Last week, Facebook rolled out the biggest change to its News Feed in years. Content from publishers and brands—which have spent the last decade obsessing over even minor changes to the News Feed’s algorithm—is out. Posts shared by friends and family, the foundation of Facebook’s initial village square appeal, are back in. Coming in response to heightened scrutiny for its role in promoting fake news during the 2016 election, these changes mark not only a shift in focus for Facebook, but also a retreat from its ambitions to become one of the primary arbiters of America’s information ecosystem.

On the whole, disentangling Facebook from the news business is a good thing, for media organizations and for readers, and, by extension, for a democracy predicated on an informed voting public. The ultimate goal of the changes, according to CEO Mark Zuckerberg, is to ensure that “time on the site was well spent.” The company that once believed it could change the world—in a good way, not the Donald Trump-becoming-president way—is now apparently more concerned with simply making its users feel good. But this does not mean Facebook has returned to being the warm and fuzzy Facebook of old. The changes are mostly cosmetic, part of a nervous PR campaign in anticipation of expected regulatory challenges and political battles.

Smart publishers had already begun to distance themselves from Facebook, adjusting to a world in which the social network wasn’t the biggest driver of traffic. Facebook was turning the screws, making paid advertising a necessity for publishers to have posts seen by readers. (Zuckerberg did not alter the company’s advertising policies, meaning that publishers with the resources can still pay to play, if they wish.) And even before then, it was clear that Facebook was not a reliable partner. It is fond of 180 degree turns, as any publisher that pivoted to video or adopted Upworthy-style headlines will testify.

The new focus on positive interactions can be read as an acknowledgment that, having moved fast and broken things, Facebook is uncomfortable with the chaos it helped create. Its failure to act on the 2016 disinformation campaign, in particular, looms large in its decision to scale back its dissemination of hard news. Facebook became the most important content delivery system in the world in a very short period of time, but it never took its responsibilities all that seriously. And when the company tried its hand at editorial work, it was a disaster—its small editorial shop was shut down amid reports that it suppressed conservative news.

Facebook says it wants to go back to basics. For one thing, prioritizing content from friends and family is good business. “Facebook has research showing that if the percentage of friend/family content gets too low then people don’t find Facebook valuable anymore,” a former senior employee told BuzzFeed’s Charlie Warzel. Facebook’s stock dropped on the news based on the assumption that revenue from brands and publishers will drop in the short term, but Zuckerberg has been adamant that these changes are the best for the company in the long term.

For another, Facebook is signaling to increasingly suspicious lawmakers that it is absolutely not a dangerous platform with the potential for monopoly-like powers over the media industry. In this sense, these moves could be understood as an extension of Zuckerberg’s misunderstood listening tour. Though widely interpreted as the foundation of a possible presidential run, Zuckerberg has really been on a PR blitz to protect his company from government scrutiny—highlighting Facebook’s positive contributions to society, as well as its willingness to change in the face of constructive criticism.

Because of the Russian government’s interference in the 2016 election—which involved both disseminating fake news and planting ads on Facebook—the News Feed has become Facebook’s most scrutinized product, particularly on Capitol Hill. So Zuckerberg is attempting to distance the company from that product. Having already acknowledged mistakes during the 2016 election, Facebook is now showing that it’s taking care of those mistakes—as an internal matter.

Whether these fixes are more than a Band-Aid remains to be seen. Backing away from news suggests that Facebook is willing to relinquish some of its power. But there is no real evidence yet that it can provide responsible oversight of its products.