But in its quest to swallow every part of the internet, Facebook incentivizes media outlets to publish their content -- video, images, text -- directly to Facebook. How's that going?
http://www.nytimes.com/2016/08/28/magazine/inside-facebooks-totally-insane-unintentionally-gigantic-hyperpartisan-political-media-machine.html
Not so well. We all see blatantly biased or false stories shared on Facebook, and here, the New York Times’ John Herrman looks at the companies behind many of them. "They have names like Occupy Democrats; The Angry Patriot; US Chronicle; Addicting Info; RightAlerts; Being Liberal; Opposing Views; Fed-Up Americans; American News; and hundreds more. Some of these pages have millions of followers; many have hundreds of thousands," he writes.
And unlike traditional media organizations, which have spent years trying to figure out how to lure readers out of the Facebook ecosystem and onto their sites, these new publishers are happy to live inside the world that Facebook has created. Their pages are accommodated but not actively courted by the company and are not a major part of its public messaging about media. But they are, perhaps, the purest expression of Facebook’s design and of the incentives coded into its algorithm — a system that has already reshaped the web and has now inherited, for better or for worse, a great deal of America’s political discourse.
Importantly, Facebook isn't combatting this glut of fake news. You probably remember how, in response to the conservative outcry over Facebook's trending topics, Facebook eliminated human editors altogether.
http://www.latimes.com/business/la-fi-election-media-20161109-story.html
In the Los Angeles Times, David Pierson makes the case that considering Facebook's huge influence, it needs to start taking responsibility for what it publishes. Of course, according to Facebook, its users are doing the publishing -- shielding Facebook from libel and defamation.
“If their goal is to simply retain user engagement by reaffirming everything users already believe without challenging them, then there are real consequences. They need to own up to that,” said Gabriel Kahn, co-director of the Media, Economics and Entrepreneurship program at USC's Annenberg School for Communication and Journalism.
Kahn said the proliferation of fake news reminded him of what the late Sen. Daniel Patrick Moynihan often liked to say: “Everyone is entitled to his own opinion, but not to his own facts.”
But another problem isn't necessarily the fake stories on Facebook. It's what Facebook chooses to show you.
https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en
The internet was once idealized as the free exchange of different, sometimes uncomfortable ideas; Facebook has made it a constant soothing reflection of one's own beliefs and interests. This idea is not new -- here's a TED Talk from Eli Pariser about the online "filter bubble" from five whole years ago. It's very odd to watch this week.
Essentially, Facebook's algorithm takes the posts from our already-curated circle or friends, and then filters them to show us what it thinks we want to see. (The Wall Street Journal created an eye-opening tool that presents only "red" and "blue" Facebook feeds; try it out.) One of the narratives that arose quickly after Tuesday night was the so-called elite urban left's inability to engage with the disenfranchised rural right. On an individual level, this means Clinton supporters felt safe in their Facebook bubble, unaware of the size and passion of Trump supporters.
So when your Democratic friends take pundits to task for not accurately predicting Trump's chances of winning -- if only we'd known, we might have fought harder, they think -- they also have to re-think Facebook's role in nurturing their feeling of security. According to their timeline, there was no threat. Trump was going to lose. Everybody on their timeline said so!
That might have been good news for them. Meanwhile, on the other side of the divide supported by a feel-good algorithm, Trump supporters consumed their own news -- in varying degrees of truthiness -- on Facebook.

UPDATE: Mark Zuckerberg has responded. NPR's Aarti Shahani reports:
http://www.npr.org/sections/alltechconsidered/2016/11/11/501743684/zuckerberg-denies-fake-news-on-facebook-had-impact-on-the-election
And the New York Times reports on the concern within Facebook's upper ranks about the company's influence on the election, and the all-hands meeting held for staff following Trump's victory: