“For me, this isn’t a journalism issue,” said Lynn Walsh, president of the Society of Professional Journalists. “This isn’t even a media issue. To me, it’s about our country, and the future of our country.”
The fake-news scandal played out on Facebook this year against a backdrop of the hotly contested presidential election. With no sign of surrender from the forces of misinformation, Walsh’s organization is set to reach out in the coming days to have the social-media giant reconsider how it views itself in the media landscape.
Throughout Facebook’s lifetime, founder Mark Zuckerberg passionately insisted that Facebook is “not a media company.” Over time, however, that position has become increasingly harder to defend.
Forbes has called the Silicon Valley giant the “global Editor-in-Chief,” as the primary news source for 44 percent of the public.
If true, that statistic makes Facebook’s reach broader than every newspaper combined.
Discussing her invitation for a summit in an exclusive interview, Walsh said she wants Facebook to reshape its thinking.
“I think six years ago, if someone said you’d primarily be getting your news from Facebook, they might have thought that was a crazy idea,” said Walsh, who is also an investigative executive producer for NBC 7 San Diego. “But that’s the reality we live in. The media landscape has evolved. Journalism has evolved, and continues to evolve. So I do hope that while it may not be the original thought that Facebook had. I think they should be now.”
As the country prepares for the inauguration of Donald Trump, a cascade of reports suggested that Facebook’s fake and junk news affected the vote earlier this month. BuzzFeed reported days before the election that one hoax-meme factory from in Macedonia had hundreds of thousands of people convinced that Hillary Clinton’s indictment was imminent. Before the fake Nov. 1 story on World Politicus was taken down, it generated more than 140,000 shares, reactions, and comments.
The stats align with one analysis that shows readers engaged more with fake news than real in the three months before Election Day.
One writer of fake news told The Washington Post: “I think Donald Trump is in the White House because of me.”
Zuckerberg initially waved off these concerns.
“Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea,” he said on Nov. 11. “Voters make decisions based on their lived experience.”
The Facebook chief has since softened that position, however, vowing to right Facebook’s editorial ship with several reforms, including partnering with third-party verification organizations and learning from the traditional press.
“We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them,” Zuckerberg wrote in a post on Saturday.
When pressed on this promise, Facebook’s spokesman Jonny Thaw declined to confirm that the company would accept SPJ’s invitation.
Walsh, for her part, expressed her willingness to explain to Facebook how the SPJ’s code of ethics can improve the platform.
“Don’t we want to make sure that we’re providing everyone – news, social media, social platforms – we’re providing people with the best, most accurate information, so they can make the most informed decisions?” she asked.
‘Facebook Already Makes a Number of Editorial Decisions’
Facebook staffers may not personally generate the content appearing on its site, but the leading analyst in online censorship notes that the website already acts like an editor.
“I think that Facebook already makes a number of editorial decisions on a daily basis,” the Electronic Fronteir Foundation’s director Jillian York noted in a phone interview. “They have their own community standards. They enforce them. They ban women from nudity.”
For Facebook, those enforcing these standards are an offshore company running on Philippine labor paid below industry standards, Wired reported.
The website’s algorithms also act as an editor, letting users see just 6 percent of what their friends post, according to the Columbia Journalism Review.
“So, I think it’s disingenuous of Mark Zuckerberg to say that Facebook is not a media company,” York said.
The EFF reported just this Wednesday on an uptick in politically motivated censorship.
Scrutinizing 294 reported content take-downs, the study includes eye-popping visualizations showing a chronology of some of these decisions. Facebook has removed the work of journalists from the Philippines, Kazakhstan and the Palestinian territories.
The platform apologized after removing a live video of police shooting Philando Castile, an unarmed black man in Minnesota, calling the removal a “mistake.”
Unfathomably, the report notes that Facebook even removed an image of a cat dressed in a business suit.
“I would say they’re already conducting censorship on a number of levels,” York said, referring to Facebook. “They could be more transparent about their algorithms. They could do a better job of explaining how sites are pulled in. They could whitelist instead of blacklisting sites.”
A “whitelist” refers to a system of verifying and rewarding accurate content.
In May, Facebook replaced the human beings involved in trending news with algorithms, after conservatives claimed their news sources were being suppressed.
“I think that the reports that angered people suggested that they only had 10 or 12 sites that they were pulling from,” York explained. “That’s obviously problematic. I can understand why the pushback happened.”
But experts note that pulling down disinformation is not Facebook’s only option. It can identify dubious or unconfirmed stories.
“I do feel that they do have a responsibility to the public to editorialize their trending topics and to weed out or at least put some sort of content warnings on fake news,” York said.
In a phone interview Thursday, York endorsed the idea of Facebook appointing a public editor, whose role is to respond to criticism and supervise an outlet’s editorial and ethical conduct.
“Look at Margaret Sullivan at The New York Times,” York said. “I thought she’s one of the best public editors that I’ve ever seen. She did such a wonderful job. She’s now at the Washington Post.”
Sullivan put her own stamp of approval on the idea and took it a step farther in a Nov. 20 column for the Washington Post.
“Now it’s time for a bolder move: Facebook should hire a top-flight executive editor and give that person the resources, power and staff to make sound editorial decisions,” Sullivan wrote.
Buzzfeed noted that one group of “renegade” staffers had even formed a Facebook Task Force to fix the platform.
York likened such an idea to Twitter’s Trust and Safety Council, made up of more than 60 nongovernmental groups addressing complaints of user-harassment on the site.
“Facebook also does consult with NGOs but they’re a lot more quiet about it,” York said. “I talk to them quite often. I would say that they are open to feedback and input. But I do think making a move to do that on a more public basis would be a good idea for them.”
‘Rumor Theory’ and ‘Rumor Cascades’
Political news aside, one journalism professor scrutinized how fake health stories on Facebook can have life-and-death consequences.
Jeffrey Riley, who teaches at Florida Gulf Coast University, explored this danger at length two years ago in his dissertation, “Shares, Likes, and Endorsement: Examining the Influence of Facebook Friends on Online Distribution of Health-based Misinformation.”
In his 119-page study, Riley started off with the hypothesis: “Individuals will be more likely to endorse a piece of health from misinformation on Facebook if the misinformation is concerning a topic they know little about.”
Riley said that premise traces back to “rumor theory,” coined by U.S. sociologist H. Taylor Buckner more than 50 years ago.
“It was the middle name of a good friend of mine,” Riley said.
From there, Riley then studied what students were the “power sharers” of their group.
“There was a very clear divide of the people who were much more likely to share it, versus the people who were much more willing to be more critical of it,” he said.
For these “power endorses,” he said: “They shared the one about the common cold, and the one about Mertenitis, the fake disease, at the exact same level.”
Another sociological theory shaped Riley’s work: one led by Facebook’s own data scientist Adrien Friggeri, in the company’s internal investigation, also from 2014.
Friggeri’s research into “rumor cascades” found that Snopes.com, an internet news-verification service, did not slow down a false-news story that had been widely circulated.
“This points to individuals likely not noticing that the rumor was snoped, or intentionally ignoring that fact,” the 10-page study says.
Riley said he was not aware of data studying the toll on public health from Facebook-disseminated misinformation linking vaccines and autism. But he noted that fake news on this issue more generally has been fatal.
“One bad study from one British guy who made up a bunch of numbers has caused this enormous problem, where we’ve got diseases that we previously thought we had wiped out of the country like measles are now making a comeback in place,” he said, referring to Andrew Jeremy Wakefield’s discredited paper on the subject.
“There are actually people, especially children and older people, who are actually dying from this,” Riley added.
The professor noted that these public health crises linked to fake news will outlast any presidential election.
Unlike a traditional newsroom, Facebook does not have editorial staff to respond to press inquiries, and its public-relations office ducked several questions about the revelations in this article with the following email.
“Thanks for your note,” spokesman Jonny Thaw said in an email. “Mark [Zuckerberg]’s post from last Friday covers several of these questions.”
The linked post was largely nonresponsive, and Thaw has not responded to follow-up questions.