Rising Tide of Misinformation, Extremism Puts Big Tech in Crosshairs

Lawmakers pushed for accountability on the spread of misinformation online the CEOs of Facebook, Twitter and Google pushed back.

Facebook CEO Mark Zuckerberg testifies remotely before the House on Thursday, March 25. (Image via Courthouse News)

WASHINGTON (CN) — Google CEO Sundar Pichai, Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg have all testified before Congress several times before. They did so again Thursday in what seemed to be a last chance for the executives to make their case before Congress imposes sweeping regulations on the industry.

In both the House and the Senate, several bills are already underway that would force new requirements from content moderation to data privacy. 

Meanwhile at Thursday’s hearing of the House Energy and Commerce Committee, members made little attempt to hold back their frustration with the executives. 

“Gentlemen, let me tell you this,” Representative Bill Johnson of Ohio said at the hearing. “There’s a lot of smugness among you. There’s this air of untouchableness in your response to many of the tough questions you’ve been asked.”

Criticism of how the companies moderate content has ramped up after the discovery that extremists who stormed the U.S. Capitol on Jan. 6 organized on social media platforms weeks before the riot. But the CEOS showed little appetite this afternoon for the dose of comeuppance that lawmakers were serving. 

“We did our part to secure the integrity of the election. And then on Jan. 6, President Trump gave a speech … calling on people to fight,” Zuckerberg said in his opening remarks. “I believe that the division we see today is primarily the result of a political and media environment that drives Americans apart.”

Each executive kept blame for their own companies to a minimum. “Certainly there was content on our services,” Zuckerberg said.

Google’s Pichai said, “We always feel some sense of responsibility.” 

And Twitter’s Dorsey said: “Yes, but you also have to take into consideration a broader ecosystem. It’s not just the technology platforms we use.”

Lawmakers were relentless in telling the executives to only answer with yes or no during the hours-long hearing, leading Dorsey to create his own yes-no poll on Twitter as the meeting dragged on. 

“Your multitasking skills are impressive,” Representative Kathleen Rice jabbed at Dorsey, after asking him if yes or no was winning.

“Congresswoman, these are nuanced issues,” Zuckerberg said in response to California Representative Anna Eshoo’s irritation at the executives for not answering with one word. 

Earlier this week, Facebook said that they disabled more than 1.3 billion fake accounts between October and December, and have more than 35,000 employees working on content moderation. Google-owned Youtube said it removed tens of thousands of videos in December regarding voter fraud, and Twitter said that it has removed 8,400 tweets and challenged 11.5 million accounts that spread misleading Covid-19 information. All three platforms banned former President Donald Trump after the riots. 

Meanwhile extremists groups continue to thrive. Lawmakers accuse the platforms of not doing enough to silence them. In their memo, committee members cited multiple studies about the continued prevalence of widespread misinformation. 

“Your business model itself has become the problem, and the time for self-regulation is over. It’s time we legislate to hold you accountable,” said Committee Chairman Frank Pallone Jr. 

Zuckerberg proposed to raise the bar on a communications law called Section 230, a legal shield that protects platforms from being liable for what their users post. Instead of being granted immunity, Zuckerberg says the law should require companies to have robust systems in place for identifying and removing illegal content.

Zuckerberg said that it’s reasonable to expect large companies to have effective moderation systems, but not reasonable to expect that there are never any errors. 

“The reality is that any system is going to make mistakes,” Zuckerberg said. “There’s going to be content that we take down that we should have left up, and there’s going to be content that we miss that we should have taken down.”

Pichai and Dorsey said they were open to changes to Section 230, but declined to endorse Zuckerberg’s proposal. 

“Our process to moderate content is designed to consistently evolve. We observe what’s happening on our service, we work to understand the ramifications, and we use that understanding to strengthen our operations,” Dorsey said. “We push ourselves to improve based on the best information we have.”

Republican lawmakers also grilled the executives on the negative effects that social media has on children and teens, by way of cyberbullying, deleterious effects on mental health and child pornography. 

“What I do want to hear is what you will do to bring our country back from the fringes and stop the poisonous practices that drive depression, isolation and suicide,” said Representative Gus M. Bilirakis of Florida. “Our kids are being lost, while you say you will try to be better, as we’ve heard countless times already.”

Representative Cathy McMorris Rogers of Washington questioned Zuckerburg on whether the platform’s business model is designed to get users to stay on for as long as possible, but Zuckerberg said it was a common misconception.

“You have broken my trust,” Rogers told the executives. “Your platforms are my biggest fear as a parent.”

Committee members also grilled the executives on advertisers’ abilities to unfairly target minorities, the use of algorithms to inflame social tensions, the lack of workforce diversity and bias against conservatives. 

%d bloggers like this: