WASHINGTON (CN) — In a move that could upend decades of internet protection by Congress, Senator Lindsey Graham said Tuesday that the risk of child predation demands that social media companies earn the immunity they enjoy today.
The proposal came in the final moments of a hearing by the Senate Judiciary Committee on the exploitation of children by digital-media platforms.
Though the Communications Decency Act of 1996 protects internet service providers from being treated as the publisher of the content on their website, Graham, who chairs the committee, said the bar should be higher for companies like YouTube, Instagram and Snapchat that have hundreds of millions of child users.
“Things would change tomorrow if you could get sued,” Graham said, asserting that these entities should be required to meet more stringent standards known as best business practices.
Testifying before the Senate, a panel of experts drew a harrowing picture of sexual predators using social media platforms as hunting grounds. They said the Federal Trade Commission needs pressure to enforce laws already on the books.
Not finding the current options sufficient, however, Graham said the country needs a new regulatory body to create and enforce child-protection standards, in consultation with social media companies and outside organizations.
“I think that is a good place to start, and I would hate to be the Democrat or Republican who is against that one,” Graham said following the hearing.
Senator Richard Blumenthal, the committee’s ranking member, called out YouTube specifically for its response last month to reports that its recommendation system promotes videos that sexualize minors.
“Parents could post innocent videos of their kids and YouTube’s algorithm might shepherd the wrong people to them,” said Blumenthal, a Democrat from Connecticut.
Senator Josh Hawley, a Republican from Missouri, denounced YouTube’s approach to protecting children as callous.
He said ads on automatically recommended videos generate 70% of the company’s business, bringing in revenue that would be lost if they stepped up enforcement.
Graham and fellow committee members voiced alarm at the testimony by Christopher McKenna, founder and CEO of the organization Protect Young Eyes.
Following reports from CNN in March that Instagram had become the leading social-media platform by which sexual predators groom their victims, Protect Young Eyes created two accounts on the app in the style of a young female user.
“Within a week we had dozens of men sending us images of their penises, telling us they were horny,” McKenna said. “Even after we told all of them that we were ‘only 12.’ They were relentless.”
Even on private accounts, teens are targeted by predators at alarming rates, McKenna said. One poll he cited found that, out of 2,000 teens with private accounts, 75 percent received pornographic direct messages from strangers.
Asked by Graham what happened to the predators, McKenna said his organization reported the messages but with little impact.
“In our experience, because you can set up a new Instagram account within seconds,” McKenna said, “all that does is create a blip in their behavior.”
The committee also grappled with the lack of descriptions in the Apple Store for various social media applications that might inform parents about the dangers their children could face.
But experts told the committee that the video game industry has proven it can comply with regulations to clearly communicate if products are appropriate for underaged users.
“There are volumes of information telling you exactly what goes on, what happens to certain body parts and blood and gore,” McKenna said. “You walk away from this description as a parent with a clear view as to what I am going to be putting in front of my children.”
Committee members also inquired how a user’s age can be taken into account when accessing certain platforms or features. Bot Instagram and Snapchat rate their apps today as 12+ for users.
But the companies fail to program the apps with child-protection settings as the default. McKenna said this helps create a symbiotic relationship that puts kids in danger.
“Many Instagram predators quickly shuttle kids over to Snapchat where evidence disappears,” McKenna said. “For example, one relentless pedophile on Instagram, who calls himself, ‘Daddy,’ invites young girls to join him in Snapchat with Instagram posts like ‘12+ slave girls.”
Confident that “brilliant organizations” like YouTube and Instagram have the know-how to address each problem set before the committee, McKenna attributed their inaction to a lack of personal responsibility to safeguard the wellbeing of kids on their platforms.
“It is not until they are pushed, or there is reputational damage that they seem to move,” McKenna said. “And even then, in the case of YouTube, they are still unwilling to do some very simple things to protect children.”