(CN) – Tired of waiting while social media platforms and websites hem and haw over what to do about illegal content, the European Commission on Thursday set the bar high at least when it comes to terrorist content: Pull it down within an hour.
The commission has been wrangling with Silicon Valley tech giants like Facebook, Google and Twitter since 2016 on how best to combat illegal online content like terror threats, child pornography and hate speech.
After proposing a 24-hour threshold for removing such content, the commission found compliance disappointing by the end of 2016: only 28 percent of content flagged as illegal hate speech was removed during the survey period, and less than 40 percent of flags were even reviewed within the required 24 hours.
The commission spent much of 2017 consulting with the online giants, and then issued new guidelines that netted better results. Currently, internet companies remove 70 percent of all flagged material, and most of the time within 24 hours. But the commission says the threat to the public continues to grow as hate speech, terrorist threats and illegal material proliferates in online communities.
“Online platforms are becoming people’s main gateway to information, so they have a responsibility to provide a secure environment for their users. What is illegal offline is also illegal online,” digital commissioner Andrus Ansip said in a statement. “While several platforms have been removing more illegal content than ever before – showing that self-regulation can work – we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights.”
While Thursday’s recommendations apply to illegal content broadly, the commission focused particularly on terrorist content.
“Terrorist content online poses a particularly grave risk to the security of Europeans, and its proliferation must be treated as a matter of the utmost urgency,” the commission said in a statement, and urged the removal of all terror threats and propaganda within one hour of detection.
Online companies must also come up with ways to keep illegal content from reappearing, and the commission urged them to report on their efforts every 90 days at a minimum.
In a statement, the Computer and Communications Industry Association – a trade group that speaks for tech giants like Facebook and Google – said the commission’s 1-hour recommendation is impossible.
“Such a tight time limit does not take due account of all actual constraints linked to content removal and will strongly incentivize hosting services providers to simply take down all reported content,” the group said.
The commission said it will consider proposing legislation if the platforms are unable to adequately comply.