SAN JOSE (CN) — Several black content creators filed a federal class action Wednesday accusing YouTube and Google of using racial profiling in algorithms and artificial intelligence that filter and censor their video content so it reaches fewer viewers, similar to improper censorship of LGBTQ+ content creators the tech giants said was fixed years ago.
In the 103-page lawsuit, with 136 pages of attachments, four content creators — Kimberly Carleste Newman, Lisa Cabrera, Catherine Jones and Denotra Nicole Lewis — claim YouTube’s content restrictions “are not the result of an identity and viewpoint-blind review and application of the rules.”
The women — whose YouTube channels have recorded millions of views — say the same “triggers” used by YouTube to profile its users and their viewpoints came under scrutiny in 2017, when LGBTQ+ content creators were censored via the “restricted mode” filtering tool based on the identity and orientation of the creator rather than the content posted.
YouTube and Google were sued in the Northern District of California over their censorship of LGBTQ+ content in 2019.
In the lawsuit filed Tuesday and made available Wednesday, the plaintiff say YouTube never fixed its “targeting” of minority content creators and restricts access to their content because it competes with sponsored content and “click per minute” (CPM) advertising revenue.
“Instead of ‘fixing’ the digital racism that pervades the filtering, restricting, and blocking of user content and access on YouTube, defendants have decided to double down and continue their racist and identity based practices because they are profitable,” the plaintiffs’ attorney Peter Obstler wrote, with Browne George Ross of San Francisco.
“By utilizing their unilateral control over 95% of the world’s public video content, defendants unlawfully misappropriate viewers, CPM, advertising, and other revenues that belong to, or would otherwise be available to, plaintiffs and other third-party users, but for the discriminatory restrictions that unlawfully restrict and block plaintiffs’ content and access to YouTube services,” the complaint states.
The lawsuit comes a week after YouTube CEO Susan Wojcicki said the company was creating a $100 million fund “dedicated to amplifying and developing the voices of black creators and artists and their stories” after George Floyd’s death and subsequent protests.
But the plaintiffs say they “would prefer that defendants spend their money to stop the racist practices that pervade the YouTube platform” including restrictions on videos which refer current events such as “Black Lives Matter,” “police brutality” and the names of people killed by law enforcement.
They say the restrictions placed on their videos prevent them from generating revenue. They also say their accounts are “shadow banned” by not appearing in YouTube searches.
They say YouTube also removes accounts based on unconfirmed complaints, interferes with livestream broadcasts, excludes plaintiffs from “Trending” and “Up Next” recommendations, freezes analytic information on subscribers and viewers and fails to remove hate speech or process the plaintiffs’ appeals.
“Plaintiffs can no longer wait for defendants to implement the ‘fix’ they promised years ago. Nor should they have to. Whether defendants’ ‘motive’ for refusing to do so is based on profit, ideology, or ‘no reason at all,’ the knowing use of a person’s, race, skin color or some other immutable personal trait or viewpoint to filter and review access to YouTube, is digital racial profiling, redlining, and discrimination. It is illegal,” the women say.
They say YouTube has claimed immunity from liability for its content filtering and alleged discrimination based on Section 230 of the Communications Decency Act of 1996. In addition to claims for breach of contract, race discrimination, unfair business practices and First Amendment violations, they seek declaratory judgment that Section 230 does not shield online platforms from liability for race discrimination.
Sections 230 immunity is under attack elsewhere. The lawsuit was released the same day the Justice Department issued recommendations to reform Section 230 so online platforms are not immune from being held accountable for illicit content posted by third parties on their sites.
Facebook took steps Wednesday to reduce hate speech online by removing 900 social media accounts on Facebook and Instagram tied to the Proud Boys and American Guard, two hate groups already banned by the platforms.
People associated with the groups discussed plans to bring weapons to protests of police killings of black people.
YouTube did not immediately return an emailed request for comment sent after office hours Wednesday.