Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Thursday, April 25, 2024 | Back issues
Courthouse News Service Courthouse News Service
Op-Ed

Moderation for moderators

January 3, 2022

If you get a nice settlement from one big company, you might as well go after another big company for the same thing. Also, how do you moderate something you can't see?

Milt Policzer

By Milt Policzer

Courthouse News columnist; racehorse owner and breeder; one of those guys who always got picked last.

Happy New Year everyone!

And now on to another of our usual depressing topics...

If a web company — say, a Facebook or a TikTok — can monitor grossness and violence, shouldn’t it be able to monitor falsehoods and defamation?

The answer is yes, they should. They don’t, but they should.

I thought of this the other day after reading on the Courthouse News website — your primary and best source of news — about a new lawsuit filed against TikTok on behalf of moderators who have bad reactions to the horrible videos that they have to moderate.

There’s some pretty gross stuff being posted. Psychological damage is understandable.

Still, horrible skeptic that I am, my first reaction to reading about this was to wonder how someone who wanted a job monitoring bad videos could complain about seeing bad videos. It seems like there’s a certain assumption of risk there.

But the videos are pretty horrifying — I’m not going to repeat the examples. So I guess I should have some empathy.

The lawsuit came after the law firm that filed it got a $52 million settlement from Facebook for the same thing. You all know by now how skeptical I am of most class action settlements so I’m not that impressed by this.

Yeah, $52 million sounds like a lot until you realize that Facebook’s 2020 net income (i.e. the net income it reported) was more than $29 billion. So the settlement was paid out of change the company found under sofa cushions.

On top of that, you know that a big chunk of that money is going to go to the plaintiff lawyers. The settlement even specifies that the fee has to come out of the settlement fund.

Do you think that TikTok will get an attorney fee discount when it settles since the firm already did the legal research in the Facebook suit? Don’t count on it.

Facebook moderators — I have no idea how many there are — can claim $1,000 off the bat and get paid for therapy they can prove they needed. I suspect that won’t be too large a sum — while the rest of the $52 million (minus the attorney fee takeout) sits in escrow drawing interest that pays for the settlement. Unclaimed money goes back to Facebook and/or insurance companies. That’s the way these things work.

The Facebook settlement appears a little better than others because the company promises to make its “U.S. Facebook vendors” provide mental health counseling and clinicians, and also to “continue to roll out” tools to make videos easier to look at.

I’m not sure how “blurring images,” which is one of the listed tools, helps. How are you supposed to moderate stuff you don’t see?

But let’s say this stuff solves the problem. Why can’t there be serious monitoring of crackpot conspiracy theories and/or basic lies — and a class action on behalf of moderators being driven insane by nonsense?

For that matter, why hasn’t someone filed a class action on behalf of America for emotional distress? Come on, class action lawyers. There’s money to be made here.

For now though, big web companies just don’t want to moderate content that doesn’t drive away viewers. They may say they do, but there’s money to be made in crazed clicking. There’s also a federal law (47 U.S. Code 230) that says computer services are not to be treated as “the publisher or speaker of any information provided by another information content provider.”

If internet companies were held liable the same way that newspaper publishers and TV and radio broadcasters are, there would suddenly be some serious moderating (i.e. editing). Teams of lawyers would have lots of work poring over content and annoying the heck out of writers. It would be a huge boon to the legal economy.

How do you monitor billions of opinionated posts and vacation pictures for truth?

Texas has the answer — a bounty-hunting law. The free market will provide.

Someone let Governor Newsom know about this.

Categories / Op-Ed

Subscribe to our columns

Want new op-eds sent directly to your inbox? Subscribe below!

Loading...