The site’s torrid week shows what a challenge moderation has become for big tech firms.
The news
YouTube announced new rules around hate speech on Wednesday that prohibit videos promoting Nazi ideology or denying the existence of the Holocaust or other well-documented violent events like the shooting at Sandy Hook Elementary School.
Thousands of channels are expected to be shut down.
But now multiple teachers are complaining that videos uploaded to educate people about Nazi history have been deleted, the Guardian reported.
The Nazi problem
Hate speech, and how to police it, isn’t a new issue for YouTube or other social platforms. But this shows just how fraught and complex the balance is, and highlights the risk of unintended consequences when policies and algorithms are tweaked.
It’s also a reminder of just how much power big tech companies have as gatekeepers of the material we consume online.
No excuses
This complexity doesn’t excuse YouTube, which is owned by Google, of its responsibilities.
This week the New York Times revealed that its recommendation engine (which drives 70% of all views) has made it easier for pedophiles to find videos of children.
A US senator has said YouTube should just outright stop recommending any videos of minors, but given that a large chunk of YouTube’s audience is kids, the company is unlikely to take that step.
It also flip-flopped over whether to ban a right-wing personality for harassing a gay Latino journalist.
Source: MIT Technology Review