Once Again, Rather Than Deleting Terrorist Propaganda, YouTube Deletes Evidence Of War Crimes
It really was just last week that we were discussing the problems of telling platforms like YouTube to remove videos concerning "violent extremism" because it's often tough to tell the difference between videos that many people think are okay and ones that those same people think are not. But in that post, we also linked back to a story from 2013 in which -- after getting pressure from then Senator Joe Lieberman -- YouTube started removing "terrorist" videos, and in the process deleted a channel of people documenting atrocities in Syria.
It appears that history is now repeating itself, because YouTube is getting some grief because (you guessed it), it's effort to keep extremist content off its platform has resulted in deleting a channel that was documenting evidence of war crimes in Syria.
YouTube is facing criticism after a new artificial intelligence program monitoring "extremist" content began flagging and removing masses of videos and blocking channels that document war crimes in the Middle East.
Middle East Eye, the monitoring organisation Airwars and the open-source investigations site Bellingcat are among a number of sites that have had videos removed for breaching YouTube's Community Guidelines.
This comes just days after YouTube announced it was expanding its program to remove "terror content" from its platform -- including better "accuracy." Oops.
Again, there are no easy answers here. You can certainly understand why no platform wants to host actual terrorism propaganda. And platforms should have the right to host or decline to host whatever content they want. The real issue is that we have more and more people -- including politicians -- demanding that these platforms must regulate, filter and moderate the content on their platform to remove "bad" speech. But in the over 4 years I've been asking this question since that last time we wrote about the shut down of the channel documenting atrocities, no one's explained to me how these platforms can distinguish videos celebrating atrocities from those documenting atrocities. And this gets even more complicated when you realize: sometimes those are the same videos. And sometimes, letting terrorists or others post the evidence of what they're doing, people are better able to stop that activity.
There is plenty of "bad" content out there, but the kneejerk reaction that we need to censor it and take it down ignores how frequently that is likely to backfire -- as it clearly did in this case.