Thursday, October 6, 2022

Gonzalez vs. Google

Interesting case coming before the US Supreme Court:

Nohemi Gonzalez was a 23-year-old American studying in Paris, who was killed after individuals affiliated with the terrorist group ISIS opened fire on a cafĂ© where she and her friends were eating dinner. According to her family’s lawyers, she was one of 129 people killed during a November 2015 wave of violence in Paris that ISIS claimed responsibility for.

In the wake of Gonzalez’s murder, her estate and several of her relatives sued an unlikely defendant: Google. Their theory is that ISIS posted “hundreds of radicalizing videos inciting violence and recruiting potential supporters” to YouTube, which is owned by Google. Significantly, the Gonzalez family’s lawyers also argue that YouTube’s algorithms promoted this content to “users whose characteristics indicated that they would be interested in ISIS videos.”

The case here is not about just allowing the content to be posted, which is clearly not illegal, but about algorithms directing people to content. That, say the plaintiffs, is effectively an endorsement of the videos by Google, and they are responsible for that endorsement. So far courts have ruled for Google, but there have been dissents. My feeling is that holding Google responsible for this now would be ex-post facto legislation, since nobody knew this might be illegal at the time.

As to whether it should be illegal, that is a harder question. Suppose we passed a law saying Internet providers could be held liable for directing people to content that "promotes violence." Would that apply to any video cheering the progress of the Ukrainian army? If not, why not? What about videos from the Antifa rampage in Seattle? What if there were a tense labor dispute, and Google directed people to a video of a violent strike from the 1930s, and someone ended up dying in a picket line clash? 

Much as I hate extremist violence, and as dubious as I am of the algorithms used by Youtube/Facebook/et al., I think we have to be very careful about holding people liable for the promotion of violence online. It's the same thing with all free speech issues: it's really hard to write a law that will ban content you hate without also banning things you like, or that scholars or reporters need to be able to find.

1 comment:

  1. Far more Islamic extremists are radicalized by the "collateral damage" of the US military dropping bombs on civilians than are ever radicalized by Youtube videos. Yet which one goes to court, hmm?

    ReplyDelete