In a defense brief filed to the US Supreme Court this week, Google warned that altering Section 230 of the Communications Decency Act — which protects internet-based companies from being sued for content created by their users — would “upend the internet.”
The brief was filed as part of Google’s defense in a lawsuit brought by the family of Nohemi Gonzalez, a 23-year-old US citizen who was killed by ISIS in Paris in November 2015. Oral arguments for the case are set to be heard on February 21.
The family argues that Google-owned YouTube violated the Anti-Terrorism Act (ATA) when its algorithms recommended ISIS-related content to users. They argue that even if the company is not liable for the ISIS content, the algorithmic recommendations should not be protected by Section 230.
The Gonzalez family argues that the algorithms that Google and YouTube use to target certain content to users are the creations of the companies themselves, not users or other third parties, and as such are essentially editorial functions for which they are responsible — and so the algorithms are not protected by Section 230.
YouTube uses algorithms to sort and list related videos that may interest viewers, so they do not have to confront billions of unsorted videos. With the world on pace to share 120 zettabytes of data online in 2023, websites use algorithms to sift through billions of pieces of content and publish information in a form most useful to particular users. Websites also allow users to select content for others by liking or sharing pictures, videos, and articles.
Lawmakers attack Section 230 internet liability shield
Section 230, however, has faced criticism from both lawmakers of all stripes. Republicans have criticized Section 230 protections, saying they allow tech platforms to make allegedly biased decisions about what posts to take down, while Democrats want platforms to take greater responsibility and expand their content moderation to make their services safer for users.
President Joe Biden has been urging for changes to be made to Section 230, with his administration stating that Section 230 protections should not extend to recommendation algorithms.
Google, in its petition, states that YouTube abhors terrorism and has taken increasingly effective actions to remove terrorist and other potentially harmful content, and that weakening Section 230 would make it harder to find and block terrorism content..
The company also argues that if Section 230 and the protections it offers are withdrawn, some companies might comply, while others might seek to evade liability by declining to do any sort of filtering —essentially shutting their eyes and leaving up everything, no matter how objectionable.
“You would be left with a forced choice between overly curated mainstream sites or fringe sites flooded with objectionable content,” according to the brief, adding that, “Legal risk for recommending or organizing content would reduce useful services like showing the best job listings, listing the most relevant products, or displaying the most helpful videos of recipes, songs, or sources of news, entertainment and information.”
Google also states that removing Section 230 would lead to a litigation minefield. “A ruling that undermines Section 230 would have significant unintended and harmful consequences,” according to the brief.
A similar case, Twitter vs Taamneh, is scheduled for oral arguments on February 2. In this case, Twitter, Facebook, and YouTube are alleged to have aided and abetted a different ISIS attack.
Copyright © 2023 IDG Communications, Inc.