
In a defense brief filed with the US Supreme Court this week, Google warned that changing Section 230 of the Communications Decency Act, which protects internet companies from being sued over content created by their users, would “disrupt the internet”.
The announcement was made as part of Google’s defense in a lawsuit brought by the family of Nohemy Gonzalez, a 23-year-old US citizen who was killed by ISIS in Paris in November 2015. The oral arguments of the case will be heard at February 21.
The family claims Google-owned YouTube violated the Anti-Terrorism Act (ATA) when its algorithms recommended ISIS-related content to users. They argue that even if a company is not responsible for ISIS content, algorithm recommendations should not be protected by Section 230.
The Gonzalez family argues that the algorithms Google and YouTube use to target certain content to users are created by the companies themselves, not by users or other third parties, and are thus essentially editorial functions for which they are responsible, and therefore the algorithms . are not protected by Article 230.
YouTube uses algorithms to sort and list related videos that might be of interest to viewers so they don’t have to deal with billions of unsorted videos. As the world begins to share 120 zettabytes of data online in 2023, websites are using algorithms to sift through billions of pieces of content and publish information in the most useful form for specific users. The sites also allow users to curate content for others by liking or sharing images, videos and articles.
Lawmakers Attack Section 230 Internet Liability Shield
However, Article 230 has faced all kinds of criticism from both legislatures. Republicans have criticized Section 230 protections, saying they allow tech platforms to make potentially biased decisions about which posts to remove, while Democrats want platforms to take more responsibility and expand content regulation to make their services safer for users.
President Joe Biden has pushed for changes to Section 230, with his administration saying Section 230 protections should not extend to recommendation algorithms.
Google says in its petition that YouTube abhors terrorism and has taken increasingly effective steps to remove terrorist and other potentially harmful content, and that weakening Section 230 would make it harder to find and block terrorist content.
The company also argues that if Section 230 and the protections it offers are repealed, some companies might comply, while others might try to avoid liability by refusing to do any kind of filtering — essentially turning a blind eye and leaving everything out, no matter how unpleasant. .
“You would be forced to choose between over-curated top sites or junk sites flooded with junk content,” the brief said, adding that “the legal risk of recommending or curating content would reduce the number of useful services, such as showing the best jobs. list the most relevant products or show the most useful videos with recipes, songs or sources of news, entertainment and information.
Google also says that removing Section 230 would lead to a minefield of litigation. “A decision undermining Article 230 would have significant unintended and harmful consequences,” the brief said.
A similar case, Twitter v. Taamneh, is scheduled for oral argument on February 2. In this case, Twitter, Facebook and YouTube are alleged to have aided and abetted another ISIS attack.
Copyright © 2023 IDG Communications, Inc.