news

Supreme Court Ruling Continues to Protect Google, Facebook and Twitter From What Users Post

WASHINGTON, DC – APRIL 19: The Supreme Court of the United States, on Wednesday, April 19, 2023 in Washington, DC. The High Court is expected to rule on wether or not to allow restrictions on the drugs mifepristone ordered by a lower court to take effect  as abortion opponents are seeking to roll back FDA approval of mifepristone, which is used in the most common method of abortion in the United States. (Kent Nishimura / Los Angeles Times via Getty Images)
Kent Nishimura | Los Angeles Times | Getty Images
  • The Supreme Court declined to address the legal liability shield that protects tech platforms from being held responsible for their users' posts, the court said in an unsigned opinion on Thursday.
  • The decision leaves in place, for now, a broad liability shield that protects companies like Twitter, Meta's Facebook and Instagram as well as Google's YouTube from being held liable for their users' speech on their platforms.
  • The Supreme Court said it would vacate and remand, or send back, the decision to the Ninth Circuit court to reconsider in light of its decision on a separate but similar case, Twitter v. Taamneh.

The Supreme Court declined to address the legal liability shield that protects tech platforms from being held responsible for their users' posts, the court said in an unsigned opinion Thursday.

The decision leaves in place, for now, a broad liability shield that protects companies like Twitter, Meta's Facebook and Instagram as well as Google's YouTube from being held liable for their users' speech on their platforms.

The court's decisions in these cases will serve as a big sigh of relief for tech platforms for now, but many members of Congress are still itching to reform the legal liability shield.

In the case, Gonzalez v. Google, the court said it would "decline to address the application" of Section 230 of the Communications Decency Act, the law that protects platforms from their users' speech and also allows the services to moderate or remove users' posts. The court said it made that decision because the complaint "appears to state little, if any, plausible claim for relief."

The Supreme Court will send the case back to a lower court to reconsider in light of its decision on a separate but similar case, Twitter v. Taamneh.

In that case, the family of an American victim of a terrorist attack sought to hold Twitter accountable under anti-terrorism law for allegedly aiding and abetting the attack by failing to take enough action against terrorist content on its platform. In a decision written by Justice Clarence Thomas, the court ruled that such a claim could not be brought under that statute.

"As alleged by plaintiffs, defendants designed virtual platforms and knowingly failed to do 'enough' to remove ISIS-affiliated users and ISIS related content—out of hundreds of millions of users worldwide and an immense ocean of content—from their platforms," Thomas wrote in the court's unanimous opinion.

"Yet, plaintiffs have failed to allege that defendants intentionally provided any substantial aid to the Reina attack or otherwise consciously participated in the Reina attack—much less that defendants so pervasively and systemically assisted ISIS as to render them liable for every ISIS attack," he added, referring to the nightclub in Istanbul where the terrorist attack took place.

Many lawmakers see Section 230 as an unnecessary protection for a massive industry, though its proponents say the law also protects smaller players from costly lawsuits, since it helps to dismiss cases about users' speech at an earlier stage. Still, lawmakers remain divided on the form such changes should take, meaning there are still massive hurdles to getting it done.

"This decision leaving Section 230 untouched is an unambiguous victory for online speech and content moderation," Jess Miers, legal counsel for Meta and Google-backed Chamber of Progress, said in a statement. "While the Court might once have had an appetite for reinterpreting decades of Internet law, it was clear from oral arguments that changing Section 230's interpretation would create more issues than it would solve. Ultimately, the Court made the right decision. Section 230 has made possible the Internet as we know it."

"This is a huge win for free speech on the internet," Chris Marchese, litigation center director for NetChoice, a group whose members include Google, Meta, Twitter and TikTok, said in a statement. "The Court was asked to undermine Section 230—and declined."

WATCH: The messy business of content moderation on Facebook, Twitter, YouTube

Copyright CNBC
Contact Us