How the Supreme Court ruling on Section 230 could end Reddit as we know it
As tech companies scramble in anticipation of a major ruling, some experts say community moderation online could be on the chopping block.
When the Supreme Court hears a landmark case on Section 230 later in February, all eyes will be on the biggest players in tech—Meta, Google, Twitter, YouTube.
A legal provision tucked into the Communications Decency Act, Section 230 has provided the foundation for Big Tech’s explosive growth, protecting social platforms from lawsuits over harmful user-generated content while giving them leeway to remove posts at their discretion (though they are still required to take down illegal content, such as child pornography, if they become aware of its existence). The case might have a range of outcomes; if Section 230 is repealed or reinterpreted, these companies may be forced to transform their approach to moderating content and to overhaul their platform architectures in the process.
But another big issue is at stake that has received much less attention: depending on the outcome of the case, individual users of sites may suddenly be liable for run-of-the-mill content moderation. Many sites rely on users for community moderation to edit, shape, remove, and promote other users’ content online—think Reddit’s upkong, or changes to a Wikipedia page. What might happen if those users were forced to take on legal risk every time they made a content decision?
In short, the court could change Section 230 in ways that won’t just impact big platforms; smaller sites like Reddit and Wikipedia that rely on community moderation will be hit too, warns Emma Llansó, director of the Center for Democracy and Technology’s Free Expression Project. “It would be an enormous loss to online speech communities if suddenly it got really risky for mods themselves to do their work,” she says.
In an amicus brief filed in January, lawyers for Reddit argued that its signature upkong/downkong feature is at risk in Gonzalez v. Google, the case that will reexamine the application of Section 230. Users “directly determine what content gets promoted or becomes less visible by using Reddit’s innovative ‘upkong’ and ‘downkong’ features,” the brief reads. “All of those activities are protected by Section 230, which Congress crafted to immunize Internet ‘users,’ not just platforms.”
At the heart of Gonzalez is the question of whether the “recommendation” of content is different from the display of content; this is widely understood to have broad implications for recommendation algorithms that power platforms like Facebook, YouTube, and TikTok. But it could also have an impact on users’ rights to like and promote content in forums where they act as community moderators and effectively boost some content over other content.
Reddit is questioning where user preferences fit, either directly or indirectly, into the interpretation of “recommendation.” “The danger is that you and I, when we use the internet, we do a lot of things that are short of actually creating the content,” says Ben Lee, Reddit’s general counsel. “We’re seeing other people’s content, and then we’re interacting with it. At what point are we ourselves, because of what we did, recommending that content?”
Reddit currently has 50 million active daily users, according to its amicus brief, and the site sorts its content according to whether users upkong or downkong posts and comments in a discussion thread. Though it does employ recommendation algorithms to help new users find discussions they might be interested in, much of its content recommendation system relies on these community-powered votes. As a result, a change to community moderation would likely drastically change how the site works.
“Can we [users] be dragged into a lawsuit, even a well-meaning lawsuit, just because we put a two-star review for a restaurant, just because like we clicked downkong or upkong on that one post, just because we decided to help volunteer for our community and start taking out posts or adding in posts?” Lee asks. “Are [these actions] enough for us to suddenly become liable for something?”
An “existential threat” to smaller platforms
Lee points to a case in Reddit’s recent history. In 2019, in the subreddit /r/Screenwriting, users started discussing screenwriting competitions they thought might be scams. The operator of those alleged scams went on to sue the moderator of /r/Screenwriting for pinning and commenting on the posts, thus prioritizing that content. The Superior Court of California in LA County excused the moderator from the lawsuit, which Reddit says was due to Section 230 protection. Lee is concerned that a different interpretation of Section 230 could leave moderators, like the one in /r/Screenwriting, significantly more vulnerable to similar lawsuits in the future.
“The reality is every Reddit user plays a role in deciding what content appears on the platform,” says Lee. “In that sense, weakening 230 can unintentionally increase liability for everyday people.”
Llansó agrees that Section 230 explicitly protects the users of platforms, as well as the companies that host them.
“Community moderation is often some of the most effective [online moderation] because it has people who are invested,” she says. “It’s often … people who have context and understand what people in their community do and don’t want to see.”
Wikimedia, the foundation that created Wikipedia, is also worried that a new interpretation of Section 230 might usher in a future in which volunteer editors can be taken to court for how they deal with user-generated content. All the information on Wikipedia is generated, fact-checked, edited, and organized by volunteers, making the site particularly vulnerable to changes in liability afforded by Section 230.
“Without Section 230, Wikipedia could not exist,” says Jacob Rogers, associate general counsel at the Wikimedia Foundation. He says the community of volunteers that manages content on Wikipedia “designs content moderation policies and processes that reflect the nuances of sharing free knowledge with the world. Alterations to Section 230 would jeopardize this process by centralizing content moderation further, eliminating communal voices, and reducing freedom of speech.”
In its own brief to the Supreme Court, Wikimedia warned that changes to liability will leave smaller technology companies unable to compete with the bigger companies that can afford to fight a host of lawsuits. “The costs of defending suits challenging the content hosted on Wikimedia Foundation’s sites would pose existential threats to the organization,” lawyers for the foundation wrote.
Lee echoes this point, noting that Reddit is “committed to maintaining the integrity of our platform regardless of the legal landscape,” but that Section 230 protects smaller internet companies that don’t have large litigation budgets, and any changes to the law would “make it harder for platforms and users to moderate in good faith.”
To be sure, not all experts think the scenarios laid out by Reddit and Wikimedia are the most likely. “This could be a bit of a mess, but [tech companies] almost always say that this is going to destroy the internet,” says Hany Farid, professor of engineering and information at the University of California, Berkeley.
Farid supports increasing liability related to content moderation and argues that the harms of targeted, data-driven recommendations online justify some of the risks that come with a ruling against Google in the Gonzalez case. “It is true that Reddit has a different model for content moderation, but what they aren’t telling you is that some communities are moderated by and populated by incels, white supremacists, racists, election deniers, covid deniers, etc.,” he says.
Brandie Nonnecke, founding director at the CITRIS Policy Lab, a social media and democracy research organization at the University of California, Berkeley, emphasizes a common viewpoint among experts: that regulation to curb the harms of online content is needed but should be established legislatively, rather than through a Supreme Court decision that could result in broad unintended consequences, such as those outlined by Reddit and Wikimedia.
“We all agree that we don’t want recommender systems to be spreading harmful content,” Nonnecke says, “but trying to address it by changing Section 230 in this very fundamental way is like a surgeon using a chain saw instead of a scalpel.”
Jump in the discussion.
No email address required.
Has it occurred to anyone that it is totally fricking insane for the Supreme Court to be legislating about how the internet should work? It's a panel of 9 lawyers whose only qualification is that they're Catholic, Jewish, or black. When did we give them the power to decide what kind of society we're going to live in? People whine about the "establishment" or the "deep state" but Jesus Fricking Christ, why do we let the Bar Association just openly dictate every aspect of our lives?
Jump in the discussion.
No email address required.
Word. I hate Reddit and want to watch it burn but this is a can of worms that should be left alone.
Krayon sexually assaulted his sister.
Jump in the discussion.
No email address required.
Dramaphobic
Jump in the discussion.
No email address required.
More options
Context
@bbbb is so glad you feel that way! Reddit is a terrible, toxic website and it's only a matter of time before it collapses. Good riddance!
TRANS LIVES MATTER
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Most of us just got used to internet being an Wild West mostly outside of government care for the first few decades since it’s inception.
Nowadays, pretty much everyone who got on to internet post iPhone 4 are treating it as an extension of the real world.
Obviously, the fed will start expecting more control over it just like they have on the real world
Jump in the discussion.
No email address required.
Yeah we have expectations of how the internet should work but it's mostly just based on attitudes people had in the 1990s. The country has changed so much since then, especially the left flipping from defending free speech at all costs to calling free speech an "alt-right dogwhistle". Most of this stuff wasn't actually laid down in law so it can be changed on a whim by the FCC or the courts.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
they're finding a provision of a law unconstitutional not legislating lol
Jump in the discussion.
No email address required.
That's what they said about Roe v. Wade, banning the death penalty, banning serious punishments for male feminists and child molesters, banning campaign finance laws...
How many fricking times where the courts impose something on the country that's explicitly against the will of voters does it take before you peepeesuckers realize that maybe you shouldn't trust lawyers?
Jump in the discussion.
No email address required.
r-slur confirmed. also roe v wade was r-slurred because it invented a new right that wasn't there. i'm not in favor of these things but a court saying "no no this law went too far" is a huge difference from "this law is illegal because it infringes on a right we just made up right here". the court is course correcting nicely
also
nooooooooo what a loss, the justice system never makes mistakes we should kill anyone convicted of bad crimes thereby destroying any chance to compensate people falsely convicted
Jump in the discussion.
No email address required.
I wonder why you chose this as the hill you want to die on. Do you have some personal experience you'd like to share with us?
Jump in the discussion.
No email address required.
lmao im not playing the misquoting game r-slur, choose to ignore the false convictions argument if you want
Jump in the discussion.
No email address required.
That's not very dramatic of you.
Jump in the discussion.
No email address required.
it’s more dramatic to watch you seethe and then continue to watch you cope and dilate when called out on your r-slurred seething. low effort high roi
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
More options
Context
More options
Context
Because voters and lawmakers are too r-slurred to understand the laws they create actually works?
Jump in the discussion.
No email address required.
More options
Context
More options
Context
More options
Context
Because even with those limited qualifications they are still more competent than 99% of people out there?
Jump in the discussion.
No email address required.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
theyve already decided these platforms should get all the power and none of the responsibility, now they might decide the opposite. either way itll suck and either way i dont really care because itll suck. maybe having it so you cant just edit shit willy nilly will lead to a chudpocalypse, more pc shit or somewhere in the middle. maybe itll make people smarter. idk but it's drama either way
Jump in the discussion.
No email address required.
I support it because it disenfranchises jannies on a fundamental level. Or at least, the most annoying aspects of janniehood.
Jump in the discussion.
No email address required.
More options
Context
More options
Context
Because in the absence of true worship or culture Americans decided that they'd worship the founding fathers.
Jump in the discussion.
No email address required.
More options
Context
More options
Context