Deregulating the Digital Town Square: Who Controls the Conversation?
By Jennifer Monahan
“What’s happened today – which has good elements and sometimes precarious elements for our society – is that social media platforms provide a digital town square,” explained Distinguished Service Professor Dan Green, director of the Master of Entertainment Industry Management (MEIM) program. “People share information through social media in a much quicker way, engaging back and forth online.”
Tech leaders like Elon Musk and Mark Zuckerberg have positioned their platforms – X and Meta’s Facebook, respectively – as town squares, advocating for less intervention in content moderation. That approach could promote the free exchange of ideas, but it could also allow harmful content to spread unchecked, Green said.
At its best, the social media “town square” can serve as a place where people share and debate ideas. As privately owned platforms, however, each town square (e.g., X, Facebook, Reddit, TikTok, Instagram, etc.) sets its own content moderation standards. Depending on the standards for that platform, users might feel their speech is being unfairly restricted – or that misinformation, hate speech, or harassment are spreading unchecked.
As the landscape around social media regulation shifts, Green took some time to explain why this situation is so complex.
Silicon Valley, Hollywood, Washington, D.C., and Wall Street are more interconnected in 2025 than ever before.Dan Green, Ph.D.
Increasing Industry Overlap
“Silicon Valley, Hollywood, Washington, D.C., and Wall Street are more interconnected in 2025 than ever before,” Green said.
Lines have blurred in media and technology. Companies such as Apple, Amazon, and Netflix, originally tech firms, now also function as major content creators and distribution platforms. These organizations own media channels. They broadcast and stream content. And they are subject to government oversight through the Federal Communications Commission (FCC), as well as being responsible to their shareholders.
“Tech leaders are trying to court favor back in D.C. because they're concerned about Wall Street,” Green said. “Tech leaders don't want to be suppressed by the new administration. They’re trying to balance regulatory concerns with Wall Street pressures.”
This is one of the reasons that tech leaders from Google, Meta, Amazon, and Apple attended the most recent inauguration of President Trump. In addition, many of these same companies donated millions for the inauguration activities.
Shortly before Trump was inaugurated in 2025, Zuckerberg announced that Meta would cease its fact-checking efforts. That shift illustrates the evolving relationship between tech companies and policymakers. The New York Times reported that the move would likely satisfy Trump, who has accused social media platforms of censoring conservatives.
“It is fascinating how quickly Silicon Valley is falling into line with the new administration,” Green said.
Social Media and the FCC: Deregulation, Misinformation, and Polarization
Most people know the FCC as the watchdog for TV, radio, satellite, cable, broadband and media responsibility. “The FCC serves as the eyes and ears of what’s being broadcast, or of what’s being shared in the town square,” Green said.
The FCC operates on the principle that the government should assess whether broadcast content serves the public interest. Advocates of limited government generally support less regulation and oversight, while those who favor a more active government typically want the FCC to play a greater role in content moderation.
Section 230 of the Communications Decency Act of 1996 protects social media companies from liability for user-generated content. In recent years, the FCC has claimed authority to interpret Section 230 as part of its mandate to regulate communications. Such a claim is problematic for those who worry it could lead to excessive government control over online speech.
We're clearly at a crossroads. Deregulation is an ideal, but it's not practical when you're talking about monopolies. We need some type of structural safeguards against media monopolies in order to sustain a republic.Kevin Stein, Principal & Co-founder, Signal Path Immersive
“I tend to be a fan of regulation despite its shortcomings,” said Kevin Stein, principal and co-founder of the experiential entertainment production company Signal Path Immersive and an adjunct professor in Heinz College’s MEIM program.
“We're clearly at a crossroads. Deregulation is an ideal, but it's not practical when you're talking about monopolies,” Stein continued. “We need some type of structural safeguards against media monopolies in order to sustain a republic.”
Artificial intelligence (AI) is further complicating content regulation on social media platforms, Stein said, because AI-generated content and amplification exist in an environment where there are no clear rules. Regulations – and regulators – are still playing catch up in the brave new world of AI.
“AI is going to be the force that focuses people on what kind of protections we need to have in place with respect to algorithmically promoted content,” Stein said. “Things are asymmetrical right now – the policies don’t match the tech advancements. Policymakers need to consider how to protect people and adopt certain types of content moderation in this era of AI algorithms.”
This asymmetry in regulation results in uneven playing fields, Stein explained, and contributes to monopolization, misinformation, and challenges in ensuring media serves the public interest.
Brendan Carr, the new FCC chair under Trump, has said that the FCC will rein in Big Tech and also weaken liability protections for online platforms in order to “combat tech censorship,” according to The New York Times.
“Deregulation could mean less moderation oversight,” Green said, “and with less moderation oversight, there's increased distribution of things that are not true. People don't have to be accountable anymore.”
Carr has been in favor of empowering consumers to choose moderation tools, if any moderation were to happen at all. Yet, misinformation and disinformation on social media platforms also contribute to polarization in society. “Social media isolates us into our own tribes,” Green said. “Instead of engaging in civil discourse, we retreat into digital echo chambers and don’t consider other viewpoints.”
Unintended Mental Health Consequences
Another aspect of deregulating social media is how the platforms affect mental health – especially for children. Reduced oversight could allow younger users greater access to social media platforms, exacerbating issues like loneliness and depression in young people. Surgeon General Vice Admiral Vivek H. Murthy, MD, wrote an opinion piece in The New York Times in June 2024 calling for a warning label on social media platforms because of their role with “significant mental health harms for adolescents.”
“We know for a fact that Instagram use increases loneliness and depression, especially in minority users and teenagers,” Green said. “We need to look at what’s happening because there are a lot of vulnerable constituents out there.”
Attention to and awareness of how social media contributes to student isolation is growing through research and public discussions. Green contends that increased regulation—in the form of minimum age requirements for accessing social media—could help.
“Critics say that age restrictions won’t actually do anything,” Green said. “But we saw what happened with seatbelt usage increasing when it became law. Government regulation might help parents curb their kids’ social media use.”
Algorithmic Bias
Algorithms built into social media platforms determine the content that users see. Media outlets, non-partisan government agencies, and think tanks including the Bipartisan Policy Center in the U.S. and the Institute for Strategic Dialogue in the U.K. have all reported on how these algorithms work and how they may contribute to or exacerbate societal biases.
Green also cited a 60 Minutes story that highlighted the discrepancy between TikTok content presented to kids in the U.S. versus what’s presented to kids in China. In China, children younger than 14 see primarily educational videos and have a built-in time limit on daily use, whereas children in other parts of the world see more superficial content. No limits on usage exist unless parents opt to edit the app settings and monitor their children’s time on the app.
Final Thoughts
Social media deregulation has benefits as well, Green said. In addition to the documented detrimental effects described above, these platforms have the potential to amplify voices, foster dialogue, and connect communities.
“Like so many things, there’s a yin and yang,” Green said.
As the digital town square continues to evolve, the questions remain: Who gets to stand on the soapbox, and who decides the rules of the gathering?