Connect with us

Social Media

Must Internet Platforms Host Objectionable Content? Appeals Courts Consider ‘Must Carry’ Rules

Court decisions on Texas and Florida “must-carry” laws disagreed on whether online platforms should be regulated as common carriers.

Published

on

WASHINGTON, January 30, 2023 — As the Supreme Court prepares to hear a pair of cases about online platform liability, it is also considering a separate pair of social media lawsuits that aim to push content moderation practices in the opposite direction, adding additional questions about the First Amendment and common carrier status to an already complicated issue.

The “must-carry” laws in Texas and Florida, both aimed at limiting online content moderation, met with mixed decisions in appeals courts after being challenged by tech industry groups NetChoice and the Computer & Communications Industry Association. The outcomes will likely end up “affecting millions of Americans and their ability to express themselves online,” said Chris Marchese, counsel at NetChoice, at a Broadband Breakfast Live Online event on Wednesday.

In September, a federal appeals court in the Fifth Circuit upheld the Texas law, ruling that social media platforms can be regulated as “common carriers,” or required to carry editorial programming as were cable television operators in the Turner Broadcasting System v. FCC decisions from the 1990s.

Dueling appeals court interpretations

By contrast, the judges overturning the Florida ruling held that social media platforms are not common carriers. Even if they were, the 11th Circuit Court judges held, “neither law nor logic recognizes government authority to strip an entity of its First Amendment rights merely by labeling it a common carrier.”

Whether social media platforms should be treated like common carriers is “a fair question to ask,” said Marshall Van Alstyne, Questrom chair professor at Boston University. It would be difficult to reach a broad audience online without utilizing one of the major platforms, he claimed.

However, Marchese argued that in the Texas ruling, the Fifth Circuit “to put it politely, ignored decades of binding precedent.” First Amendment protections have previously been extended to “what we today might think of as common carriers,” he said.

“I think we can safely say that Texas and Florida do not have the ability to force our private businesses to carry political speech or any type of speech that they don’t see fit,” Marchese said.

Ari Cohn, free speech counsel at TechFreedom, disagreed with the common carrier classification altogether, referencing an amicus brief arguing that “social media and common carriage are irreconcilable concepts,” filed by TechFreedom in the Texas case.

Similar ‘must-carry’ laws are gaining traction in other states

While the two state laws have the same general purpose of limiting moderation, their specific restrictions differ. The Texas law would ban large platforms from any content moderation based on “viewpoint.” Critics have argued that the term is so vague that it could prevent moderation entirely.

“In other words, if a social media service allows coverage of Russia’s invasion of Ukraine, it would also be forced to disseminate Russian propaganda about the war,” Marchese said. “So if you allow conversation on a topic, then you must allow all viewpoints on that topic, no matter how horrendous those viewpoints are.”

The Florida law “would require covered entities — including ones that you wouldn’t necessarily think of, like Etsy — to host all or nearly all content from so-called ‘journalistic enterprises,’ which is basically defined as anybody who has a small following on the internet,” Marchese explained. The law also prohibits taking down any speech from political candidates.

The impact of the two cases will likely be felt far beyond those two states, as dozens of similar content moderation bills have already been proposed in states across the country, according to Ali Sternburg, vice president of information policy for the CCIA.

But for now, both laws are blocked while the Supreme Court decides whether to hear the cases. On Jan. 23, the court asked for the U.S. solicitor general’s input on the decision.

“I think this was their chance to buy time because in effect, so many of these cases are actually asking the court to do opposite things,” Van Alstyne said.

Separate set of cases calls for more, not less, moderation

In February, the Supreme Court will hear two cases that effectively argue the reverse of the Texas and Florida laws by alleging that social media platforms are not doing enough to remove harmful content.

The cases were brought against Twitter and Google by family members of terror attack victims, who argue that the platforms knowingly allowed terrorist groups to spread harmful content and coordinate attacks. One case specifically looks at YouTube’s recommendation algorithms, asking whether Google can be held liable for not only hosting but promoting terrorist content.

Algorithms have become “the new boogeyman” in ongoing technology debates, but they essentially act like mirrors, determining content recommendations based on what users have searched for, engaged with and said about themselves, Cohn explained.

Reese Schonfeld, President of Cable News Network and Reynelda Nuse, weekend anchorwoman for CNN, stand at one of the many sets at the broadcast center in Atlanta on May 31, 1980. The network, owned by Ted Turner, began it’s 24-hour-a-day news broadcasts on Sunday in the afternoon. (AP Photo/Joe Holloway used with permission.)

“This has been litigated in a number of different contexts, and in pretty much all of them, the courts have said we can’t impose liability for the communication of bad ideas,” Cohn said. “You hold the person who commits the wrongful act responsible, and that’s it. There’s no such thing as negligently pointing to someone to bad information.”

A better alternative to reforming Section 230 would be implementing “more disclosures and transparency specifically around how algorithms are developed and data about enforcement,” said Jessica Dheere, director of Ranking Digital Rights.

Social media platforms have a business incentive to take down terrorist content, and Section 230 is what allows them to do so without over-moderating, Sternberg said. “No one wants to see this horrible extremist content on digital platforms, especially the services themselves.”

Holding platforms liable for all speech that they carry could have a chilling effect on speech by motivating platforms to err on the side of removing content, Van Alstyne said.

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, January 25, 2023, 12 Noon ET – Section 230, Google, Twitter and the Supreme Court

The Supreme Court will soon hear two blockbuster cases involving Section 230 of the Telecommunications Act: Gonzalez v. Google on February 21, and  Twitter v. Taamneh on February 22. Both of these cases ask if tech companies can be held liable for terrorist content on their platforms. Also in play: Laws in Florida and in Texas (both on hold during the course of litigation) that would limit online platforms’ ability to moderate content. In a recent brief, Google argued that denying Section 230 protections for platforms “could have devastating spillover effects.” In advance of Broadband Breakfast’s Big Tech & Speech Summit on March 9, this Broadband Breakfast Live Online event will consider Section 230 and the Supreme Court.

Panelists:

  • Chris Marchese, Counsel, NetChoice
  • Ari Cohn, Free Speech Counsel, TechFreedom
  • Jessica Dheere, Director, Ranking Digital Rights
  • Ali Sternburg, Vice President of Information Policy, Computer & Communications Industry Association
  • Marshall Van Alstyne, Questrom Chair Professor, Boston University
  • Drew Clark (moderator), Editor and Publisher, Broadband Breakfast

Panelist resources:

Chris Marchese analyzes technology-related legislative and regulatory issues at both the federal and state level. His portfolio includes monitoring and analyzing proposals to amend Section 230 of the Communications Decency Act, antitrust enforcement, and potential barriers to free speech and free enterprise on the internet. Before joining NetChoice in 2019, Chris worked as a law clerk at the U.S. Chamber Litigation Center, where he analyzed legal issues relevant to the business community, including state-court decisions that threatened traditional liability rules.

Ari Cohn is Free Speech Counsel at TechFreedom. A nationally recognized expert in First Amendment law, he was previously the Director of the Individual Rights Defense Program at the Foundation for Individual Rights in Education (FIRE), and has worked in private practice at Mayer Brown LLP and as a solo practitioner, and was an attorney with the U.S. Department of Education’s Office for Civil Rights. Ari graduated cum laude from Cornell Law School, and earned his Bachelor of Arts degree from the University of Illinois at Urbana-Champaign.

Jessica Dheere is the director of Ranking Digital Rights, and co-authored RDR’s spring 2020 report “Getting to the Source of Infodemics: It’s the Business Model.” An affiliate at the Berkman Klein Center for Internet & Society, she is also founder, former executive director, and board member of the Arab digital rights organization SMEX, and in 2019, she launched the CYRILLA Collaborative, which catalogs global digital rights law and case law. She is a graduate of Princeton University and the New School.

Ali Sternburg is Vice President of Information Policy at the Computer & Communications Industry Association, where she focuses on intermediary liability, copyright, and other areas of intellectual property. Ali joined CCIA during law school in 2011, and previously served as Senior Policy Counsel, Policy Counsel, and Legal Fellow. She is also an Inaugural Fellow at the Internet Law & Policy Foundry.

Marshall Van Alstyne (@InfoEcon) is the Questrom Chair Professor at Boston University. His work explores how IT affects firms, innovation, and society with an emphasis on business platforms. He co-authored the international best seller Platform Revolution and his research influence ranks among the top 2% of all scientists globally.

Drew Clark (moderator) is CEO of Breakfast Media LLC. He has led the Broadband Breakfast community since 2008. An early proponent of better broadband, better lives, he initially founded the Broadband Census crowdsourcing campaign for broadband data. As Editor and Publisher, Clark presides over the leading media company advocating for higher-capacity internet everywhere through topical, timely and intelligent coverage. Clark also served as head of the Partnership for a Connected Illinois, a state broadband initiative.

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Reporter Em McPhie studied communication design and writing at Washington University in St. Louis, where she was a managing editor for the student newspaper. In addition to agency and freelance marketing experience, she has reported extensively on Section 230, big tech, and rural broadband access. She is a founding board member of Code Open Sesame, an organization that teaches computer programming skills to underprivileged children.

Continue Reading
Click to comment

Leave a Reply

12 Days of Broadband

12 Days of Broadband: State Regulations and Children’s Safety Online

12 year olds (and older) having to age-verify on social media may become more common going forward.

Published

on

Illustration by DALL-E

January 3, 2024 – A nationwide push to restrict teenagers’ online actions gained ground in 2023 as several states implemented stringent laws targeting social media use among youth.

In March, Utah ventured into uncharted territory when Republican Gov. Spencer Cox signed two measures, H.B. 311 and S.B. 152, mandating parental consent for all minors – 17 and under – before they can register for platforms like TikTok and Meta’s Instagram. For decades, the default standard of the 1998 Children’s Online Privacy Protection Act has been no restrictions on social media use by kids 13 and over.

The pair of bills, which do not go into effect until March 2024, require individuals under 18 to gain parental consent to open a social media account, bar minors from accessing social media platforms between the hours of 10:30 p.m. and 6:30 a.m., and grant parents full access to their child’s social media accounts.

In October, Utah announced a lawsuit against TikTok, alleging that the app deploys addictive features to hook young users. The lawsuit raises additional concerns regarding user data and privacy, citing that TikTok’s China-based parent company, ByteDance, is legally binded with the Chinese Communist Party. 

Arkansas, Montana may be following Utah

Soon after, Arkansas took a similar step as Republican Gov. Sarah Huckabee Sanders signed Act 689, named the Social Media Safety Act, in April 2023. The newly approved act, aiming to mandate age verification and parental consent for social media users under 18, was set to come into effect on September 1. 

However, on that very day, U.S. District Judge Timothy Brooks granted a preliminary injunction following a petition from the tech trade industry group, NetChoice Litigation Center. Their contention was that the new law infringed upon the First Amendment’s freedom of expression guarantee.

In May, Montana Gov. Greg Ganforte signed legislation banning TikTok on all devices statewide, threatening fines up to $10,000 per violation for app providers like Google and Apple. Before the law took effect on January 1, Federal Judge Donald Molloy stopped the TikTok ban in late November, stating that the law exceeds state authority and violates the constitutional rights of users. 

Shortly after, TikTok filed a lawsuit against Montana. Judge Molloy found merit to numerous arguments raised by TikTok, including that TikTok has a number of safeguards in place surrounding user data.

Is Age verification a First Amendment issue?

Consumer groups, including the American Civil Liberties Union, have raised issues with the fact that many of these bills extend beyond merely mandating age verification solely for minors; they now necessitate age verification through proof of legal documents for anyone seeking to utilize social media within the states.

The issue was much discussed at a Broadband Breakfast Live Online session in November 2023, where child safety advocate Donna Rice Hughes and Tony Allen, executive director of Age Check Certification Scheme, agreed that age verification systems were much more robust than from a generation ago, when the Supreme Court struck down one such scheme. They disagreed with civil liberties groups including the Electronic Frontier Foundation.

On TikTok, 13 states joined in enacting bans over the use of the Chinese-owned platform being installed on government-issued devices. That brings to 34 the total number of states that have banned TikTok on government devices due to national security concerns. Additionally, more than 40 public universities have barred TikTok from their on-campus Wi-Fi and university-owned computers in response to these state-level bans.

See “The Twelve Days of Broadband” on Broadband Breakfast

Continue Reading

Social Media

Diverse Groups File Amicus Briefs Against Florida and Texas Social Media Laws

The Supreme Court will decide whether the social media laws violate the First Amendment.

Published

on

Photo of Solicitor General Elizabeth Prelogar in June 2023 by Ryland West of ALM

WASHINGTON, December 8, 2023 –  Industry, public interest, and conservative groups filed briefs with the Supreme Court this week arguing against Texas and Florida social media laws.

Drafted to combat what state legislators saw as the unfair treatment of right-wing content online, the 2021 laws would allow residents of those states to sue social media companies for suspending their accounts. Both have been blocked from going into effect after legal challenges from tech industry trade groups. The cases were initially separate, but the Supreme Court agreed in October to hear them together because they raise similar issues.

Industry groups argue the laws violate the First Amendment by forcing platforms to host speech they normally would not. The White House agrees – Solicitor General Elizabeth Prelogar asked the Court in August to take up the issue and strike down Texas’s law.

Consumer protection group Public Knowledge filed an amicus brief on Thursday in support of the tech trade groups, arguing the laws are unconstitutional and “driven by political animus.”

Center-right think tank TechFreedom filed a similar brief on Wednesday. 

“Only the state can ‘censor’ speech,” Corbin Barthold, the group’s director of appellate litigation, said in a statement. “And these states are doing so by trying to co-opt websites’ right to editorial control over the speech they disseminate.

Both groups also pushed against the states’ move to treat social media platforms as ‘common carrier’ services, a part of both laws. The legal designation, typically applied to services like railroads or voice telephone calls, requires a carrier to serve the public at just rates without unreasonable discrimination.

The states’ move to designate social media platforms as common carriers would make it more difficult for them to refuse their service to users. But the designation, the groups argued, does not map cleanly onto the service social media provides, as the platforms make editorial decisions about content they transmit – through moderation and recommendation – in a way companies like voice providers do not.

In all, at least 40 similar briefs have been filed arguing against the laws, according to the Computer and Communications Industry Association, one of the parties in the case.

A set of 15 states with Republican-led legislatures and former president Donald Trump, who had multiple social media accounts suspended after the January 2021 attack on the Capitol, have filed amicus briefs in support of Texas and Florida. The Court is expected to hear oral arguments in the case sometime in 2024.

Continue Reading

Free Speech

Improved Age Verification Allows States to Consider Restricting Social Media

Constitutional issues leading courts to strike down age verification law are still present, said EFF.

Published

on

WASHINGTON, November 20, 2023 — A Utah law requiring age verification for social media accounts is likely to face First Amendment lawsuits, experts warned during an online panel Wednesday hosted by Broadband Breakfast.

The law, set to take effect in March 2024, mandates that all social media users in Utah verify their age and imposes additional restrictions on minors’ accounts.

The Utah law raises the same constitutional issues that have led courts to strike down similar laws requiring age verification, said Aaron Mackey, free speech and transparency litigation director at the non-profit Electronic Frontier Foundation.

“What you have done is you have substantially burdened everyone’s First Amendment right to access information online that includes both adults and minors,” Mackey said. “You make no difference between the autonomy and First Amendment rights of older teens and young adults” versus young children, he said.

But Donna Rice Hughes, CEO of Enough is Enough, contended that age verification technology has successfully restricted minors’ access to pornography and could be applied to social media as well.

“Utah was one of the first states [to] have age verification technology in place to keep minor children under the age of 18 off of porn sites and it’s working,” she said.

Tony Allen, executive director of Age Check Certification Scheme, agreed that age verification systems had progressed considerably from a generation ago, when the Supreme Court in 2002’s Ashcroft v. American Civil Liberties Union, struck down the 1998 Child Online Protection Act. The law had been designed to shield minors from indecent material, but the court ruled that age-verification methods often failed at that task.

Andrew Zack, policy manager at the Family Online Safety Institute, said that his organization he welcomed interest in youth safety policies from Utah.

But Zack said, “We still have some concerns about the potential unintended consequences that come with this law,”  worrying particularly about potential unintended consequences for teen privacy and expression rights.

Taylor Barkley, director of technology and innovation at the Center for Growth and Opportunity, highlighted the importance of understanding the specific problems the law aims to address. “Policy Solutions have trade-offs.” urging that solutions be tailored to the problems identified.

Panelists generally agreed that comprehensive data privacy legislation could help address social media concerns without facing the same First Amendment hurdles.

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, November 15, 2023 – Social Media for Kids in Utah

In March 2023, Utah became the first state to adopt laws regulating kids’ access to social media. This legislative stride was rapidly followed by several states, including Arkansas, Illinois, Louisiana, and Mississippi, with numerous others contemplating similar measures. For nearly two decades, social media platforms enjoyed unbridled growth and influence. The landscape is now changing as lawmakers become more active in shaping the future of digital communication. This transformation calls for a nuanced evaluation of the current state of social media in the United States, particularly in light of Utah’s pioneering role. Is age verification the right way to go? What are the broader implications of this regulatory trend for the future of digital communication and online privacy across the country?

Panelists

  • Andrew Zack, Policy Manager, Family Online Safety Institute
  • Donna Rice Hughes, President and CEO of Enough Is Enough
  • Taylor Barkley, Director of Technology and Innovation, Center for Growth and Opportunity
  • Tony Allen, Executive Director, Age Check Certification Scheme
  • Aaron Mackey, Free Speech and Transparency Litigation Director, Electronic Frontier Foundation
  • Drew Clark (moderator), Editor and Publisher, Broadband Breakfast

Panelist resources

Andrew Zack is the Policy Manager for the Family Online Safety Institute, leading policy and research work relating to online safety issues, laws, and regulations. He works with federal and state legislatures, relevant federal agencies, and industry leaders to develop and advance policies that promote safe and positive online experience for families. Andrew joined FOSI after five years in Senator Ed Markey’s office, where he worked primarily on education, child welfare, and disability policies. Andrew studied Government and Psychology at the College of William and Mary.

Donna Rice Hughes, President and CEO of Enough Is Enough is an internationally known Internet safety expert, author, speaker and producer. Her vision, expertise and advocacy helped to birth the Internet safety movement in America at the advent of the digital age. Since 1994, she has been a pioneering leader on the frontlines of U.S. efforts to make the internet safer for children and families by implementing a three-pronged strategy of the public, the technology industry and legal community sharing the responsibility to protect children online.

Taylor Barkley is the Director of Technology and Innovation at the Center for Growth and Opportunity where he manages the research agenda, strategy, and represents the technology and innovation portfolio. His primary research and expertise are at the intersection of culture, technology, and innovation. Prior roles in tech policy have been at Stand Together, the Competitive Enterprise Institute, and the Mercatus Center at George Mason University.

Tony Allen a Chartered Trading Standards Practitioner and acknowledged specialist in age restricted sales law and practice. He is the Chair of the UK Government’s Expert Panel on Age Restrictions and Executive Director of a UKAS accredited conformity assessment body specialising in age and identity assurance testing and certification. He is the Technical Editor of the current international standard for Age Assurance Systems.

Aaron Mackey is EFF’s Free Speech and Transparency Litigation Director. He helps lead cases advancing free speech, anonymity, and privacy online while also working to increase public access to government records. Before joining EFF in 2015, Aaron was in Washington, D.C. where he worked on speech, privacy, and freedom of information issues at the Reporters Committee for Freedom of the Press and the Institute for Public Representation at Georgetown Law

Breakfast Media LLC CEO Drew Clark has led the Broadband Breakfast community since 2008. An early proponent of better broadband, better lives, he initially founded the Broadband Census crowdsourcing campaign for broadband data. As Editor and Publisher, Clark presides over the leading media company advocating for higher-capacity internet everywhere through topical, timely and intelligent coverage. Clark also served as head of the Partnership for a Connected Illinois, a state broadband initiative.

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook.

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending