Connect with us

Privacy

Children’s Online Safety Bills Criticized for Compliance Burden, Plus Speech and Privacy Risks

States are considering measures ranging from age verification to a “duty of care.”

Published

on

Screenshot of Matthew Feeney, head of technology and innovation at the Centre for Policy Studies, at the Cato Institute event

WASHINGTON, March 17, 2023 — As an increasing number of states start to consider and implement their own laws aimed at protecting children’s online safety, some experts are highlighting concerns about the practical implications of the resulting legislative “patchwork” — as well as concerns that some proposals might actually harm consumers’ digital privacy.

“States have realized that the federal government is going to be very slow in acting in this area,” said James Czerniawski, senior policy analyst at Americans for Prosperity. “So they’re going to try to take the lead here.”

Speaking at a Cato Institute forum on Wednesday, Czerniawski described the two competing approaches that have emerged among the various state laws and proposals.

The first is typified by California’s Age Appropriate Design Code Act, passed in August 2022, which requires that online platforms proactively prioritize the privacy of underage users by default and by design. Many aspects of the law are modeled after the United Kingdom’s Online Safety Bill, a controversial proposal that would establish some of the world’s most stringent internet regulations.

The second approach focuses on age verification, such as Utah legislation that will require social media companies to verify the age of Utah residents before allowing them to create or keep accounts.

In addition to those two core directions, many of the state proposals have their own unique twists, Czerniawski said. For example, the Utah legislation prohibits any design choice that “causes a minor to have an addiction to the company’s social media platform.” While the bill has not yet been signed, Gov. Spencer Cox has previously indicated his intent to do so.

For online platforms that operate nationally or internationally, complying with a growing range of disparate state privacy laws will only become more complicated, Czerniawski said. “This patchwork doesn’t work.”

Potential unintended consequences for free speech, competition and privacy

Some experts have raised concerns that legislation intended to protect children online could have unintended consequences for the privacy and speech rights of adult users.

Matthew Feeney, head of technology and innovation at the Centre for Policy Studies, argued that a heavy compliance burden could incentivize online platforms to over-moderate content. “Given the punitive fines attached to the Online Safety Bill, I think they will engage in an abundance of caution and remove a lot of legal and valuable speech.”

The task of determining which users are underage and then figuring out how to prevent them from seeing any harmful content presents a significant challenge for platforms that host a massive amount of user-generated content, Feeney said.

“Something that’s very crucial to understand is that if you require firms to treat children differently, then you’re asking them to find out which of their users are children — and that is not free; that is a cost,” he added. “And for many firms, I think it will just be cheaper to err on the side of caution and assume all users are children.”

In addition to the implications for online speech, Feeney expressed concern that the regulatory burden adds a “very worrying anti-competitive element” to the legislation. “Most of the companies that will be in scope do not have the army of lawyers and engineers that Meta and Google have,” he said.

While the age verification measures might be easier in terms of compliance, Feeney said, they might ironically create their own risk to children’s online privacy by mandating the collection of highly identifying data.

Czerniawski agreed, specifically pointing to TikTok. “From a privacy standpoint, it seems a little odd that we want to have a company that currently has some security concerns collecting more information on kids in order to continue operating in the country,” he said.

Despite agreeing that there may be legitimate concerns about TikTok’s privacy practices, Czerniawski again argued that many of the proposed solutions — such as a complete national ban — fail to address the actual problem.

“If you’re truly concerned about the privacy issues that TikTok has raised, that’s why… we need a federal data privacy law passed, right? I think that that can go a long way towards solving a lot of those issues,” he said.

In terms of child-specific legislation, Czerniawski called for a more narrowly targeted approach to address problems such as the proliferation of online child sexual abuse material without risking the privacy and free speech rights of all other internet users. “We have to be very serious when we’re looking at trade-offs that are involved here,” he said.

Reporter Em McPhie studied communication design and writing at Washington University in St. Louis, where she was a managing editor for the student newspaper. In addition to agency and freelance marketing experience, she has reported extensively on Section 230, big tech, and rural broadband access. She is a founding board member of Code Open Sesame, an organization that teaches computer programming skills to underprivileged children.

Continue Reading
Click to comment

Leave a Reply

Broadband's Impact

CES 2024: Industry Wants Federal Data Privacy Law

The current patchwork of state laws makes compliance difficult, said representatives from T-Mobile and Meta.

Published

on

Photo of the panel by Jake Neenan

LAS VEGAS, January 12, 2024 – Industry stakeholders called for federal data privacy legislation at CES on Thursday.

“I think oftentimes companies can be in the position of opposing additional regulation at the federal level,” said Melanie Tiano, director of federal regulatory affairs at T-Mobile. “But this is probably one of those areas where that’s not the case, in part because of the flurry of activity going on at the state level, which makes compliance in the U.S. marketplace extraordinarily confusing and difficult.”

The New Jersey legislature cleared one such bill on Monday. If that’s signed into law by the state’s governor, it would bring the number up to 13. Federal efforts, notably the American Data Privacy and Protection Act, have stalled in recent years.

“We will continue to be seriously committed to getting legislation done in a bipartisan way. That’s not always easy right now, but we’re continuing to work on that” said Tim Kurth, chief counsel for the House Innovation, Data and Commerce Subcommittee.

Simone Hall Wood, privacy and public policy manager at Meta, said “privacy regulation should not inhibit beneficial uses of data.” The company has argued it has a legitimate interest in data use practices that the European Union has found to be out of compliance with its data privacy law, the GDPR.

Industry groups, including the Consumer Technology Association, which runs the CES conference, have advocated for a light-touch privacy law in the United States, in contrast with the more comprehensive European standard.

Kurth had similar thoughts Thursday, saying the GDPR “really hurt startups and really hurt innovations.”

Still, Woods said establishing a uniform standard is something the law does well.

“It sets certainty across the marketplace for what privacy protections look like for consumers. And so that aspect of it is positive,” she said.

Continue Reading

Broadband's Impact

CES 2024: Biden Administration Announces Deal with EU on Cyber Trust Mark

The White House is looking to get the mark on products “by next year.”

Published

on

Deputy National Security Advisor for Cyber and Emerging Technologies Anne Neuberger at CES.

LAS VEGAS, January 11, 2024 – The United States has entered an agreement with the European Union on a “joint roadmap” for standardized cybersecurity labels, a Biden Administration official announced at CES on Thursday.

“We want companies to know when they test their product once to meet the cybersecurity standards, they can sell anywhere,” said Anne Neuberger, the White House’s deputy national security advisor for cyber and emerging technologies. “They can sell in Paris, Texas, or Paris, France.”

Neuberger said the White House is aiming to get its U.S. Cyber Trust Mark, a voluntary certification for internet of things devices, on consumer products by the end of the year. The effort to mark products like routers, baby monitors, and thermostats as safe from hacking was first announced in October 2022.

The Federal Communications Commission voted in August to seek comment on how to implement various parts of the program, including how to develop and ensure compliance with its cybersecurity standards.

What exactly those standards will be is not yet decided, but the Commission has said it will base the program on criteria developed by the National Institute of Standards and Technology. Those  include encrypting both stored and communicated data and the ability to receive software updates.

The measure is not on the FCC’s tentative January meeting agenda, but Neuberger said the agency is “working toward next steps.”

Continue Reading

Robocall

CES 2024: FCC and AT&T Say Collaboration is Key in Combatting Spam

The Commission has been aggressive on spam this year, and AT&T has been working to improve filters on its networks.

Published

on

Photo of the panel by Jake Neenan

LAS VEGAS, January 10, 2024 – Members of the telecom industry and the Federal Communications Commission emphasized the need for industry and government entities to collaborate in combating scam calls and texts at CES on Tuesday.

“Collaboration is key here,” said Amanda Potter, assistant vice president and senior legal counsel for AT&T.

Current measures

Alejandro Roark, chief of the FCC’s Consumer and Government Affairs Bureau, noted Federal Trade Commission data showing American consumers reported losing $790 million to scam calls and another $396 million to scam texts in 2022.

The Commission took action on preventing both in 2023, expanding its STIR/SHAKEN regime – a set of measures to confirm caller identities – to all providers who handle call traffic, moving to block call traffic from non compliant providers, and issuing multiple fines in the hundreds of millions. Almost every state has entered an agreement with the agency to collaborate on robocall investigations.

In addition, the FCC adopted its first robotext rules and moved to tighten those rules in December, closing the “lead generator loophole” by requiring affirmative consent for companies to send consumers marketing messages. Comments are being accepted on a proposal to institute a text authentication scheme.

For AT&T’s part, Potter said the company has instituted network filters to block messages that are likely to be illegal.

“We’re not going to claim success by any means, but when we have these robust network defenses, that does a lot,” she said, citing a total of 1 billion blocked texts on the company’s networks in July 2023.

AT&T also worked with manufacturers on features allowing consumers to report text as junk when deleting messages, which Potter said has provided extra data to tune spam filters.

What’s next

“We start from a standpoint of maximum flexibility when it comes to messaging,” Potter said, in contrast to voice calls, which are more tightly regulated and required FCC intervention for providers to block. 

“I’m concerned about that being taken away, or perhaps regulation being something of a distraction,” she said.

Roark agreed on flexibility being superior to regulation, although the Commission is moving forward with its proceeding on more expansive text authentication rules. The proposed rules include requiring more providers on the traffic chain to block texts from numbers flagged as scammers by the FCC and requiring measurers to verify the identity of texters, similar to the STIR/SHAKEN system for caller authentication.

The FCC is also taking comments on how AI factors into robocalls and robotexts, both how it’s used to perpetrate them and how the Commission might use AI tools to combat them.

At a House oversight hearing in November, FCC Chairwoman Jessica Rosenworcel asked Congress for the authority to collect the fines the Commission imposes – a job currently left to the DOJ – and access to more financial information to help the agency’s robocall prevention efforts.

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending