Welcome to readin – the best world tech news chanel.

Silicon Valley Fights States Over New Kids Online Safety Laws| GuyWhoKnowsThings


Last summer, Ohio enacted a social media statute that would require Instagram, Snapchat, TikTok and YouTube to obtain parental consent before allowing children under 16 to use their platforms.

But this month, just before the measure went into effect, a tech industry group called NetChoice, which represents Google, Meta, Snap, TikTok and others, sued to block it on free speech grounds, persuading a Federal District Court judge to temporarily halt the new rules.

The case is part of a broad litigation campaign by NetChoice to block new state laws protecting young people online, an anti-regulation effort that will likely come under scrutiny Wednesday. as the Senate Judiciary Committee questions social media executives on online child sexual exploitation. NetChoice's demands have irritated state officials and lawmakers who sought input from tech companies as they drafted the new measures.

“I think it's cowardly and disingenuous,” Jon Husted, Ohio's lieutenant governor, said of the industry's lawsuit, noting that he or his staff had met with Google and Meta about the bill last year and had taken into account the concerns of companies. “We tried to be as cooperative as possible and then at the last minute they filed a lawsuit.”

The social media platforms said some of the state laws contradicted each other and that they would prefer Congress to enact a federal law establishing national standards for children's online safety.

NetChoice said the new state laws infringed on its members' First Amendment rights to freely distribute information, as well as the rights of minors to obtain information.

“There's a reason this is such a sure win for NetChoice every time,” he said. Carl Szabo, vice president of the group. “And that's because it's obviously unconstitutional.”

Driven by growing public concern about youth mental health, lawmakers and regulators across the United States are making bipartisan efforts to rein in popular social media platforms by enacting a wave of lawseven as tech industry groups work to repeal them.

TO first-of-its-kind law spent last spring in Utah would require social media companies to verify user ages and obtain parental consent before allowing minors to set up accounts. Arkansas, Ohio, Louisiana and Texas subsequently passed similar laws requiring parental consent for social media services.

TO Historic new California lawThe Age-Appropriate Design Code Act would require many popular social networks and multiplayer video game apps to turn on the highest privacy settings and disable potentially risky features, such as messaging systems that allow strange adults to contact young people, so default for minors. .

“The intent is to ensure that any technology product accessed by anyone under the age of 18 is, by design and by default, safe for children,” said Buffy Wicks, a California Assembly member who co-sponsored the bill.

But free speech lawsuits brought by NetChoice have dealt a blow to these state efforts.

Last year, in California and Arkansas, judges in NetChoice cases temporarily blocked new state laws from taking effect. (The New York Times and the Student Press Law Center filed a joint lawsuit friend of the court writing last year in the California case in support of NetChoice, arguing that the law could limit newsworthy content available to students).

“There has been a lot of pressure on states to regulate social media and protect against its harms, and much of the anxiety is now being channeled into specific laws about children,” he said. Genevieve Lakier, teacher at the University of Chicago Law School. “What you see here is that the First Amendment continues to be a concern, that in many cases these laws have been suspended.”

State lawmakers and officials said they viewed the tech industry's pullback as a temporary setback and described their new laws as reasonable measures to ensure children's basic safety online. Rob Bonta, California's attorney general, said the new state law would regulate platform design and the conduct of companies, not content. The California statute, which goes into effect in July, does not explicitly require social media companies to verify the age of each user.

Bonta recently appealed the ruling that suspended the law.

“NetChoice has a burn it all strategy, and they are going to challenge every law and set of regulations to protect children and their privacy in the name of the First Amendment,” he said in a telephone interview on Sunday.

On Monday, California introduced two bills on privacy and online safety for children which Mr. Bonta sponsored.

NetChoice also filed a lawsuit to try to block the new social media bill in Utah that would require Instagram and TikTok to verify users' ages and obtain parental permission for minors to have accounts.

Civil rights groups have warned that such legislative efforts could stifle free speech by requiring adults, as well as minors, to verify their ages using documents such as driver's licenses just to set up and use social media accounts. They say requiring parental consent for social media could also prevent young people from finding important support groups or resources about reproductive health or gender identity.

The Supreme Court has repealed a series of laws that were intended to protect minors of potentially harmful content, including violent video games and “Indecent” online materialfor reasons of freedom of expression.

Social media companies said they had instituted many protections for young people and would prefer Congress to enact federal legislation, rather than require companies to comply with a patchwork of sometimes conflicting state laws.

Snap recently became the first social media company to support a federal bill, called the Child Online Safety Act, which has some similarities to California's new law.

In a statement, Snap said many of the provisions in the federal bill reflected the company's existing safeguards, such as setting teen accounts to the strictest privacy settings by default. The statement added that the bill would direct government agencies to study technological approaches to age verification.

Google and Tik Tok He declined to comment.

Meta has called Congress Pass legislation that would make Apple and Google's app stores, not social media companies, responsible for verifying a user's age and getting a parent's permission before allowing anyone under 16 to download an app. application. Meta recently began running ads on Instagram saying she supported the federal legislation.

“We support clear and coherent legislation that makes things simpler for Parents to Help Manage Their Teens' Online Experiences, and that holds all apps that teens use to the same standard,” Meta said in a statement. “We want to continue working with policymakers to help find more viable solutions.”

But simply requiring parental consent would do nothing to mitigate the potentially harmful effects of social media platforms, the federal judge in the NetChoice case in Ohio said.

“Preventing minors under 16 from accessing all content” on social media websites “is an astonishingly strong instrument to reduce the harm that social media causes to children,” said presiding Judge Algenon L. Marbley. of the U.S. District Court for the Southern District of Ohio, Eastern Division, he wrote in his ruling Temporarily suspend the state's social media law.




Share this article:
you may also like
Next magazine you need
most popular

what you need to know

in your inbox every morning