Welcome to readin – the best world tech news chanel.

Prompted by teen girls, states move to ban fake nudes| GuyWhoKnowsThings

Caroline Mullet, a ninth-grader at Issaquah High School near Seattle, attended her first homecoming dance last fall, a James Bond-themed party with blackjack tables attended by hundreds of girls dressed in blackjack dresses. party.

A few weeks later, she and other students learned that a classmate was circulating fake nude images of girls who had attended the dance, sexually explicit images he had fabricated using an artificial intelligence application designed to automatically “strip” photos dressed as real people. girls and women.

Mrs. Mullet, 15 years old, alerted her father: Brand, Democratic senator from Washington state. Although she was not among the girls in the photos, she asked if anything could be done to help her friends, who felt “extremely uncomfortable” because her male classmates had seen simulated nude images of them. Soon, Senator Mullet and a colleague in the House of Representatives proposed legislation to ban the sharing of explicit AI-generated depictions of the sexuality of real minors.

“I hate the idea of ​​having to worry about this happening again to any of my friends, or my sisters, or even myself,” Mullet told state lawmakers during a hearing on the bill in January.

The state legislature passed the bill without opposition. Gov. Jay Inslee, a Democrat, signed it last month.

States are on the front line of a new form of peer-to-peer sexual exploitation that is spreading rapidly and bullying in schools. Boys across the United States have used widely available “nudification” apps to surreptitiously make up sexually explicit images of their female classmates and then circulate the simulated nudes through group chats on apps like Snapchat and Instagram.

Now, spurred in part by troubling accounts from teenagers like Mullet, federal and state lawmakers are rushing to enact protections in an effort to keep pace with exploitative AI applications.

Since early last year, at least two dozen states have introduced bills to combat AI-generated sexually explicit images (known as deepfakes) of people under 18, according to data compiled by the National Center for Missing and Exploited Children. , a non-profit organization. And several states have enacted the measures.

Among them, South Dakota this year. passed a law that makes it illegal possess, produce or distribute AI-generated sexual abuse material depicting real minors. Last year, Louisiana enacted a false law which criminalizes AI-generated sexually explicit depictions of minors.

“I had a sense of urgency hearing about these cases and seeing how much damage was being done,” he said. Representative Tina Orwalla Democrat who wrote Washington state's explicit-deepfake law after learning of incidents like the one at Issaquah High.

Some lawmakers and child protection experts say such rules are urgently needed because the easy availability of AI nudification apps is enabling the mass production and distribution of fake graphic images that can potentially circulate online for a lifetime, threatening health. mental, reputation and physical health of girls. security.

“A boy with his phone in the course of an afternoon can victimize 40 girls, minors,” he said. Yota Souraslegal director of the National Center for Missing and Exploited Children, “and then their images are out there.”

Over the past two months, incidents of deepfake nudity have been rampant in schools. even in Richmond, Illinois.and Beverly Hills and lagoon beachCalifornia

However, few laws in the United States specifically protect people under 18 from exploitative AI applications.

That's because many current statutes prohibiting child sexual abuse material or non-consensual adult pornography (involving real photos or videos of real people) may not cover explicit AI-generated images that use real people's faces, he said. U.S. Representative Joseph D. Morelle, a Democrat. from New York.

Last year he presented an invoice that would make it a crime to reveal intimate images of identifiable adults or minors generated by AI. It would also give victims, or parents, of deepfake the right to sue individual perpetrators for damages.

“We want to make this as painful for anyone who would even consider doing it, because it is damage that simply cannot be undone,” Morelle said. “Although it may seem like a joke to a 15-year-old boy, this is extremely serious.”

U.S. Rep. Alexandria Ocasio-Cortez, another New York Democrat, recently introduced a similar bill to allow victims to file civil lawsuits against deepfake perpetrators.

But neither bill would explicitly give victims the right to sue developers of AI nudification apps, a move that trial lawyers say would help disrupt the mass production of sexually explicit deepfakes.

“Legislation is needed to stop the marketing, which is the root of the problem,” said Elizabeth Hanley, a lawyer in Washington who represents victims in sexual assault and harassment cases.

The US legal code prohibits the distribution of computer generated Child sexual abuse material depicting identifiable minors engaged in sexually explicit conduct. Last month, the Federal Bureau of Investigation issued an alert warning that such illegal material included Realistic images of child sexual abuse generated by AI.

However, AI-generated fake depictions of real, unclothed teenagers may not constitute “child sexual abuse material,” experts say, unless prosecutors can show that the fake images meet legal standards for sexually explicit conduct or lewd exhibition of genitals.

Some defense attorneys have tried to take advantage of the apparent legal ambiguity. A lawyer who defended a high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily prevent his client, who had created naked AI images of a classmate, from viewing or sharing the images because he did not were harmful or illegal. Federal laws, the lawyer argued in a court filing, were not designed to apply “to synthetic computer-generated images that do not even include real human body parts.” (The defendant ultimately agreed not to fight a restraining order on the images.)

Now states are working to pass laws to put an end to exploitative AI images. This month, California introduced a bill to update a state ban on child sexual abuse material to specifically cover AI-generated abusive material.

And Massachusetts lawmakers are concluding legislation that would criminalize the non-consensual sharing of explicit images, including deepfakes. It would also require a state entity to develop a diversion program for minors who shared explicit images to teach them about topics such as the “responsible use of generative artificial intelligence.”

Punishments can be severe. Under Louisiana's new law, anyone who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.

In December, Miami-Dade County police officers arrested two high school boys. for allegedly creating and sharing fake nude images with AI of two classmates, ages 12 and 13, according to police documents obtained by The New York Times through a public records request. The boys were charged with third-degree felonies under a 2022 state law prohibit altered sexual representations without consent. (The Miami-Dade County state's attorney's office said it could not comment on an open case.)

The new deepfake law in Washington state takes a different approach.

After learning about the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and former social worker. Orwall, who had worked on one of the state's first revenge porn bills, then drafted a House bill to ban the distribution of AI-generated intimate or sexually explicit images of minors or adults. (Mr. Mullet, who sponsored the Senate companion bill, is now run for governor.)

Under the resulting law, first offenders could face misdemeanor charges, while people with prior convictions for revealing sexually explicit images would face felony charges. The new deepfake statute will come into effect in June.

“It is not surprising that we are behind on protections,” Ms. Orwall said. “That's why we wanted to move so quickly.”

Share this article:
you may also like
Next magazine you need
most popular

what you need to know

in your inbox every morning