Biden asks tech companies to help stop AI-generated sexual deepfakes used to harass gay students

Biden asks tech companies to help stop AI-generated sexual deepfakes used to harass gay students
LGBTQ

The administration of President Joe Biden is urging tech and financial industries to help stop the spread of abusive, AI-generated “deepfake” sexual images used to harass real-life school kids and educators — particularly girls, women, and gay kids in schools. These images can ruin their lives, the Biden Administration says, but current school policies and laws don’t provide consistent ways to prevent their dissemination.

“As generative [artificial intelligence] broke on the scene, everyone was speculating about where the first real harms would come. And I think we have the answer,” said Biden’s chief science adviser Arati Prabhakar, director of the White House’s Office of Science and Technology Policy, according to Fortune. “If you’re a teenage girl, if you’re a gay kid, these are problems that people are experiencing right now.”

Sexual deepfakes place an individual’s face onto a naked body or a sexually explicit scene. These images are then distributed to students online as a way to humiliate and harass others in schools.

“[Creating sexual deepfakes] used to take roughly between 100-200 photos of the victim’s face; you had to have a high-powered computer; you had to have a good amount of technical ability and skill,” said Omny Miranda Martone, chief of the Virginia-based nonprofit Sexual Violence Prevention Association. “Now … you only need one or two photos.”

The Biden Administration will release a document on Thursday asking AI developers, online payment processors, financial institutions, cloud computing providers, search engines, and Apple and Google to restrict applications that help generate and distribute sexually explicit deepfakes for profit, Politico reported.

The administration has already gotten voluntary promises from Amazon, Google, Meta, Microsoft, and other major tech companies to help minimize any harm caused by new AI systems before they’re publicly released. However, those commitments “[don’t] change the underlying need for Congress to take action here,” said Jennifer Klein, director of the White House Gender Policy Council.

Current laws criminalize the production and possession of sexual images of children, even if the images have been entirely fabricated by AI image-generators. In fact, 20 states have already criminalized the dissemination of nonconsensual AI-generated pornographic images. Some states also have laws forbidding the distribution of “revenge porn” (that is, sexually explicit images released without the photographed individual’s consent). But, it can be difficult to identify the individuals and companies behind the online, fly-by-night AI image-generating tools that make it easy to spread sexual deepfakes.

Worse yet, no federal laws or guidelines tell school administrators how to respond when such images appear in educational environments, causing the consequences (or lack thereof) to vary wildly depending on where such incidents arise.

Schools can investigate such deepfakes as a violation of Title IX, the federal law banning sex discrimination in schools, according to Esther Warkov, executive director and co-founder of the nonprofit Stop Sexual Assault in Schools. In new Title IX rules released by the Biden Administration earlier this year, online sex-based harassment includes “nonconsensual distribution of intimate images that have been altered or generated by AI technologies.” The rules also require schools to address online and off-campus actions that create a hostile learning environment.

“This points to a larger need, which is to ensure that [a school district’s] Title IX procedures are properly in place,” Warkov told Politico. “Many school districts may not identify this problem as a potential Title IX issue.”

Without a federal law or guidelines, it’s unclear who gets disciplined, how minors get treated, and who must report such images to the police, especially since some school districts don’t require employees to report such images to legal authorities at all. The patchwork of existing policies and statewide laws can leave victims feeling unprotected.

“We’re pushing lawmakers to update [laws] because most protections were written way before AI-generated media,” Ronn Nozoe, CEO of the National Association of Secondary School Principals, said, according to Politico. “We’re also calling on the Department of Education to develop guidance to help schools navigate these situations.”

Earlier this month, the White House Task Force to Address Online Harassment and Abuse released a report explaining prevention, support, and accountability efforts for government agencies combating these images. The report said that the Department of Education will soon issue “resources, model policies, and best practices” for preventing online harassment in schools.

The White House also issued a “call to action” this week, urging Congress to pass legislation providing legal recourse for survivors. In the meanwhile, a bipartisan group of congressional legislators is scrambling to tackle the issue.

Senator Richard J. Durbin (D-IL) has drafted the DEFIANCE Act, an amendment to the Violence Against Women Act that would give victims of sexual deepfakes the right to sue creators, solicitors, possessors, and distributors of the images for $150,000 in damages and legal fees if the perpetrators “knew or recklessly disregarded” the victims’ non-consensuality before disseminating the images.

Rep. Nancy Mace (R-SC) also recently introduced legislation to fine perpetrators $500,000 for disseminating such images. However, Rep. Alexandria Ocasio-Cortez (D-NY), who is herself a victim of deepfake porn and a supporter of the DEFIANCE Act, has said that some legislators are reluctant to pursue any such legislation for fear that it could infringe on free speech rights or the operation of larger tech companies.

“Going really big, really fast, with something regulatory in an emerging industry space — that can oftentimes run into its challenges,” she said. “Centering the bill on survivors’ rights — particularly the right of action — helps us dodge some of those larger questions in the short term and build a coalition in the immediate term.”

Don’t forget to share:

Originally published here.

Products You May Like

Articles You May Like

Whoopi Goldberg Has to Calm Down Sunny & Alyssa During Fiery ‘The View’ Debate
Trisa Sutter Divorce? Original Bachelorette Responds to Rumors
Chad Ollinger Dishes on Season 4 Dangers & Discoveries
Book review of The Work of Art by Adam Moss
Jessica Simpson Steps Out for Family Dinner, Eric Johnson Skips Out