Contact

info@th3fus3.com



© 2024 TheFuse. All rights reserved.

San Francisco Officials Sue 18 Sites Over AI Porn

By Olivier Acuña | TH3FUS3 Chief Editor

August 16, 2024 07:20 AM

Reading time: 2 minutes, 15 seconds

TL;DR San Francisco has filed a lawsuit against 18 websites and apps for creating unauthorized deepfake nudes. The city attorney calls it 'sexual abuse' and highlights the impact on victims. This legal action seeks to curb the proliferation of non-consensual intimate imagery.

San Francisco Takes Legal Action

The city of San Francisco has filed a sweeping lawsuit against 18 websites and apps that generate unauthorized, deepfake nudes of unsuspecting victims. The lawsuit was filed on Thursday and aims to combat the proliferation of these services.

The complaint, published with the plaintiffs' service names redacted, targets the 'proliferation of websites and apps that offer to undress or nudity women and girls.' The lawsuit asserts that these sites have been visited over 200 million times in the first six months of 2024.

Dark Corners of the Internet

'This investigation has taken us to the darkest corners of the internet, and I am horrified for the women and girls who have had to endure this exploitation,' said San Francisco City Attorney David Chiu in announcing the lawsuit.

'Generative AI has enormous promise, but as with all new technologies, unintended consequences and criminals seek to exploit the new technology. This is not innovation -- this is sexual abuse,' Chiu added.

Although celebrities like Taylor Swift have been frequent targets of such image generation, Chiu pointed to recent cases involving California middle school students.

'These images, which are virtually indistinguishable from real photographs, are used to extort, bully, threaten, and humiliate women and girls,' the city announcement said.

Global Response to NCII

The rapid spread of what is known as non-consensual intimate imagery (NCII) has prompted efforts by governments and organizations worldwide to curtail the practice. 'Victims have little to no recourse, as they face significant obstacles to remove these images once they have been disseminated,' the complaint states.

'They are left with profound psychological, emotional, economic, and reputational harms, and without control and autonomy over their bodies and images.'

Even more problematic, Chiu notes, is that some sites 'allow users to create child pornography.' The use of AI to generate child sexual abuse material (CSAM) is especially problematic, as it severely hinders efforts to identify and protect real victims.

The Internet Watch Foundation, which tracks the issue, said known pedophile groups are already embracing the technology and that AI-generated CSAM could 'overwhelm' the internet.

Legal and Technological Challenges

A Louisiana state law specifically banning CSAM created with AI went into effect this month. Although major tech companies have pledged to prioritize child safety as they develop AI, according to Stanford University researchers, such images have already found their way into AI datasets.

The lawsuit calls for the services to pay $2,500 for each violation and cease operations. It also demands domain name registrars, webhosts, and payment processors to stop providing services to outfits that create deepfakes.

This legal action marks a significant step in addressing the misuse of AI technology for creating harmful and exploitative content.

Share this

Similar news
cryptocurrency

IRS Faces New Lawsuit Over Block Reward Taxation

By Olivier Acuña | TH3FUS3 Chief Editor

October 11, 2024 01:48 PM
cryptocurrency

SEC Sets Target on Yet Another Crypto Market Maker

Crypto Market Maker Faces Legal Battle

October 11, 2024 01:00 PM
cryptocurrency

Striple Scores One-Day, 70-Country Stablecoin Payment Success

Stripe had previously discontinued Bitcoin payments due to high fees and slow confirmation times

October 11, 2024 11:59 AM
All results loaded