Alert issued in AI Newsletter: Specialist advises that only 20 manipulated cloud images may craft a deepfake video depicting a minor as your child
Revised News Text:
The U.S. House of Representatives has passed a new bill, the TAKE IT DOWN Act, that aims to criminalize the sharing of non-consensual intimate images, including those generated by artificial intelligence (AI). This recent legislation follows a growing concern over revenge porn and deepfake content online.
Texas high school student Elliston Berry discussed the bill's passage on Fox & Friends. The Act, which addresses non-consensual intimate imagery and AI deepfakes on websites and networks, was a priority for First Lady Melania Trump, who lobbied lawmakers for its passage.
Artificial Intelligence (AI) technology has become increasingly prevalent, with applications ranging from powering phone autocorrect to helping create new recipes. However, a new study from the U.K. reveals that many people's cherished childhood images may be scanned and analyzed by cloud storage services without their awareness.
New York City's subway system is now testing AI to enhance security and reduce crimes. Michael Kemper, the NYPD veteran leading the rollout, explained that AI software will be used to spot suspicious behavior in real-time, improving both passenger safety and law enforcement efforts.
The TAKE IT DOWN Act, signed into law in May 2025, empowers the Federal Trade Commission (FTC) to remove non-consensual intimate images from online platforms. The law also provides support for victims, primarily protecting children and young women, and offers safe harbor provisions to local law enforcement.
Despite widespread support, the Act has faced criticism for its potential threats to free speech and online privacy. Critics express concerns over the takedown provisions covering a broad range of intimate images, which may lead to overbroad censorship, and the 48-hour removal deadline and reliance on automated filters.
The Act has been lauded for its significant step in combating revenge porn and AI-generated deepfake exploitation, enabling quicker removal of non-consensual intimate images and supporting victims and law enforcement. However, its broad scope and strict takedown requirements have triggered debates about potential negative impacts on lawful expression and online privacy protections.
President Donald Trump signed the TAKE IT DOWN Act into law on May 19, 2025, formally making it Public Law No: 119-12. Read more about AI technology advancements and its challenges on our website with a team of dedicated journalists.
Sources:1. Melania Trump Signs Take Down Act Into Law (2025, May 19). Associated Press. https://apnews.com/article/5bfb58b03b3458ac6b5e4ebabd17cd6a2. Cruz, Klobuchar Reintroduce TAKE IT DOWN Act to Protect Online Privacy and Remove Non-Consensual Intimate images (2025, January 13). Office of Senator Ted Cruz. https://www.cruz.senate.gov/news/releases/cruz-klobuchar-reintroduce-take-it-down-act-to-protect-online-privacy-and-remove-non-consensual-intimate-images3. House Passes Bill to Halt Nonconsensual Porn Sharing (2025, April 28). The Hill. https://thehill.com/homenews/ house/3939624-house-passes-bill-to-halt-nonconsensual-porn-sharing4. The TAKE IT DOWN Act: A New Frontier in Online Privacy Debate (2025, June 5). Electronic Frontier Foundation. https://www.eff.org/deeplinks/2025/06/take-it-down-act-new-frontier-online-privacy-debate5. TAKE IT DOWN Act (S. 146) (2025). GovTrack.us. https://www.govtrack.us/congress/bills/119/s146
- The criticism of the TAKE IT DOWN Act extends beyond its implications for non-consensual intimate images, with some expressing concerns about its potential impact on free speech and technology advancements in areas like AI.
- As AI technology continues to expand into various aspects of life, including politics, healthcare, sports, and the weather, the TAKE IT DOWN Act serves as a reminder of the need for careful regulation and consideration of privacy concerns, even in AI-related fields.
- In the realm of sports, some argue that the TAKE IT DOWN Act's broad scope could potentially lead to the removal of legitimate content, such as locker room conversations or behind-the-scenes footage, if they inadvertently contain intimate images or deepfakes.