Child abuse images removed from AI image-generator training source, researchers say

2 weeks ago 14

Author of the article:

Associated Press

Associated Press

Matt O'brien

Published Aug 30, 2024  •  1 minute read

A woman types on a computer keyboard.A woman types on a computer keyboard. Photo by Ping Li / iStock /Getty Images

Artificial intelligence researchers said Friday they have deleted more than 2,000 web links to suspected child sexual abuse imagery from a database used to train popular AI image-generator tools.

Advertisement 2

Toronto Sun

THIS CONTENT IS RESERVED FOR SUBSCRIBERS ONLY

Subscribe now to read the latest news in your city and across Canada.

  • Unlimited online access to articles from across Canada with one account.
  • Get exclusive access to the Toronto Sun ePaper, an electronic replica of the print edition that you can share, download and comment on.
  • Enjoy insights and behind-the-scenes analysis from our award-winning journalists.
  • Support local journalists and the next generation of journalists.
  • Daily puzzles including the New York Times Crossword.

SUBSCRIBE TO UNLOCK MORE ARTICLES

Subscribe now to read the latest news in your city and across Canada.

  • Unlimited online access to articles from across Canada with one account.
  • Get exclusive access to the Toronto Sun ePaper, an electronic replica of the print edition that you can share, download and comment on.
  • Enjoy insights and behind-the-scenes analysis from our award-winning journalists.
  • Support local journalists and the next generation of journalists.
  • Daily puzzles including the New York Times Crossword.

REGISTER / SIGN IN TO UNLOCK MORE ARTICLES

Create an account or sign in to continue with your reading experience.

  • Access articles from across Canada with one account.
  • Share your thoughts and join the conversation in the comments.
  • Enjoy additional articles per month.
  • Get email updates from your favourite authors.

Article content

The LAION research database is a huge index of online images and captions that’s been a source for leading AI image-makers such as Stable Diffusion and Midjourney.

But a report last year by the Stanford Internet Observatory found it contained links to sexually explicit images of children, contributing to the ease with which some AI tools have been able to produce photorealistic deepfakes that depict children.

That December report led LAION, which stands for the nonprofit Large-scale Artificial Intelligence Open Network, to immediately remove its dataset. Eight months later, LAION said in a blog post that it worked with the Stanford University watchdog group and anti-abuse organizations in Canada and the United Kingdom to fix the problem and release a cleaned-up database for future AI research.

Advertisement 3

Article content

But one of the LAION-based tools that Stanford identified as the “most popular model for generating explicit imagery” — an older and lightly filtered version of Stable Diffusion — remained publicly accessible until Thursday, when the New York-based company Runway ML removed it from the AI model repository Hugging Face. Runway said in a statement Friday it was a “planned deprecation of research models and code that have not been actively maintained.”

The cleaned-up version of the LAION database comes as governments around the world are taking a closer look at how some tech tools are being used to make or distribute illegal images of children.

San Francisco’s city attorney earlier this month filed a lawsuit seeking to shut down a group of websites that enable people to make AI-generated nudes of women and girls. The alleged distribution of child sexual abuse images on the messaging app Telegram is part of what led French authorities to bring charges on Wednesday against the platform’s founder and CEO, Pavel Durov.

Article content

*** Disclaimer: This Article is auto-aggregated by a Rss Api Program and has not been created or edited by Bdtype.

(Note: This is an unedited and auto-generated story from Syndicated News Rss Api. News.bdtype.com Staff may not have modified or edited the content body.

Please visit the Source Website that deserves the credit and responsibility for creating this content.)

Watch Live | Source Article