Artificial Intelligence | Tech companies remove nudity from their databases

(Washington) Several major artificial intelligence (AI) companies pledged Thursday to remove nude images from the databases they use to train their products and take other steps to curb the spread of harmful fraudulent sexual images, known as “deepfakes.”


As part of a deal brokered by the Biden administration, tech companies Adobe, Anthropic, Cohere, Microsoft and OpenAI said they would voluntarily commit to removing nude images from their AI training databases “where appropriate and consistent with the purpose of the model.”

This announcement is part of a broader campaign against image-based sexual abuse of children and the creation of fake intimate images of adults by AI without their consent.

These images have “exploded, disproportionately targeting women, children, and LGBTQI+ people, and constitute one of the fastest-growing harmful uses of AI to date,” the White House Office of Science and Technology Policy said.

Joining the tech companies in part of the engagement is Common Crawl, a nonprofit that crawls the web and provides its archives and datasets to the public for free, providing a critical source for training AI chatbots and image generators.

It is committed to sourcing data responsibly and protecting it from image-based sexual abuse.

On Thursday, another group of companies — including Bumble, Discord, Match Group, Meta, Microsoft and TikTok — announced a set of voluntary principles aimed at preventing image-based sexual abuse.


source site-55