What exactly is "undress AI"? Tips for parents and caregivers.

What is 'undress AI'?

Undress AI refers to a category of software leveraging artificial intelligence to digitally remove clothing from individuals depicted in images.

While the functionality of each application or website may differ, they all provide a similar service. Although the manipulated images do not depict the victim's actual nude body, they can insinuate such.

Perpetrators utilizing undress AI tools may retain the images for personal use or distribute them more broadly. These images could be employed for sexual coercion (sextortion), harassment, or as a means of revenge pornography.

Children and adolescents are particularly vulnerable to harm if subjected to 'undressing' via this technology. According to a report by the Internet Watch Foundation, over 11,000 potentially illegal AI-generated images of children were discovered on a single dark web forum dedicated to child sexual abuse material (CSAM). Approximately 3,000 of these images were deemed criminal.

Furthermore, the IWF noted the presence of "numerous instances of AI-generated images featuring identifiable victims and prominent minors." Generative AI can only produce convincing images when trained on accurate source material. Essentially, AI tools generating CSAM would require training on real images depicting child abuse.

How Undress AI Tools Can Be Risky for Kids

• Temptation Through Enticing Language: These tools use suggestive language and phrasing that's designed to be eye-catching. This can be particularly appealing to children's curiosity, potentially leading them to engage with the tool without fully understanding its purpose or potential consequences.

• Difficulty Distinguishing Harm from Fun: For younger users, it can be challenging to differentiate between a harmless app and one that can generate inappropriate content. This lack of discernment could lead them to unknowingly expose themselves to something they're not ready for.

• Exposure to Inappropriate Content: The very nature of these tools revolves around revealing or altering images in a suggestive way. This can expose children to content that isn't age-appropriate, potentially causing confusion or distress.

Inappropriate content and behaviour

The curiosity and novelty of an undress AI tool could expose children to inappropriate content. Because it’s not showing a ‘real’ nude image, they might then think it’s okay to use these tools. If they then share the image with their friends ‘for a laugh’, they are breaking the law likely without knowing.

Without intervention from a parent or carer, they might continue the behaviour, even if it hurts others.

Privacy and security risks

Many legitimate generative AI tools require payment or subscription to create images. So, if a deepnude website is free, it might produce low-quality images or have lax security. If a child uploads a clothed image of themselves or a friend, the site or app might misuse it. This includes the ‘deepnude’ it creates.

Children using these tools are unlikely to read the Terms of Service or Privacy Policy, so they face risk they might not understand.

Creation of child sexual abuse material (CSAM)

The IWF also reported that cases of ‘self-generated’ CSAM circulating online increased by 417% from 2019 to 2022. Note that the term ‘self-generated’ is imperfect as, in most cases, abusers coerce children into creating these images.

However, with the use of undress AI, children might unknowingly create AI-generated CSAM. If they upload a clothed picture of themselves or another child, someone could ‘nudify’ that image and share it more widely.

Cyberbullying, abuse and harassment

Just like other types of deepfakes, people can use undress AI tools or ‘deepnudes’ to bully others. This could include claiming a peer sent a nude image of themselves when they didn’t. Or, it might include using AI to create a nude with features that bullies then mock.

It’s important to remember that sharing nude images of peers is both illegal and abusive.

How prevalent is the adoption of 'deepnude' technology? 

Studies indicate a growing trend in the utilization of such AI applications, particularly for the purpose of digitally disrobing female individuals.

One platform offering AI-based disrobing explicitly states that their technology is primarily tailored for female subjects, owing to its training on female-centric imagery—a characteristic shared by most similar AI tools. Notably, a study conducted by the Internet Watch Foundation revealed that 99.6% of AI-generated Child Sexual Abuse Material (CSAM) depicted female minors.

Graphika research documented a staggering 2000% surge in referral link spam related to disrobing AI services in 2023. The report also highlighted that 34 such service providers collectively attracted over 24 million unique visitors to their platforms within a single month. It anticipates a rise in online harms such as sextortion and CSAM dissemination.

a

Given that these AI systems predominantly learn from female images, it is likely that perpetrators will continue to target females, exacerbating risks for girls and women compared to boys and men.

What are the legal implications in the UK? 

Until recently, crafting sexually explicit deepfake images remained lawful unless involving minors. However, a recent announcement by the Ministry of Justice introduced a new regulation. According to this update, individuals generating sexually explicit deepfake images of adults without their consent will now be subject to prosecution and may incur an "unlimited fine" upon conviction. This amendment contradicts an earlier statement from early 2024, which deemed the creation of deepfake intimate images not severe enough to warrant criminalization. Prior to this change, individuals could produce and distribute such images of adults without legal consequence. Nonetheless, the implementation of the Online Safety Act in January 2024 rendered it illegal to disseminate AI-generated intimate images without consent. Broadly, this law encompasses any sexually suggestive imagery, whether depicting nudity or partial nudity. However, a crucial aspect of this legislation hinges on the intent to cause harm, requiring the creator to intend humiliation or harm towards the victim. Yet, establishing intent poses a considerable challenge, potentially complicating the prosecution of those producing sexually explicit deepfakes.

Dylan Hunter6 Posts

Dylan Hunter is a bestselling author of action-packed thrillers, known for his adrenaline-fueled plots and tough-as-nails protagonists. His books are page-turning adventures filled with suspense, danger, and unexpected twists.

0 Comments

Leave a Comment

500 characters remaining