What is Undress AI? Guidance for Parents and Caregivers
Discover the Top Undress AI Tools, an essential guide for those interested in the latest advancements in AI companionship.
What is Undress AI? Guidance for Parents and Caregivers
Sheena was once an English and PSHE educator. She now serves as the Digital Content Manager at Internet Matters, providing support for online safety advice content on the website, with partners, and for the Digital Matters learning materials.
The capabilities and technologies of artificial intelligence are constantly advancing. AI-driven undressing is an example that could potentially expose young people to harm.
Understand what it is so you can better protect your children online.
Summary
- Undress AI is an application that uses artificial intelligence to alter images or videos.
- Risks include exposure to inappropriate content, bullying, abuse, and impacts on mental health.
- Research by Graphika shows a 2000% increase in spam email links directing to 'deepnude' websites.
- Now, the creation and distribution of 'intimate' deepfakes are illegal.
- Preventive conversations about undress AI can deter children from engaging with it.
What is "Undress AI"?
Undress AI describes a tool that uses artificial intelligence to remove clothes from images of individuals.
Although the operation of each app or website may vary, they all offer a similar service. Even though the processed images do not actually show the victims' real nudity, they can imply it.
Criminals using undress AI tools may keep these images for themselves or share them more broadly. They can use these images for sextortion, bullying/abuse, or as a form of revenge pornography.
If someone uses this technology to "undress" children and adolescents, it exposes them to additional harm. A report by the Internet Watch Foundation found over 11,000 potential criminal child images generated by AI on a dark web forum specifically targeting Child Sexual Abuse Material (CSAM). About 3,000 images were assessed as criminal images.
IWF noted that it also found "many examples of AI-generated images including known victims and well-known children." Generative AI can only create convincing images from accurate source material. Essentially, AI tools generating CSAM need to learn from real images of child abuse.
Risks to Be Aware Of
Undress AI tools use suggestive language to attract users. Therefore, children are more likely to follow their curiosity based on this language.
Children and adolescents may not fully understand the law. Hence, they might struggle to distinguish harmful tools from those that promote harmless fun.
Inappropriate Content and Behavior
The curiosity and novelty of undress AI tools may expose children to inappropriate content. Because it does not display "real" nude images, they might think it's okay to use these tools. If they then share the image "for a laugh" with friends, they are likely to break the law unwittingly.
Without intervention from parents or caregivers, they might continue such behavior, even though it could harm others.
Privacy and Security Risks
Many legitimate generative AI tools require payment or subscriptions to create images. Thus, if a Deepnude website is free, it might produce low-quality images or have lax security. If children upload their own or a friend's clothed photos, the website or app could misuse it. This includes the "deepnudes" it creates.
Children using these tools are unlikely to read the terms of service or privacy policies, thus facing risks they may not understand.
Production of Child Sexual Abuse Material (CSAM)
The International Olympic Committee also reports that "self-generated" CSAM circulating online increased by 417% from 2019 to 2022. Note that "self-generated" is not perfect because, in most cases, the abuser forces the child to create these images.
However, with the use of undress AI, children might unwittingly create AI-generated CSAM. If they upload their own or another child's clothed photo, someone can "nudify" the image and share it more widely.
Cyberbullying, Abuse, and Harassment
Just like other types of deepfakes, people can use undress AI tools or "deepnudes" to bully others. This might include claiming that a peer sent their own nude photo, although they did not. Or, it could involve using AI to create a nude with features for bullying and mockery.
It's important to remember that sharing a peer's nude photos is both illegal and abusive.
How Common Is "Deepnude" Technology?
Studies show that the use of such AI tools is increasing, especially in terms of undressing female victims.
An undress AI website states that their technology "does not work on male subjects." This is because they use female images to train the tool, which is the case for most such AI tools. In the Internet Watch Foundation's survey of AI-generated CSAM, 99.6% of the content was of female children.
Research by Graphika points out that by 2023, spam emails linking to undress AI services will increase by 2023%. The report also found that 34 providers received over 240,000 unique visitors in a month. They predict "more instances of online harm," including sextortion and CSAM.
Criminals are likely to continue targeting girls and women, rather than boys and men, especially if these tools primarily learn from female images.
What Does UK Law Say?
Until recently, those who created explicit deepfakes were not breaking the law, unless the images were of children.
However, the Ministry of Justice announced new legislation this week that will change this. Under the new law, those who create explicit deepfakes without the consent of minors will face prosecution. Convicted individuals will also face "an unlimited fine."
This follows a statement published in early 2024. The report stated that creating deepfake intimate images "is not sufficiently harmful or culpable to constitute a criminal offense."
Just last year, offenders could still create and share these (adult) images without breaking the law. However, the Online Safety Act of May 2024 made it illegal to share AI-generated intimate images without consent.
In general, the law should cover any images with a sexual nature. This includes those with nudity or partial nudity themes.
However, one thing to note is that the law relies on the intent to cause harm. Thus, those who create explicit deepf