Artificial intelligence continues to increase in ability and technology. Undress AI is one example that could leave young people open to harm.
Learn what it is so you can better protect your child online.
Table of Contents
What is ‘undress AI’?
Undress AI describes a category of programs that employ artificial intelligence to remove clothes of individuals in pictures.
While how each app or website works might different, all of them provide this same service. However, the edited photo does not actually depict the naked body of the victim, it can suggest that.
Users of undress AI tools or Perchance Ai Generators may either retain the images for individual use or circulate them. They could utilize these images for sexual extortion, bullying, or for revenge porn.
Likewise, children and youth suffer even more when someone uses such technology to “undress” them. Internet Watch Foundation reported found at least 11,000 Child Sexual Abuse Material CSAM related images on a single dark web forum created exclusively for pedophiles which were suspected of being illicit AI generated content. Out of these, out of those, about 3,000 were estimated.
IWF also pointed out “many cases were known victims, famous children, and AI images created”. Because Generative Ai can manipulate and polish only if suitable source material has been fed during the model training process. In particular, CSAM related AI building programs would have to use child abuse images as part of the training.
Risks to look out for
AI tools like Undress AI employ graphic language to entice users. Because of such language, children are more frequently tempted to indulge their curiosity.
Moreover, children and the youth may not fully grasp the implications of the law. Therefore, they may be unable to identify abusive tools from those that promote harmless fun.
Violating Boundaries: Inappropriate Content and Behavior
An AI bot that can undress an image of a child may cause children to be exposed to inappropriate content. They might not consider it a real nude image and as such feel it is acceptable to use such tools and furthermore, to share such an image to friends ‘for a laugh’ may be breaking the law without realizing it.
The behavior might be persisting even if it is hurting other people.
Privacy and security risks
Some generative AI tools are needed to be paid for or subscribed to, in order to create images. But if a deepnude website is free then it can produce sub standard images and have weak security. If a child uploads an image of themselves or their friend fully dressed, the application or website may misuse that image. This includes the ‘deepnude’ it makes.
Children using these tools are unlikely to read the Terms of Service or the Privacy Policy, hence they are at risk which they do not fully understand.
Child Sexual Abuse Material (CSAM) Creation
The IWF also stated that the number of ‘self-made’ CSAM spread on the internet grew by 417%, from 2019 to 2022. The term ‘self-generated’ is misleading in many cases since it often involves abuse where children are forced to make the images but note that this is the case in most instances.
Children may inadvertently generate AI-based CSAM, but the use of undress AI makes it possible. If a child uploads a photo of in a dress or a photo of other children, someone can “furnish” the photo and distribute it more widely.
Cyberbullying, abuse and harassment
Undress AI Tools or deepnudes can be used in a negative way to humiliate or bully someone like any other type of deepfakes. This can be in the form like wearing a nude element that a peer claimed to have sent them but didn’t, or it can also be having AI created nude features that are used to demean the individuals by the bullies.
Moreover sharing nude photos of one’s peers, without their permission or by force, is not only abusive but in fact it is a crime.
How widespread is ‘deepnude’ technology?
Research shows that usage of these types of AI tools is on the rise especially for removing clothes from female victims.
One undress AI site says that their technology was ‘not intended for male subjects.’ This is because they trained the tool on female imagery, which is accurate for most of such tools. In the case of the AI-generated CSAM that was scrutinized by Internet Watch Foundation, 99.6% of them were also children and females.
Research from Graphika highlighted a 2000% increase of referral link spam for undress AI services in 2023. Another finding in the report is that in a single month, over 24 million unique visits were recorded across 34 of these providers’ websites. They predict further instances of online harm, including sextortion and CSAM.
The perpetrators are most likely to focus on girls and women more than boys and men when these tools mainly learn from female images.
What does USA law say?
It is illegal to make, share and possess sexually explicit deepfake images of children.
Undress AI, an artificial intelligence that has the capacity to undress people in images, raises both legal and moral issues. When it comes to federal legislation, there is no current law that bans the production of adult sexual deepfake content, however involvement of minors in the creation and distribution of such content is strictly prohibited. Furthermore, the enactment of The Online Safety Act 2023 outlaws the distribution of private pictures without consent which includes deepfakes.
The issue surrounding the law on deep fakes is still at a developing stage. While some states have passed some laws concerning deep fakes, some are yet to start the process. One must seek the assistance of an attorney to comprehend all the laws existing in their region.
In addition, the moral effects of Undress AI are huge. The technology can be employed to produce graphic but non-consensual pornography that can do great damage to some people. It is important to use this modern technology carefully considering how destructive it can be.
How to keep children safe from undress AI
Whether you’re concerned about your child using undress AI tools or becoming a victim, here are some actions to take to protect them.
Start the initial dialogue
When children reach the age of 11, more than 1 out of 4 children living in the US stated that they have witnessed obscene material. And 10% of them claim that the first time they for porn was at the age of 9. Undress AI tools are something that children might also feel curious and try to search for. So, having that in mind, they should be educated about what constitutes respectable content, good relationships and good manners way before reaching that stage.
Restrict Apps and Websites Access
Content filtering should be set on devices, applications, mobile, and broadband to completely eliminate the chance of an unexpected content search result. Such detail prevents users from being exposed to any inappropriate material when on the internet. These websites can be reached after you have other dialogues with me.
Build children’s digital resilience
Digital resilience is a skill that children must build. It means they can identify potential online harm and take action if needed. They know how to report, block and think critically about content they come across. This includes knowing when they need to get help from their parent, carer or other trusted adult.