Company Ep (Apple ) removed the apps from the online store (AppStore) after the BBC discovered that they were being advertised on TikTok as tools for taking nude photos of someone without their consent.
Applications "Declothing" (download) or "nudify" (nudity) digitally remove clothing from photographs to create nude or partially nude images or videos.
Official pages AppStore and app sites have described them as tools that "spice up selfies".
But on TikTok, app companies used paid advertising to promote the apps as a way to take sexual images of women without their permission.
Users are invited to download the apps to "get 'hot' pictures of their favorite women."
Some boasted that the apps could "allow anyone to download".
Others said they made it possible for "anyone to be in a bikini."
We don't name the apps to avoid advertising them.
- When they "dipfake" you: "They pasted my face on a porn video"
- "Every moment is turned into pornography": Abusing friendship with fake footage
- What is "revenge pornography" and how does it affect victims
Taking sexually explicit images of someone without consent is illegal in some countries, including the United Kingdom.
"Nudify"apps are a growing problem around the world, with major scandals erupting in Spain, South Korea and the US as groups of schoolgirls have been found to have been victims of the technology.
In some cases, victims have been known to be blackmailed using deepfakes (deepfake) photo.
Technology companies have a moral obligation to restrict fake apps for downloading or nudity, says Professor Arvind Narajanan of Princeton's Center for Information Technology Policy.
He says governments around the world should ban these apps from online stores and ensure that social media companies cannot "make money from advertising them."
Demos in some of the TikTok ads use photos of stars from the mega-popular series Game of Thrones (Game of Thrones) Lena Headey and Maisie Williams to show how women can fake looking like they're stripping down to their underwear.
The BBC has uncovered four such apps that pay for advertising through TikTok, which has now removed the ads.
But similar videos apparently from accounts linked to app makers remained.
A TikTok spokesperson said the ads violated the video-sharing site's policy that does not allow paid advertising of sexually suggestive content.
The company did not explain how the ads passed the moderation check.
Apple also responded to the BBC investigation by removing the apps from the AppStore.
It says three have been removed "until developers fix the issues."
A fourth, specializing in creating semi-nude dance videos from women's photos, "has been removed and the developer's account has been terminated."
- Humiliation of female teachers: A cruel revenge campaign by students
- Deepfake pornography is destroying the lives of female students
Google says they are analyzing apps on their online store (Google Play Store).
One of the most popular apps investigated by the BBC has been downloaded more than a million times, according to Google, or rather Google Play app store.
Apple does not publish download data.
The app doesn't seem to allow for taking completely nude pictures, but sexually suggestive pictures of anyone can be easily taken.
The application program offers ready-made "packages" for creating various poses, including images of women on the floor with their mouths open or in a bathtub.
On TikTok, the app is advertised as a way to "get hot pictures of your favorite woman."
The app's maker responded electronically, claiming that it did not intend to advertise itself as a deepfake tool without consent and that it would analyze the app.
- Revenge Pornography in Serbia: How to Protect Yourself from Digital Bullies
- Fake nude photos of thousands of women posted online
- TikTok filter that made people look fat has been removed
BBC is in Serbian from now on and on YouTube, follow us HERE.
Follow us on Facebook, Twitter, Instagram, YouTube i Viber. If you have a topic proposal for us, contact us at bbcnasrpskom@bbc.co.uk
Bonus video: