The Deepnude app used AI to produce a realistic nude version of the photo of a dressed woman, preferably wearing a tight-fitting bikini.
There was a free version that placed a large watermark FAKE, and a premium version on subscription that produced better quality (higher resolution) results with a small watermark at the corner, that could be easily removed.
However, hours after exposing the app, the site went offline due to overwhelming demand, as was said at the time, while today the decision has been announced on twitter that all distribution and development activity is to stop immediately.
The reason given was that they “greatly underestimated” interest in the project and that “the probability that people will misuse it is too high.”
As there was the potential, nay certainty, of making lots of money on it, one can only suspect that the developers got scared of possible legal actions against them by celebs and feminists. As someone put it on twitter:
“yep, a tsunami of lawsuits from wealthy celebrities ready to say they felt raped because the app removed that thin almost non existent bikini they wore on the beach 1 year ago hahahah”
To which one may add, “…the same celebs that already appeared nude on screen, but that was for...artistic purposes.”
Well, it seems the world wasn’t ready for such an app…
...However, that’s only the story so far, because how can you make something so popular and already downloaded many thousands of times just disappear from the web? I consider as certainties that there will be:
pirated versions of the app around for downloading, and
plenty developers that will start work on improving it.
In fact, there are already claims of cracked premium versions without watermark on offer, e.g. at reddit, if anybody is interested for...educational purposes.
https://www.theverge.com/2019/6/27/18761496/deepnude-shuts-down-deepfake-nude-ai-app-women