How was an app used to 'undress' women even created in the first place?
It's 2019.
It's 2019.
Technology is progressing at such a terrifying rate that updates on the subject rarely surprise me.
I was stumped this weekend however when during a Saturday morning news scour, I came across a story about how an app used to 'undress' women had been taken offline by its creators.
Sorry, what?
How was such an app even allowed to be made in the first place?
With the tagline, 'the superpower you always wanted', the app allowed users (for a fee of course) to 'undress' photographs of clothed women, using a machine learned algorithm.
According to tech news site Motherboard, the program is used to take a photo of a clothed woman and transform it into a nude image, superimposing body parts onto it.
Marie Claire Newsletter
Celebrity news, beauty, fashion advice, and fascinating features, delivered straight to your inbox!
And in case you were wondering, the algorithm reportedly only work on images of women.
I repeat. Sorry, what?
The creators have recently taken the app offline, but reading the statement they released didn't make me feel much better.
‘Here is the brief history, and the end of DeepNude,’ the app announced in a statement. ‘We created this project for user’s entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner. Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic. We greatly underestimated the request.’
Hmmmmmm. Not sure the inability to keep up with requests is the issue here, but go on.
The statement continues: ‘Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it. Downloading the software from other sources or starting it by any other means would be against the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version.’
The statement concludes: ‘People who have not yet upgraded will receive a refund,’ before ending on the line: ‘The world is not yet ready for DeepNude’.
No. The world is not ready for DeepNude, nor should it even be ready for DeepNude.
It is programs like this that make revenge porn and similar dangerous phenomenons a very real threat.
And while the app has now been taken down, we can't stop there. We need to be asking why (and how) the app was allowed to be created, and change the laws around it.
If we don't start taking a stand against things like this now, the future promises to be very bleak.
Jenny Proudfoot is an award-winning journalist, specialising in lifestyle, culture, entertainment, international development and politics. She has worked at Marie Claire UK for seven years, rising from intern to Features Editor and is now the most published Marie Claire writer of all time. She was made a 30 under 30 award-winner last year and named a rising star in journalism by the Professional Publishers Association.
-
Prince Harry and Meghan Markle won’t be invited to spend Christmas with the royal family this year
By Jenny Proudfoot
-
All of the must-see looks straight from the Governors Awards 2024 red carpet
From J-Law to Zoe Saldaña, you won't want to miss these
By Sofia Piza
-
Hugh Grant’s surprising comments about his Notting Hill character are going viral
By Jenny Proudfoot