Apple and Google Broke Their Own Rules by Promoting ‘Nudify’ Apps, Report Says

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

If you want an app you’ve created to be downloadable from the Apple App Store or Google Play Store, it must meet many criteria, including security standards.

But a new report released Wednesday claims that Apple and Google broke their own rules by promoting “nudify” apps that are banned in their app store policies.

The Tech Transparency Project, part of a nonprofit tech watchdog, first revealed in January that the Apple and Google app stores offer more than 100 nudification or undressing apps. These are apps whose sole purpose is to take images of people, usually women, and edit them to look like that person without clothes, creating what are called non-consensual intimate images. Many of these applications use generative AI to create deepfakes.

Apple then removed some of the banned applications. But many are still there, as a subsequent investigation demonstrated.

In April, TTP discovered that Apple and Google still allowed users to search for a number of troubling keywords, including “nudify,” “undress” and “deepnude.” After an in-depth study of the top 10 apps from both app stores, TTP found that 40% of apps presented themselves as being able to “make women nude or scantily clad,” according to the report.

The new report also reveals that Google and Apple were indeed promoting such apps in their stores, thereby increasing their visibility, with Google notably creating “a carousel of ads for some of the most sexually explicit apps encountered during the investigation.”

Learn more: How to keep children safe online? Europe believes its age verification app is the answer

Apple and Google both have language in their policies that prohibits apps containing “overtly sexual or pornographic material” (Apple) and “sexually suggestive poses in which the subject is nude, blurred, or scantily clad” (Google). And they have both enforced these policies in the past, including going after pornographic apps.

But Apple and Google make money from app developers by serving advertising and taking a cut of paid app subscriptions. Analytics firm AppMagic found that these “nudify” apps were downloaded 483 million times and generated more than $122 million in revenue over their lifetime.

“This revenue stream could be why both companies have been less than vigilant when it comes to nudifying apps that violate their policies,” TTP writes.

AI Atlas

Google told CNET that Google Play does not allow apps containing sexual content and that many of the apps referenced in the report have been suspended for violating its policies.

Apple told CNET that it removed 15 of the apps flagged in the report and contacted six other app developers, informing them they had to fix the issues or risk being removed from the store. It also blocked several additional search terms reported by TTP.

Non-consensual graphically sexual content is a growing problem, driven in part by AI. We saw with surprising clarity how AI-enabled apps can be used to create this illegal and abusive content earlier this year, when Grok users made 1.4 million sexualized deepfakes over a nine-day period.

Some US senators then asked Apple and Google to remove Grok from their app stores, but neither removed it.

We learned this week that Apple had contacted Grok privately with concerns about her abusive AI capabilities and threatened to remove her. Grok is still available in the Apple and Google app stores and is reportedly still capable of creating abusive sexual images through AI, despite the company’s claims to the contrary.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button