Prohibition still rampant! Apple Inc. (AAPL.US) and Alphabet Inc. Class C (GOOGL.US) app stores are flooded with "AI stripping" applications, secretly collecting billions in commissions.

date
09:09 16/04/2026
avatar
GMT Eight
According to a report released by the Tech Transparency Project on Wednesday, although Apple and Google have implemented policies prohibiting involuntary sexualized content, both companies continue to provide related mobile applications in their app stores.
According to a report released by the Tech Transparency Project on Wednesday, Apple Inc. and Alphabet Inc. Class C have policies in place that explicitly prohibit non-consensual nudity, but both companies continue to offer related mobile applications in their app stores. The organization is the research arm of the non-profit Campaign for Accountability. The report points out that by searching for keywords such as "nudify" and "undress" in the Apple Inc. App Store and Alphabet Inc. Class C Google Play, users can access multiple apps that can alter photos of celebrities and ordinary people to appear nude or partially nude. Additionally, both companies also advertise similar "undressing" apps in their search results. According to revenue estimates from the market research firm AppMagic cited in the report, the organization identified apps with a total of 483 million downloads generating $122 million in revenue. A spokesperson for AppMagic stated that the report from the Tech Transparency Project had resulted in the removal of multiple apps and prompted other apps to modify their user agreements. Over the past year, there has been increasing calls from politicians around the world to crack down on the proliferation of such apps. Earlier this year, Apple Inc. and Alphabet Inc. Class C removed problematic apps reported by the Tech Transparency Project. However, researchers stated that just a few months later, dozens of similar apps reappeared. "The problem is not only that these companies have failed to properly vet and continue to allow these apps to be released and profit from them," said project director Katie Paul in an interview, "they are even actively promoting the download of these apps to users." Through app store searches, the organization found 18 apps with "undressing" features in the Apple Inc. App Store and 20 in the Alphabet Inc. Class C app store. Researchers stated that Apple Inc. and Alphabet Inc. Class C also use the autocomplete feature in searches to suggest more similar apps to users, effectively guiding users to them. Some apps have names and images with clear sexual implications, while others may not promote themselves for such purposes but can easily be used for them, with lower barriers to entry than traditional photo editing software. The Tech Transparency Project pointed out that some apps also offer subscription services. Apple Inc.'s developer guidelines explicitly prohibit "obvious pornography or obscene content." Alphabet Inc. Class C's app store prohibits "degrading or objectifying others' applications, including those claiming to undress or see through clothes, even if they are labeled as pranks or entertainment." On Alphabet Inc. Class C's side, multiple apps mentioned in the report have been removed from Google Play for policy violations, and related investigations are ongoing. "After receiving reports of violations, we will investigate and take appropriate action," Alphabet Inc. Class C said in an email statement. Apple Inc. stated that after being asked by the media about the existence of such apps, it had removed 15 apps identified by the organization. Researchers stated that the removed apps include PicsVid AI Hot Video Generator. The PicsVid developers did not respond to requests for comment. Another app identified, Uncensored AINo Filter Chat, was able to remove clothing from images of women uploaded by researchers. A representative from the app developer stated that the app has now removed the "undressing" feature. Apple Inc. stated that it has contacted 6 app developers to inform them of issues that need to be addressed, or else they will face the risk of being removed. The company also stated that other apps mentioned by the Tech Transparency Project did not violate its guidelines. Apple Inc. also added that it has proactively rejected many apps for placement and removed other violations. Anne Helmond, a professor at Utrecht University in the Netherlands, commented that the enforcement actions of these two tech giants are "inconsistent and extremely low in transparency." "If an app appears as a general image generator, even if it can be abused in practice, it may pass the review process," Helmond, who also serves as the head of the international research organization App Studies Initiative, said. "The visibility of apps is determined by ranking and search systems, and these mechanisms are user-engagement-oriented, meaning that controversial uses may actually increase the exposure of the apps." One app found by researchers in the Google Play store, Video Face Swap AI: DeepFace, advertised the ability to swap the face of actress Anya Taylor-Joy onto the character Daenerys Targaryen from "Game of Thrones." However, the investigation found that under the "Girls" category within the app, users could paste others' faces onto explicit videos. This app, labeled as "E for Everyone," has been downloaded over a million times, and users can easily find it by searching for "face swap." The app developer Okapi Software stated that they have launched an investigation into the reported issues and removed certain content uploaded by users. "This app does not provide 'undressing' features or allow the generation of nude or explicit content," Okapi stated. "We prioritize content safety and compliance." More and more regulatory agencies are calling on these two companies to intensify their efforts to enforce their policies. Last year, President Trump signed the Take It Down Act, making the dissemination of non-consensual content a criminal offense and compelling social media and websites to remove such content. In April of this year, the UK government plans to introduce legislation to provide a pathway for prosecuting tech company executives who fail to effectively remove such images.