Apple and Google have come under renewed fire after the non-profit Tech Transparency Project uncovered dozens of so-called “nudify” apps in their official stores. These apps use artificial intelligence to create nude images from ordinary photos, often without the knowledge or consent of the person being depicted.
Researchers found that simply entering words like “undress” or “nudify” into a search engine would bring up apps that remove clothing from photos or they put people's faces on naked bodiesTesting confirmed that the purpose of these applications is strongly sexualized, not just "entertaining" or editorial.
Apple After being notified, he removed several apps and sent warnings to their developers. However, some of them were returned to the menu after "editing" them. App Store. Google stated that from Google Play deleted several apps and continues to monitor them. Both companies distance themselves from the tools that create such inappropriate content and promise to take action against their spread.
However, critics point out that both companies reacted too late. Many applications already had millions of downloads and generated significant revenues – of which Apple and Google took commissions. This raises a serious question about privacy, consent and control effectiveness digital platforms.
Similar controversies aren't just limited to mobile apps. Recently, an AI tool also faced a wave of criticism. Grok by xAI, which created sexualized outputs for months. The developers have the ability chatonly after public pressure. It turns out that most companies only act when faced with a scandal, not foreventeagerly.
The public and governments around the world are therefore calling for stricter oversight of technologies that can abuse AI to violate dignity and privacy. Apple and Google now faces the question: How is it possible that these apps have gotten this far?