According to AppleInsider, despite Apple's long-standing claims about the security and ethical standards of its ecosystem, a recent report from the Technology Transparency Project (TTP) has revealed serious management loopholes in the App Store.

The report found that dozens of apps known as "AI Nudification" are openly available in the app stores of Apple and Google. These programs use artificial intelligence technology to generate fake pornographic or naked images from photos without user consent, with victims even including minors.

iTunes,appstore, Apple App Store

Research found that users can easily obtain these violating apps by searching for relevant keywords in the store. According to statistics, the download volume of such apps has exceeded 700 million times globally, generating more than $100 million in revenue, while Apple and Google also took millions of dollars in commissions from the sales.

Although Apple's review guidelines explicitly prohibit such pornographic and offensive content, many apps still manage to bypass regulation by disguising themselves as legitimate video creation tools. After relevant authorities submitted a list of issues, Apple did remove some apps, but a large number of similar software remain available on the platform.

Public opinion believes that while Apple enjoys high commissions, it has failed to establish effective protective barriers in the AI era. This inaction and silence are seriously undermining the trust in Apple's claimed "user-first" principle.