A recent investigation conducted by child safety organization Heat Initiative has exposed alarming gaps in Apple’s App Store rating system, revealing that numerous apps containing inappropriate content are labeled as safe for children as young as four years old.
Collaborating with a researcher, the groups analyzed as many apps as possible within a 24-hour timeframe, uncovering more than 200 questionable apps with content or features unsuitable for young audiences. The study found that some of these apps included stranger chat platforms, AI-generated girlfriend simulators, sexualized or violent games, and AI tools for appearance ratings—all of which were accessible under age-appropriate ratings of 4+, 9+, and 12+.
Investigation Details for Unsafe Apps
The investigation specifically targeted high-risk categories, including chat apps, body image tools, diet and weight loss platforms, unrestricted internet access, and violent games. The findings revealed:
- 24 sexualized games labeled safe for young children.
- 9 stranger chat platforms marked appropriate for kids, despite potential risks of online predators.
- 40 apps provide unfiltered internet access, bypassing restrictions often set by parents or schools.
- 75 apps promoting beauty standards, weight loss, and body image concerns rated as child-friendly.
- 28 shooter and crime-themed games labeled as suitable for children.
Collectively, these problematic apps have been downloaded more than 550 million times, underscoring the vast reach of potentially harmful content and the urgent need for reform in-app rating standards.
Categories Most Affected by Low Ratings
While analyzing approximately 800 apps, the researchers identified trends in rating discrepancies across categories. Stranger chat apps and violent games were more frequently rated 17+, reflecting slightly stricter standards in these categories.
Weight loss apps and platforms offering unfiltered internet access, however, were often assigned ratings of 4+, exposing young children to content that could negatively impact their mental health and development.
The inconsistent and lax rating system raises serious concerns about the App Store’s safety protocols, particularly for vulnerable children who rely on parental controls and ratings to navigate online content safely.
How Apple needs to Tighten Safety Standards?
The report, titled “Rotten Ratings: 24 Hours in Apple’s App Store,” has prompted urgent calls for Apple to strengthen its content evaluation and app rating policies.
Apple needs to Implement third-party reviewers to independently assess app content and ratings before approval. Increase transparency regarding how age ratings are assigned, enabling parents to make more informed decisions. Critics also want them to establish stricter content filtering systems to detect and prevent inappropriate apps from being misclassified.
Also read: JioTag Go: Reliance Unveils Tracker for Android with Google’s Find My Device