Apple’s New Child Safety Measures: What Parents and Developers Need to Know
In an era where children are increasingly immersed in the digital world, Apple is taking significant steps to create a safer online environment for minors. On February 27, 2025, the tech giant unveiled a series of new child safety initiatives aimed at empowering parents and holding app developers accountable for age-appropriate experiences. These changes, detailed in a white paper published by Apple, represent a major shift in how online safety for children is addressed, sparking both praise and debate within the tech industry.
Central to Apple’s strategy is the introduction of the Declared Range API, set to launch later in 2025. This tool allows app developers to request an approximate age range for users with Child Accounts. Parents will receive a notification to approve the age range, similar to prompts for App Tracking or Location Services. While this feature can be turned off, it offers a streamlined way for developers to tailor content without relying on invasive age verification methods like government IDs.
However, Apple’s decision to place the responsibility of age verification on app developers has stirred controversy. Unlike companies such as Meta, which advocate for app marketplaces to handle age verification, Apple believes the burden should fall on the apps themselves. As Apple stated, “The right place to address the dangers of age-restricted content online is the limited set of websites and apps that host that kind of content.” This approach contrasts with state legislatures and other tech giants pushing for stricter controls at the marketplace level.
For parents, Apple is simplifying the process of setting up and managing Child Accounts. During device setup, parents can choose from pre-selected safety settings based on age ranges or opt for default child settings. These settings can be customized later, providing flexibility as children grow. Additionally, Apple is introducing more specific age thresholds for apps—4+, 9+, 13+, 16+, and 18+—ensuring that inappropriate content doesn’t appear in the Today, Games, or Apps tabs on devices with youth accounts.
Apple’s announcement comes amid ongoing discussions about the Kids Online Safety Act, proposed in 2023. This legislation would require platforms to enforce the strongest privacy settings for underage users and hold social media companies accountable for harmful content. Apple’s stance emphasizes minimizing data collection while empowering parents and developers to take action.
As Chase DiBenedetto, a Social Good Reporter at Mashable, highlights, protecting children online requires constant vigilance. Apple’s new measures are a step in the right direction, but they also raise important questions about privacy, responsibility, and the effectiveness of age verification.
For parents, these changes mean more tools to safeguard their children’s online experiences. For developers, it’s a call to action to prioritize age-appropriate content and safety features. As the digital world evolves, so too must our efforts to protect its youngest users.
What do you think about Apple’s new child safety measures? Are they enough, or is more needed to ensure a safer online environment for kids? Share your thoughts in the comments below.
Comment Template