One of the most notable changes to Google’s policy, announced in a blog post, is that YouTube videos created by 13-17-year-olds will automatically be defaulted to private. Other YouTube changes include automatic break and bedtime reminders for the same age group and the removal of “overly commercial content” that might encourage kids to spend money. Additional significant changes include the ability for minors or their parents to request the removal of their images from Google’s image results. Google notes this feature won’t remove the image from the web entirely, but should help give young people more control of their pictures online. Finally, Engadget notes other changes, such as automatically enabling SafeSearch and disabling location history for users under 18, and not allowing minors the ability to turn it on. Google also said it would block ad targeting for minors based on their age, gender, or interests. Google said these policies and updates address concerns from parents, educators, and privacy experts. “Having an accurate age for a user can be an important element in providing experiences tailored to their needs. Yet, knowing the accurate age of our users across multiple products and surfaces, while at the same time respecting their privacy and ensuring that our services remain accessible, is a complex challenge,” Google wrote in its blog post. “It will require input from regulators, lawmakers, industry bodies, technology providers, and others to address it—and to ensure that we all build a safer internet for kids.” Google isn’t the only one to prioritize underage users’ experience in recent months. Instagram announced updates in July that automatically default any new user under the age of 16 to a private account. The social network also is implementing new technology designed to weed out accounts that have shown suspicious behavior toward younger users so that they won’t appear in the Explore tab or Reels.