Google installed a new policy Wednesday that will allow minors or their caregivers to request their images be removed from the company’s search results, saying that “kids and teens have to navigate some unique challenges online, especially when a picture of them is unexpectedly available on the internet.”
The policy follows up on Google’s announcement in August that it would take a number of steps aiming to protect minors’ privacy and their mental well-being, giving them more control over how they appear online.
You can fill out a form to ask that an image be removed
Google says the process for taking a minor’s image out of its search results starts with filling out a form that asks for the URL of the target image. The form also asks for the URL of the Google search page used to find the image, and the search terms that were used. The company will then evaluate the removal request.
While the request could wind up scrubbing problematic images from Google’s search tools, “It’s important to note that removing an image from Google results doesn’t remove it from the internet,” the company said as it announced the policy.
The changes come after Google and other tech companies have faced intense criticism for their policies toward children, who now live in the public eye more than any previous generation — facing the prospect of having any moment in their lives shared and preserved online, regardless of their own wishes.
The tool states that it is intended for cases in which the subject is under 18. Google says that if adults want material related to them to be removed, they should use a separate set of options.
Google has faced pressure to protect children and privacy
In 2019, allegations that Google’s YouTube subsidiary collected personal information from children without their parents’ knowledge or consent resulted in the company paying a $170 million settlement to state and federal regulators.
“Our children’s privacy law doesn’t allow companies to track kids across the internet and collect individual data on them without their parents’ consent,” then-FTC commissioner Rohit Chopra told NPR at the time. “And that’s exactly what YouTube did, and YouTube knew it was targeting children with some of these videos.”
When Google first announced the image-removal initiative in August, it also pledged to block ads that target people based on their age, gender or interests if they’re younger than 18. It also said its YouTube division would change the default privacy settings on video uploads to the tightest restrictions if they come from teens between 13 and 17 years old.
One of the biggest early adjustments for Google’s search tools stem from Europe, where a Spanish man’s case established the “right to be forgotten” in 2014. In the four years that followed, Google said, people made more than 650,000 requests to remove specific websites from its search results.