Google has debuted a new default SafeSearch setting, somewhere between “on” and “off,” that automatically blurs explicit images in search results for most people.
In a blog post timed to Safer Internet Day, Google outlined a number of measures it plans to implement to “protect democracies worldwide,” secure high-risk individuals, improve password management, and protect credit card numbers. Tucked into a series of small-to-medium announcements is a notable change to search results, Google’s second core product after advertising.
A new setting, rolling out “in the coming months,” “will blur explicit imagery if it appears in Search results when SafeSearch filtering isn’t turned on,” writes Google’s Jen Fitzpatrick, senior vice president of Core Systems & Experiences. “This setting will be the new default for people who don’t already have the SafeSearch filter turned on, with the option to adjust settings at any time.”
Google’s explanatory image (seen above) shows someone logged in and searching for images of “Injury.” A notice shows that “Google turned on SafeSearch blurring,” which “blurs explicit images in your search results.” One of the example image results—”Dismounted Complex Blast Injury (DCBI)” from ResearchGate—is indeed quite explicit, as far as human viscera and musculature goes. Google provides one last check if you click on that blurred image: “This image may contain explicit content. SafeSearch blurring is on.”
If you click “View image,” you see life’s frail nature. If you click “Manage setting,” you can choose between three settings: Filter (where explicit results don’t show up at all), Blur (where both blurring and are-you-sure clicks occur), and Off (where you see “all relevant results, even if they’re explicit”).
Signed-in users under the age of 18 automatically have SafeSearch enabled, blocking content including “pornography, violence, and gore.” With this change, Google will automatically be blurring explicit content for everybody using Google who doesn’t log in, stay logged in, and specifically ask to show it instead. It’s a way to prevent children from getting access to explicit images, but also, notably, a means of ensuring people are logged in to Google if they’re looking for something… very specific. An incognito window, it seems, just won’t do.
Google turned on SafeSearch as its default for under-18 users in August 2021, having been pressured by Congress to better protect children across its services, including search and YouTube.