Google will blur explicit images in search by default

Google Search is about to get a little more PG. The company announced today that, “in the coming months,” it will turn on a setting that blurs potentially explicit images in its search results.
The move is likely to be popular with parents and others who don’t want to see nudity or other potentially offensive content returned when they search for, say, “cat.” But it may not sit well with some who feel that Google is censoring its results.
The change will only affect image results, not webpages or other types of content. And it will only affect results shown in the US, at least initially.
To be clear, this won’t necessarily mean that all potentially offensive images will be blurred. Google will still return some results that it deems safe for work; it just won’t show the most explicit ones by default.
Users will still be able to see such results if they manually turn off the setting. And, of course, they can always find them by searching on another site, such as Bing or DuckDuckGo.
Still, the change is likely to reduce the visibility of sexually explicit and violent images, at least for those who don’t know about the setting or don’t bother to change it.
The setting will be turned on by default for signed-in users who have “SafeSearch” turned on. SafeSearch is a setting that filters out explicit results for all Google services, not just Search.
To be clear, this won’t necessarily mean that all potentially offensive images will be blurred. Google will still return some results that it deems safe for work; it just won’t show the most explicit ones by default.
Users will still be able to see such results if they manually turn off the setting. And, of course, they can always find them by searching on another site, such as Bing or DuckDuckGo.
Still, the change is likely to reduce the visibility of sexually explicit and violent images, at least for those who don’t know about the setting or don’t bother to change it.
The setting will be turned on by default for signed-in users who have “SafeSearch” turned on. SafeSearch is a setting that filters out explicit results for all Google services, not just Search.
Google is making the change because “we’ve heard feedback that people would like more control over the images that appear in their search results,” a company spokesperson said.
The company isn’t saying how it will decide which images to blur. But it’s likely that it will use a combination of automated systems and manual review by staff.
The move comes as Google is facing increasing scrutiny over its handling of offensive content. In December, for example, the company was criticized for featuring an image of a bomb in its image search results for the word “burqa.”
Last year, the company also came under fire for its handling of “revenge porn” – images that are posted without the subject’s consent, often by an ex-partner.
Google has been working to remove such images from its search results and has said it will take down such images if they are reported by the subject. But some have accused the company of not doing enough to prevent such images from being uploaded in the first place.
The company is now testing a feature that would allow users to flag images that they believe to be revenge porn. It’s not clear if or when that feature will be rolled out more broadly.
The new setting for image search results is likely to be greeted with mixed reactions. Some will see it as a welcome step by a company that is finally taking responsibility for the content it puts in front of users.
Others will see it as an act of censorship that should have been taken sooner. And some will no doubt see it as a pointless exercise that will do little to stop people from seeing offensive content.