Google says that its search suggestions haven’t been altered to help Hillary Clinton. Nor, it says, are they programmed to help any other candidate. “Google autocomplete does not favor any candidate or cause,” a Google spokesperson says in a statement.
The statement comes in response to a YouTube video by SourceFed claiming that Google is hiding negative autocomplete results for Hilary Clinton. The video pointed out that Google doesn’t suggest “Hillary Clinton crimes” when you type in “Hillary Clinton cri,” whereas both Yahoo and Bing do. It found the same thing when typing “Hillary Clinton ind” in search of “Hillary Clinton indictment.” It argued that both ought to show up, given each search’s popularity in Google Trends.
There appear to be a couple different reasons for why this is happening. The first being that Google does, for all people, filter out “offensive or disparaging” search predictions — and “crime” is one of the words Google consistently removes. This is true no matter who you’re searching for, be it Al Capone or, as Vox points out, Bernie Madoff. So Google ishiding the autocomplete suggestion “Hillary Clinton crime,” but it’s also doing that for everyone else, including people who have actually been charged and convicted.
The other reason this is happening involves how people search for negative news about Clinton. Matt Cutts, who is currently on leave from his position as head of Google’s webspam team, explained why this might be on Twitter: those looking for negative stories don’t type Clinton’s last name.
Basically, you can still find negative suggestions about Clinton, so long as you only type her first name.
It’s not clear exactly how far Google goes in removing potentially offensive autocomplete results. As SourceFed‘s video points out, many results pop up when you type in “Donald Trump racist” — but results also appear for “Hillary Clinton racist.” And it’s worth noting that Google does filter out both of those phrases; it only returns suggestions with additional words at the end (“racist rally” in Trump’s case, and “racist costume” in Clinton’s case).
Google is likely trying to walk the line here between providing as much information as it can (that generally being its reason for existence, beyond profit) and not stepping into a thorny legal area by, say, suggesting someone is a criminal.
So while it’s true that Google is filtering out negative suggestions from autocomplete, it both claims and appears to be doing so consistently across the board. Here’s a spokesperson’s full statement on the matter:
Google Autocomplete does not favor any candidate or cause. Claims to the contrary simply misunderstand how autocomplete works. Our autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person’s name. More generally, our autocomplete predictions are produced based on a number of factors including the popularity of search terms.”
SourceFed says it’s currently working on an update to its video, following Google’s response.