Tag Archives: Google

Google Autocomplete revisited

Google autocomplete showing autocomplete suggestions for the search query "Google autocomplete re"«Did Google Manipulate Search for [presidential candidate]?» was the title of a video that showed up in my facebook feed. In it, the video host argued that upon entering a particular presidential candidate’s name into Google’s query bar, very specific autocomplete suggestions are not showing up although – according to the host – they should.

I will address the problems with this claim at a later point, but let’s start by noting that the argument was quickly picked up (and sometimes transformed) by blogs and news outlets alike, inspiring titles such as «Google searches for [candidate] yield favorable autocomplete results, report shows», «Did [candidate]’s campaign boost her image with a Google bomb?», «Google is manipulating search results in favor of [candidate]», and «Google Accused of Rigging Search Results to Favor [candidate]». (The perhaps most accurate title of the first wave of reporting is by the Washington Times, stating «Google accused of manipulating searches, burying negative stories about [candidate]».)

I could not help but notice the shift of focus from Google Autocomplete to Google Search results in some of the reporting, and there is of course a link between the two. But it is important to keep in mind that manipulating autocomplete suggestions is not the same as manipulating search results, and careless sweeping statements are no help if we want to understand what is going on, and what is at stake – which is what I had set out to do for the first time almost four years ago.

Indeed, Google Autocomplete is not a new topic. For me, it started in 2012, when my transition from entrepreneurship/consultant into academia was smoothed by a temporary appointment at the extremely dynamic, innovative DHLab. My supervising professor was a very rigorous mentor all while giving me great freedom to explore the topics I cared about. Between his expertise in artificial intelligence and digital humanities and my background in sociology, political economy and information management we identified a shared interest in researching Google Autocomplete algorithms. I presented the results of our preliminary study in Lincoln NE at DH2013, the annual conference of Digital Humanities. We argued that autocompletions can be considered “linguistic prosthesis” because they mediate between our thoughts and how we express these thought in written language. Furthermore, we underlined how mediation by autocompletion algorithms acts in a particularly powerful way because it intervenes before we have completed formulating our thoughts in writing and may therefore have the potential to influence actual search queries. A great paper by Baker & Potts, published in 2013, has come to the same conclusion and questions “the extent to which such algorithms inadvertently help to perpetuate negative stereotypes“.

Back to the video and its claim that, upon entering a particular presidential candidate’s name into Google’s query bar, very specific autocomplete suggestions are not showing up although they should. But why should they show up? The explanation Continue reading

Google’s autocompletion: algorithms, stereotypes and accountability

Google autocompletion algorithms questions xkcd

“questions” by xkcd

Women need to be put in their place. Women cannot be trusted. Women shouldn’t have rights. Women should be in the kitchen. …

You might have come across the latest UN Women awareness campaign. Originally in print, it has been spreading online for almost two days. It shows four women, each “silenced” with a screenshot from a particular Google search and its respective suggested autocompletions.

Researching interaction with Google’s algorithms for my phd, I cannot help but add my two cents and further reading suggestions in the links …

Google's sexist autocompletion UN Women

Women should have the right to make their own decisions

Guess what was the most common reaction of people?

They headed over to Google in order to check the “veracity” of the screenshots, and test the suggested autocompletions for a search for “Women should …” and other expressions. I have seen this done all around me, on sociology blogs as well as by people I know.

In terms of an awareness campaign, this is a great success.

And more awareness is a good thing. As the video autofill: a gender study concludes “The first step to solving a problem is recognizing there is one.” However, people’s reactions have reminded me, once again, how little the autocompletion function has been problematized, in general, before the UN Women campaign. Which, then, makes me realize how much of the knowledge related to web search engine research I have acquired these last months I already take for granted… but I disgress.

This awareness campaign has been very successful in making people more aware of the sexism in our world Google’s autocomplete function.

Google's sexist autocompletion UN Women

Women need to be seen as equal

Google’s autocompletion algorithms

At DH2013, the annual Digital Humanities conference, I presented a paper I co-authored with Frederic Kaplan about an ongoing research of the DHLab about Google autocompletion algorithms. In this paper, we explained why autocompletions are “linguistic prosthesis”: they mediate between our thoughts and how we express these thought in (written) language. So do related searches, or the suggestion “Did you mean … ?” But of all the mediations by algorithms, the mediation by autocompletion algorithms acts in a particularly powerful way because it doesn’t correct us afterwards. It intervenes before we have completed formulating our thoughts in writing. Before we hit ENTER. Continue reading