Jews are no longer ‘evil’ but Muslims are still ‘bad’, according to Google


Cii Radio| Ayesha Ismail| 06 December 2016| 06 Rabi ul Awal 1438

The search engine has now removed offensive phrases from its autocomplete feature following complaints it was leading people to controversial sites.

Up until now, if you googled ‘Are Jews..’, you’d get a bunch of choices to click on, including: ‘Are Jews a race?’, ‘Are Jews white?’ and finally, ‘Are Jews evil?’.


If you clicked on the latter, you’d be shocked to find that there were dozens of websites answering ‘Yes’ to your question, with the top link being ‘Top 10 Major Reasons Why People Hate Jews.’

Similarly, if you searched for ‘are women evil..?’, you’d get a highlighted box stating that ‘every woman has some degree of prostitute in her’ from a website called ‘Shedding of the ego’.

Journalist Carole Cadwalladr, who highlighted the issue, wrote in The Observer: ‘[Google] strives to give you the best, most relevant results.

‘And in this instance the third-best, most relevant result to the search query “are Jews evil” is a link to an article from, a neo-Nazi website.

‘I feel like I’ve fallen down a wormhole, entered some parallel universe where black is white, and good is bad.’
Following the claims, it appears Google have now removed them from the suggestions (at least when we tried to search them). Phew.

However, it’s a different story when you try googling ‘Are Muslims..’, as the first suggestion that appears is ‘Are Muslims bad?’.


When you click on it, you’re immediately given a whole page of links to reasons why Muslims are ‘bad’, with the forth link being ‘Most Muslims Are Bad People’ and fifth, ‘Ten Reasons Why Islam is Not A Religion Of Peace’.

Of course, these are just suggestions, formed by an algorithm without human involvement, yet it is surprising that a site that carries out 5.5bn searches a day even allowed such suggestions to feature in the first place.

According to Google’s Search Help, the autocomplete feature is ‘based on objective factors, including how often others have searched for a word’ and is ‘designed to reflect the range of info on the web.’

‘Because of this range, the search terms you see might sometimes seem strange or surprising.’

This has often backfired though, as it did last year when Google removed an anti-Semitic answer to the search term, ‘Who runs Hollywood,’ after it showed up with the word ‘Jews’.

Ms Cadwalladr told ‘Google is where the world is going for its information and the information it is giving is absolutely toxic.

‘There’s an absolutely massive problem here that it is refusing to acknowledge let alone fix.’

A Google spokesperson told Metro: ‘Our search results are a reflection of the content across the web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs – as a company, we strongly value a diversity of perspectives, ideas and cultures.

‘Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for such a wide range of material on the web — 15% of searches we see every day are new.

‘Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn’t an exact science and we’re always working to improve our algorithms.’

Source – Metro