Google has come under heavy fire in the last couple of years due to their search engine and app categorizing or labeling photos in a way that some users find racist. A 2015 example shows the Google photo app label two Black people as ‘gorillas’.

At the time Google’s Yonatan Zunger acknowledged;

“This is 100% not OK. It was high on my list of bugs you ‘never’ want to see happen.” Mr Zunger said Google had already taken steps to avoid others experiencing a similar mistake. He added it was “also working on longer-term fixes around both linguistics – words to be careful about in photos of people – and image recognition itself – eg better recognition of dark-skinned faces”.

A 2016 Google image query example posted on THEVGC displays mug shots of Black men when a common Black male name (DeShawn) is entered. They compared this to searching with the name ‘Scott’ (a common White male name). Mug shot images were not returned in the same percentage as with ‘Deshawn’, even though logic and search results tells us there are plenty of white male criminals named Scott.

The writer findings state;

“Does this mean that Google is being racist? Not exactly. To be fair to Google (or any other search engine company), image results in a search engine are based on a number of complex factors and the results change every day. The images that are shown are based on Search Engine Optimization results, which combines factors about site content, number of visits, links to the site, and so forth. Google has control over the algorithm that creates the results, but they have little or no control over the people who run the websites and tweak them to climb search engine results lists.

Another 2016 example that went viral, featured entering ‘three black teenagers’ juxtaposed with entering ‘three white teenagers’ in Google image search. The returning results were faces of alleged or convicted juvenile prisoners for Black, and wholesome smiling, cheerful faces for White.

From ‘unprofessional hair’ to Michelle Obama; Google search image results are very suspect. In Google’s defense, they do have a logical apologetic explanation. In 2016 Google stated;

“we’re merely reflecting back the biases that exist in society and that show up in what and how people search online…image search results are a reflection of what’s on the Web, including the frequency with which certain types of images appear and how they are described.

This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query… These results don’t reflect Google’s own opinions or beliefs — as a company, we strongly value a diversity of perspectives, ideas and cultures.”

Ok Google can you explain to me what does a cum shot  have to do with a young Muslim girl?

How did I come across this you’re wondering? Just as with many millions of other people online, I search Google for pornographic videos and images. You guessed it, ‘Cum Shots’ was one of my recent searches (don’t judge me). I was completely surprised when a selfie of a young Muslim girl was displayed on Google’s main search results query screen.

Confused, it made me think of the time LGBT activist Dan Savage, in protest against far right politician Rick Santorum, created a site to manipulate Google’s algorithm so that ‘anal froth’ would return in search result queries for Rick SantorumCould the same type of protest, hacking or doxing be behind the reason why an innocent looking young Muslim girl is now associated with cum shots on the largest search engine in the world?

In my bit of research this may have something to do with a Flickr (photo) data mining site linked to a Flickr site ‘Hijab Cum’. Search engines like Google, index parts of Flickr Hive Mind and photos will show up in image search queries. Nonetheless, how does this particular image show up on Google’s search results home page? My search inquiry had no mention of hijabs.

If you click on the actual ‘images’ tab; the query results are completely different and align with the text (cum shot/s) that is typed in the search field. By these images being explicit and pornographic in nature, it seems they would have more hits or views. Regardless I wasn’t expecting to see an actual photo of a ‘cum shot’ on the main search results page, only hyperlinks but I absolutely wasn’t expecting to see an unrelated unfitting selfie of a child.

I can only assume millions of people have seen this picture but wonder if anyone else finds it odd or have attempted to notify Google? Either way I notified Google via a feedback message for the image search result. Under the “What is wrong with this?” title I entered, “Not appropriate for search query result/s.”

Not sure if it will change anything but I feel bad for the girl in the pic and find the query result photo a bit creepy and disturbing.