“Someone sent me one of those “conspiracy theory” emails with an attached video. I usually laugh these off (and indeed I did with this one), however, in the video they mentioned how manipulated a GOOGLE search can be and gave an example.
They stated if you type in the words WHITE COUPLES and do a picture search… you will find over 70% of the pictures shown will be of MULTI-RACIAL couples. I did this search and this is absolutely correct. I would even estimate that it is more 75% skewed.
Now to give impartial balance and benefit of the doubt I also typed in BLACK COUPLES picture search. I was stunned! Over 90% of the pictures were of black or non-white couples. I then extended my search and typed in PICTURES OF WHITE PEOPLE. There was almost the same skewed result… and especially after also comparing results for PICTURES OF BLACK PEOPLE. Now I extended my search and typed WHITE DRESSES and then BLACK DRESSES and got a 100% correct image search for each colour!
I then tried WHITE SKIN and BLACK SKIN. The white skin pictures were only about 50% correct as opposed to 90% for black skin. I tried many variations using white and black as descriptive words for “non-human” objects and got very similar accurate results mentioned for my DRESSES search. This to my mind suggests that when using the word WHITE associated with people, GOOGLE deliberately skew the results.
I am not racist… but surely the GOOGLE algorithms are anti-White racially skewed? The logic of an algorithm only picking up “white” words in the associated page or article will therefore by default also do the same with the “black” picture search. Unless of course there is a far more racist comment on the “white” people pics than on the “black” people pics? This does not seem logical as then this would suggest racism against white people? Try this experiment yourself and make up your own mind. – Mike Walsh