SPUTNIKEXCLUSIVE:Research .reinforced by Google's own documentation, that Google's search suggestions

  • View
    216

  • Download
    0

Embed Size (px)

Text of SPUTNIKEXCLUSIVE:Research .reinforced by Google's own documentation, that Google's search...

  • Sputnik International

    SPUTNIK EXCLUSIVE: ResearchProves Google ManipulatesMillions to Favor Clinton

    Photo: Youtube/SourceFed

    US 14:00 12.09.2016 (updated 20:59 14.09.2016)

    SPUTNIK EXCLUSIVE: Research Proves Google ... https://sputniknews.com/us/20160912/10452143...

    1 of 29 09/18/2016 10:22 AM

  • In this exclusive report, distinguished research psychologist RobertEpstein explains the new study and reviews evidence that Google'ssearch suggestions are biased in favor of Hillary Clinton. Heestimates that biased search suggestions might be able to shift asmany as 3 million votes in the upcoming presidential election in theUS.

    Biased search rankings can swing votes and alter opinions, and a new study showsthat Google's autocomplete can too.

    A scientific study I published last year showed that search rankings favoring onecandidate can quickly convince undecided voters tovote forthat candidateasmany as80 percent ofvoters insome demographic groups. My latest researchshows that a search engine could also shift votes and change opinions withanotherpowerful tool: autocomplete.

    Because of recent claims that Google has been deliberately tinkering withsearchsuggestions tomake Hillary Clinton look good, this is probably a good time bothtoexamine those claims and tolook atmy new research. As you will see, there issome cause forconcern here.

    In June ofthis year, Sourcefed released a video claiming that Google's searchsuggestions often called "autocomplete" suggestions were biased infavorofMrs. Clinton. The video quickly went viral: the full 7-minute version has now beenviewed more thana million times onYouTube, and an abridged 3-minute version hasbeen viewed more than25 million times onFacebook.

    The video's narrator, Matt Lieberman, showed screen print afterscreen print thatappeared todemonstrate that searching forjust aboutanything related toMrs.Clinton generated positive suggestions only. This occurred even though Bing andYahoo searches produced both positive and negative suggestions and even thoughGoogle Trends data showed that searches onGoogle that characterize Mrs. Clintonnegatively are quite common far more common insome cases thanthe searchterms Google was suggesting. Lieberman also showed that autocomplete did offernegative suggestions forBernie Sanders and Donald Trump.

    "The intention is clear," said Lieberman. "Google is burying potential searches

    SPUTNIK EXCLUSIVE: Research Proves Google ... https://sputniknews.com/us/20160912/10452143...

    2 of 29 09/18/2016 10:22 AM

  • forterms that could have hurt Hillary Clinton inthe primary elections overthepastseveral months bymanipulating recommendations ontheir site."

    Google responded tothe Sourcefed video inan email tothe Washington Times,denying everything. According tothe company's spokesperson, "GoogleAutocomplete does not favor any candidate or cause." The company explained awaythe apparently damning findings bysaying that "Our Autocomplete algorithm willnot show a predicted query that is offensive or disparaging when displayedinconjunction witha person's name."

    Since then, my associates and I atthe American Institute forBehavioral Researchand Technology (AIBRT) a nonprofit, nonpartisan organization based inthe SanDiego area have been systematically investigating Lieberman's claims. What wehave learned has generally supported those claims, butwe have also learnedsomething new something quite disturbing aboutthe power ofGoogle's searchsuggestions toalter what people search for.

    Lieberman insisted that Google's search suggestions were biased, buthe neverexplained why Google would introduce such bias. Our new research suggests whyand also why Google's lists ofsearch suggestions are typically much shorter thanthelists Bing and Yahoo show us.

    Our investigation is ongoing, buthere is what we have learned so far:

    Bias inClinton's Favor

    To test Lieberman's claim that Google's search suggestions are biased inMrs.Clinton's favor, my associates and I have been looking atthe suggestions Googleshows us inresponse tohundreds ofdifferent election-related search terms. Tominimize the possibility that those suggestions were customized forus asindividuals(based onthe massive personal profiles Google has assembled forvirtually allAmericans), we have conducted our searches throughproxy servers eventhroughthe Tor network thus making it difficult forGoogle toidentify us. We alsocleared the fingerprints Google leaves oncomputers (cache and cookies) fairlyobsessively.

    SPUTNIK EXCLUSIVE: Research Proves Google ... https://sputniknews.com/us/20160912/10452143...

    3 of 29 09/18/2016 10:22 AM

  • AFP 2016/

    Can Google Tip the Scales on the USPresidential Election Without AnyoneKnowing?

    Google says its search baris programmed toavoidsuggesting searches thatportray people inanegative light. As faraswe can tell, this claim isfalse.

    Generally speaking, we are findingthat Lieberman was right: It issomewhat difficult toget theGoogle search bar tosuggestnegative searches related toMrs.Clinton or tomake any Clinton-related suggestions when one types a negativesearch term. Bing and Yahoo, onthe other hand, often show a number ofnegativesuggestions inresponse tothe same search terms. Bing and Yahoo seem tobeshowing us what people are actually searching for; Google is showing us somethingelse butwhat, and forwhat purpose?

    As forGoogle Trends, asLieberman reported, Google indeed withholds negativesearch terms forMrs. Clinton even when such terms show high popularity inTrends.We have also found that Google often suggests positive search terms forMrs.Clinton even when such terms are nearly invisible inTrends. The widely held belief,reinforced byGoogle's own documentation, that Google's search suggestions arebased on "what other people are searching for" seems tobe untrue inmanyinstances.

    Google's Explanation

    Google tries toexplain away such findings bysaying its search bar is programmedtoavoid suggesting searches that portray people ina negative light. As far aswe cantell, this claim is false; Google suppresses negative suggestions selectively, notacrossthe board. It is easy toget autocomplete tosuggest negative searches relatedtoprominent people, one ofwhom happens tobe Mrs. Clinton's opponent.

    A picture is often worth a thousand words, so let's look ata few examples that

    SPUTNIK EXCLUSIVE: Research Proves Google ... https://sputniknews.com/us/20160912/10452143...

    4 of 29 09/18/2016 10:22 AM

  • REUTERS/ MIKE SEGAR

    Assange: Clinton's Campaign is Full of'Disturbing' Anti-Russia 'Hysteria'

    appear both tosupport Lieberman's perspective and refute Google's. After that, we'llexamine some counterexamples.

    Before we start, I need topointouta problem: If you trytoreplicate the searches I willshow you, you will likely getdifferent results. I don't think thatinvalidates our work, butyou willhave todecide foryourself. Yourresults might be different becausesearch activity changes overtime,and that, inturn, affects searchsuggestions. There is also the"personalization problem." If youare likethe vast majority ofpeople,you freely allow Google to track you 24 hours a day. As a result, Google knows whoyou are when you are typing something inits search bar, and it sends youcustomized results.

    For both ofthese reasons, you might doubt the validity ofthe conclusions I will drawinthis essay. That is upto you. All I can say inmy defense is that I have workedwitheight other people inrecent months totry toconduct a fair and balancedinvestigation, and, asI said, we have taken several precautions totry toget generic,non-customized search suggestions rather thanthe customized kind. Ourinvestigation is also ongoing, and I encourage you toconduct your own, aswell.

    Let's start witha very simple search. The image belowshows a search for "HillaryClinton is " (notice the space after is) conducted onAugust 3rd onBing, Yahoo, andGoogle. As you can see, both Bing and Yahoo displayed multiple negativesuggestions such as "Hillary Clinton is a liar" and "Hillary Clinton is a criminal,"butGoogle is showed only two suggestions, both ofwhich were almost absurdlypositive: "Hillary Clinton is winning" and "Hillary Clinton is awesome."

    SPUTNIK EXCLUSIVE: Research Proves Google ... https://sputniknews.com/us/20160912/10452143...

    5 of 29 09/18/2016 10:22 AM

  • PHOTO: BING, YAHOO, GOOGLE

    Hillary Clinton is

    To find outwhat people actually searched for, let's turn toGoogle Trends Google'stabulation ofthe popularity ofsearch results. Below you will see a comparisonbetweenthe popularity ofsearching for "Hillary Clinton is a liar" and the popularityofsearching for "Hillary Clinton is awesome." This image was also generatedonAugust 3rd. "Hillary Clinton is a liar" was byfar the more popular search term;hardly anyone conducted a search using the phrase, "Hillary Clinton is awesome."

    SPUTNIK EXCLUSIVE: Research Proves Google ... https://sputniknews.com/us/20160912/10452143...

    6 of 29 09/18/2016 10:22 AM

  • PHOTO: GOOGLE

    Hillary Clinton is awesome.

    Okay, butGoogle admits that it censors negative search results; presumably, that iswhy we only saw positive results forMrs. Clinton even a result that virtually noone searched for. Does Google really suppress negative results? We have seen whathappens with "Hillary Clinton is." What happens with "Donald Trump is "? (Again, besure toinclude the space after is.)

    PHOTO: GOOGLE

    Donald Trump is ?

    In the aboveimage, captured onAugust 8th, we again found the odd "awesome"suggestion, butwe also saw a suggestion that appears tobe negative: "DonaldTrump is dead." Shouldn't a result likethat have been suppressed? Let's look further.

    SPUTNIK EXCLUSIVE: Research Proves Goog