How Google tells you what you want to hear

Some experts say that Google is just parroting your own beliefs and feeding them back to you

10727 views 0 comment(s)
Photo: Shutterstock
Photo: Shutterstock
Disclaimer: The translations are mostly done through AI translator and might not be 100% accurate.

"We're at the mercy of Google." Undecided voters in the US who turn to Google for help could see dramatically different worldviews, even when asked the exact same question.

Type in "Is Kamala Harris a good Democratic candidate" and Google will present you with a pink image.

The search results are constantly changing, but last week the first link was a survey Research Center Pju which shows that "Harris is moving Democrats to action."

The following is the text The Associated Press titled "Most Democrats Think Kamala Harris Would Make a Good President," and the following links were very similar.

But if you've been hearing negative things about Kamala Harris, you might instead question whether she's a "bad" Democratic candidate.

It's essentially the same question, but Google's results in that case are much more pessimistic.

"It's easy to forget how bad Kamala Harris is," the text claims Rizon magazine in the first place of the search.

Then, US njuz end world riport offers a positive view of how Harris "isn't the worst thing to happen to America," but all scores after that are critical.

Text Al Jazeera explains "Why I won't vote for Kamala Harris," followed by an endless Reddit thread about why she's no good.

You can see the same dichotomy with questions about Donald Trump, conspiracy theories, controversial political debates, and even media information.

Some experts say that Google just parrots your own beliefs and feeds them back to you.

It could be exacerbating your own biases and deepening the social divide along the way.

"We're at the mercy of Google when it comes to what information we can find," says Warrol Keihan, assistant professor of information systems at the University of South Florida, in the US.

Bias machine

"Google's whole mission is to give people the information they're looking for, but sometimes the information people think they want isn't actually the most useful," says Sara Presh, director of digital marketing at Dragon Metrix, a platform that helps companies align their pages for better visibility on Google with using methods known as "search engine optimization", or SEO.

It's a job that requires meticulous combing through Google results, and a few years ago, Preš noticed a problem.

"I started watching how Google positioned itself on hotly debated topics," she says.

"In many cases, the results were shocking."

Some of the most striking examples have dealt with how Google handles certain health-related issues.

Google often pulls information from the web and displays it at the top of search results to provide a quick response, which it calls a "Snippet (Featured Snippet).

Preš searched for "the connection between coffee and hypertension".

The excerpt quoted an article from Klinke Mayo, highlighting the words "Caffeine can cause a brief but dramatic spike in your blood pressure."

But when she searched for "no link between coffee and hypertension," the excerpt referred to a contradictory sentence from the same Mayo Clinic article: "Caffeine has no long-term effects on blood pressure and is not associated with a higher risk of high blood pressure."

The same thing happened when Preš asked "is ADHD caused by sugar" and "ADHD is not caused by sugar".

Google presented an excerpt representing both sides of the issue, again taken from the same article.

(In reality, there is little evidence that sugar affects ADHD symptoms, and certainly does not cause the disorder.)

Watch the video: What are algorithms

She encountered the same problem with political issues.

Ask if "the UK's tax system is fair" and Google will pull up a quote from Conservative MP Nigel Huddleston, arguing that it really is.

Ask if "the British tax system is unfair" and Google's Featured Paragraph will explain how British taxes benefit the rich and deepen inequality.

"What Google does is it pulls out pieces of text based on what people are looking for and gives them what they want to read," says Presh.

"He's one big bias machine."

For its part, Google claims to provide users with unbiased results that simply connect people with the kind of information they're looking for.

"As a search engine, Google wants to surface high-quality results that are relevant to the search you entered," says a Google spokesperson.

"We provide open access to a wide range of viewpoints from across the web and give people useful tools to evaluate the information and sources they find."

When the filter bubble bursts

Serenity Strull/ BBC

According to one estimate, Google deals with some 6,3 million queries every second, which adds up to nine billion searches per day.

The vast majority of internet traffic starts with a Google search, and people rarely click on anything beyond the first five links, let alone go to another page of search results.

One study that tracked users' eye movements showed that people often don't look past the main result.

The system that arranges links in Google search has enormous power over our perception of the world.

According to Google, the company handles this responsibility well.

"Independent academic research has rejected the idea that Google search is pushing people into a filter bubble," says a spokesperson.

The issue of so-called "filter bubbles" and "echo chambers" on the Internet is a hot topic, although some research has questioned whether the effects of online echo chambers have been exaggerated.

But Keyhan, who studies how search engines affect confirmation bias, the natural impulse to seek information that confirms our beliefs, says there's no doubt that our beliefs or even our own political identities are being altered by the systems that control what we see online.

"We are dramatically affected by the way we get information," he says.

A Google spokesperson says that a 2023 study concluded that people's exposure to party news depends more on the fact that it's what they click on than Google initially serving them party news.

In one sense, this is how confirmation bias works in general: people look for evidence that supports their views and ignore evidence that challenges them.

But even in that study, the researchers said their findings do not imply that Google's algorithms are unproblematic.

"In some cases, our participants were exposed to extremely partisan and unreliable news in Google searches," the researchers said, "and previous work suggests that even a limited number of such exposures can have significant negative consequences."

Despite this, you can decide to let go of the information that keeps you imprisoned in your own bubble, "but there is only a certain bouquet of messages that is put in front of you to choose from," says Silvija Knoblock-Vestervik, a professor of mediated communication at the Technical university, in Berlin, Germany.

"Algorithms play a significant role in this problem."

Google did not respond to the BBC's question about whether there is a person or team specifically tasked with dealing with the issue of confirmation bias.

"We don't understand documents - we argue"

Serenity Strull/ BBC

"In my opinion, this whole problem stems from the technical limitations of search engines and the fact that people don't understand what those limitations are," says Mark Williams-Cook, founder of AlsoAsked, another search engine optimization tool that analyzes Google results.

A recent US Anti-Competition Act case filed against Google revealed documents from the internet company in which employees discuss some of the techniques the search engine uses to answer your questions.

"We don't understand documents - we pretend," wrote an engineer in a slideshow used during a 2016 presentation at this company.

"A billion times a day, people ask us to find documents relevant to their query... Beyond some of the most basic things, we barely look at those documents. We look at people. If the document gets positive reactions, we consider it good. If the reaction is negative, it is probably bad. Horribly simplistic, but that's the source of Google's magic."

"That's how we serve the next person, continue the induction and maintain the illusion that we understand."

In other words, Google tracks what people click when they type in the search term in question.

When people seem satisfied with a certain type of information, Google is more likely to promote that type of search result for similar queries in the future.

A Google spokesperson says those documents are out of date and that the system used to decrypt queries and web pages has become much more sophisticated.

“That presentation is from 2016, so you have to take it with a grain of salt, but the core concept is true. "Google builds models to try to predict what people like, but the problem is that it creates a feedback loop," says Williams-Cook.

If confirmation bias leads people to click on links that reinforce their beliefs, it can teach Google to show people links that lead to confirmation bias itself.

"It's like saying you're going to let a child choose their diet based on what they like. "Eventually you end up with fast food," he says.

Williams-Cook also worries that people might not understand that when you ask something like "is Trump a good candidate," Google might not interpret that as a question.

Instead, he often just pulls up documents that have to do with keywords like "Trump" or "good candidate."

That gives people wrong expectations about what they'll get in a search and can lead people to misinterpret what those search results mean, he says.

If users had a better understanding of the browser's shortcomings, Williams-Cook believes they would be able to view the content they view more critically.

"Google should do more to inform the public about how search actually works." "But I don't believe they will, because to do that you have to admit some imperfections about what doesn't work," he says.

Google is open about the fact that search is never a solved problem, a company spokesperson says, and the company is tirelessly trying to solve deep technical challenges in the field as they arise.

Google also highlights features it offers to help users better evaluate information, such as the "About This Result" tool and announcements that let the user know when results on a topic related to current news are changing rapidly.

Philosophical problems

A Google spokesperson says it's easy to find results that reflect a range of viewpoints from sources around the web, if you want to.

He claims that this applies even to some of the results singled out by Perš.

Scroll down past questions like "is Kamala Harris a good Democratic candidate" and you'll find links criticizing her.

The same goes for "is the UK tax system fair" - you'll find search results saying it isn't.

As for the query about "the link between coffee and hypertension," a Google spokesperson says the question is complicated, but that the search engine pulls authoritative sources that address the nuanced differences.

Of course, this relies on people going beyond the first few results, the further down you go on the results page, the less likely users are to click on those links.

In the case of coffee-related hypertension and the UK's tax system, Google also summarizes the results and provides its own answer in a prominent place with Excerpts, which reduces the chance that people will follow links further down the search results.

For a long time, observers have described how Google is transforming from a search engine into an "answer machine," where the company simply gives you information, instead of directing you to external sources.

The clearest example is the introduction of AI Overview, a feature where Google uses artificial intelligence to answer searches for you, instead of showing you a link in the answer.

As the company puts it, you can now "let Google do the searching for you".

"In the past, Google would show you something that someone else had written, but now it's writing the answer itself," says Williams-Cook.

"It brings all these problems together, because Google now has only one chance to get the answer right. It's a difficult move."

But even if Google has the technical ability to address all of these problems, it's not necessarily clear when or how it should intervene.

Perhaps you want information that supports a particular belief, and if so, Google provides a valuable service by providing it to you.

Many people don't like the idea of ​​one of the richest and most powerful companies in the world making decisions about what's true, Keihan says.

"Is it Google's job to fix it? Can we trust Google to solve it? And is it even solvable? These are all difficult questions, and I don't believe anyone has the answer," he says.

"One thing I can tell you for sure is that I don't think they're doing enough on it."

* Thomas Jermaine is a senior technology reporter for the BBC. He has been writing about artificial intelligence, privacy and the farthest reaches of internet culture for most of the past decade. You can find it at Ixu i TikTok @thomasgermain.

BBC is in Serbian from now on and on YouTube, follow us HERE.

Follow us on Facebook, Twitter, Instagram, YouTube i Viber. If you have a topic proposal for us, contact us at bbcnasrpskom@bbc.co.uk

Bonus video: