Recently, Google started rolling out a feature that shows AI-generated answers when you search for something. There doesn't seem to be a way to disable this.
I've searched for how to disable it, but the solutions don't work. It seems that besides the AI-generated overviews shown above the results, Google may have some OTHER AI-generated answers, and the solutions only apply to those.
In particular, disabling AI overviews from Google Labs doesn't actually disable them.
This is very frustrating because I absolutely hate this feature. It's very obvious that this is a sleazy feature that should NEVER be trusted.
When Google provides search results for your queries in a typical SERP list, it always shows a snippet that may even answer your query fully, but it includes the name of the website so you at least have an idea of where the answer came from.
When Google shows its "AI overview," even though it seems to be able to somehow attribute the source for the answers, it doesn't actually show immediately where those claims come from, nor is it obvious how claims from multiple websites are merged into one single answer.
To see the source(s) of an answer, you have to click on the answer.
This may sound innocuous, but hiding things behind clicks is a deliberate design choice, one that, surely, was thoroughly reviewed by developers at Google before being released to millions of users worldwide.
Why is the name of the websites in the SERP list not hidden behind a click, but int he AI it is? Why can you see that an article from Wikipedia comes from Wikipedia, but an answer is essentially "trust the AI" at first glance? What if the AI is quoting a terrible source, or even a parody website, that you would immediately disregard if you knew what the source was?
Observe that the "AI overview" displays a list of MULTIPLE answers. You aren't going to click every single one of them, but I assume if any of these answers came from a joke website, you wouldn't trust the rest of them either, would you?
The fact that the source is hidden from you is very telling, worrying, and disappointing.
The only way I could trust something like this is if it quoted, VERBATIM, the source, using double quotes. I want the AI to say something like:
"Albert Einstein was a [...] theoretical physicist" (Wikipedia) that "won the Nobel Prize for Physics in 1921" (Encyclopædia Britannica).
This is what you're supposed to do in school. This is software written by engineers and computer scientists. I think they know how they're supposed to do it, and why. Anything less than this, or rather, trying to be "smarter" than this, is completely unacceptable and untrustworthy.
This isn't even bias against AI. This is just basic academic rigor.
If someone can't quote the source, they're making it up. Might as well be hallucinating. It doesn't matter if it's AI, a researcher, or even myself. If I can't quote something, it didn't happen.
You either write down what you found, or you quote someone who did. AI isn't writing down anything, so I expect it to quote. I wouldn't waste my time verifying random claims on the Internet. I'm not going to waste my time verifying that the AI isn't making things up.
Solution
For now, the solution seems to be using a different search engine.
- See Web Search Engines that have their Own Index for an useful list of alternatives to Google.
I recommend Bing. Although it has an AI chat that is enabled by default and appears on top of the search result as well, you can disable it. That's right, Bing is better than Google now, and this isn't because Bing got better, it's because Google got worse! (amusingly, you could say the same about Linux and Windows.)
I recommend against setting Bing's homepage to be your default new tab page, however, because it's full of [redacted].
Leave a Reply