Google's AI Overviews sometimes acts like a lost man who won't ask for directions: It would rather confidently make a mistake than admit it doesn't know something.
We know this because folks online have porno izlemek ginah minoticed you can ask Google about any faux idiom — any random, nonsense saying you make up — and Google AI Overviews will often prescribe its meaning. That's not exactly surprising, as AI has shown a penchant for either hallucinating or inventing stuff in an effort to provide answers with insufficient data.
In the case of made-up idioms, it's kind of funny to see how Google's AI responds to idiotic sayings like "You can't lick a badger twice." On X, SEO expert Lily Ray dubbed the phenomenon "AI-splaining."
Someone on Threads noticed you can type any random sentence into Google, then add “meaning” afterwards, and you’ll get an AI explanation of a famous idiom or phrase you just made up. Here is mine
— Greg Jenner (@gregjenner.bsky.social) April 23, 2025 at 6:15 AM
[image or embed]
Fantastic technology, glad society spent a trillion dollars on this instead of sidewalks.
— Dan Olson (@foldablehuman.bsky.social) April 21, 2025 at 12:01 AM
[image or embed]
New game for you all: ask google what a made-up phrase means.
— Crab Man (@crabman.bsky.social) April 18, 2025 at 1:40 AM
[image or embed]
I tested the "make up an idiom" trend, too. One phrase — "don't give me homemade ketchup and tell me it's the good stuff" — got the response "AI Overview is not available for this search." However, my next made up phrase — "you can't shake hands with an old bear" — got a response. Apparently Google's AI thinks this phrase suggests the "old bear" is an untrustworthy person.
In this instance, Google AI Overview's penchant for making stuff up is kind of funny. In other instances — say, getting the NFL's overtime rules wrong — it can be relatively harmless. And when it first launched, it was telling folks to eat rocks and put glue on pizza. Other examples of AI hallucinations are less amusing. Keep in mind that Google warns users that AI Overviews can get facts wrong, though it remains at the top of many search results.
So, as the old, time-honored idiom goes: Be wary of search with AI, what you see may be a lie.
Mother's Day has caused a bunch of Americans to panic big timeCorgi wearing a mermaid tail is the real queen of the seaUniversity immortalizes nice cloud dog with equally nice 10Adele freaks out midSweet dad celebrates his two sons with a 'Calvin and Hobbes'A simple reason why you should stay out of floodwaters in Australia: SharksGrindr wants you to know it's not just a hookup app, launches online magazineSit back, relax and enjoy some Hillary ClintonIn its bid for world domination, Amazon buys up main competition in the Middle EastThis new breastfeeding help chatbot frustrates more than it informs Best vacuum deal: Save $250 on the Dyson Digital Slim Wordle today: The answer and hints for November 3 Are Gray Market Game Key Sites Legit? Gimmicks of Future Past All That Twitters Disintegration Nation Walmart deal of the day: Get $50 off the Ninja CREAMi Scientists found a colossal black hole near the dawn of time NYT Connections Sports Edition hints and answers for November 2: Tips to solve Connections #41. NASA reveals why it's so hard to spot alien life — even with Webb
0.2161s , 10020.5078125 kb
Copyright © 2025 Powered by 【porno izlemek ginah mi】Google AI overviews will explain any saying you make up,Feature Flash