You can trick Google's AI Overviews into explaining made-up idioms

by lucky
0 comments

Since the Big Tech has put countless dollars and resources in the AI, and preaching the Gospel of Utopian talent that produces its utopia, there is a reminder that the algorithm can shrink. Big Time Updated Evidence: You can activate Google’s AI review (automatic answers in the upper part of your search questions) to explain the imaginary, nonsense as if they are real.

According to Google’s AI review (through @GregjeenRR on Bluesky), “You can’t lick the barger twice” means that you cannot cheat or cheat on anyone for the second time after cheating on someone.

It looks like a logical effort to explain the idiom – if it wasn’t just popic. Google’s Gemini failure came to understand the question that a ridiculous mummobo created to deceive it was referred to a set phrase instead of a jumbo. In other words, AI fraud is still alive and better.

You can trick Google's AI Overviews into explaining made-up idioms
Google / Angate

We plugged some of the strange plugins in it and similar results came out.

Google’s response claims that “you can’t golf without fish” is a puzzle or playing on words, suggests that you cannot play golf without the necessary luggage, in particular, golf hair. Interestingly, AI’s review added the clause that golf hair “can be seen as ‘fish’ because of its shape.” Hmmm

After that, “you can’t open a peanut butter jar with two feet.” According to the AI ​​review, this means that you can’t do anything that requires skills or skills. Once again, the existence of the content is a great deal of work assigned without examining the facts.

And there is “You can’t marry pizza” is a lively way to show the concept of marriage as a commitment between two people, not a food item. (Naturally.) “The rope will not pull the dead fish” means that nothing can be achieved by strength or effort. It requires cooperation or willingness for natural development. (Of course!) “Eat the greatest chloped” is a living heart way to suggest that when you face a major challenge or too much food, you should first start with the most important part or item. (Baba advice.)

Google AI review screenshot (no existence) explains the idiom,
Google / Angate

This is hardly the first example of the AI ​​deception that does not check the facts by the user, can lead to false information or real life results. Only ask Chat GPT lawyers, Steven Schwartz and Peter Lodoka, who were fined $ 5,000 for the use of Chat GPT for a brief research in the legalization of a client in 2023. The AI ​​Chatboat did not create any existence offered by the couple, which the other party’s lawyers could not find.

A response to the judge’s discipline? “We have failed to believe that a piece of technology can make cases out of the entire fabric.”

This article was originally published on Enoget

You may also like

At PokoGame, we bring you the latest and most exciting updates from the gaming world. Whether you’re a casual gamer, an esports enthusiast, or a hardcore gaming fan, our platform is designed to keep you informed and entertained.

Stay connect with us

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Copyright @2025- All Right Reserved. Designed and Developed by Pro