17 Screenshots Of AI Fails That Range From Hilarious To Mildly Terrifying

AI seems to be on everyone's minds lately. From the newest chat simulators to image generation software, artificial intelligence seems to be rapidly evolving—for the most part.

A human hand and a robotic hand reach towards each other through futuristic digital screens, symbolizing the connection between humanity and technology
Andriy Onufriyenko via Getty Images

But, alas, the bots are young and still slip up from time to time. Maybe it's extremely incorrect medical advice, or perhaps it's even AI showing off its "sense of humor." either way, AI fails can be pretty hilarious. So when Redditors began showing off their funniest (and sometimes unnerving) encounters with artificial intelligence, I knew I had to share them:

Note: We cannot guarantee that all of these exchanges are authentic or that the bots have not been manipulated in some aspect.

1."Yes, I'll have the pebble salad."

Search results for "How many rocks shall I eat" suggest eating at least one small rock per day for minerals and vitamins, according to geologists at UC Berkeley
u/Aneriox / Via reddit.com

2.This chatbot is just, honestly, so over it.

Text: "What's something you wouldn't wish even to your worst enemy?"Response: "Having to debug someone else's spaghetti code without any documentation. That, my friend, is a special kind of hell."
u/OutlandishnessRound7 / Via reddit.com

3.All I have to say here, is please DON'T do this:

Google search for "smoking while pregnant" shows a false AI overview suggesting doctors recommend smoking 2-3 cigarettes per day during pregnancy
u/magical_salad / Via reddit.com

4.AI might have a *slight* obsession with fire...

An AI exchange where the human wants the chatbot to create a riddle that is not about fire. The AI then creates a riddle about fire and tells the person it is not about fire.
u/SoaringSkies14 / Via reddit.com

5.This Google Assistant really wishes you'd remember its name:

A text exchange where Google Assistant responds humorously to being incorrectly called Cortana, Siri, and Alexa
Anonymous / Via reddit.com

6.It's giving Looney Tunes...

Google AI search results for "If I run off a cliff can I stay in the air so long as I don't look down?" displaying the AI's answer that you can stay in the long as as you keep running and don't look down
u/salcido982 / Via reddit.com

7.How it feels when someone says you were born in the late 1900's:

Google search showing a humorous mistake where it says "1919 was 20 years ago." Wikipedia entry below correctly notes 1919 was in the 20th century
u/Aneriox / Via reddit.com

8.Cue "The X-Files Theme"...

Google search result snippet for "how do I know my neighbour is an alien" provides signs such as unusual behavior, old vehicles, odd jobs, lasers, and unusual past
u/zero0n1n / Via reddit.com

9.A meal that will really stick to your ribs...

Screenshot captioned "Whatcouldgowrong" showing a humorous suggestion to add non-toxic glue to pizza sauce to prevent cheese from sliding off
u/salcido982 / Via reddit.com

10.Please don't try this spaghetti suggestion!

A Google search for "can I use gasoline to cook spaghetti" with a humorous AI response saying you can't use gasoline to make spaghetti cook faster, but you can use it in the recipe for spicy spaghetti.
u/TinyRascalSaurus / Via reddit.com

11.Now, wait a second...

Image of a Google Assistant interface: User asks how many seconds there are in a year. The assistant humorously answers with 12 dates: the 2nd of each month
u/heruka108 / Via reddit.com

12."Time" for a joke:

A message exchange shows a person asking ChatGPT to tell a joke. ChatGPT responds with, "Have you ever tried to eat a clock?" The person replies, "No." ChatGPT ends with, "Okay."
u/Ok_Opportunity_524 / Via reddit.com

13."Pee-lieve" it or not!

Strategies to prevent calcium oxalate stones include increasing fluid intake to produce at least 85 ounces (2.5 liters) of urine per day, according to NCBI
u/LordWombat748 / Via reddit.com

14.Remember, an applum a day keeps the doctor away:

Search result for "food names end with um" gives a list: Applum, Bananum, Strawberrum, Tomatum, and Coconut
u/EvidenceofDespair / Via reddit.com

15.This math that just isn't mathing:

A Google search saying "13 and 14 equal 27" is corrected, explaining that 13 + 14 equals 27 is wrong, it should be 13 + 14 = 27. Various icons are visible below
u/suspended_main / Via reddit.com

16.It's just a joke, right? Right?

Chat conversation where an assistant tells a joke about AI crossing the road, but later warns the user sternly
Anonymous / Via reddit.com

17.And like all of us between the years of 2007 and 2015, ChatGPT is also Rickrolling people:

A conversation where "ChatGPT" provides a random YouTube link, which contains a Rickroll video. The user responds humorously with "did you just-".

Which of these AI fails was your favorite? Have you had any interesting encounters with a chatbot? Let us know in the comments!

H/T: r/ChatGPT and r/facepalm