web analytics

Scientists Say: Hallucination

Scientists Say: Hallucination

Hallucination (noun, “huh-loo-sin-AY-shun”)

The word “hallucination” can mean different things depending on the context. In psychology, a hallucination is when someone senses something that is not there. In computer science, it refers to a type of artificial intelligence, or AI, error. It’s when an AI system generates an incorrect or misleading response that may appear plausible.

In psychology, hallucinations are a form of psychosis. That is a mental state in which a person becomes disconnected from reality. Hallucinations can involve any of the senses. A person may see, hear, smell, taste or feel things that are not there.

Some mental conditions can come with hallucinations. Schizophrenia, for example, is a mental condition is marked by hallucinations — especially sound-related ones. About 75 percent of people with this condition report hearing things that do not exist. Drug use, poor sleep or illness — especially with a high fever — can trigger hallucinations, too.

In computer science, a hallucination refers to an AI mistake. Sometimes an AI system will generate incorrect or misleading responses. These errors can be hard to catch, even for experts.

Here’s an example. In 2023, Google released ads showing off an AI chatbot. The AI was called Bard. However, the ads contained errors. Google asked Bard, “What new discoveries from the James Webb Space Telescope (JWST) can I tell my 9-year-old about?” Bard claimed that the JWST had taken the first-ever picture of an exoplanet. That’s a planet outside our solar system. But this was not true. The first such image was captured by the VLT. That’s the European Southern Observatory’s Very Large Telescope.

In a sentence

AI experts warn that chatbot hallucinations may mislead you.

Check out the full list of Scientists Say.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.