← back to glossary

models

Hallucination

When an AI model produces a confident, fluent response that is factually wrong or entirely made up.

Last updated 2026-05-12