Hallucinations occur when generative AI tools and applications produce incorrect or misleading content. The fabricated content is presented in such a way as to appear authentic and can be difficult to identify the errors. One common hallucination is when generative AI is prompted for research citations in a given area; the citations may or may not point to actual literature.
Image and sound-based AI tools are also subject to hallucinations. Generative AI may add pixels in ways that do not accurately reflect the object. This is why, for example, image-generation tools are notorious for adding fingers to hands!
Another significant issue over the past two years has been the deliberate misrepresentation of images, audio, and text.