[SystemSafety] AI Hallucination Cases

Derek M Jones derek at knosof.co.uk
Fri Jul 18 12:18:31 CEST 2025


Les,

> Please excuse the length of my response. Google Gemini AI had a lot to say on
> the subject and put it so well I felt it was worth including.

Many LLM interfaces provide an option that creates a link to
the text of an interaction.

Using these links simplifies the process of sharing the
responses from multiple LLMs to the same question.

-- 
Derek M. Jones           Evidence-based software engineering
blog:https://shape-of-code.com



More information about the systemsafety mailing list