Notes on AI Hallucination
Jump to navigation
Jump to search
Hallucination evaluation lab
Message
08:32, 26 April 2024 (UTC) Hi Don, For info only...
The link above gives a good general overview of AI at the current time. It also mentions that LLMs are very prone to hallucinations...
https://arxiv.org/abs/2305.11747
The link above leads on to an article about how to check out for these...
Best wishes,
Hugh😑