Notes on AI Hallucination: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
mNo edit summary |
||
| Line 1: | Line 1: | ||
[[Category:AI hallucination]] | [[Category:AI hallucination]] | ||
== Hallucination evaluation lab == | |||
;:* https://github.com/RUCAIBox/HaluEval | |||
==Message== | |||
<blockquote> | <blockquote> | ||
08:32, 26 April 2024 (UTC) | 08:32, 26 April 2024 (UTC) | ||
Hi Don, For info only... | Hi Don, For info only... | ||
https:// | https://newatlas.com/technology/ai-index-report-global-impact/?fbclid=IwZXh0bgNhZW0CMTEAAR1834vD7p9PmK-8PZDABMgVyCuz2n3-qnQixGOSmhakdrM2gLMTD4sv6pg_aem_AdMSn9x1AROE7v2M_2lPwG6FXJf_FRJryYsNSKTEKxfI7VPuBcH74RLwEtvYojyDRe7Ea-7R9VahaZarNv-C65vV | ||
The link above gives a good general overview of AI at the current time. It also mentions that LLMs are very prone to hallucinations... | The link above gives a good general overview of AI at the current time. It also mentions that LLMs are very prone to hallucinations... | ||
Latest revision as of 11:31, 26 April 2024
Hallucination evaluation lab
Message
08:32, 26 April 2024 (UTC) Hi Don, For info only...
The link above gives a good general overview of AI at the current time. It also mentions that LLMs are very prone to hallucinations...
https://arxiv.org/abs/2305.11747
The link above leads on to an article about how to check out for these...
Best wishes,
Hugh😑