Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
lyu07282
on May 31, 2024
|
parent
|
context
|
favorite
| on:
Legal models hallucinate in 1 out of 6 (or more) b...
Yeah I'd imagine the problem is not verifying the output against retrieved documents. If it just hallucinates it would ignore the given context, something that can absolutely be verified by another LLM.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: