Preview
Open Original
Every single time an LLM hallucinates, I am grateful:
Grateful that I spotted it, and thus remind myself that any and all LLM output needs to be validated. You can never trust these things 100%, unless you have additional validation in place that is 100% reliable.