> We found that GPT-3, GPT-Neo/J/X, and similar autoregressive language models that predict text from left to right are prone to “hallucinating” and generating text inconsistent with the “ground truth” document.
The term hallucinating is brilliant for how these AI systems seem to generate output.
Your product is very interesting, seems to work nicely on easy queries "how do I sort an array of objects in JavaScript". But was quite confusing for complex queries.
The UI doesn't work too well on mobile, but it's a beta and software is written on the desktop.
I also think making this a specific search engine for companies internal messy data would be a very useful tool as well.
Thanks for trying it out! We've gotten some solid feedback about the UI and will be working to improve it. Could you tell us a bit more about what you were trying to do and how it was confusing for complex queries?
The term hallucinating is brilliant for how these AI systems seem to generate output.
Your product is very interesting, seems to work nicely on easy queries "how do I sort an array of objects in JavaScript". But was quite confusing for complex queries.
The UI doesn't work too well on mobile, but it's a beta and software is written on the desktop.
I also think making this a specific search engine for companies internal messy data would be a very useful tool as well.