Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I catch myself sometimes writing the model it is confused and it should just assume what I am writing is true and continue reasoning from there.

Sometimes I am actually right but sometimes I am not. Not sure what happens to any future RL and does it lean more to constantly assuming what is written as true but then has to wiggle out of it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: