Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>And I do not agree. LLMs are literally incapable of understanding the concept of truth, right/wrong, knowledge and not-knowledge. It seems pretty crucial to be able to tell if you know something or not for any level of human-level intelligence.

How are you so sure about this?

> If one believes LLMs are capable of cognition,

honestly asking: what formal proof is there for our own cognition?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: