LLMs don’t “feel”, “know”, or “understand” anything. They spit out statistically most significant answer from it’s data-set, that is all they do.
LLMs don’t “feel”, “know”, or “understand” anything. They spit out statistically most significant answer from it’s data-set, that is all they do.