Thanks! Lot of people don’t seem to realize that GPT doesn’t actually have any idea what words mean. It just outputs stuff based on how likely it is to show up after the previous stuff. This results in very interesting behavior but there’s nothing conceptually “there”, there’s no thinking and so, no conversation to be had.
Thanks! Lot of people don’t seem to realize that GPT doesn’t actually have any idea what words mean. It just outputs stuff based on how likely it is to show up after the previous stuff. This results in very interesting behavior but there’s nothing conceptually “there”, there’s no thinking and so, no conversation to be had.