For one, ChatGPT has no idea what a cat or dog looks like. It has no understanding of their differences in character of movement. Lacking that kind of non-verbal understanding, when analysing art that’s actually in its domain, that is, poetry, it couldn’t even begin to make sense of the question “has this poem feline or canine qualities” – best it can do is recognise that there’s neither cats nor dogs in it and, being stumped, make up some utter nonsense. Maybe it has heard of catty and that dogs are loyal and will be looking for those themes, but feline and canine as in elegance? Forget it, unless it has read a large corpus of poet analysis that uses those terms: It can parrot that pattern matching, but it can’t do the pattern matching itself, it cannot transfer knowledge from one domain to another when it has no access to one of those domains.
And that’s the tip of the iceberg. As humans we’re not really capable of purely symbolic thought so it’s practically impossible to appreciate just how limited those systems are because they’re not embodied.
(And, yes, Stable Diffusion has some understanding of feline vs. canine as in elegance – but it’s an utter moron in other areas. It can’t even count to one).
Then, that all said, and even more fundamentally, ChatGPT (as all other current AI algos we have) is a T2 system, not a T3 system. It comes with rules how to learn, it doesn’t come with rules enabling it to learn how to learn. As such it never thinks – it cannot think, as in “mull over”. It reacts with what passes as a gut in AI land, and never with “oh I’m not sure about this so let me mull it over”. It is in principle capable of not being sure but that doesn’t mean it can rectify the situation.
For one, ChatGPT has no idea what a cat or dog looks like. It has no understanding of their differences in character of movement. Lacking that kind of non-verbal understanding, when analysing art that’s actually in its domain, that is, poetry, it couldn’t even begin to make sense of the question “has this poem feline or canine qualities” – best it can do is recognise that there’s neither cats nor dogs in it and, being stumped, make up some utter nonsense. Maybe it has heard of catty and that dogs are loyal and will be looking for those themes, but feline and canine as in elegance? Forget it, unless it has read a large corpus of poet analysis that uses those terms: It can parrot that pattern matching, but it can’t do the pattern matching itself, it cannot transfer knowledge from one domain to another when it has no access to one of those domains.
And that’s the tip of the iceberg. As humans we’re not really capable of purely symbolic thought so it’s practically impossible to appreciate just how limited those systems are because they’re not embodied.
(And, yes, Stable Diffusion has some understanding of feline vs. canine as in elegance – but it’s an utter moron in other areas. It can’t even count to one).
Then, that all said, and even more fundamentally, ChatGPT (as all other current AI algos we have) is a T2 system, not a T3 system. It comes with rules how to learn, it doesn’t come with rules enabling it to learn how to learn. As such it never thinks – it cannot think, as in “mull over”. It reacts with what passes as a gut in AI land, and never with “oh I’m not sure about this so let me mull it over”. It is in principle capable of not being sure but that doesn’t mean it can rectify the situation.
deleted by creator