Are aphantasic minds similar to ChatGPT?
I recently had an enlightening conversation with ChatGPT, an advanced AI language model, on the topic of aphantasia. Out of curiosity, I asked it to think of a horse.
The response was both intriguing and oddly familiar to me.
Here’s the response I got:
ChatGPT: As an artificial intelligence, I don’t have the ability to visualize or imagine in the way humans do. I don’t have the ability to form mental images. So, when you ask me to think of a horse, I don’t actually conjure up an image of a horse. Instead, I access my training, which includes information about horses, their characteristics, behvaviors, and the context in which horses are typically mentioned.
This response from ChatGPT, devoid of mental imagery, mirrors the experience of those of us with aphantasia. While we may not “see” in our mind’s eye, we have a unique ability to process and understand concepts.
When asked to “think of a horse” we rely on our training (learning, experiences, knowledge) and implicitly understand associated information about horses, their characteristics, behaviors, and the context that makes sense for horses.
But there’s more to this story. This parallel between AI and Aphantasia opens up fascinating discussions about both the nature of human cognition and the capabilities of artificial intelligence.
I had a fantastic discussion with Neuroscientist Mac Shine about this very topic. We explored the unexpected ways in which the cognitive functions of aphantasics and AI systems like ChatGPT-4 overlap. It’s a conversation I think you’ll find both surprising and enlightening.
Watch Mac Shine’s presentation here and my extended interview here.