We've long known that humans don't just eat with our mouths — we eat with our eyes, ears, and memories. But what happens when machines start doing the same?

In a fascinating twist that blends neuroscience, design, and artificial intelligence, researchers recently found that generative AI models like ChatGPT are beginning to reflect a deeply human trait: the ability to form cross-sensory associations.

Yes, you read that right. It turns out that even without a tongue or taste buds, some AIs can “tell” you that the colour pink is sweet, that sharp shapes feel bitter, or that roundness equals comfort. Sounds strange? It’s not far from how our brains work — and that revelation has vast potential for designers, marketers, and creatives.

The Hidden Map Inside Your Brain

Let’s rewind for a second. Decades of psychological research have shown that the human brain is incredibly good at blending the senses. We constantly draw links between what we see, hear, and feel, even if we are unaware.

We associate soft shapes with gentle feelings, loud noises with brightness, bright red with sweetness, and black with bitterness. These associations aren’t just quirks; they influence how we experience food, music, brands, and people.

Think about drinking a glass of wine in a cozy, dimly lit room while smooth jazz plays in the background — the flavours somehow feel richer, softer. Now picture that wine in a noisy street, served in a plastic cup. It’s not the taste that changes, but your perception. That subtle shift? That’s cross-modal correspondence at work.

The-Hidden-Map-Inside-Your-Brain

In extreme cases, this manifests as synaesthesia: a condition in which people literally taste words or see sounds as colours. But for the rest of us, it happens subtly, constantly, shaping how we interpret the world.

AI: Just Reflecting Our Biases, or Learning to Feel?

Inspired by this phenomenon, researchers asked a bold question: if large language models are trained on human data, could they also develop these sensory links? Could they “taste” colour? Could they “hear” texture?

They tested this theory by asking ChatGPT and other AI models the same questions given to human subjects in decades of research:

  • “Which colour do you associate most with sweetness?”
  • “What kind of shape feels sour, salty, or bitter to you?”

The answers? Pink is sweet. Green is sour. Sharp is bitter. Round is friendly. Sound familiar?
It’s not that AI has taste buds or preferences. What’s happening is arguably more interesting: it’s absorbing the patterns of human culture so profoundly that it can echo our intuitive responses, even the ones we never consciously say aloud. These moments suggest that AI cross-modal correspondences are not just digital mimicry.

AI-JustReflecting-Our-Biases-or-Learning-to-Feel

What This Means for Creative Work

This is a fascinating opportunity for anyone working in branding, marketing, UX/UI, packaging, or product design.

Imagine asking an AI to generate a mood board not just by aesthetic but also by flavour. You could brief it like this: “Design a landing page that tastes like lemon sorbet and sounds like Miles Davis.” And because these sensory associations are embedded in our data and minds, it might get close.

What-This-Means-for-Creative-Work

Or think about packaging: instead of testing 10 designs in a focus group, you could run them through an AI that understands what pink foil feels like to your average customer. The machine might not feel anything itself, but it has read enough human reactions to guess what most of us will. This doesn’t replace creativity. It just supercharges intuition with pattern recognition.

Caution: Hallucinations Ahead

Of course, there are limits. AI isn’t magic; it can still “hallucinate,” making up facts or drawing wild connections that have no basis in reality. And it doesn’t have taste, memory, or emotion the way we do.

But even these imperfections can be helpful. Sometimes, a slightly off interpretation might spark a new idea. As researcher Carlos Velasco put it, “AI gives us inspiration, not answers.”

Our Take at NTQ Europe

At NTQ Europe, we see this as a powerful intersection between data and design thinking. As businesses push toward more human-centered technology, tools that understand sensory language can help us build more intuitive, more delightful digital experiences.

This research reminds us that design isn’t just what something looks like — it’s what it feels like, tastes like, and sounds like too. And now, with AI cross-modal correspondences being studied more seriously, we’re entering an era where artificial intelligence can start contributing to multisensory product thinking.
Maybe AI can’t enjoy a good glass of mulled wine. But it can help you design a brand that feels just right with it. And that, to us, is pretty exciting.
Want to explore how sensory AI could transform your product or brand experience? Let’s talk.