This is Your Brain on Feels
Experimenting with how to visualize emotion.
What's going on inside when you're feeling upset? What does being in love look like? What happens when all those feelings under the surface are made visible? Soon, we'll be getting a sense of that.
Machine perception lends a new perspective on people's emotional reactions, by decoding facial expressions, analyzing voice patterns, scanning text for tone, and measuring neurological immersion. People may not pick up on emotion from moment-to-moment. We may not always be aware of our emotional expressions or how they are received. And we certainly won't see millions of faces from around the world or register shifts in vocal intonation on a massive scale. Emotional artificial intelligence can pick up on these signals and reveal meaningful patterns, weird anomalies, and troubling stereotypes.
But, when it comes to emotional experience, there's a disconnect. Machines speak numbers and humans speak metaphors. Machines see micro-expressions, humans pick up vibes. Machines know just a few emotion words, humans have wide-ranging emotional vocabularies that increase over their lifetimes. Machines look for universals, humans perceive nuance.
This design exploration experiments with different ways of visualizing emotion that don't oversimplify nuanced emotion or reduce artful expression to graphs and charts. In this first installment, we focused on brainwaves as a pattern of feeling.