The Body, The Environment, and The Machine
Designing for the Senses
For decades, we have treated the human brain like a computer sitting in a jar, processing data objectively and completely detached from the physical world. But science tells a very different story...
I was introduced to the theory of Embodied Cognition back in 2016 or before when reading a book: Sensation b Thalma Lobel.
According to the theory of Embodied Cognition and Lobel, our thoughts, emotions, and decisions are deeply anchored in our physical bodies and sensory environments. When we feel physical warmth, we perceive people as kinder; when we hold something heavy, we view situations as more serious.
I found this fascinating back in 2016 (and I still do) so it was the foundation of my senior project, an interactive animation study titled “Embodied Cognition”. Using tools like P5.js and real-time audio and visual sensors, I set out to prove that our feeling are rooted in tangible biological feedback. I created a series of interactive case studies that acted as a “digital mirror” for the subconscious...
projection 1 (weight); experimental Interactive animation mapped digital tethers to the user’s joints.
As they tried to jump or move, the lines stretched with simulated physical tension, visually manifesting the heavy burden of anxiety or responsibility.



projection 2 (speed); experimental interactive animation used computer vision to map a user’s movement velocity to a digital brush.
Rushing through the space left big, round, chaotic dots, forcing the user to witness the visual trace created by their own frantic pace.



projection 3 (); experimental interactive animation translated the volume and pitch of a room into falling circles that exponentially expanded with loud noises.
It visualized the feeling of a crowded mind, showing how auditory chaos leaves no room for internal reflection. (or vice versa).


Through user interviews compiled into a physical booklet, it became clear that this code was a profound validation of the human condition: when we say we are weighted down, our bodies literally behave as if they are.
And so... if our environment so drastically shapes our thoughts, shouldn’t our digital environments respond to our physical reality?. This is where the concept of Ambient Intelligence (AmI) comes in.
No book introduced me to this concept. It was AI itself. So if you do not know about it: AmI is a technological paradigm where our environments are enriched with networks of unobtrusive sensors and artificial intelligence.
Basically, the idea is that instead of forcing humans to stare at flat screens everywhere, the tech weaves seamlessly into the fabric of everyday life, acting as an invisible butler that perceives our activities, reasons about our needs, and adapts the environment to support us.
And why am I even writing about this?
(Besides sharing my cool senior project)
Well, AmI is already beginning to profoundly affect design, med-tech, and human life. I want to briefly mention two major examples...
AmI systems can continuously and passively monitor physiological data (like heart rate variability and sleep patterns) alongside behavioral data (like movement and social withdrawal) through wearables and smart space sensors. Because AmI is context-aware, it can detect the physical markers of an impending panic attack or depressive episode and proactively intervene... perhaps by dimming the lights, playing calming music, or triggering a virtual agent to guide you through breathing exercises.
Imagine this intervention for astronauts deep in space!In exhibition spaces like museums, traditional audio guides are being replaced by AI-driven digital guides... By using precise gestures, eye contact, and multimodal interactions, these digital guides activate our mirror neurons, creating an empathetic emotional resonance that transforms a dry history lesson into a deeply immersive, embodied learning experience.
So let’s talk about sensations and the future of design...
But first let’s put it all together. I want to point out (after all that) that the core connection between my interactive project, Embodied Cognition, and technology ideas like Ambient Intelligence is the realization that: We are biological beings first and digital users second.
For too long, technology has relied on a disembodied design paradigm that ignores the physical sensations of being human. And we could be doing so much better!
When we synthesize these concepts, we pave the way for a future of better design.
If a smart environment or digital interface knows that we associate rough textures with adversarial interactions or the color red with avoidance, it can dynamically alter its properties to soothe us. Imagine an application that senses your high-velocity, frantic scrolling (much like my senior project) and responds by physically slowing down the interface or shifting its color palette to cool, calming blues to lower your social temperature.
My senior project wanted to show how our movement, volume, and emotions are part of a single, unified system. Ambient Intelligence takes that truth and scales it to the world around us. A smart home or device that isn’t aware of the physical weight or anxiety of its inhabitants isn’t truly smart. By combining the psychological insights of embodied cognition with the ubiquitous sensors of AmI, we are moving toward a future where technology is no longer a cold, flat tool, but a highly empathetic, physical extension of the human experience.
And I can’t wait!
Now, if you have the tech let’s work together! We offer the 90-Minute Strategy Session. We will identify and provide a strategic playbook to make your work credible, cohesive, and market-ready. Start here!



