Mood Paint-From Signal to Canvas | Interactive art generated from biometric data¶
The proposed project is a hybrid art–technology system that translates physiological data from the human body into generative visual environments. Using skin conductance (GSR) and / or ECG heartbit data captured via wearable sensors, the system produces real-time visual output.
A painting system that translates emotional states into visual compositions in real time. Mood Paint is not about expressing emotion intentionally. It bypasses intention.
5 Ws who, what, when, where, why¶
how is what you will start defining in your process pages
Who¶
Open to all audiences, with particular resonance for visual artists and for individuals who find verbal or traditional visual forms of emotional expression challenging.
What¶
A hybrid art–technology project that translates physiological signals associated with emotional states into evolving visual environments in real time—bypassing intention, cognition, and conscious control.
When¶
Whenever an emotional state emerges and seeks expression.
Where¶
The project takes form as an interactive wearable bracelet within installations for galleries, museums, festivals, and art-and-technology contexts. A complementary glove version is designed for intimate, personal use.
Why¶
This project seeks to expand access to emotional expression by offering a system that enables anyone — regardless of artistic training — to transform their inner states into visual form.
How¶
The project uses wearable sensors embedded in a bracelet and glove to capture physiological signals related to emotional states, such as skin conductance and micro-movements. These signals are processed in real time by a computational system that translates raw data into generative visual parameters—color, movement, density, rhythm, and spatial behavior.
Rather than attempting to label or interpret emotions, the system allows the data itself to drive visual change, creating environments that evolve directly from the body’s responses. The result is a feedback loop in which internal states are externalized as visual forms, enabling participants to perceive, reflect on, and inhabit their emotions as they unfold.
User Experience¶
MoodPaint is activated when the participant wears the biometric bracelet. The participant stands or moves naturally in front of a screen or projection while the painting continuously evolves based on their internal state. No instructions are required
Emotional State Model
The system operates using a small set of non-clinical affective categories:
-
Calm
-
Alert
-
Active
-
Stressed
These categories represent degrees of physiological activation rather than discrete emotions.
Visual Translation
Each emotional state has a distinct visual language.
| Emotion | Visual Respresentation | Signal |
|---|---|---|
| Calm | Soft gradients, slow motion, low contrast | Low arousal, stable signal |
| Alert | Pulsation, rhythmic motion, mild contrast | Moderate arousal,small fluctuations |
| Active | Bright colors, bursts | High arousal, frequent SCR peaks |
| Stressed | Saturation, noise | Sustained high arousal |
References projects, research papers, expos, performances etc¶
Murals as motion are interactive art installations using sensors (capacitive, motion, LiDAR) and embedded LEDs or projection to create dynamic, glowing walls that react to touch or presence.
Siro Installation: This interactive installation uses a Muse brainwave sensor to generate colors, shapes, and sounds based on the participant's thoughts and emotions.
MIMU gloves: Mi.Mu Gloves transform physical expression into a dynamic, visual, and intuitive musical instrument. using wireless, sensor-equipped wearable controllers that allow performers to create and manipulate music through hand and arm gestures.
AmalGAN. In 2019, MIT engineer and artist Alexander Reben createda machine that used body signals (brainwaves, pulse, eye gaze)turning his emotional responses into AI-generated art. The resultswere sent to be painted on canvas by anonymous painters








