Skip to content

Mood Paint-From Signal to Canvas | Interactive art generated from biometric data

The proposed project is a hybrid art–technology system that translates physiological data from the human body into generative visual environments. Using skin conductance (GSR) and / or ECG heartbit data captured via wearable sensors, the system produces real-time visual output.

A painting system that translates emotional states into visual compositions in real time.​ Mood Paint is not about expressing emotion intentionally. It bypasses intention.

5 Ws who, what, when, where, why

how is what you will start defining in your process pages

Who

Open to all audiences, with particular resonance for visual artists and for individuals who find verbal or traditional visual forms of emotional expression challenging.

What

A hybrid art–technology project that translates physiological signals associated with emotional states into evolving visual environments in real time—bypassing intention, cognition, and conscious control.

When

Whenever an emotional state emerges and seeks expression.

Where

The project takes form as an interactive wearable bracelet within installations for galleries, museums, festivals, and art-and-technology contexts. A complementary glove version is designed for intimate, personal use.

Why

This project seeks to expand access to emotional expression by offering a system that enables anyone — regardless of artistic training — to transform their inner states into visual form.

How

The project uses wearable sensors embedded in a bracelet and glove to capture physiological signals related to emotional states, such as skin conductance and micro-movements. These signals are processed in real time by a computational system that translates raw data into generative visual parameters—color, movement, density, rhythm, and spatial behavior.

Rather than attempting to label or interpret emotions, the system allows the data itself to drive visual change, creating environments that evolve directly from the body’s responses. The result is a feedback loop in which internal states are externalized as visual forms, enabling participants to perceive, reflect on, and inhabit their emotions as they unfold.

User Experience

The Mood Paint experience is intended to operate in two modes: an immersive painting booth for a more intimate experience, and a gallery setup where the visuals are displayed on a screen, allowing the user to simply wear the device and stand in front of it.

MoodPaint is activated when the participant wears the biometric bracelet. The participant stands or moves naturally in front of a screen or projection while the painting continuously evolves based on their internal state. No instructions are required


Emotional State Model

The system operates using a small set of non-clinical affective categories:

  • Calm

  • Focused

  • Stable

  • Elevated

  • Intense

These categories represent degrees of physiological activation rather than discrete emotions.

Visual Translation

Each emotional state has a distinct visual language.

Emotion Visual Respresentation BPM Range
Calm Soft gradients, slow motion, low contrast BPM < 60
Focused Pulsation, rhythmic motion, mild contrast 60 < BPM < 80
Stable Bright colors, bursts 80 < BPM < 95
Elevated Saturation, noise, expansion, luminous 95 < BPM < 110
Intense Concentrated power, structured turbulence​ BPM > 110

Average Adult Resting BMP

• 55–65 → trained / athletic
• 60–75 → common adult
• 70–85 → slightly elevated baseline
  80–95 → anxious baseline or caffeine
  • BPM (Beats Per Minute)

References

References projects, research papers, expos, performances etc

Murals as motion are interactive art installations using sensors (capacitive, motion, LiDAR) and embedded LEDs or projection to create dynamic, glowing walls that react to touch or presence.

describe what you see in this image


Siro Installation: This interactive installation uses a Muse brainwave sensor to generate colors, shapes, and sounds based on the participant's thoughts and emotions.

describe what you see in this image describe what you see in this image


MIMU gloves: Mi.Mu Gloves transform physical expression into a dynamic, visual, and intuitive musical instrument. using wireless, sensor-equipped wearable controllers that allow performers to create and manipulate music through hand and arm gestures.

describe what you see in this image


AmalGAN. In 2019, MIT engineer and artist Alexander Reben createda machine that used body signals (brainwaves, pulse, eye gaze)turning his emotional responses into AI-generated art. The resultswere sent to be painted on canvas by anonymous painters

describe what you see in this image describe what you see in this image


Moodboard

Inspiration images by Rachel Freire MiMu gloves, Bracalet AI generated image