Skip to content

Mood Paint-From Signal to Canvas | Interactive art generated from biometric data

The proposed project is a hybrid art–technology system that translates physiological data from the human body into generative visual environments. Using skin conductance (GSR) and / or ECG heartbit data captured via wearable sensors, the system produces real-time visual output.

A painting system that translates emotional states into visual compositions in real time.​ Mood Paint is not about expressing emotion intentionally. It bypasses intention.

This project explores:

  • The body as a generator of art
  • Emotion without language
  • The translation of invisible states into visible form

The output

A portrait, but not of your appearance What you are seeing is not a picture of you. It is not your face or your body. It is a portrait of your internal state in this exact moment.

5 Ws who, what, when, where, why

how is what you will start defining in your process pages

Who

Open to all audiences, with particular resonance for visual artists and for individuals who find verbal or traditional visual forms of emotional expression challenging.

What

A hybrid art–technology project that translates physiological signals associated with emotional states into evolving visual environments in real time—bypassing intention, cognition, and conscious control.

When

Whenever an emotional state emerges and seeks expression.

Where

The project takes form as an interactive wearable bracelet within installations for galleries, museums, festivals, and art-and-technology contexts. A complementary glove version is designed for intimate, personal use.

Why

This project seeks to expand access to emotional expression by offering a system that enables anyone — regardless of artistic training — to transform their inner states into visual form.

How

The project uses wearable sensors embedded in a bracelet and glove to capture physiological signals related to emotional states, such as skin conductance and micro-movements. These signals are processed in real time by a computational system that translates raw data into generative visual parameters—color, movement, density, rhythm, and spatial behavior.

Rather than attempting to label or interpret emotions, the system allows the data itself to drive visual change, creating environments that evolve directly from the body’s responses. The result is a feedback loop in which internal states are externalized as visual forms, enabling participants to perceive, reflect on, and inhabit their emotions as they unfold.

User Experience

The Mood Paint experience is intended to operate in two modes: an immersive painting booth for a more intimate experience, and a gallery setup where the visuals are displayed on a screen, allowing the user to simply wear the device and stand in front of it.

MoodPaint is activated when the participant wears the biometric bracelet. The participant stands or moves naturally in front of a screen or projection while the painting continuously evolves based on their internal state. No instructions are required

Two Modes of Interaction​

*Exhibition Mode — The Bracelet​

In galleries, participants wear a discreet bracelet that captures biometric signals.​ As they stand in front of a screen or projection, their internal state generates a unique visual composition in real time.​

*Personal Mode — The Glove​

For intimate or customized use, the glove expands the system.


Prototype

Bracelet & Glove


References projects, research papers, expos, performances etc

Murals as motion are interactive art installations using sensors (capacitive, motion, LiDAR) and embedded LEDs or projection to create dynamic, glowing walls that react to touch or presence.

describe what you see in this image


Siro Installation: This interactive installation uses a Muse brainwave sensor to generate colors, shapes, and sounds based on the participant's thoughts and emotions.

describe what you see in this image describe what you see in this image


MIMU gloves: Mi.Mu Gloves transform physical expression into a dynamic, visual, and intuitive musical instrument. using wireless, sensor-equipped wearable controllers that allow performers to create and manipulate music through hand and arm gestures.

describe what you see in this image


Pulse & Bloom - An interactive biofeedback installation by Saba Ghole.20 interactive lotus flowers made out of steel and rowlux. Each lotus flower ranges from 8 to 18 feet tall, each of which lights up with your pulse. You and another person can put your hands on a couple of Hamsa hands at the base of the lotus flower and your respective heartbeats will light up the flower.

describe what you see in this image


Heartsync - by Nino Basilashvili.Your live heartbeat data and the synchronization patterns of the group will be translated into dynamic visuals. The visuals are complemented by unique low-frequency sounds driven by the participants’ heartbeats, blending into a collective symphony. Each person has their own unique heartbeat rhythm. Despite this uniqueness, people’s heart rhythms can synchronize.

describe what you see in this image


Moodboard

Inspiration images by Rachel Freire MiMu gloves, Bracalet AI generated image


References