Skip to content

11. Open Source Hardware - From Fibers to Fabric

Pattaraporn (Porpla) Kittisapkajon’s Voice-Activated Gratitute Weaving Loom

After the endless trials and errors of the previous weeks’ assignments, I was honestly the most nervous about this one. Despite having an architectural background, I have never thought of myself as a builder—and suddenly, here I was, making a machine!

To stay grounded, I decided to choose the simplest machine I could realistically achieve within one week. That’s when I came across Fabricademy alumna Kae Nagano’s voice-activated weaving loom. I was immediately fascinated by the human–machine interaction she created—how the voice becomes a physical force that activates the loom.

Research & Ideation

Kae Nagano’s Voice-Activated Weaving Loom

Inspired by Kae Nagano’s work, I chose not just to build a loom—but to ritualize one. Through my own weaving loom, I explored human–artificial intelligence interaction as a living feedback loop between voice, computation, and material. By framing the interaction through gratitude, the machine becomes a device for consciousness—an interface for reflection. This became the Gratitude Loom.

How Does Weaving Work

I had never woven before, so my first step was simply to understand how weaving actually works. At its core, weaving is created by passing a weft (filling) yarn over and under a set of warp yarns that are held under tension on a loom.

describe what you see in this image

Weaving Diagram — Lauren Nishizaki

Each new row of weft alternates its over–under pattern from the previous row, creating a stable interlaced structure. This repetitive interlocking produces a high number of intersections, which gives woven fabric its strength, flexibility, and structural integrity.

Weaving Machine Animation and Basic Principle of Weaving — Textile Explanied

Double Warping Frame Loom

Next, I needed to understand how the machine itself works so I could design my loom accordingly. I began by studying the double-harp frame loom and the role of the heddle in weaving.

The heddle contains alternating slots and holes, which separate the warp yarns into two groups. This structure allows one set of warp threads to be lifted while the other remains stationary. When the heddle rotates or shifts forward and backward, these two groups of warp yarns switch positions—one moving above the weft (filling yarn), and the other moving below it.

This alternating motion creates the essential over–under pattern of weaving. Each time the heddle changes position, the relationship between warp and weft is reversed, allowing the fabric to build up row by row in a stable, interlaced structure.

Double Warping Frame Loom — Fibers and Design Weaving

Concept Development

System Design — Human-AI-Material Feedback Loop

describe what you see in this image

Human-A-Material Feedback System — Pattaraporn (Porpla) Kittisapkajon

This diagram illustrates the closed-loop interaction between the human, the AI system, and the physical loom. Spoken gratitude is captured through speech recognition, interpreted by an AI reflection engine, and translated into serial movement commands that drive the stepper motor. The resulting woven output provides visual and tactile feedback, forming a continuous embodied feedback loop.

describe what you see in this image

Gratitude Loom Web Interface (Chrome) — Real-time Ritual Control and AI Reflection Display — Pattaraporn (Porpla) Kittisapkajon

Mechanical Design

describe what you see in this image

Mechanical Assembly — Pattaraporn (Porpla) Kittisapkajon

This exploded assembly diagram shows the core mechanical components of the Gratitude Loom, including the frame, heddle shaft, motor housing, end cap, and stepper motor. The stepper motor directly drives the rotation of the heddle, enabling controlled alternation of warp threads during weaving. This design translates digital commands into precise physical motion.

Assembly Video — Pattaraporn (Porpla) Kittisapkajon

Circuit Design

describe what you see in this image

Stepper Motor Control Circuit Diagram — Pattaraporn (Porpla) Kittisapkajon

This stepper motor control circuit uses an Arduino Uno to send digital movement commands to a ULN2003 motor driver, which amplifies the signals and drives the 28BYJ-48 stepper motor. A shared 5V and GND supply ensures synchronized operation across all components. The stepper motor’s rotation directly actuates the heddle, enabling precise digital-to-physical translation for the weaving system.

Tools

Digital Tools

  • Rhinoceros — 3D modeling of loom components
  • Arduino IDE — writing and uploading Arduino motor code
  • p5.js — browser-based interaction for voice input and serial control
  • Text Editor (Notepad) — editing local files (index.html, sketch.js, server.mjs, .env)
  • Web Browser (Google Chrome) — running the local p5.js interface with Web Speech + Web Serial
  • Node.js — running the local AI server (server.mjs)
  • .env file — secure storage of the AI API key

Fabrication Tools

  • Bambu Lab P1S — rapid prototyping of loom parts
  • PLA Matte Filament — structural loom components

Electronics

  • Arduino Uno — main system controller
  • 28BYJ-48 Stepper Motor — loom rotation
  • ULN2003 Driver Board — motor control interface
  • Breadboard — rapid circuit prototyping
  • Jumper Wires (male–male, male–female) — electrical connections
  • USB Cable — Arduino power & serial communication

AI & Web Stack

  • Groq API — AI-generated gratitude reflection
  • Express.js — lightweight local server framework for server.mjs
  • dotenv — loading the API key securely from the .env file
  • Web Speech API — voice input for gratitude detection
  • Web Serial API — sending AI-triggered motor commands to Arduino

Programming

Overview

The goal of the code is simple: 1. Listen to my voice. 2. Ask the AI to respond. 3. Turn the motor so the loom moves.

This project uses three layers of code working together

Layer What It Does
Arduino Code Rotates the motor
Browser Code (p5.js) Listens to my voice + sends commands
Server Code (server.mjs) Talks to the AI

PART 1 — What Software I Used

You only need these:

  • Arduino IDE → for uploading motor code to the Arduino
  • Google Chrome → for speech recognition + serial communication
  • Notepad → for writing my browser code (index.html, sketch.js, server.mjs)
  • Node.js → for running the local AI server

Note: This project could be built using p5.js entirely in the browser, but I chose to run everything locally instead. Running the system locally allows me to (1) safely store my AI API key using server.mjs, (2) avoid browser security restrictions, and (3) create a more reliable real-time connection between voice, AI, and the physical motor system.

PART 2 — Arduino Programming (Motor Control)

This part makes sure:

When the computer sends the letter “M”, the motor rotates 180 degrees(1040 steps).

Step 1: Plug in the Arduino

  • Connect Arduino to your computer
  • Open Arduino IDE

Step 2: Select Board + Port

In Arduino IDE: - Tools → Board → Arduino Uno - Tools → Port → Select your USB port

Step 3: Copy & Paste This Arduino Code

Create a new sketch and paste this:

// ---------------------------------------------------------
// Gratitude Loom – Stepper Back-and-Forth Gesture Controller
// Motor: 28BYJ-48 + ULN2003 driver
// Behavior: On each 'M' from Serial, move in one direction,
//           next 'M' move back (toggle direction).
// ---------------------------------------------------------

// Pin connections from Arduino to ULN2003 IN1–IN4:
const int IN1 = 8;
const int IN2 = 9;
const int IN3 = 10;
const int IN4 = 11;

// Half-step sequence for 28BYJ-48 (8 steps per full electrical cycle)
const int STEPS_IN_SEQUENCE = 8;
const int SEQ[STEPS_IN_SEQUENCE][4] = {
  {1, 0, 0, 0},
 {1, 1, 0, 0},
 {0, 1, 0, 0},
 {0, 1, 1, 0},
 {0, 0, 1, 0},
 {0, 0, 1, 1},
 {0, 0, 0, 1},
 {1, 0, 0, 1}
};

// 🎚️ TUNABLE SETTINGS
const int STEPS_PER_GESTURE = 1040;  // how big each swing is (try 120–260)
const int STEP_DELAY_MS     = 3;    // lower = faster, higher = smoother / stronger torque
const unsigned long COOLDOWN_MS = 400; // minimum time between moves (prevents spam)

// Internal state
int stepIndex = 0;   // where we are in SEQ
int dir = 1;         // +1 or -1; toggles every command
unsigned long lastMoveTime = 0;  // for cooldown timing

// ---------------------------------------------------------
// Low-level helper: energize one step of the motor
// ---------------------------------------------------------
void setStep(int a, int b, int c, int d) {
 digitalWrite(IN1, a);
 digitalWrite(IN2, b);
 digitalWrite(IN3, c);
 digitalWrite(IN4, d);
}

// Move a single half-step in given direction (+1 or -1)
void stepMotor(int direction) {
 stepIndex += direction;

 if (stepIndex >= STEPS_IN_SEQUENCE) {
    stepIndex = 0;
  } else if (stepIndex < 0) {
    stepIndex = STEPS_IN_SEQUENCE - 1;
  }

 setStep(
    SEQ[stepIndex][0],
   SEQ[stepIndex][1],
   SEQ[stepIndex][2],
   SEQ[stepIndex][3]
  );
}

// ---------------------------------------------------------
// Higher-level gesture: one smooth swing in given direction
// ---------------------------------------------------------
void moveOneGesture(int direction) {
 for (int i = 0; i < STEPS_PER_GESTURE; i++) {
    stepMotor(direction);
   delay(STEP_DELAY_MS);
  }
}

// ---------------------------------------------------------
// Setup
// ---------------------------------------------------------
void setup() {
 pinMode(IN1, OUTPUT);
 pinMode(IN2, OUTPUT);
 pinMode(IN3, OUTPUT);
 pinMode(IN4, OUTPUT);

 Serial.begin(9600);
 Serial.println("✨ Gratitude Loom stepper ready ✨");
}

// ---------------------------------------------------------
// Main loop: listen for 'M' commands and gesture back & forth
// ---------------------------------------------------------
void loop() {
 if (Serial.available() > 0) {
   // Read up to newline (or timeout) so we treat each line as one command
String cmd = Serial.readStringUntil('\n');
cmd.trim();  // removes spaces, \r, etc.

if (cmd.length() > 0) {
  Serial.print("Got command: [");
  Serial.print(cmd);
  Serial.println("]");
}

// If there is an 'M' in the command, we trigger one gesture
if (cmd.indexOf('M') != -1) {
  unsigned long now = millis();

  // Simple cooldown so multiple 'M' in a burst don’t spam the motor
  if (now - lastMoveTime >= COOLDOWN_MS) {
    moveOneGesture(dir);   // move in current direction
    dir = -dir;            // flip for next time
    lastMoveTime = now;

    Serial.print("New dir = ");
    Serial.println(dir);
  } else {
    Serial.println("⏳ Ignored: cooldown active");
     }
    }
  }
}

Step 4: Upload to Arduino

Click Upload

If it finishes with no errors → ✅ Arduino is ready

Now your Arduino understands only one command:

If it receives the letter “M” → it rotates the loom one full turn.

PART 3 — Browser Programming (Voice Input)

This part: Listens to your voice and prepares the AI reques

Step 1: Create a Project Folder

Create a folder called:

GratitudeLoom

Inside the folder create three files using Notepad:

index.html
sketch.js
server.mjs

Step 2: index.html (Basic Interface)

Open Notepad → Save As → index.html

Paste:

<!DOCTYPE html>
<html>
 <head>
   <title>Gratitude Loom</title>
 </head>
 <body>
   <button onclick="connectSerial()">Connect Arduino</button>
   <button onclick="startListening()">Speak Gratitude</button>

   <p>Gratitude:</p>
   <div id="gratitudeText"></div>

    <p>AI Reflection:</p>
   <div id="aiText"></div>

   <script src="sketch.js"></script>
  </body>
</html>

Step 3: sketch.js (Voice + Serial Control)

Open Notepad → Save as sketch.js

Paste:

let port;
let writer;

// CONNECT TO ARDUINO
async function connectSerial() {
 port = await navigator.serial.requestPort();
 await port.open({ baudRate: 9600 });
 writer = port.writable.getWriter();
 alert("Arduino Connected");
}

// LISTEN TO VOICE
function startListening() {
 const recognition = new webkitSpeechRecognition();
 recognition.lang = "en-US";

 recognition.onresult = (event) => {
    const text = event.results[0][0].transcript;
    document.getElementById("gratitudeText").innerText = text;
    sendToAI(text);
 };

 recognition.start();
}

// SEND TO AI SERVER
async function sendToAI(text) {
 const response = await fetch("http://localhost:3000/groq_reflection", {
   method: "POST",
   body: JSON.stringify({ text }),
   headers: { "Content-Type": "application/json" }
  });

 const data = await response.json();
 document.getElementById("aiText").innerText = data.reflection;

  sendMoveCommand();
}

// SEND 'M' TO ARDUINO
async function sendMoveCommand() {
  if (!writer) return;
  const data = new TextEncoder().encode("M");
  await writer.write(data);
}

Now:

The browser only sends one letter: "M"

PART 4 — AI Server (server.mjs) + .env

Step 1: Create .env Open Notepad → Save as .env

GROQ_API_KEY=PASTE_YOUR_API_KEY_HERE

Step 2: server.mjs

Open Notepad → Save as server.mjs

import express from "express";
import fetch from "node-fetch";
import cors from "cors";
import "dotenv/config";

const app = express();
app.use(cors());
app.use(express.json());

const GROQ_API_KEY = process.env.GROQ_API_KEY;

app.post("/groq_reflection", async (req, res) => {
  const userText = req.body.text;

  const response = await fetch("https://api.groq.com/openai/v1/chat/completions", {
   method: "POST",
   headers: {
      "Authorization": `Bearer ${GROQ_API_KEY}`,
      "Content-Type": "application/json"
    },
    body: JSON.stringify({
      model: "llama3-8b-8192",
     messages: [
       { role: "system", content: "Respond with a calm gratitude reflection only." },
        { role: "user", content: userText }
     ]
   })
 });

 const data = await response.json();

  res.json({
    reflection: data.choices[0].message.content
 });
});

app.listen(3000, () => {
 console.log("Server running at http://localhost:3000");
});

Step 3: Install Server Packages + Start Server

Open Command Prompt inside the folder and run:

npm install express cors node-fetch dotenv
node server.mjs

When you see

Server running at http://localhost:3000

AI is live

FINAL TEST SEQUENCE

  1. Plug in Arduino
  2. Upload Arduino code
  3. Start server.mjs
  4. Open index.html in Chrome
  5. Click Connect Arduino
  6. Click Speak Gratitude
  7. Speak gratitude
  8. AI responds
  9. Browser sends "M"
  10. Motor rotates

Fabrication

All structural parts of the Gratitude Loom were fabricated using FDM 3D printing. The loom frame, motor housing, and end cap were modeled in Rhinoceros and exported as STL files for printing.

The parts were printed on a Bambu Lab P1S using PLA matte filament, which provided enough strength and stiffness for the rotating mechanism while still allowing for fast iteration. After printing, the components were lightly sanded where needed and assembled using press-fits.

The final assembly aligns the stepper motor directly with the heddle shaft, allowing digital movement commands from the Arduino to be translated into smooth physical rotation of the loom.

One key lesson I learned during Fabricademy is to test while designing. Before committing to full-size prints, I produced several smaller test parts to check fit, alignment, and tolerances.

describe what you see in this image

3D Printing Tests — Pattaraporn (Porpla) Kittisapkajon

To improve the accuracy of the narrow slot details, I reduced the print speed and slightly lowered the nozzle temperature, adjusted the wall order, and tested different part orientations. Although these changes increased the overall print time, they resulted in noticeably cleaner edges, better dimensional accuracy, and improved surface quality—especially in tight geometries.

describe what you see in this image

3D Print Setting Tests — Pattaraporn (Porpla) Kittisapkajon

Quick reference: print settings used

Layer height: 0.2 mm

Nozzle temperature: 195–200 °C

Bed temperature: 55–60 °C

Print speed: 30–40 mm/s

Outer wall speed: 20–25 mm/s

Wall order: Outer → Inner

Number of walls: 4

Infill: 15–20% (grid or gyroid)

Cooling fan: 100% after first few layers

Orientation: Slots printed vertically where possible to preserve edge sharpness

Fabrication files


  1. File: xxx 

  2. File: xxx