Process¶
Midterm Presentation¶
Maddie Olsen | Prairie Interfaces Midterm by Maddie Olsen
Ideation & sketches¶
Sketches for the tapestry... It's hard to draw a latch hook tapestry. I referenced many maps of the Blackland Prairie ecoregion: land usage, disturbance of native plant communities, and urban development. Combining areas from the maps helped me decide where the sensor areas would be placed.
I also created graphics to upload in my processing sketch using Adobe Photoshop and Illustrator. At least I could take advantage of a discounted rate for creative cloud for a few months ;)
Design & Fabrication¶
Sensors¶
I referenced PileUp: A Tufting Approach to Soft, Tactile, and Volumetric E-Textile Interfaces when designing the tapestry. It suggests construction methods, variables, and expected sensor behavior. It also explains how the sensor can detect variable resistance when connected to an analog input on a microcontroller.
I observed that with a voltage divider, the sensor has both digital and analog qualities. These qualities helped inform how the output could look and function in my Processing sketch.
I made several other small sensors to test out the gauge and type of fibers and the distribution of conductive material, but this one worked best. The other layouts of embedded conductive materials weren’t successful on a small scale/during tests. However, when testing the first two sensors that were made into the final piece, they were too conductive and I removed a decent amount of conductive material (probably 40-50% but hard to quantify it). From there, the rest of the sensors I made did not receive conductive thread in every stitch. Instead, I opted for irregular groups of stitches with conductive thread in proximity to each other but with some buffer stitches in between.
Biochromes & Tapestry¶
I was able to use yarns I had collected over the years for my dye work. I used wool and cotton yarns, scoured and mordanted to their respective processes (see my biochromes week documentation).
Dyes used:
- Logwood - 30% wof
- Madder - 150% wof
- Coreopsis - 7 grams
- Cochineal - 5 - 6 grams
- Weld - 100 wof %
- Black Hollyhock - 100% wof
To supplement the less saturated areas, I purchased some white and undyed cotton yarn from a local fabric shop (see B.O.M.).
I wrapped my yarn around acrylic scraps to measure and cut many even pieces at a time. The longer threads (in conjunction with the higher saturation/distict color areas) designate sensors from filler area.
Arduino & Processing¶
1st Draft¶
A first draft of a test sensor and processing sketch was presented at Midterms:
https://www.canva.com/design/DAHAv4hEB2s/pS0H4p_nHIwHGWDN0ICJrA/watch?utm_content=DAHAv4hEB2s&utm_campaign=designshare&utm_medium=embeds&utm_source=link" target="_blank" rel="noopener">Processing first draft by Maddie Olsen
I started learning Processing from the beginning, with Processing reference and resources including the Coding Train (Daniel Shiffman) and Happy Coding.
It became clear that to execute my idea, I needed to work with classes and objects, which is where I got overwhelmed. It’s one thing to read the processing reference and follow online tutorials, but it’s another thing to then understand what, where, and how to make these terms apply to my own idea.
While I was achieved some of these steps on my own, I take ownership of the use of AI to help write my code for Processing. It’s a part of my project I do not feel proud of, but it must be disclosed. It was not used for any imagery, electronics, dye recipes, or writing related to Prairie Interfaces.
*First Draft Processing Sketch
import processing.serial.*;
Serial myPort; //the serial port
PImage plant; // put rattlesnake master in data folder
ArrayList<PVector> plants = new ArrayList<PVector>(); //remember where plants were placed.
int sensorValue = 0;
int lastSpawnTime = 0;
int spawnInterval = 300; // milliseconds
float opacity = 255;
void setup() {
size(800,800);
println(Serial.list());
myPort = new Serial(this, Serial.list()[3], 9600);
//output added
plant = loadImage("rattlesnake_master.png");
}
void draw() {
background(200);
opacity = map(sensorValue, 300, 700, 150, 255);
opacity = constrain(opacity, 100, 255);
if ((sensorValue > 299) && (sensorValue < 701)) {
if (millis() - lastSpawnTime > spawnInterval) {
plants.add(new PVector(random(width), random(height)));
lastSpawnTime = millis();
}
}
for (PVector p : plants) {
tint(255, opacity); //opacity of plant pngs
image(plant, p.x, p.y); //location and population of plants
}
}
void serialEvent(Serial myPort) {
String val = myPort.readStringUntil('\n');
if (val != null) {
println(val);
sensorValue = int(trim(val));
}
}
Second Draft with Updated Graphics & Correct Value Mapping¶
After the first draft of the code, I felt there were only a few things that needed be adjusted for the behavior of the images. For example here the opacity is responsive to the pressure, but it affects all of the images that appear in each frame at the same time. Instead, I needed to move the mapped values for opacity into the plant class, so each object would store the data of the pressure at the time it was generated. I also wanted to the plants to fade out after existing for a certain amount of time.
I’m glad that I was able to describe enough of what I wanted with correct vocabulary that it didn’t take tons of prompting to get to a workable sketch. I am however grappling with the moral dilemma of using AI to help tutor me and build a part of my project. I consistently reference primary sources that explain all the parts of my Processing Sketch, but the details of syntax and structure are still becoming clear to me.
I updated the files to the transparent pngs I made in photoshop, and tested the code with the sensors on my tapestry. This took some debugging, because at first the way the Arduino Sketch was printing the serial data was not formatted to what the processing sketch was expecting to read.
I used the digital qualities of the sensor (on/off) to make a plant to appear in the scene every X amount of milliseconds while the sensor was pressed.
By connecting the sensors to analog pins on the Arduino UNO, it reads a range of values based on how the sensor is interacted with. Low pressure and high pressure touches were mapped to the opacity of the images that appeared in the frame.
Prairie Interfaces | Tapestry Sensor Test In Progress by Maddie Olsen
Version 3, Simulated¶
I added a background image and limited the plants to appearing only in the bottom half of the window. The only limitation here is that to have an image uploaded as the background, the window is limited to the size of the image and cannot run the fullScreen() command.
https://www.canva.com/design/DAHDFiz31oE/hRcSmt0um67XUX7_d9WTNg/watch?utm_content=DAHDFiz31oE&utm_campaign=designshare&utm_medium=embeds&utm_source=link" target="_blank" rel="noopener">Prairie Interfaces | Tapestry Sensor Simulation w/ Background by Maddie Olsen
Mentoring notes¶
Mentors in all sessions may share with you their comments, notes, advise, projects and technical equipment to check out. This is good place to share those, so that you can find them later on when you need them the most!
Half-fabrication files¶
-
Test file: 3d modelling test ↩





