Grasshopper Exploration¶
Transforming emotions into tangible matter that can be physically sewn into fabric and worn sounds like magic, right?
But how do you actually do that?
I wanted to explore different ways of expressing emotions, and I found myself drawn to the idea of translating sound into visual form. After all, as I discussed in my research page, everything is connected and music is one of the most powerful emotional triggers.
So, I chose to translate the data behind emotions particularly the types of music that strike a certain feeling in me, in others, or universally into visual elements.
The Firefly plugin was my savior it contains components that allow sound data to be translated and connected with parametric systems inside Grasshopper. That made the whole thing personal and meaningful to me.
My initial experimentation focused on testing how waveforms respond to specific sound inputs.
To install the Firefly Plugin, follow this tutorial on YouTube:
▶ Watch the Tutorial
I also followed Rico’s advice and explored the waveform path. It felt like a full-circle moment: I had initially planned to visualize brainwaves, but I found beautiful overlaps between sound waves, Chladni patterns, and emotional resonance.
So, I dove into multiple tutorials on sine waves and then connected that knowledge to the sound visualizer.
The final breakthrough came when I merged the sound visualization with a wave simulation.
⚠️ A Friendly Warning
Make sure to pray a lot when doing such things on Rhino—your laptop will crash, and you will cry!
This fusion generated beautiful, dynamic contour lines that felt alive, fluid and expressive, like the emotions they represented. These contours became the foundation for further exploration, offering endless ways to develop the forms into physical, wearable elements.
The beautiful thing about this algorithm is that every time you play music, you get a completely new set of waves. Each output is unique like emotional fingerprints. They may look similar, no two are ever truly identical.
When you bake the geometry in Rhino, you’re essentially capturing a split second a millisecond of an ongoing, dynamic experience. Just like emotions, what you freeze in that moment is only a fragment of a much bigger, richer, and more complex picture.
As seen in the videos, the contour lines looked almost 3D, as they outline the surface of the wave. Once you bake the geometry, you're left with a collection of lines that feel dimensional and dynamic.
Here’s what happens next:
- Create a plane in Rhino — this will serve as your base.
- Use the Project command — to project the 3D curves onto 2D flat lines.
- Saving and Exporting — in my case, I saved everything as Rhino 8 and SVG formats, to keep the files versatile for future use with different digital fabrication techniques.
Now you have a series of curves ready to be used for fabric manipulation, 3D printing, or motion graphics.
This is the point where data becomes form—and emotions begin to take physical shape.