Since week 1, week 2, and week 3, I’ve been focused on fabrication and making the Record function work. I also refined the interface, but I still dont think it’s a very smooth or clear UI. If I continued this project, I’d want to work on that more.
I started putting the electronics together, and playing with more reliable webcam structures.
I still wanted “record custom color” to work, but abandoned it when learning a) it was harder than I thought, and b) it was confusing.
Above: building portholes and thinking how to design an easily debuggable, wire-friendly enclosure.
A lot of this kind of fabrication felt new to me: laser cutting, working with wood, working (very little) with metal. This week was making it look polished, and putting functional finishing touches (making the switches work and fit better.)
Spent this weekend at home recuperating from the week.
It was a rainy few days where I only left home once per day.
I was so tired. I moved in slow leaps and small bounds on fabrication… something that is very new and daunting to me, but felt very good after I reflect on my work Sunday.
Also, at this point, p5.js finally works!
and fabrication is coming along… (cardboard mockup)
trying different interfaces for record dashboard
workspace circa 5pm today
to-do list, circa 4pm
This week, time spent playing with dissected webcam…
…and trying to embed it in a cardboard box. It didn’t work too great:
- Confirmed a 1-foot throw distance, which felt unwieldy.
- Underneath lighting is too dark for computer eyes
- Lighting reflects off the plastic.
So instead, a pivot from under-neath camera to overhead one:
Suspending with a braided wire works ok, but the camera is subject to rotating in space. Danny suggests a clamp-able rod.
I need to tweak the code so that the virtual grid aligns better with the lego grid.
I sketched out an idea for a “recording” interface I liked. Here’s my to-do list for next week, which includes building that and creating a nice wood enclosure for the whole thing:
pcom final project boss battle
progress: week 1 —-> week 2 —> week 3 👾
I’ve got a working prototype! (turn on sound)
playing with light
The transparent baseboards came in:
I bought a web cam, a logitech C270 that records at 720dpi.
Needs more throw depth right now. Maybe like 1 foot.
There could be opportunities for playing with mirrors to help solve this problem.
computer vision magic
Meanwhile, experimenting with p5.js this week in the lemony glow of my home:
I got a grid of colors to show:
Then I wrote a function that ONLY will display what is a brick, and is NOT a white background.
The blue has been the hardest to pick up consistently. Sometimes it blends in with the shadows.
I recorded some sounds with my phone. They’re kinda crude, and I like them that way. I’ve been really into the idea of low-polish, found sounds, and inspired by this video:
pcom final project boss battle
progress: week 1 —-> week 2 👾
Last last week, I built a simple cardboard prototype of my lego music sequencer idea.
I was having some existential ruminations about this idea…. is it interesting enough? unique enough? fun enough? is it even helpful to the world?
In the end, 6 weeks is a short amount of time. I finally decided to just go for it. I ended up getting some nice feedback from a lot of folks.
I got some great feedback from one of the residents, Davíd. (thank you!!)
- it would be conceptually stronger if you could stack the bricks. that’s one of the inherent strengths of the LEGO language
- the button interface almost seems like it could be a distraction. Or at least an add on. Save that for later
- think about the 2 axes. different musical instruments? different pitches?
- found sounds feel a bit more interesting that computer sounds
- should I even do LEGOs? should I consider making my own pieces for special extra functionality?
- what if I want to make this into a more open system? should I abandon the grid then?
for the purposes of testing, I made a simple cardboard box (a sandwich, if you will) with printed “lego” paper on top as an affordance.
There is a single row of LED’s which lights up across as a sort of time indicator.
For sound, I used the website “Patatap,” which is a lovely interactive soundkit. I pre-recorded loops and then looped over them in real-time when peeps added new bricks to the canvas.
I was really curious to know:
- what was people’s initial interaction with it?
- do they follow the grid… try to stack the LEGOs…
- what kind of sounds do people prefer?
- what did people assume the axes were for? do the LED’s get the point across?
Only one person really pursued stacking the LEGOs. I think my user group also knows too much by now of how this would work. (“Oh, p5.js feed with computer vision component, using a bottom camera? of course!”)
Maybe since they’re aware of technical constraints, they’re less tempted to try it.
- try using a shift register instead of all these LEDs
- the horizontal plane is nice (as opposed to stacking vertically.)
- “[building] height might really limit what I can do, limit my musical expression. I’d feel like I’m losing something.”
- many people thought axes corresponded with pitch and volume
- most people understood that color mapped to sound quality, or instrument
- homemade, “found” sounds is more appealing with LEGO aesthetic
- people liked the idea of recording
what am i making?
Here’s a braindump of some recent ideas…
…like an object-oriented, physical programming interface. (too vague, too ambitious)
… like a sequencer which uses legos.
…like extended interfaces for cam doodler, where you can draw with your camera.
…. or this soft jacket where kids can play test their vocabulary for emotions.
inspiration + values
These are some references I’ve been thinking about as recently as last month, and as long ago as 4 years…!
Eventually, these will be coalescing in the direction of the thesis. If anything, I’d like to think of this as a first pass in thesis mindset.
What’s important for me:
- it’s an open system. You don’t need to buy the starter kit, the expansion pack. You can introduce new materials from your world. The entire world is your palette— with all its atomic, material richness. Create possibilities that the inventor never thought of!
- it’s low cost. This relates to open-ness: I don’t want it to feel too precious. If your materials are lost or broken, they can be replaced. I don’t want this to be some project in an ivory institutional tower that is expensive and/or looks expensive.
- it’s humane. Which doesn’t mean it can’t be challenging. But it acknowledges what people already tend to do.
- it’s fun!!!
Your plant can express itself to you. You can alter the shape of its expression with fundamental parameters for growth: light and water.
I was inspired by writer Ursula LeGuin, who I deeply admire. She wrote this fictional piece called “The Author of the Acacia Seeds,” about the hidden languages in animal and plant organisms. The whole thing is very poetic. It ponders the perspectives of linguists from the future:
- Do non-human organisms communicate through language?
- Does language then enable new forms of art & expression?
I was interested in this idea of inanimate, carbon-based life talking directly to YOU. I imagined the computer as a third party translator between the PLANT and oneself.
- What parameters might it take?
- How is this translated to humans?
For this, I used a light sensor, and a water sensor. A corresponding p5.js sketch “grows” text slowly over time.
It’s hard to demo this because it happens at a very slow scale. But it successfully does the following:
- Text stops when the plant is dry.
- The more water the plant has, the faster the text generates.
- The more light the plant has, the larger each string is.
Here are links to the three related prototypes:
- new toys: rain sensor, LCD
- LCD works!