Final Project Documentation: The Wobble Box 2.0

Arduino,Assignment,Audio,Final Project,Instrument,Max,Sensors,Software — Tags: — Jake Berntsen @ 9:46 pm

Presenting my original “wobble box” to the class and Ali’s guests was a valuable experience.  The criticisms I received were relatively consistent, and I have summarized them to the best of my ability below:

  • The box cannot be used to create music as an independent object.  When I performed for the class at the critique, I was using an Akai APC40 alongside the wobble box.  I was using the APC to launch musical ideas that would then be altered using the wobble box, which I had synced to a variety of audio effects.  The complaint here was that it was unclear exactly how much I was doing to create what the audience was hearing in real time, and very clear that I wasn’t controlling 100% of the noises coming out of my computer.  At any rate, it was impossible to trigger midi notes using the wobble box, which meant the melody had to come from an external source.
  • The box only has one axis to play with.  At the time of the critique, the wobble box only had one working distance sensor attached to the Teensy, which meant I could only control one parameter at a time with my hand.  Many spectators commented that it seemed logical to have at least two, allowing me to get more sounds out of various hand motions, or even using two hands at once.
  • The box doesn’t look any particular way, and isn’t built particularly well.  The wobble box was much bigger than it needed to be to fit the parts inside it, and little to no thought went into the design and placement of the sensors.  It was sometimes difficult to know exactly when it was working or not, and some of the connections weren’t very stable.  Furthermore, the mini-USB plug on the side of the device sometimes moved around when you tried to plug in the cord.

In the interested of addressing the concerns above, I completely redesigned the wobble box, abandoning the old prototype for a new model.

IMG_1636 The most obviously improved element of the new box is the design.  Now that I knew exactly what the necessary electronic parts were, I removed all the extra space in the box.  The new design conserves about three square inches of space, and the holes cut for the distance sensors are much neater.

IMG_1643

I applied three layers of surface treatment; a green primer, a metallic overcoat, and a clear glaze.  The result is a luminescent coloring, and a rubber-esque texture that prevents the box from sliding around when placed on a wooden surface.  In my opinion, it looks nice.

IMG_1639IMG_1645

A strong LED light was placed exactly in between the two distance sensors, illuminating the ideal place for the user to put his/her hand.  This also provides a clue for the audience, making it more clear exactly what the functionality of the box is by illuminating the hand of the user.  The effect can be rather eery in dark rooms.  Perhaps most importantly, it indicates that the Teensy micro-controller has been recognized by Max, a feature lacking in the last prototype.  This saved me many headaches the second time around.

IMG_1640

IMG_1644

The new box has two new distance sensors, with differing ranges.  One transmits very fine values between about 2 inches and 10 inches, the other larger values between about 4 and 18 inches.  Staggering the ranges like this allows for a whole new world of control for the user, such as tilting the hand from front to back, using two hands with complete independence, etc.

IMG_1642

Finally, I moved the entire USB connection to the interior of the device, electing to instead just create a hole for the cord to come out.  After then securing the Teensy within the box, the connection was much stronger than it was in the previous prototype.

In addition to fixing the hardware, I created a few new software environments between Max and Ableton that allow for more expressive use of the box.  The first environment utilized both Max and Ableton Live to create an interactive art piece.  As the user stimulated the two distance sensors, a video captured by the laptop camera would be distorted along with an audio track of the user talking into the computer microphone.  Moving forward, my goals were to extend the ability to use the box as a true instrument, by granting a way to trigger pitches using only the box and a computer.  To achieve this, I wrote a max for live patch that corresponds a note sequence-stepper with a microphone.  Every time the volume of the signal picked up by the microphone exceeds a certain threshold, the melody goes forward by one step.  Using this, the user can simply snap or clap to progress the melody, while using the box to control the timbre of the sound.  I then randomized the melody so that it selected random notes from specific scales, as to allow for improvisation.  The final software environment I wrote, shown below, allows for the user to trigger notes using a midi keyboard, and affect the sounds in a variety of ways using the box.  For the sake of exhibiting how this method can be combined with any hardware the user desires, I create a few sounds on an APC40 that I then manipulate with the box.

Final Presentation – Spencer Barton

The Black Box

Put your hand into the black box. Inside you will find something to feel. Now take a look through the microscope. What do you feel? What do you see?

The Box and Microscope

2013-11-19 20.00.16

Inside the Box

2013-11-19 19.43.03

Under the Microscope

2013-11-17 00.03.12

When we interact with small objects we cannot feel them. I can hold the spider but I cannot feel it. The goal here is to enable you to feel the spider, to hold it in your hand. Our normal interaction with small things is in 2D. We see through photographs or a lens. Now I can experience the spider though touch and feel its detail. I have not created caricatures of spiders, I copied a real one. There is loss of detail but the overall form is recreated and speaks to the complexity of living organisms at a scale that is hard to appreciate.

The box enables the exploration of the spider model before the unveiling of the real spider under the microscope. The box can sense the presence of a hand and after a short delay, enabling the viewer to get a good feel of the model, a light is turned on to reveal the spider under the microscope.

Explanation of the Set-up

The Evolution of Ideas

As I created the models I found that my original goal of recreation was falling short. Instead of perfect representations of the creatures under the microscope, I had white plastic models that looked fairly abstract. The 123D models were much more realistic representations because of their color. My original presentation ideas focused around this loss of detail and the limits of the technology. However, what I came to realize was were the strengths of the technology lay: the recreation of the basic form of the object at a larger scale. For example someone could hold the spider model and get a sense of abdomen versus leg size. Rather then let someone view the model I decided to only let them feel the model.

Feedback and Moving Forward

The general feedback that I got was to explore the experience of the black box in more depth. There were two key faults with the current set-up. First the exposure of the bug under the microscope happened too soon. Time is needed for the viewer to form a questions of what is inside the black box. Only after that question is created should the answer be shown under the microscope. The experience in the box could also be augmented. The groping hand inside the box could also be exposed to other touch sensations, it could activate sound or trigger further actions. The goal would be to lead the experience toward the unveiling. For example sounds of scuttling could be triggered for the spider model.

The second piece of feedback lay with the models themselves. First it was tough to tell that the model in the box was an exact replica of the bug under the microscope. The capture process losses detail and the model creation through 3D printing adds new textures. The plastic 3D models in particular were not as interesting to touch as the experience was akin to playing with a plastic toy.

To recognize and rectify these concerns this project can be improved in a few directions. First I will improve the box with audio and a longer exposure time. Rather then look through the microscope I will have a laptop that displays the actual images that were used to make the model. The user’s view on this model with then be controlled by how they have rotated the model inside the box.

I will try another microscope and different background colors to experiment with the capture process and hopefully improve accuracy. I will redo the model slightly larger with the CNC. MDF promises to be a less distracting material to touch. Additionally the fuzziness of MDF is closer to the texture of a hairy spider.

Final project milestone 3 – M. Haris Usmani & Robert Kotcher

Assignment,Final Project,Instrument — rkotcher @ 2:31 pm

Spatianator (v1.0) week 3 progress

In this video we demonstrate a beta version of a single “cricket”. The cricket has four actuators, which we talk about individually. Finally, we demonstrate sounds that we can achieve and plans for the next week of development.

Final Project Milestone 3 – Spencer Barton

3D Printer,Final Project,Instrument,Rhino3D,Scanning — spencer barton @ 8:47 pm

Model Making

I have begun to create models. The current models utilize additive methods: one with plaster printing (thanks to dFab) and PLA printing with a Makerbot in Codelab. I also utilized the Art Fab CNC to make a slightly larger rolly polly. Some of the below models are shown with the original object that I used for the capture.

2013-11-17 17.53.01

2013-11-17 18.06.09

2013-11-17 17.55.04

2013-11-17 18.08.14

2013-11-23 13.54.47

2013-11-23 13.54.20

2013-11-18 22.59.09

Final Project Proposal- Robert Kotcher, M. Haris Usmani

Final Project,Instrument — rkotcher @ 4:20 pm

LittleBits

Instrument,Reference — haochuan @ 12:46 am

littleBits is a system of electronic modules that snap together with magnets. We built littleBits to break the boundaries between the products we consume and the things we make, and to make everyone into an inventor.

Each littleBit has one unique function (light, sound, sensors, buttons), and with different combinations you can make large circuits. littleBits allows you to create interactive projects without any background in engineering, programming or wiring, in a few seconds. It’s as easy as snapping LEGO bricks together. And the best part is, it’s open source!

 

 

What is littleBits? from littleBits on Vimeo.

‘SketchSynth’ by Billy Keyes

Audio,Instrument,Reference — haochuan @ 12:45 am

SketchSynth: A Drawable OSC Control Surface

SketchSynth lets anyone create their own control panels with just a marker and a piece of paper. Once drawn, the controller sends Open Sound Control (OSC) messages to anything that can receive them; in this case, a simple synthesizer running in Pure Data. It’s a fun toy that also demonstrates the possibilities of adding digital interaction to sketched or otherwise non-digital interfaces.

Final Project Proposal: Haochuan Liu

Final Project Proposal: JaeWook Lee

Assignment,Instrument,Reference,Submission — jwleeart @ 10:35 pm

Interesting Project with Ferro Liquid

Instrument,Reference — lianghe @ 8:32 pm

Next Page »
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Hybrid Instrument Building 2014 | powered by WordPress with Barecity