Final Project: Through the Lens – Patra Virasathienpornkul

Assignment,Final Project — Patt @ 3:01 am

Through the Lens

Through the Lens is a hybrid instrument that involves a piece of paper, a pen, and an OLED transparent display. My goal for this project is to understand the possibilities and the limitations of the device, and to come up with applications that are interesting, educational, and entertaining.

Bouncing Ball from Patt Vira

Steps: 

  • Draw inside the pre-calibrated section on a piece of paper (that is placed on top of a Wacom tablet) using a Wacom Inkling Sketch Pen.
  • Place the OLED transparent display on top of the paper .
  • Watch the graphics on the display interact with the drawings.

What did I learn? 

From this project, I realized that I put a majority of my time trying to understand the basic use of the transparent display and how to get all the technology to work properly. Even though I wish I could have created more applications and presented my project beyond the proof of concept, I am now at the comfortable point where I can use the knowledge that I have to  create interesting applications based on my own imagination and the feedback that I received from the outsiders perspective. The comments I received during both the final presentation and the show are invaluable. One important point I took away is that no one cares about the technology – what matters is the thing you do with the technology.

How can the project be improved? 

The 4D System transparent display has a lot of potential, and I believe I have only tackled a small section of the possibilities. The feedback I received during the final presentation and the show is very helpful, and widen the scope of project ideas I can do with the knowledge I currently have.  Here are the two directions I like to explore further.

1) Increase the area on a piece of paper to allow a bigger space for people to draw.

2) Use the display as a lens (think google glass)

I’d also like to get rid of the graphics tablet and make the display portable by exploring alternative ways of acquiring the pen strokes.

Acknowledgement: 

  • This project is inspired by Glassified by the Fluid Interfaces Group at the MIT Media Lab.
  • Special thanks to  Ali Momeni and Anirudh Sharma.
  • Thanks to Golan Levin and the Frank-Ratchye STUDIO for Creative Inquiry grant for allowing me to purchase the 4D systems OLED transparent display.

Final Project(revised): JaeWook Lee

Uncategorized — jwleeart @ 9:20 pm

IMG_7026

IMG_7029

Ideasthesia
video, 1:56″
2013
The word “ideasthesia” is a phenomenon in which the activation of ideas evokes perception-like experiences. The term is etymologically derived from the ancient Greek verb idea (idea) and aisthesis (sensation), referring to “sensing concepts.” The project explores how we sense things without actual stimuli, but through the intensive imagination and association in both visual and auditory levels. It is composed of two video works in which a cellist plays the cello without the actual instrument, meaning “air cello” by using her imagination. It was installed in the form of video installation in front of The Studio For Creative Inquiry in CFA.

Ideasthesia from JaeWook Lee on Vimeo.

Final Project: Jake Marsico

Final Project,Submission,Uncategorized — jmarsico @ 11:45 pm

The final deliverable of these two instruments (video portrait register and reactive video sequencer) was a series of two installations on the CMU campus.

 Learnings:

The version  shown in both installations had major flaws.  The installation was meant to show a range of clips that varied in emotion and flowed seamlessly together. Because I shot the footage before completing the software, it wasn’t clear exactly what I needed from the actor (exact time of each clip, precision of face registration, number of clips for each emotion).  After finishing the playback software, it became clear that the footage on hand didn’t work as well as it could.  Most importantly, the majority of the clips lasted for more than 9 seconds. In order to really nail the fluid transitions, I had to play each clip foward and then in reverse, so as to ensure each clip finished in the same position it started. To do that with each 9 second clip would have meant that each clip would have lasted a total of 18 seconds (9 forward, 9 backwards). These 18 second clips would eliminate any responsiveness to movements of viewers.

As a result, I chose to only use the first quarter of each clip and play that forward and back. Although this made the program more responsive to viewers, it cut off the majority of the subject’s motions and emotions, rendering the entire piece almost emotionless.

Another major flaw is that the transitions between clips very noticeable as a result of imperfect face registrations. In hindsight, it would require an actor or actress with extreme dedication and patience to perfectly register their face at the beginning of each clip. It might also require some sort of physical body registering hardware. A guest critic suggested that a better solution might be to pair the current face-registration tool with a face-tracking and frame re-alignment application in post production.

If this piece were to be shown outside the classroom, I would want to re-shoot the video with a more explicit “script” and look into building a software face-aligning tool using existing face-tracking tools such as ofxFaceTracker for openFrameworks.

Code:

github.com/jmarsico/Woo/tree/master

 

Final Project: Note Cubes – Wanfang Diao

Assignment,Final Project,Submission — Wanfang Diao @ 3:45 pm

Idea

How we learn about our physical world? How we learn light? How we learn sound? How we get the basic concept of space and time?

As for me, I learned form experience. I learned from the experience of stack toy bricks and tearing them down. I learned from  tapping a stainless steel plate with wooden spoon. I learn from doing, learn from trials. Once I get the rule of game, I begin to create.

In this project, I want to make music notes more tangible and touchable, which can be experienced in a more intuitive way. I aim to build a very straight forward mapping relationship between “time/sound” and “space/light(or color)” , which can not only give children a concept of the structure  of melody but also give them access to  create  a piece of music.

Therefore, I designed the Note Cubes, a set of tangible cubes for kids to explore sound, notes and rhythm. By putting them a line or also stacking them (just like playing toy bricks), kids can let cubes trigger their “neighbor cubes “by colorful LEDs to play notes and then get a piece of sound or melody after a few time trials.

Note Cubes from Wanfang Diao on Vimeo.

 

This project has been shown at Assemble ( assemblepgh.org/ ) Dec. 2013.Here is the video about how kids play with Note Cubes!What I learn from the show is there should be more obvious signs to the trigger direction of cubes. And more shapes can be explored.

 

Public Show for Note Cubes2 from Wanfang Diao on Vimeo.

About tech:
There is micro-controller (Trinket), photosensors, speaker and LEDs in each cubes. Each speaker can play notes when triggered by LED lights from other cubes under the control of micro-controller.
The cube’s shell is made by hardboard by laser cutting.
Acknowledgements
Thanks for the help of Ali Momeni, Dale Clifford, Zack Jacobson-Weaver,  Madeline Gannon and my  friends in CoDelab!

 

Final Project Presentation – Job Bedford

Assignment,Final Project — jbedford @ 3:39 pm

SoundWaves – Wearable Wireless Instrument that transforms dance into rhythmic sounds.

Idea: Give dance an audio synthesis, in order to create a unique form of performance based art.

Basic Implementation:

Implementing different Sounds:

Crow:

Bell:

Test:

Final Presentation is a performance with SoundWaves.
Video:

 

Performance did not show case the complete potential or vision of SoundWaves. The long, drawn out noises end up making it hard to tell what’s going on for the audience. The first acts interaction with sound was too discrete. The chirping of the second act overpowered the other sounds in the background that would showcase the manipulate of ongoing noise. The use of a totem in the center is to introduce dialogue to the performance a create an interaction to be witnessed. in the future, the totem will most likely be a small speaker from which the sound is being emitted.

Performance Logistics:

The performance consists of multiple phases:

Phase 1.) Pure_FX. Discrete long sound to project an eerie mood, great for story teller or interpretive performance. From Wide array of sounds, selection based off foot orientation.

Screen Shot 2013-12-09 at 7.54.06 PM

Phase 2.) FailSafe. Hard coded 808 drum sounds corresponding to ball and heel. Optional record and playback. Also includes two quick motion switches to change from one the drum sounds mapping to another.

Screen Shot 2013-12-09 at 7.54.45 PM

 

Phase 3.) Groove_FX. Utilize shin movement to oscillate frequency of continuous sound. Great for swing movement and ground_work.

Screen Shot 2013-12-09 at 7.55.14 PM

 

Phase 4.) Sequence_ZF. Foot controled sequencer of 808 drum sound. Feet and dance steps are orchestrators of beat, adding and removing triggers based on time of step. Useful in combination with FailSafe phase to add background beat to dance too.

Screen Shot 2013-12-09 at 7.57.47 PM

SoundWaves is a wireless wearable instrument that synthesizes dance into coordinated sound.

 

 

« Previous Page
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2025 Hybrid Instrument Building 2014 | powered by WordPress with Barecity