“Rehab Touch” By Meng Shi (2014)

rehab touch from Meng Shi on Vimeo.

The idea of the final project is to explore some possible way to detect people’s gesture when they touch something.

Background:

Stroke affects the majority survivors’ abilities to live independently and their quality of life.

The rehabilitation in hospital is extremly expensive and cannot be covered by medical insurance.

What we are doing is to provide them possible low cost solution, which is a home-based rehabilitation system.

Touche:

www.disneyresearch.com/project/touche-touch-and-gesture-sensing-for-the-real-world/

Touch and activate:

dl.acm.org/citation.cfm?id=2501989

QQ20141216-1@2x

 

QQ20141216-2@2x

These two are using different way to realize gesture detection, touche is using capacitor change in the circuit while Touch and Activate is using frequency to detect frequency change.

My explore of these two based on two instructions:

Instructable: www.instructables.com/id/Touche-for-Arduino-Advanced-touch-sensing/

Ali: http://artfab.art.cmu.edu/touch-and-activate/

 

At first I explored “Touche”, and show it in critique,  after final critique, move to “touch and activate”.

QQ20141216-3@2x

In order to realize feedback, I connect Max and Processing to do visualization in processing, since it seems Max is not very good at data visualization part. (I didn’t find good lib to do data visualization in Max, maybe also because my limitation knowledge of Max.)

 

Limitation of system:

“Touch and Activate” seems not work very well in noising environment even it just use high frequency (like the final demo environment).

The connection between Max and Processing is not very strong. So if I want to do some complicated pattern, the signal is too weak to do that.

 

Immersive Experience: “Echo Chamber” by Connor Brem, Emma Steuer, Chris Williamson (2014)

In a dark room, words, motions, and even thoughts are amplified.

Echo Chamber places its audience inside a room where the only sound is the sound that they make, cycled through a audio feedback loop; and the only light is light that follows this sound’s pitch and volume.

Screen Shot 2014-12-16 at 9.12.21 PM

The piece places its audience unexpectedly in control of this room, and explores how they react. Do they explore its sounds and lights? Do they try to find order in the feedback? Or do they shrink back, afraid that the feedback will grow out of control?

Max Extensions: Part 1

Arduino,Hardware,Max,OpenCV,Sensors — Ali Momeni @ 12:17 am

In order to add extra functionality to max, you can download and install “3rd party externals”.  These are binaries that you download, unzip and place within your MAX SEARCH PATH (i.e. in Max, go to Options > File Preferences… and add the folder where you’ll put your 3rd party extensions; I recommend a folder called “_for-Max” in your “Documents” folder).

Some helpful examples:

  • Physical Computing: Maxuino
    • how to connect motors/lights/solenoids/leds to Max with Maxuino
  • Machine Vision: OpenCV for Max (cv.jit)
  • Audio Analsyis for Max: Zsa Objects (Emmanuel Jourdain) and analyzer~ (Trista Jehan)

streaming audio from/to max

Audio,Max — Ali Momeni @ 2:49 pm

audio streaming related:

Intro to Max

Max,Software — Ali Momeni @ 12:30 pm

Class resources:

  • SIS Courses drop box folder: includes Max abstractions and externals
  • Cycling ’74 Max Licenses: CMU has a site license now; email Ali for authorization

Max Resources:

  • Max Forum: active discussion forums maintained by Cycling74
  • MaxObjects.com: database of 3rd party “externals” for Max; search here by function (e.g. “spectral analysis”)
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Hybrid Instrument Building | powered by WordPress with Barecity