“CMU View from Afar” by Claire Hentschker and Priya Ganadas (2014)

‘CMU View from Afar’ is a remote viewing tool for spaces at Carnegie Mellon University.

Why
In our initial research process we spoke with many current CMU students who had taken a campus tour prior to their arrival at school. They all mentioned that the tour took place outside of the buildings on campus, so they were unable to see inside the some of the labs, lecture halls, and eating areas in an adequate fashion. Many international students also mentioned that they were unable to visit CMU for a campus tour before their academic career here began, due to an inability to travel to pittsburgh just for a short visit. With this information, we attempted to create a tool for potential students, or anyone interested for that matter, to view the inside of rooms at CMU without leaving the comfort of their home, wherever that may be.

How
The project aims to provide an ‘inside’ view of Carnegie Mellon’s facilities. Our website, cmuviewfromafar.tumblr.com, provides this experience.  It is a three step process:

1. A PDF is downloaded and printed out. This becomes the interface for the tool. (see fig. 1)

2. An instructional video can be watched on the website, explaining how the printed PDF is then folded into a paper box.  (See fig. 2)

3. A .apk file, an application for android devices, can be downloaded from the website, and installed on a desired android smart device. (see fig. 3)

Now, when the application is opened a camera view is held over the box and a [glitchy but recognizable] room within Carnegie Mellon University is augmented onto the box, on the screen of the device. Different rooms can be toggled between, using a small menu located on the side of the screen.

 

Fig. 1

Fig. 1

Fig. 2

Fig. 2

fig. 3

fig. 3

In this prototype, three rooms are available within the menu.  An art’s Fabrication lab located in Doherty, a standard lecture hall and Zebra Lounge Cafe, located within the School of Art.

Video

 

 

Looking Ahead
For future iterations, we hope to incorporate a process for recording 3D video that will allow animated 3D interactions to occur within the box in real time.  This  is possible using the Kinect and custom made plugin from Quaternion Software.

The Big Picture
CMU View From Afar is a tangible interface for remotely navigating a space, using Augmented Reality technology. Our goal was to make this process as accessible and simple as possible for the user to implement. We set up the website and decided to have a single folded piece of paper to act as the image target because internet access and a printer are the only things needed to begin using this tool. The .apk file can also be continuously updated by us, and downloaded by a user, while the box as an image target remains the same. This can allow for the same box to function as a platform for many methods of information distribution, and all that has to happen is updating the .apk file.

“Rehab Touch” By Meng Shi (2014)

rehab touch from Meng Shi on Vimeo.

The idea of the final project is to explore some possible way to detect people’s gesture when they touch something.

Background:

Stroke affects the majority survivors’ abilities to live independently and their quality of life.

The rehabilitation in hospital is extremly expensive and cannot be covered by medical insurance.

What we are doing is to provide them possible low cost solution, which is a home-based rehabilitation system.

Touche:

www.disneyresearch.com/project/touche-touch-and-gesture-sensing-for-the-real-world/

Touch and activate:

dl.acm.org/citation.cfm?id=2501989

QQ20141216-1@2x

 

QQ20141216-2@2x

These two are using different way to realize gesture detection, touche is using capacitor change in the circuit while Touch and Activate is using frequency to detect frequency change.

My explore of these two based on two instructions:

Instructable: www.instructables.com/id/Touche-for-Arduino-Advanced-touch-sensing/

Ali: http://artfab.art.cmu.edu/touch-and-activate/

 

At first I explored “Touche”, and show it in critique,  after final critique, move to “touch and activate”.

QQ20141216-3@2x

In order to realize feedback, I connect Max and Processing to do visualization in processing, since it seems Max is not very good at data visualization part. (I didn’t find good lib to do data visualization in Max, maybe also because my limitation knowledge of Max.)

 

Limitation of system:

“Touch and Activate” seems not work very well in noising environment even it just use high frequency (like the final demo environment).

The connection between Max and Processing is not very strong. So if I want to do some complicated pattern, the signal is too weak to do that.

 

“Look Inside!” by Connor Brem (2014)

What happens if I look inside?

Curiosity is a powerful force. Look Inside! explores how people react when presented with a non-descript box and an offer to look inside.

IMG_0060

If people choose to peer inside the box, they find a mechanical drum set, which begins to play a simple but incessant beat. The beat grows in speed and intensity, and the box begins to shake!

IMG_0071

At this point, people interacting with the piece have a choice: back away and attempt to dissociate themselves from the commotion they’ve caused, or keep watching. When Look Inside! was displayed, some people ran, and some stayed. Some even came back to play again.

IMG_0074

Look Inside! explores curiosity, responsibility, and control. Would you look inside?

Smart 3D Pen Research

“FreeD” by Amit Zoran. 2013

More

Amit references

“Haptic Intelligentsia” by Joong Han Lee. 2012

More

Amit uses magnets for 3D tracking. I found this website that appears to be a DIY. Maybe I can work on this for another project if the Leap works.

I was worried it wouldn’t so I did a lot of research on 3D magnet tracking, which is still an option maybe…maybe I’ll save it for another project. I’ll post some links just so people can see what I was thinking about if you’re interested.

DIY Magnet Tracker Sites
1 2 3 4 5

 Understanding the limitations of the pen. Make sure it can work with how I want to use it.

Understanding the limitations of the pen. Make sure it can work with how I want to use it.

THE LEAP WORKS!

Pen over Leap

“Quake” by Amy Friedman (2014)

IMG_1349

Quake utilizes csv files reom the USGS website, to simulate the magnitude of earthquakes occurring throughout the past 7 days. The USGS updates its systems every 5-15 minutes to map earthquakes that have happened recently and over the past few months, and further past. Quake brings awareness to the common occurrence of earthquakes, and acknowledges the 10 levels of magnitude of an earthquake. Some quakes occur and no one feels them, but they happened and it occurs more often around the world than one is aware of.

Using a solenoid and dividing

USGS Magnitude Scale of Earthquakes

CSV file can be found USGS Spreadsheet Applications

“Ships to Sail the Void” by Connor Brem (2014)

Arduino,Python,Robotics,Software,Student Work — connorbrem @ 10:31 am

“When ships to sail the void between the stars have been built, there will step forth men to sail these ships.”

― Johannes Kepler

Satellites are funny things. It’s easy to forget how rapidly they circle the earth: it takes us a day to make a full rotation, but some satellites can complete an orbit in an hour and a half. At those speeds, reality is warped, and time bends.

What’s even stranger is that one of these satellites, the International Space Station, has people in it.

2014-10-17 09.03.39

“Ships to Sail the Void” plots the trajectory of the Space Station with a laser that it shines onto nearby surfaces. When it is activated, it takes a minute to show the path that the Space station will take over the next one hundred minutes.

Every time that it runs, it pulls fresh data from the satellite database at n2yo.

2014-10-17 09.02.42

It brings the strange, distant concept that is space travel down from the void, into your room.

“Dragonstone” by Jolan van der Wiel. (2014)

I envision myself using some unpredictable-predictable type of material. Something like this next project’s play with material properties is particularly fascinating, as I have always been interested in new material applications.

Dragonstone | Jólan van der Wiel from Mir Motion on Vimeo.

Dealing with magnetic clay allows for a high level of unpredictability due to the complex inherent nature of clay, yet also has another level of predictability due to the magnetic properties. Knowing the behavioral limits allows for some aspect of the fabrication to happen naturally without interference, such as the way the clay falls out, but it happens within boundaries so that the designer can control the general shape.

More

Can I get some advice from you guys on what I could do? It doesn’t have to be cymatics, necessarily. I think I was interested in cymatics for reasons like I mentioned above, where there is a level of control and un-control. A compromise between the natural effect and the designer’s intentions. However, I feel unhappy with the limitations of my project on what I’m imagining it could “be”, and a performance installation for a concert isn’t what I really want.

My one sentence description would be something more like:

Compromising the designer’s control with the unpredictable nature of [??a material??] accomplished by implementing [??a hybrid instrument??]

“Computer Augmented Craft” by Christian Fiebig. (2012)

Computer Augmented Craft – Christian Fiebig from The Machine on Vimeo.

This kind of intelligent learning used to aid in the design process is what I was really interested in when deciding to pursue tangible interaction design. I want to design a design process. I want to design a tool that aids in the design process.

More

Proposal: “Noise” by Connor Brem (2014)

Arduino,Audio,Final Project,Robotics,Sensors — connorbrem @ 11:09 pm

“Noise” will be a collaborative, physical music-looping device.

“runScribe” by Scribe Labs (2010)

3_Embedded and Wearable,Scanning,Sensors — mbparker @ 4:56 pm

more…

Next Page »
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Hybrid Instrument Building | powered by WordPress with Barecity