“Kim vs. Comet” by Rachel Ciavarella

“Kim vs. Comet” is a real-time comparison of the volume of data created via social media about Kim Kardashian and the Rosetta comet landing.

How do online interactions differ from real world interactions? People often create digital personalities for themselves. These alternate selves make it easier to disassociate the physical self from their digital interactions. This mental distance can lead to uncharacteristic behavior in the digital world.

How does media influence public perception? Kim’s “internet breaking” magazine cover was published about the same time as scientists landed on the Rosetta comet. However, Kim seemed to get most of the media attention. Does that mean people care less about human’s landing on a comet? Or does the media only allow the public to see what it will get the most views and publicity off of?

This piece physicalizes the digital interaction of tweeting thereby forcing people to confront theirs and other’s digital interactions in the real world.

Inside the sculpture, there are two thermal printers, two arduinos and, a laptop running two processing sketches. The processing sketches scrape the most recent tweet from twitter matching the search query for either @KimKardashian or @ESA_Rosetta. The new tweets scraped are then compared to the previous tweet. If it is different it is printed out of one of the thermal printers. If it is the same as the previous tweet the sketch waits a few seconds (to avoid exceeding rate limits) and scrapes again for the most recent tweet of either query. The printed sheets are allowed to hang from the printers as a visual comparison of the volume of data created about either topic.

printer_frontfull

printer_hand

printer_bottom

In the future I would like to improve the piece by consolidating the electronics and code into one processing sketch and a raspberry pi.

 

“CMU View from Afar” by Claire Hentschker and Priya Ganadas (2014)

‘CMU View from Afar’ is a remote viewing tool for spaces at Carnegie Mellon University.

Why
In our initial research process we spoke with many current CMU students who had taken a campus tour prior to their arrival at school. They all mentioned that the tour took place outside of the buildings on campus, so they were unable to see inside the some of the labs, lecture halls, and eating areas in an adequate fashion. Many international students also mentioned that they were unable to visit CMU for a campus tour before their academic career here began, due to an inability to travel to pittsburgh just for a short visit. With this information, we attempted to create a tool for potential students, or anyone interested for that matter, to view the inside of rooms at CMU without leaving the comfort of their home, wherever that may be.

How
The project aims to provide an ‘inside’ view of Carnegie Mellon’s facilities. Our website, cmuviewfromafar.tumblr.com, provides this experience.  It is a three step process:

1. A PDF is downloaded and printed out. This becomes the interface for the tool. (see fig. 1)

2. An instructional video can be watched on the website, explaining how the printed PDF is then folded into a paper box.  (See fig. 2)

3. A .apk file, an application for android devices, can be downloaded from the website, and installed on a desired android smart device. (see fig. 3)

Now, when the application is opened a camera view is held over the box and a [glitchy but recognizable] room within Carnegie Mellon University is augmented onto the box, on the screen of the device. Different rooms can be toggled between, using a small menu located on the side of the screen.

 

Fig. 1

Fig. 1

Fig. 2

Fig. 2

fig. 3

fig. 3

In this prototype, three rooms are available within the menu.  An art’s Fabrication lab located in Doherty, a standard lecture hall and Zebra Lounge Cafe, located within the School of Art.

Video

 

 

Looking Ahead
For future iterations, we hope to incorporate a process for recording 3D video that will allow animated 3D interactions to occur within the box in real time.  This  is possible using the Kinect and custom made plugin from Quaternion Software.

The Big Picture
CMU View From Afar is a tangible interface for remotely navigating a space, using Augmented Reality technology. Our goal was to make this process as accessible and simple as possible for the user to implement. We set up the website and decided to have a single folded piece of paper to act as the image target because internet access and a printer are the only things needed to begin using this tool. The .apk file can also be continuously updated by us, and downloaded by a user, while the box as an image target remains the same. This can allow for the same box to function as a platform for many methods of information distribution, and all that has to happen is updating the .apk file.

Vox Proprius

Vox Proprius (source) is an iPhone app that harmonizes with you while you sing. Running on the openFrameworks platform, it uses the ofxiOS addon combined with the ofxPd addon to generate sound and visuals.

All of the extra parts are generated live from your own voice using a pitch shifter in Pd. Songs can be written in any number of composition softwares (I used musescore), and exported as a musicXML file for import and synthesis in the app.

“Haptic 3D” Amy Friedman (2014)

IMG_1537

Haptics 3D is a wearable bracelet and Unity Application.The bracelet syncs with Unity to give haptic feedback when a users hand picks up an object in 3D space and connects the object to another part in 3D space. The demo has 3 cars each needing 2 car tires to be connected to the axels.  The tool is meant to be a starting point for better understanding virtual reality of 3D objects and 3D space through haptic feedback.

This project uses Leap Motion to detect hand location in 3D space, Unity to 3D model the scene, imports to connect to the Leap Motion, and Uniduino to connect to the Arduino for Haptic Feedback.

IMG_1534

Future developments would allow low cost learning to easily access complex systems and how to build them through virtual reality. Students would learn to build mechanical objects with their hands, when they don’t have access to physical components of a mechanical systems because it doesn’t exist anymore or it is too expensive to gain access to.Screen Shot 2014-12-15 at 7.54.25 PM

Screen Shot 2014-12-15 at 8.37.57 PM

Screen Shot 2014-12-15 at 8.37.17 PM

“Rehab Touch” By Meng Shi (2014)

rehab touch from Meng Shi on Vimeo.

The idea of the final project is to explore some possible way to detect people’s gesture when they touch something.

Background:

Stroke affects the majority survivors’ abilities to live independently and their quality of life.

The rehabilitation in hospital is extremly expensive and cannot be covered by medical insurance.

What we are doing is to provide them possible low cost solution, which is a home-based rehabilitation system.

Touche:

www.disneyresearch.com/project/touche-touch-and-gesture-sensing-for-the-real-world/

Touch and activate:

dl.acm.org/citation.cfm?id=2501989

QQ20141216-1@2x

 

QQ20141216-2@2x

These two are using different way to realize gesture detection, touche is using capacitor change in the circuit while Touch and Activate is using frequency to detect frequency change.

My explore of these two based on two instructions:

Instructable: www.instructables.com/id/Touche-for-Arduino-Advanced-touch-sensing/

Ali: http://artfab.art.cmu.edu/touch-and-activate/

 

At first I explored “Touche”, and show it in critique,  after final critique, move to “touch and activate”.

QQ20141216-3@2x

In order to realize feedback, I connect Max and Processing to do visualization in processing, since it seems Max is not very good at data visualization part. (I didn’t find good lib to do data visualization in Max, maybe also because my limitation knowledge of Max.)

 

Limitation of system:

“Touch and Activate” seems not work very well in noising environment even it just use high frequency (like the final demo environment).

The connection between Max and Processing is not very strong. So if I want to do some complicated pattern, the signal is too weak to do that.

 

Immersive Experience: “Echo Chamber” by Connor Brem, Emma Steuer, Chris Williamson (2014)

In a dark room, words, motions, and even thoughts are amplified.

Echo Chamber places its audience inside a room where the only sound is the sound that they make, cycled through a audio feedback loop; and the only light is light that follows this sound’s pitch and volume.

Screen Shot 2014-12-16 at 9.12.21 PM

The piece places its audience unexpectedly in control of this room, and explores how they react. Do they explore its sounds and lights? Do they try to find order in the feedback? Or do they shrink back, afraid that the feedback will grow out of control?

“Look Inside!” by Connor Brem (2014)

What happens if I look inside?

Curiosity is a powerful force. Look Inside! explores how people react when presented with a non-descript box and an offer to look inside.

IMG_0060

If people choose to peer inside the box, they find a mechanical drum set, which begins to play a simple but incessant beat. The beat grows in speed and intensity, and the box begins to shake!

IMG_0071

At this point, people interacting with the piece have a choice: back away and attempt to dissociate themselves from the commotion they’ve caused, or keep watching. When Look Inside! was displayed, some people ran, and some stayed. Some even came back to play again.

IMG_0074

Look Inside! explores curiosity, responsibility, and control. Would you look inside?

Assignment,Final Project,Student Work — John Mars @ 4:27 pm

“Quake” by Amy Friedman (2014)

IMG_1349

Quake utilizes csv files reom the USGS website, to simulate the magnitude of earthquakes occurring throughout the past 7 days. The USGS updates its systems every 5-15 minutes to map earthquakes that have happened recently and over the past few months, and further past. Quake brings awareness to the common occurrence of earthquakes, and acknowledges the 10 levels of magnitude of an earthquake. Some quakes occur and no one feels them, but they happened and it occurs more often around the world than one is aware of.

Using a solenoid and dividing

USGS Magnitude Scale of Earthquakes

CSV file can be found USGS Spreadsheet Applications

“FbStalker” by Priya Ganadas (2014)

Have you ever stalked anyone on facebook?

Do you anticipate a comment or a like from ‘that’ specific person?

Do you constantly check your notifications and get disappointed and end up wasting too much time on fb for no reason?

What if you could get notification from Facebook if and only if that someone special was active?

fbstalker from Priya Ganadas on Vimeo.

FbStalker!!

It lets you pick your person that you want to get notified about as soon as they do any activity on your profile. A simple elegant light attached to desktop blinks to let you know that someone special just commented on your status.

IMG_0265

IMG_0270

I have tracked my friend Raj, I am printing ‘Wohoo’ every time i get a like from Raj.
IMG_0269

Serial communication is used between Processing to Arduino to blink the LED.

Back-end
I have used Temboo to get data from facebook.
Here is the link

You will have to get access from Facebook, using facebook developers

It was good learning to use temboo API without using the choreos and creating a custom code.

Github repository is here

Next Page »
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Hybrid Instrument Building | powered by WordPress with Barecity