Final project: Sharing My World

Final update and wrap-up

 

Project summary

Summary for those of you just joining us: this semester I worked with Brothers Keepers, an after school program focusing on social, spiritual, and intellectual enrichment for young men, housed in Hosanna House in Wilkinsburg, PA. Together, the students and I built a short wraparound VR experience documenting the activities, discussions, and spirit that would typify an afterschool session of Brothers Keepers. Students worked on storyboarding, capturing still and video images in 360º, recording and producing audio, and learned about the Unity VR pipeline to stitch all of that material together into a VR experience.

A web version of our final product, “Brothers Keepers Journey,” is available for viewing here. See below for much more detail about each aspect of this project. The Android app version is available for direct download here. (Please give your phone permission to install the app, and for best results use a viewer such as the Google Cardboard.) The project’s source code is available here on Github.

BKgrouppose

For more detailed discussion of steps along the way, see below. (This is the top of a stack of reverse-chronological updates completed along the journey.)

 

Final Brothers Keepers session

On Thursday, April 28th, we had our last Brothers Keepers session and showcase at Hosanna House in Wilkinsburg. For the event, I borrowed equipment from ArtFab and CREATE Lab to be able to have Brothers Keepers Journey available to view on three headsets so that we’d be able to pass all of them around the room and give everyone time to go through the whole experience, which takes three to five minutes.

The plan of the evening was pretty simple: we had some informal gathering time when I passed around the Ricoh Theta camera and headsets with BK Journey loaded on them, and I spoke with students and their families about the project. Pastor Gill (one of the group leaders) called everyone to order and spoke about the importance of seizing learning opportunities, and then I presented the final work that we’d all done together. Unfortunately, only a few families were able to come to the event, but everyone present, including young men who’d participated in creating BK Journey, were interested to see the final product.

Pastor Gill asked me to speak about the technical process and skills necessary to bring the final BK Journey piece together. I showed everyone some of the C# lurking behind the final project—which I explained to the students was especially ugly code—and explained that you start simpler and then work your way up to writing this sort of code only after gaining experience through a lot of trial and error. The main lesson I tried to repeat during the evening was that for me, the technical side of this experience, like many others, was all about facing a long string of confusing problems and persisting until I’d been able to finally push through and solve (or work around) them.

After the presentation was over, Pastor Gill thanked me again and I took some questions about the Unity backend among other things. I was also able to have longer conversations with a few parents; one woman’s son was very interested in technical learning at the expense of all the rest of his schoolwork so we discussed how it’s important for him to get the chance to build interesting things in code while still keeping up with other subjects if he wants to get into a strong college where he can advance his learning in programming. Another mother works for the VA and was interested in using virtual reality as a way for providing virtual field trips for incapacitated veterans in their care—I recommended she consider looking into using Gigapan to richly document sites of interest and made myself available to her to help with the project.

 

Summary project timeline

An overview timeline of what happened when (for more detail, see prior updates below):

  • February through March 24th: develop project goals. Establish contact with primary afterschool org; when they fell off the radar, make new contact with a secondary possibility (Brothers Keepers). Continuously developing in Unity.
  • March 30th: first meeting with students at Brothers Keepers. I introduce the 360º capture camera, the Cardboard-like viewing device, and demonstrate the workflow from image capture to viewable experience.
  • April 6th: second BK session. We have an ideation brainstorming process where we decide to stitch multiple video/image bubbles together into a “journey” as the group project. Students break into groups, each of which has specific responsibility for one of these still/video bubbles.
  • April 13th: third BK session. In the first half of the session, groups plan out on paper what we’ll record, and in the second half, we record stills, audio, and video. (Planning document blank.)
  • April 20th: fourth BK session. All students present view and comment on the “rough cut” of the work we’d done last session and we discuss improvements, reshoots, edits. We decide to reshoot one scene.
  • April 28th: fifth and final BK session. Presentation of the final project to students and families.

 

Dissemination difficulties

Brothers Keepers hopes to be able to use BK Journey as a recruiting tool to attract more young men to their program. At the moment, original interactive VR content made by a small group (i.e. not by a game studio) isn’t so easily disseminated, unfortunately. I expect/hope within a year or two this will be a very different situation and this type of content will be 3rd-party hosted and sharable as easily as YouTube content is today, for instance. I include this section as sort of a historical document showing the paucity of good options at this moment.

There are four modes I can think of to distribute the final experience out there for more people to see, though none is ideal:

  • Web playable Unity module. Good because it’s broadly accessible to anybody on a computer. Problems: 1) The experience is not full screen, which of course takes away a lot of the sense of presence. 2) Unity has lots of warnings about how their web player is deprecated and so it’s not likely that it will continue working reliably in the future. Even now, it’s only supported in some browsers.
  • 360º movie. Good in that it’s now supported in YouTube, for one, so it’s widely accessible. However, there’s no interactivity beyond the user choosing where to look at. The interactively triggered sound in BK Journey wouldn’t work, the user couldn’t choose their own pace, etc.
  • Google Play Store. Good platform to get an app out there and downloadable by anybody, but at the moment won’t accept an APK bigger than 100MB. Partly since BK Journey contains a fullscreen video, it’s ~120MB in total. (Having the phone load remote assets at the time of playing would probably work, but would be a pretty heavy data pull. To preload assets in a non-live way is of course feasible but I don’t know how/if Unity itself could download and save files to a phone’s storage for this purpose.)
  • Posting the APK (Android executable) online for direct user download. Requires that end users are able to download an APK onto their phone and convince their OS that it should execute rather than discard the code from an unknown developer. Certainly possible, but security settings make it hard to do and warn users away from this, and less sophisticated users are likely to hit trouble.

Of these options, I’m posting the Unity web playable version as well as the APK. And I’m waiting (impatiently) for easier sharing possibilities so that a broader base of people starting to make this sort of experience won’t need to jump through so many hoops!

 

Concluding thoughts and future directions

Some challenges:

The biggest frustration I had was not being able to hand over more of the Unity development process to the BK students. If the project had consisted mostly of drag and drop processes, or if I’d had the scripting all completed and ready to go ahead of time, I might have been able to more deeply involve students in building the final experience. Unfortunately, nearly all of the heavy lifting in Unity was in C# scripting rather than in the main Unity GUI, and to students who’d never done any coding that just wasn’t an accessible point of entry. There’s also no easy way to collaborate on a Unity project other than passing around files via a VCS like Github, so even giving students the chance to work on the project at home (provided they had computers, could get Unity up and running, etc.) wouldn’t have been a very good option.

Unfortunately, time was a major limiting factor in what I was able to do with the students. At each evening session, I had only about 1.5–2 hours, depending on when students showed up and what other activities or discussions the leaders wanted to spend time on that night.

The greatest shortcoming of this project, I felt, was that owing to time and group constraints, I wasn’t able to work with just a few students one-on-one to give them a real chance to express their own stories individually. That was my original goal for the project. I was facing a roomful of young men who were enthusiastic about participating, and it became apparent that filming in each of their respective spaces and then producing an experience individually with them would’ve been prohibitively time consuming. I pivoted and worked on a group project instead. The reach this achieved was broader but necessarily shallower, unfortunately. Furthermore, while it incorporated many students into the production, the final experience we made was less personal narrative and more design-by-committee.

Some successes:

I was happy to give students the opportunity to see and learn about some new (emerging?) media forms. As I repeatedly expressed to the young men, I believe that new versions of these tools, accessible to novices out of the box, will be arriving soon on the market, and perhaps they’ll be a few steps ahead of their peers in at least knowing some of the possibilities for the format.

I did get many (~eight) one-on-one hours with one student, Jaeden, who was among the older students and able to come to campus to work with me for an afternoon together. He learned the basic use of Audacity and edited all of the sound files that were used in the final project, and his parents told me (as did he) that he was really excited to have learned how to do sound editing because he was interested in getting into music production.

I think this project may have put some of the students on a path towards realizing that they can make creative electronic media rather than just consuming it. I don’t really know how much of an effect the project had along these lines—I didn’t do any surveys or even informally ask this sort of question of the students—but I believe that at least some seeds were planted. Given less than ten total contact hours, I’ll count that as a good start.

Final thoughts:

An observation: I noticed that even though I’d demonstrated the 360º capture camera on the first day, and repeatedly explained and shown its affordances, students still didn’t seem to totally grasp the implications of producing a 360º feature. At times they would pose as a group in one area, for instance, rather than ringing around the camera, or not express particular interest in what was situated to the “sides” or “back” of a scene as they thought of it. To me this is just a demonstration of the power of the framing paradigm that we’re all so accustomed to, which these new tools break. I imagine soon, with broader adoption and more exposure to 360º media, people will start to think more natively about taking maximum advantage of the format.

I was very frustrated trying to set up Unity with Cardboard to export VR-ready content from my laptop to an Android. In March I documented the byzantine installation process partly because I assumed even once I did all those steps it still wasn’t going to work and I wanted some auditable trail to see where I might’ve gone wrong. Apparently since I went through the process, it has already changed. As I’ve mentioned above, I recognize that some of this post will serve as a record of how frustrating it once was to build these fairly simple experiences for VR, because the tools are evolving very quickly. I feel a bit like I’m staring at a stack of punch cards I poured hours into making, knowing that very soon they will be quaintly useless museum pieces.

With this project, I explored some new expressive means recently made widely available by low cost immersive wraparound media recording and playback technologies. This summer I’ll be working with CREATE Lab to continue and extend some of this work with the goal of bringing it to high school teachers from across the country, who will hopefully go on to inspire interesting and groundbreaking work when they return to their classrooms. More than anything else, I get the sense we’re working on the tip of an iceberg of creative expression that’s so big, I don’t trust myself to even meaningfully guess at where it’s going to lead. But in due time I’m excited to find out.

 

Project thanks

Thanks to Professor Ali Momeni for helping with project ideation, technical assistance along the way, and lending and purchasing needed equipment to complete the project.

Thanks also to Professor Illah Nourbakhsh and CREATE lab staff, Jessica P. in particular, for lending equipment and discussion about further extension of the project goals.

Special thanks to my classmate John Choi who sat with me for a lot of hours helping me understand how to convince Unity to do what we wanted it to do. Truly couldn’t’ve done it without his help!

And finally, thanks to the adult leaders and young men of Brothers Keepers, who allowed me to take over their afterschool program for more than a month in the service of this project. I hope the experience was as valuable to them as it has been to me.


April 14th update

We got a lot done at our session last night, which was our third. Reviewing: in the first session, I introduced the technology. In the second, we ideated to plan out what we’d be filming. In the third session (last night) we planned a bit further and then took still images, and recorded video and audio to make four complete scenes. All in the space of about an hour and a half!

We’d formed student teams at the end of our second session and though the attendance was lower last night (only about 9 or 10 students), I had students sit in their groups to plan at the start of the session. I handed out a planning document that’s modeled on a storyboard blank that would be used for a regular film, but with a twist: instead of blank boxes allowing for one shot followed by another, this 360º storyboarding document provides space to sketch out the perspectives on all six of the cube sides surrounding the camera. Here is how one group filled it out:

brothersKeepersCapturePlan

They used the star annotations to mark where a user’s gaze should trigger embedded audio. (By way of explanation: this scene, “Pledge, Prayer, Proverb,” is a documentation of the Brothers’ Keepers’ usual invocation at the start of a session. They always start meetings with these three things in that order.)

Today, Jaeden, who is one of the group leaders, will be coming to campus. He and I will be working in Unity on stitching the four scenes together into a rough cut of the whole Brothers’ Keepers experience.

Download link for 360º planning document blank.


April 8th update

Not an especially important matter, but I wanted to make a maximally slender, while usefully stable, stand for the 360º camera for use in still and video capture. The prior version used a chunk of hardwood as the base, and that was ok, but it would show up if you looked directly down in the captured image.

Mostly because I love TIG welding, I made a new tripod base in ArtFab’s metal shop. Here’s an in-progress image (prior to welding on the last leg)

And here’s a closeup of the junction after welding:

You’ll still be able to see this base when looking down, but you probably will only see a tiny bit of tripod. Good enough.


April 6th update

Had another really good session with the Brothers’ Keepers group at Hosanna House last night. This time there were a few more boys present than last week—14 instead of 10—though speaking with some of the program leaders that probably is indicative of the usual ebb and flow of students from week to week rather than expanding numbers.

We had an organizing session this time, discussing as a group the possibilities for what sorts of VR experiences students might want to build together. As we discussed and I noted students’ ideas down in a Google doc we could all see, I started to try to combine and synthesize ideas into 3 or 4 “scenes” that the group could produce—the thinking being that we’d form teams of interested students who would collaborate on individual VR scenes. (As a group we are using scene to refer to a single VR sphere that the viewer is in the center of. It may have 360º video or 360º still photo content.)

Much student ideation aims to simply recreate extant places

In our ideation session at the start of the evening, I noticed that many of the students’ suggestions aimed to simply recreate existing places they’d experienced. Suggestions included:

  • downtown city scene with hustle and bustle,
  • classroom with kids talking, teacher talking, people throwing papers, etc.,
  • a basketball game,
  • recreating a group field trip to Detroit,
  • kids in a schoolyard, and
  • a swim meet, track meet, or other big public event.

The purely transportative ideation is interesting: it suggests to me that it’s difficult for students to see beyond the literal use of the device to make a record of a place. I tried to push them to instead imagine it as a tool that can be used to make a mystery for the viewer to solve (look around the room for clues), or perhaps a study of an internal monologue (the original look-around-my-bedroom-and-hear-my-thoughts idea).

A few different stray ideas, however, diverged from this

Several suggestions didn’t fit neatly into the place-recreation trope, and I encouraged students to expand on these when they spoke about them.

One student suggested a memory game: photograph a room with many people in it, and then take a second photograph with some of them missing, or objects rearranged, etc. The viewer sees the first and then the second, and must try to recall what is missing or changed. I think this is an interesting idea because memorizing an entire environment is much harder than memorizing a fixed-frame image, and I think the experience would be pretty fun and challenging.

Another student, one of the older ones, came prepared with a drawing he’d carefully made on graph paper. He had an idea for a scene involving the entire group: the viewer starts by looking at a wall of motivational posters, then hears the Brothers’ Keepers all saying a motivational few sentences in unison. A computer voice asks the viewer to turn around, and all of the Brothers’ Keepers are standing behind where the viewer had been looking; when facing the group, a recording plays of them saying a sort of group creed in solidarity.

Synthesis

As discussion progressed over the course of about a half hour, I realized that perhaps the best way forward was to make separate scenes that would then be stitched together into one sequence. The viewer would start in one scene, and then either by signaling that they were ready to move on—perhaps by staring at a doorway or portal, perhaps by some other mechanism—they would then go on to the next, until the viewer had completed the whole “journey.” We’ve got four scenes, each being led by an older BK student. In order, they are:

  1. Pledge, Prayer, Proverb (this is their typical opening sequence at a weekly meeting)
  2. Group discussions (groups arrayed around the room; when the viewer looks at one of the groups directly, an audio recording of an answer to an interesting question plays)
  3. Brothers’ Keepers group (this is the group creed scene as described above)
  4. Gym/pool scene (probably will just be a fun free-for-all)

Next week we’re planning to film all of this. Looking forward!


April 1st update

The one and only John Choi helped me again. I’d tried to convince the C# script to trigger only one sound when the user was looking at an object, but I wasn’t understanding the scope of a script (children vs. same-level). Here’s the new script, which plays only one sound at a time (as is appropriate):

I’ll be going for my second session with the Brothers Keepers students Wednesday 4/6 and am looking forward!


March 30th update

Had a great ~1.5 hour meeting with the Brothers Keepers group at Hosanna House in Wilkinsburg tonight. There were 10 young men present, ages roughly 13–17, and many of them were really interested in the 360º capture camera and the possibility of making their own VR environments. We stayed after the meeting was supposed to end because of their enthusiasm and all the possibilities they kept asking about.

In demonstrating the workflow with the camera, I took an image from the center of the room (pasted below in its native equirectangular format), and after demonstrating how I used Unity to load that image on to the phone I passed the phone around, loaded in the ViewMaster. One student, after taking the goggles away from his eyes, said “it’s like being in another world!” I think they’re understanding the transportative power of the technology and getting excited about it, which was really the point of this first meeting.

HHlowres

Interested students will be coming to our next meeting with written ideas and sketches. Looking forward!

In other news, I’ve been able to install multiple sound targets into the Unity scene, but it’s got a bug I haven’t been able to squash yet: when any sound is triggered, all of them play simultaneously. Some unique identifiers may need to be added to the sound-playing script; in any case, I expect to have this fixed for our meeting next week at Hosanna House so I can show students a fully-fledged operational version of the software.


March 28th update

I fabricated a simple very thin monopod to support the Theta camera from spare parts laying around Codelab. I was planning to weld 1/4″ rod together to make a tall thin monopod with a very small (~6″ radius) metal tripod for a base, and also was going to weld a small 1/4″-20 bolt to the top of the rod for the camera to screw into. However, I found out it will take until at least the end of this week to get access to a welding machine on campus.

Then I realized that I could simply tap the end of the 1/4″ rod I had, and voilá, I had a very long 1/4″-20 support that the Theta would securely screw into. I drilled a hole in the center of a dense block of hardwood to serve as the base, and then sawed the long rod in half, threading the middle-facing ends and joining them with a coupling nut, so the assembly can be easily disassembled and transported.

1/4" steel rod, with the threading die and tool. I used the nice vises in the architecture wood shop to secure the work.

1/4″ steel rod pieces, with the threading die and tool and hacksaw. I used the nice vises in the architecture wood shop to secure the work.

My slim monopod, which separates into three parts (two rods and the base) for easy transportation. Note the coupling nut towards the middle.

I took a few pictures in my room using the setup and it did the job pretty well. Further modification is needed, but for now, it’s a serviceable monopod for my purposes.

Here’s the equirectangular version of the picture I took:

rzRoomDownsample

You can’t tell without projecting this onto a sphere, but the monopod is essentially invisible save for a small wooden square directly at the south pole of the image (which is fine).

In other news, today I confirmed I’ll be meeting with Brothers Keepers students on Wednesday night in Wilkinsburg, so I’m looking forward to that.


March 25th update

I owe an enormous debt of gratitude to my friend and classmate John Choi, who really knows his way around Unity. John walked me through the somewhat tortuous process of assembling the ingredients of an equirectangular image, sound clips, and scripting necessary to make a completely functioning interactive VR experience that runs on a compatible cell phone.

It’s weeks overdue, but now that that technical hurdle is cleared, I’m very much looking forward to meeting prospective student participants through the Hosanna House connection this week, and showing them a few sample projects. (I’ll need to produce these, but with the technical pieces in place it should take only about a half hour to actually make the sausage.)

I also completed a short and sweet pitch slideshow to use for recruiting:


March 24th update

It took about nine days after the original contact to get in touch with a program manager from Hosanna House, but that’s just how organizational contacts flow. We connected today for a very productive 30 minute phone call about the program, and he is interested. It’s a question of scheduling and logistics now; he’ll be getting back to me about working with students in the Brothers Keeper program in the next few days. (They typically meet once a week on Wednesdays, but are not meeting this coming week. We’re working on scheduling an impromptu meeting so we can start moving forward without having to wait until the next regular meeting on Wednesday, April 6th.)


March 22nd update

I’ve done more looking into scripting Unity for the purpose of streamlining the production of a wraparound VR experience including sound files. Ideally, the final product would be as simple as a drag-and-drop application or add-on that would permit anybody to easily assemble a See My World–style experience with their own captured images and sound. However, as far as I’ve gathered, all of Unity’s scripting facilities are built to operate inside of the game environment with assets that are already locally loaded. I don’t think the scripting is set up to allow for the creation of a new game from the ground up, using assets loaded externally by script. It’s still possible, of course, to use an external scripting system (on a Mac, Automator) to simplify the game-building process, but this could be quite unwieldy considering the relative complexity of assembling a game.


March 18th update

Having had a difficult time maintaining communication with the original organization I wanted to work with, I am now hoping to work with a second: Hosanna House. I have been in touch this week with their Executive Director and am hoping to attend a session of their Brothers Keeper program next Wednesday night.

Also met with Jessica P. and Alex at CREATE Lab to continue our discussion about the possibility of integrating my findings into their Fluency project. We talked about different cameras that might be used, as well as different media hosting services for web-based access to the final product like Littlstar (CREATE has used this service in the past), or just using YouTube’s 360º video feature. Jessica P. offered to connect me with some ETC students who had done a prior project using Oculus and Unity to develop work that encouraged viewers to explore racial prejudices. Unfortunately, three emails later, I’ve not heard back from the faculty member in question, but I’m going to keep pushing!

Ali notes:

  • Get the Samsung GearVR environment working (uninstall all, use ideate laptop, ….)
  • Secure 1-3 three students (14-18): Ali will connect you to Herman and Jasiri
  • Create mock-up (with actual 360 still and video recordings) of the process and prodeuct
  • Create formal 5-presentation recruitment presentation

February 19th, 2016 presentation


One-sentence project summary: The See My World project aims to help young people living in the city share what their world looks like, using new 360º capture cameras and immersion devices.

Leave a Reply