Final Project Milestone 3 – Mauricio Contreras

Assignment,Final Project,Robotics,Submission,Technique — mauricio.contreras @ 10:26 pm

My original 3rd milestone had to do with connecting a haptic feedback controller with the simulation of robotic motion, which by this time had turned into real motion, and the device chosen as described before is a smartphone since it provides an IMU and a vibrator, all with a standard and well proven programming API.

Limitations of IMU standalone motion tracking

My original intent was to use the device’s IMU to track position and orientation, each in 3 axis, providing effectively 6 degrees of freedom. My pursuit is for a very natural gestural interaction that would mimic one’s own hand orientation and position in space, to be imitated by the robot changing it’s own head position and orientation. My assumption was that standalone positioning based on integrating the accelerometer’s readings twice was a method that must have been solved by now, and I started searching for code. Yet, to my surprise, it seems this is not true and the constraints lie mainly on the double integration, the first of which leaves a constant error, and hence the second multiplies that constant with time, meaning the error grows linear with time! The drift most algorithms (at least the ones available on the web) render is even of tens of centimeters per second, which is completely unusable for the application in mind. In the case of orientation, this is completely different because there is only one integration to be made, plus there are at any given time 2 vectors of reference against which to correct: gravity and the magnetic north. To sum up, whereas one can get very accurate orientation from an IMU, linear positioning is still very much a work in progress, the underlying reasons being physical more than technological.

This immediately cut 3 degrees of freedom from my ideal application, the most important ones at that (the assumption being that one can probably use tricks to change the head’s orientation but use real degrees of freedom for it’s position, as opposed to the other way round. This is pure intuition though). I faced the decision of changing technology to visual tracking or keep using the IMU, now only with orientation. Even though kinect based motion tacking seems to be pretty plug&play these days, I had no previous experience and decided the semester was too advanced to have a setback as not being able to show anything functional in the end, whereas I was already somewhat acquainted at this time with the smartphone workflow I had developed. I decided then to stay on this path.

Orientation based linear motion control, first tests

I devised a TCP/IP socket based client (Android smartphone) – server (robot controller) application. It uses the smartphones orientation (software sensor provided by Android based on fusing the raw information from accelerometer, gyroscopes and magnetometer/compass) in each axis to generate steps that offset the robots head position in each axis.

The motion result was pretty much just as cut as what I had obtained with hardcoded targets, which left me with a feeling of disappointment. See video below, and please notice how unnatural this piecewise movement feels.

This piecewise motion is not related to the smartphone input information, but to the way the robot is controlled. Through the trials, becoming acquainted with people who have done extensive work with the robots (Mike Jeffers, Madeline Gannon, Zack Jacobson-Weaver, Ali Momeni, Jaremy Ficca, Josh Bard, Kevyn McPhail, up to then) and online research, I came to know that at least ABB robots, being developed for industrial use, are aimed towards rigid precision. This means motion commands are based on targets and are not meant to be interrupted in the middle, which is exactly what is required for responsive gestural control (real time interrupts). The next and final milestone shows the way of how I’ve dealt with this limitation.

0 Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Hybrid Instrument Building 2014 | powered by WordPress with Barecity