Final Project Milestone 2 – Mauricio Contreras

Assignment,Final Project,Robotics,Submission,Technique — mauricio.contreras @ 10:02 pm

Log

My second milestone was about simulating the motion of a robotic arm within the software workflow that had been explored in the first milestone (Rhino + Grasshopper + HAL). Upon getting acquainted with the capabilities of these pieces of softwares and understanding more about the possible constraints and needs of the instrument, I realized that REAL TIME driving of the robotic arm was a major requirement. Just imagine sculpting with your arm moving with a few seconds lag after your movement intention and you’ll see why. The software workflow described above is great for offline materialization of 3D designs, but not necessarily for real time control. Even though feasible, people from the lab commented about possible lag issues, which made me want to try out the real motion of the robot, even with simple commands, as soon as possible. I found procuring the tools to run in my own machine rather difficult: All of them are Windows only, so at first I got a virtual machine from Ali Momeni with everything preloaded, but it ran excruciatingly slow (even when I changed my computer to the latest Macbook Pro). Then I tried creating my own virtual machine from scratch, and installed Rhino and Grasshopper with success. Yet HAL’s developer webpage was down and I had problems procuring tutorial training for it. When I asked for help with this, people recommended to learn the former 2 tools first and then use HAL. This seemed reasonable but I was under time constraints (by choice) to test the robots motion as soon as possible with a configuration that would generate the least lag, and evaluate if that optimal setup would prove to be responsive enough to match the target application of the instrument, which is sculpting.

Early motion tests

I then turned to writing my own RAPID code, and quickly was able to generate a routine to move the head of the robot in a square in the air, as shown in the following video.

The routine was based on offsetting the current location by steps in each axis, but also waiting for a digital input state before each small step. Since the robot accepts 24 V digital inputs, I would have had to use a power source or do a conversion circuit from the standard micro controllers 5/3.3 V outputs. That is not difficult but I assumed that the DI of the robots had pull down resistors and just made it wait for DI=0 before each motion. Since the shape was completed that thesis was proven. Also, the motion seemed “cut”, as if doing start-pause-restart in every step, as opposed to a seamlessly continual motion that would have occurred either if the lag on processing the digital input was very low or depending on motion configuration of the robot (i.e. there may be other motion commands that would output less of a “cut” motion). I removed the wait for DI statements with no appreciable effect, hence the motion commands were the issue. To see the effect of this when driving the robot with gestures, and based on previous code for motion (FUTURE CNC LINK), I started writing a TCP/IP socket based client (Android smartphone) – server (robot controller) application, which will be outlined in the next milestone posting.

Assessment

Up to this milestone, just getting access to the robots themselves and being able to move them in a hardcoded fashion I consider a success by itself, yet it is clear that new, unforeseen difficulties have appeared.

0 Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2024 Hybrid Instrument Building 2014 | powered by WordPress with Barecity