Friday, February 11, 2011

Introducing Maxwell, Part 2

Brief video of Maxwell driving, moving his head, and moving his arm. Now onto touching up the IK and getting the Kinect calibrated to the body.



-Fergs

Sunday, February 6, 2011

Introducing Maxwell

I've been working hard to finish up (at least from a mechanical and electrical standpoint) my latest robot for a demo day tomorrow. I thought I would post a few details here.

Maxwell is my latest attempt at a lowcost, human-scale mobile manipulator using an ArbotiX and ROS. The design guidelines were pretty straight forward: it needed an arm that could manipulate things on a table top, a Kinect for primary sensor on the head, and a mobile base that kept all that stuff upright. Additionally, I wanted the robot to be easy to transport and/or ship.

Maxwell sports a larger 16x16" version Armadillo base with motors that should support a 20lb payload at speeds up to 0.5m/s. Not shown in these images is a Hokuyo URG-04LX-UG01 which will be mounted on the base, just in front of the column. The head has two AX-12 servos for pan and tilt. Eventually, the head will include a shotgun microphone.

Maxwell's arm is constructed from 2 EX-106 (shoulder lift and elbow flex), 2RX-64 (shoulder pan and wrist flex), and and 3 AX-12s (one for wrist roll, and two to form a gripper). In all honesty, it only needs the one EX-106 in the shoulder, but I didn't have any brackets that fit the RX-64 at the moment. The actual gripper fingers are temporary, I have much better ones in the works.

His central column consists of 3 sections of 8020 aluminum support, allowing all pieces to break into lengths under 20" long. This allows Maxwell to be disassembled and put into a reasonable size Pelican case for shipping. It only takes about 5 minutes to break down or setup Maxwell, as you only have to loosen 6 screws. Eventually, I'll have nicely cut foam, but for now I've hastily packed Maxwell for his journey to the state capital tomorrow:

On the software side, Maxwell has been a driving force behind the development of the ArbotiX stack v0.4.0 release. I'll be posting video shortly of smooth arm interpolation being done by the ArbotiX and ROS wrappers (I'm still fighting a bit with IK issues), in the meantime, here is a view of Maxwell's URDF-based robot model in RViz:

All of the configuration and launch files are now working, however I need to work on a calibration routine for the Kinect positioning, there is a bit of error somewhere in the head that causes the point cloud not to be aligned with the URDF model of the base when visualized in RViz (I unfortunately did not grab a screen capture of it, that will have to be a later post).

So what will Maxwell be up to? His first task is going to hopefully be competing in the AAAI Small Scale Manipulation Challenge. Once arm IK is functioning, and the kinect->arm transformation has been corrected, we'll be working on recognizing and moving chess pieces. After the AAAI event, he'll probably get an upgrade for a vertical lift on the arm (similar to the one on Georgia Tech's EL-E) and possibly a conversion to two more anthropomorphic arms. Speaking of EL-E, I couldn't help but replicate that classic pose:

Another great pose:

And that's all for now...

-Fergs

Wednesday, February 2, 2011

Snow Day!

I've been a bit quiet on this blog lately -- it's been a busy start to the new semester. Today however, the University closed due to inclement weather. So, I'll post a few update on what I've been working on:

First, I just barely finished an entry in time for the ROS 3D contest. I had insisted on using the new OpenNI-based drivers, which had some issues under 32-bit systems until about a week before the contest deadline. This was also my first attempt using the ROS Point Cloud Library (PCL). My entry improves the ar_pose package to use depth data from the Kinect to improve the localization of the AR Markers. It also uses surface normal calculation from PCL to improve the orientation. While it's probably not the most technically interesting entry, I definitely did learn a bunch about PCL while working on it. You can read more about the entry here. Or watch the really horrible video (my normal camera stopped working about 12 hours before the deadline):



In creating the entry, I also quickly constructed another robot. This guy used an iRobot Create, Kinect, and a tripod to get the Kinect up to the "Standard Social Robot Minimum Height" that we've been applying to all our robots at Albany:

This tripod worked surprisingly well -- far better than anything I've previously built out of conduit.

Finally, I'm working on putting together a new robot based on an EX-106/RX-64 based arm, Kinect, and a bit larger mobile base..... pictures and video shortly. (his parts are stuck on UPS trucks somewhere in a snow storm).

-Fergs