Friday, December 23, 2011

Navigating Apartments with ROS

While Maxwell tends to reside at the office most days, I've recently brought him home a few times. I'm now working towards some cool demos of mobile manipulation in a home environment, but the first step is of course to navigate that environment.

I've been trying for some time to use only the ASUS on the head, however, the thick carpeting and carpet/linoleum transitions really throw of my odometry at times. I finally dropped back and just used my low-cost Hokuyo. This evening I created a new map, without much effort at all, using gmapping and the Hokuyo:

Some preliminary tests show that my planner parameters will probably work well, even in the smaller environment (the map above is 10m x 10m). What is very apparent is that my laser doesn't pick up much of my furniture -- the larger room is actually mostly full with tables at various heights, which are only showing up as stray points to the laser. Adding the ASUS for 3D obstacle avoidance should help quite a bit.

-Fergs

Wednesday, December 21, 2011

Maxwell (Finally) Gets a Vertical Arm Lift

I've been saying for quite some time that I wanted to add a vertical lift to Maxwell. Earlier this month, he finally got that lift:


The linear actuator gives the arm 20" of additional vertical mobility, making it possible to grab objects at floor or table height, or hit most light switches. The lift is currently integrated into ROS arm_navigation, I'll have some awesome videos soon.

In addition to the arm upgrades, Maxwell recently got a new ASUS Xtion Pro Live to replace his Kinect. The ASUS device requires only USB (no 12V power) and is significantly less heavy, reducing neck and back stress (a common problem amongst aging robots). Speaking of aging robots, Maxwell turns one early next month.

-Fergs

Update Palooza

This blog went a bit quieter earlier this year. That is about to change.

First up, is a series of awesome pictures of a new 3d-printed gripper for Maxwell. One of the cool tools that I have access to these days is a Project HD3000 printer. Here's the grippers straight off the printer:
After removing the wax:
One side installed:
And a grasp:

Monday, August 15, 2011

Thursday, August 11, 2011

Maxwell Wins at AAAI

Maxwell won 1st place in the 2011 AAAI Small Scale Manipulation Challenge! Video and photos soon video soon.

-Fergs

Friday, July 22, 2011

ROSSerial Released

My first project of the summer here at Willow is now in release: rosserial. (It was actually released about 2 weeks ago).

This is a new library for connecting the Arduino platform to ROS, allowing an Arduino to directly publish and subscribe to ROS messages. There are also demos ranging from controlling servos to using an Arduino and rxplot as an oscilloscope to reading temperature sensors into ROS.

In addition to support integration with the Arduino platform, the rosserial library provides a general point-to-point transport for ROS communication over serial, which is intended for hardware that cannot support the full ROS TCP/IP network stack. This library can be used to easily integrate a wide-variety of low-cost hardware into ROS.

Debian packages are now available for diamondback and unstable (shortly electric).

See more at: http://www.ros.org/wiki/rosserial

-Fergs

Friday, July 1, 2011

A New Gripper for Mini Max

I was never satisfied with the simple gripper we had put on the Mini Maxwell prototypes. It took a bit of time to actually sit down and create the gripper I wanted, but I've now printed up a couple prototypes (the final versions will be laser cut):
The design was done in Autodesk Inventor:
There is a simple mechanism for converting the motion of a single AX-12 to parallel motion for both jaws:Videos shortly of Mini Max manipulating things.

-Fergs

Sunday, June 26, 2011

Introducing Mini Maxwell

While I am a strong proponent of low-cost, human-scale mobile manipulators, I also realize it's impractical for everyone to be playing with such large robots. Earlier this year I designed a small robot using an iRobot Create, some custom ABS, a Kinect, and an AX-12 based arm and neck. This small mobile manipulation platform had no name for some time, but every time I showed it to people they exclaimed "Oh! A mini version of Maxwell!". And so, this robot got the name "Mini Maxwell".


There are now about half a dozen Mini Maxes running around across the United States. As our software development is starting to calm down, I thought I would take a moment to showcase a few of the robot's features.

As you probably have guessed, this robot runs ROS (on a netbook). We're using an ArbotiX to control the arm and neck servos (7 AX-12s total), while the Create base connects directly to the netbook. Two demo buttons are attached through the ArbotiX allowing users to quickly add user input to programs.


The arm has 4 degrees of freedom, losing the wrist roll found in Maxwell. A single servo gripper is installed, although a more sophisticated one is in the works:


I demoed Mini Max earlier this week at the HBRC meeting, showing how we can easily move around toy blocks. I've posted a number of tutorials and other documentation on the ROS wiki as well.

-Fergs

Tuesday, May 24, 2011

New release of ar_kinect

I had a bit of downtime yesterday afternoon, which meant I finally had a chance to do some updates to the ar_kinect package. This package localizes augmented reality (AR) markers using the ARtoolkit -- but it has the special ability to get highly accurate pose information using the point cloud data from the Kinect. This was originally my entry into the ROS3D contest hosted by Willow Garage, and has found use in many labs.

The new version includes a number of updates. First and foremost, the algorithm for using the cloud data is completely new. Instead of estimating surface normals for orientation (a slow and somewhat unreliable process), the new ar_kinect package finds orientation and pose at the same time, by registering the four corners of the marker (as found by ARtoolkit) against an ideal marker. This is both faster and far more accurate.

The ar_kinect package is part of the albany_vision stack, and can be found in the albany-ros-pkg repository.

Friday, April 29, 2011

Navigation with Maxwell

Earlier today I gave a brief talk about our robotics program in the College-wide showcase. In preparing the talk I recorded a small video of Maxwell navigating, under voice control. This demo uses the ROS pocketsphinx wrappers that I previously developed to send a set of predefined waypoints to the move_base node:



I'm going to try and post some updates from RoboGames tonight (as usual, about 2-4 weeks after the event....)

-Fergs

Wednesday, April 20, 2011

National Robotics Week, Day 6 - Stanford Robot Block Party

Unfortunately, there was a small delay in me posting anything from out on the West Coast. This was for two reasons:
  1. The Internet connection at the hotel kinda sucked, this seems to be a recurring theme this month.
  2. I was out far too late most nights with the Mech Warfare crowd.
I did manage to make it to the Stanford Robot Block Party this year. Maxwell did as well. At least in pieces. The battery strap broke during shipment (into about 6 pieces), requiring Maxwell be opened up for repairs:


Overall the sliding battery didn't do much damage, breaking only the charge connector and yanking a few wires. It took about 25 minutes to correct things and then Maxwell took a tour around the facility.

We set up with the Home Brew Robotics Club guys, and next to Pi Robot:

(Photo from Palo Alto Patch article)

-Fergs

Thursday, April 14, 2011

National Robotics Week, Days 3 & 4

I spent Monday and Tuesday at the IEEE Technologies for Practical Robotics Applications (TePRA). This is quite a different conference, with a lot of industry and military backing. Keynotes speakers included funding managers from DARPA, the CEO of iRobot, the CEO of Vgo (a telepresence bot manufacturer), etc. If you haven't seen the Vgo yet, it's a ~$5k telepresence bot which is fairly stylish and lightweight:
The authors of papers included the usual crowd of academics and industry experts, but then some other areas, such as industrial designers. One such person was Robert Antonuccio of Antonuccio Design, presenting the X9 Minion walker (as seen in Robot Magazine). While the robot is pretty cool looking, I was actually even more impressed with the incredible demo station he had set up. The day before demos I had heard Robert talking with the organizers about getting set up the night before -- and then when I saw the demo booth, it suddenly made sense why he needed lots of time to prep. This thing has to be a beast to haul around:
I don't have time to highlight all of the interesting papers, but I though I would highlight two. The first paper that really caught my eye was presented by Aaron Dollar, of Yale, "Practical Aerial Grasping of Unstructured Objects". I'm not particularly interested in aerial platforms, but I am interested in grasping when your localization sucks.

Also present at the conference was Zachary Dodds of Harvey Mudd College. Dodds is probably the most energetic professor I have ever met, and he was presenting, quite enthusiastically, the work he and his students did last summer on texture and machine learning-based monocular ranging (a system they call PixelLaser) which might be of interest to readers. I believe I've come across the source in an online repository at some point. Here's a quick image from the abstract:
This presentation was RIGHT before my own (on the PML), and was a hard act to follow... Later this year, several students are going to be working to adapt this to the ArDrone quadrotor platforms (and probably moving to ROS as well).

Anyways, I board a plane to California in about 6 hours, to continue National Robotics Week from the other coast. I'll be stopping by the Stanford Robot Block Party tomorrow afternoon before continuing on to Robogames. If you're going to be in the area, swing by and check out Maxwell (and his lovely symposium poster) at Robogames.

Current tally for NRW: 2 events, 2 states, ~700mi of travel (by car).

-Fergs

Sunday, April 10, 2011

National Robotics Week, Day 2

Day 2 of National Robotics Week is over for me. I hit the Trinity Fire Fighting Contest again this afternoon before continuing on to Boston. The guys from down at Shepard University managed a third place in the Senior division, a real improvement of where they were just two weeks ago at the ShepRobo fest. Congrats guys! We also got Seth's mech walking.

I've now made it to Boston for the TePRA conference. Interestingly, they are apparently shutting off power to the hotel in a few hours.... although it is supposed to be back in the morning. Upside: I got a free flashlight. Yay.

In other robot news, the BSA has announced the official requirements for the long awaited Robotics merit badge.

-Fergs

Saturday, April 9, 2011

National Robotics Week, Day 1

It's National Robotics Week!

Today I ventured down to the Trinity Fire Fighting Home Robot Contest at Trinity College in Hartford CT. This is like the 17th or 18th contest (they've actually stopped naming them as such.. I think I won the 15th with Crater though...). There are two main contests: the Fire Fighting contest in which robots autonomously navigate an 8x8' maze that is laid out like a small house and put out a fire (candle), and a new RoboWaiter contest where robots have to fetch a small plate and move it to a table in a similarly sized arena. It seems that the RoboWaiter contest is finally catching on, as they had about 20 entries this year.

Maxwell also made the trip, and moved around a few chess pieces in between lots of question answering about ROS and developing larger mobile manipulators:



Seth (sthmck over at TRC) is pretending to be diligently at work on his mech in the above photo. Earlier in the day the Mech got an upgrade to a new ArbotiX2 prototype:



I actually spent so much time talking to people, I forgot to walk around and get many photos of the actual fire fighting robots... but I did snap a shot of this bot from NYU-Poly which has a very western-shoot-em-up-fire-fighter look going on:



Finally, as I packed Maxwell up for transit to California, I snapped a few photos of his new foam inserts, and how nicely he is tucked into the Pelican case:


On top of the arm/neck/riser layer of foam is a nearly full sheet which supports the mobile base and holds the Kinect in place:


Maxwell won't be making the trip back down tomorrow, he's packed and ready to go to the Robot Block Party and RoboGames.

-Fergs

Thursday, April 7, 2011

National Robotics Week

Next week is National Robotics Week and is going to be quite a busy week for me. Maxwell will actually be touring the country for NRW this year, making appearances at the following events:
Additionally, I'll be presenting a paper on a separate topic at the IEEE TePRA conference on Monday and Tuesday the 11th/12th, while Maxwell is in transit across the country.

I'll be trying to upload photos/recap as the week progresses.

-Fergs

Sunday, April 3, 2011

Localizing the Chess Board

The AAAI Small Scale Manipulation challenge is just a few months away now. I've been working on a slightly different approach to chess board localization over the past few days. We had previously been using a Canny edge detector, followed by OpenCV's probabilistic Hough transform to find lines. From this, we iterated over hypothesis about which lines corresponded to particular lines on the board.

Recently, I had a different thought: forget the lines, let's look at points alone. I'm now finding the intersections of the lines, projecting those points to 3d using the point cloud, and then doing ICP against an ideal set of intersections:


You can see fairly large red spheres inserted where each of the detected intersections is, and a TF frame being localized into the lower corner of the board.

This is working OK so far. One issue is that the ICP is occasionally deciding that it has converged when it is actually quite far off. I think the next step will be creating a different method for finding the correspondence hypothesis.

-Fergs

Tuesday, March 29, 2011

Maxwell Moves A Chess Piece

Earlier today we got Maxwell moving chess pieces around a bit. The gripper is still going to need some improvements for a better grasp, and the gripper control software needs some work, but:



Should be a bit more progress later this week.

-Fergs

A Look Inside Maxwell

Recently, I rebuilt Maxwell's base using a new design which allows the front plate to drop down for easy access to the ArbotiX2 prototype being used in Maxwell:



Here you can see the ArbotiX2 with dual 30A motor driver board. On the left is a step-up/down regulator for the Kinect, this takes 9-18VDC and outputs 12VDC@1.25A. The back area of the base is pretty much unchanged, housing the SLA battery and power terminals as seen in the initial post about Maxwell.

-Fergs

Wednesday, March 23, 2011

New Mech Warfare Micro Transponders

The new micro transponders are finally assembled and tested and heading out the door today!



-Fergs

Monday, March 21, 2011

Maxwell Gets an Emergency Stop

I had ordered the emergency stop when I bought the first round of parts for Maxwell, but I hadn't actually designed a location to attach it to on the robot. This weekend I rebuild a number of components on Maxwell, installing a new base, upgrading the torso wiring harnesses, and installing the E-stop (seen behind the arm, protected by the torso column):


The E-stop is from Digikey and is the type it latches when pressed, such that you need to twist it to release and restart the current flowing.

Maxwell's ArbotiX is powered off the USB connection, allowing all 12V power to run through the E-Stop. Pressing the E-Stop then stops all the motors and servos, while the ArbotiX continues to run and interact with the PC. On releasing the E-stop, everything comes back to life nicely.

-Fergs

Sunday, March 20, 2011

Sensors Hate Dark Surfaces

Previously I've mentioned my issues with the low cost Hokuyo URG-04LX-UG01, even going so far as to say that the laser in the Neato XV-11 robot was better. One of the major differences between Maxwell and his little Armadillo cousin is that on Maxwell I have mounted the laser about 6" off the ground, versus the 2" of the Armadillo. Interestingly, this has had a huge impact on performance -- our walls here in the lab are mostly white, and the laser works much better on the walls than the black base trim:



I've also found that the same black paint is on our doors, and wreaks havoc on the Kinect. The image below shows the RGB point cloud -- this is a Sunday, both doors are shut, yet invisible:



Luckily, it looks like the laser picks up the doors when close, so I won't have to worry about Maxwell driving through an invisible doorway.

-Fergs

Saturday, March 19, 2011

Mapping with Maxwell

I've spent a bit of time tuning the PID and cleaning up the control and feedback from the motion system on Maxwell lately. This afternoon I collected a very nice dataset and map of our lab at ILS:



As has been seen before, gmapping "shortened" the hallways (that door at the end of the hallway on the left should be about 2 meters farther down the hallway).

-Fergs

Friday, February 11, 2011

Introducing Maxwell, Part 2

Brief video of Maxwell driving, moving his head, and moving his arm. Now onto touching up the IK and getting the Kinect calibrated to the body.



-Fergs

Sunday, February 6, 2011

Introducing Maxwell

I've been working hard to finish up (at least from a mechanical and electrical standpoint) my latest robot for a demo day tomorrow. I thought I would post a few details here.

Maxwell is my latest attempt at a lowcost, human-scale mobile manipulator using an ArbotiX and ROS. The design guidelines were pretty straight forward: it needed an arm that could manipulate things on a table top, a Kinect for primary sensor on the head, and a mobile base that kept all that stuff upright. Additionally, I wanted the robot to be easy to transport and/or ship.

Maxwell sports a larger 16x16" version Armadillo base with motors that should support a 20lb payload at speeds up to 0.5m/s. Not shown in these images is a Hokuyo URG-04LX-UG01 which will be mounted on the base, just in front of the column. The head has two AX-12 servos for pan and tilt. Eventually, the head will include a shotgun microphone.

Maxwell's arm is constructed from 2 EX-106 (shoulder lift and elbow flex), 2RX-64 (shoulder pan and wrist flex), and and 3 AX-12s (one for wrist roll, and two to form a gripper). In all honesty, it only needs the one EX-106 in the shoulder, but I didn't have any brackets that fit the RX-64 at the moment. The actual gripper fingers are temporary, I have much better ones in the works.

His central column consists of 3 sections of 8020 aluminum support, allowing all pieces to break into lengths under 20" long. This allows Maxwell to be disassembled and put into a reasonable size Pelican case for shipping. It only takes about 5 minutes to break down or setup Maxwell, as you only have to loosen 6 screws. Eventually, I'll have nicely cut foam, but for now I've hastily packed Maxwell for his journey to the state capital tomorrow:

On the software side, Maxwell has been a driving force behind the development of the ArbotiX stack v0.4.0 release. I'll be posting video shortly of smooth arm interpolation being done by the ArbotiX and ROS wrappers (I'm still fighting a bit with IK issues), in the meantime, here is a view of Maxwell's URDF-based robot model in RViz:

All of the configuration and launch files are now working, however I need to work on a calibration routine for the Kinect positioning, there is a bit of error somewhere in the head that causes the point cloud not to be aligned with the URDF model of the base when visualized in RViz (I unfortunately did not grab a screen capture of it, that will have to be a later post).

So what will Maxwell be up to? His first task is going to hopefully be competing in the AAAI Small Scale Manipulation Challenge. Once arm IK is functioning, and the kinect->arm transformation has been corrected, we'll be working on recognizing and moving chess pieces. After the AAAI event, he'll probably get an upgrade for a vertical lift on the arm (similar to the one on Georgia Tech's EL-E) and possibly a conversion to two more anthropomorphic arms. Speaking of EL-E, I couldn't help but replicate that classic pose:

Another great pose:

And that's all for now...

-Fergs

Wednesday, February 2, 2011

Snow Day!

I've been a bit quiet on this blog lately -- it's been a busy start to the new semester. Today however, the University closed due to inclement weather. So, I'll post a few update on what I've been working on:

First, I just barely finished an entry in time for the ROS 3D contest. I had insisted on using the new OpenNI-based drivers, which had some issues under 32-bit systems until about a week before the contest deadline. This was also my first attempt using the ROS Point Cloud Library (PCL). My entry improves the ar_pose package to use depth data from the Kinect to improve the localization of the AR Markers. It also uses surface normal calculation from PCL to improve the orientation. While it's probably not the most technically interesting entry, I definitely did learn a bunch about PCL while working on it. You can read more about the entry here. Or watch the really horrible video (my normal camera stopped working about 12 hours before the deadline):



In creating the entry, I also quickly constructed another robot. This guy used an iRobot Create, Kinect, and a tripod to get the Kinect up to the "Standard Social Robot Minimum Height" that we've been applying to all our robots at Albany:

This tripod worked surprisingly well -- far better than anything I've previously built out of conduit.

Finally, I'm working on putting together a new robot based on an EX-106/RX-64 based arm, Kinect, and a bit larger mobile base..... pictures and video shortly. (his parts are stuck on UPS trucks somewhere in a snow storm).

-Fergs