Jonathan Beall - Robin Schriebman - Whitney Buchanan
We started the semester with just a Barbie Jammin' Jeep from WalMart. It had its Jammin' radio, two DC motors, a steering wheel and control of the motors to the resolution of slow, fast, and backward, which was controlled with a lever over a couple of toggle switches.
The first thing we new we needed to do was get moving and get steering. There were multiple options for each of these problems. We started with the moving problem. Before we got a motor controller, we built a servo with a Lego actuator to automatically switch between the jeeps built in speed control of slow, fast, and backward. Ideally though, we wanted a motor controller. We needed one that could deliver between 0 and 12 volts, but more importantly, testing while driving the jeep around showed that we needed a motor controller that could handle 30A. Thats quite a lot of current. We acquired an HB-25 motor controller. This controller, however, failed to allow us to sweep the speed, so we had forward fast, backward fast, and stop, but nothing in between. We then acquired a MC-7 motor controller. We ran into some trouble wiring this up. In the end we were successful, but in the process we burned our jeep and blew a half-inch trace on the MC-7 board. We were able to successfully repair the trace with some stranded wire and solder. The MC-7 board gave us much finer resolution of the speed control and is what the final robot uses to control its speed. The MC-7 controller is screwed into the back vertical wall of the jeep under the seats.
In order to solve the problem of steering, we started with a couple of bike gears, a bike chain, and a really heft servo. A large gear was attached to sheet metal base and then screwed on to our steering wheel. A smaller gear was affixed to a HiTec HS-805 servo which was the beefiest servo we could find in terms of torque. The idea was to couple the steering wheel and the servo using the bike chain and then control the steering with the servo. This approach was based on a successful implementation of this method by the original Mini-Grand-Challenge demo robot. This solution proved insufficient to turn the wheels with any sort of useful speed. We probably needed more torque from our servo, but we couldn't find a servo with more. From here, we moved to a linear actuator to control the steering. We removed the steering column, and affixed the linear actuator on the underside of the jeep with a bolt to replace the steering column. The actuator is rated at 200 pounds and successfully moves the axle and achieves steering. It however is still rather slow. In order to control the linear actuator, we actually ended up using our HB-25 motor controller since all we needed was forward or back.
Because linear actuator only allowed us to go forward or back and gave us no indication of where it is, we used a potentiometer as an angular encoder to give us a measurement of where our wheels are. This is what allowed us to go in a straight line fairly reliably.
The two motor controllers (for steering and going) were commanded through the use of an Arduino I/O board located under the seats with the motor controllers. At this point we had the ability to drive the jeep around. From here, it was time to integrate sensors.
The jeep currently has three types of sensors on it, vision from an iSight camera, sonars, and a GPS receiver. The robot has been running off of Jonathan's MacBook Pro. The first sensor added to the jeep was the iSight camera.
The camera is mounted at about hood height for the jeep. This location allows a reasonable view of the ground in front of the jeep. We currently have no autonomous control of the direction the camera points.
The sonars are mounted in the front of the jeep inside the little fake KC lights. They are used to stop in front of obstacles including people and walls. If the robot detects an obstacle, it stops and waits and asks the obstacle to move. If after several attempts of asking it to move, the robot doesn't, then it reverses and tries naively to get around the obstacle. The sonars are controlled through a second Arduino board which is located under the dash. Both Arduinos and the GPS sensor are plugged into a USB hub in the dash, so a laptop only needs one USB plugged into it as well as the firewire for the iSight.
We discovered that if objects get too close to the sonars, they give spurious readings which have the unfortunate property as being the same as the reading for "there's nothing in front of me for a long way." In order to deal with this, we added an aftermarket bumper out of PVC pipe to the front of the jeep. This restricted obstacles from coming within this minimum range of the sonars.
The final sensor, the GPS is not working robustly. The gpsd daemon gives us easy access to the information from the GPS. There is code in place to use a GPS coordinate as a stop point. However, drift inherent in the GPS signal combined with a resolution that is multiple times the characteristic length of the jeep pose problems to the jeep's ability to find the end point.
The original vision algorithm used blob detection as implemented in OpenCV to fine the largest yellow blob in the field of vision. The color was tuned to the smiley face beach ball in the robotics lab. It would then determine which direction to go to center the blob and approach it. A naive attempt at following a trail of yellow construction paper dots on the ground with this algorithm worked surprisingly well. The nearest blob was the largest, so the jeep would subsequently center and approach each dot in order. We were even able to get it to turn a corner in the Libra complex so long as our choice of dot path was done correctly. Around corners, the dots must be fairly densely spaced. At this point, the algorithm decided simply left or right, and the robot responded by moving a fixed distance left or right. This meant that even if the nearest blob was only a little left of center, the jeep would still go left quite a lot. This often resulted in us zig-zagging down to our target. The algorithm used discrete stepping to achieve its goal. It would look for the blob, decide left right or straight, make the wheel adjustment, go for a small fixed amount of time, then stop and look for the blob again and started over.
The vision algorithm was then modified to give a continuous range of left or right. Because of slip in the steering we still only moved the wheels to hard left or hard right. We instead varied the time for which the jeep moved hard left or hard right. This allowed us to do less zig zagging. We also implemented regions in the bottom left and right of the vision field as reverse regions. If the blob was in those fields the jeep could not reach it by turning hard, and so would turn the wheels accordingly and back up to try and center the blob. We also added a memory of which edge the blob was lost off of. So, if the jeep turns to far to the left and loses the blob off the right side of the vision field, it will stop and back up in the correct direction to try and relocate the blob.
The software is available for download, and should compile and run on a Mac with OpenCV and gpsd installed as well as having an iSight, a gps, and two Arduinos hooked up.
While the jeep had been running around indoors following yellow dots on the ground for quite some time, when we took it outside to try out the GPS for the final demo, we discovered that foliage in the sun was the same color as yellow construction paper in the shade. We then tried to switch to red construction paper, but given the limited time in the morning, our color tuning was less than ideal. We also ran into the problem of the construction paper fading surprisingly quickly in the sun. We set up the path for the demo early, but by the time we ran it, the paper had faded to a color that was visibly different from the color we had tuned the algorithm to. The final demo did not go as well as we would have hoped, but given that it was the first day we had tried to run the jeep outside and that we changed colors that morning, it didn't go too badly.
It should also be noted that there were significant aesthetic modifications to the jeep including but not limited to puffy glitter paint, D20s hung from the rearview mirror, and a custom license plate.