Skip to content

Update 4/22/13

April 22, 2013

Some recent developments:

1. Cruz, Sagar, and I had the opportunity to test the robot out on the intramural fields on Friday. After some work, the robot was able to autonomously navigate to three GPS waypoints while avoiding traffic barrels. We used the encoders and GPS updates to localize, and the sonar array to avoid obstacles. It seemed to be able to get within a meter of each waypoint, which was better than we were expecting.

We chose not to use yaw updates from the VN 200 IMU since there was a small reference conflict that we didn’t have time to fix on the field. Yaw=0 indicates that the VN 200 is pointing north (so the x-axis of the global reference frame is positive in the north direction). This conflicts with updates from the GPS offset node, which is (x, y) pairs where positive y is in the north direction. This has since been fixed.

Despite not having yaw updates, the EKF was able to accurately estimate its direction after moving for a bit and seeing updates from the GPS sensor. It took a few seconds of driving in the wrong direction, but it got it eventually without any help. This gives us confidence that at the competition we won’t have to rely too heavily on our compass. Magnetometers are known to have issues on the field due to the generators they use to provide power. I will make another post detailing EKF results with pretty charts.

From the data we collected with white lanes from the intramural fields, it seems we have some work to do in lane detection. Actual painted lines are a lot more difficult to detect than the boards we were using to test with last week. Sagar is experimenting with different gray scaling techniques to make the lines stand out better.

2. It turns out that our sonar array is suffering from a couple of issues. First, the calibration image shown last post is in fact evidence of “cross-talk” between the sensors. It cannot easily be accounted for. Also, we’ve found that the gap between sensors is too large, causing some large blind spots. This is especially a problem with the tall, thin cones, which it can sometimes run into. It works for large obstacle avoidance, but it is far too noisy and inaccurate to be used for mapping. I’ve sent emails to a few companies that sell Hokuyo and SICK laser finders to see if we can get another product donation.

3. The PSoC-Computer connection has become much more robust. Frank added a diode to the power distribution circuit, which appears to be preventing the surge we were observing before. He also has the PSoC’s fault handler reset it, so if other hard faults come up they won’t cause the whole system to go down–things will only pause for a second or so. PSoC_Listener has also gotten better. We’re trying to migrate all of our drivers to not crash if they momentarily loose connection to their sensors, and instead wait for them to reconnect.

4. We’ve decided to try to have both the EnterpRAS (a computer we used the last time RAS competed at IGVC) and DoloRAS (our current computer using the Atom processor) on the robot at the same time. Reasons for added the EnterpRAS are that it can fit the Quadro GPU and the HDMI capture card, and its i5 processor is faster than our Atom. A reason for keeping DoloRAS is that Intel has given us $3000 for this project in order that we use their Atom processor. Hopefully the power draw won’t be too much, and we can communicate between the two computers easily with ROS.


From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: