As an organizer of MIT's 6.270 Autonomous Robot Design Competition I've been working on an improved vision-based system for tracking the contestants' robots on the playing field. (Sidenote: I competed in 6.270 last January and at some point I'll write a whole post or two about my experience. To sum it up, I had an awesome time competing, which is why I'm now an organizer of the competition) The basic concept is that we wirelessly feed each robots its coordinates throughout the round and act like GPS to help the robots navigate. However, this isn't as easy as it sounds. Our approach is to mount an overhead camera facing down at the playing field and then analyze the video to find special "fiducial" patterns on the robots. This isn't too hard in a controlled environment, but it gets tricky when the system has to ignore other objects on the field or when pieces of a robot go flying after it slams into a wall (which happens quite often during tes...