If you watched the iPhone Mars Rover video, you might be wondering how we are able to control a LEGO Mindstorms robot with an iPhone. In fact, we’re not controlling the robot directly, but connect to a fairly complex set of technology prescribed by the competition rules. Interested in learning more? Read on!
While it may seem that we’re directly controlling the Mars Rover with the iPhone, this isn’t exactly true. We’re using a rather sophisticated chain of technology between the iPhone and the Rover:
The iPhone uses the Wi-Fi to send commands to an OSGi-based web service which is hosted on an Amazon EC2 instance “in the cloud”. The main purpose of this service is to make sure a large number of players can be queued before one player actually gets to play. It also stores and distributes a tremendous amount of image data to every connected client simultaneously. In fact one can roll back in time to look at every single second of the available data. Hey, it’s built by the NASA JPL!
As soon as a player becomes the active player, the web service will dispatch the commands of this player to another service, which runs on a MacBook Pro situated near the Mars Rover. This computer is connected to the LEGO Mindstorms Brick via Bluetooth. The service on the Mac will now send these commands to the Brick and receives status updates via Bluetooth.
There’s also a USB web cam which is placed overhead of the mars arena, taking pictures of the events. It is connected to the MacBook Pro which uses image recognition algorithms to derive basic telemetry data such as the current heading of the robot.
The Brick itself is programmed with LeJOS, a Java-based VM made especially for the LEGO Mindstorms Brick (it should be noted that the Brick only has 256kB of flash memory and 64kB of RAM!). The program on the Brick receives commands and sends its current state (such as battery level) back to the Mac.
The telemetry data, image data and battery state is collected by the MacBook Pro and then sent back to the web service running on EC2, which provides it to all connected clients.
Our iPhone client connects to the web service running on EC2 via RESTful web service calls. Apart from sending command requests, we also use the web service to retrieve the current image data as well as the current target (i.e., the rock to which to drive the rover and the tool to present at that rock).
As can be seen in the video, we use the iPhone’s accelerometer to determine the direction of the rover: tilting the phone forward or backward will make the rover move forward or backward, respectively. Tilting the phone to the left or right will make the rover perform a left or right turn. You can even make the rover drive on an arc by tilting the iPhone at an angle. The navigation area gives you visual feedback about the direction in which the rover is driving.
All this is spiced up with common iPhone sense: As you start the app, you automatically get subscribed to the player queue. The phone vibrates seconds before you get control over the robot – and if you leave the app before it’s your turn it moves you out of the queue.