There’s a lot of buzz around multi-screen experiences lately, where an application or game is available to you on multiple devices (ex. web, PC, mobile, TV, etc). Recently I’ve been fascinated by the idea of using these “multiple screens” together to create a single experience.
The first result was my Nexus One Gas Pedal which I posted last week. Following that, I wanted to do something that was more accessible, and allowed for more players, which led to the concept for Androideroids.
Androideroids is a prototype multiplayer asteroids game in which each player uses an Android phone as their game controller. Player’s can see a top down view of the game on the main screen (which could be a PC, TV, or projected in a public space), and a first person view with their health and score on their phone. The top down view is great for navigating, and first person view comes in handy for aiming during dogfights. Player specific sounds are played on the phone, whereas general sounds are played on the host.
This has lead to the development of what I feel is a strong framework for creating public installations and experiences that will allow smart phone users (on multiple platforms) to connect in an adhoc fashion.
When players launch the generic LANPad application on their phone, it finds and connects to the game host. The host sends the client a “surface” SWF which defines how the game will look and act on the phone.
The surface SWF sends messages to the host indicating UI input, the host interprets these, runs game logic, and sends back a message updating the game state on the phone.
Here’s a rough video on the project:
I posted the video to Vimeo last night so it would encode before I wrote this post, and was thrilled to see that it got picked up by Engadget this morning while I was eating breakfast. Please excuse any typos in the post – it was written in a hurry!