The near geriatric broncomatic (breath controlled bucking bronco – originally designed by Brendan Walker) was wheeled out of the mixed reality laboratory last week, and cobbled back together again by Joe Marshall and me for the National Videogame Arcade in Nottingham. It’ll be there for the next few weeks if you want a go. Moving with the times, it now tweets line-video photos of the riders (@BroncomaticRide if you’re interested).
We also installed the brand new Touchomatic arcade machine, a 2 player game that demands its users touch each other. It plays a game called “Astonishing Airship Adventures” in which you fly a digital me in an airship over increasingly rugged terrain, picking up coins, balloons and unintentionally dancing to the best music ever put in a videogame cabinet. And is quite fun if you fancy a go. Just be aware it records you as it goes along.
Recently, we performed an experiment dressed up as a gameshow with the title Lies, Damned Lies and Biodata (a nod to “Lies, damned lies and statistics” – Benjamin Disraeli for those who failed to pick up the reference). In practice we aimed to explore to what extent lay-people could “read” biodata when presented to them in a fairly raw form (just some line graphs and an EEG heat map).
The aim of the game was for the audience to guess whether or not a person was lying or telling the truth by looking at the presented biodata. The liar (or not) stood at the front of the stage and a picture was shown. They were asked a question about the picture and secretly informed whether they should lie or tell the truth. So for example the picture might be of their house – and they might be asked “is this your house” and told to lie, alternatively they might be asked to tell the truth. Or they might be shown a completely different house and asked the same question with the same conditions. It was a little confusing, but ultimately quite a successful experiment.
We dressed it up as a gameshow, complete with prizes, cheesy music and all the verbal silliness implied in the title. On the whole, the day seemed to go down fairly well. All the players did pretty much as was asked, with a couple of accidental exceptions and the audience, few as they were in the end did kind of get behind the game.
I take from this experience several key findings:
- Biodata is difficult to read without some explanation
- Playing with truth and lies is fun as long as everybody gets in on the game
- I’m not a natural born presenter
We hope to explore for the recorded data what subsets of information are usable to increase or decrease the facility we have for lie detection. Obviously a standard lie detector uses many of these biosensors, so there is a precedent there – my interest is in whether there is a way to present this stuff that allows it to be read without any special training.
All in all a fun day out.
Recently I’ve been spending some time experimenting with Brain Control Interfaces (BCIs). I wanted to develop a very simple, accessible game to make use of the Emotiv EEG headsets that the lab recently purchased. These are great off the shelf, if not medical grade, fairly quick to set up and pretty cheap EEG sensors and come with a fairly decent SDK. Tug-o-matic relies on the vicarious architecture, meaning its feasible to actually control it with any two dimensional sensor input, but it was intended to be used with cognitive EEG. The principle is that you train it what “pulling” feels like (i.e. think about pulling) and then to play the game you must simply replicate that thought pattern.
It’s pretty fun, not too easy and good as a two player game (though that does necessitate two players). There’s nothing especially new here, other than the input reusability, but I’m still fairly proud of it. The name is not too good though. It smacks disturbingly of teledildonics…
I’m just back from ACE2011, which had its usual array of the bizzare, the entertaining and the occasionally downright ridiculous. Met some nice people, ate some nice food and generally had fun. Hiroshi Ishi (pictured above) gave the first keynote, nicely circular since he also gave the keynote at the first ACE I went to in Valencia.
I was presenting a paper on the various breath based games we been developing, in particular talking about the different ways breath can fit into an interface stack. The paper was called: Breathalising games: understanding the potential of breath control in game interfaces and is available through the publications part of this site.
Today, perping was on display at Innovate – a TSB sponsored exhibition in London. I wasn’t there myself (being a little busy with another project) to run it, leaving it in the capable hands of Stuart Reeves and Bronya Norton, though I’ll admit it felt a little odd having other people running my baby. One source of great amusement to me was the call I got from Bronya in the morning asking how to put up the step-ladder that’s part of the set (it holds up all the hardware). I explained it over the phone to both Bronya and Stuart to no avail. Confused I though to myself, how on earth can they fail to put up a step-ladder? Then I sent them a diagram (quality drawing), and they still couldn’t do it.
In the end it turned out they’d only brought half of it with them, so I’m unsure what the set will have looked like. By all accounts the game went down well though so all is good. Nice to see it out and about.
This week I got to take my breath controlled tennis game (Perping), to the cheltenham science festival. We set up a booth and convinced several hundred people to have a go at the game. Doing all this in the day while manically writing an ace paper at night has been fairly exciting. On top of all that, I got to dress up as some kind of historical tennis umpire in order to run the event (with help from Matthew Olden), including wearing an original 1920s rowing blazer (with only a small amount of blood on it and minor repairs). Taken together with Brendan’s brilliant set design it was a real case of Game, Set and Patch (ouch).
Perping involves wearing our respiration-monitor gasmasks, and breathing to control your paddle and feels like the arcade classic: Pong. It allows two players to control the paddles in a basic tennis like simulation. In PerPing breath is the only control interface used. Players are required to accurately manipulate their paddles at increasing speed in order to successfully score points. Players see a representational chart of their breathing (or more accurately their paddle position) displayed continuously on their side of the screen.
At its most basic level, PerPing makes use of a direct mapping from paddle movement to breath flow volume, that is if a player is breathing in their paddle is moving up, if they are breathing out their paddle is moving down and if they hold their breath, the paddle should remain fixed. The speed of the paddle is mapped to the flow rate measured by the gas mask canister, thus breathing hard will move the paddle faster.
This has got to be one of my favourite uses yet of the gas masks. It was designed to be a portable and easy to demonstrate prototype for the breath sensing technology, in contrast to breathless, which while fascinating lacked somewhat on the portability front.
Another game with jaw dropping photo-realistic graphics, I’ve recently been working on Hyperventilation Sports (HVS). It’s a multiplayer game using the gas mask respiration monitors. The concept is pretty simple – You each have a runner on a 100 meters track. Each breath makes him take a step and the size of the step is mediated by the volume of the breath. Basically the more air you pass through you, and the faster you do it the faster your avatar runs. It’s timed carefully to take about as long as a real (professional) 100m sprint (sub 10 seconds is really hard but just possible), so hopefully won’t actually kill anybody who plays it. It’s definitely the most frantic of the breath games and I find it pretty good fun. Certainly not one I’d do a large scale public demo of though – I don’t need members of the public dropping dead at my feet…