The near geriatric broncomatic (breath controlled bucking bronco – originally designed by Brendan Walker) was wheeled out of the mixed reality laboratory last week, and cobbled back together again by Joe Marshall and me for the National Videogame Arcade in Nottingham. It’ll be there for the next few weeks if you want a go. Moving with the times, it now tweets line-video photos of the riders (@BroncomaticRide if you’re interested).
We also installed the brand new Touchomatic arcade machine, a 2 player game that demands its users touch each other. It plays a game called “Astonishing Airship Adventures” in which you fly a digital me in an airship over increasingly rugged terrain, picking up coins, balloons and unintentionally dancing to the best music ever put in a videogame cabinet. And is quite fun if you fancy a go. Just be aware it records you as it goes along.
Today, we had an exhibition and workshop around the performing data project in which work, all of which I’ve been involved in to some degree by Brendan Walker, Rachel Jacobs, Di Wiltshire, Caroline Locke and Richard Ramchurn was exhibited and discussed. We then let a bunch of makers loose on the data to see what they’d come up with. the results were little short of fantastic.
The concept of Performing Data has emerged from multi-disciplinary engagements between artists, social scientists and technologists. Through performance data is revealed to people in various material and embodied ways, sometimes slowly, sometimes, as if live, sometimes in tangible forms, and sometimes by requiring them to enact being sensors.
Here’s what somebody else thought about it.
Brendan Walker’s amazing new artwork Oscillate is now on display at Sheffield international Documentary Festival as of today. In Brendan’s words, oscillate is:
“An immersive interactive artwork based on two popular entertainment technologies: the multi millennia-old swing and the 21st century Oculus Rift – the former designed to excite the vestibular system, the latter designed to excite the visual cortex”.
My involvement was to try and make the technology keep up with Brendan’s vision. In the end it was a wonderful thing, that surprisingly induced less sickness than you might expect. I put this down to your motion being similar to what you’re experiencing in the VR – which is of course unlike most VR experiences.
The work appeared in the Guardian an on Radio 4’s Film Program.
For most of the past year I’ve been supporting artist Di Wiltshire in the development of her latest artwork “Sentiment”. Today she was showing it at East Side Projects in birmingham, where I had the great pleasure of also giving a talk about the technology behind the project. In Di’s words:
“Sentiment Art is a interactive soundscape with a wearable sensation device. Created from the voices and emotional response of fifty people, a chorus of humanity through sound and haptics.Existence is intrinsically a holistic system. We are connected through our senses, emotions, sexuality, spirituality our bodies. We are wellbeing and illbeing. We are part and absorbed by the way we inhabit space.Humans need spiritual spaces and thinking places, alternative realities, quirks of nature to create glitches in self absorbed projects. The everyday illusion of meaning in narrative we have life spaces not life stories.Our perceptions shape our reality and in turn our motives. Our everyday is formed through our interactions.Everybody worries about the same things. The insignificant to the catastrophic. How we think is what we live by. Our choices have brought us to a place where we exist. Sentiment Art is an intrinsically multi-modal experience, comprising rich dynamic audio narratives from multiple speakers with related biodata delivered through a wearable haptic interface. The audio of the interviews, and the haptic biodata is dynamically shifted based on the attention of the viewer, and this has been enabled through the use of the performing data toolkit. The data, whether audio recordings or biodata – heard or felt – or the captured attention of the viewer is at the heart of the experience.”
Thrill laboratory took our recent 3D film experiment out to MGM Comic Con London. This was to help publicise some work done with RealD and Vue where we looked at what changed in people’s brains between watching 2D and 3D films. This time we were looking at the differences between highly emotional content (the first 10 minutes of Pixar’s Up) and high-action content (A fight scene from Guardians of the Galaxy). People could come along, get their brain scanned and take home a souvenir video showing what their brain got up to. Needless to say, some of the costumes made effective EEG um… challenging, but a great time was had all round.
Also, I had the opportunity to have my photo taken by SyFy’s incredible 360 degree camera. It would have been nice if I’d been a little thinner at the time, but dem’s the breaks. Here’s the result:
You can view the rotatable image here – sadly there’s no way to embed it 😦
Once more, thrill lab takes to the road to do some science, this time at Vue Picadilly in partnership with Vue Cinemas and RealD. We’ve been exploring what changes in your brain between watching a 2D film vs watching in 3D. It’s been a fascinating study, though I’ll admit I’ve now seen Disney’s Big Hero 6 rather more times than I’d like to – though I actually still haven’t seen the ending. You can see below how the division of labour works for thrill lab. Only kidding, the poor Prof was just all tuckered out 🙂
In one of our biggest outings yet, I’ve been involved in the development and deployment of the world’s first brain controlled thrill ride: Neurosis. Neurosis features a six-degree-of-freedom motion simulator and virtual reality headset to immerse the rider in a surreal environment, controlled not by a ride operator, but by the rider’s own brain activity. This activity generates an audio-visual virtual world where pathways emerge, tumbling, twisting and twirling the rider through a psychedelic landscape. The rider’s real-time neurological responses to music, motion and visible wonders, activate fairground lighting; this spectacular neurodata constantly transforms the futuristic ride artwork. Music pumps as the simulator mechanism undulates and sways.
This is the brain child of Prof Walker, with some help from The Mighty Jungulator – Matthew Olden and the good folks of Nottingham and Middlesex Universities. And yes, that is George Clinton riding it in the picture.
Here’s what the observer had to say: Observer article
Here’s a video made by Middlesex about the project: