Recently I was invited to give a talk about Vicarious at the Technical University of Vienna (TUWIEN)’s Summercamp event. It proved to be an interesting day all round, though sadly my time commitments only allowed me one day of a fascinating looking week.
The abstract of my talk was as follows:
The aim of this section will to be explore the types of available biosensors and methods of presentation of the data generated by those sensors. We will begin with a short talk, accompanied by live biodata from a speaker’s assistant. The talk will be presented using the vicarious biodata visualisation system, and collected form a number of separate sensors. Part of the talk will involve an interactive game, with the audience attempting to decipher the biodata for a specific secret event. After the presentation we will provide an opportunity for people to explore the practicalities of the hardware in more detail, including trying it for themselves and the data produced. A consensus will select a task(s) for a rigged person to perform while the live biodata is displayed, including looking at the difference and assessing the quality of processed biodata such as the emotiv (EEG headset) “affective suite”, compared to more “raw” data.Then the data from the performed task may be explored as a group. Ultimately the aim is to give a feel for the practicalities of collecting and handling biodata, as well as to demonstrate the open source vicarious platform for early adopters.
A little dry sounding perhaps, but in practice I gave the talk while wired up with ECG, EEG, GSR and facial EMG showing all the data live on screen as a sort of HUD to my presentation. This allowed me to show off an interesting facet of vicarious – the fact that it can be used as an augmented presentation tool, simply by embedding ones slides as changeable images – something we had previously developed for a different event.