The Eye as Witness

I’ve been working for a while now on the development of PhotoRealiserVR, a constituent part of The Eye As Witness, a new exhibition about victim photography created by the National Holocaust Centre and Museum. Working closely with Professor Maiken Umbach, from The University of Nottingham’s history department, I devised and developed PhotoRealiser – an early prototype of which appeared last year in this blog.

It takes the form of a VR experience where a visitor “walks into a photograph”. In this case, the photograph in question is one form Jürgen Stroops’s report on the destruction of the Warsaw ghetto in 1943. The installation features a large projected curtain which floats the photograph in mid-air, so that spectators see visitors really walking through the image. Once through the curtain, visitors find themselves in a recreation of the moment the photograph was taken, including the photographer himself and much else. The fabulous architecture CGI for the scene was created by VMI studios, with whom I worked previously on Thresholds.

There are various interesting aspects of this work from a design standpoint. First the topic itself is of course challenging. This is a propaganda photograph designed to “other” the Jews living in Warsaw, and to suppress the heroism of those involved in the ghetto uprising. Stroop’s report was supposed to show how easily the Nazi soldiers captured their prisoners and was supposed to be a matter of record for future historians, but it is what is NOT shown in the photos that matters here. PhotoRealiser allows us to look beyond the frame of this image, to an interpretation of what may have been behind the camera and out of frame. Using evidence gathered from other photos in the report we have placed a number of objects in the scene – a machine gun nest, burning buildings, prisoners being marched down the street, any more soldiers, and of course what those prisoners are looking at. We can never be sure what that was, but we have chosen to place a transport truck as the focus of their attention, which is certainly within the realms of possibility. A very large number of these people were transported to death camps and subsequently murdered.

The second aspect I’d like to draw attention to is the characters in the scene. They are carefully NOT an accurate reproduction of the exact people in the photograph, though they are similarly posed and bear a reasonable resemblance. We don’t know who these people were, and I feel it was not appropriate to recreate them exactly. Instead I use representative characters.

Third is the notion of transition. Unlike many VR recreations PhotoRealiser is designed to stay very rooted in the original photograph. As such, we begin in a modern gallery with the photograph in front of us. We are then asked to “step through” voluntarily stepping into the space. This part of the design is about ensuring that we’re not experiencing any photo, we’re experiencing THIS photo. We can return to the gallery and look back, perhaps seeing the image with a fresh perspective, and we hope also looking at other images with the same more inquiring eyes.

Back in the gallery, you can also look out of the window and see the same location where the photograph was taken as it is now. Again, this is about grounding this historical recreation in the current time.

Much of the rest of the exhibition is the concerned with showing us photographs not from the perpetrators, but those rather rarer photographs taken by the victims. Photographs with an entirely different agenda. These photographs, taken at great personal risk show their subjects as people, doing normal things and are an incredibly powerful reminder of the real lives that were shattered by the Nazi regime.

kiss

Technically, the system is a multi-user VR experience, built on wireless HTC Vive Pro Eye headsets. Using the Vive Eye gives us the opportunity to apply foveated rendering, which increases the rendering effort spent on what a user is looking at and decreases items in the peripheral vision. This keeps quality and framerates high. Three visitors at a time  can experience the exhibit, without the need for bulky backpacks as in thresholds, thanks to the multi-channel wigig-based wireless. The scene is a little too complex for standalone headsets so a rackmounted server suite does the legwork. As a professional nerd – that box looks pretty damn cool.

Response to the exhibition has been incredibly positive, both with individuals and with the media with great reviews from ITV Central, BBC Front Row, BBC Sunday, BBC Online, The Telegraph, The Times of Israel and many others.

This may be the piece of work of which I’m most proud. This picture is not 🙂

L-R Henry Grunwald Chair NHC, Dr Paul Tennent, Prof Maiken Umbach University of Nottingham and Marc Cave CEO NHC at the launch of the National Holocaust Centre’s Eye as Witness exhibition at the South Hampstead United Synagogue.

You can see The Eye as Witness in a number of locations around the UK throughout 2020. See the website for details of the tour.

Photocredits: David Parry, Wikicommons, Paul Tennent, Google, Henryk Ross

Thresholds

Thesholds, by artist Mat Collishaw represents perhaps my biggest virtual/mixed reality project to date. Developed with the late Pete James who sadly passed away earlier this year, Thresholds is a multi-user physical virtual reality experience, where visitors get to explore one of the first exhibitions of photography: William Henry Fox Talbot’s 91 ‘photogenic drawings’ on display at King Edwards’ School in Birmingham, 1839. Visitors experience a lovingly recreated instance of the great hall (CGI work by VMI Studios), which has since been demolished, built with the guidance of architectural and social historians.

What makes thresholds unique however, is that unlike most VR, it doesn’t take place in an empty ‘real space’. Rather,  a physical representation of the room was created with the VR experience overlaid on that physical reality. This is a technique called substitutional reality, but done here to an unparalleled fidelity. Essentially, if you can reach a thing, you can touch it. The system uses leap motion cameras to also give you virtual representations of your hands, allowing you to interact both with the environment, and to manipulate the images on display. Thresholds stretches the boundaries of what we can do right now with consumer VR technology (literally in the case of the scale of the room we track with HTC vive base stations!).

Virtual exhibitions need not simply be recreations of existing museums – here in the virtual space, we can break the social rules, so with thresholds, you can pick up the images to take a closer look – but if you wanted to look at these images in real life – well, in many cases you simply couldn’t, as many of the images are now too light sensitive to be on display.

Thresholds has been on tour around the UK, premiering at Somerset House, before visiting Birmingham Museum and Art Gallery, Lacock Abbey, and The National Science and Media Museum, Bradford. It’s currently on display at Yapi Kredi Kültür Sanat Yayincilik A.S. Istanbul, and will be returning to the UK to go on display at the The Bodleian Library, Oxford later this year. To date more than eight thousand visitors have experienced it, and it has been a resounding critical success, with a number of TV, and national print media articles.

Thresholds is currently shortlisted for the 2018 South Bank Sky Arts Award for Visual Art

The physical space where you experience thresholds.

An etching and early photograph of the room where the exhibition took place

The experience is not just visual – for example the virtual fire is really hot!

Mat has created a viewing portal which allows for extraordinary images like these.

VR Playground

For over a year now, I’ve been working with Joe Marshall on the development of Brendan Walker‘s fabulous VR Playground. The spiritual successor to Oscillate, VR Playground, takes the idea of VR on swings and runs with it (or swings with it). A series of up to eight swings with each rider wearing a samsung gear VR, takes its riders on a journey through one of a series of beautiful abstract environments. In each world, the motion of the swing is re-mapped in a different way. High roller sees the rider zooming through a cityscape, as if trapped in a giant hamster-ball; Jellyfish sees them jetting upwards from the abyss through a coral reef and into the open water, encountering various creatures; shuttlecock sees you springing from building-roof to building-roof among the zeppelins; while walker has yo stomping though another city as if you were a giant robot. Brought to life with Brendan’s unique art style, and a stunning responsive soundtrack by Matthew Olden.

VR Playground has been exhibited at many locations, including contemporary art galleries, shopping centres, national museums, public parks and squares. VR Playground can be presented as visual artwork, performance, a technological marvel, or simply as an amusement ride. A unique visual-kinaestheic experience, part thrill-ride, part art-exhibit, part showmanship, VR playground represents a major outing for me and the Mixed Reality Lab, now on an international tour having been deployed all over the UK, as well as Munich, Seoul and as of this weekend: Philadelphia, it has thrilled some 13,500 riders to date.

VR Playground is commissioned by Horizon Digital Economy ResearchNorfolk & Norwich FestivalGreenwich + Docklands International Festival, and Without Walls. It is supported by Arts Council England and produced by Thrill Laboratory, Horizon Digital Economy Research, and Norfolk & Norwich Festival.


Integrated Immersive Inclusiveness

Working with Jo Robinson from English and Alex Mevel from Translation studies, I’ve recently been involved in a project with Red Earth Theatre called Integrated Immersive Inclusiveness. This is a project which seeks to explore tech interventions for deaf, deafened and hearing-impaired theatre goers. In conjunction with Red Earth theatre, who are already leaders in creating inclusive theatre, we have now run two of our three workshops exploring this area.

In particular we have been looking into captioning, and how to make it less rubbish than it currently tends to be. We’re addressing a few things – the process of actually generating captions, which is currently decidedly painful for producers, projection-mapping to present them in a more dynamic way, and device selection – delivering targeted captions wirelessly to known screens on stage.

This project is an ongoing concern and I will likely be writing more about it in the future, not least because the pictures are so fabulous!

Photocredit: Red Earth Theatre

Sentiment-Lite on Tour

Diane Wiltshire, has taken her work Sentiment, back on tour in a new slightly less bulky format (very sensible in my book!), originally designed to be taken to Jakarta, Indonesia, a trip for which the original installation would have been impractical, Sentiment Lite features the same soundscape as her original experience, but instead of using seven speakers that the visitor walks around, it instead uses noise-cancelling headphones and a gyroscope to allow the user to rotate in place to choose their audio-narrative. Biodata from the interviewees (who’s stories form the soundscape) is still presented to the user in the unique form of a back-caressing waistcoat.

Sentiment Lite was at Camden People’s Theatre this weekend, and Is heading to Liverpool next weekend.

Photocredit: Elly Clarke

Unity agents for ARIA platform

As part of the demonstrators build for ARIA-VALUSPA’s final review to the European Commission, I have been involved in developing the capability of running the ARIA-VALUSPA Platform in Unity3D. This allows you to create and place AVP characters (Virtual Humans) in any game environment, and use appearances and backgrounds from huge existing libraries. Previously ARIA agents lacked front-end usability, meaning that most work would necessarily focus on the agent rather than other aspects of the experience. Using Unity as the front end, offers designers the full scope of that mature experience development platform to create interesting applications for virtual humans.

To demonstrate this, I made a few simple VR and AR examples of the use of ARIA agents. Below are two different AR sample videos.


This is just one tiny part of the extremely impressive ARIA-VALUSPA project, directed by Michel Valstar which I urge to you check out in more detail.

Photo Realiser VR

I’ve been exploring using VR to reveal the context in which photgraphs were taken. Working with Maiken Umbach, Professor of Modern History and Research Director of the Faculty of Arts, we developed a simple proof-of-concept system, that allows users to walk through a photograph and look around, with a view to understanding more about the image. In this case we just simulate a photograph from an existing 3D model. The intention is to build the model from existing photographs, using historical knowledge to provide the context.

The experience, even of this simple proof-of-concept seems to be quite powerful.

Warning: video includes holocaust imagery that some people might find disturbing.

An updated version of this project, now in a much more complete state can be found here.

Corrupt Kitchen VR

I’ve been involved in developing a rather fun little game called Paradise Cafe, for a project called Corrupt Kitchen VR. Working with Martin Flintham of the Mixed Reality Lab, Richard Hyde from the school of Law and Jan Mayer-Sahling, Professor of Political Science in the School of Politics, this cross-disciplinary project aims to explore peoples’ attitudes to corruption in the workplace, as well as health and safety in commercial environments. You’d think this would be an opportunity to develop a a serious game, but since I was involved, players spend their time feeding burgers to a never-ending queue of hungry cartoon Elvisis, all the while slavishly helped out by a team of grumpy robots. While the game itself is anything but serious, and owes a debt to the humour of games like Job Simulator, the topic it is exploring and the research it is enabling genuinely are.

Unusually (and through a certain amount of coercion), I have written a separate set of development blog-posts about this one for the university’s digital research team. You can find them here, including a post from Richard about his experience of the collaboration.

Unnecessary Violence

Amongst the various hats I wear, I have been working on and off for the past couple of years on a roomscale VR combat game called Unnecessary Violence. I’ve done a lot of game design work over the years, most of which has remained off the record, but this one is probably the biggest. You can read more about it on its own dedicated site. I had stalled somewhat when some system updates broke my previous engine quite badly and I knew I was going to have to do a ground up rewrite of the mechanics, but recently I’ve found tome to pick it up again, particularly both because of offers of help from a couple of colleagues, and because those mechanics are actually proving useful in my work life. Since, I’m now actually using some of this code in an academic setting, I conclude that the open secret of the games existence (and specifically my authorship of it) can lapse.

I’m not going to go into much detail about the game here, but suffice to say its a visceral experience. There are actually a decent amount of interesting gameplay innovations hiding behind the gore, particularly the way it does inventory management. Its essentially the fighting game I wanted to play when I first put on a vive. That game still doesn’t exist, so I think there still a space to publish this one.

Planking Simulator 2018

I’ve been doing a lot of work recently exploring sensation in virtual reality. One of the classics in this area is to make somebody stand on a plank then in the VR world place that plank somewhere perilous. It’s such an engaging demo that I thought I’d have  go at making my own. Below is a screenshot from that work.

What sets mine apart from most (aside from the terrible graphics), is simple. I ask people to take their shoes off (improves proprioception for their feet) and point a whacking great fan at them. Combined with the rushing wind noise they here and the simple visuals (though I really so like the edge detection, comic book effect. Not bad for an afternoon’s work) its enough to really set some folks off. A gentle tap when they’re already off balance is usually enough to get a scream or two. The brain is a wonderful thing. It knows perfectly well it’s four inches off the ground, but show it a yawning chasm and all of a sudden its convinced its in a life or death situation. I shall be exploiting this particular human weakness more soon.

It has occurred to me that I don’t have a photograph of somebody doing this. I shall remedy that presently.