At USC’s Institute of Creative Technologies in LA, the future is here – and it’s virtual
After writing about Google Cardboard and its predecessor – a device made by the University of Southern California’s Institute of Creative Technologies – last week, I realised I had a layover of several hours in Los Angeles. Seeing as the Institute lies only a 15-minute drive from the airport, I couldn’t resist a visit. What I found strolling around the Institute’s Mixed Reality Research Lab was a view of the future I’d seen before only in books.
In one room I donned a set of Virtual Reality goggles and entered a military training exercise in Afghanistan where I wandered down a dusty street, in and out of buildings (appropriately accessorised with Kilim rugs) looking for a cache of hidden weapons. When I took the goggles off I found I was in a room with black curtains and a concrete floor rather than a dusty back street of Lashkar Gah.
What was fascinating was that the area I’d explored was much larger than the room I was in. I was able to do this because of a technique called “change blindness” invented by Evan Suma, a research assistant professor at ICT and the USC Department of Computer Science.
As I entered a virtual room, the location of the door I’d walked through switched from one wall to another. This meant that when I turned around to walk out of the door, I was walking in a different direction. This idea of “stretching space” has profound implications – it means that you aren’t limited to the physical space you have to explore virtual worlds.
The Mixed Reality Research Associate Director Mark Bolas told me this “redirected walking” can give you an essentially limitless area to explore.
In another room I looked at a series of iPads through 3D printed Virtual Reality goggles which cleverly hook onto the iPad. I was able to drive a Rover over Mars, and by tapping the screen changed my view from Rover to aerial view. I read an organic chemistry book and then was able to look through the bonds of a particularly complex chemical compound.
I entered a game in which I explored different areas of a room just by turning my head and a comic book in which a few of the characters literally lept off the page. This device has been named the “inVRse” by the Lab’s Special Project Manager, David Nelson, who likes it because, as he describes it, you can “enter the content”. While an Oculus Rift headset will set you back £200, these lenses will set you back less than £20 if you have access to a 3D printer.
Then I entered something called Project BlueShark. It’s a crazy codename given to ‘the Enhanced Environment for Communication and Collaboration’ that is being funded principally by the Navy. In essence, it is a virtual control room to train sailors. I walked in, donned goggles and hand sensors (see photograph above) and was guided through the training by a camoflauge-attired virtual human avatar called Chief Foster.
By touching the glass panels around me I could relocate to the Crow’s Nest Cam, the Bridge, or Combat Decision Centre. LED lights on my headset and hands enable the cameras around me to tell where I’m looking or reaching.
In reality I was touching a piece of glass – but when I looked through the goggles I saw my hands touching control panels. When looking through the Crow’s Nest Cam I was up on the ship’s mast looking at other ships in the water ahead of me. On the Bridge I felt a little bit like Captain Kirk from Star Trek.
Most of the lab’s funding comes from the US Department of Defense; its location in Los Angeles is because LA is home to both film and game industry artists as well as computer and social scientists. More than 75,000 soldiers have trained with technologies that the Institute has invented.
Virtual Iraq and Virtual Afghanistan create the sights, sounds and sensations of Iraq and Afghanistan and are used to treat soldiers for Post Traumatic Stress Disorder in 55 military and university-based clinical sites. The first version of Virtual Iraq came from a game developed by the ICT called Full Spectrum War.
The ICT’s work with ‘virtual humans’ has led them to some amazing discoveries. The Institute’s Orli Belman told me about an experiment they’d conducted recently in which they discovered that people are more likely to be honest in disclosing medical issues to virtual doctors rather than real doctors.
The experiment involved 239 people interacting with a virtual human doctor on a computer screen. The ‘doctor’ asked a series of questions, first getting to know the patient (“Where are you from?”), before asking them about medical issues. The virtual human was programmed to give both verbal empathy (“I’m sorry to hear that”) and empathetic listening through nodding and facial expressions.
Half of the participants were told their conversation was entirely computer-driven and not being observed. The others were told that the avatars were being manipulated and told to ask questions by a real human in an adjacent room. The result? People were more likely to be open and honest when they were told they were speaking exclusively to a computer.
The future is here, and it is virtual.
Article featured in The Week