Speaker Q&A: Nathaniel Guy discusses how NASA has utilized MR

Nat Guy is Lead Interface Developer at NASA Jet Propulsion Laboratory and will be at VRDC Fall 2017 to present the talk “Mixed Reality for Space Exploration”, which will discuss how virtual and mixed reality are currently being used to deal with problems encountered in space exploration. Here, Guy gives us some information about himself and how NASA has utilized MR.

Attend VRDC Fall 2017 to learn about immersive games & entertainment, brand experiences, and innovative use cases across industries.

Tell us about yourself and your work in VR/MR.

I’m a developer at the NASA Jet Propulsion Laboratory. I’m in a lab called the OpsLab, which specializes in building modern user interfaces to solve problems related to space exploration. We do a lot of work in virtual and mixed reality, and in my time at the lab I’ve worked on MR interfaces to navigate around Mars, VR interfaces to explore Earth science data in innovative ways, and an VR/MR toolkit for assisting astronauts with complex assembly procedures, as well as 2D and 3D interfaces for working with spacecraft data on a computer screen. Before working at JPL, I worked in the video game industry for 8 years, building skills which translate very well to displaying and manipulating real NASA data in 3D.

Without spoiling it too much, tell us what you’ll be talking about at VRDC.

I’ll be talking about some of my lab’s most exciting projects in the domain of VR and MR. We have a project called OnSight which assists Mars scientists by putting them “in the field” on the Martian surface, letting them do their science in a way that feels natural and intuitive to them. We also have a tool for CAD model viewing and interaction called ProtoSpace, which engineers on multiple flight projects are using to review their design data and make improvements to it in a collaborative, mixed reality setting. I’ll also give a look into some other projects we have in the works, and show some exciting research we’ve done which supports the effectiveness of immersion for accomplishing complicated data analysis tasks.

What excites you most about VR/MR?

Comparatively speaking, VR/MR is an very young field, and it’s beginning to take off in a big way. I remember the first time I tried what I can call “modern VR.” I ran through a number of spectacular demos that various developers had prepared, taking advantage of the uniquely emotive and immersive powers of the medium. It was nothing short of magical, and I simply couldn’t stop thinking about it or talking about the experience to others. The ability to put users into a virtual space and give them memorable, emotionally powerful, and realistic experiences is something that NASA–which explores some of the most fantastic places in the Solar System–is uniquely positioned to take advantage of.

What do you think is the biggest challenge to realizing VR/MR’s potential?

MR/VR is still in a bit of a “wild west” stage of development. There are several pieces of innovative software that have begun to take advantage of these unique media, but we still don’t really have “standard practice” or “common sense” in how we can use VR/MR to display data or provide a seamless user experience. This can make for a lot of clunky and vertiginous experiences, which can actually turn people off of using MR/VR again.  We hire UX designers and do countless hours of user research and design iteration to try to conquer these challenges, but it’s still very difficult. I think conferences like VRDC are necessary for sharing information and building up a common knowledge of proper design practices and techniques to make this new media a success.

When did the NASA Jet Propulsion Laboratory start utilizing VR applications, and what have been the benefits/drawbacks of the technology?

We’ve experimented with user control using game controllers, hand tracking, and body tracking, as well as immersive interfaces, to support JPL’s various missions. Our lab started working with VR technology several years ago, and we’d been building prototypes with immersive technologies long before that. Ever since then, we’ve been serious about the effectiveness of VR and MR to tackle space exploration. In 2012, we released Mars Rover Landing, which used body-tracked motion controls to guide the Curiosity rover to a safe landing on the Martian surface. Since then, we’ve made a number of prototypes which turned in actual MR/VR projects which are actively used by several future and current missions.

As I see it, the unique immersive characteristics of VR have a few advantages that are otherwise impossible. The way that the technology engages a natural sense of space and proprioception is amazing, and is a key reason why we try to put our scientists and engineers in virtual environments to understand data. We’ve witnessed these interfaces engaging instincts and learned techniques in ways that data manipulation on a 2D screen cannot: the first time we put Mars geologists on a virtual Mars, we saw them immediately “run” up the virtual hills to get a lay of the land, just like they do in real life. We’ve also seen the powerful emotional effects that immersion can provide, which we believe are very important for public outreach.

On the other hand, we’ve seen that it can be difficult to get adoption from users who have spent decades looking at data in different ways, and integration with traditional, trusted tools can be a major technical challenge. We’ve been working closely with user researchers and the developers of existing tools to overcome these challenges.

Where can VR/MR be the most helpful for NASA in the future? For example, could it be used to train astronauts?

Absolutely. Our team sent two HoloLenses up to the International Space Station in 2015, in order to assist astronauts with performing complex tasks on orbit. I’ll be talking about this and follow-up work during my VRDC presentation. VR is actually already used within NASA to train astronauts; check out Matt Noyes’ talk from last year’s VRDC.

One of VR/MR’s most exciting possibilities is that it can let us take our scientists and engineers to far-off places throughout the Solar System: anywhere a robot can go, we can virtually send humans. This has the potential to make robotic exploration much more effective, giving humans telenautic superpowers they would never have otherwise.

Join experts behind innovative VR, AR, and mixed reality technologies at VRDC Fall 2017. Learn more and register today.