I co-led a team of designers, artists, and programmers to develop an AR application for EVA assistance, which we tested at NASA Johnson Space Center. As a Design Engineer on the team, I co-designed the HUD for Moon Buddy, wrote our modular UI front-end and helped build back-end functionality like a listener for NASA’s telemetry server.
Designing for Astronauts
“Closing your hand in Space is like squeezing a jug of milk.” — Jay Apt
While designing Moon Buddy, we interviewed ex-astronaut Jay Apt to learn about EVAs, how information is conveyed, and space navigation. From our conversation, we learned that body movement is extremely limited, astronauts need help performing mechanical tasks (i.e. geology sampling, equipment repair), environmental hazards to be aware of, and how to handle emergencies.
Using this information, I led design sessions where we created a HUD with these design pillars:
· Use voice as input as it is hard for astronauts to do hand gestures
· Important elements must be in the periphery of the user’s FOV so they’re never in the way, but still accessible.
· Each UI element must have an expandable version to show more details
· Must visualize hazards that are hidden beneath harsh shadows.
Programming Moon Buddy
For Moon Buddy, I was responsible for building the VUI (Voice User Interface) backend which updated all HUD elements with the latest data from the Telemetry Server and from the user’s commands. Other notable features I wrote are a real-time map that used the user’s GPS position and the telemetry client interface.
As we would be rapidly iterating the app while on-site in Houston, I decided to create our VUI modular and easily reconfigurable by designers. This came in handy when we had to change the color of the HUD during a break in testing to improve UI visibility during night use.