My capstone project team, Parallax, worked with the NASA HCI Group to research and build a wearable augmented reality(AR) system that can assist astronauts in executing procedures aboard the International Space Station. We began with intensive research to understand the context of the user then iterated on prototypes along with user testing.





Lo-fi Prototype

Med-fi Prototype

Hi-fi Prototype

Final Prototype

User Interface View




As the Visual Design Lead, I was responsible for delivering all the visual aspects of the project including iterations on physical prototypes, user interfaces, presentations, website, booklets, photographs, process documentation, client materials, and more. I also collaborated closely with the team in UX research and UX design processes.

Problem Space

During regular missions aboard the International Space Station (ISS), NASA astronauts are tasked with conducting scientific experiments and performing a variety of maintenance operations. However, carrying out these operational tasks involves memory recall challenges due to the length of time that elapses between when astronauts receive training and when they execute tasks aboard the ISS. As NASA undertakes the next leap in a new era of human space exploration that extends beyond low-Earth orbit, an intelligent system that provides instructions can empower astronauts to autonomously execute procedures.


Mission Control on Ground

Astronaut working inside the ISS

  Astronaut on   Extravehicular Activity

Astronaut on Extravehicular Activity




Over the spring semester, my team conducted research to better understand the context in which an astronaut works, review existing literature in relevant areas, and understand applicable technology.

We performed contextual inquiries at the Arc Jet complex at the Ames Research Center and the Thermal Protection System Facility at Kennedy Space Center. Observing these processes allowed us to observe two NASA procedures while they were being executed. Additionally, we performed research around nine domains whose conditions map closely to that of astronauts because access to current astronauts is extremely limited. Research in these analogous domains included two additional contextual inquiries, twenty-one interviews, and three experiential learning sessions in which we were able to perform the activities within the domain ourselves to give us hands-on knowledge of the experience.



Finally, we modeled all our research data in sequence models, flow models, and ultimately a 1000+ note affinity diagram. Analyzing these data, we found out that many of the findings in behavior were related to procedure execution that could lead to feasible solutions in the context of NASA. Below are our research insights:

  1. Coordination between planning and execution can be improved by worker suggestions.
  2. Spatial references help guide workers through a procedure.
  3. Workers track status within a procedure.
  4. Adjusting instructions for a user can improve understanding.




After presenting our research, we engaged with our clients and other HCI professionals at NASA Ames Research Center through brainstorming activities centered around our four research findings. We crafted many visions and ultimately, through discussing merits and drawbacks of each with our clients, settled on a single vision: Leveraging connected devices to improve procedure execution.


Sketching Visioning Ideas

Impact vs. Feasibility Matrix 
Brainstorming potential features

  Body Storming Visioning Ideas

Body Storming Visioning Ideas



Our first set of prototypes aimed to test different form factors and strategies for guiding users spatially through an environment and cognitively through a procedure.

Three low fidelity prototypes we explored were wearable with audio, tablet, and augmented reality.

We used a basic cooking recipe as the backdrop of our usability testing with advanced and novice cooks.


MEdium Fidelity Prototype

Augmented reality, voice commands, and guidance arrows worked best in our low fidelity usability tests, and so we further explored these facets in a digital system.

In order to mimic augmented reality, we mounted an iPhone on a bike helmet. Also, this prototype made use of Wizard-of-Oz for manipulating the interface shown for the user.

Our usability testing participants were five users from NASA (mission planners, developers, and user experience designers), all of whom were unfamiliar with this procedure.



Improving from previous iterations, we refined the technology and physical construction of our system for the high fidelity prototype. Voice commands were used to navigate throughout the interface and through the test procedure, while wireless Bluetooth beacon technology assisted users in determining their proximity to tools.

In terms of physical form, we mounted iPad mini on a construction helmet and displayed the interface on a teleprompter glass.

We tested this prototype with six NASA users who work as human systems engineers, quality assurance, user experience, and interaction designers.



Final Design


ANDRW 3000 (AugmeNteD Reality Wearable, whose name is a nod to the founders of Carnegie Mellon University) is the culmination of prototype and usability testing of prior design iterations. 

ANDRW is an intelligent, connected device that leverages various technologies to empower astronauts performing unfamiliar procedures while maximizing their mobility. By overlaying written and visual procedure information over their real world field of view, ANDRW allows users to stay focused on the task at hand.

A carefully designed AR user interface supports intuitive use and clear legibility against various background colors and textures. ANDRW’s speech recognition feature accepts verbal commands to navigate the user interface while both hands are free to perform work. The system also aids the user by automatically providing guidance to tool locations, reducing the time spent looking for misplaced tools.

Additionally, by natively storing multiple procedures and contingencies, ANDRW enables greater autonomy by quickly providing critical information that astronauts would otherwise have to request from ground support.


ANDRW System
A typical usage flow for a user completing a procedure step using the ANDRW system



The user interface was designed for the screen size of the iPad mini. We chose black for the background color of the interface, because black does not reflect as much on teleprompter glass, creating a more transparent interface. On top of the black background, we used high contrast colors to stand out from the background. We used a thicker sans-serif font to increase legibility on the teleprompter glass as well.


The current step in the procedure is displayed in the middle of the screen along with a descriptive image. The screen also shows the previous step above the current step and the next step below the current step in order to give the user context for where they are within the procedure. Text listed in green is an available verbal command.

When the current step in a procedure involves tool finding, navigation arrows are displayed to guide users to the location of the tool.

When the user is in close proximity to the tool, the arrow information becomes more detailed and a distance reading appears in the middle of the circle.

The start screen is a list of all available procedures. Commands that are not applicable to this screen are hidden.

Procedure Overview is available anytime during the procedure for the user to oversee the sections of the whole procedure. Current section and step number are highlighted and give users the ability to jump through sections with the verbal command visible on the top right corner.

When the system is running, a user can access the list of all verbal commands from any system state. Throughout the system, verbal command cues are highlighted in green. This removes the need to memorize each of the commands.

The Required Items list is available at any time during a procedure. 

Showing image of the tool is available at any time during a procedure. Users can use voice commands to show images of the various items.

Hiding the procedure screen is available at any time during a procedure. This allows users to have a clear view in their main focus region.


For more details, check out the Research Report and Design Report.




Our working prototype demonstrates how an intelligent system can improve procedure execution for astronauts. It was developed over three months with the constraints of technology today, and thus, there is much room for improvement. Some of the future extensions would be object recognition, integrated speech recognition, tool sensing, and context aware guidance.

Also, during the course of our spring semester, we envisioned various ideas that may help improve the experience for astronauts working through procedures. We implemented only a few of them in order to still deliver an inspirational prototype within a short timeframe. Some ideas for future research are:

  1. Feedback system for procedures to capture an astronaut's performance during a procedure as an indicator of potential procedural improvement. 
  2. Complex progress indication, such as secondary progress bars, to model deviations in the natural flow of a procedure.
  3. Modulating information granularity that helps personalize a procedure to an astronaut's information needs because an astronaut might not need to see every step of a procedure due to prior experience or may just need additional detail for a particular step.