Visual Design Lead
Jan – Aug 2015
My capstone project team, Parallax, worked with the NASA HCI Group to research and build a wearable augmented reality(AR) system that can assist astronauts in executing procedures aboard the International Space Station. We began with intensive research to understand the context of the user then iterated on prototypes along with user testing.
As the Visual Design Lead, I was responsible for delivering all the visual aspects of the project including iterations on physical prototypes, user interfaces, presentations, website, booklets, photographs, process documentation, client materials, and more. I also collaborated closely with the team in UX research and UX design processes.
During regular missions aboard the International Space Station (ISS), NASA astronauts are tasked with conducting scientific experiments and performing a variety of maintenance operations. However, carrying out these operational tasks involves memory recall challenges due to the length of time that elapses between when astronauts receive training and when they execute tasks aboard the ISS. As NASA undertakes the next leap in a new era of human space exploration that extends beyond low-Earth orbit, an intelligent system that provides instructions can empower astronauts to autonomously execute procedures.
UNDERSTANDING THE CONTEXT
Over the spring semester, my team conducted research to better understand the context in which an astronaut works, review existing literature in relevant areas, and understand applicable technology.
We performed contextual inquiries at the Arc Jet complex at the Ames Research Center and the Thermal Protection System Facility at Kennedy Space Center. Observing these processes allowed us to observe two NASA procedures while they were being executed. Additionally, we performed research around nine domains whose conditions map closely to that of astronauts because access to current astronauts is extremely limited. Research in these analogous domains included two additional contextual inquiries, twenty-one interviews, and three experiential learning sessions in which we were able to perform the activities within the domain ourselves to give us hands-on knowledge of the experience.
Finally, we modeled all our research data in sequence models, flow models, and ultimately a 1000+ note affinity diagram. Analyzing these data, we found out that many of the findings in behavior were related to procedure execution that could lead to feasible solutions in the context of NASA. Below are our research insights:
- Coordination between planning and execution can be improved by worker suggestions.
- Spatial references help guide workers through a procedure.
- Workers track status within a procedure.
- Adjusting instructions for a user can improve understanding.
After presenting our research, we engaged with our clients and other HCI professionals at NASA Ames Research Center through brainstorming activities centered around our four research findings. We crafted many visions and ultimately, through discussing merits and drawbacks of each with our clients, settled on a single vision: Leveraging connected devices to improve procedure execution.
LOW FIDELITY PROTOTYPES
Our first set of prototypes aimed to test different form factors and strategies for guiding users spatially through an environment and cognitively through a procedure.
Three low fidelity prototypes we explored were wearable with audio, tablet, and augmented reality.
We used a basic cooking recipe as the backdrop of our usability testing with advanced and novice cooks.
MEdium Fidelity Prototype
Augmented reality, voice commands, and guidance arrows worked best in our low fidelity usability tests, and so we further explored these facets in a digital system.
In order to mimic augmented reality, we mounted an iPhone on a bike helmet. Also, this prototype made use of Wizard-of-Oz for manipulating the interface shown for the user.
Our usability testing participants were five users from NASA (mission planners, developers, and user experience designers), all of whom were unfamiliar with this procedure.
HIGH FIDELITY PROTOTYPE
Improving from previous iterations, we refined the technology and physical construction of our system for the high fidelity prototype. Voice commands were used to navigate throughout the interface and through the test procedure, while wireless Bluetooth beacon technology assisted users in determining their proximity to tools.
In terms of physical form, we mounted iPad mini on a construction helmet and displayed the interface on a teleprompter glass.
We tested this prototype with six NASA users who work as human systems engineers, quality assurance, user experience, and interaction designers.
ANDRW 3000 (AugmeNteD Reality Wearable, whose name is a nod to the founders of Carnegie Mellon University) is the culmination of prototype and usability testing of prior design iterations.
ANDRW is an intelligent, connected device that leverages various technologies to empower astronauts performing unfamiliar procedures while maximizing their mobility. By overlaying written and visual procedure information over their real world field of view, ANDRW allows users to stay focused on the task at hand.
A carefully designed AR user interface supports intuitive use and clear legibility against various background colors and textures. ANDRW’s speech recognition feature accepts verbal commands to navigate the user interface while both hands are free to perform work. The system also aids the user by automatically providing guidance to tool locations, reducing the time spent looking for misplaced tools.
Additionally, by natively storing multiple procedures and contingencies, ANDRW enables greater autonomy by quickly providing critical information that astronauts would otherwise have to request from ground support.
The user interface was designed for the screen size of the iPad mini. We chose black for the background color of the interface, because black does not reflect as much on teleprompter glass, creating a more transparent interface. On top of the black background, we used high contrast colors to stand out from the background. We used a thicker sans-serif font to increase legibility on the teleprompter glass as well.
For more details, check out the Research Report and Design Report.
Our working prototype demonstrates how an intelligent system can improve procedure execution for astronauts. It was developed over three months with the constraints of technology today, and thus, there is much room for improvement. Some of the future extensions would be object recognition, integrated speech recognition, tool sensing, and context aware guidance.
Also, during the course of our spring semester, we envisioned various ideas that may help improve the experience for astronauts working through procedures. We implemented only a few of them in order to still deliver an inspirational prototype within a short timeframe. Some ideas for future research are:
- Feedback system for procedures to capture an astronaut's performance during a procedure as an indicator of potential procedural improvement.
- Complex progress indication, such as secondary progress bars, to model deviations in the natural flow of a procedure.
- Modulating information granularity that helps personalize a procedure to an astronaut's information needs because an astronaut might not need to see every step of a procedure due to prior experience or may just need additional detail for a particular step.