TL;DR we designed an XR app system to improve accessibility in the great outdoors
How do we empower persons with motor impairments to better plan for and engage with local parks?
Enjoying a weekend trip to the beach or hike through the local park is commonly a quick Google search away, but for persons with motor impairments, this is often not the case. Currently, the major information disconnect between outdoor parks and users causes undue hardship for those with motor needs—such as wheelchair users, elderly users, or parents with strollers—and renders users vulnerable to harsh consequences when the data is inaccurate. Our goal was to better define the needs of this community and examine methods of bridging the information gap.
This project was completed with a 4-person team. My role focused on User Research and Design Prototyping.
My team and I were initially assigned to explore methods of “helping persons with a disability visit and engage with the great outdoors”. We immediately recognized the need to narrow down this problem statement both in its use case and user population. Over the course of our background research phase, we explored how people with different impairments interact with different types of outdoor spaces as well as tested current market solutions, and arrived at our focused problem statement.
For the initial research, we focused on incorporating users from a wide spectrum of geographic, impairment, demographic, and activity backgrounds to ensure our product could be universally designed. We used the following User Research Methods:
Using what we learned from these 6 avenues of user research, we created 4 Personas with Journey Maps and Empathy Maps to drive our design implications.
Using Design Implications derived from our preliminary User Research, we decided to test designs using different modalities to determine which design could offer the most inclusiveness. We produced 3 low-fidelity prototypes total: a voice-based design, a smartwatch and companion app design, and an augmented reality (AR)-based design.
Each prototype was evaluated extensively for accessibility using WCAG 2.1 and with feedback from expert evaluators in 1-on-1 feedback sessions. Additional focus was placed on increasing customizability, decreasing requirements for fine motor control, and ensuring multi-modal input options in each of the designs.
Smartwatch Prototype Colorblindness Testing
AR Prototype Sketch
After analyzing findings from the low-fidelity prototype feedback sessions, we decided to move forward with only 2 of our originally prototyped ideas. We determined that the Voice-based design simply did not allow us to fulfill all critical user’s needs, and instead integrated what we learned from evaluation of the Voice-based option into the voice-input features of the AR and Smartwatch prototypes.
With applied feedback, we updated our prototypes
Feedback sessions for updated prototypes were conducted with both users and experts. Feedback data was then compiled and analyzed again as a group, which led us to the decision of combining the AR prototype and companion app of the smartwatch prototype. Annotated Wireframes were compiled to highlight these next steps and help us identify key areas of improvement.
The final prototype resulting from our research is a platform with two distinct touchpoints:
To evaluate this newest iteration, we conducted User Evaluations with prominent accessibility advocates and professional athletes, as well as with caretakers and limb-different users. We evaluated using planned evaluation activities including:
Expert Evaluations were also conducted using:
Data was analyzed per evaluation group using comment-counts and plans for next steps were drafted using these findings.
This project is ongoing, but please feel free to check out some of the newest iterations.