Navi AR

Most people today use their smartphones to search and navigate to their preferred destinations when they’re walking or using public transportation. 

Navigation with a smartphone app requires a lot of attention to follow directions. When the attention is on the phone screen, users’ are no longer present their physical surroundings.

The Challenge

Our goal for this project was to design an application for AR glasses that enables the users to search and navigate to their desired location on foot or using public transportation, without having to look down onto their mobile phones frequently to look for directions. 

Our intention was to increase user safety while navigating and at the same time free the user’s hand from carrying the phone for long periods.

My Role

User Research User Research Icon

  • I conducted observatory and participatory field studies.
  • I collaborated to conduct semi-structured interviews.

User Flows  User Flow

I conducted observatory and participatory field studies. I also collaborated to conduct semi-structured interviews.

Led Co-DesignWorkshops 

I conducted observatory and participatory field studies. I also collaborated to conduct semi-structured interviews.

Augmented Reality Prototype 

I conducted observatory and participatory field studies. I also collaborated to conduct semi-structured interviews.

Voice User InterfaceDesign 

I conducted observatory and participatory field studies. I also collaborated to conduct semi-structured interviews.

The Approach

The Discovery

Observatory Field Study

  • I conducted an observational study on the behavior of people on the streets while walking with their attention on their mobile phones. I found that many people do this while walking on the sidewalk and crossing the streets. They glance ahead for a second and look at their phones for a long time. Their situational awareness is what they observed several seconds ago. This results in a possible collision with other pedestrians, possible accidents while crossing streets, and also delayed response to green lights while crossing streets. 
  • I observed people walking with multiple bags on their shoulders, one hand carrying an item and the other carrying a mobile phone, with their attention mostly on the screen. 

Participatory Field Study

  • I did a participatory study by using a navigation mobile phone to set a destination, walk to a bus top, ride a public bus, get off at the correct stop and walk to my preferred destination.
  • On a mobile screen, directions aren’t visible when there’s bright sunlight during the daytime.
  • The directions in the navigation app were visual and without voice alerts for turns. This required me to pay close attention to the mobile screen to not miss the turns or the bus stop to get off. This caused me to pay less attention to what was going on around me. I felt if something unexpectedly came in front of me there would be a collision. 
    • On an occasion, while walking through a parking lot and looking at the directions, I didn’t realize a car was in front of me until it was very close. 
    • I also missed my bus stop even though I had noted the bus stop name from the app, as the bus didn’t announce the bus stop name, and neither did the app. This resulted in a 15-minute walk back from the next stop. 
    • I didn’t realize there were people ahead walking toward me as I was trying to figure out my directions on the phone.
    • While waiting for the lights to turn green I tried to figure out my next turn. So I was late to realize the lights had turned green. 
  • At big intersections with multiple turns, it is difficult to figure out which turn to make when street signs aren’t visible from a distance.  
  • It is also difficult to tell which direction I was facing on the map. This required me to physically turn and check the map to orient myself in the app. 

"I felt if something unexpectedly came in front of me there would be a collision. "

Semi Structured Interviews

  • Participants liked the detailed information provided by the navigation app such as the price of the transportation ride, directions to their destination, bus numbers, and transfers.
  • When there’s no signal on the phone it is difficult to find directions especially when they have missed their bus stops. 
  • Sometimes the navigation app does not detect where the user is until he walks a few meters ahead.

The Journey

Journey Map of a Traveller using Navi AR

Co-Designing

I led two co-design workshops with the goal to generate divergent ideas for Navi AR.

After reflecting on the session with my project partner we filtered out some key points.

  • It would be beneficial to have multiple modes of interaction with Navi AR. Based on the context, this would make the app more accessible for users.  
    • For searching, voice-to-text, sign-to-text, virtual keyboards, and integration with smartphone/watch. 
    • For navigation, having basic hand gestures and voice commands. 
  • The system should adjust the UI based on the user’s movement. Only necessary information should be displayed when the user is walking and other UI elements should be minimal to avoid distraction and ensure safety.
  • The UI elements can be translucent to increase safety and should have the option to switch to audio-only mode.  
  • Basic information like arrival time for the next bus, next steps towards the destination, estimated time of arrival, walking distance, alerts for weather conditions, and route conditions should be available.  

Voice User Interface Design

From our Co-Design workshop, it was clear that voice would be an important aspect of Navi AR. So, we collaborated to make some sample VUI dialogs based on 5 scenarios and the list of basic grammar for acceptable responses by the users in those scenarios. We created a dialog flow before working on a low-fidelity VUI prototype. 

Dialog Flow for VUI
Dialog Flow for VUI

User Flow

We wanted users to have multiple ways to interact with Navi AR. So, I worked on user flows for 2 scenarios. The user uses voice command in the first one and when there is a lot of background noise the user uses an integrated smartphone app to complete the task. This integration also makes the app more accessible for people who have speaking disabilities, don’t speak English, or have a thick accent.  

Prototype

Lessons Learnt

  • The background of the augmented reality user interface changes with the user’s physical environment. The color will vary based on where the user is using the AR glasses. The color of the UI elements will need to be dynamic so that there is enough contrast for the user to read the text and see other UI elements.