Duration

Leila Kazemzadeh, Kavya Jade, Adam Whitney

Team

Feb 2024 - April 2024

UX/UI design, User research, Usability Testing

Type

Overview

An external human machine interface for autonomous vehicles, used to communicate the vehicle’s intent visually and aurally to nearby pedestrians, especially in scenarios where traffic lights and crosswalk signs are absent.

Research

Context

What year has our group chosen as the context for our project?

2040

Why?

By 2040, driver roles will be redefined by autonomous vehicles

Currently, drivers are vital for:

  • operating vehicles

  • managing safety and legal responsibilities

However, as autonomous technology advances, the role of the driver will diminish.

Problem

By 2040, over 4 million fully autonomous vehicles are expected to be sold.

To better understand the relationship between pedestrians and cars, as well as the specific challenges they face after the disappearance of the driver's role, we conducted observations, interviews, and secondary research.

22

observations Studys

11

Interviews

60

Secondary Research Articles

Interview Demographics

Insights

Design goals derived from research

Promote safe pedestrian behavior, by creating trust between the pedestrian and the vehicle through the communication of vehicle intent via an implicit interaction that supports pedestrian behavior rather than dictating it.

Aspirational Journey Map

Our aspirational journey map outlines should be communicated between the AV and pedestrian as well as how that communication would ideally affect pedestrian behavior.

Concept Developement

User testing

Objectives

Minimizing Surface Display Area

Hot Spot Testing

Selecting Appropriate Indicators

We did..

In order to destermine where on the AV our interface would make the most impact, we conducted a test in which users acted as pedestrians, crossing in front of and beside a moving car and recorded where their gazes most often rested.

Eye Tracking Glasses

Users wore these glasses throughout the test in order to track and record the direction of their gaze as they crossed in front of the car.

Simulate Driverless Future AV Scenarios

Tracing paper was used to cover the front of the vehicle in a way that obscured the driver from the users while still allowing the driver visibility during the test.

Results

We found that users primarily looked at the hood, grill, and above the front tire on the side where they started crossing, regardless of whether they were crossing in front of the car or walking beside it.

Indicator Testing

Various animations with different sizes, dynamism, and directionality were tested with users via a paper car prototype with a projector on the inside. The projector mapped our animations on the outside of the car with the help of the program, Madmapper, allowing us to act out a crossing scenarios that included our animated signifiers with users in order to determine which were the most effective.

Methodology

62

Users Tested

30

Indicators States

40

Animation Types

Indicator Testing

Audio indicators were tested similarly to the animations, using a paper car and speaker which we used to act out pedestrian crossing scenarios with users. We observed how our “pedestrians” responded to various sound patterns and how they interpreted the signals to determine the most appropriate audio cues for our interface at each of it’s stages.

52

Users Tested

30

Indicators States

30

Sound Patterns

Results

Final Design

E-ink Panel

states

Senarios

Implementation