Keeping the momentum.

For the last project of my Information Architecture class, we were to take the previous project (iOS Notification Redesign) and make it an advanced adaptive interface. I initially thought about further developing the Siri Wrap Up feature, but after discussing the project further with my professor, I decided look at what the notification system could look like in the context of autonomous cars.

Under the hood.

As a part of the project brief, we needed to pick from a list of emerging technologies. For my project I had selected sensors and actuators for location data, machine learning to offer notifications in contextual situations, and natural language processing to proactively help provide users with information.

Getting back in the flow of things.

To start off, I thought of a few situations that could improve the user’s journey when they aren’t in control of the car. I thought it to be important to still keep the smartphone out of the user's hands in case of an emergency where the car needed the driver to take over. After making a list of different situations and features, I mapped them out in a user flow diagram. I made a user flow diagram without the implementation of AI, then one with the implementation of AI to show how much it can change the system as a whole.

Customized Constraints.

From the user flow diagrams, I then when into screen ideation. For this I created my own sketch paper to ideate straight onto the form factor that I’d be designing for. This sketch paper includes three parts: The windshield projected HUD, the steering wheel embedded display, and the car instrument panel screen.

Heads Up Display
Instrument Panel
Steering Wheel

HUD Sketches

Instrument Panel Sketches

Steering Wheel Display Sketches

Heads up.

In the effort to keep a driver's eyes on the road, this new system would utilize Heads Up Display technology. The interaction for each of the actionable elements would be similar to that of the Apple TV and it's remote. In the case of the car, the remote would be located to the left and right side of the steering wheel for easy thumb access. When the car is in manual driving mode, this area would display things such as incoming calls, media controls, and essential driving information. Swiping all the way to the right of the first menu would cause the HUD to show the actions that would be displayed on the steering wheel (see the steering wheel explanation below).

Re-route
Call
Make a reservation
Invite a contact

Hold me, squeeze me.

For the sake of effortless interaction, I decided to incorporate a TouchBar like display in the steering wheel so users could interact with their notifications easily and quickly. The commands would mimic those that are on the HUD, so users don’t need to take their eyes off the road to know what they’re selecting. Additionally, the area to the left and right of the center of the steering wheel would act as a touch pad similar to the Apple TV remote to interact with the HUD. In order to combat accidental taps while driving, the user would need to grab a command on the steering wheel and squeeze.

Look Ma, no hands.

This new version of CarPlay would allow users to pin different CarPlay enabled apps to the instrument panel so the user can interact with their content easily. In order to change an app, a user can force touch the side trackpads to bring up a menu where they can select which app to select. When a user has a destination input to the car’s navigation system and they remove their hands from the wheel for 3 seconds, the car will transition to its autonomous mode and change the interface. This interface will contextually display a cluster of notifications and information to the user based on many different things like the time of day, the destination, or the current location of the car.

The theory of relativity.

In addition to serving the user with relevant notifications based on the context of their trip, the system would actively be listening for a prompt by the users. Say “Hey Siri, where’s the nearest ice cream place” and the car will provide ice cream places near the user. Additionally, the user can use the steering wheel, which utilizes touch gestures instead of squeeze gestures in autonomous mode, to re-route the car, call ahead, or schedule a reservation with an app like open table.

The theory of relativity.

In addition to serving the user with relevant notifications based on the context of their trip, the system would actively be listening for a prompt by the users. Say “Hey Siri, where’s the nearest ice cream place” and the car will provide ice cream places near the user. Additionally, the user can use the steering wheel, which utilizes touch gestures instead of squeeze gestures in autonomous mode, to re-route the car, call ahead, or schedule a reservation with an app like open table.