top of page



In-Vehicle and Accessory Feature Management





Oct - Jun 2023 (8 months)

UX/UI Design


2 Designers, 2 PMs, 6 Engineers

DISCLAIMER: In order to ensure the confidentiality, integrity, and availability of Ford’s internal processes, some information and visuals throughout the study have been altered and/or cannot be shown. All data on the prototype screens is randomly generated.


The Problem

We identified early on that most users often had trouble onboarding and managing their features due to poor error recovery tactics and struggling to understand the terminology used in the existing site. This interfered with our team's resources when users reached out to manually onboard and update feature metadata.

Throughout this case study, we will explore the various phases of our design process, including research, ideation, prototyping, and user testing. We will shed light on the challenges we encountered and the iterative approach we adopted to refine our solutions continuously. Quickly navigate to each section below!

How might we give users an efficient and engaging process to manage their features?


Listening to our Users

We dug deep into the problem space of the users by conducting several user interviews to understand our users, product, and how the relationship between the two was absolutely crucial for business success; however, due to the lack of adoption from users, site admins were forced to modify information upon request.

By utilizing Jakob Nielsen's 10 Usability Heuristics and a SWOT Analysisinconsistent and fragile areas of the tool were identified: the match between the system and the real world, consistency and standards, and error prevention.


Most users were unfamiliar with the diction used, causing almost immediate abandonment of most tasks or hesitation out of fear of making a mistake.


Our product neglected error recovery states when inputting information, often leaving users at risk of losing existing data.

When users are able to find, access, and utilize the tool to complete tasks without reaching out to our team directly for help, we will know that our redesign was successful.


The Best Solution

It was made very apparent by our users that the tool was difficult to use or understand and also had no sense of protection when it came to errors. These were the major problems we embarked on to solve. 

As mentioned, the terminology used on the site was not always immediately understood, leaving users speculating, which is not a good thing. By incorporating tooltips throughout, our users were able to engage with the product and learn along the way.

We facilitated several usability feedback sessions before finalizing designs to ensure our patterns were intuitive and efficient for our users.  Feedback is a gift, so I will never pass up an opportunity to hear from a user of my product!

  • Site terminology was not easy to understand, often resulting in the abandonment of tasks.

  • Error protocol was little to nonexistent, leaving room for disaster with our users.

  • Users had difficulty navigating the site, especially when errors surfaced.

  • Introduced tooltips to offer insight and definitions of words.

  • Implemented recovery mechanisms to ensure safe data practices.

  • By using patterns and a design system, users were able to get a consistent, usable experience.

Errors happen, and allowing users to recover from those errors, whether it's systematic or the user themselves, is crucial. By improving the error protocol and standards, we increased our users' task success by 30%.

With consistency and standards being another area of the tool that needed our attention, we made maintaining a design system in Figma our top priority. With the use of patterns and libraries, not only were we able to efficiently design and get user feedback, but our users were also able to have a recognizable and intuitive experience.


Improving Users' Lives

40% increase in user engagement and adoption using attitudinal and behavioral metrics, which was supported by a Qualtrics survey sent out to all of the users.

Below are the findings we delivered and quantified with the System Usability Scale. It is a series of 10 questions on a scale of 1 to 5 that generates an average score when calculated.

  • Defining clear outcomes and scope early on helped us reduce churn.

  • Recognize and confront the challenge: there is no right or wrong answer, just better or worse solutions.

  • Conducting small research activities throughout the process allowed us to challenge our assumptions.

  • Design a flow for users to download feature information to work offline with it.

  • Incorporate an activity log to foster shared ownership.

  • Implement a social aspect that would allow users to efficiently connect with one another in case of a service failure.

Hannah Shink

Co-Designer on the project

 "Jacob is one of the hardest working, positive, kindest humans I've ever met. I always felt comfortable delegating to him as he continuously showed his critical thinking skills, made his work visible, and thoroughly communicated throughout the design process. "

Want to see more of my work?

Check out another case study regarding the remediation of another internal enterprise-wide tool used for governing vehicle cloud data.

Contact Me

Let's Get in Touch!

Thank You for Reaching Out!

bottom of page