Company
Hewlett-Packard
Role
Interaction Designer and Researcher
Deliverables
Research reports, Logic flows, Prototypes, Interaction design documentation
Tools
Adobe XD, UserTesting.com, Excel (data analysis)
Background
The desktop version of HP Smart App had historically been inferior to the mobile version of the app. This was due to the “mobile first” business strategy. Once our team finished a printer focused on mobile first setup, leadership to address the desktop app since a large amount of legacy users still used that platform.
As a part of the Setup Design team, a lead designer and I set out to audit and upgrade the desktop setup experience to improve ratings, interaction, and visual design of the app.
Goals and Impact
Goals
Improve usability and ease of use
Improve consistency
Improve voice and tone
Impact
Changes made to the desktop experience helped increase the Microsoft app store ratings from 2.8 to 4.4 stars.
Improvement in all ux metrics: confidence, ease of use, and visual appeal and NASA TLX workload.


Process
Path to a more scalable, compelling website
01
Expert Evaluation
Mapping emotional response ratings in a user journey
02
Prioritization workshop
Aligning with stakeholders
03
baseline study
Gathering baseline UX metrics
04
Design + Iteration
competitive analysis, Design, internal reviews
05
Delta Study + impact evaluation
delta study and reflection
01
Evaluation
02
prioritization
03
Baseline
04
design iteration
05
impact
Expert Evaluation
Journey mapping to identify hero moments and pain points
The evaluation of the current experience started with creating a journey map. This allowed us to look at the steps and micro steps users needed to take along the setup journey so that we could empathize with users.

01
Evaluation
02
prioritization
03
Baseline
04
design iteration
05
impact
Prioritization Workshop
Aligning within the organization
Approach
Our team did an exercise to identify pain points along the software setup journey.
Another workshop was also done with partners to both identify the pain points and prioritize issues.
Goals + Rationales
Leverage our tribal knowledge of known customer problems to help inform the redesign
Create stakeholder buy-in for backlog prioritization

01
Evaluation
02
prioritization
03
Baseline
04
design iteration
05
impact
Baseline Study
Incorporating best practices
Objective
Establish a baseline of the current Windows 10 setup experience to inform redesign efforts and validate journey map pain points
Approach
Conducted qualitative sessions in the San Diego office
Targeted end-to-end setup flow: from initial instructions to first print
Developed and used a discussion guide and embedded micro-surveys

Methods
Paused users at key journey points to capture immediate feedback
Collected both qualitative insights and ratings of ux metrics (time, confidence, ease of use, number of steps, success)

Outcomes
Clearer understanding of user frustrations
Validation of journey map pain points
Actionable input for redesign prioritization
01
Evaluation
02
prioritization
03
Baseline
04
design iteration
05
impact
Install Driver Redesign
One piece of the redesign
Background
One of the highest priority items that I redesigned was the driver install portion of the desktop setup. Data from the expert evaluation and the testing helped inform the designs that were delivered or this feature.
Preceding Steps
As a part of the design process, I looked at where users are coming from in the journey to see if there’s anything that could impact the experience once they get to the driver. In this case the copy in the loading screen preceding the driver was misleading, and we addressed this in the design updates.

Backend Work
We worked with development first of all to see if the driver could be first and foremost downloaded in the background automatically for the user. Negotiation led to this actually be the hero case with the instruction screen only accounting for ~30% of users. That meant that this was one less step in the set up journey that required actions from users, reducing overall fatigue and friction.

Competitive Analysis
I also looked at examples of step by step instructions across other platforms especially focusing on devices that intersect with a hardware component to help identify best practices for the driver install screen re design.

Iterations
I went through around 7 version of this install driver design, continuously reviewing them with our design and development teams.



Final Design
Ultimately quick hallway testing helped inform the final direction that was delivered to development.
Final deliverable can be found here: https://drive.google.com/file/d/1DVZM2Is14AsbAKEUrqDW2LyB8Wz8gLS3/view?usp=sharing

01
Initial research
02
desk research
03
competitive audit
04
design iteration
05
impact
First Print Redesign
One piece of the redesign
Background
I also redesigned first print which was the last step of the printer setup process. The information hierarchy on the page, visuals, voice and tone definitely needed to be updated to ensure users had a positive last step. The overall intent for this was to celebrate and assure the user that their printer was ready for them to use it.
Objectives
Needs to really emphasize celebration of the completed setup
Name should just be name of printer, and picture if possible
Text should be more prominent in the information architecture
Old Design

New Design

Link to design documentation:
https://drive.google.com/file/d/1cgfzejuBS17ljlhJGNNBSgloRuxZdMDx/view?usp=sharing
01
Evaluation
02
prioritization
03
Baseline
04
design iteration
05
impact
Delta Study
Incorporating best practices
Objective
To gauge whether the improvements we were making had a positive impact on experience metrics, we ran a second test to compare the user experience of the current Win 10 experience to the updated Win 10 designs
I wrote the test, created the stimuli (two prototypes), and analyzed the qualitative and quantitative data.
Approach
Online unmoderated UserTesting.com test
Stimuli included two click through XD prototypes representing the current and new Win 10 workflows

Methods
26 users
PC users, typical person to set up a printer, PC setup path preferred
A/B Preference Test
NASA TLX Metrics

Outcomes + Impact
The overall conclusion was that the new design performed better across all key UX metrics.
The new Win 10 desktop experience performed better on almost all NASA TLX workload metrics
Users preferred the new Win 10 experience across all ux metrics: confidence, amount of steps, ease of use, use w/out assistance, and visual appeal
01
Evaluation
02
prioritization
03
Baseline
04
design iteration
05
impact
Reflections
Project challenges and mitigations
Overall, our team deemed the project a success. That didn't mean there weren't problems however.
Challenge #1:
Developer resistance to delivery of design changes
Mitigation: Developer education regarding our team’s interaction design documentation and rationale behind design changes. During the same year this project was done, our team had established a new delivery structure. This project was the first time we were using that format to deliver to the Win 10 and Mac development teams. This was met with some friction because the documentation was more detailed and new, so it took some extra time to educate and walk developers through our delivery format and where to find things. There was a large amount of relationship growth that happened with the 2 development teams that set the stage for future work.
Challenge #2:
Deliberation between 2 possible design solutions
Mitigation: Hallway testing. In the end both my and the design preference of my manager generally tied in terms of user preference, but one was chosen over the other based on expert review/standard button placement.