A rework of Pulse's existing usage dashboard
Rethinking the data visualization experience in Pulse
COMPANY
Microsoft
ROLE
Lead Designer
Duration
3 months
YEAR
2024
A comprehensive usage data dashboard that transformed scattered telemetry into a single, intuitive view
Problem
HR and IT admins could only guess how employees were engaging with the platform. Adoption felt like a black box with no clear signals or way to measure ROI.
User goals
Have guidance to make informed decisions
Freedom to choose granularity of data views
Track app setup, user engagement, and survey topics
Business goals
Drive user insight to user action, targeting a 30% click-through rate
Reduce external actions by 50%
Generate Copilot interest and adoption
Solution
Put the most relevant insights directly in front of users, right when they need them. Now, users can see the data they care about AND act on it instantly, without leaving their flow of work.
Impact
40% repeat use of insight to action feature
5% drop in data exports
01. Users want more
This MVP version was falling short of what users needed due it's rigidity in data displays and lack of actionable guidance
Auditing customer feedback
I started by auditing feedback submissions from users to understand their frustrations and needs. This quote did a good job representing how most users seemed to be feeling:
"The metrics help on a surface level, I can't do much with it though…"
02. Teamwork makes the product work
User interviews + workshopping created strong team alignment
"Creating alignment? How did he do it?!" you might ask…
To start, I conducted user interviews to get a deeper sense of user's current pain points with the dashboard
Key user frustrations learned from interviews
Leveraging a 6-person user interview I identified core frustrations users were experiencing with the current product.
Data dead-ends
Users are unable to close the loop on insights they see or do not have the metrics they want
External action agitation
The added work from having to go outside the dashboard to view or parse metrics
Overly broad data
Data provided is too broad and can’t be applied effectively to smaller groups
Establishing goals and generating early ideas through workshopping
So I finished up user interviews and now I've got all of this qualitative data… But what am I supposed to do with it?
Well first, I setup a cross-functional workshop that would progress through generating HMWs to frame the problem for users and then help us establish goals to focus on and generate ideas through a crazy 8's workshop.
An alignment on our goals and design direction
While that post-it note was a joke, in all seriousness this workshop was impactful as it helped us to define and refine user and business goals and align on early design direction
03. First pass on high-fidelity designs
With alignment on design direction and the goals the team wanted to focus on, I moved into wireframes and then onto a high-fidelity design.
First high-fidelity iteration breakdown
04. Pivot!
Users felt the in-app email wouldn't help them
While it's every designer's dream that the first iteration will be perfect, putting it in front of users usually reveals a different reality. In this case, testing revealed that users felt the in-app emailing feature wouldn't make their tasks any faster or easier.
So how did I handle this dose of user-fed reality? Re-review findings, re-iterate, and re-test!
Back to the drawing board
Through low-fidelity designs, I explored a few different ideas that would provide more targeted actions for users while keeping things simple and confirming with engineering that any ideas would be technically and timeline-ily feasible
Lightweight testing the new designs
I ran lightweight tests using clickable low-fidelity versions of three different ideas in front of users.
Feedback was the most "wow!" on a design that allowed users to pick audience grouped by response rate level and write a message to them.
Shift in design visualized
This update would allow users to only target low response rate groups, and also leveraged a design that would keep them more within the experience
05. A dashboard that delights
The shipped design that helped boost user engagement
After implementing updates from usability testing, I was able to put together a final prototype and get final buy in from all stakeholders… and soon after, a new dashboard rolled out to users, exciting!
Empower users with data and ability to take action
Put the most relevant insights directly in front of users, right when they need them. Now, users can see the data they care about AND act on it instantly, without ever leaving their flow of work.
06. So… did it work?
Impact metrics post-launch
Leveraging metrics that product, engineering, and I had partnered together to determine how to measure success post-launch as well as user satisfaction surveys and feedback submissions, we felt the launch was (mostly) a success!
40%
Click-through on metric health
40% of users had repeat and sustained use use of the new metric health buttons
20%
Response rate increase
Of 40 random users surveyed, a reported 20% increase on average in user engagement at their orgs
-5%
Drop in external actions
Less users were relying on external actions such as data export, however this number could be lower
07. Learning is power
Thank you for reading through this case study!
Below I've shared a couple of moments that felt worth reflecting on from this project:
Missing the mark the first go around can happen
Despite having plenty of data going into the design, you might not always get what users were hoping to solve the problem.
I learned to validate designs with users at multiple stages if possible and how to best include cross-functional partners if there is a need to pivot the design.
Running workshops
Running workshops are excellent tools for gaining alignment. I learned a lot about how to smoothly run a multi-stage workshop and how to make the most out of the outcomes.
Next step - investigating additional external actions users take
Due to the lower than unexpected drop in external actions done by users, the next steps for this project would be talking to users to drill down further on what specific actions users take outside of the dashboard. Then, determine how those actions can be better incorporated.