A metrics & Onboarding dashboard built from the ground up

A data visualization tool that keeps admins in the loop

COMPANY

Microsoft

ROLE

Lead Designer

Duration

4 months

YEAR

2024

Project TLDR

Project TLDR

Project TLDR

Designing a usage data dashboard to help HR and IT Admins get comprehensive insights into user telemetry and usage to ensure effective adoption and ROI within Pulse

Problem

How might we help admins track user telemetry and usage to ensure that Pulse is being used effectively so that they can address adoption issues and understand the value they're receiving

User goals

  • Quickly see the data needed without having to leave the app

  • Easily understand data with limited expertise

  • Have guidance in solving adoption & setup

Business goals

  • Unblock users that may be having issues with adoption

  • Be able to gather feature telemetry not available in PowerBI

  • Display core metrics

Solution

A metrics dashboard that provides data visualization on core data points across a 6-month timeframe while also providing guidance on onboarding and deployment issues

Impact

  • 36% growth in users within 4 weeks of launch with positive user feedback on meeting initial needs

  • Telemetry access for internal product team that was previously inaccessible

01. The case of the missing metrics

Users were struggling with missing metrics and guidance

Users were struggling with missing metrics and guidance

Users were struggling with missing metrics and guidance

We had heard mentions from some customers about the desire for more metrics to track usage in their organizations.

Talking more in depth with customers

To drill down further, we used existing customer conversation programs. We hoped to gain insights and understand more about what problems they're currently facing, what metrics matter to them, and how they monitor their metrics.

“We want to know are users actually engaging with the app… are there areas that are being over or under used”

Pulse user, IT Admin

Additionally we found out that:

Want guidance to address or understand adoption blockers

Users want to be able to understand the data they need

Users may not be tech savvy or have access to admin controls

Prefer not to leave app to monitor usage data

MAC (Microsoft Admin Center) can be lacking for granular usage data

02. It's all about the journey

Journey mapping illustrated how users struggled without usage data

Journey mapping illustrated how users struggled without usage data

Journey mapping illustrated how users struggled without usage data

Getting feedback from users and being able to understand their experiences was helpful to plot their journey to further identify where we could best uplift them

Summarized journey map findings

Low points 🤨

  • Challenges with complicated dashboards

  • Available data is limited and lacks granularity

  • Unable to fully diagnose why users in their org aren't using Pulse

High points 😄

  • Available data helps to see basic app usage

  • HR/IT Admins can understand Pulse's high-level ROI


03. Feasibilities & Feedback

Collaboration with PM and devs helped to define realistic metrics and put the focus on product onboarding

Collaboration with PM and devs helped to define realistic metrics and put the focus on product onboarding

Collaboration with PM and devs helped to define realistic metrics and put the focus on product onboarding

The journey map helped to pinpoint the biggest friction areas and further understand where to focus to help users best track data and interpret the value from Pulse.

After aligning with the product team to narrow down key requirements, I was ready to begin early designs that could help get cross-functional buy-in.

Understanding current approaches

Since users of our dashboard were also users of other internal product dashboards, I looked to existing designs in other internal products to gather inspiration for what the dashboard could look like.

Initial design (low fidelity)

The data visualizations used were chosen to help users track trends and identify problem areas. A metric summary was included to give an instant overview, along with setup help.

Feedback and data constraints

Getting an early design in front of engineering and PM was crucial to help them understand feasibility, see how data would be displayed, and align with business goals. After review, I received the following feedback:

The metric summary was a nice to have, and the product team now wanted a larger focus on adoption and deployment guidance based on customer feedback


Engineering had been investigating sourcing the data for "Assigned licenses" and discovered that it couldn't be carried over from its original source in the MAC (Microsoft admin center)

Design updates to adapt feedback and constraints

Updates done

  • Removed metric summary and "Assigned licenses" metric and replaced with additional onboarding and help material

  • Clarified with PM and Engineering which metrics we could display

  • Added color coded % badges to each metric tile as a compromise from having to remove the metric summary

04. When in doubt, ask a user

Showing the designs to customers brought clarity to improve readability and support for setup and adoption

Showing the designs to customers brought clarity to improve readability and support for setup and adoption

Showing the designs to customers brought clarity to improve readability and support for setup and adoption

Leveraging the same customer program we used previously, in addition to 1-1 meetings with a few key customers, I previewed the updated and high-fidelity design to get more alignment with customer needs.

This allowed us to move quickly while still getting actionable feedback.

User/Customer feedback

Users appreciated the simple layout of the dashboard, clear data visualizations, and setup guidance


However, they did point out the following:

Metric uncertainty

Feedback given around needing help understanding how each metric was calculated and what each indicates

Expanded setup checklist

Multiple users wanted to have a more detailed checklist rather than just key points

Unclear percentage indicator

Users found the percentage indicator next to each metric title confusing and unclear

No intro video

Feedback around the integrated tutorial video was mostly negative as mulitple users said they wouldn't stop to watch a video while in the dashboard

05. Just a speed bump

More discoveries on data leads to rethought design elements

More discoveries on data leads to rethought design elements

More discoveries on data leads to rethought design elements

A meeting with engineering brought new constraints to light around data sourcing and data collection frequency with APIs (application programming interfaces).

This led to us not being able to include an "Active Users" metric in the dashboard and also required an update to the date filtering a user could do.

Below were my proposed revisions to adapt:

Rethinking the dashboard without "Active Users" metric

While the product team and I worked closely with engineering to see if there was a way to still include the Active Users, there wasn't an available workaround for this issue due to newly understood API limitations.

Although this was a setback, I discussed with the team to see if there was a middle ground on this issue.

An in-line video used the original space, but was viewed by users as not adding much value

The original author flow only took into account survey time and length


Repurposing the space for the video, by having it in the "Learn" documentation, allowed us to make space for users to go to the MAC directly from in-app to view both removed metrics

Rethinking date range filtering

An engineering investigation into how often we could collect data revealed that, due to constraints on not overwhelming the data retrieval system, metric data had to be collected just once a week rather than the original proposal of ongoing, real-time collection.

The original date filter allowed for custom date selection with only a few guardrails

The original author flow only took into account survey time and length


A simplified way to select ranges that fits within engineering's data constraints and a "last updated" indicator so users can understand how up-to-date their data is

06. Coming in for a landing

Design hand off

Design hand off

Design hand off

Having adapted design elements to the new constraints and getting cross-functional buy-in, I was able to move forward with completing high fidelity screens and handing off designs

Below is the final design that shipped, with a breakdown of each section:

Revised to adapt to data-based constraints

Additional space was added to guide users to where Active Users and Assigned Licenses is stored.

Date filtering was updated to have fixed timeframe filtering and text was added to keep users in the loop about the freshness of their data

Repositioned UI

The second most requested metrics "Pulses sent" was prioritized to have a greater visual hierarchy in place of removing "Active Users"

Percent indicator badges were moved to improve readability

Informative tooltips & hover states

To improve user comprehension of badges and metrics, explainer text and tooltips were added. These were written in conjunction with the content design team, to ensure to-the-point but sensical guidance.

07. Looking back

Project reflections

Project reflections

Project reflections

Thank you for reading through this case study!

Below I've shared a couple of moments that felt worth reflecting on from this project:

Managing unexpected constraints

Updating the design later in the project was a great lesson in understanding that unexpected speed bumps like data constraints can happen.

While I think collaborating with engineering for greater clarity would have been useful, I ultimately learned how to be more adaptable and collaborate cross-functionally to find a solution and push forward.

Balancing levels of research with product timelines

Being able to conduct an in-depth and structured usability test is always ideal, however sometimes conducting these can delay product timeline.

This project helped to teach me how to judge when the information gathered is strong enough to move forward; striking the best balance between user needs and project timelines/business goals.

Additional projects