Imrpoving Admin Center setup and deployment times

Revising the Viva product suite's setup and management dashboard to improve setup times and streamline the process

Overview

The Viva app management dashboard in the Microsoft admin center allows admin users to setup, deploy, and manage the Viva apps purchased by an admin’s company

The driving principle of the dashboard is to help customers ensure all app features are fully setup and running. Tasks for users include things like giving proper admin permissions; audience and role assignment; content sources; and other app specific steps

Solution

An updated setup dashboard that reports status and a consolidated and more standardized setting configuration process

Results

60% (20 minute) reduction of setup and deployment times

Constraints

Working within a rigid design system
Short design, build, and ship timeline

Timeline

6 weeks
The problem

Based on submitted feedback and interview data, users were having difficulty with the app setup process. Without a way to check app status and complete the setup process in one place, users were missing key steps and task-switching.


This could cause low usage rates and impact customer retention due to a lengthy time-to-value perception.

By surfacing relevant info earlier and upfront, as well as providing a consistent experience across apps, users will complete setup faster and be confident that they completed their task.


This will improve user's time to setup, app usage rates and boost retention across the Viva product suite.

01. What's your (user) problem?

Interviews showed that users were struggling in key areas

Interviews showed that users were struggling in key areas

Auditing the design

Based on what we learned in user interviews, the key places users were struggling with were:

1

Lack of information that left users in the dark about what is complete, in progress or even needs to be started

2

The look and feel of setup pages could vary wildly and sometimes occur outside of the Admin Center almost entirely

3

Users had a general uncertainty about what steps needed to be done first or if the steps needed to be done at all

02. Collaborative Creation

Using a collaborative approach to design

Design approach

I used the audit results to help support a collaborative session with my PM and engineering partners. We discussed what our key considerations should be and then dot-voted to narrow down our top three. From there, we generated some early ideas via a crazy 8s session.

Sorting ideas

We then sorted the crazy 8's ideas into three categories of potential approaches:

1

Notify

Ideas that focused on improving how users were informed of any action needed

2

Organize

Approaches that could present info in a more consistently to streamline the process

3

Consolidate

Ways to bring all setup steps into the same admin center dashboard

This allowed us to decide that our focus should be on the areas where we could most easily affect change (+ minimize engineering effort) and generate noticeable impact.

These were identified as the notify and organize levels. Putting our effort into these areas first would allow us to quickly improve the experience.

After discussing, we decided our focus should be on the areas where we could most easily affect change (+ minimize engineering effort) and generate noticeable impact.

This was identified as the notify and organize levels. Putting our effort into these areas first would allow us to quickly generate insights that could guide future iterations of the dashboard.

After discussing, we decided our focus should be on the areas where we could most easily affect change (+ minimize engineering effort) and generate noticeable impact.

This was identified as the notify and organize levels. Putting our effort into these areas first would allow us to quickly generate insights that could guide future iterations of the dashboard.

The hypothesis

By surfacing relevant info upfront and providing a consistent experience across apps, users will finish setup faster and be confident that they completed their task.


This will improve setup times and setup success rates. As a result, app usage and customer churn rates will also improve across the Viva product suite.

03. Design tryouts

Exploring different design options for A/B Testing

Design focus

With an understanding of our main focus, as I explored designs I wanted to make sure I kept in mind the following:

  • A minimal disruption to users typical workflow

  • Staying within Microsoft Admin Center's rigid design system guidelines

  • Creating a scalable, flexible design for different app setup needs and future updates

Design option A

  • Integrate a status column into the landing page

  • Move the description to the app specific page

  • Separate settings out in a table similar to the landing page

  • Surface the priority of each setting on the app setup page

  • Place each setting's individual steps in a scalable fly-out component

Design option B

  • Leave the landing page unchanged

  • Sort each setting into a guided wizard

  • Mark "required" steps with a relevant icon

04. Testing 1…2…3

A/b Testing helped to provide clarity on user preference

A/b Testing helped to provide clarity on user preference

Measuring success

To ensure that we gained value from the test, my project PM and I met to establish what would define a “successful” test

1

Setup time improvement

Metric should have a noticeable improvement of at least 5 minute setup time per app over the control version

2

Success rate

Users in the non-control groups should be able to setup an app successfully more often than the control group

3

Negative qualitative feedback

Did we receive any noticeably negative qualitative feedback from users during the test

Testing

Once the new designs were built by engineering, they were pushed to production where user traffic was split across the 3 options (control vs. the a/b variations). The experiment ran for close to 2 weeks.

I also worked with engineering to make sure we surfaced our feedback widget more often and prominently to gather qualitative info in addition to quant.

The test stats

Over the 2 week testing period our experiment had approximately 1,000 customers and 3,000 admin users

Results

10

Minute decrease (Version A)

Version A resulted in users decreasing their setup time by over 10 minutes. A 60% reduction in setup time per app!

2.5

Minute decrease (Version B)

Version B resulted in users decreasing their setup time by 2.5 minutes. While positive, users did better with version A.

90%

Success rate

Users had a success rate when setting up any app of over 90%!

Based on the results from the test, we determined that the version that focused on showing status of all apps upfront provided the best time improvement for users and also returned more positive qualitative feedback.

05. A tweak

Qualitative feedback from A/B testing showed users likely needed some guidance

Qualitative feedback from A/B testing showed users likely needed some guidance

Following user feedback

While the results from the status and table version already showed our designs were successful, some users had mentioned that the sudden shift from the old design confused them.

Finding a solution

We hypothesized that the shift in design may have strayed too far off from a user's existing mental model for setup in the admin center. To counter this, we also added in an onboarding process to guide users through the new system.

This shift in product strategy helped to anticipate adoption risks and mitigate them through design.

Key Impacts

20 min

Setup & deployment time reduction

Users were able to setup and deploy Viva apps 60% faster than with the old design. Leading to decreased help ticket submission from setup.

Improved TTV

Time-to-value improvements

Post-launch, users mentioned they felt increased value from Pulse. This resulted in increased adoption rates.

Project takeaways

Areas to improve on

While we had strong metrics to understand the strength of the designs presented in A/B testing, it would have been worth setting up methods to gain additional qualitative feedback. This would have allowed us to better measure usability.

Next steps - Implementing opportunities to promote premium plans

While not within the scope of the project, I made a suggestion to leverage unused white space to promote premium features (known as "add-ons"). This would not only make a user aware of available features that would improve their experience, but also take could work to notify users if a "add-on" was purchased and needed setup.

This idea could boost sales and adoption rates, while also further increasing value customers received from their Viva apps!

NEXT PROJECT

Data
Dashboard

Improved response rates

Improved response rates

Improved response rates