Sector
Technology
Client
Autodesk
Timeline
12 weeks
Tools
Mural, Zoom, Figma, Dovetail
Problem
Goal
Redesign the dashboard interface to reduce cognitive load and enable confident data analysis.
PROCESS
DISCOVER
Led requirements gathering workshops
Along with the Senior Experience Designer on the team, I co-designed 4 workshops involving prioritization activities that would effectively reveal specific areas of the dashboard that required a design adjustment. To maximize operations efficiency, we used existing Mural templates and adjusted them to fit our discovery needs. 10 internal employees (sales reps and dashboard Subject Matter Experts (SMEs)) participated in the 2 workshops I facilitated.
Rose, thorn, bud
We asked sales reps to brainstorm and discuss aspects of using the dashboard they appreciated, struggled with, and wish existed. Some feedback was around technical performance, though we focused on comments related to data presentation and obfuscation.
Bull's-eye diagramming
We encouraged sales reps to work together to organize any themes identified in the previous exercise into three circles indicating level of priority. Issues that impacted how long it would take to find specific data points were often prioritized as most critical.
Frequency vs. difficulty matrix
With dashboard SMEs, we asked them to describe the type of support tickets they would receive from sales reps. They then mapped these issues on a matrix that measured how often they received related tickets, and how difficult it would be to resolve them.
define
Translated findings into design guidelines
From the workshop findings, I created a set of guidelines that any new design decisions should follow so that key pain points would not re-emerge. I later used these guidelines to identify existing Power BI functionality that could be leveraged for new designs. To support further technical feasibility analysis, I shared a features list with our developer that distinguished between must-have and nice-to-have features, along with the Power BI functions I believed they could use.
develop
Rapidly created and iterated on designs
We quickly moved onto exploring potential designs once we had better understood Power BI and could reference its capabilities. My design process involved:
10+ low-fidelity wireframes
Explored different layout options for main screens on Mural, designed around optimizing various tasks
Power BI UI kit on Figma
Leveraged existing component set to ensure designs were within Power BI constraints
Micro-interactions
Prototyped hover states, modals, and overlays more to accurately replicate experience
Conducted 4 usability tests
In order to test the efficacy of our designs, I drafted a usability testing plan and moderated all 4 tests. The test asked users to complete three tasks that they were very familiar with on the existing dashboard and part of their daily routine. We used this as our baseline metric for success. For example, could users get to the intended screen in fewer clicks? Did users recognize data points that were visualized differently?
Details
Tested with 2 power users and 2 novice users
Asked post-task questions, such as rating level of difficulty
Required participants to complete a post-test SUS questionnaire, earning an average score of over 68%
Overcoming challenges
To recruit sales reps that fit our target user group, we initially asked team leads to share our testing calendar with their direct reports and book an interview time. We didn't receive any bookings for about a week, so I volunteered to identify 16 sales reps that we would directly contact and request participation from. I selected on the basis of how long ago they had joined Autodesk and how often they opened the dashboard, and ensured a diverse representation across geographical regions.
From the first few outreach emails, we still didn't receive much interest. Our product manager learned and shared that our testing timeline overlapped end of fiscal quarter for sales reps, so it was a busy time for them — too busy for menial tasks. With this information, we shifted our approach to make it as easy as possible to participate: I used Outlook's Scheduling Assistant for visibility into their availability and sent optional invites accordingly. This resulted in a higher response rate, especially from sales reps who were enthusiastic to share feedback.
Implemented design changes
Before wrapping up my internship, I used Dovetail to consolidate participant quotes and feedback. The Senior Experience Designer and I collaborated to address patterns (such as areas of low exploration or misguided navigation) in our next iteration. All data portrayed in the images below, including customer identifying information, has been replaced with placeholder data to protect the privacy and confidentiality of Autodesk users.

Version used during usability testing

Version created after usability testing
LESSONS LEARNED
Fresh eyes offer unique perspectives
What made this project especially interesting was that every team member was very new to it. Our Senior Experience Designer, who had the most familiarity of us all, had only been at Autodesk for two years. This initially served as a challenge — the gap in knowledge meant that it took longer than usual for me to comprehend our goals as I had to first understand how Power BI dashboards worked, as well as the compliance concepts relevant to this use case. I relied a lot on existing documentation to familiarize myself and the dashboard SMEs became a very valuable partner in knowledge-sharing. However, it was that clean slate that allowed me to ask questions that might have been overlooked as complex processes became automatic. I was able to step back and ask, is this simpler alternative possible?
This is an industry project. Get in touch for more details!