Data as a Product
Democratizing Analytics for API users
Strategy & Planning | User Research | Wireframing | User Flows | UI Design | Interaction Design | User Testing
Project goal: Empower developers with analytics and insights on what is happening in their live video sessions and how they can optimize the quality of experience
Process: IDENTIFY the problem -> HIGHLIGHT key opportunities -> MAP the user's journey -> LAY OUT components and VISUALIZE DATA for balance & focus -> DEFINE a flexible filtering experience -> TEST with users -> ITERATE & LAUNCH (and iterate again)
IDENTIFYING the Problem
After the launch of the API dashboard, we targeted a cohort of over 200 users (about 10% of our monthly active userbase) to measure their experience. After an initial spike in usage after the launch, daily log ins were dropping after two months and monthly engagement had plateaued after three. We spoke with over 150 customers who noted that the launch helped them get started easily, find their way to useful tools, and documentation, but once they had started building their application, the dashboard didn't provide enough value. It was only once they were testing and after launch that they checked back in more regularly, usually to debug issues or to see their usage trends.
140+ TokBox users were surveyed (active = credit card on file and > 500 video minutes streamed in the last billing cycle) and coducted interviews with a cohort of 15 active developers. The data captured was around dashboard frequency and ease of use, value, performance, and areas for improvement.
HIGHLIGHTING Key Opportunities & MAPPING the User's Journey
Our question became, "how might we build a dashboard that was more of a central hub to the TokBox experience, throughout the journey?" What users consistently highlighted in interviews was needing more data about their application, even in early development. Being able to see when errors were happening, and where, would unblock them faster. For users with an app in production, they were frustrated by not having enough context for the data that we did provide. Usage metrics alone weren't enough to support product or business decisions on how their app should evolve.
The customer journey map, updated with Thinking, Feeling, Doing actions and customer satisfaction with verbatims.
LAYING OUT Components & VISUALIZING Data for Balance and Focus
What jobs do developer’s need to get done and what data do they need to be successful?
Once developers start building, they need to see which video sessions had issues and get recommendations on how to fix them. Pinpointing the root of an issue quickly without technical support is a must. After launch, developers and PMs need to know the most common issues as well as usage trends. For PMs, being able to map those trends to her end user’s demographics (location, network, device) will help them make better product decision for future iterations.
A strong color story and logic is critical for successful data visualization - users need to be able to disambiguate between units and understand at a glance what each graphical element means.
DEFINING a Flexible Filtering Experience
More data can create more problems. Without context, clear next steps, and the ability to filter information, piling on more data can cause confusion rather than help solve problems. By curating the default data shown based on the health of a user’s application and giving users the right filters to drill down further, Insights helped accelerate developers’ workflow from Build to Launch by 23% (compared to the previous year’s funnel metrics)
Since end-user quality of experience was often connected to browser or SDK version (especially if they were no longer supported), one of the most common frustrations that our developers had was not knowing the breakdown of browser and SDK version in their usage metrics. Providing this info, and highlighting time periods where unsupported endpoints were being used, helped solve a major debugging blocker.
Testing with Users
At this point in the process, we felt confident in the proposal but needed to validate with developers that our assumptions were correct. This helped us refine much of the data visualization and interaction details, content hierarchy, and overall user flows before the beta launch.
All users tested an Invision prototype of the end-to-end flow and were asked to perform specific tasks and provide impressions of ease of use and feature value. The biggest takeaways were around making the secondary tabs (quality, errors) more discoverable, having more clarity and definitions for terminology, the ability to drill down into individual sessions related to project-level insights, and seeing more direct action items from the dashboard. The geographic visualization was a big point of value for about 60% of testers. Errors and quality statistics were universally valuable. Overall, users were pleased with the data they saw and all but 1 noted that they would log in to the dashboard more if it meant they would have this view.