MathWorks Test Results Database (TRDB)
Internal Web app for investigating software test failures
The Test Results Database (TRDB) is a key internal app used by 1500+ members of the MathWorks development organization to review and investigate failures in automated software tests. It is used by several distinct user groups with very different usage patterns and domain expertise. The area is complex and technical and involves many concepts unique to MathWorks' build and test environment.
When I joined the project, the team was launching a major project to redesign the app and launch an improved 2.0 version. The scope included revamping the architecture and UI, as well as user experience improvements.
Because TRDB is an internal app, we were blessed with easy user access. We interviewed and observed users to understand how they used the existing version and identify pain points. I ran several collaborative workflow mapping exercises to help us understand key workflows.
We discovered that many key workflows required an excessive number of manual steps. Also, there were advanced features that even experienced users didn't know about.
I ran meetings for the cross-functional team to prioritize features for the upcoming release. We broke these features down into phases and reviewed the plan with our stakeholders.
Early on we decided to center the main page design on a toolstrip. The pattern is a standard in MathWorks' internal and customer-facing apps, which meant it would be familiar to all of the app's users. It would also address some of the discoverability problems by putting most of the app's features in a central location.
Next, I started fleshing out the ideas into Axure wireframes. Rapid prototyping allowed us to work through many different ideas very quickly, with plenty of design review and iteration along the way. The details evolved a lot as the project moved forward, but these early concepts were immensely helpful in developing a big-picture vision of what we wanted the final product to be.
We also mapped the proposed workflow to make sure our solution would reduce steps and address users' pain points.
I planned and moderated two rounds of usability testing, one at the wireframe stage and one when the design was close to complete. I worked with the team to figure out the best way to structure the study to answer the team's questions. We kept the tasks as open-ended as possible so that we could learn as much as possible about the users' natural mental models and workflow.
The complete cross-functional team attended the tests and took notes on Post-Its, which we then affinity-mapped as a team after the sessions. We emerged happy with the overall direction, but with lots of ideas for improvements. Back to the drawing board!
I wrote a detailed functional spec for the new design. I also ran design reviews and roadshows with important user groups to collect additional feedback and socialize the design.
I helped the team collect, analyze and prioritize feedback during the pilot phase. The application is data-intensive and has different usage patterns among different user groups, so the pilot allowed us to collect feedback from people using the new version for their day-to-day work with their actual data.
I also helped the team with the final rollout, including UI testing and drafting documentation.
The final version has been very well-received by users, and the UI is now serving as a model for redesigns of several other key internal applications. In addition, the user research we did as part of this project has continued to serve as a foundation for numerous later improvements.
I've continued to work with the TRDB team after the release of TRDB 2.0. For later projects, I've facilitated braindrawing and paper prototyping sessions. I have also worked with the team to plan and facilitate several week-long offsite events to rapidly design and implement incremental improvements.