Redesign of a Search Platform for Public Health Data Discovery

UX Research, Product Design, Agile Project Management

Project Overview

I led the redesign of a key FDA search tool that consolidates multiple data sources for pharmaceutical researchers, including chemists, toxicologists, and public health regulators. My role focused on improving user experience, defining functional requirements, and setting acceptance criteria for testing. As Scrum Master, I coordinated development and release efforts through launch and subsequent application enhancements.

Methods and Tools Used

Methods

  • Visual design and establishing a style guide
  • Wireframing
  • Interactive prototyping
  • Agile requirements (epics, user stories, acceptance criteria)
  • Moderated usability testing (10 participants)
  • System Usability Scale (SUS) questionnaire
  • Functional testing and User Acceptance Testing (UAT)

Tools

  • Axure (for design, wireframing, and prototyping)
  • Jira (for epics, user stories, and bug fix tickets)
  • Confluence (for release management and documentation)
  • SortSite5 (for automated 508 testing)

My Contributions

For this project, I was the Business Analyst, UX Designer, and Scrum Master on an agile team of four developers and one quality assurance (QA) tester. My responsibilities included:

Background

Context

My work for this project was done during a multi-year consulting engagement through my employer, Booz Allen Hamilton, with the FDA's Division of Research Science Informatics within the Center of Tobacco Products (CTP). For this engagement, I was the Business Analyst, Scrum Master, and UI designer for the CTP Integrated Research Data System (CIRDS), a search and retrieval tool that allowed scientists within the FDA to easily access external and internal toxicology and public health data through a single user interface.

The Challenge

My team inherited CIRDS as a proof of concept and was tasked with a full data-center migration, modernization, and database integration within CTP. The outdated Angular framework required major reengineering, but tight deadlines prevented prior discovery workshops or exploratory research. Instead, I worked within a 2-week sprint cadence to define requirements, create wireframes, and secure stakeholder approval for designs and acceptance criteria. Usability testing and research had to wait until after the first production release.

Redesign Goals

For my contributions to this redesign, my goals were to:

Methodology: Design

Process Definition

Before wireframing, I consulted clients and end-users to understand their processes and pain points. In our weekly Backlog Refinement sessions, we dedicated time to the redesign while addressing other backlog items. I used Axure to create as-is and future-state flow diagrams for key features.
An example flow of our Advanced Search feature, one of the many features we needed to document, is shown below: 

Wireframing and Requirements

After core processes were agreed upon, I then created wireframes with Axure. I first started out with greyscale, low fidelity wireframes to ensure that the customer focus would only be on important user interface elements, at least in the early phases, and not on aesthetics or color. I went through multiple rounds of wireframing during this phase, where I would continuously present changes during our sprint review and backlog refinement agile ceremonies.

Once we got to a stable enough place with the low fidelity wireframes, I then switched to more high fidelity wireframes, adding in color and some interaction elements for demos and presentations. As I designed the high fidelity wireframes, I also concurrently started building out a basic style guide and description of design patterns for elements such as modals and buttons.

The Design Process for CIRDS

Wireframe: A mockup of the redesigned CIRDS search result view.

Wireframe: A mockup of a detailed view for a research article

Sample from Style Guide: Colors & Font

Methodology: Agile Requirements, Testing, and Deployment

Agile Requirements

After a visual identity had been relatively finalized with key decision makers and the engineering team, I would then translate each component of the new design into user stories and a subsequent set of acceptance criteria for functional testing. As developers built out the features for the new interface, I would then give a design review to ensure that what was built matched the style guides and mockups. I also assisted in the functional testing for each sprint.

Sample User Story:
"As a toxicologist, I want to view all substances mentioned within a research article on the search results view, so that I can easily identify articles and adverse reports that mention specific chemicals for research and regulation."

Sample Acceptance Criteria:

  1. User can see the  "Substances" label within the expanded view of a search results card.
  2. The "Substances" area has all substances listed in-line as links, separated by comma, (e.g., Acetate, Ammonia, Nicotine, Propanetriol).
  3. Clicking on a substance link should direct the user to that substance's detailed view in CIRDS.

Testing and Deployment

Our development team operated with four environments:

This usability test aimed to qualitatively validate the meeting management wireframes. I created five task scenarios, each with a custom set of interactive wireframes.

Task goals are listed below:

  1. Developer Sandbox (where developer testing took place)
  2. Test (where functional testing and design reviews took place)
  3. Preproduction (staging environment where UAT took place)
  4. Production (live environment)

When CIRDS was ready for the full production release, I worked with our tester to develop a full set of User Acceptance Test (UAT) scripts for our customer stakeholders to execute. The purpose of UAT here was to simply verify that core business and technical requirements were met. Once UAT was successfully executed, I then coordinated the full production release for this redesign with my team.

Methodology: Usability Testing

Due to project constraints, I did not get a chance to conduct usability testing until after the redesign was fully implemented in our application's production environment. However, usability testing did provide useful insights for enhancements and fixes that we were then able to incorporate in subsequent releases. A few notes about usability testing for CIRDS: 

After each session, I administered the System Usability Scale (SUS) to supplement qualitative insights with quantitative data. The redesign received an average SUS score, from the 10 scientists, of 71, which is slightly above the industry benchmark for usable software systems at 68.

Outcomes and Lessons Learned

Outcomes

The redesign of the application was successfully implemented and deployed to production, and is still in use by the FDA at of the time of this case study.

Lessons Learned

If I could do this project all over again, I would advocate harder for more usability testing upfront. While the redesign was received positively by my clients and users, I could have achieved even better results if I did earlier rounds of usability testing with the greyscale wireframes, or at least with the higher fidelity prototypes, before even proceeding with development.

However, I was ultimately able to adapt to the constraints, and I gained valuable experience by doing the extra work to weave UX design into the agile development process with a small team.

Back to top