Role: Lead Product Designer
Tools: usertesting.com, Miro, Figma
Key Contributions: Applied user research to inform UX strategy, improving content clarity and task flow organization.
The existing recommendation system surfaced helpful actions, but the experience lacked clarity and engagement. Users often ignored the recommendations, putting adoption at risk. The challenge was to redesign the system so it motivated users to act, while also supporting integration into a new, merged product platform.
I led the UX strategy and design. I partnered with data scientists to reorganize recommendations by risk category, collaborated with our UX researcher to synthesize findings, and worked with subject matter experts to ensure accuracy. I also facilitated design reviews with the UX team, ran user testing to validate improvements, and presented design directions in stakeholder sessions for alignment and feasibility.
Delivered a redesigned Action Plan that positioned the feature as a key differentiator in the product. The work contributed to a major partner deal, and established a measurement framework for engagement once launched. The final solution created a more motivating, structured experience that gave users clearer next steps and increased the feature’s business value.
Personally identifiable information is continuously tracked through various monitoring methods, including dark web and credit monitoring.
Users are alerted when their information is detected in databases of dark web exposures.
Using these exposure data points, a score is calculated to show users their estimated risk level, providing context for the recommended actions.
Recommendations are provided to mitigate identified risks and improve their ID Safety Score.
This feature is the focus of this case study.
A diary study is a longitudinal research method where participants document their thoughts, behaviors, and experiences over time. This approach provides deep insights into user sentiment, pain points, and behavioral patterns in real-world contexts.
This method was chosen to capture users’ initial sentiments before interacting with the product, observe real-time reactions during first-time use, and track behavioral shifts over an extended period, uncovering how perceptions and engagement evolve.
While alerts were fulfilling the “inform me” need, the Action Plan (meant to guide users) was unclear, unmotivating, and untrusted after initial exposure. Engagement dropped sharply, leaving users without a sense of ongoing protection.
This study was designed and led by our UX researcher; I contributed by synthesizing findings and ensuring the implications shaped the alert redesign.
This is the most critical problem. If users don’t understand why these actions matter, they won’t trust the Action Plan. How might we make it clear why a recommendation is relevant to the user?
To ascertain the context for the action plan, users must navigate to separate tabs to access ID Safety Score, Top Risks, exposure history and exposed credentials. This separation makes it difficult for users to see the relevance of the recommendations and how those recommendations relate to their unique exposure history.
While understanding the why behind actions is important, clear and digestible how steps are essential too. How might we make guidance easy to scan and act on?
A long, unorganized list of security tasks can feel daunting, and without structure, prioritization, or clear explanations, the action plan feels like an endless to-do list rather than a guided path to improving security. How might we structure the tasks to provide clear guidance and reduce overwhelm?
To better understand why certain recommendations surfaced as higher priority, I collaborated with our data science team. They explained how risk categories were assigned, which helped me reorganize the action plan into clearer groupings. This framing gave users a more structured view of their next steps without requiring them to interpret the underlying data model.
Users struggled to locate the action plan due to its placement on the dashboard. How might we ensure that the action plan is easily accessed?
The highlighted area shows the visible portion of the screen on initial load. In the 'before' design, the call-to-action for the Action Plan feature extends below the fold. The 'after' design ensures it is fully visible and prominently emphasized, reducing friction in accessing essential tasks.
Although this feature has not yet been built, the design strategy was grounded in user insights and business needs. Testing highlighted why users struggled to engage with the original version, and the redesign directly addresses those barriers.
Once implemented, success can be measured through: