Action Plan

Role: Lead Product Designer
Tools: usertesting.com, Miro, Figma
Key Contributions: Applied user research to inform UX strategy, improving content clarity and task flow organization.

Problem ⚡

The existing recommendation system surfaced helpful actions, but the experience lacked clarity and engagement. Users often ignored the recommendations, putting adoption at risk. The challenge was to redesign the system so it motivated users to act, while also supporting integration into a new, merged product platform.

Role & Collaboration 🎨

I led the UX strategy and design. I partnered with data scientists to reorganize recommendations by risk category, collaborated with our UX researcher to synthesize findings, and worked with subject matter experts to ensure accuracy. I also facilitated design reviews with the UX team, ran user testing to validate improvements, and presented design directions in stakeholder sessions for alignment and feasibility.

Outcome & Impact 🚀

Delivered a redesigned Action Plan that positioned the feature as a key differentiator in the product. The work contributed to a major partner deal, and established a measurement framework for engagement once launched. The final solution created a more motivating, structured experience that gave users clearer next steps and increased the feature’s business value.

Platform Context: Key Components

Monitoring

Personally identifiable information is continuously tracked through various monitoring methods, including dark web and credit monitoring.

Alerts

Users are alerted when their information is detected in databases of dark web exposures.

ID Safety Score

Using these exposure data points, a score is calculated to show users their estimated risk level, providing context for the recommended actions.

Action Plan

Recommendations are provided to mitigate identified risks and improve their ID Safety Score.

This feature is the focus of this case study.

Research and Insights

Method: Diary Study

A diary study is a longitudinal research method where participants document their thoughts, behaviors, and experiences over time. This approach provides deep insights into user sentiment, pain points, and behavioral patterns in real-world contexts.

Day 1
Kick off
1:1 interview to identify needs
Day 2
First time impressions of product
Day 3
Regular usage begins
Day 30
Final Sentiment Survey
Post-
research
Analysis

This method was chosen to capture users’ initial sentiments before interacting with the product, observe real-time reactions during first-time use, and track behavioral shifts over an extended period, uncovering how perceptions and engagement evolve.

It involved:
1:1 Interviews
Unmoderated Testing
Affinity Map Analysis
Key Insight
Through diary study interviews, we identified the core Job To Be Done:

"Assure me that my identity is as protected as possible"

This assurance depends on two things:
  1. "Inform me when significant or suspicious events occur."
  2. Guide me on how to respond effectively.”
The Problem: Guidance Falls Short

This study was designed and led by our UX researcher; I contributed by synthesizing findings and ensuring the implications shaped the alert redesign.

Motivational Issue #1: Context for Recommendations Unclear

This is the most critical problem. If users don’t understand why these actions matter, they won’t trust the Action Plan. How might we make it clear why a recommendation is relevant to the user?

Before
Context missing from action plan page
Before
Key info on separate pages

To ascertain the context for the action plan, users must navigate to separate tabs to access ID Safety Score, Top Risks, exposure history and exposed credentials. This separation makes it difficult for users to see the relevance of the recommendations and how those recommendations relate to their unique exposure history.

After
Context for Recommendations Made Clear

Motivational Issue #2: Poorly Written Action Steps

While understanding the why behind actions is important, clear and digestible how steps are essential too. How might we make guidance easy to scan and act on?

Before
Why this falls short:
  • Does not explain what “monitoring” actually entails.
  • No direct call to action.
  • Assumes prior knowledge of identity protection services.
  • Leaves users uncertain about what steps to take.
After
Why this is better:
  • Clarity: Clearly defines what the action is and why it matters.
  • Actionability: Provides a step-by-step breakdown of exactly what the user needs to do.
  • Accessibility: Uses plain language, removing assumptions about prior knowledge.

Motivational Issue #3: Overwhelming Task List

A long, unorganized list of security tasks can feel daunting, and without structure, prioritization, or clear explanations, the action plan feels like an endless to-do list rather than a guided path to improving security. How might we structure the tasks to provide clear guidance and reduce overwhelm?

Before
Long, unorganized list
After
Clear and Structured Workflow

To better understand why certain recommendations surfaced as higher priority, I collaborated with our data science team. They explained how risk categories were assigned, which helped me reorganize the action plan into clearer groupings. This framing gave users a more structured view of their next steps without requiring them to interpret the underlying data model.

Feature Discoverability

Users struggled to locate the action plan due to its placement on the dashboard. How might we ensure that the action plan is easily accessed?

Before
After

The highlighted area shows the visible portion of the screen on initial load. In the 'before' design, the call-to-action for the Action Plan feature extends below the fold. The 'after' design ensures it is fully visible and prominently emphasized, reducing friction in accessing essential tasks.

Reflections & Next Steps

Although this feature has not yet been built, the design strategy was grounded in user insights and business needs. Testing highlighted why users struggled to engage with the original version, and the redesign directly addresses those barriers.

Once implemented, success can be measured through:

  • Page engagement → frequency and repeat visits to the Action Plan.
  • Task completion → percentage of recommended actions users follow through on.
  • Feature adoption → overall proportion of users engaging compared to the prior version.