top of page

Case Study — Improving Benefits Findability for Senior Members

By Philip Burgess - UX Research Leader


Role: Manager, User Experience Research (player/coach)Company: Fortune 50 healthcare organization (multi‑line)Timeline: 12 weeks (Phase 1)Team: 1 PM, 2 Product Designers, 1 Content Designer, 1 Data Analyst, 1 UXR (me), Eng Lead


Headline Outcome

We improved task success and perceived ease/satisfaction for “Find Your Benefits” by ~20–22% and lifted engagement with the benefits area to a top‑five feature within a month of release. 


1) Problem

Members—especially seniors—struggled to find and understand their plan benefits in the secure portal. Support call drivers and previous usability findings pointed to confusing navigationunhelpful labels, and inconsistent search/filter behavior. The business needed a quick path to measurable improvement ahead of AEP, without breaking information architecture.

Success looked like:

  • Faster path to Benefits from the global nav

  • Higher completion on “Find a specific benefit”

  • Reduced friction indicators (errors, backtracks)

  • Lift in Ease + Satisfaction on post‑visit pulse

Internally, this tied directly to objectives and KPIs your team already tracks in readouts (Ease/Satisfaction/Trust, task success). 


2) Constraints

  • Time‑boxed to one quarter (content updates + targeted IA changes, no wholesale redesign)

  • Regulatory and content accuracy constraints (healthcare terminology, accessibility)

  • Cross‑platform parity: desktop and mobile web

  • Must align with the org’s standardized one‑pager plan + Insights & Recommendations readout format to speed decisions 


3) My Role & Approach

I led research end‑to‑end and coached the team on a consistent, light‑weight operating rhythm:

  1. Initialize & Gather — Synthesized prior insights (readouts, call drivers, analytics), defined hypotheses, and aligned on decision criteria.

  2. Plan — Created a one‑pager (objectives, KPIs, methods, risks) and secured fast peer review.

  3. Create — Built study materials (tasks, success criteria, proto instrumentation) and pilot‑tested.

  4. Facilitate/Launch — Executed a mixed‑methods plan (below).

  5. Insights & Recommendations — Prioritized design moves with an explicit “Decision/Owner/Due” table.

  6. Follow‑through — Partnered on release validation and metrics readout.


4) Methods

  • Heuristic review of the current benefits flow with a content lens

  • Unmoderated usability testing (desktop + mobile): task success, time, error/backtrack, SUS‑like ease

  • Targeted moderated sessions with seniors to probe labels/mental models

  • Clickstream analytics dip to validate discoverability changes (pre/post)

  • Micro‑surveys post‑visit to track Ease + Satisfaction deltas

  • Accessibility spot checks (color, contrast, focus order, semantics)

These map to the plan components and reporting shells your team uses across one‑pagers and readouts, keeping the workflow consistent and predictable for stakeholders. 


5) What We Changed

  • Navigation: Added a clearly labeled “Benefits” entry in the primary nav and breadcrumbs

  • Search microcopy: Clarified scope and gave examples (“Type ‘dental cleaning’ or ‘vision exam’…”)

  • Filters: Reduced options to the essentials; grouped by user intent

  • Result density: Improved scannability with plain‑language benefit titles + structured snippets

  • Empty states: “Try: preventive | dental | vision” quick chips to guide next steps


6) Impact

  • Ease/Satisfaction: Up ~20–22% in the first post‑release pulse (desktop + mobile)

  • Findability: Benefits area rose to a top‑five destination within weeks

  • Qualitative: Fewer “Where do I start?” comments; stronger sense of control and clarity

  • Operational: Stakeholders adopted the standard one‑pager and I&R formats for subsequent work, shortening plan‑to‑field time by ~20% in the next project

Note: Exact figures are anonymized; the ranges reflect the direction and magnitude observed in real releases I led.


7) What I Did Specifically (Manager + IC)

  • Framed the problem in business terms and set decision criteria tied to KPIs

  • Designed the study (tasks, success definitions, sampling strategy) and ran the pilot

  • Facilitated cross‑functional alignment (PM, Design, Content, Analytics) with a standard one‑pager and a tight RACI

  • Synthesized insights into decision‑ready recommendations with an explicit owner/timeline

  • Closed the loop post‑release: validated metrics, shared a 1‑page wins recap, and archived assets for reuse in the insights library

This approach reinforces the process refresh + metrics mindset (planning rigor, peer review, timely I&R), which I advocate as a research leader. 


8) Artifacts (public‑safe)

  • Before/After nav and results (key screens, redacted)

  • Task flow with friction points + “after” flow

  • Decision log (owner, due date, status)

  • Mini‑readout (5 slides): Objectives → What we tested → What we found → What we changed → Results


9) Lessons Learned

  • Language wins trust. Clear microcopy and examples outperformed novel UI affordances.

  • Decision readiness beats volume. A concise I&R with an action table moved design and dev faster than a lengthy deck.

  • Process creates speed. Standardizing on a one‑pager + peer review reduced back‑and‑forth and helped us ship measurable improvements inside a quarter. 


10) What I’d Do Next

  • Fold the learnings into reusable patterns (“recipes”) for labeling, filters, and empty states so other teams inherit the wins.

  • Extend into benefit comprehension with progressive disclosure and plain‑language summaries; A/B test on top drivers.

  • Expand the metrics set from perception to behavior (repeat task time, deflection proxies, feature reuse) to strengthen the evidence chain. 

 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page
'); opacity: 0.3;">

🔄 Continuous UX Research Feedback Loop

📊
Real-time
Analytics
💬
User
Feedback
🤖
AI
Synthesis
Rapid
Insights

Click on any node to explore the continuous research process

Discover how modern UX research creates a seamless feedback loop that delivers insights in real-time, enabling product teams to make data-driven decisions faster than ever before.