Rejoovinii

Designing for "First-Session Confidence" in a Clinical Trial

Designing for "First-Session Confidence" in a Clinical Trial

Task completion

82%

Area

Research, UX, UI

This UK MRC-funded project (MR/W029421/1) is currently undergoing a Randomised Clinical Trial (RCT)

This UK MRC-funded project (MR/W029421/1) is currently undergoing a Randomised Clinical Trial (RCT)

Context

Context

Context

Rejoovinii is a home-based electrotherapy system for adults with chronic Knee Osteoarthritis (OA).
It uses a textile-based knee sleeve paired with a mobile app to deliver controlled electrical pulses for pain relief and muscle recovery.

Rejoovinii is a home-based electrotherapy system for adults with chronic Knee Osteoarthritis (OA).

It uses a textile-based knee sleeve paired with a mobile app to deliver controlled electrical pulses for pain relief and muscle recovery.

Key area

Detail

Project scope

Design and validate the end-to-end therapy experience for the mobile app, with a critical focus on ensuring first-session success.

Trial goals

Validating real-world usability for clinical adoption (NHS) + ensuring reliable clinical data capture for future research.

Constraints

Uncertainty in NHS identity and clinical integration models forced the design to be functional regardless of the onboarding method.

Target group

Adults aged 45–70 with knee osteoarthritis (This demographic includes a high proportion of users with low digital confidence.)

Timeline

8 months (Research + Design)

Unpacking the problem: The first 15 min determine 6 months

While clinical evidence shows that TENS (electrotherapy) is effective for pain relief, the real-world data reveals a critical flaw in home use:

Adherence to unsupervised home therapy falls below

0%

1/5 adults over the age 45 affected by Knee OA in the UK

0%

Confidence as the key barrier

"I wasn’t sure if I was doing it right,

so it was easier to just stop."

Karen Gillan

Electrotherapy user

Note: 'Karen Gillan' is a representative persona synthesised from insights collected across prior adherence studies.

Business Challenge

Low adherence (4%) posed a direct risk to Rejoovinii’s clinical trial validity and NHS adoption. If early confidence breakdown during setup and control persists, inconsistent usage will continue to undermine data reliability and clinical outcomes.

The business challenge was to ensure users could confidently complete their first therapy session independently, as first-run success was critical to long-term adherence and the product’s clinical and commercial pathway.

My role

My core work involved connecting complex and often conflicting priorities (from clinical safety and engineering constraints to user psychology) into one effective therapy experience. I led the shift on research scope (to include novices) and delivery pipeline (to remove low-value features), ensuring the final product mitigated the core risk of adherence failure.

Key contributions:

  • Drove the research strategy, including leading secondary analysis and collaborating on primary user studies, through concept and usability testings.

  • I defined the end-to-end flow by aligning user journeys, task sequences and interaction behaviours.

  • Aligning clinicians, engineers and product stakeholders around a first-session experience.

  • Building the foundation of the app’s design system for clinical consistency and scalability.

Outcome

The final design delivered strong, quantifiable results:

  • Completion Rate: We achieved 82% unaided first-session completion in usability tests.

  • Crucial Impact: The flow increased the user confidence in continuing therapy independently, directly addressing the primary barrier to long-term adherence needed for the clinical trial.

Discovery

Inside the drop-off: Where and why adherence fails

Inside the drop-off: Where and why adherence fails

Home electrotherapy for chronic knee OA loses most patients within days. To move beyond assumptions, I worked with clinical heads and defined three research questions that became the backbone of my secondary research, guiding every decision in where and why adherence breaks down:

Where do patients struggle first during therapy setup?

When do they lose confidence in effectiveness?

What motivates or prevents sustained use over time?

Mapping the problem space through evidence

Competitor and analogous product review

Initial discussions with stakeholders focused on reviewing global electrotherapy apps to understand industry standards. I realised that relying on these comparisons risked misaligning our design with our low-digital-confidence user base, as the few existing UK products only tangentially addressed chronic knee pain and most international apps targeted fitter, more confident users.

To mitigate this risk, I shifted the team’s focus from feature benchmarking to behavioural friction. I worked with product, clinical and engineering stakeholders to broaden the scope and realign our efforts around how low-confidence users build trust in unsupervised therapy.

  • Indirect Competitors: Rehabilitation exercise apps that successfully engage chronic pain patients.

  • Analogous Competitors: Accessible therapy for people facing physical movement challenges.

View the detailed competitive breakdown

Findings

We gained 3 crucial insights into the behavioural and psychological engagement models required to build confidence, which were entirely absent from the direct competitor landscape.

High friction during setup

One-size-fits-all therapy reducing trust

Session data shown without meaningful context

Clinical studies review

I conducted a structured review of over 40 clinical studies on home-based TENS and adjacent OA self-management. While clinical protocols focused heavily on stimulation parameters and treatment schedules, the research consistently showed that patient drop-off was driven far more by psychological and behavioural friction than by technical inefficiency.

Findings

The findings were clear and were immediately used to focus the team's attention:

Setup anxiety and electrode placement uncertainty

Loss of confidence when progress wasn’t visible

Drop-off due to weak behavioural reinforcement

Based on the clinical evidence, it became clear that adherence failures were not primarily caused by poor protocols or insufficient instructions. They were due to psychological breakdowns in confidence during the therapy.

I therefore realigned the team’s focus away from stricter guidance and towards designing for confidence, clarity, and reinforcement during the first therapy experience.

Affinity mapping: From findings to patterns

Following the secondary and expanded competitive review, we synthesised the collective findings by organising them along the patient journey. This allowed us to distill five recurring themes, all of which reflected repeated breakdown points related to confidence and adherence:

Guidance: Users need reassurance, not just instructions

Feedback: Without clear signals, they start guessing

Customisation: When therapy doesn’t adapt, trust breaks

Tracking: Data without context feels meaningless

Motivation: Repetition without reinforcement leads to drop-off

We prioritised these themes based on their direct impact on the 4% adherence rate, resulting in three insights for our design solution:

Insight 1

The Trust-Building Moment: The first therapy experience is not a learning moment but it is a trust-building moment. If confidence breaks during setup, retry probability drops sharply.

Insight 2

Control and Clarity: Patients lose confidence when they are unsure what the device is doing or whether they are in control.

Insight 3

Motivation through Change: Sessions feel repetitive without meaningful reinforcement. Motivation links to seeing change, not just completing sessions.

🧭

Plan Broken: What I did next

Original plan

Recruit experienced electrotherapy users to reduce trial noise and ensure clean data capture.

What went wrong

Multiple participants dropped out due to illness and availability conflicts, putting our research timeline at risk.

My call

Instead of delaying the study, I proposed including a cohort of complete electrotherapy novices. This decision provided essential usability data and protected the overall project timeline.

Why this mattered

The trial goal was clean data, but most future NHS users would be first-timers. If novices failed, passing the trial would not prove real-world viability.

Designing confidence across the first session

With backend system integration and NHS data sharing policies still under internal review, we couldn't finalise a login strategy. Rather than letting this delay the project, I chose to mitigate the risk of a 'fragile first run' by strategically pivoting the focus. We conducted concept tests to test the user's emotional experience and confidence at three critical points in the application.

Participants included in concept testing.

Electrotherapy Novices

0

Electrotherapy Users

0

Ages

48 - 0
  1. Setup - Bluetooth pairing + sleeve setup

What we tested (Insight 1)

The Trust-Building Moment to ensure first-time users could connect the device and wear the sleeve with confidence, without feeling they were doing something wrong.

What we observed:

1

Participants needed to see if the device was paired while viewing sleeve instructions.

2

Dense text and sleeve positioning instructions slowed first-time users.

3

Participants preferred sequenced setup with visible progress steps (e.g. Step 1 of 5)

4

Some users preferred video instructions

User signal (paraphrased)

Steps like 1 of 4 would reassure me.

Show a clear paired state.

  1. In-session - Control + pain logging

What we tested (Insight 2)

Whether users could clearly understand what the device was doing during a live session and feel they remained in control of it, rather than just passively letting it run.

What we observed:

1

Participants felt unsettled when controls looked dense or overly technical.

2

Pain logging as a meaningful feedback.

3

All participants valued seeing what was happening in real time.

User signal (paraphrased)

Recording pain score will be brilliant.

I would go for a personalised programme, but I’m open to customisation.

  1. Post-session - Result + Progress

What we tested (Insight 3)

Whether showing outcomes and progress after a session helped users make sense of the therapy and feel motivated to continue, instead of treating each session as an isolated event.

What we observed:

1

Rewards were less motivating than seeing actual progress trends.

2

Participants cared more about pain trends than one-off session results.

User signal (paraphrased)

Often the reward is just relief from pain and seeing the information.

It would be about seeing what patterns arise.

🚫

🚫

Why I killed a feature to protect clinical trust

Why I killed a feature to protect clinical trust

I initially explored light gamification to support long-term adherence, but concept testing revealed a mismatch with the clinical context. Participants were not motivated by rewards or badges. They were motivated by relief, visible improvement, and understanding their own progress. Introducing gamification risked trivialising the experience and undermining trust, especially during the first session where anxiety is highest.

So, I made a conscious decision to kill overt rewards and instead design motivation around clinical progress, using pain trends, comparative insights, and subtle reinforcement messaging. This aligned better with both patient psychology and trial goals, strengthening first-session confidence and supporting sustained adherence through meaningful, not artificial, motivation.

From insights to archetypes: Creating a single archetype for first-session confidence

To ensure the design stayed grounded in real behaviour, I collaborated with clinicians, product leads, and engineers to synthesise research findings into a single primary user archetype. This archetype represented the most common behavioural and attitudinal characteristics of the target user for this critical phase.

Linda Theobald: The face of our NHS trial

Linda is a 58-year-old with chronic knee OA and moderate digital confidence. She wants to manage her chronic knee pain independently, but her biggest barrier is confidence.


She uses everyday apps like messages and banking but feels anxious around medical technology, especially when she doesn’t understand whether she’s using it “correctly”.

Every critical design decision mapped back to one question:

Would Linda feel confident completing her first session alone, without external help?

The Confidence Checkpoints

I translated Linda's anxieties and needs into first-session user stories across setup, in-session, and post-session. These stories served as confidence checkpoints, representing moments where her belief in the therapy could either strengthen or permanently break.

I translated Linda's anxieties and needs into first-session user stories across setup, in-session, and post-session. These stories served as confidence checkpoints, representing moments where her belief in the therapy could either strengthen or permanently break.

I translated Linda's anxieties and needs into first-session user stories across setup, in-session, and post-session. These stories served as confidence checkpoints, representing moments where her belief in the therapy could either strengthen or permanently break.

Instead of asking

Instead of asking

Instead of asking

“What should the app do?”,

“What should the app do?”,

“What should the app do?”,

I reframed the question to:

I reframed the question to:

I reframed the question to:

“What must Linda feel and understand at this exact moment to keep going?”

“What must Linda feel and understand at this exact moment to keep going?”

“What must Linda feel and understand at this exact moment to keep going?”

Checkpoint

Linda's Anxiety

Design Bet & Strategic Outcome

Setup

As Linda, I want clear guidance to pair and prepare the garment, so I can start therapy without early frustration or second-guessing myself.

Design Bet 1:

Breaks setup into clear, sequenced steps with visible progress, lowering cognitive and emotional load at the most fragile moment.

In-session

As Linda, I want to log my current pain and start the session at a safe intensity using simple controls and clear feedback, so I feel in control while the therapy is happening.

Design Bet 2:

Simple controls paired with clear, real-time feedback reassure her that she is using the device correctly and safely.

Post-session

As Linda, I want to log my pain after the session and see a summary and pain trends, so I feel motivated to continue rather than treat this as a one-off.

Design Bet 3:

Visible progress provides tangible proof of change, reinforcing motivation through understanding rather than superficial rewards.

These stories gave the team a shared, time-based view of where confidence needed active design intervention.

These stories gave the team a shared, time-based view of where confidence needed active design intervention.

These stories gave the team a shared, time-based view of where confidence needed active design intervention.

Design

From flows to an end-to-end experience

From flows to an end-to-end experience

Mapping Linda's anxiety's/design bet into flow

Mapping Linda's anxiety's/design bet into flow

Using Linda’s anxiety and the above three design bets, I mapped how Linda would actually move through the product across key stages of the therapy experience. They were used to structure the product around behaviour and decision points, not the screens.

The Constraint Shift: From Ideal to Complex (V1 - V2)

The initial flow (V1) relied on pre-personalised devices. This allowed a clean, low-friction path into therapy, with only minimal setup and identity confirmation.

However, this ideal scenario broke once we modelled the real deployment context. In reality, devices would ship unregistered and unpersonalised, requiring full account creation, profile setup and NHS verification inside the app. This created critical user risks delayed gratification and maximised drop-off risk, severely jeopardising first-session success.

The Strategic Solution Post-Value Registration (V3)

Working cross-functionally with engineering and clinical leads, I reframed NHS login from a product dependency into a verification layer.

Instead of full NHS integration, users were verified externally and allowed to proceed with therapy setup immediately. Account creation and data linking were postponed until after first-session value was delivered. This protected first-session confidence while still meeting governance and compliance requirements.

🚫

🚫

Why I avoided heavy feature-based onboarding

Why I avoided heavy feature-based onboarding

There was a push to introduce a detailed onboarding sequence explaining all features and system behaviours upfront. I challenged this strategy.

Our user, Linda, was already anxious about "doing it wrong." Too much upfront information increases hesitation and cognitive load, making her feel overwhelmed before she has even begun therapy. Instead, I deliberately avoided a long onboarding flow.

I designed confidence to be built in-context through progressive guidance, just-in-time instructions, and real-time feedback during her first session. This strategic approach achieved a rapid Time to Value (TTV) by prioritising immediate success for better adherence.

Designing for the patient’s physical and mental context

Rather than designing screens in isolation, I focused on how each moment would feel across the entire therapy journey. This is where usability, semantics, ergonomics and emotional reassurance converged.

Designing for the patient’s physical and mental context

A therapy session is not just a digital interaction. It happens in a physical, vulnerable and often cognitively loaded context. Linda is wearing a medical garment, managing pain, and using a device that she fears getting wrong. Designing for this meant optimising not just for usability, but for confidence and error prevention.

Core page layout and interaction

Designing for a medical device requires touch targets that are easy to use, especially for patients who may have reduced dexterity. My research found that a major barrier to therapy adherence is the difficulty users have in accurately selecting small targets. Adequate spacing between touch targets is also critical, as it prevents accidental taps on nearby buttons, making the interface reliable and trustworthy.

From design decisions to testable reality

These design decisions shaped the working version of the interface. The next step was to validate whether these ideas actually reduced hesitation, improved confidence, and made the therapy experience easier for Linda. This led into multiple moderated usability sessions and iterative refinement across multiple versions.

⚙️ 🤝

⚙️ 🤝

Device UI shift: From dedicated tablet to user’s own phone

Device UI shift: From dedicated tablet to user’s own phone

In the initial phase, Rejoovinii was designed for a dedicated 7-inch tablet. During technical planning, this approach became unfeasible due to hardware cost, firmware reliability and long-term support risks, and the product pivoted to a mobile phones.

When this shift was made, the product was already mid-build and timelines were tight.
I focused on minimising disruption for both product and the engineering team. Rather than redesigning end-to-end, I identified high-impact pressure points where mobile constraints would increase cognitive load and technical risk, and restructured only those flows.
This helped stabilise development, reduced implementation complexity, and preserved first-session confidence for low-digital users.

Key decisions under test

Key decisions under test

In this phase, I focused on how the design held up under real use. These decisions were evaluated through moderated usability sessions, aligned against three criteria setup hesitation, in-session confidence and clarity of post-session results

I'm presenting selected critical decision points where user behaviour and design intent collided. Each example shows how the design evolved through iterations based on observed friction, confidence breakdowns, and task performance.

I'm presenting selected critical decision points where user behaviour and design intent collided. Each example shows how the design evolved through iterations based on observed friction, confidence breakdowns, and task performance.

I'm presenting selected critical decision points where user behaviour and design intent collided. Each example shows how the design evolved through iterations based on observed friction, confidence breakdowns, and task performance.

Simplifying results & trends visualisation to reduce misinterpretation

These iterations address Insight 3: motivation comes from seeing change, not just completing sessions. Early versions exposed raw data, but users struggled to interpret it. I shifted toward clearer progress summaries and comparisons that helped users quickly understand whether therapy was working for them. This reduces doubt after each session and directly supports my core anchor of moving first-session confidence into long-term adherence.

Impact

It increased patients’ confidence that the therapy was actually working, reducing doubt after sessions and supporting continued use.

Simplifying results & trends visualisation to reduce misinterpretation

This iteration directly addressed Insight 2 by shifting from raw control to guided control. Instead of letting users guess or override blindly, I broke the flow into pain input followed by explained recommendations, making the system’s logic visible. This reduced anxiety around settings, prevented unsafe adjustments, and built trust in the device during the first session, which is essential for improving long-term adherence.

Impact

This design reduced settings-related anxiety in the first session and helped users trust the system instead of fighting it, increasing the likelihood they would return for subsequent sessions.

Simplifying results & trends visualisation to reduce misinterpretation

In the first version, the summary only confirmed that a session was completed. It showed the outcome, but didn’t help users understand what it meant or what to do next. That gap weakened both perceived control and motivation.

In the second version, I reframed the summary as a confidence and continuity moment, not a closing screen. By comparing today’s outcome with the previous session and inviting light contextual reflection through notes, the interface gives users both clarity on what happened and agency over what happens next. This directly supports Insight 2 by reducing uncertainty, and reinforces first-session success as a foundation for continued adherence rather than a one-off event.

Impact

This shift helped users move from “I finished” to “I understand what happened and what to do next”. By strengthening post-session clarity and control, we reduced early drop-off risk and reinforced the habit loop after the first session.

Results at a glance

Task completion: 9 of 11 completed the full path unaided. 82 percent success.

Two participants needed help on parameter adjustment or missed a setup instruction.

Key Takeaways

1. Trust mattered more than functionality.

The biggest learning was that adherence failures came from uncertainty, not missing features. Designing for confidence became as important as designing the flow itself.

2. Constraints clarified priorities.

Participant drop-outs, shifting from a tablet to a phone, and tight trial timelines forced essential prioritisation. These restrictions sharpened the product and prevented unnecessary complexity.

3. Safety and simplicity must coexist in medical UX.

Collaborating with clinicians showed that guardrails, not extensive customisation, create a safer and more reassuring experience for patients.