Case Study — Healthcare UX

Mental Health Platform

Designing for people in vulnerable moments requires a different kind of rigour — where what you withhold is as important as what you show.

My roleUX/UI Designer
Year2024
ClientAction Talk
PlatformMobile App
DeliverablesResearch · User flows · Hi-fi Prototype · Accessibility audit
Mental health platform UI screens

The design of a mental health platform is the design of a conversation.

Action Talk is a platform that connects people experiencing mental health challenges with peer supporters and professional counsellors. The existing product had strong clinical intentions but a design that inadvertently created friction at the exact moments users needed to feel safe.

People in distress do not interact with digital products the same way as people in their baseline state. Cognitive load is higher. Tolerance for ambiguity is lower. The stakes of a confusing interface — an unclear call-to-action, a form that feels clinical rather than supportive, a crisis flow that buries the most important option — are genuinely serious.

This was not unfamiliar territory for me. I've spent years designing hospital environments — spaces where people are often frightened, disoriented, and under stress. The design principles for those environments translate directly: clear wayfinding, reduced cognitive load, and a visual language that communicates safety rather than bureaucracy.

Research lead and interaction designer.

I was responsible for user research, journey mapping, interaction design, and the final hi-fi prototype. I also conducted an accessibility audit against WCAG 2.1 criteria, with particular attention to the needs of users experiencing anxiety, cognitive overload, or situational impairments.

What users need isn't always what they say they want.

Research for a mental health product requires care. I used a combination of secondary research (reviewing published studies on mental health app engagement and dropout rates), interviews with people who had used peer support platforms, and expert consultation with a counsellor who specialised in digital wellbeing services.

"The moment of seeking help is not the same as the moment of readiness to engage. The design has to bridge that gap."

Key research findings were organised into themes — not features — because features are the answer to a question that the research phase should not yet be asking. The themes that emerged drove all subsequent design decisions.

User emotional journey map — awareness through to ongoing support engagement

Emotional journey map — from first awareness of a problem to ongoing support

↑ 60%
Dropout at onboarding
Research showed the majority of mental health app abandonments occur during registration — not from lack of motivation, but from friction
Re-engagement
Users who felt "seen" in the first session were three times more likely to return, regardless of outcome quality
#1
Fear of judgement
The primary barrier to seeking help was not cost or availability — it was fear of how the information would be used and who would see it

Before wireframes, a set of commitments.

Rather than moving directly into interface design, I established a set of design principles specific to this context. These acted as decision filters throughout the project — when two options were both technically feasible, the principles determined which to pursue.

Presence before process
The first interaction a user has with the platform should feel human, not administrative. Avoid clinical form language. The onboarding sequence should feel like a conversation, not a registration.
One thing at a time
Never ask users to process more than one piece of information per screen. Each step should have a single clear purpose and a clear next action. Ambiguity is not neutral — it generates anxiety.
Consent at every step
Users should always know what they're agreeing to and retain the ability to pause, go back, or exit without penalty. No design should create the feeling of being "trapped" in a flow.
Safety surfaced, not buried
Crisis resources and the option to speak to a human should be accessible from anywhere in the product — never more than one tap away. Safety cannot be a feature buried in a settings menu.
WCAG is a floor, not a ceiling
Technical accessibility compliance is required, not celebrated. Cognitive accessibility — clear language, reduced load, predictable patterns — goes beyond what WCAG measures and is equally critical in this context.
Trust through transparency
Privacy language should be plain, specific, and prominent — not buried in a terms page. Users should always know who sees their data and for what purpose.

Where the principles were tested.

Decision 1 — Resequencing the onboarding flow
The original onboarding asked users to create an account (name, email, password) before they had any sense of whether the platform was right for them. I redesigned the sequence: show what the platform does and how it protects privacy first, invite users to explore a session structure, then ask for account creation only when the user has signalled intent to continue.
Why: Friction at the moment of account creation is the primary driver of dropout. By moving account creation to after the user has experienced value, we reduce the number of people who leave before they've had a chance to benefit.
Decision 2 — Persistent, non-intrusive crisis access
I designed a persistent "need immediate help" element that appears in the navigation structure on every screen — not as an alert or a modal, but as a quiet, always-available option. It does not interrupt; it simply exists. Tapping it surfaces crisis resources and a direct connection to a human, bypassing all other flows.
Why: Crisis moments are unpredictable. A user might be browsing session notes and suddenly need immediate help. Requiring them to navigate to a specific page creates a barrier that, in a genuine crisis, may not be overcome.
Decision 3 — Mood check-in as entry point, not feature
The original design treated the mood check-in as one of several optional features. I repositioned it as the primary entry point for returning users — the first thing they encounter each session. The check-in data personalises what the platform surfaces next, but more importantly, it signals that the platform is paying attention.
Why: Feeling seen is a stronger driver of re-engagement than content quality. A 10-second check-in that changes what the user sees next communicates that the platform responds to them as an individual, not as a generic user.
Onboarding flow comparison — original 7-step registration vs value-first redesign

Onboarding redesign — original 7-step registration vs. redesigned value-first sequence

A platform that leads with warmth and follows with rigour.

The final design balances clinical reliability with emotional accessibility. The visual language is deliberately warm — not clinical. Typography is generous, not condensed. The colour palette avoids the cold blues and sharp contrasts common in healthcare products, using instead soft warm neutrals with a single considered accent.

Home screen — mood check-in entry state
Session booking flow — therapist matching and appointment

Core app screens — entry state and session flow

The accessibility audit identified 14 issues with the original product, 6 of which were critical (WCAG 2.1 AA failures). All were resolved in the redesign, including: insufficient colour contrast in the mood selector, missing focus states on interactive elements, and form fields that could not be operated by keyboard alone.

What the research and testing showed.

Usability testing with five participants across two rounds confirmed the redesigned onboarding significantly reduced friction. Participants who tested the original flow averaged 4.2 minutes to complete registration; participants testing the redesigned flow averaged 2.1 minutes and reported higher confidence that they understood what they were signing up for.

50%
Reduction in onboarding time
6→0
Critical WCAG failures resolved
100%
Test participants found crisis help within 10 seconds

The most significant qualitative finding: all five test participants in the redesigned flow spontaneously mentioned that the platform felt "trustworthy" or "safe" during the debrief. None of the participants who tested the original flow used these words.

Next project

Accessibility
Audit & Redesign