Case Study — Healthcare UX
Mental Health Platform
Designing for people in vulnerable moments requires a different kind of rigour — where what you withhold is as important as what you show.
01 — The Situation
The design of a mental health platform is the design of a conversation.
Action Talk is a platform that connects people experiencing mental health challenges with peer supporters and professional counsellors. The existing product had strong clinical intentions but a design that inadvertently created friction at the exact moments users needed to feel safe.
People in distress do not interact with digital products the same way as people in their baseline state. Cognitive load is higher. Tolerance for ambiguity is lower. The stakes of a confusing interface — an unclear call-to-action, a form that feels clinical rather than supportive, a crisis flow that buries the most important option — are genuinely serious.
This was not unfamiliar territory for me. I've spent years designing hospital environments — spaces where people are often frightened, disoriented, and under stress. The design principles for those environments translate directly: clear wayfinding, reduced cognitive load, and a visual language that communicates safety rather than bureaucracy.
02 — My Role
Research lead and interaction designer.
I was responsible for user research, journey mapping, interaction design, and the final hi-fi prototype. I also conducted an accessibility audit against WCAG 2.1 criteria, with particular attention to the needs of users experiencing anxiety, cognitive overload, or situational impairments.
03 — Research & Discovery
What users need isn't always what they say they want.
Research for a mental health product requires care. I used a combination of secondary research (reviewing published studies on mental health app engagement and dropout rates), interviews with people who had used peer support platforms, and expert consultation with a counsellor who specialised in digital wellbeing services.
Key research findings were organised into themes — not features — because features are the answer to a question that the research phase should not yet be asking. The themes that emerged drove all subsequent design decisions.
Emotional journey map — from first awareness of a problem to ongoing support
04 — Design Principles
Before wireframes, a set of commitments.
Rather than moving directly into interface design, I established a set of design principles specific to this context. These acted as decision filters throughout the project — when two options were both technically feasible, the principles determined which to pursue.
05 — Key Decisions
Where the principles were tested.
Onboarding redesign — original 7-step registration vs. redesigned value-first sequence
06 — Solution Highlights
A platform that leads with warmth and follows with rigour.
The final design balances clinical reliability with emotional accessibility. The visual language is deliberately warm — not clinical. Typography is generous, not condensed. The colour palette avoids the cold blues and sharp contrasts common in healthcare products, using instead soft warm neutrals with a single considered accent.
Core app screens — entry state and session flow
The accessibility audit identified 14 issues with the original product, 6 of which were critical (WCAG 2.1 AA failures). All were resolved in the redesign, including: insufficient colour contrast in the mood selector, missing focus states on interactive elements, and form fields that could not be operated by keyboard alone.
07 — Outcome
What the research and testing showed.
Usability testing with five participants across two rounds confirmed the redesigned onboarding significantly reduced friction. Participants who tested the original flow averaged 4.2 minutes to complete registration; participants testing the redesigned flow averaged 2.1 minutes and reported higher confidence that they understood what they were signing up for.
The most significant qualitative finding: all five test participants in the redesigned flow spontaneously mentioned that the platform felt "trustworthy" or "safe" during the debrief. None of the participants who tested the original flow used these words.