SudoStudy physics dashboard on an iPad

Redesigning SudoStudy; an AI-powered Ed-tech platform, to help students study smarter

Role:
Senior Product Designer
Timeframe:
6 months (then ongoing product consulting)
Business goals:
Lift WAU, increase sessions per user and decrease first-session drop off

Business impact:

+66.7%

WAU
(from 1,280 to 2,187)

+27%

Sessions per user (practice mode usage)

-31%

In first-session drop-off


Tyler opens SudoStudy after class. So many subjects to choose from.

None answer the only question he cares about:
what should I study next?

He clicks a few menus.
Closes the tab.
Plans to “study later.”
I saw versions of this across interviews:
I struggle to stay consistent with my schedule…sometimes I don’t even have one...it’s hard to figure out which topics I should focus on
-Pugh, 16 years old, O Levels
I get overwhelmed by all the topics and menus…I often skip sessions...
I don’t know where to start next.
-Cole, 17 years old, A Levels
I am a bit lenient when self-assessing, I prefer external feedback to gauge progress…delay in feedback makes I lose interest
-Grace, 16 years old, O Levels
User interview snapshots
Grid of user interview video call snapshots
This became the brief:
make the next step obvious and rewarding - fast.
Research synthesis
Research synthesis whiteboard with sticky notes and insights

What I walked into

  • No onboarding. New users hit a combined auth modal: sign up, sign in, magic link, password, everything at once. It sometimes failed. Rage-clicks were common. People bounced.
  • Practice mode was cluttered. Competing CTAs (“Give feedback,” “Buy subscription”). Redundant choices. No clear exit. Multi-select topics that didn’t match how students actually study.
  • Home didn’t guide the next session. Students couldn’t see strengths, gaps, or coverage. They felt lost between subjects and topics.
  • Desktop first, but mobile behavior mattered. The UI didn’t respect quick, focused sessions.

The bet:
If we reduce choice, surface the next best action, and fix first-run friction, students will activate faster and come back more often.

My role and how we worked

I led end-to-end design across research, IA, UI, prototyping, and developer handoff. I joined sprint planning, wrote user stories with acceptance criteria, and partnered tightly with the founder, PMs, and a small dev team. We shipped in two-week increments, with clear “measure next” hooks in each release. Marketing later used the new visuals in campaigns and on the website.

Evidence that shaped the plan

Session replays & funnels (PostHog):

  • Confusing auth modal → immediate drop-offs after sign-up/sign-in
  • Unclear navigation in practice and quizzes
  • “Too much, too soon” options causing paralysis
  • UI elements (buttons, instructional cues) needed to be more explicit

Interviews (20 students):

  • Students prefer one topic per session.
  • They want external feedback and quick wins.
  • They need a sense of progress (how am I doing?) and coverage (what have I touched?).

These insights drove three product bets:

Fix first-run friction - authentication and onboarding

Problems

New users landed in a catch-all modal that mixed sign up, sign in, magic link, and password flows. It was unreliable and unclear. Many bounced before seeing value.

Decisions

  • Separated authentication paths and hardened error states.
  • Introduced a short onboarding that sets intent and momentum: pick subject(s), set a starting point, get a clear “Start practicing” CTA.
  • Wrote crisp, action-first copy.

Trade-offs

We kept onboarding short to ship fast and measure impact. Deeper personalization moved later.

Results

Immediate drop-offs fell 31%. More students reached their first real practice session.

Old ‘Getting Started’ modal
Old SudoStudy ‘Getting Started’ modal
New ‘Getting Started’ modal
New SudoStudy ‘Getting Started’ modal
Old onboarding user flow
Old SudoStudy onboarding user flow diagram
New onboarding user flow
New SudoStudy onboarding user flow diagram
Video walkthrough of onboarding design prototype

Make “what’s next” obvious - the Home reframe

Problems

Home didn’t guide the next action. Students couldn’t see progress or gaps.

Decisions

  • Added Health Score (how often you answer correctly) and Coverage Score (how much of a topic you’ve touched).
  • Surfaced Recommended Topics using those two signals.
  • Added a daily streak to encourage frequent, shorter sessions.

Trade-offs

We deferred advanced recommendation tuning to focus on activation and return usage first.

Results

Students had a clear next step, and weekly users rose from 1,280 to 2,187.

Old ‘Home Dashboard’
Old SudoStudy home dashboard
New ‘Home Dashboard’
New SudoStudy home dashboard

Reduce cognitive load - Practice mode, before & after

Problems

  • Competing CTAs distracted from the task.
  • No obvious exit from a session.
  • Multi-select topics didn’t match real study behavior.
  • Redundant options added friction before the first question.

Decisions

  • Added a clear ‘Exit practice’ control to restore a sense of safety.
  • Put difficulty and topic choice first, in a simple left-to-right flow.
  • Switched topic selection to single-select (radio) based on interviews: one topic per session.
  • Used progressive disclosure. Secondary actions live behind an options menu.
  • Kept the question area clean: answer → feedback → next.

Trade-offs

Single-select limits “batching,” but it aligns with how students actually work and reduces choice paralysis. We bookmarked multi-topic practice as a future advanced mode.

Results

Students spent less time deciding and more time practicing, and practice per user rose 27%.

Old ‘Practice Mode’
Old SudoStudy practice mode screen
New ‘Practice Mode’
New SudoStudy practice mode screen
Old ‘Subject Dashboard’
Old SudoStudy subject dashboard
New ‘Subject Dashboard’
New SudoStudy subject dashboard

What didn’t work (and what I learned)

The initial subject empty state CTA under-performed until I swapped a descriptive card for a clear primary button. Simple beats clever when users are anxious to start.

Old ‘Enter practice mode’ card designNew ‘Start practicing’ card design

Outcomes and measurement

+66.7%

WAU
(from 1,280 to 2,187)

+27%

Sessions per user (practice mode usage)

-31%

In first-session
drop-off

How I measured

  • Post-release window: ~2–4 weeks, all active users.
  • Baseline: the prior 2–4 weeks.
  • No major promos or unrelated product changes during this period.

How I work

  • Translate user signal into product bets with clear guardrails.
  • Ship in small, measurable slices; instrument the next answer into each release.
  • Keep copy, layout, and controls brutally simple, especially for first-run flows.
  • Treat design as a team sport: I co-planned sprints and wrote user stories with acceptance criteria; handoffs were detailed and predictable.

🤔 Reflections

This project reminded me that great UX isn’t just about cleaner screens, it’s about building systems that support users end-to-end.

Grounding every decision in research and real behavior helped turn scattered features into a more focused, usable learning experience.

Even small friction points can pile up, but thoughtful design rooted in user needs makes a measurable difference.

Explore more