Stop calling us — we'll keep you posted

Stop calling us — we'll keep you posted

Stop calling us — we'll keep you posted

Varsity Tutors (Nerdy Inc.) · Product Designer · 2024

UX redesign

Interaction design

Wizard flow

Progress tracker

Varsity Tutors had trained its customers to call

Varsity Tutors had trained its customers to call

When members wanted to request a new tutor, most of them picked up the phone. Not because they preferred it, but because that's what the platform had taught them to do. A self-service request flow existed, but it was hard to find, visually outdated, poorly adapted for mobile, and presented every question at once in a single overwhelming page that felt more like a survey than a guided experience.


Fewer than 45% of users who found it and started it actually finished it. The rest called.


Around 15% of all inbound contacts to our support team were tutor requests, with a further ~10% of support tickets either directly related to the request flow or from parents simply asking where their request stood. This wasn't just a UX problem, it was a behavioral one. The goal wasn't only to redesign a flow — it was to redesign a habit.

When members wanted to request a new tutor, most of them picked up the phone. Not because they preferred it, but because that's what the platform had taught them to do. A self-service request flow existed, but it was hard to find, visually outdated, poorly adapted for mobile, and presented every question at once in a single overwhelming page that felt more like a survey than a guided experience.


Fewer than 45% of users who found it and started it actually finished it. The rest called.


Around 15% of all inbound contacts to our support team were tutor requests, with a further ~10% of support tickets either directly related to the request flow or from parents simply asking where their request stood. This wasn't just a UX problem, it was a behavioral one. The goal wasn't only to redesign a flow — it was to redesign a habit.

No time for new research — but we didn't need it

No time for new research — but we didn't need it

Given the potential operational impact, the team wanted to move fast. Rather than running a new research cycle, I audited existing studies and continuous research the team had already conducted, combined with direct input from Customer Success and Customer Experience teams who were hearing from frustrated users every day.


The evidence was clear: users were overwhelmed by the volume of questions presented at once, uncertain which fields actually mattered, and had no feedback on what happened after they submitted. That was enough to design from.

Given the potential operational impact, the team wanted to move fast. Rather than running a new research cycle, I audited existing studies and continuous research the team had already conducted, combined with direct input from Customer Success and Customer Experience teams who were hearing from frustrated users every day.


The evidence was clear: users were overwhelmed by the volume of questions presented at once, uncertain which fields actually mattered, and had no feedback on what happened after they submitted. That was enough to design from.

Early explorations before landing on the final flow

Early explorations before landing on the final flow

More screens don't equal worse UX

More screens don't equal worse UX

The core design decision was moving from a single-page form to a 5-step wizard flow. Stakeholders pushed back. More clicks meant a longer journey, they argued, and a more tedious experience. I disagreed. A single page with ten questions creates cognitive overload and a sense of never-ending commitment. Five focused screens, each asking only what's needed at that moment, feel faster — even if the total number of interactions is similar. The data backed it up, but the argument had to be won on principle first.


The redesign made several other deliberate changes:


  • One question per step — with the final step grouping two naturally related fields, keeping the experience focused throughout while still capturing useful information.


  • Clear distinction between required and optional fields — users always knew what was essential and what was supplementary, reducing decision fatigue.


  • Smart pre-population — if a user arrived from a specific subject page, relevant fields were pre-filled automatically, removing friction at the very start.


  • A visible progress indicator — so users always knew where they were and how close they were to finishing.


  • Full mobile parity — the old flow was essentially desktop-only. The new wizard was designed in parallel for both platforms from day one, not adapted after the fact.

The core design decision was moving from a single-page form to a 5-step wizard flow. Stakeholders pushed back. More clicks meant a longer journey, they argued, and a more tedious experience. I disagreed. A single page with ten questions creates cognitive overload and a sense of never-ending commitment. Five focused screens, each asking only what's needed at that moment, feel faster — even if the total number of interactions is similar. The data backed it up, but the argument had to be won on principle first.


The redesign made several other deliberate changes:


  • One question per step — with the final step grouping two naturally related fields, keeping the experience focused throughout while still capturing useful information.


  • Clear distinction between required and optional fields — users always knew what was essential and what was supplementary, reducing decision fatigue.


  • Smart pre-population — if a user arrived from a specific subject page, relevant fields were pre-filled automatically, removing friction at the very start.


  • A visible progress indicator — so users always knew where they were and how close they were to finishing.


  • Full mobile parity — the old flow was essentially desktop-only. The new wizard was designed in parallel for both platforms from day one, not adapted after the fact.

Solving the anxiety of waiting

Solving the anxiety of waiting

Submitting the request was only half the problem. The matching process itself could take up to 48 hours — not because it was slow, but because it was genuinely complex. Our algorithm was finding the best tutor based on rating, educational background, and alignment with the student's specific problem areas, while also matching schedules and waiting for the tutor to accept the offer. If they didn't, the offer moved to the next best match.


Parents didn't know any of that. So they called to ask.


I designed a tutor match tracker to sit at the end of the new request flow and follow the family through the entire matching window. The concept was grounded in two of Nielsen's usability heuristics: Visibility of System Status and Match Between the System and the Real World — using the mental model of delivery and food tracking, experiences people already interact with and instinctively understand.


The tracker showed parents exactly where their request stood, with email updates at each stage. It was deliberately kept visible on the homepage and other key surfaces until the student completed their first session — not just until the tutor was assigned. That decision was intentional: the first session is the most critical moment for a new student's engagement, and the tracker gave parents a natural reason to stay connected to the platform through that crucial first week.

Submitting the request was only half the problem. The matching process itself could take up to 48 hours — not because it was slow, but because it was genuinely complex. Our algorithm was finding the best tutor based on rating, educational background, and alignment with the student's specific problem areas, while also matching schedules and waiting for the tutor to accept the offer. If they didn't, the offer moved to the next best match.


Parents didn't know any of that. So they called to ask.


I designed a tutor match tracker to sit at the end of the new request flow and follow the family through the entire matching window. The concept was grounded in two of Nielsen's usability heuristics: Visibility of System Status and Match Between the System and the Real World — using the mental model of delivery and food tracking, experiences people already interact with and instinctively understand.


The tracker showed parents exactly where their request stood, with email updates at each stage. It was deliberately kept visible on the homepage and other key surfaces until the student completed their first session — not just until the tutor was assigned. That decision was intentional: the first session is the most critical moment for a new student's engagement, and the tracker gave parents a natural reason to stay connected to the platform through that crucial first week.

A support team that could finally breathe

A support team that could finally breathe

The new request flow shipped in June 2024. The tracker followed in August.

The new request flow shipped in June 2024. The tracker followed in August.

45%

45%

70%

70%

Request flow completion rate

Request flow completion rate

A more than 55% relative increase in the number of parents successfully submitting a tutor request on their own, without calling.

A more than 55% relative increase in the number of parents successfully submitting a tutor request on their own, without calling.

40%+

40%+

Reduction in support tickets

Reduction in support tickets

Related to tutor requests — well above our projected target. The impact was immediate and visible across the support team.

Related to tutor requests — well above our projected target. The impact was immediate and visible across the support team.

The tutor match tracker became one of the most celebrated features of the year internally. Parents were no longer in the dark during the matching window, and it showed in the sharp drop in status-check calls.


A variation of the redesigned flow was also subsequently launched for tutor replacement requests, extending the same UX improvements to the other major use case.

The tutor match tracker became one of the most celebrated features of the year internally. Parents were no longer in the dark during the matching window, and it showed in the sharp drop in status-check calls.


A variation of the redesigned flow was also subsequently launched for tutor replacement requests, extending the same UX improvements to the other major use case.

Simplicity is about cognitive load, not click count

Simplicity is about cognitive load, not click count

Stakeholders often equate simplicity with fewer steps. But simplicity is about cognitive load, not click count. The most important design decision I made on this project wasn't visual — it was structural. And making that case clearly, early, was what allowed everything else to follow.

Stakeholders often equate simplicity with fewer steps. But simplicity is about cognitive load, not click count. The most important design decision I made on this project wasn't visual — it was structural. And making that case clearly, early, was what allowed everything else to follow.