Prototype Check
Let's make sure everyone's ready to test
- Open your AI prototype on your phone or laptop
- Walk through your core flow once — does it work?
- Note any bugs or broken states you need to warn testers about
- If your prototype isn't ready, pair with someone whose is
Prayer
Lucy W., will you pray for us today?
Testing answers two questions
"Can they use it?" and "Should we build it?"
Today
Writing Good Tasks & Questions
Test Planning + In-Class Testing
By the end of today...
Test Types
You'll understand the difference between usability testing and solution validation
Tasks & Questions
You'll write test tasks and questions that reveal truth
Facilitation
You'll learn to run a test without leading the participant
Practice
You'll test classmates' prototypes and have yours tested
Why this matters
You have a working prototype — but you don't know if it works.
Most designers only test usability: "Can users tap the right buttons?"
That's necessary but not sufficient. You also need to test solution fit.
Testing validates both
Testing is how you check your work:
Forming Intent
Did you understand the right problem?
→ Validation testing
Rendering Intent
Did you communicate your solution clearly?
→ Usability testing
Two Types of Testing
Both are necessary — neither is sufficient alone
Every test should answer two questions
| Question | Type | What you learn |
| "Can they use it?" | Usability | Can users accomplish tasks? Where do they get stuck? |
| "Should we build it?" | Validation | Does this solve their problem? Would they use it? |
Why you need both
- A usable product that solves the wrong problem = failure
- A great solution that's unusable = failure
- You need signal on BOTH to succeed
You can't A/B test your way to product-market fit. And you can't validate your way to a usable product.
Different questions for different goals
Usability Questions
"Show me how you would..."
"What do you expect to happen?"
"What are you looking for?"
Validation Questions
"How does this compare to what you do now?"
"Would this change your behavior?"
"Tell me about a time you had this problem"
Quick exercise: Which type?
Categorize each question — usability or validation?
- "Walk me through logging your first workout"
- "How do you currently track your workouts?"
- "What would you tap to see your progress?"
- "Would you use this instead of what you do now? Why or why not?"
- "What do you expect that button to do?"
Writing Good Tasks & Questions
The quality of your test depends on this
What makes a good task?
| Principle | Bad | Good |
| Realistic | "Use the app" | "You want to track a 20-min run you just completed" |
| Specific | "Try the features" | "Log your breakfast from this morning" |
| No hints | "Tap the + button to add" | "Log your workout" |
Tasks are based on actual user goals — not feature demonstrations.
What makes a good question?
| Principle | Bad | Good |
| Open-ended | "Would you tap this button?" | "What would you do next?" |
| Past behavior | "Would you use this?" | "How do you currently handle this?" |
| Non-leading | "Don't you think this is easier?" | "What do you think of this?" |
The Mom Test applies here too
People will be nice. They'll say they'd use it.
Your job is to design questions that reveal truth, not confirm your hopes.
You can test with anyone — friends, roommates, family — if you ask the right questions.
"Would you use this?"
→
People say yes to be nice
"How did you handle this last time?"
→
Reveals actual behavior
Mom Test style questions for prototypes
Ask about past behavior and current reality — not future intentions
- "How do you currently handle [this problem]?" → reveals if problem is real
- "What happened the last time you tried to [task]?" → reveals pain points
- "What would you do if this app didn't exist?" → reveals alternatives
- "Who else has this problem?" → reveals if they're representative
Let's rewrite some bad questions
Turn these into good questions:
- "Don't you think this workout tracker is easy to use?"
- "Would you pay $5/month for this app?"
- "Do you like the color scheme?"
Hint: What's leading? What's hypothetical? How can you ground it in real behavior?
The 5 Act Method
Structure for a productive test session
The 5 Act Method
- Friendly welcome (1 min) — Make them comfortable, explain think-aloud
- Context questions (3 min) — Current behavior, relevant habits
- Introduce prototype (1 min) — "This is early. Some things won't work."
- Tasks (10-15 min) — Give tasks one at a time. Observe. Don't help.
- Debrief (5 min) — Overall impressions, comparison to current solutions
Act 1: Friendly Welcome (1 min)
- Make them comfortable — you want honest feedback
- Explain think-aloud: "Tell me what you're thinking as you use this"
- Reassure them: "We're testing the design, not you"
- Permission: "It's totally fine to be confused or critical"
Act 2: Context Questions (3 min)
Before showing the prototype, understand their world:
- "How do you currently [relevant behavior]?"
- "Tell me about the last time you [relevant situation]"
- "What tools do you use for [relevant task]?"
This sets up your validation questions later.
Act 3: Introduce the Prototype (1 min)
Set expectations:
- "This is an early prototype"
- "Some things won't work — that's okay"
- "Think out loud as you explore"
Don't explain what it does. Let them discover.
Act 4: Tasks (10-15 min)
- Give tasks one at a time
- Observe — don't interrupt
- Note where they hesitate, tap wrong things, look confused
- When they get stuck, ask "What are you looking for?" — don't rescue them
This is where most usability insights come from.
Act 5: Debrief (5 min)
Now ask validation questions:
- "How does this compare to what you do now?"
- "What would make you switch to this?"
- "What's missing that you'd need?"
- "Would you recommend this to someone like you?"
This is where solution fit insights come from.
Facilitation
Your job: observe and learn — not explain or defend
Four principles of good facilitation
- Silence is your friend — Let them struggle. That's data.
- Don't explain — If they're confused, note it. Don't rescue them.
- Don't defend — If they criticize, say "Tell me more." Not "Well, actually..."
- Ask "Why?" — When they do something unexpected, ask what they were thinking.
Common facilitation mistakes
| Mistake | What it sounds like | Why it's bad |
| Helping too quickly | "Oh, you tap here" | You don't learn where UI fails |
| Leading questions | "You found that easy, right?" | You get false confirmation |
| Defending choices | "That's because we wanted to..." | You stop learning |
| Hypothetical questions | "Would you use this?" | People say yes to be nice |
D5: Usability Testing Round 1
Due Monday, Mar 16 @ 5:15pm
Test at least 5 people outside of class
Test anyone — friends, roommates, family, strangers. The skill is asking good questions.
Your test should include:
- 4-5 task-based questions (realistic scenarios)
- 1+ follow-up question per task (dig into confusion)
- 4-5 Mom Test style questions (past behavior, not hypotheticals)
D5: What to Document
For each of your 5 tests:
- Who they are (relationship to you, relevant background)
- What happened (observations, not interpretations)
- Key quotes
- Usability issues discovered
- Solution validation signals
D5: Synthesis
After your 5 tests, synthesize:
- Patterns across tests
- Top 3 usability issues to fix
- Top 3 solution fit insights
- Question reflection: Which questions got honest answers? Which invited polite non-answers?
- What you'll change in next iteration
Submit: PDF (2 pages — key insights + raw data)
Test Planning Workshop (20 min)
First 10 min: Write your test plan
• 4-5 realistic task-based questions
• A follow-up question for each task
• 4-5 Mom Test style questions
• What you're specifically looking for
Next 10 min: Peer review
• Partner identifies: leading questions?
• Tasks that give hints?
• Hypotheticals that should be past behavior?
In-Class Testing (100 min)
Three rounds of testing (~30 min each):
- Round 1: You test a classmate's prototype (you facilitate)
- Round 2: A classmate tests yours (you observe silently)
- Round 3: Switch partners, repeat
Prototype owner: no helping! Take notes.
Testing ground rules
For facilitators
• Run the 5 Act Method
• Don't explain or rescue
• Take notes
For prototype owners
• Observe silently — no helping!
• Take notes on issues
• Save questions for debrief
What to document during testing
| Capture this | Not this |
| "User tapped menu icon 3 times" | "User was confused" |
| "User said 'Where do I go?'" | "UI was unclear" |
| "User looked for search, couldn't find it" | "Search is broken" |
Observations, not interpretations. Quotes, not summaries.
After each round, reflect
- What surprised you?
- What usability issues emerged?
- What did you learn about solution fit?
- What will you change?
Next up
Today
- 20 min: Test Planning Workshop
- 100 min: In-Class Testing — 3 rounds
- 10 min: Reflection
D5: Usability Testing Round 1 — due Mon, Mar 16 @ 5:15pm
Test 5 people (anyone!), 4-5 tasks + follow-ups + 4-5 Mom Test questions, synthesize patterns