UserTesting for Designers: Video-Based Remote User Research Platform
Remote user research platform that delivers video recordings of real people testing your designs and prototypes
UserTesting is a remote research platform that records videos of real people using your website, app, or prototype while thinking aloud. Unlike analytics that show what users do, UserTesting shows why they do it, capturing facial expressions, tone of voice, and moments of confusion that numbers can’t reveal.
Key Specs
| Price | Custom pricing; typically $20,000-30,000/year |
| Platform | Web-based platform; tests on desktop, mobile, tablet |
| Best for | Video feedback, moderated interviews, large-scale studies |
| Learning curve | 1-2 hours to launch first test; days to master study design |
How Designers Use UserTesting
UserTesting fits into different stages of the design process. Here’s when designers turn to video-based user research.
For Prototype Validation
Upload a Figma prototype link, InVision mockup, or staging URL. Write 3-5 tasks like “Find and purchase a red sweater in size medium.” Select your target audience (e.g., “Women aged 25-40 who shop online monthly”). Launch the test and wait 1-2 hours. Watch videos of participants attempting your tasks, noting where they hesitate, click the wrong things, or express frustration. Use these insights to revise before developers write code.
For Competitive Analysis
Create a test that asks participants to complete the same task on your product and a competitor’s. Watch how users compare the experiences in their own words. Designers use this to identify features worth copying and pain points to avoid. It’s more honest than a focus group because people react in real time, not based on what they think you want to hear.
For Messaging and Content Testing
Test landing pages, onboarding flows, or marketing copy before launch. Ask participants “What do you think this product does?” or “Who is this for?” Their answers reveal whether your message lands. Designers use this to catch confusing headlines, unclear value propositions, or jargon that alienates non-experts.
For Accessibility Testing
Request participants with specific accessibility needs (screen reader users, low vision, motor impairments). Watch how they navigate your interface with assistive technology. This reveals issues automated checkers miss, like poor heading structure or confusing focus order. UserTesting’s diverse panel makes it easier to find participants than recruiting independently.
UserTesting vs. Alternatives
How does UserTesting compare to other user research platforms?
| Feature | UserTesting | Maze | Lyssna |
|---|---|---|---|
| Video feedback | ✅ Core feature | ❌ No | ❌ No |
| Participant panel size | ✅ Millions | ✅ 3M+ | ⚠️ 530K+ |
| Moderated testing | ✅ Yes | ❌ No | ❌ No |
| Prototype integrations | ⚠️ Adobe XD only | ✅ Figma, Sketch | ✅ Figma, Sketch, more |
| Quantitative testing | ⚠️ Basic | ✅ Strong | ✅ Very strong |
| Pricing transparency | ❌ Contact sales | ✅ $99+/month | ✅ $75+/month |
| Best for small teams | ❌ Too expensive | ✅ Yes | ✅ Yes |
Choose UserTesting if: You need rich video feedback, want to see facial reactions and hear tone of voice, or need both moderated and unmoderated sessions for complex research.
Choose Maze if: You test Figma prototypes frequently, want quantitative metrics (completion rates, misclick rates), or have a limited budget under $10,000/year.
Choose Lyssna if: You want the most affordable option with strong quantitative methods (preference tests, card sorting, tree testing) and don’t need video feedback.
Getting Started with UserTesting
A quick start to running your first unmoderated test:
Step 1: Create a test plan
Click “Create Test” and choose unmoderated usability test. Add your prototype or website URL. Write 3-5 tasks that represent realistic user goals. Start with “Imagine you want to [goal]. Please attempt to do this now.” Keep tasks open-ended so participants think aloud naturally rather than following step-by-step instructions.
Step 2: Define your audience
Use the demographic filters to select age, location, income, and device type. Add screening questions to narrow further (e.g., “Do you use project management software weekly?”). Start with 5 participants for small tests, 10-15 for broader validation. More participants don’t always equal better insights; 5 users often reveal 80% of usability issues.
Step 3: Review and share insights
Watch videos as they arrive (usually within 1-2 hours). Use timestamps to mark important moments. Create highlight reels by clipping key quotes and sharing them with your team. UserTesting auto-generates transcripts, making it easy to search for keywords like “confusing” or specific feature names. Download clips for stakeholder presentations or design reviews.
UserTesting in Your Design Workflow
UserTesting rarely stands alone. Here’s where it fits in relation to other research and design tools.
- Before UserTesting: Create prototypes in Figma, define research questions in Notion or Confluence
- During testing: UserTesting for video insights, analytics tools for quantitative data
- After UserTesting: Synthesize findings in Dovetail or Miro, update designs in Figma based on feedback
Common tool pairings:
- UserTesting + Dovetail for organizing video clips into themes and sharing insights across teams
- UserTesting + Figma for testing prototypes before development, then iterating based on video feedback
- UserTesting + Maze for combining qualitative video insights with quantitative prototype metrics
- UserTesting + FullStory or Hotjar for following up analytics anomalies with video evidence of why users behave certain ways
Common Problems (and How to Fix Them)
These issues come up frequently when teams use UserTesting.
“Participants aren’t following my tasks”
Your task instructions might be too vague or too specific. Bad: “Explore the homepage.” Better: “Imagine you’re looking for a blue jacket in size large. Try to find one now.” Give context (the “imagine” scenario) followed by a clear goal. Avoid telling participants exactly where to click. You want to see their natural navigation patterns, not compliance.
“Videos show problems but stakeholders don’t act on them”
Create highlight reels instead of expecting stakeholders to watch full sessions. Clip 3-5 moments that show the same issue, add a title card explaining the problem and impact, export as a 2-minute video. Share this in Slack or design reviews. Brief, themed clips get action; 45-minute raw sessions get ignored.
“The cost is unpredictable”
UserTesting charges per session, which can balloon costs if you test frequently. Negotiate for session bundles upfront (e.g., 100 sessions/year at discounted rate). For continuous testing, consider switching to subscription-based tools like Maze or Lyssna. Use UserTesting for high-stakes projects where video feedback justifies the cost, and cheaper tools for quick validation.
“Participants give surface-level feedback”
This happens when tasks are too simple or participants aren’t in the right mindset. Add a warm-up task to get them talking. Use scenarios that create realistic motivation: “Your current tool just shut down and you need a replacement today” makes participants more critical than “Browse this app.” Ask follow-up questions in the test script: “How does this compare to your current solution?”
“We’re not finding our target users in the panel”
UserTesting’s panel is broad but not infinite. If you need niche audiences (pediatric oncologists, CFOs at Fortune 500 companies), you’ll struggle. Use UserTesting’s “bring your own users” feature to upload email lists of customers or prospects. Or combine tools: recruit on LinkedIn or user communities, then conduct the test via UserTesting’s interview platform.