User Research & Validation Plan
Systematic validation of PromptVault's core assumptions before engineering investment
1 Key Assumptions to Validate
Problem Assumptions
Solution Assumptions
Business Assumptions
| Assumption | Risk | Validation Method | Target Evidence |
|---|---|---|---|
| PROBLEM ASSUMPTIONS | |||
| AI practitioners manage 20+ prompts across multiple platforms | High | User interviews + screen recording analysis | ≥80% of practitioners confirm managing 15+ prompts |
| Teams waste ≥5 hours/week finding and recreating prompts | High | Time tracking diary study (5 teams, 1 week) | Avg. 4+ hours/week lost to prompt management |
| Version control is a critical pain point (can't find "what worked") | Critical | Scenario-based interview questions | ≥70% report losing good prompt versions |
| SOLUTION ASSUMPTIONS | |||
| Git-like versioning is intuitive for non-developers | Medium | Figma prototype usability testing (10 users) | ≥80% complete core tasks without guidance |
| Multi-model testing saves significant manual effort | High | Wizard of Oz MVP with manual execution | Users report ≥50% time savings vs manual testing |
| BUSINESS ASSUMPTIONS | |||
| Teams will pay $49/user/month for collaboration features | Critical | Pricing page A/B test + pre-order commitment | ≥3% conversion to Team plan at target price |
| CAC < $150 for Pro users via content marketing | Medium | $1,000 test campaigns across 3 channels | CAC < $120 in at least 2 channels |
2 Customer Discovery Interview Guide (75 Minutes)
Interview Targets
- AI Engineers (8-10)
- Product Managers using LLMs (6-8)
- Content Creators/Analysts (4-6)
- Consultants/Agency leads (4-6)
Logistics
- Recruitment: LinkedIn, AI Discord communities
- Incentive: $75 Amazon gift card
- Tools: Zoom + Otter.ai + Airtable notes
- Target: 25 interviews minimum
3 Validation Experiments & Timeline
8-Week Validation Timeline
Week 1-2: Problem Discovery
Validate core pain points, document current workflows, identify most frustrated users
Week 3-4: Solution Testing
Test messaging, measure demand, build waitlist, validate user interest
Week 5-6: Pricing Validation
Test price sensitivity, validate willingness to pay, optimize pricing tiers
Week 7-8: Prototype Validation
Test core workflow, collect qualitative feedback, measure time savings
4 Go/No-Go Decision Criteria
| Metric | Target | Threshold | Weight | Status |
|---|---|---|---|---|
| Problem Validation Score | ≥80% confirm pain points | 70% | 30% | ● Not Tested |
| Waitlist Signup Rate | ≥5% conversion | 3% | 25% | ● Not Tested |
| Price Acceptance | ≥60% at target price | 40% | 20% | ● Not Tested |
| Pre-Orders Secured | 10+ customers | 5 | 15% | ● Not Tested |
| Prototype NPS | ≥40 | 30 | 10% | ● Not Tested |
Go Decision Criteria
Proceed if: Total weighted score ≥ 70% AND at least 3 of 5 metrics meet target. Minimum viable validation requires problem confirmation + some willingness to pay.
5 Research Synthesis Template
Problem Validation Summary
- Top 3 validated pain points
- User quotes as evidence
- Unexpected findings
- Invalidated assumptions
Solution Validation Summary
- Most compelling features
- Features users don't care about
- UX concerns raised
- Integration needs identified
Pricing Validation Summary
- Optimal price point: $______
- Price sensitivity by segment
- Value anchors (what they compare to)
- Preferred pricing model
Go-to-Market Insights
- Where target users hang out
- How they discover solutions
- Decision-making process
- Key buying objections
Recommended Next Steps
This validation plan requires 8 weeks and approximately $1,500 in incentives/ad spend.
Success criteria based on industry benchmarks for B2B SaaS products.