Section 05: User Research & Validation Plan
A systematic, evidence-based approach to validate VendorShield's core assumptions before significant technical investment, focusing on problem existence, solution appeal, and willingness to pay.
1. Key Assumptions to Validate
๐จ Problem Assumptions
Assumption 1: Security teams spend 40+ hours per vendor assessment manually.
Target Evidence: 80% of CISOs confirm >30 hours/vendor with current process.
Assumption 2: Manual questionnaires are considered "security theater" and unreliable.
Target Evidence: 70% express low confidence in vendor self-assessment accuracy.
๐ก Solution Assumptions
Assumption 1: Automated continuous monitoring is preferable to periodic reviews.
Target Evidence: 90% of beta users prefer real-time alerts over quarterly reviews.
Assumption 2: Composite risk score (0-100) is more actionable than separate metrics.
Target Evidence: Users make faster decisions with composite score vs. detailed reports.
๐ฐ Business Assumptions
Assumption 1: Mid-market companies (500-5k employees) will pay $999/mo for 200 vendors.
Target Evidence: Price acceptance from 10+ target companies in pre-orders.
Assumption 2: Procurement teams will collaborate with security on vendor risk.
Target Evidence: 60% of procurement leaders express willingness to use shared platform.
2. Customer Discovery Interview Guide
60-Minute Framework for Security Leaders
Target: 25 interviews (15 CISOs, 5 Procurement, 5 Compliance)
๐ฏ Part 1: Context (10 min)
- "Walk me through your vendor risk management process today."
- "How many vendors do you manage? How many are considered 'high risk'?"
- "Who's involved? Security, procurement, legal?"
๐ฅ Part 2: Pain Points (15 min)
- "Tell me about the last vendor-related security incident."
- "What's the most time-consuming part of vendor assessments?"
- "How confident are you in vendor self-reported questionnaires?"
๐ ๏ธ Part 3: Current Solutions (15 min)
- "What tools do you use? Spreadsheets, GRC platforms?"
- "What do they do well? Where do they fall short?"
- "Have you evaluated automated solutions? Why/why not?"
๐ Part 4: Solution Reaction (15 min)
- "If we could automatically monitor vendor security posture..."
- "Would real-time alerts or quarterly reports be more valuable?"
- "What would be your biggest concern about automated monitoring?"
๐ Interview Success Metrics
3. Validation Experiments
๐ฏ Landing Page Test
Goal: Validate demand before building
A/B test messaging with $750 ad spend
Headlines to Test:
Success Criteria:
๐ฎ Fake Door & Pre-Order Test
Fake Door Test
Test demand for premium features before building.
Pre-Order Test
Collect refundable deposits to validate willingness to pay.
๐งช Prototype Testing Options
Wizard of Oz
Manual backend with automated frontend
Concierge MVP
High-touch manual service
Clickable Prototype
Figma mockups with user flows
4. 8-Week Validation Timeline
๐ Go/No-Go Decision Criteria
| Metric | Target | Weight | Decision Rule |
|---|---|---|---|
| Problem Validation Rate | โฅ80% confirm pain | 25% | GO if โฅ80% |
| Landing Page Conversion | โฅ7% signup rate | 20% | GO if โฅ7% |
| Price Acceptance | โฅ60% at target price | 25% | GO if โฅ60% |
| Pre-Orders Secured | โฅ10 customers | 20% | GO if โฅ10 |
| Prototype NPS | โฅ40 | 10% | GO if โฅ40 |