Section 05: User Research & Validation Plan
Before investing in a full platform build, LocalPerks must validate three critical dimensions: (1) that independent businesses genuinely struggle with loyalty programs and want a coalition solution, (2) that the value proposition resonates with both businesses and consumers, and (3) that the economics work for all parties. This 8-week validation plan tests core assumptions through customer interviews, surveys, landing page experiments, and prototype testing.
Key Assumptions to Validate
Problem Assumptions (Business Side)
Assumption 1: Independent businesses spend 3-5 hours/month managing customer relationships with no systematic tool
Risk: High — If businesses don't struggle with this, the problem isn't pressing
Validation: Customer interviews (Q: "How much time do you spend managing customer loyalty?")
Target Evidence: 75%+ of interviewed businesses confirm 3+ hours/month on manual tracking
Risk: High — If businesses don't struggle with this, the problem isn't pressing
Validation: Customer interviews (Q: "How much time do you spend managing customer loyalty?")
Target Evidence: 75%+ of interviewed businesses confirm 3+ hours/month on manual tracking
Assumption 2: Building an individual loyalty program costs 3K-10K to launch and 500-1K/month to maintain
Risk: High — Validates why DIY isn't viable for small businesses
Validation: Interviews + pricing research on existing tools (Fivestars, Toast, custom builds)
Target Evidence: Documented quotes on cost barriers; competitor pricing audit
Risk: High — Validates why DIY isn't viable for small businesses
Validation: Interviews + pricing research on existing tools (Fivestars, Toast, custom builds)
Target Evidence: Documented quotes on cost barriers; competitor pricing audit
Assumption 3: Consumers forget punch cards within 2-3 months; 70%+ physical cards are abandoned
Risk: High — Validates the need for digital solution
Validation: Interviews + consumer survey on loyalty program usage patterns
Target Evidence: 60%+ of interviewed consumers confirm punch card abandonment
Risk: High — Validates the need for digital solution
Validation: Interviews + consumer survey on loyalty program usage patterns
Target Evidence: 60%+ of interviewed consumers confirm punch card abandonment
Assumption 4: Local retail spending in target neighborhoods is 200K-500K/day, with 40%+ going to chains
Risk: Medium — Validates market size
Validation: Public spending data + interviews with business associations
Target Evidence: Census data and local chamber reports confirm neighborhood spending baseline
Risk: Medium — Validates market size
Validation: Public spending data + interviews with business associations
Target Evidence: Census data and local chamber reports confirm neighborhood spending baseline
Solution Assumptions (Business Side)
Assumption 5: Businesses will enroll if setup takes <15 minutes and requires no hardware integration
Risk: High — Frictionless onboarding is critical
Validation: Landing page signup rate + Wizard of Oz prototype trial
Target Evidence: >7% landing page conversion; 80%+ of trial users complete signup
Risk: High — Frictionless onboarding is critical
Validation: Landing page signup rate + Wizard of Oz prototype trial
Target Evidence: >7% landing page conversion; 80%+ of trial users complete signup
Assumption 6: Businesses will participate if they believe coalition creates new customer access (cross-shopping)
Risk: Critical — Value prop must be clear, not just cost savings
Validation: Deep interviews (Q: "What would make you switch from your current system?")
Target Evidence: 70%+ cite "access to nearby customers" as primary driver
Risk: Critical — Value prop must be clear, not just cost savings
Validation: Deep interviews (Q: "What would make you switch from your current system?")
Target Evidence: 70%+ cite "access to nearby customers" as primary driver
Assumption 7: Businesses will accept 5% fee on redemptions if it reduces their operational burden
Risk: High — Pricing model viability
Validation: Pricing interviews + pre-order test
Target Evidence: 60%+ accept 5% fee; 10+ pre-orders at stated price
Risk: High — Pricing model viability
Validation: Pricing interviews + pre-order test
Target Evidence: 60%+ accept 5% fee; 10+ pre-orders at stated price
Solution Assumptions (Consumer Side)
Assumption 8: Consumers will download the app and enable location if it shows nearby local businesses
Risk: High — Consumer adoption is prerequisite for business value
Validation: Landing page + fake door test with app preview
Target Evidence: >8% of visitors click "Download App"; >5% enable location in prototype
Risk: High — Consumer adoption is prerequisite for business value
Validation: Landing page + fake door test with app preview
Target Evidence: >8% of visitors click "Download App"; >5% enable location in prototype
Assumption 9: Consumers will actively use the app (1+ transaction/week) once they understand the value
Risk: High — Engagement is essential for repeat business value
Validation: Prototype testing + early user metrics
Target Evidence: 50%+ of beta users have 2+ transactions in first month
Risk: High — Engagement is essential for repeat business value
Validation: Prototype testing + early user metrics
Target Evidence: 50%+ of beta users have 2+ transactions in first month
Assumption 10: Consumers value cross-business rewards over single-brand programs
Risk: Medium — Core differentiator validation
Validation: Consumer interviews + survey on app feature preferences
Target Evidence: 65%+ prefer unified app over 5 separate loyalty apps
Risk: Medium — Core differentiator validation
Validation: Consumer interviews + survey on app feature preferences
Target Evidence: 65%+ prefer unified app over 5 separate loyalty apps
Business Model Assumptions
Assumption 11: Businesses will stay enrolled if first-month redemption rate is 40%+ (proving consumer interest)
Risk: Critical — First month experience drives retention
Validation: Prototype/pilot redemption data
Target Evidence: Pilot shows 40%+ redemption rate in month 1
Risk: Critical — First month experience drives retention
Validation: Prototype/pilot redemption data
Target Evidence: Pilot shows 40%+ redemption rate in month 1
Assumption 12: Coalition of 30 businesses can acquire 2,000+ active consumers in 3 months through word-of-mouth + marketing
Risk: High — Network effect depends on density
Validation: Pilot cohort performance + consumer acquisition cost testing
Target Evidence: Pilot achieves 1,500+ downloads; CAC <$8 via marketing
Risk: High — Network effect depends on density
Validation: Pilot cohort performance + consumer acquisition cost testing
Target Evidence: Pilot achieves 1,500+ downloads; CAC <$8 via marketing
Assumption 13: $29/month basic tier will be affordable for 80%+ of independent retailers
Risk: Medium — Pricing must align with perceived value
Validation: Van Westendorp pricing survey + interviews
Target Evidence: Survey shows $25-$35 as acceptable range for 70%+ of businesses
Risk: Medium — Pricing must align with perceived value
Validation: Van Westendorp pricing survey + interviews
Target Evidence: Survey shows $25-$35 as acceptable range for 70%+ of businesses
Customer Discovery Interview Guide (60-90 Minutes)
For Business Owners / Managers (30-40 interviews target)
PART 1: ROLE & CONTEXT (10 min)
- Tell me about your business—type, size, customer base, years in operation
- How do you currently think about customer loyalty and repeat business?
- What's your biggest business challenge right now?
- Are you a solo operator or do you have a team?
PART 2: LOYALTY PROBLEM (20 min)
- How do you currently track or reward repeat customers? (punch cards, app, memory, etc.)
- How much time do you spend managing customer relationships each week?
- What frustrates you most about your current approach?
- Have you ever tried a digital loyalty tool? Why did you switch or not adopt?
- What would an ideal loyalty system look like for your business?
- How much are you losing to chain competitors who have better rewards?
PART 3: COALITION CONCEPT (20 min)
- Imagine your customers could earn points at your shop AND at the coffee shop next door, the bookstore, restaurant. How would that change their loyalty to you?
- What other businesses in your area would make sense to partner with?
- Would you be willing to share customer data (anonymized) with partner businesses to understand shopping patterns?
- What's the biggest concern you'd have about joining a coalition like this?
PART 4: PRICING & ECONOMICS (15 min)
- How much are you currently spending (if anything) on loyalty or marketing per month?
- If a coalition loyalty system cost $29/month and 5% on each customer's redemption, would that be worth it?
- What would the system need to save you or earn you for it to be a clear "yes"?
- Are there other businesses (chamber of commerce, local group) who might subsidize this?
PART 5: NEXT STEPS (5 min)
- How likely are you to try something like this (1-10 scale)?
- What would you need to see to commit to a 3-month pilot?
- Who else in your network should I talk to about this?
For Consumers (15-20 interviews target)
PART 1: LOYALTY BEHAVIOR (10 min)
- How many loyalty programs are you currently active in? (Apps, cards, both)
- Which ones do you actually use regularly and why?
- How often do you forget about rewards cards or apps?
- How much do loyalty rewards influence where you shop?
PART 2: LOCAL SHOPPING PREFERENCES (10 min)
- Do you prefer shopping at local businesses or chains? Why?
- What barriers prevent you from shopping more locally?
- If local businesses offered rewards as good as Starbucks or CVS, would it change your behavior?
PART 3: COALITION CONCEPT TEST (10 min)
- Imagine one app where you earn points at the local coffee, bookstore, restaurant, and boutique—all in your neighborhood. How does that feel?
- Would you download and use an app like that? Why or why not?
- What features would matter most? (Earning flexibility, easy redemption, discovery, etc.)
PART 4: USAGE INTENT (5 min)
- On a scale 1-10, how likely would you be to actively use this app?
- How often would you expect to use it (daily, weekly, monthly)?
Survey Designs
Screening Survey (5-minute, goal: 300+ responses)
| Question | Response Type |
|---|---|
| 1. What best describes your business or role? | Solo business owner / Manager / Chain employee / Consumer / Other |
| 2. Do you currently operate a loyalty program of any kind? | Yes / No / Considering |
| 3. How much time per week do you spend managing customer relationships/loyalty? (if business owner) | 0 hrs / 1-3 hrs / 3-5 hrs / 5+ hrs |
| 4. How many loyalty apps/cards do you actively use? (if consumer) | 0 / 1-2 / 3-5 / 5+ |
| 5. How often do you prefer shopping at local independent businesses vs. chains? | Always local / Usually local / Mix equally / Usually chains / Always chains |
| 6. Would you be interested in a 30-minute research conversation? ($50 gift card incentive) | Yes, contact: ___ / No thanks |
Problem Validation Survey (10 questions, goal: 500+ responses)
| Question | Why This Matters |
|---|---|
| Frequency: "How often do you visit the same coffee shop (or local business)?" | Establishes baseline for repeat behavior |
| Pain: "Rate the importance: A loyalty program would make me visit more often" (1-10) | Measures problem severity |
| Adoption: "What prevents you from using loyalty programs more? (check all)" | Identifies barriers: forget, don't find valuable, too many apps, etc. |
| Switching: "If local businesses had rewards as good as chains, would you shift spending?" | Tests if rewards drive local preference |
| Coalition Interest: "How interested in ONE app for all your local spots?" (1-10) | Core value prop validation |
Landing Page & Fake Door Experiments
Experiment 1: Business Landing Page (Week 2)
Goal: Validate business problem and solution interestURL: localperks.co/for-business
Copy: "Stop losing customers to chain loyalty programs. Join 30+ local businesses building a community rewards network."
CTA: "Get early access" → email signup
Success Criteria: >7% signup rate (70+ emails from 1,000 visitors)
Experiment 2: Consumer App Fake Door (Week 2)
Goal: Validate consumer interest in unified loyalty appURL: localperks.co/app
Visual: Mockup showing points earned across 5 local businesses, redemption flow
CTA: "Download app" button → shows "Coming soon, join waitlist"
Success Criteria: >8% click-through; 500+ waitlist signups
Experiment 3: Van Westendorp Pricing Survey (Week 3-4)
Goal: Identify optimal business subscription priceSample: 200+ small business owners
Method: 4 pricing questions (too cheap, bargain, expensive, too expensive)
Deliverable: Optimal price point chart + acceptable range
Success Criteria: Identify price with 60%+ acceptance within $20-$50 range
Prototype Testing (Wizard of Oz MVP)
Week 4-6: Manual Prototype with Early Businesses
- Recruit 5-10 businesses from target neighborhoods (existing connections, business association referrals)
- Create simple onboarding: Google Form with business name, location, categories
- Manually set up: Custom QR codes, basic dashboard (Google Sheets), simple email confirmations
- Launch soft consumer test: Friends + family in same neighborhood; seeded waitlist signups
- Track 3 weeks of live data: Transactions, points earned/redeemed, business feedback, consumer feedback
- Measure success: 20+ consumer downloads, 5+ transactions per business, redemption rate 30%+
Week 5-6: Consumer Prototype Feedback Sessions
- Interview 10 beta consumers on app experience, feature clarity, willingness to use regularly
- Measure NPS: Target 40+ (measuring likelihood to recommend)
- Identify friction: Where do users get confused? What features matter most?
- Test messaging: Show different value props, measure resonance
8-Week Validation Timeline
WEEK 1-2: Assumption Identification & Interview Launch
☐ Finalize 15 key assumptions across problem/solution/business model
☐ Recruit and conduct 10-15 business owner interviews
☐ Recruit and conduct 8-10 consumer interviews
☐ Launch landing page + fake door experiments (low-cost, high-learning)
Output: Interview notes, initial problem validation, traffic data
☐ Finalize 15 key assumptions across problem/solution/business model
☐ Recruit and conduct 10-15 business owner interviews
☐ Recruit and conduct 8-10 consumer interviews
☐ Launch landing page + fake door experiments (low-cost, high-learning)
Output: Interview notes, initial problem validation, traffic data
WEEK 3-4: Survey & Pricing Validation
☐ Distribute screening survey (target 300+ responses)
☐ Distribute problem validation survey (target 500+ responses)
☐ Conduct Van Westendorp pricing study with 200+ businesses
☐ Analyze landing page & fake door performance
Output: Quantified problem severity, pricing data, cost structure validated
☐ Distribute screening survey (target 300+ responses)
☐ Distribute problem validation survey (target 500+ responses)
☐ Conduct Van Westendorp pricing study with 200+ businesses
☐ Analyze landing page & fake door performance
Output: Quantified problem severity, pricing data, cost structure validated
WEEK 5-6: Prototype & Pre-Order Validation
☐ Build Wizard of Oz MVP with 5-10 pilot businesses
☐ Launch consumer beta (target 100-200 downloads)
☐ Track 3 weeks of live transaction data
☐ Test pre-order: Offer 3-month pilot at $79 (50% discount) to 10 businesses
☐ Conduct prototype feedback interviews with 10 consumers
Output: Live usage data, redemption rates, NPS score, pre-order traction
☐ Build Wizard of Oz MVP with 5-10 pilot businesses
☐ Launch consumer beta (target 100-200 downloads)
☐ Track 3 weeks of live transaction data
☐ Test pre-order: Offer 3-month pilot at $79 (50% discount) to 10 businesses
☐ Conduct prototype feedback interviews with 10 consumers
Output: Live usage data, redemption rates, NPS score, pre-order traction
WEEK 7-8: Synthesis & Go/No-Go Decision
☐ Compile all validation data against go/no-go criteria
☐ Document learnings: confirmed assumptions, invalidated assumptions, unexpected insights
☐ Refine value props based on feedback
☐ Make pivot/persevere/scale decision
☐ Plan Phase 2 (if go): expanded pilot or full build
Output: Comprehensive validation report + strategic decision + roadmap
---
☐ Compile all validation data against go/no-go criteria
☐ Document learnings: confirmed assumptions, invalidated assumptions, unexpected insights
☐ Refine value props based on feedback
☐ Make pivot/persevere/scale decision
☐ Plan Phase 2 (if go): expanded pilot or full build
Output: Comprehensive validation report + strategic decision + roadmap
Go/No-Go Decision Criteria
| Validation Area | Success Target | Red Flag (No-Go) |
|---|---|---|
| Business Problem | 75%+ of interviewed businesses confirm 3+ hours/month on loyalty management | <20% confirmation rate |
| Business Interest | 7%+ landing page signup; 10+ pre-order commitments | <3% signup; 0 pre-orders |
| Business Pricing | 60%+ accept $29/month; Van Westendorp range $20-$50 | Optimal price >$50 or <$15 |
| Consumer Problem | 65%+ have 2+ active loyalty apps they struggle to remember | <40% report loyalty app friction |
| Consumer Interest | 8%+ fake door CTR; 500+ app waitlist | <4% CTR; <200 waitlist |
| Prototype Redemption | 40%+ of issued points redeemed in month 1 (pilot) | <25% redemption rate |
| Prototype NPS | NPS >40 (promoters 9-10 minus detractors 0-6) | NPS <20 |
| Repeat Usage | 50%+ of pilot consumers use app 2+ times in first month | <25% repeat rate |
| Cross-Business Behavior | 40%+ of redemptions are at different business than where points earned | <15% cross-redemption (kills coalition value) |
| Network Density | Pilot achieves 30+ committed businesses for Phase 2 | <15 businesses interested |
Decision Framework:
- SCALE: 8+ criteria met → Proceed to full MVP build and city expansion
- PERSEVERE: 6-7 criteria met → Run additional experiments, refine value prop, extend timeline
- PIVOT: 4-5 criteria met → Fundamentally change approach (B2B vs B2C, different coalition model, etc.)
- KILL: <4 criteria met → Unvalidated assumptions too numerous; insufficient traction
Research Synthesis Output
After 8 weeks, document findings in three sections: (1) Validated Problems: Specific quotes and patterns confirming pain points; (2) Solution Learnings: Which features resonated, what surprised us, integration needs; (3) Market & Economics: Pricing acceptance, CAC channels, retention risks, competitive positioning. This synthesis becomes the input for either proceeding to full MVP build or pivoting based on evidence.
After 8 weeks, document findings in three sections: (1) Validated Problems: Specific quotes and patterns confirming pain points; (2) Solution Learnings: Which features resonated, what surprised us, integration needs; (3) Market & Economics: Pricing acceptance, CAC channels, retention risks, competitive positioning. This synthesis becomes the input for either proceeding to full MVP build or pivoting based on evidence.