MeetingMeter - Meeting Cost Calculator

Model: z-ai/glm-4.7
Status: Completed
Cost: $0.442
Tokens: 335,146
Started: 2026-01-04 22:05

User Research & Validation Plan

MeetingMeter - Data-Driven Experimentation Framework

Section 05

1 Key Assumptions to Validate

The following assumptions represent the critical path to viability. We will prioritize testing "Privacy/Salary Data" and "Willingness to Change Behavior" as they pose the highest risk.

Category Assumption Risk Validation Method Target Evidence
Problem Ops leaders lack visibility into aggregate meeting costs and cannot currently calculate ROI on collaboration time. HIGH Discovery Interviews 80%+ cannot answer "What did meetings cost last week?"
Problem Employees are willing to reduce meeting time if provided data, but lack the social permission to do so. MED Survey + Interviews 70% say they want to attend fewer meetings.
Problem Current tools (Clockwise, Reclaim) do not solve the "cost" pain point for finance/ops leaders. HIGH Competitive Analysis Confirmed absence of "Cost View" in competitor stacks.
Solution Users will allow integration with calendar/salary data despite privacy concerns. CRITICAL Landing Page Test < 20% drop-off at "Connect Calendar" step.
Solution "Nudges" (pre-meeting cost popups) will actually change behavior without causing resentment. HIGH Prototype Testing Users rate nudge as "Helpful" not "Annoying" (Net Promoter).
Solution Role-based salary estimates are accurate enough to drive action without exact salary data. MED A/B Testing (Exact vs. Estimate) No significant difference in user decision making.
Business Ops leaders have budget authority or influence for "Productivity Software" ($4k-$12k/yr). HIGH Pricing Interviews 60%+ confirm budget exists for this category.
Business CAC can be kept below $100 via content marketing and organic virality. MED Ad Spend Test Cost per Lead < $15 in initial campaigns.

2 Customer Discovery Interview Guide

Part 1: Context & Role (10 min)

  • "Walk me through your role. What does 'operational excellence' look like for your team?"
  • "What are the top 3 metrics you are judged on this year?"
  • "How do you currently track 'time spent' vs. 'output produced'?"

Part 2: The Problem - Meeting Culture (20 min)

  • "Tell me about the last time you looked at your calendar and felt frustrated." (Follow up: What specifically frustrated you? The volume? The people? The lack of outcome?)
  • "If you had to put a dollar amount on the waste in your organization's meetings, what would it be?"
  • "Who is responsible for meeting efficiency in your company right now?" (Look for: "No one" or "It's cultural")
  • "Have you tried to fix meeting culture before? What happened?"

Part 3: The Elephant in the Room - Privacy & Data (15 min)

  • "Imagine I showed you a report listing exactly how much money 'Marketing Meetings' cost last month. How would you use that?"
  • "What if I showed you the cost of *individual* meetings? Would that be useful or invasive?"
  • "To calculate cost, we need salary data. Would you prefer: A) Uploading exact salaries, or B) Using role-based estimates (e.g., 'Senior Engineer = $150k')?"
  • "What would your employees say if they saw a '$500' price tag on the weekly all-hands invite?"

Part 4: Solution & Pricing (15 min)

  • "If we could save your company 10% on meeting costs, what is that worth to you annually?"
  • "We are thinking of pricing this at $8/employee/month. Is that a 'no-brainer', 'needs approval', or 'too expensive'?"
  • "What would need to be true for you to buy this tomorrow?"
Logistics: Target 20 interviews (10 Ops, 5 HR, 5 Dept Heads). Incentive: $50 Amazon Gift Card. Tool: Zoom + Otter.ai.

3 Survey Design

Screening Survey (Typeform)

1. Company Size: [ ] 10-50 [ ] 51-200 [ ] 201-1000 [ ] 1000+

2. Role: [ ] Ops/COO [ ] HR [ ] Engineering Mgr [ ] Finance [ ] Other

3. Top Productivity Challenge: [ ] Too many meetings [ ] Tool switching [ ] Async comms [ ] Hiring

4. Meeting Pain Scale (1-10): [ Slider ]

5. Willingness to talk: "We're researching meeting culture. Want a $50 gift card for 30 mins?" [Yes/No]

Validation Survey (Post-Interview)

1. Current Cost Visibility: "Do you know how much your team spends on meetings weekly?" [Yes/No/Roughly]

2. Feature Priority (Rank):

  • Aggregate Cost Dashboard
  • Pre-meeting Cost Nudges
  • Meeting ROI Reports
  • Salary Benchmarking

3. Pricing Sensitivity:

  • Too Expensive: > $15/user/mo
  • Expensive but worth it: $8-$15
  • Bargain: $4-$8
  • Too Cheap/Concerned: < $4

4 Validation Experiments

Experiment A: Landing Page & Fake Door

Goal: Validate demand and willingness to connect calendar.

Headline Variants (A/B/n):
  • V1 (Fear): "Your company is wasting $50k/month on meetings. Find out where."
  • V2 (Curiosity): "See the real-time cost of every meeting on your calendar."
  • V3 (Benefit): "Give your team 10% of their time back. Automatically."
Success Criteria:
  • 1,000 unique visitors (Ads)
  • >5% conversion to "Connect Calendar"
  • < 10% bounce rate on pricing page
Budget/Traffic:
  • Spend: $800 (2 weeks)
  • Channels: LinkedIn (B2B), Reddit (r/ops)
  • Target CPC: <$1.50

Experiment B: Wizard of Oz MVP (Concierge)

Goal: Validate the "Aha!" moment of seeing meeting cost data without building the integration yet.

🧙‍♂️
The Process: User exports calendar to .ics → Emails us → We manually calculate using Python script → Send beautiful PDF report → Ask for $50 payment.
Success Criteria:
  • 20 users complete the flow
  • NPS > 40 on the report
  • 3 users pay $50 for the manual report
Learning Focus:
  • Is the data accurate enough?
  • What insights are most surprising?
  • Do they share the report with their boss?

5 8-Week Validation Timeline

Weeks 1-2: Problem Discovery

Focus: Do we have the right problem?

  • Conduct 15 Customer Interviews
  • Launch Screening Survey (Target 200 responses)
  • Update Assumption Log (Kill/Keep)

Weeks 3-4: Demand Testing

Focus: Will they click?

  • Build Landing Page (Webflow/Framer)
  • Run $500 Ad Test (LinkedIn/Twitter)
  • A/B Test Headlines
  • Collect 50 Waitlist Emails

Weeks 5-6: Value Delivery

Focus: Is the data useful?

  • Onboard 10 "Concierge" users (Wizard of Oz)
  • Manually generate Meeting Cost Reports
  • Conduct "Reaction" interviews post-report
  • Test Pricing ($50 one-time vs sub)

Weeks 7-8: Synthesis & Decision

Focus: Go or No-Go?

  • Compile all data into "Investor Memo"
  • Run Go/No-Go Criteria Analysis
  • If Go: Begin MVP Engineering sprint

6 Go/No-Go Decision Matrix

Metric Green Light (Go) Yellow Light (Pivot) Red Light (Stop)
Problem Validation 80% confirm "Cost Visibility" gap 50-79% confirm gap <50% confirm gap
Landing Page Conversion >5% Signup Rate 2-5% Signup Rate <2% Signup Rate
Willingness to Pay >3 users pay $50 for manual report Users say they will pay, but don't "This should be free" feedback
Privacy Sentiment Accept Role-Based Estimates Hesitant, need strict permissions Refuse to share salary/calendar data
NPS (Wizard of Oz) >40 20 - 40 <20

Research Synthesis Template

To be completed after Week 8.

Validated Pain Points
List top 3 confirmed problems...
Unexpected Findings
Surprises or counter-intuitive data...
Pricing Verdict
Optimal price point & model...
Acquisition Channels
Where users hang out & best CAC...