0. How to use this playbook
The Forum AI Venture Studio has launched 17 companies over the past three years. From our first batch of 13, seven have already raised more than $2 million.
Our secret to producing great startups is simple: combine great founders with critical problems in massive markets.
One of the biggest reasons we’ve been successful is volume. We look at a huge number of ideas. We analyze 10 to 20 every week and have reviewed more than 1,000 in total. To make that possible, we built a framework that lets us evaluate ideas fast, kill the bad ones quickly, and let the best rise to the top.
This playbook is that framework. If you’re a founder searching for an idea and need a clear way to decide if it’s worth building, this is for you.
At the end, you’ll find templates and documents for every step of the process to help you move fast and stay consistent.
Link to templates
1. Desk research sprint
This phase is about speed and focus. You will kill weak ideas quickly and highlight a shortlist worth customer conversations. You will capture the logic in a concise discovery doc and back it with sources so decisions are easy to audit later.
Here is a sample doc we use to conduct research, we call these discovery docs and completing this doc in the first step in the validation process.
1.1 Venture-scalable market check
Validate that the upside fits a venture outcome or has a credible path. You are checking size, growth, and the catalyst that will change spending behavior. If you cannot explain the catalyst and the buyer who feels pain today, the idea likely does not pass.
What we look for at Forum:
- Market size and scalability: The target market should exceed $10B in total addressable value, or demonstrate a credible path to that scale within 5–7 years. We calculate this using bottom-up unit economics (number of potential buyers × expected annual contract value), not top-down estimates.
- Market growth and catalysts: The market must be expanding at a measurable rate (CAGR >15%) or show evidence of near-term acceleration driven by a structural catalyst—such as regulation, technological adoption, or budget reallocation.
- Small markets can still qualify if they show high compounding potential. For example, AI security and model governance tools started with limited spend but grew exponentially as AI adoption created mandatory compliance needs.
Pass if: you can describe the catalyst, the rate of growth, and the buyer who feels the pain today.
1.2 Competition and funding heat
You want proof of appetite without an entrenched wall of modern lookalikes. Early rounds indicate demand and investor interest. A crowd of well-funded, recent companies solving the same thing signals a hard path to wedge in.
What we look for at Forum:
- Signal-to-noise in funding activity: We look for early-stage investment velocity—a handful of pre-seed and seed rounds, possibly a single Series A, which signals validated demand and investor curiosity without market saturation.
- Market saturation threshold: Avoid spaces dominated by Series B–D players with deep capital and modern product parity. A high concentration of well-funded companies with overlapping value propositions indicates limited entry surface and high CAC pressure.
- Legacy market conditions: The presence of outdated incumbents (founded pre-2010) is a positive signal when they rely on manual processes, service-heavy delivery, or low customer NPS, as it implies latent demand for workflow automation and AI leverage.
Advance if there’s measurable funding momentum indicating demand but still whitespace for entry.
Kill if there’s no capital activity, no clear catalyst, or the market is crowded with well-funded modern competitors.
1.3 Right to win
State the edge that gets your first ten customers to choose you. This should be specific, testable, and tied to distribution, data, or workflow advantage. If your advantage is just “we execute better,” you do not have one yet.
Spell out your edge:
- Proprietary access. Customers. Data. Distribution.
- Domain expertise that shortens sales or implementation.
- Technical or workflow insight that creates a 10 times better experience.
Advance: if you can state the edge in one sentence and show how it helps the first ten customers.
Kill: if the plan depends on being generally smarter or faster than everyone else.
At the end of this process you should have a completed discovery doc, you’ll want to feel solid about the market and competition function as well as feel confident moving into customer interviews by having a base understanding of the problem and industry.
2. Customer problem discovery
This phase turns guesses into evidence from budget owners. You are validating pain, frequency, and cost, and you are mapping the buying motion. The goal is not friendly feedback. The goal is design partners and pre-commit signals.
2.1 Start with an initial ICP
Narrow to a reachable buyer where the pain shows up in a specific workflow. Write down firmographics, the economic owner, and where the pain is visible in daily work. This keeps calls focused and comparable.
Define a narrow buyer you can actually reach:
- Firmographics: industry, size, geography, compliance regime.
- Role and P and L ownership. Who feels the pain and controls budget.
- Workflow where the pain shows up.
- Example: Mid-market health systems in the US. 5 to 20 clinics. COO or RevOps. Prior authorization workflow with multi-system data pulls.
2.2 Book the meetings
Run targeted outreach that mixes cold and warm so signals are unbiased. Use your network for speed but hold yourself to a cold quota to avoid friend bias. Track your hit rates to improve copy and targeting.
Channels we use:
Founder outbound. 50 to 100 targeted Linkedin messages across two weeks.
Personal networks: We tap our own network and those close to us for friendly intros
Targets:
- 10 to 20 conversations in two weeks.
- Per 100 messages we like to see at least 3 generated meetings, anything over 10 meetings is a huge success.
- We recommend 30 percent of meetings coming from cold outreach to avoid friend bias.
Outreach best practices
Small tweaks compound your acceptance rate. Be clear you are not selling. Connect before messaging. Be specific about the workflow and personalize to prove you did the work. These details separate you from spam.
Best practices:
- Say you are not selling anything. Include a line like “I should mention I am not trying to sell anything.”
- Send a connect request without a message. After they connect, follow with a message. We have tested this and it performs better.
- Be direct and specific. Reference the exact workflow and problem.
- Personalize where possible. Mutual connections. School. A recent post. Anything that proves this is not spam.
Sample message
Use clear, non-sales language to open the door. Anchor on their workflow and your intent to learn. Keep it short. Ask for a specific time window.
“Hi [Name].
I am interviewing operators about [specific problem] in [industry] as part of a problem discovery sprint. I am not selling anything. Given your experience with [specific aspect], could we do a 20 to 30 minute call next week to walk through your current workflow.
Best.
[Your name]”
2.3 Sample questions for the interviews
These prompts quantify pain and surface buying motion. Push for specifics. Ask about frequency, cost, and ownership.
Questions we like to ask
- Walk me through your current workflow for [problem] from trigger to completion. Where does it break most often.
- In the past 30 days, how many times did this issue occur. What did each incident cost in time or dollars.
- What workarounds do you use today. What makes them fail or create risk.
- Who feels this pain the most and who owns the budget to fix it. How does a purchase like this usually get approved.
- Have you purchased any software or invested resources in solving this problem before
After 10-20 conversation, you should have an understanding of the problem, what we’re looking for is:
- A big pain point that is costing time and money
- A similar problem across the stakeholders
- Previous investment into a solution
How to track
Keep evidence organized and searchable from day one. Use a meeting recorder for accuracy and a simple spreadsheet to aggregate signals across calls. This lets you spot patterns fast and keep track of conversations
Step 1. Record every conversation
- Ask for permission at the start. One sentence is enough.
- Use a meeting recorder. Fireflies. Grain. Fathom. Zoom native.
- Title the recording with a standard format: YYYY-MM-DD. Company. Name. Role. Topic.
- Add a 3 to 5 sentence summary immediately after the call while details are fresh.
Step 2. Log the data in a spreadsheet
Create a single source of truth. One row per conversation. Add columns that map to your discovery goals so you can sort and filter.
Investment Memo
You can find the first draft investment memo template here. Use it to structure your findings, clarify your reasoning, and ensure each decision is backed by evidence from your research and customer conversations.
3. Customer solution discovery
Now you move from problem evidence to a testable workflow and commitment. You will present a simple flow, test data access, and ask for a design-partner cadence. The goal is feasibility on real data and willingness-to-pay signals.
3.1 Create a solution hypothesis
Make a simple, testable first slice tied to one measurable outcome. Name the user, the buyer, the inputs, and the constraints. Include a price anchor to test commercial reality early.
Define a simple, testable approach before you show anything.
- Problem slice. Pick one workflow you can improve end to end.
- User and buyer. Name the operator who uses it and the budget owner who signs.
- Outcome. One measurable result. Time saved. Errors reduced. Revenue unlocked.
- Inputs and data. Exact sources. Access method. Legal path.
- Constraints. What you will not do in this first slice.
- Price anchor. A draft package and monthly price tied to the outcome.
Example: For prior authorization teams in mid-market health systems. Automate document gathering and form population. Inputs are EHR exports and payer PDFs. Target 40 percent time reduction per case. Pilot at 2,500 dollars per month for two clinics.
3.2 Book the meetings
Reach back out to the buyers who showed real pain and walk them through a simple version of your proposed workflow. Ask if they’d be open to joining a short biweekly working session to refine it together. Keep running outbound and talking to new people too, you want fresh, unbiased signals, not just validation from the same friendly faces.
Best practices:
- State you are not selling a finished product. You are validating a workflow.
- Bring a one-page or three-slide deck that shows before and after.
- Be explicit about the time ask. Biweekly 30 minute working session for four to six weeks.
- Personalize with proof you understand their workflow and constraints.
3.3 Sample message
Frame this as workflow validation, not a sales pitch. Promise a short review and propose a recurring cadence if it fits. Make it easy to say yes.
Hi [Name].
I’m currently working on a solution to solve [specific problem] in [industry]. I am not selling anything. I want your feedback on a simple workflow that could reduce [time or cost] in your team. Open to a 20 to 30 minute call next week to review the flow and discuss a short biweekly working session if it fits.
3.4 Run solution discovery calls
Show the flow, pressure test reality, and ask for commitment. Keep the demo short and use their language. Close with a concrete request for cadence, data samples, and pilot scope.
Keep it concrete. Show the flow. Ask for commitment.
Call flow:
- Recap their problem in their words. One minute.
- Walk through the proposed workflow. 5 to 7 minutes.
- Pressure test data access and edge cases. 10 minutes.
- Show a draft price card. 2 minutes.
- Give time for Q&A
- Close with next steps. Design partner cadence. Data samples. Pilot scope.
Sample questions for solution discovery.
- In this workflow, which step feels most unrealistic. What would break first in your environment.
- What data can you actually provide in week one. How would you export it. Who needs to approve access.
- If we delivered a 40 to 50 percent reduction in time on three live cases, what would be the path to a paid pilot.
- What risks would block you from testing this. Compliance. Accuracy. Change management.
- Which metric matters most for you to justify a pilot to your leadership. Time saved. Throughput. Error rate. Dollars.
Post-sprint scoring. How we decide what lives
This is the final gate after the two-week sprint. The default posture is kill. Assume the idea is flawed. Only advance if it clears a high bar: urgent pain. Clear buyer. Large market. Fast GTM. Fast execution. If in doubt, pause or kill.
How to run the scoring
- First do you own scoring
- Then run a GPT to do the scoring. You can see our full prompt here
1. Problem clarity. Score 1–10
Plain English definition: What job or workflow is broken. Who feels it. Why it hurts now.
Questions to answer.
- What is broken in simple terms.
- Who has the pain. Title. Company size. Situation.
- Why it is painful. Time lost. Money lost. Fines. Missed deadlines.
- Current workaround. Spreadsheets. Consultants. Internal tools.
- Why now. Regulation. Turnover. Recession. New law or vendor change.
We like. Problems triggered by new regulations or macro shifts. A sharp wedge into a larger market.
We do not like. Generic problems like “AI writes better emails.” Nice-to-haves.
Scoring help.
1–3: Hand-wavy pain. Conflicting users. No buyer.
4–6: Some pain stories but inconsistent. Workarounds are “fine.”
7–8: Repeated, quantified pain from budget owners.
9–10: Urgent pain tied to deadlines or penalties with clear evidence.
2. Solution fit. Score 1–10
What the product actually does and why it is a must-have.
Questions to answer.
- What is automated or massively improved. No jargon.
- Role of AI. What is newly possible now. Humans in the loop today.
- Can an MVP be built in six months by one product lead and one engineer.
- What scope can be cut to reach usefulness fast.
We like. AI unlocks a previously impossible or impractical step. Humans in the loop now with a path to more automation.
We do not like. Deep tech that needs years of R and D. Horizontal tools with no clear buyer.
Scoring help.
1–3: Vague feature list. Magic AI claims.
4–6: Clear concept but either nice-to-have or too big for six months.
7–8: Crisp workflow. Thin slice MVP is feasible in six months.
9–10: Must-have. Thin slice demonstrably removes a painful step.
3. Market potential. Score 1–10
Plain English definition: Who pays. How much. How many exist.
Questions to answer.
- Bottom-up math. Count of accounts in the ICP times expected price.
- Who signs the check.
- Replacement spend versus net-new budget.
- Path to a 100 million dollars outcome.
We like. Clear budget line and economic buyer.
We do not like. Small, fragmented niches with no budget owner.
Scoring help.
1–3: TAM hand-waving. “Lots of companies.”
4–6: Some math but soft on pricing or buyer.
7–8: Solid bottom-up math with buyer and price validated in calls.
9–10: Large and growing market with a strong catalyst.
4. Competitive landscape. Score 1–10
Plain English definition: Who else is solving this and why we can win.
Questions to answer.
- Legacy players. Growth-stage startups. Early startups.
- Any well-funded or fast-growing players.
- Their strengths and weaknesses.
- Our angle. Faster. Cheaper. Simpler. More focused.
We like. A few early companies raising funding. Weak legacy adoption.
We do not like. Series A to C players with strong AI adoption and big war chests. No activity at all or a single dominant player.
Scoring help.
1–3: Either a wall of capable competitors or no activity plus no catalyst.
4–6: Some room to wedge but not obvious.
7–8: Evident angles against incumbents and early peers.
9–10: Clear, defensible wedge with proof prospects prefer it.
5. Go-to-market. Score 1–10
Plain English definition: Who buys and how we reach them fast.
Questions to answer.
- Buyer title and department.
- Outreach channels that already worked. Outbound. Niche communities. Partners.
- Do they already buy software here.
- Will they pilot without polish or brand trust.
- Sales cycle length
We like. Clear buyer. Easy to find ICP.
We do not like. Unclear buyer in a horizontal market. Crowded categories like AI sales or marketing.
Kill if. Buyer is unclear. Unreachable. Or the sales cycle is obviously slow and political.
Scoring help.
1–3: “We’ll try content and PLG.” No buyer defined.
4–6: Some meetings from friends. Weak cold results.
7–8: Repeatable channel with cold reply data and design-partner interest.
9–10: Short path to pilots. Early deposits or LOIs from outbound.
6. Execution risk. Score 1–10
Plain English definition: What might break in build, data, AI, legal, or customer operations.
Questions to answer.
- Missing data. Edge cases. AI failure modes.
- Regulatory or legal red tape. Example: “SOX requires documented controls for financial workflows.”
- Customer friction. Data silos. No internet. Paper workflows.
We like. Risks named with mitigation and proof from the spike.
We do not like. Hidden dependencies on unavailable data or approvals.
Kill if. Risks are high and cannot be mitigated quickly.
Scoring help.
1–3: Unknowns across data, AI, and legality.
4–6: Known risks but thin mitigation.
7–8: Risks bounded by feasibility spike and access path.
9–10: Low risk. Clear legal path. Proven on live samples.
Run this playbook
At Forum Ventures, we’ve built this framework to help founders move fast, think critically, and validate ideas with evidence not hope. The best companies don’t start with perfect ideas. They start with a clear problem, a sharp insight, and a team willing to test, kill, and refine relentlessly until something undeniable emerges.
If you’re a founder who thrives in that environment who wants to go from concept to company with real support, capital, and a full team behind you we’d love to hear from you.
You can learn more about our venture studio and apply to build with us at check us out here or reach out directly to me directly on Linkedin.
We’re always looking for founders who think deeply, move fast, and want to build something that actually matters.
Link to templates
.avif)