
The Behavioral Health Crisis Is a Data Problem
with Lauren Larsen, Videra Health
The Behavioral Health Crisis Is a Data Problem
Show Notes
The average patient gets seen and then disappears. No data, no signal, no clue what happens next. That gap is where people spiral, where diagnoses get delayed for years, where $280 billion quietly burns every year. The system is not missing doctors. It is missing continuity.
Lauren Larsen is the CEO of Videra Health. Before that she co-built HireVue, the company that let employers interview a hundred thousand candidates in a week and used AI to surface the best ones. When HireVue was sold in 2019, Lauren took the core insight — that you can understand a person deeply from audio and video — and applied it to the most broken system she could find: behavioral healthcare.
Videra Health builds AI that checks in on patients between appointments. It hears their voice, sees their face, tracks tiny changes over time — not what they say, but how they say it. The breakthrough is not diagnosis. It is continuous context. And this episode is about what founders can steal from that insight for any product driven by human decision-making.
From HireVue to Videra: The Same Insight, a Harder Problem
HireVue proved one thing at massive scale: you can understand a person from a short video clip more accurately than most humans can from a resume. By 2019, that system had interviewed over a hundred million people for jobs. Lauren sold the company and immediately started asking a harder question: if AI could see through bias in hiring, what else could it see?
The answer was behavioral health. When you leave a doctor, no one thinks about you again until your next appointment. That gap is where substance use relapses happen, where mental health crises deepen, where medication stops working and nobody knows. Lauren and her co-founder — a clinical psychologist — set out to close that gap with AI check-ins that are consistent, structured, and deeply personalized over time.
Videra Health launched in 2019. Today it works with large health systems treating substance use, eating disorders, and mental health conditions. It has reduced admin time by up to 88 percent and turned routine check-ins into a predictive data layer. None of their customers have voluntarily churned.
Frameworks from This Episode
These frameworks have been added to the AI for Founders Frameworks Library. Filter by Healthcare or Lauren Larsen to find them.
The Continuity Gap
The most valuable data in healthcare — and in most SaaS products — exists between appointments, not during them. Building for the gap is the actual product opportunity.
- •Healthcare sees the people who are good at making appointments, not the people who actually need to be seen. Most SaaS products have the same problem: they only hear from engaged users.
- •The gap between a patient visit and the next one is where all the signal lives: medication adherence, side effects, behavioral changes, early warning signs.
- •An AI that checks in consistently gives you longitudinal data — not snapshots, but trends. A baseline deviation is far more informative than any single reading.
- •The principle applies everywhere: medication management, employee burnout, customer health scores, post-onboarding drop-off. Behavior is the ultimate data set.
- •The question for every founder: where does your product go dark on your user? That dark period is your product gap and your next feature.
The Paying Problem Filter
Cool technology that no one will pay for is not a product. Before you fall in love with your insight, find the customer whose budget line it solves.
- •Videra's early pitch generated universal enthusiasm and almost zero signed contracts. People would say 'everyone should do this' and then decline to buy it themselves.
- •The problem: continuous patient monitoring does not have a direct billing code. Providers do not get paid more for following up with patients. They get paid for visits.
- •Lauren had to add features with clearer ROI — intake automation, clinical documentation — to fund the vision. The mission needed a business model attached to it.
- •The filter is simple: not 'is this valuable?' but 'who has a budget line for this and what does it replace?'
- •For healthcare specifically: find the person who benefits from more patients being diagnosed when a treatment already exists. That is how Videra cracked pharmaceutical partnerships.
Build for Signal, Not Assumption
When training models for human behavior, go in with as few assumptions as possible. Let the data surface the features. Your intuitions about what matters are often wrong.
- •The temptation is to say 'squinting eyes mean X' or 'flat voice tone means Y' before you have the data to back it up. Starting from assumptions bakes bias in from the beginning.
- •Videra's approach: collect voice, video, and language features broadly, run structured clinical screeners alongside them, and let the model find which features actually predict the outcome.
- •Then apply the sniff test: does it make sense that this feature is predictive? And does it hold across demographic groups?
- •The bias version of this failure is Amazon's hiring model — trained only on high-performing men, it learned to predict that men perform better. The data was wrong because the training set was wrong.
- •One critical behavioral predictor Videra found: whether a substance use patient sleeps at night predicts treatment completion and sobriety at 12 months better than almost any other signal. Nobody assumed that. The data surfaced it.
The Bias Audit Triad
Three practices that prevent AI models from encoding discrimination: clean and representative data, demographic testing before launch, and continuous monitoring for drift.
- •Step 1 — Clean data: make sure the training set is not obviously wrong. A model trained only on one demographic will encode that demographic's patterns as universal truth.
- •Step 2 — Model cards: for every model you ship, build a table showing false positive and false negative rates across demographic groups and intersectional groups. Not just 'Black' and 'women' — Black women is a separate intersectional group.
- •Step 3 — Monitoring: models drift as populations shift. A model that is fair on launch can become biased as the user mix changes. Set up ongoing drift detection.
- •In hiring, discrimination is legally regulated. In healthcare, it is just unethical — and often invisible. That absence of regulation means the responsibility falls entirely on the builder.
- •The goal is not equal treatment across groups. It is precise treatment. Men and women need different things at different points in their healthcare journey. Precision requires knowing which group you are serving.
Founder Experiment: Run a Disappearance Audit
Videra was built to solve the moment a patient walks out of a doctor's office and disappears. Your product has the same problem. Users onboard, engage for a while, and then go quiet — and you have no idea whether they churned, got busy, or silently hit a wall. Here is how to find your version of the gap and build toward it.
- 1Pull your last 90 days of user activity and identify the moment each user went silent. Not churned — just went dark. What was the last action they took before disappearing? That last action is your signal.
- 2Segment the disappeared users by what they did before going dark. Group by feature used, step completed, or message received. Look for patterns. If 60 percent of users disappeared after the same workflow step, that step is your gap.
- 3For your highest-value users who have gone quiet, design a single automated check-in that asks one specific question about their experience — not a satisfaction survey, a product-specific question that requires a real answer. Measure how many respond and what they say.
- 4Pick the segment with the highest LTV who went quiet earliest. What would it take to build a lightweight signal from them between their active sessions? A weekly prompt, a behavioral trigger, a passive usage indicator? Spec out the smallest possible version of that system.
- 5Run Lauren's test: take one behavioral signal you currently ignore — login time of day, scroll depth, response latency — and check whether it predicts churn before the user knows they are about to churn. You may already have a predictive model hiding in your data.
Stretch goal: Find one metric in your product that nobody tracks because it is inconvenient to measure — the behavioral equivalent of "is the patient sleeping?" Lauren's team discovered sleep was the single best predictor of substance use treatment success. Your version of that metric is sitting in your data right now, unasked. Find it and build a dashboard around it before your next board meeting.
Key Terms
These terms have been added to the AI for Founders Glossary. Search by Lauren Larsen to filter them.
Tools from This Episode
Videra Health
AI-powered behavioral health platform that checks in on patients between appointments via audio and video. Detects observational biomarkers — voice, facial movement, language — to surface patients who need attention before they reach crisis. Also offers clinical documentation automation (reducing admin time by up to 88%) and intake automation. Sells to large health systems and pharmaceutical companies. HIPAA compliant.