All Episodes
Legal AI: Why Lawyers Are Finally Free to Think
April 15, 202600:49:54

Legal AI: Why Lawyers Are Finally Free to Think

with Devansh, Irys

Legal AI: Why Lawyers Are Finally Free to Think

0:000:00

Show Notes

Devansh walked into the legal tech market and saw a graveyard of point solutions. Word doc plugins. Document hosting tools. Niche contract reviewers. Each one promising to make attorneys more efficient, and each one adding another tab to an already fragmented workflow.

That is the problem Irys was built to eliminate. Devansh, co-founder of Irys and creator of the Chocolate Milk Cult did not set out to make a better legal AI tool. He set out to rebuild the infrastructure underneath legal work entirely.

Frameworks from This Episode

The Fragmentation Problem

Most legal AI today is what Devansh calls "a system prompt wearing a trench coat", a niche product wrapping a general-purpose model, charging per word or per page for the privilege. Small and mid-sized law firms end up stitching together 10 point solutions that share no context with each other.

Irys attacks this from the foundation:

  • Built ground-up as a full end-to-end legal platform, not a wrapper.
  • Processes unlimited documents without vector search limitations.
  • Builds entity maps and relationship graphs across the entire document set.
  • Flags contradictions, jurisdictional mismatches, and contextual gaps that RAG-based systems miss.
  • Delivers a transparent, auditable thinking trace so attorneys can verify every recommendation.
  • Runs 50 to 60 argument simulations and identifies which ones are likely to succeed.

The Three Categories of Hallucination

Devansh breaks legal AI hallucinations into three categories. The third is the most dangerous, and the hardest to catch with traditional vector search.

1Citation hallucinations: The AI cites a case that does not exist.
2Applicability hallucinations: The case exists, but the jurisdiction, domain, or context makes it inapplicable.
3Context hallucinations: The AI misses a relationship between documents, where one document modifies, contradicts, or conditionally applies to another. Irys addresses this with a self-updating knowledge graph that links entities, propositions, and assertions across the entire document set.

The Democratization Mission

Devansh grew up watching legal inaccessibility cause real harm, civil cases in India carry a 10-year backlog; New York City tenants get bullied by landlords because they cannot afford to fight. His co-founder, a former Big Law attorney, had lived the inefficiency from the inside.

  • Irys is free to sign up, the access barrier is the first problem to eliminate.
  • Devansh open-sources parts of the stack, including latent space reasoning work he believes will define the next generation of AI reasoning models.
  • The platform is positioned not just as a tool for firms, but as infrastructure for justice.
  • Their shared conviction: there is no technical reason legal work has to take this long or cost this much.

Founder Experiment: Build a Document Contradiction Detector

Take the core insight from this episode and build a lightweight version of Irys's cross-document reasoning layer.

  1. 1Using Cursor or Replit, prompt an LLM to ingest two or more documents and output a structured JSON list of any propositions in Document A that are modified, contradicted, or conditioned by Document B.
  2. 2Start with two simple contracts. Note how well the model tracks cross-references and conditional clauses.
  3. 3Try two legal briefs. Watch for applicability errors, does the model catch when one brief cites a case the other brief disputes?
  4. 4Extend it to three or more documents and observe where the model starts to fail.

The lesson: That failure point, where the model loses track of cross-document relationships, is exactly where a knowledge graph starts earning its keep. You will understand Irys's architecture intuitively after running this experiment once.

Key Terms

These terms have been added to the AI for Founders Glossary. Search by Devansh to filter them.

Point Solution: A narrow software tool designed for a single task within a broader workflow. Devansh's diagnosis: law firms drowning in point solutions is the core problem Irys was built to solve.
RAG (Retrieval-Augmented Generation): A technique where an LLM retrieves relevant chunks of text before generating a response, typically using vector similarity search. Effective for single-document lookup; prone to missing cross-document relational context.
Vector Search: A method of finding relevant documents by measuring mathematical similarity between text embeddings. Effective but blind to relationships between documents that don't share identical language.
Knowledge Graph: A structured representation of entities and the relationships between them, allowing reasoning across large, complex data sets. The architectural core of how Irys tracks contradictions and conditions across document sets.
Context Hallucination: A hallucination that occurs not from false information, but from failure to account for relationships between documents, where one document modifies, contradicts, or conditionally applies to another.
Auditable Thinking Trace: A transparent log of how an AI system arrived at its conclusions, including citations and reasoning steps. Required for legal AI deployments where attorneys must be able to verify and defend every recommendation.
Latent Space Reasoning: A technique for structuring AI inference that Devansh and the Irys team open-sourced, positioned as a foundational improvement over standard autoregressive generation.
Agentic System: An AI architecture that takes sequences of actions autonomously, replanning based on new information rather than waiting for human input at each step.
OCR (Optical Character Recognition): Technology that converts images of text, including handwritten annotations, into machine-readable text. Critical for legal work, where documents often include handwritten notes and stamped modifications.
Zero Data Retention: A data policy under which LLM providers do not store or train on user inputs. A table-stakes requirement for any legal AI handling privileged client communications.

Q&A

What is Irys and how is it different from other legal AI tools?

Irys is a full end-to-end legal platform built from the ground up, not a wrapper around a general-purpose model. Where most legal AI tools are point solutions that address one task, Irys processes entire document sets, builds entity and relationship graphs, flags cross-document contradictions, and delivers an auditable thinking trace so attorneys can verify every recommendation.

What are the three types of legal AI hallucinations?

Devansh breaks them into citation hallucinations (the AI cites a case that doesn't exist), applicability hallucinations (the case exists but the jurisdiction or context makes it inapplicable), and context hallucinations (the AI misses a relationship between documents where one modifies or contradicts another). The third is the most dangerous and the hardest to catch with traditional RAG systems.

Why does RAG fail for complex legal work?

RAG retrieves relevant chunks based on vector similarity, which works well when the answer is contained within a single passage. Legal work often requires reasoning across relationships between documents, where one contract modifies another, or where one brief disputes a case cited in a second brief. Those cross-document relationships don't surface reliably through similarity search alone.

What is the Chocolate Milk Cult?

Devansh's community nickname for the audience of his AI Made Simple newsletter, which reaches over 1.5 million people monthly. The name reflects the newsletter's core mission: making AI concepts genuinely accessible and enjoyable rather than intimidating or technical.

Why is Irys free to sign up?

Devansh's mission is to democratize access to legal infrastructure. He watched legal inaccessibility cause real harm, decade-long civil case backlogs in India, tenants in New York unable to fight landlords because they can't afford attorneys. Removing the signup barrier is the first step toward making legal-grade AI available to people who need it most, not just large firms who can afford enterprise contracts.