All Episodes
Stop designing screens. Start designing patterns.
January 5, 202600:59:21

Stop designing screens. Start designing patterns.

with Chris Strahl, Knapsack

Stop designing screens. Start designing patterns.

0:000:00

Show Notes

Chris Strahl is the co-founder and CEO of Knapsack, a design systems and intelligent product engineering platform that has raised a $10M Series A and counts T. Rowe Price, Edward Jones, AP, and major healthcare and banking institutions among its enterprise customers. Before Knapsack, Chris and his co-founder Evan spent two decades doing large-scale digital product builds for the NFL, NBA, Major League Soccer, Estee Lauder, and Johnson & Johnson - and watched the same painful process repeat itself at every one: six to twelve months from concept to user contact, with the person who identified the problem separated from the thing that gets built by a gauntlet of handoffs and rewrites.

Knapsack exists to collapse that distance. The core insight is that digital products are not unique snowflakes - they are built from patterns. Buttons, navs, cards, heroes, prompts: all of these can be described as systems with rules and constraints. When those patterns are structured correctly, AI can understand them, generate from them, and build enterprise-appropriate product experiences in minutes rather than months.

Pattern Language: Stop Designing Screens

The philosophical foundation of Knapsack comes from Christopher Alexander's book A Pattern Language - originally written about architecture. Every house has walls, floors, windows, and doors: common patterns that describe a living space. Every digital product has the same kind of commonalities. These aren't just visual elements; they are systems of patterns with a set of rules and constraints that govern them.

The shift Chris is arguing for is from instance-level thinking to system-level thinking: stop designing "this screen at this URI will show this thing" and start designing the system that generates those experiences. In an AI-enabled world, that distinction is everything. The person who can think at the system level - who sees their job as composing patterns rather than crafting screens - can now build what no one could build before: 25 segmented experiences for 25 different audiences, shipped from the same underlying pattern system, in a fraction of the time.

The Intelligent Product Engine: Three Parts

Knapsack's core is what Chris calls the Intelligent Product Engine, and it has three components. First: context and archetypes. Every generation process starts with a context - and different archetypes (creative exploration vs. compliance review, for example) set different rules engines that govern what kind of output comes back. If you're running a loose brainstorm, you don't need WCAG AA checking. If you're doing a final release review, you need legal sign-off, accessibility validation, and enterprise release gate compliance. The archetype controls which rules apply.

Second: real-time artifact monitoring. The system constantly pulls from Figma files and git repositories, building and updating artifacts - design systems, brand books, Storybook stories - in real time. These artifacts serve double duty: they're browsable for humans (browse your Legos) and are formatted for AI consumption, since AI does not examine things the way humans do and needs its own purpose-built representations.

Third: the rules engine. Generation rules enforce brand governance, compliance, accessibility, and enterprise release processes. They prevent hallucinated brand colors, circumvented release gates, and accessibility failures. Together these three components make AI generation predictable rather than probabilistic - you can know that the output of the engine will look like a banking application if you work at a bank.

Prototype and Prune: AI as Skill Proxy, Not Taste Proxy

One of the more resonant ideas in this conversation comes from Scott Belsky (former CPO of Adobe), reiterating a concept coined by Jules at Meta: AI is a great proxy for skill but a poor proxy for taste. The practical implication for product design is what Chris calls "prototype and prune" - the ability to generate multiple real, enterprise-appropriate prototypes quickly using actual design system context, and then have a human with taste select and refine from those options.

This is meaningfully different from traditional comp-to-handoff workflows. In the old model, a designer produces a single artifact (a comp) that represents an abstraction of the thing to be built, which gets handed to engineers who build an approximation of it. In the prototype-and-prune model, AI generates multiple buildable candidates using the real source code of the actual design system - and a human with taste prunes. The bottleneck moves from production to curation. Brad Frost (author of Atomic Design) described the endpoint: sketch an app idea on a cocktail napkin in 30 seconds, hold it to a camera, and have the design system and AI make it real in minutes.

The Role of the Modern Designer (and Why It's a Great Time to Be One)

Chris pushes back firmly on the narrative that AI is removing creativity from product development. His argument: it is a great time to be a designer - as long as you are thinking in systems. The modern designer is a composer: someone who understands how patterns fit together, who can set the rules that govern a system of experiences, and who exercises taste in pruning from the options AI generates.

The design plane that is opening up is one that has never existed before. Every feature can be segmented; 25 different experiences can be exposed to 25 different audiences. Accessibility controls, dark mode, dyslexic-friendly fonts - these things exist today but are expensive and hard to ship. When patterns make them cheaper to produce, there is no reason not to make far more of them. That is a design opportunity, not a threat.

Forward Deployment and Consumption Pricing

Knapsack shifted from self-serve SaaS toward a forward deployed engineering model for enterprise - engineers embedded with the customer to drive implementation and get to value faster. The reason: enterprise clients frequently stall on implementation six months in despite organizational willingness, simply because they don't know how to use the product in a way that generates value. Forward deployment is the solution. The target is getting every customer to value within 60 days.

Simultaneously, Knapsack is moving to consumption pricing: customers pay based on actual usage rather than seats or licenses. This changes everything - speed of adoption becomes the only metric that matters. If customers aren't using it, revenue declines. The model forces Knapsack to care relentlessly about time-to-value and ongoing engagement, rather than just closing the license.

Frameworks from this episode
  • Prototype and Prune - AI is a great proxy for skill but a poor proxy for taste. Use AI to generate multiple real, enterprise-appropriate product prototypes quickly using actual design system context, then have a human with taste prune and refine. Replaces the comp-to-handoff waterfall with a generation-and-curation loop. See Frameworks.
  • The Intelligent Product Engine - A three-part system for making AI generation predictable at enterprise scale: (1) context archetypes that set different rules engines for different generation modes, (2) real-time artifact monitoring from Figma and git repos, (3) a rules engine enforcing brand governance, compliance, and accessibility. Together they ensure outputs look like your product, not generic slop. See Frameworks.
  • Design Systems Thinking - Stop designing individual screens; start designing systems of experiences. Every product element is a pattern with rules and constraints. The shift from instance-level to system-level abstraction is what unlocks AI-generated product at scale - and is what separates senior systems thinkers from early-career screen designers. See Frameworks.
Tools mentioned
  • Knapsack - Chris's company and the Intelligent Product Engine at the center of this episode. Stores design system patterns as structured data, monitors Figma and git in real time, and provides a rules engine for enterprise-grade AI generation. Customers include T. Rowe Price, Edward Jones, and major healthcare and banking institutions. See the Tools page for details.
  • Claude Code - Chris's primary tool for Knapsack product creation (alongside Cursor), used in day-to-day engineering workflow. See the Tools page for details.
  • Cursor - AI-powered code editor Chris uses alongside Claude Code for day-to-day development. Mentioned alongside Codex, Lovable, and Linear as examples of tools accelerating individual developer and PM productivity.
  • Perplexity - Chris uses Perplexity for synthetic research and answer-seeking, especially for business questions. Cites it as particularly strong for making better-informed business decisions quickly.
  • DeepSeek - Chris runs DeepSeek locally via Ollama primarily to search through TTRPG rulebooks quickly. Cited as a convenient local option for reference lookups without sending data externally.
Glossary terms from this episode
  • Pattern Language - Christopher Alexander's concept, originally about architecture, applied to digital products: every interface has a set of common patterns (buttons, navs, cards, heroes, prompts) with rules and constraints that govern them. Building from patterns rather than designing from scratch enables consistency, system-wide changes, and AI-generated product experiences that match enterprise brand standards. See Glossary.
  • Design System - A central system of record for all the building blocks of a digital product - described by Chris as a well-organized bucket of Lego bricks. Contains structured data describing every component, its variants, its rules, and its constraints. The foundation that makes AI generation enterprise-appropriate rather than generic. See Glossary.
  • Prototype and Prune - A design workflow enabled by AI: generate multiple real, enterprise-appropriate prototypes quickly using actual design system context, then have a human with taste select and refine. Coined by Jules at Meta, reiterated by Scott Belsky (former CPO of Adobe). Contrasted with the traditional comp → handoff → build waterfall where AI is asked to produce a single abstraction. See Glossary.
  • Forward Deployed Engineering - An enterprise go-to-market model in which engineers are embedded with the customer to accelerate implementation and drive time-to-value. Knapsack adopted this after discovering that enterprise clients frequently stall on implementation 4–6 months post-sale despite organizational willingness - not due to lack of motivation but lack of hands-on guidance. See Glossary.
  • Consumption Pricing - A pricing model in which customers pay based on actual usage rather than seats or licenses. Forces the vendor to care about adoption speed and ongoing engagement: if the customer isn't using the product, revenue declines. Changes the fundamental business metrics - time-to-value and retention become the primary drivers, not contract size. See Glossary.
  • Context Archetype - A named configuration of rules that governs how an AI generation system behaves in a specific mode. Knapsack uses archetypes to differentiate exploration contexts (creative, loose, compliance-light) from review contexts (WCAG AA, legal sign-off, enterprise release gate). Different archetypes applied to the same underlying model produce reliably different outputs. See Glossary.
  • AI as Proxy for Skill, Not Taste - Scott Belsky's framing: AI can replicate technical skill (produce well-structured code, generate brand-consistent components) but cannot replicate taste - the human judgment that knows which of five correct options is the right one for this audience, this context, this moment. The role of the modern designer is therefore curation and taste, not production. See Glossary.
Q&A

What problem does Knapsack solve in traditional product development?

The distance between the person who identified the user problem and the thing users can actually touch. Traditional product development runs concept → UX research → comps → engineering → executive review → rework → ship, a process that routinely takes 6–12 months from initial idea to user contact. Every handoff introduces abstraction and information loss. Knapsack collapses this by storing the building blocks of a product as structured data (a well-organized bucket of Lego bricks) and using AI to generate enterprise-appropriate product experiences directly from those patterns - so the person who conceived the problem can conceivably ship something without running the full waterfall gauntlet.

What is Pattern Language, and why does it matter for digital products?

Pattern Language is Christopher Alexander's framework - originally from architecture - describing how every built environment is composed of common patterns with rules and constraints. Walls, floors, windows, doors. Digital products have the same structure: buttons, navs, cards, heroes, prompts. The insight is that you can describe all of these as systems with explicit rules rather than designing each instance from scratch. When patterns are structured correctly, AI can understand them and generate compliant instances automatically. The practical implication: stop designing 'this screen at this URI will show this thing' and start designing the system that generates those experiences. That shift is what makes AI-enabled product creation at enterprise scale possible.

What are the three parts of the Intelligent Product Engine?

First: context archetypes. Different generation modes (creative exploration, compliance review) use different archetypes that set different rules engines - so the same underlying model produces reliably different outputs depending on where you are in the product cycle. Second: real-time artifact monitoring. The system continuously pulls from Figma files and git repos, builds and updates artifacts (design systems, brand books, Storybook stories) in real time, and generates both human-browsable and AI-readable versions. Third: rules engine. Generation rules enforce brand governance, WCAG compliance, legal sign-off requirements, and enterprise release gate processes. Together they make outputs predictable: if you work at a bank, your generated application looks like a banking application.

What is 'prototype and prune,' and what role does taste play?

Prototype and prune is the design workflow enabled by AI: generate multiple real, enterprise-appropriate prototypes using actual design system context, then have a human with taste select and refine from those options. Scott Belsky's framing is the key: AI is a great proxy for skill but a poor proxy for taste. AI can produce five technically correct options; it can't know which one is right for this audience in this context. That judgment - taste - is what the modern designer provides. Brad Frost put the endpoint bluntly: sketch an app idea on a cocktail napkin in 30 seconds, hold it to a camera, and have the design system and AI make it real in minutes. That is where the industry is headed.

What does it mean to design a system of experiences rather than a screen?

It means thinking at the level of the rules and patterns that govern all your interfaces rather than at the level of any individual interface. A line chart used in a retail banking app (one line, simple) and the same control used by a market trader (30–100 overlapping lines with interaction) are the same component with completely different contexts, purposes, and interaction requirements. A systems designer knows how to structure that component so it can serve both use cases correctly - because they've defined the rules governing how it adapts. A screen designer builds the retail banking chart and never thinks about the trader. The bottleneck to systems thinking in 2026 is that it's a senior skill: most practitioners who know how to think this way have spent decades getting there, and college programs haven't caught up yet.

Why did Knapsack move to forward deployed engineering and consumption pricing?

Two related discoveries. First: enterprise clients were stalling on implementation 4–6 months post-sale despite organizational willingness. They couldn't get to value because they didn't know how to use the product effectively. Forward deployed engineering - embedding engineers with the customer to drive implementation - was the solution. The target is getting every customer to 60-day value. Second: the natural extension of that is consumption pricing. If the only thing that matters is whether customers are actually using the product, charge based on usage. Consumption pricing changes the fundamental business metrics: speed of adoption becomes the only metric, because if the customer isn't using it, revenue declines. It forces alignment between vendor success and customer success that seat-based licensing doesn't.

Links & Resources