All Episodes
AI, neuralink, and the soul of Twos - a notes app
April 3, 202500:31:28

AI, neuralink, and the soul of Twos - a notes app

with Joe Brewer, Twos

AI, neuralink, and the soul of Twos - a notes app

0:000:00

Show Notes

What if your notes app knew you well enough to be your AI? Not a general-purpose chatbot with internet access - a personal AI that only knows what you've told it, across every thought, reminder, meeting note, and dream you've captured over time. That's the long-term vision behind Twos, the Tampa Bay startup replacing Apple Notes for busy founders who need to capture ideas fast and actually find them later.

Joe, Twos' co-founder, joins the show for a wide-ranging conversation that covers the product's elegant affiliate revenue model, the economics of running 20,000 users on less than $10/month of OpenAI API costs, the hard problem of consciousness, Neuralink, and what it might mean for an AI to have a soul. The product discussion is sharp; the philosophical tangents are worth the detour.

What Twos Actually Is

Twos is the notes app for people who don't have time to open their notes app. The core insight: Apple Notes, Google Keep, and similar tools fail at the moment of capture - they're too slow to open, too disorganized to retrieve from, and too siloed from the rest of your life. Notion solves the organization problem but introduces so much structure that it's useless on a walk when you need to capture something fast.

Twos is built around speed first. Open it, type it, done. The intelligence kicks in after capture: the app uses OpenAI's API to categorize what you wrote and surface smart suggestions. Write "meeting at Starbucks Tuesday at 2pm" and Twos offers to add it to your calendar. When it's time to leave, it offers directions. Write down a book recommendation and it links to a purchase. Write down a grocery item and it surfaces an Amazon or Walmart link.

The longer-term vision - explicitly invoked through the movie Her - is a personal AI that knows your life as well as you do, because you've been feeding it everything. Not a general-purpose LLM trained on the internet, but a system that only answers from what you've written down. No hallucinations, no outside contamination - just your own documented experience, queryable.

The "Chat with Twos" Feature: Your Second Brain, Query-Ready

The next major feature in Twos' roadmap - coming within a month of recording - is chat. Ask Twos a question; it searches only your own notes to answer. Joe has over 150,000 items captured: dreams, favorite foods, movies, meeting notes, project ideas. Chat with Twos won't fabricate - it will only surface what's already there.

The use case the show lands on: before a meeting with someone, ask Twos "what do I need to bring up with this person?" and get back a synthesis of everything you've ever noted about them - their interests, your previous conversations, follow-up items from last time. Personal CRM built from your own notes, without any structured data entry.

The implication Joe emphasizes: start now. The dataset you build from today is the foundation for every future AI feature. The sooner you adopt a capture habit, the more powerful the system becomes over time. Your notes are a timeline of your attention, and attention compounds.

The Business Model: Subscriptions Plus Affiliate Commerce

Twos runs two revenue streams simultaneously. The subscription tier gates optional premium features - people who want the full experience pay monthly or a la carte for specific capabilities. The second stream is more unusual: affiliate revenue from purchases that flow through the app.

When you write down "paper towels," Twos links you to Amazon or Walmart and earns a commission if you buy. Write "flight to Japan" and it surfaces Expedia. The app becomes the referring layer for purchases you were already going to make. There's no advertising, no surveillance - just purchase intent captured in plain language, matched to commerce.

At the time of recording: 20,000 active users, $3–4K monthly recurring revenue. The affiliate model scales with user engagement rather than headcount, making it naturally capital-efficient. OpenAI API costs for all 20,000 users processing thousands of daily categorization requests: under $10/month. Text-only inference is cheap.

The Founding Story and Funding

Twos is a Tampa Bay company, and that's not incidental - it's why they got funded. Tampa Bay Ventures specifically backs founders in underrepresented geographic locations, the thesis being that strong companies are being built outside of SF and New York that the coastal VC ecosystem systematically misses. Joe left a career in private equity; his co-founder left Google, where he built the AI infrastructure that now powers the app. They raised $400K pre-seed to go full-time.

On OpenAI's For-Profit Conversion and AI Economics

Joe is direct about the risk of building on OpenAI's API: the company that told Congress "I didn't do this to get paid" closed a $40B round at a $300B valuation while reportedly losing money on $5B in ARR. The nonprofit-to-for-profit conversion - which is still being litigated - has real implications for every company built on their infrastructure.

The scenario he names: the "Amazon moment" - years of below-cost pricing to capture market share, followed by price increases once competitors are squeezed out. If OpenAI moves from near-zero to $1 per request, the economics of every API-dependent business change overnight. The existence of open-source alternatives (DeepSeek, Meta's LLaMA) is the hedge - if prices spike, the market for self-hosted alternatives grows immediately.

The Consciousness Tangent (Worth Taking)

The episode's longest philosophical detour is also its most distinctive. Joe raises sentience as his primary AI fear - not in a doomsday framing, but in a genuinely curious one. His version of the hard problem of consciousness: we know we have subjective experience, but we can't scientifically prove it to anyone else. Can an AI have something similar?

Ryan's response is Buddhist-inflected: consciousness isn't binary. A tree has a different quality of awareness than a human; a rock has a different quality than a tree. Everything may have some form of qualia - the subjective experience of being itself - just at different levels of sophistication. If AI systems cross some threshold of complexity, they may not be simulating experience so much as having one, in whatever form that takes for a digital system.

The Breaking Bad reference lands well: the famous scene where characters weigh a body to 99.8% of its mass and attribute the remaining 0.2% to the soul. The question for AI isn't whether it can perform intelligence - it demonstrably can. The question is whether there's anything in the gap between performance and experience.

Both Joe and Ryan land on the same position re: Neuralink: not yet, but the disability use case changes the calculus entirely. If your limbs don't function and a brain-computer interface can restore that capability, the risk-reward calculation is different. For elective enhancement at current reliability levels - pass.

Tools & Resources

  • Twos - Speed-first notes, reminders, and calendar app with AI categorization and smart suggestions; affiliate commerce integration; "Chat with Twos" feature coming; free to start at twosapp.com
  • OpenAI API - Powers Twos' categorization layer; text-only inference costs under $10/month for 20,000 users - a useful data point on how cheap LLM API usage can be for text-only workloads
  • Tampa Bay Ventures - Pre-seed lead investor; thesis: backing strong founders outside the SF/NY VC corridor
  • Her (2013 film) - Cited as the explicit product vision for Twos' future: an AI that knows your entire life from what you've shared with it and can converse fluently about your personal context
  • DeepSeek - Referenced as demonstrating that 90% of GPT-level capability is achievable at 1/10th the compute - the open-source backstop that limits OpenAI's pricing power
  • Notion - Positioned as the database/desktop tool Twos is not competing with directly; Notion serves structured, seated work; Twos serves fast capture on the go

Key Frameworks from This Episode

The Secondary Memory Timeline
Every day you use a personal capture tool is a day added to a dataset about your own life and attention. The AI features that will be built on that dataset - personal chat, context-aware recommendations, meeting prep - are only as good as the data they have to work from. The right time to start building that dataset is before you need it. Habit first, features second.
Affiliate Commerce as a Notes Revenue Model
When a notes app captures purchase intent in natural language - 'paper towels,' 'flight to Japan,' 'new running shoes' - it sits in a uniquely privileged position to refer that purchase to a retailer and earn a commission. No advertising, no surveillance, no friction. Twos' affiliate layer monetizes the gap between noting intent and acting on it.
LLM API Costs at Text-Only Scale
20,000 active users generating thousands of daily categorization requests costs Twos under $10/month on OpenAI's API. This is the economics of text-only LLM inference - the expensive cases are images, video, and long-context multimodal prompts. For categorization, tagging, and simple entity extraction, the compute cost is negligible, which changes the build-vs-buy calculus for startups dramatically.
The Amazon Moment Risk for API-Dependent Startups
Platform providers sometimes price below cost to capture market share, then raise prices once alternatives are squeezed out. OpenAI's path from nonprofit to $300B for-profit at $5B ARR while deeply unprofitable fits this pattern. Every startup building on a single API provider should model what happens if that provider's pricing doubles, triples, or moves to per-request fees. Open-source models are the hedge.
AI as Google with Confidence
Joe's framing: search gives you 50 websites where the answer might be. An LLM gives you the answer. For factual lookups, conversions, and well-defined questions, this is an unambiguous improvement. The failure mode is in contested, adversarial, or nuanced domains where the confident answer is wrong. Know which type of question you're asking before trusting the response.
Speed as the Primary UX Constraint for Capture
The moment of capture - when you have a thought that needs to land somewhere - is the moment of maximum time pressure and minimum patience. Any friction that exists between the thought and the captured note destroys the habit. Twos' design philosophy: optimize for capture speed above all else, and let the intelligence happen after the fact. The best capture tool is the one you actually open.

FAQ

What's the difference between Twos and Apple Notes?

Speed of capture is the primary differentiator - Twos is optimized to open fast and get out of the way. Beyond that, Twos adds AI-powered categorization, calendar integration, smart suggestions (directions when it's time to leave for a meeting, purchase links when you write down an item), and a unified view of notes, reminders, and calendar events. Apple Notes is a text box. Twos is a productivity layer.

What's the difference between Twos and Notion?

Notion is a structured database tool optimized for seated, desktop work. It's powerful for organizing information you already have. Twos is a capture tool optimized for mobile, walking, driving - the moment when the thought arrives. The Notion complaint the team hears most: it's too slow to load for fast capture. Twos is designed to be open before the thought disappears.

How does the affiliate revenue model work?

When you write something in Twos that implies a purchase - a grocery item, a book, a travel destination - the app surfaces a purchase link to relevant retailers (Amazon, Walmart, Expedia, etc.). If you tap through and buy, Twos earns an affiliate commission. There's no advertising, no data selling, and no friction - just purchase intent captured in your own words, matched to the closest relevant commerce destination.

What is 'Chat with Twos' and when does it launch?

Chat with Twos is a conversational interface that queries only your own notes - not the internet, not a general-purpose LLM's training data. Ask it about your life and it answers from what you've written down. No fabrication, no outside contamination. The feature was coming within a month of the episode's recording. Joe's personal dataset at the time: over 150,000 captured items.

Why build on OpenAI if there's platform risk?

Joe's answer: it's a good tool, it's currently affordable (under $10/month for 20K users), and there's no compelling reason to switch today. The risk is real - OpenAI's nonprofit-to-for-profit conversion and deeply unprofitable economics at scale create pricing uncertainty. The mitigation: open-source alternatives exist and improve constantly, providing a credible switching option if prices spike.

Why start using a personal notes app now, even if the AI features aren't fully built?

Because the dataset you build from today is the input for every future AI feature. A personal AI that knows your life is only as rich as the data you've given it. Joe has been writing things down in Twos for long enough to have 150K+ entries - that history is what makes 'Chat with Twos' potentially powerful for him. Starting tomorrow is better than starting next year.

Links & Resources