
Good morning. It’s Monday, March 23rd.
I’m heading to the AI Tinkerers Denver Meetup on Wednesday evening. If you’re in the Denver area and would like to attend, RSVP and I hope to see you there!
-Jeff
AI Breakfast
You read. We listen. Let us know what you think by replying to this email.
Speak your prompts. Get better outputs.
The best AI outputs come from detailed prompts. But typing long, context-rich prompts is slow - so most people don't bother.
Wispr Flow turns your voice into clean, ready-to-paste text. Speak naturally into ChatGPT, Claude, Cursor, or any AI tool and get polished output without editing. Describe edge cases, explain context, walk through your thinking - all at the speed you talk.
Millions of people use Flow to give AI tools 10x more context in half the time. 89% of messages sent with zero edits.
Works system-wide on Mac, Windows, iPhone, and now Android (free and unlimited on Android during launch).

OpenAI to double its workforce, start autonomous research plans
OpenAI is moving forward with its autonomous research plans, building a multi-agent AI researcher to handle complex problems in math, science, coding, and policy with minimal human oversight. An AI research intern will take on discrete multi-day tasks starting in September, with a full AI Researcher scheduled for 2028. These systems use Codex, reasoning models, and chain-of-thought monitoring, with sandboxing to control risks.
The company will nearly double its workforce to 8,000 by year-end, adding engineers, researchers, and technical ambassadors to deploy its Frontier platform and a desktop super app unifying ChatGPT, Codex, and Atlas.
Students in the U.S. and Canada can access $100 in Codex credits to build hands-on AI experience through the OpenAI Developer Community. Read more.
Musk launches Terafab, a terawatt-scale chip factory in Austin
Terafab is Elon Musk's announcement of a massive new chip manufacturing plant in Austin that Tesla, SpaceX, and xAI are building together. The ultimate goal is to stop depending on foreign chipmakers and start producing their own advanced processors at enormous scale, enough to deliver over a terawatt of AI computing power every year once it's fully running.
These chips will power Tesla's self-driving tech and Optimus robots, xAI's AI models, and SpaceX's satellites plus future data centers in orbit. In plain terms, it's Musk betting big that owning the full chip-making process rather than relying on NVIDIA will let his companies move much faster on AI, robotics, and space tech.
SpaceX is planning a huge public offering later this year (potentially this summer) that could raise around $50 billion and value the company at $1.5–2 trillion or more. Read more.
Claude Cowork adds 'Projects' for local task organization and one-click imports
Anthropic added ‘Projects’ to Claude Cowork on Desktop, letting users create persistent workspaces that link local folders, instructions, and ongoing tasks. The update incorporates scheduled tasks from February, keeping context across sessions and supporting team workflows. Users can control file permissions, automate browser actions, and generate documents.
An intermediate permission mode for Claude Code is in development to balance automation with oversight. Claude Max costs $100–$200/month on macOS for professionals, with Windows and hybrid connectors coming. Read more.
AI floods ideas at zero cost but humans still judge what matters
Andrej Karpathy found autonomous agents outperformed manual GPT-2 hyperparameter tuning, uncovering complex interactions humans often overlook. This efficiency applies mainly to tasks with clear, measurable outcomes, while less quantifiable problems still demand human oversight.
Terence Tao notes AI slashes the cost of generating mathematical ideas nearly to zero, yet verification now constrains progress. Traditional journals, conferences, and mentoring cannot handle AI-assisted proofs, prompting Tao to advocate machine-friendly infrastructures, formal proof assistants, and AI planning systems to manage large volumes of automated hypotheses. Read more.


Naive hires autonomous agents with dedicated compute and financial stacks. These independent entities execute recursive workflows to run businesses.
Design Agent by Lokuma is an AI design layer that structures, refines, and visually organizes outputs from coding agents.
Silicon Friendly scores your website’s AI-agent compatibility across 30 checks, from readability to full autonomous operation.
Context.dev is an API that retrieves, structures, and enriches real-time web data for AI models and applications.
FigPrompt generates production-ready Figma plugins from natural language prompts, turning design intent into working code instantly.

Thank you for reading today’s edition.

Your feedback is valuable. Respond to this email and tell us how you think we could add more value to this newsletter.
Interested in reaching smart readers like you? To become an AI Breakfast sponsor, reply to this email or DM us on 𝕏!
Thinking of starting your own newsletter? AI Breakfast readers who sign up with Beehiiv receive a 14-day free trial and 20% off for 3 months.



