• AI Breakfast
  • Posts
  • Altman readies Manhattan Project–scale AI build-out

Altman readies Manhattan Project–scale AI build-out

Good morning. It’s Monday, September 22nd.

On this day in tech history: In 2017, Quanta Magazine published “New theory cracks open the black box of deep learning,” highlighting the information-bottleneck view that sees training as compressing representations until only predictive bits remain. It marked a rare moment when physics-style theory intersected with deep learning, sparking debate on generalization, implicit regularization, and the dynamics of SGD.

In today’s email:

  • Altman’s Manhattan Project–scale AI build-out

  • xAI slashes 98% of token costs

  • Meta + Oracle $20B Cloud Deal

  • Google Home powered by Gemini

  • 5 New AI Tools

  • Latest AI Research Papers

You read. We listen. We heard your requests to send out the email a bit earlier, so from now on it will go out at 6:00am Eastern Time. Let us know what you think by replying to this email.

In partnership with WorkOS

The MCP Registry makes it easy for LLMs to discover tools, but discovery alone isn’t enough.

Tools still need to act on behalf of users, and that requires secure, delegated access. API keys don’t cut it. They’re hard to scope, break user flows, and undermine the promise of seamless integration.

WorkOS Connect delivers a fully compliant OAuth 2.1 flow. It handes PKCE, scopes, user consent, and secure token issuance out of the box.

The WorkOS advantage:
- Compliant with MCP OAuth 2.1
- Handles redirects, consent, and scopes
- Easy to drop in and fast to ship

Ship MCP Auth the right way with WorkOS Connect.

Thank you for supporting our sponsors!

Today’s trending AI news stories

Altman readies Manhattan Project–scale AI build-out

OpenAI is scaling on two axes: compute and hardware. On the infrastructure side, it’s committing an extra $100B to reserve server rentals, pushing total spend toward $350B by 2030. CFO Sarah Friar concedes compute shortages have already throttled feature rollouts; Altman insists the AI race is a contest of raw scale as much as algorithms. Forecasts peg annual server bills at $85B, nearly half of hyperscaler cloud revenue in 2024, putting enormous pressure on chipmakers, integrators, and utilities. These “standby” clusters won’t just idle, they’re designed for instant spin-up of training runs or demand spikes, giving OpenAI an always-ready compute buffer.

At the same time, Jony Ive’s io project is pulling Apple’s supply chain and design bench into AI-native hardware. Prototypes under discussion include a screenless smart speaker, glasses, recorder, even a wearable pin. Luxshare is already contracted; Goertek is in talks. Launch window: 2026–27. More than two dozen Apple veterans, including Tang Tan and Evans Hankey, have defected to OpenAI, citing fewer bureaucratic bottlenecks and the chance to ship category-defining products.

OpenAI is making a Manhattan Project-scale wager. At $20B a year on training alone, it is betting that future AI leadership will hinge on brute-force compute combined with devices built from the ground up for conversational, multimodal intelligence. Read more.

xAI bends the price-performance curve, slashing 98% of token costs; Neuralink eyes October trial

xAI’s Grok 4 Fast collapses the trade-off between scale and cost. The model delivers GPT-5-class reasoning on AIME (92%) and HMMT (93.3%) while running 40% leaner, slashing task costs by up to 98%. It fuses fast-answer and deep-reasoning modes into a single prompt-driven system, cutting “thinking tokens” and latency which is critical for real-time, compute-sensitive use cases.

Image: xAI

With native tool use for web, code, and search, Grok 4 Fast even displaced OpenAI’s o3-search at the top of LMArena. A 2M-token context and pricing from $0.05/M tokens position it squarely for mass deployment. Wharton’s Ethan Mollick called it another reset of the AI cost curve, noting that benchmarks like GPQA Diamond may now be functionally maxed out.

Meanwhile, Neuralink heads into human trials this October with its thought-to-text implant, FDA-cleared for investigational use. The device decodes neural activity directly into text, with near-term applications for speech-impaired patients and longer-term ambitions to let healthy users query AI systems by thought alone. Read more.

Oracle, Meta near $20B AI cloud deal as bubble fears rise

Oracle’s push into AI cloud infrastructure is accelerating as the company reportedly nears a $20 billion multi-year deal with Meta to host its Llama models. The agreement, if finalized, would expand Meta’s compute capacity across Facebook, Instagram, and WhatsApp while reducing its dependence on Microsoft Azure. It also builds on Oracle’s recent momentum: a record-breaking $300 billion contract with OpenAI and a partnership with xAI. Oracle is betting its cheaper, faster Cloud Infrastructure can undercut AWS and Google Cloud, a strategy that has helped drive its stock up 85% this year.

But the AI boom is showing signs of strain. MIT research finds 95% of AI pilots fail to deliver ROI, echoing warnings from OpenAI’s Sam Altman and Meta’s Mark Zuckerberg. Speaking to Access podcast, Zuckerberg cautioned that “collapse is definitely a possibility.” Despite committing $600 billion through 2028 to AI data centers and infrastructure, he argues overbuilding is safer than missing the window for superintelligence.

Meta also faces reputational headwinds. Strike 3 Holdings has filed a $350 million lawsuit alleging Meta pirated 2,396 pornographic videos via BitTorrent to enrich training data, including explicit titles suggesting underage actors. Legal scholars warn such practices could contaminate AI outputs and trigger public backlash. Meta denies the charges, noting its V-JEPA 2 model trained on “1 million hours of internet video,” though critics say the dataset remains suspiciously vague. Read more.

First look at the Google Home app powered by Gemini

Google is turning the Home app into a true AI command center. Version 3.41.50.3 embeds Gemini, letting you query your smart home via the new “Ask Home” bar, by voice or text, and get contextual insights, not just device toggles. Gemini taps into device states, sensor histories, and activity logs, synthesizing data to anticipate needs, track trends, or summarize past events.

UI changes streamline control. The old Favorites tab is now Home, and Devices/Settings live behind a grid icon. You can pin live environmental metrics like outdoor air quality and temperature, while new video and thermometer icons hint at next-gen Nest hardware. Read more.

5 new AI-powered tools from around the web

arXiv is a free online library where researchers share pre-publication papers.

Thank you for reading today’s edition.

Your feedback is valuable. Respond to this email and tell us how you think we could add more value to this newsletter.

Interested in reaching smart readers like you? To become an AI Breakfast sponsor, reply to this email or DM us on 𝕏!