Meta Partners with Midjourney

In partnership with

Good morning. It’s Monday, August 25th.

On this day in tech history: In 1991, on comp.os.minix, Linus Torvalds posted about his new Unix-like operating system, initially a monolithic kernel written in C and x86 assembly with multitasking, virtual memory, and terminal I/O support. This sparked the open-source software revolution and launched the now-ubiquitous Linux kernel that powers everything from servers to smartphones.

In today’s email:

  • Meta Partners With Midjourney

  • Grok 2 Goes Open Source

  • Siri to be rebuilt with Gemini?

  • 5 New AI Tools

  • Latest AI Research Papers

You read. We listen. Let us know what you think by replying to this email.

Turn AI Into Your Income Stream

The AI economy is booming, and smart entrepreneurs are already profiting. Subscribe to Mindstream and get instant access to 200+ proven strategies to monetize AI tools like ChatGPT, Midjourney, and more. From content creation to automation services, discover actionable ways to build your AI-powered income. No coding required, just practical strategies that work.

Today’s trending AI news stories

Meta Partners with Midjourney

Meta has just licensed Midjourney’s generative image and video models, bringing the startup’s distinctive “aesthetic technology” into Meta’s future AI stack. Announced by Chief AI Officer Alexandr Wang, the deal complements Meta’s in-house work on Imagine, Movie Gen, and DinoV3 while positioning it to better compete with OpenAI’s Sora, Google’s Veo, and Black Forest Labs’ Flux.

Midjourney, with 20 million users and a reputation for design-forward outputs, remains independent and community-backed, but its touch may soon shape Instagram tools, VR assets, or even Meta’s widely mocked chatbots. Founder David Holz emphasized continuity for its subscription services, though industry watchers see the deal as Meta’s bid to fuse compute scale with generative creativity.

That compute scale is taking form in Richland Parish, Louisiana, where Meta is investing $10 billion to construct “Hyperion,” a four million-square-foot data center designed to deliver up to 5 GW of AI capacity, surpassing any existing site. The nine-building site will feed Meta’s next open-source language models. Powering it means three new gas plants and 1.5 GW of solar/storage, with Meta committing $3.2B to keep the lights on. Regulators tout it as a blueprint; critics see a future of stranded energy assets if AI efficiency races ahead. Read more.

Grok 2 Goes open-source and Musk’s new ‘MacroHard’ project bets on an AI-only software stack

Elon Musk’s xAI just sketched its most aggressive play yet. The “Macrohard” project is pitched as a software company made entirely of AI agents. No humans in the loop, no hardware required. The plan is to use Grok as a controller to spawn fleets of specialized models for coding, design, speech, video, and testing, then run them inside virtual machines where simulated “AI users” stress-test the results until they’re production ready.

No human dev cycles, no traditional stack - just Colossus, xAI’s Memphis supercomputer scaling on millions of Nvidia GPUs, as the factory floor. Macrohard was trademarked this month with coverage ranging from text generation to game design.

At the same time, xAI has made Grok 2 fully open, releasing its weights on Hugging Face - effectively handing developers a production-grade LLM to dissect, benchmark, or fine-tune. This isn’t a free sandbox: the xAI Community License bans using Grok 2 to bootstrap competing foundation models and requires redistribution to carry the “Powered by xAI” mark. Non-commercial tinkering is wide open, while commercial deployments must clear xAI’s terms.

Musk also confirmed Grok 2.5 is already open-sourced, with Grok 3 set to follow in six months. Strategically, this move plants xAI firmly in the open-model camp alongside Meta and Mistral, while keeping tight control over competitive spillover. Read more.

Apple bends its AI walls without breaking them

Bloomberg reports that Apple is exploring Google’s Gemini to power a rebuilt Siri, even as it tests two internal systems: Linwood, its trillion-parameter in-house LLM, and Glenwood, a hybrid setup integrating external providers. The target is 2026, a year late after scaling stumbles and leadership churn that saw AI architect Ruoming Pang defect to Meta on a $200M deal. The Gemini talks, and parallel feelers to OpenAI and Anthropic, signal Apple’s recognition that privacy-driven, on-device AI alone won’t close its competitive gap.

That same pragmatism is shaping Apple’s enterprise push. September’s updates will let IT teams toggle which external models employees can access, starting with ChatGPT Enterprise, while deciding whether data stays in Apple’s Private Cloud Compute or moves to external clouds. The company is also shipping infrastructure upgrades, from Apple Business Manager APIs to streamlined device migrations and Vision Pro fleet support, that underscore a new strategy: Apple isn’t surrendering control, but selectively opening gates where leverage matters. Read more.

5 new AI-powered tools from around the web

arXiv is a free online library where researchers share pre-publication papers.

Thank you for reading today’s edition.

Your feedback is valuable. Respond to this email and tell us how you think we could add more value to this newsletter.

Interested in reaching smart readers like you? To become an AI Breakfast sponsor, reply to this email or DM us on 𝕏!