Skip to content
Carlos KiK
Go back

Meta Just Went Closed Source. Nobody Should Be Surprised.

Meta just released Muse Spark, its first major AI model built under Alexandr Wang after the $14.3 billion Scale AI acquisition. The model itself is interesting: small, efficient, runs directly inside WhatsApp, Instagram, and Facebook, uses an “order of magnitude less compute” than Llama 4’s midsize models, and features a multi-agent reasoning mode called “Contemplating” that scored 58% on Humanity’s Last Exam.

But the model is not the story. The story is that Meta released it closed source.

No open weights. No model card with downloadable parameters. No community access to the architecture. After three years of positioning itself as the champion of open AI, Meta locked the door.

The open-source era was never about philosophy

Let me be direct about this. Meta did not open-source Llama because they believed in open science. They open-sourced Llama because they were behind.

When Llama 1 dropped in February 2023, Meta was losing the AI race badly. OpenAI had ChatGPT. Google had years of transformer research and was scrambling to ship Bard. Meta had an AI lab that published good papers but had no competitive product. They were a social media company trying to convince the market they belonged in the AI conversation.

Open-sourcing was the rational strategy. It built goodwill with the developer community. It attracted talent who wanted to work on open models. It created an ecosystem of fine-tuned Llama variants that extended Meta’s reach without Meta spending the compute. And most importantly, it commoditized the layer where Meta’s competitors had an advantage. If everyone has access to a good-enough base model, then the model itself stops being the competitive moat. The moat moves to distribution, data, and product integration, which is exactly where Meta is strongest.

This was not generosity. It was competitive positioning. And it worked brilliantly. Llama became the default open model. Meta became the “good guy” of AI. Developers built on Meta’s foundation instead of building their own.

What changed

Two things changed. First, Meta now has a model they believe is genuinely competitive at the frontier. Muse Spark is not a research release or a community project. It is a product, built to run inside Meta’s applications for Meta’s three billion users. The value of that model is directly proportional to how exclusively Meta controls it.

Second, the threat landscape shifted. Bloomberg reported that Meta’s “superintelligence group”, the team Wang now leads, was specifically created to build models that would not be open-sourced. The decision was made before Muse Spark was even designed. This was not a last-minute change. It was the plan from the beginning of this new era.

The Alexandr Wang hire tells you everything. Wang built Scale AI into a $14 billion company by providing data infrastructure to the most secretive AI labs in the world. His entire career is built on proprietary advantage. Hiring him to lead your AI strategy and then open-sourcing the results would be like hiring a vault designer and leaving the door open.

The Contemplating mode is the real tell

Muse Spark’s “Contemplating” feature is worth understanding because it reveals where the industry is heading. It is not a single model thinking harder. It is multiple AI agents collaborating on a problem, breaking it into subproblems, solving them in parallel, and assembling the results.

This scored 58% on Humanity’s Last Exam, a benchmark specifically designed to be unsolvable by current AI. For context, most frontier models score in the 10-20% range. 58% is a significant jump, and if the benchmark holds up to scrutiny, it suggests that multi-agent architectures may be the next performance frontier.

And multi-agent systems are inherently harder to open-source. You are not releasing one model. You are releasing an orchestration system, a routing layer, specialized sub-models, and the logic that ties them together. The complexity makes open release harder and the competitive value of the system architecture makes it more costly to share.

This is probably the future: base models get commoditized and eventually open-sourced, while the orchestration layer that makes them useful stays proprietary. The model is the engine. The system is the car. You might open-source the engine. You will never open-source the car.

What this means for the open-source community

The Llama community is not going to disappear overnight. Meta still has open models out there, and they are still useful. But the signal is clear: the best stuff is no longer going to be shared.

This matters because a significant portion of the AI startup ecosystem was built on the assumption that Meta would keep shipping open frontier models. Fine-tuning shops, hosting providers, and application developers all bet on Llama as a foundation. If Meta’s best models are now closed, those businesses need to find a new foundation or accept that they are building on yesterday’s technology.

The broader lesson is one the software industry has learned before. Open source is a strategy, not a commitment. Companies open-source when it serves their interests and close up when it does not. IBM did it with Linux. Google did it with Android. Red Hat, Elastic, MongoDB, HashiCorp, they all eventually changed the terms when the economics shifted.

Meta is doing the same thing. The only difference is that the AI community believed this time would be different because Zuckerberg said so. It was never going to be different. The incentives always pointed here.

Where this goes

Meta now has closed models running inside three billion user accounts across WhatsApp, Instagram, and Facebook. They have a $14.3 billion investment in the team building those models. They have distribution that no other AI company can match.

The open-source era gave Meta legitimacy, talent, and an ecosystem. The closed-source era will give them revenue, control, and a moat. From Meta’s perspective, this is not a betrayal. It is the logical next chapter.

From the community’s perspective, the lesson is simple: if a corporation’s open-source strategy does not have a binding legal commitment, assume it is temporary. Because it always is.


Sources: CNBC, TechCrunch, Bloomberg, Simon Willison


Share this post on:

Previous Post
Florida's AG Is Investigating ChatGPT Over a Mass Shooting. This Was Always Coming.
Next Post
OpenAI Launched a $100/Month Tier. Anthropic Is Already Beating Them While Spending Less.