☁️Llama 2 is China Cloud's Best Friend

When Meta’s large language model, Llama 2, was released a couple of weeks ago, it chose Microsoft’s cloud platform, Azure, as its preferred partner. Since the model is open source (though people dispute how open source it really is given its licensing terms), AWS soon made it available for its customers.

Another group of cloud providers, who is jumping on this Llama 2 opportunity, are the major cloud platforms from China. AliCloud was the first to market, announcing its Llama 2 offerings just 8 days after the model’s initial release. Baidu followed suit and announced its offering this week. Because there is no legal barrier to entry, I can see other leading Chinese clouds, like Huawei Cloud and Tencent Cloud, all jump on the Llama 2 train soon.

Both AliCloud and Baidu AI Cloud have their own LLMs to offer, Tongyi Qianwen and ERNIE, respectively. But offering Llama 2, arguably the most well-known if not also the best performing open source AI model out in the market, is a godsend, making them instantly more competitive internationally. This is an important development in what may lay ahead in what I call the cloud proxy war between the American cloud and Chinese cloud – a development that would not be possible without Llama 2.

The Open Source to Cloud Playbook

Before folks get up in arms about how this is yet another case of China stealing American innovation, let me explain just how common this practice is in the cloud infrastructure industry across all companies.

Bringing popular open source technologies onto the cloud and offering them as managed services is a tried and true playbook. In general, open source technology is easy to customize, easy to trust, but hard to use. Closed source technology is oftentimes the polar opposite, hard to customize, hard to trust, but easy to use. What major cloud platforms are good at is making open source easy to use.

Here’s an example. You could download Postgres, a very popular open source database, for free to run on your own laptop or servers. And indeed, many hackers, hobbyists, and SMBs do exactly that. But as your side project or business grows, scaling and managing a database in-house becomes a lot of work (and not the kind of work that grows your business). You might also want to add a data visualization and analysis tool to analyze your data, and there are many open source tools for that, but doing that on your own is even more work. So more often than not, you end up paying AWS for its offering of Postgres to manage all that for you, and you can easily add a Redshift or something else for the data analysis work all on the cloud. (I’m simplifying here. Managing a bunch of cloud services is also not easy work, just easier than managing pure open source projects.)

This “open source to cloud” business practice is commonplace. Every cloud does it. Every cloud has its own managed Postgres or MySQL (another popular open source database) service, whether you are an American cloud, Chinese cloud, or French cloud. This business practice has, however, stirred up controversy in the last five years. Many of the people and companies who have created open source technologies (e.g. MongoDB, Elasticsearch, Kafka/Confluent) feel like cloud hyperscalers, particularly AWS, have been “strip mining” their creation to make billions without giving anything back. The hyperscalers often retort back by saying the engineers on their payroll do plenty to give back and contribute to the open source projects upstream, and work to foster the community and ecosystem. Cloud vendors also use this issue as a wedge to drive away large open source communities from the cloud platforms they compete with, most notably GCP against AWS.

For what it’s worth, the large Chinese clouds have been mostly spared of this industry drama and controversy, likely because their scale is still too small to matter. (AWS’s quarterly revenue is 2x AliCloud’s annual revenue.) They have also built a more win-win partnership with some of the largest commercial open source companies – MongoDB’s four-year partnership extension in May with AliCloud being the latest example.

In some ways, the Chinese clouds have even done more than their American counterparts to share their own technologies as open source projects. In the distributed database layer – a big money maker for cloud vendors – Huawei Cloud has open sourced its GaussDB in 2019, and AliCloud has open sourced its PolarDB in 2021. By contrast, none of the equivalent distributed database products from the Big Three American Clouds – AWS’s Aurora, Azure’s CosmosDB, GCP’s Spanner – are open sourced.

These actions are not done for purely magnanimous reasons of course. Both AliCloud and Huawei Cloud want to use open source to get more developers to use their underlying infrastructure and compete with their stronger American rivals. In general, open sourcing one’s own technologies has been a well-practiced defensive business strategy to erode the moat and advantage of competitors. Facebook, before it was called Meta, did exactly that with its data center technology. Making Llama 2 open is Meta dusting off this playbook once again – a topic I explored in detail in a previous post.

All in all, if you understand enough of the business dynamic between open source and cloud, then there is nothing terribly remarkable about AliCloud (or any cloud) offering Llama 2.    What is remarkable is that, intentionally or not, Llama 2 just gave Zuckerberg a second shot at entering the Chinese market.

Zuck’s “China Dream”

When Zuckerberg made the infamous “smog jog” across Tiananmen Square in 2016, he was hellbent on expanding his social network to the world’s largest population, to no avail. Fast forward to today, he has skillfully used China’s rise as a power weapon to avoid US antitrust regulations, while still indirectly generating a modest business from China (those Temu ads have to go somewhere).

Llama 2’s entry into the Chinese market via the cloud platforms might have given his “China Dream” a second life.

In section 2 of Llama 2’s license (and the reason why many people dispute the project’s open source self-characterization), there is a clause where if a company offers Llama 2 or derivatives of Llama 2 as a product or service that reaches more than 700 million monthly active users, it must seek a business license from Meta.

In Meta’s most recent Q2 2023 earnings call, Zuckerberg expanded on this clause and even expressed some long-term expectation that this could be a revenue generator. Here’s what he said in full:

“But one of the things that you might have noticed is, in addition to making [Llama 2] open through the open source license, we did include a term that for the largest companies, specifically ones that are going to have public cloud offerings, that they don't just get a free license to use this. They'll need to come and make a business arrangement with us. And our intent there is we want everyone to be using this. We want this to be open. But if you're someone like Microsoft or Amazon or Google, and you're going to basically be reselling these services, that's something that we think we should get some portion of the revenue for. So those are the deals that we intend to be making, and we've started doing that a little bit. I don't think that's going to be a large amount of revenue in the near term. But over the long term, hopefully, that can be something.”

(Meta’s CFO, Susan Li, quickly tried to temper analyst expectations after Zuckerberg’s answer, stating that “we don't expect that Llama is going to result in a separate top line enterprise revenue line, so just making sure we're totally clear.”)

While Zuckerberg only mentioned the Big Three American cloud, it’s clear that if Llama 2 generates any revenue, he will expect it to come from some type of public cloud offerings. Being shut out of both GPT-4 and Palm 2 and stuck with less advanced domestic options, Llama 2 is now more easy to access than ever via AliCloud, Baidu AI Cloud, and likely others soon. And given the size of China's internet economy, it is highly likely that an AliCloud customer building on top of Llama 2 will reach 700 million MAUs before an Azure customer does.

Chinese cloud platforms may be the first to write a check to Meta for Llama 2. And they will be happy to do so, since Llama 2 will make them more successful domestically and more relevant internationally.

Seven years after running on Tiananmen Square, Llama 2, open source, and the AI platform shift may be what will help Zuckerberg fulfill his “China Dream.”