What happens when audio recordings of 80 internal meetings of a company are leaked to the media? A serious credibility problem.

That’s what happened to TikTok last week when the audio of, indeed, more than 80 internal meetings were leaked to Buzzfeed, which led to a bombshell story exposing the hard truth that TikTok’s promise of protecting US users’ data from China is falling short.

Now, I am not a “TikTok basher”. Over the last two years, I’ve written many posts on ByteDance – from its aggressive global expansion tactics and fledgling cloud product, to its “trust building” problems with regulators and its fascinating, enigmatic founder, Zhang Yiming. I do my best to be fair when discussing ByteDance and TikTok – give credit where credit is due, give critique where critique is due.

(To read all of our previous writings on TikTok, see “Zhang Yiming’s Last Speech” (premium content) and the ByteDance tag.)

In ByteDance’s quest to become an enduring global tech company, TikTok must succeed in its own quest to become “less Chinese” and “more American”. Thus far, it is failing.

TikTok has an “American Credibility” problem – four problems to be exact: a “whistleblower” problem, a “master admin” problem, a “protected data” problem, and an “Oracle” problem.

(To jump ahead, see "Solutions to TikTok's American Credibility Problem" on my proposed solutions to these problems.)

The “Whistleblower” Problem

Shortly before the Buzzfeed exposé was published, TikTok pre-empted the story with its own official blog post. However, both the timing and the content of the blog post shows that TikTok has a whistleblower problem.

Having worked in multiple communications offices in the past, the curious timing – publishing hours before a (really) bad media story is about to drop – looks all too familiar to me. This is how a blindsided communications department scrambles to stay ahead of a story that is both bad and true.

The timing of the blogpost implicitly admits that the audio recordings that Buzzfeed obtained were true. Additionally (and unfortunately for TikTok), the content of the blog post almost explicitly admits that the Buzzfeed story, alleging that ByteDance’s China HQ has been accessing TikTok’s US user data, is true. This phrase in the blog post gave it away:

“We've now reached a significant milestone in that work: we've changed the default storage location of US user data. Today, 100% of US user traffic is being routed to Oracle Cloud Infrastructure.”

Knowing that the blog post’s timing was forced upon TikTok, saying that “we’ve now” fixed the problem clearly indicates that the data access problem persisted during the last two-plus years, despite TikTok executives denying such fact in a sworn congressional testimony.

This internal leak is not isolated, but one among many. In the last two months, both the Wall Street Journal and the Financial Times have published long stories detailing the toxic work culture at TikTok’s US and UK offices. The FT article led a TikTok executive to “take some time off” and “step back”.

When a company’s public posture contradicts its internal operations, employees morale sags, trust is lost, and whistleblowing of all types start to occur. The first type tends to be about work culture, which is the most personal and easiest to corroborate. As media stories get published and executives get punished, employees feel more emboldened and more leaks of more problematic internal practices start to happen – like not delivering on its data governance promise and possibly lying to Congress.

We are seeing this downward spiral of continuous whistleblowing unfold in front of our eyes – similar in some ways to what TikTok’s archrival Facebook/Meta was dealing with last year with Frances Haugen. TikTok’s “whistleblower” problem may continue for a while, until either ByteDance leadership can re-establish internal trust or TikTok simply runs out of things worth whistleblowing about.

Facebook/Meta’s most recent, most high-profiled whistleblower, Frances Haugen

The “Master Admin” Problem

If the “whistleblower’ problem is bad, the “master admin” problem is arguably more pernicious.

The most damaging revelation of the Buzzfeed story, in my view, is this part:

“Everything is seen in China,” said a member of TikTok’s Trust and Safety department in a September 2021 meeting. In another September meeting, a director referred to one Beijing-based engineer as a “Master Admin” who “has access to everything.”

Despite trying to build a US-based Trust and Safety department, a well-meaning and important step in establishing a separate data governance structure, this function is of no practical use if a rank-and-file “Beijing-based engineer” who happens to have “master admin” access privilege can access any data.

Now, having a “master admin” as a function is not, in and of itself, a problem. Every company, tech or non-tech, does and should have a group of highly-skilled, highly-responsible engineers who have access to everything, so when a serious problem arises, these engineers can get to the root cause of the problem and fix it. The people who have this privileged access and power are the real life embodiments of Spiderman’s “with great power comes great responsibility” adage.

So TikTok’s “master admin” problem lies in the fact that its “Spiderman” appears to be only in China. And as long as this “master admin” role sits in China, and only China, it really doesn't matter where TikTok’s US user data is physically stored and by whom.

The “Protected Data” Problem

TikTok’s “protected data” problem is more complicated to dissect, because the responsibility in fixing this problem is not solely on TikTok, but also on the US regulators, as well as every lawyer and lobbyist in between.

Despite all the rage and rancor about how scary it would be for the Chinese Communist Party to access US user data via TikTok, the Buzzfeed story revealed that what kind of data is considered “protected” data is still “being negotiated” – a euphemistic way of saying the regulation is being “watered down”.

While we don’t know what types of data are considered “protected”, it appears that at least one type of data, the UID, is apparently not:

“In a recorded January 2022 meeting, the company’s head of product and user operations announced with a laugh that unique IDs (UIDs) will not be considered protected information under the CFIUS agreement: “The conversation continues to evolve,” they said. “We recently found out that UIDs are things we can have access to, which changes the game a bit.” (Bold emphasis mine)

I emphasized the “with a laugh” detail, because it underscores just how core UID is when it comes to data access control. Any regulation that does not include core data types, like the UID, within the “protected data” definition is basically toothless.

A quick technical aside: UID stands for “unique identifier”. It is a string of numbers or alphanumerics, often automatically generated, that serves as the “locator” of data in a database or spreadsheet. To find a piece of data, you would usually query the UID plus a column or field’s name, like “age” or “email”, to find that data.

UID is basically the key to find all the data within a database. If the UID is accessible and not considered “protected data” by US regulators, then it is almost impossible to guarantee that TikTok’s US user data are shielded from access by engineers or data analysts located in China, let alone people with “master admin” privilege.

And since TikTok has already announced it is routing all user traffic to Oracle Cloud, it’s worth noting that in the official Oracle Database documentation, it states that a “UID returns an integer that uniquely identifies the session user (the user who logged on).”

In Oracle land, a UID most certainly leads to user data…which brings us to our last problem.

The “Oracle” Problem

Oracle has somehow cornered the cloud infrastructure business of “tech companies with a China problem” – first with Zoom, then with TikTok. (My post “Why Zoom Chose Oracle” from two years ago goes into Oracle’s unique positioning in the US-China tech war.)

However, precisely because the business opportunity is so big for an otherwise small Oracle Cloud unit, it is not certain whether Oracle can be trusted either.

This problem was hinted at in this passage of the Buzzfeed report:

“It also appears that Oracle is giving TikTok considerable flexibility in how its data center will be run. In a recorded conversation from late January, TikTok’s head of global cyber and data defense made clear that while Oracle would be providing the physical data storage space for Project Texas, TikTok would control the software layer: “It’s almost incorrect to call it Oracle Cloud, because they’re just giving us bare metal, and then we're building our VMs [virtual machines] on top of it.” Oracle did not respond to a request for comment.” (Bold emphasis mine)

Another quick technical aside: “bare metal” refers to the raw servers, microprocessors, networking cables, and storage units in a cloud data center, literally the metal hardware. VMs or virtual machines refer to the low-level software that is one layer above the hardware – a crucial layer that controls the logic of how the hardware will be used and accessed, for example, who can access the data saved on the storage units.

It’s possible that Oracle’s interest is sandwiched between US regulators and TikTok. On the one hand, Oracle obviously wants to earn the trust of Washington, by closely monitoring TikTok’s US user data access. But on the other hand, TikTok is also an important source of business and validation for Oracle Cloud – a tiny player in the cloud industry.

How important?

When Oracle released its full fiscal 2022 earnings last week, the stock price spiked 12% while the NASDAQ dropped 4.7%, because Oracle Cloud grew 20+% and its CEO projects the unit will grow 30+% in fiscal 2023.

TikTok is a huge part of that growth. When the TikTok blog post that was meant to pre-empt the Buzzfeed story was published, the part about routing all US traffic to Oracle Cloud was well-received by Wall Street analysts who cover Oracle.

How much “flexibility” will Oracle continue to give to TikTok to keep its business, while assuaging Washington regulator’s concerns remains a big question mark.

All in all, TikTok has a serious “American Credibility” problem – some of that is within its control to fix, some of that requires both Oracle and relevant US regulators to act responsibly as well.

How can this credibility problem be resolved? How can TikTok be trusted as truly “American”?

See "Solutions to TikTok's American Credibility Problem" on how I think TikTok can fix its credibility problems.




我并不是一个习惯性的 "TikTok抨击者"。在过去两年中,我写了许多关于字节跳动的文章 – 从其积极的全球扩张策略和刚起步的火山引擎,到其与美国监管机构的 "建立信任" 问题,以及其神秘的创始人,张一鸣。在讨论字节和TikTok的时候,我尽力给予公平评论 – 该赞扬的赞扬,该批评的批评。

(想读《互联》以前关于字节的文章可以阅读《张一鸣的最后一次演讲》(付费内容),或者查看 ByteDance 文章标签。

在字节寻求成为一家有持久性的全球科技公司的过程中,TikTok必须在其自身寻求变得 "不那么中国", 而要 "更加美国" 的过程中取得成功。到目前为止,这一努力是失败的。

TikTok在美国有一个严重的 “信誉” 问题 – 更确切地说,是四个问题:"告密者" 问题、"主管理人"(Master Admin)问题、"受保护数据" (Protected Data)问题和 "甲骨文"(Oracle)问题。


“告密者” 问题


我过去曾在多个PR传媒办公室工作过,这种玩法 – 在一个(非常)糟糕的媒体报道即将出版的前几个小时发布 – 对我来说太熟悉了。这就是一个被蒙在鼓里的PR传播部门如何挽救一个既糟糕又真实的故事的常惯做法。



了解博客文章的时间是TikTok被迫的,说 "我们‘现在’已经" 解决了这个问题,明确地表明数据访问问题在过去两年多时间里一直存在,尽管TikTok高管在美国国会宣誓的证词中否认了此事实。

这次内部泄漏不是孤立的,而是众多泄漏中的其中一次。在过去两个月里,《华尔街日报》和《金融时报》都发表了长篇报道,暴露了TikTok美国和英国办公室的恶劣工作文化和状态。《金融时报》的文章导致TikTok的一位高管决定 "休息一段时间" 并 "退一步"

当一家公司的公共姿态与其内部运作相矛盾时,员工士气低落,失去信任,各种各样的举报行为开始发生。第一类往往是关于工作文化的,这是最亲于个人经历,也是最容易证实的。随着媒体报道的发表和高管受到惩罚,员工们的胆子开始变大,更多更严重的内部问题开始被泄露 – 比如没有兑现其用户数据治理的承诺,以及可能向国会撒谎。

我们正在目睹TikTok在面对这种持续举报的螺旋式下降 – 在某些方面类似于TikTok的老对手Facebook/Meta去年与知名 “告密者” Frances Haugen的关系。TikTok的 "告密者" 问题可能会持续一段时间,直到字节领导层能够重新建立内部信任,或者就是等到没有什么值得告密的东西可以泄露了。

Facebook/Meta 最近最知名的 “告密者” Frances Haugen

“主管理人"(Master Admin)问题

如果说 "告密者" 问题挺糟糕的了,那么 "主管理人"(Master Admin)的问题可以说是更糟糕。


"2021年9月的一次会议上,TikTok的信任和安全部门的一名成员说:"一切都会被中国方看到。在9月的另一次会议上,一位主管将一位在北京的工程师称为 "主管理人",他 "可以碰到一切。" (我个人翻译,非官方翻译)

尽管TikTok企图建立一个基于美国的信任和安全部门,这是搭建一套完全独立的数据治理结构既有意义有很重要的一步,但如果一位普通的 "在北京的工程师" 碰巧有 "主管理人"(Master Admin)的访问权限,可以看到任何数据的话,那么这个部门其实没有实际作用。

值得澄清的是,有 "主管理人" 这项功能本身并不是一个问题。每家公司,不管是科技公司还是非科技公司,都有而且也应该有一组高技能、高责任心的工程师,他们可以访问所有内部东西,当出现严重问题时,这些工程师可以找到问题的根源并立即解决。拥有这种权限和权力的人是蜘蛛侠 "权力越大,责任越大" 格言的现实化身。

因此,TikTok的 "主管理人" 问题在于,其 "蜘蛛侠" 似乎只在中国。只要这个角色在中国,而且只在中国,那TikTok的美国用户数据不管在哪里存储,由谁来存储都真的不重要了。

“受保护数据”(Protected Data)问题

要理解TikTok的 "受保护数据" 问题更复杂些,因为解决这个问题的责任不仅在于TikTok,也在于美国监管机构,以及每一个律师和游说客(lobbyist)身上。

尽管美国监管部门对中国共产党能通过TikTok获取美国用户数据做坏事既害怕又愤怒,但Buzzfeed的报道中显示,什么样的数据被视为需要 "受保护"(protected)仍在 "谈判中" – 这是一种委婉的说法,真正意思是监管的细节正在被 "淡化"。

虽然我们不知道哪些类型的数据被认为值得 "受保护",但似乎至少有一种类型的数据,即UID,显然不是:

"在2022年1月的一次会议的录音中,TikTok的一位产品和用户运营负责人笑着宣布,unique ID(UID)在CFIUS协议下将不被视为是受保护信息。"谈判对话还在继续发展,"他们说。"我们最近发现,UID是我们可以访问的东西,这使游戏规则有了点变化"。(我个人翻译以及加了黑体强调,非官方翻译。)

我强调了 "笑着" 这一小细节,因为它强调了在涉及到数据访问控制时,UID是多么的核心。任何不将核心数据类型(如UID)纳入 "受保护数据" (protected data) 定义的法规,基本上都是没有意义的。

加一段简短的技术知识解释:UID的全称是 "unique identifier",一般是一串数字或字母数字,通常是自动生成的,作为数据库或电子表格中数据的 "定位器"。要想找到一条数据,通常会查询UID加一个列或类型的名称,如 "年龄 " 或 "电子邮件",从而获得该数据。

UID基本上是查找数据库内所有数据的“钥匙”。如果可以访问UID,而不被美国监管机构视为 "受保护数据",这种监管几乎不可能保证TikTok的美国用户数据不被位于在中国的工程师或数据分析师访问,更不用说有 "主管理人" 权限的人了。




甲骨文(Oracle)似乎巧妙的垄断了所有 "有中国问题的科技公司" 的云基础设施业务 – 首先是Zoom,其后是TikTok。(我两年前的文章 《Zoom为什么选择了Oracle》谈到了甲骨文在中美科技战中的独特定位。)



"甲骨文在具体如何运行其数据中心方面看似给予了TikTok相当大的灵活性。在1月底的一次谈话记录中,TikTok的全球网络和数据防御负责人明确表示,虽然甲骨文将为“德克萨斯项目”(Project Texas)提供物理数据存储空间,但TikTok将控制软件层。"称其为甲骨文云几乎是不准确的,因为他们只是给我们提供裸机,然后我们在上面搭建自己的VM [虚拟机]。" 甲骨文没有回答记者诉求回应的请求。"(我个人翻译以及加了黑体强调,非官方翻译。)

再加一段简短的技术知识解释:"裸机" 指的是云数据中心里赤裸裸的服务器、微处理器、网络电缆和存储硬盘,看得见摸得着的金属硬件。虚拟机(VM,virtual machine)指的是在硬件之上的低层软件 — 它是控制硬件使用和访问逻辑的最关键一层,例如,谁可以访问保存在存储硬盘上的数据。

甲骨文的利益很有可能被夹在美国监管机构和TikTok之间。一方面,甲骨文显然希望通过密切监控TikTok的美国用户数据访问,赢得华盛顿的信任。但另一方面,TikTok也是甲骨文云计算的一个重要收入和口碑来源和验证,因为甲骨文云(Oracle Cloud)在整个云计算行业中是个小芝麻。




甲骨文将会继续给予TikTok多少 "灵活性" 来保持这门生意,同时安抚华盛顿监管机构的担忧,这仍然是一个大问号。

总而言之,TikTok在美国有一个严重的 "信誉" 问题 – 有些是在其控制范围内可以解决的,有些则需要甲骨文和相关的美国监管机构也要履行它们的职责。

如何解决这堆 “信誉” 问题呢?TikTok如何才能被视为是家真正的 "美国公司" 呢?