Last week, I wrote a detailed post breaking down TikTok’s immense credibility challenges in America, as revealed by Buzzfeed’s reporting based on more than 80 leaked internal meeting recordings.

In today’s post, I propose some viable solutions – some technical, some organizational – to each of TikTok’s four main credibility problems:

  • “Whistleblower” problem
  • “Master Admin” problem
  • “Protected data” problem
  • “Oracle” problem

I conclude today’s post with some additional thoughts on TikTok’s own leverage in the US, which complicates how the app could be regulated under the Biden Administration. If you have not read last week's post yet, I highly suggest you do that first to build some context around each of these problems.

“Whistleblower” Problem: RBAC Bifurcation

With a full-blown “whistleblower” problem on ByteDance’s hand, there are only two ways for this problem to end – either ByteDance leadership earns the trust of its US employees or the leaking continues until there’s nothing left worth whistleblowing.

Obviously, from ByteDance’s perspective, it would prefer the first outcome. But how can it rebuild that elusive internal trust?

I propose a technical solution: RBAC bifurcation.

RBAC stands for Role-Based Access Control. It is a set of standard operational policies that describes and enforces what type of employee can access what data or information. Are you a junior-level software engineer? You can access this codebase but not others. Are you a manager of a data analytics team? You can access these tables in your data warehouse to do analytics but not other transactional databases. Are you a senior-level site reliability engineer? You can access these logs of user traffic but not the data warehouse that the manager of that data analytics team has access to.

Having a set of RBAC policies that constantly evolves as a company grows is industry best practice. I’m sure ByteDance’s vast global operation already has some level of RBAC in place. In order to rebuild internal trust, ByteDance needs to take its current RBAC a step further – completely bifurcate TikTok’s US operation’s RBAC implementation from ByteDance China.

Replicating then bifurcating RBAC between a China operation and a US operation is no easy feat, but it is very doable and a step that ByteDance can no longer avoid or delay. The bifurcated RBAC policies should also be made available for US regulators to audit. I advocated for this approach two years ago in a post called “Can ByteDane Build Trust”. Back then, I went as far as suggesting that ByteDance’s internal RBAC policies should be open sourced too! Open sourcing would do wonders for TikTok in its quest to build public trust, but much easier said than done (just look at Twitter’s journey to open source its algorithms – a GitHub repo that is now deleted). At the very least, ByteDance’s RBAC policies should be made auditable.

Without audited RBAC policies to back up TikTok’s public promises of protecting US user data, no well-meaning, well-crafted internal emails and all-hands meetings from ByteDance leadership can rebuild trust. Without that trust, no matter how quickly TikTok can hire people in the US, many of these employees will continue to churn, whistleblow, or do both.

“Master Admin” Problem: Empower Trust and Safety Team

Related to the RBAC bifurcation approach, the way to solve TikTok’s “Master Admin” problem lies in empowering the already-created but relatively-powerless US Trust and Safety team. This solution begins with giving this team the same “master admin” power and access privilege, then separating it with China team’s “master admin”. In this organizational setup, whenever a “master admin” based in China wants to access something – anything – in TikTok US, this person must gain permission from the US Trust and Safety team’s “master admin”.

Like I explained in last week’s post, there is nothing wrong with having “master admin” as a function or role. The problem with ByteDance’s current organizational setup is that this role sits in China and only in China. If this power is split between the US and China, where each side’s “master admin” is treated equally and serves as the “gatekeeper” of its own respective operation’s data, codebase, and all user information, then suspicious requests can be reviewed and rejected, while legitimate requests can still be granted.

There may very well be good reasons why a ByteDance China engineering team needs access to certain parts of TikTok US. The legitimacy of these requests can and should be determined by the US Trust and Safety team, in concert with its US leadership team, not China’s. Plain and simple. A reciprocal process should also be applied to any TikTok US engineering team that wants to access some parts of ByteDance China as well.

It is only fair.

“Protected Data” Problem: Regulators Publish Definitions and Reasoning

While both the “whistleblower” and “master admin” problems are issues that ByteDance, and ByteDance alone, can tackle, resolving the “protected data” problem is more in the hands of the US regulators.

It is incumbent upon these regulators to not water down how TikTok (and in effect, all Chinese tech companies) get regulated, especially the definition of “protected data”. When the “protected data” definitions are determined, these definitions should go through a public comment period (typical of any new Federal regulation), then published with the reasoning behind what type of data is and is not considered “protected” clearly explained.

As Buzzfeed’s reporting revealed, this definition is still “being negotiated”. Meanwhile, data types like UID (unique identifier) are still accessible. (Please see my post last week where I provide a technical primer of UID and why it could lead to user data access.) Whether UID should or should not be considered “protected data” in TikTok’s case requires knowledge of how TikTok’s internal database system is designed, architected and operationalized, which I, of course, do not have. But regardless of how UID (or any other data type) is treated, I, as an American citizen (and occasional TikTok user), deserve to know why.

This type of disclosure is not new or unprecedented. One prior art to reference is how personal healthcare data is regulated, protected, defined, and publicly explained under HIPAA (Health Insurance Portability and Accountability Act). In HIPAA, there is a concept called PII (personal identifiable information) that encompasses a sub-concept called PHI (personal health information), which altogether explains what kind of personal data and information is considered “protected” in the American healthcare system.

What is a PII or PHI is easily Google-able. The same clarity and transparency should exist when regulating TikTok.

“Oracle” Problem: Flow Log Audit

Interestingly, the “Oracle” problem can be solved by one of Oracle’s own products – VCN (Virtual Cloud Network) flow log. This product just became generally available in Oracle Cloud in early 2021.

A flow log is a common tool, not specific to Oracle Cloud. Every cloud platform, from AWS to AliCloud, has a similar tool. A flow log lets you view every connection that goes in and out of your cloud infrastructure, i.e. a log of the “flow” of your entire web traffic. And as Oracle’s own product announcement of its VCN flow log clear states, this feature serves two purposes:

  • Troubleshooting and Monitoring
  • Regulatory and Compliance

Currently, Oracle Cloud is giving too much flexibility to how TikTok is using its cloud infrastructure, offering bare metal machines while TikTok designs its own software layer on top of these machines, thus its “Oracle” problem. In order to verify whether TikTok is complying with US regulations on Oracle Cloud, Oracle should make its TikTok-related flow logs available for auditing.

With flow log audit, there will be no need to guess whether TikTok allows access from China or not – every traffic request from anywhere will be plainly visible in TikTok’s flow log. In another post from two years ago, I specifically called out using flow log as part of a framework to “(dis)trust and verify” TikTok, back when TikTok was using a combination of AWS, GCP, and its own Virginia-based data centers as its cloud. The only difference now is that we know which cloud will have TikTok’s flow log going forward – Oracle Cloud.

TikTok’s Leverage

TikTok’s American credibility problems may be difficult, but they are by no means insurmountable, if ByteDance, Oracle, and the US government all act responsibly and in good faith. As an entrepreneur and operator, I don’t just like to talk about problems, but also possible solutions, and I hope I presented a few viable options here for all relevant parties to consider.

Whether it is RBAC or flow log, longtime readers of Interconnected will notice that many of these solutions are similar to the ones I wrote about in mid-2020. That’s because TikTok, despite its many promises to improve, has not changed. What has changed is that President Biden, not Trump, is now in charge.

Is that a good thing for regulating TikTok? I’m honestly not sure.

A recent issue of POLITICO's West Wing Playbook revealed that the Democratic Party has started its own official TikTok account, in order to appeal to young voters between the ages of 18-34 whose approval rating of Biden is low. Two rising stars of the Democratic Party, Stacey Abrams and Jon Ossoff, are both active on TikTok with large followings. During the early days of the Russian-Ukraine war, the Biden White House specifically gathered a group of 30 TikTok influencers – not YouTube, not Instagram – to brief them on the administration’s core message, so these influencers can help spread the word.

Like it or not, TikTok’s traction among American youth is strong, thus it has real leverage. As the political party that historically relies more on youth votes to win, the Democrats need TikTok, which puts the Biden administration in a tricky spot of having to both regulate TikTok and use it too!

Does this dynamic automatically mean the Biden administration will do a poor job of regulating TikTok?

No. But it does make an already complicated problem even more complicated.

TikTok在美国的信誉问题的几套解决方案

上周,我写了一篇文章,根据美媒Buzzfeed拿到80多次内部会议录音的泄露所做的报道,详细的分析了TikTok在美国面临的巨大信誉挑战,

在今天的文章中,我针对TikTok的四大主要信誉问题来分别提出一些可行的解决方案:

  • "告密者" 问题
  • "主管里" 问题
  • "受保护数据" 问题
  • "甲骨文" 问题

这些解决方案中有些是技术性的,有些是公司构架性的。在今天文章的最后,我对TikTok自身在美国群众中的杠杆作用也做出一些补充思考,分析拜登政府如何监管TikTok背后的复杂关系。如果您还没有读上周的文章,我强烈建议先看那一篇,以便对每个问题建立些背景信息。

"告密者" 问题:RBAC分支

在字节跳动手上有一个棘手的 "告密者" 问题,这个问题只有两种结局 – 要么字节的领导层赢得其美国员工的信任,要么泄密继续下去,直到没有什么值得告密的爆料为止。

显然,从字节的角度来看,更期望是第一种结果。但它如何才能重建内部信任呢?

一个基于技术的解决方案:RBAC分支。

RBAC(Role-Based Access Control)是基于工作人员职位的访问控制的缩写。它是一套标准的操作策略,描述并强制执行哪类员工可以访问什么数据或信息。你是一位初级的软件工程师吗?那你可以访问这个代码库,但不能访问其他代码库。你是一位数据分析团队的经理吗?那你可以访问数据仓库中的这些表来做分析,但不能访问其他交易数据库。你是一位资深的网站可靠性后台工程师吗?那你可以访问这些用户流量的日志,但不能访问以上提到的数据分析团队的经理可以访问的数据仓库。

有一套完善的,随着公司发展而不断演变的RBAC策略是行业内的最佳实践。我相信字节庞大的全球运营中已经有某种程度的RBAC。为了重建内部信任,字节需要将其目前的RBAC再升级一步 – 将TikTok的美国业务的RBAC实施与字节中国完全分开。

在中国业务和美国业务之间把RBAC政策复制在分支并非易事,但这是可行的,也是字节不能再避免或拖延的一步。分支的RBAC政策也应提供给美国监管机构审核。两年前,我在一篇名为《字节跳动能在海外建立诚信吗?》的文章中曾提倡过这一方案。那时,我甚至建议字节把内部RBAC政策开源出来。开源有助于TikTok建立公众信誉,但说起来容易做起来难(看看Twitter开源其算法的历程就知道了-- GitHub repo现已删掉)。但字节起码应该把RBAC政策提供给监管部门审计。

如果没有审计的RBAC政策来核实TikTok保护美国用户数据的公开承诺,那么来自字节领导层任何善意的、真心的内部邮件和全体员工会议都无法重建信任。建立不了真的信任,无论TikTok如何迅速地在美国招人,这些员工中的许多人都会继续辞职、告密,或两者都做。

"主管理人" 问题:让美国信任和安全部门有实权

与RBAC分支方法相关,解决TikTok的 "主管理人" 问题的方法是授权给已经建立的但相对无权的美国信任和安全团队。这个解决方案首先是赋予这个团队同样的 "主管理人" 权力和访问权限,然后将其与中国团队的 "主管理人" 分开。在这个团队组织架构中,每当中国的 "主管理人" 想要访问TikTok美国团队的任何信息时,这个人必须获得美国信任和安全团队的 "主管理人" 的许可。

就像我在上周的文章中解释的那样,有 "主管理人" 这项功能本身并不是一个问题。字节目前的公司架构的问题是,这个角色在中国,而且只在中国。如果这一权力在美国和中国之间分割,每一方的 "主管理人" 都被平等对待,并作为其各自业务的数据、代码库和所有用户信息的 "守门人",那么可疑的请求可以被审查和拒绝,而合理的请求仍然可以被批准。

字节中国工程团队很可能有很合理的理由需要访问TikTok美国的某些部分。这些请求的合理性应该由美国的信任和安全团队与美国的领导团队共同决定,而不是由中国团队决定。简单明了。一套对等的流程也应该适用于任何想要访问字节中国团队内部某些部分的TikTok美国工程团队。

这才是公平的。

"受保护数据"(Protected Data)问题:美国监管机构公布定义和理由

"告密者 "和 "主管理人" 问题是字节跳动可以单方解决的问题,但解决 "受保护数据" 问题的主动权在美国监管机构手中。

这些监管机构有责任不淡化针对TikTok(也是对所有中国科技公司)的监管方式和程度,特别是 "受保护数据"(Protected Data)的定义。当 "受保护数据" 的定义确定后,这些定义应该经过公众评论期(这也是任何新的联邦法规生效之前的标准流程),然后公布,并清楚地解释什么类型的数据被认为是和不被认为是 "受保护" 的理由。

正如Buzzfeed的报道所揭示的,这一定义仍在 "协商中"。同时,像UID(unique identifier)这种数据类型仍然可以访问。(我上周的文章其中提供了UID一段简短的技术解释,以及为什么它可以用来访问用户数据。) 在TikTok这个具体案例中,UID是否应该被视为 "受保护数据",需要知道TikTok的内部数据库系统是如何设计、构架和运作的。我当然没有这层内部信息。但无论UID(或任何其他数据类型)如何被定义,作为一位美国公民(和偶尔的TikTok用户),我以及所有美国公众有权利知道定义背后的理由。

这类监管数据定义的公布披露并不是新做法或前所未有的。一个可以参考的例子是个人医疗数据如何在HIPAA(健康保险可携性和责任法案)下被监管、保护、定义和公布解释的做法。在HIPAA中,有一个概念叫做PII(personal identifiable information,个人可识别信息),它其中还包含了一个叫做PHI(personal health information,个人健康信息)的子概念,两个概念加起来形容什么样的个人数据和信息在美国医疗系统中被视为是 "受保护的"。

什么是PII或PHI在谷歌上很容易找到。在针对TikTok的监管中也应该有同样的清晰透明度。

"甲骨文" 问题:流量日志审计

有意思的是,要想解决 "甲骨文" 问题,用甲骨文自己的一个产品 – VCN(虚拟云网络)流量日志 – 就可以解决。这一工具在2021年初刚刚正式在甲骨文云中出炉。

流量日志是一种常见的工具,不是甲骨文云所特有的。从AWS到阿里云的每个云平台都有类似的工具。流量日志可以让你查看进出自己的云基础设施的每一个连接和访问,即整个网络的“流量" 的日志。正如甲骨文自己对其VCN流量日志产品的官宣中指出的那样,这个产品有两个用处:

  • 故障排除和监控
  • 监管和合规

目前,甲骨文云对TikTok如何使用其云基础设施很灵活性,仅提供裸机,而由TikTok自己决定在这些机器上怎么设计软件层,因此才有了 "甲骨文" 问题。为了验证TikTok在甲骨文云上是否遵守了美国监管法规,甲骨文应该将其与TikTok有关的流量日志提供给审计。

有了流量日志的审计,就不需要猜测TikTok是否允许来自中国,或来自任何地方的访问了,因为每次访问都会在流量日志中显而易见。在两年前写的另一篇文章中,我特别提到使用流量日志作为“不信任加验证” TikTok的框架的一部分,当时TikTok使用AWS、GCP和它自己拥有的位于弗吉尼亚的数据中心组合起来的云。现在唯一的区别是,我们知道那家云将拥有所有TikTok的流量日志 – 甲骨文云。

TikTok的影响力

TikTok在美国面临的信誉问题困难重重,但如果字节、甲骨文和美国政府都以负责任的态度和诚意行事,这些问题绝不是不可克服的。作为一个创业者和运营者,我不喜欢仅仅评论问题,更在乎思考怎么去解决问题,我希望本篇文章中提出的这几套解决方案,值得所有相关方的参考。

无论是RBAC还是流量日志,《互联》的长期读者会注意到,其中许多解决方案与我在2020年中期提出的方案类似。这是因为TikTok尽管自从那时做出多次承诺要改进,并没有多大进步和变化。有变化的是,现在是拜登总统执政,而不是特朗普。

这对监管TikTok来说是一件好事吗?老实说,我不确定。

最近一期POLITICO报道中披露,民主党已经开始启动自己的官方TikTok账户,以吸引18-34岁之间的年轻选民,他们目前对拜登的支持率很低。民主党的两位新星Stacey AbramsJon Ossoff都在TikTok上很活跃,拥有大量粉丝。在俄乌战争初期,拜登白宫专门召集了30名TikTok网红 – 不是YouTube,也不是Instagram – 向他们传播政府的官方信息,希望这些网红可以帮助传播。

不管您喜不喜欢,TikTok在美国年轻人中的吸引力很强,因此TikTok在美国的影响力也是有目共睹的。作为历来更依赖年轻选民取胜的政党,民主党需要TikTok,这使拜登政府处在一个有点尴尬的位置,既要监管TikTok,又要用它。

这种态势是否自动意味着拜登政府对TikTok的监管会失败?

那倒不是,但它确实使一个已经很复杂的问题变得更复杂了。

如果您喜欢所读的内容,请用email订阅加入“互联”。要想读以前的文章,请查阅《互联档案》。每周一篇新文章送达您的邮箱。请在TwitterLinkedIn、Clubhouse(@kevinsxu)上给个follow,和我交流互动!