TikTok opens transparency center as lawmakers weigh US ban

TikTok is staring at an outright ban in the US. It’s already been banned on federal employees’ devices, blocked by dozens of universities across the country, and lawmakers are calling for its removal from US app stores.

With that in mind, I and a handful of other journalists were invited to the company’s Los Angeles headquarters for the first media tour of the Transparency and Accountability Center earlier this week. It’s a space that, like the political discussion of TikTok these days, is more about signaling virtues than anything else. Company officials say the center is designed for regulators, academics and accountants to learn more about how the app works and its security practices. We were told that a politician, who wished not to be named, had viewed it the day before. TikTok eventually plans to open more centers in Washington, DC, Dublin and Singapore.

Our tour was part of a multi-week TikTok press blitz to advance Project Texas, a novel proposal to the US government that would partition American user data rather than an outright ban. TikTok CEO Shou Zi Chew was in DC last week and delivered a similar pitch to policymakers and think tanks. He is scheduled to testify before Congress for the first time in March.

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Allison Zaucha for The Verge

TikTok isn’t the first embattled tech company to lean on the spectacle of a physical space during a PR crisis. In 2018, Facebook invited journalists to tour its election war room, which was actually just a glorified conference room full of employees staring at social media feeds and dashboards. Photos were taken, stories were written, and then the War Room closed about a month later.

Similarly, TikTok’s transparency center is made up of lots of smoke and mirrors intended to give the impression that this is the case Really takes care. Large touchscreens explain how TikTok works at a high level, along with a broad overview of the kind of trust and security efforts that have become table money for every major platform.

A crucial difference, however, is a room that my tour group was not allowed to enter. Behind a wall with Death Star-like mood lighting, TikTok officials said a server room houses the app’s source code for external auditors to review. Anyone entering must sign a non-disclosure agreement, go through metal detectors, and lock their phone in a locker. (It wasn’t clear who exactly was allowed into the room.)

A room where you can interact with a mock version of the moderation software used by TikTok.

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Allison Zaucha for The Verge

The interactive part of the center that I got to experience included a room with iMacs running a dummy version of the software that TikTok says its moderators use to review content. There was another room with iMacs running “code simulators”. While that sounded intriguing, it was really just a basic explanation of the TikTok algorithm that seemed designed for a typical member of Congress. Close-ups of the computer screens were not allowed. And although it’s been dubbed a transparency center, TikTok’s PR department has all agreed not to quote or directly attribute comments from staff running the tour.

On the moderator workstation, I was shown some potentially hurtful videos for review, along with basic information like the accounts that posted them and the number of likes and reshares of each video. As I pulled up a man speaking into the camera with the caption, “World Brings Up 9/11 to Justify Muslims as Terrorists,” the moderator system asked me to choose if I was violating one of three policies , including one on “threats and incitement to violence”.

On the code simulator iMac in the other room, I was hoping to learn more about how TikTok’s recommendation system actually works. This was, after all, a physical place to travel to. Surely there is information that I can’t find anywhere else?

What I got was this: TikTok first uses a “crude machine learning model” to select “a subset of a few thousand videos” from the billions hosted by the app. Then a “moderate machine learning model further narrows the fetch pool to a smaller pool of videos” that it believes you will be interested in. Finally, a “fine machine learning model” makes the final pass before offering videos it thinks you’ll do, like on your For-You page.

The information displayed was frustratingly vague. One slide read that TikTok “recommends content by ranking videos based on a combination of factors, including the interests that new users convey to TikTok when they first interact with the app, as well as preferences that change over time.” This is exactly how you would expect it to work.

Eric Han, Head of USDS Trust and Safety at TikTok.

a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Photo by Allison Zaucha for The Verge

TikTok first attempted to open this transparency center in 2020 when then-President Donald Trump tried to ban the app and Kevin Mayer was its CEO for a full three months. But then the pandemic came and delayed the opening of the center until now.

Over the past three years, TikTok’s trust deficit in DC has only deepened, fueled by growing anti-China sentiment that started on the right and has since become more bipartisan. The worst revelation came in late December, when the company confirmed that employees had abusively accessed the location data of several US journalists as part of an internal leak investigation. That same month, FBI Director Chris Wray warned that China could use TikTok to “manipulate content and, if they choose, use it to influence operations.”

TikTok’s answer to these concerns is Project Texas, a high-tech, unprecedented plan that would foreclose most of TikTok’s US operations from its Chinese parent company ByteDance. To make Project Texas a reality, TikTok is relying on Oracle, whose billionaire founder Larry Ellison used his connections as an influential Republican donor to personally secure Trump’s blessing early in the negotiations. (No one from Oracle was present at the briefing I attended, and my request to speak to someone there about this story went unanswered.)

Before the tour, I was given a brief overview of Project Texas, although I was asked not to quote the staff who presented themselves directly. A graphic I was shown showed a Supreme Court-like building with five pillars representing the issues Project Texas aims to address: organizational design, privacy and access control, technical assurance, content assurance, and compliance and auditing.

TikTok says it has already taken thousands of people and over $1.5 billion to create Project Texas. The effort involves TikTok forming a separate legal entity called USDS with an independent board of directors from ByteDance that reports directly to the US government. More than seven external auditors, including Oracle, will review all data flowing in and out of the US version of TikTok. Only American user data will be available to train the algorithm in the US, and TikTok says there will be strict compliance requirements for any internal access to US data. If the proposal is approved by the government, it is estimated that TikTok will cost between $700 million and $1 billion per year to maintain.

Whether or not Project Texas pleases the government, it certainly seems like it’s going to make things harder for TikTok. The US version of TikTok needs to be fully deconstructed, rebuilt and published by Oracle on US app stores. Oracle must also review each app update. Duplicate roles will be created for TikTok in the US, even if the same roles already exist for TikTok elsewhere. And app performance could suffer when Americans interact with users and content in other countries because American user data must be managed within the country.

One name that wasn’t spoken throughout the briefing: ByteDance. I got the impression that TikTok employees were uncomfortable talking about their relationship with their parent company.

While ByteDance hasn’t been directly acknowledged, its ties to TikTok haven’t been hidden either. The Wi-Fi for the building I was in was called ByteDance and the conference room screens in the Transparency Center featured Lark, the internal communication tool that ByteDance developed for its employees around the world. At one point during the tour, I tried to ask what would hypothetically happen if, once Project Texas got the green light, a Bytedance employee in China made an awkward request to an employee in TikTok’s US unit. I was quickly told by a member of TikTok’s PR team that the question wasn’t appropriate for the tour.

Ultimately, I felt that like its powerful algorithm, TikTok built its transparency center to show people what they thought it wanted to see. The company seems to have realized that it will not save itself from a US ban on the technical merits of its Project Texas proposal. The debate is now just a matter of politics and optics. Unlike the tour I took part in, TikTok can’t control that.

Leave a Reply

Your email address will not be published. Required fields are marked *