Oracle has started vetting TikTok’s algorithms to ensure that there are no manipulations by Chinese authorities. This is being applied to content moderation models as well, according to Axios. The idea behind is to provide further assurance to lawmakers, saying that the TikTok US platform is being operated independently. Implying that it is independent of the influence of the Chinese Communist Party.
Chinese tech giant ByteDance owns TikTok. It bought the US lip-syncing app musical.ly in 2017, then later came up with a smaller version of the app called TikTok. Since then the app has skyrocketed in popularity in the US. This year since June, the app had longstanding pressure from the U.S. government. So TikTok said it had begun routing all its U.S. user data to Oracle’s cloud infrastructure. Also, it hinted that it would establish a partnership with an outside firm to oversee its algorithms in a response to a letter from Republican senators inquiring. The letter was about the protection of U.S. user data. The letter was obtained by The New York Times. Here, those moves are part of a broader TikTok effort called Project Texas, which is meant to give U.S. TikTok users and lawmakers assurance. Saying that the U.S. user data is safe and content recommendations aren’t being manipulated. The project name refers to Oracle’s headquarters in Texas.
New changes
TikTok has been preparing Project Texas for over a year by separating its U.S. operations’ backend functions and code. The new arrangement gives Oracle “regular vetting and validation” of TikTok’s content recommendation and moderation models, according to Axios. The reviews, a source told Axios, began officially last week, now that all new U.S. user traffic is being routed to Oracle’s cloud infrastructure. (It’s still unclear when TikTok will be done migrating all of its previous U.S. user data over to Oracle’s cloud, as is expected.)
The reviews give Oracle visibility into how TikTok’s algorithms surface content “to ensure that outcomes are in line with expectations and that the models have not been manipulated in any way,” the spokesperson said. A Guardian report from 2019 suggested that TikTok had in the past censored content in a way that aligned with Beijing’s foreign policy messaging. TikTok has said that it has since changed its content moderation guidelines.  Oracle will also conduct regular audits of TikTok’s content moderation processes, including those involving automated systems and those employing people. Giving Oracle access “will ensure that content continues to be flagged and actioned appropriately based on our Community Guidelines and no other factors,” the spokesperson added.