The main reason behind the United States’ push to ban social media application TikTok is due to Israel’s image rather than fears of Chinese infiltrations, congressional insiders have revealed.
At the Munich Security Conference, US Senator Mark Warner, the top Democrat on the intelligence committee, said he wished to reveal what he called the “real story” behind the recent legislations to restrict the Chinese-owned application.
“So we had a bipartisan consensus,” Gallagher said. “We had the executive branch, but the bill was still dead until October 7th. And people started to see a bunch of antisemitic content on the platform and our bill had legs again.”
A memo produced by the State Department for its Near East Affairs diplomats, which Klippenstein obtained, describes how Israel’s deputy director general for public diplomacy at the foreign ministry, Emmanuel Nahshon, blamed the youth’s opposition to the war on Gaza on TikTok’s algorithm.
“Some wonder why there was such overwhelming support for us to shut down potentially TikTok or other entities of that nature,” he said in May. “If you look at the postings on TikTok and the number of mentions of Palestinians, relative to other social media sites - it’s overwhelmingly so among TikTok broadcasts.”
Nikki Haley, another former Republican presidential candidate who last year infamously signed an Israeli bomb meant for Gaza with “finish them”, said she believed that simply watching videos on TikTok would make a user antisemitic.
Can you explain a bit more what you mean by this? This article in my understanding of it does not suggest the algorithm is favoring pro-Palestinian narratives. If anything, the extreme imbalance of post count between the two sides suggests a preexisting pro-Palestinian viewpoint among the userbase rather than one that was artificially boosted.
Well when they say user engagement they seem to be talking about users taking actions to engage with content. But they say elsewhere that the TikTok algorithm doesn’t seem to respond to these actions in general, so it’s unsurprising that this is not the cause. Instead, it seems to optimize for viewing time which doesn’t seem to be part of the available data here, unfortunately.
However, if we start with a very pro-Palestinian user-base (as suggested by the initial post count) then it’s not surprising that these users would be more inclined to watch content that shares their political views and therefore the algorithm would boost these more popular videos to more people. So these numbers really don’t show anything unusual that I can identify.
Maybe that’s all you were saying initially but I was more wondering whether there is evidence that TikTok is boosting certain content above and beyond what is of interest to its users in an attempt to influence them. Given the lack of key data, this analysis cannot directly answer this question, but the patterns here strike me as fairly organic looking.