TikTok said Wednesday it pulled four million “violative” videos in the EU in September, in its first transparency report since a new law against illegal and harmful content came into force across the bloc.
The Chinese-owned video-sharing platform favored by younger online users also stressed that it has 6,125 employees whose job is to moderate content in the European Union, in all its national languages.
The figures were given as part of TikTok’s obligation under the EU’s new Digital Services Act (DSA) for major online platforms to provide a transparency report every six months.
TikTok did not previously release monthly removal data for the bloc as a whole, leaving the significance of the September figure unclear until compared with future reports.
The DSA, which came into effect in August, threatens very large online platforms and search engines with fines that can go up to six percent of global turnover for violations.
TikTok and 18 other platforms fall into that category for heightened EU scrutiny, because they have at least 45 million monthly users in the bloc.
Others include: Meta’s Facebook and Instagram; Alphabet’s YouTube and Google Search; X, formerly known as Twitter; Microsoft’s Bing search engine and LinkedIn; Apple’s AppStore; Alibaba’s AliExpress; and Wikipedia.
The European Commission last week announced it has opened probes into TikTok and Meta, asking them to give more details on what measures they have taken to stop the spread of “illegal content and disinformation” after the Hamas attack on Israel.
TikTok said that, as of September 2023, it had 134 million users in the European Union.
More work to do
It said it was “proud” of the efforts it has made so far but recognized that “we still have work to do”.
TikTok said in its report it “proactively” looks for content deemed illegal or harmful under its policies, using automated systems in the first instance, backed up where necessary by human review.
It said that the amount removed on its own initiative was “seven times more than the volume of violative content removed following a user report”.
The company said it has created a new in-app channel for users to report suspected illegal content, in line with its obligations under the DSA.
It also said, when it receives removal requests from authorities in the EU it reviews the content in light of its policies and national and EU laws.
TikTok said that, in September, it received 17 removal requests from governments in the EU.
It also received 452 requests from governments in the bloc for information about users and accounts, which it weighed “on a case-by-case basis” to respect users’ privacy and other rights.
The median time for action taken against a signaled video was 13 hours, it said, explaining that the need to consider legal obligations as well as issues such as freedom of expression took time.