Short video social platform TikTok pulled down over 360,000 videos uploaded by Kenyans after backlash over its content moderation.
This was about 0.3 per cent of the total videos uploaded in the country according to firms’ Q2 2024 Community Guidelines Enforcement Report.
The Chinese social media platform had come under scrutiny over the content on its platform with Kenya threatening to shut down its operations in the country.
The sharing of adult content, misinformation, and hate speech on the app in Kenya had led to questions on the effectiveness of its content moderation efforts, which partly rely on artificial intelligence.
However, in the report the firm says that 99.1 percent of these videos that were deleted in Kenya, were proactively removed before users reported them, with 95 per cent taken down within 24 hours.
“A total of 60,465 accounts were banned for violating TikTok’s Community Guidelines, with 57,262 accounts being removed because they were suspected to be under the age of 13,” reads the report in part.
With over a billion users and millions of pieces of content posted daily, TikTok says it continues to invest in advanced technologies that enhance content understanding and assess potential risks.
“These innovations allow the platform to detect and remove harmful content before it reaches viewers,” the report says.
In June 2024 alone, the platform removed over 178 million videos globally, with 144 million of those removed through automation.
“These technical advancements significantly reduce the volume of content that human moderators need to review, thereby minimising their exposure to violative material.”
With a proactive detection rate now at 98.2 per cent globally, TikTok is more efficient than ever at addressing harmful content before users encounter it.
As TikTok continues to invest in cutting-edge moderation technologies, its commitment to transparency and platform safety remains at the forefront, ensuring a secure environment for its diverse user base across Kenya and globally.