
In the second quarter of 2025, TikTok removed approximately 600,000 videos in Kenya for violating its community guidelines. This information was shared in the platform’s recent quarterly Community Guidelines Enforcement Report (CGER).
The video removals in Q2 show a notable increase compared to previous quarters, where 450,000 videos were removed in Q1 2025 and over 360,000 in Q2 2024. This rising trend reflects not only a surge in content creation on the platform but also TikTok’s enhanced efforts in content moderation, driven by pressure from Kenyan regulatory bodies.
In March 2025, the Communications Authority of Kenya (CA) made five official requests to TikTok after a BBC investigation highlighted the platform’s shortcomings in preventing the exploitation of minors in sexualized livestreams. These requests included the removal of all sexual content involving minors, an inquiry into the allegations, an explanation of how offensive material bypassed moderation, and a commitment to intensify public education on child online safety.
TikTok’s Q2 enforcement report indicates a proactive approach to content removal, with 92.9% of the 592,037 videos taken down before being viewed. Furthermore, 96.3% of these videos were removed within 24 hours of being uploaded, demonstrating the effectiveness of the platform’s automated moderation systems.
Globally, TikTok removed over 189 million videos in the same quarter, which represents just 0.7% of all uploaded content. Notably, 99.1% of these removals were initiated proactively, 94.4% within 24 hours, and AI-driven systems eliminated 163.9 million videos. Additionally, the company reported removing 77 million fake accounts and 26 million accounts suspected to belong to users under 13.
TikTok asserts that removing content that violates its community guidelines is essential for curbing the negative impacts of misinformation, hate speech, and other harmful material on the platform.












