TikTok announces significant layoffs in Malaysia, targeting content moderation roles, as part of its strategy to improve AI-driven content management and adapt to new regulatory frameworks.
TikTok, a popular social media platform owned by Chinese tech giant ByteDance, has announced the layoff of hundreds of employees as it aims to refine its content moderation processes using artificial intelligence (AI). The company confirmed these changes on Friday, marking a major shift in its operational strategy.
The significant layoffs, which affect nearly 500 employees in Malaysia, primarily target roles within the content moderation domain. These cuts are part of an overarching effort by TikTok to bolster its moderation capabilities using technologically advanced systems. A spokesperson from TikTok explained that the workforce reduction is intended to fortify the company’s global content moderation framework, ensuring that the platform remains safe and user-friendly.
This operational shift highlights TikTok’s commitment to integrating advanced technology into its content management processes. The company has long employed a blend of human moderators and automated tools to identify and remove content that contravenes the platform’s community guidelines. Currently, AI systems are responsible for eliminating approximately 80% of content that violates TikTok’s standards. By increasing its reliance on AI, TikTok aims to enhance the speed and efficiency of its moderation activities.
ByteDance employs over 110,000 people across the globe, and further job cuts are anticipated next month as part of a strategy to streamline and consolidate regional operations. Despite the layoffs, the company reassures stakeholders of its continued investment in trust and safety measures, emphasising the importance of maintaining a secure online environment for its users through innovative solutions.
The timing of the layoffs in Malaysia coincides with the country’s regulatory adjustments concerning social media operations. The Malaysian government has recently intensified its regulatory framework, mandating that all social media platforms acquire an operating licence by January. This regulatory enhancement is part of a broader initiative to tackle the prevalent issue of cybercrime. Earlier this year, the government had requested social media companies, including TikTok, to ramp up content monitoring in response to a noticeable increase in malicious content.
As TikTok moves towards its new AI-powered model, the company is navigating both internal restructuring and external regulatory challenges. The intersection of these factors underscores the dynamic nature of the social media landscape in which companies must continuously adapt to technological and regulatory developments. As TikTok advances on this path, its approach to content moderation will likely serve as a case study in balancing technological integration with regulatory compliance.
Source: Noah Wire Services