Subscribe to our newsletter

Join our subscriber list to get the latest news, updates and special offers delivered directly in your inbox.

From Office Tour to Safety Talks: My Adventure at TikTok Singapore HQ

It was a calm, peaceful evening when my editor called to say, “SURPRISE! You’re going to Singapore 2 days from now!”

(Okay, so maybe that’s not how it exactly went but that’s what happened anyway.)

So in 48 hours, I hopped on a plane and off I went to visit the TikTok HQ in Singapore.

We arrived in Singapore early in the afternoon with just enough time to check into our hotel and grab a quick bite at a nearby restaurant. Afterwards, we immediately rushed over to the TikTok office.

The TikTok TAC Tour

The TikTok HQ in Singapore, also known as Transparency and Accountability Centre (TAC) is located in One Raffles Quay.

The idea of visiting offices of tech companies has always excited me. Being the techie that I am, I have expectations that working in tech companies would mean working in an office that has a lot of cool tech.

Well, the TikTok Singapore office did not disappoint.

There were a lot of touchscreens displays all around and there was definitely no lack of RGB lights that techies seem to have a preference for.

The first item on the agenda was a quick but interactive office tour. It began with a short AV presentation basically introducing what TikTok is all about, its vision-mission and its plan for community safety in 2024.

We were also given a quick peek at how things worked behind the scenes in TikTok, more specifically, the different processes on how content is being moderated.

Content moderation is a two-step process: the first step is done automatically by the system or bots, and the second step requires a more manual, human interaction.

My favorite part of the tour is when we entered the Moderation Workstation and  we were given the chance to experience being a moderator.

Let me tell you, that 5 minutes of make-believe made me realize that being a TikTok moderator is HARD.

Aside from having a whole lot of patience to watch a TikTok video over and over again, you’ll also need a good sense of critical thinking to discern if a content is appropriate for the platform.

You need to have an eye for detail to catch any subtle violations of guidelines and strong emotional resilience to handle potentially disturbing content. 

It’s a role that demands both technical skills and a high level of mental and emotional fortitude.

Definitely not for the weak of heart.

Aside from content moderation, we were also shown the process and the algorithm on how TikTok customizes their content for every user.

The Talk – All About Safety for 2024

Once we were done with the office tour, we were ushered into a conference room for the 2nd part of the agenda, which is basically a talk about the ways TikTok is making the platform safer for users in 2024.

Let me share with you some of the takeaways.

#1 – TikTok has comprehensive Community Guidelines

TikTok’s guidelines are crafted with input from experts and diverse communities. They cover everything from youth safety and sensitive themes to ensure privacy and security for all users. It’s a holistic approach designed to make the platform welcoming for everyone.

#2 – TikTok has a robust system for Content Moderation

They remove content that breaks the rules, age-restrict mature content, and maintain clean and appropriate For You Feed (FYF) standards. With a 99.7% success rate in the Philippines, they’re very serious about keeping things safe.

#3 – TikTok is strict about age restrictions

Only users aged 16 and over can send or receive direct messages. Plus, they have strong measures in place to ensure underage accounts are removed swiftly.

#4 – TikTok is making it easier for parents to keep their children safe online

The Guardian’s Guide in the Safety Center offers all the info parents need. Plus, with the Family Pairing feature, they can manage their kid’s content and privacy settings.

#5 – TikTok has a new “For You” feed Refresh feature

If your “For You” feed starts feeling stale, you can now hit refresh! This feature lets you reset your recommendations to get a fresh batch of content, just like when you first joined TikTok. It’s like a clean slate for your content preferences.

#6 – TikTok is cracking down on harmful and misleading AI-generated content

They’re banning fake news, content involving minors, and much more. They’re also using advanced technology to detect and automatically label AI content.

#7 – TikTok has zero tolerance for child sexual abuse material

They remove it immediately, terminate accounts, and report cases to authorities. They’re also part of global alliances like WeProtect to combat this issue.

#8 – TikTok works with NGOs and safety experts around the world to enhance safety on the platform

They’re involved in campaigns for mental health, online safety for women, and promoting digital well-being. These collaborations help ensure that TikTok remains a safe space for all its users

Final TikTalk

(from left) Kit Cruz, TikTok; KhanJi Weerachaising, Policy Outreach, TikTok; Tyrone Piad, Philippine Daily Inquirer; Bernadette Santos, ABS-CBN News; Lia Espina, The Philippine Star; Iane Macasieb, Manila Bulletin; Bea Bautista, Communications Lead, Philippines and Malaysia, TikTok; Roy Teo, TikTok; Candera Chan, Public Policy, TikTok; Nathaniel Ong, TikTok

Leaving the TikTok Singapore HQ, I felt like I had just been given a sneak peek into the future of digital safety, especially with the rise of AI-generated content. It was pretty interesting to see a tech company taking such proactive steps to create a secure and welcoming space for all its users.

Ultimately though, I still believe that as users, it is our own responsibility to keep ourselves safe from malicious content.

While TikTok provides robust tools and systems to protect its community, staying informed and being vigilant is key. We should definitely be taking advantage of safety features like Family Pairing to improve online safety.

Remember, a safer digital experience starts with us.

RELATED ARTICLES