refresh

トレンド企業

トレンド企業

採用

求人TikTok

Policy Analyst, Search - Trust & Safety

TikTok

Policy Analyst, Search - Trust & Safety

TikTok

Dublin, Ireland

·

On-site

·

Full-time

·

2mo ago

福利厚生

Learning

Equity

Flexible Hours

Healthcare

必須スキル

Python

Node.js

PostgreSQL

Responsibilities

Tik Tok is the leading destination for short-form video. Our mission is to inspire creativity and bring joy.

Our Trust & Safety team's commitment is to keep our online community safe. We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community.

  • As Policy Analyst
  • Search, in our Trust & Safety team, you will focus on improving content moderation accuracy, conducting deep dives into policy operability and enforcement, supporting content labeling initiatives, analyzing moderation quality metrics, and reviewing user feedback to enhance policy effectiveness. You will collaborate cross-functionally with policy, operations, policy training, and T&S product teams to refine enforcement strategies and ensure a safer platform for users.

This role may involve limited exposure to harmful or distressing content, which includes but is not limited to: bullying; hate speech; child abuse; sexual assault; torture; bestiality; self-harm; suicide; or murder.

What will I be doing?

Moderation Accuracy & Policy Enforcement:

  • Analyze and assess content moderation decisions to identify accuracy gaps and provide recommendations for improvement
  • Support the development and refinement of enforcement guidelines to ensure clear, consistent, and fair application of policies
  • Work closely with operations and data teams to monitor and improve human and AI-based moderation quality
  • Collaborate with training and development teams to provide vetted cases for use in instructional materials
  • Cultivate a deep understanding of Search features, and enforcement requirements to support the development, launch and training of effective UGC content policies, guidance and safety strategies

Content Review, Metrics Analysis & Quality Assurance:

  • Work closely with moderation teams to provide policy clarifications and training support
  • Analyze key moderation quality metrics
  • Assess user feedback trends related to moderation decisions, identifying areas where enforcement may need adjustment or clarification
  • Help develop reports and insights on policy effectiveness, enforcement trends, and areas for improvement

Content Labeling & Categorization:

  • Support the development and implementation of content labeling strategies to improve content classification and policy enforcement
  • Work with data teams to analyze content trends and ensure labels align with evolving policies and enforcement needs
  • Contribute to training datasets for AI-driven moderation tools and ensure policy intent is accurately reflected

Escalation Handling:

  • Lead immediate mitigation measures, such as search query takedowns, feature restrictions, or content takedowns, following policies and escalation playbooks
  • Ensure adherence to content policies and regulatory requirements while balancing enforcement effectiveness
  • Partner with cross-functional teams (e.g., Policy, Engineering, Legal) by providing data and context as directed
  • Assess and report on the effectiveness of containment actions, iterating on strategies for continuous improvement

Qualifications

Minimum Qualifications

  • 2+ years of work experience in Trust & Safety, product policy, or other product safety roles in a new media, technology, or entertainment company
  • Strong analytical skills with the ability to conduct deep dives into policy enforcement data, moderation quality metrics, and user feedback trends
  • Team player and ability to collaborate with different teams
  • Excellent communication skills with the ability to clearly articulate policy nuances to diverse stakeholders
  • Excellent time management and great problem solving skills
  • Ability to work in a high tempo environment, adapt, respond to day-to-day challenges of the role
  • High flexibility regarding working hours and days

Preferred Qualifications

  • 5+ years of work experience in Trust & Safety, product policy, or other product safety roles in a new media, technology, or entertainment company
  • Proven ability to develop sound research methodologies and collect, synthesize, analyze, and interpret data
  • Experience working in a start-up, or forming new teams in established companies
  • Experience handling content-related escalations and assessing complex moderation decisions
  • Experience with processes such as appeals management, final arbitration, data analysis etc
  • Experience with escalation or crisis mitigation

総閲覧数

1

応募クリック数

0

模擬応募者数

0

スクラップ

0

TikTokについて

TikTok

TikTok

Late Stage

A short-form video entertainment app and social network platform

10,001+

従業員数

Los Angeles

本社所在地

$220B

企業価値

レビュー

3.8

10件のレビュー

ワークライフバランス

2.8

報酬

3.7

企業文化

4.1

キャリア

3.2

経営陣

2.9

68%

友人に勧める

良い点

Great team dynamics and support

Innovative and creative culture

Good learning opportunities

改善点

Work-life balance challenges

Fast-paced and stressful environment

High expectations and tight deadlines

給与レンジ

49件のデータ

Mid/L4

Mid/L4 · 2D 3D Artist

1件のレポート

$195,000

年収総額

基本給

$150,000

ストック

-

ボーナス

-

$195,000

$195,000

面接体験

2件の面接

難易度

4.0

/ 5

期間

21-35週間

体験

ポジティブ 0%

普通 0%

ネガティブ 100%

面接プロセス

1

Application Review

2

Recruiter Screen

3

Online Assessment

4

Behavioral Interview

5

Final Round

6

Offer

よくある質問

Coding/Algorithm

Behavioral/STAR

Technical Knowledge

Culture Fit