채용
복지 및 혜택
•Learning
•Equity
•Flexible Hours
•Healthcare
필수 스킬
Python
Node.js
PostgreSQL
Responsibilities
Tik Tok is the leading destination for short-form video. Our mission is to inspire creativity and bring joy.
Our Trust & Safety team's commitment is to keep our online community safe. We have invested heavily in human and machine-based moderation to remove harmful content quickly and often before it reaches our general community.
- As Policy Analyst
- Search, in our Trust & Safety team, you will focus on improving content moderation accuracy, conducting deep dives into policy operability and enforcement, supporting content labeling initiatives, analyzing moderation quality metrics, and reviewing user feedback to enhance policy effectiveness. You will collaborate cross-functionally with policy, operations, policy training, and T&S product teams to refine enforcement strategies and ensure a safer platform for users.
This role may involve limited exposure to harmful or distressing content, which includes but is not limited to: bullying; hate speech; child abuse; sexual assault; torture; bestiality; self-harm; suicide; or murder.
What will I be doing?
Moderation Accuracy & Policy Enforcement:
- Analyze and assess content moderation decisions to identify accuracy gaps and provide recommendations for improvement
- Support the development and refinement of enforcement guidelines to ensure clear, consistent, and fair application of policies
- Work closely with operations and data teams to monitor and improve human and AI-based moderation quality
- Collaborate with training and development teams to provide vetted cases for use in instructional materials
- Cultivate a deep understanding of Search features, and enforcement requirements to support the development, launch and training of effective UGC content policies, guidance and safety strategies
Content Review, Metrics Analysis & Quality Assurance:
- Work closely with moderation teams to provide policy clarifications and training support
- Analyze key moderation quality metrics
- Assess user feedback trends related to moderation decisions, identifying areas where enforcement may need adjustment or clarification
- Help develop reports and insights on policy effectiveness, enforcement trends, and areas for improvement
Content Labeling & Categorization:
- Support the development and implementation of content labeling strategies to improve content classification and policy enforcement
- Work with data teams to analyze content trends and ensure labels align with evolving policies and enforcement needs
- Contribute to training datasets for AI-driven moderation tools and ensure policy intent is accurately reflected
Escalation Handling:
- Lead immediate mitigation measures, such as search query takedowns, feature restrictions, or content takedowns, following policies and escalation playbooks
- Ensure adherence to content policies and regulatory requirements while balancing enforcement effectiveness
- Partner with cross-functional teams (e.g., Policy, Engineering, Legal) by providing data and context as directed
- Assess and report on the effectiveness of containment actions, iterating on strategies for continuous improvement
Qualifications
Minimum Qualifications
- 2+ years of work experience in Trust & Safety, product policy, or other product safety roles in a new media, technology, or entertainment company
- Strong analytical skills with the ability to conduct deep dives into policy enforcement data, moderation quality metrics, and user feedback trends
- Team player and ability to collaborate with different teams
- Excellent communication skills with the ability to clearly articulate policy nuances to diverse stakeholders
- Excellent time management and great problem solving skills
- Ability to work in a high tempo environment, adapt, respond to day-to-day challenges of the role
- High flexibility regarding working hours and days
Preferred Qualifications
- 5+ years of work experience in Trust & Safety, product policy, or other product safety roles in a new media, technology, or entertainment company
- Proven ability to develop sound research methodologies and collect, synthesize, analyze, and interpret data
- Experience working in a start-up, or forming new teams in established companies
- Experience handling content-related escalations and assessing complex moderation decisions
- Experience with processes such as appeals management, final arbitration, data analysis etc
- Experience with escalation or crisis mitigation
총 조회수
1
총 지원 클릭 수
0
모의 지원자 수
0
스크랩
0
비슷한 채용공고
TikTok 소개

TikTok
Late StageA short-form video entertainment app and social network platform
10,001+
직원 수
Los Angeles
본사 위치
$220B
기업 가치
리뷰
3.8
10개 리뷰
워라밸
2.8
보상
3.7
문화
4.1
커리어
3.2
경영진
2.9
68%
친구에게 추천
장점
Great team dynamics and support
Innovative and creative culture
Good learning opportunities
단점
Work-life balance challenges
Fast-paced and stressful environment
High expectations and tight deadlines
연봉 정보
49개 데이터
Mid/L4
Mid/L4 · 2D 3D Artist
1개 리포트
$195,000
총 연봉
기본급
$150,000
주식
-
보너스
-
$195,000
$195,000
면접 경험
2개 면접
난이도
4.0
/ 5
소요 기간
21-35주
경험
긍정 0%
보통 0%
부정 100%
면접 과정
1
Application Review
2
Recruiter Screen
3
Online Assessment
4
Behavioral Interview
5
Final Round
6
Offer
자주 나오는 질문
Coding/Algorithm
Behavioral/STAR
Technical Knowledge
Culture Fit
뉴스 & 버즈
Hollywood wants to be TikTok. TikTok wants to be TV - Axios
Axios
News
·
3d ago
Hundreds of Fake Pro-Trump Avatars Emerge on Social Media - The New York Times
The New York Times
News
·
3d ago
Firefighters warn parents about dangerous TikTok trends sending kids to hospitals - WFSB
WFSB
News
·
4d ago
QVC prepares for bankruptcy protection in the era of influencers, TikTok and Temu - Chicago Tribune
Chicago Tribune
News
·
4d ago




