
Multinational biopharmaceutical company.
Principal Data Engineer – Data & Analytics (Global Supply Chain)
必备技能
Databricks
Apache Spark
PySpark
SparkSQL
Delta Lake
AWS
S3
EMR
Lambda
Glue
Athena
Redshift
EKS
SQL
Python
Tableau
Power BI
Git
Jenkins
Infrastructure as Code
Career Category
Information Systems:
Job Description
Role Description:
This role acts as technical architect and hands-on lead for Data Engineering practices across the Smart Supply Chain initiative within Amgen. Additionally, responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions.
This role involves working with large datasets, developing reports, supporting and executing data governance initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes and will architect, build, and optimize enterprise-grade data pipelines using Databricks and AWS-native services.
Roles & Responsibilities:
-
Design, develop, and maintain data solutions for data generation, collection, and processing
-
Be a key team member that assists in design and development of the data pipeline
-
Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems
-
Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions
-
Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks
-
Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs
-
Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency
-
Implement data security and privacy measures to protect sensitive data
-
Leverage cloud platforms (AWS, Databricks preferred) to build scalable and efficient data solutions
-
Collaborate and communicate effectively with product teams
-
Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions
-
Identify and resolve complex data-related challenges
-
Adhere to best practices for coding, testing, and designing reusable code/component
-
Explore new tools and technologies that will help to improve ETL platform performance
-
Participate in sprint planning meetings and provide estimations on technical implementation
-
Continuously monitor data governance activities and report on compliance, data quality issues, and the effectiveness of governance initiatives
Basic Qualifications and Experience:
- 12 - 17 years of experience in Computer Science, IT or related field
Functional Skills:
Must-Have Skills
-
Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (Databricks (PySpark, SparkSQL, Delta Lake) and AWSservices (S3, EMR, Lambda, Glue, UC, Athena, Redshift, EKS), workflow orchestration, performance tuning on big data processing and the ability to work with large, complex datasets
-
Hands-on experience in orchestrating large-scale data pipelines, performance tuning, lineage tracking, and observability frameworks.
-
Proficiency in data analysis tools (eg. SQL, Python) and experience with data visualization tools (Tableau, Power BI)
-
Excellent problem-solving skills Experience with DevOps practices, version control (Git), CI/CD (Jenkins), and Infrastructure as Code.
Good-to-Have Skills:
-
Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development
-
Strong understanding of data modeling, data warehousing, and data integration concepts
-
Working knowledge of unstructured data processing, vector stores, and AI-enablement for downstream analytics.
-
Strong understanding of SAP data models (ECC tables) and Supply Chain data domains.
-
Experience working in Agile/SAFe environments with distributed global teams.
Professional Certifications:
-
Certified Data Engineer (preferred on Databricks or cloud environments)
-
Machine Learning Certification (preferred)
Soft Skills:
-
Excellent critical-thinking and problem-solving skills
-
Strong communication and collaboration skills
-
Demonstrated awareness of how to function in a team setting
-
Demonstrated presentation skills
.
浏览量
0
申请点击
0
Mock Apply
0
收藏
0
相似职位
关于Amgen

Amgen
PublicA biotechnology company that develops and manufactures human therapeutics for various illnesses and diseases.
10,001+
员工数
Thousand Oaks
总部位置
$138B
企业估值
评价
24条评价
3.6
24条评价
工作生活平衡
3.2
薪酬
3.5
企业文化
3.1
职业发展
2.8
管理层
3.4
65%
推荐率
优点
Excellent benefits and health benefits
Good pay and compensation
Supportive management and leadership
缺点
Limited career growth and promotion opportunities
Work-life balance challenges and long hours
Bureaucratic processes
薪资范围
1,002个数据点
L2
L6
L3
L4
L5
L2 · Financial Analyst L2
0份报告
$94,068
年薪总额
基本工资
$37,627
股票
$47,034
奖金
$9,407
$65,848
$122,288
面试评价
5条评价
难度
3.0
/ 5
时长
14-28周
录用率
40%
体验
正面 20%
中性 80%
负面 0%
面试流程
1
Application Review
2
Recruiter Screen
3
Hiring Manager Interview
4
Technical/Case Interview
5
Final Round/Panel Interview
6
Offer
常见问题
Technical Knowledge
Behavioral/STAR
Past Experience
Case Study
Culture Fit
最新动态
FDA turns up heat on Amgen, proposing to rescind approval of Tavneos
HN
·
1w ago
·
1
·
1
Amgen posts Q1 results April 30; anyone can hear the webcast live - Stock Titan
Stock Titan
News
·
1w ago
AMGEN ANNOUNCES WEBCAST OF 2026 FIRST QUARTER FINANCIAL RESULTS - Yahoo Finance
Yahoo Finance
News
·
1w ago
Amgen’s Chief Technology Officer Dr David Reese to retire - European Pharmaceutical Review
European Pharmaceutical Review
News
·
1w ago




