
Multinational investment company.
Vice President, Data Quality Lead Engineer at BlackRock
About the role
About this role
Black Rock is seeking a Data Quality Framework Lead to lead the strategy, architecture, and delivery of a core capability within Enterprise Data Platform in Aladdin Data. This role combines platform engineering, data governance, and stakeholder leadership to build a scalable, trusted, and transparent framework for data quality across the firm.
The platform ensures that the data Black Rock relies on is fit for purpose across key dimensions including completeness, accuracy, timeliness, consistency, validity, and integrity. It provides clear and actionable quality signals to upstream producers, downstream systems, and end users so data can be used confidently for decisions at scale.
The framework uses custom Python operators, Great Expectations, and Airflow-orchestrated pipelines to perform quality checks as data moves through the ecosystem. The ideal candidate brings strong technical depth, sound architectural judgment, hands-on execution, and the ability to align stakeholders around a common platform vision.
What You Will Do
- Lead the evolution of Black Rock’s data quality framework as a strategic platform capability for validating and monitoring data across the Aladdin Data ecosystem.
- Define the technical direction for a metadata-driven framework that supports reusable quality rules, policy enforcement, exception handling, quality scoring, and domain-level service standards.
- Design and deliver controls that run within Airflow-orchestrated pipelines, enabling early detection of issues before they affect downstream systems or clients.
- Build a strong operating model for observability, transparency, and remediation so producers and consumers can identify and resolve issues quickly.
- Partner with engineering, product, governance, and business stakeholders to drive adoption, prioritization, and long-term roadmap execution.
Key Responsibilities
- Own the target-state architecture for the Data Quality Framework, including rule execution patterns, validation layers, quality gates, exception workflows, and extensibility standards.
- Build and scale platform services, libraries, and APIs for rule authoring, execution, scoring, auditability, and quality SLA and SLO reporting across datasets and domains.
- Develop controls across core quality dimensions, including completeness, accuracy, timeliness, consistency, validity, uniqueness, and referential integrity.
- Design and implement profiling, anomaly detection, and drift detection capabilities covering schema changes, null patterns, distribution shifts, outliers, volume trends, and freshness checks.
- Implement reconciliation and financial control patterns such as source-to-target checks, row-count balancing, aggregate validation, hashes, and critical total checks.
- Drive adoption of Great Expectations and custom Python operators to standardize how assertions are defined, executed, versioned, and reused across pipelines.
- Integrate the framework into Airflow-based data pipelines so checks run at the right control points with meaningful alerting and triage.
- Establish metadata-driven rule management, including ownership, lineage, versioning, parameterization, execution history, and audit-ready evidence.
- Optimize framework performance across high-volume environments, particularly Snowflake and MSSQL, balancing control rigor with runtime efficiency.
- Create clear visibility for downstream platforms, internal users, and clients through dashboards, scorecards, status indicators, and actionable exception reporting.
- Mentor engineers and act as a senior technical leader who can make pragmatic architecture decisions while staying hands-on when needed.
- Influence enterprise standards for trusted data consumption in partnership with data governance, platform engineering, and product teams.
Required Qualifications
- At least 10+ years of experience in backend, data platform, or data engineering roles, with a strong record of hands-on technical delivery.
- Deep expertise in Python and experience building reusable engineering frameworks, services, or platform capabilities.
- Strong experience with workflow orchestration and pipeline integration, ideally with Airflow in complex enterprise environments.
- Proven experience designing and implementing data quality controls across batch and or near-real-time data pipelines.
- Strong understanding of enterprise data quality operating models, including SLAs and SLOs, exception handling, issue triage, and remediation workflows.
- Hands-on experience with Great Expectations or similar data quality frameworks, with the ability to extend them through custom engineering patterns.
- Strong proficiency with Snowflake and or MSSQL, including query tuning, scalable control execution, and performance optimization.
- Experience with metadata-driven platform design, data modeling, lineage, traceability, and audit-ready control frameworks.
- Demonstrated ability to make architecture decisions, influence platform direction, and communicate effectively with senior technical and business stakeholders.
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline, or equivalent practical experience.
Preferred Qualifications
- Experience building enterprise data quality, observability, or control frameworks in financial services or similarly regulated environments.
- Exposure to FastAPI or similar API frameworks for building platform services and developer-facing capabilities.
- Experience with event-driven or streaming architectures such as Kafka for near-real-time quality detection patterns.
- Familiarity with Docker, Kubernetes, CI/CD pipelines, and modern software delivery practices.
- Understanding of domain-driven design, data product ownership, and platform adoption across large organizations.
- Experience leading cross-functional initiatives that require both deep technical execution and strong stakeholder management.
Technical Skills
Languages & Frameworks:
Python, custom Python operators, FastAPI, Great Expectations, DBT
Orchestration & Pipelines:
Airflow, ETL / ELT pipelines, batch and near-real-time validation patterns
Data Platforms
Snowflake, MSSQL, large-scale relational and analytical data stores
Quality Capabilities
Profiling, validation, reconciliation, anomaly detection, quality gates, scorecards, SLAs / SLOs
Platform Design
Metadata-driven frameworks, rule catalogs, versioning, audit trails, observability, lineage integration
Engineering Practices
API design, distributed systems, performance tuning, Docker, Kubernetes, CI/CD
Why This Role Matters
- This role is central to strengthening trust in the data that powers Aladdin and Black Rock’s broader data ecosystem.
- You will shape a platform that improves decision quality, reduces operational risk, and increases transparency for internal consumers and clients.
- You will define standards, influence architecture, and build a durable capability that scales with the firm’s growing data needs.
Our benefits
To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.
Our hybrid work model
Black Rock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at Black Rock.
About Black Rock
At Black Rock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.
This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.
For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock
Black Rock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Required skills
Data engineering
Data quality
Platform architecture
Python
Data governance
Stakeholder management
Total Views
0
Total Apply Clicks
0
Total Mock Apply
0
Total Bookmarks
0
More open roles at BlackRock

VP/Director – Analytics Production Readiness & Assurance (APRA)
BlackRock · New York, NY

Implementation Consultant (Associate)
BlackRock · Paris, France

Portfolio Architect - Cash Management, Associate
BlackRock · London, Greater London

Vice President, Business Audit
BlackRock · New York, NY

Director, Digital Assets Legal Counsel
BlackRock · New York; San Francisco
Similar jobs

Senior Engineer, Developer (Hybrid)
RTX (Raytheon) · US-PR-AGUADILLA-110 ~ Rd 110 N Km 28.8 ~ RD110

Industry l4.0 Data Architect
RTX (Raytheon) · US-MS-FOREST-431 ~ 19859 Hwy 80 ~ BLDG 431

Inżynier/ka danych - Python Developer
RTX (Raytheon) · PL-12-NIEPOLOMICE-004 ~ Grabska 4 ~ GRABSKA; PL-30-KALISZ-002 ~ Elektryczna ~ ELEKTRYCZNA

Database & IT Specialist
RTX (Raytheon) · CA-AB-CALGARY-111 ~ 919 72nd Ave NE ~ 72ND AVE, Ste A

2026 Raytheon Full Time- Software Data Engineer II (Remote)
RTX (Raytheon) · US-IN-REMOTE
About BlackRock

BlackRock
PublicMultinational investment company.
10,001+
Employees
New York City
Headquarters
$114B
Valuation
Reviews
10 reviews
3.8
10 reviews
Work-life balance
3.2
Compensation
4.1
Culture
3.4
Career
3.7
Management
2.8
72%
Recommend to a friend
Pros
Good compensation and benefits
Learning and growth opportunities
Supportive team and collaborative culture
Cons
Long hours and demanding work culture
High expectations and stress
Management issues and disorganization
Salary Ranges
4,690 data points
Junior/L3
L2
L6
L3
L4
L5
Junior/L3 · Analyst
1,924 reports
$118,963
total per year
Base
$100,050
Stock
-
Bonus
$18,913
$81,954
$175,627
Interview experience
6 interviews
Difficulty
3.3
/ 5
Duration
14-28 weeks
Offer rate
17%
Interview process
1
HireVue
2
Online Assessment
3
Final Round/Superday
Common questions
Technical interviews
Behavioral questions
Role-specific assessments
Latest updates
What BlackRock (BLK)'s Earnings Beat And Digital-Asset Push Means For Shareholders - simplywall.st
simplywall.st
News
·
1w ago
BlackRock enters $30B RWA race with BUIDL collateral initiative – All the details! - AMBCrypto
AMBCrypto
News
·
1w ago
BlackRock warns sticky inflation constrains central banks - Fund Selector Asia
Fund Selector Asia
News
·
1w ago
Rick Rieder (BlackRock): “We Are in an Environment of Earning the Coupon, Capturing Carry, and Sleeping Soundly” - Funds Society
Funds Society
News
·
1w ago