refresh

Trending Companies

Trending

Jobs

JobsWipro

Databricks Architect

Wipro

Databricks Architect

Wipro

Bangalore, India

·

On-site

·

Full-time

·

2w ago

Required Skills

Databricks

Apache Spark

PySpark

Scala

Delta Lake

SQL

Apache Kafka

DevOps

CI/CD

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com.

Job Description:

Job Description:

Role Purpose:

The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction.

  • D atabricks Architect Key Responsibilities:**• Design, develop, and optimize ETL/ELT pipelines in Databricks.
  • Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.
  • Develop Py Spark/Scala-based data transformation scripts for large-scale data processing.
  • Ensure data quality, performance tuning, and cost optimization within Databricks.
  • Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.
  • Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.
  • Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.
  • Proficiency in Py Spark for data engineering workflows.
  • Experience with Delta Lake, Unity Catalog, and Databricks SQL.
  • Hands-on experience with Kafka, APIs, and streaming data processing.
  • Proficiency in SQL for querying and performance tuning.
  • Experience in DevOps and CI/CD pipelines for Databricks.
  • Good understanding of Data Governance, Security, and Access Control in Databricks.**Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.
  • Familiarity with Terraform, Databricks CLI, and automation frameworks

Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.

  • Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.
  • Develop Py Spark/Scala-based data transformation scripts for large-scale data processing.
  • Ensure data quality, performance tuning, and cost optimization within Databricks.
  • Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.
  • Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.
  • Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.
  • Proficiency in Py Spark for data engineering workflows.
  • Experience with Delta Lake, Unity Catalog, and Databricks SQL.
  • Hands-on experience with Kafka, APIs, and streaming data processing.
  • Proficiency in SQL for querying and performance tuning.
  • Experience in DevOps and CI/CD pipelines for Databricks.
  • Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.
  • Familiarity with Terraform, Databricks CLI, and automation frameworks

Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.

  • Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.
  • Develop Py Spark/Scala-based data transformation scripts for large-scale data processing.
  • Ensure data quality, performance tuning, and cost optimization within Databricks.
  • Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.
  • Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.
  • Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.
  • Proficiency in Py Spark for data engineering workflows.
  • Experience with Delta Lake, Unity Catalog, and Databricks SQL.
  • Hands-on experience with Kafka, APIs, and streaming data processing.
  • Proficiency in SQL for querying and performance tuning.
  • Experience in DevOps and CI/CD pipelines for Databricks.
  • Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.
  • Familiarity with Terraform, Databricks CLI, and automation frameworks

Key Responsibilities:• Design, develop, and optimize ETL/ELT pipelines in Databricks.

  • Implement real-time and batch data processing solutions using Apache Spark and Delta Lake.

  • Develop Py Spark/Scala-based data transformation scripts for large-scale data processing.

  • Ensure data quality, performance tuning, and cost optimization within Databricks.

  • Implement CI/CD pipelines for Databricks workflows using Terraform, GitHub Actions, or Azure DevOps.

  • Monitor, troubleshoot, and optimize Databricks clusters, jobs, and queries.

  • Collaborate with Data Architects, Business Analysts, and DevOps teams to align solutions with business needs.Required Skills:• Strong expertise in Databricks (AWS) and Apache Spark.

  • Proficiency in Py Spark for data engineering workflows.

  • Experience with Delta Lake, Unity Catalog, and Databricks SQL.

  • Hands-on experience with Kafka, APIs, and streaming data processing.

  • Proficiency in SQL for querying and performance tuning.

  • Experience in DevOps and CI/CD pipelines for Databricks.

  • Good understanding of Data Governance, Security, and Access Control in Databricks.Good to Have• Experience with AWS Glue, Azure Data Factory or Snowflake.

  • Familiarity with Terraform, Databricks CLI, and automation frameworks

  • Mandatory Skills: Data Bricks

  • Data Engineering .

Experience: 8-10 Years .

Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention.

Total Views

0

Apply Clicks

0

Mock Applicants

0

Scraps

0

About Wipro

Wipro

A technology services and consulting company focused on building solutions that address clients' digital transformation needs.

10,001+

Employees

Bengaluru

Headquarters

$8.5B

Valuation

Reviews

3.4

4 reviews

Work Life Balance

1.5

Compensation

2.0

Culture

1.5

Career

2.0

Management

1.5

15%

Recommend to a Friend

Pros

Good for resume/brand name

Broad technical experience

Exposure to multiple tech stacks

Cons

Poor management quality

Low compensation

Toxic work environment

Salary Ranges

41,395 data points

Mid/L4

Mid/L4 · Analyst - Business Process L2

1 reports

$128,283

total / year

Base

$111,550

Stock

-

Bonus

-

$128,283

$128,283

Interview Experience

5 interviews

Difficulty

2.0

/ 5

Duration

14-28 weeks

Offer Rate

40%

Experience

Positive 100%

Neutral 0%

Negative 0%

Interview Process

1

Application Review

2

Online Assessment/Aptitude Test

3

Technical Interview

4

HR Interview

5

Offer

Common Questions

Coding/Algorithm

Technical Knowledge

Behavioral/STAR

Past Experience

Culture Fit