OneMain Financial Jobs

Job Information

Nucleusteq Data Engineer - Pyspark, Airflow, AWS in Indore, India

Key Responsibilities:

● Develop Apache Airflow DAGs and PySpark ETL pipelines for high volume data processing.

● Write optimized SQL queries for data transformation and aggregation.

● Build data products serving Business Process, Executive KPIs, and Product Analytics.

● Implement data quality and monitoring solutions.

● Optimize pipeline performance and troubleshoot production issues.

● Collaborate with cross-functional teams.

● Production Pipeline Monitoring (KLO).

Required skills:

● 10+ years of data engineering experience. 7+ years dedicated to the Big data stack.

● Expert in Python and PySpark (DataFrame API, Spark SQL).

● Advanced SQL skills (window functions, complex queries).

● Production experience with Apache Airflow.

● Solid background in data warehousing and dimensional modelling.

Preferred skills:

● Experience with SQL, Trino, Apache Iceberg.

● Knowledge of Tableau CRM/CLOUD, Salesforce platforms.

● AWS/cloud data services experience.

Why Join Us?

  • Work in an innovative environment with the business that is shaping the future of data migration.

  • Be part of a dynamic, high-growth environment at NucleusTeq.

  • Competitive salary and comprehensive benefits package.

DirectEmployers