OneMain Financial Jobs

Job Information

Eaton Corporation Specialist-Data Engineering -IT in Pune, India

What you’ll do:

At Eaton, we are modernizing our data usage and global teams to accelerate digital transformation. Our Global IT organization is leveraging agile, lean six sigma, and building a product-centric organization to adapt and respond more quickly to changing business needs. Join our Information Technology team to build data domain across various functions, leading transformative changes across our organization, global data processes, and platforms.

Within our Integrations Data Management organization, we are seeking a data engineer to build data domain on Snowflake platform for EHS, EU Taxonomy and EPM 2.0 .

You will partner closely with key business stakeholders to drive requirements gathering, development, testing and production support. This position is based in Eaton India (Pune). Only local candidates within 50 miles will be considered.

• Thinks and Promotes AI powered data engineering

• Develops, builds, and tests data solutions to deliver business value while meeting quality & technical requirements independently. Builds and maintain ELT/ETL pipelines in Azure Data Factory feeding Snowflake, with robust orchestration and monitoring.

• Design layered data models (Bronze → Silver → Gold) and optimize SQL for performance and scalability.

• Integrate domain source data into DataMesh and create reusable ingestion playbooks. Assemble large, complex datasets that meet functional and, non-functional requirements and enterprise technology and data protection standards

• Collaborate with EHS reporting teams to expose curated datasets for Power BI

• Implement data quality checks, runbooks, and mapping documentation; participate in code reviews.

• Deploy complex eneterprise solutions to a wide range of data technology patterns and platforms.

• Demonstrate and document solutions by using flowcharts, diagrams, code comments, data lineage, and technical and business metadata

• Operate within ServiceNow workflows for requests and incidents.

• Accountable for end-to-end delivery from source data acquisition, complex transformation and orchestration pipelines, AI/ML engineering, and front-end visualization

• Collaborates directly with business stakeholders to deliver rapid, incremental business value

• Collaborate across complex arrays of people, technology and environments

• Resolves problems that are differing, but related. Issues require advanced analytical or problem-solving techniques to identify cause. Develops solutions based on limited information and past experience. Adapts existing approaches. Anticipates future issues.

Qualifications:

  • Over all 12 + Years of exp with relevant 5-9+ years in data engineering with hands-on experience in AI powered Data engineering using GitHub CoPilot, Azure Data Factory, Snowflake, and advanced SQL.

  • Bachelor's degree from an accredited insititution. Masters or advanced degree is preferred

Skills:

• Strong understanding of data modeling, performance tuning, and ETL best practices.

• Experience integrating external APIs/vendor data securely into enterprise platforms.

• Experience working on various AI tools including but not limited to Snowflake Intelligence, Cortex, GitHub Copilot, Zenlytics, Streamlit etc

• Proven collaboration with analytics/reporting teams and ability to document processes clearly.

Preferred Qualifications

• Familiarity with Snowflake OAuth integrations, and Power BI connectivity patterns.

• Knowledge of ServiceNow/JIRA workflows for change and incident management.

DirectEmployers