OneMain Financial Jobs

Job Information

Ford Motor Company Data Engineer in Naucalpan de Juarez, Mexico

As a Data Engineer for the WORKS team, you will play a critical role in the modernization and re-platforming of the WORKS Data Warehouse. This position is part of a strategic initiative to transition from legacy data structures (Oracle Database) to a modern Data Mart on Google Cloud infrastructure (Big Query and Cloud Run). You will not just be "moving data"; you will be a key architect in reducing years of technical debt by refactoring complex table structures, eliminating redundancies, and building our first formal ETL processes to ensure a high-performance, cost-effective data environment.

Responsibilities:

  • Reverse Engineering & Logic Extraction: Analyze legacy Oracle data structures and SAP Business Objects Universes to identify and document the "hidden" ETL logic, complex joins, and calculated measures currently used for business reporting.

  • Data Mart Design: Design and implement a clean, high-performance Data Mart in Big Query (Star/Snowflake schema) that eliminates the redundancies and "spaghetti joins" of the legacy environment.

  • Source-to-Target Mapping: Map upstream interfaces and Oracle data structures to the new Big Query environment, ensuring data integrity and consistency during the transition.

  • ETL Development: Design, develop, and implement high-quality ETL pipelines to automate data movement and transformation to replace manual or non-existent processes.

  • Technical Debt Reduction: Actively simplify the data architecture by consolidating redundant tables and optimizing query paths for cloud-native performance.

  • Data Models Enablement: Build and maintain the core data layer (tables, views, and curated datasets) that will enable other Software Engineers to successfully transition reports from SAP Business Objects to Power BI.

  • Documentation: Create clear technical documentation of the new Data Mart schema and the logic used to transform legacy data.

  • Collaboration: Work closely with a global team of developers (located in Mexico, US and India) and business stakeholders to simplify WORKS data architecture, and ensure data availability and integrity for reporting and analytics.

  • Ensure on-time delivery using Agile, engaging in practices such as paired programming and automated testing for data pipelines.

  • Conduct code and design reviews to ensure adherence to data standards, patterns, and architecture principles.

  • Perform and participate in load/volume testing to ensure the new platform can handle global scale.

Required Skills and Experience:

  • Bachelor’s Degree in Computer Science, Computer Engineering or a related field.

  • English proficiency (written and verbal).

  • Data Engineering or Database Development Experience (3-5 year minimum).

  • Excellent communication skills, with the ability to articulate complex technical concepts to global stakeholders.

  • Advanced proficiency in SQL and PL/SQL, with a strong ability to read and interpret complex legacy stored procedures and view logic

  • Proven experience in Data Modeling, specifically designing Data Marts and optimizing schemas for analytical workloads.

  • Familiarity with SAP Business Objects (Universes/Web Intelligence) with the ability to navigate and extract transformation logic from the semantic layer.

  • Experience with Google Cloud Platform (GCP) and Big Query.

  • Experience with ETL tools (such as Cloud Data Fusion, dbt, or Dataform).

  • Experience in relational database management systems (RDBMS) like Oracle or PostgreSQL.

  • Willingness to challenge the "status quo" to eliminate redundant table structures and unnecessary joins.

  • Ability to work in a dynamic environment, handling multiple assignments and prioritizing work appropriately.

  • Strong collaboration skills and ability to work across regions (US, Mexico, India).

  • Attention to detail and a strong "detective" mindset for solving data redundancy problems.

  • Experience working with Agile methodologies (SCRUM, Kanban)

Skills/Experience Preferred:

  • Domain knowledge of order to delivery of vehicles.

  • Specific experience with Big Query SQL and performance tuning (partitioning/clustering). Experience migrating from SAP Business Objects to Power BI (Data Layer focus).

  • Knowledge of Unix Shell Scripting

  • Experience with GitHub or other Version Control tools

  • Knowledge of Unix/Autosys to understand legacy job scheduling

Requisition ID : 59902

DirectEmployers