Job Information
UPS Lead Applications Developer – GCP, BigQuery, Pub/Sub, Kafka, GKE, Java/Python/c# in Chennai, India
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page.
Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain.
Fiche de poste :
Job Summary:
We are seeking a GCP-focused Data Engineer to build scalable, high‑quality data pipelines supporting our Data Maturity initiative for Logistics/Parcel Services. The ideal candidate has strong experience in GCP data services, data modeling, data quality frameworks, and understands logistics domain data such as shipment tracking, routing, and warehouse operations.
Key Responsibilities:
Core Engineering (All Levels)
Pipeline Development: Design and develop scalable ETL/ELT pipelines using BigQuery, Pub/Sub, and Dataflow/Dataproc.
Microservices: Build and deploy APIs using Python/Java/C# to integrate enterprise and external logistics systems.
Orchestration: Orchestrate workloads via Composer (Airflow) or GKE using Docker and Kubernetes.
Data Quality: Implement validation checks, lineage tracking, and monitoring for pipeline SLAs (freshness, latency).
Modeling: Model logistics and supply chain data in BigQuery for analytics and operational insights.
DataOps: Apply CI/CD, automated testing, and versioning best practices.
Intermediate / Senior additions
System Design: Take ownership of end-to-end technical design for complex data modules.
Mentorship: Actively mentor junior engineers and conduct rigorous code reviews to ensure high engineering standards.
Best Practices: Establish and document DataOps standards and reusable patterns for the team.
Lead additions
POD Leadership: Act as the technical head of the data pod, ensuring sprint goals are met and unblocking the team.
Architecture: Define the high-level architecture and long-term technical roadmap for the logistics data platform.
Stakeholder Management: Partner with business leaders to translate complex logistics requirements into technical specifications.
Negotiation: Manage requirements scoping and prioritize backlogs by balancing technical debt with business value.
Coaching: Drive the professional growth of the entire engineering team through structured coaching and performance feedback.
Required Skills:
- Relevant experience –
Lead – Min 7+ yrs of relevance experience
Strong hands‑on experience with GCP BigQuery, Pub/Sub, GCS, Dataflow/Dataproc .
Proficiency in Python/Java/C# , RESTful APIs, and microservice development.
Experience with Kafka for event-driven ingestion.
Strong SQL and experience with data modeling
Expertise in Docker/Kubernetes (GKE) and CI/CD tools (Cloud Build, GitHub Actions, or ADO).
Experience implementing Data Quality, Metadata management, and Data Governance frameworks .
Preferred Qualifications:
Experience with Terraform , Cloud Composer (Airflow)
Experience in Azure Databricks, Delta Lake , ADLS, and Azure Data Factory
Experience in Knowledge Graph Engineering using Neo4j and/or Stardog
Familiarity with Data Governance tools or Cataloging systems (AXON Informatica)
Logistics domain experience
Education
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field
Type de contrat:
en CDI
Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.