Job Information
UPS Data Engineer - GCP, SQL,BiqQuery, Dataflow, Pub/Sub, Logging, IAM in Chennai, India
Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page.
Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain.
Fiche de poste :
Job Description
We're seeking a talented Data Engineer with hands-on experience in the Google Cloud data ecosystem and a proven track record. You'll be instrumental in designing, building, and maintaining our data infrastructure and pipelines, enabling critical insights and supporting data-driven initiatives across the organization.
Responsibilities
Data Pipeline Development: Design, build, and optimize robust and scalable data pipelines to ingest, transform, and load data from various sources into our data warehouse and knowledge graphs.
Cloud Data Stack Expertise: Implement and manage data solutions using Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, Spanner and Dataproc and Azure Cloud Services
Knowledge Graph Engineering: Develop and maintain data models, ingest data, and create efficient queries within Neo4j and/or Stardog . Leverage your expertise to build and expand our enterprise knowledge graph.
Data Quality & Governance: Implement best practices for data quality, data validation, and data governance, ensuring data accuracy, consistency, and reliability.
Performance Optimization: Continuously monitor and optimize the performance of data pipelines and database queries, identifying and resolving bottlenecks.
Collaboration: Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver effective data solutions.
Documentation: Create and maintain comprehensive documentation for data pipelines, data models, and knowledge graph schemas.
Required Qualifications
Education:
Bachelor’s degree in computer science, Engineering, or a related quantitative field.
Experience:
5+ years of professional experience as a Data Engineer or in a similar role.
Strong hands-on experience with Google Cloud Platform (GCP) data services , including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer (Apache Airflow).
Proficiency in Python for data manipulation and pipeline orchestration.
Experience with SQL and data warehousing concepts.
Familiarity with data modeling techniques relational database
Experience with version control systems (e.g., Git).
Application Support
Terraform or any other IaC tools
Preferred Qualifications
Experience with other GCP services
Knowledge of streaming data technologies (e.g., Kafka, Google Cloud Dataflow streaming).
Familiarity with data governance tools and principles.
Certifications in Google Cloud Platform data engineering.
Type de contrat:
en CDI
Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.