Job Information
Publicis Groupe Data Engineer Specialist in Bogota, Colombia
Overview
We are seeking a skilled Data Engineer Specialist with strong SQL abilities, familiarity with the AWS environment, and knowledge of S3 buckets, access keys, and secret keys for a global pharmaceutical company. This role involves working with large amounts of data related to media metrics, such as impressions, clicks, spending, and others, in order to provide accurate data to clients. The ideal candidate is self-directed with strong attention to detail and the ability to document his/her work. He or she should be able to answer technical questions appropriately and participate actively in meetings with our global clients.
Responsibilities
Maintain ETL reports.
Ensure the quality of information for the client.
Documenting the process through Jira/Confluence.
Collaborate with internal and external stakeholders to resolve data-related issues.
Establish strong communication and actively participate in meetings.
Strong ability to evaluate new technologies and present findings to team.
Contribute to client status and reporting calls, including presentation of reporting as required.
Develop subject matter expertise in ETL, API development, and Business Intelligence platforms.
Clearly define project deliverables, timelines, and dependencies for junior team members, internal stakeholders and clients.
Responsible for loading and validating data into the data warehouse from various source systems.
Analyze, develop, fix, test, review and deploy functionality, and bug fixes in ETL data pipelines.
Query tuning, diagnosis, and resolution of performance issues leveraging ELT and push-down if required.
Building Data mappings between Source to Target systems.
Provides support for technical issues and ensuring system availability.
Work with business customers to identify and develop additional data and reporting needs.
Understand how business intelligence platform/data technologies work and offer the ability to explain technical concepts in ordinary terms (be technically savvy - understand the opportunities and limitations).
Establish and manage data integrations utilizing a taxonomy nomenclature to make recommendations for data visualization to showcase media performance through the use of Datorama or Tableau.
Perform regular quality assurance/quality control checks on assigned client campaigns to ensure the data is processing accurately.
Qualifications
Advanced English level and communications skills.
Education: Bachelor’s Degree in Computer Science, Mathematics, or a related engineering field.
Experience: At least 4+ years of experience in data engineering roles with a focus on large-scale data processing.
Advanced Python skills for data manipulation and pipeline development.
Strong expertise in relational SQL databases, particularly BigQuery.
Hands-on experience with Databricks or Spark for big data processing.
Experience with integration of data from diverse sources.
Marketing proficiency.
Strong attention to detail.
A highly responsible and organized individual, capable of managing a high volume of requests.
Cloud experience (AWS, GCP or Azure)
Experience with APIs (consuming and development).
Nice to Have:
Familiarity with Google Cloud Platform (GCP) and Azure for cloud-based data workflows.
Knowledge of ETL tools such as Alteryx.
Media space experience.