Job Information
Publicis Groupe Data Engineer- Senior Specialist- Financial Services in United States
Company description
Tremend is the newest global software engineering hub for Publicis Sapient. For over 20 years, the company has been infusing its advanced technical expertise into complex and innovative solutions that meet today's digital transformation needs and pave the way for a better and smarter future. By joining forces with Publicis Sapient we're accelerating the impact, providing a good mix of talented engineers, technology, continuous improvement, innovation, and R&D. Here, you'll have the opportunity to unleash your potential, powering up advanced software solutions for some of the world's most iconic brands. Embrace your passion for technology, creativity, and continuous improvement, and join us in making a difference through engineering.
Overview
Tremend is looking for a seasoned Senior Data Engineer to join our growing Data Engineering practice at Tremend. In this role, you will design and deliver enterprise-grade cloud data solutions across modern cloud platforms, while providing technical leadership to a team of engineers working on complex migration and greenfield initiatives.
You will be a key technical voice in client engagements, shaping data platform strategies, driving engineering best practices, and bridging the gap between data infrastructure and AI/ML capabilities. If you thrive at the intersection of robust data engineering, cloud architecture, and applied intelligence, this is your role.
Responsibilities:
Lead the design and delivery of enterprise cloud data platforms, with hands-on experience on at least one major cloud provider (Azure, GCP, or AWS)
Architect and optimize large-scale data pipelines and lakehouse solutions using Databricks, Fabric, Apache Spark
Provide technical leadership and mentoring to a team of mid-level and junior data engineers — conducting code reviews, guiding architecture decisions, and ensuring engineering quality
Drive end-to-end data engineering initiatives: from requirements and design through to deployment, monitoring, and handover
Define and enforce data engineering standards including naming conventions, CI/CD practices, data quality frameworks, and documentation
Collaborate closely with data architects, ML engineers, and business stakeholders to translate complex requirements into scalable, maintainable solutions
Design and implement medallion architecture (Bronze / Silver / Gold) and Data Vault 2.0 patterns on cloud platforms
Support AI/ML integration patterns — building feature pipelines, inference data flows, and MLOps-adjacent infrastructure
Contribute to pre-sales and client scoping conversations, producing technical proposals and effort estimates for data platform engagements
Qualifications:
7+ years of hands-on data engineering experience, including at least 2 years in a tech lead or senior individual contributor role
Databricks knowledge about Delta Lake, Unity Catalog, Spark optimization, and cluster management
Strong cloud experience, ideally on GCP; extensive experience with Azure or AWS is also highly valued due to platform similarities
Advanced SQL and PySpark skills; ability to design performant transformations at scale
Experience implementing medallion architecture and/or Data Vault 2.0 methodologies in a cloud context
Solid understanding of data pipeline patterns: batch, micro-batch, and streaming (Kafka, Spark Structured Streaming,Pub-sub or equivalent)
Experience with CDC (Change Data Capture) patterns and tools
Experience using Terraform and deploying solutions to the cloud using Infrastructure as Code (IaC) practices
Proven hands-on experience with Snowflake and big data platforms in general
Strong communication skills — comfortable presenting architecture decisions to both technical teams and business stakeholders
Proven track record of mentoring engineers and contributing to team capability growth
Nice to Have:
Hands-on experience integrating data pipelines with AI/ML workloads — feature engineering, model serving data flows, or MLOps pipelines (MLflow, Vertex AI, Azure ML, or equivalent)
Exposure to LLM-based applications and the data infrastructure they require (vector stores, RAG pipelines, embedding pipelines)
Experience with legacy-to-cloud migration projects (IBM DataStage, Informatica, SSIS, Talend → cloud-native platforms)
Relevant cloud or Databricks certifications (Databricks Certified Data Engineer Professional, AWS/GCP/Azure Data certifications)
Knowledge of EU data regulations relevant to data platform design (GDPR, EU AI Act)
Experience working in financial services, retail, or other regulated industries
Additional information
Besides an exciting job in a tremendous team, here's what you can expect:
A fast-paced tech environment
Continuous growth & learning
Open feedback culture
Room for own initiative & ideas
Transparency about results & strategy
Recognition & reward for hard work
Working with a flexible schedule
Medical subscription
Meal tickets
Extra vacation days - starting with 25 vacation days
Many others perks