OneMain Financial Jobs

Job Information

Kohl's Senior Data Engineer (Remote) in Menomonee Falls, Wisconsin

About the Role

As Senior Software Engineer, you will collaborate closely with design, product and engineering experts to tackle real-world challenges and deliver innovative solutions that elevate Kohl’s retail offerings.

What You’ll Do

  • Lead the development of high-quality applications that are robust, observable and measurable using extreme programming (XP) practices and a user-centric approach

  • Participate in the entire application lifecycle in collaboration with designers, product managers, and other engineers on the product team

  • Leverage critical thinking, experimentation, data, and industry best practices to implement desired business outcomes

  • Facilitate group discussions and team ceremonies and develop a shared context

  • Give and receive feedback that’s empathetic, actionable and specific

  • Practice emergent architecture with sane defaults and build software that is easy to use and easy to modify

  • Establish and lead product engineering and software standards

  • Ideate a new product from a user perspective, starting with one or more problem spaces and ending with a stack-ranked list of feasible solutions to test

  • Research and stay up to date on tech market trends and practices

  • Lead technical initiatives not only on the team but also across the department

  • Additional tasks may be assigned

    Addendum

SENIOR BIG DATA SOFTWARE ENGINEER

  • Develop, automate, and maintain batch and streaming ETL pipelines using Apache Airflow, Apache Spark, Python, and Scala.

  • Build and manage cloud-based data ecosystems on GCP (BigQuery, Bigtable, Dataproc, Pub/Sub, Cloud Storage, IAM, VPC).

  • Design and optimize SQL and NoSQL data models for data lakes and warehouses (BigQuery, MongoDB, Snowflake).

  • Write complex SQL queries for advanced data transformation, aggregation, and analytics optimization within BigQuery or equivalent platforms.

  • Apply modern Test-Driven Development (TDD) methodologies for big data pipelines, ensuring test automation across Airflow workflows, Spark jobs, and transformation logic.

  • Apply data mesh and data-as-a-product principles to enable reusable and domain-driven datasets.

  • Implement real time ingestion with Kafka Connect and process streaming data using Spark Streaming, Apache Flink, or similar technologies

  • Optimize data performance, scalability, and cost efficiency across GCP components.

  • Ensure compliance with PCI and PII data with standards such as GDPR, PCI DSS, SOX, and CCPA.

  • Integrate GenAI tools such as OpenAI, Gemini, and Anthropic LLMs for intelligent data quality and analytics enhancement.

  • Collaborate with stakeholders, data scientists, and full stack engineers to deliver trusted, documented, and reusable data products

What Skills You Have

Required

  • 4+ years of experience in software development

  • Understanding of application design patterns, event-driven architecture, database, schemas and testing strategies

  • In-depth knowledge and experience with continuous integration, continuous deployment and test-driven development

    Preferred

  • Bachelor's Degree or equivalent in MIS, Computer Science or related field

  • Experience with large-scale application troubleshooting and performance tuning

  • Exposure working with major cloud platforms (GCP, AWS, or Azure)

  • Familiarity and experience with XP (Extreme Programming)

DirectEmployers