OneMain Financial Jobs

Job Information

Costco Wholesale Corporation Data Engineer in Issaquah, Washington

Costco Wholesale Corporation is seeking a Data Engineer for its Issaquah, Washington office.

Job Duties:[ Implement big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization. Identify, design, and implement internal process improvements, including automating manual processes and optimizing data delivery and orchestration. Develop data pipelines to store data in defined data models or structures. Develop complex SQL and Python against a variety of data sources. Implement streaming data pipelines using event or message-based architectures. Manage database configuration, including installing and upgrading software and maintaining relevant documentation. Work in tandem with data architects and data or BI engineers to design data pipelines and recommend ongoing optimization of data storage, data ingestion, data quality, and orchestration. Design, develop, and implement ETL/ELT processes using IICS (Informatica cloud). Improve and speed up delivery of data products and services. Support development of Data Dictionaries and Data Taxonomy for product solutions. Employees may work from their preferred Costco corporate office location in Issaquah, WA; Dallas, TX; or Schaumburg, IL. Telecommuting permitted 2 days per week.]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}

Minimum requirements: [Bachelor's or foreign equivalent degree in Computer Science, Electronic Engineering, or a related field. Plus one year of experience in: ]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}

  • [Data Modeling, ETL, and Data Warehousing;]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}
  • [Cloud technologies including ADLS, Azure Data Factory, Azure Databricks, Databricks Live Table, Spark, Azure Synapse, and Cosmos DB;]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}
  • [Operationalizing data pipelines with large datasets;]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}
  • [SQL;]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}
  • [Working with data sources such as Oracle database, flat files, Web API, or XML.]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}

Alternative requirements:[ Three years of experience in: ]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}

[Data Modeling, ETL, and Data Warehousing;]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}

[Cloud technologies including ADLS, Azure Data Factory, Azure Databricks, Databricks Live Table, Spark, Azure Synapse, and Cosmos DB;]{times="" new="" roman",="" times,="" serif;="" font-size:="" 12pt"=""}

[]{times="" new="" ro=""}

DirectEmployers