OneMain Financial Jobs

Job Information

Zebra Technologies Business Analyst, Senior in Brno, Czech Republic

Overview:

At Zebra, we are a community of innovators who come together to create new ways of working. United by curiosity and a culture of caring, we develop smart solutions that anticipate our customer’s and partner’s needs and solve their challenges.

Being part of Zebra Nation means you are seen, heard, valued, and respected. Drawing from our unique perspectives, we collaborate to deliver on our purpose. Here you are part of a team pushing boundaries today to redefine the work of tomorrow for organizations, their employees, and those they serve.

You’ll have opportunities to learn and lead in a forward-thinking environment, defining your path to a fulfilling career while channeling your skills toward causes you care about—locally and globally.

Come make an impact every day at Zebra.

What We're Looking For:

We are seeking a highly skilled and motivated Senior Business Analyst to join our GSCR Data Strategy and Management team. The ideal candidate will be a highly technical and analytical professional who can independently navigate our data ecosystem, directly translating business requirements into operational dashboards, structural data quality improvements, and tactical data advisory. In this hands-on role, you will perform deep-dive data analysis, develop custom data workflows, and support the continuous enhancement of our data architecture. This position requires strong proficiency in SQL and Python, with proven experience using modern data platforms like Databricks and Google BigQuery.

Responsibilities

  • End-to-End Data Execution: Collaborate directly with business stakeholders to understand their needs and independently build the technical solutions to meet those requirements, from data extraction to final visualization.

  • Data Analysis & Insights: Perform complex data analysis using SQL and Python to extract, clean, and analyze data from our Data Lake and Warehouse environments within Databricks and Google BigQuery.

  • Analytics & Visualization: Develop and maintain insightful operational dashboards, ad-hoc analysis reports, and key metrics to support data-driven decision-making across the GSCR organization (utilizing tools like PowerBI).

  • Data Quality & Master Data Management (MDM): Lead data cleanup campaigns, perform "5 Whys" root cause investigations for data anomalies, and assist in standardizing core business entities (e.g., survivorship logic and "Golden Record" creation).

  • Technical Collaboration & Architecture: Work closely with team leadership and technical peers to optimize data pipelines, develop certified views (e.g., "Gold" layer creation), and enhance the overall data flow and lifecycle management.

  • Metadata & Documentation: Contribute to the maintenance of the data catalog (e.g., Alation) and business glossary. Formally map data workflows and manage Standard Operating Procedures (SOPs).

  • Data Advisory: Provide tactical, Subject Matter Expert (SME) support to business users, helping them navigate the data landscape, discover attributes, and clarify definitions.

Qualifications

  • Bachelor’s degree in Business, Computer Science, Statistics, or a related field.

  • 5+ years of experience in a Data Analyst, Business Analyst, or a similar highly analytical and technical role.

  • Strong proficiency in writing, executing, and optimizing complex SQL queries.

  • Hands-on experience with Python for data analysis and automation, including libraries such as Pandas.

  • Demonstrated experience working with large-scale data platforms, specifically Databricks and Google BigQuery.

  • Excellent analytical, problem-solving, and critical thinking abilities.

  • Strong communication and interpersonal skills, with the ability to effectively translate complex data concepts to non-technical audiences.

  • Preferred Skills

  • Experience with data visualization and automation tools (e.g., Power BI, Power Automate).

  • Knowledge of data warehousing concepts, ETL/ELT processes, and data modeling.

  • Familiarity with cloud environments (Google Cloud Platform, Azure).

  • Experience in data governance, master data management (MDM), and metadata management tools (like Alation).

Day in the Life

Morning: Operations, Triage, and Advisory

  • 9:00 AM – System Check & Pipeline Monitoring: Start the day by checking the automated daily job schedules and data pipelines running in Databricks and BigQuery. Ensure last night’s data loads finished successfully and that the data lake is up to date.

  • 9:30 AM – Data Quality & Triage: Review any automated data quality alerts. Spot an anomaly in a recent dataset and kick off a quick "5 Whys" root cause investigation to figure out why a specific field is showing up as null.

  • 10:15 AM – Data Advisory (Lane 1 Support): Check emails and Teams messages. A business stakeholder is asking, "Where can I find the most accurate supplier lead-time attribute?" Act as the SME, pointing them to the correct certified view and clarifying the business definition.

Mid-Day: Deep Technical Execution

  • 10:45 AM – Hands-On Development (SQL/Python): Transition to deep-focus work. Open Databricks/BigQuery to write and optimize complex SQL queries. The goal today is to take raw, messy data from a new source, clean it using Python (Pandas), and transform it into a certified "Gold" layer view that the business can actually use.

  • 12:00 PM – Team Sync: Grab a quick sync with you (their manager) and other technical peers to discuss a Master Data Management (MDM) challenge. Brainstorm the survivorship logic needed to create a "Golden Record" for a specific set of messy supplier records.

  • 12:30 PM – Lunch

Afternoon: Stakeholder Collaboration & Visualization

  • 1:30 PM – Stakeholder Requirements Meeting: Meet with supply chain managers who need a new operational dashboard. Instead of just taking notes to pass on to engineers, this person actively consults with them, asks probing questions about their metrics, and scopes out exactly what data needs to be pulled.

  • 2:30 PM – Dashboard Development (PowerBI): Take the requirements from the earlier meeting and start mocking up the visual layer in Power BI. Connect Power BI directly to the "Gold" layer view they built that morning to ensure the dashboard runs efficiently.

  • 3:30 PM – Metadata & Documentation: Spend some time updating the data catalog (Alation). Tag the newly created tables, update the business glossary with the agreed-upon definitions from the stakeholder meeting, and quickly map out the data workflow for the team's SOPs.

  • 4:30 PM – Wrap Up & Automation: Spend the last part of the day writing a quick Python script or setting up a Power Automate flow to automate a manual data entry task they noticed a stakeholder struggling with earlier in the week. Set priorities for tomorrow and log off.

Incentive Compensation:

In addition to base pay, Zebra offers this role the opportunity to earn a performance-based annual cash incentive, at a target equal to 10% of base pay, in accordance with the terms of the applicable incentive plan.

Job Posting Statement:

To protect candidates from falling victim to online fraudulent activity involving fake job postings and employment offers, please be aware our recruiters will always connect with you via @zebra.com email accounts. Applications are only accepted through our applicant tracking system and only accept personal identifying information through that system. Our Talent Acquisition team will not ask for you to provide personal identifying information via e-mail or outside of the system. If you are a victim of identity theft contact your local police department.

AI Technology Statement:

Zebra Technologies leverages AI technology to evaluate job applications using objective, job-relevant criteria. This approach enhances efficiency and promotes fairness in the hiring process. However, every decision regarding interviews and hiring is made by our dedicated team, because we believe people make the best decisions about people. For more on how we use technology in hiring and how we process applicant data, see our Zebra Privacy Policy (https://www.zebra.com/us/en/about-zebra/company-information/legal/privacy-statement.html) .

DirectEmployers