Job Information
Insight Global Data Engineer - Databricks in Sacramento, California
Job Description
Insight Global is seeking a Data Engineer to design, build, and maintain production-grade data systems that support analytics and downstream applications. This role focuses on building reliable, scalable pipelines and data services while applying modern software engineering practices.
Key Responsibilities
• Design and implement scalable data ingestion and transformation pipelines
• Build APIs and services that support data consumption
• Develop and maintain streaming and batch workflows
• Optimize data jobs for performance, reliability, and cost
• Debug distributed data systems and resolve performance issues
• Collaborate with analysts, data scientists, and platform teams
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Skills and Requirements
5+ years of experience in data engineering
Experience designing and building scalable data pipelines
Strong experience with Databricks
Strong software engineering fundamentals applied to data systems
Experience developing ETL/ELT workflows and microservices
Familiarity with streaming and batch data processing
Experience implementing monitoring, error handling, and reliability practices Exposure to real-time or event-driven data systems
Experience working in regulated or compliance-heavy environments
Familiarity with CI/CD and test-driven development for data platforms