Job Information
Insight Global REMOTE Data Engineer in MN! in Bloomington, Minnesota
Job Description
Insight Global is seeking a Data Engineer to join a large payor/provider. This person will be responsible to:
• Design, develop, and implement end-to-end data solutions using Azure Databricks.
• Convert current SQL to Python code in Databricks.
• Modify and maintain data pipelines.
• Write, test, and optimize PySpark and SQL scripts to transform and load high volumes of structured data.
• Update or maintain existing data pipelines in a production setting.
• Ensure data quality and integrity by implementing data validation and cleansing processes.
• Demonstrate strong verbal communication and critical thinking skills, working well within a team environment and not being afraid to speak up.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to HR@insightglobal.com.To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Skills and Requirements
2+ years of data engineering experience in an Azure environment
Strong SQL and python ability
Databricks or Snowflake experience Databricks experience
Pharmacy data experience
2+ years of experience in other technologies related to their tech stack (PySpark, GitHub, PowerBI, Airflow, Fabric)
Relevant certifications (Azure, Databricks etc)