OneMain Financial Jobs

Job Information

HCA Healthcare Senior Data Engineer - APIs in Nashville, Tennessee

This is OUR story... and YOUR next chapter

At HCA Healthcare, our Digital Transformation and Innovation (DT&I) team is redefining what’s possible inpatient care. By leveraging the power of artificial intelligence, automation, and digital technologies, DT&I is helping drive meaningful improvements in clinical outcomes, reduce manual workload, and expand the reach of our care teams. If you're passionate about using technology to improve human life, this is where your work truly matters

What you will accomplish in this role

The Senior Data Engineer serves as a primary development resource for design, build, implementation, and support of ITG Data Management enterprise application initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. As a senior-level position the role requires ‘self-starters’ who are proficient in problem solving and capable of bringing clarity to complex situations. The culture of the organization places an emphasis on teamwork, so social and interpersonal skills are equally important as technical capability. Due to the emerging and fast-evolving nature of Big Data/GCP technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice.

In addition, this candidate will have a history of increasing responsibility in a small multi-role team. This position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team (consultant and employee) environment. In addition, the applicant must be willing to mentor other developers to prepare them for assuming the responsibilities.

What you will do:

The following are highlighted entrepreneurial competencies and core expectations for the job/role:

  • Communication and interpersonal skills

  • Problem-solving and critical thinking skills.

  • Understand strategic imperatives.

  • Technology & business knowledge

This role will provide application development for specific business environments. Focus on setting technical direction on groups of applications and similar technologies as well as taking responsibility for technically robust solutions encompassing all business, architecture, and technology constraints.

  • Responsible for building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.

  • Build APIs, data pipelines, and systems to access and process data.

  • Implement and test data pipelines.

  • Build analytics on raw data.

  • Troubleshoot data issues.

  • Communicate with other teams to identify and solve problems.

  • Set coding standards and perform code reviews.

  • Mentor junior developers.

  • Experience with APIs, microservices, and modern software patterns using containerized environment (i.e., Kubernetes) or serverless compute services.

  • Make sure service levels are maintained, and any interruption is resolved in a timely fashion.

What qualifications you will need:

  • Bachelor's degree in computer science, related technical field, or equivalent experience preferred

  • Master's degree in computer science or related field preferred

  • 3+ years of experience in Data Engineer required

  • 1+ year(s) of experience in Healthcare preferred

  • 5+ years of experience in Information Technology required

  • Strong understanding of best practices and standards for GCP Data process design and implementation

  • 2+ year(s) experience of hands-on experience with GCP platform and experience with many of the following components:

Cloud Run, GKE, Cloud Functions

Spark Streaming, Kafka, Pub/Sub

Bigtable, Firestore, Cloud SQL, Cloud Spanner

JSON, Avro, Parquet

BigQuery, Dataflow, Data Fusion

Cloud Composer, DataProc, CI/CD, Cloud Logging

Vertex AI, NLP, GitHub

  • 3+ Years of hands-on experience with many of the following components:

Spark Streaming, Kafka

SQL, JSON, Avro, Parquet

Java, Python, or Scala

  • Ability to multitask and to balance competing priorities.

  • Ability to define and utilize best practice techniques and to impose order in a fast-changing environment. Must have strong problem-solving skills.

  • Strong verbal, written, and interpersonal skills, including a desire to work within a highly-matrixed, team-oriented environment.

  • CICD Deployment experience

  • A successful candidate may have:

  • Experience in Healthcare Domain

  • API development and integration

  • Hardware/Operating Systems:

  • Linux, UNIX

  • GCP

  • Distributed, highly scalable processing environments.

Certifications (a plus, but not required):

GCP Cloud Professional Data Engineer

PHYSICAL DEMANDS/WORKING CONDITIONS (Specific statements of physical effort required and description of work environment, e.g., prolonged sitting at CRT. required travel %)

  • Prolonged sitting or standing at computer workstation including use of mouse, keyboard, and monitor.

Requires ability to provide after-hours support.

At HCA Healthcare, we are committed to fostering a culture of growth that allows you to build the career of a lifetime. We encourage you to apply for our Senior Data Engineer - APIs today. We review all applications promptly, and qualified candidates will be contacted to continue the process. Join us!

We are an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

DirectEmployers