OneMain Financial Jobs

Job Information

Burns & McDonnell Confluent Kafka Developer / Platform Administrator in Bengaluru, India

Description

Burns & McDonnell India is seeking a skilled Confluent Kafka Developer / Platform Administrator to join our Enterprise Integration & Automation team. This role will be pivotal in designing, implementing, and maintaining our event streaming platform built on Confluent Kafka. The ideal candidate will have a strong background in Kafka cluster administration, data streaming, DevOps practices, and enterprise system integration.

Key Responsibilities:

Platform Administration:

  • Install, configure, and manage Confluent Kafka clusters (on-premises and/or cloud environments such as Confluent Cloud or Kubernetes).

  • Manage Kafka brokers, ZooKeeper, Schema Registry, Kafka Connect, ksqlDB, Control Center, and REST Proxy.

  • Implement and enhance monitoring, alerting, and logging for Kafka components using tools like Prometheus, Grafana, or Datadog.

  • Ensure high availability, scalability, and fault tolerance of Kafka clusters.

  • Perform capacity planning, performance tuning, and disaster recovery setup.

Development & Integration:

  • Design and develop Kafka topics, producers, consumers, and stream-processing applications.

  • Build Kafka Connect pipelines for integrations with systems such as Oracle, Azure, Salesforce, MuleSoft, Apigee, and cloud storage.

  • Implement schema management and data governance using Confluent Schema Registry and Avro/JSON/Protobuf.

  • Collaborate with integration teams to establish best practices for event-driven architecture.

Security & Compliance:

  • Configure RBAC, ACLs, and encryption (TLS/SASL/OAuth) for Kafka components.

  • Partner with IT Security to ensure compliance with enterprise security standards.

  • Maintain audit logs and implement data retention and governance policies.

DevOps & Automation:

  • Support and maintain CI/CD pipelines for Kafka using GitHub Actions and Terraform.

  • Automate provisioning and deployment of Kafka resources.

  • Contribute to Infrastructure as Code (IaC) and self-service automation tooling initiatives.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.

  • 4 years of experience with Apache Kafka (2 years with Confluent Kafka preferred).

  • Hands-on experience in Kafka cluster administration, operations, and performance tuning.

  • Strong knowledge of Kafka Connect, KSQL, and Schema Registry.

  • Proficiency with Linux, shell scripting, and automation tools.

  • Familiarity with container orchestration (Kubernetes / Docker).

  • Experience integrating Kafka with MuleSoft, APIs, or cloud data pipelines is a plus.

  • Exposure to Google BigQuery or other cloud-based data services related to event streaming.

  • Experience with Git, CI/CD pipelines, and monitoring frameworks.

  • Excellent communication, problem-solving, and documentation skills.

Preferred Skills:

  • Confluent Certified Administrator / Developer certification.

  • Experience with Apache Flink, Spark Streaming, or Debezium CDC connectors.

  • Strong understanding of event-driven microservices and loosely coupled architectures.

  • Prior experience in a large enterprise or consulting environment.

This job posting will remain open a minimum of 72 hours and on an ongoing basis until filled.

Job Engineering

Primary Location India-Karnataka-Bengaluru

Schedule: Full-time

Travel: No

Req ID: 254536

Job Hire Type Experienced Not Applicable #BMI N/A

DirectEmployers