···
Log in / Register
535F: Kafka Specialist / Streaming Engineer
Indeed
Full-time
Onsite
No experience limit
No degree limit
79Q22222+22
Favourites
Share
Some content was automatically translatedView Original
Description

Job Summary: An Apache Kafka Specialist will be responsible for designing, implementing, and maintaining high-scale data streaming architectures, ensuring the availability, reliability, and performance of messaging platforms. Key Highlights: 1. Working on high-scale data streaming architectures. 2. Responsible for critical messaging and event platforms. 3. Collaborating with development and infrastructure teams. **Apache Kafka Specialist** Designing, implementing, and maintaining high-scale data streaming architectures. You will join the team responsible for ensuring the availability, reliability, and performance of the company’s messaging and event platforms, directly contributing to critical business systems. Responsibilities * Design, implement, and administer Apache Kafka clusters (on-premise or cloud). * Create and maintain streaming topologies using Kafka Connect, Kafka Streams, or ksqlDB. * Ensure platform observability (metrics, logs, traces, alerts). * Perform performance tuning, throughput optimization, partitioning, and replication. * Implement resilient and scalable data pipelines. * Manage schemas using Confluent Schema Registry. * Troubleshoot complex issues involving producers, consumers, offsets, and performance. * Collaborate with development, infrastructure, DevOps, and security teams. * Create and maintain technical documentation and internal standards. * Participate in deployment cycles, CI/CD pipelines, and automation (Terraform / Ansible / Helm, as per available stack). * Actively participate in related training sessions. Hard Skills * Experience with Confluent Platform or Red Hat AMQ Streams. * Experience with Terraform or Ansible for automation. * Knowledge of a cloud provider (AWS MSK / Azure Event Hubs / GCP PubSub + Confluent Cloud). * Experience with observability tools (Prometheus, Grafana, Datadog, ELK, OpenTelemetry). * Knowledge of other messaging technologies (RabbitMQ, Pulsar, ActiveMQ). * Experience with backend programming languages such as Java, Python, or Go. Soft Skills * Practical production experience with Apache Kafka. * Mastery of concepts: topics, partitions, replication, offsets, consumer groups, retention, compaction. * Hands-on experience with Kafka Connect and connectors. * Solid knowledge of Kafka Streams or ksqlDB. * Cluster administration (Zookeeper or KIP-500, depending on version). * Linux environment experience. * Container (Docker) and Kubernetes/OpenShift experience (desirable). * Familiarity with security protocols: SSL/TLS, SASL, ACLs, RBAC. * CI/CD knowledge (GitLab, Jenkins, GitHub Actions, etc.).

Source:  indeed View original post
João Silva
Indeed · HR

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.