Full Time
Data Engineer – Remote Jobs
Job Description
Job Type: Data Engineer from home
Location: Alabama work from home
Company: Robert Half International
Your duties as an Apache Kafka Engineer will include developing, implementing, and managing our Kafka-based data streaming system. In order to enable real-time analytics, guarantee dependable and effective data flow, and promote data-driven decision-making throughout the company, you will collaborate closely with cross-functional teams.
Responsibilities:
- Create, implement, and manage Apache Kafka clusters in a high-availability, distributed environment.
- Work together to integrate Kafka into different applications and data pipelines by collaborating with software developers, data scientists, and data engineers.
- Create and refine Kafka consumers and producers to stream data at high throughput and low latency.
- Keep an eye on Kafka clusters, performance, and resource usage. Take proactive measures to guarantee system reliability.
- Diagnose and fix system faults, performance snags, and Kafka-related problems.
- Establish and manage Kafka cluster security and access controls.
- Make sure that policies regarding data retention are set up and followed correctly.
- Keep up on industry developments and Kafka best practices, and suggest tweaks to increase the scalability and efficiency of the system.
- Document the architecture, configurations, and operational processes of Kafka and keep it up to date.
- Take part in rotating on-call assistance for the Kafka infrastructure.
Requirements:
- A bachelor’s degree in information technology, computer science, or a similar discipline (or comparable professional experience).
- Demonstrated expertise in a production-related capacity, such as that of a Kafka Engineer.
- Strong familiarity with the topics, partitions, brokers, and ZooKeeper of the Apache Kafka architecture.
- Knowledge of the Confluent Platform, Kafka Connect, and Kafka Streams components of the Kafka ecosystem.
- Competence in setting up and fine-tuning Kafka for optimum dependability and performance.
- Familiarity with shell scripting and Linux-based systems.
- Abilities in scripting and automation with programs like Terraform, Puppet, or Ansible used.
- Knowledge of data encryption and Kafka security best practices.
- Strong problem-solving and troubleshooting abilities.
- Exceptional teamwork and communication abilities.
- A certification in Kafka development or administration, such as the Confluent Certified Developer or Administrator designation, is advantageous.