Kafka Fundamentals for Java Developers

Apache Kafka is an open-source distributed event streaming platform. Its unique design shows usefulness in a plethora of applications dealing with large streams of data. This course will introduce Apache Kafka to Java developers, making sure they understand not only ways to programmatically interact with the broker, but also its internal design. This knowledge is crucial to grasp the scope of responsibilities one needs to address for creating safe, robust, and high-performance Kafka based systems.

  • duration 24 hours
  • Language English
  • format Online
duration
24 hours
location
Online
Language
English
Code
JVA-083
price
€ 650 *

Available sessions

To be determined



Training for 7-8 or more people?
Customize trainings for your specific needs

Description

Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. This tagline from the main Apache Kafka website sounds vague, yet when we find out that more than 80% of Fortune 100 companies use Kafka extensively, we start to understand how important that platform is for modern software development.

This course will introduce Apache Kafka to Java developers. We will focus solely on:

  • Understanding how asynchronous messaging empowers software systems of any size;
  • Main messaging models: point-to-point and publish/subscribe;
  • Pros of asynchronous messaging;
  • Building blocks of Kafka – let’s recreate the thought process behind designing Kafka;
  • Differences from other message brokers which allow Kafka to achieve its amazing performance (achievable at that point in time);
  • A broad set of responsibilities Kafka puts on developers during project and implementation phases;

Tools and libraries to interact with Kafka, solving real-life challenges and obstacles.

After completing the course, a certificate is issued on the Luxoft Training form

Objectives

  • Understand Apache Kafka internal architecture and its behavior;
  • Be able to design a Kafka topic, having in mind proper data retention, client application performance, latency and throughput;
  • Be able to run a local Kafka cluster for development purposes;
  • Access CLI (command line tools) and AKHQ (Web UI) for simple Kafka management;
  • Create Kafka Java Api clients: producers, consumers, processors;
  • Managing data formats used in Kafka with the help of Apache Avro, along with ways to safely evolve those formats;
  • Solve real-world problems of message production and consumption, serialization, error handling, or transaction management (using Spring for Kafka integration);
  • Enter the world of stream processing by using the Kafka Stream library and getting to know a range of stateless and stateful event stream transformations.

Target Audience

Java Developers, System Architects

Prerequisites

Development experience in Java (2 years)


Roadmap

  1. Module 1: Introduction
    1. Asynchronous message passing
    2. Messaging models
  2. Setting up local Kafka installation using Docker
  3. Design your own Kafka - Let’s recreate the thought process behind creating Kafka
  4. Designing a safe and performant data exchange
  5. Command Line Interface
  6. Java Client API
    1. Project structure and dependencies
    2. Producer API and configuration
    3. Consumer API and configuration
    4. Admin API
    5. Enabling more manual control over production/consumption process
    6. Polling and heartbeat for assuring client application liveness and readiness.
    7. Large throughput setup – programmatic example
    8. Transactions
  7. Data Formats
    1. The need to use managed data format
    2. Introduction to Apache Avro
    3. Building Avro records using generated DTO classes and dynamic GenericRecord approach
    4. Schema evolution and compatibility types
    5. Introduction to Schema Registry
    6. How to create a Kafka application backed by Avro and Schema Registry
      1. Avro powered producer
      2. SpecificRecord consumer
      3. GenericRecord consumer
    7. AKHQ support for Schema Registry – interacting with Web UI
  8. Spring Framework Integration
    1. Topic creation
    2. KafkaTemplate for sending messages
    3. No-code setup using application.properties
    4. Simplified consumer using @EventListener
    5. Configuring Kafka clients from code for advanced setups
    6. Handling concurrency
    7. JSON support
    8. Avro support
    9. Transactions
    10. Implementing processors
    11. Request/response pattern using ReplyingKafkaTemplate
    12. Error handling, retries, dead letter topic recoverer
  9. Kafka Streams
    1. Introduction
    2. Stateless processing
    3. Avro support
    4. Stateful processing
      1. Processing operators
      2. Data locality principle
      3. Windowing
      4. Joins


Related courses

Discover more about professional growth and skills development

contact us