Kafka Streams Fundamentals

The "Kafka Streams Fundamentals" course introduces participants to the essential concepts of stream data processing using Kafka Streams. Participants will learn to design and develop streaming applications, work with Kafka Streams API, and integrate it with other technologies. The course balances theoretical knowledge and hands-on exercises in Java, providing practical experience with real-world data processing scenarios.
  • duration 9 hours
  • Language English
  • format Online
duration
9 hours
location
Online
Language
English
Code
EAS-036
price
€ 300 *

Available sessions

To be determined



Training for 7-8 or more people?
Customize trainings for your specific needs

Description

The "Kafka Streams Fundamentals" course is designed to equip participants with the skills to process and analyze streaming data effectively. This course provides an in-depth exploration of the Kafka Streams library, focusing on its powerful capabilities for processing real-time data at scale. Through a mix of theoretical lessons and hands-on projects, participants will gain the expertise required to build, deploy, and optimize Kafka Streams applications.

What You’ll Learn:

 

• Understand the architecture and core concepts of Kafka Streams.
• Develop streaming applications using the Kafka Streams Java API.
• Process and analyze streaming data in real time.
• Integrate Kafka Streams with external systems such as databases and cloud services.
• Debug, monitor, and optimize Kafka Streams applications.
This course is suitable for:
• Developers and Data Engineers aiming to master stream processing.
• Solution Architects looking to integrate streaming solutions into enterprise architectures.
• Technical professionals transitioning into real-time data processing roles.

 

Course Program

 

1. Introduction to Stream Processing and Kafka Streams (1 hour - Theory)

• Overview of stream processing concepts
• Role of Kafka Streams in the Apache Kafka ecosystem
• Core architecture of Kafka Streams
• Introduction to Apache Kafka: topics, producers, consumers, and brokers
• Differences between batch and stream processing
2. Fundamentals of Kafka Streams API (3 hours - 2 hours theory, 1 hour practice)

• Managing topics: reading and writing data
• Basic operations: filtering, mapping, and aggregation
• Managing stream state: state stores
• Practical Exercise:
• Setup and run a basic Kafka Streams application
• Implement basic operations using the Kafka Streams Java API
3. Advanced Event Processing (2 hours - 1 hour theory, 1 hour practice)

• Windowed operations
• Joins and aggregations
• Real-time data transformations
• Practical Exercise:
• Create an application implementing windowed operations
• Perform joins between multiple data streams
4. Integration and Deployment of Kafka Streams Applications (3 hours - 2 hours theory, 1 hour practice)

• Integrating with external systems (e.g., databases, cloud storage)
• Monitoring and debugging Kafka Streams applications
• Deployment practices for containerized environments

 

Practical Exercise:
• Deploy a Kafka Streams application using Docker
• Set up monitoring with Kafka-compatible tools

 

By the end of the course, participants will:

  

• Have a solid understanding of stream processing and Kafka Streams.
• Be able to build, debug, and deploy Kafka Streams applications.
• Confidently manage and analyze real-time data streams.

After completing the course, a certificate is issued on the Luxoft Training form

Objectives

  • By the end of the course, the students should be comfortable applying what they learned (building their own datasets and applying ML algorithms on them) and extending their knowledge in this domain confidently.

Target Audience

  • Software developers and specialists who interesting in Big Data Real time processing. Apache Flink, Apache Spark Streams users, Kafka Developers

Prerequisites

  • Kafka Development Experience

Roadmap

Course Program

1. Introduction to Stream Processing and Kafka Streams (1 hour - Theory)

  • Overview of stream processing concepts
  • Role of Kafka Streams in the Apache Kafka ecosystem
  • Core architecture of Kafka Streams
  • Introduction to Apache Kafka: topics, producers, consumers, and brokers
  • Differences between batch and stream processing
  • Managing topics: reading and writing data
  • Basic operations: filtering, mapping, and aggregation
  • Managing stream state: state stores

2. Fundamentals of Kafka Streams API (3 hours - 2 hours theory, 1 hour practice)

Practical Exercise:

  • Setup and run a basic Kafka Streams application
  • Implement basic operations using the Kafka Streams Java API
  • Windowed operations
  • Joins and aggregations
  • Real-time data transformations

3. Advanced Event Processing (2 hours - 1 hour theory, 1 hour practice)

Practical Exercise:

  • Create an application implementing windowed operations
  • Perform joins between multiple data streams
  • Integrating with external systems (e.g., databases, cloud storage)
  • Monitoring and debugging Kafka Streams applications
  • Deployment practices for containerized environments

4. Integration and Deployment of Kafka Streams Applications (3 hours - 2 hours theory, 1 hour practice)

Practical Exercise:

  • Deploy a Kafka Streams application using Docker

Set up monitoring with Kafka-compatible tools



Related courses

You may also be interested in

Discover more about professional growth and skills development

contact us