Kafka for Developers Online Course
Kafka for Developers Online Course
This course introduces the relationship between serialization and Kafka, exploring formats like AVRO, Protobuf, and Thrift with a focus on AVRO’s popularity in Kafka workflows. You’ll gain hands-on experience by setting up Kafka locally, producing and consuming messages, and generating Java classes from schemas using Gradle and Maven. Learn schema evolution techniques, build a Spring Boot Kafka application using AVRO with Schema Registry, and develop a RESTful service to publish events into Kafka. By the end, you’ll be confident in using AVRO for serialization and managing schema evolution effectively.
Who should take this Course?
The Kafka for Developers Online Course is designed for software developers, data engineers, and backend programmers who want to build scalable and real-time applications using Apache Kafka. It is also suitable for students, architects, and professionals looking to gain practical knowledge of event-driven systems, stream processing, and messaging frameworks to efficiently manage high-volume data pipelines.
What you will learn
- Understand the fundamentals of data serialization
- Understand the different serialization formats available
- Consume AVRO records using Kafka Producer
- Publish AVRO records using Kafka Producer
- Enforce data contracts using Schema Registry
- Use Schema Registry to register the AVRO Schema
Course Outline
Getting Started with the Course
- Introduction
- Prerequisites
Data Contract and Serialization in Kafka
- Data Contract and Serialization in Kafka
- Serialization Formats
Introduction to AVRO - A Data Serialization System
- Introduction to AVRO - What Is AVRO and Why AVRO?
- Build a Simple AVRO Schema
Kafka Setup and Demo in Local Using Docker
- Set Up Kafka Broker and Zookeeper Using Docker Compose
- Producer and Consumer Messages Using CLI
- Produce and Consume Using AVRO Console Producer and Consumer
Greeting App - Base AVRO Project Setup - Gradle
- Base Project Setup for Greeting App
- Generate AVRO Java Records Using AVRO Schema Files
Greeting App - Base AVRO Project Setup - Maven
- Base Project Setup for Greeting App - Maven
- Generate AVRO Java Records Using AVRO Schema Files - Maven
Build AVRO Producer and Consumer in Java
- Let's Build AVRO Kafka Producer
- Let's Build AVRO Kafka Consumer
Coffee Shop Order Service Using AVRO - A Real-Time Use Case
- Application Overview
- Project Setup for Coffee Shop - Gradle
- Project Setup for Coffee Shop - Maven
- Build a Coffee Order Schema Using AVRO
- Generating AVRO Classes Using Gradle
- Generating AVRO Classes Using Maven
- Build a Coffee Shop Order Producer
- Build a Coffee Shop Order Consumer
Logical Schema Types in AVRO
- Introduction to Logical Types in AVRO
- Add a Timestamp, Decimal Logical Type to the CoffeeOrder Schema
- Adding the UUID as Key for CoffeeOrder
- Date Logical Type
AVRO Record- Under the Hood
- What's Inside an AVRO Record?
Schema Changes in AVRO - Issues without Schema Registry
- Evolving the Schema - Consumer Fails to Read the New Schema
Introduction to Schema Registry
- Introduction
- Publish and Consumer Record Using Schema Registry
- Schema Registry Internals and Interacting with Schema Registry Using REST Endpoint
- Publish and Consume "Key" as an AVRO Record
Data Evolution Using Schema Registry
- Data Evolution and Schema Evolution
- Update the Code to Interact with Maven Local Repository - Gradle
- Update the Code to Interact with Maven Local Repository - Maven
- Deleting a Field in Schema - BACKWARD Compatibility
- Adding a New Field in Schema - FORWARD Compatibility
- Add/Delete Optional Fields - FULL Compatibility
- Modify Field Names - NONE Compatibility
Schema Naming Strategies
- Different Types of Naming Strategies
- Coffee Update Event AVRO Schema
- Publish and Consume CoffeeOrder UpdateEvent Using RecordNameStrategy
Build a Coffee Order Service Using Spring Boot and Schema Registry
- Overview of the App
- Setting Up the Base Project - Gradle
- Setting Up the Base Project - Maven
- Build the DTOs for CoffeeOrderService
- Build the POST Endpoint for the CoffeeOrderService - /coffee_orders
- Build the Service Layer to Map the DTO to AVRO Domain Object
- Configure the Kafka Producer Properties in Coffee Order Service
- Build Kafka Producer to Publish the CoffeeOrder Events
- Build the Coffee Order Consumer
- Build the PUT Endpoint for the CoffeeOrderService - PUT /coffee_orders/{id}
No reviews yet. Be the first to review!