Description
About the course topic
Introduction
Originally written in Scala and Java, Apache Kafka is a fast, horizontally scaling, fault-tolerant messaging platform for distributed data streaming first started at LinkedIn. It provides a publisher-subscriber mechanism for processing and storing data streams in a fault-tolerant way. It is used for building real-time data pipelines by streaming social data, Geo-spatial data or sensor data from various devices.
Kafka acts like a plugin for Spark, Hadoop, Storm, HBase, Flink and many others for big data analytics.
Using Kafka for real-time data streaming
- To build real-time streaming applications that react to streams to do real-time data analytics.
- To transform, react, aggregate, and join real-time data flows.
- To perform complex event processing.
The most common uses for Kafka include stream processing, messaging, website activity tracking, log aggregation and operational metrics.
Pre-requisites
- Linux basic commands
- Basic java or other programming language
Course overview
Highlights
- Instructor-led live sessions
- 20+ Lessons
- Real-time streaming apps workshops & projects
- Access to self-paced course contents
- Post-training mentorship and guidance
- Quizzes & assignments
- Once your order is confirmed, you can find the live meeting details under “My Orders” section of your account here.
What will you learn from this program
Lessons
- Introduction to data streaming & subsequent processing
- Introduction to Apache Kafka
- Installation & Set-up of Apache Kafka
- Setting up single node cluster
- Command line interface (CLI)
- Writing Java producer & consumer
- Publishing messages to topics & reading from them
- Kafka property list
Know your instructor
Gautam Goswami
- Role : Solution Architect
- Experience : 22 Years
- Specialist in : Kafka Streaming, Big Data, Hadoop, Druid







Reviews
There are no reviews yet.