In this course, you will learn to build streaming data analytics solutions using AWS services, including Amazon Kinesis and Amazon Managed Streaming for Apache Kafka (Amazon MSK). Amazon Kinesis is a massively scalable and durable real-time data streaming service. Amazon MSK offers a secure, fully managed, and highly available Apache Kafka service. You will learn how Amazon Kinesis and Amazon MSK integrate with AWS services such as AWS Glue and AWS Lambda. The course addresses the streaming data ingestion, stream storage, and stream processing components of the data analytics pipeline. You will also learn to apply security, performance, and cost management best practices to the operation of Kinesis and Amazon MSK.
Who should attend
This course is intended for:
- Data engineers and architects
- Developers who want to build and manage real-time applications and streaming data analytics solutions
We recommend that attendees of this course have:
- At least one year of data analytics experience or direct experience building real-time applications or streaming analytics solutions. We suggest the Streaming Data Solutions on AWS whitepaper for those that need a refresher on streaming concepts.
- Completed either Architecting on AWS or Data Analytics Fundamentals
- Completed Building Data Lakes on AWS
In this course, you will learn to:
- Understand the features and benefits of a modern data architecture. Learn how AWS streaming services fit into a modern data architecture.
- Design and implement a streaming data analytics solution
- Identify and apply appropriate techniques, such as compression, sharding, and partitioning, to optimize data storage
- Select and deploy appropriate options to ingest, transform, and store real-time and near real-time data
- Choose the appropriate streams, clusters, topics, scaling approach, and network topology for a particular business use case
- Understand how data storage and processing affect the analysis and visualization mechanisms needed to gain actionable business insights
- Secure streaming data at rest and in transit
- Monitor analytics workloads to identify and remediate problems
- Apply cost management best practices