Apache Spark Fundamentals
This training course delivers key concepts and methods for data processing applications development using Apache Spark.
To be determined
Databricks Fundamentals is a comprehensive course designed to introduce you to the powerful Databricks platform, a leading cloud-based solution for big data processing and machine learning. This course is ideal for data engineers, data scientists, and analysts who want to leverage Databricks for their data-driven projects.
The course begins with an introduction to Databricks, where you’ll learn how to create a Databricks service and explore its architecture and key components, including Databricks Notebooks. You’ll gain hands-on experience in setting up and configuring Databricks clusters and jobs, understanding the various cluster types, and managing workflows using Databricks Notebooks.
You'll then dive into the Databricks File System (DBFS), learning how to store and manage data efficiently within the Databricks environment. The course also covers the integration of Databricks with Apache Spark, where you’ll learn about data formats, transformations, joins, aggregations, and SQL queries within Databricks.
A significant portion of the course is dedicated to Delta Lake, a powerful storage layer that brings reliability and performance to data lakes. You’ll explore the pitfalls of traditional data lakes and learn how Delta Lake overcomes these challenges with features like time travel, updates, deletes, and data ingestion. The course also covers advanced topics such as Delta Lake’s transaction log and converting Parquet files to Delta format.
In addition, you’ll explore visualization techniques in Databricks, collaboration tools that enhance teamwork within the platform, and the deployment of Databricks on both Azure and AWS. This comprehensive course ensures you can manage data securely and optimize workflows in a collaborative environment.
By the end of this course, participants will:
This course offers a balanced mix of theory and practice across a total of 28 hours. Each module is designed to provide you with both the conceptual understanding and hands-on experience needed to effectively use Databricks in your data projects.
Developers, Architects
Development experience in Scala, Java, Python, & SQL - 3 months.
1. Introduction to Databricks – Theory 60% / Practice 40% - 4h
2. Databricks Cluster and Jobs - Theory 60% / Practice 40% - 4h
3. DBFS - Theory 60% / Practice 40% - 4h
4. Databricks and Spark - Theory 60% / Practice 40% - 4h
5. Delta Lake - Theory 60% / Practice 40% - 4h
6. Visualizations in Databricks - Theory 60% / Practice 40% - 2h
7. Collaboration in Databricks - Theory 60% / Practice 40% - 2h
8. Deploying Databricks on Azure - Theory 60% / Practice 40% - 2h
9. Deploying Databricks on the AWS Marketplace - Theory 60% / Practice 40% - 2h
10. Data Protection Use cases - 4h
Oleksandr Holota
Big Data and ML Trainer