This introductory workshop caters to data engineers seeking hands-on experience and data architects looking to deepen their knowledge.
The workshop is structured to provide a solid understanding of the following fundamental data engineering and streaming concepts:
- Introduction to Lakeflow and the Data Intelligence Platform
- Getting started with Declarative Pipelines, Streaming Tables, and Materialized Views
- Working with a realistic data streaming use case
- Mastering Lakeflow Jobs with advanced control flow and triggers
- Generative AI for Data Engineers: Genie and Databricks Assistant
- Understanding data governance and lineage with Unity Catalog
- Benefits of Serverless Compute
We believe you only become an expert if you work on real problems and gain hands-on experience.
Therefore, we will equip you with your own lab environment in this workshop and guide you through practical exercises like using GitHub, ingesting data from various sources, creating batch and streaming data pipelines, and more.
We've also allocated time for a 10-minute Q&A segment with our expert team at the end of the workshop to answer any questions related to the workshop content.