Data engineering doesn’t have to be a patchwork of tools and handoffs. In this session, we’ll introduce you to Lakeflow, Databricks’ unified solution for building reliable, scalable data pipelines with less friction and more control. Whether you’re just getting started or managing complex workflows, Lakeflow brings together ingestion, transformation, and orchestration into one cohesive experience.
We’ll walk through the key components, including Lakeflow Connect, Declarative Pipelines, Jobs, and the new Lakeflow Designer—a visual interface that makes it even easier to build and manage pipelines with minimal code. You’ll see live demos of no-code ingestion, code-optional transformation, and unified orchestration across your data estate.
We’ll also share what’s coming next, including support for open source tooling and expanded no-code capabilities. You’ll leave with a clear understanding of how Lakeflow simplifies your stack, increases productivity, and provides a strong foundation for building high-performance, governed data pipelines at scale.