A major challenge data practitioners face is maintaining data ingestion pipelines—from on-prem database connectivity, to evolving SaaS APIs, to holistic governance. This is why we built Databricks I/O: a set of native ingestion connectors for popular SaaS applications, databases, and file sources. These connectors are auto-optimized, low-maintenance, and integrated with the rest of the Data Intelligence Platform—from governance via Unity Catalog to orchestration via Workflows (and more). And you can do it all via a no-code UI or an API. In this demo-packed session, you’ll learn about what’s launching, what’s on the roadmap, how to take full advantage of this product, and what customers have already said about it. You’ll also get your questions answered by the R&D leads who built it. Final Abstract: A major challenge data practitioners face is maintaining data ingestion pipelines—from on-prem database connectivity, to evolving SaaS APIs, to holistic governance. Join the Databricks ingestion team to learn more about LakeFlow Connect - a new set of native ingestion connectors for popular SaaS applications, databases, and file sources. These connectors are auto-optimized, low-maintenance, and integrated with the rest of the Data Intelligence Platform—from governance via Unity Catalog to orchestration via Databricks Workflows. And you can do it all via a no-code UI or an API. In this demo-packed session, you’ll learn about what’s new, what’s on the roadmap, how to take full advantage of these new ingestion capabilities.