Join us at the Data + AI World Tour and help shape the future of data and AI
Registration for the event is now closed
Join us at the Beurs van Berlage Amsterdam
Thursday, November 23, 2023 | 8:30 AM-6:30 PM
Explore the latest advancements, hear real-world case studies and discover best practices that deliver data and AI transformation. From the Databricks Lakehouse Platform to open source technologies including LLMs, Apache Spark™, Delta Lake, MLflow and more — World Tour has the info you need to accelerate and enhance your work.
Organizations across nearly every industry are striving to stay ahead by embracing automation, expediting innovation and enhancing operational efficiency. We believe that generative AI plays a significant role in helping organizations achieve these goals.
Join Generation AI at the Data + AI World Tour to learn more about the transformative potential of AI. Hear from top Fortune 500 companies and Databricks customers on how they have transformed their business through the Lakehouse. From the Databricks Lakehouse Platform to open source technologies including LLMs, Apache Spark™, Delta Lake, MLflow and more — the World Tour will provide the essential insights needed to enhance and accelerate your business initiatives.
From inspiring keynotes to practical demos and insightful sessions, the Data + AI World Tour Amsterdam has something for you.
Thursday 23 November
Breakouts, Lightning Talks and Industry Tracks
Drinks and Networking
Welcome Generation AI
How does realtime data processing help to reduce foodwaste?
From grass to glass, from data to dairy, from numbers to nutrition
Data Leaders Panel
Databricks: Welcome to Generation AI
Visionary leaders will share insights on:
VP, Northern Europe
Marijse van den Berg
Manager, Field Engineering
Albert Heijn: How does realtime data processing help to reduce foodwaste?
Albert Heijn is committed to reduce our food waste with 50% by 2030 through various sustainability initiatives. One of these initiatives is the dynamic markdown project in which we dynamically reduce the prices of products as they get closer to the expiry date. This way we minimize unsold items, thus reducing product waste. Calculating the most effective markdown requires a high freshness of data. We will explain how we process data in realtime with a distributed lakehouse architecture to enable real time use cases like dynamic markdown.
Ton van RijnPlatform Architect
Mariska van Willigen
Machine Learning Engineer
Data Drives FrieslandCampina's Strategy: From grass to glass, from data to dairy, from numbers to nutrition
Burce GultekinInterim CIO
Director Architecture and Engineering
Fireside Chat: With Ahold Delhaize
Global Head of Data & Analytics
Director, Benelux & Nordics
Data Leaders Panel: Generation AI
Join some of the biggest data powerhouses in the industry as we discuss the evolution of data culture resulting from the GenAI movement. The panellists will discuss the implications of generative AI on various aspects of organisations, including workforce dynamics, decision-making processes, and the role of human creativity. They will explore how generative AI can disrupt traditional organisational structures and challenge existing power hierarchies. Additionally, the panel aims to provide insights and share strategies for leading organisations through this transformative period, including the importance of fostering a culture of innovation, up-skilling employees to work alongside AI systems, and establishing ethical guidelines for AI usage.
Head of Data Technologies
Databricks: Introduction to Data Engineering on the Lakehouse
Data engineering is a requirement for any data, analytics or AI workload. With the increased complexity of data pipelines, the need to handle real-time streaming data and the challenges of orchestrating reliable pipelines, data engineers require the best tools to help them achieve their goals. The Databricks Lakehouse Platform offers a unified platform to ingest, transform and orchestrate data and simplifies the task of building reliable ETL pipelines. This session will provide an introductory overview of the end-to-end data engineering capabilities of the platform, including Delta Live Tables and Databricks Workflows. We’ll see how these capabilities come together to provide a complete data engineering solution and how they are used in the real world by organizations leveraging the lakehouse turning raw data into insights.
Can KöklüSolutions Architect
Bart Van Der Vurst
Databricks: What's your IT & Data worth? How to communicate Business Value to secure investment
63% of CDOs struggle to effectively communicate the business value their IT & Data is delivering to the business, and, as a result, are missing out on up to 60% higher funding. Communication of ITs business impact is critical to securing the success of your future roadmap and delivering innovation.In this session we’ll explore how IT delivers value as well as frameworks for how to measure impact and best-practices to communicate value.
Sr Manager, Business Value Consulting
Databricks: More Responsible Generative AI
Generative AI promises to deliver tremendous benefits. With that it comes with equally tremendous risks.
This session sets out goals, principles, and some starter techniques to keep generative AI more responsible. Help your organization maximize value and minimize problems as you embrace an AI-driven future.
Lead Data & AI Strategist
Microsoft: Modern Analytics with Azure Databricks and Power BI
Enterprises are trying to become agile by accelerating data-driven decision making often get slowed down due to data estate and tool-set fragmentation. Lakehouse provides an architectural pattern that dramatically reduces time to value by enabling rapid capability build-out, and helping democratize analytics across all levels. Join this demo-rich session to discover how Microsoft cloud provides the best destination for building Lakehouse and analytics on top, powered by Azure Databricks and Power BI.
Data & AI Specialist
Cloud Solution Architect
Databricks: Databricks SQL: Why the Best Serverless Data Warehouse is a Lakehouse
Many organizations rely on complex cloud data architectures that create silos between applications, users and data. This fragmentation makes it difficult to access accurate, up-to-date information for analytics, often resulting in the use of outdated data. Enter the lakehouse, a modern data architecture that unifies data, AI, and analytics in a single location. This session explores why the lakehouse is the best data warehouse, featuring success stories, use cases and best practices from industry experts. You'll discover how to unify and govern business-critical data at scale to build a curated data lake for data warehousing, SQL and BI. Additionally, you'll learn how Databricks SQL can help lower costs and get started in seconds with on-demand, elastic SQL serverless warehouses, and how to empower analytics engineers and analysts to quickly find and share new insights using their preferred BI and SQL tools such as Fivetran, dbt, Tableau, or Power BI.
Franco PatanoLead Product Specialist
Databricks: What’s New in Unity Catalog - With Live Demos
Join the Unity Catalog product team and dive into the cutting-edge world of data, analytics and AI governance. With Unity Catalog’s unified governance solution for data, analytics, and AI on any cloud, you’ll discover the latest and greatest enhancements we’re shipping, including fine-grained governance with row/column filtering, new enhancements with automated data lineage and governance for ML assets. In this demo-packed session, You’ll learn how new capabilities in Unity Catalog can further simplify your data governance and accelerated analytics and AI initiatives. Plus, get an exclusive sneak peek at our upcoming roadmap. And don’t forget, you’ll have the chance to ask the product teams themselves any burning questions you have about the best governance solution for the lakehouse. Don’t miss out on this exciting opportunity to level up your data game with Unity Catalog.
Dalya Al-tahaSolutions Architect
Zen van Gaever
Albert Heijn: Albert Heijn's Journey in Continuous Data Processing
Join us for a presentation on the journey in Continuous Data Processing of the Netherlands' leading food retailer. Discover how a microbatching data architecture fuels Albert Heijn's data science success, improving customer satisfaction, operational efficiency, and supporting waste reduction.
One of our key enablers is the meta-driven Delta Lake, a pivotal component of our data infrastructure that empowers us to effectively manage and process data from our vast range of sources.
In this presentation, we will walk you through our journey, sharing our insights, challenges, and the success story of dynamic markdown in the realm of continuous data processing.
Davey van HennikProduct Architect
Databricks: Deep Dive into the Latest Lakehouse AI Capabilities
With a unified data & ML platform approach, Databricks has a unique approach to machine learning. By breaking down the silos between the data stack, ML stack and DevOps stack, Databricks offers a simplified, faster, and better-governed way to do ML, including integrated feature engineering and governance tooling, end-to-end tracking and lineage of models and data, automatic monitoring, and root cause analysis. All these capabilities, which typically require six different tools and vendors, are all unified on one platform: Databricks. This saves significant time and resources while accelerating innovation. In this session you will learn about: Built-in governance, lineage and monitoring across model, feature and data Unified feature engineering Monitoring across your data and ML assets, with automatic reporting and root cause analysis Deep integration with the data in lakehouse to accelerate the development.
Sergio Ballesteros SolanasSenior Solutions Architect
Databricks: Earth Observation and Geospatial Data Processing with the Databricks Lakehouse
This session covers all the fundamental aspects of processing Geospatial and Earth Observation Data using the Databricks Lakehouse. Throughout our session, we will present characteristic geospatial use cases, followed by an interactive demo showcasing the integration of Satellite Imagery with Artificial Intelligence to address an Earth Observation Image segmentation problem.
Konstantina FotiadouSolutions Architect
Avanade: Road to adopting Unity Catalog
During this presentation, we will talk you through the journey of going from an idea to a Unity Catalog implementation. You will learn about the possible path towards using Unity Catalog and considerations when deciding about placement of Unity Catalog in the organization. During this journey you will face challenges, we will talk about them and how these challenges can be mitigated.
Senior Data Engineer
Databricks: Delta Live Tables A to Z: Best Practices for Modern Data Pipelines
Join Databricks for a technical deep dive into how Delta Live Tables (DLT) reduces the complexity of data transformation and ETL.
Learn what’s new; what’s coming; and how to easily master the ins-and-outs of DLT.
Kivanc UrganciogluSolutions Architect
Databricks: Introduction to Data Streaming on the Lakehouse
Streaming is the future of all data pipelines and applications. It enables businesses to make data-driven decisions sooner and react faster, develop data-driven applications considered previously impossible, and deliver new and differentiated experiences to customers. However, many organizations have not realized the promise of streaming to its full potential because it requires them to completely redevelop their data pipelines and applications on new, complex, proprietary, and disjointed technology stacks. The Databricks Lakehouse Platform is a simple, unified, and open platform that supports all streaming workloads ranging from ingestion, ETL to event processing, event-driven application, and ML inference. In this session, we will discuss the streaming capabilities of the Databricks Lakehouse Platform and demonstrate how easy it is to build end-to-end, scalable streaming pipelines and applications, to fulfill the promise of streaming for your business.
Stefan van Wouw
Sr. Manager, Specialist Solutions Architects
Databricks: Observability of your Data Platform (Cost and Performance Management)
Improve your data-driven spending decisions and get maximum business value out of your data platform.
Manage your costs and keep them under control. Focus of this session is on granular cost and performance reporting of your Databricks data platform.
Heineken: Scaling data platforms at HEINEKEN
At HEINEKEN we operate globally and each of our markets provides its own challenging analytics use cases. To unlock data from all these markets and have a common analytics toolset for any use case, a highly standardized blueprint and global governance framework is deployed. This allowed us to scale globally, greatly reducing time to market of new analytics use cases and becoming the Best Connected Brewer.
Jelle van Etten
Databricks: Maximizing Value From Your Data with Lakehouse AI
Ready to get started with lakehouses, especially around DS/ML? This session is for you. The journey from training models to taking them to production can be quite challenging. Teams face a large variety of issues: siloed data, inconsistent tools, complex infrastructure, and a lack of visibility into their model performance. Learn how the Databricks Lakehouse Platform provides a unified, data-centric ML environment that accelerates and simplifies machine learning lifecycle, with a standardized set of tools, frameworks and governance used across your lakehouse data. In this session, you’ll learn how about lakehouses, and how to: Ingest, prepare and process data on a platform designed to handle production-scale ML training, including LLMs Leverage data science notebooks and MLflow to manually train and track your ML experiments or let AutoML do the experimentation for you Use real-time serving with fast autoscaling to save cost and maintain SLAs Monitor your deployed models for drift and accuracy Manage and govern all data and ML assets with Unity Catalog.
Databricks: Simplifying Lakehouse Observability: Databricks Key Design Goals and Strategies
In this session, we'll explore Databricks vision for simplifying lakehouse observability, a critical component of any successful data, analytics, and machine learning initiatives. By directly integrating observability solutions within the lakehouse, Databricks aims to provide users with the tools and insights needed to run a successful business on top of lakehouse. Our approach is designed to leverage existing expertise and simplify the process of monitoring and optimizing data and ML workflows, enabling teams to deliver sustainable and scalable data and AI applications. Join us to learn more about our key design goals and how Databricks is streamlining lakehouse observability to support the next generation of data-driven applications.
Senior Specialist Solutions Architect
Nederlandse Loterij: A Data-Driven Commitment:
Innovating for Responsible Gaming
Head of Data
Databricks: 5 Guiding Principles to Choose the right Data & AI Technology
Thanks to ChatGPT publicity and LLMs everyday increasing impact, Data is a principal asset for companies and AI is a paramount technical & business capability. But there is an old problem with new context that still persists: how do stakeholders make the right strategic decision about which technology to invest in? This session presents five guiding principles to help stakeholders in their technology criteria assessment with an an example how these accelerate your AI journey.
Data & AI Strategist
AWS: Building Your Lakehouse on AWS to Expand Your Business Capabilities
In this session, AWS will be sharing the collaborative solutions offered by Databricks and AWS, enabling them to overcome data challenges, implement a robust Lakehouse architecture, and harness the distinct advantages of AWS services for Data & Analytics. Whether you are a data engineer, data scientist, or analytics professional, this session aims to equip you with the knowledge and insights needed to drive innovation and efficiency in your data initiatives.
Jagdeep Singh Soni
Partner Sales Solutions Architect
Celebal: SAP Modernization: The AI Journey with Databricks
Prepare to unlock the full potential of your SAP investments and propel your business into Industry 4.0. Explore essential strategies to leverage Generative AI and integrate advanced analytics with Databricks, yielding actionable insights from both SAP and non-SAP data. Boost your growth by tapping into the transformative power of generative AI for workflow automation and decision enhancement, supported by our accelerators for Industry Lakehouses.
Co-Founder and President
Databricks: What is the EU AI Act and how does Databricks help compliancy?
The European AI Act is expected to pass early 2024.
What does it mean to your organization?
How do your investments in Databricks today help you with compliancy when the act goes into effect in 2026?
Victor van den Broek
Databricks: Next Gen CX
In today’s data-driven world, teams need to better understand their audience so they can more effectively find and activate them online. If you’re stitching together a bunch of platforms, you might find that fragmented views, legacy technologies and the shift from descriptive to predictive insights are holding your team back. With the Databricks unified data and AI platform, we give you a single copy of data with unified governance and an ever-expanding ecosystem of partners to build the next generation of your customer experience.
Mapiq: Creating a workplace experience platform that scales - Lessons learned from Mapiq’s transition to Databricks
On a daily basis, Mapiq's workplace experience platform combines millions of data points from workplaces all over the globe, to provide workplace leaders with purposeful insights on their workplaces and real estate. In this session, Mapiq's CTO and Tech Lead Data and AI will share their experiences from Mapiq's transition to the Databricks platform, and explain how they built a scalable product on top of Databricks.
Co-founder and CTO
Tech Lead Data and AI
Neo4j: The Art of the Possible - Databricks and Graph Technology at Rabobank
Are you drowning in data but lacking in insight? 80% of business leaders say data is critical in decision-making, yet 41% cite a lack of understanding of data because it is too complex or not accessible enough. You’ll learn how companies are using graph technology to leverage the relationships in their connected data to reveal new ways of solving their most pressing business problems and creating new business value for their enterprises. During this talk Rabobank will explain how Databricks and Neo4j go hand in hand together to create improved insights on their connected data.
Sales Director Benelux
Capgemini: How to optimize a milk factory using a Digital Twin
Bathing in massive amount of Milk and Data, our factories are challenged to optimize, considering all the levers you can turn. Using Azure Databricks we created a digital twin of the factory to gain further material flow insights. This to provide the technologists with improved control of their production line performance and enabling the first steps towards a "light out factory".
Product Owner & Data Product Analyst
Astellas: How Databricks is Helping Astellas to Build the Future of Healthcare
Director Technology Business Information
LTIMindtree: Echo Delta Charlie, Data and AI Strategy for Amplifying Business Insights
Nandakumar KrishnamurthySenior Principal Databricks Champion - Europe
Manas Ranjan NayakBusiness Head – Data and Analytics, UK & Europe
Global GTM Leader - Databricks
Kenvue: Cutting AI Product costs and run time by up to 90%
Data Science & AI Products Director
RTL: Building a local streaming hero with Data & AI
Head of Data & AI
Financial Services Track
Financial Services and Insurance Networking Lunch
Financial Risk Avoidance with Bridgefund & Rabobank
Advantage Lakehouse: CZ & VGZ
Data Leaders Panel: The Future of Financial Services and Insurance
Rabobank: Building a team from scratch that implements and maintains a Transaction Monitoring solution processing billions of records every day
For Rabobank, transaction monitoring is a vital part of our responsibilities as a bank. 3 years ago we started a project to improve our monitoring solution. Today, almost 100 colleagues are working together on a Transaction Monitoring solution developed on Databricks. With an ever-expanding scope, we have a continuous drive and need to improve our way-of-working. I'll talk you through how we started with a small group of people and were able to expand to our current size in such a small time, with billions of records passing our solution every day.
Tom PieningData Scientist
Bridgefund: How Bridgefund builds AI-driven lending and credit risk management solutions on the Lakehouse
In 2018, together with a diverse group of passionate entrepreneurs, we launched the FinTech startup BridgeFund. BridgeFund uses its own technology and smart credit models to evaluate the financial health of businesses. With this, we are able to efficiently offer capital to a market that is not respected by banks.
We have automated the entire process. As a result, entrepreneurs can secure a loan more quickly and efficiently than with traditional lenders. The outcome: a perfect digital experience that truly makes entrepreneurs happy.
Julian van de Steeg CEO
CZ: Data Lakehouse at CZ Health Insurances: a transition towards a federated Data & Analytics model
As a leading health insurer in the Netherlands, CZ holds a fundamental belief that every insured individual in need of medical care should have access to it, all while maintaining an affordable premium. Yet, todays’ healthcare landscape is facing challenges around diminishing accessibility to healthcare services and escalating cost of medical care. Data, in all its facets, has become an indispensable asset in addressing these access and affordability concerns that continue to shape the healthcare sector.
Dave van den HurckSenior Product Owner, Data
In order to harness the power of data as a key driving force, CZ decided to develop a new, state-of-the-art cloud data platform. This Data Platform is based on the Lakehouse concept and enables the main Data Mesh principles around federated development and ownership of data products. In this presentation, we will shed light on the motivation for our Data Platform, the key technical components the Data Platform consists of and the journey and findings we had along the way.
VGZ: The journey within VGZ to the value-based data platform
Data insights are crucial within every organization for effective business operations. Amounts of data are growing exponentially, users want faster availability of the data and the way they gather insights out of it continuously evolves. This leaves us with new challenges! How have we, with the help of Databricks and Unity Catalog, responded to this changing need and accelerated the time-to-market along the way?
Technology Officer Data Products
Chief Analytics Officer
Director, Platform Services
Chief Data Officer
Manufacturing and Energy Track
Accelerating Manufacturing on the Data Intelligence Platform
IoT on the Lakehouse for Manufacturing
Shell: Data Democratization
Xebia: Applying Databricks for your Datamesh
Data Leaders Panel: The Future of Supply Chain
Databricks: Accelerating Manufacturing on the Data Intelligence Platform
Global Technical Director, Manufacturing
Databricks: IoT on the Lakehouse for Manufacturing
Within the manufacturing industry, substantial volumes of IoT data are consistently generated yet remain isolated and underutilized. At the same time, manufacturers grapple with the challenge of integrating advanced technologies such as AI and predictive analytics. To re-imagine their business and enable transformation, they require a platform capable of accommodating diverse data types and delivering insights at the speed of data.
In this session we will address how the streaming of IoT data is a key function of Databricks's Lakehouse for Manufacturing. We will also share some of our recent partnerships that have enabled access to the diverse data sources.
Discover the transformative power of Shell's Next-Gen Data Platform in shifting business intelligence into intelligent business. Seamlessly integrating traditional BI with cutting-edge tech, Shell redefines its data journey. Unveil the evolution of data, its strategic potential, and fostering an engineering culture. Join us to navigate Shell's data-driven revolution at its core.
Xebia: Applying Databricks for your datamesh
During this session, we will discuss how Xebia utilized Databricks to implement a Datamesh architecture. We will explore how this architecture can effectively scale within complex organizations, allowing decentralized self-service teams to develop data transformations and models while also facilitating collaboration.
CTO, Xebia Base
CTO, Xebia Data
Head of Data Platforms
Joëlle van der Bijl
Interim Director Global Data & Analytics
Start-Ups and Digital Natives Track
At the first ever Startups and Digital Natives forum in Amsterdam, we will be running a highly interactive set of sessions designed for Start Up and Digital Native businesses for a guided demo on how to build your own LLM.
This track is invitation only and has limited space, please reach out to the Databricks team if you are interested in joining.
From Raw Data to Data Product in Under 30 Minutes
Empowering Renewable Electricity Grids with Flexibility Using Data & AI
Deploy your LLM Bot - Part 1
Deploy your LLM Bot - Part 2
Databricks: From Raw Data to Data Product in Under 30 Minutes
In this session our Databricks experts will run a guided demo to show how you can go from raw data to data product in the following steps, all without writing a single line of code.
Panos AthanasiouSr Manager, Field Engineering
Principal Specialist Solutions Architect
Sympower: Empowering Renewable Electricity Grids with Flexibility Using Data & AI
Sympower accelerates global net-zero transition by balancing electricity supply-demand across European networks. Using IoT assets, they control flexible electricity, countering grid imbalances crucial for renewable energy growth. Data and AI challenges include predicting controllable electricity and integrating diverse data. Ellissa explains how Sympower's data team achieves AI-forecasting and analytics for bidding in flexibility markets, discussing setup and challenges.
Data Team Lead
Databricks: Deploy your LLM Bot - Part 1
Delve into the world of Generative AI, exploring tokenization, word embedding, LLMs, and their applications.
Databricks: Deploy your LLM Bot - Part 2
In Part 2 - we'll explore Retrieval Augmented Generation, where you can learn how to build a Q&A system using technologies like Hugging Face and Vector Databases.
Burce GultekinInterim CIO
© Databricks . All
rights reserved. Apache, Apache Spark, Spark and the Spark
logo are trademarks of the Apache Software Foundation.