The speakers for World of Data 2026 will be announced here gradually.
Discover our inspiring line-up:




Tim is an experienced data engineer with many years in the field, working across various industries and technologies. He is skilled in leading teams, building data solutions and driving insights to support business goals. Another passion is streamlining and automating team processes, DataOps and DevOps.

Germany’s “Energiewende” represents an ambitious and necessary transformation. This shift brings both significant technological innovation and digital challenges. 50Hertz is committed to leading the integration of renewable energy and is therefore developing a new control system. Data is essential to achieving this vision and enabling a sustainable, future-proof power grid.




Jochen has been with Atruvia AG for several years, where his responsibilities include implementing the data strategy and further developing analytical platforms. Prior to that, he gained diverse professional experience as a full-stack software developer, data engineer, DevOps engineer, and IT architect.

Das Thema der digitalen Souveränität hat im europäischen Raum in den letzten Jahren enorm an Wichtigkeit gewonnen. Geopolitische Risiken, Abhängigkeiten aufgrund internationaler Verflechtungen, globale Krisen oder gar Kriege sind aufgrund der jüngsten Geschichte längst kein abstraktes Bedrohungsszenarion mehr. Dieser Vortrag beschreibt mögliche Voprgehen für Unternehmen, die nach Strategien zur Reduktion von Abhängigkeiten, zur Stärkung der Kontrolle über ihre Daten sowie zur Gestaltung resilienter Datenarchitekturen suchen und daraufhin gezielte Technologieentscheidungen treffen wollen.




Arndt specializes in advanced analytics, business intelligence, and cloud initiatives. With expertise in building modern data stacks, he leverages technologies like SAP S/4HANA, BigQuery, and Google Cloud Platform for data-driven solutions, including statistical forecasting and ML applications.

What happens when you combine a massive Google Cloud transformation with a simultaneous S/4 HANA migration? You get a masterclass in complexity—and a lot of lessons learned the hard way. We’re pulling back the curtain on our journey, moving beyond the slides to discuss the pitfalls of dual transformations and the "scars" we earned along the way. We focus specifically on the technical integration and strategic data usage that bridge these two worlds. From the failures that stalled us to the high-impact use cases that saved us, this is a story of heartbreak, hard work, and—eventually—a data-driven transformation that actually works.




Bilal is a renowned keynote speaker, entrepreneur, and AI expert. He studied business administration in Düsseldorf, travels regularly to Silicon Valley, and completed AI courses in the United States at the elite universities Harvard and Stanford. He is one of the most sought‑after speakers in Germany and across Europe and is regularly booked by international companies. Bilal is also known from various TV formats. His first book, “Der klügste Freund, den wir je hatten”, was published in bookstores in September.

„Jahrhundert-Chance: KI“ ist eine interaktive Keynote, die einen kompakten Überblick über den aktuellen Stand der Künstlichen Intelligenz weltweit gibt – mit frischen Einblicken aus dem Silicon Valley, China und Europa. Anhand konkreter Beispiele werden Anwendungsmöglichkeiten von KI in Unternehmen, im Alltag sowie ihre sozialen Auswirkungen beleuchtet. Zudem zeigt die Keynote, wie KI Medizin und Gesundheit verändert und bietet persönliche Einblicke in die zugrunde liegenden Technologien – verständlich, inspirierend und praxisnah.




Michael is responsible for cloud program management at Controlware in the Business Development division. He shapes and manages partnerships with AWS, HashiCorp an IBM Company, OVHcloud, and Google GCP. He has more than 25 years of international ICT experience across the IT, financial, automotive, manufacturing, and aviation industries.





Andrea is a Marketing and Sales Solutions Consultant at Cornelsen, specializing in requirements analysis and system implementation. She has many years of experience in marketing services, planning processes, and process consulting and implementation. In addition, she has experience in project management and in process optimization in line with system support.

Der Vortrag zeigt, wie der Cornelsen Verlag MARMIND als zentrales Steuerungsinstrument für Marketingaktivitäten etabliert und nahtlos mit SAP verzahnt hat. Im Fokus stehen die Neugestaltung der Planungs- und Freigabeprozesse, eine einheitliche Datenbasis sowie die Konzeption der Schnittstellen zwischen beiden Systemen. Die Projektreise – von den ersten Anforderungen bis zum laufenden Betrieb – liefert Einblicke, Erfolgsfaktoren und Lessons Learned für ähnliche Integrationsvorhaben.




Carsten Schweiger has more than 20 years of professional experience, including 15 years in consulting with a focus on Business Intelligence and Data Warehousing. After completing his degree in Business Information Systems at HP, he gained his first Business Intelligence experience in controlling and led numerous data warehouse projects, including in roles such as Requirements Engineer, Architect, and Database and ETL Developer. His professional focus lies in automation, API development, operations and support, as well as building sustainable and scalable data architectures. Over the past ten years, he has supported more than 25 teams through transformation processes as a Scrum Master and Agile Coach. Since 2024, he has been working as Data & AI Enabler at 2150 Datavault Builder AG. In this role, he is responsible for GenAI-based knowledge management, the integration of modern AI architectures including MCP servers, and the online training and certification program.

Data modeling often depends on experience, contextual knowledge, and many manual steps. AI can support this process. In this talk, I will show an end-to-end live demo of how AI agents support the path from business requirement to data query across the DWH process. You will see specifically: 1) ... how a Data Vault model is created directly from a business description and existing source data models. 2) ... how the model is implemented on the database in a compliant, consistent, and traceable way via MCP Server. 3) ... how data can be queried using natural language without SQL skills. The Model Context Protocol (MCP) connects AI agents directly with modeling tools and databases. It turns a chatbot into a real working assistant. One thing upfront: AI takes over the craftsmanship. Architectural decisions remain, for now, with humans. See a live demo based on Datavault Builder and Database MCP Servers.




As Chief Solution Architect, Johannes has been designing future-proof solutions for sovereign data infrastructures, data platforms, and data-driven decision support at disy for over 12 years. His focus is on combining domain expertise, modern data analysis, and location intelligence with disy Cadenza to create a powerful, sovereign analytics platform that empowers organizations to take sustainable action and make informed decisions.









Sönke has been leading data and analytics organizations for more than 15 years across a range of industries (including Telefónica, Handelsblatt, XING, Fitness First, and HRS). His areas of focus include developing digitalization and data strategies and putting them into practice through cloud-based analytics platforms and ML/AI solutions. He regularly presents his innovative work at national and international conferences.





Yannick leads the AI & Data team at the German Press Agency (dpa), Germany’s largest news agency. After earning his Master’s degree in Management & Data Science, Yannick gained several years of experience as a data scientist and team lead in IT consulting. Since early 2025, his team has been developing innovative AI solutions for the media and information ecosystem.

KI-Systeme sind nur so gut wie die Informationen, auf deren Basis sie Entscheidungen treffen. Mit dpa-iq bauen wir eine Retrieval-Infrastruktur auf AWS und Databricks, die verifizierte Informationen & Content in Echtzeit an LLM-basierte Agenten liefert - für interne Produktteams und externe Kunden gleichermaßen. Ein Blick hinter die Kulissen: Herausforderungen, Architektur und Umsetzung einer Plattform, die als Grounding-Backbone für agentische Workflows & Produkte dient.




Alex Merced is Head of DevRel at Dremio with experience as a developer and instructor. His professional journey includes roles at GenEd Systems, Crossfield Digital, CampusGuard, and General Assembly. He has co-authored "Apache Iceberg: The Definitive Guide" and "Apache Polaris: The Definitive Guide" published by O'Reilly along with "Architecting and Apache Iceberg Lakehouse" with Manning. Alex has spoken globally at notable events such as SQLbits, DataEngBytes, Data Day Texas, Data Council and more. Alex is passionate about technology, sharing his expertise through blogs, videos, podcast "Alex Merced's Tech Podcast", and has made contributions to the JavaScript, Rust and Python communities with libraries like SencilloDB and CoquitoJS. Also creator of the Pangolin Open Source Lakehouse Catalog.

Teams invested in lakehouses, pipelines, and dashboards. Yet trust in data remains low. Analysts still duplicate logic. AI systems struggle to act on incomplete context. The root cause is simple: semantics never kept up. This session breaks down how fragmented definitions and siloed logic limit both analytics and AI. It then introduces a unified semantic layer that spans all data sources without duplication. Built on open architecture, this layer brings consistent meaning to every query, model, and application. When semantics are shared, data stops being a bottleneck and becomes a reliable foundation for action. Takeaways: 1) Why semantic drift creates hidden data debt. 2) How poor context weakens AI outcomes. 3) Where semantics should live in a modern architecture. 4) How to create reusable, governed business logic at scale.




Claudia has more than 20 years of professional experience in the fields of data science, artificial intelligence, and innovation management. As Chief Data Officer at FIEGE Logistics, she is responsible for the company’s data and AI topics. Until summer 2024, she led the Data Intelligence Center at Deutsche Bahn and was responsible for the group-wide data and AI strategy. Prior to that, she headed the machine learning team at T-Labs, the research unit of Deutsche Telekom. In addition to her corporate roles, Claudia is an angel investor. She is a speaker and author and is actively committed to advancing education for children and young people in the areas of data and AI. Claudia has received numerous awards, including “Global Women Leader in AI” in 2019, “40 over 40 – Germany’s Most Inspiring Women 2022,” recognition as one of the “30 Outstanding Women in Data – 2023 Global,” and most recently the ILD Leadership Award 2025 in the category “Inspiring Leader.”

KI rennt uns allen davon, doch wir frickeln noch an den Daten rum. Claudia Pohlink gibt Antworten aus dem Maschinenraum: dort, wo Datenqualität, Governance und operative Realität darüber entscheiden, ob KI-Projekte skalieren oder scheitern. Sie zeigt, wie ein klassisches Logistikunternehmen den Weg von der Datenstrategie zur messbaren KI-Wertschöpfung geht – mit all seinen Umwegen, Rückschlägen und echten Erfolgen. Kein Hochglanz, kein Buzzword-Bingo. Sondern Klartext und der Anti-Hype.





Sascha has more than 20 years of professional experience in the fields of data management, data governance, analytics, and digital transformation. He currently serves as Head of Data & AI Foundations at FIEGE Logistics. In this role, he is responsible for the central data platform, data management, data governance, data quality, and self-service analytics, among other topics. His focus lies on scalable, economically viable data architectures and the sustainable establishment of data literacy across the organization. Previously, Sascha worked as Data Officer for the Group Executive Board at Deutsche Bahn and served as Chief Data Officer at Berlin Hyp AG. His key priorities include the practical implementation of data and AI strategies, optimizing the cost-benefit ratio of data investments, and the successful development of high-performing data organizations.






Simon is an experienced transformation leader specializing in data strategy, digital initiatives, and process optimization. As Global Head of Data Excellence at Giesecke+Devrient, he is responsible for the company’s global data governance and oversees multiple transformation programs. Previously, he held senior roles at UniCredit and PwC, where he focused on strategy development, target operating models, and the digital transformation of banking.





Thomas is Head of Data-Driven Applications & Services at GRENKE in Baden-Baden. GRENKE is a financial solution provider, offering services in more than 30 countries around the world. His department aims to deliver business value through data and data-related services and applications. He has spent the last 20 years with data, databases, IT services, and related infrastructure. Before that, he was a paramedic with the German Red Cross.

Organizations sit on mountains of data, yet without a strategy, they risk sinking into a swamp of disorganized information, where potential insights are lost. Many companies treat data as a commodity rather than a strategic driver, leaving untapped potential on the table. This session explores how organizations can move beyond reactive data management and establish a forward-looking strategy that aligns with business goals. Whether you're a DBA, data scientist, or manager, the first step is asking why your data matters, what value it brings, and how it can empower decision-making. We will identify why data initiatives fail and data strategies fall short. You'll see how defining a clear vision anchors your strategy. We'll discuss how to build a culture that embraces data as a strategic asset, not just a technical tool. Crucially, we will explore the human side of data because data is only as valuable as the impact it creates for the people who use it. Discover why your data strategy might be doomed before it starts and what you can do about it before you're knee-deep in the swamp.




Tamara is a data analyst in the e‑mobility field, specializing in the digitalization of production lines. She focuses on making machine and product data available to users in real time to enable data‑driven process optimization and sustainably increase transparency during line commissioning. Her work builds on experience in coordinating projects related to the maturity development of electric drives.

Der Vortrag beschreibt die datenarchitektonische Transformation eines Maschinenbauunternehmens, das gemeinsam mit b.telligent Microsoft Fabric zur Ablösung von SAP BW implementiert hat. Aus einem zunächst SAP fokussierten Projekt entstand ein Multisource Setup mit REST APIs, Kafka Eventstreams, SQL Datenbanken, usw. – aufgebaut im domänenübergreifenden Hub and Spoke System und ausgerichtet auf einen Single Point of Truth. Die Architektur verbindet dabei die SAP Lakehouse Welten mit hochfrequenten Engineering Datenströmen. Besonders im Engineering steht Self Service in Echtzeit Dashboards mit KQL Querysets im Mittelpunkt, um Fachbereiche nachhaltig zu befähigen.





Thomas is controller at GROB‑Werke with a focus on reporting and project analysis. In his role as project manager, he is also responsible for the BI strategy of the GROB Group. He brings many years of experience in controlling, project management, and management reporting, and has in‑depth, cross‑module SAP expertise as well as extensive experience in advancing data‑driven management and control processes.

Der Vortrag beschreibt die datenarchitektonische Transformation eines Maschinenbauunternehmens, das gemeinsam mit b.telligent Microsoft Fabric zur Ablösung von SAP BW implementiert hat. Aus einem zunächst SAP fokussierten Projekt entstand ein Multisource Setup mit REST APIs, Kafka Eventstreams, SQL Datenbanken, usw. – aufgebaut im domänenübergreifenden Hub and Spoke System und ausgerichtet auf einen Single Point of Truth. Die Architektur verbindet dabei die SAP Lakehouse Welten mit hochfrequenten Engineering Datenströmen. Besonders im Engineering steht Self Service in Echtzeit Dashboards mit KQL Querysets im Mittelpunkt, um Fachbereiche nachhaltig zu befähigen.





As a mountaineer, Alexander Huber has pushed himself to his limits—physically and mentally—in the steep world of the mountains. What defines a successful climber is not just physical strength; on the contrary, the true pioneers have always been those with the greatest visionary power. It is more than clear that ideas and visions can only be realized successfully when smart strategy and careful planning form the foundation—only then can existing potential be fully tapped. The motivation required for this always springs from a fundamental joy in creating: passion as the source of strength. The well‑attuned team of Alexander and Thomas began to form in their childhood. Together they sought out challenges and, as roped partners, each still had to take personal responsibility. Their passion for the mountains serves as a metaphor for life and opens up perspectives from strikingly powerful vantage points. After all, it is not the mountain we conquer, but always ourselves.

More info coming soon.




As Senior Director, Marco oversees the Digital & Innovation Hub at Infineon, where he has been driving the company’s digital transformation and innovation initiatives since 2017. His previous roles include senior positions in partnership, product, and project management at ADAC Financial Services and ADAC Car Rental, as well as consulting experience at EY.





Daniel has been working in the IT industry for more than 20 years, consistently focusing on partner management and business development. For the past 5 years, he has been driving the expansion of the partner ecosystem at IONOS, overseeing joint project business and developing partner-managed PaaS and SaaS services as a strategic extension of the IONOS product portfolio.









Kai has been a Solution Architect for Data & Analytics at Karl Storz since mid‑2025. A holistic, outcome‑driven architect, he focuses on concepts, strategy, and governance, anchored in a strong technical core. He bridges AI, BI, and business needs to deliver real value. Earlier roles spanned customer‑facing consulting and architecture in data‑driven businesses.

Self Service Analytics verspricht Geschwindigkeit und Freiheit, doch führt bei wachsendem Erfolg schnell zu Unübersichtlichkeit. Auch bei Karl Storz sorgen viele Tools und immer mehr Dashboards dafür, dass Orientierung und Vertrauen leiden. Wir zeigen unseren aktuellen Lösungsansatz: den Schritt von einzelnen Dashboards hin zu klar definierten und zertifizierten Analytics Products.





Livia is IT architect specializing in modern data platforms and scalable analytics architectures. She is currently contributing to the development of a global analytics platform, shaping architectural principles and governance standards. Previously, she worked as a Data Engineer with a focus on requirements engineering and close collaboration with business stakeholders. Her work included integrating and modeling SAP and CRM data for analytical data platforms.

Self Service Analytics verspricht Geschwindigkeit und Freiheit, doch führt bei wachsendem Erfolg schnell zu Unübersichtlichkeit. Auch bei Karl Storz sorgen viele Tools und immer mehr Dashboards dafür, dass Orientierung und Vertrauen leiden. Wir zeigen unseren aktuellen Lösungsansatz: den Schritt von einzelnen Dashboards hin zu klar definierten und zertifizierten Analytics Products.





Jonathan directs MarTech architectural design and technical oversight for Mytheresa, Net-A-Porter, and Mr Porter. He leads the department for tracking and data integrity while bridging marketing goals with technical strategy. His background includes CRM technology, performance marketing, and tracking. He is responsible for the high-level governance and strategic alignment of the MarTech stack.

Achieving a precise understanding of marketing impact often means moving beyond the limitations of standardized platforms. In this session, I will share how we designed and deployed a full-scale internal attribution framework at Mytheresa. I’ll walk through our approach to connecting scattered data sources, the engineering behind our unified tracking layer, and why we prioritized full ownership of our data and processes. We’ll look at the technical architecture and the practical steps we took to build a centralized system for a complete view of the customer journey.




Caroline is Chief Data Officer at Münchener Hypothekenbank eG. After completing her training as a banking professional and studying economics in Munich and Madrid, she began her career in credit risk management. She subsequently spent several years in banking consulting and risk controlling, where she established a central enterprise data platform and led the implementation of BCBS 239. Since January 2024, she has been responsible for business data management at MHB.

Viele Projekte zur Verbesserung der Datenarchitekturen in großen Unternehmen haben einen konkreten regulatorischen Ausgangspunkt oder starten mit definierten technischen Zielen – und kämpfen mit Problemen in der Umsetzung. Warum? Weil sie als reine Umsetzungsprojekte verstanden werden, obwohl ihr Kern in der Veränderung von Organisation, Kultur und Arbeitsweisen liegt. Dieser Vortrag beleuchtet, warum der Aufbau eines nachhaltigen Datenmanagements weit mehr ist als die Einführung von Tools und Prozessen. Im Fokus steht der oft unterschätzte Change-Charakter von derartigen Initiativen wie beispielsweise der Umsetzung von BCBS 239 in der Bankenwelt: von der Etablierung klarer Verantwortlichkeiten über den Umgang mit Widerständen bis hin zur Verankerung von einem nachhaltigen Umgang mit Datenqualität im täglichen Handeln. Anhand praxisnaher Erfahrungen wird aufgezeigt, woran Datenprogramme typischerweise scheitern – und welche Erfolgsfaktoren entscheidend sein können, um sie wirksam und nachhaltig umzusetzen. Teilnehmer erhalten konkrete Impulse, wie sie den notwendigen Wandel aktiv gestalten und die Umsetzung damit zum Erfolg führen können.




Since 2023, Jonas is a data scientist and product owner at OTTO where he has worked on various products in the realm of marketing optimization and customer analytics. He currently is responsible for the development and application of customer lifetime value models. Before OTTO, he worked for L’Oréal for five years on analytics topics as a data analyst and data scientist.

Um Budgets gezielt nach Kundenpotenzial einzusetzen, braucht es eine individuelle Profitabilitätsbewertung auf Einzelkundenebene. Seit diesem Geschäftsjahr setzt OTTO ein neues statistisches CLV-Modell ein, das den Kundenwert konkret als erwarteten Profit des kommenden Jahres definiert und individuell vorhersagt. Damit wird eine differenzierte Ansprache je nach Potenzial ermöglicht, ob im CRM, Customer Care oder bei Marketingentscheidungen. Im Vortrag beleuchten wir neben dem technischen Hintergrund vor allem unsere praktischen Erfahrungen in der Anwendung. Ein besonderes Augenmerk liegt dabei auf dem Change Management: Die Steuerung nach einem prognostischen Profit-CLV erfordert an einigen Stellen ein echtes Umdenken und genau darin liegt eine der spannendsten Herausforderungen dieses Projekts.




Kolja is responsible for the data strategy, the data team and the greenfield Data & Analytics platform at Rameder. Before joining Rameder, he led the Data Foundation team at MediaMarktSaturn, where he was in charge of the migration from the legacy DWH to the new cloud data platform. Kolja started his career in IT consulting, where over the course of 10 years he developed data solutions for numerous clients.

Wie bei vielen Unternehmen im kleineren Mittelstand war die Datenwelt bei Rameder bis Herbst 2024 eine heterogene Insellandschaft. SAP im Zentrum, mehrere Cloud-Services wie Google Analytics rund herum und einzelne - von verschiedenen Bereichen aufgebaute - Datenbanken dazwischen. Um systemübergreifende Analysen in hoher Qualität zu ermöglichen, wurde Ende 2024 die Entscheidung für eine Greenfield-Datenplattform in der Google Cloud getroffen. Der Vortrag zeigt die Plattformarchitektur und den Ansatz, wie alle Datenquellen an die neue Datenplattform angebunden werden konnten. Die größte Herausforderung war dabei die Integration von SAP BW. Zusätzlich enthielt es die meiste zu migrierende Geschäftslogik. Daher schauen wir uns dort die Details an, inkl. erstem fehlgeschlagenen Pilotprojekt und der erfolgreichen zweiten Lösung.




After completing his master’s degrees in Mechanical Engineering, Economics and Automotive Management, Detlev started his career with Vodafone and Colt Telecom in the service provider space. From 1999 onwards, he held different international sales engagements at Cisco, covering global accounts and leading the Service Provider team. In 2017, he joined Arista Networks, driving the Emerging EMEA Sales team and focusing on AI-driven network automation and management. Since 2025, he has been with Red Hat in the EMEA AI Platform team as a Sales Specialist.









With 15 years of experience in the IT industry, Nicolas currently serves as a Senior Platform Strategist at Red Hat, where he is responsible for app platforms and the FSI sector. He has extensive experience in platform implementation, bridging the gap between business and IT, and driving innovation in hybrid cloud and AI—always in line with the open-source philosophy.









With over five years of experience at the intersection of AI strategy and technical solution engineering, Christoph has helped organizations across industries turn data into real business value — from advising on enterprise AI governance to building and deploying production-grade AI applications. Today he specializes in Generative AI and helps companies cut through the complexity of enterprise AI.





Stefan is Head of Data Strategy & Analytics at the SPAR Austria Group. In this role, he shapes the further development of the data analytics strategy and the use of modern data platforms. In doing so, he supports the transformation of the analytics landscape as well as the implementation of SPAR’s data strategy within the organization and its culture.

More info coming soon




Sönke brings years of hands-on experience in Big Data, supporting customers in designing and implementing solutions while actively contributing to open source projects. His deep involvement in building and maintaining infrastructures has given him broad insight into the diverse needs and conditions that shape effective Big Data architectures. As CPO at Stackable, he channels this expertise into the product to ensure it fully meets the needs of every customer.








Nancy is a Senior Data Engineer at Stadtwerke Düsseldorf. She supports business use cases end to end – from initial requirements to productive ETL pipelines – with a strong focus on Data Mesh, Data Vault 2.0, and measurable business value. Her technology stack includes Databricks, dbt Labs, and Azure services.

Daten entscheiden. Aber wer darf sie sehen? In vielen Unternehmen lautet die ehrliche Antwort: die IT – und sonst kaum jemand. Dieser Vortrag ist für alle, die das ändern wollen. Ich erzähle, warum Self-Service Analytics bei uns zunächst gescheitert ist, was wir intern überwinden mussten – und wie es am Ende doch funktioniert hat. Kein technischer Deep-Dive, sondern eine ehrliche Geschichte über Daten, Verantwortung und die Frage: Wem gehören eigentlich die Erkenntnisse in deinem Unternehmen?




Ulisse is Tealium's Regional VP of Solution Consulting for EMEA, with over 15 years of experience in software engineering and solution consulting. He specialises in helping enterprises harness AI and real-time data orchestration to build trusted, governed data foundations that power smarter, more personalised customer experiences. With expertise spanning CDP, MarTech, and AdTech, Ulisse is a recognised thought leader and speaker who drives innovation at the intersection of AI and data activation.

Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua.




Birgitta works at Telefónica Germany in the areas of AI, analytics, and BI platforms. She is responsible for cloud-native AI architectures using Azure and Databricks and demonstrates how generative AI can be operated productively in highly regulated environments through DevOps, governance, and security.

KI in stark regulierten Umgebungen erfolgreich einzusetzen erfordert mehr als Technologie. In diesem Beitrag stellen wir unser praxiserprobtes Framework vor, das KI ganzheitlich von Strategie und Use‑Case‑Priorisierung über Governance, Compliance und Architektur bis hin zum sicheren Betrieb betrachtet. Anhand konkreter Beispiele zeigen wir, wie Organisationen mit klaren Leitplanken Innovation ermöglichen, regulatorische Anforderungen erfüllen und nachhaltigen Business‑Mehrwert aus KI schaffen.





Franz is a Senior Process Manager in the field of DevOps, with a focus on cloud, machine learning, and AI. He designs strategic cloud-native transformations, optimizes end-to-end DevOps processes, and works closely with security, data protection, and works councils to anchor modern AI and cloud architectures in organizations in a secure, compliant, and sustainable way.

KI in stark regulierten Umgebungen erfolgreich einzusetzen erfordert mehr als Technologie. In diesem Beitrag stellen wir unser praxiserprobtes Framework vor, das KI ganzheitlich von Strategie und Use‑Case‑Priorisierung über Governance, Compliance und Architektur bis hin zum sicheren Betrieb betrachtet. Anhand konkreter Beispiele zeigen wir, wie Organisationen mit klaren Leitplanken Innovation ermöglichen, regulatorische Anforderungen erfüllen und nachhaltigen Business‑Mehrwert aus KI schaffen.





As Head of Digital People & Culture at WACKER Chemie AG in Munich, Julia is responsible for the successful implementation of data, analytics and AI solutions, with a clear focus on people as the decisive success factor. Her work centers on sustainable usage, measurable impact, and the development of a strong data and change culture.

Unternehmen investieren in Daten, KI und digitale Tools – doch die Wirkung hängt von einer tatsächlichen Verhaltensänderung bei den Mitarbeitenden ab. In diesem Beitrag erläutert Julia Pogorzelski, User Adoption Expert bei WACKER, was Benutzerakzeptanz wirklich bedeutet – nicht nur die Einführung von Tools, sondern eine menschenzentrierte Förderung von Bewusstsein, Fähigkeiten und nachhaltiger Nutzung. Sie erfahren, wie Sie den Erfolg messen, KPIs an den Akzeptanzergebnissen ausrichten und Ängste und psychologische Biases angehen können, die die digitale Akzeptanz behindern. Anhand realer Fälle, wie der Einführung von MS Copilot, erleben Sie, wie die Prinzipien von User Adoption zielgerichtet eingesetzt werden und dadurch Kompetenz, Transformation und Kultur gefördert werden.




Tim has more than 15 years of experience in Business Intelligence. Over the years, his work has evolved from creating reports and dashboards to managing projects focused on new information systems – and ultimately to building a full BI platform. This includes the successful global rollout and adoption of Power BI as Weidmüller’s analytics and reporting platform. Most recently, he has been working on developing a data platform based on Microsoft Fabric. His current focus is on defining and establishing the platform strategy and platform governance.

Der Vortrag „Schatzkarte statt Datendschungel – Organisation & Data Catalog als Erfolgsfaktor“ beginnt mit einer Reise in die Vergangenheit und die Evolution der Data und Analytics Plattform bei Weidmüller wird umrissen. Anschließend wird die gegenwärtige Organisation dargestellt und Erfolgsfaktoren bei der Realisierung benannt. Zudem wird der mit Microsoft Tools eigenständig entwickelte Data Catalog – der an den Bedarfen im Unternehmen ausgerichtet ist – vorgestellt.
