Live-Demo: Baue skalierbare, Event-getriebene Microservices mit Confluent | Jetzt registrieren

Online Talk

Powering AI Agents with Anthropic’s MCP and Confluent

Jetzt registrieren

Thursday, June 5, 2025

10 AM PDT | 1 PM EDT | 10 AM BST | 10:30 AM IST | 1 PM SGT | 3 PM AEST

AI agents are only as effective as the data they access. Stale, fragmented information limits their ability to make intelligent real-time decisions while traditional request-response patterns, rigid APIs, and batch data processing are unscalable. Anthropic’s Model Context Protocol (MCP) changes this by providing an open standard for agents to securely connect with external tools and data sources. When combined with Confluent Data Streaming Platform, AI agents gain access to real-time, contextualized, and governed data streams to execute automated workflows, and safely scale across enterprise environments.

Join AI experts Sean Falconer and Edward Vaisman as they walk through an agentic AI tutorial using Anthropic’s MCP, Claude LLM, and Confluent.

Register now to learn:

  • What MCP is and why it matters
  • Step-by-step implementation of Confluent’s MCP Server to enable agents to operate on real-time data
  • How to build event-driven agents using connectors, Flink stream processing and AI Model Inference, and Stream Governance (e.g., schema registry, stream lineage, data portal)
  • How to use Claude and natural language to configure Kafka topics, execute Flink SQL, enforce data policy, tag PII, and more
  • And get all your questions answered during live Q&A

Sean is an AI Entrepreneur in Residence at Confluent where he works on AI strategy and thought leadership. Sean's been an academic, startup founder, and Googler. He has published works covering a wide range of topics from AI to quantum computing. Sean also hosts the popular engineering podcasts Software Engineering Daily and Software Huddle.

Edward is a Staff Partner Innovation Engineer at Confluent who specializes in distributed systems and event-driven architectures. He designs and implements enterprise-scale Kafka ecosystems, cloud-native solutions, and real-time data pipelines across cloud providers. He is known for architecting sophisticated systems–from protocol-level engineering to AI applications–while championing technical excellence and automating workflows.