У нас вы можете посмотреть бесплатно Building Langfuse: Solving Observability for LLMs in Production | Marc Klingen или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this episode, we’re joined by Marc Klingen, Co-founder & CEO of Langfuse, an open-source observability platform helping teams run LLM and agentic systems reliably in production. Marc shares the story behind Langfuse - from getting accepted into Y Combinator with an entirely different idea, to discovering firsthand why LLM applications are easy to demo but hard to run in production. We dive deep into the real challenges teams face when deploying AI systems at scale: unconstrained inputs, non-deterministic outputs, evaluation, debugging, and building effective feedback loops. We also discuss: Why observability is critical for production LLM and agentic systems How teams debug real-world AI failures using tracing, evals, datasets, and user feedback Best practices for building and maintaining golden evaluation datasets Why Langfuse chose to be fully open source and self-hostable The thinking behind Langfuse joining ClickHouse, and how it accelerates performance, reliability, and scale The future of AI observability, long-running agents, and automated optimization loops Career advice for engineers and builders entering an AI-driven job market This conversation is a must-watch for AI engineers, founders, platform teams, and anyone building real production AI systems. Resources: https://langfuse.com/blog/joining-clickhouse