У нас вы можете посмотреть бесплатно Self-Hosted LLMs: A Practical Guide - DevConf.US 2024 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Speaker(s): Hema Veeradhi, Aakanksha Duggal --- Have you ever considered deploying your own large language model (LLM), but the seemingly complex process held you back from exploring this possibility? The complexities of deploying and managing LLMs often pose significant challenges. This talk aims to provide a comprehensive introductory guide, enabling you to embark on your LLM journey by effectively hosting your own models on your laptops using open source tools and frameworks. We will discuss the process of selecting appropriate open source LLM models from HuggingFace, containerizing the models with Podman, and creating model serving and inference pipelines. For newcomers and developers delving into LLMs, self-hosted setups offer various advantages such as increased flexibility in model training, enhanced data privacy and reduced operational costs. These benefits make self-hosting an appealing option for those seeking a user-friendly approach to exploring AI infrastructure. By the end of this talk, attendees will possess the necessary skills and knowledge to navigate the exciting path of self-hosting LLMs. --- Full schedule, including slides and other resources: https://pretalx.com/devconf-us-2024/s...