У нас вы можете посмотреть бесплатно Private LLM Chat on Ubuntu Using text-generation-webui (Tutorial) или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Want to chat with an AI privately on your own computer? This video is an updated walkthrough of one of my favorite programs for running large language models locally. I'll take you through the process step by step, starting with the installation on a Linux system. You'll learn how to find and download models from Hugging Face. I cover both the standard models and the compressed (quantized) GGUF files, and I explain the different steps for each. After that, we'll look at the settings. You'll see how to load a model, adjust how much work your GPU does (using GPU layers), and tweak parameters like "temperature" to control the AI's creativity. We also look at one of the most powerful features: setting your own custom system instructions and even building different "characters" for the AI to use. Finally, we test it out with a couple of different models. I show the difference between an "instruct" model and one built for conversation. You can even get a peek into CoT model's thought process before it gives an answer. If you've had any experience with this program, drop a note in the comments. If you try this out, come back and let me know what you think. Timestamps 0:00 - Intro 0:06 - Installation & Setup 1:03 - Downloading Full Precision Models 1:49 - Downloading GGUF Models 2:39 - Loading the Model 3:46 - Model Settings 4:44 - Character Cards 5:46 - Quick Note About Vision LLM 6:08 - Chat Settings 8:56 - Chatting with the Model 11:31 - Ending