У нас вы можете посмотреть бесплатно Mixtral On Your Computer | Mixture-of-Experts LLM | Free GPT-4 Alternative | Tutorial или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this video, I will show you how to run the new Mixtral LLM, which matches or even outperforms Llama 2-70B and GPT-3.5 on many benchmarks. To this end, I will first show you the most important features of the Mixtral model, which is a Mixture-of-Experts LLM. Then I will briefly explain Mixture-of-Experts networks and how they differentiate the Mixtral model from other LLMs such as Llama 2, Falcon, etc. Next, I will show you how to run Mixtral on your own computer using a user interface. However, this requires at least 23 GB of VRAM, which makes it not feasible for many people to use it. For this reason, I will give an outlook on future progress with sparse Mixture-of-Experts LLMs. Finally, I will also provide valuable information about fine-tuning sparse Mixture-of-Experts models and explain why this is still challenging. As always, I hope you enjoyed this video and learned something new! :-) Code Used in This Video https://github.com/thisserand/mixtral My Medium Article for This Video https://medium.com/@martin-thissen/mi... My Workstation GPU: NVIDIA RTX 6000 Ada https://nvda.ws/47U7wmA CPU: Intel Core i9-13900K https://amzn.to/47qDQgp RAM: Corsair Vengeance 64 GB https://amzn.to/47o4S8e Motherboard: ASRock Z790M PG https://amzn.to/3SxvtLS Storage: Samsung 980 PRO 2 TB https://amzn.to/3u8X23Y PSU: Corsair RM 850x https://amzn.to/3uhTNXS Case: Fractal Design Meshify 2 Mini https://www.fractal-design.com/produc... CPU Cooler: Noctua NH-U12A https://amzn.to/3Qpv4IM Case Fan: Noctua NF-A12x25 https://amzn.to/3srf1lE 00:00:00 Intro 00:01:25 Mixtral 8x7B 00:04:27 Mixture of Experts 00:08:58 Run the Mixtral Model 00:16:10 Fine-Tuning Mixtral & MoEs 00:21:30 Outro References https://mistral.ai/news/mixtral-of-ex... https://huggingface.co/blog/moe https://huggingface.co/blog/mixtral https://arxiv.org/pdf/2311.01964.pdf https://arxiv.org/pdf/2208.02813.pdf https://arxiv.org/pdf/2006.16668.pdf https://arxiv.org/pdf/2101.03961.pdf https://arxiv.org/pdf/1701.06538.pdf https://arxiv.org/pdf/2202.08906.pdf https://arxiv.org/pdf/2305.14705.pdf Stay in Touch Medium / martin-thissen LinkedIn / mthissen135 YouTube Of course, feel free to subscribe to my channel! :-) Of course, financial support is completely voluntary, but I was asked for it: / martinthissen https://ko-fi.com/martinthissen