У нас вы можете посмотреть бесплатно Understanding the Llama 3 Tokenizer | Llama for Developers или скачать в максимальном доступном качестве, которое было загружено на ютуб. Для скачивания выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Download Meta Llama 3 ➡️ https://go.fb.me/kbpn54 Aston Zhang, research scientist working on Llama at Meta discusses the new tokenizer in Meta Llama 3. He discusses the improvements made to the tokenizer in Meta's latest Llama 3 models. The new tokenizer uses Tiktoken instead of SentencePiece and has a larger vocabulary size of 128k, resulting in better performance on coding, reasoning, and more. The increased vocabulary size allows for more specific and nuanced encoding of inputs, while the higher compression ratio reduces the number of tokens required to represent an input. Additionally, the use of Group Query Attention helps balance out the increased memory and compute needs, resulting in a model that can process larger batches without increasing latency. Timestamps 00:00 Introduction 00:25 What's new in the Llama 3 tokenizer? 01:58 Vocabulary size and compression ratio 13:01 Performance, efficiency and improving costs 17:46 Recap and resources Additional Resources • Dive into Deep Learning ebook: https://go.fb.me/ao405f • Getting Started Guide: https://go.fb.me/xucc2m #llama3 #llm #opensource - - Subscribe: https://www.youtube.com/aiatmeta?sub_... Learn more about our work: https://ai.meta.com Follow us on social media Follow us on Twitter: / aiatmeta Follow us on LinkedIn: / aiatmeta Follow us on Threads: https://threads.net/aiatmeta Follow us on Facebook: / aiatmeta Meta AI focuses on bringing the world together by advancing AI, powering meaningful and safe experiences, and conducting open research.