Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Byte Latent Transformer (BLT) by Meta AI - A Tokenizer-free LLM в хорошем качестве

Byte Latent Transformer (BLT) by Meta AI - A Tokenizer-free LLM 2 дня назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Byte Latent Transformer (BLT) by Meta AI - A Tokenizer-free LLM

In this video, we dive into Byte Latent Transformer (BLT), a new Large Language Model (LLM) architecture presented in a recent research paper by Meta AI, titled: "Byte Latent Transformer: Patches Scale Better Than Tokens". BLT represents a significant advancement as a tokenizer-free architecture that learns directly from raw byte data. Remarkably, it achieves performance parity with tokenization-based models at scale, sparking considerable interest for future research in this area. BLT does not process the bytes one by one, but rather groups the bytes into dynamically sized patches. This approach allocates more computational power to bytes that are harder to predict, comparing to bytes that are less challenging to predict. In this video, we break down essential topics from the paper, including the entropy-based mechanism for grouping bytes into patches, and provide an in-depth exploration of the BLT architecture. Paper - https://arxiv.org/abs/2412.09871 Code - https://github.com/facebookresearch/blt ----------------------------------------------------------------------------------------------- ✉️ Join the newsletter - https://aipapersacademy.com/newsletter/ 👍 Please like & subscribe if you enjoy this content Support us - https://paypal.me/aipapersacademy The video was edited using VideoScribe - https://tidd.ly/44TZEiX ----------------------------------------------------------------------------------------------- Chapters: 0:00 Introduction 1:33 Patching Strategies 5:11 BLT High-Level Architecture 6:36 BLT Encoder & Decoder 8:47 Results

Comments