У нас вы можете посмотреть бесплатно Textbooks Are All You Need: Phi-1 Model for code или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
phi-1 is an LLM for code. It is 1.3B, 24 layers, Transformer decoder trained with Flash Attention. It is pretrained with cleaned and synthetically generated CodeTextbook dataset. It is finetuned with synthetically generated textbooks and exercises with GPT-3.5. phi-1 outperforms several competing models on HumanEval and MBPP (Mostly Basic Python Programs), despite being 10x smaller in model size and 100x smaller in dataset size. The work leads to this learning: High quality data improves the learning efficiency of LLMs for code as they provide clear, self-contained, instructive, and balanced examples of coding concepts and skills. Here is the agenda for this video: 00:00:00 How has been the progress in LLMs for code? 00:03:18 How important is training with high quality data? 00:07:28 How is phi-1 trained? 00:09:46 How was high quality data for training phi-1 models obtained? 00:13:23 How does phi-1 perform on a variety of code related tasks? 00:15:12 What are some limitations of phi-1? For more details, please look at Gunasekar, Suriya, Yi Zhang, Jyoti Aneja, Caio César Teodoro Mendes, Allie Del Giorno, Sivakanth Gopi, Mojan Javaheripi et al. "Textbooks Are All You Need." arXiv:2306.11644 (2023).