У нас вы можете посмотреть бесплатно Balancing Sustainability and Performance. The Role of Small-Scale LLMs in Agentic AI Systems или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
This research investigates the viability of deploying small-scale, open-weights large language models (LLMs) within agentic artificial intelligence systems to address the sustainability challenges posed by the high energy demands of large closed-source models like GPT-4o. Through a comprehensive analysis of 28 models, the study evaluates the complex trade-offs between environmental impact, user experience defined by latency, and output quality. The results indicate that optimizing model selection can yield significant benefits; specifically, the Qwen3 family and mixture-of-experts architectures demonstrated the ability to match the output quality of GPT-4o while reducing energy consumption by approximately 70%. The authors found that increasing model size often results in exponential energy growth without proportional gains in performance, whereas using smaller models with optimized batch configurations and 4-bit quantization can effectively balance efficiency and responsiveness. Ultimately, the study concludes that transitioning to smaller, optimized open-weights models offers a practical pathway for creating scalable and environmentally responsible enterprise AI agents without compromising task execution. https://arxiv.org/pdf/2601.19311