У нас вы можете посмотреть бесплатно Caching Strategies | Caching Techniques | Caching Patterns или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Caching Strategies | Caching Techniques | Caching Patterns In a read heavy system, loading data from actual source of data can result into response slowness and thus increases system latency. At a very high level it doesn’t look to be an issue, but it results into a very bad user experience for the service. Caching improves page load times. Also, it can reduce the load on backend servers and databases. In caching model, a temporary memory sits between server(or client) and actual source of data (databases). This way, the request dispatcher will first lookup if the request has been made before and try to find the previous result to return, in order to save the actual execution. In this TechTalk, we will discuss about need of caching, different types of caching in a typical distributed system and different caching patterns. We will thoroughly explore about caching techniques such as cache-aside caching, write-through caching, write-back caching and refresh-ahead caching. We will also enlighten advantages and dis-advantages or different caching patterns. #cache #caching #systemdesign #distributedsystems Chapters : 0:00 - Introduction 0:30 - Agenda 0:51 - Why Caching? 1:59 - Typical distributed system architecture 3:59 - Caching types in typical distributed system 6:46 - Caching Techniques 7:25 - Cache-aside caching 10:48 - Write-through caching 13:41 - Write-back (write-behind) caching 16:11 - Refresh-ahead caching 18:48 - Closing Note