У нас вы можете посмотреть бесплатно Level Up - Automatically tokenize sensitive data with DLP and Dataflow или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Welcome to Level Up: From Zero - the show where we show you how to build solutions with Google Cloud Platform, hands-on. In this episode, Solution Architect Anant Damle discusses using Dataflow and the Data Loss Prevention API to auto-tokenize and encrypt sensitive data. One of the daunting challenges during data migration to the cloud is how to manage sensitive data. The sensitive data can be in structured forms like analytics tables or unstructured like chat history or transcription records. You can use Cloud DLP to identify sensitive data from both of these kinds of sources and then tokenize the sensitive parts. Have a look at the written version: https://cloud.google.com/community/tu... 00:35 - Introduction and explanation of tokenization 01:35 - Using encryption with tokenized data 02:45 - Automatic tokenization architecture overview 04:26 - Step 1: Flatten & sample 06:26 - Step 2: Batch & identify 09:20 - Step 3: Tokenize 09:42 - Hands-on demo 13:03 - Wrap-up and resource links Free Trial: Google Cloud Platform → https://goo.gle/2u5itEB Hands-on training with Qwiklabs → Google Cloud Platform Essentials - https://goo.gle/2vdcFcf Follow us on Twitter → / googlecloud_anz → / googlecloud_in → / googlecloud_sg → / googlecloud_id Follow us on Facebook → / googlecloud Subscribe to our Google Cloud APAC channel for more for more episodes where we dive deeper and build on previous examples → https://goo.gle/2EsiSCC