У нас вы можете посмотреть бесплатно The Hard Parts of Using Differential Privacy in Deep Learning или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this video, I take you through what I think are some of the more difficult parts of differential privacy. It's not (usually) the technical questions but the larger questions around how we architect the system, how we apply what we know about privacy and how we reach across disciplines and expertise to appropriately shape and define solutions that make sense for our organization and use case. Recommended video on learning theory and privacy: • What Is The Sample Complexity of Different... Related blog post and all the references from these slides: https://blog.kjamistan.com/differenti... What is differential privacy in ML post: https://blog.kjamistan.com/differenti... or check out my book: https://practicaldataprivacybook.com You can hire me to help advise, train or work on differential privacy in deep learning: https://kjamistan.com If you've applied DP before to ML problems, do you agree? What other problems have you run into? Any tips you can share?