У нас вы можете посмотреть бесплатно Learning Smooth Neural Functions via Lipschitz Regularization (SIGGRAPH 2022) on Talking Papers или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
All links are available in the blog post: https://www.itzikbs.com/lipschitz-mlp In this episode of the Talking Papers Podcast, I hosted Hsueh-Ti Derek Liu to chat about his paper "Learning Smooth Neural Functions via Lipschitz Regularization”, published in SIGGRAPH 2022. In this paper, they took on the unique task of enforcing smoothness on Neural Fields (modelled as a neural network). They do this by introducing a regularization term that forces the Lipschitz constant of the network to be very small. They show the performance of their method on shape interpolation, extrapolation and partial shape reconstruction from 3D point clouds. I mostly like the fact that it is implemented in only 4 lines of code. PAPER TITLE "Learning Smooth Neural Functions via Lipschitz Regularization" https://arxiv.org/abs/2202.08345 AUTHORS Hsueh-Ti Derek Liu, Francis Williams, Alec Jacobson, Sanja Fidler, Or Litany ABSTRACT Neural implicit fields have recently emerged as a useful representation for 3D shapes. These fields are commonly represented as neural networks which map latent descriptors and 3D coordinates to implicit function values. The latent descriptor of a neural field acts as a deformation handle for the 3D shape it represents. Thus, smoothness with respect to this descriptor is paramount for performing shape-editing operations. In this work, we introduce a novel regularization designed to encourage smooth latent spaces in neural fields by penalizing the upper bound on the field's Lipschitz constant. Compared with prior Lipschitz regularized networks, ours is computationally fast, can be implemented in four lines of code, and requires minimal hyperparameter tuning for geometric applications. We demonstrate the effectiveness of our approach on shape interpolation and extrapolation as well as partial shape reconstruction from 3D point clouds, showing both qualitative and quantitative improvements over existing state-of-the-art and non-regularized baselines. RELATED PAPERS 📚DeepSDF https://bit.ly/3tAjwI4 📚Neural Fields (collection of works) https://neuralfields.cs.brown.edu/ 📚Sorting Out Lipschitz Function Approximation http://proceedings.mlr.press/v97/anil... LINKS AND RESOURCES 💻 Project Page https://nv-tlabs.github.io/lip-mlp/ 💻 Code https://github.com/ml-for-gp/jaxgptoo... To stay up to date with Derek's latest research, follow him on: 👨🏻🎓 Derek's personal page: https://www.dgp.toronto.edu/~hsuehtil/ 🎓 Google Scholar: https://bit.ly/3RKqiW0 🐦 Twitter: / htderekliu TIME STAMPS ----------------------- 00:00 Intro 00:53 Authors 01:13 Abstract / TLDR 01:36 Motivation 07:15 Related Work 09:21 Approach 18:02 Results 30:53 Conclusions and future work 33:16 What did reviewer 2 say? 35:29 Outro Recorded on May 30th 2022. CONTACT If you would like to be a guest, sponsor or just share your thoughts, feel free to reach out via email: talking.papers.podcast@gmail.com SUBSCRIBE AND FOLLOW 🎧Subscribe on your favorite podcast app: https://talking.papers.podcast.itzikb... 📧Subscribe to our mailing list: http://eepurl.com/hRznqb 🐦Follow us on Twitter: / talking_papers 🎥YouTube Channel: https://bit.ly/3eQOgwP #talkingpapers #SIGGRAPH2022 #LipschitzMLP #NeuralFields #3DVision #ComputerVision #AI #DeepLearning #MachineLearning #deeplearning #AI #neuralnetworks #research #artificialintelligence #podcasts