У нас вы можете посмотреть бесплатно Generative Language Models in Molecular Discovery: Regression Transformer, GT4SD and Beyond или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Valence Labs is a research engine within Recursion committed to advancing the frontier of AI in drug discovery. Learn more about our open roles: https://www.valencelabs.com/careers Also consider joining the M2D2 Slack: https://m2d2group.slack.com/join/shar... Abstract:This talk will discuss recent developments of scientific language models for molecular design. Despite tremendous progress of generative models in the natural sciences, controllability remains challenging and a fundamentally missing aspect is an inductive bias that reflects continuous properties of interest. To that end, we propose the Regression Transformer (RT), a method that abstracts regression as a conditional sequence modelling problem. The RT introduces a new direction for multitask language models by seamlessly bridging regression and conditional sequence generation. Interestingly, in molecular, protein or reaction property prediction tasks, the RT matches conventional regression models despite using cross-entropy loss. But the RT is dichotomous: priming it with continuous properties yields a competitive conditional generative model that outperforms specialized approaches in a substructure-constrained, property-driven molecule generation benchmark. As we showcase, the RT enabled the discovery of novel catalysts and block co-polymers for ring-opening polymerisation through property-driven, local chemical space exploration. Intensifying our efforts in multitask chemical language models, we present a “Text & Chemistry T5” that solves tasks interfacing textual and molecular representations (e.g., molecule captioning, text-based molecule design) but also unimodal tasks such as forward/backward reaction prediction in a truly multitask, prompt-based manner. All presented methodology is open-sourced in GT4SD, the Generative Toolkit for Scientific Discovery that distributes 30+ state-of-the-art molecular generative models in a harmonised manner. Speaker: Jannis Born - https://research.ibm.com/people/janni... Twitter Prudencio: / tossouprudencio Twitter Jonny: / hsu_jonny ~ Chapters: 00:00 - Identifiability Background 05:03 - Structural Causal Models 07:19 - Interventions 11:08 - Identifiability in Causality 20:55 - Learning From Unknown-Target Interventions 30:53 - Learning in the Presence of Unobserved Variables 35:33 - Treks 38:58 - Latent Factor Causal Models (LFCMs) 44:39 - Causal Disentanglement Models 50:13 - Linear Causal Disentanglement via Intervention 01:01:00 - Ongoing Work 1:05:59 - Q+A