У нас вы можете посмотреть бесплатно 538 Using AI and surgical gestures to predict outcomes- Dr Andrew J Hung или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
This video is being presented at the Humans at the Cutting Edge of Robotic Surgery Symposium 2024, Jaipur, India. It was produced by Dr Umar Gaffar & Dr Andrew J Hung. Cedars Sinai Medical Center Abstract: Using AI and surgical gestures to predict outcomes Abstract Introduction How well a surgery is performed impacts a patient’s outcomes. However, objective quantification of performance remains an unsolved challenge. Deconstructing a procedure into discrete instrument-tissue 'gestures' is an emerging way to understand surgery. Our objective is to establish this paradigm, particularly for the nerve-sparing step of prostatectomy, where performance is the most important factor for patient outcomes. Methods We identified 34,323 individual gestures performed in 80 nerve-sparing robot-assisted radical prostatectomies from two international medical centers. Gestures were classified into nine distinct dissection gestures (e.g., hot cut) and four supporting gestures (e.g., retraction). Our primary outcome was to identify factors impacting a patient’s 1-year erectile function (EF) recovery after radical prostatectomy. Results We found that less use of hot cut and more use of peel/push was statistically associated with better chance of 1-year EF recovery. Our results also show interactions between surgeon experience and gesture types—similar gesture selection resulted in different EF recovery rates dependent on surgeon experience. Furthermore, two teams independently constructed machine learning models using gesture sequences vs. traditional clinical features to predict 1-year EF. Gesture sequences were able to better predict 1-year EF (Team-1: AUC 0.77, 95% CI 0.73–0.81; Team-2: AUC 0.68, 95% CI 0.66–0.70) than traditional clinical features (Team-1: AUC 0.69, 95% CI 0.65–0.73; Team-2: AUC 0.65, 95% CI 0.62–0.68). Conclusions Gestures provide a granular method to objectively indicate surgical performance and outcomes. Application of this methodology to other surgeries may lead to discoveries on methods to improve surgery. See more at: http://vattikutifoundation.com/