У нас вы можете посмотреть бесплатно Vibe Coding Meets Accessibility Part 2: Can AI Build an Accessible Modal? или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Last week, we tested whether AI tools could build an accessible dropdown. This week, we're raising the stakes with a modal dialog - one of the trickiest components to get right for accessibility. Modals have a lot of requirements that trip up even experienced developers: focus trapping, returning focus when closed, keeping background content inert, proper announcements for screen readers. Let's see how AI handles it. I'll be using acceptance criteria from Charlie Triplett's Atomic Accessibility project to evaluate the results. We'll ask Codex and Claude Code to build a modal, then see if Lovable can do it from scratch. I'll test everything with keyboard navigation and NVDA on Firefox (switching it up from last week's Mac/VoiceOver combo) to see exactly what each tool gets right and where it falls short. Whether you caught last week's session or this is your first one, you'll see real-time testing of AI-generated code and learn what to look for when evaluating accessibility claims from these tools.