У нас вы можете посмотреть бесплатно Intro to A/B testing или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this webinar, Ed Biden, co-founder of Hustle Badger, breaks down how to run AB tests effectively as a product manager. He covers the full process from picking the right success metric and writing a strong hypothesis through to designing the experiment, interpreting results, and knowing when AB testing is not the right approach. This session is packed with practical guidance for product managers, growth leads, and anyone working on digital products who wants to make better, evidence-based decisions about what to build and ship. KEY TAKEAWAYS AB testing lets you validate that product changes have the impact you expected, and deploy new code safely by controlling traffic to different features A good success metric is intuitive, measurable, actionable, and within your team's control: a directionally correct metric that everyone understands beats a precise one that half the team cannot explain Always write a hypothesis before running a test, because without one you will not understand why a result was positive or negative, and you cannot build on it A 50/50 traffic split gets you results fastest: putting only 10% into the variant does not reduce risk, it just spreads the same exposure over a longer time AB test results do not aggregate the way you expect: running five positive tests on the same metric will almost never produce a cumulative uplift equal to their sum, due to conservation of intent and novelty effects wearing off AB testing works best for established consumer products with high traffic, an existing experimentation platform, and a data-driven internal culture For startups, B2B products, or teams with low traffic, alternatives like before-and-after analysis, split-by-population testing, SEO testing, session recordings, and user interviews are often more practical Inconclusive and negative results are still valuable: they tell you that your hypothesis about what users care about was wrong, which should reshape your priorities CHAPTERS 00:00 Intro – Why AB testing matters for product teams 01:36 Why run an AB test: validation, safe deployment, and speed 04:58 The four steps of running an AB test 06:06 Step 1: Picking a success metric and what makes a good one 11:49 Step 2: Writing a strong hypothesis with a reusable template 14:03 Step 3: Designing the experiment – confidence, sample size, and runtime 21:24 Step 4: Running the test and interpreting positive, negative, and inconclusive results 26:12 The downsides of AB testing and when not to use it LINKS AND RESOURCES Course: A/B Testing https://www.hustlebadger.com/courses/... Article: How to run A/B tests: https://www.hustlebadger.com/metrics/... Article: When to run A/B tests: https://www.hustlebadger.com/metrics/... Article: Quantitative Testing Methods: https://www.hustlebadger.com/what-do-... Webinar recordings: / @hustlebadger Register for future Hustle Badger webinars: https://luma.com/hustlebadger Hustle Badger LinkedIn: / hustle-badger Ed Biden LinkedIn: / edbiden HASHTAGS #ABTesting #ProductManagement #ExperimentationCulture #GrowthStrategy #ConversionOptimisation #DataDrivenProduct #ProductMetrics #DigitalProduct #UserExperience #ProductDiscovery #StartupStrategy #PMSkills #ProductOps #HypothesisDriven #ExperimentDesign #GrowthHacking #ProductAnalytics #CRO #WebOptimisation #ProductLeadership