У нас вы можете посмотреть бесплатно Leading privacy from within: Embedding compliance and managing risk across borders или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
On December 9th, Dentons’ cross border litigation team, in partnership with the Association of Corporate Counsel (ACC) Ontario, hosted a panel tailored for in-house counsel on latest developments in privacy class actions in Canada and the US, offering practical insights on how to anticipate risk, mitigate legal exposure, and position their organizations for success amid rapidly evolving privacy and class action landscape. The session, moderated by Matthew Fleming, featured Dentons lawyers Emma Irving, Alexandra Quigley and Peter Stockburger, as well as guest speaker Shreya Gupta, Director, Legal Counsel (Technology, AI and Privacy), Loblaw Companies Limited, drew over 600 registrants, highlighting key takeaways: Shift from data breaches to data misuse: Across Canada, Quebec, and the U.S., plaintiff firms are moving beyond traditional data breach cases to attack how organizations collect, retain, share, and repurpose data, often reframing routine practices as “misuse” of personal information. Canadian courts are more skeptical of breach only class actions without real damages, which is pushing plaintiffs toward theories around over collection, over retention, and unauthorized secondary uses instead. Consent, scope creep, and sensitive data: “Scope creep” on consent is one of the main emerging risks: organizations collect data for one stated purpose, then later expand use (or share with new third parties) in ways that no longer match the original consent. This is especially risky with sensitive categories like health and biometric data, where courts and regulators scrutinize whether downstream uses truly fall within what individuals were told at the outset. Third party and AI supply chain risk: Third party technology (pixels, cookies, analytics, marketing tools, AI chatbots/call centers, and SaaS vendors) is now a primary class action trigger, with U.S. claims framed as wiretapping/eavesdropping and Canadian cases focusing on disclosure and vendor terms that go beyond the granted consent. The broader “technology supply chain” (cloud, sub processors, integrators) is a soft underbelly: attackers increasingly target vendors rather than the primary brand, and plaintiffs and regulators are probing how deeply companies’ diligence and contractually constrain those vendors. In house counsel upskilling and contracting: Every in house lawyer effectively needs to become a technology/AI lawyer, because even non-tech businesses procure, embed, or rely on technology and data driven tools. Practical priorities for the next 6–12 months include: building compliance into pre contract due diligence, clearly allocating risk in AI/tech contracts, strengthening data governance (including anonymization/re identification risk), and differentiating playbooks for incidents in your own environment versus vendor environments. Building a global, future proof program: For multijurisdictional businesses, perfect compliance everywhere is unrealistic, so the recommended approach is a common global baseline—clear and upfront privacy notices, appropriate consent, reasonable security, and robust vendor diligence—supplemented by 10–20% jurisdiction specific adjustments (e.g., Quebec Law 25, U.S. state AI and privacy rules, limits on arbitration/class action waivers in Canadian consumer contexts). Because technology and AI are evolving faster than legislation, organizations should lean on existing consumer protection, human rights, and privacy principles now rather than waiting for AI specific statutes to drive their compliance posture.