We are delighted to announce that legal advisor, Bram Hoovers will speak at PrivSec Global, opening soon.

Streaming live May 22 and 23, PrivSec Global unites experts from both Privacy and Security, providing a forum where professionals across both fields can listen, learn and debate the central role that Privacy, Security and GRC play in business today.

Bram Hoovers is Director of Legal at Considerati. A pragmatic legal advisor, he leverages broad experience in privacy laws and the implementation of privacy compliance programs.

Bram appears exclusively at PrivSec Global to discuss the lessons that organisations can take from GDPR compliance as they prepare to adhere to the EU’s forthcoming AI Act. 

Below, Bram answers questions on his professional journey and the themes of his PrivSec Global session.

Could you briefly outline your career pathway so far?

It’s important to reiterate the seminar’s title because I feel well-equipped to discuss GDPR, given my career trajectory. I’m a lawyer; I started in criminal law but transitioned to in-house legal counsel roles at multinationals. My foray into privacy law began around 2012 or 2013 when I tackled assignments on the GDPR, like examining consumer domains’ data practices for Philips. Initially, my focus wasn’t crafting privacy policies but rather ensuring businesses comprehended their legal obligations.

Over a decade, I’ve navigated the privacy field, spending about seven years with Philips, where I also contributed to the GDPR Readiness Program. Privacy law’s appeal lies in its presence of personal data all across organizations, making it a fascinating and complex legal topic. Acting as a liaison between business and legal realms has been my strength, helping to bridge understanding and compliance.

Nearly four years ago, I transitioned to consultancy, drawing on my in-house experience to advise diverse clientele on GDPR compliance. This shift provided insights into different companies’ operations and challenges, enriching my perspective. Progressing from a senior legal consultant to a directorial role, I now oversee legal advisory services while also refining our internal operations and service offerings, keeping on top of evolving legislations like AI regulations.

The insights gained from years in privacy law offer valuable lessons applicable to AI governance, eliminating the need to reinvent strategies. Moving forward, I see my role evolving to encompass both GDPR and AI compliance, leveraging the parallels between the two domains.

What are the broad steps that organisations need to take to expedite their compliance with the forthcoming EU AI Act?

One crucial initial step, as many in the privacy domain would agree, is to inventory what you have. Without this knowledge, steering a compliance program or embedding legislation becomes challenging. Mapping all AI systems that a company uses or develops is essential. Classifying them according to the AI Act’s three risk layers – prohibited, high risk, and low risk—provides clarity on their status.

Understanding if forbidden AI systems are being developed is vital, as they must be halted. While low-risk systems have fewer requirements, the EU’s AI Act primarily focuses on high-risk ones. Having a clear inventory akin to the GDPR’s register of processing activities aids in this process. Defining roles—whether as a data controller or processor under the GDPR, or as a provider or deployer under the AI Act—is crucial for determining necessary actions. This comprehensive approach facilitates risk assessment and sets the foundation for subsequent steps.

What are the main pitfalls lay with compliance to GDPR? What do you see sort of common challenges that organisations will face as they move to achieve compliance with the EU AI act?

Yes, the GDPR, compared to the privacy directive, emphasizes not just compliance but also the ability to prove it, which entails extensive documentation.

This includes policies, document management systems, and data governance, necessitating significant paperwork. Clearly defined roles and responsibilities are crucial, ensuring accountability. Without coherent policies, training efforts lack direction, akin to building a house without a foundation.

For instance, while training staff on data breach protocols is essential. You need to build your house first before you can live in it; if you teach your entire staff about what a data breach is, and that such an event needs to be reported within 72 hours, then that’s fine. But if there is a data breach, and the trained people don’t know who to contact and how to contact them, then the process is doomed to fail, I would say. 

Similarly, the AI Act, spanning around 500 pages, mandates robust data governance, risk management, monitoring and complaint procedures. Implementing these procedures post-mapping is needed to avoid pitfalls.

Further pitfalls lie in starting AI projects without established procedures and delineated roles. Even more than privacy regulations, the AI Act involves various stakeholders, including data scientists, developers and legal, necessitating alignment across departments.

This complexity makes it all the more important to have cohesive collaboration within organisations, bridging legal, business, and technical domains for effective AI governance.

Don’t miss Bram Hoovers debating these issues in depth in the PrivSec Global panel: Preparing for a new regulation: Applying GDPR lessons prepare businesses for the EU AI Act

As we approach the sixth anniversary of GDPR, professionals in risk, privacy, and technology eagerly await the implementation of the EU AI Act, set to become the first global framework regulating AI usage.

While GDPR aimed to harmonise EU laws, the AI Act addresses the burgeoning AI landscape, seeking to regulate its development and utilisation in Europe. Businesses now turn to risk and privacy professionals to navigate this new regulatory landscape, leveraging their expertise from prior regulatory implementations.

With GDPR, businesses had two years to integrate this impactful regulation into their privacy management programs on a global scale. Now, the countdown begins for the AI Act, requiring businesses to apply its provisions to their AI programs, including AI Impact assessments, risk measurement, and understanding third-party AI usage within the next two years.

In this session, the panel will discuss:

  • The relationship between GDPR and the AI Act: Key legal differentiators and similarities
  • Best practices for preparing for the imminent AI Act
  • Strategies to promote responsible AI use and development within your organisation

Moderator

Speakers

Details

Session: Preparing for a new regulation: Applying GDPR lessons prepare businesses for the EU AI Act

  • Time: 11:00 – 11:30 GMT
  • Date: Wednesday 22 May 2024.

The session sits within a packed two-day agenda of insight and guidance at PrivSec Global, livestreaming through Wednesday 22 and Thursday 23 May, 2024.

Discover more at PrivSec Global

As regulation gets stricter – and data and tech become more crucial – it’s increasingly clear that the skills required in each of these areas are not only connected, but inseparable.

Exclusively at PrivSec Global on 22 & 23 May 2024, industry leaders, academics and subject-matter experts unite to explore these skills and the central role they play in privacy, security and GRC.

Click here to register