Data Privacy Week 2026: The Boardroom moment for privacy leadership

 

Why Data Privacy Week 2026 marks a leadership turning point

 

In Data Privacy Week 2026, senior technology leaders find themselves at a pivotal moment. 

 

Data privacy is no longer just a regulatory obligation or IT concern. It has become a defining leadership issue, reshaping how organisations approach trust, innovation, and long-term value.

 

This article will draw on insights from executive roundtable debates across The Studio, HotTopics’ flagship event series for the C-suite, as well as the top news stories this past month. From record-setting breaches and global regulatory reform, to the rapidly evolving risks of AI-driven data use, these signals point to a clear conclusion: data privacy has dramatically entered the boardroom.

 

From technical risk to leadership responsibility

 

January 2026 alone has delivered a sobering reminder of the scale of today’s data privacy challenge. 

 

A massive exposure of over 149 million credentials across major platforms (including Gmail, Facebook, TikTok, Netflix and Instagram) has reinforced a reality that many executives already recognise: breaches are no longer exceptional events.

 

During a HotTopics Studio debate on “Driving business impact through data-empowered experience”, Ian Cohen, then-Chief Product and Information Officer at Acacium Group, captured the leadership shift succinctly: “We need to use data to anticipate and plan, and react.”

 

In today’s environment, that capability must extend beyond just security teams. Boards, CEOs, and executive committees are increasingly accountable not just for operational resilience, but for how transparently and responsibly organisations respond when customer data is ultimately at risk.

 

Data Privacy Week and the global regulatory reset

 

Data Privacy Week 2026 arrives amid one of the most transformative regulatory moments in recent memory. 

 

From the UK’s strengthened enforcement powers under the Data (Use and Access) Act, to California’s DELETE Act enabling consumers to erase their data from all registered brokers, and Vietnam’s first comprehensive personal data protection law, the direction of travel is unmistakable: stronger rights, tougher enforcement, and higher expectations.

 

Norma Dove-Edwin, former Chief Data and Information Officer at Places for People, spoke directly to the leadership implications of this shift during a HotTopics roundtable on “Data visionaries in the age of privacy”:

 

“You need policies and ways of working set up before you embark on a data journey. That way everyone understands what your company’s stance on ethics is, which is important in building trust and transparency with the consumer.”

 

This is no longer about compliance checklists. It is about embedding data privacy into organisational culture, operating models, and leadership decision-making.

 

The rise of the privacy-conscious customer

 

Data privacy today is as much about customer expectations as it is about regulation. Consumers are more informed, more empowered, and more willing to act on their values when it comes to how their data is used.

 

Data privacy today is as much about customer expectations as it is about regulation. As a result, consumers are more informed, more empowered, and more willing to act on their values when it comes to how their data is collected, used, and protected.

 

This shift reflects a broader recognition that breaches are no longer anomalies; they are operational realities. Organisations must now design for resilience and trust, not just prevention.

 

Dove-Edwin reinforced the opportunity this creates:

 

“The control is going back to the customer, and it is in fact a win-win scenario for the industry. If they trust you and you can demonstrate how transparent you are in how you use their data, they will become more loyal.”

 

AI: Where innovation and data privacy collide

 

Nowhere is the tension between innovation and data privacy more visible than in artificial intelligence. As organisations scale AI beyond pilot projects into core business operations, questions around governance, consent, training data, and model accountability have become unavoidable.

 

At a HotTopics roundtable on “Keeping pace with AI: Risk versus reward”, Xuyang Zhu, Senior Counsel at Taylor Wessing, reframed the leadership challenge: “The risk of not using AI is greater ultimately than the risks of using it.”

 

Yet that opportunity comes with complexity. Kate Sargeant, Chief Data Officer at the Financial Times, addressed the industry’s divided response to AI content use: “Half of them are doing deals, while the other half are suing big technology companies for using their content without permission.”

 

Explaining the FT’s approach, she said: “We would rather be part of those conversations than outside of the room.”

 

This reflects a broader leadership reality: organisations should be engaging proactively with AI governance rather than avoiding it. This is also while ensuring data privacy, consent, and intellectual property rights remain protected.

 

Guardrails, ethics and responsible innovation

 

AI is not the only frontier requiring stronger guardrails. 

 

As organisations embed emerging technologies across customer experience, product development, and internal operations, ethical and legal considerations are becoming inseparable from innovation strategy.

 

Rob O’Brien, Head of International Technology at ITV Studios, warned: “When we’re talking about emerging technologies as part of customer experience, it’s important that we start thinking about our guardrails.”

 

Similarly, Paul Davison, Head of Data at Royal Mail, grounded this conversation in everyday leadership judgment: “As we utilise technologies such as machine learning and AI, the ethics of their use becomes simultaneously more important and more interesting to more people, which it should be… I’m a great believer in sense checking everything you do: if I can do this, should I?”

 

Data privacy and transparency as the new trust signal

 

Forward-looking leaders increasingly recognise that data privacy is not just about reducing risk, it is about enabling growth.

 

Maritza Curry, Head of Data at BNP Paribas Personal Finance SA, framed this clearly: “We operate in the knowledge economy. Within this digital and knowledge economy, data is the currency.”

 

But she also underscored the non-negotiable role of protection: “An experience I don't want is a data breach and to know that my data is out there.” This duality (using data to drive insight, while safeguarding it to preserve trust) defines the modern executive mandate.

 

However, without strong data privacy foundations, that value remains very fragile.

 

As data-driven personalisation and AI-powered services become more pervasive, transparency is emerging as a core trust signal.

 

Sabah Carter, Chief Data, Intelligence and Technology Officer at FSCS, articulated this principle during the HotTopics debate “Customer experience in a digital revolution”: 

 

“We are very transparent from the start. We will say what tools we are using, who you are speaking to, how we are going to use the data almost to the extent of too much disclosure.”

 

In an era of increasing scepticism, this level of openness is becoming not just best practice, but a competitive advantage.

 

Why Data Privacy Week 2026 matters more than ever

 

Data Privacy Week 2026 is not just a moment for reflection, it is a call to leadership action for the technology C-suite. The convergence of:

 

  • Record-setting breaches
  • Expanding global regulation
  • Rising consumer activism
  • And the rapid scaling of AI

…means that data privacy is no longer a technical issue. It is a strategic, cultural, and reputational one.

 

As Ian Cohen reminded HotTopics audiences with an iconic quote: “With great power comes great responsibility.”

 

That responsibility now sits firmly with the C-suite.

 

The organisations that will lead in 2026 and beyond are not those with the longest policies, but those with the strongest confidence. This means confidence in their data governance, confidence in their AI guardrails, confidence in their transparency, and confidence in their ability to respond when (not if) things go wrong.

 

Or, as one HotTopics panellist put it: “In order to innovate you need to be able to allow some failure … it comes hand in hand.”

 

But failure without trust is fatal. Innovation with trust is transformational.

Mask group-2

SUBMIT A COMMENT

We love getting input from our communities, please feel free to share your thoughts on this article. Simply leave a comment below and one of our moderators will review
Mask group

Join the community

To join the HotTopics Community and gain access to our exclusive content, events and networking opportunities simply fill in the form below.

Mask group