Sponsored
Sponsored content is premium paid-for content produced by the Irish Times Content Studio on behalf of commercial clients. The Irish Times newsroom or other editorial departments are not involved in the production of sponsored content.

Building trust in the age of AI: when innovation and regulation intertwine

Humanity, trust and transparency are key factors to integrating artificial intelligence (AI) and new technologies into business plans and systems. Such constructs build confidence with customers and increase innovative freedom, says Deloitte’s digital trust and transparency lead, Nicola Flannery

Organisations are currently grappling with balancing innovation and regulation as we enter the age of artificial intelligence. Nicola Flannery, digital trust and transparency lead at Deloitte, suggests that the biggest obstacle to broad acceptance of technology by customers is their hesitation to share data and engage with extensive new technologies.

According to Deloitte’s recent Digital Consumer Trends survey, over one third of those surveyed would like to have their passports, driving licenses, and other forms of ID integrated into their smartphones, indicating that customers desire to do more with their devices. To achieve this goal, Flannery believes that a trust by design approach is key to balancing that regulation and innovation.

The concept of trust by design entails building trust and privacy into systems, products and processes. Using this approach, Flannery and her Deloitte team excel at bringing together experts from diverse disciplines of the business to collaborate on projects.

“We help our clients understand how they can strategically bring all of these areas together, by working holistically, and building-in compliance from the outset,” she explains. “It makes things a lot easier in the long run, especially when the regulator knocks on your door, that there is a future-proofing solution intact.”

READ MORE

Through my own client interactions, the concept of trust by design is rapidly growing as a strategic response to increasing regulation

Having first encountered the “trust by design” approach before the General Data Protection Regulation (GDPR) coming into force, Flannery felt it could positively impact her own clients. “It’s an effective and efficient way of embedding the building blocks of regulation into an organisation. The approach ensures that companies move away from a box-ticking approach to regulation, instead embedding trust and transparency from the outset,” she says.

Harnessing the knowledge and expertise of Deloitte’s global team and its partnership with the world’s leading privacy by design expert, Dr Ann Cavoukian, Flannery was able to bring the trust by design approach to her own clients at an early stage, to strengthen their approach to regulation.

“I built a lot of the work I did for clients around that concept – even if they weren’t ready for it,” she says.

“Through my own client interactions, the concept of trust by design is rapidly growing as a strategic response to increasing regulation. Prioritising the embedding of responsible design, safety, integrity, cybersecurity, privacy and trustworthy AI, at the engineering stage of product development, is key to this approach.”

Flannery likens her team to the missing piece of a jigsaw that helps her clients connect the internal developers and operations on one side with the compliance and legal people on the other side.

While developers can be laser-focused on pushing the envelope to create efficiencies, the compliance requirements can appear restrictive to that innovation. Her message is simple: companies that ignore conformation do so at their peril.

Bypassing trust can erode confidence

When fed large amounts of diverse data AI offers opportunities for good, genome sequencing in life sciences, for example, she explains adding that AI will become more powerful and impactful depending on such diverse data sets.

“There is a concept in privacy that is called the mosaic effect, where organisations can pull data from multiple sources and combine it, and this then can lead to a risk of reidentification of an individual which can become a privacy and cybersecurity risk,” she says.

“This is also a concern with AI, especially considering it is the power of these data sets that are used to train generative AI. We need to create boundaries around where companies are pulling that data from, how they are accessing it and how aware the individual that owns that data is of all the potential uses of their data.”

While the tech giants may be front-runners, the use of AI is huge and will be far-reaching within life sciences, financial services, insurance and the consumer business

In the race to be first to market with leading AI technologies, there is also uncertainty in people’s minds of the risks and harms associated with the fast evolution of the technology. Bypassing a trust by design ethos can result in a company bringing products to market that erode consumer confidence.

Creating a jargon-free, human-centric, approach to the use of AI to create operational efficiencies will allow organisations and consumers to understand and question the technology, Flannery explains. “If AI is making decisions for me, such as determining my eligibility for a mortgage, job or medical diagnosis, there must be a point where I can pause the process and ask how the decision was reached.”

Ultimately, if an individual feels that they have control over their own data, they will trust the organisation they give that data to. The fact that this data belongs to a human, with human rights, also needs to be factored in, says Flannery.

While she works to support organisations to steam ahead with innovation and realise the possibilities that can be achieved, “it is the individual consumer that’s going to be impacted, and it is they who will ultimately decide whether they trust you or not, which inevitably impacts on brand and revenue”, she says, adding: “You have to feel that you’re in control, and that involves knowing what the organisation is doing with your data. In return, the organisation has to be open, transparent and feel trustworthy to you as their customer, and that’s where the boundaries and regulations come in.”

Concepts of trust, safety and clarity are crucial

From the inception of the GDPR regulations, to internet regulations such as the Digital Services Act (DSA), the Digital Markets Act (DMA), and the recent AI Act, the concepts of trust, clarity, and safety are now fundamental requirements of any organisation’s digital footprint, regardless of the industry it operates in.

“When you think about AI, digital trust and transparency, you automatically think about tech, telecoms and media, and, while the tech giants may be front-runners, the use of AI is huge and will be far-reaching within life sciences, financial services, insurance and the consumer business,” Flannery points out. “Every organisation is trying to push innovative boundaries with AI. Every industry is going to be impacted by this.”

It makes things a lot easier in the long run, especially when the regulator knocks on your door, that there is a future-proofing solution intact

Moving fast with AI technologies may seem complex when regulation must be factored in, but ensuring that all the right decision makers are together early-on is crucial to success. “Rather than working in complete silos and hitting roadblocks, you’re joining the dots and working together, to bring in privacy, security and trust elements at the very beginning,” Flannery says. Ultimately, building in trust by design in this way opens up more opportunities for creative freedom once those foundations are in place.

She adds: “Taking the time to lay such strong business foundations benefits every aspect of the company. By building in compliance, assisting clients to identify risks, implementing mitigation measures, and enhancing governance structures, our team ensures our clients are better positioned to enjoy first mover advantage.

“Trust by design is becoming the most practical and efficient approach to future-proofing against digital risks, and we are already implementing this with our clients as standard.”

To learn more visit Deloitte.ie