Trustworthy AI Data Governance around Covid-19 Could Help Unlock Innovation

Trustworthy-AI-data-governance-around-Covid-19-could-help-unlock-innovation

The Centre for Data Ethics and Innovation (CDEI) has published new research on the use of AI and data-driven technology in the UK’s Covid-19 response, highlighting insights into public attitudes, as well as trends it has identified.

A major CDEI poll has found that the public believe digital technology has a role to play in tackling the pandemic, but that its potential is not yet being fully realised.

Public support for greater use of digital technology depends on trust in how it is governed. According to the poll, the single biggest predictor for supporting greater use of digital technology was an individual believing that ‘the right rules and regulations are in place’. This was deemed more important than demographic factors such as age.

Trend analysis of the use of AI and data-driven technologies in the same period has revealed that conventional data analysis has been more widely used in the Covid-19 response than AI.

The longitudinal study, with a representative sample of over 12,000 people, ran from June to December 2020. The results show significant public support over that period for the use of data-driven technology to tackle the Covid-19 pandemic. Almost three-quarters (72 per cent) of the UK population felt that digital technology had the potential to be used in response to the outbreak – a belief shared across all demographic groups. A majority of the public (average 69 per cent) also showed support, in principle, for a number of specific use-cases – including technologies that have not been widely adopted – such as wearable technology to aid social distancing in the workplace.

However, many respondents also felt that the potential of data-driven technology was not being fully realised. Fewer than half (42 per cent) said digital technology was making the situation in the UK better (7 per cent claimed it was making matters worse). Respondents cited concerns about whether people and organisations would be able to use the technology properly (39 per cent). This was more than double the number who pointed to problems with the technology itself (17 per cent). This could point to an opportunity – e.g. a disconnect between support for technology’s potential and the extent of its current application – although the poll results perhaps also intimate the degree of the general population’s understanding of precisely what AI technologies are available and how best to deploy them in an ethical way.

Overall, the research uncovered a clear relationship between trustworthy governance and support for the adoption of new technologies. When controlling for all other variables, the CDEI found that ‘trust that the right rules and regulations are in place’ is the single biggest predictor of whether someone will support the use of digital technology. This was substantially more predictive than attitudinal variables such as people’s level of concern about the pandemic, belief that the technology would be effective, and demographic variables such as age and education. Just under half (43 per cent) said existing rules and regulations were sufficient to ensure the technology is used responsibly, while almost a quarter (24 per cent) disagreed. Older respondents tended to have lower levels of trust in the existing rules and regulations than younger participants.

The CDEI has urged action to build trustworthy governance that earns the confidence of citizens over the long-term, pointing to principles outlined in its ‘Trust Matrix’, such as enhancing accountability and transparency. There is currently relatively low knowledge about where to seek recourse in cases where data-driven technology has caused harm: 45 per cent do not know where to raise concerns if they are unhappy with the way digital technology was being used. This finding is consistent with other CDEI-commissioned research: 68 per cent of people reported that they would not know who to complain to if they felt that an unfair automated decision had been made about them in response to a job application, for example.

The report also highlights trends and patterns relating to the use of AI and data-driven technology during the pandemic. One of these is that, aside from advancing vaccine research, AI did not play the outsized role many thought it would in relief efforts, in part due to a lack of access to data on Covid-19 to train algorithms. Instead, conventional data analysis, underpinned by new data-sharing agreements, has made the biggest difference to the work of health services and public authorities.

Edwina Dunn, deputy chair at the CDEI, said: “Data-driven technologies including AI have great potential for our economy and society. We need to ensure that the right governance regime is in place if we are to unlock the opportunities that these technologies present. The CDEI will be playing its part to ensure that the UK is developing governance approaches that the public can have confidence in.”

John Whittingdale, minister of state for media and data at the Department for Digital, Culture, Media and Sport, said: “We are determined to build back better and capitalise on all we have learnt from the pandemic, which has forced us to share data quickly, efficiently and responsibly for the public good. This research confirms that public trust in how we govern data is essential.

“Through our National Data Strategy we have committed to unlocking the huge potential of data to tackle some of society’s greatest challenges, while maintaining our high standards of data protection and governance.”

The CDEI was set up in 2018 to advise on the governance of AI and data-driven technology. The Centre is overseen by an independent board, made up of experts from across industry, civil society, academia and government.

The CDEI has collated examples of novel use-cases of AI and data specifically being used to counter and mitigate the effects of the pandemic in its Covid-19 repository. The database highlights the breadth of applications, ranging from the piloting of drones that delivered medical supplies to remote regions, to the creation of health equipment databases that monitored the availability of assets in the NHS. It comprises 118 individual use-cases, spanning an array of locations and sectors.

In June 2020, the CDEI published its ‘AI Barometer’, an analysis of the most pressing opportunities, risks and governance challenges associated with AI and data use in the UK. The CDEI convened over 120 experts to generate a community-driven view of AI and data use in the UK. The analysis identified barriers to innovation, such as low data-quality and availability, a lack of coordinated policy and practice, and a lack of transparency around AI and data use, arguing that these barriers contribute to public distrust, which acts as a more fundamental brake on innovation.

Recently, Covid-19 vaccine maker Moderna announced that it was teaming up with IBM to work on technologies to track Covid-19 vaccine administration. Leveraging the potential of ‘Big Data’ and AI, the two companies will focus on using technology to help governments and healthcare providers address potential supply chain disruptions through information sharing.

Such issues are already coming into play regarding international vaccine distribution, with Italy enforcing an EU agreement this week which allowed it to block Australia’s request for a new shipment of the AstraZeneca vaccine.

The move is intended to preserve supplies of the AstraZeneca vaccine for countries within the EU. Millions of EU-made vaccines have already been exported to dozens of countries worldwide.

The EU executive backed Italy’s decision to block a shipment of 250,000 doses. A large number of the AstraZeneca Covid-19 vaccines are bottled in Italy. The Australia decision has given Japan cause for concern about future vaccine shipments from the EU.

The refusal of Australia’s export request is the first since the mechanism to monitor vaccine flows was established in late January 2021. The decision was a reaction to AstraZeneca’s delays in delivering vaccines to the EU, with the Anglo-Swedish company saying it could only supply around 40 million doses of the 90 million foreseen in the contract for the first quarter of this year.

When asked about the situation, the health ministers of France and Germany both signalled tacit agreement, in principle, with Italy’s decision.

While seeking the European Commission’s intervention, Australian Prime Minister Scott Morrison said he could understand reasons for Italy’s objection. Speaking to reporters in Sydney, Morrison said: “In Italy people are dying at the rate of 300 a day and so I can certainly understand the high level of anxiety that would exist in Italy and in many countries across Europe”.