Contact-Tracing Technologies and the Problem of Trust—Framing a Right of Social Dialogue for an Impact Assessment Process in Pandemic Times

Abstract


Introduction
While technologies offer potentially powerful tools to help address complex social challenges, experience shows that they may fail to meet expectations and may also raise challenges of their own, including for privacy and other data rights.To what extent can these difficulties be ascribed to a lack of public trust undermining the technologies' effectiveness and disputing their legitimacy?The Australian and Dutch pandemic contact-tracing apps considered in this article suggest part of an answer to this question.As we explain below, the case studies were selected inter alia because of the very different treatment of human rights in the Netherlands and Australia, despite other similarities, for instance, in terms of development and commitments to democracy. 1 In particular, the Netherlands is bound by the Charter of Fundamental Rights, which forms part of the constitutional 1 Silver, "Smartphone Ownership"; OECD, Building Trust to Reinforce Democracy.
While technologies offer potentially powerful tools to help address complex social challenges, experience shows that they may fail to meet expectations and may also raise challenges of their own, including for privacy and other data rights.To what extent can these difficulties be ascribed to a lack of public trust undermining the technologies' effectiveness and disputing their legitimacy?The Australian and Dutch pandemic contact-tracing apps considered in this article suggest part of an answer to this question.As our case studies show, the greater efforts made by the Dutch Government to address a range of rights and provide for wide consultation in the CoronaMelder app's various impact assessments paid off in terms of a better-designed app that was more broadly conversant with human rights than its Australian COVIDSafe counterpart, and was also more trusted-even if these benefits were still marginal compared to manual contact-tracking, especially in already marginalised communities.We argue that the Dutch experience should now be taken further to frame a right of social dialogue allowing data rights subjects to participate fully in the impact assessment process.We hope (and expect) this would result in better decision-making and improved public trust in 'truly trustworthy' technologies developed and deployed in response to a pandemic.However, ultimately, our more basic argument is that rights, premised on dignity and liberty, are of value and should be respected, including-indeed especially-in pandemic times.law of the European Union (EU),2 and is a signatory member of the European Convention on Human Rights. 3Meanwhile, Australia largely relies on formal adherence to international rights instruments such as the International Covenant on Civil and Political Rights. 4As our case studies show, the greater efforts made by the Dutch Government to address a range of rights and provide for wide consultation in the CoronaMelder app's various impact assessments paid off in terms of a better-designed app that was more broadly conversant with human rights than its Australian COVIDSafe counterpart, and was also more trustedeven if these benefits were still marginal compared to manual contact-tracking, especially in already marginalised communities.We argue that the Dutch experience should now be taken further to frame a right of social dialogue allowing data rights subjects to participate fully in the impact assessment process.We hope (and expect) this would result in better decision-making and improved public trust in 'truly trustworthy' technologies developed and deployed in response to a pandemic. 5However, ultimately our more basic argument is that rights premised on dignity and liberty are of value and should be respected.In short, respect for human rights should be not only an end goal but part of the process leading up to that end, including-indeed especially-in pandemic times.

Contact Tracing Meets Impact Assessment
As Karen Yeung and Lee Bygrave point out, 6 providing impact assessments as a way of addressing risks associated with the deployment of novel technologies is one marker of a 'modern' data protection regime that functions as a progressive instrument of regulation.An example is art.35(1) of the EU General Data Protection Regulation 2016 (GDPR), 7 which states that: Where a type of processing, in particular, using new technologies, and taking into account the nature, scope, context, and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.
The terms of the impact assessment process are prescribed in the rest of the article-including making provision for public participation in the process in art.35(9), which specifies that: Where appropriate, the controller shall seek the views of data subjects or their representatives on the intended processing, without prejudice to the protection of commercial or public interests or the security of processing operations.
The GDPR process offers an example of a regulatory mechanism that aims to identify and mitigate potential risks, fairly balance with prospective benefits and, at the same time, provide for clear public accountability. 8Yet, in practice, assessments often tend to be dealt with in a rather narrow technical or legal compliance-oriented manner, 9 with their focus more on the procedural analysis and risk management dimensions of the GDPR rather than engaging in forensic discussions of the GDPR's safeguarded rights and freedoms, which are instead left to data protection authorities and courts to elaborate. 10As to the scope for public participation provided for in art.35(9), the Article 29 Working Party, in its 2017 guidance, stressed that consultation need not occur in every case. 11Even so, the combined emphasis on rights and public accountability in the GDPR provisions, as well as the Charter (and more generally the Convention), clearly informed the Dutch Government's conduct in framing and deploying its impact assessments in the process of rolling out its CoronaMelder app in the challenging environment of pandemic technologies being pitted against publics' concerns about privacy and data protection along with other rights and freedoms.
By contrast, Australia, with its more limited experience of human rights and impact assessments, was less well-prepared when it came to conducting an impact assessment for its COVIDSafe app under the emergency conditions of the pandemic.Although the International Covenant on Civil and Political Rights is mentioned in the preamble to the Privacy Act 1988 (Cth), that is as far as it goes.And minimal provision is made for impact assessments in § 33D(1) of the Act, stating that: If: a) an agency proposes to engage in an activity or function involving the handling of personal information about individuals; and b) the Commissioner considers that the activity or function might have a significant impact on the privacy of individuals; the Commissioner may, in writing, direct the agency to give the Commissioner, within a specified period, a privacy impact assessment about the activity or function.
It is clear from the terms of § 33D that the Act's focus here is limited to government agencies, and the requirement for, as well as the nature and scope of, an assessment is left to the discretion of the Commissioner.For instance, the Commissioner, in the exercise of the power to register a code under the Act, prescribes that impact assessments should be conducted for 'all high privacy risk projects' in aid of good management and trust-building.12Rights seem to hardly feature in legal discourses around the impact assessments under the Act-including the Office of the Australian Information Commissioner's published guidance on the conduct of privacy impact assessments, which characterises these assessments as 'reasonable steps to implement practices, procedures and systems that will ensure compliance with the [Act's Australian Privacy Principles] APPs and enable them to deal with enquiries or complaints about privacy compliance', thus forming part of the 'risk management and planning processes' that entities should undertake. 13 the other hand, it is also clear that the publics in Australia, as in Europe, value privacy highly even when weighed against other rights and freedoms such as security and freedom of expression and indeed want to find ways to provide for all of these at the same time.For instance, an Oxford Internet Institute-INSEAD survey conducted in 2011 shows that Australian and European countries are relatively close in terms of their citizens' preferences to have 'it all' when it comes to these rights and freedoms.14Indeed, the authors find 'a global culture developing around the Internet, in which users worldwide share similar values and attitudes related to online freedom of expression, privacy, trust, and security'. 15These concerns, especially centred on the processing of personal data, have only increased in recent years.For instance, the Office of the Australian Information Commissioner's (OAIC) Australian Community Attitudes to Privacy Survey 2020 records that 'seventy percent [of those surveyed] consider the protection of their personal information to be a major concern in their life'. 16And a 2019 Eurobarometer survey finds that (even with the GDPR's protections) 'more than six out of ten are concerned about not having complete control over the information they provide online'.17 The Dutch and Australian governments' awareness of such public concerns no doubt lay behind their readiness to conduct very public impact assessments when it came to the introduction of their contact-tracing apps, which, by their nature, required a high degree of public trust to succeed as networked technologies (meaning that their value depended on how many people downloaded and used the app). 18As we will see, the Dutch assessments of its CoronaMelder app were more squarely focused on rights.But, even in Australia, the government was eager to assuage public fears about the privacy impacts of its COVIDSafe app and appreciated that its public messaging around its impact assessment could not be restricted to narrow compliance with prevailing legal standards but should also take into account public concerns about rights and freedoms.Even so, query if its impact assessment went far enough in assuaging these concerns, as shown by the low trust ultimately placed in the app by prospective Australian users.That the Dutch app did better is a testament to its more rigorous rights-oriented impact assessments (although we might still query whether it did well enough compared to manual contact tracing to justify its deployment and use or whether the resources might have been better expended in other ways).
Related to the above was the lack of public participation and engagement provided for in the Australian impact assessment, certainly in comparison to the Dutch impact assessments.As Yeung and Bygrave explain, if the task of a privacy or data protection impact assessment is not just one of ensuring compliance with standards but maintaining public 'legitimacy', 19 then 'community-based governance' plays a distinct role here. 20In our discussion below, we suggest that the effort made by the Dutch Government to provide for wide consultation in its impact assessments paid off in terms of a better-designed app that was more broadly conversant with human rights (in other words, they represented a step in the right direction), and was also more trusted-and this should now be taken further in framing a future impact assessment process for pandemic times to recognise and incorporate a right of social dialogue as an essential feature of the process.

The COVIDSafe 'Experiment in Coercion'
Australia rolled out its COVID Bluetooth contact-tracing apps a few months into the pandemic as part of its effort to 're-open Australian society after the national and state lockdowns occasioned by the "first wave" of infections from March to May 2020'. 21As with other contact-tracing apps, the intention was to have a digital contact-tracing app that complemented manual contact tracing in the shortest time, involving as many people as possible.What may be less obvious, given the high level of public take-up needed for the apps to succeed, was that it was not mandated or strongly pushed as, for instance, in Singapore. 22ather, the government relied on voluntary take-up of the apps, which, in turn, required a focus on public trust. 23That this would be a particular challenge in Australia was partly because, as part of its 'distinctive experimental path to combatting the pandemic, which involves considerable amounts of coercion', 24 the government determined early on that it would adopt a centralised reporting architecture, here taking the Singapore TraceTogether app as its model.As foreshadowed above, the 'solution' found to the challenge of ensuring sufficient public engagement and trust for the app to succeed was to undertake and publicly vaunt a privacy impact assessment. 25But this did little to forestall and may even have helped to engender the 'furious debate' surrounding the app. 26e privacy impact assessment, broadly speaking, followed the style of an impact assessment process set down in § 33D of the Privacy Act 1988, further elaborated in the OAIC Code and Guidance (noted earlier). 27However, it followed its own sui generis character and rather than being directed by the Privacy Commissioner, it was directed by the government agency involved, namely the Department of Health.The impact assessment prepared by the law firm Maddocks made 18 recommendations highlighting issues of legal and technical compliance, which the government implemented. 28Even so, the app had a relatively low uptake of 6.3 million downloads over the first few months, 29 that is, less than 25% of the population and well below the government's stated goal of 40%. 30In part, this could be attributed to technical problems with its usability (the app required a smartphone with an Australian SIM card, was widely considered not easy to use and had a tendency to provide 'bad app' data, especially at the start). 31There were also factors to do with low rates of COVID-19 infection in Australia in 2020, border closure policies and alternative options of QR code check-ins in states and territories from around the end of 2020, beginning in New South Wales (September) and Victoria (November).But, as Paul Garrett and Simon Dennis point out, there were broader problems with the app, noting that '[a] "social license" is necessary for any voluntary measure to be effective, and right now COVIDSafe doesn't have it'. 32blic perceptions of issues inter alia about the app's ability to offer effective protection of privacy and security indeed presented significant challenges in rolling out the app (potentially impacting both download and use rates).As reported in The Lancet, concerns about the app's reliance on a centralised database were 'identified early by many commentators as potential barriers for acceptance of the COVIDSafe app'. 33Further, community experts openly complained that, although the app's source code was shared in May 2020 via GitHub, it was not properly open-sourced, and feedback was blocked-this, of course, did not prevent them from publishing their own concerns relating to the app's protection of privacy and security (based on what was publicly available). 34Legal scholars also raised concerns about the available protection of privacy under the current terms of the Act and argued for law reform to increase transparency and accountability obligations. 35The government, in its publicity, insisted that the app met high standards in both privacy and security.And in an effort to strengthen public confidence along with expert involvement, amendments to the Privacy Act were introduced to provide an explicit oversight role for the OAIC, limit the app's personal data collection, use and disclosure to the specific purposes of contact tracing and state that data collected must be held in Australia and deleted when the app was no longer in use. 36The OAIC also conducted its own review of various aspects of COVIDSafe, finding only a handful of medium and low-risk deficiencies that it suggested could be relatively easily addressed. 37Even so, the public's distrust in the app's protection of privacy and security persisted over the life of the app, along with rising 'fear[s] of the normalisation of governmental tracking', 38 as well as digital exclusion with different levels of access and affordability detected for Australians with low incomes, education and employment levels (the 65 years and older group), Indigenous Australians, those with disabilities and Australians living in remote and regional areas. 39 the final wrap-up, according to Department of Health data, only two positive COVID-19 cases were identified through the app, which were not found by manual contact tracers. 40And, despite 7.9 million registrations of the COVIDSafe app between April 2020 and May 2022, only 792 consented to their data being added to the centralised data store for contact tracing. 41The Department of Health did not reflect on the reasons for the app's failure while reporting on it. 42But, following a change of government, the new Health Minister Mark Butler, on announcing the app's decommission in August 2022, scathingly denounced the 'wasteful and ineffective COVIDSafe app' as a 'colossal waste of more than $21 million of taxpayers money'. 43

The CoronaMelder Exemplar of 'Technology Theatre'
By contrast with the failed Australian COVIDSafe app along with its rather limited impact assessment, the Dutch CoronaMelder app, supported and tested by its impact assessments, presents an exemplar of technical competence and human rights compliance.Yet, as Rosamunde van Brakel et al. point out, there was a certain amount of 'technology theatre' going on here, which may help to explain the app's ultimately limited success in tackling a pandemic that required a massive human effort in trust and cooperation. 44Despite all the laudable efforts in creating and deploying a well-designed contact-tracing app that the public would readily accept and use, and with a reasonable level of public participation, with some 4.6 million users downloading the app (i.e., about 30% of the country's population), 45 the app enhanced the effectiveness of manual contact tracing by a mere 6%. 46Even those taking a positive view describe its effects as 'small'. 47Meanwhile, according to ex post efficiency assessments, manual contact tracing was still a significantly preferable alternative in terms of its effectiveness in detecting infections. 48This leaves the question of what type of impact assessment would achieve a better outcome in terms of effectiveness without unduly derogating from rights or whether the project should have been abandoned to start with.
As a technology, CoronaMelder seems hard to fault.It combined a decentralised client-server Bluetooth architecture with decentralised privacy-preserving proximity (DP-3T) protocols that worked on iOS and Android smartphones.From inception to rollout and use, its development was an open and transparent open-source collaborative work-in-progress process, with all intermediate app versions shared in GitHub.Its design was fully aligned with the GDPR-meeting the required robustness and data protection requirements such as data minimisation, data protection by default and design and storage restrictions.Moreover, it was fully tested throughout the app's design, development and rollout.The large-scale field testing involved 2,000 users during the mid-design, and this was followed by final practice testing in different regions two weeks before rollout.And consideration was given to users with restrictions such as age (60+ group) and visual, minor mental or motor impairments, with usability testing aimed to make the app user-friendly for these users. 49kewise, the various impact assessments were also generally well-designed and conducted.Two official data protection impact assessments were conducted pursuant to art.35 of the GDPR, with both made publicly available.The first, from July 2020, was conducted under the auspices of the Ministry of Health, Welfare and Sports.This revealed no major risks associated with the app's operation, and the app was judged as a necessary and proportionate measure in the circumstances. 50The second, from August 2020, was conducted by the law firm Privacy Management Partners (a firm with extensive expertise in data protection), 51 and this found that the app was largely sufficient in addressing the data protection standards.Nor were the assessments confined to technical and legal compliance.Ethical and social implications also received attention in the above reviews.And in a further review of the app, a range of core ethical values compiled by a panel of professionals and citizens coordinated by technology ethicist Peter-Paul Verbeek were emphasised as key to 'social embedding of the app', with 'inclusiveness' and 'solidarity' highlighted as among the important values here. 52As can be seen from the above discussion, expertise and diverse participation were features of these assessments, with the teams involved including officials, scientists and technologists, legal experts, behavioural science expertise on how the app could support control and follow-up of infections, ethical experts such as Verbeek (as noted above), along with diverse other individuals and community representatives who participated in the app's testing phases to improve its functionality, effectiveness and acceptability in ethical and social terms.
Nevertheless, despite the extraordinary level and breadth of the impact assessments conducted on the CoronaMelder app, there were some omissions.In particular, the second impact assessment conducted by Privacy Management Partners noted some broader social issues although outside the scope of the terms of reference, such as the fact that those at the frontline-such as healthcare workers, shop assistants and transport workers-may be disproportionately burdened by notifications and may ultimately turn off rather than embracing the app.Additionally, there were uncertain ramifications of introducing populating monitoring technologies in the longer term. 53An independent study conducted by social researchers from the University of Twente and North-West University further found that the CoronaMelder app was generally easy to use, but some participants, particularly older adults, young people with more limited education or disability and adults of migrant background, found it more difficult to deal with.There were also various concerns expressed about the app's usefulness and privacy-preserving mechanisms, with the researchers suggesting the need for better and more targeted communication. 54Might these challenges have been overcome with even wider and more inclusive consultation in the impact assessments conducted on the app, or might the app itself have come more into question with those consulted arguing for more emphasis on manual contact tracing and human support, possibly in combination with the app and/or other measures?At the very least, it can be concluded that even in the Dutch case, for all its advantages over the Australian one, '[c]areful assessments of technological solutions in crisis situations are needed', involving a full range of participants. 55

Framing a Data Rights Impact Assessment Process
The above case studies reinforce the point made by Yeung and Bygrave that paying attention to ethics and social norms, including around human rights and establishing community-based governance, need to be central features of an impact assessment process geared to establishing and maintaining public 'legitimacy'. 56In the words of the EU's European Group on Ethics in Science and New Technologies, 'human beings ought to be able to determine which values are served by technology, what is morally relevant and which final goals and conceptions of the good are worthy to be pursued'. 57There are also more practical reasons for taking this line.As Charles Raab explains, at their best, impact assessments serve as 'mechanisms for learning, not only because of the information they bring to light and disseminate about the functioning of technologies and systems, but because of the way such information might generate better handles on what works or what is more likely to work better'. 58Or, as Alessandro Mantelero puts it, 'participation can give voice to the different groups of persons potentially affected by the use of data-intensive systems and different stakeholders (eg NGOs, public bodies) facilitating a human-centred approach to AI design'; while, at the same time, it 'reduces the risk of under-representing certain groups and may also flag up critical issues that have been underestimated or ignored'. 59For instance, as noted above, it may be hoped that (even) wider and more inclusive public consultation would have 'flagged up' concerns around inter alia privacy and other data rights implications of the Australian and Dutch contact-tracing apps as well as assisting with the development of effective responses-including deciding whether the app was worthwhile (in its current state) and if so on what terms, whether communication around the app might be better delivered to diverse communities or whether resources should be allocated in some other way.More overt attention paid to accommodating public concerns, including with respect to rights, may also have improved the prospects of public trust in these technologies that relied on public trust to succeed.But ultimately, we think the value of these inclusive initiatives should not depend solely on such immediate practical considerations.If 'legitimacy' means 'social licence', 60 that is, public acceptability in a society that is grounded in democracy, then paying attention to what relevant publics consider important is of value.Likewise, it can reasonably be argued that a 'truly trustworthy' technology is one in which 'trust is based on respect for human rights, democracy and the rule of law'. 61And if 'rights' are grounded in notions of human dignity and liberty, these should be understood to include 'the right to be free to set one's own standards and choose one's own goals' 62 and 'to participate in decision-making in matters which affect [one's] rights' 63 -including with respect to data, and in balance with other rights, freedoms and interests applying a proportionality analysis. 64 short, what we are arguing for is a right to social dialogue in relation to the deployment and use of novel pandemic technologies that affect the rights of humans with respect to their data.This is a right that can be viewed as implicit in the rights-centric of a modern privacy/data protection regime, like the EU GDPR, which (as Yeung and Bygrave argue) should be broadly construed as a progressive instrument in line with its character and goals, 65 and like the Australian Privacy Act, which although less modern than the GDPR specifies the right to privacy in the United Nations' International Covenant on Civil and Political Rights (to which Australia is a party) as a guiding principle along with the rest of that covenant, which includes provisions on freedom of expression, freedom of association, and self-determination for 'all peoples'. 66It is a right that we have in previous writing identified as important in relation to algorithmic decision-making technologies where the points of view of data subjects (as voiced by them or their chosen representatives, or both) need to be taken into account. 67But it also makes sense in relation to the deployment and use of a wider range of pandemic technologies that impact data rights, given their applications across large sections of the community, their implications for human welfare, their potential health and life-giving aspects and the emergency conditions under which decisions are made and applied-including a temptation for governments to turn to 'new technologies as a policy response to crisis'. 68We see it as part and parcel of a general approach to pandemic decision-making that reflects values such as 'preparedness and management', 69 transparency and trustworthiness, 70 inclusiveness and solidarity 71 and respect for human rights. 72We would hope (and expect) it would lead to better decisionmaking accommodating diverse individuals and communities, at the same time enhancing their trust in 'truly trustworthy' technologies.The right to engage in social dialogue on the deployment and use of pandemic technologies that impact humans and their data is a matter of human dignity and liberty and respect for rights, including rights around data.But its basic value lies in the recognition that this right should be not only an end goal but a part of the process leading up to that end, includingindeed especially-in pandemic times.

Conclusion
In this article, we have argued that an impact assessment process for pandemic technologies that impact data rights should provide for a right of social dialogue as part of the process.With Australia coming to recognise the value of impact assessments in addressing prospective social harms associated with novel technologies, 73 the question is whether policymakers of the future will learn from the salutary experience of the (rather limited) impact assessment commissioned and conducted with respect to the COVIDSafe contact-tracing app as detailed in our first case study, which failed to address and assuage public concerns about the treatment of sensitive personal data, the prospect of unwanted surveillance and the discriminatory effects for marginalised individuals and communities.And we suggested that there is much to be learned from the more broadly expert and consultative Dutch impact assessments for the Corona Melder app considered in our second case study, even if we also suggested there was scope for improvement there as well-with some further guidance sketched out in the appendix below.Hopefully, next time around, a wide range of expertise and experience of diverse individuals and communities will be drawn on in the impact assessments for technologies designed to alleviate a pandemic's threats of harm to life and health but at what cost to privacy and other data rights?

Appendix: Guidance for a Data Rights Impact Assessment Process for Pandemic Technologies
Key elements (A) Official input, expertise (B) Social dialogue

Technical proficiency
Ensure expert timely assessment of technical aspects of the technology throughout its lifecycle, including with respect to data minimisation, security and safety, transparency and consent, encryption, coding and tracking schemes and the storage of data.
Encourage and facilitate wide public consultation with diverse individuals and groups who will be affected by the technology, as well as community experts and representatives and civil society activists, with a particular emphasis on the technical features of the technology.

Legal requirements (including with respect to rights)
Ensure expert assessment of compliance with legal requirements in relation to the deployment and continued use of the technology, including with respect to privacy data and other human rights standards broadly construed.
Encourage and facilitate wide public consultation with diverse individuals and groups who will be affected by the technology, as well as community experts, representatives and civil society activists, with a particular emphasis on legal standards that, broadly construed, protect rights, freedoms and interests.

Broader ethical and social standards (including with respect to rights)
Ensure expert assessment of the extent to which the technology meets ethical standards and social norms, including with respect to privacy, data and other human rights aspects broadly construed.
Encourage and facilitate wide public consultation with diverse individuals and groups who will be affected by the technology, as well as community experts, representatives and civil society activists, with a particular emphasis on ethical standards and social norms that speak to rights, freedoms and interests.