From Shadow Profiles to Contact Tracing: Qualitative Research into Consent and Privacy

For many privacy scholars, consent is on life support, if not dead. In July 2020, we held six focus groups in Australia to test this claim by gauging attitudes to consent and privacy, with a spotlight on smartphones. These focus groups included discussion of four case studies: ‘shadow profiles’, eavesdropping by companies on smartphone users, non-consensual government surveillance of its citizens and contact tracing apps developed to combat COVID-19. Our participants expressed concerns about these practices and said they valued individual consent and saw it as a key element of privacy protection. However, they saw the limits of individual consent, saying that the law and the design of digital services also have key roles to play. Building on these findings, we argue for a blend of good law, good design and an appreciation that individual consent is still valued and must be fixed rather than discarded - ideally in ways that are also collective. In other words, consent is dead; long live consent.


Introduction
In recent years, privacy has emerged as a critical social and policy issue. In 2013, Edward Snowden revealed the widespread surveillance of US citizens by US government agencies. In 2014, Shoshana Zuboff invoked the phrase 'surveillance capitalism' to denote 'an extractive variant of information capitalism', 1 before later arguing that internet companies have been effectively granted 'a license to steal human experience and render it as proprietary data'. 2 And in 2018, the Cambridge Analytica scandal taught us that the misuse of Facebook data had potentially compromised democratic processes in several countries. 3 These developments made it clear that people's data and privacy need better protection for the sake of those individuals and society and democracy.
Then, in early 2020, came the global COVID-19 pandemic. In a matter of weeks, questions of privacy took on a new complexion as governments released technology to collect data about people's movements and interactions in the attempt to stem the spread of COVID-19. This technology was led by contact tracing apps. Alongside the new technology, laws were passed to safeguard privacy, including spelling out what data could be collected, under which conditions, and what sorts of consent were required. 4 Aspects of the privacy debate that had previously seemed esoteric and remote were suddenly playing out at speed, with life and death at stake. In this way, the pandemic, and the technology developed to thwart it, had the effect of crystallising several privacy and data issues and raising the prospect of potential solutions.
notice and consent to find that 'democratically determined standards and redlines regarding the generation, collection, storage and use of data need our focus more than notice and consent schemes do '. 17 However, the anti-consent position is not unanimous, with some scholars contending that consent should be retained in some form. One position is that consent frameworks are outdated but that the act of providing notice is still a critical aspect of preserving privacy. 18 Despite the reservations noted above, Solove believes that 'privacy self-management should not be abandoned', proposing that the process 'is key to facilitating some autonomy'. 19 Even Barocas and Nissenbaum argue that consent is important but that it should play only a supporting role: when it comes to the collection and analysis of large amounts of data, they suggest that 'rights and obligations' are more important; 20 notice and consent would only come into play when individuals are required to waive these rights and obligations, as is standard practice in scientific and medical research. 21 Ryan Calo goes further still, arguing that we have not experimented enough; he thus calls for experiential and personalised notice processes, including in the form of emerging strategies of 'visceral' notice. 22 Like these scholars, regulators and policymakers are also keen to retain notice and consent. Since coming into effect in 2018, the European Union's General Data Protection Regulation (GDPR) has been the most significant privacy reform of the past decade, with informed consent being one of its key elements. Article 4 prescribes that consent must be 'freely given, specific, informed and [an] unambiguous indication'. 23 Article 7 then prescribes that 'the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language'. Article 7 also requires companies to make it just 'as easy to withdraw as to give consent'. Meanwhile, the Californian Consumer Privacy Act mandates that people have a 'meaningful understanding' of how their data is used and have the right to opt out of the sale of their personal information. 24 In Australia, notice and consent are also central to the way the law addresses data issues. In October 2020, the Australian Attorney-General's Department released an Issues Paper signalling the start of a major privacy law reform process, in which 23 of the 68 questions raised for comment concerned notice and consent. 25 Regulators and courts have also been active in policing breaches of consent provisions. In 2019, France's data privacy body, the Commission Nationale de l'Informatique et des Libertés, fined Google €50 million because of a flawed consent process for new Android users. The Commission found that 'essential information ... was excessively disseminated over several documents', opt-outs were not pre-ticked by default and people were asked to give broad consent, all of which contravene the GDPR. 26 More recently, in 2019 and 2020, the Australian competition regulator, the Australian Competition and Consumer Commission (ACCC) launched three legal actions, two against Google and one against Facebook, for allegedly misleading people about the consent given for data collection and processing. One case concerns Google's DoubleClick. In 2016, Google combined data from people using their products with data from their advertising service DoubleClick to provide better advertising services. 27 The ACCC alleges that Google did not obtain valid consent from consumers because 'consumers could not have properly understood the changes Google was making nor how their data would be used, and so did not -and could not give informed consent'. 28 In another case, the ACCC scored a partial victory in mid-2020 when the Federal Court held that Google had misled Australians in its collection of personal location data. 29 Individuals are litigating too. In early 2021, a class action brought against Facebook in the United States under Illinois's stringent biometric privacy law was finally settled, with Facebook ordered to pay US$650 million for not obtaining the requisite user consent in its use of facial recognition and tagging technology. 30 These developments reveal that governments, regulators and courts are actively working to improve the consent process rather than do away with it. 31 This is in line with privacy scholars keen to retain certain elements of the notice and consent process. In this vein, one of the authors of this paper has contributed by applying a Kantian framework, which begins with a robust conception of individual consent. First, 'if consent is required because a morally significant privacy issue is at 17 Bietti, "Consent as a Free Pass," 392. 18 Susser, "Notice After Notice-and-Consent." 19 Solove, "Introduction," 1899. 20 Barocas, "Big Data's End Run around Anonymity and Consent," 54. 21 Barocas, "Big Data's End Run around Anonymity and Consent," 66. 22 Calo, "Against Notice Skepticism in Privacy (and Elsewhere)." 23 General Data Protection Regulation 2016/679 (EU). 24 O'Connor, "(Un)clear and (In)conspicuous." 25 Attorney-General's Department, "Privacy Act Review." 26 Dillet, "French." 27 Clark, "Google's Ad-Tracking." 28 ACCC, "Correction"; Zhou, "ACCC Sues." 29 ACCC, "Google Misled." 30 Channick, "Nearly 1.6 Million." 31 Tene, "The Draft EU General Data Protection Regulation." stake, if there is competence to consent, and if there are no relevant conditions that are ethically required or forbidden, then actual individual consent must be obtained '. 32 This approach places the onus on the data collector to secure appropriate consent, one that keeps in mind people's 'strengths and weaknesses' and ensures that consent is an 'ongoing process'. The proposed approach is, thus, less caveat emptor, and more caveat venditor. 33 Such a robust conception of individual consent, it is argued, must then be supported and sometimes limited by the law. 34 This proposed approach is compatible with the findings of our focus groups, as described below.

a. Methods
To test our hypothesis, we conducted six focus groups in Australia during July 2020. 35 We deployed a 'co-design' method, which is a 'design-led process that uses creative participatory methods' 36 and that differs markedly from the more traditional behavioural or survey-based studies found in the literature. 37 The method allowed us to involve participants 'in the design process, with the idea that this will ultimately lead to improvements and innovation'. 38 In practical terms, it meant that we did not just ask our participants what they thought about consent but also asked them to imagine what consent could look like in an ideal world.
Using local Facebook groups, we recruited 26 participants: 15 from Sydney (metropolitan) and 11 from Coffs Harbour (regional). We then held six two-hour focus groups, each with three to five participants. As far as possible, we deliberately selected participants to ensure a diverse sample. Of our participants, 11 identified as male and 15 as female. The participants' ages ranged from 19 to 65 years. The majority were under 35 years of age, but there were also several middle-aged participants and a small group of retirees. Throughout this report, we use pseudonyms for our participants. As is common in qualitative research, we do not claim that this sample is representative of Australians generally, or even of views in these two cities. Instead, these focus groups allow us to capture opinions and beliefs at a granular level, complementing the wider representative surveys conducted by the Office of the Australian Information Commissioner.
As our research was conducted during the COVID-19 outbreak, we held focus groups on the Zoom video-conferencing platform. Focus groups were divided into discrete thematic sections and semi-directed, with most focus groups consisting of open discussions. At some points, participants were asked to use Google Jamboard (a collaborative software tool) to share perspectives; at other points, participants were asked to respond to smartphone screenshots featuring various examples of notice and consent. This final method allowed us to mimic the standard user flows that people engage with when using smartphones and signing up to apps. To analyse our findings, we conducted an inductive thematic analysis. 39 We collated the data from the Google Jamboard responses into a Microsoft Excel spreadsheet, then clustered and colour-coded comments thematically. Coding the interview transcripts involved a similar approach in which relevant quotes were grouped together to identify themes. In this way, a range of key themes emerged around governance, design and consent. Although not every view presented in the focus groups could be included, the quotes selected for this paper provide a fair representation of the tone and the views that prevailed.
Participants proved to be largely critical of current efforts to protect their privacy. Given that representative surveys have revealed that most Australians consider social media sites and smartphones to pose major privacy risks, this strong agreement across the sample should not come as a surprise. 40 The coherence in responses across participants also meant that we reached saturation with our original recruitment drive. 32 Molitorisz, Net Privacy, 216. To give an example of 'ethically required', someone's emails may justifiably be read without their consent to save that person's life, or to save the life of others, such as if that person is reasonably suspected of conspiring to murder. To give an example of 'ethically forbidden', some images, such as sexual images of a minor, may not be shared or accessed even with consent. 33 Molitorisz,Net Privacy,[220][221][222][223] In fact, this approach prescribes a two-tier model of consent, in which individual consent is overarched by the law, which is characterised as 'collective consent'. Molitorisz,Net Privacy,[224][225][226][227][228][229][230][231][232][233][234][235][236][237] For more on the methods employed, see Molitorisz,The Consent Trap,[6][7] McKercher, "Beyond Sticky Notes," 15. 37 See Obar, "The Biggest Lie on the Internet"; McDonald, "The Cost of Reading Privacy Policies"; Martin, "Privacy Notices as Tabula Rasa." 38 Burkett, "An Introduction to Co-Design," 3. 39 Charmaz,Constructing Grounded Theory. 40 Office of the Australian Information Commissioner, "2020 Australian Community Attitudes to Privacy Survey."

b. General Findings
Considering the anti-consent literature cited in the previous section, it was somewhat surprising that many participants tried seriously to engage with the informed consent process. Many participants said they genuinely considered the relevance of requests for data. This often meant making a quick calculation about whether the request was in line with the stated purpose of the service or app. As Felicity (Sydney, 34) explained: It depends what the app is and why they need it [the data]. The thing that just came to mind, [was] apps like Beat the Queue. When I'm at work, for example, I have to allow my location just so it brings up the coffee shops near me.
Felicity viewed this as a logical data request, and most of our participants were happy to allow this sort of transactional data exchange. They were also happy to reject requests from inequitable or illogical exchanges. As Iris (Sydney, 45) said, 'unless it's relevant, it's a big fat "no" from me'.
While we know that people mostly do not read terms and conditions, researchers are only just starting to account for smartphones in their analyses. 41 This is significant as these devices have a range of different affordances, from smaller screens to more pop-ups, that affect the consent process. We also know that more and more data is collected as smartphone technology advances, with people now able to unlock their phones with their faces and use them as repositories for health data. Overwhelmingly and unsurprisingly, people agreed that notice and consent was, in practice, a failing model. Sally (Coffs Harbour, 36) complained that despite some improvements, the 'permissions [were] too much', and the fact that 'they need permission for everything' annoyed her.
Crucially, however, our participants did not think that informed consent was unfixable. They even offered iterative solutions. They wanted companies to be clearer about how their data was being used and said that more was needed than a standard contract. For example, Maddie (Sydney, 35-40) said that videos, graphs and pictures were 'more catchy, and you [the user] want to focus on that'. Our participants recognised the complexity of the current data economy but still wanted to be part of the conversation about solutions.
To this end, participants also explored potential design solutions. One group of solutions focused on ensuring that people actually read the terms and conditions. Many participants noted that sign up screens often do not force you to look at the terms and conditions before signing up. Suggested options included having to type in an answer or tracking whether or not people had opened the terms and conditions (Rosie, Coffs Harbour, 42), requiring people to 'tick a box' (Uma, Coffs Harbour, 46) or making people 'scroll through a summary' of terms and conditions before accepting them (Vincent,Coffs Harbour,19). Another group of solutions suggested moving towards granular or personalised consent. One suggestion was that people should be able to use some of the elements of an app if they agree to certain aspects of the terms and conditions (Felicity, Sydney, 34). Indeed, several participants were concerned about the 'all or nothing' approach to most terms and conditions and suggested that people should have more options. As Beth (Sydney, 47) said, 'an option to withdraw it or change the conditions would be useful'. Other participants said that it was hard to agree to terms about the use of a service when they had not used it before. They suggested a preliminary period where people could have 'the option to use once' (Ellie,Sydney,19). This 'try before you buy' consent model would potentially give people a better sense of the value they might get out of an app before committing to signing up.
Of course, our participants also recognised that there were genuine limitations to the individual consent process. There was a clear sense that while consent should be retained at certain points, the law needed to step in. People wanted more assistance from the government and regulators so they could take more responsibility for their privacy. Participants were particularly focused on transparency and wanted to know more about the companies collecting their data. However, there was also a clear expectation that the government and regulators would step in at critical points to protect citizens. Our participants did not come to a consensus about the precise role of citizens vis-a-vis governments but it was clear to our participants that, in many cases, consent had 'to be policy-based rather than individual-based' (Rosie, Coffs Harbour, 42) and 'more of a legislative thing rather than an individual thing' (Sally, Coffs Harbor, 36).

a. Shadow Profiles
In our desktop research and focus groups, we concentrated on four case studies that present particular challenges for issues of consent and privacy. One of these is the 'shadow profile', by which we mean digital platforms building profiles of individuals who do not use that platform. Research shows that shadow profiles are possible. 42 In a 2019 paper, James P. Bagrow et al. used 'information-theoretic estimators to study information and information flow' on Twitter to show that '95% of the potential predictive accuracy for an individual is achievable using their social ties only, without requiring that individual's data'. 43 Working with a dataset of nearly 14,000 users, the researchers found that 8-9 of an individual's contacts will be enough to reveal that individual, without that individual needing to share anything at all. This is largely due to the phenomenon of 'homophily', describing how people tend to socialise with like-minded others. The researchers concluded that 'information is so strongly embedded in a social network that in principle one can profile an individual from their available social ties even when the individual forgoes the platform completely'. 44 Other studies have reached the same conclusion. 45 Do companies such as Facebook actually build profiles of non-users? The answer is not clear. In US Congressional hearings in 2018 following the Cambridge Analytica scandal, Facebook's Mark Zuckerberg was pressed on the issue. He denied knowledge of 'shadow profiles', saying, 'I'm not familiar with that', but admitted that Facebook collects some data on non-users so that it can keep its users safe. 46 While researchers have repeatedly demonstrated that shadow profiles are possible and journalists have shed some light on the practice, the extent of the practice is hard to gauge. 47 To raise the issue of shadow profiles in our focus groups, we asked the participants to imagine five imaginary friends, four of whom are on a social network. The fifth, by contrast, does not want a social media presence. However, that fifth person can still be revealed thanks to the homophily described above, as well as other means. When presented with this scenario, most participants objected strongly, using words such as 'unacceptable', 'abhorrent', 'unethical' and 'illegal'. 'That would be like me spying on my neighbour and keeping a diary and photos of them', said Rosie (Coffs Harbour, 42). 'You'd be so furious if a person was doing that to you, so for a company to be doing that without your knowledge is just appalling.' As Aaron (Sydney, 28) said, 'Why should that organisation get to use your information for their gain? That's where it's a kind of a theft in a way to me. I find that really unethical'. Wendy (Coffs Harbour,19) said that people's choice not to use a service should be respected: 'If you haven't got that social media, it's obviously for a reason. You don't trust it or something'. However, one participant felt resigned. Karl (Sydney, 62) said: I think we have come into that era now. Digital marketing, digital profiling, these things are happening, and we have to move with the world. We cannot go back 20, 30 years and get the sales journal, purchases journal, ledgers-those things are gone. We have to move with the time. It is reality. You have to suck it up, whether you like it or not. You just have to live in this world.
Karl was alone in saying people needed to accept the practice. However, a number of others did point out that the practice would be very hard to eliminate. As Yves (Coffs Harbour, 52) said: Obviously, it's unethical. But in this day and age of everyone having a camera in their pocket, 24/7, and by a majority of people using one social network or another, as unethical and abhorrent as it may seem, to put in place rules and laws to force those companies to not store that information, is a big job. Implicit in Yves's response is that 'rules and laws' do have a job to perform here, even if it is 'a big job'.

b. Eavesdropping
Another major challenge for informed consent concerns tech companies listening in-or 'eavesdropping'-on people's conversations. Anecdotal evidence of such practices is widespread. 48 As with shadow profiles, the available literature has revealed that eavesdropping is technically possible. One 2020 paper focused on eavesdropping by accelerometers, the sensors in a smartphone that measure how that phone is vibrating or changing in motion. The researchers did not show that 42 Sarigol, "Online Privacy as a Collective Phenomenon"; Garcia, "Leaking Privacy and Shadow Profiles in Online Social Networks." 43 Bagrow, "Information Flow Reveals Prediction Limits in Online Social Activity." 44 Bagrow, "Information Flow Reveals Prediction Limits in Online Social Activity." 45 Garcia, "Privacy Beyond the Individual"; Garcia, "Collective Aspects of Privacy in the Twitter Social Network." 46 Molitorisz,Net Privacy,[53][54][55] Hill, "How Facebook." 48 BBC News, "Is Your Phone?" accelerometers do eavesdrop, but that they can, by demonstrating that accelerometers can capture the audio of adult speech. 49 Much of the literature concerns security risks, such as the risk of hacking and malware attacks by third parties, rather than the inadequacy of consent/permissions, or to what extent such eavesdropping does, in fact, occur.
Several digital platforms have admitted to some form of eavesdropping. In 2019, Facebook admitted paying contractors to transcribe audio clips from users of its services; 50 also in 2019, Google, Apple and Amazon all admitted to winding back their practice of having people analyse recordings captured by voice assistants following adverse reports. 51 That same year, one academic study concluded that 'we cannot rule out the possibility of sophisticated large-scale eavesdropping attacks being successful and remaining undetected '. 52 In our focus groups, many participants were deeply concerned that their digital devices were listening in without consent. Iris (Sydney, 45) said she was: ... disturbed about Google hearing everything that we say and listening to our phones. I was with a group of friends a couple of weeks ago, and we were talking about placement of ads. And we talked about BMW, and then a couple of days later, one of the women said, "Oh look, BMW popped up on Facebook on my phone." Even if your phone's not in use, if you've got the Hey Google activated, it's basically listening to you, which is a little disturbing to me. That's definitely not consensual.
Felicity (Sydney, 34) agreed: It's disturbing. I've been in that scenario, copious amounts of times, having conversations, and you'd be scrolling on Facebook or Instagram or something, and it'll just pop up as an ad, what we've been talking about. It's inappropriate, unethical, and it's also scary.

c. Government Surveillance
The past decade has also revealed that some governments are engaged in widespread non-consensual surveillance of their citizens. In the United States, extensive practices of domestic surveillance by the National Security Agency (NSA) and other agencies were exposed by Edward Snowden in 2013. 53 Some of these practices were subsequently declared unconstitutional or illegal. 54 In other countries, there have been no analogous whistle-blowers, so less is known about whether such practices occur. Government surveillance can take different forms. On the one hand, a surveillance technique can be surreptitious, widespread and unauthorised by law, as was the case with some of the NSA's practices. On the other hand, a surveillance technique can be known, targeted and authorised by law. In our focus groups, we did not try to pinpoint exactly which practices the participants found unacceptable; rather, we initiated open-ended discussions to gauge our participants' general responses.
Several participants said that sometimes the government would be justified in non-consensually accessing photos, emails or other personal data, as in cases of national security, but that such access would only be justified with a warrant and only in extreme circumstances. As Sally (Coffs Harbour, 36) said, 'If they have a warrant, if they've gone through the legal channels and there's evidence that says I might be about to blow something up …. They need to do it in that way'. This prompted several replies: Rosie (Coffs Harbour, 42): You'd have to be planning something pretty horrendous. Patrick (Coffs Harbour, 54): If it was national security, I'd be okay with it. Sally (Coffs Harbour, 36): If it was just trying to find out who votes left or right or whatever, then I would have a problem with that.
Uma (Coffs Harbour, 46) said that it's a fine line. If a government accesses a person's email, they might find that the person has been doing paid work but not declaring it for income tax purposes. However, they might also find that the person has been having an affair. As Uma said, 'That's none of the government's business.' Beth (Sydney, 47) noted that if the government can non-consensually access information about you, then it could use that information maliciously without you ever knowing. This would mean that you have no ability to appeal: 'You've lost opportunities or you're suffering some sort of penalty, without any 49 Ba, "Accelerometer-Based Smartphone Eavesdropping." 50 Frier, "Facebook Paid." 51 Metz, "Yes, Tech." 52  Some participants were fatalistic about non-consensual government surveillance. 'I think we don't really have a choice when we talk about a government agency,' said Karl (Sydney,62). 'Regardless if we like it or not, they still have our data. So, I cannot really comment if it's fair or not, because I think there's no choice at all.' However, most participants were adamant that governments should not be able to surreptitiously access all our correspondence and data. As Xavier (Coffs Harbour, 50) said, 'There are countries in this world that do operate that way. We don't want to be ... I don't want to be there'.

d. Contact Tracing Technology
The issue of government surveillance was suddenly cast in a fresh light following the outbreak of COVID-19, which was declared a global health emergency by the World Health Organization in January 2020. 55 To combat the global pandemic, many governments released contact tracing apps. 56 While contact tracing has traditionally been conducted by interviewing infectious people about their contacts and movements, contact tracing apps automate the process, relying on Bluetooth and other technology to collect personal data. In a sense, the boundaries of individual privacy are being redrawn in the interests of public health, with people being asked to make 'trade-offs' by relinquishing some privacy to save lives. 57 In countries including Australia, these apps are supported by legislation that attempts to minimise the risk of privacy harms.
Internationally, governments have adopted a variety of contact tracing technologies, which 'are generally not mandatory and work on an opt-in basis'. 58 To this end, they tend to rely on notice and consent. However, researchers have also found that the consent mechanisms are imperfect. One study analysed factors such as the word count and word complexity of seven contact tracing apps to find that their privacy policies required a reading ability considerably higher than that of the average adult. 59 In July 2020, a comprehensive review of contract tracing apps and their attributes specifically identified the difficulty of 'consent withdrawal' as a concern for app users; it also found that app adoption rates increase significantly with greater transparency and legislative guarantees against data misuse. 60 One specific concern among researchers involves the role played by private companies in these apps, which can increase the risk of illegal data collection and sharing. 61 An attendant privacy issue concerns whether data is stored on a centralised database, or in a decentralised manner. 62 Several scholars have also expressed concern about the normalisation of this type of surveillance. 63 In Australia, the Federal Government launched the COVIDSafe contact tracing app in April 2020. It was based on Singapore's TraceTogether app, which relies on Bluetooth technology, Amazon Web Services and a centralised server. 64 In May, Australia's Privacy Act was amended specifically to protect data collected by the app (see discussion below). 65 In our focus groups, more than half the participants had not downloaded the app because they had concerns about the app's privacy or efficacy. However, for Beth (Sydney, 47) public health trumped privacy concerns in this case: When we walked participants through the consent process by showing them smartphone screenshots, they were generally positive. Overall, despite some reservations along the lines of those detailed above, our participants viewed the Australian app as an improvement on corporate terms and conditions. This was because participants felt they had a choice about whether to participate or not and were being presented with a relatively comprehensive and comprehensible consent process. And they were pleased that associated legislative provisions had been passed to protect their personal data. One issue of concern, however, was that even those who had downloaded the app had limited knowledge of key information, including the existence of the legislation and the fact that Amazon Web Services were providing the infrastructure for the app.

Discussion: A Tripartite Response
The participants in our focus groups confirmed what the academic literature reveals: there is an intractable problem when it comes to consent and data. In addition, the case studies discussed in this article show that there are clearly situations where the process of notice and consent is particularly fraught. However, just like our focus group participants, we think there is scope to improve the mechanism of notice and consent. Indeed, we propose that consent remains core to the protection of privacy but needs to be construed in fresh terms. Current approaches focus on the individual. Given the relational and collective nature of our data and, indeed, our lives, a better approach recognises that individual consent often requires the support of good design and good law. This is the model that emerged most clearly from our consideration of contract tracing apps, which sought to balance individual consent with legal protections and clear design. We will first clarify the role of law and design, before returning finally to the role of consent.

a. The Law
As our research shows, there are clear instances where individual consent is inadequate and legal protections are needed. This is most obvious with shadow profiles and eavesdropping. Both practices are ethically suspect: it is hard to imagine a clearer failure of individual consent than shadow profiles; and if people have, in fact, consented to eavesdropping, they have done so unwittingly. Our participants were outraged by the practices. They wanted the law to protect their privacy more effectively, and here are two prime examples. Privacy laws ought to rigorously restrict, if not outlaw, shadow profiles and eavesdropping, as there is no meaningful way for people to consent to these experiences, even if companies attempt to provide notice. This is clearly what most of our focus group participants wanted. Similarly, participants had strong views around the circumstances in which government surveillance of its citizens is acceptable and unacceptable. 66 One key point here is that good law must be accompanied by effective enforcement. There is certainly scope to enforce existing laws more vigilantly. As noted above, Australia's competition watchdog (the ACCC) has launched three lawsuits (and at the time of writing has scored one victory) in a 'world-first enforcement action' against Google and Facebook for the alleged nonconsensual collection of data under longstanding prohibitions on misleading and deceptive conduct, 67 while in early 2021, in the United States, a judgment was finalised in a class action under the biometric privacy law of Illinois, with Facebook ordered to pay US$650 million for its non-consensual use of facial recognition and tagging technology. 68 Meanwhile, in Norway in 2021, Grindr was fined an estimated 10% of its global revenue for not 'gaining a valid consent' from users to sell data to advertisers. 69 For the law to do its job, good legislation must be accompanied by effective enforcement.
We do not view this reliance on law as a move away from consent. We regard law as a space where societies can address issues that extend beyond the individual to establish collectively agreed boundaries through democratic mechanisms. The law is, to quote Immanuel Kant, an expression of the 'united will of the people'. 70 It is the law that draws the line on behalf of individuals, particularly in cases where the individual is powerless to draw the line themselves. Often the united will of the law works to support individual consent, as is evident in the cases we have just described. It is also evident in the GDPR, which stipulates (in Article 4) that individuals must actively consent to the sharing of their data, thereby ensuring that a pre-ticked opt out box will be legally insufficient. (This leads to the larger point that the default position for users ought to be for privacy, not for sharing.) Further, the right to erasure contained in the GDPR (Article 17) enables people to request that, for instance, specific links be removed from results returned by a search engine in certain circumscribed conditions. Meanwhile, facial recognition software has been banned in various jurisdictions, including San Francisco, given the inaccuracy and inequity of the technology but also the way it can be used for unforeseen and non-consensual purposes. 71 We support each of these provisions (among others) and hope they are more widely adopted.
Volume 3 (2) 2021 Molitorisz et al. 55 The key role of law becomes yet more apparent when we recognise the collective nature of privacy. As researchers are increasingly demonstrating, privacy transcends the individual. 72 This is particularly evident with developments such as shadow profiles: as detailed above, a social network can build a profile of a person who never uses that network simply by accessing information from other sources (including offline sources) and by inferring information about someone. This prompted researchers to coin the phrase 'privacy leak factor', 73 and led others to begin to conceptualise the notions of 'collective privacy', 'networked privacy' and 'relational privacy'. 74 The issue of inferred data sits right at the heart of notions of collective/networked/relational privacy and is one of the most vexing issues confronting data regulators. When should inferred data count as personal data? What sort of protections should it attract? When should it be outlawed? 75 It is beyond the scope of this paper to address these questions. However, we propose that the best hope for a just response to the challenges of collective privacy generally and inferred data more specifically is likely to emerge in the shape of good law.
Fundamentally, the law can address problems with informed consent; it can specify when individual consent is required, detail what elements are needed for individual consent to be legally valid and provide top-down regulations when individual consent cannot do the work. The GDPR provides a good template. Admittedly, its implementation has not been perfect, with website operators often following the letter rather than the spirit of the law. As Bornschein et al. note, 'Most cookie notices are hardly visible and/or do not offer consumers a choice regarding information collection practices'. 76 This view supports our point that law and regulation can only be effective with effective enforcement.

b. The Supporting Role of Design
Consent must be understood as more than just the written terms of agreement. It also encompasses people's experience. In other words, we need to ask: How are these terms being presented to individuals? How are consent mechanisms being designed?
In our focus groups, participants noted many instances where design fell short, including limited/persuasive buttons or menu options, small typefaces, simplistic text and a lack of consistency in presentation across apps and platforms, all of which led participants to feel a lack of control when consenting. 77 Our participants consistently wanted simplicity and clarity in design. Natasha (Sydney, 30) said that 'being able to explain complex things in simple terms is an art form', adding that 'it's doable, and if you can't do it then find someone who can'. Olive (Sydney, 25) previously worked in the health sector, where her professional experience with informed consent convinced her that 'there was definitely a way to present complicated information in a way that is simple and understandable'. As she said, informed consent was 'not something that you can just skim through' and it should be the same with the tech sector. She reiterated a powerful point made by several participants: tech companies could make consent work if they wanted to.
When options are not presented clearly and dispassionately, the risk of manipulation or deception is high. 78 Manipulative design choices are called 'dark patterns', a term coined by user experience (UX) and user interface (UI) practitioner Harry Brignull, who compiled a list of established design patterns and user behaviours that are employed within an interface design to manipulate or deceive users into agreeing to particular terms and conditions. 79 Subsequently, Colin M. Gray et al. developed a comprehensive framework to articulate the various UX patterns that can be deployed to persuade users to engage in specific behaviours; these patterns include 'nagging', 'obstruction', 'sneaking', 'interface interference' and 'forced action'. 80 Representing a subversion of user-centred design principles, these techniques see designers use knowledge about users (and society more broadly) against them. 81 While these techniques are not necessarily intended to be 'dark', they 'have the potential to produce poor user outcomes, or force users to interact in ways that are out of alignment with their goals'. 82 In the absence of adequate regulatory standards, UX and UI designers lack appropriate guidance on best practice when it comes to informed consent. Recent research by Cherie Lacey et al. exploring the privacy decision-making processes of designers revealed that design choices involving privacy were often 'like the Wild West': 83 (1) designers feel motivated to act ethically due to their 'moral compasses'; (2) designers are restricted in their ability to act ethically due to commercial pressures and a limited purview of the project; (3) designers' understanding of the ethics of their practice do not currently match determinations made by international privacy and design scholars and demonstrate a limited understanding of how user behavior can be shaped that, in turn, obfuscates beneficial privacy outcomes for users. 84 As awareness of these issues grows, the 'privacy by design' movement is gaining momentum. 85 This involves the recognition that the due protection of privacy involves coding it into the very architecture of digital services and platforms. Rather than a mere afterthought or add-on, privacy must be embedded in software and hardware design. One innovation here is Google's Federated Learning of Cohorts, which aims to replace data tracking by cookies with a system that is less privacy invasive. 86 Design and law can work together, as they do in the GDPR's prescription for 'privacy by design and by default'. 87 Overall, stronger collaboration between the design community and lawmakers is required so that UX and UI design standards implement and mandate privacy by design strategies, including by prioritising consent at the development stage, rather than on an ad-hoc basis or retrospectively. Here, considerable work has already been done. The World Wide Web Consortium (W3C) develops design and accessibility standards for the web and fosters the development of privacy by design for web standards. 88 This shows how design, law and consent can complement one another. To give users more autonomy and agency around consent, designers need to develop experiential and personalised notice processes that build in ongoing opportunities for users to confirm or retract consent. This, in turn, needs to be combined with good law and robust enforcement provisions, to ensure that these standards are maintained and upheld.

c. Fixing Individual Consent
The participants in our focus groups tried hard to make careful judgments about their data. For instance, they were unlikely to accept consent requests that were seen as irrelevant for the service they were seeking to access. Clearly, people still highly value individual consent, even as they recognise that consent mechanisms often fail them. What's more, companies and governments seem to be working on the assumption that individual consent can be fixed. In response to Apple's plans to introduce a notification asking iOS users whether or not they want to be tracked by apps, Facebook objected that this would lose them revenue from personalised advertising. 89 Facebook's fears were seemingly based on a presumption that millions of people would choose privacy; Facebook's concern, in other words, was that Apple was giving people the genuine opportunity to withhold consent. In 2020, consent also played a central role in the development of Australia's contact tracing app. In April, the COVIDSafe app was launched; in May, the Privacy Act was amended specifically to protect the data it collected, including by creating a series of serious offences for unauthorised access, use or disclosure of data collected. 90 The consent mechanisms, and the law underpinning them, were not perfect. 91 However, privacy experts had reason to be positive about protections contained in the legislation, 92 just as our participants were positive about the app's design. Let's hope this interplay of consent, law and design is a portent of things to come.
Yes, there are serious limits to the efficacy of individual consent. Many of these stem from its focus on the individual. The complexity of online data flows means that no individual can hope to monitor all relevant data, let alone the inferences that might be drawn from that data. Indeed, emerging work in law and philosophy has challenged the normative basis upon which individual approaches to data governance are founded, arguing that this basis is no longer viable in a world where data is increasingly used as an economic resource. 93 The advent of machine learning and algorithmic governance only increases the challenges facing consent, with emerging harms from predictive decision-making likely to be not only significant but inequitably distributed. 94 In response, scholars have proposed novel solutions that include collective data governance mechanisms, such as data trusts and data cooperatives, which allow large groups of persons to have their interests represented by an intermediary body. 95 This line of thinking is encouraging: perhaps in time consent can be re-imagined and implemented in a way that is less individualistic and more collective, and at the same more effective. We hope so. For now, at least, individual consent has a key role to play. And that's unequivocally what our participants want. Despite its limitations, consent needs fixing, not discarding.

Conclusion
Our focus groups showed that people value consent but also recognise that major work is needed to improve the process in practice. As focus group participant Maddie (Sydney, 35-40) said, 'Consent is a trap … but it's still useful. It's a tool somehow to protect ourselves as well. If it can be made more simple, that's better. But now it's better than nothing'. The limitations of consent became particularly apparent from our discussion of shadow profiles, eavesdropping and government surveillance, which confirmed systemic problems around data collection and processing. In response, the academic literature has proposed innovative solutions such as the articulation of forms of consent that are less individualistic and more collective (e.g., data trusts). We encourage these developments. More immediately, however, there is a pressing need to fix individual consent, where considerable scope for improvement and innovation exists. And there is a complementary and similarly pressing need to recognise and improve the role played by law and the role played by design. With law, for instance, one oft-overlooked focus is enforcement. Recent activity in multiple jurisdictions has shown that existing laws can be applied to protect privacy (often by policing consent), as long as there is regulatory capacity and an appetite to take on offending parties. Our approach thus combines three elements: the key prescriptions set by the law; the supporting role of design; and the core component of individual consent. Each of these elements demands attention if our consent is to protect us and not entrap us.