On 10 July, the Human Rights, Big Data, and Technology Project (HRBDT) co-hosted an event in London with the Law Society of England and Wales, entitled ‘Free, informed and unambiguous consent in the digital age: fiction or possibility?’.  The event focused on the interpretation of consent, and whether consent can be considered an adequate and meaningful safeguard when it comes to individuals’ having control over the collection, retention or processing of their personal data.

Tony Fisher, Chair of the Human Rights Committee of the Law Society, opened the event by introducing the speakers and emphasising the significance and timeliness of this topic after the Cambridge Analytica revelations and the General Data Protection Regulation (GDPR) coming into force.

Professor Sheldon Leader (Co-Investigator on the HRBDT Project) was the first panellist to speak. He introduced the issues concerning consent online by first identifying how consent fits into human rights guarantees. He quoted Article 8(2) of the Charter of Fundamental Rights of the European Union (.PDF) stating that “data can be processed on the basis of the consent of the person concerned or some other legitimate basis laid down by law” and “consent is recognised as an essential aspect of the fundamental right to the protection of data.”

He then moved from the meaning to the functions of consent, explaining that it can serve as: (1) a device for controlling risks, and (2) a device for governance. These two functions can be found within the GDPR, which sets out that:

‘Consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her. (Article 4(11), GDPR)

Sheldon Leader expanded on this by identifying the core elements of consent, which are that it is “freely given”, “specific”, “informed”, and “unambiguous”. Following consideration of these four elements, he focused on one core problem, namely the imbalance of power and knowledge between individuals and institutions asking for their consent, and the extent to which this imbalance vitiates the four elements that make consent by the data subject meaningful.

He suggested that a potential solution might be found by drawing on other areas of law and policy in which collective representation underpinning collective agreements is deployed by bodies such as trade unions – organisations potentially possessing greater expertise and bargaining strength than individuals. Drawing from employment and labour law, as well as examples from other fields, he suggested ways in which such models could be translated to the digital world.

The second speaker at the event was Dr. Chris Fox (Co-Investigator on the HRBDT Project). He presented on big data and contemporary concerns about consent. He first laid out some questions to the audience that illuminated the complexity of issues around consent. These included:

  • What is being consented to?
  • What kind of data is concerned?
  • How is consent managed?
  • What are reasonable exceptions of consent?

Chris Fox covered different consent models currently in use, and particularly research related concerns, focusing on the “opt-in” and “opt-out” models. The different models discussed can be found in the table below. All these models present flaws in consent design despite the GDPR having come into force.

  • Obligatory opt-in - In this model the use of a certain service is treated as consent. Rather than providing an option to consent, or opt-in, no real choice is given, such as “Yes, I’d like to opt in” or as an alternative “Yes, but only email me once a month”, is given. This is not an adequate choices.
  • Default opt-in - In this model consent is implied, and opt-in to share data is by default. This model is often associated with confusing opt-out procedures and misleading options.
  • Pay to opt-out - This model allows paying account holders to opt out but not those who use the service for free.
  • Delegation of opt-out - In this model a site shares data automatically with third parties and requires users to visit third parties to “opt-out”, typically requiring an opt-in form to do so.
  • Restricted opt-out - This model allows a user to opt-out of being targeted by algorithms but not opt-out of having user data used, analysed, or shared.
  • Misleading opt-out - As with the “restricted opt-out model”, this model uses misleading wording and usually allows for opt-out of one source of data, but other sources still can be used.

Chris Fox then proceeded to cover some of the general problems and areas of concern. Among these were apparent breaches of assumptions and regulation around consent, effective enforcement, expectations and understanding of users and prioritisation. Many of these problems are connected with ownership and control rights and the permanence of a decision to consent, if the user is given the opportunity to consent in the first place. Derived information can be problematic because consent may be granted for a specific purpose without understanding how sources can be combined to derive “new” information.

He also spoke about users having no effective choice as many companies have de facto monopolies or offer “essential” services, such as those arguably essential to fulfilling a social life for example. The question arose, how do we avoid social or material exclusion and preserve meaningful consent? In addition to the issues listed above, technical challenges, responsibility for compliance and jurisdictional issues were also discussed.

To conclude his presentation, Chris Fox spoke about the challenges around consent exemptions for the purposes of research and journalism, explaining the ambiguities in the guidance, such as in the GDPR, and the potential for these exemptions to be abused.

Dr Rachel O’Connell (Founder and CEO of TrustElevate) was the final panellist, who presented on age verification and parental consent in the digital age. She began her presentation by speaking about regulatory compliance with the GDPR in regard to children, specifically Article 8 of the GDPR. She spoke of the problems of age verification online and presented figures that indicate that many people lie about their age online. As a proposed solution she presented the next generation of consent standards that are becoming operational, which range from companies that facilitate consent withdrawal to those that codify and automate legal documents and consent.

Rachel O’Connell explained how such processes function, from understanding exchanges between the online user  and platforms and content providers, to what the role of a “federated attribute provider” is and how it functions. When a user connects to a service that requires age checks, the consent provider uses the federated “Age Check” scheme to verify the age of the user through a minimal attribute exchange. Vectors of trust are used to achieve this. She explained that while data from school databases contains sensitive information, this information is encoded to ensure no personal data is provided to third parties.

She outlined a trust framework that could enable one site to trust the attribute, security, and privacy assurances from another site (the “identity provider”) acting on behalf of a user. This consists of two parts:

  1. The tools: Technical standards and protocols to be implemented by the members of a trust community, and,
  2. The rules: The business, legal or operational policies to be followed in order to achieve the levels of security, privacy and other trust assurances that the participants in the trust framework desire.

She noted that under the GDPR, data controllers must have a contractual relationship with data processors. The TrustElevate Trust Framework will include the contractual relationships between all parties and will govern how the federated model operates. A federated model works on the premise of “verify once, use many times”, which drives down the costs to businesses of conducting verified parental consent and age checks.

She concluded her presentation by urging the audience to familiarise themselves with the following initiatives and guidance to increase awareness of online security:

  • Kantara Initiative’s Consent receipt specification
  • User Managed Access Protocol
  • NIST eID guidelines
  • PAS 1296 Age Checking code of practice
  • eIDAS regulation
  • OIX Trust Framework

Overall, the event brought together a wide range of perspectives that covered the legal, technical and human rights aspects of current consent issues in the digital age. The presentations were followed by a Q&A session, which allowed for further discussion, enriching the dialog on these complex issues.