Privacy&AI in the time of Coronavirus: reflections on geolocation and predictive systems

06/04/2020
The Covid-19 epidemic has brought even more attention on the two hottest topics of 2020: Artificial Intelligence and Privacy.
One month after the launch of the European policy on Artificial Intelligence (you can read about it here), Europe is facing a crucial challenge, which will undoubtedly define future policies.

In the blink of an eye we started facing innumerous articles and interviews in which "privacy" is treated as a totally abstract concept which has to be sacrificed to allow the achievement of a common goal. We are constantly bombarded by news where we are asked to choose between privacy and technology, privacy and health, privacy and security: as if one necessarily excludes the other. At the same time, we hear about the creation of apps and other Artificial Intelligence-based systems dedicated to contact tracing and to monitoring the spread of the virus.

The right to privacy is a fundamental right linked to the notion of human dignity: it entails the right of everyone to have a private life, without unlawful interference. It is distinct, albeit related, from the right to data protection, which aims at ensuring that information about an individual is processed correctly. The GDPR is the heart of European rules on processing of personal data.

This being the case, as with other rights and freedoms, it is possible that in certain situations individual rights to privacy and data protection might be "limited" as a result of a balance with other public rights, such as health. However, any such derogation must be based on a legitimate source, including limits and proportionality of the derogation for the intended purpose.

The Italian Data Protection Authority Antonello Soro stated:

It is not true that privacy is the luxury we cannot afford in this difficult time, because it allows everything that is reasonable, appropriate and advisable to do to defeat the coronavirus. The key is in the proportionality, foresight and reasonableness of the intervention. In addition to its temporariness.

In this article we will aim at providing some clarity on the most relevant data protection aspects related to the application of an IA system in this dramatic historical period.

In particular, we will tackle AI systems for screening, contact tracing and infection risk assessment. We will also summarize the latest criteria issued by the Italian Data Protection Authority regarding the geolocation of coronavirus infected people.

We can identify two key aspects related to the application of AI to contrast the Corona virus outbreak:

  1. transparency on how data is processed and adequate information to data subjects;
  2. temporariness, proportionality and accuracy of data processing;

 

Transparency on how data is processed are treated and adequate information to data subkects: GDPR rules on AI

In case the processing is the result of an automated decision making process (which includes the processing carried out through the use of Artificial Intelligence technologies), the GDPR imposes on Controllers additional information and transparency requirements that must necessarily be respected.

In particular, Article 13(2)(f) provides that the person concerned must be informed about

[…] the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

In addition, according to article 15 (1) letter h) GDPR the data subject is entitled to obtain from the Controller all the information mentioned in art. 13 (2)(f) GDPR.

Finally, Article 22(3) GDPR provides that

  1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
  2. Paragraph 1 shall not apply if the decision:

(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or

(c) is based on the data subject's explicit consent.

  1. In the cases referred to in points (a) and (c) of paragraph 2, the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision

The data subject, therefore, must be able to express his or her consent to the processing of data in an informed manner, the language used must be immediate and clear, especially since in most cases data is collected through apps installed on personal smartphones.

The information must always be the result of a balancing act on the part of the person who disseminates the software to the public: on the one hand there is the right of the person concerned to receive the most accurate information possible and on the other the need to simplify complex concepts and to build confidence of citizens in the technology, given the impact it could have on their psychophysical well-being.

Timeliness, proportionality and accuracy of data processing: the criteria to be followed for the geolocation of coronavirus infected individuals by the Italian Data Protection Authority

In an interview of March 29, 2020 on Agenda Digitale, the Italian Data Protection Authority outlined the criteria to be followed by governments wishing to implement a software for geolocation of positive subjects in order to analyze the epidemiological trend of Covid-19 or to reconstruct the chain of infection.

  • Graduality: the government must first assess whether less invasive solutions may be sufficient for prevention purposes;
  • The acquisition of anonymous mobility trends is allowed;
  • If, on the other hand, the government intends to acquire identified data, first of all it needs to issue a regulatory provision with limited time effectiveness and adequate guarantees. In particular, the Authority highlights the need for this legislation to comply with the principle of proportionality, specifically regarding the purpose of data collection;
  • The Government must then carry out a preliminary analysis of the actual suitability of the chosen technological solution to achieve useful results fighting the spread of the virus, in proportion to the needs pursued and provided that less invasive measures are not considered suitable to achieve the desired results.
  • The Authority then established that the processing of data relating to the geolocation of individuals must necessarily be linked to the health data relating to the positivity or otherwise of the traced subjects.

As regards the private entities that will process the software, the Authority established that:

  • Private technology infrastructure developers should be able to make their information assets available to the public authority;
  • The public authority should be the subject conducting the data analysis (and the possible reidentification of the data). This is dictated by the increased risk involved in this activity, which can find adequate safeguards in governmental bodies.
  • The companies involved in the project must meet appropriate requirements for reliability and transparency of action.

 

The intervention of the Council of Europe

On 30 March 2020, the Chair of the Committee of Convention 108 and the Data Protection Commissioner of the Council of Europe emanated a joint statement on the processing of data in the context of combating the diffusion of COVID-19.

 As regards specifically the use of AI software, the statement indicates the following key points to be taken into account in the development of predictive systems:

  • Transparency and "explainability" of the technical analysis carried out by IA;
  • Precautionary approach and risk management strategy (including the risk of re-identification in the case of anonymous data)
  • Quality and data minimisation;
  • The role of human supervision.

 

Conclusion

It is absolutely possible to develop advanced technologies without inevitably eroding the rights of individuals to data protection and privacy. On the contrary, it is necessary - in an emergency situation such as this - that these technologies be used for the common good.

The human intervention is fundamental: it is through this that the characteristics of the software are outlined and then by default the software itself will be used by the interested parties.

This is why market players need to know how to navigate the privacy and data protection rules at a sensitive time like this.

 

Sources:

Coronavirus is forcing a trade-off between privacy and public health, Karen Hao, MIT Technology Review

The Public Interest and Personal Privacy in a Time of Crisis, Hu Yung