PEOPLE | May 27, 2019

The GDPR a year later: an interview with Luca Bolognini

A year after the GDPR became applicable, what is the situation for companies and public administrations?

On 25 May the GDPR, General Data Protection Regulation, blew out its first candle. A year has passed since it became fully applicable and, according to the Osservatorio Information Security & Privacy, during its first six months, 59% of organizations possessed a structured adaptation project and 23% had a completed project. At the end of 2018, almost a quarter of the companies declared themselves compliant with the requirements imposed by the GDPR, only 10% had not yet addressed the problem at an organizational level and 8% were analyzing the requirements and planning the activities which had to be carried out.

“A year later, we can thank the world of information which was able to highlight the problems related to the adaptation, thus increasing awareness of the need for data protection” – comments Luca Bolognini, chairman of the Istituto Italiano Privacy and author of the book Follia artificiale (Artificial Insanity). Information whose side effect led to many citizens, entrepreneurs, public officials being educated on a subject as delicate as it is little known.

What is the situation of our companies today? Compliant or not compliant?

“The biggest companies are certainly ahead as they started working on the adaptation in good time to be compliant right away. From my point of view, there are still problems for SMEs, which are generally more focused on business rather than on the organizational aspects of compliance, and for Public Administrations, which very often do not have the financial and human resources to deal with the adaptation”.

What can we hope for during this second year of life?

“Unlike many who loudly demand exemplary sanctions, which are certainly necessary in cases of full-blown violations, I am convinced that we need to think about how the general principles laid down by the GDPR will be applied. There are some which are difficult to “solve”, such as for example the limitation of data retention, which is almost impossible to apply in reality, since it entails companies carrying out diabolical searches for documents and other elements which could contain personal data and which should be classified according to variable criteria. Other general principles, such as the limitation of the purposes and of data minimization are often irreconcilable with business needs. In an increasingly Big Data-driven and AI-driven economy, thinking that you are completely compliant with these principles is likely to be deadly. What can be hoped for, therefore, is not the rigid and literal application of the GDPR, but a balanced one, i.e. one that takes into account the principles of realism, proportionality and reasonableness. One of the best parts of the text reported in the decree, Recital 4, states that privacy is not an absolute right, but something to be balanced with other rights and freedoms, such as business rights. Only if the European Privacy Authorities are able to look reality in the face and to make sure that the law does not become an obstacle to innovation, research and business, will we be able to say that we have done a good job of protecting citizens”.

What are the worst things seen in this first year of GDPR?

“Certainly the proliferation of subjects who, by taking advantage of companies’ need for adaptation, pose as experts on the matter even though they are not. Rather dishonest market practices have become widespread, such as external assignments as DPO with sometimes ridiculous working hours which are certainly not in line with the work to be done given the size and complexity of the companies in question. As always happens with the entry into force of a regulation which must be adapted to, we have witnessed a rush to get on the bandwagon which was not a good thing either for the companies or for the professionals who work seriously in this sector”.

Has users’ awareness of data being transferred to the big players such as social network platforms changed compared with a year ago?

“In this case too, I believe that, thanks to disturbing episodes such as Cambridge Analytica, users have understood the risk involved in the case of behavioral profiling aimed at advertising or at disseminating news, often false or unduly suggestive, able to unfairly shift public opinion, convictions and people’s desires. It is however true that the large social network platforms have become more transparent in recent months, sometimes to adapt to the orders of the authorities, sometimes spontaneously: in short, it is easier today, compared with yesterday, to understand how our data are processed. Objecting to their collection, moreover, would be both useless as well as impossible in the digital age. But this does not mean that something cannot be done to improve the current situation: one should reason, for example, on the equity of the data-service exchange, by involving different Authorities for the protection of different rights and interests ranging from privacy, to consumer protection, to antitrust, to pluralism of information. Data can be monetized, but there must be equity in the exchange, i.e. the user’s provision of personal information should have a true, measurable value in return. To determine this value as equitably as possible requires an approach that takes into account different rights, freedoms and interests”.

During this last year, has anything changed in relation to the need to stem “artificial insanity”?

“For sure, we talk more about the risks of Artificial Intelligence, focusing attention on ethical aspects as well, but we need to take an additional step, i.e. to establish a constitutional principle which reiterates the need for a human presence when making certain decisions. A soft approach which regulates some technical aspects with guidelines, as has been done so far at a European level, is not possible. The important risks which AI brings with it must be remedied by incisive legislative measures, adding barriers to the non-human directly in our Constitutions”.

 

Sonia Montegiove