• Non ci sono risultati.

Lawfulness of the processing of personal data

Nel documento Artificial intelligence and labour law (pagine 137-142)

G. Problem areas under labour law

V. Data protection

3. Lawfulness of the processing of personal data

The core question of data protection law is, evidently, under which conditions data processing is lawful.

a) Lawfulness of the processing according to Article 6 GDPR

Article 6(1) of the GDPR links the lawfulness of the processing of personal data to one of two conditions: that consent is given (Article 6(1)(a)) or that the processing serves one of the purposes mentioned in the provision (Article 66(1)(b)-(e)).

Finally, under Article 5(1)(f)) GDPR, processing is also permissible if it is necessary to protect the legitimate interests of the controller or of a third party. In this context,

683 Cf. Roßnagel in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 5 marginal no. 166.

684 Cf. Weichert, ZD 2013, 251 (258).

685 Cf. also e.g. Weichert, in: Däubler/Wedde/Weichert/Sommer (eds.), EU-DSGVO und BDSG,

Art. 22 DSGVO marginal no. 5, according to whom all data-protection principles are being put to the test ("sämtliche Datenschutzprinzipien werden auf den Prüfstand gestellt ").

686 Holthausen, RdA 2021, 19 (25) with reference to Roßnagel, ZD 2013, 562 (564).

138

HSI-Working Paper No. 17 December 2022

in the case of further processing, not only must there be a further processing purpose compatible with the original purpose, but any new processing that goes beyond mere collection must be fully covered by a legal basis in paragraph 1.687 As far as consent is concerned, it should again be recalled that this requirement inevitably reaches its limits in the era of ubiquitous computing and Big Data.688 In contrast, as far as the grounds in Article 6(1)(b)-(e) are concerned, these are often unlikely to be relevant in the present context.

Irrespective of this, processing of personal data is, as stated, also permissible if it is "necessary for the purposes of the legitimate interests pursued by the controller or by a third party". However, this is not the case where "such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data". In this respect, the literature argues that it is harmless in principle if personal data are further processed in order to develop an algorithmic model or to "train" the AI system. However, it raises even greater concerns if personal data are processed for the purpose of deriving concrete conclusions about the data subject. Accordingly, for example, the "assessment of the performance of employees on the basis of extensive monitoring" is to be inadmissible.689 However, it is difficult to determine where the boundaries lie:

Article 6(1)(f) GDPR is worded very openly, which on the one hand guarantees a certain flexibility, but on the other hand makes the results of the balancing exercise hard to predict.690 There is also little that is tangible in the recitals. Efforts by the European Parliament to concretise the text were just as unsuccessful as the Commission's proposal to concretise the text by means of delegated acts.691 There have also been repeated calls for special legal regulations, in the literature but also from business circles,692 but these have not yet come about.693 In addition to all this, data processing is based solely on the initiative of the data controllers, who

687 Cf. only Albrecht in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 5 marginal no.

13 with further references.

688 Roßnagel in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 5 marginal no. 40.

689 Thus Sartor, The impact of the General Data Protection Regulation (GDPR) on artificial intelligence, 2020, p. 50.

690 More closely Schantz in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 6 para. 1 marginal no. 86.

691 Cf. Schantz in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 6 para. 1 marginal no. 103 with further references.

692 Cf. also Schantz in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 6 para. 1 marginal no. 86 (with further references), who explicitly mentions the case of "augmented reality such as [e.g.] Google Glass".

693 Nevertheless, a certain concretisation in the form of approved codes of conduct of associations is conceivable; cf. Art.

40 para. 2(b) GDPR.

139

HSI-Working Paper No. 17 December 2022

initially assess its permissibility independently, but often tend to assume that the processing is lawful.694

The re-purposing already mentioned above has been regulated separately.

According to Article 6(4) of the GDPR, a number of criteria must be taken into account when answering the question of whether "processing for another purpose is compatible with the purpose for which the personal data are initially collected".

According to the (non-exhaustive) catalogue of criteria, the following must be taken into account: any link between the purposes; the context in which the personal data were collected, in particular with regard to the relationship between the data subjects and the controller; the nature of the personal data; the possible consequences of the intended further processing for the data subjects695 and the existence of appropriate safeguards, (e.g. encryption or pseudonymisation). There is no standard procedure approved by the supervisory authorities, especially with regard to the latter criterion. Accordingly, with regard to ascertaining when the conditions for processing for compatible purposes are fulfilled, there are complaints of widespread legal uncertainty.696

b) Prohibition of the processing of sensitive data

The GDPR classifies certain data as particularly sensitive: According to Article 9(1) GDPR, the processing of personal data "revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade-union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited".

In this respect, too, AI applications pose particular challenges. One challenge is that data that are not initially linked to a specific person can be linked to that person through further processing. The second challenge is that certain observable behaviour or known characteristics of individuals (e.g. their online activities) can

694 Cf. on this Schantz in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 6 para. 1 marginal no. 87 with further references.

695 Cf. on this Roßnagel in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 6 para. 4 marginal no. 58 with the assessment that "under the conditions of modern data processing in the context of Big Data, artificial intelligence, self-learning systems, context detection, Internet of Things and other applications of ubiquitous computing [...] the consideration of possible consequences of changes of purpose can only have a restrictive effect";

likewise Frenzel, in: Paal/Pauly, DS-GVO BDSG, 3rd ed. 2021, Art. 6 marginal no. 49.

696 Cf. Dehmel, ZD 2020, 62 (65).

140

HSI-Working Paper No. 17 December 2022

allow conclusions to be drawn about attitudes, state of health or sexual orientation, that is, can lead to the generation of sensitive data.697 However, the prevailing view is that Article 9 GDPR does not require that the data be sensitive in and of themselves. Rather, the provision should apply if their sensitivity only becomes apparent indirectly from the overall context of the processing.698 However, there are also other voices according to which a "revealing" within the meaning of Article 9(1) GDPR is to be denied in the case of "automatically attached data". If one considers, for example, the analytical potential of new AI systems, which can, for example, analyse photographs and voice recordings to identify emotional states and health data, it becomes clear "how much territory the concept of 'revealing' can cover when simple processing is being performed if the specific purpose of the processing is not concretised".699

c) Specific regulations in Section 26 BDSG

Specific regulations on employee data protection are contained in Section 26 of the BDSG, the wording of which has been adapted to the GDPR.700 According to Section 26(1), first sentence, BDSG: “Personal data of employees may be processed for employment-related purposes where necessary for hiring decisions or, after hiring, for carrying out or terminating the employment contract or to exercise or satisfy rights and obligations of employees' representation laid down by law or by collective agreements or other agreements between the employer and staff council”. The central criterion for the lawfulness of processing – and one that applies equally to all the grounds for authorisation in paragraph 1 – is therefore

"necessity".701 This concretises the general provision of Article 6(1)(b) of the GDPR, which in particular binds the processing of personal data for the

697 Cf. Sartor, The impact of the General Data Protection Regulation (GDPR) on artificial intelligence, 2020, p. 53;

fundamental criticism in Barczak, DÖV 2020, 997 with the observation that "hard data mining (is) currently being carried out in this area as well".

698 Petri in Simitis/Hornung/Spiecker gen. Döhmann (eds.), Datenschutzrecht, 1st ed. 2019, Art. 9 marginal no. 11 (so-called "context-related or indirectly sensitive data"), who at the same time objects to the fact that such data "are only cautiously assigned to the processing prohibition under para. 1"; ibid., marginal no. 12 with further references.

699 Thus Matejek/Mäusezahl, ZD 2019, 551 (553): "The scope of application of Art. Art. 9 GDPR with regard to the first category [data that "reveal" certain sensitive characteristics] is only opened when the source data are used to obtain such sensitive data on the said characteristics and not already when it is established that the data are in principle suitable for disclosing such characteristics. Instead of the passive formulation "to emerge", the provision should therefore be understood in a more narrowed way in relation to the concrete intention to process (in the sense of an active "bringing forth") by way of a teleological reduction.

700 Cf. also BT-Drucks. 18/11325, p. 97. General information on the relationship between Section 26 BDSG and the GDPR:

Gräber/Nolden, in: Paal/Pauly, DS-GVO BDSG. 3rd ed. 2021, Section 26 BDSG, para. 8 ff.

701 In more detail Seifert, in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 88 marginal no. 56 f.

141

HSI-Working Paper No. 17 December 2022

performance of a contract to the principle of necessity.702 Section 26(2), first sentence, BDSG provides for the possibility of consent, but at the same time stipulates that "in assessing whether such consent was freely given”, the

“employee’s level of dependence in the employment relationship and the circumstances under which consent was given shall be taken into account".

Section 26(3), first sentence, BDSG stipulates that – by derogation from Article 9(1) GDPR – the processing of special categories of personal data as referred to in Article 9(1) GDPR for purposes of the employment relationship is permissible primarily "if it is necessary to exercise rights or comply with legal obligations derived from labour law [...] and there is no reason to believe that the data subject has an overriding legitimate interest in not processing the data". According to Section 26(4), first sentence, BDSG, the processing of personal data is also permitted on the basis of collective agreements. Thus, works agreements and collective agreements can also permit the processing of employee data.703

The core prerequisite for the lawfulness of processing is, as said, its "necessity".

In the examination of necessity that arises from this, "the conflicting fundamental rights positions must be weighed up in order to establish practical concordance".

Specifically, "the employer's interests in data processing and the employee's right to privacy must be balanced in a way that takes both interests into account as far as possible".704 The discussion about the permissibility of predictive policing of employees, for example, shows that the legality of the processing will often be a matter of dispute.705 According to some views, this is inadmissible if it involves the creation of a personality profile.706 However, the limits are sometimes defined differently. For example, it is sometimes argued that predictive policing is only inadmissible if it involves a "screening of the personality", whereby the borderline to such a screening is only crossed "if the collected data are close to the core of the right of personality and these are processed into a prognosis that is also close to the core of the right of personality".707 The different views illustrate the imponderables of a proportionality test, in which not only data minimisation under

702 Seifert, in Simitis/Hornung/Spiecker gen. Döhmann (ed.), Datenschutzrecht, 1st ed. 2019, Art. 88 marginal no. 21.

703 See only Wybitul, NZA 2017, 413 (417).

704 BT-Drs. 18/11325 of 24 Feb 2017, p. 97.

705 Already today, so-called fraud monitoring is sometimes carried out, in which the probability of committing crimes against the employer's assets is determined for certain activities. Predictive policing, on the other hand, is about using AI to forecast which employees are particularly susceptible to committing compliance violations; see Rudkowski, NZA 2020, 72 (73);

Haußmann/Thieme, NZA 2019, 1612 (1615).

706 Rudkowski, NZA 2020, 72 (74).

707 Thus Dzida, NZA 2017, 541 (545), who therefore considers a Big Data analysis to be "legally problematic" if it is intended to determine whether employees are "generally inclined to commit criminal offences".

142

HSI-Working Paper No. 17 December 2022

Article 5(c) GDPR but also the employer's entrepreneurial freedom, within which economic aspects may also come into play, must be taken into account.708

Nel documento Artificial intelligence and labour law (pagine 137-142)