The Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Információszabadság Hatóság, NAIH) has recently published its annual report in which it presented a case where the Authority imposed the highest fine to date of ca. EUR 670,000 (HUF 250 million).

The case involved the personal data processing of a bank (acting as a data controller) which automatically analysed the recorded audio of customer service calls. The bank used the results of the analysis to determine which customers should be called back by analysing the emotional state of the caller using an artificial intelligence-based speech signal processing software that automatically analysed the call based on a list of keywords and the emotional state of the caller. The software then established a ranking of the calls serving as a recommendation as to which caller should be called back as a priority.

The purposes of the processing activity was determined by the bank as quality control based on variable parameters, the prevention of complaints and customer migration, and the development of its customer support’s efficiency. However, according to the Authority, the bank’s privacy notice referred to these processing activities in general terms only, and no material information was made available regarding the voice analysis itself. Furthermore, the privacy notice only indicated quality control and complaint prevention as purposes of the data processing.

The bank based the processing on its legitimate interests to retain its clients and to enhance the efficiency of its internal operations. The data processing activities in connection with these interests, however, were not separated in the privacy notice and in the legitimate interests tests, they became blurred.

In the course of the procedure before the Authority it became evident from the statements of the bank that for years it had failed to provide to the data subjects proper notice and the right to object, because it had determined that it is not able to do so. The Authority emphasised that the only lawful legal basis for the processing activity of emotions-based voice analysis can only be the freely given, informed consent of the data subjects.

Additionally, the Authority highlighted that although the bank had carried out a data protection impact assessment (DPIA) and identified that the processing is of high risk to the data subjects, capable of profiling and scoring, the DPIA had failed to present substantial solutions to address these risks. Furthermore, the legitimate interest test performed by the bank had failed to take into account proportionality, the interests of the data subjects, it merely established that the data processing is necessary to achieve the purposes it pursues. The Authority further emphasised that the legitimate interest legal basis cannot serve as a ‘last resort’ when all other legal bases are inapplicable, and as such data controllers cannot refer to this legal basis at any time and for any reason. Consequently, the Authority, in addition to imposing a record fine, obliged the bank to cease the analysis of emotions in the course of voice analysis.

In conclusion, the Authority highlighted that “artificial intelligence is generally difficult to understand and monitor due to the way it works, and even new technologies pose particular privacy risks. This is one of the reasons why the use of artificial intelligence in data management requires special attention, not only on paper but also in practice.