Algorithms and Artificial Intelligence: The Garante takes action on algorithmic discrimination

Thanks to Giulia Conforto for collaborating on this article

The Italian Data Protection Authority (Garante per la protezione dei dati personali, “Garante”) is taking a closer look at Foodinho S.r.l., an Italian company in which the Spanish GlovoApp23 S.L. holds a controlling stake. The company makes home deliveries using staff specifically dedicated to this purpose (known as “riders”).

The Garante unearthed a number of violations, particularly with respect to the algorithms used to manage riders, as well as several breaches of the data protection rules, the Workers’ Statute (Law No. 300/1970), and recent discipline protecting people working for digital platforms. As a result of injunction order No. 234, issued by the Garante on June 10, 2021 (the first such order about riders), among other things the company must modify the processing of riders’ data carried out through its digital platform and verify that the algorithms for booking and assigning orders to riders do not discriminate. Further, the company has been fined EUR 2.6 million.


This was not the first time Italian authorities ruled on the use of algorithms for handling riders’ work. For example, in the Deliveroo case, December 31, 2020, the Court of Bologna deemed Deliveroo’s reputational ranking algorithm unfair to riders because it was in breach of Italian law prohibiting discrimination against employees/the self-employed.[1] However, this is the first Garante decision about riders.

Although the Garante has identified several breaches, our analysis below focuses on the part of the injunction that deals with the use of the algorithm.


Using a shift reservation system and what is called an “excellence score,” Foodinho assigns shifts to riders based on their scores.

The Garante’s inspection activity revealed that an excellence score is determined by applying an algorithm that takes into account five elements, in different percentages: score assigned by the client (15%); score assigned by the partner (5%); score determined by the provision of services during high-demand hours, provided that the rider has selected at least 5 high-demand shifts over 7 consecutive days (35%); orders actually delivered (10%); and platform productivity (35%), based on the number of orders offered to the rider (who can opt for automatic assignment to drive up the score) upon check-in for a scheduled shift, “a few minutes after the start” of the shift, and on acceptance, within a short period of time (30 seconds) after the order is offered. This algorithm penalizes riders who do not accept orders quickly or turn them down or who do not complete a certain number of deliveries. Moreover, the score varies based only on negative feedback (positive feedback does not count). Through the score (and the underlying algorithm), Foodinho evaluates a rider’s performance and thus wields significant effect on the rider by offering or denying access to certain shifts—which in turn determines their opportunities to perform the service under the contract. Workers are not informed of how this algorithm works, but based on their scores, they may be denied access to the most “valuable” shifts or even to the platform itself, and consequently they may miss out on work opportunities.

Based on the above, the Garante affirmed that the company, using a digital platform that operates through algorithms, makes decisions based solely on automated processing, including profiling, of riders’ personal data, without adopting proper guarantees as required by Article 22 and Recital 71 GDPR. Per the Garante, an exception in Article 22 applies in this case. with respect to the right not to be subject to a decision based solely on automated processing, including profiling, that produces legal effects or significantly affects the data subject. Specifically, the processing was deemed necessary for the performance of a contract entered into by the parties.

The Garante stated that the technology used by Foodinho can be considered innovative for two main reasons: (1) work activities are managed by the company through a digital platform that functions based on complex algorithms, and (2) automated processing, including profiling, that significantly affects data subjects is used to process various types of data, including geolocation data, excluding some riders from work opportunities. In light of this and taking into account other elements of the riders’ data processing, the Garante stated that such processing represents “a high risk to the rights and freedoms of natural persons,” which in turn means an impact assessment pursuant to Article 35 GDPR must be carried out prior to the start of processing.

Furthermore, the Garante challenged several types of conduct in relation to the algorithm. Indeed, the company did not:

  • properly inform workers about the functioning of the algorithm;
  • adopt measures relating to the exercise of riders’ rights, in particular the right to obtain human intervention, express their opinions, and challenge the decisions of the algorithm through access to dedicated channels (g., chat accessible through the application, dedicated desks, email); the company also did not inform riders about the possibility of exercising such rights in relation to the decisions made using the platform;
  • adopt technical and organizational measures aimed at periodically checking that the results of the algorithmic system were correct and accurate and that the data used by the system with respect to the purposes pursued were accurate, relevant, and adequate and reducing as much as possible the risk of distorted or discriminatory effects, with reference to the functioning of the digital platform, including the scoring system and the order allocation system;
  • take appropriate measures to avoid improper or discriminatory use of reputation-scoring mechanisms based on feedback (which determines 20% of the excellence score).


In addition to issuing a fine of EUR 2.6 million, the Garante ordered Foodinho to implement a number of corrective measures, including the following:

  • take measures to safeguard riders’ rights, especially their rights to obtain human intervention, to express their opinions, and to challenge the algorithm’s decisions;
  • check that data used by the system (g., chats, emails, and phone calls between riders and customer service, geolocation at 15-second intervals, route mapping and estimated and actual delivery times, details on the handling of current and past orders, feedback from customers and partners) is accurate and fair, partly in order to minimize the risk of errors and bias; and
  • take suitable measures to prevent the inappropriate and/or discriminatory use of reputation-scoring mechanisms based on customer and partner feedback.

The Garante set a 60-day deadline for Foodinho to start implementing the required measures to remedy the serious shortcomings it had found, and it granted an additional 90 days to finish overhauling the algorithms.


The order shows once again that riders’ personal data processing (as well as the ranking of riders) is of interest.

The Foodinho case may be the first in a long line of such cases. Indeed, the Garante is investigating the behavior of other food delivery companies as well.

[1] Please refer to our article “The current scenario for delivery workers in Italy: does a new future await?

Follow us on