GDPR complaint: Airbnb hosts at the mercy of algorithms

22 December 2021
AI robot at desk

GDPR complaint:  Airbnb hosts at the mercy of algorithms

Today, noyb.eu filed a GDPR complaint against Airbnb. The online market place for vacation rentals downgraded the complainant’s rating as a host, solely through automated decision making. While everyone has the right not to be subject to automated decision-making, Airbnb relies on exactly these practices. By denying the opportunity for the complainant to contest the automated decision and to obtain human intervention, Airbnb acted in clear violation of the GDPR.

Safeguards against automated decision-making. Automated decision-making (ADM) is a type of processing that is regulated under the GDPR to protect people against unfair decisions taken by automated means, such as algorithms. The rights provided under Article 22(3) GDPR include the right for the data subject to express his/her point of view, the right to contest the automated decision, and the right to obtain a meaningful human intervention. In this case, Airbnb did not only fail to properly inform the host about the existence of such ADM, they also denied her the possibility to effectively contest the decision, leading to substantial financial loss for hosts.

Reviews essential to become ‘Superhost’. Airbnb connects hosts renting out living spaces with potential guests. After each stay, guests provide a rating of one to five stars. Naturally, rentals with good reviews are more likely to be booked which makes reviews particularly important for Airbnb hosts. If they reach and maintain an average score of 4.8 stars they become “Superhosts”– which leads to a lot of advantages, such as enhanced visibility on the platform, higher overall rental incomes, and a €100 travel voucher every year.

Algorithms to check reviews. Since thousands of reviews are left every day, Airbnb relies on algorithms to check whether these reviews comply with their Review Policy. Reviews that are biased or irrelevant are automatically deleted by these algorithms. Errors can have dramatic consequences on Airbnb hosts: if 5-stars reviews are deleted by error, this can deprive them from their Superhost status, and from all the advantages that come with it- as was the case for the complainant.

"A distant and unreachable decision-maker can delete a "simple" review on a platform. But in other cases algorithms automatically lower someone’s financial rating, assess job performances or even fire workers. That is what we risk if we do not establish the principle that algorithms must be transparent and their decisions accountable. " - Stefano Rossetti, data protection lawyer at noyb.eu

Reviews automatically deleted by Airbnb. In the case of the complainant, a five-star review was automatically deleted by Airbnb. As a consequence, her overall rating was downgraded. She immediately contacted Airbnb to explain the reasons and to contest the decision but they merely replied that the deletion was “final”. Consequently, the host contacted Airbnb’s Data Protection Officer to get more information in respect to the processing of her personal data, and the automated deletion of reviews. Even today, 1,5 years after her access request, she still hasn’t received any answer from Airbnb. According to the GDPR, they had one month to answer.

"For Airbnb, deleting reviews is a mere routine. But for me, it can have a serious effect. If they were a real partner, they would not treat their hosts like that... I don't want to live in a world where part of my livelihood depends on the decision of an algorithm" -July (alias), the complainant in this case.

Hosts at mercy of algorithms. Airbnb’s system – where algorithms patrol the platforms and automatically delete reviews without human intervention –  is a type of automated decision-making (ADM) which can have a significant impact on Airbnb hosts. Under the GDPR, any individual who is subject to an ADM, has the right to contest the decision and obtain a meaningful human review of his or her case, where all relevant circumstances are considered. Airbnb clearly does not comply with these standards. Hosts are therefore left at the mercy of algorithms, without a real possibility to fight back. Similar algorithms also dominate the life and work of many other workers in the “gig economy” from Uber drivers to delivery workers.