EPIC files complaint with FTC about Airbnb’s secret customer-scoring algorithm

On February 26, 2020, the Electronic Privacy Information Center (EPIC) submitted a complaint to the US Federal Trade Commission (FTC) about Airbnb’s secret customer-scoring algorithm. According to EPIC, Airbnb “has failed to show that its technique meets the fairness, transparency and explainability standards for AI-based decision-making set out in the OECD AI Principles and the Universal Guidelines for AI”.

According to the complaint, “Airbnb uses this algorithm to score its customers’ “trustworthiness” based on personal information obtained [by] third-parties”, including webpages, social networks, comments, etc.

Airbnb’s algorithm is supposed to evaluate customers’ “negative traits”, from being involved with hate websites to having “interests that indicate negative personality or behaviour traits”. The algorithm is also supposed to “evaluate the individual’s relationship with others”, EPIC says.

EPIC notably contests the reliability of the algorithm. For instance, it argues that the Airbnb algorithm “identifies use of words associated with criminal activity […] to categorize an individual as having criminal behaviour traits” whereas “there are many reasons why a person might use words associated with criminal activity: they could be a victim of a crime, a journalist” etc.

From a legal point of view, the complaint argues that Airbnb violates both the Federal Trade Commission Act and the Fair Credit Reporting Act. Moreover, the document also argues that the Airbnb algorithm does not comply with the OECD Principles on AI and the Universal Guidelines for AI.

Source : https://epic.org/privacy/ftc/airbnb/EPIC_FTC_Airbnb_Complaint_Feb2020.pdf

Like this article?
Share on Facebook
Share on Twitter
Share on Linkdin
Share by Email