SUBSCRIBE TO OUR NEWSLETTER

How does Airbnb utilize Artificial Intelligence to screen its guests

The tool developed by Trooly, a startup acquired by Airbnb in 2017 scans the Internet to judge the suitability of customers. As excited as I’m with the use of artificial intelligence from the field like Healthcare, where it has the potential to revolutionize disease detection & prevention, it does frighten me a bit when comparable tools are utilized to substitute human analytical decision making. A good example of the latter is businesses using AI to interview and hire you. With the meltdown that we saw from the IPO market of the various unicorns, Airbnb may come as a savior of the market in 2020, as it plans to get a list on the NY Stock exchange. 

For one, it’s got a much more competent management bench to deal with the pressures of business. More recently the business has been in the information for a totally different reason. According to the patent documents, online leasing service is currently using an AI tool to analyze clients on-line personalities to figure the risk of their risk of ransacking a host home. It’d determine the suitability of the guest to be rented on the property. In accordance with the Evening Standard, this comes following complaints from London hosts, by which a girl who had rented her £, 2.5 million apartments for a child shower, ended up getting it wrecked by a rowdy party crowd. 

“Every Airbnb reservation is scored for risk before it’s confirmed. We use predictive analytics and machine learning to instantly evaluate hundreds of signals that help us flag and investigate suspicious activity before it happens.” ~ Airbnb website

The tool was developed by Trooly, a background test startup that Airbnb acquired in 2017. Airbnb stated the patent was a continuation of his activity after acquiring Trooly. Nevertheless, the business declined to comment on the extent of usage of the AI tool. The machine learning algorithm marks down those found to be connected with fake social networking accounts or who give out bogus information. Any images or video file showing folks associated with drugs, alcohol or hate speech, etc. Also, result in poor scores. The algorithm takes into account articles related to offense postings & information web sites to make a person graph, to determine the suitability of the guest. How exactly can the AI predict whether a person is behavior from their on-line presence is an open question. Airbnb has been dealing with the guests trashing the hosts’ homes for quite some time now and the business seems keen to employ a system, which weeds out some of the troublemakers if not all. My concern is the same, I expect we’re not hurrying on our reliance on cars to make judgment calls that have been a prerogative of humans.