The AI element in the ranks of Facebook. A new chatbot to fight… criticism?

Since spring this year, Facebook has been testing a new tool in the form of a chatbot. They were called Liam Bot. However, it is not intended for public use, but only for internal use by employees. What exactly is its role and why was it created at all?
In recent years, Facebook has been struggling with a kind of image crisis, due to several dirty tricks on its part and smaller or larger scandals. A few months ago, there was information about a secret change of the slogan on the login page – you can read about it here. In 2018, a lawsuit was launched against Facebook because it allowed a situation in which data of as many as 87 million were leaked to Cambridge Analytica. In addition, the giant is also accused of not taking any countermeasures to prevent the spread of fake news. Not only users have objections, but also current employees of the concern, who, as a result of accumulating negatives, began to think about changing jobs, and potential candidates who, in the face of the allegations against Facebook, do not want to submit their applications.
People who have distanced themselves from Facebook are also family members and friends of its current employees, who fight against all allegations, and their task is, among other things, to protect the good name of the company. It is difficult to be completely fair to relatives and the employer at the same time – not to slander the company and not to mislead family and friends. Such unobvious situations occur mainly during the holiday season, because then people meet in a larger group, talking about what’s going on in life. Facebook employees reported the problem to the employer, admitting that they could not give a balanced and diplomatic answer to the questions of relatives about these crisis situations, which are usually full of allegations. Therefore, the “top” decided to use artificial intelligence, which is to serve as a help center for employees.
Since the Liam Bot was released just before the American Thanksgiving, those interested had their first real opportunity to test its capabilities. When an inquisitive cousin asks what Facebook is finally going to do about this ubiquitous hate speech, the chatbot prompts the answer: “Facebook has already hired more moderators to more effectively control content on the platform.” In some cases, the bot refers employees to articles on the Facebook blog, in the help center or to the FAQ, where the employee is able to quickly find information on a given issue. From here, the path is easy: either he will thoroughly familiarize himself with the case and give a comprehensive answer to the interested family member, or he will send him to the source so that he can draw conclusions on his own.
Facebook’s response to employee requests has therefore been fulfilled, it seems, even to a greater extent than anyone thought. We will probably find out whether the chatbot actually fulfills its role not earlier than after Christmas – a time particularly abundant in family meetings and heated discussions at the table.