ILLUSTRATION BY SAM WHITNEY; GETTY IMAGES
A whistleblower complaint by a former Facebook employee has revealed how Facebook is struggling to monitor dangerous content far from Silicon Valley. Internal concerns were raised that the moderation algorithms for languages spoken in Pakistan and Ethiopia were inadequate. Another concern was the lack of training data to tune the system for different dialects of Arabic. Is it the age of the Metaverse or the Age of Meta: The future of virtual worlds brought about by Facebook's name change? intelligence) moderation system has been deployed. Since it requires far less training data than conventional systems, it can respond quickly to moderation work according to new guidelines. The new system, called Few-Shot Learner (FSL), supports over 100 languages and works with images as well as text, Meta said. Facebook said the FSL will reduce the time it takes to automatically apply new moderation rules from about six months to about six weeks. A rule was introduced in September this year that bans content that discourages vaccination against the new coronavirus, even if the content of the post is not an outright lie. It is said that there is Facebook also said the FSL, which was first deployed earlier this year, has helped reduce hate speech within the platform. Hate speech was rampant around the world from mid-2020 until October this year. Facebook didn't provide details on how the new system worked.
The new system won't solve all of the content moderation problems on Facebook. But it serves as an example of Facebook's heavy reliance on AI to tackle the problem. Facebook has taken the world by storm with its claim to bring people together. However, the network also became a hotbed of hatred and harassment at the same time. A United Nations report also credits Facebook with facilitating the massacre of Rohingya Muslims in Myanmar. Facebook has long argued that AI is the only practical way to monitor its vast network. But despite recent technological advances, AI is still far from understanding the subtleties of human communication. Facebook recently announced that it has automated a system to find content that promotes hate speech and terrorism in more than 50 languages. But in reality, Facebook's services are available in over 100 different languages.
Next page: Concern about side effects Last update: WIRED.jpNavigation Lists
A hotbed of hate and harassmentCategory
Related Articles
Hot Articles