Twitter announced on Monday it has acquired Fabula AI, a London-based machine learning research company. Fabula AI’s team will join Twitter and work alongside Sandeep Pandey as part of Twitter’s research group focused on natural language processing, reinforcement learning, machine learning ethics, recommendation systems and graph deep learning.
Why we should care
Fabula’s graph deep learning research is used to detect network manipulation. With this acquisition, Twitter aims to use Fabula’s capabilities to better identify bad actors and malicious behavior on the platform, in addition to enhancing its recommendations processes.
While the Fabula research team will initially focus on improving the health of conversations happening on Twitter, the company said the team’s efforts will expand in the future, aiming to help stop spam and abuse, as well as improve recommendations, the explore tab and the onboarding experience.
“This strategic investment in graph deep learning research, technology and talent will be a key driver as we work to help people feel safe on Twitter and help them see relevant information,” wrote Twitter CTO Parag Agrawal on the company’s blog.
The more Twitter is able to safeguard its app from abusive content and malicious actors, the better it will serve marketers and advertisers. A healthy platform that can quickly and easily weed out intrusive, harmful content creates a more inviting experience for users — opening up lines of communication between marketers on the app and their audiences.
More on the news
- Fabula’s founders Michael Bronstein, Damon Mannion and Federico Monti and the rest of the team will become part of Twitter’s Cortex team of machine learning engineers, data scientists and researchers.
- In addition to joining Twitter, Michael Bronstein will retain his position as the chair in machine learning and pattern recognition at Imperial College in London.
- Twitter has been talking about the health of its platform for more than a year now — in March, 2018 CEO Jack Dorsey addressed the issue during a live Q&A, claiming the company was working on how it could measure the health of the platform in a way that was public and accountable.