Meta, formerly known as Facebook, said it has increased its efforts ib combatting misinformation by investing teams, technology and partnerships to ensure the safety of people using its platforms ahead of the upcoming presidential elections in Brazil.
The company said, since 2016, it has quadrupled its security and integrity workforce to over 40,000 people globally. Last year alone, the company invested nearly $5 billion in both areas.
“We know that local knowledge is essential for this work to be effective, so we also have a large team of specialists based in Brazil who have a deep understanding of the situation,” the company said in a statement.
“These efforts are intensified as the election approaches, and our work to protect the integrity of our platforms will continue after the vote,” it added.
The company will remove content that violates its policies on voter suppression, such as posts, that discourage people from voting, among its many responses to potential interference in the electoral process.
Meta will also take action to prevent hate speech or the incitement of violence on our platforms.
“Currently, 99.7 per cent of the fake accounts we remove from Facebook are deleted by artificial intelligence before users report them. We also investigate and disrupt networks that use fake accounts in a coordinated way to influence public debate,” the company said.
Closer to October, it will activate an Elections Operations Center focused on Brazil, an initiative it has implemented since 2018, to bring together experts from across the company — including intelligence, data science, engineering, research, operations, public policy and legal teams.
They work together to identify potential threats on our platforms in real-time, accelerating our response time.