Facebook removed 1.9 million terrorism content in Q1 2018, vows to improve the platform

Facebook today has detailed on its approach towards terrorism on its platform. The company says that it defines terrorism as any non-governmental organization that engages in premeditated acts of violence, government, or international organization in order to achieve a political, religious, or ideological aim.

Facebook is including everything from religious extremists, violent separatists to white supremacists and militant environmental groups. It clarified that its counterterrorism policy does not apply to governments. The company’s policy prohibits terrorists from using Facebook and its services. With the same goal, its new detection technology focuses on ISIS, al-Qaeda, and their affiliates.

The counterterrorism team currently has over 200 members which is up from 150 in June. Facebook took action on 1.9 million pieces of ISIS and al-Qaeda content, which is almost twice as much from the previous quarter. It also states that 99% of the ISIS and al-Qaeda content that it took action on was not user reported, but was detected by their internal reviewers. Though the profile, Page, or Group post content related to terrorism, Facebook just removes the specific post, not the entire page since they don’t violate any policies.

Facebook is also vouching to remove content related to terrorism as soon they were posted. It says that it has improved the enforcement, and has prioritized the work to identify the newly uploaded material. In addition to the newly added content, Facebook says that it has built specialized techniques to surface and remove older content. In Q1 2018, more than 600,000 pieces were identified through these mechanisms.

It also says that it is constantly improving and the researcher’s teams of reviewers and regularly find material that misuses the platform.

Source


Related Post