The Met’s firearms officers have begun filming their training sessions as part of an initiative to help stop terrorists live-streaming attacks online.
The training footage will be shared with Facebook to help it develop technology that can identify live-streaming of shootings on its website.
Such technology could help the company notify police of an attack early on and prevent the broadcast of such atrocities on the social media platform.
The footage will also be provided to the Home Office, so that it can be shared with other technology companies to develop similar methods to stop live-streaming of firearms attacks elsewhere online.
Commander Richard Smith, head of the Met’s Counter Terrorism Command, said: “Facebook reached out to the Met as we have worked with them on numerous occasions before to remove online terrorist propaganda.
“The live-streaming of terrorist attacks is an incredibly distressing method of spreading toxic propaganda, so I am encouraged by Facebook’s efforts to prevent such broadcasts. Stopping this kind of material being published will potentially prevent the radicalisation of some vulnerable adults and children.
“The footage we are capturing shows our highly-skilled firearms officers training to respond with the utmost expertise to a wide range of scenarios including the kind of attacks we want to stop terrorists broadcasting.”
Officers from the Met’s world-renowned specialist Firearms Command regularly train in how to respond to myriad scenarios, from terrorist incidents to hostage situations, on land, public transport and water. The footage they provide – captured on cameras attached to their bodies – will show a “shooter” perspective in a broad range of situations.
This varied imagery – combined with video from law enforcement in other countries – will help Facebook gather the volume of footage needed so its technology can learn to identify live footage of an attack, and subsequently remove it.
The Met became involved in the project as a direct result of the world-leading national Counter Terrorism Internet Referral Unit’s (CTIRU) long-standing relationship with Facebook. This national team based within the Met – the first of its kind globally – works with service providers and social media companies like Facebook to ensure the removal of harmful terrorist material from online. The CTIRU actively assists hundreds of national counter terrorism investigations, identifying specific UK-based threats and then supporting investigations into the individuals or networks behind them.
As a result, Facebook approached the Met to seek assistance with the idea.
Erin Saltman, Counter-Terrorism Policy Manager, Facebook, said: “Violent extremist and hate based content has no place on our platforms and in the last two years, we have removed 26 million pieces of content from global terrorist groups. We are investing heavily in people and technology to keep this content off our platforms, but this industry-wide problem is not a fight we can win on our own. The footage from this partnership with the Met Police will improve our artificial intelligence technology, helping us more quickly identify and remove dangerous content. Crucially, we will make this technology available to the wider tech industry so collectively, we can prevent the spread of harmful content.”