How a “new kind of physics” could track down extremists online

When online extremist groups get broken up by the authorities, its members just regroup elsewhere. Here’s how one physics professor thinks we could end this game of digital whack-a-mole once and for all.

Published: May 19, 2021 at 11:00 am

I started researching extremist activity online with an Intelligence Advanced Research Projects Activity (IARPA) competition in 2011. The competition was: given that you know publicly available information, what can you predict about future events in the real world?

At that stage, we were looking at Twitter. You can [now] hear the equivalent of Twitter from the guy standing with a megaphone outside. Everybody stops and takes selfies, and it attracts attention, but I’ve never seen anyone go, “you’re right, you’ve changed my thinking”. That happens in other areas like communities online where trust is built. That doesn’t exist on Twitter.

In 2014, a Russian speaker in my group hit upon activity around Islamic State (ISIS) on V.contakte, the Russian social media platform. We tried to find them on Facebook, but Facebook had already shut them down. V.contakte, though, would only occasionally shut them down. That was interesting, because if they’d let them just go and not interrupted them, then ISIS would have grown into one big blob of ice – easy to get rid of. But they didn’t.

We traced the ISIS communities as they formed a little group here, and a little group there. I think of it like bugs in the yard. You’re going to disrupt them if you stand in one corner with a shoe and hit them one at a time, but you’re not going to stop them. It’s like deplatforming. Sure, there are some that just won’t get their act together afterwards, but they’ll still find one another. And that’s how we came up with the gelation theory of online extremist support.

The idea is that people who appear to be different from one another can ‘gel’ into hate groups online. It’s not the obvious hate communities you want to worry about and shut down. For every one of those there are about 5 to 10 communities linked in to them.

The ISIS community never called themselves ‘ISIS Online’, and they never showed the flag on their pages. Extremist organisations go through groups devoted to pet lovers, parents, people interested in alternative health. They target specific words and titles. That’s the bait. Those sites are watched by hate groups, who then direct recruits to where the core discussion is – usually on a private server or an encrypted site.

That makes them hard to track, and it makes them resilient to efforts to stop them. Facebook have hit a lot of the communities that we would call ‘the bad ones’. But they’ve missed the other, less-obvious communities. But we don’t have a map to see where they’ve intervened and why it isn’t working.

The links between people, groups and clusters change all the time. Hyperlinks appear and are taken down; a network can look one way one day, and different the next. I need to know how these clusters connect and how they help certain content survive and get more exposure.

I need to create new physics that can model how these communities behave. Then if we can understand and predict the dynamics of these networks – like we would a cluster of cosmic dust, or a glass of curdling milk – then we can see where intervention can take place to curb extremism.

The future of tracking extremist clusters isn’t machine learning or AI. Humans will find ways to work around the bots. But it is some combination between people and machines. It’s a work in progress. Humans have intuition. Nobody’s built a machine with intuition. Yet.

Interviewed by Aleks Krotoski

More on how to make the internet great again: