“Virtual reality is going to become a surveillance universe”

Future thinker Sophie Hackford talks VR, satellites and artificial intelligence at this year’s Power Of Film And Moving Image event.

Published: February 16, 2017 at 2:00 pm

This is an edited version of a lecture delivered by Sophie Hackford as part ofPower Of Film And Moving Image, an annual cultural event and digital platform to expose and explore the ever-growing power of film and moving image to define and influence the modern world.See the talk in full at the bottom of the page.

We might think today that we are connected to each other but actually, we have really only just begun. In the 90s, I think you could have described the world as connected. Today, I think we need to think about it more as a kind of entangled concept. Entangled not just with each other but with machines, and I think it’s becoming impossible to opt out of that tangling, which brings with it a whole set of new opportunities but also new challenges.

I’m really excited about technology. I’m a huge optimist when it comes to a lot of the things that I like to think about; robotics, AI, gene sequencing, that sort of thing, but it does cause us to rethink every single one of the assumptions that we have, about our lives, about our business and about the world more broadly including fundamental assumptions about film, how it’s made and how it’s used. Because the difficulty is today we are suffering not from a lack of technology, but a lack of imagination with what to do with it.

I want to talk a little bit today about a couple of technologies that I think are super powerful and are possibly being underestimated in their power, but also to help you ask perhaps better questions about what we might be sleepwalking into. I’m going to talk about the frontiers of artificial intelligence, because I believe we’re moving into the sort of searchable real-time CCTV of the planet that I’ll discuss, and also secondly the development of virtual worlds or virtual universes and the implications I think that will have for us as a society.

I’m going to start by talking a little a bit about this real-time CCTV of the planet. You can today look at the port of Long Beach, the biggest containership in the United States and watch it and everything that’s happening in real time, or if you’re interested in the refugee crisis, map refugee camps from space. We suddenly bring an incredible real-time imagery and video piece to our understanding of the world that we’ve never really had before.

I think that’s really important but why I think it is exciting is that artificial intelligence gives us the ability to find insight in some of these images to find weird anomalies that we would never have been able spot with our small, wet organic brains. I’m excited about solving equations to tell stories and I want to talk a bit about what I mean by that. But first, let’s talk about what AI actually is, because I think it’s something we talk about a lot and perhaps lot of people don’t necessarily understand the concept.

Really all you need to know about it is it’s maths, a bunch of algorithms - it’s not always going to robots walking around talking to us. It’s about not just teaching the machines or the algorithms how to do what we already know how to do, it’s about finding things that we can’t already do that we can now solve using artificial intelligence.

Algorithms learning how to play games
Algorithms learning how to play games.

And like us they learn very quickly, they learn once they’ve been thrown into a new environment. This (above image) is a set of algorithms playing lots of different computer games. They become very good at those computer games much quicker than we would ever would because they evolve just like we did over the last 50,000 years since we emerged from the Savannah. They thrive, they reproduce, they die, they get rewards, and as they become more sophisticated, they learn to dominate whatever environment they’ve been released into. I read on the front page of the Financial Times this morning that there are now more algorithmic traders than there are humans trading in the US financial markets, and I would imagine that there might become some interesting challenges with that over the next few years.

But AI, I think, often sounds like a bit of an obscure technology, something that operates in its own niche or should have its own department within your companies or wherever. But I think that every company or individual should think about it like a commodity, like electricity or Wi-Fi that we just stream and point at whatever challenges that we have. I think that means every company or every country should have a legitimate artificial intelligence strategy, and unfortunately, most of people that I talk to don’t have that in place. I’m very excited about how we can apply the learning algorithms, that mathematics to the kind of signals that we’re drowning in today – and I’m sure you all feel, like I do, that we’re getting pummelled by data from every quarter, whether that’s smartphones or drones or what will become autonomous vehicles or satellites – because they’re constantly throwing data at us.

Real-time CCTV

I’m going to start with satellites because I’m particularly passionate about them, which I know it sounds a bit of an obscure thing to be quite passionate about, but I’m passionate because I think these small satellites are going to give us real-time CCTV of the planet and I want to explain what I mean by that and why that’s important. But the most important thing I think to bear in mind is the fact that it’s going to become searchable. We’re going to be able search this CCTV of the planet in the same way that we can search the internet today to get insight from it.

So what might you want to search for? You might want to search using some of these satellites that are orbiting our planet at a great regularity. You might be interested in where all the 200,000 cargo ships are around the world are. At the moment, they tell us that information by pinging a radar. Now, if they are doing they shouldn’t be doing, they turn their radar off. From space, you can’t hide, and so there’s been some incredible transparency in supply chains with platforms like Windward, for example, who can demonstrate exactly where people are and what they are doing. You might be interested in how many people are at the Dakota access pipeline today, yesterday, last month - maybe you want to extrapolate future information from that. You may be interested in how they’re mapping the evacuation of Aleppo.

If you happen to be interested in mining, or indeed gold, you can again use satellite imagery to track one of the world’s biggest mines that is being built in the Gobi desert at the moment, which apparently is going to represent almost a third of Mongolian GDP by the time it’s finished, so you can get incredible insights that from that. Or if you’re cheeky and you want to understand what competitors are doing you could watch their facilities, like Elon Musk’s Giga factory in the Nevada Desert, from space. You can see how much inventories going in one end and how many finished products are coming out the other and maybe use that for you competitive advantage.

Now Elon Musk was very generous to his competitors when he situated his factory in the Nevada desert because there is very little cloud cover so it means you can watch with ease, but the newest load of satellites that are coming out now can see through clouds. So, as North Korea likes to move its sensitive things around under cloud cover because it knows no one can see, perhaps that sort of thing won’t be possible in the future.

Or Uber! Uber is using really, really detailed satellite imagery now to establish where there are road works that pop up, which is much more accurate than waiting for the city to tell them what’s going on. And as we have autonomous vehicles, they’re going to have to rely very heavily on this kind of satellite imagery in order to be able to work out where to take us and how to get us there.

Satellite images showing the path of a tornado
Satellite images showing the path of a tornado.

If you’re in the insurance business, the yellow dots on this map (above) are the trail of a tornado that went through Oklahoma – kind of difficult to claim for something if you’re not on the map to demonstrate that you actually were damaged by the tornadoes, so this is some sort of verification that can go on from space.

To deforestation, which you can now map from space using satellites. For example the platform called Global Forest Watch basically aggregate lots of different satellite and other material to be able to monitor deforestation in real time.

So the sky is the limit as they say. There are plenty of companies that are helping us understand this sort of thing, and perhaps, the ultimate end lies in fact-checking in our post-truth world. Perhaps, we could finally work out how many people there were at the inauguration, which obviously people are very upset about trying to establish at the moment. But of course in the future, without trying to freak you guys out too much, we would probably be able to work out who was at the inauguration as well, given that faces are just a collection of pixels. Pixels are pretty easy to search for. And so, I think we could imagine, as I said, that this searchable CCTV moment that we’re moving into is not without controversy.

So, you can lay out all kinds of things. When we have these autonomous vehicles like Google’s Waymo, it’s going to have to scan the environment at all times in 360, in very high-resolution high-definition video. That means that every one of these vehicles that’s moving around our cities at all times will be constantly scanning the area around them. Indeed, we can layer that satellite imagery not just with the autonomous vehicle data, but also with drones like the Hover Passport drone, which can recognise your face and follow its owner around because it has very sophisticated machine recognition, facial recognition technology on board. All of this stuff, when you start layering it on top of each other brings a very comprehensive view of our world that crucially, I think, isn’t just about giving you the answers to questions you’ve asked, it’s about thinking and finding things that we may never have thought to ask in the first place.

We could also, when we get more and more of this imagery, scroll back in time, be able to look over certain areas and see what has been going on, because super intelligence can basically process millions of these images in a heartbeat. That’s very exciting and very interesting but could be a little bit scary as well.

Of course, just to be clear, we are at the absolute beginning of this journey. These are all the earliest iterations. Even this drone is the earliest iteration of these technologies. This is just with the technology we have available today, not considering what’s going to be coming in the future.

So, I hear, you all ask: “well, this will sound very thrilling but is it legal?” – which would be a very sensible question to be asking. And the answer is: “At the moment? Yes, it is.”

Obviously, there are huge privacy and security implications in all of this but then none of this sort technology has been tested in court yet. And I think we can imagine a lot of precedents are going to be tested, but at the moment it’s a bit of a free-for-all up there and we really have very little regulation on a lot of these things.

Surveillance Universe

So if you think that smartphones, smart vehicles, satellites and autonomous drones are surveillance devices, I’m now going to argue in the second part in my speech, that virtual reality is going to become a surveillance universe - something that we willingly move into and have every aspect of every part of our virtual experiences monitored and tracked.

I’m going to talk about synthetic realities. So first, much like my primer on artificial intelligence, I’m just going to you a two-second overview on what’s the difference between augmented and virtual reality are, so you can snooze for a couple of seconds if you already know.

Virtual reality is when you wear a headset. It covers your eyes so you disappear into a completely synthetic simulated virtual world, a bit like Google’s Daydream or using the HTC Vive headset. Augmented reality is like the Microsoft’s HoloLens, which senses the environment around you and putting on top of that environment maybe audio, maybe some visual stuff, basically information on top of your real world in real time. So it’s constantly working out where you are so it can augment what you can see, a bit like a really, really sophisticated version of Pokémon.

So the really important thing that I want you guys to take away today about virtual reality is the depth of experience that you get from it and I want to talk a bit about that. I think sometimes people think that virtual reality is a gaming technology or a gimmick or something in marketing and not something that is of a profound technology that we’re all going to be using. I hope to change your mind a little bit about that in my remaining minutes.

I believe that it’s really powerful environment and a powerful experience because it makes you feel like you’re actually in it. Google calls the visitor, the people who use it “platform visitors” rather than “viewers” just to distinguish the difference between the experience you might get in the virtual world versus watching a movie or a 3D movie. The key discovery has been that even though each of the environments that you visit are fake, the experiences that you have within them are real - this is really, really important.

People remember virtual reality experiences, not as something that they saw but something that happened to them, which is again, very, very different from a 2D or 3D movie experience. There aren’t many good examples of this so it’s always difficult when I give talks to try and show how this is going to play out, because again like the drones and others we are at Day 1 of this technology, but I’m going to try now with a couple of examples to give you a sense of what that’s going to feel like and just how powerful it really is.

A burns victim uses a virtual world to help alleviate pain © U.W. HITLab
A burns victim uses a virtual world to help alleviate pain © U.W. HITLab

The idea for a lot of medical interventions using virtual reality is that because the brain is so convinced that what it is experiencing is actually happening to them, they can actually use it in pain relief. So the above image is a burns victim (unfortunately, when you’ve had serious burns you have to remain awake while you’re being treated), so what they have done is created a virtual environment. It’s actually not a particularly well animated thing - it looks a bit sketchy from a graphic’s perspective as far as I am concerned - but it’s a very cold environment. They’ve actually found that for people who are having those sort of experiences while they’re undergoing this treatment, it was about 60 or 70 per cent more effective than morphine in reducing their pain and distracting them from what was going on on their bodies. As a result, they’ve started using virtual reality platforms to treat PTSD, to treat paranoia (they put you in a sort virtual train carriage underground and start putting more and more humans into that carriage to help you with your anxiety issues or whatever else, which sounds like a horrific thing to do).

They also call virtual reality an “empathy machine” so that you can experience what it’s like to be an asylum seeker. It’s also interestingly able to show people what it’s like to have dementia or migraine, so hopefully they could make the treatments better.

As a more light-hearted example, which is one of my favourites, is this snooker player and I fear I have forgotten his name, which is really embarrassing, but he’s quite famous in UK [Ed:it was Ronnie O’Sullivan FYI, and the video is rather funny]. He was convinced that he was at a real snooker table even though the graphics aren’t actually terribly good. That a professional snooker player can be this, sort of, foxed by the technology is quite extraordinary. Again, we’re really just at the beginning of this.

And because we have this massive leap at the moment in facial, emotion recognition – I talked about that drone which can recognise your face, we’ve got eye-tracking technology, platforms that can recognise the emotions flickering across your face even if you don’t realise that that’s what or how you’re feeling - they’re already being used to tailor advertising to whoever is around them for example, whatever gender they are and whatever age they might be. Even amazing people like Fadel Adib at the MIT Media Lab can do remote sensing of your deep physiological responses to things. Below is an example of measuring the heart rate of a child in a cot, but remotely, there are no sensors on the kid or anything else, he’s just using wireless signals or indeed video.

https://youtu.be/-HcKRBKlilg

But I think emotion recognition is going to be heavily used in virtual reality because as I’d promised, I would talk about it being a surveillance universe, no two virtual reality experiences are going to be the same because they’ll be tailored to us. AI will understand exactly how we’re feeling, what our responses to a movie or an advert or a virtual world or a virtual store might be. Every store is going to be our favourite store. It will be tailored to us. Above is a video from Alibaba where you can see that they’re creating virtual store experiences. But I think this going to make advertising incredibly tailored because you can’t really look anywhere else when you’re in that virtual environment. You’re going to have to look at whatever the adverts put in front of you.

Once technology allows it, I think you’re going to be able to not just change the ending of whatever movie that you’re in or whatever else, but it’s going to become so tailored that you become the character in the film and the story is happening to you. The movie is watching you. And I think this is a really important thing we need to think about. You might be dancing along with Ryan Gosling in La La Land or holding the same objects as the characters in Game of Thrones, but these virtual platforms are going to collect such personal data about us - how we look, how we’re feeling, what we choose, who we hang out, the content we consume, how we feel when we’re doing that – that I think virtual reality companies will become the biggest data companies in the future. Those who will, I’m sure resell these incredibly personal data about us as Google and Snapchat and others do already today.

Understanding the ethics of VR

But I’d like to ask some very big questions about VR in the dying moments of my talk, because I don’t think we even understand the benefits and realities of today’s internet, let alone what we’re entering. This new internet full of incredibly synthetic and powerful experiences.

I want to talk about security and privacy. With some of our actions in virtual reality, the data is collected on them and actually will be used in the future to determine our bank loans or our dating profiles or an adoption application or whatever you might be doing – that’s kind of scary because we don’t know how our behaviour in virtual worlds will actually impact real life.

With social conventions, people are going to be beamed into our lives, like using the Microsoft HoloLens. This is an incredible step, but what does it mean when you have someone who’s a virtual avatar of themselves in your life? Can you put your hand through their body? That would be kind of weird. What’s the concept of personal space in platforms like this? Even with good intentions, you can feel pretty uncomfortable or unsafe in these environments, so you’ve got to be very careful about how we think about that.

Of course, the manipulation of the real world is quite a big deal. The fact that they’re putting new data into our real world is kind of important. And my big question is, if we can fix PTSD, what could it do to healthy brains? How can we design this very, very carefully? You know, given there are no rules to these kind of environments?

So a couple of things I want to finish with. This is a grand social experiment. I think it’s really important that we think how our culture will evolve when reality becomes something we could perhaps check out of. Indeed, there are many different types of reality and I don’t think we’ve really thought through that.

Also, AI isn’t neutral, it’s kind of imbued with the biases of the people who have programmed it. Given that we’re using algorithms to do predictive policing or by judges to determine sentence length, it can be very scary if those biases are fed throughout the entire system.

Of course, my biggest question for this is can we opt out and, I’m sure you’re all asking this, is there a way that we can do things to avoid being monitored all times? Well, the short answer I’m afraid is no, but artists, helpfully, have been doing things like creating make-up that you can use in order to scramble the kind of image recognition and emotion recognition cameras that are out there. I’m not sure how practical this is to have a crazy asymmetrical hairstyle and put make-up on your face in order to avoid being followed, but at the moment, that’s about the best that we can do. I’ve also seen anti-drone hoodies and anti-facial recognition burkhas, but these are all artists, putting these ideas out there to help us confront some of these really big questions that we have about technology.

I’m going to stop there but I do think this is one of the most interesting times that we can be creating using these new media, but at the same time I think we got some pretty important questions to ask ourselves about how okay we are about these new realities that we are creating. Thank you.

Follow Science Focus onTwitter,Facebook, Instagramand Flipboard