What’s in store for the future of TV and film, from the godfather of streaming tech

Emmy-award winner Alan Bovik’s work affects most of the images and videos you see online, here is what he predicts for the future of streaming.

Try 6 issues for £9.99 when you subscribe to BBC Science Focus Magazine!
Published: September 1, 2022 at 8:00 am

Streaming has become a huge part of our everyday lives. Whether you’re sitting down to watch Netflix in the evening, or simply scrolling through social media, most people are seeing digital content continually throughout the day.

While a lot of people have been involved to produce the content you see, one person has been involved with a large portion of it, roughly 80 per cent of all visual content you see online in fact.

This person is Alan Bovik, an image processor who has won both an Emmy and the 2022 IEEE Edison medal for his work, along with his team, creating algorithms that affect how large portions of TV, film and online media are compressed.

We spoke to Bovik to better understand streaming, the neuroscience behind it and the future of video.

Changing the world of streaming

Bovik works in the field of image and video processing, creating theories and algorithms to modify visual signals like television, a smartphone or even medical images. He also works in visual neuroscience.

“I started working with visual psychologists and neuroscientists trying to understand how we see, and bringing those theories into images and video processing algorithms. Digital images are intended for human eyeballs, so the better job we can do at processing them in a way that matches how we see them, the more efficient we can be about everything involving images and videos,” says Bovik.

© Caroline Purser

When Bovik started in this field, he was addressing a problem that was seen as unsolvable. This was predicting what a human will say is the quality of an image. In theory, it is easy to say whether something is blurry, or noisy or if the quality is off, but to determine the exact level of quality was a challenge, even blur occurs on a long sliding scale.

“We tried to model visual perception of distortion as it occurs inside the visual parts of the brain. A student of mine and I developed an algorithm known as 'structural similarity'. We just needed to assess quality for something else we were doing but the TV industry quickly picked up on it, and soon it was used throughout TV and now the internet,” said Bovik.

“Every video you watch, on your devices, on your television is compressed, but it's hard to know how much to compress the signal without distorting it, especially perceivably distorting. You need to compress as much as possible to reduce the bandwidth of a video, but not so much that consumers begin to notice the reduction in quality.”

The algorithms Bovik and his student developed were a way to optimally deal with this issue of compression. Every single picture on Facebook, and all of the major streaming platforms use Bovik’s algorithms. In fact, about 80 per cent of all visual content online goes through these algorithms.

Diving into the metaverse

The metaverse is a buzzword that seems inescapable today. Whether its Facebook’s transition to being a metaverse-first company, or the many announcements of companies building their own digital worlds, it really seems to be everywhere, but achieving these digital worlds isn’t easy.

“Achieving the metaverse is challenging because it will be so data intensive. With virtual reality (VR), most of the video is gameplay and games require huge amounts of bandwidth. To reproduce those videos inside a VR helmet in a realistic way, you need more resolution than 4K,” says Bovik.

“Already the amount of data is shooting up, and because your eyes are darting around a lot, video has to be responsive to that. We’d need to increase frame rates more for VR, which again means even more data.”

© SOPA Images

Along with the data-intensive aspect of VR-led metaverses, there is also the issue of people’s health to consider. Are they feeling ill or motion-sick: a problem that frequently occurs through VR.

These feelings come from conflicts that occur in your ocular motor visual system – the neural architecture that drives the movement of the eyes.

When you're in a VR helmet, your eyes are an inch or two away from a display and that's what you're focusing on.

If something in the content is moving back and forth, then your eyes will try and verge. This sends signals to your brain, telling you how far away things are and there's a conflict here.

"To solve this, we can try to create algorithms that change the content so it doesn't happen. There might be an evolution of VR devices so that displays work a bit differently," says Bovik.

"Maybe you could control your VR where you turn off or adjust the amount of 3D, or it could happen automatically to aid the eyes and reduce sickness."

How will streaming change in the future?

The future of streaming is full of possibilities. While science-fiction often depicts a world with screens plastering every wall, Bovik doesn’t think that will be the full picture.

“In Ray Bradbury's Fahrenheit 451, he describes scenes where screens plaster all of the walls. I don't think it's going to come to that but I do think that he was correct that television's got huge and immersive,” says Bovik.

“We have 4K today, soon we'll have 8K at home. Those screens are so expensive right now but soon they'll be affordable. Truly immersive experiences in your home that are of unbelievable clarity, quality sharpness, depth of colours, everything you want, high speed, everything. It'll change our experience into a truly immersive one.”

Bovik believes that we are quickly heading to a world where the cinema will be in danger. With home streaming constantly improving, the cinema could be put under pressure to compete.

Read more: