Deepfakes – videos where a person’s face or body has been digitally altered to mimic someone else – are getting disturbingly good: they now even have heartbeats.
This finding, detailed in a new study published in the journal Frontiers in Imaging, could render some of the most advanced deepfake detectors – which rely on analysing consistent patterns of blood flows across a person's face – essentially useless, making dangerous content even harder to spot.
Deepfakes are usually created from ‘driving videos’, real footage that artificial intelligence manipulates to alter the features or even the entire identity of a person on film.
Not all uses are malicious: smartphone apps that age your face or turn you into a cartoon cat, for example, use similar underlying techniques for harmless entertainment.
At their worst, however, deepfakes can be weaponised to create unsolicited sexual content, spread misinformation or frame innocent people.

In the study, researchers used a cutting-edge deepfake detector built on medical imaging technology.
Remote photoplethysmography (rPPP) estimates pulse by detecting tiny changes in how light travels through the skin – the same principle behind pulse oximeters used in hospitals.
The detectors are impressively accurate: when compared against electrocardiogram (ECG) recordings, the difference was just two to three beats per minute.
Until now, it was thought that deepfakes couldn’t replicate these subtle signals well enough to fool rPPP-based detectors. But that assumption no longer holds.
“If your driving video is of a real person, this can now be transferred to the deep fake video,” Prof Peter Eisert, one of the study’s co-authors, told BBC Science Focus. “I guess that’s the fate of all deepfake detectors – the deepfakes get better and better until a detector that worked nicely two years ago begins to completely fail today.”
When the team tested their detector against the latest deepfake videos, it frequently picked up an extremely realistic heartbeat, even though none had been deliberately added.

So, is all hope lost? Are we doomed never to trust a video online again? Not quite yet.
Eisert’s team believes new detection strategies could help. Instead of measuring a simple global pulse rate, future detectors could track the detailed flow of blood across the face.
“As your heart beats, blood flows through blood vessels and into your face,” Eisert said. “It’s then distributed over the entire facial area, and there is a small time lag in that movement that we can pick up in genuine footage.”
Ultimately, though, Eisert doubts that deepfake detectors alone will win the race. Instead, he points towards ‘digital fingerprints’ – cryptographic proofs that show footage hasn’t been tampered with – as a more sustainable solution.
“I fear there will be an end to the deepfake race not too far in the future,” Eisert said. “Personally, I think deepfakes will get so good that they’ll be hard to detect unless we focus more on technology that proves something hasn’t been altered, rather than detecting if something is fake.”
About our expert
Peter Eisert is head of the Vision & Imaging Technologies Department and chair of visual computing at Humboldt University, Germany. A professor of visual computing, he has published more than 200 conference and journal papers. He also serves as an associate editor for the International Journal of Image and Video Processing and sits on the editorial board of the Journal of Visual Communication and Image Representation.
Read more: