This ‘digital brain’ could soon simulate ethically forbidden experiments

This ‘digital brain’ could soon simulate ethically forbidden experiments

This very complicated computer model was designed to help teach us about our own brains

Photo credit: Getty


A new computer model could help neuroscientists and psychologists to gain a better understanding of our thinking and behaviour. The model, called Centaur, was designed to mimic the human brain.

Unveiled in a recent Nature paper, Centaur is a computer simulation, driven by artificial intelligence (AI) and trained on more than 10 million choices made by real people.

“Our ultimate goal here at the institute is to understand human cognition,” lead author Dr Marcel Binz, deputy head of the Institute for Human Centred AI, told BBC Science Focus. “One way to do this is building computational models that can artificially reproduce whatever is going on in the human mind.”

The human brain is incredibly complex, so previous computational models focused on small, individual processes, to make understanding the brain more manageable.

These models replicated how a person would respond to a particular type of problem. But Centaur is much more sophisticated than that.

“Typically, people build these computational models for one specific kind of problem – but the human mind doesn’t work like this,” said Binz. “It’s a single system that can do all of these amazing things at once.”

Binz and his colleagues aimed to build “a model that captures the complexity and variety of human cognition,” he continued.

“Hopefully, later on, we can use these models to learn something about the human mind,” Binz said.

A blue-light brain becomes computer nodes
Neuroscientists can't track all the inner workings of the brain with absolute accuracy, but computer scientists can track exactly how their model works - Credit: Yuichiro Chino via Getty

How the scientists built Centaur

Centaur was developed by scientists at the Institute for Human-Centred AI at the Helmholtz Centre in Germany, along with 40 researchers in Germany, the UK, the US and Switzerland.

The team began with a large language model: an AI system that analyses huge swathes of text and data, so it can predict and generate human language.

Famous examples of large language models include ChatGPT or Gemini, but the scientists used one called Llama.

Read more:

Then the team collected a large dataset, called Psych-101: a collection of records from 160 psychology experiments, involving more than 10 million choices made by more than 60,000 study participants.

These experiments were set up to study human memory, decision making, problem solving and more.

After that, the scientists fine-tuned their model. That means they programmed it to behave like people participating in a psychology study. The result was Centaur.

What Centaur can do

Centaur could have a variety of uses. Binz explained it could be used as a prediction machine, to simulate how humans might behave – especially in situations that psychologists can’t currently test among real people.

Experiments take time and cost money, said Binz, and there are some things that psychologists can’t ethically do, such as experiment on children or damage the mental health of their participants.

But in a computational model like Centaur, scientists could simulate psychological experiments without those logistical or ethical concerns.

A woman thinks while completing a sudoku puzzle
The psychology studies asked participants to remember objects, solve logic puzzles and make difficult decisions

The model had limits. Currently, Binz said it was better at replicating human responses to challenges that were included in – or are at least similar to – experiments in the Psych-101 dataset.

“The further you go outside of this collection of experiments, eventually it breaks down,” he said. “It can do well in completely new experiments, but there are no guarantees on that, so you have to check on a case-by-case basis for now.”

But in future, the team hoped to make Centaur even more complicated, and similar to a human brain.

“We are planning to incorporate processes that we know are happening in human cognition: like specific types of memory and learning, forgetting, planning, reasoning, and so on,” Binz explained. “And we want to investigate; how do these mechanisms look when they are playing out in our mind?”

Criticism for Centaur

There has been some debate about how closely Centaur actually resembled the inner workings of a human mind.

The authors of this paper wrote that, as they fine-tuned their model to behave like a test subject in a psychology experiment, its internal data gradually looked more similar to authentic human brain activity.

“What is happening inside the model might be, to some extent, capturing actual cognitive processes that happen in people,” said Binz. “If that were the case, that would be quite exciting, because then we could poke inside the model, look at what is going on there, and then use it to understand human cognition.”

But some scientists weren’t convinced. Dr Samuel Forbes – an associate professor in developmental science at Durham University, UK, who was not involved with this research – told BBC Science Focus that the brain and this model were very different.

“Having a model that outputs answers in the way a human might, doesn’t ensure that the underlying processes in any way resemble a human brain,” he said.

Forbes explained that it was similar to teaching a robot to play cello music and then trying to learn about cellists from that robot.

“Nothing about that model would tell you how I, or any cellist, play – or anything about the thought or emotional processes that go into playing,” he said.

Forbes continued that, while the results of this team’s study were impressive, “linking them to human thought and the underlying processes remains a much harder task.”

A brain with binary numbers to represent artificial intelligence
As the scientists fine-tuned their model to behave like a test subject in a psychology experiment, its internal data gradually looked more similar to authentic human brain activity - Credit: Yuichiro Chino via Getty

Dr Di Fu, a cognitive neuroscience lecturer at the University of Surrey who was also not involved with this research, told BBC Science Focus that Centaur "tells us 'what humans might do', not 'how the brain produces it'."

And she warned that this technology could be dangerous in the wrong hands. For instance, she said: “It could be misused for behavioural manipulation, targeted persuasion, or surveillance if deployed outside science.”

For now, Binz said that scientists would need to investigate further the extent to which Centaur and similar models reproduced human thought patterns.

Mimicking the complexity of the brain

One of Centaur’s biggest selling points – its complexity – could in fact be one of its biggest criticisms.  Some scientists have warned that the model might be too large to analyse usefully – and Binz said it was a fair criticism.

“It’s true,” he said. “We have a very complicated model and analysing that is difficult. But there are a lot of advances happening in machine learning, and some of those insights could be transferred to Centaur.”

But Binz also said that the size of Centaur was precisely what made it uniquely useful.

“It is true that the model is insanely complicated, but that is also true for the human brain,” he said. Neuroscientists have developed methods to measure brain activity, Binz explained, and could create similar tools for its computer counterpart. “That is not an impossible task.”

And the advantage of models like Centaur, added Binz, was that scientists could precisely measure everything that happened inside it. “That is just impossible to do with neuroscience,” he said.

Whatever Centaur’s current limitations, Binz and his team hoped its creation marks the first step towards using computational models to replicate and learn about the brain – and potentially develop new theories for how humans think.

Read more:

This website is owned and published by Our Media Ltd. www.ourmedia.co.uk
© Our Media 2025