Power corrupts, crowds are violent and depression is just a chemical imbalance. Right? The classic psychology theories often have a nice ring to them, creating a myth that persists throughout media, cinema and literature. But new research is revealing that the human mind isn’t as simple as we’d like to think.


When eye-catching theories emerge in the field of psychology, they often take on a life of their own. Just look at the idea that oxytocin is the ‘cuddle hormone’. This captures our imaginations, but research has shown that oxytocin can also increase feelings of intolerance and aggression.

We are all amateur psychologists, and the field provides an appealing way for us to make sense of our feelings and behaviour. If it can confirm our own beliefs about human nature, then even better.

But just like every science, psychology is a messy, ongoing process, and many headline-grabbing results have not been replicated, or are far more nuanced than first realised.

What is pop psychology?

Psychology itself is the study of human behaviour, covering everything from conscious to unconscious thought, feelings, emotions and intelligence. Understandably, this makes it a highly nuanced subject, and experiments can often be difficult to replicate.

Even so, headlines from these studies can live long in the memory. Over time, these ideas become embedded in the public consciousness, joining the ranks of other such ideas in the realm of ‘popular psychology’ (also known as pop psych, cod psychology and pseudo psychology).

Just like urban myths, these theories are picked up and written about in self-help books, advice columns and on social media because they appeal to us in a way that helps us understand and explain the world around us, despite the fact that the science often doesn’t hold up.

More like this

Here are eight widely believed pop psychology ideas that are probably wrong, or at least overly simplistic.


Power corrupts

Does evil reside within us, or are we corrupted by circumstances? In 1971, the Stanford University psychologist Philip Zimbardo sought to demonstrate the potential power of situations and social roles to corrupt individual morality.

Anticipating the scenarios dreamt up by reality TV decades later, Zimbardo and colleagues created a mock prison and recruited 12 male college students to play the role of guards and 12 to play the role of prisoners.

The idea was to study their interactions for two weeks, but the ‘Stanford Prison Experiment’ had to be aborted after just six days, such were the levels of cruelty perpetrated by the ‘guards’ upon the ‘prisoners’, including forcing them to clean toilets with their bare hands.

© James Minchall
© James Minchall

To Zimbardo, the shocking lesson was clear – powerful situations can overwhelm our individuality, turning good people bad. His interpretation chimed with ideas about the roots of evil, apparently helping to explain atrocities of the past, and future – Zimbardo would later invoke his research while testifying in defence of one of the US guards accused of cruelty towards prisoners at the Abu Ghraib prison in Iraq in 2003-4.

Over the years, Zimbardo’s study has been subject to intense criticism and reinterpretation. In 2002, the British social psychologists Alex Haslam and Stephen Reicher conducted a similar experiment called the ‘BBC Prison Study’. In their version, the prisoners united and overthrew the guards, showing that the events of the Stanford experiment were far from inevitable.

Read more about crime:

Footage has also emerged of Zimbardo – in the role of ‘prison superintendent’ – instructing his guards on how to behave, which seems to undermine the spontaneity of the events that unfolded. More recently, an audio recording was uncovered that revealed one of Zimbardo’s collaborators, in the role of ‘prison warden’, persuading one of the ‘guards’ to treat the prisoners more cruelly, including telling him that, if he did a good enough job, the experiment could lead to real-life prison reform.

Critics like Haslam say the recording shows the Stanford study was more akin to a form of live theatre than a science experiment.

Zimbardo and his defenders counter that, whether the guards’ sadism was inevitable or not, the study’s message still holds – that, in the wrong circumstances, otherwise ‘normal’ people are capable of extreme cruelty.


Children with more willpower are more successful in later life

In the 1960s, the American psychologist Walter Mischel began a series of iconic experiments that involved challenging several dozen young children to sit alone with a marshmallow for around 15 minutes and resist eating it. Their reward, if they waited, was to eat the first marshmallow, plus another.

Famously, the researchers caught up with the same kids in the 1980s and 1990s, by which time they were adults, and found that those who’d been successful at this ‘delayed gratification’ task had subsequently done better in life, in terms of exam results and avoiding getting into trouble. The results appeared to suggest that if we could teach kids to have stronger willpower, their lives would benefit.

© Alamy
© Alamy

However, in 2018 psychologists at New York University and the University of California, Irvine, conducted the first replication attempt of the marshmallow study, but this time using data from hundreds of children.

Unlike in the original research, Tyler Watts and his colleagues also controlled for a host of social and situational variables, such as parental educational background and how responsive parents were to their kids.

The team found that correlation between delay of gratification and later success (in this case in adolescence) was far weaker than in the original research. Moreover, the correlations became statistically non-significant when the researchers factored in the social and family variables.

Read more about children:

Watts and his colleagues’ interpretation was that a child’s ability to resist the marshmallow has less to do with their inherent willpower, and more to do with their family circumstances – for instance, whether the child had learned to trust being promised greater rewards in the future or not. This chimes with other research that’s found that adults succeed at their goals through forward planning and avoiding temptation, rather than through brute willpower.


Crowds make people mindless and violent

Media accounts of riots often imply that a mob mentality has taken over. Such reports reflect a commonly held belief that when large groups of people get together, people lose their individual morality and run amok with the herd. Similarly, newspaper reports of disasters often describe crowds as if they are mindless, with talk of ‘stampedes’ and blind panic.

The reality, according to many contemporary social psychologists, is that there is a logic and purpose to much crowd behaviour. Violence is far from inevitable when large groups assemble – just look at the restraint shown on American civil rights marches in the 1960s.

© James Minchall
© James Minchall

Even in the case of rioters, while they might often be violent and destructive, they usually have a shared purpose and a clear sense of identity. During the English riots of 2011, for example, the damage was aimed mainly at targets seen as symbolising inequality, such as high-end shops.

Also, it wasn’t the case that anyone who saw the riots on television, or encountered them in the street, was sucked zombie-like into ‘the mob’ – rather it was in neighbourhoods where there was already a strong sense of disenfranchisement that people were far more likely to join in.

Read more about human behaviour:

It’s a similar story for crowds in emergencies. Analysis of real-life events, such as the Hillsborough disaster of 1989 and the overcrowding at a Brighton beach concert in 2002, suggest that blind panic is rare and that people will often stop to help one another. This altruistic behaviour is perhaps due to a sense of togetherness that’s forged as groups of strangers go through a common experience.


Depression is due to a chemical imbalance

The most commonly used antidepressant drugs increase the availability in the brain of a chemical called serotonin. Whatever the rights and wrongs of antidepressants (some credit the drugs for saving their lives, while critics fear the overmedicalisation of emotional problems that have complex roots), their rising use has fed the notion that depression is caused by some kind of chemical imbalance in the brain that requires correction.

The reality is that most psychiatrists believe that the chemical imbalance idea is a gross oversimplification. Part of the issue is that it’s based on flawed logic. Just because these drugs increase serotonin levels, it doesn’t mean that a lack of serotonin is the cause of depression (after all, your headache is not caused by a lack of paracetamol).

Beyond that, post-mortem research has failed to show that people with depression have lower levels of serotonin, and studies that have artificially lowered people’s serotonin levels have not induced depression.

The truth is that there is no psychiatrist or neuroscientist who could honestly say what the healthy or correct levels of brain chemicals should be.

Read more about depression:

Many mental health campaigners have embraced the chemical imbalance idea, believing that it will help to reduce stigma by showing that depression has a clear physical cause.

Sadly, if anything, biological explanations of mental illness seem to have increased stigma, perhaps because they cause people to perceive mental health conditions as being more fundamental to the sufferer and more difficult to treat.


Firstborns are natural leaders

What do Emmanuel Macron, Angela Merkel and Boris Johnson have in common? How about Jeff Bezos and Elon Musk? They are all the eldest among their siblings – providing anecdotal evidence to back up the popular idea that firstborns have distinct personalities that help them become leaders.

This rationale has a logical appeal – after all, the eldest child enjoys the undivided attention of their parents for a time, after which they get to boss their younger siblings around.

© James Minchall
© James Minchall

However appealing this pop-psych theory, the evidence largely doesn’t support it. In 2015, when psychologists carefully analysed the personality traits of hundreds of thousands of people and then correlated them with people’s birth-order position in their family, no clear associations were found.

A subsequent Swedish study did find that firstborns were more likely to end up in leadership roles, but the correlation was weak. If there is a link, then it probably has more to do with opportunity than aptitude, such as being the one chosen to take over the family business.


We all have a preferred learning style

Do you find it easier to learn by reading an article or listening to a podcast (like this one)? Maybe you prefer images over text? Surveys suggest that most of us believe we have a preferred ‘learning style’, be that visual, auditory, kinetic (learning by doing), or something else.

A majority of teachers believe it, too. In fact, a whole industry has built up around finding ways to measure people’s learning styles and guiding teachers on how to teach to those different styles. However, this is probably the most striking example of where folk wisdom clashes with psychological science.

© James Minchall
© James Minchall

Time and again, carefully controlled studies have failed to find evidence to support the ‘learning styles’ approach. Most studies in this area follow a similar format – volunteers report their preferred learning style, and then some of them are presented with material in their favoured modality while others are not. A test then ensues.

Nearly every study has found that those who learn via their preferred style do not perform any better than a comparison group not taught to their preference. What’s more, participants rarely show much insight into their supposed best learning style – their performance is often better in their non-favoured methods.

Download our Science Focus Education Packs:

Critics of learning styles point out that the optimum way to teach often depends on the nature of the topic, not the preferences of the individual students. Others say that even if you do have a weakness in learning a particular way, it is better if teachers help you improve in that area, rather than avoiding it.


Smiling will make you feel happier

The roots of this idea date back to Charles Darwin’s ‘facial-feedback’ hypothesis, which proposes that the outward expression of emotions can feedback and affect our feelings.

The esteemed 19th-Century philosopher and psychologist William James proposed a similar idea – that it’s the physical changes associated with fear that lead you to feel afraid, not the other way around.

These theories inspired a ‘modern classic’ of pop-psychology, published in 1988. Researchers led by psychologist Fritz Strack asked volunteers to watch cartoons, either while holding a pen between their teeth, thus forcing a smile, or with the pen held between their lips, forcing a frown.

The smilers found the cartoons funnier, suggesting that the mere act of grinning could have a positive effect on feelings. This result, and later variations, soon led self-help authors to propose that you could simply smile your way to greater happiness.

© Getty Images
© Getty Images

But in 2016, a collective of 17 separate research labs recruited nearly 2,000 participants in an attempt to replicate the cartoon study. The findings were inconsistent across the labs, and when they were pooled together, the result was a negative – smilers were no more amused than frowners.

However, it may be premature to write off the facial-feedback theory. Strack pointed out that the modern replications videoed the participants, whereas he hadn’t, which might have interfered with the results, perhaps by making participants self-conscious.

Read more about happiness:

Also, other research findings, such as those involving Botox patients, are consistent with the facial-feedback hypothesis. Botox treatment interferes with facial expressions, and those who’ve had it seem to experience emotions differently from other volunteers.


Oxytocin is the ‘cuddle hormone’

Especially around Valentine’s Day, the popular media gets excited about oxytocin, often referred to as the ‘love hormone’. It’s absolutely true that this chemical is released in the brain when women give birth and breastfeed, and also when people cuddle and have sex – hence the media nicknames.

There were also studies conducted in the early 2000s that suggested sniffing oxytocin made people more trusting, generous and better at empathising with others. Subsequently, the molecule has been mooted as a breakthrough intervention for various conditions, from autism to schizophrenia.

© James Minchall
© James Minchall

If it all sounds too good to be true, that’s because it is. More recent research has questioned those early findings on the chemical’s effects, both failing to replicate them and painting a more nuanced picture.

For instance, while oxytocin might increase feelings of bonding with friends and family, it can also sharpen dislike for outsiders. It can even heighten aggression in those with violent tendencies. In short, oxytocin is certainly an intriguing chemical, but it’s far more than a cuddle hormone.



Dr Christian Jarrett is a cognitive neuroscientist, science writer and author. He is the Deputy Editor of Psyche, the sister magazine to Aeon that illuminates the human condition through psychology, philosophy and the arts. Jarrett also created the British Psychological Society's Research Digest blog and was the first ever staff journalist on the Society's magazine, The Psychologist. He is author of Great Myths of The Brain and Be Who You Want: Unlocking the Science of Personality Change.