Book snobs who insist that reading literature is superior to listening to an audiobook may want to look away now.
Neuroscientists have discovered that the same cognitive and emotional parts of the brain are stimulated whether a person hears words, or reads them on a page.
A YouGov study carried out in 2016 found that just 10 per cent of Britons believed that listening to an audiobook was the same as having read the physical version, with the majority believing it was a lesser form of culture.
But experts at the University of California, Berkeley, disagree.
Lead author Dr Fatma Deniz, a researcher in neuroscience said: “At a time when more people are absorbing information via audiobooks, podcasts and even audio texts, our study shows that, whether they're listening to or reading the same materials, they are processing semantic information similarly.
“We knew that a few brain regions were activated similarly when you hear a word and read the same word, but I was not expecting such strong similarities in the meaning representation across a large network of brain regions in both these sensory modalities.”
Language is a complex process that involves many regions of the brain, and it was previously thought the brain dealt with spoken and written information differently.
For the study, nine volunteers listened to stories from The Moth Radio Hour, a popular podcast broadcast on BBC Radio 4 in which people read out true stories. The study participants were then asked to read the same stories.
Researchers scanned the brains in both the listening and reading conditions to compare brain activity and found they were virtually identical. They also found that words activated specific parts of the brain depending on whether they were visual, tactile, numeric, locational, violent, mental, emotional and or social.
The maps, which covered at least one-third of the cerebral cortex, enabled the researchers to predict with accuracy which words would activate which parts of the brain.
Scientists believe the word maps could have clinical applications, such as comparing language processing for people with stroke, epilepsy, impaired speech, brain damage or dyslexia.
“If, in the future, we find that the dyslexic brain has rich semantic language representation when listening to an audiobook or other recording, that could bring more audio materials into the classroom,” added Dr Deniz.
"It would be very helpful to be able to compare the listening and reading semantic maps for people with auditory processing disorder.”
The research was published in the Journal of Neuroscience.