Careful testing of these hypotheses, however, finds little to no evidence for them. An investigation published this year found no support for the posited effect of birth order on the propensity to take risks. In 2015, German psychologists analyzed data from thousands of people in the U.S., U.K., and Germany and found no significant correlations between birth order and traits such as agreeableness, conscientiousness, or imagination. In another study that year, psychologists Rodica Damian and Brent Roberts found only very small associations between birth order and personality, and some of them contradicted previous theorizing (later-borns, for example, were not more agreeable than firstborns).
One birth order finding that might actually hold up is a slight IQ advantage in firstborns. The German team found a roughly 1.5 IQ-point increase, on average, for each older birth position. They also found somewhat higher self-reported intellect ratings in firstborns. Why this might be is not yet clear. And even this finding might not be universal: A recent study of an Indonesian sample did not find any link between birth order and intelligence.
The right and left hemispheres do specialize in different mental functions. But the notion that individuals rely more heavily on one or the other glosses over the complexity of the left-right relationship.
A knack for writing musical hooks is a valuable gift, and there is no doubt that it relies on cognitive ability. But attributing that skill to a specific, musical form of intelligence muddies the well-established construct of general intelligence. General intelligence, which IQ tests reliably assess, has proven a robust predictor of such life outcomes as educational attainment and later success.
Women tend to engage in more altruistic behavior and rate higher on certain measures of empathy. Men, on average, perform better on tasks in which they mentally rotate an object, while women can better remember the location of objects. Evolutionary theorists postulate that sex differences arose because male and female hominids faced different reproductive and survival pressures.
As the architecture of complex disorders is mapped, the many genetic variants associated with each disorder are weighted to create polygenic risk scores. Someday, when these predictive scores are refined and widely used, the candidate gene myth will be dispelled once and for all.
Attachment theory started as an exploration of the relationship between infants and caregivers, and studies suggested that some children show markedly anxious or avoidant behaviors after being separated from their parents. Given the appearance of such differences in early life, a common misconception about adult attachment styles is that they are essentially based on how one related to and was treated by parents as a child. But the connection between how one is raised and how one turns out is not as simple as it might look.
Submit your response to this story to
let...@psychologytoday.com. If you would like us to consider your letter for publication, please include your name, city, and state. Letters may be edited for length and clarity.
The 10% of the brain myth states that humans generally use only one-tenth (or some other small fraction) of their brains. It has been misattributed to many famous scientists and historical figures, notably Albert Einstein.[1] By extrapolation, it is suggested that a person may 'harness' or 'unlock' this unused potential and increase their intelligence.
A likely origin for the "10% myth" is the reserve energy theories of Harvard psychologists William James and Boris Sidis.In the 1890s, they tested the theory in the accelerated raising of the child prodigy William Sidis.Thereafter, James told lecture audiences that people only meet a fraction of their full mental potential, which is a plausible claim.[5]
The concept gained currency by circulating within the self-help movement of the 1920s; for example, the book Mind Myths: Exploring Popular Assumptions About the Mind and Brain includes a chapter on the 10% myth that shows a self-help advertisement from the 1929 World Almanac with the line "There is NO LIMIT to what the human brain can accomplish. Scientists and psychologists tell us we use only about TEN PERCENT of our brain power."[6]
This became a particular "pet idea"[7] of science fiction writer and editor John W. Campbell, who wrote in a 1932 short story that "no man in all history ever used even half of the thinking part of his brain".[8]
In 1936, American writer and broadcaster Lowell Thomas popularized the idea, in a foreword to Dale Carnegie's How to Win Friends and Influence People, by including the falsely precise percentage: "Professor William James of Harvard used to say that the average man develops only ten percent of his latent mental ability".[9]
In the 1970s, the Bulgarian-born psychologist and educator Georgi Lozanov proposed the teaching method of suggestopedia believing "that we might be using only five to ten percent of our mental capacity".[10][11]
According to a related origin story, the ten percent myth most likely arose from a misunderstanding (or misrepresentation) of neurological research in the late 19th century or early 20th century.For example, the functions of many brain regions (especially in the cerebral cortex) are complex enough that the effects of damage are subtle, leading early neurologists to wonder what these regions did.[13]The brain was also discovered to consist mostly of glial cells, which seemed to have very minor functions.James W. Kalat, the author of the textbook Biological Psychology, points out that neuroscientists in the 1930s knew about the large number of "local" neurons in the brain.The misunderstanding of the function of local neurons may have led to the ten percent myth.[14]The myth might have been propagated simply by a truncation of the idea that some use a small percentage of their brains at any given time.[1]In the same article in Scientific American, John Henley, a neurologist at the Mayo Clinic in Rochester, Minnesota, states: "Evidence would show over a day you use 100 percent of the brain".[1]
Although parts of the brain have broadly understood functions, many mysteries remain about how brain cells (i.e., neurons and glia) work together to produce complex behaviors and disorders. Perhaps the broadest, most mysterious question is how diverse regions of the brain collaborate to form conscious experiences. So far, there is no evidence that there is one site for consciousness, which leads experts to believe that it is truly a collective neural effort. Therefore, as with James's idea that humans have untapped cognitive potential, it may be that a large number of questions about the brain have not been fully answered.[1]
Neurologist Barry Gordon describes the myth as false, adding, "we use virtually every part of the brain, and that (most of) the brain is active almost all the time."[1] Neuroscientist Barry Beyerstein sets out six kinds of evidence refuting the ten percent myth:[15]
In debunking the ten percent myth, Knowing Neurons editor Gabrielle-Ann Torre writes that using all of one's brain would not be desirable either. Such unfettered activity would almost certainly trigger an epileptic seizure.[19] Torre writes that, even at rest, a person likely uses as much of his or her brain as reasonably possible through the default mode network, a widespread brain network that is active and synchronized even in the absence of any cognitive task. Thus, "large portions of the brain are never truly dormant, as the 10% myth might otherwise suggest."
Some proponents of the "ten percent of the brain" belief have long asserted that the "unused" nine-tenths is capable of exhibiting psychic powers and can be trained to perform psychokinesis and extra-sensory perception.[3][15] This concept is especially associated with the proposed field of "psionics" (psychic + electronics), a favorite project of the influential science fiction editor John W. Campbell, Jr. in the 1950s and '60s. There is no scientifically verified body of evidence supporting the existence of such powers.[15] Such beliefs remain widespread among New Age proponents to the present day.
In 1980, Roger Lewin published an article in Science, "Is Your Brain Really Necessary?",[20] about studies by John Lorber on cerebral cortex losses. He reports the case of a Sheffield University student who had a measured IQ of 126 and passed a Mathematics Degree but who had hardly any discernible brain matter at all since his cortex was extremely reduced by hydrocephalus. The article led to the broadcast of a Yorkshire Television documentary of the same title, though it was about a different patient who had normal brain mass distributed in an unusual way in a very large skull.[21] Explanations were proposed for the first student's situation, with reviewers noting that Lorber's scans evidenced that the subject's brain mass was not absent, but compacted into the small space available, possibly compressed to a greater density than regular brain tissue.[22][23]
The myth was examined on a 27 October 2010 episode of MythBusters. The hosts used magnetoencephalography and functional magnetic resonance imaging to scan the brain of someone attempting a complicated mental task, and found that as much as 35% was used during their test.[25]
The graphic novel Scott Pilgrim and the Infinite Sadness parodies the myth, along with the justification that the other 90% is "filled with curds and whey," as the explanation for why vegans, such as antagonist Todd Ingram, possess psychic powers.
In the basement of the Bureau International des Poids et Mesures (BIPM) headquarters in Sevres, France, a suburb of Paris, there lies a piece of metal that has been secured since 1889 in an environmentally controlled chamber under three bell jars. It represents the world standard for the kilogram, and all other kilo measurements around the world must be compared and calibrated to this one prototype. There is no such standard for the human brain. Search as you might, there is no brain that has been pickled in a jar in the basement of the Smithsonian Museum or the National Institute of Health or elsewhere in the world that represents the standard to which all other human brains must be compared. Given that this is the case, how do we decide whether any individual human brain or mind is abnormal or normal? To be sure, psychiatrists have their diagnostic manuals. But when it comes to mental disorders, including autism, dyslexia, attention deficit hyperactivity disorder, intellectual disabilities, and even emotional and behavioral disorders, there appears to be substantial uncertainty concerning when a neurologically based human behavior crosses the critical threshold from normal human variation to pathology.
3a8082e126