How the brain remembers, learns and tests hypothesis

The Terror of Critical Thinking(Johns Hopkins School of Education)
by John Medina, a developmental molecular biologist

It may seem a bit odd that a scientist used to training graduate students, medical students and post-docs would write an article for educators and policymakers. I am simply a developmental molecular biologist, not even the most accomplished of molecular biologists, looking at the world of a few genes and how they interact with even fewer molecules. I’ve had a long standing interest in how tiny embryos grow into big babies, and for a long time now my focus has been on psychiatric disorders and how brain development in the womb and beyond can create them. I am also deeply involved in teaching and guiding the next set of researchers. Thus, my perspective on education really is as a research-oriented “brain watcher” and a consumer of the end-product of the American educational system.

Herewith some thoughts about what I would do with my observations on education were I some kind of policymaker and share in a concrete manner action steps possibly useful to educators given this perch. I will do so, though I freely admit that as a scientist I am considerably out of my depth in writing for educators and don’t really know what “useful” really means. I tucked myself into my safe ivory tower, really a mixture of careers, a long time ago, and I did so for some very good reasons, mostly related to cowardice.

And not only am I a coward, I am also very wary of becoming just another voice in the cacophony of people who think they have the answer to all the questions of the American classroom. Unless you really have taught 30 kids, 25% of whom are on some kind of mind-altering drug and 50% of whom have parents in the middle of a divorce, I doubt that you have earned the right to be heard. And I have never taught 30 kids under the age of 9 in a classroom of any kind, so I will tell you flatly that I have not earned the right to be heard.

The teaching profession has been offered so many “solutions” for the things that ail our classes – and are so poorly paid as they listen to the increased responsibilities that are usually told to add to their job descriptions – I am convinced that even the most responsive and progressive members of the educational culture start to get jaded when one comes a long with an “I’m gonna fix what ails ya”attitude. Especially if that attitude doesn’t include an income raise and a respectful countenance. Let me state from the beginning that I am not going to offer you any such solution from the world of science, nor do I propose a vaccination that hopes to cure everything, especially considering my lack of experience. I will simply fantasize.

And from such a fantasy perch, I will outline my perspective, for I am happy to inflict as many of my prejudices on you as I can. It is actually pretty straightforward, and I will give the summary before we get into anything. Here’s the summary: in 2000, education is not about a fad. It is about learning how to input and then use information inside a living organ, the brain. Similarly, education in 3000 will not be about a fad, it will still be about learning how to input and then use information inside a living brain. We are learning how the organ works, and what is so frightening for me is to observe how irrelevant aspects of our educational systems are in considering how the brain acquires information. To enumerate parts of that last sentence, I would like to divide this article into three parts.

PART I) THE BRAIN AND THE POWER OF CRITICAL THINKING
PART II) THE DISTANCE BETWEEN A NEURON AND A CHALKBOARD.
PART III) MEDINA’S LEGISLATIVE FANTASY

——————————————————————————–

PART I) THE BRAIN AND THE POWER OF CRITICAL THINKING

The human brain is an amazing organ, as you all know. It can send a signal to the big toe and back of an average sized adult 177,000 times a second. And it has to do that through an unbelievable thicket of neural cells. Consider that the average brain contains between 1012 and 1014 nerve cells. Laid end to end, to employ the overused science metaphor of bigness, that’s enough nerve to circle the globe 20,000 times. The human cortex, that talented rind of gray matter that makes our beautiful symphonies and plans our horrible wars, is also quite complex. Take a cubic millimeter of the stuff, about the size of a small dot and begin unwinding it, you’d unravel about two miles worth of brain cells.

So there’s a lot of material to work with. And that’s fortunate, because the organ has a lot to do. One of the most important tasks it performs is that it inputs information and organizes the information once it has been received. And so we have spent a lot of time trying to figure out how it does all that. There are a number of unintended consequences that have come from extracting data in this effort.

We have found, for example, that specific groups and configurations of nerve cells in our heads hold specific pieces of information. And if we can figure out how those groups work together, we will go a long way towards unlocking the secrets those neurons hold. We are beginning to do just that. This can be illustrated in dramatic fashion by looking at an experiment not done on us, but on birds, specifically, young chicks and young quail.

As every farmer knows, young chickens have a crowing sound consisting of solitary and quite loud squeaks lasting about half a second. Young quail also emit sounds at birth, but these are much more complex, consisting of one or two introductory notes, followed by a complex trill. A while back, we discovered that the neurons that controlled this complex quail-sound were in a particular configuration in a particular region of their brains.

So how important was the configuration of the neurons? Researchers have actually transplanted the part of the brain of the quail that controlled their song into the brains of young chicks that could only emit their single crowing sound. Lo and behold, when the operation was done, these roosters in training could still crow, but now they did so with one or two introductory notes, followed by a more complicated trill.

In other words, the transplanted neurons, in their own specific configuration, were affecting the outward behavior of a chick. This was unknown, and quite unintentional. It was the first time we understood that we could transfer actual living thought from one creature to the next.

One can see this principle of specific neurons holding specific pieces of information in humans as well. Consider the following story, which was eventually written up in a research journal. An older man came into the ER with a stroke in a very interesting part of the brain. He had lost the ability to recognize nouns, and had a particularly hard time with animals. You might show him a picture of a rhinoceros and ask him what it was. He would respond “I don’t know, it looks like uh, uh, well, it has this funny-looking nose . . . I don’t know what it is.” This was odd, because if you held up a flash card with the word R-H-I-N-O spelled out, and asked him what he was observing, he would respond instantly. “That’s an animal living in parts of Africa,” he might say “I think it’s on the endangered list, right? Something about his horn.”

It was clear that even though he could not recognize the picture, he could certainly recognize the word. If you asked him to draw a picture of rhino, he could actually do it, as long as it didn’t take him more than about two minutes. If you then grabbed the piece of paper and held up the picture of the rhinoceros he just drew and asked him what it was, his reply would be extraordinary. He would say something like before “I don’t know what it is, it looks like uh, uh, uh, well, . . . it has this funny . . . ”

This stroke victim’s lack of recognition pointed to something very interesting, and quite mysterious about the brain. We began to understand that this organ – like many personal computers – is capable of separating text from graphics, and stores concepts like ‘Rhinoceros’ in redundant areas of the brain. And that the specific neural cells that held the routing information connecting these concepts in this man’s head were destroyed because of the stroke. You see? Discrete sets of neurons actually control discrete kinds of information. We have known for a long time that the brain follow the rules of chemistry and physics. We were beginning to understand it might follow the rules of information transfer as well.

We’ve gone much further now. We can literally grow neurons in a dish, stimulate them, watch them change, teach them things and look at how their genes are responding. We have even recently taken a tiny silicon chip and bored very small channels into its surface and then seeded the now-carved out chip with living neurons.

Lo and behold, the neurons grew, and followed the pattern of the channels we bored onto the chip initially. That gives us the power to deliberately create the 3-dimensional architecture of neurons on our own, and maybe one day in the future drive a deliberately constructed silicon semiconductor, with a just as deliberately constructed human thought.

Now, I did not give you those pieces of information so that you could run back to your subcommittees and draft legislation banning this stuff. Rather, I just wanted to use this to illustrate how amazing it can be when one employs scientific principles to understand something that is essentially physical in nature. Because the brain follows the rules of chemistry and physics, it is possible to bring the awesome power of natural philosophy into this infinite neural world and actually learn something.

Though we take the power of the scientific method for granted, we do so only because we are used to it. But it is rather delicate, and you only have to see how fragile it is by observing what has happened in our history when we haven’t applied critical thinking skills. Which has been most of it. The history of humankind is filled with such follies and fads, and I would like to summarize a few of my favorites to underscore the contrast before moving on to our discussion of education issues.

Consider ancient physics for a moment. For hundreds of years, people wrote treatises trying to answer a single question Aristotle originally posed “Why does a big rock when dropped from the mast of ship, reach the deck of the ship before a little rock, when dropped at the same time?” Absolutely nobody was doing the experiments to see if that statement were true.

Human biology did not always fair any better. Consider the brain’s ultimate achievement, the establishment and origin of human behavior. Its explanation hasn’t always been well-thought through, even amongst our greatest minds. Consider, for example, how certain types of behavior we once viewed, in relationship to gender.

#1) The male is by nature superior, and the female inferior; and the one rules, and the other is ruled; this principle, of necessity, extends to all mankind . . . The lower sort are by nature slaves, and it is better for them (women) as for all inferiors that they should be under the rule of a master. — Aristotle

To which Timothy Leary has replied: Women who seek to be equal with men lack ambition.

#2) A woman preaching is like a dog walking on his hind legs. It is not done well, but you are surprised to find it done at all. –Samuel Johnson, English essayist, author and critic.

To which Charlotte Whitton, former mayor of Ottawa Canada, has replied: Whatever women do they must do twice as well as men to be thought half as good. Luckily, this is not difficult.

#3) Nature doth paint (women) further to be weak, frail, impatient, feeble and foolish; and experience hath declared them to be inconstant, variable, cruel and lacking the spirit of council. –John Knox, Theologian and Presbyterian

To which an anonymous woman by the name of Jill, who signed some Graffiti in Kentish Town, UK, 1986, has responded . . . if they can send a man to the moon, why can’t they send them all there?

So we haven’t always used our heads to understand what was inside them. Such prejudices can die hard. Even when we began to apply the scientific method to new technologies, we had a hard time reconciling them with superstitions. Did you know that when anesthetics were first introduced in the early part of the 19th century, there were groups of people who thought they would not work on women during childbirth? The reason was because of a Biblical scripture which said that women would experience pain – in fact increased pain – during childbirth.

So then someone tried it on women during childbirth, and guess what? It worked just fine. So then some people actually tried to stop doctors from using the anesthetic on women during childbirth, in order to keep the scripture inviolate. You can imagine that most of the people who advocated this were not women.

After the Greek civilization flourished and then died, the West in effect stopped asking questions. And we became suspicious of anyone who did, often using capital punishment and torture as a form of theological cleansing.

The idea was that we could allow someone else to do our thinking for us because that someone else had so much more experience and were so much wiser, perhaps so divinely inspired that to question the prevailing wisdom was a reflection not of a stimulating exchange of ideas, but a reflection of the inquirer’s spiritual degradation.

Fortunately, we began to wake-up, and the seeds of the good-morning actually occurred during the Renaissance my history colleagues tell me, and even earlier, as a result of the cultural mixing during The Crusades. We also started becoming braver because we started making inquiries.

And when we started asking questions, we started getting into real trouble. And the reason we started getting into real trouble is that we started getting real answers. Answers that were sometimes very different than what the prevailing wisdom had established.

For example: there was a Flemish physician named Vesalius. He found an extraordinary thing. Vesalius found out that men and women have the same number of ribs. The church had taught that men have one less rib than women because of what had occurred in the garden of Eden. Vesalius was made to give up his post, his books were banned and he was forced to work as physician, never again doing research, because of his radical beliefs.

Nonetheless there was a great change coming – critical thinking can be quite infectious, after all, and Galileo finally dropped two rocks from a building and found that, guess what, whether little or big, they fell to the ground at the same time (not taking into account wind resistance). Guess what? Aristotle was wrong.

Galileo also backed the Copernican idea of the sun being the center of the earth, rather than the Ptolemaic idea of the earth being the center. As you might recall, it has been only in the latter half of the 20th century that the church acknowledged this error and sought at least a paper restoration.

So natural philosophy in the form of the scientific method would not be stopped. One by one this kind of thinking began to take on a form of organization and compelling fascination that destroyed old ideas as fast as it came up with new ones.

Like some adolescent child discovering that their parents were fallible, we found out that our ancestors could be mistaken. That laws existed that could destroy the power of fads, and bring us to a certain sensibility. And as we staggered away from our older ways of thinking, like we had just been sucker-punched by rationality, we began to learn the terror of critical thinking.

——————————————————————————–

PART II) THE DISTANCE BETWEEN A NEURON AND A CHALKBOARD.

The notion that the brain follows the rules of chemistry and physics has lots of unintended consequences, ones we could not have easily foreseen. But one of the greatest of these unintended consequences is a certain degree of liberation. As history progressed, critical thinking began to wind its way through every facet of biology, including neurobiology .This physical point of view greatly freed the researcher to do specific experiments on living brains or even groups of neurons, and know that the results would be falsifiable or reproducible or whatever, but that they could be scientific. And it has given we scientists a context upon which to hang our observations, and, most magic of all, give to us the great power of prediction.

Such interesting results have come by applying such a discipline to the workings of the brain that you’d think we might often run into the danger of over-interpretation of a given set of data. You are probably used to hearing this next sentence from a scientist: “Be careful, for this work is preliminary and more studies need to be done.”

As I have talked to lay audiences over the years, I have actually found the opposite to be true. Over-interpretation is what the media does in the general populace. I have found the opposite to occur in the general populace. I have found they have no interpretation, and this is due to a single problem – a lack of the knowledge of the research. I would like to tell you a little bit about what we are learning about how neurons and brains acquire information, and then speculate on both the limits and strengths of applying such data to the classroom.

To start with, I must tell you I am happy to be consistent with the rest of my cautious colleagues. We really do have to be careful not to apply isolated facts from neurobiology onto some piece of legislation and think we have established some truth – when all we’ve really done is oversimplify an issue. But real work has been done, and it is to some of this stuff that I would like turn. I would like to give two points of view, looking from the inside of a cell outwardly to a behavior, and then going in the opposite direction, looking from the outside of a behavior and going into a cell.

We will start with the notion of something deeply relevant to education, the ability to store and recall data over time. The formal term, as I am sure many of you are aware, is called long-term memory. Believe it or not, we are actually beginning to understand some of the molecules involved in long-term memory, and that includes processes occurring in humans. To relate this accurately, I need to give a metaphor about trees, and then tell you something about what happened when I first started dating the person who would later become my wife.

We’ll start with trees. As you undoubtedly know, there is a gap between neural connections called a synapse. Many people when they consider neurons tend to think of the easy thing, that one neuron connects to another neuron via a single synapse, like two wires not yet soldered together.

Nothing could be further from the truth, actually. A better analogy is this: pretend you are Paul Bunyan, 60 feet tall and have the ability to uproot trees with bare hands. You take one tree in one hand and uproot it, then take another tree in another hand and uproot it. Now turn each tree ninety degrees so that their roots are touching each other. Literally, thousands of roots from one tree are touching thousands of roots from another.

That’s how two neurons interact. They look like trees with their roots touching, which means there can be thousands of connections between just two of them. Now here’s a weird thing. There are different flavors of connections. Some of those connections are wild about being with each other, we call these excitatory, and they talk to each other like teenagers on a telephone. Some of those connections are like guests that have stayed too long in your house. They’re localized together, but they aren’t crazy about being in the same place and in fact refuse to talk to each other. We call these connections, these synapses inhibitory. The sum total of both of the stimulatory and inhibitory connections actually creates a language between two neurons. It’s something like zeroes and ones in a computer. Which means the wire metaphor is pretty lousy. Better to think of the connections as a computer.

So what does this interesting computer have to do with learning? How we store stuff into long-term memory? To understand the full process, I now need to tell a little bit about our second metaphor, how my wife and I first met.

When I first met Kari, the woman who would later become my wife, I was dating someone else. And so was she. But I did not forget Kari, she is physically very beautiful, a talented Emmy-nominated composer and one of the nicest people I have ever met.

When both of us found ourselves “available” 6 months later, I immediately asked her out. We had a great time, and I began thinking about her more and more. Turns out she was feeling the same. I asked her out again, and soon we were seeing each other regularly. It got so that every time we met my heart would pound, my stomach would do flip-flops, sweat would appear on the back of my palms. I knew I was falling in love.

Eventually, I didn’t even have to see her in order to get the raise in pulse. Just a picture would do, or the smell of her perfume (Chanel #5), or the building that housed the music studio where she practiced. Eventually just a thought was enough. That was 18 years ago, and I have to admit, when I pick her up at the airport after she has made some trip, I still get those same sweaty palms and elevated pulses. Indeed, after all these years she has had to endure living with me, I consider myself the luckiest man in the world!

What was happening here to effect such change? With increased exposure to this wonderful lady, I became increasingly sensitive to her presence, my reactions made stronger over time, needing steadily smaller cues to elicit stronger and stronger responses. (perfume, for heaven’s sake?). Moreover, the effect has been long-lasting, having had the tenure of almost 2 decades. Leaving the whys of the heart to the poets and the psychiatrists, this idea, that increased exposure results in stronger reactions, all requiring less and less input to elicit stronger and stronger reactions, lies at the heart of how neurons learn things. When two neurons, one on each side of a synapse decide to remember a piece of information, it’s just like the two of them fall in love. At first it takes a fair amount of exposure, one neuron giving out a large amount of energy to the other, in order to keep the connection live.

But when learning takes place, an astonishing thing occurs. The connection is strengthened, and after awhile, it takes only a little bit of input from one neuron to make the other fire excitedly as though it was some star-crossed lover. This is just like my dating Kari. At first, dating her was a nice idea. But soon, even the memory of the smell of her perfume, a tiny little input, could elicit all the palpitations of a full-blown lover’s kiss. In other words, it didn’t take very much after while to get me extremely excited. We call this process synaptic strength, and when neurons learn something, they strengthen their connections with each other.

Now here’s the kicker. Once this excitability has been established, it will go away after a short period of time. In fact, after a piece of information has been learned, and synapses have been strengthened, it is only temporary. You have to restimulate those neurons again, do it in exactly the same way, and do it within 90 minutes of the first stimulation, or the memory is lost. That’s why you have to repeat things in order to learn something. You have to continually refire that rootful of connections.

You might right now be thinking: does that mean anything to the real world of classrooms and education subcommittees? The answer is emphatic and I will say it five times: no no no no no. There is a great deal of distance between the concept of synaptic strengthening and curriculum design. Here is the reason why I am telling you this. What I find so astonishing is that we are actually creating technologies that will eventually be able to address and ultimately bridge this large gap, and turn my five no no no’s into a hundred yes yes yes’s. In other words, science is providing a context, a framework, that allows us the opportunity to understand the physics and chemistry behind the extraordinary ability of humans to acquire information.

I wish I could live to be 200 years old, just to see how this gap will close. We see this not only inwardly at the level of the molecule looking to outward external behaviors, but, in going in the opposite direction, from the external observation giving us hints as to what happens deep in the interior of our brains. Whenever anyone starts looking at the research on human brains and learning, the first impression one gets is that the organ is enormously sensitive to environmental cues. So sensitive it is to outside inputs that neurobiologists use the word “plastic” to describe its responsive abilities. Let me give you an example.

This work comes from Tad Tsunoda at Medical and Dental School, Tokyo. I have never seen this published in an English language journal, though MIT press quotes from this work in the book Science of Mind.

A group of linguists were interested in the affects of language sounds on brain development. Specifically, they looked at people who learned languages heavy in vowel sounds – like Hawaiian – as their primary language and then examined the brains of people who learned languages heavy in consonant sounds – like Russian. Using PET scans and MRI technology, they did things like play a violin for the two groups of people. Or have them learn something. Or give them a stressful situation. Their results were astonishing. It appeared that the people who learned Hawaiian first used different parts of their brains to hear violins, learn or become stressed, than people who heard Russian first. And it was the same difference – i.e. there was a Hawaiian pattern and a Russian pattern in each of these studies. The upshot of the work was that brains could rewire themselves based on – what? – syllables, vowels and consonants? There is no such thing as vowels and consonants in nature. Those are just man-made ideas. There are simply compressions and rarefactions banging on people’s eardrums. But the brain is so sensitive to outside input, so plastic, in the words of the neuro guys that, what? You can actually rewire by simply speaking different kinds of textured languages? Is that true? And if it is, the great question becomes: what other environmental inputs help rewire the brain?

The point here is that the brain is an enormously plastic, and the kinds of inputs it is sensitive to we are only just discovering. We not only know that it is very flexible, but that the human brain is very finicky about when it wants to be flexible, and what kinds of talents will come out as a result of its plasticity. In other, words, there are the
famous critical periods of learning. Consider how deeply these things are etched.

Any self-respecting neurobiologist will tell you that it is an absolute disaster if a kid is born with cataracts and the cataracts are not removed within several weeks after birth. Why? Because there are nerves that exist deep within the infant brains that will not hook up correctly–in fact may not hook-up at all – unless their eyes are exposed to light. If those nerves are not given photons, the nerves will not achieve their final wiring instructions. Literally a baby can go blind, even if everything anatomically is perfect, if it cannot acquire a few photons. We say that those connections are experience-dependent.

Most of you know about the most famous example of a critical period of development which is language acquisition. But a colleague of mine, Patricia Kuhl also at the University of Washington, has discovered just how early this begins and ends. She has focused on language acquisition. Here’s a synopsis of her work, beginning with her research rational.

Language allows predictability to come into our world very quickly. Language gives people the common ability to assess what is in each others brains, and therefore allows the magic words of human survival – learning – to take place rapidly and predictably.

Here’s an astonishing fact. We know that a baby right out of the box comes pre-loaded with the ability to make every sound known in human language are pre-loaded to talk, and we may even be pre-loaded to structure our talk in a logical fashion. But not just any talk. We are pre-loaded to talk in the sounds we are hearing around us, and we are just as pre-loaded to filter out those sounds which are not around us, and in so doing, create an increasingly predictable verbal world.

Pat Kuhl has been able to show that this amazing talent to discriminate between all sounds is short-lived. In a year’s time, a baby can no longer discriminate between all the sounds of all the worlds’ languages, though she could when she was six months old.

Rather by 12 months of age, this talent is lost, she can only discriminate and hear – that’s right – I said hear – the sounds of the languages to which she has been previously exposed. The critical period is between the ages of 6 and 12 months. Thus even though we are born able to distinguish and hear all human sounds, age 12 months, we can only hear our own languages. In other words, exposure to a particular language alters our brains and shapes
our minds, so that we perceive sounds differently.

One of the great examples of this is: Native Japanese speakers raised without exposure to English as infants cannot tell the difference between the words rake and lake. They hear them as exactly the same word. That’s not true when the Native Japanese speaker was a baby. At 6 months, a Japanese baby can hear the difference. But at 12 months, that talent is gone. And is forever. Don’t feel too smug. A native English speaker can only hear a single difference when a tape is heard of the sound of b going to sound of p. Thai speakers can hear three separate stages of that movement.

So you have a visual critical period of learning. You have an auditory critical period of learning. It appears as if you are born with various stopwatches inside your head that actually started ticking away even before you were born. These stopwatches govern your ability to acquire various pieces of information, and once they have ticked down to zero, it is impossible to acquire new information.

Here’s another talent babies possess. You might call it another piece of pre-loaded software, something that babies appear to be born with, right out of the box. It turns out that babies are great hypothesis testers. What do I mean by hypothesis testing? You are very familiar with it, though you may not know you do it. In fact, hypothesis testing may be the single greatest characteristic of summary human intelligence.

..

 

Leave a comment