NOTICE: Posting schedule is irregular. I hope to get back to a regular schedule as the day-job allows.

Thursday, March 31, 2011


Sorry about no Blog on Wed. Mar 30th - I did promise posts every even numbered date, but I got off a day, thinking I had until April 2nd.  Blame it on the day job.  I'll post something tonight or tomorrow to fill in.

Monday, March 28, 2011

"you're getting sleeeeeepy....."

Hypnosis, suggestion and false memory.

A question prompted by my last entry prompted today's blog... so I suppose this could be considered a "Mailbag" blog.


No, Ratface, this is not the usual mail-day blog.  You and the rest of the LabRats can get back to work.


No, Ratface.  Not today.


Um... hey, Ratface!  Watch this... [pocket watch swings back and forth on its chain]  You're getting sleepy, very sleeeeepy.

Now, close your eyes, dream of cheese, and let me finish writing.


So, here we have the perfect cliche - a weak mind, easily influenced, the swinging watch, and *presto*!  Hypnosis can solve all of our problems!

So is it real?  and what does this have to do with recent blogs on memory?

To really cover hypnosis would require delving into psychology, which is not the primary subject of The Lab Rats' Guide to the Brain.  However, experts agree that the hypnotic (or, more correctly: autohypnotic) trance is a state in which the conscious mind is less active, and one can interact with the subconscious.  Most people are familiar with stage illusionists and hypnotists, and quite frequently doubt the truth of the results.  We do know that a person cannot enter an autohypnotic state unless they *want* to (hence the "auto-" prefix), and they likewise would not act on a suggestion unless they were willing to do so.  Thus the doubters claim that a hypnosis stage act is all about placebo and peer pressure, while the believers cite brainwave studies and cases in which hypnosis has real, lasting effects on behavior.

A true hypnotic trance or "state" looks very much like a dream (or daydream).  The EEG (brain waves) show many characteristics of sleep - first the low power alpha rhythm of meditation and relaxation, then the very slow delta and theta rhythms of deep sleep, followed by the brief, fast rhythms of REM sleep.  Psychologists consider the state to be akin to the "conscious mind" sleeping, while the "subconscious mind" is free to listen and interact with the hypnotist.  We actually enter such a state many times during the day while doing tasks that are quiet, possibly boring, repetitive and automatic (driving!, mowing the lawn, reading, etc.).

The state is also characterized by a lack of the "executive" governance of conscience, inhibition, and anxiety - hence the child-like nature and willingness to act on suggestion.  A major clinical/therapeutic benefit of autohypnotic trance is access to memory.  To understand why this is so, we must look back at what we know of how memory is converted from short-term to long-term storage:  If you want to remember something, you repeat it.  First we repeat the phone number to ourselves, then we write it down, often repeating it again.  Then we read it back.  The brain does the same thing, shuffling short-term memories from prefrontal cortext to hippocampus and back again - many times!  Then we have to let the remembered item sit idle, and come back and repeat it again at a later date.  In our prior discussion of dreams, I mentioned that experiences are most likely repeated in the next sleep or dream period - in fact, interrupting sleep is a good way to interrupt long-term memory.

The long-term storage process is called "consolidation" (or "re-consolidation" as it affects existing patterns) and it helps to store memories by build *associations* with other memories.  I discussed this in the blog on "flashbulb" and PTSD memory, but it is just as important to normal memory storage.  Association is the key, or combination that allows us to retrieve memory when it is needed.  What hypnosis does is allow access to the association and consolidation aspects of memory, much the same as during sleep and dreaming.  Thus a "post-hypnotic suggestion," one which lasts or takes effect long after the end of the hypnotic trance, forms a new associational relationship to existing memory.  This is useful in therapy for anxiety, weight loss, smoking cessation, or in setting up beneficial mental states for surgery or exercise.

Some therapists utilize hypnosis to access repressed memories.  However, from an ethical psychological perspective, such use is not without dangers.  Because the brain is in the "consolidation/association" state during a hypnotic trance, any memory accessed is also subject to creation of new associations, much the same as flashbulb memories are subject to conflation with other events.  In fact, the use of hypnotic regression to uncover repressed memories in children and abuse cases is increasingly condemned.  Essentially every time a memory with strong psychological components is recalled, it can be modified.  In the case of PTSD, the emotional content produces physiological changes - rapid heartbeat, shortness of breath, sweat, anxiety.  The resulting emotional state then re-consolidates with the original memory, strengthening the associated trauma and fear.  Use of hypnosis to treat PTSD requires a careful manipulation to *reduce* emotional triggering as the memory reconsolidates.

The greatest risk of memory manipulation in this state is that of false memory.  A therapist, trusted by the patient, who consistently questions a patient about a repressed memory, can easily set up associations in the patient's subconscious such that they believe the *suggestion* strong enough to associate it with memory.  Suggestion and reconsolidation do not require hypnosis, but they are certainly aided by the autohypnotic trance. In fact, all that is really needed for formation of a false memory is to repeatedly recall an existing memory under conditions that allow addition of false elements - such as constantly relating an incident with few original details, but strong emotional content (as with flashbulb memories) - the additions take on the seeming of the original memory.

A key sign of false memory is when the physiological reactions do not match the remembered events.  A soldier remembering combat will react tot he sights, sounds and smells of combat; a young man remembering a fleeting encounter with a beautiful young lady will remember the warmth of her touch, the smell of her perfume; in each case, the brain and body areas involved with be active and involved in recall of the memory.  Absent or inappropriate emotion will result when the associated sensory information is separated from memory - for the good, as in PTSD and abuse therapy - and for the bad as in implantation of false memory.

Memory is fragile.  We can impair the mechanism for making new memory (amnesia) and for recalling existing ones (Alzheimer's disease), we can even alter the memories we think are solid with the addition of unrelated information.  Hypnosis can work by adding beneficial suggestions, but can also irretrievably alter the real memory and obscure truth.  Truly memory is something to be tampered with only with extreme care and skill.


At the count of three, Ratface, you will wake up, and go help Ratley take out the garbage, and forget about the cheese I promised you.





Yeah, yeah.  Here's your cheese.  I never said I was a *good* hypnotist!

Saturday, March 26, 2011

It came to me in a flash...

Memory, “Flashbulb” memory and PTSD.

In general, memory is a weak process. Learning and remembering factual items requires repetition and retesting . In general, if you want to remember a phone number, you repeat it. Particularly if you can’t write it down – interrupt the repetition, and you forget the information. Skills and more complex facts require more repetition, and usually occur over days. The overnight period with REM and nonREM sleep is very important to shuttling memory from short-term temporary memory to long-term memory (this is called “Consolidation”).

Standard memories are hard to record and easy to erase (at at the very least easy to lose track of). There is a subset of memory and learning that is permanently stored after only a brief or even single exposure. What we as scientists know is that to strongly record memory requires the input of several brain areas. At the same time, memory is stronger when it incorporates associated information such as sensation (light/sound/smell) and emotion.

There are three main areas of the brain involved in memory. The “Frontal Lobe” area above and behind the eyes, “Medial Temporal Lobe” just inward from the ears, and the “Basal Ganglia” which are the deep structures in the middle, bottom of the brain. A key structure that I will mention a lot is the Hippocampus, which is a major portion of the Medial Temporal Lobe. The hippocampus is the main processing area for memory. Along those same lines, Frontal Lobes are commonly thought of as the brain region we “think” with, while the Basal Ganglia are involved in motivation and reward.

Repeating and rehearsing memory involves passing the information back and forth between Frontal and Medial Temporal Lobes. Frontal areas can retain information for about 10 seconds to a minute, Hippocampus can hold information for 10 minutes to an hour. Learning also requires an evaluation of *reward* or the relative value of the information to be stored, thus incorporating connections with the Basal Ganglia. We are finding that the greater the value of the “reward” reinforcement, the easier a memory is to store, and the harder it is to erase. Remembering usually invokes a repeated shuffle of information from Frontal to Hippocampus and back again, with the Basal Ganglia adjusting the strength and speed of the process. By the way, this process *is* tied to sleep and dreaming, but some parts we don’t really know, and the parts we do aren’t really relevant to this discussion.
Now, on to the unusually strong, single event memories. There are three main types of events that lead to strong memories: very strong emotion, stress/danger and drug abuse. The examples of remembering events associated with marriage proposals, birth of a child, the Moon Landing, etc. fall into the strong emotion category. They tend to be clear memories with a lot of associated sensory information – day/night, temperature, cooking smells, music on the radio. The emotions can be positive or negative, and the accuracy of this memories is usually around 80% correct (compared to a well rehearsed “learned” accuracy in the mid 90’s). They can be so accurate, but based on only a single event with no “repetition” because the emotional feedback strengthens the “reward value” inputs to Hippocampus and does not require repetition. The reason the accuracy is not 100% is because every time a person “relives” a memory, it is in fact “rewritten” and there is a chance for *current* information to be written along with the memory. More on this later.

Stress/danger memories include the category of “Flashbulb” memories. Strong negative emotion, fear of death, loss of a loved one, traumatic circumstances – all invoke extremely strong stimulation from the Basal Ganglia to the Hippocampus. It’s a survival mechanism: Touch the fire and feel pain – store the memory “Do Not Touch Fire!” and don’t require it to be repeated in order to be remembered. One of the features of these “pathological” memories is that replaying the memory invokes all of the emotional sensation again. Since strong emotion affects the ease of memory storage, these memories are in even greater danger that other similar circumstances will be conflated with the original memory.

Hence, my Flashbulb Memory of hearing about the Challenger Explosion in 1986: I was present at a lab meeting discussing the unfortunate accident of a fellow student when a professor burst in to tell us the Space Shuttle had exploded. The reality is that the two events were separate, but only a month or two apart. There was a lab meeting setting each time, and both were strong negative emotional events. Somehow, the retelling of the Challenger event invoked the similar memory of the student’s accident and became conflated. [Likewise Sarah remembers viewing the JFK assassination and funeral on a color TV when there was no TV, let alone color, in her house at the time. Seeing a JFK documentary at a later time likely caused her to recall the emotional memory of the Portuguese President’s funeral and the two became associated. ] Flashbulb memories are strong and easily recalled, but their accuracy is typically less than 50% because the same mechanism which makes them easy to recall – that is, the emotional content – is also replayed and makes the memory sensitive to contamination. Generally the only way to detect these errors is by some external event that causes us to check the facts – such as by being challenged by another person who was present at same events and then consulting a factual source to realize the inconsistencies in our memories. It is also noteworthy that strong emotion memories are *very* hard to erase.

This brings us to PTSD: the extreme end of stress and emotionally charged memories. An enhanced sense of personal danger or trauma is the strongest modulator of memory. Battlefield and trauma memories are particularly vivid, incredibly easy to recall, and charged with multiple associations – in fact, the associations are so strong that a PTSD sufferer can have flashbacks triggered by a sound or smell that might seem to have *no* relationship to the event, but have become indelibly linked with the trauma. PTSD memories are nearly impossible to erase, and the current treatment options involve finding a way to interrupt flashbacks and prevent the replay/rewrite consolidation cycle. Note they also are *not* limited to battlefield incidents! Although pathological memory does not *require* repetition in order to store the memory, it *does* get repeated every time the subject has a flashback. Thus two memory mechanisms are invoked, contributing to the difficulty in breaking the cycle. There are not many good indicators of how accurate PTSD memories are, but it is likely, given the ease with which flashbacks can be triggered by associative memory, that there is a considerable amount of contamination.

OK, I mentioned drug abuse at the beginning, and flashbacks in conjunction with PTSD. How do these tie in? Well, a lot of what we now know about pathological memory is now starting to tie in with drug withdrawal and relapse. We know that drugs such as cocaine, Meth, LSD, etc. all operate on neurotransmitters that are most prominent in the basal ganglia. Marijuana and narcotics are as well, but not as strongly (in fact, any euphoric sensation alters emotion and Basal Ganglia activity). Conditions underlying relapse and drug cravings turn out to have a strong component of pathological memory. The drugs actually *trigger* the neurotransmitters in Basal Ganglia to promote pathological memory associating the “high” with the surrounding conditions. Thus “pictures of drug paraphernalia” really *do* promote craving because they associate with a pathological memory. Cocaine and Meth are the worst offenders, and not coincidentally have the greatest effect on Basal Ganglia. Like PTSD, these are pathological memories, but now we are learning the actually physiological and pharmacological basis for the memory, and starting to look into ways to prevent or degrade the memories.

Current treatments for drug abuse to prevent relapse are using many of the same techniques as PTSD therapy, but now are beginning to look into ways to use the physiology and pharmacology to best advantage to truly *erase* these memories.

Both fields stand to benefit.

Thursday, March 24, 2011

Amnesia who?

Note: The following appeared last year as a guest post to "Mad Genius Club", a daily blog which rotates between science fiction/fantasy authors. This take on amnesia, as well as the appeal to authors to *not* overuse the cliche, seemed appropriate to the current topic being covered in The Lab Rats' Guide to the Brain, so I present this as my as last "canned" post while on vacation.


"Mrs. Smith?"

"Yes Doctor?"

"Your husband suffered a terrible head injury. He's in a coma."

"Oh, Doctor, will he be all right?"

"We'll only know once he wakes up."

"Ashley? It's me, Melissa!"

"Where am I? Who are you? What happened? Who am I?"

"Oh, no!"

It's a familiar theme, amnesia as a plot device. Overused, trite, cliché, yes; but also terribly *mis*-used.

Hi, the bloggers of the Mad Genius Club have asked me to contribute a series on the science behind science fiction/fantasy. I don't claim to be a Mad Genius, nor am I necessarily a Mad Scientist – a bit upset at times, but not truly Mad! Bwahahahaha! (I think we can safely save that label for Dr. Freer.) However, I am a neuroscientist, currently employed as a faculty member at a medical school.


What? Oh, yeah. This is Ratley, an intelligent lab rat. Actually that's LabRat, they insist on the capitals. Ratley and his friends will help me with these blogs.

So, on to today's topic: SF/F clichés regarding the brain with particular emphasis on amnesia.

Amnesia is little understood by the lay public. The most common experience of amnesia is the soap-opera scene with which this column opened. But what is amnesia, and how does it *really* happen?

OK, classroom time.


What? I said, class…

[Squeak, squeak, squee…]

OK, if you insist,*you* tell them.

[Ahem. OK, y'all, I got the stuffy Doc out of the way. As the Doc said, I'm Ratley, and I've *experienced* amnesia in the lab. Let me tell ya, it ain't no picnic. Amnesia means "without memory," and there's two typical types – retrograde amnesia, meaning a loss of memory from the past. The other kind is called anterograde amnesia and it means loss of memory "forward" into the future. I've had 'em both, and they result when a part of the brain that processes memory ain't workin'. ]

Excuse me, Ratley?


Are you going to explain what you mean by "future" memory? Shouldn't you tell them that anterograde amnesia is a lack of ability to make *new* memories?


I will, thanks.

In fact, anterograde amnesia is the most common form of amnesia, even though the retrograde form (poor Ashley, above) is better known. Imagine, trying to remember a phone number but never quite managing it; reading the same newspaper over and over again, never recalling the previous read; or never being able to remember where you'd left your keys, your car, your kids, your wife…

How can this be, how can it happen? Well, let's start by looking at how amnesia happens.

Take Ratley for instance.


Not literally, calm down, please! I'm just giving an example!

When Ratley said he had experienced amnesia, he means that in the lab, scientists use a chemical to temporarily put part of the brain to sleep, causing amnesia – which type depends on the brain area affected. In humans, amnesia usually results from damage to the brain. Oh, but not just any damage! It has to be specific type of damage and specific areas of the brain. Damage can be a traumatic head injury: Ashley's tragic soap-opera car crash, or the angsty teen's headfirst dive into an empty swimming pool. Damage to specific brain areas can also occur due to epilepsy, stroke, tumor, hemorrhage, infection (meningitis or encephalitis) or drug interactions.


Yes, Ratley, just like Ratface. See folks, Ratface did a bit too much LDS in the 60's. He's harmless – really – but not all there.

And what are those brain areas? Well, in scientist language, they are the pre-frontal and frontal cortex (for retrograde amnesia); hippocampus, medial temporal lobe and diencephalon (for anterograde amnesia). Traumatic injury, tumor and stroke can affect any of these areas; infection and hemorrhage are most likely to involve the frontal and prefrontal cortex, while epilepsy and drugs are most likely to affect the hippocampus, temporal lobe and diencephalon.


Yes, I know. Go ahead. Ratley wants to show you how to tell the brain areas apart.

[[sq…] Oh, sorry about that. Okay, humans. You've got those big hands with nice opposable thumbs. So, unhand that mouse and keyboard! Now, place your index fingers on your temples, yes, the soft areas at the side of the forehead. Feel that? It is the most direct access to your brain except from the inside. From your fingers to the center of your forehead is frontal cortex. If you draw a line between your forefingers across the top of your head – that's the prefrontal cortex. Move your fingers straight back until they are directly in front of your ears – that's the medial temporal lobe and hippocampus. Move the fingers below and behind the ears – straight in from there at the center, bottom of the brain is the brain stem, also known as the diencephalon.

[What about the other areas, the top of the head, the back, the base of the skull? You humans just *love* to make movies where the bad guy hits the hero on the top or back of the skull with the butt of a gun – he (or she) loses consciousness and wakes up in the hospital with amnesia. Silly humans. Listen to the rat, now: it's not gonna happen that way.

[Next exercise, put your thumbs directly in front of your ears and lace your other fingers over the top of your head. That's the sensory and motor areas – controlling all sense of touch, position and pain, and moving the various muscles of the body. From there to the back of the skull is visual area, responsible not only for sight, but also interpreting what you see. Run your fingers down from top, center of your head, to the very back the skull. Feel that slight dimple? That bony area right below it protects the cerebellum, responsible for coordinating all of the muscles involved in any movement. ]

Thanks, Ratley!

So, in our story brave Ashley foils the terrorist, gets cold-cocked at the base of his skull for his troubles, and wakes up with amnesia, right? Well, no. He might wake up with some coordination problems, blurred vision or possibly "agnosia" a specific type of amnesia for words or faces, but not full scale retrograde amnesia.

What about that mysterious alien parasite that "wraps itself around the cerebral cortex" and takes over its host, leaving total amnesia in its path?


No, Ratley, I know what you're going to say, but I'm *not* talking about Ratfink!

Leaving aside the fact that there is no *room* for such a parasite without sacrificing so much brain tissue that the host is clearly impaired in more than just memory, the description is not specific enough to suggest any particular type of amnesia. No, the more likely result will be pressure on the other parts of the brain causing the hapless host to stop thinking and breathing well before any amnesia could set in.

On the other hand, just about any surgery on the brain carries risk of damage to neighboring area. Anterograde amnesia is a common side effect, although retrograde amnesia is rarer. In fact, to get total retrograde amnesia requires trauma – massive infection, crushing injury to the central-to-frontal part of the skull, concussive blast injury. Anything less is unlikely to give total amnesia. Oh, sure, falling off a horse and hitting your head on a curb will likely cause a bit of amnesia – certainly for the 10 minutes or so immediately preceding the injury – but not the total "Who am I?" kind. Typically the amnesia lasts as long as the brain swelling that accompanies the concussion (about 24-48 hours) but usually only extends to memories from a few hours to a few months prior to the accident.

So, how *do* you incorporate brain damage and/or amnesia into a plot? Ratley?

[First, keep it simple. If the big dumb hero takes a glancing blow to the head, he's not gonna have total amnesia and lead a complete second life for 20 years. Keep it simple, and keep it short. Give the big dummy amnesia for the day leading up to the accident, and only lasting a few days to a week. However, you *can* leave the actual events of the accident permanently forgotten.

[Second, avoid the obvious. Instead of giving the dude full amnesia, consider an alternative.


Right, Ratley. Alternatives to amnesia might be: (1) agnosia– inability to remember faces or the names of common objects, (2) aphasia - the inability to speak certain words or names (we also call this the "tip of the tongue" phenomenon), or (3) neglect – an apparent inability to consciously notice objects that occur in particular places in our field of vision.

Back to you, Ratley.

[Thanks, Doc.

[Third, remember that total retrograde amnesia is *rare*. Instead of amnesia, give your character vision, hearing or coordination problems. There's a bunchaton of other stuff that happens after head injury.

[Fourth, keep in mind the differences in those different types of amnesia: give a human anterograde amnesia and they may not remember that they had the exact same conversation 15 minutes ago, but they can still remember the name of their 10th grade crush. Likewise, retrograde amnesia still leaves the ability to make new memories.

[Ah, excuse me a sec…

[HEY RATFACE! The cheese is over THERE!

[Ah, sorry about that, maybe Doc needs to finish this while I go clean up a mess...]

So, folks, Ratley's final point is to keep the perspective. Amnesia typically means loss of memory for facts. Skills such as reading, riding a bike, speaking a foreign language, complex logic puzzles – those memories are processed and stored in a different manner and not subject to the same injuries as amnesia. Just like Ratface can't remember where he left the cheese just now, he still remembers how to run mazes and get under Ratley's fur.

Finally, don't be too stuck on the rules (or clichés). Try something new, or figure out a way to let your protagonist function with just a partial injury. If you want help, don't hesitate to contact an expert –


- Or a LabRat! In fact, many scientists would be flattered to help out.

Tuesday, March 22, 2011

Hansen's Disease

(updated with pictures 3/23/11)

I am currently traveling on vacation. Today we visited Kalaupapa, on the island of Molokai, Hawaii, the site of the leper colony established by King Kamehameha V in 1865. The site is separated from the rest of the island by high cliffs, and is only accessible by sea, air and a mule track down the cliffs. It made the perfect site to isolate inhabitants once thought to be infectious by their mere presence among undiseased persons. The colony is the site of the wonderful work by Josef De Veuster, Father Damien of Molokai, the Belgian Priest who ministered to the colony from 1873 to 1889, transforming it from anarchy to order, from despair to hope.

I offer this post to the blog, both in the interest of history, and to present a misunderstood disease in the context of neuroscience. Hansen's Disease – infection by Mycobacterium leprae and/or Mycobacterium lepromatosis is not primarily a skin disease, nor does it result in "limbs falling off" as popular belief would have it. Leprosy is primarily a disease of the nervous system. The mycobacterium damages the sensory nerves of the periphery that lie close under the skin, first affecting tactile sense, then the fibers that transmit pain. The pale, blotchy skin lesions that have been the hallmark of leprosy from ancient times, appear as the "dermatomes" of the skin lose their neuron connections.

Dermatomes are regions of the skin that are served by a common source of "afferent" nerves ascending to the spinal and brain, and "efferent" nerves returning to the skin. Severing a single nerve will result in loss of sensation in a patch of skin from <1 to >10 cm square, depending on location on the body. Dermatomes on the fingertips are quite small, reflecting the many densely packed nerve endings that provide fine touch and sensitivity. Dermatomes over the ribs, hips and thighs are quite large, since tactile sense in those regions does not need to be as precise. Hence dermatomes are a necessary feature of the brain being able to localize *where* a sensation is coming from.

With the most common form of Hansen's Disease, the loss of sensation occurs first, and the skin lesions appear later as the dermatome loses all neural connections. A rarer variety exhibits the pale lesions, raised patches, nodules and bumps, with the numbness and sensory loss occurring much later. Ironically, the loss of neuron is due to the body's own immune system, much the same as other neuron diseases such as myasthenia gravis, multiple sclerosis (MS) and amyotropic lateral sclerosis (ALS – Lou Gehrig's Disease). Mycobacteria infect the neurons and change the outer membrane. Immune cells recognize the neurons as infected and damaged and "remove" them, resulting in the loss of neural connections between skin and brain. As more neural connections are lost, they also include the neurons returning from brain to the skin that regulate blood flow, perspiration, and other factors, resulting in the lesions normally associated with the disease. Severe lesions and loss of limbs results not from the disease itself, but from untreated infections that are (A) undetected due to lack of pain, and (B) impaired healing due to loss of neural control of blood and lymph flow. Untreated infections result in gangrene, cartilage damage and loss, and bone loss, resulting in lost or shortened joints and digits.

We now know that leprosy is *not* very contagious. The amount of contact required to be infected is usually only encountered by family members or caregivers, such as Father Damien. It is likely *not* transferred via the skin or by the lesions, but by nasal secretions and mucus, much the same as influenza. There appears to be a genetic susceptibility, resulting in the disease occurring within families (as well as due to the close, repeated contact) and we now also now that >95% of humans are naturally immune.

Contrary to the Ancient Greeks who first described what we know as Hansen's disease, it is not about being "unclean." Instead, it is all about the neurons, but then, isn't everything?

Update 3/23/11:  For my Facebook friends, pictures are up on my Speaker to Lab Animals page.  I actually wrote this column a week ago, and just returned from the actual tour a few hours ago.  Kalaupapa Peninsula is a study in contrasts, incredible beauty for a site of such despair.  The greatest contribution of Father Damien was that he brought hope to the patients and their families.  It is an incredible experience.

The world turned on edge...

This is the second of the "special" off-topic travelogue posts from my vacation. 

Several decades ago I read Robert Silverberg's "Lord Valentines Castle" and was struck by the imagery of a mountain so vast it stuck up out of the atmosphere.  Likewise Niven's "Ringworld" painted a word-picture of landscape curving up and over, to vertiginous effect.  I don't know if either author spent time in Hawaii, but I now know the nearest we can come to the experience.  Sure, thanks to the "Halo" games, we have artists' concepts, but on the island of Maui, the shield volcano Haleakala rises so gently out of the landscape, that it appears to simply be level ground that has been turned on edge.  I've felt this feeling a few times before, in countryside where the locals build and use the slopes as if they were level ground.  The feeling of vertigo - as if the slope is level and *you* are at an angle, can be especially acute.  On Haleakala, one can just imagine a massive mountain that rises above all, and pilgrims journey years to reach the summit.

On another part of the journey...

Maui has a famous highway, 50-ish miles long, with hundreds of curves, one-lane bridges, countless waterfalls and scenic views, it is considered one of the most lush, beautiful drives on earth... and on any given day, it seems as if the whole earth is driving that same road.  On the opposite end of Maui is another road, just as curvy, half as long, and with *way* less traffic.  While not as lush, it contains some of the most beautiful stark views of the ocean and rugged West Maui coastline.  Well worth the journey, either one, but I know which I prefer.  Sorry, Hana. 

Sunday, March 20, 2011

Out on a limb...

Hippocampus and the limbic system.

The chart at left shows the comparative organization and complexity (if not size) of the brain as developed through different species. One of the things to which I'd like to call attention is the cerebrum and that little bulb sticking out (and left) from the bottom of the brain. That's the olfactory bulb, and it is connected to the most primitive part of the brain – pyriform cortex, which ultimately ends up on the inner surface of the human Temporal Lobe. The cerebrum of fish, reptile and bird contain a structure which is functionally analogous to the mammalian hippocampus, entorhinal cortex, and other elements which come to form the "limbic system." In fact, if one presupposes a connection between olfaction/nose/rhinal, it may suggest the origins of the rhinal, perirhinal and entorhinal cortices. It would be *wrong* but a fanciful association nonetheless. [The "rhinal" areas are around the "Rhinal Sulcus" which is a groove on the surface of the Temporal Lobe which an early anatomist thought looked like a nose!]

The most primitive brain had a single layer of neurons on the outer surface, compared to the 6-layer neocortex of mammals, and was termed "archicortex" or primitive cortex. Development of the "new" neocortex over the archicortex resulted in the primitive areas folding under and into the ventral (bottom) brain regions. This is one reason why hippocampus – the most notable of the archicortex structures – is very large and near the top of brain in rodents (and thus, *very* easy to study) but rather small and convoluted into the inner Temporal Lobe in humans.

The limbic system (below right) connects olfactory senses to the rest of the brain, is intricately involved in memory processing, provides a "timing" signal or oscillation that is critical to coordinating memory storage, recall, movement and sense of time, and processes some aspects of emotion.

One of the first I recall learning of the Limbic System was the result of lesions – cuts or damage to the pathway. Fornix lesions resulted in loss of memory and ability of the rat to navigate a maze. Lesions of septum led to "anger management issues." Amygdala lesions resulted in either loss of fear, or uncontrollable fear.

The second thing I recall learning is the importance of the sense of smell to memory. Olfactory inputs are the single most numerous (and only *direct*) sensory input to hippocampus. The sense of smell is a strong trigger of association in memory (and we will get to that in a couple of days) – and is thought to be responsible for the sense of Déjà vu – in which a smell triggers memory, even in a novel situation, providing a sense of *almost* recalled memory and familiarity. We now know that smell, fear, stress, emotion are all intricately linked, and that linkage is processed through the limbic system and provides emotional context for memory.

Which brings us to the hippocampus. While not the root of all memory (there is extremely short-term memory processing in Frontal Lobe, as well as in the basal ganglia) it is the site of most of the memory processing for working memory (i.e. long enough to complete a task) and conversion to long-term memory. One of the important features of hippocampal neurons is theta rhythm. Theta is a 6-to-12 cycles-per-second oscillation of groups of neurons in the medial septum (hypothalamus) that acts as a "clock" for a lot of activities in hippocampus. Recording electrodes placed in rat hippocampus reveal a strong wave of activity every 80 to 200 milliseconds as a volley of neurons fire action potentials down the long axons terminating in hippocampus. Hippocampal neurons fire in various relationships and the actual "phase" of firing can be used to represent specific information.

For many years it was known that human hippocampus processed *new* memory. The famous case of patient H.M., who had the medial temporal lobe (including hippocampus) removed from both hemispheres to halt his epilepsy, showed that without a hippocampus, a patient was unable to make and hold memory for more than 10-15 minutes. However, in rodents, the primary type of memory processed by hippocampal was thought to be spatial (i.e. running a maze). Since theta rhythm increased in power (and slightly in frequency) with movement, and the observation that hippocampal neurons fired only in certain places within the environment (and phases of the Theta oscillation), hippocampus *must* be the site of a cognitive *map* of the environment! We have since discovered that such a map is merely one form of *association* between the animals environment and its behavior, and hence an important, but not exclusive, function of hippocampus. Indeed, the Theta Rhythm appears to be crucial in setting the sequence of associations, and hence the sense of time and order of memories.

The hippocampus receives inputs from all of the sensory association areas of the brain, as well as prefrontal cortex and striatum. Outputs from hippocampus lead back to the same areas, plus motor planning and interpretation. There are some who *still* wonder if hippocampus isn't the missing "Director" of all brain function, although as mentioned in last Monday's blog, that role is best filled by the Frontal Lobe. Still, the hippocampus is placed to receive and *associate* most, if not all, of the sensory information in the brain, and use it in the encoding of memory. Without hippocampus, there is no short term (longer than a minute) and no long term memory. If the hippocampus is damaged or inactivated (using anesthetics) between behavior and the next sleep period, the working memory is not converted to permanent long-term storage. While memory recall is still possible without the hippocampus, they ability to use associations to assist in recall is impaired.

From here, the next logical step is to talk further about memory and amnesia, and we will do that in the next series of blogs on amnesia, memory, and abnormal memory processes.

So *remember* to tune in next time!

Friday, March 18, 2011

The Temple of the Mind

At least one source says that the Temporal Lobe is so named because the forward edge lies just below the temple. More likely it refers to proximity to the temporal bone, just below the temporal and forward of the ear. As you may recall, in the naming exercise for lobes of the brain, I referred to the Temporal Lobe as the *thumb* of a mitten or baseball glove. Like a glove, the inside of the Temporal Lobe is as important as the outside.

We have come to realize that the only lobe which has only one function in the human brain is the Occipital Lobe (and that is due to the dominant role of vision in primates). Like the Parietal Lobe, the Temporal Lobe fulfills many duties: sensory (hearing), executive (memory) and association (auditory and memory). It should be noted that these divisions of the brain are largely due to the surface landmarks, and bear much less relationship to function. If pure function were invoked, there would be many more than just five lobes to the brain (and in fact, some catalogs list a “Limbic Lobe” which resides entirely within the inner surface between hemispheres and on the inner Temporal Lobe). The Dorsal (upper) surface of the Temporal Lobe contains auditory cortex and association areas. The lateral surface actually contains some of the memory storage of the brain, and is responsible for names and proper nouns (which makes sense in proximity to hearing). The ventral (lower) regions contain entorhinal cortex – one of the first relays in the memory system. Inner surface contains the hippocampus (see diagram, right) which is the primary relay for formation of new memory.

One of the terms most frequently associated with this region is “rhinal” (nose) named for the rhinal fissure, one of the deep sulci or grooves in the lower surface of the Temporal Lobe. The surrounding areas of Temporal Lobe contain the entorhinal and perirhinal cortex. On the medial (inner) surface is hippocampus and parahippocampal areas. Together, these areas serve as coordination and association areas for memory. In fact, most memory storage in the brain is via association. Related facts are stored together, so that recall of one memory can trigger recall of another.

Another feature of the Temporal Lobe is that the hippocampus is the terminal point of a circuit termed the “Limbic System” which, along with the amygdala, is commonly mis-stated as being responsible for emotion. Rather than being the embodiment of emotion (that role is reserved for Frontal and Prefrontal areas) the amygdala is involved in evaluating emotional content of events – and particularly memory. However, those are topics for the next blog post on hippocampus and limbic system.

If “newness” (in terms of development) of brain areas is measured by the degree of folding and convolution, then Temporal Lobe is among the “newest” areas of the primate brain. Yet the hippocampus and limbic system are some of the “oldest” structures in the brain, dating back to rudimentary brains of lizards and birds. More likely, the Temporal Lobe is an old structure with new functions added on as the brain developed. In some ways it is most simplified, and in others, the most complex. For this reason we will spend several days on the structures and functions as we explore the implications of memory in the mammalian brain.

Next up: Hippocampus and limbic system. The primitive brain.

If I had wings...

As promised, this is special surprise post #1.  I have scheduled updates to this blog on even numbered days, thus on March 18, 20, 22, 24, 26th at 8 PM EDT, there will be a new blog related to the Lab Rats' Guide to the Brain. 

This is not one of those blogs.

It has been said that the best way to see the Hawaiian Islands is by air.  Yet, the quality of air tours varies from island to island.  Helicopter and fixed wing tours of Maui, Molokai, Hawaii (the Big Island) can be spectacular, but Oahu, despite being the most populous, and having the most divergent scenery, does not have the most spectacular air tours.

Instead, Oahu has spectacular aerial views from Terra Firma.

What few people understand about Hawaii is that all of the island residents and tourists are sharing the same few roads and highways.  Honolulu and Waikiki traffic can be quite heavy and slow.  Want to know why the islands "overreacted" to the recent tsunami?  Because (A) they knew that it *could* be bad, but more importantly, they knew (B) that it would take time to get residents to high ground.  The moderately populated islands of Kauai, Maui, Hawaii all have essentially one road.  Oahu has more, but Honolulu County contains 2/3 of the entire state population with 1/10th the total land area.

But the highways in Oahu offer some of the most spectacular views in the state (top billing goes to the Hana Highway on Maui).  The picture above was the second best view of my day.  Unfortunately, the best had to be captured in the mind's eye, as there was no good way to stop and take the picture. 

To reach the inland  areas of Oahu you climb up from Honolulu along a curving highway that passes among and over steep valleys between abrupt peaks.  You're still in the residential regions until you reach the upland plateau between the Ko'olau Range (to your east) and Waianae (to your west).  These two ranges are fragments of volcanic craters, and surprisingly, the bulk of what is now Oahu was *between* the craters.  The bulk of the original volcanoes slid down the slopes of the undersea mountain that is Oahu, leaving only the two disconnected mountains and a valley between.  This land is rich volcanic soil, and grows delicious pineapples, the lure that caused us to brave the traffic on this beautiful day.  After a visit to the James Dole Plantation, we returned toward the city, getting a good glimpse of Pearl Harbor as we descended back toward the coastline, then a quick turn onto H3 for our "aerial" tour. 

Among the shortest of all Interstate Highways, H3 runs about 15 miles from Pearl Harbor to the Marine Corps Base at Kaneohe Bay.  To get there requires crossing the Ko'olau - the only practical way is through.  Three highways have tunnels cut through the Ko'olau in this region, and each has a distinctive view.  Likelike and Pali Highways pass further south than H3, and most of the view from the road is at somewhat lower altitude.  Palie Highway *does* have access to the Pali Lookout, from which the picture was taken.

However, H3, once through the tunnel, provides a view of Kaneohe Bay from above that is truly worth the drive.  You look down on the sparkling bay, shades of bright blue reveal the depth of the water and coral formations below the surface.  The slopes of the Ko'olau are bright green, the houses appear to be just toys below.  You also get a head-on view of the Marine Corps base runway, which further adds to the impression that you are airborne, hovering above ground, perhaps riding the thermals and onshore wind forcing a powerful updraft along the windward side of the island.

It is still part of the winter season in the islands with strong Northeasterly winds, and today was no exception.  Winds at the Nu'uanu Pali Lookout, as the above picture was taken, were *steady* at over 50 mph, and gusting much higher.  In fact, the area is so well known for it's wind, that in 1931, a sailplane pilot achieved a record 21+ hour continuous glider flight along the same cliffs.

And if I only had wings, I too, could have flown.  Instead I settled for a car and the best substitute for an airborne tour in Oahu.

Wednesday, March 16, 2011

Give me air!

Today I am traveling. In fact, I will be away for the next 8 days, and so have prepared a number of posts in advance to be able to continue the Guide while on the road. In addition, we should have a guest post by K. Mata, Goddess of Lab Rats, coming up soon.

In today's society, we take long distance travel for granted. One steps onto a jet in cool, dry Dallas, and deplanes 8 hours later in warm, humid Hawaii. In the meantime, we have traveled nearly 7000 kilometers at an altitude of over 10,000 meters. Of course that beats the Google Maps directions to "kayak across the Pacific Ocean" for 3500 kilometers, but it is still a trip through inhospitable environment! We are sheltered by virtue of the heated, pressurized cabin of the airliner, providing us with a key indispensible component of survival – oxygen.

The neurons that comprise the brain are unique compared to the vast majority of cells in the body – they cannot store essential energy-producing compounds (glucose, glycogen, fats) nor can they utilize the "anaerobic" (non-oxygen-requiring) enzymes to provide energy during short periods of activity. Muscle cells store glucose, they can synthesize a polymer-like derivative of glucose – glycogen – that provides a reserve supply for exercise and exertion. Excess glucose is converted to fatty acids and eventually stored as fat in adipose tissue. For slow, steady exertion, glucose is "burned" – in fact, broken down enzymatically – via a process called the "Kreb's cycle" or "citric acid cycle" which converts glucose to carbon dioxide, water, and energy (in the form of the energy transport molecule ATP). This cycle is "aerobic," it requires oxygen for completion. For brief bursts of intense activity, glucose can also be broken down to lactic acid in the absence or low availability of derivative via anerobic catalysis. The lactic acid builds up and is responsible for muscle fatigue, oxygen is required to convert the lactic acid back to intermediate molecules, and then to CO2 and water.

The brain, however, does none of this. It does not store glucose, nor does it manufacture glycogen or fatty acids. Neurons are very limited in anaerobic capabilities, sufficient oxygen for aerobic processing of glucose is necessary at all times – build-up of lactic acid is damaging at best, and fatal at worst. For this reason, the brain is most sensitive to low oxygen, low blood glucose and dehydration/low blood flow. Often the first symptom of any of these conditions is altered state of consciousness, delirium, then coma. It is the primary reason why hospitals monitor the O2 saturation of a patient's blood, it is why diabetics must watch out for low blood sugar as well as high. Low oxygen – "ischemia" – is the main culprit in stroke and brain injury.

Fortunately, restoring blood flow and oxygen to the neurons of the brain usually results in a rapid restoration of consciousness and normal function. One of the first things taught in first aid is to treat the body for shock, which is primarily due to low blood flow to the brain. Keep the patient warm, lower the head and raise the feet to keep the brain well fed. Normal body reactions to extreme conditions is to preserve blood flow to the brain at the expense of the body, but shock represents the opposite, and it is necessary to *help* the body and brain along.

Fortunately the body and brain are pretty adaptable. If a person moves to a high altitude, with lower oxygen content in the air, the body starts making more hemoglobin – the oxygen carrying molecule in the blood – and more red blood cells to carry the oxygen. However, the process takes several days, so the immediate adaptation is to increase the density of red blood cells by concentrating (dehydrating) the blood. Dehydration is a definite risk for high altitude athletes and hikers, so acclimating for about a week is often necessary to ensure peak capabilities.

The necessity for oxygen and glucose to be constantly supplied to the brain is also the basis for two of the most important means of viewing brain activity from the outside. Functional magnetic resonance imaging relies on movement of oxygen into neurons, while some versions of positron emission tomography use an isotope labeled version of glucose to track the most active neurons. Both methods give us a picture of active regions of brain on the basis of the uptake of their most essential nutrients.

So, next time you take a breath – appreciate it. Your brain is depending on it!

Monday, March 14, 2011

The Executive

The public perception of emotion and personality is that it resides in the temporal lobe, and in the brainstem/deep brain structures of the limbic system.  Memory is thought to be strictly confined to the temporal lobe.  Decisions regarding reward and value are the province of the basal ganglia – in particular, the striatum.

… that *somewhere* in the brain, probably down deep in the middle, is a "Director" that tells the other parts of the brain when and how to do their jobs.

The truth is that all of these functions are fulfilled in part by the Frontal Lobe.  Emotions appear to be on the dorsal Frontal Lobe, just forward of where the Temporal Lobe ends.  Extremely short term memory occurs in the orbitofrontal cortex – just above the eyes (orbits) – and coordinates with the successively longer memory processed by hippocampus and stored with the assistance of the temporal Lobe.  Decision-making quite frequently involves the "prefrontal" areas (the anterior, or forward-most extent of the brain), including moral and value judgments.
The Director.  The executive function that makes the human brain… human… is the prefrontal region of the Frontal Lobes. 

Probably the least understood in terms of *how* it arises, Executive Function is at the leading edge of the Mind-Brain question:  How does the operation of billions of neurons, and trillions of synapses become the conscious brain?  This blog does not have an easy answer for that.  However, what *can* be said is that neural modeling has been surprising, quite frequently properties and features arise in a model that were not programmed in by the modelers.  We call these "emergent properties."  While we know a lot about how consciousness can be controlled or altered, we don't really know how it comes into existence except to hand-wave it away as an emergent property.
Rather than engage in that time-honored Science Fiction tradition of "handwavium," let's talk about what we, as neuroscientists, do know:

(1) Frontal Lobe is the most active during tasks which require very short term memory and decisions

If you want to remember a phone number, how do you do it?  Repeat it to yourself.  Mentally *say* it over and over again.  That involves the speech centers in Frontal Lobe, and memory.

What do you need to make a decision?  Memory, knowledge about the environment, pre-programmed motor sequences to carry out actions, value of the decision, and evaluation of consequences.  Frontal Lobe connects to all of the association cortices, to the memory centers of temporal Lobe, to the basal ganglia for reward association, to amygdala and limbic system for emotional content, and most importantly, the motor association areas of the Frontal Lobe "rehearse" motor movements and provide feedback regarding the consequences of the action.  All of these functions are integrated via connections between the Frontal Lobe and the other brain areas. 

(2) Damage to Frontal and Prefrontal areas alters personality, emotion and the ability to make decisions.

The classic head injury cliché is that an impact to the back of the skull causes unconsciousness and an impact to the front causes amnesia.  The truth is much more complicated – a blow to the *base* of the skull compresses the brainstem and subcortical areas and can cause unconsciousness – but it can just as easily cause permanent damage.  The front of the skull is well armored.  An impact here (or anywhere on the skull) is more likely to have its effect via concussion, resulting when the brain *sloshes* in its protective cushions (the outer membrane dura mater, inner membrane pia mater, and the cerebrospinal fluid between the two) and bruises the cortex on impact with the interior of the skull.  Outright damage to prefrontal and frontal areas can alter consciousness, result in blurred vision, inability to track moving (or stationary) objects, impair ability to move limbs to a defined goal.  In addition, frontal damage can alter personality, emotion and the ability to make appropriate decisions.

One of the most fascinating mental and cognitive tests is called "moral decisions."  A subject is given a situation and one of two choices.  The situation often involves something like death or serious injury to self, friends or strangers.  The decision will test whether the subject places a higher value on their own life, that of friends, strangers, one or many.  The *purpose* of the test is not *what* decision the subject makes, rather how *quickly* they make the decision (if at all) and whether they are content to stick with the choices given by the examiner or even make up their own alternatives within the rules of the test.  *Impaired* cognitive and moral decision making is characterized by snap decisions with no evaluation of alternatives or consequences.  You may have guessed that this is precisely the type of change observed in persons with Frontal Lobe brain damage.

The irony of course is that there may be no change at all.  The Frontal Lobe is highly redundant, and there is not as much lateralization observed in function.  Damage to one side can be compensated.  The remarkable healing progress of Arizona Representation Gabrielle Giffords is testament to this ability to shift functions and heal.


As we move on from Frontal Lobe, we are increasingly encountering functions that require massive connections to other brain areas, and further invoking the subcortical areas.  It may come to your attention that of the 5 primary senses, we have only discussed three – sight, sound and touch – and have not mentioned smell and taste.  The observant reader may notice two small extensions below (ventral to) the Frontal Lobe labeled "Olfactory Bulb" or "Olfactory Cortex" and figure those are part of the Frontal Lobe. 

Actually, it is not.  Olfaction and gestation – smell and taste – are the "oldest" senses from an evolutionary and developmental perspective, and project directly to the most primitive portions of the brain – the pyriform cortex (present in amphibians, reptiles and mammals), entorhinal cortex, amygdala and hippocampus in the Temporal lobe.  As such, the olfactory system is part of the "limbic system" to be discussed later this week.  The sense of taste is tightly tied to the senses of smell and touch (physical "feel" of objects tasted), and shares connections to the limbic system, as well as direct projections to somatosensory cortex.


In the weeks to come, I will be traveling on vacation.  Blogs have been prepared and uploaded to continue The Guide throughout the next two weeks, but I may be unable to respond to comments until after my return.

Also, watch for a few "surprise" columns over the next two weeks as well.

See y'all on the flip-side!

Sunday, March 13, 2011

*IS* the hand quicker than the eye?

We have sort of backed into a description of the Frontal Lobe via the description of motor cortex.  Frontal Lobe is largely responsible for the *output* functions of the brain: muscle movements, speech, and decision-making.  I will cover the latter, termed “executive function” in the next blog, and concentrate on …


What is it Ratley?




Oh.  Excuse me, folks, but I have to go rescue Ratface.  He’s gotten his paws caught in a mousetrap again.  The lad has *no* paw-eye coordination.  Ratley?  Want to take over for a minute?

(squeak.  Sque-eek. Squick.  Squeek.)


Hah.  Well, Ratface has no paw-eye coordination, then again, he’s participated in a few too many of those head-injury experiments. 

See, there actually *is* a place in the brain that controls eye movements, assists in tracking moving objects and provides signals for coordinating movement with respect to those ye movements.  The figure that Dr. Tedd has so thoughtfully provided at the right shows the location of the “Frontal Eye Fields” with the primary job of moving the eyes to track moving objects.  In the previous discussion of the visual system, Teddy mentioned that there are neurons in the thalamus and brainstem that detect moving objects.  However, in order to continue to follow a moving object, it is necessary to move the eyes to keep an object centered on the retina. Neurons in the Frontal Eye Fields are active in conjunction with location, direction and speed of *eye* motion and provide control of the eye muscles through Cranial Nerve III, the occulomotor nerve.  Incidentally, neurons in the surrounding areas are also associated with light sensitivity, pupil dilation and focusing the lens – the latter an ability that Teddy is severely lacking as he gets older, thus the need for bifocals.

You will note that the eye regions of the frontal lobe are quite close to the hand areas of motor cortex.  Yes, there *is* a logic to this, since a major element of muscle movement is what we LabRats like to call “goal acquisition.”  Like the cheese at the end of the maze, each muscle movement has a goal.  If muscle movement used *only* the position feedback from the joints and muscles, each movement would end with a lot of minor movements (tremors or oscillations).  Instead, visual feedback from your eyes provides information that, yes, your foot *is* on the stair, your hand *is* on the door handle, etc.  This is very important to *stopping* movement when the goal is reached.

Hand-eye coordination?  Yeah, all Frontal Lobe functions.  Feedback from motor cortex to cerebellum, to muscles – coupled with visual feedback – helps guide the hand to the appropriate place.  Rapid eye movement makes sure that all of the objects are in view.  The process is extremely fast, and simply requires practice to get used to it.


Oh, hi, Boss!

Thanks, Ratley.  Now that Ratface is out of the trap, the key question:  *Is* the hand quicker than the eye? 


Visual tracking is extremely fast, and the eye is extremely sensitive to movement.  In fact, we know that it requires over 16 times changes per second before we stop  seeing the individual images and start seeing the continuous video motion.  We can still detect discontinuities at up to 60 frames per second, which is one reason why the best high-definition and IMAX films are projected at over 70 frames per second.  In the famous “three-card monty”  card trick, the hands are certainly not moving faster than the eye can track.  Rather, the “trick’ is distraction and deception.  The con-man’s speech, and certain hand passes that block the victim’s view of the other hand are the key elements of deception that allow the operator to relieve the “mark” of his money more often than not.

Hand-eye coordination is just another example of one of the associative functions of the brain, and as has been repeatedly mentioned in other blogs, the associative functions are often more important than the simple sensory inputs.  And of course, association is the key element of memory, which are the topics for next week.

On behalf of myself and the LabRats --- Ratface, NO!  (snap) – uh, excuse me, gotta go, see you next time!

Wednesday, March 9, 2011

…a little man who wasn't there…

The poem by William Hughes Mearns observes: "Last night I saw upon the stair/a little man who wasn't there/He wasn't there again today/Oh, how I wish he'd go away…"  While commonly attributed as a ghost story, the little man in the context of this blog is a homunculus.  The "little human" often grossly misshapen, is a common feature of functional descriptions of the brain.

We have seen over the past weeks, how each lobe of the brain often has one or two key functions:  Occipital contains vision, temporal contains memory and hearing, parietal combines senses into associations and frontal is serves executive function.  However, one of the most astounding features of the brain lies directly at the junction of the Parietal and Frontal lobes.  In keeping with the general scheme of Occipital/Temporal/Parietal as input structures, and Frontal as output structures, the Parietal side of the Central Sulcus (Figure right) is the somatosensory cortex and represents the tactile sensory inputs from the body – pretty much everything that isn't vision or hearing.  The Frontal side of the Central Sulcus contains the primary motor cortex – the region responsible for enabling the movement of limbs and joints.

The organization of this "Sensorimotor Complex" as it is sometimes called, is fascinating.  Our old friend, Dr. Wilder Penfield, created the idea of the cortical homunculus after determining that electrical stimulation in particular regions of motor cortex caused specific muscle movements, while stimulation of the corresponding somatosensory cortex was reported by the patients as a tactile sensation in the relevant part of the body.  Penfield's motor (right) and sensory (left) homunculi are diagrammed below.

The positions and proportions of the homunculi are distorted (a feature which was critical in the original use of the term) because of the density of nerve endings and sensory neurons.  Regions with very precise, fine sensation: face, lips, tongue, fingertips have large representations proportional to both the number of sensory neurons (left) and independent muscle control nerves (right).  Regions with low sensitivity (large "dermatomes," as will be discussed in a later blog) or just a few large muscles, have small representations.  It should be noted that any ear, eye, nose and tongue representations are for the muscles and surrounding tissues, and not for vision, hearing, smell or taste.  As mentioned in yesterday's blog, the regions involved in speech and language map quite closely adjacent to tongue and pharynx motor regions.

This simplified structure is one region why groups such as the John Hopkins University Consortium, under the direction of the Defense Advanced Research Agency, has been successful in developing a complete arm, wrist and hand prosthetic limb with movement controlled strictly by recording the brain signals from the appropriate region of motor cortex.  Providing sensory/tactile feedback is taking a bit longer, so we are not quite at the level of "The Six Million Dollar Man," but science is progressing, and we are learning a lot about what it takes to map the outside world onto the brain.
Continuing to move forward from the Sensorimotor Complex, we will start to combine the multiple senses with motor control to allow for tracking of touch, sight and sound and decision-making of *when* to do so, in the Frontal Lobe.  Of course I would be remiss if I didn't remind the readers that all of the signals – both motor and sensory – are relayed through the thalamus.  In fact, even the projections to other regions of the brain for association with other information, are routed through cortico-thalamic and thalamocortical projections.   Other key contributors include the cerebellum which receives and provide information relevant to coordinated, smooth muscle movement, and the basal ganglia which maintain the signals required to keep the muscles ready to move on command.  These areas will be mentioned much more extensively in the sections on diseases and disorders which are yet to come in The Lab Rats' Guide to the Brain.

Thanks for tuning in. and welcome to any new readers that may have been referred from other blogs and via contacts made at Stellarcon!   This would be a good time to remember to take care of your brain, for now, it's the only one you've got!

Tuesday, March 8, 2011

In One Ear and Out the Other

Speech, reading and language

Back to The Guide after a bit of time off, thanks for understanding.  For a glimpse of what I've been up to, check out Sunday's blog and the new videos posted at

Following the discussions of visual and auditory systems, I promised a discussion of speech and language.  In truth this should wait until *after* discussing the motor and sensory cortices, but seems more appropriate in this place.  To fill in that extra information, refer to the figure at right which labels several key sensory and motor areas of the brain.  At the back of the brain we have occipital lobe and exclusively the primary visual system.  Forward (anterior) and to the top is parietal lobe which contains many of the association cortices that perform advanced sensory processing as well as combine and process sensory information from different areas.  To the bottom is Temporal lobe, with the memory areas (inside) and auditory cortex.  At the anterior edge of Parietal lobe is the somatosensory cortex - the sensory (touch and position) cortex for most of the body (soma).  Forward of the somatosensory cortex is motor cortex in the Frontal Lobe. 

Two primary areas have historically been associated with speech and language.  Damage to Wernicke's Area would result is loss of language comprehension, while damage to Broca's Area resulted in loss of speech.  Both areas are found in the *dominant* hemisphere - the left side (controlling the right side of the body) in 90% of humans.  Interesting, damage in either area could result in loss of reading comprehension, but we will get to that later.

We now know that these regions are not exclusive.  It is true that sudden injury frequently results in these losses, but slow-growing tumors and gradual degeneration can leave speech and language intact despite serious degradation due to the brain's own "plasticity" - the ability to shift functions to adjacent areas or even the opposite hemisphere.  However, the location of these key areas is telling.  Wernicke's area, the blue region in the figure at left, lies firmly in the Parietal Lobe association areas, near the auditory cortex.  Broca's area (orange), lies in the motor association areas just anterior to the region that controls muscles in face and mouth. 

While these areas were originally identified as a result of lesion, or damage to the respective brain area, the approximate areas have been confirmed by imaging that identifies active brain regions during listening, reading and speaking.  With these imaging studies, though, came the realization that speech and language was not *limited* to these regions, and we find a lot of involvement of association cortices surrounding Broca's and Wernicke's areas.  It is also not entirely true that these functions are strictly lateralized to one half of the brain.  Imaging also shows *some* involvement of the corresponding areas in the nondominant hemisphere, especially if the person is not really paying attention to what they are hearing or hearing/speaking nonsense syllables. 

*Reading* is an interesting specialization of lamnguage, and it would appear to make sense for it to be localized near the intersection of vision and hearing - but it is not restricted to those areas.  Reading also seems to invoke Broca's Area and the eye muscle control areas in Frontal Lobe. Distribution between the two areas is also curious - with poor readers having more activity surrounding Broca's Area (and perhaps explaining people whose lips move during reading) while good readers have more activity surrounding Wernicke's area.  Current explanations indicate that good readers "hear" words as they are read, while poor readers have to sound them out.

Thus language really does involve the areas of brain that process the mechanisms of language - hearing and speaking.  Even the written word harkens back to oral tradition in the involvement of Broca's and Wernicke's Areas.  Of course the ultimate brain activity is reading and reciting - as in poetry and author readings at a con (been there, done, that, no thanks).  Such activity activates visual cortex and association areas for detection of words on the page (or screen), hearing, Wernicke's Area, the somatosensory and motor cortex areas for face, mouth, tongue, pharynx and larynx, Broca's area, Frontal Lobe eye fields for control of eye muscles to scan the eyes across the text, memory areas of the temporal lobe, and executive function areas of the prefrontal (most anterior) region of the Frontal lobe for decisions of what to read and when to speak. 

In all, reading and reciting activates between 25% and 50% of the whole brain in one activity.

Take *that* ten-percenters!

Sunday, March 6, 2011

Failure of the Imagination

Quick blog post since I am running behind...

Stellarcon 35 was this weekend in High Point, NC.  Yours truly was a science guest, and had a chance to talk about the good and bad with respect to the science that gets written into Science Fiction.

However, I am proudest of the panel I moderated in which I (hopefully) spoke the least:  It was Science vs. Story, and featured Toni Weisskopf, Publisher of Baen Books; Gray Rinehart, engineer, writer and "Slushmaster General" of Baen Books; and Christina Ellis, writer and chemical engineer.  We talked about the Science in SF, how for many people, the very appeal of SF is the way it incorporated *real* science into a story and fired the imagination.

Questions and discussion started with how and whether a scientific "infodump" enhances or detracts from a story - does it add credibility?  Or does it jar the reader loose from the story?  Toni Weisskopf and the panel felt that scientific detail which *advances* a story is a good thing, and harkens back to the works of Campbell, Asimov, de Camp, Heinlein, Clements - who never shied from including the real science in their stories.   Audience members concurred, including those with no scientific background who affirm that reading about something in a good story is often a good way to increase knowledge.  Toni continued that SF really doesn't have *enough* of the hard science adventure that it once did.

Up next, we discussed the difference in writing mindset and style between scientific/technical writing and writing fiction.  The best fiction is written in an active voice that draws the reader in, precise technical writing is passive, dry and often in past tense.  The two styles have their place, but rare is the writer who can do *both* at the same time.  Notably, I myself find that I can write science, or fiction, but cannot intermix the two - writing fiction requires that I stop all scientific writing and write only the fiction until complete, then resume the science.  [Here's hoping this blog will prove me wrong.]  Gray Rinehart concurred that there are different mindsets, but there*are* authors who can both entertain with precision, and it is a hallmark of *good* writing that it can be hard to find the difference when reading a good writer.

I next asked Christiana Ellis how she applies her engineering/scientific background in writing fantasy.  Her great answer - even magic has rules.  A good engineer knows that consistency is important, and a good writer knows that inconsistency in the details - whether science or magic - will lose a reader. Discussion continued with ways SF/F gets its science right and wrong, and whether the science even needs to be stated, as long as the author has it worked out and applies it consistently throughout the story.  For a reader, sometimes *not* knowing the science is more fun, as long as the reader knows the science is there and has clues to figure it out.  (Star Trek communicators, anyone?)

We included questions from the audience throughout the panel, but set aside enough time for people to ask specific questions.  I was most struck by the following:  At the time of the Apollo Lunar Landing in 1969, an SF writer was asked "What they would now write?" (Since space travel had been accomplished).  The questioner asked whether we were limiting our writers with all of Science's modern findings - after all, we now know there are no jungles on Venus or canals on Mars!   Among much great discussion, it was stated that it seems much of modern SF has turned inward, but Toni Weisskopf feels that trend is turning around, and we will see more of the "hard" SF and faithful science which brought many of us to the fields of both science and writing.

I was struck by both the direction of conversation, and the parallel with the Apollo program.  I recalled the testimony of Astronaut Frank Borman to the U.S. Congress regarding the fire in the Apollo 1 capsule which took the lives of astronauts Grissom, White and Chaffee.  I closed the panel with Borman's words when asked what had caused the tragedy.  Borman simply answered:

"Failure of the Imagination."

Thursday, March 3, 2011


OK.  This is a cop-out.  No blog today.

"Whine, whine, whine, Would you like some cheese with that?" You say.

Well, let me put it this way.  Stellarcon starts tomorrow, and I need to prepare for a reading and panels.  To top it off, I need to complete a research progress report tonight, and I *really* can't do justice to Speech and Language without a little bit more preparation.

On the flip side, I will be moderating a panel on Science vs. the Story at Stellarcon, with panelists including Baen Books Publisher Toni Wiesskopf and Baen Slushmaster General Gray Rinehart.  I'll be reporting on the fun stuff from that panel over the weekend.  The Guide will resume next week with Speech & Language - that is, unless the LabRats get loose and party too much.

... and thanks to the readers for putting up with me!

Wednesday, March 2, 2011

"Do you hear what I hear?"

The sense of hearing is unique of all the senses and motor pathways of the body in that it doesn't cross over to project primarily to brain areas on the opposite side. This does not mean it doesn't "decussate" or divide and send projections to the opposite side of the brain – in fact, these crossover projections are essential to the localizing and tracking sound sources.

The diagram at right shows the stages in the auditory processing pathway from cochlea to auditory cortex. In computer/electronic terms, the process described in yesterday's blog essentially acts as an analog to digital converter. The analog sound frequency waveform is turned into a parallel digital signal by activating the hair cells at particular points along the basilar membrane, thus separating the sound waveform into the essentially "digital" detection of discrete frequencies. Neurons attached to the hair cells project to the cochlear nucleus, and from there to the superior olive and inferior colliculus of the brainstem. It is at this level that most of the crossover of signals occurs – but not to project the sensation to the opposite side of the brain! No, the decussation of neuron projections in the brainstem is important for comparison of different signals between the ears. Differences in sound intensity as well as differences in *phase* are used to determine whether a sound occurred on the left or right.

Most people are familiar with the Doppler Effect for sound. A sound coming toward you seems higher pitched, because the sound waves are compressed by motion of the sound source, creating a slight increase in pitch. As the source moves away from you, the sound waves get longer, and the pitch drops. The outer ear, or "pinna" is shaped to funnel sound – but it also introduces minor differences in *phase*. Phase is similar to the Doppler Effect, but is more of an effect of the *distance* that sound travels to reach the ear. The best example of a phase difference is to have a person in the same room with you call you on the phone. The voice you hear through the phone is delayed compared to the voice you hear directly via the ear. The delay is due to the electrical relays and the *distance* that the signal must travel. A sound source to your right reaches the right ear sooner than it does the left ear. The pitch – therefore the waveform – is exactly the same, but the *phase* is different due to the additional time travel past the head to the opposite ear. Likewise the volume of the sound is slightly louder in the right ear.

The process of comparing the sounds received at each ear, what we call "binaural comparison" is necessary to determine to direction from which a sound originates. However, we live in a 3-D world, left or right is not enough. Again, the shape of the pinna, with the auditory canal at the bottom, and the folds at the back, further deflect the trajectory of sound waves, introducing differences that can be used to detect up-down and front-back. Of course, to further localize the source of sound, we can always move the head. In that case, we need to include the additional information of which direction the head is pointing, requiring the thalamic relays and somatosensory feedback in the parietal lobe. To further aid our search, we can add visual information from V4 and frontal eye fields and point our eyes at the suspected source of sound for positive identification. Full auditory localization requires a number of different pathways and integrates information from all over the brain, but the initial processing is done in the brainstem and doesn't even require conscious attention!

This pathway, however, does not get the auditory information all the way to the auditory cortex on the dorsal (top) surface of the temporal lobe (see figure, left). Projections from the cochlear nucleus travel through the brainstem to the major relay nucleus in the thalamus: the medial geniculate nucleus. In the MGN, the tonotopic organization is preserved for eventual projection to auditory cortex. However, much like the lateral geniculate is involved in the pre-processing of vision, the medial geniculate does much of the preprocessing of sound. MGN neurons sort by intensity, duration and more complex attributes of sound – what an audiophile might term "tone" or a sound technician would call the "envelope" of the sound. An important feature of MGN processing is a reflex that protects the auditory system from excessive sound levels. As sound intensity increases, muscles in the middle ear tighten the eardrum and the "ossicles" (bones) reducing the motion and thus the sensitivity of the ear to sound. Again, this reflex is a feature of the pre-processing of sound prior to the auditory cortex.

The final relay from thalamus to cortex presents information organized by pitch (frequency), intensity, location, and "tone" which forms a map not only of sound quality, but also that sound's location in space in a manner similar to, but not as detailed, as the visual map in area V1. Like the visual system, there is a lot of preprocessing of information in the brainstem and subcortical areas; and again, like the visual system – damage to the primary cortical areas may leave the subject unaware that the stimulus was received, but still be able to react (with a "startle" response) to the presence of loud or sudden noises.

It's a fascinating process, and this description *still* does not address the more complex modes of auditory integration. The primary auditory cortex receives information about pitch and loudness, but the elements that comprise "music" are properties of the secondary (association) auditory cortex – harmony, melody, rhythm. Further integration of sound into "music," "speech," and natural sounds occurs in the tertiary auditory cortex and association areas located in... the parietal lobe adjacent to the visual association areas. The involvement of these areas in speech, language as well as integration with vision for tracking, reading, etc will be explored in the next edition of this blog.

Tuesday, March 1, 2011

"... And Ears to Hear..."

The auditory system is interesting because it essentially maintains a "tonotopic" (i.e. organized by sound frequency) map all the way from the cochlea to the region of the Temporal Lobe that contains the auditory cortex. For that reason, this discussion will start with the neural receptors and move to the sensory portions of the brain, rather working from the brain back to the ear.

Externally, ears are shaped to funnel sound into the auditory canal, and then to the tympanum (ear drum). The eardrum vibrates in accordance with the received sound much like a drum head or the diaphragm on a speaker. In fact, if you have ever listened carefully to a snare drum, you can hear the vibration of the drum head in response to sound by the way the drum head in turn vibrates and produces the characteristic rattling sound of the snare.

The Malleus, Incus and Stapes (hammer, anvil and stirrup) are small bones that then transfer the vibratory movements of the tympanic membrane to the fluid filled cochlea. Inside the cochlea is a flexible membrane, the basilar membrane that flexes and moves with frequency of vibration. The Figure at the right shows the arrangement of tympanum (lower left) and cochlea (unrolled, right). The basilar membrane of the cochlea vibrates in a standing wave corresponding to the frequency of sound detected by the ear. Thus the system to this point simply acts as a “transducer” to reproduce the sound waves from the air into the fluid and membrane environment of the cochlea.

This is the point at which the “tonotopic map” starts. The next figure, at left, shows how different frequencies are detected at different points along the basilar membrane. As in the standing wave shown above, the basilar membrane reaches maximum deflection for specific sound frequencies at different points along its length. Thus, all that is needed to encode a particular frequency is to place detectors at the appropriate points along the basilar membrane. In the Mailbag a couple of weeks ago, we looked into *how* a receptor neuron could detect sound when the neuron is *smaller* than wavelength of the sound wave to be detected. This is how it works.

Our final figure, from Gray’s Anatomy, shows the “Organ of Corti” which runs along the basilar membrane and detects the amount by which the membrane is deflected. The neural receptors are the “Hair Cells” with the base of the neuron attached to the basilar membrane, and the top of the neuron attached to the overhanging tectoral membrane. Movement of the basilar membrane compresses or stretches the hair cells, opening and closing some ion channels, thus changing the neuron’s action potential firing rate – the greater the deflection, the greater the change in firing rate.

The cochlea is organized in a manner that detects different sound wave frequencies at positions along its length. Therefore, to keep those frequencies separate in the brain, it is only necessary to keep an accurate wiring diagram for the auditory nerve (Cranial Nerve VIII) from cochlea to auditory cortex. But keeping sound frequencies straight is not the only thing the auditory system does. Differences in tone, pitch, frequency *change* are all necessary for localization and creating a three-dimensional *sound* map of the environment – but more on that in the next blog!

Tune in next time, same blog, same internet!