Categories
News

Hamster study shows how our brains recognize other individuals

Professor Robert Johnston studies hamsters to better understand how mammalian brains recognize other individuals. In fact, different areas of the brain react differently when recognizing others, depending on the emotions attached to the memory, a team of Cornell University research psychologists has found. The team, led by professor of psychology Robert Johnston, has been conducting experiments to study individual recognition. Last year Johnston’s team conducted the first experiment to demonstrate the neural basis of individual recognition in hamsters and identify which areas of the brain play a role. The results were published in the Dec. 7, 2005, issue of the Journal of Neuroscience.

Better understanding these mechanisms, Johnston said, may be of central importance in treating certain forms of autism, Asperger syndrome, psychopathy and social anxiety disorders. “This ability to recognize individuals underlies social behavior in virtually all vertebrates and some invertebrates as well,” explained Johnston. “Humans clearly have an incredible ability to recognize, remember and store huge amounts of information about individuals — even individuals we have never actually met. This ability is the core of circuits that one might call the social brain.”

Johnston’s team uses hamsters to study recognition because their brains are strikingly similar to ours. “They are more sophisticated than you might think,” he noted. In the latest experiment, a male hamster encountered two individuals that he knew equally well but had different interactions with the previous day: a male that defeated him in a fight and a male that he had never fought. The encounters mimicked those that occur in the wild. The hamster fled from the winning male but was attracted to the neutral male — suggesting that he both recognized the individuals and remembered his experiences with them.

An hour later, the researchers removed the hamster’s breath-mint-sized brain and injected it with antibodies and enzymes. The antibodies bond to specific proteins produced by recently active brain cells, and the enzymes convert chemicals in the cells into colored dyes, leaving behind a map of where the action was. This technique, called immunohistochemistry, is also used to diagnose cancerous cells in humans. Next the brain was frozen with dry ice, shaved into very thin slices using a miniature slow-moving guillotine and then studied under a microscope to determine where the dyes were activated.

“Functional MRIs provide similar information from human brains, but those images are relatively fuzzy and lack the spatial resolution necessary for small animals,” explained Johnston. “With immunohistochemistry, on the other hand, we can see each individual cell that was activated.” The researchers found activity in the brain’s anterior dorsal hippocampus and amygdala, among other areas. They then repeated the experiment with another hamster whose anterior dorsal hippocampus was numbed with lidocaine, a local anesthetic, and found that the animal did not avoid the individual who had defeated him. “It showed us that this region is necessary for recognition memory,” said Johnston. “The hippocampus has also been implicated for recognition memory in humans.”

Although hamsters recognize individuals by smell, whereas humans use largely sight and sound, Johnston said that the underlying mechanism is the same.

Science Daily
March 28, 2006

Original web page at Science Daily

Categories
News

Researchers to study effectiveness of stem cell transplant in human brain

Researchers in Doernbecher Children’s Hospital at Oregon Health & Science University will begin a Phase I clinical trial using stem cells in infants and children with a rare neurodegenerative disorder that affects infants and children. The groundbreaking trial will test whether HuCNS-SC(TM), a proprietary human central nervous stem cell product developed by StemCells, Inc. is safe, and whether it can slow the progression of two forms of neuronal ceroid lipofuscinosis (NCL), a devastating disease that is always fatal. NCL is part of a group of disorders often referred to as Batten disease.

“NCL is a heartbreaking and devastating diagnosis for children and their families,” said Robert D. Steiner, M.D., F.A.A.P., F.A.C.M.G., vice chairman of pediatric research, head of the Division of Metabolism and the study’s principal investigator at Doernbecher Children’s Hospital, OHSU. Steiner also is an associate professor of pediatrics, and molecular and medical genetics in the OHSU School of Medicine. “While the preclinical research in the laboratory and in animals is promising, it is important to note that this is a safety trial and, to our knowledge, purified neural stem cell transplantation has never been done before. It is our hope that stem cells will provide an important therapeutic advance for these children who have no other viable options.”

NCL is caused by mutations or changes in the genes responsible for teaching the body how to make certain enzymes. Without these enzymes or proteins, material builds up inside brain neurons and other brain cells, causing a rapidly progressive decline in mental and motor function, blindness, seizures and early death. This study addresses two forms of NCL: infantile neuronal ceroid lipofuscinosis (INCL) and late-infantile neuronal ceroid lipofuscinosis (LINCL). Tragically, children with INCL typically die before age 5 and those with LINCL typically do not live past age 12.

“If delivering stem cells directly into the human brain is safe and effective, it will, in my opinion, be a major step forward in the efforts of scientists and clinicians around the country to find new treatments with the potential to help tens of thousands of patients with degenerative brain diseases,” said co-investigator Nathan Selden, M.D., Ph.D., F.A.C.S., F.A.A.P. “I am proud that Doernbecher Children’s Hospital will be part of this effort.” Selden is Campagna Associate Professor of Pediatric Neurological Surgery and head of the Division of Pediatric Neurological Surgery, Doernbecher and OHSU School of Medicine.

Up to six children from Oregon or around the country will undergo HuCNS-SC transplantation at Doernbecher. Previous studies of mice that are missing one of the enzymes that causes NCL have shown HuCNS-SC increases the amount of the missing enzyme, reduces the amount of abnormal material in the brain and prevents the death of some brain cells. No major side effects have been reported in animals. StemCells, Inc. received clearance from the U.S. Food and Drug Administration to initiate a Phase 1 clinical trial of HuCNS-SC in October 2005. The company believes this will be the first trial using a purified composition of neural stem cells as a potential therapeutic agent in humans.

Science Daily
March 28, 2006

Original web page at Science Daily

Categories
News

How new neurons find their way

Neurons born near the brain’s ventricles travel out to the olfactory bulb to function in olfaction. A steady stream of migrating neurons makes the journey not only in early development, but also during adulthood. Sawamoto et al. (Science p. 629, published online 12 January) now provide insight into how these neurons find their way in mice. The ventricles of the brain are lined with cells bearing cilia on their surface. The coordinated beating of these cilia develops a stream of fluid coursing through the ventricles carrying signaling factors that guide the traveling neurons. Mutations that disrupt the cilia also disrupt establishment of the signaling gradient and the migration of the neurons to the olfactory bulb.

Science Magazine
February 28, 2006

Original web page at Science Magazine

Categories
News

Researchers discover a natural defence mechanism for Alzheimer’s disease

A team from the Faculty of Medicine at Université Laval and the research centre at CHUQ (Centre hospitalier universitaire de Québec) has discovered a natural defence mechanism that the body deploys to combat nerve cell degeneration observed in persons with Alzheimer’s disease (AD). Investigators Alain R. Simard, Denis Soulet, Genevieve Gowing, Jean-Pierre Julien and Serge Rivest describe this major discovery in the February 16th issue of the scientific journal Neuron. Alzheimer’s disease is characterized by the accumulation of amyloid proteins in the brain. These proteins form plaques around which microglia, the central nervous system’s immune cells, aggregate. These microglia appear to be incapable of eliminating the plaques, and this has led some researchers to postulate that microglial action produces an inflammation causing neuronal death. The fact that Alzheimer’s patients are prescribed anti-inflammatory drugs results from this concept of the disease.

For Serge Rivest and his team, whose research is funded by the Canadian Institutes of Health Research (CIHR), microglia are not part of the problem, but of its solution. These investigators have observed that, although the brain’s resident microglia do appear to be poorly equipped for combating amyloid plaques, an entirely different case prevails for another type of microglia: those derived from bone marrow stem cells. Using tests conducted with transgenic mouse models of AD, the investigators have demonstrated that bone marrow-derived microglia infiltrate amyloid plaques and succeed in destroying them most efficiently. These newly-recruited immune cells are specifically attracted by the amyloid proteins that are the most toxic to nerve cells. “The discovery made by Dr. Rivest and his team is an important step towards a new therapeutic approach to Alzheimer’s disease,” states Dr. Rémi Quirion, Scientific Director of the Institute of Neurosciences, Mental Health and Addiction (INMHA). “It is the perfect example of the potential social benefits of investing in health research.”

According to Dr. Rivest, anti-inflammatory drugs should not be administered in cases of Alzheimer’s disease, as they interfere with this natural defence mechanism. On the contrary, he adds, a way must be found to stimulate the recruitment of a greater number of bone marrow-derived microglia. “Statistics show that 280,000 Canadians aged 65 and over have Alzheimer’s,” says Anne Martin-Matthews, Scientific Director of the CIHR Institute of Aging (IA). “This project gives hope to seniors, families and caregivers who are concerned by this disease. It illustrates the role health research can play in improving the health of Canadians.”

Dr. Serge Rivest’s team also had recourse to genetic engineering, in order to manufacture microglia that can anchor themselves more solidly to plaques and that are equipped with enzymes with more efficient plaque-destroying capability. “Stem cells should be harvested from the patients themselves, thus limiting the risks of both rejection and adverse effects,” says Dr. Rivest. “While this cellular therapy will not prevent Alzheimer’s, by curbing plaque development, we believe that it will help patients prolong their autonomy and cognitive capacity. We believe that this is new and powerful weapon in the fight to conquer Alzheimer’s.”
Source: The Canadian Institutes of Health Research

Bio.com
February 28, 2006

Original web page at Bio.com

Categories
News

Researchers prove a single memory is processed in three separate parts of the brain

UCI researchers have found that a single brief memory is actually processed differently in separate areas of the brain – an idea that until now scientists have only suspected to be true. The finding will influence how researchers examine the brain and could have implications for the treatment of memory disorders caused by disease or injury. The results were published this week in the early online edition of the Proceedings of the National Academy of Sciences.

In a study using rats, researchers Emily L. Malin and James L. McGaugh of UCI’s Center for the Neurobiology of Learning and Memory demonstrate that while one part of the brain, the hippocampus, is involved in processing memory for context, the anterior cingulate cortex, a part of the cerebral cortex, is responsible for retaining memories involving unpleasant stimuli. A third area, the amygdala, located in the temporal lobe, consolidates memories more broadly and influences the storage of both contextual and unpleasant information.

“These results are highly intriguing,” said McGaugh, a member of the National Academy of Sciences who pioneered the study of drug and stress-hormone influences on memory. “It is the first time we have found this fragmentation in the brain of what we would think of as a single experience. For example, different aspects of an experience, such as a car accident, would be processed by different parts of the brain. The experience is fragmented in our brain, even though we think of it as one event.”

According to Thomas J. Carew, Donald Bren Professor and chair of UCI’s Department of Neurobiology and Behavior, understanding which parts of the brain process which types of memories gives scientists a better grasp on why particular types of memory impairment can occur and why, for example, different types of strokes might affect different memory systems. “This study is a terrific demonstration of how different components of our neural real estate can be allocated to process different aspects of memory,” said Carew. “The more we know about the specialization of memories, the more we can understand how and why the processing of memory can go awry, which in turn can critically inform clinical problems involving a wide range of cognitive deficits.” McGaugh’s previous work has shown the key role emotional arousal and the accompanying release of stress hormones play in creating lasting memories. The amygdala has been shown to be activated by the release of these hormones.

In the study, the rats were placed inside a box to familiarize themselves with that context. On the second day, they were confined to a dark compartment of the same box for only a few seconds and given a mild foot shock. The drug oxotremorine, which mimics the neurotransmitter acetylcholine in the brain and enhances memory retention, was injected into the hippocampus, the anterior cingulate cortex or the amygdala immediately after either the contextual training on day one or after the foot-shock training on day two. All the rats were then tested two days later to see how quickly they would return to the chamber where they had received the foot shock, an indication of how well they remembered the previous training.

Rats given oxotremorine in the hippocampus after just the contextual training stayed out of the foot-shock chamber longer, meaning that they remembered the past event. But the injections into the hippocampus after the foot-shock training had no effect on memory retention. This is consistent with evidence that the hippocampus is involved in contextual memory consolidation but not with consolidation of unpleasant information. Likewise, those rats given injections into the anterior cingulate cortex had enhanced memory when the drug was administered after the foot-shock training but not after the contextual experience. In contrast, the rats with injections in the amygdala showed better memory retention regardless of whether they had received the drug after the context training or the foot-shock training. The results support the hypothesis that the amygdala is involved in overall consolidation of memories of different kinds of experiences.

Science Daily
February 14, 2006

Original web page at Science Daily

Categories
News

Use your brain, halve your risk of dementia

Research from UNSW provides the most convincing evidence to date that complex mental activity across people’s lives significantly reduces the risk of dementia. The researchers found that such activity almost halves the incidence of dementia. The paper, which has just been published in Psychological Medicine, is the first comprehensive review of the research in the field of ‘brain reserve’, which looks at the role of education, occupational complexity and mentally stimulating lifestyle pursuits in preventing cognitive decline. The paper integrates data from 29,000 individuals across 22 studies from around the world.

“Until now there have been mixed messages about the role of education, occupation, IQ and mentally stimulating leisure activities, in preventing cognitive decline. Now the results are much clearer,” said the lead author, Dr Michael Valenzuela, from the School of Psychiatry at UNSW. “It is a case of ‘use it or lose it’. If you increase your brain reserve over your lifetime, you seem to lessen the risk of Alzheimer’s and other neurodegenerative diseases.” The key conclusion is that individuals with high brain reserve have a 46 percent decreased risk of dementia, compared to those with low brain reserve. All the studies assessed agreed that mentally stimulating leisure activities, even in late life, are associated with a protective effect.

“This suggests that brain reserve is not a static property, nor that it is determined by early life experiences such as level of education, socio-economic deprivation or poor nutrition,” said Dr Valenzuela. “It is never too late to build brain reserve.” Dr Valenzuela’s previous research showed that after five weeks of memory-based mental exercise, participants increased brain chemistry markers in the opposite direction to that seen in Alzheimer’s disease. “The interesting point here is that this change was concentrated to the hippocampus, a part of the brain first affected in dementia,” said Dr Valenzuela. This is consistent with studies of brain reserve in mice, where some of the animals were ‘hothoused’ in stimulating environments. These mice had changes in the microstructure of their brains, compared with the controls. “We now need a clinical trial to improve our neurobiological understanding of the brain-reserve effect in humans,” said Dr Valenzuela. “Perhaps this could be based around mentally stimulating leisure activities that are fun.”

Science Daily
February 14, 2006

Original web page at Science Daily

Categories
News

A bouquet of responses: Olfactory nerve cells expressing same receptor display a varied set of reactions

In a mouse model, University of Pennsylvania School of Medicine researchers discovered that olfactory sensory neurons expressing the same receptor responded to a specific odor with an array of speeds and sensitivities, a phenomenon previously not detected in the mammalian sense of smell. The group published their findings this week in the online edition of the Proceedings of the National Academy of Sciences.

Ma’s group measured 53 different olfactory neurons that express the MOR23 odor receptor. As a group, the neurons reacted differently from one another in their response to lyral, an artificial odor used in fragrances and flavoring. After subjecting all cells to a short pulse (200-300 milliseconds) of lyral, the researchers measured the cells’ sensitivity to the odor. Some cells responded to very low concentrations of lyral; others, to higher concentrations. Regarding the cells’ reaction time, some neurons finished firing within 500 milliseconds, but for others, the response time was up to five seconds.

Detection of odor molecules depends on about 1000 different odor receptors in the rodent nose. Different sets of receptors respond to different sets of odors. To date, no one has been able to record electrical impulses from a specific subtype of olfactory sensory neuron expressing a known receptor. This is important, says Ma, because prior to this paper, when researchers would work with olfactory cells, there was no way to know what odor receptor that cell expressed. “It could literally be one out of 1000,” she says.

All the sensory neurons expressing the same receptor merge to a common region called a glomerulus, a region in the brain’s olfactory bulb. In one bulb there are about 2000 glomeruli. (The brain has two olfactory bulbs.) There are thousands of sensory neurons dedicated to expressing the same receptor, and in the case of MOR23 they all merge to two glomeruli. The researchers used genetically engineered mice that express MOR23 together with green fluorescent protein (GFP), which was generated by colleagues from Rockefeller University. The GFP allows the investigators to visualize the MOR23 cells separate from other neurons. They also recorded their measurements using cells still intact within the lining of the nose, which allows the researchers to study these cells in their natural biochemical environment.

The researchers made their measurements from the endings of olfactory neuron dendrites. A single dendrite extends from the cell body of the olfactory neuron into the nasal cavity. The dendrite has a swelling at the end called the knob, where about 10 to 15 hair-like extensions called cilia contain the odor receptors. Ma and colleagues are now working out the implications of their findings. She says this study points to a more finely tuned response in the brain to odors than previously thought. “Olfactory neurons may be able to respond to an even wider range of odor concentrations than we realized,” she says. The heterogeneity in odor sensitivity and the wide response range in single cells provides new insights into why mammals, including humans, perceive odors with unchanged quality over a broad concentration range.

Science Daily
February 14, 2006

Original web page at Science Daily

Categories
News

Master genetic switch found for chronic pain

In experiments with mice, researchers have found that eliminating what appears to be a master genetic switch for the development of pain-sensing neurons knocks out the animals’ response to “neuropathic pain.” Such pain is abnormal pain that outlasts the injury and is associated with nerve and/or central nervous system changes. The animals rendered deficient in the gene, called Runx1, also showed lack of response to discomfort caused by heat and cold and inflammation. The researchers said that their findings, reported in the February 2, 2006, issue of Neuron, could have implications for the design of improved pain therapies.

In their experiments, Qiufu Ma and colleagues studied the Runx1 gene because past research had shown it to code for a protein “transcription factor,” which is a master regulator of multiple genes. Runx1 is one of a group of proteins that are key players involved in transmitting external sensory information, like pain and the perception of movement, to the spinal cord. In two other related papers in the same issue, Silvia Arber and colleagues and Tom Jessell and colleagues examine related aspects of the biological importance underlying the Runx transcription factors.

Runx1 was known to be expressed only in sensory nerve cells called “nociceptive” cells, involved in sensing pain. Such pain-sensing cells function by translating painful stimuli into nerve signals via specialized pores called “ion channels” in the neurons, as well as specialized receptors. The researchers’ studies of Runx1 in these cells revealed that during embryonic development, the gene is characteristically expressed in pain-receptor cells involved in neuropathic pain. When they knocked out the gene, they found that the normal development of such specialized nerve cells was impaired. The animals had lost ion channels known to be involved in reacting to painful heat or cold, as well as those involved in pain due to damaged tissue. The researchers also found that the Runx1-deficient animals showed deficient wiring of certain types of pain neurons. In key experiments, the researchers measured the Runx1-deficient animals’ response to four types of pain–thermal, mechanical, inflammatory, and neuropathic.

The researchers produced a pain response by subjecting the animals’ hindpaw to either the cold of acetone or an uncomfortably warm plate (thermal); the uncomfortable prick of a filament (mechanical); an injection of an inflammation-inducing chemical (inflammatory); or nerve damage (neuropathic). They quantified the animals’ response by measuring how long the animals lifted or licked their affected paw in response to the treatments. Ma and his colleagues found that, while the deficient animals showed normal response to mechanical pain, they showed significantly lowered thermal, neuropathic, and inflammatory pain response.

The researchers concluded that while the diverse specialized components of the pain-sensing machinery could be established in a piecemeal fashion, “Our data, however, provide strong evidence that Runx1 is required to specify the receptive properties of a large cohort of nociceptive sensory neurons.” They also concluded that the dual functions they discovered for Runx1–controlling specification of sensory neurons and regulating how they target their wiring–“form a genetic basis for the assembly of specific neural circuits for nociceptive information processing. “Finally, the identification of a core transcriptional control program for many of the ion channels and receptors known to transduce noxious stimuli has intriguing implications for the design of more effective pain therapies,” they wrote.

Science Daily
February 14, 2006

Original web page at Science Daily

Categories
News

Long-term memory controlled by molecular pathway at synapses

Harvard University biologists have identified a molecular pathway active in neurons that interacts with RNA to regulate the formation of long-term memory in fruit flies. The same pathway is also found at mammalian synapses, and could eventually present a target for new therapeutics to treat human memory loss. Sam Kunes, professor of molecular and cellular biology in Harvard’s Faculty of Arts and Sciences, and postdoctoral fellow Sovon Ashraf have identified a molecular pathway active in neurons that interacts with RNA to regulate the formation of long-term memory in fruit flies. The same pathway is also found at mammalian synapses, and could eventually present a target for new therapeutics to treat human memory loss. The findings will be presented this week on the web site of the journal Cell.

Even for a fruit fly, learning and memory are important adaptive tools that facilitate survival in the environment. A fly can learn to avoid what may do it harm, such as a flyswatter, or in the laboratory, an electric shock that happens when it smells a certain odor. “It has been known for some time that learning and long-term memory require synthesis of new proteins, but exactly how protein synthesis activity relates to memory creation and storage has not been clear,” says Sam Kunes. “We have been able to monitor, for the first time, the synthesis of protein at the synapses between neurons as an animal learns, and we found a biochemical pathway that determines if and where this protein synthesis happens. This pathway, called RISC, interacts with RNA at synapses to facilitate the protein synthesis associated with forming a stable memory. In fruit flies, at least, this process makes the difference between remembering something for an hour and remembering it for a day or more.

” By manipulating the RISC pathway, Kunes and colleagues were able to alter flies’ memory, changing their response to stimuli in subsequent behavioral tests. Using a classical learning test that simultaneously exposes the insects to an odor and an electric shock, the researchers found that long-term memory could be greatly increased by adjusting the activity of the RISC pathway in the fruit flies. “In essence, these flies had twice the memory of their normal counterparts,” Kunes says. “When RISC was knocked out, so was long-term memory, and flies would remember to alter their behavior in the presence of the shock-linked odor for perhaps an hour; that is, they only had short-term memory. When the pathway was normally active, the flies remained averse to the odor for a day or more.” Kunes says the various proteins that comprise the RISC pathway are also found at synapses in mice and humans, suggesting the pathway has been conserved by evolution and that it could be a target for new medications to boost human memory.

Science Daily
January 31, 2006

Original web page at Science Daily

Categories
News

Toward a better understanding of human prion diseases

Misfolding of a single protein, the cellular prion protein (PrPc) into the disease-associated form PrPSc is believed to cause fatal prion diseases in humans and other mammals, including sheep, cattle, and deer. The misfolding can occur sporadically or after contact—through inoculation or ingestion—with PrPSc from an external source. There are also familial forms of prion diseases that are associated with certain mutations in PNRP, the gene encoding PrPc. These abnormal PrPc proteins are thought to have a higher probability to misfold than normal PrPc. The idea is that a few misfolded molecules can initiate a chain reaction and cause transformation of many of the other PrPc molecules into harmful PrPSc versions. The presence of PrPSc proteins causes widespread cell death, leading to the characteristic spongiform degeneration of the brain that kills patients, most of them within a matter of months.

The most common human prion disease is sporadic Creutzfeld–Jacob Disease (sCJD). The disease is rare (affecting roughly one to two individuals per a million people, worldwide), and its etiology is unclear; neither exogenous nor endogenous causes have been identified. sCJD is inevitably fatal, but the disease is clinically, pathologically, and genetically heterogeneous. Most patients have rapidly progressing dementia, often accompanied by involuntary muscle spasms, and death occurs within months of the first clinical symptoms. However, for some patients ataxia is the first clinical sign, while others develop sight problems, and for some the disease duration can be more than two years. In the hope that understanding the heterogeneity will help them to understand what causes sCJD, researchers are trying to systematically collect and catalog data from patients. To do this in a meaningful way, standardized assays that allow results from different patients and different laboratories to be compared in a meaningful way are necessary.

Markus Glatzel and colleagues have developed such an assay, and applied it as part of the detailed molecular characterization to autopsy samples from 50 patients with sCJD. The new assay, which the researchers call PrPSc profiling, measures the amount of PrPSc in defined brain regions. In the past, PrPSc amounts were routinely only measured in one or two regions by a variety of assays.
PrPSc profiling quantifies the amount of PrPSc in nine defined brain regions relative to internal standards, and thereby allows for direct comparison of individual profiles.The researchers determined PrPSc profiles of 50 patients, and tried to correlate the profiles with information on disease types of the patients and prion types present in the different brain areas. sCJD types are determined by a patients’ PRNP genotype at the polymorphic position 129 of PRNP and by the relative resistance of PrPSc to proteolytic degradation. It is thought that most patients only have one prion type, but previous reports have described coexistence of two different types in some samples.

Analysis of this wealth of data revealed correlations between distinct PrPSc distribution patterns and sCJD subtypes. These results have implications for confirmation of sCJD by brain biopsy. Before doing such biopsies, Glatzel and colleagues suggest, the sCJD subtype should be determined so that the correct brain area is examined. The researchers also found coexistence of two different prion types in 20% of their overall samples, and in more than 50% of the samples from patients who were heterozygous for the 129 polymorphism in the PRNP gene. These data lend further support to a link between molecular signature and clinical heterogeneity of the disease.

While many questions remain, this study underlines that the systematic analysis of prion cases can reveal links between molecular pathology, genetic makeup of patients, and disease symptoms. Glatzel and colleagues believe that “PrPSc profiling will be a valuable tool for prion research.” In the hope that it will “facilitate comparisons of PrPSc quantities present in defined samples,” the researchers will make their PrPSc standard available to the scientific community.

Science Daily
January 31, 2006

Original web page at Science Daily

Categories
News

MicroRNA may modulate memory

There’s a new potential role for microRNAs, according to a report in this week’s Nature: modulating dendritic protein building in the hippocampus, a process known to be involved in learning and memory. The findings suggest that microRNAs may help regulate synaptic strength at individual synapses, the authors say. This study is one of the first to examine microRNAs in mature neurons, said Kenneth Kosik of the Neuroscience Research Institute at the University of California, Santa Barbara, who did not participate in the research. Almost all data on microRNAs in the nervous system come from development, Kosik added.

MicroRNAs are small, non-coding RNA molecules that exert control over many developmental pathways in animals and plants by preventing the translation of messenger RNA into protein. Previous work has shown that mRNAs can be translated locally at ribosomes found in dendrites, first author Gerhard Schratt of Harvard Medical School and Children’s Hospital Boston told The Scientist. Since microRNAs can suppress mRNA translation by binding to complementary sequences, researchers have suggested that they may inhibit translation at synapses, Schratt said, but no data have confirmed this hypothesis.

Schratt and his colleagues found that overexpressing a brain-specific microRNA called miR-134 in a culture of mature rat hippocampal neurons caused a significant decrease in the size of dendritic spines, the neuron’s major sites of excitatory synaptic contact. Expression of an antisense inhibitor of miR-134, on the other hand, caused an increase in dendritic spine size. To see how miR-134 might prevent the growth of these dendritic spines, the researchers searched for possible target mRNAs of miR-134. Among 48 genes suspected to be involved in dendritic protein synthesis, they found three that contained sequences partially complementary to miR-134. They chose to focus on one called Lim-domain-containing protein kinase 1 (Limk1), whose protein product is involved in building dendritic spines. Limk1 knockout mice show abnormalities in dendritic spine structure that are “almost a phenocopy of miR-134 overexpression,” Schratt said.

The authors found several pieces of evidence suggesting that miR-134 represses dendritic spine synthesis by binding Limk1 mRNA: miR-134 and Limk1 interact in vitro and co-localize within dendrites, and miR-134 overexpression in neurons reduces translation of Limk1. These interactions disappear if Limk1 is engineered to lack the sequence complementary to miR-134. To see if miR-134 and Limk1 interact locally at dendrites, Schratt and colleagues imaged fluorescence-tagged Limk1. They found that, in dendrites, wild-type Limk1’s expression levels are 18 to 28% lower than mutant Limk1, which cannot interact with miR-134. Since miR-134 does not block translation of Limk1 completely, however, other microRNAs or proteins must also help repress Limk1, Schratt said.

When the researchers replaced wild-type Limk1 with the mutant form, miR-134’s effect on spine size was rescued, demonstrating that miR-134 regulates dendritic spine size through the wild-type Limk1, the researchers note. Finally, they found that treatment with brain-derived neurotrophic factor (BDNF) can release some of miR-134’s repression of Limk1 – suggesting that normal neuronal activity could contribute to Limk1 translation, and thus spine synthesis and memory formation, Schratt said. The study is “exciting but it leaves a lot of questions open,” said Eric Miska, of the Gurdon Institute at the University of Cambridge, who was not a co-author of the study. Overexpression studies using short, non-coding RNA can suffer from off-target effects, Miska said, so it’s hard to pin down exactly how a microRNA is interacting with a specific mRNA. “There may be up to a thousand targets for a given microRNA,” he said. “We don’t know which are the real targets of this microRNA in vivo.”

Conversely, Limk1 mRNA has binding sites for several microRNAs besides miR-134, said James Eberwine of the University of Pennsylvania Medical Center, also not a co-author. “Various cellular functions could be attributed to the other microRNAs.” Also, the paper doesn’t prove that miR-134 is acting in dendritic spines, Miska told The Scientist. “They show localization, but there’s no functional data that suggests it has to be in the spine,” he said. “It would really be nice to know whether it’s translation in the dendrite or translation in the cell soma — or both — that’s being modulated by the microRNA,” Eberwine agreed. Either way, miR-134 interactions with Limk1 could be affecting dendrite outgrowth, he said, but follow-up experiments could isolate dendrites from the rest of the neuron to prove that miR-134 suppresses translation of Limk1 specifically in dendrites. “I think it’s going to be very difficult to prove what a microRNA does in vivo,” Miska said. Although still unavailable, mouse microRNA knockouts will likely be “the gold standard to prove the function of a particular microRNA” in mammals, he said.

The Scientist
January 31, 2006

Original web page at The Scientist

Categories
News

Imaging study links key genetic risk for Alzheimer’s disease to myelin breakdown

A new UCLA imaging study shows that age-related breakdown of myelin, the fatty insulation coating the brain’s internal wiring, correlates strongly with the presence of a key genetic risk factor for Alzheimer disease. The findings are detailed in the January edition of the peer-reviewed journal Archives of General Psychiatry and add to a growing body of evidence that myelin breakdown is a key contributor to the onset of Alzheimer disease later in life. In addition, the study demonstrates how genetic testing coupled with non-invasive evaluation of myelin breakdown through magnetic resonance imaging (MRI) may prove useful in assessing treatments for preventing the disease.

“Myelination, a process uniquely built up in humans, arguably is the most important and most vulnerable process of brain development as we mature and age. These new findings offer, for the first time, compelling genetic evidence that myelin breakdown underlies both the advanced age and the principal genetic risks for Alzheimer disease,” said Dr. George Bartzokis, professor of neurology at UCLA’s David Geffen School of Medicine. “The human brain functions as a high-speed Internet system,” said Bartzokis, director of the UCLA Memory Disorders and Alzheimer Disease Clinic and Clinical Core director of the UCLA Alzheimer Disease Research Center. “The quality of the brain’s connections is key to its speed, bandwidth, fidelity and overall on-line capability.”

Myelin is a sheet of lipid, or fat, with very high cholesterol content — the highest of any brain tissue. The high cholesterol content allows myelin to wrap tightly around axons, speeding messages through the brain by insulating these neural “wire” connections. As the brain continues to develop in adulthood and as myelin is produced in greater and greater quantities, cholesterol levels in the brain increase and eventually promote the production of a toxic protein that attacks the brain. The protein attacks myelin, disrupts message transfer through the axons and eventually can lead to the brain/mind-destroying plaques and tangles visible years later in the cortex of Alzheimer patients.

The Apolipoprotein E (ApoE) genotype is the second most influential Alzheimer risk factor, after only advanced age. The study used MRI to assess myelin breakdown in 104 healthy individuals between ages 55 and 75 and determine whether the shift in the age at onset of Alzheimer disease caused by the ApoE genotype is associated with age-related myelin breakdown. The results show that in later-myelinating regions of the brain, the severity and rate of myelin breakdown in healthy older individuals is associated with ApoE status. Thus both age, the most important risk factor for Alzheimer disease, and ApoE status, the second-most important risk factor, seem to act through the process of myelin breakdown.

Source: University of California – Los Angeles

Bio.com
January 17, 2006

Original web page at Bio.com

Categories
News

The left brain may view the world through the prism of language

The language-loving left hemisphere of the brain can spot different colours faster than it can identify different shades of the same colour. Our perception of colours can depend on whether we view them from the left or the right, scientists have found. They say this demonstrates how language can alter the way we see the world. The idea that language can affect cognition is not new. In the 1930s, the American linguist Benjamin Lee Whorf proposed the controversial hypothesis that the structure of language affects the way people think. Later studies have hinted that this may be true in some circumstances. But whether language affects our perception of the world has remained an open question.

Richard Ivry of the University of California, Berkeley, and colleagues suspected that separating out the effects of visual input to the right and left brain hemispheres might yield some clues. Language is processed mainly in the left hemisphere of the brain, which also deals with signals from the left side of the retinas in both our eyes. Because light from objects to our right falls mainly into the left-hand area of our retinas, the researchers hypothesized that colours to the right would feel the influence of language more keenly. Conversely, objects on our left side activate the right hemisphere of the brain, so the effect of language would be minimal. You get this language-based enhancement of differences or similarities -but you only get that in one half of the brain. The researchers say this is because the colour blue has a distinct name, and so the language-loving left hemisphere could perceive the colour difference faster than it could a square with a different shade of green.

“You get this language-based enhancement of differences or similarities,” says Ivry. “But you only get that in one half of the brain.” Ivry and his colleagues went on to test their theory by asking the subjects to memorize a series of words during the visual tests. With their left brain’s language centre otherwise occupied, there would be less opportunity for it to influence visual perception, and so, as expected, the subjects picked out blue or green squares on the right-hand side of the picture in the same time.
Ivry’s team is now investigating whether the same effect is seen with everyday objects, such as cats or cars, rather than colours. Early results suggest that we do indeed see these everyday objects differently depending on their position and our vocabulary for them. Ivry adds that it would be interesting to examine paintings to find out whether artists use colours differently on the left and right sides of the canvas.

Nature
January 17, 2006

Original web page at Nature

Categories
News

Neuron growth in adult brain

Despite the prevailing belief that adult brain cells don’t grow, a researcher at MIT’s Picower Institute for Learning and Memory reports in the Dec. 27 issue of Public Library of Science (PLoS) Biology that structural remodeling of neurons does in fact occur in mature brains. One of the dendritic branch tips of this inhibitory neuron grew out of the microscope’s imaging area in just two weeks. This finding means that it may one day be possible to grow new cells to replace ones damaged by disease or spinal cord injury, such as the one that paralyzed the late actor Christopher Reeve. “Knowing that neurons are able to grow in the adult brain gives us a chance to enhance the process and explore under what conditions — genetic, sensory or other — we can make that happen,” said study co-author Elly Nedivi, the Fred and Carole Middleton Assistant Professor of Neurobiology.

While scientists have focused mostly on trying to regenerate the long axons damaged in spinal cord injuries, the new finding suggests targeting a different part of the cell: the dendrite. A dendrite, from the Greek word for tree, is a branched projection of a nerve cell that conducts electrical stimulation to the cell body. “We do see relatively large-scale growth” in the dendrites, Nedivi said. “Maybe we would get some level of improvement (in spinal cord patients) by embracing dendritic growth.” The growth is affected by use, meaning the more the neurons are used, the more likely they are to grow, she said.

The study’s co-authors — Nedivi; Peter T. So, an MIT professor of mechanical and biological engineering; Wei-Chung Allen Lee, an MIT brain and cognitive sciences graduate student; and Hayden Huang, a mechanical engineering research affiliate — used a method called two-photon imaging to track specific neurons over several weeks in the surface layers of the visual cortex in living mice.
While many studies have focused on the pyramidal neurons that promote firing, this work looked at all types of neurons, including interneurons, which inhibit the activity of cortical neurons. With the help of technology similar to magnetic resonance imaging (MRI), but at a much finer, cellular resolution, the researchers were able to stitch together two-dimensional slices to create the first 3-D reconstruction of entire neurons in the adult cortex. Dendritic branch tips were measured over weeks to evaluate physical changes.

In 3-D time-lapse images, the brain cells look like plants sprouting together. Some push out tentative tendrils that grow around or retract from contact with neighboring cells. Dendrite tips that look like the thinnest twigs grow longer. Of several dozen branch tips, sometimes only a handful changed; in all, 14 percent showed structural modifications. Sometimes no change for weeks was followed by a growth spurt. There were incremental changes, some as small as seven microns, the largest a dramatic 90 microns. “The scale of change is much smaller than what goes on during the critical period of development, but the fact that it goes on at all is earth-shattering,” Nedivi said. She believes the results will force a change in the way researchers think about how the adult brain is hard-wired.

Nedivi had previously identified 360 genes regulated by activity in the adult brain that she termed candidate plasticity genes or CPGs. Her group found that a surprisingly large number of CPGs encode proteins in charge of structural change. Why are so many of these genes “turned on” in the adult well after the early developmental period of dramatic structural change? The neuroscience community has long thought that whatever limited plasticity existed in the adult brain did not involve any structural remodeling, mostly because no such remodeling was ever detected in excitatory cells. Yet evidence points to the fact that adult brains can be functionally plastic. In response to the CPG data, Nedivi and Lee revisited this question with the help of So and Huang.

By applying an innovative new imaging technology that allows monitoring of neuronal structural dynamics in the living brain, they found evidence for adult neuronal restructuring in the less-known, less-accessible inhibitory interneurons. “Maybe the inhibitory network is where the capacity is for large-scale changes,” Nedivi said. “What’s more, this growth is tied to use, so even as adults, the more we use our minds, the more robust they can be.”

Science Daily
January 17, 2006

Original web page at Science Daily

Categories
News

Scientists discover mechanism tying obesity to Alzheimer’s disease

If heart disease and diabetes aren’t bad enough, now comes another reason to watch your weight. According to a study just released, packing on too many pounds can increase the risk of developing Alzheimer’s disease. A team led by researchers at the Farber Institute for Neurosciences at Thomas Jefferson University in Philadelphia and Edith Cowan University in Joondalup, Western Australia has shown that being extremely overweight or obese increases the likelihood of developing Alzheimer’s. They found a strong correlation between body mass index and high levels of beta-amyloid, the sticky protein substance that builds up in the Alzheimer’s brain and is thought to play a major role in destroying nerve cells and in cognitive and behavioral problems associated with the disease.

“We looked at the levels of beta-amyloid and found a relationship between obesity and circulating amyloid,” says Sam E. Gandy, M.D., Ph.D., director of the Farber Institute for Neurosciences. “That’s almost certainly why the risk for Alzheimer’s is increased,” says Dr. Gandy, who is also professor of neurology, and biochemistry and molecular biology at Jefferson Medical College of Thomas Jefferson University. “Heightened levels of amyloid in the blood vessels and the brain indicate the start of the Alzheimer’s process.” The scientists reported their findings this month in the Journal of Alzheimer’s disease. According to, Dr. Gandy, evidence has emerged over the last five years that many of the conditions that raise the risk for heart disease such as obesity, uncontrolled diabetes, hypertension and hypercholesterolemia also increase the risk for Alzheimer’s. Yet exactly how such factors made an individual more likely to develop Alzheimer’s remained a mystery.

Dr. Gandy, Ralph Martins, Ph.D., of Edith Cowan University and their colleagues measured body mass index and beta-amyloid levels in the blood. They also looked at several other factors associated with heart disease and diabetes, such as the inflammatory marker C-reactive protein, insulin, and high density lipoprotein in 18 healthy adults who were either extremely overweight or obese. They found a “statistically significant correlation” between body mass index and beta-amyloid.

“Ours is one of the first attempts to try to find out on both the pathological and the molecular levels how obesity was increasing the risk of Alzheimer’s,” says Dr. Gandy, who serves as chairman of the Alzheimer’s Association’s Medical and Scientific Advisory Council. One implication of these findings could be that by losing excess weight and maintaining normal body weight, an individual might reduce the risk of developing Alzheimer’s. However, this has not been proven, notes Dr. Gandy.

“What’s especially interesting about this is that several studies are showing that even medical conditions in midlife may predispose to Alzheimer’s later on,” he says. “The baby boomers today should pay attention to this. Their medical risk factors today will play a role 30 years later. Think about weight, cholesterol, blood pressure, which could affect you long-term. In terms of Alzheimer’s, another risk factor is maintaining an active mental lifestyle.” The next step is to follow such patients over the long term to see how many do indeed develop Alzheimer’s. “We need to first develop a medicine that is effective in humans in lowering amyloid accumulation or generation,” says Dr. Gandy. “We have those now in mice and we are testing them in humans. If we can develop such a medicine, then the question will be, if we can lower amyloid, will that in fact prevent Alzheimer’s?”

Science Daily Health & Medicine
January 17, 2006

Original web page at Science Health & Medicine

Categories
News

Violence and the brain: An urgent need for research

While the social sciences have devoted much attention to the origin and prevention of violence, relatively little biomedical study has been conducted. Human behavior is determined by a combination of genetic and environmental influences governing brain structure and function. Violence, therefore, ultimately derives from the operations of the brain, and recognizing the importance of neurobiology will inform and invigorate study of this urgent problem. A working group under the auspices of the Aspen Neurobehavioral Conference, a meeting devoted to the exploration of the impact of brain function on human behavior, developed a consensus statement on the neurobehavioral aspects of violence. As a result of two sessions in 1998 and 1999, we and our colleagues prepared a critical summary on what is known and what remains to be learned about the contribution of brain dysfunction to violent behavior.

Violence is ubiquitous in society–just consider action movies, video games, graphic news reports, or professional sports such as boxing. We need to study the implications of these public displays, not to mention warfare. Attention to violence among young people is particularly critical, because effective prevention can yield extended benefits. Even in the absence of brain dysfunction, exposure to violent events may exert profound behavioral consequences, presumably through effects on the developing brain. Thus not only can violence result from acquired brain lesions, but adverse childhood experiences may engender antisocial behavior. In adults, the role of brain damage in violence remains unclear. A brain lesion by itself is rarely sufficient to cause violent behavior; most individuals with brain damage do not commit criminal acts. However, we cannot assume that the brains of violent individuals are invariably normal. The neurologic status of the brains of violent persons thought to be normal has not been adequately assessed by detailed neurological examination, neuropsychological testing, magnetic resonance imaging, or functional neuroimaging techniques. Studies of murderers have suggested a high prevalence of neurologic dysfunction, and some individuals with traumatic brain injury, epilepsy, dementia, and sleep disorders have been observed to exhibit excessive violence. Violence is more likely among those with severe mental illness, particularly psychosis, and is exacerbated by the use of alcohol and other psychoactive substances.

The cause of violence is multifactorial, and a direct correlation between brain dysfunction and a violent act is rarely possible. Most published studies are retrospective and anecdotal, with small sample sizes and often inconsistent results. Identification of brain lesions is imperfect given the limitations of diagnostic classifications, the neurologic examination, neuroimaging technologies, neuropsychological assessment, and neurochemical analysis. Some subjects’ samples, such as prisoners or those with severe neurologic or psychiatric disease, are necessarily based on violent persons who are apprehended or hospitalized; conclusions are therefore based only on those whose records are analyzed, and the potential for violence in the general population remains unknown. Despite these obstacles, several particularly promising research areas emerge. First, there is the possibility of a neurogenetic contribution to violent behavior. Whereas no single gene for human violence has been discovered, data from molecular genetics suggest that multiple genes may interact to predispose individuals to this behavior. Observations in mouse knockout models have suggested that targeted disruptions of single genes can induce aggressiveness in males and diminish nurturing in females. Aggression in animals and humans is also likely related to genes regulating central serotonin metabolism.

Males are much more likely to commit violent crimes than are females, but genetic factors may not explain this discrepancy. Socioeconomic and cultural influences play a major role; unemployment, lower educational level, alcohol use, and access to firearms all contribute to violent crime among males. The XYY chromosomal disorder also serves to highlight difficulties in establishing an influence of gender on violence. A sensitive issue in research on violence is the impact of race. In the United States, African Americans are more often the victims of violent crime than are whites, and homicide rates among African Americans are higher. Available data, however, do not permit a conclusion about the role of genetics. Criteria for inclusion in a racial group are not clear, and many individuals have mixed racial background that confounds analysis. Another important factor is the disproportionate arrest rate of African Americans by law enforcement officials. Finally, substantial evidence supports the negative impact of socioeconomic and cultural factors on African Americans and other minorities. Racial differences in the rate of violent crime diminish when socioeconomic status is considered.

Whereas gender and racial differences cannot be used to explain a possible genetic basis of violence, data from behavioral genetic investigations have been provocative. Three main approaches have been employed: twin studies, studies of identical twins reared apart, and adoption studies. These studies typically use statistics on criminality, which can be considered a surrogate marker for violence. Genetic influences have been suggested to account for approximately half the variance in violence as assessed by criminal behavior.

Another promising research avenue is the investigation of brain anatomy relevant to the expression of violence. These studies use the lesion method of behavioral neurology to find associations between structural brain damage and a behavioral pattern. Whereas no “violence center” exists in the brain, the limbic system and the frontal lobes are areas most implicated. The limbic system is the neuroanatomic substrate for many aspects of emotion. The structure most often implicated in violent behavior is the amygdala; placidity has been described in humans with bilateral amygdalar damage, whereas violence has been observed in those with abnormal electrical activity in the amygdala. The frontal lobes are regarded as the repository of the most advanced functions of the brain. In particular, the orbitofrontal cortices allow for the inhibition of aggression. Individuals with orbitofrontal injury have been found to display antisocial traits (disinhibition, impulsivity, lack of empathy) that justify the diagnosis of “acquired sociopathy,” and some have an increased risk of violent behavior. A balance thus exists between the potential for impulsive aggression mediated by limbic structures, and the control of this drive by the influence of the orbitofrontal regions.

A third research area likely to yield useful results is the neurochemistry of violence. A wide variety of hormones and neurotransmitters have attracted interest. Testosterone merits strong consideration in view of the much higher incidence of violent behavior among males. In nonhuman animals, evidence for a causal link between testosterone and aggression exists, but a similar association in humans has not been demonstrated. Neurotransmitters including acetylcholine, dopamine, and gamma aminobutyric acid have been considered, but data on serotonin and norepinephrine are most convincing. These neurochemicals appear to act in a reciprocal fashion with regard to impulsive aggression; serotonin is inhibitory and norepinephrine facilitory. The association between reduced central serotonin function and impulsive aggression, the “low serotonin syndrome,” has received substantial support. Conversely, several lines of evidence have linked aggression with increased activity of norepinephrine.

Clinical and postmortem study of violent individuals is critical, particularly regarding its neuroanatomic, neurochemical, and genetic determinants. Equally desirable is large-scale, long-term, and population- based research to help clarify the epidemiology of violence and the relative importance of its risk factors. The Framingham Heart Study serves as a useful model for a project of this kind. Prospective studies could provide data on cognitive, behavioral, neuroanatomic, and genetic predisposition, early brain injury, socioeconomic disadvantage, drug abuse, familial disruption, psychological stress, and exposure to violence. Such studies could also address whether education and socialization influence the brain to promote compassion and empathy. The most efficacious and cost-effective measures in social policy, public health, and medical treatment could then be implemented. Whereas dysfunction of a discrete brain region, isolated neurochemical system, or single gene will not likely emerge as a direct cause of violence, all may contribute. Hence, the need for this type of research.

The Scientist
January 17, 2006

Original web page at The Scientist

Categories
News

Blocking nerve receptor cuts stroke damage

Johns Hopkins scientists say blocking the nerve receptor EP1 in mouse models reduces brain damage caused by stroke. The researchers discovered how to block a molecular switch that triggers brain damage caused by the lack of oxygen during a stroke. The Hopkins study is believed the first to demonstrate a protein on the surface of nerve cells called the EP1 receptor is the switch, and a specific compound, known as ONO-8713, turns it off. The lead author, Sylvain Dore, an associate professor in the university’s school of medicine, said the finding holds promise for developing effective alternatives to anti-inflammatory drugs called COX inhibitors, which have potentially lethal side effects.

Receptors are protein-docking sites on cells into which “signaling” molecules such as nerve chemicals or hormones insert themselves. The binding activates the receptor, which transfers the signal into the cell to produce a specific response. Dore has applied for a patent covering the prevention and/or treatment of neurodegenerative diseases by administering agents that block the EP1 receptor.

Science Daily
January 3, 2006

Original web page at Science Daily

Categories
News

Mental time travel

Neuroscientists at Princeton University have developed a new way of tracking people’s mental state as they think back to previous events — a process that has been described as “mental time travel”. The findings, detailed in the Dec. 23 issue of Science, will aid efforts to learn more about how people mine the recesses of memory and could have a wide-ranging impact in the field of neuroscience, including studies of brain disorders such as Alzheimer’s disease.

The researchers showed nine participants a series of pictures and then asked them to recall what they had seen. By applying a computerized pattern-recognition program to brain scanning data, the researchers were able to show that the participants’ brain state gradually aligned with their brain state from when they first studied the pictures. This supports the theory that memory retrieval is a form of mental time travel. In addition, by measuring second-by-second changes in how well participants were recapturing their previous brain state, the researchers were able to predict what kind of item the subjects would recall next, several seconds before they actually remembered that item.

“When you try to remember something that happened in the past, what you do is try to reinstate your mental context from that event,” said Norman. “If you can get yourself into the mindset that you were in during the event you’re trying to remember, that will allow you to remember specific details. The techniques that we used in this study allow us to visualize from moment to moment how well subjects are recapturing their mindset from the original event.” In the experiment, participants studied a total of 90 images in three categories — celebrity faces, famous locations and common objects — and then attempted to recall the images. Norman and his colleagues used Princeton’s functional magnetic resonance imaging (fMRI) scanner to capture the participants’ brain activity patterns as they studied the images. They then trained a computer program to distinguish between the patterns of brain activity associated with studying faces, locations or objects.

The computer program was used to track participants’ brain activity as they recalled the images to see how well it matched the patterns associated with the initial viewing of the images. The researchers found that patterns of brain activity for specific categories, such as faces, started to emerge approximately five seconds before subjects recalled items from that category — suggesting that participants were bringing to mind the general properties of the images in order to cue for specific details. “What we have learned over the years is that what you get out of memory depends on how you cue memory. If you have the perfect cue, you can remember things that you had no idea were floating around in your head,” Norman said. “Our method gives us some ability to see what cues participants are using, which in turn gives us some ability to predict what participants will recall. We are hopeful that, in the long run, this kind of work will help psychologists develop better theories of how people strategically cue memory, and also will suggest ways of making these cues more effective.”

Princeton University
January 3, 2006

Original web page at Princeton University

Categories
News

Growth factor protects brain against damage from stroke

A naturally occurring growth factor called neuregulin-1 protects brain cells from damage resulting from stroke, according to an animal study conducted by researchers at Morehouse School of Medicine (MSM) and the Atlanta-based Center for Behavioral Neuroscience (CBN). The finding, reported in the online edition of Journal of Cerebral Blood Flow and Metabolism, could lead to the development of new stroke treatments. Stroke, the third leading cause of death in adults in the United States, occurs when blood flow to the brain is interrupted. Deprived of oxygen, brain cells die within minutes, causing inflammation and further damage to tissue surrounding the site where blood flow is obstructed. In the study, a research team led by Byron Ford, PhD, of the MSM Neuroscience Institute and Department of Anatomy and Neurobiology and CBN, examined the effects of administering neuregulin-1 to rats after surgically induced strokes. The scientists discovered the compound reduced cell death by 90 percent compared to rats that did not receive it. Neuregulin-1 also protected neurons from damage even when administered as long as 13 hours after the stroke’s onset.

In DNA microarray analysis of the affected brain tissue, Ford and his team determined neuregulin-1 produces its protective effects by turning on or off nearly 1,000 genes that regulate cell death and inflammation. Neuregulin-1 also blocks the production of free radicals, compounds that have been implicated in cell injury and aging. Currently, a drug called TPA is the only available stroke treatment, and must be administered within three hours of stroke onset to be effective. “The biggest potential benefit of neuregulin-1 is that its therapeutic window is much longer than TPA, potentially up to 48 hours,” said Ford. “It also appears to easily cross the blood-brain barrier and does not produce any obvious side effects in rats.”

Ford has filed two provisional patents for the uses of neuregulin-1 as a stroke treatment and promoting the growth of endogenous neural stem cells to replace damaged neurons. Ford was recently awarded a five-year R01 grant from the National Institute of Neurological Disorders to begin pre-clinical studies of neuregulin-1 as a stroke therapy. As part of this project, Ford will test neuregulin-1 in additional animal models and conduct imaging studies to determine the optimal therapeutic window for the compound to be protective. He also intends to better characterize the molecular processes involved in the stroke process to facilitate the development of novel therapies for stroke.

Science Daily
January 3, 2006

Original web page at Science Daily

Categories
News

New procedure reveals the secrets of the brain

Scientists from the MPI for Biological Cybernetics in Tübingen have developed a new procedure which accurately maps the activity in primate brains by means of the BOLD-Signal (Blood Oxygen Level Dependent Signal). The combination of electrical microstimulation and FMRT promises substantially more precise insights into the functional organisation or the brain and its circuitry. (Neuron, December 22, 2005). Activity patterns in the brain elicited by electrical microstimulation are observed around the electrode and in other functionally connected visual areas. Functional magnetic resonance imaging was used to measure activation. Over the last two centuries electrical microstimulation has been often used to demonstrate causal links between neural activity and specific behaviors or cognitive functions. It has also been used successfully for the treatment of several neurological disorders, most notably, Parkinson’s disease. However, to understand the mechanisms by which electrical microstimulation can cause alternations in behaviors and cognitive functions it is imperative to characterize the cortical activity patterns that are elicited by stimulation locally around the electrode and in other functionally connected areas.

To this end, in a new study published in the December, 2005, issue of Neuron, Andreas S. Tolias and Fahad Sultan, under the guidance of Prof. Nikos K. Logothetis from the Max Planck Institute for Biological Cybernetics in Tübingen, have for the first time developed a technique to record brain activity using the blood oxygen level dependent (BOLD) signal while applying electrical microstimulation to the primate brain. They found that the spread of activity around the electrode in macaque area V1 is larger than expected from calculations based on passive spread of current and therefore may reflect functional spread by way of horizontal connections. Consistent with this functional transsynaptic spread they also obtained activation in expected projection sites in extrastriate visual areas demonstrating the utility of their technique in uncovering in vivo functional connectivity maps. Using the microstimulation/MRI technique in conscious, alert primates holds great promise for determining the causal relationships between activation patterns across distributed neuronal circuits and specific behaviors. Finally, this method could also proof useful in understanding and optimising the method of intra-cranial electrical stimulation in the treatment of neurological diseases.

Science Daily
January 3, 2006

Original web page at Science Daily

Categories
News

Interactive 3-D atlas of mouse brain now available on web

Researchers at the U.S. Department of Energy’s Brookhaven National Laboratory have just launched a web-based 3-D digital atlas browser and database of the brain of a popular laboratory mouse. “Neuroscientists around the world can now download these extremely accurate anatomical templates and use them to map other data — such as which parts of the brain are metabolically active and where particular genes are expressed — and for making quantitative anatomical comparisons with other, genetically engineered mouse strains,” said project leader Helene Benveniste, who is a researcher in Brookhaven’s medical department and a professor of anesthesiology at Stony Brook University.

The database was created using high-resolution magnetic resonance (MR) microscopy at the University of Florida in collaboration with researchers from Brookhaven Lab’s Center for Translational Neuroimaging. The work was done in parallel with an international collaboration, the Mouse Phenome Database (MPD) project, which was created to establish a collection of baseline phenotypic data from commonly used inbred mice. The new brain atlas database consists of 3-D anatomical data from 10 adult male mice of the strain C57BL/6J, and contains data on 20 segmented structures, including variability of brain structures across the strain, and downloadable visualization tools. The research that makes up this database was published as a cover article in the October 2005 issue of the journal Neuroscience.

Science Daily
January 3, 2006

Original web page at Science Daily

Categories
News

Pain research using electronic diaries helps identify who responds to ‘placebo effect’

Studies involving people who suffer from chronic pain often give some of them placebos, “sugar pills” with no medicinal value, to show whether the treatment has real value. Little is known, however, about the types of people who tend to respond positively to placebos, a mystery that places a hurdle before researchers who want to learn the best way to treat people’s pain. A new study by researchers at the University of Michigan Health System sheds some light on one group of people that seems to experience the “placebo effect.” The researchers found that people with one type of chronic pain who have greater swings in their pain fluctuations tend to be more likely to respond to placebos.

The study of people with fibromyalgia — a type of chronic pain affecting several million people that typically involves tenderness, stiffness and fatigue — appears in the current edition of the journal Arthritis & Rheumatism. “There is substantial evidence that the placebo effect has strong biological underpinnings, and that some individuals are more likely than others to demonstrate this effect,” says Daniel J. Clauw, M.D., director of the U-M Chronic Pain and Fatigue Research Center and professor of rheumatology at the U-M Medical School. “This study suggests that individuals with greater hour-to-hour and day-to-day variability in their pain may be more likely to be placebo responders,” says Clauw, senior author of the paper. “When such individuals are placed in clinical trials of new interventions, the strong placebo effect they experience can make it difficult to determine if there is a superimposed effect of the treatment being tested.”

“This finding is important because so far, nobody has been able to fingerprint placebo responders,” adds Richard A. Harris, Ph.D., lead author of the paper, research investigator in the Division of Rheumatology at the U-M Medical School’s Department of Internal Medicine and a researcher at the Chronic Pain and Fatigue Research Center. “The research is helping us gain a better understanding of who responds to placebos. That’s especially important to the research community as we design clinical trials.” It is not clear if these findings are only present in fibromyalgia, or may also be seen in other chronic pain conditions, the researchers said.

Each of the 125 participants in the study carried a Palm-based electronic diary and was prompted at random intervals to record his or her pain level. Participants were prompted on average about three and a half times per day. The patients who were enrolled in a multicenter drug trial of the anti-depressant milnacipran versus a placebo. Some of the participants experienced a large range of pain variability, while others experienced pain at fairly constant levels. Those whose pain levels varied widely were more likely to respond to the placebo. They were not, the researchers found, necessarily more likely to respond to milnacipran, which suggests that high pain variability may be a predictor of a placebo response. “These results have direct implications for drug trials in fibromyalgia and perhaps broader implications for other pain syndromes,” the researchers note.

The researchers also found that the electronic recording of pain levels was much more accurate than earlier pencil-and-paper recordings. The electronic system contained a time stamp when the participants recorded the data, removing some of the uncertainty and inaccuracy that often accompanied pencil-and-paper diaries. Fibromyalgia is a common condition that affects 2 to 4 percent of the U.S. population and is more commonly diagnosed in women. Clauw and his colleagues at the Chronic Pain and Fatigue Research Center are conducting extensive research into the causes and treatments of fibromyalgia.

Science Daily Health & Medicine
December 20, 2005

Original web page at Science Daily Health & Medicine

Categories
News

Focusing: brain, not age, important

University of Illinois scientists say they’ve determined that when it comes to focusing on a task, the brain and not age is most important. The researchers at the university’s Beckman Institute for Advanced Science and Technology said some folks more than 60 years old are as mentally sharp as 22-year-olds. Others struggle.

The differences became apparent through the use of functional magnetic resonance imaging of the brains of 40 individuals ranging in age from 19 to 87. The scientists found less white matter in the frontal lobes of those who struggle with focusing. “We found that both performance and brain-activation differences of older good performers and the older poor performers are predicted by changes in brain structure, specifically by the volume of white matter connecting the right and left hemispheres of the frontal lobes,” said Arthur Kramer, a professor of psychology. The study is reported in the current issue of the quarterly journal Psychology and Aging.

Science Daily
December 6, 2005

Original web page at Science Daily

Categories
News

Restricting diet may reverse early-stage Parkinson’s disease

A new Oregon Health & Science University and Portland Veterans Affairs Medical Center study suggests that early-stage Parkinson’s disease patients who lower their calorie intake may boost levels of an essential brain chemical lost from the neurodegenerative disorder. The study by Charles Meshul, Ph.D., associate professor of behavioral neuroscience in the OHSU School of Medicine and the VAMC’s Neurocytology Lab, shows that dietary restriction reverses a Parkinson’s-induced drop in glutamate, a brain neurotransmitter important for motor control, function and learning, in a mouse model for the disease’s early stages.

The results are the first to show that a restricted diet can disable neurochemical changes in the brain occurring in early-stage Parkinson’s even after those changes are observed. “In the early stages of the disease, we see certain markers in the brain that are changing that may be indicative that dietary restriction is helpful,” Meshul said. Parkinson’s disease is a progressive, degenerative disorder affecting a region of the brain called the substantia nigra where movement is controlled. Symptoms such as tremor or shaking, muscular stiffness or rigidity, slowness of movement and difficulty with balance appear when about 80 percent of cells in the body that produce the neurochemical dopamine die or become impaired. Incidence increases with age, and the disease is uncommon in people younger than 40. According to the OHSU Parkinson Center of Oregon, the disease affects both men and women across all ethnic lines and occurs in about two of every 100 people older than 55. About 1.5 million Americans suffer from the disease.

Meshul’s lab compared two groups of mice with 60 percent to 75 percent loss of dopamine in the brain, representing early-stage Parkinson’s: One had access to food every day while the other had access every other day, and both were fed over a 21-day period. The mice that ate less often lost 10 percent to 15 percent of their body weight compared to their counterparts. “Dietary restriction appears to be normalizing the levels of glutamate,” Meshul said. “The fact that we’re getting the levels of glutamate back to, essentially, control levels may indicate there are certain synapse changes going on in the brain to counteract the effects of Parkinson’s. In fact, what this may indicate is a reversal of locomotor deficits associated with the disease.”

In addition to the rise in glutamate, Meshul’s group, using a dopamine-synthesizing enzyme called tyrosine hydroxylase as a marker for dopamine nerve terminals, found that dietary restriction caused a drop in the number of dopamine terminals in the mouse model for early-stage Parkinson’s. “As it turns out, dietary restriction, in and of itself, had an effect. It actually caused a small but significant decrease in the numbers of these dopamine terminals. So in other words, dietary restriction really is doing something to the brain,” Meshul said. “It could very well be that what dietary restriction is doing is trying to protect the system somehow. And one of the reasons dietary restriction is protective may be that it’s reducing the activity of particular synapses. That’s actually what the data indicates.”

Matching the upturn in glutamate levels with positive behavioral changes is difficult at this point in the research, Meshul said. “One of the unfortunate problems with this model is it’s tough to do any behavioral measures. We see a reversal of the effect of glutamate in the brain due to the dietary restriction, but what does that actually mean in terms of the behavior of the animal? Unfortunately, we don’t know. We didn’t measure that.” But a similar primate study at the University of Southern California that Meshul is associated with is testing the hypothesis that glutamate does have an effect on behavior. “It turns out that, in time, these animals recover behaviorally from all of the motor deficits that are associated with (early-stage Parkinson’s),” he said. “Our hypothesis is there may be changes in glutamate that account for these behavioral changes.”

Dietary restriction’s beneficial effect on neurological function has been studied in primates by scientists at the National Institutes of Health for 30 years, Meshul said. Researchers found that animals whose calorie intake was lowered by 20 percent aged better, suffered from fewer immunological disorders, displayed healthier hair and skin tone, and “looked significantly better than a counterpart that hasn’t had a restricted diet.” “They live longer,” Meshul said. “It’s been known for many, many years that dietary restriction is good.”

Scientists already have shown dietary restriction initiated before the onset of early Parkinson’s can protect against neurochemical changes in the brain caused by the disease.
In 1999, researchers found that mice on restricted diets for three months prior to an early Parkinson’s diagnosis lost fewer dopamine-synthesizing neurons. “There’s not as much loss of dopamine if you restrict their diets ahead of time,” Meshul noted. Meshul’s lab is finding that dietary restriction isn’t the only way to boost neurological function in Parkinson’s disease. Early results of another study the group is conducting have shown that rats with 90 percent loss of dopamine in the brain — or full-blown Parkinson’s disease — under a four-week exercise regimen can run twice as long as parkinsonian rats that didn’t exercise. “We’re trying to make the correlation that exercise definitely helps in terms of the parkinsonian animal and, in fact, in human studies it’s been shown that any sort of exercise helps patients,” Meshul said.

Science Daily
December 6, 2005

Original web page at Science Daily

Categories
News

Statins to overcome learning disabilities in mice

In a surprise twist that recalls the film classic “Flowers for Algernon,” but adds a happy ending, UCLA scientists used statins, a popular class of cholesterol drugs, to reverse the attention deficits linked to the leading genetic cause of learning disabilities. The Nov. 8 issue of Current Biology reports the findings, which were studied in mice bred to develop the disease, called neurofibromatosis 1 (NF1).

Statins helped reverse attention deficits in this mouse, which was bred to develop the leading genetic cause of learning disabilities. (Credit: UCLA: Weidong Li) The results proved so hopeful, that the Food and Drug Administration approved the use of the drugs in three clinical trials currently under review to test the effect of statins in children and adults born with NF1. The findings could help the estimated 35 million Americans who struggle with learning disabilities. “Learning disabilities and mental retardation each affect five percent of the world population,” said Dr. Alcino Silva, professor of neurobiology, psychiatry and psychology at the David Geffen School of Medicine at UCLA. “Currently, there are no treatment options for these people. That’s why our findings are so exciting from a clinical perspective.”

In an earlier study, Silva and his colleagues linked NF1’s learning problems to a protein called Ras, a protein that regulates how brain cells talk to each other. This communication is what enables learning to take place. The NF1 mutation creates hyperactive Ras, which disrupts cellular conversation and undermines the learning process. “The act of learning creates physical changes in the brain, like grooves on a record,” said Silva. “But surplus Ras tips the balance between switching signals on and off in the brain. This interrupts the delicate cell communication needed by the brain to record learned information.” The UCLA team began searching for a safe drug that would zero in on Ras and overcome its hyperactivity without causing harmful side effects over long-term use.

“It became something of a Quixotic quest — an impossible dream,” Silva admitted. “We thought, ‘Wouldn’t it be nice to find a drug that is already FDA-approved, safe for lifetime use and could be tested in mice and humans with NF1?’ Fortunately, our optimism was rewarded.” It took a medical student in Silva’s lab to identify the drug and connect it with NF1. Steve Kushner, a scholar in UCLA’s MD/PhD program, learned in a clinical rotation about statins, the drugs already prescribed to millions of people worldwide to lower cholesterol. “Steve raced into my lab and shared what he’d learned: statins work on the Ras protein that is altered by NF1 and play a key role in learning and memory,” recalled Silva. “It was the researcher’s equivalent of finding a suitcase stuffed with a million dollars.”

Statin drugs lower cholesterol by blocking the effects of certain fats. Because Ras requires fat to function, less fat results in less Ras. With reduced Ras activity, the brain cells are able to communicate properly in mice with NF1, allowing normal learning to take place. “NF1 interrupts how cells talk to each other, which results in learning deficits,” said Silva. “Statins act on the root of the problem and reverse these deficits. This enables the process of learning to physically change the brain and create memory.” Silva’s lab tested the effects of statins on mice that were bred with the NF1 mutation. The animals displayed the same symptoms as people with NF1: attention deficits, learning problems and poor physical coordination. First author Weidong Lee, a UCLA postdoctoral fellow, ran three tests to compare the behavior of NF1 mice treated with statins to NF1 mice who received a placebo. Then he compared both groups to normal mice.

First, he trained the mice to follow a blinking light in order to find a food reward. The NF1 mice on statins showed a 30 percent improvement in their ability to pay attention, outperforming the normal mice. Second, he trained the mice to memorize spatial clues in order to navigate a water maze and swim to a platform. The normal animals learned to find the platform in seven days; the NF1 mice took 10. After receiving statins, the NF1 mice outraced the normal mice.
Third, Lee tested coordination by training the mice to balance while running on a rotating log, which gradually increased in speed. At first, the NF1 mice would jump off as the log spun faster. But statin therapy enabled the NF1 mice to perform as well as their normal counterparts. “This is mind-blowing — we think we have a real fundamental reason to be optimistic,” explained Silva. “Here is a drug that affects a key learning and memory pathway, and completely rescues the most common genetic cause for learning disabilities. We don’t have to do extensive clinical trials for toxicity or safety — these were already completed for other uses.”

Science Daily Health & Medicine
December 6. 2005

Original web page at Science Daily Health & Medicine

Categories
News

The brain needs the middle ear to track depth

When you jaywalk, your ability to keep track of that oncoming truck despite your constantly changing position can be a lifesaver. But scientists do not understand how such constant updating of depth and distance takes place, suspecting that the brain receives information not just from the eye but also from the motion-detecting vestibular system in the middle ear. In studies with monkeys reported in the October 6, 2005, issue of Neuron, Nuo Li and Dora Angelaki of Washington University School of Medicine in St. Louis have demonstrated how such depth motion is updated and strongly implicated the vestibular system in that process.

In their experiments, the researchers trained the monkeys to perform memory-guided eye movements. The animals were first shown a light a fixed distance away from their head. Then the researchers flashed one of eight other, closer “world-fixed” target lights. Next, with the room lights turned off, the monkeys were moved either forward or backward and the fixed-distance light flashed, signaling the monkeys that they should look at where they remembered the world-fixed light had flashed. Finally, the room lights and target light were turned on, so the monkey could make any corrective eye movement to the re-lit target. For comparison, the researchers also conducted experiments in which the monkeys were not moved. Such an experimental design using passive motion enabled the researchers to study depth-tracking in the absence of any clues the monkeys might have gleaned from their own motor movements–leaving vestibular system as the most likely source of information.

Finally, the researchers eliminated the vestibular systems in two of the monkeys and performed the same eye-movement experiments. They found that the eye motion of monkeys in the first experiments indicated that they were clearly able to update their perception of the depth of the target, even in the absence of information from their own motor movements. By contrast, the monkeys that lacked vestibular systems showed compromised ability in the task. “These results demonstrate not only that monkeys can update retinal disparity information but also that intact vestibular motion cues are critical in reconstructing three-dimensional visual space during motion in depth,” concluded Li and Angelaki.

Science Daily
November 8, 2005

Original web page at Science Daily

Categories
News

A new analysis of a standard brain test may help predict dementia

Although Alzheimer’s disease affects millions of people worldwide, there is no way to identify this devastating brain disease at its earliest stages when there still may be time to delay or even prevent the downward spiral into dementia. In research settings, scientists are using sophisticated tools like MRI and PET to distinguish characteristics of brain function and anatomy that indicate future problems, providing a sort of screening test for the brain. Now a new study by a research group at NYU School of Medicine demonstrates that the earliest manifestations of Alzheimer’s, when the first signs of memory loss appear, can be screened with a relatively inexpensive, painless, and easy-to-use tool called an EEG (electroencephalograph).

In the study, published in the upcoming on-line issue of the journal Neurobiology of Aging, the researchers demonstrate that a computer analysis of the EEG, which measures the brain’s electrical activity, accurately predicted healthy people in their 60s and 70s who would develop dementia over the next 7 to 10 years. It also identified individuals who would remain virtually unchanged over the same time span. The EEGs were almost 95 percent accurate in identifying those who would decline cognitively and those who would not, according to the study.

“Our results suggest that quantitative analysis of the EEG is sensitive to the earliest signs of the dementing process,” says Leslie S. Prichep, Ph.D., Associate Director of the Brain Research Laboratories of the Department of Psychiatry, who led the study. Some day she says it may be used as one of the tools to evaluate a person’s propensity for developing Alzheimer’s, the most common form of dementia affecting people over 65. But for now the results need to be replicated in and validated by much larger prospective studies before they can be applied to screen large populations. It takes about 30 minutes to perform an EEG, which involves placing recording electrodes on the scalp. The test is performed with the patient seated comfortably. There are no injections and the scalp is not shaved.

The NYU researchers, led by Dr. Prichep and Roy John, Ph.D., Professor in the Department of Psychiatry, evaluated a group of 44 individuals between the ages of 64 and 79 who felt that their memories were faltering. These people enrolled voluntarily in a long-term study at NYU’s Silberstein Aging and Dementia Research Center where they underwent a battery of neuropsychiatric and other tests, which revealed that their brain function was normal for their age. At the beginning of the testing process each volunteer was also given a baseline EEG test at the Brain Research Laboratories at NYU School of Medicine. They were tested there several more times over the next 7 to 10 years. Over this period, 27 of the 44 subjects developed mild cognitive impairment or full-blown dementia, and 17 remained stable. Applying a mathematical algorhythm to the brain scans, Drs. Prichep and John showed that certain characteristics of the pattern of brain waves on the baseline EEG were associated with future cognitive deterioration.

To the untrained eye EEGs look like a confusing thicket of squiggly lines. But the lines are actually waves that have been described mathematically by their amplitude and frequency composition as a function of age, based on data collected over the last 30 years by Drs. Prichep and John. They and their NYU colleagues obtained this data from some 12,000 healthy people and psychiatric patients who had been given EEGs. About 3,500 of the EEGs were from aging and dementia patients. “We probably have the largest electrophysiological database of this kind in the world,” says Dr. Prichep. “Since we can compare each individual’s quantitative EEG to age-expected normal values, we were able to describe which features reflected expected changes occurring with normal aging and which might be associated with future decline,” she says.

A prominent feature associated with cognitive deterioration on the baseline EEG was a brain wave called theta, which was excessive in people who would eventually decline, according to the study. This band was particularly abnormal in the frontal regions, along the lateral regions and in the right posterior region of the brain in those people who went on to decline. Another feature was a slowing in the mean frequency of the EEG, which is described in cycles per second. Yet another distinctive feature of those who decline was a change in the synchronization between the two sides of the brain. The source of the theta has been shown to be the hippocampus, a brain region demonstrated in imaging studies with MRI and PET to be impaired in dementia, notes Dr. Prichep.

Science Daily
November 8, 2005

Original web page at Science Daily

Categories
News

Scientists uncover new clues to how crucial molecular gatekeepers work

One of the biggest mysteries in molecular biology is exactly how ion channels — tiny protein pores through which molecules such as calcium and potassium flow in and out of cells — operate. Such channels can be extremely important; members of the voltage-gated ion channel family are crucial to generating electrical pulses in the brain and heart, carrying signals in nerves and muscles. When channel function goes awry, the resulting diseases — known as channelopathies, including epilepsy, a number of cardiomyopathies and cystic fibrosis — can be devastating.

Ion channels are also controversial, with two competing theories of how they open and close. Now, scientists at Jefferson Medical College, reporting October 6, 2005 in the journal Neuron, have detailed a part of this intricate process, providing evidence to support one of the theories. A better understanding of how these channels work is key to developing new drugs to treat ion channel-based disorders. According to Richard Horn, Ph.D., professor of physiology at Jefferson Medical College of Thomas Jefferson University in Philadelphia, voltage-gated ion channels are large proteins with a pore that pierces the cell membrane. They open and close in response to voltage changes across the cell membrane, and the channels determine when and which ions are permitted to cross a cell membrane.

In the conventional theory, when an electrical impulse called an action potential travels along a nerve, the cell membrane charge changes. The inside of the cell (normally electrically negative), becomes more positive. In turn, the voltage sensor, a positively charged transmembrane segment called S4, moves towards the outside of the cell through a small molecular gasket called a gating pore. This movement somehow causes the ion channel to open, releasing positively charged ions to flow across the cell membrane. After the action potential is over, the cell’s inside becomes negative again, and the membrane returns to its normal resting state.

The more recent and controversial theory proposed by Nobel laureate Roderick MacKinnon of Rockefeller University holds that a kind of molecular paddle comprised of the S4 segment and part of the S3 segment moves through the cell membrane, carrying S4’s positive charges with it across the lipid. As in the conventional theory, the S4 movement controls the channel’s opening and closing. The two theories differ in part because the paddle must move its positive charges all the way across the cell membrane. The conventional theory says that charges move a short distance through the gating pore.

In the current work, Dr. Horn and colleague Christopher Ahern, Ph.D., a research assistant in the Department of Physiology at Jefferson Medical College, showed that the field through which the voltage sensor’s charges moved is very short, lending support to the conventional model. “Using a molecular tape measure with a very fine resolution — 1.24 Angstroms — we tethered charges to the voltage sensor,” Dr. Horn explains. “When the tether is too long, the voltage sensor can’t pull it through the electric field,” meaning the electric field is highly focused. “This is another nail in the coffin of the paddle model,” he says, “because the thickness of the electric field is much smaller than predicted by that model. The measurement is unambiguous in terms of the relationship between length of the tether and how much charge gets pulled through the electric field. Next, the researchers are tackling the relationship between S4’s movement and the gates that open and close the channels.

Science Daily
November 8, 2005

Original web page at Science Daily

Categories
News

Stimulating the brain makes the fingers more sensitive

Repetitive transcranial magnetic stimulation (rTMS) has emerged as an intriguing technique for exploring brain function, and a promising, though still unproven, form of therapy. This week, in the open-access journal PLoS Biology, Hubert Dinse and colleagues show that a short course of rTMS can increase finger sensitivity for up to two hours after treatment ends, and that this change corresponds to an increase in the size of the brain map representing the finger. rTMS is applied with an electromagnetic coil in the shape of a figure-eight, placed on the scalp directly over the targeted portion of the brain. Short bursts of a strong magnetic pulse stimulate electrical currents within. Sensory input from each region of the body is represented on the surface of the brain, and the location of any region–in this case, the right index finger–can be mapped to allow precise targeting of the rTMS. The authors adjusted the strength of the magnetic field to just below that which triggered a sensory response in the finger, and then applied intermittent pulses of stimulation over the course of about ten minutes.

They tested the sensitivity of the index finger by determining how far apart two simultaneously applied pinpricks needed to be for the subject to distinguish them as separate stimuli. rTMS increased this two-point discrimination by about 15% immediately after stimulation, an effect that gradually diminished but still remained significant over the course of the next two hours. The effect was fairly specific for the right index finger: there was no effect on the left index finger, which is represented in the opposite hemisphere, and only a small effect on the right ring finger, which is represented several millimeters away from the index finger in the same hemisphere. When stimulation was applied over the area representing the lower leg, the index finger did not become more sensitive.

The authors used functional magnetic resonance imaging (fMRI) to see how the brain changed in response to the stimulation. They found that the region representing the index finger got larger, and that the degree of increase in any one subject corresponded to the degree of increased sensitivity in that same subject. As the sensory effect faded, so too did the fMRI changes. Thus, the cortex itself undergoes changes as a result of rTMS.

Science Daily Health & Medicine
November 8, 2005

Original web page at Science Daily

Categories
News

Cerebellum found to be important in cognition and behavior

Higher cognitive functions, like language and visual processing, have long been thought to reside primarily in the brain’s cerebrum. But a body of research in premature infants at Children’s Hospital Boston is documenting an important role for the cerebellum — previously thought to be principally involved in motor coordination — and shows that cerebellar injury can have far-reaching developmental consequences.

The latest study, in the October issue of Pediatrics, also demonstrates that the cerebrum and cerebellum are tightly interconnected. Sophisticated MRI imaging of 74 preterm infants’ brains revealed that when there was injury to the cerebrum, the cerebellum failed to grow to a normal size. When the cerebral injury was confined to one side, it was the opposite cerebellar hemisphere that failed to grow normally. The reverse was also true: when injury occurred in one cerebellar hemisphere, the opposite cerebral hemisphere was smaller than normal. “There seems to be an important developmental link between the cerebrum and the cerebellum,” says Catherine Limperopoulos, PhD, in Children’s Department of Neurology, the study’s lead author. “We’re finding that the two structures modulate each other’s growth and development. The way the brain forms connections between structures may be as important as the injury itself.”

As neuroimaging becomes more sophisticated, cerebellar injury is increasingly recognized as a complication of premature birth. Improved survival of fragile preemies, coupled with a surge in premature births, has left more and more families to deal with the damage to their babies’ brains — including cerebellar damage. In March, Limperopoulos and colleagues published a study in Pediatrics showing that the cerebellum grows rapidly late in gestation — much faster than the cerebral hemispheres — and that premature birth arrests this surge in development. In another study, published in Pediatrics in September, they found that the incidence of cerebellar hemorrhage in extremely premature infants rose significantly, by about 44 percent a year, from 1998 through 2002 — an increase they attribute to improved survival and improved diagnostic techniques. By 2002, cerebellar hemorrhage was identified in 15 percent of surviving infants weighing less than 750 grams. “Until recently, cerebellar injury was underrecognized,” says Limperopoulos. “Doctors downplayed it, saying, ‘Oh, maybe Johnny will be a little clumsy.’ Our research has made us aware that cerebellar injury is not a benign finding. We now know to look for it, and can counsel families that their children are likely to have deficits that extend beyond motor, and that may benefit from early intervention.”

A study presented mid-September, at the American Academy of Cerebral Palsy and Developmental Medicine meeting, documents the magnitude of these problems. Limperopoulos and colleagues compared 31 toddlers, born prematurely and identified at birth as having cerebellar hemorrhage (but no cerebral injury) with 31 controls who were also born prematurely, but whose brain imaging studies were normal. In addition to motor problems, over half the children with cerebellar injury had functional limitations in daily living, communication, and socialization skills, compared with only 3 percent of controls. Sixty-one percent, versus 3 percent of controls, had global developmental delays.
Deficits included delays in both expressive and receptive language, visual reception delays, and impaired social and behavioral function.

Limperopoulos and colleagues continue to follow children who were identified at birth as having cerebellar injury. The children undergo a battery of wide-ranging developmental tests including assessment of motor, cognitive, language, social and behavioral skills and tests of functional abilities in self-care and day-to-day activities. Comprehensive MRI studies are paying particular attention to structural connections between the cerebrum and cerebellum — how the nerve fibers run and connect, and where they might be disrupted. “We want to understand what happens over time,” Limperopoulos says. “The way the brain adapts and reorganizes after injury may be the best indicator of how a child will do.”

Science Daily
October 25, 2005

Original web page at Science Daily