Categories
News

Researchers identify where emotional fear memory and pain become permanently etched in the brain

A team of researchers led by the University of Toronto has charted how and where a painful event becomes permanently etched in the brain — a discovery that has implications for pain-related emotional disorders such as anxiety and post-traumatic stress. University of Toronto physiology professor Min Zhuo and his colleagues Professor Bong-Kiun Kaang of Seoul National University in South Korea, and Professor Bao-Ming Li of Fudan University in China have identified where emotional fear memory and pain begin by studying the biochemical processes in a different part of the brain. In a paper published in the Sept.15 issue of Neuron the researchers use mice to show how receptors activated in the pre-frontal cortex, the portion of the brain believed to be involved with higher intellectual functions, play a critical role in the development of fear. Previous research had pointed to activation in the hippocampus, an area buried in the forebrain that regulates emotion and memory, as the origin of fear memory.

“This is critical as it changes how and where scientists thought fear was developed,” says Zhuo, the EJLB-CIHR Michael Smith Chair in Neurosciences and Mental Health. “By understanding the biomolecular mechanisms behind fear, we could potentially create therapeutic ways to ease emotional pain in people. Imagine reducing the ability of distressing events, such as amputations, to be permanently imprinted in the brain.” Zhuo says that fear memory does not occur immediately after a painful event; rather, it takes time for the memory to become part of our consciousness. The initial event activates NMDA receptors — molecules on cells that receive messages and then produce specific physiological effect in the cell — which are normally quiet but triggered when the brain receives a shock. Over time, the receptors leave their imprint on brain cells.

By delivering shocks to mice, the researchers activated the NMDA receptors and traced a subunit of the molecule — a protein called NR2B — long believed to be associated with fear memory in the hippocampus and the amygdala, an almond-shaped structure in front of the hippocampus. To further test the protein’s influence, researchers reduced the amount in mice and found they were less hesitant to avoid shocks. “We tested the animals using both spatial and auditory cues,” Zhuo says. “In one experiment, the mice received small shocks when entering a chamber and they developed fear memory. In another experiment, we used sound tones to be associated with shocks. When NR2B was blocked, they no longer avoided the chamber or reacted to the tone.”

Zhuo and his team then studied the mice’s brain slices and discovered traces of NR2B in the pre-frontal cortex, supporting their theory that fear memory develops in that region. “By identifying NR2B in the pre-frontal cortex of the brain, we propose that fear memory originates from a network of receptors, rather than one simple area,” Zhuo says. “It is more complex than previously thought.”

The next step, according to Zhuo, is to determine how NR2B directly affects memory formation and storage in the brain. “While we know it exists in the hippocampus, amygdala and the pre-frontal cortex, we don’t know exactly how it alters them,” Zhuo says. “Once we understand the implications for each part, we will be able to reduce levels of NR2B accordingly and effectively reduce fear memory. In the future, perhaps people can take therapeutic measures before experiencing a particularly discomforting situation.” The University of Toronto Innovations Foundation is currently working with Zhuo to push for the translation of this finding into treatments.

Science Daily
October 11, 2005

Original web page at Science Daily

Categories
News

‘Noise’ affects how brain directs body to move

A University of California, San Francisco (UCSF) study has revealed new information about how the brain directs the body to make movements. The key factor is “noise” in the brain’s signaling, and it helps explain why all movement is not carried out with the same level of precision. Understanding where noise arises in the brain has implications for advancing research in neuromotor control and in developing therapies for disorders where control is impaired, such as Parkinson’s disease.

The new study was developed “to understand the brain machinery behind such common movements as typing, walking through a doorway or just pointing at an object,” says Stephen Lisberger, PhD, senior study investigator who is director of the W.M. Keck Center for Integrative Neuroscience at the University of California, San Francisco. The study findings, reported in the September 15 issue of the journal Nature, are part of ongoing research by Lisberger and colleagues on the neural mechanisms that allow the brain to learn and maintain skills and behavior. These basic functions are carried out through the coordination of different nerve cells within the brain’s neural circuits. “To make a movement, the brain takes the electrical activity of many neurons and combines them to make muscle contractions,” Lisberger explains. “But the movements aren’t always perfect. So we asked, what gets in the way?”

The answer, he says, is “noise,” which is defined as the difference between what is actually occurring and what the brain perceives. He offers making a foul shot in basketball as an example. If there were no noise in the neuromotor system, a player would be able to perform the same motion over and over and never miss a shot. But noise prevents even the best players in the NBA from having perfect foul-shooting percentages, he says. “Neuroscientists are interested in what limits virtuosity. Our finding is significant because it demonstrates that errors in what is seen can have a bigger impact on motor performance than errors in controlling muscles,” says co-investigator Osborne, who conducted the research.

“By studying how the brain reduces noise, we can learn more about how it processes sensory inputs, makes decisions and executes them. Understanding how noise is reduced to very precise commands helps us understand how those commands are created,” adds Lisberger, who also is a Howard Hughes Medical Institute investigator and a UCSF professor of physiology. In the study, the research team focused on a movement that all primates, including humans, are very skilled at: an eye movement known as “smooth pursuit” that allows the eyes to track a moving target.

In a series of exercises with rhesus monkeys in which the animals would fixate on and track visual targets, the researchers measured neural activity and smooth pursuit eye movements. From this data, the team analyzed the difference between how accurately the animals actually tracked a moving object and how accurately the brain perceived the trajectory. Findings showed that both the smooth pursuit system and the brain’s perceptual system were nearly equal. “This teaches us that these very different neural processes are limited to the same degree by the same noise sources,” says Lisberger. “And it shows that both processes are very good at reducing noise. The differences that exist are likely caused by the separate parts of the brain that are responsible for the separate processes.”

He concludes, “Because the brain is noisy, our motor systems don’t always do what it tells us to. Making precise movements in the face of this noise is a challenge. This study gives us new insights into how the brain works to do that.”

Science Daily
October 11, 2005

Original web page at Science Daily

Categories
News

Brain structures contribute to asthma

The mere mention of a stressful word like “wheeze” can activate two brain regions in asthmatics during an attack, and this brain activity may be associated with more severe asthma symptoms, according to a study by University of Wisconsin-Madison researchers and collaborators. The fMRI scans revealed that the asthma-related terms stimulated robust responses in two brain regions–the anterior cingulate cortex and the insula–that were strongly correlated with measures of lung function and inflammation. The other types of words were not strongly associated with lung function or inflammation.

The two brain structures are involved in transmitting information about the physiological condition of the body, such as shortness of breath and pain levels, says Davidson, and they have strong connections with other brain structures essential in processing emotional information. “In asthmatics, the anterior cingulate cortex and the insula may be hyper-responsive to emotional and physiological signals, like inflammation, which may in turn influence the severity of symptoms,” says Davidson. The researchers suspect that other brain regions may also be involved in the asthma-stress interaction.

Science Daily
September 27, 2005

Original web page at Science Daily

Categories
News

Monkey brain’s ‘pitch center’ discovered

Johns Hopkins University scientists have discovered a discrete region of the monkey brain that processes pitch — the relative high and low points of sound. By recording the activity of individual brain cells as monkeys listened to musical notes, the scientists identified single neurons that recognize a middle-C as a middle-C even when played by two different instruments. “Pitch perception is a basic function of human and animal auditory systems, yet its source has remained elusive to researchers for decades,” said Xiaoqin Wang, associate professor of biomedical engineering and neuroscience. “The discovery of a pitch-processing area in the brain solves an age-old mystery of auditory research.”

Wang said pitch’s importance to humans is found in facilitating the ability to follow a sequence of sounds recognized as “melodic” and combinations of sounds identified as harmony. As a result, pitch gives meaning to the patterns, tones and emotional content of speech. The researchers say given the similarities between monkeys and man, humans may have a similar pitch-processing region in the brain. Such a discovery might help people with hearing and speech problems.

Source: Nature

Science Daily
September 27, 2005

Original web page at Science Daily

Categories
News

Baby comes with brain repair kit for mum

Everyone knows that kids get their brains, or lack of them, from their parents. But it now seems that the reverse is also true. Stray stem cells from a growing fetus can colonise the brains of mothers during pregnancy, at least in mice. If the finding is repeated in humans, the medical implications could be profound. Initial results suggest that the fetal cells are summoned to repair damage to the mother’s brain. If this is confirmed, it could open up new, safer avenues of treatment for brain damage caused by strokes and Alzheimer’s disease, for example.

This is a long way off, but there are good reasons for thinking that fetal stem cells could one day act as a bespoke brain repair kit. It is already well known that during pregnancy a small number of fetal stem cells stray across the placenta and into the mother’s bloodstream, a phenomenon called microchimerism. They can survive for decades in tissues such as skin, liver and spleen, where they have been shown to repair damage.

Nature’s ploy to “treat mother” makes evolutionary sense too, because the fetus has a better chance of survival if the mother is fit and healthy both during and after pregnancy. But nobody has seen this effect in brain cells. “This is the first study to show conclusively that fetal cells cross the blood-brain barrier,” says Diana Bianchi, a world authority on microchimerism at Tufts University School of Medicine in Boston, Massachusetts. A team led by Gavin Dawe of the National University of Singapore and Xiao Zhi-Cheng of Singapore’s Institute of Molecular and Cell Biology showed that once the stowaways enter mouse brains, they mature into different cell types.

These include cells resembling neurons, which transmit electrical impulses; astrocytes, which support neurons; and oligodendrocytes, which ensheath and protect nerve cells. “They can become almost all major cell types found in the brain,” Dawe says. The researchers have not yet demonstrated whether the cells are functional, however. “We need to know, for example, whether fetal cells expressing characteristics of neuronal cells can actually fire action potentials and synapse with native cells in the mother’s brain,” he adds. “There are good reasons for thinking that fetal stem cells could one day act as a bespoke brain repair kit.” To make fetal stowaways easy to spot in samples from the mother’s brain, Dawe and Xiao mated normal female mice with male mice genetically engineered so that their cells contained a fluorescing protein derived from jellyfish, making the cells glow bright green.

This revealed that the fetal cells did not spread evenly. When the researchers induced stroke-like injuries to the brains of some of the mother mice, the fetal cells became six times more concentrated at the damaged areas, suggesting they may be involved in repair. Dawe says it is not yet clear how they are summoned to the sites of injury, but he suspects they are drawn there by “SOS-like” signalling factors from damaged tissue. The team is also trying to identify surface molecules unique to the brain-bound fetal cells, and hopes to isolate human counterparts from umbilical blood or bone marrow. This would be vital for medical applications, as a large number of cells might be needed to have any medical effect. “It would be important to enrich for ones that can cross the blood-brain barrier,” Dawe says.

A big potential advantage of using fetal cells as a treatment is that they could simply be injected into the bloodstream and left to find their own way into the brain. This would make it possible to treat conditions with diffuse injury, such as Alzheimer’s disease. The only existing way of getting cells into the brain to treat injured or defective areas is to inject them directly through the skull into the area where they are required. Parkinson’s disease, for example, has been treated by injecting cells that make the neurotransmitter dopamine into the region of the brain that fails to make the substance in Parkinson’s patients. Some researchers have shown, however, that injected fetal cells are capable of migrating across the brain to sites of damage.

“A big potential advantage is that cells could simply be injected into the blood and left to find their own way to the brain. ”Dawe and Xiao warn that it could take anything from five to 20 years to develop treatments, not least because in some cases, fetal cells have been shown to aggravate immunological disease. “It’s important we know it’s safe and of benefit before we try it in patients,” Dawe cautions. One key step will be to establish beyond doubt that the effect seen in mice happens in humans too. Dawe says this can be done by looking for cells containing a Y chromosome in post-mortem brain tissue from mothers of boys. “We’ve already started work on acquiring tissue to answer this question,” he says.

Bianchi says that most research on microchimerism in mice has later been borne out in humans. “In every aspect, the trend has been the same,” she says. But even if the phenomenon does occur in people, there are many hurdles to be cleared on the way to developing treatments. “It’s unclear how long the newly arrived cells will live, and how well they would integrate into the specialised functional networks of the brain,” says Jakub Tolar, a specialist in microchimerism at the University of Minnesota in Minneapolis. “Predictions of clinical use must come with caution and reflection.”

New Scientist
September 13, 2005

Original web page at New Scientist

Categories
News

New research suggests heart bypass surgery increases risk of Alzheimer’s disease

Boston University School of Medicine (BUSM) researchers have discovered that patients who have either coronary artery bypass graft surgery or coronary angioplasty are at an increased risk of developing Alzheimer’s disease. “The coronary bypass patients had a 70 percent increased risk of developing Alzheimer’s disease,” said Wolozin, co-author of the study. “This increased incidence of neurocognitive degeneration associated with heart bypass surgery provides further incentive for more studies to better characterize the risks of cardiac surgery on the brain.” According to Wolozin, previous studies show some heart surgery patients experience memory problems immediately following the procedure. However, at a one-year follow-up most patients regain cognitive function.

Researchers believe this early cognitive impairment is an immediate reaction to the stress of surgery. Heart bypass surgery represents a traumatic insult to the brain, particularly by reducing oxygen supply to the brain and increasing the stress response,” said Wolozin. “We believe that the compensation that occurs by one year masks an underlying deficit in the central nervous system caused by the heart surgery. As individuals age, this underlying deficit might exacerbate progressive cognitive deficits associated with mild cognitive impairment, a precursory phase before diagnosis of Alzheimer’s.”

Wolozin and his researchers are currently working with researchers from the Framingham Heart Study to determine if these same observations can be duplicated in their studies. “If these observations are confirmed, there are measures that can be taken to protect the brain during heart bypass surgery,” explained Wolozin. “Antioxidants might offer some protection, as well as memantine, a medication that helps slow the progression of Alzheimer’s disease. There may also be other neuroprotective agents still in development that could shield the brain from cognitive degeneration during and following surgery.”

Science Daily
September 13, 2005

Original web page at Science Daily

Categories
News

Neurotransmitter orexin tied to pleasure

University of Pennsylvania School of Medicine researchers have discovered the neurotransmitter orexin influences reward-processing in the brain. By identifying the relationship between orexin neurons and behaviors associated with reward seeking, drug relapse and addiction, researchers hope to find new treatments for drug addiction.

Communication between the lateral hypothalamus and these brain regions suggests orexin neurons may have a role in motivation and reward-seeking behavior. Glenda Harris and Gary Aston-Jones examined orexin function in rats using a behavioral test aimed at mimicking food and drug reward-seeking and drug relapse. They found a strong association between the activation of orexin neurons in the lateral hypothalamus and reward-seeking of morphine, cocaine and food. “The lateral hypothalamus has been tied to reward and pleasure for decades, but the specific circuits and chemicals involved have been elusive,” said Aston-Jones.

“This is the first indication that the neuropeptide orexin is a critical element in reward-seeking and drug addiction,” he said. “These results provide a novel and specific target for developing new approaches to treat addiction, obesity and other disorders associated with dysfunctional reward processing.” The findings appeal online in Nature.

Science Daily
September 13, 2005

Original web page at Science Daily

Categories
News

Neuroscientists identify how trauma triggers long-lasting memories in the brain

A research team led by UC Irvine neuroscientists has identified how the brain processes and stores emotional experiences as long-term memories. The research, performed on rats, could help neuroscientists better understand why emotionally arousing events are remembered over longer periods than emotionally neutral events, and may ultimately find application in treatments for conditions such as post-traumatic stress disorder.

The study shows that emotionally arousing events activate the brain’s amygdala, the almond-shaped portion of the brain involved in emotional learning and memory, which then increases a protein called “Arc” in the neurons in the hippocampus, a part of the brain involved in processing and enabling the storage of lasting memories. The researchers believe that Arc helps store these memories by strengthening the synapses, the connections between neurons. The study will appear in today’s issue of the Proceedings of the National Academy of Sciences.

“Emotionally neutral events generally are not stored as long-term memories,” said Christa McIntyre, the first author of the paper and a postdoctoral researcher in the Department of Neurobiology and Behavior in UCI’s School of Biological Sciences, working with James L. McGaugh, research professor and a fellow at the Center for the Neurobiology of Learning and Memory. “On the other hand, emotionally arousing events, such as those of September 11, tend to be well-remembered after a single experience because they activate the amygdala.” In their experiments, the researchers placed a group of rats in a well-lit compartment with access to an adjacent dark compartment.
Because rats are nocturnal and prefer dark environments, they tended to enter the dark compartment. Upon doing so, however, they were each given a mild foot-shock — an emotional experience that, by itself, was not strong enough to become a long-lasting memory. Some of the rats then had their amygdala chemically stimulated in order to determine what role it played in forming a memory of the experience.

When they placed the rats that received both the mild foot-shock and the amygdala stimulation back in the well-lit compartment, the researchers found the rats tended to remain there, demonstrating a memory for the foot shock they had received in the dark compartment. These rats, the researchers found, also showed an increase in the amount of the Arc protein in the hippocampus. On the other hand, rats that received only the mild foot-shock and no amygdala stimulation showed no increase in Arc protein. When placed in the well-lit compartment, they tended to enter the dark compartment, suggesting they didn’t remember the foot shock. “In a separate experiment, we chemically inactivated the amygdala in rats very soon after they received a strong foot-shock,” McIntyre said. “We found the increase in Arc was reduced and these rats showed poor memory for the foot shock despite its high intensity. This also shows that the amygdala is involved in forming a long-term memory.”

The brain is extremely dynamic, McIntyre explained, with some genes in the brain, called “immediate early genes,” changing after every experience. “We know the level of the immediate early gene that makes the Arc protein increases in the brain, simply in response to an exposure to a new environment,” she said. “Our findings show that this gene makes more Arc protein in the hippocampus only if the experience is emotionally arousing or important enough to activate the amygdala and to be remembered days later.”

The researchers were surprised to find no change in the gene that produced the Arc protein when the rat’s amygdala was stimulated. “We weren’t expecting the gene to be uncoupled from the Arc protein,” McIntyre said. “We thought an activation of the amygdala would create more gene activation in the hippocampus. But we saw the same amount of the gene in the rats, regardless of the amygdala treatment. It was the Arc protein, created by the gene, that was different. This gives us new insight into the way lasting memories are stored.”

Science Daily
September 13, 2005

Original web page at Science Daily

Categories
News

New clues into the aging brain

Researchers have discovered a potential reason why learning and memory function declines with age: aging brains produce lower levels of critical growth factors that fuel the birth of new neurons in the hippocampus, the brain’s learning and memory center, according to a study in rats. The researchers said their findings suggest that drugs to enhance such growth factors, or other preventive therapies, might sustain neuronal growth and thus maintain learning and memory in older people. The production of new neurons in the hippocampus was known to slow dramatically by middle age in rats — the equivalent of 45 to 50 years in humans — said the researchers from Duke University Medical Center and the Durham Veterans Affairs Medical Center (VAMC). But the molecular basis for this decline has remained a mystery, they said.

In the Aug. 15, 2005, issue of the journal GLIA, published early on line, the Duke/VAMC team reported that the levels of three critical growth factors — fibroblast growth factor-2 (FGF-2), insulin-like growth factor -1 (IGF-1) and vascular endothelial growth factor (VEGF) — decline dramatically in the middle-aged hippocampus of rats. These growth factors are secreted mostly by supporting cells in the brain, called astrocytes, and they are critical for enabling stem cells to produce new neurons. Their results illuminate the mechanism behind the declining production of new neurons in the dentate gyrus region of the hippocampus, where learning and memory occur, said Dr. Ashok K. Shetty, a Research professor of neurosurgery and lead author of the study.

Scientists had previously speculated that newly born cells in the aging hippocampus were failing to reach their potential for one of three reasons: they were differentiating into mostly non-neuronal cells; they were not migrating to the proper brain regions; or they were failing to survive long enough. However, the Duke/VAMC team showed in an earlier study that newly born cells in middle-aged and aged rats demonstrated none of these defective behaviors. “We determined that there is no major, fundamental defect in how newly born cells behave in the aging hippocampus,” said Shetty. “There is simply less of the growth factors that drive stem cells to produce new neurons. This is encouraging news because it means we can employ strategies to increase the levels of these growth factors and see whether an increased production of new neurons can be sustained in the aging hippocampus.”

For example, regular physical exercise and exposure to enriching environments have both been shown to boost new neuron production in the hippocampus, said Shetty. While these strategies will not halt the decline, they may slow it considerably, he said. Young adult rat brains (equivalent to 20-35 years of age in humans) produce approximately 2,000-3,000 new neurons per day in the hippocampus. In contrast, by middle age (45-50 years of age), only 500-700 new neurons are born each day. From that point on there is little decline in neuron production, the study showed. However, the numbers of supporting “astrocytes” that produce the growth factor FGF-2 continue to decline with advancing age, the study showed. Moreover, a fraction of newly born neurons in older brains show retarded growth of dendrites, the tentacle-like structures that reach out to and connect with other neurons to exchange messages. Such changes in new neuron numbers and growth may contribute to delayed memory processes as the brain ages, said Shetty.

Shetty said his research is the first to examine long-term survival of newly born neurons in aging brains. Most studies have focused on the production of new neurons at a specific age rather than neuronal behavior over a period of time. In their study, Shetty’s team tracked new neurons in young, middle-aged and old rats for five months as the cells divided, matured, differentiated and migrated. They observed the neurons’ behavior and measured the levels of growth factors at each age to determine how new neuron production and development progressed in each age group.

Future studies will focus on developing strategies to sustain increased neuron production in the aged brain and examining whether increased production of new neurons in the senescent hippocampus will improve learning and memory function in the aged.

Source: Duke University Medical Center

Bio.com
August 30, 2005

Original web page at Bio.com

Categories
News

Nerve cells’ power plants caught in a traffic jam

Nerve cells need lots of energy to work properly, and the energy needs to be delivered to the right place at the right time. By inducing a mutation in fruit flies, researchers have figured out that a particular gene governs the movement of cells’ energy-producing units, called mitochondria. Rather than moving to the ends of the cells, or synapses, where cell-to-cell communication takes place, mitochondria in mutant fruit flies just piled up in the center of the cell. Even so, the mutant cells could still transmit signals, although not as well.

The findings are surprising because scientists had thought any disruption in normal mitochondrial behavior would be lethal in the embryo stage. Instead, the mutant fruit fly larvae survive for five days, although they don’t live to adulthood. “Everyone believed that mitochondria are essential at synapses — and this is wrong,” said Konrad E. Zinsmaier, the University of Arizona associate professor of neuroscience who led the research team. “The mutation allows us to study what mitochondria are really good for.” The finding provides scientists with additional insight into how nerve cells work and provides a basis for understanding how such dysfunctions cause neurodegenerative diseases.

The researchers will publish their findings in the August 4 issue of the journal Neuron. A complete list of authors and their affiliations can be found at the end of this release. Little is known about what causes mitochondria to become dysfunctional and how they contribute to neurological disorders. To learn more about what could go wrong with the energy units, Zinsmaier and his colleagues induced a mutation in the fruit fly mitochondrial protein, dMiro. dMiro stands for Drosophila mitochondrial Rho-like GTPase. Molecular motors shuttle mitochondria within cells along cellular highways called microtubules. Normally, the mitochondria travel the length of the neuron until they reach the synapse. The mutation in the dMiro protein disabled the motor, disrupting the normal pattern of mitochondrial distribution.

The nerves’ synapses are where one nerve cell connects and communicates with other cells. For example, muscle cells contract when they receive the proper signals from nerve cells. Abnormal mitochondrial distribution within a neuron alters its ability to signal properly to adjoining muscle or nerve cells. Instead of cruising smoothly along the microtubules, the mitochondria in mutant cells become caught in a traffic jam at the entrance ramp, located in the cell’s center. Even though the synapses of the mutants are entirely devoid of mitochondria, the neuronal function remained intact at low levels of stimulation. But at high levels of stimulation, the mutated nerve cells failed.

Zinsmaier is now questioning the purpose of the mitochondria at the synapse. “How important are mitochondria?” he said. “We were surprised at how long the system could survive without them.” Zinsmaier explained that there may be a compensatory mechanism in place that is able to deal with minor mitochondrial dysfunction within the nerve. Besides providing energy, mitochondria carry out other tasks important for cell survival. One important mitochondrial task is taking up excess calcium. Calcium is the main ingredient for proper neuron function. Too much calcium can lead to cell death. Zinsmaier hypothesizes that there could be a specialized communication system established within neurons involving another cell component that cooperates with mitochondria to properly store calcium.

While he has begun to piece together several theories, Zinsmaier explained that it remains unclear exactly how the compensation occurs. “The real surprise is that there are mechanisms in place that can manage the system somehow,” he said. “We didn’t know about them.” The findings made by Zinsmaier and his colleagues have significant implications for neurobiologists, who may now begin looking more closely at defects in mitochondrial transport. Alterations in this process may help explain how and why human neurological diseases, such as muscular dystrophy and spastic paraplegia, develop.

Science Daily
August 30, 2005

Original web page at Science Daily

Categories
News

Nighttime dying linked to sleep apnea from brain cell loss

Aim to grow old and die peacefully in your sleep? Be careful what you wish for. A new UCLA study suggests that some people die in their sleep because they stop breathing due to a cumulative loss of cells in the brain’s breathing command-post. The online edition of Nature Neuroscience reports the findings on Aug. 7. “We wanted to reveal the mechanism behind central sleep apnea, which most commonly affects people after age 65,” explained Jack Feldman, principal investigator and Distinguished Professor of Neurobiology at the David Geffen School of Medicine at UCLA. “Unlike obstructive sleep apnea – in which a person stops breathing when their airway collapses — central sleep apnea is triggered by something going awry in the brain’s breathing center.”

Feldman’s team had earlier pinpointed a brainstem region they dubbed the preBötzinger complex (preBötC) as the command post for generating breathing in mammals, and identified a small group of preBötC neurons responsible for issuing the commands. This time, the researchers studied the role of the preBötC neurons in generating breathing during sleep, and what would happen if these brain cells were destroyed. The scientists injected adult rats with a cell-specific compound to target and kill more than half of the specialized preBötC neurons. Then the team monitored the rats’ breathing patterns. After four or five days, the results proved visibly dramatic.

“We were surprised to see that breathing completely stopped when the rat entered REM sleep, forcing the rat to wake up in order to start breathing again,” said Leanne McKay, postdoctoral fellow in neurobiology. “Over time, the breathing lapses increased in severity, spreading into non-REM sleep and eventually occurring when the rats were awake, as well.” Because mammals’ brains are organized in a similar fashion, the scientists believe that the rat findings are relevant to the human brain. Rats possess 600 specialized preBötC cells, and Feldman theorizes that humans have a few thousand, which are slowly lost over a lifetime.

“Our research suggests that the preBötzinger complex contains a fixed number of neurons that we lose as we age,” said Feldman. “Essentially, we sped up these cells’ aging process in the rats over several days instead of a lifetime.” Long before the rats had difficulty breathing when awake, they developed a breathing problem during sleep. The UCLA team suspects the same thing happens as people grow older. “We speculate that our brains can compensate for up to a 60 percent loss of preBötC cells, but the cumulative deficit of these brain cells eventually disrupts our breathing during sleep. There’s no biological reason for the body to maintain these cells beyond the average lifespan, and so they do not replenish as we age,” said Feldman. “As we lose them, we grow more prone to central sleep apnea.”

When elderly but otherwise healthy people die during sleep, physicians commonly record the cause of death as heart failure. The UCLA team believes that the loss of preBötC neurons sparks central sleep apnea, causing elderly people whose lungs and heart are already weaker due to age, to stop breathing and succumb to death in their sleep. Their true cause of death remains unknown. The scientists suspect central sleep apnea also strikes people suffering the late stages of neurodegenerative disorders, such as Parkinson’s disease, Lou Gehrig’s disease and multiple system atrophy, all serious conditions that lead to movement problems.

“People with these diseases breathe normally when they are awake, but many of them have breathing difficulties during sleep,” said Wiktor Janczewski, assistant researcher in neurobiology. “When central sleep apnea strikes, they are already very ill and their sleep-disordered breathing may go unnoticed.” As the patients grow sicker, their nighttime threshold for wakefulness rises,” he added. “Eventually, their bodies reach a point when they are unable to rouse themselves from sleep when they stop breathing, and they die from lack of oxygen.”

The UCLA team will repeat their research with elderly rats in order to learn why central sleep apnea first strikes during REM sleep. The group also plans to analyze the brains of people who die from neurodegenerative diseases to determine whether these patients show damage in their preBötzinger complexes.

Science Daily
August 30, 2005

Original web page at Science Daily

Categories
News

Gradient guides nerve growth down spinal cord

The same family of chemical signals that attracts developing sensory nerves up the spinal cord toward the brain serves to repel motor nerves, sending them in the opposite direction, down the cord and away from the brain, report researchers at the University of Chicago in the September 2005 issue of Nature Neuroscience (available online August 14). The finding may help physicians restore function to people with paralyzing spinal cord injuries.

Growing nerve cells send out axons, long narrow processes that search out and connect with other nerve cells. Axons are tipped with growth cones, bearing specific receptors, which detect chemical signals and then grow toward or away from the source. In 2003, University of Chicago researchers reported that a gradient of biochemical signals known as the Wnt proteins acted as a guide for sensory nerves. These nerves have a receptor on the tips of their growth cones, known as Frizzled3, which responds to Wnts.

In this paper, the researchers show that the nerves growing in the opposite direction are driven down the cord, away from the brain, under the guidance of a receptor, known as Ryk, with very different tastes. Ryk sees Wnts as repulsive signals. “This is remarkable example of the efficiency of nature,” said Yimin Zou, Ph.D., assistant professor of neurobiology, pharmacology and physiology at the University of Chicago. “The nervous system is using a similar set of chemical signals to regulate axon traffic in both directions along the length of the spinal cord.”

It may also prove a boon to clinicians, offering clues about how to grow new connections among neurons to repair or replace damaged nerves. Unlike many other body components, damaged axons in the adult spinal cord cannot adequately repair themselves. An estimated 250,000 people in the United States suffer from permanent spinal cord injuries, with about 11,000 new cases each year.

This study focused on corticospinal neurons, which control voluntary movements and fine-motor skills. These are some of the longest cells in the body. The corticospinal neurons connect to groups of neurons along the length of spinal cord, some of which reach out of the spinal cord. They pass out of the cord between each pair of vertebrae and extend to different parts of the body, for example the hand or foot. Zou and colleagues studied the guidance system used to assemble this complex network in newborn mice, where corticospinal axon growth is still underway. Before birth, axons grow out from the cell body of a nerve cell in the motor cortex. The axons follow a path back through the brain to the spinal cord.

By the time of birth, the axons are just growing into the cord. During the first week after birth they grow down the cervical and thoracic spinal cord until they reach their proper position, usually after seven to ten days. From previous studies, Zou and colleagues knew that a gradient of various Wnt proteins, including Wnt4, formed along the spinal cord around the time of birth. Here they show that two other proteins, Wnt1 and Wnt5a are produced at high concentrations at the top of the cord and at consecutively lower levels farther down.

They also found that motor nerves are guided by Wnts through a different receptor, called Ryk, that mediates repulsion by Wnts. Antibodies that blocked the Wnt-Ryk interaction blocked the downward growth of corticospinal axons when injected into the space between the dura and spinal cord in newborn mice. This knowledge, coupled with emerging stem cell technologies, may provide the most promising current approach to nervous system regeneration. If Wnt proteins could be used to guide transplanted nerve cells — or someday, embryonic stem cells — to restore the connections between the body and the brain, “it could revolutionize treatment of patients with paralyzing injuries to these nerves,” Zou suggests.

“Although half the battle is acquiring the right cells to repair the nervous system,” he said, “the other half is guiding them to their targets where they can make the right connections.” “Understanding how the brain and the spinal cord are connected during embryonic development could give us clues about how to repair damaged connections in adults with traumatic injury or degenerative disorders,” Zou added.

Science Daily
August 30, 2005

Original web page at Science Daily

Categories
News

Scientists link vascular gene to Alzheimer’s disease

Scientists at the University of Rochester Medical Center have discovered a link between a prominent developmental gene and neurovascular dysfunction in Alzheimer’s disease. The gene plays a major role in the growth and remodeling of vascular systems. But, in brain cells of people with Alzheimer’s disease, expression of the gene is low, the scientists found, revealing a new piece of the Alzheimer’s puzzle. In laboratory studies, the scientists also showed that restoration of the gene expression level in the human brain cells stimulated the formation of new blood vessels. It also increased the level of a protein that removes amyloid beta peptide, the toxin that builds up in brain tissue in Alzheimer’s disease.

In further studies, the scientists, led by Berislav Zlokovic, M.D., Ph.D., deleted one copy of the gene in mice, creating echoes of the damage of Alzheimer’s, including reduced ability to grow blood vessels in the brain and impaired clearance of amyloid beta. “This is a new pathway for the study and treatment of Alzheimer’s disease,” said Zlokovic. “This gene could be a therapeutic target. If we can stop this cycle, we could slow or stop the progression of the neuronal component of this disease.”

An article by Zlokovic and his team detailing the research findings appears Sunday Aug. 14 in the online version of Nature Medicine. The article will be published in the September print edition of Nature Medicine. Zlokovic is a professor in the University of Rochester Medical Center’s Department of Neurosurgery and director of the Frank P. Smith Laboratories for Neuroscience and Neurosurgical Research. The gene targeted in the research is a homeobox gene known as MEOX2 and also as GAX. A homeobox gene encodes proteins that determine development. Zlokovic calls it a “big boss.”

The scientists studied human brain endothelial cells taken from autopsy samples from people with Alzheimer’s. They found that expression of MEOX2, or mesenchyme homeobox 2, is low in the cells of those with Alzheimer’s. “The cells with low levels can’t form any kind of vascular system or any kind of network,” Zlokovic said. “They just start dying.”

In restoring expression of the gene, the Rochester scientists showed for the first time that it suppresses a specific transcription factor. When the expression of MEOX2 is low, the factor “rampages” and allows apoptosis or programmed cell death in the brain vascular system, Zlokovic said. When MEOX2 expression is low, the research also showed that a protein that helps with the clearance of amyloid beta is suppressed.

Zlokovic views the findings reported in Nature Medicine as support for his belief that Alzheimer’s is a neurovascular disease. “If you find a problem in the brain, it doesn’t necessarily mean that it started in the brain,” he said. “It’s not that neuronal injury is not important. It’s that other things are more important.” But Zlokovic said that it is not clear yet whether the low expression of the gene results in the death of brain cells and Alzheimer’s disease or that the disease in neurons results in the low expression of the disease. “But if we can restore the dysfunctional gene, we might be able to slow or stop the disease wherever it started,” Zlokovic said.

Science Daily
August 30, 2005

Original web page at Science Daily

Categories
News

Brain scientists offer insight into vision

A team of neuroscientists report in the July 21 issue of the science journal Neuron how neuron clusters in the brain overlap to communicate such combined visual information as a flower’s color, shape and distance. The team, including Dezhe Z. Jin, Penn State assistant professor of physics and an affiliate of the Penn State Neuroscience Institute, performed the research at the Picower Center for Learning and Memory at the Massachusetts Institute of Technology. The team’s research suggests that multitasking may be fundamental to the way the brain works. “Since every part of the cortex has neurons that are involved in multiple tasks, there is every reason to think that this is a deep principle of brain organization,” said Mriganka Sur, the Sherman Fairchild professor of neuroscience and head of MIT’s Department of Brain and Cognitive Sciences.

In the visual cortex, neighboring neurons detect objects in neighboring regions of space, creating an image or map of the visual scene. Neurons are clustered according to their ability to detect different properties — such as the vertical or horizontal edge of an object or whether the object is being seen by the left eye or the right — but they need to overlap so each combination of features can be represented by the cortex. If the clusters did not overlap with each other the correct way, then we would have “blind spots” for certain feature combinations. For example, in certain regions of the visual scene we might detect vertical edges with only the left eye, or horizontal edges with only the right eye. In some species’ brains, a square region of the visual image is represented by a square region of the cortex. But in other species, the visual cortex is distorted, causing a square region in the visual image to be represented by a rectangular region of cortex.

“Our study shows that the distortion in the mapping of the visual scene onto the cortex has an influence on clustering that Teuvo Kohonen’s formulas predicted,” Jin said. “The shape of the clusters of neurons representing similar orientations, and also the species’ eyes, are distorted in such a way that each feature combination still can be detected in each part of space.” The researchers comment that the visual cortex’s solution to accommodating several parameters probably holds true for other brain regions, such as those involving hearing. “Hearing, like seeing, has multiple parameters: location of a sound in space, frequency and relative activation of the two ears,” Farley said. “Maybe mapping multiple dimensions this way is a general strategy the brain uses when it faces this problem.”

Science Daily
August 16, 2005

Original web page at Science Daily

Categories
News

Cracking the perception code

The brain may interpret the information it receives from sensory neurons using a code more complicated than scientists previously thought, according to new research from the National Autonomous University of Mexico and Cold Spring Harbor Laboratory. By studying how monkeys perceive a vibrating object when it touches the skin, scientists found that changes in an animal’s attention over time influence how a sensory signal is interpreted.

Neuroscientists already knew that touching the skin with a vibrating object causes specialized sensory neurons in the brain to fire, and that firing of these neurons, which are found in a region of the brain known as the primary somatosensory cortex, is directly related to monkeys’ ability to tell how fast something is vibrating, Romo said. But the neurons’ firing patterns are complex, and it’s been tricky to tease out “which component of the neuronal activity was more likely associated with behavioral performance,” he explained. Theoretically, there are many ways in which neurons could relay information about stimulus frequency, Romo said. Frequency information might be encoded in the time between consecutive neuronal firings, the overall rate of firing, or the number of times a neuron fires.

To distinguish among these possibilities, Romo and his colleagues designed an experiment in which they touched the monkeys’ fingertips with a vibrating but painless probe for varying lengths of time. The monkeys were first taught to respond to varying vibration frequencies; in a training session, the scientists touched the monkeys twice in a row, with the probe vibrating at a different frequency each time. The monkeys signaled to the experimenters which stimulus was vibrating faster, and, when they were correct, they were rewarded with a treat.

The standard stimulus that the scientists trained the monkeys to respond to lasted 500 milliseconds (half a second). They found that when they used a stimulus that lasted 750 milliseconds instead, the monkeys consistently thought the probe was vibrating with a higher frequency than it actually was. The same thing happened in reverse; if a stimulus was given for only 250 milliseconds, the monkeys thought it was vibrating at a lower frequency. The effect was stronger for the shortened stimulus than for the lengthened stimulus, Romo noted.

Based on this experiment, it seemed most likely that the monkeys were determining the vibration frequency by the number of times the neurons fired, Romo said, since the firing rate and time between firings wouldn’t change just because the stimulus duration changed. The scientists knew they hadn’t quite cracked the neural code, though, because the magnitude effects weren’t right; the monkeys thought that a stimulus that was 50 percent shorter was vibrating at just a slightly lower frequency than it was–not 50 percent lower.

To find the cause of this discrepancy, they recorded electrical activity in single neurons of the primary somatosensory cortex. Since the shortened stimulus had produced a greater effect than the lengthened stimulus, the researchers wondered if the first part of the response might be more significant in determining vibration frequency. They explored two possible mechanisms of action: the neural firing response could adapt to the stimulus over time, making the neurons more sensitive at the beginning than at the end, or a perceptual process after neuronal firing could give more subjective weight to the beginning of the response.

Looking at the electrical responses from single neurons, Romo and his colleagues determined that, if all the neuronal firings were treated equally, these responses could not explain the monkeys’ perception of the signal. If the researchers assumed that the monkeys paid more attention to the beginning of the response, however, the neural activity perfectly explained the monkeys’ errors when judging different durations of stimuli.

Romo suggested that the best explanation for the behavioral data was to assume that the monkeys pay the most attention to the first 250 milliseconds of neural firing, and that their attention falls off exponentially from there. The longer the stimulus, the less important additional neuronal firings become to the monkeys’ perception of how fast the stimulus is vibrating, even though they continue to pay some attention throughout.

Figuring out how the brain codes sensory information into neuronal firing and how the firing patterns are interpreted by perceptual areas of the brain is a huge challenge in neurophysiology, one that’s often overlooked, said Romo. “The neuronal correlates reported in most of the neurophysiological studies in the different sensory modalities simply do not pay attention to this,” he noted. “They assume that variation in firing rate is enough as a measure.”

Source: Howard Hughes Medical Institute

Bio. com
August 16, 2005

Original web page at Bio.com

Categories
News

Echolocation calls and communication calls are controlled differentially in the brainstem of the bat Phyllostomus discolor

Echolocating bats emit vocalizations that can be classified either as echolocation calls or communication calls. Neural control of both types of calls must govern the same pool of motoneurons responsible for vocalizations. Electrical microstimulation in the periaqueductal gray matter (PAG) elicits both communication and echolocation calls, whereas stimulation of the paralemniscal area (PLA) induces only echolocation calls. In both the PAG and the PLA, the current thresholds for triggering natural vocalizations do not habituate to stimuli and remain low even for long stimulation periods, indicating that these structures have relative direct access to the final common pathway for vocalization. This study intended to clarify whether echolocation calls and communication calls are controlled differentially below the level of the PAG via separate vocal pathways before converging on the motoneurons used in vocalization.

Both structures were probed simultaneously in a single experimental approach. Two stimulation electrodes were chronically implanted within the PAG in order to elicit either echolocation or communication calls. Blockade of the ipsilateral PLA site with iontophoretically application of the glutamate antagonist kynurenic acid did not impede either echolocation or communication calls elicited from the PAG. However, blockade of the contralateral PLA suppresses PAG-elicited echolocation calls but not communication calls. In both cases the blockade was reversible.

The neural control of echolocation and communication calls seems to be differentially organized below the level of the PAG. The PLA is an essential functional unit for echolocation call control before the descending pathways share again the final common pathway for vocalization.

BioMed Central
August 16, 2005

Original web page at BioMed Central

Categories
News

Memory loss reversed in mice

Mice with memory loss have had their condition reversed, a discovery that should help refine the search for a cure for Alzheimer’s disease and other dementias. The study also helps clarify the actual cause of dementia, which should give more focus to drug studies. The brains of people with Alzheimer’s and some 50 other forms of dementia are known to have certain characteristic features, including messy bundles of fibres in nerve cells called neurofibrillary tangles. But no one has been sure whether the tangles are a cause or symptom of dementia.

Mice engineered to massively overproduce a protein called tau tend to grow more of the tangles and display the same problems with memory and learning as humans with dementia. Researchers think that it is a certain version of the tau protein, rather than a simple over-abundance, that leads to the tangles. It has been speculated that these tau proteins, rather than the tangles, kill nerve cells. Karen Ashe, a neurobiologist at the University of Minnesota Medical School in Minneapolis, and her colleagues hoped to untangle this mystery. They trained mice to navigate a maze partly submerged in water, and watched for signs of memory loss. By the age of three months, mice genetically engineered to express 13 times too much tau protein couldn’t remember the route to dry land, and had developed tangles in their brains. But surprisingly, when the researchers turned off the switch promoting tau expression, the mice began to gain back some lost memory.

The team reports in Science that the performance of the ‘switched-off’ engineered mice was roughly half as good as their normal counterparts, and twice as good as those that continued to overproduce tau. And their performance improved even through the tangles in their brains remained. The results indicate that some variety of tau proteins, and not the tangles it promotes, is responsible for dementia-related memory loss. But researchers are not yet sure which version of tau protein cause problems in the brain. That’s the next step, says Ashe: “We have to figure out the molecular form of tau that is poisoning the neurons.” That should give drug developers a better understanding of the molecules they should target. But researchers caution that in the context of dementia, this is just half the story.
Alzheimer’s patients also have plaques in their brains made of a protein compound called beta-amyloid. Most think that this also plays a role in causing memory loss. “This is a two-protein disease,” says Ashe.

Nature
August 2, 2005

Original web page at Nature

Categories
News

Targeting neural stem cell treatments

University of California Irvine School of Medicine researchers have discovered how new neurons born from endogenous neural stem cells are sent to regions of the brain where they can replace old and dying cells, a finding that suggests how stem cell therapies can be specifically targeted to brain regions affected by neurodegenerative diseases or by stroke. Associate Professor Qun-Yong Zhou and graduate student Kwan L. Ng in the UCI Department of Pharmacology have identified a protein that guides these new neurons to a particular brain region. The protein, a small peptide called prokineticin 2 (PK2), was found to play a key regulatory role for the proper functional integration of these new neurons in the brain. A few years ago, PK2 was shown by the same research group to be an important regulator of circadian rhythms. The current study appeared in the June 24 issue of the journal Science.

“One of the keys to developing promising new therapies for debilitating neurodegenerative diseases lies in our understanding of how new neurons are created and integrated into mature brain tissue,” Zhou said. “This protein is an attractive drug target for either boosting neuron-forming processes or stem cell-based therapies for diseases like Alzheimer’s and Parkinson’s, or for stroke and other brain injuries.” While all neurons are originally born and differentiated from their stem cell progenitors during development, the adult brain maintains at least some regions where neural stem cells create new neurons to replace old and dying ones. One area is the subventricular zone of the lateral ventricles, which are fluid-filled cavities in both brain hemispheres connected the central canal of the spinal cord.

Zhou and his colleagues discovered how PK2 guides the migration of neurons born from neural stem cells from the subventricular zone in the brain’s core through mature tissue to reach the olfactory bulb, the “smell” part of the brain located above the sinus cavity. PK2 allows these new neurons to settle into the proper areas of the olfactory bulb, thus permitting these neurons to function normally. “Our findings identify one of the first endogenous guidance molecules for migrating neurons in the adult brain,” Zhou said. “We are learning that molecules, like PK2, which direct the movement of neurons are crucial for neuronal replacement, and they demonstrate how adult stem cells might be manipulated for this process.”

PK2 accomplishes this task by working with its corresponding cell receptors, which are part of the G protein-coupled receptor (GPCR) family. GPCRs are proteins found in a cell’s membrane and play a critical role in transferring signals from outside of a cell to the molecular machinery within the cell. GPCRs are the largest family of proteins that serve as drug targets. It has been estimated that at least 40 percent of all medicines on the market act on this family of receptors.

Source: University of California – Irvine

Bio.com
July 19, 2005

Original web page at Bio.com

Categories
News

Rat olfaction molded early

The odors that newborn rats are exposed to appear to govern the development of synapses that carry information into the rat olfactory cortex, the seat of odor perception, researchers report in Neuron this week. Kevin Franks and Jeffry Isaacson of the University of California, San Diego, found that in newborn rats, early olfactory experiences caused changes in the relative amounts of two types of glutamate receptors in lateral olfactory tract fibers. Specifically, they observed a decrease in the number of NMDA receptors, which are believed to be important in synaptic plasticity and long-term changes, relative to AMPA receptors, which mediate fast synaptic transmission. The researchers suggest this phenomenon might be associated with “olfactory imprinting,” the strong attachment to maternal odors that occurs early in mammalian development.

“The ability of the animal to smell caused downregulation in the number of NMDA receptors,” Isaacson told The Scientist. “Very early in rat development, there is quite robust NMDA receptor-mediated, long-term potentiation from the sensory synapses into the cortex, but later in life, after the animal has had time to smell, the loss of NMDA receptors makes it difficult to induce any long-term changes in the strength of the synaptic transmission.” To study the synaptic modifications that occur during development, the authors took advantage of the layered architecture of the rat’s olfactory cortex. “The stratification makes this a nice experimental model to use,” said Kurt Illig of the University of Virginia, who did not participate in the research. “The authors were able to selectively stimulate different types of cells and look at the development of the responses for each of those layers independently in an in vitro preparation. The loss of NMDA receptors they observed could be a mechanism by which early olfactory experience shapes the cortex to respond to particular odors.”

To test for the role of sensory experience in the synaptic changes, the authors occluded one of the nostrils in newborn rats, depriving one side of the brain of olfactory stimulation, and compared the two sides of the brain in each animal. “This is a great technique because the olfactory information in the brain is ipsilateral,” said Isaacson. The results pointed to the existence of a critical period during which sensory synapses are especially plastic, a phenomenon that also has been shown in the visual, auditory, and somatosensory systems. “Olfactory deprivation caused loss of NMDA receptors in young rats, but not in rats 2 months old or so,” he said. “There have been very few researchers who have looked at how experience can modify the olfactory cortex,” said Ben Philpot of the University of North Carolina at Chapel Hill, who wrote a related preview. “This work shows that you can have changes with olfactory experience in the cortex very early on.”

Philpot added that the changes in the receptor levels may also be involved in the pruning back of exuberant projections from the olfactory bulb to the olfactory cortex. “This would be a nonexclusive possibility, equally exciting,” he said. According to Takao Hensch of the Riken Brain Science Institute in Saitama, Japan, who was not part of the research team, the results bode well for identifying a critical period in the olfactory system. “What Isaacson et al. will need to show in the future is that the NMDA-mediated events at lateral olfactory tract synapses indeed have behavioral consequences. Does nostril occlusion delay the critical period for imprinting, as they would suggest?”

E-mail address The Scientist Daily
July 19, 2005

Original web page at The Scientist

Categories
News

Can a jab keep brain trouble away?

A vaccine developed to fight brain disorders such as Parkinson’s disease has shown promise in preliminary animal trials. But experts caution that the positive results may not translate into an effective treatment for humans. The formation of abnormal protein aggregates in the brain, known as Lewy bodies, has been linked to several neurological disorders in adults. These include Parkinson’s disease, a condition that can involve slow movements, tremors and impaired coordination. Genes may predispose someone to the disease, say researchers. Others point to environmental toxins as a potential trigger.

Whatever the cause, doctors currently lack a cure for Parkinson’s disease and related Lewy-body illnesses. Many think that getting the immune system to attack the protein aggregates is a good step towards finding a treatment. So several research teams have been pursuing therapeutic vaccines. Biologists have already succeeded in giving mice specially designed immune cells to save them from neurological damage. Now they have gone a step further by getting mice to produce their own immune protection through a series of injections.

The vaccine, developed by Leslie Crews of the University of California, San Diego, and colleagues, is based on the protein in Lewy bodies, known as alpha-synuclein. An overdose of this protein, which acts at the tips of nerve cells, apparently creates these aggregates in mice. Animals genetically engineered to overproduce the protein also exhibit Parkinsonian symptoms. The team gave such mice monthly injections of their vaccine, and monitored the response. About half of the animals produced high levels of antibodies that fought the creation of Lewy bodies. “The vaccine helped to clear the alpha-synuclein from the brain,” Crews explains. After eight months of treatment, the older mice that received the vaccine showed a 47% decrease in alpha-synuclein protein compared with their control counterparts. The findings from the study appear in the journal Neuron. But researchers caution that the vaccine is far from a sure bet to battle Parkinson’s disease. “We don’t want to get ahead of ourselves. This requires more investigation,” says Crews.

Experts point out that the mice do not develop exactly the same condition that appears in humans, so it is hard to extrapolate to people. “The biggest challenges in developing vaccines for conditions like Parkinson’s disease include the availability of animal models that best reflect human disease,” says Howard Gendelman of the University of Nebraska Medical Center in Omaha, who is also working on a vaccine approach. Kieran Breen, director of research at the Parkinson’s Disease Society in London, also notes that the Lewy bodies’ role in the disorder remains unclear; preventing them from forming may not prevent illness. “It’s not known whether they are a cause or an effect,” he says.

Nature
July 5, 2005

Original web page at Nature

Categories
News

Brain imaging works to capture the many natures of the pain experience

Just before the holidays, Kenneth Casey’s young granddaughter accidentally slammed the tip of her finger in a car door. Not surprisingly, she reacted quite strongly. Casey, a long-time pain researcher at the University of Michigan, had a look and saw there was little damage; he repeatedly reassured her that though it would hurt, she’d be OK. Very soon she calmed down. “It was amazing,” says Casey of the change in her mood.

Pain encompasses multiple components–sensory, cognitive, and emotional. The intensity of one’s pain experience stems not only from the inner workings of biological pathways, but also from one’s emotional state, expectations, and previous experiences. From nociception to transmission, the brain is the ultimate interpreter and modulator of the pain experience. Therefore, it’s not surprising that researchers believed they’d found, in brain-imaging technologies, the ultimate tool to decipher the neural correlates of pain. Positron emission tomography in the 1980s, and later functional magnetic resonance imaging (fMRI) in the 1990s, promised to elucidate and greatly expand the intricacies of pain perception where electrophysiological recordings and lesion studies had fallen short.

Imaging has proved a boon for pain research, especially the easily accessible, noninvasive technology of fMRI. Nevertheless, the clinical models for pain have a limitation. Progress has been made in finding brain areas correlating to experimental pain experiences, which has provided hints about acute pain. Chronic pain has proven a greater challenge. It’s not clear how informative experimental acute pain models–popular, laboratory-friendly paradigms that involve brief pain exposures–will be for real-world, everyday pain. Jon-Kar Zubieta, director of the psychiatry division at the University of Michigan’s Depression Center, explains that there’s a significant difference between acute pain, which is a warning signal, and sustained, prolonged, unrelenting pain, which becomes a stressor that can spur more complex reactions, like depression. But some have begun to seek new methods to account for the broader dimensions of the painful experience.

“There’s definitely a well established pattern of brain activity specifically for acute, experimental, painful stimuli,” says Northwestern University associate professor of physiology Apkar Apkarian, a pain and imaging researcher. “The pattern of activity for clinical pain conditions has been studied much less and has been studied much less specifically.” Researchers investigating acute pain employ a bevy of methods for inflicting harmless pain on subjects–everything from saline injections to hot plates to lasers to the extreme cold of ice baths to balloons inflated in the subject’s esophagus. “There are a variety of torture techniques,” quips Stuart Derbyshire, an assistant professor of anesthesiology at the University of Pittsburgh. Typically, the pain is inflicted via, say, a hot plate, and the brain is imaged to reveal areas potentially involved. The hope of such acute pain experience studies is to decipher which parts of the brain are responsible for different pain components.

Twenty years ago, the thalamus was commonly believed to be necessary and sufficient for the pain experience. Lesions of the thalamus, investigators had observed, sometimes produced spontaneous pain, whereas lesions of, say, the cerebral cortex, rarely seemed to affect pain perception.
But with imaging, researchers discovered that other areas, including the anterior cingulate, the prefrontal cortex, and the parietal cortex, also have roles. “That makes sense from the point of view of pain being a complex experience with an emotional component, a sensory component, cognitive components,” says Derbyshire.
Indeed, there is no actual “seat” of pain processing in the brain. Several areas are activated systematically, none of them alone being sufficient for painful experience.

According to Robert Coghill, an assistant professor of neurobiology at Wake Forest University, functional imaging studies can provide hints about dimensions of pain–such as intensity, location, and unpleasantness–that are likely processed by the same mechanisms involved in the processing of acute pain. “Having an acute, experimental model to explore a very poorly understood system is extremely helpful,” he says.

A host of studies have tried to elucidate the subjective experience of pain and, potentially, clinically relevant pain networks in the process. One of several researchers investigating how expectations can change the experience of pain, Derbyshire recently used the increasingly popular technique of hypnosis to demonstrate significant brain changes in hypnotized subjects (see Approaching Pain’s Layers through Hypnosis). Each was told to expect pain from a hot plate when, in fact, none was presented. Five of eight subjects reported definite pain, even though they were not exposed to an actual hot stimulus. Those subjects had significant changes within the thalamus and anterior cingulate, insula, prefrontal, and parietal cortices.

Several studies have also attempted to identify brain areas for pain with no obvious physical locale, sometimes a feature of chronic pain. In one recent study, Catherine Bushnell, director of the center for research on pain at McGill University in Montreal and among the first to widely study pain with imaging techniques, compared cutaneous and visceral pain. Generally, when pain is simulated in visceral tissue, there’s referred pain to some other skin or muscle. Visceral pain is not easily localized. When Bushnell inflated balloons in a subject’s esophagus, the pain presented as an aching muscle in the chest or back.
When she applied an equivalently intense amount of pain to the chest, she got the same overall pattern of activation at the secondary somatosensory and parietal cortices, thalamus, basal ganglia, and cerebellum. But she saw differences at the insular, primary somatosensory, motor, and prefrontal cortices, suggesting somewhat different brain correlates for cutaneous versus visceral pain experiences. Despite such progress, the neural correlates of chronic pain remain elusive–although studies have begun to target the topic. The physiological underpinnings of disorders like chronic low back pain and fibromyalgia syndrome, a musculoskeletal pain and fatigue disorder, still present a challenge for researchers and physicians.

Hoping to show not simply the acute, but the long-term physiological and cognitive effects of pain, Apkarian’s group compared the brains of 26 chronic back-pain patients with those of 26 age- and sex-matched normal subjects. They found regional decreases in brain gray matter mass and density that correlated to the pain duration. The longer the pain persisted, the more the brain atrophied–specifically in the lateral prefrontal cortex and the thalamus. Apkarian examined these regions over time and with multiple modalities–fMRI, MR spectrometry, which details brain chemistry, and MR morphology, to observe morphological changes. “Our running hypothesis is that suffering with the pain is impacting the brain itself,” says Apkarian. “It’s making neurons dysfunctional, specifically in brain areas that are most involved in suffering with the pain, coping with the pain.” But the cause and effect between the pain experience and the morphological changes have not been proven, nor have the potential confounding effects of genetic predispositions.

Other recent data about pain transmission at the molecular level reveal further the complexities of pain regulation. Zubieta has done work suggesting that individuals differ in their endogenous neurotransmitter responses to pain, observations that may help explain not only clinical pain, but also the different pain experiences in men versus women. In a 2001 study, Zubieta examined the effects of endogenous, pain-suppressing opioid peptides, which interact in particular with the µ-opioid receptor and hence modulate the pain experience. Employing a pain technique that’s been around for 20 years, Zubieta’s group injected a 5% saline solution into the jaw muscles of subjects (much higher than the typical 0.9% solution used to rehydrate); then, over a period of 20 minutes, they asked subjects to rate the pain experience. Based on that rating, computers sustained the pain by modulating the saline infusion rate–it’s a method that more closely mimics a chronically painful state than those methods employing acute stimuli, says Zubieta. Subjects were scanned with and without pain. An opioid receptor marker, ingested by subjects, revealed the density of opioid receptors in each case. Zubieta discovered, to his surprise, that several brain regions were releasing endogenous opioid peptides.

He also observed the high degree of inter-individual variability. In follow-up studies, Zubieta showed that hormonal differences played a major role. Women with low estrogen and progesterone levels released fewer endogenous peptides and therefore could tolerate less pain than men. Increase the estrogen and progesterone levels, and their pain tolerance increases. “The bottom line, women are more complex in the way they modulate the pain through these mechanisms,” says Zubieta. Such studies, says Zubieta, are more illustrative than acute pain challenges. “Acute pain studies are telling us about a different kind of condition,” says Zubieta. “It’s not that it’s not relevant; it’s just different.”

E-mail address The Scientist Daily
May 24, 2005

Original web page at The Scientist

Categories
News

Cerebral navigation: How do nerve fibers know what direction to grow in?

Nervous system development requires billions of neurons to migrate to the appropriate locations in the brain and grow nerve fibers (axons) that connect to other nerve cells in an intricate network. Growth cones, structures in the tips of growing axons, are responsible for steering axons in the right direction, guided by a complex set of signals from cells they encounter along the way. Some signals lure the axons to extend and grow in a particular direction; others are inhibitory, making the axon turn away or stop growing. In two papers in the April 21 Neuron, researchers from Children’s Hospital Boston reveal important insights into how inhibitory cues affect the growth cone, and identify possible targets within axons that could be blocked to overcome this inhibition. Such intervention could possibly enable damaged axons to regenerate (normally impossible in a mature nervous system) and ultimately restore nerve function.

It’s been known that cells synthesize an inhibitory protein called ephrin, which binds to a receptor called Eph on the axon’s growth cone. But how this triggers the axon to change course or stop growing has been a mystery. Very little has been known about the inner workings of the cell that govern axon guidance,” says Michael Greenberg, PhD, Director of the Neurobiology Program at Children’s and senior author on both studies. “These studies begin to give insight into how the various steps of axon guidance are controlled.” The first paper found that when ephrin binds to Eph receptors on the axon, it activates a protein called Vav2 in the cell’s growth cone. Activation of Vav2 induces the cell to engulf the ephrin-Eph complex, breaking the bond between the two and repelling the axon, causing it to turn away. When mice were genetically modified to lack Vav2 and the related Vav3, thereby eliminating this repellent signal, the mice had abnormal axon projections and defects in neural circuitry formation.

The second paper demonstrates the role of a protein called ephexin1 in axon guidance. By itself, ephexin 1 promotes axon growth; neurons from mice genetically modified to lack ephexin1 had significantly shorter axons. But when ephrin is present and binds to Eph receptors, ephexin1 is chemically modified, causing it to alter the cell’s cytoskeleton, or internal scaffolding. This alteration makes the growth cone collapse, steering the axon in a new direction or halting its growth. In chicken motor neurons whose ephexin1 was inactivated, the axons grew into the hind limb prematurely, indicating faulty axon guidance.

“Understanding these pathways could help in understanding the process of nerve regeneration,” says Greenberg, who is also Professor of Neurology and Neurobiology at Harvard Medical School. “The mechanisms we’ve uncovered could provide opportunities for the development of therapies for spinal cord injury, targeting ephexin and possibly Vav,” he speculates, “but much more needs to be known about how ephexin, Vav and other proteins work together to coordinate axon guidance.”

Source: Children’s Hospital Boston

Bio.com
May 10, 2005

Original web page at Bio.com

Categories
News

A question of chimeras

Scientists say ruling on protest patent won’t have an impact on future chimeric-animal patents. Looking to cure a host of neurodegenerative diseases, StemCells, a Palo Alto, Calif.-based company, has transplanted human neural stem cells into the brains of thousands of mice. The mice are technically chimeras, a mix of two or more species. (The word “chimera” refers to the Greek mythological creature that has a lion’s head, a goat’s body, and a serpent’s tail.) President and CEO Martin McGlynn says his biotech company is now waiting for the FDA’s permission to test human neural stem cells – the ones already tested in mice – in human patients.

Such animals, especially mice, have been used to search for ways to cure human diseases including Parkinson and Alzheimer disease. “Having the ability to evaluate human cells in a mouse or other animal is critical to translating scientific discoveries into therapeutic medicine,” says McGlynn. “It’s the key. It’s the bridge to the clinic.” However, the use of such chimeric animals is the focus of a complicated patent case that is raising legal and ethical questions. In this case, opponents to the patenting of living things applied for a chimera patent. The US Patent and Trademark Office (USPTO) recently refused to issue a patent for the human-animal chimera in the application, on the grounds that it would have been too nearly human.

In the volatile debate over bioengineered life forms, many disagree about the ramifications of the recent case. The critics of the biotechnology industry who applied for the patent say the case has serious business and research implications. But some leading scientists and industry observers say the case is just another effort to grab attention in a field rife with more heat than rational discussion. Stuart Newman, a professor of cell biology and anatomy at New York Medical College in Valhalla, says he opposes the patenting of living things. Newman, working with Washington, DC, activist Jeremy Rifkin, filed a patent application in 1997 for a theoretical creature he never actually made. For “tactical reasons,” Newman says he eventually split his patent application into two: one involving primates and the other focused on other animals.

Using what he calls the “embryo chimera technique,” Newman sought to patent a creature combining human embryo cells with cells from the embryo of a monkey, ape, or other animal to create a blend of both. Other scientists have used similar methods to create a “geep” (part goat, part sheep) says Newman, adding that his chimera could be used for drug testing and as a source of organs to transplant into humans.After seven years and several rejections and appeals, the USPTO turned down both of Newman’s patent applications in August 2004, saying, among other things, that his creatures would be too close to human. Newman and Rifkin let the six-month appeals period lapse and declared victory in February 2005. Both Rifkin and Newman say they expect the ruling to prevent scientists and biotechs from obtaining similar patents for 20 years, the time a patent is usually viable. Rifkin says crossing species boundaries is a form of animal abuse and a violation of nature and human dignity.

“The ruling has significant implications for the future of the biotech industry,” says Rifkin, president of the nonprofit Foundation on Economic Trends, and one of the most vocal critics of biotechnology products such as genetically engineered organisms. “The implications for commercial interests are far-reaching. It means anyone applying for a patent for human-animal chimeras ought to be turned down. Newman says he expects the ruling to affect stem cell researchers, too. “There are people who are producing or who express their intention to produce mixtures of humans and mice for research purposes in order to test the potential of human stem cells. This decision does not block their ability to do that in their labs,” says Newman, “but if they wanted to patent and market these mixed human and animal organisms, it would be more difficult for them to commercialize it.” However, some leading stem cell researchers say the case is unlikely to stop work on chimeric animals.

Twenty-five years ago, US scientist Ananda Chakrabarty, who worked for General Electric at the time, obtained the first patent on a living organism, a genetically engineered bacterium that consumes oil spills. The patent office originally denied the application, believing it could not patent living organisms, according to Brigid Quinn, USPTO spokesperson. The case landed in the US Supreme Court, which in 1980 ruled that patents could be awarded on anything that was human-made.

Since then, some 436 transgenic or bioengineered animals have been patented, including 362 mice, 26 rats, 19 rabbits, 17 sheep, 24 pigs, two chickens, 20 cows, three dogs, and many more. Many say the 1980 ruling led to the birth of biotechnology in the United States. However, Quinn notes that US law clearly prohibits the patenting of people. “One reason we denied the case was the examiner believed one or more of the claims encompassed human beings.” Asked whether the case will affect future patent applications for chimeric lab animals, Quinn says examiners always decide first if it is patentable subject matter. “Humans aren’t. Anything found in nature is not patentable subject matter,” says Quinn. “It has to be new, useful, nonobvious, and fully disclosed in writing.” Quinn wouldn’t comment on whether the case will affect future chimera patent applications. “Each patent application is reviewed on its own merits.”

Irving L. Weissman, a professor of cancer biology, pathology, and developmental biology at Stanford University has created mice with brains that contain about 1% human tissue. Weissman says recent news reports that he plans to create a mouse with a 100% human brain are “inaccurate.” A pioneer in the field of stem cell research, Weissman is credited as being the first scientist to identify and isolate hematopoietic stem cells from mice and humans. He says that the news reports were fueled by an academic inquiry he made to find out, in theory, what his university ethics panel thought of the idea. He says he has no current plans to create such a mouse.

The Newman/Rifkin patent is “a new attempt to block science,” while the “use of human-mouse chimeras is old,” Weissman says. In 1988, J. Michael McCune patented the SCID-hu mouse, “a severe combined immunodeficient mouse with human organs, bones, lymphoid tissue, thymus, and liver,” says Weissman, who is also director of Stanford’s Institute of Cancer/Stem Cell Biology and Medicine and a cofounder of Stem-Cells and other companies. “The precedent is there, the discoveries are long published, and people’s lives have been affected by those discoveries. Would they take back all those discoveries and be happy if the therapies discovered through them were taken away?” Weissman dismisses the Newman/Rifkin case as “typical Rifkin,” adding that “one example doesn’t hold. It doesn’t invalidate the others, so it’s a hollow victory. The case is not the precedent they think.”

McGlynn says chimeric animals, and patents, are crucial to a biotech’s ability to develop cures for human diseases. To protect its investment, for example, StemCells has more than 43 US patents on its stem cell technology, though none are on bioengineered mice. “If the private sector cannot receive a patent on all its work and invention,” he says, “it’s unlikely to engage in the work because it takes so much time and effort and money.” “The ability to retain a return on your investment is crucial,” says McGlynn, adding: “Mice are the backbone of biotechs, pharmaceuticals, and drug development.”

The Scientist
May 10, 2005

Original web page at The Scientist

Categories
News

Mechanosensation solved?

TRP channels are involved in nearly every sense, including pain, taste, touch, and vision. Therefore, scientists pursuing the long-awaited “tip link” channel, which converts mechanical stimuli into hearing in stereocilia of the inner ear, have long expected a TRP channel to be at the root of the process. In December 2004, David Corey and collaborators at Harvard University identified TRPA1 as a strong candidate for the mechanotransducing tip link channel.1

“It’s headline news,” says Clifford Woolf of Harvard University. “It suggests TRPA1 as a serious candidate for the mechanosensor in the somatosensory system as well.” Others are more cautious. “It needs further genetic and pharmacological testing,” says David Julius of the University of California, San Francisco. He suggests that the development of better blockers for the TRPA1 channel will be needed before definitive proof can be obtained.

E-mail address The Scientist Daily
May 10, 2005

Original web page at The Scientist

Categories
News

Soccer link to motor neuron disease

A rigorous study in Italy has confirmed claims that professional soccer players have a higher than normal risk of developing a type of motor neuron disease, also known as amyotrophic lateral sclerosis. The reason remains a mystery. ALS involves the death of motor neurons, the nerve cells responsible for voluntary movement, and eventually leads to paralysis and death. Adriano Chiò’s team at the University of Turin surveyed the medical records of 7000 professional footballers who played in Italy’s first or second division between 1970 and 2001.

Based on the normal incidence of the disease and the players’ ages, the researchers calculated that there should have been 0.8 cases of ALS in this group. Instead, there were five. The study was prompted by what the Italian press dubbed “the motor neuron mystery” – the discovery a few years ago of 33 cases of ALS during an investigation of illicit drug use among 24,000 pro and semi-pro players in Italy. Dubious about the methodology of that initial investigation, Chiò’s group applied stricter diagnostic criteria to their data, such as only including players born in Italy. “I think the researchers have been conservative,” says Ammar Al-Chalabi of the Institute of Psychiatry in London, who has written a commentary on the study in Brain.

The researchers found that the mean age of onset was just 41. “They develop it about 20 years earlier than usual,” says Chiò. He also found that the longer people play football the greater the risk. In the US, ALS is known as Lou Gehrig’s disease after the baseball legend diagnosed with it in 1939. Clusters of cases have been reported in American football, but until now no large-scale studies have found any clear link between sport and ALS. The cause of ALS remains unknown, as does the reason for the higher rate among footballers. Genes undoubtedly contribute, but the disease could be triggered by head trauma, performance-enhancing drugs or some other toxin to which footballers are exposed. Certain viruses are also being investigated as potential causes.

Although the disease is certainly not limited to sportspeople, Al-Chalabi says it could also be that people prone to ALS are drawn to sport. “There could be some quality in their neuromuscular make-up that not only makes them good at sport, football particularly, but also makes them susceptible to ALS.”

Journal reference: Brain (vol 128, p 472)

New Scientist
March 15, 2005

Original web page at New Scientist

Categories
News

Life expectancy in epilepsy

Does a diagnosis of epilepsy reduce a person’s life expectancy? According to Athanasios Gaitatzis and colleagues the answer is yes, but under certain circumstances and to a variable extent. These authors recently estimated life expectancy in people with epilepsy, with data from the prospective community-based UK National General Practice Study of Epilepsy (NGPSE), and made comparisons with the general population. Life expectancy can be reduced by up to 10 years when there is a known cause of the epilepsy, the estimated reduction being highest at the time of diagnosis. These observations are hardly surprising considering the wealth of literature that shows increased mortality rates in people with epilepsy.

Traditionally, mortality has been expressed as the ratio of the observed and expected numbers of death: the standardised mortality ratio. Expected deaths are calculated by applying the death rates of an external reference population to the age distribution of the study population. The epilepsy population has an standardised mortality ratio of 2-3 (ie, a mortality that is 2-3 times higher than that of the general population). People with epilepsy of unknown cause have at most only a slight increase in mortality, while those with epilepsy as a symptom of a known underlying cause largely account for the overall increased mortality. The increase is evident during the first years after the onset of epilepsy, and mortality then declines to levels close to those in the general population. However, some studies show an increase in mortality later, a decade or more after disease onset. Relative survivorship, defined as the proportion of observed to expected number of survivors, provides a different perspective of mortality. The relative survivorship at 5, 10, and 15 years after diagnosis was 91%, 85% and 83%, respectively, in a pioneering study. In epilepsy of unknown cause relative survivorship is as high as 96% 25 years after diagnosis.

Gaitatzis and colleagues introduce yet another way of analysing data on mortality in patients with epilepsy. The estimated life expectancy of people with new-onset non-febrile seizures from the UK followed up for a median of 15 years was compared with that in people of the same age and sex in the general population. The difference in life expectancy between these populations gives the estimated years of life lost, which is presented at different intervals after the diagnostic seizure–for men and women, young and old, and epilepsy of known and unknown cause. With this analysis, the authors have expressed earlier knowledge in a new form, which serves to refine our understanding of an important prognostic aspect of epilepsy.

The critical question is what the reasons might be for the reduced life expectancy in people with new-onset epilepsy. Gaitatzis’ data, as indeed other population-based studies, strongly suggest that the increased mortality is related to the underlying cause of epilepsy rather than to the seizures. One may therefore ask, for example, how different life expectancy would be for a person with poststroke epilepsy compared with someone who has had a stroke but not developed epilepsy. Indeed, such patients might constitute the most relevant control group to clarify the contribution of the epilepsy itself. In fact, the reported causes of death in the NGPSE cohort, mainly cancer, ischaemic heart disease, cerebrovascular disease, and pneumonia, also indicate that mortality as a consequence of seizures is rare in newly diagnosed patients.

Although the estimates of life expectancy are of considerable interest for researchers and physicians, they are probably less useful in counselling patients and relatives. First, it is difficult to generalise from these data because Gaitatzis and colleagues included patients with single seizures and acute symptomatic seizures, while excluding those with brain tumours. Second, epilepsy is a heterogeneous disorder with multiple causes, each affecting the prognosis differently. Because of the limited size of the cohort, patients with different causes were lumped into one group: symptomatic epilepsy. However, it does not make much sense to estimate life expectancy for an individual with epilepsy after a traumatic brain injury on the basis of data from patients with underlying cerebrovascular disease. Third, the concept of years of lost life might be difficult to communicate to patients. A person with idiopathic/cryptogenic epilepsy would probably consider an estimated reduction in life expectancy of up to 2 years highly significant rather than minimal.

Although this extended analysis of the NGPSE data provides a new and interesting perspective on mortality risks, we now need to find ways to estimate the contribution of the epilepsy and seizures themselves and the causes of such deaths, and to analyse to what extent treatment can minimise risks and reduce years of lost life for people with epilepsy. Finally, we should acknowledge that the available data on mortality in epilepsy derive almost exclusively from western industrialised countries. We lack information from the rest of the world where the vast majority of the global epilepsy population resides and where epilepsy-related mortality and life expectancy is likely to be different.

Source: Torbjörn Tomson, Lars Forsgren

The Lancet
March 15, 2005

Original web page at The Lancet