Categories
News

Ever-happy mice may hold key to new treatment of depression

A new breed of permanently ‘cheerful’ mouse is providing hope of a new treatment for clinical depression. TREK-1 is a gene that can affect transmission of serotonin in the brain. Serotonin is known to play an important role in mood, sleep and sexuality. By breeding mice with an absence of TREK-1, researchers were able create a depression-resistant strain. The details of this research, which involved an international collaboration with scientists from the University of Nice, France, are published in Nature Neuroscience this week.

“Depression is a devastating illness, which affects around 10% of people at some point in their life,” says Dr. Guy Debonnel an MUHC psychiatrist, professor in the Department of Psychiatry at McGill University, and principal author of the new research. “Current medications for clinical depression are ineffective for a third of patients, which is why the development of alternate treatments is so important.” Mice without the TREK-1 gene (‘knock-out’ mice) were created and bred in collaboration with Dr. Michel Lazdunski, co-author of the research, in his laboratory at the University of Nice, France. “These ‘knock-out’ mice were then tested using separate behavioral, electrophysiological and biochemical measures known to gauge ‘depression’ in animals,” says Dr. Debonnel. “The results really surprised us; our ‘knock-out’ mice acted as if they had been treated with antidepressants for at least three weeks.”

This research represents the first time depression has been eliminated through genetic alteration of an organism. “The discovery of a link between TREK-1 and depression could ultimately lead to the development of a new generation of antidepressant drugs,” noted Dr. Debonnel. According to Health Canada and Statistics Canada, approximately 8% of Canadians will suffer from depression at some point in their lifetime. Around 5% of Canadians seek medical advice for depression each year; a figure that has almost doubled in the past decade. Figures in the U.S. are comparable, with approximately 18.8 million American adults (about 9.5% of the population) suffering depression during their life.

Science Daily
September 12, 2006

Original web page at Science Daily

Categories
News

Oxygen deprived brains repaired and saved

Scientists from Melbourne’s Howard Florey Institute have found special proteins that protect the brain after it has been damaged by a lack of oxygen, which occurs in conditions such as stroke, perinatal asphyxia, near-drowning and traumatic brain injury. Dr Nicole Jones and her team discovered that during oxygen deprivation, or ‘hypoxia’, these proteins (HIF1 and PHD2) increase. These proteins regulate processes like the production of red blood cells and new blood vessels, and the flow of glucose to the brain. Therefore they are involved in preventing further brain damage and repairing damage caused by the initial injury. This discovery takes the Howard Florey Institute’s scientists closer to developing preventative and regenerative treatments for brain damage caused by hypoxia.

Dr Jones said her discovery resulted from looking at how the body tries to protect itself and how the brain reacts when it experiences mild, non-damaging hypoxia. “I found that mild, non-damaging hypoxia actually protected the brain against a subsequent injury by activating certain proteins,” Dr Jones said. “Mild hypoxia appears to pre-condition neural tissues against a mass ‘suicide’ of healthy neurons after a stroke or other brain trauma. “In an experiment in rats, mild hypoxia followed by a major stroke resulted in less brain damage than if the rat experienced just a major stroke — all because these protective proteins were increased by the first non-damaging exposure to hypoxia. “I am now looking at developing both preventative and regenerative treatments that mimic these proteins’ protective and repairing effects,” she said. Dr Jones is now testing drug candidates, and would like to develop new drugs that activate these protective proteins in the brain.

While further research is required, Dr Jones and her team are hopeful that their investigations will lead to effective treatments that will help people experiencing hypoxia, and also to improve recovery from hypoxic induced brain damage. Dr Jones’ research has been recently published in the Journal of Cerebral Blood Flow and Metabolism and Neuroscience Letters. The Howard Florey Institute is Australia’s leading brain research centre. Its scientists undertake clinical and applied research that can be developed into treatments to combat brain disorders, and new medical practices. Their discoveries will improve the lives of those directly, and indirectly, affected by brain and mind disorders in Australia, and around the world. The Florey’s research areas cover a variety of brain and mind disorders including Parkinson’s disease, stroke, motor neuron disease, addiction, epilepsy, multiple sclerosis, autism and dementia.

Science Daily
September 12, 2006

Original web page at Science Daily

Categories
News

Mapping the neural landscape of hunger

The compelling urge to satisfy one’s hunger enlists structures throughout the brain, as might be expected in a process so necessary for survival. But until now, studies of those structures and of the feeding cycle have been only fragmentary–measuring brain regions only at specific times in the feeding cycle. Now, however, Ivan de Araujo, Duke University Medical Center, and colleagues report they have mapped the activity of whole ensembles of neurons in multiple feeding-related brain areas across a full cycle of hunger-satiety-hunger. Their findings, reported in the August 17, 2006, issue of the journal Neuron, published by Cell Press, open the way to understanding how these ensembles of neurons integrate to form a sort of distributed “code” that governs the motivation that drives organisms to satisfy their hunger.

In their paper, Ivan de Araujo and colleagues implanted bundles of infinitesimal recording electrodes in areas of rat brain known to be involved in feeding, motivation, and behavior. Those areas include the lateral hypothalamus, orbitofrontal cortex, insular cortex, and amygdala. The researchers then recorded neuronal activity in those regions through a feeding cycle, in which the rats became hungry, fed on sugar water to satisfy that hunger, and then grew hungry again. “This allowed us to measure both the ability of single neurons to encode for specific phases of a feeding cycle and how neuronal populations integrate information conveyed by these phase-specific neurons in order to reflect the animal’s motivational state,” wrote the researchers.

By isolating and comparing signals from particular neurons in the various regions at various times in the cycle, the researchers gained insight into the roles neurons in those regions played in feeding motivation and satisfaction, they wrote. The researchers found that they could, indeed, distinguish neurons that were sensitive to changes in satiety states as the animals satisfied their hunger. They could also measure how populations of neurons changed their activity over the different phases of a feeding cycle, reflecting the physiological state of the animals. Importantly, they found that measuring the activity of populations of neurons was a much more effective way of measuring the satiety state of an animal than measuring activity of only individual neurons in an area. And the more neurons they included in such populations, the more accurate the measure of that satiety state, they found.

Araujo and colleagues concluded that their analysis showed that while single neurons were preferentially responsive to particular phases in the metabolic status of the animal as it went through a hunger-satiety-hunger cycle, “when combined as ensembles, however, these neurons gained the ability to provide a population code that allows for predictions on the current behavioral state (hunger/satiety) of the animal by integrating information conveyed by its constituent units.” “Our results support the hypothesis that while single neurons are preferentially responsive to variations in metabolic status, neural ensembles appear to integrate the information provided by these neural sensors to maintain similar levels of activity across comparable behavioral states,” they concluded. “This distributed code acting across separate hunger phases might constitute a neural mechanism underlying meal initiation under different peripheral and metabolic environments,” they wrote.

Science Daily
August 29, 2006

Original web page at Science Daily

Categories
News

Stimulation of the semicircular canals can artificially control human walking and balance

By applying electrical currents across the heads of people while they walk, researchers have improved our understanding of how our vestibular system helps us maintain upright posture; at the same time, the researchers found that the stimulus could be applied in a way that allowed a person who was walking straight ahead to be steered by “remote control” without her balance being affected. The findings are reported by Richard Fitzpatrick and Jane E. Butler of the Prince of Wales Medical Research Institute and the University of New South Wales, Australia, and Brian L. Day of University College London in the August 8th issue of Current Biology, published by Cell Press.

To investigate how the body’s ability to sense head movements can contribute to balance control and guidance control–two critical aspects of bipedal locomotion–the researchers stimulated nerves that normally communicate signals from the so-called semicircular canals, structures that are part of the vestibular system that assists in orientation and balance. The researchers found that artificial stimulation of semiciruclar canal nerves afforded “remote control” that was accurate enough to keep subjects on pathways and avoiding obstacles while walking blindfolded through botanical gardens. The researchers also found that with a subject’s head in another position, exactly the same stimulus could be used to disturb upright balance, causing the subject to lean in one direction or the other, but without having any effect on steering his walking.

Known as bipedalism, our habitual upright posture is unique in the animal kingdom and has arisen through specific complementary adaptations of the body and brain. It has been believed that the key to human balance has come from a precise sense of–and ability to align the body to–the direction of gravity. However, the semicircular canals that the researchers stimulated to control walking and balance detect rotational movements of the head, not the direction of gravity. These findings therefore show that sensing movement is crucial for our upright posture. The findings support interpretations made from fossil evidence of an evolutionary change in the development of the human semicircular canals. These evolutionary changes would allow for enhanced movement detection, and therefore also indicate that that controlled movement, rather than alignment to gravity, has been important for the development of modern human bipedalism.

This new work has important implications for understanding how the brain processes sensory signals. According to the researchers, the findings indicate that from the single sensory organ that signals the movement of the head, the brain makes instant complex “mathematical” calculations to discard the parts not important to balance or steering, such as the movements we make when looking around, and then transforms the remaining signal into two components. One component is used to control steering, and the other to control balance. In a more practical view, this ability to produce illusions of movement, and then steer and balance the body by external control, leads the researchers to expect that stimulation techniques developed from the approach used in the new study will lead the way to diagnostic, therapeutic, and virtual-reality applications.

Science Daily
August 29, 2006

Original web page at Science Daily

Categories
News

New brain cells die without a job to do

When it comes to brainpower they say you either use it or lose it. Now a study in mice suggests that the survival of newly formed adult brain cells depends on the amount of input they receive. Fred Gage of the Salk Institute for Biological Studies in La Jolla, California, and his colleagues infected genetically engineered mice with a virus that stops new brain cells from producing NMDA receptors – proteins that sit on the surface of brain cells and help them communicate with each other. The virus used infects only newly generated cells, leaving other cells untouched. Infected cells that lacked NMDA receptors died sooner than their normal counterparts, suggesting that communication is essential for survival (Nature, DOI: 10.1038/nature05028).

To confirm this the researchers injected some of the virus-infected mice with a compound that blocks all NMDA receptors. They found this increased the survival rate of the brain cells infected by the virus, and lowered that of the normal, uninfected cells. Gage speculates that preventing any brain cell communication via NMDA receptors levels the playing field, giving all the brain cells an equal chance of survival – indirect evidence that activation of NMDA receptors affects the survival of brain cells. Since the cells the team studied were in the hippocampus, a brain region involved in learning and memory, Gage suggests that the fate of brain cells generated there helps guide the formation of memories and skills.

New Scientist
August 29, 2006

Original web page at New Scientist

Categories
News

Shared ancestor to humans, present-day non-human primates may be linchpin in evolution of language

When contemplating the coos and screams of a fellow member of its species, the rhesus monkey, or macaque, makes use of brain regions that correspond to the two principal language centers in the human brain, according to research conducted by scientists at the National Institute on Deafness and Other Communication Disorders (NIDCD) and the National Institute of Mental Health (NIMH), two of the National Institutes of Health. The finding, published July 23 in the advance online issue of Nature Neuroscience, bolsters the hypothesis that a shared ancestor to humans and present-day non-human primates may have possessed the key neural mechanisms upon which language was built. Principal collaborators on the study are Allen Braun, M.D., chief of NIDCD’s Language Section, Alex Martin, Ph.D., chief of NIMH’s Cognitive Neuropsychology Section, and Ricardo Gil-da-Costa, Gulbenkian Science Institute, Oeiras, Portugal, who conducted the study during a three-year joint appointment at the NIDCD and NIMH.

“This intriguing finding brings us closer to understanding the point at which the building blocks of language appeared on the evolutionary timeline,” says James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. “While the fossil record cannot answer this question for us, we can turn to the here and now – through brain imaging of living non-human primates – for a glimpse into how language, or at least the neural circuitry required for language, came to be.”While non-human primates do not possess language, they are able to communicate about such things as food, identity, or danger to members of their species by way of vocalizations that are interpreted and acted upon. In humans, the two main regions of the brain that are involved in encoding this type of information in language are known as Broca’s area and Wernicke’s area, named for the physician-researchers who discovered them. Both areas are located along the Sylvian fissure (and are therefore referred to as perisylvian areas) with Broca’s area located in the frontal lobe and Wernicke’s area located behind it in the temporal and parietal lobes. Scientists once believed that Broca’s area was chiefly involved in language production while Wernicke’s area dealt more with comprehension, however current thinking suggests that the two areas work in tandem with one another. Although monkeys are not able to perform the mental activities required for language, their brains possess regions that are structurally similar to the perisylvian areas in humans in both hemispheres. The functional significance of such similarities, however, has been unclear up to this point.

To measure brain activity, the researchers injected water labeled with oxygen-15, a biologically safe, fast-degrading radioisotope, into the bloodstream of three adult macaques. As neural activity increases in a given region of the brain, blood – and the radioactive water it carries – rushes into that region. Using the brain imaging technology known as positron emission tomography (PET), researchers capture an image of the radioactive areas, thus highlighting the regions of heightened activity. In this way, brain scans were taken of the monkeys as they listened to three types of sounds: the recorded coos and screams of other rhesus monkeys, and assorted non-biological sounds, such as musical instruments and computer-synthesized sounds, which matched the vocalizations in frequency, rate, scale, and duration. For each monkey, 16 scans were recorded for each sound type and compared.

Although the coo of a monkey is acoustically very different from a high-pitched scream, the researchers found that both of these meaningful species-specific sounds elicited significantly more activity than the non-biological control stimuli in the same three regions of the macaque’s brain. Moreover, these regions correspond to the key language centers in humans, with the ventral premotor cortex (PMv) corresponding to Broca’s area, and the temporoparietal area (Tpt) and posterior parietal cortex (PPC) corresponding to Wernicke’s area. In contrast, the non-biological sounds – which were acoustically similar to the coos and screams but had no meaning for the animals – elicited significantly less activity in these regions; rather, they were associated with greater activation of the brain’s primary auditory areas. (The reason for this, the researchers suggest, is that these sounds were new to the monkeys and the primary auditory areas may be especially attuned to novel stimuli.)

Based on these findings, the researchers suggest that the communication centers in the brain of the last common ancestor to macaques and humans – particularly those centers used for interpreting species-specific vocalizations – may have been recruited during the evolution of language in humans. In the macaque, these areas may currently play a parallel, prelinguistic function, in which monkeys are able to assign meaning to species-specific sounds. In addition, in light of an earlier study published by the same group, in which species-specific vocalizations of macaques activated brain regions that process higher-order visual and emotional information, the researchers suggest that the language areas of the brain may have evolved from a much larger system used to extract meaning from socially relevant situations – a system in which humans and non-human primates may share similar neural pathways.

Further studies to be conducted include investigating which regions of the non-human primate brain are activated when animals listen to meaningful auditory stimuli other than species-specific vocalizations, such as a predators’ calls, sounds made by humans, or other relevant environmental stimuli. In addition, they are interested in studying the pattern of brain activation elicited by non-auditory stimuli that convey the same meaning, such as visual images of monkeys producing vocalizations.

Science Daily
August 14, 2006

Original web page at Science Daily

Categories
News

Timing of food consumption activates genes in specific brain area

Giving up your regular late-night snack may be hard, and not just because it’s a routine. The habit may genetically change an area of the brain to expect the food at that time, researchers at UT Southwestern Medical Center have discovered. By training mice to eat at a time when they normally wouldn’t, the researchers found that food turns on body-clock genes in a particular area of the brain. Even when the food stopped coming, the genes continued to activate at the expected mealtime. “This might be an entrance to the whole mysterious arena of how metabolic conditions in an animal can synchronize themselves with a body clock,” said Dr. Masashi Yanagisawa, professor of molecular genetics and senior author of the study. The UT Southwestern researchers report their findings in the Aug. 8 issue of the Proceedings of the National Academy of Sciences.

The daily ups-and-downs of waking, eating and other bodily processes are known as circadian rhythms, which are regulated by many internal and external forces. One class of genes involved in these cycles is known as Period or Per genes. When food is freely available, the strongest controlling force is light, which sets a body’s sleep/wake cycle, among other functions. Light acts on an area in the brain called the suprachiasmatic nucleus, or SCN. But because destroying the SCN doesn’t affect the body clock that paces feeding behavior, the circadian pacemaker for feeding must be somewhere else, Dr. Yanagisawa said. To find the answer, his group did a simple but labor-intensive experiment. The scientists set the mice on a regular feeding schedule, then examined their brain tissue to find where Per genes were turned on in sync with feeding times.

The researchers put the mice on a 12-hour light/dark cycle, and provided food for four hours in the middle of the light portion. Because mice normally feed at night, this pattern is similar to humans eating at inappropriate times. Dysfunctional eating patterns play a role in human obesity, particularly in the nocturnal eating often seen in obese people, the researchers noted. The mice soon fell into a pattern of searching for food two hours before each feeding time. They also flipped their normal day/night behavior, ignoring the natural cue that day is their usual time to sleep. After several days, the researchers found that the daily activation cycle of Per genes in the SCN was not affected by the abnormal feeding pattern. However, in a few different areas of the brain, particularly a center called the dorsomedial hypothamalic nucleus or DMH, the Per genes turned on strongly in sync with feeding time after seven days.

When the mice subsequently went two days without food, the genes continued to turn on in sync with the expected feeding time. “They started to show the same pattern of anticipatory behaviors several hours before the previously scheduled time of feeding,” said Dr. Yanagisawa, a Howard Hughes Medical Institute investigator. “So somewhere in the body, they clearly remembered this time of day.” Upcoming research will focus on how the centers that control various body clocks communicate with each other, Dr. Yanagisawa said.

Science Daily
August 14, 2006

Original web page at Science Daily

Categories
News

Tiny inhaled particles take easy route from nose to brain

In a continuing effort to find out if the tiniest airborne particles pose a health risk, University of Rochester Medical Center scientists showed that when rats breathe in nano-sized materials they follow a rapid and efficient pathway from the nasal cavity to several regions of the brain, according to a study in the August issue of Environmental Health Perspectives. Researchers also saw changes in gene expression that could signal inflammation and a cellular stress response, but they do not know yet if a buildup of ultrafine particles causes brain damage, said lead author Alison Elder, Ph.D., research assistant professor of Environmental Medicine.

The study tested manganese oxide ultrafine particles at a concentration typically inhaled by factory welders. The manganese oxide particles were the same size as manufactured nanoparticles, which are controversial and being diligently investigated because they are the key ingredient in a growing industry — despite concerns about their safety. Nanotechnology is a new wave of science that deals with particles engineered from many materials such as carbon, zinc and gold, which are less than 100 nanometers in diameter. The manipulation of these materials into bundles or rods helps in the manufacturing of smaller-than-ever electronics, optical and medical equipment. The sub-microscopic particles are also used in consumer products such as toothpaste, lotions and some sunscreens.

Some doctors and scientists are concerned about what happens at the cellular level after exposure to the ultrafine or nano-sized particles, and the University of Rochester is at the forefront of this type of environmental health research. In 2004 the Defense Department selected the University Medical Center to lead a five-year, $5.5 million investigation into whether the chemical characteristics of nanoparticles determine how they will interact with or cause harm to animal and human cells.

In the current study, the particles passed quickly through the rats’ nostrils to the olfactory bulb, a region of the brain near the nasal cavity. They settled in the striatum, frontal cortex, cerebellum, and lungs. After 12 days, the concentration of ultrafine particles in the olfactory bulb rose 3.5-fold and doubled in the lungs, the study found. Although the ultra-tiny particles did not cause obvious lung inflammation, several biomarkers of inflammation and stress response, such as tumor necrosis factor and macrophage inflammatory protein, increased significantly in the brain, according to gene and protein analyses. “We suggest that despite differences between human and rodent olfactory systems, this pathway is likely to be operative in humans,” the authors conclude.

Science Daily
August 14, 2006

Original web page at Science Daily

Categories
News

A smell like no other

Unlike mice and many other animals, humans seem to lack the biological hardware for detecting pheromones, airborne molecules that carry information about sex and status. Now, researchers have discovered receptors in mice that may enable them to detect pheromones even without this specialized hardware. If people use a similar system, we may not be completely shut out of the pheromone game. Pheromones are detected by the vomeronasal organ, a small opening at the back of the nasal cavity of many mammals. Most scientists doubt that humans have a working vomeronasal organ. Still, recent research indicates that people can respond to pheromones (ScienceNOW, 9 May), and animal studies have suggested that some creatures can detect pheromones with their olfactory epithelium, a sheet of neurons that picks up ordinary odors.

Searching for an explanation to this alternate pheromone-detection system, neuroscientists Stephen Liberles and Linda Buck of the Fred Hutchinson Cancer Research Center in Seattle scanned the mouse olfactory epithelium for potential pheromone receptors. Focusing their search on receptors not known to respond to any odor, the researchers identified one family of genes that appeared to be expressed in olfactory neurons but not in other tissues. The pattern of gene expression is strikingly similar to that for individual olfactory receptor genes, Liberles says. Each of these genes, which encode trace amine-associated receptors (TAARs), was expressed by only a smattering of olfactory neurons, and each of these neurons appeared to make only one type of TAAR, the researchers report online 30 July in Nature.

Although little is known about TAARs, previous studies have found that some of them are activated by compounds in mouse urine–a major source of pheromones. Two of these compounds, isoamylamine and trimethlyamine, appear in different concentrations in male and female mouse urine, and some evidence suggests that isoamylamine acts as a pheromone by accelerating the onset puberty in female mice. Together with the new findings, this suggests (but doesn’t prove) that TAARs are pheromone receptors, Liberles says. Humans have TAAR genes too, raising the possibility that we use them to detect pheromones.

The work contributes to a growing appreciation of the molecular diversity of olfactory and pheromone receptors, says Catherine Dulac, a neuroscientist at Harvard University. The next step, she says, will be to determine whether TAARs actually work like pheromone receptors in live animals.

ScienceNow
August 14, 2006

Original web page at ScienceNow

Categories
News

Neurons return in damaged brains

A drug that triggers the birth of neurons in rat brains has opened up the possibility of a new treatment for Parkinson’s disease. Animals given the drug generated dopamine-producing neurons in the substantia nigra, the area of the brain where cells are lost in people with Parkinson’s.

Christopher Eckman and Jackalina Van Kampen at the Mayo Clinic College of Medicine in Jacksonville, Florida, found that after infusing the drug 7-OH-DPAT into damaged rats’ brains for eight weeks, the numbers of neurons in the damaged region and the connections they made had both returned to near normal (The Journal of Neuroscience, vol 26, p 7272). “The recovery in the animals was nothing short of profound,” says Eckman.

New Scientist
August 1, 2006

Original web page at New Scientist

Categories
News

Researchers link newly discovered gene to hereditary neurological disease

Scientists have linked a recently discovered gene to a rare nervous system disease called hereditary spastic paraplegia, for which there is no cure. The discovery could lead to development of drugs that target the defective gene, said the researchers at Duke University Medical Center who discovered the mutation. The gene defect accounts for 6 percent to 7 percent of all cases of hereditary spastic paraplegia, they said. The discovery of the gene defect will provide important insights into the causes of other major neurodegenerative diseases, including amyotrophic lateral sclerosis or Lou Gehrig’s disease, said Stephan Züchner, M.D., assistant professor at the Duke Center for Human Genetics and the Department of Psychiatry. “Patients with these genetic diseases now have no real treatment options,” said Züchner, co-leader of the study team. “Our discovery will open up a new opportunity to study these diseases from a different angle so we can better understand what is causing them and which genes to target in developing treatments to manage them.”

The researchers report their findings in the August 2006 issue of the American Journal of Human Genetics, which is now available online. The research was funded by the National Institutes of Health and by donations to the Duke Center for Human Genetics from individuals and families affected by hereditary spastic paraplegia. Hereditary spastic paraplegia, one of a number of related inherited disorders, causes progressive limb weakness and stiffness, often resulting in paralysis. As with many neurodegenerative diseases, patients typically begin to show symptoms during their mid-20s to mid-50s, and the symptoms grow progressively more debilitating with time. With no cure available, physicians can only treat symptoms with physical therapy to improve muscle strength and preserve range of motion.

In their study, the Duke researchers found that one form of hereditary spastic paraplegia is linked to a gene called REEP1. The gene normally produces proteins that support the cell’s energy source, the mitochondria. But a defect in the gene may disable its proteins from performing their normal functions in mitochondria – most notably the mitochondria within the nervous system’s cellular pathways. Precisely how this protein malfunction occurs is still unknown, said Margaret Pericak-Vance, Ph.D., director of Duke’s Center for Human Genetics and co-leader of the study. The Duke scientists began their search for genes associated with the disease by studying two families whose members had hereditary spastic paraplegia.

Using gene-mapping techniques, the researchers identified a small stretch of DNA on chromosome 2, where the disease-causing gene was thought to reside. The researchers screened nine candidate genes that play a potential role in governing the cellular pathways of neurodegenerative disease. By meticulously examining the DNA sequence of those genes, the researchers located mutations — changes in the DNA sequence — in the REEP1 gene among people who have hereditary spastic paraplegia but not in their unaffected relatives. Pericak-Vance and team member Allison Ashley-Koch, Ph.D., said that the discovery of REEP1’s role in hereditary spastic paraplegia strengthens the evidence that defects in mitochondria are responsible for many types of neurodegenerative diseases. For example, scientists have discovered that Lou Gehrig’s disease is caused by mutations in SOD1, a gene whose protein is also expressed in mitochondria. With REEP1’s role now identified, scientists are developing a genetic test to identify patients who have the defect, Züchner said. The Duke team has licensed its gene discovery to Athena Diagnostics Inc. to develop a genetic test for patients at risk for the disease.

Science Daily
August 1, 2006

Original web page at Science Daily

Categories
News

Scientists reverse muscle contractions in mouse model of muscular dystrophy

University of Florida scientists have used gene therapy to eliminate disabling muscle contractions in a mouse model of the most common form of adult-onset muscular dystrophy. The inherited disorder, known as myotonic dystrophy, is found in one of every 8,000 people and causes skeletal muscles to lose the ability to relax once they contract. “One of the principal manifestations of the disease is myotonia, or muscle hyperexcitability,” said Maurice Swanson, Ph.D., the paper’s senior author and a professor of molecular genetics and microbiology at UF’s College of Medicine and the UF Genetics Institute. “So when patients with myotonic dystrophy contract one of the muscles in their arm, it’s very difficult for them to release that contraction.” The muscles progressively weaken and eventually waste away. The disease also affects the heart muscle and is associated with irregular heart rhythms that can lead to sudden death. It also can result in cataracts, premature hair loss and mild to moderate mental retardation.

The work, to be published this week in the Proceedings of the National Academy of Sciences, builds on previous research at UF and the University of Rochester School of Medicine and Dentistry that revealed myotonic dystrophy is caused by malfunctioning genes that block the action of key proteins in cells, including one known as the muscleblind protein. These proteins, which help muscle and eye cells mature, stick to warped copies of RNA molecules that build up in a cell’s nucleus and prevent the proteins from working properly. In the current study, UF researchers used mice that carry the mutated genes and develop the muscle problems characteristic of myotonic dystrophy.

The scientists equipped the adeno-associated virus, or AAV-a safe and widely used vector in gene therapy-to express extra copies of the muscleblind protein. They then injected it into a muscle in the shin in the mutant mice. “We simply tried to correct some of problems that arise by flooding the muscle with extra copies of the muscleblind protein,” Swanson said. “We were able to correct the myotonia as early as four weeks after injection, and at 23 weeks it was completely eliminated in the muscle that was injected with AAV carrying this muscleblind protein.” Another six mice were in the control group and received injections of green fluorescent protein. Their muscle function did not improve. In effect, patients with myotonic dystrophy retain many of the newborn versions of all the proteins the body makes, Swanson said.

“We all know newborn muscle is very different than adult muscle,” he said. “It’s not just that adults have more muscle, but in adults, proteins are being expressed that have changed between the time we were newborns to the time we became adults. That transition to adult proteins is prevented in myotonic dystrophy. “Basically, these fetal forms of proteins that are expressed during embryonic and neonatal life are present in adult myotonic dystrophy patients and are incompatible with adult function of muscle,” he added. “The reason that’s true is muscleblind proteins are factors that regulate this transition from newborn to adult proteins. The muscleblind proteins’ responsibility in cells is to make that transition, to force the production of the adult proteins.”

In the next phase of the research, the scientists plan to inject the gene therapy solution directly into the bloodstream. “Myotonic dystrophy patients want all their muscles corrected, not just one,” Swanson said. “One way to get around this problem is to try systemic injections in this mouse model. We’d like to correct all abnormal muscle contractions, not just in a specific muscle group. “About 30 percent of myotonic dystrophy patients succumb to heart problems, so theoretically systemic injections might also prevent that,” he added.

Scientists eventually hope to find out whether correcting myotonia early by restoring normal levels of functioning muscleblind protein might prevent at least some of the muscle loss that characterizes the adult-onset disease. But researchers are years away from testing the gene therapy approach in people.

“Basically we have to make sure everything works correctly in mice before we can proceed to human trials,” Swanson said. “That’s a long way off.” Stephen Tapscott, M.D., Ph.D., a professor of neurology at the University of Washington and a researcher at the Center on Human Development and Disability at the Fred Hutchinson Cancer Research Center in Seattle, called the findings “an important advance for developing therapies for myotonic dystrophy.” “The demonstration that muscleblind can be delivered to diseased muscle and reverse the disease process in this mouse model achieves an important landmark step that will inform future preclinical and, ultimately, clinical studies in myotonic dystrophy,” he said.

Until now it was difficult to even contemplate a way of treating the disease because it is extraordinarily complex, said John Day, M.D., Ph.D., a professor of neurology at the University of Minnesota School of Medicine, but the research has identified a common element that underlies many of the disease’s different features. “A means of delivering the treatment to humans still needs to be developed, but this now provides proof of principle that the approach is effective in this important mouse model,” Day said. “For the first time this really raises the hope of people suffering from this common form of muscular dystrophy that a treatment could someday be forthcoming that will address the many serious components of this disease.”

Science Daily
August 1, 2006

Original web page at Science Daily

Categories
News

Sleep strengthens memories and makes them resistant to interfering information

Researchers have uncovered new evidence that sleep improves the brain’s ability to remember information. Their findings demonstrate that memories of recently learned word pairs are improved if sleep intervenes between learning and testing and that this benefit is most pronounced when memory is challenged by competing information. The findings are reported in the July 12th issue of Current Biology by Jeffrey Ellenbogen, of Harvard Medical School, and his colleagues. Whether sleep facilitates memory consolidation is a question as old as the experimental study of memory itself. In recent years, there has been a resurgence of experiments exploring this relationship. Although there is near-consensus that sleep promotes learning of certain types of perceptual memories (for example, learning to tap numeric sequences on a keyboard), there is ongoing debate about whether sleep benefits so-called declarative memory, a key type of memory that is based in the brain’s hippocampus.

In the new work, the researchers studied the influence of sleep on declarative memory in healthy, college-aged adults. The results demonstrated a robust effect: Compared to participants who did not sleep during the trials, those who slept between learning and testing were able to recall more of the original words they had learned earlier. The beneficial influence of sleep was particularly marked when participants were presented with the challenge of “interference”–competing word-pair information–just prior to testing. A follow-up group further demonstrated that this sleep benefit for memory persists over the subsequent waking day. This work clarifies and extends previous study of sleep and memory by demonstrating that sleep does not just passively and transiently protect memories; rather, sleep plays an active role in memory consolidation.

Science Daily
August 1, 2006

Original web page at Science Daily

Categories
News

‘Miracle recovery’ shows brain’s resilience

The amazing recovery of a man who had spent almost two decades in a barely conscious state has revealed the brain’s previously unrecognized powers of recovery. Terry Wallis became a media star in 2003 when he emerged from the minimally conscious state (MCS) in which he had spent 19 years, since suffering severe brain damage in a motor accident. At the time, his ‘miracle’ recovery was a mystery. Researchers who have examined his brain now think that his emergence was due to painstaking regrowth of the affected areas that ultimately allowed him to regain some of his faculties. Patients in an MCS are ‘awake’, but cannot produce coordinated movements or speech, and are unable to express their thoughts and feelings. But after his recovery, Wallis regained the power of speech, and his movements, although still severely hampered, showed a dramatic and remarkable improvement. It happened because his brain slowly regrew the nerve connections that were devastated as a result of his accident, say researchers led by Henning Voss of Cornell University’s Weill Medical College in New York.

Voss’s team compared scans of Wallis’s brain with those of another MCS patient who had shown no improvement during six years following traumatic brain injury, and of 20 healthy subjects. The results revealed levels of metabolic activity in specific parts of the brain’s ‘white matter’, which is largely composed of cellular tendrils called axons that form connections between cells in different brain regions. Although not up to the level of a healthy subject, activity in these areas was significantly higher in Wallis’s brain than in the other MCS patient, suggesting that Wallis’s brain cells had partially restored the connections that allow him to move and speak, the researchers report in The Journal of Clinical Investigation.

Among the regions that showed this improved activity was an area known as the precuneus, which is important for wakeful consciousness, the researchers add. In healthy brains, activity in this region declines during sleep and under general anaesthetic. Without similar scans from earlier in Wallis’s degenerative condition, the researchers don’t know at what rate his brain has recovered, or how much further recovery might be possible. However, the damage was so severe that Voss says “I think there is no prospect of a full recovery.” The discovery highlights how much there remains to learn about the brain’s response to trauma, comments Steven Laureys, a neurologist at the University of Liège in Belgium. Although Wallis had been in an MCS for almost two decades, the condition was only formally classified in 2002.

Doctors are often tempted to assume that the brain can only recover during the first few days or months after injury, leading them to give up on long-term patients. “They are thought to be hopeless cases,” says Laureys. “But this forces us to revise the old dogma. Although this is clearly an exceptional patient, it shows that it is worth studying further.” Patients with limited consciousness should not be denied care and therapy that could help them, Laureys argues.

Although the new discovery does not suggest any way to assist or accelerate the brain’s recovery, it should remind doctors that such recoveries are possible, Laureys adds. “There’s too much therapeutic nihilism,” he says. “It’s hard to find centres that are willing to accept patients and give them rehabilitation and aggressive therapy.” It also remains unclear whether the brain is capable of similar resilience in patients with more serious conditions, such as coma or persistent vegetative state (PVS), in which the patient is completely uncommunicative and displays no more than reflex responses. “There’s a big difference between these two conditions,” notes Voss.

Neurologists are reluctant to declare that PVS, the condition at the centre of the controversial debate over US sufferer Terri Schiavo, can ever be truly permanent. Earlier this year, researchers made the bizarre discovery that some PVS patients could be roused with a simple sleeping pill. But the tendency is to assume that the chances of recovery trail off with time, an assumption that will be overturned by the latest discovery, Laureys hopes. “That’s the real message,” he says.

Nature
July 18, 2006

Original web page at Nature

Categories
News

Blood analysis may reveal Alzheimer’s risk

A blood test might one day predict a person’s risk of developing dementia, a new study suggests. It links certain levels of two proteins in blood samples with a 10-fold increase of developing Alzheimer’s disease and other forms of dementia. Alzheimer’s disease has no known cure, and its causes are also mysterious. The disease typically progresses over years, gradually robbing elderly patients of memory, language and other crucial mental skills. In very rare cases, people in their 30s or 40s can develop “early-onset Alzheimer’s”. Notably, those who carry the genetic mutations known to encourage early-onset Alzheimer’s have elevated levels of certain protein fragments, called amyloid fragments, in their blood.

This finding caught the attention of epidemiologist Monique Breteler of Erasmus Medical Centre in Rotterdam, The Netherlands, because the same amyloid fragments form plaques in the brains of patients with Alzheimer’s, regardless of the age at which they develop the disease. Breteler and colleagues studied how blood levels of two slightly different amyloid fragments – Beta 1-40 and the slightly longer Beta 1-42 – relate to the risk of dementia among people who do not have early-onset Alzheimer’s. The team took blood samples from 6713 volunteers over the age of 55 years who showed no signs of dementia. They analysed the blood for levels of the two amyloid fragments, and monitored the cognitive function of 1756 volunteers randomly selected from the sample group over an average of 8.6 years.

Of the closely monitored subgroup, 9% developed dementia. While most of these developed Alzheimer’s, others had dementia relating to other illnesses, such as Parkinson’s disease. To strengthen their analysis, researchers also included data from 230 patients in the initial study group who developed dementia, such as Alzheimer’s. The study showed that those with the highest Beta 1-40 levels combined with the lowest Beta 1-42 levels were 10 times more likely to develop dementia as those who had the lowest amounts of both fragment types. Subjects with high levels of both amyloid Beta 1-40 and amyloid Beta 1-42 had about the same risk of developing dementia as those with low levels of both. As a result, Breteler believes it is the ratio of these amyloid fragments that matters. She speculates that lower levels of amyloid Beta 1-42 – the predominant component of Alzheimer’s plaques in the brain – may indicate that this is getting deposited in the brain instead of circulating in the body. Previous studies have suggested that amyloid Beta 1-40 is deposited in plaques during the later stages of Alzheimer’s. But Breteler admits that it remains “debatable” how much protein levels in the blood reflect those in the brain.

While she says that this is by far the largest study to see how well the ratio of these amyloid fragments in the blood predict dementia, she also cautions that it is too soon to consider the method a reliable test. “I don’t think that we are at a test yet,” she stresses. “I think that’s really the next step.”
Source: The Lancet Neurology

New Scientist
July 18, 2006

Original web page at New Scientist

Categories
News

Alzheimer’s vaccine shows success in mice

A DNA vaccine has successfully reduced the symptoms of Alzheimer’s disease in mice. The result could signal the first preventative and restorative treatment vaccine for Alzheimer’s without serious side effects. Alzheimer’s disease progresses as small proteins called amyloid beta (Ab) peptides are overproduced, forming plaques in the brain that interfere with its function. Memory loss and mental deterioration follow. A vaccination approach – getting the immune system to clean up the plaques – has been considered the most promising way to tackle the disease, but its success has been limited, until now. In 2002, for example, the US pharmaceutical company Elan halted trials of a vaccine that raised antibodies against Ab peptides, after some patients suffered brain inflammation.

The new vaccine is different because instead of using the Ab peptide itself to stimulate antibody production, it uses a stretch of DNA that codes for the Ab peptide, says Yoh Matsumoto, at the Tokyo Metropolitan Institute for Neuroscience, Japan, who led the research. Since DNA vaccination stimulates the immune system more gently than peptide vaccination, it should also avoid the brain swelling seen in the Elan trial. The researchers use mice engineered to develop Alzheimer’s-like symptoms by producing Ab peptides in the brain, which in turn form the plaques that lead to cognitive impairment.

Matsumoto’s team injected the mice with the DNA vaccine before Ab peptides had started to build up. The DNA is read by the muscle cells into which it is injected, and Ab peptide is produced. Antibodies to it are then raised by the immune system and cross into the brain. Mice treated preventatively, at 7 or 18 months of age, developed 15.5% or 38.5% fewer Ab peptides, respectively, than did untreated mice. This suggests that the vaccine was having a protective effect on the mice. When the DNA vaccine was used as a treatment in mice that had already started producing the Ab peptides, their Ab peptide burden was reduced by about 50%. “Ab peptide reduction to 50% that of unvaccinated mice is sufficient [for the return of normal cognitive function],” says Matsumoto. If these results are replicated in monkeys, he hopes that clinical trials in humans could start within 3 years. Nick Fox, of the Institute of Neurology at University College, London, UK, says that DNA vaccination is an extremely promising way of slowing the progression of Alzheimer’s. “The key will be what happens in humans,” he says.

New Scientist
July 3, 2006

Original web page at New Scientist

Categories
News

Researchers show how brain decodes complex smells

Duke University Medical Center researchers have discovered how the brain creates a scent symphony from signals sent by the nose. In studies in mice, the researchers found that nerve cells in the brain’s olfactory bulb — the first stop for information from the nose — do not perceive complex scent mixtures as single objects, such as the fragrance of a blooming rose. Instead, these nerve cells, or neurons, detect the host of chemical compounds that comprise a rose’s perfume. Smarter sections of the brain’s olfactory system then categorize and combine these compounds into a recognizable scent. According to the researchers, it’s as if the brain has to listen to each musician’s melody to hear a symphony.

Humans may rely on the same smell decoding system, because mice and men have similar brain structures for scent, including an olfactory bulb, the researchers said. “We wanted to understand how the brain puts together scent signals to make an odor picture. We discovered the whole is the sum of its parts,” said Da Yu Lin, Ph.D., who conducted the research as a graduate student studying with neurobiologist Lawrence Katz, Ph.D., a Howard Hughes Medical Institute investigator at Duke. The research appears June 16, 2006, in the journal Neuron. The study was supported by the National Institutes of Health, the Howard Hughes Medical Institute and the Ruth K. Broad Biomedical Research Foundation.

Scientists have long debated how the brain makes order out of the hundreds of volatile chemical compounds that assault the nose. Is the brain’s odor code redundant, with single cells responding to multiple components in the smell of a freshly baked cookie? Or does the brain process each scent component like a jigsaw puzzle piece, assembling the signals until it recognizes the picture is a cookie? To find answers, the Duke researchers exposed mice to different odors and measured response of neurons across the olfactory bulb with intrinsic signal imaging. The imaging technique maps brain activity by detecting changes in reflected light from the brain with a sensitive camera.

To start, the researchers separated and identified the volatile compounds in each odor with gas chromatography. “A complex mixture like urine has at least a hundred separate compounds in it,” Lin said. They analyzed scents as diverse as peanut butter, coffee and fresh bobcat urine shipped to the laboratory on dry ice. The researchers then exposed the mice to the original odor and its individual compounds. “We found that glomeruli, the functional units of the olfactory bulb, act as detectors for individual compounds,” Lin said. “There are no single detectors for complete smells.” Thus, to distinguish different scents, the brain must integrate the signals of multiple chemical components into an odor “picture.” The researchers suspect that this integration doesn’t happen in the olfactory bulb. Instead, the bulb likely passes the data to more advanced brain structures where it is assembled and recognized as a specific scent.

Understanding how the olfactory system works in mice may also provide broader insights into human perception, said Stephen Shea, Ph.D., a Duke University Medical Center research associate who participated in the study. Perception relies on combining multiple components, whether the input is smell, sight or sound. Shea suggested that probing the olfactory system could help scientists better understand, for example, how the various biological and neurological components underlying perception formed and evolved.

Science Daily
July 3, 2006

Original web page at Science Daily

Categories
News

If the chemistry is right … you might remember this

A young Australian scientist has made an important discovery about how brain cells communicate. This finding is central to understanding all brain function — from laying down memory to being able to walk. The groundbreaking research has been published in the latest edition of world-leading journal Nature Neuroscience. Victor Anggono, a PhD student at the Children’s Medical Research Institute (CMRI), set out to identify the molecular partners of a key protein called dynamin, and how their partnership allows neurons to send messages.

The result was astounding. A protein called syndapin, previously thought to have no major role in nerve communication, was proven to be the molecule that simultaneously works with dynamin to allow the transmission of messages between nerve cells. The brain functions by sending chemical messages between nerves. The messages, or neurotransmitters, are held in tiny packages at the nerve terminal where they are released to send a signal. The packages then return to the cell and are re-filled so that brain function can continue.

In collaboration with researchers from the University of Edinburgh further studies have revealed that by blocking the interaction of these two proteins nerve communication shuts down. “The partnership between dynamin and syndapin is crucial for the continuous cycle of neurotransmission. This makes syndapin a very specific target for medicines that could treat conditions where there is an overload of nerve activity, such as during seizures,” said Dr. Phil Robinson leader of the research at the CMRI.

The relationship between dynamin and syndapin is also crucial to understanding other processes where there is a high level of brain activity and nerve transmission, such as when forming memories and during learning. Dr. Robinson says, “A discovery like this will be vital for future research into many neurological disorders, such as epilepsy, conditions of memory loss and schizophrenia. It is only through research like this, that medical science can now target specific problems and develop improved treatments.”

Science Daily
June 20, 2006

Original web page at Science Daily

Categories
News

Neurons find their place in the developing nervous system with the help of a sticky molecule

The brain, that exquisite network of billions of communicating cells, starts to take form with the genesis of nerve cells. Most newborn nerve cells, also called neurons, must travel from their birthplace to the position they will occupy in the adult brain. Researchers at the Salk Institute for Biological Studies have identified a molecule expressed on the surface of certain migrating neurons that helps them find their correct position along on the way. Decreasing levels of that protein, an adhesion molecule called MDGA1, prevents neurons that normally make this protein from assuming their proper position, resulting in brain malformation, researchers report in the April 26th issue of the Journal of Neuroscience.

As Dennis D. M. O’Leary, Ph.D., senior author of the study and a Professor in the Molecular Neurobiology Laboratory put it, “proper neuronal positioning is essential for development of appropriate wiring, which is in turn critical for establishing a normal, functioning nervous system.” Neurons migrate throughout the brain, but migration is particularly important for development of part of the brain known as the cerebral cortex. The cortex sits like a skullcap over the rest of the brain and is responsible for sensory perception, higher-level reasoning, and, in humans, language. In mammals, the largest and evolutionarily newest part of the cortex, the neocortex, is recognized anatomically by its six horizontal layers.

The neocortex develops outward from an underlying zone of cells. From that zone, crawling neurons migrate radially out toward the surface or “superficial” part of the developing cortex, giving rise to a laminar structure. Neurons forming layers 2 and 3, the focus of the current study, are born last and so must elbow their way through cells lying in earlier formed layers to reach what will become the outermost layers. Without MDGA1, these neurons begin to migrate but get stuck before they reach their normal destination.

The MDGA1 gene was cloned and characterized first in rat by O’Leary and two former postdoctoral fellows, E. David Litwack, Ph.D. and Matthias Gesemann, Ph.D. They showed that MDGA1 is a cell adhesion molecule — a protein enabling cells to attach to other surfaces, something that they must do either to move or sit still and elaborate connections. They also showed that MDGA1 is expressed on subpopulations of migrating neurons throughout the developing nervous system, including layer 2/3 neurons in the neocortex, suggesting that MDGA1 may actually be required for migration.

In the current study, O’Leary and Akihide Takeuchi, M.D., Ph.D., a postdoctoral fellow and the study’s first author, tested this hypothesis. They first showed that layer 2/3 neurons make MDGA1 protein as they migrate to their destination. Then, utilizing a cutting-edge molecular technique called RNA interference, the Salk researchers silenced the MDGA1 gene. To do this, they painstakingly performed in utero surgery on embryonic mice — injecting an interfering RNA molecule into the lateral ventricle, a fluid-filled space next to the neocortex. Application of an electrical current forced the RNA into neural progenitor cells, and it was subsequently inherited by their neuronal progeny that form layer 2/3 and blocked their ability to make MDGA1 protein.

When Takeuchi and O’Leary examined the neocortex a few days later when the mice were born, they discovered that nearly all neurons containing the interfering RNA were stalled in aberrant deeper locations, indicating that loss of MDGA1 protein had stymied their attempt to travel the full distance to layer 2/3 and supporting the original hypothesis. The goal now is to determine how MDGA1 controls neuronal migration and what the long-term consequences are of its loss. Impaired function of neuronal adhesion molecules has been previously linked to neurological defects in humans. A cell adhesion molecule known as L1 has been shown to affect cell migration and positioning in other parts of the nervous system. Numerous mutations in the human L1 gene have been uncovered; individuals with these mutations often show severe defects in neuronal positioning and connectivity, which are clinically manifested in conditions such as hydrocephalus, mental retardation and spastic paraplegia.

Whether mutations in MDGA1 lead to brain disorders remains to be seen. “Much work needs to be done, and the appropriate tools need to be developed to do this work,” said O’Leary, “but we feel that these studies will eventually provide insight into neurological disorders that have their basis in malpositioning of neurons.”

Science Daily
June 6, 2006

Original web page at Science Daily

Categories
News

Lead exposure leads to brain cell loss and damage years later

Eighteen years later, people who worked with lead have significant loss of brain cells and damage to brain tissue, according to a new study published in the May 23, 2006, issue of Neurology, the scientific journal of the American Academy of Neurology. The study examined 532 former employees of a chemical manufacturing plant who had not been exposed to lead for an average of 18 years. The workers had worked at the plant for an average of more than eight years. The researchers measured the amount of lead accumulated in the workers’ bones and used MRI scans to measure the workers’ brain volumes and to look for white matter lesions, or small areas of damage in the brain tissue.

The higher the workers’ lead levels were, the more likely they were to have smaller brain volumes and greater amounts of brain damage. A total of 36 percent of the participants had white matter lesions. Those with the highest levels of lead were more than twice as likely to have brain damage as those with the lowest lead levels. Those with the highest levels of lead had brain volumes 1.1 percent smaller than those with the lowest lead levels. “The effect of the lead exposure was equivalent to what would be expected for five years of aging,” said study author Walter F. Stewart, PhD, of the Center for Health Research of the Geisinger Health System in Danville, PA, and the Johns Hopkins Bloomberg School of Public Health in Baltimore, MD. Stewart said the results confirm earlier findings in this same population that people with occupational lead exposure experience declines in their thinking and memory skills years after their exposure. “The effect of lead on the brain is progressive,” Stewart said. “These effects are the result of persistent changes in the structure of the brain, not short-term changes in the brain’s neurochemistry.”

The findings raise new questions, according to Andrew S. Rowland, PhD, of the University of New Mexico in Albuquerque, who wrote an editorial accompanying the article. “There have been many studies done on the effects of lead on children’s IQ, but the possible effects in other areas, such as attention, aggression, or any mental disorders, have gotten less attention. Exposure to inorganic lead, like that found in paint, remains an important public health problem. And those of us who grew up before the late 1970s still carry high lead levels in our bodies. We need more studies addressing the potential chronic health effects of those exposures.”

Science Daily
June 6, 2006

Original web page at Science Daily

Categories
News

High resolution light microscope reveals the fundamental mechanisms of nerve communication

The development of STED microscopy has allowed researchers at the Max-Planck Institute for Biophysical Chemistry to image, for the first time, proteins from single synaptic vesicles, answering long-standing questions of neurocommunication (Nature, 13th April 2006). In a simultaneous publication (Science Express, 13th April 2006), STED microscopy revealed the spatial distribution of the bruchpilot protein and aided neurobiologists from the European Neuroscience Institute and the University of Würzburg in understanding the protein’s central role in the formation of active synaptic zones. STED microscopy radically distinguishes itself from conventional far-field light microscopy in the fact that its resolution is no longer fundamentally limited by the wavelength of light used. Using STED, nanoscale optical studies are now possible inside cells.

Since its discovery in the 17th century, the light microscope has been the key to new biological and medical discoveries. Light, however, propagating as a wave, is subject to the phenomenon of diffraction, whose resolution-limiting effects were first described by Ernst Abbe in 1873. Abbe observed that structures which were closer to each other than ~200nm could not be visually separated when observed using visible light; when viewed through the optical microscope they are perceived as a blurred, single entity. Abbe’s realization of the resolution limitation of the optical microscope was long thought to be a unalterable law of far-field light imaging. Achieving higher resolution required the use of an electron microscope.

Despite that fact that electron beams can be more tightly focused, it is often difficult to efficiently label the proteins of a cell to render them visible with an electron microscope. Moreover, electron beams are only able to penetrate the first several micrometers of a biological sample. For these reasons, among others, despite using electron microscopy for high-resolution cell imaging, many questions of nerve function remained unanswered. In contrast, using fluorescent molecules as markers, one can specifically label individual proteins with high efficiency, rendering them visible with the conventional fluorescent microscope. Unfortunately the high resolution required to separate nanoscale structures was lacking due to the diffraction barrier.

In recent years researchers in the department of NanoBiophotonics at the MPI for Biophysical Chemistry in Göttingen have been able to break the Abbe resolution limit of far-field optical microscopy, as applied to fluorescent imaging, using a technique known as Stimulated Emission Depletion (STED) microscopy. The STED microscope used to obtain data for both publications is able to attain a resolution of 50-70 nm; the original fluorescent spot, roughly 200 nm in diameter, is reduced in surface area within the imaging plane by roughly an order of magnitude using the STED technique.

This resolution was sufficient for researchers from the neurobiology department to visualize, for the first time, individual synaptic vesicles – more precisely, to visualize the protein synaptotagmin, which is embedded in the membranes of individual vesicles. Vesicles are membrane ‘bubbles’ roughly 40 nm in diameter filled with neurotransmitters, which transport chemical messenger molecules to synapses, the contact points between nerve cells, enabling nerve signals to pass between cells. Their contents are released at the synapse when the vesicle membranes fuse with the membrane of the nerve cell. Previously it was unclear whether the proteins sticking in the vesicle membrane (e.g. synaptotagmin) spread out over the cell membrane after the fusion event or they remained together, localized in the membrane patch which previously formed the vesicle. With the aid of STED microscopy the researchers in Göttingen were able to show that the synaptotagmin molecules of a single vesicle remain together after fusion. The membrane of the nerve cell thus behaves in an ‘economical’ fashion: the vesicle proteins released onto the membrane of the nerve cell can be collectively reabsorbed to form another vesicle.

Neural vesicles do not fuse with the cell membrane with equal probability at all locations along the synapse junction, but preferentially at so-called ‘active zones.’ The bruchpilot protein discovered in fruit flies plays a decisive roll in the formation of these active zones. This is explained in the Science publication by Kittel et al. With STED imaging the scientists discovered that the bruchpilot protein is distributed in rings of ca. 150 nm diameter, forming the active zones. In these areas it appears that bruchpilot establishes the proximity between the calcium channels and the vesicles enabling the efficient release of neurotransmitters.

These studies demonstrate for the first time that resolution below a half-wavelength of visible light is no longer reserved for the electron microscope when observing cells. As demonstrated by recently completed research, the resolution of STED microscopy can be further increased, in principle to reach molecular scales. The STED microscope has opened a new chapter in the story of light microscopy, one in which the fundamental questions of biological processes at the nanoscale can potentially be answered with focused light.

Science Daily Health & Medicine
May 23, 2006

Original web page at Science Daily Health & Medicine

Categories
News

Research shows how visual stimulation turns up genes to shape the brain

Scientists have long known that brains need neural activity to mature and that sensory input is most important during a specific window of time called the “critical period” when the brain is primed for aggressive learning. Vision, hearing and touch all develop during such critical periods, while other senses, such as the olfactory system, maintain lifelong plasticity. The visual system provides an exemplary model for studying developmental plasticity, however, because of the pioneering work of Nobel prize-winning HMS researchers David Hubel and Torsten Wiesel describing the visual system’s structure, prerequisite knowledge for investigating its flexibility. Although visual plasticity has been studied for over 40 years, exactly how sensory experience interacts with the built-in machinery that permits the brain to change its circuits is only beginning to be understood.

A new study focusing on the molecular roots of plasticity has found that visual stimulus turns up the expression of some genes and turns down the expression of others, somewhat like a conductor cueing the members of an orchestra. The study also found that during different stages of life in rodents, distinct sets of genes spring into action in response to visual input. These gene sets may work in concert to allow synapses and neural circuits to respond to visual activity and shape the brain, reports the May issue of Nature Neuroscience.

The investigators’ identification of many distinct sets of activity-dependent genes follows a shift in neuroscience research toward a more holistic view of the role of genes in neural development and plasticity. “What we found opens science up to a more global look at genes, from studying one gene at a time to looking at families of genes acting together,” said first author Marta Majdan, Harvard Medical School research fellow in neurobiology. These findings suggest that genetic therapies for neurodegenerative diseases, some of which are largely limited to treatment focused on a single gene, will require more extensive knowledge of molecular pathways and gene interactions to be successful.

Majdan and co-author Carla Shatz, department chair and HMS Nathan Marsh Pusey Professor of Neurobiology, studied rodents during the critical period in which visual input stimulates aggressive plasticity, shaping the mesh of neural connections in the cortex and tuning the strengths of messages relayed by synapses. In mice, this period begins shortly after they open their eyes and begin to see. Previous research had determined that visual activity changes the level of expression of, or regulates, individual genes such as Brain-derived neurotrophic factor (Bdnf).

To determine whether vision regulates other genes in these rodents, Majdan and Shatz imposed abnormal visual experiences on the rodents at a variety of ages including the critical period by removing one eye and leaving the other intact. They then compared gene expression profiles of the cortex supporting the open eye to that of the missing eye. They found that Bdnf is not alone-visual input changes the levels of expression of ten additional genes, dubbed the “common set,” at all ages investigated. By chemically inhibiting a MAP kinase already known to be linked to several common set genes, they found that this kinase acts as a relay, regulating these genes in response to visual activity.

The researchers found other sets of genes superimposed on this core pathway, but these sets are turned on and off by vision at specific ages before, during and after the critical period and into adulthood. “This suggests that sensory experience regulates different genes in your brain depending on your age and past experience,” said Shatz. “Thus, nurture, our experience of the world via our senses, acts through nature, sets of genes, to alter brain circuits.” These discoveries may lead to new ways of thinking about genetic therapies to correct early vision disorders. Because the brain is so altered by abnormal vision, restoring vision to a child afflicted with cataracts or strabismus, an eye misalignment which can impair vision, may not be enough to correct the damage. Nor will treatment involving single gene replacement.

“We need to try to find the major switches that turn on genes in the downstream network as opposed to looking at each element of the network and designing therapy based on each gene,” said Shatz. This study helps explain why it is that children learn so quickly and easily, and it lends credence to the idea that, in adults, mental activity leads to mental agility. “It is amazing that, even in our oldest mice we saw genes regulated by vision. Genes in the brain change with experience at every age, forming a basis for our ability to learn and remember even in adulthood,” said Shatz.

Science Daily
May 23, 2006

Original web page at Science Daily

Categories
News

Death of Alzheimer victim linked to aluminium pollution

Fears of a link between aluminium and Alzheimer’s disease have been reignited by the case of a British woman who died of the illness 16 years after an industrial accident polluted her local drinking water. An autopsy on Carole Cross’s brain showed that she was suffering from a rare form of early-onset Alzheimer’s when she died in May 2004, and also revealed the presence of high levels of aluminium in her tissues. The researchers who investigated her brain cannot say whether the aluminium was the cause, but point out that the woman had no family history of dementia. The polluting incident occurred in 1988 when a truck driver mistakenly emptied some 20 tonnes of aluminium sulphate — used in the early stages of wastewater treatment — into a tank containing drinking water destined for the village of Camelford in Cornwall, UK. An estimated 20,000 people may have been exposed to high levels of the chemical for several weeks. Concerned residents are waiting to see whether more people will be similarly affected. Anecdotal reports state that several other villagers are suffering from dementia.

Although only a single case, the discovery has reopened the possibility that aluminium could be linked to Alzheimer’s disease, say Christopher Exley, a chemist at Keele University, UK and Margaret Esiri, a University of Oxford neurologist, who publish details of their investigation on Cross in the Journal of Neurology, Neurosurgery and Psychiatry. Aluminium is firmly linked to some temporary forms of dementia, Esiri says. Kidney dialysis patients living in areas where water is high in aluminium, for example, sometimes experience ‘dialysis dementia’, as a result of the large quantities of contaminated water passing through their bodies. But the link between aluminium and Alzheimer’s has been more controversial, says Daniel Perl, a neuropathologist at Mount Sinai School of Medicine in New York, who has written a commentary on the Camelford case. Once aluminium binds to proteins, it sticks for good. It’s like trying to use superglue to mend a Swiss watch.

Aluminium is often found in the twists of deformed protein, called ‘neurofibrillary tangles’, that characterize the disease. But there is no strong evidence that it is involved in the disease’s onset, Perl cautions. “I realize that’s quite a conservative answer,” he says. “But show me a couple more cases like this and I might have to change it.” Perl points out that, of the 20 most common elements on Earth, aluminium is the only one not involved in any essential biological process. That’s because of its feisty chemistry, he explains. When in solution, aluminium ions are small and highly charged, making them very reactive. “Once aluminium binds to proteins, it sticks for good,” he says. “It’s like trying to use superglue to mend a Swiss watch.”

What makes Cross’s case interesting is that she had succumbed to a very rare form of Alzheimer’s, but had a genetic predisposition, through a gene called APOE, to developing a more common form of the disease later in life, says Esiri. This raises the possibility that her aluminium exposure may have accelerated the onset of disease.
Previous studies of transgenic mice expressing a similar gene have shown that feeding them aluminium in drinking water can cause similar symptoms at a young age. Cross’s protein tangles were found in the blood vessels rather than in the brain tissue itself. This is consistent with the idea that the cause of the disease could have originated in the gut, reaching the brain through the bloodstream, Esiri explains. Combined with the unusually young age at which she died (aged 58), this puts her in a category shared by only a handful of known cases worldwide, Esiri says.

The discovery may also rekindle fears over drinking and cooking using aluminium pots and pans, although Perl says that most aluminium is found in an insoluble form and therefore not dangerous. The only way to ingest aluminium would be by cooking acidic foods such as rhubarb or tomato, which would react with the metal. The news is worrying for Camelford’s residents, says Exley, who carried out the chemical analysis to spot the aluminium in the autopsy samples. “There are still 20,000 people thinking about whether they’re susceptible to this chronic disease,” he says. “We can’t do anything to help them.”

Nature
May 9, 2006

Original web page at Nature

Categories
News

New hope for Alzheimer’s patients

MIT brain researchers have developed a “cocktail” of dietary supplements, now in human clinical trials, that holds promise for the treatment of Alzheimer’s disease. For years, doctors have encouraged people to consume foods such as fish that are rich in omega-3 fatty acids because they appear to improve memory and other brain functions. The MIT research suggests that a cocktail treatment of omega-3 fatty acids and two other compounds normally present in the blood, could delay the cognitive decline seen in Alzheimer’s disease, which afflicts an estimated 4 million to 5 million Americans. “It’s been enormously frustrating to have so little to offer people that have (Alzheimer’s) disease,” said Richard Wurtman, the Cecil H. Green Distinguished Professor of Neuropharmacology at MIT, who led the research team. The study appears in the May 9 issue of Brain Research.

Wurtman will present the research at the International Academy of Nutrition and Aging 2006 Symposium on Nutrition and Alzheimer’s Disease/Cognitive Decline in Chicago on Tuesday, May 2. The three compounds in the treatment cocktail – omega-3 fatty acids, uridine and choline – are all needed by brain neurons to make phospholipids, the primary component of cell membranes. After adding those supplements to the diets of gerbils, the researchers observed a dramatic increase in the amount of membranes that form brain cell synapses, where messages between cells are relayed. Damage in brain synapses is believed to cause the dementia that characterizes Alzheimer’s disease.

If the successful results obtained in gerbils can be duplicated in the ongoing human trials, the new treatment could offer perhaps not a cure but a long-term Alzheimer’s treatment similar to what L-dopa, a dopamine precursor, does for Parkinson’s patients, said Wurtman, a professor in the Department of Brain and Cognitive Sciences. “It doesn’t cure Parkinson’s, but what it does do is to help replace something that’s missing. It’s not permanent, but it has had an enormous impact on people who have Parkinson’s,” he said.

The new potential treatment offers a different approach from the traditional tactic of targeting the amyloid plaques and tangles that develop in the brains of Alzheimer’s patients. Until recently, most researchers believed these plaques and tangles caused the cognitive decline. But the failure of this hypothesis to lead to an effective treatment for Alzheimer’s disease has caused some scientists to theorize that, though the plaques and tangles are always associated with the disease, they may not be the main cause of the dementia, nor the best target for treating it.

Instead, the new research focuses on brain synapses, where neurotransmitters such as dopamine, acetylcholine, serotonin and glutamate carry messages from presynaptic neurons to receptors in the membranes of postsynaptic neurons. In Alzheimer’s patients, synapses in the cortex and hippocampus, which are involved in learning and memory, are damaged. After the dietary supplements were given, the researchers detected a large increase in the levels of specific brain proteins known to be concentrated within synapses, indicating that more synaptic membranes had formed, Wurtman said. Synaptic membrane protein levels rose if the gerbils were given either omega-3 fatty acids or uridine plus choline. However, the most dramatic upsurge was observed in gerbils fed all three compounds.

“To my knowledge, this is the first concrete explanation for the behavioral effects of taking omega-3 fatty acids,” said Wurtman. Choline can be found in meats, nuts and eggs, and omega-3 fatty acids are found in a variety of sources, including fish, eggs, flaxseed and meat from grass-fed animals. Uridine, which is found in RNA and produced by the liver and kidney, is not obtained from the diet. However, uridine is found in human breast milk, which is a good indication that supplementary uridine is safe for humans to consume, Wurtman said. Recent studies by the researchers at MIT, and by scientists at Cambridge University in England, showed that either uridine or omega-3 fatty acids can promote the growth of neurites, which are small outgrowths of neuronal cell membranes. That further supports the hypothesis that omega-3 fatty acids increase synaptic membrane formation, said Wurtman.

Alzheimer’s patients in the clinical trials, which will involve multiple medical centers, are being given a drink that contains the compounds under study, or a taste-matched placebo. “If it works as well on the brains of people with Alzheimer’s disease as it does in laboratory animals, I think there will be a lot of interest,” Wurtman said.

Science Daily
May 9, 2006

Original web page at Science Daily

Categories
News

Ways to regenerate the ear’s hearing cells

Massachusetts General Hospital (MGH) researchers have made important progress in their ongoing effort to regenerate the inner ear’s hair cells, which convert sound vibrations to nerve impulses. In an upcoming issue of Proceeding of the National Academy of Sciences they report successfully creating a mouse model that allows them to build on earlier findings about the effect of deactivating a protein that controls the growth and division of hair cells. The paper, which is receiving early online publication, also finds that suppressing the retinoblastoma (Rb) protein has different effects in specific parts of the inner ear.

“In these first studies of the role of the Rb protein in the ears of postnatal mice, we have confirmed that — under the right conditions — mature hair cells can go through the cell cycle and produce new, functioning hair cells. But we’ve also confirmed that you need to block Rb reversibly and at an early stage of development, otherwise the hair cells will die,” says Zheng-Yi Chen, DPhil, of the MGH Neurology Service, the study’s senior author. In 2005 Chen was named to the Scientific American 50, the magazine’s annual list of outstanding leaders, for this continuing research project. Named for the hair-like projections on their surfaces, hair cells form a ribbon of vibration sensors along the length of the cochlea — the organ of the inner ear that senses sound — where they convert sonic vibrations to electrical signals that are carried to the brain. The cells are very sensitive to damage from excessive noise, infections and toxins. Once damaged, hair cells do not naturally regenerate in mammals, and their death accounts for most types of acquired hearing loss.

All cells grow and divide through a process called the cell cycle, and many proteins have been identified that control different cell cycle phases. In 2005 Chen’s group published a paper in the journal Science reporting that the Rb protein, known to suppress the cell cycle, could be important for halting the cell cycle in hair cells. They used a genetically modified mouse strain in which Rb was no longer made in the inner ear. By examining the inner ears of mouse embryos — that strain did not survive past birth — the researchers found more hair cells in the knockout mice than in the ears of normal mice at the same stage of development. The additional cells looked and functioned like normal hair cells and appeared to be actively regenerating.

For this followup study, the researchers developed a new strain of inner-ear Rb-knockout mice that survive for up to six months past birth. Their investigation of the effects of Rb deletion on the hair cells of the inner ear finds differences between the auditory portion of the organ, which controls hearing, and the vestibular area, which is involved with balance. While the Rb-negative auditory hair cells in early postnatal mice are dividing and growing, the cells do not mature properly and eventually die, resulting in the mice becoming deaf by the age of 3 months.
Vestibular hair cells, however, appear to grow and mature relatively normally and continue cell division even in mature mice. Adult Rb-knockout mice maintain some vestibular function, indicating that those hair cells are contributing to their sense of balance at the system level.

“We’ve shown that vestibular hair cell regeneration may be achieved and may be less of an obstacle than auditory cell regeneration,” Chen says. “Now we need to find ways to create a similar system in the auditory cells, and this new model will help us better understand the mechanisms behind functional hair cell regeneration. Our next step will be developing a transient, reversible block of Rb function to assess its role in both types of hair cell.” Chen is an assistant professor of Neurology of Harvard Medical School (HMS).

Science Daily
May 9, 2006

Original web page at Science Daily

Categories
News

Possible brain hormone may unlock mystery of hibernation

The discovery of a possible hibernation hormone in the brain may unlock the mystery behind the dormant state, researchers reported in the April 7, 2006 issue of Cell. Hibernation allows animals from bears to rodents to survive unscathed–in a state of suspended animation–under the harshest of winter conditions. If the findings in chipmunks are confirmed, the hormone would represent the first essential brain signal governing the seasonal adaptation, according to the researchers. As hibernation factors endow animals with an incredible ability to cope under otherwise lethal conditions–ratcheting down their metabolic rate to survive on limited energy reserves and withstanding extreme cardiovascular and oxygen stresses–the candidate hormone might also pave the way toward clinical therapies that lend humans the same kind of protection, they added.

The researchers earlier found that concentrations of “hibernation-specific protein” complex (HPc) decline in the blood of hibernating chipmunks. The team now reports evidence that the level of HPc in the brain increases at the onset of hibernation independently of changes in body temperature. Moreover, treatments that block HP activity in the animals’ brains cuts hibernation short. “One of the most curious biological phenomena in mammals is their ability to hibernate circannually, which allows them to survive unusually low body temperatures at or near freezing,” said study author Takashi Ohtsu of Kanagawa Academy of Science and Technology in Japan. Scientists have attempted for decades to identify substances responsible for hibernation in the blood and organs of hibernating animals but have met with little success, the researchers said. “Although the functions of HP remain to be clarified, the current observations lead us to propose the involvement of the protein complex in the regulation of energy metabolism and/or biological defenses during hibernation–crucial events for adapting to the severe physiological state,” Ohtsu said.

In the current study, the researchers first demonstrated that hibernation in chipmunks is strictly controlled by an individual’s internal circannual rhythm even under conditions of constant cold. In 20 hibernators examined throughout their lives, concentrations of HPc in the blood started to decrease prior to hibernation and remained low throughout the inactive state. Hibernation ended after blood HPc levels rose. Further study revealed an inverse relationship between HPc levels in the blood and brain. While HPc levels dipped in blood, the putative hormone rose dramatically in cerebral spinal fluid, they reported. Likewise, HPc levels decreased abruptly in spinal fluid when hibernation terminated.

The researchers also found that blocking the activity of one of the HP complex proteins in the brain with antibody greatly decreased the hibernation time during which the chipmunks maintained a lowered body temperature, suggesting its critical role in the brain’s capacity for dormancy. The researchers propose that HPc in the blood is actively transported into the spinal fluid in response to the animals’ natural rhythm. The hibernation complex might also play a role in the seasonal behavior changes of animal species that do not hibernate, the researchers suggested.
For example, the complex could moderate physiological events such as reproduction in seasonally breeding mammals and migration in birds, they said. Even humans can maintain seasonal rhythms as exhibited by seasonal affective disorder, a recurrent depression characterized by increased sleep, overeating, and weight gain–behaviors similar to those seen in hibernators, Ohtsu noted.

“Hibernation is an extreme response to a seasonal environment, yet we knew almost nothing about how it is timed, nor how vital cellular functions are sustained in the face of plummeting body temperature,” wrote Michael Hastings in a preview. The researchers now “identify a liver-derived protein complex as an essential coordinator of this adaptation to the depredations of winter.” “The finding has more than passing biological interest because understanding how tissues cope with the cardiovascular and oxidative stresses associated with hibernation or torpor may have direct clinical relevance,” he added. For example, he wrote, such a protective program might be exploited in transplant and vascular surgery. Scientists have suggested that hibernation therapy might effectively preserve donor organs for weeks or months. Hibernation has also been found to protect animals from a wide range of potential threats, from muscle disuse to cancer, the study authors said. Therefore, hibernation therapy might confer protective effects in other clinical arenas as well.

The new findings could lead to “potential pharmacological applications in humans to the prevention of lethal diseases, such as hypothermia, ischemia, muscle atrophy, bacterial infection, and tumorigenesis, which has been observed during hibernation in hibernators,” the researchers said. “These studies may further stimulate the exploration of new techniques for cryosurgery of the heart and brain, as well as the development of hypothermia treatment that is effective for preventing brain ischemic damage.” In cryosurgery, physicians use extreme cold to destroy abnormal tissue, such as cancerous tumors.

Science Daily
April 25, 2006

Original web page at Science Daily

Categories
News

Researchers observe a new mechanism by which receptors enter hippocampal neurons

Scientists have observed a new mechanism of insertion for receptors in hippocampal neurons they term ‘kiss and wait,’ in which receptors emerge at the surface of the cell and appear to pause for up to thirty seconds before spreading laterally, according to a study appearing this week in Nature Neuroscience. The images produced in the study are the first to show individual trafficking events of receptor insertion into the plasma membrane of any cell type. “We’re proposing, although this remains for further study, that these events then represent a distinct mode of membrane protein insertion that can occur in the plasma membrane,” lead author Mark von Zastrow at the University of California, San Francisco, told The Scientist.

Until now, scientists have struggled to elucidate the details of receptor insertion, because fluorescence techniques often have too high a signal-to-noise ratio to image single insertions. To overcome this technical obstacle, von Zastrow’s team used total internal reflection fluorescence (TIRF) microscopy, along with a pH-sensitive variant of GFP that could fluoresce at the plasma membrane, but became silenced when exposed to an acidic environment during endocytosis. The researchers fused this GFP variant to the extracellular domain of the human ß2 adrenergic receptor, and monitored receptor insertions in pyramidal neurons exposed to an adrenergic agonist.

They observed that the receptor appeared on the membrane via two different mechanisms — a transient mechanism in which the receptor remains on the surface for several seconds, and a persistent or mode, which lasts for tens of seconds before the receptors spread laterally throughout the membrane. Receptor activation inhibited the transient insertion events, but enhanced the persistent events. In addition, the frequency of transient events increased in response to a protein kinase A inhibitor, whereas the persistent events did not.

The benefit of TIRF is that it enables researchers to observe events very close to the cell surface, without interferences from light, von Zastrow noted — a “critical” feature, he added, given that light from a single, inserting vesicle is very low. The fact that the two modes of insertion respond differently to receptor activation could be a basis for activity-dependent control, he noted, and the cell could be “switching” the signal from one pathway to another in response to prolonged receptor activation. Although the idea remains speculative, the theory could mean a “completely new level of plasticity in neural signaling.”

Tim Ryan of Weill Medical College of Cornell University, who did not participate in the study, said he agreed that the researchers’ method was critical to finding their result. “The techniques are now allowing people to look at things directly happening at the membrane…This is the first time someone has been able to visualize this class of receptor dynamics.” Still, Ryan noted that it is difficult to gauge the implications of the new mode of insertion without knowing if the cell signaled differently during the persistent state. Roberto Malinow of Cold Spring Harbor Laboratory in New York, also not a co-author, also cautioned against taking too much stock in data from dissociated cells. Specifically, “there is an issue about how much the mechanisms that are identified in dissociated culture cells are applicable to neurons in the brain,” he told The Scientist. Von Zastrow said the fact that protein kinase A inhibitor and receptor activation affect insertion into the membrane suggests this is a tightly regulated process, a new idea in receptor research. “What we’ve learned over the past ten years is that receptors are actually very dynamic proteins.”

The Scientist
April 25, 2006

Original web page at The Scientist

Categories
News

Brain compensates for aging by becoming less “specialized”

One of two separate areas of the brain light up when younger people look at a house or a face, but each image activates both areas of the brain at the same time in older persons, according to a study published by Yale University and the University of Illinois, Urbana-Champaign, this month in NeuroReport. Although the researchers cannot say for sure, one theory that needs further study is that the extra activity in older adults is probably compensation for age-related changes in brain volume or efficiency, according to Christy Marshuetz, assistant professor in the Department of Psychology and a co-author of the study.

The study included a dozen people 18- to 27-years-old, and an equal number of 61- to 80-year-olds. They were asked to remember three images of houses or three images of faces and then asked to decide if another image was from the original set. Functional magnetic resonance imaging was used to track neural changes during these tasks. Marshuetz said it has been known for some time that there are different regions in the inferior temporal lobes of the brain that respond to faces and to photographs of houses. It is also well established that as humans age, both neural and cognitive function become less differentiated. But the data is sparse and previous studies have examined neural activity only during passive viewing.

In this study, the researchers examined age differences in neural specialization for “faces and places” in a working memory task. They hypothesized that even when consciously remembering specific items, older adults would show decreased specialization in the fusiform face area of the brain and the parahippocampal place area of the brain when compared with younger adults. The researchers also expected, and found, more activity in older adults in the frontal cortex and believe this activity is compensation for less differentiation in the visual cortex at the back of the brain. “Our findings are the first to demonstrate decreased neural specialization in the ventral visual cortex in older adults, along with increased activations in the prefrontal cortex,” Marshuetz said. “This underscores the importance of taking into account the connected and networked nature of the brain and its function in understanding human neural aging.”

Yale University
April 25, 2006

Original web page at Yale University

Categories
News

DNA gene vaccine protects against harmful protein of Alzheimer’s disease

Dr. Roger Rosenberg director of the Alzheimer’s Disease Center, led a research team including Dr. Bao-Xi Qu, assistant professor of neurology, that investigated in mice whether a specific antibody is therapeutically effective as a means to inhibit the formation of amyloid-beta storage in the brain. Accumulation of amyloid-beta 42 in humans is a hallmark of Alzheimer’s disease. By pressure-injecting the gene responsible for producing the specific protein — called amyloid-beta 42 — the researchers caused the mice to make antibodies and greatly reduce the protein’s build-up in the brain. Accumulation of amyloid-beta 42 in humans is a hallmark of Alzheimer’s disease.

“The whole point of the study is to determine whether the antibody is therapeutically effective as a means to inhibit the formation of amyloid-beta storage in the brain, and it is,” said Dr. Roger Rosenberg, the study’s senior author and director of the Alzheimer’s Disease Center at UT Southwestern. The gene injection avoids a serious side-effect that caused the cancellation of a previous multi-center human trial with amyloid-beta 42, researchers said. UT Southwestern did not participate in that trial. In that earlier study, people received injections of the protein itself and some developed dangerous brain inflammation. The new study is available online and appears in an upcoming issue of the Journal of the Neurological Sciences.

The researchers used mutant mice with two defective human genes associated with Alzheimer’s, genes that produce amyloid-beta 42. “By seven months, the mice are storing abundant amounts of amyloid-beta 42,” said Dr. Rosenberg, who holds the Abe (Brunky), Morris and William Zale Distinguished Chair in Neurology. While the mice were young, the scientists coated microscopically small gold particles with human amyloid-beta 42 genes attached to other genes that program cells to make the protein. The particles were then injected with a gene gun into the skin cells of the mice’s ears using a blast of helium.

After receiving 11 injections over several months, the mice showed a high level of antibodies to amyloid-beta 42, and a 60 percent to 77.5 percent reduction of plaques in their brains. As controls, the researchers also either injected mutant mice with the gene for a related but harmless protein, amyloid-beta 16, or with a gene vaccine that lacked any amyloid genes. These treatments did not cause antibody production, and the mice showed the large amounts of amyloid-beta 42 brain plaques normally seen in animals with these mutations.

The gene injection showed superior results compared to a previous human study in which amyloid-beta 42 protein itself was injected into muscle, Dr. Rosenberg said. That study was halted when a small percentage of participants developed inflammation of the brain and spinal cord. Injecting the gene, in contrast, caused no brain inflammation in the mice. Dr. Rosenberg said the difference was partly because in the human trial, the protein was injected along with a substance called an adjuvant, which increased the immune response to abnormal excessive levels, causing the dangerous brain inflammation. In addition, the immune response in humans may have involved antibodies called Th1, which were probably partly responsible for the inflammation. The gene injection in the mouse study produced Th2 antibodies, which have a low probability of causing brain inflammation. Furthermore, no adjuvant was needed for antibody production.

The gene immunization is now undergoing further animal studies, with the ultimate goal being a clinical trial in humans. The researchers also plan to see if it can reverse the size of established plaques in the brains of mice.

Science Daily
April 11, 2006

Original web page at Science Daily

Categories
News

Seed of Alzheimer’s spotted

A US team has identified what could be the earliest indication of Alzheimer’s, a discovery that may help to diagnose the disease and perhaps stop it progressing. Researchers believe that some people show signs of memory loss years before they develop Alzheimer’s. But they are not sure what causes these problems, or how they turn into full-blown dementia. To find out, Karen Ashe at the University of Minnesota, Minneapolis, and her team studied a strain of mice that, like people, develop mild memory problems in middle age before getting more severe Alzheimer’s symptoms. The mice were genetically engineered to make a version of a human protein called amyloid-β. Researchers know that this protein clogs the brains of Alzheimer’s patients late in the disease.

By extracting amyloid- from the animals’ brains, the team discovered a knot of 12 proteins that appears outside brain cells just as memory loss occurs. These clusters, which it calls Aβ *56, are different from the large plaques of amyloid- that form later in Alzheimer’s patients. The more of these clusters mice have, the worse their memories are, the team reports in Nature. And injecting clusters into the brains of rats causes temporary amnesia. The tactic of seeking the molecular changes that occur at the very first signs of memory loss is “somewhere between ingenious and lateral thinking”, says neuroscientist Richard Morris, who studies memory formation at the University of Edinburgh, UK. Before this, a lot of research focused on the later stages of the disease, when neurons are dying and it is hard to work back to the initial problem.

Ashe proposes that the clusters of amyloid-β could interrupt memory by jamming communication between neurons. She suggests that they could be one cause of the mild memory loss widely associated with old age. They could also prefigure the death of neurons and severe cognitive problems. “This may be the seed that defines who gets Alzheimer’s,” she says. The study adds to evidence that Alzheimer’s starts out when molecules of amyloid-β start to clump together. “These are the bad guys,” says Alzheimer’s specialist Dennis Selkoe of Brigham and Women’s Hospital in Boston, Massachusetts. The idea needs testing with additional experiments, Ashe says. She is scrutinizing the preserved brains of people who displayed early signs of Alzheimer’s and then died from other causes, to see whether they contain Aβ *56. Another key test is whether drugs that stop the protein clusters forming can halt memory loss in mice. “That’s the gold-standard proof of this hypothesis,” Ashe says.

Alzheimer’s is difficult to diagnose and incurable; there are an estimated 18 million sufferers worldwide. Spotting clusters of amyloid-β in the blood might help doctors to identify who is at risk of disease, helping them to target possible treatments. Drugs or vaccines that stop the clusters forming might block the brain’s decline. Researchers are already testing several methods to block the production or accumulation of amyloid-β in humans. “We’re in good shape in what needs to be done therapeutically,” Selkoe says.

Nature
March 28, 2006

Original web page at Nature