Categories
News

Malaria, Irish potato famine pathogen share surprising trait

Two wildly different pathogens – one that infects vegetables, the other infecting humans – essentially use the same protein code to get their disease-causing proteins into the cells of their respective hosts. That’s what researchers from Ohio State and Northwestern universities report in a study published in the current issue of the journal PLoS Pathogens. The scientists were surprised to learn that the pathogen that causes malaria in humans and the microbe that caused the Irish potato famine use identical protein signals to start an infection.

“I don’t think anyone expected this,” said Sophien Kamoun, a study co-author and an associate professor of plant pathology at Ohio State’s Ohio Agricultural Research and Development Center in Wooster. “These are very different pathogens, and we never realized that there might be some similarities between them.” Kamoun says not to worry – there’s no chance that the potato pathogen will jump to humans, nor is it likely that the malaria parasite will start infecting plants.

However, he said it’s feasible to think that one day researchers could develop a drug with a dual purpose – one that would stop both Plasmodium falciparum, which causes malaria, and Phytophthora infestans, the microbe that triggers late potato blight in vegetables including potatoes, soybeans and tomatoes. “It sounds crazy, but it’s not totally ridiculous to consider such a drug,” said Kamoun, who is an expert on the Phytophthora group of pathogens. He conducted the study with lead author Kasturi Haldar, of Northwestern, and with colleagues from both Ohio State and Northwestern.

Each year, malaria kills more than one million people – mostly young African children – and Phytophthora pathogens devastate a wide range of food and commercial crops. The researchers swapped a small sequence of proteins, called the leader sequence, in P. falciparum with the leader sequence of P. infestans. A leader sequence is a group of about 20 to 30 amino acids on a protein secreted by the parasite. This sequence contains instructions on how to enter, and therefore start infecting, a plant or animal cell. In laboratory experiments, the researchers infected human red blood cells with the modified malarial pathogen. Results showed that malaria proteins could just as effectively enter and infect a cell when it contained the P. infestans leader sequence instead of its own.

“Our findings show that very distinct microbes can share similar strategies for delivering toxic proteins to their targets,” Kamoun said. About a year and a half ago, Kamoun and his group at Ohio State read studies conducted by the Northwestern researchers that described the leader sequence of the malaria parasite. The similarity between this and the leader sequence of P. infestans was remarkable, he said. “So we decided to collaborate and see if the sequences were not just similar, but also functionally the same,” he said. “It turned out that they were. But although the mechanism of getting virulence proteins into a host cell is very similar, the infection-causing proteins that are delivered to a host are completely different.”

To Kamoun’s knowledge, this is the first paper to show that such dissimilar pathogens of this type – both are eukaryotic organisms – share a remarkably similar trait. He and his colleagues aren’t sure how to explain this phenomenon, as these pathogens belong to distinctly different evolutionary groups.

Science Daily
June 20, 2006

Original web page at Science Daily

Categories
News

RNA interference gene therapy takes two steps forward, one step back

Three years ago Mark Kay, MD, PhD, published the first results showing that a hot new biological phenomenon called RNA interference was an effective gene-therapy technique in mice. That finding kicked off an RNAi gene therapy research flurry amongst both academic and industry research groups. Now, with three human RNAi gene therapy trials under way, Kay’s initial excitement is proving to be on target. However, reaching this point hasn’t been without challenges. In the latest twist, Kay, professor of genetics and of pediatrics at the Stanford University School of Medicine, and postdoctoral fellow Dirk Grimm, PhD, report an unexpected side effect of another type of RNAi gene therapy not on trial – mice in that study suffered liver toxicity from the treatment and some died. Despite that initial result, to be published in the May 25 issue of Nature, Kay and Grimm went on to find a way that shows promise in resolving this side effect. “Just like any other new drug, it is just going to mean that we need to proceed cautiously,” Kay said.

In traditional gene therapy the inserted DNA produces a gene to replace one that carries a mutation. In hemophilia, for example, the inserted gene makes a protein that is missing in the blood of people with the disease. RNAi gene therapy has the opposite effect. The inserted DNA produces a molecule called an shRNA, which turns off an overactive gene. With key genes shut off, viruses such as hepatitis B, hepatitis C or HIV are unable to multiply and cause disease. However, some reports had suggested that RNAi gene therapy might induce an immune reaction or switch off the wrong gene or genes. As these concerns faded, things began looking up for RNAi with three RNAi therapies now in human trials – two for macular degeneration and one for a type of pneumonia. However, these studies involve simply infusing the RNAi molecules into the eye or lung. The RNAi effects in these therapies aren’t permanent. Instead, patients may need to receive repeat doses of the RNAi.

If RNAi is going to be viable as a therapy for organ-wide diseases such as hepatitis B or C, it will have to stick around. Kay and Grimm felt they needed to show that the shRNA molecule made by the therapeutic gene would continue to be safe if it existed in high levels in a tissue over long periods of time. Instead of proving the safety of RNAi gene therapy, the pair found that persistent, high levels of the shRNA made the mice sick, and in some cases the mice even died. The problem, it seems, is that in the process of shutting down the viral genes, therapeutic shRNA molecules also hijack the cell’s normal RNAi machinery. With that machinery otherwise engaged, it’s not available to carry out its normal role in the cell.

“One benefit of RNAi gene therapy is that it uses the body’s own machinery, making it an effective approach,” Kay said. “However, the detriment of RNAi gene therapy turns out to be that it uses the body’s own machinery.” Nonetheless, Grimm and Kay bypassed the toxic effects by producing the therapeutic shRNA molecule at lower levels. They were able to prevent the human hepatitis B virus from replicating in mouse liver for more than half a year after a single therapy using this technique. Kay and Grimm said they have more work to do to learn the best way of making shRNA at levels high enough to be effective as gene therapy but low enough to avoid toxicity in humans.

Kay said that cancer and viral diseases such as AIDS or hepatitis B and C are likely targets for future RNAi therapies. In order to get to these trials, Kay said he and Grimm would need to work out what caused the toxic effects in mice and further develop strategies for circumventing that reaction. He expects that trials already under way will help him and others figure out the best way to bring RNAi gene therapy safely to humans.

Science Daily
June 20, 2006

Original web page at Science Daily

Categories
News

Model reveals how cells avoid becoming cancerous

Scientists at the University of California, San Diego (UCSD) and three other institutions have described for the first time a web of inter-related responses that cells use to avoid becoming diseased or cancerous after being exposed to a powerful chemical mutagen. The group led by UCSD bioengineering professor Trey Ideker describe in the May 19 issue of Science an elaborate system of gene control that was triggered by chemical damage to DNA. The information could be used eventually to develop drugs to boost DNA repair and possibly treat xeroderma pigmentosum, a disease in which the body’s ability to repair DNA damage caused by ultraviolet light is disabled, Werner syndrome, a premature aging disorder, as well as certain immune deficiencies and other degenerative diseases.

“Response to DNA damage is a basic physiological process that is important to coping with environmental toxins and a number of congenital diseases,” said Ideker, the senior author of the paper. “Over the past several decades, scientists have discovered many parts of the DNA-damage-repair machinery, but what has been missing until now is a ‘systems biology’ approach that explains how all the parts function together to enable a cell to repair its DNA while under routine assault.” UCSD post-doctoral fellow Christopher T. Workman, Ph.D. candidate Craig Mak, and technicians Scott McCuine and Maya Agarwal analyzed the effect of exposure of yeast cells to MMS (methyl-methanesulfonate), a chemical known to cause DNA damage in a manner similar to that of certain mutagens in tobacco smoke. The alkylation injury caused by MMS results in small kinks in the otherwise smoothly curving double helix of DNA. Cells rapidly identify the damage, stop dividing, excise the damaged DNA, and use several alternate methods to substitute a clean copy of genetic material.

“It’s almost as if cells have something akin to a computer program that becomes activated by DNA damage, and that program enables the cells to respond very quickly,” said Mak. “And this program is easily recognizable as operating in everything from yeasts to humans and mice to fruit flies.” Researchers have previously identified hundreds of genes involved in repairing MMS damage. However, they have been mystified by another group of genes whose expression is sharply affected by DNA damage, but which appear to play no role in repairing the damage itself. Ideker’s team uncovered a tangled network of interactions for 30 transcription factors with hundreds of yeast genes. A transcription factor is a protein that, either alone or in combination with other transcription factors, binds to one or more genes to affect the expression of that gene or genes. The discovery by Ideker’s group of a huge network of transcription factor-gene interactions was made possible by new biotechnology tools that provide comprehensive analysis of cells, like a passerby suddenly being able to monitor all the telephone calls made within a city.

The team discovered that part of the interaction network was involved, as expected, in repairing damaged DNA. However, they were surprised to find that a much larger part of the network is involved in modulating the expression of genes not directly related to DNA repair, such as genes involved in cell growth and division, protein degradation, responses to stress, and other metabolic functions. Ideker and others have theorized that when a cell’s DNA is damaged, the cell may be programmed to also stop dividing and perform a variety of housekeeping chores while it repairs its DNA. If true, the model may demystify the long-standing question of why DNA damage influences the expression of hundreds of genes not involved in the actual repair process.

“What we quickly realized is that we had uncovered not just a model of DNA repair, but a blueprint of how the initial event of DNA damage is transmitted by these transcription factors to repair processes and all the other important functions of the cell,” said Ideker. “With this model now in hand, we’d like to take a much closer look at the cell’s response to environmental toxins. We’d like to understand what goes wrong in certain congenital diseases involving DNA repair, and we’d also like to understand how the model plays a role in various cancers.”

“This research sheds light on the complexity of DNA repair, and offers an example of how the cellular process stimulates other pathways,” said David Schwartz, director of the National Institute of Environmental Health Sciences (NIEHS), one of the agencies which funded the study, agreed that the new findings could have practical benefits. “This new knowledge has great potential for the development of new therapeutic agents to combat a broad spectrum of diseases, including cancer, neurodegenerative diseases, and premature aging.”

Science Daily
June 6, 2006

Original web page at Science Daily

Categories
News

Clues to how B cells establish affinity

In the early moments of an immune response, B lymphocytes spread around cell membranes containing foreign antigens and gather these antigens into aggregates, according to a paper in this week’s Science. The authors also found that B cells spread farther around membranes containing high-affinity antigen, which leads to increased antigen accumulation and B cell activation. “This could be a mechanism that would allow for affinity maturation,” in which B cells that produce antibodies with high antigen affinity are selected for survival by the immune system, said Anthony DeFranco of the University of California, San Francisco, who was not involved in the study.

Previous work by senior author Facundo Batista of the Cancer Research UK London Research Institute and his colleagues revealed that when a B cell recognizes an antigen embedded in a cell membrane, an immunological “synapse” — composed of clusters of B cell receptors, antigens, and adhesion molecules — forms between the two cells, and the B cell acquires the antigen for processing and presentation to T cells. But how B cells discriminate between low- and high-affinity antigens remained unclear, Batista said. Led by Sebastian Fleire, also of London Research Institute, the researchers used scanning electron microscopy to watch cells containing surface lysozyme molecules interact with transgenic B cells carrying a receptor specific for this lysozyme. By labeling the antigen molecules with GFP, the researchers could watch as the B cells rapidly spread over the cell membranes, collecting lysozyme antigen, and then slowly contracted again, gathering the ligands into a central cluster.

By using a panel of mutant lysozymes with various affinities for the B cell receptor, the researchers next showed that the affinity between the receptor and antigen correlated directly with the degree of B cell spreading. To help explain this correlation, they created a mathematical model based on their experimental measurements of receptor-antigen interactions. The model revealed that high-affinity antigens occupy many B cell receptors, leading to cell attachment and B cell spreading. This spreading exposes more B cell receptors, which can then bind even more ligand and continue the process. Low-affinity ligands, on the other hand, occupy too few B cell receptors to perpetuate contact and spreading between the two cells.

The amount of B cell spreading determines the amount of antigen collected, and therefore establishes the ability of B cells to present antigen-derived peptides to T cells, Batista said. In this way, the immune system could distinguish which B cells have the most affinity for a particular antigen, he added. “Figuring out how it’s going to apply to real immune responses is still a bit of a conjecture,” DeFranco told The Scientist. While it’s possible that this mechanism could be used by B cells mounting a primary immune response against antigen embedded in a bacterial cell or virion, this response generally requires only low-affinity interactions, DeFranco said. The ability to discriminate between high- and low-affinity interactions would be more useful when cells in the lymph nodes display previously encountered antigen. In that case, B cells with “the higher affinity would take up more of the antigen and be able to present it to the T cell,” DeFranco said. “That’s exactly where this phenomenon would come into play most effectively.”

Batista and his colleagues also used their mathematical model to make a prediction about the B cell system. If they inactivated the spreading mechanism in the model — so that cells simply contacted each other at a fixed spot — they found that the amount of antigen accumulated by the B cell was similar for both low- and high-affinity antigens. When they then performed experiments using transgenic B cells with defective spreading, they found the same result.
Validating the model with an experimental prediction makes their model more convincing than those that simply describe biological observations, said Ronald Germain of the National Institute of Allergy and Infectious Diseases in Bethesda, Md., who was not involved in the study. “For B cells, this is a pretty new way of thinking, and it is very new to combine modeling to try to really work out mathematically how this works.”

The Scientist
May 23, 2006

Original web page at a The Scientist

Categories
News

New math model finds that the cochlea’s spiral shape enhances low frequencies

The next time someone whispers in your ear, think “cochlea.” The cochlea is the marvelous structure in the inner ear that is shaped like a snail shell and transforms sounds into the nerve impulses that your brain can process and interpret. You may remember learning about it in elementary school anatomy. According to the new model, as sound waves travel around the cochlea’s spiral, the distribution of the energy that they carry changes: It becomes weaker along the inside wall while growing stronger and more concentrated along the outside wall. This shift is more pronounced in regions with highest curvature, which correspond to the low frequency regions of the ear and so amplify low frequency sounds. This critical hearing organ consists of a fluid-filled tube about a cubic centimeter (three hundredths of an ounce) in volume. For decades, hearing experts thought that its spiral shape was simply an efficient packing job and its shape had no effect on how it functions. But a recent study headed by Vanderbilt mathematician Daphne Manoussaki calls this conventional wisdom into question. She and her colleagues, Richard Chadwick and Emilios Dimitriadis of the National Institutes of Health, have created a mathematical model of the cochlea that finds the spiral shape acts to enhance the low frequency sounds that we use to communicate with one another. They published the results recently in the journal Physical Review Letters.

If the new model is correct, then the cochlea is more sophisticated than researchers have thought. “This would indicate we need to take a step back from the cell biology and see how the cochlea works as an integrated system,” says Karl Grosh, who studies the ear’s structure at the University of Michigan in Ann Arbor. “The more we understand how the cochlea works, the more success we will have in building signal processing systems that mimic its auditory characteristics, an important aspect in designing cochlear implants and analog cochlear processors.” According to the National Institute on Deafness and Other Communications Disorders, about 59,000 people have received cochlear implants worldwide and about 250,000 are potential candidates.

Manoussaki is an assistant professor of mathematics and her main interest is modeling cell movements. She got involved in studying the cochlea inadvertently. After she finished her doctoral degree from the University of Washington, she was looking for work. At NIH, Chadwick was looking for someone with computer skills. He saw her website, was impressed by the computer model of blood vessel formation that she had created and offered her a job as a visiting fellow. That is where she got involved in the workings of the inner ear.

“I knew nothing about cochlea mechanics and I think that was to my advantage,” Manoussaki says. “I looked at this organ that was shaped like a snail but that everyone was modeling as if it were a straight duct and I asked the obvious question.” Chadwick informed her that it was well established that the spiral shape did not affect the way that the cochlea functions. In order to motivate her, however, he proposed that they review the papers that came to this conclusion. So that is what they did, one paper after another. It took them more than two years but they finally concluded that none of the existing proofs were persuasive.

That realization led them to develop a mathematical model of the cochlea that included its helical structure. Their first model, which portrayed the cochlea as a helix of constant radius, did not show that the shape had any effects. At the end of her fellowship, Manoussaki returned home to Greece. In 2004, as she was preparing to return to the United States, she reconnected with her collaborators and began working on the problem once again.

This time they developed a more sophisticated model. When sound waves enter the ear, they strike the ear drum and cause it to vibrate. Tiny bones in the ear transmit these vibrations to the fluid in the cochlea, where they travel along the narrowing tube that winds into a spiral. The tube is divided into two chambers by an elastic membrane that runs down its length. The mechanical properties of this “basilar” membrane vary from very stiff at the outer end to become increasingly flexible as the chambers narrow.
These changing properties cause the waves to grow and then die away, much as ocean waves get taller and narrower in shoaling water. Different frequency waves peak at different positions along the tube. Hair cells sitting on the basilar membrane sense these motions by bending against the membrane and produce electrical signals that feed into the auditory nerve. Hair cells near the large end of the cochlea detect high-pitched sounds, such as the notes of a piccolo, while those at the narrow end of the tube detect lower frequency sounds, like a the oompah of a tuba.

This basic frequency sorting works in the same fashion whether the cochlear tube is laid out straight or coiled in a spiral. That observation, in fact, was the major reason that the researchers studying cochlear mechanics concluded its shape didn’t matter. Manoussaki’s model comes to the same conclusion, but her calculations also reveal that the spiral shape causes the energy in the waves to accumulate against the outside edge of the chamber. She likens this to the “whispering gallery mode” effect where whispers traveling along curved walls of a large chamber can remain strong enough so they can be heard clearly on the opposite side of the room.

This uneven energy distribution, in turn, causes the fluid to slosh higher on one side of the chamber, forcing the basilar membrane to tilt to one side, the direction to which the hair cells are most sensitive. The effect is strongest in the center of the spiral, where the lowest frequencies are detected. The researchers calculate the sensitivity increase can be as much as 20 decibels. That corresponds to the difference between the ambience of a quiet restaurant and the noise of a busy street. It could easily make the difference between understanding that whisper in the ear — or not.

Science Daily
May 23, 2006

Original web page at Science Daily

Categories
News

The 6th European Colloquium on Acute Phase Proteins

The 6th European Colloquium on Acute Phase Proteins is an international symposium dealing with all aspects of the acute phase response at the acute phase proteins. The colloquium will take place at The Royal Veterinary and Agricultural University, Copenhagen, Denmark, August 24 – 26 2006.

Acute phase proteins are useful markers of inflammatory activity in patients and experimental models. Research in acute phase proteins has continuously expanded in human and veterinary medicine within the last few decades and acute phase proteins are now well-established diagnostic markers in human medicine. Evidence-based use of acute phase proteins in a wide-range of veterinary aspects has emerged within the last few years and the field is continuously expanding.

The European Colloquium on Acute Phase Proteins is the only international conference dedicated to all aspects of animal acute phase proteins. Progress within pathophysiology and genetics of acute phase proteins, analytical methodologies, diagnostic applicability and translational modeling will be reported.

The previous colloquia have been attended by an increasing number of delegates and the colloquium in 2006 is expected to attract a total of 120 to 150 scientists and administrators from universities, research organisations, state veterinary administration and diagnostic and medicine/vaccine producing institutions.

As research in acute phase proteins is an important field in a wide range of species, it is expected that delegates with interests in poultry, cattle, pigs, horses, dogs, cats, experimental animals and fish will attend to get an update on the recent research in this area. Moreover, the comparative aspect of this research will be emphasised in order to attract an increasing number of researchers from human medical sciences.

Abstract deadline is May 15, 2006.

For more information, please visit the colloquium homepage at www.appcolloquium2006.com

We hope to see many familiar as well as new faces at the 6th European Colloquium on Acute Phase Proteins in Denmark.

On behalf of the Organizing Committee,

Stine Jacobsen, Chairman The Royal Veterinary and Agricultural University

Categories
News

The fight against disease mongering: Generating knowledge for action

Disease mongering turns healthy people into patients, wastes precious resources, and causes iatrogenic harm. Like the marketing strategies that drive it, disease mongering poses a global challenge to those interested in public health, demanding in turn a global response. This theme issue of PLoS Medicine is explicitly designed to help provoke and inform that response. The problem of disease mongering is attracting increasing attention, though an adequate working definition remains elusive. In our view, disease mongering is the selling of sickness that widens the boundaries of illness and grows the markets for those who sell and deliver treatments. It is exemplified most explicitly by many pharmaceutical industry–funded disease-awareness campaigns—more often designed to sell drugs than to illuminate or to inform or educate about the prevention of illness or the maintenance of health. In this theme issue and elsewhere, observers have described different forms of disease mongering: aspects of ordinary life, such as menopause, being medicalised; mild problems portrayed as serious illnesses, as has occurred in the drug-company-sponsored promotion of irritable bowel syndrome and risk factors, such as high cholesterol and osteoporosis, being framed as diseases.

Drug companies are by no means the only players in this drama. Through the work of investigative journalists, we have learned how informal alliances of pharmaceutical corporations, public relations companies, doctors’ groups, and patient advocates promote these ideas to the public and policymakers—often using mass media to push a certain view of a particular health problem. While these different stakeholders may come to these alliances with different motives, there is often a confluence of interests—resulting in health problems routinely being framed as widespread, severe, and treatable with pills, as has happened recently with social anxiety disorder. Currently, these alliances are working with the media to popularize little-known conditions, such as restless legs syndrome and female sexual dysfunction, in each case lending credence to inflated prevalence estimates. In the case of female sexual dysfunction, there has been a serious, though heavily contested, attempt to convince the public in the United States that 43% of women live with this condition. This is happening at a time when pharmaceutical companies perceive a need to build and maintain markets for their big-selling products and when pipelines for new and genuinely innovative medicines are perceived as being weak.

It can also be argued that disease mongering is the opportunistic exploitation of both a widespread anxiety about frailty and a faith in scientific advance and “innovation”—a powerful economic, scientific, and social norm. In many nations, government policy priority is to secure market-based economic development, while more equitable social policies, such as public health strategies, can become subordinate or redundant. Disease mongering can thrive in such a normative environment. The practical consequences are that many of the so-called disease-awareness campaigns that inform our contemporary understanding of illness—whether as citizens, journalists, health professionals, industry leaders, academics, or policymakers—are now underwritten by the marketing departments of large drug companies rather than by organizations with a primary interest in public health. And it is no secret that those same marketing departments contract advertising agencies with expertise in “condition branding,” whose skills include “fostering the creation” of new medical disorders and dysfunctions. As a recent Reuters Business Insight report on so-called lifestyle drugs— designed to be read by pharmaceutical industry leaders—pointed out, “The coming years will bear greater witness to the corporate sponsored creation of disease”. We hope the coming years will also bear witness to a much more vigorous effort from within civil society to understand and to challenge that corporate process.

PLoS Medicine
May 9, 2006

Orriginal web page at PLos Medicine

Categories
News

How embryos differentiate left from right

Researchers at the Forsyth Institute have discovered a new mechanism responsible for early left/right patterning, the process by which organs locate themselves on the left or right side of the body. The discovery of this novel mechanism, garnered through the study of three different vertebrates (frogs, chickens and zebrafish), marks the first time that a single common mechanism has been identified in left-right patterning in three distinct species. Such a discovery may have far-reaching implications for the understanding of craniofacial development, right-left hand preference, right/left brain dominance and a variety of birth defects in humans.

A team of Forsyth Institute scientists, led by Michael Levin, PhD, Director of the Forsyth Center for Regenerative and Developmental Biology, examined the molecular and genetic factors that control left/right asymmetry and identified a novel component: an ion transporter that creates strong natural voltage gradients and pH changes.
The pump that normally acidifies subcellular compartments was shown to control embryonic laterality at very early stages. Their findings further challenged the previously held hypothesis that cilia (short hair-like structures on a cell) were the primary agents allowing an embryo to correctly position its internal organs along the left-right axis. Instead, their research showed a single asymmetry mechanism linking ciliary, serotonergic (serotonin is the chemical substance involved in transmitting signals between neurons), and ion flow mechanisms. The data was strengthened by the operation of this mechanism through all three vertebrates. This is important because prior data was very fragmented and different asymmetry-controlling systems appeared to be operating in frog/chick embryos vs. human/mouse/zebrafish embryos.

“In our previous research we showed that this developmental event happens earlier than expected in frogs by identifying an ion transporter that generates natural bioelectrical signals that ultimately control gene expression and the position of the heart and visceral organs,” Levin said. “We have now identified and explored an additional component of this novel mechanism – a protein pump that generates voltage and pH gradients. For the first time, we have a glimpse of how three different vertebrates utilize such ion flows in concert with ciliary movement and the function of pre-nervous neurotransmitters.”

The findings, to be published in the May 1 issue of Development (available online on April 18th) are key for understanding human development. According to Dr. Levin, this work shows a unified model for understanding embryonic development, and is therefore likely to provide important insight into human development. “Biased left-right asymmetry is both a fascinating and medically important phenomenon,” said Levin. “Problems with left/right asymmetry are responsible for a wide-range of birth defects in humans including conditions that affect the heart, the digestive system, the lungs and the brain. Building on our earlier research, we are gaining a significant understanding of asymmetry and getting closer to understanding its impact on humans. This fascinating ion pump has additional roles during development that are a goldmine of novel cellular control mechanisms.”

Dr. Levin’s team looked at molecular genetic and physiological characterization of a novel, early, biophysical event that is crucial for correct asymmetry: the flow of hydrogen ion or H+ flux. A pharmacological screen implicated the H+-pump H+-V-ATPase in Xenopus (frog) embryo asymmetry, where it directs left- and right-sided gene expression. The cell cytoskeleton is responsible for the LR-asymmetric localization of this pump during the first few cell cleavages in frog embryos. H+-flux across plasma membranes is thus asymmetric at the four- and eight-cell stages, and this asymmetry requires H+-V-ATPase activity. Artificially equalizing the asymmetry in H+ flux, by increasing or decreasing it on both sides equally, both randomized the location of the viscera without causing any other defects. To understand the mechanism of action of H+-V-ATPase, researchers isolated its two physiological functions, cytoplasmic pH and membrane voltage gradient (Vmem) regulation. Varying either pH or Vmem, independently of direct manipulation of H+-V-ATPase, caused disruptions of the normal LR pattern, suggesting important roles for both physiological parameters. V-ATPase inhibition also abolished the normal localization of serotonin at the 16-cell stage, suggesting that it helps to regulate the early flow of this important neurotransmitter. These data implicate H+-V-ATPase activity in patterning the left right axis of three different vertebrates, reveal mechanisms both upstream and downstream of its activity, and identify a novel role for this important ion transporter. Based on these observations, they proposed a detailed pH- and Vmem-dependent model of the early physiology of left/right patterning.

Michael Levin, PhD. is an Associate Member of the Staff in The Forsyth Institute Department of Cytokine Biology and the Director of the Forsyth Center for Regenerative and Developmental Biology, http://www.cellregeneration.org/. Through experimental approaches and mathematical modeling, Dr. Levin and his team examine the processes governing large-scale pattern formation and biological information storage during animal embryogenesis. The lab’s investigations are directed toward understanding the mechanisms of signaling between cells and tissues that allows a living system to reliably generate and maintain a complex morphology. The Levin team studies these processes in the context of embryonic development and regeneration, with a particular focus on the biophysics of cell behavior.

Science Daily
May 9, 2006

Original web page at Science Daily

Categories
News

How odors are sensed: A complex system clarified

John Carlson, the Eugene Higgins Professor of Molecular, Cellular and Developmental Biology and Elissa Hallem, his former graduate student in the Interdepartmental Neuroscience Program, published the comprehensive study in the journal Cell. “The results of our analysis allow us to make predictions about which odors smell alike to an animal, and which smell different,” said Carlson. “These predictions can now be tested in behavioral experiments and may help point us to insect attractants and repellants that are highly effective.”

This paper provides particular insight into the understanding of how animals perceive environmental smells that are often complex mixtures of molecular structures. The study identifies compounds that both stimulate and inhibit response in odor neurons, and the differences in response that are due to concentration and duration of exposure to a compound. “We were surprised to find that inhibitory responses are widespread among odor receptors,” said Carlson. “Most receptors are inhibited by at least one odor, and most odors inhibit at least one receptor. Although previous work has been concerned mainly with excitatory responses of receptors, our results suggest that inhibition may play a major role in how odors are identified.”

The paper further shows how the responses of the receptor repertoire map into the brain. Surprisingly, receptors with similar odor specificities often map to widely separated locations and receptors with very different specificities often map to neighboring locations. Carlson and Hallem, who have published extensively in this system over the past several years, found that individual receptors range along a continuum from “narrowly tuned” to “broadly tuned” to odorants. This finding moves the concept of the system beyond the previous picture of a simple dichotomy between specialist and generalist receptors.

Source: Cell

Science Daily
May 9, 2006

Original web page at Science Daily

Categories
News

Confronting zoonoses, linking human and veterinary medicine

Many of the emerging infectious diseases, including those caused by bioterrorist agents, are zoonoses. Since zoonoses can infect both animals and humans, the medical and veterinary communities should work closely together in clinical, public health, and research settings. In the clinical setting, input from both professions would improve assessments of the risk-benefit ratios of pet ownership, particularly for pet owners who are immunocompromised. In public health, human and animal disease surveillance systems are important in tracking and controlling zoonoses such as avian influenza virus, West Nile virus, and foodborne pathogens. Comparative medicine is the study of disease processes across species, including humans. Physician and veterinarian comparative medicine research teams should be promoted and encouraged to study zoonotic agent-host interactions. These efforts would increase our understanding of how zoonoses expand their host range and would, ultimately, improve prevention and control strategies.

Emerging Infectious Diseases
April 25, 2006

Original web page at Emerging Infectious Diseases

Categories
News

Researchers trace molecular evolution

Researchers report finding for the first time a step-by-step process explaining the evolutionary creation of molecules. The scientists, whose findings are published Friday, April 7, 2006 in the journal Science, say they reconstructed ancient genes from long-extinct animals to show how evolution reuses existing parts to recreate new pieces of molecular machinery. “We wanted to understand how this system evolved at the molecular level,” said Joseph W. Thornton, professor of biology at the University of Oregon and lead author of the paper. “There’s no scientific controversy over whether this system evolved. The question for scientists is how it evolved and that’s what our study showed.” Researchers say the findings offer a strong argument against evolution doubters who question how a progression of small changes could produce the intricate mechanisms found in living cells, The New York Times reported.

Science Daily
April 25, 2006

Original web page at Science Daily

Categories
News

Fourteen states want US bio-agro lab

Public and private institutions in at least 14 states have applied to the US Department of Homeland Security (DHS) to build and operate its proposed $451 million National Bio-and Agro-Defense Facility (NBAF), the replacement for the department’s aging Plum Island Animal Disease Center (PIADC) near Long Island, New York. DHS has said it will choose a “short list” of candidates by this fall. The 14 applicants are located in seven Southern states (Alabama, Florida, Georgia, Mississippi, North Carolina, Tennessee and Texas), five Midwestern states (Iowa, Kansas, Kentucky, Missouri, and Wisconsin) and two Western states (California and Colorado). Bidders range in size from a consortium of California public and private universities that have paired with the Department of Energy’s (DOE) Lawrence Livermore Laboratory, down to the governments of St. Lucie County in Florida and Bent County in rural Colorado.

The 500,000 square foot lab, to be built on 46,450 square meters, will expand on Plum Island’s mission, conducting research to create vaccines and drugs to fight human diseases, foreign animal diseases, and animal diseases humans can catch. It will house researchers from DHS, the Department of Agriculture (USDA), and the Department of Health and Human Services (HHS). Wherever the new lab is located, it will have an enormous local economic impact, experts say. According to a study by the University of Georgia’s Carl Vinson Institute of Government, the NBAF will bring in between $3.5 billion and $6 billion over 20 years. Salaries alone will reach up to $2.5 billion, the report says. The NBAF will also be extremely important for training a larger corps of agricultural researchers, according to Bennie Osburn, dean of the School of Veterinary Medicine at the University of California at Davis.

The Scientist compiled its list via confirmations from individual schools. Applications had to be postmarked — not received — by March 31, so additional institutions, still unknown, may have applied for the NBAF site. DHS continues to leave the future of Plum Island uncertain. The department web site notes that “it is anticipated that existing programs and PIADC would transfer to the new facility,” but adds that the DHS might build the new NBAF there or keep the facility at Plum Island, currently equipped to handle biosafety level (BSL) 3 experiments, “and recapitalize [the facility] meet 21st century laboratory standards.” Most of the buildings at Plum Island — the only US laboratory capable of handling large cattle in BSL-3 security — were constructed in the early 50s, and are aging badly.

Some experts said they are concerned that politics, not merit, might decide the winner. They worry that some states might have a leg up, given their political connections — for instance, Texas is President Bush’s home state, and Gov. Haley Barbour of Mississippi was in the Bush Administration. Congressman Hal Rogers (R-KY), chairman of the House Homeland Security Appropriations Subcommittee, also wants to build the NBAF in his district. Still, George Stewart, a professor at applicant University of Missouri, said that his state’s Congressional delegation has assured him that the vetting process is “above board.”

When asked which applicants most deserve the NBAF, agricultural experts The Scientist interviewed mentioned the California combination, because it includes the state’s university system and DOE’s Lawrence Livermore lab; the Athens, Georgia group, partly because it is near USDA labs; and the North Carolina Research Triangle consortium, because it includes Duke, the University of North Carolina, and the vet school at North Carolina State University. Experts also gave nods to Iowa State University, which is near a main USDA lab complex; Kansas State University, because the state is building its own National Agricultural Biosecurity Laboratory; and the group led by Texas A&M, partly because it already houses the National Center for Foreign Animal and Zoonotic Disease Defense.

The Scientist
April 11, 2006

Original web page at The Scientist

Categories
News

Genome sequencing is for ecologists, too

An organism widely used for genetics-versus-environment studies has joined the panoply of mice, rats, dogs, humans and other species whose entire genomes have been sequenced. At the Daphnia Genomics Consortium’s annual meeting in Bloomington this week, Indiana University and Joint Genome Institute scientists announced they’ve completed a “shotgun” sequence for Daphnia pulex, or the water flea, as it’s better known to high school biology students. “Daphnia is important to the environmental sciences, where the goal is to understand the complexities of ecosystems by getting a handle on how species in natural settings respond genetically to their environments,” said Daphnia Genomics Project leader John Colbourne. “Ecologists and evolutionary biologists would also want to learn more about how genetic variation is important for adaptation and how populations survive in a changing world.”

Colbourne is a founding member of the Daphnia Genomics Consortium and the genomics director of the Center for Genomics and Bioinformatics at IU Bloomington. The U.S. Department of Energy and the National Science Foundation funded the Daphnia project. Most of the sequencing work was done at DOE’s Joint Genome Institute Production Genomics Facility in Walnut Creek, Calif. Shotgun sequencing involves breaking a whole genome into smaller, more digestible DNA segments, then sequencing each one. The Daphnia genome was sequenced over eight times to ensure better coverage of all 12 pairs of chromosomes. Daphnia’s short generation time and small genome (a mere 200 million base pairs) makes it an ideal organism for laboratory and field studies of how environments influence — and how they’re influenced by — an organism’s genetics. The animals are common in lakes and ponds and have been used to monitor the health of aquatic environments. Members of the species can reproduce both with and without sex, which has important implications in evolutionary biology and ecology.

Species whose genomes have been sequenced are generally used for experiments in physiology and in developmental and cell biology, but rarely in ecology. Scientists are eager to exploit genomic technologies and genomic experimental approaches that have already revolutionized research in the human health sciences, with the goal of diagnosing the state of aquatic environments. Despite their common name, water fleas are not insects but crustaceans, like lobsters and crabs. Daphnia is the first crustacean genome to be sequenced. Information from its genome will help biologists make sense of similarities and differences among the intensively studied genetic models of insects, which are evolutionary relatives of crustaceans.

“The genome sequences are being completed for several insects because they are important model organisms — like fruit flies — or because they are important in disease or agriculture,” said Jeffrey Boore, head of the evolutionary genomics program at the Joint Genome Institute. “And the Daphnia genome sequence will illuminate all of this by allowing us to infer the ground state from which the insect genomes evolved.”

Science Daily
February 14, 2006

Original web page at Science Daily

Categories
News

The lungs of the planet are belching methane

It’s not just farting cows and belching sheep that spew out methane. Living plants have been disgorging millions of tonnes of the potent greenhouse gas into the atmosphere every year – without anybody noticing. The concentration of methane in the atmosphere has almost tripled since pre-industrial times. Environmental scientists thought they had identified all natural sources where bacteria convert organic plant matter to methane, such as swamps, wetlands and rice paddies. These bacteria only thrive in wet, oxygen-poor environments; they cannot survive in air.

So Frank Keppler, an environmental engineer at the Max Planck Institute for Nuclear Physics in Heidelberg, Germany, was surprised when he saw signs of methane being emitted by plants he was examining in normal air. “If we were following the textbook, we would have ignored it as a mistake,” he says. But Keppler and his colleagues decided to investigate further. They measured the amount of methane given off by plant debris – bits of grass and leaves from local and tropical plants – in methane-free chambers. To rule out the possibility that bacteria were at work, they bombarded the plants with gamma radiation to sterilise them. The team saw similar levels of methane produced by both sterilised and un-sterilised leaves. “We realised that we were looking at a previously unrecognised process,” Keppler says. They still don’t know exactly what is happening, but believe that pectin, a substance contained in plant cell walls, plays a part in the methane-making mechanism (Nature, vol 439, p 187).

When the group repeated the tests with living plants they were stunned by the amount of methane created. They estimate that, globally, living plants produce between 63 and 236 million tonnes of methane per year, with plant debris adding another 1 to 7 million tonnes. This would make plants responsible for roughly 10 to 30 per cent of global methane production. “This effect is completely missing from climate change and biogeochemical models,” says Peter Cox of the Centre for Ecology and Hydrology at Winfrith in Dorset, UK. He suggests that a new source of methane could help solve some climate mysteries. One such puzzle was posed last year when satellite observations revealed that tropical rainforests are emitting more methane than expected (New Scientist, 26 March 2005, p 20). Thomas Wagner from the University of Heidelberg in Germany, who led last year’s study, thinks Keppler may have found the culprit. “This new source is in good agreement with our results,” he says. The finding should also encourage new strategies for cutting man-made methane emissions, says Chris Jardine of the University of Oxford’s UK methane project.

New Scientist
January 31, 2006

Original web page at New Scientist

Categories
News

New analytical tool helps detect cancer

Scientists have long used ultra-fine glass tubes known as capillaries to analyze the chemical makeup of substances. Called capillary electrophoresis, or CE, the method applies high voltage to the capillaries, and by measuring the rate that the various materials move through the capillaries, researchers are able to identify individual compounds.

A group of researchers at the U.S. Department of Energy’s Ames Laboratory have developed a method called dynamic multiple equilibrium gradients, DMEG for short, that dramatically fine-tunes the process, allowing for a significant increase in resolution over previous methods. Potential applications include chemical, biological and biomedical sciences, as well as in environmental monitoring, biological warfare detection, drug discovery, and more. “This method is hyperselective and we can design it to target specific analytes for separation,” said Ryszard Jankowiak, an Ames Lab senior scientist. “Running multiple electric field gradients can focus and move the analytes to the detection window at precisely defined times, creating signature ‘fingerprints’, which minimizes the probability of false positives.”

The advance makes it possible to detect the smallest traces of substances, such as the estrogen-derived conjugates and DNA adducts in human fluid samples that could serve as biomarkers in risk assessment of breast and prostate cancers. In fact, this and other technologies being developed at the Ames Laboratory – biosensors and fluorescence-based imaging – have been used in work with cancer researchers at the University of Nebraska Medical Center and Johns Hopkins University to identify a specific adduct in the urine of prostate and breast cancer patients, and could lead to even earlier detection or indication of cancer risk.

Unlike traditional capillary electrophoresis, Jankowiak’s team, which includes Yuri Markushin and graduate student Abdulilah Dawoud, uses only low voltage, around 2kV or less. Another difference is in the way the voltage is applied. Tiny electrodes are microfabricated along the walls of the hair-like capillaries (or channels), in essence creating a complex grid of electrodes. “Saw-tooth type waves are applied along the channel outfitted with electrodes,” Jankowiak explains. “The electrodes act as capacitors and the applied waveforms generate electric fields. The moving variable electric field gradients induce very efficient focusing and separation of analytes. The analytes move along the capillary and tend to concentrate at the various electric field gradients. By varying the amplitude of the electric field gradients, these concentration points can be fine-tuned, making it easy to separate and identify the specific analytes.”

While the ability to design and test for specific analytes with greater accuracy marks a large leap forward in separation technology, DMEG has another, possibly even greater capability. Because the system can be fine-tuned to separate specific substances and concentrate them at particular points as they move through the capillaries, it can be used to create crystals. “To achieve crystallization, we created multiple moving electric field gradients along the crystallization channel that can trap, concentrate, and move charged molecules (e.g. proteins) of interest,” Jankowiak said. “In other words, using the DMEG approach, we can create and electronically control many localized regions of supersaturation which can be used to produce crystals.”

One potential application for this new crystal growth method is photosynthetic complexes for use in solar/photovoltaic cells. The major stumbling block in using these materials is that they must be arranged in architectures that promote electron transport and prevent energy wasting recombination. The complexes must also be interfaced with a conducting material in order to harvest the energy. The controlled growth offered by DMEG can help overcome these hurdles. Another possible application is for desalinization of seawater, using DMEG to extract the salt.

Science Daily
September 13, 2005

Original web page at Science Daily

Categories
News

Scientific knowledge as a public good

Life scientists are accustomed to thinking about quantifying the products of their knowledge in terms of such things as papers published, discoveries made, or, in the case of applied science, diseases treated. But there is another useful way to think about the value of scientific knowledge, which is as a public good. The public goods characteristic of ideas and knowledge – that they are freely available to all and are not diminished by use – can be traced to St. Augustine (circa 400). Adam Smith laid the conceptual economic basis for public goods in 1776, but economists did not give much attention to them until the mid-1950s. However, it has been difficult to reduce knowledge to numerical form and measurement, particularly in the basic sciences, so that there is little hard data on the linkage between scientific knowledge and growth.

Still, it is safe to say that scientific knowledge in its pure form is a classic public good. As such, it is a keystone for innovation and in its more applied forms is a basic component of the economy. The problem, however, is that the production of such knowledge has a cost, and the results are not necessarily available to all. In his presidential address at the National Academy of Sciences in 2002, Bruce Alberts noted that “the efforts that many scientists are making to strengthen world science by disseminating both knowledge and research tools…are being counteracted by several forces.”

Methods of communicating knowledge have vastly improved, but patents and copyrights have existed in various forms since the late 1400s and have become more pervasive. Social and political inhibitions and constraints to some forms of science and technology provide other barriers.Thus the notion of pure public goods is seldom realized in the real world. Scientific public goods are, like others, most often impure public goods. They increasingly represent a mixture, in quite varied proportions, of public and private efforts. At one time the common, but not exclusive, pattern was for the government to sponsor more basic research and the private sector the more applied, particularly as knowledge was embodied in a process or product. Yet in recent years an increasing number of firms have delved into more basic areas of research, particularly as they relate to biological sciences.

The latter efforts are not always commercially successful, but there can be a silver lining for the public. Celera Genomics, which could not make a profit based on its efforts to sequence the human genome, recently decided to put its database into the public domain. And in more applied areas, firms have shown a willingness to make their less commercially valuable knowledge available for public use: IBM, for example, recently announced that it would release 500 patents. Similarly, some multinational biotechnology firms have adopted a market segmentation strategy that permits the use of certain food and agricultural technologies in developing countries.

How do we encourage such efforts? Alberts proposed asking scientific journals to make their journals freely available on the Web after a delay of not more than a year, and changing the intellectual property protections that are arranged by public sector research institutions. In what might be considered a model of such a change, the University of California, Berkeley, has a “socially responsible licensing program” designed to cover technologies that promise exceptional benefit to the developing world; licenses are provided on a royalty-free basis. Several other universities have similar programs.
This kind of practice will hopefully become more widely adopted.

In the case of developing nations, efforts have long been under way to provide global, international, and regional public goods through the Consultative Group on International Agricultural Research [http://www.cgiar.org]. The system was initially best known for the development of semidwarf wheat and rice varieties that laid the basis for the Green Revolution and continuing increases in yields, but their research efforts now extend to a wide variety of other crops and activities. Multilateral efforts of a somewhat different nature have also been undertaken to sponsor research on infectious diseases such as malaria.

Some in more scientifically advanced nations are also beginning to recognize the importance of global public goods and the role that they might play in generating them. Two of the key steps to realizing such goals are a more widespread understanding and appreciation of the benefits that science can provide society through the generation of public goods, and a realization that the more widely the goods can be used – either directly or as an input into further research – the greater the social benefits and returns to investment. Such public goods perspectives might well play a more important role in establishing public, and even some private, research priorities and the allocation of scientific resources.

The Scientist
July 5, 2005

Original web page at The Scientist

Categories
News

Wellcome insists on open access

As of October, all new grant recipients will be required to deposit papers with PubMed Central. Stephen Pincock Britain’s Wellcome Trust said that after October 1 of this year, all new grant recipients will be required to post any papers arising from the funded research in an open-access repository. The trust, the United Kingdom’s biggest nongovernment funder of life sciences research, said that grant recipients would need to deposit their papers within 6 months of publication, either with the National Institutes of Health’s (NIH’s) PubMed Central or with a UK version the Wellcome Trust and others want to establish. “Everyone, everywhere will be able to read the results of the research that we fund,” said Mark Walport, director of the trust, in a statement. “PubMed Central provides a link from research to other papers and sources of data and greatly improves the power and efficiency of research. Digital archives are only as good as the information stored in them. That’s why we feel it’s important to encourage our researchers along this path—one I hope others will follow.”

Existing grant holders have until October 1, 2006, to comply with the open-access requirement, Wellcome said. The lag is to give them time to adjust to the new policy. Earlier this year, the NIH came under criticism when it announced a similar policy. Under NIH rules, funded investigators are “requested” to deposit papers arising from the research within 12 months. Advocates of open access said the policy was not strong enough; opponents said the new rule would not improve access to research, but place an unreasonable burden on researchers. “The Wellcome Trust policy is superior to the NIH policy in two key respects,” said Peter Suber, a proponent of open access at Earlham College in Richmond, Ind. “First, it’s a requirement, not a request. Second, it does not permit delays longer than 6 months. Assuring early, widespread access to important research results is in the funder’s interest, the researcher’s interest, and the public interest.” There may be a third respect in which the Wellcome policy is superior, Suber said via E-mail. “I’d have to see more details on the policy to be sure, [but] it appears that the Wellcome Trust is making deposit in PMC (or UK PMC) a simple condition of funding. If so, it’s a contractual obligation of the grantee made prior to any copyright transfer agreement with a publisher. Hence, the grantee’s publisher would have no standing to interfere.”

A spokesman for the trust confirmed that the new policy would make archiving within 6 months a grant condition. Roughly 3500 papers each year arise from Wellcome Trust–funded research, and the new policy means that after October 2006, all of those will be freely available within 6 months of publication. “If journals want to publish some of those… they’ll have to accept that,” he said. Stevan Harnad, an advocate of open access at the University of Southampton, UK, said there were problems with the Wellcome approach. “Wellcome’s policy of requiring self-archiving is a great improvement over NIH’s requesting it,” Harnad said in an E-mail. “However, requiring it to be deposited in PMC or UKPMC is a big and unnecessary strategic mistake.”

Wellcome should have required researchers to deposit articles in a repository held by his own institution, from which it could be harvested by PubMed Central or its UK version, Harnad said. “That would have greatly increased the influence of the Wellcome policy, touching on all disciplines and all institutions, not just the biomedical research that Wellcome funds. It would have helped propagate a standard, universal practice in all researchers’ institutions, one that could be performed by all researchers in all fields at all institutions.” Harnad also said the Wellcome Trust should have required immediate deposit upon acceptance for publication. “Research progress is not based on 6 or 12 months delay in access to research findings,” he said.

Responding to this criticism, the Wellcome Trust spokesman told The Scientist: “We would prefer immediate release, but we’re allowing a 6-month delay because we realize this is a big step and we have to approach it in a pragmatic way.” Meanwhile, Research Councils UK (RCUK), an umbrella group for Britain’s government science funding councils, is expected to announce its own open-access policy in coming days. People following the development of that policy have told The Scientist that it requires author archiving in repositories where they exist, but does not require the establishment of such archives. “In contrast to the NIH/Wellcome embargo policy… RCUK looks as if it might adopt the optimal policy (the one recommended by the UK Select Committee last year and already adopted by Scotland),” Harnad said. That policy is “to require immediate institutional self-archiving [with central self-archiving as an option only if the institution does not yet have an institutional repository].”

However, Suber pointed out that when researchers do not have deposit rights at either kind of repository, then a policy to mandate open-access archiving will simply not apply, creating gaps in the policy’s coverage. “That’s a reason for a funding agency to use a central repository—if not for all grantees, then at least as a fallback,” he said.

The Scientist
June 7, 2005

Original web page at The Scientist

Categories
News

Reef fish lives fastest, dies youngest

If there is a one in 10 chance you will be eaten before the day is out, you have to live life fast. That could be why a tiny fish found on the Great Barrier Reef has the dubious distinction of being the shortest-lived vertebrate known. The pygmy goby’s life span is a mere eight weeks, researchers in Australia have discovered.

The previous record holder, with a life span of 12 weeks, was the African turquoise killifish, which lives in seasonal waterholes and must mature and breed before they dry up. Now, studies of the pygmy goby – Eviota sigillata have shown their life span is even shorter. “I had an intuition that their life span might be towards the extremes,” says Martial Depczynski of James Cook University in Queensland. He and colleague David Bellwood collected 300 pygmy gobies and studied their ear stones, which accumulate daily rings. Counting these rings is a standard way of judging the age of most fish species.

They found that pygmy gobies live for 59 days at most. The larvae spend three weeks in the open ocean and mature within two weeks of settling on a reef, leaving the adults – which are 1 to 2 centimetres long – just three weeks to reproduce. “Completing all the demands of life within eight weeks is a considerable evolutionary achievement for a vertebrate,” the pair writes in Current Biology.

Depczynski thinks their “live fast, die young” strategy is a response to intense predation. A follow-up study involving the tagging of around 150 small reef fish, including 30 pygmy gobies, and trying to recapture them 9 days later suggests that the daily mortality rate for the pygmy gobies might be as high as 8%. It is possible that other small fish have even shorter life spans, Depczynski admits, but he thinks 59 days is approaching the lower limit for vertebrates. “You might be able to shave a couple of days off, but not much more.”

Journal reference: Current Biology

New Scientist
May 24, 2005

Original web page at New Scientist

Categories
News

Biology transforms

Biology is undergoing a fundamental shift from a descriptive to a quantitative, and ultimately, predictive science. This transition is being driven by the development of computer-based models of complex biological processes that can both capture and recapitulate the central metabolic and information-processing networks of cells. Receptor tyrosine kinases (RTKs) are central components of cell signaling networks and play crucial roles in normal physiological processes, such as embryogenesis, cell proliferation, and cell death (apoptosis). RTK networks function to detect, amplify, filter, and process a variety of environmental and intercellular cues. Although these networks have been intensively studied for the last 20 years, only since the completion of the Human Genome Project has the breadth and scope of their regulatory function been apparent.

RTK networks regulate human development, embryogenesis, and aging. Malfunction of RTK networks is a leading cause of major human diseases, such as developmental defects, cancer, and diabetes. Moreover, several innovative drugs that target various RTKs (for example, Herceptin, Cetuximab, Iressa, and Gleevec) have been approved by regulatory agencies in the last few years. These new drugs can be very potent and exert minimal adverse clinical effects in a well-defined group of patients, indicating the great therapeutic potential of RTK targeting.

Nevertheless, there is a lack of in-depth understanding of RTK networks because of their enormous complexity and multiplicity. This is a major obstacle for designing improved and more targeted therapies. The application of systems biology is a promising approach for improving our understanding of RTK networks. However, it requires developing very sophisticated mathematical models and acquiring large amounts of appropriate quantitative experimental data. These data must be acquired using cellular systems that are relevant for human disease and are under conditions that are highly controlled and reproducible. The technological resources needed to create a systems-level model of RTK signaling networks with significant predictive power are well beyond the research capability of any single institute or nation. To address this daunting, yet critical goal, scientific groups at the forefront of experimental, clinical, and computational molecular cell biology met in Yokohama, Japan in January 2005 to launch the International RTK Consortium.

The vision of the Consortium is to facilitate and coordinate international efforts to understand RTK signaling and its relationship to human pathologies.
Experimental data derived from various diseases and tissue settings, as well as high-throughput experiments, will be combined into a shared database. The database will be used to build a constantly improving and expanding global model that will be supervised by the Consortium. Understanding mechanisms underlying RTK network function will provide breakthroughs in the diagnosis and treatment of major human diseases, lead to the design of new therapeutic drugs, and decrease the attrition rate of new drugs in development.

Because of the complexity of RTK networks, they can only be understood by rigorous and extensive experimental analyses of a limited number of well-defined experimental systems and equally rigorous and extensive mathematical analyses.
The required mathematical and experimental expertise for understanding RTK networks cannot be found in any single laboratory or even country. Thus, the Consortium is dedicated to a strategy in which individual research groups can focus on their own area of excellence, and the data and models can be integrated at the community level.
This requires a community infrastructure that can support distributed research. The Consortium will provide the standards that will allow multiple labs across the world to work together productively. The RTK Consortium is organized from a systems biology perspective to develop the framework necessary to understand RTK signaling networks central to major human pathologies and therapies.
Systems biology is the study of living organisms in terms of their underlying network structure rather than simply their individual molecular components.

The Scientist
May 24, 2005

Original web page at The Scientist

Categories
News

Journal prints rejected paper – as ad

If you don’t like getting your paper rejected before it even reaches peer review, ask David Egilman how to get around the process: In what may be an unprecedented move, when the Brown University researcher’s paper was recently rejected from an occupational medicine journal, he simply bought two pages of ad space and printed the entire article in the same journal.

Two years ago, Egilman submitted an editorial to the Journal of Occupational and Environmental Medicine (JOEM) that critiqued a 2003 Dow-funded paper in Texas Medicine that said 11 cases of mesothelioma among Dow workers exposed to asbestos did not “suggest an occupational etiology”—even though mesothelioma typically strikes only 1 to 2 people per million, Egilman said. He received an E-mail with comments from editor Paul Brandt-Rauf, who said the material was “not likely to be a high priority for the majority of JOEM readers.”

Egilman told The Scientist he believed the article was rejected unfairly, and he wanted to “see what would happen” if he submitted the rejected paper as an advertisement. When he did, it was published in its entirety as a two-page ad in JOEM, along with his survey asking if readers believed this material was a “priority” to them. Egilman said he chose to publish the paper as an advertisement in JOEM, rather than get it peer reviewed at another journal, because he became more interested in finding out if the paper was interesting to JOEM readers. Egilman said he received 33 responses to the survey, all saying the material was of interest. “I was testing [Brandt-Rauf’s] assertion, in a semi-scientific way,” he said.

Drummond Rennie, deputy editor of the Journal of the American Medical Association, said he had never heard of a researcher who published a paper as an ad in the journal that rejected it. He told The Scientist the incident raises a number of issues, such as only the “haves” being able to publish their work. “And if it’s an ad, what’s it an ad for?” asked Rennie, also a professor of medicine at the University of California, San Francisco. “The research? The idea? The author?”

Brandt-Rauf, who is based at Columbia University in New York, told The Scientist that the paper was rejected on its own lack of merits and not out of any allegiance to Dow. He said that his note saying the material was not a “priority” was a stock rejection response and that the journal gives preference to original research. Approximately 25% of JOEM submissions are rejected before peer review, he said.
Brandt-Rauf added he was surprised by Egilman’s reaction. “I don’t know where he gets this idea that he gets to publish anything he wants in the journal of his choice,” Brandt-Rauf said. “If that were true, I’d publish all of my pieces in Nature and Science.” Brandt-Rauf noted that he normally reviews advertisements before they are published, but could not in this instance because Egilman’s paper replaced another ad that was cancelled at the last minute. If he had, he would have removed the ad—but not out of allegiance to Dow, which, to his knowledge, has never given money to him or the journal.

However, according to an editorial in the April/June 2005 issue of another journal, the International Journal of Occupational and Environmental Health, Brandt-Rauf and the American College of Occupational and Environmental Medicine (ACOEM), for which the JOEM is the official journal, have indirect ties to Dow Chemical and its strategic partner, GlaxoSmithKline. The ACOEM gave Dow Chemical the “Corporate Health Achievement Award” in 2000, for which Brandt-Rauf has been a reviewer. The award is funded by GlaxoSmithKline, Dow Chemical’s strategic partner. And Columbia University, Brandt-Rauf’s employer, receives “significant funds” from GlaxoSmithKline, according to the editorial.

Editorial coauthor Lee Friedman cautioned that practically every researcher and professional organization has ties—at least, indirect ones—to corporate sponsors, but “just the fact that there is an association raises concerns,” he told The Scientist. Dow did not respond to requests for an interview.

While many worry about researchers’ funding sources, relatively few investigate corporate connections to editors, publishers, and journal parent organizations, Friedman said. “There is a need for greater transparency,” he added. Friedman, director of the Social Policy Research Institute in Illinois, cited a 2002 study in the journal Science and Engineering Ethics showing that 42% of the editors of 33 medical journals owned by professional associations said they had recently received pressure from the association’s leadership over content. Furthermore, editors are not supposed to be able to veto ads, Friedman added. At many major biomedical journals, such as the Journal of the American Medical Association, the New England Journal of Medicine, editors are “blinded” to which ads are going into which issue, to separate editorial from advertising.

E-mail address The Scientist Daily
May 24, 2005

Original web page at The Scientist

Categories
News

Quality of veterinary education in the EU

Over the past few months many people both from within and outside of the Federation, have participated in the development of our future strategy. People from all over Europe enthusiastically exchanged opinions and gave valuable input. Already by this process we have started to achieve one of our goals – better communication and better cooperation – even before it was put on paper! One of the issues that kept coming back from these strategy discussions is the quality of veterinary education.
Adequate pre- and postgraduate training and continuous professional development are essential to fulfil the task society has conferred on to the veterinary profession: the care for animal health, animal welfare and veterinary public health. Nevertheless, from the visitations and evaluations of veterinary schools and faculties, we know that a number of establishments do not meet the minimum requirements set out be the EU legislation and suffer from serious deficiencies. In some Member States the situation is of grave concern to the profession. This is one of the main reasons why we have put a lot of effort in to convincing members of the European Parliament of the need to incorporate the evaluation of veterinary schools in the proposed Directive on there cognition of professional qualifications. While they accepted our arguments and tabled amendments on this item, other legislators on the other hand remain against such amendments. In their view the quality of education is the responsibility of the competent authorities in the Member States. They are of the opinion that it is up to each individual Member State, to make sure that the minimum requirements for veterinary training are met.

An alternative way to approach the problem, is to make the Commission more aware of the fact that some of the Member States seriously fail in implementing the EU legislation on veterinary education. Thus the Commission, in its role as guardian of the treaties can take the necessary steps against these Member States. Furthermore, FVE together with the European Association of Establishments for Veterinary Education (EAEVE) have started to review their current system for the evaluation of veterinary schools. The objective is to make the system clearer, more transparent and really focused towards the quality of young veterinary graduates produced. Their knowledge, skills, attitude and their ability to oversee the consequences of their work are of paramount importance. By making the system more transparent, it will become more credible and more acceptable so that veterinary schools will no longer be able to evade their responsibilities. Finally, the system should evolve into a central accreditation system.

Source : Jan Vaarten, FVE Executive Director

E-mail address FVE Newsletter
May 24, 2005

Original web page at FVE Newsletter

Categories
News

Threats of emerging animal diseases transmissible to humans

The significant potential impact of emerging animal diseases transmissible to humans (zoonoses) and pathogens on public health is a growing concern to all. Globalisation, industrialisation, restructuring of agricultural systems and consumerism, among others, will certainly change animal health policies. Along with its historical missions, the OIE has been asked by its Member Countries to play a greater role in confronting the challenges of such zoonoses. In fact, emerging and re-emerging zoonotic diseases will become progressively a greater factor in the public demands on Veterinary Services at national and international level, thus impacting on future partnerships, resources, and programmes. In this context, the Director General of the OIE, Dr Bernard Vallat, was invited last week to a meeting convened by the World Bank on “Emerging Zoonoses and Pathogens: Global Public Goods Concern; implications for the World Bank”.

It was an opportunity to discuss and enhance mutual understanding of the importance of the early detection, prevention and control of animal diseases in regards to public health and global trade. It was also an opportunity to agree on the role of the Veterinary Services worldwide as a Global Public Good, as well as considering relevant veterinary infrastructures and their compliance with OIE international standards for their evaluation and quality as a public investment priority.

“Veterinary Services and laboratories of developing countries are in urgent need of support to be able to meet these new challenges” Dr Vallat said. “This meeting was another step forward in sensitizing partner organizations, such as the World Bank, that emerging zoonoses and pathogens are a global public good concern. Capacity building and strengthening of Veterinary Services in terms of surveillance, rural network of veterinarians, early detection and rapid response capabilities, and legislation will provide the basis for better crisis prevention”, Dr Vallat added.

E-mail address Boglet.com
May 24, 2005

Original web page at World Veterinary Association

Categories
News

No art please, we’re scientists

As one of the world’s biggest funders of biomedical science, the Wellcome Trust can usually gather scientists to a party like honey attracts bees. But researchers were strangely lacking from a recent bash in the atrium of the Trust’s swank new Gibbs Building, 215 Euston Road, London. Those who were present were there to celebrate a major new sculpture designed especially for the building’s cavernous atrium by the artist Thomas Heatherwick. “Bleigiessen” (which means lead-pouring in German) is a staggering assemblage of 142,000 shimmering glass spheres suspended over a height of 7 stories on 840 kilometers of wire. The beads are arranged to mimic the form taken by molten lead when dropped in water. The piece is supposed to be a powerful reminder of the Trust’s growing dedication to fostering work that takes place at the interface of art and science.

Scientists are often skeptical about the value of sci-art, but the Heatherwick work impressed me as extraordinarily beautiful. I wandered through the party crowd in search of a scientist’s view, but I struck out again and again. A choreographer, a video artist, two sculptors, a curator, and an architect all thought the work “striking,” “amazing,” “marvelous,” and so on, but no researchers could be found. I discovered later that it was an artists-only affair; scientists were not invited. The research community needn’t worry, however, that their source of free booze at the Wellcome Trust has dried up. “We will very much be having more events focused at scientists,” says Ken Arnold, the Trust’s head of public programs. “It’s just that this celebration was conceived around talking to the arts community.”

The Trust began funding such projects in 1996, and so far has generated more than 50 works. Nowadays, some £500,000 is allocated to such collaborations each year, and the Wellcome has plans to make its “public engagement” aims even more visible by providing a prominent public space in its old headquarters down the road from the new building. The Wellcome is not alone in pursuing this arena. The UK government has also recently devoted specific money to public engagement with science.

What would Henry Wellcome have made of his Trust spending half a million pounds each year on art? I asked Arnold. “I don’t think he would have minded,” he says. “After all, Henry spent a good deal of his own money buying ethnographic paraphernalia and etchings and so on.”
Wellcome didn’t buy the art for art’s sake, and neither does the Trust, Arnold says. “It very much reflects our view that science is part of a broader cultural picture, and it is possible to take a different view of science by looking at it through art.”

Taking that approach inevitably produces some dubious art portraying a superficial view of science. Arnold concedes that perhaps 30% of the results are disappointing, although another third are “absolutely extraordinary.” Mark Lythgoe, a neuroscientist from London’s Institute for Child Health, has been involved in a variety of sci-art collaborations. He says that “the only time the collaboration really works is when both sides take the time to understand the other’s language.” Lythgoe confesses, though, that he can’t comment specifically on the new glass-bead sculpture, as he wasn’t invited to the event. “For scientists not to be invited to the celebrations,” he says, “is a little sad.”

The Scientist Daily
April 26, 2005

Original web page at The Scientist

Categories
News

(Mis)leading Open Access myths

(BioMed Central responds to some of the most prevalent and most misleading anti-Open Access arguments)

Myth 1: The cost of providing Open Access will reduce the availability of funding for research
“There is also the question of the impact on the funding of research by charities, particularly those without the considerable resources of the Wellcome Trust. The Royal Society, for example, runs number of funding schemes for scientists. Perhaps the best known are the University Research Fellowships (URFs), most of which are funded by our Parliamentary Grant in Aid (PGA). Our 300 University Research Fellows publish an average of about four papers per year. Based on an estimate of USD 3,000 fee per article (which we believe is realistic if the current high standards in publishing are to be maintained) an extra USD 3.6M or £1.96M per year would need to be found to fund our URFs alone. In the absence of an increase to our PGA we would be forced with the choice of reducing amount of research money funding allocated to our URFs, reducing in the total number of URFs that we could support or diverting funds from our other activities to compensate.”
Written submission to inquiry, February 2004, Royal Society

Response:
It is clear that at an overall macro-economic level, a switch to Open Access publishing would not negatively impact research funding.
The cost of the present system of biomedical research publishing, with all its inefficiencies and overly generous profit margins, still only amounts to about 1-2% of the overall funding for biomedical research (estimate from the Wellcome Trust, cited by Public Library of Science in their submission to the House of Commons inquiry). There is no reason why the cost of Open Access publishing should exceed the cost of the current system, since the fundamental process is the same. In fact, Open Access publishers are leading the way in using web technology to reduce costs further, so the cost of Open Access publishing to the scientific community will be significantly less than the cost of the system that it replaces.
Meanwhile, the vastly increased access to research that is delivered by Open Access will greatly increase the effectiveness of the research money that is spent, since all research builds on what has gone before it, and is needlessly handicapped if access to previous research is inconvenient, slow, or impossible. In short, funders will get more “bang for their buck”.
At the micro-economic level, there will certainly be transitions that need to be carefully managed as the Open Access publishing model grows in economic significance. e.g. since the total cost of publishing scientific articles is roughly proportional to the amount of research to be published, it may well make sense for the costs of publishing to be incorporated into research funding grants, rather than being covered by library budgets. These are important issues, which deserve attention. But these transitional challenges should not be allowed to obscure the overall picture which is that with the Open Access publishing model the scientific community will pay significantly less, yet receive vastly more (in terms of access and usability).

Update
On 29th April 2004 the Wellcome Trust published a report on the economic implications of Open Access publishing. The report (Costs and Business Models in Scientific Research Publishing) indicates that Open Access publishing could offer savings of up to 30%, compared to traditional publishing models, whilst also vastly increasing the accessibility of research.

Myth 2: Access is not a problem – virtually all UK researchers have the access they need
“All of us are committed to increasing accessibility of scientific content. I would argue that in the last ten years we have made a huge contribution to that, and I think 90 per cent worldwide of scientists and 97 per cent in the UK are exceptionally good numbers.”
Oral evidence to Inquiry, March 1st 2004, Crispin Davis (CEO, Reed Elsevier)

Response:
Elsevier’s figure of 97% of researchers in the UK having access to Elsevier content is misleading. As explained in the small print of their written submission, this refers to researchers at UK Higher Education institutions only, many of which have indeed taken out ScienceDirect subscriptions as a part of JISC’s “big deal” agreement.
However, these researchers do not have access to all ScienceDirect content by any means – the subset of journals that is accessible varies widely from institution to institution, meaning that access barriers are frequently a problem, even for researchers.
The access situation at institutions which focus primarily on teaching rather than research is particularly bad, but Elsevier disguises this by weighting each institution according to the number of ‘researchers’ employed, to come up with the 97% figure.
More fundamentally, the Higher Education sector is only one of several sectors carrying out biomedical research in the UK. Much medical research in the UK goes on within the NHS. Lack of online access to subscription-only research content within the NHS is a major problem, as detailed in a separate report. Similarly, Elsevier’s figures conveniently omit researchers employed at institutes funded by charities such as the Wellcome Trust and Cancer Research UK, and in industry.

Myth 3: The public can get any article they want from the public library via interlibrary loan
“I think the mechanisms are in place for anybody in this room to go into their public library, and for nothing, through inter-library loan, get access to any article they want.”
Oral evidence to inquiry, March 1st 2004, John Jarvis (Managing Director, Wiley Europe)
“Incidentally, any member of the public can access any of our content by going into a public library and asking for it. There will be a time gap but they can do that.”
Oral evidence to Inquiry, March 1st 2004, Crispin Davis (CEO, Reed Elsevier)

Response:
To say that being able to go to the library and request an interlibrary loan is a substitute for having Open Access to research articles online is rather like saying that carrier pigeon is a substitute for the Internet. Yes – both can convey information, but attempting to watch a live video stream with data delivered by carrier pigeon would be a frustrating business.
Practically, the obstacles to obtaining an article via the interlibrary loan route are so huge that all but the most determined members of the public are put off. For those who persist, after a time lag that will typically be several weeks, their article may (if they are lucky) finally arrive in the form of a photocopy. What the user can do with that photocopy is extremely restricted compared to what they can do with an Open Access article.
With an online Open Access online article, you can cut and paste information from the article into an email. With a photocopy you cannot.
With an Open Access online article, the license agreement explicitly allows you to print out as many copies as you like and distribute them as you see fit. But if you copy and distribute the article you received by Interlibrary Loan without seeking appropriate permission from the publisher, you may well be in violation of copyright law. It is also worth noting that an increasing fraction of public libraries now offer free or low-cost Internet access, making it even more convenient for the public to view Open Access research.

Myth 4: Patients would be confused if they were to have free access to the peer-reviewed medical literature on the web
“Without being pejorative or elitist, I think that is an issue that we should think about very, very carefully, because there are very few members of the public, and very few people in this room, who would want to read some of this scientific information, and in fact draw wrong conclusions from it […] Speak to people in the medical profession, and they will say the last thing they want are people who may have illnesses reading this information, marching into surgeries and asking things. We need to be careful with this very, very high-level information.”
Oral evidence to inquiry, March 1st 2004, John Jarvis (Managing Director, Wiley Europe)

Response:
This position is extremely elitist. It also defies logic. There is already a vast amount of material on medical topics available on the Internet, much of which is junk.
Can it really be beneficial for society as a whole that patients should have access to all the dubious medical information on the web, but should be denied access to the scientifically sound, peer-reviewed research articles?
In some cases, to be sure, comprehending a medical research study can be a demanding task, requiring additional background reading. But patients suffering from diseases are understandably motivated to put in the effort to learn more about their conditions, as the success of patient advocacy groups in the USA has shown. Patients absolutely should have the right to see the results of the medical research that their taxes have paid for.

Editor: Jonathan B Weitzman.

For Myths 5 to 11 click on
Myths

BioMed Central
March 15, 2005

Original web page at BioMed Central