Categories
News

Trial and ethics

There is no better example of the problems of doing a trial in a developing-world country than for the drug tenofovir. This reverse-transcriptase inhibitor is effective in combination therapy for HIV infection. Six randomised trials against placebo were planned to test the drug for preventing HIV infection. The US Centers for Disease Control and Prevention (CDC) planned trials in Botswana and Thailand, and Family Health International (FHI), also in the USA, planned trials in Ghana, Cameroon, Nigeria, and Malawi.

Mid-March, FHI decided to cancel its trial in Nigerian prostitutes after local researchers failed to reach “the necessary scientific standards”. In February, Cameroon authorities stopped the tenofovir trial in prostitutes. Act Up-Paris, which is perhaps one of the most vigorous HIV/AIDS lobby groups, accused the trialists of acting unethically by not supplying treatment after the study, and by choosing to do their trial in women at high risk in a part of the world where a study is cheap. The study is expected to restart soon. The Thai trial in intravenous drug users was approved in early March, but an AIDS lobby group castigated the trialists for not providing treatment after the study and for not supplying free clean syringes and needles.

Such accusations are not wholly fair. The Thai Government will not allow clean injecting equipment to be distributed, and anyway, the CDC is not allowed by the US Government, which opposes such distribution, to do so. Trials in developing countries are cheaper than those in the western world and, in a prevention study, finding participants at high risk of HIV infection does mean a smaller sample is needed and that the study is over sooner. However, tenofovir’s manufacturer will supply at cost if approved for HIV prevention in the developing world.

Last week, the UK’s Wellcome Trust released ethical guidelines for clinical trials in developing-world countries. That release was timed to go with a discussion paper by the Nuffield Institute for Bioethics, also in the UK. Access to care after a study is a crucial difference between trials in the western world and in a developing country, where there may be no or limited access. The Wellcome Trust guidelines include a “reasonable expectation” that there will be such access and “encourage” grant applicants to think about this point, while pointing out that “in some cases it may be difficult to estimate . . . how financially or logistically feasible it would be for a successful intervention to become available to patients within the host country”. But guaranteeing treatment after a study, especially costly drugs for a chronic disease, is probably unfeasible, the Nuffield discussion paper says. It goes on to suggest negotiations about aftercare early in the planning phase. And, it continues, who is to decide on whether a treatment is effective and what about the delay between making that decision and regulatory approval for the drug being given?

The tension lies in the imperative about aftercare in the Declaration of Helsinki and the suggestions in other ethical guidelines. The 2000 Declaration of Helsinki states that: “At the conclusion of the study, every patient entered into the study should be assured of access to the best proven prophylactic, diagnostic and therapeutic methods identified in the study”. The World Medical Association, who issue the Declaration, affirmed this position in May last year, after strongly felt representations had been made to the Association. Other guidelines, for instance those from UNAIDS in 2000, talk about the highest locally available level of care being needed for participants in HIV vaccine trials. UNAIDS is planning consultations to reach a consensus about ethics in prevention trials.

CDC and FHI investigators did meet local community groups and potential participants before starting their trials. In a comment on HIV vaccine trials in today’s Lancet, Naihua Duan highlights the importance of consumer research before a study. “HIV vaccine and vaccine trials, as public-health products that require consumer uptake and participation, also need to be grounded in consumer research to be successful in the public-health marketplace”, Duan writes.

In another comment in today’s Lancet, Peter Lurie and Dirceu Greco lambast the US Food and Drug Administration for trying to subvert the Declaration of Helsinki. But a debate in the Journal of Medical Ethics last year shows that there is no consensus about “standard of care”.
Clearly, this dilemma needs sorting out, which will need to start with funders. The Bill & Melinda Gates Foundation funded the FHI trials with US$6•6 million. Such funders need to think about also paying for and providing treatment after a trial ends.

The Lancet
April 12, 2005

Original web page at The Lancet

Categories
News

Could dandruff be altering the world’s climate?

Along with fur, algae, pollen, fungi, bacteria, viruses and various other “bio-aerosols” wafting around in the atmosphere, it may well be. A global study has found that tiny fragments of biological detritus are a major component of the atmosphere, controlling the weather and forming a previously hidden microbial metropolis in the skies. Besides their climatic influence, they may even be spreading diseases across the globe.

Scientists have known for some time that aerosols of soot, dust and ash can influence climate by reflecting or absorbing the Sun’s rays and by providing the condensation nuclei necessary for clouds to form. Recent research suggests that aerosols are also responsible for “global dimming”, which may be shading us from the full force of warming from greenhouse gases. But new findings from Ruprecht Jaenicke, of the Institute of Atmospheric Physics at the University of Mainz, Germany, show that a large fraction of the aerosols in the atmosphere are biological in origin.

Air samples collected by Jaenicke from over Germany, Siberia, the Amazon rainforest, Greenland and remote oceans found that tiny particles of organic detritus, much of it in the form of biological cells, make up about 25% of the atmospheric aerosol. Jaenicke estimates that around a billion tonnes of bio-aerosols enter the atmosphere every year from fields and forests, animal pastures and cities. That is twenty times previous estimates and similar in scale to mineral dust. Many of the tiny organic particles have shapes and structures that help form clouds and create rain. Particles made up of biological cells in particular are good at absorbing moisture in the air to form cloud condensation nuclei, says Jaenicke.

But the impact of bio-aerosols on global temperatures could be harder to predict, says Tim Lenton, an Earth systems researcher at the University of East Anglia in the UK. By dispersing solar radiation and shading the planet’s surface, “dry bio-aerosols will have a cooling effect on climate. But wet bio-aerosols could warm the Earth’s surface, especially at night, by contributing to fog and low-level cloud.” Some researchers believe that certain bacteria may have evolved to spend time in the air and create clouds and rain, as a Darwinian ploy. “Organisms are probably using winds and rain created by clouds as an effective means of dispersing themselves or their spores,” says Lenton.

More worryingly, some researchers spoken to by New Scientist argue that Jaenicke’s findings raise a new health threat. Gene Shinn, a marine biologist with the US Geological Survey in St Petersburg, Florida, US, says: “Jaenicke’s list of bio-aerosols includes proteins that are well-known allergens, especially to people with asthma.” Shinn believes that dust storms spreading across the Atlantic Ocean from the Sahara desert contain bacteria and proteins that have caused epidemics of coral disease in Caribbean reefs and widespread asthma on some islands.

Journal reference: Science (vol 308, p 73)

New Scientist
April 12, 2005

Original web page at New Scientist

Categories
News

Electronic tags for eggs, sperm and embryos

In 2002, two proud and relieved parents, Mr and Mrs A, saw their newborn twins for the first time, conceived after a long and difficult course of IVF treatment. At last it all seemed worthwhile. Except the babies were of mixed race, while both parents were white. The IVF clinic had blundered, and used the wrong sperm to fertilise Mrs A’s eggs. The child’s biological father was Mr B, a man the couple had never met and who with his partner was also trying for a family using IVF. Similar accidents have happened in the US and the Netherlands.

Now, in a bid to stop such mistakes happening again, the UK’s regulatory body, the Human Fertilisation and Embryology Authority (HFEA) is considering labelling all embryos, eggs and sperm with barcodes or electronic ID tags. The idea, discussed at the HFEA’s annual conference in London last month, is that an alarm will sound if the wrong eggs and sperm are brought close to one another, for instance, or if a doctor attempts to collect the wrong embryo to implant into a mother-to-be. In June 2004, an independent report commissioned by the UK’s chief medical officer suggested clinics use a system of double-witnessing, which requires an embryologist to ask a colleague to witness and document every procedure in which an error could occur. But with 25 such procedures required for each round of IVF, the system is laborious. And it still leaves room for human error.

Steve Troup, an embryologist on the HFEA’s advisory group on safety and new technologies, is looking into alternatives. Barcoding has been used for more than a decade in the UK’s blood transfusion service, where it has slashed the error rate. Now IMT International, in Chester, UK, is developing barcodes for IVF procedures. Digital cameras built into the IVF clinic’s benches read the barcodes off the bottom of labelled dishes containing eggs. A computer then reads the codes, and sounds an alarm if they do not match with the patient. “Our system is incredibly safe,” says Tim Haywood, director of IMT International.

The electronic tags, known as RFID tags, work in a similar way. They can be placed on the bottom of a dish containing an embryo, and are activated by radio waves which transmit across a clinic’s designated work areas. When activated, RFID tags respond by transmitting a unique ID code. “If the samples don’t match [the patient], or you bring together two things that shouldn’t be in the same work area, the alarms will sound,” Troup says.

The HFEA is investigating whether such a system would be safe, as there are concerns that radio waves might harm embryos. IVF Witness, an RFID system being developed by Research Instruments, in Falmouth, UK, has been tested on mouse embryos. The embryos are placed in Petri dishes which have tags attached to the bottom, and placed in an incubator with an antenna that activates the tags.

In Research Instruments’ tests, the tags transmitted continuously for four days without any perceptible effect on the embryos. Though the tests are not complete, “it looks very, very good that there’s going to be no problem with it,” David Lansdowne, technical director at the company told New Scientist. Troup’s personal view is that RFID tags will be safe for in vitro procedures. The tags only transmit when activated by an external signal. And they work at the low frequency of 13.5 megahertz compared with 900 to 1900 megahertz used by cellphones.

The advisory group will need to be satisfied that such an RFID system would not significantly heat up an embryo, or cause other as-yet-unknown problems. Troup’s research team at the University of Liverpool, together with researchers at the University of Manchester, will be carrying out more work looking at the effect of radio waves on mouse embryos. Lansdowne says his team will be measuring the field strength from the RFID tags when the embryos are being worked on, and comparing this with background levels of radio waves.

New Scientist
April 12, 2005

Original web page at New Scientist

Categories
News

US Food and Drug Administration and ethics

In June, 2004, the US Food and Drug Administration (FDA) proposed that clinical research projects abroad that are not conducted under an application for an investigational new drug (IND) need no longer comply with the Declaration of Helsinki, a document described by the Canadian Medical Association as “the stone tablet of medical research ethics”.

At issue are certain studies outside the USA without an IND application that might not come to the attention of the FDA until a drug company later seeks approval in the USA by filing a new drug application (NDA). Current FDA regulations require studies submitted in support of such an NDA to have been done in a manner consistent either with the Declaration of Helsinki or any local laws, whichever is more protective for patients. The FDA’s proposal would remove these requirements entirely and mandate only that the submitted studies be consistent with the Good Clinical Practices (GCP) guidelines of the International Conference on Harmonisation. However, the GCP guidelines were not developed transparently and mainly address procedural issues, not overarching ethical ones. The GCP guidelines do not, for example, address conflict of interest, the need to publish results, or post-trial availability of successful treatments to study participants or community members, topics included in the Declaration of Helsinki. The agency states that it is concerned with “ensuring quality of data” and that the GCP guidelines are therefore necessary. Why not simply have FDA regulations refer to both the GCP guidelines and the complementary Declaration of Helsinki?

The FDA also worries that the Declaration of Helsinki could be modified “independent of FDA authority”, although the GCP guidelines themselves are not immutable, and the agency does acknowledge that any revisions “could not supersede US laws and regulations”. Ironically, the FDA has already deftly evaded the 2000 modifications to the Declaration of Helsinki by declaring in 2001 that the reference to the declaration in FDA regulations was not actually to the current version, but rather to the weaker, now defunct 1989 version.

The FDA and other agencies within its parent agency, the Department of Health and Human Services, and the US drug industry have led the charge against many substantive improvements in several international ethics documents. Their efforts have been least successful with the Declaration of Helsinki, providing an alternative motivation for the FDA’s proposal. In reports dating back to 1996, and in meetings about the Declaration of Helsinki and a related document prepared by the Council for International Organizations of Medical Sciences, the Department of Health and Human Services and its agencies have spearheaded efforts to limit the rights of participants in clinical trials and their communities, particularly in developing countries.

The FDA’s concerns have focused on two areas, both conveniently absent from the GCP guidelines. For placebo use, the FDA complained that the language in the 2000 Declaration of Helsinki precludes the use of placebos in studies of minor conditions. By exerting influence unavailable to the less influential, the agency succeeded in forcing through a “clarification” that permits such use. And the FDA defends the use of placebos in studies of treatable life-threatening conditions in developing countries, and fears that the Declaration of Helsinki precludes such trials.

The Department of Health and Human Services and the FDA have also argued forcefully against the requirement in the Declaration of Helsinki that effective drugs be provided to all study participants at the conclusion of the research. This requirement is particularly critical in developing countries, where, due to poverty, those in the active study arm might see their drugs abruptly discontinued and those in the control group might be denied the benefits of the very therapies whose efficacy they have just helped prove. The FDA representative at the September, 2003, meeting of the World Medical Assembly failed to get this provision reversed. The same kinds of pressure were applied to the revision of the guidelines from the Council for International Organizations of Medical Sciences with more success.

The Department of Health and Human Services and its daughter agencies, ably assisted at times by the US drug industry, have crafted a strategy that has reached its apotheosis in the current FDA proposal. In a particularly Orwellian touch, the National Institutes of Health (also part of the Department of Health and Human Services) recently relied heavily on ethics documents developed in the USA and the UK, as well as the heavily US-influenced document from the Council for International Organizations of Medical Sciences, to declare that the Declaration of Helsinki does not represent the “consensus view” on the use of placebos in developing-country trials.

The Declaration of Helsinki is not a perfect document. But at least it has the virtue of being the product of a quasi-democratic process. The declaration is produced by the World Medical Association–82 national medical associations (government agencies cannot be members). The declaration can be amended only by a formal vote before the full World Medical Assembly, which meets annually. By contrast, the guidelines from the International Committee on Harmonisation are the product of negotiations by just six parties: the regulatory authorities and drug industries of the USA, the European Union, and Japan. Input from consumers and developing countries asymptotically approaches zero.

The Declaration of Helsinki is the standard-bearer for international research ethics and enjoys particular respect in the developing world. It would be tragic if the US tendency to arrogantly flout international mores claimed the declaration as another victim, even as the President touts the universalism of human rights.

Source: Peter Lurie, Dirceu B Greco

The Lancet
April 12, 2005

Original web page at The Lancet

Categories
News

Lessons in senescence

The cells stop dividing; the studies keep multiplying.
Exogenous expression of the ras oncogene triggers senescence in human fibroblasts. Staining with DAPI reveals the gradual appearance of senescence-associated heterochromatin foci, transcriptionally silent regions of DNA. Four days after cells are infected with a ras-containing retrovirus, nuclei also show a marked colocalization between heterochromatin protein 1ß and promyelocytic leukemia protein; ten days after infection, colocalization decreases. When aging and damaged cells undergo apoptosis or malignant transformation, their lives reach a dramatic dénouement: suicide or cancer. But when they undergo senescence, their destiny seems drab in comparison. They irreversibly exit the cell cycle, linger indefinitely, and die of undetermined causes. Senescent fibroblasts, in particular, “get big and flat, and they look ugly,” resembling fried eggs, notes Scott W. Lowe, a professor at Cold Spring Harbor Laboratory on Long Island.

For decades, senescent human cells were found only in culture. But since the 1990s, investigators have detected senescence in T lymphocytes extracted from HIV-infected people and in tumors exposed to cancer treatments. Persistent doubts about the phenomenon’s relevance in vivo should have softened. Yet the senescent cell remained a cytological ugly duckling. Not as scientifically glamorous, as revealing, or as well defined as the apoptotic or transformed cell, it received far less attention.

Lately, however, that ugly duckling has matured into a more formidable bird. Accumulating evidence suggests that even if few senescent cells normally exist in vivo, their secretions promote diseases and aging, says Felipe Sierra, director of the cell structure and function program at the US National Institute on Aging (NIA). As a result, he senses a modest resurgence in the senescence field after what he calls “a very dark period a few years ago.” Last January, an NIA-sponsored workshop explored possible interactions between senescent cells and the extracellular matrix. The meeting, held at the Buck Institute for Age Research in Novato, Calif., was intended as a wake-up call to its 18 participants to “start worrying” about the physiological effects of these interactions, says Sierra.

Another boost to senescence studies comes from new findings on intracellular mechanisms. Earlier research established that the p53 and Rb tumor-suppressor pathways are vital to the process, and the cyclin-dependent kinase inhibitors p16 and p21 also play roles. Yet senescence is far from fully characterized on the molecular level. One recent paper identifies proteins and mechanisms in a novel pathway, and another report implicates an enzyme not previously linked to senescence.

A deeper understanding of tumorigenesis is the likeliest outcome of these in vitro advances. René Bernards, a professor at the Netherlands Cancer Institute in Amsterdam, acknowledges that cell-culture senescence could be an experimental artifact. But Bernards, who conducts RNA interference screens for senescence-related genes, contends, “It’s a useful artifact because it involves many of the players that are normally deregulated in cancer.”
Links between senescence and the organism-wide aging process, on the other hand, are more tenuous and their therapeutic lessons more problematic. Efforts to reduce senescence in aging tissues “might end up promoting cancer,” cautions Peter D. Adams, a biologist at Fox Chase Cancer Center in Philadelphia.

Scientists originally induced senescence by serially passaging cells in culture. After a cell line has replicated several dozen times, telomeric erosion leads to mitotic arrest. In the past decade, studies have established that activated oncogenes, DNA damage, or oxidative stress can also trigger senescence. Different experimental protocols and culture conditions yield subtly different types of senescence. Moreover, its manifestations vary between cell types and even within a broad class of cells such as the fibroblasts.

Researchers agree nevertheless that all senescent cells probably undergo chromatin remodeling that permanently prevents their reentry into the mitotic cycle. Estela E. Medrano, a biology and dermatology professor at Baylor College of Medicine in Houston, studies histone acetylation and deacetylation, chromatin changes that respectively enhance and repress gene transcription. Her specialty is the melanocyte whose senescence might cause aging human skin to become mottled, and whose inability to senesce could foster melanomas.

Medrano postulates that senescence is mediated by a competition between histone acetyltransferases (HATs) and histone deacetylases (HDACs) to bind to promoters of cell-cycle genes. Excessive HAT or HDAC levels can each trigger senescence, she maintains. Her lab is investigating cellular complexes that sense these levels. “We want to know how and when the complex formations occur when the cells are aging in culture,” says Medrano.

In 2003, Lowe’s lab reported a striking chromatin development in some senescent fibroblasts. After DNA staining, their nuclei displayed many small, distinct spots containing heterochromatin, which is transcriptionally inactive. In contrast, DNA staining and heterochromatin markers were more uniformly distributed both in quiescent cells, which temporarily forgo mitosis under low-serum conditions, and in senescent fibroblasts that Lowe now hypothesizes lack a robust p16 response.

Formation of these small spots, which Lowe called senescence-associated heterochromatic foci (SAHF), was linked to repression of genes targeted by the transcription factor E2F; these genes encode mitogenic proteins. Lowe recognizes that SAHF might merely be a consequence, not a cause, of senescence. Yet he plans to explore “how these genes get silenced” and particularly how p16 and Rb contribute to the process.

Adams has already uncovered a pathway to SAHF that he says might operate parallel to the Rb pathway. By his own admission, Adams is not a senescence expert. But he recalls wondering, in reaction to Lowe’s paper, whether SAHF creation and exit from the cell cycle might be promoted by human homologs of certain yeast proteins; other researchers had found that the yeast proteins contribute to gene-silencing by helping form heterochromatin. Adams’ group subsequently uncovered several landmark events in a SAHF pathway. Participants include a histone H2A variant and nuclear bodies containing promyelocytic leukemia protein, a tumor suppressor.

Adams views his task now as filling in the gaps between these events. “It’s like digging a tunnel from England to France,” he explains. “You dig from England and you dig from France. And, hopefully at some point, you meet in the middle.” Lowe says that follow-up experiments should test whether manipulations to the molecules in Adams’ model would, in the long term, make cells resistant to senescence or, if not, would enable senescent cells to proliferate again.

Heterochromatin represses gene activity, but many genes in senescent cells actually display higher expression levels. The repression of repressors could account for some of this increased activity. Senescence experts insist, however, that euchromatin, which facilitates gene transcription, must also be involved. Data from a cDNA microarray study provide tentative support for this contention. Hong Zhang, a postdoc working with genetics professor Stanley N. Cohen at Stanford University School of Medicine, examined the gene-expression profiles of human fibroblasts and mammary epithelial cells. By filtering out genes upregulated in quiescence, the study identified transcriptional fingerprints unique to senescence, not those merely correlated with cell-cycle arrest. It also found that upregulated senescence-specific genes were physically clustered, an arrangement consistent with euchromatin formation.

“The clustering is sort of an in silico experiment,” notes Zhang. “You do it computationally, and it’s very exciting. But I think the first thing I need to do is to confirm it experimentally, to see whether there is a chromatin-structure alteration that occurs during senescence.” One possible approach, he adds, is a so-called “ChIP-Chip” experiment. Antibodies that bind, for example, to acetylated histones could be used to immunoprecipitate euchromatin-associated genomic regions, which then would be fragmented and identified on a genomic microarray.

Instead, Zhang has focused on smurf2, a ubiquitin ligase whose gene is upregulated in senescent cells. He and Cohen induced high levels of smurf2 expression in early-passage human fibroblasts, whose telomeres presumably were not exhausted. Showing no stress response or detectable DNA damage, the cells entered senescence if their p53 or Rb pathway was functioning. Intriguingly, the smurf2-induced senescence did not appear to depend on the enzyme’s ubiquitin ligase activity.

Protein upregulation, not as a cause but as an effect of senescence, is the bailiwick of Judith Campisi, a senior scientist at Lawrence Berkeley National Laboratory in Berkeley, Calif. A theory that she and others have touted is that senescent cells, far from being physiologically inert, secrete proteins that stimulate tissue aging and tumorigenesis. These secretions include degradative enzymes, inflammatory cytokines, and growth factors.
Campisi estimates that 30 to 40 proteins are involved.

In a study published in February, lab members irradiated human fibroblasts, causing them to senesce. The researchers injected these cells into mice together with immortal but nontumorigenic mouse mammary epithelial (MME) cells, and the murine cells formed malignant tumors. Further experiments suggested that this tumorigenic conversion was partly mediated by matrix metalloproteinase-3, an enzyme secreted by the senescent cells. In other experiments, the senescent fibroblasts were cultured with another nontumorigenic MME cell line. The MME cells formed abnormal alveolar structures and produced twofold less of a major milk protein.

Campisi draws two lessons from the study. The first is that senescent cells in vitro can disrupt a normal tissue’s function and structure, a process that she suggests might similarly occur during aging. The second is that, as experiments increasingly reveal which secreted factors yield particular outcomes, “we might be able to modify the senescent phenotype in a tissue-specific and situation-specific manner” so as “to intervene in an intelligent way.”

Chromatin remodeling is thought to mediate senescence in human melanocytes. According to one hypothesis, the histone acetyltransferase p300 is removed from promoters of certain cell-cycle regulatory genes, and histone deacetylases (HDACs) are added. The resulting repression of those genes leads to a halt in mitosis. Investigations of senescence in vivo are also continuing. Rita B. Effros, a pathology and laboratory medicine professor at the University of California, Los Angeles, has long taken a leading role in characterizing senescence in CD8+ T lymphocytes. In 1996, she and colleagues reported that, in HIV-infected people, some of these so-called cytotoxic or killer T cells displayed short telomeres and could no longer proliferate. To combat the virus, which infects CD4+ helper T cells, these HIV-targeted CD8+ cells presumably divided so much that they became senescent. A similar phenomenon occurs in elderly people harboring cytomegalovirus, another latent virus.

Senescent T cells, which resist apoptosis, accumulate over time. The immune response eventually suffers, according to Effros, possibly because the cells secrete certain cytokines or because their overwhelming presence depresses the generation of T cells that target other antigens. In a recent study, Effros’ lab inserted a gene encoding human telomerase into a culture of HIV-targeted killer T cells taken from people infected with the virus. This manipulation kept the cells from senescing but failed to improve their cytotoxic efficiency. Acknowledging the impracticality, if not danger, of gene therapy, Effros says she is collaborating with Geron Corporation, of Menlo Park, Calif., to test the effects of a telomerase-activating compound on immune cells. The goal, she adds, is a “pharmacologic way of manipulating telomerase that would selectively affect normal T cells and improve their function.”

While Effros is trying to prevent senescence, Igor B. Roninson, director of the cancer center at Ordway Research Institute in Albany, NY, hopes to impose a safe form of senescence on the wildly proliferating cells of malignant tumors. In the 1990s, Roninson’s lab discovered that various chemotherapeutic drugs could induce terminal proliferation arrest in different human tumor cell lines. This effect, which also occurs after radiation treatment, was later detected in breast carcinomas excised from patients who underwent chemotherapy.

Cancer treatments appear to cause cellular senescence, Roninson explains, by damaging DNA and thereby activating various signal-transduction pathways. He observes that the process is often not immediate; video microscopy indicates that some senescing cells first pass through a state called mitotic catastrophe. Induction of senescence is also not without its hazards. A cDNA microarray study by Roninson’s lab showed that senescent cancer cells upregulate genes that encode tumor-promoting, as well as tumor-suppressive, secreted factors.

Roninson’s newly formed company, Senex Biotechnology, also in Albany, is seeking drugs that stimulate the beneficial side of this process. One class of compounds would “induce senescence with minimal cytotoxicity and with preferential expression of growth-inhibitory genes over tumor-promoting genes,” he says. Retinoids belong to this class, but Roninson notes that their utility is limited because many tumor cells lose their retinoid receptors. Another class of compounds would prevent the induction of tumor-promoting genes in cells that have senesced as a result of other treatments. Roninson reports that such activity is faintly displayed by nonsteroidal anti-inflammatory drugs that inhibit the transcription factor NF-ĸB. Senex is developing more efficient compounds, he adds.

Some researchers have qualms about fighting cancer by promoting senescence. Given the many factors secreted by senescent cells, Campisi observes, “If I had a tumor and I was being treated by chemo, I would want those tumor cells to die.” And noting that cultured senescent cells can be genetically tweaked to reengage in mitosis, Bernards asserts, “The only good tumor cell is a dead tumor cell, as they say.”

Roninson responds that cytostatic drugs would avoid the cytotoxicity that “is the principal cause of the negative side effects” experienced by cancer patients. Stressing scientists’ longstanding quest for such drugs, he says that he is “not in the minority” in his goal of forcing tumor cells to undergo permanent growth arrest, rather than only apoptosis. Instead, he maintains, “it’s the aficionados of apoptosis who are in the minority by denying this as a goal.”

The Scientist
April 12, 2005

Original web page at The Scientist

Categories
News

European Center for Disease Prevention and Control begins recruiting

The head of the new European Centre for Disease Prevention and Control (ECDC) has begun the process of hiring staff. Zsuzanna Jakab, who took up her post as director of the new agency on March 1, told The Scientist that 29 people, including 16 scientists, will be recruited in the current financial year, and the European Commission is expected to start advertising the posts immediately. By 2010, Jakab predicted, the total number of staff could be 300.

Jakab has also been putting together an advisory panel of approximately 50 experts from various member states to give expert guidance. The panel is scheduled to hold its first meeting on April 28 at the ECDC’s Stockholm headquarters. She will likely hire a deputy in May. The European Union has given Jakab until the autumn to draw up a strategy document that sets out the new organization’s objectives and how it plans to achieve them.

To buttress the new body’s limited funding—€ 4 million (USD $5.2 million) this year, €14 million (USD $18.2 million) in 2006, and €29 million (USD $37.6 million) in 2007—Jakab is lobbying member states to pay the salaries of high-ranking experts who actually work at the center. Sweden and France have already allocated one each. “They are called detached national experts,” Jakab said. “They are funded by their national governments but they will be working at the ECDC under my authority. I’ve been seeing lots of health ministers from around the EU and I’ve been asking them all to consider seconding additional experts in either the short or long term.”

One of the biggest obstacles facing Jakab is the improvement of data quality so that the ECDC can provide an early warning system on emerging infectious diseases. “I want to see the ECDC as a one-stop shop for EU countries, allowing experts in one member state to compare statistics with the other 24,” she said. “I strongly believe this is a major step forward, and if, in 2 or 3 years’ time, we have a surveillance system that is the best in Europe, I would be very pleased.”

Angus Nicoll, head of the Centre for Infectious Diseases at the Health Protection Agency in the United Kingdom, told The Scientist he believes the ECDC could play a crucial role in protecting the health of an expanding EU population. “This project is very exciting because it’s much easier for infections to move around inside Europe than it used to be,” Nicoll said. “Having a center that can control that is potentially very useful.” Nicoll, who is a member of Jakab’s advisory panel, believes the ECDC is a chance to avoid previous mistakes in deciding who’s responsible for dealing with outbreaks in Europe. “The EU is not always great at clarifying things. It did not do terribly well during the SARS [severe acute respiratory syndrome] scare [of 2003] because there was no central body organizing the information. Instead, we relied almost entirely on the World Health Organization in Geneva,” Nicoll said. “Although EU countries do work well together, we don’t have a ‘ringmaster’ who can get things working when needed.”

“The other issue is, will it be fully operational or just an observatory of the EU? If one country reports an outbreak and the ECDC did nothing, it would not be very useful,” Nicoll said. “Equally, you don’t want it just going into a country and saying ‘We’ve come to sort it out.” Nicoll pointed out that current EU surveillance relies on a network of strategic hubs that feed into the EU from member states. The United Kingdom, for example, has centers that monitor gastrointestinal infections and Legionnaire disease; France has hubs for tuberculosis and HIV. “One controversial question is, will the new center suck up all the EU money that comes to these hubs?” Nicoll wondered. “I hope it doesn’t, because my worry would be that it would put up a barrier between us and them.”

The Scientist
April 12, 2005

Original web page at The Scientist

Categories
News

Biotechnology reenergized

Advanced biotechnology can reduce stem length to make the trunk shorter and thicker, reduce the number of branches and leaves, increase growth rate, improve adaptation to hostile climates, reduce negative response to competition, increase carbon sequestration, and improve the partitioning of biomass into components more favorable for subsequent conversion. The completion of the Human Genome Project (HGP) symbolizes the entry of biology into the “big science” arena. What constitutes big science may be in the eye of the beholder, but the term generally means using high technology on a large scale and mobilizing substantial teams to tackle problems in a highly organized and structured way. This is an exciting time and there is much talk, mostly justified, that this new century will be the century of biology.

For obvious reasons the major focus of this new biology has been on medicine and human health. Faster and cheaper sequencing and genotyping are expected to usher in an age of early diagnostics, individually tailored treatments, and effective gene therapies. But another quiet revolution is taking place in biotechnology for energy, industrial, and environmental applications. Although some preliminary attempts had been made in the past, for example in bioremediation, the tools were never adequate to tackle problems in a rigorous way.

The Department of Energy (DOE) is credited for launching the Human genome program (HPG), and as a founding member of the International Human Genome Sequencing Consortium, determined the sequences of chromosomes 5, 16, and 19. The DOE also established the Microbial Genome Program (MGP), which since 1995 has supported microbial biology research and sequenced nearly 200 microbial and related genomes and 12 microbial community genomes. Much of the DOE’s success of its relatively modest biological program rested on its position within the behemoths of the physical sciences, including scientific computing. That close relationship emboldened the DOE biology managers to access many of the physical and computational science tools that helped fuel this revolution.

With the completion of the HGP and the evolution of the MGP, the new engine of the DOE’s biotechnology effort is the Genomes To Life (GTL) program. Conceived and designed by our scientific advisors, GTL is a joint undertaking by the biology and high-performance computing science offices of the DOE. GTL focuses primarily on microbes and microbial communities and is aimed at applications of clean energy production, bioremediation of mixed (radioactive and chemical) wastes, and enhanced carbon sequestration by the biosphere.

The scientific focus of GTL is on the multiprotein molecular machines within cells and the regulatory networks that drive them. All experimental and high-end computing efforts are driving toward a robust predictive capability that can help design the systems that will deliver applications important to the DOE. Existing and new tools will be absolutely necessary to realize the GTL goals, including the DOE sequencing factory at Walnut Creek, Calif.; the X-ray, neutron, and nuclear magnetic resonance sources; the DOE supercomputers; and a new generation of proposed user facilities such as those for protein and molecular tag production, imaging, and proteomics.

Derek Lovley at University of Massachusetts at Amherst and his team focus on microbial Geobacter species. The team has demonstrated how these microbes can effectively reduce soluble uranium (VI) to insoluble uranium (IV) and thus immobilize these radionuclides and other toxic metals from contaminated groundwater. Subsurface uranium is a serious environmental problem at many of the DOE sites around the United States where nuclear weapons were manufactured during the cold war. The capacity of naturally occurring Geobacter species to transfer electrons to oxidized metals can be stimulated simply with the addition of an acetate solution to the groundwater.

The second example comes from the laboratory of the J. Craig Venter Science Foundation. Venter and his team are inventing ways to produce a synthetic genome. Having achieved their goal for a virus, they now have their sights set on a microbial genome. One of their objectives is to synthesize the “minimal” microbial genome in order to focus investigations solely on aspects of microbial biochemistry that are of interest to GTL. Creating the minimal genome of a hydrogen-producing extremophile, for example, may provide valuable insights into ways to produce hydrogen on an industrial scale. Cellulases are a broad family of enzymes that convert cellulose to glucose. The exoglucanase starts at a nick in the cellulose and proceeds down the track, converting the polysaccharide to double glucose molecules that are then split by another cellulase. The exoglucanase has three distinct domains: catalytic, linker peptide, and binding domains. The binding domain on the right extracts the track of polysaccharide from the cellulose and presents it to the active reaction site within the catalytic domain.

The third example started at the DOE Joint Genome Institute with the sequencing of a Populus tree, a common biomass crop. Plans are in place to also sequence the genomes of the microbial communities in the root zone of the Populus. Discovering the genes and gene regulatory networks that control the shape and the partitioning of carbon throughout the tree as well as the principal drivers of growth will create exciting opportunities for enhancing its biomass potential and increasing the carbon sequestration capacity in ways that can help reduce the carbon loading of the atmosphere from fossil fuel burning. Such advances may soon aid in realizing a consolidated bioprocessing system for the creation of ethanol as described by analyst Charles Mann in his report to the National Commission on Energy Policy. This system would move several processes associated with biomass to ethanol production into a single (genetically modified) organism that both breaks down the cellulose and other biomass constituents into sugars and ferments the sugars to ethanol. Mann estimates that this could increase the conversion efficiency from 36% to 70%. Steven J. Smith and colleagues at Pacific Northwest National Laboratory in Richland, Wash., describe how the generation of bioethanol may evolve over a decade, leading to the displacement of 35 billion gallons of gasoline per year without impact on land currently dedicated to food production. If Mann’s suggested improvements can be made by 2020, this technology could displace 22% of U.S petroleum imports.

Although there is no experience yet comparing the cost and effectiveness of bioremediation of metals and radionuclides with traditional methods, costs savings for bioremediation of organics are estimated to range from 30% to 95%. In situ bioremediation, taking advantage of natural microbial populations in the subsurface, has the potential for reducing costs and increasing the efficiency of groundwater treatment as compared to conventional pump-and-treat technology. More than one billion cubic meters of water and 55 million cubic meters of solid media at the DOE sites in 29 states are contaminated with radionuclides. Cost estimates for restoration of these sites start at about $200 billion, so the potential savings accrued by use of innovative technologies may amount to many billions of dollars.

Fast-growing biomass plantations will take advantage of biotechnology to increase the efficiency of biomass production without constraining the production of food and fiber. An additional benefit will be the augmentation of greenhouse-gas mitigation through increased carbon sequestration in the soils of these managed terrestrial ecosystems. Stan Wullschleger at Oak Ridge National Laboratory in Tennessee estimates that by increasing the lignin concentration of roots, thereby increasing the turnover time, an additional 0.35 GtC/year (gigatons of carbon-per-year) could be sequestered globally during a 30-year poplar rotation.

Although some genetic engineering may be pursued in laboratory studies, it is entirely possible that no genetically engineered organism will ultimately be released to the environment for the GTL-derived applications. Through environmental genomics we are discovering such diversity in nature’s tool-kit that it may suffice to simply “pick and choose and encourage” natural systems to realize the GTL applications in energy and environment. Advances such as the ones described are indicative of the acceleration in scientific innovation fueled by high-throughput processes. I believe that GTL can be the engine for innovation.

The Scientist Daily
April 12, 2005

Original web page at The Scientist

Categories
News

Australia scientists grow stem cells from nose

With the help of the Catholic Church, Australian researchers have successfully grown adult stem cells harvested from the human nose, avoiding the ethical and legal problems associated with embryonic stem cells. Australia bans creating human embryos to harvest stem cells but scientists may use embryos left over from IVF (in-vitro fertility) treatment. Stems cells harvested through other means, such as from the nose, is legal. Head researcher Alan Mackay-Sim of Griffith University said the adult stem cells taken from inside the nose could potentially be used to grow nerve, heart, liver, kidney and muscle cells. “We have got an adult stem cell which is accessible in everybody and we can grow lots of these cells and turn them into many other cell types,” Mackay-Sim told Reuters.

“Apart from neural and brain cells, they look like they can turn into blood cells, heart muscle and to skeletal muscle,” he said in an interview. Scientists believe stem cell research could eventually lead to cures for a range of serious ailments, including Parkinson’s disease (news – web sites) and spinal cord injuries. The Catholic Church, which views the use of embryonic stem cells as a form of destruction of human life, helped fund the research through a A$50,000 ($39,500) grant, which was approved by Sydney’s Catholic Archbishop George Pell. “The significance of this is manifold. This represents a significant advance and I think this will bring a great blessing for people,” Pell told Reuters on Thursday.

Australian Health Minister Tony Abbott said the new nose adult stem cells avert the ethical problems surrounding embryonic stem cell research. “It seems at least on the basis of this research that we may well be able to obtain multi-potent stem cells from adults and that we don’t need to use embryos to obtain these important cells,” Abbott told reporters.

Yahoo
April 12, 2005

Original web page at Yahoo

Categories
News

Marburg virus outbreak in Africa

An outbreak of rare and deadly Marburg haemorrhagic fever has claimed more than 100 lives in central Africa. Where does the virus come from, and how serious a potential killer is it likely to be?

Marburg is a rare and deadly virus, of the same family as Ebola, that triggers haemorrhagic fever. It infects the cells lining the blood vessels and a subset of the body’s immune cells, causing capillaries to leak fluid. The first signs of infection are fever and aches, making the disease difficult to distinguish from malaria or other viral illnesses. Although Marburg can cause severe bleeding, in most cases patients die because the circulatory system collapses, triggering shock and multiple organ failure.

Marburg is dangerous because it has a high mortality rate, is very contagious and has no effective treatment. The virus kills at least 25-30% of the people it infects, although its deadlier cousin Ebola kills up to 90%. Nevertheless, “that’s a scary proportion”, says Sina Bavari, who studies the virus at the US Army Medical Research Institute of Infectious Diseases in Fort Detrick, Maryland.

The current outbreak, which originated in Uíge province in northern Angola, has killed more people than any before. The disease first appeared there in October 2004 but was only identified as Marburg virus last week. According to the World Health Organization (WHO), health authorities had reported 132 cases as of 30 March, of which 127 have been fatal. The virus has also attracted attention in recent years because it is viewed as a prime candidate for a bioterror agent: it is easy to mass-produce and is stable as a powder. “It’s really increased awareness,” says Tom Ksiazek, who works in the Special Pathogens Branch of the US Centers for Disease Control and Prevention in Atlanta, Georgia, where experts carried out recent testing on specimens from Marburg-infected patients.

It was first recognized in 1967 after a shipment of monkeys from Uganda triggered outbreaks of disease in laboratory workers in Marburg and Frankfurt, Germany, as well as in Belgrade in the former Yugoslavia. Thirty-seven people fell ill. Since then, there has been only a handful of recorded outbreaks, all originating in sub-Saharan Africa. Before the most recent outbreak, the worst epidemic was in the Democratic Republic of Congo between 1998 and 2000, during which around 150 people were infected.

Scientists have no clear idea about where the virus originated. It is unlikely to stem from monkeys or other primates, because they are also killed quickly by the disease. Scientists suspect that it lives in another animal, which could be anything from bats to mosquitoes to birds, and occasionally jumps into humans and other primates. Because outbreaks are so rare, researchers have had little opportunity to trace the source.

There are no vaccines or drugs that fight off Marburg fever. The virus spreads through close contact with infected people, their body fluids or the tiny water droplets from coughs and sneezes. But its spread can be curbed using standard infection-control measures, such as the use of gloves and face masks, and the isolation of those infected. Because of concerns that the virus could spread further in Angola, infectious-disease experts from the WHO and other medical organizations are working with local health authorities to isolate patients, trace their contacts and raise local awareness of the disease. A smattering of groups around the world is working on prototype drugs or vaccines to combat the virus, which requires the strictest level of laboratory safety standards. A study last year showed that disabled virus particles stripped of their genetic material and injected into guinea pigs protected them from infection.

Marburg is a dangerous virus, and an outbreak is clearly devastating for the affected communities. But in terms of death toll, it cannot rival the harm caused by persistent and widespread killers such as malaria or HIV. “It does tend to get people’s attention,” Ksiazek says, “but perhaps more than it deserves.”

Nature
April 12, 2005

Original web page at Nature

Categories
News

X-chromosome sequenced

The complete sequence of the human X chromosome was published in Nature this week. The work, led by Mark Ross at the Sanger Institute in Cambridgeshire, UK, shows that large segments of it match parts of normal chromosomes in birds, confirming the X chromosome’s “non-sex” origins.

Despite the fact that X is much larger than the tiny Y, it seems that both evolved from a pair of conventional chromosomes in early mammals sometime in the past 300 million years – an idea first proposed in 1967. Previously, our main clue that X and Y had a common ancestry was that they swap a few small sections during one kind of cell division, just as pairs of ordinary chromosomes swap much larger chunks.

After X and Y had taken up their role in sex determination, their paths diverged. We already know that the Y shrank and lost almost all of its genes (New Scientist, 24 August 2002). Non-sex chromosomes have also changed greatly, acquiring or losing huge chunks. Now sequence comparisons with rats, mice and dogs show that the X chromosome seems to have changed little since the evolution of placental mammals, supporting the idea that once genes are transferred to X, they stay there. This is thought to be a result of X inactivation, the process whereby most of the genes on one X chromosome are switched off to prevent an “overdose” of X genes.

New Scientist
March 28, 2005

Original web page at New Scientist

Categories
News

‘Safer’ stem cells bring therapies closer

Completely fresh supplies of human embryonic stem cells have been created for the first time without having to grow them on potentially contaminating mouse “feeder” cells. Nor do they need to be nourished with serum derived from animals. The breakthrough boosts the prospects of growing safe tissues for transplant from embryonic stem cells – the unspecialised, primitive cells in the embryo from which all tissues originate. “The ability to generate new stem cell lines in completely [mouse-]cell-free and serum-free conditions solves a major problem associated with the use of stem cells in the treatment of human medical conditions,” says Bob Lanza of Advanced Cell Technology (ACT), the company in Worcester, Massachusetts, US, which pioneered the new production system.

Recent experiments showed that all lines of human embryonic stem cells grown on a scaffolding of mouse feeder cells may be potentially contaminated with animal substances and therefore unsafe for treatment. This includes the cell lines approved by President George W Bush in 2001, which federally-funded US scientists are restricted to use. Now, the option of dispensing with mouse feeder cells and animal serum could at last generate stem cells safe enough for therapy.

However, although the fresh stem cells were not grown on feeder cells, there is still a risk of contamination because they were raised on coatings of animal origin. But ACT says that making the same coatings entirely from the corresponding human ingredients is now a formality. “We’re now working on lines which will be completely human,” says Bob Lanza, ACT’s president of scientific development. “There’s nothing in the animal layer that can’t be replaced with a human equivalent.”
He says that although several groups of researchers now grow their human embryonic stem cells on human instead of mouse feeder cells, there is still the possibility that harmful viruses can lurk in the human feeder cells and pass into the stem cells. By dispensing completely with feeder cells – whether human or animal – ACT’s system gets round this once and for all, he says.

Firstly, he and his colleagues created the coating on which the stem cells would grow. To do this, they grew mouse feeder cells, then washed them away to leave an “extracellular matrix” – a natural coating of material which had accumulated around the growing cells. The matrix contains substances such as growth factors and nutrients which support the growth of any living cells deposited on the coating. Once sterilised of any living matter, the coating was ready to support the growth of stem cells.

To obtain human stem cells, Lanza first procured donated human embryos left over from infertility treatments. By “feeding” the embryo with specific ingredients usually found in animal serum, Lanza produced a blastocyst, a ball of cells containing a precious “inner cell mass” of stem cells. After physically extracting the stem cells, Lanza’s team transferred them into a dish containing the extracellular matrix, and they started multiplying indefinitely.

“This is a terrific breakthrough in the development of defined cultures for stem cell isolation, which is of critical importance to the safety and efficacy of embryo stem cell technology,” says Paul De Sousa, a stem cell researcher at the Roslin Institute in Edinburgh, UK.

Journal reference: The Lancet

New Scientist
March 29, 2005

Original web page at New Scientist

Categories
News

The placebo effect

Don’t try this at home. Several times a day, for several days, you induce pain in someone. You control the pain with morphine until the final day of the experiment, when you replace the morphine with saline solution. Guess what? The saline takes the pain away.

This is the placebo effect: somehow, sometimes, a whole lot of nothing can be very powerful. Except it’s not quite nothing. When Fabrizio Benedetti of the University of Turin in Italy carried out the above experiment, he added a final twist by adding naloxone, a drug that blocks the effects of morphine, to the saline. The shocking result? The pain-relieving power of saline solution disappeared.

So what is going on? Doctors have known about the placebo effect for decades, and the naloxone result seems to show that the placebo effect is somehow biochemical. But apart from that, we simply don’t know.

Benedetti has since shown that a saline placebo can also reduce tremors and muscle stiffness in people with Parkinson’s disease (Nature Neuroscience, vol 7, p 587). He and his team measured the activity of neurons in the patients’ brains as they administered the saline. They found that individual neurons in the subthalamic nucleus (a common target for surgical attempts to relieve Parkinson’s symptoms) began to fire less often when the saline was given, and with fewer “bursts” of firing – another feature associated with Parkinson’s. The neuron activity decreased at the same time as the symptoms improved: the saline was definitely doing something.

We have a lot to learn about what is happening here, Benedetti says, but one thing is clear: the mind can affect the body’s biochemistry. “The relationship between expectation and therapeutic outcome is a wonderful model to understand mind-body interaction,” he says. Researchers now need to identify when and where placebo works. There may be diseases in which it has no effect. There may be a common mechanism in different illnesses. As yet, we just don’t know.

Source: 13 things that do not make sense

New Scientist
March 29, 2005

Original web page at New Scientist

Categories
News

Embryo cells not like peas in a pod

In a mammalian embryo, all cells are equal – or so biologists believed. But a series of studies suggest that the fate of individual cells might be determined much sooner after conception than previously thought. In some non-mammals, such as fruit flies, there are different concentrations of certain molecules in different parts of the egg. When the egg cell divides, the “daughter” cells use this as a kind of grid reference to work out where in the egg cell they have come from and what they should become. This pattern is inflexible: split an insect egg by pinching it in the middle and you don’t get twins; you get a front end and a back end of the insect.

Mammalian embryos appear to be much more flexible. If you take a mouse embryo at the two-cell stage and destroy one of the cells, you still get a complete mouse. This originally led to the idea that cells in early mammal embryos are totipotent – able to form any cell type. But recent studies have raised doubts. And now, after a painstaking set of experiments, Magdalena Zernicka-Goetz of the Wellcome Trust/Cancer Research UK Gurdon Institute in Cambridge and her team have shown that cells in four-cell mouse embryos are different from one another (Development vol 132, p 479).

The team took mouse embryos at the two-cell stage and labelled one cell with dye so they could track its fate. In most embryos, they found, one of the cells divided down the longitudinal “meridian” line of the egg, while the other divided across the equator. If the longitudinal division happened first, cells from that division went on to form the body of the mouse, while those from the later lateral division made the placenta. Next, the team separated the cells of four-cell embryos and made new embryos using cells either from the longitudinal division alone or only from the lateral division. After implantation, 85 per cent of the longitudinal embryos developed to term. Only 30 per cent of lateral embryos developed to term.

Although this does not prove that mouse eggs contain molecular grid coordinates like those of flies, it does suggest cells in early mammalian embryos are not totipotent after all. “There are differences in their developmental properties at the four-cell stage,” Zernicka-Goetz concludes. It may be that mammals have something in their eggs that influences the fate of early embryo cells, but those cells are more capable of adapting if things go wrong.

Richard Gardner, an expert in early mouse development at the University of Oxford, UK, agrees that the studies show a difference between the two cell types. But he argues that manipulating the embryos in this way makes it hard to be sure what it means for normal development. “There are differences in mammalian embryo cells’ developmental properties at the four-cell stage.” But if eggs influence the fate of early embryo cells, this would be another reason for caution in the use of IVF, Gardner says. For example, it raises questions about the removal of cells from embryos for preimplantation genetic testing. But Zernicka-Goetz thinks the flexibility of the longitudinal cells means you can remove one or two cells from an eight-cell embryo without causing any problems.

New Scientist
March 29, 2005

Original web page at New Scientist

Categories
News

Amplification of acetylcholine-binding catenanes from dynamic combinatorial libraries

Directed chemical synthesis can produce a vast range of molecular structures, but the intended product must be known at the outset. In contrast, evolution in nature can lead to efficient receptors and catalysts whose structures defy prediction. To access such unpredictable structures, researchers have prepared dynamic combinatorial libraries, in which reversibly-binding building blocks assemble around a receptor target. They selected for an acetylcholine receptor by adding the neurotransmitter to chloroform/dimethylsulfoxide solutions of dipeptide hydrazones [proline-phenylalanine or proline-(cyclohexyl)alanine], which reversibly combine through hydrazone linkages. At thermodynamic equilibrium, the dominant receptor structure was an elaborate [2]-catenane, consisting of two interlocked macrocyclic trimers. This complex receptor with a 100 nanomolar affinity for acetylcholine could be isolated on a preparative scale in 65% yield.

Science
March 29, 2005

Original web page at Science

Categories
News

Looking at variation in numbers

The massive efforts to systematically find and catalog single nucleotide polymorphisms (SNPs) bear witness to the conviction that small genomic changes may provide clues to the origins of such things as heart problems, obesity, and pharmacologic responses.

But another type of variation, largely overlooked by the genetics community, might ultimately make equally important contributions to health. Large, submicroscopic rearrangements comprise about 5%-10% of the human genome. Many of these contain duplications that vary in the number of times and ways they are repeated: tandemly, at distal parts of the same chromosome, or even on other chromosomes. At least three papers last summer dealt with the advantage of new technologies used to discover the extent to which these polymorphic rearrangements are duplications of genes found elsewhere in the genome. And research published in January documented the first instance of the resulting gene-dosage effect on disease susceptibility: the effect of copy number of the CCL3L1 chemokine gene on susceptibility to HIV infection and progression to AIDS.

“In a sense, we’re just seeing the tip of the iceberg here in terms of the potential importance of copy-number variation to human disease, to human health, and to evolutionary change,” says James Sikela, professor in human medical genetics at the University of Colorado Health Sciences Center in Aurora. “I think there’s a newly-found appreciation for how important these copy numbers can be.”

HIV-1 uses the CCR5 chemokine receptor like a keyhole to unlock and enter a cell. “There’s a battle going on for this keyhole,” says Sunil Ahuja, director of the Veterans Administration Center for AIDS and HIV-1 Infection in San Antonio, and one of the recent study’s corresponding authors. The most potent endogenous key is CCL3L1, a chemokine whose gene is found anywhere from zero to at least 14 times in a normal diploid genome. CCL3L1 is also known to be a potent anti-HIV-1 chemokine. “You can imagine that if there are individuals who produce high amounts of the chemokines, there might be a possibility that they would gum up this keyhole and prevent the entry of the virus,” says Ahuja.

Robert Nibbs and his group at Beatson Institute for Cancer Research in Glasgow, Scotland, discovered the CCL3L1 copy-number polymorphism (CNP) and the copy-number correlation with production of the chemokine. “The prediction would be,” says Nibbs, “that those individuals with high copy number would be protected from HIV infection, and also be protected from progression once they have become infected.” He adds, “What Professor Ahuja has done is to prove that that hypothesis is correct.”

Ahuja teamed up with Matthew Dolan, who oversees the US military’s Tri-Service AIDS Clinical Consortium (TACC) cohort. A unique feature of that cohort, comprising 1,300 HIV-positive individuals who were in the US Air Force, is that the participants “all had uniform access to health care and were a racially balanced population,” says Dolan. “You can look at questions that are constrained by ethnicity, and you can also eliminate the factor of adequate access to health care,” he explains, pointing out that patients with HIV often do not join a prevalent cohort until years after they become infected. As predicted, Ahuja and Dolan found that individuals who had a higher number of CCL3L1 copies were less likely to become infected with HIV, or to progress to AIDS once they were infected. But there was a twist. Individuals of African origin had an average of about six copies per genome, whereas those of European ancestry had an average of two per genome. They found that a person’s copy number relative to the population average matters more than absolute gene dosage. The contribution of CCL3L1 gene dosage could be teased apart from that of the noncopy-dependent variant of CCR5, already known to confer some resistance to HIV infection and HIV progression.
Such an analysis could not have been done in the United Kingdom, says Nibbs, “We don’t have the access to the kind of cohorts that Professor Ahuja and his collaborators do.”

Indeed, in a paper last year that examined a different cohort, the researchers found no correlation between the absence of CCL3L1 and susceptibility to HIV infection or the rate of its progression. Graeme Stewart, corresponding author from Westmead Millennium Institute at West-mead Hospital in Sydney, writes in an E-mail: “Our study doesn’t contradict their results, as we only examined the proportion of people with HIV who fail to express any CCL3L1.” He adds, “Since such people are few we did not have the power to detect a partial effect.”

The TACC study may be the first to correlate CNPs with disease susceptibility, says Evan Eichler, associate professor of genome science at the University of Washington, Seattle. He calls it “a beautiful piece of work … a Christmas present,” a sentiment shared by many. But it was not the first time that researchers have seen a gene-dosage effect. Genes for the Rhesus factor blood group, cytochrome P450, glutathione S-transferase, and drug susceptibility, for example, “are all known to be copy-number variants in the population,” Eichler says.

Baylor College of Medicine pediatrician and genetics professor James Lupski argued in 1991 that a common inherited neuropathy, Charcot-Marie-Tooth disease, was due to the duplication of a large segment of chromosome 17. The 1.5-megabase segment comprises 21 different genes, and only the one for peripheral myelin protein 22 is gene-dosage sensitive. “At the time, there was a lot of resistance to the idea that you could get clinical phenotype related to just gene dosage, not having an aberrant protein or abnormal gene,” Lupski says. Yet it was already well known that Down syndrome, caused by duplication of an entire chromosome 21, is the most common genetic disease, affecting one in 600 live births. “We were so fixated on mutations,” Lupski says. “Dosage can obviously have phenotypic consequences.”

The field of human genetics had focused on characterizing single Mendelian traits found in very small portions of the population, Eichler says. But now geneticists are shifting that focus to more complex diseases: those with multiple, smaller contributions from several factors. With the TACC study, he notes, it became apparent that the “ability to become infected with HIV, or develop AIDS, [is] a complex interplay between the environment, single base-pair mutations, as well as copy-number variation.”

Eichler has been working on mapping genetic duplications, though he won’t discuss details. “We can safely say that we’ve mapped all the sites of duplication in at least three or four individuals.” To determine which of these sites have variants across the population, or the extent of that variation, requires a far larger and broader sampling. Those are the more interesting duplications, medically speaking: “If everyone has the same copy number, even though it’s duplicated, it probably doesn’t mean that there’s going to be any association there with disease,” Eichler notes. Determining the extent of variable duplication is another matter. At least three studies have used microarray screening, each in a different way, to screen the human genome for large-scale variation.

A group led by Michael Wigler of Cold Spring Harbor Laboratories in New York used representational oligonucleotide microarray analysis (ROMA) to measure the relative concentration of DNA segments in the population. About 85,000 oligonucleotides were printed onto a glass microarray and then hybridized with differentially labeled genomic digests from 20 different individuals. The experiment found 76 unique CNPs of about 100 kilobases and greater, with 70 genes among them, “including genes involved in neurological function, regulation of cell growth, regulation of metabolism, and several genes known to be associated with disease.”

In another study, Stephen Scherer at the Hospital for Sick Children in Toronto and Charles Lee at Harvard University led a group that used array-based comparative genomic hybridization (aCGH) to look for CNPs. Arrayed BAC-derived genomic clones were hybridized with the labeled genome digests of 55 individuals. They found 255 variable loci, half of which overlap with genes. Of those 255 variable regions, only 11 were detected by both the Wigler group and the Lee/Scherer group.

A number of great ape and human lineage-specific gene copy-number variations are apparent from genome-wide cDNA array comparative genomic hybridization. Each horizontal row represents aCGH data for one cDNA clone on the microarray, while each vertical column represents data from one experiment, (H=human, B=bonobo, C=chimpanzee, G=gorilla, and O=orangutan). Regions shown contain lineage-specific genes (vertical black lines) and adjacent flanking genes ordered by chromosome map position using the UCSC Golden Path genome assembly (November 2002 sequence freeze).
Arrows denote from which hominoid lineage the copy number change is unique.
Methodological differences, such as different density and scope of the arrays used, and perhaps the use of different builds of the reference genome, probably account for much of the discrepancy between the studies, notes Nigel Carter in a commentary to the Lee/Scherer study. Carter writes: “It is common practice in selecting clones or probes for [aCGH] to avoid regions that hybridize to more than one genomic location or show variation,” concluding that “many more [large-scale copy-number variations] probably remain to be discovered.”

Also, only about half of the variable regions found in either study were detected in more than a single individual. This leaves open the question of whether these duplications arose uniquely in the individual screened, or perhaps the sample size was too small to detect a more common polymorphism. At about the time the latter papers appeared last summer, another group, from Stanford University and the University of Colorado Health Science Center, published work that used cDNA arrays to undertake what they call the first genome-wide gene-based survey of gene duplication across hominoid species. cDNAs representing nearly 30,000 human genes were spotted onto glass slides and then hybridized with human and either gorilla, chimpanzee, bonobo, or orangutan genomic digests. In all, the researchers found more than 1,000 genes that showed copy-number changes unique to one or more of the human and great ape lineages. Of these, 134 showed increases in copy number specific to the human lineage, including a number of genes thought to be involved in the structure and function of the brain.

One advantage of using cDNA instead of genomic oligonucleotides or BAC clones is that “we’re actually getting gene-specific information when we use these chips,” says co-corresponding author Sikela. “We’re really excited by some of the genes we’ve found that are either in-creased or decreased specifically in human, where you could relate it to cognition, language, those kind of things,” says Sikela. “Many different traits distinguish these organisms, and it’s plausible that copy-number change could be a major reason for that.”

Genomes are dynamic, fluctuating entities, which “evolved by duplicating and by inducing variation when they duplicate,” says Wigler. Focusing only on SNPs and point mutations, he says, is not enough. “There’s a big picture here that people are missing.” In some cases though, the extent of variation has long been under scrutiny. Barbara Trask, director of the human biology division at the Fred Hutchinson Cancer Research Center in Seattle, invokes the collection of human olfactory receptor genes and pseudogenes. Exactly how many, and which, of the 800 or so related sequences (grouped in 17 clades) individuals have varies throughout the population and affects the ability to detect and differentiate smells.

When genes duplicate, the selective pressure to keep them from mutating may no longer be present. Over time, copies may be rendered nonfunctional, or they may take on a new function. “There are going to be cases where additional copies themselves might have a phenotypic effect, and therefore might confer a selective advantage or disadvantage,” says Trask. Now, the technology is available to interrogate the genome for CNPs of such duplications, says Wigler. “What you’re going to see is that … some of the more subtle differences between humans – disease susceptibility genes, and the rates at which people age – are going to be caused by [gene-]dosage effects.”

The Scientist Daily
March 29, 2005

Original web page at The Scientist

Categories
News

From SARS to Avian flu: Vaccines on the scene

When SARS struck more than 8,000 people and killed nearly 800 in the spring of 2003, the world clamored to know when a vaccine against the deadly virus would come to the rescue. Vaccine manufacturers and health institutes in Asia, the United States, and Europe rose to the challenge and began to work on vaccine candidates.

Two years later, there are no SARS vaccines on the shelves. None have come to market, nor are they likely to any time soon. Public attention has moved on, and the H5N1 avian flu virus emerging in Vietnam, Thailand, and Cambodia has eclipsed SARS. The threat of SARS has been overshadowed by the possibility that the bird virus, which rarely jumps from human to human, could reassort with a human strain and become highly transmissible, unleashing a worldwide pandemic to rival the deadly Spanish influenza of 1918.

The 2003 SARS outbreak has not repeated, but the threat hasn’t disappeared. Last year sporadic cases of SARS were traced to laboratories working on the virus in Beijing, Singapore, and Taipei, and a limited outbreak occurred in the Guangdong province of China, the source of the 2003 epidemic. An effective vaccine is still needed, should the disease reemerge. However, several factors, including the lack of a clear market, have slowed research, but work is ongoing.

“In the early days of the SARS outbreak everyone felt there was going to be a big market for a vaccine,” says Gary Nabel, director of the Vaccine Research Center (VRC) at the National Institutes of Health in Bethesda, Md. “When it was relatively well controlled, there was a reticence from companies to dive in. It’s not clear how one would license it, and we know from animal coronaviruses that vaccines can be ineffective and even exacerbate the coronavirus infections in cats, for example.”

Vaccine producers are keener to tackle H5N1 than SARS. Whereas SARS was a new and unknown disease, influenza is not. Vaccine constituent strains change every year, so regulatory procedures are already established for timely marketing. However, as with SARS, there is a catch-22: A vaccine cannot be fully developed unless an outbreak occurs. And it cannot be predicted in advance whether or not a reassortment will lead to an H5 flu virus strain.

Given that market forces alone are unlikely to spur the production of a mock vaccine, the World Health Organization (WHO) urges governments to make it easier to develop potential pandemic-preventing vaccines by offering tax incentives, financing clinical trials, and waiving fees associated with licensing. But is it enough? “At the end of the day we want vaccine manufacturers with the capability to produce vaccines that can be commercialized to take on challenges such as SARS and H5N1 flu,” says Linda Lambert, acting chief of the Influenza, SARS, and Related Viral Respiratory Diseases Section of the National Institute of Allergy and Infectious Diseases (NIAID). “Both H5N1 and SARS have great potential to be needed in large quantities, and the development of vaccines is quite appropriate given the unprecedented outbreaks in Asia.”

Preparing vaccines to combat mutable, global public health threats is, of course, the bread and butter of influenza vaccine researchers and manufacturers. They are on much more familiar ground when it comes to developing a vaccine for the H5N1 avian influenza vaccine than for SARS.
Moreover, WHO has a well-established mechanism for the sharing of data about flu strains. “Every year we have to make a new flu vaccine, and the nature of doing that means there is a very integrated process of open exchanges of information between manufacturers, the WHO, and governments. So, for example, when the NIH has clinical data we will be sharing it,” says Lambert.

In May 2004 Chiron and Sanofi Pasteur, both suppliers of the annual influenza vaccine, were contracted by NIAID prepare 16,000 doses of an investigational H5N1 avian influenza vaccine. To make the vaccine, virus was taken from a patient who died in February 2004 in Vietnam and altered with reverse genetics to reduce pathogenicity. In March, Sanofi Pasteur had 8,000 doses ready to be shipped to the NIH to begin clinical trials. Chiron’s half of the vaccine supply has been delayed due to problems at its Liverpool facility used to produce its commercial flu vaccine (see story on page 40). “We are manufacturing the clinical supply of H5N1 in Liverpool, UK, in the same location that makes our commercial vaccine, Fluvirin, but in a different part of the facility,” says a Chiron spokesperson. “Production is now underway.” The US and France have each contracted with Sanofi Pasteur to produce 2 million doses of the prototype vaccine.

“For flu every year we change strains used in the vaccine, and it would be exactly the same for pandemic flu,” explains Marie-Jose Quentin-Millet, vice president of research and development at Sanofi Pasteur in France. “The only difference is that when we vaccinate with annual flu, people have one shot because they already have some background immunity. Here, we know the population is totally naïve, so it’s difficult to raise a protective immune response.” Sanofi Pasteur is the world’s largest vaccine supplier, but it is only one of several vaccine manufacturers working on an H5N1 avian influenza vaccine.

In Asia, Beijing-based Sinovac Biotech has signed a deal with the Chinese Center for Disease Control and Prevention to work on a vaccine. The company anticipates completion of preclinical trials by the end of May 2005. The Japanese government has increased funding for influenza research, and pandemic flu vaccines will be allowed to fast-track the licensing process. Clinical trials by Japan’s National Institute of Infectious Diseases in collaboration with the country’s four influenza vaccine manufacturers are expected to start later this year, according to a WHO report.

Click for larger version ID Biomedical Corporation in Vancouver announced in January that it had begun development of a mock vaccine against H5N1 using the genetically modified rH5N1 reference strain from the UK’s National Institute for Biological Standards and Control. “There is a growing consensus among experts supported by the WHO that the development and testing of a mock pandemic vaccine is a critical component of pandemic preparedness, because it will allow manufacturers to shorten production times, thereby providing the general public with a vaccine more quickly,” says Anthony Holler, CEO of ID Biomedical.

The only case of human-to-human transmission of the H5N1 virus occurred in Thailand in 2004, but fear is growing that it will become a highly infectious pandemic strain. Since January 2004, about 50 people have died, and the virus has an estimated mortality of 72%. A study by the US Centers for Disease Control and Prevention estimates that 2 million to 7 million people could die in a pandemic if that occurs. Others have higher estimates, but whether 7 million or 100 million people are at risk, the precautions are the same.

The picture being painted for SARS isn’t nearly as scary, which may explain why many vaccine candidates are still in the preclinical stage. NIAID contracted with Sanofi Pasteur and Austrian manufacturer Baxter Healthcare to produce inactivated virus vaccine candidates, which are slated for Phase I clinical trials in 2005. Sanofi Pasteur, which has completed its work for the NIAID, appears to have closed the door on SARS research for the time being. “We’ve developed the technology, we think we know how to make a SARS vaccine, but there are no plans for us to do anything more on SARS at this stage,” says Quentin-Millet.

A number of other vaccines are at a similar stage of development. Researchers at the Hong Kong University-Pasteur Research Center are working on a recombinant protein-based vaccine candidate, and Connecticut-based Protein Sciences is doing the same under contract with NIAID. The VRC is working on a DNA vaccine, and clinical trials began in December. Chiron has also done some early work on an inactivated virus, and a Canadian network of 40 scientists in the SARS Accelerated Vaccine Initiative has developed four vaccine candidates. So far, two have been tested in animal trials.

Only one company, Sinovac, has taken a SARS vaccine to the clinical trial stage. A Phase I trial in which an inactivated virus vaccine was given to 24 people will be completed by the end of March 2005. Data from the trial is expected to be compiled by May. Sinovac’s research, conducted in collaboration with the Chinese Academy of Medical Sciences, has been funded with $2.2 million in grants from the Chinese government. The State Drug Administration approval process was fast-tracked in order to get work on a vaccine started as soon as possible.

Whether Sinovac’s work will progress to Phase II trials depends on approval from the Chinese government. Whether it can conduct Phase III trials may depend on nature: Large-scale, conventional Phase III trials can begin only if another outbreak creates a large pool of infected people.
It is theoretically possible to make a vaccine available without going through large-scale efficacy trials if it can be tested successfully on animal models, but none has so far been found. “As to whether or not there will be a market for it, there is still a question mark, although if we can come up with a vaccine the government will probably stockpile it,” says Yang Guang, spokesperson for Sinovac. “The virus is still alive. It won’t just disappear for no reason, so for us the job is to do clinical trials.
It’s not to make money, it’s to prove that we have the capability to do world-class research,” she says.

Sinovac’s getting started on clinical trials so quickly has raised eyebrows in the vaccine research community, not least because none of their data have yet been made available through peer-reviewed scientific publications.
Guang cites the difficulties of translating the research into English as one of the obstacles to publication, but she says Sinovac will aim to publish its results in internationally recognized journals this year. Sharing the data collected from its SARS vaccine research will go a long way toward allaying skepticism over their research and fears that they may inadvertently develop a vaccine that exacerbates rather than protects against SARS, says one observer. “It isn’t an international competition; there ought to be international collaboration,” he says. “If they keep the data to themselves, it won’t help them or the world.”

Scientists working on SARS vaccines argue that even taking the research only part of the way is valuable, not least because SARS is one of several examples of human coronaviruses becoming more deadly to humans. Whereas coronaviruses were previously only associated with the common cold, recently they have been linked to pneumonia cases and Kawasa-ki disease, a childhood ailment characterized by high fever, sloughing skin and vascular complications. “Our decision to pursue a SARS vaccine was part of a larger effort to develop antivirals and therapies. When we decided to do so in 2003 we had no sense of the natural progression of the epidemic. But there are still places where it can break out, and an effective vaccine is a useful insurance policy,” Nabel says.

Another outbreak of the disease would change the perspective on the business case for a SARS vaccine, potentially making it a must-have for travelers to the Asian region, healthcare professionals, and those in close contact with animals. “The SARS virus is probably living in such a harmony within an animal reservoir that we still don’t understand very well,” says Ralf Altmeyer, scientific director of the HKU-Pasteur Research Centre in Hong Kong. “The SARS virus probably became a killer when a mutant accidentally jumped to humans.”

Altmeyer notes that SARS could be a potential bioterrorism threat, as the virus, like others, could be synthesized from scratch. The NIAID acknowledges the bioterrorism aspect of SARS. “We did not develop vaccine candidates because of the bioterrorism threat per se, but SARS was added to the NIH list of pathogens for biode-fense research,” says Lambert. The stockpiling of a SARS vaccine would be useless if the SARS virus mutates and an epidemic strain differs from the vaccine strain. Data recently published by Nabel’s team at the VRC suggest that the virus samples from early and late 2003 were different. The latter probably represented a fresh jump from animals to humans and was not sensitive to neutralizing antibodies. This was not a cause for concern for Sinovac. “We collected the virus antigen from different locations in China, both north and south, and found the SARS virus very stable and therefore very good for the development of a vaccine,” says Guang. The reality is, no one knows how much SARS may mutate in the future. “We have to observe and test whether candidate vaccines developed are effective against new strains,” says Altmeyer.

The chance always exists that viruses such as SARS and H5N1 will mutate and make newly developed vaccines obsolete, but that is one of the challenges of vaccine science. Moreover, popular demand for vaccines will also wax and wane as viruses hit the headlines. For scientists in the field, the research goes on regardless of popular sentiment and which virus seems to be an imminent threat. “It’s shortsighted to say there’s no market for a SARS vaccine. I’m not so sure we will be safe from SARS,” says Altmeyer. “You have to see a vaccine from a longer perspective. If we acquire enough knowledge of predeveloped vaccine candidates, then the Phase I or even Phase II data [are] there. Developed countries cannot just manage a crisis. The long-term investment is to prevent a crisis.”

The Scientist
March 29, 2005

Original web page at The Scientist

Categories
News

The effects of cholesterol lowering with simvastatin

There have been concerns that low blood cholesterol concentrations may cause non-vascular mortality and morbidity. Randomisation of large numbers of people to receive a large, and prolonged, reduction in cholesterol concentrations provides an opportunity to address such concerns reliably.

20,536 UK adults (aged 40-80 years) with vascular disease or diabetes were randomly allocated to receive 40mg simvastatin daily or matching placebo. Prespecified safety analyses were of cause-specific mortality, and of total and site-specific cancer incidence. Comparisons between all simvastatin-allocated versus all placebo-allocated participants (ie, “intention-to-treat”) involved an average difference in blood total cholesterol concentration of 1.2 mmol/L (46 mg/dL) during the scheduled 5-year treatment period.

There was a highly significant 17% (95% CI 9-25) proportional reduction in vascular deaths, along with a non-significant reduction in all non-vascular deaths, which translated into a significant reduction in all-cause mortality (p=0.0003). The proportional reduction in the vascular mortality rate was about one-sixth in each subcategory of participant studied, including: men and women; under and over 70 years at entry; and total cholesterol below 5.0 mmol/L or LDL cholesterol below 3.0 mmol/L. No significant excess of non-vascular mortality was observed in any subcategory of participant (including the elderly and those with pretreatment total cholesterol below 5.0 mmol/L), and there was no significant excess in any particular cause of non-vascular mortality. Cancer incidence rates were similar in the two groups, both overall and in particular subcategories of participant, as well as at particular primary sites. There was no suggestion that any adverse trends in non-vascular mortality or morbidity were beginning to emerge with more prolonged treatment.

These findings, which are based on large numbers of deaths and non-fatal cancers, provide considerable reassurance that lowering total cholesterol concentrations by more than 1 mmol/L for an average of 5 years does not produce adverse effects on non-vascular mortality or cancer incidence. Moreover, among the many different types of high-risk individual studied, simvastatin 40 mg daily consistently produced substantial reductions in vascular (and, hence, all-cause) mortality, as well as in the rates of non-fatal heart attacks, strokes and revascularisation procedures.

BioMed Central
March 29, 2005

Original web page at BioMed Central

Categories
News

President Bush nominates veterinarian to head FDA

President George W. Bush announced Feb. 14 that he is nominating Dr. Lester M. Crawford Jr. to be commissioner of the Food and Drug Administration. Dr. Crawford has been acting commissioner of the FDA since March 2004 after Mark B. McClellan, MD, stepped down to head the federal Centers for Medicare and Medicaid Services.

In a statement, Secretary of Health and Human Services Mike Leavitt called Dr. Crawford an “outstanding” choice for FDA commissioner. “Dr. Crawford has dedicated his career to advancing the nation’s public health and will lead the way as we enter a new era of individualized medicine and rapidly developing science,” Leavitt said. “With Dr. Crawford’s leadership, FDA will provide the world’s safest drugs and empower citizens with the tools they need to make informed choices about their health.”

The AVMA Board of Governors, which comprises the Executive Board chair, president, and president-elect, sent a letter to Michael B. Enzi of Wyoming, chairman of the Senate Committee on Health, Education, Labor, and Pensions supporting Dr. Crawford’s nomination. “Dr. Crawford has an exemplary record of public service and leadership to our country in public health, food safety, and regulatory medicine, and will bring to the commissioner’s office invaluable experience and accomplishments in government, academia, and industry,” the letter stated. At press time, no date had been set for Dr. Crawford’s confirmation hearing.

Prior to joining the FDA as deputy commissioner in February 2002, Dr. Crawford led the Center for Food and Nutrition Policy at Virginia Polytechnic Institute and State University. He previously oversaw the Department of Agriculture’s Food Safety and Inspection Service, as well as the FDA Center for Veterinary Medicine.

Dr. Crawford is a member of the National Academy of Sciences’ Institute of Medicine and a fellow of the United Kingdom’s Royal Society of Medicine. In addition, Dr. Crawford has served as an adviser to the World Health Organization for nearly 20 years.

Dr. Crawford earned his DVM degree from Auburn University and his PhD degree in pharmacology from the University of Georgia. He is a former executive director of the Association of American Veterinary Medical Colleges, executive vice president of the National Food Processors Association, and chairman of the University of Georgia’s Department of Physiology-Pharmacology.

JAVMA
March 29, 2005

Original web page at JAVMA

Categories
News

Soccer link to motor neuron disease

A rigorous study in Italy has confirmed claims that professional soccer players have a higher than normal risk of developing a type of motor neuron disease, also known as amyotrophic lateral sclerosis. The reason remains a mystery. ALS involves the death of motor neurons, the nerve cells responsible for voluntary movement, and eventually leads to paralysis and death. Adriano Chiò’s team at the University of Turin surveyed the medical records of 7000 professional footballers who played in Italy’s first or second division between 1970 and 2001.

Based on the normal incidence of the disease and the players’ ages, the researchers calculated that there should have been 0.8 cases of ALS in this group. Instead, there were five. The study was prompted by what the Italian press dubbed “the motor neuron mystery” – the discovery a few years ago of 33 cases of ALS during an investigation of illicit drug use among 24,000 pro and semi-pro players in Italy. Dubious about the methodology of that initial investigation, Chiò’s group applied stricter diagnostic criteria to their data, such as only including players born in Italy. “I think the researchers have been conservative,” says Ammar Al-Chalabi of the Institute of Psychiatry in London, who has written a commentary on the study in Brain.

The researchers found that the mean age of onset was just 41. “They develop it about 20 years earlier than usual,” says Chiò. He also found that the longer people play football the greater the risk. In the US, ALS is known as Lou Gehrig’s disease after the baseball legend diagnosed with it in 1939. Clusters of cases have been reported in American football, but until now no large-scale studies have found any clear link between sport and ALS. The cause of ALS remains unknown, as does the reason for the higher rate among footballers. Genes undoubtedly contribute, but the disease could be triggered by head trauma, performance-enhancing drugs or some other toxin to which footballers are exposed. Certain viruses are also being investigated as potential causes.

Although the disease is certainly not limited to sportspeople, Al-Chalabi says it could also be that people prone to ALS are drawn to sport. “There could be some quality in their neuromuscular make-up that not only makes them good at sport, football particularly, but also makes them susceptible to ALS.”

Journal reference: Brain (vol 128, p 472)

New Scientist
March 15, 2005

Original web page at New Scientist

Categories
News

Closing in on a vaccine for breast cancer

Progress toward development of a breast cancer vaccine has been reported by researchers at Washington University School of Medicine and the Siteman Cancer Center in St. Louis.

Cancer-fighting vaccines stimulate immune cells to recognize tumor cells as foreign and destroy them. Physicians believe a vaccine-induced immune response could be used to supplement other cancer therapies or to immunize high-risk people against cancer. “We’ve been studying a protein called mammaglobin-A found in 80 percent of breast tumors,” says Thalachallour Mohanakumar, Ph.D., the Jacqueline G. and William E. Maritz Professor of Immunology and Oncology in the Department of Surgery and at the Siteman Cancer Center. “The protein is especially interesting for cancer immunotherapy because of its frequent occurrence and because breast tumors express it at high levels.”

In articles in the Journal of the National Cancer Institute and Breast Cancer Research and Treatment, the researchers report that they constructed a vaccine consisting of copies of the DNA sequence that makes mammaglobin-A in humans. The researchers theorized the DNA vaccine would rev up special immune cells called T cells to recognize mammaglobin-A as a foreign molecule when it is displayed on the surface of cells as an antigen (a small protein that the immune system may recognize). The primed T-cells then would proliferate and attack when they met with mammaglobin-A antigens.

“Mammaglobin-A is involved in breast development and secreted in breast milk,” Mohanakumar says. “So we had to prove first that we could elicit an immune response to a protein that is in the body normally.” They injected the DNA vaccine under the skin of test mice that had been engineered so that their immune systems would react to the human mammaglobin-A like a human immune system. The researchers loaded specific cells in the mice with mammaglobin-A antigens and found that the vaccine-primed T-cells attacked those loaded cells.

The research team also transferred vaccine-primed T cells into mice with growing tumors that had or didn’t have mammaglobin-A antigens. Tumors with mammaglobin-A antigens stopped growing and shrunk in volume, while those without the antigens continued to grow at the usual pace. “The results demonstrated that the vaccine-primed immune response is specific to mammaglobin-A antigens,” Mohanakumar says.

Breast tumors with mammaglobin-A antigens on their surface also may display antigens that come from multiple parts of the mammaglobin-A molecule. Further experiments confirmed the importance of generating T cells that can react to a variety of different mammaglobin-A antigens. When the research team tested a DNA vaccine containing the DNA code for just one part of the mammaglobin-A molecule, they found T cells reacting to only that antigen, indicating that the method can generate immune cells that target specific parts of the mammaglobin-A protein.

“Now that we’ve found how effectively an immune response can be generated to mammaglobin-A, we plan to conduct clinical trials in patients who are at very high risk for breast cancer and in patients who have had a relapse after initial treatment,” Mohanakumar says. “We want to see if giving patients the DNA vaccine can prevent or eliminate breast cancer or at least slow its growth.”

Source: Washington University in St. Louis

Bio.com
March 15, 2005

Original web page at Bio.com

Categories
News

Ice age bacteria brought back to life

A bacterium that sat dormant in a frozen pond in Alaska for 32,000 years has been revived by NASA scientists. Once scientists thawed the ice, the previously undiscovered bacteria started swimming around on the microscope slide. The researchers say it is the first new species of microbe found alive in ancient ice. Now named Carnobacterium pleistocenium, it is thought to have lived in the Pleistocene epoch, a time when woolly mammoths still roamed the Earth.

NASA astrobiologist Richard Hoover, who led the team, said the find bolsters the case for finding life elsewhere in the universe, particularly given this week’s news, broken by New Scientist, of frozen lakes just beneath the surface of equatorial Mars. The team initially set out to find bacteria that thrived at extremely low temperatures, so it was a surprise to find organisms that tolerated the cold, but preferred room temperature. “I think the most important thing from this observation is that microorganisms can be preserved in ice for long periods of time,” Hoover told New Scientist.

He retrieved the bacteria from a tunnel in the Alaskan permafrost, carved by the US Army’s Cold Regions Research and Engineering Laboratory. Walking through the tunnel, Hoover saw a fossilised mammoth tusk protruding from one side and an ancient jawbone on the other. The bacteria came from a cross-section of a preserved pond. The bottom of the pond was a brownish hue, which Hoover thought might be caused by diatoms – single-celled algae. “Frankly, I was disappointed that there weren’t any diatoms at all,” he says. Instead, he saw a host of pigmented bacteria that started swimming as soon as the ice melted. He took the samples back to Marshall Space Flight Center in Alabama, US, and cultured the samples. Initially, the team thought it might be an existing bacterium, but gene sequencing revealed it as a new species.

Another group of researchers from West Chester University in Pennsylvania, US, claimed in 2000 that they had isolated a 250-million-year-old bacterium. But other scientists disputed that the microbes could so very old. For example, those particular microbes like salty environments. And salt deposits tend to have water moving through them, potentially bringing contamination, says Robert Hazen, a geophysicist with the Carnegie Institution of Washington and president of the Mineralogical Society of America. “The fact you have extracted microbes from the salt doesn’t really tell you the microbes are as old as the salt,” he says. While Hazen says that 250-million-year-old microbes seemed unlikely, “I wouldn’t be surprised at microbes that are a few tens of thousands of years old”.

Journal reference: International Journal of Systematic and Evolutionary Microbiology (vol 55, p 473)

New Scientist
March 15, 2005

Original web page at New Scientist

Categories
News

The OIE’s involvement in the field of food safety

International standards on food safety are established by the Codex Alimentarius Commission, as stated in the Agreement on the Application of Sanitary and Phytosanitary Measures (SPS Agreement) of the World Trade Organization (WTO). For its part, the World Organisation for Animal Health (OIE) is responsible, under the terms of the SPS Agreement and the mandate given to it by its Member Countries, for standards relating to animal health and zoonoses. Since many zoonoses (animal diseases transmissible to humans) can be transmitted to humans through food, OIE standards therefore also apply to animal products that could spread pathogens via international trade.

It has become apparent that the new global concept of implementing sanitary controls “from the stable to the table”, aimed at improving the level of consumer protection, requires the OIE and the Codex Alimentarius Commission to work more closely together and collaborate on a permanent basis. This will ensure that the standards issued by the two Organisations cover all potential hazards throughout the food chain and those standards on topics of common interest do not prove to be contradictory for want of coordination.

Moreover, virtually all the official Representatives of the 167 OIE Member Countries (the Delegates), irrespective of the Ministry to which they belong, are the Head of their country’s Veterinary Services and are responsible, within the framework of their national duties, for sanitary controls at the animal production (i.e. farm) level and very often for the transport, slaughter and processing of animals. In some countries they are also responsible for controls on products in supermarkets, catering and restaurants. They are also very often responsible for verifying that sanitary requirements are respected at the national level and for preparing sanitary certificates for the export of animals and animal products. In most cases their responsibilities also include the sanitary inspection of animal products imported into their country.

The OIE and the Codex Alimentarius Commission are now working much closer together than in the past to try to develop the synergy needed to ensure better consumer protection through the international standards and guidelines that each Organisation adopts and publishes. In 2001, I had the opportunity to discuss these issues with Mr Tom Billy, the then Chairman of the Codex Alimentarius Commission, and to decide with him to set up a permanent Working Group involving both the Codex Alimentarius Commission and the OIE.

Once the necessary mandate had been obtained from our Member Countries, the Working Group was officially launched in 2002 as the “Working Group on Animal Production Food Safety”. Its members include the Chairman and the Secretary of the Codex Alimentarius Commission, the Chairman of the Codex Committee on Meat Hygiene, the Director of the Food Safety Department of the World Health Organization (WHO) and representatives from among the Delegates of OIE Member Countries, most of whom are Heads of the Veterinary Services, from several different continents.

The Working Group’s first main role was to help the two Organisations to define more precisely their future policies on the development of standards aimed at protecting consumers, with regard to precautions to be taken throughout the entire “production-to-consumption” continuum. This task, involving both the Codex and the OIE, notably helped to pinpoint areas in which international standards had not yet been prepared and adopted.
The Working Group then proposed that the majority of its work should be devoted to identifying measures to be taken at the production level prior to the slaughter of animals for food: for example, how to avoid pathogens that generally have no visible effect on animals being present in food products (Listeria, salmonellae, Trichinella, etc.).

The Working Group is also trying to ensure that there are no inconsistencies or gaps in standards on topics falling within the scope of both our Organisations. This is for example the case with standards relating to antimicrobial resistance.
It also takes into account, within the framework of existing OIE standards on food-borne zoonoses (brucellosis, tuberculosis, etc., already dealt with in the Terrestrial Animal Health Code), the elaboration of new standards aimed at strengthening consumer safety with regard to products presenting a risk with respect to these diseases.

All this work will give rise to proposals that will first be examined by the OIE Terrestrial Animal Health Standards Commission before being submitted for adoption to the General Assembly of our Member Countries.
However, the scope of the Working Group is not limited to proposing new international standards. It also includes the drafting of guidelines on control procedures throughout the food chain. Within this context the Working Group is currently preparing several documents of particular interest for the application of controls throughout the food chain, namely:

the preparation of “Guidelines on good farming practices”, containing detailed guidelines aimed at protecting consumers from hazards that could adversely affect the safety of end products of animal origin;
a scoping paper on the role and functionality of the Veterinary Services in food safety throughout the food chain;
another scoping paper on abattoir inspection procedures for animals and meat, taking into account the duality of objectives of ante- and post-mortem inspection in areas relating both to animal health and public health.

Furthermore the Working Group is invited to consult, whenever necessary, representatives of private sector industries from around the world. All the proceedings of the Working Group are of course brought to the attention of the relevant Committees of the Codex Alimentarius Commission. Every year, the Chairman or the Secretary of the Codex Alimentarius Commission and the Director General of the OIE are invited to address the Member Countries of each other’s Organisation. Representatives of the OIE and of the Codex Alimentarius Commission participate in the work of each other’s Committees and expert groups of mutual interest. Formal rules regarding the way in which the two Organisations take into account each other’s standard-setting work from the earliest stages in its preparation are currently being adopted by the Codex Alimentarius Commission.

The rich diversity of the professional, scientific, administrative and regional backgrounds of the members of the Working Group is a source of inspiration and progress. It nevertheless gives rise to a lengthy process of arbitration before a consensus can be reached, which explains why it may take some time for the Working Group to reach a common position. There have also been cases where, during the General Assembly of the Codex Alimentarius Commission and the General Assembly of the OIE, the respective Delegations of a given Member Country have adopted divergent positions, a situation that is hardly conducive to achieving rapid results in the work being carried out jointly by our two Organisations. It is therefore to be hoped that national consultative mechanisms, involving the Veterinary Services and all the other relevant Administrations and sectors, can be established within the Administrations of all our Member Countries. Such mechanisms could be along the lines of the Working Group established by the OIE with the support of the Codex Alimentarius Commission.

Source: Bernard Vallat

OIE
March 15, 2005

Original web age at OIE

Categories
News

Timing xenotransplants

Israeli researchers report in PNAS this week that embryonic pig tissues used for liver, pancreas, and lung transplants need to come from very specific windows of time in embryonic development. The findings offer new insights into organogenesis and may help explain past failures in xenotransplantation, coauthor Yair Reisner of the Weizmann Institute of Science in Rehovot, Israel, told The Scientist.

Reisner explained that although research into using embryonic pig tissues as a source of transplantable organs has gone on for more than two decades, timing of the transplant is a challenge. Too early, and undifferentiated embryonic tissue may develop into a teratoma, he said. Transplant too late, and embryonic tissues may have been marked with identifiers that trigger rejection by the new host. “Studying these windows provides a great assay system for basic research regarding the timing of developmental events at the molecular level,” Reisner said.

In 2003, Reisner and colleagues defined the gestational windows for the maximum capacity of human and pig kidneys to grow and differentiate into functional tissues, with minimal risk for teratoma formation. In the most recent study, they transplanted pig tissue precursors from embryonic day (E)21 to E100 into immune-deficient mice to identify the optimal windows for the growth of liver, pancreas, and lung. The researchers saw optimal liver growth and function at E28, with enzyme-linked immunosorbent assay (ELISA) showing rapid decrease of secreted albumin levels from implants past that stage. In contrast, development of mature lung tissue was not observed via stained micrographs until relatively late in gestation at E56. Maximal pancreas growth and function was seen from E42 to E56, with ELISA revealing reduced insulin secretion capacity before and afterward. “Disappointing results in past transplantation trials may be explained, at least in part, by these results,” Reisner said. “Early studies that attempted to cure diabetic patients by implantation of pig embryonic pancreas made use of late gestation tissue, which is now shown to be inferior compared to the optimal 6 weeks gestational time.”

The authors also found that teratomas were common after liver implants younger than E28, but they saw no potential for teratoma in pancreas or lung tissue at any time point. This could be explained by different relative amounts of pluripotent and committed stem cells in each developing organ or perhaps the differentially expressed restricting activity of stromal elements. “We all look for genes which characterize pluripotent stem cells,” Reisner said. “Considering that the ability to form teratoma reflects the presence of pluripotentiality, we have now defined the loss of such stem cells within E24 and E28 in the liver. Clearly, this will allow us now to probe genes which disappear within these narrow gestational time points.”

Marc Hammerman at Washington University in St. Louis, who did not participate in this study, noted that information gathered from studies in immunodeficient mice might not correlate with what goes on in immunocompetent hosts. “But I think these observations are still very valuable in better understanding organogenesis,” he told The Scientist. “This study has identified critical checkpoints in [organ] development, and it has important implications to help identify what transcription factors get activated here and there, and what factors are needed to move from one stage to the next,” said Bernhard Hering of the University of Minnesota in Minneapolis, who did not participate in this study.

The Scientist
March 15, 2005

Original web page at The Scientist

Categories
News

Life Sciences in the 21st Century

Jörg Ladwig Peer M. Schatz is CEO of Qiagen, a supplier of life science research tools and employer of 1,400 people in 12 countries. Schatz has a Master’s degree in finance from the University of St. Gall, Switzerland, and an MBA in finance from the University of Chicago. He is also a member of the Advisory Board of the Frankfurt Stock Exchange. In many ways the laboratory tools we use today may remind us of computers in the late 1970s. In those days, systems were mostly incompatible and were dedicated to specific tasks. When the first personal computers emerged, these systems were integrated: “Cut and paste” became ubiquitous, and it became possible to share and compare data over multiple and geographically dispersed platforms. A main driver for development was the standardization of the interfaces and communication protocols were standardized. The more complex and interrelated the applications, the more important it becomes that as much analytical risk as possible is removed, allowing various data contents to be compared and exchanged.

This is comparable to what we are seeing in the life sciences. Systems biology is a key development in academic and industrial research. In systems biology, the knowledge taken from different disciplines, including genomics, proteomics, glycomics, metabolomics, and others is brought together to help unveil the regulatory network of a biological system. Scientists, often dispersed in networks, are increasingly combining the analytical results of various analytes to understand basic biological principles and interactions on a cellular level. This makes it an exciting time to be serving the life sciences industry. Our customers are entering into new areas of research at a rapid pace, and the demands of the community are changing as well. Systems biology and the trend towards standardization of tools are some of the most significant changes taking place in science in the early 21st century.

Analyses are often linked, compared, and performed with various analyte formats such as DNA genotyping, expression analysis on RNA and protein analysis. We and other companies have invested significantly to address this trend in the life sciences. For example, Qiagen offers a deep and broad range of preanalytical tools for nucleic acid analysis. These can be used alongside a large and advanced protein fractionation product portfolio, which prepares various analytes simultaneously from the same sample. In all applications, the processing (both automated and manual) is designed to be as similar as possible, regardless of what analyte is prepared.

While the value of long-term standardization of platform tools is often questioned, the standardization of interfaces can dramatically enhance the flow of information and therefore of innovation. Anyone who has traveled the world can recall the frustrating array of phone and power plugs that can be found, with each country requiring a different adaptor. Computers in the 1970s were similar, and the emergence of interface standards such as the USB port significantly increased the number of products and value of PCs. Preanalytical tools for the collection, stabilization, purification, and handling of laboratory samples are comparable to an interface. They allow samples to be transformed into standardized formats, which make downstream analysis start with the same material, every time, and in every sample. Standardization therefore increases the value of information and ultimately, innovation.

In academia, we are seeing an increase in networked research and a greater demand for accelerating preclinical research. Preanalytical processing is a major influence on the quality of analytical results. This influence increases exponentially when the research results from different laboratories drive joint research efforts in networks. Roadmaps and regulatory frameworks increasingly reach into pre-clinical research. This trend leaves no room for uncertainty; full regulatory compliance of any tool is therefore critical. At Qiagen, we try to ensure the removal not only of quality risk in sample processing, but also all uncertainty as to compliance with regulations, frameworks, or roadmaps.

This trend towards making research comparable across analyses and geographic boundaries also confirms the need to be present in all major countries, to have the highest quality standards, to meet all regulatory requirements, and most importantly, to have a total commitment to focus. Almost 25% of Qiagen’s sales are generated from clinical diagnostics, and Qiagen has a range of products that are compliant with key regulatory frameworks in the United States or Europe. In addition to the diagnosis of infectious diseases and disease susceptibility, the market is facing another area of increasing importance: the screening of patients for clinical trials of targeted drugs, and ultimately for personalized medicine.

Astra Zeneca’s lung-tumor drug, Iressa, according to results published in April 2004, showed a significantly higher-than-normal response rate in patients with a certain enzyme pattern. In July 2004, NitroMed announced that the Phase III clinical trial of its heart failure treatment, BiDil, had been stopped early because of the significant survival benefit seen in African-American heart-failure patients. On the other hand, in late September 2004, Merck had to withdraw its COX-2 inhibitor drug, Vioxx, because data compiled by Kaiser Permanente suggested that patients who took Vioxx had a higher cardiovascular risk than those who did not take the drug.

In November 2003, the US Food and Drug Administration published the draft of new guidelines accelerating the use of molecular biology in clinical research, and the preselection of patients to increase clinical trial safety. The FDA recommended genotyping and gene expression profiling of patients in clinical trials to allow for the selection of specific patients for trial enrollment, and to help in determining the correct dose. In addition, the FDA expects further detailed information on specific drug metabolisms (pharmacogenomics), pharmacokinetics, and subject stratification, to support scientific arguments and the validation of biomarkers.

The proven ability to seamlessly integrate sample collection, stabilization, and purification and handling technologies into complex diagnostic workflows has become increasingly important as patient samples are used for larger numbers of tests and in more diversified settings. Integrated technologies such as PreAnalytiX’s PAXgene (a joint venture between Qiagen and Becton Dickinson), allow consistent and easy blood collection, stabilization, and purification of nucleic acids for large-scale analyses. Such products have become essential standards for the further development of these markets and will ensure that survey data from pharmacogenomic trials and Phase I-III clinical trials are not biased. The use of universally available, standardized tools in clinical research allows greater speed and flexibility as well as a diagnostic perspective. This streamlines and increases the reliability of clinical assay development and drug-development programs.

Consistent or standardized processes are also the bedrock for the commercialization of molecular biology technologies in molecular diagnostics, which stands to benefit in a very significant way from the advancements and successes in molecular biology-related research. Molecular biological methods are already an integral part of everyday laboratory routine, including genetic identification in forensics and in paternity testing. They are also today’s standard in the diagnosis of infectious diseases such as HIV, hepatitis B and C, as well as human papillomavirus (HVP). In addition, molecular diagnostics companies have been successful in developing specific standardized tests for predisposition to cancers of the lung, intestine, prostate, pancreas, liver, stomach, and skin. The list of such predisposition tests is getting longer every day.

With an ever increasing number of samples and the dissemination of molecular biology approaches in the life sciences and health care, we observe a very significant need for consistent and comparable solutions that have highest performance but are also simple. The pharmaceutical and diagnostics industries are demanding less complex analytical platforms to increase the ease of use and decentralize the analytical work between different laboratories and hospitals.

The breadth of sample types that can be analyzed is increasing both in research and in diagnostics and now includes more “live” sample material such as various tissue formats and whole blood. This significantly increases the scope of available sample quality. At the same time, the sensitivity and cost of downstream analysis has increased. Samples that need processing are often limited in amount or partially degraded. Tool providers must cover a wide spectrum of different products and technologies to meet all the different requirements in these markets and to find the best solution for future needs. This can be done only with a commitment and focus. By delivering innovative technologies and solutions, tool providers continue to advance standards and enable researchers in academic and industrial environments to achieve breakthroughs in healthcare. This leads to an improvement in living conditions, which will contribute to improving people’s lives.

Science has achieved such incredible successes in the last few years, and the advancements seem to be accelerating. It was very fitting that the scientific world started the new millennium with the successful completion of the greatest scientific challenges of recent years: the publication of the sequence of the human genome. While in itself a dramatic and hugely significant event, it stands for the spectacular speed at which science overall is advancing.

The Scientist Daily
March 15, 2005

Original web page at The Scientist

Categories
News

Life expectancy in epilepsy

Does a diagnosis of epilepsy reduce a person’s life expectancy? According to Athanasios Gaitatzis and colleagues the answer is yes, but under certain circumstances and to a variable extent. These authors recently estimated life expectancy in people with epilepsy, with data from the prospective community-based UK National General Practice Study of Epilepsy (NGPSE), and made comparisons with the general population. Life expectancy can be reduced by up to 10 years when there is a known cause of the epilepsy, the estimated reduction being highest at the time of diagnosis. These observations are hardly surprising considering the wealth of literature that shows increased mortality rates in people with epilepsy.

Traditionally, mortality has been expressed as the ratio of the observed and expected numbers of death: the standardised mortality ratio. Expected deaths are calculated by applying the death rates of an external reference population to the age distribution of the study population. The epilepsy population has an standardised mortality ratio of 2-3 (ie, a mortality that is 2-3 times higher than that of the general population). People with epilepsy of unknown cause have at most only a slight increase in mortality, while those with epilepsy as a symptom of a known underlying cause largely account for the overall increased mortality. The increase is evident during the first years after the onset of epilepsy, and mortality then declines to levels close to those in the general population. However, some studies show an increase in mortality later, a decade or more after disease onset. Relative survivorship, defined as the proportion of observed to expected number of survivors, provides a different perspective of mortality. The relative survivorship at 5, 10, and 15 years after diagnosis was 91%, 85% and 83%, respectively, in a pioneering study. In epilepsy of unknown cause relative survivorship is as high as 96% 25 years after diagnosis.

Gaitatzis and colleagues introduce yet another way of analysing data on mortality in patients with epilepsy. The estimated life expectancy of people with new-onset non-febrile seizures from the UK followed up for a median of 15 years was compared with that in people of the same age and sex in the general population. The difference in life expectancy between these populations gives the estimated years of life lost, which is presented at different intervals after the diagnostic seizure–for men and women, young and old, and epilepsy of known and unknown cause. With this analysis, the authors have expressed earlier knowledge in a new form, which serves to refine our understanding of an important prognostic aspect of epilepsy.

The critical question is what the reasons might be for the reduced life expectancy in people with new-onset epilepsy. Gaitatzis’ data, as indeed other population-based studies, strongly suggest that the increased mortality is related to the underlying cause of epilepsy rather than to the seizures. One may therefore ask, for example, how different life expectancy would be for a person with poststroke epilepsy compared with someone who has had a stroke but not developed epilepsy. Indeed, such patients might constitute the most relevant control group to clarify the contribution of the epilepsy itself. In fact, the reported causes of death in the NGPSE cohort, mainly cancer, ischaemic heart disease, cerebrovascular disease, and pneumonia, also indicate that mortality as a consequence of seizures is rare in newly diagnosed patients.

Although the estimates of life expectancy are of considerable interest for researchers and physicians, they are probably less useful in counselling patients and relatives. First, it is difficult to generalise from these data because Gaitatzis and colleagues included patients with single seizures and acute symptomatic seizures, while excluding those with brain tumours. Second, epilepsy is a heterogeneous disorder with multiple causes, each affecting the prognosis differently. Because of the limited size of the cohort, patients with different causes were lumped into one group: symptomatic epilepsy. However, it does not make much sense to estimate life expectancy for an individual with epilepsy after a traumatic brain injury on the basis of data from patients with underlying cerebrovascular disease. Third, the concept of years of lost life might be difficult to communicate to patients. A person with idiopathic/cryptogenic epilepsy would probably consider an estimated reduction in life expectancy of up to 2 years highly significant rather than minimal.

Although this extended analysis of the NGPSE data provides a new and interesting perspective on mortality risks, we now need to find ways to estimate the contribution of the epilepsy and seizures themselves and the causes of such deaths, and to analyse to what extent treatment can minimise risks and reduce years of lost life for people with epilepsy. Finally, we should acknowledge that the available data on mortality in epilepsy derive almost exclusively from western industrialised countries. We lack information from the rest of the world where the vast majority of the global epilepsy population resides and where epilepsy-related mortality and life expectancy is likely to be different.

Source: Torbjörn Tomson, Lars Forsgren

The Lancet
March 15, 2005

Original web page at The Lancet

Categories
News

The oldest humans just got older — by 35,000 years

Two Homo sapiens skulls, originally dated as 130,000 years old when they were unearthed in Kibish, Ethiopia in 1967, then later put back to 160,000, have now been declared 195,000 years old based on geological evidence. “It pushes back the beginning of the anatomically modern humans,” said geologist Frank Brown, Dean of the University of Utah’s College of Mines and Earth Sciences and co-author of a new study into the skulls known as Omo I and Omo II.

The results of a study with New York’s Stony Brook University and the Australian National University were published in the science journal Nature. After looking at the volcanic ash where the skulls were found along the Omo river, the researchers not only dated the remains as the same age but pushed back the date of their existence, making them by far the oldest humans. “On this basis we suggest that hominid fossils Omo I and Omo II are relatively securely dated to 195 +/- 5 (thousand) years old … making Omo I and Omo II the oldest anatomically modern human fossils yet recovered,” the study concluded.

The new dating firmly underpins the “out of Africa” theory of the origin of modern humans. Brown said the redating was important culturally because it pushed back the known dawn of mankind, the record of which in most cases only starts 50,000 years ago. “Which would mean 150,000 years of Homo sapiens without cultural stuff such as evidence of eating fish, of harpoons, anything to do with music, needles, even tools,” he said. “This stuff all comes in very late except for stone knife blades, which appeared between 50,000 and 200,000 years ago, depending on whom you believe,” he added in a commentary.

The skulls were first discovered just 200 meters apart on the shores of what was formerly a lake by a team led by renowned fossil hunter and wildlife expert Richard Leakey. They bear cut marks made by stone tools which have been taken as evidence of prehistoric mortuary practices. Ever since the discovery of the fossil skulls, scientists have not only been locked in debate over the dating but also of the physical types because Omo I has more modern features than Omo II.

The new dating suggests that modern man and his older precursor existed side by side. “It dates the fossil record almost exactly concordant with the dates suggested by genetic studies for the origin of our species,” said Stony Brook anthropologist John Fleagle. “Second, it places the first appearance of modern Homo sapiens in Africa many more thousands of years before our species appears on any other continent. It lengthens the gap,” he added.

Yahoo
March 15, 2005

Original web page at Yahoo

Categories
News

Hopes for male contraception

More than four decades after the introduction of female oral contraceptives there is still no comparable pharmacological method for men. Female methods have helped couples to achieve the desired family size and have contributed, in demographic terms, to the slowing of world-population growth. Nevertheless, the population continues to explode, and 8 billion people will inhabit the world by 2020, endangering medical, social, and economic progress in developing countries. Male contraceptive methods would help to decrease population growth further. Although such a decrease could be achieved by wider accessibility of female contraceptives, women (and often politicians) increasingly demand that men share not only the blessings of contraception but also its burdens and risks. And men would be willing to use pharmacological methods if available.

Research efforts since the 1970s have not yet led to a male contraceptive for general use, beyond the condom and vasectomy. The willingness to use new male methods might be the reason why a non-human primate study testing male immunocontraception drew much public interest. Michael O’Rand and colleagues found that seven of nine bonnet monkeys (Macaca radiata) immunised against human eppin developed sufficient antibody titres and none of them impregnated fertile female partners, while four of the six control animals sired offspring. Sperm motility, but not count, was affected in the immunised animals. Although effective at first glance, the study highlights several of the shortcomings and problems that immunological approaches to contraception harbour.

Eppin, encoded by a single-copy gene, is a protease with antibacterial activity that is claimed to be expressed solely in the testis, the epididymis, and on sperm. Eppin thus appears to be a valid target for contraception. High antibody titres in serum and semen were achieved, yet the mechanism of action remains unclear. Immunological neutralisation of eppin could lead to local infection and inflammation, and consequently to infertility, which would be undesirable. Immunisation against a body constituent such as eppin could also lead to local reactions, and to the formation of immune complexes and autoimmune disease. When rhesus monkeys were immunised against the luteinising hormone ß-subunit, granular deposits, indicating immunoglobulin penetration, were observed in pituitary cells of one animal, and monkeys immunised with ovine luteinising hormone developed alopecia. A WHO-sponsored phase II trial immunising against human chorionic gonadotropin to prevent pregnancy in women was discontinued because of severe pain at the injection sites and tissue reactions in the first volunteers. Immunisation against eppin (or other antigens from the male reproductive tract) could lead to immune orchitis and epididymitis, causing irreversible infertility. In O’Rand and colleagues’ study, only five of seven monkeys regained fertility. Reversibility is a prerequisite for any pharmacological male method that would otherwise have no advantage over vasectomy. After antibody titres have developed, it is unclear when they drop to a level no longer biologically active enough to neutralise the respective antigen. Because the antigen is of endogenous origin it could also cause perpetual boosting. On the other hand, induction of antibodies is unpredictable, as O’Rand also showed. Two of the first six animals did not efficiently form antibody, and complete Freund’s adjuvant (unacceptable for human use) had to be co-administered in further immunisations to achieve sufficient titres.

Because of these vagaries and the unpredictable side-effects surrounding immunocontraception, the WHO Task Force on Male Fertility Regulation never favoured this approach. However, immunisation of monkeys against follicle-stimulating hormone showed that elimination of this hormone alone would not suppress spermatogenesis to a degree compatible with contraception, as had been hoped. To achieve this goal, follicle-stimulating hormone, with luteinising hormone, need to be suppressed. Because lack of luteinising hormone leads to atrophy of Leydig cells and insufficient testosterone, this hormone needs to be replaced exogenously. Suppression of luteinising and follicle-stimulating hormones, and replacement of testosterone, has become the principle of hormonal male contraception.

The first efficacy study with testosterone applied this principle and showed that contraceptive protection can be achieved. Further studies showed that the success rate is even better when gestagens or gonadotropin-releasing-hormone antagonists are added to testosterone. Researchers urged the drug industry, still reluctant to venture into male contraception, to translate this principle into a marketable steroid combination. This pressure culminated in the 1997 Weimar Manifesto on Male Contraception, and at last two drug companies in the field of contraception have started trials, which might eventually result in a marketable pill or injection. Regulatory agencies are deliberating the requirements for licensing a male contraceptive.

Meanwhile a search for the second generation of male contraceptive drugs is attracting the ingenuity of researchers worldwide, supported by governmental and non-governmental organisations such as WHO, Population Council, CONRAD, National Institutes of Health, medical research councils, and the Rockefeller, Mellon, and Gates Foundations. Molecular biologists have helped to identify non-hormonal targets for male contraception, as the Rockefeller/Schering Foundation Network Symposium on Male Contraception in Bellagio in 200215 and the 2004 National Institutes of Health conference (The future of male contraception) in Seattle showed. Indeed, there are more targets than current research funds can explore. Incidentally, neither of these top-level meetings featured immunological approaches.

The road to male contraception appears to be long and winding. Often researchers and the public are frustrated by lack of progress and inflated promises. Admittedly, the need for male contraception is not as urgent as before the advent of female oral contraceptives, the search for which was driven by proponents such as Margaret Sanger and Katherine Dexter McCormick, who understood the miserable situation of women constantly threatened by pregnancy. Male contraceptive research is driven more by intellect and less by emotion. A bit more emotion and a few prominent proponents would hasten progress in male contraception and complete this chapter of gender equality.

Source: Eberhard Nieschlag, Alexander Henke

The Lancet
March 15, 2005

Original web page at The Lancet

Categories
News

Painful lessons

The serious side effects of pain relievers have been in the news lately. The increased risk of heart attacks and strokes associated with rofecoxib (Vioxx, Merck) and celecoxib (Celebrex, Pfizer) has led to the withdrawal of rofecoxib in late 2004 (the largest withdrawal in history) and reduced the sales of celecoxib by 50%. In fact, the risk may be associated with the entire class of COX-2 selective inhibitors, and an advisory committee of the US Food and Drug Administration (FDA) will meet in mid-February 2005 to discuss the cardiovascular safety profile of these drugs. Because rofecoxib and celecoxib are top-selling prescription medicines, the issue has drawn widespread response. At a US congressional hearing, we heard testimony that the FDA failed to protect the public from unsafe drugs. In newspapers, we read about accusations that pharmaceutical companies knew of the increased cardiovascular risk associated with these drugs but delayed action. Calls for reforming the FDA abound, and, not surprisingly, lawyers across the country are drumming up clients for litigation against Merck and Pfizer. Added to all of this, the FDA recently issued a warning about cardiovascular risks associated with naproxen (Aleve, Bayer), and a study claimed that ibuprofen can cause damage to the small intestine. Although the naproxen warning has been questioned—by even the fiercest critics of COX2 inhibitors—as being a premature overreaction to the Vioxx debacle, it seems that no drug is safe when it comes to pain relief.

While the social and economic implications following the fall of COX-2 selective inhibitors continue to unfold, much less is being discussed about what this may mean to the scientific and drug discovery communities. After all, the development of rofecoxib and celecoxib represented the success of modern structure-based drug design. It also validated the general belief—at least until recently—that a detailed understanding of the players in biological processes, especially those involved in disease pathogenesis, would lead to better medicine.

COX-2 selective inhibitors belong to the family of nonsteroidal anti-inflammatory drugs (NSAIDs). NSAIDs reduce pain, fever and inflammation, but many can also lead to gastric bleeding, a side effect that can be life-threatening if left untreated (the cause of an estimated 16,000 deaths a year in the United States alone). It is thus a goal to develop a pain reliever without that potentially serious problem.

Both molecular and structural studies contributed to the development of such a drug. The first breakthrough was the identification of the molecular target of NSAIDs. One of the earliest NSAIDs was aspirin, a compound related to a substance in aspen tree bark. Although aspirin has been widely used for a long time, its molecular target—an enzyme called cyclooxygenase (COX)—was only identified in the 1970s. Cyclooxygenase catalyzes the first step of the synthesis of prostaglandins, a group of compounds involved in a variety of signaling processes, such as the inflammation response and the activation of platelet clumping.

In the 1980s, it was discovered that COX activity was induced at or near the site of inflammation. Further characterization showed that the activity is from the expression of a form of the enzyme (now called COX-2) distinct from the constitutively expressed enzyme (now called COX-1). The correlation of COX-2 expression with inflammation led to the hypothesis that selective inhibition of COX-2 may be a way to reduce inflammation without the undesirable gastric bleeding side effect.

Another important step forward in the development came from the crystal structures of both COX-1 and COX-2 in complex with inhibitors in the mid-1990s. These structures revealed that the active sites of the two enzymes are very similar. This observation is not surprising because they catalyze the same reaction. Nevertheless, there is one crucial difference: COX-2 has a side pocket extending from the active site, and this difference was exploited for the design of selective inhibitors.

Indeed, the designed inhibitors preferentially inhibit COX-2 over COX-1 at the molecular level; they are as effective as nonselective NSAIDs; and they cause less damage to the stomach when viewed by endoscopy. All of these indications suggested that the drugs would be good for those who suffer from chronic inflammation and pain, such as arthritis patients. In the late 1990s, rofecoxib and celecoxib were approved to treat osteoarthritis and rheumatoid arthritis. The success also prompted pharmaceutical companies to further pursue the next generation of inhibitors with even higher specificity toward COX-2.

Similar to the development of imatinib (Gleevec, Novartis), the success of rofecoxib and celecoxib validated the concept that drugs targeted to specific molecules have fewer side effects, at least until recently. The problem facing rofecoxib and celecoxib—that they avoided gastric bleeding only to incur the severe side effects of heart attacks and strokes—seems to contradict this particular strategy. However, one only needs to look closely at the initial assumption about COX-2 inhibition to realize that perhaps inhibiting COX-2 would not be specific enough. It is now known that the roles of COX-1 and COX-2 may overlap. In addition, COX-2 has been found to be constitutively expressed in several other tissues in addition to being induced at sites of inflammation. This suggests that the activity of the enzyme is involved in other processes and is consistent with the observation that some of the COX-2 selective inhibitors seem to affect the levels of prostaglandins involved in dilating blood vessels or activating platelet clumping.

The fall of COX-2 selective inhibitors may be discouraging news, but it highlights the importance of a full understanding of the target in question. Basic research is now necessary to understand the complex roles of COX enzymes in various biological processes and to identify new anti-inflammation targets other than COX-2. Perhaps specific inhibitors downstream of the prostaglandin pathway would alleviate the unacceptable cardiovascular risks. Nevertheless, the case is a clear reminder that all drugs have side effects; the question is whether their benefits outweigh their risks.

Nature
March 15, 2005

Original web page at Nature

Categories
News

The brain of Homo floresiensis

The brain of Homo floresiensis is assessed by comparing a virtual endocast from the type specimen (LB1) with endocasts from great apes, Homo erectus, Homo sapiens, a human pygmy, a human microcephalic, Sts 5 (Australopithecus africanus) and WT 17000 (Paranthropus aeithiopicus). Morphometric , allometric and shape data indicate that LB1 is not a microcephalic or pygmy. LB1’s brain size versus body size scales like an australopithecine, but its endocast shape resembles that of Homo erectus. LB1 has derived frontal and temporal lobes and a lunate sulcus in a derived position, which are consistent with capabilities for higher cognitive processing.

Science
March 15, 2005

Original web page at Science

Categories
News

Fast track to longevity

Researchers have moved a step forward in understanding how calorie restriction is linked to lifespan extension in mammals. In this week’s issue of Nature, a group from the United States reports that SIRT1—the mammalian version of a protein linked to longevity in simpler organisms—controls glucose metabolism in mice in response to fasting.

Pere Puigserver of Johns Hopkins University and colleagues found that fasting signals induce the SIRT1 protein in the liver. This protein is one of the mammalian homologues of Sir2, known to extend lifespan in yeast and worms. SIRT1 then interacts with the coactivator PGC-1alpha, which, in turn, triggers glucose production, a key metabolic change associated with extended lifespan. “Our work provides a novel connection between PGC-1alpha, a protein involved in the food-deprivation response, and SIRT1, a protein linked to aging in lower organisms,” Puigserver told The Scientist.

SIRT1, which is an NAD+-dependent histone deacetylase, had already been associated with calorie restriction and longevity in mammals. Induced by food deprivation, it inhibits stress-induced apoptotic cell death in vitro and promotes fat mobilization in vitro and in vivo. However, it was unclear how SIRT1 might be involved in pathways such as gluconeogenesis and glycolysis, which are directly affected by calorie restriction in mammals.

In the Nature paper, the research team provides a connection between SIRT1 and these pathways. Moreover, they show that SIRT1 acts as a sensor of food deprivation. “During starvation, there is an increase in pyruvate, a nutrient signal that induces translation of SIRT1, and an increase in NAD+, which functions as a substrate and as an activator of SIRT1. The active SIRT1 interacts with PGC-1alpha, deacetylates it, and keeps it active, promoting glucose production in the liver,” explained Puigserver. With these results, the researchers showed that besides the hormonal control of PGC-1 through glucocorticoids and glucagon during fasting, there is a nutrient control as well, which targets SIRT1.

Marc Tatar of Brown University, who did not participate in the research, found the role of SIRT1 in nutrient sensing impressive. “There are hormonal inputs for sensing nutrients that are released systemically and circulate throughout the body,” Tatar told The Scientist. “But what we are beginning to see is that there are also systems in which every cell can sense the nutrient condition in their own neighborhood and adjust their metabolism to their local nutrient conditions.”

Tatar said this type of autonomous nutrient sensing could date back to times when organisms were only single celled and didn’t have hormone signals. “These are probably the roots, and the reason that you find [this sensing system] in yeast, nematodes and mammals, is because it is very ancestral. We are looking at it in flies,” said Tatar.

According to Leonard Guarente of the Massachusetts Institute of Technology, who was not involved in the study, the Nature paper provides a good example in which SIRT1 is influencing a key physiological aspect of calorie restriction in a mammal. “Although this is not the first example, it’s an important one,” he said. Guarente’s group recently reported how SIRT1 influences fat mobilization in mammals. “In fat cells, the target that SIRT1 is acting on is the nuclear hormone receptor PPAR-gamma, a critical regulator of fat; in this system, it’s PGC-1, which is a cofactor for PPAR-gamma. This suggests we are converging in a critical pathway here.”

“Calorie restriction really mitigates many diseases. Once we understand these pathways, we can think about developing drugs that can intervene pharmacologically and have implications to specific diseases,” explained Guarente. “The hypothesis linking low food to longevity and disease resistance through Sir2 is robust. The testing of the hypothesis is just beginning.”

BioMed Central
March 15, 2005

Original web page at BioMed Central