Categories
News

Heart-healthy yak cheese

In a finding likely to get cheese lovers talking, researchers in Nepal and Canada report that yak cheese contains higher levels of heart-healthy fats than cheese from dairy cattle, and may be healthier. Producers make the cheese from the milk of yaks. Those long-haired humped animals are fixtures in Tibet and throughout the Himalayan region of south central Asia, Mongolia, and a few other countries. Yak cheese has only recently become available in the United States and is available in select gourmet food stores. Studies by others have shown that certain types of dairy-derived fatty acids, particularly conjugated linoleic acid (CLA), may help fight heart disease, cancer and even diabetes. However, little was know about the fatty acid composition of yak cheese. In the new study, Brian W. McBride and colleagues compared the fatty acid composition of yak cheese from Nepal with that of cheddar cheese obtained from Canada. They found that levels of CLAs were four times higher in the yak cheese than the dairy cow cheese. Levels of polyunsaturated fatty acids, which are healthy for the heart, were also significantly higher in the yak cheese, the researchers say.

Science Daily
April 1, 2008

Original web page at Science Daily

Categories
News

Saving the planet one farm at a time

There’s good news on the agricultural front: Plants grown as food crops on small farms show a surprising amount of biodiversity worldwide, an international team has found. The findings bode well for efforts to preserve diversity as a hedge against plant diseases, insect pests, and global climate change. Biodiversity is the great bulwark against crop loss. If a certain kind of plant becomes vulnerable to a particular disease, plant breeders can work with related varieties to develop a hardier strain. The more varieties there are to choose from, the greater the chances scientists have of strengthening a crop’s defenses. But as family farms are replaced by large agricultural businesses, which cultivate less diverse crops, scientists have become concerned that the trend endangers agricultural biodiversity.

To gauge the importance of family farms, botanist Devra Jarvis of Bioversity International in Rome and colleagues examined 27 agricultural crops on more than 2000 small farms on five continents. In all, the study took nearly 10 years and covered 63,600 hectares. To the team’s surprise, every farmer grew more than one crop variety, and in some cases, such as rice farms in Vietnam and cassava farms in Peru, they grew more than 60 varieties side by side. “There still is a lot of diversity left in farmers’ fields,” says Jarvis, whose team reports its findings online this week in the Proceedings of the National Academy of Sciences.

The research is important because it presents “the overall message that [small] farms of the world continue to maintain a considerable crop genetic diversity,” says plant geneticist Jean-Louis Pham of the Institute of Research for Development in Montpellier, France. The paper’s relatively simple data required painstaking efforts to obtain, he says, and no doubt the results will be widely used as a reference, “providing us with a kind of state of the world of crop diversity at the beginning of the 21st century.”

ScienceNow
April 1, 2008

Original web page at ScienceNow

Categories
News

Eating chocolate and the risk of dying

It was a sorry end. Cut down in his prime, the cunning thief lay on the slab, his cold body offering pathologist Brett Gartrell no outward sign of how he had met his maker. Once Gartrell had wielded his scalpel, however, the cause became clear: a belly stuffed with sticky brown gunk. Diagnosis? Death by chocolate. Divine – yes. Delicious – absolutely. But deadly? For some it certainly is. The corpse on Gartrell’s slab belonged not to a human but to a kea, an endangered New Zealand parrot. Like many animals, keas are acutely sensitive to chemicals in chocolate that are harmless to humans in all but huge doses. Scientists are now studying these chemicals, along with other substances in cocoa, hoping to exploit their toxic effects to control pests or microbes. If you’re reading this after scoffing your fifteenth chocolate Santa, don’t panic: we humans have been safely enjoying the beans of the cacao plant, Theobroma cacao, for millennia. Theobroma is Greek for “food of the gods”, reflecting the Mayan belief that cocoa had divine origins. Every April, they sacrificed a dog with cacao-coloured markings in honour of Ek Chuah, the god of cacao.

Knife-wielding priests aside, chocolate is still bad news for many animals. Cocoa beans are naturally rich in caffeine and its chemical relatives theobromine and theophylline, collectively called methylxanthines. To humans these are little more than benign stimulants, but to a number of animals they are highly toxic. Just 240 grams of unsweetened dark chocolate contains enough methylxanthines to kill a 40-kilogram dog, about the size of a German shepherd. It was methylxanthines that did for the kea too. Gartrell, a wildlife pathologist at Massey University in Palmerston North, New Zealand, is wearily familiar with keas’ propensity to poison themselves. Besides being arguably the world’s smartest birds, keas are extraordinarily inquisitive foragers, using their beaks to rip open tents and backpacks, open garbage bins and even pry pieces off cars in their quest for food. “They’ll try anything that is vaguely edible, which is part of the reason they get into trouble,” says Gartrell.

The dead kea was found outside a hotel kitchen in the holiday resort of Mount Cook Village in the Southern Alps. It had eaten more than 20 grams of dark chocolate, presumably pilfered from the kitchen garbage (New Zealand Veterinary Journal, vol 55, p 149). “He’d really pigged out,” says Gartrell. The ill-fated kea was by no means alone in its folly. Veterinary journals are peppered with stories of dogs, cats, parrots, foxes, badgers and other animals dropping dead after finding chocolate or being fed it by well-meaning humans. “The dead kea had eaten more than 20 grams of dark chocolate, pilfered from the kitchen garbage” The reason humans don’t turn up their toes after bingeing on chocolate is largely down to the speed at which our bodies metabolise theobromine, the most abundant methylxanthine in chocolate. Rats metabolise it much more slowly than humans, and dogs are slower still. There are no reliable figures for theobromine toxicity in humans, but based on caffeine toxicity an average adult would have to gorge on around 50 kilograms of milk chocolate in a single sitting to get anywhere near a lethal dose.

The observation that methylxanthines are highly toxic to animals, with dogs being especially vulnerable, prompted John Johnston, a chemist at the US Department of Agriculture in Fort Collins, Colorado, to investigate chocolate as a more selective way of controlling coyotes (Journal of Agricultural and Food Chemistry, vol 53, p 4069). Coyotes are a serious pest in the US, killing $44 million worth of livestock each year, damaging property and attacking people and pets. Measures such as fences are often ineffective. Sometimes culling them is the only option but unfortunately the poisons now used, such as sodium cyanide, are toxic to humans and most other animals too. “If we can come up with something that is more selective, it offers an advantage,” says Johnston. “It’s a more responsible approach.” Methylxanthines looked as though they might fit the bill. After testing the toxicity of several different types of chocolate, Johnson came up with a mixture of theobromine and caffeine that killed coyotes quickly and with minimal distress. The mixture can be hidden in bait and is currently undergoing field tests.

Methylxanthines are also shaping up as a way to dispatch other pests. Earl Campbell of the US Pacific Basin Agricultural Research Center in Hilo, Hawaii, discovered that caffeine sprays could kill two species of noisy and ecologically damaging Caribbean tree frogs that have plagued the island since they were accidentally introduced in the 1980s. Campbell noticed that the spray also killed slugs. His colleague Robert Hollingsworth then found that caffeine spray made snails kick the bucket too (Nature, vol 417, p 915). Hollingsworth is now developing caffeine as an alternative to conventional pesticides, such as those used in slug pellets. “There’s a huge amount of interest in using botanical extracts,” he says. “People are more comfortable with things that are natural.” The methylxanthines are just a start. “Cocoa is a real gold mine of different components,” says Herwig Bernaert, research manager at Barry Callebaut, a chocolate manufacturer in Zurich, Switzerland. Cocoa contains more than 700 compounds and there is a great deal of research on which of these can affect people or other creatures. Some of these compounds, such as the flavonoids, have commanded a lot of attention for their apparent health benefits, but researchers are also keen to exploit chocolate’s more sinister side.

Some studies suggest cocoa extracts can prevent Helicobacter pylori, the bacterium that causes stomach ulcers, from setting up shop in the lining of the gut. Others suggest that the extracts block the growth of disease-causing strains of the gut bacterium E. coli. Compounds isolated from chocolate could even be used to prevent tooth decay. Phil Marsh, a microbiologist at the Leeds Dental Institute in the UK, reported last year that a cocoa extract rich in polyphenols discouraged mouth bacteria from sticking to teeth and forming plaque. It also reduced the tooth-rotting powers of the bacterium Streptococcus mutans by reducing its ability to produce acid (European Journal of Oral Sciences, vol 114, p 343). But if you think that means it’s time to ditch the toothpaste and reach for that jumbo jar of chocolate spread instead, forget it. There’s far too little of these polyphenols in chocolate to outweigh the damage that will be done by all the sugar it contains. “There are many other compounds in a chocolate bar that would overwhelm any benefit of the polyphenol,” Marsh warns. As manufacturers fall over themselves to advertise chocolate’s health-boosting potential, there’s a wry satisfaction to be had in knowing that its sinister side is being put to good use too. While this may be good news for us, just make sure your furry or feathered friends can’t get their sticky paws or claws on it.

New Scientist
January 8, 2008

Original web page at New Scientist

Categories
News

When it comes to vitamins, more is not always better

Researchers funded by the Agricultural Research Service (ARS) looked into links between dietary intakes of two B vitamins—folate and vitamin B12—and mental agility among seniors. Folate and B12 are important nutrients for the development of healthy nerves and blood cells. The study, published in 2007, was led by epidemiologist Martha Morris and colleagues at the ARS Jean Mayer USDA Human Nutrition Research Center on Aging (HNRCA) in Boston, Mass. It was based on an analysis of data collected from the U.S. population for the National Health and Nutrition Examination Survey between 1999 and 2002. Blood tests were used to determine the volunteers’ folate and vitamin B12 levels.

U.S.-enriched cereal grain products have been fortified with the synthetic form of folate (folic acid) since 1998. The HNRCA’s Paul Jacques and Jacob Selhub, coauthors on the 2007 study, had previously published papers with Silvina Choumenkovitch, reporting that folate levels have become extremely high in the U.S. population since fortification began. The researchers found an interesting association among seniors aged 60 and older whose vitamin B12 blood levels were low. Aging and taking stomach-acid blockers can contribute to a gradual lessening of B12 absorption in the body. People with high folate and low B12 status were found to be at a disadvantage when compared to those with normal folate and low B12 status; the former group was more likely to exhibit both anemia and cognitive impairment, according to Jacques. A single cognitive function test was used to assess aptitudes such as response speed, sustained attention, visual-spatial skills, associative learning and memory. Scientists have long known that being seriously deficient in vitamin B12 leads to impaired cognitive function caused by neurological complications. The researchers recommend future studies that look into the implications of having too much folic acid, due to fortification, and too little vitamin B12, due to poor absorption.

Science Daily
December 11, 2007

Original web page at Science Daily

Categories
News

Choosing dry or wet food for cats makes little difference when it comes to feline diabetes

Although society is accustomed to seeing Garfield-sized cats, obese, middle-aged cats can have a variety of problems including diabetes mellitus, which can be fatal. The causes of diabetes mellitus in cats remain unknown, although there has been a strong debate about whether a dry food diet puts cats at greater risk for diabetes. A new study from a University of Missouri-Columbia veterinarian suggests that weight gain, not the type of diet, is more important when trying to prevent diabetes in cats. Because dry cat food contains more starch and more carbohydrates than canned cat food, some have argued that a diet containing large amounts of carbohydrates is unnatural for a cat that is anatomically and physiologically designed to be a carnivore. Carbohydrates constitute between 30 percent and 40 percent of dry cat food. Some have been concerned that this unnatural diet is harmful to cats and leads to increased incidence of diabetes. Wet cat food, on the other hand, is high in protein and more similar to a natural carnivore diet. In the study, Robert Backus, assistant professor and director of the Nestle Purina Endowed Small Animal Nutrition Program at MU, and his team of researchers compared a colony of cats in California raised on dry food with a colony of cats in New Zealand raised on canned food. After comparing glucose-tolerance tests, which measures blood samples and indicates how fast glucose is being cleared from the blood after eating, researchers found no significant difference between a dry food diet and a wet food diet.

They also compared the results between cats less than three years of age and cats older than three. The MU veterinarian indicated that allowing cats to eat enough to become overweight is more detrimental to their health than the type of food they eat. “Little bits of too much energy lead to weight gain overtime,” Backus said. “We did find that cats on canned or wet food diets have less of a tendency towards obesity than cats on dry food diets.” Forty percent of all cats in middle age are overweight or obese. According to Backus, male cats should weigh between 10 and 11 pounds, and female cats should weigh between 5.5 and 7.7 pounds. Besides diabetes, overweight cats are prone to other conditions such as skin diseases, oral diseases and certain cancers. When cats are spayed and neutered, they have a tendency to eat more and gain weight. Backus suggests monitoring the food even more closely at this time and not allowing the cat to eat in excess. “The most effective thing you can do is be the one who determines how much your cat eats,” Backus said. “We have been conditioned to fat cats, but cats should have only between 18 percent to 20 percent body fat.” Backus’ research was presented recently at the American College of Veterinary Internal Medicine Conference in Seattle.

Science Daily
December 11, 2007

Original web page at Science Daily

Categories
News

Eating your greens could prove life-saving if a heart attack strikes

A diet rich in leafy vegetables may minimize the tissue damage caused by heart attacks, according to researchers at the Albert Einstein College of Medicine of Yeshiva University. Their findings suggest that the chemical nitrite, found in many vegetables, could be the secret ingredient in the heart-healthy Mediterranean diet. “Recent studies show that administering nitrite to animals, either intravenously or orally, can greatly limit the damage caused by a heart attack and the stress to tissue that follows due to reperfusion–the return of blood to oxygen-starved heart muscle,” says Dr. David Lefer, the study’s senior author and professor of medicine and of pathology at Einstein. “We wondered if feeding animals much lower levels of nitrite and nitrate–equivalent to what people can readily obtain from their diets–could also provide protection from heart-attack injury.”

Nitrite and its “chemical cousin” nitrate are important because of their role in producing nitric oxide gas. In 1986, researchers made the remarkable finding that nitric oxide–famous until then mainly as an air pollutant–is produced by cells lining healthy arteries and plays a crucial role in cardiovascular health by dilating arteries and aiding blood flow. Damage to the artery lining (in atherosclerosis, for example) impairs nitric oxide production and leads to cardiovascular disease and, ultimately, to heart attacks and strokes. Researchers now have good evidence that hearts undergoing heart attacks have a “backup” pathway for making nitric oxide. Triggered by falling oxygen levels, enzymes in heart muscle convert nitrite stored there into nitric acid that can then help minimize tissue damage. Nitrite in the diet comes mainly from vegetables–celery, beets, and spinach, lettuce and other leafy types. Once consumed, nitrite exits the bloodstream and then accumulates and become stored in organs such as the heart, kidney and brain. But it wasn’t clear whether boosting nitrite in the diet could actually translate into better protection from heart-attack damage.

To find out, the Einstein researchers administered nitrite (50 mg/liter) in the drinking water of mice for seven days, while a comparison group of mice received a standard diet that was not supplemented with nitrite. Then, to simulate a heart attack, blood flow to the animals’ hearts was stopped for 30 minutes, followed by 24 hours of reperfusion. Examination revealed that the hearts of the nitrite-supplemented mice were significantly richer in nitrite, and heart-muscle damage was reduced by an impressive 48 percent compared with the controls. In contrast to nitrite, nitrate in the diet comes mainly from cured meats such as bacon, sausage and luncheon meats. Consuming nitrate augments our nitrite supply: Once absorbed in the bloodstream, nitrate circulates to the salivary glands where bacteria convert it to nitrite, which is then swallowed in our saliva. About 10 percent of dietary nitrate is converted to nitrite in this way. As with the mice and nitrite, the Einstein researchers spiked drinking water with nitrate and then induced heart attacks. A protective effect was found yet again: Compared with the control animals, the nitrate-supplemented mice had greater stores of nitrite in their heart muscle along with significantly less heart-muscle damage, although the reduction was not as impressive as in the nitrite-fed mice.

“This new appreciation of the health benefits of nitrite and nitrate is ironic,” says Dr. Lefer, “They’ve traditionally been regarded as toxic because they tend to form chemicals called nitrosamines, some of which are carcinogenic. But recent research has found no convincing evidence that nitrite and nitrate pose a cancer risk.” Dr. Lefer notes that Europeans’ copious consumption of vegetables puts them far ahead of us in terms of nitrite and nitrate intake: On average, European consume 76 mg of nitrite and nitrate daily compared with a 0.77 mg American intake–nearly a 100-fold difference. “This large intake of nitrite and nitrate poses no known risks and could certainly help explain why the Mediterranean diet is heart-healthy despite its relatively high fat content,” says Dr. Lefer. Dr. Lefer says that the nitrite levels found cardioprotective in his study can easily be achieved by consuming more vegetables containing the chemical. That dietary change, he says, might be especially helpful for people at increased heart-attack risk–those who’ve already suffered a heart attack, have been diagnosed with cardiovascular disease or have a family history of it. “Our study suggests that building up nitrite stores in heart muscle could spell the difference between a mild heart attack and one that causes lasting heart damage or death,” says Dr. Lefer. “And since nitrite also accumulate in the brain, they could potentially help minimize the damage from strokes as well.”

Science Daily
November 27, 2007

Original web page at Science Daily

Categories
News

Africa aims to halt brain drain of crop experts

If Africa is to solve the problem of getting crops to survive through future floods and droughts, they’ll need local knowledge developed on home turf, experts say. Now that challenge is set to be tackled, with a US$ 4.9-million effort to overcome Africa’s brain drain of plant scientists. The scheme will see a new programme established in Ghana, to allow the next generation of West African crop experts to be home-grown, rather than trained abroad on foreign crops at foreign universities. The scheme, which aims to hand out around 40 doctoral degrees in plant breeding over the next five years, will create African scientists trained to study their own native crops and crop diseases. Most African crop scientists go to Europe or the United States to earn their PhD, says Eric Danquah, director of the newly established West Africa Centre for Crop Improvement at the University of Ghana, near the capital, Accra. This means that they often come home without the expertise in indigenous crops such as sorghum, millet and cassava, and therefore have to learn from scratch how to breed new varieties of these species.

It will be better to have the facilities to study African crops in Africa. Danquah knows the feeling — he studied the genetics of barley at the University of Cambridge, UK. But barley is not grown anywhere in Africa. “When you go abroad, the work you do depends on the history of the department where you are working,” Danquah says. “You are compelled to study what is there. A few foreign universities provide facilities to study African crops, but it would be better to have the facilities to study African crops in Africa.” The Ghanaian centre will offer a five-year PhD programme including both research work and, unusually for a doctoral course, two years of teaching. The aim is to produce crop experts who are equipped to breed crop strains that can resist drought, disease and other environmental stresses. The centre will train eight students per year, with the first intake starting work in January 2008. A sister scheme that has been running at the University of KwaZulu-Natal, South Africa, since 2000 has also received an $8.1-million grant to continue its work. Overall, the centres should produce 120 trained scientists over the coming decade. Training crop experts in Africa will help stem the flow of talent away from Africa, says Peter Matlon of the Alliance for a Green Revolution in Africa in Nairobi, Kenya, one of the organizations that has sponsored the grants. Foreign-trained scientists often don’t stick, he says. “Either they don’t return, or they return briefly and look for the first opportunity to leave.”

Nature
October 2, 2007

Original web page at Nature

Categories
News

Futile protein cycle keeps mice thin

These mice set up a futile cycle of making and breaking unnecessary proteins, burning fat along the way. As a result they eat more food but weigh less than normal mice. The discovery has raised hopes of novel ways to tackle obesity and diabetes. The missing gene in this case codes for an enzyme needed to chemically digest some amino acids — the building blocks of proteins. This results in the build-up of an amino acid called leucine, which in turn tricks cells into making new, unnecessary proteins and then destroying them. This pointless cycle burns up excess calories so the mice stay trim, regardless of the extra food they munch. The notion that making and breaking down biological molecules can waste spare calories is not new to scientists; it has been posited as an explanation for why some lucky people can naturally eat more but stay slim. But this study, published in Cell Metabolism today, is the first time wasteful protein turnover has been shown in practice.

Previous research has hinted that high-protein diets, or leucine supplements, cause weight loss — although the mechanism for how this happens is not well understood. To find out what happens when leucine is permanently increased, Christopher Lynch of the Pennsylvania State University College of Medicine in Hershey and his colleagues inactivated a gene that normally clears this amino acid from the blood. Lynch says the genetically modified mice “seemed more hungry than the other mice” and ate more, relative to their body weight. But even when they were put on a high-fat diet, which makes normal mice obese, these super-slim rodents stayed lean, carrying about half the body fat of their cage-mates. Their body temperature was increased slightly as all the extra food they craved was burnt off.

On a normal diet, the modified mice seemed healthy, weighed about 10% less than normal mice, and had a lower risk of developing diabetes. But Lynch says the animals “went crazy” when the researchers altered their diet to lower the amount of leucine they could eat. “They were sitting next to their food, panting and eating,” he says. The animals on this reduced-leucine diet were wet from sweat as they ate more and more, but still didn’t gain weight. He speculates that the animals might be using the amount of leucine in their systems to determine how much they ‘ought’ to eat to reach their optimum weight. Lynch is hopeful that harnessing wasteful protein-making in humans could be a novel way to treat obesity.

Nature
September 17, 2007

Original web page at Nature

Categories
News

The gene that makes your mouth water

Spit might have helped human evolution by enabling our ancestors to harvest more energy from starch than their primate cousins. Compared with chimpanzees, humans boast many more copies of the gene that makes salivary amylase — a saliva enzyme that breaks down starch into digestible sugars. And carbohydrate-loving societies carry more copies of the gene than those that follow low-carbohydrate diets, claims a new study in Nature Genetics.This strongly implies that people have adapted to their local environment. “High starch foods and a high starch diet have been an important evolutionary force for humans,” says George Perry, an anthropologist at Arizona State University in Tempe, who led the new analysis. The change could possibly have supported the growth in hominid brains that occurred some two million years ago, says Nate Dominy, an anthropologist at the University of California in Santa Cruz involved in the study. “Our diet must have had some shift to feed that brain,” says Dominy, who thinks root vegetables like African tubers allowed large-brained humans to flourish.

Starch, which helps to make a baked potato mushy, is an important source of food for modern humans. But without amylase in the saliva, man can make little use of such complex carbohydrates – enzymes elsewhere in the body are not as good at breaking the compounds down. Previous studies suggested that some people have more copies of the gene for amylase than others, but little was known about the importance of the extra copies. They could have been insignificant: the duplication of some genes has little or no effect on gene expression, says Dominy. To investigate, the team tested people with different numbers of amylase genes. “We took a population of undergraduates and asked them to spit into tubes, then measured the amount of amylase in their saliva,” says Dominy. Cheek swabs were used to measure the number of amylase genes. The conclusion: extra copies of the gene make more amylase — and so an enhanced ability to break down starches. When the researchers ventured beyond university campuses to sample populations in Africa, Asia, Europe and the Arctic, they noticed a trend. Cultures with diets that included high levels of starch tended to have more copies of the amylase gene than cultures that consumed few starches.

Starch-loving cultures such as the Hadza of Tanzania who rely heavily on tubers and other root vegetables, have 6.7 copies of amylase, on average. While people like the Mbuti, pygmy rain forest hunter-gatherers from central Africa who eat little starch, have 5.4 copies on average. In contrast, chimpanzees, dining on fruit and little else, have just two copies of the salivary amylase gene. Comparing the human and chimp genomes hints that the multiplication of this gene in humans came hundreds of thousands of years ago, or more. Dominy speculates that perhaps the change propelled our ancestors to new heights by fuelling the evolution of large brains more than two million years ago. Alternatively, the new copies may have coincided with the rise of agriculture 150,000 years ago, he says. More complete human genome sequences from diverse cultures are needed to firm up when and why this change took place, Dominy says.

The ability to digest starch may have had the added benefit of cutting down on diarrhoea — still a major cause of death in children. “It might pay to start digesting things a bit earlier in the process to get what you can before it’s shot out of your body,” says Dominy. Such studies, linking human evolution to genetic changes, are certain to become more common, says James Sikela, a biologist at the University of Colorado Health Sciences Center in Aurora. “It’s a great example of what can be learned about our past via evolutionary genomics,” he says.

Nature
September 17, 2007

Original web page at Nature

Categories
News

Contamination of wheat could be the cause of deadly kidney failure among thousands of people in the Balkans

The contamination of the region’s wheat by the birthwort plant seems to be the source of an unusual form of kidney failure and urinary tract cancer that afflicts many people in countries such as Croatia and Serbia. Researchers believe the finding should spark action by public health officials to save lives by clearing the birthwort plant from grain fields across the region. Since it was first formally recognised in 1956, the disease called endemic Balkan nephropathy has perplexed experts, who have considered various explanations, including groundwater contamination. Unlike most patients with kidney failure, people with the Balkan illness often have healthy blood pressure. Nonetheless, as their kidneys begin to fail they require dialysis and about half of them eventually develop a rare cancer of the upper urinary tract.

Arthur Grollman, at the University at Stony Brook, New York, US, did not expect to discover birthwort as the cause of this kidney disease when he set out for the region a few years ago. Instead, he had hypothesised that herbal remedies were to blame for this nephropathy. He knew about the disastrous mix-up in which thousands of healthy Belgium women developed severe kidney failure requiring transplants after accidentally receiving the Chinese herbal drug guang fang ji instead of the similar-sounding han fang ji. Grollman suspected a similar cause behind the endemic Balkan nephropathy, so he surveyed patients in dialysis clinics in the region on whether they had taken any herbal medicines. But none reported taking such supplements. Disappointed his theory had proved wrong, Grollman headed for home – but not before killing a final afternoon in a library in Zagreb, the Croatian capital.

There, he came across a striking description from the 1930s about how horses in the region had developed kidney failure after grazing on a plant known as Aristolochia clematis, also known as birthwort. Grollman immediately cancelled his flight and set off to meet Balkan farmers. A survey of their fields and mills revealed that some of their wheat was indeed contaminated with Aristolochia clematis seed. Back in the lab, Grollman and his colleagues examined kidney samples from Croatian nephropathy patients. They found the same telltale signs of DNA damage linked to Aristolochia clematis as seen in animal studies. For example, they found specific cancerous mutations in a gene called p53 known to arise from the aristolochic acid, a chemical found in birthwort. And chemical analysis revealed metabolites of the plant stuck to the patients’ DNA in a disruptive way. Grollman says 100,000 people are at risk of nephropathy in the Balkans as a result of inadvertently eating birthwort. He says he would “like to make sure that the public health officials take action on this”. Local farmers need government assistance to weed out the plant as many of them cannot afford herbicide, he adds. Grollman also notes that birthwort has been used as a herbal remedy, being given to mothers following birth because it reputedly helped to expel the placenta. But he doubts birthwort has any genuine medicinal value and stresses that “the toxic effects far outweigh any potential good effects”.

New Scientist
August 21, 2007

Original web page at New Scientist

Categories
News

Old eggs show Adélie penguins switched from fish to shrimp 200 years ago

Ancient eggshell fragments show that Adélie penguins living in Antarctica switched from eating fish to krill around the time that humans began hunting seals and whales. The finding suggests that when humans removed krill-eating predators the penguins exploited the resulting shrimp surplus. Steven Emslie of the University of North Carolina in Wilmington, and William Patterson of the University of Saskatchewan in Saskatoon, analysed more than 220 fossil eggshell pieces ranging from 100 to 38,000 years old, and compared them with samples from modern nests. By comparing the proportion of certain forms of carbon and nitrogen in the shells with the proportions found in fish and krill, the researchers could tell what the birds had been eating. Emslie expected to find changes in diet matching climate change. Instead, the penguin menu remained biased towards fish until about 200 years ago, when the birds switched to krill. Recent global warming and the rise in krill fisheries has reduced krill stocks and could be contributing to the decline in Adélie penguin populations on the Antarctic Peninsula, says Emslie. The study is published in the Proceedings of the National Academy of Sciences.

From 1793 to 1807, an estimated 3.2 million seals were taken from the Southern Ocean. The resulting crash in the seal population — including the Antarctic fur seal Arctocephalus gazella, which fed primarily on krill — caused the industry to collapse. Whaling took off in the 1800s and continued until the mid-twentieth century, eventually depleting baleen whale populations by more than 90%. It’s estimated that the combined harvest of seals and whales resulted in more than 150 million tonnes of extra krill each year. Krill is an attractive food for penguins because it is high in protein and tends to travel in swarms. “The birds can capture lots of high-energy prey in a short time,” says Emslie. “This implies a huge ecological dietary response by the penguins in relation to some change in their environment,” says Keith Hobson of the Canadian Wildlife Service in Saskatchewan. But the reasons behind this switch are less obvious, he says.

“Abundance of a secondary food item does not necessarily explain this unless it was accompanied by a reduction in fish,” says Hobson. “Why does it matter that krill became more abundant to a predator that previously happily made eggs from fish?” The switch might be explained by an increase in fishing, says David Ainley of the California ecological consulting firm H. T. Harvey & Associates. “Not only were whales and seals removed,” he says, “there was a massive removal of fish from the Scotia Sea and western Antarctic Peninsula region at the same time.” Many of these fish ate krill, so their removal would have further boosted krill supplies. Their dietary flexibility demonstrates the penguins’ ability to adjust to large ecological changes, but that doesn’t mean they’ll survive the changes to come, says Hobson. “I remain a pessimist when it comes to how they may now cope with the onslaught of climate change,” he says.

Nature
August 21, 2007

Original web page at Nature

Categories
News

US vets seeing more horses with nutritional issues this year

While much of the Midwest has recovered from the drought that parched the area last year, horses are continuing to experience effects from the hot dry summer of 2006. Due to a bad hay crop, University of Missouri-Columbia veterinarians are reporting an increased number of horses with chronic selenosis and vitamin E deficiency, problems that can be fatal. “Last year’s drought meant that Missouri’s hay crop, which is usually balanced very well for a horse’s nutrition, was much poorer than usual,” said Philip Johnson, professor of veterinary medicine and surgery. “Because of the poor Missouri hay crop, horse owners imported hay from other states nearby and possibly fed their horses hay that was too high in selenium. This can have very grave consequences for horses. Owners also may have fed their horses poor quality hay from Missouri or other places, which led to deficiencies in vitamin E, another very dangerous problem for horses.”

Selenium is a naturally occurring element and is an essential part of horse diets. However, too much or too little can create problems for a horse. When chronic selenosis, or selenium poisoning, occurs from eating too much of the element, horses can lose the hair in the mane and tail and develop a form of laminitis, a painful condition that affects the hoof. If left untreated for too long, a horse with chronic selenosis may require euthanasia as a result of severe laminitis. Johnson said that the amount of selenium in hay can vary by county throughout the nation, but that Missouri hay typically has just the right amount of the essential element. For a small fee, horse owners can have their hay tested to determine if it has the right amount of selenium in it.

In addition, hay that is not fresh can lack vitamin E, an antioxidant which is important for nerve health in a horse. Some horse owners unknowingly compensate for this deficiency by feeding their animals with nutritional supplements. Those horses that suffer from a vitamin E deficiency typically show symptoms that include weakness, loss of weight, trembling and changes in the retina at the back of the eyeball. A quick blood test can determine if the animal is suffering from a vitamin E deficiency. Johnson recommends that horse owners who imported hay from unknown sources last year either have the hay tested or keep a close watch on their horses. Horses that do not have access to green grass and that are being fed old yellow hay are at risk.

“Usually, by the time the horse is showing symptoms of either problem, it may be too late to reverse the disease completely,” Johnson said. “However, if a horse owner has other horses that are feeding from the same food source, it’s important to have those animals treated before the damage is permanent.” Craig Roberts, a professor of agronomy in the College of Agriculture, Food and Natural Resources at MU, says the quantity of this year’s hay crop will be down 50 percent to 75 percent from normal, but the nutritional value will be good. “Last year, we had the drought, which affected both the quantity and the quality of the hay,” Roberts said. “This year, we had a late freeze, which mainly affects the yield. Overall, we will be down, but the drought last year was far worse.”

Science Daily
June 26, 2007

Original web page at Science Daily

Categories
News

For combating cholera, rice is nice

Japanese researchers say they have used a staple of their nation’s diet–rice–to develop what could become an effective, safe, and inexpensive vaccine against cholera. The new vaccine, which would be taken as a pill and does not require refrigeration, could pave the way for similar vaccines for diseases that also affect the body’s mucous tissues, such as influenza, botulism, and even anthrax. Cholera, a disease of the intestinal lining, continues to ravage populations in the developing world–even though it is easily treated with fluids and antibiotics. Caused by consuming water or food contaminated with Vibrio cholerae bacteria, the disease unleashes severe diarrhea and accompanying dehydration. Researchers at the University of Tokyo engineered two strains of domestic rice to carry the gene for CTB, one of the primary cholera proteins, to help the body develop immunity to infection. Tests with laboratory mice fed varying doses of the genetically-altered rice showed that the new vaccine, called MucoRice, immediately protected them from cholera, whereas the control group became infected, the team reports online this week in the Proceedings of the National Academy of Sciences.

The team selected rice as the vaccine vector because the grain survives digestion in the stomach, which breaks down most other foods. That allows the CTB protein to travel to the intestinal mucous, where it activates the body’s immune system, says microbiologist and team member Hiroshi Kiyono. Ground up as a powder and delivered in tablet or capsule form, the rice-based vaccine could easily be taken orally to eliminate the need for syringes–which can cause secondary infections and create disposal problems. People cannot eat the rice directly, however, Kiyono says, because they would overdose on the vaccine. “We do not have any plans to deliver the vaccine as a form of steamed rice,” he jokes. Other advantages of the vaccine are that it can be stored without refrigeration for at least a year and a half, and rice is easily grown in areas where cholera is rampant.

The research follows other efforts over the past decade to develop oral vaccines that can bypass the stomach to treat diseases attacking the intestinal lining, says plant biotechnologist Hugh Mason of Arizona State University in Tempe. He thinks using engineered rice offers promise against cholera and similar pathogens for two reasons. First, Mason says, rice is edible, so the vaccine requires little purification, and second, proteins like CTB are fairly resistant to the body’s digestive actions, whereas others are less effective in the harsh environment of the stomach.

ScienceNow
June 26, 2007

Original web page at ScienceNow

Categories
News

Fruit proves better than vitamin C alone

If you’re in the market for an antioxidant to keep your body young and healthy, new research suggests you’d be much better off with oranges than vitamin C tablets. Although vitamin C is best known for its protection against scurvy and, possibly, the common cold, fruits rich in vitamin C are also powerful antioxidants that protect cellular DNA from being damaged by oxidation. Going without such foods leads to DNA damage long before the iconic bleeding gums of scurvy are seen. But do vitamin C pills on their own have the same protective effect as fruit? Serena Guarnieri and a team of researchers in the Division of Human Nutrition at the University of Milan, Italy, designed a simple experiment to find out.

The team gave test subjects a single glass of blood-orange juice, vitamin-C-fortified water, or sugar water to drink. The blood-orange juice and the fortified water had 150 milligrams of vitamin C each, whereas the sugar water had none. Blood samples were taken from the test subjects 3 hours and 24 hours after their drink. Unsurprisingly, blood plasma vitamin C levels went up after drinking both the juice and the fortified water. The blood samples were then exposed to hydrogen peroxide, a substance known to cause DNA damage through oxidation. The damage was significantly less in the samples taken from volunteers who had ingested orange juice, in both the samples collected 3 hours after consumption and 24 hours after the drink. Unsurprisingly, the sugar water had no protective effect. But neither did the vitamin-C-fortified water.

At least one other study, which looked at larger quantities of vitamin C, has shown a protective effect from the vitamin alone. But the fact that it doesn’t show up here indicates that something more complicated is going on, says Guarnieri. “It appears that vitamin C is not the only chemical responsible for antioxidant protection; there is something more at work here,” she says. The find is reported in the British Journal of Nutrition. “It is an important observation,” says David Heber, director of the Center for Human Nutrition at the University of California, Los Angeles. It suggests that people studying the effects of the vitamin should be careful to note where in the diet it comes from. “Vitamin C is provided in a matrix in fruits with many other beneficial substances,” he says; and all of these may interact with each other.

Nature
May 1, 2007

Original web page at Nature

Categories
News

Calorie restriction increases muscle mitochondrial biogenesis in healthy humans

Life expectancy (the average life span) greatly increased during the 20th century in most countries, largely due to improved hygiene, nutrition, and health care. One possible approach to further increase human life span is “caloric restriction.” A calorie-restricted diet provides all the nutrients necessary for a healthy life but minimizes the energy (calories) supplied in the diet. This type of diet increases the life span of mice and delays the onset of age-related chronic diseases such as heart disease and stroke. There are also hints that people who eat a calorie-restricted diet might live longer than those who overeat. People living in Okinawa, Japan, have a lower energy intake than the rest of the Japanese population and an extremely long life span. In addition, calorie-restricted diets beneficially affect several biomarkers of aging, including decreased insulin sensitivity (a precursor to diabetes). But how might caloric restriction slow aging? A major factor in the age-related decline of bodily functions is the accumulation of “oxidative damage” in the body’s proteins, fats, and DNA. Oxidants—in particular, chemicals called “free radicals”—are produced when food is converted to energy by cellular structures called mitochondria. One theory for how caloric restriction slows aging is that it lowers free-radical production by inducing the formation of efficient mitochondria.

Despite hints that caloric restriction might have similar effects in people as in rodents, there have been few well-controlled studies on the effect of good quality calorie-reduced diets in healthy people. It is also unknown whether an energy deficit produced by increasing physical activity while eating the same amount of food has the same effects as caloric restriction. Finally, it is unclear how caloric restriction alters mitochondrial function. The Comprehensive Assessment of Long-term Effects of Reducing Intake of Energy (CALERIE) organization is investigating the effect of caloric restriction interventions on physiology, body composition, and risk factors for age-related diseases. In this study, the researchers have tested the hypothesis that short-term caloric deficit (with or without exercise) increases the efficiency of mitochondria in human muscle.

The researchers enrolled 36 healthy overweight but non-obese young people into their study. One-third of them received 100% of their energy requirements in their diet; the caloric restriction (CR) group had their calorie intake reduced by 25%; and the caloric restriction plus exercise (CREX) group had their calorie intake reduced by 12.5% and their energy expenditure increased by 12.5%. The researchers found that a 25% caloric deficit for six months, achieved by diet alone or by diet plus exercise, decreased 24-hour whole body energy expenditure (i.e., overall calories burned for body function), which suggests improved mitochondrial function. Their analysis of genes involved in mitochondria formation indicated that CR and CREX both increased the number of mitochondria in skeletal muscle. Both interventions also reduced the amount of DNA damage—a marker of oxidative stress—in the participants’ muscles.

These results indicate that a short-term caloric deficit, whether achieved by diet or by diet plus exercise, induces the formation of “efficient mitochondria” in people just as in rodents. The induction of these efficient mitochondria in turn reduces oxidative damage in skeletal muscles. Consequently, this adaptive response to caloric restriction might have the potential to slow aging and increase longevity in humans as in other animals. However, this six-month study obviously provides no direct evidence for this, and, by analogy with studies in rodents, an increase in longevity might require lifelong caloric restriction. The results here suggest that even short-term caloric restriction can produce beneficial physiological changes, but more research is necessary before it becomes clear whether caloric restriction should be recommended to healthy individuals.

PLoS Medicine
March 20, 2007

Original web page at PLoS Medicine

Categories
News

Study shows hops helps broiler growth

University of Arkansas poultry scientists say a touch of hops, an herb used in beer, might stimulate broiler growth. The findings, published in the International Journal of Poultry Science, showed how hops might work as a substitute for growth-promoting antibiotics in broiler diets. Although the response from the addition of hops was not as great as that obtained from the antibiotic treatment, it was reported to be significantly greater than that of birds fed a control diet with neither supplement. UA scientists Susan Watkins and Park Waldroup, along with graduate students Jana Cornelison and Frances Yan, conducted the research at the Division of Agriculture’s Center of Excellence for Poultry Science.

Science Daily
April 25, 2006

Original web page at Science Daily

Categories
News

Calcium and dairy unlikely to aid in weight loss

A new study does not support the theory that a boost in calcium intake or dairy consumption is useful for maintaining or losing weight. “Media have been promoting dairy to lose weight and therefore this topic has gained a lot of importance,” Dr. Swapni N. Rajpathak, who led the study, told Reuters Health. “At this time, there is not enough justification to increase dairy intake to lose weight,” the researcher warned. “Calcium and therefore dairy — the best source of calcium in the diet — may be associated with weight loss, based on some data suggesting that calcium has some role in fat synthesis,” Rajpathak explained. However, the results of studies of calcium and dairy intake in relation to weight loss have been inconsistent.

Rajpathak, from the Albert Einstein College of Medicine of Yeshiva University in New York and colleagues looked for links between baseline calcium intake and change in calcium intake and weight change over time in about 43,000 healthy middle-aged men participating in the Health Professionals Follow-up Study — a prospective study launched in 1986 to evaluate the role of diet in chronic diseases. After taking into account multiple potentially confounding factors, baseline or change in intake of total calcium was not significantly associated with a change in weight, the team reports in the American Journal of Clinical Nutrition. “We found that men who increased their dairy/calcium intake did not lose more weight — in fact, they gained slightly more weight — in the 12-year period,” Rajpathak told Reuters Health. This was primarily due to an increase in high-fat dairy intake.

However, even low-fat dairy intake was not significantly associated with a change in weight. The take home message, said Rajpathak, is that “calcium is important for optimal health and people should have adequate calcium from diet (including dairy) or use supplements if they wish. Importantly, it is advisable to consume dairy in low-fat form.” However, increasing calcium in the diet as a means to lose weight is not advisable, the researcher said.

Source: American Journal of Clinical Nutrition, March 2006.

Reuters
April 11, 2006

Original web page at Reuters

Categories
News

Veggies contain chemicals that boost DNA repair and protect against cancer

Need another reason to eat your vegetables? New research shows that some of them contain chemicals that appear to enhance DNA repair in cells, which could lead to protection against cancer development, say Georgetown University Medical Center researchers.In a study published in the British Journal of Cancer (published by the research journal Nature) the researchers show that in laboratory tests, a compound called indole-3-carinol (I3C), found in broccoli, cauliflower and cabbage, and a chemical called genistein, found in soy beans, can increase the levels of BRCA1 and BRCA2 proteins that repair damaged DNA.

Although the health benefits of eating your vegetables—especially cruciferous ones, such as broccoli—aren’t particularly new, this study is one of the first to provide a molecular explanation as to how eating vegetables could cut a person’s risk of developing cancer, an association that some population studies have found, says the study’s senior author, Eliot M. Rosen, MD, PhD, professor of oncology, cell biology, and radiation medicine at Georgetown’s Lombardi Comprehensive Cancer Center.“It is now clear that the function of crucial cancer genes can be influenced by compounds in the things we eat,” Rosen says. “Our findings suggest a clear molecular process that would explain the connection between diet and cancer prevention.”

In this study, Rosen exposed breast and prostate cancer cells to increasing doses of 13C and genistein, and found that these chemicals boosted production of BRCA1, as well as its sister repair protein, BRCA2. Mutations in either of these genes can lead to development of breast, prostate and ovarian cancers. Since decreased amounts of the BRCA proteins are seen in cancer cells, higher levels might prevent cancer from developing, Rosen says, adding that the ability of I3C and genistein to increase production of BRCA proteins could explain their protective effects.

Science Daily
February 28, 2006

Original web page at Science Daily

Categories
News

Stick to wild salmon unless heart disease is a risk factor

On the one hand, farmed salmon has more heart-healthy omega-3 fatty acids than wild salmon. On the other hand, it also tends to have much higher levels of chemical contaminants that are known to cause cancer, memory impairment and neurobehavioral changes in children. What’s a consumer to do? In general, a new study shows that the net benefits of eating wild Pacific salmon outweigh those of eating farmed Atlantic salmon, when the risks of chemical contaminants are considered, although there are important regional differences.

Those are the conclusions of Barbara Knuth, Cornell professor of natural resources who specializes in risk management associated with chemical contaminants in fish, and Steven Schwager, Cornell associate professor of biological statistics and computational biology and an expert in sampling design and statistical analysis of comparative data. The two have co-authored a benefit-risk analysis of eating farmed versus wild salmon in the Journal of Nutrition (November, Vol. 135). “None of us [study authors] argues that the benefits of salmon are not real. But the dirty little secret is that there are risks,” said Schwager, noting that even taking into account the risks, the benefits of salmon may be particularly worthwhile for some groups.

“For a middle-aged guy who has had a coronary and doesn’t want to have another one, the risks from pollutants are minor ones, and the omega-3 benefits him in a way that far outstrips the relatively minor risks of the pollutants,” he said. “But for people who are young — and they’re at risk of lifetime accumulation of pollutants that are carcinogenic — or pregnant women — with the risks of birth defects and IQ diminution and other kinds of damage to the fetus — those risks are great enough that they outweigh the benefits.” Knuth added: “Because we found regional differences in contaminants in farmed salmon, with Chilean salmon showing the lowest levels and European (particularly Scottish) farmed salmon showing the highest levels, careful consumers with a history of heart disease could choose farmed salmon from Chile for their high omega-3 content and relatively lower level of contaminants.” She noted that farmed salmon from North America would be a better second choice than European farmed salmon.

The researchers’ benefit-risk analysis showed that consumers should not eat farmed fish from Scotland, Norway and eastern Canada more than three times a year; farmed fish from Maine, western Canada and Washington state no more than three to six times a year; and farmed fish from Chile no more than about six times a year. Wild chum salmon can be consumed safely as often as once a week, pink salmon, Sockeye and Coho about twice a month and Chinook just under once a month.

In a study published last spring (Environmental Health Perspectives, May 2005), the research team reported that the levels of chlorinated pesticides, dioxins, PCBs and other contaminants are up to 10 times greater in farm-raised salmon than in wild Pacific salmon, and that salmon farmed in Europe are more contaminated than salmon from South and North American farms. The team also published a study this fall (Environmental Science and Technology, Vol. 39:8622) that found that farmed salmon, on average, contain roughly two to three times more beneficial fatty acids than wild salmon, presumably because of the differences in the diet on which the fish are raised. “Our results also support the need for policy and regulatory efforts to limit pollution of our waters and clean up pollution that has occurred, and thus ultimately reduce the risk side of this equation by reducing the potential for human exposure to these contaminants,” said Knuth, adding that the country of origin of fish sold should be clearly labeled so consumers can make informed decisions.

Science Daily
January 17, 2006

Original web page at Science Daily

Categories
News

Rats’ response to ‘stop snacking’ signal diminished by high fat diet

Rats fed a high fat diet were less sensitive to a hormonal ‘stop eating’ signal than rats on a low fat diet when they were given access to a high calorie, high fat snack that the animals find yummy. Dr. Mihai Covasa, assistant professor of nutritional sciences and a member of the Penn State Neuroscience Institute, led the study. He says, “When we gave the rats doses of a ‘stop eating’ hormone, the rats on the low fat diet significantly suppressed their intake of the snack but not the rats on the high fat diet.” Covasa adds, “These results suggest that a long-term, high-fat diet may actually promote short-term overconsumption of highly palatable foods high in dietary fat by reducing sensitivity to at least one important feedback signal which would ordinarily limit eating.” The results are detailed in the current (August) issue of the Journal of Nutrition in a paper, Adaptation to a High-Fat Diet Leads to Hyperphagia and Diminished Sensitivity to Cholecystokinin in Rats. The authors are David M. Savastano, who recently earned his master’s degree under Covasa’s direction, and Covasa. The ‘stop eating’ hormone used in the study was cholecystokinin or CCK. CCK is released by cells in the small intestine when fat or protein is present. The hormone’s release activates nerves that connect the intestine with the brain where the decision to stop eating is made. Previous studies with human subjects showed that those on a high fat diet have more CCK in their bloodstream but are less responsive to it. They typically report feeling increased hunger and declining fullness and eat more. No human study of snacking and CCK has been reported. This study, with rats, is the first to link diminished sensitivity to CCK following exposure to a high fat diet and overconsumption of a high calorie, high fat snack. In the current study, the rats were only given access to the high calorie, high fat snack for three hours a day. The rest of the time they received either low fat or high fat rat chow. The high and low fat chows were regulated so that they were equivalent in calories and both groups of rats gained weight at the same rate. Even though the rats on the high fat diet ate, on average 40 percent more of the high calorie, high fat snack than the rats on the low fat diet, they didn’t gain extra weight. Rats, unlike humans, cut back on their usual chow when they snack. Covasa says, “Rats are notorious in compensating for food to maintain a constant body weight. Although adaptation to the high fat diet led to overconsumption of the high calorie, high fat snack, there was no difference in weight gain between the two groups of rats during the 20 days of testing because the rats compensated by eating less of their maintenance diet.”

Science Daily
August 16, 2005

Original web page at Science Daily