Categories
News

Rats have a double view of the world

Scientists from the Max Planck Institute for Biological Cybernetics in Tübingen, using miniaturised high-speed cameras and high-speed behavioural tracking, discovered that rats move their eyes in opposite directions in both the horizontal and the vertical plane when running around. Each eye moves in a different direction, depending on the change in the animal’s head position. An analysis of both eyes’ field of view found that the eye movements exclude the possibility that rats fuse the visual information into a single image like humans do. Instead, the eyes move in such a way that enables the space above them to be permanently in view — presumably an adaptation to help them deal with the major threat from predatory birds that rodents face in their natural environment. Like many mammals, rats have their eyes on the sides of their heads. This gives them a very wide visual field, useful for detection of predators. However, three-dimensional vision requires overlap of the visual fields of the two eyes. Thus, the visual system of these animals needs to meet two conflicting demands at the same time; on the one hand maximum surveillance and on the other hand detailed binocular vision. The research team from the Max Planck Institute for Biological Cybernetics have now, for the first time, observed and characterised the eye movements of freely moving rats. They fitted minuscule cameras weighing only about one gram to the animals’ heads, which could record the lightning-fast eye movements with great precision. The scientists also used another new method to measure the position and direction of the head, enabling them to reconstruct the rats’ exact line of view at any given time.

The Max Planck scientists’ findings came as a complete surprise. Although rats process visual information from their eyes through very similar brain pathways to other mammals, their eyes evidently move in a totally different way. “Humans move their eyes in a very stereotypical way for both counteracting head movements and searching around. Both our eyes move together and always follow the same object. In rats, on the other hand, the eyes generally move in opposite directions,” explains Jason Kerr from the Max Planck Institute for Biological Cybernetics. In a series of behavioural experiments, the neurobiologists also discovered that the eye movements largely depend on the position of the animal’s head. “When the head points downward, the eyes move back, away from the tip of the nose. When the rat lifts its head, the eyes look forward: cross-eyed, so to speak. If the animal puts its head on one side, the eye on the lower side moves up and the other eye moves down.” says Jason Kerr. In humans, the direction in which the eyes look must be precisely aligned, otherwise an object cannot be fixated. A deviation measuring less than a single degree of the field of view is enough to cause double vision. In rats, the opposing eye movements between left and right eye mean that the line of vision varies by as much as 40 degrees in the horizontal plane and up to 60 degrees in the vertical plane.

The consequence of these unusual eye movements is that irrespective of vigorous head movements in all planes, the eyes movements always move in such a way to ensure that the area above the animal is always in view simultaneously by both eyes – something that does not occur in any other region of the rat’s visual field. These unusual eye movements that rats possess appear to be the visual system’s way of adapting to the animals’ living conditions, given that they are preyed upon by numerous species of birds. Although the observed eye movements prevent the fusion of the two visual fields, the scientists postulate that permanent visibility in the direction of potential airborne attackers dramatically increases the animals’ chances of survival.

Science Daily
June 11, 2013

Original web page at Science Daily

Categories
News

Novel disease in songbirds demonstrates evolution in the blink of an eye

A novel disease in songbirds has rapidly evolved to become more harmful to its host on at least two separate occasions in just two decades, according to a new study. The research provides a real-life model to help understand how diseases that threaten humans can be expected to change in virulence as they emerge. “Everybody who’s had the flu has probably wondered at some point, ‘Why do I feel so bad?'” said Dana Hawley of Virginia Tech, the lead author of the study to be published in PLOS Biology on May 28, 2013. “That’s what we’re studying: Why do pathogens cause harm to the very hosts they depend on? And why are some life-threatening, while others only give you the sniffles?” Disease virulence is something of a paradox. In order to spread, viruses and bacteria have to reproduce in great numbers. But as their numbers increase inside a host’s body, the host gets more and more ill. So a highly virulent disease runs the risk of killing or debilitating its hosts before they get a chance to pass the bug along. It finds the right balance through evolution, and the new study shows it can happen in just a few years.

Hawley and her coauthors studied House Finch eye disease, a form of conjunctivitis, or pinkeye, caused by the bacteria Mycoplasma gallisepticum. It first appeared around Washington, D.C., in the 1990s. The House Finch is native to the Southwest but has spread to towns and backyards across North America. The bacteria is not harmful to humans, which makes it a good model for studying the evolution of dangerous diseases such as SARS, Ebola, and avian flu. “There’s an expectation that a very virulent disease like this one will become milder over time, to improve its ability to spread. Otherwise, it just kills the host and that’s the end of it for the organism,” said André Dhondt, director of Bird Population Studies at the Cornell Lab of Ornithology and a coauthor of the study. “House Finch eye disease gave us an opportunity to test this—and we were surprised to see it actually become worse rather than milder.” The researchers used frozen bacterial samples taken from sick birds in California and the Eastern Seaboard at five dates between 1994 and 2010, as the pathogen was evolving and spreading. The samples came from an archive maintained by coauthor David Ley of North Carolina State University, who first isolated and identified the causative organism. The team experimentally infected wild-caught House Finches, allowing them to measure how sick the birds got with each sample. They kept the birds in cages as they fell ill and then recovered (none of the birds died from the disease).

Contrary to expectations, they found that in both regions the disease had evolved to become more virulent over time. Birds exposed to later disease strains developed more swollen eyes that took longer to heal. In another intriguing finding, it was a less-virulent strain that spread westward across the continent. Once established in California, the bacteria again began evolving higher virulence. In evolutionary terms, some strains of the bacteria were better adapted to spreading across the continent, while others were more suited to becoming established in one spot. “For the disease to disperse westward, a sick bird has to fly a little farther, and survive for longer, to pass on the infection. That will select for strains that make the birds less sick,” Hawley said. “But when it gets established in a new location, there are lots of other potential hosts, especially around bird feeders. It can evolve toward being a nastier illness because it’s getting transmitted more quickly.”

House Finch eye disease was first observed in 1994 when bird watchers reported birds with weepy, inflamed eyes to Project Feeder Watch, a citizen science study run by the Cornell Lab. Though the disease does not kill birds directly, it weakens them and makes them easy targets for predators. The disease quickly spread south along the Eastern Seaboard, north and west across the Great Plains, and down the West Coast. By 1998 the House Finch population in the eastern United States had dropped by half—a loss of an estimated 40 million birds. Bird watchers can do their part to help House Finches and other backyard birds by washing their feeders in a 10 percent bleach solution twice a month.

Science Daily
June 11, 2013

Original web page at Science Daily

Categories
News

Eyes work without connection to brain: Ectopic eyes function without natural connection to brain

For the first time, scientists have shown that transplanted eyes located far outside the head in a vertebrate animal model can confer vision without a direct neural connection to the brain. Biologists at Tufts University School of Arts and Sciences used a frog model to shed new light — literally — on one of the major questions in regenerative medicine, bioengineering, and sensory augmentation research. “One of the big challenges is to understand how the brain and body adapt to large changes in organization,” says Douglas J. Blackiston, Ph.D., first author of the paper “Ectopic Eyes Outside the Head in Xenopus Tadpoles Provide Sensory Data For Light-Mediated Learning,” in the February 27 issue of the Journal of Experimental Biology. “Here, our research reveals the brain’s remarkable ability, or plasticity, to process visual data coming from misplaced eyes, even when they are located far from the head.” Blackiston is a post-doctoral associate in the laboratory of co-author Michael Levin, Ph.D., professor of biology and director of the Center for Regenerative and Developmental Biology at Tufts University.

Levin notes, “A primary goal in medicine is to one day be able to restore the function of damaged or missing sensory structures through the use of biological or artificial replacement components. There are many implications of this study, but the primary one from a medical standpoint is that we may not need to make specific connections to the brain when treating sensory disorders such as blindness.” In this experiment, the team surgically removed donor embryo eye primordia, marked with fluorescent proteins, and grafted them into the posterior region of recipient embryos. This induced the growth of ectopic eyes. The recipients’ natural eyes were removed, leaving only the ectopic eyes. Fluorescence microscopy revealed various innervation patterns but none of the animals developed nerves that connected the ectopic eyes to the brain or cranial region. To determine if the ectopic eyes conveyed visual information, the team developed a computer-controlled visual training system in which quadrants of water were illuminated by either red or blue LED lights. The system could administer a mild electric shock to tadpoles swimming in a particular quadrant. A motion tracking system outfitted with a camera and a computer program allowed the scientists to monitor and record the tadpoles’ motion and speed.

The team made exciting discoveries: Just over 19 percent of the animals with optic nerves that connected to the spine demonstrated learned responses to the lights. They swam away from the red light while the blue light stimulated natural movement. Their response to the lights elicited during the experiments was no different from that of a control group of tadpoles with natural eyes intact. Furthermore, this response was not demonstrated by eyeless tadpoles or tadpoles that did not receive any electrical shock. “This has never been shown before,” says Levin. “No one would have guessed that eyes on the flank of a tadpole could see, especially when wired only to the spinal cord and not the brain.” The findings suggest a remarkable plasticity in the brain’s ability to incorporate signals from various body regions into behavioral programs that had evolved with a specific and different body plan. “Ectopic eyes performed visual function,” says Blackiston. “The brain recognized visual data from eyes that impinged on the spinal cord. We still need to determine if this plasticity in vertebrate brains extends to different ectopic organs or organs appropriate in different species.” One of the most fascinating areas for future investigation, according to Blackiston and Levin, is the question of exactly how the brain recognizes that the electrical signals coming from tissue near the gut is to be interpreted as visual data.

Science Daily
March 19, 2013

Original web page at Science Daily

Categories
News

Stem cells cruise to clinic

Japanese study of induced pluripotent stem cells aims to demonstrate safety in humans. Induced pluripotent stem cells could soon be used in human trials in Japan. In the seven years since their discovery, induced pluripotent stem (iPS) cells have transformed basic research and won a Nobel prize. Now, a Japanese study is about to test the medical potential of these cells for the first time. Made by reprogramming adult cells into an embryo-like state that can form any cell type in the body, the cells will be transplanted into patients who have a debilitating eye disease. Masayo Takahashi, an ophthalmologist at the RIKEN Center for Developmental Biology in Kobe, Japan, plans to submit her application for the study to the Japanese health ministry next month, and could be recruiting patients as early as September. Stem-cell researchers around the world hope that if the trial goes forward, it will allay some of the safety concerns over medical use of the cells. And the Japanese government hopes that its efforts to speed iPS cells to the clinic by generously funding such work will be vindicated. “The entire field is very dependent on this group and the Japanese regulatory agencies to ensure that preclinical evidence for safety and efficacy is very strong,” says Martin Pera, a stem-cell expert at the University of Melbourne in Australia.

Takahashi, who has been studying the potential of iPS cells to rebuild diseased tissue for more than a decade, hopes to treat around six people who have severe age-related macular degeneration, a common cause of blindness that affects at least 1% of people aged over 50. The form of the disease that Takahashi will treat occurs when blood vessels invade the retina, destroying the retinal pigment epithelium that supports the light-sensitive photoreceptors. This form can be treated with drugs that block the growth of new blood vessels, but these often have to be injected repeatedly into the eye. Takahashi will take a peppercorn-size skin sample from the upper arm and add proteins that reprogram the cells into iPS cells. Other factors will transform the iPS cells into retinal cells. Then a small sheet of cells will be placed under the damaged area of the retina, where, if things go well, the cells will grow and repair the pigment epithelium. The researchers hope to see the transplants slow or halt the disease, but their main goal is to show that the cells are safe. One concern is that the reprogrammed cells will trigger an immune reaction — as has been seen in mice (T. Zhao et al. Nature 474, 212–215; 2011). But that concern has faded after a recent study suggested that iPS cells did not provoke an immune reaction after all. “Immune compatibility seems to be as expected, so I am not so concerned about that issue,” says stem-cell expert George Daley of Harvard Medical School in Boston, Massachusetts.

A bigger worry is that the reprogrammed cells might multiply uncontrollably and form tumours instead of healthy tissue. But Pera and Daley are reassured by the pre-clinical data that Takahashi has presented at conferences. Takahashi says that these results, submitted for publication, show that her iPS cells do not form tumours in mice and are safe in non-human primates. Pera adds that the procedure to treat macular degeneration requires just a few stem cells, reducing the chances that a tumour will form. Also, any tumours would be relatively easy to remove because the eye is more accessible than some organs. Daley does worry that the treatment, even if it proves harmless, might not be effective. The cells might not engraft properly, for example, or might not integrate with the patients’ own tissue. “I think we will require many years of experience to learn more about how cells integrate,” he says. Pera raises another concern: that the identity of the cells might not be stable, and that over time they would no longer function as retinal epithelium. “The entire field is very dependent on this group and the Japanese regulatory agencies.”

According to Robert Lanza, chief scientific officer at biotechnology firm Advanced Cell Technology (ACT) in Santa Monica, California, iPS-cell studies like Takahashi’s could be premature. “I cannot imagine any regulatory agency permitting such a trial without years of extensive pre-clinical testing,” he says. ACT is racing to start a less-ambitious clinical trial of iPS cells for use in other diseases. Its study would inject healthy patients with platelets derived from iPS cells and from embryonic stem cells to see if they act like normal platelets, which could open the way to a treatment for blood-clotting disorders. Because platelets lack a nucleus, there is no risk of forming tumours, explains Lanza. He will meet with the US Food and Drug Administration later this month, and hopes to get approval to start the trial this year. Lanza says that using iPS cells that contain a nucleus in human trials is “a far greater challenge” than his approach. But Takahashi’s team is prepared, counters Pera. They are “among the pioneers in this field”, he says, adding that they are “well placed to undertake these studies”.

Takahashi is carrying out a ‘clinical study’ which, in Japan’s somewhat confusing system, is less tightly regulated than a clinical trial and cannot by itself lead to approval for clinical use of a treatment. The data, if positive, might attract investors or help Takahashi to get approval for a formal clinical trial — required if the cells are ever to be used to treat patients in the clinic. The study was approved by institutional review boards at both the Center for Developmental Biology and the Institute of Biomedical Research and Innovation in Kobe, where the surgical procedures will be carried out. Now, approval depends on a health-ministry committee of 18 physicians, lawyers, administrators and scientists, including three stem-cell specialists. If Takahashi wins approval by September as expected, it will take another eight months to grow the sheets of cells required for the transplants. In Japan, future iPS-cell therapies may have an easier path to the clinic as the government continues its drive to capitalize on the technology, which was first developed there. A revised drug law, expected to be put before the Japanese parliament by late June, would fast-track therapies that seem to be effective in phase II or phase III trials. But the success of that drive, and the prospects for patients with macular degeneration, depend in part on Takahashi and her pioneering patients.

Nature
March 19, 2013

Original web page at Nature

Categories
News

Dogs recognize the dog species among several other species on a computer screen

Dogs pick out faces of other dogs, irrespective of breeds, among human and other domestic and wild animal faces and can group them into a category of their own. They do that using visual cues alone, according to new research by Dr. Dominique Autier-Dérian from the LEEC and National Veterinary School in Lyon in France and colleagues. Their work, the first to test dogs’ ability to discriminate between species and form a “dog” category in spite of the huge variability within the dog species, is published online in Springer’s journal Animal Cognition. Individuals from the same species get together for social life. These gatherings require recognition of similarities between individuals who belong to the same species and to a certain group. Research to date has shown that in some species, individuals recognize more easily, or are more attracted by images of, individuals belonging to their own species than those belonging to another species. Autier-Derian and team studied this phenomenon among domestic dogs, which have the largest morphological variety among all animal species. Indeed, more than 400 pure breeds of dogs have been registered. The authors explored whether this large morphological diversity presented a cognitive challenge to dogs trying to recognize their species, when confronted with other species, using visual cues alone.

On a computer screen, the researchers showed nine pet dogs pictures of faces from various dog breeds and cross-breeds, and simultaneously faces of other animal species, including human faces. They exposed the dogs to diverse stimuli: images of dog faces; images of non-dog species from 40 different species, including domestic and wild animals; and humans. Overall, the dogs were shown more than 144 pairs of pictures to select from. The authors observed whether the nine dogs could discriminate any type of dog from other species, and could group all dogs together, whatever their breed, into a single category. The results suggest that dogs can form a visual category of dog faces and group pictures of very different dogs into a single category, despite the diversity in dog breeds. Indeed, all nine dogs were able to group all the images of dogs within the same category. The authors conclude: “The fact that dogs are able to recognize their own species visually, and that they have great olfactory discriminative capacities, insures that social behavior and mating between different breeds is still potentially possible. Although humans have stretched the Canis familiaris species to its morphological limits, its biological entity has been preserved.”

Science Daily
March 5, 2013

Original web page at Science Daily

Categories
News

The Food and Drug Administration approved the first retinal implant for use in the United States.

FDA’s green light for Second Sight’s Argus II Retinal Prosthesis System gives hope to those blinded by a rare genetic eye condition called advanced retinitis pigmentosa, which damages the light-sensitive cells that line the retina. For Second Sight, FDA approval follows more than 20 years of development, two clinical trials and more than $200 million in funding—half from the National Eye Institute, the Department of Energy and the National Science Foundation, and the rest from private investors. The Argus II has been approved for use in Europe since 2011 and implanted in 30 clinical-trial patients since 2007. The FDA’s Ophthalmic Devices Advisory Panel in September 2012 voted unanimously to recommend approval. The Argus II includes a small video camera, a transmitter mounted on a pair of eyeglasses, a video processing unit and a 60-electrode implanted retinal prosthesis that replaces the function of degenerated cells in the retina, the membrane lining the inside of the eye. Although it does not fully restore vision, this setup can improve a patient’s ability to perceive images and movement, using the video processing unit to transform images from the video camera into electronic data that is wirelessly transmitted to the retinal prosthesis.

Retinitis pigmentosa—which affects about one in 4,000 people in the US and about 1.5 million people worldwide—kills the retina’s photoreceptors, the rod and cone cells that convert light into electrical signals transmitted via the optic nerve to the brain’s visual cortex for processing. Second Sight plans to adapt its technology to someday assist people afflicted with age-related macular degeneration, a similar but more common disease. The company plans to make the Argus II available later this year in clinical centers throughout the US, cultivate a network of surgeons who can implant the device and recruit hospitals to offer it.

The Argus II is not the only retinal implant under development. Retina Implant AG takes a slightly different approach by making a prosthetic inserted beneath a portion of the retina. The company’s technology is a three- by three-millimeter microelectronic chip (0.1-millimeter thick) containing about 1,500 light-sensitive photodiodes, amplifiers and electrodes surgically inserted beneath the fovea (which contains the cone cells) in the retina’s macula region. The fovea enables the clarity of vision that people rely on to read, watch TV and drive. The chip helps generate at least partial vision by stimulating intact nerve cells in the retina. The nerve impulses from these cells are then led via the optic nerve to the visual cortex where they create impressions of sight. The chip’s power source is positioned under the skin behind the ear and connected via a thin cable—no glasses or camera required. In May the company announced the first UK patients participating its latest trial had successfully received implants. To date surgeons have implanted Retina Implant prosthetics in 36 patients through two clinical trials over six years.

Stanford University researchers are in the early stages of developing self-powered retinal implants where each pixel in the device is fitted with silicon photodiodes. These sensors detect light, and control the output of a pulsed electrical current. Patients would wear goggles that emit near-infrared pulses that transmit both power and data directly to the photodiodes. Other retinal prosthesis are powered by inductive coils that, along with other components, must be surgically implanted in the patient’s head. The researchers reported on the plausibility of their design in the May 2012 issue of Nature Photonics, describing in vitro electrical stimulation of healthy and degenerate rat retina by photodiodes powered by near-infrared light. Weill Cornell Medical College researchers in New York City are taking retinal prosthetics in a different direction, having deciphered the neural codes that mouse and monkey retinas use to turn light patterns into patterns of electrical pulses that their brains translate into meaningful images. The researchers programmed this information into an “encoder” chip and combined it with a mini-projector to create an implantable prosthetic. The chip converts images that come into the eye into streams of electrical impulses, and the mini-projector then converts the electrical impulses into light impulses that are sent to the brain. Rather than increasing the number of electrodes placed in an eye to capture more information and send signals to the brain, this work focuses on the quality of the artificial signals themselves so as to improve their ability to carry impulses to the brain.

Nature
March 5, 2013

Original web page at Nature

Categories
News

Color vision: Explaining primates’ red-green vision

Our eyes are complicated organs, with the retina in the back of the eyeball comprising hundreds of millions of neurons that allow us to see, and to do so in color. Scientists have long known that some retinal ganglion cells — neurons connecting the retina to the rest of the brain — are tuned to specific wave-lengths of light (colors). In humans and other primates they are excited by red and inhibited by green, for example. An important question is: how are these “color-opponent” cells wired to discriminate wavelengths so that we perceive colors? Scientists in the lab of Thomas Euler, professor at the Werner Reichardt Centre for Integrative Neuroscience and the Institute for Ophthalmology at the University of Tübingen, have been working on the problem of retinal color processing for several years. Their article in the journal Neuron shows that whether or not ganglion cells become color-opponent depends on the chromatic preference of the light-sensitive photoreceptor cells in the vicinity. The research looked at mice, which have a striking distribution of photoreceptors across their retina, with a green-sensitive upper half and blue-sensitive lower half. This differs from most mammals, yet they are an excellent model system for studying important aspects of mammalian color processing.

Researchers found that when stimulated with light, ganglion cells that have never before been implicated in color vision become color-opponent if they are located close to the border between the green- and the blue-dominated retina halves, but nowhere else. Their findings show that color vision can arise from neural circuits in the retina that are not specifically “wired” for color processing. Although these findings were made in mice, they represent an important contribution to our understanding of color processing in humans and other primates, which are considered the color specialists among the mammals. Such random wiring has long been proposed for primate red-green color vision, which resulted from a gene duplication event that occurred quite recently on an evolutionary time scale — possibly leaving not enough time for a specific neural circuit to evolve. The new findings support this idea and suggest more similarities in the general principles of color discrimination in mice and primates than previously thought.

Science Daily
March 5, 2013

Original web page at Science Daily

Categories
News

Amblyopia cat: Turn off the lights

A new study in cats reveals that even brief periods in total darkness can correct the vision disorder amblyopia. A stint in the dark may be just what the doctor ordered—at least if you have “lazy eye.” Researchers report that kittens with the disorder, a visual impairment medically known as amblyopia that leads to poor sight or blindness in one eye, can completely recover their vision by simply spending 10 days in total darkness. “It’s a remarkable study, with real potential to change how we think about recovery from amblyopia,” says neuroscientist Frank Sengpiel of Cardiff University in the United Kingdom who was not involved in the work. Amblyopia affects about 4% of the human population. It’s thought to start with an imbalance in vision early in life: If one eye doesn’t see as well as the other—because, for example, of a cataract or astigmatism—the brain reroutes most of the connections needed for visual processing to the “good” eye. Doctors often treat the condition by patching the good eye and forcing the brain to rely on the other eye, but the treatment risks damaging vision in the good eye, and if it doesn’t succeed or occur early enough in a child’s visual development, the vision loss in the impaired eye can be permanent.

Earlier studies with cats, whose complex visual systems are good stand-ins for human vision, showed that neurons in the brain’s visual centers shrink when the brain decides to disconnect from the bad eye, but that they grow again when the cats are placed in darkness. So neuroscientists Kevin Duffy and Donald Mitchell of Dalhousie University in Halifax, Canada, set out to test darkness itself as a treatment. They first induced amblyopia in 27 kittens by surgically closing one of each animal’s eyes 30 days after birth, when a feline’s visual system plasticity, its ability to change and grow, is at its peak. The eye was kept closed for 7 days, after which each kitten had amblyopia. Then the kittens were divided into two groups: one group that was placed in darkness immediately upon opening the deprived eye, and a second group that waited 3 months before its stint in the darkness. The darkness that both groups experienced was total—”a darkroom inside a darkroom inside a darkroom” that kept out even faint or transient sources of light, Duffy says. When the first group emerged from its 10 days of lights-out, all the kittens were blind in both eyes. But over a 7-week period, each cat’s eyes improved in lockstep, ultimately achieving normal vision in both eyes.

The second group, which during the delay had developed stable and presumably permanent amblyopia, also spent 10 days in the pitch dark. When those kittens emerged, their good eyes could still see and their bad eyes were nearly blind. But within 7 days, each kitten’s bad eye had recovered to the point that it matched the good eye in visual acuity. “This vision impairment that would have lasted a lifetime was completely obliterated by 10 days of darkness,” Duffy says. The results, published today in Current Biology, suggest that darkness restored some of the kittens’ brain plasticity and thus enabled their vision’s recovery. To understand the mechanism behind this, the researchers measured how darkness affects levels of a protein, called NF-L, that helps stabilize the shape and structure of neurons in the brain. These so-called neurofilaments accumulate with age and are thought to be molecular “brakes” that reduce the brain’s plasticity over time. They put a different group of 30-day-old kittens—a group in which the researchers did not induce amblyopia, to make sure they weren’t measuring the effects of the deprivation on neurofilament levels—into the darkness. After 10 days in the dark, these kittens showed 50% lower NF-L levels in their brain tissue than kittens of the same age that had never been kept in the dark. “It was like you were looking at an animal that was much younger than it was,” Duffy says. Sengpiel cautions that it’s too soon to start suggesting doses of dark for humans with amblyopia, in part because the researchers haven’t yet determined the limits of their treatment’s effectiveness. They don’t know, for example, just how dark the room has to be, or whether short breaks from the darkness would destroy its benefits. Duffy and Mitchell hope to answer some of those questions in future studies.

ScienceNow
March 5, 2013

Original web page at ScienceNow

Categories
News

Development of new corneal cell line provides powerful tool

Human corneal endothelial cells (HCEnCs) form a monolayer of hexagonal cells whose main function is to maintain corneal clarity by regulating corneal hydration. Cell loss due to aging or corneal endothelial disorders, such as Fuchs dystrophy, can lead to cornea edema and blindness, resulting in the need for cornea transplants. Studying human corneal endothelium has been difficult for cell biologists because limited cellular model systems exist and have significant drawbacks. The major drawback is that HCEnC cells do not divide and there is a limited source of these cells both for patient transplantation and for study in the laboratory. This field of study is now easier. Scientists from the Schepens Eye Research Institute, Mass. Eye and Ear, have developed of HCENC-21 and HCEnC-21T, two novel model systems for human corneal endothelium. Their findings, “Telomerase Immortalization of Human Corneal Endothelial Cells Yield Functional Hexagonal Monolayers,” are online in the PLOS ONE.

A research team led by Ula Jurkunas, M.D., developed first-of their kind model systems for human corneal endothelium. “These models mimic very well the critical characteristics and functionalities known from the tissue in the eye,” Dr. Jurkunas said. “They also fulfill essential technical requirements, e.g. indefinite number of and a high rate of cell division, to be a powerful tool. They will enable cell biologists to more reliably study human corneal endothelium in health and disease. The ability to enhance HCEnC cell self renewal and growth opens a new window of development of novel regenerative therapies for corneal swelling, hopefully reducing the need for corneal transplantation in the future.”

Science Daily
January 22, 2013

Original web page at Science Daily

Categories
News

Early predictor for glaucoma identified

A new study finds that certain changes in blood vessels in the eye’s retina can be an early warning that a person is at increased risk for glaucoma, an eye disease that slowly robs people of their peripheral vision. Using diagnostic photos and other data from the Australian Blue Mountains Eye Study, the researchers showed that patients who had abnormally narrow retinal arteries when the study began were also those who were most likely to have glaucoma at its 10-year end point. If confirmed by future research, this finding could give ophthalmologists a new way to identify and treat those who are most vulnerable to vision loss from glaucoma. The study was recently published online by Ophthalmology, the journal of the American Academy of Ophthalmology. Open-angle glaucoma (OAG), the most common form of the disease, affects nearly three million people in the U.S and 60 million worldwide. Vision loss occurs when glaucoma damages the optic nerve, the part of the eye that transmits images from the retina to the brain. Unfortunately, because glaucoma does not have symptoms, many people don’t know they have the disease until a good portion of their sight has been lost. Early detection is critical to treating glaucoma in time to preserve vision.

The findings of the new study, led by Paul Mitchell, M.D., PhD, of the Centre for Vision Research, University of Sydney, supports the concept that abnormal narrowing of retinal blood vessels is an important factor in the earliest stages of OAG. Tracking nearly 2,500 participants, the study found that the OAG risk at the 10-year mark was about four times higher in patients whose retinal arteries had been narrowest when the study began, compared with those who had had the widest arteries. None of the participants had a diagnosis of OAG at the study’s outset. Compared with the study group as a whole, the patients who were diagnosed with OAG by the 10-year mark were older, had had higher blood pressure or higher intraocular pressure at the study’s baseline, and were more likely to be female. Elevated intraocular pressure, or pressure within the eye, is often found in patients with OAG. Study results were adjusted for age, family history of glaucoma, smoking, diabetes, hypertension, and other relevant factors. “Our results suggest that a computer-based imaging tool designed to detect narrowing of the retinal artery caliber, or diameter, could effectively identify those who are most at risk for open-angle glaucoma,” said Dr. Mitchell. “Such a tool would also need to account for blood pressure and other factors that can contribute to blood vessel changes. Early detection would allow ophthalmologists to treat patients before optic nerve damage occurs and would give us the best chance of protecting their vision.” A symptomless eye disease like glaucoma highlights the importance of regular eye exams. The American Academy of Ophthalmology recommends that everyone have a complete eye exam by an ophthalmologist at age 40 and stick to the follow-up exam schedule advised by their doctor.

Science Daily
January 22, 2013

Original web page at Science Daily

Categories
News

New study shows effects of prehistoric nocturnal life on mammalian vision

Since the age of dinosaurs, most species of day-active mammals have retained the imprint of nocturnal life in their eye structures. Humans and other anthropoid primates, such as monkeys and apes, are the only groups that deviate from this pattern, according to a new study from The University of Texas at Austin and Midwestern University. The findings, published in a forthcoming issue of Proceedings of the Royal Society B, are the first to provide a large-scale body of evidence for the “nocturnal bottleneck theory,” which suggests that mammalian sensory traits have been profoundly influenced by an extended period of adaptation to nocturnality during the Mesozoic Era. This period lasted from 250 million years ago to 65 million years ago. To survive in the night, mammals had a host of visual capabilities, such as good color vision and high acuity, which were lost as they passed through the nocturnal “bottleneck.” “The fact that nearly all living mammals have eye shapes that appear ‘nocturnal’ by comparison with other amniotes [mammals, reptiles and birds] is a testament to the strong influence that evolutionary history can have on modern anatomy,” says Chris Kirk, associate professor of anthropology at The University of Texas at Austin. According to Kirk, early mammals were predominantly nocturnal during the Mesozoic partly as a strategy for avoiding predation by day-active dinosaurs.

“It’s a bit surprising to still see the effects of this long period of nocturnality on living mammals more than 65 million years after non-avian dinosaurs went extinct, but that’s exactly what we found,” Kirk says. The research team, led by Margaret Hall, an evolutionary biologist at Midwestern University’s Arizona College of Osteopathic Medicine, analyzed one of the largest datasets on eye morphology ever assembled. Using a sample of eyeballs from 266 mammal species, the researchers used a multivariate statistical method to show that mammals active by day or night show only minor differences in eye morphology. The researchers then compared the eyes of mammals, birds and lizards using the ratio of cornea size and eye length — two functionally important measures of the eye’s ability to admit light and form sharp images. These analyses showed that diurnal (only active by day) and cathemeral (active by both day and night) mammals don’t differ in their eye shapes. At the same time, both groups have eye shapes that are very similar to those of nocturnal birds and lizards. These results reveal that most day-active mammals have eye shapes that appear “nocturnal” when compared with other vertebrates.

One likely reason for these findings, Kirk says, is that after the extinction of non-avian dinosaurs, some nocturnal mammals became day-active and there was less pressure to evolve eye shapes for acute diurnal vision like those of other day-active vertebrates. Anthropoid primates are the only mammalian group that re-evolved eye shape for fine detailed daytime vision. Like diurnal birds and lizards, most anthropoids have small corneas relative to eye length as an adaptation for enhanced visual acuity. Kirk says the study provides a deeper understanding of human sensory systems and our intrinsic connection with our closest living primate relatives: the monkeys and apes. “Humans and other anthropoid primates are so dependent on vision for everything that they do,” Kirk says. “In this case, we are radically different from other mammals. We found that the distinctive eye shapes that set humans apart from most other mammals evolved a long time ago — way back with the origin of anthropoid primates.”

Science Daily
November 13, 2012

Original web page at Science Daily

Categories
News

An artificial retina with the capacity to restore normal vision

Two researchers at Weill Cornell Medical College have deciphered a mouse’s retina’s neural code and coupled this information to a novel prosthetic device to restore sight to blind mice. The researchers say they have also cracked the code for a monkey retina — which is essentially identical to that of a human — and hope to quickly design and test a device that blind humans can use. The breakthrough, reported in the Proceedings of the National Academy of Sciences (PNAS), signals a remarkable advance in longstanding efforts to restore vision. Current prosthetics provide blind users with spots and edges of light to help them navigate. This novel device provides the code to restore normal vision. The code is so accurate that it can allow facial features to be discerned and allow animals to track moving images. The lead researcher, Dr. Sheila Nirenberg, a computational neuroscientist at Weill Cornell, envisions a day when the blind can choose to wear a visor, similar to the one used on the television show Star Trek. The visor’s camera will take in light and use a computer chip to turn it into a code that the brain can translate into an image.

“It’s an exciting time. We can make blind mouse retinas see, and we’re moving as fast as we can to do the same in humans,” says Dr. Nirenberg, a professor in the Department of Physiology and Biophysics and in the Institute for Computational Biomedicine at Weill Cornell. The study’s co-author is Dr. Chethan Pandarinath, who was a graduate student with Dr. Nirenberg and is currently a postdoctoral researcher at Stanford University. This new approach provides hope for the 25 million people worldwide who suffer from blindness due to diseases of the retina. Because drug therapies help only a small fraction of this population, prosthetic devices are their best option for future sight. “This is the first prosthetic that has the potential to provide normal or near-normal vision because it incorporates the code,” Dr. Nirenberg explains. Normal vision occurs when light falls on photoreceptors in the surface of the retina. The retinal circuitry then processes the signals from the photoreceptors and converts them into a code of neural impulses. These impulses are then sent up to the brain by the retina’s output cells, called ganglion cells. The brain understands this code of neural pulses and can translate it into meaningful images.

Blindness is often caused by diseases of the retina that kill the photoreceptors and destroy the associated circuitry, but typically, in these diseases, the retina’s output cells are spared. Current prosthetics generally work by driving these surviving cells. Electrodes are implanted into a blind patient’s eye, and they stimulate the ganglion cells with current. But this only produces rough visual fields. Many groups are working to improve performance by placing more stimulators into the patient’s eye. The hope is that with more stimulators, more ganglion cells in the damaged tissue will be activated, and image quality will improve. Other research teams are testing use of light-sensitive proteins as an alternate way to stimulate the cells. These proteins are introduced into the retina by gene therapy. Once in the eye, they can target many ganglion cells at once. But Dr. Nirenberg points out that there’s another critical factor. “Not only is it necessary to stimulate large numbers of cells, but they also have to be stimulated with the right code — the code the retina normally uses to communicate with the brain.” Dr. Nirenberg reasoned that any pattern of light falling on to the retina had to be converted into a general code — a set of equations — that turns light patterns into patterns of electrical pulses. “People have been trying to find the code that does this for simple stimuli, but we knew it had to be generalizable, so that it could work for anything — faces, landscapes, anything that a person sees,” Dr. Nirenberg says. In a eureka moment, while working on the code for a different reason, Dr. Nirenberg realized that what she was doing could be directly applied to a prosthetic. She and her student, Dr. Pandarinath, immediately went to work on it. They implemented the mathematical equations on a “chip” and combined it with a mini-projector. The chip, which she calls the “encoder” converts images that come into the eye into streams of electrical impulses, and the mini-projector then converts the electrical impulses into light impulses. These light pulses then drive the light-sensitive proteins, which have been put in the ganglion cells, to send the code on up to the brain.

The entire approach was tested on the mouse. The researchers built two prosthetic systems — one with the code and one without. “Incorporating the code had a dramatic impact,” Dr. Nirenberg says. “It jumped the system’s performance up to near-normal levels — that is, there was enough information in the system’s output to reconstruct images of faces, animals — basically anything we attempted.” In a rigorous series of experiments, the researchers found that the patterns produced by the blind retinas in mice closely matched those produced by normal mouse retinas. “The reason this system works is two-fold,” Dr. Nirenberg says. “The encoder — the set of equations — is able to mimic retinal transformations for a broad range of stimuli, including natural scenes, and thus produce normal patterns of electrical pulses, and the stimulator (the light sensitive protein) is able to send those pulses on up to the brain.” “What these findings show is that the critical ingredients for building a highly-effective retinal prosthetic — the retina’s code and a high resolution stimulating method — are now, to a large extent, in place,” reports Dr. Nirenberg. Dr. Nirenberg says her retinal prosthetic will need to undergo human clinical trials, especially to test safety of the gene therapy component, which delivers the light-sensitive protein. But she anticipates it will be safe since similar gene therapy vectors have been successfully tested for other retinal diseases.

Science Daily
September 4, 2012

Original web page at Science Daily

Categories
News

Glaucoma as neurologic disorder rather than eye disease?

A new paradigm to explain glaucoma is rapidly emerging, and it is generating brain-based treatment advances that may ultimately vanquish the disease known as the “sneak thief of sight.” A review now available in Ophthalmology, the journal of the American Academy of Ophthalmology, reports that some top researchers no longer think of glaucoma solely as an eye disease. Instead, they view it as a neurologic disorder that causes nerve cells in the brain to degenerate and die, similar to what occurs in Parkinson disease and in Alzheimer’s. The review, led by Jeffrey L Goldberg, M.D., Ph.D., assistant professor of ophthalmology at the Bascom Palmer Eye Institute and Interdisciplinary Stem Cell Institute, describes treatment advances that are either being tested in patients or are scheduled to begin clinical trials soon. Glaucoma is the most common cause of irreversible blindness worldwide. For many years, the prevailing theory was that vision damage in glaucoma patients was caused by abnormally high pressure inside the eye, known as intraocular pressure (IOP). As a result, lowering IOP was the only goal of those who developed surgical techniques and medications to treat glaucoma. Creating tests and instruments to measure and track IOP was crucial to that effort. Today, a patient’s IOP is no longer the only measurement an ophthalmologist uses to diagnose glaucoma, although it is still a key part of deciding how to care for the patient. IOP-lowering medications and surgical techniques continue to be effective ways to protect glaucoma patients’ eyes and vision. Tracking changes in IOP over time informs the doctor whether the treatment plan is working.

But even when surgery or medication successfully lowers IOP, vision loss continues in some glaucoma patients. Also, some patients find it difficult to use eye drop medications as prescribed by their physicians. These significant shortcomings spurred researchers to look beyond IOP as a cause of glaucoma and focus of treatment. The new research paradigm focuses on the damage that occurs in a type of nerve cell called retinal ganglion cells (RGCs), which are vital to the ability to see. These cells connect the eye to the brain through the optic nerve. RGC-targeted glaucoma treatments now in clinical trials include: medications injected into the eye that deliver survival and growth factors to RGCs; medications known to be useful for stroke and Alzheimer’s, such as cytidine-5-diphosphocholine; and electrical stimulation of RGCs, delivered via tiny electrodes implanted in contact lenses or other external devices. Human trials of stem cell therapies are in the planning stages. “As researchers turn their attention to the mechanisms that cause retinal ganglion cells to degenerate and die, they are discovering ways to protect, enhance and even regenerate these vital cells,” said Dr. Goldberg. “Understanding how to prevent damage and improve healthy function in these neurons may ultimately lead to sight-saving treatments for glaucoma and other degenerative eye diseases.” If this neurologically-based research succeeds, future glaucoma treatments may not only prevent glaucoma from stealing patients’ eyesight, but may actually restore vision. Scientists also hope that their in-depth exploration of RGCs will help them determine what factors, such as genetics, make some people more vulnerable to glaucoma.

Science Daily
April 3, 2012

Original web page at Science Daily

Categories
News

Scientists produce eye structures from human blood-derived stem cells

For the first time, scientists at the University of Wisconsin-Madison have made early retina structures containing proliferating neuroretinal progenitor cells using induced pluripotent stem (iPS) cells derived from human blood. And in another advance, the retina structures showed the capacity to form layers of cells – as the retina does in normal human development – and these cells possessed the machinery that could allow them to communicate information. (Light-sensitive photoreceptor cells in the retina along the back wall of the eye produce impulses that are ultimately transmitted through the optic nerve and then to the brain, allowing you to see.) Put together, these findings suggest that it is possible to assemble human retinal cells into more complex retinal tissues, all starting from a routine patient blood sample. Many applications of laboratory-built human retinal tissues can be envisioned, including using them to test drugs and study degenerative diseases of the retina such as retinitis pigmentosa, a prominent cause of blindness in children and young adults. One day, it may also be possible replace multiple layers of the retina in order to help patients with more widespread retinal damage.

“We don’t know how far this technology will take us, but the fact that we are able to grow a rudimentary retina structure from a patient’s blood cells is encouraging, not only because it confirms our earlier work using human skin cells, but also because blood as a starting source is convenient to obtain,” says Dr. David Gamm, pediatric ophthalmologist and senior author of the study. “This is a solid step forward.” In 2011, the Gamm lab at the UW Waisman Center created structures from the most primitive stage of retinal development using embryonic stem cells and stem cells derived from human skin. While those structures generated the major types of retinal cells, including photoreceptors, they lacked the organization found in more mature retina. This time, the team, led by Gamm, Assistant Professor of Ophthalmology and Visual Sciences in the UW School of Medicine and Public Health, and postdoctoral researcher and lead author Dr. Joseph Phillips, used their method to grow retina-like tissue from iPS cells derived from human blood gathered via standard blood draw techniques.

In their study, about 16 percent of the initial retinal structures developed distinct layers. The outermost layer primarily contained photoreceptors, whereas the middle and inner layers harbored intermediary retinal neurons and ganglion cells, respectively. This particular arrangement of cells is reminiscent of what is found in the back of the eye. Further, work by Dr. Phillips showed that these retinal cells were capable of making synapses, a prerequisite for them to communicate with one another. The iPS cells used in the study were generated through collaboration with Cellular Dynamics International (CDI) of Madison, Wis., who pioneered the technique to convert blood cells into iPS cells. CDI scientists extracted a type of blood cell called a T-lymphocyte from the donor sample, and reprogrammed the cells into iPS cells. CDI was founded by UW stem cell pioneer Dr. James Thomson. “We were fortunate that CDI shared an interest in our work. Combining our lab’s expertise with that of CDI was critical to the success of this study,” added Dr. Gamm.

Science Daily
April 3, 2012

Original web page Science Daily

Categories
News

Stem cells can repair a damaged cornea

A new cornea may be the only way to prevent a patient going blind — but there is a shortage of donated corneas and the queue for transplantation is long. Scientists at the Sahlgrenska Academy have for the first time successfully cultivated stem cells on human corneas, which may in the long term remove the need for donators. Approximately 500 corneal transplantations are carried out each year in Sweden, and about 100,000 in the world. The damaged and cloudy cornea that is turning the patient blind is replaced with a healthy, transparent one. But the procedure requires a donated cornea, and there is a severe shortage of donated material. This is particularly the case throughout the world, where religious or political views often hinder the use of donated material. Scientists at the Sahlgrenska Academy, University of Gothenburg, have taken the first step towards replacing donated corneas with corneas cultivated from stem cells. Scientists Charles Hanson and Ulf Stenevi have used defective corneas obtained from the ophthalmology clinic at Sahlgrenska University Hospital in Mölndal. Their study is now published in the journal Acta Ophthalmologica, and shows how human stem cells can be caused to develop into what are known as “epithelial cells” after 16 days’ culture in the laboratory and a further 6 days’ culture on a cornea. It is the epithelial cells that maintain the transparency of the cornea.

“Similar experiments have been carried out on animals, but this is the first time that stem cells have been grown on damaged human corneas. It means that we have taken the first step towards being able to use stem cells to treat damaged corneas,” says Charles Hanson. “If we can establish a routine method for this, the availability of material for patients who need a new cornea will be essentially unlimited. Both the surgical procedures and the aftercare will also become much more simple,” says Ulf Stenevi. Only a few clinics are currently able to transplant corneas. Many of the transplantations in Sweden are carried out at the ophthalmology clinic at Sahlgrenska University Hospital, Department of Ophthalmology, Mölndal.

Science Daily
March 20, 2012

Original web page at Science Daily

Categories
News

Cornea gene discovery reveals why humans see clearly

A transparent cornea is essential for vision, which is why the eye has evolved to nourish the cornea without blood vessels. But for millions of people around the world, diseases of the eye or trauma spur the growth of blood vessels and can cause blindness. A new Northwestern Medicine study has identified a gene that plays a major role in maintaining clarity of the cornea in humans and mice — and could possibly be used as gene therapy to treat diseases that cause blindness. The paper is published in the Proceedings of the National Academy of Sciences. “We believe we’ve discovered the master regulator gene that prevents the formation of blood vessels in the eye and protects the clarity of the cornea,” said lead author Tsutomu Kume, associate professor of medicine at Northwestern University Feinberg School of Medicine and a researcher at Feinberg Cardiovascular Research Institute. The existence of the gene, FoxC1, was previously known, but its role in maintaining a clear cornea is a new finding. Working with a special breed of mice that are missing this gene, Kume and colleagues found abnormal vascular formations, or blood vessels, streaking their corneas and blocking light.

When Kume discovered the corneal blood vessels in the mutant mice, he called a collaborator at the University of Alberta in Canada, Ordan Lehmann, MD, professor of ophthalmology and medical genetics. Lehmann found that his patients who have a single copy of this mutated FoxC1 gene — and who have congenital glaucoma — also have abnormal blood vessel growth in their eyes. “The exciting thing is by showing the loss of FoxC1 causes vascularization of the cornea, it means increasing levels of the gene might help prevent the abnormal growth of blood vessels, potentially in multiple eye disorders that cause blindness,” said Lehmann, a coauthor on the paper. “That’s the hope.” One possible use might be in corneal transplants, he said, where the growth of new blood vessels onto the transplanted cornea is a major problem. Kume next plans to test the gene therapy in mice to see if injecting FoxC1 inhibits the formation of blood vessels in the cornea.

Science Daily
January 10, 2012

Original web page at Science Daily

Categories
News

Gene expression in mouse neural retina sequenced

In a new study, researchers have gained new insights into neural disease genes by sequencing virtually all the gene expression in the mouse neural retina. The technology to obtain such a “transcriptome” has become accessible enough that full-scale sequencing is the preferred method for asking genetics questions. The population of Eric Morrow’s seminar “Neurogenetics and Disease” comprises mainly undergraduates who were skipping down the halls of their elementary schools when the first drafts of human genome sequences were published. When Morrow, assistant professor of biology, recently asked the class how to find the mutation behind a disease, a hand shot up in the back of the class to signal the answer: “Sequence the patient’s genome.” Ah, to be 19 and take gene sequencing for granted. Fifteen years ago the student’s answer would have been unthinkable. Five years ago, it would have been possible, but prohibitively expensive and cumbersome. Now, armed with so-called “next-generation sequencing” technology, which brings the costs down to thousands rather than hundreds of millions of dollars, Morrow and other properly equipped researchers are obtaining detailed and comprehensive genetic sequences of cells from tissues of interest with relative ease.

In a new study, published in the journal Genomics, Morrow led a research group that has for the first time sequenced the entire “transcriptome” — all the messenger RNA transcribed from the DNA that codes proteins — of the mouse neural retina. Morrow’s overall goal is to investigate the genetic nature of disease in neural tissue and sure enough, the research has yielded some intriguing clues, Morrow said. He added that he will publicly share the entire dataset. The reason we studied the neural retina is that we wanted to ask: Is there anything different about those genes that cause disease in the nervous system and all of the other genes in the genome,” Morrow said. “There were some fairly prominent differences.” The study team, including first author and postdoctoral scholar Ece Gamsiz, produced four main insights: Although only 114 of 15,251 genes are known to be associated with disease, the disease genes were disproportionately highly expressed to a statistically significant degree. Six disease genes were among the 20 most highly expressed genes, including a sweep of the top three. Disease genes also had much longer sequences on average (4,333.4 bases on average, vs. 3,323 for non-disease genes). Disease genes were somewhat, but significantly, more likely to have alternate transcripts than non-disease genes. This means that there were more different versions in RNA produced from the information encoded in DNA. Neurons in the retina expressed less than a third of the available genes for their synaptic vesicles, which are essential components for how nerve cells transmit signals.

To Morrow, whose studies include the genetics of autism, the next steps have involved sequencing the transcriptomes of other neural tissues to see whether his observations from the retina are more generally true. For example, are disease-related genes unusually highly expressed elsewhere, too? And do specific neural tissues use a “synaptic vesicle” code, like the retina appears to have, or is this more specific just to retina as well? Morrow said he’s been sharing his raw transciptome data publicly so that other scientists can ask their own questions. “When we go to meetings, people ask us about their particular genes in their particular networks,” he said. Morrow has all but switched methods of genetic analysis. Before sequencing, scientists would use microarrays, which can be stocked with complementary strands of DNA or RNA to detect thousands of genes if they are present in a sample. But for a cost that’s tens of thousands of dollars and dropping, Morrow said, sequencing allows him to see everything in the cell and learn about the entire picture, whether genes were well-known enough to be on a gene chip or not. “You don’t need to be sitting on a genome center to do this type of work anymore,” Morrow said.

In a similar vein in October, colleagues at Brown and Women & Infants Hospital reported sequencing the transcriptome of human egg cells and, in another first, their sidecar-like “polar bodies.” Their key insight: Expendable polar bodies reflect the gene expression of the precious eggs, making them potentially good, nondestructive indicators of which egg to choose for in vitro fertilization. Only by seeing all the transcripts of all the genes in the mouse retina, and their full sequences, for example, could Morrow and Gamsiz have learned that disease genes are significantly longer, more likely to be transcribed in different ways, and expressed more abundantly. “I think expression microarrays are becoming a little outdated,” Morrow said. “Not that we didn’t do well with them. But with them you are missing what you are not looking for because you don’t know what’s out there.” What makes next-generation sequencing, in this case on an Illumina Genome Analyzer IIx, work better than the slow, labor-intensive and expensive technology used for sequencing just 10 years ago is partly that it sequences a large number of DNA or RNA fragments in parallel, vastly increasing the sequencing system’s throughput. “The technology is amazing, it’s a game changer,” Morrow said. “In terms of molecular approaches to gene expression, genetics, and genomics it’s like a new day. The data are truly beautiful.” As that metaphorical first light shines on the mouse retina transcriptome, an improved understanding of the genetics of neural disease may dawn with it.

Science Daily
January 10, 2012

Original web page at Science Daily

Categories
News

Fast high precision eye-surgery robot developed

Researcher Thijs Meenink at Eindhoven University of Technology (TU/e) has developed a smart eye-surgery robot that allows eye surgeons to operate with increased ease and greater precision on the retina and the vitreous humor of the eye. The system also extends the effective period during which ophthalmologists can carry out these intricate procedures. Meenink defended his PhD thesis on Oct. 31 for his work on the robot, and intends later to commercialize his system. Eye operations such as retina repairs or treating a detached retina demands high precision. In most cases surgeons can only carry out these operations for a limited part of their career. “When ophthalmologists start operating they are usually already at an advanced stage in their careers,” says Thijs Meenink. “But at a later age it becomes increasingly difficult to perform these intricate procedures.” The new system can simply filter-out hand tremors, which significantly increases the effective working period of the ophthalmologist.

The robot consists of a ‘master’ and a ‘slave’. The ophthalmologist remains fully in control, and operates from the master using two joysticks. This master was developed in an earlier PhD project at TU/e by dr.ir. Ron Hendrix. Two robot arms (the ‘slave’ developed by Meenink) copy the movements of the master and carry out the actual operation. The tiny needle-like instruments on the robot arms have a diameter of only 0.5 millimeter, and include forceps, surgical scissors and drains. The robot is designed such that the point at which the needle enters the eye is always at the same location, to prevent damage to the delicate eye structures. Meenink has also designed a unique ‘instrument changer’ for the slave allowing the robot arms to change instruments, for example from forceps to scissors, within only a few seconds. This is an important factor in reducing the time taken by the procedure. Some eye operations can require as many as 40 instrument changes, which are normally a time consuming part of the overall procedure. The surgeon’s movements are scaled-down, for example so that each centimeter of motion on the joystick is translated into a movement of only one millimeter at the tip of the instrument. “This greatly increases the precision of the movements,” says Meenink.

The master also provides haptic feedback. Ophthalmologists currently work entirely by sight — the forces used in the operation are usually too small to be felt. However Meenink’s robot can ‘measure’ these tiny forces, which are then amplified and transmitted to the joysticks. This allows surgeons to feel the effects of their actions, which also contributes to the precision of the procedure. The system developed by Meenink and Hendrix also offers ergonomic benefits. While surgeons currently are bent statically over the patient, they will soon be able to operate the robot from a comfortable seated position. In addition, the slave is so compact and lightweight that operating room staff can easily carry it and attach it to the operating table. Ophthalmologist prof.dr. Marc de Smet (AMC Amsterdam), one of Meenink’s PhD supervisors, is enthusiastic about the system — not only because of the time savings it offers, but also because in his view the limits of manual procedures have now been reached. “Robotic eye surgery is the next step in the evolution of microsurgery in ophthalmology, and will lead to the development of new and more precise procedures,” de Smet explains. Both slave and master are ready for use, and Meenink intends to optimize them in the near future. The first surgery on humans is expected within five years. He also plans to investigate the market opportunities for the robot system. Robotic eye surgery is a new development; eye surgery robots are not yet available on the market.

Science Daily
November 15, 2011

Original web page at Science Daily

Categories
News

Laser’s precision and simplicity could revolutionize cataract surgery

Two new studies add to the growing body of evidence that a new approach to cataract surgery may be safer and more efficient than today’s standard procedure. The new approach, using a special femtosecond laser, is FDA-approved, but not yet widely available in the United States. It’s one of the hottest topics at the 115th Annual Meeting of the American Academy of Ophthalmology. Research reported Oct. 23 by William W. Culbertson, MD, of the Bascom Palmer Eye Institute at the University of Miami School of Medicine, and by Mark Packer, MD, of Oregon Health and Sciences University, confirms several advantages of laser cataract surgery. Dr. Culbertson’s team studied how pre-treating cataracts with the femtosecond laser affected the level of ultrasound energy needed to soften the cataracts. This emulsification is performed so that the cataracts can be easily suctioned out. Surgeons want to use the lowest possible level of ultrasound energy, since in a small percentage of patients it is associated with slower recovery of good vision after surgery and/or problems with the cornea, which is the clear outer layer of the eye. Ideally, in appropriate cases, ultrasound use would be eliminated altogether.

In Dr. Culbertson’s prospective, randomized study, 29 patients had laser cataract surgery with a femtosecond laser in one eye and the standard cataract procedure, called phacoemulsification, in the other. Laser surgery included: a laser capsulotomy, which is a circular incision in the lens capsule, followed by laser lens fragmentation, then ultrasound emulsification and aspiration. Lens fragmentation involved using the laser to split the lens into sections and then soften it by etching cross-hatch patterns on its surface. Standard surgery included a manual incision, followed by ultrasound emulsification and aspiration. After cataract removal by either method, intraocular lenses were inserted into eyes to replace the natural lens and provide appropriate vision correction for each patient. The use of ultrasound energy use was reduced by 45 percent in the laser pre-treated eyes compared with the eyes that received the standard cataract surgery procedure. Also, surgical manipulation of the eye was reduced by 45 percent in eyes that received laser pre-treatment as compared to manual standard surgery. This study involved the most common types of cataracts, those graded 1- 4. Dr. Culbertson notes that these findings may not apply to higher grade cataracts.

“In clinical practice, surgeons would expect safer, faster cataract surgery when laser pre-treatment is performed before cataract removal,” said Dr. Culbertson. “The combination of precision and simplification that is possible with the femtosecond laser represents a major advance for this surgery.” Dr. Packer’s team at the Oregon Health and Sciences University in Portland, Oregon, assessed the safety of laser cataract surgery in terms of loss of corneal endothelial cells, as measured after cataract surgery. Measuring endothelial cell loss is one of the most important ways to assess the safety of new cataract surgery techniques and technology. These cells preserve the cornea’s clarity, and since they don’t regenerate, they must last a lifetime. Dr. Packer’s study found that when laser lens fragmentation was used in 225 eyes, there was no loss of endothelial cells, while the 63 eyes that received standard treatment had cell loss of one to seven percent. “Our finding, that laser lens fragmentation appears to protect corneal endothelial cells, represents a significant benefit of this new surgery,” said Dr. Packer. “This procedure is safer than standard cataract treatment and is likely to mean better vision and fewer eye health concerns for cataract patients, over the long term.”

Earlier studies of femtosecond laser cataract surgery found other benefits. The laser allows the surgeon to make smaller, more precise incisions and to perform improved capsulotomies, which is the removal of part of the lens capsule that make intraocular lens (IOL) placement more secure. This reduces the chance that an IOL will later become displaced. Also, laser cataract surgery appears to improve results in patients who opt for advanced technology IOLs, plus corrective corneal incisions, to achieve good all-distance vision. Femtosecond lasers have been used by ophthalmologists for years in refractive surgery such as LASIK, in-corneal transplants, and in other procedures. In 2009, a new type of femtosecond laser that could reach deep enough into the eye to be used in cataract removal was approved by the FDA.

Science Daily
November 15, 2011

Original web page at Science Daily

Categories
News

Owl study expands understanding of human stereovision

Using owls as a model, a new research study reveals the advantage of stereopsis, commonly referred to as stereovision, is its ability to discriminate between objects and background, not in perceiving absolute depth. The findings were published in a recent Journal of Vision article entitled “Owls see in stereo much like humans do.” The purpose of the study, which was conducted at RWTH Aachen (Germany) and Radboud University (Nijmegen, Netherlands), was to uncover how depth perception came into existence during the course of evolution. “The reason why studying owl vision is helpful is that, like humans, owls have two frontally placed eyes,” said author Robert F. van der Willigen, PhD, of Donders Institute for Brain, Cognition and Behavior at Radboud. “As a result, owls, like humans, could appreciate the 3-dimensional shape of tangible objects through simultaneous comparison of the left and right eye.” Van der Willigen studied two trained barn owls (Tyto alba) by conducting a series of six behavioral experiments equivalent to those used on humans. He used computer-generated binocular random-dot patterns to measure stereo performance, which showed that the owl’s ability to discriminate random-dot stereograms is parallel to that of humans despite the owl’s relatively small brain. The results provided unprecedented data on stereovision, with findings that debunk the long-held consensus that the evolutionary advantage of seeing in stereo must be depth vision.

He contends the findings demonstrate that while binocular disparity, the slight difference between the viewpoints of the right and left eyes, does play a role in perceiving depth, it allows owls, like humans, to perceive relative depth rather than absolute distance. “It is useful, therefore, not so much in controlling goal-directed movements as it is in recognition.” In looking at future studies, van der Willigen hopes that scientist will consider that human or primate vision is not the only way to examine the stereovision experience. “My present work on the owl highlights underappreciated, but fundamental aspects of stereopsis,” he says. “Nonetheless, final proof should come with behavioral demonstration of equivalent stereoscopic abilities in animals other than the owl. Hopefully, my current work will encourage scientists to investigate other animal species.”

Science Daily
July 26, 2011

Original web page at Science Daily

Categories
News

Birds’ eye view is far more colorful than our own

The brilliant colors of birds have inspired poets and nature lovers, but researchers at Yale University and the University of Cambridge say these existing hues represent only a fraction of what birds are capable of seeing. The findings based on study of the avian visual system, reported in the June 23 issue of the journal Behavioral Ecology, show that over millions of years of evolution plumage colors went from dull to bright as birds gradually acquired the ability to create newer pigments and structural colors. “Our clothes were pretty drab before the invention of aniline dyes, but then color became cheap and there was an explosion in the colorful clothes we wear today,” said Richard Prum, chair and the William Robertson Coe Professor in the Department of Ornithology, Ecology and Evolutionary Biology and co-author of the paper. “The same type of thing seemed to have happened with birds.” Scientists have speculated for years on how birds obtained their colors, but the Yale/Cambridge study was the first to ask what the diversity of bird colors actually look like to birds themselves. Ironically, the answer is that birds see many more colors than humans can, but birds are also capable of seeing many more colors than they have in their plumage. Birds have additional color cones in their retina that are sensitive to ultraviolet range so they see colors that are invisible to humans.

Over time, birds have evolved a dazzling combination of colors that included various melanin pigments, which give human skin its tint, carotenoid pigments, which come from their diets, and structural colors, like the blue eyes of humans. The study shows that the structural colors produce the lion’s share of color diversity to bird feathers, even though they are relatively rare among birds. Co-author Mary Caswell Stoddard of Cambridge, who began investigating the avian visual system as an undergraduate at Yale, would like to know why birds have not yet developed the ability to produce, for example, ultraviolet yellow or red colors in their feathers — colors invisible to humans but visible to the birds themselves. “We don’t know why plumage colors are confined to this subset,” Stoddard said. “The out of gamut colors may be impossible to make with available mechanisms or they may be disadvantageous.” “That doesn’t mean that birds’ color palette might not eventually evolve to expand into new colors,” Prum said. “Birds can make only about 26 to 30 percent of the colors they are capable of seeing but they have been working hard over millions of years to overcome these limitations,” Prum said. “The startling thing to realize is that although the colors of birds look so incredibly diverse and beautiful to us, we are color blind compared to birds.”

Science Daily
July 12, 2011

Original web page at Science Daily

Categories
News

Sections of retinas regenerated and visual function increased with stem cells from skin

Scientists from Schepens Eye Research Institute are the first to regenerate large areas of damaged retinas and improve visual function using IPS cells (induced pluripotent stem cells) derived from skin. The results of their study, which is published in PLoS ONE this month, hold great promise for future treatments and cures for diseases such as age-related macular degeneration, retinitis pigmentosa, diabetic retinopathy and other retinal diseases that affect millions worldwide. “We are very excited about these results,” says Dr. Budd A. Tucker, the study’s first author. “While other researchers have been successful in converting skin cells into induced pluripotent stem cells (iPSCs) and subsequently into retinal neurons, we believe that this is the first time that this degree of retinal reconstruction and restoration of visual function has been detected,” he adds. Tucker, who is currently an Assistant Professor of Ophthalmology at the University of Iowa, Carver College of Medicine, completed the study at Schepens Eye Research Institute in collaboration with Dr. Michael J. Young, the principle investigator of the study, who heads the Institute’s regenerative medicine center.

Today, diseases such as retinitis pigmentosa (RP) and age-related macular degeneration (AMD) are the leading causes of incurable blindness in the western world. In these diseases, retinal cells, also known as photoreceptors, begin to die and with them the eye’s ability to capture light and transmit this information to the brain. Once destroyed, retinal cells, like other cells of the central nervous system have limited capacity for endogenous regeneration. “Stem cell regeneration of this precious tissue is our best hope for treating and someday curing these disorders,” says Young, who has been at the forefront of vision stem cell research for more than a decade. While Tucker, Young and other scientists were beginning to tap the potential of embryonic and adult stem cells early in the decade, the discovery that skin cells could be transformed into “pluripotent” cells, nearly identical to embryonic cells, stirred excitement in the vision research community. Since 2006 when researchers in Japan first used a set of four “transcription factors” to signal skin cells to become iPSCs, vision scientists have been exploring ways to use this new technology. Like embryonic stem cells, iPSCs have ¬the ability to become any other cell in the body, but are not fraught with the ethical, emotional and political issues associated with the use of tissue from human embryos.

Tucker and Young harvested skin cells from the tails of red fluorescent mice. They used red mice, because the red tissue would be easy to track when transplanted in the eyes of non-fluorescent diseased mice. By forcing these cells to express the four Yamanaka transcription factors (named for their discoverer) the group generated red fluorescent IPSCs, and, with additional chemical coaxing, precursors of retinal cells. Precursor cells are immature photoreceptors that only mature in their natural habitat — the eye. Within 33 days the cells were ready to be transplanted and were introduced into the eyes of a mouse model of retina degenerative disease. Due to a genetic mutation, the retinas of these recipient mice quickly degenerate, the photoreceptor cells die and at the time of transplant electrical activity, as detected by ERG (electroretinography), is absent. Within four to six weeks, the researchers observed that the transplanted “red” cells had taken up residence in the appropriate retinal area (photoreceptor layer) of the eye and had begun to integrate and assemble into healthily looking retinal tissue.

The team then retested the mice with ERG and found a significant increase in electrical activity in the newly reconstructed retinal tissue. In fact, the amount of electrical activity was approximately half of what would be expected in a normal retina. They also conducted a dark adaption test to see if connections were being made between the new photoreceptor cells and the rest of the retina. In brief, the group found that by stimulating the newly integrated photoreceptor cells with light they could detect a signal in the downstream neurons, which was absent in the other untreated eye. Based on the results of their study, Tucker and Young believe that harvesting skin cells for use in retinal regeneration is and will continue to be a promising resource for the future.

Science Daily
May 31, 2011

Original web page at Science Daily

Categories
News

Homing in on a genetic cause of severe glaucoma, an animal model

More than half of the cases of blindness due to glaucoma are a result of angle-closure glaucoma (ACG), a less common but far more serious form of the disease. Until now, researchers have made little progress toward understanding the molecular cause of ACG. But Howard Hughes Medical Institute researchers have now developed an animal model of ACG that mimics the human disease and pinpointed a gene that may be implicated in this and other eye disorders. “We’ve identified an important new gene that underlies angle-closure glaucoma phenotypes in mice, and conditions affecting ocular size in mice,” says Simon John, the HHMI investigator who led the study. “This gene is going to give us new insight into pathways for understanding these conditions, and we’re following up to understand the gene in human patients.”

“There are many people whose eyes are at risk of ACG, and we might be able to intervene before the problem ever starts.” Angle-closure glaucoma occurs when the cornea and iris of the eye meet at too narrow an angle, blocking drainage structures responsible for drawing off excess fluid. Eyes constantly produce fluid and without adequate drainage, that fluid can build up rapidly and increase internal pressure to a degree that quickly becomes dangerous. Figuring out why some people are more susceptible to the disease and what molecular pathways are responsible could ultimately lead to better treatments and predictive techniques. “Virtually nothing is known about the molecular factors that regulate ACG and so it’s a very poorly understood glaucoma. But there are 16 million people affected by it. Our linking a novel gene to it is a key step towards understanding the molecular processes,” says Sai Nair, a postdoctoral fellow in John’s research group at The Jackson Laboratory.

In research published May 1, 2011, in the journal Nature Genetics, John, Nair, and colleagues describe a mouse mutant with eyes that mimic those of humans with ACG. Like human eyes, they begin to show dangerous intraocular pressure when the mice are still young. And like humans with ACG, the mice have eyes that are slightly smaller than their peers but lenses of normal size—a combination prone to decreased angle size. “What’s fascinating about ACG is that there are a lot of different physiological processes interacting in complex ways, and there hasn’t been an underlying unifying mechanism identified that can explain all these changes,” says John. With their new mouse model of the disease, scientists are now better equipped to tease apart those mechanisms. John is already setting the pace, along with Nair, who is a first author of the Nature Genetics paper. The team first observed the ACG-like symptoms in a mouse carrying an unknown gene mutation. Together with their colleagues, John and Nair mapped the mutation to a specific gene, which encodes an enzyme called a protease that’s responsible for breaking down certain proteins. Mounira Hmani-Aifa at the Université de Sfax in Tunisia, who has been studying the genetics of families with inherited eye disorders, has found that in humans, the gene is located in a region of the genome that has been linked to another vision disorder called posterior microphthalmia. Patients with posterior microphthalmia have small eyes with extreme hyperopia, or severe far-sightedness.

The protein may also be important in a process known as emmetropization, which occurs as a baby’s brain begins to process the world around him and corrects the eye’s shape until it focuses light so precisely on the retina as to create perfect vision — suggesting defects associated with ACG may begin during childhood development. To examine the clinical relevance of their mouse studies, the John’s team collaborated Hmani-Aifa’s team. They discovered that the mutations in the gene they saw in their mice are present in some patients with posterior microophthalmia.

Howard Hughes Medical Institute
May 17, 2011

Original web page at Howard Hughes Medical Institute

Categories
News

Gene therapy shows promise against age-related macular degeneration

A gene therapy approach using a protein called CD59, or protectin, shows promise in slowing the signs of age-related macular degeneration (AMD), according to a new in vivo study by researchers at Tufts University School of Medicine. Led by senior author Rajendra Kumar-Singh, PhD, the researchers demonstrated for the first time that CD59 delivered by a gene therapy approach significantly reduced the uncontrolled blood vessel growth and cell death typical of AMD, the most common cause of blindness in the elderly. Activation of the complement system, a part of the immune system, is responsible for slowly killing cells in the back of the eye, leading to AMD. Activation of this system leads to the generation of pores or holes known as ‘membrane attack complex’ or MAC in cell membranes. CD59 is known to block the formation of MAC. “CD59 is unstable and hence previous studies using CD59 have had limited success. The gene therapy approach that we developed continuously produces CD59 in the eye and overcomes these barriers, giving us renewed hope that it can be used to fight the progression of AMD and potentially other diseases,” said Kumar-Singh.

Kumar-Singh is associate professor in the department of ophthalmology at Tufts University School of Medicine (TUSM) and member of the genetics; neuroscience; and cell, molecular, and developmental biology program faculties at the Sackler School of Graduate Biomedical Sciences at Tufts. Kumar-Singh and colleagues delivered CD59 to the eye using a deactivated virus similar to one previously shown to be safe in humans. Using an established mouse model of age-related macular degeneration, they found that eyes treated with CD59 had 62 percent less uncontrolled blood vessel growth and 52 percent less MAC than controls. “Treatment was effective when administered at a very specific location beneath the retina, but importantly, also when it was administered to the center of the eye. This finding is especially encouraging because it would allow for a safer and more convenient route of administration of treatment,” said co-first author Siobhan Cashman, PhD, assistant professor in the department of ophthalmology at Tufts University School of Medicine and member of Kumar-Singh’s lab.

The current standard treatment for some forms of AMD requires an injection directly into the eye approximately every four weeks. According to Kumar-Singh, gene therapy approaches to treat AMD are especially attractive because they will allow patients to be treated less frequently, reducing patient discomfort and lowering chances of infection and other side effects associated with frequent injections into the eye. The researchers, including co-first author Kasmir Ramo, BS, research technician, believe that while CD59 has significant potential as a treatment for AMD, the gene therapy approach lends itself for application also in other eye and systemic disorders where low-level activation of complement has been implicated. “Prior to initiating human clinical trials, we will need to perform extensive preclinical toxicology studies. In order to advance this study to Phase I clinical trials, we have formed a partnership with Hemera Biosciences Inc. to raise private venture capital,” said Kumar-Singh.

AMD, which results in a loss of sharp, central vision, is the number one cause of visual impairment among Americans age 60 and older. While treatments are available for wet AMD, they do not prevent the progression of dry AMD, the form that affects 90 percent of AMD patients. Kumar-Singh noted, however, that the current study in combination with a previously published study from his laboratory suggests that CD59 may be useful for the treatment of both the dry and wet forms of AMD. The study was published on April 28 in PLoS ONE.

Science Daily
May 17, 2011

Original web page at Science Daily

Categories
News

Pig stem cell transplants: The key to future research into retina treatment

A team of American and Chinese scientists studying the role of stem cells in repairing damaged retina tissue have found that pigs represent an effective proxy species to research treatments for humans. The study, published in Stem Cells, demonstrates how stem cells can be isolated and transplanted between pigs, overcoming a key barrier to the research. Treatments to repair the human retina following degenerative diseases remain a challenge for medical science. Unlike species of lower vertebrates the human retina lacks a regenerative pathway meaning that research has focused on cell transplantation. “The retina is the light sensitive tissue surrounding the inner surface of the eye. Its outer layer is made up of rods and cone photoreceptor cells which convert light signals,” said lead author Douglas Dean from the University of Louisville. “Traditionally transplant studies have focused on mice and other rodents because of the variety of genetic material they represent, however mouse retina tissue is rod dominant, which is significantly different to the human eye.” Dr Dean’s team turned their attention to pigs because, as with humans, the swine eye contains a cone dominant central visual streak, making it a closer anatomical and physiological match.

“Studies into swine models have been hampered in the past,” said Dean, “because the induced pluripotent stem cells (iPSCs) needed for such transplants have not been isolated from pigs, while their compatibility with a host’s photoreceptor cells had not been demonstrated.” Dr Dean’s team gathered iPSCs from swine skin fibroblasts and demonstrated that these cells differentiated in culture and could be integrated with the cells of a second pig’s retina. While only a small section of the retina was transplanted for this study the results could open a new avenue of research into degenerative conditions as researchers have a more effective human proxy species to work with. “Our results demonstrate that swine stem cells can be integrated into a damaged swine neural retina,” concluded Dean. “This research now lays a foundation for future studies of retinal stem cell transplantation in a swine model.”

EurekAlert! Medicine
May 3, 2011

Original web page at EurekAlert! Medicine

Categories
News

Spanish researchers replace pig corneal cells with human stem cells

Researchers at the University of Granada have made progress toward bioartificial organs by extracting pig corneal cells and replacing them with human stem cells. This method, known as decellularization and recellulation, allows scientists to maintain the basic structure of the cornea and replace its cellular components. The research group is composed of professors Antonio Campos and Miguel Alaminos (histologists), María del Mar Pérez, Ana Ionescu and Juan de la Cruz Cardona (opticians) and the ophthalmologist Miguel González Andrades, University Hospital San Cecilio, Granada. Their results are published online in the research journal Investigative Ophthalmology & Visual Science. The University of Granada researchers belong to the same research group that made an artificial cornea with biomaterials designed at the Tissue Engineering Laboratory of the University of Granada, which is currently on the preparatory stage to start a clinical trial. At present, the authors of this study are promoting the establishment of an Institute for Tissue Engineering in Granada, which is currently on the feasibility and design phase.

Science Daily
May 3, 2011

Original web page at Science Daily

Categories
News

Stem cells make ‘retina in a dish’

A retina made in a laboratory in Japan could pave the way for treatments for human eye diseases, including some forms of blindness. Created by coaxing mouse embryonic stem cells into a precise three-dimensional assembly, the ‘retina in a dish’ is by far and away the most complex biological tissue engineered yet, scientists say. “There’s nothing like it,” says Robin Ali, a human molecular geneticist at the Institute of Ophthalmology in London who was not involved in the study. “When I received the manuscript, I was stunned, I really was. I never though I’d see the day where you have recapitulation of development in a dish.” If the technique, published today in Nature, can be adapted to human cells and proved safe for transplantation — which will take years — it could offer an unlimited well of tissue to replace damaged retinas. More immediately, the synthetic retinal tissue could help scientists in the study of eye disease and in identifying therapies.

Nature
April 19, 2011

Original web page at Nature

Categories
News

Novel insights into glaucoma pathology following identification of glaucoma gene in beagle dogs and humans

Glaucoma – a leading cause of vision loss and blindness worldwide – runs in families. A team of investigators from Vanderbilt University and the University of Florida has identified a new candidate gene for the most common form of the eye disorder, primary open angle glaucoma (POAG). The findings, reported Feb. 17 in the open-access journal PLoS Genetics, offer novel insights into glaucoma pathology and could lead to targeted treatment strategies. Elevated pressure inside the eye is a strong risk factor for POAG. Pressure increases because of increased resistance to the flow of aqueous humor out of the eye’s front chamber (between the cornea and iris). Current treatments for POAG attempt to reduce intraocular pressure by reducing aqueous humor production or by surgically providing a clear “drain.” “It has been known for decades that the reason the pressure goes up in POAG is because the outflow pathway for aqueous humor is not working,” said Rachel Kuchtey, M.D., Ph.D., assistant professor of Ophthalmology and Visual Sciences and principal investigator of the current studies.

“It seems kind of simple – there’s a decrease in the rate of aqueous humor flowing out of the eye,” said John Kuchtey, Ph.D., research instructor in Ophthalmology and Visual Sciences and first author of the paper. “But the basic mechanisms of aqueous humor outflow at the cellular and molecular level – and how they are disrupted in glaucoma – are not understood. It’s a long-standing puzzle in ophthalmology.” So far, three genes have been associated with human glaucoma, but they account for only a small fraction of cases and have not shed much light on the disease process. The Vanderbilt investigators turned to a model with simpler genetics – a canine model of the disease. Forty years ago, Kirk Gelatt, V.M.D., from the UF College of Veterinary Medicine, came across a litter of beagles that had a high incidence of glaucoma. The finding suggested to Gelatt that the disease was inherited, and he established a colony of POAG-affected beagles to study treatments for the disease. In affected beagles, intraocular pressure begins to increase at 8 to 16 months of age, due to increased resistance to aqueous humor outflow. The clinical course of the disease “absolutely resembles human glaucoma,” Rachel Kuchtey said. The beagles are the only naturally occurring animal model for human POAG.

The Vanderbilt investigators used blood samples to search for genes associated with POAG. They first narrowed in on a certain spot (locus) on canine chromosome 20, which matched part of human chromosome 19. Previous studies had associated the human region with intraocular pressure, a good sign that they were on the right track, John Kuchtey said. Sequencing of the entire canine locus – over 4 million “letters” of DNA – revealed that a gene called ADAMTS10 was the strongest disease-associated candidate. POAG-affected dogs have a single mutation in the gene, which encodes a protein involved in processing the extracellular matrix (ECM), the connective and structural support tissue around cells. “There is a lot of evidence that proteoglycans (molecules in the ECM) and matrix remodeling might have something to do with aqueous outflow resistance, and so this gene supports that line of investigation,” John Kuchtey said.

The researchers also demonstrated that the gene is highly expressed in the trabecular meshwork – the specialized filtration tissue through which aqueous humor passes, another supportive piece of evidence that it may have a role in regulating aqueous humor outflow. The investigators are currently exploring ADAMTS10’s normal biological functions, and they have studies in progress to examine whether the human ADAMTS10 gene is mutated in human glaucoma. They are hopeful that understanding this gene will open therapeutic possibilities for glaucoma. “Right now we know that aqueous outflow is impaired in POAG, but we have no way to fix it because we don’t understand how that normally works and what the pathology is in POAG,” Rachel Kuchtey said. “If this gene truly plays a role in aqueous outflow regulation, we can begin to look at it – or its molecular partners – as targets for treatments.” Gene therapy to rescue a defect might also be a possibility. Gene therapy for an inherited form of childhood blindness was first validated in dogs and is now in trial in humans.

eBioNews
March 8, 2011

Original web page at eBioNews

Categories
News

Implant appears effective for treating inflammatory disease within the eye

An implant that releases the medication dexamethasone within the eye appears safe and effective for the treatment of some types of uveitis (swelling and inflammation in the eye’s middle layer), according to a report posted online that will appear in the May print issue of Archives of Ophthalmology, one of the JAMA/Archives journals. “Uveitis refers to a group of intraocular inflammatory diseases that cause 10 percent to 15 percent of blindness in the developed world,” the authors write as background information in the article. “Despite advances in immunosuppressive treatments, corticosteroids remain the mainstay of therapy.” However, some patients cannot tolerate or do not respond to systemic corticosteroids, in part because the medications do not cross the blood-retinal barrier. Eye drops containing corticosteroids are effective only for anterior uveitis, closer to the front of the eye. Injections of corticosteroids directly into the eye have been effective in forms of intermediate or posterior uveitis, but require repeated treatments every two to three months and are associated with cataracts and other adverse effects.

To address these issues, an intravitreal (within the vitreous fluid of the eye), bioerodible, sustained-release implant has been developed to deliver a glucocorticoid medication, dexamethasone, to the back of the eye chamber. Careen Lowder, M.D., Ph.D., of the Cleveland Clinic Cole Eye Institute, and colleagues in the Ozurdex HURON Study Group conducted a 26-week randomized controlled trial involving 229 patients with intermediate or posterior uveitis. A total of 77 patients received an implant with a 0.7-milligram dose of dexamethasone, 76 received an implant with a 0.35-milligram dose and 76 underwent a sham procedure which followed the same protocol but used a needleless applicator. After eight weeks, the eyes were evaluated for the presence and degree of vitreous haze, or inflammation that obscures visualization. Vitreous haze was scored from zero to four, with zero indicating no inflammation and four indicating the most severe inflammation, obscuring the optic nerve. At the beginning of treatment, participants had an average vitreous haze score of two.

At the eight-week follow-up, a vitreous haze score of zero was observed in 47 percent of eyes with the 0.7-milligram implant, 36 percent of those with the 0.35-milligram implant and 12 percent of those who underwent the sham procedure. There was no significant difference between the two treatment doses, and the benefit associated with the implant persisted through the 26-week study. In addition, the percentage of eyes that achieved at least a 15-letter improvement in visual acuity was two- to six-fold greater in both implant groups than in the control group throughout the study. “Typically, the most common adverse events associated with intravitreal corticosteroids, which may have impacted use in the past, including increases in intraocular pressure [pressure within the eye] and cataract. On any given follow-up visit in the present study, substantial increases in intraocular pressure (to 25 millimeters of mercury or greater) occurred in less than 10 percent of treated eyes,” the authors write. In addition, only one of 62 phakic (with lenses) eyes required surgery to remove a cataract.

“In conclusion, the present study demonstrated that in patients with non-infectious intermediate or posterior uveitis, a single dose of the dexamethasone intravitreal implant was well tolerated and produced significant improvements in intraocular inflammation and visual acuity that persisted for six months,” the authors conclude. “Overall, the 0.7-milligram dexamethasone intravitreal implant demonstrated greater efficacy than the 0.35-milligram dexamethasone intravitreal implant, with similar safety.”

Science Daily
January 24, 2011

Original web page at Science Daily

Categories
News

New cell type implicated in vision

Blind mice appear to retain some ability to sense the brightness of their surroundings thanks to cells that contain the light-sensitive protein melanopsin. Rods and cones hog all the credit for allowing us to see. But these light-sensitive neurons get some help from a much rarer kind of cell, according to a new study. If these unheralded cells are as important as the authors suspect, studying them may open the door to new therapies for some forms of blindness. Scientists have known of the existence of these nerve cells, called melanopsin-containing retinal ganglion cells (mRGCs), since 2000. Research over the past decade has shown that they play an important role in reflexive responses to light, such as pupil constriction and regulation of the body’s sleep-wake cycle. But they did not appear to be involved in vision. In July, however, researchers reported in the journal Neuron that the stringy extensions, or axons, of mRGCs extend into parts of the mouse brain involved in conscious vision, not just the parts of the brain that control unconscious responses to light. The latest study confirms that finding and suggests that mRGCs enable mice to sense the brightness of their surroundings.

In the new work, researchers tagged the mRGCs with a blue protein to see where the cells occur in the mouse eye. When they tracked the cells’ axons from the eye into the brain, they saw that many of them terminated in the lateral geniculate nucleus (LGN), the first relay station in the brain for visual information. If mRGCs are involved in mouse vision, the researchers posited that light would produce activity in the visual centers of the brain in mice that lack rods and cones. To test this, they inserted thin wire electrodes into the LGNs of 18 mice and recorded electrical signals. “What we did is keep the mice in total darkness,” says Timothy Brown, a neuroscientist at the University of Manchester in the United Kingdom. “And then we would switch on a light of a particular brightness for 60 seconds.” The team tested a range of light intensities, from starlight to bright daylight, and found that light as intense as daylight fired up the LGN.

Brown and colleagues also looked at whether mRGCs might also send information to the LGN in mice with normal vision. “We found that approximately 40% of the brain cells that process visual signals appear to receive information from mRGCs,” says Brown, whose team reports its work today in PLoS Biology. “This is a particularly surprising finding since mRGCs themselves make up only 2% of the retinal cells that communicate to the brain.” What the researchers don’t yet know is whether mRGCs can sense variations in brightness across the visual field that might allow them to distinguish between a dark wall and a brightly lit doorway, for example. If it’s the latter, Brown says, the findings may open the door to new therapies for retinal degeneration. He envisions some sort of visual aid designed to maximize the activity of these cells but notes that even if such therapies are possible, they won’t be available anytime soon.

This is not the first paper to suggest that melanopsin cells play a role in conscious vision, says David Berson, a neuroscientist at Brown University and co-author of the Neuron paper, but it is “a significant new addition to a breaking story.” However, he questions how relevant this line of research will be for blind people. The number of individuals with nonfunctional rods and cones that still have the ability to sense light is likely “vanishingly small,” he says. Samer Hattar, a neuroscientist at Johns Hopkins University in Baltimore, Maryland, and lead author on the Neuron paper, says he isn’t convinced that the study proves that mRGCs are a key component of conscious vision in mice with functional rods and cones. Hattar points out that no group has yet shown that mice lacking melanopsin have inferior vision based on their behavior. “Just because you see something doesn’t mean that it’s going to be physiologically relevant,” he says. “The story is not finished.”

ScienceNow
December 21, 2010

Original web page at ScienceNow