Categories
News

Snakes have adapted their vision to hunt their prey day or night

For example, snakes that need good eyesight to hunt during the day have eye lenses that act as sunglasses, filtering out ultraviolet light and sharpening their vision while nocturnal snakes have lenses that allow ultraviolet light through, helping them to see in the dark.

New insights into the relationship between ultraviolet (UV) filters and hunting methods in snakes is one of the findings of the first major study of visual pigment genes and lenses in snakes — published in the advanced online edition of Molecular Biology and Evolution.

The new research was an international collaboration between snake biologists and vision experts led by the David Gower and included fellow Natural History Museum researchers Bruno Simões and Filipa Sampaio. Much of the research, including most of the DNA analyses, was carried out in the Museum’s laboratories.

Scientists have long known that snakes have highly variable sets of rods and cones — the specialised cells in the retina that an animal uses to detect light. But until now, most modern studies of vision in vertebrates (animals with a backbone) have concentrated on mammals, birds and fish.

To see in different colors, animals use visual pigments in their rods and cones that are sensitive to different wavelengths of light. The researchers examined the genes involved in producing the pigments from a broad genomic survey of 69 different species of snakes. What they found was as the genes vary from species to species so does the exact molecular structure of the pigments and the wavelengths of light they absorb.

The new research discovered that most snakes possess three visual pigments and are likely dichromatic in daylight — seeing two primary colours rather than the three that most humans see.

However, it also discovered that snake visual pigment genes have undergone a great amount of adaptation, including many changes to the wavelengths of light that the pigments are sensitive to, in order to suit the diversity of lifestyles that snakes have evolved.

Most snakes examined in the new study are sensitive to UV light, which likely allows them to see well in low light conditions. For light to reach the retina and be absorbed by the pigments, it first travels through the lens of the eye. Snakes with UV-sensitive visual pigments therefore have lenses that let UV light though.

In contrast, the research showed that those snakes that rely on their eyesight to hunt in the daytime, such as the gliding golden tree snake Chrysopelea ornata and the Monypellier snake Malpolon monspessulanus, have lenses that block UV light. As well as perhaps helping to protect their eyes from damage, this likely helps sharpen their sight — in the same way that skiers’ yellow goggles cut out some blue light and improve contrast.

Moreover, these snakes with UV-filtering lenses have tuned the pigments in their retina so that they are no longer sensitive to the short UV light, but absorb longer wavelengths.

All nocturnal species examined (such as N America’s glossy snake Arizona elegans) were found to have lenses that do not filter UV. Some snake species active in daylight also lack a UV-filtering lens, perhaps because they are less reliant on very sharp vision or live in places without very bright light.

By analysing how the pigments have evolved in snakes, the new study concluded also that the most recent ancestor of all living snakes had UV sensitive vision. “The precise nature of the ancestral snake is contentious, but the evidence from vision is consistent with the idea that it was adapted to living in low light conditions on land,” said corresponding author Gower.

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/08/160816182620.htm  Original web page at Science Daily

Categories
News

No blue light, please, I’m tired: Light color determines sleepiness versus arousal in mice

Light affects sleep. A study in mice published in Open Access journal PLOS Biology shows that the actual color of light matters; blue light keeps mice awake longer while green light puts them to sleep easily. An accompanying Primer provides accessible context information and discusses open questions and potential implications for “designing the lighting of the future.”

Light shining into our eyes not only mediates vision but also has critical non-image-forming functions such as the regulation of circadian rhythm, which affects sleep and other physiological processes. As humans, light generally keeps us awake, and dark makes us sleepy. For mice, which are mostly nocturnal, light is a sleep-inducer. Previous studies in mice and humans have shown that non-image-forming light perception occurs in specific photosensitive cells in the eye and involves a light sensor called melanopsin. Mice without melanopsin show a delay in their response to fall asleep when exposed to light, pointing to a critical role for melanopsin in sleep regulation.

Stuart Peirson and Russell Foster, both from Oxford University, UK, alongside colleagues from Oxford and elsewhere, investigated this further by studying sleep induction in mice exposed to colored light, i.e., light of different wave lengths. Based on the physical properties of melanopsin, which is most sensitive to blue light, the researchers predicted that blue light would be the most potent sleep inducer.

To their surprise, that was not the case. Green light, it turns out, puts mice to sleep quickly, whereas blue light actually seems to stimulate the mice, though they did fall asleep eventually. Mice lacking melanopsin were oblivious to light color, demonstrating that the protein is directing the differential response.

Both green and blue light elevated levels of the stress hormone corticosterone in the blood of exposed mice compared with mice kept in the dark, the researchers found. Corticosterone levels in response to blue light, however, were higher than levels in mice exposed to green light. When the researchers gave the mice drugs that block the effects of corticosterone, they were able to mitigate the effects of blue light; drugged mice exposed to blue light went to sleep faster than control mice that had received placebos.

Citing previous results that exposure to blue light — a predominant component of light emitted by computer and smart-phone screens — promotes arousal and wakefulness in humans as well, the researchers suggest that “despite the differences between nocturnal and diurnal species, light may play a similar alerting role in mice as has been shown in humans.” Overall, they say their work “shows the extent to which light affects our physiology and has important implications for the design and use of artificial light sources.”

In the accompanying Primer, Patrice Bourgin, from the University of Strasbourg, France, and Jeffrey Hubbard from the University of Lausanne, Switzerland, say the study “reveals that the role of color [in controlling sleep and alertness] is far more important and complex than previously thought, and is a key parameter to take into account.” The study’s results, they say, “call for a greater understanding of melanopsin-based phototransduction and tell us that color wavelength is another aspect of environmental illumination that we should consider, in addition to photon density, duration of exposure and time of day, as we move forward in designing the lighting of the future, aiming to improve human health and well-being.”

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/08/160815185816.htm Original web page at Science Daily

Categories
News

How birds unlock their super-sense, ultraviolet vision

The ability of finches, sparrows, and many other birds to see a visual world hidden to us is explained in a study published in the journal eLife.

Birds can be divided into those that can see ultraviolet (UV) light and those that cannot. Those that can live in a sensory world apart, able to transmit and receive signals between each other in a way that is invisible to many other species. How they unlock this extra dimension to their sight is revealed in new findings from the Washington University School of Medicine in St. Louis.

The study reveals two essential adaptions that enable birds to expand their vision into the UV range: chemical changes in light-filtering pigments called carotenoids and the tuning of light-sensitive proteins called opsins.

Birds acquire carotenoids through their diets and process them in a variety of ways to shift their light absorption toward longer or shorter wavelengths. The researchers characterized the carotenoid pigments from birds with violet vision and from those with UV vision and used computational models to see how the pigments affect the number of colors they can see.

“There are two types of light-sensitive cells, called photoreceptors, in the eye: rods and cones. Cone photoreceptors are responsible for color vision. While humans have blue, green, and red-sensitive cones only, birds have a fourth cone type which is either violet or UV-sensitive, depending on the species,” says senior author Joseph Corbo, MD, PhD, Associate Professor of Pathology and Immunology.

“Our approach showed that blue-cone sensitivity is fine-tuned through a change in the chemical structure of carotenoid pigments within the photoreceptor, allowing both violet and UV-sighted birds to maximize how many colors they can see.”

The study also revealed that sensitivity of the violet/UV cone and the blue cone in birds must move in sync to allow for optimum vision. Among bird species, there is a strong relationship between the light sensitivity of opsins within the violet/UV cone and mechanisms within the blue cone, which coordinate to ensure even UV vision.

Taken together, these results suggest that both blue and violet cone cells have adapted during evolution to enhance color vision in birds.

“The majority of bird species rely on vision as their primary sense, and color discrimination plays a crucial role in their essential behaviors, such as choosing mates and foraging for food. This explains why birds have evolved one of the most richly endowed color vision systems among vertebrates,” says first author Matthew Toomey, a postdoctoral fellow at the Washington University School of Medicine.

“The precise coordination of sensitivity and filtering in the visual system may, for example, help female birds discriminate very fine differences in the elaborate coloration of their suitors and choose the fittest mates. This refinement of visual sensitivity could also facilitate the search for hidden seeds, fruits, and other food items in the environment.”

The team now plans to investigate the underlying molecular mechanisms that help modify the carotenoid pigments and light-sensitive protein tuning in a wide range of bird species, to gather further insights into the evolution of UV vision.

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/07/160712093355.htm  Original web page at Science Daily

Categories
News

* Model helps identify drugs to treat cat eye infections

It’s a problem veterinarians see all the time, but there are few treatments. Feline herpes virus 1 (FHV-1) is a frequent cause of eye infections in cats, but the drugs available to treat these infections must be applied multiple times a day and there is scant scientific evidence to support their use.

Now scientists at the Baker Institute for Animal Health at Cornell’s College of Veterinary Medicine have developed a model system that can be used to test drugs for treating these eye infections, and early results have pointed to a new drug for treating FHV-1 that will soon head to clinical trials. The work is reported in the Journal of General Virology.

“Herpes-induced cornea infections are a big problem in cats,” says Dr. Gerlinde Van de Walle, who led the study. Cats infected with FHV-1 will blink continuously, squint and have a teary, sore-looking eye or eyes. “If not treated, FHV-1 infection can eventually lead to blindness,” she says.

“We wanted to develop a model system that could predict whether an antiviral drug would work against FHV-1 in cats,” says Van de Walle. They were also searching for an easy way to identify drugs that could be given only once every 24 hours, because, as vets and many cat owners know, giving medication to a cat multiple times a day can be a difficult, painful thing to accomplish. Smearing ointment in a cat’s eyes might be easy the first and second time, but once the cat learns what you’re up to with that little tube, she will most likely hide or fight.

Van de Walle and her team used tissues donated from cats that died of causes other than eye disease. The outer clear layer of the eye, called the cornea, is shaped like a contact lens but has the consistency of Jell-O. To maintain the natural, dome-shaped structure of these corneas under laboratory conditions, the team gently filled them with agarose, waited for the agarose to firm up, then turned them over and kept them in a liquid medium. The model better resembles what happens in the eyes of a cat compared with using a single layer of cells in a dish and can, therefore, better predict what to expect in the animal.

To use these petri plate corneas as a model of FHV-1 infection, they applied the virus to some of the corneas and left others uninfected. They then tested the effectiveness of two drugs that are used for topical treatment of FHV-1 eye infections in cats: cidofovir, which is frequently used in the clinic, and acyclovir, which has shown some activity when given frequently. Both drugs cleared the infection when applied every 12 hours, but cidofovir was more effective.

Taking it a step further, Van de Walle and her team used the model system to identify another drug for treating FHV-1 infections. The antiretroviral drug raltegravir is commonly used in humans to treat HIV infections, and although some reports indicated it could be effective against herpes viruses, it had never been used to treat FHV-1 in cats before.

“We found that it is very effective against FHV-1. It even worked when we applied the drug only once every 24 hours,” says Van de Walle. This means raltegravir could be just as efficient as the other drugs available for treating FHV-1 infections, but would only have to be administered once daily. Van de Walle says she hopes eventually to see the drug tested in a well-controlled clinical trial.

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/07/160719123900.htm Original web page at Science Daily

Categories
News

Current stimulation to the brain partially restores vision in patients with glaucoma and optic nerve damage

Vision loss due to glaucoma or optic nerve damage is generally considered irreversible. Now a new prospective, randomized, multi-center clinical trial demonstrates significant vision improvement in partially blind patients after 10 days of noninvasive, transorbital alternating current stimulation (ACS). In addition to activation of their residual vision, patients also experienced improvement in vision-related quality of life such as acuity, reading, mobility or orientation. The results are reported in PLOS ONE.

“ACS treatment is a safe and effective means to partially restore vision after optic nerve damage probably by modulating brain plasticity, re-synchronizing brain networks, which were desynchronized by vision loss. This class 1 evidence is the first ever large-scale multi-center clinical trial in the field of non-invasive brain modulation using electric currents and suggests that visual fields can be improved in a clinically meaningful way,” commented lead investigator Bernhard A. Sabel, PhD, of the Institute of Medical Psychology, Medical Faculty, Otto-von-Guericke University of Magdeburg (Germany).

In a study conducted at three German clinical centers (University of Göttingen, Charité Berlin, and University of Magdeburg), 82 patients were enrolled in a double-blind, randomized, sham-controlled clinical trial, 33 with visual deficits caused by glaucoma and 32 with anterior ischemic optic neuropathy caused by inflammation, optic nerve compression (due to tumors or intracranial hemorrhage), congenital anomalies, or Leber’s hereditary optic neuropathy. Eight patients had more than one cause of optic nerve atrophy.

The groups were randomized so that 45 patients underwent 10 daily applications of ACS for up to 50 minutes per day over a two-week period and 37 patients received sham stimulation. The only difference between groups before treatment was that the stimulation group included more men than the sham group; no other differences were found, including age of the lesion or visual field characteristics. ACS was applied with electrodes on the skin near the eyes. Vision was tested before and 48 hours after completion of treatment, and then again two months later to check if any changes were long-lasting.

Patients receiving ACS showed significantly greater improvements in perceiving objects in the whole visual field than individuals in the sham-treated group. Specifically, when measuring the visual field, a 24% improvement was noted after treatment in the ACS group compared to a 2.5% improvement in the sham group. This was due to significant improvements in the defective visual field sector of 59% in the ACS group and 34% in the sham group which received a minimal stimulation protocol. Further analyses showed improvements in the ACS group at the edges of the visual field. The benefits of stimulation were found to be stable two months later, as the ACS group showed a 25% improvement in the visual field compared to negligible changes (0.28%) in the sham group.

Patient safety measures were maintained at a high level, in line with previous studies. Current flow was assessed using sophisticated computer simulation models. No participants reported discomfort during stimulation, although temporary dizziness and mild headaches were reported in rare cases.

The study results are in line with previous small sample studies in which efficacy and safety were observed. Those studies revealed that well-synchronized dynamic brain functional networks are critical for vision restoration. Although vision loss leads to de-synchronization, these neural networks can be re-synchronized by ACS via rhythmic firing of the ganglion cells of the retina, activating or “amplifying” residual vision. Dr. Sabel added that “while additional studies are needed to further explore the mechanisms of action, our results warrant the use of ACS treatment in a clinical setting to activate residual vision by brain network re-synchronization. This can partially restore vision in patients with stable vision loss caused by optic nerve damage.”

In summary, vision loss, long considered to be irreversible, can be partially reversed. There is now more light at the end of the tunnel for patients with low vision or blindness following glaucoma and optic nerve damage.

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/06/160629145208.htm  Original web page at Science Daily

Categories
News

Stem cells used to identify cellular processes related to glaucoma

Using stem cells derived from human skin cells, researchers led by Jason Meyer, assistant professor of biology, along with graduate student Sarah Ohlemacher of the School of Science at Indiana University-Purdue University Indianapolis, have successfully demonstrated the ability to turn stem cells into retinal ganglion cells (RGCs), the neurons that conduct visual information from the eye to the brain. Their goal is the development of therapies to prevent or cure glaucoma.

In addition to glaucoma, a group of degenerative diseases that damage the eye’s optic nerve and can result in vision loss and blindness, this work has potential implications for treatment of optic- nerve injuries of the types incurred by soldiers in combat or athletes in contact sports.

In the study, which appears online in advance of publication in the journal Stem Cells, the IUPUI investigators took skin cells biopsied from volunteers with an inherited form of glaucoma and from volunteers without the disease and genetically reprogrammed them to become pluripotent stem cells, meaning they are able to differentiate into any cell type in the body. The researchers then directed the stem cells to become RGCs at which point the cells began adopting features specific to RGCs — features that were different in the cells of individuals with glaucoma than in the cells that came from healthy individuals.

Glaucoma is the most common disease that affects RGCs, which serve as the connection between the eye and the brain, sending information taken in by the eye to the brain for interpretation. When these cells are damaged or severed, the brain cannot receive critical information, leading to blindness. The National Institutes of Health’s National Eye Institute estimates that glaucoma affects more than 2.7 million people in the United States and more than 60 million worldwide.

“Skin cells from individuals with glaucoma are no different from skin cells of those without glaucoma,” said Meyer, a cell biologist and stem cell researcher, who also holds an appointment as a primary investigator with the Stark Neurosciences Research Institute at the Indiana University School of Medicine. “However, when we turned glaucoma patients’ skin cells into stem cells and then into RGCs, the cells became unhealthy and started dying off at a much faster rate than those of healthy individuals.

“Now that we have produced cells that develop features of glaucoma in culture dishes, we want to see if compounds we add to these RGCs can slow down the degeneration process or prevent these cells from dying off. We already have found candidates that look promising and are studying them. In the more distant future, we may be able to use healthy patient cells as substitute cells as we learn how to replace cells lost to the disease. It’s a significant challenge, but it’s the ultimate — and, we think, not unrealistic — long-range goal.”

https://www.sciencedaily.com/   Science Daily

https://www.sciencedaily.com/releases/2016/03/160321081227.htm  Original web page at Science Daily

Categories
News

* Small birds’ vision: Not so sharp but superfast

One may expect a creature that darts around its habitat to be capable of perceiving rapid changes as well. Yet birds are famed more for their good visual acuity. Joint research by Uppsala University, Stockholm University and the Swedish University of Agricultural Sciences (SLU) now shows that, in small passerines (perching birds) in the wild, vision is considerably faster than in any other vertebrates — and more than twice as fast as ours.

The new research findings are published in the scientific journal PLOS ONE.

In behavioural experiments, the scientists have studied the ability to resolve visual detail in time in three small wild passerine species: blue tit, collared flycatcher and pied flycatcher. This ability is the temporal resolution of eyesight, i.e. the number of changes per second an animal is capable of perceiving. It may be compared to spatial resolution (visual acuity), a measure of the number of details per degree in the field of vision.

The researchers trained wild-caught birds to receive a food reward by distinguishing between a pair of lamps, one flickering and one shining a constant light. Temporal resolution was then determined by increasing the flicker rate to a threshold at which the birds could no longer tell the lamps apart. This threshold, known as the CFF (critical flicker fusion rate), averaged between 129 and 137 hertz (Hz). In the pied flycatchers it reached as high as 146 Hz, some 50 Hz above the highest rate encountered for any other vertebrate. For humans, the CFF is usually approximately 60 Hz. For passerines, the world might to be said to be in slow motion compared with how it looks to us.

It has been argued before, but never investigated, that small and agile wild birds should have extremely fast vision. Nevertheless, the blue tits and flycatchers proved to have higher CFF rates than were predicted from their size and metabolic rates. This indicates an evolutionary history of natural selection for fast vision in these species. The explanation might lie in small airborne birds’ need to detect and track objects whose image moves very swiftly across the retina — for blue tits, for example, to be able to see and avoid all branches when they take cover from predators by flying straight into bushes. Moreover, the three avian species investigated all, to a varying degree, subsist on the insects they catch. Flycatchers, as their name suggests, catch airborne insects. For this ability, aiming straight at the insect is not enough. Forward planning is required: the bird needs high temporal resolution to track the insect’s movement and predict its location the next instant.

The new results give some cause for concern about captive birds’ welfare. Small passerines are commonly kept in cages, and may be capable of seeing roughly as fast as their wild relatives. With the phase-out of incandescent light bulbs for reasons of energy efficiency, tame birds are increasingly often kept in rooms lit with low-energy light bulbs, fluorescent lamps or LED lighting. Many of these flicker at 100 Hz, which is thus invisible to humans but perhaps not to small birds in captivity. Studies have shown that flickering light can cause stress, behavioural disturbances and various forms of discomfort in humans and birds alike.

Of all the world’s animals, the eagle has the sharpest vision. It can discern 143 lines within one degree of the field of vision, while a human with excellent sight manages about 60. The magnitude of this difference is almost exactly the same as between a human’s top vision speed and a pied flycatcher’s: 60 and 146 Hz respectively. Thus, the flycatcher’s vision is faster than human vision to roughly the same extent as an eagle’s vision is sharper. So small passerines’ rapid vision is an evolutionary adaptation just as impressive as the sharp eyesight of birds of prey.

Anders Ödeen, the lecturer at Uppsala University’s Department of Ecology and Genetics who headed the study, puts the research findings in perspective.

‘Fast vision may, in fact, be a more typical feature of birds in general than visual acuity. Only birds of prey seem to have the ability to see in extremely sharp focus, while human visual acuity outshines that of all other bird species studied. On the other hand, there are lots of bird species similar to the blue tit, collared flycatcher and pied flycatcher, both ecologically and physiologically, so they probably also share the faculty of superfast vision.’

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/03/160318144548.htm  Original web page at Science

Categories
News

* Vision restored in rabbits following stem cell transplantation

Scientists have demonstrated a method for generating several key types of eye tissue from human stem cells in a way that mirrors whole eye development.

When transplanted to an animal model of corneal blindness, these tissues are shown to repair the front of the eye and restore vision, which scientists say could pave the way for human clinical trials of anterior eye transplantation to restore lost or damaged vision.

A collaborative team comprising researchers from Cardiff University and Osaka University in Japan describe their findings today in Nature.

The eye is composed of highly specialized tissues that are derived from a variety of cell lineages during development.

Previous studies have demonstrated that particular cell types, such as those that constitute the retina or cornea, can be created in the laboratory from pluripotent stem cells. However, these studies do not represent the complexity of whole eye development.

This latest study reports the generation of multiple cell lineages of the eye, including the lens, cornea, and conjunctiva, using human induced pluripotent stem cells.

The scientists have been able to show that the corneal epithelial cells can be cultivated and transplanted onto the eyes of rabbits with experimentally induced blindness to surgically repair the front of the eye.

Study co-author Professor Andrew Quantock, from Cardiff University’s School of Optometry and Vision Sciences, said: “This research shows that various types of human stem cells are able to take on the characteristics of the cornea, lens and retina.

“Importantly, it demonstrates that one cell type — the corneal epithelium — could be further grown in the lab and then transplanted on to a rabbit’s eye where it was functional, achieving recovered vision.

“Our work not only holds potential for developing cells for treatment of other areas of the eye, but could set the stage for future human clinical trials of anterior eye transplantation to restore visual function.”

Around 4000 corneal grafts are performed by the NHS annually, which rely on human organ donation.

https://www.sciencedaily.com/   Science Daily

https://www.sciencedaily.com/releases/2016/03/160309135708.htm  Original web page at  Science Daily

Categories
News

* Magnetoreception molecule found in the eyes of dogs, primates

Dog-like carnivores, some primate species may have a magnetic compass similar to that of birds. Cryptochromes are light-sensitive molecules that exist in bacteria, plants and animals. In animals, they are involved in the control of the body’s circadian rhythms. In birds, cryptochromes are also involved in the light-dependent magnetic orientation response based on Earth’s magnetic field: cryptochrome 1a is located in photoreceptors in birds’ eyes and is activated by the magnetic field. Now researchers from the Max Planck Institute for Brain Research in Frankfurt have also detected cryptochrome 1 in photoreceptors in several mammalian species. Therefore, it is possible that these animals also have a magnetic sense that is linked to their visual system.

The perception of Earth’s magnetic field is used by many animal species for orientation and navigation. A magnetic sense is found in some insects, fish, reptiles, birds and mammals, whereas humans do not appear to be able to perceive Earth’s magnetic field.

The magnetic sense in migratory birds has been studied in considerable detail: unlike a boy scout’s compass, which shows the compass direction, a bird’s compass recognizes the inclination of the magnetic field lines relative to Earth’s surface. Surprisingly, this inclination compass in birds is linked to the visual system as the magnetic field activates the light-sensitive molecule cryptochrome 1a in the retina of the bird’s eye. Cryptochrome 1a is located in the blue- to UV-sensitive cone photoreceptors and only reacts to the magnetic field if it is simultaneously excited by light.

Together with colleagues from the Ludwig-Maximilians-University Munich, the Goethe University Frankfurt, and the Universities of Duisburg-Essen and Göttingen, Christine Nießner and Leo Peichl from the Max Planck Institute for Brain Research in Frankfurt investigated the presence of cryptochrome 1 in the retinas of 90 species of mammal. Mammalian cryptochrome 1 is the equivalent of bird cryptochrome 1a. With the help of antibodies against the light-activated form of the molecule, the scientists found cryptochrome 1 only in a few species from the carnivore and primate groups. As is the case in birds, it is found in the blue-sensitive cones in these animals. The molecule is present in dog-like carnivores such as dogs, wolves, bears, foxes and badgers, but is not found in cat-like carnivores such as cats, lions and tigers. Among the primates, cryptochrome 1 is found in the orang-utan, for example. In all tested species of the other 16 mammalian orders, the researchers found no active cryptochrome 1 in the cone cells of the retina.

The active cryptochrome 1 is found in the light-sensitive outer segments of the cone cells. It is therefore unlikely that it controls the animals’ circadian rhythms from there, as this control occurs in the cell nucleus which is located a considerable distance away. It is also unlikely that cryptochrome 1 acts as an additional visual pigment for colour perception. The researchers thus suspect that some mammals may use the cryptochrome 1 to perceive Earth’s magnetic field. In evolutionary terms, the blue cones in mammals correspond to the blue-to UV-sensitive cones in birds. It is therefore entirely possible that the cryptochrome 1 in mammals has a comparable function.

Observations of foxes, dogs and even humans actually indicate that they can perceive Earth’s magnetic field. For example, foxes are more successful at catching mice when they pounce on them in a north-east direction. “Nevertheless, we were very surprised to find active cryptochrome 1 in the cone cells of only two mammalian groups, as species whose cones do not contain active cryptochrome 1, for example some rodents and bats, also react to the magnetic field,” says Christine Nießner.

One possible explanation for this is that animals can also perceive the magnetic field in a different way: for example, with the help of magnetite, microscopic ferrous particles in cells. A magnetite-based magnetic sense functions like a pocket compass and does not require any light. Mole rats, which live in lightless tunnel systems, orient using this kind of compass. Birds also have an additional orientation mechanism based on magnetite, which they use to determine their position.

Many fundamental questions remain open in the research on the magnetic sense. Future studies will have to reveal whether the cryptochrome 1 in the blue cones is also part of a magnetic sense in mammals or whether it fulfils other tasks in the retina.

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/02/160225105221.htm  Original web page at Science Daily

Categories
News

Radiation causes blindness in wild animals in Chernobyl

This year marks 30 years since the Chernobyl nuclear accident. Vast amounts of radioactive particles spread over large areas in Europe. These particles, mostly Cesium-137, cause a low but long-term exposure to ionizing radiation in animals and plants.

This chronic exposure has been shown to decrease the abundances of many animal species both after the Chernobyl and later Fukushima nuclear accidents. Damage caused by acute exposure to high radiation doses have been demonstrated in numerous laboratory studies, but effects of chronic exposure to low radiation in the wild remain largely unknown.

New research now suggests that chronic exposure to low radiation can cause damage to the eyes of wild animals. This is shown in an international study led by researchers Philipp Lehmann and Tapio Mappes from the University of Jyväskylä, Finland, which recently was published in the journal Scientific Reports.

In the study higher frequencies of cataracts were found in the lenses of bank voles which had lived in areas where background radiation levels were elevated compared to areas with natural radiation levels. Cataract frequency increased with age in the voles, similarly as in humans generally. In addition, the effects of aging intensified as a result of elevated radiation.

Interestingly the effect of radiation was significant only in female voles. Also in humans there are indications for high radiosensitivity of lenses. Persons with occupational exposure to radiation, such as radiology nurses, nuclear power plant workers and airline pilots have increased risk of cataract, but potential gender differences in radiosensitivity should be further studied.

Reasons for the gender differences in wild mammals are still largely hypothetical. However, the present study suggests that increased cataract risk may be associated with reproduction, as female bank voles who had severe cataracts received fewer offspring. Whether poorer reproductive success was caused by cataracts or by radiation is still unclear, and will require further experimental studies.

Nevertheless these new results support observations of negative consequences of chronic exposure to low radiation on wild animals and whole ecosystems. Studying effects of chronic exposure to low radiation in natural ecosystems is highly important, as it will help to prepare for new nuclear accidents and predict their consequences, which can entail widespread effects that can persist for hundreds of years in nature.

https://www.sciencedaily.com/  Science Daily

https://www.sciencedaily.com/releases/2016/02/160210110632.htm  Original web page at Science Daily

Categories
News

The magnetic compass of birds is affected by polarized light

The magnetic compass that birds use for orientation is affected by polarised light. This previously unknown phenomenon was discovered by researchers at Lund University in Sweden.

The discovery that the magnetic compass is affected by the polarisation direction of light was made when trained zebra finches were trying to find food inside a maze. The birds were only able to use their magnetic compass when the direction of the polarised light was parallel to the magnetic field, not when perpendicular to the magnetic field.

“We were expecting an effect, but not one so major that it would lead to complete disorientation when the direction of the polarisation of light was perpendicular to the direction of the magnetic field,” says Rachel Muheim, who was in charge of the study.

It is still unclear how the different directions of polarised light in relation to the Earth’s magnetic field affect birds in the wild. The researchers have put forward a thesis that the birds use it to accentuate the magnetic field during sunrise and sunset — times of day when migratory birds are believed to determine their direction and calibrate their compasses before migrating.

“In the middle of the day, when the polarised light is approximately perpendicular to the magnetic field, it can be an advantage that the magnetic field is less visible, so that it does not interfere at a time when visibility is important to locate food and to detect predators,” says Rachel Muheim.

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2016/01/160126110912.htm  Original web page at Science Daily

Categories
News

Restoring vision: Retinal nerve cell regeneration

Glaucoma is a leading cause of blindness worldwide. Vision loss from glaucoma occurs when axons in the optic nerve become damaged and can no longer carry visual information to the brain.

Glaucoma is most often treated by lowering pressure in the eye with drugs, laser surgery, or traditional surgery. However, these treatments can only preserve remaining vision; they don’t improve or restore vision that already has been lost due to glaucoma.

The nervous system is divided into the peripheral and the central systems. Damaged peripheral nerves, in your arm for example, can regenerate after injury. However, the optic nerve and the spinal cord are in the central nervous system and unfortunately cannot regenerate after injury. This is why vision loss from glaucoma, like paralysis from spinal cord injury, is permanent. The unique cellular environment of nerve cells in the central nervous system may be why regeneration is prevented.

One strategy to encourage nerve fiber growth is to remove inhibitory factors in the cellular environment. Researchers hope to prevent expression of molecules that suppress axon growth using molecular biology techniques. For example, antibodies may be introduced to block the inhibition and allow nerve fibers to re-grow. Other strategies are in development as well:

  • Nerve grafts have been tried. However, when optic nerves are damaged, they respond by forming scar tissue; nerve fibers have not successfully regenerated across the scarred areas.
  • Nanotechnology has been used to create a protein nanofiber structure through which axons can regenerate.
  • Cellular implants are engineered cells that can give physical support to neuronal fibers and provide regeneration-promoting chemicals to aid in axonal growth.
  • Genetic manipulations might help to stimulate optic nerve regeneration, but there is much research yet to be done in this area.
  • Promoters of nerve growth such as oncomodulin have shown some promise in optic nerve regeneration.
  • Stem cell approaches have shown great promise for regeneration in models of glaucoma. Multipotent stem cells are retrieved from tissues such as bone marrow and fat, thus avoiding ethical concerns.

Researchers have made great progress in understanding the process of optic nerve degeneration and regeneration in glaucoma. Molecular factors have been identified for nerve fiber growth in the central nervous system. New strategies to prevent scar formation and guide nerve fibers are being developed using nanotechnology, gene therapy, and stem cells. The next challenges are to optimize nerve regeneration and test whether it restores functionally meaningful levels of vision for glaucoma patients. This article is based on a recent “Innovations in Glaucoma” Webinar produced by Glaucoma Research Foundation: http://www.glaucoma.org/news/podcasts/audio-podcasts.php

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2015/12/151216150620.htm  Original web page at Science Daily

Categories
News

* Study stops vision loss in late-stage canine X-linked retinitis pigmentosa

Three years ago, a team from the University of Pennsylvania announced that they had cured X-linked retinitis pigmentosa, a blinding retinal disease, in dogs. Now they’ve shown that they can cure the canine disease over the long term, even when the treatment is given after half or more of the affected photoreceptor cells have been destroyed

Because the disease affects humans in almost the same fashion as it does dogs, the results suggest that this treatment could be effective and lasting in humans and could set the stage for safety studies that precede a human clinical trial.

“The 2012 study showed that gene therapy was effective if used as a preventive treatment or if you intervene right after the onset of cell death,” said William A. Beltran, co-lead author and associate professor of ophthalmology at Penn’s School of Veterinary Medicine. “That was obviously very encouraging. But now we’ve gone further, showing that the treatment is long-lasting and effective even when started at mid- and late-stage disease.”

“This happens to be a very severe disease with very early onset in the first two decades of life in humans,” said Artur V. Cideciyan, co-lead author and research professor of ophthalmology in the Scheie Eye Institute at Penn’s Perelman School of Medicine. “Because the progression of disease in dogs matches up with the progression in humans, this gives us a lot of confidence about translating these results to eventually treat humans.”

The work involved a close collaboration between Beltran and Cideciyan as well as Samuel G. Jacobson, professor of ophthalmology at Scheie, and Gustavo D. Aguirre, the paper’s senior author and professor of medical genetics and ophthalmology at Penn Vet. The Penn researchers have also long partnered with University of Florida scientists led by William Hauswirth, Rybaczki-Bullard Professor of Ophthalmology in the College of Medicine. Their work appears in Proceedings of the National Academy of Sciences.

X-linked retinitis pigmentosa, or XLRP, arises primarily from mutations in the RPGR gene, leading to progressive vision loss starting at a young age. Because it is an X chromosome-linked recessive disease, it overwhelmingly affects boys and men. It is one of the most common forms of inherited retinal disease.

Though rigorously studied, little is understood about the function of RPGR. It is believed to play a role in the function of the connecting cilium, a structure that is present in both rod and cone cells, the photoreceptor cells involved in dim-light and bright-light vision, respectively.

In XLRP, these photoreceptor cells progressively degenerate and die. To counter this effect, the Penn group’s earlier gene therapy work used a viral vector to deliver a normal copy of RPGR specifically to rods and cones using a subretinal injection.

In the new publication, the team reports that the therapy, which occurred when dogs were 5 weeks old, successfully stopped photoreceptor cell loss and maintained vision in dogs for more than three years of study.

This study also went further, using the same viral vector and same approach, except this time beginning the gene therapy intervention at two later time points: At 12 weeks of age, which the researchers term “mid-stage disease,” when approximately 40 percent of the eye’s photoreceptor cells have already died, or at 26 weeks of age, “late-stage disease,” when about 50 to 60 percent of the rods and cones were lost.

The team had concerns about treating at these later stages, both that the retina might not properly reattach following the therapeutic subretinal injection and that there could be toxicity from the viral vector due to the greater extent of photoreceptor cell degeneration. They saw no indications of either being a problem in their follow-up.

“We have spent a lot of time working to make sure the therapeutic gene is tightly regulated in terms of when and where it is expressed,” said Aguirre. “And, thankfully, we have seen that this therapy appears to be well tolerated in the retina.”

Instead, what they saw, using non-invasive tests used in human medicine, including electroretinography and optical coherence tomography imaging, was a remarkable and lasting halt in the degeneration of photoreceptor cells in the treated region of the retina. Dogs treated at these later stages of disease even had some of the structural abnormalities in the rods and cones reversed. And these findings translated to improved performance on visual behavior tests, a Y maze that tested whether the dogs could detect a dim light and an obstacle course that assessed their visual navigational skills. The dogs’ performances endured for at least two and a half years after treatment, the latest time point examined, in the late-stage group.

“What the dog studies show, especially those that are treated at a later stage, is that you can treat a relatively small region — 20 percent or less of the retinal surface, where you already had 50 percent of photoreceptor cells that died before treatment — and still see not only an electrophysiological improvement and rescue but an actual rescue of visual behavior,” Beltran said.

“Based on my experience developing gene therapies in animal models for many other inherited retinal diseases,” said the University of Florida’s Hauswirth, “I believe this report describes perhaps the strongest case yet for eventual successful therapy in humans for XLRP.”

As in their earlier work, the researchers showed that the function of both rods and cones was rescued and that these photoreceptor cells were properly connected to the neurons that transmit visual signals to the brain.

“Because this is a photoreceptor disease that affects both rods and cones, or night and day vision cells, to show that both were rescued was very wonderful to see,” Cideciyan said

“I worry a lot about my patients who have lost photoreceptor cells and possibly have abnormal connectivity and structure in their retina, whether gene therapy would still work for them at later stages of disease,” Jacobson said. “What we showed here is that the therapy resulted in downstream neurons that were robust and connected, which is exceptionally important for eventual human treatment.”

To move the work into the realm of human treatment, the researchers are examining patients to determine where in the retina may be a suitable place for injection and what patients might qualify for an eventual clinical trial. They are also studying the other genetic “partners” that function along with RPGR in the connecting cilium to see if there could be additional targets for therapy.

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2015/10/151012174519.htm  Original web page at Science Daily

Categories
News

* Genetic mutations linked to a form of blindness

Achromatopsia is a rare, inherited vision disorder that affects the eye’s cone cells, resulting in problems with daytime vision, clarity and color perception. It often strikes people early in life, and currently there is no cure for the condition. One of the most promising avenues for developing a cure, however, is through gene therapy, and to create those therapies requires animal models of disease that closely replicate the human condition.

In a new study, a collaboration between University of Pennsylvania and Temple University scientists has identified two naturally occurring genetic mutations in dogs that result in achromatopsia. Having identified the mutations responsible, they used structural modeling and molecular dynamics on the Titan supercomputer at Oak Ridge National Laboratory and the Stampede supercomputer at the Texas Advanced Computing Center to simulate how the mutations would impact the resulting protein, showing that the mutations destabilized a molecular channel essential to light signal transduction.

The findings provide new insights into the molecular cause of this form of blindness and also present new opportunities for conducting preclinical assessments of curative gene therapy for achromatopsia in both dogs and humans.

“Our work in the dogs, in vitro and in silico shows us the consequences of these mutations in disrupting the function of these crucial channels,” said Karina Guziewicz, senior author on the study and a senior research investigator at Penn’s School of Veterinary Medicine. “Everything we found suggests that gene therapy will be the best approach to treating this disease, and we are looking forward to taking that next step.”

The research began with a German shepherd that was brought to Penn Vet’s Ryan Hospital. The owners were worried about its vision. “This dog displayed a classical loss of cone vision; it could not see well in daylight but had no problem in dim light conditions,” said Aguirre, professor of medical genetics and ophthalmology at Penn Vet.

The Penn Vet researchers wanted to identify the genetic cause, but the dog had none of the “usual suspects,” the known gene mutations responsible for achromatopsia in dogs. To find the new mutation, the scientists looked at five key genes that play a role in phototransduction, or the process by which light signals are transmitted through the eye to the brain.

They found what they were looking for on the CNGA3 gene, which encodes a cyclic nucleotide channel and plays a key role in transducing visual signals. The change was a “missense” mutation, meaning that the mutation results in the production of a different amino acid. Meanwhile, they heard from colleague Dixon that he had examined Labrador retrievers with similar symptoms. When the Penn team performed the same genetic analysis, they found a different mutation on the same part of the same gene where the shepherd’s mutation was found. Neither mutation had ever been characterized previously in dogs. “The next step was to take this further and look at the consequences of these particular mutations,” Guziewicz said.

The group had the advantage of using the Titan and Stampede supercomputers, which can simulate models of the atomic structure of proteins and thereby elucidate how the protein might function. That work revealed that both mutations disrupted the function of the channel, making it unstable.

“The computational approach allows us to model, right down to the atomic level, how small changes in protein sequence can have a major impact on signaling,” said MacDermaid, assistant professor of research at Temple’s Institute for Computational Molecular Science. “We can then use these insights to help us understand and refine our experimental and clinical work.”

The Temple researchers recreated these mutated channels and showed that one resulted in a loss of channel function. Further in vitro experiments showed that the second mutation caused the channels to be routed improperly within the cell.

Penn Vet researchers have had success in treating various forms of blindness in dogs with gene therapy, setting the stage to treat human blindness. In human achromatopsia, nearly 100 different mutations have been identified in the CNGA3 gene, including the very same one identified in the German shepherd in this study. The results, therefore, lay the groundwork for designing gene therapy constructs that can target this form of blindness with the same approach.

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2015/10/151001165029.htm  Original web page at Science Daily

Categories
News

Pupil shape linked to animals’ ecological niche

While the eyes may be a window into one’s soul, new research led by scientists at the University of California, Berkeley, suggests that the pupils could also reveal whether one is a hunter or hunted.

An analysis of 214 species of land animals shows that a creature’s ecological niche is a strong predictor of pupil shape. Species with pupils that are vertical slits are more likely to be ambush predators that are active both day and night. In contrast, those with horizontally elongated pupils are extremely likely to be plant-eating prey species with eyes on the sides of their heads. Circular pupils were linked to “active foragers,” or animals that chase down their prey.

The study, led by vision scientist Martin Banks, a UC Berkeley professor of optometry, in collaboration with the United Kingdom’s Durham University, presents a new hypothesis as to why pupils are shaped and oriented the way they are. The findings will be published in the journal Science Advances.

This current research builds upon the foundation set by the late Gordon Walls, a UC Berkeley professor of optometry who published “The Vertebrate Eye and Its Adaptive Radiation” in 1942. The classic text on eye physiology put forward the theory, generally accepted, that slit-shaped pupils allow for different musculature and a greater range in the amount of light entering the eye

For example, the vertical slits of domestic cats and geckos undergo a 135- and 300-fold change in area between constricted and dilated states, while humans’ circular pupils undergo a mere 15-fold change.

“For species that are active both night and day, like domestic cats, slit pupils provide the dynamic range needed to help them see in dim light yet not get blinded by the midday sun,” said Banks. “However, this hypothesis does not explain why slits are either vertical or horizontal. Why don’t we see diagonal slits? This study is the first attempt to explain why orientation matters.”

To explain why horizontally elongated pupils, with few exceptions(are there any explanations for the exceptions?), corresponded to grazing prey animals such as sheep, deer and horses, the researchers turned to computer models to study the effects of different pupil shapes

They found that the horizontal pupils expanded the effective field of view. When stretched horizontally, the pupils are aligned with the ground, getting more light in from the front, back and sides. The orientation also helps limit the amount of dazzling light from the sun above so the animal can see the ground better, the researchers said

“The first key visual requirement for these animals is to detect approaching predators, which usually come from the ground, so they need to see panoramically on the ground with minimal blind spots,” said Banks. “The second critical requirement is that once they do detect a predator, they need to see where they are running. They have to see well enough out of the corner of their eye to run quickly and jump over things.”

But what happens to this orientation when the animal lowers its head to graze? If the pupil follows the pitch of the head, they would become more vertical and the theory falters.

“To check this out, I spent hours at the Oakland Zoo, often surrounded by school kids on field trips, to observe the different animals,” said Banks. “Sure enough, when goats, antelope and other grazing prey animals put their head down to eat, their eyes rotated to maintain the pupils’ horizontal alignment with the ground.

On the other side of the Atlantic, study co-author Gordon Love, a professor of physics at Durham University, found this same pattern when observing sheep and horses at nearby farms. Grazing animals’ eyes can rotate by 50 degrees or more in each eye, a range 10 times greater than human eyes, the researchers said

For ambush predators with vertical-slit pupils, the authors noted the importance of accurately gauging the distance animals would need to pounce on their prey. Researchers identified three cues generally used to gauge distance: stereopsis, or binocular disparity; motion parallax, in which closer objects move farther and faster across our field of vision; and blur, in which objects at different distances are out of focus.

The researchers ruled out motion parallax as a factor since using that cue would require head movement that could reveal the predator’s position. The remaining two cues, binocular disparity and blur, work together with vertically elongated pupils and front-facing eyes, the researchers said.

Binocular vision works better at judging differences when contours are vertical and objects are at a distance, while blur comes into play for horizontal contours and near-field targets. Vertical-slit pupils maximize both cues, the researchers said. Vertical pupils are not equally distributed among ambush predators, however.

“A surprising thing we noticed from this study is that the slit pupils were linked to predators that were close to the ground,” said William Sprague, a postdoctoral researcher in Banks’ lab. “So domestic cats have vertical slits, but bigger cats, like tigers and lions, don’t. Their pupils are round, like humans and dogs.

Among the 65 frontal-eyed, ambush predators in this study, 44 had vertical pupils, and 82 percent of them had shoulder heights that were less than 42 centimeters (16.5 inches). Vertical pupils appear to maximize the ability of small animals to judge distances of prey.

The authors explained this by calculating that depth-of-field cues based upon blur are more effective for estimating distances for short animals than tall ones.

“We are learning all the time just how remarkable the eye and vision are,” said Love. “This work is another piece in the jigsaw puzzle of understanding how eyes work.” The authors noted that this research focused on terrestrial species. They expect to examine associations of aquatic, aerial and arboreal life on eye position and pupil shape in

future studies.

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2015/08/150807144334.htm  Original web page at Science Daily

Categories
News

Mobile-phone microscope detects eye parasite

A device that mounts on a mobile phone is used to diagnose African eye worm at a clinic in Cameroon. To diagnose diseases in people living in remote locations, clinicians have traditionally preferred a low-tech approach because battery-powered electronic devices can be too delicate and fussy for clinics in the developing world. But now that mobile phones have penetrated almost every corner of the globe, that aversion is eroding rapidly. In a study in Science Translational Medicine on 6 May, bioengineer Daniel Fletcher of the University of California, Berkeley, and his colleagues give one example of how mobile phones may change medicine in far-flung areas. They describe a camera-phone microscope and app that can immediately detect the presence of the African eye worm parasite Loa loa in a blood sample.

An endemic problem in Central Africa, L. loa grows into a worm that wiggles into the tissue of the eye. The worms are even more problematic when they are picked up along with two other parasitic nematodes, Onchocerca volvulus (which causes river blindness) and Wuchereria bancrofti (which can cause severe limb swelling). This is because one drug typically given to treat those two other parasites, called ivermectin, can cause serious side effects such as brain swelling if a person is also infected with L. loa. Co-parasitism of this kind is common, says infectious-disease specialist Isaac Bogoch of the University of Toronto, Canada. Quickly finding out whether patients infected with O. volvulus or W. bancrofti also have L. loa in their blood is important for deciding whether ivermectin can be safely administered. By converting a phone into a microscope, clinicians have a portable way of checking blood samples.

Mobile-phone microscopes are not new — they can be bought online and have appeared as nifty science-fair projects. And their use in difficult public-health settings has also been proposed previously: in 2009, Fletcher showed with a different group of colleagues that a mobile-phone microscope they designed could identify the bacteria that cause tuberculosis. Other mobile-phone microscopes are being tested to look for infection by blood flukes. But as Samuel Sia, a biomedical engineer at Columbia University in New York, points out, earlier models were not a big improvement over a traditional microscope for use in the field because all they did was magnify. “You’d have to collect a specimen, smear it, stain it and dry it on a slide. Sure, if you have a microscope you can look at it, but what about all those other steps?”

The latest invention by Fletcher and his team, by contrast, avoids that rigmarole — it requires simply loading a blood-containing capillary onto a 3D-printed plastic case containing a lens. The plastic shell slides over an iPhone, aligning the device’s lens to its camera. An app on the phone then takes a video of the magnified blood sample and uses an algorithm to look for movements in the fluid that match up with characteristics of L. loa. Based on this, the app accurately counts how many parasites are present. It has to be used around midday, during the brief period when L. loa typically are active but the other two nematodes are not. This application could be modified to diagnose other parasitic infections, says Fletcher. Researchers are already working on phone software to detect soil-transmitted helminths such as hookworm and whipworm. Devices like these have encouraged more engineers and clinicians to embrace diagnosis tools based on consumer electronics, says Sia. But first the devices must be shown to work in the field. A large experiment to test the L. loa detection system will get under way this year.

Nature doi:10.1038/nature.2015.17499

http://www.nature.com/news/index.html  Nature

http://www.nature.com/news/mobile-phone-microscope-detects-eye-parasite-1.17499  Original web page at Nature

Categories
News

Ophthalmologists uncover autoimmune process that causes rejection of secondary corneal transplants

UT Southwestern Medical Center ophthalmologists have identified an important cause of why secondary corneal transplants are rejected at triple the rate of first-time corneal transplants.

The cornea — the most frequently transplanted solid tissue — has a first-time transplantation success rate of about 90 percent. But second corneal transplants undergo a rejection rate three times that of first transplants. “Understanding why these rejections occur is important to further improving the ways in which corneal transplants are performed,” said the study’s senior author Dr. Jerry Niederkorn, Professor and Vice Chair of Research of Ophthalmology, and Professor of Microbiology. “In the future, ophthalmologists may be able to implement processes, and eventually prescribe medications, that can lower the rates of rejection.”

More than 40,000 transplants are performed annually to replace the cornea, the clear outer lens at the front of the eye, with tissue from a donor. Most corneal transplants are done to correct severe visual impairments caused by keratoconus, a condition in which the normally dome-shaped cornea progressively thins and becomes cone-shaped, according to the American Academy of Ophthalmology. The high success rate of first-time corneal transplants is attributed to a process called immune privilege, which allows transplants to be successfully performed without matching the donor tissue to that of the recipient, as is required for organ transplants. Although immune privilege accounts for the initial high success rate, it can occasionally fail, leading to the rejection of corneal transplants in approximately 10 percent of patients. In patients requiring a second transplant, the incidence of immune rejection rises to almost 70 percent.

“We believe that this loss of immune privilege is similar to an alarm that signals the immune system of potential infection, which results in a full blown immune response at the expense of the corneal transplant,” said Dr. Niederkorn, who holds the Royal C. Miller Chair in Age-Related Macular Degeneration Research and the George A. and Nancy P. Shutt Professorship in Medical Science. Researchers studying mouse models discovered that after the first corneal transplant is accepted, T regulatory cells prevent other types of immune cells from attacking and rejecting the transplant. But severing corneal nerves, which occurs during the first transplantation procedure, releases high levels of the neuropeptide Substance P. The resulting high Substance P levels disable the T regulatory cells needed for acceptance of subsequent corneal transplants. This inactivation results in rejection of more than 90 percent of the second corneal transplants in mice and helps to explain the curiously high risk for corneal graft rejection in patients who receive a second corneal transplant.

Researchers found that the high Substance P levels can be blocked with drugs to restore the eye’s immune privilege and promote the acceptance of second corneal transplants. The study, which appears in the American Journal of Transplantation, is supported by grants from the National Institutes of Health and Research to Prevent Blindness. Future research will focus on pharmacological strategies for restoring T regulatory cell function and promoting the survival of second corneal transplants. Other studies will determine if these findings can be extended to enhancing the immune response to cancer.

http://www.sciencedaily.com/   Science Daily

http://www.sciencedaily.com/releases/2015/04/150416192710.htm  Original web page at Science Daily

Categories
News

Stem cell injection may soon reverse vision loss caused by age-related macular degeneration

An injection of stem cells into the eye may soon slow or reverse the effects of early-stage age-related macular degeneration, according to new research from scientists at Cedars-Sinai. Currently, there is no treatment that slows the progression of the disease, which is the leading cause of vision loss in people over 65. “This is the first study to show preservation of vision after a single injection of adult-derived human cells into a rat model with age-related macular degeneration,” said Shaomei Wang, MD, PhD, lead author of the study published in the journal STEM CELLS and a research scientist in the Eye Program at the Cedars-Sinai Board of Governors Regenerative Medicine Institute. The stem cell injection resulted in 130 days of preserved vision in laboratory rats, which roughly equates to 16 years in humans. Age-related macular degeneration affects upward of 15 million Americans. It occurs when the small central portion of the retina, known as the macula, deteriorates. The retina is the light-sensing nerve tissue at the back of the eye. Macular degeneration may also be caused by environmental factors, aging and a genetic predisposition.

When animal models with macular degeneration were injected with induced neural progenitor stem cells, which derive from the more commonly known induced pluripotent stem cells, healthy cells began to migrate around the retina and formed a protective layer. This protective layer prevented ongoing degeneration of the vital retinal cells responsible for vision. Cedars-Sinai researchers in the Induced Pluripotent Stem Cell (iPSC) Core, directed by Dhruv Sareen, PhD, with support from the David and Janet Polak Foundation Stem Cell Core Laboratory, first converted adult human skin cells into powerful induced pluripotent stem cells (iPSC), which can be expanded indefinitely and then made into any cell of the human body. In this study, these induced pluripotent stem cells were then directed toward a neural progenitor cell fate, known as induced neural progenitor stem cells, or iNPCs.

“These induced neural progenitor stem cells are a novel source of adult-derived cells which should have powerful effects on slowing down vision loss associated with macular degeneration,” said Clive Svendsen, PhD, director of the Board of Governors Regenerative Medicine Institute and contributing author to the study. “Though additional pre-clinical data is needed, our institute is close to a time when we can offer adult stem cells as a promising source for personalized therapies for this and other human diseases.” Next steps include testing the efficacy and safety of the stem cell injection in preclinical animal studies to provide information for applying for an investigational new drug. From there, clinical trials will be designed to test potential benefit in patients with later-stage age-related macular degeneration.

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2015/04/150414093554.htm  Original web page at Science Daily

Categories
News

* Artificial retina could someday help restore vision

The loss of eyesight, often caused by retinal degeneration, is a life-altering health issue for many people, especially as they age. But a new development toward a prosthetic retina could help counter conditions that result from problems with this crucial part of the eye. Scientists published their research on a new device, which they tested on tissue from laboratory animals, in the ACS journal Nano Letters. Yael Hanein and colleagues point out that a growing range of medical devices has become available to treat conditions, including visual impairment, that involve sending sensory signals to the brain. Patients with one type of eye disorder called age-related macular degeneration (AMD), for example, could potentially benefit from such a device, they say. AMD usually affects people age 60 or older who have damage to a specific part of the retina, limiting their vision. Scientists are trying different approaches to develop an implant that can “see” light and send visual signals to a person’s brain, countering the effects of AMD and related vision disorders. But many attempts so far use metallic parts, cumbersome wiring or have low resolution. The researchers, an interdisciplinary team from Tel Aviv University, the Hebrew University of Jerusalem Centers for Nanoscience and Nanotechnology and Newcastle University, wanted to make a more compact device. The researchers combined semiconductor nanorods and carbon nanotubes to create a wireless, light-sensitive, flexible film that could potentially act in the place of a damaged retina. When they tested it with a chick retina that normally doesn’t respond to light, they found that the film absorbed light and, in response, sparked neuronal activity. In comparison with other technologies, the researchers conclude theirs is more durable, flexible and efficient, as well as better able to stimulate neurons.

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2014/11/141112102521.htm  Original web page at Science Daily

Categories
News

* First report of long-term safety of human embryonic stem cells to treat human disease

New research published in The Lancet provides the first evidence of the medium- to long-term safety and tolerability of transplanting human embryonic stem cells (hESCs) in humans. hESC transplants used to treat severe vision loss in 18 patients with different forms of macular degeneration appeared safe up to 3 years post-transplant, and the technology restored some sight in more than half of the patients. “Embryonic stem cells have the potential to become any cell type in the body, but transplantation has been complicated by problems including the risk of teratoma formation and immune rejection,” explains lead author Professor Robert Lanza, Chief Scientific officer at Advanced Cell Technology in the USA. “As a result, immunoprivileged sites (that do not produce a strong immune response) such as the eye have become the first parts of the human body to benefit from this technology.”

In the two phase 1/2 studies, hESCs were differentiated into retinal pigment epithelium cells and transplanted into nine patients with Stargardt’s macular dystrophy and nine patients with dry atrophic age-related macular degeneration, the leading causes of juvenile and adult blindness in the developed world, respectively. No effective treatments exist for either condition, and eventually the light-receiving (photoreceptor) cells of the retina degenerate, leading to complete blindness. All participants were injected with one of three different doses of retinal cells (50,000, 100,000, and 150,000 cells) into the subretinal space (under the retina) of the eye with the worst vision. The hESC-derived cells were well tolerated for up to 37 months after transplantation. No safety concerns (eg, hyperproliferation or rejection) in the treated eyes were detected during a median follow-up of 22 months. Adverse events were associated with vitreoretinal surgery and immunosuppression, but none were deemed to be related to the hESC-derived cells. Follow-up testing showed that 10 out of 18 treated eyes had substantial improvements in how well they could see, with 8 patients reading over 15 additional letters in the first year after transplant. Visual acuity remained the same or improved in seven patients, but decreased by more than 10 letters in one patient. Importantly, untreated eyes did not show similar visual improvements.

According to co-lead author Professor Steven Schwartz from the Jules Stein Eye Institute, Los Angeles, USA, “Our results suggest the safety and promise of hESCs to alter progressive vision loss in people with degenerative diseases and mark an exciting step towards using hESC-derived stem cells as a safe source of cells for the treatment of various medical disorders requiring tissue repair or replacement.” Writing in a linked Comment, Anthony Atala, Director of the Wake Forest Institute for Regenerative Medicine, Wake Forest School of Medicine, Winston-Salem, NC, USA says, “The work by Schwartz and colleagues is a major accomplishment, but the path to get to this point has not been smooth. Since the discovery of hESC in 1998, much has transpired, including political, ethical, and scientific debates, with an overall push to achieve the promise of human therapies. Now, we have follow-up that extends to longer than 3 years in patients treated with hESC-derived stem cells, showing both safety and apparent efficacy… Much work remains to be done before hESC and induced pluripotent stem cell therapies go beyond regulatory trials, but the path is now set in motion.”

http://www.sciencedaily.com/  Science Daily

http://www.sciencedaily.com/releases/2014/10/141014211709.htm  Original web page at Science Daily

Categories
News

Japanese woman is first recipient of next-generation stem cells

Researchers were able to grow sheets of retinal tissue from induced pluripotent stem cells, and have now implanted them for the first time in a patient. A Japanese woman in her 70s is the world’s first recipient of cells derived from induced pluripotent stem cells, a technology that has created great expectations since it could offer the same advantages as embryo-derived cells but without some of the controversial aspects and safety concerns. In a two-hour procedure starting at 14:20 local time today, a team of three eye specialists lead by Yasuo Kurimoto of the Kobe City Medical Center General Hospital, transplanted a 1.3 by 3.0 millimetre sheet of retinal pigment epithelium cells into an eye of the Hyogo prefecture resident, who suffers from age-related macular degeneration. The procedure took place at the Institute of Biomedical Research and InnovationHospital, next to the RIKENCenter for Developmental Biology (CDB) where ophthalmologist Masayo Takahashi had developed and tested the epithelium sheets. She derived them from the patient’s skin cells, after producing induced pluripotent stem (iPS) cells and then getting them to differentiate into retinal cells.

Afterwards, the patient experienced no effusive bleeding or other serious problems, RIKEN has reported. The patient “took on all the risk that go with the treatment as well as the surgery”, Kurimoto said in a statement released by RIKEN. “I have deep respect for bravery she showed in resolving to go through with it.” He hit a somber note in thanking Yoshiki Sasai, a CDB researcher who recenty committed suicide. “This project could not have existed without the late Yoshiki Sasai’s research, which led the way to differentiating retinal tissue from stem cells.” Kurimoto also thanked Shinya Yamanaka, a stem-cell scientist at KyotoUniversity “without whose discovery of iPS cells, this clinical research would not be possible.” Yamanaka shared the 2012 Nobel Prize in Physiology or Medicine for that work. Kurimoto performed the procedure a mere four days after a health-ministry committee gave Takahashi clearance for the human trials. To earn that, Takahashi and her collaborators had done safety studies in both monkeys and mice. The animal tests found that iPS cells were not rejected and did not lead to the growth of tumours .

Age-related macular degeneration results from the breakdown of retinal epithelium, a layer of cells that support photoreceptors needed for vision. The procedure Kurimoto performed is unlikely to restore his patient’s vision. However, researchers around the world will be watching closely to see whether the cells are able to check the further destruction of the retina while avoiding potential side-effects, such as bringing about an immune reaction or inducing cancerous growth. “We’ve taken a momentous first step toward regenerative medicine using iPS cells,” Takahashi said in a statement. “With this as a starting point, I definitely want to bring [iPS cell-based regenerative medicine] to as many people as possible.”

Nature doi:10.1038/nature.2014.15915

http://www.nature.com/news/index.html  Nature

http://www.nature.com/news/japanese-woman-is-first-recipient-of-next-generation-stem-  cells-1.15915  Original web page at Nature

Categories
News

Running cures blind mice

Running helps mice to recover from a type of blindness caused by sensory deprivation early in life, researchers report. The study, published on 26 June in eLife, also illuminates processes underlying the brain’s ability to rewire itself in response to experience — a phenomenon known as plasticity, which neuroscientists believe is the basis of learning. More than 50 years ago, neurophysiologists David Hubel and Torsten Wiesel cracked the ‘code’ used to send information from the eyes to the brain. They also showed that the visual cortex develops properly only if it receives input from both eyes early in life. If one eye is deprived of sight during this ‘critical period’, the result is amblyopia, or ‘lazy eye’, a state of near blindness. This can happen to someone born with a droopy eyelid, cataract or other defect not corrected in time. If the eye is opened in adulthood, recovery can be slow and incomplete. In 2010, neuroscientists Christopher Niell and Michael Stryker, both at the University of California, San Francisco (UCSF), showed that running more than doubled the response of mice’s visual cortex neurons to visual stimulation. Stryker says that it is probably more important, and taxing, to keep track of the environment when navigating it at speed, and that lower responsiveness at rest may have evolved to conserve energy in less-demanding situations. “It makes sense to put the visual system in a high-gain state when you’re moving through the environment, because vision tells you about far away things, whereas touch only tells you about things that are close,” he says. It is generally assumed that activity stimulates plasticity, so Stryker and his colleague Megumi Kaneko, also a neuroscientist at UCSF, wondered whether running might influence the plasticity of the visual cortex. They induced amblyopia in mice by suturing one eye shut for several months, during and after the critical period of visual development. They then re-opened the mice’s eyes and divided them into two groups. Mice in one group were shown a ‘noisy’ visual pattern while running on a treadmill for four hours a day for three weeks. The pattern was chosen to activate nearly all the cells in the mice’s primary visual cortex. The researchers recorded the mice’s brain activity using intrinsic signal imaging, a method similar to functional magnetic resonance.

After a week these mice showed more responsiveness in the part of the cortex corresponding to the eye that had been closed. After two weeks, responses were comparable to those of normal mice that had never been visually deprived. The other group, housed in cages and without extra visual stimulation, had a much slower response to their newly reopened eye and never reached normal response levels. Further experiments revealed that neither running nor visual stimulation alone had this effect. Recovery was also specific to the stimulus. Mice viewing the noise pattern did not show improved responses to a pattern of drifting bars, and vice versa, suggesting that only the visual circuits activated during running recover. “What is amazing is the robustness of this phenomenon,” says Massimo Scanziani, a neurobiologist at the University of California, San Diego. “It’s powerful and highly reproducible, which is ideal for studying the mechanism.” Stryker and his colleagues do not yet know whether their findings apply to humans, but they plan further work to find out. Nature doi:10.1038/nature.2014.15476

http://www.nature.com/news/index.html  Nature

July 8, 2014

http://www.nature.com/news/running-cures-blind-mice-1.15476 Original web page at Nature

Categories
News

New neural pathway found in eyes that aids in vision

A type of retina cell plays a more critical role in vision than previously known, a team led by Johns Hopkins University researchers has discovered. Working with mice, the scientists found that the ipRGCs — an atypical type of photoreceptor in the retina — help detect contrast between light and dark, a crucial element in the formation of visual images. The key to the discovery is the fact that the cells express melanopsin, a type of photopigment that undergoes a chemical change when it absorbs light. “We are quite excited that melanopsin signaling contributes to vision even in the presence of functional rods and cones,” postdoctoral fellow Tiffany M. Schmidt said. Schmidt is lead author of a recently published study in the journal Neuron. The senior author is Samer Hattar, associate professor of biology in the university’s Krieger School of Arts and Sciences. Their findings have implications for future studies of blindness or impaired vision. Rods and cones are the most well-known photoreceptors in the retina, activating in different light environments. Rods, of which there are about 120 million in the human eye, are highly sensitive to light and turn on in dim or low-light environments. Meanwhile the 6 million to 7 million cones in the eye are less sensitive to light; they drive vision in brighter light conditions and are essential for color detection. Rods and cones were thought to be the only light-sensing photoreceptors in the retina until about a decade ago when scientists discovered a third type of retinal photoreceptor — the ipRGC, or intrinsically photosensitive retinal ganglion cell — that contains melanopsin. Those cells were thought to be needed exclusively for detecting light for non-image-dependent functions, for example, to control synchronization of our internal biological clocks to daytime and the constriction of our pupils in response to light.

“Rods and cones were thought to mediate vision and ipRGCs were thought to mediate these simple light-detecting functions that happen outside of conscious perception,” Schmidt said. “But our experiments revealed that ipRGCs influence a greater diversity of behaviors than was previously known and actually contribute to an important aspect of image-forming vision, namely contrast detection.” The Johns Hopkins team along with other scientists conducted several experiments with mice and found that when melanopin was present in the retinal ganglion cells, the mice were better able to see contrast in a Y-shaped maze, known as the visual water task test. In the test, mice are trained to associate a pattern with a hidden platform that allows them to escape the water. Mice that had the melanopsin gene intact had higher contrast sensitivity than mice that lack the gene. “Melanopsin signaling is essential for full contrast sensitivity in mouse visual functions,” said Hattar. “The ipRGCs and melanopsin determine the threshold for detecting edges in the visual scene, which means that visual functions that were thought to be solely mediated by rods and cones are now influenced by this system. The next step is to determine if melanopsin plays a similar role in the human retina for image-forming visual functions.”

http://www.sciencedaily.com/  Science Daily

June 24, 2014

http://www.sciencedaily.com/releases/2014/05/140521162707.htm  Original web page at Science Daily

Categories
News

Wiring of retina reveals how eyes sense motion

It is sometimes said that we see with the brain rather than the eyes, but this is not entirely true. People can only make sense of visual information once it has been interpreted by the brain, but some of this information is processed partly by neurons in the retina. In particular, 50 years ago researchers discovered that the mammalian retina is sensitive to the direction and speed of moving images. This showed that motion perception begins in the retina, but researchers struggled to explain how. When light enters the eye, it is captured by photoreceptor cells, which convert the information into electrical impulses and transmit them to deeper layers of the retina. Individual photoreceptors are not sensitive to the direction in which an object may be moving, so neuroscientist Jinseop Kim, of the Massachusetts Institute of Technology (MIT) in Cambridge, and his colleagues wanted to test whether the answer to the puzzle could lie in the way various types of cells in the retina are connected. Photoreceptors relay their signals via ‘bipolar neurons’, named this way because they have two stems that jut out of the cell’s body in opposite directions. The signal then transits through ‘starburst amacrine cells’ — which have filaments, or dendrites, that extend in all directions similarly to light rays out of a star — before reaching the cells that form the optic nerve, which relays them into the brain. To understand how bipolar and starburst cells are wired together, Kim and his colleagues analysed high-resolution electron microscope images of a mouse retina with the help of nearly 2,200 members of EyeWire, an online ‘citizen-science’ game set up to help with brain-mapping efforts. Players traced the pathways through the layers of cells to create a high-resolution wiring diagram of part of the retina.

The reconstructed map, described in Nature, showed that while one type of bipolar cell connects to the amacrine cells’ filaments close to the cell body, another does do so farther away along the length of the filaments. And crucially, the bipolar cells that connect closer to the starburst amacrine cell bodies are known to relay their messages with a time delay, whereas the others transmit their immediately. Because of the lag in the first type of connection, signals that hit two nearby locations on the retina at two slightly different times — as would happen when an object moves across the visual field — could reach the same amacrine-cell filament at the same time. This could explain how the retina detects motion, the authors suggest: The amacrine cell might fire only when it receives this combined information, signalling that something is moving in the direction of the filament. Stimuli not moving in the direction of the filament will produce impulses that reach the amacrine cell at different times, so that it will not fire. Sebastian Seung, a computational neuro scientist at MIT and senior author of the study, says that the results should be interpreted cautiously. Although he and his colleagues have helped to shed light on the anatomy of the retina, only experiments can conclusively prove that the system works as their model suggests. “We’re throwing this over to the physiologists now. They can test the hypothesis easily.” “This is a very nice paper that poses a very clear and testable prediction about direction-selective computation in the retina,” says Botond Roska, a neuroscientist at the Friedrich Miescher Institute for Biomedical Research in Basel, Switzerland. “It’s an exciting idea, and I bet it’ll be followed by research from many labs trying test this hypothesis.” Seung adds that the wiring diagram represents only a tiny proportion of the total number of connections on the retina. “There are probably other neurons that are a part of this motion-detection circuit,” he says. “We need to map those out and eventually reconstruct the entire retinal connectome.”

Nature doi:10.1038/nature.2014.15147

http://www.nature.com/news/index.html  Nature

May 27, 2014

http://www.nature.com/news/wiring-of-retina-reveals-how-eyes-sense-motion-1.15147  Original web page at Nature

Categories
News

New structure in dogs’ eye linked to blinding retinal diseases

The fovea-like area in dogs with a form of macular degeneration was affected much like humans with the disease. In humans, a tiny area in the center of the retina called the fovea is critically important to viewing fine details. Densely packed with cone photoreceptor cells, it is used while reading, driving and gazing at objects of interest. Some animals have a similar feature in their eyes, but researchers believed that among mammals the fovea was unique to primates — until now. University of Pennsylvania vision scientists report that dogs, too, have an area of their retina that strongly resembles the human fovea. What’s more, this retinal region is susceptible to genetic blinding diseases in dogs just as it is in humans. “It’s incredible that in 2014 we can still make an anatomical discovery in a species that we’ve been looking at for the past 20,000 years and that, in addition, this has high clinical relevance to humans,” said William Beltran, an assistant professor of ophthalmology in Penn’s School of Veterinary Medicine and co-lead author of the study with Artur Cideciyan, research professor of ophthalmology in Penn’s Perelman School of Medicine. “It is absolutely exhilarating to be able to investigate this very specialized and important part of canine central vision that has such unexpectedly strong resemblance to our own retina,” Cideciyan added. The paper was published in the journal PLOS ONE.

The word “fovea” comes from the Latin meaning “pit,” owing to the fact that in humans and many other primates, the inner layers of the retina are thin in this area, while the outer layers are packed with cone photoreceptor cells. It is believed that this inner layer thinning allows the foveal cone cells privileged access to light. It is known that dogs have what is called an area centralis, a region around the center of the retina with a relative increase in cone photoreceptor cell density. But dogs lack the pit formation that humans have, and, before this study, it was believed that the increase in cone photoreceptor cell density didn’t come close to matching what is seen in primates. Prior to this study, the highest reported density in dogs was 29,000 cones per square millimeter compared to more than 100,000 cones per square millimeter seen in the human and macaque foveas. It turns out that previous studies in dogs had missed a miniscule region of increased cell density. In this study, while examining the retina of a dog with a mutation that causes a disease akin to a form of X-linked retinal degeneration in humans, the Penn researchers noticed a thinning of the retinal layer that contains photoreceptor cells. Zeroing in on this region, they examined retinas of normal dogs using advanced imaging techniques, including confocal scanning laser ophthalmoscopy, optical coherence tomography and two-photon microscopy. By enabling the scientists to visualize different layers of the retina, these techniques allowed them to identify a small area of peak cone density and then estimate cone numbers by counting the cells in this unique area.

Based on their observations, the researchers found that cone densities reached more than 120,000 cells per square millimeter in a never-before-described fovea-like region of the area centralis — a density on par with that of primate foveas. “There’s no real landmark for this area like there is in humans,” Aguirre said, “so to discover such a density was unexpected.” They also recognized that the “output side” of this cone-dense region corresponded with an area of dense retinal ganglion cells, which transmit signals to the brain. Human patients with macular degeneration experience a loss of photoreceptor cells — the rods and cones that process light — at or near the fovea, resulting in a devastating loss of central vision. To see whether the fovea-like region was similarly affected in dogs, the Penn researchers used the same techniques they had employed to study normal dogs to examine animals that had mutations in two genes (BEST1 and RPGR) that can lead to macular degeneration in humans. In both cases, the onset of disease affected the fovea-like region in dogs in a very similar way to how the diseases present in humans — with central retinal lesions appearing earlier than lesions in the peripheral retina. “Why the fovea is susceptible to early disease expression for certain hereditary disorders and why it is spared under other conditions is not known,” Cideciyan said. “Our findings, which show the canine equivalent of a human genetic disease affecting an area of the retina that is of extreme importance to human vision, are very promising from the human point of view. They could allow for translational research by allowing us to test treatments for human foveal and macular degenerative diseases in dogs.”

In addition, the discovery offers insight into a rare human condition known as fovea plana, in which people have normal visual acuity but no “pit” in their fovea. In other words, their fovea resembles that of dogs, challenging the previously held assumption that lack of tissue and blood vessels overlaying the fovea is a prerequisite for the high resolution of vision. The fact that dogs have a fovea-like area of dense photoreceptor cells may also indicate that dogs are seeing more acutely than once suspected. “This gives us a structural basis to support the idea that dogs might have a higher visual acuity than has been measured so far,” Beltran said. “It could even be the case that some breeds have an especially high density of cells and could be used as working dogs for particular tasks that require high-level sight function.” Looking ahead, the researchers may focus on this fovea-like area in studies of therapies for not only X-linked retinal degeneration and Best disease but also other sight-related problems affecting the macula and fovea.

http://www.sciencedaily.com/  Science Daily

March 18, 2014

http://www.sciencedaily.com/releases/2014/03/140305191513.htm  Original web page at Science Daily

Categories
News

Lab-grown, virus-free stem cells repair retinal tissue in mice

Retinal vessel being repaired. The white arrow shows iPSC-derived vascular stem cells incorporating into a damaged retinal blood vessel and repairing it. Investigators at Johns Hopkins report they have developed human induced-pluripotent stem cells (iPSCs) capable of repairing damaged retinal vascular tissue in mice. The stem cells, derived from human umbilical cord-blood and coaxed into an embryonic-like state, were grown without the conventional use of viruses, which can mutate genes and initiate cancers, according to the scientists. Their safer method of growing the cells has drawn increased support among scientists, they say, and paves the way for a stem cell bank of cord-blood derived iPSCs to advance regenerative medicine research. In a report published Jan. 20 in the journal Circulation, stem cell biologist Elias Zambidis, M.D., Ph.D., and his colleagues describe laboratory experiments with these non-viral, human retinal iPSCs, created using the virus-free method Zambidis first reported in 2011. “We began with stem cells taken from cord-blood, which have fewer acquired mutations and little, if any, epigenetic memory, which cells accumulate as time goes on,” says Zambidis, associate professor of oncology and pediatrics at the Johns Hopkins Institute for Cell Engineering and the Kimmel Cancer Center. The scientists converted these cells to a status last experienced when they were part of six-day-old embryos.

Instead of using viruses to deliver a gene package to the cells to turn on processes that convert the cells back to stem cell states, Zambidis and his team used plasmids, rings of DNA that replicate briefly inside cells and then degrade. Next, the scientists identified high-quality, multipotent, vascular stem cells generated from these iPSC that can make a type of blood vessel-rich tissue necessary for repairing retinal and other human material. They identified these cells by looking for cell surface proteins called CD31 and CD146. Zambidis says that they were able to create twice as many well-functioning vascular stem cells as compared with iPSCs made with other methods, and, “more importantly these cells engrafted and integrated into functioning blood vessels in damaged mouse retina.” Working with Gerard Lutty, Ph.D., and his team at Johns Hopkins’ Wilmer Eye Institute, Zambidis’ team injected the newly derived iPSCs into mice with damaged retinas, the light-sensitive part of the eyeball. Injections were given in the eye, the sinus cavity near the eye or into a tail vein. When the scientists took images of the mice retinas, they found that the iPSCs, regardless of injection location, engrafted and repaired blood vessel structures in the retina. “The blood vessels enlarged like a balloon in each of the locations where the iPSCs engrafted,” says Zambidis. The scientists said their cord blood-derived iPSCs compared very well with the ability of human embryonic-derived iPSCs to repair retinal damage.

Zambidis says there are plans to conduct additional experiments of their cells in diabetic rats, whose conditions more closely resemble human vascular damage to the retina than the mouse model used for the current study, he says. With mounting requests from other laboratories, Zambidis says he frequently shares his cord blood-derived iPSC with other scientists. “The popular belief that iPSCs therapies need to be specific to individual patients may not be the case,” says Zambidis. He points to recent success of partially matched bone marrow transplants in humans, shown to be equally as effective as fully matched transplants. “Support is growing for building a large bank of iPSCs that scientists around the world can access,” says Zambidis, although large resources and intense quality- control would be needed for such a feat. However, Japanese scientists led by stem-cell pioneer Shinya Yamanaka are doing exactly that, he says, creating a bank of stem cells derived from cord-blood samples from Japanese blood banks.

http://www.sciencedaily.com/ Science Daily
February 18, 2014

http://www.sciencedaily.com/releases/2014/01/140123221915.htm Original web page at Science Daily

Categories
News

Image perception in the blink of an eye

Imagine seeing a dozen pictures flash by in a fraction of a second. You might think it would be impossible to identify any images you see for such a short time. However, a team of neuroscientists from MIT has found that the human brain can process entire images that the eye sees for as little as 13 milliseconds — the first evidence of such rapid processing speed. That speed is far faster than the 100 milliseconds suggested by previous studies. In the new study, which appears in the journal Attention, Perception, and Psychophysics, researchers asked subjects to look for a particular type of image, such as “picnic” or “smiling couple,” as they viewed a series of six or 12 images, each presented for between 13 and 80 milliseconds. “The fact that you can do that at these high speeds indicates to us that what vision does is find concepts. That’s what the brain is doing all day long — trying to understand what we’re looking at,” says Mary Potter, an MIT professor of brain and cognitive sciences and senior author of the study. This rapid-fire processing may help direct the eyes, which shift their gaze three times per second, to their next target, Potter says. “The job of the eyes is not only to get the information into the brain, but to allow the brain to think about it rapidly enough to know what you should look at next. So in general we’re calibrating our eyes so they move around just as often as possible consistent with understanding what we’re seeing,” she says. After visual input hits the retina, the information flows into the brain, where information such as shape, color, and orientation is processed. In previous studies, Potter has shown that the human brain can correctly identify images seen for as little as 100 milliseconds. In the new study, she and her colleagues decided to gradually increase the speeds until they reached a point where subjects’ answers were no better than if they were guessing. All images were new to the viewers. The researchers expected they might see a dramatic decline in performance around 50 milliseconds, because other studies have suggested that it takes at least 50 milliseconds for visual information to flow from the retina to the “top” of the visual processing chain in the brain and then back down again for further processing by so-called “re-entrant loops.” These processing loops were believed necessary to confirm identification of a particular scene or object.

However, the MIT team found that although overall performance declined, subjects continued to perform better than chance as the researchers dropped the image exposure time from 80 milliseconds to 53 milliseconds, then 40 milliseconds, then 27, and finally 13 — the fastest possible rate with the computer monitor being used. “This didn’t really fit with the scientific literature we were familiar with, or with some common assumptions my colleagues and I have had for what you can see,” Potter says. Potter believes one reason for the subjects’ better performance in this study may be that they were able to practice fast detection as the images were presented progressively faster, even though each image was unfamiliar. The subjects also received feedback on their performance after each trial, allowing them to adapt to this incredibly fast presentation. At the highest rate, subjects were seeing new images more than 20 times as fast as vision typically absorbs information. “We think that under these conditions we begin to show more evidence of knowledge than in previous experiments where people hadn’t really been expecting to find success, and didn’t look very hard for it,” Potter says. The findings are consistent with a 2001 study from researchers at the University of Parma and the University of St. Andrews, who found that neurons in the brains of macaque monkeys that respond to specific types of image, such as faces, could be activated even when the target images were each presented for only 14 milliseconds in a rapid sequence. “That was the only background that suggested maybe 14 milliseconds was sufficient to get something meaningful into the brain,” Potter says.

The study offers evidence that “feedforward processing” — the flow of information in only one direction, from retina through visual processing centers in the brain — is enough for the brain to identify concepts without having to do any further feedback processing. It also suggests that while the images are seen for only 13 milliseconds before the next image appears, part of the brain continues to process those images for longer than that, Potter says, because in some cases subjects weren’t asked whether a specified image was present until after they had seen the sequence. “If images were wiped out after 13 milliseconds, people would never be able to respond positively after the sequence. There has to be something in the brain that has maintained that information at least that long,” she says. This ability to identify images seen so briefly may help the brain as it decides where to focus the eyes, which dart from point to point in brief movements called fixations about three times per second, Potter says. Deciding where to move the eyes can take 100 to 140 milliseconds, so very high-speed understanding must occur before that. The researchers are now investigating how long visual information presented so briefly can be held in the brain. They are also scanning subjects’ brains with a magnetoencephalography (MEG) scanner during the task to see what brain regions are active when a person successfully completes the identification task.

http://www.sciencedaily.com/   Science Daily February 4, 2014

http://www.sciencedaily.com/releases/2014/01/140116091145.htm  Original web page at Science Daily

Categories
News

Cell death pathway involved in three forms of blindness, study finds

Gene therapies developed by University of Pennsylvania School of Veterinary Medicine researchers have worked to correct different forms of blindness. While effective, the downside to these approaches to vision rescue is that each disease requires its own form of gene therapy to correct the particular genetic mutation involved, a time consuming and complex process. Hoping to develop a treatment that works more broadly across diseases, a Penn Vet team used canine disease models to closely examine how retinal gene activity varied during the progression of three different forms of inherited vision disease. Their results turned up an unexpected commonality: Early on in each of the diseases, genes involved in the same specific pathway of cell death appeared to be activated. These findings point to possible interventions that could curb vision loss across a variety of inherited retinal diseases. The work, published in PLOS ONE, was conducted by Sem Genini, a senior research investigator; William A. Beltran, assistant professor of ophthalmology; and Gustavo D. Aguirre, professor of medical genetics and ophthalmology, all of Penn Vet’s Department of Clinical Studies, Philadelphia. The team examined three forms of retinal degenerative diseases, rod cone dysplasia 1 being the most severe, or earliest onset, followed by X-linked progressive retinal atrophy 2 and then early retinal degeneration. All of these diseases involve the death of photoreceptor cells and each is caused by a distinct genetic mutation. But what scientists did not know is how the mutations trigger a molecular signaling pathway that leads to the death of photoreceptor cells.

“What we have in mind is to be able to address multiple forms of disease with one treatment,” Beltran said. “We wanted to get a better understanding of whether there are any common cell death or cell survival pathways that could be targeted in some of these diseases.” The researchers looked at the activity of 112 genes in diseased retinas and compared it to gene activity in normal retinas. They assessed gene activity at time points known to correspond with key phases of disease: the “induction phase,” the time before the peak level of photoreceptor cell death; the “execution phase,” when the highest rates of photoreceptor cell death occur; and the “chronic phase,” during which photoreceptor cell death continues at somewhat reduced levels. During the execution and chronic phases of disease, the researchers identified a number of genes involved in programmed cell death, or apoptosis, that had noticeably different patterns of expression between the diseased and normal dogs. Of note, several proteins involved in the tumor necrosis factor, or TNF, pathway increased in activity during the induction and execution phases. This pathway is implicated in many diseases, from diabetes to cancer to rheumatoid arthritis. “This is quite a new result,” Genini said. “It was not expected to have the TNF pathway upregulated.”

“We assumed,” Aguirre said, “the diseases would be different from one another and that cells would commit suicide by their own specific pathway and that perhaps quite late they would have a common final pathway. But what this shows is that there is an early trigger that is quite similar among all three diseases.” An additional surprise was that the differentially expressed proteins were present not only in photoreceptor cells but also in other cells in the retina, including horizontal and Müller cells. “We were focusing on what would happen with the photoreceptor cells, the cells that we knew were dying,” Beltran said. “But what our results are telling us is that, sure, they are dying, but there is something else happening with the cells that they talk to.” Pharmaceutical companies have already developed TNF-inhibitors to treat diseases like rheumatoid arthritis. Genini, Beltran and Aguirre say their results suggest that these drugs or similar ones might have a role to play in the retinal diseases they investigated and perhaps in others that their team is currently studying. “On its own,” Beltran said, “a TNF-inhibitor might not be a cure, but it could be used complementary to gene therapy, either by slowing the course of degeneration before the corrective gene therapy is delivered or in combination with the corrective gene therapy.”

http://www.sciencedaily.com/  Science Daily February 4, 2014

http://www.sciencedaily.com/releases/2014/01/140116162216.htm  Original web page at Science Daily

Categories
News

Snakes control blood flow to aid vision

A new study from the University of Waterloo shows that snakes can optimize their vision by controlling the blood flow in their eyes when they perceive a threat. Kevin van Doorn, PhD, and Professor Jacob Sivak, from the Faculty of Science, discovered that the coachwhip snake’s visual blood flow patterns change depending on what’s in its environment. The findings appear in the most recent issue of the Journal of Experimental Biology. “Each species’ perception of the world is unique due to differences in sensory systems,” said van Doorn, from the School of Optometry & Vision Science. Instead of eyelids, snakes have a clear scale called a spectacle. It works like a window, covering and protecting their eyes. Spectacles are the result of eyelids that fuse together and become transparent during embryonic development. When van Doorn was examining a different part of the eye, the illumination from his instrument detected something unusual. Surprisingly, these spectacles contained a network of blood vessels, much like a blind on a window. To see if this feature obscured the snake’s vision, van Doorn examined if the pattern of blood flow changed under different conditions. When the snake was resting, the blood vessels in the spectacle constricted and dilated in a regular cycle. This rhythmic pattern repeated several times over the span of several minutes. But when researchers presented the snake with stimuli it perceived as threatening, the fight-or-flight response changed the spectacle’s blood flow pattern. The blood vessel constricted, reducing blood flow for longer periods than at rest, up to several minutes. The absence of blood cells within the vasculature guarantees the best possible visual capacity in times of greatest need.

“This work shows that the blood flow pattern in the snake spectacle is not static but rather dynamic,” said van Doorn. Next, the research team examined the blood flow pattern of the snake spectacle when the snake shed its skin. They found a third pattern. During this time, the vessels remained dilated and the blood flow stayed strong and continuous, unlike the cyclical pattern seen during resting. Together, these experiments show the relationship between environmental stimuli and vision, as well as highlight the interesting and complex effect blood flow patterns have on visual clarity. Future research will investigate the mechanism underlying this relationship. “This research is the perfect example of how a fortuitous discovery can redefine our understanding of the world around us,” said van Doorn.

Science Daily
November 26, 2013

Original web page at Science Daily

Categories
News

A novel locus identified for glaucoma in Dandie Dinmont Terrier dog breed

Professor Hannes Lohi’s research group at the University of Helsinki and Folkhälsan Research Center, Finland, has identified a novel locus for glaucoma in Dandie Dinmont Terrier. The locus on canine chromosome 8 includes a 9.5 Mb region that is associated with glaucoma. The canine locus shares synteny to human chromosome 14, which has been previously associated with different types of human glaucomas. However, the actual glaucoma causing mutation in Dandies remains unknown. The study was published in the scientific journal PLOS ONE on August 14, 2013. Glaucoma is one the most common blindness causing disease both in human and in dog. Glaucoma in an optic neuropathy, which destroys the retinal ganglion cells and damages the optic nerve causing irreversible blindness. Possible elevation of the intraocular pressure may cause considerable pain. In humans glaucoma is broadly classified into three types, open-angle, closed-angle and congenital glaucoma. Several loci have been mapped in humans, but only a few causative genes are known and the genetic basis remains poorly understood. In dogs several different glaucoma types are diagnosed, but only one causative gene, ADAMTS10, is known to cause open-angle glaucoma in the Beagle and in other breeds the genetic background of glaucoma is still unknown.

The glaucoma research in Dandie Dinmont Terriers has started in the United States by Doctor Gary Johnson at the University of Missouri. In the University of Helsinki samples from affected and healthy dogs have been collected since 2007. The researcher decided to collaborate to collect samples from all around the world. “Because Dandie Dinmont Terrier is globally a small breed, collecting enough samples was quite challenging,” says the research leader, Professor Hannes Lohi. Glaucoma is quite common disease in Dandie Dinmont Terrier resembling human closed-angle glaucoma. The affected dogs have very narrow or collapsed iridocorneal angles leading to obstruction of the normal outflow of the aqueous humor. This causes elevation of the intraocular pressure (IOP). Elevated IOP can be treated, but usually the most effective treatment is to remove the affected eye. Glaucoma is usually bilateral so both eyes may be removed. The average age of onset is about 7 years. As the disease is diagnosed in older dogs the affected dogs may have been used for breeding before the disease onset. Abnormal iridocorneal angles are commonly diagnosed in the breed and many dogs are affected with pectinate ligament dysplasia (PLD). Pectinate ligament form the internal boundary of the canine iridocorneal angle. In the normal canine eye the pectinate ligament is presented as pillar of tissue, which provides support for the iris to the posterior cornea. As part of the research, 18 healthy Finnish Dandies were clinically studied by a veterinary ophthalmologist Elina Pietilä. “Based on the clinical study and ophthalmological reports collected from the affected dogs, PLD causes an elevated risk for glaucoma. 72.3 % of the clinically studied dogs had PLD, but no glaucoma was diagnosed. PLD does not always lead to glaucoma development but also other eg. genetic factors have an effect on glaucoma development,” informs MSc Saija Ahonen. “It is possible that some of the studied dogs will develop glaucoma later in life, so clinically studied dogs will be followed,” continues Ahonen.

To identify the glaucoma causing gene a pedigree was constructed around the affected dogs and a genome wide association analysis was performed with 23 cases and 23 controls. A locus on canine chromosome 8 was identified including 21 genes. “The locus identification was a huge breakthrough in the project. In addition, the same chromosome has been associated with glaucoma in humans, so we can be fairly sure that we have mapped the glaucoma associated region,” informs Professor Hannes Lohi. “The actual causative mutation has not been identified. Based on the results the genetic background of glaucoma may be complex meaning that multiple genes of mutation in the regulatory regions may affect glaucoma development,” adds Professor Lohi. Even though we know the where the associated region is, we cannot develop a gene test for the breed, which would be very helpful for the breeders. The locus identification gives us a lot of new information and we can now concentrate more detailed to the specific region,” comment MSc Saija Ahonen. The research group led by Professor Lohi is based at the Faculty of Veterinary Medicine and the Faculty of Medicine in the University of Helsinki and at the Folkhälsan Research Center.

Science Daily
September 17, 2013

Original web page at Science Daily