Categories
News

The Year of the Genomes

Not too long ago, determining the precise sequence of DNA was slow and tedious. Today, genome sequencing is a billion-dollar, worldwide enterprise. Terabytes of sequence data generated through a melding of biology, chemistry, physics, mathematics, computer science, and engineering are changing the way biologists work and think. And in 2000, biologists deciphered many new genomes, including that of humans. In its 22 December edition,* Science marks this torrent of genome data as the Breakthrough of the Year.

A year ago researchers had completely spelled out the genome of only one multicellular organism, a worm called Caenorhabditis elegans. Now sequences exist for the fruit fly, human, and the plant geneticist’s beloved Arabidopsis thaliana. Not far behind are drafts of the mouse, rat, and zebrafish genomes, as well as two species of puffer fish. In addition, some 60 microbial genomes are now on file, including those of the villains that cause cholera and meningitis. Most of these data are accessible to scientists free of charge, catalyzing a vast exploration for new discoveries.

As a result, the study of genome data is now in hyperdrive. By comparing mouse to human, worm to fly, or even mouse to mouse, a new breed of computer-savvy biologists is hacking through the thickets of the DNA code, discovering not just genes, but also other important bits of genetic material, and even evolutionary secrets. We are learning, for example, that we have a lot more in common with more distantly related organisms than we thought.

This explosion of genetic knowledge comes with some heavy ethical and social baggage: It is not clear how society will deal with the growing potential to manipulate genomes, and many governments are grappling with how to protect individual rights once the technology exists for reading each person’s genome. But the allure of the new knowledge has made the quest irresistible. This year’s revolution may well be the breakthrough of the decade, perhaps even the century, for all its potential to alter our view of the world we live in.

Read full review at http://sciencenow.sciencemag.org/cgi/content/full/2000/1221/2 (subscription required).

Categories
News

Internet Computing and the Emerging Grid

This thought provoking article by IAN FOSTER starts with a statement: “When the network is as fast as the computer’s internal links, the machine disintegrates across the net into a set of special purpose appliances. — Gilder Technology Report, June 2000.” – The following extract contains verbatim quotes from http://www.nature.com/nature/webmatters/grid/grid.html#author, to whet the reader’s appetite:

“Internet computing and Grid technologies promise to change the way we tackle complex problems. They will enable large-scale aggregation and sharing of computational, data and other resources across institutional boundaries. And harnessing these new technologies effectively will transform scientific disciplines ranging from high-energy physics to the life sciences…

One solution to the problem of inadequate computer power is to ‘cluster’ multiple individual computers. This technique, first explored in the early 1980s, is now standard practice in supercomputer centres, research labs and industry. The fastest supercomputers in the world are collections of microprocessors, such as the 8,000-processor ASCI White system at Lawrence Livermore National Laboratory in California. Many research laboratories operate low-cost PC clusters or ‘farms’ for computing or data analysis…

Although clustering can provide significant improvements in total computing power, a cluster remains a dedicated facility, built at a single location. Financial, political and technical constraints place limits on how large such systems can become. For example, ASCI White cost $110 million and needed an expensive new building. Few individual institutions or research groups can afford this level of investment…

Rapid improvements in communications technologies are leading many to consider more decentralized approaches to the problem of computing power. There are over 400 million PCs around the world, many as powerful as an early 1990s supercomputer. And most are idle much of the time. Every large institution has hundreds or thousands of such systems. Internet computing seeks to exploit otherwise idle workstations and PCs to create powerful distributed computing systems with global reach and supercomputer capabilities…

What does this all mean for science and the scientist? A simplistic view is that scientists with problems amenable to Internet computing now have access to a tremendous new computing resource. All they have to do is cast their problem in a form suitable for execution on home computers — and then persuade the public (or an Internet computing company) that their problem is important enough to justify the expenditure of “free” cycles…

But the real significance is broader. Internet computing is just a special case of something much more powerful — the ability for communities to share resources as they tackle common goals. Science today is increasingly collaborative and multidisciplinary, and it is not unusual for teams to span institutions, states, countries and continents. E-mail and the web provide basic mechanisms that allow such groups to work together. But what if they could link their data, computers, sensors and other resources into a single virtual laboratory? So-called Grid technologies seek to make this possible, by providing the protocols, services and software development kits needed to enable flexible, controlled resource sharing on a large scale…

The creation of large-scale infrastructure requires the definition and acceptance of standard protocols and services, just as the Internet Protocol (TCP-IP) is at the heart of the Internet. No formal standards process as yet exists for Grids (the Grid Forum is working to create one). Nonetheless, we see a remarkable degree of consensus on core technologies. Essentially all major Grid projects are being built on protocols and services provided by the Globus Toolkit, which was developed by my group at Argonne National Laboratory in collaboration with Carl Kesselman’s team at the University of Southern California’s Information Sciences Institute, and other institutions. This open-architecture and open-source infrastructure provides many of the basic services needed to construct Grid applications such as security, resource discovery, resource management and data access…

Although Internet and Grid computing are both new technologies, they have already proven themselves useful and their future looks promising. As technologies, networks and business models mature, I expect that it will become commonplace for small and large communities of scientists to create “Science Grids” linking their various resources to support human communication, data access and computation. I also expect to see a variety of contracting arrangements between scientists and Internet computing companies providing low-cost, high-capacity cycles. The result will be integrated Grids in which problems of different types can be routed to the most appropriate resource: dedicated supercomputers for specialized problems that require tightly coupled processors and idle workstations for more latency tolerant, data analysis problems.”

Read full article at http://www.nature.com/nature/webmatters/grid/grid.html

Categories
News

Progress in imaging

The first December 2000 issue of TIME Magazine featured a series of articles entitled “Inventors and Inventions of the Year”; in one of them, the ‘winning combination’ of positron emission tomography (PET) and computerized tomography (CT) was celebrated as the major creative achievement in the Medical Science area. While the former method can reveal subtle metabolic processes such as tumour growth, the latter shows anatomical details at a very high resolution. The wining combination now allows the precise location of e.g. a tumour in relation to an organ. By early next year, the new machines will be installed at Manhattan’s Memorial Sloan-Kettering Cancer Center and other prominent medical facilities (TIME, December 4, 2000).

There are more imaging news to come. Surgeons could soon be manipulating 3D moving images floating in mid-air rather than on computer screens, twisting a brain scan around to locate an injury, say engineers at DERA, Britain’s soon-to-be-privatised defence research lab (http://www.newscientist.com/news/news.jsp?id=ns226914). DERA says it plans to have its first products based on advanced computer generated holography (CGH) on the market in 2003. Unlike stereography or virtual reality, CGH doesn’t require a headgear to see the image – users manipulate images using tools that exist partly as real objects and partly as virtual tools.

CGH is based on the same principle as the holograms invented by Dennis Gabor in 1949. A hologram is essentially an interference pattern generated from the object being depicted. When light strikes the hologram it is diffracted, forming a series of wavelets. Interference between these wavelets produces wavefronts that simulate the light that would have come from the original object.

In a normal hologram, the image appears to be “inside” the hologram that’s producing it. But with a computer generated hologram it is possible to produce interference patterns that simulate the waves from an object hanging in empty space. This means an image can be projected in front of the screen. There is a another key difference, too: as well as displaying images of real objects, the CGH system can create 3D images of imaginary objects.

The main problem with previous computer-generated holograms has been that they don’t have enough pixels to produce an image of a useful size; roughly a billion pixels are needed to produce a 3D image. DERA developed the screen on which the hologram is formed. Called an “active tiling modulator”, it uses ferro-liquid crystals to create vast numbers of pixels that form a hologram. The system is modular and can be scaled up or down to the required image size.

Categories
News

…and finally

If you want to know about the public image problem of scientists, and whether an image consultant could be of any help, read the interview with Max Clifford, one of the world’s top PR advisers. He says some interesting things on risk perception, media handling etc. Look into http://www.newscientist.com/opinion/opinion.jsp?id=ns227031

Categories
News

The BSE zoonosis

The year 2000 will be remembered for the economic and political consequences of a food-borne epidemic in cattle, later suspected and finally confirmed to possess zoonotic potential. Around Christmas, Germany announced that it would have beef products banned from shop shelves across the country because of the threat of BSE. The removal of German beef products is part of a general recall of such products across Europe.

The BSE crisis is an example of misommunication between scientists, the media, and politicians. A dozen books have already been published on the subject, and more will be written. Britain’s inquiry into the BSE crisis has revealed significant weaknesses in the way the government used scientific advice and established research priorities on a topic of urgent social concern. The long-awaited report from the public inquiry into the official handling of Britain’s BSE epidemic concluded that ministers and civil servants did not deliberately lie to the public – they genuinely believed that the risks were minimal. There were however serious short-comings in the way the crisis was handled. The BSE report suggests that turf fights between the research councils and government departments may have delayed vital research in the early 1990s in the UK.

An archive on BSE research can be found at: http://www.nature.com/nature/fow/001102b_papers.html

Read also http://www.newscientist.com/news/news.jsp?id=ns227027