Internet Computing and the Emerging Grid

This thought provoking article by IAN FOSTER starts with a statement: “When the network is as fast as the computer’s internal links, the machine disintegrates across the net into a set of special purpose appliances. — Gilder Technology Report, June 2000.” – The following extract contains verbatim quotes from, to whet the reader’s appetite:

“Internet computing and Grid technologies promise to change the way we tackle complex problems. They will enable large-scale aggregation and sharing of computational, data and other resources across institutional boundaries. And harnessing these new technologies effectively will transform scientific disciplines ranging from high-energy physics to the life sciences…

One solution to the problem of inadequate computer power is to ‘cluster’ multiple individual computers. This technique, first explored in the early 1980s, is now standard practice in supercomputer centres, research labs and industry. The fastest supercomputers in the world are collections of microprocessors, such as the 8,000-processor ASCI White system at Lawrence Livermore National Laboratory in California. Many research laboratories operate low-cost PC clusters or ‘farms’ for computing or data analysis…

Although clustering can provide significant improvements in total computing power, a cluster remains a dedicated facility, built at a single location. Financial, political and technical constraints place limits on how large such systems can become. For example, ASCI White cost $110 million and needed an expensive new building. Few individual institutions or research groups can afford this level of investment…

Rapid improvements in communications technologies are leading many to consider more decentralized approaches to the problem of computing power. There are over 400 million PCs around the world, many as powerful as an early 1990s supercomputer. And most are idle much of the time. Every large institution has hundreds or thousands of such systems. Internet computing seeks to exploit otherwise idle workstations and PCs to create powerful distributed computing systems with global reach and supercomputer capabilities…

What does this all mean for science and the scientist? A simplistic view is that scientists with problems amenable to Internet computing now have access to a tremendous new computing resource. All they have to do is cast their problem in a form suitable for execution on home computers — and then persuade the public (or an Internet computing company) that their problem is important enough to justify the expenditure of “free” cycles…

But the real significance is broader. Internet computing is just a special case of something much more powerful — the ability for communities to share resources as they tackle common goals. Science today is increasingly collaborative and multidisciplinary, and it is not unusual for teams to span institutions, states, countries and continents. E-mail and the web provide basic mechanisms that allow such groups to work together. But what if they could link their data, computers, sensors and other resources into a single virtual laboratory? So-called Grid technologies seek to make this possible, by providing the protocols, services and software development kits needed to enable flexible, controlled resource sharing on a large scale…

The creation of large-scale infrastructure requires the definition and acceptance of standard protocols and services, just as the Internet Protocol (TCP-IP) is at the heart of the Internet. No formal standards process as yet exists for Grids (the Grid Forum is working to create one). Nonetheless, we see a remarkable degree of consensus on core technologies. Essentially all major Grid projects are being built on protocols and services provided by the Globus Toolkit, which was developed by my group at Argonne National Laboratory in collaboration with Carl Kesselman’s team at the University of Southern California’s Information Sciences Institute, and other institutions. This open-architecture and open-source infrastructure provides many of the basic services needed to construct Grid applications such as security, resource discovery, resource management and data access…

Although Internet and Grid computing are both new technologies, they have already proven themselves useful and their future looks promising. As technologies, networks and business models mature, I expect that it will become commonplace for small and large communities of scientists to create “Science Grids” linking their various resources to support human communication, data access and computation. I also expect to see a variety of contracting arrangements between scientists and Internet computing companies providing low-cost, high-capacity cycles. The result will be integrated Grids in which problems of different types can be routed to the most appropriate resource: dedicated supercomputers for specialized problems that require tightly coupled processors and idle workstations for more latency tolerant, data analysis problems.”

Read full article at


Progress in imaging

The first December 2000 issue of TIME Magazine featured a series of articles entitled “Inventors and Inventions of the Year”; in one of them, the ‘winning combination’ of positron emission tomography (PET) and computerized tomography (CT) was celebrated as the major creative achievement in the Medical Science area. While the former method can reveal subtle metabolic processes such as tumour growth, the latter shows anatomical details at a very high resolution. The wining combination now allows the precise location of e.g. a tumour in relation to an organ. By early next year, the new machines will be installed at Manhattan’s Memorial Sloan-Kettering Cancer Center and other prominent medical facilities (TIME, December 4, 2000).

There are more imaging news to come. Surgeons could soon be manipulating 3D moving images floating in mid-air rather than on computer screens, twisting a brain scan around to locate an injury, say engineers at DERA, Britain’s soon-to-be-privatised defence research lab ( DERA says it plans to have its first products based on advanced computer generated holography (CGH) on the market in 2003. Unlike stereography or virtual reality, CGH doesn’t require a headgear to see the image – users manipulate images using tools that exist partly as real objects and partly as virtual tools.

CGH is based on the same principle as the holograms invented by Dennis Gabor in 1949. A hologram is essentially an interference pattern generated from the object being depicted. When light strikes the hologram it is diffracted, forming a series of wavelets. Interference between these wavelets produces wavefronts that simulate the light that would have come from the original object.

In a normal hologram, the image appears to be “inside” the hologram that’s producing it. But with a computer generated hologram it is possible to produce interference patterns that simulate the waves from an object hanging in empty space. This means an image can be projected in front of the screen. There is a another key difference, too: as well as displaying images of real objects, the CGH system can create 3D images of imaginary objects.

The main problem with previous computer-generated holograms has been that they don’t have enough pixels to produce an image of a useful size; roughly a billion pixels are needed to produce a 3D image. DERA developed the screen on which the hologram is formed. Called an “active tiling modulator”, it uses ferro-liquid crystals to create vast numbers of pixels that form a hologram. The system is modular and can be scaled up or down to the required image size.


…and finally

If you want to know about the public image problem of scientists, and whether an image consultant could be of any help, read the interview with Max Clifford, one of the world’s top PR advisers. He says some interesting things on risk perception, media handling etc. Look into


The BSE zoonosis

The year 2000 will be remembered for the economic and political consequences of a food-borne epidemic in cattle, later suspected and finally confirmed to possess zoonotic potential. Around Christmas, Germany announced that it would have beef products banned from shop shelves across the country because of the threat of BSE. The removal of German beef products is part of a general recall of such products across Europe.

The BSE crisis is an example of misommunication between scientists, the media, and politicians. A dozen books have already been published on the subject, and more will be written. Britain’s inquiry into the BSE crisis has revealed significant weaknesses in the way the government used scientific advice and established research priorities on a topic of urgent social concern. The long-awaited report from the public inquiry into the official handling of Britain’s BSE epidemic concluded that ministers and civil servants did not deliberately lie to the public – they genuinely believed that the risks were minimal. There were however serious short-comings in the way the crisis was handled. The BSE report suggests that turf fights between the research councils and government departments may have delayed vital research in the early 1990s in the UK.

An archive on BSE research can be found at:

Read also


Vaccines are here to stay, but…

Research in recent years has accumulated evidence that vaccination – the financial mainstay of many a companion animal practice – may not be as innocuous as thought before. Injection-site fibrosarcoma in cats is a point in case. Four years ago, a Vaccine-Associated Feline Sarcoma Task Force was established, consisting of representatives from the American Veterinary Medical Association, American Animal Hospital Association, Veterinary Cancer Society, American Association of Feline Practitioners, U.S. Department of Agriculture, Animal Health Institute and the Cornell Feline Health Center . Since 1997, the task force supports research on different aspects of vaccine-associated sarcomas in cats; the most recent request for grant proposals can be found at Based on insight accumulated during the last 4 years, the wholesale yearly vaccination practice has come under attack, and less and less frequent vaccinations are recommended. A synopsis of vaccination risks with literature references can be found at

A general reluctance to vaccinate is not only a veterinary problem. In a recent release from Reuters entitled “Vaccine Exemptions Mean More Sick Children”, an 11-year study was quoted of Colorado school children aged 3 to 18 in which much higher rates of measles and pertussis were found among unvaccinated children. Measles rates were 22 times higher and whooping cough rates were six times higher in unvaccinated children than in youngsters who received the vaccinations, said the report from the U.S. Centers for Disease Control and Prevention (news – web sites) in Atlanta.

An increasing number of parents are less concerned about having their children vaccinated because of the rarity of these childhood illnesses. Many are aware of media reports and warnings from anti-vaccine groups that the vaccines themselves might pose dangers to their children, such as autism, seizures or diabetes.

The report emphasizes the established fact that besides putting themselves at risk, unvaccinated individuals are a source of infectious agents for others. Most vaccine-preventable diseases are spread from person to person, and "…therefore the health of any individual in the community is intricately dependent on the health of the rest of the community," the study’s author Daniel Feikin was quoted as saying.

Read full review at