×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Computer can cure ills faster

Last Updated 21 December 2009, 17:11 IST
ADVERTISEMENT

How do you predict whether a given patient is likely to die from a heart attack? Conventional medical wisdom would base a risk assessment on factors such as the person’s age, whether they were smokers or diabetic, plus, the results of cardiac ultrasound and various blood tests. It may be that a better predictor is a computer program that analyses the patient’s electrocardiogram looking for subtle features within the data provided by the instrument.

A team of researchers at MIT and the University of Michigan analysed a large data-set of 24-hour electrocardiogram recordings collected at a Boston hospital.

Employing a number of computational techniques involving algorithms, the researchers developed a way to analyse how the shape of the electrical waveform varies, a measure they dubbed morphological variability. At the heart of the approach are mathematical techniques used in speech recognition and genome analysis which allow researchers to compare individual beats. “We compute the differences for every pair of beats,” said one researcher. “If there is lots of variability, that patient is in bad shape.”

The team then applied their algorithm to a second set of electrocardiogram recordings and found that patients with the highest morphological variability were six to eight times more likely to die from a heart attack than those with low variability. They concluded that it consistently predicted as well or better than the indicators commonly used by physicians.

In the same week, resear-chers at the Sanger Institute in Cambridge revealed that they had reconstructed the biological history of two types of cancer. The research revealed 23,000 mutations that were caused by the 60 or so chemicals in cigarette smoke that stick to the DNA and deform it.

Although these stories are reports about medical research, they are really about computing – in the sense that neither would have been possible without the application of serious computer power to masses of data. They reflect a new – but unacknowledged – reality; that in many important fields, leading-edge scientific research cannot be done without access to vast computational and data-handling facilities, with sophisticated software for analyses. In ma-ny significant areas, advanced research is no longer done by individuals looking through microscopes or telescopes, but by computers enabling investigators to collate, visualise and analyse the torrents of data produced by arrays of instruments.

The man who did most to alert the world to the urgent need to take “computational science” seriously was Jim Gray, a much-loved visionary who worked for Microsoft Research. Towards the end of his life, Gray argued that we had moved into what he called “the Fourth Paradigm” of scientific research, which he dubbed “data-intensive scientific discovery”.

His essays remind us that we now have to provide the resources to fund the IT infrastructure. If we don’t give them these tools, then we cannot expect them to finish the job.

ADVERTISEMENT
(Published 21 December 2009, 17:11 IST)

Follow us on

ADVERTISEMENT
ADVERTISEMENT