You are here

Computational biomedicine: digital doppelgängers for clinical trials?

According to a report by the British parliament, 90% of the information now existing in the world has been created in the past two years. Performing DNA analyses is now cheaper than ever, and the studies related with omics have given way to the massive amount of information and empirical evidence that we now have, which were unthinkable a decade ago. Nonetheless, experts such as Peter Coveney, director of the Centre for Computational Science at University College London, state that “biology is too complex to rely on data that have been blindly harvested.”

This may be one of the most significant criticisms levied at the generation and management of biological big data, understood as the entirety of the information gathered in recent years from the thousands of analyses of genomes, proteomes or transcriptomes. “We need more theory and understanding if we are to make sense of this tsunami of data and omes”, writes Coveney in an article published in the blog of the Science Museum of London. He made this statement after publishing a detailed study in the journal Philosophical Transactions of the Royal Society A at the end of 2016, in which he reminded readers that “big data needs big theory too.” With that goal in mind, Coveney has recently launched a research consortium for excellence in computational biomedicine.


Image: Shaury Nash (Flickr)

CompBioMed, a European center of excellence in computational biomedicine, is born

The project has been conceived to develop new predictive and simulation methods with the financial support of the Horizon 2020 program, which has provided a total of five million euros, according to a press release from Pompeu Fabra University. The institution forms part of this center of excellence through the team of Dr. Gianni de Fabritiis. “From the Computational Biophysics group we would like to contribute our knowledge in molecular dynamics simulations, as well as our experience in the administration of computing resources applied to them. We will work with molecular simulations in the attempt to solve a number of modern-day medical problems,” says the researcher in remarks made to Biocores.

The main goal of CompBioMed is to leverage high-performance computing (HPC) in three main areas of research; cardiovascular, molecular and neuromusculoskeletal. Another organization participating in the consortium is the Barcelona Supercomputing Center-Centro Nacional de Supercomputación (BSC-CNS). Dr. Mariano Vázquez, who will be application manager of CompBioMed, tells Biocores that he will be responsible for “coordinating the research tasks that lead to the integration of all the tools we have, in a simulation environment capable of providing this service.”

His work will focus on the parallel development of software that will later be installed in European supercomputing centers, such as the BSC-CNS of Barcelona, the EPCC of Scotland, or the SurfSARA of the Netherlands, for it to be used by researchers specialized in biomedicine. “The idea is that, working together, in three years’ time we develop at least a simulation environment prototype that can be used by any type of user,” says Vázquez. In his opinion, the work done by CompBioMed will serve to “produce at lower cost, and a faster pace, more and better medications, stents, catheters, valves or inhalers. It will also help reduce animal testing.”


Source: Pixabay

The biomedicine consortium faces two major challenges. First, the project aims to improve the available simulation tools. “If simulating means recreating nature in a computer, we still aren’t simulating too well. There is much room for improvement. We need better physiological, numerical and computational models. We don’t have them because, as nature wasn’t designed by an engineer, it is very difficult for us to understand,” says Vázquez in remarks to Biocores. Second, dissemination of these advancements to society is fundamental so that, “they don’t seem so much like science fiction, but rather a reality that, though there is much to improve, is working better all the time.”

Coveney adds that, “in the longer term, we are also working on virtual humans, so treatments can be tested on a digital doppelgänger of a patient as a trial run before treatment.” This vision is shared by Hiroaki Kitano, president of the Systems Biology Institute of Tokyo, which was behind one of the first initiatives to create something of a “virtual human being” through the Garuda Alliance specialized software. Thanks to the development of high-performance computing, medicine is going through a true paradigm shift. “We call this doppelgänger a ‘computational human’. That’s what we’re after. It won’t be soon, but it will happen. And not suddenly, but little by little,” states the BSC-CNS scientist.

Although the greatest challenge is to develop a “digital doppelgänger”, or something of a virtual human being to improve clinical trials and, ultimately, develop more accurate, economical and safer treatments, the truth is, we are still far from such a scenario. Nonetheless, the launch of projects like CompBioMed has positive implications for the future of personalized medicine, in which computational methods based on human biology have already become a fundamental tool. This is shown in such success stories as the application of computational technology in the fight against multiple myeloma, the use of computational methods to improve drug development or their application to reduce levels of animal testing.