Influenza is one of the most serious recurring health risks faced by United States citizens, and the world at large, every year. In a particularly bad season, the flu can wreck a tremendous amount of havoc. According to the CDC, between 1976-1977 and 2006-2007, seasonal deaths from the flu ranged from as few as 3,000 to as many as 49,000.
To better understand influenza and how it behaves and travels, scientists from a number of different teams are now utilizing high performance computing tools to study the virus, HPCWire reported.
The news source reported that the scientists hail from the Texas Advanced Computing Center at the University of Texas at Austin, the San Diego Supercomputer Center at the University of San Diego, the University of Chicago Research Computing Center and the Department of Defense High Performance Computing Center. The team's research centers on the question of the influenza virus' replication.
The source noted that the most common treatment for influenza A currently is Amantadine. However, this treatment's effectiveness is diminishing due to viral mutations. Consequently, the researchers and other scientists around the globe are now working on alternative methods of defeating influenza.
HPC plays a key role in this capacity. The team is using four HPC systems to better simulate the complex process of proton transfer through the M2 protein channel within the influenza virus. The combined power of these HPC systems delivers an unprecedented degree of detail for multiscale simulations, according to the news source. For the first time, researchers were able to computationally describe the connection between influenza mutations on the M2 protein and increasing drug resistance.
"Computer simulation, when done very well, with all the right physics, reveals a huge amount of information that you can't get otherwise," explained Gregory Voth, the Haig P. Papazian Distinguished Service Professor in Chemistry at the University of Chicago and one of the lead researchers.
According to Peter Preusch of the National Institutes of Health's National Institute for General Medical Sciences, these HPC simulations have the potential to usher in significant progress in the fight against influenza.
"This work helps expand the methods for molecular simulation available to researchers and may eventually lead to new and better drugs to treat influenza infections," said Preusch, the University of Chicago reported.
This combined effort represents one of the many ongoing efforts to leverage HPC capabilities for scientific computing projects. Thanks to its advanced capabilities, HPC is a critical tool for many scientific research initiatives. In particular, HPC's ability to run complex simulations that are stable and accurate is essential for research in numerous areas.
For example, the University of California, Santa Cruz, recently adopted a new HPC solution to improve the school's existing Hyades supercomputer, used primarily to perform astrophysics-related calculations. By running simulations, researchers can use these HPC capabilities to learn about astrophysical phenomena which cannot be studied experimentally.
Stability and accuracy can only be achieved through software that’s been validated and tested – a difficult task for complex HPC systems. That’s why debugging and memory analysis tools that support HPC’s unique environments (multiple processes, multiple GPUs, co-processors, etc.) are key to creating successful simulations.