Can human ingenuity assisted by new and emerging technologies overpower Covid-19? Will faster processing of more—and more relevant—data, analyzed with the right models, yield better insights into mitigating the spread of future pandemics, designing effective treatments, and developing successful vaccines? A number of promising initiatives were announced in recent weeks aiming to enlist data, AI algorithms, supercomputers, and human expertise in the fight with our global predicament.
The Digital Transformation Institute, a new research consortium established by C3.ai, Microsoft, a number of leading universities, and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (UIUC), announced its first call for proposals for “AI techniques to mitigate pandemics.” In addition to a total of $5.8 million in cash awards, recipients will be provided by Microsoft and C3.ai with significant cloud computing, supercomputing, data access, and AI software resources and technical support. Thomas M. Siebel, founder and chief executive of C3.ai, told The New York Times, “I cannot imagine a more important use of AI.”
IBM is sharing supercomputing resources, cloud-based data repositories, and AI-driven search tools. The Allen Institute for AI has partnered with leading research groups to prepare and distribute the Covid-19 Open Research Dataset (CORD-19), a free resource of over 51,000 scholarly articles. Google’s Kaggle has launched a series of data science competitions to answer Covid-19 questions posed by the National Academies of Sciences, Engineering, and Medicine (NASEM) and the World Health Organization (WHO). The Covid-19 High Performance Computing (HPC) Consortium is providing broad access to over 30 supercomputing systems, representing “over 402 petaflops, 105,334 nodes, 3,539,044 CPU cores, 41,286 GPUs, and counting.”
These are just few examples of recent efforts combining the power of data with computer power to understand and respond to the Coronavirus—super-fast computers or supercomputers crunching lots and lots of data are at the core of these initiatives. In the future, quantum computers, much faster computers than today’s supercomputers, may contribute to the speed by which our responses to pandemics are determined and deployed.
D-Wave Systems, a quantum computing startup, recently announced the immediate availability of free access to its cloud computing service, “designed to bring both classical and quantum resources to quickly and precisely solve highly complex problems with up to 10,000 fully connected variables.” Joining this initiative are a number of D-Wave’s partners and customers, including Forschungszentrum Jülich, a German interdisciplinary research center. According to Prof. Dr. Kristel Michielsen from the Jülich Supercomputing Centre, the initiative “is promising to accelerate the solution of complex problems in pharmacology and epidemiology, such as those that have arisen in the unprecedented COVID-19 crisis, by means of hybrid workflows from quantum-classical computer simulations. To make efficient use of D-Wave’s optimization and AI capabilities, we are integrating the system into our modular HPC environment.”
While quantum computing is just emerging as a viable technology, it stands in practice on the shoulders of the many scientists who have solved complex problems for years with high-performance computing (HPC). Peter Rutten, Research Director at IDC’s Infrastructure Systems, Platforms and Technologies Group, observes that “you can probably draw a fairly direct line between using HPC in the cloud to attack the Covid-19 problem to using the initial quantum computing capabilities that exist in the cloud to attack the Covid-19 problem.”
Last month, IDC published the results of a survey of 520 IT and business users worldwide and in-depth interviews with current quantum computing end-users. A little less than 75% of respondents reported their organizations as being very interested in quantum computing. 52% of the organizations surveyed have plans to begin experimenting with quantum computing technology in the next 18-24 months and about 10% indicated that their quantum computing technologies are already in the process of being operationalized.
“When we asked why are you investing in technology that may not show ROI in the short term,” says Rutten, “quite a few answered—we’ve run out of capabilities with our HPC environment, we cannot solve these problems with the HPC infrastructure that we have today. That’s really the jump that a lot of business are making, the jump from HPC to quantum.”
The healthcare and life sciences industry was found by the IDC survey to be one of the sectors most interested in quantum computing. Developing and distributing drugs faster, drug discovery, and clinical trial enhancements were some of the key motivations for experimenting with quantum computing, according to Heather West, Senior Research Analyst at IDC’s Infrastructure Systems, Platforms and Technologies Group. A pandemic like Covid-19 adds urgency to this interest in a very new technology.
“Quantum computing could identify patterns that will allow us to identify something like Covid-19 earlier. We would be able to work faster to identify compounds, put together a vaccine faster, or determine faster the different ways by which we can slow the transmission,” says West. In addition, she thinks the global supply chain is a prime candidate for the use of quantum computers, allowing for the quick identification of patterns of supply and demand and for swift action in response to sudden shortages or surpluses.
“Drugs today are discovered on a trial and error basis. It takes five or more years to develop a new drug, at a cost of $1 billion,” says Doug Finke, Managing Editor of the Quantum Computing Report. Quantum computers can simulate chemical reactions at the molecular level and quickly narrow down possible candidates for drugs and vaccines from 10,000 compounds to a few dozens. The work can be done “in the computer before the petri dish,” says Finke. But he cautions that “it will take 2-3 years to learn how to use these machines productively.”
The IDC survey found that the three biggest challenges for organizations considering the adoption of quantum computing are cost (26%), training resources (22%), and long-term budgets (22%). About a third of respondents expect quantum computing to improve their AI capabilities (32%), accelerate decision making (31%), and increase productivity/efficiency (30%).
“Now is the time to start building the vision, the expertise, dedicating teams and resources” for quantum computing, says Brian Solis, Global Innovation Evangelist at Salesforce. “The stepping stones to get there are building a center of excellence around AI,” he adds, making AI the focal point of the organization’s efforts to become more agile and innovative. “It forces you to get better data, clean the data, and build expertise and key capabilities around the data. Complement that with a smaller set of resources, a Center of Excellence for quantum computing,” says Solis.
The economic consequences of the current global pandemic may reduce in the short-term technology investments by corporations and venture capital firms but we may also see accelerated investments in emerging technologies, particularly those promising to assist in preventing and mitigating future pandemics.
Governments worldwide may show a specific interest, even in the short term, in technology investments. Along the relatively new AI tools (which are all about identifying patterns in large data repositories), quantum computing will probably continue to benefit from public funds. Many governments have established on-going research programs in both areas due mostly to concerns about future national competitiveness and, specifically, cybersecurity capabilities (see here for a comprehensive list of government-funded or supported quantum computing initiatives).
Governments, however, should take into consideration the less-convenient truth that investing only in computer power and speed will not raise the bar high enough to defend us from future pandemics. In the US, for example, the healthcare system is still plagued by antiquated technology infrastructure in which one hospital still can’t communicate electronically with another hospital a few miles down the road (and sometimes, as I experienced myself last year, between medical offices located in the same building and belonging to the same healthcare system…).
More important, antiquated and failed “privacy protection” policies and regulations, currently take control of healthcare data from the hands of the people they are supposed to protect and greatly encumber efforts to use the data for research purposes. Government programs aimed at stopping or overcoming quickly the next pandemic must address—and fix—the regulatory and legal issues of managing data in the 21st century, starting with healthcare.
Read the full article here.