Big data plays a central role in helping to combat COVID-19 by forging partnerships between the private and public sectors.
If we were to ask, “What’s big data used for during the COVID-19 pandemic?” we’d get many different responses.
How companies use big data varies, and while many big data companies are helping governments and health organizations better understand the coronavirus in order to combat it, some of their applications are quite controversial in nature.
Mapping reported cases of the virus, using predictive models to understand how it will affect global healthcare institutions, and enlisting big data for new drug discoveries are all noble pursuits.
At the same time, big data is also being used to track citizen movements, and many fear that this could lead to enhanced surveillance the likes of which have not been seen since the beginning of the 21st Century.
Here we take a look at how big data is helping combat COVID-19 in ways that are in line with the values of democratic societies.
In order to understand where the coronavirus is actually spreading, the reported cases need to be mapped.
But the cases can’t be mapped without first having all the data to tally the results.
Once all the data is available, big data can then get to work on making sense of it all, in order to produce accurate heat maps.
Johns Hopkins University was one of the first organizations to create a heat map of all reported coronavirus cases in the world.
Updated daily at nearly real-time speed, “Johns Hopkins experts are aggregating data from multiple credible sources to track the spread of COVID-19,” according to the website.
In addition to the heat map, Johns Hopkins researchers are also tracking “how the novel coronavirus is spreading around the globe with up-to-date visuals that give context to the data collected on Johns Hopkins University’s COVID-19 map.”
Knowing where the virus is and where it is spreading is an important first step in understanding the extent of the pandemic.
Next, predictive models are needed to figure out what will happen in the future in order to prepare for it.
Big data only works efficiently when the raw data being analyzed is good to begin with.
When data is faulty, results are skewed, and the same goes for predictive models that try to forecast how many people will be infected and how institutions should respond.
Overshooting the infection rates runs the risk of wasting precious time and resources by over-manufacturing and deploying hospital beds, ventilators, and other equipment that could wind up wasted and collecting dust.
Undershooting infection rate data could mean crashing the entire healthcare system, which wasn’t designed to handle the onrush in patients that comes from a pandemic outbreak.
In the fight against COVID-19, some of the biggest players from the private sector have teamed-up with government research labs in an unprecedented collaboration to enlist big data using the most powerful supercomputers on the planet.
Through the newly-established COVID-19 High Performance Computing (HPC) Consortium, companies like IBM, Amazon, Google, Microsoft, and Intel (to name a few) are all working with government and academia in a joint effort “to provide access to the world’s most powerful high-performance computing resources in support of COVID-19 research.”
With over 37 open projects, many are working with big data to create predictive models in “bioinformatics, epidemiology, and molecular modeling to understand the threat we’re facing and to develop strategies to address it.”
Some of these big data projects include:
And this brings us to another way in which big data is being conscripted to combat COVID-19 – drug discovery.
What used to take years to compute can now be done in a matter of minutes thanks to big data and powerful supercomputers.
Thanks to advances in big data, drug discovery has received a massive booster shot in recent years, which is fortunate timing given the circumstances.
As the world races to discover which drugs are most effective against COVID-19, big data is there every step of the way.
At the HPC, nearly every open project has something to do with drug discovery including:
And the list of projects goes on and on with more technical applications of big data for drug discovery.
According to Dario Gil, Director of IBM Research, “IBM’s Summit, the most powerful supercomputer on the planet, has already enabled researchers at the Oak Ridge National Laboratory and the University of Tennessee to screen 8,000 compounds to find those that are most likely to bind to the main ‘spike’ protein of the coronavirus, rendering it unable to infect host cells.”
“These experiments would take years to complete if worked by hand, or months if handled on slower, traditional computing platforms,” he added.
Looking back on how big data is helping to combat COVID-19, we’ve seen government, industry, and academia all come together towards a common goal.
However, the big data industry is suffering a decline in public opinion that is fueled by overriding concerns over infringements on citizens’ privacy, especially in response to COVID-19.
The combined data from contact tracing, computer vision, facial recognition, surveillance footage, and drone cameras can all be aggregated to identify, track, and monitor the movements and behaviors of an entire population.
Just as experts believe that the virus could be a call to action to demonstrate big data best practices, their customers must believe it, too.
Businesses looking for big data PR should recognize that the public wants to be reassured that their tools are developed and used ethically.
A comprehensive PR strategy can get the right message in front of the right audience that accurately portrays your company’s mission and values.
This is something that customers are sure to appreciate and is a very positive return on investment.
The Loudspeaker is your definitive guide on how to scale your startup. Brought to you by Publicize, this podcast explores the ins and outs of growing your brand and taking your product to market.
Each month, our expert guests bring you insights, advice, and the latest need-to-know trends from the intersection of marketing, PR and technology.
Today we are joined by our own in-house content writer here at Publicize, Helene Doetsch. She joins us to explore the topic of content writing. On today’s show, you will learn what your goal for writing content should be, important things to consider before you set out to write content, and best SEO practices.
We also go over the worst mistakes people make when starting out with content writing. And finally, Helene shares with us some useful resources and recommendations to help you come up with ideas for content writing.