By Dave Litman

 

Data is a precious thing and will last

        longer than the systems themselves.  

Tim Berners-Lee, inventor of the World Wide Web

 

When Tim Berners-Lee invented the world wide web and posted the very first website back in 1990, he was providing just a small glimpse into the future. The web page spoke of allowing people around the world to have access “to a large universe of documents.” 

It’s hard to know if even Tim himself could imagine the intense explosion of data his invention would help allow for. Though by 2006, he was clearly understanding the magnitude of what his little W3 project had introduced to the world when he said: “Data is a precious thing and will last longer than the systems themselves.”

How precious? Yes, it might be a cliche by now, but we all know that The Economist declared in 2017 that data is the new oil.

Today, the amount of data generated worldwide is growing at a staggering pace. In fact, 90%+ of all data today has been created in the last two years alone.

In 2020, the world is creating, capturing and consuming about 40 Zettabytes of data. 1 Zettabyte equals 1 trillion gigabytes. By way of reference, that’s equivalent to roughly five centuries’ worth of continuous tweets from every person on the planet! 

That 40 Zettabyte total will grow to an estimated 175 Zettabytes of data by 2025. How much is 175 Zettabytes? If you used the largest hard drive in the world today, it would take over 12.5 billion of them to download all that data. 

And if you wanted to store that 175 Zettabytes of data on DVDs, that stack of razor-thin discs would circle around the planet 222 times over. 

More mind-boggling 2020 numbers:

  • Internet users will spend 1.25 billion years online
  • Every person will generate 1.7 Megabytes per second
  • It would take one person over 200 million years to download all the data from the internet

 

Big Data – Big Use Cases

 

So why do we need all that precious data? The fact is, the technologies of today and tomorrow require it. 

Big data serves as the foundational element of artificial intelligence (AI) and its offshoot, machine learning (ML). Large data sets and high-density computing are essential for these technologies to be able to learn independently. AI will provide a number of critical benefits for humans, from automation of tedious or arduous tasks to things like disaster response and improved healthcare. 

The global AI market has been growing at a breakneck pace, along with other data-driven platforms, like video, and Internet of Things. IoT connected devices alone will create 90 Zettabytes of data by 2025.  

For any of these technologies to reach their full potential, and provide maximum benefits for humanity, we need to continue to feed the beast… with more and more computing power, and more and more data

 

The Evolution of Data Storage

 

All that being said, can there be any doubt of the urgent need to manage and store that massive growing stockpile of data? 

IDC’s Data Age 2025 whitepaper explains how data is stored in three ways:

  1. The Core: Data Centers (traditional & cloud)
  2. The Edge: Cell towers and branch offices
  3. Endpoints: PCs, smartphones, IoT devices

The paper predicts that by 2024, the amount of data stored in the core will be more than double the amount stored in the endpoint, as the core becomes the new data repository of choice. 

But data center infrastructure needs an overhaul to handle it all. 80% of enterprises will have shut down their traditional data centers by 2025. Static layouts, inefficient cooling, and minimal power just won’t cut it anymore.

Galaxy Capital Partners is preparing for the future by building the next generation of data centers: Hyperscale, hyper-density, immersion-cooled, and cost-efficient. GCP is on the cutting-edge of redefining data center infrastructure, to ensure that the core is robust enough to handle the continually expanding worldwide data processing and storage needs. 

With GCP’s plan for the future, the planet’s precious datasphere is in good hands.