Top 10 Data Predictions leading to zero-latency future in 2019

Just like electricity, we will soon start to see real-time computing become pervasively available and invisible at the same time, writes Madhukar Kumar, Head of Developer Advocacy at Redis Labs

The tech industry regularly sees the rise and fall of several hype cycles including the advent of the dot com era to cloud computing, big data and more recently artificial intelligence (AI) and blockchain.

Looking back, it’s clear that each one of these major changes was additive or in some way related to the disruption that happened before. For example, AI would not be where it is today without big data. Big data would not have been possible without the advent of cloud computing and cloud itself would be non-existent without the world wide web boom in the 90s.

Armed with this hindsight, I believe we are about to make technology’s next major leap due to several forces (i.e. disruptions that have already happened or are currently in play) coming together. In a nutshell, I believe we are headed into a zero-latency future.

Now before you raise an eyebrow, let’s define what that means and then look at all the individual trends I believe will together make this phenomenon a reality.

If a machine (hardware plus software) starts interacting with humans or other machines in less than a second, it is a zero-latency device or app.

When you talk to Alexa or Google Home today, the devices often respond in less than a second, but I think it could be even faster. Think about autonomous vehicles, facial recognition, smart homes etc., where everything needs to come together to make a decision and act based on hundreds of inputs in a few milliseconds.

Now imagine that kind of computing is everywhere around you. That, in my mind, is a zero-latency future. In this future, any response time over one second will be unacceptable.

So, what are the trends shaping this future? Let’s first look at these emerging megatrends.

Emerging megatrends

Quantum computing

Earlier this year, Intel announced a major milestone in building quantum computing systems with 49 qubits — enough to supersede practical limits to modern computers. IBM and Google also made similar announcements.

Although we probably won’t see an immediate replacement of classic computers in the next year, IBM has already opened up a playground where people can start experimenting with this new technology. These developments are going to fast-track opportunities for exponentially faster compute processing power.

1545891445.png

5G internet connections

Some providers, like Verizon, have already deployed 5G to a few cities in the United States. However, the first 5G networks are expected to go live in Britain in 2019.

5G technology builds upon lessons learned from 4G and delivers speeds of 1GB uploads and downloads per second. That’s right, you will be able to download – not merely stream – a full HD movie in less than 10 seconds on your way to work using a 5G network.

Persistent memory

Intel recently announced the launch of Optane DC persistent memory, which looks like any standard RAM but can store terabytes and even persist data when power is switched off.

I am hopeful this technology will continue to improve and eventually replace hard drives for most use cases. With this increased capacity, vast amounts of data can be processed in real time and persisted without ever touching a disk.

Real-time data processing at the edge

Due to the trinity of forces above, a lot more data processing will happen in real time at the edge (i.e. in devices for autonomous cars, smart cities, facial recognition, wearable tech and more).

This phenomenon is often mentioned under the category of edge or fog computing and will become more real as processing gets faster, data becomes available in memory all the time and network speeds increase exponentially.

Data processing within compute

In traditional big data implementations, we saw programming logic move to the data (think MapReduce and Hadoop). Now, I expect we’ll begin to see the reverse. Data, and more importantly data types will be pulled into compute for near-zero latency processing because any latency from seeking data on a disk will no longer be acceptable.

In my mind, the five trends above will put us squarely in the middle of a zero-latency future. That said, there are additional trends to watch out for that will either spur or have a major impact on the way we interact with computers in the future. 

1545891471.png

Serverless architectures

Serverless processing of large data sets will move more workloads away from big data to functions orchestrated at scale with Kubernetes-like tools. This means that more organisations will be able to process big data by utilising Function-as-as-a-Service (FaaS) solutions for better speed and affordability.

Multi-cloud

Multi-cloud adoption will make data storage agnostic to cloud platforms and providers. Your data could be stored partially on AWS and partially on Google Cloud or even on edge computers, for example. More and more organisations will use technologies like Kubernetes to break away from single provider lock-in.

Elimination bias in AI/ML

We will also see companies with massive amounts of consumer data (Google, Facebook, etc.) try to sift out bias from their data sets to make their AI and machine learning (ML) models more accurate and bias-free.

Today, for example, we can argue how a lot of bias exists in how personal loans were granted in the last 50 years. If the ML algorithms can learn from the data, you can almost be certain that those biases will persist, and this is one challenge all AI and ML providers will have to overcome.

1545891587.png

Data privacy

Due to today’s large volumes of data collection and instant processing requirements, data privacy will continue to dominate many data storage and processing decisions.

This year, we saw the introduction of General Data Protection Regulation (GDPR) in the EU that had far-reaching consequences on how companies collect and use private data. We will see more traction on this both from the perspective of collecting and processing data as well as additional government rules and regulations.

Event-driven architectures

Microservices architectures will further evolve. For instance, as specific services increasingly require the ability to work together with monolithic applications, Mesh App and Services Architectures (MASA) are gaining in popularity. This approach uses data services for listening to events and reacting to them in real time.

The single biggest takeaway from these 2019 predictions should be that we are headed to a zero-latency future. This is an exciting future because finally, just like electricity, we will start to see real-time computing become pervasively available and invisible at the same time.

This will require businesses to rethink how they collect and process their data, all over again. Therein lies some of the biggest challenges and opportunities.

Read Also
Post – Pandemic Era: How do International Companies Turn Crisis into Opportunities
Hechuan District (Chongqing City) Energy Big Data Center was established
IDCC2020 Review: Data Center International Cooperation Summit Forum

Research