Top 4 Trends in Data and Analytics – 2020 and beyond
By 2025, IDC says worldwide data will grow from 40 zettabytes(ZB) in 2019 to 175 zettabytes, with as much of the data residing in the cloud as in data centers. The datasphere will have three locations:
Core – traditional and cloud data centers,
Edge – Cell towers and branch offices
Endpoints – PCs, smartphones, and Internet of Things (IoT) devices.
While the adoption of cloud-based data lake is increasing within the organization to manage this large scale data produced or collected, there are still challenges to process, analyze and to monetize quickly. This article is an attempt to demystify the data and analytics trends to overcome these challenges to scale and manoeuvre business. We will focus on four key trends in data and analytics listed by Gartner in 2019. They are:
Natural Language Processing (NLP)
Augmented Analytics – Augmented Analytics uses artificial intelligence/machine learning (AI/ML) techniques to automate end to end data preparation, insight discovery and sharing. It enables automation of data science activities and machine learning model development, management and deployment through MLOps to increase reproducibility. Actionable insight generation is automated through the use of automated advanced AI/ML algorithms. There won’t be a need to clean the data for analysis anymore eliminating human error and speeding up model deployment and decision making. There will be elimination of rigorous analysts and database administrators who merge data from various sources to understand correlation and derive insights.
Example: If you have e-commerce business, data from various departments like marketing , operations, merchandising, design, customer service can be analysed to see the effect of marketing campaigns on booking and delivery of orders quickly. You can also know the zones of highest customer returned products for your business.
Continuous Intelligence (CI) – Gartner predicts that by 2022, more than half of major new business systems will in some way exploit continuous intelligence capabilities. CI systems enable frictionless augmented analytics designed to inform human decisions with the most accurate data possible. Continuous intelligence uses real-time data, automated or semi-automated processes, and AI-based, machine-driven way to continuously interpret data, discover patterns and learn what’s of value in the data. It’s not a once quarterly process for getting back on track or adjusting strategic direction — it will be an inherent part of how a business works and runs each and every minute of every day. CI allows business users to mash up and blend disparate data intelligently with the objective of discovering new insights constantly and revealing it as a data story with complete context. It does away with human biases in each step of the data pipeline and replaces them with a smart machine and AI that discovers everything in your data, no matter how complex.
Example: We will continue with the same e-commerce example. We saw that data from various departments like marketing , operations, merchandising, design, customer service was integrated on a single platform. We knew the effect of marketing campaigns on booking and delivery of orders. We also knew the zones of highest customer returned products for your business. With continuous intelligence we will know this in real time hence we will be able to quickly optimise the campaigns for maximum performance. We can also look for the underlying reasons that customers are returning products like faulty products and pause the sales to reduce losses. Teams can be empowered to make decisions in real time. We could pause the marketing campaigns of the faulty products as well as make the inventory zero for these until investigations or replacements are being thought of.
Explainable AI (XAI)- AI is finding its way into a broad range of industries such as education, construction, healthcare, manufacturing, law enforcement, and finance. The sorts of decisions and predictions being made by AI-enabled systems is becoming much more profound, and in many cases, critical to life, death, and personal wellness. This is especially true for AI systems used in healthcare, driverless cars or even drones being deployed during war. However most of us have little visibility and knowledge on how AI systems make the decisions they do, and as a result, how the results are being applied in the various fields.
Many of the AI/ML algorithms are not easy to examine to understand specifically how and why a decision has been made. This is especially true of the most popular algorithms currently in use – specifically, deep learning neural network approaches. As humans, we must be able to fully understand how decisions are being made so that we can trust the decisions of an AI system and control it. The lack of explainability and trust hampers our ability to fully trust AI systems. We want computer systems to work as expected, produce transparent explanations of the outcomes and provide reasons for decisions that they make. In a sentence the models and outcomes should be easy to interpret without any ambiguity. This is known as Explainable AI.
Example: We will continue with the same e-commerce example. We saw that data from various departments like marketing , operations, merchandising, design, customer service was integrated on a single platform. If the system tells us in real time that the marketing campaign needs to be optimised, we will need the performance of all the campaigns and the metrics that affect their performance. Now, we can see that the customer acquisition cost for the poor performing campaign was highest hence the need for optimisation.
Natural Language Processing (NLP) – According to wikipedia NLP is a subfield of linguistics, computer science, information engineering, and artificial intelligence concerned with the interaction between machines and human languages. There has been a significant increase in adoption of NLP in any kind of unstructured data analysis including pure text analytics, document classifications and interaction between human and communication devices in the past decade and it will continue to increase. NLP techniques also enable business an easier way to ask questions about data and to receive an explanation of the insights. The need to analyze complex combinations of structured and unstructured data and to make intelligence accessible to everyone in the organization will drive this growth. Conversing with tools will become as easy as talking to a human just like we are doing with Siri, Cortana et al by leveraging speech to text analytics to get answers from the data beyond predefined rule based algorithms.
Example: Imagine talking to our software for the answers we are seeking. Zones of highest customer returned products, reasons, solutions, options, details of marketing campaigns the list goes on. All the answers while conversing with the software instead of viewing dashboards!
Of course there are other trends which will shape the Data and Analytics future. We are just covering the tip of the iceberg here!