Data is generated and collected in almost all industries in unimaginable quantities. According to the IDC report, the amount of data worldwide will increase from 33 Zettabyte in 2018 to 175 Zettabyte in 2025. For clarification: 175 Zettabyte = 175 trillion Gigabyte - that's a number with 12 zeros!

However, even large companies with big budgets and large teams are currently faced with many unresolved challenges regarding the processing, evaluation and efficient use of these masses of data. The Gartner Trends 2019 provide promising solutions for these challenges. The US analysis and consulting firm Gartner is known for its in-depth studies such as the "Magic Quadrant" and technology forecasts such as the "Hype Cycles".

The following 10 Data & Analytics Trends will change your business over the next three to five years. Specialists and managers in these areas or related industries (e.g. e-commerce, marketing, IT or consulting) should therefore examine these trends for their disruption potential within their own market environment or field of activity. Better safe than sorry!


Neuer Call-to-Action


1. Augmented Analytics

Data analysis is complex and generally requires one or more data scientists who can extract value from large amounts of data. The complexity is mostly due to the fact that data is collected from a number of different sources such as web analytics, ERP, PIM, marketing software or social media.

Due to the high manual effort involved in preparing, cleansing and merging data, Data Scientists spend most of their time on such tasks – estimations go as high as 80%! Augmented analytics can help here to reduce workload with the help of Machine Learning. Data Scientists can thus invest more work in the search for feasible insights.

By 2020, augmented analytics will be a dominant driver for buying decisions of analytics and business intelligence as well as data science and machine learning platforms. In an expert interview, data consultant Marc Preusche also described augmented analytics as one of the trends in customer experience optimization.

2. Augmented Data Management

Augmented data management can also significantly reduce the manual effort described above - especially when cleansing and merging large amounts of data from different sources. According to Gartner, manual data management tasks could be reduced by 45% by 2022 through machine learning and automated workflows. This support in data preparation is called augmented data management.

Daten-Herz

3. Continuous Intelligence

By 2022, more than half of all major new business systems will have continuous intelligence. This means using real-time context data to improve decisions. Continuous intelligence therefore combines raw data and analysis with transactional business processes and other real-time interactions. Methods such as event stream processing (a method for real-time analysis), business rule management (rule-based decision-making systems) and, of course, machine learning are used. Continuous Intelligence can also be described as an evolution of operational intelligence.

4. Explainable AI

Artificial intelligence (AI) is on everyone's mind. But opaque AI applications could also potentially cause damage. Therefore, by 2023, over 75% of large companies will have hired their own artificial intelligence specialists in areas such as IT forensics and data protection to reduce brand and reputation risk for their businesses.

With the help of augmented analytics, automatically generated insights and models are increasingly being used. However, the explanatory nature of these insights and models (e.g. their derivation) is crucial for trust, compliance with legal requirements and the management of brand reputation. Because inexplicable decisions, which are made by algorithms, certainly do not trigger enthusiasm for most people. In addition, some AI applications are said to reinforce certain prejudices they "learn" from training data.

Explainable AI describes models whose strengths and weaknesses can be identified. The probable behavior of such a model can be predicted as well as possible distortions. An explainable AI thus makes decisions of a descriptive, predictive or prescriptive model more transparent. This way, important factors such as the accuracy, fairness, stability or transparency of algorithmic decision-making can be ensured.

robot-MachineLearning-unsplash

5. Graph Analytics

The use of analytical graphics processing and graphics databases will grow by 100% annually until 2022. In particular, visual data preparation can be accelerated continuously. This enables more complex data science tasks to be performed faster.

Graph analytics describes a set of analytical techniques that allow you to explore relationships between companies/organisations, individuals and transactions. A special application of Graph Analytics: graphically capable semantic knowledge graphs (prominent example: the Google Knowledge Graph). These form the basis for many Natural Language Processing / Conversational Analytics data structures and enrich many data processes enormously.

6. Natural Language Processing / Conversational Analytics

Natural Language Processing (NLP) is a branch of linguistics and computer science that is dedicated to the interactions between computers and human (natural) languages. Currently, companies are particularly concerned with the question of how to program computers to process and analyze large amounts of natural speech data. This applies to search engines, voice commerce, voice assistants as well as analytics applications.

According to Gartner, by 2020 already 50% of analytics queries will be generated in systems via search, voice input (NLP) or automatically. The consumer trend of voice control in the car, via smartphone, smart speaker and much more is increasingly finding its way into B2B analytics applications.

By 2021, natural language processing will increase the acceptance of analytics and business intelligence software from 35% of employees to over 50%. This also makes analytics more usable for new user groups such as managers, sales representatives or creative people. Providers like Tableau already offer such NLP functionalities today - see for yourself:

 

 

7. Commercial AI & ML

Many popular AI and ML software frameworks are currently still supported by Open Source (e.g. Tensorflow from Google or Caffe by Berkeley AI Research). By 2022, 75% of new end-user software (e.g. apps & websites) using AI and ML techniques will be commercial rather than open source.

Gartner currently predicts that by 2022 commercial cloud-based services from major vendors (notably Amazon, Google and Microsoft) will reach a turning point of 20% market share in the data science platform market. Because these large tech groups have long recognized the potential of data science for themselves and are therefore working on the commercialization of their self-developed frameworks.

8. Blockchain in Analytics

The promise of the blockchain is gigantic: it contains cryptographically protected data that is unchangeable. The data can only be shared and supplemented by a network of participants, so that everyone always has the latest data/information at the same time. Of course, this incorporates an enormous complexity. Nevertheless, blockchain technology has been considered for some time as one of the technologies that will revolutionize commerce.

Analytics use cases could be, for example, fraud analysis, auditing processes or data sharing between different organizations. But due to the by now unconvincing applications on blockchain basis, this trend is labelled as hype by many experts. We will continue to observe how it develops in the long term.

But now with no further adue to the last two trends - Watch out: Nerd-Alarm!

Nerd-Computerscience-unsplash

9. Persistent Memory Servers

By 2021, persistent memory (that is, persistent, as opposed to volatile memories such as a cache) will account for over 10% of in-memory computing memory usage. Most database management systems (DBMS) today use in-memory database structures, both line-based and column-based (e.g. Hadoop). In-memory computing makes it possible to significantly reduce the calculation times of large amounts of data. In the medium to long term, this hardware should make it possible to react in real time to billions of data sets. Traditional systems work with hard disk storage, which takes much longer to access the data.

In most cases the memory size is currently still limited and therefore not all data can be stored within the in-memory data storage. Either the DBMS software decides what is kept in memory or it is user-defined (e.g. by the system administrator). Some servers already exist with large in-memory data storage of up to 64 terabytes, but these require multiple processors and are currently still very expensive. It would be desirable if this hardware became cheaper in the near future and could thus be used by more companies.

10. Data Fabric Designs

The value of analytics investments also depends on an agile and trusted data structure. A data fabric is typically a custom design that provides reusable data services, pipelines, semantic layers, or APIs by combining data integration approaches.

Data Fabric Design is another important hardware trend, because the data structure is very individual and different depending on the industry, business model and complexity of the data. But this will probably only inspire the hardcore analytics nerds at the moment.

By 2022, tailored data fabric designs will be used primarily as infrastructure. Probably especially with analytics platforms like Webtrekk or AT Internet.

Mehr Insights zur CX-Optimierung?

Jetzt Dein Whitepaper sichern und von Top-Experten wie Zalando, konversionsKRAFT & Co lernen!

Kostenlos herunterladen

Subscribe to our monthly newsletter