Analysis tools have been used in companies since the 1950s. Although public interest in this issue has increased significantly recently, the way data analysis is conducted has not changed much in most companies since then: Classic analysis systems with reports and dashboards are used, which produce operational business intelligence in their highest expansion stage. They are downstream of individual business applications, so that each application has its own analysis or conclusions without being linked to other results. In the past, this method may have been sufficient, but today's reality presents the analytical world with completely new challenges: True masses of data are generated, more and more machines are networked, a large number of processes are computer-controlled. In view of these developments, can the analytical methods from the 1950s really still stand the test of time? Is it really enough to display individual analyses in dashboards or predict what will happen tomorrow? In fact, we are facing a paradigm shift in the area of analysis, which is absolutely necessary for companies to remain competitive on the market. We show exactly what is behind it and which stations are on the way there:
Analytics 1.0 - Classic Business Intelligence
The first analysis models were developed in the 1950s. At that time, the data volumes were still relatively small, structured, derived from internal sources and were mostly stored in enterprise data warehouses. The analysis models, which we summarize under Analytics 1.0, were largely limited to descriptive analyses and reporting of internal activities. The majority of companies still use these (outdated) analysis tools today, although the paradigms changed fundamentally with the advent of Big Data in the 2000s.
Analytics 2.0 - Era of Big Data
The analysis image is now determined by huge amounts of data. They come from different internal and external sources, are unstructured and are generated in bulk. This requires completely new analysis approaches and systems, which are summarized under the term Analytics 2.0. To be able to store and process the fast data flow immediately, they work with a large number of parallel servers. The analyses are no longer limited to internal business processes, but are also used for customer-oriented processes. In addition to descriptive analysis models, first predictive and prescriptive approaches (prescriptive - "what do I have to do to achieve a certain goal?") are developed.
This level represents a capital expenditure for most companies: They need specialists, so-called data scientists, who are specialized in modelling the data and deriving conclusions from it. However, there are significant gaps in the labour market and companies are looking for a different - technical - solution.
Analytics 3.0 - The analysis at the heart of all corporate activities
Analytics 3.0 tries to close this gap - the next development stage in data analysis. Equipped with the appropriate technologies and tools, Analytics 3.0 (see also Operational Intelligence Platform) can automatically process data masses on a large scale into useful knowledge. The 3.0 models combine descriptive, predictive and prescriptive analyses with the ability to evaluate and use big data "on the fly". This means that the knowledge that the systems independently derive from the data is integrated into intelligent front ends and operational processes. This enables fast, meaningful insights into business processes and activities as well as automated decisions and measures based on them. The vision is to use data enrichment to create real-time, customer-centric and customized offerings for every existing industry.
The first companies are already beginning to implement an Analytics 3.0 strategy - but many systemic challenges have to be met and the development of the models is still young. However, there are first promising approaches - also from odoscope. Read the next article about "Operational Intelligence" - the platform for automatic optimization of online offerings based on prescriptive data analysis.
Further literature tips on Anayltics 3.0 / Operational Intelligence can be found here.