Skip to main content

4 Business Intelligence Trends for 2020

Being data-driven is no longer an ideal, it is an expectation for everyone that wants to succeed in today’s business landscape. To help you become more data-driven, here are four of the most important BI trends for 2020. 

business intelligence trends 2020

Companies are no longer speculating about data analytics, they are actively looking to find the best tool for their specific requirements. As a consequence, BI solutions are evolving and becoming increasingly customized, bringing about other emerging trends.

Also read: Gartner’s top 10 strategic technology trends for 2020.

Let’s dig into the different trends: 

Collaborative Business Intelligence

Collaborative BI is not an entirely new trend. However, in the wake of the need for increased collaboration in the business environment, this trend has gained momentum. Defined as a combination of social media, web 2.0 technologies and online reporting tools, collaborative BI streamlines the sharing of analytics, thus supporting joint decision-making.

Decisions are no longer owned by a single person or group, they are shared by multiple departments, and in many cases, they have to be coordinated across regions. As a result, reaching common ground has become increasingly important and this is exactly where collaborative BI tools excel. They enable teams to work together towards a conclusion, using a side-by-side exploration of data in real-time.

Graph Databases

Graph analytics is a set of techniques that shows how entities such as organisations, people and transactions are related to one another. Graph data stores handle all the connections between entities and thus enable users to explore, discover and predict complex interrelationships across data silos more quickly and with more insight.

Business users are asking increasingly complex questions across both structured and unstructured data, often blending data from multiple applications, and increasingly, external data. Analysing this level of data complexity at scale is not practical, or in some cases not possible, using traditional query tools or query languages such as SQL.

Although the need for specialised skills has limited their adoption to date, experts say graph databases are headed for mainstream use. For example, Gartner predicts that the application of graph databases will grow at 100% annually over the next few years to accelerate data preparation and enable more complex and adaptive data science.

Here’s an example of a simple graph data model:

Business Intelligence trends for 2020

As you can see, this graph contains two nodes (Alice and Bob) that are connected by relationships. Both nodes share the same label, Person. In this example, only Bob’s node has properties, but every node and relationship can have properties.

The real world is interconnected, and graph databases mirror those sometimes consistent, sometimes erratic relationships in an intuitive way, which makes data models easier to interpret. After all, the human brain doesn’t think in terms of tables and rows, but in terms of abstract objects and connections. That’s what makes the graph paradigm different than other database models.

In-Memory Computing

The current industry perception distinctly indicates a strong trend towards the adoption of what many consider to be the holy grail of analytics: In-Memory Computing (IMC). Results from a recent Deloitte survey show that the concept of IMC is well known among IT executives, but productive use is still limited to a few daring innovators.

From a hardware-based point of view, data analysis consists of three components: the processor which performs the calculations, the storage which stores the data and the system that transfers data between the two. The bottleneck for the performance of IT-based data analysis is the latency of storage. The processing power cannot be used to full capacity because the data to be processed is not retrieved fast enough for hard disks.

In a nutshell, IMC means moving data that has traditionally been stored on hard disks into memory. By focusing on pure hardware characteristics, latency is dramatically reduced and the process of data analysis is accelerated tremendously. For example, tests with Intel’s Nehalem modular architecture showed the latency of storage is reduced from one million CPU cycles to an average of 250 CPU cycles.

Data Privacy and Digital Ethics

There is a growing need to extend the definition of corporate responsibility when dealing with data in a broader sense. The question we have to account for is not only whether something can be done, but whether it should be done in the first place. Shifting the focus from data privacy to digital ethics will move the conversation beyond check-the-box compliance and enforcement of the rules, towards doing the right thing.

The latest Avanade report, which included fielded responses from 1,200 C-suite, senior-level IT and business decision makers in 12 countries, revealed that digital ethics is accelerating up the boardroom agenda even faster than security. As digital ethics is becoming a discussion item, more companies are expected to adopt a values-driven approach to data analysis.

Want to read more about technology? Visit our Technology category page here.

Read more about technology

Most popular