actu-en

The big question of Big Data

By 31 July 2019 No Comments

Wishing to take advantage of data, companies face a major challenge: process more data, more rapidly and at a lower cost.

The potential of data, considered important in terms of economic development, draws the attention of many companies. A difficulty arises, however: raw data does not produce any value. For companies to generate additional revenue from this resource, it is necessary for them to build structures that allows data exploitation and processing. In this context, they are facing several challenges.

First, data exploitation is an expensive process, which involves three main costs: personnel costs; software costs; and server costs. Although server prices have been significantly reduced in recent years (notably through the development of the Amazon’s large-scale hardware offerings), engineering costs remain very high, due to a lack of supply on the labour market that does not meet ever-increasing demand.

Secondly, the implementation of these processes becomes a necessity for the future development of companies. A study conducted by Accenture shows that 4 out of 5 executives believe that companies that do not take the Big Data shift will lose their competitive advantages, or even that they could disappear [1].

However, despite their understanding of these issues, few companies are still making investments in analytics. According to a study conducted by Pure Storage, nearly 3/4 of companies do not control the amount of data they collect [2]. The reason is simple: current data processing and exploitation technologies still have many limitations.

One of them is the “time-to-data” performance of these tools. This concept refers to the time required to move from an unusable raw data to the extraction of accurate, complete and relevant information. The higher the time-to-data of a project, the more expensive (and thus, risky) the analytical investments will be. However, Business Intelligence (BI) or Big Data projects may see their time-to-data oscillate between several hours and several months depending on the variety and quantity of data to be processed.

Therefore, a major question arises to companies: how to reduce time-to-data to a minimum, and thus make analytical investments less costly, more efficient and more agile?

Download the white paper “The new challenges of Big Data” here.

[1] Accenture, Big Success with Big Data Survey, 2014
[2] Pure Storage, Le Big Échec du Big Data, 2016

Leave a Reply

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close