Can the CDO solve this issue while ensuring agility in the data pipeline and keeping budget under control ?
Data quantity keeps increasing. By 2025, the global volume of data should exceed 180 zettaoctets. In 2021, it reached 64 zettaoctets.
Due to data explosion over time, companie’s current data architecture is unable of handling so much data. As a result, the service for the end-user is degraded and it is not possible to visualize data easily anymore.
3 options are available to the CDO to solve this problem :
- Limit the usage of BI users
- Increase computing power
- Ask data engineers team to create a datamart layer
Is any of these options sustainable ?
Explore the CDO’s options in less than 3mn.
[To download the video, please complete the form.]