**London**: A new analysis by McKinsey highlights businesses’ transition towards ‘data ubiquity’ by 2028. It emphasises the necessity of data democratisation, usability, and trust to enhance decision-making and outlines current barriers firms face in integrating data effectively.
A recent analysis by McKinsey predicts that within five years, businesses worldwide will transition into a phase of “data ubiquity,” where data is deeply integrated into every aspect of operations, interactions, and systems. This ambitious forecast outlines the gradual transformational journey companies must undertake to fully harness data in their decision-making processes.
Over the past few years, McKinsey has observed a lag in the progress of its predictions, particularly concerning the goal for employees across various sectors to enhance their work through data and utilise advanced technologies seamlessly by the year 2025. Current data suggests a persistent need for leadership in data literacy within the workforce, highlighting that the seamless daily use of data is not advancing at the expected rate.
In 2023, global expenditure on technology remains robust, with 91% of tech decision-makers planning to increase their IT budgets. However, there is a marked shift towards measuring immediate benefits from data management tools and practices. Companies are increasingly questioning whether their investments are driving tangible outcomes that align with McKinsey’s vision of a data-empowered organisation.
Central to this transition is the concept of “data democratisation,” which emphasizes the importance of making relevant information readily accessible to employees across the organisation. According to McKinsey, achieving data democratisation requires a dual focus on usability and trust. Making data user-friendly and reliable will be critical to overcoming current barriers to widespread adoption.
Many firms have fallen into the trap of deploying niche solutions for individual needs, leading to complications such as technology stack overload. This disjointed approach not only causes inefficiencies but also impairs users’ ability to access and utilise data effectively. By 2025, organisations are expected to adopt a comprehensive view of their data management practices, assessing whether they encompass all essential elements for efficient data handling—from data collection to providing analysis-ready datasets.
Furthermore, companies are anticipated to adopt a more rigorous approach when expanding their technology stacks, carefully considering how new solutions will integrate with existing systems and whether they will truly simplify data usage.
Trust in data quality is increasingly paramount as the adoption of artificial intelligence (AI) accelerates. Gartner has projected that one in three AI projects may fail in 2023 due to inferior data quality. Consequently, data quality has risen to become one of the two most pressing trends in data and business intelligence, alongside data security and privacy. As businesses assess their data management frameworks, they are expected to implement measures that enhance data accuracy, such as automating data cleansing processes and leveraging machine learning tools for ongoing data integrity assessments.
In light of the fast-evolving business landscape, firms recognise that effective decision-making and AI outputs must be underpinned by reliable, up-to-date data. Thus, while the journey towards McKinsey’s “data ubiquity” ideal will differ for each organisation, striving for enhanced usability and accessibility of data has become a critical objective for all companies operating in today’s data-driven environment.
Source: Noah Wire Services



