COVID-19 & the future of data discovery, access and integration
The spread of COVID-19 has been challenging governments, businesses and the global markets at an unimaginable scale as it continues to disrupt mobility, social interactions and traditional ways of collaboration. For financial professionals looking for new ways to thrive and solve problems remotely, data access, discoverability and consumption are more important than ever.
Operating and innovating with data in a COVID-19 world is much more challenging. Getting value out of data is similar to using assembly lines to make cars – but in the office, the data assembly line was never fully automated, and instead a combination of automated and manual processes. As such, many firms have not been able to smoothly adapt the data assembly line to the remote work environment.
Complex manual tasks are taking longer and are more prone to error, especially when involving multiple participants. These challenges not only slow down the process of finding value, but also the speed of innovation.
In order to better suit this new work environment and the increasingly complicated demands on data, the future of data will require a move towards extreme digitization.
Re-examining the data assembly line
Organizations can experience increased efficiency by identifying and eliminating the areas of manual work that add little or no value. The re-examined data assembly line can be deconstructed into six essential components that are interchangeable and interconnected. These include:
- Content: This represents the most visible part of data and encompasses primary data, derived data, third party data and any data an organization generates internally.
- Quality: Data quality is paramount when handling hundreds of billions of data points. Organizations need to ensure that the data consumed is of high quality upon acquisition to prevent inordinately costly manual processes to clean it.
Access: This refers to an organization’s ability to easily obtain data content regardless of its technology strategy and where it chooses to deploy (on premise, cloud, data centre), ensuring that regardless of the volume of data consumed, performance remains consistent.
- Usability: Data comes in various shapes. Developers are required to write code each time a new data shape enters the organization, which can impose high costs and delays. If data shapes were rigorously implemented to an industry standard, and machines can understand the data and associated relationships through metadata, the machines can then consume the data with no manual intervention.
- Operational tools: Whether organizations are managing data consumption, manipulating data or looking to perform analytical and quantitative tasks, they require a mix of internal and third-party tools that are not only well-designed, but also integrated to avoid disruptions and inefficiencies in the data assembly line.
- Data services: This includes ingesting and normalizing multiple flavours of data into an intelligible model, ensuring that downstream applications get access to consistent and linked data.
In a world where remote work is the new normal, and where innovation is made possible by obtaining large amounts of data, the need for automation has accelerated. These six components, when working in harmony, can create a seamless and resilient workflow that will work regardless of location, ultimately resulting in the immediate generation of value.
Firms that are willing to invest in this modern data supply chain will be quicker to react to automation opportunities and leverage innovations in data science to find transformative solutions to problems.
Cloud data is the future
Looking ahead, technology will continue to evolve at a relentless pace. As a growing number of financial organizations are accelerating their movement to the cloud, the accessibility of data in the cloud becomes even more important. To support this, data needs to become invisible, ubiquitous and available for use at the click of a button. We call this cloud data.
Cloud computing can remove some of the burdens when it comes to data centres, servers and all the associated management. Similarly, with cloud data, a machine can abstract away the work associated with connecting, ingesting, cleaning, managing and loading data and, in one way or another, make these processes “invisible”, ultimately resulting in an explosion in value generation. Cloud data will create considerable value for the financial world.
As instantaneous and native access to digestible data becomes increasingly important, it is up to organizations to not only invest in achieving higher levels of automation, but also make sure all processes are seamlessly interconnected, to take full advantage of the increased reliance on data and thrive in the new normal.