Data is an asset which businesses and industries can not ignore. If used properly, data provides its users a competitive advantage, however, without a sound approach, data may fall short of its full potential. To help ensure success, organizations should start with the critical first step of identifying what questions need to be answered and what problems need to be solved, then quickly mobilize to identify what data is needed to answer those questions. Data needed for strategic purposes almost always comes from different sources, in different formats, and in varying quality that can complicate utilization, especially for machine learning applications. As data and analytics are evolving, so to are data technology platforms - and keeping up is critical.
The volume of healthcare data is exploding. According to IDCs whitepaper, The Digitization of the World – From Edge to Core, of all industries using data, the healthcare sector is primed to grow the fastest, surpassing other sectors by 2025. The field of genomics, newer diagnostics, more exact imaging technology, telehealth, consumer apps and wearable devices, medical records… there is no shortage of healthcare data. The problem is the complexity and lack of standardization within these myriad sources make it difficult to convert raw data into actionable insights. Data technology platforms help manage and transform all that messy data into insights in an efficient way. A data technology platform manages data end-to-end, including the important functions of ingesting, cleaning, and processing various formats of data. More mature platforms also provide data science tools needed to transform data into insights, as well as other important functions such as de-identification of private health information (PHI/PII). Today’s users of data and analytics find themselves in different stages of what is called the Data Platform Maturity Model.
The first stage along the Maturity Model is the ability to store an organization’s data in the same location and make that data available to the most possible users. This may include proprietary information as well as purchased data investments. This is a step up from storing data in silos, isolated from other data and only available to a subset of users. Having all your data in one place does not necessarily mean you can extract additional, unique value from the data, but having all data in one place is the necessary first step to increasing use of and access to the data. This stage has led to the explosive growth of cloud-based storage. Many organizations stop at this stage, and that is a big problem because data from different sources may overlap, have persistent quality issues, or be in too raw a format for business users, causing duplication, allowing inconsistency, and ultimately rendering large portions of data unusable.
After data from diverse sources and in different formats , such as medical and prescription claims, wearables, electronic health records, imaging, and genomics, has become available to a wider audience of users, those users climb the Maturity Model by having the technology necessary to clean, standardize, and combine data, generating congruous formats and consistent quality. Typically, as organizations begin to use data, they realize that to extract better value from data, it not only needs to be cleaned and standardized, it also needs to be combined and linked. An efficient way to store, link, and access data is an invaluable tool for data users because it is the only way possible to enable the use of data. As organizations arrive at this level, they need data platforms that can accommodate substantial amounts of data and automate the routine functions needed to extract value from data. As data users become more sophisticated, so do the needs put on the data platform. Moving from fixed, on-premise infrastructure to scalable, cloud-based technology with automated data pipelines is where organizations can exponentially increase their capabilities.
Approaching the pinnacle of the Data Platform Maturity Model means delivering advanced mechanisms such as packaged data pipelines that automatically standardize and integrate data to produce usable answers. Kythera’s Wayfinder platform, for example, accomplishes this step with machine learning-driven packaged data and use case-specific analytics packages which link data sets to mastered directories of known entities, such as practitioners, facilities, patients, payers, and products, ensuring better accuracy, more completeness, and also enabling any data sets and additional directories to be cleanly joined together for integrated use. For healthcare providers and life sciences manufacturers, this means data coming from electronic health records, claims data, direct consumer data, or any other type of transaction, event, or entity can be linked and turned into a unified view through automated, syndicated processes that ensure quality and speed.
The promise of big data, machine learning, and other information technologies requires high quality, interoperable data processed at scale and available quickly to a wide audience. Data science platforms can help organizations ensure that they have access to all their data investments have to offer.
There's real value to be gained by linking anonymized data, insights like detailed longitudinal market trends and precise patient journeys. But the process can be challenging, and proper techniques are critical.
Purchasing data can be complicated with some known pitfalls, including a lack of visibility. But we’re here with a few tips to make the process easier and more successful.