Greg Moody, Executive Director, Life Science Analytics, PerkinElmer Informatics
NOTE: The content below contains the first few paragraphs of the printed article and the titles of the sidebars and boxes, if applicable.
Pharmaceutical and biotechnology companies are facing a gigantic increase in data — not just in volume, but in complexity. The number of clinical trials in the National Institute of Health’s registry has jumped from 5,635 in 2000 to 183,991 in 2015 — an increase of more than 30 times. The data landscape has evolved to include not only traditional clinical measures, but also translational and outcomes data. These additional data sources can lead to new insights and offer tremendous opportunity in clinical development, but only if the data can be effectively accessed, aggregated, and analyzed.
The transition from collecting data on paper CRFs to electronic modes of data capture created islands of data. This process of having each function in clinical development select fit-for-purpose tools that meet one specific need is not well-suited for other user cases. The result is a proliferation of technologies that don’t necessarily work together and, ultimately, a great deal of time and effort is required to effectively leverage this data to gain insights. Layer on top of this the increasingly outsourced and partnered nature of modern clinical development and companies can easily become overwhelmed with the volume of data and how to quickly and effectively work with it all.
We have reached an inflection point, where companies and regulatory bodies have recognized that this way of doing things isn’t sufficient for moving forward. The result is a big push into analytics-driven decision making, which requires new tools and technologies.
Real-Time Data Access and Interaction
In the highly competitive drug development environment, data need to be accessed and analyzed from FPI on, rather than waiting for discrete time points when the data is in hand, cleaned, and locked down. It’s no longer a best practice to wait six months for the analysis or to rely solely on a periodic report generation. What’s needed is something faster and dynamic, incorporating data from multiple sources to create a complete picture of what is occurring. Beyond analyzing what’s already happened, solutions must focus on rapid study adaptation and predictive analytics to ensure that risk is minimized and benefits are maximized. We’re seeing a whole new class of tools evolve in clinical development, which enable a user to quickly analyze a wide variety of data, and which do not require the user to be a computer scientist or programmer to benefit from these techniques.
Visual analysis of clinical and other data is proving immensely helpful in quickly identifying outliers, trends, and problems throughout a trial. This allows companies to better manage their resources and investments so that they can focus on those programs that are providing patient benefits and reduce efforts toward those programs that are not. As Anscombe illustrated with his quartet, true insight can be found at the intersection of statistics and visualization.
Does a Solution Exist?
The issues that arise with adopting new technology solutions are uncertainty and a desire to avoid risk. Clinical development is already challenging. Companies do not want to add a layer of change by introducing different processes and new technologies. In the past, companies have attempted to build these new technologies and solutions themselves, requiring investments in software development and infrastructure management. This is no longer necessary.
Commercial, off-the-shelf solutions for self-service data discovery, visualization, and analytics do exist and have proven themselves in many organizations. Choosing an effective tool and solution set that work across the development continuum solves key business needs and reduces technical complexity, training, and process change. Harmonizing its use across clinical development provides benefits to all aspects of development, from pharmacovigilance and risk-based monitoring, to clinical trial and clinical supply monitoring, data management operations and project and portfolio management.
PerkinElmer’s data visualization and analysis solution, based on the industry leading TIBCO Spotfire data visualization platform, allows data to be aggregated and analyzed from a wide array of data sources and provides integration with industry leading analytics solutions, including SAS, R, and Matlab thereby building bridges between those traditional islands of data.
Making the move from traditional reporting and query tools enables companies to put data to work for them and change the decision making paradigm from one where the majority of their time has been spent preparing the data, to one where more of their time can now be devoted to acting on the insights from the data.(PV)
PerkinElmer Inc. is a global leader focused on improving the health and safety of people and the environment. The company’s informatics capabilities enable researchers and clinicians to visualize and analyze a wide range of complex data, identify patterns and trends, convert data into actionable information, and collaborate with peers for insights and answers.
For more information, visit perkinelmer.com.