Data integrity (DI) assessments are frequently commissioned and completed at every level of our organizations—for equipment, for systems, for processes at the global level, at the local level, for some aspect in the quality control labs, for specific scenarios in a GMP production area. The DI assessment list goes on and on (and on). So, too, do the CAPAs for these assessments go on—a CAPA for this process, a CAPA for that system, a CAPA for local compliance to the global CAPA and so on (and on).
Does this sound familiar to you? Do you feel like you and your organization are just spinning in a sea of assessments? Have you been asked to author, review or approve yet another DI assessment? Do you worry about the pile up of actions or numerous CAPAs that will come from it? Who is tracking all the assessments and aligning the actions?
At first, the assessments seem right and logical. In practice, however, this approach can make DI compliance seem slow, elusive or even prohibitive. This can become evident during a management review or regulatory inspection when despite all the assessments, DI issues—or worse, a critical data integrity breach—is identified. Where is the disconnect between the DI assessments and the DI lapses that are discovered on the shop floor or on the laboratory bench? Were there signals? And, if so, how did we miss them?
For starters, many system assessments are performed in the ideal state, not the practical state, in everyday use. Systems and processes don’t operate independently; even manual data must travel within and across processes to be used to render decisions. There are upstream and downstream dependencies, interactions and influences that affect our data. These elements are often missed during our traditional DI assessments and their ensuing CAPAs. As a result, our control strategies may only address “static” issues, when they should reflect and control all the dynamic, influential elements that impact our data.
Go to the Data!
The next time a DI assessment is suggested by your organization, consider a different, broader approach: Armed with the well-established regulatory requirements and, most likely, your own company’s established internal requirements—review the actual data! From schedules and notebooks to audit trails and databases, what does the data look like? What story is the data telling? Walk down the data process. From the perspective of the data, do the established controls facilitate and assure DI compliance? Or do they hinder compliance? Observe and converse with the data generators, data reviewers and data users. Do they understand their roles and their responsibilities for the data? Recognize and challenge signals and established ways of working that might impact the data and ask:
- Are the procedural controls too cumbersome for a high-volume laboratory?
- Is the total data process manual, electronic or a little of both?
- Are GDP issues accepted or are they escalated and tracked?
- Are assumptions made at any point in the data process?
- Are there cultural mantras impacting the data such as “we have always done it this way” or “that is noncritical data”?
- Are individuals apprehensive about raising DI issues or concerns?
Try an Alternative Approach
This next step may seem very unpopular but, if a DI gap or noncompliance is detected, use the quality management system (QMS) and raise a deviation! The QMS is designed to monitor, detect and correct issues that could impact the GxP process, and DI most definitely falls into this category. Using the deviation process will serve to document the impact of the detected gap on the data generated to date, assess the likelihood of similar gaps in other processes, define the fundamental fix of the data process and, ultimately, track and trend the incidence of similar deviations and the effectiveness of the fixes through the management review process. In an ideal state, the QMS alleviates the burden of routine DI assessments and the pile-up of CAPAs by enabling DI compliance by design, predicting and preventing DI issues as part of the normal course of QMS functionality. This is the foundation of data governance.
This approach takes courage. Many organizations seek to reduce the number of deviations. Often, supervisors and managers are measured by how many deviations are attributed to their area and are therefore disincentivized to raise deviations. Organizations can also be apprehensive about presenting a high deviation rate to regulators. Thus, this alternate approach to DI assessments also requires an adjustment to our traditional ways of viewing and managing the QMS. The QMS is a regulatory requirement, and the integration of data governance has become the expectation. Demonstrating the QMS is effectively being used for its intended purpose—to assure high quality and safe products are available to patients—will likely mitigate such concerns around deviation rates during inspections.
By utilizing a QMS integrated with data governance, an organization assures compliance with GMPs while simultaneously advancing practical DI compliance across the organization.
Join us at the 2022 PDA Data Integrity Workshop, as we delve further into these critical topics. In addition to interactive case studies, presentations and panel discussions, each session will hold a Q&A panel with both industry and regulator participation. You will have the opportunity to ask all the hard questions!