Can you believe it has been 24 years since 21 CFR Part 11 was made effective? Remember the confusion the introduction of Part 11 had on the industry? Not only were we all trying to figure out how and how much of Part 11 to apply, but it also seemed to usher in a new perspective on, and the need to understand, what it takes to ensure “data integrity.”
While there have always been ALCOA and electronic validation requirements in the regulations, our understanding and application of data integrity principles has evolved and expanded over the past couple of decades. Regulators responded by providing the industry with guidance documents and implementing specialized training for field investigators and reviewers to increase issue detection. Industry implemented data integrity policies and procedures. No longer is data integrity considered an issue to be owned by or restricted solely to IT.
We have traveled so far, and yet we are still here, dealing with the same concerns about data integrity, still searching for ways to identify and eliminate gaps. Where to next? What else can we do to manage the quality of our data and ensure data integrity?
We have learned that the people side, or human factor, is a key contributor to data integrity issues. We now understand that people bring conscious and unconscious biases and blind spots into decision-making and that there is a relationship between data integrity and a company’s quality culture. When we develop tools and systems to support users (i.e., humans), are we cognizant of our potential engineering biases? Are we starting early enough in the product or process lifecycle to understand the depth of data we need to collect to assess product quality and patient safety for the long run?
Critical thinking and quality risk management help to strengthen our connection to our data. Are we capable of shifting to a culture that embraces failure as a continuous improvement step? Destigmatizing failure encourages us to be transparent and open; thus, providing the opportunity to identify systemic root cause and implement effective corrective and preventive measures.
The role of data stewards owning the data lifecycle for products is another tool to increase oversight of the quality of our data. Understanding which data to monitor feeds into decisions on how to configure our processes and equipment to ensure we collect all the data needed to assess product quality and patient safety. This concept is not new, and data analytics and data mining are important tools for regulators (1).
Join us at the 2021 PDA Data Integrity Conference: Approaches to Data Integrity Assurance, to be held 23-24 September, as we dive into these topics with regulatory and pharma experts. In addition to interactive case studies, presentations, and panel discussions, there will be a panel Q&A for each session with industry and regulator panelists.
- U.S. Food and Drug Administration, Data Mining at FDA — White Paper, Aug 2018; https://www.fda.gov/media/91848/download (accessed 8 Sept 2021).