Skip to content

The Medpoint Blog

Learn best practices and gain new insights on quality assurance, regulatory strategy, and clinical affairs.

What does the FDA Consider Data Integrity and Why Does It Matter

For FDA (U.S. Food and Drug Administration)-regulated industries, organizations, and projects, the importance of data integrity is extremely high. Besides being a requirement for compliance reasons, data integrity is the foundation of a top-notch quality management system and paramount for ensuring patient safety. 

Integrity, security, and consistency of data lie at the core of FDA-regulated company operations. To ensure compliance, FDA regulates data integrity in several ways through the GMP, GCP, GLP (GxP) regulations and 21 CFR Part 11 requirements. 

All the data-related FDA requirements can be addressed by maintaining good data integrity practices. Meanwhile, lack of data integrity could result in Form 483 observations, a Warning Letter, and potential recall of products that fail to meet quality standards or requirements. 

This article will talk define data integrity and explain how to maintain it effectively. 

What Is Data Integrity? 

Data integrity is accuracy, reliability, and consistency of data throughout its lifecycle. Data integrity is critical in FDA-regulated industries since it makes sure that products meet the required quality standards. 

Poor data integrity doesn't just result in regulatory non-compliance issues and penalties. It can cause harm to the end user, result in incorrect decisions, initiate product recalls, and affect the company's reputation. 

Data Integrity vs Data Validity 

While data integrity and data validity may appear similar, they are not interchangeable terms. 

Data Validity — Data validity is correctness and reasonableness. Data validation is the process of checking data quality and accuracy before using. 

Data Integrity  — Data integrity is the completeness, soundness, and wholeness of data that complies with the intention of its creators. It's the assurance that data is unchanged from its source and hasn't been modified in an unauthorized way. 

Data Integrity vs Data Quality 

Data quality is the reliability of data. To be considered quality, data must be complete, valid, and consistent. 

Data integrity requires data quality to be in context to ensure usability to the organization. Overall, data quality is an essential part of data integrity. 

Why is Data Integrity Important? 

For the past several years, FDA and other global regulatory bodies have been increasing attention to data integrity problems in various industries. Today, data is the backbone of compliance, efficiency of innovations, and quality assurance. 

The safety, efficacy, and quality of products can be ensured with accurate, reliable, and valid data provided by data integrity compliance. With data integrity in place, there is no need to inspect every process involved in drug production and supply, thus saving time and money.   

Additionally, data integrity guarantees trust between regulatory agencies and the industry as a whole. 

Overall, data integrity is key to regulatory compliance, quality assurance, and the company's reputation within the industry.   

What Are The Principles That Ensure Data Integrity? 

To ensure integrity, data must meet specific quality and integrity attributes. ALCOA+ is a set of such principles introduced by the FDA. ALCOA+ principles are highly important to GxPs. 

Originally, FDA used ALCOA — a collection of five principles (Attributable, Legible, Contemporaneous, Original, Accurate). With time, the original set was updated with four more parameters (Complete, Consistent, Enduring, Available) so the name was changed to ALCOA+. 

Let's analyze the nine attributes in more detail. 

  1. Attributable — when creating a record, it's imperative to record who collected or generated data and when it was collected. At all times, data must be attributable to the person or program collecting or generating it. For example, an adjustment to the lab record must be accompanied by initials and date to show who made the change and when it happened. 

  2. Legible — all collected and generated data must be readable (legible) and permanent. The wording, notations, and references shouldn't be time-sensitive. Meaning slang and unusual phraseology should be avoided in order for the information to be decipherable in the future. Legibility could also involve storing human-readable data to accompany electronic records. 
  3. Contemporaneous — all data should be recorded at the time it was collected and generated. Backdating data isn't allowed. With electronic data maintenance, contemporaneity is regular practice. This principle mostly referred to data records created by humans. Adding a time and date stamp to all data is a good documentation practice, which must be followed by all team members involved. 
  4. Original — records should be original or certified true copies of the original data. For example, writing test results on a piece of paper with the plan of completing the main record in the future is out of the question.  Such an approach could cause errors. In case you need to transfer handwritten records to the electronic database, they must be verified for legibility, contemporaneity, and completeness. 
  5. Accurate — data records must be error-free, complete, truthful, and reflective of reality. Any editing should be performed with documentation and amendment annotation. During editing, the original information shouldn't be removed, blocked, or deleted. The system you use for recording data electronically should be equipped with accuracy checks. 
  6. Complete — data must be in a complete state to avoid manipulation and editing errors. It should include all data from all actions taken to collect required information, including audit trail and edits. The data should be presented in context in order to be understood in the future. 
  7. Consistent — data records have to be consistent internally and in relation to larger bodies of information. The biggest part of consistency is keeping data in chronological order (adding time/date stamp as noted in "Contemporaneous"). To keep data consistent, it's important to implement SOPs (Standard Operating Procedures) on data formats. 
  8. Enduring — the data record shouldn't be changed or removed with time. Data must be available for a long time after it's recorded. This mostly refers to handwritten documents. However, an appropriate set of rules is required to keep electronic data accessible in the future as well. With data formats changing and software becoming outdated, the importance of ensuring endurance is especially high. 

  9. Available — the data record should always be available when needed (e.g. for quality control, processing, reporting, quality assurance audits, inspectors). There should be guidance for archiving data but still keeping it available. The most efficient way to keep data available and easily accessible is storing it electronically. 

Making sure the data adheres to the above nine principles is all part of good data integrity practices. 

The Takeaway 

For FDA-regulated organizations and projects, maintaining data integrity is the key to compliance, product quality, patient safety, and company image. 

ALCOA+ principles are designed to help you ensure data integrity within your organization, thus avoiding issues during FDA inspections and following top-quality management practices. 

If you'd like to learn more about data integrity for quality management purposes, subscribe to our blog today. 

Subscribe to the Blog