If you’ve had your finger anywhere near the regulatory pulse, then you know data integrity is on the regulatory radar. While some individuals utilize statistical analysis to detect any irregularities in the data, others depend on validation processes to guarantee that the system is functioning according to its intended purpose. Consider the three (3) major aspects of the Data Integrity paradigm: Design, Validation, and Assessments.
Design
Focused on the system design, including physical integrity and logical integrity.
Validation
Formal testing to ensure a system performs as intended.
Assessments
Assessments are great, but they’re a little too late. This is because data is already out there and may have been used to make decisions. It’s like proofreading an article after it has been published. Ideally, statistics can be used to confirm all preceding control measures are effective and reliable, thus confirming data has attained a high degree of integrity.
Data Assessment
Assess data to verify the accuracy or identify anomalies.
Each of the above is important. However, the critical point is that data integrity should be designed into the system right from the beginning. Data integrity should be at the forefront of system development (i.e., design).
Validation
Validation is a regulatory requirement for GxP systems, so it must be done. However, when you validate a system it has already been developed. Identifying problems, such as corrupt data, will require re-design and at this point in the System Development Life Cycle (SDLC), it’s too late because the negative impact on cost and time can be exponential. Second, validation ensures a system performs as intended, which is essentially black-box testing. What about a system not performing as intended? This is where white-box and grey-box testing comes into play.
The reality is you may not, in fact, be able to use white box or grey box methods because they require access to source code. Because source code may be intellectual property of a software vendor, it may be confidential and proprietary. Furthermore, exposing source code can jeopardize security.
Data Integrity by Design
Designing the system to ensure data integrity at the onset is your first line of defense. But this isn’t an easy task and requires skilled individuals. Unless you’re in the business of developing software, you should avoid developing custom software. It’s better to procure systems from qualified vendors who are, in fact, in the software development business.
To qualify a vendor, it’s important to ensure their team is qualified by education, experience, and training. They should have a Quality Management System in place that covers the entire system and software development life cycle (SDLC). Their processes should include practices such as code reviews, white-box, and grey-box testing. They should follow standards, methodologies, and good practices common to software development. They also need domain expertise to effectively deliver solutions to highly-regulated industries such as Life Sciences.
And to the point of this article, their solutions should have been designed in a manner that ensures data integrity.