Está en la página 1de 2

Se trata de una guía que insiste en la importancia sobre los controles que se deben disponer

sobre los términos relacionados con el concepto ALCOA, atendiendo a las carencias de Data
Integrity que están detectando en diferentes inspecciones.

A través de esta guía, se pueden observar aspectos en los que hace hincapié, como son:

 Definición de ‘Data Integrity’, ‘Metadata’, datos ‘estáticos y dinámicos’.


 También insiste en la validación del sistema informático, que debe ser validado para su
uso previsto, mencionando específicamente los sistemas que gestionan ‘workflows’,
corroborando en la necesidad de validar cada workflow que sea relevante desde la óptica
GMP.
 Vuelve a insistir, como ya viene siendo común en las otras guías existentes de Data
Integrity, en el hecho de no compartir usuario/contraseña.
 Hace mención específica sobre el concepto de revisión del ‘audit trail’, enfatizando que
es necesario revisarlo como parte intrínseca de la documentación que se revisa, siendo una
parte más del proceso de aprobación de la documentación que constituye el registro GMP.
 En el apartado de los HPLC, caballo de batalla habitual, vuelve a repetir en que no es
aceptable guardar únicamente el resultado final, sino que se tiene que disponer de toda la
información generada durante el proceso cromatográfico.
 De forma casi imperceptible, porque se trata de una nota a pie de página, también
ofrece aclaraciones interesantes de los puntos de la norma (21 CFR 211) en los que se
solicitan los conceptos de la ALCOA.
 How can data integrity risks be minimized?
 In today’s marketplace, companies need to feel confident that there is no loss of
quality when using computer systems. To this, there are effective strategies that
companies may implement to manage their data integrity risks and ensure their
data respects the ALCOA principle. By moving from a reactive to a proactive way of
thinking, the following key requirements and controls may be put in place to ensure
data integrity and minimize risk for your organization.
 1. Ensure all computer systems are 21 CFR Part
11 compliant
 21 CFR Part 11 is an FDA regulation that applies to electronic records. It is required
to ensure that electronic records are trustworthy, reliable and equivalent to paper
records. All computer systems that store data used to make quality decisions must
be compliant, making it a perfect place to start with data integrity.
 2. Follow a software development lifecycle
 A Software Development lifecycle methodology helps oversee that quality related
tasks are performed to address pertinent lifecycle phases from software
development, software testing, integration and installation to ongoing system
maintenance. All computer systems should be appropriately developed, qualified,
tested and assessed on a regular basis.
 3. Validate your computer systems
 Software Validation provides documented evidence to deliver assurance that a
specific process consistently produces a product that meets its pre-determined
specifications and quality attributes. To ensure your system can be validated, it is
key to work with vendors that provide validation.
 4. Implement audit trails
 A secure, computer-generated, time-stamped audit trail records the identity, date
and time of data entries, changes, and deletions. Audit trails ensure the
trustworthiness of the electronic record, demonstrate necessary data ownership
and assure records have not been modified or deleted.
 5. Implement error detection software
 Automated inspection software can help verify important documents to ensure their
accuracy. Manual proofreading or inspections are proven to be inefficient and often
cannot assure that files are error-free.
 6. Secure your records with limited system
access
 All systems should require a login with at least two unique pieces of information and
provide access only to required individuals to guarantee data integrity.
 7. Maintain backup and recovery procedures
 A backup and recovery strategy is necessary in the unexpected event of data loss
and application errors. This procedure ensures the reconstruction of data is
achieved through media recovery, the restoration of both physical and logical data
and creates a safeguard to protect the integrity of your database files.
 8. Design a Quality Management System with
SOPs and logical controls
 A Quality Management System with Standard Operating Procedures builds quality
into the process by systematically controlling the process. It is essential to write and
follow good effective procedures to ensure clear accountability.
 9. Protect the physical and logical security of
systems
 Controls are needed to protect the physical and logical security of your systems,
change management, service management and system continuity. This will assure
continuous development for your organization and support of systems.
 10. Establish a vendor management qualification
program
 It is important to evaluate all vendors supplying products to certify that the products
are quality products that meet needs (such as validation services). A continuous
appraisal is required following the initial evaluation. Often asking what data integrity
procedures your vendors have in place will help with your own organization’s data
integrity practices.
 11. Properly train users and maintain training
records
 Users should be properly trained so that they have the right education and
expertise to perform their job competently. Documented training records provide
this proof.
 12. Conduct Internal Audits to evaluate controls
and procedures
 Internal Audits ensure that all procedures are followed and that continuous
improvement is emphasized.
 Data integrity success
 If you are reading this article, you are most probably aware of how important it is to
ensure your data is not compromised. The impact of dangerous data can have
resounding consequences on any organization no matter the size. However, if data
integrity is thought of as a process, the data infrastructure can become an asset
instead of a liability.

También podría gustarte