We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

The New Technologies Changing the Face of Data Integrity in Drug Discovery

Two young female scientists work on a laptop.
Credit: iStock
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 5 minutes

Automation and drug development are a perfect pair. Improved reliability and increased efficiency are the most obvious benefits of automation, which can overcome inconsistency and wasteful processes that have been the biggest hurdles in the pharmaceutical industry.


The well-worn figures of the field’s failures are stark evidence for the need for updated technology – 90% of drugs fail during development and as much as 60% of research and development costs are attributable to attrition.


Bringing in technology that can improve these grim statistics is a business imperative. That’s why the introduction of Industry 4.0 – shorthand for the application of technologies like automation, robotics and machine learning to industrial processes – has been rapid and redefining in drug discovery. One important benefit of these innovations is their effect on the robustness and quality of drug development information or data integrity.


In this article, we’ll look at how data integrity can be enhanced in a digitalized pharmaceutical field and the nascent risks posed by embracing these technologies.

The benefits of Industry 4.0

Filipa Mascarenhas-Melo, an assistant professor at the Polytechnic Institute of Guarda in Portugal, says that Industry 4.0 innovations offer “unprecedented opportunities” to pharmaceutical companies.


Advertisement

Enhanced tracking is one of the central promises of the switch from analog to digital data. While handwritten experimental reports or assay readouts can be lost, damaged or forged, automated processes can be calibrated to record and save each step of a process, making analysis far easier for companies and regulators. Dr. Stephen Goldrick, an associate professor of digital bioprocess engineering at University College London, called these changes “pivotal”.


“This evolution mitigates concerns related to manual handling errors and enhances the ease of managing, storing, retrieving and querying electronic batch records,” he added.


Mascarenhas-Melo points to systems such as electronic data management systems (EDMS) and laboratory information management systems (LIMS), which enable the tracking of electronic documents and samples within a workflow. Blockchain-based systems, which provide fixed ledgers that track data entries, can be used to follow the whole pharma supply chain. This boosts “transparency, authenticity and integrity of data related to drug manufacturing, distribution and sales,” says Mascarenhas-Melo.


Industry 4.0 tools have also simplified data recovery after system failures, facilitated secure communication and data encryption protocols and made quality management protocols that ensure data integrity easier to implement.


As digitalization improves workflows for pharmaceutical companies, regulators in the space must adapt their own processes to keep on top of these new technologies. Mascarenhas-Melo co-authored a recent review of good automated manufacturing practices, or GAMP, a series of recommendations for the design of digital systems published by the International Society for Pharmaceutical Engineering (ISPE).


The first GAMP was created in response to the emergence of digital workflows in 1991 and the latest version – GAMP 5’s second edition, released in July 2022 – bears little resemblance to its 33-year-old predecessor. “The latest edition of GAMP 5 acknowledges the non-linear, agile and cyclical nature of modern software development, reflecting the industry's shift towards continuous iteration and innovation,” comments Mascarenhas-Melo.


Regulators, she says, have to prioritize “agility and responsiveness”. Despite this, the ISPE took 14 years to release an update to GAMP-5’s first edition, published in 2008. During this period, many companies’ digital infrastructure fundamentally changed – whether you consider the rise of cloud software offerings, which can be updated and scaled more easily, or the proliferation of AI tools in the field.

Evergreen principles guide regulation

Even in the face of these fundamental changes, there are data integrity practices that have proved to be enduring cornerstones of regulation. The ALCOA acronym, which emphasizes that good data should be Attributable, Legible, Contemporaneous, Original and Accurate, was first coined by the FDA’s Stan W. Woollen in the early 1990s. ALCOA (albeit in a modified form that also considers data’s completeness, consistency, endurance and availability) still “stands out for its effectiveness and widespread recognition within the industry,” says Goldrick. By using these agnostic principles as a guide, regulators are trying to make their guidance more adaptable, even in the face of changing technologies.


“The percentage of violations requiring action has shown a decline, possibly indicating an improvement in compliance efforts by pharmaceutical companies,” Mascarenhas-Melo explains.


While regulators and companies have proved nimble enough to navigate these changing technologies while ensuring data integrity, Goldrick suggests making sure staff and users come along for the ride is an essential factor that risks being ignored. “The implementation and maintenance of these systems demand significant training for users, entailing considerable costs and time dedicated to training,” he says.


Mascarenhas-Melo highlights new GAMP-5 guidelines which encourage end users to liaise extensively with suppliers to make use of their expertise in maintaining digital systems. In turn, suppliers will have to adapt to a new, continuous support model where they work closely with end users long after a sale has been completed.

The risks of Industry 4.0

Completely digitalized processes have security vulnerabilities that suppliers and pharmaceutical companies must prepare for, says Goldrick. “The reliance on cloud storage elevates the industry's exposure to cybersecurity threats, including the potential for unauthorized access to sensitive or patient-specific information,” he says.


These risks are already evident. A recent ransomware attack on UnitedHealth, the world’s largest healthcare company by revenue, involved threats to release patient data and saw pharmacy operations seize up for days. This attack happened concurrently with a strike on pharmaceutical giant Cencora. Ensuring critical healthcare data’s integrity, and training those asked to steward that data, is the only way to protect against such attacks.

How to ensure data integrity in an evolving landscape

Pharma’s match-up with Industry 4.0 technologies is here to stay. The questions are no longer whether the field will be changed by the technologies but rather who will be first to leverage the new state of play to their advantage: industry players, regulators or cybercriminals.


To make systems secure and maximize data integrity, pharmaceutical companies will have to invest in both hardware and proper training for their staff. The costs involved may make some hesitate, but Mascarenhas-Melo has a warning for companies who drag their feet: “Those slow to adopt may find themselves at a competitive disadvantage, struggling to catch up as their more digitally mature counterparts forge ahead.”

 

About the interviewees:

Dr. Filipa Mascarenhas-Melo is an assistant professor in the Higher School of Health at the Polytechnic Institute of Guarda, Portugal and an integrated researcher at the Institute’s BRIDGES - Biotechnology Research, Innovation and Design for Health Products – program. She is also a collaborating researcher in the Department of Pharmaceutical Technology at the University of Coimbra.


Dr. Stephen Goldrick is a lecturer and associate professor in Digital Bioprocess Engineering at University College London’s Department of Biochemical Engineering. He specializes in the application of mathematical modelling and advanced data analytics to processes in the biotechnology field.

 

References:

1. van der Graaf PH. Probability of Success in Drug Development. Clin Pharmacol Ther. 2022;111(5):983–985. doi: 10.1002/cpt.2568

2. Pedro F, Veiga F, Mascarenhas-Melo F. Impact of GAMP 5, data integrity and QbD on quality assurance in the pharmaceutical industry: How obvious is it? Drug Discov Today. 2023;28(11):103759. doi: 10.1016/j.drudis.2023.103759

3. Bongiovanni S, Purdue R, Kornienko O, Bernard R. Quality in Non-GxP Research Environment. In: Bespalov A, Michel MC, Steckler T, eds. Good Research Practice in Non-Clinical Pharmacology and Biomedicine. Springer International Publishing; 2020:1-17. doi: 10.1007/164_2019_274