Tell me what you think about Data Integrity and I'll tell you who you are
Many companies in the pharmaceutical sector see Data Integrity legislation as just an obligation to be respected. Indeed, as their core business is drug manufacturing and not data management, this obligation is often seen as a nuisance, an obstacle, a waste of time and money that distracts them from much more important activities. And only the most forward-thinking leaders thank the FDA for this opportunity. Today we will analyze the main approaches to Data Integrity and how these characterize the present and the future of the companies that adopt them.
Level 0
The very first level, which, fortunately, is no longer easily found around, is a paper and not very rigorous (not to say chaotic) management of data and processes. In companies of this level, heavy use is made of office tools, such as Word, Excel and e-mail. Of course there is nothing wrong with their use in daily work, if it is a creative type of work and all those activities that do not necessarily have to leave a trace in batch records.
For example, it is perfectly normal to describe the production, laboratory or warehouse processes within the SOPs produced in Word format. The criticality arises when these procedures take effect. For their approval, through signature, and subsequent distribution, these documents are printed and placed in the office cabinets or, in the best cases, in dedicated and protected deposits. The scans of the signed documents are then saved using tools, often handcrafted, such as shared folders on the company server with restricted access and therefore with procedural complications.
However effective these methods may be, taking due precautions on the security of physical and digital archives, we can imagine their low efficiency due to high costs of maintenance and use of data. Not to mention the risk of data tampering, in the case of batch records, or the difficulty of consulting the SOPs.
basic level
Many Data Integrity projects, conducted since the entry into force of the legislation, aim, in fact, to reach this level. Since the authorities do not mandate the adoption of digital tools, even if they strongly recommend them, companies are limited to meeting the minimum requirements described in the ALCOA+ principles.
The most commonly adopted approach can be briefly summarized as follows:
list all areas, devices and systems that in some way process company data
evaluate the criticality of the data managed within these systems and order the systems according to the detected criticality
apply on the systems, in descending order of criticality, the solutions that guarantee their compliance with the ALCOA + principles
For example, to satisfy the 'Attributable' requirement, i.e. that each generated data must be attributed to a context (batch, date/time, and user), we try to make sure that the system can be used only after login, via username. and personal passwords, and that each generated record has associated lot and timestamp fields. Sometimes a proper configuration of existing systems is sufficient. Sometimes a software or firmware upgrade is required. Sometimes, in cases of severe obsolescence, complete replacement of the device is required.
But the main problem is that, in most cases, the solutions adopted strictly concern the scope of the single system or device. That is, for each identified system the personal login is activated. The local audit trail is implemented in each system. For each system, a method of retrieving, processing, and storing data is set up. And since each system is different from the others, by type, manufacturer, and model, the remedial actions must be multiplied by the number of systems/devices.
Better than nothing. But given the cost of the project and the complexity that this introduces into day-to-day data management, perhaps it is better to maintain a paper-based Data Integrity management. And in fact, many companies prefer this very last option. But these are companies, whose goal is limited to achieving compliance.
Now let's talk about companies that have seen an opportunity in the Data Integrity obligation, consider it an investment, and aim to obtain some return.
intermediate level
If your company is not a start-up, if it has a multi-year history and in recent years has already invested heavily in production lines, laboratory devices, and other digital systems, it is likely that the Data Integrity legislation has taken you by surprise. You have probably built your fleet by making your choices based on a thousand criteria, such as coverage of needs, the prospect of growth and expansion of the departments, and, why not, on the stable relationship with trusted suppliers. Probably the compliance of the devices with respect to Data Integrity was the least of your thoughts and now you find yourself with departments and laboratories that look like "Frankensteins", monsters composed of different components sewn together by company procedures and processes.
In this case, the idea of recontacting all suppliers to request an update of their devices can turn into a nightmare. Unless you delegate Data Integrity compliance to a sistema centralizzato that alone does everything you need. The only requirement for existing systems is that they have the ability to interface with the outside via any standard protocol.
Of course, there are many nuances here too. For example, how do I ensure that my device is not used outside the intended context? That is, in the case of the analysis laboratory, how can I be sure that the operator has not first analyzed the sample to make sure that it respects the tolerances and then, after having "improved" the sample, has performed the "official" analysis? We are talking about so-called pre-readings.
Well, there are several ways to avoid this. Many devices can be configured in such a way that they only start (unlock) via a command from an external system. In the worst case, it is possible to constantly detect the data coming from the device and at the time of unauthorized use, i.e. without having started the official analysis on the laboratory management system, record an alarm in the audit trail and send the notification to whom it may concern.
Such a system is the only way to ensure 100% Data Integrity compliance because no data goes unnoticed.
But that's not the only advantage.
The global adoption, for all the devices of the laboratory, involves a strong standardization both in the processes, thanks to the single interface towards the user, and in the collection, processing, and management of data. Of course, some initial effort is required to integrate devices, but it is much easier than aligning with the requirements of all devices individually. Furthermore, as we said before, thanks to the adoption of standard communication protocols, this integration project can be carried out by the same supplier of the centralized system. As a result, the overall effort is amply rewarded by the benefits obtained once the solution is fully operational.
Among the disadvantages, we can imagine that it is difficult to adopt a single system of this type for the entire company. Typically a different system, but with a similar concept and objective, is adopted in every business environment. An LMS (Laboatory Management System) for an analytical laboratory. One (or more) SCADA for production lines and environmental controls. An ERP, a DMS, a QMS, etc, etc. However, all these systems processes different data relating to the same object: the production batch. And these data can make different journeys involving all these systems, and make different transformations within them. How is it possible to ensure the integrity of data when it is exchanged between so many and different actors?
And this is where the story gets complicated.
The "PRO" Level
The basic level, if achieved through a well-defined and structured project, is more than sufficient to support any audit degli ispettori. But the only advantage is that of not incurring sanzioni. Therefore it cannot be considered a real investment since it does not involve a real economic return but only risk mitigation.
Of course, in addition to avoiding sanctions, company data is also secured. But let's face it, often this security is only formal since it is achieved thanks to the introduction of the most stringent procedures that simply attribute greater responsibility to the staff. For example, a periodic but manual back-up of data from a PLC via a USB key is required. This does nothing but introduce new management costs and penalize people more. Yes, because in addition to entrusting someone with the execution, it is also necessary to delegate someone to control. All this to increase the degree of safety which in any case never reaches 100%, also because humans, unlike machines, tend to become less strict over time.
The intermediate level, in addition to having a lower cost of achievement, is a real investment since, thanks to the optimization of processes, the elimination of paper and the human factor, and the automation of data management, it significantly reduces costs management and the risks of error. But it always does so at the single department level. And since appetite comes with eating, the question arises whether the same strategy can be adopted globally.
We addressed this issue in our recent article "L'ostacolo più grande al Data Integrity".
Briefly, the idea is the introduction of a digital layer into the IT architecture which, in addition to guaranteeing data integrity at the level of a single system / department, does so at the level of processes at any level. At the departmental, plant, corporate or even intercompany level.
Another flaw of the classic projects on Data Integrity, in fact, is that they underestimate a fundamental requirement, that is to guarantee the integrity of the data lungo tutto il suo ciclo di vita, starting from generation, in the moments of transformation and transmission, up to archiving and eventual elimination.
Companies that reach this level can answer inspectors' questions about Data Integrity without ever leaving the office. All data and business processes are visible from any location, in real time. But, as we said, compliance is the minor advantage, compared to having total control of data and business processes.
data integrity God
As the highest level of Data Integrity we would like to briefly mention a futuristic concept, rather than a real case study or an existing system, even if someone is already making the first attempts.
The biggest challenge, but also the most exciting and exciting, of any engineer, entrepreneur or leader is to know in advance what the end result of their ideas will be once they are put into practice. Since only one idea out of 10 is successful, creative minds spend most of their time generating and testing new ideas in order to create as much as possible in their lifetime. The natural desire, creatives being an infinite well, is to know as soon as possible if it is worth wasting time on an idea or if it is better to move on to the next.
This is the reason for the great excitement in recent years around the various simulators of all kinds, the so-called Digital Twins. The simulator of a future production line, the simulator of a possible molecule of the active ingredient, the simulator of a prosthesis, and, why not, the simulator of the future digital architecture with lots of evidence of the potential risks related to data integrity.
The difficulty of such a simulator is not so much simulating the links and transactions between different components, as simulating the behavior of each of them. Also because there is an infinite number of heterogeneous components, and with the advent of the IoT, the situation will only get worse. In fact, with a Middleware such as the one presented in the previous level it is possible to develop, from time to time, virtual components, made "ad hoc", which simulate the behavior of real objects and systems.
Of course, inspectors are not interested in such simulations. At the moment. They are interested in the current situation of real data and processes. The simulators, simply, aim to shorten the design and test times of future systems, so as not to have to make adjustments during construction but to start directly with a functioning and optimal system. It may seem obvious or useless, but with an increasing complexity and number of data, processes, systems / digital objects and architectures, simulators of this kind will soon become anything but obvious.
Tell us your experience. At what level of our lineup is your company?
Tell me what you think about Data Integrity and I'll tell you who you are