Pharma Data Integrity(part 2)
Understanding complex regulations and designing an effective implementation strategy within a short time
28 February, 2020 by
Pharma Data Integrity(part 2)
Ympronta srl, Info Ympronta
| No comments yet








Data Integrity Pharma

UNDERSTANDING COMPLEX REGULATIONS AND DESIGNING AN EFFECTIVE IMPLEMENTATION STRATEGY WITHIN A SHORT TIME (PART 2)

In the second part of the article we will delve deeper into key concepts discussed in the  first part, we will introduce the topic of Data Governance, we will analyze the main types of data and their sources, and we will see how it is appropriate to treat each of them from the perspective of Data Integrity.
Finally we will try to define the principles on which to base the design of a good strategy to achieve Data Integrity in a pharmaceutical company, identifying the main phases, challenges, opportunities and risks.

    STARTING POINT

    The question that every drug producer must answer concretely is: ok, now that I understand the requirements defined by the regulations, how do I achieve the goal and guarantee the integrity of the data in my company?  

    The number and complexity of regulations and various guides such as GLP (Good Laboratory Practice), GMP (Good Manufacturing Practice) and lately also GAMP (Good Automated Manufacturing Practice) suggest that there is no simple and immediate answer. The road to Data Integrity is long and winding, and must be traveled by all GxP impact departments. Projects of this size require a plan, often a multi-year one, a real strategy.  

    DAILY CHALLENGES OF DATA INTEGRITY

    The laboratory analyst and the production operator in their daily activities follow standard operating procedures (SOPs), analysis methods or batch records. They do this by documenting the entire process and recording the results. To facilitate these processes many companies have adopted digital systems such as LIMS and ELN (Laboratory Information Management Systems and Electronic Laboratory Notebooks) in their laboratories and EBR (Electronic Batch Record) in production departments. These systems are undoubtedly of enormous help in automating operations. However, this conversion is often partial, for two reasons:

    • These solutions are designed to mainly cover the display part of instruction sheets or analysis methods and to aggregate the results of analytical tests and department operations, entered into the system manually by the operator.

    • These solutions are not prepared natively for interfacing with electronic devices for the purpose of collecting process and analysis data.

    Moreover, the data managed in production and in the laboratory cannot be treated only as a simple set of parameters, as they have a complex and not at all standardized structure:

    • results collected by analytical instruments (scales, titrators, pH-meters)
    • production line sensor readings

    • master data

    • the state of the devices
    • users and their permissions

    • methods applied and metadata
    • ...

    Only the electronic acquisition of a limited amount of measurement data without related metadata (device ID, user, tare, sample quality data, calibration/cleaning/use history, adopted SOP, method version, etc.) leaves measurement without context.

    Soon it turns out that the process cannot be managed effectively if even every single interaction with the device itself is not described in the work flow during the design of the device and, subsequently, is not tracked during its execution. These types of interactions are as difficult as they are to describe in a paper SOP or Batch Record. It is already so for simple electronic systems, we just can imagine how the challenge becomes unsustainable as the complexity of digital systems increases. 

    The problems caused by manual/paper work flows are:

    • Data transcription is susceptible to errors

    • Lack of data audit trail

    • Reduced accuracy/completeness of recordings.

    • Non-complete compliance with ER-ES (Electronic Record - Electronic Signature) requirements, although the systems are validated.

    • Documents, and data in general, are not stored in a single location but are distributed in the various departments of the company and saved on different types of media (electronic and paper).

    • Process managers do not have the necessary knowledge and command of their own processes:

      • paper procedures are subject to interpretation

      • it is easy to underestimate the priority of requirements (what data to store, how long, audit trail, ...)

      • risks are not always adequately managed (impact of failures, durability of magnetic media, ...)

    Briefly: failure to comply with GLP, GMP and GAMP regulatory requirements

    A STANDARD APPROACH TO DATA INTEGRITY

    When launching a Data Integrity project it is necessary to take into account some organizational risks. One of these can be represented by the question "Who should take responsibility, the leadership of the project?". It is not easy to answer to this question: since the DI regulation falls within the scope of quality assurance, it could be said that the person in charge is the Director of QA. 10-15 years ago this might have been the right answer, since at that time a large part of the procedures and GxP data were handled on paper. Today, if the company's QA manager has solid experience in digitization and has already followed a Data Integrity project, he can take on that responsibility. Otherwise, it is better to entrust the guidance to the IT systems manager or a specially selected person. Later on we will see why.

    Since 2015, when the topic has become a priority, many companies have already moved with Data Integrity projects, so it is possible to collect testimonials and make an evaluation.

    The approaches followed include, in line with the MHRA indications, the identification of the various systems and their evaluation according to their complexity and relevance in order to define the necessary adaptation, validation and operational procedures.

    The most common line followed by companies is, of course, to ensure coverage of ALCOA principles for data generated during the production, analysis and movement of drugs.  


    However, in the context of the high complexity of data and the large number and variety of systems, this approach can prove to be far from easy.

    Digital system and device manufacturers, however, leveraging the trend and sniffing out new business opportunities, propose to interface their devices to a PC or, in the worst cases, to provide devices with the ability to export data manually. Even if the first method (interfacing with a PC) was adopted, this could be a not definitive or optimal solution to the problem.

    WHAT DO COMPANIES DO BASICALLY?

    So, with or without a well-defined strategy, here is a range of actions normally taken by companies?

    • If possible, adaptation (configuration) of existing systems

    • Purchase of new complete systems (hardware-software) to replace existing systems

    • Purchase of new software to upgrade existing systems

    • Review of existing procedures defining the management of:

      • accesses / privileges

      • audit trail

      • data back-up

      • data restore

    • Training and awareness of the staff to the D.I. concepts.

    STRATEGY

    PARETO PRINCIPLE


    Back to the MHRA recommendations, let's focus on the word relevance.  There is a strong temptation, in fact, to order the list of systems identified for increasing complexity and to plan the work according to this order, believing that what counts is the number of "regularized" devices.

    However, remembering that the subject is the data and not the system, we can say that with 20% of the carefully selected systems it is possible to reach a Data Integrity coverage level of 80%.

    Il disegno della strategia deve iniziare con la mappatura dettagliata di tutti i dati esistenti in azienda ed una loro classificazione sulla base di criticità che hanno rispetto alla qualità dei prodotti. Per intenderci, i classici dati sono il lotto e la sua scadenza, ma intorno a questi gravitano una miriade di dati critici, e sono diversi per ogni azienda.

    Solo dopo aver stilato questa TOP 10 (o TOP 20, o TOP50, dipende dalla dimensione e complessità dell’azienda) si procede all’individuazione dei sistemi coinvolti nella loro gestione. Per attribuire una priorità ai sistemi individuati sarà necessario ordinarli, questa volta in base alla linea temporale della vita del dato (generazione, trasferimento, trasformazione e conservazione del tato). Ma questo non è altro che mappare, ad alto livello, i processi attraversati dai dati critici. Per cui, volendo dare un ordine di importanza, la sequenza della nostra indagine avrà questo ordine: Dati - Processi - Sistemi. Questi ultimi, a loro volta, si trascinano dietro l'analisi dell'infrastruttura sulla quale dovranno funzionare.

    All this will help to understand the real data stream, its complexity and possible inconsistencies and redundancies. Yes, because one of the actions could also be the elimination of a system, if this introduces redundancies and unjustified complexity.

    Only at the end it will be appropriate to evaluate the systems from the point of view of adjustment difficulties, in order to study the feasibility and calculate the effort, so as to prepare and possibly rationalize/fragment the action plan. 

    However, Pareto gives us only the principle, not the evaluation method.

    Is there a consistent and valid way to arrive at an operational plan (also known as a remedy plan) in a short time?  Yes, there is.

    The objective of this article is precisely the definition of this method. But since it is a method, and not a ready-to-use solution, it is necessary to understand its basic principles.

    TOP-DOWN, BOTTOM-UP... INSIDE-OUT!


    Often we hear of a "top-down and bottom-up approach to ... achieve data integrity" and "Data Integrity by design". But what does it mean exactly?

    One way to achieve DI could be to identify all gaps in data management, monitor all errors, sporadic or systematic, and then plan corrective actions. But this would mean chasing problems, a path where all actions are of equal importance and never leads to a conclusion, simply because it has not been previously defined.

    Moreover, because of the constant "fear" of inspections and receiving notifications, a habit has been established in companies to hide errors and inconsistencies. This definitively eliminates any hope of success, in case the method described above was adopted.

    We cannot certainly expect the FDA not to detect failures during inspections. That is their job.  So it takes a lot of courage and determination at the top level of the pharmaceutical companies to take a diametrically opposed approach.


    MONITORING


    Innanzitutto, è importante comprendere che il Data Integrity non è un problema di un solo reparto, come Assicurazione Qualità o IT. La qualità e la disponibilità del farmaco, che sono gli obiettivi primari di qualsiasi azienda farmaceutica nel suo complesso, dipendono sempre di più dai dati, e non solo quelli GxP critici. Di conseguenza è importante avviare, a livello aziendale, un ben definito programma culturale di data governance.

    Everyone in the company needs to be conscious and feel the problem of DI. But above all, employees should be encouraged to report problems instead of "burying them", while management must take responsibility for collecting, analyzing and implementing a structured plan to resolve them.

    Among the first solutions, therefore, it is essential to implement (or optimize if they already exist) tools and processes that facilitate the monitoring of systems. And if monitoring is not automatic, as it is still the case in the vast majority of companies, once the tool is provided, it is necessary to prepare manual reporting procedures, with mechanisms to stimulate their adoption.


    ROADMAP


    It is also important to have a clear understanding of the complete data stream, how it is shared between the various actors, and the systems that make up the company's ecosystem.

    The main steps for the design of the strategy can be summarized in:

    • Assessment work

    • Understanding of the processes and data involved within them

    • Identification of risks and integrity gaps

    • Analysis and design of the remediation plan

    However, before we begin, we need to establish what the real goal is. Alignment with ALCOA principles? That is, "I do as much as I need to". Or do I take all possible advantages from this opportunity?

    For CEOs and IT Managers data integrity can be an opportunity to build the digital environment of their dreams. Taking advantage of the historical period, characterized by pressure from regulatory agencies, they could finally justify and get the budget they have been waiting for a long time. Benefiting, of course, is the entire company. It's just that it's harder for non-experts to imagine these benefits before the whole strategy is fully achieved.

    Data is king

    DATA INTEGRITY OR DATA GOVERNANCE?

    We have already mentioned Data Governance before and now it is time to structure the ideas. The two concepts are not an alternative to each other. Data Integrity is a key component of Data Governance. Moreover, as you can imagine, they do not belong exclusively to the Pharma world, and since data has been crucial for any modern company for years now, Data Governance is vitally important at a corporate level.  

    DATA GOVERNANCE

    The range of measures put in place to ensure that data, regardless of the format in which it is generated, is recorded, processed, stored and used in a way that ensures its integrity (i.e. completeness, consistency and accuracy) throughout its life cycle.

    The main Data Governance issues include availability, usability, consistency, integrity and security. Data Governance involves the establishment of processes that ensure effective data management throughout the company and the allocation of precise responsibilities for emergency management, data quality and availability within the organization.


    Since this article is dedicated to pharmaceutical companies we will focus on the most critical aspects of Data Governance for the Pharma world.

    DATA GOVERNANCE

    The Data Governance model in the pharma industry must necessarily be determined by the criticality of the data, rather than the complexity of the system that deals with it.

    Making a complete mapping of all data managed in a company from scratch could be costly.

    The good news is that, for the classic management systems in use in pharmaceutical companies, this analysis should already have been done. Simply ask your colleagues in the Quality Assurance department about the outcome of the risk assessment for ERP and LIMS made at the time of their validation. This is likely to contain a criticality ranking for the key data as well as the processes implemented. With a little luck you can even acquire valuable information about the data managed in the laboratory.


    The "bad news" is that, thanks to technological progress and new digital systems, the data is increasingly detailed, complex and pervasive. The miniaturization of the sensors and their diffusion have made it possible to collect a multiplicity of data that was previously collected manually and only if strictly necessary.

    Not only that. Applications for the management of workflows are no longer exclusive to accounting processes or Master Data (e.g. multi-role approvals of master data). Thanks to the spread of HMIs, it has been possible to define digital workflows also for those departments that until not so long ago were managed only through paper SOPs and Batch Records. And once the execution of processes becomes digital, numerous opportunities arise for data collection, not only environmental (from sensors) but also operational, such as the execution of process methods and instructions.


    THE IMPORTANCE OF RISK ASSESTMENT AND THE CHALLENGE OF THE PROPER DATA CLASSIFICATION


    In order to classify data according to their criticality and risk, the procedure provides for the evaluation of the characteristics of raw materials, semi-finished and finished products and all their transformation processes, from first to last.
    Speaking in a technical language, critical process parameters (CPP) and critical quality attributes (CQA) of products must be considered. 



    By correlating the characteristics thus identified with the work instructions (manufacturing process) and by identifying the ways in which the data make the transition from one phase to another, we obtain a scheme that represents the journey of critical data through business processes. It is not required to analyze the processes in great detail, which would lead to an unjustified waste of time. The real focus is always the data, so it is enough, starting from the identified input attributes and parameters, to track all their steps, enhancements and transformations.


    By doing so we obtain the "data - processes" link. The next step is to cross-reference the results obtained with systems (applications), devices and infrastructures.

    ISA-95

    Another way to speed up the assessment phase is to design the strategy is to follow known patterns and try to drop one on the company using the documentation already available together with the knowledge and experience of the staff.

    We can take as reference the best known standard in the automation field. ISA-95 is an international standard from the International Society of Automation for automated interfacing between company systems and control systems. In practice it has proved to be a very useful tool for understanding and structuring the panorama of business systems.

    The standard does not give priority to any particular type of data but rather focuses on the exchange of data between different levels of the architecture. To show a practical example we analyze the flow of master data. The master data flow could be synchronized between systems in a one-way manner. The meaning of the standard is that it could be advantageous to manage each type of master data (product codes, recipes, methods, MBR, etc.) in a single repository, a DB for each master data or for more than one (even if some companies opt for a single system dedicated to the management of all company master data).

    This facilitates the application of ALCOA and Data Governance principles as it allows to intercept the birth of data from a single source and its distribution along all defined paths. A similar reasoning can also be applied to transactional data such as batch, process order, department scheduling, planned and confirmed quantities, etc.

    Following this approach, in a single analysis session it is possible to:

    • classify all existing systems (based on ISA-95 levels and digital/paper type)
    • stratify the data itself on the basis of these layers

    • Finally, track the "routes" that every critical data travels from its birth to archiving.


    Here one must not be misled by indicators. Theoretically each level, each system can cover at least the first 2 phases of the data life cycle.   

    1. generation
    2. processing and reporting

    3. storage and recovery

    4. destruction

    Of course, the higher the level of membership of the system, the more data managed is structured, aggregated and includes data from lower levels.

    TYPES OF EQUIPMENTS

    In chemical industries, the production department is usually less automated compared to purely pharmaceutical companies. As a result, production processes are less complex and efforts related to Data Integrity are more focused in laboratories, being the bulwark of raw material quality. The most complex systems in the laboratory are, of course, HPLC.

    Companies whose core business is drug manufacturing have a hybrid situation. Both the production and the laboratory produce and process a lot of critical data.

    A separate case are the CMO companies (subcontractors) that only cover the packaging processes for companies that have decided to outsource packaging. In their case the greatest complexity is concentrated in the automation of packaging lines, as they do not have to carry out the in-depth laboratory analyses already carried out by their customers.


    IT IS EASY TO UNDERESTIMATE THE COMPLEXITY OF THE SYSTEMS



    In the first version of the "GMP Data Integrity Definitions and Guidance for Industry" the MHRA suggested a classification of systems according to the complexity of their configuration, specifying that the more complex the system, the greater the effort required for its validation. The logic is simple: where more configuration is required, the greater the risk of data manipulation (it is therefore Data Integrity).

    Unfortunately this concept is often misinterpreted and some companies confuse the simplicity of validation with the ease of achieving Data Integrity compliance, thus underestimating the risk. 

    Paradoxically, the most difficult types of systems to conform are simple instruments, such as scales, pH-meters, conductivity meters, etc..

    Also MHRA warns about this risk:

    However, it is common for companies to overlook systems of apparent lower complexity. Within these systems it may be possible to manipulate data or repeat testing to achieve a desired outcome with limited opportunity of detection (e.g. stand-alone systems with a user configurable output such as FT-IR, UV spectrophotometers). 

    Moreover, the complexity is given by the number of these instruments, their variety and the diversity of data formats that each device provides.

    We can also add that as the complexity of the system typically increases, so does the "paperless" level, without the need to generate reports on paper. This too, however, is a rule that now tends to remain in the past, given the spread of digital, integrated and paperless systems also in the left part of the scheme.




    This diagram is particularly useful because it represents an overlap of the ISA-95 standard with the types of typically pharma systems/devices, all with a view to Data Integrity. The focus, therefore, is on the relevance of the paper/electronic data for system adaptation and subsequent validation activities.

    WE HAVE STRATEGY. NOW WHAT?

    What we have seen so far is already enough to make an assessment and outline a Data Integrity strategy. The primary objective is to identify critical data and prioritize them. Having done this we have seen methods that allow us to quickly list and classify company systems and devices. By cross-referencing the results of the two analyses we can prepare a high-level strategic plan that will lead us towards our goal of operating under Data Integrity.

    There is still so much to do. In fact, the plan obtained must be articulated in an operational plan, also called Remediation Plan. Since each system, each device, is special, the remedial action to be taken for each can also be different. For one system it will be sufficient to define the roles and permissions, for another it will be necessary to purchase an additional SW. Other systems that handle critical data may be so outdated that they will not offer any remedial action, and therefore they will be intended for replacement.

    The transformation of the strategy into a remediation plan, therefore, is another important and challenging step. In the next chapters, we will no longer talk about data. On the understanding that, from what we have seen in the previous chapters, we already know what our data is to be protected. Instead, we will present elements of system and infrastructure assessment that will help us to choose the appropriate remedial action. 

    IDEAL DATA INTEGRITY ENVIRONMENT

        Before we imagine an Ideal Data Integrity solution, let's freshen up the main observations from FDA inspections. In the following table you will find useful suggestions for managing the above observations.

        Observation
        Suggestion Corrective Action
        - Mancato controllo degli accessiAssignment of authorizations for:
        - use of automation / production / laboratory systems
        -- access to factory area
        -- profile setup with different privileges (segregation of duties)

        - badge, fingerprint scanner etc. for system access
        - input not contemporary to the activity and backdating
        -- use of unofficial sheets and notes
        - input the same data in different batch records
        - data recording at the same time with the activities
        - regular periodic audit trail reviw
        - raise awareness about DI and staff training 
        - support for activities by qualified consultants
        Mismatch between reported and recorded data-data acquisition via electronic systems and digital tracking of the data flow
        -regular periodic audit trail review
        - interface and validation of digital systems 
        - staff training 


        “Your quality system does not adequately ensure the accuracy and integrity of data to support the safety, effectiveness, and quality of the drugs you manufacture. We strongly recommend that you retain a qualified consultant to assist in your remediation.”

        FDA

        ELECTRONIC OR PAPER?

        Among the first actions during the definition of a remediation plan, it is necessary to take a choice: continue to manage the identified data on paper, or implement a digital system? In the case of already installed digital systems, it is possible to consider whether or not to replace them. While it is highly unlikely that the choice to switch from a digital system to a paper one.

        From the point of view of regulations, both types of media are acceptable, as long as they comply with criteria defined by ALCOA:

        an electronically generated raw data can be stored in an acceptable paper format or in pdf, where it is justifiable for a "static" record to preserve the integrity of the original data.

        In these cases, however, it must be proven that the data retention process includes verified copies of raw data, metadata, audit trails and verification files, system configuration, specific settings of each production or analytical process execution, including the data processing processes necessary to reconstruct a specific raw data set.

        a document certifying that the printouts are an accurate copy of the original data is also required.

        The choice between Electronic and Paper depends on the criticality of the data and the complexity/cost of the system to manage the data in electronic format. We summarize the main ALCOA recommendations trying to decline them for both types.


        For "simple" instruments, such as scales and pH-meters, the [WHO] guidelines consider it acceptable that the data be printed on a "receipt". Unfortunately, these printouts are deficient and often it is necessary to enrich them with metadata, such as the date and time of the event and the signature of the operator who performed the action, the sample ID. The receipt must then be fixed on the analytical notebook. All these steps require clear and robust procedures and attention from analysts. Attention taken away from the main activities, i.e. carrying out the analysis.

        DIGITAL SYSTEM LIMITATIONS

        To overcome the lack of instruments the alternative is to replace them with models featuring an internal clock (synchronized with the company one) and user authentication tools. Unfortunately, such an expensive solution, even if desirable to renew the toolkit, is not always sustainable.

        It is clear that if new purchases are planned for the laboratory, it is possible to choose the instrument already compliant with Data Integrity. And what about the existing stock of instruments?

        Although it is possible to adapt existing instruments or to purchase new instruments that already comply with the requirements, as long as these instruments remain "stand-alone" they cannot be considered fully compliant as they can still lead to a loss of data or worse consequences. Just consider the risks of manual back-up via USB stick. This type of manual storage cannot be sufficient as it does not guarantee the completeness of each set of results.

        There is another aspect that needs to be considered. Laboratory Information Management Systems (LIMS, ELN, LIS) can be easily integrated with each other to work in synergy, thus covering Data Integrity requirements. While the integration of analytical instruments can be a difficult task as you often operate in an environment with many different instrument brands and many disaggregated SW. Integrating them all directly into the LIMS means giving it responsibility and functionality it was not designed for. One solution to this could be the adoption of middleware.

        As mentioned in the first part, regulatory agencies have recognized both the advantages and limitations of electronic data systems and have increased the controls for their use. Although they do not explicitly push for the adoption of digital systems, they much prefer them to paper systems because of the opportunities they offer for compliance.

        However, official regulations do not extend beyond the coverage of requirements and do not state anything about the optimization of processes that can be achieved with the introduction of these systems, as if this were not relevant to Data Integrity. Let's see how the introduction of digital workflows, in addition to process improvement, can help to achieve excellence in Data Integrity.

        DIGITAL. AUTOMATIC. CENTRALIZED.

        FREE YOUR HANDS. AND YOUR MIND!

        What are the main activities of laboratory analysts or production operators? Manufacturing medicines, performing analysis, or conducting research. High value-added activities that require focus, in-depth analysis, as well as skills and a high degree of specialization. Instead, a good part of their time is dedicated to process control and paper production.

        Unfortunately, the time they have left to perform the main tasks is not an algebraic difference between the total time available and the time spent managing and controlling the process. Being two very different types of tasks, any person, regardless of their experience, needs time to refocus by moving from one to the other and viceversa.

        Studies estimate that 20 to 30 minutes are needed to reach the right concentration on a complex activity. This factor is a real burden for any company, even a structured one, and is often underestimated by managers.

        That's why the real advantage of being paperless is not the elimination of paper and the automation of data collection. It goes without saying that the elimination of these routine tasks simplifies core operations. These factors free the hands of operators, while the real process boost is achieved by freeing their minds, when they can not think about "what" and "when" they have to do, but only "how".

        There are various possibilities and options to automate workflows:

        • Automatic identification of samples, and materials in general, by bar-code labels.

        • Planning of activities through a dedicated SW and sending the schedule to the workstations/operators.

        • Digital management of analysis methods and process instructions, both preparation and execution.

        • Automatic decisions taken by the systems according to pre-established rules and collected data.

        • Automatic generation of accurate reports.

        • Automatic sending of data to interested parties (people, departments, entities, or digital systems).

        In general, regarding the future, the skills required from the staff will change, going towards a strong specialization, on the one hand, and towards the ability to bring continuous improvement and automation of processes on the other.  

        CENTRALIZATION. STANDARDIZATION.

        Is there a way to seize the benefits of digital systems and avoid the disadvantages described above?

        There is another dimension of automation that is worth examining further. Today many manufacturers of laboratory instruments and production lines claim that their instruments comply with CFR21 and Data Integrity standards. Usually this applies to each device individually, which for the company means managing both the master data and the data generated by this device separately. Each with its own format.

        Since this fragmented management is extremely expensive, it becomes the subject of continuous internal discussions between departments on who should be responsible for it.  At best, each department takes over this burden. At worst, it all falls to the IT department, for a simple logic "digital means IT", without understanding that digital is now a problem for everyone. And it takes a strategic vision and strong leadership from those who lead the company to turn the problem into an opportunity and make the only right decision: adopt a centralized and automated system on which to "transfer" the responsibilities described above.

        Centralization is synonymous with standardization which is synonymous with simplification and cost reduction. A unique, digital and centralized tool helps, but we can also say forces, to adopt uniform templates for master records, methods and batch records.  

        Centralization also requires unique user management. A single database containing all the accounts automatically means a single digital model, so-called "digital twin", of the entire company organization chart (instead of as many incomplete models as systems and devices) with all the defined roles and permissions. You can imagine how the management model and administrative activities are simplified. But above all, how this approach, being the organization chart one of the pillars of the company, clarifies and simplifies all processes, helping to detect possible conflicts and redundancies (another important source of inefficiencies) in data flows.

        The creation of a centralized database makes it possible to set the clear boundary between operational processes and processes of verification and interpretation of results. Those involved in the verification of collected data and batch approval do not have their desk cluttered with folders of documents from different sources. Nor should they be in the vicinity of the data source. This offers the opportunity, for example, to have a single detached department for monitoring and verification of results, which serves multiple laboratories around the world.

        Finally, centralization and the resulting standardization allow the systems validation cycle to be optimized. The centralized software only needs to be validated once and each new connected analytical instrument will simply require configuration and creation of the respective master data, as the system itself, data management, audit trail, user management, result flow, etc. are already validated.

        Centralized software, therefore, can play a key role in a data integrity solution offering:

        • Reducing the cost of analysis per sample

        • Better use of analysts' skills, so that they can focus on the development of new methodologies, data analysis and their clients

        • Better quality and homogeneity of results, for consistent comparison with the results of other laboratories
        • Improvements in accuracy and precision

        • Higher analysis speed

        • Consequential increase in the demand for laboratory services

          All this, in short, leads to a reduction in operating costs and annual manpower, which in the laboratory represent a significant share of total costs, unlike production whose costs are distributed between machinery and materials handling.

        Of course this increase in efficiency, and therefore in the value of its operations, are long-term benefits. And as with any long-term return, companies, who recognize its strategic importance, must be prepared to undertake major investments both in technology and in the culture and skills of their employees.

        DATA INTEGRITY INFRASTRUCTURE

        Let's imagine that we decided to implement the digital system for the management of Data Integrity in the laboratory and production. How should we prepare our IT infrastructure to support such a load?

        PCs, servers with databases, service software, hardware servers, operating systems are the infrastructure components that underpin GMP applications. To these, we need to add the different IT tasks such as user management, server management, system management, backups, licensing, etc. It should be noted that the quality of the infrastructure is mandatory.

        As mentioned in the previous chapters, since the goal is the integration of everything with everything, the crucial role is played by the component that connects different systems that speak different languages. Therefore, a middleware is not linked to any particular system, but was created to adapt to any communication protocol and method. The introduction of middleware facilitates:

        • measurement detection directly from the instrument

        • sample identification by integration with a barcode/datamatrix reader

        • date and time are retrieved from the network
        • Possibly integrates electronic signature SW, or other user identification tools: fingerprint, heartbeat, facial recognition (in environments that do not require wearing a mask), etc.


        CLOUD


        When we talk about the need for a centralized DB we automatically think of the Cloud. Preferably a Data Lake. Nowadays everyone talks about it, even in the pharmaceutical industry, however, the same companies are still wary of delegating the management of their data to external actors.

        We can't really know if data confidentiality is guaranteed once they reach the cloud. The only way to protect ourselves here is to have a contract with the supplier and a solid insurance policy that guarantees adequate compensation for the damage suffered in the event of a given breach. However, we cannot deny that, from the point of view of data back-up and availability, no pharmaceutical company can compete with a good cloud service provider. However, guidelines for the qualification of these types of infrastructure are still being defined by regulatory agencies.


        Once the cloud is "cleared", all the opportunities it provides are unlocked. Access to external data and services, building complex and integrated applications, Big Data with its advanced analytics, machine learning and predictive algorithms. All this, or at least critical services such as production execution, cannot migrate to the cloud until there is a fast, secure and always available internet. But this is the subject of another article.

        CONCLUSIONS

        So what is the common feature of companies that are successful in Data Integrity?

        Since the biggest risk is that DI can be seen as a problem and not an opportunity, we can say that the winning companies are the ones that use Data Integrity as a way to reach an advanced level of their growth and market presence. 

        Those companies, who see Data Integrity as an enabling factor for digitization, certainly have a far-sighted vision.

         What is the point of investing a lot of money in making paper processes and data compliant with DI if they will have to be digitized very soon anyway?


        The budget for DI and digitization, therefore, can be considered as a single item in the overall budget, investments should not only cover compliance but above all aim at process improvement. Because compliance is a secondary benefit compared to the benefits of achieving data integrity.

        From here we can see another important requirement. In order for the company to be able to achieve the structure outlined above, the IT department needs to be able to make choices that "invade" the territories of other company departments. IT can no longer, therefore, be seen as a service department. As a "pit stop mechanic" that it used to be, it must now necessarily sit by the driver's side and suggest the way. And the driver is the only one who can afford it, apart from the owner of the racing stable, of course. 

        Se desideri approfindire l'argomento ecco un'articolo utile:

        How To Prevent Data Integrity Issues In Pharma QC Lab?

        Ympronta and Data Integrity

        OUR services and products

        At Ympronta we have been working in the pharmaceutical industry since the early 2000s. We have seen the creation and evolution of CFR21 and GMP. We have observed the definition of Data Integrity and its ALCOA+ principles.  

        But we haven't just been passive bystanders. Operating mainly in the Lifecience, each of our projects had to comply with regulations so the experience we gained led us to an important decision: to found a company to offer our expertise directly to our customers!

        The result has been the creation of a department dedicated to compliance and Data Integrity and the development of digital systems dedicated to the most complex areas:

        • Digital Data Integrity in the analysis laboratory (Y-Lab)

        • Pharmaceutical Serialization / aggregation (SERYAL)

        • Production automation (Y-Mes)

        Contact us to learn more and to book a meeting where we can assess your challenges and objectives.

        Contact us

        share

        Join Us

        Be the first notified about our events, news, and publication of high-quality content.

        Odoo • Un'immagine con didascalia
        Production and Laboratory Data Integrity Digital Solution
        Odoo • Un'immagine con didascalia
        Smart Serialization and Track&Trace solution

        Y-MES
        Smart MES Solution for Life Sciences Industry.
        Data Integrity 
        Questions and Answers

        Ultimi articoli

        Pharma Data Integrity(part 2)
        Ympronta srl, Info Ympronta
        28 February, 2020
        Share this post
        Archivia
        Sign in to leave a comment