The greatest obstacle to Data Integrity

How to overcome it easily and improve all business processes and Data Governance.
May 13, 2020 by
The greatest obstacle to Data Integrity
Ympronta srl, Info Ympronta
| No comments yet







The greatest obstacle to Data Integrity

Have you ever wanted to literally observe a datum along its journey through business processes? Do you think it is impossible? Since the Data Integrity requires effective process management and constant monitoring of all critical data, pharmaceutical companies are looking for a suitable tool for this purpose. In this article, we will talk about the biggest challenge in achieving a good level of Data Integrity and we will present a solution that allows you to overcome this challenge, easily building a robust environment that, in addition to securing your data, consolidates all business processes.  

YOUR BEAUTIFUL FLOW CHARTS


In your professional life you have probably had to draw at least one flow chart. Ah, how beautiful it was to see! How clear and self-explanatory it was! How well it represented that perfect process you had in mind. Once it "turned" on paper, i.e. it was slender, robust and without errors, it was practically done.  


A few "simple" steps remained to take for that beautiful flow to come to life:

  1. Share it with all the actors who would be involved. Indeed, if it was a democratic environment, the process had to please everyone to be approved.

  2. Digitalize it. That is to develop/configure a more or less complex application. Or, in less fortunate cases, simply draw up paper operating procedures, to be applied once the flow is implemented.

In this article, we will avoid investigating the case in which a "turnkey" digital system is introduced, with the flows already implemented, forcing the company to adapt to these flows, which are sometimes too rigid and unsuitable for the specific needs of people.

We are therefore talking about those cases where the company (or a part of it) is in a phase of full development and rapid growth. That phase where changes are the order of the day and the process you designed yesterday is no longer relevant today. 
 

Yes, even if you were able to convince everyone (far from easy) and, after having integrated the feedback from each one into the diagram, you took the process to its implementation phase, you ran into the harsh reality of the facts: digitizing a process drawn on paper, designed only according to business logic and without considering technological constraints, inevitably involves compromising and "tweaking" your beautiful diagram again.

Or, alternatively, have an infinite budget in order to commission the implementation of a system practically from scratch and completely custom.


This is almost impossible in the pharmaceutical sector, where any customization is penalized from the begining because it imposes considerable efforts and costs for validation (if it is about GxP processes). The solution, therefore, is the adoption of configurable systemsIn other words, quasi-standard solutions which, without the need to write code, but thanks only to configurations, transform a rigid system into a flexible one that can be adapted to some specific needs. 

All solved, then? Not exactly. 
 

The fact is that there is no single system that covers all processes from A to Z: from R&D and registration of drugs to the purchase of raw materials, to the manufacture of semi-finished and finished products, to laboratory analysis, up to sales and much more.

Of course, there are modular systems, such as ERPs, which cover many of the areas mentioned. Many, but not all. Anyway, the modules of an ERP can be seen as independent applications, but well integrated with each other. In this perspective, each area requires the adoption of a software module specifically designed and developed for that area, and customized for each specific company. The next challenge is to make these systems and modules work together effectively. And this is precisely the biggest obstacle to Data Integrity.


... THROUGHOUT THE ENTIRE LIFE CYCLE ...

All definitions of Data Integrity boil down to this. And each of the ALCOA+ principles provide for it. It is not enough that every system, which is part of a complex environment, guarantees the integrity of the data within it. Most pharmaceutical software already does this. Data Integrity is a cross-departmental and cross-system discipline and requires that the data remain intact even during all the quantum leaps from one system to another that it makes! And this is precisely the meaning of "during the entire life cycle of the data". And this is the main challenge of Data Governance.

But how is it possible to guarantee the integrity of the data in the moments of transfer from one system to another? Each system is expert in the processes of the area to which it refers and can guarantee the integrity of data, as long as these are under its jurisdiction

But between the two communicating systems, who must guarantee the success of the data transfer?

The answer, of course, is ... neither. The best solution is to delegate the burden of effectively transmitting data to those who do this for their profession. A Middleware

HOW IT'S MADE?

What should middleware be like? we see it in the following example. Let's consider a simple imaginary process that can be immediately understood: laboratory analysis with subsequent batch approval.


The process is triggered at the conclusion of the batch production phase. The system, which could be a central LIMS, is waiting for the samples taken for analysis (step "Batch Sample Receiving" in the diagram) and, as soon as it receives them, it transmits the relative data to the two analysis laboratories, chemical and microbiological. After that, it waits for the analysis results. When results arrive from both laboratories, the system can make its preliminary decision on the quality of the batch: approved or rejected. Of course, only if both results are positive then the batch can be approved. The consistency of this flow is questionable but, as we said, the goal is not to faithfully represent a real process.

The desired destination


The following is not an animation created for this article by a video editing software. It's the video shooting of Process Engine , used by Ympronta in most complex and critical projects.


Each node represents a specific state of the object of this process, in our case the batch.

The counter, visible at the bottom of each node, represents the number of lots that are currently in that specific state. 
 

The connections between the nodes define the sequence of states that each pool must go through. A connection that "lights up" indicates in real-time the passage of the batch from one logical state to another. 
 

As you can imagine, the whole process can be "crossed" by several batches at any time. And multiple lots can be in the same state at the same time. There is no conflict. Each lot is like a pilgrim who follows, with his backpack, the same route as the others to reach the desired destination.

The most interesting thing is that at each passage and at each stop, the lot literally enriches its baggage of data, arriving at the bottom with an album full of memories. This is also seen in the final part of the animation when the data relating to each lot in its final state are shown in the right pane (try to enlarge the page to see the detail better). 
 


All the passages from one system to another, the results of all the analyzes, the changes in the status... whatever happened to the batch during its processing, can be automatically added to the final "dossier". Of course, in order not to carry a too-heavy suitcase, the system can, from time to time, package and send what has been collected so far to a specialized warehouse, and carry only the most relevant data with it until the end for an immediate consultation.

Middleware does all this with an unbearable lightness because it is precisely for this reason that it was designed.

Along the way, all data relating to communications between systems is also recorded. And that is why Middleware is the right answer to the question: "...between the two communicating systems, who must guarantee the success of the data transfer?"



In questo modo il process engine (detto anche Middleware o orchestrator), diventa, a tutti gli effetti, l'Audit Trail delle interfacce e delle comunicazioni che avvengono all'interno di un ambiente IT, sollevando così tutti gli altri sistemi digitali da questo onere.   

ConclusionS

A system, such as the one discussed above, can have multiple uses. It can act as a "glue" between different systems and processes. Existing or new. It can be used to create totally new processes, independent of any other software. It can extend the functionality of a system by exporting, processing, and re-importing data back into the same system. This is very useful for modifying the behavior of a particularly rigid existing system if you do not want to replace it because you are fond of it or because it requires a considerable investment. But above all, in the pharmaceutical field, it is the ideal tool to guarantee Data Integrity, since it allows you to build, without large investments, in time and money, a digital environment with effective control of all flows and data.

The Ympronta process engine is the heart of our cloud platform. Designed for this very purpose, it can handle thousands of concurrent processes. Thanks to its advanced technology and being a "multi-client" environment it is able to manage the processes of all our customers in parallel, without going into conflict and without ever failing. Finally, each customer can manage and monitor all his processes and data within his own reserved space. 


Thanks to the graphic designer, with immediate visual impact, who does not require the writing of the code, we are able to implement the processes and test them together with the customers, receiving immediate feedback from themAll this drastically reduces the overall time required to create solid, robust and Data Integrity compliant processes. Long meetings are no longer necessary to imagine, review and approve processes on paper. The initial requirement and the general idea are sufficient to sketch the first version of a real process. Subsequently, thanks to the continuous feedback from users, in view of the Agile methodology, we get to refine the process up to its final version and then publish it in production.

And with processes that can invoke other processes, we are able to design highly complex flows, building real multi-level hierarchies.

If our Cloud Process Engine has intrigued you, if you have a project in mind but you don't know how to make it, or you think you could do it thanks to our process engine, if you have any questions about our platform, do not hesitate to contact us.

Join Us

Be the first notified about our events, news, and publication of high-quality content.

Odoo • Un'immagine con didascalia
Production and Laboratory Data Integrity Digital Solution
Odoo • Un'immagine con didascalia
Smart Serialization and Track&Trace solution

Y-MES
Smart MES Solution for Life Sciences Industry.
Data Integrity 
Questions and Answers
The greatest obstacle to Data Integrity
Ympronta srl, Info Ympronta May 13, 2020
Share this post
Archivia
Sign in to leave a comment