Changes are challenging, and in the digital world what we see the most are changes due to the continuous development of new technologies and concepts.
Let’s look back at the Computer System Validation (CSV) history. Back in the 90’s, the FDA started to increase their expectations regarding computer system compliance and validation because of a number of pivotal inspections.
Then, in 1997, they issued the General Principles of Software Validation and the well-known 21 CFR Part 11. The first version of ISPE GAMP guide was published in 1995, and the latest version 5 was published in 2008.
Today, we are moving from huge investments in infrastructure and local managed systems (aka on-premise) into a new era of SaaS (Software as a Service) and IaaS (Infrastructure as a Service).
New technology, same guidelines
We hear and read about Industry 4.0, Cloud Computing, Internet of Things, Machine Learning, Artificial Intelligence, and Blockchain. But are those regulations and guidelines up to date with all the innovation? The answer is: Yes, definitely!
Computer System Validation is about three key aspects: Product Quality, Patient Safety, and Data Integrity. The standards and methodologies upon which GAMP has been written haven’t changed. And so, it doesn’t matter the technology behind a computer system.
We’re talking about a set of good practices for the concept, development, and implementation of software, namely:
- ASTM 2500
ASTM 2500 focuses on knowing the products, processes, and regulations to guide the creation of the requirements specification. Then, the ITIL project and handover activities which GAMP5 has merged with ICH Q9 concepts to perform the risk assessment. Finally, COBIT contributes with all the system and data governance.
Times are changing
Looking at the last GAMP review, it introduced the Risk Based Approach in alignment with the ICH Q9 Quality Risk Management guideline.
At that time, the Life Sciences industry had a lot of reluctance to introduce new technologies. If you’re asking yourself why, well, it was because of the burden of the CSV. Risk Management would draw all their validation focus to high-risk functions only.
Fast forward to today, the goal of having the industry investing in automation and digitalization is yet to be fulfilled. This is partly still because of the cost of the validation and its maintenance. And, let’s face it, partly because of the fear of changes.
More agility, less paperwork
There’s urgency to move towards the Industry 4.0 concept. For that purpose, the FDA is about to release the guidance on Computer Software Assurance (CSA). The concept is still based on risks, but it is a lot simpler and less costly whilst maintaining the focus on the patient.
What about this transition from CSV to CSA?
As software development life cycle methodologies evolved from the well-known Waterfall used as a basis for the GAMP into Agile and DevOps, the validation of a computer system cannot be seen as a one-time activity based on documentation anymore.
More importantly, the transition from CSV to CSA reduces the amount of paperwork. How? It relies on leveraging the vendor’s experience and robustness of its QMS (Quality Management System), in which it is crucial that testing follows the best practices including emerging methodologies and tools, such as ATDD (Acceptance Test-Driven Development), Session-based testing and automated testing.
Making a smooth transition from CSV to CSA
4TE is ready to support your transition from CSV to CSA, from a detailed assessment of your current QMS and computer systems inventory to identify the best transition approach, going through a supplier qualification and culminating into a validation of upcoming computer systems in your current environment.
We recommend that you check our Computer System Validation Services. If you think we may be able to help you out, don’t hesitate to book a call, we’ll do our absolute best to help.
If QRM is a topic of interest to you, we recommend you checkout this article of ours: QRM over Lifecycle Management.