Advertisement
Paul Denny-Gouldson
Vice President, Strategic Solutions at IDBS

The trend of increased collaboration in the drug development process has raised potential regulatory concerns and compliance red flags.

The pharmaceutical industry operates under a plethora of regulations and organizations have to comply with the intricate regulatory system for a very good reason: protecting patient safety.

When introducing a new drug to the market the FDA has to be 100 percent confident that the patient population is not at risk or that the risks are acceptable—known side effects etc. Regulation compliance is a concern during the entire drug development process—from discovery to preclinical development, through to clinical research and manufacturing.

However, the implications, focus, and effort expended, to ensure compliance, increases dramatically the closer you get to the patient.

Organizations have to methodically document their development activity to ensure it is in line with the good (x) practices (GxP = GLP, GDP, GMP, and GCP).

All applications, such as the investigational new drug (IND) application, new drug application (NDA), biologic licence application (BLA) etc., require extensive and robust documentation to prove organizations are examining all aspects of safety, efficacy and manufacturing.

Organizations need to ensure all good practice regimes are managed effectively and scientific data collection and management is conducted to meet the relevant FDA regulations.

Most organizations are already using informatics and software tools to help manage and ensure regulatory compliance—but those only work inside the firewall. The trend of increased collaboration in the drug development process has raised potential regulatory concerns and compliance red flags.

When work is being conducted outside the firewall by third parties, pharma and biotech companies are focused on questions like how do I:

  • Demonstrate compliance?
  • Track data and results?
  • View and act upon deviations and exceptions?
  • Ensure data is captured properly to my business rules?
  • Show data is tamper evident?
  • Show who did what, when, and why, etc.?

One way to provide the ability to answer these questions is allowing the external parties to work inside the firewall. This is the only real go-to solution historically—in the most part due to the fact that all such systems were only ever located inside the firewall.

However, with the advent of affordable cloud computing, coupled with SaaS (Software-as-a-Service) and PaaS (Platform-as-a-Service) this paradigm is changing rapidly. Nevertheless, pharma and biotechs still need to answer all the same questions as listed above, whether their infrastructure is cloud-based or not.

This is leading to a number of obstacles and risks that need to be addressed. Moving forward, cloud, SaaS and PaaS are really going to be dominant in pharma IT infrastructures.

Let’s take a single example of someone’s identity credentials and security privileges inside the firewall. In this situation credentials are typically managed by a centrally administered application—ranging from LDAP, MAD to more modern single sign-on (SSO) tools.

There are teams of people managing this, and all applications use this in some manner to authenticate a user, often telling other applications what a user can do: create, read, update, delete (CRUD) etc. Some extensions of this often include access to certain data types and application features, but the foundation is that is it all managed in one place for all employees.

Now let’s take a situation where some of this work is being conducted outside the firewall – how do organizations ensure user authentication and credentials, and confirm users have permission to perform tasks in an application. For example, generating important data that contributes to an IND whilst working in a cloud application, what are the obvious risks and barriers to this?

Firstly, the cloud application must have some connection to your internal authentication service—the place where people get Ok’d to login to a system. This is not as easy as it sounds but it has been done, and is less of a barrier now as IT providers have experience working on this problem.

Secondly, the central service has enough knowledge of the application you are logging into to grant some specific privileges. This type of process is called by some late binding security/authentication and is becoming very important in the collaboration and cloud space.

Historically, all of the privileges have typically been managed inside the application. However, there is a strong trend to have more of this information centrally managed and pushed out to applications.

There are many technologies that do this—but in order to do so organizations must define their data landscape very carefully and have strong master data management (MDM) principles in place.

Without this MDM foundation—it is almost impossible to get centrally managed and late binding security to work effectively and organizations are left having to manage each application individually.

Working outside the firewall in a cloud infrastructure, that collaborators are using to create critical data, raises compliance concerns. Sponsoring organizations must be able to authenticate users, which is pretty watertight nowadays in the cloud, but also authorize users to perform tasks—not so easy at this time and requires extensive investigation.

Thirdly, everything that the individual collaborator does in the external system must meet the extensive audit requirements of the FDA and other global regulatory bodies. This is part of the authentication foundations for all data in drug development.

The ‘who’ part we dealt with in the first step—authenticating users in a defined security protocol. The next steps are trickier, as the information that is required to get the what, where, why come from the external system used by the collaborator.

In essence the requirements on the externalized system are exactly the same as the internal systems—all roads leading to audit trailing, version control and data traceability. The external system must provide this in a digestible format that allows the sponsor organization to consume and use this information to please the QA and QC departments sufficiently.

The last part of this use case is the actual hardware that the applications run on, as the three situations discussed above are doable and have been demonstrated effectively in the clinical trials domain for quite a few years.

The real “nuts and bolts” problems come when you consider cloud in its broadest sense—public infrastructures delivered by the likes of Amazon, Microsoft, and Google. Many situations exist where organizations run their infrastructures on private clouds—i.e. those not accessible to the Internet.

Here, it is relatively easy for organizations to validate the hardware and inspect it at will—which normally forms part of the commissioning and validation of a system and is something they would do if the hardware were on-premise.

So what happens in the public cloud? It’s quite different. The major suppliers of public cloud do not allow inspections of their facilities. This is absolutely understandable as it raises a set of security variables that they just don’t want to introduce into their systems.

Why would they allow all the QA representatives from all the pharmas and biotechs into their facilities, when they are running a high percentage of the world’s largest company’s IT infrastructures—the risk profile is just unfathomable.

This problem of validating hardware and operating systems has been the topic of many compliance and validation discussions in the industry recently. How can a pharma guarantee the hardware changes that the cloud provider delivers doesn’t change the behaviors of applications that run on it?

Industry working groups like cloud and application providers are all working together to provide a framework that allows for validated, cloud provisioned, collaborative applications to be used in drug discovery, development, and manufacturing. This is opening the door for a very different landscape with IT, and an opportunity to potentially change the way validated public cloud IT systems work in the future.

There are other obstacles around the IT validation and compliance environment such as the validation of the applications that run the cloud themselves and how they are used by scientists.

In conclusion, pharmaceutical organizations are looking to implement data management solutions that electronically document data and process, which allows data to easily be exported and provided as evidence to regulatory bodies.

With the advent of much greater collaboration in the pharma and biotech industry with contract research/development/manufacturing organizations (CxOs), cloud SaaS solutions need to be supported, as many scientists are now working outside the firewall.

Validated cloud-based software solutions can help streamline this collaborative drug development process as long as they can be validated and provide robust security and data management foundations.

When properly architected and designed these SaaS applications on public cloud platforms can support pharmaceutical organizations working with external parties, enabling sponsoring organizations to monitor workflows in real-time, enforce how data is acquired, track who does what, when, where and why, and track exceptions and deviations.

Essentially, with some effort, organizations can provide answers to QA and QC groups that satisfy all of the complex regulatory and compliance rules. Using advanced data management software solutions, organizations can continually meet regulation requirements, even if they can’t physically see the big box in the corner with flashing lights.

Follow us on Twitter and Facebook for updates on the latest pharmaceutical and biopharmaceutical manufacturing news!

Advertisement
Advertisement