PDA Letter Article

Pharma Has its Head in the Cloud

by Toni Manzano, Aison

Big Data is Leading Pharma Manufacturing to Greater Maturity

Today, huge amounts of information are continuously being generated and stored in many different systems: external and internal hard drives, virtual disks, network storage systems, pen drives, e-drives, etc. In addition, social networks, multimedia platforms, and Internet of Things (IoT) technology contribute increasingly greater bytes of data. In fact, 2016 saw about 2.5 trillion (2.5×1018) bytes of data created each month (1). This enormous amount of electronic information, commonly referred to as “big data,” is directly related to two factors: (1) the ease in which data can be stored, and (2) the ability to connect to the internet.

Behind both factors lie cloud technologies. Cloud systems enable both cost effective and quick deployment of storage and access to information. Tools in the cloud support what is commonly termed the three “Vs” of big data: “volume,” “velocity,” and “variety.” But when it comes to big data in regulated pharma, two more Vs appear: “veracity” and “validity.” These two Vs provide context to the knowledge pulled from massive amounts of information. Consider the following questions as applied to a manufacturing site:

  • How difficult is it to manage information when data is saved through the different site activities in different formats, on different databases, and from multiple applications producing isolated siloes of information?
  • Is there uncontrolled critical information locally used or accessed individually (e.g., dashboards built into spreadsheets, critical documents saved in local computers, deviations studies, or pieces of information related to the batch)?
  • Are current devices ready to share information about the processes they control and with new equipment that will be acquired?
  • Can knowledge be extracted from the primary data of production equipment and systems, or have they been processed through partial filters, and is there a technology solution that can aggregate that information?
  • What is the maturity level of the company in terms of efficiency and effectiveness?

These entire questions boil down to one particular point: a site’s deployed technologies must serve as a driver of knowledge innovation.

The majority of manufacturing sites are managed by computer applications and electronic devices with digital output. The amount of information generated by a medium-sized pharmaceutical site can easily reach a terabyte of data per year (2). If this is the case, then why is it so difficult to get valuable information from the IT systems across a company within our industry?

For one, pharmaceutical manufacturers rely on a variety of specialized software applications to ensure efficiency and quality within drug production. The diverse amount of information generated can lead to serious problems, however, when it comes to providing globally required knowledge. Over the years, companies have gradually acquired a number of software applications from different suppliers to address varying objectives; integrated access to knowledge across these systems was not a consideration when they were purchased. While each of these applications generally meets their particular niche, they prove difficult to integrate for generating interoperational data.

The amount of data produced by these multiple systems cannot be effectively managed using classical assets. Furthermore, when the data must be processed statistically, this demands an applied methodology that must be diverse, voluminous, and fast.

And finally, when tons of data must be processed, a legion of resources is needed to organize the information: dedicated servers and networks, specialized technicians, data and advanced analytics specialists, and specific software. Naturally, all this comes with a high upfront investment. The costs and resources needed to design and enable a digital plant can be so high as to seem overwhelming. On top of that, it requires plant management to focus less of pharmaceutical activity and more on IT systems.

So, this begs the question, what is the solution?

Cloud technologies offer the machinery and logic to obtain “pull” information from vast amounts of data within a plant without significant investment in nonpharmaceutical activities. Some of the more common cloud technologies currently being deployed across the industry are:

  • Power computing: These are servers and parallel calculation processes that can rise on demand without any limitations.
  • Artificial intelligence (AI) resources: There are many components based on the mathematical algorithms that have been developed during the last decades to generate predictions using large data sets.
  • Advanced analytics: Similar to AI components, there are different sets of preconfigured objects that allow implementation of the most complex analyses based on advanced statistics.
  • Infinite storage: Nowadays, there are no limits for storage in the cloud.
  • Encryption: A few providers are offering data encryption as a configurable attribute associated with its storage.
  • Indexing: Some cloud components can index information so that users can easily conduct fast queries and transversal researches without maintaining any local infrastructure.
  • Other resources: There are many functionalities for specific customization are available in the cloud—image recognition, natural language process, data queues, IoT, etc.

Used individually or combined, all these solutions offer manufacturers greater maturity in global information management. This can be considered part of the digital supply chain. An advanced degree of knowledge about all the interrelated events occurring through a site results in a high maturity level. Think of a digital supply chain as a self-driving car. Like the driver, the facility has to be conscious about everything going on around it to make the right decision at every moment and under any circumstance. Yet the facility doesn’t have to be “at the wheel.”

What Are the Risks?

Innovation always presents risks during the early phases. At the same time, early adopters have an edge while remaining mindful of the risks. In fact, with cloud technology, its early adopters are already taking advantage of the results arriving from their initial efforts. Studies have shown that companies gain approximately 30% in cost savings by applying predictive maintenance, a 25% reduction in product lifecycle by verifying performance in real time, and an estimated market growth of 15% for digital services, including automation (2).

Yet from a regulatory perspective, there are some risks that deserve special consideration. These are: the security and privacy of the data, the commitment and quality behind service level agreements, data integrity, and disaster recovery processes. Nevertheless, each of these elements can be addressed by building on the strength of cloud technologies. Many common cloud platforms even feature their own cybersecurity, such as Amazon, Google, or Microsoft platforms. Now, consider a manufacturer that has installed sensors to feed data into an AI system to help improve process performance. How can that manufacturer address the risks?

If the manufacturer in question uses a “software as a service” (SaaS) application that delegates security to a platform such as Google or Amazon, and this platform provides standard security options like secure authorization and a public key certificate along with some security tools (e.g., multifactor authentication), then the risk is automatically mitigated. Throw in good practices, such as prohibiting repeated passwords and avoiding passwords saved in public places, and the risk is further minimized. Furthermore, some providers offer native encrypted data for read and write actions.

In other cases, the technology itself decreases or even eliminates the risk. The data acquisition process for big data storage uses standard technologies and procedures that replicate automatically each single byte of data in several locations. Some providers also guarantee the automatic reproduction in three different geographically separated data centers. Furthermore, users can configure extra automatic backups. This process is far more robust than the traditional backup system that relies on transporting manually labeled rotational tapes by truck.

Overall, big data and cloud-based technologies can provide benefits to the pharmaceutical and biopharmaceutical industries. And cloud services are expanding their good practices to meet many security and quality standard certifications. For example, Amazon Web Services recently published a GxP Compliance white paper defining their certifications and procedures (3).

There are many considerations that can justify the mitigation of detected risks. Other sectors with strong regulatory requirements, such as the financial industry, have already adopted these technologies. (Look at NASDAQ, which currently makes six billion transactions per day using big data and cloud technologies.)

Inevitably, the evolution of technology provokes a disruption in society, and activities like computer validation will adapt to the new context. Regulations should not be seen as an obstacle, rather, they should be seen as something that can be addressed by the disruption.

[Editor’s Note: The author will be moderating the session, “Connectivity, Smart Devices, and Analytics” at the 2nd PDA Europe Annual Meeting, June 14 at 9 a.m.]

References

  1. “Every Day Big Data Statistics – 2.5 Quintillion Bytes of Data Created Daily.” VCloudNews. (April 5, 2015) http://www.vcloudnews.com/every-day-big-datastatistics-2-5-quintillion-bytes-of-data-created-daily/
  2. “The future is digital and automated.” The Magazine. (February 12, 2016). https://www.siemens.com/customer-magazine/en/home/specials/everythingsnew/the-future-is-digital-and-automated.html
  3. Wolf, C. “Introducing GxP Compliance on AWS.” AWS Security Blog (January 19, 2016). https://aws.amazon.com/es/blogs/security/introducing-gxpcompliance-on-aws/