Reproducibility is key to regulatory compliance. The hidden benefit? It boosts customer trust.
How scientists and engineers working at the forefront of technology transfer at Thermo Fisher Scientific integrate sophisticated statistical approaches in JMP® from A to Z
Thermo Fisher Scientific
|In pharmaceutical contract manufacturing, the feasibility studies required for compliant technology transfer come with significant financial costs. Noncompliance, however, can be just as – if not more – expensive given the high cost associated with defective batches. These competing challenges are a powerful incentive for scientists and engineers to work more nimbly, extracting more value and insight from development processes that can be applied to validation and continued process verification.
|Engineers and pharmacists at Thermo Fisher Scientific advocated for the widespread adoption of JMP, a powerful statistical discovery platform, to consolidate and automate data workflows, thereby accelerating process transfer, validation and continuous verification of the process during the commercialization phase. With its diverse capabilities and suite of code-free statistical process control (SPC) tools, JMP has also deepened collaboration between statisticians and domain experts.
|When process engineers were tasked with transferring a pharmaceutical batch from 1kg to 40kg clinical production based on a small feasibility study, the team used JMP to validate the experimental space and ensure conformity. Ready-made SPC tools in the platform helped them to not only meet compliance standards for their first clinical batch, but to do so within a span of just two months. Moreover, says Project Development Manager Christopher Thorne, “Sophisticated statistical approaches and the use of graphical tools in JMP helped us to earn our client’s trust.”
As automation and AI continue to advance the digital manufacturing environment, pharmaceutical companies are increasingly turning to contract manufacturing partners to reduce costs, ensure the expected product quality and accelerate time to market. In the face of both increasing competition and increasing opportunity, contract manufacturing organizations globally are seeking to nurture relationships with their customers by delivering value not just through speed and cost reduction, but also better, more reproducible science.
Thermo Fisher Scientific, a Fortune 500 leader in scientific instrumentation and services, has built its brand on relationships. Catering to customers at the forefront of scientific discovery around the world, Thermo Fisher’s manufacturing organization drives innovation in many different disciplines, from health care to pharmaceuticals and biotechnology. Though the company has nurtured deep subject matter expertise over a comparatively long history, the organization was an early adopter of analytics and has since integrated statistical approaches to science as part of its value to customers.
“Scientific curiosity is what led me to look more deeply into statistical approaches,” explains Christopher Thorne, Project Development Manager at Thermo Fisher Scientific in France. After training as a pharmacist and working in both academia and industry, he says, Thermo Fisher’s culture of analytics created a truly exciting environment for technology transfer. “In my opinion,” he says, “you need to strike a balance between domain expertise and statistical analysis, with each complementing the other.”
JMP® helps augment domain experts’ value by making analytical methods approachable to those not trained in statistics
Today, Thorne oversees the production of development, clinical and regulatory batches for Thermo Fisher’s manufacturing organization. He provides process outputs to colleagues Véronique Brianto, Validation Deputy and Continued Process Verification (CPV) Deputy; and Damien Bonnet, Validation Deputy and CPV Leader, for the preparation of Phase 2 Process Performance Qualification as well as the CPV phase supporting marketing compliance. Furthermore, the team is responsible for implementing a CPV program to determine whether new products, including both oral solids and non-sterile liquid formulations, meet conformity and process control requirements.
In CPV, explains Bonnet, “our responsibility is to ensure that the drug manufacturing processes developed by the R&D team achieve the results expected by the health authorities, and do so in a robust and reproducible way.”
Bonnet and Brianto, with input from Thorne, work to implement CPV following validation, by interfacing with production teams, quality control, process experts, continuous improvement, business management and finance to ensure that the validated manufacturing processes producing marketed products are monitored throughout the entire marketing phase. This work, Bonnet adds, can also support operational decision making and allows the company to anticipate changes in trends, introduce improvements and strive for operational excellence, in addition to generating data for new process development or scaling up.
“As scientists, statistical analysis allows us to go farther with our interpretations, establish a foundation to make sure that a study’s conclusions are solid and compelling, and be confident in our interpretation,” Thorne explains. And as the company’s manufacturing business has accelerated, he adds, “Thermo Fisher’s leadership has prioritized increasing domain experts’ access to the statistical toolkit they need to transfer well-characterized and controlled processes.”
“Over the last two years, progress has really accelerated. Soon the entire team will have access to JMP” – a statistical discovery software that Thorne and his colleagues say is now a one-stop shop for data integration, experimentation, validation and compliance workflows.
Standardizing the analytics workflow around JMP® consolidates data management and accelerates collaboration
“JMP was the only software that offered us a range of tools allowing access to data analysis for non-statisticians,” explains Brianto, who played a pivotal role in advocating for the tool’s implementation. “It provides us with a collaborative interface and enables the sharing of information in the form of both raw data and data visualizations between users and non-users of the software.”
Before the team standardized around JMP, she says, the challenges of data compilation and a general lack of understanding around visualizations often led to systematic and costly delays. Realizing from experience that these delays could be prevented by adopting a tool that would suit the needs of both statistical and domain experts, Brianto, Bonnet and Thorne outlined a compelling business case to Thermo Fisher’s leadership.
“The first step in acquiring JMP to support [our work in] continued process verification was to convince and justify to top-level management the added value regarding the use of the software,” Bonnet explains. “One example we highlighted was JMP's SPC tools – things like data distributions, control charts and stability analysis. We walked them through each step [in the JMP workflow]: identifying and solving the problem and showing how the trend improved as evidenced by a positive evolution of process capability, performance and stability metrics.”
The company’s leadership quickly saw the benefit of consolidating data workflows into a single tool that could democratize analytical best practices and facilitate more productive collaboration between departments – a benefit Brianto says is especially critical for data collection and management, which was a key hurdle with the team’s prior software.
“We rely on close collaboration with the teams responsible for generating process data, which can be complicated given the extent of the transversality of continued process verification.” she explains. “JMP actually has the advantage of being able to offer different tools that facilitate data management in a way that is equally accessible to statistical experts and novices.” By organizing data in JMP and creating a community of practice around the JMP workflow, the team was able to not only gather data faster, but also better manage process knowledge and accelerate validation and verification timelines.
Statistically sound processes speed technology transfer and reduce scrap costs
One of the most compelling cases for JMP, Thorne explains, lies in technology transfer – a critical step in pharmaceutical manufacturing where raw materials can be extremely costly and as a result, feasibility studies are kept small to limit costs. The key challenge, he says, is to extract as much data as possible from very small studies to reduce the failure risk of resource-intensive clinical batches.
At Thermo Fisher’s site in Bourgoin, Thorne and his team were tasked with an initial transfer batch scaled from 1kg to 40. The feasibility study for the development process, which involved roller compaction, had been kept small, but the customer needed to produce a clinical batch quickly – within the space of just two months. In addition to contending with the data limitations of a small sample size, the team would also have to streamline their analytical workflow to deliver a compliant solution within the customer’s timeframe.
“We had to earn the client’s trust in terms of the strategy we adopted to ensure that initial clinical batches were safe,” Thorne explains. To do so, they would have to optimize the feasibility batch to generate as much data as possible, creating granulation profiles manufactured with different parameters. This process would allow Thorne’s team to conduct an impact analysis looking at the effects of granulation parameters on granule quality. A series of compression simulations based on design of experiments (DOE) in JMP established the optimal design space.
Thorne delivered results to the customer in the form of a completed works documentation package which, he says, generated the maximum volume of data with a minimum number of tests – in effect, keeping the costs of consumed product to a minimum. This documentation package analysis was prepared in JMP, which not only provided tools for DOE and other Quality by Design approaches, but did so in a way that could be easily and transparently conveyed to the customer.
“Sophisticated statistical approaches and the use of graphical tools helped us to earn the client’s trust,” Thorne says. “Without JMP, that would have been far more difficult,” not least of all because the tool’s dynamic visualizations help to communicate results quickly, giving both customers and regulators the ability to reproduce and verify process robustness. The combination of advanced statistical tools and an exploratory platform that encourages transparency is what makes JMP a go-to solution, Thorne says. When scientists can complete an entire workflow all in one platform, they reduce opportunities for error; “plus we save time on analysis,” he adds.
A transparent, reproducible documentation package builds client trust
Ultimately, Thorne says, conclusive feasibility tests confirmed the granulation target parameters and design space for the transfer batch and helped the team to establish a robust process. Even the first clinical batch was confirmed to be in compliance – a result that Thorne says again bolstered the customer’s confidence in the value Thermo Fisher could deliver.
“While it’s too early in the product development process to quantify the business impact of consolidating our analytics workflow in JMP,” he says, because of the level of transparency in the package submitted to regulators, “these data may also be used in the marketing authorization application;” a two-for-one solution that will save the customer further expense. “Client trust is so important, and had we not had JMP, that trust would have been more difficult to earn.”
Continued process verification monitoring, a powerful tool against quality issues and waste
At Thermo Fisher’s site in Bourgoin, Bonnet and Brianto have used JMP to monitor the most critical elements of a product as well as track process improvements and outputs evolution. Throughout their work around the continuous monitoring of the active product ingredient assay and degradation of an oral solid form, they succeeded in tracking atypical result trends which required process improvements.
Ultimately, the active product ingredient assay was detected as too low with high variability, correlated with and resulting from high active product ingredient degradation and instability. Bonnet and Brianto therefore shared the JMP statistical analysis outcomes with experts to propose to the client axes for improvement. Two actions were implemented as a result, making the active product ingredient assay higher – reaching the target of 100% – and eliminating the active product ingredient degradation. “Process monitoring allows us to predict process behavior and prevent out-of-compliance occurrences by anticipating the process drift using JMP process control tools,” they say.