Customer Story

Robust processes catalyze new business opportunities

Johnson Matthey’s bioprocessing group uses statistical methods to transfer enzyme technology from the lab bench to the production line

Johnson Matthey Biocatalysts

Johnson Matthey Biocatalysts

ChallengeDevelop and scale up manufacturing processes for the supply of new enzymes to resolve the catalytic needs of customers in a range of industries.
SolutionCombine design of experiments (DOE) with data visualization and analysis in JMP® to help select which strains to scale up, establish scalable processes and enable production of commercially viable enzymes.
ResultsBy augmenting automated experimentation with DOE in JMP, scientists in the bioprocessing group have reduced attrition in catalyst transfer and expanded screening operations. This approach has played a critical role in the scaling up of Johnson Matthey’s biocatalysis business.  

Inspiring science, enhancing life. That’s the motto of Johnson Matthey, the UK-based multinational purveyor of specialty chemicals and sustainable technologies. Founded more than 200 years ago as an assayer of precious metals, the $16 billion company’s products and technologies now extend to emission-control catalysts, fuel cells, batteries, colors, coatings, pharmaceutical ingredients and medical device components. From the active ingredients in lifesaving medicines to the platinum assemblies of cochlear implants, Johnson Matthey has had a hand in many of today’s most notable scientific advances.

But in the digital age, staying on the leading edge of advanced technology takes as much effort as information. To remain competitive, Johnson Matthey relies on statistically driven science to help guide business decisions – in everything from discovery and process research to scalability and manufacturing. And for that, Johnson Matthey depends on JMP®.

A chemistry giant expands into the life sciences

With a well-established reputation for innovation in chemistry, Johnson Matthey – like many global companies today – recognized the potential to expand its expertise to meet a growing demand for biotechnology solutions as early as 2000. One of the vanguard undertakings of this expansion effort was the creation of a biocatalysis group and the acquisition of a Dusseldorf-based biotechnology startup, X-Zyme.

Biocatalysis is the process of performing chemical transformations using catalysts derived from a biological source. It was a new field for Johnson Matthey, which, despite its prior success in chemical catalysis, had not previously designed catalyst products from biological source matter. The then-fledgling group was tasked with not only building a catalog of proprietary enzymes but also expanding operations into a profitable business that could be used in everything from perfume to lifesaving pharmaceutical drugs. “Biotechnology has been transformational in many industries,” says biologist and Bioprocessing Team Leader Andrew Kaja. “Maneuvering our catalysis operations into this arena can help differentiate us and open new markets.

“We’re looking at the genetic resources in nature to find DNA sequences we can use in enzymes for industrial applications.” To find and develop these enzymes and pathways, the company is looking into creating metagenomic gene libraries using DNA extracted from soil samples. This bypasses the unculturability of many microbes and allows the bioprospecting of nature’s huge genetic reserves. In tandem, the company uses random mutagenesis and rational computational methods to further increase this genetic diversity, aiming to find a sequence that translates into a protein that delivers the right functional and quality characteristics.

Kaja’s group works chiefly with E. coli, a well characterized bacterium frequently used within the biotechnology field. The bioprocessing team starts with gene sequences that have been screened by the discovery team to identify next-generation industrial enzymes. These sequences are then transformed into E. coli strains, and their growth and productivity assessed in fermenters. Here, the cells are grown in a nutrient broth under conditions that promote the synthesis of the target enzyme. The cell line which expresses the most enzyme is then selected and brought forward for further process development and manufacturing. After the cells’ productivity within the fermenter has been optimized, the downstream process of isolating, purifying and formulating the enzyme must be undertaken to maximize yield and minimize cost. Kaja’s team works to develop processes for both these stages, tests for process robustness and seeks to identify potential bottlenecks prior to scaling up production.

“Robust screening at both the discovery and process development stages is vitally important, because we need to reduce attrition rates and develop strains that are robust enough for manufacturing,” Kaja says. Screening enzyme production processes involves a lot of variables, and that’s why it’s essential that scientists like Kaja use a powerful software like JMP that is capable of supporting design of experiments (DOE) in a multivariate environment.

Plug-and-play solutions for complex experimentation

In conducting these screenings, Kaja and his colleagues apply a number of what he calls plug-and-play experiments to both the fermentation and downstream processing stages. Scientists in the bioprocessing group look at upstream variables such as media, temperature, pH level, feed rate and fermentation time. “There are a number of DOEs you can do around those parameters to optimize the process,” Kaja explains. Focusing on one of the downstream stages “We look at various conditions such as temperature and additives during the clarification steps,” he explains. “We add a flocculating agent” – a substance that causes fine particulates to clump together and float to the top or settle to the bottom of a solution – “and look at the design space to optimize the centrifugation step to maximize pelleting and reduce the amount of debris in the materials.” It’s all about identifying the optimal conditions at which to move a product to the next stage. “Once we know we’re getting the product we want at the right quality level we can shift our attention up to robustness experiments.”

By reducing the number of tests required to achieve the desired outcome, this plug-and-play DOE approach in JMP can save Kaja and his team a significant amount of time. “Of course, you need to get to know the software first,” Kaja concedes. “But once you no longer have to think about an experimental design and how to set it up, analyzing the results takes the spotlight and there the sparks are to be found that inspire and drive the breakthroughs.

“Not all of our data, however, comes from plug-and-play experiments; sometimes we are dealing with large volumes of data – where interrelated factors are numerous – and this introduces its own challenges. JMP, with its multivariate capabilities, comes in handy with our high-throughput equipment, which generates a lot of this data type. It’s much better if we can match our high-throughput data generation with high-throughput, statistically significant data mining techniques. JMP gives you lots of options, and that allows you to do more.”

Furthermore, Kaja says, the beauty of JMP is that it’s more than just a DOE tool; the software’s robust visualization capabilities make it a go-to in other areas of the business as well. “People who don’t understand DOE can just open up JMP and use Graph Builder for data visualization. With Graph Builder, within two minutes you’ve achieved something that would have taken two hours in Excel.” Kaja still uses Excel for certain tasks, but for deeper insights into the data, JMP provides a far more expeditious solution. “When you analyze a data set in Excel, you might use bar charts or spider diagrams, for example. You’ll spend time doing that, but still the insights don’t jump out at you,” he says. “To really visualize your data properly, you need to look at it from different perspectives. In JMP, because you can drag and drop and visualize your data in this way, you can really understand your data.”

Inspiring confidence in technology transfer

These efficiencies have translated into real business impact, Kaja says. The group’s use of rich statistical methods breeds confidence in technology transfer – moving a process from a pilot plant to the partners that produce products at ton scale – and helps in reducing attrition during the transfer itself. “We use JMP to give us confidence in our decision making. If we weren’t using DOE and JMP, we wouldn’t be able to characterize our processes and demonstrate that we’re minimizing our failure rate and increasing our robustness. The designs show us that our processes are robust and ready for transfer,” Kaja says. “And because more and more of our partners are talking in terms of DOE, they can trust our processes.”

As these processes – and the products they make – scale up, so too does the data. But Johnson Matthey is prepared to meet these challenges head-on. “In five years, we’ll have so much information on so many different genes, so many different process conditions, so many different quality attributes and process parameters,” Kaja concludes. “That’s a massive challenge, but also a delight to think about.”

The results illustrated in this article are specific to the particular situations, business models, data input and computing environments described herein. Each SAS customer’s experience is unique, based on business and technical variables, and all statements must be considered nontypical. Actual savings, results and performance characteristics will vary depending on individual customer configurations and conditions. SAS does not guarantee or represent that every customer will achieve similar results. The only warranties for SAS products and services are those that are set forth in the express warranty statements in the written agreement for such products and services. Nothing herein should be construed as constituting an additional warranty. Customers have shared their successes with SAS as part of an agreed-upon contractual exchange or project success summarization following a successful implementation of SAS software.

Back to Top