The Role of Culture in Bioprocess Development

The Role of Culture in Bioprocess Development

By Rdikeman at the English Wikipedia, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=70797

 

What makes a national soccer team successful? Is it the technique of the individual player or team? And most importantly, how best to compare bioprocess development to the glamor of football?

Just in time for the World Cup, the Economist magazine published a statistical review of all national football team results in order to parse out what effects impact the goal differential between countries. Interestingly, but perhaps unsurprisingly, up to 40% of the variation was explainable by culture, government support, and stable organization of the national football league. While 60% presumably lies within the talent and development of the team, a surprising amount of the success of a football team may be tracked not to the techniques, but to the cultural and social management of the environment in which the football team plays.

Dare we compare biotechnologist to football stars? Sure, let’s have a go: first of all, because we are all at least, if not better looking on average than football stars. Secondly, because the variation in the success of bioprocess development depends on much more than having the best hardware and software systems available.

Organization of bioprocess data and data analytics is Exputec’s bread and butter. However, no matter how solid our collaborative results are, if the culture & management around bioprocess development is not sound, the results will either not be accepted or if accepted, will not be continued in the future.

In creating a state-of-the-art bioprocess development department, culture and organization is at least as important as all hardware and software systems combined. Here are some of the key principles, from our experience in process development, that will ensure a sustainable, advanced development environment:

Structure the bioprocess experimental approach from the beginning

The first things Development department heads must ward off is the tendency in early development phases to play around with unstructured experiments. Developers will often say: A bit more oxygen here, bump the temperature there, and see what happens. It cannot really be done any other way, right?

We believe that structuring the early process development experiments is as critical as the later more formal process characterization studies. By planning out even the earliest runs in any given process unit operation or equipment, a number of advantages are generated:

  • The data remains organized so that may be used in the creation of the digital twin
  • The data is more likely to be stored in a physical format usable for future modelling
  • The results are more likely to be comprehensible for knowledge transfer to future development projects
  • The results can be used to augment experimental designs in the characterization phase

Say for example, that with absolutely zero prior knowledge, we set up a screening DoE with a maximum number of settings. Let’s further assume that half of these settings lead to complete failure to produce results. No problem! With a clever use of the statistical techniques and software packages we can either rerun the experiments in a constrained D-optimal design, or we can augment further experiments with full knowledge of the edge of failure. Moreover, the remaining half of the DoE can still be evaluated for main effects of the remaining parameters. This is a win/win.

This change must be both cultural and organization. From our perspective, the whole department should understand that no experiment should be run without the big picture in mind. If correctly supported by management, this change can be extremely effective and surprisingly sustainable.

Design of Experiment (DoE) unless otherwise justified

Not to put it too subtly, in this age of bioprocess development, DoE should be the rule, not the exception, even in the earliest phases. We simply have too little time and resources to not wring every last piece of information out of every experimental plan. Are OFATS always the wrong choice? No! (see next point). But they should always be the second choice to a DoE.

All scientists must be on-board with this approach, or some will go back to the old ways and will slow down the development process. Management must support their scientist to be up to date with the most efficient design approaches and must constantly warn against inefficient experimental plans.

And elegant DoE designs are getting easier and easier to perform. Some experts are even forgetting about deciding between different designs and rather running Optimal algorithms for any given configuration, allowing for a complete standardization of design generation (See Jones and Goos)

Software set-ups in high throughput systems are also increasingly automating the ability to design and run DoEs to maximize information at micro scale.

Simply put, there is no reason why DoEs should be considered exotic or mysterious. They should be the standard and accepted by every developer in the department

OFATS done correctly

Many developers will correctly point out that OFAT (one factor at a time experiments) does not equal bad design. These developers should be encouraged, however, only within the overall DoE framework of the department.

There are situations which will definitely call for an OFAT approach to process parameter modelling. However, even in these cases, we must consider what the future use of the results will be. If we are simply testing a single point in space, we are not actually getting information. We have no idea what the variation is, or how often we will be able to repeat that results. We learned almost nothing.

However, if we cleverly modulate one single factor (and even replicate it) over a specific range, we not only learn more about the behaviour of that parameter (at least in a univariate space), but we can also still fit this simple model into later more complex models, such as in the digital twin (more on that in a minute).

We are going to need all our data to maximize process understanding, therefore we need to ensure that all our development runs may be useful and useable in later modelling.

Structure and store the data and models for future use

The data must be available to everyone that needs it. If a company remains in a silo mentality, where each scientist has their experimental plan and results on an Excel spreadsheet on a local folder, the culture will without a doubt lead to underperformance of the department.

Most companies are now on the road towards central organization of data, even in development. However, controlling and filtering development data for active use both in the current development project as well a future projects, is critical.

Digital Twin: Reap the rewards of your effort

If you have come this far, it is time be rewarded by compiling all your data into a digital twin. The digital twin is actually simply a synthesis of all active process models, leading to a complete in-silico version of your process.

By the end of process characterization, and including all data from even the early stages of development, you should be able to easily construct this virtual version of the process and use it to:

  • Establish the correct NORs
  • Warn against likely OOS
  • Predict outliers in real time
  • Reduce useless deviations

The digital twins should become the end-goal of the development team and should be considered a huge victory. This is no easy task, but the results are beautiful and it serves as a very clear end-point that can be celebrated as such. Don’t simply turn over a development report. Wow the group and management with the stunning plots and predictions of the digital twin.

Transfer your knowledge to the next process

Lastly, the teams must work together to bring the process information derived from all this work forward to the next project. Even vastly different processes can still benefit from the experience and technical achievements of the previous project.

For example, by leveraging the development database, one could notice that a single CPP nearly always behaves in the same way, despite the different processes. This could serve to create a standard experimental configuration for the optimization of this parameter.

This can only work if the culture of the development team promotes the time to talk through the lessons learned at the end of each project. And if the data is allowed to be share between groups. We are all working on this together, and as much as Ronaldo dominates the Portuguese team, he still cannot play alone.

Conclusion

The talent of the development team, and the techniques and strategies they use, are extremely important – just like the World Cup teams require star players to make the biggest difference. But without the culture and management surrounding the team, it is not possible to sustainably enable the important changes that bioprocess industry require.

If you would like to know more about our vision and approach to organization of bioprocess development groups, email us here:

All fields required