Skip To Content

5 Things to Consider When Creating or Expanding HTE Labs

July 15, 2021
by Sanji Bhal, Director, Marketing & Communications, ACD/Labs

Guest contributors: John F. Conway & Ralph Rivero, 20/15 Visioneers; Nikki Dare, ACD/Labs

The pharmaceutical industry has renewed its interest in HTE these last few years. Leaders see it as a way to get medicines to patients faster, but they’re also attracted by the sheer quantity of data generated in parallel experiments. Such data could feed the information appetite of artificial intelligence (AI) and machine learning (ML) algorithms, leading to new predictive insights. In addition, regulatory agencies are asking the scientific community for more rigor and repeatability—a gap that HTE could also fill.

“HTE delivers more results with almost a similar level of effort, and dramatically improves the rigor that can be lacking with manual execution of experiments by scientists.” So says John Conway, an industry consultant in life-science informatics for >20 years. “It provides highly reproducible experimentation and data.”

But how and where should HTE be set up for maximum success? If you’re interested in expanding your investment in parallel experimentation, or thinking about investing for the first time, consider these questions.

  1. How will you manage change?
  2. Should HTE be implemented in discovery or development?
  3. Will you provide HTE to all chemists, or set it up as a service?
  4. How will you handle the data?
  5. What’s your supporting IT landscape?
HTE-blog-1

1.     How will you manage change?

Change must be managed for the best results. Perhaps you’re introducing HTE to new groups or departments, or you’re introducing new hardware and software to existing HTE teams. In both cases, successful adoption requires considering people, process, organization, and culture.

“Adopting HTE requires a change in mindset and that is always a challenge,” says Nikki Dare, Solution Manager for HTE management software Katalyst D2D. “If you don’t have buy-in from the chemists, it’s not going to go anywhere because there are inevitable teething problems. Change must be introduced with the users, as well as business goals, in mind. When problems arise, there has to be a plan to deal with those problems.”

Ralph Rivero, a 30 year pharma veteran medicinal chemist with experience in drug discovery, adds, “chemists not currently doing parallel experiments are used to doing iterative experiments. [They’re] convinced that each reaction will inform the next. In order to effectively change that thinking and expand to a HTE way of working requires change management. The ROI [return on investment] from HTE is not always immediate. While a high throughput experiment may uncover a solution to an immediate problem, the real return is in the volumes of data generated. The return might be months from now when some of that data informs another experiment that you would have never thought of. And that can be hard for people to understand at the outset of a project.”

Tactics for success include:

  • Explaining the reason for the change, and spreading relevant information through the organization
  • Involving stakeholders early on
  • Deploying changes with small groups of users who can later train their peers
  • Choosing tools purpose-built for HTE. For example, Katalyst D2D software manages HTE experiments from set-up to analysis, making scientists’ workflows clearer

2.     Should HTE be implemented in discovery or development?

HTE works in both discovery and development. Where you want to start depends on your organizational goals.

In discovery, work is dynamic, and you’re setting up experiments to optimize as broadly as possible. A single 96-well-plate experiment can explore substitution of a particular group on a molecular scaffold, saving days of work from multiple medicinal chemists working traditionally.

In development, you’re trying to achieve high reproducibility and optimize fewer parameters. You also need to think about transfer of technology and knowledge to manufacturing.

“I think the development chemists have accepted high-throughput experimentation quicker than the discovery chemists,” says Ralph. “Discovery still relies a lot on individual experiments. But execution and ineffective capture of bespoke experiments across multiple labs in an organization means information that could inform future experiments often gets lost.”

“The highly regulated environment of development forces those scientists to accept that reproducibility is important. While automation and HTE will not guarantee it, you’re moving in the right direction.”

3.     Will you provide HTE to all chemists, or set it up as a service?

Organizations are taking two approaches to HTE labs. Some are making high-throughput experimentation available to all their chemists (democratized HTE), while others are building expertise within a small group to provide HTE-as-a-service. A few organizations are establishing themselves to be service providers for outsourced HTE. So, what is the right approach? It differs by organization.

“I’ve seen companies succeed with both these approaches,” says Nikki, “but in the discussions we are having, high throughput experiments are being done internally. Maybe this will change in the future., Democratization works well when you have people [who] are willing to spend the time to implement the processes in a very user-friendly way. Some of the adopters of Katalyst D2D work in a “core facility”, while others work in a more decentralized, “open access environment.”

4.     How will you handle the data?

In previous HTE waves, experimental data was disconnected from data management and analysis ability.

“Even those generating HTE data today typically have to work with half-a-dozen systems and software interfaces to connect the data—to link LC/MS data or HPLC data to a specific experiment,” Nikki says. “The software that supports the workflow is really important in HTE because most solutions are very specialized, meaning you have data spread across multiple places and the chemist still needs to go and link it all together, which is a very manual and tedious process.”

Without a plan for linking analytical data to the original samples, teams quickly generate reams of disconnected information that takes days to work through. So, the expected productivity of HTE declines.

Data-management software can help. For ACD/Labs’ customers, Katalyst D2D simplifies data-understanding by connecting analytical results back to experimental set-ups. It also organizes the data in a shared database for widespread use.

“Accessibility of the data that’s generated is key to making HTE a real success,” says John. “Data must be properly captured and curated and ready for secondary use.”

In the long term, secondary use might include ML workflows. But organizations can only be “ML-ready” if their data is available, organized, and standardized.

5.     What’s your supporting IT landscape?

To organize experiments and data, companies use various informatics tools. But how well do existing tools work for HTE? Electronic Lab Notebooks (ELNs) are the standard software for recording the scientific method and experiments. But many don’t fit the complex needs of HTE well because they were designed for single experiments. Laboratory Management Information Systems (LIMS) are the standard for managing sample data, but they don’t manage experiment method data well.

Data science applications use experimental results to create statistical models and design better experiments. But they first require the data to be structured and normalized.  Design of Experiments (DoE) software such as JMP, for example, will build a model to predict the temperatures that produce the highest yield, but they can only do so if temperature is consistently recorded.

“Three times in my career, I saw organizations trying to integrate HTE into the everyday chemist’s workflow”, Ralph recalls. “It failed when one component or another was not put in place or overlooked—in the infrastructure, the data handling, or in the ability of software to capture the information easily.”

“In R&D you have to capture the continuum—requests, samples, experimental methods, testing, analysis, and results,” adds John. “Informatics companies have made great strides over the last few years to try to manage both structured sample data, such as that managed in LIMS, and the unstructured data of bespoke experiments in ELNs. The ideal is a foundational platform for research with the ability to integrate the expert systems that go deep.”

No single system can handle everything. Robust integrations are the answer. When different data-management systems are smoothly connected, individual scientists are relieved of transcription-error risks and data-management burdens.

Software like Katalyst D2D provides these connections. They’re the missing piece that helps your existing IT infrastructure handle HTE data more smoothly. “We saw the gap that exists in the technology support for HTE,” says Nikki. “With Katalyst D2D, we are able to provide a more streamlined experience for our customers that have invested in HTE.”

Thank you to our guest contributors:

John-conway Ralph-rivero Nikki-dare
John F. Conway
Chief Visioneer Officer
20/15 Visioneers
Ralph Rivero
Senior Principal Visioneer
20/15 Visioneers
Nikki Dare
Katalyst D2D Solution Manager
ACD/Labs

Send me more info!

Subscribe to receive more information about Katalyst D2D from ACD/Labs

This field is for validation purposes and should be left unchanged.

Comments

Your email address will not be published. Required fields are marked *

Send me more info!

Subscribe to receive more information about Katalyst D2D from ACD/Labs

This field is for validation purposes and should be left unchanged.