Derek Buhr is an Automation Engineer working for Bruker Spatial Biology (BSB) with 8 years of experience developing and implementing consumables automation for the CosMx and GeoMx spatial biology platforms. His work at BSB has encompassed extensive liquid-handling programming and development of a custom LIMS for inventory and workflow tracking of over 100K+ unique oligonucleotides. Today, Derek will be presenting how BSB has implemented the Dynamic Devices Lynx to dynamically normalize and create pools for R&D and commercial products using database produced inputs and Lynx based outputs.

Transcript

It’s my pleasure to introduce Derek Buhr from Bruker. Derek is an Automation Engineer with eight years of experience at Bruker Spatial Biology.

Today, he’ll be talking about Dynamic Devices’ LYNX methods to dynamically normalize and prepare oligonucleotide pools from a 384-well plate with data I/O for automated LIMS updating.

It’s a pleasure to have you. Thank you!

All right. I need to choose a different headshot—that one is way too close up. There we go.

Click on that button… all right, yes. Thanks, James, for that introduction.

So yeah, essentially what the previous two presenters showed was a bit more advanced on the science side. What I’ll be talking about is more low-level—really just pooling and normalization—but how we’ve used it to significantly expand our capabilities, enabling much larger-scale options. We also use QC metrics to ensure the result is a solid pool.

Agenda

Our current agenda—what I’ll call the “menu” today—starts with a quick introduction to who I am. Then I’ll go into:

  1. What required automation: What process or workflow we were building or improving to meet our requirements.

  2. Why we chose LYNX: What features made LYNX the right choice over other potentially cheaper systems.

  3. How we did it: What modifications we made to the LYNX system, how we tested and optimized the workflow, and finally,

  4. Assessment: Did it actually work?

Background

As mentioned, I’m from Bruker Spatial Biology. If you’re familiar with the spatial biology field, you know it’s seen major changes recently. Bruker Spatial Biology is a collective of several acquisitions by Bruker, including:

  • Canopy Biosciences with their CellScape platform,

  • NanoString Technologies, where I originally worked, and

  • Our core product lines: the original nCounter Digital System, and the GeoMx and CosMx platforms—newer, advanced systems for spatial biology.

Together, under Bruker Spatial Biology, we’re pooling knowledge and efforts to push the field forward.

Why We Needed Automation

Visually, from the left side of your screen: What are we pooling from? 384-well plates in SBS format. Each well contains a molecule. When pooling, we’re combining select wells—each possibly with a unique molecule.

We’re dealing with around 40,000 potential oligos and around 120 pools. So scale is a major factor. We want to normalize at the same time to save time. If we can normalize and pool in a single step with a 96-head, we’re set.

We also wanted to maintain a consistent worklist format that integrates well with other instruments. On-screen, you can see how we adjust volumes dynamically—our standard is a 5 µL transfer of 100 µM oligos, adjusted based on actual concentrations.

This meant we needed a 96-head capable of different volumes simultaneously and able to direct them into multiple pools.

Why LYNX?

The 96VVP head on the LYNX was a game-changer. It gave us the power to process pooling and normalization at scale with single-channel control times 12. The speed and flexibility were exactly what we needed.

We also gained diagnostic output through embedded scripting capabilities, enabling automated QC assessments and toggling built-in features.

Previously, we had a workflow like this:

  • Create worklist,

  • Run the worklist,

  • Find QC or log files,

  • Run a separate script,

  • Update our data in LIMS.

Now, thanks to LYNX, we embed scripts directly in the method. Everything is validated at once—method and script—ensuring consistency and eliminating version issues between users.

We also embedded direct database querying. We use Excel connected via ODBC (Open Database Connectivity) to manipulate our input file live and add QC checks as we run.

How We Did It

The LYNX system solved throughput issues, but we had to standardize input workfiles. This meant converting our format into the VMDI format LYNX uses.

This was the hardest part—writing a C# script to build the proper input. With help from Dynamic Devices, who provided a starter script, we expanded functionality to map volumes accurately based on QC feedback.

We also created a UI to track volumes in real time and integrate them into our spreadsheets and QC sheets. Now, when processing plates, we log everything and avoid errors like re-pooling the same plate—a critical protection when dealing with 120 plates.

We’ve eliminated human error by validating plates against a “completed” list. If someone attempts to re-run a completed plate, it errors out automatically. This prevents disastrous duplication—like pooling 38,000 out of 40,000 oligos and realizing you doubled a plate.

Scientific Enhancements

Now, going back to volume tracking and QC updating—this was a major scientific improvement. With LYNX’s built-in toggles and grid options, we can manipulate volumes and tracking easily in real-time.