Robyn Laskowski is the Senior Manager of LGC Lab Services in Middleton, Wisconsin, with over a decade of experience in the Applied markets. She specializes in developing and implementing high-throughput genotyping technologies globally and has successfully navigated the challenges associated with them. In her current role, Robyn is focused on scaling LGC’s global service laboratories to deliver high-quality results that meet customer turnaround time and cost expectations. She is dedicated to continuously improving the customer experience while driving operational efficiency and innovation.
Transcript
As you’ve said, my name is Robyn Laskowski, and I work for LGC (or Biosearch) and run the North America Lab Services groups. So, we currently have one in Middleton and we have one in Gainesville for a little bit longer. And really what– oh, beautiful, thank you.
Actually, I’ll come back and I’ll center us on just LGC a little bit. LGC has been serving the science community for 180 years. It’s actually a spinoff from UK’s the laboratory of the government chemist and is its own private equity company now. So, what we look to do is partner with our customers to find novel and critical solutions for what they need to help diagnose, treat, feed, and advance their programs to help protect the growing population, all within the framework of creating science for a safer world. So, this is part of the organization that I belong in, the Diagnostics and Genomics piece. And over there you can see the laboratory services part of our org.
And what we do is basically take sample to data. So, we have customers in a variety of breeding applications, academic institutions, museums. You can think of a lot of different scenarios, but what they’re looking for is somebody to help them get the data they need to advance whatever they’re studying or trying to manufacture, and we try to provide that service to them in the best quality, lowest cost, and in some quick turnaround times.
So when we look at the genotyping portfolio that LGC has to offer, there’s a variety of chemistries, there’s a variety of different pieces of automation. But where the services part really comes into play is with the technologies of Amp-Seq, which is an amplicon-based sequencing kit that we launched last year at PAG. Flex-Seq, which is a probe-based hybridization sequencing platform. We have a Capture-Seq, a standard GBS, and a whole genome sequencing platform as well.
So really, it is up to our customer to let us know what kind of information do they want to derive, what do they need it for, and help solve which part of our portfolio actually fits best for them.
So, in doing so, LGC announced last year that we were going to have two centers of excellence for our lab services group. So there’s the one in Middleton, Wisconsin, which is just outside of Madison, and then in Berlin, Germany. So we have a variety of offerings, we still got some Sanger sequencing going on there, steady and strong. But then to our new offerings, and again, it comes back to what the customer needs.
Where we were seeing a lot of growth was in what I would qualify as our mid-density genotyping. So, where customers were looking to do whole genome prediction or screening, that they needed a larger density of data, and it couldn’t just be an endpoint where you’re getting a single SNP answer. They wanted greater depth to advance their programs.
But as you start to work on a global scale, and you start to try to build these platforms, there comes to be some hiccups with that, because we all had our own little equipment. We had our workflows that were similar, but not exactly the same. And in becoming a global institution, we had to really work to harmonize that. So, a lot of the work that we’ve been doing over the past year has been trying to harmonize our process and methods, increase our accuracy (both in terms of the data and our workflows), our dependability, and overall, our sample throughput. So, it’s taking us from running maybe a couple of thousand samples over a week to tens of thousands, hoping to get us to over 100,000 samples per month processing for a variety of customers. So it could be, depending on the customer, they might have 20,000 samples a month. One might have 100. So we have to work all of this together in our workflow to find the most efficient way to do this and keep our costs down and have good quality.
And that’s where the Lynx came into play.
So, I had previously used Lynx in a prior role and knew of its capabilities. So I reached out to Toby and the team to see about how we could incorporate it into LGC, because I could see a lot of really good wins right from the start. So first off, getting the LM1800, getting the biggest one you can get when you’re thinking about scaling a lab, you can put, with your tip boxes, you can get 48 plates. It also then gives us walkaway time and the technicians can get the run set up, hit go, make sure the light stays green, and then go off to do what only they can do with their hands. That’s invaluable to have that extra time back for them. The VVP head, which I’ll talk a little bit more on the next slide, and as I think a lot of the speakers have talked about, has been also a very pivotal piece for us in decreasing the amount of time some of our processes took because of those independent channels.
And not to minimize the 384-well head, it’s still strong and steady because we do a lot of rearray and a lot of movement and a lot of addition into 384-well plates, again, coming to that scale.
As I think others have talked about too, LIMS integration, so being able to take files from the Lynx, put them into our LIMS, and vice versa so that we have a better tracking and traceability component, and we’re telling it exactly what we need it to do. Takes a human component out of it, very helpful in all this, as well as once you’re scaling.
Coming back to the globalization piece, the method harmonization, knowing that when we say we’re gonna normalize this set of plates, the lab in Germany is doing it the same way, if we have to troubleshoot, if something doesn’t look right, we know exactly what method they’re using so that we can try to then help them, and conversely for them to help us.
And then lastly, the barcode utilization. So within the LIMS piece, having the barcode scanner on the Lynx helps us, one, track what we’re doing, where we’re doing it, have the methods written to stop if it doesn’t match what it should be, but conversely it has that trackability and traceability that certain customers are looking for. So, it’s almost continuing that chain of custody throughout our process. So again, if anything happens, we need to troubleshoot, we can define things by those moments in time.
So, here’s just a quick picture of the VVP head. You can kind of see the different volumes of blue dye in the tips, and then our very large deck layout, which they did change the color to like an LGC teal for us as it got installed.
So hopefully… there we go.
So, I have the video for the VVP head just going because this was probably one of the pieces that improved, I would say, the technician’s quality of life very quickly. So compared to the piece of automation that we had, we reduced the time to do a quantification and a normalization using the VVP head by 75%. Versus the piece that we had before.
It was a very, very large win, and I think I mentioned it to some of y’all at dinner last night when my German colleagues told me how much they liked this, that was like gold star, like big win for me personally, because they just recognized how much more efficient it was and how much more accurate it was.
Even going to the 384-well head, or excuse me, the rearray process where you go from 4 96 plates to a 384, again, the precision and the accuracy and the speed of which the Lynx can work with a much larger deck.
You can set it and forget it, the phrase, “a Ronco,” might be a little outdated, I might be saying how old I am when I say it, but you just get it going and again, you can walk away.
The durability has also been phenomenal in compared to what we were using before, the quality of life for the technicians there again. We just don’t have the same sort of breakdowns or crashes that are not recoverable, where a technician has to come to get things fixed. If anything, it’s an email to one of the engineers and they just walk you through it and we’re back up and running, typically within just a couple of hours, which was a big improvement sometimes over a week before somebody could get onsite to help us.
And within doing all of these pieces and getting the Lynx integrated into our labs, this was a pretty big point in us increasing that sample throughput with this instrument alone. So we increased about two and a half times the number of samples we could process globally just by getting the Lynx into play.
So there was a lot of quick wins, in my opinion, for ever getting the Lynx in play and there’s certainly a lot more that we are developing, but the piece that I’m also excited about is also continuing the future opportunities, and not just from an LGC perspective, but also as we continue to work with Dynamic Devices and find solutions together.
So the first one is, 384-well plates are pretty standard within the lab service operations, but can we go smaller? Can we then push more samples through? Can we miniaturize these reactions that will reduce costs even more? Again, because of the precision of that 384-well head.
So, as I think maybe I mentioned, so sbeadex Lightning is an extraction chemistry that LGC released last year.
sbeadex might be more commonly known, it’s been around for a while, but the lightning process with some tissue, you can go from sample to DNA within five minutes.
So within that, Dynamic is also brought in to play in running sbeadex Lightning on its manifold. So, Toby and Abby actually presented at PAG with LGC last week, within the past couple weeks, about actually having a Lynx running this chemistry and the benefits thereof, because it can also then sequentially process too. It’s not getting everything in the beads, letting it sit. There’s kind of that sequential processing. So again, really maximizing the way that you are able to write the methods and utilize that full deck of the system to really increase the throughput of those extractions.
And then another product that LGC came out with and announced, I think, at the beginning of the year’s, I think another Amp-Seq product called Amp-Seq One.
And one of the things while this was an LGC product that we have realized is, when we can continue to work with other organizations like Dynamic, we can get the methods written for customers and say, hey, we know it’s already validated on a Lynx. We can give you the method, we know it works, we’ve vetted it. And it helps us in terms of trying to sell some of the product, but it also helps continue to build the relationship that we have with Dynamic and say that it’s a known and trusted entity to do these things in their own lab space.
So, I guess my thoughts and feelings are, this has been a really beneficial partnership, especially over the past years, so, from both sides, and really look forward to seeing what else we can do together. When we bring together the LGC chemistries and offerings, and then the great engineering that Dynamic has done and believe that they’ll continue to do for us in the future.
So, made it really short and sweet. So hopefully I can answer any other questions regarding LGC technology stuff certainly, but really want to give a shout out to Toby and Abbie and Cory, they were very instrumental in us getting the Lynx up and operational globally. It was a feat to make sure that we were all working in the same way, but they’ve been just nothing but a phenomenal team to help us do that and to do that lift. So, thank you very much.