With 25 years of experience in automation, Kevin Kho brings a unique multidisciplinary background in science and engineering to every project. His expertise spans both in-house and field automation roles, providing him with a comprehensive perspective on delivering tailored solutions.
As the Principal Automation Engineer at Alpine Bio for the past 2.5 years, Kevin has been instrumental in driving innovation and efficiency. Known for his empathetic approach and support-driven mindset, he is a true jack of all trades, seamlessly bridging technical excellence with practical problem-solving.
Transcript
Good afternoon, everybody. Thankfully, maybe the construction crew is on their lunch break right now, so that’s one less thing.
So yeah, so real quickly, yeah, 25 years. It’s been a while. I have had a blessed opportunity to work with a variety of different laboratories working in the field and everything to large labs fully autonomous labs, work cells, to where I am now with Alpine Bio, which is a very small humble South San Francisco lab working to basically make proteins using soy as their biofactory. Marketing and leadership kind of got a little bit of a hold of this presentation, so a lot of this looks kind of very marketing.
I think I’ll be focusing a lot on what I say, rather than what’s conveyed on the presentation, so I’ll do my best with that. But as I mentioned before, there we are a molecular farming company, and we are best known for casein. How we basically make cheese instead of using cow to derive that milk protein. I wanted to take a step back, much in the same wavelength as Derek and Jason had alluded to, talk about why we adopted the Lynx and the choices that led to that. The challenge in this case was how do we create an automation device, a setup, that’s conducive for two different departments with varying needs? Future-proof it, make it seamless for them to use and address all their workflow needs. And of course, remove my own personal biases as much as I could in this decision-making process, because this would be one of the few times where I actually had the opportunity to select everything associated with our automation setup.
And with that in mind, these were some of the requirements that I came to understand and appreciate from the scientists of which I was selecting these devices for. We knew we had to get a small footprint; again, small lab, I wouldn’t even call it a lab, the first lab that we started with, it was more like an office space converted to a lab, so it had to be small.
I knew I needed to get a gripper, and I say that because we were using magnetic bead separations for the genetic screening group, so I need to be able to take deep well plates on and off of a magnet. It had to have—oh, it didn’t click, follow with me, there we go. Oh, that’s weird. It says it’s here, but it’s not here. I see that, yeah, yeah I know, right? You have to double-click.
I knew I needed a 96-well head because, yet again, deep well plates. I had to remove the supernatant, the waste supernatant, from our deep well plates, so I knew I needed a 96 well head. Yet I still needed to find a way to do normalizations. I had a team where– our protein screening team needed to do normalizations, so I’m thinking in my mind, okay, well, I need a 96-well head. I’ll likely need some sort of independent volume-controlled channels to be able to accomplish this task. The volume range is pretty wide: large volumes, small volumes. Again, different needs, different for the team. And this was sort of a new thing, the integrations. There were some additional devices that I also spec’d out that I thought would be worthwhile to put onto this instrumentation, and so, yeah. I wanted it to be able to control those devices as well, seamlessly.
And then lastly, some just, basic nuances. Okay, I’m still a scientist at heart, I’ve grown up the ranks, I am a principal automation engineer now, but I started on the bench. And it was very important to me to be able to see what was going on okay, I don’t have that blind faith and can’t use black tips, so I’ve already started weeding out some things, but I needed to be able to see this. I needed to be able to see what was going wrong or right in this particular regard and to be able to work with a very responsive and responsible support team, that was very important to me.
So, a little bit of a throwback. As I was doing this research, and I think I narrowed it down to like four different automation vendors out of the slew of many available that were out there, Dynamic Devices came up. And that came up from a conversation, this is a throwback. Jeff Hurwitz, who used to work for Dynamic Devices, invited me over to his booth and told me about the VVP head. And so, as I was doing this research, I looked up the Lynx, I saw that, wow, okay, this could actually work. It’ll cover the normalizations that I need, it’ll cover the fact that it’s a 96-well head, ticked off all the boxes that I was looking to achieve.
At first, the LM730i was kind of what we were looking at, because it was the smallest footprint that was available that had the VVP capabilities. Could come with a gripper head, it was a 96-well VVP head, sorry, kind of went a little ahead of myself. Also, it doesn’t have the rubber O-rings on the outside, no offense, I just really don’t need to worry about those things, that additional maintenance on the instrument. I need something robust, something that’s just not going to break down on me, have to be attended to every six months, you know where I’m going with that. And… my clicks are still not going, ooh, that was not a good click. May I try this? Okay, I think the arrows work better for me, the mouse may have some issues.
Again, the variable 96 well head, the volume range that the VVPs touts, and the fact that the integrations that I have on here. So, I was hoping to get a normal shaker, we were doing, again, the magnetic bead cleanup. I will profess that we were using the Omega Bio-tek Plant DNA Extraction Kit, and so it required a lot of cleanup, a lot of reagents to go through to get our DNA of interest. And forgive me, I didn’t catch your name, but you had a very great question about the 32 samples? So I’ll address that with the 8-channel reagent dispenser offline, which is what I determined would be fruitful for that particular application and request that the users had.
And then lastly the nuances, works with clear tips, beautiful. This head is a tank, I have crashed the crap out of it, and it still keeps going, but it’s been a really beautiful thing for us.
So what you see here is the LM730i, and there were other laboratory automation specifications and requests that actually came to fruition, which meant, oh, okay, I actually now can’t fit the 730i on a bench, I’m going to need to request a table. So I got the automation table. When you look at the 730i with its 18 positions and these relative dimensions, and yet the automation table looks kind of like that, it’s like, huh, well if I’m going to get an automation table anyway, why don’t I look at the LM900, which, looking at the depth, fit on this automation table. So I was still not really worried so much about getting a bigger instrument if I had a table that could effectively hold it, and I gained 30 positions on top of that, which was huge. And you’ll see in a moment, oh sorry, let me just go back real quickly. So this is kind of an overview of the deck setup, very basic, I actually do have some off-deck positions over here, but easily 25-30 positions that I can play with. I’ve got the Q1 orbital shaker here, the heater cooler orbital shaker. I also have a gravity-fed waste drain over here, which is huge for the supernatant removal steps. And then what you see this guy creeping out here is an off-deck 8-port reagent dispenser manifold that goes over and dispenses channel by channel depending upon what input I have for it. That’s how we came about doing the, 32 for example, questions that you had, sample requests. Because my team doesn’t always deal with full plates. I give them the ability to actually do column by column in this particular regard, and so it’s been great. Not to mention, it also frees up having to put reservoirs here. All of my reagents, all my wash reagents are offline, and so it doesn’t clog up the deck, it enables me to increase the throughput of the workflows that my team needs.
Here is my colleague dutifully working on our Lynx LM900. You can see how nicely it sits on an automation table nestled into our scientific bench islands that we have. And yeah, I just wanted to give you an example, oh sorry, sorry sorry. Just wanted to give you a quick example of what it looks like in this fully decked out, beautiful setup. And I’ve more or less taken advantage of almost every single position in this particular setup. So, for the department, for the protein screening department, you can see protein extraction in quant, normalization in quant, and the special dilutions, and how filled this deck is and the requirements that actually made this work. As I mentioned before, how robust the head is, that worked out really well because I really had to figure out how to lay this out in a certain manner or else I was going to get crashes. And yeah, it was a good learning experience, very grateful that the head was very forgiving for that matter, but it’s worked out wonderfully for the teams.
Going into the special dilutions, I also had trouble because I had a video over here as well. I have it on my phone in case any of you have questions and want to see it offline, but I was just going to show you how the multi-dispense was a really slick application. But effectively, what we’re trying to do is we have a 96-well sample that we want to run in triplicate, and so it takes four plates to do that. I can aspirate all the material up at once and then have it actually do triplicate dispenses. I’ve got trailing air-grafts after that to just make sure that whatever I dispense stays within the tip. That would have been the video, but Windows, codecs don’t really play nicely with MP4s that come from my phone, so I can show you guys that offline.
One of the biggest applications, also, for the normalization– which, we know, the scientists that I program this for, it’s just painstaking to have to do normalizations one by one, 96 samples, 192 samples. And so this was just an illustration of the pain that it would have caused to have to load each tip, change the volumes, each and every time, eject those tips. My heart went out to them because it would take one to two hours for them to do this. And the pains associated– so I don’t take this lightly, because in my career, in the people that I’ve helped support, I’ve seen people go down from injuries associated with pipetting stresses, and so this was really important to me to get the right solution for my team. But, look at that, that’s beautiful, right? Look at how crazy that a 96-hole head can have all these variable volumes in the tip and just get normalizations done, but that takes them one to two hours by hand, get it done in ten minutes. Ridiculous.
And, oh yeah, I wanted to show you how the Lynx performed when we challenged manual versus the robot. How concordant that data was. Again, leadership got a hold of this, so I can’t show you the axes and everything like that, but you can see how well the lines line up, basically, right? It’s just wonderful. It did take a little bit of fine-tuning with liquid classes and calibration curves and whatnot, but this is what we want. I want to prove that our robot can actually meet their needs and do so accordingly.
Moving on to the genetic screening side. Again, this is really the advantage. Now, I can’t say that it’s necessarily making things faster for them– DNA cleanups is just a slow process. There’s incubations, there’s just tons of reagents that you have to pull through to get the washes that you need, like one washes the other, right, it just keeps going and going. But the beauty of this particular setup is, I can have these tips, as they are here, lined up in accordance to the amount of samples that they have per plate. So those tips and the elution tips to the fourth column on the right line up with the number of samples. I have that off-deck reagent dispenser programmed to match each and every one of these four plates, column to column, so that it just goes through and gets parsed out as the loops go through for these plates. All of those questions are being asked of the user from the very beginning before they start this method: how many columns do you have, how many plates do you have, what do you want as your elution volume? All of these requests I try to incorporate in this particular method. Oh, and a pro tip, too, if you ever want to re-rack those tips, rather than take it one by one off by hand, I found that the Eppendorf Move-It—which, in itself does have O-rings on the outside, but that’s besides the point– actually fits the tips, the 1250s and the 200s, so it’s so much nicer to actually re-rack with that. It can’t pipette anything, so don’t expect that, but it actually re-racks really nicely for you, so pro tip.
But yes, data without the axes. So, we have here, the manual run was the second run here. And effectively, what we were trying to do is to challenge the Lynx. We had several different iterations, several different things that we were trying, but ultimately, what we wanted to do was to get the same data metrics as we were getting off of manual through the subsequent runs that we had across all this time. As you can see, data sets, I think it was four, five and six… five, six, one, two, three, four, five– yeah, four and five ultimately led to the triplicate results hitting the marks that we were looking for.
And so again, that proved that the Lynx could actually be used for this particular purpose, the justification to let it continue on. And now, as I mentioned before, while it’s the case where we don’t necessarily increase the speed of which we do this, one operator can control the entire throughput for their department for the entire week. So it’s a reduction in FTE, basically, and a double in their throughput. But we’re not really saving on time, per se, when you think of the processing time, but it’s just a beautiful thing that one operator can essentially do all that work for the entire department.
And this is kind of what it’s meant, I’ve toned this particular slide down, I had some playful things on here, but I tried to be professional about it, But you can see, much to the effect of Jason, we had this one chance to get a robot right for everybody. And you can see how it interweaves with the workflows on any given week, and all the meetings and other obligations that we might have. How teams can just walk up to it, run their thing. Now, there may be some manual intervention points that they have, but at the end of the day, it’s all about getting on, getting off, going about your day. And it’s just been really lovely to do that for the team. So the key takeaways, it’s pretty much used daily, almost hourly in some cases. There are capabilities to walk away from it, particularly the DNA clean up, it’s a long and arduous process. But yeah, the key thing that’s worked out: the throughput that we’ve achieved in 2024, as an example, is double, and with one less than FTE. So it’s been really, really nice.
That’s pretty much it for what I had here. I just wanted to send my acknowledgements, my profound appreciation to the people that made this possible. Dynamic Devices, in particular, with some of the members that are here today. Some I may not have listed because there’s just so many to list, but the fact that they’re a small but mighty company, right? They get it, they get that our success is their success, and I just wanted to give them credit for providing me with the tools, the skeleton of the methods that I derived the workflows from, the scripts that they shared with me to make everything that my team had requested out of me happen. I just wanted to thank you truly and kindly for that, and of course, the opportunity to present and convey this to you guys today.
And then of course, my team at Alpine Bio. Again, we’ve had many users, many team members who have gone on to bigger things. I have not listed them all here, but these are, at least, the team that’s there now. Just wanted to thank them for their time, their effort to go about and help me justify that what I wrote for them, what I made for them, actually works for them. And then of course, lastly, to our VP and our CEO for bringing me on board in 2022 and giving me this opportunity. I’ll take questions if you guys have it now, and then if there’s anything outside, offline, I’m happy to answer that for you as well.