First Time Loading...

Ginkgo Bioworks Holdings Inc
NYSE:DNA

Watchlist Manager
Ginkgo Bioworks Holdings Inc Logo
Ginkgo Bioworks Holdings Inc
NYSE:DNA
Watchlist
Price: 0.4497 USD -2.66% Market Closed
Updated: Jun 16, 2024
Have any thoughts about
Ginkgo Bioworks Holdings Inc?
Write Note

Earnings Call Transcript

Earnings Call Transcript
2024-Q1

from 0
M
Megan LeDuc
executive

Good evening. I'm Megan LeDuc, Manager of Investor Relations at Ginkgo Bioworks. I'm joined by Jason Kelly, our Co-Founder and CEO; and Mark Dmytruk, our CFO. Thanks as always for joining us. We're looking forward to updating you on our progress. As a reminder, during the presentation today, we will be making forward-looking statements, which involve risks and uncertainties. Please refer to our filings with the Securities and Exchange Commission to learn more about these risks and uncertainties.

Today, in addition to updating you on the quarter, we are going to provide more detail into our drive towards adjusted EBITDA breakeven and the necessary steps we're taking to get there. As usual, we'll end with the Q&A session, and I'll take questions from analysts, investors and the public. You can submit those questions to us in advance via X at #Ginkgoresults or e-mail investors@ginkgobioworks.com. All right. Over to you, Jason.

J
Jason Kelly
executive

Thanks, everyone, for joining us. We always start with our mission of making biology easier to engineer, and that's especially critical today. Ginkgo's a founder-led company and myself and the other founders have been pouring our lives into this company for the past 15 years and many of our senior leaders for more than a decade. The advantage of this is we're very motivated to see the most out of Ginkgo. We've invested a ton of our lives in it. And as a consequence, we want to see the most out of the investment of your capital in Ginkgo as well. So today, we're going to be announcing major changes to how we do our work at Ingo. These are going to be difficult for many on the team, and I want to say that upfront, it's going to involve substantial head count reductions alongside important changes to improve our operations. The mission of what we're doing matters to Ginkgo, to everyone at the company. And you will see us collectively take difficult but decisive action when needed to ensure we deliver on it. And today is one of those days. So Ginkgo is an increasingly important part of the technology ecosystem in biotech, and that's why I think it's important we get this right. I'm really proud of this customer list. It's unbelievably broad, it showcases our core thesis that a common platform can provide biotechnology R&D services for very demanding customers across ag, food, industrial, biopharma and consumer biotech. I'm also happy with how we've been expanding this list. In particular, many of the big names in that biopharma column were added in just the last 18 months, Merck, Novo Nordisk, [indiscernible] or Pfizer. However, the next step for Ginkgo is to take what we've been learning across now hundreds of customer programs and make changes in the business that deliver those programs more efficiently. In particular, I'm going to talk later about how we can achieve greater scalability via simplification of the business. We want to simplify both our technology back end, ultimately consolidating -- to consolidate to a single automation platform and simplify on the front end. We've gotten a lot of feedback from all the logos on this page about what they like and don't like about our deal terms. So we're going to be simplifying those, too, and that hopefully will increase sales velocity and simplify our dealmaking. More on that in a minute, but first, Mark is going to walk you through our Q1 performance. And there are a couple of things that are indications that we do need to change course. In particular, you'll see an increase in programs without a matching increase in revenue. This is a problem that I'll be working to fix via the changes you're going to hear about today. We're fortunate to be in a position of financial strength as we execute these changes. We have $840 million in cash. We have no bank debt. And so we have a large margin of safety, which is really the position you want to be in when you make large changes like this. In other words, we're not doing this with our back against the wall, and that's a very deliberate choice on our part. We're also setting a target of achieving adjusted EBITDA breakeven by the end of 2026. The attitude internally at Ginkgo, and I know many at the company are listening right now, will be to collectively set our plan for reaching that, which is going to involve input from all the folks on the team. And then commitment from all of us to not spend outside of that tight plan. Over the past few years, we've learned a lot by trying different avenues to drive growth. But we have all that data now and the team, and we have a team that can set the right plan and determine who are the best folks to deliver on it. And we're going to be doing that in the coming weeks internally. This also aligns well with what we've heard from many investors, especially those of you who've been waiting on the sidelines to invest in Ginkgo. The most common thing I hear is I love the vision, I see a path where Ginkgo ends up being the horizontal services platform serving all of biotech, massively scaled up, you get better with scale. But Jason, can you get there with the capital you have on hand. And I think our plans today will give you confidence that we can. Okay. I'm now going to ask Mark to share more details on our Q1 financials, and I'll follow with an explanation of how we're going to execute our targeted plan. Over to you, Mark.

M
Mark Dmytruk
executive

Thanks, Jason. I'll start with the Cell Engineering business. We added 17 new cell programs and supported a total of 140 active programs across 82 customers on the Cell Engineering platform in the first quarter of 2024. This represents a 44% increase in active programs year-over-year with solid growth across most verticals. Cell Engineering revenue was $28 million in the quarter, down 18% compared to the first quarter of 2023. Cell Engineering services revenue, which excludes downstream value share was down 15% compared to the prior year, driven primarily by a decrease in revenue from early-stage customers partially offset by growth in revenue from larger customers. We believe the mix shift to be an overall positive and is indicative of market conditions, our refocused sales efforts on cash customers and the increased penetration of larger biopharma and government customers that we have discussed over the past few quarters. That said, the revenue in the quarter was below our expectation and the pipeline indicates a weaker-than-expected revenue ramp for the rest of the year. Jason will be discussing later in the presentation, both how we're thinking about demand and our offering in this environment and efforts we're taking to further focus the customer base. Now turning to Biosecurity. Our Biosecurity business generated $10 million of revenue in the first quarter of 2024 at a gross margin of 8%. We do expect the gross margin to improve in upcoming quarters based on the revenue mix in our contracted backlog. We're continuing to build out both domestic and international infrastructure for Biosecurity especially with our recently announced Biosecurity products, Ginkgo Canopy and Ginkgo Horizon. And now I'll provide more commentary on the rest of the P&L, where noted, these figures exclude stock-based compensation expense, which is shown separately. And we are also breaking out M&A-related expenses to provide you with additional comparability. OpEx. Starting with OpEx, R&D expense, excluding stock-based compensation, and M&A-related expenses decreased from $109 million in the first quarter of 2023 to $94 million in the first quarter of 2024. G&A expense, excluding stock-based compensation and M&A-related expenses decreased from $71 million in the first quarter of 2023 and to $51 million in the first quarter of 2024. The significant decrease in both R&D and G&A expenses was due to the cost reduction actions we completed in 2023, including cost synergies related to the Zymergen integration and subsequent deconsolidation. Stock-based compensation. You'll again notice a significant drop in stock-based comp this quarter, similar to what we saw in each quarter in 2023 as we complete the roll-off of the original catch-up accounting adjustment related to the modification of restricted stock units when we first went public. Additional details are provided in the appendix to this presentation. Net loss. It is important to note that our net loss includes a number of noncash income and/or expenses as detailed more fully in our financial statements. Because of these noncash and other nonrecurring items, we believe adjusted EBITDA is a more indicative measure of our profitability. We've also included a reconciliation of adjusted EBITDA to net loss in the appendix. Adjusted EBITDA in the quarter was negative $100 million, which was flat year-over-year as the decline in revenue was offset by a decline in operating expenses. And finally, CapEx in the first quarter of 2024 was $7 million as we continue to build out the Biofab1 facility. Now normally, I would speak to our guidance next, but given our plans to accelerate our path to adjusted EBITDA breakeven through both customer demand related changes and significant cost-related restructuring, Jason is going to first walk through those plans and then discuss guidance at the end. Before I hand it over to Jason, I'd like to provide some color on the cost restructuring we are planning. High level, we are committed to taking out $200 million of operating expenses on an annualized run rate basis by the time we have completed our site consolidation actions, which we expect by mid-2025. We expect at least half of that savings target to be achieved on a run rate basis by the fourth quarter of this year. The majority of our cost structure is in our people and facilities costs. And so workforce reductions across both G&A and R&D and site rationalization are the primary focus, though we see significant opportunities in other areas of cost as well. For clarity, our cost takeout estimate includes an assumption relating to our ability to manage our lease expenses relating to space we will no longer require. As I said, Jason will speak to the overall plan in more detail, including importantly the customer demand side of this. And so now, Jason, back over to you.

J
Jason Kelly
executive

Thanks, Mark. The big theme for today is how we're going to grow revenue while decreasing cost in order to reach adjusted EBITDA breakeven by the end of 2026. I'll start by talking about why we're not seeing revenue growth alongside program growth. I mentioned this earlier, and what we'll be doing to simplify our back-end automation technology to improve scalability there. Second, I want to talk about the front end, what customers like and not like about our service terms and how we'll be simplifying on the front end to expand our offerings and simplify our offerings to reflect what we're hearing from our customers. And then finally, we're taking decisive action to reduce our costs. Specifically, we plan to reduce our annualized run rate OpEx by $200 million by mid-2025 in order to achieve adjusted EBITDA breakeven by the end of 2026. And we'll dive into the high-level plan of how we'll execute on this. Okay. Let's jump in. So the charts here are a big part of what's driving our decisions today. You can see from the chart on the left, the number of active programs on our platform grew significantly over year -- over the year. This is a good thing. Really excited about this. But alongside that, we saw a decline in our revenue from service fees. And so this is, again, ignored downstream value share. Just look at that fee number, that's gone down. This is particularly frustrating to me because we actually have a large amount of fee bookings across these many deals. And -- but we're not converting those into revenues in the near term. And the core challenge is the rate that we're bringing these programs to full scale on our automation at Ginkgo. And I'm going to explain that, but I want to give you a little more detail so you understand that challenge because what we're trying to fix. So on the left-hand side here is the basic process by which an R&D leader at one of our customers develops a biotech product. Okay? So they're -- at R&D leaders engaging with the senior scientists on their team. They're specifying a particular product scientific deliverable, okay, right? So to give some examples from Ginkgo programs of what a leader might ask for. Maybe it's an mRNA design that performs a certain way in humans like we have for Pfizer. Or microbes that capture nitrogen for Bayer or an improved manufacturing process for Novo Nordisk. These are all scientific deliverables, all right? And in the case on the left, the customer's internal scientists will then design experiments, they think will help deliver that outcome to their boss. More junior researchers will perform those experiments at the lab bench by hand. And then the data will come back to be analyzed, and you go around that loop. Now this is a manual process. and generate small amounts of data, but it does work, right? I want to highlight this is how all these biotech drugs are developed every year. And the strength of it is the flexibility right? The scientist can run any new experiment tomorrow very quickly that they want as long as their 2 hands can pull it off, all right? And again, when that data comes back to the senior scientist, they repeat this all over again, right? Now, of course, an R&D leader at Pfizer, Bayer and Novo, in these cases are all choosing to instead pay to have a Ginkgo scientists give them the same deliverables instead of using their internal infrastructure. So why are they doing that? And the major reason is that Ginkgo scientists that they do that same loop, but they do it in a different way. They design experiments, but instead of small amounts of manually generated lab data from a team, they get large amounts of data generated either via automation or via pooled approaches that leverage high-throughput DNA sequencing and barcoding. And Ginkgo is a world expert in both of these large data generation approaches. That's really the big difference, small data generation versus large data generation. And that's really our expertise. So the short answer is, why was the customer choosing to use us instead of all that in-house infrastructure they have is they're coming to us, asking for a scientific deliverable that they think will need a lot of data to get to the answer, all right? And that's not every project, but it is an increasing number, as you see. But our approach, I want to be clear, is not strictly better than doing it manually mainly because it takes more time to get a new protocol running at large scale. And this is the heart of why we're not seeing revenue come up with our programs in the near term. It's not a perfect correlation, but generally, the faster that a Ginkgo scientist can start to order large amounts of lab data, the faster we then see revenue coming out of all those customer projects. Now fortunately, the acquisition of Zymergen and the follow-on tech development we've been doing over the past couple of years put us in a great place to resolve this issue. And so I want to talk about how we're planning to do that. Okay. So to give you a little bit of background on how lab automation works today, there's basically three levels. At the first level, that's what you're seeing in our customers often. Scientists working by hand. This is the overwhelming amount of lab data generated in product development in biotech today is at this first level, manual. Second level, a scientist walks up to a robot, put samples on it and program it to do a specific task, okay? This task targeted. The third level, and you'll see a lot of these around Ginkgo are work cells. And so you can work with an automation vendor and you have a robotic arm sitting in the middle of a set of equipment and it moves samples through multiple steps, but it basically does the same steps over and over. okay? And this is, again, what a majority of our foundry looks like today. And you can see on that spectrum at the top, you go from very flexible, low amounts of data per dollar to very inflexible large amounts of data per dollar, right? And that has been the historical trade-off in lab automation. Okay. We believe that the automation paradigm invented at Zymergen and then expanded on in the last 2 years at Ginkgo since the acquisition, ultimately offers flexibility and low-cost data at the same time. The simple idea is that each piece of equipment, is its own removable cart. It has a robotic arm connecting it to a magnetic track to deliver samples. You can see it in the video. And when you need a new equipment, you can just add that equipment to the track. It's like adding a little Lego block to that track and incorporate it without needing to build a whole new work cell like you would in Level 3 automation. And when you want to do a different protocol on the same equipment, that can be done with the software quickly. And we've been seeing that. We've seen instances where we've taken smaller batch protocols and moved them on to the rack system relatively fast compared to what would have been a multi-month project on a work cell. And then over the last year, we've also been seeing -- and this is early data on this, but 80% to 90% labor time reduction, 60% cycle time reduction. We have not done this for the majority of our protocols yet, but the signals are good that we could. And I'll be talking about that as part of our plans for efficiency gains in the third section of the talk today. Since acquiring Zymergen we also focused on simplifying the cart designs. You can see our second-generation rack cards that were recently delivered to our facility in Boston. And if you're at Ginkgo [ Ferment ] in April, you saw these important -- in person. Importantly, these are easy to assemble. So these are in-house designs, are proprietary to Ginkgo. So it makes it faster for us to order more manufactured as needed. We also standardized the sizes. So we have these 3 sides here that allow us to incorporate a wide variety of different equipment while again, keeping manufacturing costs down, all right? So these rack systems are made to be very scalable. It's a very different paradigm than what you see with work cell-based automation. Today, we are at the closed loop rack system scale that one you see there, have ordered 15 systems. But we've been planning much larger integrated systems as part of achieving long-term efficiency goals in flexible lab data generation. Towards that end, our purpose-built facility that we've been talking about to house these large rack installations, Biofab1, we'll be opening in mid-2025. And the best way to think of this facility is a lab data center. Okay? So we have these big data centers in compute and what you offer there is common scale hardware that does lots of different types of compute. Very similar idea here, common scaled hardware in the form of the racks that can generate a diverse array of lab data output quickly for customers. And hopefully, this means as we sign more programs, they can very quickly scale to generate large amounts of data. This leads to more revenue, but more importantly, to happier customers who greatly desire both speed and scale of lab data generation. In other words, our customers be more than happy if we were more rapidly extracting revenue out of our bookings because it means their programs are happening more quickly on our infrastructure. So that is a win-win for Ginkgo and for our customers, and that's what we're trying to do with this change to how we operate. Okay. So that's a bit on the technology. I believe it will simplify the back end, allowing one automation platform ultimately, the racks to replace many different workflows at Ginkgo and work cells. And now I want to talk about the front end and how we engage with customers when we sell those cell programs. All right. So this is a slide I showed you earlier. And as I mentioned, customers are choosing to use Ginkgo rather than their internal infrastructure when they think they need large amounts of data. However, as part of the business model, we've also asked for a few things customers don't love. You can see these up here. We have Ginkgo scientists run the projects. We have scientific control over the experimental design. Number two, we have IP rights. Ginkgo can reuse the data generated and keep it in our code base. And by the way, that is valuable to Ginkgo. Like don't get me wrong, us being able to reuse that is valuable. It helps us with future deals. But customers really don't like it. I'll talk about that. And then finally, downstream value share. We get milestones or royalties on your future product sales. And now, look, I designed a lot of these service terms, right? Like I was responsible for our business model at Ginkgo. And I battle test it. I'm out there talking to customers every week it varies a little by market. So like, for example, in biopharma, there's a lot more tolerance for milestones and royalties, right? If you're doing a strategic deal with a customer, many deals are done like that. In industrial biotech like in the chemical industry where margins are much lower, oh, they hate it. Okay, right? So again, one size fits all there, not a great idea. And then second, in biopharma, there's much more sensitivity to IP. So given how much it makes up the competitive moats around a drug. So they have a lot of sensitivity to [indiscernible], you're going to reuse some of this data that I'm paying for to potentially bring to a customer. That creates resistance in deals. So when you see us adding all these programs, know that we're fighting through that resistance with customers to get them done. And we felt that was important. And I think in the context of where we are today and the rate of revenue I'm seeing and what I'm hearing from customers, right, like we should change it. And so we're going to stop fighting customers on these things, update our terms to give customer IP reuse rights and in many cases, not include downstream value share. There will be some exceptions where we are bringing a lot of product relevant background IP. We do have that. But by and large, we moved that downstream value share. And our hope is this will speed deal making as we spend huge amounts of time negotiating these IP terms. It also allows us to scale the number of deals we do without needing to scale legal and financial resources due to reduced deal complexity. Okay. So beyond the issue of IP rights and DBS, customers sometimes have a problem, also giving up scientific control of their experiments. And so you'll see us working on simplifying that too. In other words, they might say, yes, I love all the data you can generate, Jason. But like I really trust my own scientists and their expertise around this particular problem. I just wish they could use your infrastructure. And so we announced just a [ GigaFerment ] in April, lab data as a service, which is exactly this. The key idea is that a customer scientist can design the experiments and analyze the data. But they have Ginkgo's infrastructure, again, remember all those racks, available to them to quickly generate large data sets that they wouldn't be able to do with their in-house research team. They might still use that team for other problems that favor small batch, manual rapid word. Again, I think that is actually a valuable piece of the puzzle internal to our customers. But if it's a large data generation need, they can just order it. And this will be the best of both worlds for many customers. Now there's a subtlety here that took me actually a while to understand, even though I'm at the cold face with customers all the time. When we sell our usual process, where a Ginkgo scientist is in control, that's really sold to the customer like as a strategic deal over often a couple of years, and it's coming out of a special budget, sort kind of sold up through corp dev that funds that kind of work, these kind of research partnerships. And we actually do a ton of those deals, right? And I think we've scaled that kind of deal making more than almost anyone in biotechnology today. But there's this whole other big budget, often billions of dollars at pharma companies, that is the everyday R&D budgets at the biotech company that's in the hands of internal scientists at various levels. And with Lab data as a service, we can see it sell smaller deals directly into those scientists. This is both a big new market for us and is also a great mission fit for Ginkgo. Our goal has always been to make biology easier to engineer. But thus far, that's been limited to Ginkgo scientists. By allowing our customer scientists to access the foundry directly. We're making it easier to engineer for them too. And that's really important to me. It's really important to the team. If you watch my talk, it Ginkgo Ferment, spent a fair bit of time on that. Finally, I want to mention, we think we can really be the picks and shovels to all the folks that are inventing amazing new AI models in biotechnology. What we are hearing again and again, there's many new startups getting funded, large big funding rounds. Most of the large biopharmas have now a person in charge of AI strategy. And what we hear from these people again and again is that data is the missing piece for building new and better models in biology. Okay? And again, we had huge large English language data sets and things like that to train AI models for English language or videos or images, in biology, the missing piece is actually the data. And so our lab data as a service is exactly the right offering for the -- we could generate large multimodal data sets. And we expect to do business here with customers wanting to access both our automation scale and our expertise, like I said earlier, in conducting large pooled assays. That type of assay generation is particularly important and both of those are available right now on a fee-for-service basis. You own the IP, there's no royalties or milestones for any AI company that's tuning in. We'd love to do that work for you, and you can get that data much faster than anyone else. Okay. So those are the big changes we're making, both on the back end and the front end of our platform to drive scalability through simplification. We expect these simplifications and others to allow for substantial cost takeouts in the coming months. So I want to talk about those cost savings and how all of these pieces tie to our path to adjusted EBITDA breakeven. So if you look at our Q1 numbers, our annualized OpEx comes in at approximately $500 million. This is simply too high relative to near-term revenues, right? We plan to cut this back by $100 million by Q4 2024 by significantly consolidating our footprint and reducing labor expenses across both G&A and R&D, which is enabled by the simplifications I just spoke about in the previous two sections. We're also targeting reducing our annualized run rate cash OpEx by another $100 million, totaling $200 million by mid-2025. The big takeaway is we plan to eliminate discretionary spending that isn't very specifically focused on how we get to adjusted EBITDA breakeven by end of year 2026. So a note on this for the team that's tuning in, there are many things we are doing at Ginkgo right now that are good things to do in the long run, but aren't good investments today given the opportunity we have to get to a breakeven business built on technology that keeps making biology easier to engineer as it scales up. No other company in the world, in my opinion, has pulled that off yet. So we need to sacrifice activities that on that path since I think we have a good shot at hitting it at this point, and it's critical. We're going through the detailed planning process now and input across our team is essential to get this right, but I can share some of the major places we expect to see savings. So first, facilities are a significant cost for us, both in terms of rent, but also in terms of facilities maintenance and tracking. We have 8 sites today. And with Biofab1 coming online in mid-2025, expect we could reduce our footprint by up to 60%. These simplified operations require less ops, G&A, HR, finance, facilities, management and other overhead support, which will allow us to significantly reduce G&A costs and overall headcount. And with our movement to a more rack centric foundry, our technical teams will be adjusted to suit highly leveraged automated and pooled workflows. We expect that these combined initiatives will result in 25-plus percent reduction in labor expenses, which is inclusive of a reduction in force. We are also taking on other cost-cutting measures to reduce nonstrategic overhead expenses through a thorough review of existing internal and external programs while also pausing and reviewing professional services spend. We know that many bioworkers will be impacted by these changes, and we're sad that we have to see many of you go, but are thankful for your patience and input throughout this restructuring and dedication to our mission of making biology easier to engineer. It's times like this when that mission dedication is tested the most. The last piece I will get into today is our updated guidance for 2024. You'll notice that we are no longer -- we no longer have new programs listed on this page, and that's because we're not sure as currently defined, it's the right metric for program growth going forward. With the simplifications and changes to our deal structure I've described today and particularly removing downstream value share on many deals, our prior guidance where programs dependent on things like downstream value share to be counted as a program is no longer applicable. Ginkgo does expect to add at least 100 new customer projects, comprising both traditional cell programs as we thought of them as well as new offerings, including lab data as a service. Due to the changes in our deal structure and focus on cost savings, Ginkgo now expects total revenue of $170 million to $190 million in 2024. Ginkgo revised its expectation for cell engineering services revenue to $120 million to $140 million in 2024. This guidance reflects a weaker-than-expected revenue ramp during the year, uncertainty relating to the timing of technical milestones and the potential near-term impact of the restructuring actions I just described above. This guidance excludes the impact of any potential downstream value share as well as potential upside from new service offerings. Ginkgo continues to expect Biosecurity revenue in 2024 of at least $50 million, representing approximately current contracted backlog with potential upside from additional opportunities in the pipeline. Okay. In conclusion, though, these are difficult changes, acting decisively now while we're in a position of strength in terms of cash in the business, it's critical. This will not be an easy period for our team, and we're grateful to them for their help and partnership as we make this transition. All right. Now I'll hand it back to Megan for Q&A.

M
Megan LeDuc
executive

Great. Thanks, Jason. As usual, I'll start with the question from the public and remind the analysts on the line that if they'd like to ask a question, please raise their hand on Zoom, and I'll call on you and open up your line. Thanks all. All right. Welcome back, everyone. As usual, we'll start with a retail question, and then we'll go down our list of analysts. So Rahul, you'll be first after our retail question. First question comes from our IR inbox, and it's for you, Jason. Can investors get some color on how data as a service is being received. A big part of the original investment thesis was downstream value revenue, but now that is gone. Can you explain why data as a service is the right pivot and how it's being received?

J
Jason Kelly
executive

Yes. So I touched on some of this on the call. We announced Ferment about a month ago. I'd say it's being received really well. We have like now tens of customers in our sales pipeline, which is pretty quick for the type of stuff we sell here. And -- I made this point on the call, but -- and it's a subtlety, we are able to sell this just to a really different pool of budget at our customers, right? Like we get to walk into the R&D department and basically say, we are an alternative to generating a collection of data yourself. So you can save that money on the reagent, you can save -- you could take that team that would have to do it and instead have them get to do something different. If you're a small biotech maybe you never build that lab or hire that team in the first place, right? So we have this kind of new thing we're able to take to people. The second thing is I get to say, hey, it's your IP, there's no royalties. Let me tell you having sold Ginkgo's infrastructure for the last decade, that makes my life a lot easier. So I do think this is the right time. Downstream value share has been something that I think has been part of our thinking about the company. I think in the long run, it could still be part of our thinking. But in this window of time, there's just an enormous amount of research budget for us to get after. And I think we are able to tap that budget a lot faster with these terms that are a lot more customer-friendly. So I'm really excited about. I think it's going to be a big part of the business going forward. And I should just mention one last thing. The point about that about the sort of AI companies. It's fascinating, right? Like a lot of these companies are really software first, right? They're AI experts, they're building incredible models. And they're all leveraging like the existing public data sets to do that, right? They're leveraging the protein data bank, they're leveraging Genbank to have access to genes and protein structures and so on. And eventually, that's going to get mined out. Right? In fact, I'd argue it's probably pretty close to having already been mined out. And so what you're going to need is new large data sets, right? And I think the way we've structured Lab Data as a Service, where these companies can own that data, what's the point? Why build your own lab? It's just going to be faster to use Ginkgo's infrastructure and our conversations with companies are reflecting that. So I'm pretty excited about that.

M
Megan LeDuc
executive

Great. Thanks, Jason. Like I said, Rahul from Raymond James, Europe first.

R
Rahul Sarugaser
analyst

Can you hear me all right?

J
Jason Kelly
executive

Yes.

R
Rahul Sarugaser
analyst

Terrific. Jason, Mark, congrats on taking a bit of a reset quarter here. So I guess by being, first of all, maybe I'll ask the big global question, right? So Jason, you started by talking about how you guys have been doing this a long time. And I think most folks on this call are believers in biomanufacturing synthetic biology. So my question is, what -- given the attrition that we've seen, given the thinning and revenue fitting in projects, what are the threats out there that you're pulling on that make you believe that you're not too early. How is Ginkgo at the right time? And then maybe a more granular question will be then as you evolve your business model, assuming you are at the right time, how is Ginkgo not going to be categorized effectively as a big CDMO. That's it for me.

J
Jason Kelly
executive

Yes, okay. So great. Let me speak to that. Yes. So the first point you made around biomanufacturing. So I share your concerns on this. I think what we've seen -- if you look at the -- like Ginkgo, when we went to the company public, majority of our customer base was in the industrial biotechnology sector. And I think people often treat that as like a synonym to synthetic biology. It's actually not, right? The way to think about it is that synthetic biology is it's a tools infrastructure, right? It's people that are working on new ways to make the process of designing and engineering cells faster and easier. It happened to be that a lot of the demand for that was an industrial biotech because of the complexity of the genetic engineering there. So that was sort of why there was a common like equivalency. Industrial biotech has been hit very, very hard with higher interest rates. I think that's just the reality. Like the venture capital ecosystem completely dried up for those companies. Many of those companies have gone out of business. I love this space, and it's been tough. And so I think one of the things that the last few years have shown is even in the face of that, which we weren't expecting, we took the company public, we were able to show that, hey, we're a tools platform. And actually, we're adding all these new programs in biopharma, which was a space that we were likely in when we took the company public. And so I think that does speak to like the flexibility of Ginkgo as a platform. We're not really hooked exclusively on biomanufacturing or industrial biotechnology. In fact, the story of the last 2.5 years has been pretty impressive, in my opinion, shift of our customer base from industrial biotech, start-up companies that were growing on a lot of venture capital to increasingly large biopharma, bioAg company is that have big existing research budgets, right? And so I will -- that'd be my point on biomanufacturing. I don't I think it's tough. I think we need some breakthroughs in that area. I think some of the consumer biotech stuff that's happening is pretty interesting. There's 6 new GMO like house plants on the market somehow, right? Like -- and so I think that kind of stuff is exciting, but you're -- I think that sector has been hit really hard. The -- you asked about whether we thought of -- when we focus on the pharma space as a large CDMO, I don't think that's the end of the world, right? So here's the basic problem with like the CROs today. They don't really improve as they scale, right? If you look at [indiscernible], right, like you're essentially outsourcing, doing that same thing, you're choosing to outsource data generation to an external lab, but they're just going to do exactly what your lab was going to do. They're going to hire a bunch of scientists, put them at lab benches and do that work by hand. So the fundamental like philosophy of how they do the work is exactly the same as how you do it in-house. So you're really just choosing there to hand off some of the things that you don't want to do. That's different than the distinction with Ginkgo as a "CRO". People are coming to us because they want a large data asset, okay? They either want it with automation or they want it with pool. They can't get that from the traditional CROs, okay? And they can't get it in-house. So the real question is, do they want it, right? If they start to want it, the advantage of my approach is I have scale economics. Like I get cheaper on the infrastructure as I do more work, right? That -- and so that's where we'll be different. But like in my view, if you compare like the way lab work gets done, the overwhelming majority of research spending is going to kits and equipment and real estate and labs and all this, it's so much a giant pile of expense. And the CROs have tapped almost none of that. And that's because they have not offered like really a compelling differentiation from what the customer can do themselves. And I'm hopeful if we can make this flexible and scalable, which is the big unlock. Then you eat up more and more and more of that by hand research budget. You really do. So I don't have a problem being thought of that way at all. As long as you think of me as one that could ultimately take half of the research budget someday if we were right about this philosophy for doing the lab work.

M
Megan LeDuc
executive

Thanks, Rahul. Next up, we have Matt Sykes at Goldman Sachs. Matt, your line is now open.

M
Matthew Sykes
analyst

Great. I guess kind of a high-level question for Jason or Mark, just as you kind of look at sort of the proof points of this restructuring and shift in what you're doing, particularly from how you're approaching customers. We had moved our model away from new program growth a while ago because we felt the correlation wasn't there and focusing on active programs. Revenue is obviously going to be key KPI. But as you kind of give advice to the sell side in terms of how to measure success of this shift, what are some of the KPIs that we should really be focusing on at this point?

J
Jason Kelly
executive

Do you want to take a swing Mark?

M
Mark Dmytruk
executive

So I think, Matt, what you heard is we're going to be focused on cash flow, first and foremost. And so cash flow is a function, of course, of cash revenue and cash OpEx. And so that's where a lot of the energy of the company is going to be. But we want to get to a place, as you heard on the call today, we're we are moving towards profitability, where we're adjusted EBITDA breakeven. And that is what sets -- can go up for success. We have to prove that we can operate the programs economically. So I think, Matt, that's pretty important. On the sort of volume side, which is what you're getting at, is there going to be like a substitute for the program metric or something like that. We're going to need a little bit of time to figure that out. I'm not sure there certainly will be KPIs. They may be things that we report on, but don't guide to. We sort of need to I think, get a feel for the types of deals that we're going to be signing under the newer sort of commercial terms that Jason outlined today and then you are offering like Lab data as a service.

J
Jason Kelly
executive

As a reminder, the way we do program counts today involve having things like downstream value share involved in order for it to count. I know a number of the analysts on the call do factor that into their modeling. And so part of what we're doing here is because we're changing the terms that they wouldn't count. And so we do want to give you all a picture of how things are going because when we sign deals that does imply revenue in the future. I do know that's part of the modeling. So we'll work on that.

M
Matthew Sykes
analyst

Got it. And then just on the guidance commentary slide, Jason, you mentioned we prioritizing quality of quantity in terms of programs. With sort of the new approach and including Lab Data as a Service, are there certain types of programs or end markets that are more attractive to achieve sort of that combination of flexibility and scale but also speed to get that program up and running? And/or are there certain -- I think you mentioned the difference between those that are attracted to downstream value and those that are not, which is clear, but just sort of the types of customers that would maybe generate that revenue quicker in terms of either scaling on the platform or just in order to bring new programs to the platform?

J
Jason Kelly
executive

Yes. Yes, great questions. So I think for starters, you will continue to see us. I showed that slide where we're either selling to the research budget directly. That's the kind of lab data as a service type model where their scientist is in control of our infrastructure. Or our scientist is running a program, and we're kind of doing a strategic deal with [ corpdev ]. And like say, corpdev because they essentially are the ones who negotiate research partnerships upstream or down -- like I acquire an asset, and I also signed a research partnership with a small biotech, right, on the large biopharma corpdev leader. Like that -- we're like the half of that deal, that's like the research partnership has, right? That's what we're selling to. We still have a pipeline of those. We'll still do those. I think frequently, those will still involve downstream value share. They're going to be like a whole bunch all at once and really the customer for those is large biopharma, okay, and to a lesser degree, maybe large ag companies, all right? The research budget, on the other hand, I think we have a really good opportunity with smaller biopharma start-ups and biotechs that are still getting funded and are making choices in a limited capital environment about how much they want to spend on their laboratory infrastructure. How much do they want to spend building that stuff out, how much they want to spend on equipment, all that upfront cost I think with their -- what they are uncompromising on this is why again, these are things you realize over time, we're fundamentally selling a new thing. Those small biopharma biotechs would never give up scientific control. And so with our Lab data as a service, where now their scientists can run our infrastructure, we can sell to those guys for the first time. So I'm really excited about that area, in particular, when it comes to lab data as a service. Last one I'll mention, in industrial biotechnology. We've got a hard time selling into the bigger companies there, okay? Because they don't really do those types of research partnerships that pharma does. Now it's a lower margin industry, but they do have research, but they have a research team. So I think Lab Data-as-a Service is also way as an entree into some of those larger chemical and other industrial biotech companies where we've had more friction selling on the strategic deals. Does that make sense? .

M
Matthew Sykes
analyst

Yes. Appreciate it.

J
Jason Kelly
executive

And in general, like I like it. Like I think it gives us more -- like things also reinforce each other. I think one of the things people are always like, "Oh, no, call yourself a CRO or something which is closer to what the lab data as a service is." And that's because when we're talking to the strategic half of the house, they're like, I wouldn't do a strategic deal with a CRO, right?

And so they're sort of -- when we're talking to those folks, they are evaluating our scientists, they're evaluating whether we can do a multiyear deal. And we're really great at that engage with those people. People already think of us that way, right? They think of us as strategic. So -- so why I am excited about though is we can now walk into same customer, different part of the organization. And our reputation over here is going to help us over here. And so I am excited to cross-sell to both the kind of directly to the R&D mid-level and senior leadership as well as more strategic. That's going to help.

M
Megan LeDuc
executive

Thanks, Matt. Next up, we have Mike [ Ryskin ] at Bank of America. Mike, your line is now open.

U
Unknown Analyst

Can you hear me?

J
Jason Kelly
executive

Yes.

U
Unknown Analyst

So I kind of want to go back to, I think, kind of the crux of your argument earlier, Jason. Just kind of looking at like Slide 14, I think it captures it perfectly is that the disconnect between active programs or new programs and revenues or why that's disconnected over time. I'm just wondering, within that, we only see the total number, right, total programs, total revenues. Any success stories you can talk about? Any examples, any lessons you can give as you parse that out, where you see some of the proof points because you were talking a little bit about how it's about not being able to get up to full scale cases where you are able to achieve that? And what I'm getting at is -- it's a long question, but what I'm getting at is you're putting in these cost cuts, you're trimming things -- how do we know that's not just going to trim the number of programs from network revenues and just you're just cutting everything in half, right, by reducing headcount, reducing footprint, right? How do you know you're selecting the better approach versus just taking what you have now and cutting in an end?

J
Jason Kelly
executive

Yes. Super clear, Mike. Yes. So I'll give a little color on it. So first off, we have now multiple years of experience with many programs, right? And again, you can see the ramp of our program. I know you people that have followed us since listening to the company know this. And so each year, we get more and more data about what's easy to onboard and get to scale, what's hard. All right. Secondly, we are working on the back end. We're always trying to make the back end actually do this more quickly. Right? And we have a substantial amount of bookings, right? Like this is the thing that frustrates me, right? Like we actually have a lot of bookings and the rate at which we're able to push them through the infrastructure into revenue is just too low, right? And so I do need to -- we need to work on that back-end problem. I think when it comes to do we know like success cases, it is the types of programs that we have done previously, right? Like if we have done that type of work, particularly if we've done that type of work end-to-end and succeeded for a customer, it is much easier for us to do it again. When it's a newer thing, that creates a lot more churn, right? And then I would say across the board, there's still like a set of experiments like, for example, assay onboarding, right? Like customer comes to us, they have their specific project. And as part of it, there's a particular assay that they trust that they really want us to onboard onto the automation in order to make that project work. That remains something that like you can't hand off to automation, like we have to do that pretty manually. It's pretty low throughput it ends up being a bottleneck. And until you complete that, I can't turn up the dial and generate all that revenue from running that assay thousands and thousands of times. Does that make sense? And so like assay onboarding would be a great thing for us to be able to do with the racks. Right? Like being able to onboard quickly some of these things that are currently repeatedly frictional by hand work to get the automation spun up. We know what those things are because we're doing so many programs. And those are some of the first things we're going to attack with our new focus. So that really is it. And that's why we're confident that we aren't going to have the situation you're talking about. But again, the -- what we sell will be different. Like I won't be going out and say, "Hey, I want to do a project for the first time", not in this world, right? Now where I see a line of sight to us getting to breakeven. We just don't need to do that anymore. And that -- we've had to do -- I think it was part of building the process here, part of building the story and helping us learn what's easy and hard over the last few years. But it's a mistake to keep selling that type of stuff. And so you'll see us tighten up the sales there, but I'm hopeful with the better terms. We get more deals of the type we like. Does that make sense?

U
Unknown Analyst

Okay. No, that does. I appreciate that. And then just a quick follow-up on the cost reductions and the plan there, a pretty meaningful reduction in labor at 25%. Is that how do you ensure sort of like minimal disruption because you've been scaling up for a while, you are still bringing on the new the new foundry operations, the Biofab1, so how do you juggle both expansion and shift and meaningful head count reduction at the same time?

J
Jason Kelly
executive

Yes. I think that point there is the key challenge for us, right? It is figuring out what are the things that we need to be investing in to make sure we can handle the shift and then also make sure we're supporting our key customers today. Right? And so that is like a big part of what we worked on in early planning for this and what we're going to be working on in the coming weeks with the team to tighten that up. We do also, I will say, like post listing, we pursued many different ways to potentially get to growth. We do a lot of internal research to try to get something that's going to pay off in the longer term. Some of that we did need to do 2 or 3 years ago, like we were very early 4 years ago and even doing mammalian cells. Of course, we needed to do internal research to bring that online. We would never be able to tap the pharma industry today like we do. The list of things like that versus the payoff in this environment is just shorter. And so a lot of that internal research is this work we shouldn't be doing an we should be focusing that on either delivering on customer projects or making it easier to onboard things onto the automation, on to the pooled screening and things like that. Like you're going to -- you'll just see us focus a little more on just the stuff that delivers revenue and allows us to sell to the types of customers we want to sell to at a faster scale. Does that make sense?

M
Megan LeDuc
executive

Thanks, Mike. [Operator Instructions] But next up, we have Steve Ma at TD Cowen.

J
Jason Kelly
executive

Steve, we're getting you from [ SymBio beta ].

P
Poon Mah
analyst

Yes. So yes, apologies for the background noise, yes, maybe picked a great day. But anyway, maybe just a follow-up on Mike's question on the [ ref and ] reduction in labor costs. I appreciate on the services business, you have good visibility on what's easy and then you'll take it on as a services project. But what about 1/3 of your programs are in the harder pharma partnerships, those are obviously more complicated. Can you give us a sense -- is the reduction in force? Is it more targeted? And maybe give us -- maybe quantitate the percentage of the actual ports reduced. And I know you said 25% labor, but can you give us a sense of like what percentage of your total workforce that is?

J
Jason Kelly
executive

Yes. So we're working through those numbers now to get to exact numbers. So we don't have the exact numbers now. We are planning to take 25% out of labor inclusive of a head count reduction, but that's the process we're going through in the coming weeks. I will say that type of biopharma work is our highest priority stuff, right? So those are important long-term customers for us. We know there's a ton more business there. And so that's an area where you'll see us make sure that we can continue to serve those customers well.

P
Poon Mah
analyst

Okay. Got that. All right. And then maybe one for Mark. Can you give us your confidence level in your ability to sublease the foundry real estate and consolidating it to Biofab1. Just how confident are you to be able to do that -- you put out a pretty big number of kind of reducing up to 60% of the cost of that facility.

M
Mark Dmytruk
executive

Yes. So first of all, I'll just make the comment that we're committed to delivering the cost savings number even if we can't get the lease savings that we're expecting. So it is a fairly -- I would say it's a nontrivial task for us to do the sort of transformation of the foundry work, so we need to kind of make that happen. The opportunities for subleasing or similar to sort of mitigate the costs of that lease. I mean, there's lots of those that will depend on market conditions once we're ready to sort of make that move. So yes, so we're not -- I would say we're not like just banking on that to make the number. That's the way I would put it.

P
Poon Mah
analyst

Okay. That's helpful. And if I could sneak one last quick one in, Mark. On the downturn value, I appreciate you guys pulling it because of lumpiness and lack of visibility. But are you going to add that back to guidance when you have good visibility and as you kind of approach maybe a milestone?

M
Mark Dmytruk
executive

Yes. So first of all, we do have a significant portfolio of potential downstream value share or royalty rights, milestone rights. We have not forgotten about that. And so that is the -- yes, I mean, the short answer is, yes, Steve, once that becomes something that is more predictable and steady source. I think we would start talking about it like that and maybe adding it back into revenue guidance. We're just not there sort of in this time horizon that we're talking about. And so we're, again, very focused on getting Ginkgo to that adjusted EBITDA breakeven level without relying on what might or might not happen in terms of downstream value share over the next 2 years.

M
Megan LeDuc
executive

Thanks, Steve. Next up, we have Edmond [indiscernible] at Morgan Stanley. Edmond. Your line is now open. Can you hear us Edmond?

U
Unknown Analyst

Sorry, I apologize having some lag issues here. Just a quick question from me on the implementation of your new rack automation. How long do you think it will take to implement this new strategy? And will there be a ramp-up time associated with reaching optimal efficiency here? And how much improvement to the revenue conversion do you think you can drive with this?

J
Jason Kelly
executive

Yes. So maybe I'll speak to some of the time lines. So one of the things that's great is we're already starting to do this, right? So in our current facility in Boston, we have a setup of racks that were up to I don't know, 15 or 20 of the [ carts ]. And so we're able to start basically moving workflows onto the racks. Obviously, the team that is designing and programming and doing final manufacture of the racks is based out in Emeryville, California, but then they get shipped over here, and we're able to basically start to do the lab transfer and all that work well in advance of Biofab1 being open in mid-'25. So all that work is -- there's nothing to slow that down other than how much attention we're putting to it what its priority is relative to other priorities in the company. So I'm actually pretty excited that some of that can move quite a lot quicker. There is still then getting -- doing like a wholesale move over into Biofab1. That's part of our plan here for cost reduction just in terms of simplification of our facilities and so on. And that is more on Biofab1 time lines, which is mid-'25. Mark, I don't know if you want to speak to some of the other stuff?

M
Mark Dmytruk
executive

So Edmond, the question was what impact on revenue might we see from the sort of rack driven foundry. Was that the question?

J
Jason Kelly
executive

How much faster we could pull it through?

M
Mark Dmytruk
executive

Yes. Yes. So I think the idea is significantly faster. So if you really look at sort of how an end-to-end cell program today is both contracted for with a customer and then how we execute on it. Ginkgo takes a lot of risk on both the technical success and the timing of that work performing. And it takes -- it does take a long time to kind of plan it and onboard it. And I mean these are just often very complex long cycle projects where Ginkgo is taking a lot of that timing and technical risk. We're rewarded sort of along the way with kind of these micro milestone tight payments. And it flows into revenue on a very laggy sort of basis. That's just the way the revenue recognition rules work. So you take out a lot of that, like these would be shorter-cycle projects to begin with. There's going to be less need to take on technical or timing risk. The revenue recognition, I think, will be more evenly spread over -- and more matched with the actual work that we're doing over a much kind of tighter time frame. It'd be sort of months instead of years. And so -- so we'll -- like I can't tell you is that 50% faster than a sort of equivalent size project, we'll have to see, but it will be materially faster, I think, from a revenue perspective than sort of the equivalent end-to-end cell engineering solutions type project.

U
Unknown Analyst

Got it. Given my choppy internet, I'm going to keep it to one and ask the rest on the follow-up.

M
Megan LeDuc
executive

Thanks, Edmond. I'm not seeing any other questions in the queue. So Jason, do you have any closing thoughts for us?

J
Jason Kelly
executive

No. As I mentioned, tough for us internally with the head count reduction. I appreciate the support of the team. I think it goes going to come out of this in a much stronger spot to make biology easier to engineer. I think we have a chance to do that horizontal basis across the entire biotech industry. And so excited to go forward and do that. Thanks, everyone, for your questions.

M
Megan LeDuc
executive

Thanks all. We'll talk to you all next quarter. Have a good one.