Arista Networks Q2 2023 Earnings Call Transcript

There are 19 speakers on the call.

Operator

Welcome to the Second Quarter 2023 Arista Networks Financial Results Earnings Conference Call. During the call, all participants will be in a listen only mode. After the presentation, we will conduct a question and answer session. Instructions will be provided at that time. As a reminder, this conference is being recorded and will be available for replay from the Investor Relations section at the Arista website following this call.

Operator

Ms. Liz Stein, Arista's Director of Investor Relations, you may begin.

Speaker 1

Thank you, operator. Good afternoon, everyone, and thank you for joining us. With me on today's call are Jayshree Yulal, Arista Networks' President and Chief Executive Officer and Ida Brennan, Arista's Chief Financial Officer. This afternoon, Arista Networks issued a press release announcing the results for its fiscal Q2 ending June 30, 2023. If you would like a copy of the release, you can access it online at our website.

Speaker 1

During the course of this conference call, Arista Networks Management will make forward looking statements, including those relating to our financial outlook For the Q3 of the 2023 fiscal year, longer term financial outlooks for 2023 and beyond, Our total addressable market and strategy for addressing these market opportunities, including AI, customer demand trends, supply chain constraints, Component costs, manufacturing output, inventory management and inflationary pressures on our business, lead times, product innovation, Working capital optimization and the benefits of acquisitions, which are subject to the risks and uncertainties that we discuss in detail in our documents filed with the SEC, specifically in our most recent Form 10 Q and Form 10 ks and which could cause actual results to differ materially from those anticipated by these statements. These forward looking statements apply as of today and you should not rely on them as representing our views in the future. We undertake no obligation to update these statements after this call. Also, please note that certain financial measures we use on this call are expressed on a non GAAP basis and have been adjusted to exclude certain charges. We have provided reconciliations of these non GAAP financial measures The GAAP financial measures in our earnings press release.

Speaker 1

With that, I will turn the call over to Jayshree.

Speaker 2

Thank you, Liz, and happy last day in July, everyone. We delivered revenues of $1,460,000,000 for the quarter with a non GAAP earnings per share of $1.58 Services and software support renewals contributed approximately 15.2% of revenue. Our non GAAP gross margins of 61.3 percent was influenced by improving supply chain overheads and higher enterprise contribution. We do expect gross margins to consistently improve every quarter this year and stabilize in 2024. International contribution registered at 21% with the Americas at 79%.

Speaker 2

As we surpass 75,000,000 cumulative cloud networking ports, we are experiencing 3 refresh cycles with our customers. 100 gigabit migration in the enterprises, 204 100 gigabit migration in the cloud and 400 going to 800 gigabit for AI workloads. During the past couple of years, we have enjoyed significant increase in cloud CapEx to support our cloud titan customers For their ever growing needs, Tech Refresh and expanded offerings. Each customer brings a different business And mix of AI networking and classic cloud networking for their compute and storage clusters. One specific cloud titan customer has signaled a slowdown in CapEx from previously elevated levels.

Speaker 2

Therefore, we expect near term cloud titan demand to moderate with spend favoring their AI investments. We do project, however, that we will grow in excess of 30% annually versus our prior Analyst Day forecast of 25% in 2023. The AI opportunity is exciting. As our largest cloud customers review their Arista is adapting to these changes, thereby doubling down on our investments in AI. Arista is a proud founding member of the Ultra Ethernet Consortium that is on a mission to build open, multi vendor AI networking at scale based on proven Ethernet and IP.

Speaker 2

There are a lot of software and EOS considerations for AI. AI traffic and performance demands are different as it comprises of a small number of synchronized high banded flows, making them prone to collisions that slow down the job As we connect thousands of GPUs generating billions of parameters for petascale clusters, Arista's EOS capabilities must also scale along with our AI Spine and Leaf platforms to achieve that consistent performance and throughput. Arista has been developing EOS features such as intelligent load balancing and advanced analyzers to report and rebalance stores that can achieve predictable performance. Customers can now pick and choose programmable packet header fields for better entropy and efficient load balancing of their AI workloads. Network visibility is also important in the training phase for large datasets Arista's new AI analyzer monitors and reports traffic counters at microsecond level windows to detect and address microbursts.

Speaker 2

Our AI strategy and platforms are resonating well with our early customers. Presently, in 2023, we are in the middle of trials for back end AI networks, leading to pilots in 2024. We expect larger clusters and production deployments in 2025 and beyond. In the decade ahead, AI networking will become an extension of cloud networking to form a cohesive and seamless front end and back end network. In the non cloud enterprise category, we continue to experience good momentum in both data center and campus.

Speaker 2

Let me illustrate with a few customer wins. The first is an international new transportation win, where the customer was seeking to modernize their legacy campus. Their endpoints included large and small campus locations, Internal communication devices, various IoT, CCTV, display boards and much more. The customer mandated a fully automated workflow. Arista presented a highly optimized best of class cognitive campus.

Speaker 2

With Arista's single binary EOS image across all campus platforms, complete with a universal API and built in Complete with a universal API and built in automation features, the customer was set on a path to continued Campus Modernization. The next enterprise win involves both data center and campus with advanced EVPN, L3 VPN over VXLAN routing architectures instead of the traditional Layer 2 extension. Distributed AIVA sensors were strategically positioned within the network to capture and analyze traffic at critical points. This 0 Trust approach emphasizes threat mitigation throughout the network as opposed to relying solely on silo security. The integration of real time streaming telemetry and visibility capabilities proved to be paramount in obtaining this operational acceptance.

Speaker 2

The final win was in a large public sector, connecting redundant data centers to hundreds of campus locations with a large routing environment. They were challenged with complex MPLS routing that was hard to operate across the WAN and campus network. An upgrade of any magnitude implied several $1,000,000 impacting change controls to touch on all their sites. Arista demonstrated that the customer could use a single spine for both LAN and WAN to dramatically simplify and automate the whole environment within 30 days. This 80% reduction in total cost of ownership was made possible with Arista's modern cloud operating model.

Speaker 2

You can see a recurring theme here across all these customer wins, which is the power of our platform innovation, quality and support With a low TCO and a single cloud vision and EOS software stack, Arista is diversifying its business to transform the enterprise to a modern network operating model. Before I hand to Ida, I would like to share with you That Ida is planning to retire sometime next year in 2024. She has had a stellar career at Arista as our Chief Financial Officer. Ida has been our business partner and friend for the past 8 years. She has displayed the Ulysta way, always prioritizing our customers, Employees and shareholders, EIA has demonstrated and delivered both growth and profitability With a very, very small G and A investment, often only 1.5% of revenue.

Speaker 2

These type of pristine financials are so rare in a fast growing tech company and only possible with a shared vision between the CFO and CEO. Ida, thank you for your steady leadership and contribution. Undoubtedly, we will miss you next year when you retire. Over to you for financial metrics.

Speaker 3

Thanks, Jayshree. It's very kind. It's been an amazing experience working with you and the whole Arista team over the last 8 years. Now back to the numbers. This analysis of our Q2 results and our guidance for Q3 is based on non GAAP and excludes all non cash stock based compensation impacts, for certain acquisition related charges and other non recurring items.

Speaker 3

A full reconciliation of our selected GAAP to non GAAP results is provided in our earnings release. Total revenues in Q2 were $1,460,000,000 up 38.7% year over year and well above the upper end of our guidance of $1,350,000,000 to $1,400,000,000 Services and Subscription Software contributed approximately 15.2 percent of revenue in the quarter, up from 14.9% in Q1. International revenues for the quarter came in at $304,400,000 or 20.9 percent of total revenue, up from 17.5% last quarter. This quarter over quarter increase largely reflected a healthy contribution from our enterprise customers in EMEA and APAC And some reduction in domestic shipments to our cloud titan customers, which were unusually robust in the prior quarter. Overall gross margin in Q2 was 61.3 percent, in line with our guidance of approximately 61% and up from 60.3% last quarter.

Speaker 3

We continue to see incremental improvements in gross margin quarter over quarter with higher enterprise shipments and better supply chain costs, somewhat offset by the need for some additional inventory reserve as customers refine their forecasted product mix. Operating expenses for the quarter were R and D spending came in at $188,500,000 or 12.9 percent of revenue, up from $164,800,000 last quarter. This primarily reflected increased headcount and higher new product introduction costs in the period. Sales and marketing expense $79,600,000 or 5.5 percent of revenue compared to $75,900,000 last quarter with increased headcounts and product demo costs. Our G and A costs came in at $19,100,000 or 1.3 percent of revenue consistent with last quarter.

Speaker 3

Our operating income for the quarter was $606,500,000 or 41.6 percent of revenue. Other income and expense for the quarter was Favorable $31,600,000 and our effective tax rate was 21.4%. This resulted in net income for the quarter of $501,200,000 or 34.4 percent of revenue. Our diluted share number was 316,500,000 shares, resulting in a diluted earnings per share number for the quarter of $1.58 up 46% from the prior year. Now turning to the balance sheet.

Speaker 3

Cash, cash equivalents and investments ended the quarter at approximately $3,700,000,000 In the quarter, we repurchased $30,000,000 of our common stock at an average price of $137,200,000 per share. We've now repurchased $855,500,000 or 8,000,000 shares at an average price of $107 per share under our current $1,000,000,000 forward authorization. This leaves $145,000,000 available for repurchase in future quarters. The actual timing and amount of future repurchases will be dependent on market and business conditions, stock price and other factors. Now turning to operating cash performance for the Q2.

Speaker 3

We generated approximately $434,100,000 of cash from operations in the period, reflecting strong earnings performance, partially offset by ongoing investments in working capital. DSOs came in at 49 days, down from 57 days in Q1, reflecting a strong collections quarter with good linearity of billing. Inventory turns were 1.2 times, down from $1,300,000,000 last quarter. Inventory increased to $1,900,000,000 in the quarter, up from $1,700,000,000 in the prior period, reflecting the receipt of components from our purchase commitments and an increase in switch related finish goods. Our purchase commitments at the end of the quarter were CAD2.200 billion, down from $2,900,000,000 at the end of Q1.

Speaker 3

We expect this number to continue to decline in future quarters as component lease times improve and we work to optimize our Our total deferred revenue balance was $1,085,000,000, down from $1,092,000,000 in Q1. The majority of the deferred revenue balance is services and related and directly linked to the timing and term of service contracts, which can vary on a quarter by quarter basis. Our product deferred revenue balance declined approximately $33,000,000 from last quarter. House payable days were 57 days, up from 55 days in Q1, reflecting the timing of inventory receipts and payments. Capital expenditures for the quarter were $11,600,000 Now turning to our outlook for the Q3 and beyond.

Speaker 3

To recap, global supply chain disruptions over the last couple of years necessitated elongated planning horizons and customer demand signals. The corollary is also true. Improving lead times are now driving shorter planning horizons and demand signals, delaying when customers need to place new orders. This is particularly true of our cloud titan customers, who following a year of elevated purchases, must now graph with changing technology roadmaps and priorities before providing visibility to future demand later in the year. On the supply side, we expect to continue to ship against previously committed deployment plans for some time, targeting supply improvements where most needed, but also careful not to create redundant customer inventory.

Speaker 3

In spite of the return to shorter lead times and reduced We are executing well with gradual incremental improvements to our 2023 outlook, which now calls for year over year growth in excess of 30%. On the gross margin front, we expect continued progress through the end of the year, reflecting supply chain and manufacturing benefits, while maintaining a reasonably healthy cloud contribution. Now turning to spending and investments. We continue to monitor the overall macro environment carefully. We will prioritize our investments as we move through the year.

Speaker 3

This will include a focus on targeted hires in R and D and go to market as the team sees the opportunity to add talent. On the cash front, while we'll continue to focus on supply chain and working capital optimization, you should expect some continued growth in inventory through the end of the year. Also as a reminder, our 2023 tax payments have been deferred to October and will represent a significant use of cash in that quarter. With all of this as a backdrop, our guidance for the Q3 is based on non GAAP results and excludes any non cash stock based compensation impacts and other non recurring items Revenues of approximately $1,450,000,000 to $1,500,000,000 gross margin of approximately 62 percent Operating margin at approximately 41%. Our effective tax rate is expected to be approximately 21.5% With diluted shares of approximately 318,000,000 shares.

Speaker 3

I will now turn the call back to Liz. Liz?

Speaker 1

Thank you, Ita. We will now move to the Q and A portion of the Arista earnings call. To allow for greater participation, I'd like to request that everyone please limit themselves Thank you for your understanding. Operator, take it away.

Operator

Thank you. We will now begin the Q and A We ask that you pick up your handset before asking questions in order to ensure optimal sound quality. Your first question comes from the line of William Ng with Goldman Sachs.

Speaker 4

Hey, this is Mike Ng from Goldman Sachs. Thanks for the question. I was just wondering if you could talk a little bit more about the outlook for in excess of 30% year over year growth this year on revenue. What's gotten better relative to last earnings call? If you could talk about it in the context of Cloud titans versus enterprise, that would be helpful.

Speaker 4

I'm just trying to reconcile the revenue upgrade versus the commentary about Near term cloud titan growth moderating? Thank you.

Speaker 2

Yes. Thanks, Mike. I think it's pretty clear that some quarter over quarter Our enterprise momentum continues to get stronger and better and our cloud is strong. However, it's got 2 components now. There's the classic cloud networking and then the AI.

Speaker 2

So we're reconciling how we double down more on AI, which we are feeling stronger and stronger about. And even on the cloud, you know that the last 2 years have just been out of this world and phenomenal. So while it's moderating, it's still pretty good.

Speaker 1

Thanks, Mike.

Speaker 5

Thanks, Jayshree.

Operator

We'll take our next question from Tim Long with Barclays.

Speaker 6

Thank you. J. Shree, I was hoping you could dig more into some of the comments around AI. It Sounds like there's a large pipeline there and you talked about kind of the stages with 2025 being the big growth area. I'm curious if you can Talk a little bit about a few things related to that.

Speaker 6

One, do you see that the move to AI Expanding or diversifying more your cloud titan or your cloud customers. And second, Can you talk about kind of the next year or 2, the entire the InfiniBand versus Ethernet debate? I think you guys have been I'm trialing some Ethernet inside clusters. Can you just give us an update on how you think that Competition between those two technologies, how you think that's going to play out? Thank you.

Speaker 2

Okay. Thank you, Tim. Maybe my answer will be shorter than your question. I think the gist of what I'd like to first of all say is, majority of Arista's participation has been in the front end of the network, right? And we're getting a chance for the first time ever to play in the back end.

Speaker 2

So when we think AI, there's clearly some ramifications of bandwidth on the front end of the network, but we're not counting that. So we're truly thinking of something that's incremental, brand new, a lot of work to do in testing, proving pilots, trials before we get into production. Today, I would say in the back end of the network, there are basically 3 classes of networks. 1 is very, very small networks that are within a server Where customers use TCIE, CXL, there's proprietary NVIDIA specific technologies like NVLink that Arista does not participate. Then there's more medium clusters, you can think generative AI and more for inference, where they may well get built on Ethernet.

Speaker 2

For the extremely large clusters with large language training models, especially with the advent of CHAT GPT-three and four, You're not talking about not just 1,000,000,000 parameters, but an aggregate of 1,000,000,000,000 parameters. And this is where Ethernet will shine. But today, the only technology that is available to customers is InfiniBand. So obviously InfiniBand with 10, 15 years of similarity In an HPC environment, it's often being bundled with the GPUs. But the right long term technology is Ethernet, which is why I'm so proud of what the Ultra Ethernet Consortium and a number of vendors are doing to make that happen.

Speaker 2

So near term, there's going to be a lot of InfiniBand and Arista will be watching that outside in, but longer term Arista will be participating in an Ethernet AI network. And neither technology I want to say were perfectly designed for AI. InfiniBand was more focused on HPC And Ethernet was more focused on general purpose networking. So I think the work we are doing with the UAC to improve Ethernet for AI is very important.

Speaker 7

Okay. Thank you. That's very helpful.

Operator

We'll take our next question from Neeta Marshall with Morgan Stanley.

Speaker 8

Great. Thanks. I mean, just revisiting kind of the cloud titan commentary, Does the change that you're seeing mean that they're completing kind of one upgrade cycle, there might just be time between the next upgrade cycle? Or are there real changes to kind of current deployment plans, or kind of deployments of the current upgrades? Thanks.

Speaker 2

Sorry. Meta, were you addressing the question to Ida?

Speaker 3

Thanks, Daniel.

Speaker 8

I guess I was addressing to whoever wants to answer about the kind of commentary on the changes in the cloud titan order.

Speaker 3

Hi, Ron. Go for it, Anshul. Sure.

Speaker 9

Meta, when you look at the cloud customers in the last few Quarter, especially since the advent of Chargebee, there's been a rotation in the way. It's not that they're done with the upgrades or The upgrades, but they had to reprioritize their business and their deployments for AI. You've seen the competitive battle between the largest of the largest items in the world Trying to get ahead. But we see signs of that coin and in the future we believe they'll be back to adding and refreshing the standard compute infrastructure as well.

Speaker 2

I always like to say that you can only do so for so long, eventually you have to eat. So I think we will see a nice mix of AI and Cloud networking over time? Yes. And I think the

Speaker 3

lead time improvements have kind of facilitated them waiting for a little bit longer than what we've gotten used to over the last Couple of years. But I think again, that's kind of we're going to start coming within lead time here pretty soon then we'll see.

Speaker 9

Thank you.

Speaker 1

Thanks, Meena.

Operator

We'll take our next question from Ben Bollin with Cleveland Research.

Speaker 7

Good evening, everyone. Thanks for taking the question. Ita, congrats. I had a question for you. I was hoping you could speak to Where you see lead times presently and you talked about taking a little bit more managed approach to inventory levels at customers.

Speaker 7

Could you talk about some strategies that you employ to manage that and where you might see where you think inventory levels are within those accounts? Thank you. Yes.

Speaker 3

I think, look, the lead times are mixed across products. I mean, our goal certainly is to try to get back to like a 6 month lead time here, Maybe the end of the year, certainly early next year, but it is currently mixed across products. The commentary around customer inventory and stuff, we've been very diligent All the way through this process, the supply chain process is trying to make sure we understood demand when it showed up and that it was Put into reasonable deployment schedules and deployment plans. And we just want to continue to do that as we come through the other side of I believe that whole supply chain disruption. So it's really more understanding kind of what customers need, when they need it, And again, being able to prioritize and make sure that we understand that.

Speaker 3

So it's really a continuation of what we were doing honestly on the other side of the supply chain when you have these We had this kind of accelerated demand and then we were very focused on deployment schedules and timing. And this is just the other side of that, again, making sure we understand what's Thank

Speaker 2

you. We'll

Operator

take our next question from Antoine Skibon with New Street Research.

Speaker 10

Hi. Thanks a lot for taking my question. This is maybe a bit of a longer term question, but can you please provide an update On the opportunity at hyperscalers beyond your 2 largest customers, does the accelerated deployment of AI clusters Potentially open the door to business with the other 2 hyperscalers as the complexity of the network is increasing rapidly.

Speaker 9

So, this gets asked way often how we're doing and we continue to do well with them. As I mentioned before, not all titans are the same in terms of size. Some are small and we do very well with them, but they're just not as big as our 2 large And others who have the potential, we're still doing very well technologically with them, but we haven't seen the opportunity It's not that we're losing to anybody, it's just nothing has changed. And we continue to invest with them and we believe the opportunity is still ahead of us.

Speaker 2

Exactly, Anshul. I think the way to look at our AI opportunity is it's 10 years ahead of us. And we'll have early customers in the cloud With very large datasets, trialing our Ethernet now and then we will have more cloud customers, not only Titan's, but other high end Tier 2 cloud providers and enterprises with large datasets that would also trial us over time. So in 2025, I expect to have a large list of customers

Operator

Our next question comes from Amit Geryanani with Evercore.

Speaker 11

Thanks and congrats on a nice quarter here. I guess my question is really, there's been a fair bit of debate, I mean, investors on what does CAMDA 24 looks like for Arista and the Fira, I think always it could look like CAMDA 20 when you have some cloud digestion. I realize it's really early for EDI-twenty four, Just how to think about the puts and takes into Nexi, that would be helpful. And maybe, Jes, you could talk about how do you think Arista is different today versus The calendar 2019 calendar 2020 timeframe that would be helpful.

Speaker 2

Yes, that's a really good question. Stay tuned for our 2024 guide when we have our annual space sometime in November. But qualitatively speaking, they're a very different company today than 3 years ago. Clearly, we've doubled down on our cloud titans and You know that they're getting stronger and stronger. But even in the cloud type use, Anshul and the team have worked to have a number of use cases.

Speaker 2

It isn't just one. And the addition of AI to that use cases just gave us a whole lot of broad opportunity from front end to back end, right? So to me, the holistic and seamless and Cohesion between the front end and back end will get even more important as time goes on and cloud tightens. We also see that we're stronger in Tier 2 providers and of course the broader enterprise. Both of these were not as strong for us 3 years ago and they also represent AI opportunities, But as you know, they represent cactus, routing, classical data center opportunities and allow us to go Target is much larger TAM, again, 3 years ago, it was probably $30,000,000,000 3 years later, it's well north of $50,000,000,000 So I feel we are much more diversified.

Speaker 2

And while we deeply appreciate M and M, there are a lot more can be beyond that.

Speaker 1

Thank you, Amit.

Operator

We'll take our next question from Tal Liani with Bank of America.

Speaker 5

Hi, Ida. I have to ask you a tough one before you go, so you have a good taste for the rest few years. How much of the growth this time is coming from backlog drawdown? Can you give us some Information about the order trends rather than revenues. And the reason why I'm asking it is because your guidance for 3Q is 25% growth.

Speaker 5

When I look at 4Q, it's the implied 11%. So there is a sharp deceleration in growth in 4Q. And I'm wondering if it's a function of backlog End of elevated backlog. Thanks.

Speaker 3

I mean, look, we haven't talked about backlog and orders. I think we've talked more Just in terms of deployments and deployment slots. And if you think back to the to my commentary, I mean, we do believe that there are ongoing deployments that will go well into So it's not again, I don't necessarily sign up to the terminology of the backlog and the drawdown, etcetera, because it's just Given how the orders and the patterns of the orders, it's very difficult to talk in that language, right? But I think in terms of deployments, you will have deployments that are already planned and scheduled into 2024. I think we're taking it quarter by quarter through the end of the year, but I'd still go back to my kind of incremental look at it kind of incrementally quarter over quarter and continue to show some improvement.

Speaker 3

And as we guide to Q3, so Q4 is takes some similar kind of Incremental improvement into Q4, and I think that's the way to think about it for now. But again, I don't know that it's our commentary on kind of Demand and lead time stands, right? I mean, as lead time shorten, you will see some period of time where customers don't need to place orders until you get back That dynamic is certainly there. And as we get closer to the end of the year, we'll get more visibility into next year.

Speaker 2

Tom, I

Speaker 6

think I

Speaker 2

know you asked a difficult question. Sorry, sorry. It's a difficult question. Look, I said, We'll know more as time goes on. And we think the business is strong and whether that comes in strongly in 2024 or 2025 or somewhere in between, we shall see, Right.

Speaker 2

And the reality is it will be difficult to repeat the last 2 years of exceptional cloud CapEx for cloud networking. So as they go through that deployment and as they look at AI and as we bring in the enterprise and Tier 2 cloud, we've got a nice mix of things. And I urge everyone to think of our business as Ita has always alluded to, not in 1 quarter or even 1 year, but really a 3 year CAGR. And I think our 3 year CAGR will continue to be in double digits and good numbers.

Speaker 5

Great. Thank you.

Speaker 2

Thanks, Tom.

Operator

We'll take our next question from Sebastien D'Argentle with William Blair.

Speaker 5

Great. Thanks for taking the question. Can you maybe just update us on the visibility in your customer base? Is it Still around 6 months or are we now down closer to 3 months? And maybe just longer term, do you think that generative AI could help improve that visibility From where it's historically been, just given that many of these hyperscalers have what seems like decent visibility into a pretty robust pipeline over the next few years?

Speaker 2

Yes. That's a very good question, Sebastian. Since we have so many products in the mix, I have to break your question into visibility across multiple areas. Enterprise, I would 6 months to 12 months, generally speaking. In the cloud, given the reduced lead times on classic cloud networking, it's less than 6 months.

Speaker 2

However, on AI, it is greater since it's an early cycle and we have to do a lot more joint development. So you can think of it as Three migrations going on with different visibility patterns.

Operator

We'll take our next question from Samik Chatterjee with JPMorgan.

Speaker 12

Hi. Thanks for taking my question. Maybe if I can shift gears here a bit to enterprise, Jayshree. And obviously, you're talking about the slowdown on the cloud side here a bit Going into 2024, but when you look at enterprise, how do you think about sustaining a growth rate or the slowdown in that Growth rate into 2024. What are you seeing in terms of orders on that front to sort of give you visibility into 2024?

Speaker 12

Thank you.

Speaker 2

Look, I think Samik, this is an area that we feel pretty good about and it's an area of great execution from Arshul, Krishmit, Ashwin and the entire team where we have really diversified our business globally in the enterprise, we're not just in the high end financials, We're in just about every major vertical, healthcare, transportation, public sector, education, banks, insurances. So I feel enterprise, barring any macro issue, which is the thing we were always worried about for 2024. So if macro doesn't let us down and we don't have to worry about the economy, We will have a strong year in enterprise.

Speaker 9

Thank you.

Operator

We'll take our next question from Aaron Rakers with Wells Fargo.

Speaker 13

Yes. Thank you for taking the question. I guess I wanted to ask just on product cycle cadence. There's a lot of focus from one of your key component suppliers in the merchant silicon side around 51.2 Terabit, silicon and obviously supporting the 800 gig cycle. I'm curious, how do you think about the timing of that?

Speaker 13

When do we start to see the materializing Deployment of 800 gig and maybe that's tied to AI, maybe it's not, but just curious to when that cycle you believe really starts to kick in?

Speaker 9

Aaron, we had the same discussion when the world went to 400 gig, are we switching from 100 to 400. The reality was the customers continue to buy both 10400 for different use cases. 51T and 800 gig especially Being pulled by AI cluster, the AI teams are waiting access to get their hands on it, move the data as quickly as possible and reduce the job completion time. So you'll see early traction there. You'll see, Cheshri mentioned trials really in 2024 going into volume in 2025 And that should be the ramp we'll follow for 800 gig, but that does not mean everything they just bought last few years at 400 gig for DCI or the And so on for classic clusters is going to get upgrade to 800 gig.

Speaker 9

I think that's going to be a longer cycle. So you will see 100, 200, 400, 800 Get deployed in parallel as we enter that cycle in 2024, 2025. Thank you.

Speaker 2

Thanks. Okay.

Operator

We'll take our next question from Matthew Niknam with Deutsche Bank.

Speaker 10

Hey, thanks for taking the question. I'm just wondering on the supply chain, if you can talk about how that's evolved over the last quarter. And as it relates to gross margins, I think you're incremental improvements in 3Q and 4Q. Is that purely a function of easing supply chain? Or is there also maybe greater Relative contribution from enterprise relative to cloud titans envisioned in the second half of the year as well.

Speaker 10

Thanks.

Speaker 3

I mean, we're definitely seeing improvement on the supply chain side. We're seeing improvements with freight, improvements with Just some of the expedite costs of the things that we were dealing with and we're kind of inventoried and now we're releasing them. So I think we're coming out from Underneath that, there is some small shift in mix as well, but it's still a good strong cloud mix This quarter, this year. So, it's not like we're back to a heavy enterprise mix with cloud, including a much smaller part. There's still a very healthy kind of cloud mix in this year.

Speaker 3

So it's more where you can back out our the supply chain stuff that we'd incurred

Speaker 10

in the past.

Speaker 2

Yes. I want to give a shout out. Mark Berlhardt, our new Senior VP of Manufacturing and John McCool, both in our insurance team have done a fantastic job of optimizing the So those improvements are really playing a role in our quarter to quarter gross margins. Thank you. Thanks.

Operator

We'll take our next question from James Fish with Piper Sandler.

Speaker 14

Hey, thanks for the question. Just wanted to follow-up around some of the Prior questions asked, as many of mine have been asked already. But I know you guys aren't talking about visibility and don't discuss backlog. But is it still fair to assume that We should think about you guys returning to a normal environment from a supply perspective in the early part of next year. And I believe, Ita, you've talked about Underneath, assuming that hyperscalers or your cloud titans grow double digits for this year, is it still fair to think about that kind of level For 2023?

Speaker 3

Yes. I think that's absolutely right. And I think that we kind of forget the cloud is still an important part of 2020 We're still executing on deployments and planning that we did some time back, right, all the way through this year. So cloud is still a significant piece of the business in 2023.

Speaker 2

Yes. And James, just to confirm, we expect a more normal setting in 2024 in terms of lead time. You would like to assume that. Thanks, Dan.

Operator

We'll take our next question from Simon Leopold with Raymond James.

Speaker 15

Thanks for taking the question. I wanted

Speaker 3

to see if you could maybe do a

Speaker 15

little bit of Hacking in terms of what's driving your enterprise business in that, I think the conventional wisdom is that enterprises are challenged by Sessionary forces on the cycle and then the secular challenge around public cloud adoption being slowing. So what do you see happening? How much of this success is related to market share gains? How much to general cycles, products, Etcetera, if we can unpack the enterprise traction. Thank you.

Speaker 2

Sure, Simon. Well, of course, we have market share gains. That is the result of our enterprise traction, I would Okay. But if you ask me why are we winning in the enterprise? I would say number 1, from an alternative perspective, our customers haven't had one for a very long time.

Speaker 2

They haven't had a high quality, high support, very friendly software experience, A common leaf spine architecture across their data center, campus, routing in a long, long time. So I think the architectural shift in the enterprise To move to a modern cloud operating model is the number one reason that Arista has been chosen. They are seeking our architecture for that quality In fact, Angela and I were just talking about the call. We use the word cloudify a lot and it's quite tricky right now Our high end enterprises are really looking for the cloud principles, but however on their premise. In terms of the shift between workloads in the cloud and workloads on the Enterprise, it depends on the customer.

Speaker 2

You're still seeing some of the mid market customers want to move their e commerce workloads on the cloud, But a lot of their mission critical applications stay on the premise. So our hybrid strategy continues to dominate the enterprise decisions for the data center. Secondly, our entry into the campus and routing as well as 0 Trust security and availability, etcetera, is adding more layers to the cake. So our product depth And breadth is getting better and better. So the cloud operating model, the product depth and now actually we've been at it now for what do you say, I'm sure 3 to 5 years Maybe so, especially in the United States, we've got more work to do internationally.

Speaker 2

I would say we've been engaging with these customers. I remember when Eda and I had a Question, I want to say 5 years ago where she was right and I was wrong and she persuaded me to invest more in the enterprise. I think all these things have gone into really making us who we are in the enterprise. And clearly, we are gold standard and we have a seat at the table there.

Speaker 3

Thank you.

Speaker 2

Thank you, Simon.

Operator

We'll take our next question from David Vogt with UBS.

Speaker 15

Great. Thanks guys for taking

Speaker 10

the question and congratulations, Ita.

Speaker 2

I just want to go

Speaker 10

back to the point and maybe help bridge the 23% to 24% to 25% commentary that Jayshree mentioned sort of strong double digit growth. I think in the past you've talked about 15% growth across cycles. And I'm just trying to think through, is there enough in trials and pilots in 2024 To kind of get you to that kind of mid teens growth over the next couple of years and if not, does that mean that your enterprise business has to remain incredibly robust in 24 upwards of high teens to low 20% growth next year. I know you're not giving guidance, but trying to kind of walk the bridge to get from where we are today 25 where you're going to start to see more widespread AI deployments from a revenue recognition perspective. Thanks.

Speaker 3

And now you want to go to 25 as well. I don't think we're ready to do that. That's a really good conversation for the Analyst Day, honestly. I think We obviously are very focused internally. As Jayshree reiterated earlier

Speaker 9

on, the businesses is a

Speaker 3

lot more robust with many different drivers. As you go through that period, cloud will ebb and flow, but it's still a healthy business. It has been a healthy business through those cycles. So I think we've got a lot of the building blocks, but how we're going to assemble them, maybe we'll stay for the Analyst Day.

Speaker 2

We'll share the Lego plan more. But David, rest assured that we are aiming for at least double digits next year. And so we'll go from there.

Speaker 10

Great. Thanks guys and congrats again.

Operator

Thank you. Thank you. We'll take our next question from Erik Suppiger with JMP Securities.

Speaker 16

Yes. Thanks for taking the question. Maybe this is for Anshul. Can you just walk us through Kind of how the cloud titans work. We hear a lot about them buying volumes of GPUs right now.

Speaker 16

At what point do their purchasing of GPUs translate into their demand for switches? How does it work with Trials and so on and so forth.

Speaker 9

So, there is no uniform recipe. In general, when they're buying GP to connect, It could be a few quarters depending on their timing of deployments to build the network It takes them a couple of months, sometimes a quarter or more to fine tune the cluster and benchmark and test everything It is actually released to production. So you can think of that sort of the basis a couple of months, couple of quarters minimum Before you can get there, sometimes it adds up to about a year before you really ramp into production.

Speaker 2

Great. Thanks, Eric.

Speaker 9

We'll

Operator

take our next question from George Notter with Jefferies.

Speaker 17

Hi, guys. Thanks a lot. I guess I wanted to ask about your comments about 2025 participation in AI. Could you walk us through sort of the milestones that you see between now and then in terms of increasing Arista's Participation, certainly there's new product development, there's market acceptance, I presume. And then also, I assume that you participate today with inferencing applications and that's by and large done on Ethernet.

Speaker 17

I think what we're really talking about is Training, correct. So any more color there would be great. Thanks.

Speaker 2

Yes, George. So I think you can look at 2023 as nearly a year of planning for AI, Because as I said, there's tons and tons of GPUs being purchased and then the question is how is it being connected. So depending on whether they're small, medium or large, they're different technologies, but I'm going to stay focused on the large because that's the biggest problem. You are right to say some of them may be Ethernet or even a non networking technology, just an IO or a bus for smaller networks. But generally speaking, we're focusing on things that are much larger than 200 or even 1,000 GPUs.

Speaker 2

So that's the first thing. So a lot of planning is going into that. And the planning basically is how do they get the GPUs? What is their application? What is the size of the cluster?

Speaker 2

What is the time, what is the large language model datasets, etcetera? And what is their network foundation? In some cases where they just need to go quick and fast, As I explained before, it would not be uncommon to just bundle their GPUs with an existing technology like InfiniBand. But where they're really rolling out into 2025, They're doing more trials and pilots with us to see what the performance is, to see what the drop is, to see how many they can connect, what's the latency, What's the better entropy? What's the efficiency, etcetera?

Speaker 2

That's where we are today. Now we expect next year this will translate to some what I call pilots Because majority of them will happen in 2025, but in 2024, you'll start seeing, what do you say, Anshul, maybe 4000 to 8000 GPUs, something

Speaker 3

like that?

Speaker 9

That's right, in that range.

Speaker 2

In that range, okay. 4000 to 8000 GPUs at about 400 gig 5 clusters, but we'll actually put some production workloads on it. So I call them smaller pilots. But the real test Why you buy these expensive GPUs is in 2025 when you want to have not just 4 to 8, but 30,000, 50,000 maybe even 100,000. This is why 2025 is so critical.

Speaker 2

And taking testing and taking out all the kinks out of the GPUs and networks is important because your network is so a good network It's so pivotal to getting the most out of your GPUs. If you have idling cycles on those GPUs, you wasted 1,000 if not 1,000,000 of dollars. And so I think these next 2 years are crucial to getting the most out of these expensive GPUs and that's where the network really Anshul?

Speaker 9

If I can add one more thing here, what are the milestones to get to these 2025 loudspeaker requirements? There is one key milestone that has nothing to do with GPUs or our switches, which is does the customer have enough power in the It's ready to deploy that many megawatts or gigawatts of capacity. And as you know, getting a 5,000,000 megawatts site takes a couple of years, Which is why this is a slow ramp. This is not suddenly turn on the key and you have thousands of GPUs.

Speaker 2

Yes, really good point. Simple things like power and space are still vital.

Speaker 1

Thanks, George.

Operator

We'll take our next question from Karl Ackerman with BNP Paribas.

Speaker 3

Yes. Thank

Speaker 15

you. Jayshree, there's been some investor concern that hyperscale customers may focus more on white box solutions for 800 Gig band in the 400 gig cycle. We're aware that some of your customers continue to adopt a dual sourcing strategy. But if you could just comment on the Potential for an upgrade cycle as well as reuse risk on the transition to 800 gig, it will be very helpful. Thank you.

Speaker 2

Sure. So as you're probably well aware, the white box question has remained with Arista as one of the most popular Questions asked right from the time of our IPO, whether it's 10 gig, 40 gig, 100 gig, 400 gig or now you ask it at 800 gig. I think there will always be an element of white box if somebody is just looking to build something and throw in some quick traffic. But for some of these most mission critical networks, it's less about the box and more about the software stack and how much performance, availability, power You really get out of this. So the cost of putting in the box, if you save something, if you even save something, is far dwarfed By the total OpEx you need to make that box work.

Speaker 2

So we continue to believe that we will coexist with Whitebox and some of our cloud titan customers. We will continue to run both Sonic and EFTPOS in the case of Microsoft and Meta along with our EOS.

Speaker 1

But at the

Speaker 2

end of the day, whether it's a white box or a blue box, It's the software stack that really wins.

Speaker 1

Thanks, Kyle. Operator, we have time for one last question.

Operator

Thank you. We'll take our last question from Ben Rithes with Melius Research.

Speaker 18

Hey, thanks a lot for sneaking me in there. Congratulations, Jayshree I wanted to ask about enterprise again. I think the comments you made around cloud titans were all things that People were able to detect, but the enterprise just seems so much better in terms of the performance and the guide. So, you mentioned that you gained share, but Did the market pick up as well? And did you see that market pick up in demand in the enterprise sustaining into 2024?

Speaker 18

Just kind of more color around enterprise and whether the market picked up in addition to you gaining share?

Speaker 2

Hey, Ben, thank you. What do you mean by the market pickup? I don't follow the question.

Speaker 18

Did demand pick up because the enterprise outperformance was quite a surprise and clearly the cloud titan commentary was Subdued as everybody was able to predict after the last conference calls this week. So I mean, was it all market share or is the market picking up? Is

Speaker 2

I would say to you that probably our enterprise demand has always been strong and not subdued, far from that. However, dwarfed by the excellence of our cloud performance, you didn't notice it and now you're noticing it. Thanks, Ben.

Speaker 1

This concludes the Aritzker Second Quarter 2023 Earnings Call. We have posted a presentation which provides additional information on our results, which you can access on our Investors section of our website. Thank you for joining us today and thank you for your interest in Arista.

Operator

Thank you for joining, ladies and gentlemen. This concludes today's call. You may now disconnect.

Remove Ads
Earnings Conference Call
Arista Networks Q2 2023
00:00 / 00:00
Remove Ads