Arista Networks 公佈了 2024 財年第三季的強勁財務業績,營收和每股盈餘均創歷史新高。該公司討論了他們在服務和軟體支援續約、國際貢獻以及 2025 年 Arista 2.0 計劃方面的成功。
Arista 的目標是實現兩位數的年增長率和三年複合年增長率預測。該公司還宣布進行 4 比 1 的股票分割,並為第四季度和 2025 財年提供指導。 Arista 強調了他們在乙太網路方面的專業知識以及他們作為前端網路領導者的地位。
他們討論了企業、人工智慧以及與 Broadcom 乙太網路交換器系列合作的機會。該公司預計由於客戶組合而導致利潤率下降,但目標是維持供應鏈紀律。總體而言,Arista Networks對其在人工智慧網路領域的業績和成長前景表示樂觀。
使用警語:中文譯文來源為 Google 翻譯,僅供參考,實際內容請以英文原文為主
Operator
Operator
Welcome to the third-quarter 2024 Arista Networks financial results earnings conference call.
歡迎參加 Arista Networks 2024 年第三季財務業績電話會議。
(Operator Instructions) As a reminder, this conference is being recorded and will be available for replay from the Investor Relations section at the Arista website following this call.
(操作員說明)謹此提醒,本次會議正在錄製中,並可在本次電話會議後透過 Arista 網站的投資者關係部分重播。
Ms. Liz Stine, Arista's Director of Investor Relations.
Liz Stine 女士,Arista 投資者關係總監。
You may begin.
你可以開始了。
Liz Stine - Director - Investor Relations Advocacy
Liz Stine - Director - Investor Relations Advocacy
Thank you, operator.
謝謝你,接線生。
Good afternoon, everyone, and thank you for joining us.
大家下午好,感謝您加入我們。
With me on today's call are Jayshree Ullal, Arista Networks Chairperson and Chief Executive Officer; and Chantelle Breithaupt, Arista's Chief Financial Officer.
參加今天電話會議的有 Arista Networks 董事長兼執行長 Jayshree Ullal;以及 Arista 財務長 Chantelle Breithaupt。
This afternoon, Arista Networks issued a press release announcing the results for its fiscal third quarter ending September 30, 2024.
今天下午,Arista Networks 發布新聞稿,宣布截至 2024 年 9 月 30 日的第三財季業績。
If you would like a copy of this release it online at our website.
如果您需要此副本,請在我們的網站上線上發布。
During the course of this conference call, Arista Networks management will make forward-looking statements, including those relating to our financial outlook for the fourth quarter of the 2024 fiscal year. longer-term business model and financial outlooks for 2025 and beyond, our total addressable market and strategy for addressing these market opportunities, including AI, customer demand trends, supply chain constraints, component costs, manufacturing output, inventory management and inflationary pressures on our business, lead times, product innovation, working capital optimization and the benefits of acquisitions, which are subject to the risks and uncertainties that we discuss in detail in our documents filed with the SEC, specifically in our most recent Form 10-Q and Form 10-K and which could cause actual results to differ materially from those anticipated by these statements.
在本次電話會議期間,Arista Networks 管理層將做出前瞻性聲明,包括與我們 2024 財年第四季的財務前景相關的聲明。 2025 年及以後的長期業務模式和財務前景、我們的整體目標市場和應對這些市場機會的策略,包括人工智慧、客戶需求趨勢、供應鏈限制、零件成本、製造產量、庫存管理和我們業務的通膨壓力、交貨時間、產品創新、營運資本優化和收購的好處,這些都受到我們在向SEC 提交的文件中詳細討論的風險和不確定性的影響,特別是在我們最新的表格10-Q和表格10- 中K 並且可能導致實際結果與這些陳述預期的結果有重大差異。
These forward-looking statements apply as of today, and you should not rely on them as representing our views in the future.
這些前瞻性陳述從今天起適用,您不應依賴它們來代表我們未來的觀點。
We undertake no obligation to update these statements after this call.
我們不承擔在本次電話會議後更新這些聲明的義務。
Also, please note that certain financial measures we use on this call are expressed on a non-GAAP basis and have been adjusted to exclude certain charges.
另請注意,我們在本次電話會議中使用的某些財務指標是在非公認會計原則的基礎上表示的,並已進行調整以排除某些費用。
We have provided reconciliations of these non-GAAP financial measures to GAAP financial measures in our earnings press release.
我們在收益新聞稿中提供了這些非公認會計原則財務指標與公認會計原則財務指標的調節表。
With that, I will turn the call all over to Jayshree.
這樣,我就把電話全部轉給 Jayshree。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, Liz, and thank you, everyone, for joining us this afternoon for our third-quarter 2024 earnings call.
謝謝 Liz,也謝謝大家今天下午參加我們的 2024 年第三季財報電話會議。
We delivered revenues of $1.81 billion for the quarter, with a record non-GAAP earnings per share of $2.40. Services and software support renewals contributed strongly at approximately 17.6% of revenue.
我們本季實現營收 18.1 億美元,非 GAAP 每股盈餘創紀錄的 2.40 美元。服務和軟體支援續約貢獻強勁,約佔營收的 17.6%。
Our non-GAAP gross margin of 64.6% was influenced by both pressure from cloud titan customer pricing, offset by favorable enterprise margin and supply chain hygiene.
我們 64.6% 的非 GAAP 毛利率受到雲端巨頭客戶定價壓力的影響,但卻被有利的企業利潤率和供應鏈狀況所抵銷。
International contribution for the quarter registered an approximately 18% with the Americas very strong at 82%.
本季的國際貢獻約為 18%,其中美洲非常強勁,達到 82%。
Clearly, Q3 2024 had a lot of bright spots in the quarter and are encouraged by the strength and momentum of the company.
顯然,2024 年第三季有很多亮點,並受到公司實力和勢頭的鼓舞。
At the recent tenth anniversary in June and 2024 celebration and vision event, we covered a lot of ground on what we would have otherwise said in an Analyst Day.
在最近 6 月舉行的十週年慶典和 2024 年慶祝活動和願景活動中,我們討論了許多原本在分析師日要講的內容。
So today, I'd like to briefly expand on our Arista 2.0 plans for 2025.
今天,我想簡單闡述我們 2025 年的 Arista 2.0 計畫。
We believe that networks are emerging at the epicenter of mission-critical transactions and our Arista 2.0 strategy is resonating well with customers.
我們相信,網路正在成為關鍵任務交易的中心,我們的 Arista 2.0 策略正在與客戶產生良好的共鳴。
We are, we believe, the only pure-play network innovator for the next decade.
我們相信,我們是未來十年唯一的純粹網路創新者。
Our modern networking platforms are foundational for transformation from silos to centers of data.
我們的現代網路平台是從孤島轉型為資料中心的基礎。
This can be a data center a campus center, a WAN center or an AI center.
這可以是資料中心、園區中心、WAN 中心或 AI 中心。
At the heart of this is our state-oriented public subscribed network data lake EOS software stack for multimodal data sets.
其核心是我們面向多模式資料集的面向國家的公共訂閱網路資料湖 EOS 軟體堆疊。
One simply cannot learn without having access to all this data.
如果無法存取所有這些數據,人們根本無法學習。
So it is all about the data.
所以一切都與數據有關。
We provide customers the state foundation for data for AI and machine learning without which AI and ML would just be buzzwords.
我們為客戶提供人工智慧和機器學習數據的國家基礎,沒有這些基礎,人工智慧和機器學習就只是流行語。
Arista is well positioned with the right network architecture for client to campus, data center, cloud and AI networking.
Arista 擁有適合客戶端到園區、資料中心、雲端和 AI 網路的正確網路架構。
Three principles guide us and differentiate us in bringing this data-driven networking.
三個原則指導我們並使我們在實現這種數據驅動的網路方面脫穎而出。
Number one, best-in-class, highly available proactive products with resilience and hitless upgrade built in at multiple levels; two, zero-touch automation and telemetry with predictive client to cloud one-click operations with that granular visibility that relies less on number three, prescriptive insights for deeper AI for networking delivering AIOps and algorithms for security, availability and root cause analysis.
排名第一、同類最佳、高度可用的主動式產品,具有彈性和多級內建的無中斷升級;第二,零接觸自動化和遙測,具有預測性客戶端到雲端的一鍵式操作,具有精細的可見性,減少對第三點的依賴,為網路提供更深入的人工智慧的規範性見解,提供用於安全性、可用性和根本原因分析的AIOps和演算法。
Networking for AI is gaining a lot of traction as we move from trials in 2023 to more pilots in 2024, collecting to thousands of GPUs, and we expect more production in 2025 and 2026.
隨著我們從 2023 年的試驗轉向 2024 年的更多試點,收集數千個 GPU,人工智慧網路正在獲得很大的吸引力,我們預計 2025 年和 2026 年會有更多的生產。
In all vernacular, Arista AI centers are made up of both the back-end clusters and front-end networks.
用通俗的話來說,Arista AI 中心由後端集群和前端網路組成。
AI traffic differs greatly from cloud workloads in terms of diversity, duration, and size of flow.
人工智慧流量在多樣性、持續時間和流量大小方面與雲端工作負載有很大不同。
The fidelity of AI traffic flows with the slowest flow matters and one slow flow could slow down the entire job completion time is a crucial factor in networking.
最慢流量的人工智慧流量的保真度很重要,一個緩慢的流量可能會減慢整個作業的完成時間,這是網路中的關鍵因素。
Our AI centers connect seamlessly from the back end to the front end of compute, storage, WAN, and classic cloud networks.
我們的人工智慧中心從運算、儲存、WAN 和經典雲端網路的後端無縫連接到前端。
Investor emerging as a pioneer and scale-out Ethernet accelerated networking for large-scale training and AI workloads.
投資者成為大規模培訓和人工智慧工作負載的橫向擴展乙太網路加速網路的先驅。
Our new Etherlink portfolio with wire speed 800-gig throughput and non-blocking performance, scales from single tier to session two tier networks for over 100,000 GPUs, potentially even 1 million AI accelerators with multiple tiers.
我們的新 Etherlink 產品組合具有線速 800 GIG 吞吐量和無阻塞性能,可從單層擴展到會話兩層網絡,支援超過 100,000 個 GPU,甚至可能支援 100 萬個多層 AI 加速器。
Our accelerated AI networking portfolio consists of three families with over 20 switching products and not just one point switch.
我們的加速 AI 網路產品組合由三個系列組成,擁有 20 多種交換產品,而不僅僅是一款單點交換器。
At the recent OCP in mid-October 2024, we officially launched a very unique platform that distributed Etherlink7700 to build two tier networks for up to 10,000 GPU clusters.
在最近的 2024 年 10 月中旬的 OCP 上,我們正式推出了一個非常獨特的平台,該平台分散式 Etherlink7700 為最多 10,000 個 GPU 叢集建立兩層網路。
The 77R4 DES platform was developed in close collaboration with Meta.
77R4 DES 平台是與 Meta 密切合作開發的。
And why it may physically look like and be cable like a 2-tier leaf spine network, DES provides a single-stage forwarding with highly efficient spine fabric, eliminating the need for tuning and encouraging fast failover for large AI accelerator-based clusters.
DES 的物理外觀和電纜結構類似於 2 層葉主幹網絡,它提供了具有高效主幹結構的單級轉發,無需調整並鼓勵基於人工智慧加速器的大型集群的快速故障轉移。
It complements our Arista flagship 7800 AI spine for the ultimate scale with differentiated fare and fully scheduled cell spraying architecture with a virtual output curing fabric saving valuable AI processor resources and improving job completion time.
它補充了我們的 Arista 旗艦 7800 AI 脊柱,以差異化的價格和完全調度的單元噴塗架構實現最終規模,並具有虛擬輸出固化織物,節省寶貴的 AI 處理器資源並縮短作業完成時間。
I would like to now invite John McCool, our Chief Platform Officer, to describe our 2024 platform and supply chain innovations after a challenging couple of years.
我現在想邀請我們的首席平台長 John McCool 描述我們在經歷了充滿挑戰的幾年後的 2024 年平台和供應鏈創新。
John, over to you.
約翰,交給你了。
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
Thank you, Jayshree.
謝謝你,傑什裡。
I'm pleased to report Arista 7700 R4 distributed Ecolink switch, the 7800 R4 Spine, along with the 7060 X6 AI leaf that we announced in June have entered into production providing our customers the broadest set of 800 gigabit per second Ethernet products for their AI networks.
我很高興地向您報告,我們在 6 月宣布的 Arista 7700 R4 分散式 Ecolink 交換器、7800 R4 Spine 以及 7060 X6 AI leaf 已投入生產,為我們的客戶提供最廣泛的 800 Gb/秒乙太網路產品。網路。
Together with 800 gigabit per second parallel optics, our customers are able to connect to 400 gigabit per second GPUs to each port increasing the deployment density over current switching solutions.
結合每秒 800 吉比特的平行光學元件,我們的客戶能夠將每秒 400 吉比特的 GPU 連接到每個端口,從而提高了當前交換解決方案的部署密度。
This broad range of Ethernet platforms allows our customers to optimize density and minimize tiers to best match the requirements of their AI work.
這種廣泛的乙太網路平台使我們的客戶能夠優化密度並最小化層數,以最好地滿足其人工智慧工作的要求。
As our customers continue with AI deployments, they're also preparing their front-end networks.
隨著我們的客戶繼續進行人工智慧部署,他們也在準備他們的前端網路。
New AI clusters require new high-speed connections into the existing backbone.
新的人工智慧叢集需要與現有主幹網路建立新的高速連線。
These new clusters also increased bandwidth on the backbone to access training data, capture snapshots and deliver results generated by the cluster.
這些新叢集還增加了主幹網路的頻寬,以存取訓練資料、擷取快照並交付叢集產生的結果。
This trend is providing increased demand for 7800 R3 400-gigabit solution.
這一趨勢增加了對 7800 R3 400 Gb 解決方案的需求。
While the post-pandemic supply chain has returned to predictability, lead times for advanced semiconductors remain extended from pre-pandemic levels.
雖然大流行後的供應鏈已恢復可預測性,但先進半導體的交貨時間仍較大流行前的水平延長。
To assure availability of high-performance switching silicon, we've increased our purchase commitments for these key components.
為了確保高效能開關晶片的可用性,我們增加了對這些關鍵組件的採購承諾。
In addition, we will increase our on-hand inventory to respond to the rapid deployment of new AI networks and reduce overall lead times as we move into next year.
此外,我們將增加現有庫存,以應對新人工智慧網路的快速部署,並在進入明年時縮短整體交貨時間。
Our supply chain team continues to work closely with planning to best align receipt of these purchases with expected customer delivery.
我們的供應鏈團隊繼續與規劃密切合作,以最好地將這些採購的收貨與預期的客戶交付保持一致。
Next-generation data centers integrating AI will contend with significant increases in power consumption while looking to double network performance.
整合人工智慧的下一代資料中心將應對功耗的顯著增加,同時尋求雙倍的網路效能。
Our tightly coupled electrical and mechanical design flow allows us to make system-level design trade-offs across domains to optimize our solutions.
我們緊密耦合的電氣和機械設計流程使我們能夠跨領域進行系統級設計權衡,以優化我們的解決方案。
Our experience in co-design with the leading cloud companies provides insight into the variety of switch configurations required for these tightly coupled data center environments.
我們與領先的雲端公司共同設計的經驗讓我們能夠深入了解這些緊密耦合的資料中心環境所需的各種交換器配置。
Finally, our development operating software with SDK integration, device diagnostics and data analysis supports a fast time to design and production with a focus on first-time results.
最後,我們的開發操作軟體整合了 SDK、設備診斷和數據分析功能,支援快速設計和生產,並專注於首次結果。
These attributes give us confidence that we will continue to execute on our road map in this rapidly evolving AI networking segment.
這些屬性讓我們有信心在這個快速發展的人工智慧網路領域繼續執行我們的路線圖。
Back to you, Jayshree.
回到你身邊,傑什裡。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, John.
謝謝你,約翰。
And congrats on a very high performance here for you to you and your new executives, [Alex Rose, Mike Capas, Luke] and the entire team.
並祝賀您和您的新高管[亞歷克斯·羅斯、邁克·卡帕斯、盧克]以及整個團隊在這裡取得了非常出色的表現。
You guys have really done a phenomenal job.
你們確實做得非常好。
Critical to the rapid adoption of AI networking is the Ultra Ethernet consortium specification expected imminently with Arista's key contributions as a founding member.
AI 網路快速採用的關鍵是超乙太網路聯盟規範,Arista 作為創始成員的關鍵貢獻預計即將到來。
The UEC ecosystem for AI has evolved to over 97 members.
UEC 人工智慧生態系統已發展到擁有超過 97 個成員。
In our view, Ethernet is the only long-term viable direction for open standard space AI networking.
我們認為,乙太網路是開放標準空間人工智慧網路唯一長期可行的方向。
Arista is building holistic AI centers powered by our unparalleled superiority of US and the depth of automation and visibility software provided by CloudVision.
Arista 正在利用美國無與倫比的優勢以及 CloudVision 提供的深度自動化和可見性軟體來建立整體人工智慧中心。
Arista EUS delivers dynamic methods using cluster load balancing for congestion control and smart system upgrades where the traffic for AI continues to flow in the midst of an upgrade.
Arista EUS 提供使用叢集負載平衡進行擁塞控制和智慧型系統升級的動態方法,其中 AI 流量在升級過程中繼續流動。
Arista continues to work with AI accelerators of all types, and we're agnostic to nice to bring advanced EOS visibility all the way down to the host.
Arista 繼續與所有類型的 AI 加速器合作,我們不知道能否將先進的 EOS 可見度一直帶到主機。
Shifting to 2025 goals.
轉向 2025 年目標。
As we discussed in our New York Stock Exchange event in June, our TAM has expanded to $70 billion in 2028.
正如我們在 6 月的紐約證券交易所活動中討論的那樣,我們的 TAM 已在 2028 年擴大到 700 億美元。
And you know we've experienced some pretty amazing growth years with 33.8% growth in '23 and 2024 appears to be heading at least to 18%, exceeding our prior predictions of 10% to 12%.
你知道,我們經歷了一些相當驚人的成長年,23 年成長了 33.8%,而 2024 年似乎至少會達到 18%,超出了我們先前預測的 10% 到 12%。
This is quite a jump in 2024, influenced by faster AI pilots.
受更快的人工智慧飛行員的影響,這在 2024 年是一個相當大的飛躍。
We are now projecting an annual growth of 15% to 17% and next year, translating to approximately $8 billion in 2025 revenue with a healthy expectation of operating margin.
我們現在預計明年的年增長率為 15% 至 17%,即 2025 年收入約為 80 億美元,營運利潤率預期良好。
Within that $8 billion revenue target, we are quite confident in achieving our campus and by back-end networking targets of $750 million each in 2025 that we set way back one or two years ago.
在 80 億美元的營收目標中,我們非常有信心實現我們在一兩年前設定的 2025 年園區和後端網路目標分別為 7.5 億美元的目標。
It's important to recognize though that the back end of AI will influence the front-end AI network and its ratios.
但重要的是要認識到人工智慧的後端將影響前端人工智慧網路及其比率。
This ratio can be anywhere from 30% to 100% and sometimes, we've seen it as high as 200% of the back-end network depending on the training requirements.
該比率可以是 30% 到 100% 之間的任何值,有時,根據訓練要求,我們看到後端網路的比率高達 200%。
Our comprehensive AI center networking number is therefore likely to be double of our back-end target of $750 million, now aiming for approximately $1.5 billion in 2025.
因此,我們的綜合人工智慧中心網路數量可能是我們 7.5 億美元後端目標的兩倍,目前的目標是到 2025 年約為 15 億美元。
We will continue to aim for double-digit annual growth and a three-year CAGR forecast of teams in the foreseeable future of 2024 to 2026.
在可預見的未來(2024年至2026年),我們將繼續致力於實現兩位數的年度成長和團隊的三年複合年增長率預測。
More details forthcoming from none other than our Chief Financial Officer, so over to you, Chantelle.
更多詳細資訊將由我們的財務長提供,接下來就交給你了,Chantelle。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Thank you, Jayshree.
謝謝你,傑什裡。
Turning now to more detail on the financials.
現在轉向財務方面的更多細節。
This analysis of our Q3 results and our guidance for Q4 fiscal year '24 is based on non-GAAP.
對我們第三季業績的分析以及我們對 24 財年第四季的指導是基於非公認會計原則的。
Excludes all noncash stock-based compensation impacts, intangible asset amortization and other nonrecurring items.
不包括所有非現金股票補償影響、無形資產攤銷和其他非經常性項目。
A full reconciliation of our selected GAAP to non-GAAP results and earnings release.
我們選擇的 GAAP 與非 GAAP 業績和收益發布的全面調整。
Total revenues reached $1.81 billion, marking a 20% year-over-year increase.
總收入達18.1億美元,年增20%。
This strong performance exceeded our guidance range of $1.72 billion to $1.75 billion.
這一強勁表現超出了我們 17.2 億美元至 17.5 億美元的指導範圍。
Services and subscription software contributed approximately 17% of revenues in the third quarter.
服務和訂閱軟體約佔第三季營收的 17%。
International revenues for the quarter came in $330.9 million or 18.3% of total revenue down from 18.7% last quarter.
本季國際營收為 3.309 億美元,佔總營收的 18.3%,低於上季的 18.7%。
This quarter-over-quarter decrease reflects an increased contribution from domestic shipments to our cloud and enterprise customers.
這一環比下降反映了國內出貨量對我們的雲端和企業客戶的貢獻增加。
Overall, gross margin in Q3 was 64.6%, above the upper range of our guidance of approximately 64%, down from 65.4% last quarter and up from 63.1% in Q3 prior year.
整體而言,第三季的毛利率為 64.6%,高於我們指引的上限(約 64%),低於上季的 65.4%,高於去年第三季的 63.1%。
This year-over-year improvement is driven by stronger enterprise margins and supply chain discipline in the current quarter.
這項同比改善是由本季更強的企業利潤和供應鏈紀律所推動的。
Operating expenses in the quarter were $279.9 million or 15.5% of revenue, down from last quarter at $319.8 million. R&D
本季營運費用為 2.799 億美元,佔營收的 15.5%,低於上季的 3.198 億美元。研發
spending came in at $177.5 million or 9.8% of revenue, down from $216.7 million last quarter.
支出為 1.775 億美元,佔營收的 9.8%,低於上季的 2.167 億美元。
An item of note is that there were additional R&D-related expenses originally expected in Q3 that are now expected to materialize in the Q4 quarter.
值得注意的是,最初預計在第三季出現的額外研發相關費用現在預計將在第四季度實現。
R&D had increased low double-digit percentage versus Q3 in the prior year.
與去年第三季相比,研發的百分比增幅較低,為兩位數。
Sales and marketing expense was $83.4 million or 4.6% of revenue, down slightly from last quarter.
銷售和行銷費用為 8,340 萬美元,佔收入的 4.6%,比上季略有下降。
Our G&A costs came in at $19.1 million or 1.1% similar to last quarter.
我們的一般管理費用為 1,910 萬美元,與上季相似,增幅為 1.1%。
Our operating income for the quarter was $890.1 million or 49.1% of revenue.
我們本季的營業收入為 8.901 億美元,佔營收的 49.1%。
This was favorably impacted by the shift of R&D-related expenses from Q3 now anticipated in Q4 of this year.
這受到研發相關費用從第三季轉移到今年第四季的有利影響。
Other income and expense for the quarter was a favorable $85.3 million, and our effective tax rate was 21.1%.
本季的其他收入和支出為 8,530 萬美元,有效稅率為 21.1%。
This resulted in net income for the quarter of $769.1 million or (inaudible) of revenue.
這使得該季度的淨利潤或(聽不清楚)收入達到 7.691 億美元。
Our diluted share number was 320.5 shares resulting in a diluted earnings per share number for the quarter of $2.40, up 31.1% from the prior year.
我們的稀釋後股票數量為 320.5 股,導致本季稀釋後每股收益為 2.40 美元,比前一年增長 31.1%。
This, too, was favorably impacted by the shift in R&D-related expenses from Q3 to Q4.
這也受到研發相關費用從第三季轉移到第四季的有利影響。
Now turning to the balance sheet.
現在轉向資產負債表。
Cash, cash equivalents and investments ended the quarter at approximately $7.4 billion.
本季末現金、現金等價物和投資約為 74 億美元。
In the quarter, we repurchased $65.2 million of our common stock at an average price of $318.40 per share.
本季度,我們以每股 318.40 美元的平均價格回購了 6,520 萬美元的普通股。
Of the $1.2 billion repurchase program approved in May 2024,
在2024年5月核准的12億美元回購計畫中,
$1
1 美元
billion remains available for repurchase of future quarters.
仍有 10 億美元可用於未來幾季的回購。
The actual timing and amount of future repurchases will be dependent upon market and business conditions, stock price and other factors.
未來回購的實際時間和金額將取決於市場和業務狀況、股價和其他因素。
Turning to operating (inaudible) performance for the third quarter.
轉向第三季的營運(聽不清楚)業績。
We generated approximately $1.2 billion of cash from operations in the period, reflecting strong earnings performance combined with favorable working capital results.
在此期間,我們從營運中產生了約 12 億美元的現金,反映出強勁的獲利表現和良好的營運資本結果。
DSOs came in at 57 days, down from 66 days in Q2, reflecting a strong collections quarter combined with contributions from linearity of billing.
DSO 的週期為 57 天,低於第二季的 66 天,反映出強勁的收款季度以及計費線性的貢獻。
Inventory turns were 1.3 times, up from 1.1% last quarter.
庫存週轉率為 1.3 倍,高於上季的 1.1%。
Inventory decreased to $1.8 billion in the quarter, down from $1.9 billion in the prior period, reflecting a reduction in our raw materials inventory.
本季庫存從上一季的 19 億美元降至 18 億美元,反映我們原材料庫存的減少。
Our purchase commitments and inventory at the end of the quarter totaled $4.1 billion, up from $4 billion at the end of Q2.
截至本季末,我們的採購承諾和庫存總額為 41 億美元,高於第二季末的 40 億美元。
We expect this number to continue to have some variability in future quarters as a reflection of demand for our new product introductions.
我們預計這個數字在未來幾季將繼續出現一些變化,以反映對我們新產品推出的需求。
Our total deferred revenue balance was $2.5 billion, up from $2.1 billion in Q2.
我們的遞延收入餘額總額為 25 億美元,高於第二季的 21 億美元。
The majority of the deferred revenue balance is services related and directly linked to the timing and term of service contracts, which can vary on a quarter-by-quarter basis.
大部分遞延收入餘額與服務相關,並與服務合約的時間和期限直接相關,而服務合約的時間和期限可能會因季度而異。
Our product deferred revenue increased approximately $320 million versus last quarter.
我們的產品遞延收入比上季增加了約 3.2 億美元。
Fiscal 2024 continues to be a year of new product introductions, new customers and expanded use cases.
2024 財年仍然是新產品推出、新客戶和擴大用例的一年。
These trends have resulted in even contracts with customer-specific acceptance clauses and has and will continue to increase the variability of magnitude of our product deferred revenue balances.
這些趨勢導致合約中包含特定於客戶的驗收條款,並且已經並將繼續增加我們產品遞延收入餘額幅度的可變性。
Accounts payable days were 42 days, down from 46 days in Q2, reflecting the timing of inner receipt payments.
應付帳款天數為 42 天,低於第二季的 46 天,反映了內部收據付款的時間。
Capital expenditures for the quarter were $7 million.
該季度的資本支出為 700 萬美元。
In October, we began our initial construction work to build expanded facilities and (inaudible) expect to incur approximately $15 million during Q4 for this protest.
10 月,我們開始了初步建設工作,以擴大設施,(聽不清楚)預計第四季將為此抗議籌集約 1500 萬美元。
Now turning to the fourth quarter.
現在轉向第四季。
Our guidance for the fourth quarter, which is based on non-GAAP results and excludes any noncash stock-based compensation impacts, intangible asset amortization and other nonrecurring items is as follows: revenues of approximately $1.85 billion to $1.9 billion, gross margin of approximately 63% to 64%; operating margin of approximately 44%.
我們對第四季度的指導是基於非 GAAP 業績,不包括任何非現金股票薪酬影響、無形資產攤銷和其他非經常性項目,具體如下:收入約為 18.5 億至 19 億美元,毛利率約為 63 % 至64%;營業利益率約44%。
Our effective tax rate is expected to be approximately 21.5% on with diluted shares of approximately 321 million shares on a pre-split basis.
我們的有效稅率預計約為 21.5%,稀釋前股份約為 3.21 億股。
On the cash front, while we have experienced good increases in operating cash over the last couple of quarters, we anticipate an increase in working capital requirements in Q4.
在現金方面,雖然過去幾季我們的營運現金大幅成長,但我們預計第四季的營運資金需求將會增加。
This is primarily driven by increased inventory in order to respond to the rapid deployment of AI networks and to reduce overall lead times as we move into 2025, mentioned in John's prepared remarks.
John 在準備好的發言中提到,這主要是由於庫存增加,以應對人工智慧網路的快速部署,並在進入 2025 年時縮短總體交付時間。
We will continue our spending investment in R&D, go-to-market activities and scaling the company.
我們將繼續在研發、上市活動和擴大公司規模方面進行支出投資。
Additionally, in Q4 as part of our ongoing commitment to creating long-term value for our shareholders and enhancing the accessibility of our stock, we are pleased to announce that Arista's Board of Directors has approved a [4-for-1] stock split.
此外,在第四季度,作為我們為股東創造長期價值和提高股票可及性的持續承諾的一部分,我們很高興地宣布 Arista 董事會已批准 [4 比 1] 股票分割。
This decision reflects our confidence in the continued growth and prospects of the company.
這項決定反映了我們對公司持續成長和前景的信心。
It's important to note that while the stock split increases the number of shares outstanding, it does not change the intrinsic value of the company nor does it impact our financial performance or strategy.
值得注意的是,雖然股票分割增加了已發行股票的數量,但它不會改變公司的內在價值,也不會影響我們的財務表現或策略。
The split is designed to make our stock more accessible and attractive to a wider range of investors, particularly retail investors, which we believe will ultimately support broader ownership and improve trading dynamics.
此次拆分旨在使我們的股票更容易獲得併吸引更廣泛的投資者,特別是散戶投資者,我們相信這最終將支持更廣泛的所有權並改善交易動態。
Transitioning now to fiscal year 2025.
現在過渡到 2025 財年。
As Jayshree mentioned, we are putting revenue growth of 15% to 17%.
正如 Jayshree 所提到的,我們預計營收將成長 15% 至 17%。
The expected revenue mix is forecasted to have an increased weighting of (inaudible), placing the gross margin outlook at 60% to 62% and operating margin at approximately 43% to 44%.
預期收入組合的權重預計將增加(聽不清楚),毛利率前景為 60% 至 62%,營業利潤率約為 43% 至 44%。
Our commitment remains to continue to invest in R&D, go-to-market and the scaling of the company as we forecast to reach approximately $8 billion in revenue in 2025.
我們仍致力於繼續投資於研發、上市和公司規模擴張,預計 2025 年營收將達到約 80 億美元。
We reiterate our double-digit growth forecast in the foreseeable future, and a three-year revenue CAGR goal of mid-teens for fiscal year '24 through '26.
我們重申在可預見的未來實現兩位數成長的預測,以及 24 財年至 26 財年的三年收入複合年增長率目標為中位數。
We are excited by the current and future opportunities to serve our customers as the pure-play networking innovation company and to deliver strong returns to our shareholders.
我們對當前和未來作為純粹的網路創新公司為客戶提供服務並為股東帶來豐厚回報的機會感到興奮。
I will now turn the call back to Liz.
我現在將把電話轉回給莉茲。
Liz?
麗茲?
Liz Stine - Director - Investor Relations Advocacy
Liz Stine - Director - Investor Relations Advocacy
Thank you, Chantelle.
謝謝你,尚特爾。
We will now move to the Q&A portion of the Arista earnings call.
我們現在將進入 Arista 財報電話會議的問答部分。
To allow for greater participation, I'd like to request that everyone please limit themselves to a single question.
為了讓更多人參與,我想請大家只回答一個問題。
Thank you for your understanding.
感謝您的體諒。
Operator, take it away.
接線員,把它拿走。
Operator
Operator
(Operator Instructions) Samik Chatterjee, JPMorgan.
(操作員指示)Samik Chatterjee,摩根大通。
Samik Chatterjee - Analyst
Samik Chatterjee - Analyst
Hey, thanks for taking my question.
嘿,謝謝你回答我的問題。
A strong set of results, but if I can ask one on the guidance, if you don't mind.
一組強有力的結果,但如果你不介意的話,我可以問一個指導。
Jayshree, you're guiding here to the $750 million of TI target that you had issued previously and you're also guiding to meet your campus revenue target.
Jayshree,您在此引導您實現先前製定的 7.5 億美元的 TI 目標,並引導您實現園區收入目標。
So if I take those two into account, it does imply that the ex AI and ex campus business is only growing single digits next year.
因此,如果我考慮到這兩個因素,這確實意味著前人工智慧和前校園業務明年只會成長個位數。
This is on the yields of coming through a double-digit year in 2024 where you comped backlog digestion in 2023.
這是基於 2024 年實現兩位數的收益率,您在 2023 年完成了積壓消化工作。
So just maybe have parse through that as to why there's a significant (inaudible) in the non-AI noncampus business implied in the numbers?
那麼或許可以分析為什麼數字中隱含著非人工智慧非校園業務的顯著(聽不清楚)?
And what maybe is driving that -- in your expectations, what's driving that outlook?
也許是什麼推動了這一趨勢——按照您的預期,是什麼推動了這種前景?
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, Samik.
謝謝你,薩米克。
As you know, our visibility only extends to roughly about six months, right?
如您所知,我們的可見度只能持續約六個月,對嗎?
So we don't want to get ahead of ourselves on how much better we can do and that's how we started 24 either, and we were pleasantly surprised with the faster acceleration of AI pilots.
所以我們不想超前於我們能做得更好,這也是我們開始 24 的原因,我們對人工智慧飛行員更快的加速感到驚訝。
So we definitely see that our large cloud customers are continuing to refresh on the cloud, but are pivoting very aggressively to -- so it wouldn't surprise me if we grow faster in AI and faster in campus in the new center markets and slower in our classic markets called that data center and cloud.
因此,我們確實看到我們的大型雲端客戶正在繼續在雲端上進行更新,但正在非常積極地轉向——因此,如果我們在人工智慧方面增長更快,在新中心市場的園區更快,在新中心市場成長較慢,我不會感到驚訝。
And this is the best we can see right now.
這是我們現在能看到的最好的情況。
It doesn't mean we couldn't do better or worse.
這並不意味著我們不能做得更好或更差。
But as far as our visibility goes, I think this represents a nice combination of all our different customer segments and all our different product sectors.
但就我們的知名度而言,我認為這代表了我們所有不同客戶群和所有不同產品領域的完美結合。
Samik Chatterjee - Analyst
Samik Chatterjee - Analyst
Thank you.
謝謝。
Operator
Operator
Antoine Chkaiban, New Street Research.
Antoine Chkaiban,新街研究。
Antoine Chkaiban - Analyst
Antoine Chkaiban - Analyst
Hi, thank you very much for taking my question of.
您好,非常感謝您接受我的問題。
Can you maybe provide an update on the four major AI trials that you gave in the past?
您能否提供您過去進行的四項主要人工智慧試驗的最新情況?
How are things progressing versus your expectations as of 90 days ago.
截至 90 天前,與您的預期相比,事情的進展如何。
And when do you expect the move to production to happen?
您預計何時會轉向生產?
And what scale are we talking about?
我們談論的規模是多少?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes.
是的。
No, thank you.
不,謝謝。
That's a good question.
這是個好問題。
Arista now believes we're actually five out of five, not four out of five.
Arista 現在相信我們其實是五分之五,而不是五分之四。
We are progressing very well in four out of the five clusters.
我們在五個集群中的四個方面進展順利。
Three of the customers are moving from trials to pilots this year, inspecting those three to become 50,000 to 100,000 GPU clusters in 2025.
今年,其中三位客戶將從試驗轉向試點,檢查這三位客戶將在 2025 年將 GPU 叢集變為 50,000 到 100,000 個。
We're also pleased with the new Ethernet trial in 2024 with our fifth customer.
我們也對 2024 年第五位客戶進行的新乙太網路試驗感到滿意。
This customer was historically very, very InfiniBand driven.
從歷史上看,該客戶非常非常受 InfiniBand 驅動。
And we are now moving in that particular fifth customer, we are largely in a trial mode in 2024, and we hope to go to pilots and production.
我們現在正在轉向第五個客戶,我們在 2024 年基本上處於試驗模式,我們希望進入試點和生產。
There is one customer who -- so three are going well.
有一位客戶——所以三位客戶進展順利。
One is starting.
一個正在開始。
The fifth customer is moving slower than we expected.
第五個客戶的移動速度比我們預期的慢。
They may get back on their feet.
他們可能會重新站起來。
In 2025, they're awaiting new GPUs, and they've got some challenges on power cooling, et cetera.
2025 年,他們正在等待新的 GPU,並且在電源冷卻等方面面臨一些挑戰。
So three, I would give an A. The fourth one, we're really glad we won, and we're getting started and the fifth one, I'd say, steady state, not quite as great as we would expect -- have expected them to be.
所以第三個,我會給 A。 。
Antoine Chkaiban - Analyst
Antoine Chkaiban - Analyst
Thanks for the color.
謝謝你的顏色。
Operator
Operator
Tal Liani, Bank of America.
塔爾·利亞尼,美國銀行。
Tal Liani - Analyst
Tal Liani - Analyst
Hi, guys.
嗨,大家好。
NVIDIA in the last quarter the launch of the Spectrum X, it shows that in data center switching their market share went from like 4% to 15%.
NVIDIA在上個季度推出了Spectrum X,這顯示他們在資料中心轉換的市佔率從4%左右上升到了15%。
Does it mean that you're seeing increased competition from NVIDIA?
這是否意味著您看到來自 NVIDIA 的競爭更加激烈?
And is it competing with you on the same spot?
它是否與你在同一地點競爭?
Or is it more competing with white boxes?
還是與白盒更具競爭性?
And the second question is about white boxes.
第二個問題是關於白盒子的。
What is the outlook for white box participation in Gen AI?
白盒參與 Gen AI 的前景如何?
Is it going to be higher or lower than in front-end data centers?
它會比前端資料中心更高還是更低?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Okay.
好的。
Thanks, Tal which question do you want me to answer?
謝謝,塔爾,你想讓我回答哪個問題?
Tal Liani - Analyst
Tal Liani - Analyst
Let's go with NVIDIA.
讓我們選擇 NVIDIA 吧。
Give me the gift of
給我的禮物
--
--
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Somebody else may ask the question anyway.
無論如何,其他人可能會問這個問題。
So you'll get your answer.
所以你會得到答案。
But just to answer your question on NVIDIA.
但只是回答你關於 NVIDIA 的問題。
First of all, we view NVIDIA as a good partner.
首先,我們認為 NVIDIA 是一個很好的合作夥伴。
If we didn't have the ability to connect to their GPUs, we wouldn't have all this AI networking demand.
如果我們沒有能力連接到他們的 GPU,我們就不會擁有所有這些人工智慧網路需求。
So thank you, NVIDIA -- thank you, Jensen, for the partnership.
謝謝 NVIDIA,謝謝 Jensen 的合作。
Now as you know, NVIDIA sells the full stack and most of the time, it's with InfiniBand, and with the (inaudible) acquisition, they do have some Ethernet capability.
現在,如您所知,NVIDIA 出售完整的堆棧,並且大多數情況下都與 InfiniBand 一起銷售,並且透過(聽不清楚)收購,他們確實擁有一些乙太網路功能。
We personally do not run into the Ethernet capability very much.
我們個人不太接觸乙太網路功能。
We run into it, maybe one or two customers.
我們遇到的,可能是一兩個顧客。
And so generally speaking, Arista has looked upon as the expert there.
一般來說,阿里斯塔被視為那裡的專家。
We have a full portfolio.
我們擁有完整的產品組合。
We have full software.
我們有完整的軟體。
And whether it's the large scale-out ethernet working customers like the Titans or even the smaller enterprises, we're seeing a lot of smaller GPU plus of enterprise, Arista is looked upon as the expert there.
無論是像 Titans 這樣的大型橫向擴展乙太網路工作客戶,還是小型企業,我們都看到許多小型 GPU 加上企業,Arista 被視為那裡的專家。
But that's not to say we're going to win 100%.
但這並不是說我們會100%獲勝。
We welcome NVIDIA as a partner on the GP and a fierce competitor, and we look to compete with them on the Ethernet switching.
我們歡迎 NVIDIA 作為 GP 的合作夥伴和激烈的競爭對手,我們希望在乙太網路交換領域與他們競爭。
Operator
Operator
Simon Leopold, Raymond James
西蒙·利奧波德,雷蒙德·詹姆斯
Simon Leopold - Analyst
Simon Leopold - Analyst
I'll tag team with Tal.
我將用塔爾標記團隊。
So we'll partner once again here.
所以我們將在這裡再次合作。
I do want to look at this competition or competitive landscape broadly in that what I'm trying to understand is how it may be changing with the advent of AI.
我確實想廣泛地看待這種競爭或競爭格局,因為我試圖了解隨著人工智慧的出現,它可能會發生怎樣的變化。
So not just hearing from you about white box, but also competitors like Cisco and Juniper and Nokia.
因此,不僅是您關於白盒的訊息,還有思科、瞻博網路和諾基亞等競爭對手的消息。
So really an update on the competitive landscape would be helpful.
因此,了解競爭格局的最新情況確實會有所幫助。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, Simon.
謝謝你,西蒙。
That's a nice broad question.
這是一個很好的廣泛問題。
So since you asked me specifically about AI as opposed to cloud, let me parse this problem into two halves, the back end and the front end, right?
既然你具體問了我關於人工智慧而不是雲端的問題,那麼讓我把這個問題分成兩部分,後端和前端,對嗎?
At the back end, we're natively connecting to GPUs.
在後端,我們原生連接到 GPU。
And there can be many times, we just don't see it because somebody just muddles it in the GPU in particular, NVIDIA.
很多時候,我們只是看不到它,因為有人只是在 GPU 中混淆了它,特別是 NVIDIA。
And you may remember a year ago, I was saying we're outside looking in because most of the bundling is happening within (inaudible), I would expect on the back end, any share Arista get, including that $750 million is incremental.
你可能還記得一年前,我說過我們正在外部觀察,因為大部分捆綁都發生在(聽不清楚)內部,我預計在後端,Arista 獲得的任何份額,包括7.5 億美元都是增量。
It's brand new to us.
這對我們來說是全新的。
We were never there before.
我們以前從未去過那裡。
So we'll take all we can get, but we are not claiming to be a market leader there.
因此,我們將竭盡全力,但我們並不聲稱自己是那裡的市場領導者。
We're, in fact, claiming that there are many incumbents there with InfiniBand and smaller versions of Ethernet that Arista is looking to gain more credibility and experience and become the gold standard for the back end.
事實上,我們聲稱有許多現有企業擁有 InfiniBand 和較小版本的以太網,Arista 希望獲得更多可信度和經驗,並成為後端的黃金標準。
On the front end, in many ways, we are viewed as the gold standard to competitively.
在前端,在許多方面,我們被視為競爭的黃金標準。
It's a much more complex network.
這是一個更複雜的網路。
You have to build a leaf-spine architecture.
你必須建構一個葉脊架構。
John alluded to this, there's a tremendous amount of scale with L2, L3, EVPN, VXLAN, visibility, telemetry, automation routing at scale, encryption at scale.
John 提到了這一點,L2、L3、EVPN、VXLAN、可見性、遙測、大規模自動化路由、大規模加密具有巨大的規模。
And this, what I would call accelerated networking portfolio complements NVIDIA's accelerated compute portfolio.
我所說的加速網路產品組合補充了 NVIDIA 的加速運算產品組合。
And compared to all the peers you mentioned, we have the absolute best portfolio of 20 switches and three families and the capability and the competitive differentiation is bar none.
與您提到的所有同行相比,我們擁有絕對最好的 20 種交換器和三個系列的產品組合,並且能力和競爭差異化是無與倫比的。
In fact, I am specifically aware of a couple of situations where the AI applications aren't even running on some of the industry peers you talked about, and they want to swap theirs for ours.
事實上,我特別意識到一些情況,其中人工智慧應用程式甚至沒有在您談到的一些行業同行上運行,而他們想將他們的應用程式替換為我們的應用程式。
So feeling extremely bullish with the 7,800 flagship product, the newly introduced 700 that we worked closely with Meta, the 7060, this product line running today mostly at 400 gig because a lot of the (inaudible) the ecosystem isn't there for 800.
因此,我們對7,800 旗艦產品感到非常樂觀,即我們與Meta 密切合作的新推出的700,即7060,該產品線目前主要以400 gig 運行,因為很多(聽不清楚)生態系統並不適用於800 。
But moving forward into 800, this is why John and the team are building the supply chain to get ready for it.
但展望 800,這就是 John 和團隊正在建立供應鏈以為此做好準備的原因。
So competitively, I would say, we're doing extremely well in the front end.
所以我想說,從競爭角度來說,我們在前端做得非常好。
-- and it's incremental on the back end.
——而且它是在後端增量的。
So -- and overall, I would classify our performance in AI coming from being a now 12 years ago to where we are today
因此,總的來說,我會將我們在人工智慧方面的表現從 12 年前的狀態分為現在的狀態
(inaudible)
(聽不清楚)
Simon Leopold - Analyst
Simon Leopold - Analyst
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, Simon.
謝謝你,西蒙。
Operator
Operator
Ben Reitzes, Melius Research.
本‧雷茨 (Ben Reitzes),Melius 研究中心。
Ben Reitzes - Analyst
Ben Reitzes - Analyst
I wanted to ask a little bit more about the $750 million in AI for next year.
我想多問一些關於明年 7.5 億美元的人工智慧投資的問題。
Has your visibility on that improved over the last few months.
在過去的幾個月裡,您對此的了解是否有所提升?
I wanted to reconcile your comment around the fifth customer not going slower than expected.
我想協調您對第五位客戶的評論,不要比預期慢。
And it sounds like you're now in 5 on 5, but wondering if that fifth customer going slower is limiting upside or limiting your visibility there?
聽起來您現在處於 5 對 5 的狀態,但想知道第五個客戶速度較慢是否會限制您的上升空間或限制您在那裡的可見度?
Or has it actually improved and it's gotten more conservative over the last
或者它實際上有所改善並且在過去變得更加保守
--
--
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Somebody has to bring a conservative end, but I think we're being realistic.
一定有人帶來保守的結局,但我認為我們是現實的。
So I think you said it right.
所以我認為你說得對。
I think on three out of the five, we have good visibility, at least for the next 6 months, maybe even 12, John, what do you think?
我認為在五分之三的情況下,我們有良好的能見度,至少在接下來的 6 個月內,甚至可能是 12 個月內,約翰,你覺得怎麼樣?
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
Yes.
是的。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
On the fourth one, we are in early trials, we got improving to do.
在第四個方面,我們正處於早期試驗階段,我們仍有待改進。
So let's see, but we're not looking for 2025 to be the bang up year on the fourth one.
讓我們拭目以待,但我們並不希望 2025 年成為第四年的爆炸年。
It's probably 2026.
大概是2026年吧。
And on the fifth one, we're a little bit stalled, which may be why we're being careful about predicting how they'll do.
在第五個方面,我們有點停滯不前,這可能是為什麼我們在預測他們會如何做時要小心。
They may step in nicely in the second half of '25, in which case, we'll let you know.
他們可能會在 25 年下半年很好地介入,在這種情況下,我們會通知您。
But if they don't, we're still feeling good about our guide for '25.
但即使他們不這樣做,我們仍然對 25 年的指南感到滿意。
Is that right,
是這樣嗎,
(inaudible)
(聽不清楚)
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
I would totally agree.
我完全同意。
It's a good question, Ben.
這是個好問題,本。
But I think out of the five the way Jayshree categorized them, I would completely agree.
但我認為,在 Jayshree 分類的五種方式中,我完全同意。
Ben Reitzes - Analyst
Ben Reitzes - Analyst
Okay.
好的。
Thanks.
謝謝。
Operator
Operator
Karl Ackerman, BNP Paribas.
卡爾‧阿克曼,法國巴黎銀行。
Karl Ackerman - Analyst
Karl Ackerman - Analyst
Thank you.
謝謝。
Jayshree, could you discuss whether the programs are engaged with on hyperscalers?
Jayshree,您能討論一下這些程式是否適用於超大規模電腦嗎?
Will it be deploying your new EtonLink switches and AI spine products on 800-gig ports.
是否會在 800 GB 連接埠上部署您的新 EtonLink 交換器和 AI 骨幹產品?
In other words, have these pilots or trials been on 400 gig and production beyond 800 gig.
換句話說,這些試點或試驗是否已進行 400 場演出,而生產規模是否超過 800 場。
And I guess if so, what's the right way to think about the hardware mix of sales of 800 gig in '25?
我想如果是這樣,那麼思考 25 年 800 場演出的硬體組合的正確方法是什麼?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes.
是的。
That's a good question.
這是個好問題。
I mean, just going back to again, it was always hard to tell that 100 and 400 because somebody can take their 400 and break it into breakouts of 100 -- so I would say today, if you ask John and I, a majority of the trials and pilots are on 400 because people are still waiting for the ecosystem at 800, including the NIC and the UEC and the packet spring capabilities, et cetera.
我的意思是,回到過去,總是很難區分 100 和 400,因為有人可以將他們的 400 分解為 100 的突破 - 所以我今天會說,如果你問約翰和我,大多數人試驗和試點在400 上進行,因為人們仍在等待800 上的生態系統,包括NIC 和UEC 以及資料包彈簧功能等。
So while we're in some early trials on 800, majority of 400.
因此,雖然我們正在進行一些 800 的早期試驗,但大多數是 400。
Majority of 2024 is 400 gig.
2024 年的大部分時間是 400 場演出。
I expect as we go into 25, we will see a better split between 400 and 800.
我預計當我們進入 25 個國家時,我們會看到 400 到 800 之間的分配情況更好。
Karl Ackerman - Analyst
Karl Ackerman - Analyst
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, Karl.
謝謝你,卡爾。
Operator
Operator
Ryan Koontz, Needham & Co.
孔茲 (Ryan Koontz),李約瑟公司
Ryan Koontz - Analyst
Ryan Koontz - Analyst
Great.
偉大的。
Thanks for the question.
謝謝你的提問。
I was hoping we could touch base on your campus opportunities a bit.
我希望我們能稍微了解一下你們的校園機會。
Where are you seeing the most traction in terms of your applications?
您認為您的應用程式最受關注的地方在哪裡?
Is this primarily from your strength in core moving big bits around the campus core?
這主要是因為您在校園核心周圍移動大量核心方面的實力嗎?
Or are you seeing WiFi?
或是你看到WiFi了嗎?
Can you maybe just update us on the campus applications and verticals you're seeing the most traction in?
您能否向我們介紹一下您認為最受關注的校園應用程式和垂直行業的最新情況?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes.
是的。
Yes.
是的。
Ryan, let me try and step back and say -- tell you that our enterprise opportunity has never been stronger.
瑞安,讓我試著退一步說——告訴你,我們的企業機會從未如此強大。
As a pure-play innovator, we are getting invited more and more into enterprise deals even though sometimes we don't often have the sales coverage for it.
作為純粹的創新者,我們越來越被邀請參與企業交易,儘管有時我們通常沒有相關的銷售覆蓋範圍。
And what I mean by that is, I think Arista is being sought for a network design that doesn't have five operating systems and different silos and not just the get code.
我的意思是,我認為 Arista 正在尋找一種沒有五個作業系統和不同孤島而不僅僅是獲取程式碼的網路設計。
And there's an awful lot of competitive fatigue and add to the fact that there's an awful lot of consolidation going on and a lot of our peers in the industry are looking at other things, whether it's observability or being other products together.
而且存在著非常多的競爭疲勞,再加上正在進行大量的整合,而且我們行業中的許多同行正在考慮其他事情,無論是可觀察性還是其他產品。
So our enterprise opportunity now, we don't just characterize as data center.
所以我們現在的企業機會,我們不僅僅將其定性為資料中心。
There's data center, there's campus center, there's WAN center and of course, there's a little bit of AI in there, too.
有資料中心,有校園中心,有廣域網路中心,當然,裡面也有一點人工智慧。
So now let me address your (inaudible) question more specifically.
現在讓我更具體地解決您的(聽不清楚)問題。
Clearly, one of the first place as everybody went on (inaudible) is the universal spine.
顯然,每個人都走的第一個地方(聽不清楚)是通用脊椎。
They go, oh, okay, I can have the same spine from my data center and campus.
他們說,哦,好吧,我的資料中心和園區可以擁有相同的主幹網路。
This is so cool.
這太酷了。
So that activity has already started, and a big part of our $750 million projection comes from the confidence that they've already put in a platform and a foundation to get ready for more spine.
因此,這項活動已經開始,我們 7.5 億美元預測的很大一部分來自於我們的信心,即他們已經建立了一個平台和基礎,為更多的脊椎做好準備。
Then if Kumar (inaudible) way here, he'd say, but Jayshree, you need to measure the edge ports, which is the power Ethernet, the wired and the WiFi.
然後,如果 Kumar(聽不清楚)在這裡,他會說,但是 Jayshree,您需要測量邊緣端口,即電源以太網、有線和 WiFi。
And this is super important, (inaudible) or laughing.
這是非常重要的,(聽不清楚)或笑。
Yes.
是的。
And so he would say, you got to get that right.
所以他會說,你必須做對。
And so number one, we're in the spine; two, we're making stronger progress on the wired.
所以第一,我們在脊椎;第二,我們在有線方面取得了更大的進展。
Our weakest partly because we are data center folks, and we're still learning how to sell radio is the WiFi that we plan to fix that, and this is where the extra coverage will come in.
我們最弱的部分原因是我們是資料中心人員,我們仍在學習如何銷售無線電,我們計劃解決這個問題的 WiFi,而這正是額外覆蓋範圍發揮作用的地方。
So I would say more of our strength is coming into wired and spine.
所以我想說,我們的力量更集中在有線和脊椎上。
We are doing very well in pockets of WiFi, but we need to do better.
我們在 WiFi 領域做得很好,但我們需要做得更好。
Chantelle, you want to add something?
Chantelle,你想補充點什麼嗎?
.
。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Just to take the second part, I think you were asking about some of the verticals in your question.
就第二部分而言,我認為您正在問問題中的一些垂直領域。
I just wanted to add some of the verticals.
我只是想添加一些垂直領域。
I think where we're seeing some strength data center and campus, I would say financials, health care, media, retail,
我認為我們看到一些實力雄厚的資料中心和園區,我想說的是金融、醫療保健、媒體、零售、
(inaudible)
(聽不清楚)
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Fed and (inaudible), that's a good one.
美聯儲和(聽不清楚),這是一件好事。
This is historically an area we have not paid attention to the federal market we're getting very serious about, including setting up its own subsidiary.
這是我們歷史上從未關注過聯邦市場的一個領域,但我們正在非常認真地關注這個問題,包括建立自己的子公司。
So Chantelle, you've been a huge part of pushing us there.
所以 Chantelle,你在推動我們實現這一目標方面發揮了重要作用。
So thank you.
所以謝謝你。
Ryan Koontz - Analyst
Ryan Koontz - Analyst
Thanks so much.
非常感謝。
Operator
Operator
Amit Daryanani, Evercore.
阿米特·達裡亞納尼(Amit Daryanani),Evercore。
Amit Daryanani - Analyst
Amit Daryanani - Analyst
Good afternoon.
午安.
Thanks for taking my question.
感謝您提出我的問題。
I guess I'm hoping you could spend some time on the sizable acceleration we're seeing both on your total deferred number, but also the product deferred number is going up pretty dramatically.
我想我希望您能花一些時間來了解我們所看到的總延期數量的大幅增長,而且產品延期數量也大幅增加。
Jayshree, historically, when product default goes up in such a dramatic manner, you actually end up with really good revenue (inaudible) in the out years, and you're guiding for revenue that you decelerate in '25.
Jayshree,從歷史上看,當產品違約率以如此戲劇性的方式上升時,你實際上最終會在未來幾年獲得非常好的收入(聽不清楚),並且你正在指導你在25 年減速的收入。
Maybe just help me connect like what's the delta, why product different what makes the acceleration that we historically has.
也許只是幫助我聯繫什麼是三角洲,為什麼產品不同,是什麼使我們歷史上的加速。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
I'm going to let Chantelle the expert answer the question, but I will say one line.
我會讓專家 Chantelle 來回答這個問題,但我會說一句話。
Remember, in the case of those examples you're quoting, the trials were typically, I don't know, 6 to 12 months this can be multiple years and can take a lot longer to manifest.
請記住,就您引用的這些例子而言,試驗通常是(我不知道)6 到 12 個月,也可能是多年,並且可能需要更長的時間才能顯現出來。
It may not all happen in 2025.
這一切可能不會在 2025 年發生。
Over to you, Chantelle.
交給你了,尚特爾。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
I think, yes.
我想,是的。
So thank you, Jayshree.
謝謝你,傑什裡。
So part of it is the type of use case, the type of customers, the mix of product that goes in there. they all have bespoke time frames, Jayshree referred to, you're starting to see those lengthen.
因此,其中一部分是用例的類型、客戶的類型以及其中的產品組合。 Jayshree 提到,它們都有客製化的時間框架,你會開始看到這些時間框架被延長。
And the other thing, too, is that this is what we know now as you move through every quarter, there are deferred in and out.
另一件事也是,這就是我們現在所知道的,當你經歷每個季度時,都會有延遲的進出。
So this is what we know at this time.
這就是我們目前所知道的。
And it's a mix of the variables that we told you before.
這是我們之前告訴過你的變數的混合體。
And then as we move through '25, we'll continue to update.
然後,當我們進入 25 年時,我們將繼續更新。
Amit Daryanani - Analyst
Amit Daryanani - Analyst
Okay.
好的。
Got it.
知道了。
Operator
Operator
Meta Marshall, Morgan Stanley.
梅塔‧馬歇爾,摩根士丹利。
Meta Marshall - Analyst
Meta Marshall - Analyst
Great.
偉大的。
Thanks.
謝謝。
Jayshree, I just wanted to get a sense of, clearly, you keep -- clearly have these four main trials and have added a fifth.
Jayshree,我只是想清楚地了解一下,你顯然有這四個主要試驗,並且增加了第五個。
But just how are you thinking about adding other either Tier 2 opportunities or sovereigns or just some of these other customers that are investing heavily in AI and how do you see those opportunities developing those for Arista?
但是,您如何考慮添加其他 2 級機會或主權國家或只是大量投資於 AI 的其他一些客戶?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
This is a good question.
這是一個好問題。
So we're not saying these are the
所以我們並不是說這些是
(inaudible).
(聽不清楚)。
But these are the five we predict can go to 100,000 GPUs and more.
但我們預測這五個可以支援 100,000 個甚至更多的 GPU。
That's the way to look at this.
這就是看待這個問題的方式。
So there are largest AI titans, if you will.
所以如果你願意的話,還有最大的人工智慧巨頭。
And they can be in the cloud, hyperscaler Titan group, they could be in the Tier 2 as well, by the way, very rarely would they be in a classic enterprise.
他們可以在雲端、超大規模泰坦集團中,他們也可以在第二層,順便說一句,他們很少會在經典企業中。
By the way, we do have at least 10 to 15 trials going on in the classic enterprise too, but they're much smaller GPU counts, so we don't talk about it.
順便說一句,我們在經典企業中也確實進行了至少 10 到 15 次試驗,但它們的 GPU 數量要少得多,所以我們不談論它。
So we're folding on the big five to the point that they really skew our numbers and they're very important to establish our our beachhead, our innovation and our market share in AI, but there's definitely more going on.
因此,我們正在對五巨頭進行折疊,以至於它們確實扭曲了我們的數字,它們對於建立我們的灘頭陣地、我們的創新和我們在人工智慧領域的市場份額非常重要,但肯定還有更多的事情發生。
In terms of specifically your question on Tier 2 and will there be more there will be more, but these are the five we see in that category and they're spread across both the Tier 1 Titan Cloud as well as the Tier 2.
就您關於第 2 層的具體問題而言,是否還會有更多,但這些是我們在該類別中看到的五個,它們分佈在第 1 層泰坦雲和第 2 層。
Meta Marshall - Analyst
Meta Marshall - Analyst
Thank you.
謝謝。
Operator
Operator
Sebastien Naji, William Blair.
塞巴斯蒂安·納吉,威廉·布萊爾。
Sebastien Naji - Analyst
Sebastien Naji - Analyst
Yes, good evening.
是的,晚上好。
Thanks for taking the question.
感謝您提出問題。
Just specifically on the Ethernet or Etalin portfolio, could you maybe rank order or comment on what you see as the opportunity across each of the three families a single-tier lease spine and then the tolling switch as we're going into 2025 and beyond.
特別是在乙太網路或Etalin 產品組合方面,您能否對您認為三個系列中每一個系列(單層租賃主幹,然後是收費交換器)的機會進行排名或評論,因為我們將進入2025 年及以後。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
I'll take a crack at it, but John helped me out here because this is clearly a guestimate.
我會嘗試一下,但約翰在這裡幫助了我,因為這顯然是一個猜測。
It's probably one we should say no comment, but we'll try to give you color.
我們可能不應該對此發表評論,但我們會盡力為您提供顏色。
On (inaudible), I would say the fixed 7060 switches in terms of units are very popular because it's a single switch.
關於(聽不清楚),我想說固定 7060 開關在單位方面非常受歡迎,因為它是單一開關。
It's one our customers are familiar with.
這是我們的客戶所熟悉的一種。
It's based on an intense partnership with Broadcom.
它基於與 Broadcom 的密切合作關係。
So we've done Tomahawk 1, 2, 3, 4, and here we are on 5, right?
我們已經完成了“戰斧”1、2、3、4,現在我們完成了“戰斧 5”,對吧?
So I would say, volume-wise, that's the big one.
所以我想說,就數量而言,這是最大的。
Going into the other extreme, the 7800 in volume may be smaller, but in dollars, is extremely strategic, and this is where we feel competitively again, working with our partners in Broadcom with the Jericho and (inaudible) family.
進入另一個極端,7800 的體積可能較小,但以美元計算,極具戰略意義,這就是我們與 Broadcom 的合作夥伴以及 Jericho 和(聽不清楚)家族合作再次感受到競爭力的地方。
This is just -- what would you say, John, a real flagship, right?
約翰,你會說這只是一個真正的旗艦,對吧?
In dollars, that's the stealer, if you will.
如果你願意的話,以美元計算,那就是偷竊者。
And then the 7,700 is the best of both worlds that gives you all the capabilities of the 7,800 in a mini configuration up to 10,000 GPUs.
7,700 是兩全其美的產品,它在最多 10,000 個 GPU 的迷你配置中為您提供 7,800 的所有功能。
It's brand new.
這是全新的。
So -- but I think it's going to -- and competitively, there's no peer for this.
所以——但我認為它將會——並且在競爭中,沒有同行可以做到這一點。
Nobody else does this, but us with a scheduled fabric in a single stage.
除了我們在一個階段中使用預定的結構之外,沒有其他人這樣做。
We did this in a very close collaboration, John, with Meta, right?
約翰,我們與 Meta 進行了非常密切的合作,對吧?
So you guys have been working together, John, for 18 months, two years, I would say.
約翰,你們已經一起工作了 18 個月,我想說,兩年了。
So I think we know less about how to qualify that, but it could be very promising and it could be a fast accelerator in the next couple of years.
所以我認為我們對如何證明這一點知之甚少,但它可能非常有前途,並且可能成為未來幾年的快速加速器。
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
Yes, 7,700 people are interested in the very large scale is attractive for the 700.
是的,7700人有興趣,規模非常大,對700人來說很有吸引力。
Between the 7060 and the 7800, we do see people that are optimizing a mix of both of those products in the same deployment so they can get the minimum number of tiers but have the maximum amount of GPUs that fit their use case.
在 7060 和 7800 之間,我們確實看到人們在同一部署中優化這兩種產品的組合,以便他們可以獲得最少的層數,但擁有適合其用例的最大數量的 GPU。
So we do see a lot of tailoring now around the size of the deployments based on how many GPUs they want to deploy their data center.
因此,我們現在確實看到,根據他們想要部署資料中心的 GPU 數量,圍繞部署規模進行了大量自訂。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes, that's a really good point.
是的,這是一個很好的觀點。
And then suddenly, they'll go, okay, I want to go from a four-way rates to an eight way.
然後突然間,他們會說,好吧,我想從四路費率改為八路費率。
And then suddenly, you have to add more line cards in your 7,800 and come running to you for more supply chain.
然後突然間,您必須在 7,800 中添加更多線路卡,並為您提供更多供應鏈。
Sebastien Naji - Analyst
Sebastien Naji - Analyst
Sure.
當然。
Great.
偉大的。
Thank you, both.
謝謝你們,兩位。
Operator
Operator
Aaron Rakers, Wells Fargo.
亞倫·雷克斯,富國銀行。
Aaron Rakers - Analyst
Aaron Rakers - Analyst
Yes, thanks for taking the question.
是的,感謝您提出問題。
I wanted to segue off the competitive landscape and just ask you about when I look at your 2025 outlook as well as the midterm model that you provided, it looks like you're making some assumptions of some margin declines.
我想脫離競爭格局,只是問一下,當我查看你們的 2025 年展望以及你們提供的中期模型時,你們似乎對利潤率下降做出了一些假設。
I'm curious of what's underlying those expectations of gross margin declines?
我很好奇毛利率下降的預期背後是什麼?
Is it mix of customers?
是混合客戶嗎?
Do you expect multiple 10% plus customers in 2025?
您預計 2025 年會有多個 10% 以上的客戶嗎?
Just any help on what's factored into that -- those margin expectations?
對其中的因素有什麼幫助嗎──那些利潤率預期?
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
I would say, absolutely, and the outlook that you referred to, it is customer mix.
我想說,絕對是,你提到的前景是客戶組合。
We're expecting John to continue the great supply chain discipline that he's been doing with his team.
我們期待約翰繼續他和他的團隊一起遵守的供應鏈紀律。
So it is a BIC comment only.
所以這只是 BIC 的評論。
And as for the 10% customers, I would say the one dynamic, maybe it's a bit cheeky to say it as the denominator gets bigger, that gets a bit tougher.
至於 10% 的客戶,我想說的是,隨著分母變大,這樣說可能有點厚顏無恥,這會變得有點困難。
So we'll see as we go in the out years.
因此,我們將在未來幾年內拭目以待。
But right now, we'll just keep to the the ones that we currently talk about, and we'll see how that goes from '25 and '26.
但現在,我們將繼續討論目前討論的內容,我們將看看 25 年和 26 年的情況如何。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
It's going to get harder and harder to have 10 customers.
擁有 10 個客戶將會變得越來越難。
So I believe M&M will still be that in 2025, but I don't anticipate there's any others at the moment.
所以我相信 2025 年 M&M 仍會是這樣,但我預計目前不會有其他的。
Liz Stine - Director - Investor Relations Advocacy
Liz Stine - Director - Investor Relations Advocacy
Operator, we have time for one last question.
接線員,我們還有時間回答最後一個問題。
Operator
Operator
Atif Malik, Citigroup.
阿蒂夫‧馬利克,花旗集團。
Atif Malik - Analyst
Atif Malik - Analyst
Hi.
你好。
Thank you for taking my question.
感謝您回答我的問題。
Jayshree, some the recent conferences, you've talked about every dollar spent on back end is at least 2 times on the front end.
Jayshree,最近的一些會議,您談到花在後端的每一美元至少是前端的 2 倍。
What signs are you looking for to see the lift from AI on the front end or classic cloud from the pressure on the bandwidth.
您正在尋找哪些跡象來了解前端人工智慧或經典雲端對頻寬壓力的緩解作用。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes.
是的。
Listen, I think it all depends on Atif their approach to AI.
聽著,我認為這完全取決於 Atif 他們的人工智慧方法。
If they just want to build a back-end cluster and prove something out, they just look for the highest job training completion and intense training models.
如果他們只是想建立一個後端集群並證明一些東西,他們只是尋找最高的工作培訓完成度和密集的培訓模型。
And it's a very narrow use case.
這是一個非常狹窄的用例。
But what we're starting to see more and more like I said, is for every dollar spent in the back end, you could spend 30% more, 100% more, and we've even seen a 200% more scenario, which is why $750 million will carry over to, we believe, next year, another $750 million on front-end traffic that will include AI, but it will include other things as well.
但我們開始越來越多地看到,就像我說的,在後端花費的每一美元,你可能會多花 30%、100%,我們甚至看到了多花 200% 的情況,即為什麼我們相信,明年7.5 億美元將轉入另外7.5 億美元的前端流量,其中包括人工智慧,但它也將包括其他內容。
It's (inaudible) be unique to AI.
它(聽不清楚)是人工智慧獨有的。
So I wouldn't be surprised if that number is anywhere between 30% and 100%, so the average is 1-0%., which is 2 times back and over.
因此,如果該數字介於 30% 和 100% 之間,我不會感到驚訝,因此平均值為 1-0%。
So feeling pretty good about that.
所以對此感覺很好。
Don't know how to exactly count that as (inaudible), which is why I qualify it by saying increasingly, if you start having inference, training, front end, storage, (inaudible) classic cloud all come together, the AI -- the pure AI number becomes difficult to track.
不知道如何準確地算作(聽不清楚),這就是為什麼我越來越多地說,如果你開始將推理、訓練、前端、儲存、(聽不清楚)經典雲全部聚集在一起,那麼人工智慧——純人工智慧號碼變得難以追蹤。
Atif Malik - Analyst
Atif Malik - Analyst
Thanks so much.
非常感謝。
Liz Stine - Director - Investor Relations Advocacy
Liz Stine - Director - Investor Relations Advocacy
This concludes the Arista Networks third-quarter 2024 earnings call.
Arista Networks 2024 年第三季財報電話會議到此結束。
We have posted a presentation, which provides additional information on our results, which you can access on the Investors section of our website.
我們發布了一份演示文稿,其中提供了有關我們業績的更多信息,您可以在我們網站的投資者部分訪問這些信息。
Thank you for joining us today, and thank you for your interest in Arista.
感謝您今天加入我們,並感謝您對 Arista 的興趣。
Operator
Operator
Thank you for joining.
感謝您的加入。
Ladies and gentlemen, this concludes today's call.
女士們、先生們,今天的電話會議到此結束。
You may now disconnect.
您現在可以斷開連線。