Arista Networks 召開電話會議討論其 2024 財年第四季度的成功,超出了收入預期並實現了創紀錄的收入。他們提供了2025財年的前瞻性聲明,重點在於雲端AI、資料中心產品和網路軟體的成長。
該公司強調了對人工智慧集群的承諾,討論了網路市場的挑戰和機遇,並強調了其作為數據驅動網路領導者的地位。他們還討論了 GPU 和網路的整合、路由在雲端和資料中心產品中的重要性,以及他們未來成長的資本配置策略。
使用警語:中文譯文來源為 Google 翻譯,僅供參考,實際內容請以英文原文為主
Operator
Operator
Welcome to the fourth-quarter 2024 Arista Networks' financial results earnings conference call.
歡迎參加 Arista Networks 2024 年第四季財務業績收益電話會議。
(Operator Instructions)
(操作員指令)
As a reminder, this conference is being recorded and will be available for replay from the Investor Relations section on the Arista website following this call.
提醒一下,本次會議正在錄製中,電話會議結束後可從 Arista 網站的投資者關係部分重播。
Mr. Rodolph Araujo, Arista's Head of Investor Advocacy, you may begin.
Arista 投資者宣傳主管 Rodolph Araujo 先生,您可以開始啦。
Rudolph Araujo - Director of IR Advocacy
Rudolph Araujo - Director of IR Advocacy
Thank you, Regina.
謝謝你,里賈娜。
Good afternoon, everyone, and thank you for joining us.
大家下午好,感謝大家的參與。
With me on today's call are Jayshree Ullal, Arista Networks' Chairperson and Chief Executive Officer; and Chantelle Breithaupt, Arista's Chief Financial Officer.
今天與我一起參加電話會議的還有 Arista Networks 董事長兼執行長 Jayshree Ullal;以及 Arista 財務長 Chantelle Breithaupt。
This afternoon, Arista Networks issued a press release announcing the results for its fiscal fourth quarter ending December 31, 2024.
今天下午,Arista Networks發布新聞稿,宣布了截至2024年12月31日的第四財季業績。
If you want to copy of the release, you can access it online on our website.
如果您想要取得該新聞稿的副本,您可以在我們的網站上線上查閱。
During the course of this conference call, Arista Networks' management will make forward-looking statements, including those relating to our financial outlook for the first quarter of the 2025 fiscal year, longer-term business model, and financial outlook for 2025 and beyond.
在本次電話會議期間,Arista Networks 管理層將發表前瞻性聲明,包括與 2025 財年第一季的財務展望、長期業務模式以及 2025 年及以後的財務展望有關的聲明。
Our total addressable market and strategy for addressing these market opportunities, including a high customer-demand trends, supply chain constraints, component costs, manufacturing output, inventory management, and inflationary pressures on our business, lead times, product innovation, working capital optimization and the benefits of acquisitions, which are subject to risks and uncertainties that we discuss in detail in our documents filed with the SEC, specifically in our most recent Form 10-Q and Form 10-K and which could cause actual results to differ materially from those anticipated by these statements.
我們整體的潛在市場和應對這些市場機會的策略,包括高客戶需求趨勢、供應鏈約束、零組件成本、製造產出、庫存管理、我們業務面臨的通膨壓力、交貨時間、產品創新、營運資本優化和收購帶來的好處,這些都受到我們在提交給美國證券交易委員會的文件中詳細討論的風險和不確定性的影響,特別是在這些風險和不確定性的情況下。
These forward-looking statements apply as of today and you should not rely on them as representing our views in the future.
這些前瞻性陳述自今天起適用,您不應依賴它們來代表我們未來的觀點。
We undertake no obligation to update these statements after this call.
我們不承擔本次電話會議後更新這些聲明的義務。
Also, please note that certain financial measures we use on this call are expressed on a non-GAAP basis and have been adjusted to exclude certain charges.
另請注意,我們在本次電話會議中使用的某些財務指標是以非 GAAP 為基礎表示的,並且已進行調整以排除某些費用。
We have provided reconciliations of these non-GAAP financial measures to GAAP financial measures in our earnings press release.
我們在收益新聞稿中提供了這些非 GAAP 財務指標與 GAAP 財務指標的對帳表。
With that, I will turn the call over to Jayshree.
說完這些,我會把電話轉給 Jayshree。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you, everyone, for joining us this afternoon for our fourth-quarter 2024 earnings call.
感謝大家今天下午參加我們的 2024 年第四季財報電話會議。
First, I'd like to warmly welcome our new IR leadership duo of Rudolph Araujo, our Director of IR Advocacy, supported by Rod Hall that many of you may know as our Leader for IR Strategy.
首先,我要熱烈歡迎我們新的 IR 領導團隊,也就是我們的 IR 宣傳總監 Rudolph Araujo,以及我們的 IR 策略領導者 Rod Hall。
Special thanks to Liz Stine for her tenure, both as a Systems Engineer and IR Lead at Arista.
特別感謝 Liz Stine 在 Arista 擔任系統工程師和 IR 主管期間的貢獻。
Well I think you'll all agree that 2024 has been a memorable and defining year for Arista.
我想你們都會同意,2024 年對 Arista 來說是值得紀念且具有決定意義的一年。
We started with an initial guidance of 10% to 12% annual revenue growth, with the momentum of generative AI.
我們最初預計年收入成長率為 10% 至 12%,得益於生成式人工智慧的發展。
We have achieved well beyond that at almost 20% growth, achieving a record revenue of 7 billion, coupled with a non-GAAP operating margin of 47.5%.
我們取得了遠超預期的成績,成長了近 20%,實現了創紀錄的 70 億美元的營收,非公認會計準則下的營業利潤率達到 47.5%。
Before I dwell on that more, let me get back to Q4 2024 specifics.
在進一步討論這個問題之前,讓我先回顧一下 2024 年第四季的具體情況。
We delivered revenues of 1.93 billion for the quarter, with a non-GAAP earnings per share of $0.65 adjusted for the recent four-to-one stock split.
本季度,我們的營收為 19.3 億美元,根據最近的四比一股票分割進行調整後,非 GAAP 每股收益為 0.65 美元。
Our non-GAAP gross margins of 64.2% was influenced by efficient supply chain and manufacturing, as well as a good mix of Enterprise and Software in the quarter.
我們的非公認會計準則毛利率為 64.2%,這受到高效的供應鏈和製造以及本季度企業和軟體的良好組合的影響。
International contribution for the quarter registered at 16%, with the Americas was strong at 84%.
本季國際貢獻率為 16%,其中美洲地區貢獻率強勁,高達 84%。
Now shifting to annual sector revenue for 2024.
現在轉向2024年的年度行業收入。
Our Cloud and AI Titans contributed significantly at approximately 48%, keeping in mind that Oracle is a new member of this category.
我們的雲端運算和人工智慧巨頭貢獻率高達約 48%,請記住,Oracle 是此類別的新成員。
Enterprise & Financials was strong at approximately 35%, while the providers, which now includes Apple, was at 17% approximately.
企業和金融部門表現強勁,約佔 35%,而包括蘋果在內的供應商板塊則約佔 17%。
Both Microsoft and Meta are greater than 10% concentration customers at approximately 20% and 14.6%, respectively.
微軟和 Meta 的客戶集中度均超過 10%,分別約 20% 和 14.6%。
As you know, we cherish our privileged partnerships with both of them very much.
正如您所知,我們非常珍惜與他們兩家公司之間的特殊合作夥伴關係。
It has spanned over 14 years, and we collaborate deeply for joint engineering and innovative AI and cloud products.
這段歷程已經持續了 14 年,我們在聯合工程和創新人工智慧和雲端產品方面進行了深入合作。
In terms of annual 2024 product lines, our core cloud AI and data center products are built off a highly differentiated, extensible OS stock and is successfully deployed across 10, 25, 100, 200, 400, and 800 gigabit ethernet speeds.
就 2024 年度產品線而言,我們的核心雲端 AI 和資料中心產品基於高度差異化、可擴展的作業系統庫存構建,並成功部署在 10、25、100、200、400 和 800 千兆乙太網路速度上。
It delivers power efficiency, high availability, automation, and agility as the data centers demand insatiable bandwidth capacity and network speeds of both front-end and back-end storage, compute, and AI zones.
由於資料中心對前端和後端儲存、運算和 AI 區域的頻寬容量和網路速度有無限需求,因此它可以提供高能源效率、高可用性、自動化和靈活性。
This core product line grows approximately 65% of our revenue.
這核心產品線約占我們營收的65%。
We continue to gain market share in the highest performance of the switching category of 100, 200, and 400 gig ports to attain the number one position at greater than 40% market share according to industry analysts in ports.
根據連接埠產業分析師的統計,我們在 100、200 和 400 千兆埠交換類別的最高效能市佔率持續成長,以超過 40% 的市佔率佔據第一的位置。
We have increased our 400 gig customer base to approximately 1,000 customers last year in 2024.
去年,我們已將 400G 的客戶群增加到 2024 年的約 1,000 位客戶。
We expect 800 gigabit ethernet to emerge as an AI back-end cluster in 2025.
我們預計 800 千兆乙太網路將在 2025 年成為 AI 後端集群。
We remain optimistic about achieving our AI revenue goal of 1.5 billion and AI centers, which includes the 750 million in AI back-end clusters in 2025.
我們對實現15億美元的AI收入目標和AI中心(包括2025年的7.5億美元AI後端集群)仍然持樂觀態度。
Our network adjacencies market comprised of routing, replacing routers, and the cognitive AI-driven campus is going well.
我們的網路鄰接市場包括路由、更換路由器和認知人工智慧驅動的校園,進展順利。
Our investments in cognitive wired, wireless, zero-touch provisioning, and networked identity, as well as in sensors for threat mitigation, is being received extremely well by our Campus customers.
我們對認知有線、無線、零接觸配置和網路身分以及威脅緩解感測器的投資受到了校園客戶的高度評價。
Our recent modern stacking and introduction of SWAG, Switched Aggregation Group, is a fitting example of our compelling innovation for open and efficient networking, conserving IP addresses without proprietary methods.
我們最近推出的現代堆疊和 SWAG(交換聚合組)就是我們在開放和高效網路方面引人注目的創新的一個恰當例子,它無需專有方法即可節省 IP 位址。
The post pandemic Campus is very different and our customers are seeking alternatives to legacy incumbents with deep, zero trust security, high availability, and observability embedded in the network across our software stack with CloudVision management.
後疫情時代的校園已截然不同,我們的客戶正在尋求能夠取代傳統系統的方案,透過 CloudVision 管理,在我們的軟體堆疊網路中嵌入深度、零信任安全性、高可用性和可觀察性。
We are committed to the $750 million goal in 2025 and much more ahead.
我們致力於在 2025 年實現 7.5 億美元的目標,並在未來實現更大的目標。
We are successfully also deployed in routing edge and peering use cases.
我們也成功部署了路由邊緣和對等用例。
Just in 2024 alone, we introduced six US software releases with greater than 600 new features across our core and adjacent offerings.
光是 2024 年,我們就推出了 6 個美國軟體版本,涵蓋核心產品和鄰近產品的 600 多項新功能。
The Campus and Routing adjacencies together contributed approximately 18% of revenue.
校園和路由鄰接共貢獻了約 18% 的收入。
Our third category is network software and services based on subscription models such as the Arista A-Care, CloudVision, DMF Observability, and advanced security sensors for network detection and response.
我們的第三類是基於訂閱模式的網路軟體和服務,例如 Arista A-Care、CloudVision、DMF Observability 以及用於網路偵測和回應的高階安全感測器。
We added over 350 CloudVision customers translating to literally one new customer day.
我們增加了超過 350 個 CloudVision 客戶,這意味著每天都會有一個新客戶。
CloudVision is pivotal to building our network-as-a-service and deploying Arista-validated designs in the Enterprise.
CloudVision 對於建立我們的網路即服務和在企業中部署 Arista 驗證的設計至關重要。
Arista's subscription-based network services and software contributed approximately 17% of total revenue.
Arista 的訂閱式網路服務和軟體貢獻了總收入的約 17%。
Note that perpetual licenses do not count here and go into the core or adjacent sections.
請注意,永久許可證不計算在這裡並且進入核心或相鄰部分。
While the 2024 headline has clearly been about generative AI, Arista continues to diversify our business globally with multiple-use cases and verticals.
雖然 2024 年的頭條新聞顯然是關於生成式人工智慧,但 Arista 繼續透過多用途案例和垂直行業在全球範圍內實現業務多元化。
We are viewed as the modern network innovator of choice for clients to Campus to Cloud and AI networking, ideally positioned with our differentiated foundation.
我們被視為客戶從校園到雲端和人工智慧網路的首選現代網路創新者,憑藉我們差異化的基礎,佔據理想的地位。
We celebrated two milestones in 2024, our 10th anniversary of going public at the New York Stock Exchange and our 20th anniversary of founding.
2024年,我們慶祝了兩個里程碑:在紐約證券交易所上市10週年和公司成立20週年。
In the past decade, we exceeded 10,000 customers with a cumulative of 100 million ports of installed base as Arista drives the epicenter of mission-critical network transactions.
在過去十年中,隨著 Arista 成為關鍵任務網路交易的中心,我們的客戶數量超過 10,000 家,累計安裝連接埠數達到 1 億個。
Arista 2.0 strategy is resonating exceptionally well with our customers.
Arista 2.0 策略在我們的客戶中引起了強烈迴響。
Customers are not only looking to connect but unify and consolidate their data across silos for optical networking outcomes.
客戶不僅希望連接,還希望跨數據孤島統一和整合數據,以實現光網路成果。
Our modern networking platforms are foundational for transformation from incongruent silos to centers of data, and it places us in a very unique position as the best of breed innovator for data-driven networking.
我們現代的網路平台為從不協調的孤島向資料中心的轉變奠定了基礎,它使我們成為資料驅動網路領域最優秀的創新者,處於非常獨特的地位。
These centers of data as we call it, can reside in the Campus as a campus center, or data centers or WAN centers, or AI centers regardless of their locations.
我們所說的資料中心可以作為校園中心、資料中心、廣域網路中心或人工智慧中心駐留在校園內,無論它們位於何處。
Networking for AI is also gaining traction as we move into 2025, building some of the world's greatest Arista AI centers at production scale.
隨著我們進入 2025 年,人工智慧網路也越來越受到關注,並將在生產規模上建造一些世界上最偉大的 Arista 人工智慧中心。
These are constructed with both back-end clusters and front-end networks.
它們是由後端集群和前端網路建構的。
And as I've shared with you often, the fidelity of the AI traffic differs greatly from cloud workloads in terms of diversity, duration, and size of flow.
正如我經常與大家分享的那樣,人工智慧流量的保真度在多樣性、持續時間和流量大小方面與雲端工作負載有很大不同。
Just one slow flow can slow the entire job completion time for attaining workload.
僅一個緩慢的流程就會減慢整個作業完成工作量的時間。
Therefore, Arista AI centers seamlessly connect to the front end of compute storage WAN and classic cloud networks with our back-end Arista etherlink portfolio.
因此,Arista AI 中心透過我們的後端 Arista etherlink 產品組合無縫連接到運算儲存 WAN 和經典雲端網路的前端。
This AI-accelerated networking portfolio consists of three families and over 2o etherlinks switches, not just one point switch.
這個AI加速網路產品組合由三個系列和超過20個Etherlinks交換器組成,而不僅僅是一個點交換器。
Our AI for networking strategy is also doing well, and it's about curating the data for higher-level network functions.
我們的網路人工智慧策略也進展順利,它致力於為更高層級的網路功能整理資料。
We instrument our customers' networks with our published subscribed state foundation with our software called Network Data Lake to deliver proactive, predictive, and prescriptive platforms that have superior AI ops with A-Care support and product functions.
我們利用名為網路資料湖 (Network Data Lake) 的軟體,透過已發布的訂閱狀態基礎為客戶的網路提供監測,從而提供具有卓越 AI 操作以及 A-Care 支援和產品功能的主動、預測和規範平台。
We are pleased to surpass for the first time the $1 billion revenue mark in 2024 for the software and subscription service category.
我們很高興 2024 年軟體和訂閱服務類別的營收首次突破 10 億美元大關。
In 2024, we conducted three very large customer events in London, New York, and Santa Clara, California.
2024 年,我們在倫敦、紐約和加州聖克拉拉舉辦了三場大型客戶活動。
Our differentiated strategy and superior products are resonating deeply as we touched over 1,000 strategic customers and partners in these exclusive events.
在這些獨家活動中,我們接觸了超過 1,000 名策略客戶和合作夥伴,我們的差異化策略和優質產品產生了深刻共鳴。
Simply put, we outpaced the industry in quality and support with the highest net promoted score of 87 which translates to 93% of customer respondent satisfaction.
簡而言之,我們在品質和支援方面超越了行業,淨升級分數最高達到 87,這意味著客戶滿意度達到 93%。
Of course, we do that with the low security and vulnerabilities and steadfast network innovation.
當然,我們透過降低安全性和漏洞以及堅定的網路創新來做到這一點。
In summary, 2024 has been a pivotal turning point for Arista.
總而言之,2024 年對 Arista 來說是一個關鍵的轉捩點。
It has been a key breakaway year as we continue to aim for 10 billion annual revenue with a CAGR of double digits that we set way back in November 2022, Analyst Day.
這是關鍵的突破之年,因為我們繼續以 100 億美元的年收入和兩位數的複合年增長率為目標,這是我們在 2022 年 11 月分析師日設定的目標。
While I do appreciate the exuberance support from our analyst community on our momentum, I would encourage you to pay attention to our stated guidance.
雖然我確實感謝分析師社群對我們發展勢頭的熱情支持,但我還是鼓勵您關注我們所聲明的指導。
We live in a dynamic world of changes, most of which have resulted in positive outcome for Arista.
我們生活在一個充滿活力的變化的世界,其中大多數變化都為 Arista 帶來了積極的結果。
We reiterate at the upper range of our 2025 guidance of our double-digit growth at 17%, now aiming for approximately 8.2 billion in 2025 in revenue.
我們重申 2025 年兩位數成長預期的上限,即 17%,目前的目標是 2025 年營收達到約 82 億美元。
The Arista leadership team has driven outstanding progress across multiple dimensions.
Arista 領導團隊在多個方面都取得了突出的進展。
In 2024, we are at approximately 4,465 employees rooted in engineering and customer investments.
到 2024 年,我們將擁有約 4,465 名員工,致力於工程和客戶投資。
I'm incredibly proud of how we've executed and navigated the year based on our core principles and culture.
我為我們基於核心原則和文化執行和度過這一年的方式感到無比自豪。
While customers are struggling with customer fatigue from a legacy incumbents, Arista's redefining the future of data-driven networking intimately with our strategic customers.
當客戶正努力應對傳統企業帶來的客戶疲勞時,Arista 正在與我們的策略客戶緊密合作,重新定義數據驅動網路的未來。
With that, I'd like to turn it over to Chantelle, who has transitioned to become our core Arista Chief Financial Officer in record time, less than a year.
接下來,我想將麥克風交給 Chantelle,她在不到一年的時間內,創紀錄地轉型成為我們的核心 Arista 財務長。
Over to you Chantelle, and welcome again, and happy one year anniversary.
這個就交給你了,Chantelle,歡迎你再次光臨,祝你結婚一周年快樂。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Thank you, Jayshree, and congratulations on a great 2024.
謝謝你,Jayshree,恭喜你度過美好的 2024 年。
My first full year as CFO has been more than I could have hoped for, and I am excited about the rest of the journey ahead.
擔任財務長的第一年就超出了我的預期,我對接下來的旅程充滿期待。
Now on to the numbers.
現在來看一下數字。
As a reminder, this analysis of our Q4, our full-year 2024 results and our guidance for Q1 2025 is based on non-GAAP and excludes all non-cash stock-based compensation impacts, certain acquisition related charges and other nonrecurring items.
提醒一下,我們對第四季度、2024 年全年業績以及 2025 年第一季指引的分析均基於非 GAAP,且不包括所有非現金股票薪酬影響、某些收購相關費用和其他非經常性項目。
In addition, all share related numbers are provided on a post-split basis to reflect the four-to-one stock split in December 2024.
此外,所有股票相關數字均以分割後為基礎提供,以反映 2024 年 12 月的四比一股票分割。
A full reconciliation of our selected GAAP to non-GAAP results is provided in our earnings release.
我們在收益報告中提供了所選的 GAAP 結果與非 GAAP 結果的完整對帳表。
Total revenues in Q4 were $1.93 billion, up 25.3% year-over-year and above the upper end of our guidance of $1.85 billion to $1.9 billion.
第四季總營收為 19.3 億美元,年增 25.3%,高於我們預期的 18.5 億美元至 19 億美元的上限。
For fiscal 2024, we are pleased to have delivered 19.5% of revenue growth, driven by achievements in all three of our product sectors.
在 2024 財年,我們很高興實現了 19.5% 的收入成長,這得益於我們三個產品領域的成就。
Services and subscription software contributed approximately 18.2% of revenue in the fourth quarter, up from 17.6% in Q3.
服務和訂閱軟體在第四季度貢獻了約 18.2% 的收入,高於第三季的 17.6%。
International revenues for the quarter came in at $311.1 million or 16% of total revenue, down from 17.6% last quarter.
本季國際營收為 3.111 億美元,佔總營收的 16%,低於上一季的 17.6%。
This quarter-over-quarter decrease was driven by the relative increase mix of domestic revenue from our large global customers.
這一季度環比下降是由於來自我們全球大型客戶的國內收入相對增加所致。
The overall gross margin in Q4 was 64.2%, slightly above the guidance of 63% to 64% and down from 65.4% in the prior year.
第四季整體毛利率為 64.2%,略高於預期的 63% 至 64%,低於去年同期的 65.4%。
As a recap for the year, we delivered a gross margin result of 64.6% compared with 62.6% for the prior year.
回顧這一年,我們的毛利率為 64.6%,而前一年為 62.6%。
This increase is largely due to a combination of improved supply chain and inventory management.
這一成長主要歸功於供應鏈和庫存管理的改善。
Operating expenses for the quarter were $332.4 million or 17.2% of revenue, up from the last quarter at $279.9 million. R&D
本季營業費用為 3.324 億美元,佔營收的 17.2%,高於上一季的 2.799 億美元。研發
spending came in at $226.1 million or 11.7% of revenue, up from 9.8% last quarter.
支出為 2.261 億美元,佔營收的 11.7%,高於上一季的 9.8%。
This matches the expectations discussed in our Q3 earnings call regarding the timing of engineering costs and other costs associated with the development of our next-gen products moving from Q3 to Q4.
這與我們在第三季財報電話會議上討論的關於工程成本和與下一代產品開發相關的其他成本從第三季轉移到第四季的預期相符。
This finishes the year of R&D at 11.2% of revenue, demonstrating a continued focus on products innovation.
今年的研發支出佔收入的 11.2%,顯示公司持續注重產品創新。
Sales and marketing expense was $86.3 million or 4.5% of revenue, up from $83.4 million last quarter.
銷售和行銷費用為 8,630 萬美元,佔營收的 4.5%,高於上一季的 8,340 萬美元。
This was driven by continued investment in both headcount and channel programs.
這是由對員工人數和通路計畫的持續投資所推動的。
Our G&A costs came in at $19.9 million or 1% of revenue, up from $19.1 million last quarter reflecting continued investment in scaling the company.
我們的一般及行政開支為 1,990 萬美元,佔營收的 1%,高於上一季的 1,910 萬美元,反映了對公司規模的持續投資。
Our operating income for the quarter was $907.1 million or 47% of revenue.
本季我們的營業收入為 9.071 億美元,佔營收的 47%。
The strong Q4 finish contributed to an operating income result for fiscal year '24 of $3.3 billion or 47.5% of revenue.
第四季的強勁表現推動24財年的營業收入達到33億美元,佔營收的47.5%。
Congratulations to the Arista team on this impressive achievement.
恭喜 Arista 團隊取得這項令人矚目的成就。
Other income and expense for the quarter was a favorable $89.2 million, and our effective tax rate was 16.7%.
本季的其他收入和支出為 8,920 萬美元,我們的有效稅率為 16.7%。
This lower-than-normal quarterly tax rate reflected the release of tax reserves due to the expiration of statute of limitations and favorable changes in state taxes.
低於正常水平的季度稅率反映了因訴訟時效到期和州稅的有利變化而釋放的稅收儲備。
This resulted in net income for the quarter of $830.1 million or 43% of revenue.
這使得該季度淨收入達到 8.301 億美元,佔營收的 43%。
Our diluted share number was 1.283 billion shares, resulting in diluted earnings per share for the quarter of $0.65, up 25% from the prior year.
我們的稀釋股數為 12.83 億股,導致本季每股稀釋收益為 0.65 美元,比上年增長 25%。
For FY24, we are pleased to have delivered diluted earnings per share of $2.27, a 31.2% increase year over year.
就 24 財年而言,我們很高興看到每股攤薄收益達到 2.27 美元,年增 31.2%。
Now turning to balance sheet.
現在轉向資產負債表。
Cash, cash equivalents and marketable securities ended the quarter at approximately $8.3 billion.
本季末,現金、現金等價物和有價證券約為 83 億美元。
In the quarter, we repurchased 123.8 million of our common stock at an average price of $94.80 per share.
本季度,我們以平均每股 94.80 美元的價格回購了 1.238 億股普通股。
Within fiscal '24, we repurchased $423.6 million of our common stock at an average price of $77.13 per share.
在 24 財年,我們以平均每股 77.13 美元的價格回購了 4.236 億美元的普通股。
Of the $1.2 billion repurchase program approved in May 2024, $921 million remains available for repurchase in future quarters.
在 2024 年 5 月批准的 12 億美元回購計畫中,仍有 9.21 億美元可在未來幾季回購。
The actual timing and amount of future repurchases will be dependent on market and business conditions, stock price, and other factors.
未來回購的實際時間和金額將取決於市場和商業狀況、股票價格和其他因素。
Now turning to operating cash performance for the fourth quarter, we generated approximately $1 billion of cash from operations for the period, reflecting strong earnings performance, combined with an increase in deferred revenue, offset by an increase in income tax payments.
現在談到第四季度的營運現金表現,我們本季從經營活動中產生了約 10 億美元的現金,反映了強勁的獲利表現,加上遞延收入的增加,但被所得稅支付的增加所抵消。
DSOs came in at 54 days, down from 57 days in Q3, reflecting the timing of shipments and the strong collections performance by the team.
DSO 為 54 天,低於第三季的 57 天,反映了出貨時間和團隊強勁的收款表現。
Inventory turns were 1.4 times, up from 1.3 last quarter.
庫存週轉率為 1.4 次,高於上一季的 1.3 次。
Inventory increased marginally to $1.83 billion, reflecting diligent inventory management across raw finished goods.
庫存小幅增加至 18.3 億美元,反映出對原始成品庫存的嚴格管理。
Our first purchase commenced at the end of the quarter are $3.1 billion, up from $2.4 billion at the end of Q3.
我們在本季末開始的首次採購額為 31 億美元,高於第三季末的 24 億美元。
As mentioned in prior quarters, this expected activity represents purchases for chips related to new products and deployments.
正如前幾季所提到的,這項預期活動代表與新產品和部署相關的晶片的購買。
We will continue to rationalize our overall purchase commitment number.
我們將繼續合理化我們的整體採購承諾數量。
However, we expect maintained a healthy position related to key components and continue to have some variability in this number to meet customer demand and improve lead times in future quarters.
然而,我們預計關鍵部件相關的業務將保持健康的地位,並且這個數字將繼續保持一定的變化,以滿足客戶需求並在未來幾季縮短交貨時間。
Our total deferred revenue balance was $2.79 billion, up from $2.51 billion in the prior quarter.
我們的總遞延收入餘額為 27.9 億美元,高於上一季的 25.1 億美元。
The majority of the deferred revenue balances, services related and directly linked to the timing in terms of service contracts, which can vary on a quarter-by-quarter basis.
大部分遞延收入餘額與服務相關且與服務合約的時間直接相關,這些服務可能會按季度變化。
Our product deferred revenue balance increased by approximately $150 million over the last quarter.
上個季度,我們的產品遞延收入餘額增加了約 1.5 億美元。
Fiscal 2024 was a year of new product introductions, new customers and expanded use cases.
2024 財年是推出新產品、拓展新客戶和擴大用例的一年。
These trends have resulted in increased customer trials and contracts with customer-specific acceptance clauses that had and will continue to have increased the variability magnitude of our preferred deferred revenue balances.
這些趨勢已導致客戶試用和包含客戶特定驗收條款的合約增加,而這些趨勢已經並將繼續增加我們優先遞延收入餘額的變動幅度。
We expect this to continue into fiscal 2025.
我們預計這種情況將持續到 2025 財年。
Accounts payable days were 51 days, up from 42 days in Q3, reflecting the timing of inventory receipts and payments.
應付帳款天數為 51 天,高於第三季的 42 天,反映了庫存收款和付款的時間。
Capital expenditures for the quarter were $12.5 million.
本季的資本支出為 1250 萬美元。
In October, we began our initial construction work to build expanded facilities in Santa Clara, and we expect to incur approximately $100 million in CapEx during fiscal '25 for this project.
十月份,我們開始了在聖克拉拉擴建設施的初步施工工作,我們預計 25 財年將為該項目投入約 1 億美元的資本支出。
Now turning to our outlook for the first quarter of 2025 and the remainder of the fiscal '25 year.
現在來談談我們對 2025 年第一季和 25 財年剩餘時間的展望。
We continue to gain confidence in our view for fiscal year '25 and now place our revenue growth that looked at approximately 17% or $8.2 billion.
我們持續對 25 財年的展望充滿信心,目前我們的營收成長率約為 17% 或 82 億美元。
This is up from our initial FY25 guidance of 15% to 17%.
這高於我們最初 FY25 預測的 15% 至 17%。
This reflects our combined outlook for cloud, AI, Enterprise and cloud specialty providers, along with the recognition of the volatility that we've seen in the market since the beginning of the year.
這反映了我們對雲端、人工智慧、企業和雲端專業供應商的綜合展望,以及對自今年年初以來市場波動的認識。
For gross margin, we reiterate the range of the fiscal year of 60% to 62%, with Q1 '25 expected to be above the range due to the anticipated mix of business in the quarter.
對於毛利率,我們重申本財年的毛利率範圍為 60% 至 62%,由於本季度預期的業務組合,預計 2025 年第一季的毛利率將高於該範圍。
Similar to others in the industry, we will continue to monitor the fluid tariff situation and be thoughtful for both the short and long-term outcomes to both our company and our customers.
與業內其他公司類似,我們將繼續監測流動關稅情況,並認真考慮公司和客戶的短期和長期結果。
In terms of spending, we expect to invest in innovation, sales, and scaling the company, resulting in the continued operating margin outlook of 43% to 44% in 2025.
在支出方面,我們預計將投資於創新、銷售和擴大公司規模,使 2025 年的營業利潤率持續保持在 43% 至 44% 的預期。
On the cash front, we will continue to work to optimize our working capital investments with some expected variability in inventory due to the timing of the component receipts on purchase commitments.
在現金方面,我們將繼續努力優化我們的營運資本投資,由於採購承諾的零件收貨時間,庫存預計會出現一些變化。
Our structural tax rate is expected to remain at 21.5% back to the usual historical rate, up from the unusually low one-time rate of 16.7% experiments last quarter Q4 FY24.
我們的結構性稅率預計將保持在 21.5%,回到通常的歷史稅率,高於 2024 財年第四季上個季度的異常低的一次性稅率 16.7%。
With all this as a backdrop, our guidance for the first quarter is as follows: revenues of approximately $1.93 billion to %1.97 billion, a slightly stronger seasonality in Q1 than prior year trends and outcome of the timing of our customers' priorities.
鑑於以上所有背景,我們對第一季度的預期如下:營收約為 19.3 億美元至 19.7 億美元,第一季度的季節性因素比去年同期略強,並且是客戶優先考慮事項的時間安排的結果。
Gross margin of approximately 63% and operating margin at approximately 44%.
毛利率約63%,營業利益率約44%。
Our effective tax rate is expected to be approximately 21.5%, with approximately 1.285 billion diluted shares.
我們的有效稅率預計約為21.5%,稀釋股份約為12.85億股。
In summary, we at Arista are enthusiastic about 2025 and the general networking outlook ahead.
總而言之,Arista 對 2025 年以及未來的整體網路前景充滿熱情。
We have an impressive portfolio and are ready to solve our customers' needs across all the centers of data.
我們擁有令人印象深刻的產品組合,並隨時準備好解決所有資料中心的客戶需求。
Combined with our Arista team spirit, we are ready to realize our fair share of the $70 billion market TAM.
結合我們的 Arista 團隊精神,我們已準備好在 700 億美元的市場 TAM 中實現應有的份額。
With that, I, too would like to welcome Rudy and Rod to the Arista IR team.
同時,我也歡迎 Rudy 和 Rod 加入 Arista IR 團隊。
Back over to you Rudy for Q&A.
回到 Rudy 的問答環節。
Rudolph Araujo - Director of IR Advocacy
Rudolph Araujo - Director of IR Advocacy
Thank you, Chantelle.
謝謝你,Chantelle。
We will now move to the Q&A portion of the Arista earnings call.
我們現在進入 Arista 收益電話會議的問答部分。
To allow for greater participation.
以便更多人參與。
I'd like to request that everyone, please limit themselves to a single question.
我請求大家只問一個問題。
Thank you for your understanding.
感謝您的體諒。
Operator, take it away.
接線員,把它拿走。
Operator
Operator
We will now begin the Q&A portion of the Arista earnings call.
我們現在開始 Arista 收益電話會議的問答部分。
(Operator Instructions)
(操作員指令)
Michael Ng, Goldman Sachs.
高盛的 Michael Ng。
Michael Ng - Analyst
Michael Ng - Analyst
Hi, good afternoon and thank you for the question.
大家下午好,感謝您的提問。
I was just wondering if you could talk about the timing of how the year might look like and how you expect the switches in the AI back end to be rolled out into production, the sale of these switches, types of deployment of next-generation video chips or hyperscale custom [matrix] on the compute side and, is that a gating factor that you're watching out for?
我只是想知道您是否可以談談今年的時間安排,以及您預計人工智能後端的交換機將如何投入生產,這些交換機的銷售,下一代視頻芯片或計算端超大規模定制[矩陣]的部署類型,以及這是您正在關注的限制因素嗎?
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes.
是的。
Thank you, Michael.
謝謝你,麥可。
First, I want Arista is still committed to four out of our five AI clusters that I mentioned in prior calls.
首先,我希望 Arista 仍然致力於我在之前的電話會議中提到的五個 AI 集群中的四個。
The fifth one is a little bit stalled.
第五個有點停滯。
It is not a Cloud Titan.
它不是雲泰坦。
They are awaiting new GPUs and some funding too, I think.
我認為他們也在等待新的 GPU 和一些資金。
So I hope they'll come back next year, but for this year we won't talk about them.
所以我希望他們明年能回來,但今年我們不會談論他們。
But the remaining four, let me spend some color.
但剩下的四個,讓我花一些顏色。
Three out of the four customers are expected to this year, rolled out a cumulative of 100,000 GPUs.
預計四家客戶中,有三家今年將累計推出 10 萬塊 GPU。
So we're going to do very well with three of them on the back end, and you can imagine they're all pretty much one major Nvidia class of GPU.
因此,我們將在後端很好地利用這三種技術,你可以想像它們幾乎都是一個主要的 Nvidia 類 GPU。
It's ah -- they will be waiting for the next generation of GPUs, but independent of that we'll be rolling out fairly large numbers.
他們會等待下一代 GPU,但除此之外,我們還將推出相當數量的 GPU。
On the fourth one, we are migrating right now from InfiniBand to proving that ethernet's a viable solution, so we're still -- they've historically been InfiniBand and so we're still in pilots and we expect to go into production next year.
第四,我們現在正在從 InfiniBand 遷移到以太網,以證明以太網是一種可行的解決方案,因此我們仍然 - 它們歷史上一直是 InfiniBand,因此我們仍處於試點階段,預計明年投入生產。
They're doing very well in four out of four, the fifth one installed, and three out of the four are expected to be 100,000 GPUs this year.
他們在四分之四中表現非常好,第五個已經安裝好了,另外四個中有三個預計今年將達到 100,000 個 GPU。
Operator
Operator
Amit Daryanani, Evercore.
阿米特·達裡亞納尼 (Amit Daryanani),Evercore。
Amit Daryanani - Analyst
Amit Daryanani - Analyst
Good afternoon.
午安.
Thanks for taking my question.
感謝您回答我的問題。
I guess, Jayshree, there's always this concern around the impact of white box vendors through your revenue growth and you kill over the last decade.
我想,Jayshree,人們總是擔心白盒供應商會對您的收入成長和過去十年的淘汰產生影響。
I don't think it's been an impediment for the company.
我不認為這會成為公司的障礙。
But can you share your perspective that when it comes to AI networks, especially the back-end networks, how do you see the mix evolving white box versus OEM solutions and maybe just help us understand the differentiators that help Arista be successful on the front end.
但是,您能否分享一下您對 AI 網路(尤其是後端網路)的看法,您如何看待混合發展的白盒與 OEM 解決方案,也許能幫助我們了解幫助 Arista 在前端取得成功的差異化因素。
Do they extend the back-end networks as well, or is there something different we should be aware about?
他們是否也擴展了後端網絡,或者我們應該注意哪些不同之處?
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, sure, Amit.
是的,當然,阿米特。
This feels like a deja vu question, but thank you for asking it.
這感覺像是一個似曾相識的問題,但還是感謝您提出這個問題。
I'm sure it's on a lot of minds.
我確信很多人都在考慮這個問題。
We've been going through this for at least 10 years on the cloud, and the first thing I just want to say is this can is so huge and so large we will always coexist with white boxes and operating systems that are non-EOS, much like Apple coexists on their iPhone with other phones of different types.
我們在雲端至少已經經歷了 10 年,我只想說的第一件事是,這個罐子如此之大,我們將永遠與非 EOS 的白盒和作業系統共存,就像 Apple 在他們的 iPhone 上與其他不同類型的手機共存一樣。
When you look at the back end of an AI cluster, there are typically two components the AI leaf and the AI spine.
當你查看 AI 群集的後端時,通常有兩個組件:AI 葉和 AI 主幹。
The AI leaf connects to the GPUs and, therefore, is the first, if you will, point of connection, and the AI spine aggregates all of these AIs.
AI 葉連接到 GPU,因此,如果你願意的話,它是第一個連接點,而 AI 主幹聚合了所有這些 AI。
Almost in all the back-end examples we've seen, the AI spine is generally 100% Arista-branded EOS.
在我們所見過的幾乎所有後端範例中,AI 主幹通常都是 100% Arista 品牌的 EOS。
You've got to do an awful lot of routing, scale, features, capabilities that are very rich that would be difficult to do in any other environment.
您必須完成大量的路由、擴展、特性和功能,這些功能非常豐富,在任何其他環境中都很難做到。
The AI leaf can vary.
AI 葉子可能會有所不同。
So for example, let's take the example of the five customers I mentioned a lot.
舉例來說,讓我們以我多次提到的五個客戶為例。
Three out of the five are all EOS in the leaf and spine.
五個中有三個都是葉和脊柱中的 EOS。
Two out of the 5 are kind of hybrids.
五種之中有兩種是混合型。
Some of them have some form of sonic or EFTPOS, and as we co-develop with them and coexist in a number of use cases where it's a real hybrid combination of EOS and an open OS.
其中一些具有某種形式的聲波或 EFTPOS,並且我們與他們共同開發並在許多用例中共存,它是 EOS 和開放作業系統的真正混合組合。
So for the most part, I'd just like to say that white box and Arista will coexist and will provide different strokes for different folks.
因此,在大多數情況下,我只想說白盒和 Arista 將共存並為不同的人提供不同的選擇。
Now, in terms of differentiators, a lot of our deployments right now is 400 and 800 gig, and you see a tremendous amount of differentiation not only like I explained to you in scale and routing features but cost and load balancing.
現在,就差異化而言,我們現在的許多部署都是 400 和 800 千兆,您會看到巨大的差異化,不僅像我在規模和路由功能方面向您解釋的那樣,而且還有成本和負載平衡方面。
AI visibility and analytics at real time, personal queuing, congestion control, visibility, and most importantly smart system upgrade because you sure don't want your GPUs to come down because you don't have the right software to accelerate so that the network provides the ideal foundation that if a GPU is in trouble, we can automatically give it a different connection and an alternate connection.
即時人工智慧視覺和分析、個人排隊、擁塞控制、可視性,以及最重要的智慧型系統升級,因為您肯定不希望您的 GPU 因為沒有合適的加速軟體而崩潰,因此網路提供了理想的基礎,如果 GPU 出現問題,我們可以自動為其提供不同的連接和備用連接。
So tremendous amount of differentiation there and even more valid in a GPU which costs typically 5 times as much as a CPU.
因此存在巨大的差異,而 GPU 的差異更大,其成本通常是 CPU 的 5 倍。
Operator
Operator
Tim Long, Barclays.
巴克萊銀行的提姆朗。
Tim Long - Analyst
Tim Long - Analyst
Thank you.
謝謝。
Wanted to touch on the Cloud Titan numbers if you par there.
如果你在那裡,想談談 Cloud Titan 的數字。
Obviously, one of them, Meta, looks like, based on the numbers you gave, if I heard it, write it down year over year.
顯然,其中之一,Meta,看起來,根據你給出的數字,如果我聽到的話,會逐年寫下來。
If you could touch on that and then the other, if we do the math for the other Cloud Titans, looks like it went up a lot.
如果您可以談談這一點,然後再談談其他,如果我們對其他 Cloud Titans 進行計算,看起來它已經上漲了很多。
I don't know.
我不知道。
I think Oracle was kind of in that already.
我認為 Oracle 已經這麼做了。
Was there anything else going on?
還有其他事情發生嗎?
Other than the Oracle shift with the rest of the Cloud Titans where it looked extremely strong in 2024?
除了 Oracle 與其他雲端巨頭的轉變,它在 2024 年看起來非常強大之外?
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, so speaking specifically to Meta, we're obviously in a number of use cases in Meta.
是的,具體到 Meta,我們顯然在 Meta 中有很多用例。
Keep in mind that our 2024 Meta numbers is influenced by more of their 2023 CapEx, and that was Meta's year of efficiency where their CapEx was down 15% to 20%.
請記住,我們 2024 年的 Meta 數字更受其 2023 年資本支出的影響,而那是 Meta 效率最高的一年,其資本支出下降了 15% 至 20%。
So you're probably seeing some correlation between their CapEx being down and our revenue numbers being slightly lower in '24.
因此,您可能會看到他們的資本支出下降與我們 24 年的收入數字略低之間存在一定的關聯。
In general, I would just say all our Cloud Titans are performing well in demand, and we shouldn't confuse that with timing of our shipments, and I fully expect Microsoft and Meta to be greater than 10% customers in a strong manner in 2025 as well.
總的來說,我想說我們所有的 Cloud Titans 都表現良好,我們不應該將其與我們的發貨時間混淆,我完全預計微軟和 Meta 在 2025 年也將以強勁的方式成為超過 10% 的客戶。
Specific to the others we added in, they're not 10% customers, but they're doing very well, and we're happy with their cloud and AI use cases.
具體到我們新增的其他客戶,他們不是 10% 的客戶,但他們做得很好,我們對他們的雲端和 AI 用例感到滿意。
Operator
Operator
Ben Reitzes, Melius.
本·賴澤斯(Ben Reitzes),Melius。
Ben Reitzes - Analyst
Ben Reitzes - Analyst
Yeah, darn Tim Long took my question, so I'm going to ask about gross margins, darn that guy.
是的,該死的蒂姆朗回答了我的問題,所以我要問毛利率,該死的傢伙。
So about gross margins, so obviously they go between the 61 at the midpoint, after being much higher in the first quarter.
就毛利率而言,顯然在第一季高出很多之後,中間值處於 61 之間。
I would think that implies a significant Cloud Titan mix though, and going throughout the rest of the year.
我認為這意味著 Cloud Titan 的大量混合,並將持續至今年剩餘時間。
Do you mind just giving some color on what is pushing down the gross margin a little more, and does that mean that Cloud Titans do accelerate throughout the year because the gross margin gets pushed down in your guidance?
您介意解釋一下導致毛利率進一步下降的原因嗎?
Thank you.
謝謝。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
No, absolutely.
不,絕對不是。
I think you got it right, Ben, from that perspective.
本,從這個角度來看,我認為你是對的。
As we entered Q4 last year talking about '25 guidance and as we enter into this quarter, it is and still remains a mix.
當我們去年進入第四季度時談論'25指導時,以及當我們進入本季度時,它仍然是混合的。
Jayshree kind of gave some thoughts onto the timing to the first question, so it is mixed driven, there were some questions last quarter if it was price driven.
Jayshree 對第一個問題的時間表發表了一些想法,所以它是混合驅動的,上個季度有一些問題,如果它是由價格驅動的。
This is just a mix-driven conversation.
這只是一次混合驅動的對話。
I would say we have absorbed a little bit of the specific tariffs on China in that number, so we are absorbing that on behalf of our customers, but otherwise it's mixed driven, and we'll continue to update as we do the quarterly guidance through the year.
我想說的是,我們已經吸收了一點針對中國的特定關稅,所以我們代表我們的客戶吸收了這些關稅,但除此之外,它是混合驅動的,我們將在全年的季度指導中繼續更新。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Sorry, just going to add that John's done a fantastic job on the planning for China ahead of time.
抱歉,我只是想補充一下,約翰提前為中國所做的規劃非常出色。
So while we're absorbing the cost, most of it is related to the mix, and some of it is related to the China tariffs.
因此,雖然我們吸收了成本,但大部分成本與組合有關,部分成本與中國關稅有關。
Would you say that?
你會這麼說嗎?
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
John McCool - Chief Platform Officer, Senior Vice President - Engineering Operations
That's right, absolutely.
沒錯,絕對是如此。
We've been working on China mitigation for some time and happy to report we've made good progress.
我們已經為緩解中國影響而努力了一段時間,很高興地報告我們取得了良好的進展。
Operator
Operator
Meta Marshall, Morgan Stanley.
摩根士丹利的 Meta Marshall。
Meta Marshall - Analyst
Meta Marshall - Analyst
Great, thanks.
太好了,謝謝。
Maybe another topical question from investors.
這可能是投資者的另一個熱門問題。
Just over the past month has been DeepSeek and just as you think about kind of this 1-to-1 ratio you've talked about on back end versus front end, how you kind of see that changing as we've seen some of the kind of the changes to kind of thoughts around training and investments.
就在過去一個月裡,我們一直在討論 DeepSeek,正如您所談論的後端與前端的 1 比 1 比例,隨著我們看到圍繞培訓和投資的思想方面的一些變化,您如何看待這種變化。
Thanks.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, thank you, Meta.
是的,謝謝你,Meta。
Well, DeepSeek certainly [dip fit] many stocks, but I actually see this as a positive because I think you're now going to see a new class of CPUs, GPUs, AI accelerators, and where you can have substantial efficiency gains that go beyond training.
好吧,DeepSeek 確實 [逢低買入] 了許多股票,但我實際上認為這是積極的,因為我認為你現在將看到一類新的 CPU、GPU、AI 加速器,而且你可以在超越訓練的領域獲得實質性的效率提升。
So that could be some sort of inference or a mixture of experts or reasoning and -- which lowers the token count and therefore the cost so.
所以這可能是某種推論,或是專家或推理的混合——這會降低令牌數量,從而降低成本。
What I like about all these different options is Arista can be a scale up network for all kinds of XPUs and accelerators, and I think the eye opening thing here for all of our experts who are building all these engineering models is there are many different types, and training isn't the only one.
我喜歡所有這些不同選擇的原因是 Arista 可以成為各種 XPU 和加速器的擴展網絡,而且我認為這對於我們所有構建所有這些工程模型的專家來說,最令人大開眼界的是,模型有很多不同的類型,而培訓並不是唯一的類型。
So I think this is a nice evolution of how AI will not just be a backhand training only limited to five-customers-type phenomena but will become more and more distributed across a range of CPUs and GPUs.
所以我認為這是一個很好的進化,人工智慧將不只是一種僅限於五個客戶類型現象的反手訓練,而且將越來越分佈在各種 CPU 和 GPU 上。
Operator
Operator
Aaron Rakers, Wells Fargo.
富國銀行的 Aaron Rakers。
Aaron Rakers - Analyst
Aaron Rakers - Analyst
Yeah, thanks for taking the question.
是的,感謝您回答這個問題。
Jayshree, I'm curious just kind of thinking about some of the questions we've gotten recently.
Jayshree,我很好奇,只是在想我們最近收到的一些問題。
When you see announcements like Stargate and, obviously, Stargate has the involvement of one of your newer Cloud Titan customers, how do you conceptualize the opportunities set for Arista, vis-a-vis both back-end and front-end networking, in deployments like that?
當您看到像 Stargate 這樣的公告時,顯然 Stargate 有您的一位較新的 Cloud Titan 客戶參與,您如何看待 Arista 在這樣的部署中在後端和前端網絡方面所面臨的機遇?
And then do you have any thoughts on just the broader context of what you're seeing on Sovereign AI opportunities in your business?
那麼,您對您所在業務中主權 AI 機會的更廣泛背景有什麼看法嗎?
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, thank you, Aaron.
是的,謝謝你,亞倫。
Stargate and Sovereign AI are not quite related, so let me take the first one, Stargate first.
星際之門 (Stargate) 和 Sovereign AI 關係不大,所以我先來談談第一個,星際之門。
If you look at how we have classically approached GPUs and collective libraries, we've largely looked at it as two separate building blocks.
如果您觀察我們如何經典地處理 GPU 和集體庫,我們基本上將其視為兩個獨立的構建塊。
There's a vendor who provides the GPU and there's us who provides the scale out networking.
有一個供應商提供 GPU,而我們提供橫向擴展網路。
But when you look at Stargate and projects like this, I think you'll start to see more of a vertical rack integration where the processor, the scale up, the scale out, and all of the software to provide a single point of control and visibility starts to come more and more together.
但是當你看看星際之門和類似的項目時,我認為你會開始看到更多的垂直機架集成,其中處理器、縱向擴展、橫向擴展以及提供單點控制和可視性的所有軟體開始越來越多地結合在一起。
This is not a 2025 phenomenon, but definitely in '26 and '27 you're going to see a new class of AI accelerators for -- and a new class of training and inference, which is extremely different than the current more pluggable Lego-type of version.
這不是 2025 年的現象,但肯定會在 2026 年和 2027 年看到一種新型的 AI 加速器,以及一種新型的訓練和推理,這與當前更可插入的樂高類型版本極為不同。
So we're very optimistic about it and [Everoine] is personally involved in the design of a number of these next generation projects.
因此我們對此非常樂觀,並且 [Everoine] 親自參與了許多下一代項目的設計。
And the need for this type of, shall we say, pushing Moore's law of improvements in density and performance that we saw in the 2000s is coming back.
而對於這種我們在 21 世紀看到的推動摩爾定律在密度和性能方面的改進的需求正在回歸。
And you can boost more and more performance for XPU, which means you have to boost the network scale from 800 gig to 1.16. There are other things to consider like liquid cooling and co-packaging of copper and optics, so lots going on there that Arista's in the middle of where a best-of-breed hardware that John McCool's team is working on as well as Andy's team is.
並且您可以進一步提升 XPU 的效能,這意味著您必須將網路規模從 800G 提升到 1.16。還有其他事情需要考慮,例如液體冷卻以及銅和光學元件的共同封裝,因此有很多事情要做,Arista 正處於 John McCool 團隊和 Andy 團隊共同研究的最佳硬體的階段。
Operator
Operator
Atif Malik, Citi.
花旗銀行的阿蒂夫馬利克 (Atif Malik) 說:
Atif Malik - Analyst
Atif Malik - Analyst
Hi, thank you for taking my question, Jayshree.
你好,謝謝你回答我的問題,Jayshree。
I appreciate you calling out to pay attention to the guidance.
感謝您呼籲大家關注指導。
Now you're retracing $750 million AI back-end sales this year despite the stalled, or the fifth customer.
現在,儘管停滯不前,或者說是第五個客戶,你今年仍要重新追蹤 7.5 億美元的 AI 後端銷售額。
Can you talk about where is the upside coming from this year?
可以談談今年的上升空間在哪裡嗎?
Is it broad-based or one or two customers?
是擁有廣泛的客戶還是只有一兩個客戶?
And also if you can talk to the $70 billion TAM number for 2028, how much is AI?
另外,如果您能談談 2028 年 700 億美元的 TAM 數字,那麼 AI 是多少?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Okay, so I'll take your second question first, Atif.
好的,我先回答你的第二個問題,阿蒂夫。
On the $70 billion TAM in 2028, I would roughly say a third is AI, a third is Data Center and Cloud, and a third is Campus and Enterprise.
對於 2028 年 700 億美元的 TAM,我粗略地說,三分之一是人工智慧,三分之一是資料中心和雲,三分之一是校園和企業。
And, obviously, absorbed into that is routing and security and observability.
顯然,路由、安全性和可觀察性也包含在其中。
I'm not calling them out separately for the purpose of this discussion.
為了本次討論,我並不會單獨點名他們。
So roughly $20 billion to $25 billion on each to get to that $70 billion.
因此,每家公司大約需要投入 200 億到 250 億美元才能達到 700 億美元的目標。
So coming back to your original question, which was?
那麼回到你最初的問題,是什麼呢?
Help me out again.
再幫我一下吧。
Atif Malik - Analyst
Atif Malik - Analyst
The $750 million in back-end sales.
7.5 億美元的後端銷售額。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, so we're, as I said, we're well on our way, and three customers deploying a cumulative 100,000 GPUs is going to help us with that number this year.
是的,正如我所說的,我們已經在順利進行中,三位客戶今年共部署了 100,000 個 GPU,這將幫助我們實現這一目標。
And as we increased our guidance to $8.2 billion, I think we're going to see momentum both in AI, Cloud, and Enterprises.
隨著我們將指導金額提高至 82 億美元,我認為我們將看到人工智慧、雲端運算和企業領域的發展勢頭。
I'm not ready to break it down and tell you which where I think we'll see -- we'll know that much better in the second half, but Chantelle and I feel confident that we can definitely do the $8.2 billion that we historically don't call out so early in the year.
我還沒有準備好詳細分析並告訴您我認為我們會看到什麼——我們將在下半年更好地了解情況,但 Chantelle 和我相信我們絕對可以實現 82 億美元的目標,從歷史上看,我們不會在今年這麼早的時候宣布這一目標。
So having visibility of that helps.
因此了解這一點是有幫助的。
Operator
Operator
Samik Chatterjee, JPMorgan.
摩根大通的 Samik Chatterjee。
Samik Chatterjee - Analyst
Samik Chatterjee - Analyst
Hi, thanks for taking the question.
你好,謝謝你回答這個問題。
Just maybe I can sort of bring up one more topic that's come up a lot in the last few days, which is the value of the EOS software layer to the back end of the network and particularly in the discussion in terms of rate of competition to like a white box player.
也許我可以再提一下最近幾天出現很多的話題,那就是 EOS 軟體層對網路後端的價值,特別是在與白盒玩家的競爭率方面的討論。
How do you sort of emphasize the value of EOS to your customers?
您如何向客戶強調 EOS 的價值?
Can you sort of outline some of the sort of -- what are the key drivers we should keep in mind and again in that competitive landscape between white box and Arista?
您能否概述一下——在白盒和 Arista 之間的競爭格局中,我們應該牢記哪些關鍵驅動因素?
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, sure, Samik.
是的,當然,薩米克。
First of all, when you're buying these expensive GPUs that cost $25,000, they're like diamonds, right?
首先,當您購買這些價值 25,000 美元的昂貴 GPU 時,它們就像鑽石一樣,對嗎?
You're not going to string a diamond on a piece of thread.
你不會把鑽石串在一條線上。
So first thing I want to say is you need a mission-critical network, whether it's whether you want to call it white box, blue box, US, or some other software.
所以我首先想說的是,你需要一個關鍵任務網絡,無論你想稱之為白盒、藍盒、美國還是其他軟體。
You've got to have mission critical functions, analytics, visibility, high availability, etc.
您必須擁有關鍵任務功能、分析功能、視覺性功能、高可用性功能等等。
As I mentioned, and I want to reiterate, they're also typically a least fine network.
正如我所提到的,我想重申一下,他們通常也是最不嚴格的網路。
And I have yet to see an AI spine deployment that is not EOS-based.
而且我還沒看到不是基於 EOS 的 AI 主幹部署。
I'm not saying it can happen or won't happen, but in all five major installations, the benefit of our EOS features for high availability for routing, for VXLAN, for telemetry, our customers really see that.
我並不是說它會發生或不會發生,但是在所有五個主要安裝中,我們的 EOS 功能對於路由、VXLAN、遙測的高可用性的好處是我們的客戶真正看到的。
And the 7,800 is the flagship AI spine product that we have been deploying last year, this year, and in the future.
而7800是我們去年、今年以及未來一直在佈置的旗艦AI脊椎產品。
Coming soon, of course, is also the product we jointly engineered with Meta, which is the distributed etherlink switch, and that is also an example of a product that provides that kind of leaf-spin combination, both with EFTPOS and EOS options in it.
當然,即將推出的還有我們與 Meta 共同設計的產品,即分散式 Etherlink 交換機,這也是提供這種葉旋轉組合的產品的一個例子,其中既有 EFTPOS 也有 EOS 選項。
So in my view, it's difficult to imagine a highly resilient system without Arista EOS in AI or non-AI use cases.
因此在我看來,很難想像在 AI 或非 AI 用例中沒有 Arista EOS 的高度彈性系統。
On the leaf you can cut corners.
在葉子上你可以剪一些角。
You can go with smaller buffers.
您可以使用較小的緩衝區。
You may have a smaller installation.
您的安裝可能較小。
So I can imagine that some people will want to experiment and do experiment in smaller configurations with non-EOS.
因此我可以想像有些人會想用非 EOS 在較小的配置中進行實驗。
But again, to do that you have to have a fairly large staff to build the operations for it.
但同樣,要做到這一點,你必須擁有相當多的員工來為其營運。
So that's also a critical element.
所以這也是一個關鍵因素。
So unless you're a large Cloud Titan customer, you're less likely to take that chance because you don't have the staff.
因此,除非您是 Cloud Titan 的大客戶,否則您不太可能抓住這個機會,因為您沒有員工。
So all in all, EOS is alive and well in AI and cloud use cases, except in certain specific use cases where the customer may have their own operation staff to do so.
總而言之,EOS 在 AI 和雲端用例中表現良好,但某些特定用例除外,在這些用例中,客戶可能有自己的營運人員來執行此操作。
Operator
Operator
Ben Bollin, Cleveland Research.
克利夫蘭研究公司的本‧博林 (Ben Bollin)
Ben Bollin - Analyst
Ben Bollin - Analyst
Good afternoon, everyone.
大家下午好。
Thanks for taking the question.
感謝您回答這個問題。
Jayshree, I'm interested in your thoughts on your Enterprise strategy within G2000 and how that may be evolving as it looks like refresh opportunities are intensifying.
Jayshree,我對您對 G2000 中的企業策略的想法很感興趣,以及隨著更新機會日益增多,該策略將如何發展。
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yes, no, listen, we're always looking at three major threads our classic Cloud business, our AI, and the Enterprise, led by Ashwin and Chris, is a very significant area of investment for us.
是的,不,聽著,我們一直在關註三大主線:我們的經典雲端業務、我們的人工智慧以及由阿什溫和克里斯領導的企業,這是我們非常重要的投資領域。
From a product point of view, we have a natural trickledown effect from our high-end data center products to the cloud, and so whether it's the Enterprise, Data Center or the Campus, I've never seen our portfolio be as strong as it is today.
從產品的角度來看,我們的高階資料中心產品對雲端有著自然的涓滴效應,因此無論是企業、資料中心還是園區,我從未見過我們的產品組合像今天這樣強大。
So a lot of our challenges and our execution is really in the go-to market.
因此,我們面臨的許多挑戰和執行實際上都與進入市場有關。
And that just takes time, as we've been slowly but steadily investing there, and our customer count, the number of projects we get invited to, especially as you pointed out in the global 2000s, has never been stronger.
這需要時間,因為我們一直在緩慢但穩定地進行投資,我們的客戶數量、我們受邀參與的項目數量,特別是正如您所指出的,在全球 21 世紀,從未如此強大。
One area I'd like to see more strength and Chris Schmidt and the team are working on it, as you can tell from our numbers, is International.
我希望看到我們更強大的一個領域,克里斯·施密特 (Chris Schmidt) 和團隊正在為此努力,正如你從我們的數據中看到的那樣,那就是國際領域。
We're bringing in some new leadership there and hope to see some significant contributions in the next year or so.
我們正在引進一些新的領導,並希望在未來一年左右看到一些重大貢獻。
Operator
Operator
Ryan Koontz, Needham & Company.
瑞安·孔茨 (Ryan Koontz),Needham & Company。
Ryan Koontz - Senior Analyst
Ryan Koontz - Senior Analyst
Great, thanks.
太好了,謝謝。
Jayshree, can you comment -- there's been a lot of chatter lately about co-package optics.
Jayshree,您能否評論一下—最近有很多關於共同封裝光學元件的討論。
You maybe speak about its place in your roadmap and how investors should think about that effect on your TAM and your opportunities to sell.
您或許可以談談它在您的路線圖中的位置,以及投資者應該如何看待它對您的 TAM 和銷售機會的影響。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Well, first of all, Andy has reminded me that co-packaged optics is not a new idea.
首先,安迪提醒我,共封裝光學元件並不是新想法。
It's been around 10 to 20 years.
已經有 10 到 20 年了。
So the fundamental reason let's go through why co-packaged optics has had a relatively weak adoption so far is because of field failures, and most of it is still in proof of concept today.
因此,讓我們來分析為什麼共封裝光學元件迄今為止採用率相對較低,其根本原因是現場故障,而且目前大多數仍處於概念驗證階段。
So, going back to networking, the most important the attribute of a network switch is reliability and troubleshooting.
所以,回到網絡,網路交換器最重要的屬性是可靠性和故障排除。
And once you solder a co-packaged optics on a PCB, you lose some of that flexibility and you don't get the serviceability and manufacturing.
一旦將共同封裝的光學元件焊接到 PCB 上,就會失去一些靈活性,並且無法獲得可維護性和製造性。
That's been the problem.
這就是問題所在。
Now a number of alternatives are emerging, and we're a big fan of co-packaged copper, as well as pluggable optics that can complement this, like linear drive or LPO as we call it.
現在出現了許多替代方案,我們非常青睞共封裝銅,以及可以與之互補的可插拔光學器件,例如我們所說的線性驅動器或 LPO。
Now, we also see that if co-packaged optics improves some of the metrics it has right now.
現在,我們也看到,如果共封裝光學元件能改善目前的一些指標。
For example, it has a higher channel count than the industry standard of 8-channel pluggable optics, but we can do higher channel pluggable optics as well.
例如,它的通道數比業界標準的 8 通道可插拔光學元件高,但我們也可以製作更高通道的可插拔光學元件。
So some of these things improve.
因此,有些情況會有所改善。
We can see that both CPC and CPO will be important technologies at 224 gig or even 448 gig.
我們可以看到,CPC和CPO都將成為224G甚至448G的重要技術。
But so far our customers have preferred a Lego approach that they can mix and match pluggable switches and pluggable optics and haven't committed to soldering them on the PCB and we feel that will change only if CPO gets better and more reliable, and I think CPC can be a nice alternative to that.
但到目前為止,我們的客戶更喜歡樂高的方法,他們可以混合搭配可插拔交換器和可插拔光學器件,而不必將它們焊接在 PCB 上,我們認為只有當 CPO 變得更好、更可靠時,這種情況才會改變,我認為 CPC 可以成為一個不錯的替代品。
Operator
Operator
Simon Leopold, Raymond James.
西蒙李奧波德、雷蒙詹姆斯。
Simon Leopold - Analyst
Simon Leopold - Analyst
Thank you very much for taking the question.
非常感謝您回答這個問題。
I was hoping you could maybe double click on the cognitive adjacencies.
我希望您可以雙擊認知鄰接。
It's been a meaningful part of revenue.
這是收入中很重要的一部分。
I think you said 18%.
我認為你說的是 18%。
If you could offer a little bit more color about how the elements of that are trending and your expectations for how that part of the business is growing in your 2025 expectations.
如果您可以更詳細地介紹這些要素的發展趨勢,以及您對 2025 年該部分業務成長的預期。
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, no, as you can imagine, that's an important both the routing and the Campus, and we've already signed up to $750 million so we clearly out of our $8.2 billion expect that to be over a billion this year, right.
是的,不,正如你所想像的,這對路線和校園都很重要,我們已經簽了 7.5 億美元的合同,所以我們顯然在 82 億美元中預計今年的金額將超過 10 億美元,對吧。
Now we don't -- in the routing use case, which is particularly Enterprise and service-provider related, it's, as I've often said, it's difficult to measure it in isolation.
現在我們沒有——在路由用例中,特別是與企業和服務提供者相關的用例,正如我經常說的那樣,很難單獨衡量它。
We're very strict about the definition being it has to be a combination of the software running with a dedicated routing hardware.
我們對定義非常嚴格,它必須是軟體運作與專用路由硬體的結合。
So for example, if the hardware is shared across switching and routing, we don't count it there.
例如,如果硬體在交換和路由之間共享,我們就不會將其計算在內。
So I think sometimes we shortchange the numbers a little bit and more of it goes in the core, but I just want you to be aware that's a very strategic piece.
因此我認為,有時我們會對數字做一些縮減,而將更多的數據用於核心,但我只是想讓你意識到,這是一個非常戰略性的部分。
You add that to the fact that the SD-WAN market is now evolving.
您還要補充一點,SD-WAN 市場正在不斷發展。
It's not just about how do you do encryption and tunnels and migrate from MPLS, but you really need a routed backbone.
這不僅涉及如何進行加密和隧道以及如何從 MPLS 遷移,而且您確實需要一個路由主幹網路。
So the combination of SD-WAN and the edge and a routed backbone really falls into the sweet spot for Arista, both in the Enterprise and service providers.
因此,SD-WAN 與邊緣和路由主幹網路的組合對於 Arista 來說確實是最佳選擇,無論是在企業還是服務供應商中。
I don't need to tell you about our Campus initiatives.
我不需要告訴你我們的校園計畫。
They are very, very keen there.
他們對此非常非常熱衷。
We see a parting of those fees, if you will, where there's a lot of fatigue with subscription models on the Campus from one competitor and another set of competitors who are trying to do a merger or acquisition.
如果你願意的話,我們可以看到這些費用的分離,因為校園裡的訂閱模式讓一個競爭對手感到厭倦,而另一組競爭對手正試圖合併或收購。
So Arista is the only pure play campus innovator who can provide that best of breed and we're particularly getting traction there where our Data Center customers are already familiar with us and they're using a Data Center spine and they can extend that same universal spine to wired and wireless leads.
因此,Arista 是唯一一家能夠提供最佳產品的純校園創新者,我們在那裡尤其受到關注,我們的資料中心客戶已經熟悉我們,他們正在使用資料中心主幹,他們可以將相同的通用主幹擴展到有線和無線線路。
So using CloudVision as a management domain we're seeing much more traction.
因此,使用 CloudVision 作為管理網域,我們看到了更多的吸引力。
Be it for automation, zero trust security, or observability with our Campus products.
無論是用於自動化、零信任安全,或是使用我們的 Campus 產品的可觀察性。
So I think both of those are meaningful, and we expect them to exceed a billion dollars.
所以我認為這兩者都很有意義,我們預計它們將超過 10 億美元。
Chantelle, do you want to say a few words?
Chantelle,你想說幾句話嗎?
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Yeah, the only other things I'd add, Jayshree, to your points which are completely showing the intent.
是的,Jayshree,我唯一想補充的是,你的觀點完全表明了意圖。
The other things for Campus are a couple fold.
校園的其他事物有幾方面。
First is, John, who's here with us, has spent a lot of time getting us better lead times coming into this year, so we have a great lead times conversation.
首先,和我們在一起的約翰今年花了很多時間來幫助我們縮短交貨時間,所以我們就交貨時間進行了一次很棒的對話。
We're very excited about that, and the customers seemed pretty excited as well.
我們對此感到非常興奮,顧客似乎也非常興奮。
We also have a curated preferred partner program, particularly International to your point earlier Jayshree on growing the International revenue, and we're excited because we've seen some Campus first wins where not only is the DC bringing us in, but we're actually bringing the DC through a Campus win.
我們還有一個精心挑選的首選合作夥伴計劃,特別是國際計劃,正如您之前 Jayshree 提到的那樣,旨在增加國際收入,我們對此感到很興奮,因為我們已經看到一些 Campus 的首次勝利,其中不僅 DC 將我們帶入,而且我們實際上通過 Campus 的勝利帶領 DC 取得了勝利。
So just add a couple more points to the enthusiasm for 2025.
因此,對 2025 年的熱情又增加了幾分。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Thank you.
謝謝。
Great reminders.
很好的提醒。
Operator
Operator
Tal Liani, Bank of America.
美國銀行的塔爾·利阿尼 (Tal Liani)。
Tal Liani - Analyst
Tal Liani - Analyst
Hi guys.
嗨,大家好。
Two questions, one on Routing and one on Enterprise.
兩個問題,一個關於路由,一個關於企業。
Enterprise grew 16% this year.
今年企業成長了16%。
What drives it?
是什麼驅動了它?
Is it just regular growth of Data Centers, or do you start to see Enterprise's investing because of AI and obligations for AI?
這只是資料中心的常規成長,還是您開始看到企業因為人工智慧和對人工智慧的義務而進行投資?
The second thing is about Routing.
第二件事是關於路由。
Routing used to be a small opportunity when it was just a license.
當路由只是一個許可證時,它曾經是一個小機會。
Can you elaborate on Routing?
能詳細說明一下路由嗎?
What is your differentiation versus the others are bundling it with optics?
與其他將其與光學捆綁在一起的產品相比,您的差異是什麼?
What, how do you sell it?
啥,你怎麼賣它?
And then how big is the opportunity?
那麼機會有多大?
Not in terms of numbers, but is it now a hardware with software or is it just software license like it used to be?
不是從數字方面來說,而是它現在是帶有軟體的硬體還是像以前一樣僅僅是軟體許可證?
Thanks.
謝謝。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Well, Jayshree, I can start with Enterprise and if you want to take routing?
好吧,Jayshree,我可以從企業開始,如果你想採用路由?
Yeah, so I think for Enterprise there are a few ways I would describe, kind of our growth vectors in '24 and continuing into '25 and into the following years.
是的,所以我認為對於企業來說,我可以用幾種方式來描述,有點像我們在'24年以及持續到'25年及接下來幾年的增長方向。
One is coverage.
一是覆蓋範圍。
You've seen we've invested in sales and marketing headcount in '23, '24 and going into '25. '24 we had double digit increase in sales and marketing headcounts, so we're just getting more coverage.
你已經看到,我們在'23、'24 以及'25 年對銷售和行銷人員進行了投資。 '24年,我們的銷售和行銷員工數量實現了兩位數的成長,因此我們的覆蓋範圍更加廣泛。
We're also using that preferred partner program I mentioned to get into the Enterprise.
我們也正在使用我提到的優先合作夥伴計畫來進入企業。
We do have international kind of campaigns that we're working on.
我們確實正在進行國際性活動。
We do have a new logo focus.
我們確實有一個新的標誌焦點。
And so I think that all those land and expand motions are the growth vectors.
所以我認為所有這些著陸和擴張運動都是成長載體。
I would say for the AI perspective, speaking with the customers, it's great it's moved from kind of a theory to more specific conversation, and you're seeing that in the banks and some of the higher tier, global 2000 and Fortune 500 companies, and so they're moving from theory to actual use cases they're speaking to.
我想說,從人工智慧的角度來看,與客戶交談,從一種理論轉向更具體的對話是很好的,而且你可以在銀行和一些較高級別的全球 2000 強和財富 500 強公司中看到這種情況,所以他們正在從理論轉向他們所談論的實際用例。
And the way they describe it is it takes a bit of time, they're working mostly with cloud service providers at the beginning, kind of doing some training, and then they're deciding whether they bring that on-prem and inference, so they're making those decisions.
他們是這樣描述的,這需要一些時間,他們一開始主要與雲端服務提供者合作,進行一些培訓,然後他們決定是否將其帶到本地並進行推理,所以他們做出這些決定。
So I think those are early days, but we're having really great conversations for the AI part of that.
所以我認為現在還處於早期階段,但我們正在就人工智慧部分進行非常出色的對話。
Jayshree, did you want to cover Routing?
Jayshree,你想介紹路由嗎?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, so on Routing.
是的,關於路由。
Routing's always been a critical part of our offerings for the Cloud and Data Center, whereas you rightly described, it was more of a software enhancement.
路由一直是我們為雲端和資料中心提供的產品的重要組成部分,而您正確描述的那樣,它更像是一種軟體增強。
But as we are getting more meaningful and important to the service providers as well as to the large enterprises -- I was just with a very large bank in New York last week.
但隨著我們對服務提供者和大型企業變得越來越有意義和重要——上週我剛在紐約的一家非常大的銀行。
It was snowing there, so it was super cold, so I stayed indoors most of the time.
那裡下著雪,非常冷,所以我大部分時間都待在室內。
And the use case there is not Data Center.
那裡的用例不是資料中心。
It's not Campus.
這不是校園。
It's all a WAN-routed fabric.
這全都是 WAN 路由結構。
It's pretty amazing.
這真是太神奇了。
And then they're looking for us to not just build that core routing as a hub but also take it into the spoke.
然後他們希望我們不僅將核心路由建構成樞紐,而且還將其帶入輻條。
And you mentioned features.
您也提到了特點。
I think what happened with Arista is we were largely building features for the Cloud and Data Center, which is 20% of the features.
我認為 Arista 的情況是,我們主要為雲端和資料中心建置功能,這佔功能的 20%。
But today our Routing portfolio is much more complete.
但今天我們的路由產品組合更完整。
Vxlansec, TunnelSec, MACsec encryption, MPLS, segment routing, OSPF, BGP, we got it all, so we're no more apologizing for what we don't have in Routing, and we obviously have the best software stack in terms of quality and support in the industry.
Vxlansec、TunnelSec、MACsec 加密、MPLS、分段路由、OSPF、BGP,我們擁有一切,因此我們不再為路由方面所缺乏的功能而道歉,而且我們顯然擁有業內品質和支援方面最好的軟體堆疊。
So while we are selling a lot of software SKUs, we are now finding ourselves in a lot of dedicated hardware SKUs, particularly with the 70 to 80 platform that has been a real workhorse for us and very successful.
因此,雖然我們銷售大量軟體 SKU,但我們現在發現自己擁有大量專用硬體 SKU,特別是 70 到 80 個平台,這些平台對我們來說是真正的主力,並且非常成功。
Operator
Operator
Antoine Chkaiban, New Street Research.
Antoine Chkaiban,New Street Research。
Antoine Chkaiban - Analyst
Antoine Chkaiban - Analyst
Hi, thank you for taking my question.
你好,謝謝你回答我的問題。
I'd love to get your latest perspective on what you're hearing from service providers in AI.
我很想了解您對人工智慧服務提供者的最新看法。
One of your competitors mentioned that they were seeing AI driving demand for service providers because they're building out their network in anticipation of an increase in traffic driven by AI.
您的一位競爭對手提到,他們看到人工智慧正在推動對服務提供者的需求,因為他們正在建立自己的網絡,以應對人工智慧帶來的流量成長。
So just wondering if you could comment on that as well.
我只是想知道您是否也可以對此發表評論。
Thank you.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Antoine, do you mean the classic service providers or generally the Neoclouds?
安托萬,你指的是傳統服務提供者還是一般的 Neoclouds?
Antoine Chkaiban - Analyst
Antoine Chkaiban - Analyst
The classic service providers.
經典服務提供者。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Oh, okay, so we haven't seen a huge uptick there yet, I think, maybe some experimental.
哦,好的,所以我們還沒有看到那裡有大幅的上漲,我想,也許是一些實驗性的。
But we have, to answer a question you didn't ask, we have seen much more activity and somebody earlier that I didn't answer the Sovereign cloud, the Neoclouds, we are seeing a new class of Tier 2 specialty-cloud providers emerge that want to provide AI as a service and want to be differentiated there and there's a whole lot of funding, grant money, real money going in there.
但是,為了回答您沒有問的一個問題,我們已經看到了更多的活動,之前有人提到我沒有回答的 Sovereign 雲、Neoclouds,我們看到出現了一類新的 Tier 2 專業雲提供商,他們希望提供 AI 即服務,並希望在那裡實現差異化,並且有大量的資金、補助金和真金白銀投入其中。
So service providers too early to call, but Neoclouds and specialty providers, yeah, we're seeing lots of examples of that.
因此,現在談論服務提供者還為時過早,但是 Neoclouds 和專業提供者,是的,我們看到了很多這樣的例子。
Operator
Operator
Matt Niknam, Deutsche Bank.
德意志銀行的馬特·尼克納姆(Matt Niknam)。
Matt Niknam - Analyst
Matt Niknam - Analyst
Hey, thanks so much for taking the question.
嘿,非常感謝您回答這個問題。
Maybe for Chantelle, I mean, you're sitting now on $8 billion worth of cash and equivalents on the balance sheet.
也許對 Chantelle 來說,我的意思是,你現在的資產負債表上有價值 80 億美元的現金和等價物。
So maybe just an update on how you'd prioritize uses of cash heading into 2025.
因此,這也許只是關於如何在 2025 年優先考慮現金用途的更新。
Thank you.
謝謝。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Yeah, no, it's great.
是啊,不,非常棒。
Thank you for the question.
感謝您的提問。
We're very pleased with the performance and our capital allocation strategy has not changed coming into FY25.
我們對業績非常滿意,進入 25 財年,我們的資本配置策略沒有改變。
Just to kind of reiterate and remind, and I appreciate the opportunity to do so.
只是想重申和提醒,我很感謝有這樣做的機會。
First of all, in the sense of investing that cash where we can still get a very reasonable and respectable return continues to be a priority.
首先,從投資現金的意義上來說,我們仍然可以獲得非常合理和可觀的回報,這仍然是我們的優先考慮。
Repurchasing, you saw the way we did through '24 and willing to do what we can through '25.
回購,您看到了我們在 24 年所做的事情,並願意在 25 年盡我們所能。
Organic investment, so you saw that in the sense we're still looking to scale the company in R&D sales, back office, and probably the one that's the -- probably the one that's the least on the scale is sizable inorganic activity.
有機投資,所以你看到了,從某種意義上說,我們仍然希望擴大公司的研發銷售、後台辦公室規模,而其中規模最小的可能是相當大的無機活動。
So I was focused on the first four, and that's how we remain in '25.
因此我集中精力於前四名,而這也是我們在 25 年保持的狀態。
Operator
Operator
David Vogt, UBS.
瑞銀的戴維沃格特(David Vogt)。
David Vogt - Analyst
David Vogt - Analyst
Great, thanks guys for taking my question.
太好了,謝謝大家回答我的問題。
So Jayshree, I have a question about sort of the evolution of speed deployment that some of your M&M customers.
Jayshree,我對您的一些 M&M 客戶的速度部署的演變有一個疑問。
Obviously, you mentioned 400 and 800 Gs have been, obviously a principal driver.
顯然,您提到 400 和 800 G 顯然是主要驅動因素。
How are you thinking about how that plays out in '25 and beyond?
您認為這在 25 年及以後會如何發展?
There's some wins out there.
那裡有一些勝利。
I know different parts of the network at 1.6 but trying to get a sense for how you think about, 400 into 800 into ultimately 1.6, not just in '25 but in '26 and beyond.
我知道 1.6 的網路的不同部分,但我試圖了解你是如何考慮的,從 400 到 800,最終到 1.6,不僅在 '25 年,而且在 '26 年及以後。
Thanks.
謝謝。
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
No, I think that's a very good question.
不,我認為這是非常好的問題。
The speed transitions because of AI are certainly getting faster.
因為人工智慧而導致的速度轉變肯定會越來越快。
It used to take when we went from 200 gig for example at Meta or 100 gig in some of our Cloud Titans to 400, that speed transition typically took three to four, maybe even five years, right?
當我們從 Meta 的 200G 或某些 Cloud Titans 的 100G 增加到 400G 時,這種速度轉換通常需要三到四年,甚至五年的時間,對嗎?
In AI we see that cycle being almost every two years.
在人工智慧中,我們發現這個週期幾乎每兩年一次。
So I'd say 2024 was the year of real 400 gig. '25 and '26, I would say is more 800 gig, and I really see 1.60 coming into the picture because we don't have chips yet, maybe in what do you say, John, late '26 and real production maybe in '27.
因此我認為 2024 年是真正實現 400G 的一年。 '25 和 '26,我會說超過 800G,我真的看到 1.60 進入畫面,因為我們還沒有晶片,也許在你說的,約翰,'26 年末,真正的生產可能在 '27 年。
So there's a lot of talk and hype on it just like I remember talking and hype on 400 gig five years ago, but I think realistically you're going to see a long runway for 400 and 800 gig.
因此,對此有很多討論和炒作,就像我記得五年前對 400G 的討論和炒作一樣,但我認為現實是,你會看到 400 和 800G 的長期發展。
Now as you get into 1.60, part of the reason I think it's going to be measured and thoughtful is many of our customers are still awaiting their own AI accelerators or Nvidia GPUs, which with liquid cooling that would actually push that kind of bandwidth.
現在,當進入 1.60 時,我認為它需要經過衡量和深思熟慮,部分原因是我們的許多客戶仍在等待自己的 AI 加速器或 Nvidia GPU,這些 GPU 採用液體冷卻技術實際上可以推動這種頻寬。
So, new GPUs will require a new bandwidth and that's going to push it out a year or two.
因此,新的 GPU 將需要新的頻寬,這將推遲一兩年。
Operator
Operator
Karl Ackerman, BNP Paribas.
法國巴黎銀行的卡爾‧阿克曼(Karl Ackerman)。
Karl Ackerman - Analyst
Karl Ackerman - Analyst
Yes, thank you.
是的,謝謝。
Could you discuss the outlook for services relative to your outlook for March in the full year?
您能否討論一下相對於全年 3 月的展望,服務業的前景如何?
I asked because services grew 40% year over year and your deferred revenue balance is up another $250 million or so sequentially and nearly $2.8 billion.
我之所以問這個問題,是因為服務業務年增了 40%,你們的遞延收入餘額比上一季增加了 2.5 億美元左右,達到近 28 億美元。
So the outlook for that would be very helpful.
因此,這種前景將會非常有幫助。
Thank you.
謝謝。
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Chantelle Breithaupt - Chief Financial Officer, Senior Vice President
Yeah, we don't usually guide the piece parts of that.
是的,我們通常不指導其中的各個部分。
So all I would say is that the thing to keep in mind with services is a little bit of timing in the sense of catching up to product, especially kind of post COVID.
因此,我想說的是,提供服務時需要牢記的一點就是要把握好時機,以便趕上產品的步伐,尤其是在疫情過後。
So I would kind of take the trend that you see over the last few years, and that's probably your best guide looking forward.
因此,我會借鏡過去幾年的趨勢,這可能是您展望未來的最佳指南。
We don't guide the peace parts.
我們不指導和平部分。
Operator
Operator
Sebastien Naji, William Blair.
塞巴斯蒂安·納吉、威廉·布萊爾。
Sebastien Naji - Analyst
Sebastien Naji - Analyst
Yeah, thanks for taking the question and we talked about this a little bit, but there's been a lot of discussion over the last few months between the general purpose GPU clusters from Nvidia, and then the custom Arista solutions from some of your popular customers, I guess just in your view over the longer term, does Arista opportunity differ across these two chip types and is there one approach that would maybe pull in more Arista versus the other?
是的,感謝您提出這個問題,我們也討論過這個問題,但是過去幾個月來,關於 Nvidia 的通用 GPU 集群和一些受歡迎客戶的定制 Arista 解決方案,我們進行了很多討論,我想從您的角度來看,從長遠來看,Arista 在這兩種芯片類型中的機會是否不同,是否有一種方法可能會吸引更多的 Arista 而不是另一種?
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Jayshree Ullal - Chairman of the Board, President, Chief Executive Officer
Yeah, no, absolutely, I think, I've always said this, you guys often spoke about Nvidia as a competitor, and I don't see it that way.
是的,不,絕對如此,我認為,我一直都這麼說,你們經常把 Nvidia 稱為競爭對手,但我並不這麼認為。
I see that.
我明白。
Thank you, Nvidia, thank you, Jensen for the GPUs because that gives us an opportunity to connect to them and that's been a predominant market for us.
感謝 Nvidia,感謝 Jensen 提供的 GPU,因為這讓我們有機會與他們建立聯繫,而這一直是我們的主要市場。
As we move forward, we see not only that we connect to them that we can connect to AMD GPUs and built in in-house AI accelerators.
隨著我們不斷前進,我們不僅看到與它們連接,而且看到我們能夠與 AMD GPU 和內建的內部 AI 加速器連接。
So a lot of them are in active development or in early stages, Nvidia is the dominant market shareholder with probably 80%, 90%.
其中許多都處於積極開發或早期階段,Nvidia 是主要市場份額持有者,大概佔有 80% 到 90% 的份額。
But if you ask me to guess what it would look like two or three years from now, I think, it could be 50/50.
但如果你讓我猜兩三年後它會是什麼樣子,我想,可能性是 50%。
So Arista could be the scale-out network for all types of accelerators, we'll be GPU agnostic, and I think there'll be less opportunity to bundle by specific vendors and more opportunities for customers to choose best of breed.
因此,Arista 可以成為所有類型加速器的橫向擴展網絡,我們將與 GPU 無關,而且我認為由特定供應商進行捆綁的機會會更少,而客戶可以選擇最佳產品的機會會更多。
Rudolph Araujo - Director of IR Advocacy
Rudolph Araujo - Director of IR Advocacy
This concludes Arista Networks' fourth-quarter 2024 earnings call.
這就是 Arista Networks 2024 年第四季財報電話會議的結束。
We have posted a presentation that provides additional information on our results, which you can access on the Investor Section of our website.
我們發布了一份演示文稿,提供有關我們業績的更多信息,您可以在我們網站的投資者部分訪問。
Thank you for joining us today and for your interest in Arista.
感謝您今天加入我們並關注 Arista。
Operator
Operator
Thank you for calling, ladies and gentlemen.
女士們、先生們,謝謝大家的來電。
This concludes today's call and you may now disconnect.
今天的通話到此結束,您可以掛斷電話了。