博通 (Broadcom) 報告稱,2025 財年第二季營收達到創紀錄的 150 億美元。該公司的 AI 半導體收入和基礎設施軟體收入均實現了顯著增長。他們預計 2026 年 AI 收入將繼續成長,並專注於擴大其 XPU 業務。
博通不參與 NVLink 項目,並優先透過分紅和減債來實現資本回報。他們對併購持謹慎態度,並且不確定出口管制對 AI 的影響。該公司計劃於 2025 年 9 月 4 日公佈第三季業績。
使用警語:中文譯文來源為 Google 翻譯,僅供參考,實際內容請以英文原文為主
Operator
Operator
Welcome to Broadcom Inc's Second Quarter Fiscal Year 2025 Financial Results Conference Call. At this time, for opening remarks and introductions. I would like to turn the call over to Ji Yoo. Head of Investor Relations of Broadcom Inc.
歡迎參加博通公司 2025 財年第二季財務業績電話會議。現在,進行開場白和介紹。我想把電話轉給 Ji Yoo。博通公司投資人關係主管
Ji Yoo - Head of Investor Relations
Ji Yoo - Head of Investor Relations
Thank you, operator, and good afternoon, everyone. Joining me on today's call are Hock Tan, President and CEO; Kirsten Spears, Chief Financial Officer; and Charlie Case, President Semiconductor Solutions Group. Broadcom distributed a press release and financial tables after the market closed, describing our financial performance for the second quarter of fiscal year 2025.
謝謝接線員,大家下午好。參加今天電話會議的還有總裁兼執行長 Hock Tan、財務長 Kirsten Spears 和半導體解決方案集團總裁 Charlie Case。博通在收盤後發布了一份新聞稿和財務表,描述了我們2025財年第二季的財務表現。
If you did not receive a copy, you may obtain the information from the Investors section of the Broadcom's website at broadcom.com. This conference call is being webcast live, and an audio replay of the call can be accessed for one year through the Investors section of Broadcom's website. During the prepared comments, Hock and Kirsten will be providing details of our second quarter fiscal year 2025 results, guidance for our third quarter of fiscal year 2025 as well as commentary regarding the business environment.
如果您尚未收到副本,您可以從博通網站 broadcom.com 的「投資者」版塊獲取相關資訊。本次電話會議將進行網路直播,您可以透過博通網站的「投資者」版塊收聽電話會議的音訊回放,重播有效期為一年。在準備好的評論中,霍克和克爾斯滕將提供我們 2025 財年第二季業績的詳細資訊、2025 財年第三季的指導以及有關商業環境的評論。
We'll take questions after the end of our prepared comments. Please refer to our press release today and our recent filings with the SEC for information on the specific risk factors that could cause our actual results to differ materially from the forward-looking statements made on this call. In addition to US GAAP reporting, Broadcom reports certain financial measures on a non-GAAP basis.
我們將在準備好的評論結束後回答問題。請參閱我們今天的新聞稿和我們最近向美國證券交易委員會提交的文件,以獲取可能導致我們的實際結果與本次電話會議中的前瞻性陳述存在重大差異的具體風險因素的資訊。除了美國 GAAP 報告外,Broadcom 還以非 GAAP 為基礎報告某些財務指標。
A reconciliation between GAAP and non-GAAP measures is included in the tables attached to today's press release. Comments made during today's call will primarily refer to our non-GAAP financial results.
今天的新聞稿所附的表格中包含了 GAAP 指標和非 GAAP 指標之間的對帳。今天電話會議上的評論主要涉及我們的非公認會計準則財務表現。
I will now turn the call over to Hock.
現在我將把電話轉給霍克。
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Thank you, Ji, and thank you, everyone, for joining us today. In our fiscal Q2 2025, total revenue was a record $15 billion, up 20% year-on-year. This 20% year-on-year growth was all organic as Q2 last year was the first full quarter with VMware. Now revenue was driven by continued strength in AI semiconductor and the momentum we have achieved in VMware.
謝謝吉,也謝謝大家今天加入我們。在我們的 2025 財年第二季度,總營收創紀錄地達到 150 億美元,年增 20%。由於去年第二季度是 VMware 的第一個完整季度,因此年成長 20% 完全是自然成長。現在,收入是由人工智慧半導體的持續強勁以及我們在 VMware 取得的勢頭推動的。
Now reflecting excellent operating leverage, Q2 consolidated adjusted EBITDA was $10 billion, up 35% year-on-year. Now let me provide more color Q2 semiconductor revenue was $8.4 billion, with growth accelerating to 17% year-on-year up from 11% in Q1. And of course, driving this growth was AI semiconductor revenue of over $4.4 billion which is up 46% year-on-year and continues the trajectory of nine consecutive quarters of strong growth.
現在反映出出色的營運槓桿,第二季合併調整後 EBITDA 為 100 億美元,年增 35%。現在讓我提供更多顏色,第二季半導體營收為 84 億美元,年成長從第一季的 11% 加速至 17%。當然,推動這一成長的是超過 44 億美元的人工智慧半導體收入,該收入年增 46%,並延續了連續九個季度強勁成長的軌跡。
Within this, custom AI accelerators grew double digits year-on-year, while AI networking grew over 170% year-on-year. AI networking, which is based on Ethernet was robust and represented 40% of our AI revenue. As a standard-based open protocol, Ethernet enables one single fed break for both scale out and scale up and remains the preferred choice by our hyperscale customers.
其中,客製化AI加速器年增兩位數,AI網路年增超過170%。基於乙太網路的人工智慧網路非常強大,占我們人工智慧收入的 40%。作為一種基於標準的開放協議,乙太網路支援單一饋電中斷,既可進行橫向擴展,又可進行縱向擴展,並且仍然是我們超大規模客戶的首選。
Our networking portfolio of Tomahawk switches, Cherry core routers and NIC is what's driving our success within AI clusters in hyperscale. And the momentum continues with our breakthrough Tomahawk switch just announced this week. This represents the next generation 102.4 terabits per second switch capacity. TMOV enables clusters of more than 100,000 AI salarators to be deployed in just two tiers instead of three.
我們的 Tomahawk 交換器、Cherry 核心路由器和 NIC 網路產品組合是推動我們在超大規模 AI 叢集中取得成功的動力。隨著我們本週剛剛宣布突破性的 Tomahawk 開關,這一勢頭仍在繼續。這代表下一代每秒 102.4 太比特的交換容量。TMOV 使得超過 100,000 個 AI 算力群集只需部署在兩層而不是三層。
This flattening of the AI cluster is huge because it enables much better performance in training, next-generation frontier models through a lower latency, higher bandwidth and lower power. Turning to XPUs or customer accelerators. We continue to make excellent progress on the multiyear journey of enabling our three customers and four prospects to deploy custom accelerators.
人工智慧叢集的扁平化意義重大,因為它可以透過更低的延遲、更高的頻寬和更低的功率實現訓練、下一代前沿模型的更好性能。轉向 XPU 或客戶加速器。我們在幫助我們的三位客戶和四位潛在客戶部署客製化加速器的多年歷程中繼續取得了出色的進展。
As we had articulated over six months ago, we eventually expect at least three customers to each deploy 1 million AI-accelerated clusters in 2027, largely for training their frontier models. And we forecast and continue to do so a significant percentage of these deployments to big custom XPUs. These partners are still unwavering in their plan to invest despite the certain economic environment.
正如我們六個月前所闡述的那樣,我們最終預計至少有三家客戶將在 2027 年各自部署 100 萬個 AI 加速集群,主要用於訓練他們的前沿模型。我們預測,這些部署中很大一部分將部署到大型客製化 XPU 上,並且我們將繼續這樣做。儘管經濟環境不容樂觀,但這些合作夥伴仍堅定不移地堅持投資計畫。
In fact, what we've seen recently is that they are doubling down on inference in order to monetize their platforms. And reflecting this, we may actually see an acceleration of XPU demand into the back half of 2026 to meet urgent demand for inference on top of the demand we have indicated from training. And accordingly, we do anticipate now our fiscal 2025 growth rate of AI semiconductor revenue to sustain into fiscal 2026.
事實上,我們最近看到的是,他們正在加倍投入推理,以便將他們的平台貨幣化。考慮到這一點,我們實際上可能會看到 XPU 需求在 2026 年下半年加速成長,以滿足除了我們所指出的訓練需求之外的推理方面的迫切需求。因此,我們現在預計 2025 財年的 AI 半導體營收成長率將持續到 2026 財年。
Turning to our Q3 outlook. As we continue our current trajectory of growth. We forecast AI semiconductor revenue to be $5.1 billion, up 60% year-on-year which would be the tenth consecutive quarter of growth. Now turning to non-AI semiconductors in Q2. revenue of $4 billion was down 5% year-on-year. Non-AI semiconductor revenue is close to the bottom and has been relatively slow to recover but they had bright spots.
談談我們的第三季展望。我們繼續當前的成長軌跡。我們預測人工智慧半導體營收將達到 51 億美元,年成長 60%,這將是連續第十個季度成長。現在轉向第二季的非人工智慧半導體,營收 40 億美元,年減 5%。非AI半導體收入已接近底部,復甦相對較慢,但也存在亮點。
In Q2, broadband, enterprise networking and service storage revenues were up sequentially. However, industrial was down and as expected, wireless was also down due to seasonality. In Q3, we expect enterprise networking in broadband to continue to grow sequentially, but service storage, wireless and industrial are expected to be largely flat. And overall, we forecast non-AI semiconductor revenue to stay around $4 billion.
第二季度,寬頻、企業網路和服務儲存收入較上季成長。然而,工業業務下滑,正如預期,無線業務也因季節性因素而下滑。在第三季度,我們預計寬頻企業網路將繼續連續成長,但服務儲存、無線和工業預計將基本持平。總體而言,我們預測非人工智慧半導體收入將維持在 40 億美元左右。
Now let me talk about our infrastructure software segment. Q2 Infrastructure software revenue of $6.6 billion was up 25% year-on-year, above our outlook of $6.5 billion. As we have said before, this growth reflects our success in converting our enterprise customers from perpetual vSphere to the full VCS software type subscription.
現在讓我談談我們的基礎設施軟體部分。第二季基礎設施軟體營收為 66 億美元,年增 25%,高於我們預期的 65 億美元。正如我們之前所說,這種成長反映了我們成功地將企業客戶從永久 vSphere 轉變為完整的 VCS 軟體類型訂閱。
Customers are increasingly turning to VCF to create a modernized private cloud on-prem which will enable them to repatriate workloads from public clouds, while being able to run modern container-based applications and AI applications of our 10,000 largest customers, over 87% have now adopted VCF. The momentum from strong VCF sales over the past 18-months since the acquisition of VMware has created annual recurring revenue or otherwise could not ARR growth of double digits in our core infrastructure software.
越來越多的客戶開始使用 VCF 來創建現代化的私有雲,這將使他們能夠從公有雲中遣返工作負載,同時能夠運行現代基於容器的應用程式和 AI 應用程式。目前,我們 10,000 個最大客戶中,超過 87% 的客戶已經採用了 VCF。自收購 VMware 以來,過去 18 個月中 VCF 銷售強勁的勢頭創造了年度經常性收入,否則我們的核心基礎設施軟體的 ARR 就不會達到兩位數的成長。
In Q3, we expect Infrastructure software revenue to be approximately $6.7 billion up 16% year-on-year. So in total, we're guiding Q3 consolidated revenue to be approximately $15.8 billion up 21% year-on-year. We expect Q3 adjusted EBITDA to be at least 66%.
第三季度,我們預計基礎設施軟體營收約為 67 億美元,年增 16%。因此,總體而言,我們預計第三季綜合收入約為 158 億美元,年增 21%。我們預計第三季調整後的 EBITDA 至少為 66%。
With that, let me turn the call over to Kirsten.
說完這些,讓我把電話轉給 Kirsten。
Kirsten Spears - Chief Financial Officer, Chief Accounting Officer
Kirsten Spears - Chief Financial Officer, Chief Accounting Officer
Thank you, Hock. Let me now provide additional detail on our Q2 financial performance. Consolidated revenue was a record $15 billion for the quarter, up 20% from a year ago. Gross margin was 79.4% of revenue in the quarter better than we originally guided on product mix. Consolidated operating expenses were $2.1 billion, of which $1.5 billion was related to R&D.
謝謝你,霍克。現在讓我提供有關我們第二季財務業績的更多細節。本季綜合營收達到創紀錄的 150 億美元,比去年同期成長 20%。本季毛利率為 79.4%,高於我們最初對產品組合的預期。合併營運費用為 21 億美元,其中 15 億美元與研發有關。
Q2 operating income of $9.8 billion was up 37% from a year ago, with operating margin at 65% of revenue. Adjusted EBITDA was $10 billion or 67% of revenue, above our guidance of 66%. This figure excludes $142 million of depreciation. Now a review of the P&L for our two segments. Starting with semiconductors. Revenue for our semiconductor Solutions segment was $8.4 billion, with growth accelerating to 17% year-on-year, driven by AI.
第二季營業收入為 98 億美元,比去年同期成長 37%,營業利潤率為營收的 65%。調整後的 EBITDA 為 100 億美元,佔營收的 67%,高於我們預期的 66%。該數字不包括 1.42 億美元的折舊。現在回顧我們兩個部門的損益表。從半導體開始。我們的半導體解決方案部門的收入為 84 億美元,在人工智慧的推動下,成長率年增至 17%。
Semiconductor revenue represented 56% of total revenue in the quarter. Gross margin for our Semiconductor Solutions segment was approximately 69% up 140 basis points year-on-year driven by product mix. Operating expenses increased 12% year-on-year to $971 million on increased investment in R&D for leading edge AI semiconductors.
半導體收入佔本季總收入的56%。受產品組合推動,我們半導體解決方案部門的毛利率約為 69%,較去年同期成長 140 個基點。由於對前沿人工智慧半導體研發的投資增加,營運費用年增 12% 至 9.71 億美元。
Semiconductor operating margin of 57% was up 200 basis points year-on-year. Now moving on to infrastructure software. Revenue for infrastructure software of $6.6 billion was up 25% year-on-year and represented 44% of total revenue, gross margin for infrastructure software was 93% in the quarter compared to 88% a year ago.
半導體營業利益率為57%,較去年同期成長200個基點。現在轉向基礎設施軟體。基礎設施軟體營收為 66 億美元,年增 25%,佔總營收的 44%,本季基礎設施軟體毛利率為 93%,去年同期為 88%。
Operating expenses were $1.1 billion in the quarter, resulting in infrastructure software operating margin of approximately 76%. This compares to operating margin of 60% a year ago. This year-on-year improvement reflects our disciplined integration of VMware. Moving on to cash flow. Free cash flow in the quarter was $6.4 billion and represented 43% of revenue.
本季營運費用為 11 億美元,導致基礎設施軟體營運利潤率約為 76%。相比之下,一年前公司的營業利益率為 60%。這一同比進步反映了我們對 VMware 的嚴格整合。繼續討論現金流。本季自由現金流為 64 億美元,佔營收的 43%。
Free cash flow as a percentage of revenue continues to be impacted by increased interest expense from debt related to the VMware acquisition and increased cash taxes. We spent $144 million on capital expenditures. Days sales outstanding were 34-days in the second quarter compared to 40-days a year ago. We ended the second quarter with inventory of $2 billion, up 6% sequentially in anticipation of revenue growth in future quarters.
自由現金流佔收入的百分比持續受到與 VMware 收購相關的債務利息支出增加以及現金稅增加的影響。我們的資本支出為 1.44 億美元。第二季應收帳款週轉天數為 34 天,去年同期為 40 天。我們第二季的庫存為 20 億美元,季增 6%,預計未來幾季的營收將成長。
Our days of inventory on hand were 69-days in Q2 as we continue to remain disciplined on how we manage inventory across the ecosystem. We ended the second quarter with $9.5 billion of cash and $69.4 million of gross principal debt. Subsequent to quarter end, we repaid $1.6 billion of debt resulting in gross principal debt of $67.8 billion.
由於我們繼續嚴格遵守整個生態系統的庫存管理方式,因此我們第二季的庫存天數為 69 天。截至第二季末,我們的現金餘額為 95 億美元,總本金債務為 6,940 萬美元。季度末之後,我們償還了 16 億美元的債務,導致總本金債務達到 678 億美元。
The weighted average coupon rate and years to maturity of our $59.8 billion in fixed rate debt is 3.8% and seven years, respectively. The weighted average interest rate and years to maturity of our $8 billion in floating rate debt is 5.3% and 2.6-years, respectively. Turning to capital allocation in Q2, we paid stockholders $2.8 billion of cash dividends based on a quarterly common stock cash dividend of $0.59 per share.
我們的 598 億美元固定利率債務的加權平均票面利率和到期年限分別為 3.8% 和 7 年。我們的 80 億美元浮動利率債務的加權平均利率和到期年限分別為 5.3% 和 2.6 年。談到第二季的資本配置,我們根據每股 0.59 美元的季度普通股現金股利向股東支付了 28 億美元的現金股利。
In Q2, we repurchased $4.2 billion or approximately 25 million shares of common stock. In Q3, we expect the non-GAAP diluted share count to be approximately 4.97 billion shares, excluding the potential impact of any share repurchases. Now moving on to guidance. Our guidance for Q3 is for consolidated revenue of $15.8 billion up 21% year-on-year.
第二季度,我們回購了價值 42 億美元的約 2,500 萬股普通股。在第三季度,我們預計非公認會計準則稀釋股數約為 49.7 億股,不包括任何股票回購的潛在影響。現在開始指導。我們對第三季的預期是綜合營收達到 158 億美元,年增 21%。
We forecast semiconductor revenue of approximately $9.1 billion, up 25% year-on-year. Within this, we expect Q3 AI semiconductor revenue of $5.1 billion, up 60% year-on-year. We expect infrastructure software revenue of approximately $6.7 billion, up 16% year-on-year. For modeling purposes, we expect Q3 consolidated gross margin to be down approximately 130 basis points sequentially, primarily reflecting a higher mix of XPUs within AI revenue.
我們預測半導體收入約為 91 億美元,年增 25%。其中,我們預期第三季AI半導體營收為51億美元,年增60%。我們預計基礎設施軟體營收約為 67 億美元,年增 16%。出於建模目的,我們預計第三季綜合毛利率將環比下降約 130 個基點,主要反映了 AI 收入中 XPU 組合的增加。
As a reminder, consolidated gross margins through the year will be impacted by the revenue mix of infrastructure software and semiconductors. We expect Q3 adjusted EBITDA to be at least 66%. We expect the non-GAAP tax rate for Q3 and fiscal year 2025 to remain at 14%. And with this, that concludes my prepared remarks.
提醒一下,全年的綜合毛利率將受到基礎設施軟體和半導體收入組合的影響。我們預計第三季調整後的 EBITDA 至少為 66%。我們預計第三季和2025財年的非公認會計準則稅率將維持在14%。我的準備好的發言到此結束。
Operator, please open up the call for questions.
接線員,請打開電話詢問。
Operator
Operator
(Operator Instructions) Ross Seymore, Deutsche Bank.
(操作員指令)Ross Seymore,德意志銀行。
Ross Seymore - Analyst
Ross Seymore - Analyst
I wanted to jump on to the AI side and specifically some of the commentary you had about next year. Can you just give a little bit more color on the inference commentary you gave? And is it more of the XPU side, the connectivity side or both that's giving you the confidence to talk about the growth rate that you have this year being matched next fiscal year?
我想談談人工智慧方面,特別是您對明年的一些評論。您能否對您給予的推論評論進行更詳細的說明?是 XPU 方面、連結性方面,還是兩者兼而有之,讓您有信心談論今年的成長率與下一財年的成長率相符?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Thank you, Ross. Good question. I think we're indicating that what we are seeing and what we have quite a bit of visibility increasingly, is increased deployment of XPUs next year, much more than we originally thought. And hand-in-hand, we did, of course, more and more networking. So it's a combination of both.
謝謝你,羅斯。好問題。我認為,我們表明,我們所看到的以及我們越來越清楚的是,明年 XPU 的部署將會增加,遠遠超出我們最初的想像。當然,我們攜手並進,建立了越來越多的網路。所以這是兩者的結合。
Ross Seymore - Analyst
Ross Seymore - Analyst
And the inference side of things?
那麼事物的推理面呢?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Yes, we are seeing much more inference now.
是的,我們現在看到了更多的推論。
Operator
Operator
Harlan Sur, JPMorgan.
摩根大通的 Harlan Sur。
Harlan Sur - Analyst
Harlan Sur - Analyst
Great job on the quarterly execution. Good to see the positive growth inflection quarter-over-quarter year-over-year growth rates in your AI business. As a team, as mentioned, right, the quarters can be a bit lumpy specify. So if I move out kind of first three quarters of this fiscal year, your AI business is up 60% year-over-year. It's kind of right in line with your three-year kind of SAM growth CAGR, right?
季度執行情況非常好。很高興看到您的人工智慧業務的季度環比和同比增長率呈現積極的成長態勢。作為一個團隊,正如所提到的,具體來說,宿舍可能會有點不均勻。因此,如果我看一下本財年的前三個季度,您的人工智慧業務將年增 60%。這與您三年的 SAM 成長複合年增長率一致,對嗎?
Given your prepared remarks and knowing that your lead times remain at 35 weeks or better, do you see the Broadcom team sustaining the year-over-year growth rate exiting this year, and I assume that potentially implies that you see your AI business sustaining the 60% year-over-year growth rate into fiscal '26 again, based on your prepared commentary, which again is in line with your SAM growth maker. Is that kind of a fair way to think about the trajectory this year and next year?
鑑於您準備好的評論,並且知道您的交貨時間保持在 35 週或更短,您是否認為 Broadcom 團隊將在今年保持同比增長率,並且我認為這可能意味著您看到您的 AI 業務在 26 財年再次保持 60% 的同比增長率,基於您準備好的評論,這再次與您的 SAM 增長製造商一致。這是思考今年和明年的發展軌跡的公平方式嗎?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Harlan, that's a very insightful set of analysis here. And that's exactly what we're trying to do here. Because over six months ago, we gave you guys a point a year, 2027. As we come into the second half, of 2025. And with improved visibility and updates we are seeing in the way our hyperscale partners are deploying data centers AI classes.
哈蘭,這是一組非常有見地的分析。這正是我們在這裡嘗試做的事情。因為六個多月前,我們給你們每年一個積分,就是 2027 年。當我們進入2025年下半年。隨著可見性和更新的提高,我們看到了超大規模合作夥伴部署資料中心 AI 類別的方式。
We are providing you more, some level of guidance visibility, what we are seeing how the trajectory of '26 might look like. I'm not giving you any update on '27. We're just still establishing the update we have in '27, six months ago. But what we're doing now is giving you more visibility into where we're seeing '26 headed.
我們正在為您提供更多信息,某種程度的指導可見性,我們正在觀察 26 年的發展軌跡。我不會給你關於 27 號的任何最新消息。我們仍在建立六個月前 27 年的更新。但我們現在所做的是讓您更清楚地了解我們對 26 年發展方向的預測。
Harlan Sur - Analyst
Harlan Sur - Analyst
But is the framework that you laid out for us like second half of last year, which implies 60% kind of growth CAGR in your SAM opportunity. Is that kind of the right way to think about it as it relates to the profile of growth in your business this year and next year?
但是,您為我們制定的框架是否像去年下半年一樣,意味著您的 SAM 機會的複合年增長率為 60%。就您今年和明年的業務成長而言,這是正確的思考方式嗎?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Yes.
是的。
Operator
Operator
Ben Reitzes, Melius Research.
Ben Reitzes,Melius Research。
Ben Reitzes - Analyst
Ben Reitzes - Analyst
Hock, networking -- AI networking was really strong in the quarter. And it seemed like it must have beat expectations. I was wondering if you could just talk about the networking in particular, what caused that? And how much of that is your acceleration into next year? And when do you think you see Tomahawk kicking in as part of that acceleration?
霍克,網路—本季人工智慧網路表現非常強勁。而且看起來它一定超出了預期。我想知道您是否可以具體談談網路問題,是什麼導致了這個問題?其中有多少是您在明年加速的?您認為何時會看到戰斧作為加速的一部分發揮作用?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Well, I think the network, -- AI networking as you probably would know, goes pretty hand-in-hand with deployment of AI accelerator clusters. It isn't. It doesn't deploy on the timetable that is very different from the way that accelerators get deployed, whether they are XPUs or GPUs. It does happen.
嗯,我認為網路——您可能知道,人工智慧網路與人工智慧加速器集群的部署密切相關。事實並非如此。它的部署時間表與加速器的部署方式有很大不同,無論是 XPU 還是 GPU。這確實會發生。
And they deploy a lot in scale out where Ethernet, of course, is the choice of protocol but it's also increasingly moving into the space of what we all call scale up within those data centers where you have much higher, -- more than we originally thought consumption or density of switches than you have in the scale-out scenario.
他們在橫向擴展方面部署了很多,當然,以太網是協議的選擇,但它也越來越多地進入我們在數據中心內稱之為縱向擴展的領域,在這些領域中,交換機的消耗或密度比橫向擴展場景中的要高得多——比我們最初想像的要高得多。
It's, in fact, increased density in scale up is 5 times to 10 times more than in scale out. And that's the part, -- that kind of pleasantly surprised us. And which is why this past quarter Q2 the AI networking portion continues at about 40% from when we reported a quarter ago for Q1. And at that time, I said I expect it to drop. It hasn't.
事實上,擴大規模時密度比縮小規模時增加 5 倍到 10 倍。這就是讓我們感到驚喜的部分。這就是為什麼上個季度第二季度人工智慧網路部分繼續保持在 40% 左右,而我們上個季度報告的第一季也是如此。當時我說我預計它會下降。事實並非如此。
Ben Reitzes - Analyst
Ben Reitzes - Analyst
And your thoughts on Tomahawk driving acceleration for next year and when it kicks in?
您對 Tomahawk 明年的駕駛加速以及何時開始加速有何看法?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Tomahawk, oh, yes, that's extremely strong interest now. We're not shipping big orders or any orders other than basic proof of concepts out to customers, but there is tremendous demand for this new 102 terabit per second Tomahawk switches.
戰斧,哦,是的,現在人們對它非常感興趣。我們不會向客戶發送大訂單或除基本概念驗證之外的任何訂單,但這種新的每秒 102 太比特的 Tomahawk 交換機的需求量巨大。
Operator
Operator
Blayne Curtis, Jefferies.
布萊恩‧柯蒂斯,傑富瑞集團。
Blayne Curtis - Analyst
Blayne Curtis - Analyst
Great results. I just want to ask maybe following up on the scale-out opportunities. So today, I guess your main customer is not really using an envy link switch style scale up. I'm just kind of curious your visibility or the timing in terms of when you might be shipping a switched Ethernet scale-up network to your customers?
效果非常好。我只是想問一下是否可以跟進擴展機會。所以今天,我猜你的主要客戶並沒有真正使用 Envy Link Switch 風格的擴充功能。我只是有點好奇,您何時會向客戶交付交換式乙太網路擴充網路?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
The talking scale up?
談話規模擴大了嗎?
Blayne Curtis - Analyst
Blayne Curtis - Analyst
Scale up?
擴大規模?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Well, scale-up is very rapidly converting to Ethernet now. Very much so, for our fairly narrow band of hyperscale customers, scale up is very much Ethernet.
嗯,現在規模正在非常迅速地轉換為乙太網路。確實如此,對於我們相當狹窄的超大規模客戶群來說,擴大規模就非常依賴乙太網路。
Operator
Operator
Stacy Rasgon, Bernstein.
史泰西‧拉斯貢,伯恩斯坦。
Stacy Rasgon - Analyst
Stacy Rasgon - Analyst
I still wanted to follow up on that AI 2026 question. I wanted to just put some numbers on it, just to make sure I've got it right. So if you did 60% in the first three quarters of this year, if you grow 60% year-over-year in Q4, it puts you at like, I don't know, $5.8 billion, something like $19 billion or $20 billion for the year.
我仍然想跟進 AI 2026 的問題。我只是想在上面放一些數字,以確保我做對了。因此,如果今年前三個季度的銷售額達到 60%,如果第四季的銷售額年增 60%,那麼全年銷售額將達到,我不知道,58 億美元,還是 190 億美元或 200 億美元。
And then are you saying you're going to grow 60% in 2026, what put you $30 billion plus in AI revenues for 2026. So I just want to -- is that the math that you're trying to communicate to us directly?
然後你們是說你們將在 2026 年實現 60% 的成長,這會讓你們在 2026 年的人工智慧收入達到 300 億美元以上。所以我只是想——這是您想直接與我們交流的數學知識嗎?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
I think you're doing the math. I'm giving you the trend. But I did answer that question. I think Holland adds earlier. The rate we are seeing now so far in fiscal '25 and will presumably continue. We don't see any reason why it doesn't give a lead time visibility in '25. What we're seeing today based on what we have visibility on '26 is to be able to ramp up this AI revenue in the same trajectory.
我認為你正在做計算。我正在告訴你趨勢。但我確實回答了這個問題。我認為 Holland 早些時候就補充道。到目前為止,我們在 25 財年看到的利率預計還會持續下去。我們沒有理由不提供 25 年交貨時間的可見度。根據我們對 26 年的預見,我們今天看到的是能夠以同樣的軌跡增加 AI 收入。
Stacy Rasgon - Analyst
Stacy Rasgon - Analyst
So is the SAM going up -- so is the SAM going up as well because now you have influence on top of training. So is the SAM still 60 to 90? Or is the SAM higher now as you see it?
SAM 也在上升-SAM 也在上升,因為現在你在訓練之上還有影響力。那麼 SAM 仍然是 60 到 90 嗎?或者您認為現在 SAM 更高?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
I'm not playing the SAM game here. I'm just giving a trajectory towards where we drew the line on '27 before. So I have no response to it's the same going up or not. Stop talking about SAM now. Thanks.
我在這裡不玩 SAM 遊戲。我只是給了我們之前在 27 年劃定的一條軌跡。所以我無法回答它是否會上漲。現在不要再談論 SAM 了。謝謝。
Operator
Operator
Vivek Arya, BofA Global Research.
美國銀行全球研究部的 Vivek Arya 說。
Vivek Arya - Analyst
Vivek Arya - Analyst
I had a near and then a longer-term question on the XPU business. talk for near term, if you're networking upside in Q2 and overall was in line, it means XPU was perhaps not as strong. So I realize it's lumpy, but anything more to read into that any product transition or anything else? So just a clarification there. And then longer-term, you have outlined a number of additional customers that you're working with what milestones should we look forward to?
我對 XPU 業務有一個近期和長期的問題。就短期而言,如果您在第二季的網路上行且整體情況良好,則表示 XPU 可能不那麼強勁。所以我意識到它很不完整,但還有什麼可以解讀的嗎,例如產品轉型或其他什麼?這只是澄清一下。從長遠來看,您已經概述了正在合作的其他客戶的數量,我們應該期待哪些里程碑?
And what milestones are you watching to give you the confidence that you can now start adding that addressable opportunity into your '27 or '28 or other numbers? Like how do we get the confidence that these projects are going to turn into revenue in some reasonable time frame from now.
您正在關注哪些里程碑,以使您有信心現在可以開始將這個可尋址的機會添加到您的 2027 年、2028 年或其他數字中?例如,我們如何確信這些項目將在從現在起的某個合理的時間範圍內轉化為收入。
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Okay. On the first part that you're asking, it's like you're trying to count how many angels are head of pin here. I mean whether it's HP or networking. Networking is hot, but that doesn't mean XPU any software. It's very much along the trajectory we expect it to be. And so there's no lumpiness, there's no softening. It's pretty much what we expect the trajectory to go so far. And into next quarter as well and probably beyond.
好的。關於您問的第一部分,就像您試圖數出這裡有多少天使一樣。我的意思是無論是 HP 還是網路。網路很熱門,但這並不意味著 XPU 任何軟體。一切正如我們預期的那樣。因此不會出現結塊,也不會變軟。這幾乎就是我們目前所預期的軌跡。並且可能持續到下個季度甚至更久。
So we have a -- it's a fairly I guess, in our view, a fairly clear visibility on the short-term trajectory. In terms of going on to '27, no, we are not updating any numbers here. We -- six months ago, we drew a sense for the size of the SAM based on 1 trillion GPU, XP clusters for three customers. And that's still very valid at that point, that would be done, and we have not provided any further updates here.
因此,我認為,在我們看來,短期軌跡具有相當清晰的可視性。至於 27 年,不,我們不會在這裡更新任何數字。六個月前,我們為三位客戶繪製了基於 1 兆 GPU、XP 叢集的 SAM 規模圖。這在當時仍然非常有效,這將會完成,並且我們在這裡沒有提供任何進一步的更新。
No, are we intending to at this point. When we get a better visibility clearer sense of where we are? And that probably won't happen until '26. We'll be happy to give an update to the audience. But right now, though, in today's prepared remarks and answering a couple of questions. We have -- we are -- as we are doing -- as we have done yet, we are intending to give you guys more visibility what we've seen the growth trajectory in '26.
不,我們現在有這個打算嗎?當我們獲得更好的能見度時,我們能更清楚地了解我們在哪裡嗎?而這可能要到 26 年才會發生。我們很高興向觀眾通報最新消息。但現在,在今天的準備好的發言中,我要回答幾個問題。我們已經——我們正在——正如我們所做的那樣——正如我們迄今為止所做的那樣,我們打算讓你們更清楚地了解我們在 26 年所看到的成長軌跡。
Operator
Operator
CJ Muse, Cantor Fitzgerald.
CJ Muse、康托·菲茨傑拉德。
CJ Muse - Analyst
CJ Muse - Analyst
I was hoping to follow up on Ross's question regarding inference opportunity. Can you discuss workloads that are optimal that you're seeing for custom silicon? And then over time, what percentage of your XPU business could be inferenced versus training?
我希望跟進羅斯關於推理機會的問題。您能否討論一下您所看到的客製化矽片的最佳工作負載?然後隨著時間的推移,您的 XPU 業務中有多少比例可以進行推理而不是訓練?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
I think there's no differentiation between training and inference and using merchant accelerators versus customer accelerators. I think they're all the whole premise behind going towards custom accelerators continues, which is it's not a matter of cost alone. It is that as custom accelerators get used and get developed on a road map with any particular hyperscaler.
我認為訓練和推理之間以及使用商家加速器和客戶加速器之間沒有區別。我認為他們繼續採用定制加速器的整個前提,這不僅僅是成本問題。這是因為定制加速器會隨著任何特定的超大規模器而得到使用和開發。
That's a learning curve, learning turf on how they could optimize the way the, -- and their algorithms on their large language models gets written and tied to silicon. And that ability to do so is a huge value-added in creating algorithms that can drive their LLMs to higher and higher performance, much more than basically a segregation approach between hardware and the software. Is that you literally combine end-to-end hardware software as they take their journey.
這是一個學習曲線,學習如何優化他們的大型語言模型的演算法的編寫方式以及與矽片的綁定方式。這種能力在創建可以推動 LLM 性能越來越高的演算法方面具有巨大的附加價值,遠遠超過硬體和軟體之間的隔離方法。就是在旅程中將端到端的硬體軟體真正地結合在一起。
And it's a journey. They don't learn that in one year. Doing a few cycles get better and better and analyze the file the fundamental value in creating your own hardware versus using a third-party merchant silicon that you are able to optimize your software to the hardware and eventually achieve way high performance than you otherwise could. And we see that happening.
這是一趟旅程。他們一年內是學不會這些的。經過幾個週期的改進,情況會越來越好,分析文件在創建自己的硬體方面具有根本價值,而不是使用第三方商家矽片,這樣您就可以針對硬體優化軟體,並最終實現比其他方式更高的性能。我們看到這種情況正在發生。
Operator
Operator
Karl Ackerman, BNP Paribas.
法國巴黎銀行的卡爾·阿克曼。
Karl Ackerman - Analyst
Karl Ackerman - Analyst
Hock, you spoke about the much higher content opportunity in scale-up networking. I was hoping you could discuss how important is demand adoption for co-package optics in achieving this five times or 10 times higher content for scalp networks. Or should we anticipate much of a scale-up opportunity will be driven by Tomahawk and Thornix? Thank you.
霍克,您談到了擴大網路規模所帶來的更高內容機會。我希望您能討論一下,對於頭皮網路實現 5 倍或 10 倍更高的內容,共封裝光學元件的需求採用有多重要。或者我們應該預期大部分擴大規模的機會將由 Tomahawk 和 Thornix 推動?謝謝。
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
I'm trying to decipher this question of these. So let me try to answer perhaps in a way I think you want me to clarify. First and foremost, I think most of one scaling up a lot of the scaling up that's going as I call it, which means a lot of XPU or GPU to GPU interconnects it's done on copper interconnects. And because the size of this -- of this scale cluster, still not that huge and they can get away with copper -- using copper interconnects.
我正在嘗試解釋這些問題。因此,請允許我嘗試以我認為您希望我澄清的方式來回答。首先,我認為大部分的擴展都是我所說的擴展,這意味著許多 XPU 或 GPU 到 GPU 的互連都是在銅互連上完成的。而且由於這種規模的集群的尺寸仍然不是那麼大,他們可以使用銅——使用銅互連。
And they're still doing it mostly they are doing it today. At some point soon, I believe, when you start trying to go beyond maybe 72 GPU to GPU interconnects you may have to push towards a different protocol model at a. different media from copper to optical. And when we do that, yes, perhaps then thing side exalting stuff like co-packaging might be for of silicon with optical might become relevant.
他們今天仍然在這樣做,而且大部分時候都是如此。我相信,在不久的將來,當您開始嘗試超越 72 個 GPU 到 GPU 的互連時,您可能必須採用不同的協定模型,從銅線到光纖的不同媒體。當我們這樣做的時候,是的,也許像矽和光學共同封裝這樣的東西可能會變得重要。
But truly, what we really are talking about is that at some stage, as the clusters get larger, which means scale up becomes much bigger, and you need to interconnect to GPU and XPU each other in scale up many more than just 72 or 100, maybe even 128 in start going more and more. You want to use optical interconnects simply because of distance. And that's when optical will start replacing copper.
但實際上,我們真正談論的是,在某個階段,隨著叢集變得越來越大,這意味著規模擴大會變得更大,並且您需要將 GPU 和 XPU 相互互連,規模擴大到遠不止 72 或 100,甚至可能達到 128,並且越來越多。您想使用光纖互連只是因為距離。那時光纖將開始取代銅線。
And when that happens, the question is what's the best way to deliver on optical. And one way is co-package optics, but it's not the only way. You can just simply use continued use perhaps pluggable at low-cost optics, and which case then can interconnect the bandwidth, the ratings of a switch and our switch is down 512 connections.
當這種情況發生時,問題就是實現光學傳輸的最佳方法。其中一種方法是共同封裝光學元件,但這不是唯一的方法。您可以簡單地繼續使用或插入低成本的光學元件,然後可以互連頻寬、交換器的額定值,而我們的交換器有 512 個連接。
So you can now connect all these XPUs, GPUs 512 for scale-up phenomenon. And that is huge but that's when you go to optical. That's going to happen, I said within my view a year or two, and we'll be right in the forefront of it. And it may be co-package optics, which we are very much in development but it's a lock-in co-package or it could just be as a first step pluggable optics.
因此,您現在可以連接所有這些 XPU、GPU 512 以實現擴大規模現象。這是巨大的,但那是當你使用光學的時候。我認為,一兩年之內這就會實現,而我們會處於最前線。它可能是共同封裝光學元件,我們正在大力開發它,但它是一種鎖定共同封裝,或者它可能只是作為第一步可插拔光學元件。
Whatever it is, I think the bigger question is when does it go for optical from copper connecting GPU to GPU tribute to optical connecting it. And the staff in that move will be huge. And it's not necessary go package offices. So that's definitely one path we are pursuing.
不管它是什麼,我認為更大的問題是什麼時候從銅線連接到 GPU 到 GPU 再到光纖連接。此次行動所需的工作人員將會非常龐大。而且沒必要去包裹辦公室。所以這絕對是我們正在追求的道路之一。
Operator
Operator
Joshua Buchalter, TD Cowen.
約書亞·布查爾特(Joshua Buchalter),考恩公司 (TD Cowen) 的董事。
Joshua Buchalter - Analyst
Joshua Buchalter - Analyst
I realize it's a bit nitpicky, but I wanted to ask about gross margins in the guide. Revenue implies sort of $80 million to $100 million incremental increase with gross profit up, I think, $400 million to $450 million, which is kind of pretty well below corporate average fall-through. Appreciate that semis is dilutive, and custom is probably dilutive within semi, but anything else going on with margins that we should be aware of?
我知道這有點吹毛求疵,但我想問指南中的毛利率。營收意味著 8,000 萬至 1 億美元的增量,毛利則增加,我認為是 4 億至 4.5 億美元,遠低於企業平均跌幅。我們意識到半導體具有稀釋性,定制在半導體內部也可能具有稀釋性,但是我們還應該注意利潤率的其他方面嗎?
And how should we think about the margin profile of custom longer term as that business continues to scale and diversify.
隨著客製化業務的不斷擴大和多樣化,我們應該如何看待客製化業務的長期利潤狀況?
Kirsten Spears - Chief Financial Officer, Chief Accounting Officer
Kirsten Spears - Chief Financial Officer, Chief Accounting Officer
Yes. We've historically said that the XPU margins are slightly lower than the rest of the business other than wireless. And so there's really nothing else going on other than that. It's just exactly what I said, that the majority of it quarter-over-quarter, the 130-basis point decline is being driven by more XPUs.
是的。我們過去曾說過,XPU 利潤率略低於無線以外的其他業務。因此,除此之外,實際上沒有其他事情發生。正如我所說的,環比下降 130 個基點的主要原因是 XPU 增多。
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
There are more moving parts here, then your simple analysis pros here. And I think our simple analysis is totally wrong in that regard.
這裡有更多活動部件,然後是這裡的簡單分析優點。我認為我們在這方面的簡單分析是完全錯誤的。
Operator
Operator
Timothy Arcuri, UBS.
瑞銀的提摩西·阿庫裡。
Timothy Arcuri - Analyst
Timothy Arcuri - Analyst
I also wanted to ask about scale up, Hock. So there's a lot of competing ecosystems that's UA-Link which, of course, you left. And now there's the big GPU company opening up NVLink. And they're both trying to build ecosystems and there's an argument that you're an ecosystem of one. What would you say to that debate?
我還想問一下有關擴大規模的問題,霍克。因此,存在許多競爭生態系統,當然,UA-Link 就是其中之一。現在,大型 GPU 公司正在開放 NVLink。他們都在嘗試建立生態系統,並且有一種觀點認為,你是一個生態系統。您對這場辯論有何看法?
Does opening up NVLink change the landscape and sort of how do you view of your AI network and growth next year? Do you think it's going to be primarily driven by scale up? Or would still be pretty scale-out heavy?
開放 NVLink 是否會改變格局?您如何看待明年的 AI 網路和成長?您認為這主要將由規模擴大所驅動嗎?還是仍然會相當沉重?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
People don't like to create platforms and new protocols and systems. The fact of the meta is scale up can just be done easily. And it's currently available. It's open standards, open source, Ethernet, just as well just as well, you don't need to create new systems for the sake of doing something that you could easily be doing in networking in Ethernet. And so yes, I hear a lot of this interesting new protocol standards that are trying to be created.
人們不喜歡創建平台、新協議和新系統。事實上,元資料的擴大可以輕鬆實現。目前它已可用。它是開放標準、開源、以太網,同樣如此,您不需要為了做一些您可以在乙太網路中輕鬆完成的事情而創建新的系統。是的,我聽說了很多正在嘗試創建的有趣的新協議標準。
And most of them, by the way, are proprietary. Much as they like to call it otherwise. One is really open source and open standards is Ethernet. And we believe Ethernet will prevail as it does before for the last 20-years in traditional networking. There's no reason to create a new standard for something that could be easily done in transferring bits and bytes of data.
順便說一句,其中大多數都是專有的。儘管他們喜歡這樣稱呼它。一個真正開源和開放的標準是乙太網路。我們相信乙太網路將像過去 20 年在傳統網路中一樣佔據主導地位。沒有理由為那些可以輕鬆傳輸位元和位元組的資料的事情創建新的標準。
Operator
Operator
Christopher Rolland, Susquehanna.
克里斯多福羅蘭 (Christopher Rolland),薩斯奎漢納。
Christopher Rolland - Analyst
Christopher Rolland - Analyst
Thanks for the question Yes. My question is for you, Hock. It's a kind of a bigger picture one here. And this kind of acceleration that we're seeing in AI demand. Do you think that this acceleration is because of a marked improvement in ASICs or XPUs closing the gap on the software side at your customers?
謝謝你的提問,是的。我的問題是問你的,霍克。這是一個更大的圖景。我們看到人工智慧需求正在加速成長。您是否認為這種加速是因為 ASIC 或 XPU 的顯著改善縮小了客戶在軟體方面的差距?
Do you think it's these require tokenomics around inference test time compute driving that? For example, what do you think is actually driving the upside here? And do you think it leads to a market share shift faster than we were expecting towards XPU from GPU?
您是否認為這些需要圍繞推理測試時間計算驅動的代幣經濟學?例如,您認為推動這項上漲的真正因素是什麼?您是否認為這會導致市場佔有率從 GPU 轉向 XPU 的轉變速度比我們預期的更快?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Yes. Interesting question about none of the foregoing that you outlined. Very simple, why inference has come out very, very hot lately is remember, we're only selling to a few customers, hyperscalers with platforms and LLMs. There are not that many. And we told you how many we have, and we haven't increased any. But what is happening is these hyperscalers and those with LMM need to justify all the spending they're doing. Doing training makes your frontier models smarter.
是的。這是一個有趣的問題,與您之前概述的任何一個都不一樣。很簡單,為什麼推理最近變得非常非常熱門,記住,我們只向少數客戶銷售,即擁有平台和 LLM 的超大規模企業。沒有那麼多。我們告訴你們我們有多少個,而且我們沒有增加任何數量。但實際情況是,這些超大規模企業和那些擁有 LMM 的企業需要證明他們所做的所有支出都是合理的。進行訓練可以使你的前沿模型更加聰明。
That's no question. It's almost like science research and science, make your frontier models by creating very clever algorithms that consumes a lot of compute for training, smart training mixes smarter. You want to monetize inference. And that's what's driving it. Monitor I indicated in my prepared remarks, the drive to justify a return on investment and a lot of that investment is training.
毫無疑問。這幾乎就像科學研究和科學,透過創建非常聰明的演算法來製作你的前沿模型,這些演算法會消耗大量的計算來進行訓練,智慧訓練會更加聰明。您想將推理貨幣化。這就是推動它的原因。監督員我在準備好的發言中指出,證明投資回報的動力以及大量投資都是培訓。
And that return on investment is by creating use cases, a lot AI use cases, AI consumption out there through availability of a lot of influence. And that's what we are now starting to see among a small group of customers.
而投資回報是透過創造用例、大量的人工智慧用例、透過大量的影響力實現的人工智慧消費。我們現在開始在一小部分客戶中看到這種情況。
Operator
Operator
Vijay Rakesh, Mizuho.
瑞穗的 Vijay Rakesh。
Vijay Rakesh - Analyst
Vijay Rakesh - Analyst
Just going back on the AIS revenue side. I know you said fiscal '25 kind of tracking to that up 60% as growth. As you look at fiscal '26, you have many new customers lapping a Meta and probably you have the four of the six hyperscalers that you've talked in the past. Would you expect that growth to activate into fiscal '26 about that kind of the 60% you talked about?
回到 AIS 收入方面。我知道您說過 25 財年的成長速度將達到 60%。回顧 26 財年,您會發現許多新客戶都在使用 Meta,而且您可能已經擁有過去談到的六個超大規模客戶中的四個。您是否預計這種成長將在 26 財年實現,達到您所說的 60% 左右?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
In my prepared remarks, which I clarify that the grade of growth we are seeing in '25 will sustain based on improved visibility and the fact that we're seeing insurance coming in on top of the demand for training as the clusters get buildup became bigger, still stands. I don't think we are getting very far by trying to pass through my words or data here, I just attribute. And we see that going from '25 into '26 as the best forecasts we have at this point.
在我準備好的發言中,我澄清說,我們在 25 年看到的增長水平將持續下去,這基於可見性的提高,而且隨著集群的擴大,我們看到保險需求超過了培訓需求,這一事實仍然存在。我認為我們在這裡試圖傳遞我的言語或數據不會有多大進展,我只是歸因。我們認為從 25 年到 26 年是目前我們能做出的最佳預測。
Vijay Rakesh - Analyst
Vijay Rakesh - Analyst
Got it. And on the NVLink fusion versus the scale up. And do you expect that market to go the route of -- on top of the rack where you've seen some move to the Ethernet side in kind of the scale out? Do you expect scale up to kind of go the same route.
知道了。以及 NVLink 融合與擴大規模。您是否預期市場會走這樣的路線—在機架頂部,您已經看到一些向乙太網路端的轉變,即向外擴展?您是否希望擴大規模並走同樣的路線?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Broadcom do not participate in NVLink. So I'm really not qualified to answer that question, I think.
博通不參與 NVLink。所以我認為我真的沒有資格回答這個問題。
Operator
Operator
Aaron Rakers, Wells Fargo.
富國銀行的 Aaron Rakers。
Aaron Rakers - Analyst
Aaron Rakers - Analyst
I think all my questions on scale up have been asked. But I guess, Hock, given the execution that you guys have been able to do with the VMware integration looking at the balance sheet, looking at the debt structure. I'm curious if you could give us your thoughts on how the company thinks about capital return versus the thoughts on M&A and the strategy going forward?
我認為所有關於擴大規模的問題都已經被問過了。但是我想,霍克,考慮到你們已經能夠透過 VMware 整合來執行,請查看資產負債表和債務結構。我很好奇,您能否告訴我們您對公司如何看待資本回報以及對併購和未來策略的看法?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Okay. That's an interesting question. And I grant not too untimely, I would say, because, yes, we have done a lot of the integration of VMware now. And you can see that in the level of free cash flow we're generating from operations. And as we said, the use of capital has always been -- we're very -- I guess, measured and upfront with a return through dividends, which is half our free cash flow of the preceding year.
好的。這是一個有趣的問題。我想說,我承認這並不是太不合時宜,因為是的,我們現在已經完成了許多 VMware 的整合。您可以從我們營運產生的自由現金流水準中看到這一點。正如我們所說,資本的使用一直是——我想,我們非常——經過衡量並透過股息預先獲得回報,這是我們前一年自由現金流的一半。
And frankly, as Kirsten has mentioned three months ago and six months ago to in our last earnings call. The first choice typically of the other part of the free cash flow is to bring down our debt to a more -- to a level that we feel closer to no more than two ratio of debt to EBITDA. And that doesn't mean that opportunistically, we may go out there and buy back our shares as we did last quarter. and indicated by Kirsten when we did $4.2 billion of stock buyback.
坦白說,正如 Kirsten 在三個月前和六個月前的上次財報電話會議上提到的那樣。自由現金流的另一部分通常的首選是將我們的債務降低到我們認為更接近債務與 EBITDA 比率不超過兩倍的水平。但這並不意味著我們可能會像上個季度那樣投機取巧地回購股票。正如 Kirsten 所指出的,當時我們進行了 42 億美元的股票回購。
Now part of it is used to basically when employee RSUs vest basically use -- we basically buy back part of the shares in used to be paying taxes on vested RSU. But the other part of it, I do it -- we use it opportunistically last quarter when we see a situation when basically, we think that it's a good time to buy some shares back, we do. But having said all that, our use of cash outside of dividends would be at this stage, used towards reducing our debt.
現在,其中一部分基本上用於在員工 RSU 歸屬時使用——我們基本上回購部分股份,用於支付已歸屬 RSU 的稅款。但另一方面,我會這樣做——上個季度,當我們看到某種情況時,我們會抓住機會,認為是回購一些股票的好時機,我們就會這麼做。但話雖如此,我們現階段除股息之外的現金用途將用於減少債務。
And I know you're going to ask what about M&A? Well, the kind of M&A we will do in our view, would be significant, would be substantial enough that we need debt in any case. And the good use of our free cash flow to bring down debt to, in a way, expand, if don't preserve our borrowing capacity if we have to do another M&A deal.
我知道你會問併購怎麼樣?嗯,我們認為,我們將進行的併購將是重大的,規模足夠大,以至於無論如何我們都需要債務。並且充分利用我們的自由現金流來降低債務,從而在某種程度上擴大或保留我們的借貸能力,如果我們必須進行另一筆併購交易的話。
Operator
Operator
Srini Pajjuri, Raymond James.
Srini Pajjuri,雷蒙德‧詹姆斯。
Srini Pajjuri - Analyst
Srini Pajjuri - Analyst
A couple of clarifications. First, on your 2026 expectation. Are you assuming any meaningful contribution from the four prospects that you talked about?
幾點澄清。首先,關於您對 2026 年的預期。您是否認為您談到的四個前景會做出任何有意義的貢獻?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
No comment. We don't talk about prospects. We only talk on customers.
沒有意見。我們不談論前景。我們只談論客戶。
Srini Pajjuri - Analyst
Srini Pajjuri - Analyst
Okay. Fair enough. And then my other clarification is that, I think you talked about networking being about 40% of the mix within AI, is that the right kind of mix that you expect going forward? Or is that going to materially change as we, I guess, see XPUs ramping going forward?
好的。很公平。然後我的另一個澄清是,我認為您談到網路在人工智慧中佔 40% 左右,這是您預期的未來正確組合嗎?或者,隨著我們看到 XPU 的不斷增長,這種情況是否會發生實質的變化?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
No. I've always said, and I expect that to be the case in going forward in '26 as we grow, that networking should be a ratio to XPU should be closer in the range of less than 30%, not the 40%.
不。我一直在說,我預計隨著我們的發展,在 26 年內這種情況也會如此,網路與 XPU 的比例應該接近 30% 以下,而不是 40%。
Operator
Operator
Joe Moore, Morgan Stanley.
摩根士丹利的喬摩爾。
Joe Moore - Analyst
Joe Moore - Analyst
Great. You've said you're not going to be impacted by export controls on AI. I know there's been a number of changes since in the industry since the last that you made the call. Is that still the case? And just can you give people comfort that there's no impact from that down the road?
偉大的。您曾說過,人工智慧出口管制不會對您造成影響。我知道自從您上次打電話以來,這個行業已經發生了許多變化。現在還是這樣嗎?您能否讓人們放心,這不會對日後造成影響?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
Nobody can give anybody comfort in this environment, Joe. Rules are changing quite dramatically as trade bilateral trade agreements continue to be negotiated in a very, very dynamic environment. So I'll be honest, I don't know -- I know as little as probably -- you probably know more than I do, maybe in which case then I know very little about this whole thing about whether there is any export control, how the export control will take place, we're guessing. So I'd rather not answer that because no, I don't know whether it will be.
喬,在這種環境下,誰也安慰不了誰。隨著雙邊貿易協定的談判持續在一個非常動態的環境中進行,規則也正在發生巨大的變化。所以說實話,我不知道——我知道的可能很少——你可能比我知道的更多,也許在這種情況下,我對整個事情知之甚少,關於是否有出口管制,出口管制將如何進行,我們都只是猜測。所以我寧願不回答這個問題,因為我不知道是否會這樣做。
Operator
Operator
William Stein, Truist Securities.
威廉·斯坦(William Stein),Truist Securities。
William Stein - Analyst
William Stein - Analyst
I wanted to ask about the VMware. Can you comment as to how far along you are in the process of converting customers to the subscription model? Is that close to complete? Or is there still a number of quarters that we should expect that, that conversion continues.
我想問一下有關 VMware 的問題。您能否評論一下在將客戶轉變為訂閱模式的過程中進展到了哪一步?這接近完成了嗎?或者我們是否應該預期這種轉變還會持續數季?
Hock Tan - President, Chief Executive Officer, Director
Hock Tan - President, Chief Executive Officer, Director
That's a good question. And so by saying a good way to measure it is -- most of our VMware contracts are about typically three years. And that was one VMware did before we acquired them, and that's pretty much what we continue to do. This is very traditional. So based on that, the renewals were like 2/3 of the way, almost to the halfway, more than halfway through the renewals. So we probably have at least another year plus maybe 1.5-years to go.
這是個好問題。因此,一個很好的衡量方法是——我們的大多數 VMware 合約通常為三年。這是我們收購 VMware 之前所做的事情,現在我們也繼續在做這件事。這是非常傳統的。因此基於此,續約工作已經完成了 2/3,幾乎完成了一半,甚至超過了一半。所以我們可能至少還要再等一年半的時間。
Operator
Operator
Thank you. And with that, I'd like to turn the call over to Ji Yoo for closing remarks.
謝謝。最後,我想請 Ji Yoo 做最後的總結發言。
Ji Yoo - Head of Investor Relations
Ji Yoo - Head of Investor Relations
Thank you, operator. Broadcom currently plans to report its earnings for the third quarter of fiscal year 2025 after close of market on Thursday, September 4, 2025. A public webcast of Broadcom's earnings conference call will follow at 2:00 p.m. Pacific. That will conclude our earnings call today. Thank you all for joining. Operator, you may end the call.
謝謝您,接線生。博通目前計劃在 2025 年 9 月 4 日星期四收盤後公佈其 2025 財年第三季的收益。博通收益電話會議將於太平洋時間下午 2:00 進行公開網路直播。我們今天的收益電話會議就到此結束了。感謝大家的加入。接線員,您可以結束通話了。