高速連接公司 Credo 報告其 23 財年的收入達到創紀錄的 1.842 億美元,同比增長 73%。增長是由產品收入推動的,該收入增長了 87%。該公司預計將在第一季度實現連續增長,並在整個 24 財年繼續實現連續季度收入增長。在高速連接解決方案的加速市場機會中,特別是在生成 AI 應用程序市場中,預計會有顯著的增長預期。
Credo 的 AEC(有源電纜)解決方案是為客戶量身定制的,可滿足特定要求。大量客戶正在迅速採用每通道 100 兆字節。公司擁有20個合格的AEC出貨。
使用警語:中文譯文來源為 Google 翻譯,僅供參考,實際內容請以英文原文為主
Operator
Operator
Good day, and thank you for standing by. Welcome to the Credo Q4 Fiscal Year 2023 Earnings Conference Call. (Operator Instructions) Please be advised that today's conference is being recorded.
美好的一天,感謝您的支持。歡迎參加 Credo 2023 財年第四季財報電話會議。 (操作員指示)請注意,今天的會議正在錄製中。
I would now like to go ahead and turn the call over to Dan O'Neil. Please go ahead.
我現在想把電話轉給丹·歐尼爾。請繼續。
Daniel J. O'Neil - VP of Corporate Development & IR
Daniel J. O'Neil - VP of Corporate Development & IR
Good afternoon, and thank you all for joining us today for our fiscal 2023 fourth quarter and year ending earnings call. Joining me today from Credo are Bill Brennan, our Chief Executive Officer; and Dan Fleming, our Chief Financial Officer.
下午好,感謝大家今天參加我們的 2023 財年第四季和年終財報電話會議。今天從 Credo 加入我的是我們的執行長 Bill Brennan;和我們的財務長丹·弗萊明。
I'd like to remind everyone that certain comments made in this call today may include forward-looking statements regarding expected future financial results, strategies and plans, future operations, the markets in which we operate and other areas of discussion. These forward-looking statements are subject to risks and uncertainties that are discussed in detail in our documents filed with the SEC.
我想提醒大家,今天的電話會議中發表的某些評論可能包括有關預期未來財務業績、策略和計劃、未來營運、我們營運的市場以及其他討論領域的前瞻性陳述。這些前瞻性陳述受到風險和不確定性的影響,這些風險和不確定性在我們向 SEC 提交的文件中進行了詳細討論。
It's not possible for the company's management to predict all risks nor can the company assess the impact of all factors on its business or the extent to which any factor or combination of factors may cause actual results to differ materially from those contained in any forward-looking statements.
公司管理階層不可能預測所有風險,公司也無法評估所有因素對其業務的影響,或任何因素或因素組合可能導致實際結果與任何前瞻性陳述中包含的結果有重大差異的程度。
Given these risks, uncertainties and assumptions, the forward-looking events discussed during this call may not occur, and actual results could differ materially and adversely from those anticipated or implied. The company undertakes no obligation to publicly update forward-looking statements for any reason after the date of this call to conform these statements to actual results or to changes in the company's expectations except as required by law.
鑑於這些風險、不確定性和假設,本次電話會議中討論的前瞻性事件可能不會發生,實際結果可能與預期或暗示的結果有重大不利差異。除法律要求外,本公司沒有義務在本次電話會議之後以任何理由公開更新前瞻性陳述,以使這些陳述符合實際結果或公司預期的變更。
Also during this call, we will refer to certain non-GAAP financial measures, which we consider to be important measures of the company's performance. These non-GAAP financial measures are provided in addition to and not as a substitute for or superior to financial performance prepared in accordance with U.S. GAAP. A discussion of why we use non-GAAP financial measures and reconciliations between our GAAP and non-GAAP financial measures is available in the earnings release we issued today, which can be accessed using the Investor Relations portion of our website.
此外,在這次電話會議中,我們也將提及某些非公認會計準則財務指標,我們認為這些指標是衡量公司績效的重要指標。這些非公認會計原則財務指標是根據美國公認會計原則編制的財務績效的補充,而不是替代或優於根據美國公認會計原則編制的財務業績。我們今天發布的收益報告中討論了我們為什麼使用非公認會計原則財務指標以及我們的公認會計原則和非公認會計原則財務指標之間的調節,您可以透過我們網站的投資者關係部分訪問該報告。
With that, I'll now turn the call over to our CEO. Bill?
現在,我將把電話轉給我們的執行長。帳單?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Thanks, Dan, and good afternoon, everyone. Thank you for joining our Q4 fiscal '23 earnings call. I'll begin by providing an overview of our fiscal year '23 and fiscal Q4 results. I will then highlight what we see going forward in the fiscal '24. Dan Fleming, our CFO, will follow my remarks with a detailed discussion of our Q4 and fiscal year '23 financial results and share our outlook for the first quarter.
謝謝丹,大家下午好。感謝您參加我們的 23 年第四季財報電話會議。首先,我將概述我們 23 財年和第四財季的業績。然後我將重點介紹我們對 24 財年的展望。我們的財務長 Dan Fleming 將在我發言後詳細討論我們第四季和 23 財年的財務業績,並分享我們對第一季的展望。
Credo is a high-speed connectivity company, delivering integrated circuits, system-level solutions and IP licenses to the hyperscale data center ecosystem along with a range of other data centers and service providers. All our solutions leverage our core SerDes technology and our unique customer-focused design approach, enabling Credo to deliver optimized, secure, high-speed solutions with significantly better power efficiency and cost.
Credo 是一家高速連接公司,與一系列其他資料中心和服務供應商一起向超大規模資料中心生態系統提供積體電路、系統級解決方案和 IP 授權。我們所有的解決方案都利用我們的核心 SerDes 技術和獨特的以客戶為中心的設計方法,使 Credo 能夠提供優化、安全、高速的解決方案,並顯著提高能源效率和成本。
Our electrical and optical connectivity solutions delivered leading performance with port speeds ranging from 50 gig up to 1.6 terabits per second. While we primarily serve the Ethernet market today, we continue to extend into other standards-based markets as the need for higher speed with more power-efficient connectivity increases exponentially.
我們的電氣和光纖連接解決方案提供了領先的性能,連接埠速度範圍為每秒 50 GB 至 1.6 太比特。雖然我們今天主要服務於乙太網路市場,但隨著對更高速度和更節能連接的需求呈指數級增長,我們繼續擴展到其他基於標準的市場。
Credo continues to have significant growth expectations within the accelerating market opportunity for high-speed connectivity solutions. In fact, the onset of generative AI applications is already accelerating the need for higher speed and more energy-efficient connectivity solutions, and this is what Credo excels.
Credo 在高速連接解決方案不斷加速的市場機會中繼續抱持顯著的成長預期。事實上,生成式人工智慧應用的出現已經加速了對更高速度和更節能連接解決方案的需求,而這正是 Credo 所擅長的。
I'll start with comments on our fiscal 2023 results. Today, Credo is reporting results from our first full fiscal year as a public company. In fiscal '23, Credo achieved just over $184 million in revenue, up 73% over fiscal '22, and we achieved non-GAAP gross margin of 58%. Product revenue increased 87% year-over-year, primarily due to the ramp of our active electrical cable solutions.
我先對 2023 財年業績發表評論。今天,Credo 報告了我們作為上市公司第一個完整財年的表現。在 23 財年,Credo 實現了略高於 1.84 億美元的收入,比 22 財年增長了 73%,並且我們實現了 58% 的非 GAAP 毛利率。產品營收年增 87%,這主要歸功於我們有源電纜解決方案的成長。
License revenue grew 28% year-over-year from $25 million to $32 million. Throughout fiscal '23, we had several highlights across our product lines. For active electrical cables, or AECs, we continued to lead the market Credo pioneered during the last several years. Our team continued to quickly innovate with application-specific solutions, and we've been successful in expanding our engagements to include multiple data centers and service providers.
授權收入年增 28%,從 2,500 萬美元增至 3,200 萬美元。在整個 23 財年,我們的產品線有幾個亮點。對於有源電纜 (AEC),我們繼續引領 Credo 在過去幾年中開創的市場。我們的團隊繼續透過特定於應用程式的解決方案進行快速創新,並且我們已經成功地將我們的業務範圍擴大到包括多個資料中心和服務提供者。
Our customer-focused innovation has led to more than 20 different versions of AECs shipped for qualification or production in the last year, and we remain sole sourced in all our wins. And while our significant power advantage was a nice to have a couple of years ago, it's increasingly becoming imperative as our hyperscaler customers are pushed to lower their carbon footprint.
去年,我們以客戶為中心的創新已導致 20 多個不同版本的 AEC 交付用於鑑定或生產,並且我們在所有勝利中保持獨家採購。雖然幾年前我們擁有顯著的電力優勢是一件好事,但隨著我們的超大規模客戶被迫降低碳足跡,這一優勢變得越來越重要。
For optical DSPs, Credo continued to build momentum by successfully passing qualification for 200-gig and 400-gig solutions at multiple hyperscalers with multiple optical module partners. In addition, Credo introduced our 800-gig optical DSPs, laser drivers and TIAs and we announced our entry into the coherent optical DSP market.
對於光學 DSP,Credo 與多個光學模組合作夥伴成功通過了多個超大規模廠商的 200 GB 和 400 GB 解決方案資格認證,並繼續保持勢頭。此外,Credo 還推出了我們的 800g 光學 DSP、雷射驅動器和 TIA,並且我們宣布進入相干光學 DSP 市場。
For Line Card clients, we continued to expand our market leadership. In particular, Credo built upon our position as the leader for MACsec PHYs with over 50% market share. We also extended our performance in power efficiency advantages for 100-gig per lane Line Card PHYs with the introduction of our Screaming Eagle family of retimers and gearboxes with up to 1.6 terabits per second of bandwidth.
對於線路卡客戶,我們繼續擴大我們的市場領導地位。特別是,Credo 奠定了我們作為 MACsec PHY 領導者的地位,擁有超過 50% 的市佔率。我們還推出了具有寬高達 1.6 太比特/秒的 Screaming Eagle 系列重定時器和變速箱,從而擴展了每通道 100 GB 線卡 PHY 的能源效率優勢。
For IP licensing, we continue to build on our offering of highly optimized SerDes IP. In the year, we licensed SerDes IP at several process nodes from 4-nanometer to 28-nanometer with speeds ranging from 28-gig to 112-gig and reached performance ranging from XSR to LR. We believe our ability to innovate to deliver custom solutions remains unparalleled. We maintain very close working relationships with hyperscalers, and we'll continue to collaborate with them to deliver solutions that are optimized to their needs.
對於 IP 許可,我們繼續以高度優化的 SerDes IP 產品為基礎。這一年,我們獲得了從4奈米到28奈米的多個製程節點的SerDes IP許可,速度範圍從28G到112G,性能範圍從XSR到LR。我們相信,我們提供客製化解決方案的創新能力仍然無與倫比。我們與超大規模企業保持著非常密切的合作關係,並將繼續與他們合作,提供根據他們的需求進行最佳化的解決方案。
Despite recent macroeconomic headwinds in the data center industry, we believe the need for higher speed with better power efficiency will continue to grow. This plays perfectly to Credo's strengths, which is why we remain optimistic about our prospects in fiscal '24 and beyond.
儘管資料中心產業最近面臨宏觀經濟阻力,但我們相信對更高速度和更好能源效率的需求將繼續成長。這完美地發揮了 Credo 的優勢,這就是為什麼我們對 24 財年及以後的前景保持樂觀。
I will now discuss the fourth quarter more specifically. In Q4, we delivered revenue of $32.1 million and non-GAAP gross margin of 58%.
我現在將更具體地討論第四季。第四季度,我們實現了 3,210 萬美元的收入,非 GAAP 毛利率為 58%。
I'll now provide an overview of key business trends for the quarter. First, regarding AEC, market forecasters continue to expect significant growth in this product category due to the benefits of AECs compared to both legacy direct attached copper cables and compared to active optical cables, which is significantly higher power and higher cost.
我現在將概述本季的主要業務趨勢。首先,關於 AEC,市場預測人士繼續預期該產品類別將顯著成長,因為與傳統直連銅纜和主動光纜相比,AEC 具有明顯更高的功率和更高的成本,因此具有優勢。
With our largest customer, we're encouraged by our development progress on several new AEC programs, including an acceleration in the first 100-gig per lane AI program where they intend to deploy accretive AECs. We saw the initial ramp up of a second hyperscale customer, which we expect to grow meaningfully throughout the year. We're ramping 50-gig per lane NIC to ToR ADC solutions for both their AI and compute applications, And I'm happy to report that Credo has been awarded this customer's first 100-gig per lane program.
對於我們最大的客戶來說,我們在幾個新的 AEC 項目上的開發進展感到鼓舞,包括加速第一個每通道 100 個 AI 項目,他們打算在該項目中部署增值的 AEC。我們看到了第二個超大規模客戶的初步成長,我們預計該客戶將在全年實現有意義的成長。我們正在將每個通道 50 G 的 NIC 升級為適用於他們的 AI 和計算應用的 ToR ADC 解決方案,我很高興地報告,Credo 已獲得該客戶的首個每通道 100 G 的計劃。
We're also actively working to develop several other advanced AEC solutions for their next-generation deployments. We continue to make progress with additional customers as well. We remain in flight with 2 additional hyperscalers and are also engaged in meaningful opportunities with service providers.
我們也積極致力於為其下一代部署開發其他幾種先進的 AEC 解決方案。我們也持續與更多客戶取得進展。我們仍在與另外兩家超大規模企業合作,並與服務提供者合作提供有意義的機會。
We've seen momentum building for AEC solutions across AI, compute and switch applications and we continue to expect to benefit as speeds move quickly to 100-gig per lane.
我們已經看到 AEC 解決方案在人工智慧、計算和交換應用方面的發展勢頭,並且隨著速度迅速提高到每通道 100G,我們繼續期望從中受益。
Regarding our progress on optical solutions, in the optical category, we've leveraged our SerDes technology to deliver disruptive products, including DSPs, laser drivers and TIAs for 50-gig through 800-gig port applications.
關於我們在光學解決方案方面的進展,在光學類別中,我們利用 SerDes 技術提供了顛覆性產品,包括用於 50-gig 到 800-gig 連接埠應用的 DSP、雷射驅動器和 TIA。
We remain confident we can gain share over time due to our compelling combination of performance, power and cost. In addition to the hyperscalers that have previously production-qualified Credo's optical DSPs, we started the production ramp of a 400-gig optical DSP for a U.S. hyperscaler as the end customer.
我們仍然相信,由於我們將性能、功耗和成本完美地結合在一起,我們能夠隨著時間的推移贏得市場份額。除了先前擁有 Credo 光學 DSP 生產資格的超大規模企業之外,我們也開始為一家美國超大規模企業(作為最終客戶)生產 400 GB 光學 DSP。
At OFC in March, we received very positive feedback on our market solutions, including our Dove 800 products as well as on our announcement to enter the 100-gig ZR coherent DSP market. We're well positioned to win hyperscalers across a range of applications, including 200-gig, 400-gig and 800-gig port speeds. We're also engaged in opportunities for fiber channel, 5G, OTN and PON applications with optical partners, service providers and networking OEMs.
在 3 月的 OFC 上,我們收到了有關我們的市場解決方案的非常積極的反饋,包括我們的 Dove 800 產品以及我們宣布進入 100 吉 ZR 相干 DSP 市場。我們處於有利地位,可以贏得一系列應用的超大規模廠商,包括 200 GB、400 GB 和 800 GB 連接埠速度。我們也與光學合作夥伴、服務供應商和網路 OEM 合作,尋求光纖通道、5G、OTN 和 PON 應用的機會。
Within our Line Card PHY category, during the fourth quarter, we saw growing interest in our solutions, specifically for our Screaming Eagle 1.6 terabit per second PHYs. We've already been successful winning several design commitments from leading networking OEMs and ODMs for the Screaming Eagle devices. Credo was selected due to our combination of performance, signal integrity, power efficiency and cost effectiveness.
在我們的線路卡 PHY 類別中,第四季度,我們看到人們對我們的解決方案越來越感興趣,特別是我們的 Screaming Eagle 1.6 太比特每秒 PHY。我們已經成功贏得了領先網路 OEM 和 ODM 的多項 Screaming Eagle 設備設計承諾。選擇 Credo 是因為我們綜合考慮了效能、訊號完整性、電源效率和成本效益。
We also made significant development progress with our customer-sponsored next-generation 5-nanometer 1.6 terabit per second MACsec PHY, which we believe will extend our leadership well into the future for applications requiring encryption.
我們也在客戶贊助的下一代 5 奈米 1.6 太比特每秒 MACsec PHY 方面取得了重大開發進展,我們相信這將在未來將我們的領先地位擴展到需要加密的應用程式。
Regarding our SerDes IP licensing in SerDes chiplet businesses, our IP deals in Q4 were primarily led by our 5 and 4-nanometer 112-gig SerDes IP, which, according to customers, offers significant power advantage versus competition based on our ability to power optimize to the reach of an application.
關於我們在SerDes 小晶片業務中的SerDes IP 許可,我們在第四季度的IP 交易主要由我們的5 奈米和4 奈米112g SerDes IP 主導,根據客戶的說法,基於我們的功率優化能力,與競爭相比,該IP 提供了顯著的功率優勢到應用程式的範圍。
Our SerDes chiplet opportunity continues to progress. Our collaboration with Tesla on their Dojo supercomputer design is an example of how connectivity chiplets can enable advanced next-generation AI systems. We're working closely with customers and standard sites such as the UCIe consortium to ensure we retain leadership as the chiplet market grows and matures. We believe the acceleration of AI solutions across the industry will continue to fuel our licensing and chiplet businesses.
我們的 SerDes 小晶片機會持續取得進展。我們與特斯拉在 Dojo 超級電腦設計上的合作是連接晶片如何支援先進的下一代人工智慧系統的一個例子。我們正在與客戶和 UCIe 聯盟等標準站點密切合作,以確保隨著小晶片市場的發展和成熟,我們保持領先地位。我們相信,整個產業人工智慧解決方案的加速發展將繼續推動我們的授權和小晶片業務。
To sum up, the hyperscale landscape has shifted swiftly and dramatically in 2023. Compute is now facing a new horizon, which is generative AI. We expect this shift to accelerate the demand for energy-efficient connectivity solutions that perform at the highest speeds. From our viewpoint, this technology acceleration increases the degree of difficulty and will naturally slim the field of market participants.
總而言之,超大規模格局在 2023 年發生了迅速而巨大的變化。我們預計這種轉變將加速對以最高速度運行的節能連接解決方案的需求。我們認為,這種技術加速增加了難度,自然會縮小市場參與者的範圍。
We remain confident that our technology innovation and market leadership will fuel our growth as these opportunities materialize. We expect to grow sequentially in Q1 and then continue with sequential quarterly revenue growth throughout fiscal '24. We believe our growth will be led by multiple customers across our range of connectivity solutions, which will result in a more diversified revenue base as we exit fiscal '24.
我們仍然相信,隨著這些機會的實現,我們的技術創新和市場領導地位將推動我們的成長。我們預計第一季營收將實現環比成長,然後在整個 24 財年繼續實現季度營收季增。我們相信,我們的成長將由我們一系列連接解決方案中的多個客戶引領,這將在我們退出 24 財年時帶來更加多元化的收入基礎。
I'll now turn the call over to our CFO, Dan Fleming, who will provide additional details. Thank you.
我現在將把電話轉給我們的財務長 Dan Fleming,他將提供更多詳細資訊。謝謝。
Daniel Fleming - CFO
Daniel Fleming - CFO
Thank you, Bill, and good afternoon. I will first provide a financial summary of our fiscal year '23, then review our Q4 results and finally, discuss our outlook for Q1 and fiscal '24. As a reminder, the following financials will be discussed on a non-GAAP basis, unless otherwise noted.
謝謝你,比爾,下午好。我將首先提供 23 財年的財務摘要,然後回顧我們第四季度的業績,最後討論我們對第一季和 24 財年的展望。提醒一下,除非另有說明,否則以下財務數據將在非公認會計準則的基礎上進行討論。
Revenue for fiscal year '23 was a record at $184.2 million, up 73% year-over-year, driven by product revenue that grew by 87%. Gross margin for the year was 58.0%. Our operating margin improved by 13 percentage points even as we grew our product revenue mix. This illustrates the leverage that we can produce in the business. We reported earnings per share of $0.05, an $0.18 improvement over the prior year.
在產品收入成長 87% 的推動下,23 財年的營收達到創紀錄的 1.842 億美元,年增 73%。全年毛利率為58.0%。即使我們擴大了產品收入組合,我們的營業利潤率仍提高了 13 個百分點。這說明了我們可以在業務中產生的影響力。我們公佈的每股收益為 0.05 美元,比前一年增加 0.18 美元。
Moving on to the fourth quarter. In Q4, we reported revenue of $32.1 million, down 41% sequentially and down 14% year-over-year. Our IT business generated $5.7 million of revenue in Q4, down 55% sequentially and down 49% year-over-year. IP remains a strategic part of our business, but as a reminder, our IP results may vary from quarter-to-quarter, driven largely by specific deliverables to preexisting contracts.
繼續看第四季。第四季度,我們的營收為 3,210 萬美元,季減 41%,年減 14%。我們的 IT 業務在第四季度創造了 570 萬美元的收入,環比下降 55%,年減 49%。智慧財產權仍然是我們業務的策略部分,但需要提醒的是,我們的智慧財產權業績可能會因季度而異,這主要是由現有合約的具體交付成果所驅動的。
While the mix of IP and product revenue will vary in any given quarter over time, our revenue mix in Q4 was 18% IP, above our long-term expectation for IP, which is 10% to 15% of revenue. We continue to expect IP as a percentage of revenue to come in above our long-term expectations for fiscal '24.
雖然IP 和產品收入的組合在任何特定季度都會隨著時間的推移而變化,但我們第四季度的收入組合為IP 的18%,高於我們對IP 的長期預期,即佔收入的10% 至15%。我們繼續預期知識產權佔收入的百分比將高於我們對 24 財年的長期預期。
Our product business generated $26.4 million of revenue in Q4, down 37% sequentially and flat year-over-year. Our team delivered Q4 gross margin of 58.2%, above the high end of our guidance range and down 94 basis points sequentially due to lower IP contribution.
我們的產品業務在第四季度創造了 2,640 萬美元的收入,環比下降 37%,比去年同期持平。我們的團隊在第四季度實現了 58.2% 的毛利率,高於我們指導範圍的上限,但由於知識產權貢獻較低,環比下降了 94 個基點。
Our IP gross margin generally hovers near 100% and was 97.4% in Q4. Our product gross margin was 49.7% in the quarter, up 245 basis points sequentially and up 167 basis points year-over-year, due principally to product mix.
我們的IP毛利率整體徘徊在100%附近,第四季為97.4%。本季我們的產品毛利率為 49.7%,季增 245 個基點,年增 167 個基點,主要歸功於產品組合。
Total operating expenses in the fourth quarter were $27.2 million, within guidance and up 6% sequentially and 25% year-over-year. Our year-over-year OpEx increase was a result of a 36% increase in R&D as we continue to invest in the resources to deliver innovative solutions. Our SG&A was up 12% year-over-year as we built out public company infrastructure.
第四季總營運支出為 2,720 萬美元,符合預期,季增 6%,年增 25%。我們的營運支出較去年同期成長是由於我們不斷投資資源以提供創新解決方案,研發費用增加了 36%。隨著我們建設上市公司基礎設施,我們的 SG&A 年比成長 12%。
Our operating loss was $8.5 million in Q4, a decline of $10.7 million year-over-year. Our operating margin was negative 26.4% in the quarter, a decline of 32.2 percentage points year-over-year due to reduced top line leverage.
第四季我們的營運虧損為 850 萬美元,年減 1,070 萬美元。由於營收槓桿率下降,本季我們的營業利潤率為負 26.4%,年減 32.2 個百分點。
We reported a net loss of $5.7 million in Q4, $8.3 million below last year. Cash flow used by operations in the fourth quarter was $11.8 million, a decrease of $14.2 million year-over-year due largely to our net loss and changes in working capital.
我們報告第四季淨虧損 570 萬美元,比去年減少 830 萬美元。第四季營運使用的現金流量為 1,180 萬美元,年減 1,420 萬美元,這主要是由於我們的淨虧損和營運資本的變化。
CapEx was $3.9 million in the quarter driven by R&D equipment spending and free cash flow was negative $15.7 million, a decrease of $8.4 million year-over-year. We ended the quarter with cash and equivalents of $217.8 million, a decrease of $15.2 million from the third quarter. This decrease in cash was a result of our net loss and the investments required to grow the business.
由於研發設備支出的推動,本季資本支出為 390 萬美元,自由現金流為負 1,570 萬美元,年減 840 萬美元。本季末,我們的現金及等價物為 2.178 億美元,比第三季減少 1,520 萬美元。現金減少是由於我們的淨虧損和發展業務所需的投資所造成的。
We remain well capitalized to continue investing in our growth opportunities while maintaining a substantial cash buffer for uncertain macroeconomic conditions. Our accounts receivable balance increased by 14.6% sequentially to $49.5 million, while days sales outstanding increased to 140 days, up from 72 days in Q3 due to lower revenue. Our Q4 ending inventory was $46.0 million, down $4.3 million sequentially.
我們仍然擁有充足的資本,可以繼續投資於我們的成長機會,同時為不確定的宏觀經濟狀況保持大量現金緩衝。我們的應收帳款餘額季增 14.6%,達到 4,950 萬美元,而由於收入下降,應收帳款天數從第三季的 72 天增加到 140 天。我們第四季期末庫存為 4,600 萬美元,比上一季減少 430 萬美元。
Now turning to our guidance. We currently expect revenue in Q1 of fiscal '24 to be between $33 million and $35 million, up 6% sequentially at the midpoint. We expect the Q1 gross margin to be within a range of 58% to 60%. We expect the Q1 operating expenses to be between $26 million and $28 million.
現在轉向我們的指導。我們目前預計 24 財年第一季的營收將在 3,300 萬美元至 3,500 萬美元之間,比中位數連續成長 6%。我們預計第一季毛利率將在 58%至 60%之間。我們預計第一季營運支出將在 2,600 萬美元至 2,800 萬美元之間。
We expect the Q1 basic weighted average share count to be approximately 149 million shares. We feel we have moved through the bottom in the fourth quarter, while we see some near-term upside to our prior expectations, we remain cautious about the back half of our fiscal year due to uncertain macroeconomic conditions.
我們預計第一季基本加權平均股數約為1.49億股。我們認為我們已經在第四季度觸底,雖然我們看到近期比我們之前的預期有一些上行,但由於不確定的宏觀經濟狀況,我們對本財年後半段仍持謹慎態度。
In summary, as we move forward through fiscal year '24, we expect sequential revenue growth, expanding gross margins due to increasing scale and modest sequential growth in operating expenses. As a result, we look forward to driving operating leverage as we exit the year.
總之,隨著我們進入 24 財年,我們預期營收會較上季成長,由於規模擴大和營運費用較上季溫和成長,毛利率也會擴大。因此,我們期待在今年結束時提高營運槓桿率。
And with that, I'll open it up for questions. Thank you.
接下來,我將開放提問。謝謝。
Operator
Operator
(Operator Instructions) The first question that we have is coming from Tore Svanberg of Stifel.
(操作員說明)我們的第一個問題來自 Stifel 的 Tore Svanberg。
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
For my first question and in regards to the Q1 guidance as far as what's driving the growth, given your gross margin comment, I assume that AEC will probably continue to be down with perhaps the growth coming from -- kind of for DSP and IP. Is that sort of the correct thinking or if not, please correct me?
對於我的第一個問題,以及關於推動成長的第一季指引,考慮到您對毛利率的評論,我認為 AEC 可能會繼續下降,成長可能來自 DSP 和 IP。這樣的想法正確嗎?
Daniel Fleming - CFO
Daniel Fleming - CFO
So you're correct in that our -- if you look at the sequential increase in gross margin from Q3 to Q4, while our product revenue was down, that's really reflective of a favorable product mix, where AEC, as we all know, which is on the lower end of our margin profile, was -- contributed less of the overall product mix.
所以你是對的——如果你看看從第三季到第四季毛利率的連續成長,而我們的產品收入卻下降了,這確實反映了有利的產品組合,眾所周知,AEC處於我們利潤率的較低端,對整體產品組合的貢獻較小。
That trend will continue in Q1. And I would characterize that really as broadly across all of our other product lines, not really singling out one specific product line that's taking up the slack from AEC, so to speak.
這一趨勢將在第一季持續下去。我認為這一點實際上在我們所有其他產品線中都廣泛存在,而不是真正單獨挑出一個特定的產品線來填補 AEC 的空缺,可以這麼說。
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
And as my follow-up question for you, Bill, with generative AI, as you mentioned in your script, things are clearly changing. I was just hoping you could talk a little bit more granular about how it impacts each business? I'm even thinking about sort of the 800-gig PAM4 cycle. I mean, is that getting pulled in? So yes, I mean, how -- if you could just give us a little bit more color on how generative AI could impact each of your 4 business units at this point?
作為我向你提出的後續問題,比爾,正如你在腳本中提到的,隨著生成人工智慧的發展,事情正在明顯改變。我只是希望您能更詳細地談談它如何影響每項業務?我甚至正在考慮 800 場 PAM4 循環。我的意思是,這是被拉進來了嗎?所以,是的,我的意思是,如果您能給我們更多關於生成式人工智慧如何影響您的 4 個業務部門的資訊嗎?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Sure, absolutely. So I think -- generally, I think that AI applications will create revenue opportunities for us across our portfolio. I think the largest opportunity that we'll see is with AEC. However, optical DSPs, there will definitely be a big opportunity there. Even Line Card PHYs, chiplets, even SerDes IP licensing will get an uplift as AI deployments increase. So maybe I can start first with AECs.
當然,絕對。所以我認為,總的來說,我認為人工智慧應用將為我們的產品組合創造收入機會。我認為我們將看到的最大機會是 AEC。然而,光學DSP,肯定會有很大的機會。隨著人工智慧部署的增加,即使是線路卡 PHY、小晶片、甚至 SerDes IP 授權也會提升。所以也許我可以先從 AEC 開始。
Now it's important to kind of identify the differences between traditional compute server racks, which is kind of commonly referred to -- used at the front-to-end network, so basically a NIC to ToR connection, the ToR up to the leaf and spine network. The typical compute rack would have 10 to 20 AECs in rack, meaning in rack connections from NIC to ToR and highlight the leading-edge lane rates today for these connections with compute servers is 50-gig per lane.
現在重要的是要識別傳統計算伺服器機架之間的差異,這通常被稱為 - 在前端到終端網路中使用,因此基本上是 NIC 到 ToR 連接,ToR 一直到葉子和主幹網路。典型的電腦架在機架中有 10 到 20 個 AEC,這意味著從 NIC 到 ToR 的機架連接,並強調當今這些與計算伺服器的連接的前沿通道速率為每通道 50 G。
Within an AI cluster, in addition to the front-end network, which is similar, there's a back-end network referred to as the RDMA network, and that basically allows the AI appliances to be networked together within a cluster directly. And if we start going through the map, this back-end network has 5 to 10x of the bandwidth as the front-end network.
在AI集群內,除了類似的前端網絡之外,還有一個被稱為RDMA網絡的後端網絡,它基本上允許AI設備在集群內直接聯網。如果我們開始查看地圖,就會發現該後端網路的頻寬是前端網路的 5 到 10 倍。
And so the other important thing is to note within these RDMA networks, there are Leaf-Spine racks as well. And so if we look at the -- if we look at one example of a customer that we're working with in deploying, the AI appliance rack itself will have a total of 56 ADCs between the front-end and back-end networks. Each Leaf-Spine rack is a [class] rack or aggregated chassis, which will have 256 ADCs.
因此,另一件重要的事情是,在這些 RDMA 網路中,也存在葉脊機架。因此,如果我們看一下正在部署的客戶的一個範例,就會發現 AI 設備機架本身在前端和後端網路之間總共有 56 個 ADC。每個葉-主幹機架都是一個[類]機架或聚合機箱,其中將有 256 個 ADC。
And so when we look at it from an overall opportunity for AEC, this is a huge uplift in volume and the volume coincides with the bandwidth. Now lane rates will quickly move and certain applications will go forward at 50-gig per lane, others will go straight to 100-gig per lane. And so we see probably a 5x plus revenue opportunity difference between the typical -- if you were to say apples-to-apples with the number of compute server racks versus an AI cluster.
因此,當我們從 AEC 的整體機會來看時,這是數量上的巨大提升,並且數量與頻寬是一致的。現在,通道速率將快速變化,某些應用將以每通道 50 G 的速度前進,其他應用程式將直接達到每通道 100 G 的速度。因此,我們看到典型的收入機會差異可能是 5 倍以上——如果你要說計算伺服器機架數量與人工智慧叢集的數量是同類的。
So it kind of extends the -- kind of extends in the optical. There is also a typically large -- there's typically a large number of AOCs in the same cluster. So you can imagine that the short in-rack connections are going to be done with AECs. These are 3 meters or less. But these appliances will connect to the to the back-end Leaf-Spine racks, these disaggregated racks. All of those connections will be AOCs.
所以它有點延伸了光學領域的延伸。還有一個通常很大的問題——同一群人通常有大量 AOC。因此,您可以想像,短的機架內連接將透過 AEC 完成。這些是 3 米或更小。但這些設備將連接到後端葉脊機架,這些分散的機架。所有這些連接都將是 AOC。
Those are connections that are greater than 3 meters. And so, if we look at this, this is all upside to, say a, traditional compute deployment where there's really no AOCs connecting rack-to-rack. Okay, so, when we look at the overall opportunity, we think that the additional AEC opportunity within an AI cluster is probably twice as large as -- twice as many connections as AOCs, but the AOC opportunity for us will be significant in a sense that AOCs represent the most cost-sensitive portion of the optical market.
這些是大於 3 公尺的連接。因此,如果我們看看這一點,就會發現這對傳統運算部署來說都是有利的,在傳統運算部署中實際上沒有 AOC 連接機架到機架。好的,所以,當我們審視整體機會時,我們認為人工智慧叢集內的額外 AEC 機會可能是 AOC 連線數量的兩倍,但從某種意義上說,AOC 機會對我們來說將是重要的AOC代表光學市場中對成本最敏感的部分。
And so it's also a lower technology hurdle since the optical connection is well defined, and it's within the cable. So this is a really natural spot for us to be disruptive in this market. We see some of our planning on deploying 400-gig AOCs. Others are planning to go straight to 800-gig AOCs. So we view -- AEC is the largest opportunity -- optical DSPs for sure will get an uplift in the overall opportunity set.
因此,這也是一個較低的技術障礙,因為光學連接是明確定義的,並且它位於電纜內。因此,這是我們在這個市場上進行顛覆的一個非常自然的地方。我們看到了一些部署 400 兆 AOC 的計劃。其他人則計劃直接採用 800 場 AOC。因此,我們認為——AEC 是最大的機會——光學 DSP 肯定會在整體機會集中得到提升。
But also, I think that if we look at Tesla, as an example, that's an example of where as they deploy, we're going to see a really nice opportunity for our chiplets that we did for them for that Dojo supercomputer. And it's an example of how AI applications are doing things completely differently, and we view that long-term this will be kind of a natural thing for us to benefit from. We could extend that to SerDes IP licensing.
而且,我認為,如果我們以 Tesla 為例,這就是他們部署的地方的一個例子,我們將看到我們為 Dojo 超級電腦所做的小晶片的一個非常好的機會。這是人工智慧應用程式如何以完全不同的方式做事的一個例子,我們認為從長遠來看,這對我們來說是自然而然的受益。我們可以將其擴展到 SerDes IP 授權。
Many of the licenses that we're doing now are targeting different AI applications. And also, don't forget Line Cards. The opportunity for the network OEMs and ODMs is also increasing. And of course, Line Card PHYs are something that go on the switch Line Cards that are developed. So generally speaking, I think that AI will drive faster lane rates. And we've been very, very consistent with our message that as the market hits the knee in the curve on AI deployments we're naturally going to see lane rates go more quickly to 100-gig per lane.
我們現在正在做的許多許可都針對不同的人工智慧應用程式。另外,不要忘記線路卡。網路OEM 和ODM 的機會也在增加。當然,線路卡 PHY 是在開發的交換器線路卡上運行的東西。所以總的來說,我認為人工智慧將推動更快的車道速度。我們一直非常非常一致地傳達我們的訊息,即隨著市場在人工智慧部署方面達到拐點,我們自然會看到通道速率更快地達到每通道 100G。
And that's where we really see our business taking off. So, we're getting a really nice revenue increase from 50-gig per lane applications, but we really see acceleration as 100-gig per lane happens. And especially when you start thinking about the power advantages that all of our solutions offer compared to others that are doing similar things -- Does that -- that might have been more than you were looking for, but...
這就是我們真正看到我們的業務起飛的地方。因此,我們從每個通道 50 G 的應用中獲得了非常可觀的收入成長,但隨著每個通道 100 G 的應用,我們確實看到了加速。尤其是當您開始考慮我們所有解決方案與其他正在做類似事情的解決方案相比所提供的功率優勢時 - 是嗎 - 這可能超出您的預期,但是...
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
No, that's a great overview.
不,這是一個很棒的概述。
Operator
Operator
And the next question will be coming from Quinn Bolton of Needham & Company.
下一個問題將由 Needham & Company 的 Quinn Bolton 提出。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
Bill maybe a follow-up to Tore's question, just sort of the impact of generative AI on the business. Given that most of your AEC revenue today comes from the standard compute racks rather than AI racks, what do you see in terms of potential cannibalization at least in the near term, as these hyperscalers prioritize building out the AI racks potentially at the expense of compute deployments again -- in the near term?
Bill 可能是 Tore 問題的後續問題,只是生成式人工智慧對業務的影響。鑑於您今天的大部分 AEC 收入來自標準電腦架而不是 AI 機架,您認為至少在短期內潛在的蠶食是怎樣的,因為這些超大規模企業可能會以犧牲計算為代價來優先構建 AI 機架再次部署——短期內?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
So I feel very good about how we're positioned. It is the case that our first ramp with our largest customer was a compute rack. I think we're very well positioned with our customer as they transition to AI deployments. And so we've talked in the past about 2 different types of deployments at the server level. Of course, compute will continue, and we can all guess as to what ratio it's going to be between compute and AI.
所以我對我們的定位感覺非常好。我們最大客戶的第一個坡道就是一個計算機架。我認為,當客戶轉向人工智慧部署時,我們與他們處於非常有利的位置。因此,我們過去討論過伺服器層級的兩種不同類型的部署。當然,計算將會持續下去,我們都可以猜測計算和人工智慧之間的比例將會是多少。
We've got the road map very well covered for compute. So I think we're well set. And so as that resumes as our largest customer, I think we're going to be in good shape. I'm actually more excited about the acceleration of the AI program that we've been working with the same customer on, for close to 1 year. And so, I feel like we're well covered for both compute and AI, and that's really a long-term statement.
我們已經為計算制定了很好的路線圖。所以我認為我們已經準備好了。因此,當它恢復成為我們最大的客戶時,我認為我們將處於良好狀態。事實上,我對人工智慧程式的加速感到更興奮,我們與同一客戶合作了近一年。因此,我覺得我們在計算和人工智慧方面都得到了很好的覆蓋,這確實是一個長期的聲明。
So a little bit of new information, I would say, is that with our second hyperscale customer, just to give an update generally on that and then relate that back to the same point that I was making about the earlier customer, we are right on track with the AEC ramp. The first program is a compute server rack that we've talked about. We saw small shipments in Q4, and we expect to see a continued ramp through fiscal '24.
因此,我想說的一點新資訊是,對於我們的第二個超大規模客戶,只是為了總體上提供更新,然後將其與我對早期客戶提出的相同觀點聯繫起來,我們是對的與AEC 坡道的軌道。第一個程式是我們討論過的計算伺服器機架。我們在第四季度看到了少量出貨量,我們預計到 24 財年將持續成長。
However, during the past several months, a new AI application has taken shape. So if we would have talked 100 days ago, we wouldn't have seen this -- we wouldn't have talked about this program. And so, we quickly delivered a different configuration of the AEC that was designed for the compute [SerDes] rack. So if you recall, we did a straight cable as well as an ex-cable configuration.
然而,在過去的幾個月裡,一種新的人工智慧應用已經成形。因此,如果我們在 100 天前就進行討論,我們就不會看到這個——我們就不會討論這個項目。因此,我們快速交付了專為計算 [SerDes] 機架設計的 AEC 的不同配置。所以如果你還記得的話,我們做了直電纜和前電纜配置。
So they asked us to deliver a new configuration that had specific changes that were needed for their deployment. And we delivered the new configuration within weeks, which is -- that's another example of the benefit to how we're organized. The qualification is underway, and we expect this AI appliance rack to also ramp in our fiscal '24. It's unclear as to the exact schedule from a time standpoint and a volume standpoint.
因此,他們要求我們提供一個新配置,其中包含部署所需的特定變更。我們在幾週內交付了新的配置,這是我們的組織方式受益的另一個例子。資格認證正在進行中,我們預計該 AI 設備機架也將在我們的 24 財年投入使用。從時間和數量的角度來看,尚不清楚確切的時間表。
But we feel like this is going to be another significant second program for us. And so, I think that -- for both our first and our second hyperscale customer, I think we're covering the spectrum between compute and AI. So, I feel like we're really in great shape. So hopefully, that answers your question. Now if I take it a little bit further and say, okay, long-term, let's say it's 80% compute, 20% AI.
但我們覺得這對我們來說將是另一個重要的第二個項目。因此,我認為,對於我們的第一個和第二個超大規模客戶,我認為我們正在涵蓋計算和人工智慧之間的範圍。所以,我覺得我們的狀態真的很好。希望這能回答你的問題。現在,如果我更進一步說,好吧,從長遠來看,假設 80% 是計算,20% 是人工智慧。
And you think maybe -- because the opportunity for us is 5x larger in AI, maybe the opportunity is similar if the ratio is like that. So compute might be equal to AI from an AEC perspective. I think that, any way that goes, we're going to benefit. If it goes 50-50, that's a big upside for us with AEC given the fact that there's larger volume, larger dollars associated with an AI cluster deployment.
你可能會想,因為我們在人工智慧方面的機會是原來的 5 倍,如果比例是這樣的話,也許機會是相似的。因此,從 AEC 的角度來看,計算可能等於人工智慧。我認為,無論怎樣,我們都會受益。如果它達到 50-50,這對我們的 AEC 來說是一個很大的優勢,因為人工智慧叢集部署涉及的數量更大,資金也更多。
And so, I think that for us, it won't affect us one way or another, maybe in the near-term quarters, yes. But the situation at our first customer really hasn't changed since the last update. So, we think that the year-over-year increase in revenue for that customer will happen in FY '25, as we've discussed before.
因此,我認為對我們來說,它不會以某種方式影響我們,也許在近期幾個季度,是的。但自上次更新以來,我們第一個客戶的情況確實沒有改變。因此,我們認為該客戶的收入將在 25 財年實現同比增長,正如我們之前討論的那樣。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
But no further push out or delay of the compute rack at the first hyperscaler given the potential reprioritization to AI in the near term?
但考慮到短期內可能會重新優先考慮人工智慧,第一台超大規模電腦的電腦架不會進一步推出或延遲嗎?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Well, the new program qualifications, we've talked about 2 of them, they're still scheduled in the relatively near future. And of course, as those get qualified and ramp, we'll see benefit from that. But it's a little bit tough to track month-by-month, right? That's a little bit too specific in a timeframe standpoint. So we've seen a slight delay, but it's not something that we're necessarily concerned about.
嗯,新的項目資格,我們已經討論了其中的兩個,它們仍然安排在相對不久的將來。當然,隨著這些人獲得資格並成長,我們將看到從中受益。但逐月跟蹤有點困難,對吧?從時間框架的角度來看,這有點過於具體。因此,我們看到了輕微的延遲,但這並不是我們必須擔心的事情。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
And then just a clarification on the second hyperscaler. I think the last update, you said you may not yet have a hard forecast for that hyperscalers needs on the AEC side. Have you received sort of a hard PO or at least a more reliable forecast that you're now sort of forecasting that business from in fiscal '24?
然後是對第二個超大規模的澄清。我認為在上次更新中,您說過您可能還沒有對 AEC 方面的超大規模需求進行硬性預測。您是否收到了某種硬性採購訂單,或者至少是您現在正在預測 24 財年該業務的更可靠的預測?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, it's coming together, and I think we feel comfortable saying that the revenue that will be generated by this second customer will be significant. And I'm not exactly able to talk about how significant.
是的,它正在整合,我認為我們可以放心地說,第二個客戶將產生的收入將是巨大的。我無法確切地談論其重要性。
I think that we're -- we continue to view this through a conservative lens, because we really don't know how the second half is going to shape up. But all the indicators that we've heard over the last 90 days are quite positive. And I think -- Dan referenced the fact that in Q2, we expect significant material revenue as that starts.
我認為我們繼續從保守的角度看待這個問題,因為我們真的不知道下半年會如何發展。但過去 90 天內我們聽到的所有指標都非常正面。我認為 - 丹提到了這樣一個事實,即在第二季度,我們預計將獲得大量的物質收入。
Operator
Operator
And our next question will be coming from Suji Desilva of ROTH Capital.
我們的下一個問題將來自羅斯資本的 Suji Desilva。
Suji Desilva - MD & Senior Research Analyst
Suji Desilva - MD & Senior Research Analyst
Just want to talk about the AEC, the products. You have multiple products, and I just want to know if -- are there certain ones that are more relevant to AI rack versus a traditional compute rack or are they all applicable across the board?
只是想談談 AEC、產品。您有多種產品,我只是想知道是否有某些產品與傳統電腦架相比與人工智慧機架更相關,或者它們是否全部適用?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
So, I would say that I wouldn't classify all of these solutions, I wouldn't lump them together. We're very much looking at the AEC opportunity as one where we're positioned to implement really customer-specific requests. And so, part of what we're seeing is that most of the designs that we're engaged now have something very specific to a given customer.
所以,我想說我不會對所有這些解決方案進行分類,我不會將它們混為一談。我們非常重視 AEC 機會,認為這是我們能夠真正實現客戶特定要求的機會。因此,我們所看到的部分情況是,我們現在參與的大多數設計都有針對特定客戶的特定內容。
And so, I can say that we're seeing that there's a large number of customers moving to 100-gig per lane quickly, but we're also seeing customers that are reconfiguring existing NICs and building different AI appliances with those NICs. And so, they're going to be able to ramp 50-gig per lane solutions. Now as far as configurations go, we see straight cable opportunities. We see [wide] cable opportunities.
因此,我可以說,我們看到大量客戶正在快速轉向每通道 100 GB,但我們也看到客戶正在重新配置現有 NIC 並使用這些 NIC 構建不同的 AI 設備。因此,他們將能夠提升每通道 50G 的解決方案。現在就配置而言,我們看到了直電纜的機會。我們看到了[廣泛的]有線電視機會。
We see opportunities where -- just recently we had a customer ask us to have 100-gig on one of the cable and 50-gig on the other end of the cable. And so -- obviously, that's a breakout cable. But it's an interesting challenge because this is the first time we'll be mixing different generations of ICs. And so -- again, this is something we're able to do because we're so unique in a sense that we have a dedicated organization internal to Credo that's responsible for delivering these system solutions.
我們看到了機會,就在最近,我們有一位客戶要求我們在一條電纜上安裝 100 GB,在電纜的另一端安裝 50 GB。所以——顯然,這是一條分支電纜。但這是一個有趣的挑戰,因為這是我們第一次混合不同世代的 IC。所以,這也是我們能夠做到的事情,因為我們在某種意義上非常獨特,我們在 Credo 內部有一個專門的組織來負責提供這些系統解決方案。
It's really that single-party that's responsible for collaborating with the customer, designing, developing, delivery, qualifying and then supporting the designs with our customers. And so, I can't emphasize enough that -- you give engineers these hyperscalers the opportunity to innovate in the space they've never been thought of, and it's something that we're getting really good uptake on.
實際上是由單一一方負責與客戶合作、設計、開發、交付、鑑定,然後與客戶一起支援設計。因此,我怎麼強調都不為過——你為這些超大規模的工程師提供了在他們從未想到的領域進行創新的機會,而且我們正在很好地接受這一點。
And of course, our investment in the AEC space is really unmatched by any of our competition. I think we're unique in the sense that we can offer this type of flexibility. So to answer your question, it's not -- I couldn't really point to one type of cable that is going to be leaned on.
當然,我們在 AEC 領域的投資確實是我們的任何競爭對手都無法比擬的。我認為我們是獨一無二的,因為我們可以提供這種靈活性。因此,為了回答你的問題,我無法真正指出要依賴的一種類型的電纜。
Suji Desilva - MD & Senior Research Analyst
Suji Desilva - MD & Senior Research Analyst
It paints the picture of how the cables are being deployed here, [I guess]. And then also, I believe in the prepared remarks you mentioned, 20 AECs being qualified for shipments, if I heard that right. I'm curious how many -- across how many customers or how many programs that is, just to understand the breadth of that qualification effort?
[我猜]它描繪瞭如何在這裡部署電纜的情況。另外,我相信您提到的準備好的言論,如果我沒聽錯的話,20 個 AEC 符合發貨資格。我很好奇有多少——有多少客戶或多少個項目,只是為了了解資格認證工作的廣度?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, I would say that -- there's a set of hyperscalers that are really the large opportunity within the industry for the AEC opportunity. But we've also had a lot of success with data centers that might not qualify as capital H, Hyperscaler, as well as service providers. And so, we can look at the relationships with hyperscalers directly, and there are several SKUs that we've delivered.
是的,我想說的是,有一組超大規模企業確實是產業內 AEC 機會的巨大機會。但我們在可能不符合 H 資本、Hyperscaler 以及服務提供者資格的資料中心方面也取得了許多成功。因此,我們可以直接查看與超大規模提供者的關係,並且我們已經交付了多個 SKU。
And there's even more in the queue for these more advanced next-generation systems. But even if we look at -- I think we're -- if you look at the number of $1 million per quarter or per year customers that we've got, the list is really increasing. The product category, I think, has really been solidified over the last 6 to 9 months. And you see that also because a lot of companies are announcing that they intend to compete longer term.
而且還有更多的產品正在等待這些更先進的下一代系統。但即使我們看看——我想我們是——如果你看看我們每季或每年擁有 100 萬美元的客戶數量,這個名單確實在增加。我認為,在過去的 6 到 9 個月裡,產品類別確實得到了鞏固。你看到這一點也是因為許多公司宣布他們打算長期競爭。
Operator
Operator
And our next question will be coming from Karl Ackerman of BNP.
我們的下一個問題將來自法國巴黎銀行的卡爾·阿克曼。
Karl Ackerman - Research Analyst
Karl Ackerman - Research Analyst
I have 2 questions. I guess, first off, it's great to see the sequential improvement in your fiscal Q1, but I didn't hear you confirm your fiscal '24 revenue outlook from 90 days ago. And I guess -- could you just speak to the visibility you have on your existing programs that gives you confidence in the sequential growth that you spoke about throughout fiscal '24? Could you just touch on that? That would be helpful.
我有 2 個問題。我想,首先,很高興看到你們第一財季的連續改善,但我沒有聽到你們確認 90 天前的 24 財年收入前景。我想,您能否談談現有計劃的可見性,這讓您對整個 24 財年的連續增長充滿信心?你能談談這一點嗎?那會有幫助的。
Daniel Fleming - CFO
Daniel Fleming - CFO
So yes, generally speaking, we -- as we've described, we see some near-term upside, but we still remain a bit cautiously optimistic about the back half of the year. But we're -- so we're very comfortable ultimately with the current estimates for the back half.
所以,是的,一般來說,正如我們所描述的,我們看到了一些近期的上行空間,但我們仍然對今年下半年保持謹慎樂觀的態度。但我們最終對後半段的當前估計感到非常滿意。
We do have certainly increasing visibility as time passes, and we hope to provide meaningful updates in -- over the next upcoming quarters. But we're working hard to expand these growth opportunities for FY '24 and beyond, and we remain very encouraged with what we're seeing, especially with the acceleration of AI programs.
隨著時間的推移,我們的知名度確實在不斷提高,我們希望在接下來的幾季提供有意義的更新。但我們正在努力擴大 24 財年及以後的成長機會,我們對所看到的情況仍然感到非常鼓舞,特別是隨著人工智慧專案的加速發展。
Karl Ackerman - Research Analyst
Karl Ackerman - Research Analyst
I guess as a follow-up, of the DSP opportunity that you've highlighted in the prepared remarks, are you seeing our design engagements primarily in fiscal '24 on coherent offerings or are you seeing more opportunities in DCI for your 400-gig and 800-gig opportunities?
我想,作為您在準備好的演講中強調的DSP 機會的後續行動,您是否看到我們主要在24 財年有關連貫產品的設計活動,或者您是否看到DCI 為您的400 場演出提供更多機會, 800 場演出的機會?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, so the opportunities that we're seeing -- the large opportunities that we're seeing are really within the data center. And I can say that it's across the board, 200-gig, 400-gig and 800-gig, all of these hyperscalers have different strategies as to how they're deploying optical. I think we continue to make progress with 200 and 400, and I think we're in a really good position from a time-to-market perspective on 800-gig.
是的,所以我們看到的機會——我們看到的巨大機會確實存在於資料中心內。我可以說,這是全面的,200 GB、400 GB 和 800 GB,所有這些超大規模企業對於如何部署光纖都有不同的策略。我認為我們在 200 和 400 方面繼續取得進展,而且從上市時間的角度來看,我們在 800 場方面處於非常有利的位置。
And so, we can talk about the cycles that we're spending with every hyperscaler. We're also aligning ourselves very closely in a strategic go-to-market strategy with select optical module companies. And we think that as it relates to DCI and coherent specifically, we're in development for that first solution that we're pursuing, which is 100-gig ZR. And we feel like that development will take place throughout this year and that we'll see first revenue in the second half of calendar '24. But as far as 400-gig, that would really be a second follow-on type of DCI opportunity for us.
因此,我們可以討論每個超大規模企業所花費的週期。我們也與選定的光學模組公司在策略進入市場策略上保持密切配合。我們認為,由於它與 DCI 和相干相關,因此我們正在開發我們正在追求的第一個解決方案,即 100-g ZR。我們認為這項發展將在今年持續進行,我們將在 24 年下半年看到第一筆收入。但就 400 場演出而言,這對我們來說確實是第二種後續類型的 DCI 機會。
Now in the ZR space, we're going to be unique because we'll market and sell the DSP to optical module makers. And so, we intend to engage 3 to 4 module makers in addition to our partner, EFFECT Photonics, and that makes us somewhat unique in the sense that other competitors are going directly to market with the ZR module.
現在,在 ZR 領域,我們將獨一無二,因為我們將向光學模組製造商推銷和銷售 DSP。因此,除了我們的合作夥伴 EFFECT Photonics 之外,我們還打算與 3 到 4 家模組製造商合作,這使我們在某種程度上獨一無二,因為其他競爭對手都直接將 ZR 模組推向市場。
And I highlight power is really an enabler here. And the key thing is we can do 100-gig ZR module and fit under the power ceiling for a standard QSFP connector, which is roughly 4.5 watts. So there's a large upgrade cycle from 10-gig modules that we'll enable, but also there's new deployments in addition. So that kind of gives you a little bit of flavor about the coherent, but I really see our opportunities more within the data center.
我強調權力確實是這裡的推動者。關鍵是我們可以生產 100 GB 的 ZR 模組,並且適合標準 QSFP 連接器的功率上限(約 4.5 瓦)。因此,我們將啟用 10-gig 模組的大升級週期,而且還會有新的部署。因此,這給了你一些關於連貫性的感覺,但我確實更多地在資料中心內看到了我們的機會。
Operator
Operator
And our next question will be coming from Vivek Arya of Bank of America.
我們的下一個問題將來自美國銀行的 Vivek Arya。
Vivek Arya - MD in Equity Research & Research Analyst
Vivek Arya - MD in Equity Research & Research Analyst
Bill, I'm curious to get your perspective on some of these technology changes. One is the role of InfiniBand that's getting more share in these AI clusters. What does that do to your AEC opportunity? Is that a competitive situation? Is that a complementary situation?
比爾,我很想了解您對其中一些技術變革的看法。一是 InfiniBand 的作用,它在這些人工智慧叢集中獲得了更多份額。這對您的 AEC 機會有何影響?這是一個競爭的局面嗎?這是一種互補的情況嗎?
And then the other technology question is some of your customers and partners have spoken about their desire to consider co-packaged optics and linear direct drive type of architectures. What does that do, right, to the need for stand-alone pluggables?
另一個技術問題是您的一些客戶和合作夥伴已經談到他們希望考慮共同封裝的光學元件和線性直接驅動類型的架構。這對獨立可插拔的需求有什麼作用?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
I appreciate the opportunity to talk about Ethernet versus InfiniBand, because there's been a lot said about that. Generally, we see coexistence. I think depending on how you look at the market forecast information, there is a point soon in the future with Ethernet exceeds InfiniBand for AI specifically. Beyond AI, I think it's game over already.
我很高興有機會談論乙太網路與 InfiniBand,因為關於這一點已經有很多討論了。一般來說,我們看到的是共存。我認為,根據您如何看待市場預測訊息,在未來不久,乙太網路將超過 InfiniBand,特別是在人工智慧領域。除了人工智慧之外,我認為遊戲已經結束了。
Whether you measure the TAM in ports or dollars, Ethernet is forecasted to far exceed InfiniBand in the out years, so calendar '25 and beyond. And so, if we think about -- from an absolute TAM perspective, forecasters are showing Ethernet dollars perspective. They're showing that Ethernet surpasses InfiniBand by 2025. And so, the forecast show a CAGR for Ethernet of greater than 50%, while InfiniBand, they're showing a CAGR of less than 25%.
無論您以連接埠還是美元來衡量 TAM,預計乙太網路在未來幾年將遠遠超過 InfiniBand,因此 25 年及以後。因此,如果我們從絕對 TAM 的角度考慮,預測者正在展示乙太網路美元的觀點。他們表明乙太網路到 2025 年將超過 InfiniBand。
And so you can also look at this from a port cost perspective where InfiniBand is 2 to 4x the ASP per port compared to Ethernet, depending on who you talk to. And so in a sense, it's so secret that -- the world will continue to do what the world does, they'll pursue cost-effective decisions. And we think from a technology standpoint, they're very similar.
因此,您也可以從連接埠成本的角度來看待這個問題,其中與乙太網路相比,InfiniBand 每個連接埠的 ASP 是乙太網路的 2 到 4 倍,具體取決於您與誰交談。因此,從某種意義上說,它是如此秘密,以至於世界將繼續做世界所做的事情,他們將追求具有成本效益的決策。我們認為從技術角度來看,它們非常相似。
So if you think from a cost perspective, if you look apples-to-apples, if you think that an InfiniBand port is 2 to 4x the cost of an Ethernet port, in a sense, you could justify that 1 to 3 of those ports of Ethernet are free in comparison to InfiniBand. So, I think that our position here is that we really believe that Ethernet is going to prevail. We're working on so many AI programs. Every single one of them is Ethernet.
因此,如果您從成本角度考慮,如果您進行比較,如果您認為 InfiniBand 連接埠的成本是乙太網路連接埠的 2 到 4 倍,從某種意義上說,您可以證明其中 1 到 3 個連接埠是合理的與InfiniBand 相比,乙太網路是免費的。因此,我認為我們的立場是我們確實相信乙太網路將會盛行。我們正在開發很多人工智慧專案。其中每一個都是乙太網路。
Vivek Arya - MD in Equity Research & Research Analyst
Vivek Arya - MD in Equity Research & Research Analyst
And then, Bill, anything on the move by some of your customers to think about co-packaged optics and direct drive? And while I'm it, maybe let me just ask Dan -- have a follow-up on the fiscal '24. I think Dan, you suggested you are comfortable with where I think expectations are right now. That still implies a very steep ramp into the back half. So I'm just trying to square the confidence in the full year with some of -- just kind of the macro caution that came through in your comments?
然後,比爾,您的一些客戶有什麼想法考慮共同封裝光學元件和直接驅動嗎?當我這樣做時,也許讓我問 Dan——對 24 財年進行後續行動。我想丹,你建議你對我現在的期望感到滿意。這仍然意味著進入後半部的坡道非常陡峭。所以我只是想用您評論中提到的一些宏觀警告來調和對全年的信心?
Daniel Fleming - CFO
Daniel Fleming - CFO
Yes, we are confident in how we have guided. And as I mentioned, we're very comfortable with the current estimates. If we look at FY '24, as you alluded to, Vivek, there's -- we see strong sequential top line growth throughout the year in order to get -- to achieve those numbers. And it's kind of well documented, the -- what's happened at Microsoft to us for this fiscal year.
是的,我們對我們的指導方式充滿信心。正如我所提到的,我們對目前的估計非常滿意。如果我們看看 24 財年,正如您所提到的,Vivek,我們看到全年營收連續強勁成長,以便實現這些數字。微軟本財年發生的事情有詳細記錄。
So if we exclude Microsoft, what that means is we have in excess of 100% year-on-year growth of other product revenue from other customers, which, again, we're very confident based on all of the traction that we've seen recently that we'll be able to achieve that. And of course, I'll just reiterate, one of the key drivers is AI in some of those programs. So hopefully, that gives you some additional color on our confidence for FY '24.
因此,如果我們排除微軟,這意味著我們來自其他客戶的其他產品收入同比增長超過 100%,基於我們已經獲得的所有吸引力,我們對此非常有信心。目標。當然,我重申一下,其中一些程式的關鍵驅動因素之一是人工智慧。希望這能讓您對我們對 24 財年的信心有更多的了解。
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, regarding your question about the linear direct drive, that was, I think, this year's shiny object at OFC. I do think that the idea that -- it's really -- the idea is really to -- how to address the power challenges, basically move away from the optical DSP. I think that this is not a new idea. There was a big move towards this linear drug drive in the 10-gig space when that was emerging. And I think -- the fact that there is really none in existence, I think that DSP [has] chosen that and it was really critical to close the system.
是的,關於你關於線性直接驅動的問題,我認為這是今年 OFC 的亮點。我確實認為,這個想法——確實是——這個想法真的是——如何解決功率挑戰,基本上擺脫了光學 DSP。我認為這不是什麼新想法。當這種線性藥物驅動出現時,10 場演出領域出現了重大進展。我認為,事實上,實際上沒有任何系統存在,我認為 DSP [已經] 選擇了這一點,關閉該系統確實至關重要。
Our feeling is that, I think we'll see much of the same this year. I think [Marvell] did a great job in kind of setting expectations correctly. They did a long session right after OFC that I think addressed it quite well. I think you'll see small deployments where every link in the system is very, very controlled. But these are typically very, very small in terms of the overall TAM.
我們的感覺是,我想今年我們會看到很多相同的情況。我認為 [Marvell] 在正確設定期望方面做得很好。 OFC 結束後他們舉行了一次長時間的會議,我認為會議很好地解決了這個問題。我認為您會看到小型部署,其中系統中的每個連結都受到非常非常的控制。但就整體 TAM 而言,這些通常非常非常小。
Now we're fully signed up. If the real goal is power, that's exactly where we play. So we're fully signed up to looking at unique approaches in the future to be able to offer compelling things from a power perspective. And it's not like I'm completely dismissing the concept that was really behind the idea of linear direct drive. We're actually viewing that as a potential opportunity for us in solving the problem differently. But generally speaking, I don't think you'll see in the future a world where linear direct drive is measured in any kind of significant way.
現在我們已經完全註冊了。如果真正的目標是權力,那麼這正是我們的目標。因此,我們完全同意在未來尋找獨特的方法,以便能夠從力量的角度提供引人注目的東西。我並不是完全否定線性直接驅動背後的概念。實際上,我們認為這是我們以不同方式解決問題的潛在機會。但總的來說,我認為您在未來不會看到以任何重要方式測量線性直接驅動的世界。
It's not to say that people aren't spending money trying to prove it out right now. That is happening. And regarding CPO, I think that was -- yes, like that was kind of a -- that was something that was talked about for many, many years prior. And I think also on that, you'll see smaller deployments, if that's ultimately something that some customers embrace. But I don't think you'll see it in a big way. That's simply not what the customer base is looking for.
這並不是說人們現在沒有花錢來證明這一點。這正在發生。關於 CPO,我認為這是——是的,就像是——這是很多年前討論過的事情。我認為,在這一點上,您將看到更小的部署,如果這最終是一些客戶所接受的。但我認為你不會在很大程度上看到它。這根本不是客戶群所尋求的。
Operator
Operator
And our next question will be coming from David Liu of Mizuho.
我們的下一個問題將來自瑞穗銀行的 David Liu。
Jing Xiao Liu - Research Analyst
Jing Xiao Liu - Research Analyst
This is David on for Vijay, Mizuho. My first question is, assuming that in fiscal '25, data [SerDes] demand for general compute improves and you see the continued new AI ramps, can you provide any more color on the puts and takes there and the type of operating leverage you can improve?
這是戴維 (David) 替補維傑 (Vijay),瑞穗 (Mizuho)。我的第一個問題是,假設在25 財年,通用計算的數據[SerDes] 需求有所改善,並且您看到新的AI 持續增長,您能否提供有關看跌期權和拿取期權的更多資訊以及您可以提供的營運槓桿類型提升?
Daniel Fleming - CFO
Daniel Fleming - CFO
Well, we're not giving specific guidance yet to fiscal '25. But you're right in that the ingredient certainly exist where operating leverage. We should exit FY '24 with pretty robust operating leverage and that we would expect based on what we know now to carry forward into FY '25. But we haven't framed yet, of course, what that's going to ultimately look like.
嗯,我們還沒有為 25 財年提供具體指導。但你說得對,營運槓桿中肯定存在這因素。我們應該以相當強勁的營運槓桿退出 24 財年,並且根據我們現在所知,我們預計該槓桿將延續到 25 財年。但當然,我們還沒有確定最終的樣子。
Jing Xiao Liu - Research Analyst
Jing Xiao Liu - Research Analyst
And I guess for my second question, when you're talking with hyperscalers on these new AI applications, how important is sort of your TCO advantage when they're exploring your solution? Or are they currently kind of just primarily focused on time-to-market and maximum performance and just getting their AI deployments out there?
我想對於我的第二個問題,當您與超大規模企業討論這些新的人工智慧應用程式時,當他們探索您的解決方案時,您的 TCO 優勢有多重要?或者他們目前主要關注的是上市時間和最大性能,只是將他們的人工智慧部署在那裡?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
So I just want to make sure you said total cost of ownership?
所以我只是想確定你說的是總擁有成本?
Jing Xiao Liu - Research Analyst
Jing Xiao Liu - Research Analyst
Yes.
是的。
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, it's - I think it's hands down in favor of AEC. So if we look at 100-gig lane rates, I think the conclusion throughout the market is that there's 2 ways to deploy short cable solutions. It's really AEC or AOC. If we look at it from a CapEx standpoint, AECs are about half the cost. If we look at -- from an OpEx standpoint, also about half the cost, about half the power, half the ASP for an apples-to-apples type solution. So I think the TCO benefit is significant.
是的,我認為這毫無疑問有利於 AEC。因此,如果我們觀察 100 千兆通道速率,我認為整個市場的結論是,有兩種部署短電纜解決方案的方法。這實際上是 AEC 或 AOC。如果我們從資本支出的角度來看,AEC 大約是成本的一半。如果我們從營運支出的角度來看,對於同類解決方案來說,成本、功耗、平均售價也只有一半左右。所以我認為 TCO 的好處是顯著的。
The other thing you've got to consider is that, especially when you're down in server racks, these are different than switch racks in a sense that having a failure with your cable solution is -- it becomes a very urgent item. And so, when we think about AOCs, the reliability in terms of number of years, it's probably anywhere from 1/3 to 1/10. The AECs that we sell are -- we talked about a 10-year product life.
您必須考慮的另一件事是,特別是當您在伺服器機架中時,這些與交換器機架不同,從某種意義上說,您的電纜解決方案出現故障是一個非常緊急的問題。因此,當我們考慮 AOC 時,以年數計算的可靠性可能在 1/3 到 1/10 之間。我們銷售的 AEC 的產品壽命為 10 年。
And so, it kind of matches or exceeds the life of the rack that is being deployed, and the same cannot be true for -- it cannot be said for any kind of optical solution. So I think across the board, it's -- hands down, the TCO is much more favorable for AEC.
因此,它在某種程度上匹配或超過了正在部署的機架的壽命,但對於任何類型的光學解決方案來說都不是這樣。所以我認為,從整體來看,TCO 對 AEC 來說更加有利。
Operator
Operator
And our next question will be coming from Quinn Bolton of Needham & Company.
我們的下一個問題將來自 Needham & Company 的 Quinn Bolton。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
Quick 2 follow-ups. One, Dan, was there any contra revenue in the April quarter?
快速2次跟進。第一,丹,四月這個季度有抵銷收入嗎?
Daniel Fleming - CFO
Daniel Fleming - CFO
That's an excellent question, Quinn. I'm glad you caught that. Actually, so, there was, and you will see that when we file our 10-K. In the past, you've been able to see that in our press release, in our GAAP to non-GAAP reconciliation. But from Q4 and going forward, we're no longer excluding that contra revenue from our non-GAAP financials. And this really came about through a comment from the SEC, not singling out Credo, but actually all of Amazon suppliers who have a warrant -- or Amazon has a warrant with them.
這是一個很好的問題,奎因。我很高興你明白了。事實上,確實有,當我們提交 10-K 時您會看到這一點。過去,您可以在我們的新聞稿、公認會計原則與非公認會計原則的調節中看到這一點。但從第四季開始,我們將不再將這筆對銷收入排除在我們的非公認會計準則財務資料之外。這實際上是透過美國證券交易委員會的評論得出的,它不是針對 Credo,而是實際上所有擁有搜查令的亞馬遜供應商——或者亞馬遜擁有搜查令。
So the positive things of this change are you'll still be able to track ultimately what that warrant expense is, but when we file our Q-on-Q. And looking historically, there's not really -- it doesn't really make a reporting difference on a non-GAAP basis. It was not material, the difference. And it just makes calculation a little bit more straightforward going forward. Our only non-GAAP reconciling item going forward, at least for the foreseeable future, is really just share-based compensation.
因此,這項變更的積極之處在於,您仍然能夠最終追蹤認股權證費用是多少,但是當我們提交 Q-on-Q 時。從歷史上看,在非公認會計原則的基礎上,它並沒有真正產生報告差異。這不是物質上的差別。它只是讓計算變得更簡單。至少在可預見的未來,我們唯一的非公認會計原則調節項目實際上只是基於股份的薪酬。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
So the revenue doesn't change. You'll just sort of - you won't be making the adjustments for the contra revenue and the non-GAAP gross margin calculation going forward?
所以收入沒有改變。您只是想 - 您不會對未來的對銷收入和非 GAAP 毛利率計算進行調整嗎?
Daniel Fleming - CFO
Daniel Fleming - CFO
That's exactly correct. Yes. Revenue is still revenue. It has a portion of it, which is contra-revenue, which obviously brings down the revenue a little bit.
這是完全正確的。是的。收入還是收入。它有一部分是對沖收入,這顯然會稍微降低收入。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
And then for Bill, would you expect in fiscal '24 a meaningful initial ramp of the 200 or 400-gig optical DSPs? Or would you continue to encourage investors to sort of think that the optical DSP ramp is really beyond a fiscal '24 event at this point?
那麼對於 Bill 來說,您是否期望在 24 財年實現 200 或 400 吉光纖 DSP 的有意義的初始成長?或者您會繼續鼓勵投資者認為光學 DSP 的提升實際上已經超出了 24 財年的事件範圍?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
I think that when we think about significant -- we think about crossing the 10% of revenue threshold, and we don't see that until fiscal '25. We do see signs of life in China. And as I said, we're shipping 400-gig optical DSPs to a U.S. hyperscaler now. My expectation is throughout the year we're going to have a lot more success stories to talk about, but those ramps will most likely not take place within the next 3 quarters. So it's really a fiscal '25 target at this point.
我認為,當我們考慮重大問題時,我們會考慮跨越收入的 10% 門檻,但直到 25 財年我們才會看到這一點。我們確實在中國看到了生命的跡象。正如我所說,我們現在正在向美國超大規模企業運送 400 GB 光學 DSP。我的期望是,全年我們將有更多的成功故事可供談論,但這些成長很可能不會在接下來的三個季度內發生。因此,目前這確實是 25 財年的目標。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
But it starts this year, it's just you're notâ¦
但從今年開始,只是你不…
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes.
是的。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
going to meaningful because it doesn't hit 10% threshold?
因為沒有達到 10% 的門檻就有意義嗎?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Right, exactly.
對,完全正確。
Operator
Operator
And that question will be coming from Tore Svanberg of Stifel.
這個問題將由 Stifel 的 Tore Svanberg 提出。
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
Bill, maybe a follow-up to the previous question about 200 or 400-gig. I was little bit more curious about 800-gig. Are you seeing any changes at all to the time lines there? I think the expectation was that the 800-gig market would maybe take off second half of next calendar year. But with all these new AI trends, just wondering if you're seeing any pull-in activity there or maybe even seeing some cannibalization versus 200-gig and 400-gig?
Bill,也許是上一個關於 200 或 400 場演出的問題的後續。我對 800 場演出更好奇一些。您看到那裡的時間線有任何變化嗎?我認為 800 場演出的市場可能會在明年下半年起飛。但面對所有這些新的人工智慧趨勢,只是想知道您是否看到了任何拉動活動,或者甚至可能看到與 200-gig 和 400-gig 相比的一些蠶食?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
I think that -- my expectation is that this is really a calendar year '24 type of market take off. And whether it's the second half or first half, we, of course, would like to see it in the first half, given the fact that, that would imply that there would be success in pulling in AI programs. And so there's a lot of benefit that comes with 800-gig modules and the implication that it has on our AEC business.
我認為——我的期望是,這確實是 24 年的市場起飛。無論是下半年還是上半年,我們當然希望看到上半年,因為這意味著人工智慧專案的引入會成功。因此,800g 模組帶來了許多好處,並對我們的 AEC 業務產生了影響。
But I definitely see it kind of in that time frame. I don't really see it as a cannibalization of the 200 and 400-gig. It's really unless you look at it, that these new deployments are in lieu of the old technology. But like I said before, every hyperscaler has their own strategy related to the port size that they plan on deploying. Everybody's got a unique architecture.
但我確實在那個時間範圍內看到了這一點。我並不認為這是對 200 場和 400 場演出的蠶食。除非你仔細觀察,否則這些新部署確實取代了舊技術。但正如我之前所說,每個超大規模企業都有自己的與其計劃部署的連接埠大小相關的策略。每個人都有獨特的建築。
And where we see optical is typically in the Leaf-Spine network for anything above the ToR. In AI, I think the real opportunity is going to be with AOCs. And that, I think, is going to be a very large 800-gig market when those AI clusters really begin deployment, which again, I think it -- could be in calendar '24. So I appreciate the question, though.
我們看到光學通常是在葉脊網絡中,用於任何高於 ToR 的情況。在人工智慧領域,我認為真正的機會在於 AOC。我認為,當這些人工智慧叢集真正開始部署時,這將是一個非常大的 800G 市場,我認為這可能會在 24 日曆年開始。不過,我很欣賞這個問題。
Operator
Operator
That concludes the Q&A for today. I would like to turn the call back over to Bill Brennan for closing remarks. Please go ahead.
今天的問答就到此結束。我想將電話轉回給比爾布倫南 (Bill Brennan),讓其致閉幕詞。請繼續。
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
We really appreciate the participation today, and we look forward to following up on the call backs. So thanks very much.
我們非常感謝今天的參與,並期待對回電進行跟進。非常感謝。
Operator
Operator
This concludes today's conference call. Thank you all for joining, and everyone, enjoy the rest of your evening.
今天的電話會議到此結束。感謝大家的加入,祝福大家有個愉快的夜晚。