高速連接公司 Credo 報告其 23 財年的收入達到創紀錄的 1.842 億美元,同比增長 73%。增長是由產品收入推動的,該收入增長了 87%。該公司預計將在第一季度實現連續增長,並在整個 24 財年繼續實現連續季度收入增長。在高速連接解決方案的加速市場機會中,特別是在生成 AI 應用程序市場中,預計會有顯著的增長預期。
Credo 的 AEC(有源電纜)解決方案是為客戶量身定制的,可滿足特定要求。大量客戶正在迅速採用每通道 100 兆字節。公司擁有20個合格的AEC出貨。
使用警語:中文譯文來源為 Google 翻譯,僅供參考,實際內容請以英文原文為主
Operator
Operator
Good day, and thank you for standing by. Welcome to the Credo Q4 Fiscal Year 2023 Earnings Conference Call. (Operator Instructions) Please be advised that today's conference is being recorded.
美好的一天,謝謝你的支持。歡迎參加 Credo 2023 財年第四季度收益電話會議。 (操作員說明)請注意,今天的會議正在錄製中。
I would now like to go ahead and turn the call over to Dan O'Neil. Please go ahead.
我現在想繼續把電話轉給 Dan O'Neil。請繼續。
Daniel J. O'Neil - VP of Corporate Development & IR
Daniel J. O'Neil - VP of Corporate Development & IR
Good afternoon, and thank you all for joining us today for our fiscal 2023 fourth quarter and year ending earnings call. Joining me today from Credo are Bill Brennan, our Chief Executive Officer; and Dan Fleming, our Chief Financial Officer.
下午好,感謝大家今天加入我們的 2023 財年第四季度和年末財報電話會議。今天從 Credo 加入我的是我們的首席執行官 Bill Brennan;以及我們的首席財務官 Dan Fleming。
I'd like to remind everyone that certain comments made in this call today may include forward-looking statements regarding expected future financial results, strategies and plans, future operations, the markets in which we operate and other areas of discussion. These forward-looking statements are subject to risks and uncertainties that are discussed in detail in our documents filed with the SEC.
我想提醒大家,在今天的電話會議中發表的某些評論可能包括有關預期未來財務結果、戰略和計劃、未來運營、我們運營的市場和其他討論領域的前瞻性陳述。這些前瞻性陳述受我們向美國證券交易委員會提交的文件中詳細討論的風險和不確定性的影響。
It's not possible for the company's management to predict all risks nor can the company assess the impact of all factors on its business or the extent to which any factor or combination of factors may cause actual results to differ materially from those contained in any forward-looking statements.
公司管理層不可能預測所有風險,公司也無法評估所有因素對其業務的影響或任何因素或因素組合可能導致實際結果與任何前瞻性預測中包含的結果存在重大差異的程度聲明。
Given these risks, uncertainties and assumptions, the forward-looking events discussed during this call may not occur, and actual results could differ materially and adversely from those anticipated or implied. The company undertakes no obligation to publicly update forward-looking statements for any reason after the date of this call to conform these statements to actual results or to changes in the company's expectations except as required by law.
鑑於這些風險、不確定性和假設,本次電話會議期間討論的前瞻性事件可能不會發生,實際結果可能與預期或暗示的結果存在重大不利差異。除非法律要求,否則公司沒有義務在本次電話會議日期後以任何理由公開更新前瞻性陳述,以使這些陳述符合實際結果或公司預期的變化。
Also during this call, we will refer to certain non-GAAP financial measures, which we consider to be important measures of the company's performance. These non-GAAP financial measures are provided in addition to and not as a substitute for or superior to financial performance prepared in accordance with U.S. GAAP. A discussion of why we use non-GAAP financial measures and reconciliations between our GAAP and non-GAAP financial measures is available in the earnings release we issued today, which can be accessed using the Investor Relations portion of our website.
同樣在本次電話會議中,我們將參考某些非 GAAP 財務指標,我們認為這些指標是衡量公司業績的重要指標。這些非 GAAP 財務指標是根據美國公認會計原則編制的財務業績的補充,而不是替代或優於這些財務業績。我們今天發布的收益報告中討論了為什麼我們使用非 GAAP 財務指標以及我們的 GAAP 和非 GAAP 財務指標之間的調節,可以使用我們網站的投資者關係部分訪問。
With that, I'll now turn the call over to our CEO. Bill?
有了這個,我現在將把電話轉給我們的首席執行官。賬單?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Thanks, Dan, and good afternoon, everyone. Thank you for joining our Q4 fiscal '23 earnings call. I'll begin by providing an overview of our fiscal year '23 and fiscal Q4 results. I will then highlight what we see going forward in the fiscal '24. Dan Fleming, our CFO, will follow my remarks with a detailed discussion of our Q4 and fiscal year '23 financial results and share our outlook for the first quarter.
謝謝,丹,大家下午好。感謝您加入我們的 23 財年第四季度財報電話會議。首先,我將概述我們的 23 財年和第四財季業績。然後我將強調我們在 24 財年看到的未來。我們的首席財務官丹·弗萊明 (Dan Fleming) 將在我講話後詳細討論我們第四季度和 23 財年的財務業績,並分享我們對第一季度的展望。
Credo is a high-speed connectivity company, delivering integrated circuits, system-level solutions and IP licenses to the hyperscale data center ecosystem along with a range of other data centers and service providers. All our solutions leverage our core SerDes technology and our unique customer-focused design approach, enabling Credo to deliver optimized, secure, high-speed solutions with significantly better power efficiency and cost.
Credo 是一家高速連接公司,為超大規模數據中心生態系統以及一系列其他數據中心和服務提供商提供集成電路、系統級解決方案和 IP 許可。我們所有的解決方案都利用我們的核心 SerDes 技術和我們獨特的以客戶為中心的設計方法,使 Credo 能夠提供優化、安全、高速的解決方案,並顯著提高能效和成本。
Our electrical and optical connectivity solutions delivered leading performance with port speeds ranging from 50 gig up to 1.6 terabits per second. While we primarily serve the Ethernet market today, we continue to extend into other standards-based markets as the need for higher speed with more power-efficient connectivity increases exponentially.
我們的電氣和光學連接解決方案提供了領先的性能,端口速度從 50 gig 到每秒 1.6 太比特不等。雖然我們今天主要服務於以太網市場,但隨著對更高速度和更節能連接的需求呈指數級增長,我們將繼續擴展到其他基於標準的市場。
Credo continues to have significant growth expectations within the accelerating market opportunity for high-speed connectivity solutions. In fact, the onset of generative AI applications is already accelerating the need for higher speed and more energy-efficient connectivity solutions, and this is what Credo excels.
Credo 在高速連接解決方案的加速市場機會中繼續擁有顯著的增長預期。事實上,生成式 AI 應用的出現已經加速了對更高速度和更節能的連接解決方案的需求,而這正是 Credo 所擅長的。
I'll start with comments on our fiscal 2023 results. Today, Credo is reporting results from our first full fiscal year as a public company. In fiscal '23, Credo achieved just over $184 million in revenue, up 73% over fiscal '22, and we achieved non-GAAP gross margin of 58%. Product revenue increased 87% year-over-year, primarily due to the ramp of our active electrical cable solutions.
我將從對我們 2023 財年業績的評論開始。今天,Credo 正在報告我們作為上市公司的第一個完整財政年度的結果。在 23 財年,Credo 實現了超過 1.84 億美元的收入,比 22 財年增長了 73%,我們實現了 58% 的非 GAAP 毛利率。產品收入同比增長 87%,這主要是由於我們有源電纜解決方案的增長。
License revenue grew 28% year-over-year from $25 million to $32 million. Throughout fiscal '23, we had several highlights across our product lines. For active electrical cables, or AECs, we continued to lead the market Credo pioneered during the last several years. Our team continued to quickly innovate with application-specific solutions, and we've been successful in expanding our engagements to include multiple data centers and service providers.
許可收入同比增長 28%,從 2500 萬美元增至 3200 萬美元。在整個 23 財年,我們的產品線有幾個亮點。對於有源電纜或 AEC,我們繼續引領 Credo 在過去幾年開創的市場。我們的團隊繼續通過特定於應用程序的解決方案進行快速創新,並且我們已經成功地將我們的參與範圍擴大到包括多個數據中心和服務提供商。
Our customer-focused innovation has led to more than 20 different versions of AECs shipped for qualification or production in the last year, and we remain sole sourced in all our wins. And while our significant power advantage was a nice to have a couple of years ago, it's increasingly becoming imperative as our hyperscaler customers are pushed to lower their carbon footprint.
去年,我們以客戶為中心的創新已導致 20 多個不同版本的 AEC 出貨用於鑑定或生產,並且我們仍然是我們所有勝利的唯一來源。雖然我們在幾年前擁有顯著的功率優勢是件好事,但隨著我們的超大規模客戶被迫降低他們的碳足跡,它變得越來越勢在必行。
For optical DSPs, Credo continued to build momentum by successfully passing qualification for 200-gig and 400-gig solutions at multiple hyperscalers with multiple optical module partners. In addition, Credo introduced our 800-gig optical DSPs, laser drivers and TIAs and we announced our entry into the coherent optical DSP market.
對於光學 DSP,Credo 通過與多個光學模塊合作夥伴在多個超大規模器中成功通過 200-gig 和 400-gig 解決方案的資格認證,繼續建立勢頭。此外,Credo 推出了我們的 800-gig 光學 DSP、激光驅動器和 TIA,並且我們宣布進入相干光學 DSP 市場。
For Line Card clients, we continued to expand our market leadership. In particular, Credo built upon our position as the leader for MACsec PHYs with over 50% market share. We also extended our performance in power efficiency advantages for 100-gig per lane Line Card PHYs with the introduction of our Screaming Eagle family of retimers and gearboxes with up to 1.6 terabits per second of bandwidth.
對於線路卡客戶,我們繼續擴大我們的市場領導地位。特別是,Credo 鞏固了我們作為 MACsec PHY 領導者的地位,擁有超過 50% 的市場份額。我們還通過引入我們的 Screaming Eagle 重定時器系列和帶寬高達每秒 1.6 太比特的齒輪箱,擴展了我們在每通道 100-gig 線路卡 PHY 的能效優勢方面的性能。
For IP licensing, we continue to build on our offering of highly optimized SerDes IP. In the year, we licensed SerDes IP at several process nodes from 4-nanometer to 28-nanometer with speeds ranging from 28-gig to 112-gig and reached performance ranging from XSR to LR. We believe our ability to innovate to deliver custom solutions remains unparalleled. We maintain very close working relationships with hyperscalers, and we'll continue to collaborate with them to deliver solutions that are optimized to their needs.
對於 IP 許可,我們繼續構建高度優化的 SerDes IP 產品。在這一年裡,我們在從 4 納米到 28 納米的幾個工藝節點上獲得了 SerDes IP 許可,速度從 28-gig 到 112-gig,並達到了從 XSR 到 LR 的性能。我們相信,我們提供定制解決方案的創新能力仍然是無與倫比的。我們與超大規模企業保持著非常密切的工作關係,我們將繼續與他們合作,提供針對他們的需求優化的解決方案。
Despite recent macroeconomic headwinds in the data center industry, we believe the need for higher speed with better power efficiency will continue to grow. This plays perfectly to Credo's strengths, which is why we remain optimistic about our prospects in fiscal '24 and beyond.
儘管數據中心行業最近遭遇宏觀經濟逆風,但我們相信對更高速度和更好能效的需求將繼續增長。這完美地發揮了 Credo 的優勢,這就是為什麼我們對我們在 24 財年及以後的前景保持樂觀。
I will now discuss the fourth quarter more specifically. In Q4, we delivered revenue of $32.1 million and non-GAAP gross margin of 58%.
我現在將更具體地討論第四季度。在第四季度,我們實現了 3210 萬美元的收入和 58% 的非 GAAP 毛利率。
I'll now provide an overview of key business trends for the quarter. First, regarding AEC, market forecasters continue to expect significant growth in this product category due to the benefits of AECs compared to both legacy direct attached copper cables and compared to active optical cables, which is significantly higher power and higher cost.
我現在將概述本季度的主要業務趨勢。首先,關於 AEC,由於 AEC 與傳統的直連銅纜和有源光纜相比具有明顯更高的功率和更高的成本,因此市場預測人員繼續預計該產品類別將出現顯著增長。
With our largest customer, we're encouraged by our development progress on several new AEC programs, including an acceleration in the first 100-gig per lane AI program where they intend to deploy accretive AECs. We saw the initial ramp up of a second hyperscale customer, which we expect to grow meaningfully throughout the year. We're ramping 50-gig per lane NIC to ToR ADC solutions for both their AI and compute applications, And I'm happy to report that Credo has been awarded this customer's first 100-gig per lane program.
對於我們最大的客戶,我們對幾個新的 AEC 項目的開發進展感到鼓舞,包括他們打算部署增值 AEC 的第一個 100-gig 人工智能項目的加速。我們看到了第二個超大規模客戶的初步增長,我們預計該客戶將在全年實現有意義的增長。我們正在為他們的 AI 和計算應用程序將每通道 50-gig NIC 升級到 ToR ADC 解決方案,我很高興地報告,Credo 已獲得該客戶的首個每通道 100-gig 計劃。
We're also actively working to develop several other advanced AEC solutions for their next-generation deployments. We continue to make progress with additional customers as well. We remain in flight with 2 additional hyperscalers and are also engaged in meaningful opportunities with service providers.
我們還積極致力於為他們的下一代部署開發其他幾種先進的 AEC 解決方案。我們也在繼續與更多客戶取得進展。我們與 2 個額外的超大規模生產商保持聯繫,並且還與服務提供商一起參與有意義的機會。
We've seen momentum building for AEC solutions across AI, compute and switch applications and we continue to expect to benefit as speeds move quickly to 100-gig per lane.
我們已經看到 AEC 解決方案在人工智能、計算和交換機應用程序中的發展勢頭,隨著速度迅速提高到每通道 100 gig,我們將繼續受益。
Regarding our progress on optical solutions, in the optical category, we've leveraged our SerDes technology to deliver disruptive products, including DSPs, laser drivers and TIAs for 50-gig through 800-gig port applications.
關於我們在光學解決方案方面的進展,在光學類別中,我們利用我們的 SerDes 技術提供了顛覆性產品,包括 DSP、激光驅動器和用於 50-gig 到 800-gig 端口應用的 TIA。
We remain confident we can gain share over time due to our compelling combination of performance, power and cost. In addition to the hyperscalers that have previously production-qualified Credo's optical DSPs, we started the production ramp of a 400-gig optical DSP for a U.S. hyperscaler as the end customer.
由於我們對性能、功率和成本的引人注目的組合,我們仍然有信心隨著時間的推移獲得市場份額。除了之前已通過生產認證的 Credo 光學 DSP 的超大規模生產商之外,我們還開始為美國超大規模生產商作為最終客戶生產 400-gig 光學 DSP。
At OFC in March, we received very positive feedback on our market solutions, including our Dove 800 products as well as on our announcement to enter the 100-gig ZR coherent DSP market. We're well positioned to win hyperscalers across a range of applications, including 200-gig, 400-gig and 800-gig port speeds. We're also engaged in opportunities for fiber channel, 5G, OTN and PON applications with optical partners, service providers and networking OEMs.
在 3 月的 OFC 上,我們的市場解決方案收到了非常積極的反饋,包括我們的 Dove 800 產品以及我們宣布進入 100-gig ZR 相干 DSP 市場的反饋。我們有能力贏得一系列應用程序的超大規模應用,包括 200-gig、400-gig 和 800-gig 端口速度。我們還與光學合作夥伴、服務提供商和網絡原始設備製造商一起參與光纖通道、5G、OTN 和 PON 應用的機會。
Within our Line Card PHY category, during the fourth quarter, we saw growing interest in our solutions, specifically for our Screaming Eagle 1.6 terabit per second PHYs. We've already been successful winning several design commitments from leading networking OEMs and ODMs for the Screaming Eagle devices. Credo was selected due to our combination of performance, signal integrity, power efficiency and cost effectiveness.
在我們的線路卡 PHY 類別中,在第四季度,我們看到人們對我們的解決方案越來越感興趣,特別是我們的 Screaming Eagle 1.6 太比特每秒 PHY。我們已經成功贏得領先網絡 OEM 和 ODM 的多項設計承諾,用於 Screaming Eagle 設備。之所以選擇 Credo,是因為我們結合了性能、信號完整性、電源效率和成本效益。
We also made significant development progress with our customer-sponsored next-generation 5-nanometer 1.6 terabit per second MACsec PHY, which we believe will extend our leadership well into the future for applications requiring encryption.
我們還在客戶贊助的下一代 5 納米 1.6 太比特/秒 MACsec PHY 方面取得了重大開發進展,我們相信這將使我們在未來需要加密的應用程序中保持領先地位。
Regarding our SerDes IP licensing in SerDes chiplet businesses, our IP deals in Q4 were primarily led by our 5 and 4-nanometer 112-gig SerDes IP, which, according to customers, offers significant power advantage versus competition based on our ability to power optimize to the reach of an application.
關於我們在 SerDes 小芯片業務中的 SerDes IP 許可,我們在第四季度的 IP 交易主要由我們的 5 納米和 4 納米 112-gig SerDes IP 主導,據客戶稱,基於我們的功率優化能力,與競爭相比,它具有顯著的功率優勢到應用程序的範圍。
Our SerDes chiplet opportunity continues to progress. Our collaboration with Tesla on their Dojo supercomputer design is an example of how connectivity chiplets can enable advanced next-generation AI systems. We're working closely with customers and standard sites such as the UCIe consortium to ensure we retain leadership as the chiplet market grows and matures. We believe the acceleration of AI solutions across the industry will continue to fuel our licensing and chiplet businesses.
我們的 SerDes 小芯片機會繼續取得進展。我們與特斯拉在 Dojo 超級計算機設計方面的合作是連接小芯片如何支持先進的下一代人工智能係統的一個例子。我們正在與客戶和 UCIe 聯盟等標準站點密切合作,以確保我們在 chiplet 市場的發展和成熟過程中保持領先地位。我們相信人工智能解決方案在整個行業的加速發展將繼續推動我們的許可和小芯片業務。
To sum up, the hyperscale landscape has shifted swiftly and dramatically in 2023. Compute is now facing a new horizon, which is generative AI. We expect this shift to accelerate the demand for energy-efficient connectivity solutions that perform at the highest speeds. From our viewpoint, this technology acceleration increases the degree of difficulty and will naturally slim the field of market participants.
總而言之,超大規模格局在 2023 年發生了迅速而巨大的變化。計算現在面臨著一個新的視野,即生成 AI。我們預計這種轉變將加速對以最高速度執行的節能連接解決方案的需求。在我們看來,這種技術加速增加了難度,自然會縮小市場參與者的範圍。
We remain confident that our technology innovation and market leadership will fuel our growth as these opportunities materialize. We expect to grow sequentially in Q1 and then continue with sequential quarterly revenue growth throughout fiscal '24. We believe our growth will be led by multiple customers across our range of connectivity solutions, which will result in a more diversified revenue base as we exit fiscal '24.
我們仍然相信,隨著這些機會的實現,我們的技術創新和市場領導地位將推動我們的增長。我們預計將在第一季度實現連續增長,然後在整個 24 財年繼續實現連續的季度收入增長。我們相信,我們的增長將由我們一系列連接解決方案中的多個客戶引領,這將在我們退出 24 財年時帶來更加多元化的收入基礎。
I'll now turn the call over to our CFO, Dan Fleming, who will provide additional details. Thank you.
我現在將電話轉給我們的首席財務官 Dan Fleming,他將提供更多詳細信息。謝謝。
Daniel Fleming - CFO
Daniel Fleming - CFO
Thank you, Bill, and good afternoon. I will first provide a financial summary of our fiscal year '23, then review our Q4 results and finally, discuss our outlook for Q1 and fiscal '24. As a reminder, the following financials will be discussed on a non-GAAP basis, unless otherwise noted.
謝謝你,比爾,下午好。我將首先提供我們 23 財年的財務摘要,然後回顧我們的第四季度業績,最後討論我們對第一季度和 24 財年的展望。提醒一下,除非另有說明,否則將在非 GAAP 基礎上討論以下財務數據。
Revenue for fiscal year '23 was a record at $184.2 million, up 73% year-over-year, driven by product revenue that grew by 87%. Gross margin for the year was 58.0%. Our operating margin improved by 13 percentage points even as we grew our product revenue mix. This illustrates the leverage that we can produce in the business. We reported earnings per share of $0.05, an $0.18 improvement over the prior year.
在產品收入增長 87% 的推動下,'23 財年的收入創下歷史新高,達到 1.842 億美元,同比增長 73%。全年毛利率為 58.0%。即使我們增加了產品收入組合,我們的營業利潤率也提高了 13 個百分點。這說明了我們可以在業務中產生的影響力。我們報告的每股收益為 0.05 美元,比上一年增加了 0.18 美元。
Moving on to the fourth quarter. In Q4, we reported revenue of $32.1 million, down 41% sequentially and down 14% year-over-year. Our IT business generated $5.7 million of revenue in Q4, down 55% sequentially and down 49% year-over-year. IP remains a strategic part of our business, but as a reminder, our IP results may vary from quarter-to-quarter, driven largely by specific deliverables to preexisting contracts.
繼續第四季度。在第四季度,我們報告的收入為 3210 萬美元,環比下降 41%,同比下降 14%。我們的 IT 業務在第四季度產生了 570 萬美元的收入,環比下降 55%,同比下降 49%。 IP 仍然是我們業務的戰略組成部分,但提醒一下,我們的 IP 結果可能會因季度而異,這主要是由特定可交付成果對現有合同的推動。
While the mix of IP and product revenue will vary in any given quarter over time, our revenue mix in Q4 was 18% IP, above our long-term expectation for IP, which is 10% to 15% of revenue. We continue to expect IP as a percentage of revenue to come in above our long-term expectations for fiscal '24.
雖然 IP 和產品收入的組合在任何給定的季度都會隨著時間的推移而變化,但我們在第四季度的收入組合為 18% IP,高於我們對 IP 的長期預期,即佔收入的 10% 至 15%。我們繼續預計 IP 佔收入的百分比將高於我們對 24 財年的長期預期。
Our product business generated $26.4 million of revenue in Q4, down 37% sequentially and flat year-over-year. Our team delivered Q4 gross margin of 58.2%, above the high end of our guidance range and down 94 basis points sequentially due to lower IP contribution.
我們的產品業務在第四季度創造了 2640 萬美元的收入,環比下降 37%,同比持平。我們的團隊在第四季度實現了 58.2% 的毛利率,高於我們指導範圍的上限,並且由於較低的知識產權貢獻而連續下降 94 個基點。
Our IP gross margin generally hovers near 100% and was 97.4% in Q4. Our product gross margin was 49.7% in the quarter, up 245 basis points sequentially and up 167 basis points year-over-year, due principally to product mix.
我們的 IP 毛利率通常徘徊在 100% 附近,第四季度為 97.4%。本季度我們的產品毛利率為 49.7%,環比增長 245 個基點,同比增長 167 個基點,這主要歸功於產品組合。
Total operating expenses in the fourth quarter were $27.2 million, within guidance and up 6% sequentially and 25% year-over-year. Our year-over-year OpEx increase was a result of a 36% increase in R&D as we continue to invest in the resources to deliver innovative solutions. Our SG&A was up 12% year-over-year as we built out public company infrastructure.
第四季度的總運營費用為 2720 萬美元,在指導範圍內,環比增長 6%,同比增長 25%。我們的運營支出同比增長是研發增長 36% 的結果,因為我們繼續投資資源以提供創新解決方案。隨著我們建立上市公司基礎設施,我們的 SG&A 同比增長 12%。
Our operating loss was $8.5 million in Q4, a decline of $10.7 million year-over-year. Our operating margin was negative 26.4% in the quarter, a decline of 32.2 percentage points year-over-year due to reduced top line leverage.
我們在第四季度的營業虧損為 850 萬美元,同比下降 1070 萬美元。由於頂線槓桿率降低,本季度我們的營業利潤率為負 26.4%,同比下降 32.2 個百分點。
We reported a net loss of $5.7 million in Q4, $8.3 million below last year. Cash flow used by operations in the fourth quarter was $11.8 million, a decrease of $14.2 million year-over-year due largely to our net loss and changes in working capital.
我們報告第四季度淨虧損 570 萬美元,比去年同期減少 830 萬美元。第四季度運營使用的現金流量為 1180 萬美元,同比減少 1420 萬美元,這主要是由於我們的淨虧損和營運資金的變化。
CapEx was $3.9 million in the quarter driven by R&D equipment spending and free cash flow was negative $15.7 million, a decrease of $8.4 million year-over-year. We ended the quarter with cash and equivalents of $217.8 million, a decrease of $15.2 million from the third quarter. This decrease in cash was a result of our net loss and the investments required to grow the business.
在研發設備支出的推動下,本季度資本支出為 390 萬美元,自由現金流為負 1570 萬美元,同比減少 840 萬美元。本季度末,我們的現金及等價物為 2.178 億美元,比第三季度減少 1520 萬美元。現金減少是我們的淨虧損和發展業務所需投資的結果。
We remain well capitalized to continue investing in our growth opportunities while maintaining a substantial cash buffer for uncertain macroeconomic conditions. Our accounts receivable balance increased by 14.6% sequentially to $49.5 million, while days sales outstanding increased to 140 days, up from 72 days in Q3 due to lower revenue. Our Q4 ending inventory was $46.0 million, down $4.3 million sequentially.
我們仍有充足的資本繼續投資於我們的增長機會,同時為不確定的宏觀經濟狀況維持大量現金緩衝。由於收入下降,我們的應收賬款餘額環比增長 14.6% 至 4950 萬美元,而銷售未清天數從第三季度的 72 天增加至 140 天。我們第四季度末庫存為 4600 萬美元,環比下降 430 萬美元。
Now turning to our guidance. We currently expect revenue in Q1 of fiscal '24 to be between $33 million and $35 million, up 6% sequentially at the midpoint. We expect the Q1 gross margin to be within a range of 58% to 60%. We expect the Q1 operating expenses to be between $26 million and $28 million.
現在轉向我們的指導。我們目前預計 24 財年第一季度的收入將在 3300 萬美元至 3500 萬美元之間,中點環比增長 6%。我們預計第一季度的毛利率將在 58%至 60%的範圍內。我們預計第一季度運營費用在 2600 萬美元至 2800 萬美元之間。
We expect the Q1 basic weighted average share count to be approximately 149 million shares. We feel we have moved through the bottom in the fourth quarter, while we see some near-term upside to our prior expectations, we remain cautious about the back half of our fiscal year due to uncertain macroeconomic conditions.
我們預計第一季度基本加權平均股數約為 1.49 億股。我們認為我們已經在第四季度觸底,雖然我們看到我們之前的預期在短期內有一些上行空間,但由於不確定的宏觀經濟狀況,我們對本財年的後半年仍然持謹慎態度。
In summary, as we move forward through fiscal year '24, we expect sequential revenue growth, expanding gross margins due to increasing scale and modest sequential growth in operating expenses. As a result, we look forward to driving operating leverage as we exit the year.
總而言之,隨著我們在 24 財年向前邁進,我們預計收入會連續增長,毛利率會因規模擴大和運營費用的適度連續增長而擴大。因此,我們期待在今年結束時推動經營槓桿。
And with that, I'll open it up for questions. Thank you.
有了這個,我會打開它來提問。謝謝。
Operator
Operator
(Operator Instructions) The first question that we have is coming from Tore Svanberg of Stifel.
(操作員說明)我們的第一個問題來自 Stifel 的 Tore Svanberg。
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
For my first question and in regards to the Q1 guidance as far as what's driving the growth, given your gross margin comment, I assume that AEC will probably continue to be down with perhaps the growth coming from -- kind of for DSP and IP. Is that sort of the correct thinking or if not, please correct me?
對於我的第一個問題和關於推動增長的第一季度指導,鑑於你的毛利率評論,我認為 AEC 可能會繼續下降,也許增長來自——有點像 DSP 和 IP。這種想法是否正確,如果不正確,請糾正我?
Daniel Fleming - CFO
Daniel Fleming - CFO
So you're correct in that our -- if you look at the sequential increase in gross margin from Q3 to Q4, while our product revenue was down, that's really reflective of a favorable product mix, where AEC, as we all know, which is on the lower end of our margin profile, was -- contributed less of the overall product mix.
所以你是對的,我們的 - 如果你看一下從第三季度到第四季度毛利率的連續增長,而我們的產品收入卻下降了,這確實反映了有利的產品組合,眾所周知,AEC,這處於我們利潤率概況的低端,是 - 對整體產品組合的貢獻較少。
That trend will continue in Q1. And I would characterize that really as broadly across all of our other product lines, not really singling out one specific product line that's taking up the slack from AEC, so to speak.
這一趨勢將在第一季度繼續。我會在我們所有其他產品線中廣泛地描述這一點,而不是真正挑出一個特定的產品線來彌補 AEC 的不足,可以這麼說。
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
And as my follow-up question for you, Bill, with generative AI, as you mentioned in your script, things are clearly changing. I was just hoping you could talk a little bit more granular about how it impacts each business? I'm even thinking about sort of the 800-gig PAM4 cycle. I mean, is that getting pulled in? So yes, I mean, how -- if you could just give us a little bit more color on how generative AI could impact each of your 4 business units at this point?
作為我要問你的後續問題,Bill,正如你在腳本中提到的那樣,對於生成人工智能,情況顯然正在發生變化。我只是希望你能更詳細地談談它如何影響每項業務?我什至在考慮某種 800-gig PAM4 循環。我的意思是,這是被拉進來了嗎?所以,是的,我的意思是,如果你能給我們更多關於生成人工智能在這一點上如何影響你的 4 個業務部門的顏色?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Sure, absolutely. So I think -- generally, I think that AI applications will create revenue opportunities for us across our portfolio. I think the largest opportunity that we'll see is with AEC. However, optical DSPs, there will definitely be a big opportunity there. Even Line Card PHYs, chiplets, even SerDes IP licensing will get an uplift as AI deployments increase. So maybe I can start first with AECs.
當然,絕對。所以我認為——總的來說,我認為人工智能應用程序將為我們的產品組合創造收入機會。我認為我們將看到的最大機會是 AEC。但是,光學DSP,肯定會有很大的機會。隨著 AI 部署的增加,甚至線路卡 PHY、小芯片,甚至 SerDes IP 許可也將得到提升。所以也許我可以先從 AEC 開始。
Now it's important to kind of identify the differences between traditional compute server racks, which is kind of commonly referred to -- used at the front-to-end network, so basically a NIC to ToR connection, the ToR up to the leaf and spine network. The typical compute rack would have 10 to 20 AECs in rack, meaning in rack connections from NIC to ToR and highlight the leading-edge lane rates today for these connections with compute servers is 50-gig per lane.
現在,重要的是要識別傳統計算服務器機架之間的差異,這通常被稱為——用於前端到端網絡,所以基本上是一個 NIC 到 ToR 連接,ToR 一直到葉子和主幹網絡。典型的計算機架在機架中有 10 到 20 個 AEC,這意味著在從 NIC 到 ToR 的機架連接中,並強調當今與計算服務器的這些連接的前沿通道速率是每通道 50 gig。
Within an AI cluster, in addition to the front-end network, which is similar, there's a back-end network referred to as the RDMA network, and that basically allows the AI appliances to be networked together within a cluster directly. And if we start going through the map, this back-end network has 5 to 10x of the bandwidth as the front-end network.
在一個 AI 集群中,除了類似的前端網絡之外,還有一個稱為 RDMA 網絡的後端網絡,它基本上允許 AI 設備在集群內直接聯網在一起。如果我們開始瀏覽地圖,這個後端網絡的帶寬是前端網絡的 5 到 10 倍。
And so the other important thing is to note within these RDMA networks, there are Leaf-Spine racks as well. And so if we look at the -- if we look at one example of a customer that we're working with in deploying, the AI appliance rack itself will have a total of 56 ADCs between the front-end and back-end networks. Each Leaf-Spine rack is a [class] rack or aggregated chassis, which will have 256 ADCs.
因此,另一件重要的事情是要注意在這些 RDMA 網絡中,也有葉脊機架。因此,如果我們看一下——如果我們看一下我們在部署過程中與之合作的客戶的一個例子,AI 設備機架本身在前端和後端網絡之間將總共有 56 個 ADC。每個 Leaf-Spine 機架都是一個 [類] 機架或聚合機箱,將有 256 個 ADC。
And so when we look at it from an overall opportunity for AEC, this is a huge uplift in volume and the volume coincides with the bandwidth. Now lane rates will quickly move and certain applications will go forward at 50-gig per lane, others will go straight to 100-gig per lane. And so we see probably a 5x plus revenue opportunity difference between the typical -- if you were to say apples-to-apples with the number of compute server racks versus an AI cluster.
因此,當我們從 AEC 的整體機會來看它時,這是一個巨大的提升,而且數量與帶寬相吻合。現在,通道速率將快速變化,某些應用程序將以每通道 50 gig 的速度前進,而其他應用程序將直接達到每通道 100 gig。因此,我們看到典型之間的收入機會差異可能超過 5 倍——如果你說的是計算服務器機架數量與 AI 集群的同類產品。
So it kind of extends the -- kind of extends in the optical. There is also a typically large -- there's typically a large number of AOCs in the same cluster. So you can imagine that the short in-rack connections are going to be done with AECs. These are 3 meters or less. But these appliances will connect to the to the back-end Leaf-Spine racks, these disaggregated racks. All of those connections will be AOCs.
所以它有點擴展——有點在光學上擴展。還有一個通常很大——在同一個集群中通常有大量的 AOC。因此,您可以想像短的機架內連接將通過 AEC 完成。這些是 3 米或更短。但是這些設備將連接到後端 Leaf-Spine 機架,這些分解機架。所有這些連接都將是 AOC。
Those are connections that are greater than 3 meters. And so, if we look at this, this is all upside to, say a, traditional compute deployment where there's really no AOCs connecting rack-to-rack. Okay, so, when we look at the overall opportunity, we think that the additional AEC opportunity within an AI cluster is probably twice as large as -- twice as many connections as AOCs, but the AOC opportunity for us will be significant in a sense that AOCs represent the most cost-sensitive portion of the optical market.
這些是大於 3 米的連接。因此,如果我們看一下,這對於傳統的計算部署來說是有好處的,在傳統的計算部署中,實際上沒有 AOC 連接機架到機架。好的,所以,當我們審視整體機會時,我們認為 AI 集群中的額外 AEC 機會可能是 AOC 連接數量的兩倍 - 兩倍,但 AOC 機會對我們來說在某種意義上意義重大AOC 代表光學市場中對成本最敏感的部分。
And so it's also a lower technology hurdle since the optical connection is well defined, and it's within the cable. So this is a really natural spot for us to be disruptive in this market. We see some of our planning on deploying 400-gig AOCs. Others are planning to go straight to 800-gig AOCs. So we view -- AEC is the largest opportunity -- optical DSPs for sure will get an uplift in the overall opportunity set.
因此,它也是一個較低的技術障礙,因為光學連接定義明確,並且在電纜內。所以這是我們在這個市場上進行破壞的一個非常自然的地方。我們看到了一些關於部署 400 兆 AOC 的計劃。其他人則計劃直接使用 800 兆 AOC。所以我們認為——AEC 是最大的機會——光學 DSP 肯定會在整體機會集中得到提升。
But also, I think that if we look at Tesla, as an example, that's an example of where as they deploy, we're going to see a really nice opportunity for our chiplets that we did for them for that Dojo supercomputer. And it's an example of how AI applications are doing things completely differently, and we view that long-term this will be kind of a natural thing for us to benefit from. We could extend that to SerDes IP licensing.
而且,我認為,如果我們以特斯拉為例,這是他們部署位置的一個例子,我們將看到我們為 Dojo 超級計算機為他們所做的小芯片的一個非常好的機會。這是人工智能應用程序如何以完全不同的方式做事的一個例子,我們認為從長遠來看,這將是一種自然而然的事情,我們將從中受益。我們可以將其擴展到 SerDes IP 許可。
Many of the licenses that we're doing now are targeting different AI applications. And also, don't forget Line Cards. The opportunity for the network OEMs and ODMs is also increasing. And of course, Line Card PHYs are something that go on the switch Line Cards that are developed. So generally speaking, I think that AI will drive faster lane rates. And we've been very, very consistent with our message that as the market hits the knee in the curve on AI deployments we're naturally going to see lane rates go more quickly to 100-gig per lane.
我們現在正在做的許多許可證都針對不同的 AI 應用程序。而且,不要忘記線卡。網絡 OEM 和 ODM 的機會也在增加。當然,線路卡 PHY 是在開發的交換機線路卡上運行的東西。所以總的來說,我認為人工智能會推動更快的車道速度。我們一直非常、非常一致地傳達我們的信息,即隨著市場在 AI 部署曲線上觸及拐點,我們自然會看到通道速率更快地上升到每通道 100 gig。
And that's where we really see our business taking off. So, we're getting a really nice revenue increase from 50-gig per lane applications, but we really see acceleration as 100-gig per lane happens. And especially when you start thinking about the power advantages that all of our solutions offer compared to others that are doing similar things -- Does that -- that might have been more than you were looking for, but...
這就是我們真正看到我們的業務起飛的地方。因此,我們從每通道 50g 的應用程序中獲得了非常好的收入增長,但我們確實看到了每通道 100g 的加速。尤其是當你開始考慮我們所有的解決方案與其他做類似事情的解決方案相比所提供的功率優勢時——是嗎——這可能比你期待的要多,但是……
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
No, that's a great overview.
不,這是一個很好的概述。
Operator
Operator
And the next question will be coming from Quinn Bolton of Needham & Company.
下一個問題將來自 Needham & Company 的 Quinn Bolton。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
Bill maybe a follow-up to Tore's question, just sort of the impact of generative AI on the business. Given that most of your AEC revenue today comes from the standard compute racks rather than AI racks, what do you see in terms of potential cannibalization at least in the near term, as these hyperscalers prioritize building out the AI racks potentially at the expense of compute deployments again -- in the near term?
Bill 可能是 Tore 問題的後續問題,只是關於生成 AI 對業務的影響。鑑於您今天的大部分 AEC 收入來自標準計算機架而不是 AI 機架,您至少在短期內看到了潛在的蠶食,因為這些超大規模企業優先考慮以犧牲計算為代價來構建 AI 機架再次部署——在短期內?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
So I feel very good about how we're positioned. It is the case that our first ramp with our largest customer was a compute rack. I think we're very well positioned with our customer as they transition to AI deployments. And so we've talked in the past about 2 different types of deployments at the server level. Of course, compute will continue, and we can all guess as to what ratio it's going to be between compute and AI.
所以我對我們的定位感覺很好。情況是,我們與最大客戶的第一個斜坡是一個計算機架。我認為我們在客戶過渡到 AI 部署時處於非常有利的位置。因此,我們過去曾討論過服務器級別的兩種不同類型的部署。當然,計算將繼續,我們都可以猜測計算和 AI 之間的比例。
We've got the road map very well covered for compute. So I think we're well set. And so as that resumes as our largest customer, I think we're going to be in good shape. I'm actually more excited about the acceleration of the AI program that we've been working with the same customer on, for close to 1 year. And so, I feel like we're well covered for both compute and AI, and that's really a long-term statement.
我們已經很好地涵蓋了計算路線圖。所以我認為我們已經準備好了。因此,隨著它恢復成為我們最大的客戶,我認為我們會處於良好狀態。實際上,我對我們與同一客戶合作近 1 年的 AI 程序的加速感到更加興奮。因此,我覺得我們在計算和 AI 方面都得到了很好的覆蓋,這確實是一個長期聲明。
So a little bit of new information, I would say, is that with our second hyperscale customer, just to give an update generally on that and then relate that back to the same point that I was making about the earlier customer, we are right on track with the AEC ramp. The first program is a compute server rack that we've talked about. We saw small shipments in Q4, and we expect to see a continued ramp through fiscal '24.
因此,我想說的一點新信息是,對於我們的第二個超大規模客戶,只是一般性地提供更新,然後將其與我對早期客戶所做的相同觀點聯繫起來,我們是正確的跟踪 AEC 斜坡。第一個程序是我們討論過的計算服務器機架。我們在第四季度看到了少量出貨量,我們預計在 24 財年會繼續增長。
However, during the past several months, a new AI application has taken shape. So if we would have talked 100 days ago, we wouldn't have seen this -- we wouldn't have talked about this program. And so, we quickly delivered a different configuration of the AEC that was designed for the compute [SerDes] rack. So if you recall, we did a straight cable as well as an ex-cable configuration.
然而,在過去的幾個月裡,一個新的人工智能應用已經初具規模。因此,如果我們在 100 天前進行討論,我們就不會看到這個——我們就不會討論這個程序。因此,我們很快交付了專為計算 [SerDes] 機架設計的 AEC 的不同配置。因此,如果您還記得的話,我們做了一條直電纜以及一條前電纜配置。
So they asked us to deliver a new configuration that had specific changes that were needed for their deployment. And we delivered the new configuration within weeks, which is -- that's another example of the benefit to how we're organized. The qualification is underway, and we expect this AI appliance rack to also ramp in our fiscal '24. It's unclear as to the exact schedule from a time standpoint and a volume standpoint.
因此,他們要求我們提供一個新配置,該配置具有部署所需的特定更改。我們在幾週內交付了新配置,這是 - 這是對我們的組織方式有益的另一個例子。資格認證正在進行中,我們預計該 AI 設備機架也將在我們的 24 財年推出。從時間和數量的角度來看,確切的時間表尚不清楚。
But we feel like this is going to be another significant second program for us. And so, I think that -- for both our first and our second hyperscale customer, I think we're covering the spectrum between compute and AI. So, I feel like we're really in great shape. So hopefully, that answers your question. Now if I take it a little bit further and say, okay, long-term, let's say it's 80% compute, 20% AI.
但我們覺得這對我們來說將是另一個重要的第二個項目。因此,我認為——對於我們的第一個和第二個超大規模客戶,我認為我們正在涵蓋計算和人工智能之間的範圍。所以,我覺得我們的狀態真的很好。希望這能回答您的問題。現在,如果我更進一步說,好吧,從長遠來看,假設它是 80% 的計算,20% 的人工智能。
And you think maybe -- because the opportunity for us is 5x larger in AI, maybe the opportunity is similar if the ratio is like that. So compute might be equal to AI from an AEC perspective. I think that, any way that goes, we're going to benefit. If it goes 50-50, that's a big upside for us with AEC given the fact that there's larger volume, larger dollars associated with an AI cluster deployment.
你認為也許——因為我們在 AI 中的機會是 5 倍,如果比例是這樣的話,也許機會是相似的。因此,從 AEC 的角度來看,計算可能等同於 AI。我認為,無論如何,我們都會受益。如果它達到 50-50,這對我們 AEC 來說是一個很大的好處,因為事實上,AI 集群部署會帶來更大的數量、更多的資金。
And so, I think that for us, it won't affect us one way or another, maybe in the near-term quarters, yes. But the situation at our first customer really hasn't changed since the last update. So, we think that the year-over-year increase in revenue for that customer will happen in FY '25, as we've discussed before.
因此,我認為對我們來說,它不會以某種方式影響我們,也許在短期內,是的。但自上次更新以來,我們第一個客戶的情況確實沒有改變。因此,我們認為該客戶的收入同比增長將發生在 25 財年,正如我們之前討論過的那樣。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
But no further push out or delay of the compute rack at the first hyperscaler given the potential reprioritization to AI in the near term?
但是考慮到短期內可能重新優先考慮 AI,不會進一步推出或延遲第一個超大規模計算器的計算機架嗎?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Well, the new program qualifications, we've talked about 2 of them, they're still scheduled in the relatively near future. And of course, as those get qualified and ramp, we'll see benefit from that. But it's a little bit tough to track month-by-month, right? That's a little bit too specific in a timeframe standpoint. So we've seen a slight delay, but it's not something that we're necessarily concerned about.
好吧,新計劃的資格,我們已經談到了其中的兩個,它們仍在相對較近的將來安排。當然,隨著這些人獲得資格並取得成功,我們將從中受益。但是逐月跟踪有點困難,對嗎?從時間框架的角度來看,這有點太具體了。所以我們看到了輕微的延遲,但這不是我們必須擔心的事情。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
And then just a clarification on the second hyperscaler. I think the last update, you said you may not yet have a hard forecast for that hyperscalers needs on the AEC side. Have you received sort of a hard PO or at least a more reliable forecast that you're now sort of forecasting that business from in fiscal '24?
然後只是對第二個超大規模的澄清。我想在上次更新中,你說你可能還沒有對 AEC 方面的超大規模用戶需求做出硬性預測。您是否收到了某種硬 PO 或至少更可靠的預測,您現在正在從 24 財年開始預測該業務?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, it's coming together, and I think we feel comfortable saying that the revenue that will be generated by this second customer will be significant. And I'm not exactly able to talk about how significant.
是的,它走到了一起,我認為我們可以很自在地說第二個客戶將產生的收入將是可觀的。而且我不能完全談論有多重要。
I think that we're -- we continue to view this through a conservative lens, because we really don't know how the second half is going to shape up. But all the indicators that we've heard over the last 90 days are quite positive. And I think -- Dan referenced the fact that in Q2, we expect significant material revenue as that starts.
我認為我們 - 我們繼續通過保守的視角來看待這個問題,因為我們真的不知道下半年會如何發展。但我們在過去 90 天聽到的所有指標都非常積極。我認為 - 丹提到了這樣一個事實,即在第二季度,我們預計開始時會有大量的物質收入。
Operator
Operator
And our next question will be coming from Suji Desilva of ROTH Capital.
我們的下一個問題將來自 ROTH Capital 的 Suji Desilva。
Suji Desilva - MD & Senior Research Analyst
Suji Desilva - MD & Senior Research Analyst
Just want to talk about the AEC, the products. You have multiple products, and I just want to know if -- are there certain ones that are more relevant to AI rack versus a traditional compute rack or are they all applicable across the board?
只想談談 AEC,產品。您有多種產品,我只想知道是否有某些產品與 AI 機架相比與傳統計算機架更相關,或者它們是否都適用於所有產品?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
So, I would say that I wouldn't classify all of these solutions, I wouldn't lump them together. We're very much looking at the AEC opportunity as one where we're positioned to implement really customer-specific requests. And so, part of what we're seeing is that most of the designs that we're engaged now have something very specific to a given customer.
所以,我會說我不會對所有這些解決方案進行分類,我不會將它們混為一談。我們非常關注 AEC 機會,因為我們可以在其中實施真正針對客戶的特定要求。因此,我們所看到的部分情況是,我們現在參與的大多數設計都針對給定客戶提供非常具體的東西。
And so, I can say that we're seeing that there's a large number of customers moving to 100-gig per lane quickly, but we're also seeing customers that are reconfiguring existing NICs and building different AI appliances with those NICs. And so, they're going to be able to ramp 50-gig per lane solutions. Now as far as configurations go, we see straight cable opportunities. We see [wide] cable opportunities.
因此,我可以說,我們看到有大量客戶正在快速轉向每通道 100 gig,但我們也看到客戶正在重新配置現有 NIC 並使用這些 NIC 構建不同的 AI 設備。因此,他們將能夠增加每通道 50 兆的解決方案。現在就配置而言,我們看到了直接電纜的機會。我們看到了[廣泛的]有線電視機會。
We see opportunities where -- just recently we had a customer ask us to have 100-gig on one of the cable and 50-gig on the other end of the cable. And so -- obviously, that's a breakout cable. But it's an interesting challenge because this is the first time we'll be mixing different generations of ICs. And so -- again, this is something we're able to do because we're so unique in a sense that we have a dedicated organization internal to Credo that's responsible for delivering these system solutions.
我們看到了機會——就在最近,我們有一位客戶要求我們在其中一根電纜上提供 100 gig,在電纜的另一端提供 50 gig。所以 - 顯然,這是一條分支電纜。但這是一個有趣的挑戰,因為這是我們第一次混合使用不同代的 IC。所以——再一次,這是我們能夠做到的事情,因為我們在某種意義上是如此獨特,以至於我們在 Credo 內部有一個專門的組織,負責提供這些系統解決方案。
It's really that single-party that's responsible for collaborating with the customer, designing, developing, delivery, qualifying and then supporting the designs with our customers. And so, I can't emphasize enough that -- you give engineers these hyperscalers the opportunity to innovate in the space they've never been thought of, and it's something that we're getting really good uptake on.
確實是負責與客戶協作、設計、開發、交付、鑑定然後與我們的客戶一起支持設計的單方。因此,我怎麼強調都不為過——你為這些超大規模工程師提供了在他們從未想過的領域進行創新的機會,這是我們得到很好接受的事情。
And of course, our investment in the AEC space is really unmatched by any of our competition. I think we're unique in the sense that we can offer this type of flexibility. So to answer your question, it's not -- I couldn't really point to one type of cable that is going to be leaned on.
當然,我們在 AEC 領域的投資是我們任何競爭對手都無法比擬的。我認為我們在提供這種靈活性方面是獨一無二的。所以要回答你的問題,它不是——我不能真正指出一種將要依靠的電纜。
Suji Desilva - MD & Senior Research Analyst
Suji Desilva - MD & Senior Research Analyst
It paints the picture of how the cables are being deployed here, [I guess]. And then also, I believe in the prepared remarks you mentioned, 20 AECs being qualified for shipments, if I heard that right. I'm curious how many -- across how many customers or how many programs that is, just to understand the breadth of that qualification effort?
它描繪了電纜如何在這裡部署的圖片,[我猜]。然後,我相信你提到的準備好的評論,如果我沒聽錯的話,有 20 個 AEC 有資格發貨。我很好奇有多少 - 有多少客戶或有多少計劃,只是為了了解資格認證工作的廣度?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, I would say that -- there's a set of hyperscalers that are really the large opportunity within the industry for the AEC opportunity. But we've also had a lot of success with data centers that might not qualify as capital H, Hyperscaler, as well as service providers. And so, we can look at the relationships with hyperscalers directly, and there are several SKUs that we've delivered.
是的,我會這麼說——有一組超大規模器確實是行業內 AEC 機會的巨大機會。但我們在可能不符合資本 H、Hyperscaler 以及服務提供商資格的數據中心方面也取得了很多成功。因此,我們可以直接查看與 hyperscalers 的關係,並且我們已經交付了幾個 SKU。
And there's even more in the queue for these more advanced next-generation systems. But even if we look at -- I think we're -- if you look at the number of $1 million per quarter or per year customers that we've got, the list is really increasing. The product category, I think, has really been solidified over the last 6 to 9 months. And you see that also because a lot of companies are announcing that they intend to compete longer term.
這些更先進的下一代系統甚至還有更多。但即使我們看看 - 我認為我們 - 如果你看看我們擁有的每季度或每年 100 萬美元的客戶數量,這個名單確實在增加。我認為,產品類別在過去 6 到 9 個月中確實得到了鞏固。你也看到了這一點,因為很多公司都宣布他們打算長期競爭。
Operator
Operator
And our next question will be coming from Karl Ackerman of BNP.
我們的下一個問題將來自 BNP 的 Karl Ackerman。
Karl Ackerman - Research Analyst
Karl Ackerman - Research Analyst
I have 2 questions. I guess, first off, it's great to see the sequential improvement in your fiscal Q1, but I didn't hear you confirm your fiscal '24 revenue outlook from 90 days ago. And I guess -- could you just speak to the visibility you have on your existing programs that gives you confidence in the sequential growth that you spoke about throughout fiscal '24? Could you just touch on that? That would be helpful.
我有兩個問題。我想,首先,很高興看到第一財季的連續改善,但我沒有聽到你確認 90 天前的 24 財年收入前景。而且我想 - 你能否談談你對現有計劃的可見性,這些計劃讓你對你在整個 24 財年談到的連續增長充滿信心?你能談談嗎?那會很有幫助。
Daniel Fleming - CFO
Daniel Fleming - CFO
So yes, generally speaking, we -- as we've described, we see some near-term upside, but we still remain a bit cautiously optimistic about the back half of the year. But we're -- so we're very comfortable ultimately with the current estimates for the back half.
所以是的,一般來說,我們——正如我們所描述的,我們看到了一些近期的上行空間,但我們仍然對今年下半年持謹慎樂觀的態度。但是我們 - 所以我們最終對後半部分的當前估計非常滿意。
We do have certainly increasing visibility as time passes, and we hope to provide meaningful updates in -- over the next upcoming quarters. But we're working hard to expand these growth opportunities for FY '24 and beyond, and we remain very encouraged with what we're seeing, especially with the acceleration of AI programs.
隨著時間的推移,我們確實有越來越多的知名度,我們希望在接下來的幾個季度中提供有意義的更新。但我們正在努力為 24 財年及以後擴大這些增長機會,我們仍然對我們所看到的感到非常鼓舞,尤其是隨著 AI 程序的加速。
Karl Ackerman - Research Analyst
Karl Ackerman - Research Analyst
I guess as a follow-up, of the DSP opportunity that you've highlighted in the prepared remarks, are you seeing our design engagements primarily in fiscal '24 on coherent offerings or are you seeing more opportunities in DCI for your 400-gig and 800-gig opportunities?
我想作為您在準備好的評論中強調的 DSP 機會的後續行動,您是否主要在 24 財年看到我們的設計參與相關產品,或者您是否在 DCI 中看到更多機會用於您的 400-gig 和800 次演出機會?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, so the opportunities that we're seeing -- the large opportunities that we're seeing are really within the data center. And I can say that it's across the board, 200-gig, 400-gig and 800-gig, all of these hyperscalers have different strategies as to how they're deploying optical. I think we continue to make progress with 200 and 400, and I think we're in a really good position from a time-to-market perspective on 800-gig.
是的,所以我們看到的機會——我們看到的巨大機會確實在數據中心內。我可以說它是全面的,200-gig、400-gig 和 800-gig,所有這些超大規模企業在如何部署光學方面都有不同的策略。我認為我們在 200 和 400 方面繼續取得進展,而且我認為從 800 演出的上市時間角度來看,我們處於非常有利的位置。
And so, we can talk about the cycles that we're spending with every hyperscaler. We're also aligning ourselves very closely in a strategic go-to-market strategy with select optical module companies. And we think that as it relates to DCI and coherent specifically, we're in development for that first solution that we're pursuing, which is 100-gig ZR. And we feel like that development will take place throughout this year and that we'll see first revenue in the second half of calendar '24. But as far as 400-gig, that would really be a second follow-on type of DCI opportunity for us.
因此,我們可以談談我們在每個超大規模上花費的周期。我們還與選定的光學模塊公司密切合作,制定戰略性的上市戰略。我們認為,由於它與 DCI 相關,特別是連貫性,我們正在開發我們追求的第一個解決方案,即 100-gig ZR。我們覺得這種發展將在今年全年進行,我們將在 24 年日曆的下半年看到第一筆收入。但就 400-gig 而言,這對我們來說確實是第二種後續類型的 DCI 機會。
Now in the ZR space, we're going to be unique because we'll market and sell the DSP to optical module makers. And so, we intend to engage 3 to 4 module makers in addition to our partner, EFFECT Photonics, and that makes us somewhat unique in the sense that other competitors are going directly to market with the ZR module.
現在在 ZR 領域,我們將成為獨一無二的,因為我們將向光學模塊製造商營銷和銷售 DSP。因此,除了我們的合作夥伴 EFFECT Photonics 之外,我們還打算與 3 到 4 家模塊製造商合作,這讓我們在某種程度上變得與眾不同,因為其他競爭對手直接將 ZR 模塊推向市場。
And I highlight power is really an enabler here. And the key thing is we can do 100-gig ZR module and fit under the power ceiling for a standard QSFP connector, which is roughly 4.5 watts. So there's a large upgrade cycle from 10-gig modules that we'll enable, but also there's new deployments in addition. So that kind of gives you a little bit of flavor about the coherent, but I really see our opportunities more within the data center.
我強調權力在這裡確實是一個推動因素。關鍵是我們可以做 100-gig ZR 模塊,並安裝在標準 QSFP 連接器的功率上限下,大約為 4.5 瓦。因此,我們將啟用的 10-gig 模塊有一個很大的升級週期,而且還有新的部署。所以這給了你一點關於連貫性的味道,但我真的看到了我們在數據中心內更多的機會。
Operator
Operator
And our next question will be coming from Vivek Arya of Bank of America.
我們的下一個問題將來自美國銀行的 Vivek Arya。
Vivek Arya - MD in Equity Research & Research Analyst
Vivek Arya - MD in Equity Research & Research Analyst
Bill, I'm curious to get your perspective on some of these technology changes. One is the role of InfiniBand that's getting more share in these AI clusters. What does that do to your AEC opportunity? Is that a competitive situation? Is that a complementary situation?
比爾,我很想知道你對這些技術變化的看法。其中之一是 InfiniBand 的作用,它在這些 AI 集群中的份額越來越大。這對您的 AEC 機會有何影響?這是一種競爭情況嗎?那是互補的情況嗎?
And then the other technology question is some of your customers and partners have spoken about their desire to consider co-packaged optics and linear direct drive type of architectures. What does that do, right, to the need for stand-alone pluggables?
然後另一個技術問題是您的一些客戶和合作夥伴談到他們希望考慮共同封裝的光學器件和線性直接驅動類型的架構。這對獨立可插拔設備的需求有何影響?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
I appreciate the opportunity to talk about Ethernet versus InfiniBand, because there's been a lot said about that. Generally, we see coexistence. I think depending on how you look at the market forecast information, there is a point soon in the future with Ethernet exceeds InfiniBand for AI specifically. Beyond AI, I think it's game over already.
我很高興有機會討論以太網與 InfiniBand 的對比,因為已經有很多人對此進行了討論。通常,我們看到共存。我認為,根據你如何看待市場預測信息,未來很快就會有一個點,以太網專門針對 AI 超過 InfiniBand。除了 AI,我認為遊戲已經結束了。
Whether you measure the TAM in ports or dollars, Ethernet is forecasted to far exceed InfiniBand in the out years, so calendar '25 and beyond. And so, if we think about -- from an absolute TAM perspective, forecasters are showing Ethernet dollars perspective. They're showing that Ethernet surpasses InfiniBand by 2025. And so, the forecast show a CAGR for Ethernet of greater than 50%, while InfiniBand, they're showing a CAGR of less than 25%.
無論您以端口還是美元衡量 TAM,預計以太網在未來幾年將遠遠超過 InfiniBand,因此日曆 '25 及以後。因此,如果我們考慮 - 從絕對 TAM 的角度來看,預測人員正在展示以太網美元的角度。他們表明以太網到 2025 年將超過 InfiniBand。因此,預測顯示以太網的複合年增長率超過 50%,而 InfiniBand 的複合年增長率低於 25%。
And so you can also look at this from a port cost perspective where InfiniBand is 2 to 4x the ASP per port compared to Ethernet, depending on who you talk to. And so in a sense, it's so secret that -- the world will continue to do what the world does, they'll pursue cost-effective decisions. And we think from a technology standpoint, they're very similar.
因此,您還可以從端口成本的角度來看這一點,與以太網相比,InfiniBand 每個端口的 ASP 是以太網的 2 到 4 倍,具體取決於您與誰交談。所以從某種意義上說,它是如此的秘密——世界將繼續做世界所做的事情,他們將追求具有成本效益的決策。我們認為,從技術的角度來看,它們非常相似。
So if you think from a cost perspective, if you look apples-to-apples, if you think that an InfiniBand port is 2 to 4x the cost of an Ethernet port, in a sense, you could justify that 1 to 3 of those ports of Ethernet are free in comparison to InfiniBand. So, I think that our position here is that we really believe that Ethernet is going to prevail. We're working on so many AI programs. Every single one of them is Ethernet.
因此,如果您從成本角度考慮,如果您比較同類產品,如果您認為 InfiniBand 端口的成本是以太網端口的 2 到 4 倍,那麼從某種意義上說,您可以證明其中 1 到 3 個端口是合理的與 InfiniBand 相比,以太網是免費的。所以,我認為我們在這裡的立場是,我們真的相信以太網將會盛行。我們正在研究很多人工智能程序。它們中的每一個都是以太網。
Vivek Arya - MD in Equity Research & Research Analyst
Vivek Arya - MD in Equity Research & Research Analyst
And then, Bill, anything on the move by some of your customers to think about co-packaged optics and direct drive? And while I'm it, maybe let me just ask Dan -- have a follow-up on the fiscal '24. I think Dan, you suggested you are comfortable with where I think expectations are right now. That still implies a very steep ramp into the back half. So I'm just trying to square the confidence in the full year with some of -- just kind of the macro caution that came through in your comments?
然後,Bill,您的一些客戶正在考慮共同封裝的光學器件和直接驅動器嗎?當我在的時候,也許讓我問丹 - 對 24 財年進行跟進。我想丹,你建議你對我認為現在的期望感到滿意。這仍然意味著進入後半部分的坡度非常陡峭。因此,我只是想將全年的信心與您的評論中出現的一些宏觀謹慎相提並論?
Daniel Fleming - CFO
Daniel Fleming - CFO
Yes, we are confident in how we have guided. And as I mentioned, we're very comfortable with the current estimates. If we look at FY '24, as you alluded to, Vivek, there's -- we see strong sequential top line growth throughout the year in order to get -- to achieve those numbers. And it's kind of well documented, the -- what's happened at Microsoft to us for this fiscal year.
是的,我們對我們的指導方式充滿信心。正如我提到的,我們對當前的估計非常滿意。如果我們看看 24 財年,正如你提到的那樣,Vivek,我們看到全年收入連續強勁增長,以實現這些數字。它有很好的記錄,即本財年微軟對我們發生的事情。
So if we exclude Microsoft, what that means is we have in excess of 100% year-on-year growth of other product revenue from other customers, which, again, we're very confident based on all of the traction that we've seen recently that we'll be able to achieve that. And of course, I'll just reiterate, one of the key drivers is AI in some of those programs. So hopefully, that gives you some additional color on our confidence for FY '24.
因此,如果我們排除微軟,這意味著我們來自其他客戶的其他產品收入同比增長超過 100%,再次,我們非常有信心基於我們的所有牽引力最近看到我們將能夠實現這一目標。當然,我只是重申,其中一些程序中的關鍵驅動因素之一是人工智能。因此,希望這能為您提供更多關於我們對 FY '24 的信心的顏色。
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, regarding your question about the linear direct drive, that was, I think, this year's shiny object at OFC. I do think that the idea that -- it's really -- the idea is really to -- how to address the power challenges, basically move away from the optical DSP. I think that this is not a new idea. There was a big move towards this linear drug drive in the 10-gig space when that was emerging. And I think -- the fact that there is really none in existence, I think that DSP [has] chosen that and it was really critical to close the system.
是的,關於您關於線性直接驅動的問題,我認為那是今年 OFC 的亮點。我確實認為這個想法 - 它真的 - 這個想法真的是 - 如何解決電源挑戰,基本上遠離光學 DSP。我認為這不是一個新想法。當它出現時,在 10 兆空間中朝著這種線性藥物驅動邁出了一大步。而且我認為 - 實際上不存在的事實,我認為 DSP [已經] 選擇了它並且關閉系統真的很關鍵。
Our feeling is that, I think we'll see much of the same this year. I think [Marvell] did a great job in kind of setting expectations correctly. They did a long session right after OFC that I think addressed it quite well. I think you'll see small deployments where every link in the system is very, very controlled. But these are typically very, very small in terms of the overall TAM.
我們的感覺是,我認為今年我們會看到很多相同的情況。我認為 [Marvell] 在正確設定期望方面做得很好。他們在 OFC 之後進行了長時間的會議,我認為這很好地解決了這個問題。我認為您會看到小型部署,其中系統中的每個鏈接都受到非常、非常可控。但就整體 TAM 而言,這些通常非常非常小。
Now we're fully signed up. If the real goal is power, that's exactly where we play. So we're fully signed up to looking at unique approaches in the future to be able to offer compelling things from a power perspective. And it's not like I'm completely dismissing the concept that was really behind the idea of linear direct drive. We're actually viewing that as a potential opportunity for us in solving the problem differently. But generally speaking, I don't think you'll see in the future a world where linear direct drive is measured in any kind of significant way.
現在我們已經完全簽約了。如果真正的目標是權力,那正是我們比賽的地方。因此,我們完全同意在未來尋找獨特的方法,以便能夠從強大的角度提供引人注目的東西。這並不是說我完全否定了線性直接驅動概念背後的概念。我們實際上將其視為我們以不同方式解決問題的潛在機會。但總的來說,我不認為你會在未來看到一個以任何重要方式衡量線性直接驅動的世界。
It's not to say that people aren't spending money trying to prove it out right now. That is happening. And regarding CPO, I think that was -- yes, like that was kind of a -- that was something that was talked about for many, many years prior. And I think also on that, you'll see smaller deployments, if that's ultimately something that some customers embrace. But I don't think you'll see it in a big way. That's simply not what the customer base is looking for.
這並不是說人們現在不花錢試圖證明這一點。那正在發生。關於 CPO,我認為那是——是的,有點像——這是很多很多年前討論過的事情。而且我還認為,如果這最終是一些客戶所接受的,那麼你會看到更小的部署。但我不認為你會在很大程度上看到它。這根本不是客戶群想要的。
Operator
Operator
And our next question will be coming from David Wu of Mizuho.
我們的下一個問題將來自瑞穗的 David Wu。
David Wu
David Wu
This is David on for Vijay, Mizuho. My first question is, assuming that in fiscal '25, data [SerDes] demand for general compute improves and you see the continued new AI ramps, can you provide any more color on the puts and takes there and the type of operating leverage you can improve?
我是維杰的大衛,瑞穗。我的第一個問題是,假設在 25 財年,數據 [SerDes] 對通用計算的需求有所改善,並且您會看到新的 AI 持續增長,您能否提供更多關於看跌期權和獲取的顏色以及您可以使用的運營槓桿類型提升?
Daniel Fleming - CFO
Daniel Fleming - CFO
Well, we're not giving specific guidance yet to fiscal '25. But you're right in that the ingredient certainly exist where operating leverage. We should exit FY '24 with pretty robust operating leverage and that we would expect based on what we know now to carry forward into FY '25. But we haven't framed yet, of course, what that's going to ultimately look like.
好吧,我們還沒有為 25 財年提供具體指導。但你是對的,因為該成分肯定存在於運營槓桿中。我們應該以相當強勁的運營槓桿退出 24 財年,並且根據我們現在所知道的情況,我們希望將其延續到 25 財年。當然,我們還沒有構想出最終會是什麼樣子。
David Wu
David Wu
And I guess for my second question, when you're talking with hyperscalers on these new AI applications, how important is sort of your TCO advantage when they're exploring your solution? Or are they currently kind of just primarily focused on time-to-market and maximum performance and just getting their AI deployments out there?
我想我的第二個問題是,當您與超大規模應用程序討論這些新的 AI 應用程序時,當他們探索您的解決方案時,您的 TCO 優勢有多重要?或者他們目前主要關注的是上市時間和最佳性能,只是讓他們的 AI 部署在那裡?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
So I just want to make sure you said total cost of ownership?
所以我只想確定你說的總擁有成本?
David Wu
David Wu
Yes.
是的。
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes, it's - I think it's hands down in favor of AEC. So if we look at 100-gig lane rates, I think the conclusion throughout the market is that there's 2 ways to deploy short cable solutions. It's really AEC or AOC. If we look at it from a CapEx standpoint, AECs are about half the cost. If we look at -- from an OpEx standpoint, also about half the cost, about half the power, half the ASP for an apples-to-apples type solution. So I think the TCO benefit is significant.
是的,它是 - 我認為這是有利於 AEC 的。因此,如果我們查看 100 千兆通道速率,我認為整個市場的結論是有兩種部署短電纜解決方案的方法。真的是AEC或者AOC。如果我們從資本支出的角度來看,AEC 大約是成本的一半。如果我們看——從 OpEx 的角度來看,同樣是同類解決方案的一半成本、一半功率、一半 ASP。所以我認為 TCO 的好處是顯著的。
The other thing you've got to consider is that, especially when you're down in server racks, these are different than switch racks in a sense that having a failure with your cable solution is -- it becomes a very urgent item. And so, when we think about AOCs, the reliability in terms of number of years, it's probably anywhere from 1/3 to 1/10. The AECs that we sell are -- we talked about a 10-year product life.
另一件你必須考慮的事情是,特別是當你在服務器機架中時,這些與交換機機架不同,因為你的電纜解決方案出現故障是 - 它成為一個非常緊急的項目。因此,當我們考慮 AOC 時,就年數而言的可靠性可能在 1/3 到 1/10 之間。我們銷售的 AEC 是——我們談到了 10 年的產品壽命。
And so, it kind of matches or exceeds the life of the rack that is being deployed, and the same cannot be true for -- it cannot be said for any kind of optical solution. So I think across the board, it's -- hands down, the TCO is much more favorable for AEC.
因此,它有點匹配或超過正在部署的機架的壽命,但對於任何一種光學解決方案來說都不是這樣。所以我認為總體而言,它 - 毫無疑問,TCO 對 AEC 更有利。
Operator
Operator
And our next question will be coming from Quinn Bolton of Needham & Company.
我們的下一個問題將來自 Needham & Company 的 Quinn Bolton。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
Quick 2 follow-ups. One, Dan, was there any contra revenue in the April quarter?
快速 2 次跟進。一個,Dan,四月份的季度是否有任何相反的收入?
Daniel Fleming - CFO
Daniel Fleming - CFO
That's an excellent question, Quinn. I'm glad you caught that. Actually, so, there was, and you will see that when we file our 10-K. In the past, you've been able to see that in our press release, in our GAAP to non-GAAP reconciliation. But from Q4 and going forward, we're no longer excluding that contra revenue from our non-GAAP financials. And this really came about through a comment from the SEC, not singling out Credo, but actually all of Amazon suppliers who have a warrant -- or Amazon has a warrant with them.
這是一個很好的問題,奎因。我很高興你明白了。實際上,確實如此,當我們提交 10-K 時,您會看到這一點。過去,您已經能夠在我們的新聞稿中,在我們的 GAAP 與非 GAAP 對賬中看到這一點。但從第四季度開始,我們不再從我們的非 GAAP 財務中排除該相反收入。這確實是通過美國證券交易委員會的評論來實現的,不是單獨針對 Credo,而是實際上所有擁有搜查令的亞馬遜供應商——或者亞馬遜與他們有搜查令。
So the positive things of this change are you'll still be able to track ultimately what that warrant expense is, but when we file our Q-on-Q. And looking historically, there's not really -- it doesn't really make a reporting difference on a non-GAAP basis. It was not material, the difference. And it just makes calculation a little bit more straightforward going forward. Our only non-GAAP reconciling item going forward, at least for the foreseeable future, is really just share-based compensation.
因此,此更改的積極之處在於,您仍然可以最終跟踪保證金費用是多少,但是當我們提交 Q-on-Q 時。從歷史上看,實際上並沒有——它並沒有真正在非公認會計原則的基礎上產生報告差異。這不是物質上的區別。它只是讓計算更簡單一點。我們未來唯一的非 GAAP 調節項目,至少在可預見的未來,實際上只是基於股份的薪酬。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
So the revenue doesn't change. You'll just sort of - you won't be making the adjustments for the contra revenue and the non-GAAP gross margin calculation going forward?
所以收入沒有變化。你會有點 - 你不會對未來的收入和非 GAAP 毛利率計算進行調整?
Daniel Fleming - CFO
Daniel Fleming - CFO
That's exactly correct. Yes. Revenue is still revenue. It has a portion of it, which is contra-revenue, which obviously brings down the revenue a little bit.
這是完全正確的。是的。收入還是收入。它有一部分是反收入,這顯然會稍微降低收入。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
And then for Bill, would you expect in fiscal '24 a meaningful initial ramp of the 200 or 400-gig optical DSPs? Or would you continue to encourage investors to sort of think that the optical DSP ramp is really beyond a fiscal '24 event at this point?
然後對於 Bill,您是否期望在 24 財年 200 或 400-g 光學 DSP 的有意義的初始增長?還是您會繼續鼓勵投資者認為光學 DSP 的增長目前真的超出了 24 財年的事件?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
I think that when we think about significant -- we think about crossing the 10% of revenue threshold, and we don't see that until fiscal '25. We do see signs of life in China. And as I said, we're shipping 400-gig optical DSPs to a U.S. hyperscaler now. My expectation is throughout the year we're going to have a lot more success stories to talk about, but those ramps will most likely not take place within the next 3 quarters. So it's really a fiscal '25 target at this point.
我認為,當我們考慮重大問題時——我們考慮超過收入門檻的 10%,直到 25 財年我們才看到這一點。我們確實在中國看到了生命跡象。正如我所說,我們現在正在向一家美國超大規模公司運送 400 千兆光學 DSP。我的期望是全年我們將有更多的成功故事可以談論,但這些增長很可能不會在接下來的 3 個季度內發生。所以在這一點上它真的是一個財政'25目標。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
But it starts this year, it's just you're not...
但它從今年開始,只是你不是......
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Yes.
是的。
Nathaniel Quinn Bolton - Senior Analyst
Nathaniel Quinn Bolton - Senior Analyst
going to meaningful because it doesn't hit 10% threshold?
因為它沒有達到 10% 的閾值而有意義嗎?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
Right, exactly.
沒錯,沒錯。
Operator
Operator
And that question will be coming from Tore Svanberg of Stifel.
這個問題將來自 Stifel 的 Tore Svanberg。
Tore Egil Svanberg - MD
Tore Egil Svanberg - MD
Bill, maybe a follow-up to the previous question about 200 or 400-gig. I was little bit more curious about 800-gig. Are you seeing any changes at all to the time lines there? I think the expectation was that the 800-gig market would maybe take off second half of next calendar year. But with all these new AI trends, just wondering if you're seeing any pull-in activity there or maybe even seeing some cannibalization versus 200-gig and 400-gig?
比爾,也許是對上一個關於 200 或 400 演出的問題的跟進。我對 800-gig 有點好奇。您是否看到那裡的時間線有任何變化?我認為人們的預期是 800 兆市場可能會在下個日曆年下半年起飛。但是對於所有這些新的 AI 趨勢,只是想知道您是否看到了那裡的任何拉入活動,或者甚至可能看到了一些與 200-gig 和 400-gig 的蠶食?
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
I think that -- my expectation is that this is really a calendar year '24 type of market take off. And whether it's the second half or first half, we, of course, would like to see it in the first half, given the fact that, that would imply that there would be success in pulling in AI programs. And so there's a lot of benefit that comes with 800-gig modules and the implication that it has on our AEC business.
我認為——我的期望是這真的是一個日曆年的 24 年類型的市場起飛。無論是下半場還是上半年,我們當然希望在上半年看到它,因為這意味著引入 AI 程序將會成功。因此,800-gig 模塊帶來了很多好處,而且它對我們的 AEC 業務也有影響。
But I definitely see it kind of in that time frame. I don't really see it as a cannibalization of the 200 and 400-gig. It's really unless you look at it, that these new deployments are in lieu of the old technology. But like I said before, every hyperscaler has their own strategy related to the port size that they plan on deploying. Everybody's got a unique architecture.
但我肯定會在那個時間範圍內看到它。我真的不認為它是 200 和 400 演出的蠶食。除非您仔細觀察,否則這些新部署確實取代了舊技術。但正如我之前所說,每個超大規模企業都有自己的策略,這些策略與他們計劃部署的端口大小相關。每個人都有一個獨特的架構。
And where we see optical is typically in the Leaf-Spine network for anything above the ToR. In AI, I think the real opportunity is going to be with AOCs. And that, I think, is going to be a very large 800-gig market when those AI clusters really begin deployment, which again, I think it -- could be in calendar '24. So I appreciate the question, though.
對於高於 ToR 的任何事物,我們看到光學通常在葉脊網絡中。在 AI 領域,我認為真正的機會在於 AOC。而且,我認為,當這些 AI 集群真正開始部署時,這將是一個非常大的 800 兆市場,我認為這可能會在 24 年日曆中。所以我很欣賞這個問題。
Operator
Operator
That concludes the Q&A for today. I would like to turn the call back over to Bill Brennan for closing remarks. Please go ahead.
今天的問答到此結束。我想將電話轉回 Bill Brennan 以作結束語。請繼續。
William J. Brennan - President, CEO & Director
William J. Brennan - President, CEO & Director
We really appreciate the participation today, and we look forward to following up on the call backs. So thanks very much.
我們非常感謝今天的參與,我們期待跟進回電。非常感謝。
Operator
Operator
This concludes today's conference call. Thank you all for joining, and everyone, enjoy the rest of your evening.
今天的電話會議到此結束。感謝大家的加入,祝大家度過餘下的夜晚。