Xiao-I Corp (AIXI) 2023 Q2 法說會逐字稿

完整原文

使用警語:中文譯文來源為 Google 翻譯,僅供參考,實際內容請以英文原文為主

  • Operator

  • Good day, ladies and gentlemen. Thank you for standing by and we warmly welcome you all to Xiao-I First Half 2023 Earnings Conference call. (Operator Instructions). As a reminder, we are recording today's call. If you have any objections, you may disconnect at this time. Now, I'll turn the call over to Berry Xia, IR Director of XIAO-I. Berry, please proceed.

  • Berry Xia - IR Director

  • Thank you, operator, and hello everyone. Welcome to Xiao-I First Half 2023 Earnings Conference call. Joining us today are Mr. Max Yuan, Chief Executive Officer, and Mrs. Kelly Weng, Chief Financial Officer.

  • We announced our first half 2023 unaudited and unreviewed financial results earlier today. The press release is available on the company's IR website, as well as from Newswire Services. A replay of this call will also be available in a few hours on our IR website.

  • During this call, we will discuss our business outlook and make forward-looking statements. Please note that these comments are made under the Safe Harbor Provisions of the U.S. Private Security Litigation Reform Act of 1995.

  • Also, the forward-looking statements are based on predictions and expectations as of today. Actual events or results could differ materially due to a number of risks and uncertainties, including those mentioned in our filings with the SEC. The company does not assume any obligation to update any forward-looking statement, except as required under applicable law.

  • In addition, the financial statements for the first half year ended June 30, 2023 have not been audited or reviewed by the company's independent registered accounting firm. The financial statements for the six months ended June 30, 2023 to be disclosed in the company's Form 6-K may differ from the above-mentioned unaudited and unreviewed financial statements. Also, please note that unless otherwise stated, all figures mentioned during the conference call are in U.S. dollars. With that, let me now turn the call over to our CEO, Max Yuan. Please go ahead, Max.

  • Hui Yuan - Executive Chairman of the Board, Chief Executive Officer

  • Thanks, Berry. Good day, everyone, and thank you for joining us today. The first half of 2023 is fantastic for Xiao-I and the broader AI industry. I'd even go as far as to say it's been the best on every level. Generative AI, like a large language model, is truly changing the game and triggering a new technological revolution. It shifts AI from just seeing and hearing to truly understanding. This leap has the potential to reshape countless industries, making operations smoother and more cost-effective. It could eventually change the economy and disrupt the industrial ecology.

  • Against this backdrop, I am glad to share that our first six months of 2023 saw impressive growth of 106% in our net revenues year-over-year. Revenues also hit an all-time high and a big shoutout to our cloud platform sales for fueling this boost. We didn't stop there. Our passion for innovation made us up our game in AI research and development with new record-breaking investments. This was showcased by our launch of Hua Zang, our very own large language model. On top of this, we have taken a step toward global expansion, establishing our U.S. subsidiary.

  • Next, let's dive into our focus on innovation. Our R&D spending grew 608% to USD 29.6 million in the first half of this year. To put it into perspective, this has already surpassed our entire R&D investment for all of 2022. The spending was mainly used for the purchase of computing power-related resources for the large language model. This commitment arises from our overarching objective: to pioneer the AI evolution and fortify our position at the forefront of the Global AI industry. As a testament to our ongoing R&D focus, as of July, we have secured more than 300 authorized patents. This cements our reputation as a leading innovator and creator of advanced AI technologies.

  • Now, let's move to something I'm quite excited about. Back in June, we introduced one of our coolest innovations Hua Zang, our advanced large language model. We have a clear positioning. We'd like to develop it as an operating system for the AI era. Why? The large language model applies AI capabilities to thousands of industries. It empowers the core scenarios of production and life and greatly reduces the threshold for using AI. It's positioned to be the very backbone, the operating system if you will, of AI application development. This is a tool that can make AI models more maintainable, scalable, and iterative. Think of it like this: Just as Windows was vital in the PC era, and iOS and Android rocked the mobile Internet era, Hua Zang aims to be that game-changer for the AI era with the building ability of a platform ecosystem.

  • Now, I know what you might be wondering: "How does an operating system fit organically into various business landscapes?" Here is the thing. We have spent over two decades, collaborating with more than thousands of enterprises for hundreds of diverse application scenarios. This has been instrumental in commercializing Hua Zang. This extensive experience has afforded us a granular understanding of market dynamics and customer pain points. This further bolsters our strong ability and technologies to adapt the model to various industries, empowering them to transcend their traditional bounds. The understanding, ability, and technologies drives Hua Zang's key competence and advantages that significantly distinguish it from other models.

  • With that, Hua Zang's key features are its controllable, customizable, and deliverable capabilities. That means based on clients' business systems, documents and demands, we can control output and content to meet the requirements of generated content, data security, model algorithm design, and operational standards. We can also tailor for multiple platforms and languages. Therefore, whether you are in healthcare, finance, E-commerce, or education, this model's got your back. Its adaptability is unparalleled.

  • Our approach was strategic when we developed Hua Zang. By focusing intently on customer pain points, we have significantly improved the return from our model training investments. In fact, our investment in Hua Zang was channeled towards addressing genuine customer challenges and exploring realistic avenues for commercial integration. It wasn't just about creating something innovative. It had to be cost-effective, too. And let me tell you, Hua Zang checks both those boxes. It can connect to business systems in a quicker and more efficient way with low computing deployment and training costs.

  • As we transition into the next phase, it's not just about having a large language model, but how quickly and effectively we can translate this technology into real-world business applications. The true measure of success now is in our implementation. We are currently in discussions with several potential partners in the finance, intelligent customer services, and new materials sectors to jointly innovate models that can tailor to their industries. And here is a heads-up: We've got a press conference lined up in the fourth quarter to spill more beans on Hua Zang's journey. So stick around for that!

  • Moving on to another key strategic focus - globalization. We've made bigger strides, especially with the birth of Xiao-I Plus Inc. in the U.S.. It crystallizes our commitment to a global footprint. More than expansion, it represents our commitment to delivering AI solutions to our North American customer base, marking the beginning of a new chapter in our story. Our recent presentation at the Ai4 2023 Exhibition in Las Vegas was also a resounding affirmation of our intent and gained meaningful recognition by interacting with potential local partners and showcasing our innovations.

  • Domestically, we continued to enhance our client base with a contract from a leading Chinese aviation group. This is for an intelligent contact center project that aims to improve the aviation group's service capabilities and drive its digital transformation.

  • In summary, the milestones we've achieved in the first half of 2023 are a testament to our focus on driving growth, pioneering innovation, and empowering our customers. Looking ahead into the second half of the year, we will continue to invest in R&D and strengthen our product capabilities. Collaborative efforts with our partners and customers to integrate our large language model will be a central part of our approach. Our long-term vision remains clear, to lead the AI evolution both in China and globally while delivering value to customers worldwide.

  • Now, let me turn the call over to our CFO, Kelly, to go over our financials.

  • Wei Weng - Chief Financial Officer

  • Thank you, Max, and welcome to everyone on the call. Before I go into the numbers, please note that all numbers presented are in U.S. dollars and all comparisons are on a year-over-year basis unless otherwise stated. For a more comprehensive breakdown, please refer to our earnings press release.

  • Driven by strong digital transformation needs in enterprises, we've hit a record top line for the first half. We are talking about a 106% growth from the previous year. Not just that, this robust performance also translated into a 620-basis point expansion, bringing our growth margin to an impressive 77.3%. And here's the cherry on top: As pioneers at the frontier of cognitive AI, we've unveiled our own large language model. This innovative move has propelled our R&D investments to unprecedented heights, underscoring our unwavering dedication to harness and spearhead the transformative power of core AI technology. With that, we expect to capitalize on vast opportunities in the AI realm.

  • Breaking it down further, our net revenues for the first half surged to USD 26.5 million, up from USD 12.9 million for the same period in 2022. The growth was primarily driven by robust sales of our cloud platform products, which scored 523%. These numbers helped offset the declines in other areas, led by a continued shift towards subscription-based cloud platform products from one-time software purchases. This shift is proving to be beneficial as it provides more predictable revenue streams and consistent cash flow, all while building stronger bonds with our customers. Keep in mind, some of our revenue channels, like tech, development services and software purchases, typically shine in the last quarter of the year.

  • On the profits front, we hit USD 20.5 million in gross profit, making a 124% increase from USD 9.1 million a year early. Moreover, our gross margin expansion unfolded as planned, standing at 77.3%, up from 71.1% in the previous period. This improvement was mainly attributed to our product mix shift towards subscription-based cloud platform products.

  • Now, moving to our operating expenses. Our total operating expenses were USD 34.1 million, up 355% from USD 7.5 million in the same period last year. On the bright side, selling, general, and administrative expenses have become more efficient with large business scales, with the expense ratio compared to revenue dropping 13 percentage points from a year ago. Still, the majority of our operating expense increase was driven by a 708% surge in research and development expenses, which reached USD 29.6 million. This reflects our focused investment in the advanced large language model. We expect our latest release, Hua Zang, will play a pivotal role in our long-term growth as our co-creation model matures.

  • As a result, our operating loss for the first half was USD 13.6 million, compared with an operating income of USD 1.6 million a year earlier. Net loss was USD 18.8 million, compared to a net income of USD 0.6 million in the first half of 2022. However, it's important to know that if we exclude the incremental R&D expenses, we would have achieved profitability and even a significant growth.

  • Moreover, our balance sheet remains healthy. Cash balance more than quadrupled to USD 4.7 million from USD 1.0 million as of December 31, 2022. This improved cash position provides us with a stable footing as we move forward.

  • Looking ahead, we remain committed to managing costs and enhancing efficiency as we focus on allocating resources strategically, particularly in the pivotal area of AI Technology R&D. We believe this approach will position us well for continued growth and success.

  • Thank you for your attention. We will now open the floor for questions. Operator, please go ahead.

  • Berry Xia - IR Director

  • Well, before we answer the first question, I would like to make a correction for our previous statement. Please kindly aware that our R&D spending grew 708% to USD 29.6 million in the first half of this year. It was not 608%, apologize for the slip of the tongue. Operator, please go ahead. Thank you.

  • Operator

  • (Operator Instructions). Please answer your question in Chinese first, and repeat your questions again in English immediately after. (Operator Instructions). Our first question comes from the line of Kai Wang from Haitong International. Please ask your question Kai.

  • Kai Wang - Analyst

  • (Foreign Language) This year, the large language models, especially ChatGPT has taken the development of the chatbots to a new level. But from a business view would you say that chatbots have really generated significant revenues in the past few years. So how do you see the future of large language models and for the commercial applications, which scenario do you think has the most potential to succeed? Thank you.

  • Hui Yuan - Executive Chairman of the Board, Chief Executive Officer

  • (Translated) Let me translate first. We have a very firm conclusion that I'm very optimistic about the prospects of large language model.

  • The large language model is a powerful example of cognitive intelligence in action. So as you know conversational AI is not a new concept -- it is not a new concept, but it is a fundamental application of cognitive intelligence. Over the years cognitive intelligence has advanced significantly moving from rule based computing to expert systems, and then deep learning, and now the large language model. So, the large language model represents a major milestone in the world of cognitive intelligence application.

  • The true potential of today's large language model lies in their ability to simultaneously improve, upgrade and reduce the cost across all industries. In the near future, I'm very confident that we will see the thousands of applications harnessing the power of large language models. This cloud involving -- this could involve enhancing existing applications like chatbots and also the call centers with the large language model technology, or even creating entirely new industries and applications that were previously unimaginable.The traditional sectors like real estate, mining, agriculture, and more can all be benefited significantly from the empowerment of AI.

  • And when it comes to the operating system, I want to emphasize that the large language model has the potential to become the next-generation operating system.

  • An operating system is fundamental components of any computer system. It provides the necessary infrastructure for co-efficient human-computer interaction, a universality and an application ecosystem. As we look towards the future of human-computer interaction, it's worth considering the progress we've made over the past two decades.

  • Back when Xiao-I first introduced chatbots, we couldn't have predicted the widespread adoption of conversational interfaces in the form of Siri, Alexa, or even Cortana. The reason for this shift was conversation is simple. Language is how we, as humans, communicate and express ourselves.

  • So conversation is a natural and intuitive way to convey even the most complex thoughts and emotions. This is why conversational human-computer interaction represents an idea and a significant gateway for us to explore. And with the emergence of large language models, we can phase in a completely new iteration in the world of human-computer interaction.

  • So, the defining features of generative AI are universality and versatility. The large language model is the embodiment of this revolutionary technology. By building on the fundamental principles of AI, large language models have made a remarkable transformation, from being quantity-based to processing the abilities of generalization and generation, creative thinking, and encompassing the core elements of human intelligence.

  • So this is the dawn of GAI, making a significant leap from perceiving to comprehending and from quantitative changes to qualitative ones. So, the large language model has the potential to simultaneously empower thousands of industries, significantly reducing costs and improving efficiency across a variety of industries, and reshaping the socio-economic landscape and industrial ecology. So, this is a testament to their robust universality and versatility.

  • So the infrastructure quality and innovative interaction model of large language models has made them the ideal foundation for AI application development. They can facilitate maintainability, scalability, and iterative improvements for AI models. In the past, we had Windows in the PC era and iOS and Android in the mobile Internet era. So today, large language model possesses the capability to construct a comprehensive ecosystem. That will be the answer to your question. Thank you.

  • Operator

  • Thank you. Next question comes from the line of Zijie Han from Cinda Security. Please ask your question, Zijie.

  • Zijie Han - Analyst

  • (Foreign Language) Here is the English. We know that the generative AI ChatGPT is definitely shaking things up in the AI world, giving us lots to think about when it comes to the real-world applications. Microsoft's CEO, Satya Nadella said in an interview that we are not just looking at linear improvements but exponential growth. GPT-3.5 is already showing it's more capable than the previous version GPT-3.

  • So, what's your take on how ChatGPT is evolving? Could it evolve into an Artificial superintelligence like the Matrix, in The Matrix Series, or the Skynet in the Terminator? Thank you.

  • Hui Yuan - Executive Chairman of the Board, Chief Executive Officer

  • (Translated) So the bottom line is, ChatGPT is getting closer to AGI capabilities, but we are nowhere near the singularity yet. When we talk about a singularity, we are referring to AI hitting a critical point where it surpasses human intelligence and causes seismic shifts in, well, everything.

  • So right now, ChatGPT is still built on a tech framework that's been around for the last 60 years. So basically, the recent leaps we've made are due to stacking up more computing power and data. It's like the brute forces artistry in the AI world. But let's get real. Even if we threw all the computing power and data in the world at ChatGPT, it's not going to turn into a god. As long as we stick with the same old tech approach, the singularity isn't happening.

  • So the second point is, the underlying tech of large language models. Like this is all about probability, solving equations based on given parameters and filling in blanks with the info. we feed it. So these models follow what's known as Scaling Laws, which means as the model gets bigger, the dataset grows and we ramp up training, performance improves.

  • So in order to get the best performance, we have to step up from the three aspects together. But it's not just about having more data or larger models. It's like a game of diminishing returns. So pump more resources into it and you'll actually see less bang from your buck.

  • So every generation of tech has its limit, right. What's important now is to get these large language models into real world application as soon as possible. So, the big difference between today's model and older AI tech is the scope.

  • Old school AI was niche. It could only disrupt certain industries. But for now, with large language models leading the way, AI has the power to drive progress across the board. Think of these models as the new operating system for the AI age. Every app you're used to-the news, social media, gaming, e-commerce-you name it, is going to get redefined as the operating system changed for the AI age. So, the backbone is fundamentally different.

  • As such, this brings us an unprecedented level of structural and strategic opportunity from the development of the AI industry. So, let's grasp this game-changing strategic opportunity, which we've never seen before. It's brought by AI and it's disruptive, transformative, and packed with potential.

  • Just like our Hua Zang large language model is empowering a variety of industries, driving commercial applications and broader ecosystem development. It's leading to large shifts economically, technically and industrially. So, that will be all for your answer. Thank you for your question.

  • Operator

  • Thank you. Our next question comes from the line of Jessie Jia from Soochow Securities. Please ask your question, Jessie.

  • Jessie Jia - Analyst

  • (Foreign Language) Since April 2023, a battle of hundreds of AI large models officially began, with multiple companies launching their own AI large language models and organizing various activities to demonstrate their powerful capabilities. In this context, how can Xiao-I ensure that the company's large language model products can win? What are the advantages of your products, and how is the current business progress of your large language model? That's all. Thank you.

  • Hui Yuan - Executive Chairman of the Board, Chief Executive Officer

  • (Translated) And so, one question and then the others. Here's the thing that make our product unique. When we introduced Hua Zang back in June, it was one of our coolest innovations, an advanced large language model with a clear positioning. We want to develop it as the operating system for the AI era and why not?

  • Hua Zang applies AI capabilities to thousands of industries, empowering the core scenarios of production and life. It greatly reduced the threshold for using AI. So, it's positioned to be a very backbone or the operating system if you will, of AI application developments.

  • Think of it like this. Just like Windows was vital in the PC era and iOS and Android rocked the mobile Internet era, Hua Zang aims to be that game changer for the AI era. And we are not just talking about a tool that can make AI models more maintainable, scalable, and iterative. We are taking about building a platform ecosystem that can transform industries.

  • Now, I know what you're thinking. How does an operating system fit organically into various business landscapes? Well, let me tell you, we spent over two decades collaborating with thousands of enterprises for hundreds of diverse application scenario. So, this has been instrumental in commercializing Hua Zang. This extensive experience has afforded us a granular understanding of market dynamics and customer pain points.

  • And let me tell you, this feature bolsters our strong ability and technologies to adapt models to various industries, empowering them to transcend their traditional bounds. Their understanding, ability and technologies drive Hua Zang's key competence and advantages that significantly distinguish it from the other models.

  • Oh and when it comes to Hua Zang, its key features are what set it apart. It's controllable, customizable, and deliverable, which means we can tailor its output to meet our clients specific business needs and requirements. Whether you're in healthcare, finance, e-commerce, or education, this model can adapt to your industry.

  • We took a strategic approach when developing Hua Zang, focusing on real customer pain points and investing wisely in our model training. Our goal was to create something that wasn't just innovate, but also cost effect. So let me tell you, Hua Zang delivers on both counts. It can connect to business system quickly and efficiently, all while keeping computing, deployment, and training costs low.

  • As we move forward, it's not just about having a large language model but how quickly and effectively we can transform this technology into real world business allocations. So that's the true measure of success. We are currently in discussion with several potential partners in the finance, intelligent customer services and new materials sectors to jointly innovate models that can tailor to their industries.

  • And here is a sneak peak: we've got a press conference lined up in the fourth quarter to share more about Hua Zang's journey. So, let's stay tuned for that. That will be the answer for your question. Thank you.

  • Operator

  • Thank you. Our next question comes from the line Eva Wang from Stone Capital. Please ask your question, Eva.

  • Eva Wang - Analyst

  • (Foreign Language) We are glad to see the company achieved strong results with revenues growing rapidly and as CFO just mentioned a shift in the business model from one-time software sales to subscription for cloud platform products. So, this means stronger customer stickiness and consistent cash flow. However, I've noticed that R&D costs are still high and growing rapidly.

  • So, here are my questions. Can you share the company's growth plans and when will we expect profitability? Also, what's the company's stance on cash flow pressures and when do you anticipate positive cash flow? Thank you.

  • Wei Weng - Chief Financial Officer

  • (Translated) When we shape the company's investment strategy, we always prioritize the business sustainability and aim to invest as long as we have enough cash flow. So based on this principal, investing in R&D is crucial for our future. It is critical to push our tech advancement and is our main way to enhance our product quality, service, and competitiveness in the market.

  • We believe that with cloud platform subscriptions and the launch of products related to large language models, we will see significant growth in the next one to two years.

  • As these two businesses expand, we expect our growth margin to settle at a healthy standard and with our business model shift to cloud subscriptions from the one-time deliveries, we will have faster billing and more consistent cash flow.

  • However, due to our substantial investment in R&D, we might face some challenges with our operating cash flow. To tackle this, we are considering both borrowing and raising capital. So, this strategy will effectively supplement our cash flow and ensure that all of our operations run smoothly.

  • Given our business development and investment projections, the management team will take all endeavor to optimize operations, trying to achieve the goal for the company to become more profitable and have positive cash flow by 2025, that is a conservative expectation. And in 2024 we will also work hard to decrease our losses and improving our financial health.

  • With ongoing investment and tailored and improved operations, we are confident that the company will see significant growth in the next few years. I think that will be all for your answers. Thank you for your questions.

  • Operator

  • Thank you all very much for your questions. We have now reached the end of the Q&A session. I'll now turn the call back to Mr. Yuan for closing remarks.

  • Hui Yuan - Executive Chairman of the Board, Chief Executive Officer

  • Thank you, operator, and thank you all for participating on today's call and for your support.

  • Operator

  • Thank you all again. That's concludes the call. You may now disconnect.