AI Top 3 - Week of March 4th

Open AI invests in Figure, Klarna chatbot takes the job of 700 agents, 1-Bit LLM is here

  1. Open AI invests in Figure, partners on humanoid AI development

  2. Klarna Chatbot has taken the job of 700 full time agents

  3. 1-Bit LLM surprises everyone

1. OpenAI invests in Figure, partners on humanoid AI development

In one Sentence: Figure AI raises $675M at $2.6 Billion Valuation with Backing from Bezos, OpenAI, and Nvidia and as part of this round OpenAI agreed to help Figure humanoids see, speak and do physical tasks.

Key Takeaway

  • Figure is less than 2 years old robotics startup. Figure 01, the company's prototype, can mimic human actions, showing potential to fill gaps in labor-intensive industries.

  • Earlier in January the company announced commercial agreement with BMW to bring general purpose robots into automotive production line.

  • The expectations is humanoid robot sector will hit $38 billion in total addressable market by 2035.

Why you should care:

Humanoid robots paired with leading AI models creates an extremely powerful combination. Take a moment to imagine, the power of GPT version 10 and now imagine if Figure humanoids 5 / 10 years from now. We believe 10 years from now humanoids will take over all high-risk and less desired labor-intensive manual jobs and it will be a good thing for society and the economy.

2. Klarna says its AI Assistant does the work of 700 full-time agents!

In one sentence: Klarna a $7 Billion "buy now pay later" startup used by companies like Versace, Nike, and Wayfair, stated earlier last week that its AI chatbot is doing equivalent work of 700 full-time agents.

Key Takeaway

  • The AI assistant handles questions about refunds, returns, payments, cancellations, and more in 35 languages in usually under two minutes. The previous time of a customer service interaction without the chatbot was 11 minutes.

  • The AI bot has taken on 75% of Klarna's customer service chats, or about 2.3 million conversations so far.

  • It has achieved customer satisfaction scores comparable to those of human agents, according to the company.

  • Klarna claims that the chatbot could improve the company's profits by $40 million this year.

Why you should care:

This is one of the primary use cases in Generative AI . We will see a lot more of these stories in the coming months. If your business has a front-end customer service function, its time to start thinking about how to integrate AI into your customer support function, and what steps do you need to take to get started (if you haven't already)?

3. 1-Bit LLM surprises everyone

In one sentence: New paradigm for consuming large language models that is better in terms of memory and throughput. Let me explain using the picture below:

Key Takeaway

  • Imagine a regular language models is like a high quality paintbrush that can create paintings with wide range of colors and shades. Paintbrush represents bits used in regular language model. More bits (colors) more memory, higher compute, higher cost, slow performance etc.

  • Now, a 1-bit LLM is like a paintbrush that can only paint with two colors: black and white. Two colors represents 1-bit {1, 0} , it can essentially understand and generate language in simplified manner. Look up and see the black white vs. colored version of the pictures.

  • So, while the high quality brush represent wide range of colors and nuances, the black and white is simpler and more straightforward i.e. better performance in terms of latency, memory, throughput, and energy consumption. In a lot of use cases black and white is good enough.

  • Actually science is little bit more complex, but you get the gist. I am committed to getting you to understand the complex tech without jargons.

Why  you should care:

It is worth noting that not every use case needs highly nuanced approach with all the shades and color. This also opens up the opportunity for deploying sophisticated LLMs on edge devices, laptops and mobile platforms, as this approach significantly reduces memory requirements.

Last but not least, AI computation at its core is matrix multiplication + addition, when you replace one of the matrices with 1s and 0s, you only need addition (no multiplication required) , which opens the door for new hardware optimized for 1-bit LLM.