Meta’s AI Assistant Got an LLM Update. Here’s What You Need to Know

Tech giant Meta this week released the latest generation of its large language model, Llama 3.1 405B, an open-source model it says is on par with proprietary LLM competitors like OpenAI’s GPT-4 and GPT-4o and Anthropic’s Claude 3.5 Sonnet.

Llama is what powers the Meta AI assistant. As of Tuesday, Llama 3.1 405B is accessible via the assistant you run into on WhatsApp (which is owned by Meta) and on the Meta.ai site. Though you can also use Meta AI on Instagram and Facebook, it wasn’t immediately clear if the latest model is available on those platforms as well. A spokesperson didn’t respond to a request for comment.

AI Atlas art badge tag

Meta’s first version of Llama was released in February 2023, but even CEO Mark Zuckerberg acknowledged that early versions of Llama lagged behind their peers.

“Last year, Llama 2 was only comparable to an older generation of models behind the frontier,” he wrote in a blog post published Tuesday.

Large language models are the technology behind generative AI chatbots like OpenAI’s ChatGPT, Google’s Gemini and Meta AI. They’re trained on massive data sets to learn how we use language so they can generate their own unique content that sounds at least plausibly human.

In addition to now having access to Llama 3.1 405B, Meta AI’s image generation feature, Imagine, is starting to enable what the company calls “Imagine me” prompts, which allow you to create images of yourself doing things like surfing or as part of a surrealist painting, based on existing photos. Meta AI is also getting new editing tools, which will allow you to remove and edit objects within images. Starting this week, English language users will be able to share those images on Facebook, Instagram, Messenger and WhatsApp.

Meta AI’s image generator was one feature that wowed my CNET colleague Katelyn Chedraoui in what she otherwise felt was a “convenient but unimpressive” assistant.

Llama has been downloaded more than 300 million times to date, according to Meta’s figures.

Llama vs. everybody 

The latest Llama models, which also include Llama 3.1 8B and 70B, have a context window of 128,000 tokens, which is a measurement of how much the model can remember in a given conversation. OpenAI’s GPT-4o and the newly announced GPT-4o Mini also have context windows of 128,000 tokens, while Google’s Gemini 1.5 Pro has a window of 1 million tokens.

According to a separate blog post, Llama’s improved reasoning capabilities help Meta AI understand more complex queries — especially when it comes to math and coding. The Meta models also support eight languages.

Like Llama 3, which came out in April, Llama 3.1 405B was trained on more than 15 trillion tokens, which is equivalent to about 11.25 trillion words.

Signup notice for AI Atlas newsletter

Meta says that the 8B and 70B models are best suited for text summaries and as conversational agents and coding assistants. Meanwhile, 405B can be used to create synthetic data, or data that’s generated by algorithms or through computer simulations (rather than coming from real-world sources). It can also be used in model distillation, which is the process of transferring knowledge from an LLM to a smaller model, which offers AI capabilities and speed while taking up fewer computing resources.

More than 25 partners, such as Amazon, Databricks and Nvidia, are launching related services for Llama 3.1 405B to support these developers, which Zuckerberg also believes gives the model a fighting chance.

A key difference between Llama and its peers is that the Meta model is open source. LLMs come in two varieties. Proprietary LLMs can be used only by developers who purchase access. Open-source LLMs are widely available for free.

Zuckerberg said this will ultimately make Llama and Meta AI more competitive, much like the open-source version of Linux software eventually became more popular than the closed, proprietary versions of Unix software that were developed by major tech companies. In his blog post, Zuckerberg argued this is because Linux allowed developers to experiment and it was more affordable, which led to more users and, ultimately, more advancements.