French AI startup Mistral shakes things up with surprise release of LLM that’s better than ChatGPT

  • French startup Mistral surprises industry with Mixtral 8x7B release via Torrent link.
  • Users applaud LLM’s prowess, surpassing OpenAI’s GPT-3.5 and others.
  • Mistral’s open-source commitment in AI culture war sets it apart.

In a surprising move that has raised eyebrows across the tech community, French AI startup Mistral has dumped its latest large language model (LLM) into the digital domain without fanfare, releasing it through a nondescript Torrent link on the popular forum X. This unorthodox approach starkly contrasts with the traditional practices of unveiling cutting-edge algorithms through press tours and carefully crafted blog posts.

Mistral’s unconventional drop sparks online buzz

Mistral, having recently secured an impressive $415 million in a series A funding round, now boasts an estimated valuation of $2 billion. The startup’s latest creation, dubbed Mixtral-8x7B, has quickly garnered attention for its impressive performance, with users claiming it easily surpasses the capabilities of OpenAI’s GPT-3.5, one of the leading LLMs in the field.

Mixtral-8x7B
Mixtral 8x7B

The decision to release the model via a Torrent link has been met with a mix of amusement and admiration, particularly on the X forum, where users appreciate Mistral’s seemingly carefree and hacker-like attitude. One commentator on the forum highlighted the absence of the usual promotional elements, stating, “No blog, no sizzle, no description — just a torrent with the model files… Mistral understands their primary audience to be engineers and knows their cultural erogenous zones.”

Mistral breaks silence with details on Mixtral-8x7B

While the initial release left users speculating about Mistral’s motives, the company eventually followed up with a blog post on Monday, providing more details about Mixtral-8x7B. According to benchmarks shared in the post, Mistral’s algorithm outperforms some of its U.S. competitors, including Meta’s Llama 2 family and the renowned GPT-3.5 from OpenAI.

As more and more AI products emerge in the market, users will compare these products. Mixtral-8x7B-32kseqlen, abbreviated as Mixtral. Lulu Cheng Meservey thinks it sounds like something from Elon Musk because it’s just a simple description (expert mix, 8 models, 7 billion parameters, 32k context size).

No blog, no sizzle, no description — just a torrent with the model files. A plain cut of steak.

Bottom line: Mistral understands their primary audience to be engineers and knows their cultural erogenous zones. Compared to Google’s rollout, Mistral’s speed, focus on substance, laconic minimalism, and mic drop without fanfare wins this round.

Lulu Cheng Meservey, CCO/EVP Activision Blizzard; personal acct. Ex-Substack, TrailRunner cofounder.

Online community echoes praise for Mistral’s new algorithm

The online community has echoed these sentiments, praising the speed and efficiency of Mistral’s new algorithm. Memes and compliments flooded the X forum, emphasizing the apparent success of Mixtral-8x7B. An interesting aspect of Mistral’s release strategy is that the model is open source, a stark contrast to OpenAI’s closed-source approach that has faced backlash within the industry.

Mistral’s commitment to open sourcing all its AI software aligns with the company’s broader vision, as stated by CEO Arthur Mensch. Mensch highlighted Mistral’s dedication to “an open, responsible and decentralized approach to technology,” positioning the company firmly within a growing culture war in the AI industry.

As Mistral disrupts the traditional norms of AI model releases, the success of Mixtral-8x7B and its open-source nature challenge established players in the field. Whether Mistral’s unconventional approach will become a trendsetter or remains an isolated incident remains to be seen, but the company’s rapid ascent and bold strategies have undeniably set tongues wagging in the tech world.

Flavie-Du

Flavie Du

Flavie Du was a senior writer at BTW media focused on blockchain and fintech investment. She graduated from King’s College London.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *