xAI open-sources base model of Grok without any training code

  • Elon Musk’s xAI has open-sourced the base code of Grok AI model, but without any training code.
  • The company noted that Grok-1 was trained on a “custom” stack without specifying details.
  • X says it will fine-tune Grok1 to enable conversational search and make it available to pro users.

Grok AI model without any training code

Elon Musk’s xAI has open-sourced the base code for the Grok AI model, but without any training code. The company describes it as a “314 billion parameter hybrid expert model” on GitHub. In a blog post, xAI said the model wasn’t tweaked for any specific application, such as using it for conversations. The model is licensed under Apache License 2.0, which permits commercial use cases.

Also read: Elon Musk denies xAI funding speculation, says not seeking capital

Also read: Elon Musk to integrate xAI with X in preparation for ‘AI age’

X launched Grok in a chatbot form

The company released Grok in a chatbot form last year, accessible to Premium+ users of X social network. Notably, the chatbot could access some of the X data, but the open-source model doesn’t include connections to the social network. Perplexity CEO Arvind Srinivas posted on X that the company will fine-tune Grok for conversational search and make it available to Pro users.

Tuna-Tu

Tuna Tu

Tuna Tu, an intern reporter at BTW media dedicated in IT infrastructure and media. She graduated from The Communication University of Zhejiang and now works in Hangzhou. Send tips to t.tu@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *