Elon Musk’s xAI has publicly released the base code of the Grok AI model, but it lacks any training code.On GitHub, the business referred to it as the “314 billion parameter Mixture-of-Expert model.”

In a blog post, xAI stated that they hadn’t designed the model with any specific use in mind, such as engaging in conversations. Without providing further information. The business stated that they trained Grok-1 on a “custom” stack. The Apache License 2.0 licenses the model, allowing for commercial use cases.

Musk announced on X last week that xAI planned to make the Grok model available to the public this week. Last year. The business made Grok available to X social network Premium+ customers as a chatbot. Notably, although the open-source architecture lacks social network connections. The chatbot might still be able to access some of the X data.

Some of the AI models from many well-known firms. Such as Meta’s LLaMa, Mistral, Falcon, and AI2, have been made available to the public. Google also unveiled two brand-new open models in February: Gemma2B and Gemma7B.

Some AI-driven tool developers have already discussed incorporating Grok into their products. Arvind Srinivas, the CEO of Perplexity, announced on X that they will enhance Grok for conversational search and make it accessible to Pro users.

Earlier this month, Musk and OpenAI became embroiled in a court dispute due to the nonprofit AI aim’s alleged “betrayal” by the company. He has now made numerous criticisms of OpenAI and Sam Altman on X.