Monday, March 18, 2024

Elon Musk Reveal of Grok | NOT aligned, MUCH Bigger, Open Source (English)





...Grok is still a very early beta product – the best we could do with 2 months of training – so expect it to improve rapidly with each passing week with your help...

After announcing xAI, we trained a prototype LLM (Grok-0) with 33 billion parameters. This early model approaches LLaMA 2 (70B) capabilities on standard LM benchmarks but uses only half of its training resources. In the last two months, we have made significant improvements in reasoning and coding capabilities leading up to Grok-1, a state-of-the-art language model that is significantly more powerful, achieving 63.2% on the HumanEval coding task and 73% on MMLU.


https://x.ai/blog/grok