All tags
Model: "grok-1"
World_sim.exe
gpt-4 gpt-4o grok-1 llama-cpp claude-3-opus claude-3 gpt-5 nvidia nous-research stability-ai hugging-face langchain anthropic openai multimodality foundation-models hardware-optimization model-quantization float4 float6 retrieval-augmented-generation text-to-video prompt-engineering long-form-rag gpu-optimization philosophy-of-ai agi-predictions jensen-huang yann-lecun sam-altman
NVIDIA announced Project GR00T, a foundation model for humanoid robot learning using multimodal instructions, built on their tech stack including Isaac Lab, OSMO, and Jetson Thor. They revealed the DGX Grace-Blackwell GB200 with over 1 exaflop compute, capable of training GPT-4 1.8T parameters in 90 days on 2000 Blackwells. Jensen Huang confirmed GPT-4 has 1.8 trillion parameters. The new GB200 GPU supports float4/6 precision with ~3 bits per parameter and achieves 40,000 TFLOPs on fp4 with 2x sparsity.
Open source highlights include the release of Grok-1, a 340B parameter model, and Stability AI's SV3D, an open-source text-to-video generation solution. Nous Research collaborated on implementing Steering Vectors in Llama.CPP.
In Retrieval Augmented Generation (RAG), a new 5.5-hour tutorial builds a pipeline using open-source HF models, and LangChain released a video on query routing and announced integration with NVIDIA NIM for GPU-optimized LLM inference.
Prominent opinions include Yann LeCun distinguishing language from other cognitive abilities, Sam Altman predicting AGI arrival in 6 years with a leap from GPT-4 to GPT-5 comparable to GPT-3 to GPT-4, and discussions on the philosophical status of LLMs like Claude. There is also advice against training models from scratch for most companies.
Grok-1 in Bio
grok-1 mixtral miqu-70b claude-3-opus claude-3 claude-3-haiku xai mistral-ai perplexity-ai groq anthropic openai mixture-of-experts model-release model-performance benchmarking finetuning compute hardware-optimization mmlu model-architecture open-source memes sam-altman arthur-mensch daniel-han arav-srinivas francis-yao
Grok-1, a 314B parameter Mixture-of-Experts (MoE) model from xAI, has been released under an Apache 2.0 license, sparking discussions on its architecture, finetuning challenges, and performance compared to models like Mixtral and Miqu 70B. Despite its size, its MMLU benchmark performance is currently unimpressive, with expectations that Grok-2 will be more competitive. The model's weights and code are publicly available, encouraging community experimentation. Sam Altman highlighted the growing importance of compute resources, while Grok's potential deployment on Groq hardware was noted as a possible game-changer. Meanwhile, Anthropic's Claude continues to attract attention for its "spiritual" interaction experience and consistent ethical framework. The release also inspired memes and humor within the AI community.