All tags
Topic: "philosophy-of-ai"
World_sim.exe
gpt-4 gpt-4o grok-1 llama-cpp claude-3-opus claude-3 gpt-5 nvidia nous-research stability-ai hugging-face langchain anthropic openai multimodality foundation-models hardware-optimization model-quantization float4 float6 retrieval-augmented-generation text-to-video prompt-engineering long-form-rag gpu-optimization philosophy-of-ai agi-predictions jensen-huang yann-lecun sam-altman
NVIDIA announced Project GR00T, a foundation model for humanoid robot learning using multimodal instructions, built on their tech stack including Isaac Lab, OSMO, and Jetson Thor. They revealed the DGX Grace-Blackwell GB200 with over 1 exaflop compute, capable of training GPT-4 1.8T parameters in 90 days on 2000 Blackwells. Jensen Huang confirmed GPT-4 has 1.8 trillion parameters. The new GB200 GPU supports float4/6 precision with ~3 bits per parameter and achieves 40,000 TFLOPs on fp4 with 2x sparsity.
Open source highlights include the release of Grok-1, a 340B parameter model, and Stability AI's SV3D, an open-source text-to-video generation solution. Nous Research collaborated on implementing Steering Vectors in Llama.CPP.
In Retrieval Augmented Generation (RAG), a new 5.5-hour tutorial builds a pipeline using open-source HF models, and LangChain released a video on query routing and announced integration with NVIDIA NIM for GPU-optimized LLM inference.
Prominent opinions include Yann LeCun distinguishing language from other cognitive abilities, Sam Altman predicting AGI arrival in 6 years with a leap from GPT-4 to GPT-5 comparable to GPT-3 to GPT-4, and discussions on the philosophical status of LLMs like Claude. There is also advice against training models from scratch for most companies.
12/28/2023: Smol Talk updates
tinyllama-1.1b mixtral tinygpt-v nous-research tyrannosaurus latex benchmarking knowledge-graphs model-finetuning tokenization decentralized-computation philosophy-of-ai multimodality vision open-source-models gary-marcus
Nous Research AI Discord discussions covered topics such as AI placement charts, ChatGPT's issues with Latex math format compatibility with Obsidian, and performance metrics of the TinyLlama 1.1B model on various benchmarks. Users shared resources including the math-centric corpus MathPile, knowledge graph building methods, and open-source large language model repositories. Technical discussions included decentralized computation feasibility for models like Mixtral, philosophical debates on AI sentience, and strategies for model finetuning and token counting. The community also discussed the Obsidian model, vision model training, and the release of the multimodal TinyGPT-V model by Tyrannosaurus. "ChatGPT not generating Latex math format compatible with Obsidian" and "optimistic about human-level AI within our lifetime" were notable quotes.