All tags
Topic: "high-resolution-upscaling"
Jamba: Mixture of Architectures dethrones Mixtral
jamba dbrx mixtral animatediff fastsd sdxs512-0.9 b-lora supir ai21-labs databricks together-ai hugging-face midjourney mixture-of-experts model-architecture context-windows model-optimization fine-tuning image-generation video-generation cpu-optimization style-content-separation high-resolution-upscaling
AI21 labs released Jamba, a 52B parameter MoE model with 256K context length and open weights under Apache 2.0 license, optimized for single A100 GPU performance. It features a unique blocks-and-layers architecture combining transformer and MoE layers, competing with models like Mixtral. Meanwhile, Databricks introduced DBRX, a 36B active parameter MoE model trained on 12T tokens, noted as a new standard for open LLMs. In image generation, advancements include Animatediff for video-quality image generation and FastSD CPU v1.0.0 beta 28 enabling ultra-fast image generation on CPUs. Other innovations involve style-content separation using B-LoRA and improvements in high-resolution image upscaling with SUPIR.