view article Article Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages +7 Quent-01, nilabhra, rcojocaru, Mughaira, gcampesan, SanathNarayan, griffintaur, clefourrier, SaylorTwift • May 24, 2024 • 28
view article Article Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages +7 Quent-01, nilabhra, rcojocaru, Mughaira, gcampesan, SanathNarayan, griffintaur, clefourrier, SaylorTwift • May 24, 2024 • 28