Post
87
Let's keep the momentum for small models. I just published dot. It's the first pretrained causal model that is trained on math/symbols rather than english. The goal is to get an agnostic fewshot meta learner that learns from reality itself instead of language.
It's already decent at some tasks, with next version coming in a few weeks.
appvoid/dot
It's already decent at some tasks, with next version coming in a few weeks.
appvoid/dot