New hxa07D family of hybrid models, combining improved RWKV recurrent architectures with Transformer-based attention.
Designed for efficient long-cont
OpenMOSE
OpenMOSE
AI & ML interests
Can love be expressed as a tensor?
Recent Activity
updated
a model
17 minutes ago
OpenMOSE/RWKV-Qwen3-30B-A3B-Thinking-hxa07d-GGUF
published
a model
26 minutes ago
OpenMOSE/RWKV-Qwen3-30B-A3B-Thinking-hxa07d-GGUF
updated
a dataset
about 6 hours ago
OpenMOSE/my-imatrix-dataset-gen6c
Organizations
None yet