Original model: Mistral-Small-24B-ArliAI-RPMax-v1.4 by ArliAI
Available ExLlamaV3 0.0.16 quants
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 13.16 GB | Copy-paste the line / Download the batch file |
| H8-6.0BPW | 18.72 GB | Copy-paste the line / Download the batch file |
| H8-8.0BPW | 24.27 GB | Copy-paste the line / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing: The license for the provided quantized models is derived from the original model (see the source above)
Model tree for DeathGodlike/Mistral-Small-24B-ArliAI-RPMax-v1.4_EXL3
Base model
ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4