--- library_name: transformers.js base_model: nomic-ai/CodeRankEmbed license: mit tags: - transformers.js - onnx - quantization --- # osgrep-coderank-q8 This model is a quantized (Int8 / Q8) version of [nomic-ai/CodeRankEmbed](https://huggingface.co/nomic-ai/CodeRankEmbed) export for use with `transformers.js`. It is used as the primary dense retrieval model in **osgrep**. ## Original Model License This model is a derivative work of `nomic-ai/CodeRankEmbed`, licensed under the **MIT License**. Please refer to the [original model card](https://huggingface.co/nomic-ai/CodeRankEmbed) for citation and full license details.