Porting the SSL (self-supervised-learning) version of the Omnilingual-Asr W2V2 release from Meta to transformers. 7B checkpoint. More on the official repo.

Almost the same usage as indicated here.

Downloads last month
6
Safetensors
Model size
6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train ylacombe/omniASR_W2V_7B_SSL