16c088bd50f60dc3d8b58cf6db4d8e97
This model is a fine-tuned version of facebook/mbart-large-50-many-to-one-mmt on the Helsinki-NLP/opus_books [en-ru] dataset. It achieves the following results on the evaluation set:
- Loss: 2.1585
- Data Size: 1.0
- Epoch Runtime: 111.6070
- Bleu: 8.2384
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 5.9554 | 0 | 9.4906 | 0.4722 |
| No log | 1 | 437 | 4.6232 | 0.0078 | 10.6367 | 1.1591 |
| No log | 2 | 874 | 3.8821 | 0.0156 | 12.2133 | 2.2390 |
| No log | 3 | 1311 | 3.4263 | 0.0312 | 15.0625 | 3.3700 |
| No log | 4 | 1748 | 3.0455 | 0.0625 | 19.3313 | 4.5330 |
| 2.8613 | 5 | 2185 | 2.7022 | 0.125 | 26.0513 | 5.3314 |
| 2.5206 | 6 | 2622 | 2.3944 | 0.25 | 37.4514 | 6.4168 |
| 2.1287 | 7 | 3059 | 2.1496 | 0.5 | 62.9429 | 7.2491 |
| 1.6994 | 8.0 | 3496 | 1.9230 | 1.0 | 112.8877 | 8.4187 |
| 1.3225 | 9.0 | 3933 | 1.9081 | 1.0 | 112.4188 | 8.8377 |
| 0.9777 | 10.0 | 4370 | 1.9337 | 1.0 | 110.6205 | 9.0018 |
| 0.7374 | 11.0 | 4807 | 1.9846 | 1.0 | 112.6945 | 8.6039 |
| 0.5269 | 12.0 | 5244 | 2.0883 | 1.0 | 111.7913 | 8.2357 |
| 0.3633 | 13.0 | 5681 | 2.1585 | 1.0 | 111.6070 | 8.2384 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for contemmcm/16c088bd50f60dc3d8b58cf6db4d8e97
Base model
facebook/mbart-large-50-many-to-one-mmt