647fe3da288ee2dba74e55d6843bf4d7
This model is a fine-tuned version of facebook/mbart-large-50-many-to-one-mmt on the Helsinki-NLP/opus_books [it-ru] dataset. It achieves the following results on the evaluation set:
- Loss: 1.8024
- Data Size: 1.0
- Epoch Runtime: 117.2758
- Bleu: 10.9219
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 5.9666 | 0 | 9.8803 | 0.7704 |
| No log | 1 | 447 | 4.5187 | 0.0078 | 11.8300 | 1.7236 |
| 0.0793 | 2 | 894 | 3.4836 | 0.0156 | 12.8525 | 3.1679 |
| 0.0824 | 3 | 1341 | 2.9831 | 0.0312 | 14.8335 | 4.5718 |
| 0.1177 | 4 | 1788 | 2.6561 | 0.0625 | 18.0296 | 5.8346 |
| 0.1931 | 5 | 2235 | 2.3428 | 0.125 | 25.1397 | 6.9470 |
| 2.2172 | 6 | 2682 | 2.0652 | 0.25 | 38.0805 | 8.0879 |
| 1.7968 | 7 | 3129 | 1.8062 | 0.5 | 66.6981 | 9.4933 |
| 1.3694 | 8.0 | 3576 | 1.6164 | 1.0 | 118.5646 | 10.5113 |
| 1.07 | 9.0 | 4023 | 1.5784 | 1.0 | 117.8986 | 11.0174 |
| 0.7746 | 10.0 | 4470 | 1.6013 | 1.0 | 117.9308 | 10.8405 |
| 0.5413 | 11.0 | 4917 | 1.6573 | 1.0 | 118.1724 | 11.1652 |
| 0.3863 | 12.0 | 5364 | 1.7049 | 1.0 | 118.2122 | 11.2885 |
| 0.2563 | 13.0 | 5811 | 1.8024 | 1.0 | 117.2758 | 10.9219 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for contemmcm/647fe3da288ee2dba74e55d6843bf4d7
Base model
facebook/mbart-large-50-many-to-one-mmt