--- license: apache-2.0 base_model: google/t5-v1_1-large tags: - generated_from_trainer model-index: - name: SChem5Labels-google-t5-v1_1-large-inter_model-shuffle-model_annots_str results: [] --- # SChem5Labels-google-t5-v1_1-large-inter_model-shuffle-model_annots_str This model is a fine-tuned version of [google/t5-v1_1-large](https://huggingface.co/google/t5-v1_1-large) on the None dataset. It achieves the following results on the evaluation set: - Loss: nan ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 19.9516 | 1.0 | 25 | 23.6924 | | 18.985 | 2.0 | 50 | 21.9715 | | 18.5733 | 3.0 | 75 | 19.0631 | | 16.456 | 4.0 | 100 | 14.6944 | | 14.6058 | 5.0 | 125 | 10.3972 | | 11.384 | 6.0 | 150 | 8.9604 | | 10.1534 | 7.0 | 175 | 8.6228 | | 8.6467 | 8.0 | 200 | 8.4688 | | 8.2161 | 9.0 | 225 | 8.3154 | | 7.9229 | 10.0 | 250 | 8.2324 | | 7.8179 | 11.0 | 275 | 8.1809 | | 7.7843 | 12.0 | 300 | 8.0948 | | 7.5714 | 13.0 | 325 | 7.9681 | | 7.2487 | 14.0 | 350 | 7.7352 | | 7.2237 | 15.0 | 375 | 7.4691 | | 6.9821 | 16.0 | 400 | 7.2523 | | 6.8667 | 17.0 | 425 | 7.1151 | | 6.8551 | 18.0 | 450 | 7.0423 | | 6.7468 | 19.0 | 475 | 6.9926 | | 6.6918 | 20.0 | 500 | 6.9466 | | 6.4912 | 21.0 | 525 | 6.9125 | | 6.5704 | 22.0 | 550 | 6.8707 | | 6.4854 | 23.0 | 575 | 6.8123 | | 3.9521 | 24.0 | 600 | 1.5605 | | 1.16 | 25.0 | 625 | 1.0195 | | 1.0643 | 26.0 | 650 | 1.0007 | | 1.0417 | 27.0 | 675 | 1.0069 | | 1.04 | 28.0 | 700 | 0.9974 | | 1.0347 | 29.0 | 725 | 0.9974 | | 1.0375 | 30.0 | 750 | 1.0006 | | 1.0382 | 31.0 | 775 | 0.9958 | | 1.0347 | 32.0 | 800 | 0.9999 | | 1.0198 | 33.0 | 825 | 1.0013 | | 1.0092 | 34.0 | 850 | 1.0044 | | 1.0376 | 35.0 | 875 | 1.0045 | | 1.0245 | 36.0 | 900 | 0.9974 | ### Framework versions - Transformers 4.34.0 - Pytorch 2.1.0+cu121 - Datasets 2.6.1 - Tokenizers 0.14.1