20250331_072726_gemma-3-27b-pt_LoRA / KETI_b1_s4_e2_training_log.log
minyong's picture
Training in progress, epoch 1
c9b5650 verified
03/31/2025 07:27:42 - INFO - Train data file: finetuning_data_25_sentences.json
03/31/2025 07:27:42 - INFO - Output Directory: output/gemma-3-27b-pt/20250331_072726_gemma-3-27b-pt_LoRA
03/31/2025 07:27:42 - INFO - Experiment name: KETI_b1_s4_e2
03/31/2025 07:27:42 - INFO - torch_dtype: torch.bfloat16
03/31/2025 07:27:42 - INFO - πŸ” Start inference on base model: google/gemma-3-27b-it
03/31/2025 07:28:41 - INFO - βœ… base_modelκ³Ό tokenizer λ©”λͺ¨λ¦¬μ—μ„œ ν•΄μ œ μ™„λ£Œ
03/31/2025 07:28:41 - INFO - Using 6 GPU(s): NVIDIA A100-SXM4-80GB
03/31/2025 07:28:41 - INFO - πŸ”’ Training samples: 37822
03/31/2025 07:28:41 - INFO - πŸ” Evaluation samples: 4203
03/31/2025 07:28:41 - INFO - πŸ“Š Steps per epoch: 1575
03/31/2025 07:28:41 - INFO - πŸͺœ Total training steps: 3150
03/31/2025 07:28:41 - INFO - βœ… FFT or LoRA λͺ¨λ“œλ‘œ ν•™μŠ΅ν•©λ‹ˆλ‹€.
03/31/2025 07:28:55 - INFO - Initializing LORA model...
03/31/2025 07:29:01 - INFO - gcc -pthread -B /root/pai/envs/llm-finetuning/compiler_compat -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /root/pai/envs/llm-finetuning/include -fPIC -O2 -isystem /root/pai/envs/llm-finetuning/include -fPIC -c /tmp/tmp1xul7knp/test.c -o /tmp/tmp1xul7knp/test.o
03/31/2025 07:29:01 - INFO - gcc -pthread -B /root/pai/envs/llm-finetuning/compiler_compat -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /root/pai/envs/llm-finetuning/include -fPIC -O2 -isystem /root/pai/envs/llm-finetuning/include -fPIC -c /tmp/tmpp1iy2isl/test.c -o /tmp/tmpp1iy2isl/test.o
03/31/2025 07:29:02 - INFO - Start Training !
03/31/2025 07:29:29 - INFO - [Epoch 0.11] [Step 10] loss: 3.5779
03/31/2025 07:29:53 - INFO - [Epoch 0.22] [Step 20] loss: 3.0373
03/31/2025 07:30:16 - INFO - [Epoch 0.33] [Step 30] loss: 2.9353
03/31/2025 07:30:39 - INFO - [Epoch 0.44] [Step 40] loss: 2.8949
03/31/2025 07:31:02 - INFO - [Epoch 0.55] [Step 50] loss: 2.8782
03/31/2025 07:31:25 - INFO - [Epoch 0.66] [Step 60] loss: 2.8695
03/31/2025 07:31:48 - INFO - [Epoch 0.77] [Step 70] loss: 2.8571
03/31/2025 07:32:11 - INFO - [Epoch 0.88] [Step 80] loss: 2.8411
03/31/2025 07:32:34 - INFO - [Epoch 0.99] [Step 90] loss: 2.8435
03/31/2025 07:34:08 - INFO - [Epoch 1.09] [Step 100] loss: 2.7393
03/31/2025 07:34:31 - INFO - [Epoch 1.20] [Step 110] loss: 2.7271
03/31/2025 07:34:54 - INFO - [Epoch 1.31] [Step 120] loss: 2.7266
03/31/2025 07:35:17 - INFO - [Epoch 1.42] [Step 130] loss: 2.7222
03/31/2025 07:35:40 - INFO - [Epoch 1.53] [Step 140] loss: 2.7247
03/31/2025 07:36:03 - INFO - [Epoch 1.64] [Step 150] loss: 2.7279
03/31/2025 07:36:26 - INFO - [Epoch 1.75] [Step 160] loss: 2.7273
03/31/2025 07:36:49 - INFO - [Epoch 1.85] [Step 170] loss: 2.7270
03/31/2025 07:37:12 - INFO - [Epoch 1.96] [Step 180] loss: 2.7116
03/31/2025 07:38:27 - INFO - βœ… Training complete. Logging system usage...
03/31/2025 07:38:27 - INFO - >> System Usage - CPU: 5.8%, RAM: 2.7%, SSD: 75.50GB / 1888.43GB
03/31/2025 07:38:27 - INFO - >> GPU 0: 76.86 GB used
03/31/2025 07:38:27 - INFO - >> GPU 1: 73.42 GB used
03/31/2025 07:38:27 - INFO - >> GPU 2: 76.32 GB used
03/31/2025 07:38:27 - INFO - >> GPU 3: 75.68 GB used
03/31/2025 07:38:27 - INFO - >> GPU 4: 76.52 GB used
03/31/2025 07:38:27 - INFO - >> GPU 5: 74.11 GB used
03/31/2025 07:38:27 - INFO - >> Total GPU Memory Used: 452.92 GB
03/31/2025 07:38:28 - INFO - >> Total GPU Power Consumption: 531.23 W