LAUNCH INFO 2022-08-23 17:36:23,308 ----------- Configuration ---------------------- INFO 2022-08-23 17:36:23,308 __init__.py:21] ----------- Configuration ---------------------- LAUNCH INFO 2022-08-23 17:36:23,308 devices: None INFO 2022-08-23 17:36:23,308 __init__.py:23] devices: None LAUNCH INFO 2022-08-23 17:36:23,308 elastic_level: -1 INFO 2022-08-23 17:36:23,308 __init__.py:23] elastic_level: -1 LAUNCH INFO 2022-08-23 17:36:23,308 elastic_timeout: 30 INFO 2022-08-23 17:36:23,308 __init__.py:23] elastic_timeout: 30 LAUNCH INFO 2022-08-23 17:36:23,308 gloo_port: 6767 INFO 2022-08-23 17:36:23,308 __init__.py:23] gloo_port: 6767 LAUNCH INFO 2022-08-23 17:36:23,309 host: None INFO 2022-08-23 17:36:23,309 __init__.py:23] host: None LAUNCH INFO 2022-08-23 17:36:23,309 job_id: default INFO 2022-08-23 17:36:23,309 __init__.py:23] job_id: default LAUNCH INFO 2022-08-23 17:36:23,309 legacy: False INFO 2022-08-23 17:36:23,309 __init__.py:23] legacy: False LAUNCH INFO 2022-08-23 17:36:23,309 log_dir: output/topformer/topformer_base_ade20k_512x512_160k/test_0/log_dir INFO 2022-08-23 17:36:23,309 __init__.py:23] log_dir: output/topformer/topformer_base_ade20k_512x512_160k/test_0/log_dir LAUNCH INFO 2022-08-23 17:36:23,309 log_level: INFO INFO 2022-08-23 17:36:23,309 __init__.py:23] log_level: INFO LAUNCH INFO 2022-08-23 17:36:23,309 master: None INFO 2022-08-23 17:36:23,309 __init__.py:23] master: None LAUNCH INFO 2022-08-23 17:36:23,309 max_restart: 3 INFO 2022-08-23 17:36:23,309 __init__.py:23] max_restart: 3 LAUNCH INFO 2022-08-23 17:36:23,309 nnodes: 1 INFO 2022-08-23 17:36:23,309 __init__.py:23] nnodes: 1 LAUNCH INFO 2022-08-23 17:36:23,309 nproc_per_node: None INFO 2022-08-23 17:36:23,309 __init__.py:23] nproc_per_node: None LAUNCH INFO 2022-08-23 17:36:23,309 rank: -1 INFO 2022-08-23 17:36:23,309 __init__.py:23] rank: -1 LAUNCH INFO 2022-08-23 17:36:23,309 run_mode: collective INFO 2022-08-23 17:36:23,309 __init__.py:23] run_mode: collective LAUNCH INFO 2022-08-23 17:36:23,309 server_num: None INFO 2022-08-23 17:36:23,309 __init__.py:23] server_num: None LAUNCH INFO 2022-08-23 17:36:23,309 servers: INFO 2022-08-23 17:36:23,309 __init__.py:23] servers: LAUNCH INFO 2022-08-23 17:36:23,309 trainer_num: None INFO 2022-08-23 17:36:23,309 __init__.py:23] trainer_num: None LAUNCH INFO 2022-08-23 17:36:23,309 trainers: INFO 2022-08-23 17:36:23,309 __init__.py:23] trainers: LAUNCH INFO 2022-08-23 17:36:23,309 training_script: train.py INFO 2022-08-23 17:36:23,309 __init__.py:23] training_script: train.py LAUNCH INFO 2022-08-23 17:36:23,309 training_script_args: ['--config', 'configs/topformer/topformer_base_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_base_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] INFO 2022-08-23 17:36:23,309 __init__.py:23] training_script_args: ['--config', 'configs/topformer/topformer_base_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_base_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] LAUNCH INFO 2022-08-23 17:36:23,309 with_gloo: 0 INFO 2022-08-23 17:36:23,309 __init__.py:23] with_gloo: 0 LAUNCH INFO 2022-08-23 17:36:23,309 -------------------------------------------------- INFO 2022-08-23 17:36:23,309 __init__.py:24] -------------------------------------------------- LAUNCH INFO 2022-08-23 17:36:23,314 Job: default, mode collective, replicas 1[1:1], elastic False INFO 2022-08-23 17:36:23,314 controller.py:152] Job: default, mode collective, replicas 1[1:1], elastic False LAUNCH INFO 2022-08-23 17:36:23,315 Run Pod: xybidw, replicas 2, status ready INFO 2022-08-23 17:36:23,315 controller.py:53] Run Pod: xybidw, replicas 2, status ready LAUNCH INFO 2022-08-23 17:36:23,343 Watching Pod: xybidw, replicas 2, status running INFO 2022-08-23 17:36:23,343 controller.py:73] Watching Pod: xybidw, replicas 2, status running 2022-08-23 17:36:25 [INFO] ------------Environment Information------------- platform: Linux-3.10.0-1062.18.1.el7.x86_64-x86_64-with-glibc2.18 Python: 3.9.7 (default, Sep 16 2021, 13:09:58) [GCC 7.5.0] Paddle compiled with cuda: True NVCC: Cuda compilation tools, release 10.2, V10.2.89 cudnn: 7.6 GPUs used: 2 CUDA_VISIBLE_DEVICES: 2,3 GPU: ['GPU 0: Tesla V100-SXM2-16GB', 'GPU 1: Tesla V100-SXM2-16GB', 'GPU 2: Tesla V100-SXM2-16GB', 'GPU 3: Tesla V100-SXM2-16GB', 'GPU 4: Tesla V100-SXM2-16GB', 'GPU 5: Tesla V100-SXM2-16GB', 'GPU 6: Tesla V100-SXM2-16GB', 'GPU 7: Tesla V100-SXM2-16GB'] GCC: gcc (GCC) 9.1.0 PaddleSeg: develop PaddlePaddle: 2.3.0 OpenCV: 4.4.0 ------------------------------------------------ 2022-08-23 17:36:25 [INFO] ---------------Config Information--------------- batch_size: 8 export: transforms: - keep_ratio: true size_divisor: 32 target_size: - 2048 - 512 type: Resize - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize iters: 160000 loss: coef: - 1 types: - ignore_index: 255 type: CrossEntropyLoss lr_scheduler: end_lr: 0 learning_rate: 0.0012 power: 1.0 type: PolynomialDecay warmup_iters: 1500 warmup_start_lr: 1.0e-06 model: backbone: lr_mult: 0.1 pretrained: https://paddleseg.bj.bcebos.com/dygraph/backbone/topformer_base_imagenet_pretrained.zip type: TopTransformer_Base type: TopFormer optimizer: type: AdamW weight_decay: 0.01 train_dataset: dataset_root: data/ADEChallengeData2016/ mode: train transforms: - max_scale_factor: 2.0 min_scale_factor: 0.5 scale_step_size: 0.25 type: ResizeStepScaling - crop_size: - 512 - 512 type: RandomPaddingCrop - type: RandomHorizontalFlip - brightness_range: 0.4 contrast_range: 0.4 saturation_range: 0.4 type: RandomDistort - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize type: ADE20K val_dataset: dataset_root: data/ADEChallengeData2016/ mode: val transforms: - keep_ratio: true size_divisor: 32 target_size: - 2048 - 512 type: Resize - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize type: ADE20K ------------------------------------------------ W0823 17:36:25.907640 19860 gpu_context.cc:278] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 10.2, Runtime API Version: 10.2 W0823 17:36:25.907712 19860 gpu_context.cc:306] device: 0, cuDNN Version: 7.6. 2022-08-23 17:36:32 [INFO] Loading pretrained model from https://paddleseg.bj.bcebos.com/dygraph/backbone/topformer_base_imagenet_pretrained.zip 2022-08-23 17:36:32 [WARNING] SIM.1.local_embedding.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.local_embedding.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.local_embedding.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.local_embedding.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.local_embedding.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_embedding.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_embedding.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_embedding.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_embedding.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_embedding.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_act.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_act.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_act.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_act.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.1.global_act.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.local_embedding.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.local_embedding.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.local_embedding.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.local_embedding.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.local_embedding.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_embedding.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_embedding.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_embedding.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_embedding.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_embedding.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_act.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_act.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_act.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_act.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.2.global_act.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.local_embedding.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.local_embedding.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.local_embedding.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.local_embedding.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.local_embedding.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_embedding.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_embedding.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_embedding.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_embedding.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_embedding.bn._variance is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_act.conv.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_act.bn.weight is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_act.bn.bias is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_act.bn._mean is not in pretrained model 2022-08-23 17:36:32 [WARNING] SIM.3.global_act.bn._variance is not in pretrained model 2022-08-23 17:36:32 [INFO] There are 278/323 variables loaded into TopTransformer. server not ready, wait 3 sec to retry... not ready endpoints:['10.9.189.6:32912'] I0823 17:36:35.983691 19860 nccl_context.cc:83] init nccl context nranks: 2 local rank: 0 gpu id: 0 ring id: 0 I0823 17:36:36.261029 19860 nccl_context.cc:115] init nccl context nranks: 2 local rank: 0 gpu id: 0 ring id: 10 /ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/tensor/creation.py:125: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations if data.dtype == np.object: 2022-08-23 17:36:36,308-INFO: [topology.py:169:__init__] HybridParallelInfo: rank_id: 0, mp_degree: 1, sharding_degree: 1, pp_degree: 1, dp_degree: 2, mp_group: [0], sharding_group: [0], pp_group: [0], dp_group: [0, 1], check/clip group: [0] /ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/math_op_patch.py:276: UserWarning: The dtype of left and right variables are not the same, left dtype is paddle.float32, but right dtype is paddle.int64, the right dtype will convert to paddle.float32 warnings.warn( 2022-08-23 17:36:47 [INFO] [TRAIN] epoch: 1, iter: 50/160000, loss: 5.4469, lr: 0.000040, batch_cost: 0.2260, reader_cost: 0.03316, ips: 35.4021 samples/sec | ETA 10:02:24 2022-08-23 17:36:55 [INFO] [TRAIN] epoch: 1, iter: 100/160000, loss: 5.1218, lr: 0.000080, batch_cost: 0.1588, reader_cost: 0.00075, ips: 50.3936 samples/sec | ETA 07:03:04 2022-08-23 17:37:04 [INFO] [TRAIN] epoch: 1, iter: 150/160000, loss: 4.5977, lr: 0.000120, batch_cost: 0.1721, reader_cost: 0.00073, ips: 46.4888 samples/sec | ETA 07:38:27 2022-08-23 17:37:12 [INFO] [TRAIN] epoch: 1, iter: 200/160000, loss: 3.9248, lr: 0.000160, batch_cost: 0.1610, reader_cost: 0.00069, ips: 49.6961 samples/sec | ETA 07:08:44 2022-08-23 17:37:22 [INFO] [TRAIN] epoch: 1, iter: 250/160000, loss: 3.4920, lr: 0.000200, batch_cost: 0.1957, reader_cost: 0.00062, ips: 40.8689 samples/sec | ETA 08:41:10 2022-08-23 17:37:31 [INFO] [TRAIN] epoch: 1, iter: 300/160000, loss: 3.0498, lr: 0.000240, batch_cost: 0.1861, reader_cost: 0.00050, ips: 42.9951 samples/sec | ETA 08:15:15 2022-08-23 17:37:41 [INFO] [TRAIN] epoch: 1, iter: 350/160000, loss: 2.8793, lr: 0.000280, batch_cost: 0.1941, reader_cost: 0.00065, ips: 41.2069 samples/sec | ETA 08:36:34 2022-08-23 17:37:51 [INFO] [TRAIN] epoch: 1, iter: 400/160000, loss: 2.7087, lr: 0.000320, batch_cost: 0.2023, reader_cost: 0.00388, ips: 39.5401 samples/sec | ETA 08:58:11 2022-08-23 17:38:01 [INFO] [TRAIN] epoch: 1, iter: 450/160000, loss: 2.4798, lr: 0.000360, batch_cost: 0.2017, reader_cost: 0.00047, ips: 39.6547 samples/sec | ETA 08:56:27 2022-08-23 17:38:12 [INFO] [TRAIN] epoch: 1, iter: 500/160000, loss: 2.4355, lr: 0.000400, batch_cost: 0.2174, reader_cost: 0.00047, ips: 36.7947 samples/sec | ETA 09:37:58 2022-08-23 17:38:24 [INFO] [TRAIN] epoch: 1, iter: 550/160000, loss: 2.3115, lr: 0.000440, batch_cost: 0.2389, reader_cost: 0.00044, ips: 33.4934 samples/sec | ETA 10:34:45 2022-08-23 17:38:33 [INFO] [TRAIN] epoch: 1, iter: 600/160000, loss: 2.2101, lr: 0.000480, batch_cost: 0.1910, reader_cost: 0.00081, ips: 41.8937 samples/sec | ETA 08:27:18 2022-08-23 17:38:44 [INFO] [TRAIN] epoch: 1, iter: 650/160000, loss: 2.1531, lr: 0.000520, batch_cost: 0.2194, reader_cost: 0.00053, ips: 36.4565 samples/sec | ETA 09:42:47 2022-08-23 17:38:54 [INFO] [TRAIN] epoch: 1, iter: 700/160000, loss: 2.0302, lr: 0.000560, batch_cost: 0.1973, reader_cost: 0.00042, ips: 40.5541 samples/sec | ETA 08:43:44 2022-08-23 17:39:05 [INFO] [TRAIN] epoch: 1, iter: 750/160000, loss: 2.0979, lr: 0.000600, batch_cost: 0.2167, reader_cost: 0.00070, ips: 36.9240 samples/sec | ETA 09:35:03 2022-08-23 17:39:15 [INFO] [TRAIN] epoch: 1, iter: 800/160000, loss: 1.8699, lr: 0.000640, batch_cost: 0.2055, reader_cost: 0.00053, ips: 38.9352 samples/sec | ETA 09:05:10 2022-08-23 17:39:25 [INFO] [TRAIN] epoch: 1, iter: 850/160000, loss: 1.8187, lr: 0.000680, batch_cost: 0.1996, reader_cost: 0.00069, ips: 40.0890 samples/sec | ETA 08:49:19 2022-08-23 17:39:36 [INFO] [TRAIN] epoch: 1, iter: 900/160000, loss: 1.7688, lr: 0.000720, batch_cost: 0.2126, reader_cost: 0.00106, ips: 37.6370 samples/sec | ETA 09:23:37 2022-08-23 17:39:46 [INFO] [TRAIN] epoch: 1, iter: 950/160000, loss: 1.7758, lr: 0.000760, batch_cost: 0.2100, reader_cost: 0.00048, ips: 38.1014 samples/sec | ETA 09:16:35 2022-08-23 17:39:57 [INFO] [TRAIN] epoch: 1, iter: 1000/160000, loss: 1.8263, lr: 0.000800, batch_cost: 0.2084, reader_cost: 0.00036, ips: 38.3919 samples/sec | ETA 09:12:11 2022-08-23 17:39:57 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 160s - batch_cost: 0.1604 - reader cost: 6.8739e-04 2022-08-23 17:42:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.0870 Acc: 0.6329 Kappa: 0.6000 Dice: 0.1248 2022-08-23 17:42:37 [INFO] [EVAL] Class IoU: [0.5254 0.6491 0.8562 0.5865 0.5444 0.6182 0.6293 0.4995 0.38 0.5636 0.3019 0.3412 0.5579 0.199 0.044 0.1701 0.2915 0.2399 0.3429 0.2427 0.5978 0.3362 0.3009 0.2298 0.1489 0.2013 0.2544 0.0007 0.1772 0.143 0.0816 0.1341 0.0142 0.1002 0.1348 0. 0.1036 0.1535 0.012 0.1032 0. 0.0003 0.0003 0.0012 0.1098 0. 0. 0.2073 0.0297 0.0211 0.001 0. 0.0008 0.0005 0. 0.0069 0.3172 0.0485 0. 0. 0.0032 0.015 0.0007 0.0024 0.0233 0.0101 0.0064 0.0456 0.0002 0.0021 0.0003 0.1031 0.0108 0. 0.0069 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0006 0. 0. 0. 0.1773 0.0678 0. 0.0009 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0115 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.002 0. 0. 0. 0. 0.0001 0. 0. 0. 0. 0.0001 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-23 17:42:37 [INFO] [EVAL] Class Precision: [0.6128 0.6924 0.8962 0.7025 0.6566 0.7589 0.717 0.6756 0.5588 0.6545 0.3772 0.5656 0.6514 0.3506 0.2452 0.3587 0.478 0.5915 0.6279 0.3244 0.7087 0.4422 0.6188 0.2885 0.1874 0.51 0.3802 0.3193 0.4882 0.2458 0.1581 0.7827 0.1883 0.2217 0.1815 0.0185 0.5122 0.3975 0.2458 0.2578 0. 0.1354 0.8547 0.7412 0.4672 0. 0.0652 0.3055 0.6694 0.7391 0.8737 0. 0.1125 0.6809 0. 0.3446 0.8779 0.4377 0. 0.0002 0.0151 0.3414 0.1685 0.319 0.2303 0.5273 0.4511 0.4087 0.0366 0.3462 0.013 0.3286 0.8037 0. 0.6312 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.9647 0. 0. 0. 0.5681 0.9628 0. 0.7391 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.6403 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.3929 0. 0. 0. 0. 0.8421 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-23 17:42:37 [INFO] [EVAL] Class Recall: [0.7866 0.9123 0.9504 0.7803 0.7611 0.7693 0.8372 0.6571 0.5429 0.8022 0.602 0.4624 0.7954 0.3153 0.0509 0.2444 0.4277 0.2875 0.4303 0.4906 0.7925 0.5839 0.3693 0.5304 0.4201 0.2495 0.4348 0.0007 0.2176 0.2546 0.1442 0.1393 0.0152 0.1547 0.3438 0. 0.1149 0.2 0.0125 0.1468 0. 0.0003 0.0003 0.0012 0.1255 0. 0. 0.3922 0.0302 0.0213 0.001 0. 0.0008 0.0005 0. 0.0069 0.3318 0.0517 0. 0. 0.004 0.0155 0.0007 0.0024 0.0252 0.0102 0.0065 0.0488 0.0002 0.0021 0.0003 0.1306 0.0108 0. 0.007 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0006 0. 0. 0. 0.2049 0.068 0. 0.0009 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0116 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.002 0. 0. 0. 0. 0.0001 0. 0. 0. 0. 0.0001 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-23 17:42:38 [INFO] [EVAL] The model with the best validation mIoU (0.0870) was saved at iter 1000. 2022-08-23 17:42:46 [INFO] [TRAIN] epoch: 1, iter: 1050/160000, loss: 1.7783, lr: 0.000840, batch_cost: 0.1685, reader_cost: 0.00350, ips: 47.4678 samples/sec | ETA 07:26:28 2022-08-23 17:42:55 [INFO] [TRAIN] epoch: 1, iter: 1100/160000, loss: 1.7953, lr: 0.000879, batch_cost: 0.1793, reader_cost: 0.00180, ips: 44.6151 samples/sec | ETA 07:54:52 2022-08-23 17:43:03 [INFO] [TRAIN] epoch: 1, iter: 1150/160000, loss: 1.7436, lr: 0.000919, batch_cost: 0.1678, reader_cost: 0.00049, ips: 47.6886 samples/sec | ETA 07:24:07 2022-08-23 17:43:12 [INFO] [TRAIN] epoch: 1, iter: 1200/160000, loss: 1.6217, lr: 0.000959, batch_cost: 0.1726, reader_cost: 0.00066, ips: 46.3369 samples/sec | ETA 07:36:56 2022-08-23 17:43:20 [INFO] [TRAIN] epoch: 1, iter: 1250/160000, loss: 1.7155, lr: 0.000999, batch_cost: 0.1629, reader_cost: 0.00045, ips: 49.1031 samples/sec | ETA 07:11:03 2022-08-23 17:43:32 [INFO] [TRAIN] epoch: 2, iter: 1300/160000, loss: 1.6472, lr: 0.001039, batch_cost: 0.2411, reader_cost: 0.06594, ips: 33.1841 samples/sec | ETA 10:37:39 2022-08-23 17:43:40 [INFO] [TRAIN] epoch: 2, iter: 1350/160000, loss: 1.5699, lr: 0.001079, batch_cost: 0.1639, reader_cost: 0.00069, ips: 48.7980 samples/sec | ETA 07:13:29 2022-08-23 17:43:51 [INFO] [TRAIN] epoch: 2, iter: 1400/160000, loss: 1.4564, lr: 0.001119, batch_cost: 0.2070, reader_cost: 0.00039, ips: 38.6544 samples/sec | ETA 09:07:04 2022-08-23 17:44:01 [INFO] [TRAIN] epoch: 2, iter: 1450/160000, loss: 1.5840, lr: 0.001159, batch_cost: 0.2012, reader_cost: 0.00052, ips: 39.7616 samples/sec | ETA 08:51:40 2022-08-23 17:44:12 [INFO] [TRAIN] epoch: 2, iter: 1500/160000, loss: 1.4512, lr: 0.001199, batch_cost: 0.2292, reader_cost: 0.00062, ips: 34.9053 samples/sec | ETA 10:05:26 2022-08-23 17:44:23 [INFO] [TRAIN] epoch: 2, iter: 1550/160000, loss: 1.5105, lr: 0.001200, batch_cost: 0.2209, reader_cost: 0.00066, ips: 36.2186 samples/sec | ETA 09:43:18 2022-08-23 17:44:36 [INFO] [TRAIN] epoch: 2, iter: 1600/160000, loss: 1.4952, lr: 0.001199, batch_cost: 0.2632, reader_cost: 0.00110, ips: 30.3945 samples/sec | ETA 11:34:51 2022-08-23 17:44:47 [INFO] [TRAIN] epoch: 2, iter: 1650/160000, loss: 1.5533, lr: 0.001199, batch_cost: 0.2100, reader_cost: 0.00065, ips: 38.0903 samples/sec | ETA 09:14:17 2022-08-23 17:44:57 [INFO] [TRAIN] epoch: 2, iter: 1700/160000, loss: 1.4921, lr: 0.001198, batch_cost: 0.2036, reader_cost: 0.00163, ips: 39.2992 samples/sec | ETA 08:57:04 2022-08-23 17:45:08 [INFO] [TRAIN] epoch: 2, iter: 1750/160000, loss: 1.4831, lr: 0.001198, batch_cost: 0.2178, reader_cost: 0.00043, ips: 36.7348 samples/sec | ETA 09:34:23 2022-08-23 17:45:19 [INFO] [TRAIN] epoch: 2, iter: 1800/160000, loss: 1.5095, lr: 0.001198, batch_cost: 0.2182, reader_cost: 0.00071, ips: 36.6631 samples/sec | ETA 09:35:19 2022-08-23 17:45:30 [INFO] [TRAIN] epoch: 2, iter: 1850/160000, loss: 1.4813, lr: 0.001197, batch_cost: 0.2114, reader_cost: 0.00055, ips: 37.8366 samples/sec | ETA 09:17:18 2022-08-23 17:45:40 [INFO] [TRAIN] epoch: 2, iter: 1900/160000, loss: 1.3096, lr: 0.001197, batch_cost: 0.1998, reader_cost: 0.00087, ips: 40.0358 samples/sec | ETA 08:46:31 2022-08-23 17:45:50 [INFO] [TRAIN] epoch: 2, iter: 1950/160000, loss: 1.4331, lr: 0.001197, batch_cost: 0.2044, reader_cost: 0.00065, ips: 39.1459 samples/sec | ETA 08:58:19 2022-08-23 17:46:01 [INFO] [TRAIN] epoch: 2, iter: 2000/160000, loss: 1.4209, lr: 0.001196, batch_cost: 0.2165, reader_cost: 0.00056, ips: 36.9455 samples/sec | ETA 09:30:12 2022-08-23 17:46:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1905 - reader cost: 7.1859e-04 2022-08-23 17:49:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.1671 Acc: 0.6739 Kappa: 0.6481 Dice: 0.2426 2022-08-23 17:49:11 [INFO] [EVAL] Class IoU: [0.5754 0.718 0.8992 0.6288 0.6117 0.6632 0.6832 0.6142 0.4132 0.59 0.2973 0.4532 0.6093 0.1565 0.0748 0.2224 0.4307 0.2661 0.4864 0.2693 0.6517 0.3388 0.4612 0.332 0.181 0.3476 0.3419 0.0279 0.2703 0.1497 0.1421 0.3313 0.1863 0.1866 0.077 0.0011 0.2644 0.2884 0.0712 0.161 0.0001 0.0437 0.0173 0.1726 0.1676 0.0525 0.1473 0.253 0.3201 0.29 0.0701 0.2597 0.0008 0.2019 0.6046 0.1592 0.6424 0.2454 0.0432 0.1246 0.005 0.0603 0.1905 0.1071 0.1277 0.4328 0.145 0.2175 0. 0.0225 0.0047 0.2262 0.3189 0.0038 0.3154 0.2052 0.1124 0.0006 0.0245 0. 0.2885 0.088 0.0935 0.0402 0.1194 0.349 0.0013 0.0007 0. 0.2525 0.3251 0. 0.151 0.0082 0. 0. 0. 0. 0. 0.0003 0.0066 0. 0.0212 0. 0. 0. 0.0005 0.3392 0.0009 0. 0. 0.0491 0. 0.2966 0.2372 0. 0.1191 0.4934 0. 0.3781 0.0315 0. 0. 0.0175 0.0502 0.0002 0.0093 0.1129 0.0802 0. 0.147 0. 0. 0. 0.0051 0.0007 0. 0. 0.0017 0.0159 0. 0.0937 0.024 0.0001 0.0089 0. 0. 0. 0. 0. ] 2022-08-23 17:49:11 [INFO] [EVAL] Class Precision: [0.6926 0.8058 0.939 0.7144 0.7106 0.8261 0.8533 0.7871 0.5412 0.7213 0.3294 0.5585 0.6764 0.5104 0.3376 0.4334 0.533 0.6153 0.6785 0.5242 0.7272 0.4675 0.718 0.4585 0.2256 0.394 0.4035 0.5362 0.5947 0.2457 0.2157 0.4499 0.3854 0.352 0.336 0.7623 0.4962 0.5547 0.5045 0.3898 0.0421 0.2082 0.6285 0.4779 0.3037 0.3025 0.17 0.3311 0.5798 0.5399 0.3845 0.5732 0.788 0.3181 0.6862 0.1724 0.7359 0.4741 0.6917 0.3468 0.1632 0.7556 0.3333 0.5196 0.5308 0.5268 0.1764 0.5417 0. 0.3602 0.1741 0.3994 0.577 0.0842 0.6275 0.587 0.5556 0.2166 0.3403 0. 0.8103 0.5981 0.6096 0.0511 0.9171 0.5838 0.7836 0.1939 0. 0.533 0.4525 0. 0.3658 0.3308 0. 0. 0. 0. 0.0047 0.1776 0.9067 0. 0.5284 0. 0. 0. 1. 0.5454 0.0304 0. 0. 0.0544 0. 0.3963 0.8095 0. 0.4964 0.5935 0. 0.4816 0.3704 0. 0. 0.3989 0.3584 0.734 1. 0.9045 0.1568 0. 0.8447 0. 0. 0. 0.8184 0.0782 0.0385 0.0037 1. 0.8495 0. 0.239 0.463 0.0459 0.6978 0. 0. 0. 0. 0. ] 2022-08-23 17:49:11 [INFO] [EVAL] Class Recall: [0.7728 0.8683 0.955 0.8399 0.8146 0.7708 0.7741 0.7366 0.636 0.7643 0.753 0.7061 0.86 0.1841 0.0877 0.3136 0.6918 0.3191 0.632 0.3564 0.8625 0.5517 0.5632 0.5461 0.478 0.7472 0.6913 0.0286 0.3313 0.2772 0.2939 0.5568 0.2651 0.2842 0.0909 0.0011 0.3613 0.3753 0.0766 0.2152 0.0001 0.0524 0.0175 0.2127 0.2722 0.0597 0.5247 0.5176 0.4168 0.3852 0.079 0.322 0.0008 0.356 0.8357 0.6747 0.8348 0.3371 0.044 0.1628 0.0052 0.0615 0.3078 0.1189 0.1439 0.7081 0.4489 0.2666 0. 0.0234 0.0048 0.3428 0.4162 0.004 0.388 0.2398 0.1235 0.0006 0.0257 0. 0.3093 0.0935 0.0995 0.1594 0.1207 0.4646 0.0013 0.0007 0. 0.3243 0.5361 0. 0.2045 0.0083 0. 0. 0. 0. 0. 0.0003 0.0067 0. 0.0217 0. 0. 0. 0.0005 0.473 0.001 0. 0. 0.3366 0. 0.5411 0.2512 0. 0.1354 0.7452 0. 0.6375 0.0333 0. 0. 0.018 0.0551 0.0002 0.0093 0.1142 0.1412 0. 0.1511 0. 0. 0. 0.0051 0.0007 0. 0. 0.0017 0.016 0. 0.1336 0.0246 0.0001 0.009 0. 0. 0. 0. 0. ] 2022-08-23 17:49:12 [INFO] [EVAL] The model with the best validation mIoU (0.1671) was saved at iter 2000. 2022-08-23 17:49:19 [INFO] [TRAIN] epoch: 2, iter: 2050/160000, loss: 1.4132, lr: 0.001196, batch_cost: 0.1578, reader_cost: 0.00314, ips: 50.6970 samples/sec | ETA 06:55:24 2022-08-23 17:49:28 [INFO] [TRAIN] epoch: 2, iter: 2100/160000, loss: 1.4099, lr: 0.001195, batch_cost: 0.1786, reader_cost: 0.00094, ips: 44.7825 samples/sec | ETA 07:50:07 2022-08-23 17:49:37 [INFO] [TRAIN] epoch: 2, iter: 2150/160000, loss: 1.3684, lr: 0.001195, batch_cost: 0.1650, reader_cost: 0.00092, ips: 48.4847 samples/sec | ETA 07:14:05 2022-08-23 17:49:44 [INFO] [TRAIN] epoch: 2, iter: 2200/160000, loss: 1.3244, lr: 0.001195, batch_cost: 0.1555, reader_cost: 0.00048, ips: 51.4446 samples/sec | ETA 06:48:59 2022-08-23 17:49:53 [INFO] [TRAIN] epoch: 2, iter: 2250/160000, loss: 1.4722, lr: 0.001194, batch_cost: 0.1718, reader_cost: 0.00077, ips: 46.5564 samples/sec | ETA 07:31:46 2022-08-23 17:50:04 [INFO] [TRAIN] epoch: 2, iter: 2300/160000, loss: 1.3478, lr: 0.001194, batch_cost: 0.2122, reader_cost: 0.00040, ips: 37.6955 samples/sec | ETA 09:17:48 2022-08-23 17:50:14 [INFO] [TRAIN] epoch: 2, iter: 2350/160000, loss: 1.3803, lr: 0.001194, batch_cost: 0.2106, reader_cost: 0.00035, ips: 37.9808 samples/sec | ETA 09:13:26 2022-08-23 17:50:25 [INFO] [TRAIN] epoch: 2, iter: 2400/160000, loss: 1.3744, lr: 0.001193, batch_cost: 0.2089, reader_cost: 0.00102, ips: 38.2919 samples/sec | ETA 09:08:45 2022-08-23 17:50:35 [INFO] [TRAIN] epoch: 2, iter: 2450/160000, loss: 1.3549, lr: 0.001193, batch_cost: 0.2039, reader_cost: 0.00078, ips: 39.2343 samples/sec | ETA 08:55:24 2022-08-23 17:50:46 [INFO] [TRAIN] epoch: 2, iter: 2500/160000, loss: 1.2680, lr: 0.001192, batch_cost: 0.2154, reader_cost: 0.00085, ips: 37.1425 samples/sec | ETA 09:25:23 2022-08-23 17:50:59 [INFO] [TRAIN] epoch: 3, iter: 2550/160000, loss: 1.2913, lr: 0.001192, batch_cost: 0.2768, reader_cost: 0.06473, ips: 28.8976 samples/sec | ETA 12:06:28 2022-08-23 17:51:10 [INFO] [TRAIN] epoch: 3, iter: 2600/160000, loss: 1.2753, lr: 0.001192, batch_cost: 0.2214, reader_cost: 0.00079, ips: 36.1385 samples/sec | ETA 09:40:43 2022-08-23 17:51:22 [INFO] [TRAIN] epoch: 3, iter: 2650/160000, loss: 1.2333, lr: 0.001191, batch_cost: 0.2244, reader_cost: 0.00058, ips: 35.6480 samples/sec | ETA 09:48:31 2022-08-23 17:51:31 [INFO] [TRAIN] epoch: 3, iter: 2700/160000, loss: 1.3933, lr: 0.001191, batch_cost: 0.1802, reader_cost: 0.01523, ips: 44.4048 samples/sec | ETA 07:52:19 2022-08-23 17:51:42 [INFO] [TRAIN] epoch: 3, iter: 2750/160000, loss: 1.2749, lr: 0.001191, batch_cost: 0.2231, reader_cost: 0.00047, ips: 35.8602 samples/sec | ETA 09:44:40 2022-08-23 17:51:54 [INFO] [TRAIN] epoch: 3, iter: 2800/160000, loss: 1.2633, lr: 0.001190, batch_cost: 0.2375, reader_cost: 0.00070, ips: 33.6873 samples/sec | ETA 10:22:11 2022-08-23 17:52:05 [INFO] [TRAIN] epoch: 3, iter: 2850/160000, loss: 1.2946, lr: 0.001190, batch_cost: 0.2182, reader_cost: 0.00089, ips: 36.6710 samples/sec | ETA 09:31:23 2022-08-23 17:52:16 [INFO] [TRAIN] epoch: 3, iter: 2900/160000, loss: 1.2259, lr: 0.001189, batch_cost: 0.2172, reader_cost: 0.00061, ips: 36.8383 samples/sec | ETA 09:28:36 2022-08-23 17:52:27 [INFO] [TRAIN] epoch: 3, iter: 2950/160000, loss: 1.1872, lr: 0.001189, batch_cost: 0.2269, reader_cost: 0.00092, ips: 35.2560 samples/sec | ETA 09:53:56 2022-08-23 17:52:38 [INFO] [TRAIN] epoch: 3, iter: 3000/160000, loss: 1.3062, lr: 0.001189, batch_cost: 0.2268, reader_cost: 0.00085, ips: 35.2728 samples/sec | ETA 09:53:28 2022-08-23 17:52:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 157s - batch_cost: 0.1565 - reader cost: 6.4847e-04 2022-08-23 17:55:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.2159 Acc: 0.6992 Kappa: 0.6751 Dice: 0.3115 2022-08-23 17:55:15 [INFO] [EVAL] Class IoU: [0.601 0.7224 0.9056 0.6564 0.6274 0.7002 0.6851 0.6653 0.4278 0.5385 0.3943 0.449 0.646 0.2767 0.0681 0.2711 0.4493 0.3316 0.4881 0.3202 0.6578 0.2878 0.4873 0.3788 0.2221 0.4033 0.341 0.1177 0.23 0.1843 0.0812 0.3982 0.2428 0.1784 0.256 0.1869 0.3149 0.3391 0.1194 0.1796 0.033 0.0284 0.1339 0.1976 0.2497 0.0296 0.2708 0.1827 0.3847 0.3479 0.2578 0.1954 0.0011 0.1309 0.6108 0.3917 0.7716 0.1984 0.0683 0.1458 0.071 0.231 0.1964 0.1005 0.2398 0.4056 0.1638 0.2893 0. 0.1093 0.0718 0.3086 0.2936 0.0867 0.2891 0.2099 0.3763 0.0056 0.0328 0.1749 0.6083 0.2473 0.1477 0.0076 0.246 0.3859 0.0164 0.0091 0.021 0.3116 0.3349 0. 0.1536 0.0122 0. 0.0008 0.0929 0.0165 0.0643 0.1294 0.0031 0.0249 0. 0.1182 0.0013 0.1 0.0092 0.4258 0.0062 0.0386 0. 0.1594 0.001 0.2758 0.6492 0. 0.3624 0.5456 0. 0.2947 0.3693 0. 0. 0.1243 0.163 0.0484 0.3167 0.3771 0. 0.1117 0.3181 0. 0. 0.0232 0.0002 0.0126 0. 0. 0.0671 0.2314 0.0257 0.018 0.1252 0.0076 0.1036 0. 0.0004 0. 0. 0.0036] 2022-08-23 17:55:15 [INFO] [EVAL] Class Precision: [0.6963 0.8279 0.9467 0.7376 0.711 0.8005 0.7918 0.7544 0.6306 0.7275 0.5943 0.5734 0.728 0.5101 0.4573 0.4783 0.6765 0.7016 0.7476 0.447 0.7109 0.462 0.7544 0.4852 0.3118 0.4417 0.4609 0.6657 0.7116 0.2318 0.2531 0.54 0.4574 0.2315 0.3276 0.7062 0.4905 0.4343 0.4657 0.4695 0.0861 0.711 0.6975 0.4533 0.3712 0.4012 0.5046 0.5605 0.5514 0.6905 0.539 0.2258 0.0817 0.6565 0.6448 0.5431 0.8752 0.5546 0.8296 0.2613 0.0978 0.3854 0.2298 0.6056 0.7294 0.4329 0.233 0.503 0. 0.5766 0.5316 0.4403 0.4252 0.292 0.4128 0.2784 0.4392 0.5848 0.1856 0.3117 0.7452 0.6882 0.657 0.0342 0.5404 0.571 0.5557 0.6176 0.7781 0.6725 0.442 0. 0.2771 0.2587 0. 0.0348 0.5506 0.1816 0.4277 0.7429 0.0549 0.2187 0. 0.5261 0.06 0.9909 0.6168 0.6376 0.8798 0.0906 0. 0.1805 0.0911 0.2837 0.7714 0. 0.5602 0.6229 0. 0.3526 0.6875 0. 0. 0.508 0.4332 0.6916 0.727 0.5418 0. 0.1568 0.6449 0. 0. 0.5112 0.6042 0.2235 0. 0. 0.5446 0.6074 0.1783 0.2226 0.7005 0.0467 0.2907 0. 1. 0. 0. 0.7781] 2022-08-23 17:55:15 [INFO] [EVAL] Class Recall: [0.8146 0.8501 0.9543 0.8564 0.8421 0.8482 0.8357 0.8493 0.5708 0.6746 0.5394 0.6743 0.8515 0.3768 0.0741 0.3848 0.5723 0.3861 0.5844 0.5303 0.8981 0.4329 0.5793 0.6332 0.4356 0.8228 0.5673 0.1251 0.2536 0.4735 0.1068 0.6025 0.3411 0.4376 0.5395 0.2026 0.4678 0.6074 0.1383 0.2253 0.0507 0.0287 0.1422 0.2594 0.4328 0.031 0.3689 0.2132 0.56 0.4122 0.3307 0.5918 0.0011 0.1406 0.9204 0.5843 0.867 0.236 0.0692 0.2479 0.2058 0.3658 0.5746 0.1076 0.2633 0.8655 0.3553 0.4051 0. 0.1188 0.0766 0.5078 0.4869 0.1097 0.4912 0.4604 0.7245 0.0056 0.0383 0.285 0.768 0.2785 0.16 0.0096 0.3111 0.5434 0.0166 0.0091 0.0211 0.3674 0.5802 0. 0.2563 0.0126 0. 0.0008 0.1005 0.0178 0.0704 0.1355 0.0033 0.0274 0. 0.1323 0.0013 0.1001 0.0093 0.5617 0.0062 0.0631 0. 0.5765 0.0011 0.9086 0.8038 0. 0.5066 0.8147 0. 0.642 0.4438 0. 0. 0.1414 0.2073 0.0495 0.3595 0.5536 0. 0.2795 0.3857 0. 0. 0.0237 0.0002 0.0131 0. 0. 0.071 0.2721 0.0292 0.0192 0.1323 0.009 0.1387 0. 0.0004 0. 0. 0.0036] 2022-08-23 17:55:15 [INFO] [EVAL] The model with the best validation mIoU (0.2159) was saved at iter 3000. 2022-08-23 17:55:24 [INFO] [TRAIN] epoch: 3, iter: 3050/160000, loss: 1.2773, lr: 0.001188, batch_cost: 0.1736, reader_cost: 0.00313, ips: 46.0830 samples/sec | ETA 07:34:06 2022-08-23 17:55:33 [INFO] [TRAIN] epoch: 3, iter: 3100/160000, loss: 1.2906, lr: 0.001188, batch_cost: 0.1922, reader_cost: 0.00106, ips: 41.6228 samples/sec | ETA 08:22:36 2022-08-23 17:55:42 [INFO] [TRAIN] epoch: 3, iter: 3150/160000, loss: 1.1846, lr: 0.001188, batch_cost: 0.1680, reader_cost: 0.00048, ips: 47.6195 samples/sec | ETA 07:19:10 2022-08-23 17:55:51 [INFO] [TRAIN] epoch: 3, iter: 3200/160000, loss: 1.3629, lr: 0.001187, batch_cost: 0.1724, reader_cost: 0.00033, ips: 46.4078 samples/sec | ETA 07:30:29 2022-08-23 17:55:59 [INFO] [TRAIN] epoch: 3, iter: 3250/160000, loss: 1.2231, lr: 0.001187, batch_cost: 0.1716, reader_cost: 0.00042, ips: 46.6159 samples/sec | ETA 07:28:20 2022-08-23 17:56:09 [INFO] [TRAIN] epoch: 3, iter: 3300/160000, loss: 1.1827, lr: 0.001186, batch_cost: 0.1902, reader_cost: 0.00070, ips: 42.0614 samples/sec | ETA 08:16:44 2022-08-23 17:56:17 [INFO] [TRAIN] epoch: 3, iter: 3350/160000, loss: 1.2163, lr: 0.001186, batch_cost: 0.1776, reader_cost: 0.00093, ips: 45.0491 samples/sec | ETA 07:43:38 2022-08-23 17:56:29 [INFO] [TRAIN] epoch: 3, iter: 3400/160000, loss: 1.2268, lr: 0.001186, batch_cost: 0.2344, reader_cost: 0.00129, ips: 34.1296 samples/sec | ETA 10:11:47 2022-08-23 17:56:39 [INFO] [TRAIN] epoch: 3, iter: 3450/160000, loss: 1.1476, lr: 0.001185, batch_cost: 0.2049, reader_cost: 0.00047, ips: 39.0463 samples/sec | ETA 08:54:34 2022-08-23 17:56:49 [INFO] [TRAIN] epoch: 3, iter: 3500/160000, loss: 1.1361, lr: 0.001185, batch_cost: 0.1934, reader_cost: 0.00166, ips: 41.3718 samples/sec | ETA 08:24:22 2022-08-23 17:57:00 [INFO] [TRAIN] epoch: 3, iter: 3550/160000, loss: 1.2408, lr: 0.001184, batch_cost: 0.2275, reader_cost: 0.00308, ips: 35.1670 samples/sec | ETA 09:53:10 2022-08-23 17:57:12 [INFO] [TRAIN] epoch: 3, iter: 3600/160000, loss: 1.1972, lr: 0.001184, batch_cost: 0.2206, reader_cost: 0.00037, ips: 36.2687 samples/sec | ETA 09:34:58 2022-08-23 17:57:22 [INFO] [TRAIN] epoch: 3, iter: 3650/160000, loss: 1.2068, lr: 0.001184, batch_cost: 0.2164, reader_cost: 0.00076, ips: 36.9615 samples/sec | ETA 09:24:00 2022-08-23 17:57:32 [INFO] [TRAIN] epoch: 3, iter: 3700/160000, loss: 1.2931, lr: 0.001183, batch_cost: 0.2012, reader_cost: 0.00297, ips: 39.7567 samples/sec | ETA 08:44:11 2022-08-23 17:57:43 [INFO] [TRAIN] epoch: 3, iter: 3750/160000, loss: 1.1897, lr: 0.001183, batch_cost: 0.2076, reader_cost: 0.00217, ips: 38.5402 samples/sec | ETA 09:00:33 2022-08-23 17:57:57 [INFO] [TRAIN] epoch: 4, iter: 3800/160000, loss: 1.2507, lr: 0.001183, batch_cost: 0.2929, reader_cost: 0.10544, ips: 27.3110 samples/sec | ETA 12:42:34 2022-08-23 17:58:08 [INFO] [TRAIN] epoch: 4, iter: 3850/160000, loss: 1.1318, lr: 0.001182, batch_cost: 0.2178, reader_cost: 0.00076, ips: 36.7369 samples/sec | ETA 09:26:43 2022-08-23 17:58:18 [INFO] [TRAIN] epoch: 4, iter: 3900/160000, loss: 1.0938, lr: 0.001182, batch_cost: 0.1983, reader_cost: 0.00043, ips: 40.3391 samples/sec | ETA 08:35:57 2022-08-23 17:58:28 [INFO] [TRAIN] epoch: 4, iter: 3950/160000, loss: 1.0844, lr: 0.001181, batch_cost: 0.1920, reader_cost: 0.00082, ips: 41.6703 samples/sec | ETA 08:19:18 2022-08-23 17:58:39 [INFO] [TRAIN] epoch: 4, iter: 4000/160000, loss: 1.1277, lr: 0.001181, batch_cost: 0.2251, reader_cost: 0.00036, ips: 35.5472 samples/sec | ETA 09:45:08 2022-08-23 17:58:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1775 - reader cost: 8.5630e-04 2022-08-23 18:01:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.2362 Acc: 0.7101 Kappa: 0.6872 Dice: 0.3378 2022-08-23 18:01:37 [INFO] [EVAL] Class IoU: [0.6208 0.732 0.912 0.6668 0.6321 0.6983 0.7046 0.6674 0.4462 0.5411 0.4036 0.447 0.6597 0.2705 0.1041 0.3081 0.4611 0.2482 0.524 0.2982 0.7159 0.4334 0.5146 0.3975 0.2555 0.3681 0.4876 0.2099 0.3286 0.1533 0.1359 0.4447 0.2501 0.2007 0.2482 0.1787 0.3333 0.4236 0.247 0.1378 0.0055 0.0457 0.2347 0.1494 0.239 0.0994 0.2065 0.3489 0.5305 0.3965 0.2909 0.2378 0.1071 0.0047 0.6983 0.279 0.7158 0.2008 0.2536 0.1081 0.0034 0.1418 0.1632 0.0914 0.2883 0.5701 0.1504 0.3513 0.0105 0.2233 0.2927 0.3802 0.2832 0.0858 0.3591 0.2349 0.3512 0.1168 0.2123 0.0124 0.5808 0.2292 0.1125 0. 0.1382 0.414 0.0118 0.0178 0.1412 0.3522 0.3766 0. 0.1281 0.0169 0. 0. 0.0441 0.0109 0.0117 0.1594 0.0004 0.0837 0.0334 0.5668 0. 0.3806 0.1296 0.4046 0.1025 0.1318 0. 0. 0.0181 0.2809 0.618 0. 0.3158 0.4028 0.0001 0.3732 0.13 0. 0.0756 0.1155 0.1641 0.0886 0.2822 0.3074 0.0145 0.1262 0.4517 0. 0. 0.0062 0.0162 0.0346 0.0456 0.0066 0.0953 0.2222 0. 0.0006 0.2381 0. 0.0243 0. 0.0383 0.0014 0.0481 0.0291] 2022-08-23 18:01:37 [INFO] [EVAL] Class Precision: [0.7178 0.8174 0.9607 0.7927 0.7266 0.8162 0.7941 0.7425 0.5759 0.633 0.6277 0.6621 0.7608 0.4341 0.3676 0.5134 0.5952 0.7356 0.7308 0.4721 0.8286 0.5429 0.7267 0.5296 0.3294 0.3889 0.6551 0.6108 0.4246 0.2109 0.2417 0.683 0.5391 0.3275 0.5597 0.7073 0.5196 0.7078 0.4359 0.5953 0.1132 0.3691 0.6104 0.5217 0.3067 0.6095 0.4866 0.5994 0.6393 0.531 0.6889 0.2745 0.3121 0.3587 0.7525 0.3285 0.7392 0.4957 0.4011 0.1565 0.5983 0.2841 0.2767 0.5746 0.4333 0.7952 0.2035 0.4248 0.433 0.4836 0.4441 0.5725 0.6756 0.2743 0.7362 0.3982 0.8713 0.2878 0.3377 0.0469 0.6579 0.7382 0.7942 0. 0.7979 0.6982 0.8495 0.5936 0.7118 0.6671 0.5429 0. 0.1544 0.2629 0. 0. 0.2374 0.4799 0.2107 0.7043 0.2327 0.3758 0.7943 0.5928 0. 0.7324 0.4725 0.8232 0.3644 0.2081 0. 0. 0.1704 0.7374 0.714 0. 0.4793 0.4259 0.003 0.4845 0.5927 0. 0.9985 0.5497 0.6153 0.5964 0.8608 0.5766 0.4283 0.2604 0.6334 0. 0. 0.5171 0.5928 0.2169 0.2438 0.3575 0.3288 0.7273 0. 1. 0.6206 0. 0.3613 0. 0.9977 0.1291 0.2681 0.7397] 2022-08-23 18:01:37 [INFO] [EVAL] Class Recall: [0.8212 0.875 0.9473 0.8076 0.8293 0.8286 0.8621 0.8684 0.6646 0.7884 0.5306 0.5792 0.8324 0.4179 0.1269 0.4351 0.6717 0.2725 0.6493 0.4474 0.8403 0.6825 0.6381 0.6144 0.5323 0.8733 0.6559 0.2423 0.5924 0.3593 0.237 0.5603 0.318 0.3414 0.3084 0.193 0.4817 0.5134 0.3632 0.152 0.0057 0.0496 0.276 0.1731 0.5198 0.1062 0.2641 0.455 0.757 0.6101 0.3349 0.6405 0.1403 0.0047 0.9064 0.6494 0.9576 0.2524 0.4081 0.2589 0.0034 0.2207 0.2847 0.098 0.4627 0.6682 0.3656 0.67 0.0106 0.2931 0.4619 0.5309 0.3277 0.1111 0.4122 0.3642 0.3705 0.1643 0.3639 0.0165 0.8321 0.2495 0.1159 0. 0.1432 0.5043 0.0118 0.018 0.1497 0.4273 0.5514 0. 0.4294 0.0178 0. 0. 0.0513 0.011 0.0123 0.1708 0.0004 0.0972 0.0337 0.9282 0. 0.4421 0.1515 0.4431 0.1249 0.2645 0. 0. 0.0198 0.3121 0.8214 0. 0.4807 0.8814 0.0001 0.6188 0.1428 0. 0.0757 0.1276 0.1828 0.0942 0.2957 0.3971 0.0148 0.1968 0.6117 0. 0. 0.0062 0.0164 0.0395 0.0531 0.0067 0.1184 0.2423 0. 0.0006 0.2787 0. 0.0254 0. 0.0383 0.0014 0.0554 0.0294] 2022-08-23 18:01:37 [INFO] [EVAL] The model with the best validation mIoU (0.2362) was saved at iter 4000. 2022-08-23 18:01:45 [INFO] [TRAIN] epoch: 4, iter: 4050/160000, loss: 1.2027, lr: 0.001181, batch_cost: 0.1632, reader_cost: 0.00321, ips: 49.0186 samples/sec | ETA 07:04:11 2022-08-23 18:01:53 [INFO] [TRAIN] epoch: 4, iter: 4100/160000, loss: 1.1847, lr: 0.001180, batch_cost: 0.1598, reader_cost: 0.00222, ips: 50.0521 samples/sec | ETA 06:55:18 2022-08-23 18:02:03 [INFO] [TRAIN] epoch: 4, iter: 4150/160000, loss: 1.1628, lr: 0.001180, batch_cost: 0.1832, reader_cost: 0.00064, ips: 43.6629 samples/sec | ETA 07:55:55 2022-08-23 18:02:11 [INFO] [TRAIN] epoch: 4, iter: 4200/160000, loss: 1.2133, lr: 0.001180, batch_cost: 0.1682, reader_cost: 0.00052, ips: 47.5658 samples/sec | ETA 07:16:43 2022-08-23 18:02:21 [INFO] [TRAIN] epoch: 4, iter: 4250/160000, loss: 1.2472, lr: 0.001179, batch_cost: 0.1937, reader_cost: 0.00060, ips: 41.2981 samples/sec | ETA 08:22:50 2022-08-23 18:02:31 [INFO] [TRAIN] epoch: 4, iter: 4300/160000, loss: 1.1847, lr: 0.001179, batch_cost: 0.2158, reader_cost: 0.00062, ips: 37.0657 samples/sec | ETA 09:20:05 2022-08-23 18:02:42 [INFO] [TRAIN] epoch: 4, iter: 4350/160000, loss: 1.0925, lr: 0.001178, batch_cost: 0.2070, reader_cost: 0.00655, ips: 38.6383 samples/sec | ETA 08:57:07 2022-08-23 18:02:52 [INFO] [TRAIN] epoch: 4, iter: 4400/160000, loss: 1.0704, lr: 0.001178, batch_cost: 0.2031, reader_cost: 0.00057, ips: 39.3882 samples/sec | ETA 08:46:43 2022-08-23 18:03:03 [INFO] [TRAIN] epoch: 4, iter: 4450/160000, loss: 1.1121, lr: 0.001178, batch_cost: 0.2183, reader_cost: 0.00065, ips: 36.6538 samples/sec | ETA 09:25:50 2022-08-23 18:03:13 [INFO] [TRAIN] epoch: 4, iter: 4500/160000, loss: 1.2401, lr: 0.001177, batch_cost: 0.2119, reader_cost: 0.00062, ips: 37.7459 samples/sec | ETA 09:09:17 2022-08-23 18:03:25 [INFO] [TRAIN] epoch: 4, iter: 4550/160000, loss: 1.1245, lr: 0.001177, batch_cost: 0.2260, reader_cost: 0.00546, ips: 35.3998 samples/sec | ETA 09:45:30 2022-08-23 18:03:36 [INFO] [TRAIN] epoch: 4, iter: 4600/160000, loss: 1.1031, lr: 0.001177, batch_cost: 0.2338, reader_cost: 0.00122, ips: 34.2124 samples/sec | ETA 10:05:37 2022-08-23 18:03:49 [INFO] [TRAIN] epoch: 4, iter: 4650/160000, loss: 1.2344, lr: 0.001176, batch_cost: 0.2421, reader_cost: 0.00060, ips: 33.0416 samples/sec | ETA 10:26:53 2022-08-23 18:03:59 [INFO] [TRAIN] epoch: 4, iter: 4700/160000, loss: 1.0843, lr: 0.001176, batch_cost: 0.2141, reader_cost: 0.00084, ips: 37.3683 samples/sec | ETA 09:14:07 2022-08-23 18:04:10 [INFO] [TRAIN] epoch: 4, iter: 4750/160000, loss: 1.1983, lr: 0.001175, batch_cost: 0.2200, reader_cost: 0.00092, ips: 36.3703 samples/sec | ETA 09:29:08 2022-08-23 18:04:22 [INFO] [TRAIN] epoch: 4, iter: 4800/160000, loss: 1.1091, lr: 0.001175, batch_cost: 0.2246, reader_cost: 0.00097, ips: 35.6212 samples/sec | ETA 09:40:55 2022-08-23 18:04:32 [INFO] [TRAIN] epoch: 4, iter: 4850/160000, loss: 1.0415, lr: 0.001175, batch_cost: 0.2155, reader_cost: 0.00074, ips: 37.1306 samples/sec | ETA 09:17:07 2022-08-23 18:04:44 [INFO] [TRAIN] epoch: 4, iter: 4900/160000, loss: 1.1008, lr: 0.001174, batch_cost: 0.2288, reader_cost: 0.00103, ips: 34.9686 samples/sec | ETA 09:51:23 2022-08-23 18:04:54 [INFO] [TRAIN] epoch: 4, iter: 4950/160000, loss: 1.1443, lr: 0.001174, batch_cost: 0.1966, reader_cost: 0.00055, ips: 40.6827 samples/sec | ETA 08:28:09 2022-08-23 18:05:04 [INFO] [TRAIN] epoch: 4, iter: 5000/160000, loss: 1.1429, lr: 0.001174, batch_cost: 0.2170, reader_cost: 0.00040, ips: 36.8680 samples/sec | ETA 09:20:33 2022-08-23 18:05:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1639 - reader cost: 9.4517e-04 2022-08-23 18:07:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.2541 Acc: 0.7162 Kappa: 0.6940 Dice: 0.3627 2022-08-23 18:07:49 [INFO] [EVAL] Class IoU: [0.6289 0.7629 0.915 0.6705 0.6484 0.7176 0.6891 0.6864 0.4403 0.5313 0.4181 0.4265 0.6557 0.2526 0.1359 0.2987 0.4624 0.3746 0.4942 0.2708 0.7026 0.3747 0.5266 0.3875 0.2821 0.4174 0.3441 0.2549 0.2946 0.1847 0.0661 0.441 0.2126 0.1982 0.3114 0.3563 0.3313 0.4378 0.1933 0.1845 0.0115 0.0842 0.2209 0.2092 0.222 0.0902 0.1336 0.2735 0.3892 0.4004 0.281 0.3815 0.0776 0.2622 0.6397 0.4066 0.8055 0.2874 0.2692 0.2178 0.0515 0.349 0.189 0.184 0.3071 0.4734 0.1755 0.341 0.0002 0.2189 0.302 0.3804 0.3413 0.0734 0.2682 0.2269 0.5181 0.044 0.2401 0.2033 0.5895 0.2254 0.1767 0.0007 0.2029 0.421 0.0494 0.0399 0.16 0.3912 0.3695 0.0475 0.0409 0.0693 0.0329 0.0097 0.0022 0. 0.0339 0.1662 0.0002 0.0812 0.0436 0.0006 0.0347 0.3383 0.0845 0.4494 0.0781 0.0639 0.0853 0.0447 0.002 0.5198 0.6966 0. 0.317 0.5306 0.0292 0.3127 0.4569 0. 0.0003 0.1583 0.2104 0.146 0.372 0.4093 0. 0.1525 0.4992 0. 0. 0.0538 0.0251 0.0586 0.0205 0.0001 0.101 0.3031 0.054 0.0654 0.3202 0. 0.1926 0. 0.1674 0.0147 0.0203 0.0349] 2022-08-23 18:07:49 [INFO] [EVAL] Class Precision: [0.7423 0.836 0.9566 0.7468 0.7426 0.8214 0.7483 0.7916 0.5411 0.7992 0.6569 0.6953 0.7319 0.3763 0.4145 0.5257 0.7009 0.643 0.8213 0.5808 0.8068 0.5411 0.7666 0.434 0.3484 0.5592 0.4193 0.5583 0.6945 0.2613 0.2329 0.6387 0.4467 0.2858 0.4887 0.6165 0.5884 0.622 0.5277 0.548 0.4369 0.2482 0.7103 0.5414 0.2664 0.4626 0.155 0.713 0.7112 0.6295 0.3087 0.4828 0.3052 0.5694 0.7127 0.5364 0.8459 0.4894 0.4309 0.3999 0.112 0.6071 0.3252 0.557 0.4654 0.5082 0.2841 0.4272 0.7746 0.7485 0.5609 0.5421 0.6203 0.2035 0.8036 0.3177 0.6656 0.2222 0.3123 0.4809 0.7182 0.4465 0.7913 0.016 0.5992 0.491 0.7296 0.3823 0.9489 0.6206 0.5417 0.2066 0.2659 0.3383 0.0936 0.0288 0.0271 0. 0.1565 0.606 1. 0.1838 0.3794 0.7822 0.2732 0.3829 0.581 0.6651 0.1825 0.4036 0.2154 0.0463 0.0493 0.6294 0.7885 0. 0.4992 0.6231 0.0554 0.3746 0.6872 0. 0.1361 0.571 0.4437 0.6129 0.8295 0.7581 0. 0.2044 0.7758 0. 0. 0.5015 0.5386 0.2776 0.2751 0.0006 0.4425 0.5546 0.0695 0.0797 0.4996 0. 0.5791 0. 0.9531 0.263 0.8591 0.9115] 2022-08-23 18:07:49 [INFO] [EVAL] Class Recall: [0.8046 0.8972 0.9547 0.8678 0.8365 0.8504 0.897 0.8378 0.7027 0.6131 0.5348 0.5245 0.8631 0.4347 0.1682 0.4089 0.5761 0.4729 0.5537 0.3366 0.8447 0.5494 0.6272 0.7835 0.5972 0.6221 0.6575 0.3192 0.3385 0.3868 0.0844 0.5875 0.2886 0.3927 0.4618 0.4578 0.4313 0.5966 0.2338 0.2176 0.0116 0.113 0.2428 0.2542 0.5712 0.1007 0.4927 0.3074 0.4622 0.5239 0.7575 0.6451 0.0942 0.327 0.862 0.6268 0.944 0.4104 0.4177 0.3237 0.0869 0.4508 0.311 0.2155 0.4746 0.8736 0.3147 0.6284 0.0002 0.2363 0.3956 0.5606 0.4313 0.103 0.287 0.4426 0.7004 0.052 0.5094 0.2605 0.7669 0.3128 0.1853 0.0007 0.2348 0.7469 0.0504 0.0426 0.1614 0.5141 0.5375 0.0581 0.0461 0.0801 0.0484 0.0145 0.0024 0. 0.0415 0.1863 0.0002 0.1269 0.0469 0.0006 0.0382 0.7438 0.0899 0.5808 0.12 0.0706 0.1238 0.5728 0.002 0.749 0.8567 0. 0.4648 0.7813 0.0582 0.654 0.5769 0. 0.0003 0.1797 0.2858 0.1608 0.4028 0.4707 0. 0.3755 0.5833 0. 0. 0.0569 0.0256 0.0691 0.0217 0.0001 0.1157 0.4006 0.1959 0.2662 0.4714 0. 0.2239 0. 0.1688 0.0154 0.0203 0.0351] 2022-08-23 18:07:49 [INFO] [EVAL] The model with the best validation mIoU (0.2541) was saved at iter 5000. 2022-08-23 18:07:57 [INFO] [TRAIN] epoch: 4, iter: 5050/160000, loss: 1.2312, lr: 0.001173, batch_cost: 0.1721, reader_cost: 0.00246, ips: 46.4871 samples/sec | ETA 07:24:25 2022-08-23 18:08:15 [INFO] [TRAIN] epoch: 5, iter: 5100/160000, loss: 1.0610, lr: 0.001173, batch_cost: 0.3413, reader_cost: 0.09897, ips: 23.4390 samples/sec | ETA 14:41:09 2022-08-23 18:08:27 [INFO] [TRAIN] epoch: 5, iter: 5150/160000, loss: 1.1048, lr: 0.001172, batch_cost: 0.2484, reader_cost: 0.00360, ips: 32.2024 samples/sec | ETA 10:41:09 2022-08-23 18:08:39 [INFO] [TRAIN] epoch: 5, iter: 5200/160000, loss: 1.0877, lr: 0.001172, batch_cost: 0.2498, reader_cost: 0.00066, ips: 32.0221 samples/sec | ETA 10:44:33 2022-08-23 18:08:51 [INFO] [TRAIN] epoch: 5, iter: 5250/160000, loss: 1.1129, lr: 0.001172, batch_cost: 0.2328, reader_cost: 0.00068, ips: 34.3692 samples/sec | ETA 10:00:20 2022-08-23 18:09:03 [INFO] [TRAIN] epoch: 5, iter: 5300/160000, loss: 1.0697, lr: 0.001171, batch_cost: 0.2436, reader_cost: 0.00069, ips: 32.8449 samples/sec | ETA 10:28:00 2022-08-23 18:09:15 [INFO] [TRAIN] epoch: 5, iter: 5350/160000, loss: 1.1398, lr: 0.001171, batch_cost: 0.2245, reader_cost: 0.00046, ips: 35.6329 samples/sec | ETA 09:38:40 2022-08-23 18:09:26 [INFO] [TRAIN] epoch: 5, iter: 5400/160000, loss: 1.1621, lr: 0.001170, batch_cost: 0.2248, reader_cost: 0.00068, ips: 35.5819 samples/sec | ETA 09:39:19 2022-08-23 18:09:37 [INFO] [TRAIN] epoch: 5, iter: 5450/160000, loss: 0.9812, lr: 0.001170, batch_cost: 0.2197, reader_cost: 0.00188, ips: 36.4210 samples/sec | ETA 09:25:47 2022-08-23 18:09:49 [INFO] [TRAIN] epoch: 5, iter: 5500/160000, loss: 1.1532, lr: 0.001170, batch_cost: 0.2477, reader_cost: 0.00051, ips: 32.3014 samples/sec | ETA 10:37:44 2022-08-23 18:09:59 [INFO] [TRAIN] epoch: 5, iter: 5550/160000, loss: 1.0950, lr: 0.001169, batch_cost: 0.1998, reader_cost: 0.00391, ips: 40.0474 samples/sec | ETA 08:34:13 2022-08-23 18:10:11 [INFO] [TRAIN] epoch: 5, iter: 5600/160000, loss: 1.0711, lr: 0.001169, batch_cost: 0.2296, reader_cost: 0.00075, ips: 34.8377 samples/sec | ETA 09:50:55 2022-08-23 18:10:22 [INFO] [TRAIN] epoch: 5, iter: 5650/160000, loss: 1.1175, lr: 0.001169, batch_cost: 0.2239, reader_cost: 0.00058, ips: 35.7345 samples/sec | ETA 09:35:54 2022-08-23 18:10:33 [INFO] [TRAIN] epoch: 5, iter: 5700/160000, loss: 1.0805, lr: 0.001168, batch_cost: 0.2139, reader_cost: 0.00071, ips: 37.3984 samples/sec | ETA 09:10:06 2022-08-23 18:10:43 [INFO] [TRAIN] epoch: 5, iter: 5750/160000, loss: 1.1707, lr: 0.001168, batch_cost: 0.2184, reader_cost: 0.00086, ips: 36.6362 samples/sec | ETA 09:21:22 2022-08-23 18:10:54 [INFO] [TRAIN] epoch: 5, iter: 5800/160000, loss: 1.1202, lr: 0.001167, batch_cost: 0.2165, reader_cost: 0.00045, ips: 36.9595 samples/sec | ETA 09:16:17 2022-08-23 18:11:05 [INFO] [TRAIN] epoch: 5, iter: 5850/160000, loss: 1.0742, lr: 0.001167, batch_cost: 0.2127, reader_cost: 0.00033, ips: 37.6125 samples/sec | ETA 09:06:26 2022-08-23 18:11:15 [INFO] [TRAIN] epoch: 5, iter: 5900/160000, loss: 1.0984, lr: 0.001167, batch_cost: 0.1963, reader_cost: 0.00035, ips: 40.7611 samples/sec | ETA 08:24:04 2022-08-23 18:11:25 [INFO] [TRAIN] epoch: 5, iter: 5950/160000, loss: 1.1563, lr: 0.001166, batch_cost: 0.2083, reader_cost: 0.00049, ips: 38.4142 samples/sec | ETA 08:54:41 2022-08-23 18:11:36 [INFO] [TRAIN] epoch: 5, iter: 6000/160000, loss: 1.0520, lr: 0.001166, batch_cost: 0.2123, reader_cost: 0.00102, ips: 37.6757 samples/sec | ETA 09:05:00 2022-08-23 18:11:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1684 - reader cost: 0.0011 2022-08-23 18:14:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.2697 Acc: 0.7252 Kappa: 0.7037 Dice: 0.3833 2022-08-23 18:14:24 [INFO] [EVAL] Class IoU: [0.6393 0.739 0.9179 0.6808 0.6215 0.7185 0.7284 0.6896 0.4475 0.5887 0.4287 0.5077 0.6599 0.2296 0.1817 0.3147 0.5082 0.3881 0.5474 0.3395 0.7219 0.3881 0.5521 0.4194 0.24 0.2179 0.4373 0.2767 0.3333 0.2037 0.1083 0.3805 0.2477 0.247 0.3439 0.3371 0.3556 0.4422 0.2426 0.2845 0.0908 0.1215 0.2766 0.2227 0.2464 0.1566 0.2443 0.388 0.3387 0.4426 0.3533 0.4236 0.0924 0.1224 0.6491 0.2993 0.7716 0.2443 0.1111 0.1792 0.0667 0.2411 0.2065 0.001 0.3539 0.5721 0.2208 0.3662 0.069 0.1751 0.3204 0.3609 0.3446 0.1574 0.3694 0.272 0.4238 0.0161 0.2831 0.226 0.6134 0.2637 0.2912 0.0257 0.1963 0.4788 0.0496 0.0292 0.0469 0.4271 0.4046 0.0186 0.1496 0.0764 0. 0.0091 0.0804 0.0063 0.0786 0.2158 0.0111 0.0746 0.0689 0.6995 0.1579 0.4223 0.0388 0.4859 0.1661 0.0986 0.0207 0.1506 0.0828 0.4852 0.7757 0. 0.3061 0.2389 0.0093 0.2099 0.4229 0. 0.1723 0.0488 0.2596 0.1257 0.4052 0.3972 0. 0.1093 0.4462 0. 0. 0.0949 0.0471 0.0738 0.1402 0.0007 0.1145 0.2979 0.3168 0.0104 0.1563 0.0001 0.0987 0. 0.2177 0.0195 0.0196 0.0194] 2022-08-23 18:14:24 [INFO] [EVAL] Class Precision: [0.7615 0.7757 0.9446 0.7596 0.7552 0.8643 0.8529 0.7608 0.5838 0.7195 0.6455 0.6422 0.7403 0.4682 0.3735 0.5628 0.6739 0.7292 0.6844 0.5096 0.8077 0.5916 0.778 0.5935 0.4875 0.7214 0.5342 0.4338 0.6734 0.2749 0.2809 0.4788 0.501 0.3759 0.5325 0.5208 0.569 0.675 0.5264 0.4663 0.2014 0.3187 0.6843 0.3818 0.3235 0.4235 0.4587 0.5824 0.6615 0.5551 0.4196 0.7003 0.354 0.7655 0.6946 0.3501 0.8125 0.5538 0.8403 0.2844 0.1167 0.401 0.256 0.3766 0.4517 0.6338 0.4003 0.5384 0.1207 0.6661 0.4795 0.4072 0.5542 0.2815 0.5321 0.4427 0.4714 0.601 0.3988 0.2921 0.7752 0.6854 0.6965 0.1427 0.5175 0.6579 0.4691 0.4695 0.9597 0.5956 0.5919 0.106 0.2293 0.233 0. 0.0369 0.2336 0.425 0.3136 0.548 0.3746 0.3155 0.4819 0.9277 0.5026 0.7153 0.8263 0.7124 0.3243 0.1569 0.3619 0.1675 0.2095 0.5937 0.8517 0. 0.6133 0.2414 0.0966 0.4087 0.673 0. 0.9005 0.7261 0.4427 0.4494 0.7462 0.6951 0. 0.5446 0.5432 0. 0. 0.3954 0.4906 0.3165 0.2218 0.1675 0.3195 0.5904 0.5485 0.0174 0.6199 0.0078 0.583 0. 0.8671 0.3631 0.8324 0.9623] 2022-08-23 18:14:24 [INFO] [EVAL] Class Recall: [0.7994 0.9399 0.9701 0.8677 0.7783 0.8099 0.833 0.8805 0.6572 0.7641 0.5608 0.708 0.8586 0.3107 0.2613 0.4165 0.674 0.4534 0.7323 0.5043 0.8717 0.5302 0.6553 0.5885 0.321 0.2379 0.7069 0.4331 0.3975 0.44 0.1499 0.6496 0.3288 0.4186 0.4927 0.4887 0.4867 0.5618 0.3103 0.4218 0.142 0.1641 0.3171 0.3484 0.5081 0.199 0.3432 0.5375 0.4098 0.686 0.6909 0.5175 0.1111 0.1272 0.9084 0.6736 0.9387 0.3041 0.1135 0.3263 0.1346 0.3767 0.5169 0.001 0.6205 0.8546 0.3299 0.5338 0.1386 0.1919 0.4911 0.7603 0.4768 0.2632 0.5471 0.4137 0.8076 0.0163 0.4938 0.4997 0.7461 0.3 0.3335 0.0304 0.2403 0.6374 0.0526 0.0302 0.047 0.6015 0.5611 0.0221 0.3009 0.102 0. 0.0119 0.1091 0.0064 0.0949 0.2626 0.0114 0.089 0.0744 0.7398 0.1871 0.5076 0.0392 0.6045 0.2541 0.2096 0.0215 0.5984 0.1204 0.7264 0.8968 0. 0.3793 0.9574 0.0102 0.3014 0.5322 0. 0.1756 0.0498 0.3856 0.1486 0.4699 0.481 0. 0.1203 0.7141 0. 0. 0.111 0.0495 0.0878 0.2758 0.0007 0.1515 0.3756 0.4286 0.0249 0.1729 0.0001 0.1062 0. 0.2252 0.0202 0.0197 0.0194] 2022-08-23 18:14:25 [INFO] [EVAL] The model with the best validation mIoU (0.2697) was saved at iter 6000. 2022-08-23 18:14:36 [INFO] [TRAIN] epoch: 5, iter: 6050/160000, loss: 1.1325, lr: 0.001166, batch_cost: 0.2213, reader_cost: 0.00248, ips: 36.1473 samples/sec | ETA 09:27:51 2022-08-23 18:14:47 [INFO] [TRAIN] epoch: 5, iter: 6100/160000, loss: 1.0696, lr: 0.001165, batch_cost: 0.2164, reader_cost: 0.00098, ips: 36.9750 samples/sec | ETA 09:14:58 2022-08-23 18:14:58 [INFO] [TRAIN] epoch: 5, iter: 6150/160000, loss: 1.0045, lr: 0.001165, batch_cost: 0.2216, reader_cost: 0.00087, ips: 36.0979 samples/sec | ETA 09:28:16 2022-08-23 18:15:08 [INFO] [TRAIN] epoch: 5, iter: 6200/160000, loss: 1.1641, lr: 0.001164, batch_cost: 0.1959, reader_cost: 0.00059, ips: 40.8340 samples/sec | ETA 08:22:11 2022-08-23 18:15:20 [INFO] [TRAIN] epoch: 5, iter: 6250/160000, loss: 0.9701, lr: 0.001164, batch_cost: 0.2463, reader_cost: 0.00272, ips: 32.4812 samples/sec | ETA 10:31:08 2022-08-23 18:15:31 [INFO] [TRAIN] epoch: 5, iter: 6300/160000, loss: 1.0553, lr: 0.001164, batch_cost: 0.2273, reader_cost: 0.00081, ips: 35.1913 samples/sec | ETA 09:42:20 2022-08-23 18:15:48 [INFO] [TRAIN] epoch: 6, iter: 6350/160000, loss: 1.0011, lr: 0.001163, batch_cost: 0.3281, reader_cost: 0.07874, ips: 24.3804 samples/sec | ETA 14:00:17 2022-08-23 18:15:59 [INFO] [TRAIN] epoch: 6, iter: 6400/160000, loss: 1.0938, lr: 0.001163, batch_cost: 0.2262, reader_cost: 0.00072, ips: 35.3699 samples/sec | ETA 09:39:01 2022-08-23 18:16:11 [INFO] [TRAIN] epoch: 6, iter: 6450/160000, loss: 1.0638, lr: 0.001163, batch_cost: 0.2314, reader_cost: 0.00064, ips: 34.5664 samples/sec | ETA 09:52:17 2022-08-23 18:16:21 [INFO] [TRAIN] epoch: 6, iter: 6500/160000, loss: 0.9675, lr: 0.001162, batch_cost: 0.2110, reader_cost: 0.00104, ips: 37.9059 samples/sec | ETA 08:59:56 2022-08-23 18:16:32 [INFO] [TRAIN] epoch: 6, iter: 6550/160000, loss: 1.0544, lr: 0.001162, batch_cost: 0.2158, reader_cost: 0.00433, ips: 37.0704 samples/sec | ETA 09:11:55 2022-08-23 18:16:44 [INFO] [TRAIN] epoch: 6, iter: 6600/160000, loss: 0.9818, lr: 0.001161, batch_cost: 0.2356, reader_cost: 0.00447, ips: 33.9555 samples/sec | ETA 10:02:21 2022-08-23 18:16:55 [INFO] [TRAIN] epoch: 6, iter: 6650/160000, loss: 1.0186, lr: 0.001161, batch_cost: 0.2356, reader_cost: 0.00046, ips: 33.9489 samples/sec | ETA 10:02:16 2022-08-23 18:17:08 [INFO] [TRAIN] epoch: 6, iter: 6700/160000, loss: 1.0427, lr: 0.001161, batch_cost: 0.2562, reader_cost: 0.00042, ips: 31.2259 samples/sec | ETA 10:54:35 2022-08-23 18:17:20 [INFO] [TRAIN] epoch: 6, iter: 6750/160000, loss: 0.9724, lr: 0.001160, batch_cost: 0.2321, reader_cost: 0.00033, ips: 34.4625 samples/sec | ETA 09:52:54 2022-08-23 18:17:31 [INFO] [TRAIN] epoch: 6, iter: 6800/160000, loss: 1.1210, lr: 0.001160, batch_cost: 0.2194, reader_cost: 0.00318, ips: 36.4646 samples/sec | ETA 09:20:10 2022-08-23 18:17:42 [INFO] [TRAIN] epoch: 6, iter: 6850/160000, loss: 1.0816, lr: 0.001160, batch_cost: 0.2331, reader_cost: 0.00049, ips: 34.3230 samples/sec | ETA 09:54:56 2022-08-23 18:17:52 [INFO] [TRAIN] epoch: 6, iter: 6900/160000, loss: 1.0277, lr: 0.001159, batch_cost: 0.1844, reader_cost: 0.00032, ips: 43.3912 samples/sec | ETA 07:50:26 2022-08-23 18:18:01 [INFO] [TRAIN] epoch: 6, iter: 6950/160000, loss: 1.0258, lr: 0.001159, batch_cost: 0.1835, reader_cost: 0.00039, ips: 43.5926 samples/sec | ETA 07:48:07 2022-08-23 18:18:11 [INFO] [TRAIN] epoch: 6, iter: 7000/160000, loss: 1.0766, lr: 0.001158, batch_cost: 0.1942, reader_cost: 0.00043, ips: 41.1958 samples/sec | ETA 08:15:11 2022-08-23 18:18:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1586 - reader cost: 6.3801e-04 2022-08-23 18:20:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.2707 Acc: 0.7232 Kappa: 0.7023 Dice: 0.3853 2022-08-23 18:20:50 [INFO] [EVAL] Class IoU: [0.6386 0.7344 0.913 0.6819 0.6454 0.7276 0.7261 0.7252 0.4595 0.6049 0.413 0.4795 0.6451 0.2495 0.1272 0.3073 0.4746 0.3841 0.5237 0.3128 0.704 0.2985 0.5563 0.4379 0.2766 0.4097 0.4422 0.2572 0.3441 0.1577 0.1749 0.4025 0.185 0.2362 0.3189 0.3626 0.3481 0.4381 0.2763 0.2727 0.0205 0.0593 0.27 0.2057 0.25 0.1719 0.2552 0.378 0.4965 0.4154 0.3785 0.2587 0.0164 0.2549 0.5864 0.3542 0.7787 0.2709 0.3018 0.2427 0.0766 0.1485 0.2059 0.1174 0.3285 0.5526 0.1523 0.3914 0.0216 0.2469 0.3127 0.401 0.3764 0.1347 0.3903 0.2484 0.472 0.0231 0.2492 0.0671 0.6064 0.2925 0.1363 0. 0.2722 0.4386 0.0439 0.0492 0.2214 0.4233 0.37 0.0331 0.1718 0.0485 0.147 0.0027 0.0585 0. 0.1343 0.1682 0.0036 0.0877 0.0842 0.5574 0.0767 0.305 0.1478 0.3403 0.0729 0.1375 0.0233 0.0088 0.0509 0.5541 0.7131 0.0057 0.2929 0.5906 0. 0.3306 0.3285 0.0001 0.1958 0.1235 0.1922 0.1206 0.3632 0.3836 0. 0.1754 0.4476 0. 0.013 0.1212 0.0122 0.0963 0.068 0.0079 0.0965 0.3221 0.0922 0. 0.3073 0.0003 0.2304 0. 0.2287 0.0203 0.0571 0.0506] 2022-08-23 18:20:50 [INFO] [EVAL] Class Precision: [0.7741 0.8042 0.9704 0.7714 0.7067 0.8207 0.832 0.8407 0.5886 0.7878 0.6191 0.6254 0.7003 0.4534 0.4501 0.5858 0.5994 0.6905 0.6773 0.5997 0.7722 0.5444 0.7916 0.5994 0.3966 0.4981 0.5195 0.7585 0.6495 0.263 0.3188 0.4739 0.4284 0.2742 0.4896 0.4405 0.5655 0.6996 0.5438 0.5468 0.1595 0.3541 0.5642 0.4905 0.3798 0.5357 0.3809 0.6171 0.5045 0.5351 0.438 0.2937 0.1267 0.4355 0.617 0.4335 0.8213 0.6191 0.3961 0.6184 0.0919 0.3046 0.4388 0.5757 0.369 0.6095 0.4076 0.5321 0.1377 0.6695 0.443 0.5397 0.5103 0.2318 0.5942 0.4973 0.5536 0.2812 0.6057 0.1931 0.7414 0.559 0.8111 0.0003 0.5505 0.5838 0.246 0.3809 0.9654 0.7064 0.5272 0.0407 0.4623 0.3292 0.3307 0.1754 0.1321 0.9 0.1631 0.7147 0.6764 0.1937 0.47 0.9246 0.8518 0.3521 0.4984 0.4308 0.3349 0.2678 0.1905 0.0088 0.2258 0.6012 0.7473 0.4657 0.555 0.7141 0. 0.4002 0.6422 0.0186 0.423 0.6364 0.6379 0.503 0.8439 0.5122 0. 0.6716 0.5389 0. 0.2546 0.4551 0.8854 0.208 0.3194 0.1191 0.347 0.5136 0.5172 0. 0.5723 0.24 0.4441 0. 0.8719 0.2265 0.5251 0.9325] 2022-08-23 18:20:50 [INFO] [EVAL] Class Recall: [0.7849 0.8943 0.9392 0.8547 0.8816 0.8651 0.8509 0.8407 0.677 0.7226 0.5537 0.6727 0.8911 0.3568 0.1506 0.3927 0.6951 0.464 0.6978 0.3954 0.8885 0.398 0.6517 0.6191 0.4775 0.6976 0.7484 0.2801 0.4225 0.2827 0.2793 0.7276 0.2455 0.6303 0.4777 0.6724 0.4753 0.5396 0.3596 0.3524 0.023 0.0666 0.3412 0.2615 0.4224 0.202 0.4361 0.4939 0.9691 0.65 0.7357 0.6842 0.0185 0.3808 0.9221 0.6594 0.9376 0.3251 0.559 0.2854 0.3163 0.2246 0.2796 0.1285 0.7496 0.8555 0.1957 0.5968 0.0249 0.2812 0.5152 0.6094 0.5894 0.2432 0.5321 0.3317 0.762 0.0245 0.2975 0.0933 0.7691 0.3802 0.1408 0. 0.35 0.6381 0.0508 0.0535 0.2232 0.5137 0.5538 0.1511 0.2148 0.0538 0.2092 0.0027 0.0951 0. 0.4325 0.1804 0.0036 0.1382 0.093 0.5839 0.0777 0.6953 0.1737 0.6182 0.0853 0.2202 0.0259 0.8937 0.0616 0.8762 0.9396 0.0057 0.3827 0.7735 0. 0.6554 0.4021 0.0001 0.2673 0.1329 0.2157 0.1369 0.3894 0.6044 0. 0.1919 0.7255 0. 0.0135 0.1418 0.0122 0.1521 0.0795 0.0084 0.1179 0.4635 0.1009 0. 0.399 0.0003 0.3237 0. 0.2367 0.0218 0.0602 0.0507] 2022-08-23 18:20:50 [INFO] [EVAL] The model with the best validation mIoU (0.2707) was saved at iter 7000. 2022-08-23 18:21:01 [INFO] [TRAIN] epoch: 6, iter: 7050/160000, loss: 1.0142, lr: 0.001158, batch_cost: 0.2225, reader_cost: 0.00224, ips: 35.9586 samples/sec | ETA 09:27:07 2022-08-23 18:21:12 [INFO] [TRAIN] epoch: 6, iter: 7100/160000, loss: 1.0205, lr: 0.001158, batch_cost: 0.2269, reader_cost: 0.00448, ips: 35.2626 samples/sec | ETA 09:38:08 2022-08-23 18:21:24 [INFO] [TRAIN] epoch: 6, iter: 7150/160000, loss: 1.0706, lr: 0.001157, batch_cost: 0.2301, reader_cost: 0.00070, ips: 34.7654 samples/sec | ETA 09:46:12 2022-08-23 18:21:35 [INFO] [TRAIN] epoch: 6, iter: 7200/160000, loss: 0.9505, lr: 0.001157, batch_cost: 0.2235, reader_cost: 0.00068, ips: 35.8000 samples/sec | ETA 09:29:05 2022-08-23 18:21:47 [INFO] [TRAIN] epoch: 6, iter: 7250/160000, loss: 1.0494, lr: 0.001156, batch_cost: 0.2318, reader_cost: 0.00060, ips: 34.5063 samples/sec | ETA 09:50:13 2022-08-23 18:21:59 [INFO] [TRAIN] epoch: 6, iter: 7300/160000, loss: 1.1649, lr: 0.001156, batch_cost: 0.2569, reader_cost: 0.00059, ips: 31.1414 samples/sec | ETA 10:53:47 2022-08-23 18:22:11 [INFO] [TRAIN] epoch: 6, iter: 7350/160000, loss: 1.0963, lr: 0.001156, batch_cost: 0.2408, reader_cost: 0.00792, ips: 33.2294 samples/sec | ETA 10:12:30 2022-08-23 18:22:23 [INFO] [TRAIN] epoch: 6, iter: 7400/160000, loss: 1.0393, lr: 0.001155, batch_cost: 0.2307, reader_cost: 0.00288, ips: 34.6833 samples/sec | ETA 09:46:38 2022-08-23 18:22:35 [INFO] [TRAIN] epoch: 6, iter: 7450/160000, loss: 1.0068, lr: 0.001155, batch_cost: 0.2491, reader_cost: 0.00062, ips: 32.1214 samples/sec | ETA 10:33:13 2022-08-23 18:22:47 [INFO] [TRAIN] epoch: 6, iter: 7500/160000, loss: 1.0108, lr: 0.001155, batch_cost: 0.2336, reader_cost: 0.00455, ips: 34.2460 samples/sec | ETA 09:53:44 2022-08-23 18:22:59 [INFO] [TRAIN] epoch: 6, iter: 7550/160000, loss: 1.0661, lr: 0.001154, batch_cost: 0.2280, reader_cost: 0.00054, ips: 35.0821 samples/sec | ETA 09:39:24 2022-08-23 18:23:16 [INFO] [TRAIN] epoch: 7, iter: 7600/160000, loss: 1.0560, lr: 0.001154, batch_cost: 0.3440, reader_cost: 0.09445, ips: 23.2542 samples/sec | ETA 14:33:49 2022-08-23 18:23:28 [INFO] [TRAIN] epoch: 7, iter: 7650/160000, loss: 0.9462, lr: 0.001153, batch_cost: 0.2366, reader_cost: 0.00050, ips: 33.8114 samples/sec | ETA 10:00:47 2022-08-23 18:23:39 [INFO] [TRAIN] epoch: 7, iter: 7700/160000, loss: 0.9261, lr: 0.001153, batch_cost: 0.2334, reader_cost: 0.00615, ips: 34.2774 samples/sec | ETA 09:52:25 2022-08-23 18:23:51 [INFO] [TRAIN] epoch: 7, iter: 7750/160000, loss: 0.9893, lr: 0.001153, batch_cost: 0.2306, reader_cost: 0.00099, ips: 34.6912 samples/sec | ETA 09:45:09 2022-08-23 18:24:01 [INFO] [TRAIN] epoch: 7, iter: 7800/160000, loss: 0.9426, lr: 0.001152, batch_cost: 0.2106, reader_cost: 0.00086, ips: 37.9823 samples/sec | ETA 08:54:17 2022-08-23 18:24:14 [INFO] [TRAIN] epoch: 7, iter: 7850/160000, loss: 0.9974, lr: 0.001152, batch_cost: 0.2449, reader_cost: 0.00433, ips: 32.6686 samples/sec | ETA 10:20:58 2022-08-23 18:24:25 [INFO] [TRAIN] epoch: 7, iter: 7900/160000, loss: 0.9944, lr: 0.001152, batch_cost: 0.2298, reader_cost: 0.00082, ips: 34.8125 samples/sec | ETA 09:42:32 2022-08-23 18:24:35 [INFO] [TRAIN] epoch: 7, iter: 7950/160000, loss: 0.9510, lr: 0.001151, batch_cost: 0.1973, reader_cost: 0.00043, ips: 40.5378 samples/sec | ETA 08:20:06 2022-08-23 18:24:45 [INFO] [TRAIN] epoch: 7, iter: 8000/160000, loss: 0.9710, lr: 0.001151, batch_cost: 0.2057, reader_cost: 0.00060, ips: 38.8936 samples/sec | ETA 08:41:04 2022-08-23 18:24:45 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 157s - batch_cost: 0.1565 - reader cost: 7.1723e-04 2022-08-23 18:27:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.2847 Acc: 0.7296 Kappa: 0.7087 Dice: 0.4030 2022-08-23 18:27:22 [INFO] [EVAL] Class IoU: [0.6363 0.7476 0.922 0.698 0.6517 0.718 0.7099 0.7245 0.4696 0.538 0.4316 0.4855 0.6743 0.284 0.22 0.3452 0.4734 0.3731 0.5329 0.3589 0.7187 0.4562 0.5532 0.4225 0.2552 0.2304 0.4673 0.3076 0.3494 0.1909 0.1389 0.4403 0.2301 0.2289 0.2814 0.3732 0.3457 0.4115 0.2543 0.2891 0.0561 0.0445 0.2663 0.1997 0.227 0.234 0.1878 0.3887 0.3717 0.4142 0.3974 0.3074 0.0759 0.169 0.6527 0.4207 0.7691 0.3403 0.3412 0.218 0.0602 0.2306 0.1991 0.0789 0.4072 0.6173 0.1799 0.3517 0.0265 0.2933 0.335 0.353 0.4026 0.1893 0.3784 0.3208 0.4441 0.0322 0.19 0.231 0.612 0.313 0.2636 0.0037 0.237 0.4723 0.0421 0.0521 0.1394 0.4202 0.343 0.0206 0.0928 0.0238 0.0776 0.0001 0.131 0.046 0.1851 0.1964 0. 0.0981 0.1118 0.5259 0.1715 0.4291 0.0555 0.4109 0.0142 0.1997 0.0125 0.0494 0.0765 0.4564 0.8237 0. 0.4811 0.4857 0.0231 0.3198 0.5005 0.0011 0.2727 0.1986 0.3203 0.1853 0.4195 0.4337 0.0142 0.0708 0.6115 0. 0.138 0.1094 0.1569 0.0929 0.0731 0. 0.1494 0.274 0.0559 0.0146 0.2235 0.066 0.097 0. 0.3027 0.0181 0.0665 0.079 ] 2022-08-23 18:27:22 [INFO] [EVAL] Class Precision: [0.7471 0.8183 0.9605 0.8082 0.7201 0.8549 0.7871 0.8429 0.6165 0.8176 0.567 0.68 0.769 0.5942 0.4107 0.5354 0.6484 0.6903 0.6393 0.4441 0.8079 0.5625 0.6921 0.5705 0.4937 0.4904 0.6047 0.5669 0.6509 0.2353 0.3145 0.6181 0.4835 0.324 0.3651 0.542 0.6102 0.8274 0.4461 0.4429 0.1919 0.4325 0.5695 0.5225 0.4371 0.7262 0.3172 0.5257 0.6654 0.4985 0.5466 0.397 0.2229 0.6369 0.7145 0.5508 0.8012 0.417 0.5702 0.4385 0.336 0.3116 0.5466 0.61 0.5344 0.7518 0.4445 0.4228 0.1993 0.5389 0.4584 0.4931 0.5801 0.2241 0.7206 0.5024 0.838 0.5561 0.4587 0.3383 0.7768 0.5599 0.7287 0.0151 0.559 0.5758 0.4333 0.255 0.9888 0.7053 0.4099 0.0529 0.2843 0.2712 0.5704 0.0165 0.3702 0.4051 0.3578 0.6643 0. 0.2275 0.3743 0.7064 0.6711 0.6584 0.753 0.5455 0.0702 0.3151 0.2772 0.0662 0.3934 0.7268 0.8617 0. 0.6517 0.5526 0.0386 0.5698 0.7642 0.1265 0.6991 0.5639 0.618 0.6321 0.7439 0.6676 0.281 0.7461 0.7722 0. 0.5349 0.4591 0.4174 0.2142 0.3961 0.0007 0.4458 0.6047 0.8774 0.0302 0.5613 0.2125 0.5788 0. 0.7801 0.3484 0.5089 0.959 ] 2022-08-23 18:27:22 [INFO] [EVAL] Class Recall: [0.811 0.8963 0.9583 0.8366 0.8728 0.8176 0.8786 0.8376 0.6634 0.6114 0.6437 0.6293 0.8455 0.3524 0.3215 0.4929 0.637 0.4482 0.7619 0.6519 0.8669 0.7071 0.7338 0.6195 0.3456 0.303 0.6728 0.4021 0.43 0.5025 0.1992 0.6049 0.305 0.4383 0.551 0.5451 0.4437 0.4501 0.3716 0.4544 0.0735 0.0472 0.3335 0.2442 0.3208 0.2566 0.3152 0.5985 0.4572 0.71 0.5929 0.5764 0.1033 0.187 0.8829 0.6405 0.9505 0.6492 0.4594 0.3024 0.0684 0.4703 0.2385 0.0831 0.6311 0.7752 0.2322 0.6765 0.0297 0.3916 0.5545 0.5539 0.5682 0.5489 0.4434 0.4702 0.4858 0.033 0.2449 0.4212 0.7425 0.4152 0.2923 0.0049 0.2915 0.7243 0.0445 0.0614 0.1396 0.5097 0.6775 0.0327 0.121 0.0254 0.0824 0.0001 0.1686 0.0493 0.2771 0.218 0. 0.147 0.1375 0.673 0.1872 0.552 0.0565 0.6247 0.0175 0.353 0.0129 0.1636 0.0867 0.5509 0.9491 0. 0.6475 0.8005 0.0542 0.4215 0.5918 0.0011 0.309 0.2346 0.3994 0.2078 0.4903 0.5532 0.0147 0.0725 0.7461 0. 0.1569 0.1256 0.2009 0.141 0.0822 0. 0.1835 0.3338 0.0564 0.0275 0.2708 0.0874 0.1044 0. 0.3309 0.0187 0.0711 0.0793] 2022-08-23 18:27:23 [INFO] [EVAL] The model with the best validation mIoU (0.2847) was saved at iter 8000. 2022-08-23 18:27:33 [INFO] [TRAIN] epoch: 7, iter: 8050/160000, loss: 1.1475, lr: 0.001150, batch_cost: 0.1964, reader_cost: 0.00290, ips: 40.7286 samples/sec | ETA 08:17:26 2022-08-23 18:27:44 [INFO] [TRAIN] epoch: 7, iter: 8100/160000, loss: 1.0474, lr: 0.001150, batch_cost: 0.2258, reader_cost: 0.00166, ips: 35.4301 samples/sec | ETA 09:31:38 2022-08-23 18:27:55 [INFO] [TRAIN] epoch: 7, iter: 8150/160000, loss: 0.9820, lr: 0.001150, batch_cost: 0.2169, reader_cost: 0.00073, ips: 36.8838 samples/sec | ETA 09:08:55 2022-08-23 18:28:05 [INFO] [TRAIN] epoch: 7, iter: 8200/160000, loss: 0.9503, lr: 0.001149, batch_cost: 0.2012, reader_cost: 0.00369, ips: 39.7629 samples/sec | ETA 08:29:01 2022-08-23 18:28:17 [INFO] [TRAIN] epoch: 7, iter: 8250/160000, loss: 1.0338, lr: 0.001149, batch_cost: 0.2478, reader_cost: 0.00093, ips: 32.2883 samples/sec | ETA 10:26:38 2022-08-23 18:28:28 [INFO] [TRAIN] epoch: 7, iter: 8300/160000, loss: 1.0307, lr: 0.001149, batch_cost: 0.2268, reader_cost: 0.00036, ips: 35.2677 samples/sec | ETA 09:33:31 2022-08-23 18:28:40 [INFO] [TRAIN] epoch: 7, iter: 8350/160000, loss: 1.0016, lr: 0.001148, batch_cost: 0.2259, reader_cost: 0.00049, ips: 35.4145 samples/sec | ETA 09:30:57 2022-08-23 18:28:51 [INFO] [TRAIN] epoch: 7, iter: 8400/160000, loss: 0.9987, lr: 0.001148, batch_cost: 0.2282, reader_cost: 0.00051, ips: 35.0532 samples/sec | ETA 09:36:38 2022-08-23 18:29:02 [INFO] [TRAIN] epoch: 7, iter: 8450/160000, loss: 1.0437, lr: 0.001147, batch_cost: 0.2149, reader_cost: 0.00120, ips: 37.2307 samples/sec | ETA 09:02:44 2022-08-23 18:29:13 [INFO] [TRAIN] epoch: 7, iter: 8500/160000, loss: 1.0266, lr: 0.001147, batch_cost: 0.2181, reader_cost: 0.01342, ips: 36.6771 samples/sec | ETA 09:10:45 2022-08-23 18:29:24 [INFO] [TRAIN] epoch: 7, iter: 8550/160000, loss: 0.9800, lr: 0.001147, batch_cost: 0.2259, reader_cost: 0.00203, ips: 35.4101 samples/sec | ETA 09:30:16 2022-08-23 18:29:34 [INFO] [TRAIN] epoch: 7, iter: 8600/160000, loss: 0.9930, lr: 0.001146, batch_cost: 0.1952, reader_cost: 0.00829, ips: 40.9828 samples/sec | ETA 08:12:33 2022-08-23 18:29:45 [INFO] [TRAIN] epoch: 7, iter: 8650/160000, loss: 1.0293, lr: 0.001146, batch_cost: 0.2210, reader_cost: 0.00043, ips: 36.1953 samples/sec | ETA 09:17:31 2022-08-23 18:29:57 [INFO] [TRAIN] epoch: 7, iter: 8700/160000, loss: 1.0699, lr: 0.001145, batch_cost: 0.2448, reader_cost: 0.00079, ips: 32.6749 samples/sec | ETA 10:17:23 2022-08-23 18:30:10 [INFO] [TRAIN] epoch: 7, iter: 8750/160000, loss: 1.0178, lr: 0.001145, batch_cost: 0.2572, reader_cost: 0.00042, ips: 31.0987 samples/sec | ETA 10:48:28 2022-08-23 18:30:21 [INFO] [TRAIN] epoch: 7, iter: 8800/160000, loss: 0.9486, lr: 0.001145, batch_cost: 0.2280, reader_cost: 0.00082, ips: 35.0828 samples/sec | ETA 09:34:38 2022-08-23 18:30:40 [INFO] [TRAIN] epoch: 8, iter: 8850/160000, loss: 1.0021, lr: 0.001144, batch_cost: 0.3778, reader_cost: 0.16062, ips: 21.1731 samples/sec | ETA 15:51:50 2022-08-23 18:30:49 [INFO] [TRAIN] epoch: 8, iter: 8900/160000, loss: 0.9641, lr: 0.001144, batch_cost: 0.1774, reader_cost: 0.00061, ips: 45.0879 samples/sec | ETA 07:26:49 2022-08-23 18:30:58 [INFO] [TRAIN] epoch: 8, iter: 8950/160000, loss: 0.9722, lr: 0.001144, batch_cost: 0.1690, reader_cost: 0.00056, ips: 47.3331 samples/sec | ETA 07:05:29 2022-08-23 18:31:07 [INFO] [TRAIN] epoch: 8, iter: 9000/160000, loss: 0.9657, lr: 0.001143, batch_cost: 0.1779, reader_cost: 0.00071, ips: 44.9699 samples/sec | ETA 07:27:42 2022-08-23 18:31:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 140s - batch_cost: 0.1404 - reader cost: 5.5693e-04 2022-08-23 18:33:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2924 Acc: 0.7388 Kappa: 0.7183 Dice: 0.4137 2022-08-23 18:33:27 [INFO] [EVAL] Class IoU: [0.6417 0.7511 0.9237 0.6978 0.659 0.7369 0.7335 0.7204 0.4744 0.6515 0.4278 0.4859 0.6717 0.2624 0.1711 0.3651 0.4909 0.4392 0.5564 0.3489 0.7036 0.4533 0.5652 0.4515 0.2819 0.3571 0.5049 0.3033 0.3582 0.1757 0.1383 0.4532 0.241 0.2611 0.3316 0.4256 0.3711 0.4639 0.2386 0.2893 0.0302 0.1371 0.2757 0.208 0.2757 0.211 0.2413 0.4004 0.3713 0.4966 0.4113 0.3005 0.0415 0.1591 0.675 0.3587 0.8173 0.1479 0.3287 0.2517 0.1114 0.2747 0.2855 0.1031 0.3804 0.6599 0.2119 0.3641 0.0176 0.3132 0.3757 0.4054 0.4091 0.1588 0.4063 0.2878 0.5711 0.1257 0.2473 0.3258 0.5862 0.3078 0.2777 0.028 0.2975 0.4816 0.0585 0.0545 0.2203 0.4518 0.4458 0.1032 0.1747 0.0531 0.1075 0.0182 0.1178 0.1666 0.238 0.1102 0. 0.0784 0.0593 0.4394 0.1732 0.3074 0.1624 0.3704 0.0116 0.2574 0.0621 0.1623 0.0422 0.5032 0.6126 0.002 0.2932 0.4772 0.0625 0.1068 0.4695 0. 0.1972 0.0492 0.2092 0.1068 0.4534 0.4454 0. 0.2424 0.5293 0. 0.1032 0.1032 0.0244 0.1298 0.0922 0.003 0.1047 0.2822 0.032 0.0397 0.2181 0.0948 0.1944 0.0001 0.3303 0.0386 0.0363 0.1021] 2022-08-23 18:33:27 [INFO] [EVAL] Class Precision: [0.7369 0.8187 0.9652 0.8229 0.7921 0.8367 0.834 0.7801 0.6054 0.7418 0.7181 0.6087 0.7549 0.4658 0.5005 0.4532 0.6064 0.6477 0.7094 0.5411 0.775 0.6274 0.7072 0.5762 0.459 0.6302 0.5977 0.6999 0.6475 0.3817 0.3511 0.568 0.5262 0.5734 0.4017 0.5888 0.5614 0.6183 0.5647 0.5418 0.2803 0.2281 0.4282 0.4734 0.4341 0.3423 0.3797 0.7641 0.7161 0.7048 0.6053 0.3878 0.2122 0.6535 0.8011 0.4433 0.8549 0.6673 0.492 0.4414 0.1887 0.4366 0.5922 0.7706 0.5387 0.7894 0.3734 0.4723 0.3345 0.6181 0.5153 0.6507 0.6299 0.281 0.7718 0.5156 0.747 0.4914 0.7413 0.467 0.7426 0.601 0.7054 0.2451 0.3471 0.6817 0.4119 0.3607 0.8347 0.7283 0.6651 0.215 0.2529 0.2909 0.4259 0.1279 0.3311 0.4143 0.3159 0.7943 0. 0.1205 0.4813 0.8957 0.7508 0.3286 0.5362 0.7236 0.0742 0.319 0.2951 0.1983 0.4221 0.7115 0.6867 0.1159 0.4996 0.5278 0.2273 0.7195 0.7155 0. 0.9661 0.7093 0.6444 0.6509 0.7531 0.643 0. 0.7797 0.7153 0. 0.7069 0.633 0.8364 0.3077 0.4553 0.0492 0.3476 0.5809 0.7069 0.0541 0.6795 0.3534 0.5183 0.12 0.7626 0.2116 0.754 0.9095] 2022-08-23 18:33:27 [INFO] [EVAL] Class Recall: [0.8325 0.9009 0.9555 0.8212 0.7968 0.8606 0.8589 0.9039 0.6868 0.8425 0.5142 0.7066 0.8591 0.3753 0.2064 0.6524 0.7204 0.5771 0.7207 0.4955 0.8842 0.6202 0.7379 0.6762 0.4221 0.4518 0.7649 0.3487 0.445 0.2455 0.1857 0.6916 0.3078 0.324 0.6551 0.6057 0.5226 0.65 0.2924 0.383 0.0328 0.2557 0.4364 0.2707 0.4303 0.3548 0.3984 0.4568 0.4354 0.627 0.562 0.5715 0.049 0.1737 0.8109 0.6527 0.9489 0.1597 0.4976 0.3693 0.2138 0.4255 0.3553 0.1063 0.5641 0.8008 0.3288 0.6139 0.0183 0.3884 0.5811 0.5181 0.5385 0.2676 0.4617 0.3944 0.708 0.1445 0.2707 0.5188 0.7356 0.3869 0.3141 0.0306 0.6759 0.6214 0.0639 0.0603 0.2303 0.5434 0.5749 0.1657 0.361 0.061 0.1257 0.0208 0.1545 0.2179 0.4914 0.1135 0. 0.1835 0.0633 0.4631 0.1838 0.8266 0.1889 0.4315 0.0136 0.5712 0.0729 0.4721 0.0448 0.6322 0.8503 0.002 0.4151 0.8327 0.0793 0.1115 0.5772 0. 0.1986 0.0502 0.2365 0.1133 0.5326 0.5918 0. 0.2603 0.6705 0. 0.1078 0.1098 0.0245 0.1832 0.1036 0.0032 0.1303 0.3543 0.0325 0.1297 0.2432 0.1147 0.2372 0.0001 0.3682 0.0451 0.0368 0.1032] 2022-08-23 18:33:28 [INFO] [EVAL] The model with the best validation mIoU (0.2924) was saved at iter 9000. 2022-08-23 18:33:36 [INFO] [TRAIN] epoch: 8, iter: 9050/160000, loss: 0.9452, lr: 0.001143, batch_cost: 0.1622, reader_cost: 0.00378, ips: 49.3133 samples/sec | ETA 06:48:08 2022-08-23 18:33:44 [INFO] [TRAIN] epoch: 8, iter: 9100/160000, loss: 0.9631, lr: 0.001142, batch_cost: 0.1754, reader_cost: 0.00103, ips: 45.6195 samples/sec | ETA 07:21:02 2022-08-23 18:33:55 [INFO] [TRAIN] epoch: 8, iter: 9150/160000, loss: 1.0039, lr: 0.001142, batch_cost: 0.2025, reader_cost: 0.00042, ips: 39.5149 samples/sec | ETA 08:29:00 2022-08-23 18:34:05 [INFO] [TRAIN] epoch: 8, iter: 9200/160000, loss: 0.9636, lr: 0.001142, batch_cost: 0.2038, reader_cost: 0.00039, ips: 39.2521 samples/sec | ETA 08:32:14 2022-08-23 18:34:14 [INFO] [TRAIN] epoch: 8, iter: 9250/160000, loss: 0.9109, lr: 0.001141, batch_cost: 0.1898, reader_cost: 0.00863, ips: 42.1481 samples/sec | ETA 07:56:53 2022-08-23 18:34:25 [INFO] [TRAIN] epoch: 8, iter: 9300/160000, loss: 0.9318, lr: 0.001141, batch_cost: 0.2142, reader_cost: 0.00049, ips: 37.3558 samples/sec | ETA 08:57:53 2022-08-23 18:34:36 [INFO] [TRAIN] epoch: 8, iter: 9350/160000, loss: 0.9151, lr: 0.001141, batch_cost: 0.2157, reader_cost: 0.00461, ips: 37.0898 samples/sec | ETA 09:01:34 2022-08-23 18:34:46 [INFO] [TRAIN] epoch: 8, iter: 9400/160000, loss: 1.0261, lr: 0.001140, batch_cost: 0.2143, reader_cost: 0.00056, ips: 37.3370 samples/sec | ETA 08:57:48 2022-08-23 18:34:58 [INFO] [TRAIN] epoch: 8, iter: 9450/160000, loss: 0.9457, lr: 0.001140, batch_cost: 0.2216, reader_cost: 0.00033, ips: 36.1056 samples/sec | ETA 09:15:57 2022-08-23 18:35:07 [INFO] [TRAIN] epoch: 8, iter: 9500/160000, loss: 0.9580, lr: 0.001139, batch_cost: 0.1952, reader_cost: 0.00261, ips: 40.9846 samples/sec | ETA 08:09:36 2022-08-23 18:35:17 [INFO] [TRAIN] epoch: 8, iter: 9550/160000, loss: 0.9608, lr: 0.001139, batch_cost: 0.2013, reader_cost: 0.00315, ips: 39.7403 samples/sec | ETA 08:24:46 2022-08-23 18:35:29 [INFO] [TRAIN] epoch: 8, iter: 9600/160000, loss: 0.9790, lr: 0.001139, batch_cost: 0.2336, reader_cost: 0.00052, ips: 34.2466 samples/sec | ETA 09:45:33 2022-08-23 18:35:39 [INFO] [TRAIN] epoch: 8, iter: 9650/160000, loss: 0.9630, lr: 0.001138, batch_cost: 0.2008, reader_cost: 0.00082, ips: 39.8428 samples/sec | ETA 08:23:08 2022-08-23 18:35:49 [INFO] [TRAIN] epoch: 8, iter: 9700/160000, loss: 0.9825, lr: 0.001138, batch_cost: 0.2064, reader_cost: 0.00050, ips: 38.7590 samples/sec | ETA 08:37:02 2022-08-23 18:36:01 [INFO] [TRAIN] epoch: 8, iter: 9750/160000, loss: 0.9835, lr: 0.001138, batch_cost: 0.2266, reader_cost: 0.00053, ips: 35.3053 samples/sec | ETA 09:27:25 2022-08-23 18:36:11 [INFO] [TRAIN] epoch: 8, iter: 9800/160000, loss: 0.9022, lr: 0.001137, batch_cost: 0.1970, reader_cost: 0.00245, ips: 40.6139 samples/sec | ETA 08:13:05 2022-08-23 18:36:23 [INFO] [TRAIN] epoch: 8, iter: 9850/160000, loss: 0.9197, lr: 0.001137, batch_cost: 0.2397, reader_cost: 0.00046, ips: 33.3723 samples/sec | ETA 09:59:53 2022-08-23 18:36:34 [INFO] [TRAIN] epoch: 8, iter: 9900/160000, loss: 0.9230, lr: 0.001136, batch_cost: 0.2235, reader_cost: 0.00064, ips: 35.7919 samples/sec | ETA 09:19:09 2022-08-23 18:36:45 [INFO] [TRAIN] epoch: 8, iter: 9950/160000, loss: 0.9856, lr: 0.001136, batch_cost: 0.2195, reader_cost: 0.00108, ips: 36.4461 samples/sec | ETA 09:08:56 2022-08-23 18:36:55 [INFO] [TRAIN] epoch: 8, iter: 10000/160000, loss: 0.9875, lr: 0.001136, batch_cost: 0.2115, reader_cost: 0.00064, ips: 37.8292 samples/sec | ETA 08:48:41 2022-08-23 18:36:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 155s - batch_cost: 0.1552 - reader cost: 6.4095e-04 2022-08-23 18:39:31 [INFO] [EVAL] #Images: 2000 mIoU: 0.2870 Acc: 0.7359 Kappa: 0.7156 Dice: 0.4062 2022-08-23 18:39:31 [INFO] [EVAL] Class IoU: [0.6389 0.7446 0.9252 0.7059 0.6542 0.7294 0.7437 0.7185 0.4732 0.6047 0.4453 0.5064 0.6797 0.2488 0.2251 0.369 0.5044 0.3529 0.5276 0.3683 0.7186 0.4272 0.5615 0.4356 0.2885 0.3759 0.4471 0.355 0.3384 0.2008 0.2402 0.4146 0.2604 0.2586 0.3016 0.3946 0.3528 0.4559 0.2504 0.242 0.1278 0.1085 0.2877 0.1796 0.2732 0.1831 0.2634 0.3774 0.5795 0.4377 0.4698 0.41 0.14 0.1628 0.6567 0.3515 0.6965 0.2538 0.3272 0.253 0.0889 0.2107 0.226 0.2474 0.3715 0.5962 0.1736 0.3646 0.0608 0.3326 0.4156 0.4304 0.4114 0.1971 0.3873 0.3017 0.4903 0.1208 0.1782 0.1272 0.5929 0.2961 0.2898 0.0217 0.1254 0.4364 0.0507 0.0511 0.2442 0.3905 0.3495 0.0002 0.1725 0.0371 0.0035 0.0026 0.0691 0.0875 0.2669 0.2945 0.0199 0.0674 0.1981 0.1129 0.1756 0.2488 0.2497 0.4424 0.1213 0.0254 0.058 0.2279 0.0808 0.5706 0.8296 0. 0.4113 0.5437 0.0768 0.127 0.2164 0.0122 0.2159 0.0471 0.1786 0.1195 0.4578 0.4319 0. 0.0269 0.5593 0. 0.1342 0.1909 0.0577 0.0781 0.0254 0.014 0.1053 0.1843 0.1024 0.0105 0.0771 0. 0.2344 0. 0.3006 0.0089 0.0784 0.0556] 2022-08-23 18:39:31 [INFO] [EVAL] Class Precision: [0.7623 0.7982 0.9565 0.8131 0.7602 0.8065 0.8475 0.7744 0.6244 0.7254 0.7004 0.6859 0.7824 0.4927 0.4477 0.5595 0.6319 0.7295 0.6539 0.567 0.7951 0.6305 0.781 0.6656 0.4806 0.5297 0.5059 0.5953 0.7086 0.265 0.3455 0.7272 0.4461 0.3407 0.5035 0.5759 0.6251 0.6876 0.3791 0.6125 0.2641 0.3754 0.6904 0.6191 0.3409 0.6218 0.4004 0.7187 0.7119 0.5308 0.6804 0.5381 0.33 0.5835 0.7471 0.4366 0.7108 0.5912 0.6817 0.416 0.2513 0.308 0.4577 0.5311 0.4766 0.6729 0.215 0.4578 0.1901 0.5642 0.5044 0.5429 0.6383 0.2913 0.568 0.5606 0.7205 0.7763 0.3197 0.6751 0.7196 0.6573 0.7171 0.0371 0.3677 0.7052 0.2566 0.2859 0.6148 0.7225 0.4286 0.0038 0.3105 0.2752 0.0114 0.1197 0.0853 0.3804 0.4514 0.5195 0.3581 0.0801 0.6441 0.9107 0.4519 0.6722 0.5264 0.6011 0.3191 0.6151 0.4726 0.5259 0.3078 0.7007 0.8847 0. 0.4827 0.5651 0.2375 0.5986 0.6359 0.3835 0.7857 0.5691 0.7091 0.7997 0.7406 0.5948 0. 0.8549 0.6448 0. 0.469 0.5537 0.4753 0.5316 0.8407 0.0286 0.3251 0.755 0.1094 0.0155 0.5347 0. 0.4005 0. 0.8589 0.2391 0.7633 0.9046] 2022-08-23 18:39:31 [INFO] [EVAL] Class Recall: [0.7979 0.9172 0.9659 0.8426 0.8243 0.8841 0.8586 0.9087 0.6615 0.7842 0.55 0.6593 0.8382 0.3344 0.3117 0.52 0.7142 0.406 0.7321 0.5125 0.8818 0.5699 0.6664 0.5577 0.4192 0.5642 0.7937 0.468 0.3931 0.453 0.4408 0.4909 0.3848 0.5176 0.4292 0.5563 0.4474 0.575 0.4245 0.2857 0.1984 0.1324 0.3303 0.2019 0.5789 0.206 0.435 0.4428 0.7569 0.7139 0.6028 0.6327 0.1956 0.1842 0.8445 0.6433 0.972 0.3078 0.3862 0.3924 0.1209 0.4002 0.3088 0.3165 0.6275 0.8395 0.4742 0.6417 0.0821 0.4475 0.7025 0.6749 0.5365 0.3785 0.5491 0.3951 0.6055 0.1251 0.287 0.1355 0.771 0.3502 0.3272 0.0499 0.1599 0.5338 0.0594 0.0586 0.2883 0.4594 0.6544 0.0003 0.2795 0.0411 0.005 0.0026 0.2676 0.102 0.395 0.4048 0.0207 0.2983 0.2225 0.1142 0.2231 0.2831 0.322 0.6263 0.1637 0.0259 0.0621 0.2868 0.0987 0.7545 0.9302 0. 0.7357 0.9347 0.1018 0.1388 0.247 0.0125 0.2294 0.0488 0.1927 0.1232 0.5452 0.612 0. 0.0271 0.8082 0. 0.1583 0.2256 0.0616 0.0839 0.0256 0.0266 0.1347 0.196 0.6174 0.0313 0.0826 0. 0.3612 0. 0.3162 0.0091 0.0803 0.056 ] 2022-08-23 18:39:31 [INFO] [EVAL] The model with the best validation mIoU (0.2924) was saved at iter 9000. 2022-08-23 18:39:40 [INFO] [TRAIN] epoch: 8, iter: 10050/160000, loss: 0.9886, lr: 0.001135, batch_cost: 0.1813, reader_cost: 0.00358, ips: 44.1149 samples/sec | ETA 07:33:12 2022-08-23 18:39:50 [INFO] [TRAIN] epoch: 8, iter: 10100/160000, loss: 0.9335, lr: 0.001135, batch_cost: 0.2013, reader_cost: 0.00149, ips: 39.7391 samples/sec | ETA 08:22:56 2022-08-23 18:40:07 [INFO] [TRAIN] epoch: 9, iter: 10150/160000, loss: 0.9496, lr: 0.001135, batch_cost: 0.3320, reader_cost: 0.12618, ips: 24.0980 samples/sec | ETA 13:49:06 2022-08-23 18:40:19 [INFO] [TRAIN] epoch: 9, iter: 10200/160000, loss: 0.9113, lr: 0.001134, batch_cost: 0.2454, reader_cost: 0.00366, ips: 32.5990 samples/sec | ETA 10:12:41 2022-08-23 18:40:29 [INFO] [TRAIN] epoch: 9, iter: 10250/160000, loss: 0.9749, lr: 0.001134, batch_cost: 0.2075, reader_cost: 0.00050, ips: 38.5582 samples/sec | ETA 08:37:49 2022-08-23 18:40:39 [INFO] [TRAIN] epoch: 9, iter: 10300/160000, loss: 0.9815, lr: 0.001133, batch_cost: 0.1982, reader_cost: 0.00041, ips: 40.3536 samples/sec | ETA 08:14:37 2022-08-23 18:40:48 [INFO] [TRAIN] epoch: 9, iter: 10350/160000, loss: 0.9110, lr: 0.001133, batch_cost: 0.1815, reader_cost: 0.00206, ips: 44.0853 samples/sec | ETA 07:32:36 2022-08-23 18:40:59 [INFO] [TRAIN] epoch: 9, iter: 10400/160000, loss: 0.8466, lr: 0.001133, batch_cost: 0.2075, reader_cost: 0.00046, ips: 38.5601 samples/sec | ETA 08:37:17 2022-08-23 18:41:10 [INFO] [TRAIN] epoch: 9, iter: 10450/160000, loss: 0.9723, lr: 0.001132, batch_cost: 0.2224, reader_cost: 0.00052, ips: 35.9713 samples/sec | ETA 09:14:19 2022-08-23 18:41:20 [INFO] [TRAIN] epoch: 9, iter: 10500/160000, loss: 0.9602, lr: 0.001132, batch_cost: 0.2104, reader_cost: 0.00081, ips: 38.0242 samples/sec | ETA 08:44:13 2022-08-23 18:41:31 [INFO] [TRAIN] epoch: 9, iter: 10550/160000, loss: 0.9980, lr: 0.001131, batch_cost: 0.2118, reader_cost: 0.00035, ips: 37.7685 samples/sec | ETA 08:47:35 2022-08-23 18:41:42 [INFO] [TRAIN] epoch: 9, iter: 10600/160000, loss: 0.9717, lr: 0.001131, batch_cost: 0.2229, reader_cost: 0.00076, ips: 35.8886 samples/sec | ETA 09:15:03 2022-08-23 18:41:52 [INFO] [TRAIN] epoch: 9, iter: 10650/160000, loss: 0.9687, lr: 0.001131, batch_cost: 0.1896, reader_cost: 0.00033, ips: 42.1990 samples/sec | ETA 07:51:53 2022-08-23 18:42:03 [INFO] [TRAIN] epoch: 9, iter: 10700/160000, loss: 0.8771, lr: 0.001130, batch_cost: 0.2243, reader_cost: 0.00076, ips: 35.6727 samples/sec | ETA 09:18:02 2022-08-23 18:42:13 [INFO] [TRAIN] epoch: 9, iter: 10750/160000, loss: 0.9199, lr: 0.001130, batch_cost: 0.2092, reader_cost: 0.00062, ips: 38.2429 samples/sec | ETA 08:40:21 2022-08-23 18:42:24 [INFO] [TRAIN] epoch: 9, iter: 10800/160000, loss: 0.9705, lr: 0.001130, batch_cost: 0.2149, reader_cost: 0.00048, ips: 37.2241 samples/sec | ETA 08:54:25 2022-08-23 18:42:34 [INFO] [TRAIN] epoch: 9, iter: 10850/160000, loss: 0.9257, lr: 0.001129, batch_cost: 0.1949, reader_cost: 0.00052, ips: 41.0519 samples/sec | ETA 08:04:25 2022-08-23 18:42:44 [INFO] [TRAIN] epoch: 9, iter: 10900/160000, loss: 0.9249, lr: 0.001129, batch_cost: 0.2145, reader_cost: 0.00065, ips: 37.3031 samples/sec | ETA 08:52:55 2022-08-23 18:42:54 [INFO] [TRAIN] epoch: 9, iter: 10950/160000, loss: 0.9545, lr: 0.001128, batch_cost: 0.1867, reader_cost: 0.00079, ips: 42.8394 samples/sec | ETA 07:43:54 2022-08-23 18:43:04 [INFO] [TRAIN] epoch: 9, iter: 11000/160000, loss: 0.9286, lr: 0.001128, batch_cost: 0.1959, reader_cost: 0.00068, ips: 40.8428 samples/sec | ETA 08:06:25 2022-08-23 18:43:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 146s - batch_cost: 0.1459 - reader cost: 9.7827e-04 2022-08-23 18:45:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3011 Acc: 0.7396 Kappa: 0.7201 Dice: 0.4257 2022-08-23 18:45:30 [INFO] [EVAL] Class IoU: [0.6466 0.7571 0.9256 0.7029 0.6654 0.7286 0.7484 0.7256 0.466 0.6618 0.4334 0.5051 0.6745 0.2819 0.1809 0.3567 0.4564 0.3897 0.5515 0.3593 0.7364 0.3686 0.5612 0.4454 0.3193 0.4116 0.4318 0.3414 0.3221 0.2729 0.1431 0.4827 0.2683 0.2839 0.3129 0.3713 0.3706 0.4893 0.2289 0.3173 0.0937 0.1018 0.298 0.1999 0.2471 0.2103 0.2514 0.4275 0.5179 0.4897 0.419 0.4916 0.1519 0.1875 0.6509 0.3812 0.8402 0.2582 0.3857 0.2329 0.1 0.2591 0.3157 0.0124 0.3519 0.5962 0.2387 0.3687 0.0401 0.3082 0.3154 0.383 0.3296 0.1919 0.4063 0.2813 0.4026 0.1109 0.2865 0.2155 0.5713 0.3508 0.3526 0.0547 0.336 0.4815 0.0861 0.0725 0.272 0.4267 0.4254 0.1238 0.2296 0.0395 0.08 0.0011 0.1219 0.0208 0.2158 0.2715 0. 0.0625 0.0811 0.673 0.2 0.342 0.1651 0.4445 0.146 0.2229 0.076 0.0918 0.1135 0.4547 0.4606 0. 0.2662 0.596 0.0839 0.2678 0.359 0. 0.295 0.0624 0.2403 0.1561 0.4072 0.3777 0. 0.0291 0.6094 0. 0.1066 0.1444 0.096 0.1427 0.0999 0.0149 0.1015 0.2831 0.2196 0.0396 0.3269 0. 0.2504 0. 0.3036 0.0215 0.075 0.0745] 2022-08-23 18:45:30 [INFO] [EVAL] Class Precision: [0.7788 0.844 0.9662 0.7966 0.7431 0.8167 0.8495 0.7916 0.5777 0.769 0.5665 0.6659 0.7525 0.4825 0.5108 0.605 0.5671 0.7169 0.6967 0.6022 0.8384 0.6527 0.7051 0.5407 0.5408 0.4619 0.4586 0.5416 0.7147 0.5776 0.2898 0.607 0.7334 0.3919 0.4734 0.5321 0.5537 0.6988 0.4835 0.4526 0.2201 0.3239 0.5395 0.5338 0.322 0.4711 0.3656 0.6975 0.72 0.6463 0.5654 0.7438 0.3532 0.3936 0.7347 0.4753 0.8865 0.5303 0.5226 0.3708 0.2996 0.4418 0.4464 0.4748 0.4184 0.6747 0.3628 0.5825 0.118 0.604 0.497 0.4665 0.7341 0.2476 0.6579 0.3962 0.4705 0.5355 0.4906 0.4733 0.6849 0.688 0.6312 0.1499 0.4746 0.5826 0.3505 0.2699 0.4123 0.5428 0.5966 0.287 0.3575 0.2808 0.2473 0.0124 0.1754 0.2637 0.6697 0.7562 0. 0.1073 0.5878 0.8099 0.4916 0.3742 0.2242 0.793 0.2372 0.3165 0.2056 0.0971 0.3345 0.4575 0.4957 0. 0.7455 0.7223 0.1352 0.4101 0.769 0. 0.9564 0.7856 0.4609 0.6886 0.8351 0.4899 0. 0.859 0.8126 0. 0.4501 0.5916 0.4238 0.2484 0.4111 0.3826 0.5152 0.6976 0.2891 0.0533 0.5282 0. 0.6999 0. 0.8904 0.2689 0.4786 0.94 ] 2022-08-23 18:45:30 [INFO] [EVAL] Class Recall: [0.7921 0.8803 0.9566 0.8566 0.8641 0.8711 0.8629 0.897 0.7067 0.826 0.6484 0.6765 0.8667 0.4041 0.2188 0.465 0.7005 0.4606 0.7258 0.4711 0.8582 0.4586 0.7333 0.7164 0.4381 0.7907 0.8807 0.4801 0.3696 0.3409 0.2204 0.7021 0.2973 0.5076 0.48 0.5514 0.5284 0.62 0.3029 0.5149 0.1403 0.1293 0.3998 0.2421 0.5151 0.2754 0.4461 0.5248 0.6485 0.6689 0.6181 0.5918 0.2104 0.2637 0.851 0.6581 0.9415 0.3348 0.5956 0.3849 0.1305 0.3853 0.5188 0.0126 0.6886 0.8367 0.4112 0.501 0.0572 0.3863 0.4632 0.6817 0.3742 0.4605 0.5151 0.4924 0.7361 0.1227 0.4078 0.2836 0.775 0.4171 0.4441 0.0792 0.5349 0.7351 0.1024 0.0902 0.4442 0.6661 0.5971 0.1789 0.3908 0.044 0.1057 0.0012 0.2852 0.0221 0.2415 0.2976 0. 0.1302 0.086 0.7992 0.2521 0.7992 0.3854 0.5029 0.2752 0.4298 0.1075 0.6281 0.1467 0.9866 0.8665 0. 0.2928 0.7732 0.181 0.4355 0.4023 0. 0.299 0.0635 0.3342 0.1679 0.4428 0.6225 0. 0.0293 0.7091 0. 0.1225 0.1604 0.1104 0.251 0.1166 0.0152 0.1122 0.3227 0.4776 0.1342 0.4618 0. 0.2805 0. 0.3154 0.0228 0.0817 0.0749] 2022-08-23 18:45:30 [INFO] [EVAL] The model with the best validation mIoU (0.3011) was saved at iter 11000. 2022-08-23 18:45:38 [INFO] [TRAIN] epoch: 9, iter: 11050/160000, loss: 0.9145, lr: 0.001128, batch_cost: 0.1544, reader_cost: 0.00272, ips: 51.8253 samples/sec | ETA 06:23:12 2022-08-23 18:45:46 [INFO] [TRAIN] epoch: 9, iter: 11100/160000, loss: 0.9521, lr: 0.001127, batch_cost: 0.1726, reader_cost: 0.00085, ips: 46.3424 samples/sec | ETA 07:08:24 2022-08-23 18:45:55 [INFO] [TRAIN] epoch: 9, iter: 11150/160000, loss: 0.9708, lr: 0.001127, batch_cost: 0.1796, reader_cost: 0.00447, ips: 44.5317 samples/sec | ETA 07:25:40 2022-08-23 18:46:06 [INFO] [TRAIN] epoch: 9, iter: 11200/160000, loss: 0.8690, lr: 0.001127, batch_cost: 0.2054, reader_cost: 0.00082, ips: 38.9496 samples/sec | ETA 08:29:22 2022-08-23 18:46:17 [INFO] [TRAIN] epoch: 9, iter: 11250/160000, loss: 0.9129, lr: 0.001126, batch_cost: 0.2315, reader_cost: 0.00101, ips: 34.5597 samples/sec | ETA 09:33:53 2022-08-23 18:46:29 [INFO] [TRAIN] epoch: 9, iter: 11300/160000, loss: 0.9166, lr: 0.001126, batch_cost: 0.2324, reader_cost: 0.00162, ips: 34.4203 samples/sec | ETA 09:36:01 2022-08-23 18:46:40 [INFO] [TRAIN] epoch: 9, iter: 11350/160000, loss: 0.9682, lr: 0.001125, batch_cost: 0.2298, reader_cost: 0.00058, ips: 34.8191 samples/sec | ETA 09:29:13 2022-08-23 18:46:55 [INFO] [TRAIN] epoch: 10, iter: 11400/160000, loss: 0.9615, lr: 0.001125, batch_cost: 0.2940, reader_cost: 0.09816, ips: 27.2065 samples/sec | ETA 12:08:15 2022-08-23 18:47:07 [INFO] [TRAIN] epoch: 10, iter: 11450/160000, loss: 0.8906, lr: 0.001125, batch_cost: 0.2452, reader_cost: 0.00180, ips: 32.6240 samples/sec | ETA 10:07:07 2022-08-23 18:47:19 [INFO] [TRAIN] epoch: 10, iter: 11500/160000, loss: 0.8398, lr: 0.001124, batch_cost: 0.2302, reader_cost: 0.00073, ips: 34.7456 samples/sec | ETA 09:29:51 2022-08-23 18:47:30 [INFO] [TRAIN] epoch: 10, iter: 11550/160000, loss: 0.9961, lr: 0.001124, batch_cost: 0.2237, reader_cost: 0.00066, ips: 35.7645 samples/sec | ETA 09:13:26 2022-08-23 18:47:39 [INFO] [TRAIN] epoch: 10, iter: 11600/160000, loss: 0.8516, lr: 0.001124, batch_cost: 0.1872, reader_cost: 0.00494, ips: 42.7431 samples/sec | ETA 07:42:55 2022-08-23 18:47:50 [INFO] [TRAIN] epoch: 10, iter: 11650/160000, loss: 0.9043, lr: 0.001123, batch_cost: 0.2208, reader_cost: 0.00070, ips: 36.2305 samples/sec | ETA 09:05:56 2022-08-23 18:48:01 [INFO] [TRAIN] epoch: 10, iter: 11700/160000, loss: 0.8285, lr: 0.001123, batch_cost: 0.2068, reader_cost: 0.00088, ips: 38.6907 samples/sec | ETA 08:31:03 2022-08-23 18:48:12 [INFO] [TRAIN] epoch: 10, iter: 11750/160000, loss: 0.8628, lr: 0.001122, batch_cost: 0.2260, reader_cost: 0.00039, ips: 35.3985 samples/sec | ETA 09:18:24 2022-08-23 18:48:21 [INFO] [TRAIN] epoch: 10, iter: 11800/160000, loss: 0.8854, lr: 0.001122, batch_cost: 0.1885, reader_cost: 0.00036, ips: 42.4500 samples/sec | ETA 07:45:29 2022-08-23 18:48:32 [INFO] [TRAIN] epoch: 10, iter: 11850/160000, loss: 0.8955, lr: 0.001122, batch_cost: 0.2022, reader_cost: 0.00104, ips: 39.5559 samples/sec | ETA 08:19:22 2022-08-23 18:48:41 [INFO] [TRAIN] epoch: 10, iter: 11900/160000, loss: 0.9285, lr: 0.001121, batch_cost: 0.1991, reader_cost: 0.00347, ips: 40.1796 samples/sec | ETA 08:11:27 2022-08-23 18:48:54 [INFO] [TRAIN] epoch: 10, iter: 11950/160000, loss: 0.9897, lr: 0.001121, batch_cost: 0.2512, reader_cost: 0.00305, ips: 31.8420 samples/sec | ETA 10:19:56 2022-08-23 18:49:05 [INFO] [TRAIN] epoch: 10, iter: 12000/160000, loss: 0.8677, lr: 0.001121, batch_cost: 0.2187, reader_cost: 0.00080, ips: 36.5795 samples/sec | ETA 08:59:27 2022-08-23 18:49:05 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 143s - batch_cost: 0.1432 - reader cost: 6.2716e-04 2022-08-23 18:51:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3138 Acc: 0.7416 Kappa: 0.7222 Dice: 0.4415 2022-08-23 18:51:28 [INFO] [EVAL] Class IoU: [0.6517 0.7661 0.9228 0.7096 0.664 0.734 0.7472 0.7258 0.4772 0.594 0.4589 0.5176 0.6735 0.2484 0.2175 0.3813 0.4858 0.3648 0.5639 0.3695 0.7192 0.4951 0.5685 0.4603 0.2984 0.2827 0.5319 0.3452 0.3918 0.2533 0.2 0.4636 0.2705 0.2813 0.2859 0.3352 0.3787 0.4958 0.2399 0.2849 0.0912 0.148 0.3109 0.2308 0.2335 0.1948 0.2494 0.4116 0.5222 0.4323 0.4479 0.3478 0.1571 0.2092 0.6757 0.2876 0.8487 0.3469 0.4137 0.2607 0.0824 0.3006 0.2323 0.1407 0.3473 0.6035 0.2342 0.3729 0.0096 0.3133 0.4039 0.4214 0.4026 0.2008 0.4291 0.3153 0.5115 0.2219 0.3801 0.2046 0.5706 0.3417 0.1866 0.0329 0.3057 0.5048 0.0835 0.0632 0.2018 0.4704 0.393 0.0373 0.1575 0.0824 0.1365 0.0021 0.1195 0.1177 0.2745 0.2967 0.0134 0.0805 0.2547 0.6928 0.1142 0.5746 0.1942 0.4742 0.134 0.252 0.033 0.42 0.1292 0.5253 0.5989 0. 0.4409 0.5298 0.0148 0.1588 0.4007 0.0144 0.3304 0.1631 0.2395 0.16 0.4413 0.4245 0. 0.2166 0.4768 0.0083 0.0061 0.1712 0.0658 0.1402 0.1378 0.0207 0.1248 0.3193 0.2209 0.0542 0.2777 0.0918 0.246 0.016 0.299 0.0029 0.0689 0.1182] 2022-08-23 18:51:28 [INFO] [EVAL] Class Precision: [0.7739 0.8451 0.9576 0.8528 0.7339 0.8271 0.8556 0.7833 0.5771 0.7376 0.6946 0.7011 0.7408 0.4797 0.4312 0.5488 0.5744 0.7047 0.7309 0.5307 0.8365 0.6503 0.7599 0.6402 0.5074 0.6132 0.6428 0.5919 0.5714 0.3304 0.3656 0.6574 0.4726 0.3833 0.3883 0.62 0.55 0.7544 0.482 0.4775 0.2713 0.2845 0.6484 0.5479 0.3132 0.3509 0.3574 0.673 0.7465 0.5158 0.573 0.419 0.276 0.4795 0.7443 0.5183 0.8973 0.4867 0.7529 0.5382 0.2053 0.4239 0.3738 0.5845 0.4243 0.6754 0.3619 0.491 0.097 0.5512 0.5962 0.4859 0.5283 0.2473 0.6215 0.513 0.7905 0.6186 0.4741 0.2828 0.6818 0.5575 0.8169 0.0465 0.4357 0.6688 0.2432 0.3077 0.7088 0.673 0.5453 0.0503 0.2088 0.2986 0.3056 0.0436 0.2013 0.35 0.4385 0.8534 0.2814 0.1019 0.5539 0.8101 0.7306 0.629 0.438 0.7141 0.2566 0.3397 0.3234 0.5879 0.2902 0.8196 0.6018 0. 0.6259 0.6052 0.0885 0.3524 0.4593 0.2023 0.8231 0.6502 0.4175 0.594 0.9307 0.5812 0. 0.5609 0.6258 0.0799 0.0806 0.6012 0.8268 0.2403 0.4832 0.0957 0.4388 0.6336 0.3015 0.0781 0.7053 0.2361 0.4172 0.0737 0.8693 0.1977 0.5349 0.8879] 2022-08-23 18:51:28 [INFO] [EVAL] Class Recall: [0.8049 0.8913 0.9622 0.8086 0.8746 0.8671 0.855 0.9083 0.7338 0.7532 0.5748 0.6642 0.8811 0.34 0.3051 0.5554 0.7591 0.4306 0.7117 0.5488 0.8368 0.6747 0.6929 0.621 0.4201 0.344 0.755 0.453 0.5549 0.5207 0.3062 0.6113 0.3875 0.5139 0.5202 0.4219 0.5488 0.5912 0.3232 0.4139 0.1208 0.2359 0.3739 0.2852 0.4788 0.3044 0.4523 0.5145 0.6348 0.7276 0.6722 0.6716 0.2672 0.2707 0.8799 0.3925 0.94 0.5471 0.4787 0.3359 0.121 0.5084 0.3803 0.1564 0.6567 0.85 0.3988 0.6079 0.0105 0.4206 0.5559 0.7604 0.6286 0.5165 0.5808 0.45 0.5917 0.2571 0.6572 0.425 0.7777 0.4688 0.1947 0.1009 0.5061 0.673 0.1129 0.0737 0.2201 0.6097 0.5846 0.1258 0.3908 0.1022 0.1978 0.0022 0.2271 0.1507 0.4232 0.3126 0.0138 0.2768 0.3203 0.8271 0.1193 0.8691 0.2587 0.5853 0.2191 0.4941 0.0355 0.5953 0.189 0.594 0.9921 0. 0.5986 0.8096 0.0175 0.2243 0.7585 0.0153 0.3556 0.1788 0.3597 0.1796 0.4563 0.6116 0. 0.2609 0.667 0.0092 0.0065 0.1932 0.0667 0.2516 0.1616 0.0257 0.1485 0.3917 0.4526 0.1506 0.3141 0.1306 0.3747 0.02 0.3131 0.0029 0.0733 0.1199] 2022-08-23 18:51:29 [INFO] [EVAL] The model with the best validation mIoU (0.3138) was saved at iter 12000. 2022-08-23 18:51:38 [INFO] [TRAIN] epoch: 10, iter: 12050/160000, loss: 0.8775, lr: 0.001120, batch_cost: 0.1797, reader_cost: 0.00452, ips: 44.5118 samples/sec | ETA 07:23:10 2022-08-23 18:51:47 [INFO] [TRAIN] epoch: 10, iter: 12100/160000, loss: 0.8839, lr: 0.001120, batch_cost: 0.1946, reader_cost: 0.00506, ips: 41.1089 samples/sec | ETA 07:59:42 2022-08-23 18:51:57 [INFO] [TRAIN] epoch: 10, iter: 12150/160000, loss: 0.9723, lr: 0.001119, batch_cost: 0.1992, reader_cost: 0.00372, ips: 40.1650 samples/sec | ETA 08:10:48 2022-08-23 18:52:08 [INFO] [TRAIN] epoch: 10, iter: 12200/160000, loss: 0.9577, lr: 0.001119, batch_cost: 0.2155, reader_cost: 0.00120, ips: 37.1237 samples/sec | ETA 08:50:50 2022-08-23 18:52:18 [INFO] [TRAIN] epoch: 10, iter: 12250/160000, loss: 0.9247, lr: 0.001119, batch_cost: 0.1947, reader_cost: 0.00083, ips: 41.0895 samples/sec | ETA 07:59:26 2022-08-23 18:52:29 [INFO] [TRAIN] epoch: 10, iter: 12300/160000, loss: 0.9705, lr: 0.001118, batch_cost: 0.2165, reader_cost: 0.00107, ips: 36.9574 samples/sec | ETA 08:52:51 2022-08-23 18:52:38 [INFO] [TRAIN] epoch: 10, iter: 12350/160000, loss: 0.8901, lr: 0.001118, batch_cost: 0.1944, reader_cost: 0.00052, ips: 41.1578 samples/sec | ETA 07:58:19 2022-08-23 18:52:49 [INFO] [TRAIN] epoch: 10, iter: 12400/160000, loss: 0.8756, lr: 0.001117, batch_cost: 0.2117, reader_cost: 0.00112, ips: 37.7902 samples/sec | ETA 08:40:46 2022-08-23 18:53:00 [INFO] [TRAIN] epoch: 10, iter: 12450/160000, loss: 0.8818, lr: 0.001117, batch_cost: 0.2150, reader_cost: 0.00079, ips: 37.2020 samples/sec | ETA 08:48:49 2022-08-23 18:53:11 [INFO] [TRAIN] epoch: 10, iter: 12500/160000, loss: 0.9135, lr: 0.001117, batch_cost: 0.2310, reader_cost: 0.00143, ips: 34.6348 samples/sec | ETA 09:27:49 2022-08-23 18:53:22 [INFO] [TRAIN] epoch: 10, iter: 12550/160000, loss: 0.8818, lr: 0.001116, batch_cost: 0.2039, reader_cost: 0.00352, ips: 39.2302 samples/sec | ETA 08:21:08 2022-08-23 18:53:32 [INFO] [TRAIN] epoch: 10, iter: 12600/160000, loss: 1.0391, lr: 0.001116, batch_cost: 0.2192, reader_cost: 0.00060, ips: 36.5032 samples/sec | ETA 08:58:24 2022-08-23 18:53:47 [INFO] [TRAIN] epoch: 11, iter: 12650/160000, loss: 0.9733, lr: 0.001116, batch_cost: 0.2842, reader_cost: 0.08146, ips: 28.1466 samples/sec | ETA 11:38:00 2022-08-23 18:53:56 [INFO] [TRAIN] epoch: 11, iter: 12700/160000, loss: 0.8899, lr: 0.001115, batch_cost: 0.1929, reader_cost: 0.00093, ips: 41.4822 samples/sec | ETA 07:53:27 2022-08-23 18:54:06 [INFO] [TRAIN] epoch: 11, iter: 12750/160000, loss: 0.8923, lr: 0.001115, batch_cost: 0.2012, reader_cost: 0.00054, ips: 39.7676 samples/sec | ETA 08:13:42 2022-08-23 18:54:16 [INFO] [TRAIN] epoch: 11, iter: 12800/160000, loss: 0.9167, lr: 0.001114, batch_cost: 0.1910, reader_cost: 0.00052, ips: 41.8763 samples/sec | ETA 07:48:40 2022-08-23 18:54:27 [INFO] [TRAIN] epoch: 11, iter: 12850/160000, loss: 0.8869, lr: 0.001114, batch_cost: 0.2178, reader_cost: 0.00056, ips: 36.7367 samples/sec | ETA 08:54:04 2022-08-23 18:54:36 [INFO] [TRAIN] epoch: 11, iter: 12900/160000, loss: 0.8883, lr: 0.001114, batch_cost: 0.1882, reader_cost: 0.00037, ips: 42.5172 samples/sec | ETA 07:41:18 2022-08-23 18:54:48 [INFO] [TRAIN] epoch: 11, iter: 12950/160000, loss: 0.8315, lr: 0.001113, batch_cost: 0.2301, reader_cost: 0.00371, ips: 34.7684 samples/sec | ETA 09:23:55 2022-08-23 18:54:58 [INFO] [TRAIN] epoch: 11, iter: 13000/160000, loss: 0.8835, lr: 0.001113, batch_cost: 0.2143, reader_cost: 0.00094, ips: 37.3312 samples/sec | ETA 08:45:01 2022-08-23 18:54:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 150s - batch_cost: 0.1498 - reader cost: 8.1995e-04 2022-08-23 18:57:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3135 Acc: 0.7393 Kappa: 0.7195 Dice: 0.4403 2022-08-23 18:57:29 [INFO] [EVAL] Class IoU: [0.6564 0.7561 0.9213 0.7012 0.6666 0.7361 0.7534 0.7459 0.4826 0.5911 0.4516 0.5302 0.6768 0.2348 0.2048 0.3568 0.4937 0.3807 0.5523 0.3689 0.728 0.2436 0.5776 0.4654 0.2349 0.4308 0.4931 0.3789 0.3509 0.2263 0.1535 0.4634 0.2627 0.2682 0.2991 0.3649 0.3628 0.4594 0.2092 0.3304 0.1284 0.0855 0.3128 0.2105 0.2825 0.1314 0.2996 0.4289 0.4667 0.4706 0.4425 0.4212 0.1616 0.2061 0.6985 0.3464 0.8128 0.102 0.4143 0.2951 0.0757 0.4149 0.2654 0.0985 0.3847 0.6659 0.1469 0.3831 0.0835 0.3101 0.4374 0.4469 0.4144 0.1915 0.425 0.3195 0.4473 0.2076 0.5005 0.1646 0.6283 0.375 0.2664 0.0127 0.3091 0.4867 0.0755 0.0606 0.2853 0.4498 0.331 0.0358 0.1141 0.092 0.1742 0.0128 0.1851 0.0055 0.2913 0.3043 0.105 0.0986 0.0772 0.6162 0.1683 0.4519 0.144 0.467 0.0604 0.2673 0.0472 0.4478 0.126 0.5205 0.5976 0.0032 0.3745 0.5732 0.0142 0.2748 0.4719 0.005 0.2986 0.1017 0.2435 0.1575 0.4165 0.4347 0. 0.2699 0.5301 0.0003 0.0725 0.1809 0.0831 0.1422 0.1092 0.0618 0.1197 0.2515 0.1719 0.0007 0.3471 0.1247 0.2567 0. 0.3437 0.0242 0.0874 0.1238] 2022-08-23 18:57:29 [INFO] [EVAL] Class Precision: [0.7592 0.8665 0.9689 0.8087 0.7406 0.8698 0.8674 0.8026 0.5748 0.7841 0.6728 0.6638 0.7497 0.46 0.5331 0.5822 0.5842 0.7074 0.7595 0.5524 0.8036 0.6594 0.7348 0.5748 0.5063 0.4882 0.527 0.5641 0.7029 0.2735 0.366 0.572 0.4452 0.3286 0.5411 0.4207 0.654 0.7394 0.477 0.4941 0.1563 0.4257 0.6204 0.553 0.3979 0.4559 0.5527 0.6209 0.4722 0.6248 0.6225 0.5345 0.3578 0.6016 0.7831 0.6837 0.8365 0.7075 0.6926 0.5373 0.0887 0.5723 0.3461 0.7981 0.4555 0.8325 0.2362 0.4892 0.6172 0.5779 0.5974 0.5354 0.5919 0.2545 0.5807 0.4839 0.4745 0.7704 0.8766 0.6769 0.7686 0.691 0.7437 0.0519 0.3763 0.5993 0.2039 0.3159 0.6884 0.6683 0.41 0.0614 0.3357 0.284 0.2522 0.052 0.4011 0.2823 0.4774 0.764 0.4305 0.2304 0.3896 0.6563 0.4963 0.4824 0.315 0.6577 0.3355 0.3753 0.3827 0.6508 0.4344 0.5322 0.605 0.4877 0.644 0.6722 0.05 0.5993 0.6649 0.075 0.6254 0.7489 0.5971 0.5492 0.8793 0.6992 0. 0.5097 0.6633 0.0489 0.716 0.5803 0.5748 0.4151 0.3939 0.2273 0.5394 0.7027 0.5746 0.0012 0.5691 0.3171 0.7839 0. 0.6993 0.1872 0.1724 0.888 ] 2022-08-23 18:57:29 [INFO] [EVAL] Class Recall: [0.829 0.8558 0.9494 0.8407 0.8696 0.8273 0.8515 0.9136 0.7504 0.706 0.5788 0.7249 0.8743 0.3241 0.2495 0.4797 0.7613 0.4519 0.6694 0.5262 0.8856 0.2787 0.7297 0.7097 0.3046 0.7857 0.8845 0.5358 0.4121 0.5677 0.2092 0.7093 0.3904 0.5935 0.4007 0.7336 0.449 0.5481 0.2715 0.4993 0.418 0.0966 0.3869 0.2537 0.4934 0.1559 0.3955 0.5811 0.9755 0.6561 0.6047 0.6652 0.2276 0.2386 0.866 0.4125 0.9663 0.1065 0.5076 0.3957 0.3402 0.6013 0.5326 0.101 0.7124 0.7689 0.2797 0.6385 0.088 0.4009 0.6202 0.7301 0.5802 0.4361 0.6132 0.4847 0.8863 0.2213 0.5384 0.1786 0.7749 0.4506 0.2933 0.0165 0.6337 0.7214 0.1071 0.0698 0.3276 0.5791 0.6318 0.079 0.1473 0.1198 0.3603 0.0168 0.2558 0.0056 0.4276 0.3359 0.1219 0.147 0.0879 0.9097 0.203 0.8773 0.2097 0.617 0.0686 0.4815 0.051 0.5894 0.1507 0.9592 0.9798 0.0032 0.4723 0.7955 0.0194 0.3366 0.6192 0.0053 0.3637 0.1053 0.2914 0.1809 0.4418 0.5347 0. 0.3645 0.7252 0.0003 0.0747 0.2081 0.0885 0.1779 0.1313 0.0782 0.1333 0.2814 0.197 0.0016 0.4709 0.1704 0.2762 0. 0.4032 0.027 0.1505 0.1258] 2022-08-23 18:57:29 [INFO] [EVAL] The model with the best validation mIoU (0.3138) was saved at iter 12000. 2022-08-23 18:57:38 [INFO] [TRAIN] epoch: 11, iter: 13050/160000, loss: 0.9663, lr: 0.001113, batch_cost: 0.1914, reader_cost: 0.00255, ips: 41.7986 samples/sec | ETA 07:48:45 2022-08-23 18:57:50 [INFO] [TRAIN] epoch: 11, iter: 13100/160000, loss: 0.8822, lr: 0.001112, batch_cost: 0.2359, reader_cost: 0.00125, ips: 33.9185 samples/sec | ETA 09:37:27 2022-08-23 18:57:59 [INFO] [TRAIN] epoch: 11, iter: 13150/160000, loss: 0.9105, lr: 0.001112, batch_cost: 0.1882, reader_cost: 0.00481, ips: 42.5017 samples/sec | ETA 07:40:41 2022-08-23 18:58:09 [INFO] [TRAIN] epoch: 11, iter: 13200/160000, loss: 0.9541, lr: 0.001111, batch_cost: 0.1971, reader_cost: 0.00717, ips: 40.5900 samples/sec | ETA 08:02:13 2022-08-23 18:58:21 [INFO] [TRAIN] epoch: 11, iter: 13250/160000, loss: 0.9395, lr: 0.001111, batch_cost: 0.2245, reader_cost: 0.00100, ips: 35.6370 samples/sec | ETA 09:09:03 2022-08-23 18:58:32 [INFO] [TRAIN] epoch: 11, iter: 13300/160000, loss: 0.9096, lr: 0.001111, batch_cost: 0.2259, reader_cost: 0.00052, ips: 35.4187 samples/sec | ETA 09:12:15 2022-08-23 18:58:42 [INFO] [TRAIN] epoch: 11, iter: 13350/160000, loss: 0.8195, lr: 0.001110, batch_cost: 0.1938, reader_cost: 0.00136, ips: 41.2787 samples/sec | ETA 07:53:41 2022-08-23 18:58:52 [INFO] [TRAIN] epoch: 11, iter: 13400/160000, loss: 0.9471, lr: 0.001110, batch_cost: 0.2048, reader_cost: 0.00040, ips: 39.0648 samples/sec | ETA 08:20:21 2022-08-23 18:59:02 [INFO] [TRAIN] epoch: 11, iter: 13450/160000, loss: 0.8730, lr: 0.001110, batch_cost: 0.2021, reader_cost: 0.00037, ips: 39.5872 samples/sec | ETA 08:13:35 2022-08-23 18:59:13 [INFO] [TRAIN] epoch: 11, iter: 13500/160000, loss: 0.9343, lr: 0.001109, batch_cost: 0.2149, reader_cost: 0.00063, ips: 37.2225 samples/sec | ETA 08:44:46 2022-08-23 18:59:24 [INFO] [TRAIN] epoch: 11, iter: 13550/160000, loss: 0.8897, lr: 0.001109, batch_cost: 0.2194, reader_cost: 0.00098, ips: 36.4597 samples/sec | ETA 08:55:34 2022-08-23 18:59:34 [INFO] [TRAIN] epoch: 11, iter: 13600/160000, loss: 0.9356, lr: 0.001108, batch_cost: 0.2156, reader_cost: 0.00045, ips: 37.1009 samples/sec | ETA 08:46:07 2022-08-23 18:59:45 [INFO] [TRAIN] epoch: 11, iter: 13650/160000, loss: 0.8952, lr: 0.001108, batch_cost: 0.2152, reader_cost: 0.00089, ips: 37.1822 samples/sec | ETA 08:44:48 2022-08-23 18:59:56 [INFO] [TRAIN] epoch: 11, iter: 13700/160000, loss: 0.8662, lr: 0.001108, batch_cost: 0.2236, reader_cost: 0.00045, ips: 35.7798 samples/sec | ETA 09:05:11 2022-08-23 19:00:09 [INFO] [TRAIN] epoch: 11, iter: 13750/160000, loss: 0.9149, lr: 0.001107, batch_cost: 0.2521, reader_cost: 0.00070, ips: 31.7356 samples/sec | ETA 10:14:27 2022-08-23 19:00:22 [INFO] [TRAIN] epoch: 11, iter: 13800/160000, loss: 0.8816, lr: 0.001107, batch_cost: 0.2521, reader_cost: 0.00158, ips: 31.7282 samples/sec | ETA 10:14:23 2022-08-23 19:00:33 [INFO] [TRAIN] epoch: 11, iter: 13850/160000, loss: 0.8894, lr: 0.001107, batch_cost: 0.2380, reader_cost: 0.00063, ips: 33.6128 samples/sec | ETA 09:39:44 2022-08-23 19:00:52 [INFO] [TRAIN] epoch: 12, iter: 13900/160000, loss: 0.8511, lr: 0.001106, batch_cost: 0.3605, reader_cost: 0.14096, ips: 22.1930 samples/sec | ETA 14:37:45 2022-08-23 19:01:03 [INFO] [TRAIN] epoch: 12, iter: 13950/160000, loss: 0.9301, lr: 0.001106, batch_cost: 0.2326, reader_cost: 0.00516, ips: 34.3883 samples/sec | ETA 09:26:16 2022-08-23 19:01:15 [INFO] [TRAIN] epoch: 12, iter: 14000/160000, loss: 0.9307, lr: 0.001105, batch_cost: 0.2315, reader_cost: 0.00038, ips: 34.5500 samples/sec | ETA 09:23:26 2022-08-23 19:01:15 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 151s - batch_cost: 0.1506 - reader cost: 8.0192e-04 2022-08-23 19:03:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3186 Acc: 0.7478 Kappa: 0.7284 Dice: 0.4464 2022-08-23 19:03:46 [INFO] [EVAL] Class IoU: [0.6555 0.7733 0.9271 0.7132 0.6498 0.7513 0.7369 0.7523 0.4849 0.6119 0.4531 0.5257 0.6773 0.2834 0.2366 0.4006 0.4964 0.294 0.5615 0.3787 0.7213 0.4888 0.5785 0.4685 0.2278 0.4428 0.5155 0.3774 0.3967 0.2413 0.1959 0.4462 0.2659 0.2758 0.3312 0.3665 0.3825 0.4781 0.2397 0.2934 0.1682 0.138 0.308 0.2526 0.3125 0.2463 0.3057 0.3863 0.6326 0.4386 0.4567 0.3821 0.1555 0.2534 0.6876 0.2744 0.8132 0.3091 0.3958 0.2092 0.0686 0.4573 0.2476 0.0908 0.3777 0.6025 0.2082 0.3148 0.0361 0.3322 0.4012 0.4473 0.3859 0.2097 0.4042 0.3267 0.5821 0.2721 0.5032 0.29 0.6055 0.331 0.3798 0.0115 0.2778 0.486 0.0861 0.0558 0.3035 0.4772 0.4225 0.0094 0.1627 0.051 0.0265 0.0073 0.169 0.0724 0.097 0.2563 0.0273 0.126 0.0462 0.5959 0.1794 0.5788 0.0804 0.4414 0.1606 0.1995 0.0885 0.2884 0.1731 0.4648 0.61 0. 0.3405 0.5209 0.0639 0.2398 0.4556 0.0656 0.1924 0.0633 0.2328 0.1618 0.4425 0.4126 0. 0.251 0.4261 0. 0.1457 0.1496 0.1288 0.1374 0.1382 0.029 0.1539 0.3422 0.3054 0.0034 0.3606 0. 0.2955 0.0007 0.3437 0.0336 0.1089 0.1088] 2022-08-23 19:03:46 [INFO] [EVAL] Class Precision: [0.7526 0.869 0.9621 0.8064 0.7252 0.8604 0.8472 0.8246 0.6379 0.7295 0.6808 0.6848 0.7965 0.4951 0.4513 0.5673 0.6243 0.7446 0.7182 0.5911 0.7827 0.6504 0.7134 0.5846 0.4779 0.6196 0.6218 0.5912 0.6651 0.3602 0.3929 0.5232 0.4894 0.4532 0.4336 0.5053 0.5668 0.808 0.385 0.5668 0.3408 0.2404 0.5957 0.442 0.4471 0.459 0.6873 0.5803 0.7137 0.5498 0.6935 0.484 0.3206 0.4726 0.7546 0.3647 0.8469 0.5379 0.6148 0.8555 0.225 0.6834 0.2851 0.83 0.4385 0.6649 0.3075 0.5235 0.34 0.5703 0.6579 0.5832 0.5148 0.2732 0.5649 0.5878 0.7429 0.7068 0.7022 0.4079 0.7433 0.6956 0.6025 0.0542 0.442 0.5783 0.2666 0.3089 0.7619 0.6845 0.5973 0.0198 0.383 0.2295 0.2783 0.0626 0.6323 0.4139 0.5068 0.8688 0.4606 0.2382 0.5137 0.8301 0.611 0.7498 0.2541 0.6397 0.3606 0.2982 0.3028 0.5516 0.4481 0.8078 0.6145 0. 0.4975 0.5821 0.1168 0.4391 0.5637 0.3855 0.8032 0.6901 0.6619 0.6121 0.8521 0.5863 0. 0.3292 0.63 0. 0.7804 0.5043 0.5838 0.4721 0.3253 0.1151 0.4092 0.6163 0.7284 0.005 0.5184 1. 0.6492 0.2188 0.7641 0.1715 0.4215 0.8705] 2022-08-23 19:03:46 [INFO] [EVAL] Class Recall: [0.8355 0.8754 0.9622 0.8606 0.8621 0.8556 0.8498 0.8956 0.6691 0.7915 0.5753 0.6936 0.819 0.3986 0.3321 0.5768 0.7078 0.327 0.7203 0.513 0.9019 0.6631 0.7537 0.7024 0.3032 0.6082 0.7511 0.5107 0.4957 0.4224 0.281 0.7518 0.368 0.4134 0.5838 0.5715 0.5405 0.5394 0.3884 0.3782 0.2494 0.2447 0.3894 0.3709 0.5094 0.3471 0.3551 0.536 0.8477 0.6845 0.5721 0.6448 0.2318 0.3534 0.8856 0.5256 0.9533 0.4209 0.5263 0.2169 0.0898 0.5802 0.6526 0.0925 0.7316 0.8654 0.3919 0.4412 0.0388 0.4431 0.5069 0.6575 0.6064 0.4741 0.5869 0.4237 0.729 0.3067 0.6398 0.5009 0.7656 0.387 0.5068 0.0144 0.4279 0.7527 0.1128 0.0638 0.3353 0.6118 0.5909 0.0177 0.2205 0.0615 0.0285 0.0082 0.1874 0.0807 0.1072 0.2666 0.0282 0.211 0.0483 0.6786 0.2026 0.7173 0.1053 0.5874 0.2246 0.3761 0.1111 0.3767 0.22 0.5226 0.988 0. 0.519 0.8319 0.1236 0.3456 0.7037 0.0732 0.2019 0.0651 0.2642 0.1803 0.4793 0.5821 0. 0.5136 0.5683 0. 0.1519 0.1753 0.1419 0.1624 0.1936 0.0374 0.1978 0.4348 0.3446 0.0103 0.5423 0. 0.3516 0.0007 0.3846 0.0401 0.1281 0.1106] 2022-08-23 19:03:46 [INFO] [EVAL] The model with the best validation mIoU (0.3186) was saved at iter 14000. 2022-08-23 19:03:57 [INFO] [TRAIN] epoch: 12, iter: 14050/160000, loss: 0.8122, lr: 0.001105, batch_cost: 0.2176, reader_cost: 0.00363, ips: 36.7584 samples/sec | ETA 08:49:24 2022-08-23 19:04:08 [INFO] [TRAIN] epoch: 12, iter: 14100/160000, loss: 0.9141, lr: 0.001105, batch_cost: 0.2279, reader_cost: 0.00171, ips: 35.0957 samples/sec | ETA 09:14:17 2022-08-23 19:04:19 [INFO] [TRAIN] epoch: 12, iter: 14150/160000, loss: 0.9048, lr: 0.001104, batch_cost: 0.2156, reader_cost: 0.00065, ips: 37.1081 samples/sec | ETA 08:44:03 2022-08-23 19:04:30 [INFO] [TRAIN] epoch: 12, iter: 14200/160000, loss: 0.9020, lr: 0.001104, batch_cost: 0.2136, reader_cost: 0.00193, ips: 37.4489 samples/sec | ETA 08:39:06 2022-08-23 19:04:41 [INFO] [TRAIN] epoch: 12, iter: 14250/160000, loss: 0.8991, lr: 0.001103, batch_cost: 0.2215, reader_cost: 0.00035, ips: 36.1148 samples/sec | ETA 08:58:05 2022-08-23 19:04:51 [INFO] [TRAIN] epoch: 12, iter: 14300/160000, loss: 0.8638, lr: 0.001103, batch_cost: 0.1981, reader_cost: 0.00141, ips: 40.3800 samples/sec | ETA 08:01:05 2022-08-23 19:05:02 [INFO] [TRAIN] epoch: 12, iter: 14350/160000, loss: 0.9213, lr: 0.001103, batch_cost: 0.2346, reader_cost: 0.00033, ips: 34.1032 samples/sec | ETA 09:29:26 2022-08-23 19:05:14 [INFO] [TRAIN] epoch: 12, iter: 14400/160000, loss: 0.8877, lr: 0.001102, batch_cost: 0.2430, reader_cost: 0.00069, ips: 32.9199 samples/sec | ETA 09:49:42 2022-08-23 19:05:25 [INFO] [TRAIN] epoch: 12, iter: 14450/160000, loss: 0.9080, lr: 0.001102, batch_cost: 0.2151, reader_cost: 0.00038, ips: 37.1857 samples/sec | ETA 08:41:53 2022-08-23 19:05:36 [INFO] [TRAIN] epoch: 12, iter: 14500/160000, loss: 0.8517, lr: 0.001102, batch_cost: 0.2087, reader_cost: 0.00087, ips: 38.3407 samples/sec | ETA 08:25:59 2022-08-23 19:05:48 [INFO] [TRAIN] epoch: 12, iter: 14550/160000, loss: 0.9336, lr: 0.001101, batch_cost: 0.2413, reader_cost: 0.00046, ips: 33.1472 samples/sec | ETA 09:45:04 2022-08-23 19:05:59 [INFO] [TRAIN] epoch: 12, iter: 14600/160000, loss: 0.8352, lr: 0.001101, batch_cost: 0.2244, reader_cost: 0.00071, ips: 35.6505 samples/sec | ETA 09:03:47 2022-08-23 19:06:09 [INFO] [TRAIN] epoch: 12, iter: 14650/160000, loss: 0.8144, lr: 0.001100, batch_cost: 0.2069, reader_cost: 0.00053, ips: 38.6597 samples/sec | ETA 08:21:17 2022-08-23 19:06:21 [INFO] [TRAIN] epoch: 12, iter: 14700/160000, loss: 0.8725, lr: 0.001100, batch_cost: 0.2359, reader_cost: 0.00094, ips: 33.9193 samples/sec | ETA 09:31:09 2022-08-23 19:06:31 [INFO] [TRAIN] epoch: 12, iter: 14750/160000, loss: 0.8488, lr: 0.001100, batch_cost: 0.1985, reader_cost: 0.00054, ips: 40.3064 samples/sec | ETA 08:00:29 2022-08-23 19:06:40 [INFO] [TRAIN] epoch: 12, iter: 14800/160000, loss: 0.8431, lr: 0.001099, batch_cost: 0.1885, reader_cost: 0.00144, ips: 42.4497 samples/sec | ETA 07:36:04 2022-08-23 19:06:51 [INFO] [TRAIN] epoch: 12, iter: 14850/160000, loss: 0.8660, lr: 0.001099, batch_cost: 0.2126, reader_cost: 0.00050, ips: 37.6321 samples/sec | ETA 08:34:16 2022-08-23 19:07:02 [INFO] [TRAIN] epoch: 12, iter: 14900/160000, loss: 0.8964, lr: 0.001099, batch_cost: 0.2114, reader_cost: 0.00278, ips: 37.8438 samples/sec | ETA 08:31:13 2022-08-23 19:07:10 [INFO] [TRAIN] epoch: 12, iter: 14950/160000, loss: 0.8437, lr: 0.001098, batch_cost: 0.1730, reader_cost: 0.00297, ips: 46.2536 samples/sec | ETA 06:58:07 2022-08-23 19:07:20 [INFO] [TRAIN] epoch: 12, iter: 15000/160000, loss: 0.8813, lr: 0.001098, batch_cost: 0.1969, reader_cost: 0.00060, ips: 40.6281 samples/sec | ETA 07:55:51 2022-08-23 19:07:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1641 - reader cost: 6.7366e-04 2022-08-23 19:10:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3036 Acc: 0.7417 Kappa: 0.7220 Dice: 0.4262 2022-08-23 19:10:05 [INFO] [EVAL] Class IoU: [0.6542 0.7626 0.9237 0.7095 0.6473 0.7328 0.7472 0.7359 0.487 0.5925 0.4415 0.5146 0.6807 0.2136 0.2482 0.3867 0.4652 0.3144 0.5469 0.3816 0.7249 0.4878 0.5566 0.445 0.3091 0.3972 0.5683 0.2933 0.3652 0.2561 0.1335 0.4318 0.2696 0.2792 0.2524 0.3856 0.3887 0.4618 0.2656 0.3097 0.128 0.0718 0.325 0.2368 0.3086 0.0934 0.3002 0.4128 0.5795 0.5124 0.4261 0.3501 0.1373 0.1401 0.6539 0.3963 0.8424 0.1611 0.3594 0.2223 0.1058 0.2877 0.2928 0.0717 0.3683 0.6263 0.1961 0.4192 0.0433 0.2784 0.4078 0.3778 0.3854 0.1956 0.4012 0.3206 0.445 0.1826 0.3284 0.209 0.6334 0.2974 0.3362 0.021 0.3 0.5154 0.0724 0.0554 0.1626 0.4193 0.42 0.0153 0.0859 0.0403 0.1526 0.012 0.1688 0.0705 0.0478 0.3076 0. 0.1157 0.1549 0.4453 0.1662 0.6588 0.1428 0.4814 0.1356 0.0852 0.0551 0.3272 0.1397 0.5314 0.6303 0.0002 0.3992 0.5494 0.0851 0.1677 0.5089 0.0134 0.2133 0.1088 0.1781 0.1339 0.4498 0.4328 0. 0.3234 0.4142 0.0003 0.2177 0.0724 0.0192 0.1063 0.1199 0.0017 0.1337 0.3468 0.009 0.004 0.2249 0.0052 0.2236 0. 0.3137 0.0049 0.0845 0.1095] 2022-08-23 19:10:05 [INFO] [EVAL] Class Precision: [0.7576 0.8567 0.9724 0.8025 0.7062 0.8745 0.8796 0.8221 0.6182 0.7783 0.6516 0.5893 0.7822 0.5416 0.4816 0.5662 0.5365 0.7234 0.662 0.5575 0.7796 0.621 0.674 0.514 0.5055 0.5298 0.6171 0.7876 0.6937 0.3183 0.379 0.5606 0.5846 0.4101 0.3625 0.4591 0.6011 0.5636 0.4418 0.484 0.2479 0.3601 0.6915 0.6151 0.41 0.7174 0.4861 0.8386 0.6188 0.6677 0.5313 0.4142 0.3298 0.7245 0.7368 0.5706 0.9068 0.6572 0.6247 0.4585 0.2737 0.5259 0.4921 0.8384 0.445 0.6983 0.3114 0.5452 0.1151 0.6252 0.6503 0.6325 0.5824 0.2531 0.5529 0.4869 0.7825 0.7729 0.6374 0.3053 0.8267 0.7739 0.6584 0.0702 0.4503 0.6514 0.2788 0.3247 0.2098 0.5171 0.715 0.0212 0.346 0.2945 0.378 0.0626 0.3846 0.5002 0.5268 0.7788 0. 0.2183 0.653 0.8423 0.5619 0.7154 0.3044 0.6801 0.1866 0.1782 0.521 0.3991 0.5204 0.5991 0.6325 0.0578 0.6175 0.59 0.1574 0.4234 0.7598 0.3829 0.9293 0.6447 0.7543 0.6867 0.7212 0.6537 0. 0.5746 0.4981 0.0068 0.7868 0.7124 0.6894 0.3768 0.3967 0.0384 0.5063 0.6073 0.1524 0.0068 0.6738 0.2176 0.3288 0. 0.837 0.2745 0.4978 0.9299] 2022-08-23 19:10:05 [INFO] [EVAL] Class Recall: [0.8274 0.8742 0.9486 0.8596 0.8858 0.819 0.8323 0.8753 0.6965 0.7128 0.5779 0.8023 0.8399 0.2607 0.3387 0.5495 0.7777 0.3574 0.7587 0.5475 0.9118 0.6946 0.7617 0.7681 0.4431 0.6135 0.8778 0.3185 0.4354 0.5676 0.1709 0.6527 0.3335 0.4666 0.4538 0.7067 0.5238 0.7188 0.3998 0.4625 0.2092 0.0823 0.3801 0.2779 0.555 0.097 0.4398 0.4485 0.9013 0.6879 0.6827 0.6934 0.1904 0.148 0.8532 0.5646 0.9222 0.1759 0.4584 0.3015 0.1471 0.3884 0.4196 0.0727 0.6813 0.8586 0.3463 0.6447 0.0649 0.3341 0.5223 0.4841 0.5326 0.4625 0.5938 0.4843 0.5077 0.193 0.4038 0.3984 0.7305 0.3257 0.4073 0.029 0.4734 0.7117 0.0891 0.0626 0.4193 0.6892 0.5044 0.0524 0.1026 0.0446 0.2038 0.0146 0.2312 0.0759 0.0499 0.337 0. 0.1975 0.1689 0.4859 0.1909 0.8928 0.2119 0.6223 0.3316 0.1402 0.0581 0.645 0.1604 0.8246 0.9944 0.0002 0.5303 0.8887 0.1564 0.2174 0.6064 0.0137 0.2168 0.1157 0.1891 0.1426 0.5445 0.5616 0. 0.4252 0.7111 0.0003 0.2314 0.0746 0.0194 0.1289 0.1467 0.0018 0.1537 0.447 0.0095 0.0093 0.2524 0.0053 0.4113 0. 0.3341 0.005 0.0923 0.1104] 2022-08-23 19:10:05 [INFO] [EVAL] The model with the best validation mIoU (0.3186) was saved at iter 14000. 2022-08-23 19:10:15 [INFO] [TRAIN] epoch: 12, iter: 15050/160000, loss: 0.9077, lr: 0.001097, batch_cost: 0.2056, reader_cost: 0.00374, ips: 38.9168 samples/sec | ETA 08:16:36 2022-08-23 19:10:25 [INFO] [TRAIN] epoch: 12, iter: 15100/160000, loss: 0.8749, lr: 0.001097, batch_cost: 0.2032, reader_cost: 0.00103, ips: 39.3753 samples/sec | ETA 08:10:39 2022-08-23 19:10:37 [INFO] [TRAIN] epoch: 12, iter: 15150/160000, loss: 0.9093, lr: 0.001097, batch_cost: 0.2288, reader_cost: 0.00094, ips: 34.9586 samples/sec | ETA 09:12:27 2022-08-23 19:10:53 [INFO] [TRAIN] epoch: 13, iter: 15200/160000, loss: 0.8405, lr: 0.001096, batch_cost: 0.3227, reader_cost: 0.10295, ips: 24.7884 samples/sec | ETA 12:58:51 2022-08-23 19:11:04 [INFO] [TRAIN] epoch: 13, iter: 15250/160000, loss: 0.8402, lr: 0.001096, batch_cost: 0.2253, reader_cost: 0.00716, ips: 35.5094 samples/sec | ETA 09:03:31 2022-08-23 19:11:16 [INFO] [TRAIN] epoch: 13, iter: 15300/160000, loss: 0.8424, lr: 0.001096, batch_cost: 0.2367, reader_cost: 0.00057, ips: 33.7920 samples/sec | ETA 09:30:56 2022-08-23 19:11:27 [INFO] [TRAIN] epoch: 13, iter: 15350/160000, loss: 0.8540, lr: 0.001095, batch_cost: 0.2153, reader_cost: 0.01356, ips: 37.1584 samples/sec | ETA 08:39:02 2022-08-23 19:11:39 [INFO] [TRAIN] epoch: 13, iter: 15400/160000, loss: 0.7877, lr: 0.001095, batch_cost: 0.2447, reader_cost: 0.00089, ips: 32.6928 samples/sec | ETA 09:49:43 2022-08-23 19:11:50 [INFO] [TRAIN] epoch: 13, iter: 15450/160000, loss: 0.8571, lr: 0.001094, batch_cost: 0.2241, reader_cost: 0.00045, ips: 35.6966 samples/sec | ETA 08:59:55 2022-08-23 19:12:02 [INFO] [TRAIN] epoch: 13, iter: 15500/160000, loss: 0.7875, lr: 0.001094, batch_cost: 0.2296, reader_cost: 0.00042, ips: 34.8429 samples/sec | ETA 09:12:57 2022-08-23 19:12:14 [INFO] [TRAIN] epoch: 13, iter: 15550/160000, loss: 0.8864, lr: 0.001094, batch_cost: 0.2494, reader_cost: 0.00048, ips: 32.0742 samples/sec | ETA 10:00:28 2022-08-23 19:12:25 [INFO] [TRAIN] epoch: 13, iter: 15600/160000, loss: 0.8576, lr: 0.001093, batch_cost: 0.2192, reader_cost: 0.00439, ips: 36.4918 samples/sec | ETA 08:47:36 2022-08-23 19:12:37 [INFO] [TRAIN] epoch: 13, iter: 15650/160000, loss: 0.8405, lr: 0.001093, batch_cost: 0.2305, reader_cost: 0.00034, ips: 34.7141 samples/sec | ETA 09:14:26 2022-08-23 19:12:48 [INFO] [TRAIN] epoch: 13, iter: 15700/160000, loss: 0.8552, lr: 0.001092, batch_cost: 0.2329, reader_cost: 0.00058, ips: 34.3493 samples/sec | ETA 09:20:07 2022-08-23 19:12:58 [INFO] [TRAIN] epoch: 13, iter: 15750/160000, loss: 0.8551, lr: 0.001092, batch_cost: 0.1996, reader_cost: 0.00097, ips: 40.0846 samples/sec | ETA 07:59:49 2022-08-23 19:13:08 [INFO] [TRAIN] epoch: 13, iter: 15800/160000, loss: 0.9114, lr: 0.001092, batch_cost: 0.1866, reader_cost: 0.01719, ips: 42.8818 samples/sec | ETA 07:28:21 2022-08-23 19:13:17 [INFO] [TRAIN] epoch: 13, iter: 15850/160000, loss: 0.8391, lr: 0.001091, batch_cost: 0.1830, reader_cost: 0.00086, ips: 43.7277 samples/sec | ETA 07:19:32 2022-08-23 19:13:25 [INFO] [TRAIN] epoch: 13, iter: 15900/160000, loss: 0.8761, lr: 0.001091, batch_cost: 0.1654, reader_cost: 0.00232, ips: 48.3568 samples/sec | ETA 06:37:19 2022-08-23 19:13:34 [INFO] [TRAIN] epoch: 13, iter: 15950/160000, loss: 0.8125, lr: 0.001091, batch_cost: 0.1783, reader_cost: 0.00134, ips: 44.8777 samples/sec | ETA 07:07:58 2022-08-23 19:13:43 [INFO] [TRAIN] epoch: 13, iter: 16000/160000, loss: 0.9458, lr: 0.001090, batch_cost: 0.1793, reader_cost: 0.00151, ips: 44.6125 samples/sec | ETA 07:10:22 2022-08-23 19:13:43 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1696 - reader cost: 8.6243e-04 2022-08-23 19:16:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3154 Acc: 0.7463 Kappa: 0.7269 Dice: 0.4445 2022-08-23 19:16:33 [INFO] [EVAL] Class IoU: [0.6575 0.7694 0.9233 0.6925 0.6797 0.7495 0.7488 0.6859 0.4792 0.6514 0.4473 0.4944 0.6847 0.2456 0.2585 0.3885 0.5003 0.4487 0.5489 0.3757 0.734 0.4326 0.5769 0.4364 0.2732 0.3834 0.4939 0.3368 0.3281 0.2882 0.2072 0.4779 0.2909 0.3295 0.3813 0.3841 0.3846 0.4554 0.2124 0.2808 0.1628 0.0983 0.3155 0.2385 0.2679 0.2369 0.278 0.4068 0.6229 0.4771 0.4402 0.3692 0.1122 0.2501 0.7128 0.3387 0.7086 0.3453 0.3053 0.2416 0.1261 0.3056 0.1901 0.0287 0.3577 0.6009 0.2529 0.2748 0.0857 0.3241 0.3515 0.437 0.3705 0.2121 0.373 0.3302 0.5089 0.2027 0.2851 0.1152 0.5255 0.3682 0.2566 0.0163 0.3308 0.4684 0.1165 0.055 0.2712 0.4324 0.4003 0.0745 0.0761 0.0704 0.074 0.0329 0.1693 0.0176 0.0454 0.3602 0.158 0.1012 0.1316 0.2646 0.1437 0.4783 0.1558 0.465 0.1152 0.236 0.0448 0.4708 0.1301 0.5916 0.6971 0. 0.424 0.6092 0.0365 0.3796 0.4948 0.0077 0.2024 0.1279 0.2353 0.2167 0.3803 0.4238 0.0002 0.2951 0.4374 0.0068 0.2236 0.155 0.0934 0.165 0.1346 0.0135 0.1026 0.295 0.4382 0.031 0.2481 0.2728 0.221 0.0107 0.2813 0.0056 0.0703 0.1464] 2022-08-23 19:16:33 [INFO] [EVAL] Class Precision: [0.7653 0.8613 0.9677 0.7641 0.7466 0.849 0.8212 0.897 0.6474 0.7167 0.706 0.7496 0.7734 0.5417 0.5042 0.5255 0.6286 0.7095 0.6799 0.5615 0.8261 0.6258 0.7945 0.5621 0.4529 0.515 0.6083 0.7208 0.7808 0.4604 0.3678 0.6273 0.4202 0.4354 0.4619 0.4441 0.5912 0.8477 0.4064 0.5337 0.4087 0.5341 0.5992 0.4531 0.3904 0.3499 0.4539 0.5329 0.7103 0.6181 0.5323 0.5161 0.3687 0.6409 0.7431 0.4752 0.728 0.5212 0.6771 0.7588 0.1934 0.4611 0.2066 0.8114 0.4131 0.6645 0.4273 0.3607 0.2528 0.5127 0.6835 0.7063 0.5211 0.2328 0.7117 0.6216 0.7047 0.7634 0.7794 0.5337 0.622 0.6618 0.8186 0.0331 0.4398 0.6958 0.4755 0.3607 0.5648 0.564 0.5183 0.0833 0.4959 0.3064 0.1896 0.078 0.413 0.1545 0.3697 0.5224 0.5729 0.1438 0.4457 0.6939 0.9366 0.5659 0.3325 0.7154 0.2891 0.3108 0.3651 0.8672 0.3233 0.6535 0.7004 0. 0.6085 0.695 0.0836 0.5106 0.6726 0.8683 0.8911 0.7491 0.6022 0.5206 0.8036 0.6073 0.008 0.469 0.5417 0.1357 0.6335 0.5844 0.499 0.3918 0.3945 0.2587 0.4271 0.7086 0.4841 0.0599 0.6518 0.4857 0.707 0.024 0.8931 0.3647 0.1838 0.8313] 2022-08-23 19:16:33 [INFO] [EVAL] Class Recall: [0.8236 0.8782 0.9527 0.8807 0.8836 0.8648 0.8947 0.7446 0.6484 0.8773 0.5497 0.5922 0.8565 0.3101 0.3467 0.5983 0.7102 0.5497 0.7401 0.5318 0.8681 0.5836 0.6781 0.6612 0.4079 0.5999 0.7241 0.3873 0.3613 0.4352 0.3218 0.6673 0.4859 0.5753 0.686 0.7397 0.524 0.496 0.3078 0.3721 0.2129 0.1075 0.3999 0.335 0.4605 0.4231 0.4176 0.6322 0.835 0.6766 0.7179 0.5647 0.1389 0.2908 0.9459 0.5412 0.9637 0.5058 0.3574 0.2617 0.2661 0.4755 0.7034 0.0289 0.727 0.8626 0.3825 0.5357 0.1147 0.4683 0.4198 0.534 0.5618 0.7043 0.4394 0.4133 0.6468 0.2162 0.3101 0.1281 0.772 0.4536 0.2721 0.031 0.5718 0.5891 0.1337 0.061 0.3429 0.6496 0.6373 0.4144 0.0825 0.0838 0.1082 0.0539 0.223 0.0194 0.0492 0.5371 0.1791 0.255 0.1573 0.2996 0.1451 0.7556 0.2267 0.5706 0.1607 0.4952 0.0486 0.5073 0.1788 0.8618 0.9932 0. 0.583 0.8314 0.0608 0.5968 0.6518 0.0077 0.2075 0.1336 0.2786 0.2708 0.4192 0.5838 0.0002 0.4431 0.6943 0.0071 0.2568 0.1742 0.103 0.2218 0.1696 0.0141 0.119 0.3357 0.8221 0.0604 0.2859 0.3836 0.2433 0.0189 0.2911 0.0056 0.1022 0.1509] 2022-08-23 19:16:33 [INFO] [EVAL] The model with the best validation mIoU (0.3186) was saved at iter 14000. 2022-08-23 19:16:43 [INFO] [TRAIN] epoch: 13, iter: 16050/160000, loss: 0.8469, lr: 0.001090, batch_cost: 0.2072, reader_cost: 0.00467, ips: 38.6052 samples/sec | ETA 08:17:10 2022-08-23 19:16:53 [INFO] [TRAIN] epoch: 13, iter: 16100/160000, loss: 0.7843, lr: 0.001089, batch_cost: 0.2011, reader_cost: 0.00103, ips: 39.7806 samples/sec | ETA 08:02:18 2022-08-23 19:17:06 [INFO] [TRAIN] epoch: 13, iter: 16150/160000, loss: 0.8441, lr: 0.001089, batch_cost: 0.2453, reader_cost: 0.00096, ips: 32.6160 samples/sec | ETA 09:48:03 2022-08-23 19:17:16 [INFO] [TRAIN] epoch: 13, iter: 16200/160000, loss: 0.8660, lr: 0.001089, batch_cost: 0.1984, reader_cost: 0.00071, ips: 40.3273 samples/sec | ETA 07:55:26 2022-08-23 19:17:26 [INFO] [TRAIN] epoch: 13, iter: 16250/160000, loss: 0.8835, lr: 0.001088, batch_cost: 0.2059, reader_cost: 0.00055, ips: 38.8523 samples/sec | ETA 08:13:19 2022-08-23 19:17:35 [INFO] [TRAIN] epoch: 13, iter: 16300/160000, loss: 0.8215, lr: 0.001088, batch_cost: 0.1849, reader_cost: 0.00394, ips: 43.2631 samples/sec | ETA 07:22:52 2022-08-23 19:17:46 [INFO] [TRAIN] epoch: 13, iter: 16350/160000, loss: 0.8457, lr: 0.001088, batch_cost: 0.2119, reader_cost: 0.00043, ips: 37.7510 samples/sec | ETA 08:27:21 2022-08-23 19:17:56 [INFO] [TRAIN] epoch: 13, iter: 16400/160000, loss: 0.8762, lr: 0.001087, batch_cost: 0.2059, reader_cost: 0.00082, ips: 38.8626 samples/sec | ETA 08:12:40 2022-08-23 19:18:13 [INFO] [TRAIN] epoch: 14, iter: 16450/160000, loss: 0.8302, lr: 0.001087, batch_cost: 0.3398, reader_cost: 0.12679, ips: 23.5421 samples/sec | ETA 13:33:00 2022-08-23 19:18:23 [INFO] [TRAIN] epoch: 14, iter: 16500/160000, loss: 0.8042, lr: 0.001086, batch_cost: 0.2072, reader_cost: 0.00068, ips: 38.6113 samples/sec | ETA 08:15:32 2022-08-23 19:18:34 [INFO] [TRAIN] epoch: 14, iter: 16550/160000, loss: 0.8234, lr: 0.001086, batch_cost: 0.2081, reader_cost: 0.00066, ips: 38.4474 samples/sec | ETA 08:17:28 2022-08-23 19:18:45 [INFO] [TRAIN] epoch: 14, iter: 16600/160000, loss: 0.8168, lr: 0.001086, batch_cost: 0.2328, reader_cost: 0.00042, ips: 34.3641 samples/sec | ETA 09:16:23 2022-08-23 19:18:56 [INFO] [TRAIN] epoch: 14, iter: 16650/160000, loss: 0.7661, lr: 0.001085, batch_cost: 0.2011, reader_cost: 0.00054, ips: 39.7869 samples/sec | ETA 08:00:23 2022-08-23 19:19:07 [INFO] [TRAIN] epoch: 14, iter: 16700/160000, loss: 0.8349, lr: 0.001085, batch_cost: 0.2263, reader_cost: 0.00053, ips: 35.3457 samples/sec | ETA 09:00:33 2022-08-23 19:19:16 [INFO] [TRAIN] epoch: 14, iter: 16750/160000, loss: 0.8471, lr: 0.001085, batch_cost: 0.1854, reader_cost: 0.00051, ips: 43.1613 samples/sec | ETA 07:22:31 2022-08-23 19:19:24 [INFO] [TRAIN] epoch: 14, iter: 16800/160000, loss: 0.8180, lr: 0.001084, batch_cost: 0.1644, reader_cost: 0.00053, ips: 48.6722 samples/sec | ETA 06:32:17 2022-08-23 19:19:33 [INFO] [TRAIN] epoch: 14, iter: 16850/160000, loss: 0.8727, lr: 0.001084, batch_cost: 0.1790, reader_cost: 0.00052, ips: 44.6891 samples/sec | ETA 07:07:05 2022-08-23 19:19:43 [INFO] [TRAIN] epoch: 14, iter: 16900/160000, loss: 0.8310, lr: 0.001083, batch_cost: 0.1949, reader_cost: 0.00044, ips: 41.0386 samples/sec | ETA 07:44:55 2022-08-23 19:19:52 [INFO] [TRAIN] epoch: 14, iter: 16950/160000, loss: 0.8378, lr: 0.001083, batch_cost: 0.1796, reader_cost: 0.00052, ips: 44.5330 samples/sec | ETA 07:08:17 2022-08-23 19:20:02 [INFO] [TRAIN] epoch: 14, iter: 17000/160000, loss: 0.8817, lr: 0.001083, batch_cost: 0.1898, reader_cost: 0.00064, ips: 42.1392 samples/sec | ETA 07:32:28 2022-08-23 19:20:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 153s - batch_cost: 0.1524 - reader cost: 5.2238e-04 2022-08-23 19:22:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.3150 Acc: 0.7470 Kappa: 0.7276 Dice: 0.4426 2022-08-23 19:22:34 [INFO] [EVAL] Class IoU: [0.6574 0.7575 0.9235 0.7163 0.6711 0.7347 0.7542 0.7499 0.4757 0.6565 0.4641 0.5362 0.6937 0.2354 0.219 0.3956 0.4606 0.4374 0.5737 0.3524 0.7339 0.4337 0.5698 0.4418 0.3154 0.4093 0.4437 0.3918 0.3648 0.2394 0.2244 0.4966 0.2235 0.287 0.2431 0.3308 0.3877 0.5191 0.192 0.2867 0.1262 0.1072 0.3489 0.235 0.3296 0.2245 0.2981 0.4375 0.5868 0.4369 0.4699 0.3111 0.1364 0.1406 0.7235 0.414 0.8464 0.2832 0.427 0.279 0.0948 0.3458 0.2618 0.1145 0.4179 0.6288 0.238 0.3576 0.0567 0.3214 0.3706 0.4354 0.3761 0.2027 0.3812 0.3576 0.4881 0.2222 0.4939 0.2325 0.5312 0.3453 0.2735 0.0155 0.2909 0.4971 0.0535 0.0929 0.1373 0.4216 0.4348 0.0175 0.1569 0.0454 0.0765 0.0337 0.2209 0.0081 0.0947 0.3501 0.2253 0.1165 0.1202 0.5006 0.1293 0.5485 0.2628 0.4995 0.1677 0.2131 0.099 0.2139 0.1513 0.3795 0.6459 0.017 0.4132 0.5947 0.0922 0.2084 0.4839 0.0131 0.2364 0.1232 0.2454 0.1858 0.4175 0.417 0.0002 0.3017 0.3603 0.0032 0.0855 0.0812 0.0647 0.1722 0.1505 0.0154 0.1126 0.3304 0.2453 0.0054 0.2392 0.0009 0.2216 0. 0.2286 0.0119 0.0941 0.1419] 2022-08-23 19:22:34 [INFO] [EVAL] Class Precision: [0.75 0.8767 0.9593 0.8066 0.7691 0.9002 0.8849 0.8084 0.5898 0.7195 0.6942 0.6329 0.7881 0.5046 0.5094 0.5454 0.5266 0.6651 0.7717 0.5825 0.8117 0.5806 0.7457 0.6191 0.507 0.4468 0.5504 0.5943 0.7011 0.4124 0.292 0.7286 0.5094 0.3783 0.5308 0.4761 0.5555 0.7563 0.3402 0.5966 0.451 0.3728 0.6155 0.5741 0.5199 0.5527 0.4607 0.615 0.6444 0.5256 0.6901 0.3809 0.3032 0.632 0.8262 0.5642 0.8878 0.5954 0.6198 0.4449 0.1922 0.4229 0.3376 0.7256 0.5183 0.6952 0.4907 0.5279 0.591 0.522 0.6911 0.6515 0.5688 0.2908 0.4699 0.5994 0.7818 0.606 0.6631 0.5464 0.6274 0.5621 0.7847 0.1481 0.4435 0.677 0.646 0.2704 0.3354 0.7255 0.614 0.0275 0.2803 0.2598 0.4936 0.0884 0.2992 0.1403 0.5288 0.5881 0.6538 0.1732 0.5544 0.8864 0.8638 0.5841 0.4936 0.7631 0.2702 0.3562 0.305 0.2726 0.4015 0.8806 0.6478 0.5876 0.6408 0.6878 0.2036 0.6572 0.6175 0.2908 0.8119 0.5778 0.5301 0.5926 0.7967 0.5109 0.0015 0.6668 0.4303 0.063 0.877 0.6582 0.6977 0.4372 0.3525 0.1912 0.5364 0.6523 0.5307 0.0072 0.6611 0.0079 0.8027 0. 0.8671 0.2997 0.2831 0.9213] 2022-08-23 19:22:34 [INFO] [EVAL] Class Recall: [0.8419 0.8478 0.9612 0.8648 0.8404 0.7999 0.8362 0.912 0.7109 0.8823 0.5834 0.7782 0.8529 0.3061 0.2776 0.5901 0.7861 0.5609 0.6909 0.4715 0.8845 0.6317 0.7073 0.6067 0.4549 0.8301 0.696 0.5349 0.4319 0.3633 0.4919 0.6093 0.2849 0.5431 0.3096 0.5201 0.5621 0.6233 0.3059 0.3556 0.1491 0.1308 0.4461 0.2846 0.4737 0.2743 0.4579 0.6025 0.8679 0.7214 0.5956 0.6296 0.1987 0.1531 0.8534 0.6086 0.9478 0.3507 0.5785 0.428 0.1575 0.6546 0.5382 0.1197 0.6834 0.8682 0.3161 0.5258 0.059 0.4554 0.4442 0.5676 0.5261 0.4007 0.6687 0.4699 0.5651 0.2597 0.6593 0.288 0.776 0.4723 0.2957 0.017 0.458 0.6516 0.0551 0.1239 0.1887 0.5016 0.5984 0.0456 0.2628 0.0522 0.083 0.0517 0.4577 0.0085 0.1035 0.4638 0.2558 0.2626 0.1331 0.5349 0.132 0.8999 0.3598 0.5912 0.3063 0.3465 0.1278 0.4985 0.1954 0.4001 0.9955 0.0172 0.5379 0.8146 0.1441 0.2338 0.6909 0.0135 0.2501 0.1353 0.3135 0.2131 0.4673 0.6941 0.0002 0.3553 0.6889 0.0034 0.0865 0.0848 0.0666 0.2213 0.208 0.0165 0.1247 0.401 0.3133 0.0218 0.2726 0.001 0.2343 0. 0.2369 0.0122 0.1235 0.1437] 2022-08-23 19:22:34 [INFO] [EVAL] The model with the best validation mIoU (0.3186) was saved at iter 14000. 2022-08-23 19:22:44 [INFO] [TRAIN] epoch: 14, iter: 17050/160000, loss: 0.8660, lr: 0.001082, batch_cost: 0.1952, reader_cost: 0.00507, ips: 40.9906 samples/sec | ETA 07:44:59 2022-08-23 19:22:54 [INFO] [TRAIN] epoch: 14, iter: 17100/160000, loss: 0.8644, lr: 0.001082, batch_cost: 0.1976, reader_cost: 0.00089, ips: 40.4895 samples/sec | ETA 07:50:34 2022-08-23 19:23:05 [INFO] [TRAIN] epoch: 14, iter: 17150/160000, loss: 0.8502, lr: 0.001082, batch_cost: 0.2087, reader_cost: 0.00044, ips: 38.3329 samples/sec | ETA 08:16:52 2022-08-23 19:23:16 [INFO] [TRAIN] epoch: 14, iter: 17200/160000, loss: 0.8171, lr: 0.001081, batch_cost: 0.2251, reader_cost: 0.00086, ips: 35.5407 samples/sec | ETA 08:55:43 2022-08-23 19:23:26 [INFO] [TRAIN] epoch: 14, iter: 17250/160000, loss: 0.8233, lr: 0.001081, batch_cost: 0.2118, reader_cost: 0.00040, ips: 37.7687 samples/sec | ETA 08:23:56 2022-08-23 19:23:36 [INFO] [TRAIN] epoch: 14, iter: 17300/160000, loss: 0.8230, lr: 0.001080, batch_cost: 0.1942, reader_cost: 0.00101, ips: 41.1929 samples/sec | ETA 07:41:53 2022-08-23 19:23:47 [INFO] [TRAIN] epoch: 14, iter: 17350/160000, loss: 0.8661, lr: 0.001080, batch_cost: 0.2159, reader_cost: 0.00092, ips: 37.0591 samples/sec | ETA 08:33:14 2022-08-23 19:23:58 [INFO] [TRAIN] epoch: 14, iter: 17400/160000, loss: 0.8414, lr: 0.001080, batch_cost: 0.2230, reader_cost: 0.00057, ips: 35.8790 samples/sec | ETA 08:49:55 2022-08-23 19:24:09 [INFO] [TRAIN] epoch: 14, iter: 17450/160000, loss: 0.8398, lr: 0.001079, batch_cost: 0.2127, reader_cost: 0.00047, ips: 37.6067 samples/sec | ETA 08:25:24 2022-08-23 19:24:19 [INFO] [TRAIN] epoch: 14, iter: 17500/160000, loss: 0.8686, lr: 0.001079, batch_cost: 0.2126, reader_cost: 0.00045, ips: 37.6342 samples/sec | ETA 08:24:51 2022-08-23 19:24:30 [INFO] [TRAIN] epoch: 14, iter: 17550/160000, loss: 0.8432, lr: 0.001078, batch_cost: 0.2097, reader_cost: 0.00078, ips: 38.1445 samples/sec | ETA 08:17:55 2022-08-23 19:24:40 [INFO] [TRAIN] epoch: 14, iter: 17600/160000, loss: 0.8124, lr: 0.001078, batch_cost: 0.1989, reader_cost: 0.00094, ips: 40.2246 samples/sec | ETA 07:52:00 2022-08-23 19:24:50 [INFO] [TRAIN] epoch: 14, iter: 17650/160000, loss: 0.8349, lr: 0.001078, batch_cost: 0.2099, reader_cost: 0.00094, ips: 38.1168 samples/sec | ETA 08:17:56 2022-08-23 19:25:04 [INFO] [TRAIN] epoch: 15, iter: 17700/160000, loss: 0.8917, lr: 0.001077, batch_cost: 0.2666, reader_cost: 0.05340, ips: 30.0053 samples/sec | ETA 10:32:19 2022-08-23 19:25:13 [INFO] [TRAIN] epoch: 15, iter: 17750/160000, loss: 0.7833, lr: 0.001077, batch_cost: 0.1856, reader_cost: 0.00831, ips: 43.1126 samples/sec | ETA 07:19:55 2022-08-23 19:25:23 [INFO] [TRAIN] epoch: 15, iter: 17800/160000, loss: 0.7859, lr: 0.001077, batch_cost: 0.2019, reader_cost: 0.00050, ips: 39.6246 samples/sec | ETA 07:58:29 2022-08-23 19:25:32 [INFO] [TRAIN] epoch: 15, iter: 17850/160000, loss: 0.8172, lr: 0.001076, batch_cost: 0.1712, reader_cost: 0.00087, ips: 46.7312 samples/sec | ETA 06:45:34 2022-08-23 19:25:40 [INFO] [TRAIN] epoch: 15, iter: 17900/160000, loss: 0.8339, lr: 0.001076, batch_cost: 0.1785, reader_cost: 0.00064, ips: 44.8100 samples/sec | ETA 07:02:49 2022-08-23 19:25:49 [INFO] [TRAIN] epoch: 15, iter: 17950/160000, loss: 0.7498, lr: 0.001075, batch_cost: 0.1799, reader_cost: 0.00047, ips: 44.4617 samples/sec | ETA 07:05:59 2022-08-23 19:25:58 [INFO] [TRAIN] epoch: 15, iter: 18000/160000, loss: 0.7948, lr: 0.001075, batch_cost: 0.1687, reader_cost: 0.00076, ips: 47.4297 samples/sec | ETA 06:39:11 2022-08-23 19:25:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1620 - reader cost: 0.0013 2022-08-23 19:28:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.3228 Acc: 0.7517 Kappa: 0.7324 Dice: 0.4499 2022-08-23 19:28:40 [INFO] [EVAL] Class IoU: [0.6631 0.7682 0.9287 0.7173 0.6779 0.7509 0.7557 0.7587 0.4986 0.5831 0.4539 0.5217 0.6897 0.2756 0.1874 0.3564 0.5168 0.3854 0.5704 0.4054 0.7286 0.4137 0.5894 0.4695 0.3504 0.473 0.4928 0.37 0.3529 0.2306 0.225 0.485 0.2706 0.3145 0.3836 0.4015 0.3745 0.514 0.2398 0.3068 0.1385 0.1012 0.3282 0.2379 0.2878 0.16 0.231 0.431 0.6525 0.4856 0.4683 0.3471 0.1046 0.2433 0.7738 0.3322 0.8253 0.2806 0.4073 0.3323 0.1113 0.1504 0.3084 0.087 0.3664 0.6699 0.2068 0.3406 0.0135 0.2962 0.3782 0.4223 0.4146 0.2001 0.4451 0.3424 0.615 0.2182 0.2515 0.2409 0.5833 0.3337 0.3149 0.0171 0.2563 0.4975 0.1011 0.0594 0.3214 0.4376 0.3848 0.0543 0.0686 0.0659 0.026 0.0193 0.1041 0.0804 0.165 0.3407 0.0998 0.1001 0.0738 0.6838 0.1857 0.4194 0.1835 0.4656 0.0466 0.0949 0.0971 0.2734 0.106 0.5618 0.8103 0.0036 0.261 0.5922 0.1122 0.2466 0.4165 0.0217 0.3258 0.1463 0.2325 0.1586 0.479 0.4098 0. 0.3737 0.5392 0.0133 0.2173 0.2637 0.1337 0.1613 0.1501 0.0573 0.1516 0.3664 0.2457 0.0265 0.3325 0.0021 0.2994 0. 0.2486 0.0172 0.1018 0.1753] 2022-08-23 19:28:40 [INFO] [EVAL] Class Precision: [0.7537 0.85 0.9691 0.8092 0.7791 0.8418 0.8766 0.8325 0.6204 0.7293 0.6589 0.6009 0.8214 0.4317 0.5591 0.6374 0.6989 0.7498 0.7492 0.5735 0.7804 0.6357 0.7499 0.5947 0.5069 0.5554 0.5378 0.7181 0.6275 0.5171 0.401 0.5875 0.6119 0.4081 0.4601 0.5212 0.6363 0.7897 0.352 0.5617 0.2164 0.4288 0.5665 0.6266 0.449 0.4776 0.306 0.8062 0.719 0.6575 0.6625 0.4292 0.2601 0.4328 0.8333 0.6932 0.8589 0.56 0.6727 0.6053 0.2424 0.2661 0.5517 0.6111 0.4234 0.793 0.4384 0.5101 0.1701 0.5178 0.571 0.654 0.5526 0.2812 0.6667 0.5027 0.8631 0.7537 0.5754 0.592 0.7072 0.538 0.7716 0.0713 0.4321 0.7506 0.3583 0.3513 0.517 0.625 0.5036 0.0732 0.3366 0.2805 0.2576 0.1345 0.4373 0.3109 0.7594 0.7479 0.794 0.1532 0.459 0.9662 0.7013 0.448 0.5125 0.5603 0.1647 0.1956 0.2812 0.3988 0.5371 0.7218 0.855 0.1638 0.7026 0.7036 0.1508 0.6396 0.7375 0.3957 0.7203 0.5952 0.708 0.6342 0.8108 0.5099 0. 0.8527 0.7023 0.2559 0.5814 0.4892 0.3195 0.3032 0.4573 0.0722 0.5199 0.5934 0.4721 0.0368 0.6594 0.0694 0.5778 0. 0.895 0.3106 0.3009 0.7798] 2022-08-23 19:28:40 [INFO] [EVAL] Class Recall: [0.8467 0.8887 0.9571 0.8634 0.8391 0.8742 0.8456 0.8954 0.7174 0.7441 0.5932 0.7983 0.8114 0.4326 0.2199 0.447 0.6648 0.4422 0.705 0.5804 0.9165 0.5422 0.7336 0.6904 0.5316 0.7613 0.8547 0.4329 0.4464 0.2939 0.339 0.7356 0.3267 0.5781 0.6974 0.6361 0.4765 0.5955 0.4294 0.4034 0.2777 0.117 0.4383 0.2773 0.4451 0.194 0.4853 0.4808 0.876 0.65 0.6151 0.6448 0.149 0.3571 0.9154 0.3894 0.9547 0.3599 0.508 0.4242 0.1706 0.2572 0.4115 0.0921 0.7312 0.8118 0.2814 0.5062 0.0145 0.4091 0.5283 0.5437 0.6241 0.4098 0.5725 0.5177 0.6814 0.235 0.3088 0.2888 0.769 0.4677 0.3473 0.0219 0.3864 0.596 0.1235 0.0667 0.4594 0.5934 0.62 0.174 0.0793 0.0793 0.0281 0.022 0.1202 0.0978 0.1742 0.3849 0.1024 0.2241 0.0808 0.7006 0.2017 0.868 0.2223 0.7337 0.0609 0.1557 0.1291 0.465 0.1167 0.7171 0.9394 0.0036 0.2934 0.789 0.3049 0.2865 0.4889 0.0225 0.373 0.1624 0.2572 0.1745 0.5393 0.6763 0. 0.3995 0.699 0.0139 0.2576 0.3638 0.187 0.2563 0.1826 0.2169 0.1762 0.4893 0.3388 0.0872 0.4015 0.0021 0.3832 0. 0.2561 0.0178 0.1333 0.1844] 2022-08-23 19:28:41 [INFO] [EVAL] The model with the best validation mIoU (0.3228) was saved at iter 18000. 2022-08-23 19:28:52 [INFO] [TRAIN] epoch: 15, iter: 18050/160000, loss: 0.8231, lr: 0.001075, batch_cost: 0.2298, reader_cost: 0.00457, ips: 34.8063 samples/sec | ETA 09:03:46 2022-08-23 19:29:03 [INFO] [TRAIN] epoch: 15, iter: 18100/160000, loss: 0.8116, lr: 0.001074, batch_cost: 0.2104, reader_cost: 0.00171, ips: 38.0223 samples/sec | ETA 08:17:36 2022-08-23 19:29:13 [INFO] [TRAIN] epoch: 15, iter: 18150/160000, loss: 0.7705, lr: 0.001074, batch_cost: 0.2147, reader_cost: 0.00095, ips: 37.2593 samples/sec | ETA 08:27:36 2022-08-23 19:29:25 [INFO] [TRAIN] epoch: 15, iter: 18200/160000, loss: 0.8627, lr: 0.001074, batch_cost: 0.2248, reader_cost: 0.00045, ips: 35.5925 samples/sec | ETA 08:51:11 2022-08-23 19:29:35 [INFO] [TRAIN] epoch: 15, iter: 18250/160000, loss: 0.8210, lr: 0.001073, batch_cost: 0.2185, reader_cost: 0.00035, ips: 36.6172 samples/sec | ETA 08:36:09 2022-08-23 19:29:47 [INFO] [TRAIN] epoch: 15, iter: 18300/160000, loss: 0.8972, lr: 0.001073, batch_cost: 0.2263, reader_cost: 0.00038, ips: 35.3457 samples/sec | ETA 08:54:31 2022-08-23 19:29:58 [INFO] [TRAIN] epoch: 15, iter: 18350/160000, loss: 0.8596, lr: 0.001072, batch_cost: 0.2174, reader_cost: 0.00493, ips: 36.8030 samples/sec | ETA 08:33:10 2022-08-23 19:30:09 [INFO] [TRAIN] epoch: 15, iter: 18400/160000, loss: 0.8305, lr: 0.001072, batch_cost: 0.2240, reader_cost: 0.00047, ips: 35.7084 samples/sec | ETA 08:48:43 2022-08-23 19:30:20 [INFO] [TRAIN] epoch: 15, iter: 18450/160000, loss: 0.7893, lr: 0.001072, batch_cost: 0.2133, reader_cost: 0.00319, ips: 37.5029 samples/sec | ETA 08:23:15 2022-08-23 19:30:31 [INFO] [TRAIN] epoch: 15, iter: 18500/160000, loss: 0.8106, lr: 0.001071, batch_cost: 0.2330, reader_cost: 0.00065, ips: 34.3387 samples/sec | ETA 09:09:25 2022-08-23 19:30:43 [INFO] [TRAIN] epoch: 15, iter: 18550/160000, loss: 0.8222, lr: 0.001071, batch_cost: 0.2281, reader_cost: 0.00077, ips: 35.0751 samples/sec | ETA 08:57:42 2022-08-23 19:30:53 [INFO] [TRAIN] epoch: 15, iter: 18600/160000, loss: 0.7488, lr: 0.001071, batch_cost: 0.1992, reader_cost: 0.00044, ips: 40.1658 samples/sec | ETA 07:49:23 2022-08-23 19:31:04 [INFO] [TRAIN] epoch: 15, iter: 18650/160000, loss: 0.8408, lr: 0.001070, batch_cost: 0.2260, reader_cost: 0.00273, ips: 35.4011 samples/sec | ETA 08:52:22 2022-08-23 19:31:14 [INFO] [TRAIN] epoch: 15, iter: 18700/160000, loss: 0.8814, lr: 0.001070, batch_cost: 0.2113, reader_cost: 0.00051, ips: 37.8581 samples/sec | ETA 08:17:38 2022-08-23 19:31:24 [INFO] [TRAIN] epoch: 15, iter: 18750/160000, loss: 0.8493, lr: 0.001069, batch_cost: 0.1836, reader_cost: 0.00040, ips: 43.5636 samples/sec | ETA 07:12:19 2022-08-23 19:31:33 [INFO] [TRAIN] epoch: 15, iter: 18800/160000, loss: 0.8099, lr: 0.001069, batch_cost: 0.1800, reader_cost: 0.00049, ips: 44.4326 samples/sec | ETA 07:03:42 2022-08-23 19:31:42 [INFO] [TRAIN] epoch: 15, iter: 18850/160000, loss: 0.8257, lr: 0.001069, batch_cost: 0.1912, reader_cost: 0.00058, ips: 41.8415 samples/sec | ETA 07:29:47 2022-08-23 19:31:52 [INFO] [TRAIN] epoch: 15, iter: 18900/160000, loss: 0.7930, lr: 0.001068, batch_cost: 0.1868, reader_cost: 0.00052, ips: 42.8198 samples/sec | ETA 07:19:21 2022-08-23 19:32:05 [INFO] [TRAIN] epoch: 16, iter: 18950/160000, loss: 0.8465, lr: 0.001068, batch_cost: 0.2748, reader_cost: 0.07406, ips: 29.1154 samples/sec | ETA 10:45:56 2022-08-23 19:32:16 [INFO] [TRAIN] epoch: 16, iter: 19000/160000, loss: 0.7604, lr: 0.001068, batch_cost: 0.2080, reader_cost: 0.00043, ips: 38.4540 samples/sec | ETA 08:08:53 2022-08-23 19:32:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 171s - batch_cost: 0.1708 - reader cost: 0.0010 2022-08-23 19:35:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3247 Acc: 0.7523 Kappa: 0.7335 Dice: 0.4517 2022-08-23 19:35:07 [INFO] [EVAL] Class IoU: [0.6716 0.7567 0.9304 0.7073 0.681 0.7493 0.7563 0.7766 0.4938 0.6157 0.4488 0.5545 0.6985 0.2476 0.2578 0.4058 0.4825 0.3945 0.5637 0.3935 0.7329 0.4474 0.5838 0.4661 0.3419 0.3817 0.4687 0.3853 0.4051 0.2665 0.2428 0.4913 0.2853 0.3104 0.3095 0.3598 0.4047 0.5 0.2286 0.3104 0.1057 0.1514 0.345 0.189 0.3156 0.2286 0.2718 0.4501 0.5942 0.515 0.4669 0.4397 0.1439 0.1659 0.6797 0.4577 0.8581 0.362 0.3669 0.2599 0.1565 0.3047 0.3168 0.049 0.4096 0.6597 0.2252 0.3942 0.0555 0.3166 0.4052 0.4581 0.4078 0.2303 0.4105 0.3404 0.4841 0.2244 0.2289 0.2505 0.6343 0.3494 0.2712 0.0158 0.2954 0.5055 0.0593 0.0679 0.3248 0.4679 0.3811 0.0497 0.1189 0.0624 0.0157 0.0062 0.1602 0.046 0.0455 0.2864 0.045 0.0952 0.058 0.4511 0.1782 0.6544 0.1598 0.4788 0.0648 0.1469 0.0738 0.3492 0.132 0.6773 0.6162 0.0006 0.4262 0.5963 0.0267 0.3462 0.4544 0.0189 0.2954 0.05 0.2485 0.1848 0.5176 0.4545 0.0836 0.3345 0.4091 0. 0.0826 0.1244 0.0975 0.1683 0.0957 0.0107 0.1544 0.3807 0.3441 0. 0.2188 0.1644 0.3 0. 0.316 0.0147 0.103 0.1348] 2022-08-23 19:35:07 [INFO] [EVAL] Class Precision: [0.7856 0.8376 0.9665 0.7989 0.7651 0.8356 0.8844 0.8587 0.6412 0.7864 0.6609 0.6702 0.7793 0.4934 0.4499 0.571 0.5914 0.693 0.6992 0.5892 0.8096 0.6559 0.7934 0.6053 0.5123 0.5846 0.5121 0.7002 0.6288 0.3341 0.3722 0.6262 0.5555 0.4115 0.4109 0.4289 0.634 0.7779 0.3242 0.4709 0.3312 0.3741 0.5546 0.6667 0.5029 0.4165 0.592 0.7509 0.6429 0.6933 0.6881 0.6327 0.3735 0.5591 0.7483 0.6463 0.9087 0.5817 0.7674 0.4246 0.4516 0.4549 0.4295 0.4722 0.5788 0.7754 0.3702 0.6144 0.1792 0.5191 0.6091 0.6142 0.5544 0.2673 0.531 0.5497 0.542 0.6794 0.7773 0.292 0.8275 0.5953 0.7788 0.0481 0.3429 0.6708 0.5038 0.3168 0.5484 0.6744 0.5352 0.0624 0.4396 0.3321 0.066 0.0405 0.3535 0.3956 0.4526 0.4834 0.6648 0.1957 0.3959 0.4704 0.6527 0.7969 0.2422 0.7458 0.144 0.2961 0.3202 0.4337 0.3276 0.7136 0.7405 0.4041 0.5956 0.6864 0.2924 0.7175 0.7229 0.3124 0.6519 0.794 0.6711 0.5251 0.8189 0.6652 0.5323 0.5584 0.6995 0. 0.2393 0.5838 0.7403 0.3286 0.6019 0.053 0.477 0.6388 0.603 0. 0.6144 0.3664 0.5285 0. 0.8784 0.3079 0.3738 0.7523] 2022-08-23 19:35:07 [INFO] [EVAL] Class Recall: [0.8223 0.8867 0.9614 0.8604 0.8611 0.8789 0.8392 0.8904 0.6823 0.7394 0.583 0.7626 0.8707 0.332 0.3764 0.5837 0.7238 0.478 0.7442 0.5423 0.8856 0.5847 0.6885 0.6696 0.5069 0.5238 0.8468 0.4614 0.5324 0.5686 0.4111 0.6951 0.3697 0.5582 0.5562 0.6905 0.528 0.5833 0.4366 0.4767 0.1344 0.2027 0.4772 0.2087 0.4588 0.3363 0.3344 0.5291 0.887 0.6669 0.5922 0.5905 0.1897 0.1908 0.8811 0.6106 0.9391 0.4895 0.4129 0.4012 0.1933 0.48 0.5471 0.0518 0.5835 0.8156 0.3649 0.5238 0.0745 0.448 0.5475 0.6432 0.6066 0.6243 0.644 0.472 0.8193 0.2509 0.2449 0.6379 0.7309 0.4582 0.2939 0.023 0.6811 0.6724 0.063 0.0796 0.4434 0.6044 0.5697 0.1954 0.1401 0.0713 0.0201 0.0073 0.2266 0.0495 0.0482 0.4127 0.046 0.1565 0.0637 0.9168 0.1969 0.7854 0.3196 0.5721 0.1053 0.2256 0.0875 0.6418 0.181 0.9301 0.7858 0.0006 0.5997 0.8195 0.0285 0.4008 0.5502 0.0197 0.3507 0.0507 0.283 0.2219 0.5846 0.5893 0.0902 0.4548 0.4963 0. 0.112 0.1365 0.101 0.2565 0.1022 0.0133 0.1859 0.4851 0.4448 0. 0.2536 0.2297 0.4097 0. 0.3304 0.0152 0.1244 0.1411] 2022-08-23 19:35:07 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 19000. 2022-08-23 19:35:17 [INFO] [TRAIN] epoch: 16, iter: 19050/160000, loss: 0.7487, lr: 0.001067, batch_cost: 0.1971, reader_cost: 0.00340, ips: 40.5961 samples/sec | ETA 07:42:56 2022-08-23 19:35:27 [INFO] [TRAIN] epoch: 16, iter: 19100/160000, loss: 0.8483, lr: 0.001067, batch_cost: 0.1986, reader_cost: 0.00274, ips: 40.2804 samples/sec | ETA 07:46:23 2022-08-23 19:35:37 [INFO] [TRAIN] epoch: 16, iter: 19150/160000, loss: 0.7914, lr: 0.001066, batch_cost: 0.2015, reader_cost: 0.00043, ips: 39.7009 samples/sec | ETA 07:53:02 2022-08-23 19:35:48 [INFO] [TRAIN] epoch: 16, iter: 19200/160000, loss: 0.7712, lr: 0.001066, batch_cost: 0.2303, reader_cost: 0.00037, ips: 34.7313 samples/sec | ETA 09:00:31 2022-08-23 19:35:59 [INFO] [TRAIN] epoch: 16, iter: 19250/160000, loss: 0.8333, lr: 0.001066, batch_cost: 0.2094, reader_cost: 0.00081, ips: 38.2024 samples/sec | ETA 08:11:14 2022-08-23 19:36:11 [INFO] [TRAIN] epoch: 16, iter: 19300/160000, loss: 0.8547, lr: 0.001065, batch_cost: 0.2423, reader_cost: 0.00071, ips: 33.0159 samples/sec | ETA 09:28:12 2022-08-23 19:36:21 [INFO] [TRAIN] epoch: 16, iter: 19350/160000, loss: 0.7767, lr: 0.001065, batch_cost: 0.1898, reader_cost: 0.00876, ips: 42.1515 samples/sec | ETA 07:24:54 2022-08-23 19:36:32 [INFO] [TRAIN] epoch: 16, iter: 19400/160000, loss: 0.8366, lr: 0.001064, batch_cost: 0.2214, reader_cost: 0.00051, ips: 36.1404 samples/sec | ETA 08:38:43 2022-08-23 19:36:42 [INFO] [TRAIN] epoch: 16, iter: 19450/160000, loss: 0.8361, lr: 0.001064, batch_cost: 0.2157, reader_cost: 0.00043, ips: 37.0818 samples/sec | ETA 08:25:22 2022-08-23 19:36:54 [INFO] [TRAIN] epoch: 16, iter: 19500/160000, loss: 0.8250, lr: 0.001064, batch_cost: 0.2364, reader_cost: 0.00040, ips: 33.8392 samples/sec | ETA 09:13:35 2022-08-23 19:37:04 [INFO] [TRAIN] epoch: 16, iter: 19550/160000, loss: 0.7419, lr: 0.001063, batch_cost: 0.2038, reader_cost: 0.00054, ips: 39.2627 samples/sec | ETA 07:56:57 2022-08-23 19:37:15 [INFO] [TRAIN] epoch: 16, iter: 19600/160000, loss: 0.7952, lr: 0.001063, batch_cost: 0.2190, reader_cost: 0.00070, ips: 36.5309 samples/sec | ETA 08:32:26 2022-08-23 19:37:26 [INFO] [TRAIN] epoch: 16, iter: 19650/160000, loss: 0.8351, lr: 0.001063, batch_cost: 0.2056, reader_cost: 0.00067, ips: 38.9124 samples/sec | ETA 08:00:54 2022-08-23 19:37:35 [INFO] [TRAIN] epoch: 16, iter: 19700/160000, loss: 0.7653, lr: 0.001062, batch_cost: 0.1782, reader_cost: 0.00111, ips: 44.9034 samples/sec | ETA 06:56:35 2022-08-23 19:37:43 [INFO] [TRAIN] epoch: 16, iter: 19750/160000, loss: 0.8572, lr: 0.001062, batch_cost: 0.1759, reader_cost: 0.00051, ips: 45.4911 samples/sec | ETA 06:51:04 2022-08-23 19:37:53 [INFO] [TRAIN] epoch: 16, iter: 19800/160000, loss: 0.8023, lr: 0.001061, batch_cost: 0.1883, reader_cost: 0.00063, ips: 42.4915 samples/sec | ETA 07:19:55 2022-08-23 19:38:02 [INFO] [TRAIN] epoch: 16, iter: 19850/160000, loss: 0.8386, lr: 0.001061, batch_cost: 0.1909, reader_cost: 0.00147, ips: 41.9117 samples/sec | ETA 07:25:51 2022-08-23 19:38:13 [INFO] [TRAIN] epoch: 16, iter: 19900/160000, loss: 0.7757, lr: 0.001061, batch_cost: 0.2110, reader_cost: 0.00058, ips: 37.9125 samples/sec | ETA 08:12:42 2022-08-23 19:38:24 [INFO] [TRAIN] epoch: 16, iter: 19950/160000, loss: 0.8251, lr: 0.001060, batch_cost: 0.2131, reader_cost: 0.00107, ips: 37.5418 samples/sec | ETA 08:17:24 2022-08-23 19:38:34 [INFO] [TRAIN] epoch: 16, iter: 20000/160000, loss: 0.8174, lr: 0.001060, batch_cost: 0.2177, reader_cost: 0.00031, ips: 36.7508 samples/sec | ETA 08:27:55 2022-08-23 19:38:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1748 - reader cost: 5.8976e-04 2022-08-23 19:41:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3234 Acc: 0.7513 Kappa: 0.7324 Dice: 0.4527 2022-08-23 19:41:30 [INFO] [EVAL] Class IoU: [0.666 0.7678 0.9265 0.7178 0.6772 0.7394 0.7555 0.7613 0.4983 0.6287 0.4645 0.5216 0.6856 0.2682 0.2261 0.3947 0.4329 0.4079 0.5836 0.4036 0.7281 0.4364 0.5797 0.4779 0.3042 0.4724 0.4988 0.402 0.3645 0.2468 0.1981 0.4763 0.2775 0.344 0.2334 0.3618 0.405 0.4653 0.2009 0.2986 0.2039 0.1248 0.3284 0.2298 0.3241 0.165 0.2857 0.4462 0.6332 0.5253 0.4832 0.3984 0.175 0.2819 0.6365 0.23 0.8415 0.3806 0.4163 0.3106 0.1287 0.1445 0.2707 0.0561 0.3978 0.6469 0.1898 0.3698 0.0795 0.3214 0.429 0.5137 0.4051 0.2365 0.4107 0.3344 0.5379 0.2234 0.3996 0.2754 0.6031 0.3446 0.3391 0.0173 0.351 0.5154 0.085 0.0569 0.2129 0.4703 0.385 0.1471 0.152 0.0498 0.0124 0.0041 0.1699 0.108 0.0968 0.282 0.061 0.1088 0.1476 0.6804 0.1837 0.336 0.184 0.4947 0.1632 0.2679 0.0965 0.4645 0.113 0.6095 0.6374 0.0065 0.3198 0.5225 0.1077 0.1236 0.3005 0.0413 0.2978 0.1358 0.2188 0.1851 0.2565 0.4484 0. 0.3196 0.45 0.005 0.2199 0.2165 0.0959 0.1652 0.1364 0.0104 0.1369 0.376 0.2511 0.0048 0.1812 0.0059 0.3144 0. 0.348 0.015 0.0857 0.173 ] 2022-08-23 19:41:30 [INFO] [EVAL] Class Precision: [0.7727 0.8474 0.962 0.8325 0.7604 0.8633 0.8973 0.8218 0.6161 0.7589 0.6782 0.6061 0.7512 0.5646 0.4779 0.5263 0.6166 0.6438 0.7536 0.5798 0.7875 0.6506 0.7094 0.5861 0.5565 0.6183 0.5239 0.7042 0.6643 0.3254 0.4584 0.6682 0.4264 0.4896 0.4589 0.4409 0.5608 0.7699 0.3733 0.5149 0.3116 0.3879 0.6254 0.5933 0.4605 0.6414 0.4357 0.5925 0.7833 0.7167 0.703 0.4993 0.328 0.599 0.6577 0.2472 0.8855 0.5321 0.8195 0.5636 0.3378 0.2721 0.4003 0.73 0.5123 0.7586 0.5309 0.5206 0.1527 0.524 0.6155 0.6358 0.6464 0.3364 0.5966 0.4602 0.6151 0.5871 0.4884 0.3811 0.7503 0.5652 0.7364 0.0513 0.4906 0.694 0.3059 0.3582 0.3852 0.6311 0.5457 0.3946 0.4903 0.3029 0.033 0.0284 0.4217 0.4603 0.4704 0.6178 0.5351 0.1924 0.48 0.8814 0.696 0.3469 0.5207 0.6403 0.3154 0.354 0.3249 0.7254 0.7023 0.6502 0.6417 0.2757 0.6029 0.5854 0.321 0.8109 0.7177 0.4345 0.7773 0.7114 0.6774 0.6035 0.9834 0.6634 0. 0.6029 0.6671 0.1475 0.6679 0.5199 0.7407 0.3617 0.4467 0.0438 0.3514 0.6373 0.4967 0.007 0.6753 0.668 0.7468 0. 0.7719 0.4535 0.503 0.6263] 2022-08-23 19:41:30 [INFO] [EVAL] Class Recall: [0.8283 0.891 0.9616 0.8389 0.8608 0.8374 0.827 0.9118 0.7227 0.7857 0.5959 0.7892 0.8871 0.3381 0.3002 0.6122 0.5924 0.5268 0.7213 0.5705 0.9061 0.57 0.7603 0.7213 0.4015 0.6668 0.9126 0.4837 0.4467 0.5055 0.2587 0.6238 0.4427 0.5364 0.3221 0.6686 0.5931 0.5405 0.3032 0.4155 0.3712 0.1555 0.4089 0.2728 0.5224 0.1817 0.4535 0.6437 0.7676 0.6629 0.6071 0.6633 0.2729 0.3475 0.9519 0.7683 0.9442 0.5719 0.4583 0.4089 0.1721 0.2354 0.4555 0.0573 0.6403 0.8147 0.228 0.5609 0.1423 0.4539 0.5859 0.7278 0.5205 0.4434 0.5686 0.5503 0.8107 0.2651 0.6874 0.4983 0.7545 0.4688 0.3859 0.0254 0.5523 0.667 0.1053 0.0633 0.3225 0.6485 0.5667 0.19 0.1805 0.0562 0.0195 0.0047 0.2216 0.1237 0.1087 0.3416 0.0645 0.2002 0.1757 0.749 0.1998 0.915 0.2215 0.6851 0.2528 0.5243 0.1207 0.5635 0.1187 0.9067 0.9895 0.0066 0.4052 0.8295 0.1395 0.1273 0.3408 0.0437 0.3256 0.1437 0.2443 0.2108 0.2576 0.5805 0. 0.4047 0.5803 0.0052 0.2469 0.2707 0.0992 0.2331 0.1641 0.0134 0.1832 0.4784 0.3368 0.0154 0.1985 0.0059 0.3519 0. 0.3878 0.0153 0.0937 0.1929] 2022-08-23 19:41:30 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 19000. 2022-08-23 19:41:40 [INFO] [TRAIN] epoch: 16, iter: 20050/160000, loss: 0.8145, lr: 0.001060, batch_cost: 0.2094, reader_cost: 0.00356, ips: 38.2026 samples/sec | ETA 08:08:26 2022-08-23 19:41:51 [INFO] [TRAIN] epoch: 16, iter: 20100/160000, loss: 0.7703, lr: 0.001059, batch_cost: 0.2124, reader_cost: 0.00182, ips: 37.6709 samples/sec | ETA 08:15:09 2022-08-23 19:42:01 [INFO] [TRAIN] epoch: 16, iter: 20150/160000, loss: 0.8205, lr: 0.001059, batch_cost: 0.1927, reader_cost: 0.00059, ips: 41.5128 samples/sec | ETA 07:29:10 2022-08-23 19:42:10 [INFO] [TRAIN] epoch: 16, iter: 20200/160000, loss: 0.7582, lr: 0.001058, batch_cost: 0.1972, reader_cost: 0.00062, ips: 40.5613 samples/sec | ETA 07:39:33 2022-08-23 19:42:25 [INFO] [TRAIN] epoch: 17, iter: 20250/160000, loss: 0.8137, lr: 0.001058, batch_cost: 0.2882, reader_cost: 0.08764, ips: 27.7557 samples/sec | ETA 11:11:20 2022-08-23 19:42:36 [INFO] [TRAIN] epoch: 17, iter: 20300/160000, loss: 0.7772, lr: 0.001058, batch_cost: 0.2159, reader_cost: 0.00070, ips: 37.0540 samples/sec | ETA 08:22:41 2022-08-23 19:42:47 [INFO] [TRAIN] epoch: 17, iter: 20350/160000, loss: 0.7015, lr: 0.001057, batch_cost: 0.2298, reader_cost: 0.00036, ips: 34.8106 samples/sec | ETA 08:54:53 2022-08-23 19:42:57 [INFO] [TRAIN] epoch: 17, iter: 20400/160000, loss: 0.7829, lr: 0.001057, batch_cost: 0.1937, reader_cost: 0.00083, ips: 41.2962 samples/sec | ETA 07:30:43 2022-08-23 19:43:07 [INFO] [TRAIN] epoch: 17, iter: 20450/160000, loss: 0.7949, lr: 0.001057, batch_cost: 0.2112, reader_cost: 0.00096, ips: 37.8843 samples/sec | ETA 08:11:08 2022-08-23 19:43:18 [INFO] [TRAIN] epoch: 17, iter: 20500/160000, loss: 0.7751, lr: 0.001056, batch_cost: 0.2106, reader_cost: 0.00035, ips: 37.9937 samples/sec | ETA 08:09:33 2022-08-23 19:43:30 [INFO] [TRAIN] epoch: 17, iter: 20550/160000, loss: 0.7874, lr: 0.001056, batch_cost: 0.2336, reader_cost: 0.00039, ips: 34.2474 samples/sec | ETA 09:02:54 2022-08-23 19:43:40 [INFO] [TRAIN] epoch: 17, iter: 20600/160000, loss: 0.7844, lr: 0.001055, batch_cost: 0.2124, reader_cost: 0.00034, ips: 37.6613 samples/sec | ETA 08:13:31 2022-08-23 19:43:50 [INFO] [TRAIN] epoch: 17, iter: 20650/160000, loss: 0.7848, lr: 0.001055, batch_cost: 0.2013, reader_cost: 0.00074, ips: 39.7334 samples/sec | ETA 07:47:36 2022-08-23 19:43:59 [INFO] [TRAIN] epoch: 17, iter: 20700/160000, loss: 0.8253, lr: 0.001055, batch_cost: 0.1819, reader_cost: 0.00310, ips: 43.9837 samples/sec | ETA 07:02:16 2022-08-23 19:44:09 [INFO] [TRAIN] epoch: 17, iter: 20750/160000, loss: 0.7723, lr: 0.001054, batch_cost: 0.1852, reader_cost: 0.00254, ips: 43.2036 samples/sec | ETA 07:09:44 2022-08-23 19:44:18 [INFO] [TRAIN] epoch: 17, iter: 20800/160000, loss: 0.7829, lr: 0.001054, batch_cost: 0.1971, reader_cost: 0.00084, ips: 40.5901 samples/sec | ETA 07:37:15 2022-08-23 19:44:28 [INFO] [TRAIN] epoch: 17, iter: 20850/160000, loss: 0.8158, lr: 0.001054, batch_cost: 0.1933, reader_cost: 0.00078, ips: 41.3939 samples/sec | ETA 07:28:12 2022-08-23 19:44:38 [INFO] [TRAIN] epoch: 17, iter: 20900/160000, loss: 0.7877, lr: 0.001053, batch_cost: 0.1976, reader_cost: 0.00046, ips: 40.4792 samples/sec | ETA 07:38:10 2022-08-23 19:44:47 [INFO] [TRAIN] epoch: 17, iter: 20950/160000, loss: 0.7740, lr: 0.001053, batch_cost: 0.1828, reader_cost: 0.00041, ips: 43.7741 samples/sec | ETA 07:03:32 2022-08-23 19:44:56 [INFO] [TRAIN] epoch: 17, iter: 21000/160000, loss: 0.8295, lr: 0.001052, batch_cost: 0.1853, reader_cost: 0.00120, ips: 43.1666 samples/sec | ETA 07:09:20 2022-08-23 19:44:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1681 - reader cost: 7.7448e-04 2022-08-23 19:47:45 [INFO] [EVAL] #Images: 2000 mIoU: 0.3280 Acc: 0.7557 Kappa: 0.7369 Dice: 0.4556 2022-08-23 19:47:45 [INFO] [EVAL] Class IoU: [0.6629 0.7771 0.93 0.7111 0.6777 0.7543 0.7494 0.7731 0.4957 0.6392 0.4647 0.5374 0.6798 0.2733 0.2414 0.4103 0.4734 0.395 0.5942 0.3878 0.7393 0.4596 0.6081 0.4753 0.3174 0.411 0.5226 0.3856 0.3511 0.2433 0.1993 0.4771 0.2906 0.3419 0.4195 0.3563 0.3973 0.3162 0.2314 0.2964 0.0814 0.146 0.3315 0.2342 0.2899 0.1672 0.2895 0.3836 0.6906 0.4643 0.4828 0.3616 0.1762 0.2565 0.6332 0.3686 0.8485 0.3375 0.4765 0.3103 0.1229 0.4637 0.2546 0.0305 0.3692 0.6612 0.2161 0.3953 0.0624 0.247 0.3837 0.4859 0.4296 0.152 0.418 0.3402 0.5139 0.2085 0.2262 0.309 0.6393 0.3533 0.2745 0.0254 0.3269 0.5015 0.0981 0.068 0.2978 0.4787 0.2456 0.0522 0.0645 0.0654 0.0052 0.0042 0.1837 0.0596 0.2448 0.3011 0.0949 0.0818 0.1385 0.7744 0.1518 0.5691 0.215 0.4866 0.0921 0.258 0.0751 0.4064 0.1541 0.626 0.6078 0.0015 0.3491 0.599 0.0869 0.1993 0.4566 0.0053 0.2997 0.1379 0.1931 0.1432 0.5002 0.4613 0. 0.316 0.5078 0.0012 0.2319 0.2729 0.2015 0.1527 0.1337 0.0102 0.1591 0.3338 0.0793 0.0106 0.3651 0.0017 0.2836 0. 0.3552 0.0171 0.0628 0.1666] 2022-08-23 19:47:45 [INFO] [EVAL] Class Precision: [0.7619 0.8601 0.9694 0.8021 0.7468 0.8457 0.8861 0.8464 0.603 0.7244 0.648 0.6679 0.741 0.5522 0.5255 0.5716 0.6465 0.7293 0.769 0.5763 0.8276 0.6448 0.7584 0.5722 0.4453 0.5313 0.5762 0.7039 0.7646 0.4328 0.4412 0.6366 0.5668 0.5001 0.5199 0.4216 0.6632 0.8919 0.4121 0.5536 0.2689 0.3732 0.6332 0.5474 0.6621 0.552 0.5267 0.4803 0.8164 0.5946 0.6989 0.5063 0.4235 0.5557 0.7559 0.8657 0.8947 0.5826 0.6031 0.5862 0.1984 0.5809 0.4017 0.5414 0.4231 0.8403 0.3456 0.5045 0.1271 0.7897 0.4628 0.6544 0.6463 0.2964 0.6361 0.5066 0.6313 0.4353 0.6194 0.4703 0.7908 0.6066 0.7832 0.0599 0.5275 0.7047 0.3494 0.3782 0.5567 0.755 0.2786 0.0687 0.2347 0.298 0.0349 0.066 0.5696 0.591 0.5922 0.6705 0.7507 0.1234 0.4806 0.9718 0.4502 0.6974 0.3492 0.7666 0.1704 0.3872 0.2506 0.4709 0.4779 0.7544 0.6135 0.0998 0.7016 0.6887 0.3153 0.6261 0.583 0.121 0.7078 0.6162 0.7391 0.61 0.8122 0.695 0. 0.6093 0.6596 0.022 0.7912 0.5521 0.5211 0.2713 0.4204 0.0715 0.3584 0.7231 0.651 0.0156 0.5325 0.0435 0.8164 0. 0.84 0.3722 0.7102 0.8456] 2022-08-23 19:47:45 [INFO] [EVAL] Class Recall: [0.8362 0.8895 0.9581 0.8624 0.88 0.8746 0.8292 0.8991 0.7359 0.8446 0.6217 0.7335 0.8917 0.351 0.3087 0.5925 0.6387 0.4628 0.7234 0.5425 0.8739 0.6154 0.7543 0.7374 0.525 0.6447 0.8489 0.4602 0.3937 0.3572 0.2666 0.6557 0.3736 0.5195 0.6848 0.6968 0.4977 0.3288 0.3455 0.3894 0.1046 0.1934 0.4103 0.2904 0.3402 0.1934 0.3913 0.6559 0.8176 0.6793 0.6095 0.5586 0.2317 0.3227 0.7959 0.3909 0.9427 0.4451 0.6942 0.3972 0.244 0.6967 0.4102 0.0314 0.7433 0.7563 0.3658 0.6462 0.1091 0.2644 0.6918 0.6536 0.5616 0.2377 0.5494 0.5087 0.7343 0.2857 0.2627 0.4739 0.7694 0.4584 0.297 0.0422 0.4622 0.6349 0.12 0.0765 0.3903 0.5668 0.6744 0.1791 0.0816 0.0774 0.0061 0.0045 0.2133 0.0622 0.2945 0.3533 0.098 0.1953 0.1628 0.7922 0.1863 0.7558 0.3587 0.5712 0.167 0.4361 0.0969 0.7478 0.1853 0.7863 0.9851 0.0015 0.4099 0.8213 0.1072 0.2262 0.678 0.0055 0.342 0.1508 0.2072 0.1576 0.5657 0.5783 0. 0.3962 0.688 0.0013 0.247 0.3506 0.2473 0.259 0.1639 0.0118 0.2225 0.3827 0.0828 0.0317 0.5374 0.0018 0.303 0. 0.381 0.0176 0.0644 0.1718] 2022-08-23 19:47:45 [INFO] [EVAL] The model with the best validation mIoU (0.3280) was saved at iter 21000. 2022-08-23 19:47:56 [INFO] [TRAIN] epoch: 17, iter: 21050/160000, loss: 0.7791, lr: 0.001052, batch_cost: 0.2191, reader_cost: 0.00424, ips: 36.5098 samples/sec | ETA 08:27:26 2022-08-23 19:48:06 [INFO] [TRAIN] epoch: 17, iter: 21100/160000, loss: 0.8322, lr: 0.001052, batch_cost: 0.2029, reader_cost: 0.00125, ips: 39.4311 samples/sec | ETA 07:49:40 2022-08-23 19:48:17 [INFO] [TRAIN] epoch: 17, iter: 21150/160000, loss: 0.8204, lr: 0.001051, batch_cost: 0.2113, reader_cost: 0.00060, ips: 37.8579 samples/sec | ETA 08:09:01 2022-08-23 19:48:27 [INFO] [TRAIN] epoch: 17, iter: 21200/160000, loss: 0.7683, lr: 0.001051, batch_cost: 0.2034, reader_cost: 0.00049, ips: 39.3381 samples/sec | ETA 07:50:27 2022-08-23 19:48:38 [INFO] [TRAIN] epoch: 17, iter: 21250/160000, loss: 0.8196, lr: 0.001050, batch_cost: 0.2096, reader_cost: 0.00124, ips: 38.1634 samples/sec | ETA 08:04:45 2022-08-23 19:48:49 [INFO] [TRAIN] epoch: 17, iter: 21300/160000, loss: 0.8980, lr: 0.001050, batch_cost: 0.2177, reader_cost: 0.00041, ips: 36.7556 samples/sec | ETA 08:23:08 2022-08-23 19:48:59 [INFO] [TRAIN] epoch: 17, iter: 21350/160000, loss: 0.7354, lr: 0.001050, batch_cost: 0.2021, reader_cost: 0.00491, ips: 39.5911 samples/sec | ETA 07:46:56 2022-08-23 19:49:09 [INFO] [TRAIN] epoch: 17, iter: 21400/160000, loss: 0.7740, lr: 0.001049, batch_cost: 0.2084, reader_cost: 0.00060, ips: 38.3901 samples/sec | ETA 08:01:22 2022-08-23 19:49:19 [INFO] [TRAIN] epoch: 17, iter: 21450/160000, loss: 0.7544, lr: 0.001049, batch_cost: 0.1961, reader_cost: 0.00044, ips: 40.7909 samples/sec | ETA 07:32:52 2022-08-23 19:49:33 [INFO] [TRAIN] epoch: 18, iter: 21500/160000, loss: 0.7660, lr: 0.001049, batch_cost: 0.2850, reader_cost: 0.07667, ips: 28.0745 samples/sec | ETA 10:57:46 2022-08-23 19:49:43 [INFO] [TRAIN] epoch: 18, iter: 21550/160000, loss: 0.7653, lr: 0.001048, batch_cost: 0.2021, reader_cost: 0.00047, ips: 39.5764 samples/sec | ETA 07:46:26 2022-08-23 19:49:54 [INFO] [TRAIN] epoch: 18, iter: 21600/160000, loss: 0.8337, lr: 0.001048, batch_cost: 0.2071, reader_cost: 0.00748, ips: 38.6258 samples/sec | ETA 07:57:44 2022-08-23 19:50:04 [INFO] [TRAIN] epoch: 18, iter: 21650/160000, loss: 0.7194, lr: 0.001047, batch_cost: 0.1992, reader_cost: 0.01042, ips: 40.1606 samples/sec | ETA 07:39:19 2022-08-23 19:50:13 [INFO] [TRAIN] epoch: 18, iter: 21700/160000, loss: 0.7474, lr: 0.001047, batch_cost: 0.1882, reader_cost: 0.00970, ips: 42.5174 samples/sec | ETA 07:13:42 2022-08-23 19:50:24 [INFO] [TRAIN] epoch: 18, iter: 21750/160000, loss: 0.7516, lr: 0.001047, batch_cost: 0.2143, reader_cost: 0.00073, ips: 37.3225 samples/sec | ETA 08:13:53 2022-08-23 19:50:32 [INFO] [TRAIN] epoch: 18, iter: 21800/160000, loss: 0.7440, lr: 0.001046, batch_cost: 0.1756, reader_cost: 0.00064, ips: 45.5577 samples/sec | ETA 06:44:28 2022-08-23 19:50:42 [INFO] [TRAIN] epoch: 18, iter: 21850/160000, loss: 0.7236, lr: 0.001046, batch_cost: 0.1873, reader_cost: 0.00093, ips: 42.7021 samples/sec | ETA 07:11:21 2022-08-23 19:50:51 [INFO] [TRAIN] epoch: 18, iter: 21900/160000, loss: 0.8789, lr: 0.001046, batch_cost: 0.1764, reader_cost: 0.00086, ips: 45.3560 samples/sec | ETA 06:45:58 2022-08-23 19:50:59 [INFO] [TRAIN] epoch: 18, iter: 21950/160000, loss: 0.7333, lr: 0.001045, batch_cost: 0.1721, reader_cost: 0.00068, ips: 46.4938 samples/sec | ETA 06:35:53 2022-08-23 19:51:07 [INFO] [TRAIN] epoch: 18, iter: 22000/160000, loss: 0.7891, lr: 0.001045, batch_cost: 0.1652, reader_cost: 0.00050, ips: 48.4323 samples/sec | ETA 06:19:54 2022-08-23 19:51:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1629 - reader cost: 6.3998e-04 2022-08-23 19:53:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3213 Acc: 0.7518 Kappa: 0.7332 Dice: 0.4473 2022-08-23 19:53:51 [INFO] [EVAL] Class IoU: [0.6708 0.7774 0.9275 0.7161 0.6688 0.7383 0.7439 0.7522 0.4831 0.6339 0.4508 0.5211 0.6934 0.2809 0.2779 0.3959 0.5226 0.3855 0.5795 0.4097 0.7161 0.4077 0.591 0.4742 0.3287 0.3363 0.4677 0.3466 0.3847 0.2984 0.1731 0.3938 0.2647 0.3029 0.3163 0.4017 0.4021 0.4761 0.2245 0.2499 0.1179 0.0888 0.351 0.2483 0.3049 0.2287 0.2571 0.4257 0.6541 0.4822 0.4408 0.4025 0.1798 0.1723 0.6888 0.3756 0.8332 0.2597 0.2306 0.2354 0.1415 0.4024 0.2932 0.0225 0.3551 0.6664 0.2084 0.3846 0.0172 0.3233 0.4072 0.4643 0.3894 0.1996 0.4361 0.3312 0.5971 0.2408 0.2457 0.137 0.546 0.3503 0.3939 0.0101 0.3437 0.5008 0.0694 0.0827 0.2407 0.4744 0.4274 0.1311 0.0361 0.0505 0.1161 0.0088 0.2195 0.0341 0.0289 0.2596 0.0524 0.0896 0.088 0.7744 0.1616 0.5164 0.1699 0.4665 0.083 0.3572 0.0533 0.1116 0.1269 0.6248 0.6882 0.0042 0.3669 0.5401 0.0781 0.1056 0.4486 0.047 0.3369 0.1367 0.2536 0.1963 0.4759 0.4488 0.0027 0.2944 0.5636 0.001 0.1956 0.2439 0.1133 0.1681 0.1221 0.0152 0.1187 0.3555 0.2071 0.0012 0.2588 0.0278 0.3138 0.0023 0.3806 0.0135 0.0998 0.1406] 2022-08-23 19:53:51 [INFO] [EVAL] Class Precision: [0.7826 0.8638 0.9668 0.8519 0.7355 0.8514 0.8148 0.8013 0.5823 0.7603 0.7127 0.6815 0.7812 0.6121 0.488 0.5771 0.6427 0.713 0.7539 0.5914 0.778 0.6885 0.705 0.558 0.4554 0.5217 0.5379 0.7721 0.5109 0.4437 0.4601 0.5426 0.4897 0.4139 0.4218 0.4973 0.563 0.7645 0.3522 0.6766 0.3199 0.3498 0.6232 0.4676 0.4443 0.3996 0.3556 0.5877 0.6997 0.5876 0.6917 0.5605 0.3189 0.4173 0.7304 0.4634 0.8715 0.6099 0.6678 0.3312 0.2096 0.5122 0.3784 0.6502 0.4074 0.7999 0.3495 0.5464 0.2292 0.5416 0.5099 0.545 0.5683 0.2243 0.6063 0.6148 0.6818 0.6196 0.4754 0.2743 0.6441 0.6191 0.6717 0.0378 0.4312 0.6703 0.3718 0.4006 0.3738 0.6755 0.6229 0.1608 0.1725 0.3743 0.1717 0.0359 0.288 0.7943 0.6372 0.5925 0.5522 0.1259 0.379 0.9078 0.7193 0.5709 0.3497 0.6803 0.243 0.4321 0.3711 0.9859 0.4243 0.7236 0.6915 0.5173 0.8061 0.5998 0.1629 0.6801 0.7319 0.3427 0.7802 0.6447 0.6647 0.5079 0.8651 0.5654 0.0691 0.433 0.7011 0.0128 0.7003 0.6342 0.6899 0.3801 0.3058 0.1621 0.4107 0.6068 0.5719 0.0017 0.6542 0.2888 0.6921 0.003 0.8445 0.4748 0.5254 0.7989] 2022-08-23 19:53:51 [INFO] [EVAL] Class Recall: [0.8244 0.886 0.958 0.8179 0.8806 0.8475 0.8953 0.9247 0.7395 0.7923 0.5509 0.6888 0.8605 0.3417 0.3922 0.5577 0.7365 0.4564 0.7147 0.5714 0.9001 0.5 0.7852 0.7595 0.5417 0.4862 0.782 0.3861 0.6089 0.4767 0.2172 0.5895 0.3656 0.5304 0.5585 0.6763 0.5846 0.558 0.3824 0.2839 0.1574 0.1063 0.4455 0.3463 0.4929 0.3485 0.4813 0.607 0.9094 0.729 0.5486 0.588 0.292 0.2269 0.9237 0.6648 0.9499 0.3115 0.2605 0.4485 0.3035 0.6526 0.5657 0.0228 0.7346 0.7998 0.3404 0.565 0.0182 0.4451 0.6692 0.7582 0.5529 0.6451 0.6083 0.4179 0.8278 0.2826 0.337 0.2148 0.7818 0.4466 0.4878 0.0135 0.6289 0.6645 0.0786 0.0944 0.4033 0.6144 0.5766 0.4156 0.0437 0.0551 0.2639 0.0116 0.4796 0.0344 0.0294 0.316 0.0547 0.2367 0.1028 0.8406 0.1725 0.8439 0.2485 0.5976 0.1118 0.6732 0.0586 0.1117 0.1533 0.8205 0.993 0.0042 0.4024 0.8444 0.1305 0.1111 0.5368 0.0516 0.3723 0.1478 0.2908 0.2424 0.5141 0.685 0.0028 0.4791 0.7419 0.0011 0.2135 0.2838 0.1194 0.2316 0.1688 0.0165 0.1431 0.4619 0.2451 0.0039 0.2998 0.0298 0.3647 0.0087 0.4093 0.0137 0.1097 0.1457] 2022-08-23 19:53:51 [INFO] [EVAL] The model with the best validation mIoU (0.3280) was saved at iter 21000. 2022-08-23 19:54:01 [INFO] [TRAIN] epoch: 18, iter: 22050/160000, loss: 0.7150, lr: 0.001044, batch_cost: 0.1948, reader_cost: 0.00405, ips: 41.0615 samples/sec | ETA 07:27:56 2022-08-23 19:54:13 [INFO] [TRAIN] epoch: 18, iter: 22100/160000, loss: 0.7526, lr: 0.001044, batch_cost: 0.2355, reader_cost: 0.00103, ips: 33.9729 samples/sec | ETA 09:01:12 2022-08-23 19:54:23 [INFO] [TRAIN] epoch: 18, iter: 22150/160000, loss: 0.7548, lr: 0.001044, batch_cost: 0.2122, reader_cost: 0.00130, ips: 37.6990 samples/sec | ETA 08:07:32 2022-08-23 19:54:34 [INFO] [TRAIN] epoch: 18, iter: 22200/160000, loss: 0.7540, lr: 0.001043, batch_cost: 0.2068, reader_cost: 0.00046, ips: 38.6867 samples/sec | ETA 07:54:55 2022-08-23 19:54:45 [INFO] [TRAIN] epoch: 18, iter: 22250/160000, loss: 0.7396, lr: 0.001043, batch_cost: 0.2318, reader_cost: 0.00036, ips: 34.5139 samples/sec | ETA 08:52:09 2022-08-23 19:54:57 [INFO] [TRAIN] epoch: 18, iter: 22300/160000, loss: 0.7903, lr: 0.001043, batch_cost: 0.2319, reader_cost: 0.00066, ips: 34.4990 samples/sec | ETA 08:52:11 2022-08-23 19:55:07 [INFO] [TRAIN] epoch: 18, iter: 22350/160000, loss: 0.7313, lr: 0.001042, batch_cost: 0.2150, reader_cost: 0.00479, ips: 37.2094 samples/sec | ETA 08:13:14 2022-08-23 19:55:18 [INFO] [TRAIN] epoch: 18, iter: 22400/160000, loss: 0.7537, lr: 0.001042, batch_cost: 0.2138, reader_cost: 0.00049, ips: 37.4178 samples/sec | ETA 08:10:19 2022-08-23 19:55:29 [INFO] [TRAIN] epoch: 18, iter: 22450/160000, loss: 0.7901, lr: 0.001041, batch_cost: 0.2128, reader_cost: 0.00194, ips: 37.5876 samples/sec | ETA 08:07:55 2022-08-23 19:55:40 [INFO] [TRAIN] epoch: 18, iter: 22500/160000, loss: 0.7970, lr: 0.001041, batch_cost: 0.2252, reader_cost: 0.00078, ips: 35.5187 samples/sec | ETA 08:36:09 2022-08-23 19:55:53 [INFO] [TRAIN] epoch: 18, iter: 22550/160000, loss: 0.7854, lr: 0.001041, batch_cost: 0.2497, reader_cost: 0.00059, ips: 32.0440 samples/sec | ETA 09:31:55 2022-08-23 19:56:03 [INFO] [TRAIN] epoch: 18, iter: 22600/160000, loss: 0.8110, lr: 0.001040, batch_cost: 0.2153, reader_cost: 0.00047, ips: 37.1571 samples/sec | ETA 08:13:02 2022-08-23 19:56:13 [INFO] [TRAIN] epoch: 18, iter: 22650/160000, loss: 0.8097, lr: 0.001040, batch_cost: 0.2038, reader_cost: 0.00061, ips: 39.2612 samples/sec | ETA 07:46:26 2022-08-23 19:56:23 [INFO] [TRAIN] epoch: 18, iter: 22700/160000, loss: 0.8227, lr: 0.001040, batch_cost: 0.1961, reader_cost: 0.00047, ips: 40.7976 samples/sec | ETA 07:28:43 2022-08-23 19:56:37 [INFO] [TRAIN] epoch: 19, iter: 22750/160000, loss: 0.7814, lr: 0.001039, batch_cost: 0.2815, reader_cost: 0.05400, ips: 28.4157 samples/sec | ETA 10:44:00 2022-08-23 19:56:48 [INFO] [TRAIN] epoch: 19, iter: 22800/160000, loss: 0.7941, lr: 0.001039, batch_cost: 0.2068, reader_cost: 0.00079, ips: 38.6921 samples/sec | ETA 07:52:47 2022-08-23 19:56:56 [INFO] [TRAIN] epoch: 19, iter: 22850/160000, loss: 0.7396, lr: 0.001038, batch_cost: 0.1752, reader_cost: 0.00031, ips: 45.6581 samples/sec | ETA 06:40:30 2022-08-23 19:57:06 [INFO] [TRAIN] epoch: 19, iter: 22900/160000, loss: 0.7270, lr: 0.001038, batch_cost: 0.1877, reader_cost: 0.00061, ips: 42.6132 samples/sec | ETA 07:08:58 2022-08-23 19:57:14 [INFO] [TRAIN] epoch: 19, iter: 22950/160000, loss: 0.7843, lr: 0.001038, batch_cost: 0.1686, reader_cost: 0.00060, ips: 47.4390 samples/sec | ETA 06:25:11 2022-08-23 19:57:22 [INFO] [TRAIN] epoch: 19, iter: 23000/160000, loss: 0.8008, lr: 0.001037, batch_cost: 0.1603, reader_cost: 0.00070, ips: 49.9159 samples/sec | ETA 06:05:56 2022-08-23 19:57:22 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1718 - reader cost: 7.0822e-04 2022-08-23 20:00:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3335 Acc: 0.7552 Kappa: 0.7363 Dice: 0.4622 2022-08-23 20:00:14 [INFO] [EVAL] Class IoU: [0.6655 0.7784 0.9258 0.7209 0.6722 0.7367 0.7609 0.7571 0.4913 0.6411 0.4688 0.5593 0.6843 0.2866 0.2849 0.3914 0.5063 0.419 0.581 0.4073 0.7173 0.434 0.5978 0.4784 0.3001 0.4764 0.4665 0.3508 0.3599 0.2672 0.225 0.5 0.235 0.3203 0.3784 0.3943 0.405 0.4653 0.2242 0.3325 0.1626 0.1277 0.3397 0.2456 0.3017 0.2648 0.1815 0.4594 0.6695 0.5275 0.4778 0.3694 0.1829 0.1619 0.621 0.4211 0.8624 0.1742 0.432 0.2639 0.1111 0.4481 0.2202 0.1303 0.3203 0.6584 0.2242 0.3505 0.033 0.3168 0.4519 0.4671 0.3644 0.2319 0.449 0.3505 0.549 0.2284 0.3866 0.0893 0.5884 0.3256 0.3954 0.0275 0.2476 0.5352 0.0964 0.0967 0.2676 0.4971 0.3219 0.0283 0.1525 0.0445 0.0031 0.0231 0.1991 0.1119 0.29 0.3108 0.1163 0.0579 0.1167 0.8533 0.1936 0.5697 0.1675 0.5372 0.1447 0.3096 0.0743 0.3962 0.1678 0.5538 0.6248 0.0006 0.3809 0.6331 0.0834 0.2557 0.5062 0.0409 0.3214 0.0449 0.2682 0.1941 0.4832 0.4426 0. 0.3702 0.5436 0.0106 0.3 0.24 0.1377 0.1678 0.1172 0.0148 0.1342 0.3013 0.142 0. 0.2275 0. 0.2937 0.0004 0.3184 0.0109 0.1155 0.1812] 2022-08-23 20:00:14 [INFO] [EVAL] Class Precision: [0.7538 0.8697 0.9711 0.8259 0.757 0.8809 0.8746 0.816 0.6723 0.7723 0.653 0.6961 0.798 0.4695 0.4987 0.5756 0.6812 0.6317 0.7466 0.6089 0.7725 0.6097 0.78 0.5975 0.5206 0.5314 0.4894 0.7859 0.6703 0.4876 0.4943 0.6165 0.5477 0.4763 0.5001 0.5365 0.6576 0.8607 0.3966 0.5405 0.2823 0.3719 0.667 0.4914 0.422 0.631 0.4407 0.7161 0.8262 0.685 0.6101 0.4977 0.398 0.576 0.6594 0.5421 0.8923 0.7639 0.7136 0.4291 0.4042 0.6547 0.2656 0.6104 0.3474 0.7447 0.338 0.4927 0.5756 0.5585 0.5727 0.665 0.5667 0.2628 0.6818 0.5084 0.7397 0.6029 0.5536 0.2916 0.7077 0.6172 0.6996 0.1156 0.4066 0.6999 0.3373 0.39 0.5966 0.7113 0.4188 0.0331 0.2591 0.3345 0.005 0.0827 0.4336 0.3179 0.5087 0.6944 0.5669 0.0692 0.3939 0.9663 0.9223 0.6854 0.2424 0.6919 0.2515 0.3435 0.3893 0.7171 0.6587 0.686 0.6275 0.1648 0.6567 0.7236 0.1101 0.5803 0.6931 0.4127 0.7981 0.8576 0.5567 0.5438 0.8357 0.5601 0. 0.8023 0.6497 0.1239 0.7286 0.7634 0.6299 0.3788 0.3512 0.1283 0.3141 0.7098 0.5421 0. 0.7343 0. 0.8708 0.0008 0.8967 0.4191 0.3659 0.7553] 2022-08-23 20:00:14 [INFO] [EVAL] Class Recall: [0.8503 0.8811 0.952 0.8501 0.8572 0.8182 0.8541 0.913 0.646 0.7904 0.6244 0.74 0.8276 0.4239 0.3993 0.5502 0.6635 0.5545 0.7236 0.5517 0.9095 0.6009 0.719 0.706 0.4148 0.8215 0.9091 0.3879 0.4373 0.3715 0.2923 0.7258 0.2916 0.4944 0.6087 0.598 0.5132 0.5031 0.3404 0.4634 0.2772 0.1629 0.4091 0.3293 0.5143 0.3134 0.2359 0.5618 0.7792 0.6964 0.6879 0.5889 0.2528 0.1838 0.9143 0.6535 0.9625 0.1841 0.5227 0.4068 0.1328 0.5867 0.5632 0.1422 0.8038 0.8503 0.3996 0.5484 0.0338 0.4227 0.6819 0.6108 0.5052 0.6635 0.568 0.5302 0.6804 0.2689 0.5617 0.114 0.7773 0.408 0.4763 0.0348 0.3878 0.6946 0.1189 0.1139 0.3267 0.6227 0.5818 0.1629 0.2703 0.0488 0.0081 0.0311 0.269 0.1472 0.4029 0.3601 0.1276 0.2618 0.1423 0.8794 0.1968 0.7714 0.3515 0.7061 0.254 0.7584 0.0841 0.4696 0.1838 0.7418 0.9932 0.0006 0.4756 0.835 0.2565 0.3138 0.6525 0.0434 0.3498 0.0452 0.3411 0.2319 0.5339 0.6784 0. 0.4074 0.769 0.0114 0.3377 0.2593 0.1498 0.2316 0.1495 0.0164 0.1899 0.3437 0.1613 0. 0.2479 0. 0.3071 0.0007 0.3305 0.011 0.1443 0.1924] 2022-08-23 20:00:15 [INFO] [EVAL] The model with the best validation mIoU (0.3335) was saved at iter 23000. 2022-08-23 20:00:24 [INFO] [TRAIN] epoch: 19, iter: 23050/160000, loss: 0.7512, lr: 0.001037, batch_cost: 0.1930, reader_cost: 0.00534, ips: 41.4519 samples/sec | ETA 07:20:30 2022-08-23 20:00:36 [INFO] [TRAIN] epoch: 19, iter: 23100/160000, loss: 0.7923, lr: 0.001036, batch_cost: 0.2306, reader_cost: 0.00130, ips: 34.6898 samples/sec | ETA 08:46:11 2022-08-23 20:00:45 [INFO] [TRAIN] epoch: 19, iter: 23150/160000, loss: 0.7998, lr: 0.001036, batch_cost: 0.1831, reader_cost: 0.00296, ips: 43.7028 samples/sec | ETA 06:57:30 2022-08-23 20:00:57 [INFO] [TRAIN] epoch: 19, iter: 23200/160000, loss: 0.8067, lr: 0.001036, batch_cost: 0.2416, reader_cost: 0.00053, ips: 33.1074 samples/sec | ETA 09:10:56 2022-08-23 20:01:09 [INFO] [TRAIN] epoch: 19, iter: 23250/160000, loss: 0.7295, lr: 0.001035, batch_cost: 0.2320, reader_cost: 0.00059, ips: 34.4815 samples/sec | ETA 08:48:47 2022-08-23 20:01:18 [INFO] [TRAIN] epoch: 19, iter: 23300/160000, loss: 0.7938, lr: 0.001035, batch_cost: 0.1928, reader_cost: 0.00381, ips: 41.4847 samples/sec | ETA 07:19:21 2022-08-23 20:01:29 [INFO] [TRAIN] epoch: 19, iter: 23350/160000, loss: 0.7384, lr: 0.001035, batch_cost: 0.2072, reader_cost: 0.00066, ips: 38.6020 samples/sec | ETA 07:51:59 2022-08-23 20:01:40 [INFO] [TRAIN] epoch: 19, iter: 23400/160000, loss: 0.7243, lr: 0.001034, batch_cost: 0.2299, reader_cost: 0.00053, ips: 34.7963 samples/sec | ETA 08:43:25 2022-08-23 20:01:51 [INFO] [TRAIN] epoch: 19, iter: 23450/160000, loss: 0.7835, lr: 0.001034, batch_cost: 0.2184, reader_cost: 0.00063, ips: 36.6218 samples/sec | ETA 08:17:09 2022-08-23 20:02:02 [INFO] [TRAIN] epoch: 19, iter: 23500/160000, loss: 0.7421, lr: 0.001033, batch_cost: 0.2067, reader_cost: 0.00034, ips: 38.7115 samples/sec | ETA 07:50:08 2022-08-23 20:02:12 [INFO] [TRAIN] epoch: 19, iter: 23550/160000, loss: 0.8217, lr: 0.001033, batch_cost: 0.2062, reader_cost: 0.00035, ips: 38.8025 samples/sec | ETA 07:48:52 2022-08-23 20:02:22 [INFO] [TRAIN] epoch: 19, iter: 23600/160000, loss: 0.8026, lr: 0.001033, batch_cost: 0.1955, reader_cost: 0.00095, ips: 40.9295 samples/sec | ETA 07:24:20 2022-08-23 20:02:32 [INFO] [TRAIN] epoch: 19, iter: 23650/160000, loss: 0.7795, lr: 0.001032, batch_cost: 0.2121, reader_cost: 0.00578, ips: 37.7258 samples/sec | ETA 08:01:53 2022-08-23 20:02:42 [INFO] [TRAIN] epoch: 19, iter: 23700/160000, loss: 0.7690, lr: 0.001032, batch_cost: 0.1935, reader_cost: 0.00212, ips: 41.3476 samples/sec | ETA 07:19:31 2022-08-23 20:02:51 [INFO] [TRAIN] epoch: 19, iter: 23750/160000, loss: 0.7213, lr: 0.001032, batch_cost: 0.1869, reader_cost: 0.00077, ips: 42.8055 samples/sec | ETA 07:04:24 2022-08-23 20:03:00 [INFO] [TRAIN] epoch: 19, iter: 23800/160000, loss: 0.7094, lr: 0.001031, batch_cost: 0.1850, reader_cost: 0.00085, ips: 43.2346 samples/sec | ETA 07:00:02 2022-08-23 20:03:10 [INFO] [TRAIN] epoch: 19, iter: 23850/160000, loss: 0.7715, lr: 0.001031, batch_cost: 0.1876, reader_cost: 0.00034, ips: 42.6500 samples/sec | ETA 07:05:38 2022-08-23 20:03:19 [INFO] [TRAIN] epoch: 19, iter: 23900/160000, loss: 0.7298, lr: 0.001030, batch_cost: 0.1788, reader_cost: 0.00102, ips: 44.7383 samples/sec | ETA 06:45:37 2022-08-23 20:03:28 [INFO] [TRAIN] epoch: 19, iter: 23950/160000, loss: 0.7715, lr: 0.001030, batch_cost: 0.1848, reader_cost: 0.00076, ips: 43.2971 samples/sec | ETA 06:58:57 2022-08-23 20:03:41 [INFO] [TRAIN] epoch: 20, iter: 24000/160000, loss: 0.7709, lr: 0.001030, batch_cost: 0.2550, reader_cost: 0.09260, ips: 31.3753 samples/sec | ETA 09:37:56 2022-08-23 20:03:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1653 - reader cost: 6.9699e-04 2022-08-23 20:06:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.3339 Acc: 0.7544 Kappa: 0.7362 Dice: 0.4646 2022-08-23 20:06:26 [INFO] [EVAL] Class IoU: [0.666 0.7745 0.9297 0.7196 0.6831 0.7526 0.7624 0.7471 0.4895 0.6587 0.4543 0.5378 0.6876 0.2597 0.2613 0.3927 0.5202 0.4149 0.59 0.4106 0.7241 0.3796 0.6226 0.4784 0.3415 0.3934 0.4234 0.3367 0.4172 0.3034 0.2268 0.4583 0.2653 0.3061 0.351 0.3908 0.4223 0.4924 0.1867 0.3312 0.1901 0.1319 0.3307 0.2392 0.2773 0.2046 0.221 0.3893 0.6959 0.5086 0.4865 0.4428 0.1929 0.0515 0.6456 0.4309 0.85 0.2998 0.4135 0.2552 0.0947 0.3067 0.2709 0.2143 0.3953 0.6348 0.1845 0.4043 0.0379 0.3208 0.3748 0.5268 0.4125 0.2233 0.4374 0.3512 0.5864 0.1999 0.2636 0.2339 0.5918 0.3373 0.2645 0.0202 0.2974 0.4972 0.0859 0.078 0.2948 0.4688 0.3747 0. 0.1947 0.054 0.0035 0.0245 0.1462 0.1092 0.2538 0.3524 0.2082 0.0768 0.1407 0.6424 0.1548 0.3739 0.2307 0.4881 0.1508 0.3349 0.054 0.4193 0.1064 0.6007 0.7369 0. 0.4263 0.6111 0.0622 0.3436 0.4856 0.0012 0.3269 0.2115 0.2467 0.1583 0.3938 0.4317 0.0451 0.318 0.5319 0.0036 0.2518 0.2734 0.1818 0.1583 0.1349 0.0131 0.1297 0.2732 0.1629 0. 0.3408 0.1093 0.2901 0.0055 0.3922 0.0166 0.1025 0.1876] 2022-08-23 20:06:26 [INFO] [EVAL] Class Precision: [0.7935 0.856 0.9706 0.8257 0.7624 0.8261 0.8868 0.7933 0.6146 0.7607 0.6486 0.6766 0.7813 0.4424 0.5115 0.5795 0.6813 0.6622 0.7295 0.6008 0.8114 0.6654 0.7659 0.6011 0.5172 0.5228 0.4709 0.7422 0.6274 0.4164 0.4067 0.5341 0.4828 0.3827 0.5131 0.4652 0.6184 0.7465 0.3513 0.6149 0.281 0.3898 0.6211 0.5039 0.4303 0.441 0.5176 0.4851 0.8467 0.6473 0.6809 0.7085 0.4459 0.5054 0.6925 0.5678 0.8866 0.5872 0.4808 0.3658 0.1871 0.5 0.4014 0.5185 0.4718 0.717 0.2815 0.5519 0.7287 0.5191 0.6662 0.6595 0.6357 0.3649 0.6384 0.5124 0.7225 0.5888 0.6243 0.5592 0.7523 0.7184 0.8105 0.0303 0.3559 0.7759 0.2924 0.386 0.765 0.5988 0.5137 0. 0.3601 0.2888 0.032 0.0829 0.3309 0.3404 0.4569 0.568 0.6898 0.1052 0.4957 0.8478 0.8874 0.3962 0.4881 0.5977 0.2565 0.4506 0.3121 0.6606 0.6458 0.62 0.7501 0.004 0.6339 0.7137 0.4156 0.7206 0.6407 0.022 0.6469 0.4856 0.7037 0.6188 0.7777 0.6322 0.0827 0.4603 0.6313 0.1668 0.7931 0.5976 0.5592 0.2901 0.3184 0.1308 0.3003 0.7505 0.6057 0. 0.5855 0.2941 0.885 0.0259 0.8028 0.2075 0.4531 0.8588] 2022-08-23 20:06:26 [INFO] [EVAL] Class Recall: [0.8056 0.8905 0.9567 0.8485 0.8679 0.8943 0.8446 0.9276 0.7062 0.8309 0.6027 0.7239 0.8516 0.3861 0.3482 0.5491 0.6875 0.5263 0.7552 0.5648 0.8706 0.4692 0.7689 0.7009 0.5012 0.6139 0.8076 0.3813 0.5546 0.528 0.339 0.7636 0.3706 0.6044 0.5264 0.7094 0.5711 0.5912 0.2849 0.4178 0.3702 0.1662 0.4144 0.3129 0.4381 0.2763 0.2783 0.6635 0.7962 0.7035 0.6302 0.5415 0.2538 0.0542 0.905 0.6411 0.9537 0.3799 0.7469 0.4578 0.1608 0.4425 0.4545 0.2676 0.7092 0.847 0.3486 0.602 0.0385 0.4564 0.4614 0.7235 0.5403 0.3653 0.5814 0.5274 0.7569 0.2324 0.3133 0.2867 0.735 0.3887 0.282 0.0573 0.6442 0.5806 0.1084 0.089 0.3242 0.6836 0.5807 0. 0.2977 0.0623 0.004 0.0337 0.2075 0.1385 0.3634 0.4815 0.2297 0.2214 0.1642 0.7262 0.1579 0.8696 0.3043 0.7269 0.268 0.566 0.0612 0.5344 0.113 0.9505 0.9766 0. 0.5656 0.8095 0.0682 0.3964 0.6673 0.0013 0.3979 0.2726 0.2754 0.1754 0.4437 0.5765 0.0901 0.5071 0.7716 0.0037 0.2695 0.335 0.2122 0.2583 0.1897 0.0143 0.1859 0.3005 0.1823 0. 0.4491 0.1482 0.3015 0.007 0.434 0.0177 0.1169 0.1935] 2022-08-23 20:06:27 [INFO] [EVAL] The model with the best validation mIoU (0.3339) was saved at iter 24000. 2022-08-23 20:06:37 [INFO] [TRAIN] epoch: 20, iter: 24050/160000, loss: 0.7153, lr: 0.001029, batch_cost: 0.2123, reader_cost: 0.00341, ips: 37.6783 samples/sec | ETA 08:01:05 2022-08-23 20:06:47 [INFO] [TRAIN] epoch: 20, iter: 24100/160000, loss: 0.7437, lr: 0.001029, batch_cost: 0.2017, reader_cost: 0.00073, ips: 39.6666 samples/sec | ETA 07:36:48 2022-08-23 20:06:57 [INFO] [TRAIN] epoch: 20, iter: 24150/160000, loss: 0.7467, lr: 0.001029, batch_cost: 0.1936, reader_cost: 0.00049, ips: 41.3223 samples/sec | ETA 07:18:20 2022-08-23 20:07:07 [INFO] [TRAIN] epoch: 20, iter: 24200/160000, loss: 0.7859, lr: 0.001028, batch_cost: 0.2084, reader_cost: 0.01137, ips: 38.3807 samples/sec | ETA 07:51:45 2022-08-23 20:07:18 [INFO] [TRAIN] epoch: 20, iter: 24250/160000, loss: 0.7689, lr: 0.001028, batch_cost: 0.2119, reader_cost: 0.00057, ips: 37.7529 samples/sec | ETA 07:59:25 2022-08-23 20:07:29 [INFO] [TRAIN] epoch: 20, iter: 24300/160000, loss: 0.8034, lr: 0.001027, batch_cost: 0.2197, reader_cost: 0.00094, ips: 36.4091 samples/sec | ETA 08:16:56 2022-08-23 20:07:40 [INFO] [TRAIN] epoch: 20, iter: 24350/160000, loss: 0.7479, lr: 0.001027, batch_cost: 0.2268, reader_cost: 0.00044, ips: 35.2740 samples/sec | ETA 08:32:44 2022-08-23 20:07:52 [INFO] [TRAIN] epoch: 20, iter: 24400/160000, loss: 0.7631, lr: 0.001027, batch_cost: 0.2264, reader_cost: 0.00095, ips: 35.3318 samples/sec | ETA 08:31:43 2022-08-23 20:08:02 [INFO] [TRAIN] epoch: 20, iter: 24450/160000, loss: 0.6981, lr: 0.001026, batch_cost: 0.2102, reader_cost: 0.00055, ips: 38.0603 samples/sec | ETA 07:54:51 2022-08-23 20:08:13 [INFO] [TRAIN] epoch: 20, iter: 24500/160000, loss: 0.7714, lr: 0.001026, batch_cost: 0.2230, reader_cost: 0.00290, ips: 35.8713 samples/sec | ETA 08:23:39 2022-08-23 20:08:24 [INFO] [TRAIN] epoch: 20, iter: 24550/160000, loss: 0.6920, lr: 0.001025, batch_cost: 0.2147, reader_cost: 0.00185, ips: 37.2569 samples/sec | ETA 08:04:44 2022-08-23 20:08:33 [INFO] [TRAIN] epoch: 20, iter: 24600/160000, loss: 0.7404, lr: 0.001025, batch_cost: 0.1792, reader_cost: 0.00070, ips: 44.6306 samples/sec | ETA 06:44:30 2022-08-23 20:08:42 [INFO] [TRAIN] epoch: 20, iter: 24650/160000, loss: 0.7342, lr: 0.001025, batch_cost: 0.1710, reader_cost: 0.00061, ips: 46.7868 samples/sec | ETA 06:25:43 2022-08-23 20:08:50 [INFO] [TRAIN] epoch: 20, iter: 24700/160000, loss: 0.7499, lr: 0.001024, batch_cost: 0.1685, reader_cost: 0.00046, ips: 47.4807 samples/sec | ETA 06:19:56 2022-08-23 20:08:58 [INFO] [TRAIN] epoch: 20, iter: 24750/160000, loss: 0.7846, lr: 0.001024, batch_cost: 0.1672, reader_cost: 0.00089, ips: 47.8531 samples/sec | ETA 06:16:50 2022-08-23 20:09:06 [INFO] [TRAIN] epoch: 20, iter: 24800/160000, loss: 0.7762, lr: 0.001024, batch_cost: 0.1542, reader_cost: 0.00111, ips: 51.8879 samples/sec | ETA 05:47:24 2022-08-23 20:09:15 [INFO] [TRAIN] epoch: 20, iter: 24850/160000, loss: 0.7036, lr: 0.001023, batch_cost: 0.1862, reader_cost: 0.00064, ips: 42.9607 samples/sec | ETA 06:59:27 2022-08-23 20:09:24 [INFO] [TRAIN] epoch: 20, iter: 24900/160000, loss: 0.7520, lr: 0.001023, batch_cost: 0.1697, reader_cost: 0.00069, ips: 47.1383 samples/sec | ETA 06:22:08 2022-08-23 20:09:33 [INFO] [TRAIN] epoch: 20, iter: 24950/160000, loss: 0.8364, lr: 0.001022, batch_cost: 0.1739, reader_cost: 0.00074, ips: 46.0122 samples/sec | ETA 06:31:20 2022-08-23 20:09:42 [INFO] [TRAIN] epoch: 20, iter: 25000/160000, loss: 0.7599, lr: 0.001022, batch_cost: 0.1905, reader_cost: 0.00072, ips: 41.9859 samples/sec | ETA 07:08:42 2022-08-23 20:09:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1562 - reader cost: 6.1964e-04 2022-08-23 20:12:19 [INFO] [EVAL] #Images: 2000 mIoU: 0.3268 Acc: 0.7532 Kappa: 0.7340 Dice: 0.4564 2022-08-23 20:12:19 [INFO] [EVAL] Class IoU: [0.6578 0.7679 0.9288 0.7231 0.6721 0.7486 0.7501 0.7561 0.5063 0.6281 0.4383 0.5111 0.6816 0.2984 0.2318 0.3935 0.537 0.3827 0.5963 0.4049 0.7079 0.4205 0.6026 0.4826 0.3383 0.3779 0.408 0.3699 0.3836 0.1932 0.207 0.5168 0.2331 0.3076 0.283 0.3164 0.4115 0.5043 0.2094 0.3506 0.0971 0.1312 0.353 0.2582 0.2946 0.1383 0.3097 0.4138 0.6222 0.4681 0.5052 0.4376 0.1872 0.1885 0.6786 0.4018 0.8842 0.3315 0.4171 0.281 0.1235 0.4058 0.2906 0.2344 0.4051 0.6475 0.2377 0.3935 0.082 0.3591 0.436 0.4865 0.4426 0.1945 0.3782 0.2941 0.4914 0.2199 0.4625 0.3157 0.5628 0.3307 0.2891 0.0248 0.0406 0.5231 0.0896 0.0963 0.2117 0.4934 0.3184 0.0246 0.1362 0.0572 0.1356 0.0337 0.147 0.1081 0.2682 0.2324 0.1116 0.0613 0.199 0.5289 0.0538 0.45 0.1449 0.5019 0.1421 0.3603 0.0564 0.2918 0.1664 0.6062 0.7184 0.0034 0.353 0.6846 0.1486 0.1373 0.4678 0.0178 0.2229 0.1743 0.2334 0.1874 0.4354 0.4064 0. 0.2722 0.4218 0.0061 0.2706 0.2208 0.1211 0.1579 0.1177 0.0068 0.1346 0.3402 0.0935 0. 0.2776 0.3091 0.1412 0.0006 0.3581 0.0249 0.1058 0.1172] 2022-08-23 20:12:19 [INFO] [EVAL] Class Precision: [0.7508 0.8454 0.9705 0.8385 0.7426 0.8631 0.8367 0.8037 0.6562 0.726 0.7188 0.7326 0.7519 0.5378 0.5077 0.5778 0.6339 0.7159 0.7634 0.6152 0.7591 0.6083 0.8092 0.6369 0.4502 0.5829 0.5169 0.6317 0.6303 0.4766 0.4992 0.6844 0.5251 0.4274 0.4 0.4409 0.6531 0.7269 0.4215 0.5558 0.2665 0.3287 0.6972 0.4466 0.389 0.4209 0.5696 0.5142 0.672 0.5564 0.5984 0.7076 0.4609 0.719 0.7414 0.6008 0.9522 0.6208 0.7042 0.6418 0.2548 0.4924 0.4047 0.4867 0.4759 0.8293 0.349 0.6077 0.3057 0.6118 0.632 0.739 0.5935 0.2262 0.6112 0.4645 0.7573 0.5425 0.5063 0.5912 0.678 0.6 0.8016 0.0718 0.105 0.7203 0.4278 0.3438 0.2261 0.6438 0.3879 0.0308 0.5269 0.2838 0.233 0.2275 0.4665 0.42 0.662 0.6152 0.6503 0.1166 0.4983 0.8843 0.6086 0.5299 0.3284 0.6179 0.2821 0.4662 0.5259 0.6849 0.5019 0.6513 0.7456 0.1091 0.608 0.7562 0.3425 0.6886 0.5797 0.3331 0.3321 0.6175 0.7625 0.4923 0.9054 0.5871 0. 0.4761 0.7456 0.4684 0.7916 0.5597 0.6695 0.2904 0.356 0.1342 0.3291 0.6939 0.3975 0. 0.5558 0.4394 0.865 0.0139 0.8366 0.2664 0.3521 0.8521] 2022-08-23 20:12:19 [INFO] [EVAL] Class Recall: [0.8414 0.8933 0.9557 0.8401 0.8762 0.8494 0.8788 0.9274 0.6892 0.8233 0.529 0.6283 0.8794 0.4013 0.2989 0.5524 0.7784 0.4512 0.7315 0.5422 0.913 0.5766 0.7024 0.6658 0.5764 0.5179 0.6596 0.4716 0.4949 0.2452 0.2612 0.6786 0.2954 0.5233 0.4918 0.5284 0.5266 0.6222 0.2938 0.4871 0.1325 0.1793 0.4169 0.3798 0.5484 0.1708 0.4043 0.6794 0.8936 0.7468 0.7645 0.5342 0.2397 0.2035 0.889 0.5481 0.9253 0.4157 0.5057 0.3333 0.1934 0.6975 0.5076 0.3113 0.7313 0.747 0.4271 0.5274 0.1007 0.4651 0.5844 0.5874 0.6351 0.5807 0.4981 0.445 0.5833 0.27 0.8425 0.4039 0.7681 0.4242 0.3114 0.0366 0.0621 0.6564 0.1018 0.118 0.7698 0.6787 0.6396 0.109 0.1551 0.0669 0.2451 0.038 0.1767 0.1271 0.3107 0.272 0.1188 0.1144 0.2489 0.5681 0.0557 0.7491 0.2059 0.7277 0.2225 0.6134 0.0595 0.337 0.1993 0.8975 0.9518 0.0035 0.457 0.8785 0.208 0.1464 0.7079 0.0185 0.404 0.1954 0.2517 0.2323 0.4562 0.5691 0. 0.3886 0.4927 0.0062 0.2914 0.2672 0.1288 0.2571 0.1495 0.0071 0.1856 0.4003 0.109 0. 0.3568 0.5104 0.1444 0.0006 0.385 0.0268 0.1313 0.1196] 2022-08-23 20:12:19 [INFO] [EVAL] The model with the best validation mIoU (0.3339) was saved at iter 24000. 2022-08-23 20:12:29 [INFO] [TRAIN] epoch: 20, iter: 25050/160000, loss: 0.6604, lr: 0.001022, batch_cost: 0.2011, reader_cost: 0.00285, ips: 39.7714 samples/sec | ETA 07:32:25 2022-08-23 20:12:40 [INFO] [TRAIN] epoch: 20, iter: 25100/160000, loss: 0.7566, lr: 0.001021, batch_cost: 0.2272, reader_cost: 0.00130, ips: 35.2184 samples/sec | ETA 08:30:43 2022-08-23 20:12:51 [INFO] [TRAIN] epoch: 20, iter: 25150/160000, loss: 0.7395, lr: 0.001021, batch_cost: 0.2041, reader_cost: 0.00189, ips: 39.2040 samples/sec | ETA 07:38:37 2022-08-23 20:13:01 [INFO] [TRAIN] epoch: 20, iter: 25200/160000, loss: 0.7246, lr: 0.001021, batch_cost: 0.1971, reader_cost: 0.00719, ips: 40.5900 samples/sec | ETA 07:22:48 2022-08-23 20:13:10 [INFO] [TRAIN] epoch: 20, iter: 25250/160000, loss: 0.7925, lr: 0.001020, batch_cost: 0.1799, reader_cost: 0.00363, ips: 44.4587 samples/sec | ETA 06:44:07 2022-08-23 20:13:26 [INFO] [TRAIN] epoch: 21, iter: 25300/160000, loss: 0.7740, lr: 0.001020, batch_cost: 0.3200, reader_cost: 0.10930, ips: 25.0031 samples/sec | ETA 11:58:18 2022-08-23 20:13:37 [INFO] [TRAIN] epoch: 21, iter: 25350/160000, loss: 0.7230, lr: 0.001019, batch_cost: 0.2305, reader_cost: 0.00049, ips: 34.7065 samples/sec | ETA 08:37:17 2022-08-23 20:13:49 [INFO] [TRAIN] epoch: 21, iter: 25400/160000, loss: 0.7231, lr: 0.001019, batch_cost: 0.2344, reader_cost: 0.00086, ips: 34.1365 samples/sec | ETA 08:45:43 2022-08-23 20:14:01 [INFO] [TRAIN] epoch: 21, iter: 25450/160000, loss: 0.7142, lr: 0.001019, batch_cost: 0.2420, reader_cost: 0.00056, ips: 33.0595 samples/sec | ETA 09:02:39 2022-08-23 20:14:11 [INFO] [TRAIN] epoch: 21, iter: 25500/160000, loss: 0.7169, lr: 0.001018, batch_cost: 0.2122, reader_cost: 0.00047, ips: 37.6965 samples/sec | ETA 07:55:43 2022-08-23 20:14:22 [INFO] [TRAIN] epoch: 21, iter: 25550/160000, loss: 0.7408, lr: 0.001018, batch_cost: 0.2095, reader_cost: 0.00040, ips: 38.1919 samples/sec | ETA 07:49:23 2022-08-23 20:14:32 [INFO] [TRAIN] epoch: 21, iter: 25600/160000, loss: 0.7367, lr: 0.001018, batch_cost: 0.1959, reader_cost: 0.00035, ips: 40.8456 samples/sec | ETA 07:18:43 2022-08-23 20:14:43 [INFO] [TRAIN] epoch: 21, iter: 25650/160000, loss: 0.7497, lr: 0.001017, batch_cost: 0.2205, reader_cost: 0.00056, ips: 36.2802 samples/sec | ETA 08:13:44 2022-08-23 20:14:52 [INFO] [TRAIN] epoch: 21, iter: 25700/160000, loss: 0.7170, lr: 0.001017, batch_cost: 0.1867, reader_cost: 0.00085, ips: 42.8413 samples/sec | ETA 06:57:58 2022-08-23 20:15:01 [INFO] [TRAIN] epoch: 21, iter: 25750/160000, loss: 0.7122, lr: 0.001016, batch_cost: 0.1787, reader_cost: 0.00077, ips: 44.7750 samples/sec | ETA 06:39:46 2022-08-23 20:15:11 [INFO] [TRAIN] epoch: 21, iter: 25800/160000, loss: 0.7400, lr: 0.001016, batch_cost: 0.1954, reader_cost: 0.00069, ips: 40.9490 samples/sec | ETA 07:16:57 2022-08-23 20:15:19 [INFO] [TRAIN] epoch: 21, iter: 25850/160000, loss: 0.7358, lr: 0.001016, batch_cost: 0.1641, reader_cost: 0.00040, ips: 48.7579 samples/sec | ETA 06:06:50 2022-08-23 20:15:28 [INFO] [TRAIN] epoch: 21, iter: 25900/160000, loss: 0.7667, lr: 0.001015, batch_cost: 0.1782, reader_cost: 0.00103, ips: 44.9025 samples/sec | ETA 06:38:11 2022-08-23 20:15:37 [INFO] [TRAIN] epoch: 21, iter: 25950/160000, loss: 0.7556, lr: 0.001015, batch_cost: 0.1735, reader_cost: 0.00076, ips: 46.1129 samples/sec | ETA 06:27:35 2022-08-23 20:15:45 [INFO] [TRAIN] epoch: 21, iter: 26000/160000, loss: 0.7113, lr: 0.001015, batch_cost: 0.1775, reader_cost: 0.00072, ips: 45.0765 samples/sec | ETA 06:36:21 2022-08-23 20:15:45 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1828 - reader cost: 7.8301e-04 2022-08-23 20:18:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3326 Acc: 0.7534 Kappa: 0.7343 Dice: 0.4639 2022-08-23 20:18:49 [INFO] [EVAL] Class IoU: [0.6734 0.7687 0.9272 0.7136 0.6783 0.7428 0.7475 0.7698 0.5067 0.56 0.4545 0.5027 0.6942 0.2557 0.2612 0.4093 0.514 0.4085 0.5893 0.4181 0.7383 0.4481 0.6006 0.4594 0.3127 0.3452 0.5223 0.366 0.413 0.2403 0.1682 0.478 0.2698 0.3157 0.4069 0.351 0.4107 0.4967 0.263 0.2867 0.1642 0.1273 0.3435 0.245 0.2872 0.1382 0.2006 0.456 0.6299 0.5029 0.4125 0.3521 0.1406 0.2469 0.65 0.4104 0.8545 0.3467 0.4083 0.3242 0.1406 0.1994 0.2374 0.243 0.4124 0.6861 0.2158 0.3894 0.1014 0.3396 0.3703 0.4984 0.4059 0.2589 0.4235 0.3071 0.6128 0.2564 0.288 0.2313 0.5783 0.3391 0.2952 0.0967 0.3148 0.4969 0.052 0.0945 0.2353 0.476 0.3661 0.0362 0.1995 0.0745 0.0056 0.0306 0.1785 0.1103 0.2074 0.262 0.0231 0.0599 0.1362 0.5225 0.1919 0.3767 0.1887 0.55 0.127 0.3946 0.1423 0.1901 0.11 0.6149 0.7385 0.0059 0.3831 0.6556 0.1102 0.2808 0.4771 0.0311 0.341 0.1134 0.2519 0.1509 0.4996 0.4513 0.008 0.3067 0.5 0.0149 0.3382 0.2921 0.1555 0.159 0.1376 0.0291 0.1539 0.3028 0.2857 0.0001 0.2397 0.0314 0.3287 0. 0.3926 0.014 0.0903 0.19 ] 2022-08-23 20:18:49 [INFO] [EVAL] Class Precision: [0.7723 0.8403 0.9698 0.7965 0.7686 0.8686 0.8336 0.8483 0.6842 0.7564 0.6433 0.7045 0.7789 0.4752 0.4756 0.5681 0.6395 0.6843 0.7444 0.6507 0.8123 0.7195 0.7463 0.5322 0.4724 0.5589 0.5626 0.7522 0.6904 0.3207 0.4731 0.587 0.529 0.4209 0.5413 0.5479 0.5576 0.8109 0.4148 0.5964 0.2877 0.3636 0.6608 0.4793 0.4546 0.4957 0.4771 0.6271 0.7857 0.6386 0.7399 0.4532 0.2192 0.5433 0.7034 0.63 0.9045 0.618 0.7378 0.5567 0.2279 0.4336 0.288 0.4716 0.5038 0.8673 0.3265 0.5319 0.2852 0.5975 0.7306 0.6547 0.5998 0.3114 0.6079 0.5332 0.7345 0.5175 0.5383 0.4628 0.683 0.6385 0.7728 0.2299 0.3667 0.7658 0.7257 0.3162 0.4362 0.6107 0.4718 0.0465 0.3565 0.3222 0.023 0.0721 0.2366 0.2549 0.6526 0.8234 0.4232 0.0882 0.5553 0.8499 0.7951 0.4636 0.2652 0.7192 0.2099 0.4592 0.3264 0.5018 0.6977 0.6841 0.7651 0.137 0.5313 0.7788 0.1869 0.5492 0.7482 0.2113 0.8477 0.6525 0.6965 0.5706 0.7882 0.5899 0.0468 0.6822 0.7661 0.209 0.6416 0.4699 0.7543 0.3453 0.3038 0.0831 0.462 0.6802 0.7609 0.0002 0.6828 0.3382 0.6842 0. 0.8044 0.5348 0.3901 0.8126] 2022-08-23 20:18:49 [INFO] [EVAL] Class Recall: [0.8401 0.9002 0.9548 0.8726 0.8525 0.8369 0.8785 0.8926 0.6614 0.6831 0.6077 0.637 0.8645 0.3562 0.3669 0.5942 0.7238 0.5034 0.7388 0.539 0.8901 0.543 0.7546 0.7705 0.4805 0.4745 0.8796 0.4162 0.5069 0.4894 0.207 0.7201 0.355 0.5582 0.6209 0.4942 0.6092 0.5617 0.4183 0.3557 0.2765 0.1638 0.417 0.3339 0.4382 0.1607 0.2571 0.6257 0.7606 0.703 0.4825 0.612 0.2818 0.3115 0.8955 0.5408 0.9393 0.4412 0.4776 0.4371 0.2687 0.2697 0.5745 0.334 0.6945 0.7666 0.3889 0.5924 0.1359 0.4403 0.4289 0.6761 0.5567 0.6058 0.5826 0.4199 0.7871 0.3369 0.3825 0.3163 0.7904 0.4196 0.3233 0.1429 0.6899 0.5859 0.053 0.1188 0.3381 0.6833 0.6203 0.1398 0.3118 0.0883 0.0073 0.0505 0.4208 0.1628 0.2331 0.2776 0.0238 0.1577 0.1529 0.5757 0.2019 0.6678 0.3956 0.7004 0.2433 0.737 0.2014 0.2344 0.1155 0.8587 0.9551 0.0061 0.5787 0.8057 0.2117 0.3649 0.5684 0.0352 0.3633 0.1207 0.2829 0.1702 0.577 0.6575 0.0096 0.3579 0.5901 0.0158 0.417 0.4357 0.1638 0.2276 0.201 0.0428 0.1876 0.3531 0.3138 0.0004 0.2697 0.0335 0.3875 0. 0.4341 0.0142 0.1051 0.1988] 2022-08-23 20:18:49 [INFO] [EVAL] The model with the best validation mIoU (0.3339) was saved at iter 24000. 2022-08-23 20:18:59 [INFO] [TRAIN] epoch: 21, iter: 26050/160000, loss: 0.7651, lr: 0.001014, batch_cost: 0.1944, reader_cost: 0.01199, ips: 41.1504 samples/sec | ETA 07:14:01 2022-08-23 20:19:10 [INFO] [TRAIN] epoch: 21, iter: 26100/160000, loss: 0.7866, lr: 0.001014, batch_cost: 0.2238, reader_cost: 0.00111, ips: 35.7533 samples/sec | ETA 08:19:20 2022-08-23 20:19:21 [INFO] [TRAIN] epoch: 21, iter: 26150/160000, loss: 0.7170, lr: 0.001013, batch_cost: 0.2307, reader_cost: 0.00076, ips: 34.6734 samples/sec | ETA 08:34:42 2022-08-23 20:19:32 [INFO] [TRAIN] epoch: 21, iter: 26200/160000, loss: 0.7628, lr: 0.001013, batch_cost: 0.2230, reader_cost: 0.00094, ips: 35.8701 samples/sec | ETA 08:17:21 2022-08-23 20:19:43 [INFO] [TRAIN] epoch: 21, iter: 26250/160000, loss: 0.7488, lr: 0.001013, batch_cost: 0.2052, reader_cost: 0.00039, ips: 38.9794 samples/sec | ETA 07:37:30 2022-08-23 20:19:54 [INFO] [TRAIN] epoch: 21, iter: 26300/160000, loss: 0.7361, lr: 0.001012, batch_cost: 0.2269, reader_cost: 0.00062, ips: 35.2577 samples/sec | ETA 08:25:36 2022-08-23 20:20:04 [INFO] [TRAIN] epoch: 21, iter: 26350/160000, loss: 0.7648, lr: 0.001012, batch_cost: 0.2002, reader_cost: 0.00054, ips: 39.9531 samples/sec | ETA 07:26:01 2022-08-23 20:20:15 [INFO] [TRAIN] epoch: 21, iter: 26400/160000, loss: 0.7531, lr: 0.001011, batch_cost: 0.2093, reader_cost: 0.00035, ips: 38.2267 samples/sec | ETA 07:45:59 2022-08-23 20:20:25 [INFO] [TRAIN] epoch: 21, iter: 26450/160000, loss: 0.8131, lr: 0.001011, batch_cost: 0.2042, reader_cost: 0.00054, ips: 39.1678 samples/sec | ETA 07:34:37 2022-08-23 20:20:36 [INFO] [TRAIN] epoch: 21, iter: 26500/160000, loss: 0.7535, lr: 0.001011, batch_cost: 0.2218, reader_cost: 0.00039, ips: 36.0672 samples/sec | ETA 08:13:31 2022-08-23 20:20:50 [INFO] [TRAIN] epoch: 22, iter: 26550/160000, loss: 0.7064, lr: 0.001010, batch_cost: 0.2835, reader_cost: 0.06780, ips: 28.2149 samples/sec | ETA 10:30:38 2022-08-23 20:21:01 [INFO] [TRAIN] epoch: 22, iter: 26600/160000, loss: 0.7148, lr: 0.001010, batch_cost: 0.2119, reader_cost: 0.01433, ips: 37.7596 samples/sec | ETA 07:51:03 2022-08-23 20:21:10 [INFO] [TRAIN] epoch: 22, iter: 26650/160000, loss: 0.7355, lr: 0.001010, batch_cost: 0.1966, reader_cost: 0.00282, ips: 40.6967 samples/sec | ETA 07:16:53 2022-08-23 20:21:19 [INFO] [TRAIN] epoch: 22, iter: 26700/160000, loss: 0.6923, lr: 0.001009, batch_cost: 0.1699, reader_cost: 0.00042, ips: 47.0915 samples/sec | ETA 06:17:25 2022-08-23 20:21:28 [INFO] [TRAIN] epoch: 22, iter: 26750/160000, loss: 0.7320, lr: 0.001009, batch_cost: 0.1780, reader_cost: 0.00068, ips: 44.9560 samples/sec | ETA 06:35:12 2022-08-23 20:21:36 [INFO] [TRAIN] epoch: 22, iter: 26800/160000, loss: 0.6549, lr: 0.001008, batch_cost: 0.1665, reader_cost: 0.00061, ips: 48.0517 samples/sec | ETA 06:09:36 2022-08-23 20:21:45 [INFO] [TRAIN] epoch: 22, iter: 26850/160000, loss: 0.7378, lr: 0.001008, batch_cost: 0.1669, reader_cost: 0.00041, ips: 47.9417 samples/sec | ETA 06:10:18 2022-08-23 20:21:55 [INFO] [TRAIN] epoch: 22, iter: 26900/160000, loss: 0.7497, lr: 0.001008, batch_cost: 0.2177, reader_cost: 0.00074, ips: 36.7515 samples/sec | ETA 08:02:52 2022-08-23 20:22:06 [INFO] [TRAIN] epoch: 22, iter: 26950/160000, loss: 0.7075, lr: 0.001007, batch_cost: 0.2077, reader_cost: 0.00057, ips: 38.5253 samples/sec | ETA 07:40:28 2022-08-23 20:22:16 [INFO] [TRAIN] epoch: 22, iter: 27000/160000, loss: 0.7608, lr: 0.001007, batch_cost: 0.2061, reader_cost: 0.00047, ips: 38.8203 samples/sec | ETA 07:36:48 2022-08-23 20:22:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1969 - reader cost: 0.0011 2022-08-23 20:25:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3315 Acc: 0.7554 Kappa: 0.7365 Dice: 0.4619 2022-08-23 20:25:33 [INFO] [EVAL] Class IoU: [0.6683 0.7636 0.9304 0.7198 0.687 0.7283 0.7623 0.7844 0.5027 0.648 0.4637 0.5335 0.6805 0.241 0.2471 0.4144 0.4237 0.4076 0.5816 0.4156 0.7483 0.4593 0.6028 0.4754 0.3232 0.3621 0.4605 0.4041 0.4134 0.2881 0.2654 0.4855 0.275 0.3114 0.3169 0.3956 0.4084 0.448 0.2762 0.3463 0.1228 0.0971 0.3312 0.2245 0.285 0.2016 0.2824 0.4049 0.7207 0.4112 0.4765 0.4025 0.1626 0.2503 0.6075 0.4581 0.8582 0.3185 0.3549 0.3225 0.1265 0.4248 0.2707 0.081 0.4368 0.6199 0.1925 0.3756 0.0537 0.291 0.4315 0.5038 0.375 0.2809 0.4157 0.3688 0.4546 0.2249 0.3625 0.1282 0.6162 0.3382 0.3092 0.1428 0.1032 0.5167 0.0613 0.085 0.3215 0.4841 0.4236 0.0516 0.1603 0.0655 0. 0.0284 0.1481 0.1688 0.2673 0.2713 0.0353 0.0921 0.2531 0.6082 0.1831 0.4328 0.1608 0.5244 0.0961 0.383 0.0997 0.2818 0.1398 0.6473 0.6024 0.0008 0.3963 0.5914 0.1206 0.3524 0.4803 0.0099 0.3068 0.0569 0.2681 0.1743 0.4143 0.3553 0.0011 0.3432 0.489 0. 0.2753 0.303 0.192 0.158 0.1155 0.0336 0.1711 0.3475 0.0582 0.0277 0.3428 0.0099 0.28 0.0034 0.3627 0.0037 0.0743 0.128 ] 2022-08-23 20:25:33 [INFO] [EVAL] Class Precision: [0.7688 0.8229 0.9673 0.8476 0.7709 0.8588 0.8668 0.8501 0.6619 0.7342 0.6788 0.6511 0.7496 0.4597 0.5948 0.5456 0.5777 0.6975 0.7208 0.601 0.8317 0.6378 0.7649 0.6284 0.5258 0.533 0.5599 0.5605 0.6983 0.5158 0.4057 0.6185 0.5897 0.4118 0.4654 0.5014 0.6681 0.822 0.4589 0.5056 0.3063 0.3518 0.6091 0.5978 0.4154 0.6194 0.46 0.5877 0.7947 0.461 0.6285 0.5598 0.4324 0.4347 0.7292 0.7383 0.9006 0.7124 0.7185 0.649 0.3728 0.6247 0.3436 0.4856 0.574 0.6864 0.2996 0.5449 0.1078 0.5492 0.7465 0.6813 0.4699 0.5081 0.599 0.5493 0.4946 0.5983 0.7515 0.181 0.7592 0.6743 0.7419 0.4391 0.2427 0.65 0.2796 0.3342 0.6188 0.7287 0.6451 0.0652 0.3718 0.2816 0. 0.0946 0.2952 0.3716 0.5216 0.8306 0.8712 0.185 0.6109 0.8741 0.8138 0.4673 0.3033 0.6518 0.334 0.3948 0.3753 0.6521 0.5081 0.7656 0.7003 0.0804 0.6302 0.6619 0.2695 0.4335 0.6307 0.2384 0.7106 0.6632 0.6548 0.5794 0.8795 0.4642 0.09 0.6157 0.6937 0. 0.5615 0.6397 0.5318 0.2588 0.5228 0.1867 0.4192 0.6548 0.3155 0.0375 0.4561 0.299 0.5728 0.0041 0.798 0.1268 0.5404 0.9597] 2022-08-23 20:25:33 [INFO] [EVAL] Class Recall: [0.8363 0.9137 0.9607 0.8268 0.8632 0.8273 0.8634 0.9103 0.6764 0.8467 0.594 0.747 0.8806 0.3363 0.2971 0.6327 0.6138 0.4951 0.7507 0.574 0.8819 0.6213 0.7399 0.6612 0.4561 0.5304 0.7217 0.5915 0.5033 0.3949 0.4342 0.693 0.34 0.5607 0.4982 0.6521 0.5123 0.4961 0.4095 0.5236 0.1701 0.1183 0.4206 0.2644 0.476 0.2301 0.4226 0.5656 0.8856 0.792 0.6633 0.5888 0.2067 0.371 0.7846 0.547 0.948 0.3655 0.4123 0.3907 0.1607 0.5704 0.5608 0.0886 0.6463 0.8648 0.3499 0.5473 0.0965 0.3823 0.5056 0.6591 0.6501 0.3859 0.576 0.5287 0.849 0.2649 0.4119 0.3052 0.7658 0.4042 0.3465 0.1746 0.1523 0.716 0.0728 0.1024 0.4009 0.5906 0.5523 0.1979 0.2199 0.0786 0. 0.039 0.229 0.2362 0.3541 0.2872 0.0355 0.155 0.3018 0.6666 0.1911 0.8544 0.2551 0.7284 0.1188 0.9281 0.1195 0.3317 0.1616 0.8072 0.8116 0.0008 0.5164 0.8474 0.1792 0.6534 0.6682 0.0102 0.3506 0.0586 0.3123 0.1995 0.4392 0.6024 0.0011 0.4367 0.6236 0. 0.3506 0.3654 0.231 0.2886 0.1291 0.0394 0.2242 0.4254 0.0666 0.096 0.5797 0.0102 0.3539 0.0205 0.3994 0.0038 0.0794 0.1287] 2022-08-23 20:25:34 [INFO] [EVAL] The model with the best validation mIoU (0.3339) was saved at iter 24000. 2022-08-23 20:25:45 [INFO] [TRAIN] epoch: 22, iter: 27050/160000, loss: 0.7219, lr: 0.001007, batch_cost: 0.2207, reader_cost: 0.00447, ips: 36.2540 samples/sec | ETA 08:08:57 2022-08-23 20:25:55 [INFO] [TRAIN] epoch: 22, iter: 27100/160000, loss: 0.7321, lr: 0.001006, batch_cost: 0.2047, reader_cost: 0.00089, ips: 39.0876 samples/sec | ETA 07:33:20 2022-08-23 20:26:06 [INFO] [TRAIN] epoch: 22, iter: 27150/160000, loss: 0.7303, lr: 0.001006, batch_cost: 0.2138, reader_cost: 0.00819, ips: 37.4234 samples/sec | ETA 07:53:19 2022-08-23 20:26:18 [INFO] [TRAIN] epoch: 22, iter: 27200/160000, loss: 0.7846, lr: 0.001005, batch_cost: 0.2490, reader_cost: 0.00097, ips: 32.1342 samples/sec | ETA 09:11:01 2022-08-23 20:26:28 [INFO] [TRAIN] epoch: 22, iter: 27250/160000, loss: 0.7223, lr: 0.001005, batch_cost: 0.2028, reader_cost: 0.00076, ips: 39.4570 samples/sec | ETA 07:28:35 2022-08-23 20:26:39 [INFO] [TRAIN] epoch: 22, iter: 27300/160000, loss: 0.7416, lr: 0.001005, batch_cost: 0.2257, reader_cost: 0.00038, ips: 35.4451 samples/sec | ETA 08:19:10 2022-08-23 20:26:49 [INFO] [TRAIN] epoch: 22, iter: 27350/160000, loss: 0.7500, lr: 0.001004, batch_cost: 0.1860, reader_cost: 0.01107, ips: 43.0132 samples/sec | ETA 06:51:11 2022-08-23 20:27:00 [INFO] [TRAIN] epoch: 22, iter: 27400/160000, loss: 0.7481, lr: 0.001004, batch_cost: 0.2182, reader_cost: 0.00065, ips: 36.6633 samples/sec | ETA 08:02:13 2022-08-23 20:27:12 [INFO] [TRAIN] epoch: 22, iter: 27450/160000, loss: 0.7812, lr: 0.001004, batch_cost: 0.2408, reader_cost: 0.00069, ips: 33.2188 samples/sec | ETA 08:52:01 2022-08-23 20:27:22 [INFO] [TRAIN] epoch: 22, iter: 27500/160000, loss: 0.7423, lr: 0.001003, batch_cost: 0.1976, reader_cost: 0.00381, ips: 40.4901 samples/sec | ETA 07:16:19 2022-08-23 20:27:30 [INFO] [TRAIN] epoch: 22, iter: 27550/160000, loss: 0.6786, lr: 0.001003, batch_cost: 0.1702, reader_cost: 0.00043, ips: 47.0086 samples/sec | ETA 06:15:40 2022-08-23 20:27:38 [INFO] [TRAIN] epoch: 22, iter: 27600/160000, loss: 0.7319, lr: 0.001002, batch_cost: 0.1643, reader_cost: 0.00057, ips: 48.6792 samples/sec | ETA 06:02:38 2022-08-23 20:27:50 [INFO] [TRAIN] epoch: 22, iter: 27650/160000, loss: 0.7305, lr: 0.001002, batch_cost: 0.2275, reader_cost: 0.00044, ips: 35.1586 samples/sec | ETA 08:21:54 2022-08-23 20:27:59 [INFO] [TRAIN] epoch: 22, iter: 27700/160000, loss: 0.7741, lr: 0.001002, batch_cost: 0.1880, reader_cost: 0.00070, ips: 42.5447 samples/sec | ETA 06:54:37 2022-08-23 20:28:07 [INFO] [TRAIN] epoch: 22, iter: 27750/160000, loss: 0.7833, lr: 0.001001, batch_cost: 0.1564, reader_cost: 0.00037, ips: 51.1379 samples/sec | ETA 05:44:49 2022-08-23 20:28:20 [INFO] [TRAIN] epoch: 23, iter: 27800/160000, loss: 0.7006, lr: 0.001001, batch_cost: 0.2552, reader_cost: 0.08621, ips: 31.3453 samples/sec | ETA 09:22:20 2022-08-23 20:28:29 [INFO] [TRAIN] epoch: 23, iter: 27850/160000, loss: 0.7304, lr: 0.001001, batch_cost: 0.1901, reader_cost: 0.00044, ips: 42.0770 samples/sec | ETA 06:58:45 2022-08-23 20:28:38 [INFO] [TRAIN] epoch: 23, iter: 27900/160000, loss: 0.7231, lr: 0.001000, batch_cost: 0.1834, reader_cost: 0.00085, ips: 43.6301 samples/sec | ETA 06:43:41 2022-08-23 20:28:48 [INFO] [TRAIN] epoch: 23, iter: 27950/160000, loss: 0.6951, lr: 0.001000, batch_cost: 0.1955, reader_cost: 0.00073, ips: 40.9309 samples/sec | ETA 07:10:09 2022-08-23 20:28:58 [INFO] [TRAIN] epoch: 23, iter: 28000/160000, loss: 0.6806, lr: 0.000999, batch_cost: 0.2014, reader_cost: 0.00061, ips: 39.7189 samples/sec | ETA 07:23:06 2022-08-23 20:28:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1699 - reader cost: 0.0010 2022-08-23 20:31:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3265 Acc: 0.7497 Kappa: 0.7314 Dice: 0.4548 2022-08-23 20:31:49 [INFO] [EVAL] Class IoU: [0.6685 0.767 0.9294 0.7174 0.6676 0.7622 0.7649 0.7744 0.4903 0.6202 0.4588 0.5335 0.684 0.248 0.2975 0.387 0.4764 0.3913 0.5772 0.39 0.7344 0.4633 0.5987 0.4794 0.3133 0.3356 0.4316 0.3722 0.3926 0.2686 0.1791 0.5088 0.264 0.2816 0.2351 0.3737 0.4266 0.5022 0.2161 0.3582 0.1666 0.1153 0.34 0.2451 0.3319 0.1589 0.3071 0.4607 0.6651 0.4243 0.4728 0.3853 0.1428 0.1539 0.6619 0.4325 0.7954 0.3546 0.4011 0.2581 0.1133 0.2262 0.2656 0.0969 0.4278 0.6248 0.1959 0.4001 0.0655 0.3358 0.4063 0.506 0.4121 0.225 0.4267 0.3573 0.5562 0.2176 0.3641 0.2584 0.6585 0.331 0.2798 0.0142 0.2264 0.5353 0.0694 0.0603 0.2791 0.4969 0.3954 0.0287 0.1702 0.0835 0.0983 0.0244 0.1664 0.1528 0.2879 0.2858 0.0476 0.0586 0.029 0.0663 0.1614 0.4304 0.1833 0.542 0.0875 0.3052 0.0936 0.3124 0.1302 0.6664 0.6993 0.0094 0.4303 0.6439 0.1144 0.3681 0.4952 0.0172 0.1872 0.0478 0.2485 0.1898 0.5106 0.4547 0.0026 0.3675 0.4703 0.0004 0.1604 0.217 0.1476 0.1574 0.1505 0.0064 0.1266 0.3782 0.2317 0.0091 0.2854 0.0601 0.2834 0.0033 0.3684 0.0267 0.114 0.1328] 2022-08-23 20:31:49 [INFO] [EVAL] Class Precision: [0.813 0.8393 0.9564 0.812 0.7732 0.8594 0.8939 0.8252 0.5976 0.7933 0.6517 0.6461 0.7785 0.4745 0.5389 0.5207 0.5751 0.6873 0.7798 0.6083 0.7991 0.7122 0.7365 0.5971 0.4959 0.443 0.5746 0.5462 0.688 0.3675 0.4376 0.6265 0.5229 0.3332 0.3497 0.4783 0.5853 0.6698 0.349 0.5541 0.2448 0.4386 0.5282 0.5061 0.4668 0.3918 0.5491 0.6967 0.8615 0.4941 0.7171 0.4878 0.3166 0.6244 0.7322 0.6131 0.815 0.6591 0.4616 0.3797 0.1486 0.4141 0.5016 0.5286 0.5009 0.6942 0.2652 0.516 0.1113 0.5741 0.5262 0.6669 0.5497 0.2428 0.7171 0.4908 0.6671 0.5205 0.6034 0.3609 0.8244 0.5436 0.7915 0.0414 0.3484 0.7267 0.2353 0.4106 0.3938 0.7047 0.5914 0.0321 0.2493 0.2831 0.1203 0.0603 0.2934 0.4175 0.4144 0.6708 0.5093 0.0896 0.2541 0.2605 0.5201 0.4683 0.3889 0.6765 0.2132 0.3904 0.3945 0.4685 0.5791 0.8264 0.72 0.0802 0.6921 0.6727 0.2993 0.6314 0.6357 0.2377 0.7836 0.6572 0.6236 0.565 0.8289 0.6727 0.0179 0.6315 0.6185 0.0154 0.7979 0.8131 0.6492 0.2924 0.3795 0.076 0.5184 0.5638 0.5792 0.013 0.658 0.2591 0.8873 0.0118 0.6705 0.2076 0.3508 0.9349] 2022-08-23 20:31:49 [INFO] [EVAL] Class Recall: [0.79 0.899 0.9705 0.8602 0.8301 0.8708 0.8412 0.9264 0.732 0.7397 0.6078 0.7539 0.8493 0.3419 0.3992 0.6011 0.7351 0.4761 0.6897 0.5208 0.9008 0.57 0.7619 0.7085 0.4597 0.5804 0.6342 0.5388 0.4776 0.4994 0.2327 0.7305 0.3478 0.6451 0.4176 0.6308 0.6114 0.6675 0.3621 0.5032 0.3426 0.1353 0.4882 0.3222 0.5345 0.2108 0.4106 0.5763 0.7447 0.7502 0.5812 0.6472 0.2064 0.1696 0.8734 0.5949 0.9707 0.4343 0.7539 0.4463 0.323 0.3327 0.3608 0.106 0.7456 0.862 0.4284 0.6406 0.1374 0.4471 0.6407 0.6771 0.6221 0.7535 0.5131 0.5677 0.7698 0.2721 0.4787 0.4763 0.766 0.4585 0.3021 0.0211 0.3926 0.6702 0.0896 0.066 0.4894 0.6275 0.544 0.2124 0.3491 0.1059 0.3491 0.0395 0.2778 0.1943 0.4854 0.3324 0.0499 0.1451 0.0317 0.0817 0.1897 0.8416 0.2574 0.7317 0.1293 0.583 0.1094 0.4839 0.1438 0.7749 0.9605 0.0106 0.5322 0.9377 0.1563 0.4689 0.6914 0.0182 0.1974 0.049 0.2923 0.2222 0.5707 0.5839 0.003 0.4678 0.6625 0.0004 0.1672 0.2284 0.1604 0.2543 0.1997 0.0069 0.1435 0.5346 0.2787 0.0294 0.3351 0.0726 0.294 0.0045 0.4499 0.0298 0.1445 0.1341] 2022-08-23 20:31:49 [INFO] [EVAL] The model with the best validation mIoU (0.3339) was saved at iter 24000. 2022-08-23 20:32:01 [INFO] [TRAIN] epoch: 23, iter: 28050/160000, loss: 0.6523, lr: 0.000999, batch_cost: 0.2427, reader_cost: 0.00529, ips: 32.9682 samples/sec | ETA 08:53:38 2022-08-23 20:32:11 [INFO] [TRAIN] epoch: 23, iter: 28100/160000, loss: 0.7359, lr: 0.000999, batch_cost: 0.1977, reader_cost: 0.00119, ips: 40.4697 samples/sec | ETA 07:14:33 2022-08-23 20:32:22 [INFO] [TRAIN] epoch: 23, iter: 28150/160000, loss: 0.6874, lr: 0.000998, batch_cost: 0.2190, reader_cost: 0.00074, ips: 36.5343 samples/sec | ETA 08:01:11 2022-08-23 20:32:33 [INFO] [TRAIN] epoch: 23, iter: 28200/160000, loss: 0.7042, lr: 0.000998, batch_cost: 0.2173, reader_cost: 0.00041, ips: 36.8226 samples/sec | ETA 07:57:14 2022-08-23 20:32:43 [INFO] [TRAIN] epoch: 23, iter: 28250/160000, loss: 0.7342, lr: 0.000997, batch_cost: 0.2139, reader_cost: 0.00130, ips: 37.3997 samples/sec | ETA 07:49:42 2022-08-23 20:32:54 [INFO] [TRAIN] epoch: 23, iter: 28300/160000, loss: 0.7117, lr: 0.000997, batch_cost: 0.2207, reader_cost: 0.00093, ips: 36.2558 samples/sec | ETA 08:04:20 2022-08-23 20:33:06 [INFO] [TRAIN] epoch: 23, iter: 28350/160000, loss: 0.7321, lr: 0.000997, batch_cost: 0.2226, reader_cost: 0.00040, ips: 35.9364 samples/sec | ETA 08:08:27 2022-08-23 20:33:15 [INFO] [TRAIN] epoch: 23, iter: 28400/160000, loss: 0.7242, lr: 0.000996, batch_cost: 0.1929, reader_cost: 0.00063, ips: 41.4719 samples/sec | ETA 07:03:05 2022-08-23 20:33:24 [INFO] [TRAIN] epoch: 23, iter: 28450/160000, loss: 0.7718, lr: 0.000996, batch_cost: 0.1721, reader_cost: 0.00082, ips: 46.4962 samples/sec | ETA 06:17:14 2022-08-23 20:33:33 [INFO] [TRAIN] epoch: 23, iter: 28500/160000, loss: 0.6876, lr: 0.000996, batch_cost: 0.1887, reader_cost: 0.00074, ips: 42.3970 samples/sec | ETA 06:53:33 2022-08-23 20:33:42 [INFO] [TRAIN] epoch: 23, iter: 28550/160000, loss: 0.6805, lr: 0.000995, batch_cost: 0.1735, reader_cost: 0.00052, ips: 46.1081 samples/sec | ETA 06:20:07 2022-08-23 20:33:51 [INFO] [TRAIN] epoch: 23, iter: 28600/160000, loss: 0.7613, lr: 0.000995, batch_cost: 0.1878, reader_cost: 0.00047, ips: 42.5908 samples/sec | ETA 06:51:21 2022-08-23 20:34:00 [INFO] [TRAIN] epoch: 23, iter: 28650/160000, loss: 0.6917, lr: 0.000994, batch_cost: 0.1774, reader_cost: 0.00040, ips: 45.0999 samples/sec | ETA 06:28:19 2022-08-23 20:34:08 [INFO] [TRAIN] epoch: 23, iter: 28700/160000, loss: 0.7344, lr: 0.000994, batch_cost: 0.1628, reader_cost: 0.00047, ips: 49.1419 samples/sec | ETA 05:56:14 2022-08-23 20:34:18 [INFO] [TRAIN] epoch: 23, iter: 28750/160000, loss: 0.7060, lr: 0.000994, batch_cost: 0.1886, reader_cost: 0.00051, ips: 42.4270 samples/sec | ETA 06:52:28 2022-08-23 20:34:27 [INFO] [TRAIN] epoch: 23, iter: 28800/160000, loss: 0.7010, lr: 0.000993, batch_cost: 0.1828, reader_cost: 0.00065, ips: 43.7652 samples/sec | ETA 06:39:42 2022-08-23 20:34:35 [INFO] [TRAIN] epoch: 23, iter: 28850/160000, loss: 0.7929, lr: 0.000993, batch_cost: 0.1607, reader_cost: 0.00041, ips: 49.7927 samples/sec | ETA 05:51:11 2022-08-23 20:34:43 [INFO] [TRAIN] epoch: 23, iter: 28900/160000, loss: 0.7300, lr: 0.000993, batch_cost: 0.1661, reader_cost: 0.00600, ips: 48.1564 samples/sec | ETA 06:02:59 2022-08-23 20:34:52 [INFO] [TRAIN] epoch: 23, iter: 28950/160000, loss: 0.6872, lr: 0.000992, batch_cost: 0.1772, reader_cost: 0.01259, ips: 45.1400 samples/sec | ETA 06:27:05 2022-08-23 20:35:02 [INFO] [TRAIN] epoch: 23, iter: 29000/160000, loss: 0.7369, lr: 0.000992, batch_cost: 0.1940, reader_cost: 0.00054, ips: 41.2357 samples/sec | ETA 07:03:34 2022-08-23 20:35:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1885 - reader cost: 6.8476e-04 2022-08-23 20:38:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3401 Acc: 0.7579 Kappa: 0.7393 Dice: 0.4723 2022-08-23 20:38:11 [INFO] [EVAL] Class IoU: [0.6714 0.7677 0.9288 0.7232 0.6783 0.7473 0.7743 0.7771 0.5037 0.6391 0.4656 0.5607 0.6921 0.2607 0.2446 0.3905 0.5128 0.4335 0.5923 0.3975 0.7428 0.4684 0.5967 0.4546 0.3443 0.4261 0.5247 0.405 0.3846 0.2445 0.2483 0.4931 0.2572 0.3247 0.3397 0.3697 0.43 0.5426 0.2228 0.3519 0.1736 0.0788 0.3352 0.2678 0.2776 0.2287 0.2426 0.4937 0.6404 0.4358 0.4653 0.3268 0.1909 0.2712 0.6477 0.382 0.7605 0.3542 0.4352 0.3537 0.1355 0.4982 0.3007 0.0279 0.4658 0.696 0.2146 0.415 0.0586 0.3333 0.4208 0.5223 0.3715 0.2194 0.4495 0.36 0.4946 0.2523 0.5535 0.225 0.6149 0.3017 0.246 0.018 0.1862 0.4862 0.0796 0.0712 0.2413 0.4946 0.3729 0.0376 0.1903 0.0516 0.0529 0.0244 0.1663 0.1278 0.2504 0.3054 0.0494 0.0534 0.1509 0.4218 0.1913 0.3954 0.2162 0.5355 0.1484 0.4044 0.1443 0.4194 0.1701 0.5244 0.6404 0. 0.3606 0.6949 0.1942 0.3853 0.4837 0.0188 0.2929 0.0496 0.2601 0.1721 0.4981 0.4301 0.0006 0.3378 0.5444 0.026 0.299 0.3147 0.1603 0.1652 0.1636 0.0127 0.1676 0.321 0.1638 0.0274 0.3248 0.0709 0.3128 0. 0.3625 0.0544 0.0776 0.1793] 2022-08-23 20:38:11 [INFO] [EVAL] Class Precision: [0.7586 0.8722 0.9627 0.8123 0.754 0.8832 0.8994 0.8658 0.6271 0.7671 0.7159 0.7075 0.7879 0.4965 0.5822 0.5462 0.7031 0.6909 0.7784 0.6037 0.8138 0.6177 0.696 0.6327 0.5347 0.5141 0.581 0.7164 0.7012 0.37 0.358 0.6169 0.492 0.4228 0.494 0.5199 0.6193 0.7451 0.3406 0.5866 0.2907 0.4426 0.626 0.5374 0.3618 0.4443 0.3531 0.7773 0.6926 0.5052 0.6869 0.4051 0.3885 0.6368 0.6882 0.6609 0.7736 0.6814 0.5507 0.6054 0.4178 0.6624 0.5369 0.6428 0.6099 0.8561 0.3777 0.5603 0.1258 0.5786 0.5499 0.6561 0.5879 0.4097 0.6403 0.5318 0.6195 0.5202 0.6747 0.2474 0.7095 0.5955 0.8465 0.0889 0.313 0.7984 0.3488 0.3466 0.5535 0.7718 0.5044 0.0403 0.3787 0.3017 0.2397 0.0696 0.4322 0.3937 0.446 0.5214 0.7116 0.0708 0.5541 0.4745 0.9061 0.4123 0.4017 0.7207 0.2701 0.4458 0.429 0.6481 0.5172 0.6688 0.6427 0. 0.7517 0.7772 0.3394 0.4945 0.5662 0.3384 0.6535 0.714 0.6614 0.633 0.7135 0.558 0.0064 0.5722 0.6824 0.3662 0.7813 0.7239 0.6038 0.3555 0.3561 0.0286 0.4043 0.7707 0.5332 0.0418 0.6477 0.2848 0.6127 0. 0.7746 0.2609 0.5773 0.8572] 2022-08-23 20:38:11 [INFO] [EVAL] Class Recall: [0.8538 0.8651 0.9635 0.8682 0.871 0.8292 0.8478 0.8836 0.719 0.7929 0.5711 0.7298 0.8505 0.3545 0.2967 0.5782 0.6545 0.5378 0.7124 0.5378 0.895 0.6597 0.807 0.6177 0.4915 0.7135 0.8442 0.4823 0.46 0.4187 0.4476 0.7107 0.3502 0.5831 0.521 0.5615 0.5846 0.6663 0.3919 0.468 0.3012 0.0874 0.4192 0.348 0.5442 0.3203 0.4367 0.575 0.8948 0.7602 0.5906 0.6281 0.273 0.3209 0.9166 0.4751 0.9784 0.4245 0.6749 0.4596 0.167 0.6677 0.406 0.0283 0.6635 0.7883 0.332 0.6153 0.0988 0.4402 0.6418 0.7191 0.5024 0.3208 0.6014 0.5271 0.7104 0.3289 0.755 0.7129 0.8217 0.3794 0.2575 0.022 0.315 0.5543 0.0935 0.0823 0.2996 0.5793 0.5885 0.357 0.2766 0.0586 0.0636 0.0363 0.2128 0.1591 0.3634 0.4244 0.0504 0.1781 0.1717 0.7918 0.1952 0.9062 0.3189 0.6758 0.2477 0.8134 0.1786 0.5431 0.2022 0.7083 0.9944 0. 0.4093 0.8678 0.3123 0.6358 0.7686 0.0195 0.3468 0.0506 0.3 0.1912 0.6227 0.6523 0.0007 0.4518 0.7292 0.0272 0.3263 0.3576 0.1791 0.2358 0.2323 0.0222 0.2225 0.3548 0.1912 0.074 0.3945 0.0863 0.3898 0. 0.4052 0.0642 0.0823 0.1848] 2022-08-23 20:38:11 [INFO] [EVAL] The model with the best validation mIoU (0.3401) was saved at iter 29000. 2022-08-23 20:38:23 [INFO] [TRAIN] epoch: 24, iter: 29050/160000, loss: 0.7208, lr: 0.000991, batch_cost: 0.2514, reader_cost: 0.07701, ips: 31.8179 samples/sec | ETA 09:08:44 2022-08-23 20:38:33 [INFO] [TRAIN] epoch: 24, iter: 29100/160000, loss: 0.7353, lr: 0.000991, batch_cost: 0.1979, reader_cost: 0.00518, ips: 40.4149 samples/sec | ETA 07:11:51 2022-08-23 20:38:42 [INFO] [TRAIN] epoch: 24, iter: 29150/160000, loss: 0.6797, lr: 0.000991, batch_cost: 0.1797, reader_cost: 0.00168, ips: 44.5071 samples/sec | ETA 06:31:59 2022-08-23 20:38:52 [INFO] [TRAIN] epoch: 24, iter: 29200/160000, loss: 0.7366, lr: 0.000990, batch_cost: 0.1999, reader_cost: 0.00270, ips: 40.0147 samples/sec | ETA 07:15:50 2022-08-23 20:39:04 [INFO] [TRAIN] epoch: 24, iter: 29250/160000, loss: 0.6981, lr: 0.000990, batch_cost: 0.2292, reader_cost: 0.00817, ips: 34.9101 samples/sec | ETA 08:19:22 2022-08-23 20:39:15 [INFO] [TRAIN] epoch: 24, iter: 29300/160000, loss: 0.6932, lr: 0.000990, batch_cost: 0.2168, reader_cost: 0.00060, ips: 36.8979 samples/sec | ETA 07:52:17 2022-08-23 20:39:26 [INFO] [TRAIN] epoch: 24, iter: 29350/160000, loss: 0.6834, lr: 0.000989, batch_cost: 0.2207, reader_cost: 0.00108, ips: 36.2407 samples/sec | ETA 08:00:40 2022-08-23 20:39:37 [INFO] [TRAIN] epoch: 24, iter: 29400/160000, loss: 0.7842, lr: 0.000989, batch_cost: 0.2250, reader_cost: 0.00155, ips: 35.5571 samples/sec | ETA 08:09:43 2022-08-23 20:39:48 [INFO] [TRAIN] epoch: 24, iter: 29450/160000, loss: 0.7409, lr: 0.000988, batch_cost: 0.2146, reader_cost: 0.00093, ips: 37.2750 samples/sec | ETA 07:46:58 2022-08-23 20:39:57 [INFO] [TRAIN] epoch: 24, iter: 29500/160000, loss: 0.7081, lr: 0.000988, batch_cost: 0.1834, reader_cost: 0.00054, ips: 43.6223 samples/sec | ETA 06:38:52 2022-08-23 20:40:06 [INFO] [TRAIN] epoch: 24, iter: 29550/160000, loss: 0.6997, lr: 0.000988, batch_cost: 0.1737, reader_cost: 0.00036, ips: 46.0450 samples/sec | ETA 06:17:44 2022-08-23 20:40:15 [INFO] [TRAIN] epoch: 24, iter: 29600/160000, loss: 0.7032, lr: 0.000987, batch_cost: 0.1995, reader_cost: 0.00059, ips: 40.0920 samples/sec | ETA 07:13:40 2022-08-23 20:40:25 [INFO] [TRAIN] epoch: 24, iter: 29650/160000, loss: 0.6976, lr: 0.000987, batch_cost: 0.1842, reader_cost: 0.00074, ips: 43.4222 samples/sec | ETA 06:40:15 2022-08-23 20:40:34 [INFO] [TRAIN] epoch: 24, iter: 29700/160000, loss: 0.7489, lr: 0.000987, batch_cost: 0.1823, reader_cost: 0.00073, ips: 43.8810 samples/sec | ETA 06:35:55 2022-08-23 20:40:42 [INFO] [TRAIN] epoch: 24, iter: 29750/160000, loss: 0.7657, lr: 0.000986, batch_cost: 0.1658, reader_cost: 0.00110, ips: 48.2454 samples/sec | ETA 05:59:57 2022-08-23 20:40:51 [INFO] [TRAIN] epoch: 24, iter: 29800/160000, loss: 0.6607, lr: 0.000986, batch_cost: 0.1775, reader_cost: 0.00033, ips: 45.0776 samples/sec | ETA 06:25:06 2022-08-23 20:41:00 [INFO] [TRAIN] epoch: 24, iter: 29850/160000, loss: 0.7415, lr: 0.000985, batch_cost: 0.1736, reader_cost: 0.00049, ips: 46.0954 samples/sec | ETA 06:16:27 2022-08-23 20:41:09 [INFO] [TRAIN] epoch: 24, iter: 29900/160000, loss: 0.7155, lr: 0.000985, batch_cost: 0.1900, reader_cost: 0.00060, ips: 42.1026 samples/sec | ETA 06:52:00 2022-08-23 20:41:20 [INFO] [TRAIN] epoch: 24, iter: 29950/160000, loss: 0.7296, lr: 0.000985, batch_cost: 0.2195, reader_cost: 0.00057, ips: 36.4384 samples/sec | ETA 07:55:52 2022-08-23 20:41:30 [INFO] [TRAIN] epoch: 24, iter: 30000/160000, loss: 0.6699, lr: 0.000984, batch_cost: 0.1902, reader_cost: 0.00052, ips: 42.0713 samples/sec | ETA 06:51:59 2022-08-23 20:41:30 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 161s - batch_cost: 0.1611 - reader cost: 9.4104e-04 2022-08-23 20:44:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3503 Acc: 0.7608 Kappa: 0.7427 Dice: 0.4830 2022-08-23 20:44:11 [INFO] [EVAL] Class IoU: [0.6746 0.7732 0.9268 0.728 0.6865 0.7551 0.754 0.7846 0.5026 0.6435 0.4665 0.5319 0.6973 0.3077 0.2776 0.4118 0.4682 0.4264 0.5815 0.4193 0.7416 0.4512 0.6021 0.4764 0.2653 0.4276 0.5265 0.3686 0.4147 0.2401 0.2047 0.4497 0.3055 0.3157 0.3226 0.3744 0.4274 0.5011 0.2674 0.364 0.1332 0.1459 0.3537 0.2307 0.2574 0.1827 0.2705 0.4723 0.6935 0.5175 0.5184 0.4103 0.1405 0.2581 0.6978 0.4457 0.8687 0.3633 0.165 0.2628 0.1055 0.475 0.2322 0.1877 0.422 0.6909 0.2278 0.3635 0.0733 0.3332 0.4563 0.4906 0.4046 0.2704 0.4527 0.3803 0.5862 0.2163 0.2886 0.3362 0.6092 0.3319 0.2902 0.0953 0.2123 0.5112 0.1058 0.081 0.2888 0.4843 0.4295 0.0218 0.1711 0.082 0.0667 0.0017 0.2273 0.1993 0.2825 0.3096 0.0908 0.0541 0.203 0.7768 0.2008 0.6966 0.1969 0.547 0.1331 0.3978 0.0969 0.4456 0.1584 0.6273 0.7408 0.0067 0.4192 0.605 0.0778 0.3554 0.5252 0.0241 0.3287 0.1515 0.255 0.2252 0.4452 0.4858 0.2572 0.316 0.517 0.021 0.295 0.3441 0.1384 0.1546 0.1509 0.0271 0.1506 0.3718 0.1026 0.0032 0.3391 0.0091 0.2656 0.0063 0.3892 0.026 0.1502 0.1868] 2022-08-23 20:44:11 [INFO] [EVAL] Class Precision: [0.7776 0.8763 0.9674 0.8252 0.7617 0.841 0.8483 0.8542 0.6268 0.7131 0.6557 0.6912 0.7987 0.55 0.533 0.5737 0.6612 0.6621 0.783 0.6042 0.8097 0.6964 0.7224 0.5606 0.4438 0.5158 0.6039 0.7925 0.7041 0.4243 0.4571 0.539 0.5739 0.4478 0.4701 0.495 0.6449 0.7601 0.3926 0.5554 0.3554 0.3507 0.6718 0.6077 0.3917 0.3842 0.4247 0.7038 0.7713 0.6572 0.633 0.5416 0.3828 0.6201 0.762 0.599 0.9247 0.6749 0.5847 0.6186 0.2094 0.5824 0.2726 0.5375 0.5022 0.8234 0.3375 0.5248 0.4197 0.5786 0.5932 0.5561 0.6348 0.4076 0.6399 0.5243 0.7399 0.4036 0.5775 0.5271 0.7292 0.6588 0.7927 0.2662 0.3471 0.6537 0.3202 0.3922 0.5939 0.7173 0.5963 0.0261 0.4391 0.3084 0.0986 0.02 0.4302 0.453 0.4185 0.6422 0.7129 0.1046 0.5781 0.8081 0.824 0.8329 0.3941 0.7007 0.3163 0.4958 0.3782 0.7535 0.6755 0.8035 0.754 0.3851 0.6978 0.6152 0.55 0.5925 0.6877 0.4422 0.7859 0.5376 0.693 0.58 0.8636 0.676 0.3721 0.8823 0.5598 0.4025 0.8799 0.6984 0.6368 0.2459 0.2595 0.1212 0.3737 0.7089 0.4046 0.0044 0.696 0.1375 0.4402 0.0068 0.8457 0.254 0.52 0.8497] 2022-08-23 20:44:11 [INFO] [EVAL] Class Recall: [0.836 0.8679 0.9567 0.8607 0.8743 0.8808 0.8716 0.9059 0.7173 0.8683 0.6178 0.6977 0.8459 0.4112 0.3668 0.5934 0.616 0.545 0.6932 0.5781 0.8982 0.5616 0.7833 0.7604 0.3976 0.7144 0.8043 0.408 0.5021 0.3561 0.2705 0.7308 0.3951 0.5169 0.5069 0.606 0.5589 0.5953 0.456 0.5136 0.1757 0.1999 0.4276 0.271 0.4288 0.2583 0.4269 0.5894 0.8731 0.7089 0.7412 0.6285 0.1816 0.3066 0.8922 0.6352 0.9348 0.4404 0.1869 0.3137 0.1754 0.7205 0.6105 0.2238 0.7255 0.8111 0.4122 0.5417 0.0816 0.44 0.6641 0.8064 0.5275 0.4454 0.6074 0.5807 0.7384 0.3178 0.3658 0.4814 0.7873 0.4008 0.314 0.1293 0.3535 0.7011 0.1364 0.0926 0.3598 0.5985 0.6055 0.1174 0.2189 0.1005 0.1711 0.0019 0.3252 0.2624 0.4652 0.3741 0.0943 0.1008 0.2383 0.9524 0.2098 0.8098 0.2823 0.7137 0.1868 0.668 0.1152 0.5217 0.1714 0.7411 0.9768 0.0067 0.5122 0.9734 0.083 0.4703 0.6897 0.0248 0.361 0.1742 0.2875 0.269 0.4789 0.6332 0.4543 0.3299 0.871 0.0217 0.3073 0.4042 0.1503 0.2941 0.2651 0.0337 0.2015 0.4388 0.1208 0.0113 0.398 0.0096 0.4011 0.0844 0.419 0.0281 0.1744 0.1932] 2022-08-23 20:44:11 [INFO] [EVAL] The model with the best validation mIoU (0.3503) was saved at iter 30000. 2022-08-23 20:44:23 [INFO] [TRAIN] epoch: 24, iter: 30050/160000, loss: 0.7394, lr: 0.000984, batch_cost: 0.2328, reader_cost: 0.00515, ips: 34.3667 samples/sec | ETA 08:24:10 2022-08-23 20:44:34 [INFO] [TRAIN] epoch: 24, iter: 30100/160000, loss: 0.7613, lr: 0.000983, batch_cost: 0.2185, reader_cost: 0.00102, ips: 36.6173 samples/sec | ETA 07:53:00 2022-08-23 20:44:45 [INFO] [TRAIN] epoch: 24, iter: 30150/160000, loss: 0.6515, lr: 0.000983, batch_cost: 0.2116, reader_cost: 0.00054, ips: 37.8146 samples/sec | ETA 07:37:50 2022-08-23 20:44:55 [INFO] [TRAIN] epoch: 24, iter: 30200/160000, loss: 0.7427, lr: 0.000983, batch_cost: 0.2190, reader_cost: 0.00047, ips: 36.5303 samples/sec | ETA 07:53:45 2022-08-23 20:45:05 [INFO] [TRAIN] epoch: 24, iter: 30250/160000, loss: 0.7670, lr: 0.000982, batch_cost: 0.1973, reader_cost: 0.00080, ips: 40.5469 samples/sec | ETA 07:06:39 2022-08-23 20:45:16 [INFO] [TRAIN] epoch: 24, iter: 30300/160000, loss: 0.7062, lr: 0.000982, batch_cost: 0.2119, reader_cost: 0.00060, ips: 37.7476 samples/sec | ETA 07:38:07 2022-08-23 20:45:29 [INFO] [TRAIN] epoch: 25, iter: 30350/160000, loss: 0.6643, lr: 0.000982, batch_cost: 0.2571, reader_cost: 0.06542, ips: 31.1149 samples/sec | ETA 09:15:34 2022-08-23 20:45:40 [INFO] [TRAIN] epoch: 25, iter: 30400/160000, loss: 0.6865, lr: 0.000981, batch_cost: 0.2216, reader_cost: 0.00134, ips: 36.1019 samples/sec | ETA 07:58:38 2022-08-23 20:45:50 [INFO] [TRAIN] epoch: 25, iter: 30450/160000, loss: 0.6653, lr: 0.000981, batch_cost: 0.2043, reader_cost: 0.00104, ips: 39.1539 samples/sec | ETA 07:21:09 2022-08-23 20:46:00 [INFO] [TRAIN] epoch: 25, iter: 30500/160000, loss: 0.7091, lr: 0.000980, batch_cost: 0.1931, reader_cost: 0.00055, ips: 41.4289 samples/sec | ETA 06:56:46 2022-08-23 20:46:09 [INFO] [TRAIN] epoch: 25, iter: 30550/160000, loss: 0.7061, lr: 0.000980, batch_cost: 0.1797, reader_cost: 0.00063, ips: 44.5074 samples/sec | ETA 06:27:48 2022-08-23 20:46:17 [INFO] [TRAIN] epoch: 25, iter: 30600/160000, loss: 0.7191, lr: 0.000980, batch_cost: 0.1718, reader_cost: 0.00075, ips: 46.5665 samples/sec | ETA 06:10:30 2022-08-23 20:46:27 [INFO] [TRAIN] epoch: 25, iter: 30650/160000, loss: 0.7318, lr: 0.000979, batch_cost: 0.1838, reader_cost: 0.00084, ips: 43.5372 samples/sec | ETA 06:36:08 2022-08-23 20:46:36 [INFO] [TRAIN] epoch: 25, iter: 30700/160000, loss: 0.7038, lr: 0.000979, batch_cost: 0.1904, reader_cost: 0.00047, ips: 42.0137 samples/sec | ETA 06:50:20 2022-08-23 20:46:44 [INFO] [TRAIN] epoch: 25, iter: 30750/160000, loss: 0.6890, lr: 0.000979, batch_cost: 0.1532, reader_cost: 0.00041, ips: 52.2230 samples/sec | ETA 05:29:59 2022-08-23 20:46:52 [INFO] [TRAIN] epoch: 25, iter: 30800/160000, loss: 0.6830, lr: 0.000978, batch_cost: 0.1569, reader_cost: 0.00076, ips: 50.9746 samples/sec | ETA 05:37:56 2022-08-23 20:47:00 [INFO] [TRAIN] epoch: 25, iter: 30850/160000, loss: 0.7563, lr: 0.000978, batch_cost: 0.1703, reader_cost: 0.00057, ips: 46.9819 samples/sec | ETA 06:06:31 2022-08-23 20:47:10 [INFO] [TRAIN] epoch: 25, iter: 30900/160000, loss: 0.6973, lr: 0.000977, batch_cost: 0.1956, reader_cost: 0.00064, ips: 40.8940 samples/sec | ETA 07:00:55 2022-08-23 20:47:19 [INFO] [TRAIN] epoch: 25, iter: 30950/160000, loss: 0.6937, lr: 0.000977, batch_cost: 0.1753, reader_cost: 0.00094, ips: 45.6404 samples/sec | ETA 06:17:00 2022-08-23 20:47:28 [INFO] [TRAIN] epoch: 25, iter: 31000/160000, loss: 0.6934, lr: 0.000977, batch_cost: 0.1869, reader_cost: 0.00076, ips: 42.8044 samples/sec | ETA 06:41:49 2022-08-23 20:47:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1736 - reader cost: 5.2466e-04 2022-08-23 20:50:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3398 Acc: 0.7586 Kappa: 0.7397 Dice: 0.4725 2022-08-23 20:50:22 [INFO] [EVAL] Class IoU: [0.6729 0.7775 0.9278 0.7292 0.665 0.7506 0.7553 0.7623 0.514 0.654 0.4788 0.5333 0.6963 0.2735 0.2732 0.3996 0.4644 0.3987 0.5873 0.404 0.7436 0.4448 0.5838 0.4748 0.2789 0.3395 0.5097 0.3858 0.4214 0.1647 0.2647 0.5046 0.3074 0.3047 0.2997 0.3677 0.3998 0.5082 0.2436 0.2895 0.179 0.0717 0.3487 0.2429 0.2837 0.2198 0.2778 0.4591 0.4403 0.5398 0.5122 0.3352 0.1263 0.2799 0.6556 0.4338 0.7923 0.3568 0.3822 0.2903 0.1198 0.4111 0.2627 0.0765 0.4138 0.6693 0.2033 0.3636 0.0321 0.3657 0.446 0.5136 0.3949 0.2936 0.4316 0.3824 0.4951 0.266 0.3082 0.405 0.6499 0.3931 0.3149 0.1411 0.2091 0.525 0.0971 0.1019 0.269 0.4801 0.4406 0.0786 0.1522 0.0409 0.1339 0.0205 0.1373 0.0829 0.2697 0.4065 0.0479 0.0592 0.1387 0.4372 0.181 0.3153 0.2725 0.5617 0.1618 0.4761 0.1124 0.4959 0.167 0.6284 0.5036 0.0217 0.4143 0.6275 0.1061 0.3404 0.5078 0.0254 0.3205 0.0729 0.2544 0.1825 0.5201 0.4683 0.0033 0.3081 0.5788 0.0396 0.329 0.2819 0.185 0.1607 0.1518 0.0128 0.1738 0.3986 0.1399 0.0311 0.4077 0.0057 0.2608 0. 0.2509 0.0063 0.0866 0.1499] 2022-08-23 20:50:22 [INFO] [EVAL] Class Precision: [0.7536 0.8453 0.9674 0.8499 0.7641 0.873 0.8537 0.8101 0.6626 0.767 0.7383 0.6601 0.7836 0.4931 0.56 0.5661 0.6056 0.6863 0.7199 0.6092 0.8113 0.6724 0.6872 0.6457 0.4461 0.6018 0.5593 0.6762 0.6215 0.4707 0.3803 0.635 0.521 0.378 0.4705 0.6066 0.6888 0.8031 0.4263 0.6486 0.3237 0.3214 0.5951 0.5531 0.386 0.483 0.3969 0.6284 0.8159 0.7094 0.6772 0.3988 0.4119 0.7679 0.718 0.7514 0.807 0.633 0.7597 0.5603 0.2401 0.5761 0.3416 0.5233 0.4799 0.8299 0.345 0.521 0.0705 0.6275 0.7292 0.7289 0.559 0.4815 0.7548 0.5447 0.7192 0.5861 0.7494 0.4829 0.8065 0.6608 0.788 0.4309 0.309 0.6425 0.6149 0.323 0.5141 0.6889 0.6132 0.1165 0.2494 0.2595 0.193 0.1146 0.7724 0.3781 0.4192 0.6571 0.5744 0.0857 0.5299 0.7777 0.6877 0.3285 0.4533 0.7081 0.2772 0.5183 0.324 0.7353 0.5926 0.8478 0.5042 0.3741 0.6702 0.7046 0.18 0.5384 0.6318 0.098 0.7254 0.7166 0.6835 0.6297 0.884 0.6706 0.0392 0.7854 0.8136 0.3822 0.6294 0.8225 0.4792 0.256 0.3855 0.1215 0.3678 0.6535 0.3975 0.0418 0.6086 0.1386 0.9216 0. 0.9095 0.2582 0.7793 0.9338] 2022-08-23 20:50:22 [INFO] [EVAL] Class Recall: [0.8626 0.9065 0.9577 0.837 0.8368 0.8426 0.8676 0.9281 0.6962 0.8161 0.5767 0.7351 0.8621 0.3804 0.3478 0.5761 0.6657 0.4876 0.7612 0.5453 0.8991 0.5679 0.7951 0.6422 0.4268 0.4379 0.8518 0.4732 0.5669 0.2022 0.4655 0.7109 0.4285 0.6111 0.4522 0.4828 0.488 0.5805 0.3624 0.3434 0.286 0.0844 0.4572 0.3023 0.517 0.2874 0.4807 0.6303 0.4889 0.6931 0.6777 0.6775 0.1541 0.3057 0.8829 0.5064 0.9776 0.4498 0.4347 0.376 0.1929 0.5894 0.5321 0.0822 0.7502 0.7757 0.3311 0.5462 0.0556 0.467 0.5345 0.6349 0.5735 0.4293 0.5019 0.5621 0.6138 0.3276 0.3436 0.7152 0.77 0.4924 0.3441 0.1735 0.3929 0.7416 0.1034 0.1296 0.3607 0.613 0.6102 0.1946 0.2809 0.0463 0.3044 0.0244 0.1431 0.096 0.4307 0.516 0.0496 0.1609 0.1581 0.4997 0.1972 0.8869 0.4059 0.731 0.2798 0.8541 0.1468 0.6037 0.1887 0.7084 0.9976 0.0226 0.5204 0.8515 0.2054 0.4806 0.7213 0.0332 0.3647 0.075 0.2884 0.2044 0.5582 0.6082 0.0036 0.3364 0.6673 0.0424 0.4081 0.3001 0.2316 0.3014 0.2003 0.0141 0.2479 0.5055 0.1775 0.1082 0.5527 0.006 0.2667 0. 0.2573 0.0064 0.0888 0.1515] 2022-08-23 20:50:22 [INFO] [EVAL] The model with the best validation mIoU (0.3503) was saved at iter 30000. 2022-08-23 20:50:33 [INFO] [TRAIN] epoch: 25, iter: 31050/160000, loss: 0.6738, lr: 0.000976, batch_cost: 0.2261, reader_cost: 0.00394, ips: 35.3789 samples/sec | ETA 08:05:58 2022-08-23 20:50:44 [INFO] [TRAIN] epoch: 25, iter: 31100/160000, loss: 0.7459, lr: 0.000976, batch_cost: 0.2206, reader_cost: 0.00094, ips: 36.2615 samples/sec | ETA 07:53:57 2022-08-23 20:50:56 [INFO] [TRAIN] epoch: 25, iter: 31150/160000, loss: 0.7102, lr: 0.000976, batch_cost: 0.2285, reader_cost: 0.00057, ips: 35.0150 samples/sec | ETA 08:10:38 2022-08-23 20:51:06 [INFO] [TRAIN] epoch: 25, iter: 31200/160000, loss: 0.7375, lr: 0.000975, batch_cost: 0.2066, reader_cost: 0.00077, ips: 38.7289 samples/sec | ETA 07:23:25 2022-08-23 20:51:17 [INFO] [TRAIN] epoch: 25, iter: 31250/160000, loss: 0.6863, lr: 0.000975, batch_cost: 0.2113, reader_cost: 0.00064, ips: 37.8579 samples/sec | ETA 07:33:26 2022-08-23 20:51:27 [INFO] [TRAIN] epoch: 25, iter: 31300/160000, loss: 0.7200, lr: 0.000974, batch_cost: 0.2140, reader_cost: 0.00049, ips: 37.3762 samples/sec | ETA 07:39:06 2022-08-23 20:51:37 [INFO] [TRAIN] epoch: 25, iter: 31350/160000, loss: 0.7358, lr: 0.000974, batch_cost: 0.1964, reader_cost: 0.00046, ips: 40.7392 samples/sec | ETA 07:01:03 2022-08-23 20:51:48 [INFO] [TRAIN] epoch: 25, iter: 31400/160000, loss: 0.7274, lr: 0.000974, batch_cost: 0.2058, reader_cost: 0.00042, ips: 38.8816 samples/sec | ETA 07:20:59 2022-08-23 20:51:58 [INFO] [TRAIN] epoch: 25, iter: 31450/160000, loss: 0.7144, lr: 0.000973, batch_cost: 0.2073, reader_cost: 0.00104, ips: 38.5849 samples/sec | ETA 07:24:12 2022-08-23 20:52:09 [INFO] [TRAIN] epoch: 25, iter: 31500/160000, loss: 0.6888, lr: 0.000973, batch_cost: 0.2258, reader_cost: 0.00038, ips: 35.4238 samples/sec | ETA 08:03:40 2022-08-23 20:52:19 [INFO] [TRAIN] epoch: 25, iter: 31550/160000, loss: 0.7448, lr: 0.000972, batch_cost: 0.2017, reader_cost: 0.00046, ips: 39.6611 samples/sec | ETA 07:11:49 2022-08-23 20:52:37 [INFO] [TRAIN] epoch: 26, iter: 31600/160000, loss: 0.6945, lr: 0.000972, batch_cost: 0.3505, reader_cost: 0.13164, ips: 22.8247 samples/sec | ETA 12:30:03 2022-08-23 20:52:48 [INFO] [TRAIN] epoch: 26, iter: 31650/160000, loss: 0.6778, lr: 0.000972, batch_cost: 0.2199, reader_cost: 0.00078, ips: 36.3802 samples/sec | ETA 07:50:24 2022-08-23 20:52:58 [INFO] [TRAIN] epoch: 26, iter: 31700/160000, loss: 0.6907, lr: 0.000971, batch_cost: 0.2019, reader_cost: 0.00075, ips: 39.6176 samples/sec | ETA 07:11:47 2022-08-23 20:53:07 [INFO] [TRAIN] epoch: 26, iter: 31750/160000, loss: 0.6858, lr: 0.000971, batch_cost: 0.1885, reader_cost: 0.00079, ips: 42.4401 samples/sec | ETA 06:42:55 2022-08-23 20:53:17 [INFO] [TRAIN] epoch: 26, iter: 31800/160000, loss: 0.6589, lr: 0.000971, batch_cost: 0.2016, reader_cost: 0.00071, ips: 39.6735 samples/sec | ETA 07:10:51 2022-08-23 20:53:26 [INFO] [TRAIN] epoch: 26, iter: 31850/160000, loss: 0.6897, lr: 0.000970, batch_cost: 0.1661, reader_cost: 0.00062, ips: 48.1516 samples/sec | ETA 05:54:51 2022-08-23 20:53:36 [INFO] [TRAIN] epoch: 26, iter: 31900/160000, loss: 0.7458, lr: 0.000970, batch_cost: 0.1958, reader_cost: 0.00160, ips: 40.8661 samples/sec | ETA 06:57:57 2022-08-23 20:53:44 [INFO] [TRAIN] epoch: 26, iter: 31950/160000, loss: 0.6516, lr: 0.000969, batch_cost: 0.1777, reader_cost: 0.00069, ips: 45.0218 samples/sec | ETA 06:19:13 2022-08-23 20:53:54 [INFO] [TRAIN] epoch: 26, iter: 32000/160000, loss: 0.7019, lr: 0.000969, batch_cost: 0.2008, reader_cost: 0.00096, ips: 39.8457 samples/sec | ETA 07:08:19 2022-08-23 20:53:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1625 - reader cost: 8.9834e-04 2022-08-23 20:56:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3402 Acc: 0.7585 Kappa: 0.7402 Dice: 0.4724 2022-08-23 20:56:37 [INFO] [EVAL] Class IoU: [0.6767 0.7817 0.9271 0.7271 0.669 0.7591 0.7505 0.7826 0.5106 0.6392 0.4713 0.5293 0.6956 0.1876 0.3137 0.4086 0.4414 0.3741 0.5816 0.418 0.7363 0.38 0.5749 0.4888 0.2967 0.3663 0.4955 0.3695 0.4339 0.2065 0.2255 0.4848 0.3062 0.3382 0.3157 0.3723 0.4374 0.5547 0.277 0.3222 0.1657 0.1108 0.3544 0.2627 0.2589 0.2072 0.2753 0.4784 0.6231 0.5367 0.4919 0.4162 0.1747 0.1904 0.6595 0.2918 0.8789 0.3148 0.4485 0.2677 0.1126 0.4283 0.271 0.1971 0.4371 0.7121 0.2443 0.3832 0.1224 0.3371 0.4566 0.5003 0.3407 0.2648 0.4506 0.3884 0.5201 0.2508 0.5027 0.188 0.6186 0.363 0.3635 0.0592 0.0785 0.5043 0.0868 0.075 0.3006 0.4976 0.4722 0.0024 0.1842 0.0392 0.1978 0.0066 0.1714 0.1439 0.2676 0.3098 0.1018 0.113 0.0921 0.1638 0.1892 0.5959 0.157 0.5633 0.1138 0.3958 0.1019 0.29 0.1669 0.547 0.7199 0.0153 0.3893 0.5801 0.209 0.3654 0.4601 0.0206 0.2964 0.1392 0.244 0.1872 0.5441 0.4187 0. 0.3646 0.5834 0.0375 0.27 0.2385 0.1617 0.1651 0.1492 0.0314 0.1442 0.3917 0.167 0.0001 0.2888 0.0632 0.306 0.0032 0.3071 0.0295 0.0963 0.1685] 2022-08-23 20:56:37 [INFO] [EVAL] Class Precision: [0.7918 0.8544 0.9572 0.8163 0.7419 0.8336 0.8297 0.8512 0.6722 0.7149 0.7041 0.7306 0.7723 0.5372 0.4833 0.5404 0.5981 0.6857 0.718 0.5973 0.803 0.6353 0.7034 0.5872 0.5303 0.5453 0.5949 0.6796 0.6653 0.3355 0.4479 0.6049 0.5564 0.4379 0.475 0.5586 0.6112 0.7452 0.449 0.5648 0.3806 0.349 0.5905 0.459 0.4833 0.4364 0.4205 0.8033 0.6835 0.7196 0.6837 0.516 0.2952 0.4432 0.743 0.3601 0.9191 0.6786 0.6461 0.418 0.1381 0.6636 0.3482 0.5432 0.5888 0.86 0.3888 0.5728 0.3516 0.5812 0.5669 0.7106 0.5759 0.432 0.7079 0.5207 0.7717 0.6802 0.7416 0.254 0.764 0.6908 0.7509 0.2152 0.1807 0.7265 0.2601 0.3546 0.5278 0.699 0.8286 0.0037 0.3226 0.2727 0.4787 0.0435 0.4403 0.4588 0.4106 0.6065 0.4865 0.2018 0.4907 0.6173 0.8916 0.6428 0.3378 0.7262 0.2346 0.5035 0.3891 0.3442 0.6331 0.5586 0.7981 0.2171 0.7802 0.6652 0.3396 0.603 0.6747 0.4784 0.7447 0.5991 0.7185 0.6412 0.8267 0.6024 0. 0.5479 0.6585 0.3873 0.6697 0.7551 0.7071 0.3477 0.365 0.0737 0.3134 0.6951 0.5927 0.0002 0.6351 0.3071 0.599 0.0567 0.8642 0.2137 0.2887 0.8574] 2022-08-23 20:56:37 [INFO] [EVAL] Class Recall: [0.8232 0.9019 0.9672 0.8694 0.872 0.8946 0.8872 0.9066 0.6799 0.8579 0.5876 0.6576 0.8751 0.2237 0.472 0.6262 0.6275 0.4515 0.7539 0.5819 0.8985 0.4861 0.7589 0.7447 0.4024 0.5274 0.748 0.4475 0.555 0.3494 0.3123 0.7094 0.405 0.5977 0.4849 0.5275 0.6061 0.6846 0.4196 0.4285 0.2269 0.1397 0.47 0.3805 0.3579 0.2829 0.4437 0.5419 0.8758 0.6786 0.6368 0.6827 0.2996 0.2503 0.8544 0.606 0.9526 0.37 0.5946 0.4268 0.3784 0.5471 0.5501 0.2362 0.6292 0.8055 0.3966 0.5366 0.1581 0.4453 0.7012 0.6283 0.4548 0.4062 0.5536 0.6046 0.6147 0.2844 0.6094 0.4199 0.7648 0.4335 0.4133 0.0755 0.1218 0.6225 0.1153 0.0868 0.4112 0.6333 0.5234 0.007 0.3004 0.0438 0.2521 0.0078 0.2191 0.1734 0.4344 0.3878 0.1141 0.2042 0.1018 0.1823 0.1936 0.8911 0.2268 0.7151 0.1811 0.6493 0.1213 0.6483 0.1848 0.9632 0.8802 0.0162 0.4373 0.8194 0.352 0.4811 0.5913 0.0211 0.3299 0.1535 0.2698 0.209 0.6142 0.5786 0. 0.5215 0.8365 0.0399 0.3115 0.2585 0.1733 0.2391 0.2015 0.0519 0.2109 0.4729 0.1886 0.0003 0.3463 0.0736 0.3848 0.0034 0.3227 0.0331 0.1262 0.1734] 2022-08-23 20:56:38 [INFO] [EVAL] The model with the best validation mIoU (0.3503) was saved at iter 30000. 2022-08-23 20:56:48 [INFO] [TRAIN] epoch: 26, iter: 32050/160000, loss: 0.7072, lr: 0.000969, batch_cost: 0.2133, reader_cost: 0.00629, ips: 37.4977 samples/sec | ETA 07:34:57 2022-08-23 20:56:59 [INFO] [TRAIN] epoch: 26, iter: 32100/160000, loss: 0.7002, lr: 0.000968, batch_cost: 0.2162, reader_cost: 0.00131, ips: 37.0018 samples/sec | ETA 07:40:52 2022-08-23 20:57:11 [INFO] [TRAIN] epoch: 26, iter: 32150/160000, loss: 0.6994, lr: 0.000968, batch_cost: 0.2300, reader_cost: 0.00067, ips: 34.7866 samples/sec | ETA 08:10:02 2022-08-23 20:57:22 [INFO] [TRAIN] epoch: 26, iter: 32200/160000, loss: 0.6738, lr: 0.000968, batch_cost: 0.2228, reader_cost: 0.00043, ips: 35.8994 samples/sec | ETA 07:54:39 2022-08-23 20:57:32 [INFO] [TRAIN] epoch: 26, iter: 32250/160000, loss: 0.6950, lr: 0.000967, batch_cost: 0.2065, reader_cost: 0.00038, ips: 38.7437 samples/sec | ETA 07:19:38 2022-08-23 20:57:43 [INFO] [TRAIN] epoch: 26, iter: 32300/160000, loss: 0.7248, lr: 0.000967, batch_cost: 0.2113, reader_cost: 0.00067, ips: 37.8564 samples/sec | ETA 07:29:46 2022-08-23 20:57:53 [INFO] [TRAIN] epoch: 26, iter: 32350/160000, loss: 0.6988, lr: 0.000966, batch_cost: 0.2030, reader_cost: 0.00069, ips: 39.4055 samples/sec | ETA 07:11:55 2022-08-23 20:58:02 [INFO] [TRAIN] epoch: 26, iter: 32400/160000, loss: 0.6392, lr: 0.000966, batch_cost: 0.1824, reader_cost: 0.00047, ips: 43.8521 samples/sec | ETA 06:27:58 2022-08-23 20:58:12 [INFO] [TRAIN] epoch: 26, iter: 32450/160000, loss: 0.6640, lr: 0.000966, batch_cost: 0.2072, reader_cost: 0.00056, ips: 38.6140 samples/sec | ETA 07:20:25 2022-08-23 20:58:23 [INFO] [TRAIN] epoch: 26, iter: 32500/160000, loss: 0.6477, lr: 0.000965, batch_cost: 0.2112, reader_cost: 0.00065, ips: 37.8808 samples/sec | ETA 07:28:46 2022-08-23 20:58:34 [INFO] [TRAIN] epoch: 26, iter: 32550/160000, loss: 0.7121, lr: 0.000965, batch_cost: 0.2210, reader_cost: 0.00035, ips: 36.1999 samples/sec | ETA 07:49:25 2022-08-23 20:58:46 [INFO] [TRAIN] epoch: 26, iter: 32600/160000, loss: 0.7268, lr: 0.000965, batch_cost: 0.2371, reader_cost: 0.00062, ips: 33.7475 samples/sec | ETA 08:23:20 2022-08-23 20:58:55 [INFO] [TRAIN] epoch: 26, iter: 32650/160000, loss: 0.7128, lr: 0.000964, batch_cost: 0.1961, reader_cost: 0.00771, ips: 40.7862 samples/sec | ETA 06:56:19 2022-08-23 20:59:05 [INFO] [TRAIN] epoch: 26, iter: 32700/160000, loss: 0.7391, lr: 0.000964, batch_cost: 0.1942, reader_cost: 0.00726, ips: 41.1867 samples/sec | ETA 06:52:06 2022-08-23 20:59:14 [INFO] [TRAIN] epoch: 26, iter: 32750/160000, loss: 0.6895, lr: 0.000963, batch_cost: 0.1700, reader_cost: 0.00196, ips: 47.0495 samples/sec | ETA 06:00:36 2022-08-23 20:59:22 [INFO] [TRAIN] epoch: 26, iter: 32800/160000, loss: 0.6546, lr: 0.000963, batch_cost: 0.1757, reader_cost: 0.00032, ips: 45.5247 samples/sec | ETA 06:12:32 2022-08-23 20:59:36 [INFO] [TRAIN] epoch: 27, iter: 32850/160000, loss: 0.6324, lr: 0.000963, batch_cost: 0.2624, reader_cost: 0.10185, ips: 30.4861 samples/sec | ETA 09:16:05 2022-08-23 20:59:46 [INFO] [TRAIN] epoch: 27, iter: 32900/160000, loss: 0.6859, lr: 0.000962, batch_cost: 0.2132, reader_cost: 0.00092, ips: 37.5279 samples/sec | ETA 07:31:34 2022-08-23 20:59:56 [INFO] [TRAIN] epoch: 27, iter: 32950/160000, loss: 0.6622, lr: 0.000962, batch_cost: 0.1957, reader_cost: 0.00071, ips: 40.8706 samples/sec | ETA 06:54:28 2022-08-23 21:00:04 [INFO] [TRAIN] epoch: 27, iter: 33000/160000, loss: 0.6336, lr: 0.000962, batch_cost: 0.1635, reader_cost: 0.00033, ips: 48.9377 samples/sec | ETA 05:46:01 2022-08-23 21:00:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1719 - reader cost: 7.0758e-04 2022-08-23 21:02:56 [INFO] [EVAL] #Images: 2000 mIoU: 0.3332 Acc: 0.7538 Kappa: 0.7351 Dice: 0.4633 2022-08-23 21:02:56 [INFO] [EVAL] Class IoU: [0.6754 0.7635 0.9275 0.7186 0.6558 0.7527 0.7617 0.7733 0.5056 0.6205 0.4788 0.5345 0.6984 0.2642 0.2937 0.4107 0.4484 0.3394 0.588 0.4247 0.7551 0.4391 0.5925 0.4883 0.3387 0.3286 0.4379 0.4035 0.3655 0.1977 0.2377 0.4893 0.2353 0.2992 0.4019 0.3198 0.4335 0.5226 0.1913 0.2977 0.176 0.0982 0.3291 0.257 0.2879 0.2404 0.2315 0.4864 0.6317 0.5295 0.5201 0.3353 0.1872 0.2403 0.6619 0.3672 0.8482 0.3762 0.3412 0.3044 0.122 0.2518 0.2632 0.1487 0.416 0.6346 0.2114 0.4261 0.0399 0.3403 0.4554 0.5061 0.3822 0.2605 0.4153 0.3434 0.2966 0.2692 0.3128 0.268 0.6337 0.3402 0.316 0.0188 0.0393 0.5207 0.0967 0.0923 0.3245 0.4952 0.4186 0.002 0.1551 0.0599 0.0063 0.006 0.1958 0.1207 0.2687 0.3259 0.1357 0.0555 0.1273 0.3136 0.1898 0.6211 0.1481 0.5357 0.0735 0.2153 0.0451 0.4357 0.1521 0.5167 0.7348 0.0108 0.4341 0.6605 0.0878 0.3089 0.4512 0.0036 0.3309 0.1201 0.2396 0.1652 0.525 0.3939 0. 0.2468 0.5642 0.0039 0.3241 0.3626 0.2218 0.1609 0.1512 0.0205 0.1556 0.3733 0.2062 0. 0.1577 0.4071 0.3198 0.0024 0.3267 0.0087 0.0947 0.1786] 2022-08-23 21:02:56 [INFO] [EVAL] Class Precision: [0.783 0.8536 0.9672 0.8011 0.7184 0.8564 0.8506 0.8896 0.6274 0.7177 0.7226 0.7565 0.7896 0.4213 0.4617 0.5582 0.7203 0.7406 0.7219 0.5843 0.8514 0.6013 0.7448 0.5997 0.4719 0.5195 0.5643 0.6342 0.7954 0.5642 0.4446 0.6784 0.6153 0.3709 0.6032 0.4405 0.6405 0.7646 0.2555 0.6175 0.3558 0.3746 0.5201 0.5151 0.4227 0.4911 0.355 0.7247 0.6872 0.6693 0.6346 0.3777 0.3867 0.5671 0.7086 0.699 0.8881 0.6447 0.6985 0.5301 0.1783 0.505 0.4316 0.5884 0.5047 0.7194 0.3495 0.5652 0.1538 0.5675 0.5529 0.551 0.5769 0.3859 0.5371 0.6012 0.6776 0.4784 0.6202 0.3002 0.708 0.5785 0.7942 0.0978 0.1116 0.6883 0.551 0.3391 0.4771 0.7605 0.6144 0.0049 0.363 0.2739 0.0126 0.106 0.3232 0.4641 0.3816 0.7745 0.4801 0.0679 0.5877 0.3764 0.846 0.7047 0.3733 0.8348 0.2026 0.4084 0.3389 0.7352 0.4808 0.8701 0.749 0.1248 0.6692 0.7385 0.3909 0.507 0.6885 0.0256 0.6501 0.6543 0.6809 0.5965 0.8746 0.7878 0. 0.9733 0.6542 0.4475 0.8441 0.7964 0.6273 0.2388 0.3752 0.1053 0.337 0.6611 0.3111 0. 0.699 0.6549 0.6107 0.1022 0.8389 0.2424 0.6115 0.8728] 2022-08-23 21:02:56 [INFO] [EVAL] Class Recall: [0.8309 0.8786 0.9576 0.8746 0.8827 0.8614 0.8794 0.8555 0.7225 0.8208 0.5867 0.6456 0.858 0.4147 0.4467 0.6085 0.543 0.3853 0.7603 0.6085 0.8697 0.6195 0.7435 0.7246 0.5456 0.4721 0.6617 0.526 0.4034 0.2334 0.3381 0.6371 0.2759 0.6075 0.5463 0.5386 0.5729 0.6228 0.4325 0.3651 0.2582 0.1174 0.4727 0.339 0.4744 0.3201 0.3996 0.5966 0.8865 0.717 0.7423 0.7495 0.2663 0.2943 0.9093 0.4362 0.9496 0.4747 0.4001 0.417 0.2789 0.3344 0.4027 0.1659 0.703 0.8433 0.3485 0.634 0.0511 0.4594 0.7209 0.8614 0.531 0.445 0.6468 0.4447 0.3453 0.3811 0.3869 0.7144 0.8579 0.4523 0.3442 0.0228 0.0572 0.6813 0.105 0.1126 0.5037 0.5867 0.5678 0.0033 0.2132 0.0713 0.0125 0.0063 0.3319 0.1402 0.4759 0.3601 0.159 0.2322 0.1397 0.6527 0.1966 0.8396 0.1971 0.5992 0.1033 0.3129 0.0494 0.5168 0.182 0.5599 0.9748 0.0117 0.5528 0.8621 0.1017 0.4415 0.5669 0.0042 0.4027 0.1282 0.2699 0.186 0.5678 0.4406 0. 0.2485 0.8039 0.0039 0.3447 0.3997 0.2554 0.3302 0.2021 0.0249 0.2243 0.4616 0.3794 0. 0.1691 0.5183 0.4016 0.0025 0.3485 0.0089 0.1008 0.1834] 2022-08-23 21:02:57 [INFO] [EVAL] The model with the best validation mIoU (0.3503) was saved at iter 30000. 2022-08-23 21:03:08 [INFO] [TRAIN] epoch: 27, iter: 33050/160000, loss: 0.6796, lr: 0.000961, batch_cost: 0.2263, reader_cost: 0.00450, ips: 35.3574 samples/sec | ETA 07:58:43 2022-08-23 21:03:19 [INFO] [TRAIN] epoch: 27, iter: 33100/160000, loss: 0.6749, lr: 0.000961, batch_cost: 0.2207, reader_cost: 0.00208, ips: 36.2447 samples/sec | ETA 07:46:49 2022-08-23 21:03:30 [INFO] [TRAIN] epoch: 27, iter: 33150/160000, loss: 0.6756, lr: 0.000960, batch_cost: 0.2150, reader_cost: 0.00095, ips: 37.2025 samples/sec | ETA 07:34:37 2022-08-23 21:03:40 [INFO] [TRAIN] epoch: 27, iter: 33200/160000, loss: 0.7245, lr: 0.000960, batch_cost: 0.2030, reader_cost: 0.00549, ips: 39.3998 samples/sec | ETA 07:09:06 2022-08-23 21:03:50 [INFO] [TRAIN] epoch: 27, iter: 33250/160000, loss: 0.6714, lr: 0.000960, batch_cost: 0.1933, reader_cost: 0.00409, ips: 41.3759 samples/sec | ETA 06:48:26 2022-08-23 21:04:01 [INFO] [TRAIN] epoch: 27, iter: 33300/160000, loss: 0.6636, lr: 0.000959, batch_cost: 0.2367, reader_cost: 0.00106, ips: 33.7918 samples/sec | ETA 08:19:55 2022-08-23 21:04:11 [INFO] [TRAIN] epoch: 27, iter: 33350/160000, loss: 0.6605, lr: 0.000959, batch_cost: 0.1850, reader_cost: 0.00036, ips: 43.2461 samples/sec | ETA 06:30:28 2022-08-23 21:04:21 [INFO] [TRAIN] epoch: 27, iter: 33400/160000, loss: 0.6910, lr: 0.000958, batch_cost: 0.2126, reader_cost: 0.00046, ips: 37.6219 samples/sec | ETA 07:28:40 2022-08-23 21:04:30 [INFO] [TRAIN] epoch: 27, iter: 33450/160000, loss: 0.6472, lr: 0.000958, batch_cost: 0.1821, reader_cost: 0.00138, ips: 43.9328 samples/sec | ETA 06:24:04 2022-08-23 21:04:42 [INFO] [TRAIN] epoch: 27, iter: 33500/160000, loss: 0.6613, lr: 0.000958, batch_cost: 0.2289, reader_cost: 0.00058, ips: 34.9549 samples/sec | ETA 08:02:31 2022-08-23 21:04:52 [INFO] [TRAIN] epoch: 27, iter: 33550/160000, loss: 0.6789, lr: 0.000957, batch_cost: 0.1946, reader_cost: 0.00065, ips: 41.1113 samples/sec | ETA 06:50:06 2022-08-23 21:05:03 [INFO] [TRAIN] epoch: 27, iter: 33600/160000, loss: 0.6686, lr: 0.000957, batch_cost: 0.2325, reader_cost: 0.00060, ips: 34.4099 samples/sec | ETA 08:09:46 2022-08-23 21:05:13 [INFO] [TRAIN] epoch: 27, iter: 33650/160000, loss: 0.6941, lr: 0.000957, batch_cost: 0.1964, reader_cost: 0.01508, ips: 40.7403 samples/sec | ETA 06:53:30 2022-08-23 21:05:24 [INFO] [TRAIN] epoch: 27, iter: 33700/160000, loss: 0.7267, lr: 0.000956, batch_cost: 0.2114, reader_cost: 0.00064, ips: 37.8471 samples/sec | ETA 07:24:56 2022-08-23 21:05:32 [INFO] [TRAIN] epoch: 27, iter: 33750/160000, loss: 0.7149, lr: 0.000956, batch_cost: 0.1777, reader_cost: 0.00129, ips: 45.0089 samples/sec | ETA 06:14:00 2022-08-23 21:05:42 [INFO] [TRAIN] epoch: 27, iter: 33800/160000, loss: 0.6623, lr: 0.000955, batch_cost: 0.1957, reader_cost: 0.00046, ips: 40.8739 samples/sec | ETA 06:51:40 2022-08-23 21:05:51 [INFO] [TRAIN] epoch: 27, iter: 33850/160000, loss: 0.6463, lr: 0.000955, batch_cost: 0.1789, reader_cost: 0.00049, ips: 44.7202 samples/sec | ETA 06:16:07 2022-08-23 21:06:00 [INFO] [TRAIN] epoch: 27, iter: 33900/160000, loss: 0.7045, lr: 0.000955, batch_cost: 0.1810, reader_cost: 0.00065, ips: 44.1951 samples/sec | ETA 06:20:26 2022-08-23 21:06:10 [INFO] [TRAIN] epoch: 27, iter: 33950/160000, loss: 0.6715, lr: 0.000954, batch_cost: 0.1934, reader_cost: 0.00101, ips: 41.3553 samples/sec | ETA 06:46:23 2022-08-23 21:06:18 [INFO] [TRAIN] epoch: 27, iter: 34000/160000, loss: 0.6743, lr: 0.000954, batch_cost: 0.1706, reader_cost: 0.00063, ips: 46.9063 samples/sec | ETA 05:58:09 2022-08-23 21:06:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1974 - reader cost: 6.3416e-04 2022-08-23 21:09:36 [INFO] [EVAL] #Images: 2000 mIoU: 0.3532 Acc: 0.7619 Kappa: 0.7440 Dice: 0.4867 2022-08-23 21:09:36 [INFO] [EVAL] Class IoU: [0.6761 0.7683 0.9285 0.7317 0.6743 0.7504 0.7729 0.7744 0.5136 0.6747 0.4768 0.5514 0.6933 0.2709 0.2776 0.4047 0.4909 0.4087 0.5993 0.4183 0.7384 0.501 0.6069 0.4775 0.3285 0.4199 0.4988 0.398 0.4189 0.2827 0.2799 0.515 0.2654 0.2878 0.3886 0.3545 0.4305 0.5266 0.2567 0.3207 0.1288 0.0961 0.3334 0.2545 0.2694 0.2318 0.3112 0.4813 0.6923 0.4717 0.5339 0.4792 0.1744 0.2704 0.6531 0.4682 0.8364 0.2667 0.423 0.351 0.0742 0.4708 0.2738 0.1081 0.4331 0.7154 0.2211 0.3986 0.052 0.2989 0.3882 0.4405 0.4065 0.2382 0.4233 0.392 0.5647 0.287 0.4029 0.2473 0.6084 0.3584 0.3045 0.1309 0.1939 0.5099 0.0841 0.0653 0.2601 0.5013 0.4413 0.0683 0.1639 0.084 0. 0.009 0.2434 0.1476 0.2501 0.3276 0.211 0.0743 0.1529 0.7407 0.1102 0.4804 0.1767 0.631 0.0715 0.3941 0.1159 0.4789 0.1569 0.6675 0.6304 0.0024 0.3703 0.6627 0.1899 0.3661 0.4965 0.0331 0.3155 0.078 0.2442 0.1566 0.4948 0.4441 0.2613 0.3689 0.5496 0.0025 0.3499 0.3597 0.178 0.1674 0.1407 0.0243 0.1557 0.3783 0.15 0.0001 0.2581 0.241 0.3247 0. 0.3502 0.0385 0.0982 0.1674] 2022-08-23 21:09:36 [INFO] [EVAL] Class Precision: [0.7838 0.8643 0.958 0.8395 0.7557 0.8653 0.8774 0.8261 0.6407 0.7558 0.712 0.7068 0.7594 0.4783 0.5133 0.6192 0.6155 0.6562 0.7507 0.6382 0.8066 0.6586 0.772 0.6497 0.4716 0.4832 0.6925 0.6035 0.6969 0.4422 0.4052 0.6519 0.4871 0.348 0.5933 0.4633 0.683 0.6992 0.4731 0.5989 0.2659 0.4236 0.5825 0.4521 0.3902 0.606 0.58 0.6762 0.7386 0.5499 0.6853 0.6894 0.3245 0.5885 0.7144 0.623 0.8603 0.659 0.5123 0.6621 0.1263 0.5779 0.4302 0.6401 0.5605 0.8526 0.3787 0.5094 0.6174 0.5066 0.6742 0.7861 0.5771 0.3163 0.7099 0.6037 0.698 0.7144 0.7087 0.3731 0.7631 0.6571 0.7821 0.3157 0.3559 0.7241 0.3313 0.5135 0.33 0.7396 0.7014 0.0896 0.4102 0.2328 0. 0.0818 0.4053 0.3245 0.3469 0.6832 0.4335 0.1196 0.4638 0.7902 0.4502 0.5024 0.2238 0.8447 0.1794 0.493 0.3512 0.7313 0.6082 0.7047 0.6376 0.0724 0.7647 0.681 0.3925 0.5356 0.6692 0.1068 0.5882 0.6938 0.7554 0.6546 0.863 0.6478 0.5218 0.5323 0.6469 0.2468 0.6709 0.7371 0.7632 0.409 0.3191 0.1575 0.2945 0.5306 0.6999 0.0002 0.6476 0.4697 0.6678 0. 0.7964 0.2374 0.3298 0.7592] 2022-08-23 21:09:36 [INFO] [EVAL] Class Recall: [0.8311 0.8737 0.9678 0.8507 0.8622 0.8497 0.8665 0.9251 0.7213 0.8627 0.5907 0.7149 0.8885 0.3844 0.3768 0.5388 0.7081 0.52 0.7483 0.5483 0.8973 0.6769 0.7394 0.643 0.5199 0.7622 0.6408 0.5389 0.5122 0.4393 0.475 0.7103 0.3684 0.6245 0.5297 0.6015 0.538 0.6808 0.3594 0.4084 0.1999 0.1106 0.4381 0.3679 0.4652 0.2729 0.4018 0.6255 0.9169 0.7683 0.7074 0.6111 0.2739 0.3334 0.8839 0.6533 0.9678 0.3094 0.7082 0.4276 0.1526 0.7175 0.4295 0.1151 0.6558 0.8164 0.3471 0.647 0.0538 0.4217 0.4778 0.5005 0.5789 0.491 0.5119 0.5277 0.7474 0.3242 0.4828 0.4231 0.75 0.4408 0.3327 0.1827 0.2986 0.6329 0.1013 0.0696 0.5512 0.6087 0.5434 0.2236 0.2144 0.1161 0. 0.0101 0.3787 0.213 0.4726 0.3863 0.2913 0.164 0.1857 0.922 0.1274 0.9164 0.4567 0.7139 0.1063 0.6626 0.1475 0.5812 0.1745 0.9267 0.9823 0.0025 0.418 0.961 0.2689 0.5363 0.658 0.0459 0.405 0.0808 0.2652 0.1707 0.537 0.5855 0.3436 0.5457 0.785 0.0025 0.4224 0.4127 0.1884 0.2209 0.2011 0.0279 0.2484 0.5686 0.1603 0.0003 0.3003 0.3312 0.3872 0. 0.3846 0.044 0.1226 0.1768] 2022-08-23 21:09:37 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:09:47 [INFO] [TRAIN] epoch: 27, iter: 34050/160000, loss: 0.6882, lr: 0.000954, batch_cost: 0.2091, reader_cost: 0.00316, ips: 38.2611 samples/sec | ETA 07:18:54 2022-08-23 21:09:57 [INFO] [TRAIN] epoch: 27, iter: 34100/160000, loss: 0.7059, lr: 0.000953, batch_cost: 0.1923, reader_cost: 0.00140, ips: 41.6102 samples/sec | ETA 06:43:25 2022-08-23 21:10:13 [INFO] [TRAIN] epoch: 28, iter: 34150/160000, loss: 0.6709, lr: 0.000953, batch_cost: 0.3230, reader_cost: 0.07883, ips: 24.7666 samples/sec | ETA 11:17:31 2022-08-23 21:10:24 [INFO] [TRAIN] epoch: 28, iter: 34200/160000, loss: 0.7167, lr: 0.000952, batch_cost: 0.2342, reader_cost: 0.00067, ips: 34.1542 samples/sec | ETA 08:11:06 2022-08-23 21:10:35 [INFO] [TRAIN] epoch: 28, iter: 34250/160000, loss: 0.6453, lr: 0.000952, batch_cost: 0.2172, reader_cost: 0.00064, ips: 36.8252 samples/sec | ETA 07:35:18 2022-08-23 21:10:47 [INFO] [TRAIN] epoch: 28, iter: 34300/160000, loss: 0.6788, lr: 0.000952, batch_cost: 0.2268, reader_cost: 0.00044, ips: 35.2773 samples/sec | ETA 07:55:05 2022-08-23 21:10:58 [INFO] [TRAIN] epoch: 28, iter: 34350/160000, loss: 0.6526, lr: 0.000951, batch_cost: 0.2312, reader_cost: 0.00040, ips: 34.6058 samples/sec | ETA 08:04:07 2022-08-23 21:11:09 [INFO] [TRAIN] epoch: 28, iter: 34400/160000, loss: 0.6615, lr: 0.000951, batch_cost: 0.2189, reader_cost: 0.00065, ips: 36.5492 samples/sec | ETA 07:38:11 2022-08-23 21:11:20 [INFO] [TRAIN] epoch: 28, iter: 34450/160000, loss: 0.7138, lr: 0.000951, batch_cost: 0.2186, reader_cost: 0.00061, ips: 36.6000 samples/sec | ETA 07:37:22 2022-08-23 21:11:31 [INFO] [TRAIN] epoch: 28, iter: 34500/160000, loss: 0.6583, lr: 0.000950, batch_cost: 0.2133, reader_cost: 0.00069, ips: 37.5125 samples/sec | ETA 07:26:04 2022-08-23 21:11:39 [INFO] [TRAIN] epoch: 28, iter: 34550/160000, loss: 0.6778, lr: 0.000950, batch_cost: 0.1735, reader_cost: 0.00070, ips: 46.1088 samples/sec | ETA 06:02:45 2022-08-23 21:11:48 [INFO] [TRAIN] epoch: 28, iter: 34600/160000, loss: 0.6681, lr: 0.000949, batch_cost: 0.1630, reader_cost: 0.00034, ips: 49.0732 samples/sec | ETA 05:40:42 2022-08-23 21:11:56 [INFO] [TRAIN] epoch: 28, iter: 34650/160000, loss: 0.7130, lr: 0.000949, batch_cost: 0.1765, reader_cost: 0.01153, ips: 45.3336 samples/sec | ETA 06:08:40 2022-08-23 21:12:05 [INFO] [TRAIN] epoch: 28, iter: 34700/160000, loss: 0.6710, lr: 0.000949, batch_cost: 0.1741, reader_cost: 0.00078, ips: 45.9479 samples/sec | ETA 06:03:35 2022-08-23 21:12:16 [INFO] [TRAIN] epoch: 28, iter: 34750/160000, loss: 0.6956, lr: 0.000948, batch_cost: 0.2078, reader_cost: 0.00035, ips: 38.5044 samples/sec | ETA 07:13:43 2022-08-23 21:12:27 [INFO] [TRAIN] epoch: 28, iter: 34800/160000, loss: 0.6681, lr: 0.000948, batch_cost: 0.2306, reader_cost: 0.00058, ips: 34.6980 samples/sec | ETA 08:01:06 2022-08-23 21:12:36 [INFO] [TRAIN] epoch: 28, iter: 34850/160000, loss: 0.6299, lr: 0.000948, batch_cost: 0.1775, reader_cost: 0.00120, ips: 45.0578 samples/sec | ETA 06:10:20 2022-08-23 21:12:44 [INFO] [TRAIN] epoch: 28, iter: 34900/160000, loss: 0.6815, lr: 0.000947, batch_cost: 0.1698, reader_cost: 0.00054, ips: 47.1076 samples/sec | ETA 05:54:04 2022-08-23 21:12:53 [INFO] [TRAIN] epoch: 28, iter: 34950/160000, loss: 0.6818, lr: 0.000947, batch_cost: 0.1635, reader_cost: 0.00095, ips: 48.9407 samples/sec | ETA 05:40:41 2022-08-23 21:13:01 [INFO] [TRAIN] epoch: 28, iter: 35000/160000, loss: 0.6937, lr: 0.000946, batch_cost: 0.1590, reader_cost: 0.00212, ips: 50.3189 samples/sec | ETA 05:31:13 2022-08-23 21:13:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 149s - batch_cost: 0.1493 - reader cost: 7.6694e-04 2022-08-23 21:15:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3435 Acc: 0.7583 Kappa: 0.7398 Dice: 0.4768 2022-08-23 21:15:30 [INFO] [EVAL] Class IoU: [0.6725 0.7683 0.9267 0.725 0.6706 0.7523 0.7688 0.7767 0.5127 0.638 0.4711 0.5224 0.6998 0.2311 0.3144 0.4283 0.451 0.4054 0.5865 0.4116 0.7501 0.4374 0.5909 0.4773 0.3222 0.4378 0.4567 0.4044 0.4206 0.2858 0.2601 0.4746 0.2922 0.3394 0.391 0.3258 0.4273 0.5362 0.2641 0.337 0.0913 0.1234 0.3339 0.2519 0.2777 0.2222 0.2668 0.4667 0.6522 0.5067 0.5371 0.4081 0.15 0.2429 0.634 0.4424 0.8749 0.4166 0.502 0.3166 0.0779 0.2102 0.3072 0.1259 0.4079 0.6484 0.2336 0.3618 0.068 0.3156 0.4099 0.5273 0.3885 0.2484 0.4242 0.3775 0.5133 0.2589 0.2149 0.2694 0.5772 0.3603 0.34 0.2137 0.2239 0.4702 0.0834 0.1172 0.2547 0.4916 0.3491 0.0479 0.1438 0.0878 0.0263 0.0152 0.2989 0.1418 0.2774 0.3207 0.0181 0.0995 0.2026 0.6056 0.1232 0.4142 0.1737 0.5566 0.0459 0.282 0.1135 0.0991 0.1486 0.6786 0.6672 0.0211 0.3551 0.6743 0.1011 0.3458 0.4846 0.0397 0.3373 0.1715 0.2642 0.1767 0.5183 0.4542 0.3696 0.3431 0.5081 0.0036 0.3022 0.3449 0.2118 0.1522 0.1478 0.0196 0.1779 0.334 0.2009 0.027 0.3409 0.0008 0.294 0. 0.2958 0.0557 0.0876 0.2163] 2022-08-23 21:15:30 [INFO] [EVAL] Class Precision: [0.7664 0.8784 0.9666 0.8218 0.7428 0.8546 0.8467 0.8405 0.6771 0.7568 0.6999 0.7589 0.7884 0.4827 0.485 0.6387 0.6228 0.732 0.7585 0.6226 0.834 0.635 0.6971 0.6312 0.5529 0.4932 0.5317 0.6753 0.7959 0.4432 0.4901 0.5563 0.5131 0.5378 0.5012 0.4443 0.6503 0.6922 0.4284 0.5419 0.3538 0.2904 0.5759 0.539 0.4307 0.4699 0.388 0.6635 0.6882 0.6532 0.7034 0.4939 0.2675 0.4693 0.6568 0.6606 0.9225 0.5326 0.6483 0.59 0.2154 0.3921 0.4357 0.597 0.4747 0.909 0.3657 0.6341 0.143 0.5313 0.4925 0.6259 0.6805 0.2832 0.7674 0.565 0.7426 0.6021 0.6114 0.3754 0.6785 0.637 0.7719 0.4106 0.3049 0.7966 0.3052 0.3843 0.5362 0.6351 0.4516 0.0495 0.322 0.2708 0.0973 0.0403 0.4071 0.2542 0.3947 0.7504 0.5863 0.1379 0.5547 0.7123 0.6949 0.4328 0.3853 0.8501 0.1776 0.4622 0.45 0.1048 0.655 0.7733 0.6866 0.281 0.6327 0.7013 0.5892 0.5335 0.698 0.4195 0.6268 0.5678 0.68 0.6221 0.7762 0.5612 0.7729 0.5726 0.5919 0.5599 0.5205 0.6145 0.554 0.2428 0.3266 0.0818 0.5653 0.6929 0.5049 0.033 0.5595 0.0115 0.5986 0. 0.8625 0.2209 0.6286 0.7289] 2022-08-23 21:15:30 [INFO] [EVAL] Class Recall: [0.8458 0.8598 0.9574 0.8603 0.8734 0.8628 0.8931 0.9109 0.6787 0.8026 0.5904 0.6264 0.8617 0.3071 0.4719 0.5653 0.6205 0.476 0.7212 0.5485 0.8818 0.5843 0.7951 0.6619 0.4357 0.7958 0.764 0.502 0.4714 0.446 0.3566 0.7637 0.4043 0.4791 0.6402 0.5498 0.5548 0.704 0.4077 0.4714 0.1096 0.1766 0.4428 0.321 0.4386 0.2965 0.4607 0.6115 0.9257 0.6932 0.6943 0.7013 0.2544 0.3348 0.9482 0.5725 0.9443 0.6566 0.6899 0.4058 0.1088 0.3117 0.5103 0.1376 0.7434 0.6934 0.3929 0.4573 0.1149 0.4373 0.7097 0.7699 0.4753 0.6692 0.4868 0.5321 0.6244 0.3124 0.2488 0.4884 0.7945 0.4534 0.378 0.3083 0.4571 0.5344 0.103 0.1444 0.3266 0.6851 0.6061 0.5929 0.2063 0.1149 0.0349 0.0239 0.5292 0.2427 0.4827 0.359 0.0184 0.2634 0.2419 0.8018 0.1303 0.9059 0.2402 0.6172 0.0584 0.4198 0.1317 0.6473 0.1612 0.8472 0.9594 0.0224 0.4473 0.9459 0.1087 0.4958 0.6132 0.042 0.4221 0.1972 0.3017 0.1979 0.6094 0.7044 0.4146 0.4612 0.7822 0.0036 0.4187 0.4401 0.2553 0.2895 0.2126 0.0251 0.2061 0.392 0.2501 0.1282 0.4659 0.0009 0.3661 0. 0.3105 0.0694 0.0924 0.2353] 2022-08-23 21:15:31 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:15:41 [INFO] [TRAIN] epoch: 28, iter: 35050/160000, loss: 0.6487, lr: 0.000946, batch_cost: 0.2078, reader_cost: 0.00196, ips: 38.4968 samples/sec | ETA 07:12:45 2022-08-23 21:15:51 [INFO] [TRAIN] epoch: 28, iter: 35100/160000, loss: 0.6303, lr: 0.000946, batch_cost: 0.1963, reader_cost: 0.00633, ips: 40.7585 samples/sec | ETA 06:48:35 2022-08-23 21:16:00 [INFO] [TRAIN] epoch: 28, iter: 35150/160000, loss: 0.7040, lr: 0.000945, batch_cost: 0.1859, reader_cost: 0.00034, ips: 43.0455 samples/sec | ETA 06:26:43 2022-08-23 21:16:10 [INFO] [TRAIN] epoch: 28, iter: 35200/160000, loss: 0.7462, lr: 0.000945, batch_cost: 0.1964, reader_cost: 0.00055, ips: 40.7385 samples/sec | ETA 06:48:27 2022-08-23 21:16:21 [INFO] [TRAIN] epoch: 28, iter: 35250/160000, loss: 0.6840, lr: 0.000944, batch_cost: 0.2216, reader_cost: 0.00417, ips: 36.1035 samples/sec | ETA 07:40:42 2022-08-23 21:16:31 [INFO] [TRAIN] epoch: 28, iter: 35300/160000, loss: 0.6414, lr: 0.000944, batch_cost: 0.2008, reader_cost: 0.00079, ips: 39.8330 samples/sec | ETA 06:57:24 2022-08-23 21:16:42 [INFO] [TRAIN] epoch: 28, iter: 35350/160000, loss: 0.7300, lr: 0.000944, batch_cost: 0.2245, reader_cost: 0.00569, ips: 35.6369 samples/sec | ETA 07:46:22 2022-08-23 21:16:56 [INFO] [TRAIN] epoch: 29, iter: 35400/160000, loss: 0.7298, lr: 0.000943, batch_cost: 0.2764, reader_cost: 0.06691, ips: 28.9462 samples/sec | ETA 09:33:56 2022-08-23 21:17:07 [INFO] [TRAIN] epoch: 29, iter: 35450/160000, loss: 0.6194, lr: 0.000943, batch_cost: 0.2096, reader_cost: 0.00103, ips: 38.1743 samples/sec | ETA 07:15:01 2022-08-23 21:17:17 [INFO] [TRAIN] epoch: 29, iter: 35500/160000, loss: 0.6462, lr: 0.000943, batch_cost: 0.2127, reader_cost: 0.00052, ips: 37.6185 samples/sec | ETA 07:21:16 2022-08-23 21:17:28 [INFO] [TRAIN] epoch: 29, iter: 35550/160000, loss: 0.6371, lr: 0.000942, batch_cost: 0.2189, reader_cost: 0.00066, ips: 36.5428 samples/sec | ETA 07:34:04 2022-08-23 21:17:39 [INFO] [TRAIN] epoch: 29, iter: 35600/160000, loss: 0.6387, lr: 0.000942, batch_cost: 0.2173, reader_cost: 0.00080, ips: 36.8199 samples/sec | ETA 07:30:28 2022-08-23 21:17:49 [INFO] [TRAIN] epoch: 29, iter: 35650/160000, loss: 0.6434, lr: 0.000941, batch_cost: 0.1947, reader_cost: 0.00092, ips: 41.0918 samples/sec | ETA 06:43:29 2022-08-23 21:17:58 [INFO] [TRAIN] epoch: 29, iter: 35700/160000, loss: 0.6453, lr: 0.000941, batch_cost: 0.1840, reader_cost: 0.00056, ips: 43.4672 samples/sec | ETA 06:21:17 2022-08-23 21:18:07 [INFO] [TRAIN] epoch: 29, iter: 35750/160000, loss: 0.6836, lr: 0.000941, batch_cost: 0.1830, reader_cost: 0.00047, ips: 43.7042 samples/sec | ETA 06:19:03 2022-08-23 21:18:17 [INFO] [TRAIN] epoch: 29, iter: 35800/160000, loss: 0.6789, lr: 0.000940, batch_cost: 0.1931, reader_cost: 0.00068, ips: 41.4396 samples/sec | ETA 06:39:37 2022-08-23 21:18:26 [INFO] [TRAIN] epoch: 29, iter: 35850/160000, loss: 0.6747, lr: 0.000940, batch_cost: 0.1840, reader_cost: 0.00079, ips: 43.4744 samples/sec | ETA 06:20:45 2022-08-23 21:18:34 [INFO] [TRAIN] epoch: 29, iter: 35900/160000, loss: 0.6163, lr: 0.000940, batch_cost: 0.1553, reader_cost: 0.00127, ips: 51.5061 samples/sec | ETA 05:21:15 2022-08-23 21:18:41 [INFO] [TRAIN] epoch: 29, iter: 35950/160000, loss: 0.6749, lr: 0.000939, batch_cost: 0.1531, reader_cost: 0.00048, ips: 52.2528 samples/sec | ETA 05:16:32 2022-08-23 21:18:51 [INFO] [TRAIN] epoch: 29, iter: 36000/160000, loss: 0.6823, lr: 0.000939, batch_cost: 0.1832, reader_cost: 0.00044, ips: 43.6735 samples/sec | ETA 06:18:34 2022-08-23 21:18:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1759 - reader cost: 7.5735e-04 2022-08-23 21:21:47 [INFO] [EVAL] #Images: 2000 mIoU: 0.3439 Acc: 0.7600 Kappa: 0.7415 Dice: 0.4758 2022-08-23 21:21:47 [INFO] [EVAL] Class IoU: [0.6672 0.7814 0.9282 0.7328 0.6726 0.7531 0.7652 0.784 0.5017 0.6345 0.4771 0.5404 0.7007 0.3048 0.2977 0.4212 0.4496 0.4052 0.5869 0.4027 0.7372 0.4278 0.6092 0.4958 0.3025 0.4554 0.4368 0.4136 0.4429 0.3002 0.2586 0.4156 0.2807 0.3185 0.3393 0.3931 0.4335 0.5434 0.2249 0.3315 0.0887 0.1199 0.3521 0.2377 0.3174 0.2572 0.2929 0.4706 0.6688 0.5499 0.516 0.3972 0.152 0.1858 0.6582 0.3558 0.8538 0.3991 0.4407 0.2898 0.0786 0.5135 0.3094 0.1416 0.3703 0.6781 0.225 0.391 0.1074 0.3409 0.4043 0.5396 0.3989 0.2543 0.4602 0.3777 0.5933 0.2556 0.2308 0.0444 0.6414 0.3404 0.3043 0.0316 0.1927 0.541 0.1011 0.0744 0.2325 0.4813 0.4367 0.0215 0.1864 0.0733 0.1082 0.0123 0.2079 0.1015 0.2676 0.3489 0.0705 0.0587 0.0767 0.5916 0.1898 0.5685 0.2175 0.5564 0.0826 0.3784 0.1138 0.3451 0.1721 0.601 0.626 0.0062 0.3922 0.6174 0.1783 0.3695 0.4238 0.0467 0.3905 0.156 0.2623 0.1584 0.4859 0.4745 0. 0.3854 0.447 0.0105 0.2337 0.3111 0.2079 0.1579 0.1505 0.0179 0.1824 0.329 0.0755 0.0001 0.2435 0.2344 0.2929 0.0105 0.3877 0.0459 0.072 0.1908] 2022-08-23 21:21:47 [INFO] [EVAL] Class Precision: [0.7594 0.8595 0.9586 0.8682 0.7538 0.8583 0.856 0.8556 0.6409 0.8294 0.7328 0.741 0.7912 0.437 0.5456 0.5832 0.6869 0.7353 0.7235 0.6451 0.7994 0.6118 0.7231 0.6218 0.5408 0.6 0.5815 0.7017 0.6583 0.4773 0.3819 0.4553 0.5083 0.5206 0.4824 0.493 0.5863 0.7283 0.3134 0.6423 0.2631 0.3582 0.6539 0.5519 0.4388 0.4188 0.5698 0.8016 0.841 0.6859 0.633 0.4598 0.3647 0.5703 0.7082 0.5051 0.8889 0.6062 0.5208 0.4435 0.1472 0.7966 0.4245 0.5687 0.4144 0.7833 0.3482 0.5618 0.322 0.5452 0.5113 0.7077 0.5901 0.3666 0.648 0.5272 0.7659 0.6492 0.7329 0.7755 0.7885 0.74 0.8185 0.0856 0.3857 0.716 0.4248 0.4733 0.5177 0.6306 0.6659 0.0323 0.3978 0.2809 0.1645 0.0418 0.4341 0.695 0.3876 0.7987 0.5063 0.0661 0.4998 0.8621 0.8286 0.6364 0.5648 0.6982 0.1957 0.4533 0.4771 0.4617 0.5045 0.6897 0.6303 0.1461 0.6009 0.648 0.3874 0.5796 0.7115 0.1278 0.6837 0.6361 0.6745 0.6533 0.8513 0.677 0. 0.5683 0.4752 0.1927 0.8179 0.719 0.7438 0.3073 0.322 0.0672 0.3708 0.7067 0.3522 0.0001 0.6065 0.4469 0.6946 0.0247 0.8456 0.227 0.6033 0.7379] 2022-08-23 21:21:47 [INFO] [EVAL] Class Recall: [0.8461 0.8958 0.9669 0.8245 0.862 0.86 0.8783 0.9036 0.6979 0.7298 0.5777 0.6663 0.8598 0.5018 0.3959 0.6025 0.5655 0.4745 0.7565 0.5172 0.9045 0.5871 0.7946 0.7099 0.407 0.654 0.637 0.5018 0.5751 0.4471 0.4447 0.8267 0.3853 0.4508 0.5335 0.66 0.6245 0.6816 0.4431 0.4065 0.118 0.1527 0.4328 0.2945 0.5341 0.4 0.376 0.5327 0.7656 0.735 0.7364 0.7449 0.2068 0.216 0.9032 0.5461 0.9558 0.5389 0.7412 0.4554 0.1443 0.5909 0.5328 0.1586 0.7771 0.8347 0.3886 0.5626 0.1388 0.4763 0.6589 0.6943 0.5518 0.4537 0.6136 0.5712 0.7247 0.2966 0.252 0.045 0.7747 0.3867 0.3263 0.0477 0.2781 0.6888 0.1172 0.0811 0.2968 0.6704 0.5592 0.0605 0.2597 0.0903 0.24 0.0171 0.2853 0.1062 0.4636 0.3826 0.0757 0.3423 0.083 0.6534 0.1976 0.8419 0.2613 0.7326 0.1251 0.6962 0.13 0.5774 0.2071 0.8238 0.9891 0.0064 0.5303 0.9291 0.2484 0.5048 0.5117 0.0685 0.4766 0.1713 0.3003 0.1729 0.5309 0.6134 0. 0.5451 0.8828 0.011 0.2465 0.3542 0.224 0.2452 0.2203 0.0239 0.2642 0.381 0.0877 0.0003 0.2891 0.3301 0.3363 0.018 0.4172 0.0544 0.0756 0.2047] 2022-08-23 21:21:47 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:21:56 [INFO] [TRAIN] epoch: 29, iter: 36050/160000, loss: 0.7037, lr: 0.000938, batch_cost: 0.1795, reader_cost: 0.01188, ips: 44.5585 samples/sec | ETA 06:10:53 2022-08-23 21:22:06 [INFO] [TRAIN] epoch: 29, iter: 36100/160000, loss: 0.6831, lr: 0.000938, batch_cost: 0.1988, reader_cost: 0.01286, ips: 40.2346 samples/sec | ETA 06:50:35 2022-08-23 21:22:17 [INFO] [TRAIN] epoch: 29, iter: 36150/160000, loss: 0.6638, lr: 0.000938, batch_cost: 0.2168, reader_cost: 0.00033, ips: 36.9034 samples/sec | ETA 07:27:28 2022-08-23 21:22:28 [INFO] [TRAIN] epoch: 29, iter: 36200/160000, loss: 0.6783, lr: 0.000937, batch_cost: 0.2212, reader_cost: 0.00063, ips: 36.1597 samples/sec | ETA 07:36:29 2022-08-23 21:22:39 [INFO] [TRAIN] epoch: 29, iter: 36250/160000, loss: 0.6889, lr: 0.000937, batch_cost: 0.2207, reader_cost: 0.00048, ips: 36.2432 samples/sec | ETA 07:35:15 2022-08-23 21:22:50 [INFO] [TRAIN] epoch: 29, iter: 36300/160000, loss: 0.6460, lr: 0.000937, batch_cost: 0.2241, reader_cost: 0.00085, ips: 35.6950 samples/sec | ETA 07:42:03 2022-08-23 21:23:01 [INFO] [TRAIN] epoch: 29, iter: 36350/160000, loss: 0.6201, lr: 0.000936, batch_cost: 0.2262, reader_cost: 0.00053, ips: 35.3674 samples/sec | ETA 07:46:09 2022-08-23 21:23:12 [INFO] [TRAIN] epoch: 29, iter: 36400/160000, loss: 0.6764, lr: 0.000936, batch_cost: 0.2025, reader_cost: 0.00046, ips: 39.4971 samples/sec | ETA 06:57:14 2022-08-23 21:23:23 [INFO] [TRAIN] epoch: 29, iter: 36450/160000, loss: 0.6475, lr: 0.000935, batch_cost: 0.2201, reader_cost: 0.00177, ips: 36.3393 samples/sec | ETA 07:33:19 2022-08-23 21:23:33 [INFO] [TRAIN] epoch: 29, iter: 36500/160000, loss: 0.6946, lr: 0.000935, batch_cost: 0.2065, reader_cost: 0.00053, ips: 38.7359 samples/sec | ETA 07:05:06 2022-08-23 21:23:45 [INFO] [TRAIN] epoch: 29, iter: 36550/160000, loss: 0.7213, lr: 0.000935, batch_cost: 0.2360, reader_cost: 0.00047, ips: 33.8985 samples/sec | ETA 08:05:33 2022-08-23 21:23:56 [INFO] [TRAIN] epoch: 29, iter: 36600/160000, loss: 0.6907, lr: 0.000934, batch_cost: 0.2223, reader_cost: 0.00065, ips: 35.9811 samples/sec | ETA 07:37:16 2022-08-23 21:24:08 [INFO] [TRAIN] epoch: 30, iter: 36650/160000, loss: 0.6582, lr: 0.000934, batch_cost: 0.2515, reader_cost: 0.05880, ips: 31.8112 samples/sec | ETA 08:37:00 2022-08-23 21:24:19 [INFO] [TRAIN] epoch: 30, iter: 36700/160000, loss: 0.6285, lr: 0.000934, batch_cost: 0.2111, reader_cost: 0.00195, ips: 37.9026 samples/sec | ETA 07:13:44 2022-08-23 21:24:28 [INFO] [TRAIN] epoch: 30, iter: 36750/160000, loss: 0.6638, lr: 0.000933, batch_cost: 0.1842, reader_cost: 0.00048, ips: 43.4320 samples/sec | ETA 06:18:22 2022-08-23 21:24:39 [INFO] [TRAIN] epoch: 30, iter: 36800/160000, loss: 0.6412, lr: 0.000933, batch_cost: 0.2122, reader_cost: 0.00057, ips: 37.6914 samples/sec | ETA 07:15:49 2022-08-23 21:24:47 [INFO] [TRAIN] epoch: 30, iter: 36850/160000, loss: 0.6375, lr: 0.000932, batch_cost: 0.1651, reader_cost: 0.00033, ips: 48.4419 samples/sec | ETA 05:38:57 2022-08-23 21:24:55 [INFO] [TRAIN] epoch: 30, iter: 36900/160000, loss: 0.6232, lr: 0.000932, batch_cost: 0.1614, reader_cost: 0.00042, ips: 49.5579 samples/sec | ETA 05:31:11 2022-08-23 21:25:04 [INFO] [TRAIN] epoch: 30, iter: 36950/160000, loss: 0.6403, lr: 0.000932, batch_cost: 0.1728, reader_cost: 0.00049, ips: 46.3092 samples/sec | ETA 05:54:17 2022-08-23 21:25:13 [INFO] [TRAIN] epoch: 30, iter: 37000/160000, loss: 0.6806, lr: 0.000931, batch_cost: 0.1893, reader_cost: 0.00056, ips: 42.2625 samples/sec | ETA 06:28:03 2022-08-23 21:25:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 205s - batch_cost: 0.2044 - reader cost: 8.6796e-04 2022-08-23 21:28:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3412 Acc: 0.7594 Kappa: 0.7415 Dice: 0.4742 2022-08-23 21:28:38 [INFO] [EVAL] Class IoU: [0.6792 0.7804 0.9287 0.7247 0.6741 0.7411 0.7641 0.7761 0.5017 0.6487 0.4939 0.5395 0.6943 0.2553 0.2794 0.4081 0.4943 0.391 0.5825 0.4082 0.753 0.4615 0.6101 0.4768 0.3317 0.4509 0.419 0.4094 0.363 0.2226 0.2197 0.4992 0.2983 0.3192 0.3232 0.3725 0.4293 0.512 0.2737 0.306 0.0702 0.1192 0.3449 0.252 0.2802 0.2421 0.2711 0.4791 0.5986 0.5242 0.5473 0.335 0.1697 0.2675 0.671 0.4976 0.8447 0.3701 0.4814 0.3717 0.087 0.4536 0.295 0.1337 0.3761 0.5824 0.1906 0.4041 0.089 0.3255 0.452 0.435 0.4373 0.2591 0.4755 0.4011 0.4534 0.2489 0.4592 0.322 0.5746 0.3133 0.2729 0.0526 0.3504 0.5364 0.1046 0.1115 0.2986 0.4819 0.375 0.0404 0.1744 0.0949 0.0934 0.0224 0.2555 0.1035 0.2752 0.3007 0.1582 0.0794 0.143 0.098 0.0981 0.3775 0.2573 0.5384 0.1223 0.4319 0.1624 0.1732 0.1551 0.4498 0.7365 0.0044 0.4333 0.6866 0.1861 0.223 0.4874 0.0283 0.2936 0.0615 0.2679 0.2025 0.4778 0.4776 0.02 0.3491 0.5703 0.0011 0.2613 0.2026 0.1773 0.1692 0.14 0.0205 0.1643 0.3635 0.0798 0.0004 0.2737 0.3211 0.3037 0.0058 0.3948 0.0106 0.11 0.1613] 2022-08-23 21:28:38 [INFO] [EVAL] Class Precision: [0.7942 0.8816 0.961 0.8244 0.7519 0.8063 0.8569 0.8318 0.6324 0.7692 0.733 0.7208 0.7681 0.4599 0.5085 0.532 0.6885 0.6394 0.7514 0.6412 0.8395 0.5244 0.7953 0.549 0.4859 0.5474 0.8505 0.7465 0.8235 0.3293 0.463 0.6717 0.5735 0.4502 0.4684 0.5169 0.6404 0.687 0.4114 0.639 0.1765 0.4448 0.6507 0.4949 0.383 0.4652 0.5122 0.7732 0.6375 0.6656 0.7408 0.3751 0.4001 0.7717 0.7205 0.7938 0.865 0.5701 0.5976 0.6197 0.2112 0.5864 0.3855 0.519 0.4209 0.6352 0.286 0.5569 0.1669 0.5123 0.6194 0.7259 0.6471 0.5034 0.7193 0.6089 0.8341 0.5689 0.6358 0.4914 0.7159 0.7252 0.7927 0.1299 0.4601 0.6507 0.5018 0.3889 0.6144 0.6123 0.4957 0.0747 0.3432 0.2799 0.1128 0.0735 0.4094 0.5808 0.4242 0.6989 0.582 0.0921 0.4838 0.5014 0.7005 0.3936 0.5046 0.7187 0.2441 0.455 0.376 0.1912 0.6361 0.7816 0.7514 0.1189 0.646 0.7029 0.3123 0.4699 0.6884 0.1502 0.5458 0.7824 0.6976 0.6029 0.7601 0.6494 0.0395 0.4345 0.6651 1. 0.6173 0.8973 0.6749 0.3091 0.4163 0.1087 0.3946 0.6515 0.1183 0.0009 0.5426 0.6086 0.515 0.0151 0.761 0.1771 0.3441 0.8752] 2022-08-23 21:28:38 [INFO] [EVAL] Class Recall: [0.8242 0.8718 0.9651 0.857 0.8669 0.9017 0.8758 0.9205 0.7082 0.8056 0.6022 0.6821 0.8785 0.3646 0.3827 0.6368 0.6368 0.5016 0.7215 0.529 0.8797 0.7937 0.7238 0.7837 0.5111 0.7188 0.4523 0.4755 0.3936 0.4071 0.2947 0.6603 0.3834 0.5231 0.5104 0.5714 0.5657 0.6678 0.4499 0.3699 0.1043 0.1401 0.4233 0.3393 0.5108 0.3355 0.3655 0.5574 0.9074 0.7117 0.6769 0.7579 0.2275 0.2905 0.9072 0.5715 0.973 0.5134 0.7123 0.4816 0.1289 0.6669 0.5571 0.1526 0.7792 0.8751 0.3637 0.5957 0.16 0.4718 0.6258 0.5205 0.5742 0.3481 0.5838 0.5402 0.4983 0.3068 0.6231 0.483 0.7444 0.3555 0.2938 0.0813 0.595 0.7532 0.1167 0.1352 0.3675 0.6935 0.6063 0.0807 0.2617 0.1255 0.3527 0.0312 0.4047 0.1118 0.4395 0.3454 0.1785 0.3654 0.1688 0.1085 0.1023 0.9022 0.3444 0.6822 0.1968 0.895 0.2222 0.6489 0.1702 0.5145 0.9738 0.0046 0.5682 0.9674 0.3152 0.298 0.6253 0.0337 0.3885 0.0625 0.3031 0.2336 0.5626 0.6434 0.039 0.6396 0.7999 0.0011 0.3119 0.2074 0.1939 0.2722 0.1742 0.0246 0.2196 0.4513 0.1967 0.0009 0.3558 0.4047 0.4253 0.0093 0.4507 0.0111 0.1391 0.1651] 2022-08-23 21:28:38 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:28:48 [INFO] [TRAIN] epoch: 30, iter: 37050/160000, loss: 0.6864, lr: 0.000931, batch_cost: 0.1971, reader_cost: 0.00374, ips: 40.5983 samples/sec | ETA 06:43:47 2022-08-23 21:28:59 [INFO] [TRAIN] epoch: 30, iter: 37100/160000, loss: 0.6184, lr: 0.000930, batch_cost: 0.2067, reader_cost: 0.00936, ips: 38.7060 samples/sec | ETA 07:03:21 2022-08-23 21:29:10 [INFO] [TRAIN] epoch: 30, iter: 37150/160000, loss: 0.6159, lr: 0.000930, batch_cost: 0.2303, reader_cost: 0.00325, ips: 34.7324 samples/sec | ETA 07:51:36 2022-08-23 21:29:21 [INFO] [TRAIN] epoch: 30, iter: 37200/160000, loss: 0.7450, lr: 0.000930, batch_cost: 0.2103, reader_cost: 0.00033, ips: 38.0417 samples/sec | ETA 07:10:24 2022-08-23 21:29:30 [INFO] [TRAIN] epoch: 30, iter: 37250/160000, loss: 0.6158, lr: 0.000929, batch_cost: 0.1980, reader_cost: 0.00052, ips: 40.4127 samples/sec | ETA 06:44:59 2022-08-23 21:29:41 [INFO] [TRAIN] epoch: 30, iter: 37300/160000, loss: 0.6600, lr: 0.000929, batch_cost: 0.2064, reader_cost: 0.00048, ips: 38.7662 samples/sec | ETA 07:02:01 2022-08-23 21:29:50 [INFO] [TRAIN] epoch: 30, iter: 37350/160000, loss: 0.6615, lr: 0.000929, batch_cost: 0.1884, reader_cost: 0.00556, ips: 42.4655 samples/sec | ETA 06:25:05 2022-08-23 21:30:00 [INFO] [TRAIN] epoch: 30, iter: 37400/160000, loss: 0.6156, lr: 0.000928, batch_cost: 0.1862, reader_cost: 0.00063, ips: 42.9561 samples/sec | ETA 06:20:32 2022-08-23 21:30:11 [INFO] [TRAIN] epoch: 30, iter: 37450/160000, loss: 0.6658, lr: 0.000928, batch_cost: 0.2256, reader_cost: 0.00081, ips: 35.4675 samples/sec | ETA 07:40:42 2022-08-23 21:30:21 [INFO] [TRAIN] epoch: 30, iter: 37500/160000, loss: 0.6971, lr: 0.000927, batch_cost: 0.2075, reader_cost: 0.00087, ips: 38.5505 samples/sec | ETA 07:03:41 2022-08-23 21:30:32 [INFO] [TRAIN] epoch: 30, iter: 37550/160000, loss: 0.6406, lr: 0.000927, batch_cost: 0.2261, reader_cost: 0.00061, ips: 35.3805 samples/sec | ETA 07:41:27 2022-08-23 21:30:41 [INFO] [TRAIN] epoch: 30, iter: 37600/160000, loss: 0.6509, lr: 0.000927, batch_cost: 0.1723, reader_cost: 0.00043, ips: 46.4252 samples/sec | ETA 05:51:31 2022-08-23 21:30:50 [INFO] [TRAIN] epoch: 30, iter: 37650/160000, loss: 0.6728, lr: 0.000926, batch_cost: 0.1734, reader_cost: 0.00105, ips: 46.1295 samples/sec | ETA 05:53:38 2022-08-23 21:30:58 [INFO] [TRAIN] epoch: 30, iter: 37700/160000, loss: 0.6475, lr: 0.000926, batch_cost: 0.1573, reader_cost: 0.00490, ips: 50.8624 samples/sec | ETA 05:20:36 2022-08-23 21:31:06 [INFO] [TRAIN] epoch: 30, iter: 37750/160000, loss: 0.6994, lr: 0.000926, batch_cost: 0.1666, reader_cost: 0.00054, ips: 48.0244 samples/sec | ETA 05:39:24 2022-08-23 21:31:15 [INFO] [TRAIN] epoch: 30, iter: 37800/160000, loss: 0.5927, lr: 0.000925, batch_cost: 0.1796, reader_cost: 0.00059, ips: 44.5318 samples/sec | ETA 06:05:52 2022-08-23 21:31:24 [INFO] [TRAIN] epoch: 30, iter: 37850/160000, loss: 0.6781, lr: 0.000925, batch_cost: 0.1837, reader_cost: 0.00038, ips: 43.5412 samples/sec | ETA 06:14:03 2022-08-23 21:31:37 [INFO] [TRAIN] epoch: 31, iter: 37900/160000, loss: 0.6585, lr: 0.000924, batch_cost: 0.2517, reader_cost: 0.07283, ips: 31.7899 samples/sec | ETA 08:32:06 2022-08-23 21:31:45 [INFO] [TRAIN] epoch: 31, iter: 37950/160000, loss: 0.6412, lr: 0.000924, batch_cost: 0.1744, reader_cost: 0.00081, ips: 45.8677 samples/sec | ETA 05:54:47 2022-08-23 21:31:56 [INFO] [TRAIN] epoch: 31, iter: 38000/160000, loss: 0.6288, lr: 0.000924, batch_cost: 0.2153, reader_cost: 0.00091, ips: 37.1627 samples/sec | ETA 07:17:42 2022-08-23 21:31:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1630 - reader cost: 5.1044e-04 2022-08-23 21:34:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.3443 Acc: 0.7609 Kappa: 0.7425 Dice: 0.4785 2022-08-23 21:34:40 [INFO] [EVAL] Class IoU: [0.6751 0.7712 0.9296 0.728 0.6865 0.7633 0.7644 0.7759 0.4899 0.6307 0.4742 0.5647 0.6971 0.3102 0.319 0.4129 0.5061 0.4179 0.5679 0.3887 0.7411 0.4327 0.6026 0.4975 0.315 0.3162 0.4972 0.3556 0.4364 0.2908 0.252 0.4822 0.2677 0.3269 0.321 0.3668 0.4284 0.5255 0.2708 0.3167 0.0828 0.1524 0.3407 0.2609 0.2384 0.2616 0.3377 0.4679 0.6864 0.5231 0.5348 0.4281 0.159 0.191 0.655 0.3669 0.8702 0.4125 0.3288 0.2225 0.1166 0.237 0.289 0.1919 0.4247 0.6697 0.2318 0.3946 0.0099 0.3306 0.4146 0.5132 0.3996 0.2421 0.4403 0.3565 0.4096 0.2584 0.4646 0.3302 0.6223 0.348 0.2861 0.1324 0.3681 0.5382 0.0904 0.0932 0.2341 0.5156 0.4042 0.0429 0.1834 0.0451 0.0494 0.0169 0.1819 0.125 0.2793 0.3073 0.2885 0.0734 0.1648 0.3992 0.178 0.5748 0.2571 0.4918 0.1038 0.3512 0.1446 0.3617 0.1284 0.6613 0.5758 0.0071 0.3672 0.6542 0.0762 0.0919 0.3726 0.0233 0.2243 0.1716 0.2539 0.1904 0.4869 0.4501 0.1016 0.3969 0.5543 0.0296 0.3051 0.2488 0.1262 0.1651 0.1407 0.0251 0.1871 0.379 0.1809 0.0012 0.2474 0.3904 0.2928 0.015 0.4089 0.045 0.1193 0.136 ] 2022-08-23 21:34:40 [INFO] [EVAL] Class Precision: [0.7678 0.861 0.968 0.83 0.7795 0.8623 0.8906 0.8531 0.7198 0.7278 0.6727 0.6979 0.7666 0.5259 0.4908 0.5321 0.6209 0.7098 0.6623 0.6276 0.8037 0.6116 0.7817 0.63 0.4536 0.558 0.5684 0.7898 0.7872 0.4776 0.4122 0.6022 0.5154 0.4478 0.448 0.516 0.6136 0.7517 0.4064 0.6561 0.2738 0.283 0.6137 0.5093 0.417 0.3979 0.6595 0.7129 0.7849 0.735 0.7159 0.5679 0.3129 0.7772 0.7097 0.7164 0.921 0.5382 0.6852 0.4933 0.1985 0.6323 0.3557 0.5846 0.5431 0.8145 0.3618 0.6033 0.0524 0.5696 0.6821 0.6474 0.4972 0.2866 0.6957 0.4994 0.7868 0.4382 0.7472 0.5156 0.7481 0.6411 0.83 0.2432 0.4662 0.7529 0.3342 0.3689 0.5449 0.7604 0.5554 0.0474 0.3016 0.2498 0.1716 0.0859 0.4458 0.4426 0.3931 0.6188 0.5493 0.1058 0.5141 0.7087 0.649 0.7199 0.6145 0.5923 0.2162 0.442 0.3106 0.4838 0.7148 0.7398 0.6011 0.2212 0.7391 0.6619 0.5196 0.6999 0.6715 0.1585 0.5633 0.49 0.6851 0.5747 0.7249 0.6254 0.2323 0.5843 0.6628 0.2886 0.6849 0.8015 0.806 0.3533 0.2838 0.0962 0.4154 0.5892 0.2893 0.0027 0.6685 0.4939 0.7553 0.0288 0.8938 0.1919 0.4723 0.8786] 2022-08-23 21:34:40 [INFO] [EVAL] Class Recall: [0.8483 0.8808 0.9591 0.8556 0.8519 0.8692 0.8436 0.8956 0.6054 0.8254 0.6164 0.7474 0.8849 0.4306 0.4767 0.6483 0.7323 0.504 0.7992 0.5052 0.9049 0.5967 0.7245 0.7028 0.5076 0.4218 0.7989 0.3927 0.4948 0.4265 0.3934 0.7076 0.3577 0.5478 0.531 0.5593 0.5867 0.6359 0.4481 0.3798 0.1061 0.2482 0.4338 0.3484 0.3576 0.4329 0.409 0.5765 0.8455 0.6446 0.6789 0.6349 0.2442 0.2021 0.8946 0.4293 0.9403 0.6384 0.3874 0.2884 0.2204 0.2749 0.6066 0.2221 0.6608 0.7902 0.3922 0.5329 0.012 0.4407 0.5139 0.7124 0.6704 0.6093 0.5453 0.5547 0.4608 0.3865 0.5513 0.4786 0.7873 0.4321 0.3039 0.2251 0.6363 0.6537 0.1103 0.1109 0.291 0.6156 0.5974 0.3114 0.3186 0.0522 0.0649 0.0206 0.235 0.1484 0.491 0.379 0.378 0.1932 0.1951 0.4776 0.197 0.7403 0.3066 0.7435 0.1664 0.631 0.2129 0.5889 0.1353 0.8617 0.9318 0.0073 0.4218 0.9826 0.082 0.0957 0.4557 0.0266 0.2715 0.209 0.2875 0.2217 0.5973 0.6162 0.1531 0.553 0.772 0.0319 0.3549 0.2651 0.1302 0.2365 0.2182 0.0328 0.2539 0.5152 0.3257 0.002 0.2821 0.6506 0.3235 0.0302 0.4298 0.0556 0.1377 0.1386] 2022-08-23 21:34:40 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:34:51 [INFO] [TRAIN] epoch: 31, iter: 38050/160000, loss: 0.6498, lr: 0.000923, batch_cost: 0.2140, reader_cost: 0.00472, ips: 37.3833 samples/sec | ETA 07:14:57 2022-08-23 21:35:01 [INFO] [TRAIN] epoch: 31, iter: 38100/160000, loss: 0.6401, lr: 0.000923, batch_cost: 0.2168, reader_cost: 0.00121, ips: 36.8941 samples/sec | ETA 07:20:32 2022-08-23 21:35:11 [INFO] [TRAIN] epoch: 31, iter: 38150/160000, loss: 0.6569, lr: 0.000923, batch_cost: 0.1934, reader_cost: 0.00043, ips: 41.3752 samples/sec | ETA 06:32:39 2022-08-23 21:35:21 [INFO] [TRAIN] epoch: 31, iter: 38200/160000, loss: 0.6387, lr: 0.000922, batch_cost: 0.2035, reader_cost: 0.00037, ips: 39.3198 samples/sec | ETA 06:53:01 2022-08-23 21:35:32 [INFO] [TRAIN] epoch: 31, iter: 38250/160000, loss: 0.6637, lr: 0.000922, batch_cost: 0.2103, reader_cost: 0.00112, ips: 38.0417 samples/sec | ETA 07:06:43 2022-08-23 21:35:42 [INFO] [TRAIN] epoch: 31, iter: 38300/160000, loss: 0.6000, lr: 0.000921, batch_cost: 0.2076, reader_cost: 0.00052, ips: 38.5272 samples/sec | ETA 07:01:10 2022-08-23 21:35:52 [INFO] [TRAIN] epoch: 31, iter: 38350/160000, loss: 0.6543, lr: 0.000921, batch_cost: 0.2008, reader_cost: 0.00050, ips: 39.8429 samples/sec | ETA 06:47:05 2022-08-23 21:36:04 [INFO] [TRAIN] epoch: 31, iter: 38400/160000, loss: 0.6815, lr: 0.000921, batch_cost: 0.2334, reader_cost: 0.00083, ips: 34.2730 samples/sec | ETA 07:53:03 2022-08-23 21:36:14 [INFO] [TRAIN] epoch: 31, iter: 38450/160000, loss: 0.6614, lr: 0.000920, batch_cost: 0.2079, reader_cost: 0.00073, ips: 38.4754 samples/sec | ETA 07:01:13 2022-08-23 21:36:25 [INFO] [TRAIN] epoch: 31, iter: 38500/160000, loss: 0.6677, lr: 0.000920, batch_cost: 0.2143, reader_cost: 0.00976, ips: 37.3326 samples/sec | ETA 07:13:56 2022-08-23 21:36:37 [INFO] [TRAIN] epoch: 31, iter: 38550/160000, loss: 0.6258, lr: 0.000920, batch_cost: 0.2346, reader_cost: 0.00047, ips: 34.0934 samples/sec | ETA 07:54:58 2022-08-23 21:36:47 [INFO] [TRAIN] epoch: 31, iter: 38600/160000, loss: 0.6802, lr: 0.000919, batch_cost: 0.2046, reader_cost: 0.00069, ips: 39.0948 samples/sec | ETA 06:54:02 2022-08-23 21:36:55 [INFO] [TRAIN] epoch: 31, iter: 38650/160000, loss: 0.6918, lr: 0.000919, batch_cost: 0.1630, reader_cost: 0.00055, ips: 49.0804 samples/sec | ETA 05:29:39 2022-08-23 21:37:03 [INFO] [TRAIN] epoch: 31, iter: 38700/160000, loss: 0.6363, lr: 0.000918, batch_cost: 0.1582, reader_cost: 0.00054, ips: 50.5603 samples/sec | ETA 05:19:52 2022-08-23 21:37:12 [INFO] [TRAIN] epoch: 31, iter: 38750/160000, loss: 0.6221, lr: 0.000918, batch_cost: 0.1815, reader_cost: 0.00062, ips: 44.0870 samples/sec | ETA 06:06:41 2022-08-23 21:37:21 [INFO] [TRAIN] epoch: 31, iter: 38800/160000, loss: 0.6827, lr: 0.000918, batch_cost: 0.1687, reader_cost: 0.00052, ips: 47.4098 samples/sec | ETA 05:40:51 2022-08-23 21:37:29 [INFO] [TRAIN] epoch: 31, iter: 38850/160000, loss: 0.6734, lr: 0.000917, batch_cost: 0.1633, reader_cost: 0.00032, ips: 48.9946 samples/sec | ETA 05:29:41 2022-08-23 21:37:37 [INFO] [TRAIN] epoch: 31, iter: 38900/160000, loss: 0.6556, lr: 0.000917, batch_cost: 0.1695, reader_cost: 0.00033, ips: 47.1921 samples/sec | ETA 05:42:08 2022-08-23 21:37:45 [INFO] [TRAIN] epoch: 31, iter: 38950/160000, loss: 0.6075, lr: 0.000916, batch_cost: 0.1660, reader_cost: 0.00294, ips: 48.1859 samples/sec | ETA 05:34:57 2022-08-23 21:37:56 [INFO] [TRAIN] epoch: 31, iter: 39000/160000, loss: 0.6593, lr: 0.000916, batch_cost: 0.2068, reader_cost: 0.00361, ips: 38.6929 samples/sec | ETA 06:56:57 2022-08-23 21:37:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1916 - reader cost: 9.0656e-04 2022-08-23 21:41:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3388 Acc: 0.7617 Kappa: 0.7437 Dice: 0.4705 2022-08-23 21:41:08 [INFO] [EVAL] Class IoU: [0.6738 0.7719 0.9292 0.7243 0.688 0.7532 0.7729 0.7684 0.5283 0.6596 0.4695 0.5433 0.6861 0.2722 0.3016 0.4203 0.492 0.4416 0.5872 0.4138 0.7288 0.4557 0.6306 0.4704 0.3202 0.3901 0.4518 0.417 0.3969 0.1964 0.2551 0.4947 0.2657 0.3184 0.3878 0.398 0.4416 0.5186 0.2764 0.3703 0.1059 0.1415 0.3409 0.2696 0.2674 0.231 0.2952 0.4802 0.6647 0.5011 0.5447 0.453 0.1529 0.2408 0.6585 0.4451 0.8453 0.1528 0.4587 0.2713 0.0955 0.2231 0.3149 0.1469 0.4324 0.6809 0.2008 0.4066 0.0903 0.3601 0.457 0.5274 0.3777 0.2942 0.4547 0.3405 0.5124 0.245 0.4685 0.3539 0.5696 0.3465 0.2779 0.0898 0.3424 0.521 0.0978 0.0572 0.2713 0.5111 0.3815 0. 0.0995 0.0762 0.0453 0.0092 0.2035 0.0856 0.2822 0.342 0.1165 0.0772 0.1164 0.416 0.1657 0.4984 0.2048 0.561 0.0611 0.3369 0.0921 0.2286 0.1592 0.6659 0.2935 0.0088 0.3382 0.5314 0.1226 0.3846 0.4615 0.0222 0.3001 0.0828 0.2548 0.1576 0.4683 0.4602 0.0412 0.3328 0.5156 0.0109 0.2707 0.3324 0.1653 0.1478 0.1637 0.0219 0.1662 0.3624 0.044 0.008 0.2935 0.1655 0.283 0. 0.3891 0.0187 0.0859 0.2202] 2022-08-23 21:41:08 [INFO] [EVAL] Class Precision: [0.7809 0.8648 0.9659 0.8203 0.7765 0.864 0.8705 0.8291 0.6637 0.735 0.7428 0.6927 0.749 0.5465 0.5384 0.5635 0.6854 0.6007 0.7455 0.6246 0.8037 0.5825 0.7944 0.6065 0.4965 0.4734 0.626 0.7268 0.7095 0.3708 0.4209 0.5969 0.5468 0.4004 0.4894 0.5776 0.642 0.7343 0.3901 0.4479 0.2517 0.3772 0.5388 0.4678 0.3422 0.427 0.6403 0.6982 0.7905 0.5945 0.6969 0.5927 0.3555 0.6975 0.7152 0.5808 0.876 0.7908 0.5174 0.5444 0.2497 0.6212 0.4549 0.6041 0.5237 0.8384 0.4134 0.5449 0.3655 0.5737 0.6061 0.6171 0.4657 0.3622 0.6753 0.4997 0.6502 0.5948 0.6076 0.5958 0.6744 0.6015 0.8069 0.216 0.4204 0.7029 0.3681 0.4592 0.5598 0.6806 0.5043 0. 0.5214 0.3605 0.1872 0.1313 0.4174 0.5771 0.4214 0.7248 0.5534 0.1075 0.5704 0.4428 0.8933 0.5257 0.3381 0.7966 0.1217 0.4776 0.3899 0.2611 0.6855 0.7341 0.5734 0.483 0.7013 0.5337 0.2598 0.7649 0.6524 0.2128 0.5881 0.5918 0.7915 0.6236 0.7685 0.6331 0.337 0.555 0.6616 0.2286 0.6185 0.6006 0.7487 0.3861 0.3102 0.0845 0.485 0.7152 0.439 0.0119 0.6537 0.3781 0.6185 0. 0.8636 0.2216 0.3705 0.7475] 2022-08-23 21:41:08 [INFO] [EVAL] Class Recall: [0.8309 0.8777 0.9607 0.8609 0.8579 0.8545 0.8733 0.913 0.7214 0.8654 0.5607 0.7159 0.8909 0.3516 0.4068 0.6233 0.6356 0.625 0.7344 0.5507 0.8866 0.6767 0.7536 0.6771 0.4741 0.6892 0.6189 0.4945 0.4739 0.2946 0.3931 0.7429 0.3407 0.6085 0.6514 0.5614 0.586 0.6383 0.4867 0.6814 0.1546 0.1846 0.4813 0.3889 0.5504 0.3348 0.3539 0.606 0.8068 0.7615 0.7138 0.6577 0.2115 0.2689 0.8926 0.6558 0.9602 0.1592 0.8018 0.351 0.134 0.2582 0.5058 0.1626 0.7127 0.7838 0.2808 0.6157 0.1071 0.4916 0.6501 0.7839 0.6664 0.6105 0.5819 0.5167 0.7074 0.294 0.6718 0.4658 0.7857 0.4498 0.2977 0.1332 0.6485 0.6681 0.1176 0.0613 0.3449 0.6724 0.6103 0. 0.1095 0.0882 0.0564 0.0098 0.2843 0.0913 0.4608 0.3931 0.1286 0.215 0.1276 0.8732 0.169 0.9056 0.3418 0.6547 0.1093 0.5335 0.1076 0.6477 0.1717 0.8775 0.3755 0.0088 0.3951 0.9917 0.1883 0.4361 0.612 0.0242 0.3799 0.0878 0.2731 0.1742 0.5452 0.6275 0.0449 0.4539 0.7002 0.0113 0.325 0.4267 0.175 0.1932 0.2574 0.0287 0.2018 0.4235 0.0466 0.0241 0.3475 0.2273 0.3428 0. 0.4146 0.02 0.1006 0.2379] 2022-08-23 21:41:08 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:41:18 [INFO] [TRAIN] epoch: 31, iter: 39050/160000, loss: 0.6896, lr: 0.000916, batch_cost: 0.2077, reader_cost: 0.00386, ips: 38.5220 samples/sec | ETA 06:58:38 2022-08-23 21:41:29 [INFO] [TRAIN] epoch: 31, iter: 39100/160000, loss: 0.6695, lr: 0.000915, batch_cost: 0.2165, reader_cost: 0.00063, ips: 36.9499 samples/sec | ETA 07:16:15 2022-08-23 21:41:40 [INFO] [TRAIN] epoch: 31, iter: 39150/160000, loss: 0.6976, lr: 0.000915, batch_cost: 0.2157, reader_cost: 0.00046, ips: 37.0938 samples/sec | ETA 07:14:23 2022-08-23 21:41:56 [INFO] [TRAIN] epoch: 32, iter: 39200/160000, loss: 0.6503, lr: 0.000915, batch_cost: 0.3149, reader_cost: 0.11621, ips: 25.4038 samples/sec | ETA 10:34:01 2022-08-23 21:42:06 [INFO] [TRAIN] epoch: 32, iter: 39250/160000, loss: 0.6303, lr: 0.000914, batch_cost: 0.2098, reader_cost: 0.00334, ips: 38.1347 samples/sec | ETA 07:02:11 2022-08-23 21:42:17 [INFO] [TRAIN] epoch: 32, iter: 39300/160000, loss: 0.6984, lr: 0.000914, batch_cost: 0.2087, reader_cost: 0.00062, ips: 38.3303 samples/sec | ETA 06:59:51 2022-08-23 21:42:27 [INFO] [TRAIN] epoch: 32, iter: 39350/160000, loss: 0.6525, lr: 0.000913, batch_cost: 0.2101, reader_cost: 0.00107, ips: 38.0754 samples/sec | ETA 07:02:29 2022-08-23 21:42:37 [INFO] [TRAIN] epoch: 32, iter: 39400/160000, loss: 0.6546, lr: 0.000913, batch_cost: 0.2023, reader_cost: 0.00058, ips: 39.5514 samples/sec | ETA 06:46:33 2022-08-23 21:42:47 [INFO] [TRAIN] epoch: 32, iter: 39450/160000, loss: 0.6209, lr: 0.000913, batch_cost: 0.1874, reader_cost: 0.00872, ips: 42.6983 samples/sec | ETA 06:16:26 2022-08-23 21:42:56 [INFO] [TRAIN] epoch: 32, iter: 39500/160000, loss: 0.6512, lr: 0.000912, batch_cost: 0.1902, reader_cost: 0.00240, ips: 42.0696 samples/sec | ETA 06:21:54 2022-08-23 21:43:06 [INFO] [TRAIN] epoch: 32, iter: 39550/160000, loss: 0.6098, lr: 0.000912, batch_cost: 0.1914, reader_cost: 0.00081, ips: 41.8046 samples/sec | ETA 06:24:10 2022-08-23 21:43:16 [INFO] [TRAIN] epoch: 32, iter: 39600/160000, loss: 0.6561, lr: 0.000912, batch_cost: 0.1974, reader_cost: 0.00073, ips: 40.5272 samples/sec | ETA 06:36:06 2022-08-23 21:43:24 [INFO] [TRAIN] epoch: 32, iter: 39650/160000, loss: 0.6532, lr: 0.000911, batch_cost: 0.1608, reader_cost: 0.00694, ips: 49.7499 samples/sec | ETA 05:22:32 2022-08-23 21:43:34 [INFO] [TRAIN] epoch: 32, iter: 39700/160000, loss: 0.6547, lr: 0.000911, batch_cost: 0.2129, reader_cost: 0.00053, ips: 37.5820 samples/sec | ETA 07:06:47 2022-08-23 21:43:45 [INFO] [TRAIN] epoch: 32, iter: 39750/160000, loss: 0.5923, lr: 0.000910, batch_cost: 0.2028, reader_cost: 0.00054, ips: 39.4556 samples/sec | ETA 06:46:21 2022-08-23 21:43:54 [INFO] [TRAIN] epoch: 32, iter: 39800/160000, loss: 0.6464, lr: 0.000910, batch_cost: 0.1904, reader_cost: 0.00036, ips: 42.0258 samples/sec | ETA 06:21:21 2022-08-23 21:44:03 [INFO] [TRAIN] epoch: 32, iter: 39850/160000, loss: 0.6398, lr: 0.000910, batch_cost: 0.1764, reader_cost: 0.00054, ips: 45.3633 samples/sec | ETA 05:53:08 2022-08-23 21:44:13 [INFO] [TRAIN] epoch: 32, iter: 39900/160000, loss: 0.7126, lr: 0.000909, batch_cost: 0.2058, reader_cost: 0.00068, ips: 38.8722 samples/sec | ETA 06:51:56 2022-08-23 21:44:23 [INFO] [TRAIN] epoch: 32, iter: 39950/160000, loss: 0.6294, lr: 0.000909, batch_cost: 0.2060, reader_cost: 0.00045, ips: 38.8305 samples/sec | ETA 06:52:13 2022-08-23 21:44:33 [INFO] [TRAIN] epoch: 32, iter: 40000/160000, loss: 0.6795, lr: 0.000909, batch_cost: 0.1880, reader_cost: 0.00031, ips: 42.5461 samples/sec | ETA 06:16:03 2022-08-23 21:44:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.2003 - reader cost: 5.9645e-04 2022-08-23 21:47:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3511 Acc: 0.7638 Kappa: 0.7457 Dice: 0.4847 2022-08-23 21:47:53 [INFO] [EVAL] Class IoU: [0.681 0.7792 0.9309 0.725 0.6778 0.7581 0.7772 0.78 0.5213 0.6563 0.469 0.5574 0.7004 0.2752 0.2874 0.4285 0.5321 0.3916 0.6002 0.4154 0.7437 0.3758 0.6173 0.488 0.303 0.3957 0.4528 0.3992 0.4072 0.2849 0.209 0.4866 0.2803 0.3265 0.3919 0.395 0.444 0.5516 0.2389 0.381 0.1243 0.1419 0.347 0.2704 0.2416 0.1866 0.2852 0.4929 0.6432 0.5448 0.494 0.3806 0.1739 0.1441 0.6693 0.5005 0.8832 0.4116 0.3973 0.2966 0.1667 0.4021 0.3 0.1925 0.4323 0.7005 0.203 0.3846 0.0969 0.3434 0.4774 0.4758 0.3107 0.2452 0.4386 0.3464 0.5266 0.2019 0.2364 0.2104 0.6069 0.3491 0.3199 0.016 0.3962 0.5318 0.1049 0.1066 0.2454 0.4847 0.3693 0.0282 0.1764 0.0835 0.0698 0.0152 0.2113 0.1455 0.2897 0.3175 0.0558 0.0762 0.1831 0.563 0.1911 0.6092 0.2577 0.5623 0.0981 0.2628 0.1419 0.4502 0.1487 0.6735 0.603 0.0031 0.399 0.7424 0.1338 0.414 0.5012 0.035 0.4031 0.1637 0.2745 0.1869 0.5012 0.4825 0.3023 0.3167 0.5605 0.0084 0.2679 0.2399 0.1451 0.1726 0.1178 0.0193 0.1773 0.3209 0.0668 0.0412 0.2997 0.1892 0.3059 0. 0.3956 0.0322 0.091 0.1679] 2022-08-23 21:47:53 [INFO] [EVAL] Class Precision: [0.776 0.8703 0.9665 0.8119 0.7496 0.8479 0.8899 0.8418 0.7088 0.738 0.6745 0.7284 0.7769 0.5111 0.5482 0.5923 0.7162 0.7236 0.7516 0.625 0.8079 0.7011 0.7689 0.5818 0.4845 0.5733 0.5413 0.7771 0.6995 0.471 0.489 0.6027 0.5564 0.4131 0.4635 0.5141 0.6579 0.8456 0.337 0.5909 0.308 0.2683 0.5579 0.5084 0.3166 0.4962 0.4524 0.7135 0.7378 0.712 0.7771 0.5614 0.3428 0.6918 0.7241 0.7932 0.9368 0.6632 0.7119 0.4175 0.1953 0.562 0.4282 0.7896 0.5253 0.8374 0.2613 0.498 0.3237 0.5473 0.6976 0.7029 0.5815 0.4054 0.7048 0.5058 0.7147 0.5023 0.7514 0.2732 0.7425 0.6607 0.7705 0.0521 0.4727 0.6635 0.4317 0.3899 0.3828 0.6385 0.4819 0.0342 0.3969 0.2728 0.1018 0.0777 0.7509 0.4717 0.414 0.4209 0.7327 0.107 0.566 0.6352 0.8786 0.7901 0.5085 0.7989 0.1859 0.4854 0.2695 0.6618 0.5889 0.7612 0.6067 0.0545 0.658 0.8127 0.279 0.5474 0.6326 0.1919 0.7164 0.5166 0.6569 0.5947 0.8167 0.6855 0.4003 0.4535 0.7539 0.2626 0.5533 0.7638 0.74 0.3836 0.3916 0.1403 0.4397 0.7039 0.4195 0.0597 0.6714 0.3272 0.6122 0. 0.9095 0.2164 0.597 0.8702] 2022-08-23 21:47:53 [INFO] [EVAL] Class Recall: [0.8476 0.8816 0.962 0.8713 0.8761 0.8774 0.8599 0.9139 0.6633 0.8557 0.6063 0.7037 0.8767 0.3735 0.3765 0.6078 0.6742 0.4605 0.7488 0.5534 0.9035 0.4475 0.758 0.7516 0.4472 0.5609 0.7348 0.4509 0.4935 0.4189 0.2673 0.7165 0.361 0.6091 0.7171 0.6304 0.5772 0.6134 0.4506 0.5175 0.1725 0.2314 0.4786 0.3662 0.5051 0.2302 0.4355 0.6145 0.8339 0.6988 0.5755 0.5416 0.2608 0.154 0.8985 0.5757 0.9392 0.5204 0.4735 0.506 0.5327 0.5856 0.5004 0.2029 0.7095 0.8107 0.4763 0.6282 0.1214 0.4797 0.6019 0.5955 0.4002 0.383 0.5374 0.5235 0.6668 0.2524 0.2565 0.4778 0.7687 0.4253 0.3536 0.0226 0.7099 0.7282 0.1217 0.1279 0.406 0.668 0.6124 0.1389 0.241 0.1073 0.1815 0.0185 0.2272 0.1738 0.4909 0.5639 0.057 0.209 0.213 0.8321 0.1963 0.7269 0.3431 0.655 0.1721 0.3644 0.2306 0.5848 0.1659 0.854 0.9902 0.0033 0.5034 0.8956 0.2046 0.6294 0.707 0.041 0.4796 0.1933 0.3205 0.2142 0.5647 0.6196 0.5525 0.5122 0.6861 0.0086 0.3418 0.2591 0.1528 0.2389 0.1442 0.0219 0.2291 0.3709 0.0736 0.1172 0.3513 0.3096 0.3794 0. 0.4118 0.0365 0.097 0.1722] 2022-08-23 21:47:54 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:48:04 [INFO] [TRAIN] epoch: 32, iter: 40050/160000, loss: 0.6286, lr: 0.000908, batch_cost: 0.2126, reader_cost: 0.00482, ips: 37.6246 samples/sec | ETA 07:05:04 2022-08-23 21:48:15 [INFO] [TRAIN] epoch: 32, iter: 40100/160000, loss: 0.6298, lr: 0.000908, batch_cost: 0.2042, reader_cost: 0.00145, ips: 39.1816 samples/sec | ETA 06:48:00 2022-08-23 21:48:25 [INFO] [TRAIN] epoch: 32, iter: 40150/160000, loss: 0.6729, lr: 0.000907, batch_cost: 0.2075, reader_cost: 0.00058, ips: 38.5542 samples/sec | ETA 06:54:28 2022-08-23 21:48:35 [INFO] [TRAIN] epoch: 32, iter: 40200/160000, loss: 0.6299, lr: 0.000907, batch_cost: 0.2087, reader_cost: 0.00046, ips: 38.3317 samples/sec | ETA 06:56:42 2022-08-23 21:48:45 [INFO] [TRAIN] epoch: 32, iter: 40250/160000, loss: 0.6213, lr: 0.000907, batch_cost: 0.2017, reader_cost: 0.00090, ips: 39.6575 samples/sec | ETA 06:42:36 2022-08-23 21:48:56 [INFO] [TRAIN] epoch: 32, iter: 40300/160000, loss: 0.6614, lr: 0.000906, batch_cost: 0.2024, reader_cost: 0.00437, ips: 39.5323 samples/sec | ETA 06:43:43 2022-08-23 21:49:05 [INFO] [TRAIN] epoch: 32, iter: 40350/160000, loss: 0.6472, lr: 0.000906, batch_cost: 0.1982, reader_cost: 0.00597, ips: 40.3715 samples/sec | ETA 06:35:09 2022-08-23 21:49:14 [INFO] [TRAIN] epoch: 32, iter: 40400/160000, loss: 0.6775, lr: 0.000905, batch_cost: 0.1634, reader_cost: 0.00063, ips: 48.9715 samples/sec | ETA 05:25:37 2022-08-23 21:49:26 [INFO] [TRAIN] epoch: 33, iter: 40450/160000, loss: 0.5832, lr: 0.000905, batch_cost: 0.2434, reader_cost: 0.06917, ips: 32.8642 samples/sec | ETA 08:05:01 2022-08-23 21:49:35 [INFO] [TRAIN] epoch: 33, iter: 40500/160000, loss: 0.6116, lr: 0.000905, batch_cost: 0.1800, reader_cost: 0.00083, ips: 44.4522 samples/sec | ETA 05:58:26 2022-08-23 21:49:44 [INFO] [TRAIN] epoch: 33, iter: 40550/160000, loss: 0.6519, lr: 0.000904, batch_cost: 0.1929, reader_cost: 0.00065, ips: 41.4804 samples/sec | ETA 06:23:57 2022-08-23 21:49:53 [INFO] [TRAIN] epoch: 33, iter: 40600/160000, loss: 0.6557, lr: 0.000904, batch_cost: 0.1697, reader_cost: 0.00057, ips: 47.1458 samples/sec | ETA 05:37:40 2022-08-23 21:50:02 [INFO] [TRAIN] epoch: 33, iter: 40650/160000, loss: 0.6520, lr: 0.000904, batch_cost: 0.1885, reader_cost: 0.00066, ips: 42.4464 samples/sec | ETA 06:14:54 2022-08-23 21:50:12 [INFO] [TRAIN] epoch: 33, iter: 40700/160000, loss: 0.6695, lr: 0.000903, batch_cost: 0.1898, reader_cost: 0.00076, ips: 42.1489 samples/sec | ETA 06:17:23 2022-08-23 21:50:23 [INFO] [TRAIN] epoch: 33, iter: 40750/160000, loss: 0.5880, lr: 0.000903, batch_cost: 0.2163, reader_cost: 0.00065, ips: 36.9807 samples/sec | ETA 07:09:57 2022-08-23 21:50:33 [INFO] [TRAIN] epoch: 33, iter: 40800/160000, loss: 0.6512, lr: 0.000902, batch_cost: 0.2034, reader_cost: 0.00106, ips: 39.3367 samples/sec | ETA 06:44:02 2022-08-23 21:50:42 [INFO] [TRAIN] epoch: 33, iter: 40850/160000, loss: 0.6830, lr: 0.000902, batch_cost: 0.1793, reader_cost: 0.00095, ips: 44.6202 samples/sec | ETA 05:56:02 2022-08-23 21:50:50 [INFO] [TRAIN] epoch: 33, iter: 40900/160000, loss: 0.6507, lr: 0.000902, batch_cost: 0.1551, reader_cost: 0.00122, ips: 51.5753 samples/sec | ETA 05:07:53 2022-08-23 21:50:59 [INFO] [TRAIN] epoch: 33, iter: 40950/160000, loss: 0.6142, lr: 0.000901, batch_cost: 0.1951, reader_cost: 0.00061, ips: 40.9972 samples/sec | ETA 06:27:10 2022-08-23 21:51:08 [INFO] [TRAIN] epoch: 33, iter: 41000/160000, loss: 0.6822, lr: 0.000901, batch_cost: 0.1689, reader_cost: 0.00045, ips: 47.3647 samples/sec | ETA 05:34:59 2022-08-23 21:51:08 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1965 - reader cost: 5.7016e-04 2022-08-23 21:54:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.3444 Acc: 0.7633 Kappa: 0.7451 Dice: 0.4765 2022-08-23 21:54:25 [INFO] [EVAL] Class IoU: [0.6818 0.7743 0.9272 0.7228 0.6841 0.7617 0.77 0.7926 0.5239 0.6686 0.4951 0.5423 0.7064 0.2887 0.2912 0.4286 0.4661 0.4005 0.5842 0.4108 0.7351 0.3764 0.6019 0.5 0.2974 0.3702 0.4467 0.4171 0.3722 0.351 0.2392 0.517 0.2554 0.3238 0.2976 0.3659 0.4376 0.5089 0.2902 0.3759 0.1125 0.1405 0.3545 0.268 0.2716 0.2016 0.3335 0.4637 0.667 0.5544 0.5013 0.3798 0.1802 0.2816 0.6179 0.3424 0.8812 0.3192 0.2957 0.3149 0.1258 0.432 0.2618 0.1348 0.4524 0.696 0.1646 0.3994 0.0101 0.3663 0.4649 0.4962 0.394 0.2717 0.4603 0.3715 0.5704 0.2311 0.4993 0.2376 0.6153 0.3794 0.3746 0.0193 0.3623 0.5201 0.0796 0.0732 0.2366 0.4816 0.3849 0.029 0.1538 0.1104 0.0152 0.0172 0.1692 0.092 0.2644 0.4029 0.0998 0.0679 0.1239 0.5644 0.1464 0.4334 0.1927 0.5701 0.0718 0.2393 0.1316 0.2529 0.1321 0.676 0.6881 0.0412 0.2358 0.6938 0.0756 0.4181 0.4947 0.0546 0.2833 0.1158 0.2531 0.2166 0.4028 0.4258 0.2797 0.3926 0.4797 0.0239 0.2675 0.2912 0.1811 0.1395 0.1134 0.0216 0.1621 0.3764 0.2794 0.0191 0.1816 0.0597 0.3177 0. 0.3748 0.0262 0.0928 0.178 ] 2022-08-23 21:54:25 [INFO] [EVAL] Class Precision: [0.7792 0.8582 0.964 0.8041 0.7875 0.8724 0.865 0.8624 0.6421 0.7677 0.6699 0.6979 0.7892 0.5061 0.5346 0.5407 0.5669 0.6972 0.8169 0.6466 0.7967 0.6718 0.7221 0.6236 0.4669 0.5211 0.4894 0.609 0.7599 0.5092 0.4703 0.6762 0.5844 0.5549 0.4986 0.4727 0.645 0.7874 0.4928 0.5364 0.2976 0.3517 0.603 0.478 0.4209 0.5365 0.6195 0.6107 0.7417 0.7267 0.7149 0.4849 0.5651 0.7693 0.7106 0.6507 0.9367 0.7292 0.8285 0.6263 0.1692 0.5063 0.3743 0.7701 0.5877 0.8458 0.2385 0.5313 0.0944 0.6682 0.708 0.6839 0.5443 0.5468 0.6674 0.5105 0.687 0.487 0.7488 0.3211 0.7258 0.6971 0.7644 0.0665 0.4269 0.7221 0.8064 0.5679 0.3452 0.7091 0.5519 0.0398 0.3585 0.2709 0.026 0.1224 0.3478 0.3881 0.419 0.6122 0.6401 0.0881 0.53 0.5948 0.9538 0.4664 0.3139 0.7534 0.1702 0.4075 0.3661 0.2948 0.705 0.8395 0.6973 0.2008 0.7585 0.724 0.2903 0.5566 0.6879 0.5117 0.6729 0.5972 0.7363 0.6122 0.7763 0.5423 0.4205 0.6131 0.7242 0.2339 0.5062 0.7604 0.7527 0.4275 0.3582 0.1781 0.3792 0.6356 0.5638 0.0255 0.551 0.3127 0.6721 0. 0.9119 0.2556 0.4195 0.8387] 2022-08-23 21:54:25 [INFO] [EVAL] Class Recall: [0.845 0.8879 0.9604 0.8773 0.839 0.8572 0.8752 0.9074 0.74 0.8382 0.6549 0.7087 0.8707 0.4019 0.3901 0.6738 0.7239 0.4849 0.6722 0.5297 0.9048 0.4612 0.7835 0.7161 0.4502 0.5611 0.8366 0.5696 0.4218 0.5305 0.3274 0.6871 0.3121 0.4374 0.4247 0.6184 0.5765 0.59 0.4138 0.5569 0.1532 0.1895 0.4624 0.3789 0.4335 0.244 0.4195 0.6582 0.8688 0.7005 0.6266 0.6367 0.2092 0.3076 0.8257 0.4194 0.937 0.3622 0.315 0.3878 0.3288 0.7464 0.4657 0.1405 0.6628 0.7972 0.3471 0.6167 0.0112 0.4478 0.5752 0.6438 0.588 0.3506 0.5974 0.5771 0.7707 0.3054 0.5998 0.4772 0.8015 0.4543 0.4235 0.0265 0.7052 0.6503 0.0812 0.0775 0.4293 0.6001 0.5599 0.0959 0.2122 0.1571 0.0353 0.0196 0.2478 0.1077 0.4176 0.5409 0.1057 0.2291 0.1392 0.917 0.1475 0.8597 0.333 0.7009 0.1106 0.367 0.1704 0.6403 0.1399 0.7763 0.9813 0.0493 0.255 0.9432 0.0927 0.6268 0.638 0.0576 0.3285 0.1256 0.2784 0.2511 0.4557 0.6647 0.4551 0.5219 0.587 0.026 0.362 0.3206 0.1925 0.1715 0.1423 0.0239 0.2206 0.4799 0.3564 0.0709 0.2132 0.0687 0.376 0. 0.3889 0.0283 0.1064 0.1844] 2022-08-23 21:54:25 [INFO] [EVAL] The model with the best validation mIoU (0.3532) was saved at iter 34000. 2022-08-23 21:54:35 [INFO] [TRAIN] epoch: 33, iter: 41050/160000, loss: 0.6645, lr: 0.000901, batch_cost: 0.2057, reader_cost: 0.00389, ips: 38.8876 samples/sec | ETA 06:47:50 2022-08-23 21:54:46 [INFO] [TRAIN] epoch: 33, iter: 41100/160000, loss: 0.5996, lr: 0.000900, batch_cost: 0.2213, reader_cost: 0.00082, ips: 36.1429 samples/sec | ETA 07:18:37 2022-08-23 21:54:57 [INFO] [TRAIN] epoch: 33, iter: 41150/160000, loss: 0.6032, lr: 0.000900, batch_cost: 0.2200, reader_cost: 0.00065, ips: 36.3692 samples/sec | ETA 07:15:43 2022-08-23 21:55:07 [INFO] [TRAIN] epoch: 33, iter: 41200/160000, loss: 0.6605, lr: 0.000899, batch_cost: 0.1975, reader_cost: 0.00720, ips: 40.5161 samples/sec | ETA 06:30:57 2022-08-23 21:55:17 [INFO] [TRAIN] epoch: 33, iter: 41250/160000, loss: 0.6422, lr: 0.000899, batch_cost: 0.2059, reader_cost: 0.00079, ips: 38.8521 samples/sec | ETA 06:47:31 2022-08-23 21:55:27 [INFO] [TRAIN] epoch: 33, iter: 41300/160000, loss: 0.6212, lr: 0.000899, batch_cost: 0.1971, reader_cost: 0.00088, ips: 40.5965 samples/sec | ETA 06:29:51 2022-08-23 21:55:36 [INFO] [TRAIN] epoch: 33, iter: 41350/160000, loss: 0.6487, lr: 0.000898, batch_cost: 0.1730, reader_cost: 0.00142, ips: 46.2326 samples/sec | ETA 05:42:10 2022-08-23 21:55:46 [INFO] [TRAIN] epoch: 33, iter: 41400/160000, loss: 0.6407, lr: 0.000898, batch_cost: 0.1932, reader_cost: 0.00049, ips: 41.4130 samples/sec | ETA 06:21:50 2022-08-23 21:55:55 [INFO] [TRAIN] epoch: 33, iter: 41450/160000, loss: 0.6153, lr: 0.000898, batch_cost: 0.1830, reader_cost: 0.00089, ips: 43.7039 samples/sec | ETA 06:01:40 2022-08-23 21:56:04 [INFO] [TRAIN] epoch: 33, iter: 41500/160000, loss: 0.6460, lr: 0.000897, batch_cost: 0.1928, reader_cost: 0.00054, ips: 41.4917 samples/sec | ETA 06:20:47 2022-08-23 21:56:14 [INFO] [TRAIN] epoch: 33, iter: 41550/160000, loss: 0.6360, lr: 0.000897, batch_cost: 0.1831, reader_cost: 0.00044, ips: 43.6873 samples/sec | ETA 06:01:30 2022-08-23 21:56:22 [INFO] [TRAIN] epoch: 33, iter: 41600/160000, loss: 0.6917, lr: 0.000896, batch_cost: 0.1626, reader_cost: 0.00061, ips: 49.1857 samples/sec | ETA 05:20:57 2022-08-23 21:56:32 [INFO] [TRAIN] epoch: 33, iter: 41650/160000, loss: 0.6499, lr: 0.000896, batch_cost: 0.2091, reader_cost: 0.00051, ips: 38.2612 samples/sec | ETA 06:52:25 2022-08-23 21:56:48 [INFO] [TRAIN] epoch: 34, iter: 41700/160000, loss: 0.6425, lr: 0.000896, batch_cost: 0.3254, reader_cost: 0.14754, ips: 24.5847 samples/sec | ETA 10:41:35 2022-08-23 21:56:58 [INFO] [TRAIN] epoch: 34, iter: 41750/160000, loss: 0.6426, lr: 0.000895, batch_cost: 0.1864, reader_cost: 0.00066, ips: 42.9294 samples/sec | ETA 06:07:16 2022-08-23 21:57:08 [INFO] [TRAIN] epoch: 34, iter: 41800/160000, loss: 0.6363, lr: 0.000895, batch_cost: 0.1995, reader_cost: 0.00082, ips: 40.0974 samples/sec | ETA 06:33:02 2022-08-23 21:57:17 [INFO] [TRAIN] epoch: 34, iter: 41850/160000, loss: 0.6182, lr: 0.000895, batch_cost: 0.1943, reader_cost: 0.00053, ips: 41.1769 samples/sec | ETA 06:22:34 2022-08-23 21:57:27 [INFO] [TRAIN] epoch: 34, iter: 41900/160000, loss: 0.5739, lr: 0.000894, batch_cost: 0.1957, reader_cost: 0.00052, ips: 40.8708 samples/sec | ETA 06:25:16 2022-08-23 21:57:36 [INFO] [TRAIN] epoch: 34, iter: 41950/160000, loss: 0.5950, lr: 0.000894, batch_cost: 0.1723, reader_cost: 0.00057, ips: 46.4381 samples/sec | ETA 05:38:56 2022-08-23 21:57:45 [INFO] [TRAIN] epoch: 34, iter: 42000/160000, loss: 0.6313, lr: 0.000893, batch_cost: 0.1880, reader_cost: 0.00097, ips: 42.5551 samples/sec | ETA 06:09:42 2022-08-23 21:57:45 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1753 - reader cost: 8.7210e-04 2022-08-23 22:00:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3563 Acc: 0.7643 Kappa: 0.7465 Dice: 0.4913 2022-08-23 22:00:41 [INFO] [EVAL] Class IoU: [0.6772 0.775 0.93 0.7289 0.6717 0.7578 0.774 0.7942 0.5204 0.6847 0.4795 0.5552 0.7071 0.3019 0.2148 0.4269 0.4984 0.4042 0.5824 0.3965 0.7419 0.407 0.6165 0.4752 0.3189 0.3818 0.4598 0.4152 0.4498 0.3474 0.229 0.4833 0.2963 0.3328 0.3637 0.3785 0.4392 0.5305 0.2578 0.3545 0.1205 0.0997 0.3352 0.2566 0.2502 0.2236 0.3081 0.4859 0.6797 0.4887 0.5426 0.4117 0.2184 0.2281 0.6857 0.4312 0.8694 0.3997 0.3974 0.3365 0.093 0.4059 0.2877 0.2403 0.4324 0.6952 0.2635 0.39 0.0829 0.3694 0.4797 0.5174 0.3823 0.2526 0.4387 0.338 0.567 0.2814 0.3943 0.2376 0.6326 0.3319 0.2558 0.1259 0.223 0.5405 0.1215 0.1046 0.258 0.5202 0.3638 0.0192 0.1942 0.0488 0.0079 0.0266 0.2118 0.178 0.2804 0.3582 0.0697 0.0764 0.2317 0.3989 0.149 0.4406 0.1945 0.5515 0.0375 0.255 0.1566 0.3468 0.1897 0.6317 0.8117 0.0182 0.4447 0.7942 0.1232 0.4639 0.4305 0.0376 0.3557 0.1287 0.2776 0.2026 0.4318 0.4884 0.4148 0.3258 0.5816 0.0775 0.2925 0.3776 0.1931 0.1643 0.1397 0.0367 0.1789 0.2954 0.1584 0.0073 0.2435 0.3867 0.3206 0.009 0.3641 0.0228 0.0973 0.1307] 2022-08-23 22:00:41 [INFO] [EVAL] Class Precision: [0.7878 0.8425 0.9638 0.8373 0.7532 0.8302 0.8884 0.8593 0.6412 0.8012 0.7167 0.6828 0.7902 0.5432 0.5783 0.6045 0.6488 0.6913 0.7224 0.5882 0.8158 0.7244 0.7726 0.5466 0.4839 0.6248 0.5553 0.7924 0.6495 0.5437 0.3867 0.6839 0.5152 0.4601 0.4991 0.4691 0.6128 0.7831 0.3924 0.6191 0.2855 0.4185 0.5094 0.4055 0.3867 0.4467 0.5575 0.6909 0.7502 0.5869 0.6826 0.5328 0.3854 0.5729 0.7294 0.6445 0.9232 0.622 0.7194 0.5535 0.1169 0.5361 0.3977 0.6117 0.5158 0.8737 0.4018 0.5239 0.2565 0.6922 0.6575 0.6017 0.5673 0.285 0.7537 0.5015 0.7824 0.5173 0.7045 0.3279 0.7653 0.6652 0.8207 0.2927 0.315 0.7171 0.3527 0.403 0.4277 0.7307 0.4766 0.0601 0.3461 0.2426 0.0168 0.0792 0.6296 0.4801 0.4307 0.7026 0.554 0.1247 0.64 0.5653 0.844 0.4669 0.3461 0.8594 0.1288 0.408 0.3149 0.4414 0.5251 0.7269 0.8322 0.2618 0.6367 0.8309 0.225 0.6775 0.6758 0.3889 0.6792 0.6341 0.6744 0.5803 0.7021 0.6394 0.4639 0.4896 0.7666 0.4407 0.5733 0.6223 0.7797 0.2812 0.3399 0.1088 0.4089 0.7629 0.5998 0.0263 0.537 0.493 0.6424 0.0114 0.8768 0.2777 0.6339 0.9097] 2022-08-23 22:00:41 [INFO] [EVAL] Class Recall: [0.8284 0.9063 0.9636 0.8492 0.8612 0.8968 0.8574 0.913 0.7343 0.8248 0.5917 0.7483 0.8705 0.4046 0.2547 0.5923 0.6825 0.4933 0.7503 0.5489 0.8911 0.4816 0.7531 0.7844 0.4833 0.4954 0.7279 0.4659 0.594 0.4903 0.3596 0.6223 0.4109 0.5461 0.5727 0.6622 0.6079 0.6218 0.429 0.4534 0.1725 0.1157 0.495 0.4114 0.4148 0.3091 0.4079 0.6209 0.8786 0.745 0.7256 0.6442 0.3352 0.2748 0.9198 0.5658 0.9372 0.528 0.4703 0.4618 0.3129 0.6257 0.5099 0.2835 0.7279 0.7728 0.4337 0.604 0.109 0.442 0.6396 0.787 0.5396 0.6894 0.5122 0.509 0.6732 0.3817 0.4724 0.4632 0.7849 0.3985 0.2709 0.1811 0.4328 0.687 0.1563 0.1237 0.394 0.6436 0.6057 0.0274 0.3066 0.0575 0.0145 0.0384 0.2419 0.2204 0.4454 0.4223 0.0738 0.1646 0.2664 0.5754 0.1532 0.8867 0.3075 0.6062 0.0502 0.4047 0.2375 0.6179 0.229 0.8284 0.9706 0.0192 0.5959 0.9473 0.2139 0.5953 0.5426 0.04 0.4276 0.1391 0.3206 0.2373 0.5287 0.6741 0.7969 0.4935 0.7068 0.086 0.374 0.4898 0.2042 0.2833 0.1917 0.0525 0.2412 0.3253 0.1771 0.01 0.3082 0.6421 0.3902 0.0399 0.3837 0.0242 0.1031 0.1324] 2022-08-23 22:00:41 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:00:52 [INFO] [TRAIN] epoch: 34, iter: 42050/160000, loss: 0.6362, lr: 0.000893, batch_cost: 0.2190, reader_cost: 0.00381, ips: 36.5377 samples/sec | ETA 07:10:25 2022-08-23 22:01:03 [INFO] [TRAIN] epoch: 34, iter: 42100/160000, loss: 0.6236, lr: 0.000893, batch_cost: 0.2052, reader_cost: 0.00095, ips: 38.9844 samples/sec | ETA 06:43:14 2022-08-23 22:01:13 [INFO] [TRAIN] epoch: 34, iter: 42150/160000, loss: 0.6330, lr: 0.000892, batch_cost: 0.2068, reader_cost: 0.00084, ips: 38.6820 samples/sec | ETA 06:46:13 2022-08-23 22:01:24 [INFO] [TRAIN] epoch: 34, iter: 42200/160000, loss: 0.6106, lr: 0.000892, batch_cost: 0.2197, reader_cost: 0.00046, ips: 36.4178 samples/sec | ETA 07:11:17 2022-08-23 22:01:36 [INFO] [TRAIN] epoch: 34, iter: 42250/160000, loss: 0.6076, lr: 0.000891, batch_cost: 0.2515, reader_cost: 0.00211, ips: 31.8111 samples/sec | ETA 08:13:32 2022-08-23 22:01:46 [INFO] [TRAIN] epoch: 34, iter: 42300/160000, loss: 0.6678, lr: 0.000891, batch_cost: 0.1943, reader_cost: 0.00060, ips: 41.1673 samples/sec | ETA 06:21:12 2022-08-23 22:01:56 [INFO] [TRAIN] epoch: 34, iter: 42350/160000, loss: 0.6261, lr: 0.000891, batch_cost: 0.1933, reader_cost: 0.00059, ips: 41.3947 samples/sec | ETA 06:18:57 2022-08-23 22:02:05 [INFO] [TRAIN] epoch: 34, iter: 42400/160000, loss: 0.6356, lr: 0.000890, batch_cost: 0.1741, reader_cost: 0.00111, ips: 45.9534 samples/sec | ETA 05:41:12 2022-08-23 22:02:13 [INFO] [TRAIN] epoch: 34, iter: 42450/160000, loss: 0.6265, lr: 0.000890, batch_cost: 0.1736, reader_cost: 0.00082, ips: 46.0883 samples/sec | ETA 05:40:04 2022-08-23 22:02:23 [INFO] [TRAIN] epoch: 34, iter: 42500/160000, loss: 0.6288, lr: 0.000890, batch_cost: 0.1957, reader_cost: 0.00086, ips: 40.8834 samples/sec | ETA 06:23:12 2022-08-23 22:02:32 [INFO] [TRAIN] epoch: 34, iter: 42550/160000, loss: 0.6907, lr: 0.000889, batch_cost: 0.1829, reader_cost: 0.00084, ips: 43.7441 samples/sec | ETA 05:57:59 2022-08-23 22:02:43 [INFO] [TRAIN] epoch: 34, iter: 42600/160000, loss: 0.6610, lr: 0.000889, batch_cost: 0.2082, reader_cost: 0.00052, ips: 38.4316 samples/sec | ETA 06:47:18 2022-08-23 22:02:53 [INFO] [TRAIN] epoch: 34, iter: 42650/160000, loss: 0.7184, lr: 0.000888, batch_cost: 0.2167, reader_cost: 0.00058, ips: 36.9114 samples/sec | ETA 07:03:53 2022-08-23 22:03:03 [INFO] [TRAIN] epoch: 34, iter: 42700/160000, loss: 0.7012, lr: 0.000888, batch_cost: 0.1856, reader_cost: 0.00077, ips: 43.0938 samples/sec | ETA 06:02:55 2022-08-23 22:03:11 [INFO] [TRAIN] epoch: 34, iter: 42750/160000, loss: 0.5955, lr: 0.000888, batch_cost: 0.1737, reader_cost: 0.00055, ips: 46.0561 samples/sec | ETA 05:39:26 2022-08-23 22:03:20 [INFO] [TRAIN] epoch: 34, iter: 42800/160000, loss: 0.6302, lr: 0.000887, batch_cost: 0.1718, reader_cost: 0.00077, ips: 46.5589 samples/sec | ETA 05:35:37 2022-08-23 22:03:29 [INFO] [TRAIN] epoch: 34, iter: 42850/160000, loss: 0.6164, lr: 0.000887, batch_cost: 0.1879, reader_cost: 0.00058, ips: 42.5753 samples/sec | ETA 06:06:52 2022-08-23 22:03:38 [INFO] [TRAIN] epoch: 34, iter: 42900/160000, loss: 0.6465, lr: 0.000887, batch_cost: 0.1688, reader_cost: 0.00044, ips: 47.3843 samples/sec | ETA 05:29:30 2022-08-23 22:03:52 [INFO] [TRAIN] epoch: 35, iter: 42950/160000, loss: 0.6549, lr: 0.000886, batch_cost: 0.2814, reader_cost: 0.10996, ips: 28.4292 samples/sec | ETA 09:08:57 2022-08-23 22:04:02 [INFO] [TRAIN] epoch: 35, iter: 43000/160000, loss: 0.6004, lr: 0.000886, batch_cost: 0.2008, reader_cost: 0.00046, ips: 39.8330 samples/sec | ETA 06:31:38 2022-08-23 22:04:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1933 - reader cost: 7.3927e-04 2022-08-23 22:07:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3477 Acc: 0.7652 Kappa: 0.7472 Dice: 0.4801 2022-08-23 22:07:15 [INFO] [EVAL] Class IoU: [0.6804 0.778 0.9294 0.7314 0.6728 0.7517 0.7446 0.7839 0.5137 0.6548 0.487 0.5398 0.6983 0.299 0.2899 0.4356 0.5262 0.4166 0.5863 0.4114 0.7421 0.4277 0.6121 0.4949 0.3394 0.4902 0.4001 0.4052 0.4435 0.2756 0.2596 0.4982 0.2723 0.3411 0.3605 0.3691 0.4416 0.511 0.2716 0.3526 0.0966 0.1506 0.3326 0.2635 0.2741 0.194 0.3668 0.4964 0.7094 0.4914 0.5471 0.3738 0.147 0.1798 0.5823 0.3644 0.8688 0.3913 0.5006 0.2777 0.0888 0.2321 0.2912 0.1643 0.4469 0.6862 0.2515 0.3886 0.1179 0.3539 0.4473 0.5234 0.3927 0.2259 0.4476 0.3385 0.5898 0.2589 0.5568 0.349 0.6896 0.3593 0.3165 0.0616 0.1785 0.5413 0.1299 0.0927 0.2385 0.4747 0.44 0.0261 0.2116 0.0629 0.0246 0.03 0.191 0.1431 0.2849 0.339 0.07 0.0813 0.0975 0.5221 0.1694 0.5726 0.1612 0.5528 0.0519 0.3274 0.1403 0.3888 0.1344 0.5419 0.6594 0.0054 0.3438 0.6589 0.0779 0.0692 0.4963 0.0504 0.3381 0.149 0.2973 0.2234 0.3864 0.4884 0.019 0.3147 0.5734 0.0044 0.2341 0.3453 0.251 0.1571 0.1296 0.0379 0.1506 0.3555 0.109 0.0695 0.1146 0.2648 0.2893 0.0067 0.4189 0.0529 0.1573 0.2041] 2022-08-23 22:07:15 [INFO] [EVAL] Class Precision: [0.7753 0.8703 0.9657 0.827 0.7625 0.8598 0.8276 0.8507 0.6641 0.7171 0.7059 0.6979 0.7803 0.5475 0.5775 0.5849 0.6856 0.6567 0.6998 0.5876 0.8043 0.5934 0.7595 0.6103 0.4986 0.5646 0.5748 0.7438 0.6873 0.6996 0.4891 0.5894 0.4913 0.4801 0.4992 0.5294 0.6178 0.7585 0.4269 0.606 0.253 0.37 0.5536 0.5688 0.3416 0.6126 0.7498 0.708 0.7891 0.6132 0.6986 0.4516 0.3558 0.5584 0.7121 0.6586 0.9474 0.6437 0.7016 0.4433 0.1564 0.7014 0.3815 0.6346 0.5739 0.8262 0.4119 0.6046 0.2712 0.571 0.731 0.6093 0.6238 0.3143 0.5843 0.4789 0.7389 0.686 0.7411 0.5106 0.8768 0.7064 0.8154 0.1847 0.2798 0.718 0.5926 0.4416 0.5181 0.6155 0.6703 0.0553 0.3399 0.2727 0.041 0.071 0.5239 0.5046 0.45 0.6654 0.757 0.1261 0.4959 0.9244 0.8628 0.6283 0.3272 0.8279 0.1544 0.5294 0.3357 0.548 0.7275 0.7553 0.6668 0.2166 0.713 0.6715 0.1863 0.7392 0.673 0.5286 0.6676 0.5481 0.564 0.6353 0.6667 0.6726 0.5906 0.4939 0.7109 0.1635 0.6667 0.7145 0.7246 0.3935 0.4146 0.0611 0.3758 0.6736 0.7214 0.1073 0.7155 0.4689 0.6442 0.0338 0.8584 0.2444 0.4912 0.742 ] 2022-08-23 22:07:15 [INFO] [EVAL] Class Recall: [0.8474 0.8801 0.9612 0.8634 0.8512 0.8568 0.8813 0.909 0.6939 0.8829 0.6109 0.7045 0.8692 0.3972 0.3679 0.6306 0.6935 0.5326 0.7832 0.5785 0.9056 0.605 0.7592 0.7236 0.5153 0.7881 0.5683 0.4709 0.5556 0.3126 0.3561 0.7629 0.3792 0.5408 0.5647 0.5494 0.6076 0.6104 0.4274 0.4575 0.1352 0.2025 0.4544 0.3293 0.5808 0.2211 0.418 0.6241 0.8753 0.7122 0.7162 0.6845 0.2004 0.2097 0.7617 0.4494 0.9128 0.4995 0.636 0.4264 0.1704 0.2576 0.5515 0.1815 0.6688 0.802 0.3923 0.5211 0.1726 0.4821 0.5354 0.7877 0.5147 0.4455 0.6567 0.5359 0.7451 0.2937 0.6912 0.5244 0.7636 0.4224 0.3409 0.0846 0.33 0.6875 0.1426 0.1049 0.3064 0.6748 0.5616 0.0471 0.3591 0.0756 0.0577 0.0493 0.2312 0.1665 0.4371 0.4087 0.0717 0.1861 0.1082 0.5454 0.1741 0.8659 0.2411 0.6246 0.0725 0.4618 0.1942 0.5723 0.1416 0.6573 0.9834 0.0055 0.399 0.9722 0.1181 0.071 0.654 0.0528 0.4065 0.1699 0.386 0.2562 0.479 0.6407 0.0192 0.4645 0.7477 0.0045 0.2651 0.4005 0.2775 0.2073 0.1587 0.0906 0.2007 0.4295 0.1137 0.1648 0.1201 0.3782 0.3444 0.0083 0.45 0.0633 0.1879 0.2197] 2022-08-23 22:07:16 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:07:27 [INFO] [TRAIN] epoch: 35, iter: 43050/160000, loss: 0.6144, lr: 0.000885, batch_cost: 0.2181, reader_cost: 0.00456, ips: 36.6888 samples/sec | ETA 07:05:00 2022-08-23 22:07:37 [INFO] [TRAIN] epoch: 35, iter: 43100/160000, loss: 0.6136, lr: 0.000885, batch_cost: 0.2057, reader_cost: 0.00072, ips: 38.8864 samples/sec | ETA 06:40:49 2022-08-23 22:07:47 [INFO] [TRAIN] epoch: 35, iter: 43150/160000, loss: 0.6117, lr: 0.000885, batch_cost: 0.2078, reader_cost: 0.00056, ips: 38.4981 samples/sec | ETA 06:44:41 2022-08-23 22:07:58 [INFO] [TRAIN] epoch: 35, iter: 43200/160000, loss: 0.6423, lr: 0.000884, batch_cost: 0.2120, reader_cost: 0.00036, ips: 37.7387 samples/sec | ETA 06:52:39 2022-08-23 22:08:09 [INFO] [TRAIN] epoch: 35, iter: 43250/160000, loss: 0.6192, lr: 0.000884, batch_cost: 0.2184, reader_cost: 0.00071, ips: 36.6380 samples/sec | ETA 07:04:52 2022-08-23 22:08:19 [INFO] [TRAIN] epoch: 35, iter: 43300/160000, loss: 0.6436, lr: 0.000884, batch_cost: 0.2048, reader_cost: 0.00096, ips: 39.0646 samples/sec | ETA 06:38:18 2022-08-23 22:08:30 [INFO] [TRAIN] epoch: 35, iter: 43350/160000, loss: 0.6145, lr: 0.000883, batch_cost: 0.2119, reader_cost: 0.00056, ips: 37.7527 samples/sec | ETA 06:51:58 2022-08-23 22:08:40 [INFO] [TRAIN] epoch: 35, iter: 43400/160000, loss: 0.6261, lr: 0.000883, batch_cost: 0.2056, reader_cost: 0.00207, ips: 38.9156 samples/sec | ETA 06:39:29 2022-08-23 22:08:48 [INFO] [TRAIN] epoch: 35, iter: 43450/160000, loss: 0.6119, lr: 0.000882, batch_cost: 0.1715, reader_cost: 0.00079, ips: 46.6493 samples/sec | ETA 05:33:07 2022-08-23 22:08:58 [INFO] [TRAIN] epoch: 35, iter: 43500/160000, loss: 0.6215, lr: 0.000882, batch_cost: 0.1851, reader_cost: 0.00042, ips: 43.2165 samples/sec | ETA 05:59:25 2022-08-23 22:09:07 [INFO] [TRAIN] epoch: 35, iter: 43550/160000, loss: 0.6316, lr: 0.000882, batch_cost: 0.1953, reader_cost: 0.00050, ips: 40.9608 samples/sec | ETA 06:19:03 2022-08-23 22:09:18 [INFO] [TRAIN] epoch: 35, iter: 43600/160000, loss: 0.5805, lr: 0.000881, batch_cost: 0.2162, reader_cost: 0.00032, ips: 36.9973 samples/sec | ETA 06:59:29 2022-08-23 22:09:27 [INFO] [TRAIN] epoch: 35, iter: 43650/160000, loss: 0.6399, lr: 0.000881, batch_cost: 0.1694, reader_cost: 0.00044, ips: 47.2252 samples/sec | ETA 05:28:29 2022-08-23 22:09:35 [INFO] [TRAIN] epoch: 35, iter: 43700/160000, loss: 0.5719, lr: 0.000881, batch_cost: 0.1673, reader_cost: 0.00122, ips: 47.8285 samples/sec | ETA 05:24:12 2022-08-23 22:09:44 [INFO] [TRAIN] epoch: 35, iter: 43750/160000, loss: 0.6427, lr: 0.000880, batch_cost: 0.1779, reader_cost: 0.00038, ips: 44.9727 samples/sec | ETA 05:44:39 2022-08-23 22:09:53 [INFO] [TRAIN] epoch: 35, iter: 43800/160000, loss: 0.6941, lr: 0.000880, batch_cost: 0.1714, reader_cost: 0.00129, ips: 46.6719 samples/sec | ETA 05:31:57 2022-08-23 22:10:01 [INFO] [TRAIN] epoch: 35, iter: 43850/160000, loss: 0.7050, lr: 0.000879, batch_cost: 0.1748, reader_cost: 0.00081, ips: 45.7774 samples/sec | ETA 05:38:18 2022-08-23 22:10:11 [INFO] [TRAIN] epoch: 35, iter: 43900/160000, loss: 0.6706, lr: 0.000879, batch_cost: 0.1892, reader_cost: 0.00054, ips: 42.2775 samples/sec | ETA 06:06:09 2022-08-23 22:10:21 [INFO] [TRAIN] epoch: 35, iter: 43950/160000, loss: 0.6000, lr: 0.000879, batch_cost: 0.2090, reader_cost: 0.00053, ips: 38.2720 samples/sec | ETA 06:44:17 2022-08-23 22:10:31 [INFO] [TRAIN] epoch: 35, iter: 44000/160000, loss: 0.6198, lr: 0.000878, batch_cost: 0.1846, reader_cost: 0.00056, ips: 43.3461 samples/sec | ETA 05:56:49 2022-08-23 22:10:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 167s - batch_cost: 0.1674 - reader cost: 6.2145e-04 2022-08-23 22:13:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.3497 Acc: 0.7648 Kappa: 0.7469 Dice: 0.4845 2022-08-23 22:13:18 [INFO] [EVAL] Class IoU: [0.6811 0.7713 0.931 0.7353 0.6683 0.7533 0.7747 0.7784 0.5237 0.6456 0.4814 0.5613 0.6998 0.3284 0.2636 0.4353 0.5404 0.391 0.6019 0.4074 0.754 0.4239 0.6073 0.4785 0.2644 0.3471 0.4318 0.4026 0.4215 0.3204 0.2448 0.5246 0.3094 0.3237 0.4196 0.4033 0.4389 0.5214 0.2745 0.3472 0.2124 0.1636 0.3582 0.2462 0.3004 0.2233 0.261 0.4908 0.6139 0.4813 0.5563 0.3689 0.1898 0.2464 0.665 0.3346 0.8681 0.4168 0.4829 0.3099 0.1255 0.2243 0.2699 0.1794 0.4342 0.6788 0.182 0.3992 0.115 0.3218 0.4794 0.5139 0.4022 0.2126 0.4521 0.3649 0.4799 0.2507 0.3403 0.2851 0.5884 0.3623 0.3314 0.0036 0.2077 0.5149 0.1011 0.0748 0.2321 0.4989 0.4026 0.0112 0.2033 0.1276 0.0018 0.0316 0.2133 0.1631 0.2703 0.3786 0.1306 0.0716 0.1667 0.5064 0.1242 0.5709 0.1709 0.5481 0.0481 0.4023 0.0905 0.1126 0.1956 0.6238 0.7465 0.0436 0.3303 0.6516 0.1141 0.3337 0.2558 0.0794 0.3778 0.1553 0.2903 0.202 0.4379 0.4679 0.0667 0.3202 0.5628 0.0493 0.2575 0.3914 0.2334 0.1626 0.1353 0.0196 0.1598 0.3819 0.2475 0.0184 0.2708 0.3634 0.3044 0.0195 0.4265 0.0126 0.1732 0.1794] 2022-08-23 22:13:18 [INFO] [EVAL] Class Precision: [0.782 0.8493 0.9662 0.8386 0.7401 0.8502 0.8807 0.8285 0.643 0.8267 0.6988 0.6911 0.7762 0.4535 0.555 0.618 0.7239 0.7099 0.7372 0.6071 0.8464 0.6307 0.7872 0.6456 0.4742 0.5919 0.5548 0.6059 0.7906 0.5097 0.4955 0.6215 0.4984 0.388 0.5525 0.5225 0.6536 0.7646 0.4383 0.5833 0.3962 0.3813 0.6521 0.6068 0.4817 0.4932 0.8853 0.6642 0.7074 0.5667 0.7157 0.4404 0.5139 0.5346 0.7046 0.5773 0.9078 0.5798 0.7061 0.5429 0.1745 0.4263 0.3497 0.7192 0.5159 0.7876 0.3654 0.5746 0.2229 0.5652 0.6382 0.6498 0.6096 0.2668 0.6851 0.5129 0.8191 0.6069 0.7941 0.3563 0.6457 0.611 0.8306 0.0832 0.3104 0.7444 0.4412 0.4588 0.3702 0.6859 0.6207 0.0322 0.4457 0.3322 0.0071 0.1235 0.5194 0.4038 0.4123 0.5903 0.7405 0.1186 0.5004 0.5565 0.8093 0.6308 0.332 0.7266 0.219 0.5056 0.4459 0.1251 0.5498 0.7188 0.7625 0.2118 0.7015 0.6991 0.249 0.6759 0.8083 0.4967 0.9357 0.5717 0.6998 0.6213 0.6755 0.6657 0.6382 0.5749 0.7505 0.8082 0.656 0.6865 0.5895 0.3145 0.3811 0.1452 0.4905 0.571 0.5109 0.0583 0.5837 0.4466 0.7819 0.0599 0.8411 0.3273 0.418 0.8353] 2022-08-23 22:13:18 [INFO] [EVAL] Class Recall: [0.8406 0.8936 0.9624 0.8566 0.8733 0.8686 0.8655 0.9279 0.7385 0.7467 0.6074 0.7493 0.8768 0.5435 0.3343 0.5956 0.6807 0.4654 0.7663 0.5533 0.8736 0.5638 0.7266 0.649 0.3741 0.4562 0.6607 0.5454 0.4745 0.4632 0.3261 0.7709 0.4492 0.6617 0.6358 0.6387 0.572 0.621 0.4235 0.4618 0.3141 0.2228 0.4429 0.2929 0.4438 0.2897 0.2701 0.6528 0.8228 0.7616 0.7141 0.6943 0.2314 0.3137 0.9221 0.4431 0.952 0.5972 0.6045 0.4194 0.3089 0.3212 0.5421 0.1929 0.7328 0.831 0.2662 0.5668 0.1921 0.4277 0.6583 0.7107 0.5417 0.5114 0.5708 0.5583 0.5368 0.2994 0.3732 0.5879 0.869 0.4709 0.3554 0.0038 0.3856 0.6254 0.116 0.082 0.3836 0.6467 0.5341 0.0168 0.2721 0.1715 0.0024 0.0407 0.2657 0.2149 0.4398 0.5135 0.1369 0.153 0.2 0.8491 0.1279 0.8574 0.2605 0.6904 0.0581 0.6633 0.102 0.5296 0.2329 0.8252 0.9728 0.052 0.3843 0.9056 0.1739 0.3973 0.2723 0.0864 0.3879 0.1757 0.3317 0.2304 0.5545 0.6116 0.0693 0.4196 0.6924 0.0498 0.2977 0.4767 0.2786 0.2518 0.1735 0.0221 0.1915 0.5357 0.3244 0.0263 0.3356 0.6609 0.3326 0.0281 0.4639 0.013 0.2283 0.186 ] 2022-08-23 22:13:18 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:13:29 [INFO] [TRAIN] epoch: 35, iter: 44050/160000, loss: 0.6471, lr: 0.000878, batch_cost: 0.2192, reader_cost: 0.00381, ips: 36.4896 samples/sec | ETA 07:03:40 2022-08-23 22:13:39 [INFO] [TRAIN] epoch: 35, iter: 44100/160000, loss: 0.5988, lr: 0.000877, batch_cost: 0.1884, reader_cost: 0.00103, ips: 42.4661 samples/sec | ETA 06:03:53 2022-08-23 22:13:49 [INFO] [TRAIN] epoch: 35, iter: 44150/160000, loss: 0.6485, lr: 0.000877, batch_cost: 0.1999, reader_cost: 0.00786, ips: 40.0112 samples/sec | ETA 06:26:03 2022-08-23 22:13:58 [INFO] [TRAIN] epoch: 35, iter: 44200/160000, loss: 0.6980, lr: 0.000877, batch_cost: 0.1883, reader_cost: 0.00107, ips: 42.4787 samples/sec | ETA 06:03:28 2022-08-23 22:14:12 [INFO] [TRAIN] epoch: 36, iter: 44250/160000, loss: 0.6429, lr: 0.000876, batch_cost: 0.2797, reader_cost: 0.08444, ips: 28.6042 samples/sec | ETA 08:59:32 2022-08-23 22:14:22 [INFO] [TRAIN] epoch: 36, iter: 44300/160000, loss: 0.6524, lr: 0.000876, batch_cost: 0.2045, reader_cost: 0.00052, ips: 39.1255 samples/sec | ETA 06:34:17 2022-08-23 22:14:33 [INFO] [TRAIN] epoch: 36, iter: 44350/160000, loss: 0.6670, lr: 0.000876, batch_cost: 0.2131, reader_cost: 0.00047, ips: 37.5357 samples/sec | ETA 06:50:48 2022-08-23 22:14:42 [INFO] [TRAIN] epoch: 36, iter: 44400/160000, loss: 0.6147, lr: 0.000875, batch_cost: 0.1792, reader_cost: 0.00128, ips: 44.6468 samples/sec | ETA 05:45:13 2022-08-23 22:14:52 [INFO] [TRAIN] epoch: 36, iter: 44450/160000, loss: 0.6167, lr: 0.000875, batch_cost: 0.2010, reader_cost: 0.00048, ips: 39.8085 samples/sec | ETA 06:27:01 2022-08-23 22:15:02 [INFO] [TRAIN] epoch: 36, iter: 44500/160000, loss: 0.5867, lr: 0.000874, batch_cost: 0.1892, reader_cost: 0.00068, ips: 42.2844 samples/sec | ETA 06:04:12 2022-08-23 22:15:10 [INFO] [TRAIN] epoch: 36, iter: 44550/160000, loss: 0.5787, lr: 0.000874, batch_cost: 0.1767, reader_cost: 0.00073, ips: 45.2653 samples/sec | ETA 05:40:04 2022-08-23 22:15:19 [INFO] [TRAIN] epoch: 36, iter: 44600/160000, loss: 0.6296, lr: 0.000874, batch_cost: 0.1641, reader_cost: 0.00090, ips: 48.7530 samples/sec | ETA 05:15:36 2022-08-23 22:15:28 [INFO] [TRAIN] epoch: 36, iter: 44650/160000, loss: 0.6062, lr: 0.000873, batch_cost: 0.1868, reader_cost: 0.00040, ips: 42.8287 samples/sec | ETA 05:59:06 2022-08-23 22:15:38 [INFO] [TRAIN] epoch: 36, iter: 44700/160000, loss: 0.6255, lr: 0.000873, batch_cost: 0.1905, reader_cost: 0.00072, ips: 42.0044 samples/sec | ETA 06:05:59 2022-08-23 22:15:46 [INFO] [TRAIN] epoch: 36, iter: 44750/160000, loss: 0.6757, lr: 0.000873, batch_cost: 0.1688, reader_cost: 0.00056, ips: 47.3886 samples/sec | ETA 05:24:16 2022-08-23 22:15:54 [INFO] [TRAIN] epoch: 36, iter: 44800/160000, loss: 0.6253, lr: 0.000872, batch_cost: 0.1612, reader_cost: 0.00064, ips: 49.6238 samples/sec | ETA 05:09:31 2022-08-23 22:16:02 [INFO] [TRAIN] epoch: 36, iter: 44850/160000, loss: 0.6196, lr: 0.000872, batch_cost: 0.1658, reader_cost: 0.00080, ips: 48.2394 samples/sec | ETA 05:18:16 2022-08-23 22:16:11 [INFO] [TRAIN] epoch: 36, iter: 44900/160000, loss: 0.5934, lr: 0.000871, batch_cost: 0.1657, reader_cost: 0.00070, ips: 48.2690 samples/sec | ETA 05:17:56 2022-08-23 22:16:19 [INFO] [TRAIN] epoch: 36, iter: 44950/160000, loss: 0.6245, lr: 0.000871, batch_cost: 0.1620, reader_cost: 0.00041, ips: 49.3962 samples/sec | ETA 05:10:33 2022-08-23 22:16:27 [INFO] [TRAIN] epoch: 36, iter: 45000/160000, loss: 0.6184, lr: 0.000871, batch_cost: 0.1657, reader_cost: 0.00057, ips: 48.2672 samples/sec | ETA 05:17:40 2022-08-23 22:16:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1682 - reader cost: 6.9317e-04 2022-08-23 22:19:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3496 Acc: 0.7615 Kappa: 0.7432 Dice: 0.4851 2022-08-23 22:19:15 [INFO] [EVAL] Class IoU: [0.6755 0.7739 0.9275 0.7276 0.6694 0.7488 0.7402 0.7787 0.5005 0.6654 0.4779 0.5326 0.7096 0.3166 0.2768 0.4183 0.4978 0.4269 0.5985 0.4173 0.7564 0.5139 0.593 0.5024 0.302 0.2531 0.4606 0.424 0.4573 0.3412 0.2708 0.4835 0.2854 0.2902 0.2858 0.3999 0.4401 0.5612 0.2782 0.3212 0.1485 0.1421 0.3585 0.2354 0.2807 0.2061 0.3459 0.5047 0.6457 0.5396 0.5686 0.4017 0.1832 0.2242 0.676 0.3749 0.8639 0.3763 0.4426 0.2671 0.0938 0.2391 0.2447 0.1426 0.4352 0.7142 0.2476 0.4014 0.1038 0.3415 0.4578 0.4109 0.3715 0.2792 0.4443 0.384 0.3958 0.2349 0.4052 0.2594 0.655 0.3697 0.3499 0.0192 0.2459 0.5158 0.1068 0.0879 0.2712 0.4834 0.443 0.1206 0.2105 0.0822 0.029 0.036 0.2042 0.1434 0.2812 0.3687 0.0963 0.1004 0.236 0.4112 0.1041 0.4564 0.222 0.5508 0.0274 0.4406 0.0936 0.2128 0.1289 0.5308 0.5398 0.0039 0.408 0.6793 0.1255 0.4398 0.4643 0.0317 0.3788 0.1022 0.2676 0.2121 0.47 0.4897 0.5198 0.3495 0.3839 0.0992 0.0967 0.3088 0.1637 0.1799 0.1688 0.029 0.1616 0.3678 0.2013 0.0002 0.1756 0.3722 0.3395 0.015 0.3993 0.0151 0.1536 0.2023] 2022-08-23 22:19:15 [INFO] [EVAL] Class Precision: [0.7678 0.8544 0.9562 0.825 0.774 0.8923 0.9124 0.8317 0.6799 0.7876 0.7306 0.6044 0.7944 0.5135 0.5738 0.5246 0.6432 0.6802 0.7248 0.617 0.8399 0.6065 0.7735 0.6194 0.4677 0.5441 0.7625 0.6022 0.7854 0.5868 0.4544 0.6061 0.5335 0.3465 0.5165 0.5046 0.6803 0.7697 0.4352 0.5785 0.3121 0.3382 0.6124 0.5619 0.3605 0.3488 0.6616 0.8124 0.7123 0.6598 0.7205 0.4977 0.3425 0.5816 0.717 0.7117 0.9057 0.695 0.589 0.5503 0.181 0.4585 0.3265 0.7835 0.5337 0.8687 0.3802 0.5452 0.2166 0.4937 0.7025 0.7371 0.6617 0.3363 0.673 0.5965 0.6528 0.484 0.7446 0.3513 0.8023 0.7432 0.8032 0.0345 0.3193 0.6097 0.3909 0.3631 0.5336 0.6542 0.7143 0.1386 0.3426 0.3288 0.0613 0.1333 0.7505 0.5342 0.423 0.6469 0.6851 0.1916 0.5006 0.5845 0.7655 0.5288 0.4372 0.7685 0.1267 0.5584 0.3529 0.6071 0.6379 0.7761 0.5426 0.0889 0.6474 0.7805 0.1476 0.6471 0.6956 0.2778 0.6465 0.6183 0.7283 0.6295 0.7129 0.6677 0.7594 0.6275 0.404 0.3667 0.7587 0.7615 0.7697 0.3725 0.2849 0.1075 0.4488 0.7163 0.3459 0.0005 0.6063 0.5073 0.6984 0.0214 0.8778 0.4701 0.4876 0.8334] 2022-08-23 22:19:16 [INFO] [EVAL] Class Recall: [0.8489 0.8914 0.9686 0.8605 0.8321 0.8233 0.7969 0.9243 0.6548 0.8109 0.5802 0.8176 0.8693 0.4523 0.3485 0.6737 0.6878 0.5341 0.7745 0.5632 0.8838 0.771 0.7175 0.7267 0.4602 0.3212 0.5377 0.589 0.5226 0.4491 0.4012 0.705 0.3803 0.6409 0.3901 0.6585 0.5549 0.6745 0.4355 0.4193 0.2208 0.1968 0.4637 0.2883 0.5592 0.335 0.4203 0.5713 0.8736 0.7476 0.7296 0.6757 0.2825 0.2673 0.922 0.442 0.9493 0.4507 0.6404 0.3417 0.1631 0.3333 0.4941 0.1484 0.7023 0.8006 0.4152 0.6034 0.1663 0.5254 0.5679 0.4814 0.4587 0.6218 0.5666 0.5188 0.5013 0.3133 0.4706 0.4978 0.7811 0.4239 0.3827 0.0416 0.5168 0.7702 0.1281 0.1039 0.3555 0.6493 0.5384 0.4821 0.3531 0.0988 0.0522 0.047 0.2191 0.1639 0.4561 0.4616 0.1007 0.1741 0.3086 0.581 0.1075 0.7692 0.3107 0.6603 0.0337 0.6761 0.113 0.2468 0.1391 0.6268 0.9906 0.004 0.5246 0.8397 0.4557 0.5787 0.5826 0.0345 0.4778 0.1091 0.2973 0.2424 0.5797 0.6474 0.6224 0.441 0.8855 0.1197 0.0997 0.3419 0.1721 0.2582 0.2928 0.0382 0.2016 0.4305 0.3249 0.0004 0.1982 0.5829 0.3978 0.0482 0.4228 0.0153 0.1831 0.2108] 2022-08-23 22:19:16 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:19:25 [INFO] [TRAIN] epoch: 36, iter: 45050/160000, loss: 0.6084, lr: 0.000870, batch_cost: 0.1872, reader_cost: 0.00347, ips: 42.7247 samples/sec | ETA 05:58:43 2022-08-23 22:19:35 [INFO] [TRAIN] epoch: 36, iter: 45100/160000, loss: 0.6381, lr: 0.000870, batch_cost: 0.1926, reader_cost: 0.00069, ips: 41.5416 samples/sec | ETA 06:08:47 2022-08-23 22:19:45 [INFO] [TRAIN] epoch: 36, iter: 45150/160000, loss: 0.6274, lr: 0.000870, batch_cost: 0.2050, reader_cost: 0.00053, ips: 39.0283 samples/sec | ETA 06:32:21 2022-08-23 22:19:54 [INFO] [TRAIN] epoch: 36, iter: 45200/160000, loss: 0.6257, lr: 0.000869, batch_cost: 0.1878, reader_cost: 0.00158, ips: 42.5896 samples/sec | ETA 05:59:23 2022-08-23 22:20:05 [INFO] [TRAIN] epoch: 36, iter: 45250/160000, loss: 0.6063, lr: 0.000869, batch_cost: 0.2150, reader_cost: 0.00051, ips: 37.2149 samples/sec | ETA 06:51:07 2022-08-23 22:20:16 [INFO] [TRAIN] epoch: 36, iter: 45300/160000, loss: 0.5990, lr: 0.000868, batch_cost: 0.2182, reader_cost: 0.00044, ips: 36.6634 samples/sec | ETA 06:57:07 2022-08-23 22:20:25 [INFO] [TRAIN] epoch: 36, iter: 45350/160000, loss: 0.6458, lr: 0.000868, batch_cost: 0.1754, reader_cost: 0.00043, ips: 45.6023 samples/sec | ETA 05:35:13 2022-08-23 22:20:35 [INFO] [TRAIN] epoch: 36, iter: 45400/160000, loss: 0.6590, lr: 0.000868, batch_cost: 0.2120, reader_cost: 0.00059, ips: 37.7316 samples/sec | ETA 06:44:57 2022-08-23 22:20:45 [INFO] [TRAIN] epoch: 36, iter: 45450/160000, loss: 0.6259, lr: 0.000867, batch_cost: 0.1828, reader_cost: 0.00097, ips: 43.7634 samples/sec | ETA 05:48:59 2022-08-23 22:20:57 [INFO] [TRAIN] epoch: 37, iter: 45500/160000, loss: 0.6002, lr: 0.000867, batch_cost: 0.2471, reader_cost: 0.06408, ips: 32.3742 samples/sec | ETA 07:51:34 2022-08-23 22:21:05 [INFO] [TRAIN] epoch: 37, iter: 45550/160000, loss: 0.5965, lr: 0.000867, batch_cost: 0.1716, reader_cost: 0.00118, ips: 46.6180 samples/sec | ETA 05:27:20 2022-08-23 22:21:16 [INFO] [TRAIN] epoch: 37, iter: 45600/160000, loss: 0.5813, lr: 0.000866, batch_cost: 0.2037, reader_cost: 0.00075, ips: 39.2779 samples/sec | ETA 06:28:20 2022-08-23 22:21:25 [INFO] [TRAIN] epoch: 37, iter: 45650/160000, loss: 0.6062, lr: 0.000866, batch_cost: 0.1766, reader_cost: 0.00066, ips: 45.3043 samples/sec | ETA 05:36:32 2022-08-23 22:21:34 [INFO] [TRAIN] epoch: 37, iter: 45700/160000, loss: 0.6103, lr: 0.000865, batch_cost: 0.1835, reader_cost: 0.00051, ips: 43.6052 samples/sec | ETA 05:49:29 2022-08-23 22:21:43 [INFO] [TRAIN] epoch: 37, iter: 45750/160000, loss: 0.5983, lr: 0.000865, batch_cost: 0.1823, reader_cost: 0.00073, ips: 43.8817 samples/sec | ETA 05:47:08 2022-08-23 22:21:54 [INFO] [TRAIN] epoch: 37, iter: 45800/160000, loss: 0.5559, lr: 0.000865, batch_cost: 0.2160, reader_cost: 0.00067, ips: 37.0330 samples/sec | ETA 06:51:09 2022-08-23 22:22:02 [INFO] [TRAIN] epoch: 37, iter: 45850/160000, loss: 0.6456, lr: 0.000864, batch_cost: 0.1646, reader_cost: 0.00073, ips: 48.5970 samples/sec | ETA 05:13:11 2022-08-23 22:22:10 [INFO] [TRAIN] epoch: 37, iter: 45900/160000, loss: 0.6109, lr: 0.000864, batch_cost: 0.1673, reader_cost: 0.00087, ips: 47.8169 samples/sec | ETA 05:18:09 2022-08-23 22:22:20 [INFO] [TRAIN] epoch: 37, iter: 45950/160000, loss: 0.6443, lr: 0.000863, batch_cost: 0.2031, reader_cost: 0.00063, ips: 39.3864 samples/sec | ETA 06:26:05 2022-08-23 22:22:29 [INFO] [TRAIN] epoch: 37, iter: 46000/160000, loss: 0.5915, lr: 0.000863, batch_cost: 0.1761, reader_cost: 0.00041, ips: 45.4353 samples/sec | ETA 05:34:32 2022-08-23 22:22:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 213s - batch_cost: 0.2130 - reader cost: 6.5433e-04 2022-08-23 22:26:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.3514 Acc: 0.7632 Kappa: 0.7454 Dice: 0.4853 2022-08-23 22:26:02 [INFO] [EVAL] Class IoU: [0.6797 0.7761 0.9281 0.7411 0.6721 0.7595 0.7737 0.7889 0.5132 0.6694 0.49 0.5335 0.7028 0.2546 0.3133 0.4221 0.4891 0.4098 0.5775 0.4282 0.7505 0.43 0.5956 0.4908 0.3134 0.3393 0.4225 0.4135 0.4591 0.3171 0.253 0.5081 0.3013 0.3112 0.2593 0.3886 0.4413 0.5311 0.2991 0.3885 0.1581 0.1619 0.3648 0.2685 0.3081 0.2284 0.2552 0.4871 0.4806 0.5489 0.5344 0.5092 0.175 0.0826 0.6528 0.4299 0.8667 0.3997 0.5138 0.2576 0.0664 0.4387 0.2997 0.0928 0.465 0.6865 0.215 0.4065 0.113 0.3686 0.4977 0.5318 0.3579 0.2616 0.449 0.3389 0.4538 0.2323 0.2957 0.4364 0.6041 0.3412 0.3757 0.0141 0.077 0.5185 0.0898 0.1063 0.1917 0.4976 0.3726 0.0944 0.1965 0.055 0.0033 0.0132 0.1711 0.1243 0.1907 0.3796 0.0826 0.0687 0.2591 0.3825 0.1594 0.5828 0.1962 0.5285 0.0754 0.4321 0.1401 0.3871 0.1984 0.6783 0.6959 0.0173 0.4417 0.6797 0.1197 0.2913 0.3374 0.0527 0.3169 0.1376 0.2624 0.2047 0.4196 0.5048 0.3825 0.306 0.5284 0.0808 0.2207 0.3106 0.2155 0.1462 0.1483 0.0414 0.1795 0.398 0.2001 0.0002 0.1979 0.4316 0.3234 0.0056 0.4429 0.0023 0.1058 0.1421] 2022-08-23 22:26:02 [INFO] [EVAL] Class Precision: [0.7886 0.8607 0.9619 0.8543 0.7497 0.8646 0.8941 0.8448 0.6424 0.7671 0.6946 0.6138 0.792 0.4979 0.5401 0.5665 0.6184 0.6771 0.6971 0.5799 0.8236 0.6732 0.7415 0.6294 0.4935 0.5728 0.5207 0.669 0.7475 0.4322 0.493 0.7294 0.5306 0.3735 0.4546 0.4868 0.6823 0.8296 0.4672 0.5563 0.3341 0.3658 0.5927 0.5834 0.4079 0.4523 0.415 0.6439 0.654 0.7249 0.7012 0.6875 0.3957 0.4647 0.7001 0.5923 0.9297 0.645 0.7212 0.3856 0.107 0.6226 0.3979 0.6086 0.5642 0.8176 0.275 0.574 0.3551 0.6397 0.6528 0.7138 0.6021 0.3888 0.6875 0.6027 0.5656 0.5261 0.5748 0.7248 0.7132 0.5547 0.7536 0.0542 0.1418 0.644 0.4893 0.3423 0.3381 0.6353 0.5196 0.117 0.3625 0.2552 0.0071 0.1292 0.5003 0.4327 0.2379 0.7187 0.4714 0.1152 0.5442 0.4595 0.8001 0.6299 0.4319 0.7153 0.1397 0.5135 0.3831 0.5019 0.5302 0.7449 0.7012 0.3986 0.6636 0.776 0.2856 0.7156 0.797 0.255 0.6953 0.5861 0.7393 0.6661 0.701 0.6664 0.4918 0.726 0.605 0.5464 0.3864 0.7627 0.7361 0.2541 0.306 0.0956 0.4487 0.6372 0.606 0.0027 0.5871 0.5457 0.6794 0.0429 0.8039 0.164 0.5877 0.7621] 2022-08-23 22:26:02 [INFO] [EVAL] Class Recall: [0.8311 0.8876 0.9636 0.8484 0.8666 0.862 0.8518 0.9226 0.7184 0.8402 0.6246 0.8032 0.8619 0.3426 0.4273 0.6235 0.7006 0.5093 0.7709 0.6207 0.8942 0.5434 0.7517 0.6903 0.4619 0.4542 0.6915 0.5198 0.5433 0.5436 0.342 0.6262 0.4109 0.6508 0.3763 0.6582 0.5554 0.5962 0.4538 0.5629 0.2308 0.2251 0.4868 0.3321 0.5574 0.3157 0.3985 0.6666 0.6444 0.6934 0.6919 0.6625 0.2389 0.0913 0.9062 0.6106 0.9275 0.5124 0.6412 0.4369 0.1489 0.5976 0.5484 0.0987 0.7257 0.8107 0.4963 0.5822 0.1422 0.4652 0.6769 0.6759 0.4688 0.4441 0.5641 0.4364 0.6965 0.2938 0.3785 0.5231 0.7979 0.47 0.4283 0.0187 0.1442 0.7268 0.0991 0.1336 0.3068 0.6966 0.5684 0.3284 0.3002 0.0656 0.006 0.0145 0.2064 0.1485 0.4898 0.4458 0.091 0.1454 0.3309 0.6954 0.166 0.8863 0.2645 0.6693 0.1408 0.7315 0.181 0.6286 0.2408 0.8835 0.9892 0.0178 0.5692 0.8456 0.1709 0.3294 0.3691 0.0623 0.368 0.1525 0.2892 0.2281 0.511 0.6756 0.6325 0.346 0.8068 0.0866 0.3398 0.3439 0.2335 0.2561 0.2236 0.0681 0.2303 0.5145 0.23 0.0002 0.2298 0.6737 0.3817 0.0063 0.4966 0.0023 0.1143 0.1487] 2022-08-23 22:26:03 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:26:13 [INFO] [TRAIN] epoch: 37, iter: 46050/160000, loss: 0.6543, lr: 0.000863, batch_cost: 0.1986, reader_cost: 0.00391, ips: 40.2920 samples/sec | ETA 06:17:04 2022-08-23 22:26:23 [INFO] [TRAIN] epoch: 37, iter: 46100/160000, loss: 0.5586, lr: 0.000862, batch_cost: 0.2124, reader_cost: 0.00180, ips: 37.6724 samples/sec | ETA 06:43:07 2022-08-23 22:26:33 [INFO] [TRAIN] epoch: 37, iter: 46150/160000, loss: 0.6286, lr: 0.000862, batch_cost: 0.2022, reader_cost: 0.00038, ips: 39.5560 samples/sec | ETA 06:23:45 2022-08-23 22:26:44 [INFO] [TRAIN] epoch: 37, iter: 46200/160000, loss: 0.5893, lr: 0.000862, batch_cost: 0.2102, reader_cost: 0.00076, ips: 38.0673 samples/sec | ETA 06:38:35 2022-08-23 22:26:53 [INFO] [TRAIN] epoch: 37, iter: 46250/160000, loss: 0.5896, lr: 0.000861, batch_cost: 0.1872, reader_cost: 0.00054, ips: 42.7247 samples/sec | ETA 05:54:59 2022-08-23 22:27:03 [INFO] [TRAIN] epoch: 37, iter: 46300/160000, loss: 0.6707, lr: 0.000861, batch_cost: 0.1944, reader_cost: 0.00056, ips: 41.1605 samples/sec | ETA 06:08:18 2022-08-23 22:27:11 [INFO] [TRAIN] epoch: 37, iter: 46350/160000, loss: 0.5996, lr: 0.000860, batch_cost: 0.1724, reader_cost: 0.00066, ips: 46.4089 samples/sec | ETA 05:26:31 2022-08-23 22:27:21 [INFO] [TRAIN] epoch: 37, iter: 46400/160000, loss: 0.6320, lr: 0.000860, batch_cost: 0.1827, reader_cost: 0.00068, ips: 43.7785 samples/sec | ETA 05:45:59 2022-08-23 22:27:29 [INFO] [TRAIN] epoch: 37, iter: 46450/160000, loss: 0.6461, lr: 0.000860, batch_cost: 0.1725, reader_cost: 0.00071, ips: 46.3773 samples/sec | ETA 05:26:27 2022-08-23 22:27:39 [INFO] [TRAIN] epoch: 37, iter: 46500/160000, loss: 0.5784, lr: 0.000859, batch_cost: 0.1983, reader_cost: 0.00059, ips: 40.3515 samples/sec | ETA 06:15:02 2022-08-23 22:27:50 [INFO] [TRAIN] epoch: 37, iter: 46550/160000, loss: 0.5775, lr: 0.000859, batch_cost: 0.2133, reader_cost: 0.00048, ips: 37.5097 samples/sec | ETA 06:43:16 2022-08-23 22:28:00 [INFO] [TRAIN] epoch: 37, iter: 46600/160000, loss: 0.6609, lr: 0.000859, batch_cost: 0.2044, reader_cost: 0.00046, ips: 39.1322 samples/sec | ETA 06:26:22 2022-08-23 22:28:10 [INFO] [TRAIN] epoch: 37, iter: 46650/160000, loss: 0.5995, lr: 0.000858, batch_cost: 0.2051, reader_cost: 0.00070, ips: 39.0067 samples/sec | ETA 06:27:27 2022-08-23 22:28:20 [INFO] [TRAIN] epoch: 37, iter: 46700/160000, loss: 0.6278, lr: 0.000858, batch_cost: 0.1952, reader_cost: 0.00043, ips: 40.9899 samples/sec | ETA 06:08:32 2022-08-23 22:28:31 [INFO] [TRAIN] epoch: 38, iter: 46750/160000, loss: 0.6352, lr: 0.000857, batch_cost: 0.2217, reader_cost: 0.03903, ips: 36.0887 samples/sec | ETA 06:58:24 2022-08-23 22:28:42 [INFO] [TRAIN] epoch: 38, iter: 46800/160000, loss: 0.5978, lr: 0.000857, batch_cost: 0.2109, reader_cost: 0.00065, ips: 37.9252 samples/sec | ETA 06:37:58 2022-08-23 22:28:51 [INFO] [TRAIN] epoch: 38, iter: 46850/160000, loss: 0.6213, lr: 0.000857, batch_cost: 0.1890, reader_cost: 0.00040, ips: 42.3371 samples/sec | ETA 05:56:20 2022-08-23 22:28:59 [INFO] [TRAIN] epoch: 38, iter: 46900/160000, loss: 0.7059, lr: 0.000856, batch_cost: 0.1598, reader_cost: 0.00060, ips: 50.0596 samples/sec | ETA 05:01:14 2022-08-23 22:29:07 [INFO] [TRAIN] epoch: 38, iter: 46950/160000, loss: 0.6017, lr: 0.000856, batch_cost: 0.1605, reader_cost: 0.00058, ips: 49.8487 samples/sec | ETA 05:02:22 2022-08-23 22:29:16 [INFO] [TRAIN] epoch: 38, iter: 47000/160000, loss: 0.6055, lr: 0.000856, batch_cost: 0.1783, reader_cost: 0.00045, ips: 44.8637 samples/sec | ETA 05:35:49 2022-08-23 22:29:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1826 - reader cost: 9.6831e-04 2022-08-23 22:32:19 [INFO] [EVAL] #Images: 2000 mIoU: 0.3473 Acc: 0.7618 Kappa: 0.7437 Dice: 0.4826 2022-08-23 22:32:19 [INFO] [EVAL] Class IoU: [0.6768 0.7753 0.9267 0.7199 0.6858 0.7507 0.7692 0.7748 0.5176 0.6395 0.491 0.5617 0.6998 0.3105 0.3016 0.4254 0.4978 0.4164 0.5774 0.4085 0.7377 0.439 0.6064 0.4885 0.336 0.3042 0.4262 0.4171 0.3468 0.2846 0.282 0.4731 0.2658 0.3191 0.3055 0.3836 0.4316 0.4782 0.2773 0.3607 0.1403 0.1288 0.3452 0.273 0.2913 0.1885 0.3114 0.4594 0.6237 0.485 0.5498 0.4704 0.2027 0.2899 0.6467 0.4555 0.8586 0.3612 0.3663 0.2929 0.1209 0.3526 0.2748 0.0943 0.4521 0.657 0.2357 0.3973 0.0923 0.3511 0.4759 0.5629 0.3768 0.2569 0.4251 0.3503 0.5414 0.2369 0.3905 0.4094 0.6407 0.354 0.3255 0.0181 0.1899 0.4765 0.1055 0.0927 0.1852 0.4575 0.3741 0.0488 0.1979 0.0636 0.0022 0.0228 0.1912 0.1008 0.2242 0.2924 0.1401 0.0583 0.1774 0.4685 0.1746 0.4161 0.2198 0.551 0.0561 0.3274 0.1072 0.3254 0.1963 0.6276 0.5019 0.0054 0.3401 0.7228 0.107 0.2275 0.1998 0.0369 0.3443 0.101 0.2606 0.2051 0.4749 0.4296 0.4363 0.2874 0.5352 0.0588 0.2523 0.2307 0.2739 0.1386 0.1563 0.0197 0.1709 0.3668 0.481 0.057 0.1918 0.4785 0.3228 0.0106 0.4147 0.0397 0.1054 0.2092] 2022-08-23 22:32:19 [INFO] [EVAL] Class Precision: [0.7856 0.8463 0.9685 0.8047 0.784 0.8815 0.879 0.8323 0.6932 0.7359 0.6822 0.6854 0.7747 0.502 0.541 0.5523 0.6159 0.6664 0.6976 0.6183 0.8206 0.6275 0.7436 0.6343 0.4958 0.6387 0.5651 0.5722 0.8048 0.5013 0.4816 0.6829 0.512 0.3899 0.4678 0.4767 0.671 0.8046 0.389 0.5105 0.2475 0.3864 0.5833 0.4957 0.5014 0.4879 0.6235 0.6078 0.8041 0.5817 0.7959 0.6289 0.4465 0.6932 0.6896 0.5651 0.8998 0.647 0.7288 0.558 0.1852 0.623 0.4208 0.7376 0.582 0.8248 0.3778 0.4947 0.5098 0.5484 0.6467 0.7271 0.6187 0.3391 0.7266 0.4948 0.6345 0.4201 0.5787 0.5371 0.7717 0.6497 0.7857 0.0362 0.2796 0.7027 0.3943 0.4302 0.3678 0.6838 0.5539 0.0534 0.424 0.3516 0.0113 0.1207 0.542 0.4113 0.2978 0.7414 0.5524 0.075 0.5784 0.6286 0.7466 0.4842 0.3297 0.8188 0.1345 0.5191 0.2815 0.5946 0.5275 0.719 0.5054 0.1645 0.7515 0.7812 0.3326 0.5942 0.6389 0.1691 0.5511 0.6117 0.6387 0.6107 0.7423 0.7725 0.9264 0.4754 0.6224 0.464 0.4305 0.7751 0.4946 0.4194 0.3069 0.0816 0.5167 0.6458 0.6031 0.2374 0.7707 0.6626 0.7799 0.0134 0.8888 0.1909 0.4011 0.6958] 2022-08-23 22:32:19 [INFO] [EVAL] Class Recall: [0.8301 0.9024 0.9555 0.8724 0.8456 0.8349 0.8603 0.9182 0.6713 0.83 0.6367 0.7569 0.8786 0.4486 0.4054 0.6493 0.7218 0.5261 0.7702 0.5462 0.8796 0.5937 0.7668 0.68 0.5103 0.3674 0.6344 0.6062 0.3786 0.397 0.405 0.6063 0.3559 0.6374 0.4683 0.6627 0.5474 0.5411 0.4913 0.5513 0.2447 0.162 0.4581 0.3779 0.4101 0.235 0.3835 0.6529 0.7355 0.7447 0.64 0.6511 0.2708 0.3325 0.9121 0.7013 0.9494 0.4498 0.4241 0.3814 0.2582 0.4482 0.442 0.0976 0.6694 0.7635 0.3852 0.6686 0.1013 0.494 0.6431 0.7137 0.4907 0.5146 0.506 0.5454 0.7867 0.3521 0.5456 0.6326 0.7905 0.4374 0.3572 0.0349 0.3716 0.5968 0.126 0.1057 0.2716 0.5803 0.5354 0.3606 0.2707 0.0721 0.0028 0.0273 0.228 0.1179 0.4756 0.3256 0.158 0.2079 0.2037 0.6478 0.1856 0.7471 0.3973 0.6275 0.0878 0.4699 0.1476 0.4181 0.2381 0.8316 0.9865 0.0055 0.3832 0.9062 0.1362 0.2694 0.2252 0.045 0.4784 0.1079 0.3056 0.2359 0.5687 0.4918 0.452 0.421 0.7924 0.063 0.3787 0.2472 0.3804 0.1716 0.2416 0.0253 0.2035 0.4592 0.7037 0.0697 0.2035 0.6326 0.3551 0.0484 0.4374 0.0477 0.125 0.2303] 2022-08-23 22:32:19 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:32:31 [INFO] [TRAIN] epoch: 38, iter: 47050/160000, loss: 0.6090, lr: 0.000855, batch_cost: 0.2274, reader_cost: 0.00339, ips: 35.1801 samples/sec | ETA 07:08:04 2022-08-23 22:32:41 [INFO] [TRAIN] epoch: 38, iter: 47100/160000, loss: 0.6371, lr: 0.000855, batch_cost: 0.2130, reader_cost: 0.00120, ips: 37.5517 samples/sec | ETA 06:40:52 2022-08-23 22:32:51 [INFO] [TRAIN] epoch: 38, iter: 47150/160000, loss: 0.5995, lr: 0.000854, batch_cost: 0.1947, reader_cost: 0.00247, ips: 41.0863 samples/sec | ETA 06:06:13 2022-08-23 22:33:02 [INFO] [TRAIN] epoch: 38, iter: 47200/160000, loss: 0.5904, lr: 0.000854, batch_cost: 0.2266, reader_cost: 0.00162, ips: 35.2986 samples/sec | ETA 07:06:04 2022-08-23 22:33:10 [INFO] [TRAIN] epoch: 38, iter: 47250/160000, loss: 0.6134, lr: 0.000854, batch_cost: 0.1612, reader_cost: 0.00048, ips: 49.6261 samples/sec | ETA 05:02:55 2022-08-23 22:33:19 [INFO] [TRAIN] epoch: 38, iter: 47300/160000, loss: 0.5895, lr: 0.000853, batch_cost: 0.1794, reader_cost: 0.00088, ips: 44.5829 samples/sec | ETA 05:37:02 2022-08-23 22:33:28 [INFO] [TRAIN] epoch: 38, iter: 47350/160000, loss: 0.5869, lr: 0.000853, batch_cost: 0.1726, reader_cost: 0.00146, ips: 46.3514 samples/sec | ETA 05:24:02 2022-08-23 22:33:37 [INFO] [TRAIN] epoch: 38, iter: 47400/160000, loss: 0.6112, lr: 0.000852, batch_cost: 0.1885, reader_cost: 0.00055, ips: 42.4378 samples/sec | ETA 05:53:46 2022-08-23 22:33:46 [INFO] [TRAIN] epoch: 38, iter: 47450/160000, loss: 0.6184, lr: 0.000852, batch_cost: 0.1781, reader_cost: 0.00143, ips: 44.9141 samples/sec | ETA 05:34:07 2022-08-23 22:33:56 [INFO] [TRAIN] epoch: 38, iter: 47500/160000, loss: 0.5648, lr: 0.000852, batch_cost: 0.1868, reader_cost: 0.00056, ips: 42.8189 samples/sec | ETA 05:50:18 2022-08-23 22:34:05 [INFO] [TRAIN] epoch: 38, iter: 47550/160000, loss: 0.5929, lr: 0.000851, batch_cost: 0.1829, reader_cost: 0.00077, ips: 43.7490 samples/sec | ETA 05:42:42 2022-08-23 22:34:13 [INFO] [TRAIN] epoch: 38, iter: 47600/160000, loss: 0.5914, lr: 0.000851, batch_cost: 0.1697, reader_cost: 0.00356, ips: 47.1377 samples/sec | ETA 05:17:56 2022-08-23 22:34:22 [INFO] [TRAIN] epoch: 38, iter: 47650/160000, loss: 0.6056, lr: 0.000851, batch_cost: 0.1798, reader_cost: 0.00054, ips: 44.5010 samples/sec | ETA 05:36:37 2022-08-23 22:34:32 [INFO] [TRAIN] epoch: 38, iter: 47700/160000, loss: 0.6437, lr: 0.000850, batch_cost: 0.1863, reader_cost: 0.00093, ips: 42.9333 samples/sec | ETA 05:48:45 2022-08-23 22:34:40 [INFO] [TRAIN] epoch: 38, iter: 47750/160000, loss: 0.6092, lr: 0.000850, batch_cost: 0.1627, reader_cost: 0.00656, ips: 49.1838 samples/sec | ETA 05:04:18 2022-08-23 22:34:48 [INFO] [TRAIN] epoch: 38, iter: 47800/160000, loss: 0.6065, lr: 0.000849, batch_cost: 0.1667, reader_cost: 0.00104, ips: 47.9920 samples/sec | ETA 05:11:43 2022-08-23 22:34:56 [INFO] [TRAIN] epoch: 38, iter: 47850/160000, loss: 0.5731, lr: 0.000849, batch_cost: 0.1686, reader_cost: 0.00071, ips: 47.4502 samples/sec | ETA 05:15:08 2022-08-23 22:35:05 [INFO] [TRAIN] epoch: 38, iter: 47900/160000, loss: 0.5664, lr: 0.000849, batch_cost: 0.1693, reader_cost: 0.00079, ips: 47.2526 samples/sec | ETA 05:16:18 2022-08-23 22:35:14 [INFO] [TRAIN] epoch: 38, iter: 47950/160000, loss: 0.6023, lr: 0.000848, batch_cost: 0.1827, reader_cost: 0.00032, ips: 43.7842 samples/sec | ETA 05:41:13 2022-08-23 22:35:27 [INFO] [TRAIN] epoch: 39, iter: 48000/160000, loss: 0.5950, lr: 0.000848, batch_cost: 0.2636, reader_cost: 0.07551, ips: 30.3526 samples/sec | ETA 08:11:59 2022-08-23 22:35:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1890 - reader cost: 7.5742e-04 2022-08-23 22:38:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3533 Acc: 0.7647 Kappa: 0.7470 Dice: 0.4885 2022-08-23 22:38:37 [INFO] [EVAL] Class IoU: [0.6799 0.7786 0.9294 0.7298 0.6863 0.7606 0.7805 0.7928 0.5136 0.633 0.4696 0.5682 0.7082 0.303 0.2861 0.4258 0.4969 0.439 0.607 0.4146 0.7602 0.4533 0.6242 0.4829 0.3192 0.3736 0.5426 0.4246 0.4405 0.2614 0.2535 0.4982 0.2755 0.3463 0.2682 0.3574 0.4533 0.4689 0.2721 0.3539 0.1445 0.1727 0.3472 0.2637 0.2836 0.2108 0.2539 0.4667 0.6603 0.4123 0.5716 0.4143 0.212 0.2778 0.6623 0.473 0.8693 0.4265 0.3876 0.3402 0.1455 0.3287 0.3003 0.1953 0.4376 0.5883 0.2841 0.41 0.1078 0.3566 0.4996 0.5415 0.4163 0.2512 0.463 0.368 0.6043 0.2163 0.2241 0.4101 0.6637 0.3718 0.2579 0.0521 0.2486 0.5014 0.1044 0.0771 0.1882 0.4871 0.3774 0.1493 0.2009 0.1024 0.0015 0.0304 0.1961 0.188 0.2143 0.3655 0.2451 0.042 0.2276 0.1276 0.1441 0.5727 0.1806 0.5223 0.0968 0.4034 0.1096 0.3759 0.1629 0.5957 0.521 0.0329 0.417 0.7032 0.1099 0.3678 0.4364 0.0229 0.3878 0.146 0.2569 0.1713 0.4717 0.4932 0.5739 0.3665 0.5635 0.021 0.221 0.2408 0.2689 0.1769 0.1562 0.0366 0.1782 0.3799 0.0771 0.0297 0.246 0.0564 0.3208 0. 0.4113 0.0267 0.0986 0.1955] 2022-08-23 22:38:37 [INFO] [EVAL] Class Precision: [0.7888 0.8657 0.9678 0.8388 0.769 0.8571 0.8972 0.8555 0.6572 0.7649 0.6478 0.712 0.7969 0.4516 0.5537 0.5471 0.6095 0.7012 0.7607 0.5608 0.8346 0.7143 0.8029 0.6271 0.5466 0.5348 0.5868 0.7434 0.6991 0.4265 0.4555 0.6579 0.5002 0.4568 0.5122 0.4449 0.6526 0.7994 0.4151 0.5486 0.3616 0.3649 0.5421 0.5786 0.3868 0.4221 0.6415 0.6743 0.729 0.4589 0.716 0.5171 0.3579 0.6522 0.712 0.7414 0.931 0.5839 0.6581 0.6479 0.2099 0.5899 0.3848 0.5539 0.5439 0.6416 0.4228 0.5625 0.5794 0.6033 0.6483 0.656 0.5802 0.3099 0.6769 0.5245 0.7027 0.3745 0.4556 0.5128 0.8059 0.6428 0.8564 0.1512 0.3081 0.696 0.5292 0.4603 0.8107 0.6703 0.557 0.1882 0.3623 0.3622 0.0033 0.0907 0.8045 0.4467 0.2703 0.5969 0.6728 0.0769 0.5151 0.9545 0.6906 0.6804 0.3451 0.6436 0.188 0.5434 0.4551 0.5228 0.5589 0.8135 0.526 0.3253 0.7587 0.7226 0.4427 0.6294 0.7493 0.2532 0.6949 0.6337 0.7044 0.6762 0.7553 0.624 0.9817 0.6769 0.7589 0.2201 0.5694 0.7526 0.7643 0.3409 0.281 0.073 0.3696 0.6426 0.4729 0.0415 0.5139 0.3144 0.7288 0. 0.8191 0.2642 0.6339 0.8562] 2022-08-23 22:38:37 [INFO] [EVAL] Class Recall: [0.8313 0.8855 0.9591 0.8488 0.8646 0.8711 0.8572 0.9153 0.7016 0.786 0.6307 0.7378 0.8641 0.4793 0.3718 0.6577 0.729 0.5399 0.7502 0.6139 0.8951 0.5537 0.7372 0.6773 0.4341 0.5535 0.8781 0.4975 0.5436 0.4032 0.3637 0.6725 0.3802 0.5887 0.3603 0.645 0.5975 0.5314 0.4413 0.4993 0.194 0.2469 0.4913 0.3263 0.5152 0.2963 0.2958 0.6025 0.8752 0.8023 0.7391 0.6758 0.342 0.3261 0.9045 0.5664 0.9292 0.6128 0.4853 0.4173 0.3217 0.4261 0.5777 0.2317 0.6911 0.8763 0.464 0.6019 0.117 0.4659 0.6853 0.7562 0.5957 0.5701 0.5944 0.5521 0.8118 0.3386 0.306 0.672 0.79 0.4686 0.2696 0.0736 0.5627 0.6419 0.1151 0.0847 0.1968 0.6406 0.5393 0.4192 0.3108 0.1249 0.0029 0.0438 0.2059 0.245 0.5083 0.4852 0.2783 0.0847 0.2897 0.1284 0.154 0.7834 0.2747 0.7348 0.1663 0.6103 0.1262 0.5722 0.1869 0.6899 0.9822 0.0353 0.4807 0.9632 0.1275 0.4695 0.511 0.0246 0.4674 0.1594 0.288 0.1866 0.5567 0.7017 0.5801 0.4441 0.6864 0.0227 0.2653 0.2616 0.2932 0.2689 0.2601 0.0685 0.256 0.4817 0.0844 0.0945 0.3206 0.0643 0.3642 0. 0.4524 0.0288 0.1046 0.2021] 2022-08-23 22:38:37 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:38:47 [INFO] [TRAIN] epoch: 39, iter: 48050/160000, loss: 0.5827, lr: 0.000848, batch_cost: 0.2131, reader_cost: 0.00378, ips: 37.5378 samples/sec | ETA 06:37:38 2022-08-23 22:38:58 [INFO] [TRAIN] epoch: 39, iter: 48100/160000, loss: 0.6572, lr: 0.000847, batch_cost: 0.2082, reader_cost: 0.00183, ips: 38.4255 samples/sec | ETA 06:28:17 2022-08-23 22:39:08 [INFO] [TRAIN] epoch: 39, iter: 48150/160000, loss: 0.5764, lr: 0.000847, batch_cost: 0.2010, reader_cost: 0.00475, ips: 39.8017 samples/sec | ETA 06:14:41 2022-08-23 22:39:17 [INFO] [TRAIN] epoch: 39, iter: 48200/160000, loss: 0.5587, lr: 0.000846, batch_cost: 0.1792, reader_cost: 0.00118, ips: 44.6328 samples/sec | ETA 05:33:59 2022-08-23 22:39:26 [INFO] [TRAIN] epoch: 39, iter: 48250/160000, loss: 0.5899, lr: 0.000846, batch_cost: 0.1840, reader_cost: 0.00060, ips: 43.4867 samples/sec | ETA 05:42:38 2022-08-23 22:39:35 [INFO] [TRAIN] epoch: 39, iter: 48300/160000, loss: 0.6311, lr: 0.000846, batch_cost: 0.1842, reader_cost: 0.00120, ips: 43.4355 samples/sec | ETA 05:42:53 2022-08-23 22:39:45 [INFO] [TRAIN] epoch: 39, iter: 48350/160000, loss: 0.6626, lr: 0.000845, batch_cost: 0.1905, reader_cost: 0.00065, ips: 41.9925 samples/sec | ETA 05:54:30 2022-08-23 22:39:53 [INFO] [TRAIN] epoch: 39, iter: 48400/160000, loss: 0.6186, lr: 0.000845, batch_cost: 0.1627, reader_cost: 0.00053, ips: 49.1741 samples/sec | ETA 05:02:35 2022-08-23 22:40:02 [INFO] [TRAIN] epoch: 39, iter: 48450/160000, loss: 0.6388, lr: 0.000845, batch_cost: 0.1827, reader_cost: 0.00031, ips: 43.7942 samples/sec | ETA 05:39:37 2022-08-23 22:40:11 [INFO] [TRAIN] epoch: 39, iter: 48500/160000, loss: 0.6081, lr: 0.000844, batch_cost: 0.1699, reader_cost: 0.00055, ips: 47.0886 samples/sec | ETA 05:15:43 2022-08-23 22:40:21 [INFO] [TRAIN] epoch: 39, iter: 48550/160000, loss: 0.5842, lr: 0.000844, batch_cost: 0.2075, reader_cost: 0.00055, ips: 38.5524 samples/sec | ETA 06:25:26 2022-08-23 22:40:29 [INFO] [TRAIN] epoch: 39, iter: 48600/160000, loss: 0.6382, lr: 0.000843, batch_cost: 0.1672, reader_cost: 0.00055, ips: 47.8555 samples/sec | ETA 05:10:22 2022-08-23 22:40:38 [INFO] [TRAIN] epoch: 39, iter: 48650/160000, loss: 0.5836, lr: 0.000843, batch_cost: 0.1708, reader_cost: 0.00078, ips: 46.8431 samples/sec | ETA 05:16:56 2022-08-23 22:40:46 [INFO] [TRAIN] epoch: 39, iter: 48700/160000, loss: 0.6261, lr: 0.000843, batch_cost: 0.1628, reader_cost: 0.00043, ips: 49.1346 samples/sec | ETA 05:02:01 2022-08-23 22:40:54 [INFO] [TRAIN] epoch: 39, iter: 48750/160000, loss: 0.5875, lr: 0.000842, batch_cost: 0.1628, reader_cost: 0.00069, ips: 49.1291 samples/sec | ETA 05:01:55 2022-08-23 22:41:03 [INFO] [TRAIN] epoch: 39, iter: 48800/160000, loss: 0.5716, lr: 0.000842, batch_cost: 0.1846, reader_cost: 0.00044, ips: 43.3394 samples/sec | ETA 05:42:06 2022-08-23 22:41:12 [INFO] [TRAIN] epoch: 39, iter: 48850/160000, loss: 0.5913, lr: 0.000842, batch_cost: 0.1638, reader_cost: 0.00458, ips: 48.8490 samples/sec | ETA 05:03:23 2022-08-23 22:41:20 [INFO] [TRAIN] epoch: 39, iter: 48900/160000, loss: 0.5813, lr: 0.000841, batch_cost: 0.1738, reader_cost: 0.00054, ips: 46.0233 samples/sec | ETA 05:21:51 2022-08-23 22:41:31 [INFO] [TRAIN] epoch: 39, iter: 48950/160000, loss: 0.5903, lr: 0.000841, batch_cost: 0.2101, reader_cost: 0.00106, ips: 38.0765 samples/sec | ETA 06:28:51 2022-08-23 22:41:41 [INFO] [TRAIN] epoch: 39, iter: 49000/160000, loss: 0.6290, lr: 0.000840, batch_cost: 0.2132, reader_cost: 0.00055, ips: 37.5165 samples/sec | ETA 06:34:29 2022-08-23 22:41:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1802 - reader cost: 7.5101e-04 2022-08-23 22:44:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.3506 Acc: 0.7666 Kappa: 0.7488 Dice: 0.4864 2022-08-23 22:44:42 [INFO] [EVAL] Class IoU: [0.6802 0.7809 0.9308 0.7272 0.6796 0.7625 0.7734 0.7767 0.514 0.6334 0.4937 0.5651 0.6927 0.2984 0.2974 0.4303 0.4999 0.4278 0.5979 0.4238 0.7444 0.4668 0.6208 0.4966 0.2859 0.4892 0.4573 0.4569 0.4389 0.2477 0.2845 0.5151 0.2771 0.3678 0.2872 0.3887 0.4547 0.5294 0.2768 0.3794 0.1132 0.1237 0.3657 0.2618 0.2893 0.1562 0.307 0.4955 0.4016 0.5317 0.5491 0.3533 0.2075 0.2735 0.6846 0.4085 0.8676 0.2548 0.4056 0.3356 0.1324 0.2379 0.2801 0.1861 0.4813 0.6868 0.2159 0.4056 0.0693 0.3179 0.4774 0.4935 0.4018 0.2336 0.4734 0.3575 0.4998 0.207 0.2191 0.4047 0.7065 0.3774 0.3517 0.1568 0.1965 0.5161 0.0696 0.0847 0.3552 0.491 0.4 0.1226 0.1714 0.0943 0.016 0.0347 0.1795 0.0959 0.248 0.3643 0.227 0.057 0.1263 0.1491 0.1861 0.5285 0.2949 0.5488 0.041 0.3658 0.081 0.3686 0.1764 0.5148 0.5259 0.0148 0.2538 0.6535 0.1788 0.3981 0.4441 0.0152 0.3793 0.0947 0.2736 0.2525 0.4506 0.4342 0.5057 0.3555 0.4732 0.0356 0.2812 0.2681 0.2267 0.1831 0.1529 0.0469 0.1921 0.3638 0.1517 0.0009 0.2186 0.4259 0.2789 0. 0.4212 0.0179 0.119 0.1913] 2022-08-23 22:44:42 [INFO] [EVAL] Class Precision: [0.7835 0.8558 0.9655 0.8204 0.7713 0.8639 0.8786 0.8195 0.664 0.7218 0.6924 0.7062 0.7558 0.5358 0.5318 0.6035 0.6167 0.7131 0.7949 0.5869 0.82 0.6564 0.7498 0.6632 0.4956 0.6045 0.6074 0.7141 0.7265 0.3792 0.4667 0.6743 0.44 0.4941 0.4934 0.4611 0.6644 0.8271 0.4167 0.566 0.2952 0.4288 0.631 0.5859 0.3933 0.3759 0.6796 0.6872 0.6343 0.6836 0.7261 0.4441 0.3747 0.5561 0.7417 0.4742 0.9417 0.7176 0.7955 0.5752 0.2268 0.4853 0.3819 0.5198 0.6372 0.8009 0.4598 0.5581 0.2045 0.6039 0.654 0.762 0.6122 0.39 0.6693 0.5017 0.8322 0.2923 0.6771 0.6381 0.924 0.6715 0.776 0.317 0.296 0.7019 0.5366 0.5245 0.7924 0.689 0.5504 0.1744 0.3529 0.3102 0.0457 0.144 0.5672 0.4513 0.3331 0.7043 0.7525 0.1943 0.5698 0.5511 0.6512 0.5727 0.542 0.8262 0.1439 0.5167 0.3517 0.5059 0.5076 0.8222 0.5417 0.2442 0.893 0.7352 0.2401 0.5948 0.6686 0.2884 0.7354 0.5745 0.638 0.6888 0.8714 0.5107 0.9312 0.5108 0.5069 0.3879 0.7056 0.8486 0.8079 0.3682 0.2992 0.1018 0.5236 0.7164 0.3672 0.0025 0.6646 0.6385 0.7072 0. 0.8289 0.3543 0.4544 0.8931] 2022-08-23 22:44:42 [INFO] [EVAL] Class Recall: [0.8376 0.8993 0.9628 0.8649 0.851 0.8665 0.8659 0.9371 0.6947 0.8379 0.6325 0.7388 0.8925 0.4024 0.4029 0.5999 0.7251 0.5167 0.707 0.604 0.8898 0.6177 0.7831 0.664 0.4033 0.7194 0.6492 0.5591 0.5258 0.4168 0.4215 0.6857 0.428 0.59 0.4073 0.7123 0.5902 0.5953 0.4519 0.5351 0.1552 0.148 0.4652 0.3212 0.5225 0.2108 0.359 0.6399 0.5226 0.7053 0.6926 0.6333 0.3175 0.3499 0.8989 0.7465 0.9168 0.2832 0.4528 0.4462 0.2414 0.3181 0.5125 0.2248 0.6629 0.8281 0.2893 0.5976 0.0949 0.4016 0.6387 0.5834 0.539 0.3681 0.6179 0.5543 0.5558 0.415 0.2446 0.5252 0.7501 0.4629 0.3914 0.2367 0.3691 0.6609 0.074 0.0918 0.3917 0.6308 0.5941 0.292 0.2499 0.1193 0.024 0.0437 0.208 0.1085 0.4925 0.4301 0.2454 0.0747 0.1396 0.1698 0.2067 0.8727 0.3928 0.6204 0.0543 0.5562 0.0952 0.5759 0.2128 0.5793 0.9474 0.0155 0.2617 0.8548 0.4118 0.5462 0.5695 0.0158 0.4393 0.1018 0.3238 0.285 0.4828 0.7433 0.5253 0.5391 0.8766 0.0378 0.3186 0.2816 0.2397 0.2671 0.2382 0.08 0.2328 0.425 0.2054 0.0014 0.2457 0.5612 0.3153 0. 0.4613 0.0185 0.1388 0.1958] 2022-08-23 22:44:42 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:44:52 [INFO] [TRAIN] epoch: 39, iter: 49050/160000, loss: 0.5863, lr: 0.000840, batch_cost: 0.1979, reader_cost: 0.00406, ips: 40.4244 samples/sec | ETA 06:05:57 2022-08-23 22:45:02 [INFO] [TRAIN] epoch: 39, iter: 49100/160000, loss: 0.6042, lr: 0.000840, batch_cost: 0.1910, reader_cost: 0.00521, ips: 41.8782 samples/sec | ETA 05:53:05 2022-08-23 22:45:11 [INFO] [TRAIN] epoch: 39, iter: 49150/160000, loss: 0.6292, lr: 0.000839, batch_cost: 0.1926, reader_cost: 0.00087, ips: 41.5319 samples/sec | ETA 05:55:52 2022-08-23 22:45:22 [INFO] [TRAIN] epoch: 39, iter: 49200/160000, loss: 0.5774, lr: 0.000839, batch_cost: 0.2094, reader_cost: 0.00053, ips: 38.1973 samples/sec | ETA 06:26:45 2022-08-23 22:45:32 [INFO] [TRAIN] epoch: 39, iter: 49250/160000, loss: 0.6147, lr: 0.000838, batch_cost: 0.2110, reader_cost: 0.00082, ips: 37.9095 samples/sec | ETA 06:29:31 2022-08-23 22:45:43 [INFO] [TRAIN] epoch: 40, iter: 49300/160000, loss: 0.6518, lr: 0.000838, batch_cost: 0.2119, reader_cost: 0.05083, ips: 37.7593 samples/sec | ETA 06:30:53 2022-08-23 22:45:52 [INFO] [TRAIN] epoch: 40, iter: 49350/160000, loss: 0.5917, lr: 0.000838, batch_cost: 0.1821, reader_cost: 0.00066, ips: 43.9422 samples/sec | ETA 05:35:44 2022-08-23 22:46:00 [INFO] [TRAIN] epoch: 40, iter: 49400/160000, loss: 0.6181, lr: 0.000837, batch_cost: 0.1639, reader_cost: 0.00036, ips: 48.8022 samples/sec | ETA 05:02:10 2022-08-23 22:46:08 [INFO] [TRAIN] epoch: 40, iter: 49450/160000, loss: 0.5981, lr: 0.000837, batch_cost: 0.1631, reader_cost: 0.00567, ips: 49.0477 samples/sec | ETA 05:00:31 2022-08-23 22:46:17 [INFO] [TRAIN] epoch: 40, iter: 49500/160000, loss: 0.6373, lr: 0.000837, batch_cost: 0.1768, reader_cost: 0.00051, ips: 45.2611 samples/sec | ETA 05:25:31 2022-08-23 22:46:26 [INFO] [TRAIN] epoch: 40, iter: 49550/160000, loss: 0.6086, lr: 0.000836, batch_cost: 0.1710, reader_cost: 0.00071, ips: 46.7782 samples/sec | ETA 05:14:49 2022-08-23 22:46:33 [INFO] [TRAIN] epoch: 40, iter: 49600/160000, loss: 0.6051, lr: 0.000836, batch_cost: 0.1529, reader_cost: 0.00041, ips: 52.3084 samples/sec | ETA 04:41:24 2022-08-23 22:46:43 [INFO] [TRAIN] epoch: 40, iter: 49650/160000, loss: 0.6140, lr: 0.000835, batch_cost: 0.1861, reader_cost: 0.00064, ips: 42.9907 samples/sec | ETA 05:42:14 2022-08-23 22:46:51 [INFO] [TRAIN] epoch: 40, iter: 49700/160000, loss: 0.5880, lr: 0.000835, batch_cost: 0.1664, reader_cost: 0.00087, ips: 48.0809 samples/sec | ETA 05:05:52 2022-08-23 22:46:59 [INFO] [TRAIN] epoch: 40, iter: 49750/160000, loss: 0.5756, lr: 0.000835, batch_cost: 0.1645, reader_cost: 0.00064, ips: 48.6283 samples/sec | ETA 05:02:17 2022-08-23 22:47:08 [INFO] [TRAIN] epoch: 40, iter: 49800/160000, loss: 0.6244, lr: 0.000834, batch_cost: 0.1844, reader_cost: 0.00067, ips: 43.3785 samples/sec | ETA 05:38:43 2022-08-23 22:47:16 [INFO] [TRAIN] epoch: 40, iter: 49850/160000, loss: 0.5812, lr: 0.000834, batch_cost: 0.1608, reader_cost: 0.00059, ips: 49.7494 samples/sec | ETA 04:55:12 2022-08-23 22:47:24 [INFO] [TRAIN] epoch: 40, iter: 49900/160000, loss: 0.5896, lr: 0.000834, batch_cost: 0.1519, reader_cost: 0.00047, ips: 52.6595 samples/sec | ETA 04:38:46 2022-08-23 22:47:33 [INFO] [TRAIN] epoch: 40, iter: 49950/160000, loss: 0.5823, lr: 0.000833, batch_cost: 0.1795, reader_cost: 0.00042, ips: 44.5560 samples/sec | ETA 05:29:19 2022-08-23 22:47:42 [INFO] [TRAIN] epoch: 40, iter: 50000/160000, loss: 0.5728, lr: 0.000833, batch_cost: 0.1721, reader_cost: 0.00066, ips: 46.4926 samples/sec | ETA 05:15:27 2022-08-23 22:47:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1783 - reader cost: 6.5670e-04 2022-08-23 22:50:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.3532 Acc: 0.7629 Kappa: 0.7450 Dice: 0.4886 2022-08-23 22:50:40 [INFO] [EVAL] Class IoU: [0.6767 0.7773 0.926 0.7364 0.671 0.751 0.7705 0.7935 0.5127 0.636 0.4825 0.5679 0.6881 0.3024 0.3177 0.4281 0.496 0.4108 0.5961 0.404 0.7528 0.4729 0.5837 0.5172 0.311 0.3356 0.4722 0.4319 0.441 0.3018 0.2902 0.5369 0.2423 0.3251 0.3628 0.3803 0.4546 0.5167 0.2692 0.3337 0.1242 0.1258 0.2881 0.2678 0.2689 0.2174 0.3397 0.4784 0.5112 0.5235 0.5477 0.4077 0.1908 0.1036 0.6888 0.3444 0.8668 0.4376 0.4401 0.2632 0.1288 0.285 0.2498 0.0853 0.4174 0.6637 0.1838 0.3309 0.0904 0.3198 0.4847 0.5465 0.3488 0.2346 0.4506 0.3374 0.5253 0.2849 0.2052 0.2079 0.6864 0.3568 0.3653 0.0506 0.2199 0.5301 0.0907 0.1171 0.2762 0.5013 0.3893 0.0674 0.1735 0.0689 0.0059 0.0453 0.1892 0.1364 0.2865 0.3437 0.1842 0.1112 0.2149 0.2771 0.1862 0.607 0.1993 0.561 0.04 0.3106 0.1318 0.2535 0.1767 0.6173 0.5825 0.0047 0.354 0.7255 0.1439 0.3738 0.2576 0.0394 0.394 0.1185 0.2723 0.229 0.5143 0.4605 0.7063 0.3985 0.5715 0.1507 0.2815 0.313 0.268 0.1755 0.1384 0.0217 0.1948 0.3769 0.2501 0.0277 0.28 0.5065 0.3422 0.0051 0.3062 0.0231 0.1185 0.1916] 2022-08-23 22:50:40 [INFO] [EVAL] Class Precision: [0.7887 0.8465 0.9542 0.8503 0.755 0.8654 0.8807 0.8584 0.6424 0.7577 0.6959 0.7187 0.7601 0.5612 0.5011 0.6068 0.6046 0.6822 0.7212 0.617 0.8226 0.7087 0.7139 0.6693 0.4479 0.5756 0.5801 0.6759 0.7436 0.455 0.4642 0.6631 0.5465 0.4925 0.4973 0.5034 0.6535 0.8429 0.3829 0.6062 0.1801 0.3444 0.372 0.5177 0.3487 0.6235 0.7243 0.7204 0.6792 0.6319 0.7179 0.5089 0.4226 0.5064 0.7448 0.6908 0.9141 0.5856 0.7936 0.4105 0.1859 0.566 0.3466 0.5485 0.4714 0.7575 0.376 0.3959 0.3373 0.6685 0.6515 0.6781 0.6367 0.2719 0.6078 0.5597 0.6739 0.5839 0.6289 0.2272 0.8549 0.5128 0.7842 0.1297 0.3166 0.6814 0.2796 0.3782 0.7469 0.7771 0.5299 0.0748 0.2213 0.3233 0.0165 0.1375 0.8271 0.4707 0.4346 0.7367 0.7394 0.2286 0.6291 0.6086 0.7301 0.793 0.348 0.7902 0.1189 0.5323 0.2499 0.8232 0.4985 0.8316 0.6034 0.0594 0.7606 0.8572 0.4507 0.5384 0.6753 0.2482 0.7795 0.6858 0.6973 0.6112 0.8236 0.559 0.9675 0.6676 0.6835 0.4712 0.6404 0.8498 0.6572 0.2802 0.3234 0.1222 0.5364 0.6043 0.3926 0.1833 0.6667 0.7534 0.6146 0.0091 0.9434 0.2352 0.4526 0.8996] 2022-08-23 22:50:40 [INFO] [EVAL] Class Recall: [0.8266 0.9049 0.9691 0.8461 0.8578 0.8503 0.8603 0.913 0.7175 0.7984 0.6115 0.7303 0.879 0.396 0.4647 0.5925 0.7342 0.508 0.7745 0.5393 0.8987 0.5869 0.762 0.6946 0.5045 0.4459 0.7174 0.5448 0.5201 0.4727 0.4364 0.7384 0.3033 0.4889 0.573 0.6086 0.5991 0.5718 0.4756 0.4261 0.2861 0.1654 0.5608 0.3568 0.5403 0.2502 0.3901 0.5874 0.674 0.7532 0.6979 0.6721 0.258 0.1153 0.9016 0.4072 0.9437 0.6339 0.497 0.4233 0.2953 0.3647 0.4722 0.0917 0.7846 0.8428 0.2644 0.6684 0.1099 0.3801 0.6544 0.7378 0.4355 0.6308 0.6353 0.4593 0.7043 0.3575 0.2335 0.7097 0.777 0.5397 0.4061 0.0766 0.4185 0.7048 0.1184 0.1451 0.3047 0.5856 0.5946 0.4077 0.4452 0.0806 0.0092 0.0633 0.197 0.1612 0.4568 0.3919 0.197 0.1779 0.246 0.3373 0.2 0.7213 0.3182 0.6592 0.0569 0.4271 0.2182 0.2681 0.2149 0.7054 0.9437 0.005 0.3984 0.8253 0.1745 0.5501 0.294 0.0448 0.4434 0.1253 0.3088 0.268 0.5779 0.7233 0.7235 0.4971 0.7771 0.1814 0.3343 0.3313 0.3115 0.3197 0.1947 0.0256 0.2342 0.5004 0.408 0.0316 0.3255 0.6072 0.4357 0.0113 0.3119 0.025 0.1384 0.1958] 2022-08-23 22:50:41 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:50:50 [INFO] [TRAIN] epoch: 40, iter: 50050/160000, loss: 0.6012, lr: 0.000832, batch_cost: 0.1797, reader_cost: 0.00483, ips: 44.5190 samples/sec | ETA 05:29:17 2022-08-23 22:50:59 [INFO] [TRAIN] epoch: 40, iter: 50100/160000, loss: 0.5979, lr: 0.000832, batch_cost: 0.1799, reader_cost: 0.00314, ips: 44.4734 samples/sec | ETA 05:29:29 2022-08-23 22:51:09 [INFO] [TRAIN] epoch: 40, iter: 50150/160000, loss: 0.6175, lr: 0.000832, batch_cost: 0.2189, reader_cost: 0.00041, ips: 36.5541 samples/sec | ETA 06:40:41 2022-08-23 22:51:21 [INFO] [TRAIN] epoch: 40, iter: 50200/160000, loss: 0.5838, lr: 0.000831, batch_cost: 0.2336, reader_cost: 0.00036, ips: 34.2480 samples/sec | ETA 07:07:28 2022-08-23 22:51:31 [INFO] [TRAIN] epoch: 40, iter: 50250/160000, loss: 0.5503, lr: 0.000831, batch_cost: 0.1966, reader_cost: 0.00050, ips: 40.6946 samples/sec | ETA 05:59:35 2022-08-23 22:51:41 [INFO] [TRAIN] epoch: 40, iter: 50300/160000, loss: 0.6309, lr: 0.000831, batch_cost: 0.1977, reader_cost: 0.00072, ips: 40.4742 samples/sec | ETA 06:01:22 2022-08-23 22:51:50 [INFO] [TRAIN] epoch: 40, iter: 50350/160000, loss: 0.6209, lr: 0.000830, batch_cost: 0.1856, reader_cost: 0.00052, ips: 43.0945 samples/sec | ETA 05:39:15 2022-08-23 22:51:59 [INFO] [TRAIN] epoch: 40, iter: 50400/160000, loss: 0.6468, lr: 0.000830, batch_cost: 0.1749, reader_cost: 0.00064, ips: 45.7389 samples/sec | ETA 05:19:29 2022-08-23 22:52:08 [INFO] [TRAIN] epoch: 40, iter: 50450/160000, loss: 0.5998, lr: 0.000829, batch_cost: 0.1749, reader_cost: 0.00059, ips: 45.7276 samples/sec | ETA 05:19:25 2022-08-23 22:52:16 [INFO] [TRAIN] epoch: 40, iter: 50500/160000, loss: 0.5712, lr: 0.000829, batch_cost: 0.1585, reader_cost: 0.00051, ips: 50.4718 samples/sec | ETA 04:49:16 2022-08-23 22:52:28 [INFO] [TRAIN] epoch: 41, iter: 50550/160000, loss: 0.5764, lr: 0.000829, batch_cost: 0.2509, reader_cost: 0.07702, ips: 31.8853 samples/sec | ETA 07:37:40 2022-08-23 22:52:36 [INFO] [TRAIN] epoch: 41, iter: 50600/160000, loss: 0.5718, lr: 0.000828, batch_cost: 0.1581, reader_cost: 0.00042, ips: 50.6094 samples/sec | ETA 04:48:13 2022-08-23 22:52:46 [INFO] [TRAIN] epoch: 41, iter: 50650/160000, loss: 0.5665, lr: 0.000828, batch_cost: 0.1949, reader_cost: 0.00071, ips: 41.0455 samples/sec | ETA 05:55:12 2022-08-23 22:52:54 [INFO] [TRAIN] epoch: 41, iter: 50700/160000, loss: 0.6061, lr: 0.000828, batch_cost: 0.1690, reader_cost: 0.00052, ips: 47.3304 samples/sec | ETA 05:07:54 2022-08-23 22:53:03 [INFO] [TRAIN] epoch: 41, iter: 50750/160000, loss: 0.5825, lr: 0.000827, batch_cost: 0.1721, reader_cost: 0.00052, ips: 46.4889 samples/sec | ETA 05:13:20 2022-08-23 22:53:11 [INFO] [TRAIN] epoch: 41, iter: 50800/160000, loss: 0.5745, lr: 0.000827, batch_cost: 0.1734, reader_cost: 0.00085, ips: 46.1487 samples/sec | ETA 05:15:30 2022-08-23 22:53:20 [INFO] [TRAIN] epoch: 41, iter: 50850/160000, loss: 0.5853, lr: 0.000826, batch_cost: 0.1794, reader_cost: 0.00094, ips: 44.6036 samples/sec | ETA 05:26:16 2022-08-23 22:53:30 [INFO] [TRAIN] epoch: 41, iter: 50900/160000, loss: 0.6305, lr: 0.000826, batch_cost: 0.1837, reader_cost: 0.00067, ips: 43.5471 samples/sec | ETA 05:34:02 2022-08-23 22:53:40 [INFO] [TRAIN] epoch: 41, iter: 50950/160000, loss: 0.5558, lr: 0.000826, batch_cost: 0.2042, reader_cost: 0.00074, ips: 39.1687 samples/sec | ETA 06:11:12 2022-08-23 22:53:51 [INFO] [TRAIN] epoch: 41, iter: 51000/160000, loss: 0.5972, lr: 0.000825, batch_cost: 0.2162, reader_cost: 0.00042, ips: 37.0042 samples/sec | ETA 06:32:44 2022-08-23 22:53:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1748 - reader cost: 6.6413e-04 2022-08-23 22:56:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3529 Acc: 0.7632 Kappa: 0.7452 Dice: 0.4897 2022-08-23 22:56:46 [INFO] [EVAL] Class IoU: [0.6805 0.7697 0.93 0.7276 0.6862 0.757 0.7705 0.7846 0.5124 0.6227 0.4797 0.568 0.6967 0.2785 0.2906 0.415 0.4959 0.4285 0.5781 0.3984 0.7628 0.5003 0.594 0.4963 0.2973 0.3804 0.5055 0.4335 0.4553 0.259 0.2814 0.5016 0.2584 0.3272 0.328 0.3693 0.4493 0.5798 0.2665 0.2984 0.1583 0.1478 0.3561 0.2635 0.2939 0.2382 0.3107 0.5017 0.5034 0.4644 0.5278 0.423 0.2176 0.2091 0.6559 0.4105 0.8704 0.4268 0.474 0.3053 0.1701 0.2409 0.2531 0.2191 0.4084 0.6899 0.2141 0.4054 0.1087 0.3623 0.4998 0.5395 0.3756 0.2287 0.4623 0.3713 0.5355 0.2754 0.2018 0.2069 0.6821 0.3586 0.3492 0.0229 0.1624 0.5185 0.0905 0.1087 0.2116 0.4832 0.4146 0.0637 0.1514 0.1409 0.0523 0.0186 0.1579 0.173 0.2869 0.3539 0.2929 0.0855 0.2462 0.3124 0.1646 0.5907 0.2356 0.5351 0.0588 0.4104 0.1353 0.2624 0.1836 0.4319 0.4428 0.0118 0.4037 0.6335 0.1807 0.2747 0.4357 0.0718 0.3266 0.078 0.2615 0.1967 0.5182 0.4906 0.6233 0.3719 0.5826 0.0788 0.2363 0.337 0.2454 0.1775 0.1359 0.0243 0.1886 0.3469 0.1409 0.0296 0.2631 0.396 0.2977 0. 0.4063 0.0219 0.102 0.1126] 2022-08-23 22:56:46 [INFO] [EVAL] Class Precision: [0.7829 0.8395 0.9692 0.8379 0.7764 0.8801 0.885 0.8792 0.626 0.7175 0.7229 0.7042 0.7962 0.5845 0.5276 0.5846 0.6242 0.7111 0.7293 0.6372 0.8433 0.7023 0.7437 0.6048 0.4148 0.5719 0.6289 0.6863 0.6161 0.4242 0.4531 0.6344 0.4555 0.4262 0.4398 0.4923 0.6884 0.8008 0.3942 0.6554 0.2968 0.3172 0.5954 0.4801 0.4286 0.4433 0.4331 0.7759 0.6493 0.5418 0.6854 0.5264 0.3603 0.4644 0.7055 0.6187 0.9172 0.594 0.6126 0.5248 0.2602 0.5473 0.2999 0.5785 0.4677 0.8457 0.3312 0.542 0.2434 0.5832 0.6363 0.6882 0.5641 0.2959 0.7099 0.5193 0.6895 0.636 0.5116 0.5356 0.8329 0.6004 0.7454 0.0816 0.2579 0.6613 0.7038 0.4344 0.4461 0.6858 0.5743 0.0722 0.3486 0.3143 0.1717 0.0865 0.7096 0.3199 0.439 0.6018 0.5285 0.1354 0.6182 0.9181 0.6298 0.7189 0.4343 0.7396 0.1648 0.5486 0.3721 0.415 0.525 0.8427 0.4447 0.236 0.6674 0.7218 0.2873 0.5923 0.6851 0.2168 0.5281 0.7551 0.7802 0.6077 0.8303 0.6863 0.9166 0.5812 0.6597 0.4446 0.5586 0.7745 0.7135 0.4193 0.3729 0.1105 0.44 0.7199 0.2202 0.0594 0.6455 0.6097 0.6188 0. 0.8905 0.3671 0.4529 0.8558] 2022-08-23 22:56:46 [INFO] [EVAL] Class Recall: [0.8387 0.9025 0.9583 0.8468 0.8553 0.8441 0.8563 0.8795 0.7385 0.8249 0.5878 0.746 0.8479 0.3473 0.3928 0.5886 0.7071 0.5188 0.7361 0.5153 0.8889 0.6349 0.7469 0.7345 0.5122 0.5319 0.7204 0.5406 0.6356 0.3993 0.4262 0.7055 0.3739 0.5846 0.5634 0.5966 0.5641 0.6774 0.4514 0.3539 0.2534 0.2167 0.4698 0.3687 0.4832 0.3398 0.5237 0.5867 0.6915 0.7647 0.6965 0.6829 0.3546 0.2756 0.9032 0.5496 0.9446 0.6025 0.677 0.4219 0.3295 0.3009 0.6189 0.2608 0.7631 0.7893 0.377 0.6167 0.1641 0.4889 0.6996 0.714 0.5292 0.5018 0.5699 0.5658 0.7056 0.3269 0.25 0.2521 0.7902 0.4711 0.3965 0.0308 0.3047 0.706 0.0941 0.1266 0.287 0.6205 0.5985 0.3495 0.2111 0.2034 0.07 0.0231 0.1688 0.2736 0.4531 0.462 0.3964 0.1881 0.2903 0.3213 0.1822 0.7681 0.3398 0.6593 0.0837 0.6196 0.1754 0.4163 0.2202 0.4698 0.9905 0.0123 0.5053 0.8381 0.3276 0.3387 0.5448 0.097 0.4613 0.08 0.2822 0.2253 0.5796 0.6325 0.6608 0.5081 0.8329 0.0874 0.2906 0.3737 0.2722 0.2353 0.1762 0.0302 0.2481 0.401 0.2813 0.0558 0.3076 0.5305 0.3646 0. 0.4277 0.0228 0.1164 0.1148] 2022-08-23 22:56:46 [INFO] [EVAL] The model with the best validation mIoU (0.3563) was saved at iter 42000. 2022-08-23 22:56:57 [INFO] [TRAIN] epoch: 41, iter: 51050/160000, loss: 0.5838, lr: 0.000825, batch_cost: 0.2098, reader_cost: 0.00366, ips: 38.1238 samples/sec | ETA 06:21:02 2022-08-23 22:57:09 [INFO] [TRAIN] epoch: 41, iter: 51100/160000, loss: 0.5681, lr: 0.000824, batch_cost: 0.2403, reader_cost: 0.00161, ips: 33.2863 samples/sec | ETA 07:16:12 2022-08-23 22:57:18 [INFO] [TRAIN] epoch: 41, iter: 51150/160000, loss: 0.5802, lr: 0.000824, batch_cost: 0.1898, reader_cost: 0.00660, ips: 42.1530 samples/sec | ETA 05:44:18 2022-08-23 22:57:29 [INFO] [TRAIN] epoch: 41, iter: 51200/160000, loss: 0.5796, lr: 0.000824, batch_cost: 0.2197, reader_cost: 0.00039, ips: 36.4061 samples/sec | ETA 06:38:28 2022-08-23 22:57:41 [INFO] [TRAIN] epoch: 41, iter: 51250/160000, loss: 0.5842, lr: 0.000823, batch_cost: 0.2309, reader_cost: 0.00050, ips: 34.6431 samples/sec | ETA 06:58:33 2022-08-23 22:57:51 [INFO] [TRAIN] epoch: 41, iter: 51300/160000, loss: 0.6372, lr: 0.000823, batch_cost: 0.2105, reader_cost: 0.00073, ips: 38.0092 samples/sec | ETA 06:21:18 2022-08-23 22:57:59 [INFO] [TRAIN] epoch: 41, iter: 51350/160000, loss: 0.5979, lr: 0.000823, batch_cost: 0.1646, reader_cost: 0.00083, ips: 48.5965 samples/sec | ETA 04:58:06 2022-08-23 22:58:08 [INFO] [TRAIN] epoch: 41, iter: 51400/160000, loss: 0.6106, lr: 0.000822, batch_cost: 0.1830, reader_cost: 0.00101, ips: 43.7171 samples/sec | ETA 05:31:13 2022-08-23 22:58:19 [INFO] [TRAIN] epoch: 41, iter: 51450/160000, loss: 0.6341, lr: 0.000822, batch_cost: 0.2087, reader_cost: 0.00074, ips: 38.3388 samples/sec | ETA 06:17:30 2022-08-23 22:58:28 [INFO] [TRAIN] epoch: 41, iter: 51500/160000, loss: 0.6247, lr: 0.000821, batch_cost: 0.1851, reader_cost: 0.00082, ips: 43.2258 samples/sec | ETA 05:34:40 2022-08-23 22:58:38 [INFO] [TRAIN] epoch: 41, iter: 51550/160000, loss: 0.5957, lr: 0.000821, batch_cost: 0.1927, reader_cost: 0.00074, ips: 41.5145 samples/sec | ETA 05:48:18 2022-08-23 22:58:48 [INFO] [TRAIN] epoch: 41, iter: 51600/160000, loss: 0.5984, lr: 0.000821, batch_cost: 0.2040, reader_cost: 0.00097, ips: 39.2158 samples/sec | ETA 06:08:33 2022-08-23 22:58:57 [INFO] [TRAIN] epoch: 41, iter: 51650/160000, loss: 0.5861, lr: 0.000820, batch_cost: 0.1714, reader_cost: 0.00040, ips: 46.6717 samples/sec | ETA 05:09:32 2022-08-23 22:59:06 [INFO] [TRAIN] epoch: 41, iter: 51700/160000, loss: 0.5686, lr: 0.000820, batch_cost: 0.1830, reader_cost: 0.00118, ips: 43.7123 samples/sec | ETA 05:30:20 2022-08-23 22:59:14 [INFO] [TRAIN] epoch: 41, iter: 51750/160000, loss: 0.5818, lr: 0.000820, batch_cost: 0.1707, reader_cost: 0.00046, ips: 46.8545 samples/sec | ETA 05:08:02 2022-08-23 22:59:26 [INFO] [TRAIN] epoch: 42, iter: 51800/160000, loss: 0.5845, lr: 0.000819, batch_cost: 0.2265, reader_cost: 0.05014, ips: 35.3272 samples/sec | ETA 06:48:22 2022-08-23 22:59:34 [INFO] [TRAIN] epoch: 42, iter: 51850/160000, loss: 0.5551, lr: 0.000819, batch_cost: 0.1769, reader_cost: 0.00699, ips: 45.2173 samples/sec | ETA 05:18:54 2022-08-23 22:59:43 [INFO] [TRAIN] epoch: 42, iter: 51900/160000, loss: 0.5881, lr: 0.000818, batch_cost: 0.1688, reader_cost: 0.00166, ips: 47.4052 samples/sec | ETA 05:04:02 2022-08-23 22:59:53 [INFO] [TRAIN] epoch: 42, iter: 51950/160000, loss: 0.5818, lr: 0.000818, batch_cost: 0.1957, reader_cost: 0.00040, ips: 40.8840 samples/sec | ETA 05:52:22 2022-08-23 23:00:01 [INFO] [TRAIN] epoch: 42, iter: 52000/160000, loss: 0.5448, lr: 0.000818, batch_cost: 0.1652, reader_cost: 0.00175, ips: 48.4125 samples/sec | ETA 04:57:26 2022-08-23 23:00:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1838 - reader cost: 0.0010 2022-08-23 23:03:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3597 Acc: 0.7665 Kappa: 0.7487 Dice: 0.4974 2022-08-23 23:03:05 [INFO] [EVAL] Class IoU: [0.6818 0.7752 0.9292 0.7313 0.6659 0.7505 0.7798 0.7847 0.5103 0.6404 0.4827 0.56 0.6986 0.3156 0.2765 0.4224 0.5206 0.4253 0.5948 0.4208 0.7563 0.4943 0.6234 0.4867 0.3151 0.4146 0.4741 0.4273 0.4464 0.2504 0.1881 0.5151 0.2841 0.3041 0.3882 0.3957 0.4447 0.5767 0.2714 0.3728 0.1915 0.1124 0.3573 0.272 0.2963 0.2066 0.3643 0.514 0.493 0.5351 0.5801 0.4032 0.1983 0.2554 0.653 0.4305 0.8702 0.3896 0.435 0.2911 0.104 0.2955 0.2723 0.1598 0.4402 0.6961 0.239 0.409 0.0648 0.3251 0.5028 0.5481 0.3213 0.2586 0.4671 0.3638 0.5334 0.2911 0.2404 0.2846 0.5949 0.386 0.3877 0.1085 0.2508 0.501 0.1382 0.0939 0.2532 0.4746 0.3911 0.1158 0.1625 0.0847 0.0349 0.0397 0.1714 0.1756 0.2715 0.347 0.2833 0.1431 0.2543 0.6 0.1681 0.4818 0.1943 0.5097 0.089 0.3961 0.1395 0.288 0.1805 0.6013 0.5182 0.0257 0.4196 0.6398 0.1888 0.4187 0.4552 0.0391 0.3362 0.131 0.2577 0.2381 0.4878 0.4976 0.6582 0.3588 0.5429 0.0477 0.2484 0.3067 0.2195 0.1736 0.134 0.014 0.1805 0.3358 0.1087 0.0014 0.2524 0.3351 0.2953 0.0063 0.4308 0.0294 0.0982 0.155 ] 2022-08-23 23:03:05 [INFO] [EVAL] Class Precision: [0.7761 0.8712 0.9665 0.83 0.7234 0.8763 0.909 0.8329 0.6095 0.7394 0.7344 0.6617 0.789 0.5464 0.6229 0.575 0.6968 0.6874 0.7595 0.6434 0.8177 0.7309 0.8007 0.5841 0.5248 0.5692 0.6101 0.7123 0.6499 0.4422 0.4831 0.6397 0.4811 0.3936 0.5125 0.5242 0.6153 0.8004 0.4428 0.5509 0.2806 0.3996 0.5629 0.4924 0.3686 0.4499 0.678 0.7763 0.6773 0.6633 0.7826 0.5006 0.3146 0.5959 0.6965 0.6723 0.918 0.6139 0.7896 0.6144 0.1566 0.6347 0.4187 0.5349 0.5488 0.8864 0.4559 0.5739 0.2181 0.5852 0.6523 0.6403 0.5798 0.3108 0.6323 0.6155 0.6436 0.6261 0.6285 0.4883 0.7109 0.6586 0.7961 0.2287 0.3777 0.6952 0.4606 0.3872 0.5704 0.6532 0.5265 0.1491 0.358 0.3043 0.0576 0.1176 0.8475 0.4431 0.4206 0.7094 0.7131 0.2592 0.5876 0.6504 0.8246 0.5312 0.3738 0.7281 0.1966 0.5123 0.4359 0.3425 0.4531 0.846 0.5218 0.2553 0.6103 0.7455 0.3882 0.5858 0.7435 0.3784 0.6231 0.6849 0.7969 0.5947 0.8094 0.6238 0.8291 0.5776 0.6404 0.4306 0.5103 0.8098 0.7702 0.3879 0.3354 0.0769 0.4922 0.6972 0.264 0.0039 0.6122 0.4909 0.5436 0.0076 0.8403 0.2466 0.3298 0.8307] 2022-08-23 23:03:05 [INFO] [EVAL] Class Recall: [0.8488 0.8755 0.9601 0.8601 0.8934 0.8394 0.8459 0.9313 0.758 0.827 0.5848 0.7847 0.8591 0.4277 0.3321 0.6142 0.673 0.5273 0.7328 0.5488 0.9097 0.6043 0.7379 0.7447 0.4409 0.6042 0.6802 0.5164 0.5877 0.366 0.2354 0.7257 0.4097 0.5722 0.6153 0.6175 0.6158 0.6735 0.412 0.5356 0.3762 0.1352 0.4945 0.3779 0.6015 0.2764 0.4405 0.6034 0.6443 0.7345 0.6915 0.6744 0.3491 0.3088 0.9126 0.5448 0.9436 0.5161 0.492 0.3561 0.2363 0.3561 0.4378 0.1856 0.6898 0.7643 0.3344 0.5875 0.0844 0.4224 0.6869 0.7918 0.4189 0.6058 0.6412 0.4707 0.7571 0.3523 0.2802 0.4056 0.7846 0.4826 0.4304 0.1711 0.4275 0.6421 0.1648 0.1103 0.3129 0.6344 0.6033 0.3418 0.2294 0.105 0.0815 0.0566 0.1769 0.2253 0.4336 0.4046 0.3198 0.242 0.3096 0.8857 0.1743 0.8381 0.2881 0.6296 0.1398 0.6358 0.1703 0.6443 0.2308 0.6752 0.9871 0.0278 0.5731 0.8186 0.2689 0.5948 0.5401 0.0418 0.422 0.1395 0.2759 0.2842 0.5511 0.7109 0.7615 0.4864 0.781 0.0509 0.3262 0.3305 0.2349 0.2392 0.1825 0.0168 0.2218 0.3932 0.1559 0.0023 0.3005 0.5136 0.3927 0.0348 0.4692 0.0323 0.1227 0.16 ] 2022-08-23 23:03:05 [INFO] [EVAL] The model with the best validation mIoU (0.3597) was saved at iter 52000. 2022-08-23 23:03:15 [INFO] [TRAIN] epoch: 42, iter: 52050/160000, loss: 0.6090, lr: 0.000817, batch_cost: 0.2024, reader_cost: 0.00402, ips: 39.5253 samples/sec | ETA 06:04:09 2022-08-23 23:03:25 [INFO] [TRAIN] epoch: 42, iter: 52100/160000, loss: 0.5924, lr: 0.000817, batch_cost: 0.1926, reader_cost: 0.00121, ips: 41.5332 samples/sec | ETA 05:46:23 2022-08-23 23:03:36 [INFO] [TRAIN] epoch: 42, iter: 52150/160000, loss: 0.6454, lr: 0.000817, batch_cost: 0.2201, reader_cost: 0.00078, ips: 36.3394 samples/sec | ETA 06:35:42 2022-08-23 23:03:48 [INFO] [TRAIN] epoch: 42, iter: 52200/160000, loss: 0.5617, lr: 0.000816, batch_cost: 0.2274, reader_cost: 0.00053, ips: 35.1761 samples/sec | ETA 06:48:36 2022-08-23 23:03:57 [INFO] [TRAIN] epoch: 42, iter: 52250/160000, loss: 0.5929, lr: 0.000816, batch_cost: 0.1910, reader_cost: 0.00055, ips: 41.8786 samples/sec | ETA 05:43:03 2022-08-23 23:04:08 [INFO] [TRAIN] epoch: 42, iter: 52300/160000, loss: 0.5510, lr: 0.000815, batch_cost: 0.2223, reader_cost: 0.00049, ips: 35.9854 samples/sec | ETA 06:39:03 2022-08-23 23:04:17 [INFO] [TRAIN] epoch: 42, iter: 52350/160000, loss: 0.5556, lr: 0.000815, batch_cost: 0.1671, reader_cost: 0.00095, ips: 47.8832 samples/sec | ETA 04:59:45 2022-08-23 23:04:24 [INFO] [TRAIN] epoch: 42, iter: 52400/160000, loss: 0.6339, lr: 0.000815, batch_cost: 0.1587, reader_cost: 0.00059, ips: 50.4250 samples/sec | ETA 04:44:30 2022-08-23 23:04:34 [INFO] [TRAIN] epoch: 42, iter: 52450/160000, loss: 0.5521, lr: 0.000814, batch_cost: 0.1881, reader_cost: 0.00053, ips: 42.5313 samples/sec | ETA 05:37:09 2022-08-23 23:04:42 [INFO] [TRAIN] epoch: 42, iter: 52500/160000, loss: 0.6296, lr: 0.000814, batch_cost: 0.1687, reader_cost: 0.00108, ips: 47.4282 samples/sec | ETA 05:02:12 2022-08-23 23:04:51 [INFO] [TRAIN] epoch: 42, iter: 52550/160000, loss: 0.5876, lr: 0.000814, batch_cost: 0.1646, reader_cost: 0.00189, ips: 48.5961 samples/sec | ETA 04:54:48 2022-08-23 23:05:00 [INFO] [TRAIN] epoch: 42, iter: 52600/160000, loss: 0.5855, lr: 0.000813, batch_cost: 0.1939, reader_cost: 0.00086, ips: 41.2538 samples/sec | ETA 05:47:07 2022-08-23 23:05:09 [INFO] [TRAIN] epoch: 42, iter: 52650/160000, loss: 0.6448, lr: 0.000813, batch_cost: 0.1803, reader_cost: 0.00078, ips: 44.3686 samples/sec | ETA 05:22:36 2022-08-23 23:05:18 [INFO] [TRAIN] epoch: 42, iter: 52700/160000, loss: 0.5376, lr: 0.000812, batch_cost: 0.1818, reader_cost: 0.00062, ips: 44.0097 samples/sec | ETA 05:25:04 2022-08-23 23:05:27 [INFO] [TRAIN] epoch: 42, iter: 52750/160000, loss: 0.5793, lr: 0.000812, batch_cost: 0.1753, reader_cost: 0.00055, ips: 45.6487 samples/sec | ETA 05:13:15 2022-08-23 23:05:36 [INFO] [TRAIN] epoch: 42, iter: 52800/160000, loss: 0.5982, lr: 0.000812, batch_cost: 0.1789, reader_cost: 0.00106, ips: 44.7174 samples/sec | ETA 05:19:38 2022-08-23 23:05:44 [INFO] [TRAIN] epoch: 42, iter: 52850/160000, loss: 0.6311, lr: 0.000811, batch_cost: 0.1666, reader_cost: 0.00057, ips: 48.0326 samples/sec | ETA 04:57:26 2022-08-23 23:05:56 [INFO] [TRAIN] epoch: 42, iter: 52900/160000, loss: 0.5929, lr: 0.000811, batch_cost: 0.2223, reader_cost: 0.00036, ips: 35.9814 samples/sec | ETA 06:36:52 2022-08-23 23:06:05 [INFO] [TRAIN] epoch: 42, iter: 52950/160000, loss: 0.5938, lr: 0.000810, batch_cost: 0.1869, reader_cost: 0.00043, ips: 42.7939 samples/sec | ETA 05:33:32 2022-08-23 23:06:13 [INFO] [TRAIN] epoch: 42, iter: 53000/160000, loss: 0.5652, lr: 0.000810, batch_cost: 0.1668, reader_cost: 0.00051, ips: 47.9531 samples/sec | ETA 04:57:30 2022-08-23 23:06:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1925 - reader cost: 0.0013 2022-08-23 23:09:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.3468 Acc: 0.7612 Kappa: 0.7427 Dice: 0.4800 2022-08-23 23:09:26 [INFO] [EVAL] Class IoU: [0.6792 0.7655 0.9309 0.7268 0.6538 0.7486 0.7836 0.7954 0.5042 0.6266 0.4764 0.5434 0.7009 0.289 0.2928 0.4241 0.4746 0.3779 0.5985 0.4121 0.7465 0.4487 0.6199 0.5036 0.3224 0.2165 0.5618 0.4393 0.462 0.2301 0.2801 0.4732 0.2702 0.3107 0.3247 0.3685 0.4568 0.4945 0.2483 0.374 0.1458 0.1255 0.3499 0.2743 0.298 0.2044 0.4143 0.4787 0.5926 0.5258 0.5582 0.4106 0.1733 0.2299 0.6676 0.3323 0.8759 0.3723 0.5022 0.3409 0.1267 0.1917 0.2707 0.1273 0.4546 0.6907 0.2213 0.3891 0.052 0.3365 0.4731 0.5535 0.4061 0.2151 0.4555 0.3888 0.5805 0.2619 0.3024 0.2922 0.6388 0.3623 0.3657 0.0908 0.3072 0.5388 0.1136 0.0954 0.2621 0.491 0.4047 0. 0.0546 0.0781 0.013 0.0232 0.1954 0.1765 0.2861 0.2164 0.117 0.1495 0.1115 0.0783 0.1754 0.6027 0.2244 0.5538 0.0706 0.3328 0.2124 0.1572 0.1399 0.541 0.5944 0.0206 0.4257 0.6563 0.1441 0.3955 0.4634 0.0477 0.3222 0.1487 0.2746 0.1401 0.5406 0.4712 0.4849 0.3459 0.4671 0.0656 0.2865 0.3328 0.2754 0.1593 0.1583 0.0176 0.192 0.3617 0.2021 0.0507 0.1911 0.0351 0.3139 0. 0.3816 0.0219 0.0819 0.1474] 2022-08-23 23:09:26 [INFO] [EVAL] Class Precision: [0.7705 0.8392 0.9649 0.8412 0.7455 0.877 0.8661 0.8648 0.6321 0.7663 0.7065 0.7019 0.804 0.4867 0.5506 0.5788 0.6003 0.6881 0.741 0.6168 0.8062 0.7158 0.7784 0.6815 0.4786 0.478 0.62 0.7126 0.6806 0.3952 0.4583 0.5732 0.5731 0.487 0.4633 0.4486 0.6432 0.8083 0.3412 0.5806 0.2855 0.4191 0.561 0.5679 0.4189 0.5163 0.8195 0.6501 0.6078 0.6263 0.6951 0.5088 0.4896 0.659 0.7266 0.7484 0.921 0.6253 0.5681 0.484 0.2031 0.6086 0.337 0.5938 0.5676 0.8115 0.3496 0.5598 0.1947 0.5291 0.6243 0.6665 0.5815 0.307 0.6371 0.5494 0.7311 0.519 0.5985 0.447 0.7887 0.6516 0.7922 0.1939 0.3829 0.6824 0.585 0.4106 0.5976 0.7022 0.5832 0. 0.2905 0.3113 0.0291 0.1096 0.6431 0.3845 0.4229 0.7997 0.6872 0.2487 0.5911 0.4983 0.505 0.7515 0.4221 0.8281 0.1475 0.4262 0.2963 0.1718 0.5208 0.7943 0.6031 0.1891 0.6497 0.7089 0.2473 0.53 0.7481 0.2805 0.6257 0.6611 0.5532 0.6437 0.8364 0.6921 0.9981 0.6771 0.7639 0.4168 0.5213 0.8258 0.7562 0.4051 0.2988 0.1541 0.3823 0.6568 0.4785 0.0617 0.6642 0.1268 0.6184 0. 0.8998 0.2921 0.3065 0.8151] 2022-08-23 23:09:26 [INFO] [EVAL] Class Recall: [0.8514 0.8971 0.9635 0.8424 0.8416 0.8364 0.8917 0.9083 0.7136 0.7746 0.5939 0.7064 0.8453 0.4157 0.3848 0.6134 0.6938 0.456 0.7568 0.5539 0.9097 0.546 0.7527 0.6586 0.497 0.2836 0.8568 0.5339 0.59 0.3552 0.4188 0.7305 0.3383 0.4618 0.5205 0.6737 0.6117 0.5602 0.4768 0.5124 0.2296 0.1519 0.4819 0.3466 0.508 0.2528 0.4558 0.6448 0.9593 0.7661 0.7392 0.6803 0.2114 0.2609 0.8916 0.3741 0.9471 0.4793 0.8123 0.5355 0.2521 0.2186 0.5789 0.1394 0.6955 0.8227 0.3762 0.5607 0.0662 0.4804 0.6613 0.7656 0.5739 0.4181 0.6151 0.5708 0.738 0.3459 0.3794 0.4578 0.7707 0.4494 0.4045 0.1459 0.6085 0.7191 0.1236 0.1106 0.3182 0.6201 0.5694 0. 0.063 0.0944 0.023 0.0286 0.2192 0.2461 0.4692 0.2288 0.1236 0.2724 0.1208 0.085 0.2118 0.7528 0.3239 0.6258 0.1193 0.6031 0.4285 0.6494 0.1605 0.6292 0.9764 0.0226 0.5524 0.8985 0.2566 0.6091 0.5492 0.0543 0.3992 0.1609 0.3528 0.1518 0.6045 0.5962 0.4854 0.4142 0.5459 0.0722 0.3888 0.3579 0.3023 0.2079 0.2518 0.0195 0.2783 0.4459 0.2591 0.2223 0.2115 0.0463 0.3893 0. 0.3985 0.0231 0.1006 0.1525] 2022-08-23 23:09:26 [INFO] [EVAL] The model with the best validation mIoU (0.3597) was saved at iter 52000. 2022-08-23 23:09:42 [INFO] [TRAIN] epoch: 43, iter: 53050/160000, loss: 0.6429, lr: 0.000810, batch_cost: 0.3086, reader_cost: 0.09604, ips: 25.9206 samples/sec | ETA 09:10:08 2022-08-23 23:09:50 [INFO] [TRAIN] epoch: 43, iter: 53100/160000, loss: 0.5323, lr: 0.000809, batch_cost: 0.1710, reader_cost: 0.00788, ips: 46.7823 samples/sec | ETA 05:04:40 2022-08-23 23:10:01 [INFO] [TRAIN] epoch: 43, iter: 53150/160000, loss: 0.6217, lr: 0.000809, batch_cost: 0.2139, reader_cost: 0.00876, ips: 37.4063 samples/sec | ETA 06:20:51 2022-08-23 23:10:11 [INFO] [TRAIN] epoch: 43, iter: 53200/160000, loss: 0.5544, lr: 0.000809, batch_cost: 0.1942, reader_cost: 0.00060, ips: 41.2020 samples/sec | ETA 05:45:36 2022-08-23 23:10:20 [INFO] [TRAIN] epoch: 43, iter: 53250/160000, loss: 0.5988, lr: 0.000808, batch_cost: 0.1923, reader_cost: 0.00069, ips: 41.5951 samples/sec | ETA 05:42:11 2022-08-23 23:10:31 [INFO] [TRAIN] epoch: 43, iter: 53300/160000, loss: 0.6033, lr: 0.000808, batch_cost: 0.2065, reader_cost: 0.00050, ips: 38.7449 samples/sec | ETA 06:07:11 2022-08-23 23:10:38 [INFO] [TRAIN] epoch: 43, iter: 53350/160000, loss: 0.5576, lr: 0.000807, batch_cost: 0.1541, reader_cost: 0.00049, ips: 51.9044 samples/sec | ETA 04:33:57 2022-08-23 23:10:47 [INFO] [TRAIN] epoch: 43, iter: 53400/160000, loss: 0.5606, lr: 0.000807, batch_cost: 0.1686, reader_cost: 0.00071, ips: 47.4488 samples/sec | ETA 04:59:33 2022-08-23 23:10:56 [INFO] [TRAIN] epoch: 43, iter: 53450/160000, loss: 0.6164, lr: 0.000807, batch_cost: 0.1840, reader_cost: 0.00115, ips: 43.4817 samples/sec | ETA 05:26:43 2022-08-23 23:11:05 [INFO] [TRAIN] epoch: 43, iter: 53500/160000, loss: 0.6009, lr: 0.000806, batch_cost: 0.1828, reader_cost: 0.00152, ips: 43.7669 samples/sec | ETA 05:24:26 2022-08-23 23:11:14 [INFO] [TRAIN] epoch: 43, iter: 53550/160000, loss: 0.5692, lr: 0.000806, batch_cost: 0.1817, reader_cost: 0.00033, ips: 44.0241 samples/sec | ETA 05:22:23 2022-08-23 23:11:23 [INFO] [TRAIN] epoch: 43, iter: 53600/160000, loss: 0.5120, lr: 0.000806, batch_cost: 0.1678, reader_cost: 0.00072, ips: 47.6706 samples/sec | ETA 04:57:35 2022-08-23 23:11:32 [INFO] [TRAIN] epoch: 43, iter: 53650/160000, loss: 0.5760, lr: 0.000805, batch_cost: 0.1909, reader_cost: 0.00063, ips: 41.8973 samples/sec | ETA 05:38:26 2022-08-23 23:11:40 [INFO] [TRAIN] epoch: 43, iter: 53700/160000, loss: 0.5774, lr: 0.000805, batch_cost: 0.1588, reader_cost: 0.00173, ips: 50.3796 samples/sec | ETA 04:41:19 2022-08-23 23:11:48 [INFO] [TRAIN] epoch: 43, iter: 53750/160000, loss: 0.5927, lr: 0.000804, batch_cost: 0.1692, reader_cost: 0.00041, ips: 47.2914 samples/sec | ETA 04:59:33 2022-08-23 23:11:58 [INFO] [TRAIN] epoch: 43, iter: 53800/160000, loss: 0.5750, lr: 0.000804, batch_cost: 0.1835, reader_cost: 0.00041, ips: 43.5968 samples/sec | ETA 05:24:47 2022-08-23 23:12:06 [INFO] [TRAIN] epoch: 43, iter: 53850/160000, loss: 0.5986, lr: 0.000804, batch_cost: 0.1676, reader_cost: 0.00066, ips: 47.7446 samples/sec | ETA 04:56:26 2022-08-23 23:12:15 [INFO] [TRAIN] epoch: 43, iter: 53900/160000, loss: 0.5762, lr: 0.000803, batch_cost: 0.1830, reader_cost: 0.00035, ips: 43.7052 samples/sec | ETA 05:23:41 2022-08-23 23:12:25 [INFO] [TRAIN] epoch: 43, iter: 53950/160000, loss: 0.5553, lr: 0.000803, batch_cost: 0.1935, reader_cost: 0.00032, ips: 41.3515 samples/sec | ETA 05:41:56 2022-08-23 23:12:34 [INFO] [TRAIN] epoch: 43, iter: 54000/160000, loss: 0.5378, lr: 0.000803, batch_cost: 0.1841, reader_cost: 0.00072, ips: 43.4526 samples/sec | ETA 05:25:15 2022-08-23 23:12:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1917 - reader cost: 0.0011 2022-08-23 23:15:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3626 Acc: 0.7673 Kappa: 0.7498 Dice: 0.4995 2022-08-23 23:15:46 [INFO] [EVAL] Class IoU: [0.6845 0.7775 0.9324 0.7353 0.6827 0.7608 0.7833 0.7871 0.5176 0.6231 0.467 0.5691 0.7011 0.2977 0.3002 0.4302 0.539 0.4188 0.5798 0.4316 0.7522 0.4859 0.6066 0.4792 0.3315 0.364 0.5744 0.4485 0.4486 0.2532 0.221 0.502 0.3127 0.3245 0.3912 0.3879 0.4595 0.5276 0.3016 0.291 0.1492 0.1355 0.3663 0.2782 0.2865 0.217 0.3871 0.5057 0.613 0.5409 0.5363 0.4247 0.1926 0.248 0.6513 0.5328 0.8418 0.4307 0.4738 0.3542 0.0867 0.2371 0.2621 0.2304 0.4661 0.6955 0.2424 0.3939 0.0725 0.3472 0.5102 0.517 0.397 0.2351 0.4631 0.4124 0.5216 0.2402 0.1997 0.3325 0.5961 0.367 0.3154 0.0826 0.2116 0.5252 0.1396 0.1099 0.3027 0.5078 0.4169 0.0607 0.1804 0.0562 0.039 0.0243 0.1876 0.2307 0.2901 0.3448 0.2765 0.0996 0.2192 0.3584 0.1898 0.5718 0.2125 0.5399 0.0657 0.2739 0.2124 0.4085 0.1841 0.5654 0.6603 0.0245 0.402 0.6598 0.1246 0.408 0.414 0.0613 0.3398 0.1396 0.2785 0.2691 0.5717 0.491 0.5952 0.3888 0.5779 0.0878 0.1222 0.3255 0.2442 0.1834 0.1624 0.0442 0.1784 0.3607 0.3554 0.0039 0.268 0.0164 0.3075 0. 0.3808 0.0362 0.1032 0.1313] 2022-08-23 23:15:46 [INFO] [EVAL] Class Precision: [0.7879 0.8651 0.9651 0.8514 0.76 0.8614 0.8948 0.8546 0.6785 0.7609 0.6827 0.7066 0.7669 0.4748 0.5405 0.6111 0.7157 0.7044 0.6946 0.5887 0.8202 0.7044 0.7406 0.5534 0.5383 0.5184 0.6595 0.7301 0.6636 0.3707 0.453 0.6777 0.5256 0.4316 0.4614 0.5488 0.6369 0.7447 0.4764 0.6558 0.303 0.319 0.5849 0.5409 0.3514 0.5001 0.7297 0.711 0.65 0.6841 0.7503 0.5484 0.421 0.5535 0.7132 0.6812 0.8788 0.6039 0.7704 0.5768 0.1268 0.4867 0.3888 0.5087 0.5795 0.8302 0.3637 0.4829 0.1851 0.5171 0.725 0.695 0.5301 0.3163 0.6946 0.6249 0.7911 0.4783 0.5945 0.4351 0.7012 0.7616 0.8205 0.1859 0.2879 0.6845 0.5269 0.378 0.6298 0.7148 0.5771 0.0722 0.3249 0.311 0.0706 0.0991 0.7647 0.407 0.4144 0.6998 0.778 0.128 0.4956 0.9169 0.842 0.674 0.3637 0.7221 0.2512 0.3834 0.435 0.5346 0.4891 0.7717 0.6862 0.266 0.8136 0.7699 0.3175 0.5407 0.7208 0.3191 0.7353 0.5807 0.6285 0.5963 0.8235 0.71 0.8416 0.6012 0.6295 0.4844 0.4276 0.8722 0.7777 0.3424 0.385 0.107 0.4113 0.6432 0.4253 0.0057 0.5723 0.4524 0.557 0. 0.866 0.174 0.4631 0.7683] 2022-08-23 23:15:46 [INFO] [EVAL] Class Recall: [0.8391 0.8848 0.9649 0.8436 0.8702 0.867 0.8628 0.9087 0.6857 0.7749 0.5965 0.7451 0.8909 0.4439 0.403 0.5924 0.6859 0.5081 0.7783 0.618 0.9006 0.6103 0.7703 0.7814 0.4632 0.55 0.8165 0.5377 0.5807 0.4441 0.3014 0.6595 0.4357 0.5666 0.7199 0.5696 0.6225 0.6441 0.4512 0.3435 0.2271 0.1907 0.495 0.3642 0.608 0.2771 0.4519 0.6366 0.915 0.721 0.6528 0.6531 0.2621 0.3101 0.8824 0.7098 0.9524 0.6003 0.5517 0.4785 0.2151 0.3161 0.4459 0.2963 0.7044 0.8109 0.421 0.6814 0.1064 0.5138 0.6326 0.6686 0.6126 0.4781 0.5815 0.5481 0.6049 0.3254 0.2312 0.585 0.7992 0.4146 0.3388 0.1294 0.4438 0.6929 0.1596 0.1341 0.3682 0.6368 0.6003 0.2751 0.2885 0.0642 0.0801 0.0312 0.1991 0.3475 0.4918 0.4047 0.3002 0.3096 0.2821 0.3704 0.1968 0.7904 0.3383 0.6816 0.0817 0.4893 0.2932 0.6339 0.2279 0.679 0.9459 0.0263 0.4428 0.8219 0.1702 0.6244 0.493 0.0705 0.3872 0.1552 0.3334 0.3291 0.6515 0.6142 0.6703 0.5239 0.8758 0.0968 0.1461 0.3418 0.2625 0.2832 0.2192 0.0699 0.2395 0.4508 0.6837 0.0118 0.3351 0.0167 0.407 0. 0.4046 0.0437 0.1172 0.1367] 2022-08-23 23:15:46 [INFO] [EVAL] The model with the best validation mIoU (0.3626) was saved at iter 54000. 2022-08-23 23:15:57 [INFO] [TRAIN] epoch: 43, iter: 54050/160000, loss: 0.5727, lr: 0.000802, batch_cost: 0.2138, reader_cost: 0.00257, ips: 37.4183 samples/sec | ETA 06:17:32 2022-08-23 23:16:09 [INFO] [TRAIN] epoch: 43, iter: 54100/160000, loss: 0.5955, lr: 0.000802, batch_cost: 0.2294, reader_cost: 0.00199, ips: 34.8808 samples/sec | ETA 06:44:48 2022-08-23 23:16:17 [INFO] [TRAIN] epoch: 43, iter: 54150/160000, loss: 0.5710, lr: 0.000801, batch_cost: 0.1698, reader_cost: 0.00051, ips: 47.1118 samples/sec | ETA 04:59:34 2022-08-23 23:16:27 [INFO] [TRAIN] epoch: 43, iter: 54200/160000, loss: 0.6233, lr: 0.000801, batch_cost: 0.1943, reader_cost: 0.00042, ips: 41.1640 samples/sec | ETA 05:42:41 2022-08-23 23:16:36 [INFO] [TRAIN] epoch: 43, iter: 54250/160000, loss: 0.5582, lr: 0.000801, batch_cost: 0.1828, reader_cost: 0.00035, ips: 43.7668 samples/sec | ETA 05:22:09 2022-08-23 23:16:46 [INFO] [TRAIN] epoch: 43, iter: 54300/160000, loss: 0.5793, lr: 0.000800, batch_cost: 0.2002, reader_cost: 0.00101, ips: 39.9697 samples/sec | ETA 05:52:36 2022-08-23 23:17:01 [INFO] [TRAIN] epoch: 44, iter: 54350/160000, loss: 0.5580, lr: 0.000800, batch_cost: 0.3007, reader_cost: 0.11165, ips: 26.6026 samples/sec | ETA 08:49:31 2022-08-23 23:17:10 [INFO] [TRAIN] epoch: 44, iter: 54400/160000, loss: 0.5733, lr: 0.000800, batch_cost: 0.1892, reader_cost: 0.00053, ips: 42.2783 samples/sec | ETA 05:33:01 2022-08-23 23:17:19 [INFO] [TRAIN] epoch: 44, iter: 54450/160000, loss: 0.5470, lr: 0.000799, batch_cost: 0.1742, reader_cost: 0.00081, ips: 45.9188 samples/sec | ETA 05:06:28 2022-08-23 23:17:30 [INFO] [TRAIN] epoch: 44, iter: 54500/160000, loss: 0.5734, lr: 0.000799, batch_cost: 0.2081, reader_cost: 0.00053, ips: 38.4434 samples/sec | ETA 06:05:54 2022-08-23 23:17:39 [INFO] [TRAIN] epoch: 44, iter: 54550/160000, loss: 0.5962, lr: 0.000798, batch_cost: 0.1874, reader_cost: 0.00082, ips: 42.6794 samples/sec | ETA 05:29:25 2022-08-23 23:17:48 [INFO] [TRAIN] epoch: 44, iter: 54600/160000, loss: 0.6122, lr: 0.000798, batch_cost: 0.1828, reader_cost: 0.00078, ips: 43.7601 samples/sec | ETA 05:21:08 2022-08-23 23:17:56 [INFO] [TRAIN] epoch: 44, iter: 54650/160000, loss: 0.6062, lr: 0.000798, batch_cost: 0.1666, reader_cost: 0.00091, ips: 48.0288 samples/sec | ETA 04:52:27 2022-08-23 23:18:05 [INFO] [TRAIN] epoch: 44, iter: 54700/160000, loss: 0.5319, lr: 0.000797, batch_cost: 0.1743, reader_cost: 0.00048, ips: 45.9057 samples/sec | ETA 05:05:50 2022-08-23 23:18:14 [INFO] [TRAIN] epoch: 44, iter: 54750/160000, loss: 0.5403, lr: 0.000797, batch_cost: 0.1692, reader_cost: 0.00080, ips: 47.2904 samples/sec | ETA 04:56:44 2022-08-23 23:18:23 [INFO] [TRAIN] epoch: 44, iter: 54800/160000, loss: 0.5664, lr: 0.000796, batch_cost: 0.1985, reader_cost: 0.00041, ips: 40.3085 samples/sec | ETA 05:47:58 2022-08-23 23:18:33 [INFO] [TRAIN] epoch: 44, iter: 54850/160000, loss: 0.5877, lr: 0.000796, batch_cost: 0.1849, reader_cost: 0.00045, ips: 43.2562 samples/sec | ETA 05:24:06 2022-08-23 23:18:42 [INFO] [TRAIN] epoch: 44, iter: 54900/160000, loss: 0.5753, lr: 0.000796, batch_cost: 0.1790, reader_cost: 0.00081, ips: 44.6992 samples/sec | ETA 05:13:30 2022-08-23 23:18:51 [INFO] [TRAIN] epoch: 44, iter: 54950/160000, loss: 0.5621, lr: 0.000795, batch_cost: 0.1804, reader_cost: 0.00061, ips: 44.3529 samples/sec | ETA 05:15:48 2022-08-23 23:19:01 [INFO] [TRAIN] epoch: 44, iter: 55000/160000, loss: 0.5954, lr: 0.000795, batch_cost: 0.2022, reader_cost: 0.00052, ips: 39.5740 samples/sec | ETA 05:53:46 2022-08-23 23:19:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1867 - reader cost: 7.3248e-04 2022-08-23 23:22:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3600 Acc: 0.7685 Kappa: 0.7510 Dice: 0.4959 2022-08-23 23:22:08 [INFO] [EVAL] Class IoU: [0.6888 0.7823 0.9304 0.7321 0.6644 0.7651 0.7791 0.7903 0.524 0.6466 0.4585 0.5644 0.6962 0.322 0.3117 0.4355 0.5065 0.3881 0.5959 0.4379 0.7473 0.4736 0.6166 0.5117 0.3139 0.448 0.5002 0.4276 0.4278 0.1925 0.2759 0.4963 0.315 0.317 0.379 0.3883 0.459 0.5315 0.2995 0.3607 0.1917 0.132 0.3567 0.2741 0.2647 0.1977 0.3303 0.4993 0.6014 0.5036 0.5753 0.4157 0.1879 0.2471 0.6505 0.4471 0.8049 0.4129 0.4262 0.3938 0.1161 0.3333 0.2387 0.0739 0.4826 0.7236 0.2602 0.3959 0.0564 0.337 0.4779 0.5631 0.3579 0.2426 0.4429 0.3675 0.5462 0.2477 0.3436 0.401 0.6478 0.3745 0.4076 0.0205 0.2065 0.5187 0.0879 0.0933 0.3632 0.5014 0.468 0.1547 0.1464 0.048 0.0442 0.0201 0.2079 0.1157 0.2881 0.3651 0.2524 0.1371 0.2185 0.0637 0.1647 0.3848 0.1041 0.4612 0.1038 0.265 0.1592 0.3308 0.1513 0.4847 0.6644 0.0576 0.4252 0.7502 0.1917 0.4241 0.4469 0.0491 0.3165 0.1008 0.2508 0.2446 0.5371 0.468 0.6197 0.4283 0.534 0.0835 0.2145 0.3537 0.2535 0.2018 0.1555 0.0181 0.1938 0.3664 0.0758 0. 0.2183 0.5018 0.3343 0.0017 0.4257 0.0194 0.1085 0.1867] 2022-08-23 23:22:08 [INFO] [EVAL] Class Precision: [0.7918 0.8566 0.9668 0.8304 0.7341 0.8854 0.8764 0.8514 0.6598 0.8047 0.7311 0.7066 0.7601 0.4887 0.5377 0.5781 0.6427 0.6751 0.737 0.586 0.8139 0.6522 0.7746 0.6554 0.5444 0.56 0.639 0.7653 0.6966 0.3484 0.4875 0.6503 0.5249 0.4286 0.5371 0.5361 0.6382 0.8317 0.4957 0.5958 0.3637 0.3974 0.5839 0.4661 0.3377 0.3576 0.8209 0.7585 0.6541 0.5949 0.7307 0.5325 0.3843 0.5353 0.7385 0.7368 0.8422 0.6064 0.8103 0.5711 0.1859 0.448 0.3164 0.6063 0.6646 0.8688 0.4109 0.5081 0.1589 0.5148 0.6595 0.7085 0.6757 0.3441 0.6153 0.6932 0.6813 0.4109 0.6821 0.7085 0.8037 0.7574 0.7648 0.1451 0.3314 0.6987 0.2758 0.3768 0.6681 0.6441 0.8291 0.4854 0.2668 0.2935 0.1959 0.0719 0.5185 0.443 0.443 0.5994 0.7801 0.2207 0.5009 0.9713 0.6308 0.4478 0.2323 0.5496 0.1728 0.3913 0.6013 0.4052 0.4824 0.8005 0.6684 0.338 0.7487 0.7963 0.3449 0.5988 0.7193 0.3384 0.6311 0.6277 0.7502 0.6247 0.8125 0.6613 0.9482 0.7686 0.5653 0.5296 0.6166 0.7067 0.6688 0.3883 0.4035 0.207 0.363 0.5407 0.2212 0. 0.6237 0.6873 0.5576 0.0071 0.8295 0.4168 0.5417 0.8749] 2022-08-23 23:22:08 [INFO] [EVAL] Class Recall: [0.8412 0.9003 0.9612 0.8608 0.8749 0.8492 0.8752 0.9168 0.7181 0.767 0.5515 0.7372 0.8922 0.4857 0.4258 0.6383 0.7049 0.4773 0.7569 0.6342 0.9014 0.6336 0.7514 0.7001 0.4258 0.6914 0.6972 0.4921 0.5258 0.3007 0.3887 0.677 0.4406 0.5489 0.563 0.5847 0.6205 0.5956 0.4308 0.4776 0.2884 0.1651 0.4782 0.3995 0.5504 0.3066 0.356 0.5936 0.8817 0.7663 0.7302 0.6547 0.2688 0.3146 0.8452 0.5321 0.9479 0.564 0.4734 0.5591 0.2361 0.5656 0.4929 0.0777 0.6379 0.8124 0.4149 0.6418 0.0805 0.4939 0.6344 0.7329 0.4321 0.4513 0.6126 0.4388 0.7335 0.3841 0.4092 0.4803 0.7696 0.4256 0.4661 0.0233 0.354 0.6682 0.1143 0.1103 0.4432 0.6936 0.518 0.1851 0.2451 0.0543 0.0541 0.0271 0.2576 0.1354 0.4517 0.4831 0.2717 0.2657 0.2793 0.0638 0.1823 0.7322 0.1586 0.7414 0.2062 0.451 0.178 0.6431 0.1806 0.5513 0.991 0.065 0.496 0.9283 0.3015 0.5925 0.5413 0.0543 0.3884 0.1073 0.2736 0.2868 0.6131 0.6155 0.6414 0.4917 0.906 0.0902 0.2475 0.4146 0.2899 0.2959 0.2019 0.0194 0.2938 0.532 0.1033 0. 0.2514 0.6503 0.455 0.0022 0.4666 0.0199 0.1194 0.1918] 2022-08-23 23:22:08 [INFO] [EVAL] The model with the best validation mIoU (0.3626) was saved at iter 54000. 2022-08-23 23:22:18 [INFO] [TRAIN] epoch: 44, iter: 55050/160000, loss: 0.6091, lr: 0.000795, batch_cost: 0.2034, reader_cost: 0.00727, ips: 39.3303 samples/sec | ETA 05:55:47 2022-08-23 23:22:30 [INFO] [TRAIN] epoch: 44, iter: 55100/160000, loss: 0.5815, lr: 0.000794, batch_cost: 0.2256, reader_cost: 0.00102, ips: 35.4595 samples/sec | ETA 06:34:26 2022-08-23 23:22:41 [INFO] [TRAIN] epoch: 44, iter: 55150/160000, loss: 0.5883, lr: 0.000794, batch_cost: 0.2216, reader_cost: 0.00076, ips: 36.1062 samples/sec | ETA 06:27:11 2022-08-23 23:22:51 [INFO] [TRAIN] epoch: 44, iter: 55200/160000, loss: 0.5720, lr: 0.000793, batch_cost: 0.1966, reader_cost: 0.00860, ips: 40.7019 samples/sec | ETA 05:43:18 2022-08-23 23:23:01 [INFO] [TRAIN] epoch: 44, iter: 55250/160000, loss: 0.5780, lr: 0.000793, batch_cost: 0.2163, reader_cost: 0.00045, ips: 36.9919 samples/sec | ETA 06:17:33 2022-08-23 23:23:11 [INFO] [TRAIN] epoch: 44, iter: 55300/160000, loss: 0.5850, lr: 0.000793, batch_cost: 0.1931, reader_cost: 0.00056, ips: 41.4255 samples/sec | ETA 05:36:59 2022-08-23 23:23:20 [INFO] [TRAIN] epoch: 44, iter: 55350/160000, loss: 0.5765, lr: 0.000792, batch_cost: 0.1735, reader_cost: 0.00049, ips: 46.1084 samples/sec | ETA 05:02:37 2022-08-23 23:23:28 [INFO] [TRAIN] epoch: 44, iter: 55400/160000, loss: 0.5958, lr: 0.000792, batch_cost: 0.1687, reader_cost: 0.00064, ips: 47.4309 samples/sec | ETA 04:54:02 2022-08-23 23:23:36 [INFO] [TRAIN] epoch: 44, iter: 55450/160000, loss: 0.6364, lr: 0.000792, batch_cost: 0.1515, reader_cost: 0.00043, ips: 52.8090 samples/sec | ETA 04:23:58 2022-08-23 23:23:44 [INFO] [TRAIN] epoch: 44, iter: 55500/160000, loss: 0.5882, lr: 0.000791, batch_cost: 0.1586, reader_cost: 0.00032, ips: 50.4280 samples/sec | ETA 04:36:18 2022-08-23 23:23:52 [INFO] [TRAIN] epoch: 44, iter: 55550/160000, loss: 0.5460, lr: 0.000791, batch_cost: 0.1711, reader_cost: 0.00260, ips: 46.7588 samples/sec | ETA 04:57:50 2022-08-23 23:24:07 [INFO] [TRAIN] epoch: 45, iter: 55600/160000, loss: 0.6066, lr: 0.000790, batch_cost: 0.2936, reader_cost: 0.09705, ips: 27.2519 samples/sec | ETA 08:30:47 2022-08-23 23:24:16 [INFO] [TRAIN] epoch: 45, iter: 55650/160000, loss: 0.5745, lr: 0.000790, batch_cost: 0.1869, reader_cost: 0.00062, ips: 42.8022 samples/sec | ETA 05:25:03 2022-08-23 23:24:27 [INFO] [TRAIN] epoch: 45, iter: 55700/160000, loss: 0.5277, lr: 0.000790, batch_cost: 0.2059, reader_cost: 0.00056, ips: 38.8502 samples/sec | ETA 05:57:57 2022-08-23 23:24:37 [INFO] [TRAIN] epoch: 45, iter: 55750/160000, loss: 0.5594, lr: 0.000789, batch_cost: 0.2079, reader_cost: 0.00065, ips: 38.4811 samples/sec | ETA 06:01:12 2022-08-23 23:24:46 [INFO] [TRAIN] epoch: 45, iter: 55800/160000, loss: 0.5556, lr: 0.000789, batch_cost: 0.1766, reader_cost: 0.00043, ips: 45.2900 samples/sec | ETA 05:06:45 2022-08-23 23:24:55 [INFO] [TRAIN] epoch: 45, iter: 55850/160000, loss: 0.5604, lr: 0.000789, batch_cost: 0.1814, reader_cost: 0.00086, ips: 44.1125 samples/sec | ETA 05:14:48 2022-08-23 23:25:04 [INFO] [TRAIN] epoch: 45, iter: 55900/160000, loss: 0.5812, lr: 0.000788, batch_cost: 0.1757, reader_cost: 0.00063, ips: 45.5443 samples/sec | ETA 05:04:45 2022-08-23 23:25:12 [INFO] [TRAIN] epoch: 45, iter: 55950/160000, loss: 0.5284, lr: 0.000788, batch_cost: 0.1625, reader_cost: 0.00066, ips: 49.2440 samples/sec | ETA 04:41:43 2022-08-23 23:25:20 [INFO] [TRAIN] epoch: 45, iter: 56000/160000, loss: 0.5944, lr: 0.000787, batch_cost: 0.1614, reader_cost: 0.00148, ips: 49.5576 samples/sec | ETA 04:39:48 2022-08-23 23:25:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1930 - reader cost: 6.4287e-04 2022-08-23 23:28:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3606 Acc: 0.7676 Kappa: 0.7499 Dice: 0.4962 2022-08-23 23:28:33 [INFO] [EVAL] Class IoU: [0.683 0.78 0.9286 0.7344 0.6679 0.746 0.7853 0.7846 0.5186 0.6168 0.4655 0.5662 0.6959 0.3097 0.3081 0.4351 0.52 0.4158 0.5995 0.4175 0.7629 0.5211 0.6001 0.5003 0.3081 0.4793 0.525 0.4519 0.4586 0.2182 0.2942 0.5159 0.2818 0.3505 0.3092 0.3472 0.4557 0.5325 0.296 0.3757 0.191 0.157 0.3649 0.2619 0.2405 0.2067 0.3491 0.5247 0.6634 0.5192 0.5522 0.3534 0.2269 0.2399 0.6858 0.4261 0.8716 0.3577 0.4822 0.2575 0.0942 0.1937 0.3049 0.1176 0.4523 0.7164 0.1926 0.3838 0.0938 0.329 0.5069 0.5809 0.4204 0.2145 0.4646 0.3746 0.4535 0.2479 0.3786 0.2941 0.653 0.386 0.3197 0.0211 0.2812 0.5162 0.0809 0.081 0.298 0.507 0.4395 0.097 0.1006 0.0793 0.0014 0.032 0.1895 0.1181 0.2905 0.2822 0.2565 0.1371 0.2853 0.3538 0.1384 0.621 0.1756 0.5629 0.0763 0.3548 0.2076 0.2134 0.098 0.504 0.7079 0.0135 0.4493 0.6855 0.2606 0.347 0.4588 0.0343 0.2568 0.1288 0.2803 0.1966 0.5326 0.4652 0.6016 0.3693 0.5303 0.0533 0.2466 0.3226 0.234 0.1853 0.1458 0.0216 0.1899 0.3559 0.11 0.0772 0.26 0.438 0.3801 0. 0.3588 0.0284 0.1046 0.1888] 2022-08-23 23:28:33 [INFO] [EVAL] Class Precision: [0.7819 0.8455 0.9589 0.8397 0.7662 0.8924 0.8929 0.8488 0.6636 0.7486 0.7229 0.6909 0.7628 0.5267 0.5616 0.5727 0.662 0.6559 0.7464 0.6235 0.842 0.6321 0.7346 0.6353 0.5717 0.642 0.7398 0.6652 0.6476 0.3262 0.4484 0.6192 0.5466 0.4591 0.4887 0.4358 0.635 0.7512 0.4412 0.5962 0.2684 0.3357 0.5648 0.5274 0.3523 0.4033 0.6996 0.7821 0.7298 0.6133 0.7631 0.4699 0.4213 0.7403 0.7165 0.583 0.9282 0.6655 0.6465 0.5762 0.1696 0.4758 0.4727 0.6059 0.5555 0.821 0.3392 0.5146 0.1819 0.5772 0.6876 0.7214 0.5833 0.3148 0.6964 0.5491 0.7155 0.672 0.629 0.6771 0.8053 0.7101 0.8155 0.1336 0.3483 0.75 0.5586 0.4289 0.8953 0.6869 0.7143 0.1182 0.3232 0.2987 0.0044 0.198 0.5476 0.4454 0.4352 0.6847 0.7794 0.2639 0.5124 0.9575 0.5684 0.8213 0.3734 0.8528 0.1841 0.5227 0.4265 0.2477 0.5841 0.6303 0.7163 0.1464 0.6714 0.7334 0.4106 0.5487 0.6821 0.2569 0.8544 0.5732 0.7405 0.6414 0.833 0.637 0.8881 0.6057 0.6799 0.4244 0.5024 0.7769 0.6236 0.3722 0.3872 0.095 0.3498 0.6744 0.2107 0.139 0.5625 0.6415 0.6897 0. 0.8017 0.1843 0.2023 0.7996] 2022-08-23 23:28:33 [INFO] [EVAL] Class Recall: [0.8438 0.9097 0.9671 0.8542 0.8388 0.8198 0.867 0.9121 0.7037 0.778 0.5665 0.7583 0.8881 0.4291 0.4057 0.6443 0.7079 0.5317 0.7529 0.5583 0.8904 0.7481 0.7662 0.7018 0.4007 0.6543 0.644 0.585 0.6111 0.3973 0.461 0.7556 0.3678 0.597 0.4571 0.6308 0.6173 0.6465 0.4736 0.5039 0.3986 0.2277 0.5077 0.3422 0.4312 0.2978 0.4106 0.6146 0.8794 0.772 0.6665 0.5878 0.3297 0.2619 0.9412 0.6129 0.9346 0.4361 0.6548 0.3176 0.175 0.2462 0.4621 0.1273 0.7089 0.849 0.3082 0.6015 0.1621 0.4334 0.6587 0.7489 0.6007 0.4023 0.5826 0.5409 0.5533 0.282 0.4875 0.3421 0.7755 0.4582 0.3446 0.0244 0.5934 0.6234 0.0864 0.0908 0.3088 0.6593 0.5332 0.3504 0.1275 0.0974 0.002 0.0368 0.2246 0.1385 0.4662 0.3244 0.2765 0.2218 0.3916 0.3594 0.1546 0.7181 0.249 0.6235 0.1153 0.5249 0.288 0.606 0.1053 0.7156 0.9836 0.0147 0.5759 0.9131 0.4162 0.4856 0.5836 0.0381 0.2685 0.1424 0.3108 0.2209 0.5963 0.633 0.651 0.4862 0.7067 0.0575 0.3263 0.3555 0.2725 0.2695 0.1896 0.0271 0.2935 0.4297 0.187 0.148 0.3259 0.58 0.4585 0. 0.3938 0.0325 0.178 0.1982] 2022-08-23 23:28:33 [INFO] [EVAL] The model with the best validation mIoU (0.3626) was saved at iter 54000. 2022-08-23 23:28:44 [INFO] [TRAIN] epoch: 45, iter: 56050/160000, loss: 0.5654, lr: 0.000787, batch_cost: 0.2137, reader_cost: 0.00433, ips: 37.4378 samples/sec | ETA 06:10:12 2022-08-23 23:28:54 [INFO] [TRAIN] epoch: 45, iter: 56100/160000, loss: 0.5749, lr: 0.000787, batch_cost: 0.2041, reader_cost: 0.00550, ips: 39.1923 samples/sec | ETA 05:53:28 2022-08-23 23:29:04 [INFO] [TRAIN] epoch: 45, iter: 56150/160000, loss: 0.5756, lr: 0.000786, batch_cost: 0.2032, reader_cost: 0.00093, ips: 39.3796 samples/sec | ETA 05:51:37 2022-08-23 23:29:12 [INFO] [TRAIN] epoch: 45, iter: 56200/160000, loss: 0.5757, lr: 0.000786, batch_cost: 0.1509, reader_cost: 0.00050, ips: 53.0151 samples/sec | ETA 04:21:03 2022-08-23 23:29:22 [INFO] [TRAIN] epoch: 45, iter: 56250/160000, loss: 0.5538, lr: 0.000785, batch_cost: 0.1970, reader_cost: 0.00062, ips: 40.6179 samples/sec | ETA 05:40:34 2022-08-23 23:29:31 [INFO] [TRAIN] epoch: 45, iter: 56300/160000, loss: 0.5845, lr: 0.000785, batch_cost: 0.1776, reader_cost: 0.00080, ips: 45.0365 samples/sec | ETA 05:07:00 2022-08-23 23:29:39 [INFO] [TRAIN] epoch: 45, iter: 56350/160000, loss: 0.5923, lr: 0.000785, batch_cost: 0.1726, reader_cost: 0.00091, ips: 46.3632 samples/sec | ETA 04:58:04 2022-08-23 23:29:48 [INFO] [TRAIN] epoch: 45, iter: 56400/160000, loss: 0.6083, lr: 0.000784, batch_cost: 0.1780, reader_cost: 0.00054, ips: 44.9508 samples/sec | ETA 05:07:17 2022-08-23 23:29:57 [INFO] [TRAIN] epoch: 45, iter: 56450/160000, loss: 0.5860, lr: 0.000784, batch_cost: 0.1841, reader_cost: 0.00052, ips: 43.4500 samples/sec | ETA 05:17:45 2022-08-23 23:30:07 [INFO] [TRAIN] epoch: 45, iter: 56500/160000, loss: 0.5411, lr: 0.000784, batch_cost: 0.1828, reader_cost: 0.00044, ips: 43.7666 samples/sec | ETA 05:15:18 2022-08-23 23:30:16 [INFO] [TRAIN] epoch: 45, iter: 56550/160000, loss: 0.5397, lr: 0.000783, batch_cost: 0.1843, reader_cost: 0.00049, ips: 43.4111 samples/sec | ETA 05:17:44 2022-08-23 23:30:25 [INFO] [TRAIN] epoch: 45, iter: 56600/160000, loss: 0.5460, lr: 0.000783, batch_cost: 0.1791, reader_cost: 0.00065, ips: 44.6722 samples/sec | ETA 05:08:37 2022-08-23 23:30:34 [INFO] [TRAIN] epoch: 45, iter: 56650/160000, loss: 0.5595, lr: 0.000782, batch_cost: 0.1924, reader_cost: 0.00079, ips: 41.5822 samples/sec | ETA 05:31:23 2022-08-23 23:30:43 [INFO] [TRAIN] epoch: 45, iter: 56700/160000, loss: 0.6085, lr: 0.000782, batch_cost: 0.1739, reader_cost: 0.00069, ips: 46.0102 samples/sec | ETA 04:59:21 2022-08-23 23:30:51 [INFO] [TRAIN] epoch: 45, iter: 56750/160000, loss: 0.5557, lr: 0.000782, batch_cost: 0.1573, reader_cost: 0.00097, ips: 50.8458 samples/sec | ETA 04:30:45 2022-08-23 23:30:59 [INFO] [TRAIN] epoch: 45, iter: 56800/160000, loss: 0.6352, lr: 0.000781, batch_cost: 0.1590, reader_cost: 0.00044, ips: 50.3096 samples/sec | ETA 04:33:30 2022-08-23 23:31:10 [INFO] [TRAIN] epoch: 46, iter: 56850/160000, loss: 0.5791, lr: 0.000781, batch_cost: 0.2166, reader_cost: 0.05822, ips: 36.9339 samples/sec | ETA 06:12:22 2022-08-23 23:31:18 [INFO] [TRAIN] epoch: 46, iter: 56900/160000, loss: 0.5440, lr: 0.000781, batch_cost: 0.1680, reader_cost: 0.00073, ips: 47.6325 samples/sec | ETA 04:48:35 2022-08-23 23:31:27 [INFO] [TRAIN] epoch: 46, iter: 56950/160000, loss: 0.5236, lr: 0.000780, batch_cost: 0.1685, reader_cost: 0.00055, ips: 47.4679 samples/sec | ETA 04:49:27 2022-08-23 23:31:36 [INFO] [TRAIN] epoch: 46, iter: 57000/160000, loss: 0.5470, lr: 0.000780, batch_cost: 0.1985, reader_cost: 0.00073, ips: 40.2924 samples/sec | ETA 05:40:50 2022-08-23 23:31:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1894 - reader cost: 8.4689e-04 2022-08-23 23:34:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3583 Acc: 0.7680 Kappa: 0.7503 Dice: 0.4945 2022-08-23 23:34:46 [INFO] [EVAL] Class IoU: [0.686 0.7773 0.929 0.7326 0.6825 0.753 0.7674 0.7867 0.5156 0.6336 0.48 0.5543 0.698 0.3441 0.2989 0.4324 0.5166 0.4288 0.601 0.4149 0.7431 0.5012 0.5856 0.514 0.3214 0.3647 0.5761 0.4154 0.4446 0.2707 0.2452 0.4908 0.2986 0.3128 0.3588 0.3938 0.4727 0.5459 0.2697 0.3588 0.1854 0.1383 0.3591 0.2778 0.277 0.2299 0.3722 0.497 0.6398 0.5492 0.5168 0.3405 0.2098 0.2509 0.6559 0.4257 0.8781 0.3777 0.3003 0.3105 0.1074 0.224 0.2511 0.0712 0.4463 0.6633 0.2024 0.4087 0.0303 0.321 0.4598 0.5711 0.3929 0.2172 0.4455 0.3996 0.5001 0.258 0.3244 0.0827 0.6523 0.3878 0.402 0.1241 0.1415 0.5283 0.0893 0.0905 0.327 0.5018 0.4606 0.0441 0.1718 0.0846 0.0192 0.0218 0.2056 0.1934 0.293 0.324 0.0849 0.1053 0.1485 0.5474 0.1822 0.5515 0.2533 0.5912 0.0459 0.3417 0.1399 0.339 0.1756 0.4358 0.6864 0.0265 0.406 0.6666 0.1668 0.3445 0.477 0.0648 0.3076 0.1501 0.2842 0.1898 0.4714 0.4519 0.5306 0.4065 0.5495 0.0866 0.2855 0.3516 0.2625 0.2082 0.1663 0.0108 0.1931 0.3542 0.2329 0.0894 0.2373 0.4304 0.2974 0. 0.3779 0.0276 0.1129 0.1551] 2022-08-23 23:34:46 [INFO] [EVAL] Class Precision: [0.7823 0.8564 0.9579 0.8405 0.7829 0.8723 0.8646 0.841 0.6333 0.7425 0.6744 0.707 0.7782 0.5489 0.5778 0.6041 0.6915 0.6824 0.7363 0.6322 0.8019 0.6983 0.7068 0.6194 0.5074 0.5669 0.6499 0.7672 0.7225 0.4731 0.4867 0.6099 0.4641 0.3959 0.4191 0.5315 0.6556 0.7549 0.4201 0.5782 0.3027 0.3681 0.617 0.5088 0.3571 0.4362 0.7835 0.7259 0.7351 0.6709 0.7467 0.4008 0.4396 0.6523 0.7071 0.7853 0.9436 0.6515 0.7407 0.622 0.1852 0.458 0.3635 0.5168 0.5314 0.7415 0.3844 0.5482 0.0858 0.5642 0.7568 0.7684 0.5548 0.2936 0.6006 0.5417 0.8015 0.5831 0.6464 0.5818 0.7789 0.6716 0.7097 0.3415 0.2145 0.7547 0.4004 0.3919 0.7285 0.8176 0.7903 0.0504 0.2295 0.3098 0.0711 0.1079 0.527 0.4845 0.4356 0.571 0.7739 0.1461 0.6007 0.8314 0.6587 0.6579 0.4599 0.846 0.141 0.4824 0.4075 0.4177 0.572 0.5993 0.6947 0.3426 0.7681 0.787 0.3872 0.5226 0.6285 0.3298 0.6319 0.6206 0.6953 0.6122 0.8922 0.5834 0.8462 0.7066 0.7304 0.3646 0.5818 0.7447 0.7708 0.3429 0.2831 0.0563 0.4293 0.7211 0.4196 0.2371 0.6335 0.6686 0.5918 0. 0.8043 0.2423 0.4657 0.7228] 2022-08-23 23:34:46 [INFO] [EVAL] Class Recall: [0.8478 0.8938 0.9685 0.8509 0.8419 0.8463 0.8722 0.9242 0.7351 0.8121 0.6249 0.7196 0.8713 0.4797 0.3824 0.6034 0.6714 0.5357 0.7659 0.5468 0.9102 0.6397 0.7735 0.7512 0.4671 0.5055 0.8354 0.4753 0.5362 0.3875 0.3307 0.7154 0.4557 0.5983 0.7139 0.6032 0.6289 0.6634 0.4296 0.486 0.3238 0.1813 0.4622 0.3797 0.5527 0.3271 0.4149 0.6118 0.8315 0.7517 0.6267 0.6937 0.2865 0.2896 0.9006 0.4818 0.9268 0.4733 0.3356 0.3828 0.2036 0.3049 0.4481 0.0763 0.7359 0.8628 0.2995 0.6163 0.0447 0.4269 0.5395 0.6899 0.5739 0.4551 0.633 0.6037 0.5708 0.3163 0.3945 0.0879 0.8006 0.4786 0.4811 0.1632 0.2936 0.6379 0.1031 0.1053 0.3724 0.5651 0.5248 0.2628 0.4058 0.1042 0.0256 0.0266 0.2521 0.2435 0.4723 0.4283 0.0871 0.2738 0.1648 0.6158 0.2012 0.7733 0.3606 0.6625 0.0638 0.5394 0.1756 0.6427 0.2022 0.6149 0.983 0.0279 0.4628 0.8134 0.2265 0.5027 0.6644 0.0747 0.3747 0.1652 0.3247 0.2158 0.4999 0.6672 0.5872 0.4891 0.6893 0.1021 0.3591 0.3998 0.2847 0.3466 0.2873 0.0131 0.2598 0.4104 0.3437 0.1256 0.2751 0.5471 0.3742 0. 0.4161 0.0302 0.1296 0.1649] 2022-08-23 23:34:46 [INFO] [EVAL] The model with the best validation mIoU (0.3626) was saved at iter 54000. 2022-08-23 23:34:54 [INFO] [TRAIN] epoch: 46, iter: 57050/160000, loss: 0.6046, lr: 0.000779, batch_cost: 0.1603, reader_cost: 0.00452, ips: 49.9132 samples/sec | ETA 04:35:00 2022-08-23 23:35:03 [INFO] [TRAIN] epoch: 46, iter: 57100/160000, loss: 0.5348, lr: 0.000779, batch_cost: 0.1682, reader_cost: 0.00138, ips: 47.5653 samples/sec | ETA 04:48:26 2022-08-23 23:35:11 [INFO] [TRAIN] epoch: 46, iter: 57150/160000, loss: 0.5420, lr: 0.000779, batch_cost: 0.1600, reader_cost: 0.00042, ips: 49.9856 samples/sec | ETA 04:34:20 2022-08-23 23:35:20 [INFO] [TRAIN] epoch: 46, iter: 57200/160000, loss: 0.5411, lr: 0.000778, batch_cost: 0.1888, reader_cost: 0.00060, ips: 42.3633 samples/sec | ETA 05:23:33 2022-08-23 23:35:29 [INFO] [TRAIN] epoch: 46, iter: 57250/160000, loss: 0.5311, lr: 0.000778, batch_cost: 0.1801, reader_cost: 0.00064, ips: 44.4280 samples/sec | ETA 05:08:21 2022-08-23 23:35:38 [INFO] [TRAIN] epoch: 46, iter: 57300/160000, loss: 0.5679, lr: 0.000778, batch_cost: 0.1701, reader_cost: 0.00040, ips: 47.0194 samples/sec | ETA 04:51:13 2022-08-23 23:35:48 [INFO] [TRAIN] epoch: 46, iter: 57350/160000, loss: 0.5668, lr: 0.000777, batch_cost: 0.2037, reader_cost: 0.00055, ips: 39.2772 samples/sec | ETA 05:48:27 2022-08-23 23:35:57 [INFO] [TRAIN] epoch: 46, iter: 57400/160000, loss: 0.5913, lr: 0.000777, batch_cost: 0.1903, reader_cost: 0.00057, ips: 42.0302 samples/sec | ETA 05:25:28 2022-08-23 23:36:07 [INFO] [TRAIN] epoch: 46, iter: 57450/160000, loss: 0.5354, lr: 0.000776, batch_cost: 0.1981, reader_cost: 0.00062, ips: 40.3884 samples/sec | ETA 05:38:32 2022-08-23 23:36:16 [INFO] [TRAIN] epoch: 46, iter: 57500/160000, loss: 0.5671, lr: 0.000776, batch_cost: 0.1828, reader_cost: 0.00055, ips: 43.7549 samples/sec | ETA 05:12:20 2022-08-23 23:36:27 [INFO] [TRAIN] epoch: 46, iter: 57550/160000, loss: 0.5660, lr: 0.000776, batch_cost: 0.2065, reader_cost: 0.00052, ips: 38.7429 samples/sec | ETA 05:52:34 2022-08-23 23:36:36 [INFO] [TRAIN] epoch: 46, iter: 57600/160000, loss: 0.6106, lr: 0.000775, batch_cost: 0.1761, reader_cost: 0.00055, ips: 45.4178 samples/sec | ETA 05:00:36 2022-08-23 23:36:44 [INFO] [TRAIN] epoch: 46, iter: 57650/160000, loss: 0.5958, lr: 0.000775, batch_cost: 0.1623, reader_cost: 0.00056, ips: 49.2822 samples/sec | ETA 04:36:54 2022-08-23 23:36:53 [INFO] [TRAIN] epoch: 46, iter: 57700/160000, loss: 0.5925, lr: 0.000775, batch_cost: 0.1761, reader_cost: 0.00065, ips: 45.4283 samples/sec | ETA 05:00:15 2022-08-23 23:37:01 [INFO] [TRAIN] epoch: 46, iter: 57750/160000, loss: 0.5680, lr: 0.000774, batch_cost: 0.1624, reader_cost: 0.00079, ips: 49.2628 samples/sec | ETA 04:36:44 2022-08-23 23:37:08 [INFO] [TRAIN] epoch: 46, iter: 57800/160000, loss: 0.5586, lr: 0.000774, batch_cost: 0.1529, reader_cost: 0.00067, ips: 52.3124 samples/sec | ETA 04:20:29 2022-08-23 23:37:18 [INFO] [TRAIN] epoch: 46, iter: 57850/160000, loss: 0.5661, lr: 0.000773, batch_cost: 0.2025, reader_cost: 0.00064, ips: 39.5000 samples/sec | ETA 05:44:48 2022-08-23 23:37:28 [INFO] [TRAIN] epoch: 46, iter: 57900/160000, loss: 0.5340, lr: 0.000773, batch_cost: 0.1829, reader_cost: 0.00093, ips: 43.7329 samples/sec | ETA 05:11:17 2022-08-23 23:37:36 [INFO] [TRAIN] epoch: 46, iter: 57950/160000, loss: 0.5664, lr: 0.000773, batch_cost: 0.1604, reader_cost: 0.00070, ips: 49.8607 samples/sec | ETA 04:32:53 2022-08-23 23:37:46 [INFO] [TRAIN] epoch: 46, iter: 58000/160000, loss: 0.6004, lr: 0.000772, batch_cost: 0.2027, reader_cost: 0.00089, ips: 39.4578 samples/sec | ETA 05:44:40 2022-08-23 23:37:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 190s - batch_cost: 0.1903 - reader cost: 8.2164e-04 2022-08-23 23:40:56 [INFO] [EVAL] #Images: 2000 mIoU: 0.3596 Acc: 0.7665 Kappa: 0.7489 Dice: 0.4970 2022-08-23 23:40:56 [INFO] [EVAL] Class IoU: [0.6828 0.7937 0.9264 0.7318 0.6739 0.7446 0.7498 0.7912 0.5095 0.6078 0.5 0.5024 0.6995 0.2559 0.3124 0.4439 0.5549 0.4198 0.5997 0.4124 0.7409 0.4816 0.6102 0.4956 0.3139 0.4926 0.5183 0.4379 0.4033 0.2299 0.2858 0.5012 0.2927 0.3483 0.3768 0.3968 0.4448 0.489 0.2859 0.3716 0.1737 0.1456 0.3389 0.261 0.301 0.2158 0.3911 0.4885 0.6586 0.5569 0.5786 0.3741 0.1433 0.2648 0.6724 0.4007 0.8853 0.3995 0.4363 0.3353 0.1163 0.2219 0.2831 0.1223 0.4445 0.6867 0.1915 0.3872 0.0356 0.3618 0.4842 0.5619 0.4033 0.2209 0.4242 0.3898 0.5037 0.2592 0.5185 0.1569 0.6679 0.3963 0.276 0.0327 0.2389 0.5256 0.129 0.0962 0.2923 0.5088 0.345 0.0614 0.2242 0.0891 0.0449 0.0314 0.2022 0.1092 0.2878 0.2825 0.2533 0.1249 0.2836 0.675 0.1705 0.3091 0.2287 0.5362 0.0879 0.2495 0.1571 0.2344 0.1437 0.4359 0.7292 0.0709 0.4426 0.6464 0.1156 0.3982 0.4445 0.0824 0.2844 0.1472 0.2936 0.1836 0.4815 0.4788 0.3701 0.3586 0.5861 0.0907 0.2125 0.3718 0.2884 0.1912 0.1413 0.035 0.1824 0.3712 0.2528 0.1118 0.2276 0.4049 0.2866 0.0007 0.3615 0.0225 0.1079 0.2092] 2022-08-23 23:40:56 [INFO] [EVAL] Class Precision: [0.7933 0.8731 0.9552 0.8249 0.7531 0.8344 0.8222 0.869 0.6952 0.713 0.6891 0.6884 0.7763 0.5554 0.5457 0.5819 0.7175 0.6846 0.7391 0.6083 0.8048 0.6501 0.7935 0.6596 0.5292 0.5945 0.6295 0.7575 0.7906 0.3598 0.4315 0.6564 0.5119 0.4266 0.4296 0.5312 0.698 0.8353 0.4442 0.5443 0.3139 0.4603 0.4888 0.3835 0.5703 0.4944 0.8364 0.6576 0.7742 0.7272 0.7329 0.4554 0.2884 0.6747 0.7321 0.6885 0.9434 0.5891 0.5866 0.6524 0.1742 0.4111 0.3712 0.6819 0.5192 0.8173 0.3386 0.5274 0.1057 0.5657 0.7181 0.6604 0.6003 0.3155 0.768 0.5369 0.7935 0.7709 0.724 0.56 0.825 0.7561 0.835 0.1601 0.3156 0.6576 0.4543 0.4026 0.77 0.6961 0.4491 0.0817 0.3949 0.3133 0.1127 0.0986 0.5884 0.4591 0.4253 0.6878 0.7222 0.181 0.6099 0.8925 0.6127 0.3241 0.3198 0.769 0.1304 0.4433 0.4455 0.27 0.5751 0.8604 0.7386 0.3143 0.7944 0.7408 0.6905 0.6452 0.7301 0.3502 0.545 0.6187 0.7327 0.551 0.7972 0.6273 0.842 0.5773 0.746 0.4657 0.3336 0.7347 0.5774 0.3976 0.3242 0.0575 0.3716 0.6014 0.5206 0.374 0.5356 0.505 0.586 0.0015 0.9073 0.3095 0.3581 0.6258] 2022-08-23 23:40:56 [INFO] [EVAL] Class Recall: [0.8306 0.8972 0.9685 0.8664 0.8651 0.8738 0.8949 0.8984 0.6561 0.8047 0.6457 0.6503 0.8761 0.3218 0.4221 0.6517 0.71 0.5205 0.7608 0.5615 0.9033 0.6501 0.7253 0.6659 0.4355 0.7418 0.7458 0.5093 0.4516 0.389 0.4583 0.6795 0.406 0.655 0.7537 0.6107 0.5509 0.5412 0.4451 0.5394 0.2801 0.1756 0.525 0.4499 0.3893 0.2769 0.4235 0.655 0.8152 0.704 0.7332 0.6768 0.2216 0.3035 0.8918 0.4894 0.935 0.5538 0.63 0.4083 0.259 0.3252 0.5441 0.1296 0.7556 0.8113 0.306 0.5929 0.051 0.5009 0.5979 0.7903 0.5513 0.4243 0.4865 0.5872 0.5797 0.2809 0.6463 0.1789 0.778 0.4544 0.2919 0.0394 0.4958 0.7237 0.1526 0.1123 0.3203 0.654 0.5982 0.198 0.3415 0.1107 0.0695 0.044 0.2355 0.1254 0.4709 0.3241 0.2806 0.2873 0.3465 0.7348 0.1911 0.8693 0.4451 0.6391 0.2122 0.3632 0.1953 0.6401 0.1607 0.4691 0.9829 0.0838 0.4998 0.8353 0.1219 0.5099 0.5319 0.0973 0.3729 0.1618 0.3288 0.2159 0.5487 0.6691 0.3977 0.4863 0.7322 0.1012 0.3693 0.4295 0.3656 0.2691 0.2003 0.0821 0.2638 0.4924 0.3295 0.1376 0.2835 0.6714 0.3593 0.0012 0.3753 0.0237 0.1338 0.2392] 2022-08-23 23:40:57 [INFO] [EVAL] The model with the best validation mIoU (0.3626) was saved at iter 54000. 2022-08-23 23:41:06 [INFO] [TRAIN] epoch: 46, iter: 58050/160000, loss: 0.5632, lr: 0.000772, batch_cost: 0.1988, reader_cost: 0.00423, ips: 40.2432 samples/sec | ETA 05:37:46 2022-08-23 23:41:23 [INFO] [TRAIN] epoch: 47, iter: 58100/160000, loss: 0.5990, lr: 0.000771, batch_cost: 0.3210, reader_cost: 0.10173, ips: 24.9249 samples/sec | ETA 09:05:06 2022-08-23 23:41:32 [INFO] [TRAIN] epoch: 47, iter: 58150/160000, loss: 0.5501, lr: 0.000771, batch_cost: 0.1853, reader_cost: 0.00121, ips: 43.1621 samples/sec | ETA 05:14:37 2022-08-23 23:41:40 [INFO] [TRAIN] epoch: 47, iter: 58200/160000, loss: 0.5772, lr: 0.000771, batch_cost: 0.1591, reader_cost: 0.00078, ips: 50.2926 samples/sec | ETA 04:29:53 2022-08-23 23:41:49 [INFO] [TRAIN] epoch: 47, iter: 58250/160000, loss: 0.5039, lr: 0.000770, batch_cost: 0.1783, reader_cost: 0.00054, ips: 44.8584 samples/sec | ETA 05:02:25 2022-08-23 23:41:57 [INFO] [TRAIN] epoch: 47, iter: 58300/160000, loss: 0.5421, lr: 0.000770, batch_cost: 0.1667, reader_cost: 0.00222, ips: 47.9911 samples/sec | ETA 04:42:33 2022-08-23 23:42:06 [INFO] [TRAIN] epoch: 47, iter: 58350/160000, loss: 0.5339, lr: 0.000770, batch_cost: 0.1825, reader_cost: 0.00032, ips: 43.8475 samples/sec | ETA 05:09:06 2022-08-23 23:42:14 [INFO] [TRAIN] epoch: 47, iter: 58400/160000, loss: 0.5755, lr: 0.000769, batch_cost: 0.1532, reader_cost: 0.00092, ips: 52.2356 samples/sec | ETA 04:19:20 2022-08-23 23:42:22 [INFO] [TRAIN] epoch: 47, iter: 58450/160000, loss: 0.5374, lr: 0.000769, batch_cost: 0.1694, reader_cost: 0.00062, ips: 47.2254 samples/sec | ETA 04:46:42 2022-08-23 23:42:31 [INFO] [TRAIN] epoch: 47, iter: 58500/160000, loss: 0.5364, lr: 0.000768, batch_cost: 0.1782, reader_cost: 0.00047, ips: 44.8987 samples/sec | ETA 05:01:25 2022-08-23 23:42:40 [INFO] [TRAIN] epoch: 47, iter: 58550/160000, loss: 0.5799, lr: 0.000768, batch_cost: 0.1843, reader_cost: 0.00271, ips: 43.4162 samples/sec | ETA 05:11:33 2022-08-23 23:42:49 [INFO] [TRAIN] epoch: 47, iter: 58600/160000, loss: 0.5122, lr: 0.000768, batch_cost: 0.1689, reader_cost: 0.00076, ips: 47.3640 samples/sec | ETA 04:45:26 2022-08-23 23:42:59 [INFO] [TRAIN] epoch: 47, iter: 58650/160000, loss: 0.5663, lr: 0.000767, batch_cost: 0.2072, reader_cost: 0.00127, ips: 38.6190 samples/sec | ETA 05:49:54 2022-08-23 23:43:09 [INFO] [TRAIN] epoch: 47, iter: 58700/160000, loss: 0.5839, lr: 0.000767, batch_cost: 0.1920, reader_cost: 0.00036, ips: 41.6714 samples/sec | ETA 05:24:07 2022-08-23 23:43:18 [INFO] [TRAIN] epoch: 47, iter: 58750/160000, loss: 0.5215, lr: 0.000767, batch_cost: 0.1778, reader_cost: 0.00048, ips: 44.9877 samples/sec | ETA 05:00:04 2022-08-23 23:43:26 [INFO] [TRAIN] epoch: 47, iter: 58800/160000, loss: 0.5487, lr: 0.000766, batch_cost: 0.1641, reader_cost: 0.00044, ips: 48.7502 samples/sec | ETA 04:36:47 2022-08-23 23:43:35 [INFO] [TRAIN] epoch: 47, iter: 58850/160000, loss: 0.5386, lr: 0.000766, batch_cost: 0.1767, reader_cost: 0.00042, ips: 45.2689 samples/sec | ETA 04:57:55 2022-08-23 23:43:42 [INFO] [TRAIN] epoch: 47, iter: 58900/160000, loss: 0.5934, lr: 0.000765, batch_cost: 0.1539, reader_cost: 0.00055, ips: 51.9911 samples/sec | ETA 04:19:16 2022-08-23 23:43:50 [INFO] [TRAIN] epoch: 47, iter: 58950/160000, loss: 0.5458, lr: 0.000765, batch_cost: 0.1561, reader_cost: 0.00067, ips: 51.2422 samples/sec | ETA 04:22:56 2022-08-23 23:43:59 [INFO] [TRAIN] epoch: 47, iter: 59000/160000, loss: 0.5845, lr: 0.000765, batch_cost: 0.1766, reader_cost: 0.00469, ips: 45.3038 samples/sec | ETA 04:57:15 2022-08-23 23:43:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1862 - reader cost: 8.3807e-04 2022-08-23 23:47:06 [INFO] [EVAL] #Images: 2000 mIoU: 0.3661 Acc: 0.7678 Kappa: 0.7501 Dice: 0.5028 2022-08-23 23:47:06 [INFO] [EVAL] Class IoU: [0.6851 0.78 0.9311 0.7329 0.6793 0.7546 0.7794 0.7928 0.5227 0.6001 0.4932 0.5662 0.7074 0.2874 0.3338 0.4307 0.5327 0.4296 0.6012 0.419 0.7725 0.4391 0.6024 0.4871 0.3123 0.3339 0.5126 0.4354 0.4068 0.2345 0.2552 0.5072 0.308 0.3659 0.3827 0.4169 0.4613 0.5502 0.2996 0.3842 0.1537 0.1645 0.3595 0.2508 0.2681 0.2504 0.3434 0.5046 0.5167 0.5023 0.5004 0.3561 0.2296 0.2412 0.6847 0.3444 0.879 0.4121 0.4737 0.3094 0.0914 0.1949 0.2869 0.1085 0.4939 0.7112 0.1948 0.3901 0.0626 0.3588 0.4928 0.5331 0.3926 0.1971 0.4845 0.3751 0.5797 0.2611 0.1241 0.1562 0.6654 0.3994 0.4054 0.073 0.113 0.5169 0.0972 0.1168 0.3013 0.5096 0.4761 0.2349 0.1909 0.111 0.0011 0.0232 0.1742 0.2166 0.2728 0.2734 0.1415 0.1305 0.3124 0.5449 0.1675 0.6934 0.2865 0.5891 0.1237 0.4769 0.1809 0.4143 0.1675 0.6232 0.669 0.0486 0.4183 0.6434 0.2207 0.3312 0.4922 0.0452 0.3154 0.1063 0.2713 0.1827 0.4948 0.5011 0.6624 0.3627 0.6639 0.0868 0.2705 0.3456 0.3002 0.1774 0.1545 0.0317 0.1887 0.364 0.1333 0.0698 0.267 0.38 0.2206 0. 0.3944 0.0164 0.1036 0.2033] 2022-08-23 23:47:06 [INFO] [EVAL] Class Precision: [0.7891 0.8432 0.9588 0.8203 0.781 0.887 0.9023 0.8619 0.6686 0.7415 0.67 0.6968 0.8074 0.5436 0.5432 0.5807 0.676 0.6593 0.7319 0.6135 0.8547 0.7376 0.7581 0.6277 0.4238 0.6144 0.543 0.6976 0.7122 0.3285 0.4241 0.6188 0.4424 0.4806 0.475 0.5226 0.6515 0.7894 0.4812 0.5716 0.3845 0.3985 0.5719 0.5246 0.3681 0.435 0.6796 0.7441 0.6593 0.5898 0.7199 0.4532 0.4263 0.5591 0.7347 0.7296 0.9497 0.6098 0.5884 0.5512 0.1584 0.559 0.3819 0.6159 0.651 0.8347 0.3062 0.5936 0.1255 0.5657 0.6353 0.8127 0.5573 0.2631 0.725 0.559 0.7696 0.7428 0.7278 0.3767 0.8301 0.7186 0.7467 0.1977 0.2516 0.7446 0.52 0.4147 0.6236 0.7013 0.6803 0.676 0.2689 0.3083 0.0034 0.1379 0.7244 0.4155 0.424 0.8765 0.5354 0.3115 0.556 0.823 0.5309 0.8747 0.6019 0.8273 0.231 0.5806 0.3924 0.5755 0.5495 0.6892 0.6769 0.2216 0.6393 0.7591 0.2636 0.4948 0.6195 0.2198 0.6315 0.6998 0.7852 0.611 0.7963 0.6623 0.8575 0.559 0.8043 0.5131 0.4966 0.8427 0.6611 0.3074 0.4633 0.0784 0.6009 0.6972 0.46 0.3107 0.6892 0.7356 0.6198 0. 0.8339 0.2792 0.2658 0.7762] 2022-08-23 23:47:06 [INFO] [EVAL] Class Recall: [0.8387 0.9123 0.9699 0.8731 0.8391 0.8349 0.8512 0.9081 0.7054 0.7588 0.6514 0.7514 0.851 0.3788 0.4641 0.6252 0.7153 0.5521 0.771 0.5692 0.8892 0.5204 0.7458 0.6851 0.5429 0.4224 0.9014 0.5367 0.4869 0.4504 0.3904 0.7376 0.5035 0.6053 0.6632 0.6731 0.6125 0.6449 0.4425 0.5395 0.2039 0.2189 0.4918 0.3246 0.4967 0.3711 0.4097 0.6106 0.7049 0.7718 0.6214 0.6242 0.3322 0.2978 0.9097 0.3948 0.922 0.5596 0.7084 0.4135 0.1775 0.2303 0.5357 0.1164 0.6718 0.8277 0.3486 0.5322 0.111 0.4952 0.6873 0.6078 0.5707 0.4399 0.5935 0.5327 0.7014 0.2871 0.1302 0.2106 0.7703 0.4735 0.47 0.1037 0.1702 0.6283 0.1068 0.1398 0.3682 0.651 0.6133 0.2647 0.397 0.1478 0.0017 0.0271 0.1866 0.3115 0.4335 0.2844 0.1612 0.1834 0.4162 0.6173 0.1966 0.7698 0.3535 0.6716 0.2104 0.7277 0.2512 0.5967 0.1941 0.8668 0.9829 0.0586 0.5475 0.8084 0.5759 0.5005 0.7056 0.0538 0.3865 0.1114 0.2931 0.2068 0.5665 0.673 0.7443 0.508 0.7918 0.0946 0.3726 0.3695 0.3547 0.2955 0.1881 0.0506 0.2157 0.4324 0.158 0.0826 0.3036 0.4401 0.2551 0. 0.4281 0.0171 0.1451 0.216 ] 2022-08-23 23:47:06 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-23 23:47:14 [INFO] [TRAIN] epoch: 47, iter: 59050/160000, loss: 0.5507, lr: 0.000764, batch_cost: 0.1593, reader_cost: 0.00336, ips: 50.2152 samples/sec | ETA 04:28:02 2022-08-23 23:47:24 [INFO] [TRAIN] epoch: 47, iter: 59100/160000, loss: 0.5523, lr: 0.000764, batch_cost: 0.1997, reader_cost: 0.00079, ips: 40.0627 samples/sec | ETA 05:35:48 2022-08-23 23:47:33 [INFO] [TRAIN] epoch: 47, iter: 59150/160000, loss: 0.5708, lr: 0.000764, batch_cost: 0.1873, reader_cost: 0.00071, ips: 42.7014 samples/sec | ETA 05:14:53 2022-08-23 23:47:43 [INFO] [TRAIN] epoch: 47, iter: 59200/160000, loss: 0.5525, lr: 0.000763, batch_cost: 0.1912, reader_cost: 0.00065, ips: 41.8447 samples/sec | ETA 05:21:11 2022-08-23 23:47:52 [INFO] [TRAIN] epoch: 47, iter: 59250/160000, loss: 0.5856, lr: 0.000763, batch_cost: 0.1818, reader_cost: 0.00070, ips: 43.9949 samples/sec | ETA 05:05:20 2022-08-23 23:48:01 [INFO] [TRAIN] epoch: 47, iter: 59300/160000, loss: 0.5671, lr: 0.000762, batch_cost: 0.1800, reader_cost: 0.00157, ips: 44.4432 samples/sec | ETA 05:02:06 2022-08-23 23:48:11 [INFO] [TRAIN] epoch: 47, iter: 59350/160000, loss: 0.5557, lr: 0.000762, batch_cost: 0.1965, reader_cost: 0.00033, ips: 40.7218 samples/sec | ETA 05:29:33 2022-08-23 23:48:25 [INFO] [TRAIN] epoch: 48, iter: 59400/160000, loss: 0.5446, lr: 0.000762, batch_cost: 0.2875, reader_cost: 0.08886, ips: 27.8273 samples/sec | ETA 08:02:01 2022-08-23 23:48:35 [INFO] [TRAIN] epoch: 48, iter: 59450/160000, loss: 0.5675, lr: 0.000761, batch_cost: 0.1891, reader_cost: 0.00072, ips: 42.3130 samples/sec | ETA 05:16:50 2022-08-23 23:48:44 [INFO] [TRAIN] epoch: 48, iter: 59500/160000, loss: 0.5731, lr: 0.000761, batch_cost: 0.1847, reader_cost: 0.00069, ips: 43.3025 samples/sec | ETA 05:09:27 2022-08-23 23:48:52 [INFO] [TRAIN] epoch: 48, iter: 59550/160000, loss: 0.5356, lr: 0.000761, batch_cost: 0.1713, reader_cost: 0.00047, ips: 46.7102 samples/sec | ETA 04:46:43 2022-08-23 23:49:02 [INFO] [TRAIN] epoch: 48, iter: 59600/160000, loss: 0.6108, lr: 0.000760, batch_cost: 0.1877, reader_cost: 0.00059, ips: 42.6186 samples/sec | ETA 05:14:06 2022-08-23 23:49:11 [INFO] [TRAIN] epoch: 48, iter: 59650/160000, loss: 0.5672, lr: 0.000760, batch_cost: 0.1767, reader_cost: 0.00055, ips: 45.2779 samples/sec | ETA 04:55:30 2022-08-23 23:49:20 [INFO] [TRAIN] epoch: 48, iter: 59700/160000, loss: 0.5421, lr: 0.000759, batch_cost: 0.1864, reader_cost: 0.00053, ips: 42.9177 samples/sec | ETA 05:11:36 2022-08-23 23:49:29 [INFO] [TRAIN] epoch: 48, iter: 59750/160000, loss: 0.5208, lr: 0.000759, batch_cost: 0.1784, reader_cost: 0.00056, ips: 44.8381 samples/sec | ETA 04:58:06 2022-08-23 23:49:38 [INFO] [TRAIN] epoch: 48, iter: 59800/160000, loss: 0.5337, lr: 0.000759, batch_cost: 0.1784, reader_cost: 0.00047, ips: 44.8461 samples/sec | ETA 04:57:54 2022-08-23 23:49:46 [INFO] [TRAIN] epoch: 48, iter: 59850/160000, loss: 0.5557, lr: 0.000758, batch_cost: 0.1743, reader_cost: 0.00063, ips: 45.8857 samples/sec | ETA 04:51:00 2022-08-23 23:49:56 [INFO] [TRAIN] epoch: 48, iter: 59900/160000, loss: 0.5648, lr: 0.000758, batch_cost: 0.1864, reader_cost: 0.00050, ips: 42.9088 samples/sec | ETA 05:11:02 2022-08-23 23:50:05 [INFO] [TRAIN] epoch: 48, iter: 59950/160000, loss: 0.5545, lr: 0.000757, batch_cost: 0.1751, reader_cost: 0.00140, ips: 45.6999 samples/sec | ETA 04:51:54 2022-08-23 23:50:13 [INFO] [TRAIN] epoch: 48, iter: 60000/160000, loss: 0.5150, lr: 0.000757, batch_cost: 0.1625, reader_cost: 0.01343, ips: 49.2220 samples/sec | ETA 04:30:52 2022-08-23 23:50:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 194s - batch_cost: 0.1944 - reader cost: 7.4477e-04 2022-08-23 23:53:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3615 Acc: 0.7679 Kappa: 0.7501 Dice: 0.4975 2022-08-23 23:53:27 [INFO] [EVAL] Class IoU: [0.6841 0.7848 0.9315 0.7389 0.6948 0.7488 0.774 0.788 0.5307 0.6159 0.4808 0.5534 0.7073 0.2769 0.2908 0.4326 0.488 0.4623 0.6111 0.4166 0.7552 0.4735 0.6277 0.4964 0.3126 0.4509 0.5581 0.4202 0.4796 0.1993 0.2662 0.5221 0.3124 0.317 0.371 0.4074 0.4522 0.5276 0.2654 0.3496 0.0996 0.1988 0.3726 0.2612 0.2817 0.2166 0.407 0.4925 0.6506 0.5475 0.5155 0.3695 0.2213 0.147 0.6695 0.3631 0.8764 0.4111 0.4901 0.2561 0.0979 0.2257 0.2876 0.2543 0.4049 0.7144 0.2274 0.4049 0.0455 0.343 0.4793 0.5401 0.4109 0.2191 0.4602 0.3792 0.4768 0.2681 0.3555 0.1083 0.5504 0.4 0.3809 0.1284 0.1452 0.5033 0.1079 0.1055 0.2481 0.4806 0.4478 0.0116 0.1644 0.0901 0.0255 0.0308 0.1914 0.1611 0.28 0.339 0.1682 0.1457 0.2597 0.5855 0.1918 0.5665 0.1536 0.4911 0.0631 0.2689 0.1253 0.4188 0.1702 0.5718 0.6779 0.0625 0.4554 0.678 0.1354 0.3506 0.4631 0.0677 0.3652 0.1787 0.2776 0.2123 0.5004 0.488 0.6251 0.3499 0.5845 0.0719 0.2899 0.3373 0.2094 0.1715 0.1297 0.0284 0.1758 0.3614 0.176 0. 0.2865 0.4542 0.2835 0. 0.3644 0.0294 0.1134 0.1774] 2022-08-23 23:53:27 [INFO] [EVAL] Class Precision: [0.7666 0.8762 0.9629 0.8524 0.8054 0.8941 0.892 0.8496 0.7085 0.7221 0.671 0.6826 0.7825 0.4757 0.5993 0.6133 0.6735 0.625 0.7702 0.6404 0.814 0.7523 0.7867 0.5935 0.4794 0.5359 0.6263 0.7132 0.6524 0.324 0.4538 0.6483 0.4991 0.4453 0.5057 0.5059 0.6341 0.7534 0.4405 0.617 0.3637 0.3235 0.6802 0.5485 0.4172 0.3843 0.6474 0.7231 0.7389 0.687 0.6636 0.4983 0.4274 0.5364 0.7253 0.8054 0.9394 0.6425 0.6344 0.3771 0.1436 0.4651 0.4098 0.4925 0.4549 0.8673 0.4235 0.5556 0.1002 0.5079 0.6929 0.7009 0.5916 0.2921 0.7467 0.5315 0.6462 0.5348 0.7065 0.4674 0.6456 0.6628 0.7646 0.3265 0.2266 0.7392 0.6392 0.3965 0.5752 0.6716 0.7076 0.0288 0.2543 0.3622 0.0576 0.088 0.7994 0.3828 0.4364 0.5025 0.7495 0.2606 0.5513 0.6336 0.8919 0.7557 0.3034 0.752 0.1493 0.4308 0.3727 0.5529 0.5796 0.6153 0.6904 0.5072 0.7431 0.7789 0.1752 0.6248 0.7542 0.2714 0.8259 0.5407 0.703 0.6264 0.7575 0.6142 0.7571 0.4825 0.6259 0.4558 0.6342 0.7625 0.8423 0.4666 0.3601 0.0665 0.4467 0.5921 0.2106 0. 0.5755 0.6098 0.5256 0. 0.8171 0.2126 0.3621 0.8071] 2022-08-23 23:53:27 [INFO] [EVAL] Class Recall: [0.864 0.8827 0.9661 0.8473 0.835 0.8217 0.854 0.9158 0.6789 0.8072 0.6291 0.7452 0.8804 0.3985 0.361 0.5948 0.6393 0.6398 0.7473 0.5437 0.9128 0.561 0.7564 0.7521 0.4732 0.7398 0.8367 0.5056 0.6442 0.3411 0.3917 0.7283 0.4551 0.5239 0.5821 0.6764 0.6119 0.6378 0.4003 0.4464 0.1206 0.3403 0.4518 0.3328 0.4644 0.3317 0.523 0.607 0.8449 0.7294 0.698 0.5884 0.3146 0.1684 0.897 0.3981 0.929 0.533 0.6829 0.4438 0.2353 0.3048 0.491 0.3447 0.7867 0.8021 0.3292 0.5987 0.0768 0.5137 0.6086 0.7019 0.5737 0.4669 0.5454 0.5695 0.6452 0.3496 0.4171 0.1236 0.7887 0.5023 0.4315 0.1746 0.2879 0.612 0.115 0.1257 0.3037 0.6282 0.5494 0.019 0.3173 0.1071 0.0437 0.0452 0.2011 0.2176 0.4386 0.5104 0.1783 0.2484 0.3294 0.8852 0.1964 0.6936 0.2373 0.586 0.0984 0.4171 0.1587 0.6332 0.1942 0.8899 0.974 0.0665 0.5405 0.8396 0.3734 0.4441 0.5454 0.0827 0.3957 0.2107 0.3145 0.2431 0.5959 0.7037 0.782 0.5601 0.8984 0.0787 0.348 0.3769 0.218 0.2133 0.1686 0.0473 0.2247 0.4812 0.5166 0. 0.3633 0.6402 0.381 0. 0.3968 0.0331 0.1417 0.1852] 2022-08-23 23:53:27 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-23 23:53:36 [INFO] [TRAIN] epoch: 48, iter: 60050/160000, loss: 0.5658, lr: 0.000757, batch_cost: 0.1646, reader_cost: 0.00365, ips: 48.6096 samples/sec | ETA 04:34:09 2022-08-23 23:53:44 [INFO] [TRAIN] epoch: 48, iter: 60100/160000, loss: 0.5193, lr: 0.000756, batch_cost: 0.1737, reader_cost: 0.00133, ips: 46.0584 samples/sec | ETA 04:49:11 2022-08-23 23:53:54 [INFO] [TRAIN] epoch: 48, iter: 60150/160000, loss: 0.5646, lr: 0.000756, batch_cost: 0.1932, reader_cost: 0.00040, ips: 41.4177 samples/sec | ETA 05:21:26 2022-08-23 23:54:03 [INFO] [TRAIN] epoch: 48, iter: 60200/160000, loss: 0.5727, lr: 0.000756, batch_cost: 0.1852, reader_cost: 0.00113, ips: 43.2055 samples/sec | ETA 05:07:59 2022-08-23 23:54:13 [INFO] [TRAIN] epoch: 48, iter: 60250/160000, loss: 0.5366, lr: 0.000755, batch_cost: 0.1923, reader_cost: 0.00092, ips: 41.6103 samples/sec | ETA 05:19:37 2022-08-23 23:54:23 [INFO] [TRAIN] epoch: 48, iter: 60300/160000, loss: 0.5177, lr: 0.000755, batch_cost: 0.1950, reader_cost: 0.00104, ips: 41.0201 samples/sec | ETA 05:24:04 2022-08-23 23:54:32 [INFO] [TRAIN] epoch: 48, iter: 60350/160000, loss: 0.5678, lr: 0.000754, batch_cost: 0.1859, reader_cost: 0.00054, ips: 43.0346 samples/sec | ETA 05:08:44 2022-08-23 23:54:43 [INFO] [TRAIN] epoch: 48, iter: 60400/160000, loss: 0.6079, lr: 0.000754, batch_cost: 0.2119, reader_cost: 0.00079, ips: 37.7450 samples/sec | ETA 05:51:50 2022-08-23 23:54:52 [INFO] [TRAIN] epoch: 48, iter: 60450/160000, loss: 0.6238, lr: 0.000754, batch_cost: 0.1894, reader_cost: 0.00106, ips: 42.2320 samples/sec | ETA 05:14:17 2022-08-23 23:55:01 [INFO] [TRAIN] epoch: 48, iter: 60500/160000, loss: 0.5634, lr: 0.000753, batch_cost: 0.1731, reader_cost: 0.00102, ips: 46.2085 samples/sec | ETA 04:47:06 2022-08-23 23:55:09 [INFO] [TRAIN] epoch: 48, iter: 60550/160000, loss: 0.6003, lr: 0.000753, batch_cost: 0.1704, reader_cost: 0.00042, ips: 46.9511 samples/sec | ETA 04:42:25 2022-08-23 23:55:19 [INFO] [TRAIN] epoch: 48, iter: 60600/160000, loss: 0.5419, lr: 0.000753, batch_cost: 0.1896, reader_cost: 0.00040, ips: 42.1913 samples/sec | ETA 05:14:07 2022-08-23 23:55:33 [INFO] [TRAIN] epoch: 49, iter: 60650/160000, loss: 0.5296, lr: 0.000752, batch_cost: 0.2879, reader_cost: 0.08487, ips: 27.7837 samples/sec | ETA 07:56:46 2022-08-23 23:55:41 [INFO] [TRAIN] epoch: 49, iter: 60700/160000, loss: 0.5468, lr: 0.000752, batch_cost: 0.1606, reader_cost: 0.00045, ips: 49.8108 samples/sec | ETA 04:25:48 2022-08-23 23:55:49 [INFO] [TRAIN] epoch: 49, iter: 60750/160000, loss: 0.5651, lr: 0.000751, batch_cost: 0.1640, reader_cost: 0.00064, ips: 48.7873 samples/sec | ETA 04:31:14 2022-08-23 23:55:59 [INFO] [TRAIN] epoch: 49, iter: 60800/160000, loss: 0.5594, lr: 0.000751, batch_cost: 0.1850, reader_cost: 0.00102, ips: 43.2449 samples/sec | ETA 05:05:51 2022-08-23 23:56:09 [INFO] [TRAIN] epoch: 49, iter: 60850/160000, loss: 0.5634, lr: 0.000751, batch_cost: 0.2121, reader_cost: 0.00045, ips: 37.7221 samples/sec | ETA 05:50:27 2022-08-23 23:56:18 [INFO] [TRAIN] epoch: 49, iter: 60900/160000, loss: 0.5858, lr: 0.000750, batch_cost: 0.1660, reader_cost: 0.00200, ips: 48.1833 samples/sec | ETA 04:34:13 2022-08-23 23:56:26 [INFO] [TRAIN] epoch: 49, iter: 60950/160000, loss: 0.5565, lr: 0.000750, batch_cost: 0.1741, reader_cost: 0.00046, ips: 45.9385 samples/sec | ETA 04:47:29 2022-08-23 23:56:36 [INFO] [TRAIN] epoch: 49, iter: 61000/160000, loss: 0.5181, lr: 0.000750, batch_cost: 0.1930, reader_cost: 0.00060, ips: 41.4472 samples/sec | ETA 05:18:28 2022-08-23 23:56:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 185s - batch_cost: 0.1851 - reader cost: 7.7958e-04 2022-08-23 23:59:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3614 Acc: 0.7695 Kappa: 0.7518 Dice: 0.4980 2022-08-23 23:59:41 [INFO] [EVAL] Class IoU: [0.6879 0.7859 0.9311 0.739 0.6747 0.7584 0.7853 0.7975 0.5283 0.6348 0.4827 0.565 0.7086 0.2908 0.3206 0.437 0.4962 0.45 0.6092 0.3994 0.7603 0.4743 0.6125 0.5081 0.327 0.3659 0.5432 0.4307 0.456 0.2399 0.2687 0.4838 0.265 0.3461 0.403 0.3564 0.4668 0.5635 0.2719 0.3384 0.1473 0.1573 0.3566 0.2682 0.2382 0.2023 0.3235 0.5091 0.6357 0.526 0.5036 0.3229 0.2376 0.2268 0.6556 0.3319 0.8686 0.4157 0.4561 0.2913 0.1264 0.2481 0.2718 0.1356 0.4424 0.7224 0.2278 0.3978 0.068 0.3477 0.5007 0.5639 0.3945 0.2412 0.4652 0.3575 0.5932 0.2588 0.2295 0.082 0.6287 0.3613 0.3407 0.1217 0.1522 0.5301 0.1142 0.1221 0.3342 0.5311 0.4193 0.1364 0.2188 0.066 0.0027 0.0439 0.2057 0.2106 0.2862 0.3286 0.1692 0.1047 0.2661 0.5573 0.1883 0.4657 0.1619 0.5661 0.0602 0.3634 0.1126 0.3947 0.1649 0.4932 0.7093 0.0504 0.4808 0.7042 0.1421 0.3356 0.4588 0.0663 0.2411 0.1281 0.2503 0.2193 0.5216 0.4783 0.5642 0.3876 0.5608 0.0649 0.2605 0.3845 0.3029 0.1817 0.1541 0.0244 0.1765 0.3339 0.4527 0.1245 0.2991 0.1594 0.1301 0.0025 0.4036 0.0208 0.0977 0.2071] 2022-08-23 23:59:41 [INFO] [EVAL] Class Precision: [0.7814 0.8608 0.9582 0.8379 0.7711 0.8676 0.8746 0.8619 0.6981 0.7147 0.6853 0.7483 0.7897 0.4723 0.5305 0.6195 0.7323 0.6845 0.7708 0.6329 0.8283 0.7044 0.7511 0.6162 0.4308 0.5844 0.6263 0.7208 0.7088 0.4329 0.4011 0.646 0.5514 0.4513 0.5201 0.577 0.6688 0.7993 0.375 0.6163 0.3031 0.3859 0.5493 0.4822 0.3027 0.4558 0.5524 0.8027 0.7326 0.7563 0.6779 0.3867 0.4354 0.6499 0.6896 0.6997 0.9119 0.6484 0.68 0.5975 0.1803 0.4414 0.3774 0.6239 0.5238 0.8318 0.4763 0.5475 0.1991 0.5255 0.6837 0.6865 0.587 0.3114 0.7284 0.469 0.7298 0.6771 0.6889 0.5578 0.7539 0.7462 0.7835 0.3135 0.248 0.7043 0.5741 0.3563 0.7187 0.7313 0.6292 0.2018 0.3875 0.3022 0.0049 0.1168 0.6418 0.4256 0.43 0.7157 0.7173 0.1436 0.5115 0.784 0.8149 0.5204 0.3085 0.8159 0.2081 0.5698 0.5628 0.5174 0.5604 0.6024 0.7275 0.4564 0.7923 0.8007 0.183 0.527 0.7401 0.2158 0.6468 0.6627 0.762 0.5877 0.8766 0.6266 0.9164 0.7477 0.6844 0.2751 0.4996 0.7516 0.7224 0.4068 0.3714 0.0927 0.3741 0.7504 0.6113 0.4554 0.5889 0.4335 0.6932 0.0045 0.8519 0.337 0.2695 0.8741] 2022-08-23 23:59:41 [INFO] [EVAL] Class Recall: [0.8518 0.9004 0.9706 0.8624 0.8437 0.8576 0.885 0.9144 0.6848 0.8504 0.6201 0.6976 0.8734 0.4309 0.4477 0.5974 0.6062 0.5678 0.7439 0.5198 0.9027 0.5922 0.7684 0.7434 0.5756 0.4947 0.8038 0.517 0.5611 0.3499 0.4489 0.6584 0.3379 0.5976 0.6415 0.4824 0.6072 0.6563 0.4973 0.4288 0.2227 0.2098 0.5041 0.3768 0.5277 0.2668 0.4384 0.5819 0.8277 0.6333 0.6621 0.6618 0.3435 0.2583 0.9302 0.3871 0.9481 0.5367 0.5808 0.3624 0.2971 0.3617 0.4927 0.1477 0.7401 0.846 0.3038 0.5927 0.0936 0.5068 0.6516 0.7594 0.5461 0.5167 0.5627 0.6004 0.7601 0.2953 0.256 0.0877 0.791 0.4119 0.3761 0.1658 0.2826 0.6819 0.1248 0.1567 0.3845 0.6598 0.5568 0.2962 0.3346 0.0778 0.0058 0.0658 0.2324 0.2942 0.4612 0.378 0.1813 0.2787 0.3568 0.6584 0.1967 0.8161 0.2541 0.649 0.0781 0.5008 0.1234 0.6245 0.1893 0.7312 0.966 0.0537 0.5501 0.8538 0.3884 0.4802 0.5469 0.0874 0.2777 0.137 0.2716 0.2591 0.5629 0.6689 0.5947 0.446 0.7564 0.0783 0.3524 0.4405 0.3428 0.2473 0.2085 0.0321 0.2505 0.3756 0.6356 0.1463 0.378 0.2014 0.1381 0.0054 0.4341 0.0217 0.133 0.2134] 2022-08-23 23:59:41 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-23 23:59:50 [INFO] [TRAIN] epoch: 49, iter: 61050/160000, loss: 0.5682, lr: 0.000749, batch_cost: 0.1722, reader_cost: 0.00629, ips: 46.4604 samples/sec | ETA 04:43:58 2022-08-23 23:59:58 [INFO] [TRAIN] epoch: 49, iter: 61100/160000, loss: 0.5593, lr: 0.000749, batch_cost: 0.1604, reader_cost: 0.00100, ips: 49.8868 samples/sec | ETA 04:24:19 2022-08-24 00:00:07 [INFO] [TRAIN] epoch: 49, iter: 61150/160000, loss: 0.5522, lr: 0.000748, batch_cost: 0.1706, reader_cost: 0.00129, ips: 46.9019 samples/sec | ETA 04:41:00 2022-08-24 00:00:15 [INFO] [TRAIN] epoch: 49, iter: 61200/160000, loss: 0.5383, lr: 0.000748, batch_cost: 0.1639, reader_cost: 0.00084, ips: 48.8011 samples/sec | ETA 04:29:56 2022-08-24 00:00:24 [INFO] [TRAIN] epoch: 49, iter: 61250/160000, loss: 0.5921, lr: 0.000748, batch_cost: 0.1813, reader_cost: 0.00042, ips: 44.1227 samples/sec | ETA 04:58:24 2022-08-24 00:00:32 [INFO] [TRAIN] epoch: 49, iter: 61300/160000, loss: 0.5343, lr: 0.000747, batch_cost: 0.1687, reader_cost: 0.00113, ips: 47.4254 samples/sec | ETA 04:37:29 2022-08-24 00:00:40 [INFO] [TRAIN] epoch: 49, iter: 61350/160000, loss: 0.5472, lr: 0.000747, batch_cost: 0.1532, reader_cost: 0.00142, ips: 52.2083 samples/sec | ETA 04:11:56 2022-08-24 00:00:49 [INFO] [TRAIN] epoch: 49, iter: 61400/160000, loss: 0.5297, lr: 0.000747, batch_cost: 0.1849, reader_cost: 0.00045, ips: 43.2718 samples/sec | ETA 05:03:48 2022-08-24 00:00:59 [INFO] [TRAIN] epoch: 49, iter: 61450/160000, loss: 0.5419, lr: 0.000746, batch_cost: 0.1905, reader_cost: 0.00041, ips: 41.9884 samples/sec | ETA 05:12:56 2022-08-24 00:01:08 [INFO] [TRAIN] epoch: 49, iter: 61500/160000, loss: 0.5481, lr: 0.000746, batch_cost: 0.1926, reader_cost: 0.00071, ips: 41.5280 samples/sec | ETA 05:16:15 2022-08-24 00:01:17 [INFO] [TRAIN] epoch: 49, iter: 61550/160000, loss: 0.5307, lr: 0.000745, batch_cost: 0.1813, reader_cost: 0.00044, ips: 44.1281 samples/sec | ETA 04:57:28 2022-08-24 00:01:26 [INFO] [TRAIN] epoch: 49, iter: 61600/160000, loss: 0.5588, lr: 0.000745, batch_cost: 0.1776, reader_cost: 0.00045, ips: 45.0404 samples/sec | ETA 04:51:17 2022-08-24 00:01:35 [INFO] [TRAIN] epoch: 49, iter: 61650/160000, loss: 0.5579, lr: 0.000745, batch_cost: 0.1783, reader_cost: 0.00072, ips: 44.8771 samples/sec | ETA 04:52:12 2022-08-24 00:01:44 [INFO] [TRAIN] epoch: 49, iter: 61700/160000, loss: 0.5415, lr: 0.000744, batch_cost: 0.1763, reader_cost: 0.00047, ips: 45.3718 samples/sec | ETA 04:48:52 2022-08-24 00:01:52 [INFO] [TRAIN] epoch: 49, iter: 61750/160000, loss: 0.5405, lr: 0.000744, batch_cost: 0.1683, reader_cost: 0.00080, ips: 47.5351 samples/sec | ETA 04:35:35 2022-08-24 00:02:00 [INFO] [TRAIN] epoch: 49, iter: 61800/160000, loss: 0.5500, lr: 0.000743, batch_cost: 0.1580, reader_cost: 0.00043, ips: 50.6264 samples/sec | ETA 04:18:37 2022-08-24 00:02:10 [INFO] [TRAIN] epoch: 49, iter: 61850/160000, loss: 0.5416, lr: 0.000743, batch_cost: 0.1831, reader_cost: 0.00044, ips: 43.6991 samples/sec | ETA 04:59:28 2022-08-24 00:02:23 [INFO] [TRAIN] epoch: 50, iter: 61900/160000, loss: 0.5853, lr: 0.000743, batch_cost: 0.2751, reader_cost: 0.07286, ips: 29.0834 samples/sec | ETA 07:29:44 2022-08-24 00:02:33 [INFO] [TRAIN] epoch: 50, iter: 61950/160000, loss: 0.5313, lr: 0.000742, batch_cost: 0.1880, reader_cost: 0.00085, ips: 42.5554 samples/sec | ETA 05:07:12 2022-08-24 00:02:42 [INFO] [TRAIN] epoch: 50, iter: 62000/160000, loss: 0.5575, lr: 0.000742, batch_cost: 0.1825, reader_cost: 0.00044, ips: 43.8293 samples/sec | ETA 04:58:07 2022-08-24 00:02:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1927 - reader cost: 8.1329e-04 2022-08-24 00:05:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3600 Acc: 0.7674 Kappa: 0.7496 Dice: 0.4976 2022-08-24 00:05:55 [INFO] [EVAL] Class IoU: [0.6839 0.7776 0.9293 0.7297 0.6875 0.7571 0.7712 0.7953 0.5064 0.6214 0.4828 0.5569 0.7074 0.3182 0.3032 0.4446 0.5286 0.4145 0.5986 0.4114 0.7513 0.4751 0.5979 0.4607 0.3407 0.3984 0.5724 0.4227 0.4516 0.2214 0.3059 0.5037 0.2895 0.3625 0.3287 0.4098 0.4447 0.5213 0.2603 0.3415 0.1714 0.1741 0.3497 0.2769 0.2543 0.2068 0.384 0.4835 0.6407 0.5404 0.5277 0.406 0.191 0.1463 0.6616 0.4343 0.8821 0.4115 0.3997 0.2683 0.103 0.2163 0.2579 0.1205 0.4566 0.6807 0.2013 0.3933 0.0892 0.3597 0.4878 0.5632 0.3348 0.2574 0.4584 0.3721 0.505 0.2794 0.3791 0.0978 0.6683 0.3797 0.3703 0.035 0.2609 0.509 0.1242 0.123 0.3271 0.5173 0.4385 0.1026 0.2267 0.0727 0.0045 0.0396 0.2205 0.1666 0.2811 0.3237 0.1716 0.1247 0.2244 0.5158 0.1845 0.5459 0.1963 0.5662 0.0761 0.2539 0.1696 0.2616 0.1664 0.5089 0.5252 0.0147 0.3069 0.6075 0.1698 0.3638 0.4509 0.0653 0.2156 0.1076 0.263 0.2526 0.5147 0.4466 0.5529 0.4072 0.5352 0.0357 0.3265 0.3983 0.2965 0.1798 0.1498 0.0345 0.1861 0.368 0.4004 0.0384 0.2633 0.3071 0.3357 0.0172 0.4418 0.0367 0.1018 0.2152] 2022-08-24 00:05:55 [INFO] [EVAL] Class Precision: [0.778 0.8607 0.9663 0.8294 0.7689 0.8707 0.8782 0.8654 0.6326 0.7337 0.7244 0.706 0.7925 0.5076 0.5576 0.6331 0.6531 0.6996 0.7231 0.6385 0.8215 0.6674 0.7158 0.6451 0.4948 0.5723 0.6356 0.7696 0.6669 0.3972 0.4271 0.64 0.536 0.5046 0.5986 0.5071 0.6853 0.7319 0.3732 0.6104 0.3333 0.4074 0.5705 0.4993 0.3733 0.3954 0.7751 0.67 0.7922 0.6673 0.6814 0.5202 0.3489 0.4898 0.7372 0.6444 0.9298 0.6146 0.7078 0.4668 0.1737 0.3674 0.4463 0.7625 0.552 0.7613 0.3682 0.5293 0.1968 0.5756 0.6421 0.7392 0.6081 0.3137 0.6416 0.5226 0.8502 0.6258 0.5955 0.3305 0.8193 0.6767 0.7731 0.1182 0.3368 0.6792 0.3597 0.3919 0.5462 0.6777 0.6248 0.1243 0.5793 0.3173 0.015 0.0817 0.7732 0.5083 0.4229 0.6608 0.7245 0.1967 0.5403 0.6198 0.755 0.5843 0.3066 0.8202 0.1515 0.5683 0.5094 0.2955 0.4883 0.6429 0.527 0.1752 0.795 0.6792 0.2728 0.4537 0.6544 0.3745 0.5593 0.7447 0.7481 0.5566 0.8579 0.6497 0.9301 0.7183 0.6373 0.2567 0.6942 0.7178 0.6044 0.3596 0.3124 0.0682 0.4108 0.6523 0.5349 0.0777 0.5539 0.621 0.4884 0.025 0.8404 0.2091 0.3177 0.7287] 2022-08-24 00:05:55 [INFO] [EVAL] Class Recall: [0.8497 0.8896 0.9604 0.8587 0.8665 0.853 0.8636 0.9075 0.7175 0.8024 0.5914 0.725 0.8683 0.4604 0.3992 0.5988 0.735 0.5042 0.7767 0.5364 0.8979 0.6224 0.784 0.6171 0.5223 0.5673 0.8519 0.4839 0.5831 0.3334 0.5188 0.7028 0.3864 0.5629 0.4216 0.681 0.5588 0.6443 0.4624 0.4368 0.2607 0.2331 0.4746 0.3834 0.4439 0.3025 0.4322 0.6346 0.7702 0.7397 0.7006 0.6491 0.2967 0.1726 0.8658 0.5711 0.945 0.5545 0.4787 0.3869 0.2018 0.3447 0.3792 0.1252 0.7254 0.8654 0.3076 0.6048 0.1402 0.4896 0.6699 0.7029 0.4269 0.5892 0.6161 0.5638 0.5544 0.3354 0.5106 0.122 0.7837 0.4638 0.4155 0.0474 0.5364 0.6701 0.1595 0.1521 0.4492 0.6861 0.5953 0.3706 0.2713 0.0862 0.0063 0.0715 0.2358 0.1986 0.456 0.3882 0.1835 0.254 0.2773 0.7546 0.1962 0.8924 0.3531 0.6464 0.1326 0.3146 0.2027 0.6953 0.2015 0.7094 0.9937 0.0158 0.3333 0.8519 0.3102 0.6475 0.5919 0.0733 0.2597 0.1117 0.2885 0.3162 0.5627 0.5882 0.5768 0.4846 0.7698 0.0399 0.3814 0.4722 0.368 0.2645 0.2234 0.0654 0.2539 0.4578 0.6143 0.0704 0.3342 0.3779 0.5179 0.052 0.4823 0.0426 0.1302 0.2339] 2022-08-24 00:05:55 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:06:04 [INFO] [TRAIN] epoch: 50, iter: 62050/160000, loss: 0.5178, lr: 0.000742, batch_cost: 0.1885, reader_cost: 0.00501, ips: 42.4431 samples/sec | ETA 05:07:42 2022-08-24 00:06:14 [INFO] [TRAIN] epoch: 50, iter: 62100/160000, loss: 0.5200, lr: 0.000741, batch_cost: 0.1833, reader_cost: 0.00121, ips: 43.6438 samples/sec | ETA 04:59:05 2022-08-24 00:06:23 [INFO] [TRAIN] epoch: 50, iter: 62150/160000, loss: 0.5437, lr: 0.000741, batch_cost: 0.1904, reader_cost: 0.00081, ips: 42.0197 samples/sec | ETA 05:10:29 2022-08-24 00:06:34 [INFO] [TRAIN] epoch: 50, iter: 62200/160000, loss: 0.5569, lr: 0.000740, batch_cost: 0.2257, reader_cost: 0.00041, ips: 35.4403 samples/sec | ETA 06:07:56 2022-08-24 00:06:43 [INFO] [TRAIN] epoch: 50, iter: 62250/160000, loss: 0.5697, lr: 0.000740, batch_cost: 0.1656, reader_cost: 0.00106, ips: 48.2991 samples/sec | ETA 04:29:50 2022-08-24 00:06:50 [INFO] [TRAIN] epoch: 50, iter: 62300/160000, loss: 0.5262, lr: 0.000740, batch_cost: 0.1517, reader_cost: 0.00052, ips: 52.7463 samples/sec | ETA 04:06:58 2022-08-24 00:06:59 [INFO] [TRAIN] epoch: 50, iter: 62350/160000, loss: 0.5755, lr: 0.000739, batch_cost: 0.1841, reader_cost: 0.00071, ips: 43.4625 samples/sec | ETA 04:59:34 2022-08-24 00:07:08 [INFO] [TRAIN] epoch: 50, iter: 62400/160000, loss: 0.5305, lr: 0.000739, batch_cost: 0.1760, reader_cost: 0.00074, ips: 45.4635 samples/sec | ETA 04:46:14 2022-08-24 00:07:18 [INFO] [TRAIN] epoch: 50, iter: 62450/160000, loss: 0.5825, lr: 0.000739, batch_cost: 0.1887, reader_cost: 0.00055, ips: 42.3969 samples/sec | ETA 05:06:47 2022-08-24 00:07:26 [INFO] [TRAIN] epoch: 50, iter: 62500/160000, loss: 0.5545, lr: 0.000738, batch_cost: 0.1563, reader_cost: 0.00064, ips: 51.1794 samples/sec | ETA 04:14:00 2022-08-24 00:07:34 [INFO] [TRAIN] epoch: 50, iter: 62550/160000, loss: 0.5542, lr: 0.000738, batch_cost: 0.1782, reader_cost: 0.00048, ips: 44.8942 samples/sec | ETA 04:49:25 2022-08-24 00:07:44 [INFO] [TRAIN] epoch: 50, iter: 62600/160000, loss: 0.5650, lr: 0.000737, batch_cost: 0.1828, reader_cost: 0.00175, ips: 43.7605 samples/sec | ETA 04:56:46 2022-08-24 00:07:52 [INFO] [TRAIN] epoch: 50, iter: 62650/160000, loss: 0.5307, lr: 0.000737, batch_cost: 0.1774, reader_cost: 0.00057, ips: 45.0852 samples/sec | ETA 04:47:53 2022-08-24 00:08:02 [INFO] [TRAIN] epoch: 50, iter: 62700/160000, loss: 0.5563, lr: 0.000737, batch_cost: 0.1855, reader_cost: 0.00073, ips: 43.1286 samples/sec | ETA 05:00:48 2022-08-24 00:08:10 [INFO] [TRAIN] epoch: 50, iter: 62750/160000, loss: 0.5530, lr: 0.000736, batch_cost: 0.1670, reader_cost: 0.00073, ips: 47.9074 samples/sec | ETA 04:30:39 2022-08-24 00:08:18 [INFO] [TRAIN] epoch: 50, iter: 62800/160000, loss: 0.5386, lr: 0.000736, batch_cost: 0.1667, reader_cost: 0.00088, ips: 48.0039 samples/sec | ETA 04:29:58 2022-08-24 00:08:27 [INFO] [TRAIN] epoch: 50, iter: 62850/160000, loss: 0.5494, lr: 0.000736, batch_cost: 0.1680, reader_cost: 0.00052, ips: 47.6067 samples/sec | ETA 04:32:05 2022-08-24 00:08:36 [INFO] [TRAIN] epoch: 50, iter: 62900/160000, loss: 0.5711, lr: 0.000735, batch_cost: 0.1851, reader_cost: 0.00070, ips: 43.2101 samples/sec | ETA 04:59:37 2022-08-24 00:08:45 [INFO] [TRAIN] epoch: 50, iter: 62950/160000, loss: 0.5747, lr: 0.000735, batch_cost: 0.1855, reader_cost: 0.00036, ips: 43.1176 samples/sec | ETA 05:00:06 2022-08-24 00:08:55 [INFO] [TRAIN] epoch: 50, iter: 63000/160000, loss: 0.5656, lr: 0.000734, batch_cost: 0.2004, reader_cost: 0.00128, ips: 39.9210 samples/sec | ETA 05:23:58 2022-08-24 00:08:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1821 - reader cost: 0.0010 2022-08-24 00:11:58 [INFO] [EVAL] #Images: 2000 mIoU: 0.3579 Acc: 0.7646 Kappa: 0.7469 Dice: 0.4960 2022-08-24 00:11:58 [INFO] [EVAL] Class IoU: [0.6774 0.773 0.9331 0.7192 0.6895 0.7504 0.7708 0.7739 0.5227 0.6611 0.495 0.5393 0.6991 0.2519 0.3078 0.441 0.5053 0.4578 0.6128 0.4003 0.7353 0.4486 0.6167 0.4454 0.3462 0.4042 0.4703 0.3793 0.4048 0.2279 0.2657 0.4664 0.3139 0.3572 0.38 0.3711 0.4624 0.5333 0.2715 0.3531 0.168 0.135 0.3347 0.2588 0.2416 0.2384 0.2413 0.5104 0.4808 0.5311 0.5621 0.4706 0.1656 0.2407 0.6503 0.4301 0.8797 0.354 0.4074 0.2861 0.0969 0.2054 0.2981 0.1599 0.4503 0.6994 0.2577 0.4036 0.1154 0.3511 0.4708 0.5742 0.3662 0.2616 0.4775 0.3646 0.56 0.245 0.2673 0.4214 0.6712 0.345 0.3591 0.1239 0.1268 0.5342 0.1001 0.0932 0.3363 0.5394 0.4384 0.1937 0.1296 0.0527 0.0019 0.0594 0.1998 0.1501 0.2819 0.3393 0.2459 0.1364 0.1576 0.4125 0.1699 0.3191 0.2135 0.558 0.0436 0.1714 0.2047 0.4178 0.148 0.5887 0.5067 0.0279 0.3982 0.7105 0.1342 0.4012 0.3864 0.0506 0.3495 0.1074 0.2869 0.2526 0.4614 0.4895 0.5617 0.3523 0.4952 0.0857 0.2656 0.3344 0.2251 0.1498 0.1214 0.0101 0.1965 0.3109 0.3065 0.1603 0.2272 0.5186 0.3076 0.0093 0.3996 0.0328 0.1573 0.1605] 2022-08-24 00:11:58 [INFO] [EVAL] Class Precision: [0.7981 0.8397 0.9648 0.8058 0.8057 0.8209 0.8806 0.9016 0.661 0.7251 0.6983 0.6328 0.7618 0.5337 0.5466 0.6427 0.6734 0.6764 0.7757 0.6401 0.7952 0.6035 0.7791 0.5066 0.4808 0.5433 0.605 0.7456 0.6801 0.4886 0.4454 0.5735 0.5436 0.458 0.4855 0.5122 0.6724 0.711 0.4722 0.5537 0.2806 0.3634 0.5294 0.5601 0.339 0.4983 0.4579 0.6907 0.7673 0.6481 0.6658 0.6489 0.3738 0.648 0.7002 0.5902 0.9277 0.7357 0.6605 0.5329 0.1672 0.416 0.4501 0.6949 0.5378 0.8363 0.504 0.5672 0.1985 0.5497 0.666 0.6836 0.5647 0.3325 0.7159 0.5341 0.6807 0.5494 0.6609 0.5503 0.8318 0.6993 0.8003 0.4128 0.2111 0.7671 0.5615 0.4728 0.6507 0.7533 0.6545 0.355 0.339 0.2967 0.0054 0.135 0.4284 0.4057 0.4113 0.6802 0.6422 0.2622 0.4128 0.4403 0.5576 0.3634 0.316 0.8308 0.0936 0.3633 0.3736 0.5425 0.5317 0.7439 0.5074 0.2349 0.7232 0.8364 0.3644 0.5382 0.7616 0.3416 0.5535 0.5545 0.6975 0.5883 0.7624 0.6584 0.8214 0.4943 0.6807 0.5718 0.4173 0.8037 0.8043 0.4687 0.3634 0.1155 0.4709 0.7687 0.5195 0.3646 0.5808 0.6703 0.6142 0.011 0.8361 0.2978 0.3327 0.841 ] 2022-08-24 00:11:58 [INFO] [EVAL] Class Recall: [0.8174 0.9068 0.966 0.8699 0.827 0.8973 0.8608 0.8453 0.7143 0.8822 0.6297 0.785 0.8948 0.323 0.4133 0.5842 0.6694 0.5862 0.7447 0.5166 0.9071 0.6361 0.7473 0.7865 0.5529 0.6122 0.6787 0.4357 0.4999 0.2993 0.3972 0.714 0.4263 0.6188 0.6361 0.5739 0.5969 0.6808 0.3897 0.4935 0.2951 0.1768 0.4764 0.3248 0.4568 0.3138 0.3379 0.6617 0.5629 0.7462 0.7831 0.6313 0.2291 0.2769 0.9013 0.6134 0.9445 0.4056 0.5153 0.3818 0.1871 0.2886 0.4689 0.1719 0.7344 0.8103 0.3452 0.5833 0.2163 0.4928 0.6163 0.7821 0.5103 0.5509 0.5892 0.5346 0.7595 0.3066 0.3098 0.6428 0.7766 0.4052 0.3945 0.1503 0.2412 0.6376 0.1086 0.104 0.4103 0.6551 0.5704 0.2989 0.1734 0.0602 0.003 0.0959 0.2724 0.1924 0.4726 0.4037 0.285 0.2213 0.2031 0.8669 0.1963 0.7238 0.3969 0.6295 0.0753 0.2449 0.3117 0.6451 0.1702 0.7384 0.997 0.0307 0.4699 0.8252 0.1753 0.6118 0.4396 0.0561 0.4868 0.1176 0.3276 0.3068 0.5389 0.6562 0.6398 0.5509 0.6449 0.0915 0.4222 0.3641 0.2381 0.1804 0.1542 0.0109 0.2521 0.343 0.4278 0.2225 0.2718 0.6962 0.3812 0.0565 0.4336 0.0355 0.2297 0.1656] 2022-08-24 00:11:58 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:12:07 [INFO] [TRAIN] epoch: 50, iter: 63050/160000, loss: 0.5333, lr: 0.000734, batch_cost: 0.1717, reader_cost: 0.00283, ips: 46.5880 samples/sec | ETA 04:37:28 2022-08-24 00:12:15 [INFO] [TRAIN] epoch: 50, iter: 63100/160000, loss: 0.5800, lr: 0.000734, batch_cost: 0.1589, reader_cost: 0.00156, ips: 50.3585 samples/sec | ETA 04:16:33 2022-08-24 00:12:23 [INFO] [TRAIN] epoch: 50, iter: 63150/160000, loss: 0.5674, lr: 0.000733, batch_cost: 0.1764, reader_cost: 0.02137, ips: 45.3640 samples/sec | ETA 04:44:39 2022-08-24 00:12:36 [INFO] [TRAIN] epoch: 51, iter: 63200/160000, loss: 0.5403, lr: 0.000733, batch_cost: 0.2525, reader_cost: 0.05956, ips: 31.6791 samples/sec | ETA 06:47:25 2022-08-24 00:12:44 [INFO] [TRAIN] epoch: 51, iter: 63250/160000, loss: 0.5385, lr: 0.000732, batch_cost: 0.1709, reader_cost: 0.00073, ips: 46.8213 samples/sec | ETA 04:35:30 2022-08-24 00:12:54 [INFO] [TRAIN] epoch: 51, iter: 63300/160000, loss: 0.5796, lr: 0.000732, batch_cost: 0.1898, reader_cost: 0.00097, ips: 42.1518 samples/sec | ETA 05:05:52 2022-08-24 00:13:04 [INFO] [TRAIN] epoch: 51, iter: 63350/160000, loss: 0.5484, lr: 0.000732, batch_cost: 0.2097, reader_cost: 0.00115, ips: 38.1448 samples/sec | ETA 05:37:50 2022-08-24 00:13:14 [INFO] [TRAIN] epoch: 51, iter: 63400/160000, loss: 0.5444, lr: 0.000731, batch_cost: 0.1839, reader_cost: 0.00034, ips: 43.4956 samples/sec | ETA 04:56:07 2022-08-24 00:13:23 [INFO] [TRAIN] epoch: 51, iter: 63450/160000, loss: 0.5146, lr: 0.000731, batch_cost: 0.1921, reader_cost: 0.00059, ips: 41.6415 samples/sec | ETA 05:09:08 2022-08-24 00:13:31 [INFO] [TRAIN] epoch: 51, iter: 63500/160000, loss: 0.5137, lr: 0.000731, batch_cost: 0.1603, reader_cost: 0.00045, ips: 49.9214 samples/sec | ETA 04:17:44 2022-08-24 00:13:40 [INFO] [TRAIN] epoch: 51, iter: 63550/160000, loss: 0.5697, lr: 0.000730, batch_cost: 0.1730, reader_cost: 0.00634, ips: 46.2546 samples/sec | ETA 04:38:01 2022-08-24 00:13:49 [INFO] [TRAIN] epoch: 51, iter: 63600/160000, loss: 0.5326, lr: 0.000730, batch_cost: 0.1740, reader_cost: 0.00060, ips: 45.9874 samples/sec | ETA 04:39:29 2022-08-24 00:13:58 [INFO] [TRAIN] epoch: 51, iter: 63650/160000, loss: 0.5214, lr: 0.000729, batch_cost: 0.1829, reader_cost: 0.00077, ips: 43.7382 samples/sec | ETA 04:53:43 2022-08-24 00:14:05 [INFO] [TRAIN] epoch: 51, iter: 63700/160000, loss: 0.5495, lr: 0.000729, batch_cost: 0.1540, reader_cost: 0.00081, ips: 51.9634 samples/sec | ETA 04:07:05 2022-08-24 00:14:14 [INFO] [TRAIN] epoch: 51, iter: 63750/160000, loss: 0.5422, lr: 0.000729, batch_cost: 0.1607, reader_cost: 0.00062, ips: 49.7907 samples/sec | ETA 04:17:44 2022-08-24 00:14:23 [INFO] [TRAIN] epoch: 51, iter: 63800/160000, loss: 0.5241, lr: 0.000728, batch_cost: 0.1993, reader_cost: 0.00052, ips: 40.1317 samples/sec | ETA 05:19:36 2022-08-24 00:14:33 [INFO] [TRAIN] epoch: 51, iter: 63850/160000, loss: 0.5555, lr: 0.000728, batch_cost: 0.1979, reader_cost: 0.00037, ips: 40.4320 samples/sec | ETA 05:17:04 2022-08-24 00:14:42 [INFO] [TRAIN] epoch: 51, iter: 63900/160000, loss: 0.5353, lr: 0.000728, batch_cost: 0.1749, reader_cost: 0.00071, ips: 45.7403 samples/sec | ETA 04:40:07 2022-08-24 00:14:52 [INFO] [TRAIN] epoch: 51, iter: 63950/160000, loss: 0.5002, lr: 0.000727, batch_cost: 0.1931, reader_cost: 0.00047, ips: 41.4213 samples/sec | ETA 05:09:10 2022-08-24 00:15:01 [INFO] [TRAIN] epoch: 51, iter: 64000/160000, loss: 0.5558, lr: 0.000727, batch_cost: 0.1867, reader_cost: 0.00035, ips: 42.8533 samples/sec | ETA 04:58:41 2022-08-24 00:15:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1933 - reader cost: 5.7382e-04 2022-08-24 00:18:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3636 Acc: 0.7710 Kappa: 0.7534 Dice: 0.5016 2022-08-24 00:18:15 [INFO] [EVAL] Class IoU: [0.6871 0.7794 0.9324 0.7365 0.673 0.7573 0.789 0.8004 0.5258 0.6622 0.5 0.5513 0.7114 0.2692 0.2752 0.4329 0.5051 0.4257 0.6245 0.4303 0.7601 0.4591 0.5996 0.5073 0.335 0.4332 0.5378 0.392 0.3719 0.314 0.288 0.4913 0.3063 0.3509 0.362 0.4112 0.4603 0.5688 0.2895 0.3295 0.1564 0.1669 0.3586 0.2565 0.26 0.2435 0.3018 0.5091 0.597 0.547 0.52 0.3787 0.2246 0.1811 0.6591 0.3836 0.8768 0.3967 0.473 0.2598 0.0754 0.245 0.2584 0.1923 0.4771 0.7123 0.2361 0.3903 0.1051 0.335 0.4797 0.5602 0.3837 0.2311 0.4827 0.3627 0.5531 0.265 0.3138 0.4113 0.6667 0.4001 0.3231 0.1175 0.0827 0.5363 0.1063 0.1074 0.3338 0.5241 0.3854 0.1334 0.211 0.0868 0.0067 0.0412 0.1906 0.1203 0.2848 0.292 0.2679 0.145 0.2808 0.5027 0.1073 0.3761 0.1922 0.5641 0.0523 0.4406 0.1399 0.2735 0.1607 0.534 0.6481 0.0598 0.4255 0.7607 0.1421 0.433 0.4787 0.0797 0.3087 0.1363 0.2936 0.2463 0.5057 0.4341 0.5023 0.3744 0.4598 0.0907 0.2599 0.305 0.3081 0.1869 0.1338 0.0232 0.197 0.3455 0.22 0.0844 0.285 0.4099 0.2212 0.0086 0.4147 0.0293 0.0985 0.1804] 2022-08-24 00:18:15 [INFO] [EVAL] Class Precision: [0.7795 0.8557 0.9687 0.8308 0.7325 0.8686 0.8906 0.86 0.6588 0.7534 0.6988 0.7039 0.7875 0.55 0.5987 0.5766 0.6829 0.7371 0.7837 0.5958 0.838 0.6968 0.7386 0.6366 0.4784 0.6001 0.6311 0.8282 0.7353 0.4706 0.4738 0.6751 0.5399 0.4551 0.4439 0.5549 0.6832 0.764 0.4529 0.6347 0.3179 0.4047 0.6767 0.4771 0.3836 0.4912 0.4802 0.8427 0.6553 0.68 0.7903 0.4474 0.4143 0.5198 0.6853 0.7079 0.9323 0.6481 0.7884 0.4362 0.1193 0.4861 0.3789 0.5892 0.6048 0.8188 0.4283 0.4844 0.2866 0.5197 0.6832 0.6663 0.5748 0.3598 0.6711 0.6188 0.776 0.6244 0.6346 0.5728 0.8275 0.6822 0.8145 0.2744 0.145 0.6517 0.6513 0.4722 0.663 0.7574 0.5356 0.1612 0.5396 0.3124 0.0623 0.1208 0.7596 0.4791 0.4352 0.7176 0.6194 0.2491 0.5215 0.641 0.6408 0.4046 0.4046 0.7813 0.1721 0.5261 0.5201 0.3205 0.5292 0.7751 0.6544 0.399 0.8002 0.8092 0.429 0.6405 0.7042 0.3551 0.5478 0.5018 0.7039 0.648 0.7998 0.5234 0.5825 0.8539 0.7093 0.4327 0.5121 0.8709 0.6173 0.429 0.3446 0.0654 0.4562 0.7651 0.3488 0.1817 0.5408 0.6223 0.5822 0.0532 0.8055 0.172 0.2614 0.7598] 2022-08-24 00:18:15 [INFO] [EVAL] Class Recall: [0.853 0.8973 0.9614 0.8664 0.8923 0.8552 0.8736 0.9203 0.7226 0.8455 0.6373 0.7176 0.8805 0.3453 0.3374 0.6345 0.6599 0.5019 0.7545 0.6076 0.891 0.5737 0.7611 0.7141 0.5279 0.6089 0.7843 0.4267 0.4295 0.4855 0.4234 0.6435 0.4144 0.6052 0.6622 0.6136 0.5852 0.69 0.4453 0.4067 0.2355 0.2213 0.4327 0.3568 0.4467 0.3255 0.4483 0.5626 0.8702 0.7366 0.6033 0.7117 0.329 0.2175 0.9451 0.4558 0.9365 0.5056 0.5418 0.3911 0.1704 0.3306 0.4482 0.2221 0.6933 0.8455 0.3448 0.6677 0.1423 0.4851 0.6169 0.7787 0.5358 0.3925 0.6322 0.4671 0.6581 0.3153 0.383 0.5932 0.7743 0.4917 0.3488 0.1704 0.1615 0.7517 0.1128 0.1221 0.4021 0.6298 0.5789 0.4361 0.2574 0.1073 0.0075 0.0589 0.2028 0.1384 0.4516 0.3299 0.3207 0.2576 0.3783 0.6997 0.1142 0.8421 0.2679 0.6699 0.0699 0.7307 0.1607 0.651 0.1875 0.632 0.9854 0.0657 0.4761 0.927 0.1753 0.572 0.5993 0.0932 0.4142 0.1576 0.3349 0.2844 0.579 0.7179 0.785 0.4 0.5666 0.1029 0.3454 0.3195 0.3808 0.2488 0.1795 0.0347 0.2574 0.3865 0.3735 0.1362 0.376 0.5456 0.2629 0.0102 0.4609 0.0342 0.1364 0.1913] 2022-08-24 00:18:15 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:18:24 [INFO] [TRAIN] epoch: 51, iter: 64050/160000, loss: 0.5221, lr: 0.000726, batch_cost: 0.1872, reader_cost: 0.00509, ips: 42.7406 samples/sec | ETA 04:59:19 2022-08-24 00:18:33 [INFO] [TRAIN] epoch: 51, iter: 64100/160000, loss: 0.5465, lr: 0.000726, batch_cost: 0.1749, reader_cost: 0.00109, ips: 45.7520 samples/sec | ETA 04:39:28 2022-08-24 00:18:42 [INFO] [TRAIN] epoch: 51, iter: 64150/160000, loss: 0.5628, lr: 0.000726, batch_cost: 0.1691, reader_cost: 0.00196, ips: 47.3086 samples/sec | ETA 04:30:08 2022-08-24 00:18:51 [INFO] [TRAIN] epoch: 51, iter: 64200/160000, loss: 0.5482, lr: 0.000725, batch_cost: 0.1803, reader_cost: 0.00052, ips: 44.3817 samples/sec | ETA 04:47:48 2022-08-24 00:19:00 [INFO] [TRAIN] epoch: 51, iter: 64250/160000, loss: 0.5435, lr: 0.000725, batch_cost: 0.1937, reader_cost: 0.00041, ips: 41.3100 samples/sec | ETA 05:09:02 2022-08-24 00:19:08 [INFO] [TRAIN] epoch: 51, iter: 64300/160000, loss: 0.5261, lr: 0.000725, batch_cost: 0.1606, reader_cost: 0.00062, ips: 49.8283 samples/sec | ETA 04:16:04 2022-08-24 00:19:17 [INFO] [TRAIN] epoch: 51, iter: 64350/160000, loss: 0.5750, lr: 0.000724, batch_cost: 0.1706, reader_cost: 0.00057, ips: 46.8805 samples/sec | ETA 04:32:02 2022-08-24 00:19:27 [INFO] [TRAIN] epoch: 51, iter: 64400/160000, loss: 0.5595, lr: 0.000724, batch_cost: 0.2022, reader_cost: 0.00033, ips: 39.5612 samples/sec | ETA 05:22:12 2022-08-24 00:19:40 [INFO] [TRAIN] epoch: 52, iter: 64450/160000, loss: 0.5630, lr: 0.000723, batch_cost: 0.2701, reader_cost: 0.10628, ips: 29.6214 samples/sec | ETA 07:10:05 2022-08-24 00:19:49 [INFO] [TRAIN] epoch: 52, iter: 64500/160000, loss: 0.5263, lr: 0.000723, batch_cost: 0.1778, reader_cost: 0.00105, ips: 45.0005 samples/sec | ETA 04:42:57 2022-08-24 00:19:58 [INFO] [TRAIN] epoch: 52, iter: 64550/160000, loss: 0.5314, lr: 0.000723, batch_cost: 0.1676, reader_cost: 0.00037, ips: 47.7453 samples/sec | ETA 04:26:33 2022-08-24 00:20:07 [INFO] [TRAIN] epoch: 52, iter: 64600/160000, loss: 0.5428, lr: 0.000722, batch_cost: 0.1887, reader_cost: 0.00053, ips: 42.3867 samples/sec | ETA 05:00:05 2022-08-24 00:20:16 [INFO] [TRAIN] epoch: 52, iter: 64650/160000, loss: 0.5628, lr: 0.000722, batch_cost: 0.1730, reader_cost: 0.00042, ips: 46.2326 samples/sec | ETA 04:34:59 2022-08-24 00:20:25 [INFO] [TRAIN] epoch: 52, iter: 64700/160000, loss: 0.5596, lr: 0.000722, batch_cost: 0.1919, reader_cost: 0.00031, ips: 41.6810 samples/sec | ETA 05:04:51 2022-08-24 00:20:33 [INFO] [TRAIN] epoch: 52, iter: 64750/160000, loss: 0.5116, lr: 0.000721, batch_cost: 0.1544, reader_cost: 0.00071, ips: 51.8221 samples/sec | ETA 04:05:04 2022-08-24 00:20:41 [INFO] [TRAIN] epoch: 52, iter: 64800/160000, loss: 0.5262, lr: 0.000721, batch_cost: 0.1680, reader_cost: 0.00303, ips: 47.6075 samples/sec | ETA 04:26:37 2022-08-24 00:20:50 [INFO] [TRAIN] epoch: 52, iter: 64850/160000, loss: 0.5533, lr: 0.000720, batch_cost: 0.1617, reader_cost: 0.00037, ips: 49.4869 samples/sec | ETA 04:16:21 2022-08-24 00:20:59 [INFO] [TRAIN] epoch: 52, iter: 64900/160000, loss: 0.5572, lr: 0.000720, batch_cost: 0.1895, reader_cost: 0.00060, ips: 42.2102 samples/sec | ETA 05:00:24 2022-08-24 00:21:07 [INFO] [TRAIN] epoch: 52, iter: 64950/160000, loss: 0.5597, lr: 0.000720, batch_cost: 0.1664, reader_cost: 0.00039, ips: 48.0634 samples/sec | ETA 04:23:40 2022-08-24 00:21:16 [INFO] [TRAIN] epoch: 52, iter: 65000/160000, loss: 0.5299, lr: 0.000719, batch_cost: 0.1677, reader_cost: 0.00079, ips: 47.6986 samples/sec | ETA 04:25:33 2022-08-24 00:21:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1893 - reader cost: 5.4076e-04 2022-08-24 00:24:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.3576 Acc: 0.7676 Kappa: 0.7499 Dice: 0.4951 2022-08-24 00:24:25 [INFO] [EVAL] Class IoU: [0.6879 0.778 0.9325 0.7323 0.6881 0.7591 0.7801 0.8066 0.5174 0.644 0.4925 0.5799 0.7032 0.2676 0.2968 0.4346 0.4866 0.4493 0.6126 0.4212 0.7624 0.4721 0.6101 0.4927 0.3216 0.3839 0.4763 0.4397 0.3999 0.3128 0.2823 0.4648 0.3097 0.3366 0.3126 0.3813 0.4514 0.5486 0.2545 0.3615 0.0918 0.1704 0.369 0.2494 0.2672 0.2582 0.2963 0.4985 0.6317 0.5307 0.5678 0.3684 0.1789 0.1822 0.6762 0.377 0.8536 0.3812 0.4195 0.2704 0.1148 0.2358 0.2694 0.0953 0.4086 0.684 0.2178 0.3805 0.1091 0.3108 0.4351 0.5712 0.3993 0.2166 0.4567 0.3845 0.5607 0.2519 0.2698 0.1455 0.5159 0.3858 0.3276 0.0469 0.1556 0.5304 0.1388 0.0952 0.3993 0.4964 0.4236 0.1259 0.2273 0.0734 0.0143 0.0348 0.1993 0.128 0.2818 0.3355 0.2147 0.1636 0.2874 0.4298 0.1843 0.4719 0.241 0.4259 0.0796 0.3805 0.1743 0.346 0.165 0.3529 0.5859 0.0449 0.421 0.6715 0.1941 0.391 0.4549 0.0071 0.3172 0.1351 0.2768 0.218 0.5173 0.487 0.6425 0.3533 0.5782 0.0841 0.3104 0.3878 0.2755 0.1732 0.1286 0.0302 0.1819 0.3663 0.0368 0.1022 0.2658 0.4797 0.2444 0.0096 0.4342 0.0276 0.1415 0.2107] 2022-08-24 00:24:25 [INFO] [EVAL] Class Precision: [0.7895 0.8557 0.9673 0.8168 0.7744 0.8686 0.9155 0.8749 0.6383 0.7711 0.6739 0.7016 0.7713 0.4825 0.5408 0.5955 0.6347 0.7118 0.7343 0.6089 0.8502 0.6867 0.7414 0.6193 0.499 0.5465 0.5883 0.7055 0.7258 0.4655 0.4732 0.6763 0.5367 0.4093 0.4139 0.5367 0.7162 0.7704 0.349 0.5997 0.2663 0.3511 0.6502 0.5092 0.3933 0.5135 0.4981 0.6854 0.7455 0.6464 0.7025 0.4435 0.3109 0.4705 0.7238 0.7217 0.9043 0.6657 0.7003 0.5033 0.1958 0.444 0.402 0.6798 0.4547 0.8283 0.4023 0.4762 0.2545 0.492 0.714 0.7366 0.5529 0.3678 0.6924 0.5385 0.788 0.7223 0.5562 0.3733 0.5952 0.6507 0.7925 0.1392 0.2166 0.6737 0.4876 0.5223 0.7243 0.7813 0.5969 0.2895 0.4912 0.2497 0.0207 0.0785 0.7018 0.3383 0.3963 0.6754 0.5796 0.3265 0.5911 0.7225 0.7097 0.5504 0.3873 0.8069 0.2109 0.5774 0.5143 0.4292 0.5471 0.8485 0.5898 0.383 0.7312 0.7397 0.4448 0.5168 0.7151 0.4703 0.6084 0.686 0.7595 0.6126 0.8122 0.6659 0.7722 0.628 0.6861 0.5024 0.5656 0.7998 0.7981 0.3595 0.3728 0.0571 0.4137 0.6093 0.1818 0.4632 0.634 0.6057 0.6469 0.0111 0.7639 0.32 0.2873 0.6955] 2022-08-24 00:24:25 [INFO] [EVAL] Class Recall: [0.8424 0.8955 0.9628 0.8762 0.8606 0.8576 0.8406 0.9117 0.732 0.7962 0.6466 0.7698 0.8884 0.3754 0.3968 0.6166 0.6758 0.5492 0.7871 0.5773 0.8807 0.6016 0.775 0.7068 0.475 0.5635 0.7144 0.5385 0.471 0.4881 0.4118 0.5978 0.4228 0.6546 0.5607 0.5684 0.5497 0.6559 0.4846 0.4764 0.1229 0.2487 0.4604 0.3283 0.4545 0.3418 0.4223 0.6464 0.8053 0.7479 0.7475 0.685 0.2964 0.2292 0.9113 0.4411 0.9383 0.4714 0.5112 0.3689 0.2174 0.3347 0.4497 0.0998 0.8013 0.7969 0.3219 0.6545 0.1603 0.4576 0.5269 0.7177 0.5896 0.345 0.573 0.5735 0.6603 0.2789 0.3439 0.1925 0.7947 0.4866 0.3583 0.0661 0.3556 0.7138 0.1625 0.1043 0.4708 0.5765 0.5933 0.1821 0.2972 0.0942 0.0437 0.0587 0.2178 0.1707 0.4938 0.4 0.2543 0.247 0.3588 0.5147 0.1994 0.7679 0.3895 0.4743 0.1134 0.5274 0.2087 0.6409 0.1911 0.3766 0.9889 0.0484 0.4981 0.8793 0.2561 0.6164 0.5555 0.0071 0.3985 0.144 0.3034 0.2529 0.5876 0.6444 0.7927 0.4468 0.7861 0.0917 0.4077 0.4295 0.2961 0.2506 0.1641 0.0603 0.2451 0.4787 0.0441 0.1159 0.314 0.6976 0.282 0.0678 0.5015 0.0294 0.218 0.2321] 2022-08-24 00:24:26 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:24:35 [INFO] [TRAIN] epoch: 52, iter: 65050/160000, loss: 0.5136, lr: 0.000719, batch_cost: 0.1822, reader_cost: 0.00370, ips: 43.9108 samples/sec | ETA 04:48:18 2022-08-24 00:24:44 [INFO] [TRAIN] epoch: 52, iter: 65100/160000, loss: 0.5407, lr: 0.000718, batch_cost: 0.1782, reader_cost: 0.00120, ips: 44.8808 samples/sec | ETA 04:41:55 2022-08-24 00:24:53 [INFO] [TRAIN] epoch: 52, iter: 65150/160000, loss: 0.5154, lr: 0.000718, batch_cost: 0.1911, reader_cost: 0.00065, ips: 41.8719 samples/sec | ETA 05:02:01 2022-08-24 00:25:03 [INFO] [TRAIN] epoch: 52, iter: 65200/160000, loss: 0.5648, lr: 0.000718, batch_cost: 0.2037, reader_cost: 0.00093, ips: 39.2692 samples/sec | ETA 05:21:52 2022-08-24 00:25:13 [INFO] [TRAIN] epoch: 52, iter: 65250/160000, loss: 0.5802, lr: 0.000717, batch_cost: 0.1823, reader_cost: 0.00040, ips: 43.8807 samples/sec | ETA 04:47:54 2022-08-24 00:25:22 [INFO] [TRAIN] epoch: 52, iter: 65300/160000, loss: 0.5580, lr: 0.000717, batch_cost: 0.1861, reader_cost: 0.00053, ips: 42.9857 samples/sec | ETA 04:53:44 2022-08-24 00:25:30 [INFO] [TRAIN] epoch: 52, iter: 65350/160000, loss: 0.5480, lr: 0.000717, batch_cost: 0.1714, reader_cost: 0.00053, ips: 46.6794 samples/sec | ETA 04:30:21 2022-08-24 00:25:40 [INFO] [TRAIN] epoch: 52, iter: 65400/160000, loss: 0.5646, lr: 0.000716, batch_cost: 0.1911, reader_cost: 0.00042, ips: 41.8707 samples/sec | ETA 05:01:14 2022-08-24 00:25:49 [INFO] [TRAIN] epoch: 52, iter: 65450/160000, loss: 0.5190, lr: 0.000716, batch_cost: 0.1772, reader_cost: 0.00072, ips: 45.1452 samples/sec | ETA 04:39:14 2022-08-24 00:25:59 [INFO] [TRAIN] epoch: 52, iter: 65500/160000, loss: 0.5207, lr: 0.000715, batch_cost: 0.1995, reader_cost: 0.00058, ips: 40.0914 samples/sec | ETA 05:14:16 2022-08-24 00:26:08 [INFO] [TRAIN] epoch: 52, iter: 65550/160000, loss: 0.5516, lr: 0.000715, batch_cost: 0.1892, reader_cost: 0.00040, ips: 42.2787 samples/sec | ETA 04:57:51 2022-08-24 00:26:18 [INFO] [TRAIN] epoch: 52, iter: 65600/160000, loss: 0.5820, lr: 0.000715, batch_cost: 0.1861, reader_cost: 0.00056, ips: 42.9901 samples/sec | ETA 04:52:46 2022-08-24 00:26:27 [INFO] [TRAIN] epoch: 52, iter: 65650/160000, loss: 0.5696, lr: 0.000714, batch_cost: 0.1783, reader_cost: 0.00070, ips: 44.8730 samples/sec | ETA 04:40:20 2022-08-24 00:26:41 [INFO] [TRAIN] epoch: 53, iter: 65700/160000, loss: 0.5201, lr: 0.000714, batch_cost: 0.2899, reader_cost: 0.10008, ips: 27.5911 samples/sec | ETA 07:35:42 2022-08-24 00:26:51 [INFO] [TRAIN] epoch: 53, iter: 65750/160000, loss: 0.5400, lr: 0.000714, batch_cost: 0.1985, reader_cost: 0.00061, ips: 40.3073 samples/sec | ETA 05:11:46 2022-08-24 00:27:00 [INFO] [TRAIN] epoch: 53, iter: 65800/160000, loss: 0.5546, lr: 0.000713, batch_cost: 0.1863, reader_cost: 0.00034, ips: 42.9378 samples/sec | ETA 04:52:30 2022-08-24 00:27:10 [INFO] [TRAIN] epoch: 53, iter: 65850/160000, loss: 0.5324, lr: 0.000713, batch_cost: 0.1993, reader_cost: 0.00065, ips: 40.1435 samples/sec | ETA 05:12:42 2022-08-24 00:27:19 [INFO] [TRAIN] epoch: 53, iter: 65900/160000, loss: 0.5554, lr: 0.000712, batch_cost: 0.1813, reader_cost: 0.00144, ips: 44.1344 samples/sec | ETA 04:44:16 2022-08-24 00:27:29 [INFO] [TRAIN] epoch: 53, iter: 65950/160000, loss: 0.5376, lr: 0.000712, batch_cost: 0.1869, reader_cost: 0.00133, ips: 42.7954 samples/sec | ETA 04:53:01 2022-08-24 00:27:39 [INFO] [TRAIN] epoch: 53, iter: 66000/160000, loss: 0.5301, lr: 0.000712, batch_cost: 0.2098, reader_cost: 0.00068, ips: 38.1378 samples/sec | ETA 05:28:37 2022-08-24 00:27:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1857 - reader cost: 0.0011 2022-08-24 00:30:45 [INFO] [EVAL] #Images: 2000 mIoU: 0.3643 Acc: 0.7701 Kappa: 0.7525 Dice: 0.5006 2022-08-24 00:30:45 [INFO] [EVAL] Class IoU: [0.6869 0.788 0.9289 0.7364 0.6767 0.7657 0.7688 0.7948 0.526 0.6234 0.4811 0.5665 0.7019 0.3077 0.2939 0.4389 0.5369 0.4406 0.6265 0.4223 0.7631 0.4279 0.5979 0.5086 0.3172 0.3865 0.4607 0.4577 0.4184 0.2523 0.2405 0.4988 0.2856 0.3806 0.3963 0.4027 0.4649 0.5161 0.2862 0.3705 0.1055 0.1344 0.3718 0.269 0.2615 0.2082 0.2422 0.5019 0.6604 0.4993 0.5598 0.3904 0.187 0.2701 0.6785 0.3438 0.8688 0.4033 0.4998 0.2835 0.1147 0.2287 0.2863 0.1016 0.4538 0.6803 0.2842 0.4049 0.077 0.3656 0.5055 0.5418 0.3549 0.2453 0.4752 0.3739 0.5848 0.2851 0.3324 0.2331 0.6886 0.3884 0.3092 0.0445 0.1508 0.5366 0.1194 0.108 0.2577 0.5259 0.4223 0.1518 0.1858 0.0363 0.0127 0.0311 0.2078 0.1583 0.2845 0.2967 0.057 0.1571 0.2991 0.5669 0.1806 0.3907 0.1331 0.6038 0.0888 0.487 0.1839 0.4222 0.1537 0.4905 0.7391 0.0763 0.353 0.6565 0.1716 0.4301 0.4503 0.0766 0.2545 0.1482 0.2842 0.2462 0.5273 0.4591 0.6574 0.3553 0.5999 0.0116 0.3027 0.4048 0.2736 0.1828 0.177 0.0309 0.1896 0.3467 0.0499 0.1082 0.2555 0.417 0.2826 0. 0.4199 0.0366 0.1163 0.2109] 2022-08-24 00:30:45 [INFO] [EVAL] Class Precision: [0.7822 0.8625 0.9579 0.8373 0.7686 0.8672 0.8908 0.8403 0.6545 0.7098 0.7137 0.677 0.7677 0.5212 0.5448 0.6056 0.7519 0.7071 0.78 0.6396 0.8434 0.6258 0.7314 0.6275 0.46 0.5667 0.6176 0.7178 0.7014 0.4488 0.5547 0.6143 0.5665 0.5286 0.4615 0.5174 0.6232 0.7508 0.4311 0.5846 0.2768 0.3616 0.7429 0.4925 0.3367 0.44 0.6505 0.7646 0.768 0.5716 0.7536 0.5253 0.3516 0.5391 0.7335 0.613 0.9266 0.6795 0.6628 0.5557 0.1675 0.6987 0.3891 0.7727 0.5556 0.8178 0.5435 0.5674 0.114 0.5912 0.6027 0.6596 0.597 0.337 0.765 0.4927 0.7699 0.7195 0.6629 0.4827 0.8127 0.7224 0.844 0.1235 0.236 0.7204 0.549 0.4314 0.7068 0.6884 0.5984 0.2295 0.3177 0.254 0.0227 0.1438 0.7186 0.4295 0.44 0.7118 0.5385 0.3072 0.6219 0.8316 0.6911 0.4355 0.3379 0.8213 0.1538 0.5638 0.3712 0.6147 0.5283 0.8152 0.7503 0.4347 0.7707 0.7269 0.3158 0.5746 0.6944 0.3468 0.7259 0.6271 0.7603 0.5701 0.8683 0.5696 0.8653 0.7088 0.722 0.213 0.5428 0.719 0.7903 0.3487 0.2861 0.0995 0.4205 0.7234 0.211 0.2107 0.6498 0.6287 0.5911 0. 0.8139 0.3316 0.2836 0.8306] 2022-08-24 00:30:45 [INFO] [EVAL] Class Recall: [0.8493 0.9012 0.9684 0.8594 0.8499 0.8675 0.8487 0.9363 0.7282 0.8366 0.5961 0.7764 0.8912 0.4289 0.3895 0.6146 0.6524 0.5389 0.761 0.5543 0.8891 0.575 0.7661 0.7285 0.5054 0.5486 0.6446 0.5582 0.509 0.3655 0.298 0.7263 0.3656 0.5763 0.7373 0.6451 0.6467 0.6228 0.4599 0.5029 0.1456 0.1763 0.4267 0.3721 0.5396 0.2832 0.2785 0.5937 0.8249 0.7977 0.6852 0.6033 0.2854 0.3511 0.9005 0.4391 0.933 0.4981 0.6702 0.3666 0.2668 0.2537 0.5201 0.1048 0.7124 0.8019 0.3734 0.5857 0.1916 0.4893 0.7581 0.7522 0.4667 0.4741 0.5564 0.608 0.7087 0.3208 0.4 0.3107 0.8184 0.4566 0.328 0.065 0.2946 0.6778 0.1324 0.1259 0.2885 0.6902 0.5893 0.3096 0.3092 0.0406 0.028 0.0381 0.2263 0.2005 0.446 0.3372 0.0599 0.2434 0.3657 0.6405 0.1965 0.7917 0.18 0.6952 0.1736 0.7815 0.2672 0.5742 0.1782 0.5518 0.9802 0.0847 0.3945 0.8714 0.273 0.6311 0.5617 0.0895 0.2816 0.1625 0.3121 0.3024 0.5731 0.7029 0.7324 0.416 0.7802 0.0121 0.4063 0.4809 0.295 0.2776 0.3171 0.0428 0.2566 0.3997 0.0614 0.182 0.2963 0.5533 0.3513 0. 0.4645 0.0395 0.1647 0.2204] 2022-08-24 00:30:45 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:30:53 [INFO] [TRAIN] epoch: 53, iter: 66050/160000, loss: 0.5133, lr: 0.000711, batch_cost: 0.1630, reader_cost: 0.00376, ips: 49.0825 samples/sec | ETA 04:15:12 2022-08-24 00:31:01 [INFO] [TRAIN] epoch: 53, iter: 66100/160000, loss: 0.5168, lr: 0.000711, batch_cost: 0.1558, reader_cost: 0.00201, ips: 51.3583 samples/sec | ETA 04:03:46 2022-08-24 00:31:09 [INFO] [TRAIN] epoch: 53, iter: 66150/160000, loss: 0.5373, lr: 0.000711, batch_cost: 0.1604, reader_cost: 0.00391, ips: 49.8776 samples/sec | ETA 04:10:52 2022-08-24 00:31:18 [INFO] [TRAIN] epoch: 53, iter: 66200/160000, loss: 0.5952, lr: 0.000710, batch_cost: 0.1702, reader_cost: 0.00057, ips: 47.0106 samples/sec | ETA 04:26:02 2022-08-24 00:31:26 [INFO] [TRAIN] epoch: 53, iter: 66250/160000, loss: 0.5581, lr: 0.000710, batch_cost: 0.1590, reader_cost: 0.00061, ips: 50.3144 samples/sec | ETA 04:08:26 2022-08-24 00:31:34 [INFO] [TRAIN] epoch: 53, iter: 66300/160000, loss: 0.5626, lr: 0.000709, batch_cost: 0.1669, reader_cost: 0.00081, ips: 47.9252 samples/sec | ETA 04:20:41 2022-08-24 00:31:42 [INFO] [TRAIN] epoch: 53, iter: 66350/160000, loss: 0.5653, lr: 0.000709, batch_cost: 0.1573, reader_cost: 0.00051, ips: 50.8644 samples/sec | ETA 04:05:29 2022-08-24 00:31:51 [INFO] [TRAIN] epoch: 53, iter: 66400/160000, loss: 0.5201, lr: 0.000709, batch_cost: 0.1745, reader_cost: 0.00069, ips: 45.8509 samples/sec | ETA 04:32:11 2022-08-24 00:31:59 [INFO] [TRAIN] epoch: 53, iter: 66450/160000, loss: 0.5854, lr: 0.000708, batch_cost: 0.1756, reader_cost: 0.00062, ips: 45.5653 samples/sec | ETA 04:33:44 2022-08-24 00:32:09 [INFO] [TRAIN] epoch: 53, iter: 66500/160000, loss: 0.5643, lr: 0.000708, batch_cost: 0.1915, reader_cost: 0.00040, ips: 41.7855 samples/sec | ETA 04:58:20 2022-08-24 00:32:18 [INFO] [TRAIN] epoch: 53, iter: 66550/160000, loss: 0.5466, lr: 0.000708, batch_cost: 0.1772, reader_cost: 0.00063, ips: 45.1560 samples/sec | ETA 04:35:55 2022-08-24 00:32:27 [INFO] [TRAIN] epoch: 53, iter: 66600/160000, loss: 0.5680, lr: 0.000707, batch_cost: 0.1778, reader_cost: 0.00032, ips: 44.9926 samples/sec | ETA 04:36:47 2022-08-24 00:32:35 [INFO] [TRAIN] epoch: 53, iter: 66650/160000, loss: 0.5450, lr: 0.000707, batch_cost: 0.1559, reader_cost: 0.00110, ips: 51.3256 samples/sec | ETA 04:02:30 2022-08-24 00:32:42 [INFO] [TRAIN] epoch: 53, iter: 66700/160000, loss: 0.5100, lr: 0.000706, batch_cost: 0.1564, reader_cost: 0.00051, ips: 51.1396 samples/sec | ETA 04:03:15 2022-08-24 00:32:52 [INFO] [TRAIN] epoch: 53, iter: 66750/160000, loss: 0.5393, lr: 0.000706, batch_cost: 0.1983, reader_cost: 0.00073, ips: 40.3417 samples/sec | ETA 05:08:12 2022-08-24 00:33:01 [INFO] [TRAIN] epoch: 53, iter: 66800/160000, loss: 0.5054, lr: 0.000706, batch_cost: 0.1798, reader_cost: 0.00084, ips: 44.5028 samples/sec | ETA 04:39:14 2022-08-24 00:33:11 [INFO] [TRAIN] epoch: 53, iter: 66850/160000, loss: 0.5050, lr: 0.000705, batch_cost: 0.1877, reader_cost: 0.00058, ips: 42.6180 samples/sec | ETA 04:51:25 2022-08-24 00:33:20 [INFO] [TRAIN] epoch: 53, iter: 66900/160000, loss: 0.5557, lr: 0.000705, batch_cost: 0.1771, reader_cost: 0.00109, ips: 45.1769 samples/sec | ETA 04:34:46 2022-08-24 00:33:33 [INFO] [TRAIN] epoch: 54, iter: 66950/160000, loss: 0.5834, lr: 0.000704, batch_cost: 0.2733, reader_cost: 0.09619, ips: 29.2737 samples/sec | ETA 07:03:48 2022-08-24 00:33:42 [INFO] [TRAIN] epoch: 54, iter: 67000/160000, loss: 0.5448, lr: 0.000704, batch_cost: 0.1844, reader_cost: 0.00060, ips: 43.3944 samples/sec | ETA 04:45:45 2022-08-24 00:33:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1855 - reader cost: 6.1166e-04 2022-08-24 00:36:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3626 Acc: 0.7711 Kappa: 0.7535 Dice: 0.5001 2022-08-24 00:36:48 [INFO] [EVAL] Class IoU: [0.6873 0.7771 0.9318 0.7448 0.6818 0.7602 0.7735 0.7989 0.5277 0.6435 0.484 0.5709 0.7059 0.2988 0.3294 0.4371 0.4968 0.47 0.6169 0.4395 0.7699 0.4566 0.5847 0.4884 0.3038 0.3617 0.5748 0.4158 0.4537 0.2927 0.2764 0.5036 0.3172 0.3461 0.3311 0.386 0.47 0.5444 0.279 0.3672 0.1047 0.1367 0.3685 0.2497 0.2673 0.2706 0.3377 0.5154 0.6298 0.5404 0.5301 0.4807 0.2144 0.2532 0.6879 0.4142 0.8617 0.3906 0.4271 0.281 0.0885 0.2406 0.3058 0.1297 0.4513 0.6766 0.2667 0.3993 0.0944 0.3519 0.4929 0.5624 0.3619 0.246 0.4761 0.3633 0.5725 0.2612 0.1856 0.1518 0.5872 0.3791 0.4131 0.1364 0.1726 0.5302 0.0917 0.1108 0.3067 0.5078 0.3917 0.0783 0.2 0.0927 0.0269 0.0495 0.194 0.1582 0.2434 0.2923 0.1032 0.1532 0.3024 0.5636 0.1299 0.3442 0.2178 0.5579 0.0785 0.2926 0.1415 0.3673 0.1303 0.469 0.7596 0.046 0.4416 0.6962 0.1741 0.4178 0.3346 0.0776 0.2635 0.1433 0.2709 0.2474 0.5345 0.4657 0.5287 0.2988 0.5793 0.1038 0.1838 0.35 0.2593 0.18 0.171 0.0345 0.2072 0.3682 0.2786 0.079 0.2109 0.4817 0.2477 0. 0.4135 0.0286 0.1503 0.2021] 2022-08-24 00:36:48 [INFO] [EVAL] Class Precision: [0.7857 0.8406 0.9596 0.844 0.792 0.8683 0.8889 0.8627 0.7162 0.7485 0.6669 0.7136 0.7706 0.5033 0.5168 0.6362 0.6509 0.6756 0.7599 0.5925 0.8531 0.6616 0.699 0.6004 0.4883 0.5453 0.6256 0.7843 0.7516 0.4861 0.5353 0.6355 0.5682 0.4886 0.4891 0.5222 0.6565 0.8269 0.4178 0.547 0.3 0.4207 0.5906 0.4365 0.3576 0.5075 0.7045 0.707 0.7368 0.6428 0.776 0.6532 0.4514 0.6237 0.7396 0.5492 0.9031 0.6458 0.7673 0.5854 0.1468 0.4669 0.402 0.6437 0.5483 0.7882 0.5253 0.5233 0.3079 0.5515 0.7425 0.7338 0.5583 0.3408 0.728 0.628 0.7715 0.5763 0.656 0.2553 0.7046 0.6444 0.7657 0.2932 0.2618 0.7028 0.7219 0.4199 0.6777 0.8287 0.5433 0.09 0.6636 0.2647 0.0644 0.1715 0.6304 0.4142 0.3255 0.634 0.6091 0.2616 0.5791 0.8337 0.5704 0.3589 0.5173 0.7962 0.1588 0.4252 0.3852 0.4707 0.4819 0.7149 0.768 0.414 0.7102 0.746 0.4395 0.5512 0.6632 0.2874 0.5966 0.6239 0.7522 0.64 0.7807 0.6055 0.7116 0.3837 0.6818 0.5724 0.5019 0.8405 0.8033 0.3559 0.3578 0.0964 0.4795 0.6049 0.3546 0.254 0.7029 0.6386 0.6575 0. 0.8816 0.272 0.4237 0.8326] 2022-08-24 00:36:48 [INFO] [EVAL] Class Recall: [0.8458 0.9113 0.9698 0.8638 0.8305 0.8593 0.8562 0.9152 0.6673 0.821 0.6384 0.7406 0.8937 0.4238 0.476 0.5827 0.6773 0.607 0.7662 0.63 0.8876 0.5958 0.7816 0.7235 0.4457 0.5178 0.8762 0.4695 0.5337 0.4238 0.3637 0.7082 0.418 0.5428 0.5063 0.5968 0.6233 0.6144 0.4566 0.5277 0.1386 0.1685 0.4949 0.3684 0.5143 0.3669 0.3934 0.6554 0.8126 0.7723 0.6259 0.6455 0.2899 0.2989 0.9079 0.6275 0.9496 0.497 0.4907 0.3507 0.182 0.3318 0.561 0.1397 0.7183 0.8269 0.3513 0.6275 0.1198 0.4931 0.5945 0.7066 0.5072 0.4694 0.5791 0.463 0.6894 0.3233 0.2056 0.2725 0.7789 0.4794 0.4728 0.2033 0.3362 0.6834 0.0951 0.1308 0.359 0.5674 0.584 0.3759 0.2226 0.1249 0.0443 0.065 0.219 0.2038 0.4909 0.3516 0.1105 0.27 0.3876 0.635 0.1439 0.8934 0.2734 0.6509 0.1344 0.4841 0.1828 0.6258 0.1516 0.5768 0.9858 0.0492 0.5386 0.9124 0.2238 0.6331 0.4031 0.0961 0.3207 0.1568 0.2975 0.2874 0.6289 0.6685 0.673 0.5746 0.794 0.1126 0.2248 0.3749 0.2769 0.2669 0.2467 0.051 0.2674 0.4848 0.5655 0.1029 0.2316 0.6623 0.2844 0. 0.4378 0.0309 0.189 0.2106] 2022-08-24 00:36:48 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:36:57 [INFO] [TRAIN] epoch: 54, iter: 67050/160000, loss: 0.5295, lr: 0.000704, batch_cost: 0.1686, reader_cost: 0.00267, ips: 47.4602 samples/sec | ETA 04:21:07 2022-08-24 00:37:06 [INFO] [TRAIN] epoch: 54, iter: 67100/160000, loss: 0.5738, lr: 0.000703, batch_cost: 0.1797, reader_cost: 0.00114, ips: 44.5164 samples/sec | ETA 04:38:14 2022-08-24 00:37:15 [INFO] [TRAIN] epoch: 54, iter: 67150/160000, loss: 0.5082, lr: 0.000703, batch_cost: 0.1777, reader_cost: 0.00049, ips: 45.0320 samples/sec | ETA 04:34:54 2022-08-24 00:37:24 [INFO] [TRAIN] epoch: 54, iter: 67200/160000, loss: 0.5561, lr: 0.000703, batch_cost: 0.1877, reader_cost: 0.00099, ips: 42.6292 samples/sec | ETA 04:50:15 2022-08-24 00:37:34 [INFO] [TRAIN] epoch: 54, iter: 67250/160000, loss: 0.5358, lr: 0.000702, batch_cost: 0.1897, reader_cost: 0.00092, ips: 42.1774 samples/sec | ETA 04:53:12 2022-08-24 00:37:43 [INFO] [TRAIN] epoch: 54, iter: 67300/160000, loss: 0.5147, lr: 0.000702, batch_cost: 0.1911, reader_cost: 0.00034, ips: 41.8736 samples/sec | ETA 04:55:10 2022-08-24 00:37:52 [INFO] [TRAIN] epoch: 54, iter: 67350/160000, loss: 0.4900, lr: 0.000701, batch_cost: 0.1826, reader_cost: 0.00050, ips: 43.8041 samples/sec | ETA 04:42:00 2022-08-24 00:38:01 [INFO] [TRAIN] epoch: 54, iter: 67400/160000, loss: 0.5284, lr: 0.000701, batch_cost: 0.1703, reader_cost: 0.00065, ips: 46.9742 samples/sec | ETA 04:22:50 2022-08-24 00:38:09 [INFO] [TRAIN] epoch: 54, iter: 67450/160000, loss: 0.5301, lr: 0.000701, batch_cost: 0.1738, reader_cost: 0.00059, ips: 46.0284 samples/sec | ETA 04:28:05 2022-08-24 00:38:18 [INFO] [TRAIN] epoch: 54, iter: 67500/160000, loss: 0.5397, lr: 0.000700, batch_cost: 0.1638, reader_cost: 0.00031, ips: 48.8367 samples/sec | ETA 04:12:32 2022-08-24 00:38:27 [INFO] [TRAIN] epoch: 54, iter: 67550/160000, loss: 0.4949, lr: 0.000700, batch_cost: 0.1793, reader_cost: 0.00045, ips: 44.6189 samples/sec | ETA 04:36:15 2022-08-24 00:38:35 [INFO] [TRAIN] epoch: 54, iter: 67600/160000, loss: 0.5525, lr: 0.000700, batch_cost: 0.1671, reader_cost: 0.00077, ips: 47.8711 samples/sec | ETA 04:17:21 2022-08-24 00:38:43 [INFO] [TRAIN] epoch: 54, iter: 67650/160000, loss: 0.5472, lr: 0.000699, batch_cost: 0.1622, reader_cost: 0.00052, ips: 49.3289 samples/sec | ETA 04:09:37 2022-08-24 00:38:52 [INFO] [TRAIN] epoch: 54, iter: 67700/160000, loss: 0.5395, lr: 0.000699, batch_cost: 0.1711, reader_cost: 0.00066, ips: 46.7622 samples/sec | ETA 04:23:10 2022-08-24 00:39:00 [INFO] [TRAIN] epoch: 54, iter: 67750/160000, loss: 0.5200, lr: 0.000698, batch_cost: 0.1670, reader_cost: 0.00092, ips: 47.9043 samples/sec | ETA 04:16:45 2022-08-24 00:39:09 [INFO] [TRAIN] epoch: 54, iter: 67800/160000, loss: 0.5744, lr: 0.000698, batch_cost: 0.1805, reader_cost: 0.00083, ips: 44.3115 samples/sec | ETA 04:37:25 2022-08-24 00:39:18 [INFO] [TRAIN] epoch: 54, iter: 67850/160000, loss: 0.5500, lr: 0.000698, batch_cost: 0.1769, reader_cost: 0.00067, ips: 45.2327 samples/sec | ETA 04:31:37 2022-08-24 00:39:26 [INFO] [TRAIN] epoch: 54, iter: 67900/160000, loss: 0.5331, lr: 0.000697, batch_cost: 0.1587, reader_cost: 0.00053, ips: 50.4014 samples/sec | ETA 04:03:38 2022-08-24 00:39:35 [INFO] [TRAIN] epoch: 54, iter: 67950/160000, loss: 0.5238, lr: 0.000697, batch_cost: 0.1872, reader_cost: 0.00044, ips: 42.7435 samples/sec | ETA 04:47:08 2022-08-24 00:39:44 [INFO] [TRAIN] epoch: 54, iter: 68000/160000, loss: 0.5436, lr: 0.000697, batch_cost: 0.1825, reader_cost: 0.00116, ips: 43.8275 samples/sec | ETA 04:39:53 2022-08-24 00:39:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1864 - reader cost: 7.7045e-04 2022-08-24 00:42:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3631 Acc: 0.7710 Kappa: 0.7536 Dice: 0.4990 2022-08-24 00:42:51 [INFO] [EVAL] Class IoU: [0.6906 0.7838 0.9308 0.739 0.6864 0.7626 0.7777 0.7985 0.5253 0.6095 0.4929 0.5725 0.7134 0.2891 0.3183 0.4395 0.5174 0.4589 0.6319 0.4205 0.7633 0.4373 0.6207 0.5172 0.3161 0.4241 0.5567 0.4458 0.4165 0.2499 0.3004 0.4775 0.2829 0.3425 0.3444 0.4174 0.4757 0.5621 0.2608 0.3518 0.1403 0.1491 0.3598 0.2684 0.2893 0.1788 0.2548 0.5073 0.6158 0.5571 0.5569 0.3267 0.1915 0.2135 0.6813 0.3736 0.7968 0.3928 0.4765 0.289 0.0546 0.2721 0.2415 0.2159 0.4728 0.6938 0.2472 0.3675 0.0526 0.3519 0.5159 0.5244 0.3638 0.2329 0.4854 0.3827 0.5379 0.2553 0.1659 0.1865 0.6448 0.4175 0.4166 0.0413 0.0367 0.5004 0.1281 0.1229 0.3317 0.5232 0.3957 0.0358 0.2242 0.0524 0.0442 0.03 0.1839 0.177 0.2654 0.2872 0.1481 0.1761 0.2971 0.6995 0.0561 0.5904 0.2121 0.5387 0.0767 0.4918 0.1772 0.2809 0.1392 0.4789 0.6615 0.0511 0.3898 0.6738 0.1765 0.431 0.4011 0.0388 0.2999 0.1565 0.2931 0.2479 0.5406 0.4738 0.5191 0.3439 0.5427 0.022 0.2806 0.3336 0.3021 0.1469 0.1535 0.0402 0.192 0.3644 0.2495 0.0987 0.2379 0.4821 0.3225 0. 0.4246 0.0164 0.1478 0.2325] 2022-08-24 00:42:51 [INFO] [EVAL] Class Precision: [0.7872 0.873 0.9589 0.8255 0.7816 0.8854 0.8944 0.8499 0.6919 0.7058 0.6932 0.6967 0.7995 0.5215 0.5736 0.5584 0.6226 0.7133 0.8134 0.6476 0.8397 0.6736 0.803 0.645 0.4545 0.5161 0.5994 0.734 0.7764 0.4526 0.5228 0.5965 0.5263 0.4295 0.5193 0.5572 0.6787 0.7639 0.3795 0.6081 0.2895 0.4106 0.6285 0.5183 0.4265 0.3831 0.5515 0.6596 0.661 0.6919 0.6642 0.3858 0.3333 0.5883 0.7295 0.6623 0.8185 0.6652 0.5919 0.5095 0.0912 0.6276 0.3468 0.5982 0.5678 0.8029 0.536 0.4318 0.152 0.5332 0.676 0.7267 0.567 0.3758 0.6842 0.5986 0.675 0.4912 0.701 0.4297 0.7864 0.692 0.7651 0.119 0.0822 0.7499 0.4703 0.3583 0.7024 0.731 0.5172 0.0431 0.3888 0.2961 0.2298 0.1123 0.7743 0.4986 0.3685 0.6052 0.5798 0.3481 0.486 0.7396 0.4944 0.6693 0.4157 0.7011 0.1666 0.5015 0.408 0.3346 0.514 0.8685 0.6716 0.1703 0.7807 0.7176 0.2556 0.5806 0.7103 0.2001 0.5203 0.5095 0.6134 0.6199 0.8088 0.6388 0.6014 0.6503 0.675 0.3031 0.5673 0.8514 0.6988 0.3593 0.4016 0.0954 0.4115 0.6864 0.4287 0.2659 0.5988 0.6079 0.6388 0. 0.7447 0.3629 0.5966 0.8163] 2022-08-24 00:42:51 [INFO] [EVAL] Class Recall: [0.8491 0.8846 0.9694 0.8759 0.8493 0.8462 0.8563 0.9296 0.6857 0.8171 0.6304 0.7626 0.8688 0.3935 0.417 0.6736 0.7537 0.5627 0.739 0.5452 0.8934 0.5548 0.7322 0.7231 0.5092 0.7039 0.8867 0.5317 0.4733 0.3581 0.4139 0.7054 0.3794 0.6285 0.5056 0.6245 0.614 0.6803 0.4546 0.455 0.214 0.1898 0.4571 0.3576 0.4734 0.2511 0.3214 0.6872 0.8999 0.7409 0.7752 0.6806 0.3104 0.251 0.9116 0.4615 0.9678 0.4895 0.7097 0.4003 0.1196 0.3245 0.4428 0.2525 0.7387 0.8363 0.3146 0.7119 0.0745 0.5085 0.6853 0.6533 0.5038 0.3798 0.6256 0.5149 0.7259 0.3471 0.1786 0.2479 0.7817 0.5129 0.4777 0.0595 0.0622 0.6005 0.1498 0.1576 0.386 0.6479 0.6275 0.1748 0.3463 0.0599 0.0518 0.0393 0.1943 0.2154 0.4868 0.3534 0.1659 0.2629 0.4333 0.9282 0.0596 0.8334 0.3023 0.6993 0.1245 0.9618 0.2385 0.6361 0.1603 0.5163 0.9777 0.068 0.4377 0.9169 0.3631 0.626 0.4796 0.0459 0.4146 0.1843 0.3595 0.2924 0.6198 0.6471 0.7913 0.4219 0.7346 0.0231 0.357 0.3542 0.3474 0.199 0.199 0.0649 0.2646 0.4372 0.3737 0.1358 0.283 0.6996 0.3944 0. 0.497 0.0168 0.1643 0.2453] 2022-08-24 00:42:51 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:43:01 [INFO] [TRAIN] epoch: 54, iter: 68050/160000, loss: 0.5451, lr: 0.000696, batch_cost: 0.1847, reader_cost: 0.00324, ips: 43.3212 samples/sec | ETA 04:43:00 2022-08-24 00:43:08 [INFO] [TRAIN] epoch: 54, iter: 68100/160000, loss: 0.5702, lr: 0.000696, batch_cost: 0.1568, reader_cost: 0.00316, ips: 51.0270 samples/sec | ETA 04:00:08 2022-08-24 00:43:16 [INFO] [TRAIN] epoch: 54, iter: 68150/160000, loss: 0.5282, lr: 0.000695, batch_cost: 0.1523, reader_cost: 0.00072, ips: 52.5203 samples/sec | ETA 03:53:10 2022-08-24 00:43:25 [INFO] [TRAIN] epoch: 54, iter: 68200/160000, loss: 0.5497, lr: 0.000695, batch_cost: 0.1814, reader_cost: 0.00063, ips: 44.1122 samples/sec | ETA 04:37:28 2022-08-24 00:43:38 [INFO] [TRAIN] epoch: 55, iter: 68250/160000, loss: 0.5747, lr: 0.000695, batch_cost: 0.2519, reader_cost: 0.09234, ips: 31.7568 samples/sec | ETA 06:25:13 2022-08-24 00:43:47 [INFO] [TRAIN] epoch: 55, iter: 68300/160000, loss: 0.5550, lr: 0.000694, batch_cost: 0.1922, reader_cost: 0.00060, ips: 41.6306 samples/sec | ETA 04:53:41 2022-08-24 00:43:56 [INFO] [TRAIN] epoch: 55, iter: 68350/160000, loss: 0.4991, lr: 0.000694, batch_cost: 0.1763, reader_cost: 0.00061, ips: 45.3675 samples/sec | ETA 04:29:21 2022-08-24 00:44:06 [INFO] [TRAIN] epoch: 55, iter: 68400/160000, loss: 0.5170, lr: 0.000694, batch_cost: 0.2003, reader_cost: 0.00038, ips: 39.9308 samples/sec | ETA 05:05:51 2022-08-24 00:44:14 [INFO] [TRAIN] epoch: 55, iter: 68450/160000, loss: 0.5310, lr: 0.000693, batch_cost: 0.1589, reader_cost: 0.00040, ips: 50.3437 samples/sec | ETA 04:02:27 2022-08-24 00:44:22 [INFO] [TRAIN] epoch: 55, iter: 68500/160000, loss: 0.5373, lr: 0.000693, batch_cost: 0.1564, reader_cost: 0.00050, ips: 51.1386 samples/sec | ETA 03:58:34 2022-08-24 00:44:31 [INFO] [TRAIN] epoch: 55, iter: 68550/160000, loss: 0.5675, lr: 0.000692, batch_cost: 0.1836, reader_cost: 0.00047, ips: 43.5702 samples/sec | ETA 04:39:51 2022-08-24 00:44:39 [INFO] [TRAIN] epoch: 55, iter: 68600/160000, loss: 0.5322, lr: 0.000692, batch_cost: 0.1564, reader_cost: 0.00126, ips: 51.1416 samples/sec | ETA 03:58:17 2022-08-24 00:44:46 [INFO] [TRAIN] epoch: 55, iter: 68650/160000, loss: 0.5237, lr: 0.000692, batch_cost: 0.1513, reader_cost: 0.00057, ips: 52.8899 samples/sec | ETA 03:50:17 2022-08-24 00:44:55 [INFO] [TRAIN] epoch: 55, iter: 68700/160000, loss: 0.5337, lr: 0.000691, batch_cost: 0.1614, reader_cost: 0.00053, ips: 49.5675 samples/sec | ETA 04:05:35 2022-08-24 00:45:04 [INFO] [TRAIN] epoch: 55, iter: 68750/160000, loss: 0.4855, lr: 0.000691, batch_cost: 0.1875, reader_cost: 0.00068, ips: 42.6592 samples/sec | ETA 04:45:12 2022-08-24 00:45:12 [INFO] [TRAIN] epoch: 55, iter: 68800/160000, loss: 0.5401, lr: 0.000690, batch_cost: 0.1615, reader_cost: 0.00093, ips: 49.5396 samples/sec | ETA 04:05:27 2022-08-24 00:45:20 [INFO] [TRAIN] epoch: 55, iter: 68850/160000, loss: 0.5278, lr: 0.000690, batch_cost: 0.1689, reader_cost: 0.00035, ips: 47.3599 samples/sec | ETA 04:16:36 2022-08-24 00:45:29 [INFO] [TRAIN] epoch: 55, iter: 68900/160000, loss: 0.4996, lr: 0.000690, batch_cost: 0.1633, reader_cost: 0.00081, ips: 48.9900 samples/sec | ETA 04:07:56 2022-08-24 00:45:38 [INFO] [TRAIN] epoch: 55, iter: 68950/160000, loss: 0.5685, lr: 0.000689, batch_cost: 0.1850, reader_cost: 0.00085, ips: 43.2363 samples/sec | ETA 04:40:46 2022-08-24 00:45:46 [INFO] [TRAIN] epoch: 55, iter: 69000/160000, loss: 0.5126, lr: 0.000689, batch_cost: 0.1552, reader_cost: 0.00070, ips: 51.5528 samples/sec | ETA 03:55:21 2022-08-24 00:45:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 198s - batch_cost: 0.1982 - reader cost: 6.4898e-04 2022-08-24 00:49:04 [INFO] [EVAL] #Images: 2000 mIoU: 0.3653 Acc: 0.7683 Kappa: 0.7506 Dice: 0.5034 2022-08-24 00:49:04 [INFO] [EVAL] Class IoU: [0.6825 0.7773 0.9324 0.7406 0.6732 0.7639 0.7678 0.8081 0.5086 0.654 0.4832 0.5732 0.7132 0.277 0.2761 0.4425 0.512 0.4085 0.6256 0.4406 0.7508 0.4549 0.6108 0.4927 0.3289 0.3333 0.5524 0.4547 0.4482 0.261 0.2816 0.5119 0.28 0.3672 0.3164 0.4038 0.4545 0.4727 0.2671 0.3937 0.1097 0.1326 0.3531 0.2695 0.262 0.2431 0.3676 0.4625 0.6549 0.5538 0.575 0.4133 0.1989 0.2592 0.6597 0.3178 0.8818 0.3927 0.456 0.3077 0.1166 0.2255 0.2766 0.0915 0.4656 0.663 0.2782 0.3922 0.1497 0.3475 0.4969 0.5374 0.316 0.2481 0.4931 0.3839 0.5914 0.279 0.1713 0.2697 0.6339 0.3946 0.4037 0.0646 0.2542 0.5252 0.1128 0.1166 0.2674 0.5277 0.4036 0.0805 0.2064 0.1047 0.1308 0.0387 0.1922 0.1918 0.2894 0.2954 0.2691 0.1365 0.2435 0.3247 0.1553 0.6231 0.2022 0.5451 0.0963 0.4478 0.135 0.2677 0.1429 0.606 0.6927 0.0635 0.4673 0.6682 0.1751 0.343 0.3881 0.0562 0.2926 0.1556 0.2876 0.2586 0.5421 0.4813 0.5989 0.3259 0.5941 0.0452 0.2911 0.3386 0.3104 0.1922 0.1587 0.0208 0.2249 0.3642 0.1575 0.007 0.2603 0.371 0.2393 0. 0.4342 0.0328 0.1694 0.1917] 2022-08-24 00:49:04 [INFO] [EVAL] Class Precision: [0.7753 0.8744 0.9647 0.838 0.7431 0.8509 0.8979 0.8833 0.6356 0.78 0.688 0.6909 0.7855 0.4861 0.5875 0.5805 0.6364 0.6896 0.773 0.6203 0.8142 0.6684 0.7506 0.6199 0.4508 0.4971 0.661 0.6935 0.7612 0.533 0.4725 0.6883 0.5032 0.4971 0.4769 0.4914 0.6896 0.8416 0.401 0.6037 0.2593 0.3839 0.6347 0.4303 0.4306 0.462 0.5468 0.5982 0.7607 0.6948 0.7196 0.5455 0.339 0.6935 0.7046 0.5642 0.9384 0.6879 0.7018 0.6866 0.1508 0.4038 0.496 0.6547 0.5659 0.8593 0.4161 0.5793 0.4092 0.5377 0.6118 0.7071 0.6276 0.3375 0.6838 0.5353 0.8004 0.6059 0.6212 0.4469 0.7687 0.7393 0.7915 0.1611 0.3296 0.7096 0.6981 0.363 0.6681 0.6655 0.5957 0.0961 0.4524 0.3082 0.3464 0.0993 0.6181 0.5175 0.4475 0.654 0.6927 0.25 0.5903 0.6269 0.7978 0.7623 0.5498 0.8364 0.1565 0.5318 0.3993 0.3273 0.5194 0.8177 0.6989 0.2976 0.7363 0.7506 0.3819 0.4227 0.7125 0.2672 0.5373 0.6536 0.7047 0.6007 0.8292 0.617 0.8187 0.4238 0.6672 0.5413 0.5993 0.8243 0.5501 0.3267 0.299 0.088 0.4293 0.5291 0.319 0.0159 0.6745 0.6555 0.4865 0. 0.8624 0.2835 0.5977 0.874 ] 2022-08-24 00:49:04 [INFO] [EVAL] Class Recall: [0.8508 0.875 0.9654 0.8645 0.8774 0.8819 0.8413 0.9047 0.718 0.802 0.6189 0.7709 0.8857 0.3917 0.3425 0.6505 0.7238 0.5005 0.7663 0.6033 0.906 0.5875 0.7663 0.706 0.5487 0.5028 0.7708 0.5691 0.5216 0.3384 0.4106 0.6663 0.387 0.5842 0.4845 0.6937 0.5714 0.5189 0.4443 0.531 0.1597 0.1684 0.4431 0.4189 0.4009 0.3392 0.5287 0.6709 0.8248 0.7319 0.741 0.6304 0.325 0.2928 0.9119 0.4212 0.936 0.4778 0.5655 0.3579 0.3397 0.3381 0.3848 0.0961 0.7244 0.7437 0.4564 0.5484 0.1909 0.4955 0.7257 0.6912 0.3889 0.4839 0.6388 0.5758 0.6938 0.3409 0.1912 0.4049 0.7834 0.4584 0.4518 0.0973 0.5263 0.6691 0.1186 0.1466 0.3083 0.7181 0.5559 0.3322 0.2752 0.1369 0.1736 0.0595 0.2181 0.2336 0.4502 0.3501 0.3056 0.2311 0.293 0.4025 0.1617 0.7734 0.2424 0.6101 0.2001 0.7392 0.1694 0.5952 0.1647 0.7007 0.9873 0.0747 0.5612 0.8588 0.2444 0.6453 0.4601 0.0664 0.3911 0.1695 0.3271 0.3123 0.6103 0.6864 0.6905 0.5853 0.8443 0.047 0.3614 0.365 0.4161 0.3182 0.2527 0.0265 0.3208 0.5388 0.2372 0.0123 0.2977 0.4609 0.3201 0. 0.4665 0.0357 0.1912 0.1972] 2022-08-24 00:49:04 [INFO] [EVAL] The model with the best validation mIoU (0.3661) was saved at iter 59000. 2022-08-24 00:49:13 [INFO] [TRAIN] epoch: 55, iter: 69050/160000, loss: 0.5332, lr: 0.000689, batch_cost: 0.1806, reader_cost: 0.00372, ips: 44.2905 samples/sec | ETA 04:33:47 2022-08-24 00:49:23 [INFO] [TRAIN] epoch: 55, iter: 69100/160000, loss: 0.5256, lr: 0.000688, batch_cost: 0.1931, reader_cost: 0.00102, ips: 41.4355 samples/sec | ETA 04:52:30 2022-08-24 00:49:31 [INFO] [TRAIN] epoch: 55, iter: 69150/160000, loss: 0.5090, lr: 0.000688, batch_cost: 0.1575, reader_cost: 0.00053, ips: 50.7919 samples/sec | ETA 03:58:29 2022-08-24 00:49:39 [INFO] [TRAIN] epoch: 55, iter: 69200/160000, loss: 0.5396, lr: 0.000687, batch_cost: 0.1650, reader_cost: 0.00051, ips: 48.4881 samples/sec | ETA 04:09:41 2022-08-24 00:49:47 [INFO] [TRAIN] epoch: 55, iter: 69250/160000, loss: 0.5450, lr: 0.000687, batch_cost: 0.1628, reader_cost: 0.00091, ips: 49.1320 samples/sec | ETA 04:06:16 2022-08-24 00:49:56 [INFO] [TRAIN] epoch: 55, iter: 69300/160000, loss: 0.4976, lr: 0.000687, batch_cost: 0.1753, reader_cost: 0.00076, ips: 45.6444 samples/sec | ETA 04:24:56 2022-08-24 00:50:05 [INFO] [TRAIN] epoch: 55, iter: 69350/160000, loss: 0.5449, lr: 0.000686, batch_cost: 0.1802, reader_cost: 0.00039, ips: 44.3867 samples/sec | ETA 04:32:18 2022-08-24 00:50:15 [INFO] [TRAIN] epoch: 55, iter: 69400/160000, loss: 0.5302, lr: 0.000686, batch_cost: 0.1961, reader_cost: 0.00041, ips: 40.7932 samples/sec | ETA 04:56:07 2022-08-24 00:50:24 [INFO] [TRAIN] epoch: 55, iter: 69450/160000, loss: 0.6068, lr: 0.000686, batch_cost: 0.1841, reader_cost: 0.00035, ips: 43.4627 samples/sec | ETA 04:37:47 2022-08-24 00:50:38 [INFO] [TRAIN] epoch: 56, iter: 69500/160000, loss: 0.5747, lr: 0.000685, batch_cost: 0.2707, reader_cost: 0.06881, ips: 29.5550 samples/sec | ETA 06:48:16 2022-08-24 00:50:48 [INFO] [TRAIN] epoch: 56, iter: 69550/160000, loss: 0.5295, lr: 0.000685, batch_cost: 0.2125, reader_cost: 0.00087, ips: 37.6492 samples/sec | ETA 05:20:19 2022-08-24 00:50:58 [INFO] [TRAIN] epoch: 56, iter: 69600/160000, loss: 0.5123, lr: 0.000684, batch_cost: 0.1906, reader_cost: 0.00030, ips: 41.9648 samples/sec | ETA 04:47:13 2022-08-24 00:51:07 [INFO] [TRAIN] epoch: 56, iter: 69650/160000, loss: 0.5337, lr: 0.000684, batch_cost: 0.1818, reader_cost: 0.00076, ips: 43.9939 samples/sec | ETA 04:33:49 2022-08-24 00:51:16 [INFO] [TRAIN] epoch: 56, iter: 69700/160000, loss: 0.5115, lr: 0.000684, batch_cost: 0.1845, reader_cost: 0.00086, ips: 43.3707 samples/sec | ETA 04:37:36 2022-08-24 00:51:25 [INFO] [TRAIN] epoch: 56, iter: 69750/160000, loss: 0.5407, lr: 0.000683, batch_cost: 0.1779, reader_cost: 0.00046, ips: 44.9810 samples/sec | ETA 04:27:31 2022-08-24 00:51:34 [INFO] [TRAIN] epoch: 56, iter: 69800/160000, loss: 0.5223, lr: 0.000683, batch_cost: 0.1830, reader_cost: 0.00055, ips: 43.7142 samples/sec | ETA 04:35:07 2022-08-24 00:51:43 [INFO] [TRAIN] epoch: 56, iter: 69850/160000, loss: 0.5422, lr: 0.000683, batch_cost: 0.1771, reader_cost: 0.00043, ips: 45.1701 samples/sec | ETA 04:26:06 2022-08-24 00:51:52 [INFO] [TRAIN] epoch: 56, iter: 69900/160000, loss: 0.5390, lr: 0.000682, batch_cost: 0.1823, reader_cost: 0.00092, ips: 43.8840 samples/sec | ETA 04:33:45 2022-08-24 00:52:01 [INFO] [TRAIN] epoch: 56, iter: 69950/160000, loss: 0.5020, lr: 0.000682, batch_cost: 0.1802, reader_cost: 0.00033, ips: 44.3881 samples/sec | ETA 04:30:29 2022-08-24 00:52:09 [INFO] [TRAIN] epoch: 56, iter: 70000/160000, loss: 0.5160, lr: 0.000681, batch_cost: 0.1638, reader_cost: 0.00052, ips: 48.8361 samples/sec | ETA 04:05:43 2022-08-24 00:52:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1823 - reader cost: 9.5130e-04 2022-08-24 00:55:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.3678 Acc: 0.7721 Kappa: 0.7545 Dice: 0.5042 2022-08-24 00:55:12 [INFO] [EVAL] Class IoU: [0.6829 0.7893 0.9295 0.7433 0.6918 0.7492 0.7764 0.806 0.5165 0.6776 0.489 0.566 0.7048 0.2861 0.2877 0.4351 0.4643 0.4621 0.604 0.4201 0.7682 0.4867 0.6129 0.5044 0.3172 0.4579 0.5559 0.4402 0.4297 0.2707 0.2855 0.4702 0.2842 0.349 0.3363 0.3993 0.4687 0.5711 0.2771 0.3475 0.1198 0.156 0.3542 0.2673 0.2942 0.2378 0.3502 0.5266 0.67 0.5014 0.5691 0.4027 0.1633 0.1001 0.6755 0.4618 0.8733 0.4429 0.3969 0.2569 0.0626 0.2483 0.2565 0.1336 0.4452 0.7205 0.264 0.4063 0.1206 0.3338 0.5 0.5572 0.3883 0.2072 0.4825 0.349 0.5312 0.2666 0.3544 0.2199 0.6531 0.3805 0.355 0.0468 0.1214 0.5219 0.1307 0.0916 0.3154 0.4981 0.4385 0.0596 0.1078 0.0898 0.235 0.0232 0.1712 0.2162 0.2796 0.2816 0.2522 0.1142 0.3044 0.6218 0.1539 0.5725 0.1728 0.5593 0.0845 0.5424 0.1642 0.4388 0.1403 0.6487 0.7023 0.0513 0.3686 0.7258 0.1712 0.3821 0.379 0.0837 0.2227 0.2135 0.2751 0.263 0.5039 0.4932 0.5841 0.3635 0.5668 0.0734 0.2228 0.3256 0.2471 0.1641 0.1336 0.0322 0.1976 0.3709 0.2027 0.0006 0.2214 0.4309 0.2306 0.0006 0.3978 0.0351 0.159 0.2053] 2022-08-24 00:55:12 [INFO] [EVAL] Class Precision: [0.7748 0.8597 0.9649 0.8326 0.7826 0.8756 0.8746 0.8714 0.6414 0.751 0.6904 0.7136 0.7763 0.5489 0.585 0.611 0.6208 0.6535 0.758 0.6462 0.8486 0.7132 0.7286 0.6156 0.4905 0.6141 0.6449 0.7654 0.7071 0.6592 0.4447 0.6341 0.5506 0.5009 0.4515 0.5225 0.6867 0.785 0.4579 0.621 0.3485 0.3735 0.7136 0.5248 0.396 0.4108 0.6776 0.7687 0.7888 0.5914 0.7659 0.531 0.341 0.6672 0.7177 0.6474 0.9538 0.6213 0.7254 0.4493 0.1037 0.4224 0.3424 0.8141 0.5456 0.8462 0.4342 0.5611 0.2745 0.583 0.7441 0.6786 0.6155 0.3209 0.6679 0.4793 0.6938 0.5682 0.6993 0.2801 0.8012 0.5741 0.8264 0.151 0.2311 0.7053 0.5324 0.3961 0.5402 0.6144 0.6735 0.0659 0.3666 0.3215 0.4013 0.0776 0.5168 0.4661 0.4191 0.6601 0.8048 0.1887 0.5024 0.733 0.8154 0.6309 0.2813 0.8261 0.1576 0.5717 0.4631 0.6505 0.6097 0.719 0.7117 0.5034 0.797 0.7561 0.3776 0.5035 0.707 0.3477 0.5949 0.5122 0.7413 0.6128 0.7634 0.6496 0.7115 0.5249 0.6355 0.3315 0.489 0.8147 0.7846 0.5079 0.3599 0.06 0.4555 0.6579 0.5784 0.0064 0.6829 0.54 0.6025 0.0016 0.8928 0.2282 0.4263 0.8944] 2022-08-24 00:55:12 [INFO] [EVAL] Class Recall: [0.852 0.906 0.9621 0.8739 0.8564 0.8384 0.8736 0.9148 0.7262 0.874 0.6264 0.7324 0.8844 0.3741 0.3615 0.6018 0.6482 0.6121 0.7482 0.5456 0.8902 0.6052 0.7941 0.7363 0.4732 0.6429 0.8011 0.5088 0.5227 0.3147 0.4435 0.6453 0.3701 0.5351 0.5686 0.6288 0.5962 0.677 0.4124 0.4411 0.1545 0.2113 0.4129 0.3527 0.5336 0.3608 0.4202 0.6257 0.8165 0.7672 0.689 0.625 0.2386 0.1053 0.9199 0.6169 0.9119 0.6067 0.4671 0.375 0.1365 0.3759 0.5054 0.1378 0.7076 0.8291 0.4025 0.5956 0.1769 0.4384 0.6039 0.757 0.5126 0.369 0.6347 0.562 0.6938 0.3343 0.4181 0.5057 0.7794 0.5301 0.3836 0.0636 0.2038 0.6675 0.1477 0.1065 0.4312 0.7248 0.5569 0.3834 0.1325 0.1109 0.3619 0.0321 0.2038 0.2873 0.4565 0.3293 0.2686 0.2242 0.4358 0.8039 0.1594 0.8609 0.3094 0.634 0.1542 0.9135 0.2029 0.5741 0.1542 0.869 0.9814 0.054 0.4068 0.9477 0.2386 0.6132 0.4496 0.0992 0.2625 0.2679 0.3043 0.3153 0.5972 0.672 0.7654 0.5417 0.8398 0.0861 0.2905 0.3516 0.2651 0.1952 0.1752 0.065 0.2587 0.4596 0.2378 0.0007 0.2468 0.6809 0.272 0.001 0.4177 0.0399 0.2022 0.2104] 2022-08-24 00:55:12 [INFO] [EVAL] The model with the best validation mIoU (0.3678) was saved at iter 70000. 2022-08-24 00:55:22 [INFO] [TRAIN] epoch: 56, iter: 70050/160000, loss: 0.4836, lr: 0.000681, batch_cost: 0.1912, reader_cost: 0.00356, ips: 41.8493 samples/sec | ETA 04:46:35 2022-08-24 00:55:31 [INFO] [TRAIN] epoch: 56, iter: 70100/160000, loss: 0.5385, lr: 0.000681, batch_cost: 0.1773, reader_cost: 0.00195, ips: 45.1091 samples/sec | ETA 04:25:43 2022-08-24 00:55:39 [INFO] [TRAIN] epoch: 56, iter: 70150/160000, loss: 0.5231, lr: 0.000680, batch_cost: 0.1671, reader_cost: 0.00042, ips: 47.8638 samples/sec | ETA 04:10:17 2022-08-24 00:55:48 [INFO] [TRAIN] epoch: 56, iter: 70200/160000, loss: 0.5100, lr: 0.000680, batch_cost: 0.1778, reader_cost: 0.00085, ips: 44.9828 samples/sec | ETA 04:26:10 2022-08-24 00:55:56 [INFO] [TRAIN] epoch: 56, iter: 70250/160000, loss: 0.5114, lr: 0.000680, batch_cost: 0.1685, reader_cost: 0.00049, ips: 47.4794 samples/sec | ETA 04:12:02 2022-08-24 00:56:05 [INFO] [TRAIN] epoch: 56, iter: 70300/160000, loss: 0.5422, lr: 0.000679, batch_cost: 0.1817, reader_cost: 0.00186, ips: 44.0359 samples/sec | ETA 04:31:35 2022-08-24 00:56:14 [INFO] [TRAIN] epoch: 56, iter: 70350/160000, loss: 0.5237, lr: 0.000679, batch_cost: 0.1668, reader_cost: 0.00052, ips: 47.9752 samples/sec | ETA 04:09:09 2022-08-24 00:56:22 [INFO] [TRAIN] epoch: 56, iter: 70400/160000, loss: 0.5491, lr: 0.000678, batch_cost: 0.1682, reader_cost: 0.00062, ips: 47.5532 samples/sec | ETA 04:11:13 2022-08-24 00:56:30 [INFO] [TRAIN] epoch: 56, iter: 70450/160000, loss: 0.5170, lr: 0.000678, batch_cost: 0.1630, reader_cost: 0.00058, ips: 49.0659 samples/sec | ETA 04:03:20 2022-08-24 00:56:38 [INFO] [TRAIN] epoch: 56, iter: 70500/160000, loss: 0.5625, lr: 0.000678, batch_cost: 0.1550, reader_cost: 0.00069, ips: 51.6006 samples/sec | ETA 03:51:15 2022-08-24 00:56:46 [INFO] [TRAIN] epoch: 56, iter: 70550/160000, loss: 0.5068, lr: 0.000677, batch_cost: 0.1667, reader_cost: 0.00107, ips: 47.9762 samples/sec | ETA 04:08:35 2022-08-24 00:56:55 [INFO] [TRAIN] epoch: 56, iter: 70600/160000, loss: 0.5278, lr: 0.000677, batch_cost: 0.1693, reader_cost: 0.00042, ips: 47.2601 samples/sec | ETA 04:12:13 2022-08-24 00:57:04 [INFO] [TRAIN] epoch: 56, iter: 70650/160000, loss: 0.5389, lr: 0.000676, batch_cost: 0.1892, reader_cost: 0.00057, ips: 42.2751 samples/sec | ETA 04:41:48 2022-08-24 00:57:15 [INFO] [TRAIN] epoch: 56, iter: 70700/160000, loss: 0.5225, lr: 0.000676, batch_cost: 0.2152, reader_cost: 0.00045, ips: 37.1724 samples/sec | ETA 05:20:18 2022-08-24 00:57:30 [INFO] [TRAIN] epoch: 57, iter: 70750/160000, loss: 0.5330, lr: 0.000676, batch_cost: 0.2991, reader_cost: 0.08292, ips: 26.7466 samples/sec | ETA 07:24:54 2022-08-24 00:57:38 [INFO] [TRAIN] epoch: 57, iter: 70800/160000, loss: 0.5191, lr: 0.000675, batch_cost: 0.1676, reader_cost: 0.00223, ips: 47.7467 samples/sec | ETA 04:09:05 2022-08-24 00:57:47 [INFO] [TRAIN] epoch: 57, iter: 70850/160000, loss: 0.4746, lr: 0.000675, batch_cost: 0.1714, reader_cost: 0.00064, ips: 46.6818 samples/sec | ETA 04:14:37 2022-08-24 00:57:56 [INFO] [TRAIN] epoch: 57, iter: 70900/160000, loss: 0.5142, lr: 0.000675, batch_cost: 0.1837, reader_cost: 0.00035, ips: 43.5558 samples/sec | ETA 04:32:45 2022-08-24 00:58:04 [INFO] [TRAIN] epoch: 57, iter: 70950/160000, loss: 0.5100, lr: 0.000674, batch_cost: 0.1656, reader_cost: 0.00051, ips: 48.2979 samples/sec | ETA 04:05:50 2022-08-24 00:58:13 [INFO] [TRAIN] epoch: 57, iter: 71000/160000, loss: 0.4865, lr: 0.000674, batch_cost: 0.1638, reader_cost: 0.00092, ips: 48.8374 samples/sec | ETA 04:02:58 2022-08-24 00:58:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1916 - reader cost: 6.3685e-04 2022-08-24 01:01:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.3623 Acc: 0.7691 Kappa: 0.7517 Dice: 0.4989 2022-08-24 01:01:24 [INFO] [EVAL] Class IoU: [0.6837 0.7856 0.9303 0.7494 0.6853 0.7611 0.7813 0.8045 0.5086 0.6561 0.4633 0.5769 0.702 0.288 0.2835 0.4439 0.5013 0.446 0.622 0.4369 0.7424 0.4224 0.6024 0.5008 0.3243 0.4011 0.4995 0.4406 0.4625 0.2504 0.2536 0.5051 0.2824 0.3628 0.3879 0.3746 0.4757 0.5409 0.2656 0.3691 0.0507 0.1512 0.3773 0.2693 0.2581 0.2559 0.3646 0.5009 0.6923 0.5353 0.5647 0.3598 0.1776 0.1681 0.6657 0.3046 0.8456 0.4194 0.4942 0.2591 0.0903 0.278 0.2957 0.1785 0.4816 0.6989 0.2595 0.379 0.1048 0.3606 0.4829 0.5304 0.3184 0.2354 0.4811 0.3906 0.5155 0.2569 0.2247 0.1744 0.6063 0.3875 0.4248 0.0408 0.2116 0.5229 0.1177 0.1254 0.3579 0.5405 0.3883 0.0741 0.0867 0.0625 0.0492 0.0243 0.1803 0.1943 0.2404 0.2771 0.2013 0.126 0.1098 0.6818 0.19 0.5611 0.1919 0.5669 0.0761 0.4763 0.1667 0.2161 0.1724 0.5492 0.7659 0.0567 0.3588 0.6473 0.1981 0.3265 0.3364 0.0763 0.3206 0.1846 0.2644 0.2487 0.5264 0.4888 0.5154 0.3168 0.5418 0.066 0.2326 0.3469 0.2926 0.1949 0.1425 0.0236 0.1934 0.3751 0.2346 0.1214 0.2425 0.433 0.2479 0. 0.429 0.0464 0.1268 0.2011] 2022-08-24 01:01:24 [INFO] [EVAL] Class Precision: [0.7898 0.8705 0.9651 0.8453 0.7636 0.8634 0.876 0.8681 0.652 0.7639 0.6639 0.7295 0.7754 0.5434 0.5723 0.6059 0.7443 0.6333 0.7784 0.6192 0.7993 0.6555 0.7328 0.6021 0.4415 0.519 0.6002 0.662 0.7702 0.3867 0.5039 0.6637 0.518 0.5341 0.4711 0.4447 0.6895 0.7693 0.3845 0.5773 0.2544 0.3435 0.6834 0.4817 0.3433 0.4405 0.6733 0.6485 0.763 0.6449 0.7267 0.4217 0.4089 0.6232 0.696 0.5457 0.8747 0.6844 0.6295 0.4348 0.1236 0.595 0.3703 0.7333 0.5996 0.8581 0.4651 0.5821 0.1697 0.5827 0.653 0.7839 0.6618 0.3112 0.7088 0.5493 0.6723 0.4501 0.6713 0.2714 0.7291 0.7369 0.7944 0.1497 0.3142 0.718 0.5797 0.3568 0.5879 0.7043 0.5262 0.0888 0.3777 0.3336 0.1419 0.1086 0.4827 0.4021 0.3398 0.5554 0.5255 0.211 0.5238 0.8133 0.7903 0.6255 0.3145 0.787 0.1752 0.5551 0.459 0.246 0.5666 0.7883 0.7865 0.4973 0.5822 0.7715 0.2544 0.4107 0.6877 0.5074 0.5157 0.6223 0.7559 0.6195 0.7881 0.6537 0.9157 0.4559 0.6267 0.3953 0.6081 0.7951 0.6754 0.3296 0.326 0.089 0.4041 0.6761 0.5944 0.3628 0.5762 0.5062 0.6997 0. 0.8855 0.2137 0.4594 0.8576] 2022-08-24 01:01:24 [INFO] [EVAL] Class Recall: [0.8357 0.8896 0.9627 0.8685 0.87 0.8653 0.8785 0.9166 0.698 0.823 0.6053 0.7338 0.8812 0.3799 0.3597 0.624 0.6056 0.6013 0.7558 0.5974 0.9124 0.543 0.772 0.7485 0.55 0.6384 0.7486 0.5684 0.5365 0.4153 0.338 0.6789 0.383 0.5308 0.6872 0.7036 0.6053 0.6456 0.4621 0.5058 0.0595 0.2126 0.4572 0.3792 0.5099 0.3792 0.443 0.6876 0.8819 0.7591 0.7169 0.7102 0.2389 0.1872 0.9387 0.4081 0.9621 0.52 0.6968 0.3907 0.2508 0.3429 0.5948 0.1909 0.7099 0.7902 0.3698 0.5206 0.215 0.4862 0.6496 0.6212 0.3803 0.4913 0.5996 0.5747 0.6885 0.3743 0.2525 0.3278 0.7827 0.4497 0.4773 0.0531 0.3933 0.658 0.1287 0.162 0.4779 0.6992 0.5971 0.3095 0.1012 0.0714 0.0701 0.0304 0.2235 0.2731 0.4509 0.356 0.2461 0.2382 0.122 0.8082 0.2001 0.845 0.3298 0.6697 0.1185 0.7705 0.2074 0.6405 0.1986 0.6442 0.9669 0.0602 0.4832 0.8009 0.4724 0.6145 0.3971 0.0824 0.4587 0.2079 0.289 0.2936 0.6131 0.6596 0.541 0.5094 0.7999 0.0734 0.2736 0.381 0.3404 0.3229 0.2021 0.0312 0.2705 0.4572 0.2793 0.1542 0.2951 0.7494 0.2775 0. 0.4542 0.0559 0.1491 0.208 ] 2022-08-24 01:01:25 [INFO] [EVAL] The model with the best validation mIoU (0.3678) was saved at iter 70000. 2022-08-24 01:01:33 [INFO] [TRAIN] epoch: 57, iter: 71050/160000, loss: 0.5077, lr: 0.000673, batch_cost: 0.1733, reader_cost: 0.00391, ips: 46.1542 samples/sec | ETA 04:16:57 2022-08-24 01:01:42 [INFO] [TRAIN] epoch: 57, iter: 71100/160000, loss: 0.5142, lr: 0.000673, batch_cost: 0.1772, reader_cost: 0.00093, ips: 45.1462 samples/sec | ETA 04:22:33 2022-08-24 01:01:52 [INFO] [TRAIN] epoch: 57, iter: 71150/160000, loss: 0.5213, lr: 0.000673, batch_cost: 0.1890, reader_cost: 0.00059, ips: 42.3375 samples/sec | ETA 04:39:48 2022-08-24 01:02:00 [INFO] [TRAIN] epoch: 57, iter: 71200/160000, loss: 0.5290, lr: 0.000672, batch_cost: 0.1664, reader_cost: 0.00044, ips: 48.0835 samples/sec | ETA 04:06:14 2022-08-24 01:02:09 [INFO] [TRAIN] epoch: 57, iter: 71250/160000, loss: 0.5199, lr: 0.000672, batch_cost: 0.1893, reader_cost: 0.00038, ips: 42.2525 samples/sec | ETA 04:40:03 2022-08-24 01:02:19 [INFO] [TRAIN] epoch: 57, iter: 71300/160000, loss: 0.5317, lr: 0.000672, batch_cost: 0.2010, reader_cost: 0.00045, ips: 39.8101 samples/sec | ETA 04:57:04 2022-08-24 01:02:27 [INFO] [TRAIN] epoch: 57, iter: 71350/160000, loss: 0.5159, lr: 0.000671, batch_cost: 0.1578, reader_cost: 0.00075, ips: 50.6934 samples/sec | ETA 03:53:09 2022-08-24 01:02:35 [INFO] [TRAIN] epoch: 57, iter: 71400/160000, loss: 0.5414, lr: 0.000671, batch_cost: 0.1632, reader_cost: 0.00045, ips: 49.0048 samples/sec | ETA 04:01:03 2022-08-24 01:02:45 [INFO] [TRAIN] epoch: 57, iter: 71450/160000, loss: 0.5226, lr: 0.000670, batch_cost: 0.1914, reader_cost: 0.00044, ips: 41.7897 samples/sec | ETA 04:42:31 2022-08-24 01:02:55 [INFO] [TRAIN] epoch: 57, iter: 71500/160000, loss: 0.5449, lr: 0.000670, batch_cost: 0.2046, reader_cost: 0.00073, ips: 39.0987 samples/sec | ETA 05:01:48 2022-08-24 01:03:04 [INFO] [TRAIN] epoch: 57, iter: 71550/160000, loss: 0.4925, lr: 0.000670, batch_cost: 0.1753, reader_cost: 0.00041, ips: 45.6258 samples/sec | ETA 04:18:28 2022-08-24 01:03:13 [INFO] [TRAIN] epoch: 57, iter: 71600/160000, loss: 0.5434, lr: 0.000669, batch_cost: 0.1820, reader_cost: 0.00168, ips: 43.9441 samples/sec | ETA 04:28:13 2022-08-24 01:03:22 [INFO] [TRAIN] epoch: 57, iter: 71650/160000, loss: 0.5530, lr: 0.000669, batch_cost: 0.1834, reader_cost: 0.00091, ips: 43.6206 samples/sec | ETA 04:30:03 2022-08-24 01:03:32 [INFO] [TRAIN] epoch: 57, iter: 71700/160000, loss: 0.5841, lr: 0.000669, batch_cost: 0.1914, reader_cost: 0.00047, ips: 41.7946 samples/sec | ETA 04:41:41 2022-08-24 01:03:41 [INFO] [TRAIN] epoch: 57, iter: 71750/160000, loss: 0.5392, lr: 0.000668, batch_cost: 0.1740, reader_cost: 0.00207, ips: 45.9669 samples/sec | ETA 04:15:58 2022-08-24 01:03:51 [INFO] [TRAIN] epoch: 57, iter: 71800/160000, loss: 0.5274, lr: 0.000668, batch_cost: 0.2016, reader_cost: 0.00700, ips: 39.6908 samples/sec | ETA 04:56:17 2022-08-24 01:04:00 [INFO] [TRAIN] epoch: 57, iter: 71850/160000, loss: 0.4978, lr: 0.000667, batch_cost: 0.1862, reader_cost: 0.00042, ips: 42.9557 samples/sec | ETA 04:33:36 2022-08-24 01:04:08 [INFO] [TRAIN] epoch: 57, iter: 71900/160000, loss: 0.5367, lr: 0.000667, batch_cost: 0.1674, reader_cost: 0.00042, ips: 47.7915 samples/sec | ETA 04:05:47 2022-08-24 01:04:17 [INFO] [TRAIN] epoch: 57, iter: 71950/160000, loss: 0.5333, lr: 0.000667, batch_cost: 0.1819, reader_cost: 0.01750, ips: 43.9775 samples/sec | ETA 04:26:57 2022-08-24 01:04:36 [INFO] [TRAIN] epoch: 58, iter: 72000/160000, loss: 0.5136, lr: 0.000666, batch_cost: 0.3776, reader_cost: 0.15210, ips: 21.1878 samples/sec | ETA 09:13:46 2022-08-24 01:04:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1779 - reader cost: 7.2282e-04 2022-08-24 01:07:35 [INFO] [EVAL] #Images: 2000 mIoU: 0.3613 Acc: 0.7676 Kappa: 0.7499 Dice: 0.4979 2022-08-24 01:07:35 [INFO] [EVAL] Class IoU: [0.683 0.7806 0.9292 0.7392 0.6842 0.741 0.7759 0.8052 0.5222 0.6337 0.4763 0.5754 0.7088 0.2995 0.3411 0.4375 0.4908 0.4416 0.607 0.4316 0.7655 0.4396 0.6076 0.5073 0.299 0.3719 0.4631 0.425 0.4362 0.2574 0.2606 0.5034 0.2847 0.3481 0.3176 0.3902 0.477 0.5158 0.2703 0.3293 0.0958 0.1082 0.3706 0.2464 0.2654 0.2556 0.3423 0.5044 0.7034 0.5186 0.5811 0.3698 0.2353 0.2427 0.6675 0.3189 0.861 0.4505 0.4385 0.3198 0.0997 0.2354 0.2523 0.167 0.4367 0.7192 0.2609 0.3867 0.0573 0.3712 0.4927 0.5577 0.3773 0.2226 0.4967 0.3677 0.461 0.284 0.2018 0.1458 0.6576 0.4229 0.4038 0.1007 0.0682 0.5429 0.0779 0.1213 0.3375 0.5192 0.45 0.1052 0.1893 0.0619 0.1081 0.0196 0.2086 0.2332 0.283 0.2942 0.0968 0.1303 0.2891 0.4046 0.1777 0.5734 0.2317 0.609 0.0609 0.5417 0.1838 0.4074 0.165 0.6902 0.5971 0.0696 0.4058 0.6725 0.1866 0.4054 0.3669 0.08 0.2645 0.1301 0.2722 0.2426 0.5188 0.5088 0.5134 0.3428 0.4891 0.0611 0.2104 0.3995 0.2426 0.1919 0.158 0.0244 0.1826 0.3779 0.1222 0.0528 0.2162 0.1954 0.2429 0.0017 0.4031 0.0372 0.1376 0.2345] 2022-08-24 01:07:35 [INFO] [EVAL] Class Precision: [0.7819 0.8481 0.9669 0.8398 0.7669 0.9012 0.8712 0.8726 0.6651 0.7527 0.6851 0.7086 0.8054 0.5443 0.4927 0.6015 0.6391 0.7106 0.7652 0.6237 0.8401 0.6255 0.7527 0.617 0.4326 0.5354 0.6084 0.6285 0.6351 0.4206 0.4777 0.5952 0.5596 0.5044 0.4637 0.4823 0.6684 0.8267 0.4289 0.6411 0.2806 0.3872 0.6278 0.559 0.3656 0.5454 0.5935 0.696 0.7976 0.613 0.7464 0.4872 0.404 0.7196 0.7166 0.5774 0.9063 0.6337 0.7245 0.6192 0.1402 0.4809 0.3227 0.6636 0.5214 0.8628 0.5761 0.5299 0.1505 0.6045 0.6937 0.7236 0.5762 0.272 0.6931 0.5718 0.7078 0.6202 0.6509 0.7187 0.8005 0.665 0.8033 0.3 0.1703 0.7195 0.5576 0.3748 0.8088 0.7471 0.6524 0.1342 0.3781 0.2488 0.258 0.0506 0.644 0.442 0.4203 0.6894 0.5003 0.1992 0.5725 0.4554 0.6742 0.6558 0.306 0.8282 0.1554 0.5449 0.4447 0.5527 0.5366 0.7765 0.6032 0.3952 0.6605 0.7638 0.3515 0.5948 0.6765 0.4707 0.5873 0.6717 0.7308 0.603 0.7687 0.6705 0.8875 0.5169 0.656 0.4132 0.4775 0.7688 0.7388 0.454 0.3169 0.0673 0.5083 0.633 0.2138 0.0704 0.6353 0.5608 0.7157 0.0022 0.8946 0.2595 0.5393 0.795 ] 2022-08-24 01:07:35 [INFO] [EVAL] Class Recall: [0.8438 0.9075 0.9598 0.8606 0.8638 0.8065 0.8764 0.9124 0.7085 0.8003 0.6098 0.7538 0.8553 0.3997 0.5257 0.6161 0.6789 0.5384 0.7459 0.5836 0.8961 0.5966 0.7592 0.7404 0.492 0.5491 0.6599 0.5676 0.5822 0.3989 0.3645 0.7654 0.3669 0.5289 0.5019 0.6712 0.6248 0.5783 0.4224 0.4037 0.127 0.1306 0.475 0.3058 0.492 0.3249 0.4471 0.6468 0.8562 0.7711 0.724 0.6054 0.3605 0.2681 0.9069 0.4161 0.9451 0.6091 0.5263 0.398 0.2569 0.3156 0.5364 0.1825 0.7289 0.8121 0.3229 0.5886 0.0847 0.4902 0.6298 0.7087 0.5222 0.5503 0.6368 0.5074 0.5694 0.3437 0.2263 0.1546 0.7865 0.5373 0.4481 0.1316 0.102 0.6886 0.0831 0.1521 0.3667 0.6299 0.592 0.3277 0.2749 0.0761 0.1569 0.031 0.2359 0.3305 0.4641 0.3391 0.1071 0.2736 0.3687 0.7838 0.1944 0.8202 0.4886 0.6971 0.0911 0.9891 0.2386 0.6077 0.1924 0.8613 0.9834 0.0779 0.5127 0.8491 0.2846 0.5601 0.445 0.0879 0.3248 0.1389 0.3025 0.2887 0.6148 0.6785 0.5492 0.5044 0.6579 0.067 0.2733 0.4541 0.2654 0.2495 0.2396 0.0369 0.2218 0.4838 0.2221 0.1746 0.2469 0.2307 0.2689 0.0085 0.4232 0.0416 0.1559 0.2496] 2022-08-24 01:07:35 [INFO] [EVAL] The model with the best validation mIoU (0.3678) was saved at iter 70000. 2022-08-24 01:07:44 [INFO] [TRAIN] epoch: 58, iter: 72050/160000, loss: 0.5089, lr: 0.000666, batch_cost: 0.1845, reader_cost: 0.00680, ips: 43.3607 samples/sec | ETA 04:30:26 2022-08-24 01:07:52 [INFO] [TRAIN] epoch: 58, iter: 72100/160000, loss: 0.5184, lr: 0.000665, batch_cost: 0.1642, reader_cost: 0.00084, ips: 48.7243 samples/sec | ETA 04:00:32 2022-08-24 01:08:00 [INFO] [TRAIN] epoch: 58, iter: 72150/160000, loss: 0.5055, lr: 0.000665, batch_cost: 0.1638, reader_cost: 0.00068, ips: 48.8452 samples/sec | ETA 03:59:48 2022-08-24 01:08:09 [INFO] [TRAIN] epoch: 58, iter: 72200/160000, loss: 0.5053, lr: 0.000665, batch_cost: 0.1636, reader_cost: 0.00422, ips: 48.8926 samples/sec | ETA 03:59:26 2022-08-24 01:08:18 [INFO] [TRAIN] epoch: 58, iter: 72250/160000, loss: 0.5394, lr: 0.000664, batch_cost: 0.1837, reader_cost: 0.00097, ips: 43.5378 samples/sec | ETA 04:28:43 2022-08-24 01:08:28 [INFO] [TRAIN] epoch: 58, iter: 72300/160000, loss: 0.5427, lr: 0.000664, batch_cost: 0.2121, reader_cost: 0.00032, ips: 37.7163 samples/sec | ETA 05:10:02 2022-08-24 01:08:38 [INFO] [TRAIN] epoch: 58, iter: 72350/160000, loss: 0.5242, lr: 0.000664, batch_cost: 0.1918, reader_cost: 0.00062, ips: 41.7100 samples/sec | ETA 04:40:11 2022-08-24 01:08:47 [INFO] [TRAIN] epoch: 58, iter: 72400/160000, loss: 0.5264, lr: 0.000663, batch_cost: 0.1851, reader_cost: 0.00084, ips: 43.2098 samples/sec | ETA 04:30:18 2022-08-24 01:08:57 [INFO] [TRAIN] epoch: 58, iter: 72450/160000, loss: 0.5145, lr: 0.000663, batch_cost: 0.1884, reader_cost: 0.00096, ips: 42.4701 samples/sec | ETA 04:34:51 2022-08-24 01:09:05 [INFO] [TRAIN] epoch: 58, iter: 72500/160000, loss: 0.5435, lr: 0.000662, batch_cost: 0.1720, reader_cost: 0.00056, ips: 46.5007 samples/sec | ETA 04:10:53 2022-08-24 01:09:15 [INFO] [TRAIN] epoch: 58, iter: 72550/160000, loss: 0.5389, lr: 0.000662, batch_cost: 0.1864, reader_cost: 0.00073, ips: 42.9283 samples/sec | ETA 04:31:36 2022-08-24 01:09:24 [INFO] [TRAIN] epoch: 58, iter: 72600/160000, loss: 0.5276, lr: 0.000662, batch_cost: 0.1810, reader_cost: 0.00069, ips: 44.2044 samples/sec | ETA 04:23:37 2022-08-24 01:09:32 [INFO] [TRAIN] epoch: 58, iter: 72650/160000, loss: 0.5436, lr: 0.000661, batch_cost: 0.1638, reader_cost: 0.00041, ips: 48.8453 samples/sec | ETA 03:58:26 2022-08-24 01:09:40 [INFO] [TRAIN] epoch: 58, iter: 72700/160000, loss: 0.5051, lr: 0.000661, batch_cost: 0.1676, reader_cost: 0.00083, ips: 47.7444 samples/sec | ETA 04:03:47 2022-08-24 01:09:48 [INFO] [TRAIN] epoch: 58, iter: 72750/160000, loss: 0.5180, lr: 0.000661, batch_cost: 0.1633, reader_cost: 0.00080, ips: 49.0042 samples/sec | ETA 03:57:23 2022-08-24 01:09:57 [INFO] [TRAIN] epoch: 58, iter: 72800/160000, loss: 0.5390, lr: 0.000660, batch_cost: 0.1701, reader_cost: 0.00088, ips: 47.0176 samples/sec | ETA 04:07:17 2022-08-24 01:10:06 [INFO] [TRAIN] epoch: 58, iter: 72850/160000, loss: 0.5113, lr: 0.000660, batch_cost: 0.1729, reader_cost: 0.00031, ips: 46.2724 samples/sec | ETA 04:11:07 2022-08-24 01:10:14 [INFO] [TRAIN] epoch: 58, iter: 72900/160000, loss: 0.5432, lr: 0.000659, batch_cost: 0.1719, reader_cost: 0.00098, ips: 46.5505 samples/sec | ETA 04:09:28 2022-08-24 01:10:25 [INFO] [TRAIN] epoch: 58, iter: 72950/160000, loss: 0.5325, lr: 0.000659, batch_cost: 0.2250, reader_cost: 0.00331, ips: 35.5576 samples/sec | ETA 05:26:25 2022-08-24 01:10:35 [INFO] [TRAIN] epoch: 58, iter: 73000/160000, loss: 0.5372, lr: 0.000659, batch_cost: 0.2016, reader_cost: 0.00083, ips: 39.6815 samples/sec | ETA 04:52:19 2022-08-24 01:10:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1968 - reader cost: 7.4745e-04 2022-08-24 01:13:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3702 Acc: 0.7715 Kappa: 0.7545 Dice: 0.5081 2022-08-24 01:13:52 [INFO] [EVAL] Class IoU: [0.686 0.7865 0.9309 0.743 0.6988 0.7656 0.7597 0.8021 0.5211 0.6613 0.4893 0.5665 0.7113 0.3243 0.3277 0.4357 0.5211 0.4701 0.593 0.4211 0.7523 0.4429 0.5993 0.5054 0.3162 0.4543 0.4852 0.424 0.4568 0.2421 0.2929 0.5097 0.2672 0.357 0.3731 0.3671 0.474 0.5773 0.2623 0.3888 0.1106 0.1328 0.3551 0.2661 0.2659 0.2299 0.3798 0.5068 0.6837 0.5596 0.5872 0.3981 0.2158 0.2268 0.677 0.4597 0.863 0.3904 0.4362 0.2851 0.1031 0.274 0.2778 0.1354 0.4451 0.6978 0.2375 0.373 0.105 0.3552 0.4615 0.5686 0.3712 0.2283 0.4719 0.3867 0.5209 0.2934 0.3699 0.1964 0.5916 0.4249 0.3856 0.1614 0.0786 0.5297 0.1173 0.1124 0.3575 0.5119 0.4288 0.0485 0.1156 0.1125 0.0428 0.0197 0.2198 0.2195 0.2866 0.2722 0.1629 0.1509 0.2111 0.5399 0.1836 0.5174 0.25 0.5969 0.0653 0.4323 0.1793 0.5045 0.1835 0.6217 0.6069 0.0752 0.3746 0.6969 0.1831 0.4244 0.4644 0.0555 0.1968 0.1776 0.2818 0.1836 0.5234 0.4731 0.5653 0.3703 0.5511 0.0776 0.2954 0.4175 0.2604 0.1798 0.1537 0.0262 0.1748 0.3816 0.3493 0. 0.2704 0.3431 0.2787 0.0058 0.446 0.0383 0.1013 0.229 ] 2022-08-24 01:13:52 [INFO] [EVAL] Class Precision: [0.7977 0.8743 0.9684 0.8441 0.7839 0.8657 0.8835 0.8682 0.6776 0.7487 0.7034 0.6876 0.7881 0.5477 0.5217 0.5816 0.614 0.6603 0.7057 0.6496 0.8106 0.6775 0.7318 0.6665 0.4542 0.5595 0.6083 0.6988 0.6845 0.6278 0.4422 0.6217 0.5151 0.4649 0.529 0.4538 0.6751 0.7758 0.3853 0.5569 0.375 0.3709 0.6142 0.4422 0.3831 0.5248 0.6494 0.7091 0.7944 0.7071 0.722 0.4896 0.3451 0.7121 0.7234 0.686 0.8892 0.7234 0.6569 0.4386 0.1464 0.4739 0.3377 0.6523 0.5289 0.7727 0.5103 0.5322 0.3445 0.5532 0.696 0.6907 0.5938 0.3506 0.6632 0.5632 0.6739 0.5752 0.6806 0.4163 0.6907 0.6058 0.8228 0.3195 0.1545 0.7055 0.4767 0.429 0.8334 0.6663 0.6267 0.0526 0.2386 0.2894 0.1379 0.0724 0.4755 0.4871 0.4122 0.5984 0.5851 0.2494 0.5208 0.5738 0.7093 0.5756 0.3549 0.834 0.1764 0.5745 0.4314 0.7312 0.4766 0.7486 0.6135 0.2748 0.5882 0.7259 0.402 0.5836 0.6936 0.2788 0.608 0.6101 0.732 0.5817 0.8369 0.5952 0.8652 0.6702 0.6022 0.3628 0.5994 0.8207 0.769 0.408 0.3368 0.0724 0.364 0.6187 0.4674 0. 0.6234 0.5805 0.4934 0.0066 0.7739 0.2121 0.4506 0.7763] 2022-08-24 01:13:52 [INFO] [EVAL] Class Recall: [0.8305 0.8868 0.9601 0.8612 0.8656 0.8688 0.8443 0.9133 0.6928 0.85 0.6165 0.7628 0.8795 0.4429 0.4684 0.6347 0.7749 0.6202 0.7877 0.5449 0.9128 0.5611 0.7681 0.6764 0.51 0.7072 0.7057 0.5188 0.5787 0.2826 0.4646 0.7388 0.3569 0.6059 0.5587 0.6578 0.6141 0.6929 0.451 0.5629 0.1356 0.1715 0.457 0.4006 0.4651 0.2903 0.4777 0.6398 0.8307 0.7284 0.7587 0.6804 0.3655 0.2497 0.9134 0.5822 0.967 0.4588 0.5649 0.4489 0.2585 0.3938 0.6101 0.1459 0.7374 0.878 0.3077 0.555 0.1312 0.4981 0.5779 0.7628 0.4976 0.3954 0.6205 0.5524 0.6964 0.3745 0.4476 0.271 0.8049 0.5873 0.4205 0.2459 0.138 0.68 0.1347 0.1321 0.385 0.6884 0.576 0.3784 0.1831 0.1555 0.0584 0.0264 0.2902 0.2855 0.4848 0.333 0.1842 0.2763 0.2619 0.9013 0.1986 0.8367 0.4582 0.6773 0.094 0.6358 0.2348 0.6194 0.2298 0.7858 0.9824 0.0939 0.5077 0.9458 0.2517 0.6087 0.5843 0.0648 0.2254 0.2004 0.3143 0.2116 0.5829 0.6976 0.6199 0.4528 0.8666 0.0898 0.3681 0.4594 0.2825 0.2433 0.2204 0.0394 0.2517 0.4989 0.5802 0. 0.3232 0.4562 0.3904 0.05 0.5128 0.0447 0.1156 0.2452] 2022-08-24 01:13:53 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:14:03 [INFO] [TRAIN] epoch: 58, iter: 73050/160000, loss: 0.5204, lr: 0.000658, batch_cost: 0.2033, reader_cost: 0.00406, ips: 39.3527 samples/sec | ETA 04:54:36 2022-08-24 01:14:12 [INFO] [TRAIN] epoch: 58, iter: 73100/160000, loss: 0.5323, lr: 0.000658, batch_cost: 0.1772, reader_cost: 0.00139, ips: 45.1575 samples/sec | ETA 04:16:35 2022-08-24 01:14:20 [INFO] [TRAIN] epoch: 58, iter: 73150/160000, loss: 0.5252, lr: 0.000658, batch_cost: 0.1649, reader_cost: 0.00205, ips: 48.5074 samples/sec | ETA 03:58:43 2022-08-24 01:14:29 [INFO] [TRAIN] epoch: 58, iter: 73200/160000, loss: 0.5377, lr: 0.000657, batch_cost: 0.1709, reader_cost: 0.00056, ips: 46.8187 samples/sec | ETA 04:07:11 2022-08-24 01:14:37 [INFO] [TRAIN] epoch: 58, iter: 73250/160000, loss: 0.5349, lr: 0.000657, batch_cost: 0.1723, reader_cost: 0.00096, ips: 46.4301 samples/sec | ETA 04:09:07 2022-08-24 01:14:49 [INFO] [TRAIN] epoch: 59, iter: 73300/160000, loss: 0.4734, lr: 0.000656, batch_cost: 0.2340, reader_cost: 0.06898, ips: 34.1949 samples/sec | ETA 05:38:03 2022-08-24 01:14:58 [INFO] [TRAIN] epoch: 59, iter: 73350/160000, loss: 0.5116, lr: 0.000656, batch_cost: 0.1803, reader_cost: 0.00065, ips: 44.3809 samples/sec | ETA 04:20:19 2022-08-24 01:15:07 [INFO] [TRAIN] epoch: 59, iter: 73400/160000, loss: 0.4934, lr: 0.000656, batch_cost: 0.1833, reader_cost: 0.00053, ips: 43.6553 samples/sec | ETA 04:24:29 2022-08-24 01:15:16 [INFO] [TRAIN] epoch: 59, iter: 73450/160000, loss: 0.4668, lr: 0.000655, batch_cost: 0.1798, reader_cost: 0.00038, ips: 44.5015 samples/sec | ETA 04:19:19 2022-08-24 01:15:26 [INFO] [TRAIN] epoch: 59, iter: 73500/160000, loss: 0.5177, lr: 0.000655, batch_cost: 0.1932, reader_cost: 0.00059, ips: 41.4018 samples/sec | ETA 04:38:34 2022-08-24 01:15:35 [INFO] [TRAIN] epoch: 59, iter: 73550/160000, loss: 0.5055, lr: 0.000655, batch_cost: 0.1880, reader_cost: 0.00031, ips: 42.5453 samples/sec | ETA 04:30:55 2022-08-24 01:15:44 [INFO] [TRAIN] epoch: 59, iter: 73600/160000, loss: 0.4926, lr: 0.000654, batch_cost: 0.1810, reader_cost: 0.00050, ips: 44.1928 samples/sec | ETA 04:20:40 2022-08-24 01:15:54 [INFO] [TRAIN] epoch: 59, iter: 73650/160000, loss: 0.5134, lr: 0.000654, batch_cost: 0.1935, reader_cost: 0.00093, ips: 41.3469 samples/sec | ETA 04:38:27 2022-08-24 01:16:03 [INFO] [TRAIN] epoch: 59, iter: 73700/160000, loss: 0.5279, lr: 0.000653, batch_cost: 0.1766, reader_cost: 0.00048, ips: 45.3106 samples/sec | ETA 04:13:57 2022-08-24 01:16:11 [INFO] [TRAIN] epoch: 59, iter: 73750/160000, loss: 0.5119, lr: 0.000653, batch_cost: 0.1639, reader_cost: 0.00063, ips: 48.7983 samples/sec | ETA 03:55:39 2022-08-24 01:16:19 [INFO] [TRAIN] epoch: 59, iter: 73800/160000, loss: 0.5321, lr: 0.000653, batch_cost: 0.1591, reader_cost: 0.00062, ips: 50.2708 samples/sec | ETA 03:48:37 2022-08-24 01:16:27 [INFO] [TRAIN] epoch: 59, iter: 73850/160000, loss: 0.5344, lr: 0.000652, batch_cost: 0.1572, reader_cost: 0.00041, ips: 50.8920 samples/sec | ETA 03:45:42 2022-08-24 01:16:36 [INFO] [TRAIN] epoch: 59, iter: 73900/160000, loss: 0.5052, lr: 0.000652, batch_cost: 0.1780, reader_cost: 0.00069, ips: 44.9397 samples/sec | ETA 04:15:27 2022-08-24 01:16:45 [INFO] [TRAIN] epoch: 59, iter: 73950/160000, loss: 0.5315, lr: 0.000651, batch_cost: 0.1966, reader_cost: 0.00065, ips: 40.6911 samples/sec | ETA 04:41:57 2022-08-24 01:16:56 [INFO] [TRAIN] epoch: 59, iter: 74000/160000, loss: 0.5460, lr: 0.000651, batch_cost: 0.2077, reader_cost: 0.00049, ips: 38.5156 samples/sec | ETA 04:57:42 2022-08-24 01:16:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1919 - reader cost: 6.7220e-04 2022-08-24 01:20:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3675 Acc: 0.7702 Kappa: 0.7526 Dice: 0.5050 2022-08-24 01:20:08 [INFO] [EVAL] Class IoU: [0.6874 0.7838 0.9307 0.743 0.6807 0.7653 0.7683 0.789 0.5323 0.6357 0.4932 0.574 0.7144 0.338 0.2525 0.41 0.4904 0.4277 0.5956 0.4307 0.7632 0.4411 0.6064 0.5126 0.3061 0.4003 0.4904 0.434 0.4599 0.241 0.2183 0.5157 0.3074 0.3483 0.367 0.416 0.4659 0.5513 0.2942 0.3795 0.1228 0.1452 0.3622 0.2651 0.274 0.2771 0.3532 0.5477 0.6371 0.5349 0.5852 0.358 0.2224 0.2142 0.6602 0.4611 0.879 0.4279 0.4891 0.2936 0.0912 0.2371 0.2625 0.2039 0.4102 0.7149 0.2499 0.3811 0.0953 0.3381 0.5098 0.5612 0.3365 0.2279 0.4801 0.3868 0.471 0.2528 0.2426 0.2985 0.605 0.3849 0.3896 0.1037 0.0742 0.5387 0.1286 0.1022 0.2688 0.5274 0.3909 0.0381 0.2347 0.0787 0.0201 0.028 0.1691 0.2297 0.2916 0.288 0.0989 0.1443 0.2469 0.7743 0.1854 0.3147 0.2038 0.5504 0.0577 0.5296 0.1884 0.4109 0.1575 0.6586 0.581 0.0415 0.4319 0.6517 0.217 0.3758 0.3656 0.0773 0.3726 0.1584 0.2854 0.1838 0.4839 0.4933 0.6713 0.3341 0.4413 0.0856 0.2894 0.3075 0.2929 0.1914 0.128 0.0302 0.1944 0.4005 0.484 0.0929 0.2593 0.3273 0.2646 0. 0.4237 0.0297 0.1141 0.2044] 2022-08-24 01:20:08 [INFO] [EVAL] Class Precision: [0.7889 0.8517 0.9601 0.836 0.7703 0.8648 0.8992 0.8767 0.6625 0.784 0.6787 0.6893 0.7987 0.5206 0.6148 0.5502 0.6965 0.709 0.7274 0.6334 0.8386 0.6877 0.7332 0.5945 0.5338 0.6424 0.577 0.6567 0.6742 0.3682 0.4806 0.5973 0.5309 0.4472 0.4589 0.601 0.6131 0.7972 0.4731 0.5945 0.3573 0.3609 0.6841 0.4878 0.3985 0.5021 0.7371 0.7606 0.7051 0.6584 0.7774 0.4376 0.4049 0.7139 0.7039 0.6056 0.9361 0.6515 0.6622 0.4701 0.1425 0.5044 0.3402 0.6148 0.463 0.8504 0.4279 0.5293 0.1512 0.594 0.6625 0.7807 0.5289 0.3204 0.6805 0.5426 0.6838 0.6146 0.6319 0.5272 0.7516 0.6869 0.8246 0.1871 0.1378 0.7037 0.4206 0.4149 0.5406 0.7591 0.5238 0.0429 0.4078 0.2627 0.0639 0.0839 0.5296 0.4587 0.4334 0.6521 0.5043 0.2289 0.6224 0.8258 0.7584 0.3237 0.5472 0.725 0.1899 0.6534 0.4669 0.6576 0.5855 0.8515 0.5897 0.3695 0.7078 0.8177 0.3494 0.5558 0.6705 0.4099 0.6714 0.6671 0.6807 0.595 0.7578 0.6493 0.8873 0.4924 0.6145 0.5099 0.4495 0.8619 0.7062 0.3963 0.305 0.1055 0.4005 0.6178 0.7678 0.1662 0.4831 0.5017 0.6491 0. 0.8513 0.1834 0.2968 0.8778] 2022-08-24 01:20:08 [INFO] [EVAL] Class Recall: [0.8423 0.9077 0.9682 0.8697 0.854 0.8694 0.8407 0.8875 0.7304 0.7706 0.6435 0.7743 0.8713 0.4908 0.3 0.6169 0.6238 0.5188 0.7668 0.5737 0.8947 0.5516 0.7781 0.7882 0.4179 0.515 0.7656 0.5615 0.5914 0.4109 0.2857 0.7906 0.422 0.6117 0.6469 0.5747 0.6599 0.6413 0.4377 0.5121 0.1577 0.1954 0.4349 0.3675 0.4672 0.3821 0.4041 0.6618 0.8686 0.7403 0.703 0.6632 0.3304 0.2343 0.914 0.6588 0.9351 0.5549 0.6517 0.4388 0.2022 0.3092 0.5348 0.2337 0.7824 0.8178 0.3752 0.5764 0.2052 0.4397 0.6887 0.6663 0.4805 0.441 0.6198 0.5739 0.6022 0.3004 0.2826 0.4077 0.7563 0.4669 0.4248 0.1887 0.1385 0.6967 0.1563 0.1193 0.3483 0.6335 0.6063 0.2569 0.3561 0.101 0.0286 0.0404 0.199 0.3151 0.4713 0.3402 0.1096 0.2808 0.2904 0.9254 0.1971 0.9184 0.2451 0.6956 0.0766 0.7365 0.2401 0.5227 0.1773 0.7441 0.9752 0.0447 0.5257 0.7625 0.3642 0.5371 0.4457 0.087 0.4557 0.172 0.3295 0.2101 0.5724 0.6725 0.7339 0.5096 0.6102 0.0933 0.4483 0.3234 0.3335 0.2702 0.1807 0.0406 0.2742 0.5324 0.5671 0.1738 0.3588 0.485 0.3087 0. 0.4576 0.0343 0.1564 0.2104] 2022-08-24 01:20:08 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:20:18 [INFO] [TRAIN] epoch: 59, iter: 74050/160000, loss: 0.5284, lr: 0.000651, batch_cost: 0.2001, reader_cost: 0.00371, ips: 39.9720 samples/sec | ETA 04:46:42 2022-08-24 01:20:27 [INFO] [TRAIN] epoch: 59, iter: 74100/160000, loss: 0.5330, lr: 0.000650, batch_cost: 0.1829, reader_cost: 0.00159, ips: 43.7315 samples/sec | ETA 04:21:54 2022-08-24 01:20:36 [INFO] [TRAIN] epoch: 59, iter: 74150/160000, loss: 0.5388, lr: 0.000650, batch_cost: 0.1724, reader_cost: 0.00065, ips: 46.4003 samples/sec | ETA 04:06:41 2022-08-24 01:20:45 [INFO] [TRAIN] epoch: 59, iter: 74200/160000, loss: 0.4994, lr: 0.000650, batch_cost: 0.1815, reader_cost: 0.00050, ips: 44.0759 samples/sec | ETA 04:19:33 2022-08-24 01:20:53 [INFO] [TRAIN] epoch: 59, iter: 74250/160000, loss: 0.5735, lr: 0.000649, batch_cost: 0.1520, reader_cost: 0.00088, ips: 52.6370 samples/sec | ETA 03:37:12 2022-08-24 01:21:01 [INFO] [TRAIN] epoch: 59, iter: 74300/160000, loss: 0.5092, lr: 0.000649, batch_cost: 0.1674, reader_cost: 0.00217, ips: 47.7968 samples/sec | ETA 03:59:04 2022-08-24 01:21:09 [INFO] [TRAIN] epoch: 59, iter: 74350/160000, loss: 0.4928, lr: 0.000648, batch_cost: 0.1651, reader_cost: 0.00036, ips: 48.4601 samples/sec | ETA 03:55:39 2022-08-24 01:21:19 [INFO] [TRAIN] epoch: 59, iter: 74400/160000, loss: 0.5112, lr: 0.000648, batch_cost: 0.1874, reader_cost: 0.00062, ips: 42.6790 samples/sec | ETA 04:27:25 2022-08-24 01:21:28 [INFO] [TRAIN] epoch: 59, iter: 74450/160000, loss: 0.5461, lr: 0.000648, batch_cost: 0.1918, reader_cost: 0.00067, ips: 41.7051 samples/sec | ETA 04:33:30 2022-08-24 01:21:37 [INFO] [TRAIN] epoch: 59, iter: 74500/160000, loss: 0.5203, lr: 0.000647, batch_cost: 0.1671, reader_cost: 0.00057, ips: 47.8758 samples/sec | ETA 03:58:06 2022-08-24 01:21:50 [INFO] [TRAIN] epoch: 60, iter: 74550/160000, loss: 0.5579, lr: 0.000647, batch_cost: 0.2757, reader_cost: 0.10158, ips: 29.0132 samples/sec | ETA 06:32:41 2022-08-24 01:21:59 [INFO] [TRAIN] epoch: 60, iter: 74600/160000, loss: 0.5290, lr: 0.000647, batch_cost: 0.1811, reader_cost: 0.00057, ips: 44.1705 samples/sec | ETA 04:17:47 2022-08-24 01:22:08 [INFO] [TRAIN] epoch: 60, iter: 74650/160000, loss: 0.5181, lr: 0.000646, batch_cost: 0.1760, reader_cost: 0.00052, ips: 45.4576 samples/sec | ETA 04:10:20 2022-08-24 01:22:18 [INFO] [TRAIN] epoch: 60, iter: 74700/160000, loss: 0.5124, lr: 0.000646, batch_cost: 0.1984, reader_cost: 0.00068, ips: 40.3131 samples/sec | ETA 04:42:07 2022-08-24 01:22:27 [INFO] [TRAIN] epoch: 60, iter: 74750/160000, loss: 0.4886, lr: 0.000645, batch_cost: 0.1678, reader_cost: 0.00061, ips: 47.6759 samples/sec | ETA 03:58:24 2022-08-24 01:22:35 [INFO] [TRAIN] epoch: 60, iter: 74800/160000, loss: 0.5100, lr: 0.000645, batch_cost: 0.1726, reader_cost: 0.00053, ips: 46.3452 samples/sec | ETA 04:05:07 2022-08-24 01:22:45 [INFO] [TRAIN] epoch: 60, iter: 74850/160000, loss: 0.4838, lr: 0.000645, batch_cost: 0.1923, reader_cost: 0.00050, ips: 41.6023 samples/sec | ETA 04:32:54 2022-08-24 01:22:55 [INFO] [TRAIN] epoch: 60, iter: 74900/160000, loss: 0.5363, lr: 0.000644, batch_cost: 0.2091, reader_cost: 0.00101, ips: 38.2510 samples/sec | ETA 04:56:38 2022-08-24 01:23:05 [INFO] [TRAIN] epoch: 60, iter: 74950/160000, loss: 0.5072, lr: 0.000644, batch_cost: 0.1866, reader_cost: 0.00037, ips: 42.8637 samples/sec | ETA 04:24:33 2022-08-24 01:23:14 [INFO] [TRAIN] epoch: 60, iter: 75000/160000, loss: 0.5068, lr: 0.000644, batch_cost: 0.1957, reader_cost: 0.00047, ips: 40.8837 samples/sec | ETA 04:37:12 2022-08-24 01:23:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1888 - reader cost: 9.0122e-04 2022-08-24 01:26:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3659 Acc: 0.7694 Kappa: 0.7521 Dice: 0.5043 2022-08-24 01:26:23 [INFO] [EVAL] Class IoU: [0.688 0.7863 0.9285 0.7393 0.6881 0.7611 0.7731 0.7933 0.5232 0.646 0.479 0.5827 0.7003 0.3222 0.3129 0.4095 0.4902 0.4258 0.5886 0.4348 0.7596 0.4526 0.6215 0.516 0.3249 0.3541 0.4579 0.4404 0.441 0.2749 0.3028 0.5291 0.2937 0.3134 0.3389 0.412 0.4681 0.5085 0.2725 0.395 0.1447 0.1558 0.3383 0.2605 0.2658 0.2195 0.3483 0.4671 0.6785 0.4927 0.5748 0.3709 0.193 0.272 0.657 0.36 0.8866 0.4227 0.4776 0.3344 0.1272 0.2359 0.293 0.2266 0.4364 0.654 0.2143 0.3992 0.1075 0.3791 0.5082 0.5439 0.3703 0.2712 0.4695 0.3849 0.4118 0.2897 0.2952 0.2117 0.6206 0.4322 0.3886 0.049 0.2837 0.5259 0.1003 0.1098 0.5003 0.5124 0.4067 0.026 0.2221 0.0655 0.0477 0.0274 0.0498 0.2098 0.3021 0.288 0.2476 0.1404 0.2701 0.7744 0.1746 0.3699 0.1868 0.5234 0.087 0.4784 0.1789 0.3651 0.1684 0.6396 0.4709 0.0538 0.4115 0.7169 0.2205 0.3865 0.3213 0.0722 0.3689 0.2045 0.2795 0.213 0.4865 0.4747 0.5704 0.3793 0.4538 0.0947 0.1472 0.4349 0.3147 0.1923 0.1689 0.0362 0.2007 0.3489 0.1799 0.0152 0.199 0.2836 0.3041 0. 0.4432 0.0273 0.1429 0.1958] 2022-08-24 01:26:23 [INFO] [EVAL] Class Precision: [0.8015 0.8596 0.967 0.843 0.7664 0.8537 0.8908 0.8832 0.6798 0.7261 0.7021 0.7038 0.7666 0.5195 0.5443 0.564 0.7243 0.6943 0.7054 0.6431 0.8384 0.6696 0.7742 0.6431 0.4956 0.6047 0.5374 0.7135 0.6644 0.5232 0.4836 0.6652 0.538 0.39 0.4221 0.5674 0.6741 0.7957 0.4045 0.6329 0.3541 0.3572 0.5498 0.5051 0.3259 0.4979 0.5261 0.5879 0.7612 0.5764 0.6943 0.4649 0.4277 0.6049 0.6985 0.6397 0.9207 0.6176 0.592 0.6775 0.1961 0.6022 0.3931 0.5533 0.5174 0.7261 0.354 0.5592 0.2103 0.6249 0.6714 0.6961 0.5309 0.3443 0.6802 0.5578 0.8149 0.519 0.7055 0.4115 0.7541 0.6621 0.804 0.1667 0.3653 0.6367 0.4952 0.4059 0.6753 0.6495 0.5661 0.0294 0.5154 0.2906 0.1023 0.0845 0.401 0.3691 0.4695 0.7537 0.6816 0.2328 0.5808 0.8361 0.5829 0.3918 0.3144 0.6539 0.1694 0.6041 0.3733 0.4605 0.5358 0.7695 0.4766 0.2662 0.5478 0.7868 0.3137 0.5055 0.6676 0.2868 0.5951 0.577 0.7098 0.5949 0.787 0.585 0.8547 0.6233 0.6287 0.4418 0.379 0.8202 0.7218 0.3053 0.2708 0.124 0.4203 0.6366 0.2312 0.0251 0.5142 0.502 0.5966 0. 0.8564 0.1919 0.2582 0.8509] 2022-08-24 01:26:23 [INFO] [EVAL] Class Recall: [0.8293 0.9022 0.9589 0.8574 0.8707 0.8753 0.854 0.8863 0.6943 0.854 0.6012 0.772 0.8901 0.4591 0.4239 0.5993 0.6026 0.5241 0.7804 0.573 0.89 0.5827 0.7591 0.723 0.4854 0.4608 0.7558 0.535 0.5675 0.3668 0.4474 0.7212 0.3928 0.6145 0.6323 0.6007 0.605 0.5848 0.4551 0.5124 0.1966 0.2165 0.4678 0.3497 0.5905 0.2819 0.5076 0.6944 0.862 0.7723 0.7695 0.6471 0.2602 0.3307 0.9171 0.4515 0.9599 0.5726 0.712 0.3977 0.2657 0.2794 0.535 0.2773 0.736 0.8681 0.3519 0.5826 0.1802 0.4908 0.6764 0.7132 0.5505 0.5609 0.6025 0.5538 0.4543 0.3959 0.3367 0.3036 0.7781 0.5545 0.4293 0.0648 0.5596 0.7514 0.1117 0.1309 0.6588 0.7083 0.5909 0.1806 0.2807 0.0779 0.082 0.039 0.0538 0.3272 0.4586 0.3179 0.28 0.2611 0.3355 0.913 0.1995 0.8686 0.3152 0.7239 0.1516 0.697 0.2557 0.638 0.1971 0.7912 0.9753 0.0631 0.6231 0.8897 0.426 0.6214 0.3825 0.0879 0.4925 0.2405 0.3156 0.2492 0.5603 0.7158 0.6317 0.4921 0.6198 0.1075 0.194 0.4808 0.3582 0.3419 0.3099 0.0487 0.2776 0.4356 0.4479 0.0373 0.2451 0.3946 0.3828 0. 0.4788 0.0308 0.2425 0.2028] 2022-08-24 01:26:24 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:26:32 [INFO] [TRAIN] epoch: 60, iter: 75050/160000, loss: 0.4875, lr: 0.000643, batch_cost: 0.1700, reader_cost: 0.00425, ips: 47.0460 samples/sec | ETA 04:00:45 2022-08-24 01:26:41 [INFO] [TRAIN] epoch: 60, iter: 75100/160000, loss: 0.4729, lr: 0.000643, batch_cost: 0.1807, reader_cost: 0.00145, ips: 44.2801 samples/sec | ETA 04:15:38 2022-08-24 01:26:50 [INFO] [TRAIN] epoch: 60, iter: 75150/160000, loss: 0.4864, lr: 0.000642, batch_cost: 0.1839, reader_cost: 0.00172, ips: 43.5017 samples/sec | ETA 04:20:04 2022-08-24 01:26:59 [INFO] [TRAIN] epoch: 60, iter: 75200/160000, loss: 0.5199, lr: 0.000642, batch_cost: 0.1796, reader_cost: 0.00055, ips: 44.5334 samples/sec | ETA 04:13:53 2022-08-24 01:27:07 [INFO] [TRAIN] epoch: 60, iter: 75250/160000, loss: 0.4889, lr: 0.000642, batch_cost: 0.1538, reader_cost: 0.00205, ips: 52.0155 samples/sec | ETA 03:37:14 2022-08-24 01:27:15 [INFO] [TRAIN] epoch: 60, iter: 75300/160000, loss: 0.4957, lr: 0.000641, batch_cost: 0.1622, reader_cost: 0.00038, ips: 49.3083 samples/sec | ETA 03:49:02 2022-08-24 01:27:24 [INFO] [TRAIN] epoch: 60, iter: 75350/160000, loss: 0.5110, lr: 0.000641, batch_cost: 0.1693, reader_cost: 0.00145, ips: 47.2607 samples/sec | ETA 03:58:49 2022-08-24 01:27:33 [INFO] [TRAIN] epoch: 60, iter: 75400/160000, loss: 0.5214, lr: 0.000641, batch_cost: 0.1916, reader_cost: 0.00211, ips: 41.7573 samples/sec | ETA 04:30:07 2022-08-24 01:27:43 [INFO] [TRAIN] epoch: 60, iter: 75450/160000, loss: 0.5128, lr: 0.000640, batch_cost: 0.1885, reader_cost: 0.00092, ips: 42.4404 samples/sec | ETA 04:25:37 2022-08-24 01:27:51 [INFO] [TRAIN] epoch: 60, iter: 75500/160000, loss: 0.4866, lr: 0.000640, batch_cost: 0.1681, reader_cost: 0.00065, ips: 47.6023 samples/sec | ETA 03:56:40 2022-08-24 01:28:00 [INFO] [TRAIN] epoch: 60, iter: 75550/160000, loss: 0.4731, lr: 0.000639, batch_cost: 0.1870, reader_cost: 0.00052, ips: 42.7910 samples/sec | ETA 04:23:08 2022-08-24 01:28:09 [INFO] [TRAIN] epoch: 60, iter: 75600/160000, loss: 0.5210, lr: 0.000639, batch_cost: 0.1643, reader_cost: 0.00054, ips: 48.6768 samples/sec | ETA 03:51:11 2022-08-24 01:28:17 [INFO] [TRAIN] epoch: 60, iter: 75650/160000, loss: 0.5075, lr: 0.000639, batch_cost: 0.1608, reader_cost: 0.00066, ips: 49.7596 samples/sec | ETA 03:46:01 2022-08-24 01:28:24 [INFO] [TRAIN] epoch: 60, iter: 75700/160000, loss: 0.5234, lr: 0.000638, batch_cost: 0.1521, reader_cost: 0.00075, ips: 52.6108 samples/sec | ETA 03:33:38 2022-08-24 01:28:33 [INFO] [TRAIN] epoch: 60, iter: 75750/160000, loss: 0.5166, lr: 0.000638, batch_cost: 0.1793, reader_cost: 0.00040, ips: 44.6213 samples/sec | ETA 04:11:44 2022-08-24 01:28:44 [INFO] [TRAIN] epoch: 61, iter: 75800/160000, loss: 0.5218, lr: 0.000637, batch_cost: 0.2156, reader_cost: 0.05379, ips: 37.1071 samples/sec | ETA 05:02:32 2022-08-24 01:28:53 [INFO] [TRAIN] epoch: 61, iter: 75850/160000, loss: 0.5256, lr: 0.000637, batch_cost: 0.1790, reader_cost: 0.00048, ips: 44.6956 samples/sec | ETA 04:11:01 2022-08-24 01:29:02 [INFO] [TRAIN] epoch: 61, iter: 75900/160000, loss: 0.4774, lr: 0.000637, batch_cost: 0.1780, reader_cost: 0.00053, ips: 44.9546 samples/sec | ETA 04:09:26 2022-08-24 01:29:11 [INFO] [TRAIN] epoch: 61, iter: 75950/160000, loss: 0.4803, lr: 0.000636, batch_cost: 0.1877, reader_cost: 0.00048, ips: 42.6145 samples/sec | ETA 04:22:58 2022-08-24 01:29:22 [INFO] [TRAIN] epoch: 61, iter: 76000/160000, loss: 0.4832, lr: 0.000636, batch_cost: 0.2128, reader_cost: 0.00359, ips: 37.5940 samples/sec | ETA 04:57:55 2022-08-24 01:29:22 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 188s - batch_cost: 0.1877 - reader cost: 9.1447e-04 2022-08-24 01:32:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3629 Acc: 0.7709 Kappa: 0.7534 Dice: 0.5011 2022-08-24 01:32:30 [INFO] [EVAL] Class IoU: [0.6867 0.7822 0.9324 0.7417 0.6873 0.7605 0.7758 0.7794 0.5367 0.6423 0.4886 0.5841 0.7127 0.3104 0.3103 0.4277 0.5135 0.4501 0.6143 0.4313 0.7633 0.4412 0.623 0.5053 0.3381 0.4626 0.4577 0.4499 0.4404 0.2982 0.3034 0.4808 0.2922 0.3279 0.3495 0.394 0.4761 0.4055 0.2552 0.3519 0.1131 0.1426 0.363 0.2558 0.2525 0.2252 0.3557 0.4302 0.5119 0.5525 0.5738 0.3968 0.1831 0.27 0.6743 0.3641 0.8601 0.3513 0.4075 0.3069 0.0755 0.2136 0.2922 0.2269 0.4432 0.6706 0.248 0.4001 0.1196 0.3607 0.4875 0.5516 0.3946 0.3033 0.4748 0.3685 0.4882 0.2723 0.2224 0.2825 0.6813 0.4042 0.2855 0.0475 0.2922 0.5368 0.1071 0.1227 0.3031 0.5185 0.4375 0.035 0.1288 0.0696 0.0613 0.034 0.2044 0.1689 0.2919 0.2835 0.188 0.1796 0.2831 0.6886 0.1806 0.379 0.2987 0.5584 0.0411 0.3148 0.1625 0.4089 0.1409 0.6636 0.6306 0.0507 0.4472 0.6671 0.1783 0.3735 0.472 0.0883 0.3259 0.2034 0.2842 0.207 0.5241 0.4631 0.5362 0.3416 0.4283 0.0563 0.3165 0.3921 0.2271 0.208 0.1558 0.0513 0.1751 0.3712 0.1831 0.0505 0.2844 0.0311 0.2788 0.0148 0.4267 0.0357 0.1716 0.2119] 2022-08-24 01:32:30 [INFO] [EVAL] Class Precision: [0.784 0.8553 0.9695 0.8313 0.7627 0.8626 0.8816 0.8445 0.6994 0.7958 0.698 0.7208 0.7844 0.4977 0.5481 0.6337 0.6848 0.6945 0.7764 0.5961 0.851 0.6245 0.7515 0.6673 0.5072 0.603 0.5969 0.7444 0.7348 0.448 0.4643 0.6243 0.5664 0.4505 0.4619 0.5217 0.6652 0.8681 0.35 0.611 0.2649 0.3546 0.6105 0.4892 0.3726 0.4511 0.7067 0.5252 0.7677 0.6699 0.7711 0.5491 0.3411 0.5244 0.7167 0.6392 0.8933 0.7153 0.8088 0.7749 0.1283 0.3675 0.4304 0.6124 0.5265 0.789 0.3997 0.5231 0.2843 0.5679 0.7277 0.7255 0.5715 0.3897 0.7076 0.4898 0.8011 0.524 0.6698 0.4741 0.8306 0.6462 0.8354 0.1243 0.3871 0.6604 0.6791 0.4318 0.7257 0.7001 0.691 0.047 0.3981 0.2757 0.1724 0.0927 0.5938 0.396 0.4578 0.6973 0.644 0.3006 0.6369 0.8229 0.6239 0.4037 0.6037 0.7225 0.0997 0.4908 0.4656 0.5373 0.5601 0.7734 0.6373 0.5454 0.6326 0.7502 0.4194 0.4682 0.6709 0.3596 0.7326 0.6412 0.6914 0.6455 0.7658 0.6096 0.8684 0.5014 0.5669 0.3477 0.5536 0.8115 0.8207 0.3826 0.3928 0.1295 0.5006 0.6585 0.5482 0.0623 0.5097 0.4283 0.7312 0.0213 0.8853 0.2593 0.3876 0.7999] 2022-08-24 01:32:30 [INFO] [EVAL] Class Recall: [0.847 0.9015 0.9606 0.8732 0.8742 0.8653 0.866 0.91 0.6976 0.7691 0.6197 0.7548 0.8863 0.452 0.4169 0.5681 0.6724 0.5612 0.7464 0.6094 0.881 0.6005 0.7847 0.6754 0.5036 0.6652 0.6626 0.5321 0.5236 0.4713 0.4667 0.6765 0.3764 0.5464 0.5896 0.6168 0.6262 0.4321 0.4853 0.4535 0.1649 0.1927 0.4724 0.3489 0.4392 0.3103 0.4174 0.7039 0.6058 0.7592 0.6916 0.5886 0.2833 0.3577 0.9193 0.4582 0.9585 0.4084 0.451 0.3369 0.1549 0.3378 0.4765 0.2649 0.7371 0.8171 0.3951 0.6298 0.1711 0.4972 0.5962 0.697 0.5604 0.5777 0.5906 0.598 0.5556 0.3618 0.2498 0.4113 0.7912 0.5192 0.3026 0.0713 0.5439 0.7416 0.1128 0.1464 0.3423 0.6665 0.5439 0.12 0.16 0.0852 0.0868 0.051 0.2376 0.2275 0.4461 0.3233 0.2098 0.3086 0.3375 0.8084 0.2027 0.8613 0.3716 0.7109 0.0654 0.4675 0.1997 0.6312 0.1584 0.8238 0.9837 0.0529 0.6041 0.8576 0.2368 0.6487 0.6141 0.1048 0.3699 0.2295 0.3254 0.2336 0.6241 0.6583 0.5836 0.5173 0.6366 0.063 0.425 0.4313 0.2389 0.3131 0.2052 0.0782 0.2122 0.4596 0.2157 0.2096 0.3914 0.0324 0.3107 0.0463 0.4517 0.0398 0.2355 0.2238] 2022-08-24 01:32:30 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:32:39 [INFO] [TRAIN] epoch: 61, iter: 76050/160000, loss: 0.4989, lr: 0.000636, batch_cost: 0.1681, reader_cost: 0.00298, ips: 47.5992 samples/sec | ETA 03:55:09 2022-08-24 01:32:48 [INFO] [TRAIN] epoch: 61, iter: 76100/160000, loss: 0.4429, lr: 0.000635, batch_cost: 0.1837, reader_cost: 0.00078, ips: 43.5442 samples/sec | ETA 04:16:54 2022-08-24 01:32:56 [INFO] [TRAIN] epoch: 61, iter: 76150/160000, loss: 0.5134, lr: 0.000635, batch_cost: 0.1616, reader_cost: 0.00066, ips: 49.4923 samples/sec | ETA 03:45:53 2022-08-24 01:33:05 [INFO] [TRAIN] epoch: 61, iter: 76200/160000, loss: 0.4900, lr: 0.000634, batch_cost: 0.1846, reader_cost: 0.00067, ips: 43.3328 samples/sec | ETA 04:17:50 2022-08-24 01:33:16 [INFO] [TRAIN] epoch: 61, iter: 76250/160000, loss: 0.5009, lr: 0.000634, batch_cost: 0.2101, reader_cost: 0.00062, ips: 38.0766 samples/sec | ETA 04:53:16 2022-08-24 01:33:25 [INFO] [TRAIN] epoch: 61, iter: 76300/160000, loss: 0.4802, lr: 0.000634, batch_cost: 0.1916, reader_cost: 0.00034, ips: 41.7570 samples/sec | ETA 04:27:15 2022-08-24 01:33:34 [INFO] [TRAIN] epoch: 61, iter: 76350/160000, loss: 0.5286, lr: 0.000633, batch_cost: 0.1851, reader_cost: 0.00047, ips: 43.2179 samples/sec | ETA 04:18:04 2022-08-24 01:33:43 [INFO] [TRAIN] epoch: 61, iter: 76400/160000, loss: 0.4886, lr: 0.000633, batch_cost: 0.1713, reader_cost: 0.00868, ips: 46.7003 samples/sec | ETA 03:58:41 2022-08-24 01:33:53 [INFO] [TRAIN] epoch: 61, iter: 76450/160000, loss: 0.5255, lr: 0.000633, batch_cost: 0.1986, reader_cost: 0.00091, ips: 40.2752 samples/sec | ETA 04:36:35 2022-08-24 01:34:03 [INFO] [TRAIN] epoch: 61, iter: 76500/160000, loss: 0.4667, lr: 0.000632, batch_cost: 0.1978, reader_cost: 0.00066, ips: 40.4548 samples/sec | ETA 04:35:12 2022-08-24 01:34:13 [INFO] [TRAIN] epoch: 61, iter: 76550/160000, loss: 0.4825, lr: 0.000632, batch_cost: 0.1958, reader_cost: 0.00068, ips: 40.8594 samples/sec | ETA 04:32:18 2022-08-24 01:34:23 [INFO] [TRAIN] epoch: 61, iter: 76600/160000, loss: 0.5111, lr: 0.000631, batch_cost: 0.1971, reader_cost: 0.00082, ips: 40.5890 samples/sec | ETA 04:33:57 2022-08-24 01:34:32 [INFO] [TRAIN] epoch: 61, iter: 76650/160000, loss: 0.5317, lr: 0.000631, batch_cost: 0.1809, reader_cost: 0.00686, ips: 44.2176 samples/sec | ETA 04:11:19 2022-08-24 01:34:39 [INFO] [TRAIN] epoch: 61, iter: 76700/160000, loss: 0.4699, lr: 0.000631, batch_cost: 0.1547, reader_cost: 0.00054, ips: 51.7121 samples/sec | ETA 03:34:46 2022-08-24 01:34:47 [INFO] [TRAIN] epoch: 61, iter: 76750/160000, loss: 0.5503, lr: 0.000630, batch_cost: 0.1567, reader_cost: 0.00068, ips: 51.0445 samples/sec | ETA 03:37:27 2022-08-24 01:34:57 [INFO] [TRAIN] epoch: 61, iter: 76800/160000, loss: 0.5232, lr: 0.000630, batch_cost: 0.2015, reader_cost: 0.00088, ips: 39.7018 samples/sec | ETA 04:39:24 2022-08-24 01:35:06 [INFO] [TRAIN] epoch: 61, iter: 76850/160000, loss: 0.5173, lr: 0.000630, batch_cost: 0.1816, reader_cost: 0.00843, ips: 44.0420 samples/sec | ETA 04:11:43 2022-08-24 01:35:19 [INFO] [TRAIN] epoch: 61, iter: 76900/160000, loss: 0.4720, lr: 0.000629, batch_cost: 0.2479, reader_cost: 0.00173, ips: 32.2693 samples/sec | ETA 05:43:21 2022-08-24 01:35:28 [INFO] [TRAIN] epoch: 61, iter: 76950/160000, loss: 0.5285, lr: 0.000629, batch_cost: 0.1944, reader_cost: 0.00044, ips: 41.1538 samples/sec | ETA 04:29:04 2022-08-24 01:35:38 [INFO] [TRAIN] epoch: 61, iter: 77000/160000, loss: 0.5044, lr: 0.000628, batch_cost: 0.1954, reader_cost: 0.00105, ips: 40.9394 samples/sec | ETA 04:30:19 2022-08-24 01:35:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 181s - batch_cost: 0.1806 - reader cost: 7.1759e-04 2022-08-24 01:38:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3622 Acc: 0.7680 Kappa: 0.7503 Dice: 0.4993 2022-08-24 01:38:39 [INFO] [EVAL] Class IoU: [0.6872 0.7867 0.931 0.7372 0.6746 0.7665 0.7692 0.79 0.5175 0.6108 0.4983 0.5668 0.703 0.2935 0.2905 0.4317 0.5041 0.3888 0.6136 0.4386 0.7653 0.3474 0.6109 0.5034 0.3427 0.3718 0.4795 0.4563 0.3972 0.3124 0.2896 0.5038 0.2819 0.3311 0.3608 0.4194 0.472 0.5214 0.2486 0.3774 0.1255 0.1183 0.3606 0.2587 0.2627 0.1988 0.3663 0.4958 0.6504 0.5387 0.5597 0.377 0.1038 0.2379 0.6839 0.3618 0.8175 0.3675 0.404 0.3743 0.0846 0.2395 0.2905 0.1982 0.4745 0.7043 0.2032 0.3821 0.0871 0.3216 0.4871 0.5767 0.3624 0.2409 0.4625 0.3961 0.5112 0.2481 0.3752 0.1592 0.671 0.3901 0.353 0.0282 0.3271 0.5391 0.1256 0.1274 0.3174 0.4984 0.4147 0.0057 0.212 0.0776 0.0587 0.0344 0.1655 0.1671 0.3117 0.2722 0.1315 0.1914 0.2916 0.7039 0.1906 0.4003 0.2196 0.571 0.0616 0.2088 0.1259 0.32 0.1517 0.6687 0.5127 0.0434 0.4561 0.6682 0.2192 0.368 0.4075 0.0518 0.3693 0.1211 0.2722 0.2478 0.4968 0.4668 0.4977 0.3641 0.5496 0.1109 0.0591 0.3923 0.3257 0.1939 0.1527 0.0407 0.1967 0.3541 0.2942 0.0103 0.2858 0.4591 0.2906 0.0099 0.4302 0.0323 0.1707 0.216 ] 2022-08-24 01:38:39 [INFO] [EVAL] Class Precision: [0.7894 0.8629 0.9629 0.8345 0.7469 0.8606 0.882 0.8478 0.6483 0.7269 0.7017 0.7014 0.7687 0.4823 0.5197 0.5764 0.688 0.6908 0.7594 0.6191 0.8483 0.6806 0.7702 0.642 0.4903 0.5727 0.5099 0.7373 0.7358 0.497 0.4655 0.6466 0.638 0.5366 0.469 0.5726 0.6889 0.7888 0.3494 0.5486 0.3085 0.3347 0.6626 0.5224 0.3676 0.4095 0.7316 0.7159 0.7725 0.6696 0.7053 0.4794 0.3482 0.6447 0.7346 0.6969 0.8453 0.6602 0.7232 0.5565 0.1287 0.5944 0.394 0.7016 0.595 0.7953 0.3196 0.5305 0.1277 0.5766 0.6068 0.7405 0.556 0.2933 0.6645 0.5464 0.8007 0.5092 0.7392 0.5971 0.8082 0.6724 0.8154 0.1059 0.4113 0.7184 0.6333 0.3525 0.5784 0.6395 0.5599 0.0088 0.4122 0.3148 0.1541 0.1076 0.7118 0.4319 0.4459 0.742 0.7403 0.3856 0.5876 0.789 0.7427 0.449 0.4778 0.7818 0.1758 0.3968 0.515 0.4438 0.5055 0.8132 0.5179 0.3404 0.5781 0.7597 0.3789 0.4673 0.7151 0.482 0.698 0.7538 0.7412 0.5764 0.7578 0.6355 0.6917 0.6047 0.678 0.5144 0.3663 0.7558 0.7019 0.3971 0.3184 0.0686 0.5046 0.6791 0.4107 0.0312 0.4863 0.6475 0.6911 0.0116 0.8238 0.2061 0.5058 0.7912] 2022-08-24 01:38:39 [INFO] [EVAL] Class Recall: [0.8414 0.899 0.9656 0.8634 0.8746 0.8752 0.8575 0.9206 0.7195 0.7927 0.6322 0.7471 0.8917 0.4286 0.3971 0.6322 0.6535 0.4707 0.7618 0.6007 0.8866 0.4151 0.747 0.7 0.5323 0.5144 0.8895 0.5448 0.4633 0.4568 0.4339 0.6952 0.3357 0.4637 0.6099 0.6105 0.5998 0.606 0.4629 0.5473 0.1746 0.1548 0.4417 0.3388 0.4793 0.2788 0.4232 0.6173 0.8045 0.7337 0.7306 0.6384 0.1288 0.2738 0.9083 0.4294 0.9614 0.4533 0.478 0.5335 0.1983 0.2863 0.5253 0.2164 0.7009 0.8603 0.3579 0.5773 0.215 0.421 0.7118 0.7229 0.5099 0.5742 0.6035 0.5902 0.5857 0.326 0.4325 0.1784 0.7981 0.4817 0.3837 0.037 0.6149 0.6836 0.1354 0.1662 0.4129 0.693 0.6154 0.0162 0.3038 0.0933 0.0867 0.048 0.1774 0.2141 0.5087 0.3006 0.1378 0.2755 0.3667 0.8672 0.2041 0.7866 0.2889 0.6793 0.0865 0.3059 0.1428 0.5342 0.1781 0.7901 0.9807 0.0474 0.6837 0.8472 0.3421 0.634 0.4865 0.0548 0.4396 0.126 0.3008 0.303 0.5905 0.6374 0.6396 0.4778 0.7436 0.1239 0.0659 0.4492 0.378 0.2748 0.2269 0.0909 0.2438 0.4253 0.5092 0.0153 0.4093 0.6121 0.334 0.062 0.4738 0.0369 0.2048 0.229 ] 2022-08-24 01:38:39 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:38:53 [INFO] [TRAIN] epoch: 62, iter: 77050/160000, loss: 0.5123, lr: 0.000628, batch_cost: 0.2836, reader_cost: 0.07183, ips: 28.2114 samples/sec | ETA 06:32:02 2022-08-24 01:39:02 [INFO] [TRAIN] epoch: 62, iter: 77100/160000, loss: 0.4946, lr: 0.000628, batch_cost: 0.1678, reader_cost: 0.00081, ips: 47.6894 samples/sec | ETA 03:51:46 2022-08-24 01:39:10 [INFO] [TRAIN] epoch: 62, iter: 77150/160000, loss: 0.4887, lr: 0.000627, batch_cost: 0.1670, reader_cost: 0.00047, ips: 47.9030 samples/sec | ETA 03:50:36 2022-08-24 01:39:19 [INFO] [TRAIN] epoch: 62, iter: 77200/160000, loss: 0.4811, lr: 0.000627, batch_cost: 0.1678, reader_cost: 0.00062, ips: 47.6877 samples/sec | ETA 03:51:30 2022-08-24 01:39:29 [INFO] [TRAIN] epoch: 62, iter: 77250/160000, loss: 0.5261, lr: 0.000627, batch_cost: 0.2005, reader_cost: 0.00038, ips: 39.8952 samples/sec | ETA 04:36:33 2022-08-24 01:39:39 [INFO] [TRAIN] epoch: 62, iter: 77300/160000, loss: 0.5134, lr: 0.000626, batch_cost: 0.2149, reader_cost: 0.00071, ips: 37.2286 samples/sec | ETA 04:56:11 2022-08-24 01:39:49 [INFO] [TRAIN] epoch: 62, iter: 77350/160000, loss: 0.4951, lr: 0.000626, batch_cost: 0.1886, reader_cost: 0.00070, ips: 42.4128 samples/sec | ETA 04:19:49 2022-08-24 01:39:58 [INFO] [TRAIN] epoch: 62, iter: 77400/160000, loss: 0.4839, lr: 0.000625, batch_cost: 0.1770, reader_cost: 0.00070, ips: 45.1913 samples/sec | ETA 04:03:42 2022-08-24 01:40:06 [INFO] [TRAIN] epoch: 62, iter: 77450/160000, loss: 0.5194, lr: 0.000625, batch_cost: 0.1720, reader_cost: 0.00083, ips: 46.5075 samples/sec | ETA 03:56:39 2022-08-24 01:40:16 [INFO] [TRAIN] epoch: 62, iter: 77500/160000, loss: 0.5163, lr: 0.000625, batch_cost: 0.1891, reader_cost: 0.00069, ips: 42.3024 samples/sec | ETA 04:20:01 2022-08-24 01:40:24 [INFO] [TRAIN] epoch: 62, iter: 77550/160000, loss: 0.4839, lr: 0.000624, batch_cost: 0.1739, reader_cost: 0.00067, ips: 45.9997 samples/sec | ETA 03:58:59 2022-08-24 01:40:32 [INFO] [TRAIN] epoch: 62, iter: 77600/160000, loss: 0.5274, lr: 0.000624, batch_cost: 0.1585, reader_cost: 0.00055, ips: 50.4825 samples/sec | ETA 03:37:37 2022-08-24 01:40:40 [INFO] [TRAIN] epoch: 62, iter: 77650/160000, loss: 0.5255, lr: 0.000623, batch_cost: 0.1562, reader_cost: 0.00062, ips: 51.2027 samples/sec | ETA 03:34:26 2022-08-24 01:40:49 [INFO] [TRAIN] epoch: 62, iter: 77700/160000, loss: 0.4646, lr: 0.000623, batch_cost: 0.1735, reader_cost: 0.00042, ips: 46.1005 samples/sec | ETA 03:58:01 2022-08-24 01:40:57 [INFO] [TRAIN] epoch: 62, iter: 77750/160000, loss: 0.5335, lr: 0.000623, batch_cost: 0.1701, reader_cost: 0.00042, ips: 47.0307 samples/sec | ETA 03:53:10 2022-08-24 01:41:05 [INFO] [TRAIN] epoch: 62, iter: 77800/160000, loss: 0.5272, lr: 0.000622, batch_cost: 0.1635, reader_cost: 0.00044, ips: 48.9170 samples/sec | ETA 03:44:03 2022-08-24 01:41:15 [INFO] [TRAIN] epoch: 62, iter: 77850/160000, loss: 0.4888, lr: 0.000622, batch_cost: 0.1883, reader_cost: 0.00057, ips: 42.4762 samples/sec | ETA 04:17:52 2022-08-24 01:41:23 [INFO] [TRAIN] epoch: 62, iter: 77900/160000, loss: 0.4938, lr: 0.000622, batch_cost: 0.1594, reader_cost: 0.00056, ips: 50.1868 samples/sec | ETA 03:38:07 2022-08-24 01:41:33 [INFO] [TRAIN] epoch: 62, iter: 77950/160000, loss: 0.4951, lr: 0.000621, batch_cost: 0.1983, reader_cost: 0.00078, ips: 40.3361 samples/sec | ETA 04:31:13 2022-08-24 01:41:43 [INFO] [TRAIN] epoch: 62, iter: 78000/160000, loss: 0.5388, lr: 0.000621, batch_cost: 0.1995, reader_cost: 0.00048, ips: 40.1035 samples/sec | ETA 04:32:37 2022-08-24 01:41:43 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.1995 - reader cost: 0.0016 2022-08-24 01:45:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.3682 Acc: 0.7715 Kappa: 0.7540 Dice: 0.5059 2022-08-24 01:45:03 [INFO] [EVAL] Class IoU: [0.6889 0.7852 0.9314 0.7351 0.6837 0.7669 0.7637 0.7994 0.5167 0.644 0.4895 0.5665 0.7163 0.3083 0.2848 0.4322 0.5229 0.4202 0.6155 0.4243 0.7671 0.4572 0.6161 0.4895 0.3498 0.4005 0.4407 0.4566 0.4413 0.278 0.294 0.5088 0.2685 0.3453 0.3611 0.4011 0.4732 0.5506 0.2661 0.3961 0.1299 0.1882 0.3674 0.2711 0.2852 0.1831 0.3438 0.5132 0.6196 0.5594 0.5494 0.329 0.1927 0.2733 0.6772 0.4466 0.8392 0.4119 0.345 0.3149 0.1039 0.2728 0.3228 0.1446 0.4681 0.7305 0.229 0.3944 0.095 0.3707 0.5118 0.5679 0.3862 0.2582 0.4813 0.3705 0.4688 0.2319 0.364 0.0831 0.659 0.4137 0.3153 0.0446 0.3053 0.5393 0.1196 0.1234 0.2742 0.5473 0.4102 0.0497 0.2018 0.1278 0.0742 0.0204 0.1887 0.2278 0.2867 0.2724 0.1706 0.1896 0.287 0.7134 0.1909 0.5054 0.2338 0.5696 0.0785 0.444 0.156 0.514 0.1054 0.6022 0.6757 0.0348 0.3806 0.6407 0.1207 0.3635 0.277 0.0461 0.3091 0.1539 0.3063 0.1965 0.4943 0.4659 0.5298 0.3379 0.482 0.0493 0.2516 0.4297 0.2824 0.193 0.1571 0.0168 0.2151 0.3734 0.4813 0.0002 0.2209 0.3161 0.2841 0.0144 0.4174 0.0413 0.159 0.1929] 2022-08-24 01:45:03 [INFO] [EVAL] Class Precision: [0.7833 0.8626 0.9698 0.8216 0.7743 0.8455 0.885 0.8598 0.659 0.7319 0.6896 0.7274 0.8036 0.5654 0.5867 0.5845 0.6703 0.66 0.7818 0.6452 0.8363 0.6618 0.7773 0.6338 0.5097 0.5335 0.5376 0.6913 0.6675 0.5633 0.4386 0.6326 0.6021 0.4727 0.4671 0.5188 0.6762 0.8368 0.3604 0.6014 0.3487 0.3271 0.6273 0.5279 0.4029 0.4503 0.7592 0.7258 0.6926 0.7501 0.7384 0.4106 0.4141 0.6475 0.7184 0.6565 0.872 0.6616 0.7822 0.6691 0.163 0.5125 0.4787 0.7385 0.578 0.86 0.3341 0.539 0.1571 0.6317 0.6173 0.7288 0.5379 0.4047 0.6896 0.546 0.747 0.4319 0.6797 0.3634 0.7588 0.7267 0.8431 0.1684 0.3715 0.7129 0.6109 0.4203 0.646 0.7713 0.5631 0.0544 0.4131 0.3626 0.1859 0.0874 0.695 0.5374 0.4482 0.6821 0.7305 0.3912 0.58 0.7853 0.8372 0.5826 0.4704 0.7252 0.1884 0.5262 0.4331 0.773 0.5287 0.792 0.6839 0.3569 0.8108 0.7567 0.3409 0.5564 0.7121 0.5564 0.7632 0.6946 0.6704 0.6319 0.8056 0.6617 0.7571 0.5015 0.6576 0.4528 0.6203 0.7522 0.7 0.4162 0.3683 0.1167 0.4121 0.6226 0.5478 0.0004 0.4821 0.5128 0.6474 0.0196 0.8469 0.2217 0.6111 0.8449] 2022-08-24 01:45:03 [INFO] [EVAL] Class Recall: [0.8511 0.8973 0.9592 0.8747 0.8539 0.8919 0.8478 0.9191 0.7053 0.8428 0.6278 0.7191 0.8684 0.4041 0.3563 0.6238 0.704 0.5362 0.7432 0.5535 0.9027 0.5965 0.7481 0.6826 0.5272 0.6165 0.7096 0.5736 0.5656 0.3543 0.4714 0.7222 0.3264 0.5617 0.6139 0.6388 0.6118 0.6169 0.5043 0.5372 0.1715 0.3072 0.47 0.3578 0.4941 0.2359 0.3859 0.6366 0.8546 0.6875 0.6822 0.6235 0.265 0.3211 0.9219 0.5828 0.9571 0.5219 0.3817 0.373 0.2228 0.3684 0.4978 0.1524 0.7112 0.8291 0.4214 0.5951 0.1938 0.4729 0.7496 0.7201 0.578 0.4164 0.6144 0.5354 0.5572 0.3337 0.4393 0.0972 0.8336 0.4899 0.3349 0.0572 0.6316 0.6889 0.1295 0.1487 0.3227 0.6533 0.6016 0.3692 0.2829 0.1648 0.11 0.0259 0.2057 0.2834 0.4432 0.312 0.182 0.2689 0.3623 0.8862 0.1983 0.7924 0.3172 0.7264 0.1186 0.7399 0.1959 0.6053 0.1163 0.7154 0.9826 0.0371 0.4177 0.807 0.1575 0.5119 0.3119 0.0478 0.3419 0.1651 0.3606 0.222 0.5612 0.6116 0.6383 0.5088 0.6434 0.0524 0.2973 0.5006 0.3213 0.2647 0.2151 0.0192 0.3104 0.4826 0.7986 0.0004 0.2896 0.4517 0.3361 0.0518 0.4515 0.0482 0.1769 0.2 ] 2022-08-24 01:45:03 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:45:11 [INFO] [TRAIN] epoch: 62, iter: 78050/160000, loss: 0.4669, lr: 0.000620, batch_cost: 0.1641, reader_cost: 0.00383, ips: 48.7520 samples/sec | ETA 03:44:07 2022-08-24 01:45:19 [INFO] [TRAIN] epoch: 62, iter: 78100/160000, loss: 0.4943, lr: 0.000620, batch_cost: 0.1628, reader_cost: 0.00091, ips: 49.1398 samples/sec | ETA 03:42:13 2022-08-24 01:45:27 [INFO] [TRAIN] epoch: 62, iter: 78150/160000, loss: 0.4828, lr: 0.000620, batch_cost: 0.1635, reader_cost: 0.00047, ips: 48.9354 samples/sec | ETA 03:43:00 2022-08-24 01:45:36 [INFO] [TRAIN] epoch: 62, iter: 78200/160000, loss: 0.5047, lr: 0.000619, batch_cost: 0.1689, reader_cost: 0.00066, ips: 47.3733 samples/sec | ETA 03:50:13 2022-08-24 01:45:46 [INFO] [TRAIN] epoch: 62, iter: 78250/160000, loss: 0.5169, lr: 0.000619, batch_cost: 0.1983, reader_cost: 0.00075, ips: 40.3389 samples/sec | ETA 04:30:12 2022-08-24 01:45:54 [INFO] [TRAIN] epoch: 62, iter: 78300/160000, loss: 0.5005, lr: 0.000619, batch_cost: 0.1721, reader_cost: 0.00050, ips: 46.4967 samples/sec | ETA 03:54:16 2022-08-24 01:46:06 [INFO] [TRAIN] epoch: 63, iter: 78350/160000, loss: 0.4940, lr: 0.000618, batch_cost: 0.2405, reader_cost: 0.06694, ips: 33.2709 samples/sec | ETA 05:27:12 2022-08-24 01:46:15 [INFO] [TRAIN] epoch: 63, iter: 78400/160000, loss: 0.5154, lr: 0.000618, batch_cost: 0.1747, reader_cost: 0.00064, ips: 45.8018 samples/sec | ETA 03:57:32 2022-08-24 01:46:24 [INFO] [TRAIN] epoch: 63, iter: 78450/160000, loss: 0.4460, lr: 0.000617, batch_cost: 0.1875, reader_cost: 0.00043, ips: 42.6577 samples/sec | ETA 04:14:53 2022-08-24 01:46:35 [INFO] [TRAIN] epoch: 63, iter: 78500/160000, loss: 0.5048, lr: 0.000617, batch_cost: 0.2027, reader_cost: 0.00191, ips: 39.4749 samples/sec | ETA 04:35:16 2022-08-24 01:46:43 [INFO] [TRAIN] epoch: 63, iter: 78550/160000, loss: 0.4471, lr: 0.000617, batch_cost: 0.1735, reader_cost: 0.00056, ips: 46.1024 samples/sec | ETA 03:55:33 2022-08-24 01:46:51 [INFO] [TRAIN] epoch: 63, iter: 78600/160000, loss: 0.4847, lr: 0.000616, batch_cost: 0.1620, reader_cost: 0.00074, ips: 49.3867 samples/sec | ETA 03:39:45 2022-08-24 01:47:00 [INFO] [TRAIN] epoch: 63, iter: 78650/160000, loss: 0.4600, lr: 0.000616, batch_cost: 0.1710, reader_cost: 0.00087, ips: 46.7797 samples/sec | ETA 03:51:52 2022-08-24 01:47:09 [INFO] [TRAIN] epoch: 63, iter: 78700/160000, loss: 0.5068, lr: 0.000616, batch_cost: 0.1908, reader_cost: 0.00712, ips: 41.9318 samples/sec | ETA 04:18:30 2022-08-24 01:47:19 [INFO] [TRAIN] epoch: 63, iter: 78750/160000, loss: 0.5012, lr: 0.000615, batch_cost: 0.1853, reader_cost: 0.00035, ips: 43.1744 samples/sec | ETA 04:10:55 2022-08-24 01:47:28 [INFO] [TRAIN] epoch: 63, iter: 78800/160000, loss: 0.4964, lr: 0.000615, batch_cost: 0.1883, reader_cost: 0.00078, ips: 42.4854 samples/sec | ETA 04:14:49 2022-08-24 01:47:39 [INFO] [TRAIN] epoch: 63, iter: 78850/160000, loss: 0.5148, lr: 0.000614, batch_cost: 0.2100, reader_cost: 0.00075, ips: 38.0901 samples/sec | ETA 04:44:03 2022-08-24 01:47:49 [INFO] [TRAIN] epoch: 63, iter: 78900/160000, loss: 0.4724, lr: 0.000614, batch_cost: 0.2117, reader_cost: 0.00562, ips: 37.7813 samples/sec | ETA 04:46:12 2022-08-24 01:47:59 [INFO] [TRAIN] epoch: 63, iter: 78950/160000, loss: 0.4867, lr: 0.000614, batch_cost: 0.1923, reader_cost: 0.00048, ips: 41.6099 samples/sec | ETA 04:19:42 2022-08-24 01:48:09 [INFO] [TRAIN] epoch: 63, iter: 79000/160000, loss: 0.5226, lr: 0.000613, batch_cost: 0.2114, reader_cost: 0.00040, ips: 37.8443 samples/sec | ETA 04:45:22 2022-08-24 01:48:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.1999 - reader cost: 0.0012 2022-08-24 01:51:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3647 Acc: 0.7675 Kappa: 0.7502 Dice: 0.5019 2022-08-24 01:51:30 [INFO] [EVAL] Class IoU: [0.688 0.7789 0.9306 0.7422 0.6966 0.7636 0.7673 0.7943 0.5233 0.6172 0.4828 0.5673 0.7199 0.3312 0.2953 0.4228 0.463 0.4397 0.6095 0.4324 0.7565 0.4467 0.6093 0.4988 0.3197 0.4073 0.4431 0.4534 0.458 0.2393 0.2774 0.5149 0.2904 0.3308 0.3572 0.411 0.471 0.5809 0.2677 0.3157 0.1755 0.1552 0.356 0.2562 0.2808 0.1796 0.3009 0.5175 0.6175 0.4941 0.5627 0.3877 0.1245 0.1854 0.6528 0.372 0.8337 0.3729 0.4875 0.2697 0.0521 0.2838 0.2653 0.1379 0.4474 0.7077 0.2567 0.3754 0.0717 0.3733 0.5098 0.5367 0.3794 0.2892 0.4609 0.3874 0.4681 0.2433 0.2275 0.3555 0.6496 0.4069 0.4081 0.0365 0.2554 0.5372 0.138 0.1351 0.3635 0.5179 0.4058 0.0035 0.1732 0.0898 0.0599 0.0395 0.1992 0.1967 0.2783 0.2926 0.1704 0.1592 0.2753 0.7237 0.1896 0.403 0.1645 0.5654 0.0772 0.2719 0.1234 0.4396 0.1741 0.6158 0.6582 0.0381 0.5325 0.5386 0.1526 0.3642 0.4676 0.0601 0.3609 0.2206 0.2786 0.2315 0.5256 0.4539 0.5895 0.2897 0.4714 0.0053 0.2999 0.4037 0.306 0.2034 0.1484 0.0166 0.2045 0.3881 0.4639 0.0084 0.2237 0.257 0.2843 0.0104 0.4233 0.032 0.121 0.2004] 2022-08-24 01:51:30 [INFO] [EVAL] Class Precision: [0.8003 0.8599 0.9658 0.8349 0.7858 0.8714 0.8724 0.855 0.6415 0.7888 0.7249 0.715 0.8123 0.4963 0.5363 0.5705 0.693 0.6927 0.7807 0.6207 0.8171 0.6117 0.7378 0.5973 0.4991 0.5388 0.5788 0.6872 0.682 0.3981 0.436 0.7032 0.4884 0.4279 0.4519 0.5243 0.6308 0.7825 0.3906 0.6511 0.3466 0.3697 0.5834 0.4687 0.3335 0.4163 0.5094 0.7786 0.7023 0.5749 0.7472 0.4971 0.2653 0.5214 0.6972 0.7339 0.8745 0.6772 0.713 0.4359 0.0923 0.4051 0.3443 0.5965 0.5346 0.7995 0.438 0.4895 0.1264 0.6161 0.6447 0.6769 0.4808 0.3561 0.6357 0.539 0.6167 0.4726 0.656 0.4798 0.7869 0.6612 0.7528 0.0877 0.3539 0.681 0.5357 0.4002 0.7208 0.8025 0.5739 0.0044 0.3822 0.2797 0.369 0.1235 0.642 0.4718 0.4221 0.641 0.5464 0.2578 0.6213 0.794 0.7687 0.4551 0.3565 0.7724 0.1611 0.414 0.385 0.6267 0.4273 0.8129 0.6783 0.2846 0.7162 0.5994 0.2709 0.4823 0.7112 0.2585 0.5839 0.633 0.6789 0.5719 0.7415 0.587 0.7658 0.4701 0.6242 0.0614 0.4825 0.8252 0.6734 0.439 0.3086 0.0693 0.4455 0.5789 0.5266 0.0135 0.5976 0.6655 0.4711 0.0175 0.8029 0.2092 0.377 0.7597] 2022-08-24 01:51:30 [INFO] [EVAL] Class Recall: [0.8305 0.8922 0.9624 0.8699 0.8598 0.8606 0.8643 0.918 0.7395 0.7393 0.5911 0.7329 0.8635 0.499 0.3964 0.6202 0.5824 0.5463 0.7355 0.5878 0.9107 0.6235 0.7778 0.7516 0.4706 0.6253 0.6539 0.5712 0.5824 0.3751 0.4327 0.6578 0.4174 0.5931 0.6301 0.6555 0.6502 0.6928 0.4597 0.38 0.2622 0.2111 0.4774 0.3611 0.6399 0.2401 0.4236 0.6068 0.8365 0.7786 0.6951 0.6379 0.1901 0.2234 0.9111 0.43 0.9471 0.4535 0.6066 0.4142 0.1068 0.4866 0.5363 0.1521 0.733 0.8605 0.3829 0.617 0.1422 0.4865 0.7089 0.7216 0.6426 0.6062 0.6263 0.5794 0.6602 0.3341 0.2583 0.5785 0.7882 0.5141 0.4712 0.0589 0.4786 0.7178 0.1567 0.1694 0.423 0.5936 0.5807 0.016 0.2406 0.1169 0.0668 0.0548 0.2241 0.2522 0.4496 0.35 0.1985 0.2938 0.3309 0.8911 0.201 0.7788 0.234 0.6784 0.1291 0.4419 0.1537 0.5955 0.227 0.7175 0.9569 0.0422 0.6749 0.8416 0.259 0.5981 0.5772 0.0727 0.4859 0.253 0.3209 0.2801 0.6435 0.6668 0.7192 0.4302 0.6581 0.0058 0.442 0.4415 0.3593 0.2748 0.2223 0.0213 0.2744 0.5406 0.7957 0.0216 0.2634 0.2952 0.4177 0.0248 0.4724 0.0364 0.1513 0.2139] 2022-08-24 01:51:30 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:51:39 [INFO] [TRAIN] epoch: 63, iter: 79050/160000, loss: 0.5180, lr: 0.000613, batch_cost: 0.1753, reader_cost: 0.00430, ips: 45.6382 samples/sec | ETA 03:56:29 2022-08-24 01:51:48 [INFO] [TRAIN] epoch: 63, iter: 79100/160000, loss: 0.5039, lr: 0.000612, batch_cost: 0.1946, reader_cost: 0.00163, ips: 41.1024 samples/sec | ETA 04:22:26 2022-08-24 01:51:57 [INFO] [TRAIN] epoch: 63, iter: 79150/160000, loss: 0.4999, lr: 0.000612, batch_cost: 0.1835, reader_cost: 0.00104, ips: 43.5875 samples/sec | ETA 04:07:19 2022-08-24 01:52:05 [INFO] [TRAIN] epoch: 63, iter: 79200/160000, loss: 0.4984, lr: 0.000612, batch_cost: 0.1544, reader_cost: 0.00039, ips: 51.8022 samples/sec | ETA 03:27:58 2022-08-24 01:52:14 [INFO] [TRAIN] epoch: 63, iter: 79250/160000, loss: 0.5133, lr: 0.000611, batch_cost: 0.1763, reader_cost: 0.00509, ips: 45.3766 samples/sec | ETA 03:57:16 2022-08-24 01:52:23 [INFO] [TRAIN] epoch: 63, iter: 79300/160000, loss: 0.5025, lr: 0.000611, batch_cost: 0.1770, reader_cost: 0.00047, ips: 45.2061 samples/sec | ETA 03:58:01 2022-08-24 01:52:31 [INFO] [TRAIN] epoch: 63, iter: 79350/160000, loss: 0.4728, lr: 0.000611, batch_cost: 0.1685, reader_cost: 0.00085, ips: 47.4750 samples/sec | ETA 03:46:30 2022-08-24 01:52:40 [INFO] [TRAIN] epoch: 63, iter: 79400/160000, loss: 0.4904, lr: 0.000610, batch_cost: 0.1685, reader_cost: 0.00041, ips: 47.4679 samples/sec | ETA 03:46:23 2022-08-24 01:52:49 [INFO] [TRAIN] epoch: 63, iter: 79450/160000, loss: 0.5502, lr: 0.000610, batch_cost: 0.1809, reader_cost: 0.00045, ips: 44.2325 samples/sec | ETA 04:02:48 2022-08-24 01:52:58 [INFO] [TRAIN] epoch: 63, iter: 79500/160000, loss: 0.4640, lr: 0.000609, batch_cost: 0.1793, reader_cost: 0.00062, ips: 44.6111 samples/sec | ETA 04:00:35 2022-08-24 01:53:07 [INFO] [TRAIN] epoch: 63, iter: 79550/160000, loss: 0.5140, lr: 0.000609, batch_cost: 0.1830, reader_cost: 0.00032, ips: 43.7224 samples/sec | ETA 04:05:20 2022-08-24 01:53:19 [INFO] [TRAIN] epoch: 64, iter: 79600/160000, loss: 0.5281, lr: 0.000609, batch_cost: 0.2523, reader_cost: 0.09345, ips: 31.7037 samples/sec | ETA 05:38:07 2022-08-24 01:53:29 [INFO] [TRAIN] epoch: 64, iter: 79650/160000, loss: 0.4812, lr: 0.000608, batch_cost: 0.1876, reader_cost: 0.01388, ips: 42.6426 samples/sec | ETA 04:11:14 2022-08-24 01:53:39 [INFO] [TRAIN] epoch: 64, iter: 79700/160000, loss: 0.4668, lr: 0.000608, batch_cost: 0.2092, reader_cost: 0.00177, ips: 38.2328 samples/sec | ETA 04:40:02 2022-08-24 01:53:52 [INFO] [TRAIN] epoch: 64, iter: 79750/160000, loss: 0.5180, lr: 0.000608, batch_cost: 0.2532, reader_cost: 0.00087, ips: 31.5894 samples/sec | ETA 05:38:43 2022-08-24 01:54:03 [INFO] [TRAIN] epoch: 64, iter: 79800/160000, loss: 0.5064, lr: 0.000607, batch_cost: 0.2177, reader_cost: 0.00068, ips: 36.7558 samples/sec | ETA 04:50:55 2022-08-24 01:54:13 [INFO] [TRAIN] epoch: 64, iter: 79850/160000, loss: 0.4907, lr: 0.000607, batch_cost: 0.2014, reader_cost: 0.00191, ips: 39.7188 samples/sec | ETA 04:29:03 2022-08-24 01:54:24 [INFO] [TRAIN] epoch: 64, iter: 79900/160000, loss: 0.5109, lr: 0.000606, batch_cost: 0.2181, reader_cost: 0.00059, ips: 36.6768 samples/sec | ETA 04:51:11 2022-08-24 01:54:34 [INFO] [TRAIN] epoch: 64, iter: 79950/160000, loss: 0.4764, lr: 0.000606, batch_cost: 0.2121, reader_cost: 0.00052, ips: 37.7203 samples/sec | ETA 04:42:57 2022-08-24 01:54:44 [INFO] [TRAIN] epoch: 64, iter: 80000/160000, loss: 0.4989, lr: 0.000606, batch_cost: 0.1979, reader_cost: 0.00095, ips: 40.4227 samples/sec | ETA 04:23:52 2022-08-24 01:54:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1700 - reader cost: 0.0016 2022-08-24 01:57:35 [INFO] [EVAL] #Images: 2000 mIoU: 0.3619 Acc: 0.7687 Kappa: 0.7515 Dice: 0.4978 2022-08-24 01:57:35 [INFO] [EVAL] Class IoU: [0.6882 0.7855 0.9323 0.7394 0.6981 0.7756 0.7628 0.7974 0.5259 0.6196 0.4869 0.5606 0.7125 0.3157 0.2952 0.4395 0.474 0.4633 0.6001 0.44 0.7608 0.3581 0.6071 0.5092 0.3401 0.4304 0.4584 0.4504 0.4657 0.2876 0.2831 0.5113 0.2795 0.3459 0.3874 0.3916 0.4756 0.5354 0.2563 0.3496 0.1761 0.1505 0.3522 0.2579 0.2732 0.2001 0.3274 0.5051 0.6322 0.5278 0.5694 0.3553 0.1821 0.1258 0.6529 0.3697 0.8717 0.3521 0.438 0.2542 0.0924 0.2368 0.3441 0.1415 0.459 0.7047 0.2802 0.3992 0.0753 0.3646 0.5106 0.5721 0.3442 0.236 0.4815 0.3692 0.4738 0.2564 0.2246 0.3956 0.6501 0.4153 0.4007 0.0371 0.275 0.531 0.0983 0.1195 0.5366 0.5415 0.4488 0.0033 0.1138 0.1023 0.0447 0.0385 0.191 0.168 0.2751 0.2814 0.0542 0.1498 0.3153 0.7807 0.1953 0.3857 0.1997 0.5737 0.0766 0.2691 0.1703 0.4704 0.1536 0.281 0.6907 0.0567 0.4739 0.6405 0.178 0.3501 0.471 0.0755 0.3573 0.156 0.2771 0.2042 0.4943 0.4504 0.5579 0.3322 0.5102 0.0524 0.1465 0.381 0.2962 0.2066 0.1483 0.0294 0.1811 0.3845 0.0929 0.0584 0.2068 0.2458 0.2914 0.0091 0.4192 0.0291 0.1458 0.1993] 2022-08-24 01:57:35 [INFO] [EVAL] Class Precision: [0.806 0.862 0.9672 0.8412 0.8019 0.8724 0.8721 0.8478 0.6264 0.7625 0.6579 0.7011 0.7802 0.5338 0.5232 0.6208 0.6832 0.663 0.7507 0.6339 0.8429 0.5951 0.7454 0.6183 0.4848 0.5069 0.5541 0.7235 0.6552 0.3817 0.4922 0.6372 0.5184 0.4626 0.4901 0.4675 0.6563 0.7706 0.4158 0.6425 0.3335 0.4262 0.5341 0.464 0.4284 0.3527 0.7757 0.7501 0.7393 0.6368 0.7465 0.4531 0.3396 0.4568 0.7021 0.6153 0.9143 0.7282 0.6683 0.3918 0.1175 0.5269 0.4897 0.6877 0.5851 0.794 0.5582 0.5863 0.1236 0.5405 0.6883 0.7438 0.5894 0.2946 0.7086 0.5234 0.7421 0.4963 0.6002 0.6466 0.7824 0.6256 0.8033 0.1376 0.3552 0.7459 0.5158 0.4628 0.7793 0.7306 0.6429 0.004 0.3057 0.3286 0.1252 0.1424 0.5881 0.4092 0.4215 0.7431 0.6914 0.242 0.5635 0.8454 0.8688 0.427 0.3582 0.7633 0.16 0.4147 0.4489 0.6908 0.4802 0.8673 0.7027 0.2832 0.655 0.7523 0.404 0.4521 0.7209 0.5925 0.6034 0.6905 0.7258 0.554 0.6947 0.5575 0.963 0.6487 0.6115 0.2668 0.5808 0.8392 0.7406 0.4088 0.3601 0.0546 0.3407 0.5386 0.3614 0.101 0.6287 0.514 0.5411 0.0119 0.7941 0.2145 0.4182 0.848 ] 2022-08-24 01:57:35 [INFO] [EVAL] Class Recall: [0.8248 0.8985 0.9628 0.8593 0.8436 0.8748 0.8589 0.9306 0.7661 0.7677 0.6519 0.7366 0.8915 0.4358 0.4037 0.6008 0.6074 0.606 0.7494 0.59 0.8865 0.4735 0.766 0.7426 0.5327 0.7406 0.7262 0.5441 0.6169 0.5384 0.3999 0.7213 0.3775 0.5784 0.6489 0.7069 0.6334 0.6369 0.4006 0.434 0.2716 0.1887 0.5084 0.3674 0.4298 0.3164 0.3616 0.6073 0.8136 0.7551 0.7059 0.6222 0.2819 0.1479 0.9031 0.4809 0.9493 0.4054 0.5597 0.4198 0.3023 0.3008 0.5365 0.1512 0.6804 0.8624 0.36 0.5557 0.1614 0.5284 0.6643 0.7125 0.4528 0.5428 0.6003 0.5562 0.5672 0.3465 0.2642 0.5047 0.7936 0.5527 0.4443 0.0484 0.549 0.6482 0.1083 0.1387 0.6327 0.6766 0.5978 0.0184 0.1535 0.1293 0.0649 0.0501 0.2205 0.2218 0.442 0.3118 0.0555 0.2822 0.4172 0.9108 0.2012 0.7997 0.3109 0.6979 0.1282 0.434 0.2153 0.5959 0.1842 0.2936 0.976 0.0661 0.6315 0.8118 0.2414 0.6079 0.576 0.0796 0.467 0.1678 0.3095 0.2444 0.6315 0.701 0.5701 0.405 0.7549 0.0613 0.1639 0.411 0.3305 0.2946 0.2013 0.0599 0.2788 0.5733 0.1111 0.1217 0.2355 0.3202 0.387 0.0378 0.4704 0.0326 0.1829 0.2067] 2022-08-24 01:57:35 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 01:57:43 [INFO] [TRAIN] epoch: 64, iter: 80050/160000, loss: 0.5331, lr: 0.000605, batch_cost: 0.1719, reader_cost: 0.00683, ips: 46.5253 samples/sec | ETA 03:49:07 2022-08-24 01:57:52 [INFO] [TRAIN] epoch: 64, iter: 80100/160000, loss: 0.4777, lr: 0.000605, batch_cost: 0.1642, reader_cost: 0.00114, ips: 48.7154 samples/sec | ETA 03:38:41 2022-08-24 01:57:59 [INFO] [TRAIN] epoch: 64, iter: 80150/160000, loss: 0.5111, lr: 0.000605, batch_cost: 0.1576, reader_cost: 0.00070, ips: 50.7459 samples/sec | ETA 03:29:48 2022-08-24 01:58:09 [INFO] [TRAIN] epoch: 64, iter: 80200/160000, loss: 0.4754, lr: 0.000604, batch_cost: 0.1846, reader_cost: 0.00087, ips: 43.3381 samples/sec | ETA 04:05:30 2022-08-24 01:58:18 [INFO] [TRAIN] epoch: 64, iter: 80250/160000, loss: 0.4809, lr: 0.000604, batch_cost: 0.1938, reader_cost: 0.00068, ips: 41.2780 samples/sec | ETA 04:17:36 2022-08-24 01:58:27 [INFO] [TRAIN] epoch: 64, iter: 80300/160000, loss: 0.4780, lr: 0.000603, batch_cost: 0.1793, reader_cost: 0.00104, ips: 44.6248 samples/sec | ETA 03:58:08 2022-08-24 01:58:36 [INFO] [TRAIN] epoch: 64, iter: 80350/160000, loss: 0.4990, lr: 0.000603, batch_cost: 0.1634, reader_cost: 0.00055, ips: 48.9513 samples/sec | ETA 03:36:57 2022-08-24 01:58:43 [INFO] [TRAIN] epoch: 64, iter: 80400/160000, loss: 0.4981, lr: 0.000603, batch_cost: 0.1583, reader_cost: 0.00055, ips: 50.5211 samples/sec | ETA 03:30:04 2022-08-24 01:58:52 [INFO] [TRAIN] epoch: 64, iter: 80450/160000, loss: 0.5017, lr: 0.000602, batch_cost: 0.1615, reader_cost: 0.00067, ips: 49.5323 samples/sec | ETA 03:34:08 2022-08-24 01:59:01 [INFO] [TRAIN] epoch: 64, iter: 80500/160000, loss: 0.4567, lr: 0.000602, batch_cost: 0.1852, reader_cost: 0.00059, ips: 43.1996 samples/sec | ETA 04:05:22 2022-08-24 01:59:12 [INFO] [TRAIN] epoch: 64, iter: 80550/160000, loss: 0.5081, lr: 0.000602, batch_cost: 0.2178, reader_cost: 0.00082, ips: 36.7341 samples/sec | ETA 04:48:22 2022-08-24 01:59:21 [INFO] [TRAIN] epoch: 64, iter: 80600/160000, loss: 0.5240, lr: 0.000601, batch_cost: 0.1852, reader_cost: 0.00048, ips: 43.1938 samples/sec | ETA 04:05:05 2022-08-24 01:59:32 [INFO] [TRAIN] epoch: 64, iter: 80650/160000, loss: 0.5168, lr: 0.000601, batch_cost: 0.2120, reader_cost: 0.00045, ips: 37.7287 samples/sec | ETA 04:40:25 2022-08-24 01:59:42 [INFO] [TRAIN] epoch: 64, iter: 80700/160000, loss: 0.5513, lr: 0.000600, batch_cost: 0.2044, reader_cost: 0.00032, ips: 39.1457 samples/sec | ETA 04:30:06 2022-08-24 01:59:52 [INFO] [TRAIN] epoch: 64, iter: 80750/160000, loss: 0.4786, lr: 0.000600, batch_cost: 0.2044, reader_cost: 0.00089, ips: 39.1459 samples/sec | ETA 04:29:55 2022-08-24 02:00:02 [INFO] [TRAIN] epoch: 64, iter: 80800/160000, loss: 0.5154, lr: 0.000600, batch_cost: 0.2046, reader_cost: 0.00037, ips: 39.0927 samples/sec | ETA 04:30:07 2022-08-24 02:00:16 [INFO] [TRAIN] epoch: 65, iter: 80850/160000, loss: 0.5020, lr: 0.000599, batch_cost: 0.2736, reader_cost: 0.06711, ips: 29.2428 samples/sec | ETA 06:00:53 2022-08-24 02:00:27 [INFO] [TRAIN] epoch: 65, iter: 80900/160000, loss: 0.4648, lr: 0.000599, batch_cost: 0.2200, reader_cost: 0.00065, ips: 36.3683 samples/sec | ETA 04:49:59 2022-08-24 02:00:37 [INFO] [TRAIN] epoch: 65, iter: 80950/160000, loss: 0.5233, lr: 0.000598, batch_cost: 0.2028, reader_cost: 0.00050, ips: 39.4390 samples/sec | ETA 04:27:14 2022-08-24 02:00:47 [INFO] [TRAIN] epoch: 65, iter: 81000/160000, loss: 0.4917, lr: 0.000598, batch_cost: 0.2055, reader_cost: 0.00050, ips: 38.9273 samples/sec | ETA 04:30:35 2022-08-24 02:00:47 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 157s - batch_cost: 0.1572 - reader cost: 7.9492e-04 2022-08-24 02:03:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.3664 Acc: 0.7696 Kappa: 0.7523 Dice: 0.5032 2022-08-24 02:03:25 [INFO] [EVAL] Class IoU: [0.6879 0.7867 0.9305 0.7412 0.6686 0.7627 0.7683 0.8049 0.5198 0.6462 0.4965 0.5622 0.7136 0.2824 0.3234 0.4259 0.473 0.4364 0.5868 0.4421 0.7598 0.4494 0.5999 0.5048 0.2978 0.4582 0.5029 0.438 0.4653 0.2387 0.2784 0.5131 0.2876 0.33 0.3464 0.4288 0.4738 0.5391 0.2507 0.3804 0.0868 0.1694 0.3741 0.2669 0.262 0.2194 0.3976 0.4814 0.6339 0.5407 0.5518 0.3084 0.1874 0.2729 0.645 0.3536 0.8336 0.4186 0.4746 0.3205 0.0904 0.2259 0.2922 0.1768 0.4519 0.7216 0.2495 0.4051 0.0347 0.3469 0.4987 0.5676 0.3763 0.2831 0.4531 0.3689 0.5013 0.2607 0.1925 0.4488 0.6585 0.3985 0.4176 0.0551 0.3228 0.545 0.1139 0.1137 0.3614 0.5294 0.3898 0.0438 0.1586 0.1205 0.0583 0.0195 0.1999 0.2041 0.2904 0.2941 0.0841 0.126 0.296 0.7724 0.1974 0.389 0.132 0.5692 0.0882 0.5143 0.1851 0.4999 0.1666 0.4881 0.6029 0.0481 0.4068 0.6939 0.1805 0.3813 0.4544 0.0677 0.2831 0.1899 0.2952 0.2036 0.4627 0.477 0.4614 0.368 0.5564 0.0388 0.3355 0.3924 0.3152 0.1899 0.1606 0.0233 0.1812 0.3855 0.2152 0.0033 0.2324 0.0522 0.3296 0.0087 0.4211 0.0304 0.1481 0.2077] 2022-08-24 02:03:25 [INFO] [EVAL] Class Precision: [0.7921 0.871 0.9619 0.8424 0.7463 0.8893 0.8644 0.8685 0.6658 0.7574 0.7144 0.7395 0.7849 0.5434 0.5211 0.5609 0.6146 0.6951 0.7092 0.651 0.8333 0.69 0.7249 0.6221 0.4505 0.5625 0.5809 0.6582 0.6826 0.5089 0.4811 0.6637 0.5375 0.4478 0.4516 0.5434 0.6603 0.7989 0.4722 0.576 0.3222 0.334 0.5822 0.495 0.3765 0.405 0.6804 0.6303 0.7305 0.6507 0.7329 0.3897 0.2839 0.5969 0.6714 0.6975 0.8655 0.6683 0.6403 0.625 0.1369 0.5824 0.367 0.6025 0.557 0.8528 0.3645 0.5966 0.109 0.6065 0.6577 0.7381 0.5464 0.353 0.6215 0.5419 0.6229 0.5516 0.6406 0.6636 0.8088 0.6649 0.7803 0.137 0.3765 0.7255 0.5186 0.4144 0.7109 0.7144 0.5658 0.0488 0.3862 0.3927 0.1582 0.086 0.4652 0.4879 0.4082 0.6897 0.7414 0.1997 0.5954 0.8064 0.7631 0.4203 0.284 0.8132 0.1572 0.5544 0.3523 0.73 0.4689 0.8036 0.616 0.3841 0.6841 0.7627 0.4008 0.5295 0.6911 0.4338 0.7837 0.5329 0.6897 0.6313 0.765 0.6024 0.6477 0.6455 0.6087 0.359 0.4901 0.7646 0.7263 0.3088 0.29 0.0621 0.4416 0.5802 0.4897 0.0104 0.5981 0.4565 0.5801 0.0111 0.7858 0.1908 0.3992 0.8172] 2022-08-24 02:03:25 [INFO] [EVAL] Class Recall: [0.8394 0.8905 0.9662 0.8605 0.8653 0.8428 0.8736 0.9166 0.7032 0.8149 0.6193 0.701 0.8869 0.3703 0.4601 0.639 0.6724 0.5397 0.7726 0.5794 0.896 0.5631 0.7766 0.728 0.4677 0.7119 0.7893 0.567 0.5937 0.3102 0.3979 0.6934 0.3822 0.5563 0.5978 0.6703 0.6265 0.6237 0.3483 0.5284 0.1062 0.2557 0.5114 0.3668 0.4626 0.3237 0.4889 0.6709 0.8273 0.7619 0.6906 0.5967 0.3555 0.3345 0.9426 0.4177 0.9576 0.5284 0.6472 0.3967 0.2103 0.2695 0.589 0.2001 0.7054 0.8243 0.4417 0.5578 0.0484 0.4477 0.6736 0.7108 0.5472 0.5883 0.6257 0.5361 0.7198 0.3308 0.2158 0.581 0.7798 0.4986 0.4733 0.0845 0.6934 0.6866 0.1274 0.1355 0.4237 0.6716 0.556 0.2984 0.212 0.1481 0.0845 0.0246 0.2595 0.2598 0.5014 0.339 0.0866 0.2545 0.3705 0.9483 0.2103 0.8391 0.1977 0.6548 0.1674 0.8767 0.2806 0.6133 0.2054 0.5543 0.9659 0.0521 0.5009 0.8849 0.2472 0.5767 0.5702 0.0743 0.3071 0.2278 0.3404 0.2311 0.5393 0.6961 0.616 0.4613 0.8662 0.0416 0.5155 0.4463 0.3576 0.3302 0.2648 0.036 0.2351 0.5347 0.2773 0.0048 0.2755 0.0557 0.4329 0.0384 0.4757 0.0349 0.1905 0.2179] 2022-08-24 02:03:25 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 02:03:34 [INFO] [TRAIN] epoch: 65, iter: 81050/160000, loss: 0.4929, lr: 0.000598, batch_cost: 0.1822, reader_cost: 0.00490, ips: 43.8982 samples/sec | ETA 03:59:47 2022-08-24 02:03:43 [INFO] [TRAIN] epoch: 65, iter: 81100/160000, loss: 0.5500, lr: 0.000597, batch_cost: 0.1714, reader_cost: 0.00153, ips: 46.6792 samples/sec | ETA 03:45:22 2022-08-24 02:03:51 [INFO] [TRAIN] epoch: 65, iter: 81150/160000, loss: 0.4802, lr: 0.000597, batch_cost: 0.1694, reader_cost: 0.00066, ips: 47.2189 samples/sec | ETA 03:42:39 2022-08-24 02:03:59 [INFO] [TRAIN] epoch: 65, iter: 81200/160000, loss: 0.5186, lr: 0.000597, batch_cost: 0.1601, reader_cost: 0.00126, ips: 49.9783 samples/sec | ETA 03:30:13 2022-08-24 02:04:07 [INFO] [TRAIN] epoch: 65, iter: 81250/160000, loss: 0.5078, lr: 0.000596, batch_cost: 0.1662, reader_cost: 0.00109, ips: 48.1391 samples/sec | ETA 03:38:07 2022-08-24 02:04:16 [INFO] [TRAIN] epoch: 65, iter: 81300/160000, loss: 0.5012, lr: 0.000596, batch_cost: 0.1755, reader_cost: 0.00059, ips: 45.5934 samples/sec | ETA 03:50:09 2022-08-24 02:04:24 [INFO] [TRAIN] epoch: 65, iter: 81350/160000, loss: 0.5196, lr: 0.000595, batch_cost: 0.1622, reader_cost: 0.00076, ips: 49.3145 samples/sec | ETA 03:32:38 2022-08-24 02:04:35 [INFO] [TRAIN] epoch: 65, iter: 81400/160000, loss: 0.4912, lr: 0.000595, batch_cost: 0.2067, reader_cost: 0.00047, ips: 38.7085 samples/sec | ETA 04:30:44 2022-08-24 02:04:43 [INFO] [TRAIN] epoch: 65, iter: 81450/160000, loss: 0.4939, lr: 0.000595, batch_cost: 0.1703, reader_cost: 0.00040, ips: 46.9809 samples/sec | ETA 03:42:55 2022-08-24 02:04:51 [INFO] [TRAIN] epoch: 65, iter: 81500/160000, loss: 0.5378, lr: 0.000594, batch_cost: 0.1564, reader_cost: 0.00053, ips: 51.1391 samples/sec | ETA 03:24:40 2022-08-24 02:04:59 [INFO] [TRAIN] epoch: 65, iter: 81550/160000, loss: 0.5154, lr: 0.000594, batch_cost: 0.1544, reader_cost: 0.00083, ips: 51.8060 samples/sec | ETA 03:21:54 2022-08-24 02:05:09 [INFO] [TRAIN] epoch: 65, iter: 81600/160000, loss: 0.5129, lr: 0.000594, batch_cost: 0.1953, reader_cost: 0.00033, ips: 40.9637 samples/sec | ETA 04:15:11 2022-08-24 02:05:18 [INFO] [TRAIN] epoch: 65, iter: 81650/160000, loss: 0.4882, lr: 0.000593, batch_cost: 0.1966, reader_cost: 0.00062, ips: 40.6935 samples/sec | ETA 04:16:42 2022-08-24 02:05:30 [INFO] [TRAIN] epoch: 65, iter: 81700/160000, loss: 0.4480, lr: 0.000593, batch_cost: 0.2247, reader_cost: 0.00082, ips: 35.6046 samples/sec | ETA 04:53:13 2022-08-24 02:05:40 [INFO] [TRAIN] epoch: 65, iter: 81750/160000, loss: 0.4842, lr: 0.000592, batch_cost: 0.2098, reader_cost: 0.00065, ips: 38.1340 samples/sec | ETA 04:33:35 2022-08-24 02:05:51 [INFO] [TRAIN] epoch: 65, iter: 81800/160000, loss: 0.4876, lr: 0.000592, batch_cost: 0.2181, reader_cost: 0.00034, ips: 36.6870 samples/sec | ETA 04:44:12 2022-08-24 02:06:00 [INFO] [TRAIN] epoch: 65, iter: 81850/160000, loss: 0.4902, lr: 0.000592, batch_cost: 0.1843, reader_cost: 0.00055, ips: 43.3970 samples/sec | ETA 04:00:06 2022-08-24 02:06:10 [INFO] [TRAIN] epoch: 65, iter: 81900/160000, loss: 0.4767, lr: 0.000591, batch_cost: 0.1986, reader_cost: 0.00076, ips: 40.2887 samples/sec | ETA 04:18:28 2022-08-24 02:06:21 [INFO] [TRAIN] epoch: 65, iter: 81950/160000, loss: 0.4954, lr: 0.000591, batch_cost: 0.2096, reader_cost: 0.00074, ips: 38.1724 samples/sec | ETA 04:32:37 2022-08-24 02:06:32 [INFO] [TRAIN] epoch: 65, iter: 82000/160000, loss: 0.4790, lr: 0.000591, batch_cost: 0.2231, reader_cost: 0.00065, ips: 35.8592 samples/sec | ETA 04:50:01 2022-08-24 02:06:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 160s - batch_cost: 0.1599 - reader cost: 0.0013 2022-08-24 02:09:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.3666 Acc: 0.7714 Kappa: 0.7541 Dice: 0.5037 2022-08-24 02:09:12 [INFO] [EVAL] Class IoU: [0.6909 0.7787 0.9306 0.7382 0.6852 0.7674 0.7662 0.8102 0.5304 0.6474 0.4954 0.5808 0.6969 0.3338 0.3183 0.4397 0.4583 0.4414 0.5917 0.4316 0.7699 0.4422 0.5997 0.4992 0.2801 0.4403 0.4634 0.4632 0.4198 0.2988 0.298 0.5177 0.2981 0.3562 0.3627 0.4245 0.4743 0.5488 0.2554 0.3865 0.1227 0.1697 0.3683 0.2511 0.26 0.2165 0.3961 0.512 0.6052 0.5133 0.5493 0.3113 0.175 0.241 0.6773 0.4153 0.8797 0.4118 0.4442 0.3212 0.1295 0.2418 0.3019 0.1638 0.4391 0.7135 0.2196 0.4024 0.0803 0.3337 0.5356 0.5712 0.3706 0.227 0.4745 0.3588 0.5733 0.2429 0.0947 0.3223 0.6503 0.4075 0.3968 0.0433 0.1409 0.521 0.1106 0.112 0.4188 0.5126 0.4508 0.0268 0.2371 0.1187 0.0701 0.0398 0.2076 0.2011 0.2938 0.3062 0.1923 0.1603 0.283 0.47 0.1952 0.4112 0.1086 0.5671 0.071 0.3507 0.1752 0.4604 0.1723 0.5881 0.7784 0.0501 0.3973 0.6896 0.1712 0.3903 0.4308 0.0831 0.3431 0.1715 0.2786 0.246 0.519 0.4632 0.5166 0.3343 0.607 0.0622 0.2888 0.458 0.2719 0.192 0.1324 0.0267 0.1988 0.3724 0.1416 0.0219 0.2182 0.0568 0.3439 0.013 0.4458 0.049 0.17 0.2135] 2022-08-24 02:09:12 [INFO] [EVAL] Class Precision: [0.7926 0.8554 0.9646 0.8193 0.7699 0.8641 0.8982 0.8927 0.6939 0.8075 0.696 0.687 0.7509 0.552 0.6151 0.6014 0.6705 0.6368 0.7093 0.6266 0.8435 0.6425 0.727 0.6392 0.4986 0.5767 0.5697 0.6838 0.8198 0.4497 0.448 0.6613 0.5089 0.5269 0.4474 0.5462 0.64 0.8177 0.414 0.5515 0.3145 0.3686 0.6348 0.5679 0.3685 0.4362 0.6347 0.6886 0.6939 0.6092 0.7131 0.3798 0.3262 0.6179 0.7222 0.5601 0.9484 0.6629 0.6759 0.5286 0.1854 0.6571 0.4137 0.6568 0.5113 0.8224 0.4223 0.5458 0.1772 0.6257 0.6931 0.7374 0.5486 0.3368 0.6737 0.474 0.8159 0.479 0.6674 0.7107 0.7784 0.6824 0.812 0.1084 0.2395 0.724 0.3968 0.356 0.7271 0.666 0.6442 0.038 0.4137 0.3332 0.4542 0.0887 0.638 0.461 0.418 0.7099 0.683 0.3078 0.6273 0.8117 0.7096 0.4387 0.3038 0.8129 0.1218 0.4822 0.4053 0.6255 0.4864 0.7717 0.7898 0.4589 0.574 0.7859 0.3132 0.5603 0.6225 0.3711 0.6234 0.5806 0.7286 0.5436 0.804 0.589 0.8035 0.5056 0.732 0.3633 0.5309 0.7742 0.797 0.3323 0.3241 0.0679 0.4145 0.6776 0.7892 0.0317 0.5931 0.3348 0.634 0.027 0.808 0.2019 0.4453 0.7431] 2022-08-24 02:09:12 [INFO] [EVAL] Class Recall: [0.8434 0.8968 0.9636 0.8818 0.8616 0.8727 0.8391 0.8977 0.6925 0.7655 0.6322 0.7897 0.9066 0.4578 0.3975 0.6207 0.5915 0.59 0.7812 0.581 0.8981 0.5866 0.774 0.695 0.3899 0.6505 0.713 0.5895 0.4624 0.4709 0.471 0.7044 0.4185 0.5237 0.6569 0.6557 0.647 0.6253 0.4001 0.5637 0.1675 0.2392 0.4673 0.3105 0.4689 0.3006 0.5131 0.6663 0.8256 0.7653 0.7052 0.6333 0.274 0.2833 0.916 0.6163 0.9239 0.5208 0.5644 0.45 0.3002 0.2767 0.5277 0.1792 0.7566 0.8435 0.314 0.6051 0.1281 0.4169 0.7021 0.7169 0.5332 0.4104 0.6161 0.5961 0.6585 0.3301 0.0994 0.3709 0.798 0.5028 0.4369 0.0673 0.2549 0.6502 0.1329 0.1405 0.4969 0.69 0.6002 0.083 0.3571 0.1558 0.0765 0.0672 0.2353 0.2629 0.4972 0.35 0.2111 0.2506 0.3402 0.5276 0.2121 0.8676 0.1446 0.6522 0.1456 0.5625 0.2359 0.6357 0.2106 0.7119 0.9818 0.0533 0.5635 0.849 0.2742 0.5626 0.5831 0.0968 0.4329 0.1958 0.3109 0.3101 0.5942 0.6844 0.5912 0.4967 0.7804 0.0699 0.3877 0.5286 0.2922 0.3127 0.183 0.0422 0.2764 0.4527 0.1472 0.0663 0.2567 0.064 0.4291 0.0244 0.4987 0.0607 0.2156 0.2306] 2022-08-24 02:09:12 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 02:09:20 [INFO] [TRAIN] epoch: 65, iter: 82050/160000, loss: 0.4820, lr: 0.000590, batch_cost: 0.1584, reader_cost: 0.00497, ips: 50.5134 samples/sec | ETA 03:25:45 2022-08-24 02:09:31 [INFO] [TRAIN] epoch: 66, iter: 82100/160000, loss: 0.4932, lr: 0.000590, batch_cost: 0.2153, reader_cost: 0.06229, ips: 37.1620 samples/sec | ETA 04:39:29 2022-08-24 02:09:40 [INFO] [TRAIN] epoch: 66, iter: 82150/160000, loss: 0.4904, lr: 0.000589, batch_cost: 0.1869, reader_cost: 0.00043, ips: 42.8116 samples/sec | ETA 04:02:27 2022-08-24 02:09:49 [INFO] [TRAIN] epoch: 66, iter: 82200/160000, loss: 0.4931, lr: 0.000589, batch_cost: 0.1685, reader_cost: 0.00060, ips: 47.4776 samples/sec | ETA 03:38:29 2022-08-24 02:09:58 [INFO] [TRAIN] epoch: 66, iter: 82250/160000, loss: 0.4712, lr: 0.000589, batch_cost: 0.1878, reader_cost: 0.00046, ips: 42.6036 samples/sec | ETA 04:03:19 2022-08-24 02:10:07 [INFO] [TRAIN] epoch: 66, iter: 82300/160000, loss: 0.4618, lr: 0.000588, batch_cost: 0.1792, reader_cost: 0.00096, ips: 44.6431 samples/sec | ETA 03:52:03 2022-08-24 02:10:17 [INFO] [TRAIN] epoch: 66, iter: 82350/160000, loss: 0.5092, lr: 0.000588, batch_cost: 0.1899, reader_cost: 0.00049, ips: 42.1339 samples/sec | ETA 04:05:43 2022-08-24 02:10:25 [INFO] [TRAIN] epoch: 66, iter: 82400/160000, loss: 0.5229, lr: 0.000588, batch_cost: 0.1686, reader_cost: 0.00082, ips: 47.4531 samples/sec | ETA 03:38:02 2022-08-24 02:10:33 [INFO] [TRAIN] epoch: 66, iter: 82450/160000, loss: 0.5203, lr: 0.000587, batch_cost: 0.1599, reader_cost: 0.00052, ips: 50.0395 samples/sec | ETA 03:26:38 2022-08-24 02:10:41 [INFO] [TRAIN] epoch: 66, iter: 82500/160000, loss: 0.4976, lr: 0.000587, batch_cost: 0.1552, reader_cost: 0.00076, ips: 51.5601 samples/sec | ETA 03:20:24 2022-08-24 02:10:50 [INFO] [TRAIN] epoch: 66, iter: 82550/160000, loss: 0.4880, lr: 0.000586, batch_cost: 0.1816, reader_cost: 0.00064, ips: 44.0525 samples/sec | ETA 03:54:25 2022-08-24 02:10:59 [INFO] [TRAIN] epoch: 66, iter: 82600/160000, loss: 0.4786, lr: 0.000586, batch_cost: 0.1768, reader_cost: 0.00072, ips: 45.2609 samples/sec | ETA 03:48:00 2022-08-24 02:11:07 [INFO] [TRAIN] epoch: 66, iter: 82650/160000, loss: 0.4954, lr: 0.000586, batch_cost: 0.1754, reader_cost: 0.00052, ips: 45.6088 samples/sec | ETA 03:46:07 2022-08-24 02:11:17 [INFO] [TRAIN] epoch: 66, iter: 82700/160000, loss: 0.5603, lr: 0.000585, batch_cost: 0.1919, reader_cost: 0.00044, ips: 41.6784 samples/sec | ETA 04:07:17 2022-08-24 02:11:27 [INFO] [TRAIN] epoch: 66, iter: 82750/160000, loss: 0.4811, lr: 0.000585, batch_cost: 0.2022, reader_cost: 0.00056, ips: 39.5680 samples/sec | ETA 04:20:18 2022-08-24 02:11:36 [INFO] [TRAIN] epoch: 66, iter: 82800/160000, loss: 0.4891, lr: 0.000584, batch_cost: 0.1736, reader_cost: 0.00050, ips: 46.0704 samples/sec | ETA 03:43:25 2022-08-24 02:11:45 [INFO] [TRAIN] epoch: 66, iter: 82850/160000, loss: 0.4869, lr: 0.000584, batch_cost: 0.1881, reader_cost: 0.00094, ips: 42.5235 samples/sec | ETA 04:01:54 2022-08-24 02:11:55 [INFO] [TRAIN] epoch: 66, iter: 82900/160000, loss: 0.4806, lr: 0.000584, batch_cost: 0.1927, reader_cost: 0.00058, ips: 41.5075 samples/sec | ETA 04:07:39 2022-08-24 02:12:05 [INFO] [TRAIN] epoch: 66, iter: 82950/160000, loss: 0.4861, lr: 0.000583, batch_cost: 0.2032, reader_cost: 0.00044, ips: 39.3752 samples/sec | ETA 04:20:54 2022-08-24 02:12:15 [INFO] [TRAIN] epoch: 66, iter: 83000/160000, loss: 0.4919, lr: 0.000583, batch_cost: 0.1989, reader_cost: 0.00079, ips: 40.2300 samples/sec | ETA 04:15:11 2022-08-24 02:12:15 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1776 - reader cost: 7.0692e-04 2022-08-24 02:15:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3676 Acc: 0.7673 Kappa: 0.7500 Dice: 0.5060 2022-08-24 02:15:13 [INFO] [EVAL] Class IoU: [0.6858 0.7813 0.9285 0.7425 0.6731 0.7595 0.7724 0.8043 0.5132 0.6105 0.4974 0.5634 0.7026 0.3175 0.3175 0.4096 0.48 0.4106 0.5716 0.4265 0.7627 0.4368 0.6268 0.4926 0.3167 0.4137 0.4393 0.4321 0.4163 0.2284 0.2748 0.4955 0.2521 0.307 0.3503 0.4258 0.4741 0.5648 0.2643 0.3565 0.1307 0.1636 0.359 0.2452 0.2536 0.2255 0.3738 0.5111 0.6238 0.5004 0.5186 0.3128 0.1713 0.263 0.6382 0.4105 0.8715 0.4237 0.5322 0.3192 0.0709 0.2802 0.2738 0.2018 0.3532 0.7225 0.2443 0.3916 0.0782 0.3588 0.519 0.5543 0.3872 0.2925 0.4735 0.3836 0.5072 0.2536 0.1853 0.3672 0.615 0.4332 0.3693 0.036 0.337 0.5173 0.1118 0.1014 0.4203 0.5119 0.3704 0.0091 0.2148 0.1304 0.0546 0.0265 0.1902 0.2453 0.2914 0.3109 0.3105 0.1684 0.3043 0.6392 0.1941 0.4143 0.2085 0.5744 0.0733 0.4072 0.163 0.526 0.1508 0.6381 0.7674 0.0601 0.428 0.696 0.1723 0.4027 0.4404 0.0878 0.3036 0.1531 0.286 0.1887 0.4892 0.4389 0.5652 0.3237 0.5258 0.081 0.2365 0.4565 0.3059 0.1803 0.1197 0.019 0.1818 0.3578 0.1553 0.0395 0.2874 0.1798 0.2451 0.0152 0.421 0.0311 0.1796 0.2084] 2022-08-24 02:15:13 [INFO] [EVAL] Class Precision: [0.8107 0.851 0.9643 0.8502 0.7391 0.8385 0.8869 0.8616 0.6549 0.7193 0.7289 0.6861 0.7712 0.5287 0.5602 0.6015 0.6821 0.6761 0.6521 0.6072 0.8462 0.6607 0.7873 0.5941 0.5015 0.577 0.5625 0.6012 0.6902 0.3823 0.4495 0.6781 0.508 0.3905 0.4789 0.5738 0.6581 0.8176 0.3996 0.5963 0.3432 0.3865 0.6281 0.4863 0.3516 0.4994 0.7591 0.7413 0.7503 0.59 0.7886 0.3543 0.2917 0.6248 0.6742 0.6773 0.9124 0.6424 0.7029 0.5942 0.1204 0.5423 0.3805 0.7218 0.3804 0.8546 0.3528 0.5104 0.1653 0.5552 0.692 0.681 0.5689 0.3467 0.6856 0.5512 0.8264 0.5097 0.653 0.5661 0.7338 0.6669 0.814 0.1023 0.4039 0.7247 0.4106 0.3858 0.7175 0.6964 0.489 0.0245 0.4353 0.3484 0.2828 0.0964 0.5872 0.4325 0.5002 0.5989 0.5479 0.2892 0.5814 0.7906 0.7094 0.4443 0.4583 0.8548 0.1593 0.5024 0.491 0.6126 0.488 0.7318 0.788 0.3533 0.6138 0.7762 0.2889 0.5772 0.7003 0.3337 0.7019 0.6455 0.745 0.6448 0.7403 0.5392 0.7023 0.5931 0.6258 0.3077 0.5353 0.747 0.7469 0.2657 0.3374 0.0471 0.4111 0.6885 0.4277 0.0799 0.5701 0.6682 0.6418 0.0413 0.8098 0.2304 0.4579 0.7866] 2022-08-24 02:15:13 [INFO] [EVAL] Class Recall: [0.8165 0.9051 0.9616 0.8542 0.883 0.8896 0.8568 0.9236 0.7033 0.8015 0.6102 0.7591 0.8876 0.4427 0.4229 0.5621 0.6184 0.5111 0.8225 0.5889 0.8855 0.5632 0.7546 0.7426 0.4622 0.5939 0.6672 0.6058 0.5119 0.3619 0.4142 0.6478 0.3336 0.5894 0.5662 0.6226 0.629 0.6462 0.4385 0.4699 0.1743 0.221 0.4559 0.331 0.4764 0.2914 0.4242 0.6221 0.7872 0.7671 0.6024 0.7279 0.2933 0.3123 0.9228 0.5103 0.9511 0.5545 0.6867 0.4083 0.1472 0.3669 0.4939 0.2189 0.8319 0.8237 0.4427 0.6271 0.1292 0.5035 0.6748 0.7486 0.548 0.652 0.6048 0.5578 0.5676 0.3354 0.2056 0.5111 0.7915 0.5529 0.4033 0.0527 0.6705 0.6438 0.1332 0.1209 0.5036 0.6589 0.6043 0.0143 0.2977 0.1725 0.0634 0.0353 0.2196 0.3617 0.4112 0.3927 0.4174 0.2874 0.3897 0.7694 0.2109 0.8597 0.2766 0.6365 0.1194 0.6824 0.1962 0.7883 0.1791 0.8329 0.967 0.0676 0.5857 0.8707 0.2993 0.5712 0.5426 0.1065 0.3486 0.1672 0.3171 0.2105 0.5905 0.7024 0.7432 0.4161 0.7669 0.099 0.2976 0.54 0.3412 0.3593 0.1564 0.031 0.2458 0.4269 0.1961 0.0724 0.3669 0.1974 0.2839 0.0235 0.4673 0.0347 0.2282 0.2209] 2022-08-24 02:15:13 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 02:15:22 [INFO] [TRAIN] epoch: 66, iter: 83050/160000, loss: 0.5067, lr: 0.000583, batch_cost: 0.1863, reader_cost: 0.00388, ips: 42.9357 samples/sec | ETA 03:58:57 2022-08-24 02:15:31 [INFO] [TRAIN] epoch: 66, iter: 83100/160000, loss: 0.5004, lr: 0.000582, batch_cost: 0.1713, reader_cost: 0.00061, ips: 46.7112 samples/sec | ETA 03:39:30 2022-08-24 02:15:40 [INFO] [TRAIN] epoch: 66, iter: 83150/160000, loss: 0.5024, lr: 0.000582, batch_cost: 0.1765, reader_cost: 0.00044, ips: 45.3249 samples/sec | ETA 03:46:04 2022-08-24 02:15:49 [INFO] [TRAIN] epoch: 66, iter: 83200/160000, loss: 0.4776, lr: 0.000581, batch_cost: 0.1845, reader_cost: 0.00059, ips: 43.3632 samples/sec | ETA 03:56:08 2022-08-24 02:15:59 [INFO] [TRAIN] epoch: 66, iter: 83250/160000, loss: 0.5193, lr: 0.000581, batch_cost: 0.2018, reader_cost: 0.00072, ips: 39.6524 samples/sec | ETA 04:18:04 2022-08-24 02:16:08 [INFO] [TRAIN] epoch: 66, iter: 83300/160000, loss: 0.4786, lr: 0.000581, batch_cost: 0.1720, reader_cost: 0.00071, ips: 46.5076 samples/sec | ETA 03:39:53 2022-08-24 02:16:16 [INFO] [TRAIN] epoch: 66, iter: 83350/160000, loss: 0.5263, lr: 0.000580, batch_cost: 0.1618, reader_cost: 0.00055, ips: 49.4484 samples/sec | ETA 03:26:40 2022-08-24 02:16:29 [INFO] [TRAIN] epoch: 67, iter: 83400/160000, loss: 0.4866, lr: 0.000580, batch_cost: 0.2550, reader_cost: 0.07441, ips: 31.3756 samples/sec | ETA 05:25:31 2022-08-24 02:16:39 [INFO] [TRAIN] epoch: 67, iter: 83450/160000, loss: 0.4687, lr: 0.000580, batch_cost: 0.2024, reader_cost: 0.00041, ips: 39.5195 samples/sec | ETA 04:18:16 2022-08-24 02:16:48 [INFO] [TRAIN] epoch: 67, iter: 83500/160000, loss: 0.4973, lr: 0.000579, batch_cost: 0.1963, reader_cost: 0.00075, ips: 40.7448 samples/sec | ETA 04:10:20 2022-08-24 02:16:59 [INFO] [TRAIN] epoch: 67, iter: 83550/160000, loss: 0.5077, lr: 0.000579, batch_cost: 0.2092, reader_cost: 0.00073, ips: 38.2362 samples/sec | ETA 04:26:35 2022-08-24 02:17:09 [INFO] [TRAIN] epoch: 67, iter: 83600/160000, loss: 0.4858, lr: 0.000578, batch_cost: 0.1972, reader_cost: 0.00059, ips: 40.5638 samples/sec | ETA 04:11:07 2022-08-24 02:17:19 [INFO] [TRAIN] epoch: 67, iter: 83650/160000, loss: 0.5066, lr: 0.000578, batch_cost: 0.1985, reader_cost: 0.00032, ips: 40.3062 samples/sec | ETA 04:12:34 2022-08-24 02:17:29 [INFO] [TRAIN] epoch: 67, iter: 83700/160000, loss: 0.4620, lr: 0.000578, batch_cost: 0.1963, reader_cost: 0.00039, ips: 40.7441 samples/sec | ETA 04:09:41 2022-08-24 02:17:38 [INFO] [TRAIN] epoch: 67, iter: 83750/160000, loss: 0.4450, lr: 0.000577, batch_cost: 0.1882, reader_cost: 0.00073, ips: 42.5027 samples/sec | ETA 03:59:12 2022-08-24 02:17:47 [INFO] [TRAIN] epoch: 67, iter: 83800/160000, loss: 0.4990, lr: 0.000577, batch_cost: 0.1848, reader_cost: 0.00114, ips: 43.2847 samples/sec | ETA 03:54:43 2022-08-24 02:17:56 [INFO] [TRAIN] epoch: 67, iter: 83850/160000, loss: 0.4653, lr: 0.000577, batch_cost: 0.1735, reader_cost: 0.00107, ips: 46.1206 samples/sec | ETA 03:40:08 2022-08-24 02:18:06 [INFO] [TRAIN] epoch: 67, iter: 83900/160000, loss: 0.4948, lr: 0.000576, batch_cost: 0.2111, reader_cost: 0.00106, ips: 37.9047 samples/sec | ETA 04:27:41 2022-08-24 02:18:17 [INFO] [TRAIN] epoch: 67, iter: 83950/160000, loss: 0.4754, lr: 0.000576, batch_cost: 0.2103, reader_cost: 0.00088, ips: 38.0463 samples/sec | ETA 04:26:31 2022-08-24 02:18:26 [INFO] [TRAIN] epoch: 67, iter: 84000/160000, loss: 0.5157, lr: 0.000575, batch_cost: 0.1860, reader_cost: 0.00051, ips: 43.0198 samples/sec | ETA 03:55:33 2022-08-24 02:18:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1695 - reader cost: 7.3988e-04 2022-08-24 02:21:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3670 Acc: 0.7714 Kappa: 0.7541 Dice: 0.5053 2022-08-24 02:21:16 [INFO] [EVAL] Class IoU: [0.6862 0.7918 0.9299 0.7435 0.6869 0.7656 0.7788 0.8101 0.5323 0.6417 0.4924 0.5582 0.7004 0.2923 0.3211 0.4177 0.4748 0.436 0.6127 0.4301 0.7591 0.4791 0.614 0.5067 0.303 0.4933 0.4427 0.4445 0.4083 0.3219 0.2569 0.5213 0.2556 0.3228 0.3457 0.3866 0.4712 0.5259 0.2669 0.3812 0.1651 0.1219 0.3609 0.268 0.2552 0.2229 0.3726 0.5089 0.6146 0.5372 0.5536 0.3182 0.1579 0.2491 0.6499 0.4022 0.85 0.4147 0.4936 0.3301 0.1076 0.2438 0.2869 0.1444 0.4053 0.6791 0.2493 0.3668 0.0762 0.3622 0.4783 0.5678 0.3898 0.2449 0.4802 0.3736 0.5497 0.2738 0.2334 0.4027 0.6518 0.4209 0.36 0.0556 0.2229 0.5112 0.1328 0.1094 0.3297 0.488 0.3835 0.0001 0.1681 0.0742 0.0464 0.0285 0.2051 0.238 0.286 0.2995 0.1783 0.1492 0.3152 0.6357 0.1832 0.3591 0.1962 0.5758 0.0822 0.3513 0.1955 0.4472 0.1481 0.5801 0.8191 0.0512 0.4356 0.6467 0.1444 0.3618 0.425 0.072 0.237 0.1889 0.2873 0.2211 0.485 0.4615 0.4954 0.3169 0.5571 0.0436 0.2212 0.4579 0.2863 0.1845 0.1346 0.0282 0.1776 0.3774 0.3651 0.0907 0.3333 0.2558 0.2961 0.0124 0.4208 0.0402 0.1337 0.2091] 2022-08-24 02:21:16 [INFO] [EVAL] Class Precision: [0.7868 0.87 0.9626 0.8501 0.7651 0.8737 0.8788 0.8705 0.674 0.7301 0.71 0.726 0.7726 0.5243 0.5742 0.5552 0.6583 0.6869 0.7591 0.6111 0.8268 0.637 0.7596 0.5963 0.447 0.6176 0.567 0.6938 0.7688 0.5001 0.4749 0.6936 0.5445 0.43 0.5024 0.4999 0.6403 0.8276 0.3705 0.6115 0.392 0.3579 0.6374 0.5067 0.338 0.4843 0.6643 0.7507 0.716 0.6415 0.7447 0.3803 0.3734 0.6449 0.6855 0.5601 0.8904 0.6892 0.7425 0.6692 0.2064 0.4914 0.4049 0.6275 0.4515 0.7594 0.4278 0.4626 0.1753 0.6056 0.7162 0.6943 0.5466 0.3352 0.6751 0.5191 0.7169 0.638 0.6263 0.5817 0.7873 0.669 0.8323 0.1253 0.3134 0.7015 0.3697 0.3731 0.8553 0.6133 0.4814 0.0006 0.3539 0.3161 0.17 0.0731 0.7144 0.5278 0.4414 0.6606 0.7007 0.2441 0.6235 0.824 0.6917 0.3827 0.3424 0.7958 0.2073 0.4663 0.3552 0.5935 0.5187 0.802 0.8299 0.3004 0.6984 0.7599 0.3389 0.5136 0.6541 0.3408 0.699 0.5572 0.7054 0.6211 0.6861 0.58 0.7753 0.4625 0.6879 0.2797 0.446 0.7134 0.7963 0.3422 0.2959 0.0634 0.4218 0.5514 0.8733 0.1624 0.5194 0.6233 0.6684 0.0264 0.7358 0.2312 0.3001 0.7784] 2022-08-24 02:21:16 [INFO] [EVAL] Class Recall: [0.8428 0.8981 0.9647 0.8556 0.8704 0.8609 0.8725 0.9211 0.7168 0.8412 0.6164 0.7071 0.8823 0.3978 0.4215 0.6278 0.63 0.5442 0.7607 0.5922 0.9027 0.6592 0.7621 0.7713 0.4847 0.7102 0.6689 0.5529 0.4654 0.4745 0.3589 0.6773 0.3251 0.5642 0.5257 0.6304 0.6409 0.5907 0.4884 0.503 0.222 0.1561 0.4541 0.3627 0.5102 0.2923 0.459 0.6125 0.8128 0.7677 0.6833 0.6608 0.2148 0.2887 0.9259 0.5879 0.9493 0.5101 0.5955 0.3944 0.1836 0.3262 0.4961 0.158 0.7984 0.8653 0.3741 0.6391 0.1187 0.474 0.5901 0.7571 0.5761 0.4761 0.6246 0.5713 0.7021 0.3241 0.2712 0.5669 0.7912 0.5316 0.3881 0.0908 0.4355 0.6532 0.1718 0.134 0.3492 0.7049 0.6534 0.0002 0.2426 0.0883 0.06 0.0447 0.2234 0.3025 0.4482 0.3539 0.193 0.2772 0.3893 0.7355 0.1995 0.8534 0.3149 0.6757 0.12 0.5875 0.303 0.6448 0.1717 0.6771 0.9844 0.0582 0.5366 0.8129 0.201 0.5505 0.5483 0.0837 0.2639 0.2223 0.3265 0.2555 0.6233 0.693 0.5785 0.5016 0.7455 0.0491 0.305 0.5611 0.309 0.286 0.198 0.0484 0.2347 0.5446 0.3855 0.1704 0.4818 0.3026 0.3471 0.0228 0.4956 0.0464 0.1942 0.2223] 2022-08-24 02:21:16 [INFO] [EVAL] The model with the best validation mIoU (0.3702) was saved at iter 73000. 2022-08-24 02:21:25 [INFO] [TRAIN] epoch: 67, iter: 84050/160000, loss: 0.5171, lr: 0.000575, batch_cost: 0.1698, reader_cost: 0.00362, ips: 47.1194 samples/sec | ETA 03:34:54 2022-08-24 02:21:32 [INFO] [TRAIN] epoch: 67, iter: 84100/160000, loss: 0.4984, lr: 0.000575, batch_cost: 0.1517, reader_cost: 0.00088, ips: 52.7415 samples/sec | ETA 03:11:52 2022-08-24 02:21:40 [INFO] [TRAIN] epoch: 67, iter: 84150/160000, loss: 0.5088, lr: 0.000574, batch_cost: 0.1539, reader_cost: 0.00034, ips: 51.9700 samples/sec | ETA 03:14:35 2022-08-24 02:21:49 [INFO] [TRAIN] epoch: 67, iter: 84200/160000, loss: 0.5002, lr: 0.000574, batch_cost: 0.1873, reader_cost: 0.00091, ips: 42.7075 samples/sec | ETA 03:56:38 2022-08-24 02:21:59 [INFO] [TRAIN] epoch: 67, iter: 84250/160000, loss: 0.4702, lr: 0.000574, batch_cost: 0.1833, reader_cost: 0.00040, ips: 43.6350 samples/sec | ETA 03:51:27 2022-08-24 02:22:07 [INFO] [TRAIN] epoch: 67, iter: 84300/160000, loss: 0.4889, lr: 0.000573, batch_cost: 0.1764, reader_cost: 0.00063, ips: 45.3522 samples/sec | ETA 03:42:33 2022-08-24 02:22:17 [INFO] [TRAIN] epoch: 67, iter: 84350/160000, loss: 0.5066, lr: 0.000573, batch_cost: 0.1922, reader_cost: 0.00050, ips: 41.6215 samples/sec | ETA 04:02:20 2022-08-24 02:22:26 [INFO] [TRAIN] epoch: 67, iter: 84400/160000, loss: 0.4978, lr: 0.000572, batch_cost: 0.1703, reader_cost: 0.00065, ips: 46.9856 samples/sec | ETA 03:34:32 2022-08-24 02:22:34 [INFO] [TRAIN] epoch: 67, iter: 84450/160000, loss: 0.4763, lr: 0.000572, batch_cost: 0.1627, reader_cost: 0.00045, ips: 49.1598 samples/sec | ETA 03:24:54 2022-08-24 02:22:42 [INFO] [TRAIN] epoch: 67, iter: 84500/160000, loss: 0.5009, lr: 0.000572, batch_cost: 0.1634, reader_cost: 0.00038, ips: 48.9543 samples/sec | ETA 03:25:38 2022-08-24 02:22:52 [INFO] [TRAIN] epoch: 67, iter: 84550/160000, loss: 0.4937, lr: 0.000571, batch_cost: 0.1966, reader_cost: 0.00049, ips: 40.6997 samples/sec | ETA 04:07:10 2022-08-24 02:23:01 [INFO] [TRAIN] epoch: 67, iter: 84600/160000, loss: 0.4947, lr: 0.000571, batch_cost: 0.1837, reader_cost: 0.00683, ips: 43.5575 samples/sec | ETA 03:50:48 2022-08-24 02:23:15 [INFO] [TRAIN] epoch: 68, iter: 84650/160000, loss: 0.4487, lr: 0.000570, batch_cost: 0.2735, reader_cost: 0.05965, ips: 29.2483 samples/sec | ETA 05:43:29 2022-08-24 02:23:25 [INFO] [TRAIN] epoch: 68, iter: 84700/160000, loss: 0.4902, lr: 0.000570, batch_cost: 0.2043, reader_cost: 0.00089, ips: 39.1628 samples/sec | ETA 04:16:21 2022-08-24 02:23:34 [INFO] [TRAIN] epoch: 68, iter: 84750/160000, loss: 0.4502, lr: 0.000570, batch_cost: 0.1843, reader_cost: 0.00036, ips: 43.4148 samples/sec | ETA 03:51:06 2022-08-24 02:23:44 [INFO] [TRAIN] epoch: 68, iter: 84800/160000, loss: 0.4589, lr: 0.000569, batch_cost: 0.1953, reader_cost: 0.00303, ips: 40.9728 samples/sec | ETA 04:04:42 2022-08-24 02:23:54 [INFO] [TRAIN] epoch: 68, iter: 84850/160000, loss: 0.4673, lr: 0.000569, batch_cost: 0.2053, reader_cost: 0.00055, ips: 38.9602 samples/sec | ETA 04:17:11 2022-08-24 02:24:03 [INFO] [TRAIN] epoch: 68, iter: 84900/160000, loss: 0.4691, lr: 0.000569, batch_cost: 0.1833, reader_cost: 0.00045, ips: 43.6560 samples/sec | ETA 03:49:22 2022-08-24 02:24:14 [INFO] [TRAIN] epoch: 68, iter: 84950/160000, loss: 0.4695, lr: 0.000568, batch_cost: 0.2144, reader_cost: 0.00047, ips: 37.3075 samples/sec | ETA 04:28:13 2022-08-24 02:24:24 [INFO] [TRAIN] epoch: 68, iter: 85000/160000, loss: 0.4722, lr: 0.000568, batch_cost: 0.2026, reader_cost: 0.00116, ips: 39.4948 samples/sec | ETA 04:13:11 2022-08-24 02:24:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1724 - reader cost: 0.0011 2022-08-24 02:27:17 [INFO] [EVAL] #Images: 2000 mIoU: 0.3707 Acc: 0.7719 Kappa: 0.7545 Dice: 0.5078 2022-08-24 02:27:17 [INFO] [EVAL] Class IoU: [0.6889 0.7901 0.932 0.7463 0.683 0.7638 0.7699 0.7888 0.53 0.6433 0.4896 0.5744 0.7035 0.3015 0.3184 0.4276 0.5083 0.4584 0.6087 0.4349 0.7658 0.428 0.6252 0.504 0.3332 0.4201 0.4745 0.4587 0.4935 0.2509 0.2228 0.4923 0.2874 0.3402 0.3322 0.4263 0.478 0.5565 0.2743 0.3594 0.122 0.1364 0.3555 0.2742 0.2688 0.2458 0.4037 0.5006 0.6321 0.5358 0.5288 0.3395 0.1781 0.2617 0.668 0.3363 0.8462 0.4097 0.3038 0.3209 0.0909 0.333 0.323 0.219 0.4656 0.7079 0.2338 0.3898 0.0976 0.3623 0.5075 0.5701 0.389 0.2438 0.4694 0.3932 0.5177 0.2775 0.2609 0.213 0.7078 0.4072 0.4026 0.0785 0.1336 0.5142 0.1193 0.1078 0.3786 0.5278 0.402 0.1685 0.2122 0.0932 0.046 0.024 0.1935 0.1969 0.2541 0.3024 0.1449 0.154 0.2751 0.7425 0.1865 0.3589 0.1159 0.5755 0.0813 0.257 0.1733 0.5215 0.1685 0.6626 0.8984 0.0664 0.4727 0.6277 0.1686 0.4086 0.4461 0.0766 0.2045 0.2114 0.2691 0.2439 0.4968 0.4685 0.5369 0.3633 0.6382 0.0234 0.2194 0.4187 0.3416 0.1881 0.1453 0.0249 0.1938 0.3961 0.4238 0.0401 0.2984 0.1556 0.2831 0.004 0.3828 0.0411 0.1115 0.2113] 2022-08-24 02:27:17 [INFO] [EVAL] Class Precision: [0.787 0.8639 0.965 0.853 0.7552 0.8768 0.8855 0.8346 0.7196 0.7899 0.7275 0.6985 0.7661 0.4919 0.5272 0.567 0.6579 0.6971 0.7661 0.6315 0.8437 0.6793 0.7606 0.5935 0.4364 0.6717 0.5642 0.7245 0.7024 0.3981 0.4834 0.6724 0.5271 0.4478 0.4458 0.5434 0.6661 0.8055 0.3842 0.6123 0.3927 0.3556 0.5471 0.4804 0.369 0.4795 0.7096 0.6668 0.6935 0.7057 0.7728 0.4321 0.3841 0.5527 0.7178 0.7826 0.8811 0.6274 0.8394 0.6123 0.1311 0.5032 0.4746 0.6826 0.5703 0.803 0.443 0.5333 0.2186 0.5983 0.6304 0.7399 0.623 0.3283 0.6328 0.569 0.7967 0.6121 0.585 0.5967 0.8619 0.7117 0.7908 0.18 0.234 0.6926 0.4875 0.4054 0.8316 0.7256 0.5236 0.4097 0.4335 0.3203 0.2499 0.1004 0.6872 0.4555 0.3548 0.658 0.6378 0.3009 0.6594 0.7853 0.6622 0.3886 0.3358 0.7852 0.1713 0.3957 0.3743 0.7271 0.4389 0.7808 0.913 0.3209 0.6732 0.7269 0.3402 0.5367 0.6888 0.368 0.6073 0.5189 0.7704 0.6249 0.7358 0.6371 0.7643 0.5654 0.769 0.222 0.4844 0.7057 0.6875 0.3242 0.342 0.0655 0.3704 0.6144 0.6388 0.0646 0.5386 0.3881 0.5939 0.0109 0.8597 0.2127 0.2801 0.7651] 2022-08-24 02:27:17 [INFO] [EVAL] Class Recall: [0.8468 0.9025 0.9646 0.8564 0.8771 0.8556 0.855 0.935 0.6679 0.7761 0.5996 0.7637 0.8959 0.4378 0.4457 0.6349 0.6909 0.5724 0.7477 0.5828 0.8924 0.5364 0.7783 0.7699 0.5847 0.5286 0.7491 0.5557 0.624 0.4042 0.2925 0.6477 0.3873 0.586 0.5658 0.6643 0.6287 0.6429 0.4895 0.4654 0.1504 0.1812 0.5038 0.3899 0.4973 0.3352 0.4835 0.6676 0.8772 0.6901 0.6261 0.6129 0.2494 0.332 0.9059 0.3709 0.9552 0.5414 0.3226 0.4026 0.2284 0.496 0.5027 0.2439 0.7172 0.8567 0.3311 0.5916 0.15 0.4787 0.7224 0.713 0.5087 0.4865 0.645 0.5599 0.5966 0.3367 0.3201 0.2488 0.7983 0.4876 0.4506 0.1222 0.2376 0.6662 0.1364 0.1281 0.41 0.6594 0.6339 0.2226 0.2936 0.1161 0.0534 0.0306 0.2122 0.2575 0.4724 0.3588 0.1579 0.2397 0.3207 0.9317 0.2061 0.8245 0.1504 0.683 0.1339 0.423 0.2439 0.6484 0.2147 0.8139 0.9826 0.0773 0.6135 0.8213 0.2504 0.6312 0.5587 0.0882 0.2357 0.263 0.2925 0.2858 0.6047 0.6391 0.6434 0.504 0.7895 0.0254 0.2863 0.5073 0.4045 0.3095 0.2017 0.0386 0.2889 0.5271 0.5574 0.0956 0.4009 0.2062 0.351 0.0063 0.4084 0.0485 0.1564 0.226 ] 2022-08-24 02:27:17 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 02:27:25 [INFO] [TRAIN] epoch: 68, iter: 85050/160000, loss: 0.4793, lr: 0.000567, batch_cost: 0.1642, reader_cost: 0.00415, ips: 48.7247 samples/sec | ETA 03:25:05 2022-08-24 02:27:33 [INFO] [TRAIN] epoch: 68, iter: 85100/160000, loss: 0.4648, lr: 0.000567, batch_cost: 0.1577, reader_cost: 0.00094, ips: 50.7407 samples/sec | ETA 03:16:49 2022-08-24 02:27:42 [INFO] [TRAIN] epoch: 68, iter: 85150/160000, loss: 0.4421, lr: 0.000567, batch_cost: 0.1724, reader_cost: 0.00049, ips: 46.3927 samples/sec | ETA 03:35:07 2022-08-24 02:27:51 [INFO] [TRAIN] epoch: 68, iter: 85200/160000, loss: 0.4515, lr: 0.000566, batch_cost: 0.1887, reader_cost: 0.00052, ips: 42.3863 samples/sec | ETA 03:55:17 2022-08-24 02:28:00 [INFO] [TRAIN] epoch: 68, iter: 85250/160000, loss: 0.4486, lr: 0.000566, batch_cost: 0.1755, reader_cost: 0.00061, ips: 45.5873 samples/sec | ETA 03:38:37 2022-08-24 02:28:09 [INFO] [TRAIN] epoch: 68, iter: 85300/160000, loss: 0.4726, lr: 0.000566, batch_cost: 0.1815, reader_cost: 0.00051, ips: 44.0835 samples/sec | ETA 03:45:56 2022-08-24 02:28:18 [INFO] [TRAIN] epoch: 68, iter: 85350/160000, loss: 0.5256, lr: 0.000565, batch_cost: 0.1673, reader_cost: 0.00051, ips: 47.8090 samples/sec | ETA 03:28:11 2022-08-24 02:28:26 [INFO] [TRAIN] epoch: 68, iter: 85400/160000, loss: 0.4901, lr: 0.000565, batch_cost: 0.1741, reader_cost: 0.00078, ips: 45.9574 samples/sec | ETA 03:36:25 2022-08-24 02:28:36 [INFO] [TRAIN] epoch: 68, iter: 85450/160000, loss: 0.4696, lr: 0.000564, batch_cost: 0.1958, reader_cost: 0.00066, ips: 40.8580 samples/sec | ETA 04:03:16 2022-08-24 02:28:45 [INFO] [TRAIN] epoch: 68, iter: 85500/160000, loss: 0.4989, lr: 0.000564, batch_cost: 0.1870, reader_cost: 0.00698, ips: 42.7799 samples/sec | ETA 03:52:11 2022-08-24 02:28:55 [INFO] [TRAIN] epoch: 68, iter: 85550/160000, loss: 0.4795, lr: 0.000564, batch_cost: 0.1912, reader_cost: 0.00064, ips: 41.8324 samples/sec | ETA 03:57:17 2022-08-24 02:29:05 [INFO] [TRAIN] epoch: 68, iter: 85600/160000, loss: 0.4851, lr: 0.000563, batch_cost: 0.2063, reader_cost: 0.00130, ips: 38.7741 samples/sec | ETA 04:15:50 2022-08-24 02:29:14 [INFO] [TRAIN] epoch: 68, iter: 85650/160000, loss: 0.4791, lr: 0.000563, batch_cost: 0.1833, reader_cost: 0.00044, ips: 43.6475 samples/sec | ETA 03:47:07 2022-08-24 02:29:25 [INFO] [TRAIN] epoch: 68, iter: 85700/160000, loss: 0.4887, lr: 0.000563, batch_cost: 0.2114, reader_cost: 0.00079, ips: 37.8402 samples/sec | ETA 04:21:48 2022-08-24 02:29:35 [INFO] [TRAIN] epoch: 68, iter: 85750/160000, loss: 0.4258, lr: 0.000562, batch_cost: 0.1950, reader_cost: 0.00044, ips: 41.0165 samples/sec | ETA 04:01:21 2022-08-24 02:29:46 [INFO] [TRAIN] epoch: 68, iter: 85800/160000, loss: 0.5027, lr: 0.000562, batch_cost: 0.2160, reader_cost: 0.00078, ips: 37.0336 samples/sec | ETA 04:27:08 2022-08-24 02:29:54 [INFO] [TRAIN] epoch: 68, iter: 85850/160000, loss: 0.5102, lr: 0.000561, batch_cost: 0.1753, reader_cost: 0.00069, ips: 45.6471 samples/sec | ETA 03:36:35 2022-08-24 02:30:09 [INFO] [TRAIN] epoch: 69, iter: 85900/160000, loss: 0.4787, lr: 0.000561, batch_cost: 0.2943, reader_cost: 0.09118, ips: 27.1799 samples/sec | ETA 06:03:30 2022-08-24 02:30:19 [INFO] [TRAIN] epoch: 69, iter: 85950/160000, loss: 0.4936, lr: 0.000561, batch_cost: 0.1947, reader_cost: 0.00084, ips: 41.0803 samples/sec | ETA 04:00:20 2022-08-24 02:30:29 [INFO] [TRAIN] epoch: 69, iter: 86000/160000, loss: 0.4935, lr: 0.000560, batch_cost: 0.2083, reader_cost: 0.00044, ips: 38.4002 samples/sec | ETA 04:16:56 2022-08-24 02:30:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1723 - reader cost: 9.0167e-04 2022-08-24 02:33:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3671 Acc: 0.7725 Kappa: 0.7551 Dice: 0.5050 2022-08-24 02:33:22 [INFO] [EVAL] Class IoU: [0.692 0.7914 0.9303 0.7424 0.6775 0.7694 0.7722 0.793 0.5338 0.637 0.489 0.5467 0.711 0.3346 0.2996 0.43 0.5214 0.4011 0.5954 0.4289 0.7586 0.4464 0.6201 0.4982 0.2917 0.4219 0.4693 0.4367 0.45 0.3086 0.2803 0.5078 0.2791 0.3345 0.3316 0.4041 0.4718 0.5281 0.3068 0.3544 0.1444 0.1571 0.3681 0.2546 0.2643 0.2376 0.3753 0.5191 0.6295 0.5069 0.5367 0.3586 0.1737 0.284 0.6762 0.4371 0.8723 0.3762 0.4336 0.3409 0.0612 0.5018 0.2855 0.2002 0.437 0.7084 0.213 0.3878 0.1229 0.366 0.4773 0.5442 0.3607 0.2717 0.4758 0.3796 0.5033 0.2908 0.238 0.3375 0.6551 0.4169 0.367 0.0629 0.1162 0.5133 0.1129 0.1064 0.2901 0.5319 0.4122 0.0001 0.1247 0.0804 0.0655 0.021 0.1 0.1529 0.238 0.2627 0.2106 0.1174 0.2874 0.7252 0.1854 0.3903 0.2208 0.579 0.078 0.3138 0.1582 0.5073 0.1705 0.5857 0.6324 0.0549 0.3863 0.6175 0.208 0.3927 0.4857 0.0781 0.2239 0.2089 0.2895 0.2292 0.472 0.4293 0.5687 0.3639 0.5886 0.0367 0.3232 0.3918 0.2914 0.1797 0.1552 0.0343 0.1954 0.3518 0.2801 0.0636 0.2618 0.3295 0.2976 0.0075 0.4273 0.0355 0.1163 0.1903] 2022-08-24 02:33:22 [INFO] [EVAL] Class Precision: [0.7939 0.8563 0.9648 0.8425 0.7474 0.8808 0.8599 0.8453 0.6817 0.739 0.6762 0.6948 0.7784 0.5321 0.5566 0.5909 0.7109 0.6889 0.7516 0.6422 0.8266 0.6534 0.7678 0.6194 0.4943 0.6516 0.5963 0.7448 0.6956 0.5749 0.4594 0.6381 0.5179 0.431 0.4587 0.5016 0.6738 0.7839 0.5048 0.619 0.3018 0.3609 0.7057 0.5299 0.4024 0.5217 0.8523 0.7658 0.7613 0.6226 0.7295 0.4671 0.4078 0.6638 0.7266 0.5765 0.9148 0.7308 0.6999 0.7645 0.0948 0.6496 0.3885 0.677 0.5046 0.811 0.3233 0.5461 0.2761 0.6185 0.7494 0.8022 0.6258 0.3046 0.6809 0.5811 0.7663 0.5535 0.6064 0.4438 0.7833 0.708 0.8149 0.1043 0.2057 0.7281 0.3918 0.4414 0.6444 0.6954 0.5784 0.0003 0.3642 0.2822 0.1676 0.0922 0.697 0.4865 0.3222 0.8838 0.6111 0.1635 0.5537 0.8151 0.6157 0.4201 0.4365 0.8575 0.1364 0.4628 0.4377 0.7731 0.6004 0.665 0.6404 0.3246 0.643 0.7121 0.2922 0.5871 0.7103 0.2596 0.5204 0.4971 0.7003 0.6134 0.8121 0.5258 0.7637 0.7031 0.7226 0.1928 0.6092 0.8488 0.7757 0.4063 0.3652 0.0605 0.4309 0.6692 0.6815 0.117 0.527 0.6473 0.611 0.017 0.7891 0.2602 0.3342 0.7903] 2022-08-24 02:33:22 [INFO] [EVAL] Class Recall: [0.8435 0.9126 0.963 0.8621 0.8787 0.8588 0.8834 0.9276 0.7111 0.8219 0.6386 0.7194 0.8914 0.474 0.3935 0.6123 0.6617 0.4899 0.7412 0.5635 0.9022 0.585 0.7632 0.7179 0.4157 0.5448 0.6877 0.5135 0.5604 0.3998 0.4182 0.7132 0.377 0.5989 0.5447 0.6752 0.6114 0.618 0.4388 0.4533 0.2168 0.2176 0.4349 0.3288 0.4352 0.3038 0.4014 0.617 0.7843 0.7316 0.6701 0.6069 0.2323 0.3317 0.9068 0.6439 0.9495 0.4367 0.5326 0.3808 0.1473 0.6881 0.5184 0.2213 0.7655 0.8485 0.3845 0.5721 0.1813 0.4727 0.5679 0.6285 0.4598 0.7158 0.6124 0.5225 0.5946 0.38 0.2814 0.5848 0.8001 0.5034 0.4004 0.137 0.2109 0.6351 0.1369 0.1229 0.3453 0.6935 0.5891 0.0002 0.1594 0.101 0.0971 0.0265 0.1045 0.1824 0.4764 0.2721 0.2432 0.2943 0.374 0.868 0.2097 0.8464 0.3088 0.6406 0.1542 0.4936 0.1986 0.596 0.1923 0.831 0.9806 0.062 0.4918 0.8229 0.4189 0.5426 0.6058 0.1006 0.2822 0.2649 0.3304 0.2679 0.5298 0.7005 0.6902 0.43 0.7605 0.0434 0.4078 0.4212 0.3182 0.2437 0.2126 0.0735 0.2633 0.4258 0.3223 0.1221 0.3422 0.4016 0.3671 0.0132 0.4825 0.0395 0.1515 0.2004] 2022-08-24 02:33:22 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 02:33:31 [INFO] [TRAIN] epoch: 69, iter: 86050/160000, loss: 0.4618, lr: 0.000560, batch_cost: 0.1829, reader_cost: 0.00383, ips: 43.7360 samples/sec | ETA 03:45:26 2022-08-24 02:33:40 [INFO] [TRAIN] epoch: 69, iter: 86100/160000, loss: 0.4999, lr: 0.000560, batch_cost: 0.1879, reader_cost: 0.00123, ips: 42.5705 samples/sec | ETA 03:51:27 2022-08-24 02:33:50 [INFO] [TRAIN] epoch: 69, iter: 86150/160000, loss: 0.4677, lr: 0.000559, batch_cost: 0.1894, reader_cost: 0.00059, ips: 42.2347 samples/sec | ETA 03:53:08 2022-08-24 02:33:58 [INFO] [TRAIN] epoch: 69, iter: 86200/160000, loss: 0.4707, lr: 0.000559, batch_cost: 0.1676, reader_cost: 0.00102, ips: 47.7365 samples/sec | ETA 03:26:07 2022-08-24 02:34:07 [INFO] [TRAIN] epoch: 69, iter: 86250/160000, loss: 0.5027, lr: 0.000558, batch_cost: 0.1692, reader_cost: 0.00043, ips: 47.2704 samples/sec | ETA 03:28:01 2022-08-24 02:34:16 [INFO] [TRAIN] epoch: 69, iter: 86300/160000, loss: 0.4752, lr: 0.000558, batch_cost: 0.1792, reader_cost: 0.00071, ips: 44.6313 samples/sec | ETA 03:40:10 2022-08-24 02:34:26 [INFO] [TRAIN] epoch: 69, iter: 86350/160000, loss: 0.4532, lr: 0.000558, batch_cost: 0.2109, reader_cost: 0.00044, ips: 37.9267 samples/sec | ETA 04:18:55 2022-08-24 02:34:37 [INFO] [TRAIN] epoch: 69, iter: 86400/160000, loss: 0.4810, lr: 0.000557, batch_cost: 0.2140, reader_cost: 0.00075, ips: 37.3827 samples/sec | ETA 04:22:30 2022-08-24 02:34:48 [INFO] [TRAIN] epoch: 69, iter: 86450/160000, loss: 0.4973, lr: 0.000557, batch_cost: 0.2176, reader_cost: 0.00058, ips: 36.7620 samples/sec | ETA 04:26:45 2022-08-24 02:34:58 [INFO] [TRAIN] epoch: 69, iter: 86500/160000, loss: 0.4947, lr: 0.000556, batch_cost: 0.2059, reader_cost: 0.00057, ips: 38.8524 samples/sec | ETA 04:12:14 2022-08-24 02:35:07 [INFO] [TRAIN] epoch: 69, iter: 86550/160000, loss: 0.5211, lr: 0.000556, batch_cost: 0.1790, reader_cost: 0.00058, ips: 44.7015 samples/sec | ETA 03:39:04 2022-08-24 02:35:16 [INFO] [TRAIN] epoch: 69, iter: 86600/160000, loss: 0.5124, lr: 0.000556, batch_cost: 0.1852, reader_cost: 0.00048, ips: 43.1907 samples/sec | ETA 03:46:35 2022-08-24 02:35:27 [INFO] [TRAIN] epoch: 69, iter: 86650/160000, loss: 0.4602, lr: 0.000555, batch_cost: 0.2076, reader_cost: 0.00065, ips: 38.5357 samples/sec | ETA 04:13:47 2022-08-24 02:35:37 [INFO] [TRAIN] epoch: 69, iter: 86700/160000, loss: 0.4712, lr: 0.000555, batch_cost: 0.1982, reader_cost: 0.00104, ips: 40.3734 samples/sec | ETA 04:02:04 2022-08-24 02:35:47 [INFO] [TRAIN] epoch: 69, iter: 86750/160000, loss: 0.5395, lr: 0.000555, batch_cost: 0.2081, reader_cost: 0.00059, ips: 38.4479 samples/sec | ETA 04:14:01 2022-08-24 02:35:57 [INFO] [TRAIN] epoch: 69, iter: 86800/160000, loss: 0.5066, lr: 0.000554, batch_cost: 0.1906, reader_cost: 0.00040, ips: 41.9621 samples/sec | ETA 03:52:35 2022-08-24 02:36:07 [INFO] [TRAIN] epoch: 69, iter: 86850/160000, loss: 0.5245, lr: 0.000554, batch_cost: 0.2155, reader_cost: 0.00066, ips: 37.1314 samples/sec | ETA 04:22:40 2022-08-24 02:36:17 [INFO] [TRAIN] epoch: 69, iter: 86900/160000, loss: 0.5055, lr: 0.000553, batch_cost: 0.1978, reader_cost: 0.00077, ips: 40.4460 samples/sec | ETA 04:00:58 2022-08-24 02:36:27 [INFO] [TRAIN] epoch: 69, iter: 86950/160000, loss: 0.4927, lr: 0.000553, batch_cost: 0.1856, reader_cost: 0.00057, ips: 43.0989 samples/sec | ETA 03:45:59 2022-08-24 02:36:35 [INFO] [TRAIN] epoch: 69, iter: 87000/160000, loss: 0.4849, lr: 0.000553, batch_cost: 0.1787, reader_cost: 0.00786, ips: 44.7774 samples/sec | ETA 03:37:22 2022-08-24 02:36:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 145s - batch_cost: 0.1454 - reader cost: 6.3062e-04 2022-08-24 02:39:01 [INFO] [EVAL] #Images: 2000 mIoU: 0.3670 Acc: 0.7730 Kappa: 0.7560 Dice: 0.5054 2022-08-24 02:39:01 [INFO] [EVAL] Class IoU: [0.6872 0.7847 0.9278 0.7445 0.688 0.7701 0.7712 0.7987 0.529 0.6658 0.4922 0.5689 0.7092 0.3585 0.3105 0.4336 0.4941 0.4493 0.6022 0.4409 0.7709 0.4639 0.6102 0.4993 0.3085 0.471 0.4758 0.4361 0.48 0.2966 0.2555 0.545 0.2973 0.3497 0.332 0.4227 0.4716 0.5476 0.2977 0.3745 0.1615 0.1213 0.3608 0.2652 0.273 0.2196 0.3642 0.5294 0.5832 0.5178 0.5709 0.3244 0.1627 0.2619 0.6794 0.4342 0.8707 0.4001 0.3361 0.3077 0.0903 0.2551 0.2751 0.1567 0.4483 0.7166 0.2713 0.3644 0.105 0.3703 0.4763 0.6006 0.3965 0.2708 0.4736 0.3635 0.3565 0.2678 0.2408 0.3703 0.6326 0.4033 0.4391 0.0241 0.0457 0.51 0.107 0.1336 0.3656 0.5233 0.45 0.0654 0.2126 0.0711 0.0574 0.0163 0.1789 0.2301 0.2385 0.2824 0.1704 0.1379 0.3343 0.5871 0.1764 0.3723 0.175 0.5886 0.0842 0.3568 0.17 0.5027 0.1645 0.5018 0.6387 0.0443 0.4282 0.6093 0.1958 0.3366 0.4368 0.0579 0.3485 0.2482 0.2969 0.193 0.5283 0.4575 0.5146 0.3373 0.5286 0.0204 0.2791 0.4144 0.2837 0.1898 0.1442 0.0294 0.2139 0.3543 0.2427 0.0984 0.214 0.4398 0.2728 0.0123 0.4079 0.0367 0.1224 0.2171] 2022-08-24 02:39:01 [INFO] [EVAL] Class Precision: [0.8004 0.8575 0.9628 0.8416 0.7773 0.854 0.8811 0.8499 0.6642 0.7929 0.7084 0.6987 0.7893 0.5473 0.5076 0.5675 0.6384 0.7135 0.7488 0.6021 0.8481 0.6333 0.7311 0.6146 0.4382 0.6016 0.6243 0.8325 0.6917 0.4852 0.4906 0.7233 0.6129 0.4864 0.4674 0.5799 0.6669 0.7564 0.4444 0.6113 0.3219 0.4144 0.5622 0.4908 0.3872 0.5453 0.7605 0.7361 0.7701 0.6418 0.7368 0.3873 0.3601 0.7148 0.7261 0.6905 0.9202 0.6706 0.769 0.6142 0.1471 0.4906 0.3307 0.7331 0.5464 0.8081 0.5178 0.5108 0.1812 0.6193 0.7497 0.8068 0.5518 0.3503 0.6703 0.5843 0.7276 0.5547 0.4938 0.585 0.7371 0.7475 0.7532 0.0813 0.094 0.6872 0.5189 0.4077 0.7295 0.6821 0.6681 0.0801 0.4679 0.2862 0.1287 0.0791 0.3601 0.372 0.3398 0.6466 0.6533 0.2397 0.6339 0.6318 0.5192 0.3933 0.3764 0.7777 0.1661 0.4941 0.4324 0.6981 0.5557 0.6659 0.645 0.4516 0.6873 0.6757 0.5122 0.4119 0.6712 0.2768 0.6283 0.5478 0.7038 0.6292 0.7997 0.6242 0.713 0.5226 0.6854 0.1338 0.4236 0.7802 0.7693 0.4187 0.3822 0.1024 0.563 0.669 0.4935 0.2354 0.6228 0.6031 0.5774 0.0174 0.7452 0.2735 0.2539 0.7539] 2022-08-24 02:39:01 [INFO] [EVAL] Class Recall: [0.8293 0.9024 0.9623 0.8658 0.8569 0.8869 0.8607 0.9299 0.7222 0.806 0.6173 0.7539 0.8749 0.5097 0.4443 0.6476 0.6862 0.5482 0.7546 0.6222 0.8943 0.6343 0.7867 0.7269 0.5103 0.6846 0.6667 0.478 0.6106 0.4327 0.3477 0.6885 0.366 0.5544 0.5339 0.6092 0.617 0.6648 0.474 0.4916 0.2447 0.1464 0.5018 0.3658 0.4808 0.2689 0.4114 0.6534 0.7061 0.7282 0.7171 0.6662 0.2288 0.2925 0.9136 0.5392 0.9418 0.498 0.3739 0.3813 0.1897 0.3471 0.6207 0.1662 0.714 0.8636 0.3631 0.5599 0.1998 0.4795 0.5664 0.7015 0.585 0.5441 0.6175 0.4904 0.4114 0.3411 0.3198 0.5023 0.8169 0.4669 0.5129 0.0331 0.0818 0.6642 0.1188 0.1657 0.4229 0.6921 0.5795 0.263 0.2804 0.0865 0.0938 0.0202 0.2622 0.3763 0.4446 0.334 0.1874 0.2452 0.4143 0.8923 0.2108 0.8745 0.2465 0.7077 0.1458 0.5623 0.2189 0.6424 0.1895 0.6707 0.9851 0.0468 0.5317 0.861 0.2406 0.6481 0.5557 0.0682 0.439 0.3121 0.3394 0.2178 0.6088 0.6315 0.6491 0.4875 0.698 0.0236 0.45 0.4691 0.3101 0.2578 0.1881 0.0395 0.2564 0.4296 0.3232 0.1446 0.2459 0.6188 0.3409 0.0407 0.474 0.0406 0.1911 0.2336] 2022-08-24 02:39:01 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 02:39:10 [INFO] [TRAIN] epoch: 69, iter: 87050/160000, loss: 0.4768, lr: 0.000552, batch_cost: 0.1739, reader_cost: 0.00456, ips: 46.0162 samples/sec | ETA 03:31:22 2022-08-24 02:39:19 [INFO] [TRAIN] epoch: 69, iter: 87100/160000, loss: 0.4727, lr: 0.000552, batch_cost: 0.1754, reader_cost: 0.00122, ips: 45.5997 samples/sec | ETA 03:33:09 2022-08-24 02:39:30 [INFO] [TRAIN] epoch: 70, iter: 87150/160000, loss: 0.4773, lr: 0.000552, batch_cost: 0.2306, reader_cost: 0.07134, ips: 34.6923 samples/sec | ETA 04:39:59 2022-08-24 02:39:39 [INFO] [TRAIN] epoch: 70, iter: 87200/160000, loss: 0.4870, lr: 0.000551, batch_cost: 0.1706, reader_cost: 0.00066, ips: 46.9011 samples/sec | ETA 03:26:57 2022-08-24 02:39:47 [INFO] [TRAIN] epoch: 70, iter: 87250/160000, loss: 0.4694, lr: 0.000551, batch_cost: 0.1636, reader_cost: 0.00106, ips: 48.8935 samples/sec | ETA 03:18:23 2022-08-24 02:39:55 [INFO] [TRAIN] epoch: 70, iter: 87300/160000, loss: 0.4853, lr: 0.000550, batch_cost: 0.1602, reader_cost: 0.00144, ips: 49.9351 samples/sec | ETA 03:14:07 2022-08-24 02:40:04 [INFO] [TRAIN] epoch: 70, iter: 87350/160000, loss: 0.5000, lr: 0.000550, batch_cost: 0.1801, reader_cost: 0.00147, ips: 44.4119 samples/sec | ETA 03:38:06 2022-08-24 02:40:14 [INFO] [TRAIN] epoch: 70, iter: 87400/160000, loss: 0.4967, lr: 0.000550, batch_cost: 0.2005, reader_cost: 0.00038, ips: 39.9043 samples/sec | ETA 04:02:34 2022-08-24 02:40:23 [INFO] [TRAIN] epoch: 70, iter: 87450/160000, loss: 0.4801, lr: 0.000549, batch_cost: 0.1718, reader_cost: 0.00080, ips: 46.5687 samples/sec | ETA 03:27:43 2022-08-24 02:40:32 [INFO] [TRAIN] epoch: 70, iter: 87500/160000, loss: 0.4855, lr: 0.000549, batch_cost: 0.1962, reader_cost: 0.00045, ips: 40.7839 samples/sec | ETA 03:57:01 2022-08-24 02:40:42 [INFO] [TRAIN] epoch: 70, iter: 87550/160000, loss: 0.4797, lr: 0.000549, batch_cost: 0.2014, reader_cost: 0.00110, ips: 39.7189 samples/sec | ETA 04:03:12 2022-08-24 02:40:53 [INFO] [TRAIN] epoch: 70, iter: 87600/160000, loss: 0.4914, lr: 0.000548, batch_cost: 0.2149, reader_cost: 0.00043, ips: 37.2251 samples/sec | ETA 04:19:19 2022-08-24 02:41:04 [INFO] [TRAIN] epoch: 70, iter: 87650/160000, loss: 0.4337, lr: 0.000548, batch_cost: 0.2105, reader_cost: 0.00055, ips: 38.0038 samples/sec | ETA 04:13:50 2022-08-24 02:41:14 [INFO] [TRAIN] epoch: 70, iter: 87700/160000, loss: 0.4990, lr: 0.000547, batch_cost: 0.2090, reader_cost: 0.00091, ips: 38.2769 samples/sec | ETA 04:11:50 2022-08-24 02:41:24 [INFO] [TRAIN] epoch: 70, iter: 87750/160000, loss: 0.4682, lr: 0.000547, batch_cost: 0.1998, reader_cost: 0.00035, ips: 40.0412 samples/sec | ETA 04:00:35 2022-08-24 02:41:36 [INFO] [TRAIN] epoch: 70, iter: 87800/160000, loss: 0.4852, lr: 0.000547, batch_cost: 0.2318, reader_cost: 0.00056, ips: 34.5157 samples/sec | ETA 04:38:54 2022-08-24 02:41:45 [INFO] [TRAIN] epoch: 70, iter: 87850/160000, loss: 0.4649, lr: 0.000546, batch_cost: 0.1862, reader_cost: 0.00054, ips: 42.9581 samples/sec | ETA 03:43:56 2022-08-24 02:41:54 [INFO] [TRAIN] epoch: 70, iter: 87900/160000, loss: 0.4829, lr: 0.000546, batch_cost: 0.1819, reader_cost: 0.00052, ips: 43.9746 samples/sec | ETA 03:38:36 2022-08-24 02:42:04 [INFO] [TRAIN] epoch: 70, iter: 87950/160000, loss: 0.4845, lr: 0.000545, batch_cost: 0.2050, reader_cost: 0.00057, ips: 39.0328 samples/sec | ETA 04:06:07 2022-08-24 02:42:14 [INFO] [TRAIN] epoch: 70, iter: 88000/160000, loss: 0.5118, lr: 0.000545, batch_cost: 0.1923, reader_cost: 0.00067, ips: 41.5947 samples/sec | ETA 03:50:47 2022-08-24 02:42:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 181s - batch_cost: 0.1808 - reader cost: 6.9810e-04 2022-08-24 02:45:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3654 Acc: 0.7690 Kappa: 0.7516 Dice: 0.5044 2022-08-24 02:45:15 [INFO] [EVAL] Class IoU: [0.6895 0.7852 0.9295 0.7424 0.6941 0.7759 0.7456 0.8039 0.5305 0.6355 0.4937 0.5725 0.713 0.3134 0.2933 0.4369 0.5138 0.4419 0.5993 0.4195 0.7638 0.4318 0.612 0.5114 0.323 0.3995 0.4542 0.4472 0.4836 0.2608 0.3075 0.5269 0.2801 0.3387 0.3375 0.3979 0.4644 0.568 0.2879 0.3463 0.1409 0.151 0.3578 0.2627 0.2604 0.1804 0.3431 0.5358 0.6199 0.5046 0.555 0.3238 0.1814 0.2568 0.6743 0.3842 0.8803 0.4179 0.2469 0.3216 0.0803 0.2173 0.2934 0.1367 0.4925 0.7384 0.2604 0.3707 0.0553 0.3287 0.4965 0.5792 0.3736 0.2936 0.4681 0.3672 0.512 0.2794 0.3552 0.3125 0.6652 0.3973 0.3484 0.1056 0.1072 0.5133 0.1236 0.099 0.2879 0.4934 0.4004 0.0681 0.1854 0.1082 0.078 0.0551 0.1177 0.1982 0.2306 0.2739 0.2876 0.1542 0.2911 0.6479 0.1906 0.3713 0.1805 0.5751 0.0832 0.3936 0.1986 0.3294 0.1695 0.5827 0.5997 0.052 0.397 0.6562 0.2285 0.3865 0.4219 0.0632 0.3375 0.1706 0.294 0.2352 0.5166 0.4307 0.5453 0.3439 0.4988 0.0593 0.2758 0.3798 0.3195 0.1816 0.1566 0.0205 0.1998 0.3697 0.3169 0.0471 0.2444 0.344 0.3149 0.0177 0.4249 0.0343 0.1364 0.2015] 2022-08-24 02:45:15 [INFO] [EVAL] Class Precision: [0.7962 0.8548 0.9612 0.8426 0.7896 0.867 0.9007 0.8619 0.6748 0.8348 0.7088 0.691 0.7975 0.4859 0.4848 0.5891 0.6835 0.6633 0.7511 0.6617 0.8237 0.6008 0.7581 0.6476 0.4301 0.5914 0.5508 0.6626 0.674 0.4882 0.4513 0.6673 0.6093 0.4371 0.402 0.5315 0.7085 0.8487 0.4059 0.6488 0.2591 0.3857 0.6077 0.5037 0.3922 0.5066 0.4904 0.7192 0.7248 0.6085 0.761 0.3902 0.3984 0.6386 0.7327 0.6147 0.9307 0.6576 0.7102 0.5897 0.1195 0.4335 0.4716 0.7692 0.6618 0.8757 0.4057 0.4921 0.1171 0.4908 0.7167 0.7284 0.4743 0.3746 0.7294 0.5702 0.6844 0.4671 0.6423 0.5163 0.7837 0.7906 0.8459 0.146 0.1859 0.6684 0.5192 0.5195 0.6708 0.6052 0.5274 0.0929 0.4209 0.354 0.0908 0.1299 0.4525 0.4507 0.3506 0.6968 0.5764 0.2863 0.6144 0.828 0.7919 0.3937 0.456 0.7618 0.1552 0.4797 0.4215 0.3985 0.5136 0.7744 0.6045 0.2377 0.7705 0.7627 0.3787 0.5202 0.6503 0.3399 0.5529 0.7611 0.7758 0.6639 0.8142 0.502 0.7909 0.5314 0.6906 0.3735 0.6179 0.8404 0.6516 0.3673 0.3357 0.0743 0.447 0.6225 0.5127 0.0801 0.59 0.5199 0.5704 0.025 0.8587 0.2423 0.3929 0.8741] 2022-08-24 02:45:15 [INFO] [EVAL] Class Recall: [0.8373 0.906 0.9657 0.8619 0.8517 0.8807 0.8123 0.9227 0.7126 0.7269 0.6193 0.7696 0.8706 0.4689 0.4262 0.6284 0.6741 0.5697 0.7477 0.5341 0.913 0.6054 0.7606 0.7086 0.5646 0.5518 0.7215 0.579 0.6313 0.3589 0.4911 0.7146 0.3414 0.6008 0.6779 0.6128 0.5741 0.632 0.4976 0.4262 0.2359 0.1989 0.4652 0.3545 0.4367 0.2188 0.5333 0.6775 0.8107 0.7471 0.6722 0.6555 0.2499 0.3005 0.8942 0.5061 0.942 0.5341 0.2745 0.4143 0.1969 0.3034 0.437 0.1426 0.6582 0.8248 0.4211 0.6003 0.0948 0.4988 0.6177 0.7388 0.6378 0.576 0.5665 0.5078 0.6702 0.4103 0.4427 0.4419 0.8148 0.444 0.372 0.2762 0.2021 0.6887 0.1395 0.109 0.3352 0.7276 0.6244 0.2033 0.2488 0.1349 0.3565 0.0874 0.1372 0.2613 0.4025 0.3109 0.3646 0.2506 0.3561 0.7487 0.2007 0.8669 0.23 0.7012 0.152 0.6868 0.273 0.6554 0.2019 0.7019 0.9871 0.0624 0.4502 0.8246 0.3655 0.6007 0.5457 0.0721 0.4643 0.1802 0.3213 0.267 0.5856 0.752 0.6371 0.4937 0.6423 0.0659 0.3325 0.4093 0.3853 0.2642 0.2269 0.0275 0.2654 0.4765 0.4535 0.1027 0.2944 0.5042 0.4128 0.0571 0.4568 0.0385 0.1728 0.2076] 2022-08-24 02:45:15 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 02:45:24 [INFO] [TRAIN] epoch: 70, iter: 88050/160000, loss: 0.4893, lr: 0.000545, batch_cost: 0.1689, reader_cost: 0.00338, ips: 47.3559 samples/sec | ETA 03:22:34 2022-08-24 02:45:32 [INFO] [TRAIN] epoch: 70, iter: 88100/160000, loss: 0.4900, lr: 0.000544, batch_cost: 0.1569, reader_cost: 0.00181, ips: 50.9934 samples/sec | ETA 03:07:59 2022-08-24 02:45:40 [INFO] [TRAIN] epoch: 70, iter: 88150/160000, loss: 0.4850, lr: 0.000544, batch_cost: 0.1553, reader_cost: 0.00060, ips: 51.4976 samples/sec | ETA 03:06:01 2022-08-24 02:45:48 [INFO] [TRAIN] epoch: 70, iter: 88200/160000, loss: 0.5082, lr: 0.000544, batch_cost: 0.1743, reader_cost: 0.00064, ips: 45.9077 samples/sec | ETA 03:28:32 2022-08-24 02:45:57 [INFO] [TRAIN] epoch: 70, iter: 88250/160000, loss: 0.5221, lr: 0.000543, batch_cost: 0.1832, reader_cost: 0.00063, ips: 43.6799 samples/sec | ETA 03:39:01 2022-08-24 02:46:07 [INFO] [TRAIN] epoch: 70, iter: 88300/160000, loss: 0.4842, lr: 0.000543, batch_cost: 0.1984, reader_cost: 0.00051, ips: 40.3245 samples/sec | ETA 03:57:04 2022-08-24 02:46:18 [INFO] [TRAIN] epoch: 70, iter: 88350/160000, loss: 0.4636, lr: 0.000542, batch_cost: 0.2070, reader_cost: 0.00089, ips: 38.6472 samples/sec | ETA 04:07:11 2022-08-24 02:46:28 [INFO] [TRAIN] epoch: 70, iter: 88400/160000, loss: 0.4925, lr: 0.000542, batch_cost: 0.2086, reader_cost: 0.00042, ips: 38.3473 samples/sec | ETA 04:08:57 2022-08-24 02:46:45 [INFO] [TRAIN] epoch: 71, iter: 88450/160000, loss: 0.4842, lr: 0.000542, batch_cost: 0.3317, reader_cost: 0.14328, ips: 24.1148 samples/sec | ETA 06:35:36 2022-08-24 02:46:56 [INFO] [TRAIN] epoch: 71, iter: 88500/160000, loss: 0.4858, lr: 0.000541, batch_cost: 0.2208, reader_cost: 0.00062, ips: 36.2359 samples/sec | ETA 04:23:05 2022-08-24 02:47:06 [INFO] [TRAIN] epoch: 71, iter: 88550/160000, loss: 0.4931, lr: 0.000541, batch_cost: 0.2005, reader_cost: 0.00043, ips: 39.8961 samples/sec | ETA 03:58:47 2022-08-24 02:47:17 [INFO] [TRAIN] epoch: 71, iter: 88600/160000, loss: 0.4748, lr: 0.000541, batch_cost: 0.2257, reader_cost: 0.00035, ips: 35.4384 samples/sec | ETA 04:28:38 2022-08-24 02:47:27 [INFO] [TRAIN] epoch: 71, iter: 88650/160000, loss: 0.4569, lr: 0.000540, batch_cost: 0.2042, reader_cost: 0.00087, ips: 39.1847 samples/sec | ETA 04:02:46 2022-08-24 02:47:37 [INFO] [TRAIN] epoch: 71, iter: 88700/160000, loss: 0.4885, lr: 0.000540, batch_cost: 0.1928, reader_cost: 0.00074, ips: 41.4906 samples/sec | ETA 03:49:07 2022-08-24 02:47:47 [INFO] [TRAIN] epoch: 71, iter: 88750/160000, loss: 0.4732, lr: 0.000539, batch_cost: 0.2008, reader_cost: 0.00484, ips: 39.8313 samples/sec | ETA 03:58:30 2022-08-24 02:47:57 [INFO] [TRAIN] epoch: 71, iter: 88800/160000, loss: 0.4610, lr: 0.000539, batch_cost: 0.2054, reader_cost: 0.00034, ips: 38.9536 samples/sec | ETA 04:03:42 2022-08-24 02:48:07 [INFO] [TRAIN] epoch: 71, iter: 88850/160000, loss: 0.4937, lr: 0.000539, batch_cost: 0.1908, reader_cost: 0.00476, ips: 41.9321 samples/sec | ETA 03:46:14 2022-08-24 02:48:18 [INFO] [TRAIN] epoch: 71, iter: 88900/160000, loss: 0.4933, lr: 0.000538, batch_cost: 0.2150, reader_cost: 0.00075, ips: 37.2030 samples/sec | ETA 04:14:49 2022-08-24 02:48:27 [INFO] [TRAIN] epoch: 71, iter: 88950/160000, loss: 0.4851, lr: 0.000538, batch_cost: 0.1963, reader_cost: 0.00043, ips: 40.7605 samples/sec | ETA 03:52:24 2022-08-24 02:48:37 [INFO] [TRAIN] epoch: 71, iter: 89000/160000, loss: 0.4807, lr: 0.000538, batch_cost: 0.1938, reader_cost: 0.00980, ips: 41.2697 samples/sec | ETA 03:49:23 2022-08-24 02:48:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1617 - reader cost: 5.4860e-04 2022-08-24 02:51:19 [INFO] [EVAL] #Images: 2000 mIoU: 0.3701 Acc: 0.7738 Kappa: 0.7564 Dice: 0.5072 2022-08-24 02:51:19 [INFO] [EVAL] Class IoU: [0.6893 0.7887 0.9313 0.7398 0.6951 0.7669 0.7793 0.8058 0.5316 0.637 0.4919 0.569 0.7071 0.2827 0.3048 0.4345 0.4985 0.4522 0.6029 0.4321 0.7882 0.452 0.6107 0.5052 0.3252 0.4775 0.4504 0.4222 0.4749 0.2382 0.228 0.5398 0.2933 0.3522 0.3474 0.3876 0.4577 0.5624 0.3043 0.3794 0.1384 0.1539 0.3461 0.273 0.2935 0.2283 0.2941 0.5199 0.6134 0.5201 0.5912 0.3421 0.1949 0.2822 0.6755 0.3785 0.8756 0.4293 0.4818 0.3198 0.0937 0.2746 0.3078 0.1518 0.458 0.7334 0.2544 0.3795 0.1144 0.3391 0.5104 0.5621 0.3915 0.2587 0.4809 0.3849 0.5232 0.261 0.2422 0.3253 0.6949 0.4213 0.3631 0.0371 0.0901 0.5136 0.1152 0.1206 0.304 0.5131 0.4574 0.1272 0.1358 0.0818 0.048 0.0385 0.1048 0.2001 0.238 0.3183 0.2903 0.1391 0.307 0.8073 0.19 0.3833 0.1718 0.569 0.0534 0.4488 0.184 0.4144 0.1871 0.6396 0.624 0.0608 0.463 0.6281 0.2625 0.3967 0.4091 0.0737 0.2525 0.1765 0.2624 0.2339 0.5179 0.4952 0.4689 0.335 0.5538 0.0329 0.2703 0.4288 0.332 0.2013 0.1517 0.0197 0.1756 0.3876 0.1329 0.0086 0.2421 0.3877 0.3293 0. 0.4223 0.0384 0.0957 0.1984] 2022-08-24 02:51:19 [INFO] [EVAL] Class Precision: [0.7794 0.8646 0.9646 0.8316 0.7815 0.874 0.8795 0.8639 0.6691 0.7125 0.7125 0.704 0.7756 0.5798 0.5662 0.6301 0.6819 0.69 0.7653 0.6575 0.8703 0.6243 0.7244 0.6016 0.4524 0.6111 0.5986 0.7095 0.7345 0.3593 0.4763 0.7264 0.5825 0.4804 0.4679 0.5365 0.7243 0.7795 0.4522 0.5901 0.2908 0.4101 0.6002 0.5347 0.375 0.4538 0.609 0.6908 0.6999 0.6718 0.7755 0.4506 0.3997 0.7184 0.7266 0.6574 0.9146 0.6622 0.7386 0.6854 0.1724 0.5694 0.4871 0.5966 0.538 0.8837 0.4953 0.5087 0.2042 0.715 0.715 0.7801 0.5856 0.2991 0.7272 0.5793 0.7826 0.5593 0.6423 0.519 0.8474 0.6333 0.8313 0.1303 0.1564 0.6786 0.4427 0.4302 0.7192 0.7477 0.6964 0.1626 0.345 0.321 0.1829 0.1281 0.3537 0.4064 0.3178 0.6598 0.7617 0.2156 0.5816 0.8539 0.6506 0.4156 0.3895 0.8158 0.1446 0.5257 0.5274 0.5765 0.6303 0.6968 0.6349 0.2887 0.6232 0.7267 0.3585 0.5276 0.6911 0.3788 0.5939 0.7242 0.8448 0.6758 0.7469 0.6755 0.6506 0.4779 0.6361 0.2965 0.662 0.7655 0.6 0.378 0.3276 0.0398 0.4887 0.608 0.576 0.0335 0.5531 0.5333 0.5875 0. 0.8512 0.1721 0.3752 0.8167] 2022-08-24 02:51:19 [INFO] [EVAL] Class Recall: [0.8564 0.8998 0.9642 0.8701 0.8628 0.8622 0.8724 0.9231 0.7212 0.8574 0.6137 0.7478 0.889 0.3556 0.3977 0.5833 0.6496 0.5675 0.7396 0.5577 0.8932 0.6209 0.7956 0.7592 0.5362 0.6859 0.6453 0.5104 0.5733 0.4141 0.3042 0.6776 0.3714 0.5688 0.5744 0.5827 0.5543 0.6687 0.4818 0.5152 0.2089 0.1976 0.4498 0.358 0.5743 0.3148 0.3626 0.6776 0.8323 0.6972 0.7132 0.5869 0.2755 0.3173 0.9057 0.4714 0.9536 0.5498 0.5808 0.3749 0.1702 0.3466 0.4554 0.1691 0.755 0.8118 0.3434 0.5992 0.2063 0.3921 0.6408 0.6679 0.5416 0.6571 0.5868 0.5342 0.6121 0.3287 0.2799 0.4656 0.7942 0.5572 0.392 0.0493 0.1753 0.6786 0.1347 0.1436 0.345 0.6205 0.5713 0.3691 0.1829 0.099 0.0611 0.0521 0.1296 0.2828 0.4867 0.3808 0.3193 0.2818 0.394 0.9367 0.2115 0.8314 0.2352 0.6529 0.078 0.7542 0.2203 0.5957 0.2102 0.8862 0.9733 0.0715 0.6431 0.8224 0.4952 0.6151 0.5007 0.0839 0.3052 0.1893 0.2757 0.2635 0.6281 0.6498 0.6267 0.5284 0.8106 0.0357 0.3136 0.4936 0.4265 0.301 0.2203 0.0376 0.2151 0.5167 0.1473 0.0114 0.3009 0.5867 0.4283 0. 0.456 0.047 0.1139 0.2076] 2022-08-24 02:51:19 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 02:51:28 [INFO] [TRAIN] epoch: 71, iter: 89050/160000, loss: 0.4715, lr: 0.000537, batch_cost: 0.1811, reader_cost: 0.00381, ips: 44.1673 samples/sec | ETA 03:34:11 2022-08-24 02:51:38 [INFO] [TRAIN] epoch: 71, iter: 89100/160000, loss: 0.4733, lr: 0.000537, batch_cost: 0.1874, reader_cost: 0.00095, ips: 42.6800 samples/sec | ETA 03:41:29 2022-08-24 02:51:47 [INFO] [TRAIN] epoch: 71, iter: 89150/160000, loss: 0.4673, lr: 0.000536, batch_cost: 0.1766, reader_cost: 0.00107, ips: 45.3008 samples/sec | ETA 03:28:31 2022-08-24 02:51:56 [INFO] [TRAIN] epoch: 71, iter: 89200/160000, loss: 0.5076, lr: 0.000536, batch_cost: 0.1928, reader_cost: 0.00067, ips: 41.4982 samples/sec | ETA 03:47:28 2022-08-24 02:52:05 [INFO] [TRAIN] epoch: 71, iter: 89250/160000, loss: 0.4568, lr: 0.000536, batch_cost: 0.1818, reader_cost: 0.00052, ips: 44.0007 samples/sec | ETA 03:34:23 2022-08-24 02:52:15 [INFO] [TRAIN] epoch: 71, iter: 89300/160000, loss: 0.4830, lr: 0.000535, batch_cost: 0.1905, reader_cost: 0.00830, ips: 41.9997 samples/sec | ETA 03:44:26 2022-08-24 02:52:24 [INFO] [TRAIN] epoch: 71, iter: 89350/160000, loss: 0.4408, lr: 0.000535, batch_cost: 0.1885, reader_cost: 0.00280, ips: 42.4383 samples/sec | ETA 03:41:58 2022-08-24 02:52:35 [INFO] [TRAIN] epoch: 71, iter: 89400/160000, loss: 0.4845, lr: 0.000535, batch_cost: 0.2077, reader_cost: 0.00059, ips: 38.5232 samples/sec | ETA 04:04:21 2022-08-24 02:52:45 [INFO] [TRAIN] epoch: 71, iter: 89450/160000, loss: 0.4884, lr: 0.000534, batch_cost: 0.1975, reader_cost: 0.00035, ips: 40.5147 samples/sec | ETA 03:52:10 2022-08-24 02:52:54 [INFO] [TRAIN] epoch: 71, iter: 89500/160000, loss: 0.4659, lr: 0.000534, batch_cost: 0.1944, reader_cost: 0.00070, ips: 41.1569 samples/sec | ETA 03:48:23 2022-08-24 02:53:03 [INFO] [TRAIN] epoch: 71, iter: 89550/160000, loss: 0.4955, lr: 0.000533, batch_cost: 0.1818, reader_cost: 0.00081, ips: 44.0105 samples/sec | ETA 03:33:26 2022-08-24 02:53:14 [INFO] [TRAIN] epoch: 71, iter: 89600/160000, loss: 0.4648, lr: 0.000533, batch_cost: 0.2208, reader_cost: 0.00084, ips: 36.2362 samples/sec | ETA 04:19:02 2022-08-24 02:53:25 [INFO] [TRAIN] epoch: 71, iter: 89650/160000, loss: 0.4580, lr: 0.000533, batch_cost: 0.2057, reader_cost: 0.00105, ips: 38.8885 samples/sec | ETA 04:01:12 2022-08-24 02:53:40 [INFO] [TRAIN] epoch: 72, iter: 89700/160000, loss: 0.4910, lr: 0.000532, batch_cost: 0.3163, reader_cost: 0.10753, ips: 25.2937 samples/sec | ETA 06:10:34 2022-08-24 02:53:50 [INFO] [TRAIN] epoch: 72, iter: 89750/160000, loss: 0.4597, lr: 0.000532, batch_cost: 0.1835, reader_cost: 0.00052, ips: 43.5913 samples/sec | ETA 03:34:52 2022-08-24 02:54:00 [INFO] [TRAIN] epoch: 72, iter: 89800/160000, loss: 0.4775, lr: 0.000531, batch_cost: 0.2033, reader_cost: 0.00065, ips: 39.3451 samples/sec | ETA 03:57:53 2022-08-24 02:54:10 [INFO] [TRAIN] epoch: 72, iter: 89850/160000, loss: 0.4616, lr: 0.000531, batch_cost: 0.2059, reader_cost: 0.00051, ips: 38.8458 samples/sec | ETA 04:00:46 2022-08-24 02:54:19 [INFO] [TRAIN] epoch: 72, iter: 89900/160000, loss: 0.4556, lr: 0.000531, batch_cost: 0.1821, reader_cost: 0.00048, ips: 43.9322 samples/sec | ETA 03:32:45 2022-08-24 02:54:29 [INFO] [TRAIN] epoch: 72, iter: 89950/160000, loss: 0.4594, lr: 0.000530, batch_cost: 0.1878, reader_cost: 0.00788, ips: 42.6020 samples/sec | ETA 03:39:14 2022-08-24 02:54:38 [INFO] [TRAIN] epoch: 72, iter: 90000/160000, loss: 0.4682, lr: 0.000530, batch_cost: 0.1885, reader_cost: 0.00070, ips: 42.4303 samples/sec | ETA 03:39:58 2022-08-24 02:54:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1617 - reader cost: 7.9661e-04 2022-08-24 02:57:20 [INFO] [EVAL] #Images: 2000 mIoU: 0.3610 Acc: 0.7710 Kappa: 0.7534 Dice: 0.4978 2022-08-24 02:57:20 [INFO] [EVAL] Class IoU: [0.6843 0.7764 0.9302 0.7344 0.6894 0.7647 0.7697 0.7953 0.5394 0.6253 0.4915 0.5682 0.7001 0.3365 0.3098 0.4314 0.5274 0.4532 0.6227 0.4454 0.7647 0.4703 0.6022 0.5024 0.3249 0.3573 0.4608 0.4378 0.4283 0.2757 0.2413 0.5331 0.2948 0.3583 0.3721 0.4237 0.4738 0.5554 0.2764 0.3118 0.095 0.1324 0.3529 0.2684 0.2537 0.2336 0.3428 0.533 0.6033 0.5222 0.5774 0.3369 0.2085 0.2112 0.6819 0.3284 0.8871 0.4548 0.4569 0.3168 0.1144 0.2543 0.2798 0.1585 0.4707 0.73 0.2941 0.3572 0.1212 0.3569 0.4996 0.5657 0.374 0.2538 0.4783 0.3794 0.4928 0.2614 0.2741 0.2811 0.6909 0.3744 0.4454 0.0294 0.0888 0.5348 0.1159 0.1295 0.2757 0.5268 0.4223 0.0855 0.1717 0.0508 0.0282 0.0651 0.1263 0.1081 0.2514 0.2676 0.083 0.1383 0.3265 0.4724 0.1897 0.3041 0.2224 0.5826 0.0749 0.4322 0.1723 0.4128 0.163 0.5045 0.6155 0.0505 0.3355 0.6867 0.1674 0.4142 0.4142 0.0768 0.2924 0.2069 0.2763 0.2235 0.5305 0.425 0.5016 0.359 0.5259 0.0829 0.2505 0.3743 0.3053 0.1965 0.1529 0.0301 0.1845 0.3941 0.0644 0.0335 0.2046 0.3789 0.2632 0.014 0.4301 0.0498 0.1206 0.1906] 2022-08-24 02:57:20 [INFO] [EVAL] Class Precision: [0.7756 0.8719 0.9563 0.823 0.7616 0.8744 0.8802 0.8632 0.6869 0.7526 0.7169 0.7005 0.764 0.4991 0.5683 0.6066 0.7089 0.715 0.7702 0.6383 0.8376 0.6526 0.7001 0.5977 0.4598 0.6899 0.5459 0.8108 0.7256 0.5786 0.4915 0.677 0.4808 0.4735 0.5299 0.5776 0.6909 0.7143 0.3971 0.6679 0.2717 0.3428 0.5586 0.5433 0.3553 0.4499 0.7828 0.7317 0.6803 0.667 0.8084 0.4288 0.3694 0.602 0.73 0.6969 0.946 0.6077 0.6868 0.6218 0.2587 0.4079 0.3544 0.7411 0.5837 0.8731 0.4961 0.5747 0.4351 0.691 0.7185 0.7138 0.5397 0.305 0.7353 0.5384 0.6168 0.5777 0.6816 0.474 0.8426 0.6434 0.7543 0.1249 0.1558 0.6727 0.3637 0.3481 0.6837 0.7081 0.5783 0.1049 0.281 0.2932 0.1253 0.1588 0.5255 0.4177 0.3632 0.6541 0.5594 0.2078 0.6064 0.4939 0.7222 0.3605 0.3731 0.8912 0.1481 0.5396 0.4592 0.5653 0.5718 0.7126 0.6227 0.4464 0.6062 0.7604 0.4639 0.5578 0.7514 0.2594 0.555 0.6607 0.7999 0.6523 0.8237 0.5078 0.7095 0.6007 0.7087 0.4488 0.686 0.8113 0.7133 0.3658 0.3376 0.0725 0.4497 0.5933 0.2071 0.1124 0.5624 0.6248 0.5681 0.0164 0.8244 0.1998 0.4045 0.7732] 2022-08-24 02:57:20 [INFO] [EVAL] Class Recall: [0.8532 0.8764 0.9715 0.8722 0.8792 0.8591 0.8597 0.91 0.7153 0.787 0.6099 0.7506 0.8932 0.5081 0.4052 0.5989 0.6732 0.5531 0.7648 0.5957 0.8978 0.6274 0.8116 0.759 0.5255 0.4257 0.7471 0.4876 0.5111 0.345 0.3215 0.7149 0.4325 0.5955 0.5553 0.6139 0.6013 0.714 0.4763 0.369 0.1274 0.1775 0.4893 0.3467 0.47 0.3269 0.3789 0.6625 0.8419 0.7063 0.6689 0.6111 0.3237 0.2455 0.912 0.3832 0.9344 0.6439 0.5771 0.3924 0.1702 0.403 0.5706 0.1678 0.7085 0.8166 0.4193 0.4855 0.1438 0.4246 0.6212 0.7317 0.5492 0.6018 0.5778 0.5622 0.7101 0.3232 0.3144 0.4085 0.7934 0.4725 0.5209 0.0371 0.1712 0.7228 0.1453 0.1709 0.316 0.673 0.6102 0.3163 0.3061 0.0579 0.0351 0.0993 0.1425 0.1273 0.4494 0.3118 0.0888 0.2927 0.4143 0.9156 0.2047 0.6606 0.3552 0.6272 0.1315 0.6847 0.2162 0.6048 0.1857 0.6333 0.9816 0.0538 0.429 0.8763 0.2076 0.6167 0.48 0.0984 0.382 0.2315 0.2969 0.2538 0.5985 0.7227 0.6313 0.4715 0.671 0.0923 0.2829 0.41 0.348 0.298 0.2184 0.0489 0.2383 0.5399 0.0854 0.0456 0.2434 0.4905 0.3291 0.0902 0.4735 0.0623 0.1466 0.2019] 2022-08-24 02:57:20 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 02:57:29 [INFO] [TRAIN] epoch: 72, iter: 90050/160000, loss: 0.4951, lr: 0.000530, batch_cost: 0.1793, reader_cost: 0.00486, ips: 44.6240 samples/sec | ETA 03:29:00 2022-08-24 02:57:39 [INFO] [TRAIN] epoch: 72, iter: 90100/160000, loss: 0.4738, lr: 0.000529, batch_cost: 0.1958, reader_cost: 0.00120, ips: 40.8662 samples/sec | ETA 03:48:03 2022-08-24 02:57:49 [INFO] [TRAIN] epoch: 72, iter: 90150/160000, loss: 0.4999, lr: 0.000529, batch_cost: 0.2069, reader_cost: 0.00078, ips: 38.6687 samples/sec | ETA 04:00:50 2022-08-24 02:58:00 [INFO] [TRAIN] epoch: 72, iter: 90200/160000, loss: 0.4673, lr: 0.000528, batch_cost: 0.2149, reader_cost: 0.00103, ips: 37.2305 samples/sec | ETA 04:09:58 2022-08-24 02:58:10 [INFO] [TRAIN] epoch: 72, iter: 90250/160000, loss: 0.4621, lr: 0.000528, batch_cost: 0.2007, reader_cost: 0.00637, ips: 39.8589 samples/sec | ETA 03:53:19 2022-08-24 02:58:20 [INFO] [TRAIN] epoch: 72, iter: 90300/160000, loss: 0.4602, lr: 0.000528, batch_cost: 0.2078, reader_cost: 0.00040, ips: 38.4951 samples/sec | ETA 04:01:24 2022-08-24 02:58:29 [INFO] [TRAIN] epoch: 72, iter: 90350/160000, loss: 0.5105, lr: 0.000527, batch_cost: 0.1746, reader_cost: 0.00085, ips: 45.8121 samples/sec | ETA 03:22:42 2022-08-24 02:58:40 [INFO] [TRAIN] epoch: 72, iter: 90400/160000, loss: 0.4555, lr: 0.000527, batch_cost: 0.2106, reader_cost: 0.00195, ips: 37.9872 samples/sec | ETA 04:04:17 2022-08-24 02:58:51 [INFO] [TRAIN] epoch: 72, iter: 90450/160000, loss: 0.4879, lr: 0.000527, batch_cost: 0.2159, reader_cost: 0.00066, ips: 37.0536 samples/sec | ETA 04:10:16 2022-08-24 02:59:01 [INFO] [TRAIN] epoch: 72, iter: 90500/160000, loss: 0.5003, lr: 0.000526, batch_cost: 0.2005, reader_cost: 0.00058, ips: 39.9064 samples/sec | ETA 03:52:12 2022-08-24 02:59:10 [INFO] [TRAIN] epoch: 72, iter: 90550/160000, loss: 0.4901, lr: 0.000526, batch_cost: 0.1896, reader_cost: 0.00057, ips: 42.1860 samples/sec | ETA 03:39:30 2022-08-24 02:59:20 [INFO] [TRAIN] epoch: 72, iter: 90600/160000, loss: 0.4833, lr: 0.000525, batch_cost: 0.2056, reader_cost: 0.00107, ips: 38.9081 samples/sec | ETA 03:57:49 2022-08-24 02:59:29 [INFO] [TRAIN] epoch: 72, iter: 90650/160000, loss: 0.4780, lr: 0.000525, batch_cost: 0.1706, reader_cost: 0.00040, ips: 46.8989 samples/sec | ETA 03:17:09 2022-08-24 02:59:39 [INFO] [TRAIN] epoch: 72, iter: 90700/160000, loss: 0.4754, lr: 0.000525, batch_cost: 0.2107, reader_cost: 0.00045, ips: 37.9753 samples/sec | ETA 04:03:18 2022-08-24 02:59:49 [INFO] [TRAIN] epoch: 72, iter: 90750/160000, loss: 0.4697, lr: 0.000524, batch_cost: 0.1936, reader_cost: 0.00081, ips: 41.3136 samples/sec | ETA 03:43:29 2022-08-24 02:59:58 [INFO] [TRAIN] epoch: 72, iter: 90800/160000, loss: 0.4931, lr: 0.000524, batch_cost: 0.1804, reader_cost: 0.00146, ips: 44.3572 samples/sec | ETA 03:28:00 2022-08-24 03:00:09 [INFO] [TRAIN] epoch: 72, iter: 90850/160000, loss: 0.4571, lr: 0.000524, batch_cost: 0.2159, reader_cost: 0.00067, ips: 37.0540 samples/sec | ETA 04:08:49 2022-08-24 03:00:19 [INFO] [TRAIN] epoch: 72, iter: 90900/160000, loss: 0.4470, lr: 0.000523, batch_cost: 0.2084, reader_cost: 0.00057, ips: 38.3918 samples/sec | ETA 03:59:58 2022-08-24 03:00:34 [INFO] [TRAIN] epoch: 73, iter: 90950/160000, loss: 0.4478, lr: 0.000523, batch_cost: 0.2900, reader_cost: 0.07690, ips: 27.5905 samples/sec | ETA 05:33:41 2022-08-24 03:00:45 [INFO] [TRAIN] epoch: 73, iter: 91000/160000, loss: 0.4682, lr: 0.000522, batch_cost: 0.2298, reader_cost: 0.00040, ips: 34.8099 samples/sec | ETA 04:24:17 2022-08-24 03:00:45 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 157s - batch_cost: 0.1570 - reader cost: 9.4792e-04 2022-08-24 03:03:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3644 Acc: 0.7719 Kappa: 0.7546 Dice: 0.5010 2022-08-24 03:03:23 [INFO] [EVAL] Class IoU: [0.6842 0.7915 0.9331 0.7358 0.6853 0.772 0.7762 0.7918 0.5194 0.6394 0.5048 0.5678 0.7077 0.3176 0.3006 0.4312 0.4846 0.4306 0.6036 0.4387 0.7555 0.4746 0.6102 0.4895 0.314 0.4916 0.4721 0.4498 0.4576 0.248 0.2495 0.5229 0.2751 0.3204 0.3314 0.3926 0.4719 0.5322 0.2892 0.3422 0.0938 0.1443 0.3544 0.2646 0.274 0.1856 0.4436 0.5107 0.6075 0.461 0.5844 0.3445 0.1738 0.2568 0.6838 0.34 0.8851 0.4039 0.3783 0.334 0.0817 0.2809 0.2963 0.2085 0.4625 0.7176 0.2815 0.3713 0.1114 0.3537 0.4604 0.5654 0.3812 0.2797 0.4747 0.3895 0.5253 0.2329 0.2563 0.446 0.6534 0.4041 0.38 0.062 0.1512 0.5333 0.1284 0.1329 0.2975 0.5509 0.4499 0.099 0.2431 0.087 0.0364 0.0245 0.0685 0.1303 0.2353 0.2803 0.1726 0.1266 0.295 0.8139 0.1897 0.4204 0.1517 0.5948 0.0593 0.2743 0.199 0.3221 0.1602 0.5408 0.6625 0.0802 0.3818 0.6564 0.1797 0.3834 0.4288 0.0636 0.3528 0.2288 0.2888 0.193 0.5065 0.4765 0.5431 0.3654 0.4978 0.0678 0.2656 0.3828 0.3153 0.1934 0.1552 0.0243 0.215 0.3811 0.0366 0.0241 0.2369 0.2575 0.3149 0.0254 0.4208 0.031 0.0915 0.1523] 2022-08-24 03:03:23 [INFO] [EVAL] Class Precision: [0.792 0.8558 0.9693 0.841 0.7629 0.8725 0.8881 0.8421 0.6391 0.714 0.695 0.7036 0.7702 0.542 0.5134 0.5956 0.6702 0.6859 0.7646 0.6462 0.8176 0.6723 0.7453 0.5786 0.4504 0.7361 0.5717 0.7194 0.6795 0.4853 0.4702 0.7046 0.5243 0.424 0.4902 0.4988 0.6678 0.7524 0.4615 0.6475 0.2781 0.3377 0.5856 0.4875 0.4799 0.4774 0.7316 0.7512 0.7552 0.5373 0.7633 0.4486 0.3907 0.7073 0.7118 0.6813 0.9222 0.6596 0.6799 0.6551 0.1452 0.6074 0.3845 0.6208 0.5632 0.8566 0.5288 0.5144 0.4438 0.5672 0.7547 0.7312 0.5225 0.3616 0.6777 0.5855 0.7877 0.4134 0.6261 0.6468 0.7777 0.7067 0.823 0.1848 0.2415 0.7226 0.5172 0.3894 0.6996 0.744 0.6192 0.1269 0.4153 0.3199 0.1285 0.0748 0.5946 0.4064 0.3118 0.6148 0.5927 0.2116 0.5748 0.8715 0.7484 0.4811 0.3674 0.8721 0.1698 0.4316 0.4191 0.4126 0.515 0.6667 0.6709 0.3191 0.578 0.7524 0.3353 0.5254 0.7651 0.2125 0.6185 0.6351 0.7806 0.5741 0.8158 0.6072 0.84 0.6399 0.5786 0.4206 0.4958 0.824 0.6331 0.4861 0.3169 0.0594 0.4465 0.5684 0.1646 0.0388 0.5734 0.4837 0.6089 0.0407 0.8357 0.2132 0.1953 0.862 ] 2022-08-24 03:03:23 [INFO] [EVAL] Class Recall: [0.8341 0.9133 0.9615 0.8547 0.8707 0.8701 0.8603 0.9299 0.7351 0.8595 0.6485 0.7463 0.8972 0.4341 0.4203 0.6097 0.6364 0.5363 0.7413 0.5775 0.9086 0.6174 0.7711 0.7607 0.509 0.5967 0.7304 0.5454 0.5836 0.3366 0.3471 0.6697 0.3666 0.5673 0.5056 0.6483 0.6166 0.6452 0.4364 0.4206 0.1241 0.2012 0.4731 0.3667 0.3896 0.2329 0.5298 0.6147 0.7564 0.7645 0.7138 0.5975 0.2384 0.2874 0.9456 0.4042 0.9566 0.5103 0.4603 0.4053 0.1573 0.3432 0.5636 0.2389 0.7212 0.8156 0.3758 0.5717 0.1294 0.4844 0.5414 0.7138 0.5851 0.5526 0.6132 0.5378 0.612 0.3479 0.3026 0.5896 0.8035 0.4855 0.4138 0.0853 0.288 0.6705 0.1459 0.1679 0.341 0.6797 0.622 0.3105 0.3695 0.1068 0.0484 0.0351 0.0718 0.161 0.4897 0.34 0.1959 0.2398 0.3774 0.9249 0.2027 0.7691 0.2053 0.6516 0.0835 0.4294 0.2749 0.5949 0.1886 0.7413 0.9814 0.0968 0.5294 0.8373 0.2792 0.5866 0.4938 0.0832 0.4508 0.2635 0.3143 0.2253 0.5719 0.6888 0.6057 0.4599 0.7811 0.0748 0.3639 0.4168 0.3858 0.2431 0.2333 0.0394 0.2932 0.5363 0.045 0.0601 0.2876 0.355 0.3947 0.0629 0.4588 0.035 0.1468 0.1561] 2022-08-24 03:03:23 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 03:03:34 [INFO] [TRAIN] epoch: 73, iter: 91050/160000, loss: 0.4588, lr: 0.000522, batch_cost: 0.2264, reader_cost: 0.00373, ips: 35.3431 samples/sec | ETA 04:20:06 2022-08-24 03:03:45 [INFO] [TRAIN] epoch: 73, iter: 91100/160000, loss: 0.4258, lr: 0.000522, batch_cost: 0.2132, reader_cost: 0.00218, ips: 37.5218 samples/sec | ETA 04:04:50 2022-08-24 03:03:55 [INFO] [TRAIN] epoch: 73, iter: 91150/160000, loss: 0.4775, lr: 0.000521, batch_cost: 0.2000, reader_cost: 0.00251, ips: 39.9948 samples/sec | ETA 03:49:31 2022-08-24 03:04:04 [INFO] [TRAIN] epoch: 73, iter: 91200/160000, loss: 0.4531, lr: 0.000521, batch_cost: 0.1938, reader_cost: 0.00307, ips: 41.2722 samples/sec | ETA 03:42:15 2022-08-24 03:04:15 [INFO] [TRAIN] epoch: 73, iter: 91250/160000, loss: 0.4884, lr: 0.000521, batch_cost: 0.2066, reader_cost: 0.00069, ips: 38.7147 samples/sec | ETA 03:56:46 2022-08-24 03:04:25 [INFO] [TRAIN] epoch: 73, iter: 91300/160000, loss: 0.4752, lr: 0.000520, batch_cost: 0.2045, reader_cost: 0.00099, ips: 39.1146 samples/sec | ETA 03:54:11 2022-08-24 03:04:35 [INFO] [TRAIN] epoch: 73, iter: 91350/160000, loss: 0.4349, lr: 0.000520, batch_cost: 0.2003, reader_cost: 0.00052, ips: 39.9445 samples/sec | ETA 03:49:09 2022-08-24 03:04:45 [INFO] [TRAIN] epoch: 73, iter: 91400/160000, loss: 0.4839, lr: 0.000519, batch_cost: 0.2010, reader_cost: 0.00037, ips: 39.7942 samples/sec | ETA 03:49:50 2022-08-24 03:04:56 [INFO] [TRAIN] epoch: 73, iter: 91450/160000, loss: 0.4851, lr: 0.000519, batch_cost: 0.2103, reader_cost: 0.00106, ips: 38.0329 samples/sec | ETA 04:00:19 2022-08-24 03:05:05 [INFO] [TRAIN] epoch: 73, iter: 91500/160000, loss: 0.4803, lr: 0.000519, batch_cost: 0.1950, reader_cost: 0.00033, ips: 41.0334 samples/sec | ETA 03:42:34 2022-08-24 03:05:14 [INFO] [TRAIN] epoch: 73, iter: 91550/160000, loss: 0.4618, lr: 0.000518, batch_cost: 0.1795, reader_cost: 0.00404, ips: 44.5673 samples/sec | ETA 03:24:47 2022-08-24 03:05:23 [INFO] [TRAIN] epoch: 73, iter: 91600/160000, loss: 0.4671, lr: 0.000518, batch_cost: 0.1814, reader_cost: 0.00034, ips: 44.1049 samples/sec | ETA 03:26:46 2022-08-24 03:05:33 [INFO] [TRAIN] epoch: 73, iter: 91650/160000, loss: 0.4764, lr: 0.000517, batch_cost: 0.1970, reader_cost: 0.00061, ips: 40.6023 samples/sec | ETA 03:44:27 2022-08-24 03:05:43 [INFO] [TRAIN] epoch: 73, iter: 91700/160000, loss: 0.4449, lr: 0.000517, batch_cost: 0.1969, reader_cost: 0.00735, ips: 40.6298 samples/sec | ETA 03:44:08 2022-08-24 03:05:54 [INFO] [TRAIN] epoch: 73, iter: 91750/160000, loss: 0.4629, lr: 0.000517, batch_cost: 0.2248, reader_cost: 0.00060, ips: 35.5895 samples/sec | ETA 04:15:41 2022-08-24 03:06:04 [INFO] [TRAIN] epoch: 73, iter: 91800/160000, loss: 0.4893, lr: 0.000516, batch_cost: 0.1964, reader_cost: 0.00059, ips: 40.7430 samples/sec | ETA 03:43:11 2022-08-24 03:06:14 [INFO] [TRAIN] epoch: 73, iter: 91850/160000, loss: 0.4627, lr: 0.000516, batch_cost: 0.1901, reader_cost: 0.00059, ips: 42.0888 samples/sec | ETA 03:35:53 2022-08-24 03:06:23 [INFO] [TRAIN] epoch: 73, iter: 91900/160000, loss: 0.4822, lr: 0.000516, batch_cost: 0.1867, reader_cost: 0.00039, ips: 42.8508 samples/sec | ETA 03:31:53 2022-08-24 03:06:33 [INFO] [TRAIN] epoch: 73, iter: 91950/160000, loss: 0.4551, lr: 0.000515, batch_cost: 0.2069, reader_cost: 0.00077, ips: 38.6580 samples/sec | ETA 03:54:42 2022-08-24 03:06:42 [INFO] [TRAIN] epoch: 73, iter: 92000/160000, loss: 0.4731, lr: 0.000515, batch_cost: 0.1800, reader_cost: 0.01405, ips: 44.4407 samples/sec | ETA 03:24:01 2022-08-24 03:06:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 147s - batch_cost: 0.1473 - reader cost: 6.1731e-04 2022-08-24 03:09:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.3675 Acc: 0.7728 Kappa: 0.7553 Dice: 0.5047 2022-08-24 03:09:10 [INFO] [EVAL] Class IoU: [0.6889 0.7899 0.9318 0.735 0.6985 0.7632 0.7645 0.8054 0.5269 0.6432 0.4955 0.5607 0.7175 0.3087 0.2797 0.4296 0.5096 0.4574 0.6088 0.4324 0.7643 0.4682 0.6296 0.5052 0.3142 0.4285 0.5077 0.4393 0.4064 0.2307 0.2845 0.5326 0.3094 0.3275 0.3346 0.392 0.4848 0.571 0.2794 0.3365 0.0747 0.1421 0.3653 0.2731 0.2411 0.2228 0.3966 0.5029 0.5202 0.5245 0.5679 0.3752 0.1923 0.2399 0.7124 0.3415 0.8775 0.4354 0.3539 0.2949 0.0881 0.3232 0.2908 0.1355 0.4734 0.7198 0.3042 0.3709 0.1171 0.347 0.5207 0.5858 0.3832 0.2959 0.4717 0.3971 0.5113 0.2539 0.3643 0.4945 0.678 0.3754 0.3877 0.0378 0.0964 0.5264 0.1227 0.1326 0.2666 0.5365 0.4387 0.1009 0.2528 0.0813 0.0271 0.0405 0.1718 0.1276 0.2161 0.2697 0.1986 0.1135 0.2155 0.6985 0.1873 0.3917 0.1663 0.5754 0.0753 0.4441 0.1469 0.4787 0.177 0.5558 0.7042 0.0863 0.3691 0.6185 0.1573 0.3932 0.4281 0.0613 0.3217 0.1817 0.2863 0.2232 0.5363 0.4492 0.5355 0.3167 0.5009 0.0517 0.2644 0.3834 0.288 0.2012 0.167 0.0335 0.2008 0.3531 0.124 0.098 0.2421 0.4558 0.2377 0.0024 0.44 0.0349 0.0723 0.2019] 2022-08-24 03:09:10 [INFO] [EVAL] Class Precision: [0.7837 0.8546 0.9632 0.8254 0.8015 0.8661 0.8884 0.8721 0.6657 0.7374 0.6792 0.6728 0.7928 0.5537 0.5385 0.5634 0.686 0.6686 0.755 0.6292 0.8352 0.6905 0.7625 0.6307 0.4256 0.7047 0.5952 0.8353 0.7231 0.4588 0.5061 0.6893 0.5381 0.4609 0.4065 0.6272 0.6742 0.7264 0.4024 0.632 0.308 0.3968 0.6437 0.4789 0.34 0.4124 0.6452 0.7399 0.7077 0.6555 0.7261 0.507 0.3273 0.6217 0.749 0.7495 0.9097 0.6294 0.6834 0.5255 0.1345 0.6582 0.4446 0.6566 0.5895 0.8235 0.5615 0.5029 0.2467 0.5563 0.687 0.7635 0.5861 0.3369 0.6866 0.6025 0.7332 0.5264 0.679 0.7836 0.791 0.6626 0.8087 0.1244 0.1912 0.7356 0.4696 0.4191 0.6289 0.7562 0.6136 0.1342 0.4343 0.3617 0.1263 0.1403 0.7514 0.3242 0.2938 0.6699 0.6334 0.1921 0.6247 0.7533 0.7501 0.4131 0.5356 0.8056 0.1795 0.5329 0.409 0.7414 0.4741 0.6481 0.7122 0.4065 0.6813 0.7367 0.4431 0.5046 0.7478 0.3589 0.5904 0.77 0.7324 0.6039 0.7767 0.5939 0.8172 0.4148 0.6943 0.4419 0.5935 0.8361 0.7463 0.4481 0.3555 0.0677 0.5217 0.7207 0.4326 0.1792 0.5498 0.6168 0.5703 0.0069 0.8329 0.2218 0.2041 0.8495] 2022-08-24 03:09:10 [INFO] [EVAL] Class Recall: [0.8507 0.9126 0.9662 0.8703 0.8447 0.8652 0.8457 0.9133 0.7164 0.8342 0.6469 0.7709 0.8831 0.411 0.368 0.6441 0.6645 0.5915 0.7587 0.5802 0.9 0.5926 0.7832 0.7173 0.5457 0.5223 0.7755 0.4809 0.4813 0.317 0.394 0.7009 0.4214 0.5307 0.6545 0.511 0.6332 0.7274 0.4775 0.4184 0.0897 0.1813 0.4579 0.3885 0.453 0.3264 0.5071 0.6108 0.6626 0.724 0.7227 0.5909 0.3181 0.2809 0.9357 0.3855 0.9612 0.5855 0.4233 0.4019 0.2035 0.3884 0.4567 0.1459 0.7062 0.8511 0.399 0.5856 0.1823 0.4798 0.6826 0.7157 0.5254 0.7086 0.6012 0.538 0.6281 0.3291 0.4401 0.5727 0.826 0.4642 0.4268 0.0515 0.1628 0.6493 0.1424 0.1624 0.3164 0.6486 0.6062 0.2894 0.377 0.0949 0.0333 0.0539 0.1822 0.1739 0.4497 0.3111 0.2244 0.2172 0.2476 0.9056 0.1997 0.8835 0.1943 0.6682 0.1149 0.7273 0.1864 0.5747 0.2203 0.796 0.9842 0.0987 0.4461 0.7941 0.1961 0.6404 0.5003 0.0688 0.4142 0.1921 0.3198 0.2615 0.6341 0.6484 0.6084 0.5724 0.6425 0.0554 0.3229 0.4145 0.3192 0.2675 0.2395 0.062 0.2461 0.409 0.1481 0.1779 0.302 0.6359 0.2895 0.0037 0.4826 0.0398 0.1007 0.2094] 2022-08-24 03:09:10 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 03:09:19 [INFO] [TRAIN] epoch: 73, iter: 92050/160000, loss: 0.4611, lr: 0.000514, batch_cost: 0.1710, reader_cost: 0.00350, ips: 46.7818 samples/sec | ETA 03:13:39 2022-08-24 03:09:28 [INFO] [TRAIN] epoch: 73, iter: 92100/160000, loss: 0.5001, lr: 0.000514, batch_cost: 0.1859, reader_cost: 0.00124, ips: 43.0329 samples/sec | ETA 03:30:22 2022-08-24 03:09:37 [INFO] [TRAIN] epoch: 73, iter: 92150/160000, loss: 0.4927, lr: 0.000514, batch_cost: 0.1794, reader_cost: 0.00060, ips: 44.6034 samples/sec | ETA 03:22:49 2022-08-24 03:09:48 [INFO] [TRAIN] epoch: 74, iter: 92200/160000, loss: 0.4649, lr: 0.000513, batch_cost: 0.2209, reader_cost: 0.04337, ips: 36.2196 samples/sec | ETA 04:09:35 2022-08-24 03:10:01 [INFO] [TRAIN] epoch: 74, iter: 92250/160000, loss: 0.5016, lr: 0.000513, batch_cost: 0.2604, reader_cost: 0.04869, ips: 30.7220 samples/sec | ETA 04:54:02 2022-08-24 03:10:10 [INFO] [TRAIN] epoch: 74, iter: 92300/160000, loss: 0.4447, lr: 0.000513, batch_cost: 0.1896, reader_cost: 0.00055, ips: 42.1921 samples/sec | ETA 03:33:56 2022-08-24 03:10:21 [INFO] [TRAIN] epoch: 74, iter: 92350/160000, loss: 0.4685, lr: 0.000512, batch_cost: 0.2022, reader_cost: 0.00072, ips: 39.5577 samples/sec | ETA 03:48:01 2022-08-24 03:10:31 [INFO] [TRAIN] epoch: 74, iter: 92400/160000, loss: 0.4752, lr: 0.000512, batch_cost: 0.2015, reader_cost: 0.00045, ips: 39.7038 samples/sec | ETA 03:47:00 2022-08-24 03:10:41 [INFO] [TRAIN] epoch: 74, iter: 92450/160000, loss: 0.4617, lr: 0.000511, batch_cost: 0.2094, reader_cost: 0.00050, ips: 38.2112 samples/sec | ETA 03:55:42 2022-08-24 03:10:51 [INFO] [TRAIN] epoch: 74, iter: 92500/160000, loss: 0.4923, lr: 0.000511, batch_cost: 0.1910, reader_cost: 0.00038, ips: 41.8828 samples/sec | ETA 03:34:53 2022-08-24 03:11:03 [INFO] [TRAIN] epoch: 74, iter: 92550/160000, loss: 0.4586, lr: 0.000511, batch_cost: 0.2471, reader_cost: 0.00082, ips: 32.3768 samples/sec | ETA 04:37:46 2022-08-24 03:11:13 [INFO] [TRAIN] epoch: 74, iter: 92600/160000, loss: 0.5048, lr: 0.000510, batch_cost: 0.2081, reader_cost: 0.00039, ips: 38.4473 samples/sec | ETA 03:53:44 2022-08-24 03:11:24 [INFO] [TRAIN] epoch: 74, iter: 92650/160000, loss: 0.4623, lr: 0.000510, batch_cost: 0.2185, reader_cost: 0.00044, ips: 36.6212 samples/sec | ETA 04:05:12 2022-08-24 03:11:34 [INFO] [TRAIN] epoch: 74, iter: 92700/160000, loss: 0.4767, lr: 0.000510, batch_cost: 0.1914, reader_cost: 0.00051, ips: 41.8081 samples/sec | ETA 03:34:37 2022-08-24 03:11:44 [INFO] [TRAIN] epoch: 74, iter: 92750/160000, loss: 0.4725, lr: 0.000509, batch_cost: 0.1937, reader_cost: 0.00062, ips: 41.3034 samples/sec | ETA 03:37:05 2022-08-24 03:11:52 [INFO] [TRAIN] epoch: 74, iter: 92800/160000, loss: 0.4651, lr: 0.000509, batch_cost: 0.1761, reader_cost: 0.00107, ips: 45.4352 samples/sec | ETA 03:17:12 2022-08-24 03:12:02 [INFO] [TRAIN] epoch: 74, iter: 92850/160000, loss: 0.4990, lr: 0.000508, batch_cost: 0.1864, reader_cost: 0.00070, ips: 42.9089 samples/sec | ETA 03:28:39 2022-08-24 03:12:12 [INFO] [TRAIN] epoch: 74, iter: 92900/160000, loss: 0.4736, lr: 0.000508, batch_cost: 0.2034, reader_cost: 0.00067, ips: 39.3388 samples/sec | ETA 03:47:25 2022-08-24 03:12:21 [INFO] [TRAIN] epoch: 74, iter: 92950/160000, loss: 0.4946, lr: 0.000508, batch_cost: 0.1885, reader_cost: 0.00076, ips: 42.4445 samples/sec | ETA 03:30:37 2022-08-24 03:12:32 [INFO] [TRAIN] epoch: 74, iter: 93000/160000, loss: 0.4964, lr: 0.000507, batch_cost: 0.2098, reader_cost: 0.00109, ips: 38.1235 samples/sec | ETA 03:54:19 2022-08-24 03:12:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1802 - reader cost: 0.0010 2022-08-24 03:15:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3654 Acc: 0.7704 Kappa: 0.7527 Dice: 0.5029 2022-08-24 03:15:32 [INFO] [EVAL] Class IoU: [0.689 0.7828 0.9303 0.7309 0.6921 0.7638 0.7663 0.8089 0.5242 0.6403 0.4852 0.5729 0.7123 0.3279 0.3011 0.4387 0.4727 0.4232 0.6043 0.4468 0.7664 0.4071 0.6123 0.5062 0.3049 0.3999 0.4461 0.463 0.4435 0.2558 0.2691 0.5474 0.3036 0.3455 0.3734 0.3922 0.4664 0.5408 0.2796 0.3613 0.1158 0.1744 0.3439 0.274 0.2614 0.2489 0.2295 0.5213 0.5583 0.5455 0.5737 0.3411 0.1604 0.2449 0.7041 0.3236 0.8665 0.4468 0.3441 0.2809 0.074 0.275 0.3177 0.1633 0.4823 0.7233 0.2744 0.388 0.0881 0.395 0.5053 0.5714 0.3946 0.2611 0.4671 0.3853 0.5234 0.2935 0.2989 0.3313 0.6732 0.3996 0.4148 0.0338 0.0955 0.515 0.1242 0.1161 0.2889 0.5313 0.4531 0.121 0.157 0.1274 0.0082 0.0458 0.067 0.1955 0.3102 0.3181 0.1923 0.1171 0.3014 0.5671 0.189 0.4529 0.1639 0.5631 0.0874 0.3816 0.1923 0.5073 0.1662 0.5106 0.6403 0.0518 0.4035 0.702 0.1884 0.4098 0.4245 0.0596 0.3138 0.1786 0.2786 0.2107 0.529 0.4618 0.5619 0.3899 0.5027 0.1179 0.2574 0.379 0.2715 0.1999 0.1696 0.031 0.2002 0.4016 0.0302 0.0598 0.2253 0.4122 0.232 0.0186 0.4218 0.0367 0.1028 0.1834] 2022-08-24 03:15:32 [INFO] [EVAL] Class Precision: [0.777 0.8705 0.9604 0.8162 0.7771 0.8823 0.8866 0.8913 0.6815 0.743 0.6814 0.6852 0.7831 0.5026 0.5364 0.5777 0.7088 0.7055 0.7304 0.6279 0.8368 0.6353 0.7296 0.6188 0.4781 0.5113 0.5242 0.8222 0.7094 0.465 0.4776 0.734 0.5903 0.4844 0.4581 0.515 0.6806 0.797 0.4359 0.6229 0.2803 0.4074 0.6717 0.482 0.369 0.4926 0.7456 0.7359 0.7665 0.7205 0.7157 0.4484 0.3787 0.6957 0.7393 0.6012 0.9059 0.6172 0.6983 0.5967 0.1134 0.6467 0.4087 0.6838 0.6137 0.8546 0.4882 0.5273 0.139 0.7181 0.7342 0.7132 0.5669 0.3277 0.716 0.6052 0.7218 0.6821 0.7027 0.5957 0.7954 0.6721 0.7955 0.1053 0.199 0.7604 0.6216 0.4729 0.7063 0.7168 0.6668 0.1665 0.3932 0.3406 0.1001 0.1559 0.5631 0.3271 0.4718 0.7465 0.6654 0.2066 0.5726 0.8244 0.7971 0.5069 0.3791 0.8097 0.2049 0.4788 0.4567 0.8151 0.4966 0.8184 0.6493 0.2964 0.6953 0.8111 0.3059 0.6287 0.7125 0.4245 0.6712 0.754 0.8125 0.6107 0.8648 0.5956 0.8481 0.7048 0.594 0.351 0.6287 0.8256 0.6084 0.4368 0.3493 0.0535 0.4231 0.5746 0.2679 0.157 0.6201 0.6006 0.6805 0.0291 0.8892 0.3024 0.2224 0.7681] 2022-08-24 03:15:32 [INFO] [EVAL] Class Recall: [0.8588 0.886 0.9674 0.8749 0.8635 0.8504 0.8496 0.8974 0.6943 0.8224 0.6275 0.7776 0.8874 0.4853 0.407 0.6457 0.5866 0.514 0.7777 0.6078 0.9011 0.5313 0.792 0.7356 0.457 0.6473 0.7497 0.5145 0.542 0.3624 0.3814 0.6829 0.3847 0.5465 0.6687 0.6219 0.5972 0.6272 0.4381 0.4625 0.1649 0.2337 0.4134 0.3884 0.4726 0.3347 0.249 0.6413 0.6727 0.692 0.743 0.5877 0.2178 0.2743 0.9366 0.4121 0.9522 0.6181 0.4041 0.3467 0.1755 0.3237 0.588 0.1766 0.6925 0.8249 0.3851 0.5951 0.194 0.4675 0.6184 0.7419 0.5649 0.5622 0.5733 0.5146 0.6556 0.34 0.3421 0.4274 0.8143 0.4964 0.4644 0.0473 0.1551 0.6148 0.1343 0.1333 0.3283 0.6725 0.5858 0.3065 0.2071 0.169 0.0089 0.0609 0.0707 0.327 0.4752 0.3566 0.2129 0.2128 0.3889 0.6451 0.1985 0.8096 0.2241 0.6491 0.1323 0.6526 0.2493 0.5733 0.1999 0.5758 0.9786 0.059 0.4902 0.8392 0.3291 0.5407 0.5122 0.0648 0.3708 0.1896 0.2977 0.2434 0.5767 0.6728 0.6248 0.466 0.7657 0.1508 0.3036 0.412 0.3289 0.2693 0.248 0.0686 0.2754 0.5715 0.0329 0.0881 0.2614 0.5679 0.2604 0.0492 0.4452 0.0401 0.1605 0.1942] 2022-08-24 03:15:33 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 03:15:43 [INFO] [TRAIN] epoch: 74, iter: 93050/160000, loss: 0.4696, lr: 0.000507, batch_cost: 0.1978, reader_cost: 0.00373, ips: 40.4432 samples/sec | ETA 03:40:43 2022-08-24 03:15:52 [INFO] [TRAIN] epoch: 74, iter: 93100/160000, loss: 0.4699, lr: 0.000507, batch_cost: 0.1981, reader_cost: 0.00124, ips: 40.3827 samples/sec | ETA 03:40:53 2022-08-24 03:16:03 [INFO] [TRAIN] epoch: 74, iter: 93150/160000, loss: 0.4711, lr: 0.000506, batch_cost: 0.2200, reader_cost: 0.00069, ips: 36.3584 samples/sec | ETA 04:05:09 2022-08-24 03:16:14 [INFO] [TRAIN] epoch: 74, iter: 93200/160000, loss: 0.4849, lr: 0.000506, batch_cost: 0.2122, reader_cost: 0.00076, ips: 37.7011 samples/sec | ETA 03:56:14 2022-08-24 03:16:25 [INFO] [TRAIN] epoch: 74, iter: 93250/160000, loss: 0.4787, lr: 0.000505, batch_cost: 0.2105, reader_cost: 0.00048, ips: 38.0033 samples/sec | ETA 03:54:11 2022-08-24 03:16:35 [INFO] [TRAIN] epoch: 74, iter: 93300/160000, loss: 0.4591, lr: 0.000505, batch_cost: 0.1999, reader_cost: 0.00045, ips: 40.0183 samples/sec | ETA 03:42:13 2022-08-24 03:16:45 [INFO] [TRAIN] epoch: 74, iter: 93350/160000, loss: 0.4564, lr: 0.000505, batch_cost: 0.2015, reader_cost: 0.00069, ips: 39.7111 samples/sec | ETA 03:43:46 2022-08-24 03:16:54 [INFO] [TRAIN] epoch: 74, iter: 93400/160000, loss: 0.4844, lr: 0.000504, batch_cost: 0.1935, reader_cost: 0.00035, ips: 41.3537 samples/sec | ETA 03:34:43 2022-08-24 03:17:05 [INFO] [TRAIN] epoch: 74, iter: 93450/160000, loss: 0.4721, lr: 0.000504, batch_cost: 0.2134, reader_cost: 0.00107, ips: 37.4866 samples/sec | ETA 03:56:42 2022-08-24 03:17:22 [INFO] [TRAIN] epoch: 75, iter: 93500/160000, loss: 0.4495, lr: 0.000503, batch_cost: 0.3397, reader_cost: 0.15773, ips: 23.5495 samples/sec | ETA 06:16:30 2022-08-24 03:17:32 [INFO] [TRAIN] epoch: 75, iter: 93550/160000, loss: 0.4820, lr: 0.000503, batch_cost: 0.1910, reader_cost: 0.00033, ips: 41.8897 samples/sec | ETA 03:31:30 2022-08-24 03:17:43 [INFO] [TRAIN] epoch: 75, iter: 93600/160000, loss: 0.4873, lr: 0.000503, batch_cost: 0.2330, reader_cost: 0.00075, ips: 34.3295 samples/sec | ETA 04:17:53 2022-08-24 03:17:52 [INFO] [TRAIN] epoch: 75, iter: 93650/160000, loss: 0.4892, lr: 0.000502, batch_cost: 0.1835, reader_cost: 0.00085, ips: 43.5875 samples/sec | ETA 03:22:57 2022-08-24 03:18:02 [INFO] [TRAIN] epoch: 75, iter: 93700/160000, loss: 0.4762, lr: 0.000502, batch_cost: 0.1949, reader_cost: 0.00159, ips: 41.0481 samples/sec | ETA 03:35:21 2022-08-24 03:18:12 [INFO] [TRAIN] epoch: 75, iter: 93750/160000, loss: 0.4480, lr: 0.000502, batch_cost: 0.1893, reader_cost: 0.00371, ips: 42.2712 samples/sec | ETA 03:28:58 2022-08-24 03:18:21 [INFO] [TRAIN] epoch: 75, iter: 93800/160000, loss: 0.4513, lr: 0.000501, batch_cost: 0.1887, reader_cost: 0.00321, ips: 42.4059 samples/sec | ETA 03:28:08 2022-08-24 03:18:31 [INFO] [TRAIN] epoch: 75, iter: 93850/160000, loss: 0.4639, lr: 0.000501, batch_cost: 0.1998, reader_cost: 0.00074, ips: 40.0369 samples/sec | ETA 03:40:17 2022-08-24 03:18:40 [INFO] [TRAIN] epoch: 75, iter: 93900/160000, loss: 0.4210, lr: 0.000500, batch_cost: 0.1711, reader_cost: 0.00041, ips: 46.7637 samples/sec | ETA 03:08:27 2022-08-24 03:18:48 [INFO] [TRAIN] epoch: 75, iter: 93950/160000, loss: 0.4833, lr: 0.000500, batch_cost: 0.1630, reader_cost: 0.00080, ips: 49.0811 samples/sec | ETA 02:59:25 2022-08-24 03:18:55 [INFO] [TRAIN] epoch: 75, iter: 94000/160000, loss: 0.4402, lr: 0.000500, batch_cost: 0.1502, reader_cost: 0.00052, ips: 53.2767 samples/sec | ETA 02:45:10 2022-08-24 03:18:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 161s - batch_cost: 0.1605 - reader cost: 5.9402e-04 2022-08-24 03:21:36 [INFO] [EVAL] #Images: 2000 mIoU: 0.3687 Acc: 0.7738 Kappa: 0.7565 Dice: 0.5066 2022-08-24 03:21:36 [INFO] [EVAL] Class IoU: [0.6896 0.7917 0.9312 0.7404 0.6916 0.7717 0.7758 0.8032 0.5317 0.6592 0.4867 0.5801 0.7106 0.3243 0.2749 0.4366 0.4744 0.4193 0.6195 0.4373 0.7714 0.473 0.6246 0.5132 0.3362 0.4564 0.4669 0.4674 0.4205 0.2408 0.2824 0.5278 0.3152 0.3407 0.37 0.389 0.4777 0.5899 0.2759 0.398 0.1813 0.1388 0.3576 0.2723 0.2438 0.2388 0.4006 0.5215 0.6005 0.5339 0.5579 0.3379 0.1847 0.2374 0.6944 0.3141 0.8681 0.3994 0.491 0.2957 0.0747 0.2604 0.3334 0.1547 0.479 0.7347 0.2236 0.3717 0.1024 0.3674 0.5082 0.4911 0.371 0.209 0.4733 0.401 0.54 0.2841 0.2308 0.1216 0.6628 0.4304 0.4212 0.0373 0.2621 0.5197 0.1111 0.1339 0.3186 0.5091 0.4003 0.0677 0.151 0.0909 0.0504 0.0655 0.0795 0.1492 0.2384 0.3044 0.2075 0.1359 0.291 0.685 0.1862 0.4768 0.2434 0.5705 0.1255 0.3466 0.2421 0.5405 0.1729 0.4272 0.5527 0.0576 0.3904 0.6684 0.1616 0.3954 0.437 0.0635 0.3355 0.1651 0.3047 0.2477 0.5499 0.473 0.5523 0.3682 0.5636 0.0859 0.2805 0.3832 0.2819 0.2001 0.1597 0.0273 0.1901 0.4055 0.08 0.0608 0.2798 0.3959 0.2659 0.0049 0.4342 0.0384 0.1084 0.1922] 2022-08-24 03:21:36 [INFO] [EVAL] Class Precision: [0.7842 0.8751 0.9689 0.8291 0.7746 0.8684 0.8926 0.8692 0.6545 0.7431 0.6844 0.7022 0.7827 0.5223 0.5469 0.5981 0.6678 0.6833 0.7932 0.6294 0.8482 0.686 0.7855 0.624 0.4725 0.5721 0.5437 0.6915 0.744 0.4991 0.4678 0.6664 0.5804 0.4536 0.5141 0.4798 0.6838 0.7859 0.4307 0.585 0.3776 0.3751 0.642 0.4711 0.3858 0.4465 0.7788 0.7691 0.6999 0.6848 0.766 0.4218 0.3004 0.6988 0.7282 0.6594 0.896 0.713 0.6968 0.5822 0.1357 0.6582 0.4813 0.7583 0.5848 0.8484 0.3182 0.513 0.2578 0.6131 0.6997 0.5967 0.5597 0.3142 0.6785 0.5942 0.7281 0.6196 0.7131 0.411 0.7996 0.6466 0.7734 0.0958 0.3539 0.7497 0.6229 0.3877 0.6811 0.685 0.5232 0.0773 0.4345 0.34 0.1851 0.232 0.6327 0.387 0.3207 0.6592 0.6871 0.2041 0.5972 0.8306 0.8968 0.5224 0.5579 0.7844 0.2417 0.466 0.406 0.7968 0.6045 0.7689 0.5568 0.4385 0.6239 0.7927 0.3536 0.5045 0.7703 0.3879 0.6749 0.7251 0.7503 0.6133 0.8917 0.6224 0.8301 0.5544 0.6922 0.3155 0.6463 0.851 0.7059 0.4333 0.3465 0.0532 0.433 0.6405 0.2535 0.1711 0.5503 0.7239 0.5736 0.0075 0.8516 0.26 0.2234 0.7995] 2022-08-24 03:21:36 [INFO] [EVAL] Class Recall: [0.851 0.8926 0.96 0.8738 0.8659 0.8739 0.8557 0.9136 0.7392 0.8537 0.6276 0.7694 0.8853 0.461 0.3561 0.6178 0.621 0.5205 0.7388 0.5889 0.895 0.6038 0.7529 0.7429 0.5381 0.693 0.7678 0.5906 0.4916 0.3176 0.4161 0.7173 0.4082 0.5779 0.569 0.6727 0.6132 0.7029 0.4342 0.5546 0.2586 0.1806 0.4466 0.3921 0.3986 0.3392 0.452 0.6183 0.8088 0.7078 0.6725 0.6294 0.324 0.2645 0.9373 0.375 0.9655 0.476 0.6243 0.3754 0.1424 0.3011 0.5204 0.1628 0.7259 0.8458 0.4292 0.5742 0.1452 0.4783 0.65 0.7351 0.524 0.3844 0.6101 0.5521 0.6764 0.3441 0.2544 0.1472 0.7949 0.5627 0.4805 0.0576 0.5024 0.6288 0.1191 0.1698 0.3744 0.6646 0.6302 0.3515 0.188 0.1103 0.0648 0.0837 0.0833 0.1953 0.4816 0.3612 0.2292 0.2893 0.3621 0.7962 0.1903 0.8454 0.3016 0.6766 0.2071 0.575 0.375 0.6269 0.1949 0.4901 0.9869 0.0622 0.5106 0.81 0.2293 0.6465 0.5025 0.0706 0.4001 0.1761 0.3391 0.2935 0.5893 0.6634 0.6227 0.5229 0.7522 0.1055 0.3314 0.4108 0.3194 0.2711 0.2285 0.0532 0.253 0.525 0.1047 0.0863 0.3628 0.4663 0.3314 0.014 0.4697 0.0432 0.174 0.2019] 2022-08-24 03:21:36 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 03:21:46 [INFO] [TRAIN] epoch: 75, iter: 94050/160000, loss: 0.4549, lr: 0.000499, batch_cost: 0.2024, reader_cost: 0.00284, ips: 39.5191 samples/sec | ETA 03:42:30 2022-08-24 03:21:57 [INFO] [TRAIN] epoch: 75, iter: 94100/160000, loss: 0.4833, lr: 0.000499, batch_cost: 0.2163, reader_cost: 0.00157, ips: 36.9813 samples/sec | ETA 03:57:35 2022-08-24 03:22:07 [INFO] [TRAIN] epoch: 75, iter: 94150/160000, loss: 0.4880, lr: 0.000499, batch_cost: 0.1898, reader_cost: 0.00033, ips: 42.1583 samples/sec | ETA 03:28:15 2022-08-24 03:22:16 [INFO] [TRAIN] epoch: 75, iter: 94200/160000, loss: 0.4582, lr: 0.000498, batch_cost: 0.1824, reader_cost: 0.00467, ips: 43.8549 samples/sec | ETA 03:20:03 2022-08-24 03:22:27 [INFO] [TRAIN] epoch: 75, iter: 94250/160000, loss: 0.4460, lr: 0.000498, batch_cost: 0.2215, reader_cost: 0.00033, ips: 36.1148 samples/sec | ETA 04:02:44 2022-08-24 03:22:37 [INFO] [TRAIN] epoch: 75, iter: 94300/160000, loss: 0.4409, lr: 0.000497, batch_cost: 0.1932, reader_cost: 0.00057, ips: 41.4039 samples/sec | ETA 03:31:34 2022-08-24 03:22:46 [INFO] [TRAIN] epoch: 75, iter: 94350/160000, loss: 0.4855, lr: 0.000497, batch_cost: 0.1820, reader_cost: 0.00621, ips: 43.9611 samples/sec | ETA 03:19:06 2022-08-24 03:22:55 [INFO] [TRAIN] epoch: 75, iter: 94400/160000, loss: 0.4853, lr: 0.000497, batch_cost: 0.1899, reader_cost: 0.00045, ips: 42.1191 samples/sec | ETA 03:27:39 2022-08-24 03:23:06 [INFO] [TRAIN] epoch: 75, iter: 94450/160000, loss: 0.4650, lr: 0.000496, batch_cost: 0.2188, reader_cost: 0.00140, ips: 36.5632 samples/sec | ETA 03:59:02 2022-08-24 03:23:17 [INFO] [TRAIN] epoch: 75, iter: 94500/160000, loss: 0.4670, lr: 0.000496, batch_cost: 0.2094, reader_cost: 0.00042, ips: 38.2019 samples/sec | ETA 03:48:36 2022-08-24 03:23:27 [INFO] [TRAIN] epoch: 75, iter: 94550/160000, loss: 0.4671, lr: 0.000496, batch_cost: 0.2154, reader_cost: 0.00130, ips: 37.1425 samples/sec | ETA 03:54:57 2022-08-24 03:23:37 [INFO] [TRAIN] epoch: 75, iter: 94600/160000, loss: 0.4778, lr: 0.000495, batch_cost: 0.1979, reader_cost: 0.00063, ips: 40.4313 samples/sec | ETA 03:35:40 2022-08-24 03:23:46 [INFO] [TRAIN] epoch: 75, iter: 94650/160000, loss: 0.4895, lr: 0.000495, batch_cost: 0.1772, reader_cost: 0.00235, ips: 45.1590 samples/sec | ETA 03:12:56 2022-08-24 03:23:57 [INFO] [TRAIN] epoch: 75, iter: 94700/160000, loss: 0.4250, lr: 0.000494, batch_cost: 0.2138, reader_cost: 0.00051, ips: 37.4151 samples/sec | ETA 03:52:42 2022-08-24 03:24:10 [INFO] [TRAIN] epoch: 76, iter: 94750/160000, loss: 0.4553, lr: 0.000494, batch_cost: 0.2587, reader_cost: 0.06368, ips: 30.9191 samples/sec | ETA 04:41:22 2022-08-24 03:24:17 [INFO] [TRAIN] epoch: 76, iter: 94800/160000, loss: 0.4800, lr: 0.000494, batch_cost: 0.1551, reader_cost: 0.00126, ips: 51.5959 samples/sec | ETA 02:48:29 2022-08-24 03:24:26 [INFO] [TRAIN] epoch: 76, iter: 94850/160000, loss: 0.4615, lr: 0.000493, batch_cost: 0.1722, reader_cost: 0.00068, ips: 46.4563 samples/sec | ETA 03:06:59 2022-08-24 03:24:35 [INFO] [TRAIN] epoch: 76, iter: 94900/160000, loss: 0.4522, lr: 0.000493, batch_cost: 0.1727, reader_cost: 0.00063, ips: 46.3229 samples/sec | ETA 03:07:22 2022-08-24 03:24:44 [INFO] [TRAIN] epoch: 76, iter: 94950/160000, loss: 0.4500, lr: 0.000492, batch_cost: 0.1856, reader_cost: 0.00066, ips: 43.1099 samples/sec | ETA 03:21:11 2022-08-24 03:24:53 [INFO] [TRAIN] epoch: 76, iter: 95000/160000, loss: 0.4638, lr: 0.000492, batch_cost: 0.1805, reader_cost: 0.00049, ips: 44.3244 samples/sec | ETA 03:15:31 2022-08-24 03:24:53 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 154s - batch_cost: 0.1539 - reader cost: 0.0013 2022-08-24 03:27:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3627 Acc: 0.7719 Kappa: 0.7544 Dice: 0.4997 2022-08-24 03:27:27 [INFO] [EVAL] Class IoU: [0.6908 0.7888 0.93 0.741 0.676 0.77 0.78 0.8021 0.5302 0.6262 0.4837 0.5625 0.7038 0.309 0.3221 0.439 0.4655 0.4398 0.6141 0.4367 0.7673 0.4084 0.625 0.5248 0.3192 0.4265 0.518 0.4353 0.4511 0.2449 0.2823 0.5223 0.2893 0.3444 0.3457 0.3944 0.4827 0.5527 0.2499 0.3199 0.1056 0.1269 0.3425 0.2643 0.2608 0.229 0.4092 0.5014 0.6163 0.533 0.5693 0.3245 0.1865 0.1974 0.706 0.3455 0.8877 0.3933 0.3454 0.2907 0.1124 0.2547 0.307 0.1502 0.4707 0.7117 0.2716 0.3985 0.1085 0.3352 0.5098 0.5325 0.371 0.227 0.474 0.3844 0.5284 0.271 0.2762 0.2028 0.6953 0.4154 0.4002 0.0406 0.1048 0.5245 0.1138 0.1127 0.3429 0.5216 0.3909 0.0953 0.1785 0.1109 0.0539 0.0226 0.1307 0.143 0.2809 0.313 0.1539 0.1453 0.2704 0.5519 0.2 0.3843 0.1459 0.5775 0.117 0.4095 0.1789 0.3726 0.1738 0.5386 0.6169 0.0836 0.4106 0.6984 0.1847 0.386 0.4669 0.0804 0.2626 0.1875 0.2871 0.2099 0.5336 0.4511 0.4406 0.3744 0.5894 0.0542 0.2998 0.3425 0.2778 0.2039 0.1674 0.038 0.1795 0.393 0.0355 0.0279 0.259 0.4808 0.2834 0.0136 0.4084 0.0262 0.0875 0.2011] 2022-08-24 03:27:27 [INFO] [EVAL] Class Precision: [0.7846 0.8608 0.9622 0.8301 0.7738 0.8639 0.8855 0.8568 0.6639 0.7345 0.6957 0.7121 0.7659 0.5235 0.5289 0.5852 0.6763 0.7094 0.7751 0.6288 0.8314 0.7087 0.7871 0.6406 0.4856 0.6139 0.5584 0.686 0.7413 0.4552 0.4817 0.6587 0.5422 0.4499 0.4722 0.5093 0.6859 0.7814 0.3646 0.676 0.3818 0.336 0.537 0.4624 0.3457 0.6763 0.7509 0.7223 0.6872 0.6668 0.7614 0.3964 0.3823 0.6409 0.7585 0.7423 0.9284 0.6587 0.704 0.4839 0.1623 0.535 0.4284 0.7139 0.5967 0.8076 0.4163 0.5289 0.184 0.6039 0.7086 0.6233 0.6269 0.34 0.659 0.5803 0.7403 0.5673 0.6819 0.6293 0.8599 0.6686 0.8164 0.1402 0.2102 0.6967 0.5684 0.4419 0.7893 0.6964 0.5154 0.1377 0.3729 0.3472 0.1476 0.0952 0.6862 0.369 0.4511 0.7179 0.6695 0.2644 0.6226 0.7823 0.7897 0.4944 0.492 0.8148 0.1996 0.514 0.4408 0.4579 0.629 0.7878 0.6228 0.3841 0.7587 0.763 0.3857 0.5504 0.721 0.4544 0.5687 0.7226 0.769 0.6476 0.7859 0.5664 0.5677 0.5688 0.67 0.3355 0.6468 0.889 0.7686 0.4137 0.3848 0.0671 0.4776 0.6405 0.2126 0.1668 0.5929 0.6819 0.5902 0.0176 0.8434 0.2097 0.2011 0.8557] 2022-08-24 03:27:27 [INFO] [EVAL] Class Recall: [0.8525 0.9041 0.9652 0.8735 0.8424 0.8763 0.8675 0.9263 0.7247 0.8093 0.6136 0.728 0.8966 0.43 0.4516 0.6372 0.599 0.5364 0.7472 0.5884 0.9086 0.4908 0.7521 0.7439 0.4823 0.5828 0.8774 0.5436 0.5354 0.3465 0.4054 0.716 0.3828 0.5949 0.5633 0.6361 0.6196 0.6538 0.4428 0.3778 0.1273 0.1695 0.4861 0.3816 0.5151 0.2572 0.4734 0.6211 0.8567 0.7266 0.6929 0.6414 0.2668 0.2219 0.9106 0.3926 0.9529 0.4939 0.4041 0.4213 0.2677 0.3272 0.5199 0.1598 0.6903 0.8571 0.4387 0.6178 0.2092 0.4297 0.6451 0.7852 0.4762 0.4058 0.6282 0.5324 0.6487 0.3416 0.3171 0.2303 0.7841 0.523 0.4398 0.0541 0.173 0.6796 0.1245 0.1315 0.3774 0.6751 0.6181 0.2361 0.255 0.1402 0.0783 0.0288 0.139 0.1894 0.4267 0.3569 0.1666 0.2439 0.3233 0.6521 0.2113 0.6332 0.1718 0.6647 0.2202 0.6682 0.2315 0.6669 0.1936 0.63 0.9851 0.0965 0.4723 0.8918 0.2617 0.5639 0.5699 0.089 0.3278 0.202 0.3142 0.237 0.6244 0.689 0.6631 0.5228 0.8305 0.0607 0.3585 0.3578 0.3031 0.2869 0.2286 0.0805 0.2234 0.5042 0.0408 0.0324 0.3151 0.6198 0.3528 0.0562 0.4419 0.0291 0.1342 0.2082] 2022-08-24 03:27:27 [INFO] [EVAL] The model with the best validation mIoU (0.3707) was saved at iter 85000. 2022-08-24 03:27:37 [INFO] [TRAIN] epoch: 76, iter: 95050/160000, loss: 0.4708, lr: 0.000492, batch_cost: 0.1903, reader_cost: 0.00428, ips: 42.0298 samples/sec | ETA 03:26:02 2022-08-24 03:27:47 [INFO] [TRAIN] epoch: 76, iter: 95100/160000, loss: 0.4578, lr: 0.000491, batch_cost: 0.1965, reader_cost: 0.01252, ips: 40.7150 samples/sec | ETA 03:32:32 2022-08-24 03:27:56 [INFO] [TRAIN] epoch: 76, iter: 95150/160000, loss: 0.4708, lr: 0.000491, batch_cost: 0.1777, reader_cost: 0.00050, ips: 45.0199 samples/sec | ETA 03:12:03 2022-08-24 03:28:06 [INFO] [TRAIN] epoch: 76, iter: 95200/160000, loss: 0.4994, lr: 0.000491, batch_cost: 0.2140, reader_cost: 0.00049, ips: 37.3821 samples/sec | ETA 03:51:07 2022-08-24 03:28:17 [INFO] [TRAIN] epoch: 76, iter: 95250/160000, loss: 0.4843, lr: 0.000490, batch_cost: 0.2049, reader_cost: 0.00085, ips: 39.0467 samples/sec | ETA 03:41:06 2022-08-24 03:28:28 [INFO] [TRAIN] epoch: 76, iter: 95300/160000, loss: 0.4614, lr: 0.000490, batch_cost: 0.2245, reader_cost: 0.00053, ips: 35.6270 samples/sec | ETA 04:02:08 2022-08-24 03:28:39 [INFO] [TRAIN] epoch: 76, iter: 95350/160000, loss: 0.4650, lr: 0.000489, batch_cost: 0.2232, reader_cost: 0.00043, ips: 35.8377 samples/sec | ETA 04:00:31 2022-08-24 03:28:49 [INFO] [TRAIN] epoch: 76, iter: 95400/160000, loss: 0.4620, lr: 0.000489, batch_cost: 0.1917, reader_cost: 0.00051, ips: 41.7406 samples/sec | ETA 03:26:21 2022-08-24 03:28:58 [INFO] [TRAIN] epoch: 76, iter: 95450/160000, loss: 0.4665, lr: 0.000489, batch_cost: 0.1828, reader_cost: 0.00259, ips: 43.7547 samples/sec | ETA 03:16:42 2022-08-24 03:29:08 [INFO] [TRAIN] epoch: 76, iter: 95500/160000, loss: 0.4776, lr: 0.000488, batch_cost: 0.2065, reader_cost: 0.00056, ips: 38.7448 samples/sec | ETA 03:41:57 2022-08-24 03:29:17 [INFO] [TRAIN] epoch: 76, iter: 95550/160000, loss: 0.4552, lr: 0.000488, batch_cost: 0.1749, reader_cost: 0.00049, ips: 45.7490 samples/sec | ETA 03:07:50 2022-08-24 03:29:26 [INFO] [TRAIN] epoch: 76, iter: 95600/160000, loss: 0.4627, lr: 0.000488, batch_cost: 0.1799, reader_cost: 0.00044, ips: 44.4674 samples/sec | ETA 03:13:06 2022-08-24 03:29:35 [INFO] [TRAIN] epoch: 76, iter: 95650/160000, loss: 0.4672, lr: 0.000487, batch_cost: 0.1779, reader_cost: 0.00496, ips: 44.9670 samples/sec | ETA 03:10:48 2022-08-24 03:29:44 [INFO] [TRAIN] epoch: 76, iter: 95700/160000, loss: 0.4880, lr: 0.000487, batch_cost: 0.1798, reader_cost: 0.00522, ips: 44.4876 samples/sec | ETA 03:12:42 2022-08-24 03:29:53 [INFO] [TRAIN] epoch: 76, iter: 95750/160000, loss: 0.4527, lr: 0.000486, batch_cost: 0.1958, reader_cost: 0.00090, ips: 40.8656 samples/sec | ETA 03:29:37 2022-08-24 03:30:04 [INFO] [TRAIN] epoch: 76, iter: 95800/160000, loss: 0.4563, lr: 0.000486, batch_cost: 0.2064, reader_cost: 0.00172, ips: 38.7503 samples/sec | ETA 03:40:54 2022-08-24 03:30:13 [INFO] [TRAIN] epoch: 76, iter: 95850/160000, loss: 0.4797, lr: 0.000486, batch_cost: 0.1824, reader_cost: 0.00064, ips: 43.8538 samples/sec | ETA 03:15:02 2022-08-24 03:30:22 [INFO] [TRAIN] epoch: 76, iter: 95900/160000, loss: 0.4503, lr: 0.000485, batch_cost: 0.1797, reader_cost: 0.00079, ips: 44.5122 samples/sec | ETA 03:12:00 2022-08-24 03:30:31 [INFO] [TRAIN] epoch: 76, iter: 95950/160000, loss: 0.4731, lr: 0.000485, batch_cost: 0.1802, reader_cost: 0.00081, ips: 44.3914 samples/sec | ETA 03:12:22 2022-08-24 03:30:44 [INFO] [TRAIN] epoch: 77, iter: 96000/160000, loss: 0.4481, lr: 0.000485, batch_cost: 0.2656, reader_cost: 0.08901, ips: 30.1159 samples/sec | ETA 04:43:20 2022-08-24 03:30:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 194s - batch_cost: 0.1939 - reader cost: 5.2803e-04 2022-08-24 03:33:58 [INFO] [EVAL] #Images: 2000 mIoU: 0.3712 Acc: 0.7734 Kappa: 0.7562 Dice: 0.5087 2022-08-24 03:33:58 [INFO] [EVAL] Class IoU: [0.6925 0.7874 0.932 0.733 0.6886 0.7719 0.7747 0.8019 0.5215 0.6339 0.4922 0.5791 0.71 0.3149 0.3258 0.4491 0.4815 0.4291 0.6047 0.4405 0.7649 0.4536 0.6388 0.5259 0.3427 0.4503 0.4838 0.4458 0.4214 0.2447 0.279 0.5206 0.2462 0.3534 0.3523 0.4185 0.4675 0.5853 0.2787 0.3837 0.1334 0.1569 0.3633 0.2728 0.2534 0.229 0.408 0.5392 0.615 0.5039 0.5772 0.3166 0.1769 0.2615 0.6632 0.349 0.8832 0.3978 0.5004 0.3025 0.0874 0.2397 0.3418 0.2034 0.4739 0.7248 0.2891 0.3719 0.1075 0.3738 0.5019 0.5262 0.3585 0.2493 0.4737 0.3904 0.468 0.301 0.3181 0.4004 0.6817 0.38 0.3865 0.0376 0.2098 0.5236 0.1354 0.1231 0.2949 0.5026 0.4066 0.0387 0.2095 0.1011 0.0103 0.0228 0.1827 0.1939 0.2704 0.2927 0.2766 0.1292 0.3002 0.7753 0.1902 0.4234 0.2049 0.5678 0.0746 0.2068 0.1949 0.2867 0.1709 0.5656 0.7286 0.0703 0.4335 0.6913 0.1952 0.406 0.4338 0.0841 0.3718 0.2242 0.2863 0.2408 0.547 0.4653 0.5395 0.3761 0.5246 0.0268 0.2555 0.4371 0.2757 0.1988 0.1608 0.0303 0.2111 0.3874 0.0247 0.0363 0.2519 0.4496 0.2123 0.0075 0.4541 0.0361 0.1083 0.2078] 2022-08-24 03:33:58 [INFO] [EVAL] Class Precision: [0.7894 0.8748 0.965 0.8119 0.7639 0.8711 0.8859 0.8602 0.6843 0.756 0.7152 0.7153 0.7814 0.5371 0.542 0.6227 0.6818 0.6975 0.7508 0.6403 0.8286 0.6714 0.8023 0.6331 0.4946 0.506 0.5635 0.6681 0.806 0.4029 0.5222 0.6465 0.5288 0.4699 0.4432 0.5781 0.7075 0.8118 0.4509 0.5909 0.3759 0.3823 0.5819 0.5114 0.3544 0.4648 0.7692 0.764 0.7921 0.6028 0.7856 0.423 0.3238 0.633 0.7333 0.7303 0.921 0.709 0.7149 0.6334 0.1402 0.4331 0.4548 0.6432 0.6267 0.8311 0.4554 0.5397 0.1931 0.5989 0.7561 0.7274 0.5758 0.3273 0.6816 0.5269 0.7524 0.5618 0.7302 0.6341 0.7942 0.6517 0.7921 0.1142 0.3127 0.7017 0.5463 0.3954 0.6051 0.6676 0.5264 0.0426 0.446 0.3165 0.056 0.0644 0.5216 0.3862 0.4023 0.6654 0.7304 0.1895 0.6275 0.8403 0.7913 0.4541 0.4136 0.8165 0.2362 0.3735 0.414 0.3226 0.4392 0.7246 0.7394 0.2843 0.7078 0.7516 0.2541 0.5927 0.746 0.3693 0.6501 0.5822 0.7602 0.6019 0.8803 0.5993 0.779 0.5849 0.6119 0.3167 0.5983 0.8077 0.6288 0.425 0.4062 0.0539 0.4666 0.6723 0.18 0.1439 0.5835 0.6598 0.594 0.0121 0.792 0.2215 0.2953 0.8352] 2022-08-24 03:33:58 [INFO] [EVAL] Class Recall: [0.8494 0.8874 0.9646 0.8829 0.8748 0.8715 0.8606 0.922 0.6866 0.797 0.6122 0.7526 0.886 0.4322 0.4495 0.617 0.6212 0.5271 0.7566 0.5853 0.9087 0.5831 0.7581 0.7564 0.5274 0.8036 0.7739 0.5726 0.4689 0.3838 0.3746 0.7278 0.3154 0.5878 0.6319 0.6026 0.5794 0.6772 0.4219 0.5225 0.1714 0.2101 0.4916 0.3689 0.4706 0.311 0.4649 0.647 0.7334 0.7543 0.6851 0.5574 0.2806 0.3082 0.8741 0.4007 0.9556 0.4754 0.6251 0.3667 0.1883 0.3492 0.5789 0.2293 0.6603 0.85 0.4418 0.5447 0.1952 0.4986 0.5989 0.6554 0.4872 0.5112 0.6083 0.6012 0.5533 0.3934 0.3604 0.5208 0.8281 0.4769 0.4302 0.0531 0.3894 0.6734 0.1526 0.1517 0.3652 0.6703 0.6413 0.2938 0.2831 0.1294 0.0125 0.034 0.2195 0.2802 0.4519 0.3432 0.3081 0.2888 0.3652 0.9093 0.2002 0.8622 0.2887 0.6508 0.0983 0.3166 0.2691 0.7204 0.2185 0.7204 0.9805 0.0855 0.528 0.896 0.457 0.5631 0.509 0.0982 0.4648 0.2673 0.3147 0.2864 0.5909 0.6755 0.637 0.5129 0.7862 0.0285 0.3083 0.4879 0.3292 0.2719 0.2102 0.0649 0.2783 0.4776 0.0279 0.0463 0.3072 0.5853 0.2483 0.0193 0.5155 0.0413 0.1461 0.2167] 2022-08-24 03:33:59 [INFO] [EVAL] The model with the best validation mIoU (0.3712) was saved at iter 96000. 2022-08-24 03:34:08 [INFO] [TRAIN] epoch: 77, iter: 96050/160000, loss: 0.4607, lr: 0.000484, batch_cost: 0.1824, reader_cost: 0.00615, ips: 43.8507 samples/sec | ETA 03:14:26 2022-08-24 03:34:18 [INFO] [TRAIN] epoch: 77, iter: 96100/160000, loss: 0.4608, lr: 0.000484, batch_cost: 0.2113, reader_cost: 0.00077, ips: 37.8534 samples/sec | ETA 03:45:04 2022-08-24 03:34:29 [INFO] [TRAIN] epoch: 77, iter: 96150/160000, loss: 0.4370, lr: 0.000483, batch_cost: 0.2200, reader_cost: 0.00062, ips: 36.3629 samples/sec | ETA 03:54:07 2022-08-24 03:34:39 [INFO] [TRAIN] epoch: 77, iter: 96200/160000, loss: 0.4573, lr: 0.000483, batch_cost: 0.1920, reader_cost: 0.00142, ips: 41.6668 samples/sec | ETA 03:24:09 2022-08-24 03:34:48 [INFO] [TRAIN] epoch: 77, iter: 96250/160000, loss: 0.4850, lr: 0.000483, batch_cost: 0.1859, reader_cost: 0.00587, ips: 43.0381 samples/sec | ETA 03:17:29 2022-08-24 03:34:57 [INFO] [TRAIN] epoch: 77, iter: 96300/160000, loss: 0.4510, lr: 0.000482, batch_cost: 0.1849, reader_cost: 0.00035, ips: 43.2735 samples/sec | ETA 03:16:16 2022-08-24 03:35:08 [INFO] [TRAIN] epoch: 77, iter: 96350/160000, loss: 0.4690, lr: 0.000482, batch_cost: 0.2104, reader_cost: 0.00046, ips: 38.0286 samples/sec | ETA 03:43:09 2022-08-24 03:35:18 [INFO] [TRAIN] epoch: 77, iter: 96400/160000, loss: 0.4697, lr: 0.000482, batch_cost: 0.2068, reader_cost: 0.00082, ips: 38.6767 samples/sec | ETA 03:39:15 2022-08-24 03:35:28 [INFO] [TRAIN] epoch: 77, iter: 96450/160000, loss: 0.4671, lr: 0.000481, batch_cost: 0.2026, reader_cost: 0.00037, ips: 39.4878 samples/sec | ETA 03:34:34 2022-08-24 03:35:38 [INFO] [TRAIN] epoch: 77, iter: 96500/160000, loss: 0.4480, lr: 0.000481, batch_cost: 0.1959, reader_cost: 0.00067, ips: 40.8434 samples/sec | ETA 03:27:17 2022-08-24 03:35:49 [INFO] [TRAIN] epoch: 77, iter: 96550/160000, loss: 0.4580, lr: 0.000480, batch_cost: 0.2115, reader_cost: 0.00051, ips: 37.8173 samples/sec | ETA 03:43:42 2022-08-24 03:35:58 [INFO] [TRAIN] epoch: 77, iter: 96600/160000, loss: 0.4728, lr: 0.000480, batch_cost: 0.1809, reader_cost: 0.00053, ips: 44.2271 samples/sec | ETA 03:11:08 2022-08-24 03:36:07 [INFO] [TRAIN] epoch: 77, iter: 96650/160000, loss: 0.4622, lr: 0.000480, batch_cost: 0.1779, reader_cost: 0.00078, ips: 44.9571 samples/sec | ETA 03:07:52 2022-08-24 03:36:17 [INFO] [TRAIN] epoch: 77, iter: 96700/160000, loss: 0.4811, lr: 0.000479, batch_cost: 0.2098, reader_cost: 0.00028, ips: 38.1303 samples/sec | ETA 03:41:20 2022-08-24 03:36:27 [INFO] [TRAIN] epoch: 77, iter: 96750/160000, loss: 0.4578, lr: 0.000479, batch_cost: 0.1905, reader_cost: 0.00035, ips: 41.9868 samples/sec | ETA 03:20:51 2022-08-24 03:36:36 [INFO] [TRAIN] epoch: 77, iter: 96800/160000, loss: 0.4778, lr: 0.000478, batch_cost: 0.1877, reader_cost: 0.00046, ips: 42.6246 samples/sec | ETA 03:17:41 2022-08-24 03:36:44 [INFO] [TRAIN] epoch: 77, iter: 96850/160000, loss: 0.4865, lr: 0.000478, batch_cost: 0.1562, reader_cost: 0.00047, ips: 51.2145 samples/sec | ETA 02:44:24 2022-08-24 03:36:53 [INFO] [TRAIN] epoch: 77, iter: 96900/160000, loss: 0.4353, lr: 0.000478, batch_cost: 0.1768, reader_cost: 0.00059, ips: 45.2454 samples/sec | ETA 03:05:56 2022-08-24 03:37:01 [INFO] [TRAIN] epoch: 77, iter: 96950/160000, loss: 0.4811, lr: 0.000477, batch_cost: 0.1699, reader_cost: 0.00066, ips: 47.0785 samples/sec | ETA 02:58:34 2022-08-24 03:37:09 [INFO] [TRAIN] epoch: 77, iter: 97000/160000, loss: 0.4636, lr: 0.000477, batch_cost: 0.1613, reader_cost: 0.00076, ips: 49.6041 samples/sec | ETA 02:49:20 2022-08-24 03:37:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1751 - reader cost: 7.1952e-04 2022-08-24 03:40:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3695 Acc: 0.7725 Kappa: 0.7553 Dice: 0.5075 2022-08-24 03:40:05 [INFO] [EVAL] Class IoU: [0.69 0.7873 0.9304 0.7361 0.6831 0.7671 0.7701 0.8036 0.5338 0.6388 0.4881 0.5717 0.719 0.3046 0.3357 0.4466 0.4766 0.4122 0.6097 0.4287 0.7633 0.4621 0.6373 0.521 0.337 0.425 0.5235 0.4644 0.4835 0.2355 0.3038 0.5186 0.2918 0.3492 0.3471 0.4193 0.4822 0.5325 0.2764 0.3489 0.0999 0.1497 0.3427 0.2679 0.2493 0.2144 0.4018 0.508 0.6255 0.4803 0.5702 0.2913 0.1652 0.2135 0.6804 0.3653 0.8898 0.4442 0.3175 0.2851 0.0933 0.2564 0.3357 0.2422 0.4711 0.7018 0.2726 0.3783 0.1093 0.3263 0.5129 0.5611 0.3928 0.2498 0.4832 0.3788 0.5527 0.2504 0.3366 0.4138 0.6948 0.4249 0.4261 0.0474 0.1503 0.5325 0.1259 0.1123 0.2808 0.518 0.4417 0.1054 0.2178 0.1128 0.0137 0.0552 0.1287 0.1985 0.2791 0.2798 0.2226 0.1298 0.2765 0.7066 0.19 0.3996 0.1812 0.561 0.0714 0.2996 0.2065 0.5014 0.1859 0.5126 0.6524 0.0589 0.3642 0.625 0.1768 0.3542 0.4623 0.0713 0.3383 0.196 0.292 0.2048 0.5199 0.4445 0.5433 0.3636 0.5803 0.0455 0.3897 0.3699 0.2373 0.207 0.1629 0.0362 0.207 0.3644 0.092 0.0048 0.2706 0.4307 0.2405 0.0167 0.4322 0.0312 0.115 0.198 ] 2022-08-24 03:40:05 [INFO] [EVAL] Class Precision: [0.7978 0.8499 0.9623 0.8168 0.8023 0.8696 0.8955 0.8602 0.7107 0.7041 0.7016 0.6792 0.8012 0.5241 0.4691 0.595 0.6936 0.6948 0.799 0.6448 0.8338 0.6791 0.772 0.652 0.4603 0.677 0.5647 0.7461 0.7013 0.4696 0.4787 0.628 0.5537 0.4695 0.4585 0.6088 0.6959 0.7931 0.3982 0.6077 0.4416 0.3216 0.5739 0.4761 0.3387 0.447 0.6868 0.7208 0.6892 0.5413 0.8025 0.3531 0.3106 0.5424 0.7141 0.7484 0.9467 0.6637 0.69 0.5184 0.1486 0.5706 0.5032 0.6323 0.6112 0.8007 0.3973 0.4806 0.3381 0.5515 0.6702 0.7733 0.5211 0.34 0.6904 0.6016 0.7331 0.6326 0.7649 0.6044 0.8549 0.7278 0.7941 0.1401 0.2463 0.668 0.5835 0.4422 0.6392 0.6803 0.6296 0.1231 0.494 0.3122 0.0424 0.1436 0.5588 0.3251 0.4056 0.5873 0.6984 0.2199 0.6062 0.8349 0.8246 0.4466 0.4953 0.8264 0.1922 0.4882 0.3852 0.7165 0.5628 0.714 0.6597 0.2907 0.7916 0.706 0.315 0.5426 0.741 0.3356 0.6942 0.6108 0.7793 0.6199 0.7756 0.5646 0.788 0.5102 0.6711 0.2 0.6471 0.8291 0.7716 0.4338 0.3667 0.0857 0.4495 0.6703 0.3816 0.0355 0.5569 0.7086 0.5849 0.0235 0.7918 0.2052 0.2131 0.8713] 2022-08-24 03:40:05 [INFO] [EVAL] Class Recall: [0.8363 0.9145 0.9656 0.8818 0.8214 0.8668 0.8462 0.9243 0.682 0.8731 0.6159 0.7832 0.8751 0.421 0.5414 0.6417 0.6037 0.5033 0.7201 0.5613 0.9002 0.5912 0.7852 0.7217 0.5572 0.533 0.8777 0.5516 0.6089 0.3207 0.4541 0.7486 0.3816 0.5767 0.5882 0.5739 0.611 0.6184 0.4746 0.4504 0.1143 0.2188 0.4596 0.3799 0.4858 0.2919 0.492 0.6324 0.8713 0.8098 0.6633 0.6247 0.2607 0.2604 0.9352 0.4164 0.9367 0.5732 0.3703 0.3877 0.2006 0.3177 0.5021 0.2819 0.6727 0.8504 0.4649 0.6397 0.139 0.4441 0.6861 0.6716 0.6146 0.485 0.6168 0.5056 0.6919 0.293 0.3755 0.5675 0.7877 0.5051 0.479 0.0669 0.2782 0.7242 0.1384 0.1309 0.3337 0.6846 0.5967 0.4226 0.2804 0.15 0.0198 0.0823 0.1432 0.3375 0.4723 0.3483 0.2463 0.2405 0.337 0.8213 0.198 0.7919 0.2222 0.636 0.1021 0.4367 0.3079 0.6254 0.2172 0.645 0.9833 0.0688 0.4029 0.845 0.2873 0.5049 0.5515 0.083 0.3975 0.2239 0.3183 0.2342 0.612 0.6764 0.6362 0.5586 0.8109 0.0557 0.4948 0.4004 0.2552 0.2836 0.2267 0.0589 0.2773 0.4439 0.1081 0.0055 0.3448 0.5233 0.29 0.0541 0.4876 0.0355 0.1998 0.204 ] 2022-08-24 03:40:05 [INFO] [EVAL] The model with the best validation mIoU (0.3712) was saved at iter 96000. 2022-08-24 03:40:14 [INFO] [TRAIN] epoch: 77, iter: 97050/160000, loss: 0.4621, lr: 0.000477, batch_cost: 0.1776, reader_cost: 0.00388, ips: 45.0470 samples/sec | ETA 03:06:19 2022-08-24 03:40:25 [INFO] [TRAIN] epoch: 77, iter: 97100/160000, loss: 0.4271, lr: 0.000476, batch_cost: 0.2153, reader_cost: 0.00062, ips: 37.1520 samples/sec | ETA 03:45:44 2022-08-24 03:40:35 [INFO] [TRAIN] epoch: 77, iter: 97150/160000, loss: 0.4732, lr: 0.000476, batch_cost: 0.2091, reader_cost: 0.00183, ips: 38.2621 samples/sec | ETA 03:39:00 2022-08-24 03:40:46 [INFO] [TRAIN] epoch: 77, iter: 97200/160000, loss: 0.4673, lr: 0.000475, batch_cost: 0.2172, reader_cost: 0.00076, ips: 36.8298 samples/sec | ETA 03:47:21 2022-08-24 03:40:57 [INFO] [TRAIN] epoch: 77, iter: 97250/160000, loss: 0.5255, lr: 0.000475, batch_cost: 0.2138, reader_cost: 0.00074, ips: 37.4097 samples/sec | ETA 03:43:38 2022-08-24 03:41:15 [INFO] [TRAIN] epoch: 78, iter: 97300/160000, loss: 0.4233, lr: 0.000475, batch_cost: 0.3620, reader_cost: 0.14419, ips: 22.1008 samples/sec | ETA 06:18:16 2022-08-24 03:41:24 [INFO] [TRAIN] epoch: 78, iter: 97350/160000, loss: 0.4493, lr: 0.000474, batch_cost: 0.1920, reader_cost: 0.00060, ips: 41.6701 samples/sec | ETA 03:20:27 2022-08-24 03:41:34 [INFO] [TRAIN] epoch: 78, iter: 97400/160000, loss: 0.4141, lr: 0.000474, batch_cost: 0.1985, reader_cost: 0.00056, ips: 40.3004 samples/sec | ETA 03:27:06 2022-08-24 03:41:43 [INFO] [TRAIN] epoch: 78, iter: 97450/160000, loss: 0.4454, lr: 0.000474, batch_cost: 0.1765, reader_cost: 0.00059, ips: 45.3341 samples/sec | ETA 03:03:58 2022-08-24 03:41:52 [INFO] [TRAIN] epoch: 78, iter: 97500/160000, loss: 0.4407, lr: 0.000473, batch_cost: 0.1755, reader_cost: 0.00054, ips: 45.5927 samples/sec | ETA 03:02:46 2022-08-24 03:42:01 [INFO] [TRAIN] epoch: 78, iter: 97550/160000, loss: 0.4743, lr: 0.000473, batch_cost: 0.1770, reader_cost: 0.00092, ips: 45.2011 samples/sec | ETA 03:04:12 2022-08-24 03:42:09 [INFO] [TRAIN] epoch: 78, iter: 97600/160000, loss: 0.4368, lr: 0.000472, batch_cost: 0.1654, reader_cost: 0.00784, ips: 48.3585 samples/sec | ETA 02:52:02 2022-08-24 03:42:17 [INFO] [TRAIN] epoch: 78, iter: 97650/160000, loss: 0.4647, lr: 0.000472, batch_cost: 0.1594, reader_cost: 0.00052, ips: 50.1735 samples/sec | ETA 02:45:41 2022-08-24 03:42:26 [INFO] [TRAIN] epoch: 78, iter: 97700/160000, loss: 0.4582, lr: 0.000472, batch_cost: 0.1790, reader_cost: 0.00042, ips: 44.7016 samples/sec | ETA 03:05:49 2022-08-24 03:42:36 [INFO] [TRAIN] epoch: 78, iter: 97750/160000, loss: 0.4409, lr: 0.000471, batch_cost: 0.1977, reader_cost: 0.00071, ips: 40.4648 samples/sec | ETA 03:25:06 2022-08-24 03:42:44 [INFO] [TRAIN] epoch: 78, iter: 97800/160000, loss: 0.4597, lr: 0.000471, batch_cost: 0.1620, reader_cost: 0.00059, ips: 49.3972 samples/sec | ETA 02:47:53 2022-08-24 03:42:52 [INFO] [TRAIN] epoch: 78, iter: 97850/160000, loss: 0.4354, lr: 0.000471, batch_cost: 0.1593, reader_cost: 0.00080, ips: 50.2200 samples/sec | ETA 02:45:00 2022-08-24 03:43:00 [INFO] [TRAIN] epoch: 78, iter: 97900/160000, loss: 0.4499, lr: 0.000470, batch_cost: 0.1537, reader_cost: 0.00048, ips: 52.0417 samples/sec | ETA 02:39:06 2022-08-24 03:43:08 [INFO] [TRAIN] epoch: 78, iter: 97950/160000, loss: 0.4517, lr: 0.000470, batch_cost: 0.1615, reader_cost: 0.00062, ips: 49.5433 samples/sec | ETA 02:46:59 2022-08-24 03:43:17 [INFO] [TRAIN] epoch: 78, iter: 98000/160000, loss: 0.4479, lr: 0.000469, batch_cost: 0.1776, reader_cost: 0.00045, ips: 45.0325 samples/sec | ETA 03:03:34 2022-08-24 03:43:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1905 - reader cost: 0.0011 2022-08-24 03:46:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3712 Acc: 0.7741 Kappa: 0.7567 Dice: 0.5100 2022-08-24 03:46:27 [INFO] [EVAL] Class IoU: [0.6875 0.7854 0.9326 0.7357 0.6816 0.7674 0.7772 0.8 0.5256 0.6619 0.4853 0.5695 0.7112 0.331 0.3014 0.43 0.4774 0.4315 0.603 0.4398 0.7679 0.4777 0.6266 0.5385 0.34 0.4501 0.4761 0.4321 0.4499 0.306 0.2914 0.5233 0.2399 0.3398 0.3551 0.4111 0.472 0.5686 0.2713 0.3827 0.1597 0.1307 0.3598 0.2816 0.256 0.2259 0.4336 0.5112 0.5038 0.5007 0.5757 0.3583 0.162 0.2614 0.6932 0.3352 0.8821 0.4115 0.4276 0.3435 0.0894 0.2338 0.3254 0.1811 0.4591 0.738 0.2998 0.3952 0.1271 0.3785 0.5142 0.5413 0.3526 0.2584 0.4768 0.3762 0.4864 0.2681 0.2447 0.5073 0.6873 0.4036 0.4388 0.0352 0.1957 0.5295 0.0945 0.1299 0.2833 0.5002 0.4253 0.0275 0.1865 0.1217 0.0397 0.0557 0.1875 0.1931 0.2676 0.3316 0.1171 0.1497 0.2769 0.5179 0.1884 0.4278 0.1277 0.5434 0.1082 0.3785 0.2076 0.3505 0.1897 0.6024 0.6931 0.0843 0.4168 0.637 0.1636 0.3325 0.403 0.0837 0.3475 0.1473 0.3104 0.2029 0.5503 0.4575 0.5639 0.3751 0.5863 0.0662 0.2863 0.3263 0.2893 0.2028 0.1447 0.0422 0.2172 0.4044 0.2112 0.0022 0.332 0.4872 0.2639 0.0105 0.4214 0.0417 0.1139 0.2163] 2022-08-24 03:46:27 [INFO] [EVAL] Class Precision: [0.7838 0.8409 0.9674 0.8352 0.7563 0.8782 0.8766 0.8616 0.6757 0.7779 0.6941 0.7338 0.7874 0.5278 0.5885 0.6291 0.6839 0.6794 0.7347 0.6235 0.839 0.6663 0.7706 0.6616 0.4604 0.7126 0.5844 0.6757 0.7811 0.4843 0.5307 0.6793 0.4899 0.4329 0.486 0.5407 0.7143 0.823 0.4224 0.606 0.3527 0.4167 0.5924 0.4723 0.3436 0.4676 0.7995 0.6875 0.6583 0.616 0.8121 0.4764 0.3736 0.5404 0.749 0.7207 0.925 0.6396 0.7692 0.7328 0.1474 0.4909 0.5187 0.6423 0.5397 0.8703 0.4942 0.552 0.3291 0.6848 0.7201 0.7059 0.5855 0.3553 0.7113 0.5693 0.7077 0.6069 0.7219 0.715 0.8468 0.6528 0.7556 0.0934 0.3232 0.6625 0.6036 0.3902 0.8442 0.6404 0.5909 0.0605 0.2744 0.3022 0.1438 0.1597 0.5626 0.427 0.3555 0.5407 0.8141 0.2886 0.555 0.808 0.7762 0.4755 0.4252 0.765 0.2091 0.4201 0.4071 0.429 0.5508 0.6972 0.6989 0.3086 0.6756 0.7604 0.2436 0.5402 0.7426 0.3484 0.5905 0.7727 0.7273 0.596 0.8584 0.5655 0.8296 0.6092 0.6685 0.4305 0.672 0.8822 0.6771 0.4161 0.3428 0.1402 0.4854 0.5992 0.477 0.0206 0.5567 0.5809 0.603 0.0298 0.838 0.2413 0.2581 0.7663] 2022-08-24 03:46:27 [INFO] [EVAL] Class Recall: [0.8484 0.9225 0.9628 0.8606 0.8734 0.8588 0.8727 0.9179 0.703 0.8162 0.6174 0.7179 0.8803 0.4704 0.3819 0.5761 0.6126 0.5418 0.7709 0.5989 0.9006 0.6279 0.7702 0.7433 0.5652 0.55 0.7199 0.5451 0.5149 0.4538 0.3926 0.695 0.3197 0.6123 0.5686 0.6316 0.5818 0.6478 0.4313 0.5094 0.2259 0.1599 0.4781 0.4108 0.5009 0.3041 0.4865 0.6659 0.6821 0.7279 0.6641 0.5911 0.2225 0.3362 0.9029 0.3853 0.95 0.5357 0.4906 0.3927 0.1853 0.3086 0.4662 0.2014 0.7545 0.8292 0.4326 0.5819 0.1716 0.4584 0.6426 0.6989 0.4699 0.4868 0.5913 0.5258 0.6087 0.3244 0.2702 0.6359 0.7849 0.5139 0.5114 0.0534 0.3317 0.725 0.1008 0.163 0.2989 0.6955 0.6027 0.048 0.3681 0.1692 0.0519 0.0788 0.2195 0.2606 0.5198 0.4617 0.1203 0.2374 0.3559 0.5906 0.1992 0.8101 0.1543 0.6523 0.1831 0.7926 0.2975 0.6568 0.2244 0.8158 0.9883 0.1039 0.521 0.7969 0.3327 0.4637 0.4684 0.0993 0.4578 0.154 0.3513 0.2353 0.6053 0.7054 0.6378 0.494 0.8268 0.0725 0.3328 0.3412 0.3356 0.2834 0.2002 0.0569 0.2822 0.5543 0.2748 0.0024 0.4513 0.7511 0.3195 0.0159 0.4587 0.0479 0.1694 0.2316] 2022-08-24 03:46:27 [INFO] [EVAL] The model with the best validation mIoU (0.3712) was saved at iter 96000. 2022-08-24 03:46:38 [INFO] [TRAIN] epoch: 78, iter: 98050/160000, loss: 0.4490, lr: 0.000469, batch_cost: 0.2054, reader_cost: 0.00453, ips: 38.9544 samples/sec | ETA 03:32:02 2022-08-24 03:46:47 [INFO] [TRAIN] epoch: 78, iter: 98100/160000, loss: 0.4736, lr: 0.000469, batch_cost: 0.1878, reader_cost: 0.00064, ips: 42.5958 samples/sec | ETA 03:13:45 2022-08-24 03:46:58 [INFO] [TRAIN] epoch: 78, iter: 98150/160000, loss: 0.4396, lr: 0.000468, batch_cost: 0.2216, reader_cost: 0.00161, ips: 36.1028 samples/sec | ETA 03:48:25 2022-08-24 03:47:08 [INFO] [TRAIN] epoch: 78, iter: 98200/160000, loss: 0.4905, lr: 0.000468, batch_cost: 0.1852, reader_cost: 0.00033, ips: 43.1967 samples/sec | ETA 03:10:45 2022-08-24 03:47:16 [INFO] [TRAIN] epoch: 78, iter: 98250/160000, loss: 0.4550, lr: 0.000468, batch_cost: 0.1761, reader_cost: 0.00059, ips: 45.4338 samples/sec | ETA 03:01:12 2022-08-24 03:47:27 [INFO] [TRAIN] epoch: 78, iter: 98300/160000, loss: 0.4729, lr: 0.000467, batch_cost: 0.2152, reader_cost: 0.00078, ips: 37.1725 samples/sec | ETA 03:41:18 2022-08-24 03:47:37 [INFO] [TRAIN] epoch: 78, iter: 98350/160000, loss: 0.4776, lr: 0.000467, batch_cost: 0.1973, reader_cost: 0.00058, ips: 40.5390 samples/sec | ETA 03:22:46 2022-08-24 03:47:46 [INFO] [TRAIN] epoch: 78, iter: 98400/160000, loss: 0.4453, lr: 0.000466, batch_cost: 0.1841, reader_cost: 0.00054, ips: 43.4434 samples/sec | ETA 03:09:03 2022-08-24 03:47:54 [INFO] [TRAIN] epoch: 78, iter: 98450/160000, loss: 0.4560, lr: 0.000466, batch_cost: 0.1657, reader_cost: 0.00041, ips: 48.2800 samples/sec | ETA 02:49:58 2022-08-24 03:48:04 [INFO] [TRAIN] epoch: 78, iter: 98500/160000, loss: 0.4463, lr: 0.000466, batch_cost: 0.1912, reader_cost: 0.00317, ips: 41.8491 samples/sec | ETA 03:15:56 2022-08-24 03:48:20 [INFO] [TRAIN] epoch: 79, iter: 98550/160000, loss: 0.4209, lr: 0.000465, batch_cost: 0.3220, reader_cost: 0.14725, ips: 24.8427 samples/sec | ETA 05:29:48 2022-08-24 03:48:29 [INFO] [TRAIN] epoch: 79, iter: 98600/160000, loss: 0.4470, lr: 0.000465, batch_cost: 0.1779, reader_cost: 0.00072, ips: 44.9667 samples/sec | ETA 03:02:03 2022-08-24 03:48:39 [INFO] [TRAIN] epoch: 79, iter: 98650/160000, loss: 0.4334, lr: 0.000464, batch_cost: 0.1955, reader_cost: 0.00032, ips: 40.9269 samples/sec | ETA 03:19:52 2022-08-24 03:48:48 [INFO] [TRAIN] epoch: 79, iter: 98700/160000, loss: 0.4449, lr: 0.000464, batch_cost: 0.1782, reader_cost: 0.00089, ips: 44.8951 samples/sec | ETA 03:02:03 2022-08-24 03:48:57 [INFO] [TRAIN] epoch: 79, iter: 98750/160000, loss: 0.4735, lr: 0.000464, batch_cost: 0.1828, reader_cost: 0.00079, ips: 43.7677 samples/sec | ETA 03:06:35 2022-08-24 03:49:06 [INFO] [TRAIN] epoch: 79, iter: 98800/160000, loss: 0.4738, lr: 0.000463, batch_cost: 0.1802, reader_cost: 0.00067, ips: 44.3901 samples/sec | ETA 03:03:49 2022-08-24 03:49:14 [INFO] [TRAIN] epoch: 79, iter: 98850/160000, loss: 0.4471, lr: 0.000463, batch_cost: 0.1666, reader_cost: 0.00108, ips: 48.0192 samples/sec | ETA 02:49:47 2022-08-24 03:49:23 [INFO] [TRAIN] epoch: 79, iter: 98900/160000, loss: 0.4504, lr: 0.000463, batch_cost: 0.1684, reader_cost: 0.00079, ips: 47.4990 samples/sec | ETA 02:51:30 2022-08-24 03:49:31 [INFO] [TRAIN] epoch: 79, iter: 98950/160000, loss: 0.4845, lr: 0.000462, batch_cost: 0.1611, reader_cost: 0.00086, ips: 49.6585 samples/sec | ETA 02:43:55 2022-08-24 03:49:40 [INFO] [TRAIN] epoch: 79, iter: 99000/160000, loss: 0.4724, lr: 0.000462, batch_cost: 0.1917, reader_cost: 0.00056, ips: 41.7323 samples/sec | ETA 03:14:53 2022-08-24 03:49:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1624 - reader cost: 5.2478e-04 2022-08-24 03:52:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3715 Acc: 0.7755 Kappa: 0.7581 Dice: 0.5098 2022-08-24 03:52:23 [INFO] [EVAL] Class IoU: [0.6897 0.7902 0.9324 0.7429 0.6895 0.7649 0.7806 0.8086 0.537 0.646 0.5039 0.5712 0.7165 0.329 0.3304 0.4449 0.4709 0.4435 0.6043 0.4368 0.7659 0.4235 0.6234 0.5218 0.3246 0.4879 0.4394 0.4633 0.4511 0.2719 0.2996 0.5406 0.2606 0.3638 0.3443 0.4064 0.4861 0.5316 0.28 0.3493 0.2057 0.1109 0.3579 0.276 0.2405 0.2367 0.2919 0.4985 0.4855 0.5217 0.5924 0.352 0.1513 0.2364 0.6897 0.3678 0.8875 0.4406 0.4275 0.2995 0.0856 0.232 0.2871 0.1939 0.5011 0.7177 0.2972 0.3874 0.1058 0.3769 0.5124 0.5813 0.3313 0.2312 0.4903 0.3661 0.543 0.2817 0.2359 0.4626 0.7039 0.4203 0.4313 0.0235 0.1046 0.5263 0.1296 0.1203 0.2844 0.5344 0.4596 0.1032 0.2205 0.0989 0.0365 0.0372 0.1738 0.19 0.2954 0.2893 0.1153 0.1223 0.2917 0.6299 0.1927 0.4306 0.1166 0.5881 0.0859 0.3554 0.1996 0.4751 0.196 0.5408 0.6798 0.0536 0.3923 0.666 0.1492 0.368 0.4148 0.0786 0.3436 0.171 0.3129 0.2213 0.5285 0.4693 0.5062 0.3887 0.5807 0.0624 0.2484 0.3829 0.3453 0.1828 0.1363 0.0457 0.212 0.3709 0.1891 0.0633 0.3867 0.4194 0.2617 0.0171 0.403 0.0388 0.137 0.2186] 2022-08-24 03:52:23 [INFO] [EVAL] Class Precision: [0.7745 0.867 0.9643 0.8247 0.7669 0.8921 0.8711 0.8733 0.6924 0.7606 0.7218 0.7322 0.7923 0.5433 0.566 0.6183 0.6459 0.668 0.7468 0.6263 0.8444 0.6409 0.7594 0.6235 0.4675 0.6311 0.5194 0.7378 0.7625 0.4844 0.4692 0.7217 0.5714 0.4744 0.45 0.5867 0.6826 0.8431 0.4531 0.6329 0.3995 0.391 0.6777 0.526 0.3655 0.4685 0.7846 0.6496 0.6931 0.7053 0.7535 0.4719 0.3139 0.6894 0.7424 0.7074 0.9465 0.6519 0.7571 0.5855 0.1344 0.4517 0.3867 0.7209 0.6453 0.8191 0.5091 0.5041 0.3169 0.6719 0.6662 0.7285 0.546 0.3565 0.7554 0.5504 0.7263 0.5219 0.7296 0.6533 0.8586 0.673 0.7907 0.0901 0.1759 0.7046 0.4226 0.4232 0.5946 0.6951 0.6818 0.1512 0.3533 0.3243 0.1568 0.1121 0.505 0.4743 0.4278 0.5685 0.5661 0.1839 0.6619 0.8284 0.8158 0.459 0.2915 0.7579 0.1948 0.5142 0.487 0.682 0.6597 0.8416 0.6908 0.3403 0.6584 0.753 0.3262 0.5505 0.7019 0.5184 0.7066 0.7269 0.7444 0.5723 0.7849 0.5947 0.7111 0.772 0.7338 0.3798 0.6473 0.8673 0.6468 0.4499 0.4077 0.088 0.4634 0.6717 0.5134 0.1972 0.6154 0.7165 0.667 0.0243 0.9121 0.1622 0.4069 0.6964] 2022-08-24 03:52:23 [INFO] [EVAL] Class Recall: [0.8629 0.8992 0.9657 0.8822 0.8723 0.8429 0.8826 0.9161 0.7051 0.8109 0.6253 0.7221 0.8822 0.4548 0.4424 0.6133 0.6349 0.5689 0.76 0.5908 0.8917 0.5553 0.7768 0.7619 0.5152 0.6826 0.7405 0.5546 0.5248 0.3826 0.4531 0.6829 0.3239 0.6094 0.5945 0.5695 0.6281 0.59 0.4229 0.4381 0.2978 0.134 0.4314 0.3673 0.413 0.3235 0.3174 0.6818 0.6185 0.6671 0.7347 0.5809 0.2261 0.2646 0.9067 0.4338 0.9343 0.5762 0.4954 0.38 0.191 0.323 0.527 0.2096 0.6915 0.8529 0.4166 0.626 0.1371 0.4619 0.6894 0.7421 0.4574 0.3968 0.5828 0.5223 0.6827 0.3797 0.2585 0.6131 0.7962 0.5282 0.4868 0.0309 0.2052 0.6754 0.1574 0.1439 0.3528 0.698 0.5851 0.2453 0.3697 0.1245 0.0455 0.0527 0.2094 0.2406 0.4884 0.3707 0.1265 0.2675 0.3428 0.7244 0.2014 0.8743 0.1628 0.7241 0.1333 0.535 0.2527 0.6103 0.2181 0.6021 0.977 0.0599 0.4925 0.8521 0.2156 0.526 0.5035 0.0847 0.4008 0.1827 0.3506 0.2652 0.6179 0.69 0.6373 0.4391 0.7357 0.0695 0.2873 0.4067 0.4256 0.2354 0.17 0.0868 0.281 0.4531 0.2304 0.0852 0.5099 0.5028 0.3011 0.0544 0.4193 0.0486 0.1712 0.2417] 2022-08-24 03:52:23 [INFO] [EVAL] The model with the best validation mIoU (0.3715) was saved at iter 99000. 2022-08-24 03:52:33 [INFO] [TRAIN] epoch: 79, iter: 99050/160000, loss: 0.4415, lr: 0.000461, batch_cost: 0.1933, reader_cost: 0.00365, ips: 41.3907 samples/sec | ETA 03:16:20 2022-08-24 03:52:43 [INFO] [TRAIN] epoch: 79, iter: 99100/160000, loss: 0.4458, lr: 0.000461, batch_cost: 0.2059, reader_cost: 0.00182, ips: 38.8486 samples/sec | ETA 03:29:00 2022-08-24 03:52:53 [INFO] [TRAIN] epoch: 79, iter: 99150/160000, loss: 0.4262, lr: 0.000461, batch_cost: 0.1965, reader_cost: 0.00060, ips: 40.7025 samples/sec | ETA 03:19:19 2022-08-24 03:53:02 [INFO] [TRAIN] epoch: 79, iter: 99200/160000, loss: 0.4147, lr: 0.000460, batch_cost: 0.1863, reader_cost: 0.00042, ips: 42.9438 samples/sec | ETA 03:08:46 2022-08-24 03:53:13 [INFO] [TRAIN] epoch: 79, iter: 99250/160000, loss: 0.4453, lr: 0.000460, batch_cost: 0.2108, reader_cost: 0.00038, ips: 37.9532 samples/sec | ETA 03:33:25 2022-08-24 03:53:24 [INFO] [TRAIN] epoch: 79, iter: 99300/160000, loss: 0.4571, lr: 0.000460, batch_cost: 0.2168, reader_cost: 0.00150, ips: 36.8924 samples/sec | ETA 03:39:22 2022-08-24 03:53:34 [INFO] [TRAIN] epoch: 79, iter: 99350/160000, loss: 0.4269, lr: 0.000459, batch_cost: 0.2115, reader_cost: 0.00035, ips: 37.8208 samples/sec | ETA 03:33:48 2022-08-24 03:53:45 [INFO] [TRAIN] epoch: 79, iter: 99400/160000, loss: 0.5060, lr: 0.000459, batch_cost: 0.2052, reader_cost: 0.00039, ips: 38.9954 samples/sec | ETA 03:27:12 2022-08-24 03:53:53 [INFO] [TRAIN] epoch: 79, iter: 99450/160000, loss: 0.4541, lr: 0.000458, batch_cost: 0.1571, reader_cost: 0.00054, ips: 50.9230 samples/sec | ETA 02:38:32 2022-08-24 03:54:01 [INFO] [TRAIN] epoch: 79, iter: 99500/160000, loss: 0.4490, lr: 0.000458, batch_cost: 0.1622, reader_cost: 0.00059, ips: 49.3342 samples/sec | ETA 02:43:30 2022-08-24 03:54:09 [INFO] [TRAIN] epoch: 79, iter: 99550/160000, loss: 0.4806, lr: 0.000458, batch_cost: 0.1756, reader_cost: 0.00063, ips: 45.5544 samples/sec | ETA 02:56:55 2022-08-24 03:54:18 [INFO] [TRAIN] epoch: 79, iter: 99600/160000, loss: 0.4906, lr: 0.000457, batch_cost: 0.1780, reader_cost: 0.00071, ips: 44.9336 samples/sec | ETA 02:59:13 2022-08-24 03:54:26 [INFO] [TRAIN] epoch: 79, iter: 99650/160000, loss: 0.4908, lr: 0.000457, batch_cost: 0.1526, reader_cost: 0.00100, ips: 52.4288 samples/sec | ETA 02:33:28 2022-08-24 03:54:34 [INFO] [TRAIN] epoch: 79, iter: 99700/160000, loss: 0.4957, lr: 0.000457, batch_cost: 0.1553, reader_cost: 0.00084, ips: 51.5036 samples/sec | ETA 02:36:06 2022-08-24 03:54:43 [INFO] [TRAIN] epoch: 79, iter: 99750/160000, loss: 0.5006, lr: 0.000456, batch_cost: 0.1769, reader_cost: 0.00102, ips: 45.2241 samples/sec | ETA 02:57:38 2022-08-24 03:54:54 [INFO] [TRAIN] epoch: 80, iter: 99800/160000, loss: 0.4700, lr: 0.000456, batch_cost: 0.2266, reader_cost: 0.06486, ips: 35.2975 samples/sec | ETA 03:47:24 2022-08-24 03:55:03 [INFO] [TRAIN] epoch: 80, iter: 99850/160000, loss: 0.4327, lr: 0.000455, batch_cost: 0.1788, reader_cost: 0.00088, ips: 44.7375 samples/sec | ETA 02:59:16 2022-08-24 03:55:11 [INFO] [TRAIN] epoch: 80, iter: 99900/160000, loss: 0.4552, lr: 0.000455, batch_cost: 0.1635, reader_cost: 0.00049, ips: 48.9308 samples/sec | ETA 02:43:46 2022-08-24 03:55:19 [INFO] [TRAIN] epoch: 80, iter: 99950/160000, loss: 0.4378, lr: 0.000455, batch_cost: 0.1599, reader_cost: 0.00049, ips: 50.0379 samples/sec | ETA 02:40:00 2022-08-24 03:55:27 [INFO] [TRAIN] epoch: 80, iter: 100000/160000, loss: 0.4768, lr: 0.000454, batch_cost: 0.1690, reader_cost: 0.00049, ips: 47.3261 samples/sec | ETA 02:49:02 2022-08-24 03:55:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1594 - reader cost: 6.4721e-04 2022-08-24 03:58:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3703 Acc: 0.7739 Kappa: 0.7566 Dice: 0.5082 2022-08-24 03:58:07 [INFO] [EVAL] Class IoU: [0.6931 0.7916 0.9313 0.7359 0.6798 0.772 0.78 0.7989 0.5269 0.6622 0.4938 0.557 0.7219 0.321 0.3276 0.4427 0.4714 0.4154 0.6061 0.4069 0.7742 0.4454 0.6373 0.5121 0.3129 0.3532 0.4832 0.4557 0.39 0.3236 0.2449 0.5485 0.2708 0.3416 0.338 0.4315 0.4857 0.5784 0.2892 0.3607 0.2073 0.1457 0.3617 0.2778 0.2628 0.2327 0.3664 0.5533 0.6035 0.5252 0.5821 0.3584 0.1454 0.2044 0.6838 0.4206 0.8842 0.3991 0.3892 0.2967 0.088 0.2806 0.3091 0.1565 0.4853 0.7317 0.233 0.3838 0.1106 0.3635 0.4591 0.5743 0.3458 0.2249 0.4713 0.3741 0.509 0.258 0.2291 0.1926 0.7062 0.4096 0.3631 0.0307 0.2204 0.5339 0.0814 0.1233 0.4686 0.5352 0.4431 0.0745 0.2244 0.1121 0.0319 0.054 0.1804 0.1646 0.2913 0.2845 0.1951 0.1137 0.2788 0.708 0.1975 0.4012 0.0981 0.5274 0.0865 0.3633 0.2012 0.385 0.1781 0.5779 0.6774 0.0641 0.4132 0.6793 0.1932 0.3521 0.4439 0.0836 0.363 0.1837 0.304 0.2026 0.5384 0.4231 0.5473 0.4109 0.6036 0.0591 0.3177 0.4725 0.2983 0.2109 0.1538 0.0452 0.1941 0.4028 0.2158 0.0356 0.2938 0.2498 0.3002 0.0097 0.4196 0.0356 0.1273 0.1839] 2022-08-24 03:58:07 [INFO] [EVAL] Class Precision: [0.7968 0.8477 0.9604 0.815 0.7695 0.8748 0.8797 0.85 0.66 0.7788 0.709 0.6945 0.8074 0.5553 0.5415 0.5823 0.6923 0.6804 0.7581 0.6591 0.8473 0.6572 0.7714 0.5881 0.4863 0.6495 0.5636 0.6777 0.7457 0.5019 0.4201 0.6923 0.5758 0.4401 0.4311 0.5542 0.6868 0.7782 0.441 0.6258 0.3969 0.3405 0.5948 0.526 0.3838 0.4994 0.7137 0.7878 0.7291 0.6238 0.772 0.4897 0.3657 0.459 0.7335 0.6994 0.9403 0.7199 0.6968 0.5117 0.1369 0.4984 0.4131 0.7629 0.6081 0.8382 0.3832 0.5156 0.2478 0.6405 0.7069 0.7652 0.5189 0.3486 0.6707 0.5563 0.7112 0.6555 0.6464 0.7288 0.8646 0.6996 0.829 0.0904 0.3207 0.7176 0.5127 0.4279 0.7922 0.755 0.602 0.0883 0.4545 0.3389 0.0988 0.1493 0.4175 0.4197 0.4392 0.7548 0.6175 0.2002 0.5369 0.8354 0.8147 0.4285 0.3082 0.7621 0.1654 0.4923 0.4415 0.4722 0.4582 0.7158 0.6827 0.2953 0.6784 0.7608 0.3476 0.4671 0.6988 0.4436 0.6543 0.7484 0.7688 0.5715 0.8106 0.5089 0.7964 0.6815 0.7776 0.3484 0.5535 0.7528 0.6249 0.3971 0.3533 0.1122 0.395 0.6339 0.3793 0.0589 0.6223 0.6891 0.6635 0.0147 0.8653 0.1733 0.2318 0.7566] 2022-08-24 03:58:07 [INFO] [EVAL] Class Recall: [0.8419 0.9229 0.9686 0.8835 0.8537 0.8679 0.8731 0.93 0.7232 0.8155 0.6193 0.7377 0.8721 0.4321 0.4534 0.6487 0.5964 0.5161 0.7514 0.5153 0.8998 0.5802 0.7857 0.7984 0.4673 0.4364 0.7721 0.5818 0.4498 0.4766 0.3699 0.7252 0.3383 0.6042 0.6101 0.6608 0.6238 0.6925 0.4566 0.4599 0.3027 0.2031 0.48 0.3706 0.4545 0.3035 0.4296 0.6502 0.778 0.7687 0.703 0.5719 0.1944 0.2693 0.9098 0.5133 0.9368 0.4724 0.4686 0.4139 0.1978 0.3909 0.551 0.1645 0.7063 0.8521 0.3727 0.6002 0.1666 0.4567 0.5669 0.6971 0.5091 0.388 0.6133 0.5332 0.6417 0.2984 0.2619 0.2075 0.7941 0.497 0.3924 0.0445 0.4134 0.6758 0.0882 0.1476 0.5343 0.6476 0.6268 0.3234 0.3072 0.1434 0.0451 0.0781 0.241 0.2131 0.4638 0.3135 0.2219 0.2084 0.3671 0.8228 0.2067 0.8633 0.1258 0.6314 0.1535 0.581 0.2698 0.6757 0.2256 0.7501 0.9888 0.0757 0.5139 0.8638 0.3031 0.5884 0.5489 0.0934 0.4491 0.1958 0.3346 0.2389 0.6159 0.7149 0.6364 0.5085 0.7296 0.0664 0.4272 0.5593 0.3633 0.3102 0.2141 0.0703 0.2762 0.5249 0.3337 0.0825 0.3575 0.2815 0.3541 0.0278 0.449 0.0429 0.2203 0.1955] 2022-08-24 03:58:07 [INFO] [EVAL] The model with the best validation mIoU (0.3715) was saved at iter 99000. 2022-08-24 03:58:18 [INFO] [TRAIN] epoch: 80, iter: 100050/160000, loss: 0.5172, lr: 0.000454, batch_cost: 0.2057, reader_cost: 0.00480, ips: 38.8923 samples/sec | ETA 03:25:31 2022-08-24 03:58:28 [INFO] [TRAIN] epoch: 80, iter: 100100/160000, loss: 0.4582, lr: 0.000454, batch_cost: 0.2036, reader_cost: 0.00199, ips: 39.2862 samples/sec | ETA 03:23:17 2022-08-24 03:58:38 [INFO] [TRAIN] epoch: 80, iter: 100150/160000, loss: 0.4479, lr: 0.000453, batch_cost: 0.1964, reader_cost: 0.00036, ips: 40.7382 samples/sec | ETA 03:15:53 2022-08-24 03:58:49 [INFO] [TRAIN] epoch: 80, iter: 100200/160000, loss: 0.4374, lr: 0.000453, batch_cost: 0.2174, reader_cost: 0.00054, ips: 36.7988 samples/sec | ETA 03:36:40 2022-08-24 03:58:59 [INFO] [TRAIN] epoch: 80, iter: 100250/160000, loss: 0.4344, lr: 0.000452, batch_cost: 0.2126, reader_cost: 0.00037, ips: 37.6210 samples/sec | ETA 03:31:45 2022-08-24 03:59:09 [INFO] [TRAIN] epoch: 80, iter: 100300/160000, loss: 0.4619, lr: 0.000452, batch_cost: 0.1896, reader_cost: 0.00067, ips: 42.1848 samples/sec | ETA 03:08:41 2022-08-24 03:59:18 [INFO] [TRAIN] epoch: 80, iter: 100350/160000, loss: 0.4573, lr: 0.000452, batch_cost: 0.1867, reader_cost: 0.00072, ips: 42.8428 samples/sec | ETA 03:05:38 2022-08-24 03:59:28 [INFO] [TRAIN] epoch: 80, iter: 100400/160000, loss: 0.4651, lr: 0.000451, batch_cost: 0.2017, reader_cost: 0.00035, ips: 39.6602 samples/sec | ETA 03:20:22 2022-08-24 03:59:36 [INFO] [TRAIN] epoch: 80, iter: 100450/160000, loss: 0.4632, lr: 0.000451, batch_cost: 0.1652, reader_cost: 0.00035, ips: 48.4300 samples/sec | ETA 02:43:56 2022-08-24 03:59:46 [INFO] [TRAIN] epoch: 80, iter: 100500/160000, loss: 0.4779, lr: 0.000450, batch_cost: 0.1898, reader_cost: 0.00068, ips: 42.1597 samples/sec | ETA 03:08:10 2022-08-24 03:59:56 [INFO] [TRAIN] epoch: 80, iter: 100550/160000, loss: 0.4598, lr: 0.000450, batch_cost: 0.1966, reader_cost: 0.00072, ips: 40.6894 samples/sec | ETA 03:14:48 2022-08-24 04:00:04 [INFO] [TRAIN] epoch: 80, iter: 100600/160000, loss: 0.4583, lr: 0.000450, batch_cost: 0.1757, reader_cost: 0.00053, ips: 45.5336 samples/sec | ETA 02:53:56 2022-08-24 04:00:15 [INFO] [TRAIN] epoch: 80, iter: 100650/160000, loss: 0.4653, lr: 0.000449, batch_cost: 0.2070, reader_cost: 0.00048, ips: 38.6428 samples/sec | ETA 03:24:46 2022-08-24 04:00:25 [INFO] [TRAIN] epoch: 80, iter: 100700/160000, loss: 0.4226, lr: 0.000449, batch_cost: 0.2007, reader_cost: 0.00063, ips: 39.8532 samples/sec | ETA 03:18:23 2022-08-24 04:00:33 [INFO] [TRAIN] epoch: 80, iter: 100750/160000, loss: 0.4633, lr: 0.000449, batch_cost: 0.1627, reader_cost: 0.00051, ips: 49.1635 samples/sec | ETA 02:40:41 2022-08-24 04:00:42 [INFO] [TRAIN] epoch: 80, iter: 100800/160000, loss: 0.4708, lr: 0.000448, batch_cost: 0.1871, reader_cost: 0.00033, ips: 42.7630 samples/sec | ETA 03:04:34 2022-08-24 04:00:52 [INFO] [TRAIN] epoch: 80, iter: 100850/160000, loss: 0.4350, lr: 0.000448, batch_cost: 0.1967, reader_cost: 0.00120, ips: 40.6781 samples/sec | ETA 03:13:52 2022-08-24 04:01:01 [INFO] [TRAIN] epoch: 80, iter: 100900/160000, loss: 0.4611, lr: 0.000447, batch_cost: 0.1689, reader_cost: 0.00042, ips: 47.3682 samples/sec | ETA 02:46:21 2022-08-24 04:01:10 [INFO] [TRAIN] epoch: 80, iter: 100950/160000, loss: 0.4525, lr: 0.000447, batch_cost: 0.1806, reader_cost: 0.00076, ips: 44.2926 samples/sec | ETA 02:57:45 2022-08-24 04:01:18 [INFO] [TRAIN] epoch: 80, iter: 101000/160000, loss: 0.4508, lr: 0.000447, batch_cost: 0.1702, reader_cost: 0.00049, ips: 47.0062 samples/sec | ETA 02:47:21 2022-08-24 04:01:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1784 - reader cost: 5.9997e-04 2022-08-24 04:04:17 [INFO] [EVAL] #Images: 2000 mIoU: 0.3712 Acc: 0.7743 Kappa: 0.7570 Dice: 0.5087 2022-08-24 04:04:17 [INFO] [EVAL] Class IoU: [0.6943 0.7895 0.9324 0.7388 0.6924 0.7673 0.7827 0.8083 0.5398 0.6337 0.4999 0.5729 0.7209 0.2957 0.3058 0.4489 0.5019 0.4376 0.6252 0.4327 0.7652 0.4313 0.6251 0.512 0.3446 0.4494 0.4983 0.4469 0.4464 0.2846 0.3223 0.539 0.286 0.3531 0.3272 0.3944 0.4849 0.5532 0.2862 0.3938 0.1702 0.1159 0.3536 0.2759 0.2611 0.2246 0.2975 0.5308 0.551 0.522 0.5855 0.3148 0.1768 0.2604 0.6758 0.388 0.8857 0.3998 0.3992 0.3058 0.0695 0.2372 0.3311 0.1594 0.4794 0.6889 0.2914 0.3807 0.101 0.3548 0.5015 0.5427 0.3416 0.2335 0.4977 0.3746 0.528 0.2668 0.2844 0.4221 0.6841 0.4199 0.3814 0.029 0.088 0.5263 0.1166 0.1225 0.3197 0.553 0.4506 0.0445 0.2346 0.136 0.0515 0.0572 0.1706 0.1987 0.2573 0.2807 0.1788 0.1107 0.2773 0.7078 0.1905 0.4015 0.0809 0.5438 0.0939 0.3235 0.2046 0.3498 0.1981 0.688 0.7325 0.0757 0.4346 0.7154 0.2111 0.3189 0.4281 0.0641 0.3552 0.2346 0.2857 0.2063 0.5387 0.4507 0.5397 0.3861 0.6352 0.0648 0.3234 0.3743 0.3233 0.2005 0.1575 0.0334 0.1911 0.3647 0.0871 0.0695 0.2537 0.3532 0.3068 0.0154 0.4176 0.0368 0.1218 0.158 ] 2022-08-24 04:04:17 [INFO] [EVAL] Class Precision: [0.7886 0.866 0.9652 0.8246 0.7723 0.8823 0.881 0.8855 0.6684 0.75 0.6694 0.73 0.8021 0.4851 0.5838 0.5972 0.6485 0.7104 0.8041 0.6452 0.8369 0.6832 0.7777 0.6748 0.4762 0.5975 0.5619 0.6909 0.7726 0.5188 0.443 0.6995 0.4584 0.4549 0.4346 0.5876 0.6777 0.7794 0.4737 0.5686 0.4285 0.335 0.5362 0.5503 0.3802 0.5173 0.4627 0.706 0.7235 0.641 0.8075 0.4051 0.3297 0.6769 0.7191 0.7316 0.9239 0.7137 0.76 0.6254 0.1065 0.4508 0.5067 0.6777 0.5775 0.7848 0.5191 0.5428 0.1727 0.5305 0.7138 0.7419 0.525 0.3816 0.7427 0.6362 0.714 0.595 0.6959 0.5736 0.8105 0.6635 0.8272 0.1103 0.1806 0.7806 0.3972 0.498 0.6022 0.7228 0.6444 0.0525 0.481 0.3474 0.1382 0.1329 0.5168 0.47 0.3848 0.5904 0.5977 0.199 0.5582 0.8678 0.7562 0.4219 0.2335 0.8104 0.2385 0.4528 0.4517 0.621 0.5435 0.7913 0.7446 0.2855 0.7104 0.7703 0.2804 0.4309 0.7685 0.3884 0.6556 0.5786 0.8079 0.5998 0.8902 0.5775 0.7809 0.8588 0.7736 0.4085 0.5712 0.8072 0.6234 0.4141 0.3666 0.056 0.4056 0.6245 0.2241 0.1343 0.5686 0.7057 0.5589 0.0214 0.8519 0.1766 0.263 0.7723] 2022-08-24 04:04:17 [INFO] [EVAL] Class Recall: [0.8531 0.8993 0.9648 0.8765 0.87 0.8548 0.8752 0.9027 0.7373 0.8033 0.6637 0.727 0.8768 0.4309 0.391 0.644 0.6895 0.5326 0.7376 0.5679 0.8993 0.5392 0.7612 0.6798 0.5551 0.6445 0.8151 0.5586 0.514 0.3867 0.5419 0.7014 0.432 0.612 0.5697 0.5454 0.6302 0.6559 0.4197 0.5615 0.2202 0.1506 0.5094 0.3563 0.4545 0.2841 0.4546 0.6815 0.698 0.7377 0.6805 0.5854 0.2759 0.2974 0.9182 0.4524 0.9555 0.4762 0.4567 0.3744 0.1668 0.3336 0.4887 0.1724 0.7383 0.8494 0.3992 0.5605 0.1956 0.5172 0.6278 0.6689 0.4944 0.3757 0.6014 0.4768 0.6696 0.326 0.3247 0.615 0.8143 0.5336 0.4144 0.0379 0.1466 0.6177 0.1417 0.1397 0.4053 0.7019 0.5998 0.2264 0.314 0.1826 0.076 0.0914 0.203 0.2561 0.437 0.3486 0.2032 0.1996 0.3553 0.7933 0.2029 0.8926 0.1102 0.6231 0.134 0.5312 0.2722 0.4448 0.2376 0.8406 0.9782 0.0934 0.5282 0.9094 0.4607 0.5509 0.4915 0.0714 0.4367 0.2829 0.3065 0.2393 0.577 0.6724 0.636 0.4123 0.7803 0.0716 0.427 0.411 0.4018 0.28 0.2163 0.0764 0.2655 0.4671 0.1248 0.1259 0.3142 0.4142 0.4049 0.0521 0.4503 0.0444 0.185 0.1657] 2022-08-24 04:04:17 [INFO] [EVAL] The model with the best validation mIoU (0.3715) was saved at iter 99000. 2022-08-24 04:04:30 [INFO] [TRAIN] epoch: 81, iter: 101050/160000, loss: 0.4805, lr: 0.000446, batch_cost: 0.2655, reader_cost: 0.06904, ips: 30.1346 samples/sec | ETA 04:20:49 2022-08-24 04:04:41 [INFO] [TRAIN] epoch: 81, iter: 101100/160000, loss: 0.4699, lr: 0.000446, batch_cost: 0.2143, reader_cost: 0.00069, ips: 37.3393 samples/sec | ETA 03:30:19 2022-08-24 04:04:51 [INFO] [TRAIN] epoch: 81, iter: 101150/160000, loss: 0.4560, lr: 0.000446, batch_cost: 0.1920, reader_cost: 0.00074, ips: 41.6601 samples/sec | ETA 03:08:20 2022-08-24 04:05:01 [INFO] [TRAIN] epoch: 81, iter: 101200/160000, loss: 0.3978, lr: 0.000445, batch_cost: 0.2099, reader_cost: 0.00054, ips: 38.1218 samples/sec | ETA 03:25:39 2022-08-24 04:05:12 [INFO] [TRAIN] epoch: 81, iter: 101250/160000, loss: 0.4628, lr: 0.000445, batch_cost: 0.2114, reader_cost: 0.00095, ips: 37.8365 samples/sec | ETA 03:27:01 2022-08-24 04:05:22 [INFO] [TRAIN] epoch: 81, iter: 101300/160000, loss: 0.4789, lr: 0.000444, batch_cost: 0.2126, reader_cost: 0.00099, ips: 37.6285 samples/sec | ETA 03:27:59 2022-08-24 04:05:33 [INFO] [TRAIN] epoch: 81, iter: 101350/160000, loss: 0.4669, lr: 0.000444, batch_cost: 0.2149, reader_cost: 0.00066, ips: 37.2278 samples/sec | ETA 03:30:03 2022-08-24 04:05:43 [INFO] [TRAIN] epoch: 81, iter: 101400/160000, loss: 0.4423, lr: 0.000444, batch_cost: 0.1995, reader_cost: 0.00032, ips: 40.0962 samples/sec | ETA 03:14:51 2022-08-24 04:05:51 [INFO] [TRAIN] epoch: 81, iter: 101450/160000, loss: 0.4576, lr: 0.000443, batch_cost: 0.1589, reader_cost: 0.00059, ips: 50.3529 samples/sec | ETA 02:35:02 2022-08-24 04:06:01 [INFO] [TRAIN] epoch: 81, iter: 101500/160000, loss: 0.4556, lr: 0.000443, batch_cost: 0.1917, reader_cost: 0.00042, ips: 41.7343 samples/sec | ETA 03:06:53 2022-08-24 04:06:10 [INFO] [TRAIN] epoch: 81, iter: 101550/160000, loss: 0.4385, lr: 0.000443, batch_cost: 0.1892, reader_cost: 0.00056, ips: 42.2896 samples/sec | ETA 03:04:17 2022-08-24 04:06:18 [INFO] [TRAIN] epoch: 81, iter: 101600/160000, loss: 0.4432, lr: 0.000442, batch_cost: 0.1645, reader_cost: 0.00059, ips: 48.6357 samples/sec | ETA 02:40:06 2022-08-24 04:06:27 [INFO] [TRAIN] epoch: 81, iter: 101650/160000, loss: 0.4671, lr: 0.000442, batch_cost: 0.1659, reader_cost: 0.00069, ips: 48.2258 samples/sec | ETA 02:41:19 2022-08-24 04:06:36 [INFO] [TRAIN] epoch: 81, iter: 101700/160000, loss: 0.4683, lr: 0.000441, batch_cost: 0.1789, reader_cost: 0.00110, ips: 44.7239 samples/sec | ETA 02:53:48 2022-08-24 04:06:44 [INFO] [TRAIN] epoch: 81, iter: 101750/160000, loss: 0.4361, lr: 0.000441, batch_cost: 0.1763, reader_cost: 0.00082, ips: 45.3827 samples/sec | ETA 02:51:08 2022-08-24 04:06:54 [INFO] [TRAIN] epoch: 81, iter: 101800/160000, loss: 0.4379, lr: 0.000441, batch_cost: 0.1977, reader_cost: 0.00069, ips: 40.4596 samples/sec | ETA 03:11:47 2022-08-24 04:07:04 [INFO] [TRAIN] epoch: 81, iter: 101850/160000, loss: 0.4551, lr: 0.000440, batch_cost: 0.2012, reader_cost: 0.00042, ips: 39.7545 samples/sec | ETA 03:15:01 2022-08-24 04:07:13 [INFO] [TRAIN] epoch: 81, iter: 101900/160000, loss: 0.4564, lr: 0.000440, batch_cost: 0.1726, reader_cost: 0.00091, ips: 46.3515 samples/sec | ETA 02:47:07 2022-08-24 04:07:22 [INFO] [TRAIN] epoch: 81, iter: 101950/160000, loss: 0.4525, lr: 0.000440, batch_cost: 0.1763, reader_cost: 0.00079, ips: 45.3901 samples/sec | ETA 02:50:31 2022-08-24 04:07:31 [INFO] [TRAIN] epoch: 81, iter: 102000/160000, loss: 0.4206, lr: 0.000439, batch_cost: 0.1809, reader_cost: 0.00054, ips: 44.2136 samples/sec | ETA 02:54:54 2022-08-24 04:07:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1725 - reader cost: 0.0013 2022-08-24 04:10:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.3738 Acc: 0.7750 Kappa: 0.7580 Dice: 0.5111 2022-08-24 04:10:24 [INFO] [EVAL] Class IoU: [0.696 0.7942 0.9305 0.7394 0.6764 0.7686 0.7836 0.7918 0.529 0.6405 0.4907 0.5755 0.7145 0.3279 0.3006 0.4406 0.5186 0.4163 0.6148 0.4402 0.7597 0.403 0.6113 0.5294 0.3311 0.5138 0.448 0.4579 0.4687 0.2849 0.2558 0.5294 0.2765 0.3452 0.3664 0.3952 0.4843 0.5585 0.2841 0.3482 0.1727 0.1207 0.355 0.276 0.2423 0.2463 0.3795 0.5329 0.6301 0.5494 0.5773 0.306 0.1467 0.2736 0.6248 0.327 0.8886 0.4049 0.4189 0.2562 0.1131 0.2796 0.3373 0.1883 0.4585 0.7297 0.2712 0.3906 0.1124 0.371 0.5092 0.61 0.3388 0.2179 0.4991 0.3961 0.5642 0.2816 0.3698 0.451 0.6902 0.4188 0.3403 0.0463 0.1301 0.5236 0.1193 0.1248 0.2916 0.5069 0.421 0.1518 0.2101 0.1152 0.0444 0.0327 0.1164 0.1457 0.2676 0.3265 0.1935 0.1284 0.2724 0.7558 0.1929 0.4081 0.1072 0.5828 0.0801 0.4152 0.2166 0.4238 0.1874 0.7091 0.7148 0.0715 0.3984 0.6365 0.1723 0.308 0.4635 0.0843 0.3826 0.1727 0.2815 0.2256 0.5438 0.4519 0.4879 0.3934 0.6412 0.0427 0.251 0.4426 0.3002 0.2086 0.1706 0.0272 0.1946 0.3896 0.0976 0.0183 0.2718 0.4044 0.2634 0.0092 0.446 0.0334 0.1294 0.177 ] 2022-08-24 04:10:24 [INFO] [EVAL] Class Precision: [0.7973 0.8729 0.9649 0.8267 0.7468 0.8687 0.8894 0.839 0.6673 0.7406 0.6712 0.7165 0.7824 0.5303 0.5827 0.6116 0.7167 0.7253 0.7391 0.6356 0.8194 0.6747 0.7221 0.6372 0.4691 0.6644 0.5524 0.7608 0.7754 0.5188 0.5135 0.6708 0.5551 0.4502 0.4685 0.5302 0.6283 0.8218 0.426 0.6408 0.3433 0.3033 0.5489 0.4814 0.3168 0.5432 0.6216 0.6911 0.7385 0.6735 0.7762 0.3823 0.2817 0.5585 0.701 0.6973 0.9405 0.6764 0.6631 0.6769 0.1496 0.565 0.5322 0.649 0.5545 0.8513 0.4688 0.5114 0.2757 0.6237 0.7632 0.7897 0.4799 0.3343 0.7553 0.5348 0.7642 0.6652 0.721 0.6145 0.8526 0.7228 0.8712 0.1715 0.2351 0.7583 0.3582 0.4004 0.7344 0.6285 0.5603 0.2477 0.4531 0.3615 0.1317 0.0945 0.5817 0.352 0.393 0.8071 0.4538 0.2223 0.5318 0.8431 0.8092 0.4363 0.2556 0.8179 0.1565 0.5129 0.4598 0.611 0.5924 0.8241 0.7247 0.3102 0.7615 0.7337 0.3479 0.5216 0.741 0.3735 0.6805 0.7612 0.7928 0.6434 0.7831 0.5506 0.6765 0.6022 0.7425 0.2183 0.6619 0.7605 0.6991 0.4241 0.3746 0.0469 0.4708 0.6051 0.1948 0.1679 0.6123 0.7513 0.6074 0.0227 0.8369 0.2236 0.2642 0.789 ] 2022-08-24 04:10:24 [INFO] [EVAL] Class Recall: [0.8456 0.8981 0.9631 0.875 0.8777 0.8696 0.8682 0.9336 0.7185 0.8258 0.646 0.7451 0.8918 0.4621 0.3831 0.6118 0.6523 0.4943 0.7852 0.5887 0.9125 0.5002 0.7993 0.7578 0.5294 0.6938 0.7032 0.5349 0.5423 0.3872 0.3376 0.7152 0.3552 0.5969 0.6269 0.6083 0.6787 0.6355 0.4603 0.4327 0.2579 0.1671 0.5012 0.3928 0.5076 0.3107 0.4935 0.6996 0.8111 0.7488 0.6925 0.6054 0.2344 0.3491 0.8519 0.3811 0.9415 0.5021 0.5322 0.2919 0.3166 0.3564 0.4795 0.2096 0.726 0.8363 0.3916 0.6231 0.1595 0.4781 0.6047 0.7284 0.5353 0.3848 0.5953 0.6044 0.6831 0.3281 0.4316 0.629 0.7837 0.4989 0.3584 0.0597 0.2257 0.6284 0.1517 0.1535 0.326 0.7238 0.6287 0.2817 0.2814 0.1447 0.0628 0.0475 0.127 0.1991 0.4561 0.3541 0.2523 0.2332 0.3584 0.8795 0.2021 0.8631 0.1558 0.6698 0.1409 0.6856 0.2906 0.5804 0.2151 0.8355 0.9812 0.085 0.4552 0.8277 0.2545 0.4293 0.553 0.0982 0.4664 0.1826 0.3038 0.2579 0.6403 0.716 0.6363 0.5316 0.8245 0.0504 0.288 0.5143 0.3447 0.291 0.2386 0.0609 0.249 0.5225 0.1636 0.0201 0.3284 0.4669 0.3175 0.0151 0.4884 0.0378 0.2024 0.1857] 2022-08-24 04:10:24 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:10:34 [INFO] [TRAIN] epoch: 81, iter: 102050/160000, loss: 0.4394, lr: 0.000439, batch_cost: 0.2072, reader_cost: 0.00379, ips: 38.6062 samples/sec | ETA 03:20:08 2022-08-24 04:10:44 [INFO] [TRAIN] epoch: 81, iter: 102100/160000, loss: 0.4511, lr: 0.000438, batch_cost: 0.2020, reader_cost: 0.00073, ips: 39.6068 samples/sec | ETA 03:14:54 2022-08-24 04:10:54 [INFO] [TRAIN] epoch: 81, iter: 102150/160000, loss: 0.4287, lr: 0.000438, batch_cost: 0.1976, reader_cost: 0.00087, ips: 40.4897 samples/sec | ETA 03:10:30 2022-08-24 04:11:03 [INFO] [TRAIN] epoch: 81, iter: 102200/160000, loss: 0.4583, lr: 0.000438, batch_cost: 0.1800, reader_cost: 0.00475, ips: 44.4430 samples/sec | ETA 02:53:24 2022-08-24 04:11:14 [INFO] [TRAIN] epoch: 81, iter: 102250/160000, loss: 0.4338, lr: 0.000437, batch_cost: 0.2049, reader_cost: 0.00047, ips: 39.0492 samples/sec | ETA 03:17:11 2022-08-24 04:11:22 [INFO] [TRAIN] epoch: 81, iter: 102300/160000, loss: 0.4139, lr: 0.000437, batch_cost: 0.1741, reader_cost: 0.00881, ips: 45.9510 samples/sec | ETA 02:47:25 2022-08-24 04:11:38 [INFO] [TRAIN] epoch: 82, iter: 102350/160000, loss: 0.4767, lr: 0.000436, batch_cost: 0.3157, reader_cost: 0.13125, ips: 25.3398 samples/sec | ETA 05:03:20 2022-08-24 04:11:46 [INFO] [TRAIN] epoch: 82, iter: 102400/160000, loss: 0.4333, lr: 0.000436, batch_cost: 0.1678, reader_cost: 0.00052, ips: 47.6870 samples/sec | ETA 02:41:03 2022-08-24 04:11:56 [INFO] [TRAIN] epoch: 82, iter: 102450/160000, loss: 0.4506, lr: 0.000436, batch_cost: 0.1873, reader_cost: 0.00068, ips: 42.7160 samples/sec | ETA 02:59:38 2022-08-24 04:12:05 [INFO] [TRAIN] epoch: 82, iter: 102500/160000, loss: 0.4260, lr: 0.000435, batch_cost: 0.1905, reader_cost: 0.00049, ips: 41.9861 samples/sec | ETA 03:02:35 2022-08-24 04:12:15 [INFO] [TRAIN] epoch: 82, iter: 102550/160000, loss: 0.4253, lr: 0.000435, batch_cost: 0.1969, reader_cost: 0.00037, ips: 40.6369 samples/sec | ETA 03:08:29 2022-08-24 04:12:24 [INFO] [TRAIN] epoch: 82, iter: 102600/160000, loss: 0.4694, lr: 0.000435, batch_cost: 0.1792, reader_cost: 0.00124, ips: 44.6487 samples/sec | ETA 02:51:24 2022-08-24 04:12:33 [INFO] [TRAIN] epoch: 82, iter: 102650/160000, loss: 0.4280, lr: 0.000434, batch_cost: 0.1737, reader_cost: 0.00066, ips: 46.0641 samples/sec | ETA 02:46:00 2022-08-24 04:12:41 [INFO] [TRAIN] epoch: 82, iter: 102700/160000, loss: 0.4540, lr: 0.000434, batch_cost: 0.1649, reader_cost: 0.00047, ips: 48.5288 samples/sec | ETA 02:37:25 2022-08-24 04:12:51 [INFO] [TRAIN] epoch: 82, iter: 102750/160000, loss: 0.4290, lr: 0.000433, batch_cost: 0.1925, reader_cost: 0.00114, ips: 41.5542 samples/sec | ETA 03:03:41 2022-08-24 04:12:59 [INFO] [TRAIN] epoch: 82, iter: 102800/160000, loss: 0.4522, lr: 0.000433, batch_cost: 0.1694, reader_cost: 0.00038, ips: 47.2178 samples/sec | ETA 02:41:31 2022-08-24 04:13:07 [INFO] [TRAIN] epoch: 82, iter: 102850/160000, loss: 0.4362, lr: 0.000433, batch_cost: 0.1621, reader_cost: 0.00060, ips: 49.3425 samples/sec | ETA 02:34:25 2022-08-24 04:13:15 [INFO] [TRAIN] epoch: 82, iter: 102900/160000, loss: 0.4492, lr: 0.000432, batch_cost: 0.1529, reader_cost: 0.00040, ips: 52.3063 samples/sec | ETA 02:25:33 2022-08-24 04:13:23 [INFO] [TRAIN] epoch: 82, iter: 102950/160000, loss: 0.4267, lr: 0.000432, batch_cost: 0.1583, reader_cost: 0.00069, ips: 50.5461 samples/sec | ETA 02:30:29 2022-08-24 04:13:32 [INFO] [TRAIN] epoch: 82, iter: 103000/160000, loss: 0.4601, lr: 0.000432, batch_cost: 0.1728, reader_cost: 0.00061, ips: 46.3015 samples/sec | ETA 02:44:08 2022-08-24 04:13:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1871 - reader cost: 5.8215e-04 2022-08-24 04:16:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3710 Acc: 0.7748 Kappa: 0.7574 Dice: 0.5085 2022-08-24 04:16:39 [INFO] [EVAL] Class IoU: [0.6948 0.7951 0.9299 0.7409 0.6895 0.7595 0.7748 0.799 0.5268 0.6196 0.4958 0.5733 0.716 0.3194 0.2805 0.4378 0.5248 0.4237 0.6154 0.4424 0.76 0.431 0.6199 0.5234 0.3351 0.4488 0.4568 0.446 0.4522 0.2864 0.2727 0.5316 0.2975 0.329 0.3344 0.3878 0.4843 0.5705 0.2783 0.3738 0.1483 0.1318 0.344 0.2862 0.2606 0.2032 0.3711 0.5338 0.6235 0.5457 0.5831 0.3275 0.1601 0.2201 0.7073 0.3425 0.8864 0.3557 0.3332 0.2847 0.08 0.2433 0.3233 0.2045 0.4861 0.7144 0.2909 0.3785 0.1079 0.367 0.4927 0.6002 0.3354 0.2439 0.487 0.4074 0.4469 0.2934 0.2342 0.3291 0.6823 0.3788 0.3727 0.0289 0.1125 0.5281 0.12 0.1388 0.4709 0.5366 0.4538 0.1213 0.189 0.1014 0.0317 0.02 0.181 0.1613 0.2452 0.3216 0.2114 0.1422 0.2074 0.7312 0.1917 0.412 0.1045 0.5613 0.0743 0.3431 0.2099 0.3655 0.1755 0.5826 0.6857 0.0738 0.4923 0.6423 0.1543 0.351 0.4062 0.0817 0.3724 0.2061 0.2998 0.2356 0.5306 0.4726 0.5464 0.3884 0.6169 0.0582 0.3221 0.417 0.3029 0.1972 0.166 0.0238 0.1851 0.3757 0.1467 0.0522 0.3053 0.4959 0.2274 0.0122 0.4559 0.0309 0.12 0.1968] 2022-08-24 04:16:39 [INFO] [EVAL] Class Precision: [0.7837 0.8622 0.9583 0.825 0.7799 0.8884 0.891 0.8543 0.6701 0.7299 0.672 0.6816 0.7817 0.5333 0.5747 0.5873 0.6795 0.7507 0.7754 0.6414 0.8211 0.6415 0.7368 0.6296 0.4791 0.6156 0.5632 0.8134 0.7996 0.4618 0.4688 0.6913 0.5321 0.4465 0.4662 0.526 0.6838 0.8235 0.4576 0.6065 0.3371 0.3445 0.6885 0.4927 0.3757 0.4777 0.7032 0.7801 0.7281 0.7165 0.7376 0.4213 0.2899 0.6796 0.7588 0.7013 0.9239 0.7286 0.7393 0.553 0.1197 0.3881 0.5006 0.6213 0.6472 0.8092 0.5615 0.4976 0.2773 0.6958 0.7754 0.7415 0.533 0.3313 0.7218 0.5766 0.6356 0.5281 0.6447 0.7084 0.8212 0.7087 0.8357 0.1383 0.2117 0.7183 0.4935 0.3938 0.688 0.7336 0.6498 0.1777 0.46 0.3757 0.1109 0.102 0.5067 0.4212 0.3479 0.7242 0.7179 0.2782 0.5214 0.8408 0.7187 0.4482 0.3098 0.7426 0.1959 0.4665 0.4697 0.4468 0.659 0.7035 0.6943 0.297 0.6786 0.7505 0.3447 0.4644 0.7688 0.2894 0.6795 0.6611 0.7692 0.671 0.7713 0.6092 0.8409 0.6513 0.6648 0.301 0.658 0.8309 0.658 0.4121 0.4268 0.0535 0.5175 0.6507 0.2595 0.2214 0.5977 0.6822 0.6381 0.022 0.8232 0.2798 0.2775 0.8478] 2022-08-24 04:16:39 [INFO] [EVAL] Class Recall: [0.8598 0.9108 0.9691 0.8792 0.8561 0.8396 0.8559 0.9251 0.7113 0.804 0.6541 0.7831 0.8949 0.4433 0.354 0.6323 0.6974 0.4931 0.7489 0.5878 0.9108 0.5678 0.7962 0.7562 0.5272 0.6235 0.7074 0.4968 0.51 0.4298 0.3946 0.697 0.4029 0.5556 0.5418 0.5961 0.6241 0.6501 0.4153 0.4934 0.2093 0.1759 0.4074 0.4057 0.4596 0.2612 0.44 0.6284 0.8127 0.696 0.7358 0.5953 0.2635 0.2455 0.9124 0.401 0.9563 0.41 0.3776 0.3698 0.1945 0.3948 0.4772 0.2336 0.6613 0.8592 0.3765 0.6126 0.1502 0.4372 0.5748 0.7591 0.475 0.4805 0.5995 0.5813 0.6008 0.3977 0.2689 0.3806 0.8013 0.4487 0.4022 0.0353 0.1936 0.666 0.1368 0.1766 0.5988 0.6665 0.6007 0.2765 0.2428 0.122 0.0424 0.0242 0.2197 0.2072 0.4536 0.3665 0.2305 0.2252 0.2561 0.8488 0.2073 0.8362 0.1363 0.6968 0.1068 0.5648 0.2751 0.6674 0.193 0.7723 0.9823 0.0894 0.6419 0.8168 0.2183 0.5897 0.4628 0.1022 0.4517 0.2304 0.3294 0.2663 0.6296 0.6782 0.6094 0.4904 0.8954 0.0673 0.3869 0.4557 0.3594 0.2744 0.2137 0.0412 0.2237 0.4706 0.2523 0.064 0.3843 0.6448 0.2611 0.0266 0.5054 0.0336 0.1745 0.204 ] 2022-08-24 04:16:39 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:16:50 [INFO] [TRAIN] epoch: 82, iter: 103050/160000, loss: 0.4808, lr: 0.000431, batch_cost: 0.2161, reader_cost: 0.00370, ips: 37.0264 samples/sec | ETA 03:25:04 2022-08-24 04:17:00 [INFO] [TRAIN] epoch: 82, iter: 103100/160000, loss: 0.4420, lr: 0.000431, batch_cost: 0.1938, reader_cost: 0.00092, ips: 41.2897 samples/sec | ETA 03:03:44 2022-08-24 04:17:08 [INFO] [TRAIN] epoch: 82, iter: 103150/160000, loss: 0.4482, lr: 0.000430, batch_cost: 0.1671, reader_cost: 0.00090, ips: 47.8680 samples/sec | ETA 02:38:21 2022-08-24 04:17:16 [INFO] [TRAIN] epoch: 82, iter: 103200/160000, loss: 0.4501, lr: 0.000430, batch_cost: 0.1585, reader_cost: 0.00052, ips: 50.4662 samples/sec | ETA 02:30:04 2022-08-24 04:17:25 [INFO] [TRAIN] epoch: 82, iter: 103250/160000, loss: 0.4588, lr: 0.000430, batch_cost: 0.1765, reader_cost: 0.00101, ips: 45.3137 samples/sec | ETA 02:46:59 2022-08-24 04:17:33 [INFO] [TRAIN] epoch: 82, iter: 103300/160000, loss: 0.3998, lr: 0.000429, batch_cost: 0.1670, reader_cost: 0.00065, ips: 47.9011 samples/sec | ETA 02:37:49 2022-08-24 04:17:42 [INFO] [TRAIN] epoch: 82, iter: 103350/160000, loss: 0.4182, lr: 0.000429, batch_cost: 0.1722, reader_cost: 0.00074, ips: 46.4513 samples/sec | ETA 02:42:36 2022-08-24 04:17:50 [INFO] [TRAIN] epoch: 82, iter: 103400/160000, loss: 0.4662, lr: 0.000429, batch_cost: 0.1579, reader_cost: 0.00062, ips: 50.6758 samples/sec | ETA 02:28:55 2022-08-24 04:17:58 [INFO] [TRAIN] epoch: 82, iter: 103450/160000, loss: 0.4598, lr: 0.000428, batch_cost: 0.1660, reader_cost: 0.00048, ips: 48.1871 samples/sec | ETA 02:36:28 2022-08-24 04:18:07 [INFO] [TRAIN] epoch: 82, iter: 103500/160000, loss: 0.4546, lr: 0.000428, batch_cost: 0.1860, reader_cost: 0.00052, ips: 43.0051 samples/sec | ETA 02:55:10 2022-08-24 04:18:17 [INFO] [TRAIN] epoch: 82, iter: 103550/160000, loss: 0.4539, lr: 0.000427, batch_cost: 0.1957, reader_cost: 0.00141, ips: 40.8716 samples/sec | ETA 03:04:09 2022-08-24 04:18:30 [INFO] [TRAIN] epoch: 83, iter: 103600/160000, loss: 0.4352, lr: 0.000427, batch_cost: 0.2626, reader_cost: 0.09693, ips: 30.4682 samples/sec | ETA 04:06:48 2022-08-24 04:18:38 [INFO] [TRAIN] epoch: 83, iter: 103650/160000, loss: 0.4660, lr: 0.000427, batch_cost: 0.1633, reader_cost: 0.00485, ips: 48.9844 samples/sec | ETA 02:33:22 2022-08-24 04:18:47 [INFO] [TRAIN] epoch: 83, iter: 103700/160000, loss: 0.4347, lr: 0.000426, batch_cost: 0.1649, reader_cost: 0.00066, ips: 48.5112 samples/sec | ETA 02:34:44 2022-08-24 04:18:55 [INFO] [TRAIN] epoch: 83, iter: 103750/160000, loss: 0.4376, lr: 0.000426, batch_cost: 0.1688, reader_cost: 0.00069, ips: 47.4026 samples/sec | ETA 02:38:13 2022-08-24 04:19:03 [INFO] [TRAIN] epoch: 83, iter: 103800/160000, loss: 0.4450, lr: 0.000425, batch_cost: 0.1611, reader_cost: 0.00048, ips: 49.6519 samples/sec | ETA 02:30:55 2022-08-24 04:19:11 [INFO] [TRAIN] epoch: 83, iter: 103850/160000, loss: 0.4349, lr: 0.000425, batch_cost: 0.1597, reader_cost: 0.00042, ips: 50.0954 samples/sec | ETA 02:29:26 2022-08-24 04:19:20 [INFO] [TRAIN] epoch: 83, iter: 103900/160000, loss: 0.4679, lr: 0.000425, batch_cost: 0.1742, reader_cost: 0.00123, ips: 45.9162 samples/sec | ETA 02:42:54 2022-08-24 04:19:29 [INFO] [TRAIN] epoch: 83, iter: 103950/160000, loss: 0.4357, lr: 0.000424, batch_cost: 0.1879, reader_cost: 0.00075, ips: 42.5656 samples/sec | ETA 02:55:34 2022-08-24 04:19:39 [INFO] [TRAIN] epoch: 83, iter: 104000/160000, loss: 0.4389, lr: 0.000424, batch_cost: 0.1966, reader_cost: 0.00057, ips: 40.6875 samples/sec | ETA 03:03:30 2022-08-24 04:19:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1698 - reader cost: 6.4348e-04 2022-08-24 04:22:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3708 Acc: 0.7728 Kappa: 0.7557 Dice: 0.5086 2022-08-24 04:22:29 [INFO] [EVAL] Class IoU: [0.6918 0.7955 0.9315 0.7383 0.69 0.7706 0.7638 0.8021 0.5262 0.6171 0.4938 0.5574 0.717 0.2958 0.3084 0.4417 0.5238 0.4431 0.619 0.4388 0.7681 0.4501 0.6169 0.5181 0.3342 0.4878 0.4613 0.4264 0.4329 0.3099 0.2628 0.5215 0.2956 0.3315 0.3413 0.4032 0.4874 0.5797 0.27 0.3817 0.1275 0.1201 0.3527 0.2855 0.2663 0.1913 0.3701 0.541 0.5432 0.5374 0.5948 0.3207 0.1591 0.2013 0.7107 0.3949 0.8863 0.3869 0.3266 0.2653 0.0922 0.2752 0.3139 0.1958 0.4583 0.7185 0.2377 0.3812 0.1018 0.3642 0.484 0.5334 0.3671 0.2383 0.4673 0.3914 0.4963 0.2867 0.2551 0.3341 0.6892 0.4412 0.4145 0.0332 0.1764 0.5225 0.1511 0.1265 0.3264 0.4984 0.4965 0.0541 0.2099 0.1037 0.0183 0.0735 0.1488 0.1346 0.213 0.3197 0.1862 0.1466 0.2162 0.6591 0.1938 0.5576 0.1152 0.564 0.1104 0.3846 0.2097 0.3949 0.1983 0.6524 0.6531 0.0715 0.444 0.6379 0.1623 0.3333 0.4458 0.0753 0.3412 0.2148 0.2962 0.2372 0.5303 0.4689 0.5442 0.3549 0.5917 0.0525 0.235 0.4276 0.3387 0.2005 0.1717 0.0315 0.1844 0.3976 0.198 0.016 0.2838 0.4314 0.3256 0.0095 0.4261 0.042 0.1028 0.1743] 2022-08-24 04:22:29 [INFO] [EVAL] Class Precision: [0.7961 0.8734 0.9602 0.8275 0.7774 0.8673 0.8624 0.8617 0.6399 0.7369 0.6946 0.7234 0.7822 0.5167 0.5237 0.6142 0.6972 0.696 0.7816 0.6162 0.8376 0.6918 0.7532 0.6189 0.4818 0.6257 0.5619 0.7886 0.778 0.4732 0.4505 0.6911 0.536 0.4112 0.4396 0.5224 0.6651 0.8147 0.4097 0.5955 0.3096 0.3506 0.592 0.5032 0.3718 0.4533 0.647 0.7106 0.7065 0.6506 0.8032 0.4022 0.3046 0.6265 0.7744 0.6939 0.9314 0.7049 0.7697 0.5094 0.1428 0.4717 0.4826 0.5882 0.551 0.825 0.4017 0.5044 0.1776 0.5962 0.7317 0.7029 0.5236 0.3477 0.6849 0.5785 0.688 0.5385 0.6228 0.5601 0.8123 0.6923 0.8205 0.1376 0.2586 0.679 0.5345 0.4509 0.8167 0.6468 0.7152 0.059 0.4097 0.3543 0.0604 0.1871 0.5559 0.4499 0.28 0.7364 0.6264 0.2217 0.5693 0.9195 0.8383 0.6261 0.3764 0.7608 0.2156 0.5077 0.4726 0.4892 0.5465 0.7901 0.6589 0.3852 0.7192 0.7574 0.235 0.4933 0.7207 0.4555 0.6784 0.6151 0.7419 0.6134 0.8049 0.6082 0.7925 0.6049 0.6722 0.2936 0.6375 0.7922 0.6753 0.4087 0.4025 0.0572 0.3872 0.6144 0.4871 0.0501 0.5989 0.8034 0.6712 0.0172 0.8505 0.1895 0.2295 0.8612] 2022-08-24 04:22:29 [INFO] [EVAL] Class Recall: [0.8409 0.8991 0.9689 0.8726 0.8598 0.8736 0.8698 0.9206 0.7476 0.7915 0.6307 0.7084 0.8958 0.4089 0.4286 0.6112 0.6781 0.5495 0.7485 0.6038 0.9025 0.563 0.7731 0.7607 0.5218 0.6887 0.7205 0.4814 0.4939 0.4731 0.3869 0.6801 0.3973 0.631 0.6042 0.6387 0.646 0.6678 0.4418 0.5154 0.1782 0.1545 0.466 0.3976 0.4841 0.2486 0.4637 0.6939 0.7015 0.7555 0.6963 0.6129 0.2497 0.2287 0.8963 0.4781 0.9482 0.4616 0.362 0.3564 0.2062 0.3979 0.473 0.2269 0.7314 0.8477 0.368 0.6094 0.1927 0.4834 0.5884 0.6887 0.5511 0.4311 0.5953 0.5476 0.6405 0.3801 0.3018 0.4529 0.8197 0.5488 0.4558 0.0419 0.3569 0.694 0.174 0.1496 0.3522 0.6847 0.6189 0.3952 0.301 0.1279 0.0255 0.108 0.1689 0.1611 0.4713 0.361 0.2095 0.3021 0.2585 0.6994 0.2013 0.836 0.1424 0.6856 0.1845 0.6132 0.2739 0.6722 0.2374 0.7891 0.9869 0.0808 0.5372 0.8016 0.3442 0.5067 0.5389 0.0827 0.4071 0.2482 0.3302 0.2789 0.6085 0.6719 0.6347 0.462 0.8317 0.0601 0.2712 0.4817 0.4047 0.2824 0.2305 0.0656 0.2603 0.5297 0.2502 0.0231 0.3503 0.4823 0.3875 0.0209 0.4606 0.0512 0.157 0.1793] 2022-08-24 04:22:29 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:22:40 [INFO] [TRAIN] epoch: 83, iter: 104050/160000, loss: 0.4480, lr: 0.000424, batch_cost: 0.2065, reader_cost: 0.00416, ips: 38.7425 samples/sec | ETA 03:12:33 2022-08-24 04:22:50 [INFO] [TRAIN] epoch: 83, iter: 104100/160000, loss: 0.4778, lr: 0.000423, batch_cost: 0.2121, reader_cost: 0.00099, ips: 37.7245 samples/sec | ETA 03:17:34 2022-08-24 04:23:00 [INFO] [TRAIN] epoch: 83, iter: 104150/160000, loss: 0.4491, lr: 0.000423, batch_cost: 0.1850, reader_cost: 0.00061, ips: 43.2484 samples/sec | ETA 02:52:11 2022-08-24 04:23:08 [INFO] [TRAIN] epoch: 83, iter: 104200/160000, loss: 0.4530, lr: 0.000422, batch_cost: 0.1598, reader_cost: 0.00052, ips: 50.0561 samples/sec | ETA 02:28:37 2022-08-24 04:23:15 [INFO] [TRAIN] epoch: 83, iter: 104250/160000, loss: 0.4411, lr: 0.000422, batch_cost: 0.1589, reader_cost: 0.00060, ips: 50.3343 samples/sec | ETA 02:27:40 2022-08-24 04:23:24 [INFO] [TRAIN] epoch: 83, iter: 104300/160000, loss: 0.4663, lr: 0.000422, batch_cost: 0.1650, reader_cost: 0.00121, ips: 48.4878 samples/sec | ETA 02:33:09 2022-08-24 04:23:33 [INFO] [TRAIN] epoch: 83, iter: 104350/160000, loss: 0.4412, lr: 0.000421, batch_cost: 0.1877, reader_cost: 0.00049, ips: 42.6217 samples/sec | ETA 02:54:05 2022-08-24 04:23:42 [INFO] [TRAIN] epoch: 83, iter: 104400/160000, loss: 0.4081, lr: 0.000421, batch_cost: 0.1810, reader_cost: 0.00099, ips: 44.2107 samples/sec | ETA 02:47:40 2022-08-24 04:23:51 [INFO] [TRAIN] epoch: 83, iter: 104450/160000, loss: 0.4290, lr: 0.000421, batch_cost: 0.1836, reader_cost: 0.00040, ips: 43.5739 samples/sec | ETA 02:49:58 2022-08-24 04:24:00 [INFO] [TRAIN] epoch: 83, iter: 104500/160000, loss: 0.4639, lr: 0.000420, batch_cost: 0.1820, reader_cost: 0.00058, ips: 43.9660 samples/sec | ETA 02:48:18 2022-08-24 04:24:10 [INFO] [TRAIN] epoch: 83, iter: 104550/160000, loss: 0.4482, lr: 0.000420, batch_cost: 0.1827, reader_cost: 0.00108, ips: 43.7776 samples/sec | ETA 02:48:53 2022-08-24 04:24:18 [INFO] [TRAIN] epoch: 83, iter: 104600/160000, loss: 0.4318, lr: 0.000419, batch_cost: 0.1786, reader_cost: 0.00067, ips: 44.8033 samples/sec | ETA 02:44:52 2022-08-24 04:24:28 [INFO] [TRAIN] epoch: 83, iter: 104650/160000, loss: 0.4512, lr: 0.000419, batch_cost: 0.1858, reader_cost: 0.00042, ips: 43.0566 samples/sec | ETA 02:51:24 2022-08-24 04:24:37 [INFO] [TRAIN] epoch: 83, iter: 104700/160000, loss: 0.4471, lr: 0.000419, batch_cost: 0.1771, reader_cost: 0.00078, ips: 45.1659 samples/sec | ETA 02:43:14 2022-08-24 04:24:45 [INFO] [TRAIN] epoch: 83, iter: 104750/160000, loss: 0.4491, lr: 0.000418, batch_cost: 0.1657, reader_cost: 0.00099, ips: 48.2750 samples/sec | ETA 02:32:35 2022-08-24 04:24:53 [INFO] [TRAIN] epoch: 83, iter: 104800/160000, loss: 0.4333, lr: 0.000418, batch_cost: 0.1651, reader_cost: 0.00073, ips: 48.4428 samples/sec | ETA 02:31:55 2022-08-24 04:25:07 [INFO] [TRAIN] epoch: 84, iter: 104850/160000, loss: 0.4331, lr: 0.000418, batch_cost: 0.2736, reader_cost: 0.07986, ips: 29.2415 samples/sec | ETA 04:11:28 2022-08-24 04:25:15 [INFO] [TRAIN] epoch: 84, iter: 104900/160000, loss: 0.4183, lr: 0.000417, batch_cost: 0.1534, reader_cost: 0.00045, ips: 52.1638 samples/sec | ETA 02:20:50 2022-08-24 04:25:23 [INFO] [TRAIN] epoch: 84, iter: 104950/160000, loss: 0.4628, lr: 0.000417, batch_cost: 0.1660, reader_cost: 0.00073, ips: 48.1904 samples/sec | ETA 02:32:18 2022-08-24 04:25:32 [INFO] [TRAIN] epoch: 84, iter: 105000/160000, loss: 0.4480, lr: 0.000416, batch_cost: 0.1827, reader_cost: 0.00034, ips: 43.7805 samples/sec | ETA 02:47:30 2022-08-24 04:25:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1888 - reader cost: 0.0012 2022-08-24 04:28:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3722 Acc: 0.7726 Kappa: 0.7554 Dice: 0.5112 2022-08-24 04:28:41 [INFO] [EVAL] Class IoU: [0.6902 0.7928 0.9305 0.7384 0.6941 0.7726 0.77 0.8032 0.526 0.6415 0.4947 0.5452 0.7163 0.316 0.2914 0.4367 0.4749 0.4386 0.606 0.4298 0.7659 0.4684 0.6168 0.5201 0.325 0.4185 0.4536 0.4445 0.4954 0.252 0.2776 0.5337 0.3059 0.3493 0.3435 0.4105 0.4848 0.5637 0.3005 0.3696 0.1562 0.1132 0.3598 0.2599 0.2431 0.1795 0.3613 0.5107 0.6384 0.5086 0.594 0.3496 0.1482 0.2065 0.6781 0.4183 0.8867 0.4293 0.3887 0.2651 0.0935 0.2606 0.3095 0.2247 0.469 0.721 0.2797 0.3638 0.1001 0.374 0.5082 0.4952 0.3741 0.2536 0.489 0.3987 0.424 0.2517 0.3707 0.4029 0.7085 0.4362 0.4242 0.032 0.2409 0.5303 0.1386 0.1342 0.33 0.5335 0.4626 0.1066 0.219 0.1444 0.0329 0.0526 0.1781 0.1718 0.2265 0.3095 0.2197 0.1239 0.2449 0.6278 0.1903 0.4987 0.1918 0.5516 0.0711 0.401 0.2158 0.42 0.1965 0.627 0.6829 0.0675 0.3789 0.6329 0.1918 0.3043 0.4118 0.0848 0.2942 0.1872 0.2849 0.2553 0.5362 0.4842 0.5137 0.3407 0.5678 0.0787 0.3025 0.4495 0.3354 0.2073 0.1566 0.034 0.2024 0.3942 0.2615 0.0007 0.3157 0.1787 0.2742 0.01 0.4296 0.0387 0.1044 0.171 ] 2022-08-24 04:28:41 [INFO] [EVAL] Class Precision: [0.7932 0.862 0.9633 0.8427 0.7799 0.8701 0.8598 0.8711 0.6588 0.7711 0.6693 0.7261 0.7903 0.5283 0.5723 0.572 0.6883 0.7313 0.7494 0.637 0.8447 0.6657 0.7608 0.6156 0.4703 0.6562 0.5596 0.7141 0.6956 0.4037 0.4921 0.6728 0.5299 0.467 0.4057 0.5517 0.6861 0.782 0.4776 0.6129 0.3606 0.3963 0.6008 0.5115 0.364 0.3443 0.7876 0.6692 0.7043 0.6221 0.7758 0.4647 0.2747 0.5806 0.711 0.7099 0.9279 0.6253 0.7436 0.5223 0.1474 0.5172 0.4056 0.5507 0.5593 0.8394 0.4491 0.5548 0.1716 0.6543 0.697 0.6633 0.5629 0.3106 0.7194 0.5561 0.7169 0.3982 0.7252 0.6506 0.8383 0.6222 0.8042 0.2185 0.3243 0.7055 0.4435 0.423 0.7825 0.6717 0.6368 0.138 0.3903 0.3548 0.0802 0.1566 0.6083 0.4381 0.3181 0.7528 0.661 0.1655 0.5206 0.6885 0.7344 0.5417 0.4808 0.7513 0.1996 0.5024 0.4629 0.5426 0.4996 0.7567 0.6888 0.368 0.7102 0.7373 0.3297 0.4762 0.7145 0.3208 0.6093 0.6335 0.7444 0.6455 0.8203 0.6664 0.852 0.4935 0.6224 0.4143 0.6818 0.6954 0.6738 0.4004 0.3967 0.0607 0.4663 0.6456 0.507 0.0011 0.5916 0.5959 0.6121 0.013 0.8048 0.1638 0.2329 0.7725] 2022-08-24 04:28:41 [INFO] [EVAL] Class Recall: [0.8417 0.908 0.9647 0.8564 0.8632 0.8733 0.8805 0.9115 0.7229 0.7923 0.6547 0.6864 0.8844 0.4402 0.3725 0.6487 0.6051 0.5228 0.76 0.5692 0.8914 0.6124 0.7652 0.7703 0.5127 0.5361 0.7053 0.5407 0.6325 0.4015 0.3891 0.7208 0.4199 0.5808 0.6915 0.6159 0.623 0.6687 0.4477 0.4822 0.216 0.1368 0.4729 0.3457 0.4225 0.2727 0.4003 0.6832 0.872 0.736 0.717 0.5853 0.2434 0.2427 0.9361 0.5045 0.9523 0.578 0.4488 0.35 0.2034 0.3443 0.5665 0.2751 0.7438 0.8364 0.4259 0.5138 0.1938 0.4661 0.6523 0.6615 0.5272 0.5801 0.6042 0.5847 0.5093 0.4063 0.4313 0.5142 0.8207 0.5934 0.473 0.0361 0.4837 0.6812 0.1678 0.1643 0.3634 0.7216 0.6284 0.3197 0.3329 0.1959 0.0527 0.0735 0.2011 0.2203 0.4403 0.3446 0.2476 0.3301 0.3163 0.8768 0.2044 0.8625 0.2419 0.6748 0.0995 0.6652 0.2879 0.6502 0.2446 0.7852 0.9876 0.0764 0.4482 0.8171 0.3143 0.4575 0.4929 0.1033 0.3626 0.21 0.3158 0.2969 0.6076 0.6391 0.5641 0.5238 0.866 0.0885 0.3523 0.5597 0.4004 0.3005 0.2055 0.072 0.2635 0.503 0.3506 0.0019 0.4037 0.2034 0.3318 0.0413 0.4796 0.0483 0.159 0.18 ] 2022-08-24 04:28:41 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:28:51 [INFO] [TRAIN] epoch: 84, iter: 105050/160000, loss: 0.4290, lr: 0.000416, batch_cost: 0.2005, reader_cost: 0.00363, ips: 39.9031 samples/sec | ETA 03:03:36 2022-08-24 04:29:00 [INFO] [TRAIN] epoch: 84, iter: 105100/160000, loss: 0.4598, lr: 0.000416, batch_cost: 0.1801, reader_cost: 0.00081, ips: 44.4174 samples/sec | ETA 02:44:48 2022-08-24 04:29:10 [INFO] [TRAIN] epoch: 84, iter: 105150/160000, loss: 0.4453, lr: 0.000415, batch_cost: 0.1940, reader_cost: 0.00061, ips: 41.2462 samples/sec | ETA 02:57:18 2022-08-24 04:29:19 [INFO] [TRAIN] epoch: 84, iter: 105200/160000, loss: 0.4387, lr: 0.000415, batch_cost: 0.1734, reader_cost: 0.00067, ips: 46.1325 samples/sec | ETA 02:38:23 2022-08-24 04:29:29 [INFO] [TRAIN] epoch: 84, iter: 105250/160000, loss: 0.4571, lr: 0.000415, batch_cost: 0.1948, reader_cost: 0.00112, ips: 41.0589 samples/sec | ETA 02:57:47 2022-08-24 04:29:37 [INFO] [TRAIN] epoch: 84, iter: 105300/160000, loss: 0.4893, lr: 0.000414, batch_cost: 0.1636, reader_cost: 0.00148, ips: 48.8908 samples/sec | ETA 02:29:10 2022-08-24 04:29:46 [INFO] [TRAIN] epoch: 84, iter: 105350/160000, loss: 0.4262, lr: 0.000414, batch_cost: 0.1800, reader_cost: 0.00069, ips: 44.4546 samples/sec | ETA 02:43:54 2022-08-24 04:29:55 [INFO] [TRAIN] epoch: 84, iter: 105400/160000, loss: 0.4556, lr: 0.000413, batch_cost: 0.1844, reader_cost: 0.00045, ips: 43.3792 samples/sec | ETA 02:47:49 2022-08-24 04:30:03 [INFO] [TRAIN] epoch: 84, iter: 105450/160000, loss: 0.4495, lr: 0.000413, batch_cost: 0.1650, reader_cost: 0.00052, ips: 48.4828 samples/sec | ETA 02:30:01 2022-08-24 04:30:13 [INFO] [TRAIN] epoch: 84, iter: 105500/160000, loss: 0.4499, lr: 0.000413, batch_cost: 0.2009, reader_cost: 0.00055, ips: 39.8122 samples/sec | ETA 03:02:31 2022-08-24 04:30:22 [INFO] [TRAIN] epoch: 84, iter: 105550/160000, loss: 0.4777, lr: 0.000412, batch_cost: 0.1681, reader_cost: 0.00072, ips: 47.5881 samples/sec | ETA 02:32:33 2022-08-24 04:30:30 [INFO] [TRAIN] epoch: 84, iter: 105600/160000, loss: 0.4529, lr: 0.000412, batch_cost: 0.1693, reader_cost: 0.00034, ips: 47.2604 samples/sec | ETA 02:33:28 2022-08-24 04:30:40 [INFO] [TRAIN] epoch: 84, iter: 105650/160000, loss: 0.4358, lr: 0.000411, batch_cost: 0.1926, reader_cost: 0.00053, ips: 41.5271 samples/sec | ETA 02:54:30 2022-08-24 04:30:50 [INFO] [TRAIN] epoch: 84, iter: 105700/160000, loss: 0.4244, lr: 0.000411, batch_cost: 0.1981, reader_cost: 0.00085, ips: 40.3919 samples/sec | ETA 02:59:14 2022-08-24 04:30:58 [INFO] [TRAIN] epoch: 84, iter: 105750/160000, loss: 0.4334, lr: 0.000411, batch_cost: 0.1710, reader_cost: 0.00040, ips: 46.7748 samples/sec | ETA 02:34:38 2022-08-24 04:31:07 [INFO] [TRAIN] epoch: 84, iter: 105800/160000, loss: 0.4480, lr: 0.000410, batch_cost: 0.1755, reader_cost: 0.00067, ips: 45.5771 samples/sec | ETA 02:38:33 2022-08-24 04:31:16 [INFO] [TRAIN] epoch: 84, iter: 105850/160000, loss: 0.4638, lr: 0.000410, batch_cost: 0.1761, reader_cost: 0.00121, ips: 45.4381 samples/sec | ETA 02:38:53 2022-08-24 04:31:24 [INFO] [TRAIN] epoch: 84, iter: 105900/160000, loss: 0.4476, lr: 0.000410, batch_cost: 0.1702, reader_cost: 0.00084, ips: 46.9934 samples/sec | ETA 02:33:29 2022-08-24 04:31:34 [INFO] [TRAIN] epoch: 84, iter: 105950/160000, loss: 0.3958, lr: 0.000409, batch_cost: 0.1843, reader_cost: 0.00034, ips: 43.4154 samples/sec | ETA 02:45:59 2022-08-24 04:31:42 [INFO] [TRAIN] epoch: 84, iter: 106000/160000, loss: 0.4247, lr: 0.000409, batch_cost: 0.1646, reader_cost: 0.00076, ips: 48.5904 samples/sec | ETA 02:28:10 2022-08-24 04:31:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 177s - batch_cost: 0.1768 - reader cost: 8.5945e-04 2022-08-24 04:34:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3721 Acc: 0.7743 Kappa: 0.7572 Dice: 0.5108 2022-08-24 04:34:39 [INFO] [EVAL] Class IoU: [0.6939 0.7944 0.9308 0.7418 0.6941 0.7709 0.7658 0.8047 0.5338 0.6442 0.4949 0.5699 0.7169 0.3363 0.3058 0.4432 0.4725 0.4477 0.6304 0.4438 0.7662 0.4302 0.6338 0.5129 0.3363 0.4316 0.4497 0.4602 0.4465 0.2619 0.2801 0.5331 0.2943 0.3413 0.3347 0.4181 0.4856 0.5546 0.298 0.379 0.1495 0.1339 0.3653 0.2684 0.2569 0.2262 0.4208 0.5207 0.4641 0.5232 0.5962 0.3407 0.1485 0.1727 0.6874 0.3985 0.8817 0.4329 0.4574 0.2699 0.1249 0.2818 0.308 0.1874 0.4749 0.6962 0.2722 0.3806 0.121 0.3602 0.5149 0.5738 0.3791 0.2659 0.4895 0.3937 0.5017 0.2604 0.4125 0.3747 0.7149 0.4266 0.391 0.0405 0.1173 0.5317 0.1366 0.1283 0.3526 0.5267 0.4318 0.119 0.2334 0.1059 0.0203 0.0413 0.2077 0.2015 0.2192 0.3373 0.192 0.1405 0.2776 0.7008 0.1887 0.3918 0.1831 0.5588 0.0881 0.3284 0.208 0.4192 0.1966 0.6271 0.6382 0.0669 0.3836 0.6453 0.1495 0.3842 0.4315 0.0776 0.2941 0.2155 0.2864 0.2495 0.532 0.4553 0.5575 0.3711 0.5672 0.0738 0.286 0.4085 0.3288 0.2083 0.1679 0.0276 0.2162 0.3858 0.1136 0.0011 0.2459 0.2978 0.2487 0.0012 0.4237 0.032 0.1115 0.1763] 2022-08-24 04:34:39 [INFO] [EVAL] Class Precision: [0.7965 0.8601 0.9663 0.831 0.781 0.8841 0.8886 0.8712 0.6412 0.7669 0.7046 0.6901 0.7856 0.5115 0.537 0.6017 0.6887 0.6611 0.7983 0.6193 0.8285 0.6896 0.7833 0.6299 0.453 0.6822 0.5411 0.7459 0.7194 0.4426 0.4903 0.691 0.5972 0.4338 0.4412 0.574 0.6738 0.8071 0.4535 0.6175 0.3734 0.3276 0.6071 0.5009 0.3511 0.3633 0.7857 0.6821 0.6937 0.6543 0.7854 0.4332 0.2958 0.5309 0.7276 0.674 0.9294 0.633 0.6751 0.4568 0.1717 0.5662 0.4637 0.6817 0.5687 0.7977 0.4755 0.5144 0.2422 0.561 0.7175 0.736 0.5745 0.3358 0.6879 0.5994 0.7821 0.6366 0.7688 0.7959 0.8674 0.6779 0.8366 0.1618 0.1999 0.7025 0.474 0.4176 0.7773 0.706 0.578 0.1665 0.4753 0.3752 0.0456 0.1441 0.6195 0.4418 0.2943 0.7506 0.578 0.3 0.5403 0.867 0.7646 0.447 0.4736 0.7413 0.1806 0.4466 0.5384 0.5474 0.5968 0.8087 0.6428 0.3408 0.7526 0.7635 0.3139 0.5295 0.6789 0.4612 0.6905 0.5599 0.7763 0.6652 0.769 0.5587 0.821 0.6089 0.6227 0.3135 0.5751 0.7614 0.7398 0.3556 0.3694 0.0737 0.4626 0.6316 0.5382 0.0021 0.6572 0.5962 0.5832 0.002 0.8224 0.2154 0.2476 0.9113] 2022-08-24 04:34:39 [INFO] [EVAL] Class Recall: [0.8434 0.9124 0.962 0.8736 0.8618 0.8576 0.847 0.9133 0.7612 0.8011 0.6244 0.7658 0.8914 0.4954 0.4154 0.6272 0.6008 0.5811 0.7498 0.6103 0.9106 0.5334 0.7686 0.7342 0.5662 0.5401 0.7268 0.5457 0.5407 0.3907 0.3952 0.7 0.3672 0.6156 0.581 0.6062 0.6349 0.6393 0.465 0.4953 0.1996 0.1846 0.4785 0.3663 0.489 0.3749 0.4753 0.6876 0.5837 0.7232 0.7123 0.6147 0.2297 0.2038 0.9256 0.4936 0.945 0.578 0.5865 0.3974 0.3145 0.3594 0.4784 0.2054 0.7422 0.8456 0.3891 0.5941 0.1948 0.5016 0.6457 0.7225 0.5271 0.5608 0.6292 0.5342 0.5833 0.3059 0.471 0.4145 0.8026 0.535 0.4233 0.0512 0.2212 0.6863 0.1611 0.1563 0.3922 0.6747 0.6306 0.2945 0.3144 0.1286 0.0353 0.0548 0.238 0.2702 0.4622 0.3798 0.2233 0.209 0.3635 0.7852 0.2003 0.7606 0.2299 0.6942 0.1468 0.5537 0.2531 0.6415 0.2267 0.7364 0.9891 0.0768 0.4389 0.8065 0.2221 0.5833 0.5421 0.0853 0.3388 0.2595 0.3121 0.2854 0.6332 0.7109 0.6346 0.4873 0.8643 0.0881 0.3626 0.4685 0.3718 0.3347 0.2354 0.0422 0.2887 0.4978 0.1258 0.0026 0.2821 0.373 0.3025 0.0031 0.4664 0.0363 0.1685 0.1793] 2022-08-24 04:34:39 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:34:48 [INFO] [TRAIN] epoch: 84, iter: 106050/160000, loss: 0.4671, lr: 0.000408, batch_cost: 0.1783, reader_cost: 0.00526, ips: 44.8674 samples/sec | ETA 02:40:19 2022-08-24 04:35:03 [INFO] [TRAIN] epoch: 85, iter: 106100/160000, loss: 0.4535, lr: 0.000408, batch_cost: 0.2927, reader_cost: 0.08472, ips: 27.3305 samples/sec | ETA 04:22:57 2022-08-24 04:35:11 [INFO] [TRAIN] epoch: 85, iter: 106150/160000, loss: 0.4746, lr: 0.000408, batch_cost: 0.1720, reader_cost: 0.00105, ips: 46.5158 samples/sec | ETA 02:34:21 2022-08-24 04:35:21 [INFO] [TRAIN] epoch: 85, iter: 106200/160000, loss: 0.4404, lr: 0.000407, batch_cost: 0.1865, reader_cost: 0.00089, ips: 42.8855 samples/sec | ETA 02:47:16 2022-08-24 04:35:29 [INFO] [TRAIN] epoch: 85, iter: 106250/160000, loss: 0.4503, lr: 0.000407, batch_cost: 0.1765, reader_cost: 0.00064, ips: 45.3254 samples/sec | ETA 02:38:06 2022-08-24 04:35:39 [INFO] [TRAIN] epoch: 85, iter: 106300/160000, loss: 0.4582, lr: 0.000407, batch_cost: 0.1819, reader_cost: 0.00056, ips: 43.9897 samples/sec | ETA 02:42:45 2022-08-24 04:35:48 [INFO] [TRAIN] epoch: 85, iter: 106350/160000, loss: 0.4523, lr: 0.000406, batch_cost: 0.1934, reader_cost: 0.00082, ips: 41.3660 samples/sec | ETA 02:52:55 2022-08-24 04:35:56 [INFO] [TRAIN] epoch: 85, iter: 106400/160000, loss: 0.4352, lr: 0.000406, batch_cost: 0.1585, reader_cost: 0.00110, ips: 50.4716 samples/sec | ETA 02:21:35 2022-08-24 04:36:06 [INFO] [TRAIN] epoch: 85, iter: 106450/160000, loss: 0.4358, lr: 0.000405, batch_cost: 0.1907, reader_cost: 0.00041, ips: 41.9507 samples/sec | ETA 02:50:11 2022-08-24 04:36:15 [INFO] [TRAIN] epoch: 85, iter: 106500/160000, loss: 0.4285, lr: 0.000405, batch_cost: 0.1801, reader_cost: 0.00085, ips: 44.4148 samples/sec | ETA 02:40:36 2022-08-24 04:36:24 [INFO] [TRAIN] epoch: 85, iter: 106550/160000, loss: 0.4626, lr: 0.000405, batch_cost: 0.1766, reader_cost: 0.00055, ips: 45.3063 samples/sec | ETA 02:37:17 2022-08-24 04:36:32 [INFO] [TRAIN] epoch: 85, iter: 106600/160000, loss: 0.4480, lr: 0.000404, batch_cost: 0.1759, reader_cost: 0.00065, ips: 45.4716 samples/sec | ETA 02:36:34 2022-08-24 04:36:41 [INFO] [TRAIN] epoch: 85, iter: 106650/160000, loss: 0.4396, lr: 0.000404, batch_cost: 0.1679, reader_cost: 0.00062, ips: 47.6462 samples/sec | ETA 02:29:17 2022-08-24 04:36:49 [INFO] [TRAIN] epoch: 85, iter: 106700/160000, loss: 0.4409, lr: 0.000404, batch_cost: 0.1716, reader_cost: 0.00079, ips: 46.6258 samples/sec | ETA 02:32:25 2022-08-24 04:36:57 [INFO] [TRAIN] epoch: 85, iter: 106750/160000, loss: 0.4646, lr: 0.000403, batch_cost: 0.1612, reader_cost: 0.00038, ips: 49.6160 samples/sec | ETA 02:23:05 2022-08-24 04:37:06 [INFO] [TRAIN] epoch: 85, iter: 106800/160000, loss: 0.4495, lr: 0.000403, batch_cost: 0.1676, reader_cost: 0.00054, ips: 47.7218 samples/sec | ETA 02:28:38 2022-08-24 04:37:14 [INFO] [TRAIN] epoch: 85, iter: 106850/160000, loss: 0.3909, lr: 0.000402, batch_cost: 0.1597, reader_cost: 0.00082, ips: 50.0926 samples/sec | ETA 02:21:28 2022-08-24 04:37:23 [INFO] [TRAIN] epoch: 85, iter: 106900/160000, loss: 0.4578, lr: 0.000402, batch_cost: 0.1810, reader_cost: 0.00072, ips: 44.2078 samples/sec | ETA 02:40:09 2022-08-24 04:37:31 [INFO] [TRAIN] epoch: 85, iter: 106950/160000, loss: 0.4400, lr: 0.000402, batch_cost: 0.1692, reader_cost: 0.00087, ips: 47.2846 samples/sec | ETA 02:29:35 2022-08-24 04:37:40 [INFO] [TRAIN] epoch: 85, iter: 107000/160000, loss: 0.4510, lr: 0.000401, batch_cost: 0.1729, reader_cost: 0.00054, ips: 46.2648 samples/sec | ETA 02:32:44 2022-08-24 04:37:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1783 - reader cost: 0.0012 2022-08-24 04:40:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3687 Acc: 0.7732 Kappa: 0.7561 Dice: 0.5067 2022-08-24 04:40:38 [INFO] [EVAL] Class IoU: [0.6904 0.7918 0.9299 0.7397 0.6933 0.7728 0.7705 0.796 0.5369 0.6299 0.4974 0.5728 0.7156 0.3302 0.3079 0.4445 0.5025 0.4568 0.6289 0.4249 0.7707 0.4388 0.6332 0.5004 0.326 0.4774 0.4555 0.4718 0.4411 0.2666 0.2902 0.5257 0.3199 0.3577 0.3339 0.4137 0.4888 0.5638 0.2729 0.3837 0.1359 0.1448 0.353 0.269 0.2668 0.1985 0.4174 0.5332 0.5245 0.5015 0.5664 0.2801 0.1905 0.194 0.6951 0.3677 0.8757 0.4246 0.2683 0.2696 0.0967 0.2801 0.2971 0.1527 0.4715 0.7073 0.3023 0.3736 0.128 0.3943 0.5203 0.5633 0.3499 0.2356 0.4993 0.3916 0.5093 0.2923 0.2287 0.3705 0.7106 0.4355 0.4124 0.0329 0.1395 0.5386 0.1307 0.1464 0.2634 0.5214 0.4432 0.0758 0.1792 0.1176 0.0442 0.0785 0.0946 0.1815 0.1964 0.3099 0.1601 0.1657 0.2787 0.7393 0.1898 0.3947 0.173 0.5622 0.0765 0.4585 0.2293 0.3784 0.1926 0.628 0.5067 0.0694 0.3593 0.6288 0.1882 0.3582 0.4081 0.079 0.3584 0.2107 0.2877 0.1865 0.5159 0.4343 0.4676 0.3659 0.6585 0.0475 0.3528 0.4375 0.3301 0.2009 0.1756 0.0331 0.2202 0.3939 0.107 0.0098 0.274 0.3481 0.2542 0.0147 0.4036 0.0395 0.0983 0.1966] 2022-08-24 04:40:38 [INFO] [EVAL] Class Precision: [0.8024 0.8581 0.9593 0.8281 0.7924 0.8623 0.8936 0.8484 0.674 0.7272 0.6965 0.7091 0.7876 0.5033 0.5208 0.6449 0.7192 0.7093 0.8015 0.6666 0.848 0.6764 0.7711 0.6239 0.4598 0.6823 0.5554 0.7304 0.7228 0.4705 0.4498 0.6606 0.5477 0.4974 0.417 0.5192 0.6693 0.8211 0.3889 0.5748 0.3924 0.2867 0.532 0.4746 0.3791 0.4207 0.7356 0.7614 0.6813 0.5914 0.7063 0.3369 0.3607 0.6297 0.7504 0.6974 0.9184 0.6392 0.7599 0.5018 0.1411 0.5954 0.4758 0.7447 0.5853 0.799 0.5058 0.482 0.2828 0.6893 0.6823 0.793 0.5463 0.2888 0.7381 0.5436 0.7374 0.5116 0.6845 0.534 0.8605 0.6468 0.8029 0.0772 0.232 0.732 0.6033 0.389 0.7103 0.6443 0.6019 0.09 0.5071 0.3707 0.1073 0.2099 0.464 0.4272 0.2526 0.6959 0.6681 0.3276 0.6016 0.8324 0.7662 0.4222 0.3892 0.7488 0.1817 0.5353 0.3675 0.4844 0.4724 0.8517 0.5111 0.3117 0.6294 0.7261 0.2655 0.4807 0.7809 0.3996 0.6381 0.6548 0.7397 0.6403 0.7543 0.5441 0.6375 0.4933 0.7926 0.1707 0.6532 0.7163 0.7192 0.348 0.3828 0.0616 0.3897 0.6442 0.3019 0.0249 0.5705 0.737 0.657 0.0174 0.8511 0.2451 0.2305 0.7557] 2022-08-24 04:40:38 [INFO] [EVAL] Class Recall: [0.8317 0.911 0.9681 0.8739 0.8472 0.8816 0.8483 0.9281 0.7253 0.8249 0.635 0.7489 0.8867 0.49 0.4297 0.5885 0.6251 0.5619 0.7449 0.5396 0.8942 0.5553 0.7797 0.7166 0.5283 0.6138 0.7169 0.5713 0.5309 0.3809 0.4499 0.7202 0.4348 0.5601 0.6262 0.6706 0.6444 0.6428 0.4778 0.5357 0.1721 0.2262 0.512 0.383 0.4738 0.2732 0.4911 0.6401 0.6951 0.7674 0.741 0.6243 0.2876 0.219 0.9041 0.4376 0.9496 0.5584 0.2931 0.3681 0.2349 0.3459 0.4416 0.1611 0.708 0.8604 0.4291 0.6243 0.1895 0.4795 0.6866 0.6604 0.4932 0.5611 0.6069 0.5836 0.6221 0.4054 0.2557 0.5476 0.8031 0.5713 0.4588 0.0542 0.259 0.6708 0.143 0.1902 0.2951 0.7321 0.6269 0.3244 0.217 0.1469 0.0699 0.1114 0.1062 0.24 0.4689 0.3585 0.174 0.2511 0.3418 0.8687 0.2014 0.8586 0.2375 0.6929 0.1166 0.7616 0.3787 0.6335 0.2453 0.7052 0.9833 0.082 0.4556 0.8242 0.3924 0.5844 0.4609 0.0897 0.4498 0.237 0.32 0.2083 0.6201 0.6828 0.6369 0.5861 0.7955 0.0618 0.4342 0.5291 0.3789 0.3221 0.2451 0.0668 0.3362 0.5035 0.1422 0.0161 0.3452 0.3975 0.2931 0.0891 0.4342 0.045 0.1463 0.21 ] 2022-08-24 04:40:39 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:40:49 [INFO] [TRAIN] epoch: 85, iter: 107050/160000, loss: 0.4378, lr: 0.000401, batch_cost: 0.1981, reader_cost: 0.00435, ips: 40.3855 samples/sec | ETA 02:54:48 2022-08-24 04:40:57 [INFO] [TRAIN] epoch: 85, iter: 107100/160000, loss: 0.4574, lr: 0.000401, batch_cost: 0.1624, reader_cost: 0.00075, ips: 49.2467 samples/sec | ETA 02:23:13 2022-08-24 04:41:05 [INFO] [TRAIN] epoch: 85, iter: 107150/160000, loss: 0.4598, lr: 0.000400, batch_cost: 0.1603, reader_cost: 0.00098, ips: 49.8987 samples/sec | ETA 02:21:13 2022-08-24 04:41:12 [INFO] [TRAIN] epoch: 85, iter: 107200/160000, loss: 0.4155, lr: 0.000400, batch_cost: 0.1538, reader_cost: 0.00060, ips: 52.0035 samples/sec | ETA 02:15:22 2022-08-24 04:41:22 [INFO] [TRAIN] epoch: 85, iter: 107250/160000, loss: 0.4685, lr: 0.000399, batch_cost: 0.1890, reader_cost: 0.00061, ips: 42.3366 samples/sec | ETA 02:46:07 2022-08-24 04:41:31 [INFO] [TRAIN] epoch: 85, iter: 107300/160000, loss: 0.4688, lr: 0.000399, batch_cost: 0.1816, reader_cost: 0.00055, ips: 44.0453 samples/sec | ETA 02:39:31 2022-08-24 04:41:40 [INFO] [TRAIN] epoch: 85, iter: 107350/160000, loss: 0.4464, lr: 0.000399, batch_cost: 0.1766, reader_cost: 0.00701, ips: 45.3080 samples/sec | ETA 02:34:56 2022-08-24 04:41:51 [INFO] [TRAIN] epoch: 86, iter: 107400/160000, loss: 0.4767, lr: 0.000398, batch_cost: 0.2294, reader_cost: 0.06906, ips: 34.8728 samples/sec | ETA 03:21:06 2022-08-24 04:41:59 [INFO] [TRAIN] epoch: 86, iter: 107450/160000, loss: 0.4482, lr: 0.000398, batch_cost: 0.1539, reader_cost: 0.00110, ips: 51.9802 samples/sec | ETA 02:14:47 2022-08-24 04:42:07 [INFO] [TRAIN] epoch: 86, iter: 107500/160000, loss: 0.4197, lr: 0.000397, batch_cost: 0.1687, reader_cost: 0.00035, ips: 47.4345 samples/sec | ETA 02:27:34 2022-08-24 04:42:16 [INFO] [TRAIN] epoch: 86, iter: 107550/160000, loss: 0.4229, lr: 0.000397, batch_cost: 0.1754, reader_cost: 0.00052, ips: 45.6120 samples/sec | ETA 02:33:19 2022-08-24 04:42:24 [INFO] [TRAIN] epoch: 86, iter: 107600/160000, loss: 0.4461, lr: 0.000397, batch_cost: 0.1613, reader_cost: 0.00050, ips: 49.6002 samples/sec | ETA 02:20:51 2022-08-24 04:42:32 [INFO] [TRAIN] epoch: 86, iter: 107650/160000, loss: 0.4268, lr: 0.000396, batch_cost: 0.1607, reader_cost: 0.00093, ips: 49.7728 samples/sec | ETA 02:20:14 2022-08-24 04:42:42 [INFO] [TRAIN] epoch: 86, iter: 107700/160000, loss: 0.4384, lr: 0.000396, batch_cost: 0.2007, reader_cost: 0.00078, ips: 39.8655 samples/sec | ETA 02:54:55 2022-08-24 04:42:52 [INFO] [TRAIN] epoch: 86, iter: 107750/160000, loss: 0.4503, lr: 0.000396, batch_cost: 0.2014, reader_cost: 0.00049, ips: 39.7154 samples/sec | ETA 02:55:24 2022-08-24 04:43:01 [INFO] [TRAIN] epoch: 86, iter: 107800/160000, loss: 0.4536, lr: 0.000395, batch_cost: 0.1633, reader_cost: 0.00059, ips: 48.9969 samples/sec | ETA 02:22:02 2022-08-24 04:43:09 [INFO] [TRAIN] epoch: 86, iter: 107850/160000, loss: 0.4363, lr: 0.000395, batch_cost: 0.1729, reader_cost: 0.00075, ips: 46.2582 samples/sec | ETA 02:30:18 2022-08-24 04:43:17 [INFO] [TRAIN] epoch: 86, iter: 107900/160000, loss: 0.4067, lr: 0.000394, batch_cost: 0.1635, reader_cost: 0.00046, ips: 48.9294 samples/sec | ETA 02:21:58 2022-08-24 04:43:26 [INFO] [TRAIN] epoch: 86, iter: 107950/160000, loss: 0.4179, lr: 0.000394, batch_cost: 0.1733, reader_cost: 0.00075, ips: 46.1679 samples/sec | ETA 02:30:19 2022-08-24 04:43:35 [INFO] [TRAIN] epoch: 86, iter: 108000/160000, loss: 0.4598, lr: 0.000394, batch_cost: 0.1701, reader_cost: 0.00110, ips: 47.0334 samples/sec | ETA 02:27:24 2022-08-24 04:43:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 188s - batch_cost: 0.1876 - reader cost: 7.4820e-04 2022-08-24 04:46:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.3719 Acc: 0.7722 Kappa: 0.7547 Dice: 0.5087 2022-08-24 04:46:42 [INFO] [EVAL] Class IoU: [0.6841 0.7833 0.93 0.7396 0.6892 0.7695 0.7733 0.8037 0.5342 0.6368 0.4823 0.5594 0.7183 0.3084 0.3125 0.4378 0.4635 0.4419 0.6077 0.4387 0.7722 0.4722 0.6302 0.5175 0.3327 0.4404 0.4739 0.4614 0.4462 0.2742 0.314 0.5431 0.3007 0.3538 0.3627 0.4049 0.4866 0.5742 0.2638 0.3644 0.1159 0.1189 0.3554 0.2629 0.2394 0.2248 0.3833 0.5506 0.6158 0.5338 0.6121 0.3339 0.1449 0.1455 0.7252 0.3678 0.885 0.4219 0.4759 0.2726 0.0925 0.2928 0.3118 0.2099 0.501 0.72 0.2967 0.3718 0.1213 0.3809 0.5202 0.5831 0.4068 0.246 0.4934 0.3971 0.5021 0.3263 0.3394 0.3339 0.7109 0.3907 0.4493 0.0416 0.1165 0.5264 0.1403 0.1267 0.2746 0.5187 0.4339 0.1508 0.13 0.0806 0.0306 0.0628 0.1466 0.1843 0.2011 0.3795 0.1761 0.1282 0.2923 0.7049 0.1927 0.3934 0.2285 0.5756 0.0898 0.3498 0.1949 0.3603 0.1934 0.6207 0.7214 0.0607 0.3632 0.6226 0.1444 0.3463 0.4748 0.0924 0.3565 0.2045 0.2917 0.2158 0.5433 0.4441 0.5489 0.3548 0.5619 0.0289 0.283 0.4103 0.3201 0.2063 0.1806 0.0277 0.2034 0.4039 0.0204 0.0019 0.3068 0.2954 0.2384 0. 0.3985 0.0406 0.1122 0.2208] 2022-08-24 04:46:42 [INFO] [EVAL] Class Precision: [0.775 0.8608 0.9644 0.8301 0.7692 0.8833 0.8734 0.8744 0.6602 0.746 0.7339 0.6964 0.7966 0.5212 0.5333 0.5992 0.7067 0.6839 0.8396 0.6293 0.8495 0.7452 0.7742 0.6625 0.4475 0.6132 0.5859 0.7789 0.7021 0.5027 0.4699 0.728 0.5318 0.4734 0.452 0.5453 0.6615 0.8474 0.366 0.6445 0.3815 0.2877 0.5524 0.4606 0.2974 0.4478 0.8721 0.7602 0.7012 0.6549 0.7593 0.4089 0.3709 0.5572 0.7925 0.6911 0.9289 0.6644 0.701 0.4192 0.1317 0.5659 0.4364 0.5544 0.6907 0.8482 0.4997 0.5574 0.2804 0.6023 0.7279 0.7412 0.5506 0.2849 0.7868 0.6196 0.7846 0.6465 0.7711 0.7239 0.8463 0.6613 0.7751 0.1125 0.2074 0.7254 0.5956 0.3917 0.5621 0.6809 0.5829 0.2429 0.4902 0.3686 0.0851 0.1873 0.5339 0.3604 0.2707 0.6295 0.6759 0.248 0.5155 0.8673 0.8187 0.4197 0.4144 0.7827 0.2002 0.4783 0.3735 0.4364 0.56 0.8676 0.7313 0.3063 0.6264 0.7311 0.2753 0.5501 0.7298 0.4299 0.7313 0.6542 0.7482 0.6529 0.8486 0.5654 0.8024 0.5945 0.6041 0.2864 0.5262 0.7795 0.702 0.4208 0.358 0.0656 0.4093 0.6939 0.2414 0.0038 0.5727 0.6648 0.6425 0. 0.8431 0.2309 0.2497 0.7743] 2022-08-24 04:46:42 [INFO] [EVAL] Class Recall: [0.8536 0.897 0.963 0.8716 0.8688 0.8566 0.8709 0.9087 0.7368 0.8132 0.5845 0.7398 0.8796 0.4304 0.4302 0.6191 0.5738 0.5554 0.6876 0.5917 0.8946 0.5631 0.772 0.7028 0.5647 0.6099 0.7126 0.5309 0.5503 0.3763 0.4863 0.6814 0.409 0.5836 0.6475 0.6112 0.6479 0.6404 0.4858 0.4561 0.1426 0.1685 0.4991 0.3799 0.5507 0.3111 0.4061 0.6663 0.8349 0.7426 0.7595 0.6455 0.1921 0.1645 0.8951 0.4401 0.9493 0.5361 0.5971 0.4382 0.2372 0.3777 0.5221 0.2525 0.646 0.8264 0.4222 0.5275 0.1761 0.5089 0.6458 0.7322 0.6091 0.6431 0.5695 0.5251 0.5824 0.3971 0.3774 0.3826 0.8163 0.4884 0.5167 0.062 0.2099 0.6574 0.1551 0.1578 0.3493 0.6853 0.6292 0.2847 0.1503 0.0935 0.0456 0.0862 0.1682 0.2739 0.4386 0.4888 0.1923 0.2096 0.4029 0.7901 0.2013 0.8627 0.3375 0.6852 0.14 0.5655 0.2895 0.6739 0.228 0.6856 0.9815 0.0704 0.4636 0.8076 0.233 0.4832 0.5761 0.1053 0.4102 0.2293 0.3235 0.2438 0.6016 0.6742 0.6347 0.4681 0.8896 0.0311 0.3797 0.4642 0.3704 0.2882 0.2672 0.0459 0.288 0.4914 0.0217 0.0039 0.3979 0.3472 0.2749 0. 0.4305 0.0469 0.1693 0.2359] 2022-08-24 04:46:43 [INFO] [EVAL] The model with the best validation mIoU (0.3738) was saved at iter 102000. 2022-08-24 04:46:52 [INFO] [TRAIN] epoch: 86, iter: 108050/160000, loss: 0.4245, lr: 0.000393, batch_cost: 0.1820, reader_cost: 0.00383, ips: 43.9477 samples/sec | ETA 02:37:36 2022-08-24 04:47:02 [INFO] [TRAIN] epoch: 86, iter: 108100/160000, loss: 0.4706, lr: 0.000393, batch_cost: 0.1951, reader_cost: 0.00081, ips: 41.0042 samples/sec | ETA 02:48:45 2022-08-24 04:47:11 [INFO] [TRAIN] epoch: 86, iter: 108150/160000, loss: 0.4657, lr: 0.000393, batch_cost: 0.1813, reader_cost: 0.00089, ips: 44.1287 samples/sec | ETA 02:36:39 2022-08-24 04:47:20 [INFO] [TRAIN] epoch: 86, iter: 108200/160000, loss: 0.4361, lr: 0.000392, batch_cost: 0.1908, reader_cost: 0.00070, ips: 41.9389 samples/sec | ETA 02:44:41 2022-08-24 04:47:29 [INFO] [TRAIN] epoch: 86, iter: 108250/160000, loss: 0.4284, lr: 0.000392, batch_cost: 0.1763, reader_cost: 0.00075, ips: 45.3886 samples/sec | ETA 02:32:01 2022-08-24 04:47:38 [INFO] [TRAIN] epoch: 86, iter: 108300/160000, loss: 0.4730, lr: 0.000391, batch_cost: 0.1710, reader_cost: 0.00049, ips: 46.7926 samples/sec | ETA 02:27:18 2022-08-24 04:47:46 [INFO] [TRAIN] epoch: 86, iter: 108350/160000, loss: 0.4541, lr: 0.000391, batch_cost: 0.1754, reader_cost: 0.00047, ips: 45.6031 samples/sec | ETA 02:31:00 2022-08-24 04:47:56 [INFO] [TRAIN] epoch: 86, iter: 108400/160000, loss: 0.4807, lr: 0.000391, batch_cost: 0.1901, reader_cost: 0.00059, ips: 42.0722 samples/sec | ETA 02:43:31 2022-08-24 04:48:05 [INFO] [TRAIN] epoch: 86, iter: 108450/160000, loss: 0.4160, lr: 0.000390, batch_cost: 0.1858, reader_cost: 0.00074, ips: 43.0608 samples/sec | ETA 02:39:37 2022-08-24 04:48:14 [INFO] [TRAIN] epoch: 86, iter: 108500/160000, loss: 0.4449, lr: 0.000390, batch_cost: 0.1761, reader_cost: 0.00208, ips: 45.4253 samples/sec | ETA 02:31:09 2022-08-24 04:48:25 [INFO] [TRAIN] epoch: 86, iter: 108550/160000, loss: 0.4430, lr: 0.000390, batch_cost: 0.2298, reader_cost: 0.00036, ips: 34.8154 samples/sec | ETA 03:17:02 2022-08-24 04:48:34 [INFO] [TRAIN] epoch: 86, iter: 108600/160000, loss: 0.4424, lr: 0.000389, batch_cost: 0.1679, reader_cost: 0.00061, ips: 47.6593 samples/sec | ETA 02:23:47 2022-08-24 04:48:45 [INFO] [TRAIN] epoch: 87, iter: 108650/160000, loss: 0.4055, lr: 0.000389, batch_cost: 0.2265, reader_cost: 0.05002, ips: 35.3124 samples/sec | ETA 03:13:53 2022-08-24 04:48:54 [INFO] [TRAIN] epoch: 87, iter: 108700/160000, loss: 0.4405, lr: 0.000388, batch_cost: 0.1787, reader_cost: 0.00052, ips: 44.7761 samples/sec | ETA 02:32:45 2022-08-24 04:49:04 [INFO] [TRAIN] epoch: 87, iter: 108750/160000, loss: 0.4370, lr: 0.000388, batch_cost: 0.1955, reader_cost: 0.00074, ips: 40.9288 samples/sec | ETA 02:46:57 2022-08-24 04:49:13 [INFO] [TRAIN] epoch: 87, iter: 108800/160000, loss: 0.4458, lr: 0.000388, batch_cost: 0.1812, reader_cost: 0.00043, ips: 44.1572 samples/sec | ETA 02:34:35 2022-08-24 04:49:21 [INFO] [TRAIN] epoch: 87, iter: 108850/160000, loss: 0.4049, lr: 0.000387, batch_cost: 0.1529, reader_cost: 0.00076, ips: 52.3161 samples/sec | ETA 02:10:21 2022-08-24 04:49:29 [INFO] [TRAIN] epoch: 87, iter: 108900/160000, loss: 0.4235, lr: 0.000387, batch_cost: 0.1738, reader_cost: 0.00803, ips: 46.0325 samples/sec | ETA 02:28:00 2022-08-24 04:49:37 [INFO] [TRAIN] epoch: 87, iter: 108950/160000, loss: 0.4415, lr: 0.000387, batch_cost: 0.1557, reader_cost: 0.00096, ips: 51.3729 samples/sec | ETA 02:12:29 2022-08-24 04:49:46 [INFO] [TRAIN] epoch: 87, iter: 109000/160000, loss: 0.4224, lr: 0.000386, batch_cost: 0.1874, reader_cost: 0.00080, ips: 42.6932 samples/sec | ETA 02:39:16 2022-08-24 04:49:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 171s - batch_cost: 0.1714 - reader cost: 9.1767e-04 2022-08-24 04:52:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3750 Acc: 0.7749 Kappa: 0.7577 Dice: 0.5130 2022-08-24 04:52:38 [INFO] [EVAL] Class IoU: [0.693 0.7907 0.9312 0.7427 0.6852 0.7772 0.772 0.8023 0.5355 0.605 0.4929 0.5718 0.7164 0.3362 0.3128 0.4455 0.481 0.437 0.6197 0.4378 0.7669 0.4898 0.6308 0.5147 0.3356 0.4171 0.4682 0.4524 0.4471 0.2566 0.2626 0.5383 0.3202 0.355 0.3769 0.4292 0.4871 0.5674 0.2784 0.3648 0.1331 0.1385 0.3576 0.272 0.2409 0.2325 0.3531 0.5263 0.6121 0.5509 0.6161 0.3764 0.1684 0.1811 0.6948 0.369 0.8809 0.4348 0.4374 0.2689 0.0835 0.3372 0.3089 0.1844 0.4893 0.7258 0.2838 0.3818 0.126 0.3716 0.5197 0.5535 0.3568 0.245 0.4947 0.3756 0.4989 0.315 0.4227 0.3638 0.7044 0.4297 0.3903 0.081 0.1109 0.5266 0.1221 0.1341 0.3286 0.5323 0.4683 0.056 0.1081 0.1265 0.0102 0.0577 0.1542 0.2033 0.2056 0.321 0.1962 0.1547 0.2486 0.7104 0.1961 0.5015 0.2166 0.5618 0.0826 0.2541 0.2109 0.4112 0.1895 0.6566 0.6255 0.0719 0.4297 0.6269 0.1932 0.3805 0.5262 0.0895 0.2616 0.1981 0.2887 0.2605 0.5541 0.4717 0.5568 0.35 0.6007 0.0337 0.2747 0.4057 0.2996 0.1959 0.1726 0.0268 0.2125 0.4209 0.2344 0.0168 0.3069 0.2577 0.2331 0. 0.4114 0.0361 0.1154 0.2161] 2022-08-24 04:52:38 [INFO] [EVAL] Class Precision: [0.7909 0.8636 0.9637 0.8304 0.7602 0.8718 0.8899 0.861 0.6674 0.7635 0.7119 0.707 0.7861 0.488 0.5732 0.6106 0.7331 0.6862 0.7756 0.6395 0.8403 0.7105 0.7857 0.6016 0.4723 0.6458 0.5574 0.6922 0.7331 0.4658 0.4935 0.7052 0.5483 0.462 0.4708 0.5333 0.7104 0.8246 0.3923 0.6354 0.3687 0.2841 0.5876 0.4609 0.3341 0.5231 0.7536 0.7012 0.7142 0.6968 0.8104 0.4776 0.3648 0.6539 0.7221 0.7836 0.9396 0.6687 0.6015 0.5048 0.1297 0.5877 0.4705 0.6365 0.6417 0.8644 0.4857 0.5211 0.2792 0.635 0.6769 0.8257 0.5179 0.3502 0.7563 0.515 0.7416 0.6225 0.8052 0.707 0.8385 0.6401 0.84 0.1819 0.1919 0.7257 0.6023 0.4185 0.7093 0.678 0.6902 0.0665 0.2278 0.3431 0.0304 0.2148 0.5187 0.442 0.281 0.6444 0.6713 0.281 0.532 0.8665 0.8606 0.5503 0.4668 0.7583 0.1767 0.4004 0.4492 0.5371 0.6276 0.8516 0.6327 0.2719 0.7675 0.7123 0.3301 0.5379 0.7095 0.4942 0.6227 0.6951 0.7489 0.6062 0.843 0.6047 0.8257 0.4617 0.7304 0.211 0.7189 0.8302 0.7084 0.4249 0.3995 0.0908 0.4765 0.6459 0.4556 0.027 0.6035 0.5801 0.6939 0. 0.8586 0.2688 0.2584 0.7657] 2022-08-24 04:52:38 [INFO] [EVAL] Class Recall: [0.8484 0.9036 0.965 0.8756 0.8741 0.8775 0.8536 0.9216 0.7304 0.7445 0.6157 0.7495 0.8899 0.5195 0.4078 0.6223 0.5832 0.5461 0.7551 0.5811 0.8977 0.6119 0.7618 0.7809 0.5368 0.5407 0.7453 0.5663 0.5341 0.3636 0.3595 0.6945 0.435 0.6053 0.654 0.6873 0.6077 0.6452 0.4897 0.4613 0.1724 0.2128 0.4775 0.3988 0.4632 0.295 0.3991 0.6785 0.8107 0.7246 0.72 0.6398 0.2383 0.2003 0.9484 0.4109 0.9338 0.5542 0.6158 0.3652 0.1897 0.4417 0.4736 0.2061 0.6733 0.819 0.4057 0.5883 0.1868 0.4726 0.6912 0.6267 0.5341 0.4493 0.5886 0.5811 0.6039 0.3893 0.4709 0.4284 0.815 0.5666 0.4216 0.1273 0.208 0.6574 0.1329 0.1648 0.3797 0.7124 0.593 0.263 0.1706 0.167 0.0151 0.0732 0.18 0.2736 0.434 0.39 0.2171 0.256 0.3182 0.7977 0.2026 0.8499 0.2878 0.6844 0.1343 0.4101 0.2845 0.6369 0.2135 0.7414 0.982 0.0891 0.4939 0.8395 0.3179 0.5652 0.6706 0.0985 0.3109 0.217 0.3197 0.3136 0.6178 0.6818 0.6309 0.5914 0.7718 0.0385 0.3077 0.4424 0.3417 0.2665 0.2331 0.0366 0.2773 0.5471 0.3256 0.0422 0.3844 0.3168 0.2598 0. 0.4413 0.04 0.1725 0.2314] 2022-08-24 04:52:38 [INFO] [EVAL] The model with the best validation mIoU (0.3750) was saved at iter 109000. 2022-08-24 04:52:47 [INFO] [TRAIN] epoch: 87, iter: 109050/160000, loss: 0.4393, lr: 0.000386, batch_cost: 0.1652, reader_cost: 0.00353, ips: 48.4230 samples/sec | ETA 02:20:17 2022-08-24 04:52:54 [INFO] [TRAIN] epoch: 87, iter: 109100/160000, loss: 0.4569, lr: 0.000385, batch_cost: 0.1569, reader_cost: 0.00148, ips: 50.9788 samples/sec | ETA 02:13:07 2022-08-24 04:53:03 [INFO] [TRAIN] epoch: 87, iter: 109150/160000, loss: 0.4503, lr: 0.000385, batch_cost: 0.1763, reader_cost: 0.00104, ips: 45.3760 samples/sec | ETA 02:29:25 2022-08-24 04:53:13 [INFO] [TRAIN] epoch: 87, iter: 109200/160000, loss: 0.4167, lr: 0.000385, batch_cost: 0.1861, reader_cost: 0.00068, ips: 42.9985 samples/sec | ETA 02:37:31 2022-08-24 04:53:21 [INFO] [TRAIN] epoch: 87, iter: 109250/160000, loss: 0.4410, lr: 0.000384, batch_cost: 0.1739, reader_cost: 0.00056, ips: 45.9970 samples/sec | ETA 02:27:06 2022-08-24 04:53:30 [INFO] [TRAIN] epoch: 87, iter: 109300/160000, loss: 0.4132, lr: 0.000384, batch_cost: 0.1742, reader_cost: 0.00052, ips: 45.9185 samples/sec | ETA 02:27:13 2022-08-24 04:53:40 [INFO] [TRAIN] epoch: 87, iter: 109350/160000, loss: 0.4269, lr: 0.000383, batch_cost: 0.2098, reader_cost: 0.00050, ips: 38.1357 samples/sec | ETA 02:57:05 2022-08-24 04:53:49 [INFO] [TRAIN] epoch: 87, iter: 109400/160000, loss: 0.4380, lr: 0.000383, batch_cost: 0.1719, reader_cost: 0.00073, ips: 46.5281 samples/sec | ETA 02:25:00 2022-08-24 04:53:59 [INFO] [TRAIN] epoch: 87, iter: 109450/160000, loss: 0.4379, lr: 0.000383, batch_cost: 0.2056, reader_cost: 0.00053, ips: 38.9186 samples/sec | ETA 02:53:10 2022-08-24 04:54:09 [INFO] [TRAIN] epoch: 87, iter: 109500/160000, loss: 0.4498, lr: 0.000382, batch_cost: 0.1836, reader_cost: 0.00063, ips: 43.5785 samples/sec | ETA 02:34:30 2022-08-24 04:54:17 [INFO] [TRAIN] epoch: 87, iter: 109550/160000, loss: 0.4622, lr: 0.000382, batch_cost: 0.1783, reader_cost: 0.00049, ips: 44.8707 samples/sec | ETA 02:29:54 2022-08-24 04:54:26 [INFO] [TRAIN] epoch: 87, iter: 109600/160000, loss: 0.4359, lr: 0.000382, batch_cost: 0.1644, reader_cost: 0.00080, ips: 48.6752 samples/sec | ETA 02:18:03 2022-08-24 04:54:34 [INFO] [TRAIN] epoch: 87, iter: 109650/160000, loss: 0.4167, lr: 0.000381, batch_cost: 0.1700, reader_cost: 0.00148, ips: 47.0690 samples/sec | ETA 02:22:37 2022-08-24 04:54:43 [INFO] [TRAIN] epoch: 87, iter: 109700/160000, loss: 0.4539, lr: 0.000381, batch_cost: 0.1846, reader_cost: 0.00061, ips: 43.3298 samples/sec | ETA 02:34:46 2022-08-24 04:54:53 [INFO] [TRAIN] epoch: 87, iter: 109750/160000, loss: 0.4685, lr: 0.000380, batch_cost: 0.1960, reader_cost: 0.00071, ips: 40.8238 samples/sec | ETA 02:44:07 2022-08-24 04:55:02 [INFO] [TRAIN] epoch: 87, iter: 109800/160000, loss: 0.4713, lr: 0.000380, batch_cost: 0.1833, reader_cost: 0.00042, ips: 43.6479 samples/sec | ETA 02:33:20 2022-08-24 04:55:12 [INFO] [TRAIN] epoch: 87, iter: 109850/160000, loss: 0.4408, lr: 0.000380, batch_cost: 0.1846, reader_cost: 0.00055, ips: 43.3264 samples/sec | ETA 02:34:19 2022-08-24 04:55:24 [INFO] [TRAIN] epoch: 88, iter: 109900/160000, loss: 0.4568, lr: 0.000379, batch_cost: 0.2456, reader_cost: 0.07570, ips: 32.5728 samples/sec | ETA 03:25:04 2022-08-24 04:55:33 [INFO] [TRAIN] epoch: 88, iter: 109950/160000, loss: 0.4329, lr: 0.000379, batch_cost: 0.1801, reader_cost: 0.00037, ips: 44.4276 samples/sec | ETA 02:30:12 2022-08-24 04:55:42 [INFO] [TRAIN] epoch: 88, iter: 110000/160000, loss: 0.4173, lr: 0.000379, batch_cost: 0.1851, reader_cost: 0.00036, ips: 43.2116 samples/sec | ETA 02:34:16 2022-08-24 04:55:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1796 - reader cost: 0.0011 2022-08-24 04:58:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.3728 Acc: 0.7750 Kappa: 0.7577 Dice: 0.5107 2022-08-24 04:58:42 [INFO] [EVAL] Class IoU: [0.6917 0.7903 0.9324 0.736 0.6891 0.7735 0.7808 0.8017 0.5351 0.6317 0.5012 0.5759 0.7171 0.3037 0.3068 0.4476 0.4656 0.4222 0.6161 0.438 0.7702 0.4613 0.6445 0.5262 0.3245 0.4385 0.469 0.4547 0.4477 0.2617 0.2936 0.5142 0.2777 0.3594 0.3646 0.4153 0.4933 0.5629 0.292 0.3798 0.1545 0.1579 0.3582 0.2693 0.2641 0.212 0.4329 0.5442 0.5255 0.5379 0.5945 0.3889 0.178 0.2576 0.6889 0.3391 0.8888 0.4113 0.3211 0.2926 0.0947 0.2683 0.2872 0.1259 0.5023 0.7124 0.2779 0.3778 0.1219 0.3864 0.4987 0.5716 0.3525 0.2227 0.4908 0.3998 0.5095 0.2891 0.438 0.3872 0.7251 0.4294 0.4391 0.0304 0.1095 0.5202 0.1034 0.1182 0.3449 0.5449 0.4471 0.0384 0.1247 0.1217 0.0349 0.0458 0.1114 0.2173 0.2015 0.3018 0.2312 0.1141 0.2579 0.6908 0.1952 0.4278 0.1695 0.5683 0.0628 0.3633 0.2203 0.3545 0.1984 0.5832 0.7289 0.0691 0.4226 0.6061 0.2048 0.3894 0.4396 0.0927 0.3179 0.2022 0.2982 0.2466 0.5376 0.4445 0.5431 0.4015 0.5672 0.0427 0.2845 0.4527 0.2913 0.2167 0.1636 0.0299 0.2207 0.4098 0.2723 0.0013 0.2642 0.3023 0.1761 0.0093 0.3844 0.046 0.1457 0.2156] 2022-08-24 04:58:42 [INFO] [EVAL] Class Precision: [0.7887 0.8527 0.9652 0.8157 0.7601 0.885 0.8851 0.8584 0.658 0.7325 0.6833 0.7157 0.7945 0.4994 0.5409 0.6242 0.7052 0.7137 0.7969 0.6434 0.8385 0.676 0.7944 0.6573 0.4823 0.6689 0.5759 0.7973 0.7216 0.4314 0.4981 0.7183 0.5718 0.5139 0.4612 0.547 0.6857 0.8357 0.4452 0.6126 0.3897 0.3257 0.5951 0.4956 0.3698 0.4567 0.7953 0.7621 0.7179 0.6936 0.8155 0.4546 0.338 0.5986 0.7335 0.706 0.9461 0.6697 0.6557 0.6037 0.1432 0.6351 0.3862 0.7192 0.6606 0.805 0.5018 0.5125 0.2995 0.6926 0.7722 0.781 0.6199 0.3747 0.6945 0.6036 0.7539 0.6198 0.7688 0.5466 0.8535 0.6883 0.7759 0.1153 0.1808 0.7248 0.6738 0.4543 0.8101 0.7239 0.5816 0.0703 0.324 0.3654 0.117 0.1744 0.3994 0.3871 0.2674 0.6887 0.7356 0.187 0.5408 0.8782 0.7404 0.4625 0.3498 0.7479 0.1877 0.47 0.4786 0.4245 0.6123 0.7736 0.7378 0.2316 0.7596 0.6872 0.3022 0.5332 0.748 0.498 0.5246 0.636 0.7039 0.6689 0.7716 0.5612 0.7894 0.6408 0.6419 0.3696 0.6256 0.8246 0.728 0.3884 0.4426 0.0553 0.4772 0.6591 0.5795 0.0027 0.6122 0.723 0.6995 0.0134 0.8335 0.2443 0.4699 0.7711] 2022-08-24 04:58:42 [INFO] [EVAL] Class Recall: [0.849 0.9151 0.9648 0.8829 0.8806 0.8599 0.8689 0.9239 0.7414 0.8212 0.6528 0.7466 0.8805 0.4367 0.4148 0.6128 0.5781 0.5083 0.7309 0.5785 0.9044 0.5922 0.7735 0.7252 0.498 0.5601 0.7164 0.5141 0.5412 0.3995 0.4169 0.644 0.3506 0.5445 0.635 0.633 0.6375 0.6329 0.4589 0.4998 0.2039 0.2347 0.4737 0.371 0.4802 0.2834 0.4872 0.6556 0.6623 0.7055 0.687 0.7292 0.2732 0.3115 0.919 0.3949 0.9362 0.5159 0.3863 0.3621 0.2184 0.3172 0.5284 0.1324 0.6769 0.861 0.3838 0.5897 0.1706 0.4664 0.5847 0.6807 0.4496 0.3545 0.6259 0.5421 0.6112 0.3514 0.5045 0.5705 0.8282 0.5331 0.5029 0.0396 0.2172 0.6481 0.1088 0.1378 0.3753 0.6878 0.659 0.0782 0.1685 0.1542 0.0473 0.0584 0.1338 0.3313 0.45 0.3495 0.2522 0.2262 0.3301 0.764 0.2095 0.8509 0.2474 0.703 0.0863 0.6154 0.2898 0.6827 0.2269 0.7032 0.9837 0.0896 0.4879 0.837 0.3887 0.5908 0.516 0.1023 0.4465 0.2286 0.3409 0.281 0.6394 0.6812 0.6351 0.5181 0.8297 0.046 0.3429 0.5009 0.3269 0.3289 0.2061 0.061 0.2911 0.5201 0.3394 0.0024 0.3173 0.3419 0.1905 0.0291 0.4163 0.0537 0.1744 0.2304] 2022-08-24 04:58:42 [INFO] [EVAL] The model with the best validation mIoU (0.3750) was saved at iter 109000. 2022-08-24 04:58:52 [INFO] [TRAIN] epoch: 88, iter: 110050/160000, loss: 0.4390, lr: 0.000378, batch_cost: 0.1893, reader_cost: 0.00525, ips: 42.2589 samples/sec | ETA 02:37:35 2022-08-24 04:59:01 [INFO] [TRAIN] epoch: 88, iter: 110100/160000, loss: 0.4521, lr: 0.000378, batch_cost: 0.1938, reader_cost: 0.00156, ips: 41.2830 samples/sec | ETA 02:41:09 2022-08-24 04:59:12 [INFO] [TRAIN] epoch: 88, iter: 110150/160000, loss: 0.4593, lr: 0.000377, batch_cost: 0.2025, reader_cost: 0.00046, ips: 39.5042 samples/sec | ETA 02:48:15 2022-08-24 04:59:20 [INFO] [TRAIN] epoch: 88, iter: 110200/160000, loss: 0.4275, lr: 0.000377, batch_cost: 0.1729, reader_cost: 0.00052, ips: 46.2570 samples/sec | ETA 02:23:32 2022-08-24 04:59:29 [INFO] [TRAIN] epoch: 88, iter: 110250/160000, loss: 0.4350, lr: 0.000377, batch_cost: 0.1763, reader_cost: 0.00078, ips: 45.3832 samples/sec | ETA 02:26:09 2022-08-24 04:59:37 [INFO] [TRAIN] epoch: 88, iter: 110300/160000, loss: 0.4517, lr: 0.000376, batch_cost: 0.1674, reader_cost: 0.00063, ips: 47.7813 samples/sec | ETA 02:18:41 2022-08-24 04:59:45 [INFO] [TRAIN] epoch: 88, iter: 110350/160000, loss: 0.4213, lr: 0.000376, batch_cost: 0.1546, reader_cost: 0.00278, ips: 51.7600 samples/sec | ETA 02:07:53 2022-08-24 04:59:56 [INFO] [TRAIN] epoch: 88, iter: 110400/160000, loss: 0.4390, lr: 0.000376, batch_cost: 0.2118, reader_cost: 0.00059, ips: 37.7770 samples/sec | ETA 02:55:03 2022-08-24 05:00:05 [INFO] [TRAIN] epoch: 88, iter: 110450/160000, loss: 0.4208, lr: 0.000375, batch_cost: 0.1775, reader_cost: 0.00070, ips: 45.0823 samples/sec | ETA 02:26:32 2022-08-24 05:00:14 [INFO] [TRAIN] epoch: 88, iter: 110500/160000, loss: 0.4221, lr: 0.000375, batch_cost: 0.1802, reader_cost: 0.00039, ips: 44.3936 samples/sec | ETA 02:28:40 2022-08-24 05:00:23 [INFO] [TRAIN] epoch: 88, iter: 110550/160000, loss: 0.4089, lr: 0.000374, batch_cost: 0.1936, reader_cost: 0.00092, ips: 41.3275 samples/sec | ETA 02:39:32 2022-08-24 05:00:31 [INFO] [TRAIN] epoch: 88, iter: 110600/160000, loss: 0.4207, lr: 0.000374, batch_cost: 0.1639, reader_cost: 0.00071, ips: 48.8134 samples/sec | ETA 02:14:56 2022-08-24 05:00:39 [INFO] [TRAIN] epoch: 88, iter: 110650/160000, loss: 0.4258, lr: 0.000374, batch_cost: 0.1540, reader_cost: 0.00049, ips: 51.9534 samples/sec | ETA 02:06:39 2022-08-24 05:00:47 [INFO] [TRAIN] epoch: 88, iter: 110700/160000, loss: 0.4255, lr: 0.000373, batch_cost: 0.1551, reader_cost: 0.00030, ips: 51.5929 samples/sec | ETA 02:07:24 2022-08-24 05:00:55 [INFO] [TRAIN] epoch: 88, iter: 110750/160000, loss: 0.4266, lr: 0.000373, batch_cost: 0.1693, reader_cost: 0.00062, ips: 47.2441 samples/sec | ETA 02:18:59 2022-08-24 05:01:04 [INFO] [TRAIN] epoch: 88, iter: 110800/160000, loss: 0.4355, lr: 0.000372, batch_cost: 0.1763, reader_cost: 0.00062, ips: 45.3782 samples/sec | ETA 02:24:33 2022-08-24 05:01:13 [INFO] [TRAIN] epoch: 88, iter: 110850/160000, loss: 0.4583, lr: 0.000372, batch_cost: 0.1824, reader_cost: 0.00056, ips: 43.8502 samples/sec | ETA 02:29:26 2022-08-24 05:01:22 [INFO] [TRAIN] epoch: 88, iter: 110900/160000, loss: 0.4035, lr: 0.000372, batch_cost: 0.1828, reader_cost: 0.00061, ips: 43.7548 samples/sec | ETA 02:29:37 2022-08-24 05:01:31 [INFO] [TRAIN] epoch: 88, iter: 110950/160000, loss: 0.4582, lr: 0.000371, batch_cost: 0.1706, reader_cost: 0.00071, ips: 46.9030 samples/sec | ETA 02:19:26 2022-08-24 05:01:39 [INFO] [TRAIN] epoch: 88, iter: 111000/160000, loss: 0.4073, lr: 0.000371, batch_cost: 0.1693, reader_cost: 0.00044, ips: 47.2565 samples/sec | ETA 02:18:15 2022-08-24 05:01:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1756 - reader cost: 8.3659e-04 2022-08-24 05:04:35 [INFO] [EVAL] #Images: 2000 mIoU: 0.3754 Acc: 0.7762 Kappa: 0.7590 Dice: 0.5129 2022-08-24 05:04:35 [INFO] [EVAL] Class IoU: [0.693 0.7945 0.9318 0.7425 0.6973 0.7758 0.7736 0.8026 0.5295 0.6284 0.4955 0.5603 0.7148 0.3112 0.3004 0.4479 0.5092 0.4453 0.6148 0.4454 0.77 0.4341 0.6358 0.5211 0.3295 0.4332 0.4339 0.479 0.4639 0.2589 0.2936 0.5451 0.3063 0.3529 0.3445 0.4134 0.4899 0.5703 0.3054 0.364 0.1486 0.1352 0.3512 0.2725 0.2407 0.2569 0.4004 0.5506 0.6327 0.5039 0.5981 0.3651 0.1482 0.2044 0.69 0.3857 0.8852 0.4486 0.4243 0.2858 0.0979 0.2862 0.3288 0.117 0.4728 0.7206 0.2966 0.3866 0.13 0.3605 0.5228 0.5834 0.3824 0.2386 0.4892 0.3953 0.5187 0.2971 0.2274 0.4125 0.701 0.4283 0.4058 0.0373 0.1111 0.5334 0.0992 0.1174 0.3777 0.5364 0.4456 0.0171 0.1954 0.1078 0.0476 0.0538 0.1436 0.1897 0.2246 0.3115 0.3016 0.1177 0.2646 0.7775 0.1909 0.6056 0.1697 0.5604 0.1088 0.3713 0.2188 0.4637 0.1914 0.6721 0.6166 0.0781 0.4278 0.6567 0.1786 0.4199 0.4164 0.0831 0.2869 0.1934 0.3053 0.2503 0.5509 0.4604 0.3606 0.4095 0.5495 0.0495 0.2816 0.4214 0.2817 0.2066 0.1789 0.0202 0.2224 0.4108 0.1329 0.0071 0.2908 0.2677 0.2588 0.0107 0.4019 0.0401 0.1274 0.2093] 2022-08-24 05:04:35 [INFO] [EVAL] Class Precision: [0.7868 0.8566 0.9634 0.8344 0.7973 0.877 0.8702 0.854 0.6652 0.7529 0.7209 0.7069 0.7899 0.5339 0.5747 0.6348 0.6683 0.68 0.7411 0.6301 0.8362 0.6752 0.7826 0.6437 0.457 0.672 0.5292 0.7065 0.707 0.4573 0.4909 0.675 0.536 0.4703 0.4365 0.5423 0.6546 0.8089 0.4799 0.6376 0.3569 0.346 0.5831 0.5351 0.322 0.4821 0.762 0.7522 0.7387 0.5998 0.8124 0.4488 0.3244 0.5921 0.7651 0.6456 0.9425 0.661 0.6972 0.5 0.1609 0.6245 0.4544 0.661 0.571 0.9012 0.4286 0.5502 0.2939 0.6051 0.67 0.8343 0.5509 0.3488 0.7507 0.6086 0.8266 0.6687 0.7295 0.7105 0.8383 0.7179 0.8278 0.1605 0.2098 0.7221 0.6652 0.4569 0.8131 0.7147 0.6121 0.0195 0.3793 0.3855 0.1298 0.1717 0.5103 0.4011 0.3234 0.7405 0.7775 0.2008 0.5609 0.8829 0.6789 0.7021 0.4055 0.7649 0.2184 0.4915 0.4256 0.6452 0.5965 0.8261 0.6216 0.3623 0.6944 0.7809 0.3802 0.5932 0.7483 0.4496 0.642 0.5985 0.6985 0.6666 0.7991 0.5777 0.4482 0.6006 0.602 0.3434 0.6678 0.823 0.7245 0.4138 0.3439 0.0737 0.5399 0.6675 0.5531 0.0122 0.6552 0.5797 0.6227 0.0143 0.8531 0.2906 0.2722 0.7766] 2022-08-24 05:04:35 [INFO] [EVAL] Class Recall: [0.8533 0.9164 0.9661 0.8707 0.8475 0.8705 0.8745 0.9302 0.7219 0.7916 0.6131 0.7299 0.8827 0.4273 0.3863 0.6034 0.6815 0.5633 0.783 0.6031 0.9067 0.5488 0.7722 0.7323 0.5414 0.5494 0.7068 0.598 0.5744 0.3737 0.4222 0.7391 0.4168 0.5856 0.6203 0.635 0.6607 0.659 0.4564 0.4589 0.2029 0.1815 0.4689 0.3571 0.4881 0.3547 0.4576 0.6726 0.8152 0.7592 0.694 0.6618 0.2143 0.2378 0.8755 0.4893 0.9357 0.5826 0.5202 0.4001 0.2001 0.3457 0.5431 0.1245 0.7332 0.7824 0.4905 0.5651 0.189 0.4714 0.7041 0.6598 0.5556 0.4301 0.5841 0.53 0.5821 0.3483 0.2483 0.4958 0.8107 0.515 0.4432 0.0464 0.191 0.6711 0.1044 0.1365 0.4136 0.6825 0.6209 0.1201 0.2873 0.1302 0.0698 0.0726 0.1666 0.2646 0.4236 0.3497 0.3301 0.2213 0.3338 0.8669 0.2098 0.8151 0.2259 0.677 0.178 0.6028 0.3105 0.6225 0.2199 0.7828 0.9872 0.0905 0.5269 0.805 0.252 0.5897 0.4842 0.0926 0.3416 0.2222 0.3517 0.2862 0.6394 0.6939 0.6485 0.5628 0.863 0.0547 0.3274 0.4634 0.3155 0.2921 0.2715 0.0271 0.2744 0.5165 0.149 0.0168 0.3433 0.3321 0.3069 0.0405 0.4318 0.0445 0.1932 0.2227] 2022-08-24 05:04:36 [INFO] [EVAL] The model with the best validation mIoU (0.3754) was saved at iter 111000. 2022-08-24 05:04:45 [INFO] [TRAIN] epoch: 88, iter: 111050/160000, loss: 0.4611, lr: 0.000371, batch_cost: 0.1847, reader_cost: 0.00578, ips: 43.3071 samples/sec | ETA 02:30:42 2022-08-24 05:04:54 [INFO] [TRAIN] epoch: 88, iter: 111100/160000, loss: 0.3774, lr: 0.000370, batch_cost: 0.1889, reader_cost: 0.00120, ips: 42.3609 samples/sec | ETA 02:33:54 2022-08-24 05:05:06 [INFO] [TRAIN] epoch: 89, iter: 111150/160000, loss: 0.4299, lr: 0.000370, batch_cost: 0.2278, reader_cost: 0.07017, ips: 35.1138 samples/sec | ETA 03:05:29 2022-08-24 05:05:14 [INFO] [TRAIN] epoch: 89, iter: 111200/160000, loss: 0.4442, lr: 0.000369, batch_cost: 0.1757, reader_cost: 0.00107, ips: 45.5292 samples/sec | ETA 02:22:54 2022-08-24 05:05:25 [INFO] [TRAIN] epoch: 89, iter: 111250/160000, loss: 0.4551, lr: 0.000369, batch_cost: 0.2090, reader_cost: 0.00041, ips: 38.2797 samples/sec | ETA 02:49:48 2022-08-24 05:05:35 [INFO] [TRAIN] epoch: 89, iter: 111300/160000, loss: 0.4235, lr: 0.000369, batch_cost: 0.1940, reader_cost: 0.00074, ips: 41.2305 samples/sec | ETA 02:37:29 2022-08-24 05:05:43 [INFO] [TRAIN] epoch: 89, iter: 111350/160000, loss: 0.4307, lr: 0.000368, batch_cost: 0.1766, reader_cost: 0.00048, ips: 45.3038 samples/sec | ETA 02:23:10 2022-08-24 05:05:53 [INFO] [TRAIN] epoch: 89, iter: 111400/160000, loss: 0.4296, lr: 0.000368, batch_cost: 0.1837, reader_cost: 0.00040, ips: 43.5601 samples/sec | ETA 02:28:45 2022-08-24 05:06:02 [INFO] [TRAIN] epoch: 89, iter: 111450/160000, loss: 0.4168, lr: 0.000368, batch_cost: 0.1909, reader_cost: 0.00054, ips: 41.8994 samples/sec | ETA 02:34:29 2022-08-24 05:06:11 [INFO] [TRAIN] epoch: 89, iter: 111500/160000, loss: 0.4150, lr: 0.000367, batch_cost: 0.1717, reader_cost: 0.00046, ips: 46.5820 samples/sec | ETA 02:18:49 2022-08-24 05:06:19 [INFO] [TRAIN] epoch: 89, iter: 111550/160000, loss: 0.4154, lr: 0.000367, batch_cost: 0.1723, reader_cost: 0.00118, ips: 46.4266 samples/sec | ETA 02:19:08 2022-08-24 05:06:28 [INFO] [TRAIN] epoch: 89, iter: 111600/160000, loss: 0.4504, lr: 0.000366, batch_cost: 0.1636, reader_cost: 0.00047, ips: 48.9144 samples/sec | ETA 02:11:55 2022-08-24 05:06:36 [INFO] [TRAIN] epoch: 89, iter: 111650/160000, loss: 0.4236, lr: 0.000366, batch_cost: 0.1605, reader_cost: 0.00046, ips: 49.8364 samples/sec | ETA 02:09:21 2022-08-24 05:06:43 [INFO] [TRAIN] epoch: 89, iter: 111700/160000, loss: 0.4049, lr: 0.000366, batch_cost: 0.1531, reader_cost: 0.00030, ips: 52.2683 samples/sec | ETA 02:03:12 2022-08-24 05:06:52 [INFO] [TRAIN] epoch: 89, iter: 111750/160000, loss: 0.4239, lr: 0.000365, batch_cost: 0.1680, reader_cost: 0.00085, ips: 47.6086 samples/sec | ETA 02:15:07 2022-08-24 05:07:00 [INFO] [TRAIN] epoch: 89, iter: 111800/160000, loss: 0.4400, lr: 0.000365, batch_cost: 0.1693, reader_cost: 0.00081, ips: 47.2537 samples/sec | ETA 02:16:00 2022-08-24 05:07:10 [INFO] [TRAIN] epoch: 89, iter: 111850/160000, loss: 0.4605, lr: 0.000365, batch_cost: 0.1917, reader_cost: 0.00064, ips: 41.7282 samples/sec | ETA 02:33:51 2022-08-24 05:07:20 [INFO] [TRAIN] epoch: 89, iter: 111900/160000, loss: 0.4356, lr: 0.000364, batch_cost: 0.2011, reader_cost: 0.00070, ips: 39.7724 samples/sec | ETA 02:41:15 2022-08-24 05:07:30 [INFO] [TRAIN] epoch: 89, iter: 111950/160000, loss: 0.4195, lr: 0.000364, batch_cost: 0.1959, reader_cost: 0.00050, ips: 40.8304 samples/sec | ETA 02:36:54 2022-08-24 05:07:40 [INFO] [TRAIN] epoch: 89, iter: 112000/160000, loss: 0.4376, lr: 0.000363, batch_cost: 0.2061, reader_cost: 0.00053, ips: 38.8153 samples/sec | ETA 02:44:52 2022-08-24 05:07:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1639 - reader cost: 6.7930e-04 2022-08-24 05:10:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.3776 Acc: 0.7753 Kappa: 0.7581 Dice: 0.5164 2022-08-24 05:10:24 [INFO] [EVAL] Class IoU: [0.6928 0.7921 0.9298 0.7391 0.6849 0.7775 0.778 0.8091 0.5354 0.6254 0.4945 0.5687 0.717 0.3117 0.315 0.4557 0.5338 0.4216 0.6269 0.4284 0.7655 0.4288 0.6338 0.5273 0.3208 0.4208 0.4419 0.4708 0.4288 0.256 0.295 0.5357 0.3015 0.36 0.3426 0.3666 0.4855 0.5628 0.3008 0.3899 0.1642 0.1365 0.3569 0.2761 0.2323 0.2439 0.4299 0.5482 0.642 0.4909 0.6137 0.3627 0.1666 0.2044 0.6802 0.3901 0.8808 0.4103 0.439 0.2812 0.1134 0.2791 0.3033 0.1946 0.4372 0.7092 0.252 0.3956 0.0967 0.341 0.5185 0.6062 0.3953 0.278 0.4747 0.3953 0.5895 0.2936 0.2276 0.3987 0.7084 0.4323 0.4044 0.0323 0.3035 0.5287 0.1155 0.131 0.3721 0.5199 0.4558 0.0765 0.1843 0.1093 0.0391 0.055 0.1863 0.2165 0.2279 0.3046 0.1962 0.1256 0.2214 0.7928 0.1785 0.4876 0.1478 0.5511 0.101 0.3681 0.2237 0.4205 0.1832 0.6051 0.6565 0.0649 0.4276 0.636 0.1335 0.4131 0.4702 0.0819 0.3073 0.1955 0.2896 0.2259 0.5572 0.4572 0.4867 0.3623 0.5563 0.0498 0.2939 0.432 0.2788 0.2121 0.1762 0.0359 0.2196 0.4034 0.2524 0.0143 0.2879 0.4805 0.2361 0.0088 0.4076 0.0392 0.1682 0.1997] 2022-08-24 05:10:24 [INFO] [EVAL] Class Precision: [0.7908 0.8663 0.958 0.822 0.7683 0.8691 0.8761 0.8727 0.6879 0.7498 0.6913 0.7202 0.7865 0.5337 0.5457 0.6575 0.7272 0.7017 0.7984 0.66 0.833 0.6761 0.7643 0.6293 0.5186 0.6166 0.5506 0.7559 0.7369 0.4004 0.4479 0.6853 0.5377 0.4907 0.4411 0.5198 0.6565 0.8194 0.4572 0.559 0.4148 0.3489 0.5659 0.516 0.3185 0.4368 0.7532 0.7984 0.7276 0.5817 0.7949 0.4516 0.3286 0.5506 0.7196 0.5687 0.9322 0.7246 0.7411 0.5246 0.1804 0.56 0.4507 0.6104 0.5026 0.7966 0.42 0.5175 0.1608 0.6399 0.7425 0.724 0.5596 0.3648 0.6559 0.5987 0.7681 0.5851 0.6955 0.6613 0.8631 0.7255 0.8091 0.106 0.367 0.7296 0.5607 0.4301 0.7947 0.6652 0.6711 0.0925 0.3611 0.3633 0.0877 0.1543 0.613 0.4307 0.3198 0.6031 0.7041 0.2065 0.5665 0.8894 0.7275 0.5227 0.4242 0.8026 0.2099 0.4866 0.4433 0.5103 0.5712 0.6237 0.6637 0.3373 0.7894 0.7345 0.2806 0.578 0.6757 0.3659 0.6601 0.5794 0.706 0.7137 0.8435 0.5886 0.6615 0.4832 0.7057 0.3044 0.4927 0.8158 0.6003 0.4074 0.3059 0.071 0.5012 0.6621 0.6193 0.0328 0.6443 0.6737 0.6905 0.0117 0.8609 0.243 0.5383 0.7939] 2022-08-24 05:10:24 [INFO] [EVAL] Class Recall: [0.8482 0.9024 0.9693 0.88 0.8633 0.8805 0.8742 0.9175 0.7072 0.7904 0.6346 0.73 0.8902 0.4284 0.427 0.5976 0.6675 0.5136 0.7448 0.5497 0.9043 0.5396 0.7877 0.7649 0.4567 0.57 0.6912 0.5552 0.5063 0.415 0.4634 0.7105 0.4069 0.5749 0.6055 0.5544 0.6508 0.6425 0.4678 0.563 0.2137 0.1831 0.4915 0.3725 0.4618 0.3559 0.5003 0.6362 0.845 0.7587 0.7292 0.6482 0.2526 0.2453 0.9254 0.5541 0.9411 0.486 0.5185 0.3773 0.234 0.3575 0.4811 0.2222 0.7707 0.8661 0.3865 0.6267 0.1953 0.422 0.6322 0.7884 0.5737 0.5389 0.6322 0.5378 0.7171 0.3708 0.2528 0.5011 0.7981 0.5168 0.4471 0.0445 0.6368 0.6575 0.127 0.1584 0.4116 0.7042 0.5868 0.3059 0.2735 0.1352 0.0659 0.0787 0.2111 0.3033 0.4423 0.381 0.2139 0.2426 0.2666 0.8796 0.1913 0.8789 0.1849 0.6374 0.1629 0.6018 0.3112 0.705 0.2124 0.9531 0.9837 0.0744 0.4827 0.8259 0.2031 0.5916 0.6072 0.0955 0.365 0.2279 0.3293 0.2484 0.6214 0.672 0.648 0.5916 0.7245 0.0562 0.4214 0.4787 0.3424 0.3067 0.2937 0.0677 0.281 0.508 0.2987 0.0248 0.3424 0.6263 0.264 0.0341 0.4363 0.0447 0.1965 0.2106] 2022-08-24 05:10:24 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:10:33 [INFO] [TRAIN] epoch: 89, iter: 112050/160000, loss: 0.4286, lr: 0.000363, batch_cost: 0.1690, reader_cost: 0.00429, ips: 47.3434 samples/sec | ETA 02:15:02 2022-08-24 05:10:41 [INFO] [TRAIN] epoch: 89, iter: 112100/160000, loss: 0.4586, lr: 0.000363, batch_cost: 0.1627, reader_cost: 0.00077, ips: 49.1567 samples/sec | ETA 02:09:55 2022-08-24 05:10:51 [INFO] [TRAIN] epoch: 89, iter: 112150/160000, loss: 0.4323, lr: 0.000362, batch_cost: 0.1948, reader_cost: 0.00094, ips: 41.0618 samples/sec | ETA 02:35:22 2022-08-24 05:11:00 [INFO] [TRAIN] epoch: 89, iter: 112200/160000, loss: 0.4072, lr: 0.000362, batch_cost: 0.1965, reader_cost: 0.00040, ips: 40.7025 samples/sec | ETA 02:36:35 2022-08-24 05:11:09 [INFO] [TRAIN] epoch: 89, iter: 112250/160000, loss: 0.4174, lr: 0.000362, batch_cost: 0.1816, reader_cost: 0.00050, ips: 44.0578 samples/sec | ETA 02:24:30 2022-08-24 05:11:19 [INFO] [TRAIN] epoch: 89, iter: 112300/160000, loss: 0.4396, lr: 0.000361, batch_cost: 0.1960, reader_cost: 0.00053, ips: 40.8112 samples/sec | ETA 02:35:50 2022-08-24 05:11:28 [INFO] [TRAIN] epoch: 89, iter: 112350/160000, loss: 0.4246, lr: 0.000361, batch_cost: 0.1689, reader_cost: 0.00051, ips: 47.3579 samples/sec | ETA 02:14:09 2022-08-24 05:11:37 [INFO] [TRAIN] epoch: 89, iter: 112400/160000, loss: 0.4331, lr: 0.000360, batch_cost: 0.1786, reader_cost: 0.00078, ips: 44.7887 samples/sec | ETA 02:21:42 2022-08-24 05:11:52 [INFO] [TRAIN] epoch: 90, iter: 112450/160000, loss: 0.4556, lr: 0.000360, batch_cost: 0.3099, reader_cost: 0.14799, ips: 25.8137 samples/sec | ETA 04:05:36 2022-08-24 05:12:02 [INFO] [TRAIN] epoch: 90, iter: 112500/160000, loss: 0.4431, lr: 0.000360, batch_cost: 0.1887, reader_cost: 0.00042, ips: 42.3920 samples/sec | ETA 02:29:23 2022-08-24 05:12:12 [INFO] [TRAIN] epoch: 90, iter: 112550/160000, loss: 0.4505, lr: 0.000359, batch_cost: 0.2107, reader_cost: 0.00042, ips: 37.9638 samples/sec | ETA 02:46:39 2022-08-24 05:12:21 [INFO] [TRAIN] epoch: 90, iter: 112600/160000, loss: 0.4477, lr: 0.000359, batch_cost: 0.1689, reader_cost: 0.00091, ips: 47.3758 samples/sec | ETA 02:13:24 2022-08-24 05:12:29 [INFO] [TRAIN] epoch: 90, iter: 112650/160000, loss: 0.4416, lr: 0.000358, batch_cost: 0.1763, reader_cost: 0.00071, ips: 45.3821 samples/sec | ETA 02:19:06 2022-08-24 05:12:38 [INFO] [TRAIN] epoch: 90, iter: 112700/160000, loss: 0.4196, lr: 0.000358, batch_cost: 0.1818, reader_cost: 0.00060, ips: 44.0055 samples/sec | ETA 02:23:18 2022-08-24 05:12:46 [INFO] [TRAIN] epoch: 90, iter: 112750/160000, loss: 0.4345, lr: 0.000358, batch_cost: 0.1514, reader_cost: 0.00055, ips: 52.8361 samples/sec | ETA 01:59:14 2022-08-24 05:12:55 [INFO] [TRAIN] epoch: 90, iter: 112800/160000, loss: 0.4345, lr: 0.000357, batch_cost: 0.1694, reader_cost: 0.00043, ips: 47.2363 samples/sec | ETA 02:13:13 2022-08-24 05:13:03 [INFO] [TRAIN] epoch: 90, iter: 112850/160000, loss: 0.4415, lr: 0.000357, batch_cost: 0.1748, reader_cost: 0.00051, ips: 45.7730 samples/sec | ETA 02:17:20 2022-08-24 05:13:11 [INFO] [TRAIN] epoch: 90, iter: 112900/160000, loss: 0.4629, lr: 0.000357, batch_cost: 0.1545, reader_cost: 0.00307, ips: 51.7940 samples/sec | ETA 02:01:14 2022-08-24 05:13:19 [INFO] [TRAIN] epoch: 90, iter: 112950/160000, loss: 0.4090, lr: 0.000356, batch_cost: 0.1592, reader_cost: 0.00114, ips: 50.2463 samples/sec | ETA 02:04:51 2022-08-24 05:13:28 [INFO] [TRAIN] epoch: 90, iter: 113000/160000, loss: 0.4493, lr: 0.000356, batch_cost: 0.1879, reader_cost: 0.00034, ips: 42.5838 samples/sec | ETA 02:27:09 2022-08-24 05:13:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1725 - reader cost: 7.1271e-04 2022-08-24 05:16:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3749 Acc: 0.7748 Kappa: 0.7577 Dice: 0.5131 2022-08-24 05:16:21 [INFO] [EVAL] Class IoU: [0.6945 0.7928 0.9304 0.7394 0.6863 0.7691 0.7744 0.8118 0.5338 0.6266 0.5008 0.5565 0.7123 0.3092 0.3297 0.4416 0.5374 0.4353 0.6225 0.4393 0.7752 0.4189 0.6212 0.5333 0.3305 0.3603 0.446 0.4688 0.4121 0.2314 0.2919 0.538 0.3202 0.3499 0.3404 0.3892 0.4874 0.5776 0.2684 0.3881 0.1808 0.1282 0.3545 0.2739 0.26 0.2315 0.3538 0.5402 0.6066 0.5265 0.6017 0.3593 0.1435 0.2131 0.6879 0.3442 0.8786 0.4145 0.3936 0.2843 0.0893 0.2313 0.2989 0.182 0.4714 0.7028 0.2844 0.3836 0.1294 0.37 0.5185 0.5998 0.3949 0.285 0.4851 0.4071 0.5892 0.2859 0.3674 0.3722 0.7443 0.4314 0.3712 0.0349 0.1621 0.5342 0.1167 0.1387 0.3317 0.5571 0.4299 0.0633 0.2209 0.1195 0.0239 0.0596 0.2039 0.2324 0.2253 0.3166 0.2931 0.109 0.248 0.7712 0.1705 0.426 0.1221 0.5478 0.1026 0.401 0.2021 0.271 0.1973 0.696 0.7906 0.0598 0.3569 0.6443 0.1629 0.3957 0.4696 0.0798 0.2551 0.1703 0.276 0.274 0.5179 0.4638 0.5153 0.3885 0.5115 0.0607 0.2911 0.4419 0.2979 0.209 0.1666 0.0262 0.2298 0.39 0.2714 0.0534 0.2788 0.2238 0.2677 0.0142 0.4146 0.0304 0.1466 0.1985] 2022-08-24 05:16:21 [INFO] [EVAL] Class Precision: [0.7991 0.8495 0.9657 0.826 0.7681 0.8591 0.8779 0.8805 0.6852 0.7285 0.7285 0.6935 0.7805 0.5575 0.5273 0.6127 0.6918 0.7097 0.7622 0.6286 0.8468 0.6625 0.7534 0.6806 0.4881 0.6507 0.5496 0.7806 0.7073 0.4043 0.4795 0.7032 0.5317 0.4555 0.454 0.539 0.68 0.8304 0.4608 0.5791 0.3724 0.328 0.5446 0.4927 0.3701 0.4502 0.812 0.7431 0.7146 0.6283 0.7695 0.4846 0.3201 0.744 0.732 0.6515 0.915 0.6941 0.6945 0.5216 0.1223 0.4169 0.4585 0.6244 0.5529 0.7674 0.5078 0.5103 0.2946 0.6392 0.6977 0.732 0.5346 0.3673 0.7035 0.6158 0.7428 0.6724 0.771 0.6238 0.9007 0.7216 0.8319 0.1146 0.2602 0.7089 0.5448 0.4157 0.8582 0.7396 0.5786 0.0832 0.4173 0.3359 0.0596 0.2039 0.5263 0.4364 0.308 0.8136 0.6594 0.1823 0.4922 0.8728 0.8066 0.4451 0.2474 0.7531 0.1981 0.4682 0.4365 0.3065 0.5363 0.8284 0.7982 0.2792 0.7641 0.767 0.2567 0.6005 0.6694 0.3243 0.4975 0.7114 0.7738 0.6346 0.7463 0.6106 0.7397 0.6377 0.626 0.3813 0.5718 0.7994 0.7631 0.418 0.366 0.069 0.5145 0.6829 0.6035 0.075 0.6418 0.5598 0.5981 0.0194 0.8558 0.2306 0.4794 0.7434] 2022-08-24 05:16:21 [INFO] [EVAL] Class Recall: [0.8415 0.9224 0.9622 0.8759 0.8657 0.8801 0.8679 0.9122 0.7073 0.8175 0.6158 0.7379 0.8907 0.4098 0.468 0.6127 0.7065 0.5297 0.7726 0.5933 0.9016 0.5326 0.7797 0.7112 0.5059 0.4467 0.7029 0.5399 0.4968 0.3511 0.4273 0.6961 0.4459 0.6016 0.5763 0.5835 0.6325 0.6549 0.3913 0.5406 0.26 0.1738 0.5039 0.3816 0.4665 0.3228 0.3854 0.6643 0.8005 0.7646 0.734 0.5815 0.2065 0.2299 0.9194 0.4219 0.9568 0.5071 0.4761 0.3846 0.2487 0.342 0.4619 0.2044 0.7617 0.8931 0.3925 0.607 0.1874 0.4677 0.6688 0.7685 0.6019 0.5598 0.6097 0.5457 0.7403 0.3321 0.4124 0.4798 0.8108 0.5176 0.4013 0.0478 0.3008 0.6844 0.1293 0.1724 0.3509 0.6931 0.6258 0.2091 0.3194 0.1565 0.0385 0.0777 0.2497 0.332 0.4564 0.3414 0.3454 0.2135 0.3333 0.8689 0.1778 0.9084 0.1944 0.6677 0.1754 0.7367 0.2734 0.7003 0.2379 0.8133 0.9882 0.0708 0.401 0.801 0.3083 0.5372 0.6114 0.0957 0.3437 0.183 0.3002 0.3253 0.6285 0.6587 0.6294 0.4986 0.7366 0.0673 0.3722 0.497 0.3282 0.2949 0.2341 0.0404 0.2935 0.4763 0.3304 0.1565 0.3302 0.2716 0.3265 0.0506 0.4457 0.0338 0.1743 0.2131] 2022-08-24 05:16:21 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:16:30 [INFO] [TRAIN] epoch: 90, iter: 113050/160000, loss: 0.4223, lr: 0.000355, batch_cost: 0.1816, reader_cost: 0.00486, ips: 44.0509 samples/sec | ETA 02:22:06 2022-08-24 05:16:39 [INFO] [TRAIN] epoch: 90, iter: 113100/160000, loss: 0.4149, lr: 0.000355, batch_cost: 0.1632, reader_cost: 0.00120, ips: 49.0224 samples/sec | ETA 02:07:33 2022-08-24 05:16:47 [INFO] [TRAIN] epoch: 90, iter: 113150/160000, loss: 0.4335, lr: 0.000355, batch_cost: 0.1775, reader_cost: 0.00086, ips: 45.0635 samples/sec | ETA 02:18:37 2022-08-24 05:16:56 [INFO] [TRAIN] epoch: 90, iter: 113200/160000, loss: 0.4311, lr: 0.000354, batch_cost: 0.1748, reader_cost: 0.00064, ips: 45.7633 samples/sec | ETA 02:16:21 2022-08-24 05:17:05 [INFO] [TRAIN] epoch: 90, iter: 113250/160000, loss: 0.4361, lr: 0.000354, batch_cost: 0.1698, reader_cost: 0.00082, ips: 47.1016 samples/sec | ETA 02:12:20 2022-08-24 05:17:14 [INFO] [TRAIN] epoch: 90, iter: 113300/160000, loss: 0.4337, lr: 0.000354, batch_cost: 0.1914, reader_cost: 0.00061, ips: 41.7884 samples/sec | ETA 02:29:00 2022-08-24 05:17:24 [INFO] [TRAIN] epoch: 90, iter: 113350/160000, loss: 0.4561, lr: 0.000353, batch_cost: 0.1881, reader_cost: 0.00094, ips: 42.5210 samples/sec | ETA 02:26:16 2022-08-24 05:17:32 [INFO] [TRAIN] epoch: 90, iter: 113400/160000, loss: 0.4360, lr: 0.000353, batch_cost: 0.1584, reader_cost: 0.00124, ips: 50.5101 samples/sec | ETA 02:03:00 2022-08-24 05:17:39 [INFO] [TRAIN] epoch: 90, iter: 113450/160000, loss: 0.4338, lr: 0.000352, batch_cost: 0.1538, reader_cost: 0.00053, ips: 52.0276 samples/sec | ETA 01:59:17 2022-08-24 05:17:47 [INFO] [TRAIN] epoch: 90, iter: 113500/160000, loss: 0.4140, lr: 0.000352, batch_cost: 0.1597, reader_cost: 0.00082, ips: 50.1081 samples/sec | ETA 02:03:43 2022-08-24 05:17:57 [INFO] [TRAIN] epoch: 90, iter: 113550/160000, loss: 0.4154, lr: 0.000352, batch_cost: 0.1911, reader_cost: 0.00082, ips: 41.8627 samples/sec | ETA 02:27:56 2022-08-24 05:18:05 [INFO] [TRAIN] epoch: 90, iter: 113600/160000, loss: 0.4564, lr: 0.000351, batch_cost: 0.1714, reader_cost: 0.00047, ips: 46.6619 samples/sec | ETA 02:12:35 2022-08-24 05:18:16 [INFO] [TRAIN] epoch: 90, iter: 113650/160000, loss: 0.4191, lr: 0.000351, batch_cost: 0.2052, reader_cost: 0.00066, ips: 38.9939 samples/sec | ETA 02:38:29 2022-08-24 05:18:29 [INFO] [TRAIN] epoch: 91, iter: 113700/160000, loss: 0.4634, lr: 0.000351, batch_cost: 0.2631, reader_cost: 0.09399, ips: 30.4054 samples/sec | ETA 03:23:02 2022-08-24 05:18:38 [INFO] [TRAIN] epoch: 91, iter: 113750/160000, loss: 0.4642, lr: 0.000350, batch_cost: 0.1792, reader_cost: 0.00185, ips: 44.6329 samples/sec | ETA 02:18:09 2022-08-24 05:18:47 [INFO] [TRAIN] epoch: 91, iter: 113800/160000, loss: 0.4620, lr: 0.000350, batch_cost: 0.1818, reader_cost: 0.00044, ips: 44.0140 samples/sec | ETA 02:19:57 2022-08-24 05:18:56 [INFO] [TRAIN] epoch: 91, iter: 113850/160000, loss: 0.4406, lr: 0.000349, batch_cost: 0.1877, reader_cost: 0.00269, ips: 42.6324 samples/sec | ETA 02:24:20 2022-08-24 05:19:06 [INFO] [TRAIN] epoch: 91, iter: 113900/160000, loss: 0.4350, lr: 0.000349, batch_cost: 0.1928, reader_cost: 0.00058, ips: 41.4907 samples/sec | ETA 02:28:08 2022-08-24 05:19:14 [INFO] [TRAIN] epoch: 91, iter: 113950/160000, loss: 0.4248, lr: 0.000349, batch_cost: 0.1594, reader_cost: 0.00065, ips: 50.1933 samples/sec | ETA 02:02:19 2022-08-24 05:19:23 [INFO] [TRAIN] epoch: 91, iter: 114000/160000, loss: 0.4481, lr: 0.000348, batch_cost: 0.1818, reader_cost: 0.00039, ips: 43.9968 samples/sec | ETA 02:19:24 2022-08-24 05:19:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1801 - reader cost: 0.0011 2022-08-24 05:22:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3756 Acc: 0.7749 Kappa: 0.7578 Dice: 0.5132 2022-08-24 05:22:23 [INFO] [EVAL] Class IoU: [0.6952 0.7889 0.9326 0.7387 0.6959 0.7706 0.7722 0.8101 0.5376 0.6349 0.5063 0.568 0.7213 0.3232 0.3065 0.4452 0.519 0.4419 0.6209 0.4437 0.7682 0.442 0.6438 0.524 0.3457 0.3507 0.4376 0.4644 0.4326 0.252 0.2936 0.5506 0.3223 0.3465 0.3534 0.3862 0.4849 0.5683 0.2621 0.37 0.1668 0.1374 0.3622 0.2735 0.2695 0.1852 0.3562 0.5348 0.618 0.5139 0.5564 0.3744 0.1543 0.1535 0.6696 0.3317 0.8842 0.4434 0.4505 0.2709 0.1096 0.226 0.3065 0.1678 0.5113 0.7086 0.2878 0.3965 0.1398 0.3688 0.4998 0.5745 0.4061 0.2209 0.4792 0.4071 0.556 0.2348 0.2832 0.4393 0.6928 0.4265 0.3848 0.0483 0.2784 0.5239 0.1312 0.1355 0.3703 0.5399 0.4711 0.0559 0.2227 0.0918 0.0424 0.0324 0.1467 0.1942 0.2216 0.3223 0.2988 0.1039 0.2666 0.66 0.1718 0.744 0.1597 0.5431 0.0896 0.3279 0.2012 0.4684 0.1918 0.6987 0.6773 0.0577 0.4154 0.6396 0.135 0.4073 0.3917 0.0749 0.2588 0.1798 0.3162 0.257 0.5253 0.4685 0.5443 0.4081 0.5984 0.0566 0.272 0.4044 0.3389 0.2084 0.1653 0.0317 0.2231 0.4051 0.2684 0.0146 0.2195 0.2364 0.1939 0.0091 0.3883 0.0256 0.167 0.2043] 2022-08-24 05:22:23 [INFO] [EVAL] Class Precision: [0.7974 0.8663 0.9616 0.8281 0.7794 0.8685 0.8795 0.8702 0.6817 0.7523 0.6966 0.6898 0.7875 0.5489 0.5732 0.5942 0.6941 0.7042 0.7712 0.6041 0.8335 0.6403 0.7925 0.6588 0.493 0.6011 0.5464 0.7852 0.6963 0.3939 0.4443 0.7239 0.5113 0.452 0.4566 0.511 0.6671 0.8246 0.3751 0.6176 0.388 0.3041 0.5934 0.5218 0.3752 0.4216 0.7513 0.7458 0.7322 0.6394 0.7231 0.4673 0.2751 0.6012 0.7123 0.719 0.9226 0.6315 0.7456 0.4356 0.1721 0.4648 0.4067 0.613 0.637 0.8049 0.5183 0.5364 0.3473 0.6568 0.7162 0.7013 0.5566 0.3732 0.6938 0.6051 0.7329 0.4268 0.6821 0.6611 0.8101 0.6556 0.8385 0.1488 0.3656 0.6843 0.471 0.4401 0.846 0.692 0.656 0.0649 0.4788 0.3328 0.1222 0.1188 0.6417 0.4438 0.3 0.6323 0.6105 0.1558 0.5311 0.8241 0.7922 0.8168 0.3227 0.7554 0.1851 0.4446 0.4131 0.5708 0.5825 0.762 0.6823 0.2976 0.737 0.7502 0.2204 0.6005 0.7426 0.2498 0.6355 0.6805 0.7529 0.6877 0.7866 0.6099 0.8464 0.6622 0.6841 0.3506 0.5811 0.8207 0.7821 0.3711 0.3638 0.0765 0.4043 0.6696 0.5914 0.024 0.6274 0.7105 0.6735 0.0128 0.8681 0.1795 0.5142 0.7171] 2022-08-24 05:22:23 [INFO] [EVAL] Class Recall: [0.8443 0.8982 0.9687 0.8726 0.8666 0.8723 0.8636 0.9214 0.7178 0.8028 0.6495 0.7628 0.8955 0.4402 0.3972 0.6398 0.6729 0.5426 0.761 0.6257 0.9074 0.588 0.7743 0.7191 0.5363 0.457 0.6872 0.532 0.5332 0.4115 0.4639 0.697 0.4657 0.5974 0.61 0.6125 0.6398 0.6464 0.4653 0.48 0.2263 0.2003 0.4817 0.365 0.4889 0.2483 0.4038 0.654 0.7984 0.7236 0.7071 0.6532 0.2599 0.1709 0.9177 0.3811 0.955 0.5982 0.5323 0.4174 0.2319 0.3055 0.5543 0.1877 0.7215 0.8554 0.3929 0.6033 0.1896 0.4569 0.6233 0.7607 0.6003 0.3512 0.6077 0.5544 0.6973 0.3429 0.3263 0.567 0.8272 0.5496 0.4156 0.0667 0.5384 0.6909 0.1538 0.1638 0.3971 0.7106 0.6256 0.2879 0.2939 0.1126 0.061 0.0426 0.1597 0.2566 0.4589 0.3967 0.3692 0.2377 0.3486 0.7681 0.1799 0.893 0.2403 0.6591 0.1479 0.5552 0.2817 0.7232 0.2224 0.8938 0.9892 0.0668 0.4878 0.8128 0.2585 0.5587 0.4532 0.0966 0.3039 0.1963 0.3528 0.291 0.6126 0.6688 0.6039 0.5154 0.8268 0.0632 0.3384 0.4436 0.3742 0.3222 0.2325 0.0512 0.3323 0.5063 0.3295 0.0358 0.2524 0.2617 0.214 0.0305 0.4127 0.029 0.1983 0.2222] 2022-08-24 05:22:23 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:22:32 [INFO] [TRAIN] epoch: 91, iter: 114050/160000, loss: 0.4291, lr: 0.000348, batch_cost: 0.1645, reader_cost: 0.00365, ips: 48.6266 samples/sec | ETA 02:05:59 2022-08-24 05:22:41 [INFO] [TRAIN] epoch: 91, iter: 114100/160000, loss: 0.4107, lr: 0.000348, batch_cost: 0.1877, reader_cost: 0.00109, ips: 42.6277 samples/sec | ETA 02:23:34 2022-08-24 05:22:51 [INFO] [TRAIN] epoch: 91, iter: 114150/160000, loss: 0.4840, lr: 0.000347, batch_cost: 0.2060, reader_cost: 0.00073, ips: 38.8271 samples/sec | ETA 02:37:27 2022-08-24 05:23:01 [INFO] [TRAIN] epoch: 91, iter: 114200/160000, loss: 0.4352, lr: 0.000347, batch_cost: 0.2026, reader_cost: 0.00041, ips: 39.4815 samples/sec | ETA 02:34:40 2022-08-24 05:23:11 [INFO] [TRAIN] epoch: 91, iter: 114250/160000, loss: 0.4152, lr: 0.000346, batch_cost: 0.1897, reader_cost: 0.00074, ips: 42.1753 samples/sec | ETA 02:24:38 2022-08-24 05:23:20 [INFO] [TRAIN] epoch: 91, iter: 114300/160000, loss: 0.4001, lr: 0.000346, batch_cost: 0.1745, reader_cost: 0.00029, ips: 45.8413 samples/sec | ETA 02:12:55 2022-08-24 05:23:28 [INFO] [TRAIN] epoch: 91, iter: 114350/160000, loss: 0.4468, lr: 0.000346, batch_cost: 0.1743, reader_cost: 0.00098, ips: 45.8942 samples/sec | ETA 02:12:37 2022-08-24 05:23:39 [INFO] [TRAIN] epoch: 91, iter: 114400/160000, loss: 0.4481, lr: 0.000345, batch_cost: 0.2064, reader_cost: 0.00086, ips: 38.7560 samples/sec | ETA 02:36:52 2022-08-24 05:23:48 [INFO] [TRAIN] epoch: 91, iter: 114450/160000, loss: 0.4341, lr: 0.000345, batch_cost: 0.1799, reader_cost: 0.00090, ips: 44.4732 samples/sec | ETA 02:16:33 2022-08-24 05:23:56 [INFO] [TRAIN] epoch: 91, iter: 114500/160000, loss: 0.4255, lr: 0.000344, batch_cost: 0.1738, reader_cost: 0.00078, ips: 46.0334 samples/sec | ETA 02:11:47 2022-08-24 05:24:05 [INFO] [TRAIN] epoch: 91, iter: 114550/160000, loss: 0.4704, lr: 0.000344, batch_cost: 0.1690, reader_cost: 0.00049, ips: 47.3252 samples/sec | ETA 02:08:03 2022-08-24 05:24:13 [INFO] [TRAIN] epoch: 91, iter: 114600/160000, loss: 0.4228, lr: 0.000344, batch_cost: 0.1527, reader_cost: 0.00050, ips: 52.3987 samples/sec | ETA 01:55:31 2022-08-24 05:24:20 [INFO] [TRAIN] epoch: 91, iter: 114650/160000, loss: 0.3926, lr: 0.000343, batch_cost: 0.1557, reader_cost: 0.00065, ips: 51.3689 samples/sec | ETA 01:57:42 2022-08-24 05:24:28 [INFO] [TRAIN] epoch: 91, iter: 114700/160000, loss: 0.4463, lr: 0.000343, batch_cost: 0.1573, reader_cost: 0.00106, ips: 50.8660 samples/sec | ETA 01:58:44 2022-08-24 05:24:37 [INFO] [TRAIN] epoch: 91, iter: 114750/160000, loss: 0.4324, lr: 0.000343, batch_cost: 0.1675, reader_cost: 0.00050, ips: 47.7510 samples/sec | ETA 02:06:20 2022-08-24 05:24:45 [INFO] [TRAIN] epoch: 91, iter: 114800/160000, loss: 0.4220, lr: 0.000342, batch_cost: 0.1678, reader_cost: 0.00074, ips: 47.6830 samples/sec | ETA 02:06:23 2022-08-24 05:24:54 [INFO] [TRAIN] epoch: 91, iter: 114850/160000, loss: 0.4309, lr: 0.000342, batch_cost: 0.1730, reader_cost: 0.00081, ips: 46.2374 samples/sec | ETA 02:10:11 2022-08-24 05:25:03 [INFO] [TRAIN] epoch: 91, iter: 114900/160000, loss: 0.4010, lr: 0.000341, batch_cost: 0.1961, reader_cost: 0.00093, ips: 40.7972 samples/sec | ETA 02:27:23 2022-08-24 05:25:19 [INFO] [TRAIN] epoch: 92, iter: 114950/160000, loss: 0.4348, lr: 0.000341, batch_cost: 0.3032, reader_cost: 0.09057, ips: 26.3837 samples/sec | ETA 03:47:39 2022-08-24 05:25:28 [INFO] [TRAIN] epoch: 92, iter: 115000/160000, loss: 0.4658, lr: 0.000341, batch_cost: 0.1966, reader_cost: 0.00727, ips: 40.6962 samples/sec | ETA 02:27:26 2022-08-24 05:25:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 188s - batch_cost: 0.1884 - reader cost: 0.0011 2022-08-24 05:28:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3731 Acc: 0.7742 Kappa: 0.7567 Dice: 0.5112 2022-08-24 05:28:37 [INFO] [EVAL] Class IoU: [0.6895 0.7903 0.9311 0.7387 0.6927 0.7723 0.773 0.802 0.5425 0.6404 0.4952 0.5595 0.7161 0.3188 0.3046 0.4437 0.4963 0.4262 0.6175 0.4285 0.7761 0.4364 0.6302 0.5296 0.3395 0.4274 0.4455 0.4575 0.3801 0.2499 0.2996 0.5278 0.3256 0.344 0.3712 0.3994 0.4803 0.5676 0.2918 0.4046 0.1662 0.1183 0.3597 0.2695 0.2602 0.2085 0.3884 0.5443 0.5828 0.4873 0.585 0.387 0.1845 0.1923 0.6808 0.3465 0.8923 0.3768 0.4058 0.2716 0.0969 0.2364 0.3229 0.1479 0.5024 0.7062 0.2462 0.3891 0.126 0.3662 0.5325 0.5875 0.3837 0.2687 0.479 0.3972 0.4429 0.2464 0.2202 0.3818 0.6542 0.4271 0.4472 0.0307 0.1648 0.5267 0.1309 0.1224 0.434 0.5019 0.4063 0.0453 0.2021 0.1377 0.0618 0.0554 0.1699 0.1616 0.2371 0.2901 0.2572 0.1193 0.2976 0.6372 0.1909 0.6544 0.1875 0.5493 0.0756 0.4436 0.2106 0.4843 0.1881 0.5748 0.6736 0.069 0.4256 0.6524 0.1805 0.3415 0.4362 0.0844 0.3172 0.2043 0.2803 0.2667 0.5722 0.4784 0.5428 0.395 0.6219 0.0671 0.2802 0.3481 0.3204 0.2056 0.1734 0.0228 0.2215 0.3926 0.1951 0.0101 0.2809 0.0873 0.2827 0.0006 0.4207 0.0316 0.1553 0.1989] 2022-08-24 05:28:37 [INFO] [EVAL] Class Precision: [0.7801 0.8615 0.9649 0.8233 0.7692 0.8699 0.876 0.8516 0.664 0.772 0.7215 0.7338 0.7976 0.5297 0.5633 0.6236 0.694 0.7198 0.7803 0.6284 0.8471 0.6443 0.7885 0.6638 0.5094 0.6704 0.5347 0.7438 0.7498 0.3769 0.4443 0.6711 0.5422 0.4701 0.4613 0.5478 0.7121 0.8246 0.4479 0.5775 0.33 0.357 0.5847 0.5224 0.3472 0.4362 0.7434 0.7762 0.7219 0.6074 0.8245 0.4933 0.3223 0.4875 0.7187 0.6144 0.9361 0.6904 0.7546 0.5046 0.1617 0.4795 0.4453 0.6421 0.6621 0.7863 0.429 0.5312 0.2736 0.664 0.6624 0.7702 0.5711 0.3641 0.6661 0.579 0.7541 0.6079 0.7072 0.5728 0.7597 0.6751 0.7914 0.1237 0.257 0.7059 0.4674 0.4978 0.8389 0.649 0.5258 0.0556 0.3939 0.3232 0.1815 0.1758 0.5753 0.481 0.3244 0.7516 0.678 0.1981 0.5617 0.7292 0.8229 0.7151 0.3952 0.735 0.2039 0.5033 0.4571 0.6398 0.6558 0.7942 0.681 0.2061 0.7075 0.7712 0.3261 0.5025 0.707 0.2956 0.723 0.6233 0.7784 0.638 0.8616 0.6291 0.801 0.6446 0.7287 0.4364 0.6171 0.8823 0.6542 0.3937 0.3444 0.0831 0.4314 0.6892 0.5255 0.0153 0.6192 0.5851 0.5664 0.0013 0.7896 0.2584 0.4563 0.662 ] 2022-08-24 05:28:37 [INFO] [EVAL] Class Recall: [0.8558 0.9053 0.9637 0.8778 0.8746 0.8731 0.868 0.9323 0.7478 0.7898 0.6123 0.702 0.8751 0.4446 0.3988 0.6059 0.6353 0.511 0.7475 0.5739 0.9025 0.5749 0.7584 0.7238 0.5044 0.5411 0.7276 0.5431 0.4353 0.4258 0.4791 0.7119 0.4491 0.562 0.6554 0.5958 0.5961 0.6455 0.4556 0.5747 0.2508 0.1504 0.4831 0.3576 0.5093 0.2854 0.4485 0.6457 0.7516 0.7114 0.6683 0.6423 0.3014 0.241 0.9281 0.4428 0.9502 0.4534 0.4675 0.3704 0.1949 0.3181 0.5403 0.1612 0.6756 0.8739 0.3662 0.5927 0.1893 0.4494 0.7308 0.7123 0.5391 0.5063 0.6304 0.5586 0.5176 0.293 0.2422 0.5338 0.825 0.5377 0.5069 0.0392 0.3147 0.6748 0.1539 0.1397 0.4734 0.6889 0.6412 0.1969 0.2933 0.1934 0.0857 0.0748 0.1942 0.1957 0.4682 0.3208 0.293 0.2308 0.3876 0.8347 0.1991 0.8852 0.2629 0.6849 0.1072 0.7891 0.2809 0.6658 0.2087 0.6754 0.9842 0.094 0.5166 0.8089 0.2878 0.5159 0.5325 0.1056 0.3611 0.233 0.3047 0.3142 0.6301 0.6663 0.6274 0.505 0.8093 0.0734 0.3392 0.3651 0.3858 0.301 0.2589 0.0304 0.3128 0.4771 0.2369 0.0293 0.3396 0.0931 0.3608 0.001 0.4738 0.0348 0.1905 0.2214] 2022-08-24 05:28:37 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:28:46 [INFO] [TRAIN] epoch: 92, iter: 115050/160000, loss: 0.4232, lr: 0.000340, batch_cost: 0.1676, reader_cost: 0.00459, ips: 47.7255 samples/sec | ETA 02:05:34 2022-08-24 05:28:54 [INFO] [TRAIN] epoch: 92, iter: 115100/160000, loss: 0.4322, lr: 0.000340, batch_cost: 0.1676, reader_cost: 0.00079, ips: 47.7394 samples/sec | ETA 02:05:24 2022-08-24 05:29:02 [INFO] [TRAIN] epoch: 92, iter: 115150/160000, loss: 0.4356, lr: 0.000340, batch_cost: 0.1577, reader_cost: 0.00074, ips: 50.7202 samples/sec | ETA 01:57:54 2022-08-24 05:29:10 [INFO] [TRAIN] epoch: 92, iter: 115200/160000, loss: 0.4262, lr: 0.000339, batch_cost: 0.1530, reader_cost: 0.00052, ips: 52.2933 samples/sec | ETA 01:54:13 2022-08-24 05:29:18 [INFO] [TRAIN] epoch: 92, iter: 115250/160000, loss: 0.4345, lr: 0.000339, batch_cost: 0.1623, reader_cost: 0.00131, ips: 49.2903 samples/sec | ETA 02:01:03 2022-08-24 05:29:26 [INFO] [TRAIN] epoch: 92, iter: 115300/160000, loss: 0.4230, lr: 0.000338, batch_cost: 0.1744, reader_cost: 0.00072, ips: 45.8758 samples/sec | ETA 02:09:54 2022-08-24 05:29:35 [INFO] [TRAIN] epoch: 92, iter: 115350/160000, loss: 0.4152, lr: 0.000338, batch_cost: 0.1807, reader_cost: 0.00051, ips: 44.2826 samples/sec | ETA 02:14:26 2022-08-24 05:29:43 [INFO] [TRAIN] epoch: 92, iter: 115400/160000, loss: 0.4589, lr: 0.000338, batch_cost: 0.1551, reader_cost: 0.00175, ips: 51.5840 samples/sec | ETA 01:55:16 2022-08-24 05:29:53 [INFO] [TRAIN] epoch: 92, iter: 115450/160000, loss: 0.4520, lr: 0.000337, batch_cost: 0.1922, reader_cost: 0.00051, ips: 41.6322 samples/sec | ETA 02:22:40 2022-08-24 05:30:01 [INFO] [TRAIN] epoch: 92, iter: 115500/160000, loss: 0.4170, lr: 0.000337, batch_cost: 0.1632, reader_cost: 0.00044, ips: 49.0048 samples/sec | ETA 02:01:04 2022-08-24 05:30:10 [INFO] [TRAIN] epoch: 92, iter: 115550/160000, loss: 0.4186, lr: 0.000337, batch_cost: 0.1748, reader_cost: 0.00114, ips: 45.7559 samples/sec | ETA 02:09:31 2022-08-24 05:30:18 [INFO] [TRAIN] epoch: 92, iter: 115600/160000, loss: 0.4201, lr: 0.000336, batch_cost: 0.1642, reader_cost: 0.00081, ips: 48.7316 samples/sec | ETA 02:01:28 2022-08-24 05:30:26 [INFO] [TRAIN] epoch: 92, iter: 115650/160000, loss: 0.4410, lr: 0.000336, batch_cost: 0.1624, reader_cost: 0.00108, ips: 49.2490 samples/sec | ETA 02:00:04 2022-08-24 05:30:34 [INFO] [TRAIN] epoch: 92, iter: 115700/160000, loss: 0.4178, lr: 0.000335, batch_cost: 0.1605, reader_cost: 0.00051, ips: 49.8348 samples/sec | ETA 01:58:31 2022-08-24 05:30:42 [INFO] [TRAIN] epoch: 92, iter: 115750/160000, loss: 0.4229, lr: 0.000335, batch_cost: 0.1520, reader_cost: 0.00053, ips: 52.6487 samples/sec | ETA 01:52:03 2022-08-24 05:30:50 [INFO] [TRAIN] epoch: 92, iter: 115800/160000, loss: 0.4319, lr: 0.000335, batch_cost: 0.1691, reader_cost: 0.00090, ips: 47.3184 samples/sec | ETA 02:04:32 2022-08-24 05:30:59 [INFO] [TRAIN] epoch: 92, iter: 115850/160000, loss: 0.4209, lr: 0.000334, batch_cost: 0.1763, reader_cost: 0.00058, ips: 45.3847 samples/sec | ETA 02:09:42 2022-08-24 05:31:08 [INFO] [TRAIN] epoch: 92, iter: 115900/160000, loss: 0.4382, lr: 0.000334, batch_cost: 0.1811, reader_cost: 0.00119, ips: 44.1648 samples/sec | ETA 02:13:08 2022-08-24 05:31:17 [INFO] [TRAIN] epoch: 92, iter: 115950/160000, loss: 0.4134, lr: 0.000334, batch_cost: 0.1757, reader_cost: 0.00052, ips: 45.5289 samples/sec | ETA 02:09:00 2022-08-24 05:31:27 [INFO] [TRAIN] epoch: 92, iter: 116000/160000, loss: 0.4227, lr: 0.000333, batch_cost: 0.2085, reader_cost: 0.00061, ips: 38.3705 samples/sec | ETA 02:32:53 2022-08-24 05:31:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1871 - reader cost: 7.4314e-04 2022-08-24 05:34:35 [INFO] [EVAL] #Images: 2000 mIoU: 0.3750 Acc: 0.7755 Kappa: 0.7583 Dice: 0.5146 2022-08-24 05:34:35 [INFO] [EVAL] Class IoU: [0.6906 0.7932 0.9309 0.7419 0.6908 0.7704 0.7748 0.8109 0.5417 0.6449 0.49 0.5704 0.719 0.3112 0.341 0.4436 0.4897 0.4527 0.6193 0.4344 0.7778 0.4473 0.6444 0.5345 0.3439 0.3731 0.4497 0.4761 0.4914 0.2679 0.3017 0.5361 0.295 0.3539 0.311 0.412 0.4792 0.5513 0.2904 0.3579 0.1562 0.1445 0.3562 0.2638 0.2466 0.2373 0.3931 0.5231 0.5871 0.5323 0.5763 0.366 0.1583 0.2191 0.6803 0.3145 0.8857 0.4183 0.399 0.2967 0.0991 0.2557 0.3401 0.1959 0.4861 0.6948 0.2618 0.3888 0.1341 0.3728 0.5104 0.5647 0.3891 0.2677 0.4906 0.3928 0.5419 0.2482 0.3286 0.3416 0.6787 0.4246 0.4369 0.0479 0.2475 0.5116 0.1196 0.1271 0.3593 0.5465 0.4345 0.0732 0.1967 0.0921 0.0614 0.0473 0.1855 0.1847 0.2528 0.2806 0.2543 0.1319 0.2904 0.5587 0.1862 0.5435 0.1591 0.5552 0.0735 0.3025 0.2138 0.4715 0.1992 0.5536 0.7234 0.065 0.4166 0.6888 0.2015 0.3891 0.4486 0.0857 0.2867 0.1986 0.2841 0.2483 0.5479 0.4547 0.4796 0.3936 0.5962 0.1109 0.2978 0.3906 0.3263 0.2131 0.1716 0.0344 0.2142 0.4204 0.1354 0.0244 0.3007 0.2937 0.2265 0.0129 0.3965 0.0314 0.1294 0.1972] 2022-08-24 05:34:35 [INFO] [EVAL] Class Precision: [0.7866 0.8536 0.9627 0.83 0.7847 0.8763 0.891 0.8667 0.708 0.7637 0.7008 0.6984 0.8026 0.4966 0.5344 0.6047 0.6486 0.7 0.791 0.6402 0.8534 0.6453 0.7879 0.652 0.4939 0.6876 0.5556 0.755 0.7407 0.507 0.5015 0.6603 0.5784 0.498 0.4543 0.5717 0.6735 0.8313 0.431 0.6302 0.3308 0.2896 0.5985 0.5099 0.3229 0.4771 0.8342 0.7269 0.7223 0.6554 0.7765 0.4537 0.307 0.5843 0.7193 0.6648 0.9245 0.6702 0.752 0.5264 0.1651 0.5037 0.483 0.597 0.5776 0.7759 0.436 0.5392 0.2795 0.6648 0.7013 0.7013 0.5519 0.3553 0.7148 0.5612 0.7339 0.6036 0.683 0.5125 0.7975 0.6411 0.8032 0.1339 0.3307 0.7468 0.5502 0.43 0.8388 0.744 0.626 0.0903 0.4102 0.3323 0.185 0.1808 0.626 0.4159 0.3658 0.6216 0.6626 0.1909 0.4754 0.8223 0.726 0.6007 0.3662 0.73 0.1956 0.4376 0.4878 0.6231 0.5673 0.8146 0.7327 0.1832 0.6213 0.8233 0.3121 0.5787 0.6824 0.3824 0.6862 0.6288 0.7453 0.6242 0.8046 0.5963 0.6633 0.6979 0.6691 0.4952 0.5704 0.8188 0.621 0.4256 0.3625 0.0844 0.5108 0.6861 0.4192 0.0433 0.6095 0.5405 0.5957 0.0147 0.8768 0.2133 0.4161 0.7421] 2022-08-24 05:34:35 [INFO] [EVAL] Class Recall: [0.8499 0.9181 0.9657 0.8747 0.8523 0.8644 0.8558 0.9265 0.6975 0.8056 0.6195 0.7568 0.8734 0.4546 0.4852 0.6248 0.6665 0.5616 0.7405 0.5748 0.8978 0.5932 0.7796 0.7479 0.5311 0.4493 0.7024 0.5631 0.5935 0.3623 0.4309 0.7402 0.3758 0.5501 0.4966 0.5959 0.6242 0.6207 0.471 0.4531 0.2284 0.2239 0.4681 0.3534 0.5109 0.3206 0.4264 0.6511 0.7583 0.7392 0.6909 0.6545 0.2463 0.2595 0.9262 0.3737 0.9547 0.5267 0.4594 0.4048 0.1987 0.3418 0.5349 0.2258 0.7544 0.8691 0.3959 0.5823 0.2048 0.4592 0.6521 0.7434 0.5687 0.5206 0.61 0.5669 0.6743 0.2965 0.3878 0.5061 0.8201 0.557 0.4893 0.0695 0.4958 0.619 0.1325 0.1529 0.386 0.6731 0.5868 0.2798 0.2742 0.113 0.0841 0.0603 0.2086 0.2494 0.4502 0.3383 0.2922 0.2989 0.4274 0.6355 0.2003 0.8509 0.2196 0.6987 0.1054 0.4949 0.2757 0.6596 0.2349 0.6335 0.9827 0.0915 0.5584 0.8083 0.3624 0.5428 0.567 0.0995 0.3299 0.2249 0.3147 0.2919 0.632 0.6568 0.6339 0.4744 0.8454 0.125 0.3838 0.4276 0.4074 0.2992 0.2457 0.0549 0.2695 0.5205 0.1667 0.0529 0.3724 0.3915 0.2676 0.0965 0.4199 0.0355 0.158 0.2117] 2022-08-24 05:34:35 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:34:44 [INFO] [TRAIN] epoch: 92, iter: 116050/160000, loss: 0.4256, lr: 0.000333, batch_cost: 0.1799, reader_cost: 0.00273, ips: 44.4695 samples/sec | ETA 02:11:46 2022-08-24 05:34:54 [INFO] [TRAIN] epoch: 92, iter: 116100/160000, loss: 0.4345, lr: 0.000332, batch_cost: 0.2025, reader_cost: 0.00169, ips: 39.5088 samples/sec | ETA 02:28:09 2022-08-24 05:35:02 [INFO] [TRAIN] epoch: 92, iter: 116150/160000, loss: 0.4191, lr: 0.000332, batch_cost: 0.1664, reader_cost: 0.00070, ips: 48.0669 samples/sec | ETA 02:01:38 2022-08-24 05:35:14 [INFO] [TRAIN] epoch: 93, iter: 116200/160000, loss: 0.4365, lr: 0.000332, batch_cost: 0.2354, reader_cost: 0.07326, ips: 33.9830 samples/sec | ETA 02:51:51 2022-08-24 05:35:23 [INFO] [TRAIN] epoch: 93, iter: 116250/160000, loss: 0.4220, lr: 0.000331, batch_cost: 0.1877, reader_cost: 0.00082, ips: 42.6179 samples/sec | ETA 02:16:52 2022-08-24 05:35:33 [INFO] [TRAIN] epoch: 93, iter: 116300/160000, loss: 0.4761, lr: 0.000331, batch_cost: 0.1820, reader_cost: 0.00056, ips: 43.9579 samples/sec | ETA 02:12:33 2022-08-24 05:35:41 [INFO] [TRAIN] epoch: 93, iter: 116350/160000, loss: 0.3985, lr: 0.000330, batch_cost: 0.1727, reader_cost: 0.00080, ips: 46.3310 samples/sec | ETA 02:05:37 2022-08-24 05:35:50 [INFO] [TRAIN] epoch: 93, iter: 116400/160000, loss: 0.4407, lr: 0.000330, batch_cost: 0.1862, reader_cost: 0.00061, ips: 42.9585 samples/sec | ETA 02:15:19 2022-08-24 05:35:58 [INFO] [TRAIN] epoch: 93, iter: 116450/160000, loss: 0.4578, lr: 0.000330, batch_cost: 0.1567, reader_cost: 0.00072, ips: 51.0523 samples/sec | ETA 01:53:44 2022-08-24 05:36:06 [INFO] [TRAIN] epoch: 93, iter: 116500/160000, loss: 0.4099, lr: 0.000329, batch_cost: 0.1596, reader_cost: 0.00031, ips: 50.1117 samples/sec | ETA 01:55:44 2022-08-24 05:36:15 [INFO] [TRAIN] epoch: 93, iter: 116550/160000, loss: 0.4271, lr: 0.000329, batch_cost: 0.1668, reader_cost: 0.00085, ips: 47.9588 samples/sec | ETA 02:00:47 2022-08-24 05:36:23 [INFO] [TRAIN] epoch: 93, iter: 116600/160000, loss: 0.4394, lr: 0.000329, batch_cost: 0.1712, reader_cost: 0.00050, ips: 46.7159 samples/sec | ETA 02:03:52 2022-08-24 05:36:32 [INFO] [TRAIN] epoch: 93, iter: 116650/160000, loss: 0.4404, lr: 0.000328, batch_cost: 0.1782, reader_cost: 0.00097, ips: 44.8862 samples/sec | ETA 02:08:46 2022-08-24 05:36:41 [INFO] [TRAIN] epoch: 93, iter: 116700/160000, loss: 0.4328, lr: 0.000328, batch_cost: 0.1686, reader_cost: 0.00056, ips: 47.4414 samples/sec | ETA 02:01:41 2022-08-24 05:36:49 [INFO] [TRAIN] epoch: 93, iter: 116750/160000, loss: 0.4145, lr: 0.000327, batch_cost: 0.1645, reader_cost: 0.00044, ips: 48.6369 samples/sec | ETA 01:58:33 2022-08-24 05:36:56 [INFO] [TRAIN] epoch: 93, iter: 116800/160000, loss: 0.4489, lr: 0.000327, batch_cost: 0.1514, reader_cost: 0.00032, ips: 52.8240 samples/sec | ETA 01:49:02 2022-08-24 05:37:05 [INFO] [TRAIN] epoch: 93, iter: 116850/160000, loss: 0.4439, lr: 0.000327, batch_cost: 0.1686, reader_cost: 0.00275, ips: 47.4450 samples/sec | ETA 02:01:15 2022-08-24 05:37:15 [INFO] [TRAIN] epoch: 93, iter: 116900/160000, loss: 0.4095, lr: 0.000326, batch_cost: 0.1987, reader_cost: 0.00060, ips: 40.2682 samples/sec | ETA 02:22:42 2022-08-24 05:37:24 [INFO] [TRAIN] epoch: 93, iter: 116950/160000, loss: 0.4194, lr: 0.000326, batch_cost: 0.1792, reader_cost: 0.00544, ips: 44.6444 samples/sec | ETA 02:08:34 2022-08-24 05:37:34 [INFO] [TRAIN] epoch: 93, iter: 117000/160000, loss: 0.4225, lr: 0.000326, batch_cost: 0.2093, reader_cost: 0.00054, ips: 38.2209 samples/sec | ETA 02:30:00 2022-08-24 05:37:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1780 - reader cost: 9.5649e-04 2022-08-24 05:40:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3750 Acc: 0.7751 Kappa: 0.7580 Dice: 0.5135 2022-08-24 05:40:32 [INFO] [EVAL] Class IoU: [0.6924 0.788 0.932 0.7426 0.6937 0.7699 0.7835 0.8124 0.5375 0.6339 0.4975 0.5787 0.7169 0.3275 0.3144 0.4418 0.5214 0.4293 0.6212 0.4333 0.7712 0.4377 0.6328 0.5312 0.3501 0.4118 0.4656 0.4653 0.4583 0.2702 0.32 0.5258 0.3261 0.3491 0.3568 0.3944 0.4797 0.5625 0.292 0.3801 0.1463 0.1192 0.3575 0.2808 0.25 0.1765 0.413 0.5419 0.6265 0.5367 0.5908 0.3619 0.1656 0.1422 0.6929 0.3253 0.8893 0.4461 0.3607 0.2618 0.0865 0.4005 0.331 0.1969 0.4825 0.7105 0.2489 0.3776 0.1233 0.3759 0.5245 0.5944 0.3925 0.225 0.4925 0.3898 0.5742 0.2389 0.2654 0.3272 0.7156 0.4233 0.3948 0.0492 0.1926 0.5287 0.1385 0.1472 0.3706 0.5427 0.4519 0.075 0.1903 0.1056 0.0438 0.0591 0.2071 0.2116 0.2566 0.2953 0.2131 0.1345 0.2839 0.57 0.1536 0.5189 0.1475 0.5784 0.0811 0.3241 0.1987 0.3904 0.1861 0.6007 0.739 0.0691 0.3911 0.6498 0.1742 0.3907 0.4679 0.0842 0.3055 0.2087 0.2953 0.2816 0.5435 0.4568 0.4496 0.4196 0.6 0.0971 0.2803 0.4386 0.3246 0.206 0.1655 0.039 0.2362 0.4227 0.1787 0.0085 0.259 0.1787 0.2577 0.0126 0.3906 0.0321 0.143 0.1874] 2022-08-24 05:40:32 [INFO] [EVAL] Class Precision: [0.7871 0.8757 0.9676 0.8248 0.7812 0.8795 0.8934 0.8689 0.6818 0.7848 0.7171 0.7163 0.7829 0.4862 0.5444 0.6175 0.6951 0.7067 0.7717 0.6304 0.8406 0.6356 0.7529 0.6683 0.4986 0.4764 0.5994 0.7435 0.7865 0.4571 0.4584 0.6803 0.5353 0.4579 0.4675 0.4973 0.6469 0.8143 0.4497 0.5641 0.3518 0.3461 0.5932 0.4705 0.3761 0.5006 0.7526 0.7444 0.7277 0.6372 0.784 0.4638 0.323 0.6766 0.7369 0.6438 0.9348 0.6502 0.8137 0.4756 0.1327 0.5765 0.4461 0.6068 0.5899 0.8035 0.4078 0.5379 0.2288 0.7576 0.7381 0.7699 0.5441 0.4016 0.7193 0.6158 0.7745 0.5518 0.6611 0.4995 0.858 0.6388 0.838 0.1324 0.3093 0.6885 0.5358 0.4149 0.7315 0.6919 0.648 0.1258 0.4072 0.3659 0.126 0.1683 0.5831 0.4189 0.3508 0.6858 0.6292 0.2374 0.5947 0.9791 0.916 0.5692 0.3158 0.7703 0.1754 0.4586 0.4551 0.4691 0.6389 0.8601 0.7463 0.2657 0.6648 0.7556 0.3013 0.5383 0.7236 0.3446 0.6195 0.5997 0.7601 0.6691 0.917 0.5743 0.5671 0.7065 0.68 0.4047 0.6054 0.8076 0.6669 0.4231 0.3344 0.0764 0.4964 0.6826 0.3349 0.013 0.6182 0.5417 0.5799 0.0179 0.87 0.3083 0.6323 0.7807] 2022-08-24 05:40:32 [INFO] [EVAL] Class Recall: [0.852 0.8873 0.962 0.8817 0.861 0.8607 0.8643 0.9259 0.7175 0.7673 0.619 0.7509 0.8947 0.5007 0.4267 0.6083 0.6759 0.5224 0.7611 0.5809 0.9033 0.5842 0.7987 0.7214 0.5402 0.7524 0.6759 0.5542 0.5234 0.3978 0.5144 0.6984 0.4548 0.5951 0.6012 0.6557 0.6499 0.6452 0.4542 0.5383 0.2003 0.1539 0.4736 0.4105 0.4272 0.2141 0.4778 0.6658 0.8184 0.7729 0.7056 0.6222 0.2536 0.1526 0.9207 0.3967 0.9481 0.587 0.3932 0.3679 0.1992 0.5674 0.5619 0.2257 0.7259 0.8599 0.3897 0.5589 0.2109 0.4273 0.6444 0.7227 0.5847 0.3386 0.6097 0.5151 0.6894 0.2964 0.3072 0.4868 0.8117 0.5565 0.4275 0.0726 0.338 0.6948 0.1574 0.1857 0.4289 0.7156 0.599 0.1564 0.2631 0.1292 0.0629 0.0835 0.2431 0.2994 0.4889 0.3415 0.2437 0.2368 0.352 0.577 0.1558 0.8545 0.2169 0.699 0.131 0.5251 0.2608 0.6994 0.208 0.6658 0.9869 0.0854 0.4871 0.8227 0.2923 0.5877 0.5697 0.1003 0.3761 0.2424 0.3256 0.3271 0.5716 0.6907 0.6845 0.5081 0.8362 0.1133 0.3429 0.4898 0.3875 0.2865 0.2467 0.0736 0.3106 0.5261 0.277 0.0241 0.3084 0.2105 0.3169 0.0404 0.4149 0.0346 0.156 0.1978] 2022-08-24 05:40:33 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:40:41 [INFO] [TRAIN] epoch: 93, iter: 117050/160000, loss: 0.4574, lr: 0.000325, batch_cost: 0.1645, reader_cost: 0.00365, ips: 48.6243 samples/sec | ETA 01:57:46 2022-08-24 05:40:49 [INFO] [TRAIN] epoch: 93, iter: 117100/160000, loss: 0.4208, lr: 0.000325, batch_cost: 0.1603, reader_cost: 0.00229, ips: 49.9162 samples/sec | ETA 01:54:35 2022-08-24 05:40:57 [INFO] [TRAIN] epoch: 93, iter: 117150/160000, loss: 0.4309, lr: 0.000324, batch_cost: 0.1599, reader_cost: 0.00088, ips: 50.0328 samples/sec | ETA 01:54:11 2022-08-24 05:41:05 [INFO] [TRAIN] epoch: 93, iter: 117200/160000, loss: 0.4466, lr: 0.000324, batch_cost: 0.1686, reader_cost: 0.00137, ips: 47.4587 samples/sec | ETA 02:00:14 2022-08-24 05:41:14 [INFO] [TRAIN] epoch: 93, iter: 117250/160000, loss: 0.4636, lr: 0.000324, batch_cost: 0.1720, reader_cost: 0.00691, ips: 46.5032 samples/sec | ETA 02:02:34 2022-08-24 05:41:23 [INFO] [TRAIN] epoch: 93, iter: 117300/160000, loss: 0.4236, lr: 0.000323, batch_cost: 0.1795, reader_cost: 0.00067, ips: 44.5712 samples/sec | ETA 02:07:44 2022-08-24 05:41:32 [INFO] [TRAIN] epoch: 93, iter: 117350/160000, loss: 0.4084, lr: 0.000323, batch_cost: 0.1739, reader_cost: 0.00084, ips: 45.9929 samples/sec | ETA 02:03:38 2022-08-24 05:41:40 [INFO] [TRAIN] epoch: 93, iter: 117400/160000, loss: 0.4547, lr: 0.000323, batch_cost: 0.1653, reader_cost: 0.00073, ips: 48.3883 samples/sec | ETA 01:57:23 2022-08-24 05:41:48 [INFO] [TRAIN] epoch: 93, iter: 117450/160000, loss: 0.4130, lr: 0.000322, batch_cost: 0.1636, reader_cost: 0.00093, ips: 48.8863 samples/sec | ETA 01:56:03 2022-08-24 05:42:01 [INFO] [TRAIN] epoch: 94, iter: 117500/160000, loss: 0.4214, lr: 0.000322, batch_cost: 0.2521, reader_cost: 0.09298, ips: 31.7357 samples/sec | ETA 02:58:33 2022-08-24 05:42:09 [INFO] [TRAIN] epoch: 94, iter: 117550/160000, loss: 0.4331, lr: 0.000321, batch_cost: 0.1639, reader_cost: 0.00041, ips: 48.8215 samples/sec | ETA 01:55:55 2022-08-24 05:42:17 [INFO] [TRAIN] epoch: 94, iter: 117600/160000, loss: 0.4283, lr: 0.000321, batch_cost: 0.1587, reader_cost: 0.00051, ips: 50.3978 samples/sec | ETA 01:52:10 2022-08-24 05:42:24 [INFO] [TRAIN] epoch: 94, iter: 117650/160000, loss: 0.4005, lr: 0.000321, batch_cost: 0.1513, reader_cost: 0.00062, ips: 52.8655 samples/sec | ETA 01:46:48 2022-08-24 05:42:32 [INFO] [TRAIN] epoch: 94, iter: 117700/160000, loss: 0.4411, lr: 0.000320, batch_cost: 0.1594, reader_cost: 0.00077, ips: 50.1793 samples/sec | ETA 01:52:23 2022-08-24 05:42:41 [INFO] [TRAIN] epoch: 94, iter: 117750/160000, loss: 0.3894, lr: 0.000320, batch_cost: 0.1773, reader_cost: 0.00074, ips: 45.1251 samples/sec | ETA 02:04:50 2022-08-24 05:42:50 [INFO] [TRAIN] epoch: 94, iter: 117800/160000, loss: 0.4397, lr: 0.000320, batch_cost: 0.1717, reader_cost: 0.00032, ips: 46.5917 samples/sec | ETA 02:00:45 2022-08-24 05:42:58 [INFO] [TRAIN] epoch: 94, iter: 117850/160000, loss: 0.4122, lr: 0.000319, batch_cost: 0.1677, reader_cost: 0.00062, ips: 47.7109 samples/sec | ETA 01:57:47 2022-08-24 05:43:06 [INFO] [TRAIN] epoch: 94, iter: 117900/160000, loss: 0.3972, lr: 0.000319, batch_cost: 0.1625, reader_cost: 0.00055, ips: 49.2218 samples/sec | ETA 01:54:02 2022-08-24 05:43:16 [INFO] [TRAIN] epoch: 94, iter: 117950/160000, loss: 0.4011, lr: 0.000318, batch_cost: 0.2008, reader_cost: 0.00032, ips: 39.8341 samples/sec | ETA 02:20:45 2022-08-24 05:43:26 [INFO] [TRAIN] epoch: 94, iter: 118000/160000, loss: 0.3848, lr: 0.000318, batch_cost: 0.1904, reader_cost: 0.00053, ips: 42.0141 samples/sec | ETA 02:13:17 2022-08-24 05:43:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1839 - reader cost: 8.1918e-04 2022-08-24 05:46:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3759 Acc: 0.7747 Kappa: 0.7577 Dice: 0.5135 2022-08-24 05:46:30 [INFO] [EVAL] Class IoU: [0.6907 0.7938 0.9328 0.7423 0.6958 0.7711 0.7777 0.8092 0.527 0.6191 0.5045 0.57 0.715 0.2964 0.3201 0.4429 0.5232 0.4505 0.617 0.4303 0.7752 0.4328 0.6282 0.5252 0.3327 0.4452 0.4482 0.4583 0.4235 0.2451 0.3137 0.5372 0.3283 0.3517 0.3571 0.415 0.483 0.5579 0.2746 0.3603 0.1684 0.1346 0.3556 0.2724 0.2695 0.2492 0.3706 0.5321 0.629 0.5412 0.5928 0.3875 0.1749 0.1809 0.6856 0.3302 0.8917 0.4349 0.3653 0.2779 0.0762 0.2466 0.3207 0.1853 0.4968 0.6931 0.2839 0.3631 0.1254 0.3705 0.5233 0.5572 0.3886 0.2494 0.4848 0.3914 0.5499 0.2469 0.3184 0.2761 0.7139 0.4166 0.4322 0.0348 0.1551 0.5328 0.1304 0.1358 0.3937 0.5407 0.4591 0.019 0.1991 0.1187 0.0528 0.0624 0.2079 0.1623 0.2702 0.3036 0.2286 0.1098 0.3065 0.7456 0.1812 0.5243 0.1665 0.5588 0.0866 0.2646 0.181 0.5207 0.2076 0.717 0.7529 0.0601 0.4215 0.6427 0.1889 0.3937 0.4809 0.0818 0.2653 0.1917 0.3059 0.2841 0.5593 0.4631 0.5606 0.3613 0.5125 0.0574 0.3195 0.4375 0.2898 0.2098 0.1837 0.0618 0.2101 0.397 0.106 0.0267 0.2893 0.1231 0.273 0.0115 0.4095 0.0268 0.1652 0.207 ] 2022-08-24 05:46:30 [INFO] [EVAL] Class Precision: [0.7934 0.8665 0.9649 0.8258 0.7809 0.8684 0.875 0.8655 0.6924 0.7382 0.7057 0.7097 0.7835 0.5492 0.5252 0.6066 0.6876 0.677 0.7703 0.6501 0.8481 0.6547 0.7699 0.6381 0.4999 0.659 0.5968 0.7045 0.7964 0.3792 0.4758 0.6737 0.5316 0.4875 0.4334 0.5494 0.6634 0.8122 0.3993 0.6298 0.3223 0.3066 0.6104 0.4927 0.4011 0.4866 0.797 0.7593 0.7296 0.6434 0.7504 0.5125 0.3043 0.5569 0.733 0.6934 0.962 0.6628 0.5305 0.4573 0.11 0.3908 0.3917 0.5775 0.6033 0.7673 0.4937 0.5804 0.2365 0.6726 0.69 0.6826 0.5754 0.3421 0.7127 0.558 0.7799 0.6009 0.8206 0.4981 0.8662 0.645 0.7913 0.1299 0.251 0.6956 0.4288 0.4771 0.8022 0.7324 0.6578 0.0317 0.4396 0.3502 0.1425 0.162 0.512 0.4107 0.4056 0.7627 0.7354 0.1891 0.592 0.9142 0.8637 0.5766 0.3107 0.7401 0.2231 0.4074 0.5041 0.7073 0.6776 0.8343 0.7626 0.2512 0.7064 0.7507 0.505 0.5447 0.6645 0.4552 0.6025 0.6271 0.7436 0.654 0.8297 0.5968 0.8464 0.4978 0.5614 0.4683 0.5469 0.8344 0.6972 0.4488 0.3406 0.1234 0.4566 0.6658 0.506 0.039 0.5854 0.462 0.667 0.0132 0.8272 0.3196 0.4426 0.8223] 2022-08-24 05:46:30 [INFO] [EVAL] Class Recall: [0.8422 0.9044 0.9656 0.88 0.8647 0.8731 0.8749 0.9256 0.6881 0.7933 0.639 0.7434 0.8909 0.3918 0.4506 0.6213 0.6863 0.5739 0.7561 0.5599 0.9002 0.5609 0.7735 0.748 0.4987 0.5784 0.6428 0.5674 0.4749 0.4092 0.4793 0.7261 0.4619 0.5581 0.6696 0.6291 0.6399 0.6405 0.4678 0.457 0.2607 0.1934 0.46 0.3785 0.4508 0.3381 0.4093 0.6401 0.8201 0.7732 0.7384 0.6136 0.2915 0.2113 0.9138 0.3866 0.9243 0.5584 0.5398 0.4148 0.1987 0.4005 0.639 0.2144 0.7378 0.8776 0.4006 0.4924 0.2106 0.452 0.6842 0.752 0.5449 0.4795 0.6025 0.5672 0.651 0.2954 0.3422 0.3825 0.8023 0.5406 0.4878 0.0453 0.2889 0.6948 0.1578 0.1595 0.4361 0.6737 0.6032 0.0451 0.2668 0.1523 0.0774 0.0921 0.2593 0.2117 0.4474 0.3352 0.2491 0.2075 0.3886 0.8017 0.1865 0.8524 0.2641 0.6952 0.124 0.43 0.2202 0.6637 0.2303 0.836 0.9834 0.0732 0.511 0.8171 0.2318 0.5868 0.6351 0.0907 0.3216 0.2164 0.342 0.3343 0.6318 0.674 0.6241 0.5685 0.8546 0.0614 0.4346 0.4791 0.3315 0.2826 0.285 0.1102 0.2801 0.4958 0.1182 0.0781 0.3638 0.1437 0.3161 0.0824 0.4478 0.0285 0.2085 0.2167] 2022-08-24 05:46:30 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:46:39 [INFO] [TRAIN] epoch: 94, iter: 118050/160000, loss: 0.3897, lr: 0.000318, batch_cost: 0.1711, reader_cost: 0.00413, ips: 46.7672 samples/sec | ETA 01:59:35 2022-08-24 05:46:48 [INFO] [TRAIN] epoch: 94, iter: 118100/160000, loss: 0.4059, lr: 0.000317, batch_cost: 0.1730, reader_cost: 0.00197, ips: 46.2426 samples/sec | ETA 02:00:48 2022-08-24 05:46:56 [INFO] [TRAIN] epoch: 94, iter: 118150/160000, loss: 0.4345, lr: 0.000317, batch_cost: 0.1585, reader_cost: 0.00053, ips: 50.4865 samples/sec | ETA 01:50:31 2022-08-24 05:47:03 [INFO] [TRAIN] epoch: 94, iter: 118200/160000, loss: 0.4246, lr: 0.000316, batch_cost: 0.1584, reader_cost: 0.00071, ips: 50.4965 samples/sec | ETA 01:50:22 2022-08-24 05:47:12 [INFO] [TRAIN] epoch: 94, iter: 118250/160000, loss: 0.4066, lr: 0.000316, batch_cost: 0.1719, reader_cost: 0.00063, ips: 46.5504 samples/sec | ETA 01:59:35 2022-08-24 05:47:21 [INFO] [TRAIN] epoch: 94, iter: 118300/160000, loss: 0.4340, lr: 0.000316, batch_cost: 0.1720, reader_cost: 0.00101, ips: 46.5039 samples/sec | ETA 01:59:33 2022-08-24 05:47:29 [INFO] [TRAIN] epoch: 94, iter: 118350/160000, loss: 0.4334, lr: 0.000315, batch_cost: 0.1684, reader_cost: 0.00110, ips: 47.4939 samples/sec | ETA 01:56:55 2022-08-24 05:47:38 [INFO] [TRAIN] epoch: 94, iter: 118400/160000, loss: 0.4323, lr: 0.000315, batch_cost: 0.1821, reader_cost: 0.00069, ips: 43.9219 samples/sec | ETA 02:06:17 2022-08-24 05:47:48 [INFO] [TRAIN] epoch: 94, iter: 118450/160000, loss: 0.4217, lr: 0.000315, batch_cost: 0.1970, reader_cost: 0.00096, ips: 40.6050 samples/sec | ETA 02:16:26 2022-08-24 05:47:56 [INFO] [TRAIN] epoch: 94, iter: 118500/160000, loss: 0.4438, lr: 0.000314, batch_cost: 0.1669, reader_cost: 0.00072, ips: 47.9280 samples/sec | ETA 01:55:27 2022-08-24 05:48:05 [INFO] [TRAIN] epoch: 94, iter: 118550/160000, loss: 0.4257, lr: 0.000314, batch_cost: 0.1709, reader_cost: 0.00090, ips: 46.7982 samples/sec | ETA 01:58:05 2022-08-24 05:48:14 [INFO] [TRAIN] epoch: 94, iter: 118600/160000, loss: 0.4704, lr: 0.000313, batch_cost: 0.1854, reader_cost: 0.00062, ips: 43.1576 samples/sec | ETA 02:07:54 2022-08-24 05:48:23 [INFO] [TRAIN] epoch: 94, iter: 118650/160000, loss: 0.4130, lr: 0.000313, batch_cost: 0.1717, reader_cost: 0.00051, ips: 46.5866 samples/sec | ETA 01:58:20 2022-08-24 05:48:32 [INFO] [TRAIN] epoch: 94, iter: 118700/160000, loss: 0.4259, lr: 0.000313, batch_cost: 0.1760, reader_cost: 0.00071, ips: 45.4455 samples/sec | ETA 02:01:10 2022-08-24 05:48:44 [INFO] [TRAIN] epoch: 95, iter: 118750/160000, loss: 0.4175, lr: 0.000312, batch_cost: 0.2494, reader_cost: 0.08090, ips: 32.0794 samples/sec | ETA 02:51:26 2022-08-24 05:48:53 [INFO] [TRAIN] epoch: 95, iter: 118800/160000, loss: 0.4299, lr: 0.000312, batch_cost: 0.1870, reader_cost: 0.00081, ips: 42.7726 samples/sec | ETA 02:08:25 2022-08-24 05:49:04 [INFO] [TRAIN] epoch: 95, iter: 118850/160000, loss: 0.4606, lr: 0.000312, batch_cost: 0.2094, reader_cost: 0.00034, ips: 38.2011 samples/sec | ETA 02:23:37 2022-08-24 05:49:15 [INFO] [TRAIN] epoch: 95, iter: 118900/160000, loss: 0.4312, lr: 0.000311, batch_cost: 0.2170, reader_cost: 0.00070, ips: 36.8747 samples/sec | ETA 02:28:36 2022-08-24 05:49:26 [INFO] [TRAIN] epoch: 95, iter: 118950/160000, loss: 0.4012, lr: 0.000311, batch_cost: 0.2160, reader_cost: 0.00069, ips: 37.0371 samples/sec | ETA 02:27:46 2022-08-24 05:49:36 [INFO] [TRAIN] epoch: 95, iter: 119000/160000, loss: 0.4094, lr: 0.000310, batch_cost: 0.2070, reader_cost: 0.00134, ips: 38.6520 samples/sec | ETA 02:21:25 2022-08-24 05:49:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1594 - reader cost: 8.2662e-04 2022-08-24 05:52:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3732 Acc: 0.7735 Kappa: 0.7563 Dice: 0.5108 2022-08-24 05:52:15 [INFO] [EVAL] Class IoU: [0.696 0.7927 0.9318 0.7373 0.6826 0.7766 0.778 0.8029 0.5312 0.6241 0.4918 0.5754 0.7143 0.2987 0.2933 0.4506 0.4969 0.4009 0.6165 0.4351 0.7664 0.4228 0.6215 0.5258 0.3285 0.4445 0.4276 0.4578 0.4156 0.2175 0.296 0.5431 0.304 0.3621 0.3712 0.4268 0.4798 0.588 0.2927 0.3841 0.1634 0.1555 0.3562 0.2695 0.2469 0.2341 0.3429 0.5484 0.6216 0.56 0.5928 0.387 0.1508 0.2519 0.6781 0.3161 0.8954 0.3625 0.3676 0.2827 0.0892 0.2854 0.2994 0.1373 0.4776 0.7241 0.2683 0.3872 0.1248 0.3629 0.525 0.5903 0.4179 0.2573 0.4766 0.4004 0.5819 0.2773 0.287 0.3374 0.7128 0.4136 0.4443 0.0417 0.1162 0.5251 0.1228 0.133 0.2772 0.5341 0.4575 0.0711 0.2158 0.1062 0.0573 0.0487 0.2164 0.174 0.2432 0.2958 0.2477 0.1266 0.2369 0.6744 0.1756 0.4904 0.1948 0.5553 0.0771 0.2606 0.1946 0.5333 0.1997 0.6883 0.7424 0.0548 0.3781 0.661 0.2142 0.4079 0.4254 0.0793 0.3282 0.1694 0.3032 0.2672 0.5413 0.4605 0.5412 0.3711 0.5691 0.0681 0.233 0.4194 0.2995 0.2188 0.1683 0.0303 0.2289 0.3882 0.1374 0.0096 0.2617 0.2104 0.2541 0.0107 0.402 0.032 0.1359 0.1977] 2022-08-24 05:52:15 [INFO] [EVAL] Class Precision: [0.7923 0.8703 0.9636 0.8187 0.7588 0.867 0.9008 0.8529 0.6659 0.7475 0.7106 0.6658 0.7778 0.5189 0.5685 0.6382 0.7057 0.7175 0.7914 0.6441 0.8342 0.6475 0.7415 0.6462 0.4941 0.5866 0.5452 0.8134 0.8138 0.3236 0.462 0.6774 0.5195 0.4744 0.4629 0.5642 0.6615 0.7958 0.4211 0.6155 0.3472 0.3283 0.5958 0.4918 0.3138 0.4149 0.7583 0.7553 0.6894 0.7106 0.7797 0.4774 0.3091 0.6327 0.7275 0.6721 0.9379 0.7241 0.6807 0.5578 0.131 0.4889 0.4252 0.6354 0.5715 0.847 0.4244 0.5022 0.2534 0.6053 0.7521 0.7813 0.5624 0.3648 0.7106 0.6554 0.8042 0.5494 0.7311 0.5834 0.8543 0.6663 0.7746 0.1329 0.2158 0.7136 0.484 0.443 0.7196 0.7202 0.7068 0.0852 0.4163 0.3446 0.1777 0.1653 0.5969 0.384 0.346 0.7194 0.6458 0.1972 0.5287 0.9662 0.8448 0.5253 0.411 0.776 0.1899 0.3957 0.4808 0.8001 0.5869 0.8724 0.7523 0.2823 0.6775 0.7762 0.4014 0.5696 0.6682 0.3115 0.6955 0.7406 0.7421 0.6756 0.8481 0.6074 0.8091 0.519 0.6916 0.4933 0.5106 0.7984 0.6142 0.385 0.397 0.1049 0.4256 0.7071 0.5788 0.0147 0.5693 0.5386 0.5588 0.0141 0.8525 0.2515 0.4144 0.7163] 2022-08-24 05:52:15 [INFO] [EVAL] Class Recall: [0.8513 0.8989 0.9658 0.8811 0.8717 0.8816 0.8509 0.9319 0.7242 0.7908 0.615 0.8091 0.8975 0.413 0.3772 0.6053 0.6267 0.476 0.7361 0.5727 0.9042 0.5492 0.7933 0.7383 0.495 0.6472 0.6646 0.5115 0.4592 0.3986 0.4517 0.7326 0.423 0.6047 0.6518 0.6368 0.636 0.6924 0.49 0.5053 0.2359 0.2281 0.4697 0.3734 0.5368 0.3495 0.385 0.6669 0.8635 0.7255 0.7121 0.6713 0.2274 0.295 0.909 0.3737 0.9518 0.4207 0.4442 0.3644 0.2188 0.4068 0.5029 0.149 0.7441 0.833 0.4219 0.6284 0.1973 0.4754 0.6349 0.7071 0.6192 0.466 0.5914 0.5072 0.6779 0.359 0.3209 0.4445 0.8115 0.5217 0.5102 0.0573 0.2011 0.6654 0.1413 0.1597 0.3108 0.6739 0.5647 0.3002 0.3094 0.1331 0.078 0.0647 0.2534 0.2413 0.4502 0.3344 0.2867 0.2614 0.3004 0.6908 0.1815 0.8807 0.2703 0.6614 0.1149 0.4328 0.2464 0.6153 0.2324 0.7654 0.9825 0.0637 0.461 0.8167 0.3147 0.5896 0.5394 0.0961 0.3833 0.1801 0.3389 0.3066 0.5994 0.6556 0.6204 0.5655 0.7625 0.0732 0.3 0.4691 0.3688 0.3364 0.2261 0.0408 0.3312 0.4626 0.1527 0.0266 0.3264 0.2567 0.3179 0.0416 0.4321 0.0354 0.1682 0.2145] 2022-08-24 05:52:16 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:52:24 [INFO] [TRAIN] epoch: 95, iter: 119050/160000, loss: 0.4425, lr: 0.000310, batch_cost: 0.1755, reader_cost: 0.00407, ips: 45.5753 samples/sec | ETA 01:59:48 2022-08-24 05:52:33 [INFO] [TRAIN] epoch: 95, iter: 119100/160000, loss: 0.4127, lr: 0.000310, batch_cost: 0.1658, reader_cost: 0.00162, ips: 48.2439 samples/sec | ETA 01:53:02 2022-08-24 05:52:41 [INFO] [TRAIN] epoch: 95, iter: 119150/160000, loss: 0.4237, lr: 0.000309, batch_cost: 0.1736, reader_cost: 0.00333, ips: 46.0774 samples/sec | ETA 01:58:12 2022-08-24 05:52:49 [INFO] [TRAIN] epoch: 95, iter: 119200/160000, loss: 0.4104, lr: 0.000309, batch_cost: 0.1593, reader_cost: 0.00052, ips: 50.2321 samples/sec | ETA 01:48:17 2022-08-24 05:52:58 [INFO] [TRAIN] epoch: 95, iter: 119250/160000, loss: 0.4212, lr: 0.000309, batch_cost: 0.1635, reader_cost: 0.00048, ips: 48.9178 samples/sec | ETA 01:51:04 2022-08-24 05:53:06 [INFO] [TRAIN] epoch: 95, iter: 119300/160000, loss: 0.4352, lr: 0.000308, batch_cost: 0.1747, reader_cost: 0.00068, ips: 45.7821 samples/sec | ETA 01:58:31 2022-08-24 05:53:16 [INFO] [TRAIN] epoch: 95, iter: 119350/160000, loss: 0.4182, lr: 0.000308, batch_cost: 0.2038, reader_cost: 0.00046, ips: 39.2480 samples/sec | ETA 02:18:05 2022-08-24 05:53:24 [INFO] [TRAIN] epoch: 95, iter: 119400/160000, loss: 0.4598, lr: 0.000307, batch_cost: 0.1563, reader_cost: 0.00031, ips: 51.1776 samples/sec | ETA 01:45:46 2022-08-24 05:53:33 [INFO] [TRAIN] epoch: 95, iter: 119450/160000, loss: 0.4259, lr: 0.000307, batch_cost: 0.1649, reader_cost: 0.00117, ips: 48.5184 samples/sec | ETA 01:51:26 2022-08-24 05:53:40 [INFO] [TRAIN] epoch: 95, iter: 119500/160000, loss: 0.4379, lr: 0.000307, batch_cost: 0.1528, reader_cost: 0.00041, ips: 52.3517 samples/sec | ETA 01:43:08 2022-08-24 05:53:51 [INFO] [TRAIN] epoch: 95, iter: 119550/160000, loss: 0.4185, lr: 0.000306, batch_cost: 0.2066, reader_cost: 0.00038, ips: 38.7192 samples/sec | ETA 02:19:17 2022-08-24 05:53:58 [INFO] [TRAIN] epoch: 95, iter: 119600/160000, loss: 0.4083, lr: 0.000306, batch_cost: 0.1555, reader_cost: 0.00061, ips: 51.4431 samples/sec | ETA 01:44:42 2022-08-24 05:54:06 [INFO] [TRAIN] epoch: 95, iter: 119650/160000, loss: 0.4207, lr: 0.000305, batch_cost: 0.1621, reader_cost: 0.00052, ips: 49.3510 samples/sec | ETA 01:49:00 2022-08-24 05:54:16 [INFO] [TRAIN] epoch: 95, iter: 119700/160000, loss: 0.4331, lr: 0.000305, batch_cost: 0.1856, reader_cost: 0.00043, ips: 43.1051 samples/sec | ETA 02:04:39 2022-08-24 05:54:24 [INFO] [TRAIN] epoch: 95, iter: 119750/160000, loss: 0.4082, lr: 0.000305, batch_cost: 0.1673, reader_cost: 0.00051, ips: 47.8273 samples/sec | ETA 01:52:12 2022-08-24 05:54:32 [INFO] [TRAIN] epoch: 95, iter: 119800/160000, loss: 0.4235, lr: 0.000304, batch_cost: 0.1622, reader_cost: 0.00093, ips: 49.3354 samples/sec | ETA 01:48:38 2022-08-24 05:54:40 [INFO] [TRAIN] epoch: 95, iter: 119850/160000, loss: 0.4222, lr: 0.000304, batch_cost: 0.1587, reader_cost: 0.00056, ips: 50.4158 samples/sec | ETA 01:46:11 2022-08-24 05:54:48 [INFO] [TRAIN] epoch: 95, iter: 119900/160000, loss: 0.4133, lr: 0.000304, batch_cost: 0.1648, reader_cost: 0.00056, ips: 48.5299 samples/sec | ETA 01:50:10 2022-08-24 05:54:57 [INFO] [TRAIN] epoch: 95, iter: 119950/160000, loss: 0.4254, lr: 0.000303, batch_cost: 0.1640, reader_cost: 0.00091, ips: 48.7674 samples/sec | ETA 01:49:29 2022-08-24 05:55:08 [INFO] [TRAIN] epoch: 96, iter: 120000/160000, loss: 0.4194, lr: 0.000303, batch_cost: 0.2360, reader_cost: 0.05570, ips: 33.8949 samples/sec | ETA 02:37:20 2022-08-24 05:55:08 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 185s - batch_cost: 0.1846 - reader cost: 8.7016e-04 2022-08-24 05:58:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3746 Acc: 0.7754 Kappa: 0.7581 Dice: 0.5103 2022-08-24 05:58:13 [INFO] [EVAL] Class IoU: [0.6941 0.7928 0.9309 0.7395 0.6926 0.7685 0.7792 0.8092 0.539 0.6335 0.4977 0.5734 0.7145 0.3028 0.3109 0.4468 0.4885 0.4413 0.6204 0.4416 0.7756 0.456 0.6312 0.5218 0.3531 0.384 0.4767 0.4554 0.443 0.2243 0.2763 0.5309 0.312 0.3594 0.3448 0.4145 0.4795 0.599 0.2822 0.392 0.1272 0.1316 0.3626 0.257 0.2542 0.2529 0.3794 0.55 0.6139 0.5492 0.6116 0.4121 0.1671 0.1296 0.6824 0.3311 0.8797 0.421 0.4199 0.244 0.0957 0.2229 0.3042 0.2209 0.483 0.725 0.282 0.3928 0.1235 0.3723 0.5276 0.5825 0.3873 0.2655 0.48 0.4107 0.618 0.3054 0.2638 0.326 0.721 0.4335 0.4357 0.0314 0.0937 0.5323 0.1125 0.1327 0.3403 0.5217 0.4776 0.0431 0.2188 0.1235 0.0577 0.0583 0.2132 0.178 0.2592 0.305 0.1909 0.1282 0.297 0.7753 0.1878 0.693 0.1164 0.5535 0.0818 0.2127 0.2015 0.5285 0.1958 0.6848 0.7331 0.0611 0.3143 0.6496 0.1444 0.4097 0.4455 0.0686 0.2533 0.1852 0.2732 0.242 0.5465 0.4444 0.5105 0.3946 0.614 0.0792 0.2726 0.4189 0.3063 0.2076 0.1673 0.0474 0.2411 0.4175 0.0829 0.0087 0.2386 0.1581 0.2186 0.0123 0.4135 0.037 0.1434 0.1959] 2022-08-24 05:58:13 [INFO] [EVAL] Class Precision: [0.7864 0.8583 0.9632 0.822 0.782 0.878 0.8867 0.8642 0.6882 0.7365 0.7282 0.718 0.7807 0.5047 0.5531 0.6052 0.6836 0.6812 0.7746 0.6192 0.8526 0.6676 0.7729 0.6215 0.49 0.6514 0.5665 0.7936 0.8019 0.3943 0.4718 0.6823 0.5834 0.52 0.4403 0.5604 0.693 0.8297 0.3784 0.6032 0.358 0.3113 0.6282 0.4948 0.3463 0.5611 0.815 0.7421 0.7447 0.6904 0.8104 0.5078 0.2934 0.5048 0.7267 0.7298 0.9207 0.7183 0.7454 0.4128 0.1657 0.396 0.4333 0.6191 0.5776 0.8437 0.4673 0.5451 0.2404 0.6347 0.7277 0.7857 0.6022 0.3514 0.6826 0.6167 0.835 0.6904 0.6935 0.5054 0.8534 0.7028 0.7903 0.1884 0.1699 0.6698 0.523 0.4122 0.8453 0.684 0.6811 0.0527 0.459 0.3418 0.1941 0.1334 0.5499 0.3895 0.4006 0.7546 0.7298 0.1989 0.5269 0.9598 0.8498 0.8575 0.2912 0.8077 0.2078 0.3605 0.4429 0.7711 0.54 0.8763 0.7412 0.2777 0.5594 0.7628 0.1913 0.5503 0.7019 0.2627 0.6297 0.6365 0.7831 0.6801 0.8228 0.5461 0.7716 0.5879 0.7239 0.4357 0.5775 0.8053 0.6453 0.4133 0.3911 0.0983 0.4907 0.6217 0.4186 0.0134 0.6319 0.5239 0.612 0.0273 0.8232 0.2475 0.4892 0.7412] 2022-08-24 05:58:13 [INFO] [EVAL] Class Recall: [0.8554 0.9123 0.9652 0.8805 0.8584 0.8604 0.8654 0.9271 0.7131 0.8192 0.6112 0.7401 0.8938 0.4308 0.4152 0.6306 0.6311 0.5562 0.7572 0.6063 0.8957 0.5899 0.7749 0.7649 0.5584 0.4834 0.7503 0.5167 0.4974 0.3423 0.4 0.7051 0.4014 0.5379 0.6138 0.6142 0.6088 0.683 0.5259 0.5283 0.1647 0.1856 0.4618 0.3483 0.4887 0.3153 0.4151 0.68 0.7776 0.7287 0.7138 0.6862 0.2796 0.1484 0.918 0.3773 0.9518 0.5043 0.4902 0.3738 0.1846 0.3377 0.5051 0.2556 0.7468 0.8375 0.4155 0.5842 0.2026 0.4738 0.6574 0.6925 0.5205 0.5207 0.6179 0.5515 0.7041 0.3538 0.2986 0.4788 0.8229 0.5309 0.4926 0.0363 0.1727 0.7217 0.1254 0.1636 0.3629 0.6874 0.6151 0.1909 0.2948 0.1621 0.0759 0.0939 0.2582 0.2469 0.4233 0.3386 0.2054 0.2653 0.405 0.8014 0.1942 0.7832 0.1624 0.6375 0.1189 0.3416 0.2699 0.6269 0.235 0.758 0.9853 0.0726 0.4176 0.814 0.3708 0.6158 0.5494 0.0849 0.2976 0.2071 0.2955 0.2731 0.6194 0.7046 0.6014 0.5454 0.8017 0.0882 0.3405 0.4662 0.3683 0.2944 0.2263 0.0839 0.3216 0.5596 0.0937 0.0241 0.2771 0.1846 0.2538 0.0221 0.4537 0.0417 0.1686 0.2103] 2022-08-24 05:58:13 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 05:58:22 [INFO] [TRAIN] epoch: 96, iter: 120050/160000, loss: 0.4104, lr: 0.000302, batch_cost: 0.1686, reader_cost: 0.00374, ips: 47.4531 samples/sec | ETA 01:52:15 2022-08-24 05:58:31 [INFO] [TRAIN] epoch: 96, iter: 120100/160000, loss: 0.4069, lr: 0.000302, batch_cost: 0.1786, reader_cost: 0.00118, ips: 44.7831 samples/sec | ETA 01:58:47 2022-08-24 05:58:40 [INFO] [TRAIN] epoch: 96, iter: 120150/160000, loss: 0.4154, lr: 0.000302, batch_cost: 0.1804, reader_cost: 0.00058, ips: 44.3542 samples/sec | ETA 01:59:47 2022-08-24 05:58:49 [INFO] [TRAIN] epoch: 96, iter: 120200/160000, loss: 0.3945, lr: 0.000301, batch_cost: 0.1769, reader_cost: 0.00060, ips: 45.2181 samples/sec | ETA 01:57:21 2022-08-24 05:58:58 [INFO] [TRAIN] epoch: 96, iter: 120250/160000, loss: 0.4084, lr: 0.000301, batch_cost: 0.1846, reader_cost: 0.00079, ips: 43.3392 samples/sec | ETA 02:02:17 2022-08-24 05:59:06 [INFO] [TRAIN] epoch: 96, iter: 120300/160000, loss: 0.4276, lr: 0.000301, batch_cost: 0.1654, reader_cost: 0.00092, ips: 48.3600 samples/sec | ETA 01:49:27 2022-08-24 05:59:14 [INFO] [TRAIN] epoch: 96, iter: 120350/160000, loss: 0.4654, lr: 0.000300, batch_cost: 0.1596, reader_cost: 0.00074, ips: 50.1302 samples/sec | ETA 01:45:27 2022-08-24 05:59:23 [INFO] [TRAIN] epoch: 96, iter: 120400/160000, loss: 0.4284, lr: 0.000300, batch_cost: 0.1728, reader_cost: 0.00137, ips: 46.3023 samples/sec | ETA 01:54:01 2022-08-24 05:59:31 [INFO] [TRAIN] epoch: 96, iter: 120450/160000, loss: 0.4407, lr: 0.000299, batch_cost: 0.1617, reader_cost: 0.00105, ips: 49.4851 samples/sec | ETA 01:46:33 2022-08-24 05:59:39 [INFO] [TRAIN] epoch: 96, iter: 120500/160000, loss: 0.4089, lr: 0.000299, batch_cost: 0.1642, reader_cost: 0.00038, ips: 48.7204 samples/sec | ETA 01:48:05 2022-08-24 05:59:47 [INFO] [TRAIN] epoch: 96, iter: 120550/160000, loss: 0.4243, lr: 0.000299, batch_cost: 0.1509, reader_cost: 0.00060, ips: 53.0198 samples/sec | ETA 01:39:12 2022-08-24 05:59:54 [INFO] [TRAIN] epoch: 96, iter: 120600/160000, loss: 0.4304, lr: 0.000298, batch_cost: 0.1501, reader_cost: 0.00061, ips: 53.2883 samples/sec | ETA 01:38:34 2022-08-24 06:00:02 [INFO] [TRAIN] epoch: 96, iter: 120650/160000, loss: 0.4132, lr: 0.000298, batch_cost: 0.1514, reader_cost: 0.00074, ips: 52.8421 samples/sec | ETA 01:39:17 2022-08-24 06:00:10 [INFO] [TRAIN] epoch: 96, iter: 120700/160000, loss: 0.4134, lr: 0.000298, batch_cost: 0.1712, reader_cost: 0.00116, ips: 46.7200 samples/sec | ETA 01:52:09 2022-08-24 06:00:19 [INFO] [TRAIN] epoch: 96, iter: 120750/160000, loss: 0.4026, lr: 0.000297, batch_cost: 0.1816, reader_cost: 0.00058, ips: 44.0560 samples/sec | ETA 01:58:47 2022-08-24 06:00:28 [INFO] [TRAIN] epoch: 96, iter: 120800/160000, loss: 0.4169, lr: 0.000297, batch_cost: 0.1735, reader_cost: 0.00080, ips: 46.1137 samples/sec | ETA 01:53:20 2022-08-24 06:00:36 [INFO] [TRAIN] epoch: 96, iter: 120850/160000, loss: 0.4329, lr: 0.000296, batch_cost: 0.1663, reader_cost: 0.00039, ips: 48.1094 samples/sec | ETA 01:48:30 2022-08-24 06:00:45 [INFO] [TRAIN] epoch: 96, iter: 120900/160000, loss: 0.3888, lr: 0.000296, batch_cost: 0.1656, reader_cost: 0.00841, ips: 48.2973 samples/sec | ETA 01:47:56 2022-08-24 06:00:54 [INFO] [TRAIN] epoch: 96, iter: 120950/160000, loss: 0.4380, lr: 0.000296, batch_cost: 0.1860, reader_cost: 0.00097, ips: 43.0128 samples/sec | ETA 02:01:02 2022-08-24 06:01:03 [INFO] [TRAIN] epoch: 96, iter: 121000/160000, loss: 0.3888, lr: 0.000295, batch_cost: 0.1731, reader_cost: 0.00042, ips: 46.2146 samples/sec | ETA 01:52:31 2022-08-24 06:01:03 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2018 - reader cost: 7.6328e-04 2022-08-24 06:04:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.3729 Acc: 0.7750 Kappa: 0.7577 Dice: 0.5099 2022-08-24 06:04:25 [INFO] [EVAL] Class IoU: [0.695 0.7922 0.9312 0.7416 0.6876 0.774 0.7785 0.8015 0.5331 0.6354 0.5024 0.5683 0.7044 0.2845 0.3132 0.4523 0.5376 0.4319 0.6127 0.4346 0.7643 0.4001 0.6307 0.5306 0.3463 0.4091 0.456 0.445 0.4759 0.2317 0.2914 0.5287 0.3236 0.3685 0.3495 0.4189 0.488 0.5911 0.3104 0.365 0.1693 0.1565 0.3599 0.2626 0.2546 0.2121 0.3969 0.5378 0.6079 0.5366 0.6057 0.3699 0.1648 0.1549 0.6835 0.3069 0.8783 0.3876 0.3781 0.2597 0.1049 0.2447 0.3558 0.1209 0.4956 0.7276 0.2941 0.3806 0.1302 0.382 0.5131 0.5513 0.396 0.2358 0.4938 0.39 0.599 0.2715 0.2237 0.2726 0.7133 0.4204 0.4162 0.0502 0.1978 0.5159 0.1194 0.1368 0.3468 0.5347 0.4747 0.0589 0.2073 0.0915 0.031 0.0465 0.2232 0.1557 0.2451 0.3036 0.2182 0.1329 0.2749 0.7183 0.1885 0.5624 0.1297 0.545 0.0822 0.2197 0.217 0.4112 0.1886 0.7163 0.6789 0.0681 0.387 0.6398 0.1664 0.3568 0.4285 0.0727 0.3127 0.2157 0.274 0.2611 0.5164 0.4804 0.5789 0.4132 0.591 0.0651 0.2999 0.3624 0.3153 0.2031 0.1774 0.0474 0.2428 0.3949 0.038 0.0088 0.2593 0.327 0.2049 0.0184 0.4255 0.0315 0.165 0.2034] 2022-08-24 06:04:25 [INFO] [EVAL] Class Precision: [0.7909 0.8644 0.9676 0.8283 0.7586 0.8728 0.8768 0.8511 0.6752 0.7495 0.707 0.6963 0.77 0.5325 0.5391 0.6195 0.702 0.7032 0.7557 0.6521 0.8333 0.6377 0.7757 0.6416 0.5041 0.6476 0.5574 0.7889 0.7863 0.3608 0.4884 0.6751 0.5991 0.5291 0.4491 0.5788 0.6706 0.8031 0.4554 0.6204 0.3643 0.3152 0.5977 0.5049 0.3519 0.4538 0.7728 0.7525 0.7001 0.6694 0.7805 0.4632 0.3076 0.509 0.7254 0.6344 0.9235 0.7065 0.744 0.4373 0.137 0.4767 0.5124 0.6344 0.609 0.8399 0.5055 0.5591 0.2659 0.6656 0.7714 0.6589 0.5587 0.3639 0.7289 0.553 0.82 0.5862 0.7456 0.6154 0.8401 0.7028 0.8228 0.1467 0.2905 0.7315 0.4223 0.3957 0.8136 0.6927 0.686 0.0694 0.3708 0.347 0.0879 0.109 0.5465 0.3349 0.3321 0.6416 0.504 0.2318 0.5387 0.9477 0.7291 0.6087 0.2772 0.7632 0.1763 0.3568 0.479 0.5383 0.6199 0.8098 0.6856 0.2447 0.644 0.7445 0.2844 0.521 0.7343 0.3318 0.7196 0.523 0.7693 0.6495 0.793 0.6267 0.8437 0.7071 0.694 0.3571 0.5501 0.8672 0.7111 0.4238 0.3288 0.1388 0.5262 0.6561 0.1617 0.0159 0.6196 0.7173 0.604 0.031 0.7996 0.2055 0.3937 0.7072] 2022-08-24 06:04:25 [INFO] [EVAL] Class Recall: [0.8514 0.9046 0.9612 0.8763 0.8801 0.8724 0.8742 0.9322 0.7169 0.8066 0.6345 0.7555 0.892 0.3792 0.4278 0.6263 0.6966 0.5282 0.764 0.5659 0.9022 0.5177 0.7714 0.7542 0.5252 0.5263 0.7147 0.5052 0.5466 0.393 0.4194 0.7092 0.4131 0.5485 0.612 0.6027 0.6419 0.6913 0.4937 0.4699 0.2402 0.2372 0.475 0.3536 0.4793 0.2848 0.4494 0.6533 0.8218 0.7301 0.73 0.6473 0.2619 0.1821 0.9221 0.3728 0.9473 0.462 0.4346 0.39 0.3098 0.3346 0.538 0.1299 0.7269 0.8447 0.4129 0.5438 0.2032 0.4727 0.6051 0.7715 0.5763 0.4013 0.605 0.5695 0.6896 0.3359 0.2421 0.3286 0.8254 0.5112 0.4572 0.0709 0.3828 0.6364 0.1428 0.1729 0.3767 0.701 0.6064 0.2811 0.3199 0.1106 0.0456 0.0749 0.2739 0.2254 0.4832 0.3657 0.2779 0.2376 0.3596 0.7479 0.2027 0.881 0.1959 0.6559 0.1334 0.3639 0.2841 0.6352 0.2132 0.8611 0.9858 0.0862 0.4924 0.8198 0.2862 0.5309 0.5071 0.0852 0.3561 0.2685 0.2986 0.304 0.5968 0.6728 0.6485 0.4985 0.7993 0.0737 0.3974 0.3837 0.3616 0.2806 0.2781 0.0671 0.3107 0.498 0.0473 0.0192 0.3084 0.3754 0.2367 0.0432 0.4763 0.0359 0.2211 0.2221] 2022-08-24 06:04:25 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 06:04:33 [INFO] [TRAIN] epoch: 96, iter: 121050/160000, loss: 0.4112, lr: 0.000295, batch_cost: 0.1642, reader_cost: 0.01134, ips: 48.7324 samples/sec | ETA 01:46:34 2022-08-24 06:04:42 [INFO] [TRAIN] epoch: 96, iter: 121100/160000, loss: 0.4011, lr: 0.000295, batch_cost: 0.1842, reader_cost: 0.00088, ips: 43.4365 samples/sec | ETA 01:59:24 2022-08-24 06:04:50 [INFO] [TRAIN] epoch: 96, iter: 121150/160000, loss: 0.3987, lr: 0.000294, batch_cost: 0.1639, reader_cost: 0.00080, ips: 48.8118 samples/sec | ETA 01:46:07 2022-08-24 06:04:58 [INFO] [TRAIN] epoch: 96, iter: 121200/160000, loss: 0.4122, lr: 0.000294, batch_cost: 0.1523, reader_cost: 0.00103, ips: 52.5242 samples/sec | ETA 01:38:29 2022-08-24 06:05:09 [INFO] [TRAIN] epoch: 97, iter: 121250/160000, loss: 0.4335, lr: 0.000293, batch_cost: 0.2128, reader_cost: 0.03455, ips: 37.6018 samples/sec | ETA 02:17:24 2022-08-24 06:05:18 [INFO] [TRAIN] epoch: 97, iter: 121300/160000, loss: 0.4129, lr: 0.000293, batch_cost: 0.1925, reader_cost: 0.01156, ips: 41.5641 samples/sec | ETA 02:04:08 2022-08-24 06:05:29 [INFO] [TRAIN] epoch: 97, iter: 121350/160000, loss: 0.4338, lr: 0.000293, batch_cost: 0.2074, reader_cost: 0.00064, ips: 38.5690 samples/sec | ETA 02:13:36 2022-08-24 06:05:38 [INFO] [TRAIN] epoch: 97, iter: 121400/160000, loss: 0.4182, lr: 0.000292, batch_cost: 0.1923, reader_cost: 0.00075, ips: 41.5915 samples/sec | ETA 02:03:44 2022-08-24 06:05:47 [INFO] [TRAIN] epoch: 97, iter: 121450/160000, loss: 0.3990, lr: 0.000292, batch_cost: 0.1717, reader_cost: 0.00097, ips: 46.5950 samples/sec | ETA 01:50:18 2022-08-24 06:05:56 [INFO] [TRAIN] epoch: 97, iter: 121500/160000, loss: 0.4544, lr: 0.000291, batch_cost: 0.1837, reader_cost: 0.00059, ips: 43.5575 samples/sec | ETA 01:57:51 2022-08-24 06:06:05 [INFO] [TRAIN] epoch: 97, iter: 121550/160000, loss: 0.4170, lr: 0.000291, batch_cost: 0.1680, reader_cost: 0.00073, ips: 47.6073 samples/sec | ETA 01:47:41 2022-08-24 06:06:13 [INFO] [TRAIN] epoch: 97, iter: 121600/160000, loss: 0.4448, lr: 0.000291, batch_cost: 0.1684, reader_cost: 0.00076, ips: 47.4970 samples/sec | ETA 01:47:47 2022-08-24 06:06:22 [INFO] [TRAIN] epoch: 97, iter: 121650/160000, loss: 0.4037, lr: 0.000290, batch_cost: 0.1717, reader_cost: 0.00077, ips: 46.5901 samples/sec | ETA 01:49:45 2022-08-24 06:06:29 [INFO] [TRAIN] epoch: 97, iter: 121700/160000, loss: 0.4503, lr: 0.000290, batch_cost: 0.1559, reader_cost: 0.00080, ips: 51.3027 samples/sec | ETA 01:39:32 2022-08-24 06:06:38 [INFO] [TRAIN] epoch: 97, iter: 121750/160000, loss: 0.4311, lr: 0.000290, batch_cost: 0.1815, reader_cost: 0.00055, ips: 44.0726 samples/sec | ETA 01:55:43 2022-08-24 06:06:49 [INFO] [TRAIN] epoch: 97, iter: 121800/160000, loss: 0.3976, lr: 0.000289, batch_cost: 0.2199, reader_cost: 0.00051, ips: 36.3806 samples/sec | ETA 02:20:00 2022-08-24 06:06:59 [INFO] [TRAIN] epoch: 97, iter: 121850/160000, loss: 0.4160, lr: 0.000289, batch_cost: 0.1900, reader_cost: 0.00061, ips: 42.1104 samples/sec | ETA 02:00:47 2022-08-24 06:07:09 [INFO] [TRAIN] epoch: 97, iter: 121900/160000, loss: 0.4572, lr: 0.000288, batch_cost: 0.2009, reader_cost: 0.00216, ips: 39.8200 samples/sec | ETA 02:07:34 2022-08-24 06:07:20 [INFO] [TRAIN] epoch: 97, iter: 121950/160000, loss: 0.4472, lr: 0.000288, batch_cost: 0.2115, reader_cost: 0.00038, ips: 37.8298 samples/sec | ETA 02:14:06 2022-08-24 06:07:30 [INFO] [TRAIN] epoch: 97, iter: 122000/160000, loss: 0.4441, lr: 0.000288, batch_cost: 0.2108, reader_cost: 0.00037, ips: 37.9460 samples/sec | ETA 02:13:31 2022-08-24 06:07:30 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1738 - reader cost: 8.1745e-04 2022-08-24 06:10:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.3730 Acc: 0.7747 Kappa: 0.7575 Dice: 0.5097 2022-08-24 06:10:24 [INFO] [EVAL] Class IoU: [0.6913 0.7925 0.9333 0.7461 0.6865 0.7782 0.782 0.8091 0.529 0.6388 0.4866 0.5728 0.7131 0.2949 0.325 0.4479 0.546 0.4175 0.605 0.4357 0.7714 0.411 0.6289 0.5184 0.3225 0.3696 0.4547 0.4685 0.4736 0.2256 0.2748 0.5086 0.3239 0.367 0.3875 0.4104 0.4839 0.5917 0.3063 0.3594 0.1706 0.1583 0.3556 0.2642 0.2327 0.2343 0.3518 0.5432 0.5957 0.5225 0.5884 0.4156 0.1582 0.1685 0.6812 0.3082 0.8876 0.424 0.3996 0.2696 0.0855 0.2458 0.2852 0.1892 0.4903 0.7355 0.2485 0.3521 0.1493 0.3725 0.5404 0.5465 0.397 0.2358 0.4832 0.3902 0.4996 0.2964 0.2537 0.2944 0.7349 0.4329 0.4501 0.0711 0.1541 0.512 0.1225 0.1427 0.3869 0.5384 0.4453 0.1011 0.1913 0.0777 0.0455 0.0581 0.2006 0.1381 0.2635 0.3139 0.2157 0.1158 0.2968 0.7078 0.1679 0.6242 0.1017 0.5514 0.102 0.2719 0.1957 0.4219 0.198 0.7194 0.7956 0.0863 0.4264 0.6453 0.1949 0.4146 0.3872 0.0694 0.3436 0.1655 0.275 0.2514 0.5454 0.4682 0.5619 0.3772 0.6272 0.0883 0.2662 0.3785 0.2769 0.2093 0.1765 0.0479 0.2308 0.404 0.0499 0.0093 0.2504 0.0909 0.239 0.0168 0.414 0.0339 0.1568 0.2026] 2022-08-24 06:10:24 [INFO] [EVAL] Class Precision: [0.7851 0.8701 0.9627 0.8363 0.7627 0.8672 0.8882 0.869 0.6983 0.7487 0.7102 0.6913 0.7765 0.522 0.5341 0.6632 0.6984 0.6923 0.7529 0.6245 0.8461 0.6727 0.7672 0.609 0.478 0.6247 0.5417 0.7386 0.7729 0.3734 0.4656 0.6914 0.5858 0.5196 0.4702 0.5227 0.6854 0.797 0.4414 0.6295 0.3244 0.3632 0.5972 0.533 0.3138 0.4488 0.7228 0.8026 0.6758 0.6427 0.7042 0.5129 0.3939 0.5798 0.7266 0.6272 0.9357 0.6983 0.6885 0.4545 0.1225 0.3885 0.3271 0.6186 0.5985 0.8498 0.5007 0.5347 0.3815 0.7275 0.7293 0.6861 0.5568 0.3644 0.7388 0.5961 0.7133 0.6755 0.7273 0.478 0.8911 0.6888 0.7715 0.192 0.2361 0.7281 0.495 0.4132 0.8371 0.7322 0.5849 0.1353 0.4003 0.3147 0.1327 0.1946 0.5035 0.3803 0.4018 0.6959 0.4939 0.1811 0.5391 0.7719 0.7501 0.7272 0.2265 0.6999 0.2351 0.4161 0.47 0.6726 0.6136 0.8559 0.8093 0.3081 0.6611 0.7861 0.4175 0.5708 0.7698 0.3113 0.702 0.6888 0.7663 0.6687 0.8168 0.6136 0.8032 0.5731 0.7268 0.5558 0.5421 0.8791 0.706 0.3778 0.3343 0.1332 0.4834 0.6837 0.1738 0.0131 0.5973 0.42 0.7084 0.0218 0.8321 0.2211 0.459 0.7369] 2022-08-24 06:10:24 [INFO] [EVAL] Class Recall: [0.8526 0.8988 0.9683 0.8737 0.873 0.8836 0.8674 0.9215 0.6856 0.8131 0.6071 0.7696 0.8974 0.404 0.4536 0.5798 0.7145 0.5125 0.7549 0.5903 0.8972 0.5138 0.7772 0.777 0.4978 0.4751 0.7391 0.5617 0.5501 0.363 0.4013 0.658 0.4201 0.5555 0.6876 0.6563 0.6221 0.6967 0.5002 0.4558 0.2647 0.2192 0.4677 0.3437 0.4738 0.3289 0.4067 0.6269 0.8342 0.7363 0.7815 0.6867 0.209 0.1919 0.9161 0.3773 0.9452 0.5191 0.4878 0.3987 0.2204 0.401 0.69 0.2142 0.7305 0.8455 0.3303 0.5077 0.197 0.4328 0.6761 0.7288 0.5804 0.4006 0.5827 0.5304 0.625 0.3455 0.2804 0.4339 0.8074 0.5381 0.5194 0.1015 0.3073 0.633 0.14 0.1789 0.4184 0.6705 0.6511 0.2856 0.2681 0.0935 0.0648 0.0765 0.2501 0.1782 0.4336 0.3637 0.2769 0.2432 0.3977 0.8949 0.1778 0.8151 0.1559 0.722 0.1526 0.4397 0.2511 0.5309 0.2262 0.8185 0.9792 0.1071 0.5457 0.7827 0.2676 0.6024 0.4379 0.0819 0.4023 0.1789 0.3002 0.2871 0.6215 0.664 0.6516 0.5245 0.8206 0.095 0.3434 0.3993 0.313 0.3194 0.2721 0.0697 0.3063 0.4968 0.0654 0.0308 0.3013 0.1039 0.265 0.0686 0.4518 0.0386 0.1924 0.2184] 2022-08-24 06:10:24 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 06:10:32 [INFO] [TRAIN] epoch: 97, iter: 122050/160000, loss: 0.4182, lr: 0.000287, batch_cost: 0.1562, reader_cost: 0.00357, ips: 51.2138 samples/sec | ETA 01:38:48 2022-08-24 06:10:40 [INFO] [TRAIN] epoch: 97, iter: 122100/160000, loss: 0.4368, lr: 0.000287, batch_cost: 0.1581, reader_cost: 0.00188, ips: 50.6068 samples/sec | ETA 01:39:51 2022-08-24 06:10:49 [INFO] [TRAIN] epoch: 97, iter: 122150/160000, loss: 0.4292, lr: 0.000287, batch_cost: 0.1810, reader_cost: 0.00057, ips: 44.2094 samples/sec | ETA 01:54:09 2022-08-24 06:10:57 [INFO] [TRAIN] epoch: 97, iter: 122200/160000, loss: 0.4156, lr: 0.000286, batch_cost: 0.1614, reader_cost: 0.00127, ips: 49.5669 samples/sec | ETA 01:41:40 2022-08-24 06:11:05 [INFO] [TRAIN] epoch: 97, iter: 122250/160000, loss: 0.4104, lr: 0.000286, batch_cost: 0.1663, reader_cost: 0.00076, ips: 48.0947 samples/sec | ETA 01:44:39 2022-08-24 06:11:16 [INFO] [TRAIN] epoch: 97, iter: 122300/160000, loss: 0.4321, lr: 0.000285, batch_cost: 0.2052, reader_cost: 0.00084, ips: 38.9799 samples/sec | ETA 02:08:57 2022-08-24 06:11:24 [INFO] [TRAIN] epoch: 97, iter: 122350/160000, loss: 0.3957, lr: 0.000285, batch_cost: 0.1654, reader_cost: 0.00077, ips: 48.3624 samples/sec | ETA 01:43:47 2022-08-24 06:11:32 [INFO] [TRAIN] epoch: 97, iter: 122400/160000, loss: 0.4287, lr: 0.000285, batch_cost: 0.1679, reader_cost: 0.00050, ips: 47.6460 samples/sec | ETA 01:45:13 2022-08-24 06:11:40 [INFO] [TRAIN] epoch: 97, iter: 122450/160000, loss: 0.4492, lr: 0.000284, batch_cost: 0.1535, reader_cost: 0.00091, ips: 52.1092 samples/sec | ETA 01:36:04 2022-08-24 06:11:49 [INFO] [TRAIN] epoch: 97, iter: 122500/160000, loss: 0.4147, lr: 0.000284, batch_cost: 0.1707, reader_cost: 0.00059, ips: 46.8539 samples/sec | ETA 01:46:42 2022-08-24 06:12:04 [INFO] [TRAIN] epoch: 98, iter: 122550/160000, loss: 0.4016, lr: 0.000284, batch_cost: 0.3179, reader_cost: 0.16611, ips: 25.1619 samples/sec | ETA 03:18:26 2022-08-24 06:12:13 [INFO] [TRAIN] epoch: 98, iter: 122600/160000, loss: 0.4318, lr: 0.000283, batch_cost: 0.1688, reader_cost: 0.00125, ips: 47.3945 samples/sec | ETA 01:45:12 2022-08-24 06:12:21 [INFO] [TRAIN] epoch: 98, iter: 122650/160000, loss: 0.4254, lr: 0.000283, batch_cost: 0.1701, reader_cost: 0.00088, ips: 47.0279 samples/sec | ETA 01:45:53 2022-08-24 06:12:30 [INFO] [TRAIN] epoch: 98, iter: 122700/160000, loss: 0.4193, lr: 0.000282, batch_cost: 0.1803, reader_cost: 0.00126, ips: 44.3680 samples/sec | ETA 01:52:05 2022-08-24 06:12:40 [INFO] [TRAIN] epoch: 98, iter: 122750/160000, loss: 0.4075, lr: 0.000282, batch_cost: 0.1819, reader_cost: 0.00043, ips: 43.9835 samples/sec | ETA 01:52:55 2022-08-24 06:12:49 [INFO] [TRAIN] epoch: 98, iter: 122800/160000, loss: 0.4198, lr: 0.000282, batch_cost: 0.1846, reader_cost: 0.00213, ips: 43.3394 samples/sec | ETA 01:54:26 2022-08-24 06:12:59 [INFO] [TRAIN] epoch: 98, iter: 122850/160000, loss: 0.4074, lr: 0.000281, batch_cost: 0.1950, reader_cost: 0.00438, ips: 41.0199 samples/sec | ETA 02:00:45 2022-08-24 06:13:09 [INFO] [TRAIN] epoch: 98, iter: 122900/160000, loss: 0.4229, lr: 0.000281, batch_cost: 0.2090, reader_cost: 0.01705, ips: 38.2684 samples/sec | ETA 02:09:15 2022-08-24 06:13:19 [INFO] [TRAIN] epoch: 98, iter: 122950/160000, loss: 0.4218, lr: 0.000281, batch_cost: 0.1987, reader_cost: 0.00046, ips: 40.2602 samples/sec | ETA 02:02:42 2022-08-24 06:13:29 [INFO] [TRAIN] epoch: 98, iter: 123000/160000, loss: 0.4244, lr: 0.000280, batch_cost: 0.1920, reader_cost: 0.00059, ips: 41.6725 samples/sec | ETA 01:58:22 2022-08-24 06:13:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 169s - batch_cost: 0.1689 - reader cost: 9.6075e-04 2022-08-24 06:16:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.3743 Acc: 0.7744 Kappa: 0.7574 Dice: 0.5119 2022-08-24 06:16:18 [INFO] [EVAL] Class IoU: [0.6938 0.7897 0.9314 0.7431 0.6894 0.7718 0.7742 0.8099 0.5291 0.6494 0.4955 0.5633 0.7115 0.303 0.3223 0.4407 0.5287 0.429 0.6106 0.4346 0.765 0.4294 0.6197 0.5255 0.3433 0.4246 0.45 0.4433 0.4726 0.241 0.2873 0.5377 0.3257 0.3499 0.3773 0.4037 0.4918 0.5843 0.3174 0.3958 0.1232 0.1375 0.345 0.2543 0.2301 0.2283 0.31 0.5452 0.601 0.4986 0.6106 0.3791 0.1609 0.1556 0.6937 0.3277 0.8713 0.4314 0.4088 0.2703 0.1087 0.2017 0.308 0.1538 0.5048 0.7461 0.2853 0.3633 0.1423 0.3774 0.5029 0.5139 0.3732 0.2429 0.484 0.4144 0.4956 0.2788 0.2462 0.2987 0.7203 0.4149 0.4463 0.0656 0.1186 0.5257 0.1221 0.122 0.3638 0.523 0.447 0.1104 0.2061 0.1127 0.0195 0.0675 0.2146 0.142 0.2588 0.302 0.22 0.1147 0.2982 0.7704 0.1951 0.6494 0.1356 0.5402 0.0996 0.2666 0.2077 0.4105 0.1913 0.6717 0.7686 0.0694 0.4236 0.6527 0.1458 0.4036 0.4119 0.0678 0.3157 0.1706 0.2928 0.2613 0.5263 0.4755 0.5551 0.3923 0.6088 0.0947 0.2732 0.4248 0.2655 0.2037 0.1838 0.0494 0.2288 0.4321 0.1792 0.0139 0.2439 0.2606 0.2191 0.0136 0.3934 0.0309 0.1867 0.2056] 2022-08-24 06:16:18 [INFO] [EVAL] Class Precision: [0.7921 0.8752 0.9641 0.8413 0.7696 0.8668 0.8788 0.8644 0.6943 0.7316 0.7019 0.7044 0.774 0.5343 0.552 0.5924 0.6986 0.6662 0.7413 0.6363 0.8244 0.6885 0.7513 0.6372 0.4697 0.5692 0.5371 0.8093 0.7195 0.4243 0.4677 0.6652 0.5531 0.4634 0.4855 0.521 0.6782 0.7937 0.4762 0.5982 0.3433 0.3198 0.5598 0.4964 0.3448 0.5 0.6195 0.7461 0.7096 0.6005 0.776 0.4636 0.3049 0.589 0.7405 0.6724 0.9095 0.6809 0.7067 0.447 0.1568 0.3707 0.388 0.5692 0.6436 0.8646 0.4286 0.5685 0.2962 0.6597 0.7575 0.5902 0.5674 0.318 0.7255 0.5798 0.6977 0.5902 0.7105 0.561 0.8703 0.6987 0.7921 0.236 0.2116 0.7406 0.4357 0.4328 0.7994 0.6558 0.5922 0.1657 0.4051 0.3187 0.0497 0.1883 0.6219 0.3742 0.3474 0.7123 0.8143 0.19 0.4963 0.8839 0.8721 0.7446 0.2588 0.7831 0.1991 0.4004 0.4697 0.5997 0.4466 0.8433 0.7778 0.3245 0.692 0.7612 0.3593 0.5498 0.7246 0.2799 0.726 0.7248 0.7396 0.6544 0.7871 0.6297 0.8163 0.6613 0.7398 0.4648 0.591 0.8257 0.671 0.4118 0.3607 0.1166 0.504 0.6584 0.3013 0.0234 0.6125 0.5189 0.6421 0.0191 0.8508 0.2103 0.4298 0.7551] 2022-08-24 06:16:18 [INFO] [EVAL] Class Recall: [0.8483 0.8899 0.9649 0.8642 0.8687 0.8756 0.8668 0.9277 0.6899 0.8525 0.6276 0.7377 0.8982 0.4118 0.4365 0.6325 0.685 0.5465 0.776 0.5782 0.9139 0.5329 0.7796 0.7497 0.5607 0.6257 0.7351 0.495 0.5794 0.358 0.4269 0.7372 0.4421 0.588 0.6287 0.642 0.6414 0.6889 0.4876 0.539 0.1612 0.1944 0.4734 0.3426 0.4089 0.2959 0.3829 0.6694 0.797 0.746 0.7412 0.6753 0.2542 0.1746 0.9166 0.39 0.954 0.5407 0.4924 0.406 0.2617 0.3067 0.5992 0.1741 0.7006 0.8448 0.4605 0.5016 0.2151 0.4686 0.5995 0.799 0.5216 0.5072 0.5925 0.5922 0.6311 0.3457 0.2736 0.3898 0.8069 0.5053 0.5055 0.0833 0.2124 0.6443 0.1451 0.1453 0.4003 0.7209 0.6458 0.2488 0.2956 0.1485 0.0311 0.0952 0.2467 0.1863 0.5037 0.344 0.2316 0.2246 0.4276 0.8572 0.2009 0.8355 0.2218 0.6352 0.1662 0.4436 0.2712 0.5654 0.2507 0.7675 0.9848 0.0812 0.5221 0.8207 0.1971 0.6027 0.4883 0.0821 0.3585 0.1824 0.3264 0.3031 0.6136 0.6602 0.6343 0.4909 0.7747 0.1062 0.3369 0.4667 0.3052 0.2873 0.2726 0.079 0.2952 0.557 0.3065 0.0331 0.2884 0.3437 0.2496 0.045 0.4226 0.0349 0.2481 0.2203] 2022-08-24 06:16:18 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 06:16:27 [INFO] [TRAIN] epoch: 98, iter: 123050/160000, loss: 0.4306, lr: 0.000280, batch_cost: 0.1767, reader_cost: 0.00491, ips: 45.2690 samples/sec | ETA 01:48:49 2022-08-24 06:16:36 [INFO] [TRAIN] epoch: 98, iter: 123100/160000, loss: 0.4126, lr: 0.000279, batch_cost: 0.1758, reader_cost: 0.00146, ips: 45.5000 samples/sec | ETA 01:48:07 2022-08-24 06:16:45 [INFO] [TRAIN] epoch: 98, iter: 123150/160000, loss: 0.4116, lr: 0.000279, batch_cost: 0.1803, reader_cost: 0.00049, ips: 44.3805 samples/sec | ETA 01:50:42 2022-08-24 06:16:54 [INFO] [TRAIN] epoch: 98, iter: 123200/160000, loss: 0.3926, lr: 0.000279, batch_cost: 0.1848, reader_cost: 0.00057, ips: 43.2821 samples/sec | ETA 01:53:21 2022-08-24 06:17:03 [INFO] [TRAIN] epoch: 98, iter: 123250/160000, loss: 0.4187, lr: 0.000278, batch_cost: 0.1831, reader_cost: 0.00053, ips: 43.6844 samples/sec | ETA 01:52:10 2022-08-24 06:17:12 [INFO] [TRAIN] epoch: 98, iter: 123300/160000, loss: 0.4410, lr: 0.000278, batch_cost: 0.1721, reader_cost: 0.00123, ips: 46.4772 samples/sec | ETA 01:45:17 2022-08-24 06:17:20 [INFO] [TRAIN] epoch: 98, iter: 123350/160000, loss: 0.4120, lr: 0.000277, batch_cost: 0.1638, reader_cost: 0.00053, ips: 48.8456 samples/sec | ETA 01:40:02 2022-08-24 06:17:28 [INFO] [TRAIN] epoch: 98, iter: 123400/160000, loss: 0.4230, lr: 0.000277, batch_cost: 0.1726, reader_cost: 0.00058, ips: 46.3376 samples/sec | ETA 01:45:18 2022-08-24 06:17:37 [INFO] [TRAIN] epoch: 98, iter: 123450/160000, loss: 0.3963, lr: 0.000277, batch_cost: 0.1817, reader_cost: 0.00042, ips: 44.0340 samples/sec | ETA 01:50:40 2022-08-24 06:17:46 [INFO] [TRAIN] epoch: 98, iter: 123500/160000, loss: 0.4140, lr: 0.000276, batch_cost: 0.1700, reader_cost: 0.00065, ips: 47.0702 samples/sec | ETA 01:43:23 2022-08-24 06:17:54 [INFO] [TRAIN] epoch: 98, iter: 123550/160000, loss: 0.3876, lr: 0.000276, batch_cost: 0.1653, reader_cost: 0.00097, ips: 48.3864 samples/sec | ETA 01:40:26 2022-08-24 06:18:03 [INFO] [TRAIN] epoch: 98, iter: 123600/160000, loss: 0.4127, lr: 0.000276, batch_cost: 0.1673, reader_cost: 0.00068, ips: 47.8051 samples/sec | ETA 01:41:31 2022-08-24 06:18:11 [INFO] [TRAIN] epoch: 98, iter: 123650/160000, loss: 0.3986, lr: 0.000275, batch_cost: 0.1681, reader_cost: 0.00072, ips: 47.5812 samples/sec | ETA 01:41:51 2022-08-24 06:18:21 [INFO] [TRAIN] epoch: 98, iter: 123700/160000, loss: 0.4050, lr: 0.000275, batch_cost: 0.1933, reader_cost: 0.00052, ips: 41.3802 samples/sec | ETA 01:56:57 2022-08-24 06:18:32 [INFO] [TRAIN] epoch: 98, iter: 123750/160000, loss: 0.3961, lr: 0.000274, batch_cost: 0.2162, reader_cost: 0.00130, ips: 37.0075 samples/sec | ETA 02:10:36 2022-08-24 06:18:48 [INFO] [TRAIN] epoch: 99, iter: 123800/160000, loss: 0.3970, lr: 0.000274, batch_cost: 0.3312, reader_cost: 0.15315, ips: 24.1516 samples/sec | ETA 03:19:50 2022-08-24 06:18:59 [INFO] [TRAIN] epoch: 99, iter: 123850/160000, loss: 0.4010, lr: 0.000274, batch_cost: 0.2098, reader_cost: 0.00055, ips: 38.1255 samples/sec | ETA 02:06:25 2022-08-24 06:19:08 [INFO] [TRAIN] epoch: 99, iter: 123900/160000, loss: 0.3792, lr: 0.000273, batch_cost: 0.1970, reader_cost: 0.00076, ips: 40.6095 samples/sec | ETA 01:58:31 2022-08-24 06:19:18 [INFO] [TRAIN] epoch: 99, iter: 123950/160000, loss: 0.4329, lr: 0.000273, batch_cost: 0.1909, reader_cost: 0.00044, ips: 41.9057 samples/sec | ETA 01:54:42 2022-08-24 06:19:28 [INFO] [TRAIN] epoch: 99, iter: 124000/160000, loss: 0.4104, lr: 0.000273, batch_cost: 0.1910, reader_cost: 0.00074, ips: 41.8805 samples/sec | ETA 01:54:36 2022-08-24 06:19:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1732 - reader cost: 6.6413e-04 2022-08-24 06:22:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3723 Acc: 0.7755 Kappa: 0.7583 Dice: 0.5102 2022-08-24 06:22:21 [INFO] [EVAL] Class IoU: [0.6919 0.7919 0.9335 0.7421 0.6946 0.7746 0.7816 0.8112 0.5435 0.6575 0.4885 0.5679 0.7179 0.2969 0.3081 0.4476 0.5448 0.4306 0.6047 0.4393 0.7533 0.4307 0.6343 0.5274 0.3309 0.4237 0.4476 0.4542 0.4733 0.3067 0.256 0.5344 0.3296 0.3539 0.3583 0.4307 0.4822 0.5821 0.3107 0.3689 0.1669 0.1319 0.3604 0.2685 0.2308 0.2488 0.3075 0.5294 0.6189 0.5424 0.6002 0.3969 0.143 0.1589 0.6828 0.3272 0.8828 0.4433 0.339 0.2709 0.0849 0.2308 0.3049 0.1516 0.5195 0.7373 0.2625 0.3804 0.1238 0.3701 0.4853 0.5283 0.3709 0.2427 0.4844 0.406 0.5064 0.2902 0.2661 0.3325 0.697 0.435 0.4254 0.0511 0.1478 0.514 0.1212 0.1415 0.3027 0.5523 0.4108 0.0917 0.2107 0.0816 0.0298 0.0708 0.137 0.1318 0.2472 0.2958 0.2791 0.119 0.257 0.7272 0.1878 0.521 0.1334 0.547 0.0743 0.2089 0.1951 0.4428 0.194 0.5879 0.6473 0.0839 0.4034 0.6536 0.1523 0.4057 0.4569 0.0731 0.3186 0.2244 0.2941 0.286 0.5116 0.4693 0.5654 0.3693 0.5922 0.0669 0.2656 0.4172 0.287 0.2127 0.1733 0.0433 0.2381 0.4238 0.3596 0.0275 0.257 0.1918 0.2097 0.0116 0.3908 0.0398 0.1668 0.1925] 2022-08-24 06:22:21 [INFO] [EVAL] Class Precision: [0.7854 0.8751 0.9656 0.8335 0.7743 0.8699 0.8834 0.8681 0.7119 0.7642 0.6998 0.6979 0.7921 0.5383 0.5502 0.605 0.7161 0.669 0.7493 0.6291 0.8076 0.6216 0.7991 0.6258 0.5015 0.619 0.5449 0.7923 0.7615 0.4342 0.5175 0.652 0.5476 0.464 0.47 0.5507 0.6769 0.7602 0.4743 0.6063 0.3292 0.3482 0.5764 0.4698 0.3096 0.4992 0.6436 0.6976 0.7094 0.7437 0.7394 0.4836 0.27 0.7372 0.7201 0.6907 0.9154 0.6827 0.7587 0.4658 0.1287 0.3981 0.4007 0.7014 0.6525 0.8558 0.4312 0.5189 0.2558 0.6523 0.723 0.6525 0.5214 0.3335 0.7278 0.5945 0.6448 0.6855 0.7463 0.6028 0.8302 0.6943 0.8034 0.1657 0.2439 0.7244 0.4279 0.3818 0.7361 0.719 0.5459 0.121 0.4258 0.3032 0.059 0.1868 0.6066 0.3689 0.3382 0.69 0.699 0.1702 0.5633 0.8407 0.8807 0.5876 0.4005 0.727 0.1661 0.3478 0.4925 0.575 0.6213 0.8468 0.6526 0.2755 0.7728 0.7766 0.4946 0.5452 0.6867 0.3309 0.5844 0.5572 0.7627 0.6527 0.7479 0.599 0.8438 0.5065 0.7425 0.3871 0.5479 0.827 0.6401 0.3967 0.3468 0.0756 0.5011 0.6712 0.4407 0.0406 0.6087 0.6075 0.6437 0.0177 0.857 0.1598 0.4507 0.6656] 2022-08-24 06:22:21 [INFO] [EVAL] Class Recall: [0.8533 0.8928 0.9656 0.8713 0.8709 0.876 0.8715 0.9252 0.6967 0.8249 0.618 0.7531 0.8845 0.3984 0.4119 0.6323 0.6949 0.5471 0.758 0.5928 0.918 0.5837 0.7546 0.7703 0.4931 0.5732 0.7149 0.5157 0.5556 0.5109 0.3362 0.7476 0.4529 0.5986 0.6012 0.6641 0.6264 0.713 0.474 0.4851 0.253 0.1751 0.4903 0.3853 0.4755 0.3316 0.3706 0.6871 0.8291 0.6671 0.7612 0.6887 0.2332 0.1685 0.9294 0.3834 0.9611 0.5584 0.38 0.3931 0.1995 0.3544 0.5607 0.1621 0.7182 0.8419 0.4015 0.5877 0.1936 0.461 0.5962 0.7352 0.5622 0.4714 0.5917 0.5616 0.7023 0.3348 0.2926 0.4258 0.8128 0.538 0.4748 0.0689 0.2726 0.639 0.1446 0.1835 0.3396 0.7044 0.6241 0.2749 0.2943 0.1004 0.0569 0.1024 0.1504 0.1701 0.4788 0.3411 0.3172 0.2832 0.3209 0.8435 0.1927 0.8213 0.1666 0.6884 0.1185 0.3435 0.2442 0.6582 0.22 0.6579 0.9876 0.1077 0.4577 0.805 0.1803 0.6132 0.5773 0.0858 0.412 0.2731 0.3237 0.3374 0.6182 0.6844 0.6315 0.5769 0.7453 0.0748 0.3401 0.4571 0.3422 0.3145 0.2572 0.0922 0.312 0.5348 0.6616 0.078 0.3079 0.219 0.2372 0.0322 0.4181 0.0503 0.2094 0.2131] 2022-08-24 06:22:21 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 06:22:29 [INFO] [TRAIN] epoch: 99, iter: 124050/160000, loss: 0.4212, lr: 0.000272, batch_cost: 0.1662, reader_cost: 0.00374, ips: 48.1410 samples/sec | ETA 01:39:34 2022-08-24 06:22:38 [INFO] [TRAIN] epoch: 99, iter: 124100/160000, loss: 0.4203, lr: 0.000272, batch_cost: 0.1798, reader_cost: 0.00094, ips: 44.4869 samples/sec | ETA 01:47:35 2022-08-24 06:22:47 [INFO] [TRAIN] epoch: 99, iter: 124150/160000, loss: 0.4193, lr: 0.000271, batch_cost: 0.1782, reader_cost: 0.00124, ips: 44.8880 samples/sec | ETA 01:46:29 2022-08-24 06:22:56 [INFO] [TRAIN] epoch: 99, iter: 124200/160000, loss: 0.3813, lr: 0.000271, batch_cost: 0.1740, reader_cost: 0.00095, ips: 45.9879 samples/sec | ETA 01:43:47 2022-08-24 06:23:04 [INFO] [TRAIN] epoch: 99, iter: 124250/160000, loss: 0.4010, lr: 0.000271, batch_cost: 0.1528, reader_cost: 0.00052, ips: 52.3472 samples/sec | ETA 01:31:03 2022-08-24 06:23:11 [INFO] [TRAIN] epoch: 99, iter: 124300/160000, loss: 0.3945, lr: 0.000270, batch_cost: 0.1538, reader_cost: 0.00064, ips: 52.0260 samples/sec | ETA 01:31:29 2022-08-24 06:23:19 [INFO] [TRAIN] epoch: 99, iter: 124350/160000, loss: 0.4202, lr: 0.000270, batch_cost: 0.1559, reader_cost: 0.00059, ips: 51.3062 samples/sec | ETA 01:32:38 2022-08-24 06:23:27 [INFO] [TRAIN] epoch: 99, iter: 124400/160000, loss: 0.3910, lr: 0.000270, batch_cost: 0.1646, reader_cost: 0.00052, ips: 48.6114 samples/sec | ETA 01:37:38 2022-08-24 06:23:38 [INFO] [TRAIN] epoch: 99, iter: 124450/160000, loss: 0.4053, lr: 0.000269, batch_cost: 0.2020, reader_cost: 0.00036, ips: 39.6099 samples/sec | ETA 01:59:40 2022-08-24 06:23:46 [INFO] [TRAIN] epoch: 99, iter: 124500/160000, loss: 0.4143, lr: 0.000269, batch_cost: 0.1751, reader_cost: 0.00193, ips: 45.6877 samples/sec | ETA 01:43:36 2022-08-24 06:23:55 [INFO] [TRAIN] epoch: 99, iter: 124550/160000, loss: 0.4009, lr: 0.000268, batch_cost: 0.1783, reader_cost: 0.00071, ips: 44.8652 samples/sec | ETA 01:45:21 2022-08-24 06:24:04 [INFO] [TRAIN] epoch: 99, iter: 124600/160000, loss: 0.4259, lr: 0.000268, batch_cost: 0.1701, reader_cost: 0.00052, ips: 47.0350 samples/sec | ETA 01:40:21 2022-08-24 06:24:13 [INFO] [TRAIN] epoch: 99, iter: 124650/160000, loss: 0.4101, lr: 0.000268, batch_cost: 0.1761, reader_cost: 0.00248, ips: 45.4381 samples/sec | ETA 01:43:43 2022-08-24 06:24:23 [INFO] [TRAIN] epoch: 99, iter: 124700/160000, loss: 0.4081, lr: 0.000267, batch_cost: 0.2018, reader_cost: 0.00054, ips: 39.6449 samples/sec | ETA 01:58:43 2022-08-24 06:24:32 [INFO] [TRAIN] epoch: 99, iter: 124750/160000, loss: 0.4011, lr: 0.000267, batch_cost: 0.1978, reader_cost: 0.00315, ips: 40.4391 samples/sec | ETA 01:56:13 2022-08-24 06:24:42 [INFO] [TRAIN] epoch: 99, iter: 124800/160000, loss: 0.4069, lr: 0.000267, batch_cost: 0.1959, reader_cost: 0.00047, ips: 40.8402 samples/sec | ETA 01:54:55 2022-08-24 06:24:54 [INFO] [TRAIN] epoch: 99, iter: 124850/160000, loss: 0.4102, lr: 0.000266, batch_cost: 0.2244, reader_cost: 0.00038, ips: 35.6498 samples/sec | ETA 02:11:27 2022-08-24 06:25:03 [INFO] [TRAIN] epoch: 99, iter: 124900/160000, loss: 0.4200, lr: 0.000266, batch_cost: 0.1898, reader_cost: 0.00047, ips: 42.1561 samples/sec | ETA 01:51:00 2022-08-24 06:25:14 [INFO] [TRAIN] epoch: 99, iter: 124950/160000, loss: 0.4349, lr: 0.000265, batch_cost: 0.2222, reader_cost: 0.00046, ips: 36.0006 samples/sec | ETA 02:09:48 2022-08-24 06:25:24 [INFO] [TRAIN] epoch: 99, iter: 125000/160000, loss: 0.4112, lr: 0.000265, batch_cost: 0.1949, reader_cost: 0.00062, ips: 41.0532 samples/sec | ETA 01:53:40 2022-08-24 06:25:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1718 - reader cost: 7.1921e-04 2022-08-24 06:28:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3768 Acc: 0.7751 Kappa: 0.7578 Dice: 0.5152 2022-08-24 06:28:16 [INFO] [EVAL] Class IoU: [0.6918 0.7924 0.9316 0.742 0.6886 0.7705 0.7779 0.8166 0.5364 0.6516 0.4943 0.5737 0.7159 0.2781 0.3137 0.4476 0.5249 0.4438 0.6144 0.4431 0.7725 0.3643 0.6399 0.5137 0.3568 0.4603 0.488 0.4715 0.395 0.3051 0.2492 0.5431 0.3147 0.3623 0.362 0.4308 0.4891 0.5725 0.3041 0.3909 0.1416 0.1294 0.3543 0.2639 0.2495 0.2138 0.3715 0.5323 0.6163 0.5468 0.6069 0.3791 0.1567 0.1553 0.6819 0.3233 0.8912 0.4188 0.3995 0.2604 0.101 0.2279 0.321 0.1582 0.4959 0.7058 0.2933 0.4092 0.1349 0.3581 0.5071 0.525 0.3675 0.248 0.4912 0.4224 0.4103 0.2745 0.2758 0.2429 0.7069 0.427 0.4128 0.0655 0.1753 0.5183 0.1174 0.1325 0.3171 0.5623 0.4334 0.0655 0.2052 0.0775 0.0369 0.0646 0.2845 0.1272 0.2689 0.3219 0.2726 0.1116 0.2939 0.6714 0.195 0.5709 0.1658 0.5653 0.0703 0.3038 0.2332 0.3799 0.1921 0.6756 0.7107 0.0842 0.43 0.7062 0.1974 0.3626 0.4119 0.0783 0.3564 0.2337 0.2914 0.279 0.5141 0.4813 0.665 0.4115 0.5369 0.0916 0.2748 0.4457 0.287 0.2085 0.1745 0.0453 0.2209 0.411 0.1117 0.0588 0.2352 0.4367 0.2352 0.0128 0.3994 0.0312 0.1586 0.2065] 2022-08-24 06:28:16 [INFO] [EVAL] Class Precision: [0.7787 0.8783 0.962 0.8279 0.7639 0.8864 0.8951 0.881 0.679 0.7608 0.7263 0.7032 0.7925 0.4983 0.5546 0.5875 0.6977 0.6757 0.7688 0.6325 0.8412 0.6777 0.8045 0.6022 0.4862 0.6236 0.5612 0.7333 0.8396 0.4839 0.4899 0.6981 0.5964 0.5303 0.4772 0.5687 0.6646 0.8027 0.43 0.5892 0.3231 0.3644 0.5564 0.4944 0.3374 0.4654 0.6491 0.7423 0.6964 0.6854 0.7559 0.4423 0.305 0.5793 0.7258 0.6513 0.9343 0.7011 0.6687 0.4513 0.1294 0.36 0.4555 0.6093 0.603 0.7977 0.468 0.5903 0.3001 0.7273 0.7158 0.6368 0.5819 0.3355 0.7509 0.6502 0.6728 0.4948 0.766 0.6551 0.8531 0.6686 0.8288 0.2287 0.2656 0.6869 0.5804 0.4114 0.7714 0.7213 0.5782 0.0777 0.3967 0.3363 0.0819 0.1605 0.5275 0.3668 0.3933 0.7646 0.6151 0.1839 0.5573 0.7114 0.8673 0.6235 0.4212 0.7211 0.186 0.4307 0.486 0.4468 0.6663 0.8356 0.7166 0.2717 0.7238 0.7794 0.4288 0.5962 0.7229 0.3014 0.6593 0.5756 0.7827 0.6702 0.7822 0.6403 0.8539 0.5626 0.6563 0.4808 0.6191 0.8171 0.674 0.424 0.353 0.0898 0.4603 0.7039 0.3652 0.0966 0.5874 0.6588 0.5442 0.0182 0.8796 0.2591 0.5208 0.7205] 2022-08-24 06:28:16 [INFO] [EVAL] Class Recall: [0.8611 0.8902 0.9672 0.8773 0.8747 0.8549 0.8559 0.9179 0.7186 0.8195 0.6074 0.757 0.881 0.3863 0.4194 0.6528 0.6793 0.5638 0.7536 0.5967 0.9044 0.4406 0.7577 0.7776 0.5727 0.6373 0.7891 0.5691 0.4272 0.4523 0.3366 0.7097 0.3998 0.5334 0.6 0.6398 0.6494 0.6663 0.5096 0.5373 0.2012 0.1671 0.4938 0.3614 0.4891 0.2833 0.4648 0.6529 0.8427 0.7301 0.7548 0.7264 0.2438 0.1751 0.9185 0.391 0.9507 0.5098 0.4981 0.381 0.3153 0.3832 0.5209 0.1761 0.7362 0.8597 0.44 0.5714 0.1967 0.4136 0.635 0.7494 0.4993 0.4874 0.5868 0.5466 0.5126 0.3815 0.3011 0.2785 0.8048 0.5416 0.4513 0.0842 0.3403 0.6787 0.1282 0.1635 0.35 0.7184 0.6337 0.2935 0.2982 0.0915 0.0631 0.0976 0.3819 0.163 0.4594 0.3573 0.3286 0.2212 0.3833 0.9226 0.201 0.8712 0.2148 0.7234 0.1016 0.5076 0.3096 0.7175 0.2125 0.7792 0.9886 0.1088 0.5144 0.8827 0.2679 0.4807 0.4892 0.0956 0.4368 0.2824 0.3171 0.3234 0.5999 0.6597 0.7503 0.6051 0.747 0.1016 0.3307 0.495 0.3333 0.2909 0.2565 0.0838 0.2981 0.497 0.1386 0.1308 0.2818 0.5644 0.2929 0.0414 0.4225 0.0343 0.1857 0.2245] 2022-08-24 06:28:16 [INFO] [EVAL] The model with the best validation mIoU (0.3776) was saved at iter 112000. 2022-08-24 06:28:28 [INFO] [TRAIN] epoch: 100, iter: 125050/160000, loss: 0.4004, lr: 0.000265, batch_cost: 0.2327, reader_cost: 0.06705, ips: 34.3785 samples/sec | ETA 02:15:33 2022-08-24 06:28:36 [INFO] [TRAIN] epoch: 100, iter: 125100/160000, loss: 0.3934, lr: 0.000264, batch_cost: 0.1575, reader_cost: 0.00083, ips: 50.8063 samples/sec | ETA 01:31:35 2022-08-24 06:28:44 [INFO] [TRAIN] epoch: 100, iter: 125150/160000, loss: 0.3782, lr: 0.000264, batch_cost: 0.1724, reader_cost: 0.00064, ips: 46.4114 samples/sec | ETA 01:40:07 2022-08-24 06:28:54 [INFO] [TRAIN] epoch: 100, iter: 125200/160000, loss: 0.4347, lr: 0.000263, batch_cost: 0.1974, reader_cost: 0.00045, ips: 40.5345 samples/sec | ETA 01:54:28 2022-08-24 06:29:03 [INFO] [TRAIN] epoch: 100, iter: 125250/160000, loss: 0.4316, lr: 0.000263, batch_cost: 0.1731, reader_cost: 0.00076, ips: 46.2061 samples/sec | ETA 01:40:16 2022-08-24 06:29:11 [INFO] [TRAIN] epoch: 100, iter: 125300/160000, loss: 0.4200, lr: 0.000263, batch_cost: 0.1701, reader_cost: 0.00063, ips: 47.0400 samples/sec | ETA 01:38:21 2022-08-24 06:29:22 [INFO] [TRAIN] epoch: 100, iter: 125350/160000, loss: 0.4026, lr: 0.000262, batch_cost: 0.2080, reader_cost: 0.00030, ips: 38.4694 samples/sec | ETA 02:00:05 2022-08-24 06:29:31 [INFO] [TRAIN] epoch: 100, iter: 125400/160000, loss: 0.4246, lr: 0.000262, batch_cost: 0.1777, reader_cost: 0.00300, ips: 45.0256 samples/sec | ETA 01:42:27 2022-08-24 06:29:40 [INFO] [TRAIN] epoch: 100, iter: 125450/160000, loss: 0.3998, lr: 0.000262, batch_cost: 0.1789, reader_cost: 0.00077, ips: 44.7220 samples/sec | ETA 01:43:00 2022-08-24 06:29:48 [INFO] [TRAIN] epoch: 100, iter: 125500/160000, loss: 0.4250, lr: 0.000261, batch_cost: 0.1643, reader_cost: 0.00061, ips: 48.6929 samples/sec | ETA 01:34:28 2022-08-24 06:29:57 [INFO] [TRAIN] epoch: 100, iter: 125550/160000, loss: 0.4544, lr: 0.000261, batch_cost: 0.1899, reader_cost: 0.00048, ips: 42.1217 samples/sec | ETA 01:49:02 2022-08-24 06:30:07 [INFO] [TRAIN] epoch: 100, iter: 125600/160000, loss: 0.4025, lr: 0.000260, batch_cost: 0.1939, reader_cost: 0.00361, ips: 41.2681 samples/sec | ETA 01:51:08 2022-08-24 06:30:18 [INFO] [TRAIN] epoch: 100, iter: 125650/160000, loss: 0.3872, lr: 0.000260, batch_cost: 0.2162, reader_cost: 0.00078, ips: 36.9961 samples/sec | ETA 02:03:47 2022-08-24 06:30:26 [INFO] [TRAIN] epoch: 100, iter: 125700/160000, loss: 0.4310, lr: 0.000260, batch_cost: 0.1738, reader_cost: 0.00916, ips: 46.0377 samples/sec | ETA 01:39:20 2022-08-24 06:30:38 [INFO] [TRAIN] epoch: 100, iter: 125750/160000, loss: 0.4036, lr: 0.000259, batch_cost: 0.2327, reader_cost: 0.00097, ips: 34.3829 samples/sec | ETA 02:12:49 2022-08-24 06:30:49 [INFO] [TRAIN] epoch: 100, iter: 125800/160000, loss: 0.4397, lr: 0.000259, batch_cost: 0.2090, reader_cost: 0.00147, ips: 38.2866 samples/sec | ETA 01:59:06 2022-08-24 06:30:58 [INFO] [TRAIN] epoch: 100, iter: 125850/160000, loss: 0.4241, lr: 0.000259, batch_cost: 0.1818, reader_cost: 0.00069, ips: 43.9965 samples/sec | ETA 01:43:29 2022-08-24 06:31:07 [INFO] [TRAIN] epoch: 100, iter: 125900/160000, loss: 0.4123, lr: 0.000258, batch_cost: 0.1823, reader_cost: 0.00081, ips: 43.8893 samples/sec | ETA 01:43:35 2022-08-24 06:31:17 [INFO] [TRAIN] epoch: 100, iter: 125950/160000, loss: 0.4335, lr: 0.000258, batch_cost: 0.1992, reader_cost: 0.00046, ips: 40.1598 samples/sec | ETA 01:53:02 2022-08-24 06:31:27 [INFO] [TRAIN] epoch: 100, iter: 126000/160000, loss: 0.3715, lr: 0.000257, batch_cost: 0.1979, reader_cost: 0.00040, ips: 40.4287 samples/sec | ETA 01:52:07 2022-08-24 06:31:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 169s - batch_cost: 0.1689 - reader cost: 8.4678e-04 2022-08-24 06:34:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3785 Acc: 0.7774 Kappa: 0.7604 Dice: 0.5158 2022-08-24 06:34:16 [INFO] [EVAL] Class IoU: [0.6965 0.7949 0.9326 0.7494 0.6814 0.7772 0.7836 0.8081 0.5376 0.6529 0.505 0.5721 0.7183 0.3009 0.3108 0.4511 0.5116 0.418 0.6161 0.4326 0.7757 0.4333 0.6435 0.5159 0.3512 0.4269 0.4704 0.4785 0.4616 0.2378 0.312 0.5471 0.3263 0.3711 0.3396 0.4359 0.4956 0.5694 0.3076 0.3894 0.1387 0.1374 0.3624 0.2591 0.2619 0.2203 0.3569 0.5304 0.6146 0.5334 0.6134 0.3788 0.1913 0.1753 0.6935 0.3352 0.8938 0.4119 0.2474 0.2816 0.0943 0.1946 0.3353 0.1612 0.4997 0.7046 0.2973 0.4011 0.1279 0.3593 0.5295 0.5836 0.3531 0.2459 0.501 0.3976 0.488 0.2893 0.2413 0.2555 0.7193 0.4412 0.4119 0.0742 0.1318 0.5171 0.1056 0.1168 0.3725 0.5435 0.4571 0.0666 0.2113 0.1109 0.0499 0.0516 0.2024 0.1715 0.2688 0.3082 0.3148 0.1318 0.2998 0.7631 0.1787 0.6094 0.1322 0.5521 0.0812 0.2173 0.2331 0.4946 0.1987 0.6862 0.8291 0.0668 0.3461 0.6347 0.1607 0.3698 0.428 0.0735 0.3857 0.2081 0.2908 0.2523 0.5294 0.4691 0.5799 0.4103 0.5915 0.0701 0.2979 0.4558 0.3183 0.2108 0.1665 0.0426 0.2113 0.4357 0.1072 0.0647 0.2463 0.4384 0.256 0.013 0.3919 0.0365 0.1463 0.178 ] 2022-08-24 06:34:16 [INFO] [EVAL] Class Precision: [0.7942 0.8543 0.9609 0.8458 0.7635 0.8769 0.8811 0.8627 0.6812 0.7365 0.7047 0.7224 0.7889 0.5239 0.5204 0.6269 0.6898 0.7113 0.7765 0.6403 0.8511 0.6556 0.7929 0.6462 0.4795 0.6652 0.5767 0.7593 0.7054 0.4713 0.4551 0.7155 0.5403 0.5282 0.4457 0.5911 0.6652 0.8093 0.4476 0.6067 0.3262 0.3291 0.5828 0.4558 0.3533 0.4519 0.7161 0.7473 0.7314 0.6431 0.8191 0.4774 0.3701 0.6548 0.7414 0.6132 0.9394 0.7052 0.6031 0.4883 0.1375 0.3381 0.4787 0.7352 0.6125 0.8012 0.458 0.5391 0.2534 0.6793 0.7064 0.7507 0.5716 0.3273 0.7363 0.6208 0.8231 0.5772 0.6795 0.3584 0.8601 0.6992 0.8284 0.2122 0.2368 0.7358 0.4802 0.4171 0.7887 0.6924 0.6465 0.0929 0.5484 0.3231 0.1515 0.147 0.5973 0.3831 0.369 0.8007 0.727 0.243 0.5793 0.875 0.7798 0.7025 0.3782 0.7773 0.178 0.3545 0.4557 0.6734 0.6341 0.8484 0.8395 0.3071 0.6359 0.7571 0.2347 0.5333 0.7331 0.2598 0.7211 0.6362 0.8051 0.6671 0.8063 0.6168 0.925 0.6435 0.7582 0.4736 0.4958 0.7702 0.7064 0.3906 0.3895 0.0813 0.4586 0.6649 0.4838 0.1158 0.5852 0.6292 0.5924 0.0163 0.8222 0.2203 0.3702 0.7917] 2022-08-24 06:34:16 [INFO] [EVAL] Class Recall: [0.8499 0.9197 0.9694 0.868 0.8638 0.8723 0.8763 0.9273 0.7182 0.8519 0.6406 0.7332 0.8892 0.4142 0.4355 0.6168 0.6645 0.5034 0.7489 0.5714 0.8974 0.5611 0.7735 0.7189 0.5676 0.5437 0.7184 0.564 0.5718 0.3244 0.4982 0.6993 0.4516 0.5552 0.5878 0.6241 0.6603 0.6576 0.4957 0.5208 0.1944 0.191 0.4893 0.3752 0.5032 0.3007 0.4157 0.6463 0.7938 0.7577 0.7095 0.647 0.2837 0.1932 0.9148 0.425 0.9485 0.4975 0.2955 0.3996 0.2308 0.3143 0.5282 0.1711 0.7307 0.854 0.4587 0.6104 0.2053 0.4327 0.6788 0.7239 0.4802 0.4974 0.6106 0.5251 0.5452 0.3671 0.2722 0.471 0.8146 0.5445 0.4503 0.1025 0.229 0.6351 0.1193 0.1396 0.4138 0.7166 0.6094 0.1903 0.2558 0.1444 0.0693 0.0737 0.2343 0.237 0.4975 0.3339 0.357 0.2235 0.3832 0.8565 0.1882 0.8214 0.1689 0.6559 0.1299 0.3596 0.323 0.6507 0.2244 0.7821 0.9853 0.0787 0.4316 0.7969 0.3376 0.5468 0.507 0.093 0.4534 0.2362 0.3128 0.2887 0.6065 0.6619 0.6085 0.531 0.729 0.076 0.4275 0.5276 0.3669 0.3141 0.2253 0.0823 0.2815 0.5582 0.121 0.1278 0.2983 0.5911 0.3108 0.0603 0.4281 0.0419 0.1948 0.1868] 2022-08-24 06:34:16 [INFO] [EVAL] The model with the best validation mIoU (0.3785) was saved at iter 126000. 2022-08-24 06:34:24 [INFO] [TRAIN] epoch: 100, iter: 126050/160000, loss: 0.4255, lr: 0.000257, batch_cost: 0.1659, reader_cost: 0.00296, ips: 48.2156 samples/sec | ETA 01:33:53 2022-08-24 06:34:32 [INFO] [TRAIN] epoch: 100, iter: 126100/160000, loss: 0.3936, lr: 0.000257, batch_cost: 0.1525, reader_cost: 0.00168, ips: 52.4510 samples/sec | ETA 01:26:10 2022-08-24 06:34:42 [INFO] [TRAIN] epoch: 100, iter: 126150/160000, loss: 0.4163, lr: 0.000256, batch_cost: 0.1942, reader_cost: 0.00062, ips: 41.1845 samples/sec | ETA 01:49:35 2022-08-24 06:34:49 [INFO] [TRAIN] epoch: 100, iter: 126200/160000, loss: 0.4206, lr: 0.000256, batch_cost: 0.1528, reader_cost: 0.00069, ips: 52.3496 samples/sec | ETA 01:26:05 2022-08-24 06:34:57 [INFO] [TRAIN] epoch: 100, iter: 126250/160000, loss: 0.4107, lr: 0.000256, batch_cost: 0.1533, reader_cost: 0.00043, ips: 52.1759 samples/sec | ETA 01:26:14 2022-08-24 06:35:05 [INFO] [TRAIN] epoch: 100, iter: 126300/160000, loss: 0.4142, lr: 0.000255, batch_cost: 0.1523, reader_cost: 0.00143, ips: 52.5275 samples/sec | ETA 01:25:32 2022-08-24 06:35:17 [INFO] [TRAIN] epoch: 101, iter: 126350/160000, loss: 0.3944, lr: 0.000255, batch_cost: 0.2573, reader_cost: 0.06440, ips: 31.0929 samples/sec | ETA 02:24:17 2022-08-24 06:35:27 [INFO] [TRAIN] epoch: 101, iter: 126400/160000, loss: 0.4071, lr: 0.000254, batch_cost: 0.1897, reader_cost: 0.00081, ips: 42.1690 samples/sec | ETA 01:46:14 2022-08-24 06:35:36 [INFO] [TRAIN] epoch: 101, iter: 126450/160000, loss: 0.4127, lr: 0.000254, batch_cost: 0.1755, reader_cost: 0.00051, ips: 45.5931 samples/sec | ETA 01:38:06 2022-08-24 06:35:44 [INFO] [TRAIN] epoch: 101, iter: 126500/160000, loss: 0.3962, lr: 0.000254, batch_cost: 0.1600, reader_cost: 0.00067, ips: 50.0045 samples/sec | ETA 01:29:19 2022-08-24 06:35:52 [INFO] [TRAIN] epoch: 101, iter: 126550/160000, loss: 0.4091, lr: 0.000253, batch_cost: 0.1696, reader_cost: 0.00062, ips: 47.1815 samples/sec | ETA 01:34:31 2022-08-24 06:36:02 [INFO] [TRAIN] epoch: 101, iter: 126600/160000, loss: 0.4014, lr: 0.000253, batch_cost: 0.1899, reader_cost: 0.00082, ips: 42.1213 samples/sec | ETA 01:45:43 2022-08-24 06:36:12 [INFO] [TRAIN] epoch: 101, iter: 126650/160000, loss: 0.3832, lr: 0.000252, batch_cost: 0.2147, reader_cost: 0.00067, ips: 37.2600 samples/sec | ETA 01:59:20 2022-08-24 06:36:24 [INFO] [TRAIN] epoch: 101, iter: 126700/160000, loss: 0.4093, lr: 0.000252, batch_cost: 0.2229, reader_cost: 0.00059, ips: 35.8859 samples/sec | ETA 02:03:43 2022-08-24 06:36:33 [INFO] [TRAIN] epoch: 101, iter: 126750/160000, loss: 0.4072, lr: 0.000252, batch_cost: 0.1907, reader_cost: 0.00089, ips: 41.9499 samples/sec | ETA 01:45:40 2022-08-24 06:36:44 [INFO] [TRAIN] epoch: 101, iter: 126800/160000, loss: 0.4144, lr: 0.000251, batch_cost: 0.2142, reader_cost: 0.00056, ips: 37.3547 samples/sec | ETA 01:58:30 2022-08-24 06:36:53 [INFO] [TRAIN] epoch: 101, iter: 126850/160000, loss: 0.4080, lr: 0.000251, batch_cost: 0.1868, reader_cost: 0.00035, ips: 42.8197 samples/sec | ETA 01:43:13 2022-08-24 06:37:04 [INFO] [TRAIN] epoch: 101, iter: 126900/160000, loss: 0.4007, lr: 0.000251, batch_cost: 0.2236, reader_cost: 0.00034, ips: 35.7844 samples/sec | ETA 02:03:19 2022-08-24 06:37:14 [INFO] [TRAIN] epoch: 101, iter: 126950/160000, loss: 0.4468, lr: 0.000250, batch_cost: 0.1979, reader_cost: 0.00086, ips: 40.4317 samples/sec | ETA 01:48:59 2022-08-24 06:37:25 [INFO] [TRAIN] epoch: 101, iter: 127000/160000, loss: 0.4244, lr: 0.000250, batch_cost: 0.2185, reader_cost: 0.00049, ips: 36.6064 samples/sec | ETA 02:00:11 2022-08-24 06:37:25 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1647 - reader cost: 7.7796e-04 2022-08-24 06:40:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.3767 Acc: 0.7744 Kappa: 0.7573 Dice: 0.5144 2022-08-24 06:40:10 [INFO] [EVAL] Class IoU: [0.689 0.7927 0.9316 0.7446 0.6895 0.7715 0.7769 0.8048 0.5337 0.6306 0.4973 0.5606 0.7136 0.3162 0.3203 0.4558 0.4743 0.444 0.6214 0.4475 0.7663 0.4097 0.6402 0.5302 0.3367 0.423 0.4689 0.468 0.473 0.2888 0.274 0.5366 0.3236 0.3712 0.3808 0.4059 0.4837 0.5737 0.3029 0.3651 0.1179 0.1261 0.3576 0.2614 0.2381 0.2642 0.3397 0.5279 0.6409 0.5167 0.6037 0.3982 0.158 0.1698 0.6851 0.3372 0.8826 0.4477 0.39 0.2782 0.0806 0.2302 0.3281 0.1342 0.5035 0.7133 0.2754 0.3853 0.1307 0.3574 0.523 0.5372 0.3869 0.2518 0.5 0.4118 0.5097 0.2942 0.2419 0.2502 0.686 0.4551 0.4526 0.077 0.1447 0.5309 0.1042 0.1243 0.3217 0.541 0.4443 0.0481 0.2195 0.127 0.0434 0.0501 0.2072 0.1709 0.2503 0.3128 0.2288 0.1277 0.291 0.7564 0.1761 0.6185 0.1545 0.548 0.0807 0.2588 0.2211 0.4635 0.2067 0.6455 0.6972 0.0688 0.4285 0.6295 0.2033 0.3535 0.4331 0.0718 0.3627 0.2115 0.3075 0.2499 0.526 0.4762 0.5427 0.4116 0.6396 0.0531 0.2839 0.4269 0.2999 0.2042 0.1726 0.044 0.2168 0.4211 0.1046 0.0068 0.2741 0.4021 0.2354 0.012 0.4032 0.0366 0.1704 0.2078] 2022-08-24 06:40:10 [INFO] [EVAL] Class Precision: [0.7874 0.8659 0.9619 0.8413 0.7763 0.8852 0.8786 0.8561 0.676 0.7623 0.6644 0.7029 0.7792 0.5018 0.5101 0.6513 0.7209 0.6735 0.774 0.6473 0.8246 0.6394 0.7866 0.6353 0.4472 0.6166 0.5858 0.7339 0.6927 0.4791 0.4929 0.7163 0.5387 0.5159 0.4989 0.5238 0.7156 0.8106 0.4486 0.6241 0.2971 0.2887 0.5863 0.4618 0.3505 0.4872 0.6211 0.7409 0.7317 0.6211 0.8116 0.5006 0.3309 0.6437 0.7313 0.6645 0.9272 0.6685 0.7362 0.4723 0.1161 0.422 0.4597 0.7845 0.6443 0.8252 0.4254 0.5278 0.2988 0.6978 0.713 0.6763 0.5711 0.3809 0.7579 0.6149 0.7985 0.7452 0.7181 0.3706 0.8092 0.7015 0.7634 0.1781 0.2633 0.6882 0.473 0.4573 0.7689 0.6897 0.6204 0.0545 0.4622 0.3422 0.1664 0.1811 0.5609 0.4036 0.3498 0.7759 0.7415 0.1927 0.5385 0.904 0.8363 0.8042 0.3393 0.7932 0.1387 0.3975 0.5023 0.6695 0.506 0.8096 0.7062 0.2722 0.7197 0.7208 0.2988 0.6037 0.7373 0.2727 0.6784 0.6272 0.763 0.6427 0.7797 0.6125 0.7793 0.6357 0.7859 0.3753 0.5897 0.8562 0.642 0.4324 0.3961 0.0926 0.4467 0.699 0.4132 0.0148 0.5674 0.6572 0.6289 0.0174 0.8324 0.206 0.4241 0.7599] 2022-08-24 06:40:10 [INFO] [EVAL] Class Recall: [0.8465 0.9037 0.9672 0.8662 0.8605 0.8573 0.8703 0.9307 0.7171 0.785 0.6641 0.7347 0.8945 0.4609 0.4627 0.603 0.5809 0.5658 0.7592 0.5918 0.9155 0.5328 0.7747 0.7622 0.5767 0.5739 0.7014 0.5637 0.5986 0.421 0.3815 0.6814 0.4477 0.5695 0.6168 0.6432 0.5989 0.6625 0.4826 0.468 0.1634 0.1829 0.4783 0.376 0.4262 0.3661 0.4286 0.6475 0.8377 0.7545 0.7021 0.6606 0.2321 0.1874 0.9156 0.4063 0.9483 0.5756 0.4533 0.4037 0.2088 0.3361 0.5342 0.1393 0.6973 0.8402 0.4386 0.588 0.1885 0.4229 0.6625 0.7231 0.5453 0.4262 0.595 0.555 0.585 0.3271 0.2673 0.435 0.8184 0.5644 0.5264 0.1194 0.2432 0.6991 0.1179 0.1458 0.3562 0.715 0.6101 0.2906 0.2947 0.1679 0.0555 0.0648 0.2473 0.2286 0.4681 0.3439 0.2486 0.2746 0.3877 0.8225 0.1823 0.7281 0.2211 0.6394 0.1618 0.4259 0.2831 0.6011 0.2589 0.761 0.982 0.0843 0.5143 0.8324 0.3889 0.4604 0.5122 0.0887 0.438 0.2419 0.34 0.2903 0.6178 0.6816 0.6413 0.5386 0.7746 0.0583 0.3538 0.4599 0.3601 0.2789 0.2342 0.0775 0.2964 0.5144 0.1229 0.0123 0.3465 0.5088 0.2734 0.0376 0.4388 0.0427 0.2217 0.2224] 2022-08-24 06:40:10 [INFO] [EVAL] The model with the best validation mIoU (0.3785) was saved at iter 126000. 2022-08-24 06:40:19 [INFO] [TRAIN] epoch: 101, iter: 127050/160000, loss: 0.4175, lr: 0.000249, batch_cost: 0.1795, reader_cost: 0.00324, ips: 44.5618 samples/sec | ETA 01:38:35 2022-08-24 06:40:29 [INFO] [TRAIN] epoch: 101, iter: 127100/160000, loss: 0.4125, lr: 0.000249, batch_cost: 0.1951, reader_cost: 0.00147, ips: 41.0134 samples/sec | ETA 01:46:57 2022-08-24 06:40:37 [INFO] [TRAIN] epoch: 101, iter: 127150/160000, loss: 0.4292, lr: 0.000249, batch_cost: 0.1662, reader_cost: 0.00078, ips: 48.1413 samples/sec | ETA 01:30:58 2022-08-24 06:40:47 [INFO] [TRAIN] epoch: 101, iter: 127200/160000, loss: 0.4327, lr: 0.000248, batch_cost: 0.1907, reader_cost: 0.00061, ips: 41.9520 samples/sec | ETA 01:44:14 2022-08-24 06:40:56 [INFO] [TRAIN] epoch: 101, iter: 127250/160000, loss: 0.4279, lr: 0.000248, batch_cost: 0.1803, reader_cost: 0.00074, ips: 44.3610 samples/sec | ETA 01:38:26 2022-08-24 06:41:06 [INFO] [TRAIN] epoch: 101, iter: 127300/160000, loss: 0.3979, lr: 0.000248, batch_cost: 0.1895, reader_cost: 0.00079, ips: 42.2131 samples/sec | ETA 01:43:17 2022-08-24 06:41:14 [INFO] [TRAIN] epoch: 101, iter: 127350/160000, loss: 0.4138, lr: 0.000247, batch_cost: 0.1725, reader_cost: 0.00067, ips: 46.3726 samples/sec | ETA 01:33:52 2022-08-24 06:41:24 [INFO] [TRAIN] epoch: 101, iter: 127400/160000, loss: 0.3954, lr: 0.000247, batch_cost: 0.1902, reader_cost: 0.00047, ips: 42.0654 samples/sec | ETA 01:43:19 2022-08-24 06:41:34 [INFO] [TRAIN] epoch: 101, iter: 127450/160000, loss: 0.4197, lr: 0.000246, batch_cost: 0.2025, reader_cost: 0.00055, ips: 39.5114 samples/sec | ETA 01:49:50 2022-08-24 06:41:45 [INFO] [TRAIN] epoch: 101, iter: 127500/160000, loss: 0.4216, lr: 0.000246, batch_cost: 0.2228, reader_cost: 0.00054, ips: 35.9085 samples/sec | ETA 02:00:40 2022-08-24 06:41:56 [INFO] [TRAIN] epoch: 101, iter: 127550/160000, loss: 0.4043, lr: 0.000246, batch_cost: 0.2114, reader_cost: 0.00049, ips: 37.8466 samples/sec | ETA 01:54:19 2022-08-24 06:42:09 [INFO] [TRAIN] epoch: 102, iter: 127600/160000, loss: 0.4031, lr: 0.000245, batch_cost: 0.2793, reader_cost: 0.07977, ips: 28.6422 samples/sec | ETA 02:30:49 2022-08-24 06:42:19 [INFO] [TRAIN] epoch: 102, iter: 127650/160000, loss: 0.4445, lr: 0.000245, batch_cost: 0.1997, reader_cost: 0.00263, ips: 40.0584 samples/sec | ETA 01:47:40 2022-08-24 06:42:30 [INFO] [TRAIN] epoch: 102, iter: 127700/160000, loss: 0.4244, lr: 0.000245, batch_cost: 0.2056, reader_cost: 0.00077, ips: 38.9158 samples/sec | ETA 01:50:39 2022-08-24 06:42:40 [INFO] [TRAIN] epoch: 102, iter: 127750/160000, loss: 0.4031, lr: 0.000244, batch_cost: 0.2037, reader_cost: 0.00044, ips: 39.2649 samples/sec | ETA 01:49:30 2022-08-24 06:42:50 [INFO] [TRAIN] epoch: 102, iter: 127800/160000, loss: 0.4076, lr: 0.000244, batch_cost: 0.2026, reader_cost: 0.00058, ips: 39.4871 samples/sec | ETA 01:48:43 2022-08-24 06:43:00 [INFO] [TRAIN] epoch: 102, iter: 127850/160000, loss: 0.3910, lr: 0.000243, batch_cost: 0.1952, reader_cost: 0.00052, ips: 40.9881 samples/sec | ETA 01:44:34 2022-08-24 06:43:10 [INFO] [TRAIN] epoch: 102, iter: 127900/160000, loss: 0.4118, lr: 0.000243, batch_cost: 0.2032, reader_cost: 0.00064, ips: 39.3714 samples/sec | ETA 01:48:42 2022-08-24 06:43:21 [INFO] [TRAIN] epoch: 102, iter: 127950/160000, loss: 0.4100, lr: 0.000243, batch_cost: 0.2237, reader_cost: 0.00046, ips: 35.7609 samples/sec | ETA 01:59:29 2022-08-24 06:43:32 [INFO] [TRAIN] epoch: 102, iter: 128000/160000, loss: 0.4154, lr: 0.000242, batch_cost: 0.2145, reader_cost: 0.00046, ips: 37.2997 samples/sec | ETA 01:54:23 2022-08-24 06:43:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1592 - reader cost: 0.0011 2022-08-24 06:46:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3760 Acc: 0.7745 Kappa: 0.7573 Dice: 0.5130 2022-08-24 06:46:11 [INFO] [EVAL] Class IoU: [0.6912 0.7901 0.9327 0.7394 0.6922 0.7769 0.7745 0.8129 0.5377 0.634 0.4982 0.572 0.7218 0.2826 0.3012 0.4573 0.4762 0.4299 0.6129 0.4357 0.7717 0.4271 0.6404 0.5134 0.3342 0.4391 0.4628 0.4429 0.415 0.2469 0.2943 0.5354 0.3044 0.3601 0.3812 0.4271 0.481 0.573 0.2996 0.3935 0.1305 0.1303 0.3592 0.2648 0.2326 0.2283 0.3603 0.5313 0.6138 0.5468 0.6024 0.3831 0.1542 0.2079 0.6689 0.3319 0.8824 0.4466 0.4398 0.2906 0.0842 0.2478 0.3234 0.2001 0.5227 0.7163 0.2558 0.3751 0.1339 0.3601 0.5352 0.4796 0.39 0.2325 0.4974 0.3944 0.5819 0.2959 0.3547 0.324 0.7022 0.449 0.4409 0.0355 0.1281 0.5276 0.1264 0.1255 0.3704 0.5621 0.4341 0.054 0.2178 0.1045 0.0532 0.0736 0.1008 0.1688 0.2798 0.3204 0.269 0.0974 0.2874 0.8101 0.18 0.7055 0.1418 0.5652 0.0849 0.2437 0.2095 0.4408 0.1972 0.6227 0.6585 0.0756 0.4056 0.6653 0.2007 0.3812 0.4655 0.0632 0.3511 0.1823 0.3156 0.2471 0.5398 0.4569 0.5525 0.3889 0.5881 0.0952 0.1861 0.4159 0.2988 0.212 0.1594 0.0326 0.2171 0.4132 0.1465 0.009 0.2986 0.1856 0.2125 0.0122 0.4069 0.0368 0.1572 0.2083] 2022-08-24 06:46:11 [INFO] [EVAL] Class Precision: [0.7845 0.873 0.9614 0.8225 0.7712 0.8743 0.8813 0.8713 0.6841 0.731 0.7007 0.7182 0.7969 0.5032 0.6067 0.6225 0.7055 0.6964 0.7474 0.6615 0.8414 0.6622 0.7793 0.6195 0.4888 0.559 0.556 0.6418 0.7713 0.4148 0.4738 0.6625 0.5367 0.4985 0.4933 0.5597 0.6918 0.8109 0.445 0.5926 0.3104 0.2997 0.6521 0.4645 0.3218 0.4745 0.6943 0.7118 0.7412 0.6609 0.7772 0.4524 0.3148 0.685 0.7012 0.6654 0.96 0.6839 0.7409 0.5278 0.1223 0.4884 0.4182 0.6621 0.6643 0.8259 0.3697 0.5465 0.2719 0.6826 0.7118 0.5645 0.5499 0.333 0.7007 0.5332 0.7254 0.6545 0.7561 0.5865 0.833 0.7213 0.7707 0.1279 0.2446 0.7094 0.4632 0.4291 0.8488 0.7265 0.6146 0.0596 0.5012 0.3412 0.1188 0.1982 0.5218 0.4076 0.4137 0.6709 0.6574 0.1471 0.5763 0.8916 0.7803 0.8549 0.4179 0.7681 0.2147 0.4096 0.4785 0.5684 0.663 0.8363 0.6646 0.257 0.6392 0.7904 0.4908 0.5435 0.7324 0.1777 0.6366 0.7159 0.7386 0.7052 0.796 0.5609 0.8087 0.5842 0.7049 0.505 0.4855 0.8204 0.5963 0.3953 0.3831 0.0839 0.4515 0.6304 0.3656 0.0137 0.6046 0.5908 0.6342 0.0167 0.8226 0.2126 0.3765 0.7179] 2022-08-24 06:46:11 [INFO] [EVAL] Class Recall: [0.8532 0.8927 0.969 0.8798 0.8711 0.8745 0.8647 0.9237 0.7153 0.8269 0.6328 0.7376 0.8846 0.392 0.3743 0.6328 0.5943 0.529 0.773 0.5606 0.903 0.5462 0.7824 0.75 0.5138 0.6719 0.7341 0.5884 0.4732 0.3788 0.4372 0.7362 0.4129 0.5647 0.6264 0.6432 0.6122 0.6615 0.4784 0.5395 0.1837 0.1874 0.4443 0.3812 0.4563 0.3055 0.4282 0.6768 0.7813 0.76 0.7281 0.7144 0.2321 0.2298 0.9357 0.3984 0.9161 0.5628 0.5197 0.3927 0.2131 0.3347 0.5877 0.2228 0.7104 0.8437 0.4537 0.5445 0.2088 0.4325 0.6833 0.7611 0.5728 0.4352 0.6316 0.6025 0.7462 0.3507 0.4005 0.42 0.8173 0.5433 0.5075 0.0469 0.212 0.6731 0.148 0.1507 0.3966 0.7129 0.5965 0.3639 0.278 0.131 0.088 0.1049 0.1111 0.2236 0.4637 0.3801 0.3129 0.2237 0.3644 0.8985 0.1896 0.8015 0.1767 0.6815 0.1232 0.3757 0.2715 0.6626 0.2192 0.7092 0.9862 0.0967 0.5261 0.8077 0.2534 0.5608 0.5609 0.0893 0.4391 0.1965 0.3553 0.2756 0.6266 0.7115 0.6355 0.5377 0.7802 0.105 0.2318 0.4576 0.3746 0.3137 0.2144 0.0507 0.2948 0.5453 0.1964 0.0255 0.3711 0.2129 0.2422 0.043 0.446 0.0426 0.2126 0.2269] 2022-08-24 06:46:12 [INFO] [EVAL] The model with the best validation mIoU (0.3785) was saved at iter 126000. 2022-08-24 06:46:22 [INFO] [TRAIN] epoch: 102, iter: 128050/160000, loss: 0.4350, lr: 0.000242, batch_cost: 0.2101, reader_cost: 0.00429, ips: 38.0732 samples/sec | ETA 01:51:53 2022-08-24 06:46:32 [INFO] [TRAIN] epoch: 102, iter: 128100/160000, loss: 0.4650, lr: 0.000242, batch_cost: 0.1958, reader_cost: 0.00090, ips: 40.8600 samples/sec | ETA 01:44:05 2022-08-24 06:46:41 [INFO] [TRAIN] epoch: 102, iter: 128150/160000, loss: 0.4127, lr: 0.000241, batch_cost: 0.1834, reader_cost: 0.00055, ips: 43.6134 samples/sec | ETA 01:37:22 2022-08-24 06:46:50 [INFO] [TRAIN] epoch: 102, iter: 128200/160000, loss: 0.4253, lr: 0.000241, batch_cost: 0.1774, reader_cost: 0.00043, ips: 45.1072 samples/sec | ETA 01:33:59 2022-08-24 06:46:59 [INFO] [TRAIN] epoch: 102, iter: 128250/160000, loss: 0.4067, lr: 0.000240, batch_cost: 0.1868, reader_cost: 0.00103, ips: 42.8274 samples/sec | ETA 01:38:50 2022-08-24 06:47:09 [INFO] [TRAIN] epoch: 102, iter: 128300/160000, loss: 0.3859, lr: 0.000240, batch_cost: 0.1973, reader_cost: 0.00054, ips: 40.5501 samples/sec | ETA 01:44:13 2022-08-24 06:47:20 [INFO] [TRAIN] epoch: 102, iter: 128350/160000, loss: 0.3913, lr: 0.000240, batch_cost: 0.2071, reader_cost: 0.00058, ips: 38.6256 samples/sec | ETA 01:49:15 2022-08-24 06:47:30 [INFO] [TRAIN] epoch: 102, iter: 128400/160000, loss: 0.4312, lr: 0.000239, batch_cost: 0.1998, reader_cost: 0.00149, ips: 40.0425 samples/sec | ETA 01:45:13 2022-08-24 06:47:38 [INFO] [TRAIN] epoch: 102, iter: 128450/160000, loss: 0.4007, lr: 0.000239, batch_cost: 0.1733, reader_cost: 0.00294, ips: 46.1634 samples/sec | ETA 01:31:07 2022-08-24 06:47:49 [INFO] [TRAIN] epoch: 102, iter: 128500/160000, loss: 0.4050, lr: 0.000238, batch_cost: 0.2077, reader_cost: 0.00290, ips: 38.5091 samples/sec | ETA 01:49:03 2022-08-24 06:48:00 [INFO] [TRAIN] epoch: 102, iter: 128550/160000, loss: 0.4168, lr: 0.000238, batch_cost: 0.2361, reader_cost: 0.00043, ips: 33.8801 samples/sec | ETA 02:03:46 2022-08-24 06:48:10 [INFO] [TRAIN] epoch: 102, iter: 128600/160000, loss: 0.3985, lr: 0.000238, batch_cost: 0.1902, reader_cost: 0.00048, ips: 42.0595 samples/sec | ETA 01:39:32 2022-08-24 06:48:20 [INFO] [TRAIN] epoch: 102, iter: 128650/160000, loss: 0.4556, lr: 0.000237, batch_cost: 0.2003, reader_cost: 0.00046, ips: 39.9447 samples/sec | ETA 01:44:38 2022-08-24 06:48:30 [INFO] [TRAIN] epoch: 102, iter: 128700/160000, loss: 0.3916, lr: 0.000237, batch_cost: 0.1959, reader_cost: 0.00038, ips: 40.8302 samples/sec | ETA 01:42:12 2022-08-24 06:48:41 [INFO] [TRAIN] epoch: 102, iter: 128750/160000, loss: 0.4171, lr: 0.000237, batch_cost: 0.2204, reader_cost: 0.00071, ips: 36.3032 samples/sec | ETA 01:54:46 2022-08-24 06:48:51 [INFO] [TRAIN] epoch: 102, iter: 128800/160000, loss: 0.3960, lr: 0.000236, batch_cost: 0.2045, reader_cost: 0.00054, ips: 39.1204 samples/sec | ETA 01:46:20 2022-08-24 06:49:03 [INFO] [TRAIN] epoch: 103, iter: 128850/160000, loss: 0.4322, lr: 0.000236, batch_cost: 0.2374, reader_cost: 0.04943, ips: 33.6999 samples/sec | ETA 02:03:14 2022-08-24 06:49:14 [INFO] [TRAIN] epoch: 103, iter: 128900/160000, loss: 0.3776, lr: 0.000235, batch_cost: 0.2128, reader_cost: 0.00441, ips: 37.5995 samples/sec | ETA 01:50:17 2022-08-24 06:49:23 [INFO] [TRAIN] epoch: 103, iter: 128950/160000, loss: 0.3980, lr: 0.000235, batch_cost: 0.1997, reader_cost: 0.00108, ips: 40.0638 samples/sec | ETA 01:43:20 2022-08-24 06:49:33 [INFO] [TRAIN] epoch: 103, iter: 129000/160000, loss: 0.4418, lr: 0.000235, batch_cost: 0.1909, reader_cost: 0.01656, ips: 41.9152 samples/sec | ETA 01:38:36 2022-08-24 06:49:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 139s - batch_cost: 0.1389 - reader cost: 6.2591e-04 2022-08-24 06:51:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3749 Acc: 0.7750 Kappa: 0.7577 Dice: 0.5111 2022-08-24 06:51:52 [INFO] [EVAL] Class IoU: [0.69 0.794 0.9335 0.7423 0.6852 0.7769 0.778 0.8146 0.5388 0.6332 0.4997 0.583 0.7255 0.2914 0.3168 0.4479 0.4843 0.4227 0.6173 0.4434 0.7749 0.4261 0.6392 0.5118 0.3369 0.443 0.4602 0.4536 0.4488 0.2439 0.2825 0.5228 0.3186 0.3717 0.3619 0.4226 0.487 0.5929 0.3029 0.3896 0.0918 0.1333 0.3496 0.2594 0.246 0.2322 0.3817 0.5292 0.649 0.5417 0.606 0.3869 0.1684 0.176 0.6925 0.3406 0.8857 0.3949 0.3296 0.2879 0.0757 0.2503 0.3321 0.1404 0.4657 0.7152 0.261 0.3976 0.1461 0.3725 0.5335 0.525 0.4108 0.2342 0.495 0.4198 0.5668 0.2934 0.1951 0.3092 0.7359 0.4284 0.3986 0.127 0.1474 0.5251 0.1224 0.1464 0.2982 0.5413 0.4539 0.0631 0.2077 0.1065 0.0604 0.0514 0.1061 0.1352 0.2627 0.2962 0.1968 0.1013 0.2946 0.7217 0.1875 0.7667 0.1234 0.5661 0.0863 0.2259 0.2192 0.4594 0.1995 0.7047 0.6639 0.0584 0.3984 0.672 0.1952 0.4049 0.451 0.0717 0.3409 0.2032 0.3045 0.2573 0.5486 0.4427 0.5619 0.4074 0.6188 0.1025 0.1918 0.4235 0.3188 0.2132 0.1709 0.0385 0.2343 0.4183 0.1088 0.0222 0.313 0.1129 0.2163 0.0083 0.4328 0.0351 0.1747 0.2054] 2022-08-24 06:51:52 [INFO] [EVAL] Class Precision: [0.7802 0.8702 0.9678 0.8286 0.756 0.8797 0.8854 0.8722 0.6787 0.7506 0.6884 0.7188 0.7971 0.4968 0.5464 0.6233 0.6866 0.6837 0.7675 0.6463 0.8509 0.667 0.7876 0.6063 0.5125 0.672 0.5709 0.7631 0.741 0.381 0.4845 0.6936 0.6104 0.5209 0.4962 0.5798 0.6828 0.8105 0.4363 0.6008 0.3376 0.2905 0.54 0.5403 0.3353 0.498 0.6806 0.7387 0.7168 0.693 0.7876 0.5079 0.3008 0.5934 0.7387 0.6346 0.9392 0.7443 0.6971 0.4837 0.1067 0.519 0.4712 0.6861 0.5556 0.8067 0.394 0.5309 0.3498 0.6362 0.6994 0.6257 0.5592 0.319 0.7257 0.6329 0.8002 0.6425 0.7865 0.4867 0.8974 0.7226 0.829 0.2877 0.2821 0.7227 0.459 0.4121 0.8175 0.6998 0.637 0.0746 0.4146 0.332 0.1882 0.163 0.5778 0.3924 0.3641 0.7517 0.7264 0.142 0.6211 0.8931 0.7118 0.8893 0.3661 0.7842 0.1878 0.3648 0.4424 0.6094 0.4885 0.8571 0.67 0.2588 0.716 0.812 0.2983 0.5788 0.6672 0.2395 0.6356 0.6522 0.7406 0.6773 0.8629 0.5337 0.8306 0.684 0.7388 0.4689 0.514 0.7687 0.5978 0.4009 0.3628 0.0821 0.443 0.6883 0.3669 0.0322 0.5131 0.6645 0.5409 0.0147 0.8624 0.2306 0.498 0.7642] 2022-08-24 06:51:52 [INFO] [EVAL] Class Recall: [0.8565 0.9007 0.9634 0.8769 0.8797 0.8693 0.8651 0.925 0.7233 0.802 0.6457 0.7552 0.8899 0.4134 0.4297 0.6141 0.6217 0.5255 0.7593 0.5855 0.8967 0.5413 0.7723 0.7664 0.4957 0.5651 0.7036 0.5279 0.5323 0.4041 0.404 0.6797 0.4 0.5649 0.5721 0.609 0.6294 0.6883 0.4976 0.5258 0.112 0.1977 0.4978 0.3329 0.4801 0.3031 0.465 0.6511 0.8728 0.7128 0.7243 0.6189 0.2767 0.2001 0.9173 0.4237 0.9396 0.4569 0.3847 0.4157 0.2068 0.3259 0.5295 0.15 0.742 0.8631 0.4362 0.613 0.2006 0.4732 0.6923 0.7654 0.6076 0.4684 0.609 0.5549 0.6602 0.3506 0.206 0.4588 0.8035 0.5127 0.4344 0.1853 0.236 0.6576 0.1431 0.1851 0.3195 0.705 0.6124 0.2908 0.2939 0.1356 0.0817 0.0699 0.1151 0.171 0.4853 0.3284 0.2126 0.2614 0.3591 0.7899 0.2029 0.8475 0.1569 0.6706 0.1378 0.3725 0.3029 0.6512 0.2521 0.7986 0.9864 0.0701 0.4732 0.7958 0.3611 0.574 0.582 0.0928 0.4237 0.2279 0.3409 0.2932 0.601 0.7219 0.6346 0.5019 0.7921 0.1159 0.2344 0.4853 0.4058 0.3128 0.2441 0.0677 0.3322 0.5161 0.1339 0.0667 0.4453 0.1197 0.265 0.0185 0.4649 0.0398 0.2121 0.2193] 2022-08-24 06:51:52 [INFO] [EVAL] The model with the best validation mIoU (0.3785) was saved at iter 126000. 2022-08-24 06:52:01 [INFO] [TRAIN] epoch: 103, iter: 129050/160000, loss: 0.3930, lr: 0.000234, batch_cost: 0.1722, reader_cost: 0.00390, ips: 46.4537 samples/sec | ETA 01:28:50 2022-08-24 06:52:10 [INFO] [TRAIN] epoch: 103, iter: 129100/160000, loss: 0.4034, lr: 0.000234, batch_cost: 0.1704, reader_cost: 0.00086, ips: 46.9524 samples/sec | ETA 01:27:44 2022-08-24 06:52:17 [INFO] [TRAIN] epoch: 103, iter: 129150/160000, loss: 0.4190, lr: 0.000234, batch_cost: 0.1589, reader_cost: 0.00041, ips: 50.3426 samples/sec | ETA 01:21:42 2022-08-24 06:52:25 [INFO] [TRAIN] epoch: 103, iter: 129200/160000, loss: 0.4214, lr: 0.000233, batch_cost: 0.1555, reader_cost: 0.00072, ips: 51.4527 samples/sec | ETA 01:19:48 2022-08-24 06:52:36 [INFO] [TRAIN] epoch: 103, iter: 129250/160000, loss: 0.3963, lr: 0.000233, batch_cost: 0.2073, reader_cost: 0.00065, ips: 38.5843 samples/sec | ETA 01:46:15 2022-08-24 06:52:43 [INFO] [TRAIN] epoch: 103, iter: 129300/160000, loss: 0.4142, lr: 0.000232, batch_cost: 0.1549, reader_cost: 0.00095, ips: 51.6487 samples/sec | ETA 01:19:15 2022-08-24 06:52:52 [INFO] [TRAIN] epoch: 103, iter: 129350/160000, loss: 0.3774, lr: 0.000232, batch_cost: 0.1736, reader_cost: 0.00079, ips: 46.0714 samples/sec | ETA 01:28:42 2022-08-24 06:53:02 [INFO] [TRAIN] epoch: 103, iter: 129400/160000, loss: 0.4173, lr: 0.000232, batch_cost: 0.1963, reader_cost: 0.00305, ips: 40.7523 samples/sec | ETA 01:40:07 2022-08-24 06:53:12 [INFO] [TRAIN] epoch: 103, iter: 129450/160000, loss: 0.4178, lr: 0.000231, batch_cost: 0.2113, reader_cost: 0.00069, ips: 37.8570 samples/sec | ETA 01:47:35 2022-08-24 06:53:23 [INFO] [TRAIN] epoch: 103, iter: 129500/160000, loss: 0.4390, lr: 0.000231, batch_cost: 0.2055, reader_cost: 0.00134, ips: 38.9201 samples/sec | ETA 01:44:29 2022-08-24 06:53:33 [INFO] [TRAIN] epoch: 103, iter: 129550/160000, loss: 0.4234, lr: 0.000231, batch_cost: 0.2140, reader_cost: 0.00071, ips: 37.3807 samples/sec | ETA 01:48:36 2022-08-24 06:53:43 [INFO] [TRAIN] epoch: 103, iter: 129600/160000, loss: 0.4288, lr: 0.000230, batch_cost: 0.1904, reader_cost: 0.00065, ips: 42.0253 samples/sec | ETA 01:36:26 2022-08-24 06:53:52 [INFO] [TRAIN] epoch: 103, iter: 129650/160000, loss: 0.3999, lr: 0.000230, batch_cost: 0.1859, reader_cost: 0.00032, ips: 43.0427 samples/sec | ETA 01:34:00 2022-08-24 06:54:02 [INFO] [TRAIN] epoch: 103, iter: 129700/160000, loss: 0.4073, lr: 0.000229, batch_cost: 0.2027, reader_cost: 0.00080, ips: 39.4743 samples/sec | ETA 01:42:20 2022-08-24 06:54:13 [INFO] [TRAIN] epoch: 103, iter: 129750/160000, loss: 0.4234, lr: 0.000229, batch_cost: 0.2055, reader_cost: 0.00573, ips: 38.9201 samples/sec | ETA 01:43:37 2022-08-24 06:54:23 [INFO] [TRAIN] epoch: 103, iter: 129800/160000, loss: 0.4328, lr: 0.000229, batch_cost: 0.2050, reader_cost: 0.00089, ips: 39.0250 samples/sec | ETA 01:43:10 2022-08-24 06:54:33 [INFO] [TRAIN] epoch: 103, iter: 129850/160000, loss: 0.4298, lr: 0.000228, batch_cost: 0.2071, reader_cost: 0.00057, ips: 38.6333 samples/sec | ETA 01:44:03 2022-08-24 06:54:44 [INFO] [TRAIN] epoch: 103, iter: 129900/160000, loss: 0.4246, lr: 0.000228, batch_cost: 0.2117, reader_cost: 0.00132, ips: 37.7912 samples/sec | ETA 01:46:11 2022-08-24 06:54:54 [INFO] [TRAIN] epoch: 103, iter: 129950/160000, loss: 0.3977, lr: 0.000228, batch_cost: 0.1958, reader_cost: 0.00038, ips: 40.8559 samples/sec | ETA 01:38:04 2022-08-24 06:55:04 [INFO] [TRAIN] epoch: 103, iter: 130000/160000, loss: 0.4154, lr: 0.000227, batch_cost: 0.2081, reader_cost: 0.00086, ips: 38.4399 samples/sec | ETA 01:44:03 2022-08-24 06:55:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 161s - batch_cost: 0.1605 - reader cost: 0.0011 2022-08-24 06:57:45 [INFO] [EVAL] #Images: 2000 mIoU: 0.3770 Acc: 0.7769 Kappa: 0.7597 Dice: 0.5148 2022-08-24 06:57:45 [INFO] [EVAL] Class IoU: [0.6956 0.7968 0.9323 0.7429 0.6903 0.7761 0.7832 0.806 0.5367 0.6229 0.5136 0.5776 0.7267 0.2814 0.3195 0.4496 0.4761 0.4205 0.6154 0.4372 0.777 0.4418 0.6375 0.5155 0.3407 0.4923 0.4627 0.461 0.4549 0.2406 0.3012 0.5338 0.3158 0.372 0.3535 0.4292 0.4794 0.5679 0.2927 0.3832 0.0932 0.1397 0.3651 0.2655 0.2508 0.2444 0.3756 0.5272 0.6354 0.5449 0.6107 0.3933 0.1532 0.1689 0.6851 0.3281 0.8768 0.4184 0.3666 0.2796 0.0743 0.2431 0.3318 0.1653 0.4662 0.7167 0.2506 0.3906 0.1294 0.3787 0.4912 0.5389 0.395 0.2521 0.5044 0.3996 0.5565 0.3112 0.2157 0.3481 0.7202 0.4332 0.4241 0.1202 0.1249 0.5305 0.1322 0.1328 0.3276 0.5621 0.4377 0.1117 0.2286 0.0946 0.0549 0.0732 0.1176 0.1981 0.293 0.295 0.2748 0.1053 0.2972 0.7428 0.1843 0.6044 0.1321 0.5785 0.0857 0.2371 0.2153 0.5513 0.1788 0.5956 0.6651 0.0588 0.4046 0.6446 0.1973 0.4082 0.4337 0.0673 0.3292 0.1731 0.294 0.26 0.5478 0.4835 0.5458 0.4158 0.6143 0.0704 0.2532 0.4169 0.2683 0.1992 0.1543 0.047 0.2026 0.4017 0.1555 0.003 0.299 0.3141 0.2318 0.0099 0.4331 0.0382 0.1766 0.2026] 2022-08-24 06:57:45 [INFO] [EVAL] Class Precision: [0.7891 0.8635 0.9673 0.8284 0.7703 0.8767 0.8808 0.8553 0.6798 0.7297 0.6876 0.721 0.8091 0.5029 0.5819 0.635 0.6427 0.7153 0.7697 0.6371 0.8433 0.6773 0.8008 0.6192 0.4644 0.6675 0.5674 0.7818 0.7522 0.3976 0.4903 0.6842 0.6154 0.528 0.466 0.5682 0.6569 0.8162 0.4111 0.6137 0.3195 0.3022 0.5906 0.4711 0.3829 0.4845 0.7547 0.7462 0.7303 0.6595 0.8012 0.4816 0.3192 0.542 0.7269 0.6664 0.9216 0.7137 0.7118 0.4442 0.1144 0.3989 0.4542 0.626 0.5481 0.8209 0.3747 0.5306 0.2561 0.6897 0.7611 0.6576 0.5689 0.3546 0.7348 0.6093 0.7475 0.7028 0.7308 0.6823 0.8548 0.719 0.816 0.2945 0.2291 0.7054 0.4467 0.443 0.8922 0.7479 0.6002 0.1836 0.5038 0.3598 0.123 0.2126 0.5224 0.4021 0.4642 0.6525 0.6588 0.1658 0.5864 0.8478 0.8427 0.6809 0.2976 0.7715 0.1841 0.3912 0.4631 0.7762 0.5229 0.865 0.6718 0.2392 0.7949 0.7553 0.4054 0.5536 0.6975 0.3402 0.6431 0.7654 0.7873 0.6005 0.8311 0.6344 0.7914 0.7303 0.7485 0.3704 0.6051 0.8367 0.6095 0.421 0.3744 0.1222 0.4862 0.7144 0.3411 0.0054 0.5528 0.712 0.5881 0.0137 0.8542 0.2267 0.5412 0.8481] 2022-08-24 06:57:45 [INFO] [EVAL] Class Recall: [0.8544 0.9117 0.9626 0.878 0.8693 0.8711 0.8761 0.9332 0.7182 0.8097 0.6698 0.7439 0.877 0.3898 0.4148 0.6063 0.6475 0.5051 0.7544 0.5822 0.9081 0.5596 0.7577 0.7547 0.5611 0.6522 0.7149 0.5291 0.5351 0.3787 0.4385 0.7084 0.3934 0.5575 0.5941 0.6368 0.6395 0.6512 0.5041 0.505 0.1163 0.2063 0.4888 0.3782 0.421 0.3304 0.4278 0.6424 0.8301 0.7581 0.7198 0.6822 0.2276 0.197 0.9226 0.3926 0.9474 0.5028 0.4305 0.43 0.1748 0.3837 0.5519 0.1834 0.7573 0.8495 0.4308 0.597 0.2073 0.4564 0.5807 0.7492 0.5637 0.4659 0.6167 0.5372 0.6853 0.3583 0.2343 0.4154 0.8206 0.5215 0.4689 0.1687 0.2155 0.6815 0.1582 0.1594 0.3412 0.6934 0.6177 0.2219 0.295 0.1138 0.0901 0.1004 0.1318 0.2807 0.4428 0.35 0.3204 0.2238 0.376 0.8571 0.1908 0.8433 0.1918 0.6982 0.1382 0.3756 0.2869 0.6555 0.2137 0.6566 0.9852 0.0723 0.4517 0.8148 0.2777 0.6085 0.5342 0.0775 0.4027 0.1828 0.3194 0.3143 0.6165 0.6703 0.6375 0.4912 0.774 0.0799 0.3033 0.4538 0.3239 0.2744 0.2079 0.0709 0.2578 0.4786 0.2223 0.0066 0.3945 0.3598 0.2768 0.0354 0.4676 0.0439 0.2077 0.2102] 2022-08-24 06:57:45 [INFO] [EVAL] The model with the best validation mIoU (0.3785) was saved at iter 126000. 2022-08-24 06:57:53 [INFO] [TRAIN] epoch: 103, iter: 130050/160000, loss: 0.4144, lr: 0.000227, batch_cost: 0.1660, reader_cost: 0.00409, ips: 48.2069 samples/sec | ETA 01:22:50 2022-08-24 06:58:04 [INFO] [TRAIN] epoch: 104, iter: 130100/160000, loss: 0.3955, lr: 0.000226, batch_cost: 0.2235, reader_cost: 0.04628, ips: 35.7961 samples/sec | ETA 01:51:22 2022-08-24 06:58:13 [INFO] [TRAIN] epoch: 104, iter: 130150/160000, loss: 0.3928, lr: 0.000226, batch_cost: 0.1667, reader_cost: 0.00068, ips: 47.9972 samples/sec | ETA 01:22:55 2022-08-24 06:58:21 [INFO] [TRAIN] epoch: 104, iter: 130200/160000, loss: 0.4170, lr: 0.000226, batch_cost: 0.1549, reader_cost: 0.00074, ips: 51.6587 samples/sec | ETA 01:16:54 2022-08-24 06:58:29 [INFO] [TRAIN] epoch: 104, iter: 130250/160000, loss: 0.3937, lr: 0.000225, batch_cost: 0.1634, reader_cost: 0.00057, ips: 48.9457 samples/sec | ETA 01:21:02 2022-08-24 06:58:38 [INFO] [TRAIN] epoch: 104, iter: 130300/160000, loss: 0.3753, lr: 0.000225, batch_cost: 0.1834, reader_cost: 0.00061, ips: 43.6182 samples/sec | ETA 01:30:47 2022-08-24 06:58:47 [INFO] [TRAIN] epoch: 104, iter: 130350/160000, loss: 0.3839, lr: 0.000224, batch_cost: 0.1869, reader_cost: 0.00069, ips: 42.7967 samples/sec | ETA 01:32:22 2022-08-24 06:58:57 [INFO] [TRAIN] epoch: 104, iter: 130400/160000, loss: 0.4258, lr: 0.000224, batch_cost: 0.1952, reader_cost: 0.00053, ips: 40.9871 samples/sec | ETA 01:36:17 2022-08-24 06:59:07 [INFO] [TRAIN] epoch: 104, iter: 130450/160000, loss: 0.4113, lr: 0.000224, batch_cost: 0.1947, reader_cost: 0.00061, ips: 41.0838 samples/sec | ETA 01:35:54 2022-08-24 06:59:18 [INFO] [TRAIN] epoch: 104, iter: 130500/160000, loss: 0.3760, lr: 0.000223, batch_cost: 0.2237, reader_cost: 0.00050, ips: 35.7559 samples/sec | ETA 01:50:00 2022-08-24 06:59:28 [INFO] [TRAIN] epoch: 104, iter: 130550/160000, loss: 0.3959, lr: 0.000223, batch_cost: 0.1980, reader_cost: 0.00043, ips: 40.4050 samples/sec | ETA 01:37:10 2022-08-24 06:59:38 [INFO] [TRAIN] epoch: 104, iter: 130600/160000, loss: 0.4140, lr: 0.000223, batch_cost: 0.2041, reader_cost: 0.00035, ips: 39.1886 samples/sec | ETA 01:40:01 2022-08-24 06:59:48 [INFO] [TRAIN] epoch: 104, iter: 130650/160000, loss: 0.3761, lr: 0.000222, batch_cost: 0.1982, reader_cost: 0.00045, ips: 40.3679 samples/sec | ETA 01:36:56 2022-08-24 06:59:59 [INFO] [TRAIN] epoch: 104, iter: 130700/160000, loss: 0.4011, lr: 0.000222, batch_cost: 0.2183, reader_cost: 0.00045, ips: 36.6551 samples/sec | ETA 01:46:34 2022-08-24 07:00:09 [INFO] [TRAIN] epoch: 104, iter: 130750/160000, loss: 0.4292, lr: 0.000221, batch_cost: 0.2060, reader_cost: 0.00039, ips: 38.8324 samples/sec | ETA 01:40:25 2022-08-24 07:00:20 [INFO] [TRAIN] epoch: 104, iter: 130800/160000, loss: 0.3950, lr: 0.000221, batch_cost: 0.2156, reader_cost: 0.00059, ips: 37.1086 samples/sec | ETA 01:44:55 2022-08-24 07:00:30 [INFO] [TRAIN] epoch: 104, iter: 130850/160000, loss: 0.3946, lr: 0.000221, batch_cost: 0.2019, reader_cost: 0.00180, ips: 39.6249 samples/sec | ETA 01:38:05 2022-08-24 07:00:39 [INFO] [TRAIN] epoch: 104, iter: 130900/160000, loss: 0.4284, lr: 0.000220, batch_cost: 0.1875, reader_cost: 0.00046, ips: 42.6609 samples/sec | ETA 01:30:56 2022-08-24 07:00:49 [INFO] [TRAIN] epoch: 104, iter: 130950/160000, loss: 0.3913, lr: 0.000220, batch_cost: 0.1973, reader_cost: 0.00063, ips: 40.5519 samples/sec | ETA 01:35:30 2022-08-24 07:00:58 [INFO] [TRAIN] epoch: 104, iter: 131000/160000, loss: 0.4190, lr: 0.000220, batch_cost: 0.1746, reader_cost: 0.00072, ips: 45.8296 samples/sec | ETA 01:24:22 2022-08-24 07:00:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 150s - batch_cost: 0.1496 - reader cost: 0.0011 2022-08-24 07:03:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3799 Acc: 0.7765 Kappa: 0.7594 Dice: 0.5177 2022-08-24 07:03:28 [INFO] [EVAL] Class IoU: [0.6946 0.7953 0.9326 0.7432 0.6859 0.775 0.7782 0.8162 0.538 0.6301 0.4983 0.5832 0.7163 0.2884 0.308 0.4469 0.4937 0.4205 0.6118 0.4338 0.7759 0.4556 0.641 0.5139 0.3239 0.4592 0.4463 0.4682 0.4285 0.2954 0.3253 0.5584 0.3333 0.3657 0.3751 0.4285 0.49 0.576 0.3039 0.375 0.127 0.1487 0.3566 0.2632 0.2389 0.2434 0.3467 0.5364 0.6373 0.547 0.5908 0.368 0.156 0.1897 0.6971 0.3348 0.8771 0.4694 0.4162 0.2885 0.089 0.2212 0.3294 0.1556 0.507 0.7215 0.2527 0.3899 0.1289 0.388 0.5071 0.533 0.3873 0.26 0.4992 0.4008 0.5673 0.3071 0.4173 0.344 0.7152 0.4346 0.4251 0.1498 0.1975 0.5358 0.1274 0.1519 0.3193 0.562 0.453 0.0359 0.1964 0.1118 0.058 0.0456 0.1369 0.2019 0.2724 0.2882 0.2736 0.1308 0.29 0.7597 0.192 0.6916 0.1443 0.5701 0.0987 0.2162 0.2191 0.4726 0.2011 0.7105 0.6937 0.0666 0.4246 0.6164 0.1445 0.4041 0.4018 0.0736 0.3476 0.1736 0.3116 0.2562 0.5434 0.4758 0.5595 0.3915 0.5949 0.0607 0.2061 0.4195 0.2887 0.1948 0.1764 0.0655 0.2228 0.4043 0.2035 0.0027 0.2787 0.143 0.2523 0.0028 0.4239 0.0308 0.176 0.1943] 2022-08-24 07:03:28 [INFO] [EVAL] Class Precision: [0.7847 0.8717 0.9646 0.8281 0.7619 0.8699 0.8923 0.8812 0.6988 0.7632 0.6954 0.6993 0.7836 0.4932 0.5861 0.611 0.6896 0.7031 0.7699 0.6513 0.8394 0.6787 0.7996 0.6505 0.4544 0.6074 0.5546 0.7915 0.7554 0.4526 0.4801 0.7017 0.5626 0.4991 0.4752 0.5953 0.6875 0.8343 0.4362 0.5878 0.3123 0.3143 0.5901 0.4671 0.327 0.4799 0.672 0.7846 0.7155 0.6776 0.7714 0.4522 0.3666 0.5144 0.7397 0.671 0.9132 0.6444 0.6564 0.4895 0.1406 0.4206 0.4411 0.6311 0.6412 0.8178 0.3935 0.5496 0.2673 0.6638 0.7348 0.618 0.5883 0.3453 0.7521 0.5898 0.8076 0.6168 0.7878 0.63 0.8563 0.7246 0.8077 0.3368 0.2832 0.7236 0.4872 0.3994 0.875 0.7308 0.6311 0.0582 0.3971 0.3312 0.1879 0.1392 0.5271 0.3999 0.3899 0.6986 0.6801 0.2302 0.6288 0.8873 0.8007 0.781 0.3257 0.7567 0.2359 0.3735 0.4833 0.583 0.5313 0.8486 0.7001 0.3078 0.7688 0.7102 0.2361 0.5568 0.7477 0.2967 0.6662 0.7016 0.7458 0.6391 0.8129 0.6267 0.8243 0.6549 0.6947 0.3063 0.5421 0.8385 0.6138 0.4145 0.3253 0.1316 0.4383 0.7137 0.5501 0.0042 0.6127 0.6158 0.5517 0.0047 0.8828 0.2204 0.4788 0.7405] 2022-08-24 07:03:28 [INFO] [EVAL] Class Recall: [0.8582 0.9007 0.9656 0.8787 0.8731 0.8766 0.8589 0.9172 0.7004 0.7833 0.6374 0.7783 0.8929 0.4098 0.3936 0.6246 0.6347 0.5113 0.7488 0.565 0.9111 0.581 0.7637 0.7099 0.5301 0.6531 0.6955 0.5341 0.4976 0.4596 0.5022 0.7323 0.4499 0.5778 0.6405 0.6047 0.6305 0.6504 0.5007 0.5088 0.1763 0.2201 0.474 0.3762 0.4698 0.3307 0.4173 0.629 0.8536 0.7395 0.7162 0.6642 0.2136 0.231 0.9237 0.4006 0.9569 0.6335 0.5321 0.4126 0.195 0.3182 0.5655 0.1712 0.7077 0.8597 0.4139 0.5731 0.1994 0.4829 0.6207 0.7947 0.5312 0.5129 0.5975 0.5558 0.6559 0.3795 0.4701 0.4311 0.8127 0.5206 0.473 0.2124 0.3947 0.6738 0.1471 0.1968 0.3346 0.7087 0.6161 0.0853 0.2798 0.1444 0.0775 0.0635 0.1561 0.2898 0.4747 0.3291 0.314 0.2324 0.35 0.8407 0.2016 0.858 0.2058 0.6981 0.145 0.3393 0.2861 0.714 0.2444 0.8137 0.987 0.0783 0.4868 0.8236 0.2714 0.5958 0.4649 0.0892 0.4209 0.1875 0.3486 0.2995 0.6211 0.6639 0.6353 0.4933 0.8055 0.0703 0.2496 0.4564 0.3527 0.2687 0.2781 0.1152 0.3118 0.4825 0.2441 0.0077 0.3382 0.1571 0.3173 0.0071 0.4492 0.0346 0.2178 0.2084] 2022-08-24 07:03:28 [INFO] [EVAL] The model with the best validation mIoU (0.3799) was saved at iter 131000. 2022-08-24 07:03:36 [INFO] [TRAIN] epoch: 104, iter: 131050/160000, loss: 0.4254, lr: 0.000219, batch_cost: 0.1542, reader_cost: 0.00345, ips: 51.8783 samples/sec | ETA 01:14:24 2022-08-24 07:03:44 [INFO] [TRAIN] epoch: 104, iter: 131100/160000, loss: 0.4300, lr: 0.000219, batch_cost: 0.1603, reader_cost: 0.00901, ips: 49.9092 samples/sec | ETA 01:17:12 2022-08-24 07:03:52 [INFO] [TRAIN] epoch: 104, iter: 131150/160000, loss: 0.4299, lr: 0.000218, batch_cost: 0.1534, reader_cost: 0.00062, ips: 52.1488 samples/sec | ETA 01:13:45 2022-08-24 07:04:00 [INFO] [TRAIN] epoch: 104, iter: 131200/160000, loss: 0.4222, lr: 0.000218, batch_cost: 0.1614, reader_cost: 0.00074, ips: 49.5592 samples/sec | ETA 01:17:28 2022-08-24 07:04:08 [INFO] [TRAIN] epoch: 104, iter: 131250/160000, loss: 0.3881, lr: 0.000218, batch_cost: 0.1626, reader_cost: 0.00083, ips: 49.1868 samples/sec | ETA 01:17:56 2022-08-24 07:04:16 [INFO] [TRAIN] epoch: 104, iter: 131300/160000, loss: 0.3998, lr: 0.000217, batch_cost: 0.1619, reader_cost: 0.00064, ips: 49.4210 samples/sec | ETA 01:17:25 2022-08-24 07:04:24 [INFO] [TRAIN] epoch: 104, iter: 131350/160000, loss: 0.4075, lr: 0.000217, batch_cost: 0.1647, reader_cost: 0.00094, ips: 48.5611 samples/sec | ETA 01:18:39 2022-08-24 07:04:37 [INFO] [TRAIN] epoch: 105, iter: 131400/160000, loss: 0.3923, lr: 0.000217, batch_cost: 0.2660, reader_cost: 0.07784, ips: 30.0808 samples/sec | ETA 02:06:46 2022-08-24 07:04:47 [INFO] [TRAIN] epoch: 105, iter: 131450/160000, loss: 0.3806, lr: 0.000216, batch_cost: 0.1995, reader_cost: 0.00075, ips: 40.0993 samples/sec | ETA 01:34:55 2022-08-24 07:04:56 [INFO] [TRAIN] epoch: 105, iter: 131500/160000, loss: 0.3927, lr: 0.000216, batch_cost: 0.1766, reader_cost: 0.00196, ips: 45.3083 samples/sec | ETA 01:23:52 2022-08-24 07:05:05 [INFO] [TRAIN] epoch: 105, iter: 131550/160000, loss: 0.4007, lr: 0.000215, batch_cost: 0.1840, reader_cost: 0.00081, ips: 43.4900 samples/sec | ETA 01:27:13 2022-08-24 07:05:15 [INFO] [TRAIN] epoch: 105, iter: 131600/160000, loss: 0.3980, lr: 0.000215, batch_cost: 0.1948, reader_cost: 0.00069, ips: 41.0647 samples/sec | ETA 01:32:12 2022-08-24 07:05:25 [INFO] [TRAIN] epoch: 105, iter: 131650/160000, loss: 0.3909, lr: 0.000215, batch_cost: 0.1991, reader_cost: 0.00045, ips: 40.1728 samples/sec | ETA 01:34:05 2022-08-24 07:05:34 [INFO] [TRAIN] epoch: 105, iter: 131700/160000, loss: 0.3936, lr: 0.000214, batch_cost: 0.1863, reader_cost: 0.00061, ips: 42.9392 samples/sec | ETA 01:27:52 2022-08-24 07:05:43 [INFO] [TRAIN] epoch: 105, iter: 131750/160000, loss: 0.4282, lr: 0.000214, batch_cost: 0.1746, reader_cost: 0.00038, ips: 45.8194 samples/sec | ETA 01:22:12 2022-08-24 07:05:53 [INFO] [TRAIN] epoch: 105, iter: 131800/160000, loss: 0.3977, lr: 0.000214, batch_cost: 0.1983, reader_cost: 0.00033, ips: 40.3491 samples/sec | ETA 01:33:11 2022-08-24 07:06:03 [INFO] [TRAIN] epoch: 105, iter: 131850/160000, loss: 0.4121, lr: 0.000213, batch_cost: 0.1974, reader_cost: 0.00051, ips: 40.5314 samples/sec | ETA 01:32:36 2022-08-24 07:06:13 [INFO] [TRAIN] epoch: 105, iter: 131900/160000, loss: 0.4124, lr: 0.000213, batch_cost: 0.1957, reader_cost: 0.00169, ips: 40.8764 samples/sec | ETA 01:31:39 2022-08-24 07:06:23 [INFO] [TRAIN] epoch: 105, iter: 131950/160000, loss: 0.4248, lr: 0.000212, batch_cost: 0.2019, reader_cost: 0.00044, ips: 39.6143 samples/sec | ETA 01:34:24 2022-08-24 07:06:33 [INFO] [TRAIN] epoch: 105, iter: 132000/160000, loss: 0.4227, lr: 0.000212, batch_cost: 0.1988, reader_cost: 0.00061, ips: 40.2466 samples/sec | ETA 01:32:45 2022-08-24 07:06:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1556 - reader cost: 6.3533e-04 2022-08-24 07:09:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3793 Acc: 0.7786 Kappa: 0.7615 Dice: 0.5166 2022-08-24 07:09:09 [INFO] [EVAL] Class IoU: [0.6963 0.7967 0.9312 0.7428 0.6924 0.7771 0.7761 0.816 0.5355 0.6464 0.5024 0.5735 0.7227 0.3032 0.3069 0.4561 0.5008 0.4262 0.6118 0.4482 0.7722 0.4282 0.6347 0.5151 0.3477 0.4433 0.4556 0.4682 0.4251 0.2722 0.2898 0.5567 0.3216 0.3715 0.3708 0.4237 0.4851 0.5799 0.3079 0.3554 0.1224 0.1537 0.3575 0.2575 0.2421 0.246 0.3839 0.5369 0.6403 0.5631 0.5955 0.3798 0.156 0.2227 0.6916 0.3398 0.8926 0.4433 0.4206 0.2956 0.0934 0.2103 0.3563 0.1364 0.4861 0.7397 0.2641 0.4003 0.1424 0.3977 0.4926 0.5747 0.3977 0.2538 0.5036 0.4126 0.5362 0.3215 0.3418 0.3032 0.7281 0.44 0.4103 0.067 0.174 0.5342 0.1245 0.1282 0.3038 0.5301 0.4601 0.1128 0.2388 0.1015 0.0504 0.058 0.1639 0.19 0.2788 0.3165 0.2119 0.124 0.2842 0.7747 0.1805 0.6436 0.137 0.5737 0.0879 0.2279 0.191 0.4551 0.1915 0.6218 0.7476 0.0544 0.4323 0.6309 0.1453 0.4033 0.459 0.0634 0.3395 0.1914 0.3095 0.2489 0.5379 0.482 0.5865 0.3809 0.5744 0.0471 0.3021 0.4375 0.2931 0.1987 0.1719 0.0495 0.216 0.4236 0.152 0.0231 0.2626 0.14 0.2163 0.0022 0.4576 0.0398 0.156 0.2049] 2022-08-24 07:09:09 [INFO] [EVAL] Class Precision: [0.7852 0.8596 0.9617 0.8235 0.7717 0.8786 0.8797 0.8771 0.6617 0.7681 0.7062 0.7002 0.7936 0.5287 0.5984 0.6321 0.696 0.7035 0.7711 0.6287 0.8334 0.6608 0.7876 0.6086 0.4851 0.6936 0.5603 0.7664 0.7693 0.4229 0.4942 0.7407 0.5683 0.5695 0.483 0.573 0.6647 0.8342 0.4537 0.6305 0.361 0.336 0.6145 0.4908 0.327 0.5061 0.6913 0.6933 0.7402 0.6962 0.8046 0.476 0.3249 0.6569 0.7411 0.6807 0.9357 0.6827 0.7172 0.5385 0.1321 0.3627 0.5154 0.7103 0.5983 0.8671 0.4152 0.5649 0.3381 0.6889 0.7471 0.6753 0.5885 0.3555 0.7504 0.6145 0.7965 0.7141 0.7775 0.6184 0.8729 0.6963 0.8364 0.1859 0.2847 0.7105 0.4968 0.4686 0.7991 0.644 0.6626 0.2019 0.5424 0.3277 0.1791 0.1758 0.51 0.4434 0.4307 0.7514 0.67 0.2121 0.5763 0.8748 0.7577 0.7291 0.3504 0.7788 0.1777 0.3743 0.473 0.5911 0.6421 0.8623 0.7597 0.2726 0.7562 0.7321 0.2491 0.5845 0.7157 0.2791 0.6723 0.6686 0.7601 0.655 0.7837 0.6218 0.8888 0.567 0.7502 0.3401 0.6624 0.8079 0.6152 0.4197 0.3404 0.1211 0.428 0.6545 0.4771 0.0334 0.606 0.495 0.6135 0.0039 0.8214 0.2157 0.4351 0.7655] 2022-08-24 07:09:09 [INFO] [EVAL] Class Recall: [0.8602 0.9159 0.967 0.8834 0.8708 0.8706 0.8683 0.9213 0.7373 0.8032 0.6352 0.7603 0.89 0.4155 0.3866 0.621 0.6411 0.5195 0.7475 0.6095 0.9132 0.5488 0.7658 0.7702 0.5509 0.5513 0.7093 0.5462 0.4872 0.4329 0.412 0.6915 0.4255 0.5166 0.6147 0.6191 0.6424 0.6555 0.4892 0.4489 0.1563 0.2207 0.4609 0.3514 0.4826 0.3238 0.4633 0.7041 0.826 0.7465 0.6962 0.6528 0.2308 0.252 0.912 0.4042 0.951 0.5583 0.5043 0.3959 0.2415 0.3336 0.5358 0.1444 0.7216 0.8344 0.4205 0.5787 0.1975 0.4847 0.5912 0.7941 0.5509 0.4702 0.6049 0.5567 0.6213 0.369 0.3789 0.3729 0.8145 0.5445 0.4461 0.0948 0.309 0.6829 0.1425 0.15 0.329 0.7499 0.6009 0.2034 0.2991 0.1283 0.0656 0.0797 0.1945 0.2494 0.4416 0.3535 0.2366 0.2299 0.3593 0.8713 0.1915 0.8459 0.1837 0.6854 0.1482 0.3681 0.2426 0.6642 0.2144 0.6904 0.9792 0.0636 0.5023 0.8203 0.2585 0.5653 0.5614 0.0758 0.4068 0.2115 0.343 0.2864 0.6316 0.682 0.6329 0.5372 0.7103 0.0519 0.357 0.4883 0.3589 0.274 0.2578 0.0773 0.3036 0.5455 0.1825 0.0697 0.3167 0.1634 0.2504 0.0048 0.5082 0.0465 0.1957 0.2187] 2022-08-24 07:09:09 [INFO] [EVAL] The model with the best validation mIoU (0.3799) was saved at iter 131000. 2022-08-24 07:09:17 [INFO] [TRAIN] epoch: 105, iter: 132050/160000, loss: 0.3986, lr: 0.000212, batch_cost: 0.1673, reader_cost: 0.00398, ips: 47.8136 samples/sec | ETA 01:17:56 2022-08-24 07:09:27 [INFO] [TRAIN] epoch: 105, iter: 132100/160000, loss: 0.4338, lr: 0.000211, batch_cost: 0.1891, reader_cost: 0.00121, ips: 42.3011 samples/sec | ETA 01:27:56 2022-08-24 07:09:35 [INFO] [TRAIN] epoch: 105, iter: 132150/160000, loss: 0.4036, lr: 0.000211, batch_cost: 0.1632, reader_cost: 0.00085, ips: 49.0208 samples/sec | ETA 01:15:45 2022-08-24 07:09:44 [INFO] [TRAIN] epoch: 105, iter: 132200/160000, loss: 0.4015, lr: 0.000210, batch_cost: 0.1748, reader_cost: 0.00105, ips: 45.7721 samples/sec | ETA 01:20:58 2022-08-24 07:09:52 [INFO] [TRAIN] epoch: 105, iter: 132250/160000, loss: 0.3783, lr: 0.000210, batch_cost: 0.1648, reader_cost: 0.00309, ips: 48.5450 samples/sec | ETA 01:16:13 2022-08-24 07:10:02 [INFO] [TRAIN] epoch: 105, iter: 132300/160000, loss: 0.3904, lr: 0.000210, batch_cost: 0.1975, reader_cost: 0.00044, ips: 40.5130 samples/sec | ETA 01:31:09 2022-08-24 07:10:10 [INFO] [TRAIN] epoch: 105, iter: 132350/160000, loss: 0.4236, lr: 0.000209, batch_cost: 0.1729, reader_cost: 0.00185, ips: 46.2774 samples/sec | ETA 01:19:39 2022-08-24 07:10:19 [INFO] [TRAIN] epoch: 105, iter: 132400/160000, loss: 0.4083, lr: 0.000209, batch_cost: 0.1635, reader_cost: 0.00463, ips: 48.9323 samples/sec | ETA 01:15:12 2022-08-24 07:10:27 [INFO] [TRAIN] epoch: 105, iter: 132450/160000, loss: 0.3887, lr: 0.000209, batch_cost: 0.1755, reader_cost: 0.00088, ips: 45.5935 samples/sec | ETA 01:20:34 2022-08-24 07:10:36 [INFO] [TRAIN] epoch: 105, iter: 132500/160000, loss: 0.4246, lr: 0.000208, batch_cost: 0.1780, reader_cost: 0.00368, ips: 44.9552 samples/sec | ETA 01:21:33 2022-08-24 07:10:46 [INFO] [TRAIN] epoch: 105, iter: 132550/160000, loss: 0.4236, lr: 0.000208, batch_cost: 0.1910, reader_cost: 0.00042, ips: 41.8760 samples/sec | ETA 01:27:24 2022-08-24 07:10:55 [INFO] [TRAIN] epoch: 105, iter: 132600/160000, loss: 0.4151, lr: 0.000207, batch_cost: 0.1839, reader_cost: 0.00620, ips: 43.5027 samples/sec | ETA 01:23:58 2022-08-24 07:11:10 [INFO] [TRAIN] epoch: 106, iter: 132650/160000, loss: 0.4343, lr: 0.000207, batch_cost: 0.2924, reader_cost: 0.07818, ips: 27.3598 samples/sec | ETA 02:13:17 2022-08-24 07:11:20 [INFO] [TRAIN] epoch: 106, iter: 132700/160000, loss: 0.3962, lr: 0.000207, batch_cost: 0.2097, reader_cost: 0.00051, ips: 38.1461 samples/sec | ETA 01:35:25 2022-08-24 07:11:30 [INFO] [TRAIN] epoch: 106, iter: 132750/160000, loss: 0.3993, lr: 0.000206, batch_cost: 0.1972, reader_cost: 0.00036, ips: 40.5695 samples/sec | ETA 01:29:33 2022-08-24 07:11:40 [INFO] [TRAIN] epoch: 106, iter: 132800/160000, loss: 0.3881, lr: 0.000206, batch_cost: 0.2027, reader_cost: 0.00052, ips: 39.4675 samples/sec | ETA 01:31:53 2022-08-24 07:11:50 [INFO] [TRAIN] epoch: 106, iter: 132850/160000, loss: 0.4228, lr: 0.000206, batch_cost: 0.2039, reader_cost: 0.00053, ips: 39.2261 samples/sec | ETA 01:32:17 2022-08-24 07:12:00 [INFO] [TRAIN] epoch: 106, iter: 132900/160000, loss: 0.4285, lr: 0.000205, batch_cost: 0.1916, reader_cost: 0.00042, ips: 41.7538 samples/sec | ETA 01:26:32 2022-08-24 07:12:11 [INFO] [TRAIN] epoch: 106, iter: 132950/160000, loss: 0.4331, lr: 0.000205, batch_cost: 0.2223, reader_cost: 0.00035, ips: 35.9802 samples/sec | ETA 01:40:14 2022-08-24 07:12:21 [INFO] [TRAIN] epoch: 106, iter: 133000/160000, loss: 0.4208, lr: 0.000204, batch_cost: 0.1934, reader_cost: 0.00050, ips: 41.3682 samples/sec | ETA 01:27:01 2022-08-24 07:12:21 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 161s - batch_cost: 0.1614 - reader cost: 8.0618e-04 2022-08-24 07:15:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.3799 Acc: 0.7772 Kappa: 0.7602 Dice: 0.5180 2022-08-24 07:15:02 [INFO] [EVAL] Class IoU: [0.695 0.7945 0.9326 0.7416 0.698 0.772 0.7797 0.8107 0.529 0.6376 0.5061 0.5749 0.7234 0.2849 0.3214 0.4494 0.4995 0.4313 0.6159 0.4405 0.7778 0.4477 0.6349 0.5184 0.3356 0.4033 0.4503 0.4644 0.4225 0.2975 0.3094 0.5504 0.3383 0.3789 0.3719 0.4264 0.4891 0.6032 0.3012 0.3775 0.1516 0.1291 0.3555 0.2636 0.2481 0.2274 0.3191 0.5413 0.6461 0.53 0.605 0.3946 0.1682 0.2224 0.6937 0.3202 0.8834 0.4373 0.3596 0.2996 0.0832 0.263 0.3185 0.1821 0.4913 0.734 0.2539 0.4033 0.1277 0.3829 0.4673 0.5856 0.4041 0.2788 0.4927 0.4171 0.5321 0.2908 0.3437 0.2961 0.7239 0.4252 0.4187 0.1763 0.216 0.5299 0.1152 0.1295 0.4286 0.5622 0.4523 0.082 0.2416 0.1226 0.0388 0.0817 0.107 0.2023 0.2934 0.3108 0.2126 0.1137 0.2906 0.8067 0.1899 0.7321 0.1323 0.5671 0.0853 0.1975 0.202 0.3357 0.2042 0.6174 0.7639 0.054 0.4371 0.6429 0.1435 0.4055 0.4338 0.076 0.3578 0.177 0.2874 0.2454 0.5432 0.4875 0.5638 0.3773 0.5254 0.0805 0.2401 0.4051 0.2753 0.1952 0.1651 0.0546 0.2074 0.418 0.2217 0.0259 0.2638 0.2117 0.2195 0.0021 0.4361 0.0359 0.165 0.2114] 2022-08-24 07:15:02 [INFO] [EVAL] Class Precision: [0.7893 0.8618 0.9649 0.8182 0.7846 0.8796 0.8858 0.8628 0.6693 0.7406 0.692 0.7238 0.7931 0.5364 0.5309 0.6304 0.6908 0.7082 0.7733 0.6326 0.8482 0.6613 0.7936 0.631 0.4978 0.6391 0.5555 0.7927 0.7441 0.4358 0.4748 0.6936 0.5538 0.528 0.4728 0.5798 0.6942 0.7919 0.4778 0.6064 0.3168 0.3506 0.5911 0.4788 0.3897 0.4903 0.5531 0.7631 0.7382 0.6508 0.7825 0.5001 0.2885 0.6545 0.7376 0.6911 0.9351 0.6945 0.7183 0.5296 0.1289 0.4297 0.443 0.5919 0.6049 0.832 0.427 0.5532 0.3649 0.655 0.7409 0.7261 0.5477 0.3042 0.7532 0.6247 0.7317 0.6193 0.7627 0.4509 0.8597 0.7007 0.8317 0.3287 0.3012 0.7447 0.53 0.4409 0.8185 0.7616 0.6293 0.122 0.4676 0.3179 0.1157 0.1954 0.6112 0.4234 0.4418 0.7167 0.7473 0.186 0.604 0.8999 0.7436 0.8513 0.3278 0.779 0.1636 0.3463 0.4399 0.4023 0.6201 0.7919 0.776 0.2952 0.7522 0.7664 0.2164 0.5787 0.7288 0.3053 0.6674 0.6772 0.7989 0.6672 0.8018 0.6307 0.8325 0.5902 0.757 0.3765 0.5327 0.8205 0.6449 0.3942 0.3617 0.1395 0.4299 0.7117 0.5351 0.0363 0.563 0.5862 0.5801 0.0033 0.8776 0.1757 0.5216 0.6724] 2022-08-24 07:15:02 [INFO] [EVAL] Class Recall: [0.8533 0.9106 0.9653 0.8879 0.8635 0.8632 0.8669 0.9306 0.7162 0.821 0.6532 0.7365 0.8917 0.378 0.4488 0.6103 0.6434 0.5245 0.7515 0.592 0.9036 0.5808 0.7605 0.744 0.5075 0.5222 0.7039 0.5285 0.4943 0.4838 0.4704 0.7273 0.465 0.573 0.6354 0.6171 0.6234 0.7168 0.449 0.5001 0.2252 0.1696 0.4715 0.3697 0.4058 0.2977 0.4299 0.6506 0.8382 0.7406 0.7274 0.6518 0.2873 0.252 0.9209 0.3737 0.9411 0.5415 0.4186 0.4081 0.19 0.404 0.5313 0.2083 0.7234 0.8616 0.385 0.598 0.1643 0.4796 0.5586 0.7517 0.6065 0.7696 0.5876 0.5565 0.6611 0.3541 0.3849 0.4631 0.8209 0.5196 0.4575 0.2755 0.4332 0.6476 0.1283 0.1549 0.4736 0.6823 0.6166 0.2 0.3334 0.1664 0.0551 0.1231 0.1149 0.2792 0.4661 0.3544 0.2291 0.2264 0.3589 0.8863 0.2033 0.8395 0.1815 0.6759 0.1513 0.3148 0.2719 0.6698 0.2334 0.737 0.9801 0.0621 0.5106 0.7995 0.2988 0.5754 0.5173 0.092 0.4355 0.1933 0.3099 0.2796 0.6274 0.6823 0.6359 0.5113 0.632 0.0929 0.3042 0.4445 0.3245 0.2789 0.233 0.0824 0.2861 0.5032 0.2746 0.0826 0.3318 0.2489 0.261 0.0053 0.4644 0.0431 0.1944 0.2357] 2022-08-24 07:15:03 [INFO] [EVAL] The model with the best validation mIoU (0.3799) was saved at iter 131000. 2022-08-24 07:15:11 [INFO] [TRAIN] epoch: 106, iter: 133050/160000, loss: 0.4125, lr: 0.000204, batch_cost: 0.1678, reader_cost: 0.00455, ips: 47.6736 samples/sec | ETA 01:15:22 2022-08-24 07:15:20 [INFO] [TRAIN] epoch: 106, iter: 133100/160000, loss: 0.3727, lr: 0.000204, batch_cost: 0.1837, reader_cost: 0.00124, ips: 43.5499 samples/sec | ETA 01:22:21 2022-08-24 07:15:29 [INFO] [TRAIN] epoch: 106, iter: 133150/160000, loss: 0.3879, lr: 0.000203, batch_cost: 0.1696, reader_cost: 0.00103, ips: 47.1633 samples/sec | ETA 01:15:54 2022-08-24 07:15:37 [INFO] [TRAIN] epoch: 106, iter: 133200/160000, loss: 0.4017, lr: 0.000203, batch_cost: 0.1604, reader_cost: 0.00068, ips: 49.8827 samples/sec | ETA 01:11:38 2022-08-24 07:15:45 [INFO] [TRAIN] epoch: 106, iter: 133250/160000, loss: 0.3914, lr: 0.000203, batch_cost: 0.1620, reader_cost: 0.00064, ips: 49.3972 samples/sec | ETA 01:12:12 2022-08-24 07:15:53 [INFO] [TRAIN] epoch: 106, iter: 133300/160000, loss: 0.3991, lr: 0.000202, batch_cost: 0.1652, reader_cost: 0.00111, ips: 48.4156 samples/sec | ETA 01:13:31 2022-08-24 07:16:03 [INFO] [TRAIN] epoch: 106, iter: 133350/160000, loss: 0.4296, lr: 0.000202, batch_cost: 0.2000, reader_cost: 0.00056, ips: 40.0086 samples/sec | ETA 01:28:48 2022-08-24 07:16:12 [INFO] [TRAIN] epoch: 106, iter: 133400/160000, loss: 0.4107, lr: 0.000201, batch_cost: 0.1782, reader_cost: 0.00486, ips: 44.9011 samples/sec | ETA 01:18:59 2022-08-24 07:16:22 [INFO] [TRAIN] epoch: 106, iter: 133450/160000, loss: 0.4135, lr: 0.000201, batch_cost: 0.2085, reader_cost: 0.00151, ips: 38.3630 samples/sec | ETA 01:32:16 2022-08-24 07:16:32 [INFO] [TRAIN] epoch: 106, iter: 133500/160000, loss: 0.3997, lr: 0.000201, batch_cost: 0.2005, reader_cost: 0.00096, ips: 39.9025 samples/sec | ETA 01:28:32 2022-08-24 07:16:42 [INFO] [TRAIN] epoch: 106, iter: 133550/160000, loss: 0.4064, lr: 0.000200, batch_cost: 0.1880, reader_cost: 0.00058, ips: 42.5636 samples/sec | ETA 01:22:51 2022-08-24 07:16:52 [INFO] [TRAIN] epoch: 106, iter: 133600/160000, loss: 0.3625, lr: 0.000200, batch_cost: 0.1980, reader_cost: 0.00062, ips: 40.4057 samples/sec | ETA 01:27:06 2022-08-24 07:17:01 [INFO] [TRAIN] epoch: 106, iter: 133650/160000, loss: 0.4087, lr: 0.000200, batch_cost: 0.1879, reader_cost: 0.00060, ips: 42.5852 samples/sec | ETA 01:22:30 2022-08-24 07:17:10 [INFO] [TRAIN] epoch: 106, iter: 133700/160000, loss: 0.4168, lr: 0.000199, batch_cost: 0.1847, reader_cost: 0.00152, ips: 43.3019 samples/sec | ETA 01:20:58 2022-08-24 07:17:20 [INFO] [TRAIN] epoch: 106, iter: 133750/160000, loss: 0.4010, lr: 0.000199, batch_cost: 0.1943, reader_cost: 0.00081, ips: 41.1821 samples/sec | ETA 01:24:59 2022-08-24 07:17:29 [INFO] [TRAIN] epoch: 106, iter: 133800/160000, loss: 0.3855, lr: 0.000198, batch_cost: 0.1806, reader_cost: 0.00088, ips: 44.3029 samples/sec | ETA 01:18:51 2022-08-24 07:17:39 [INFO] [TRAIN] epoch: 106, iter: 133850/160000, loss: 0.3974, lr: 0.000198, batch_cost: 0.1930, reader_cost: 0.00057, ips: 41.4467 samples/sec | ETA 01:24:07 2022-08-24 07:17:53 [INFO] [TRAIN] epoch: 107, iter: 133900/160000, loss: 0.3929, lr: 0.000198, batch_cost: 0.2917, reader_cost: 0.10259, ips: 27.4246 samples/sec | ETA 02:06:53 2022-08-24 07:18:03 [INFO] [TRAIN] epoch: 107, iter: 133950/160000, loss: 0.3852, lr: 0.000197, batch_cost: 0.1899, reader_cost: 0.00055, ips: 42.1361 samples/sec | ETA 01:22:25 2022-08-24 07:18:12 [INFO] [TRAIN] epoch: 107, iter: 134000/160000, loss: 0.4056, lr: 0.000197, batch_cost: 0.1896, reader_cost: 0.00038, ips: 42.1887 samples/sec | ETA 01:22:10 2022-08-24 07:18:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 155s - batch_cost: 0.1553 - reader cost: 7.1220e-04 2022-08-24 07:20:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3828 Acc: 0.7778 Kappa: 0.7609 Dice: 0.5214 2022-08-24 07:20:48 [INFO] [EVAL] Class IoU: [0.6951 0.7951 0.9315 0.7457 0.6946 0.7734 0.7821 0.8184 0.5306 0.6577 0.4957 0.5754 0.7188 0.3079 0.3126 0.4548 0.488 0.4451 0.6114 0.4478 0.7816 0.438 0.6388 0.5209 0.333 0.4298 0.4501 0.4677 0.443 0.3333 0.3141 0.5508 0.328 0.3718 0.3493 0.4299 0.4809 0.6102 0.2956 0.3916 0.1541 0.1259 0.3523 0.2532 0.2412 0.2075 0.3617 0.5474 0.635 0.5449 0.5954 0.3647 0.1765 0.1973 0.6847 0.3328 0.89 0.4507 0.4472 0.2852 0.0983 0.2334 0.3129 0.1543 0.4936 0.7468 0.2758 0.3905 0.1358 0.3647 0.5118 0.5815 0.3491 0.2624 0.4986 0.4176 0.4793 0.3208 0.3621 0.3117 0.7258 0.4182 0.4256 0.168 0.1077 0.5318 0.142 0.1255 0.3998 0.555 0.4489 0.0517 0.2238 0.1207 0.0528 0.0565 0.1927 0.1635 0.294 0.3113 0.2699 0.1311 0.3032 0.7838 0.1921 0.5789 0.1562 0.5749 0.0762 0.1971 0.1919 0.5059 0.2059 0.6765 0.6822 0.0581 0.4399 0.6358 0.1854 0.3856 0.4447 0.0727 0.3717 0.2261 0.2959 0.2527 0.5132 0.4788 0.5685 0.4043 0.543 0.0646 0.3014 0.4113 0.2745 0.2089 0.1655 0.0377 0.2115 0.4254 0.2227 0.0118 0.2819 0.467 0.2152 0.0016 0.4179 0.0386 0.1485 0.209 ] 2022-08-24 07:20:48 [INFO] [EVAL] Class Precision: [0.7903 0.8695 0.9644 0.8397 0.7689 0.8637 0.8815 0.8842 0.6845 0.7839 0.6981 0.7199 0.7845 0.5367 0.5528 0.623 0.6738 0.6906 0.7778 0.6273 0.8581 0.669 0.791 0.6548 0.4545 0.593 0.5587 0.7783 0.728 0.4835 0.4795 0.7159 0.5497 0.4918 0.4539 0.6001 0.6486 0.8073 0.4352 0.5991 0.3761 0.3578 0.5908 0.4947 0.3174 0.466 0.6563 0.7682 0.7393 0.6776 0.7578 0.4644 0.3285 0.5601 0.7241 0.6423 0.9302 0.6864 0.6059 0.4964 0.1433 0.4142 0.4553 0.6862 0.5971 0.874 0.4518 0.5179 0.2692 0.5843 0.7345 0.7367 0.5936 0.3361 0.7415 0.6502 0.7189 0.6164 0.783 0.5265 0.8519 0.6949 0.8243 0.3294 0.1991 0.7135 0.5021 0.466 0.8079 0.7101 0.6486 0.0664 0.4597 0.3479 0.1826 0.1627 0.5083 0.3565 0.4413 0.7057 0.6371 0.2115 0.5456 0.8509 0.7816 0.6232 0.3161 0.7819 0.1802 0.3372 0.4211 0.6707 0.6196 0.7818 0.6884 0.313 0.747 0.7442 0.257 0.5505 0.674 0.2597 0.6646 0.5419 0.7651 0.649 0.7386 0.6213 0.8429 0.6026 0.688 0.4372 0.5846 0.843 0.6218 0.406 0.3258 0.125 0.4832 0.7028 0.4163 0.0276 0.5889 0.66 0.5446 0.0021 0.8685 0.2749 0.4213 0.6775] 2022-08-24 07:20:48 [INFO] [EVAL] Class Recall: [0.8522 0.9028 0.9646 0.8695 0.8778 0.8809 0.8741 0.9166 0.7023 0.8033 0.631 0.7414 0.8957 0.4193 0.4184 0.6275 0.639 0.556 0.7408 0.6102 0.8975 0.5592 0.7685 0.7182 0.5546 0.6097 0.6984 0.5395 0.5309 0.5176 0.4766 0.7048 0.4486 0.6036 0.6025 0.6025 0.6504 0.7142 0.4796 0.5307 0.207 0.1626 0.466 0.3416 0.501 0.2723 0.4462 0.6558 0.8182 0.7355 0.7353 0.6295 0.276 0.2334 0.9264 0.4085 0.9537 0.5676 0.6305 0.4014 0.2384 0.3485 0.5001 0.166 0.74 0.8369 0.4146 0.6135 0.215 0.4926 0.628 0.734 0.4587 0.5448 0.6035 0.5386 0.5899 0.4008 0.4025 0.4331 0.8306 0.5122 0.4681 0.2554 0.1901 0.6761 0.1652 0.1466 0.4418 0.7176 0.5931 0.1887 0.3037 0.156 0.0691 0.0797 0.2369 0.232 0.4684 0.3577 0.3189 0.2565 0.4056 0.9086 0.203 0.8905 0.236 0.6847 0.1166 0.3216 0.2607 0.6731 0.2356 0.8341 0.9869 0.0666 0.5169 0.8136 0.3993 0.5627 0.5666 0.0917 0.4575 0.2795 0.3254 0.2928 0.6271 0.6761 0.6359 0.5513 0.7203 0.0705 0.3836 0.4454 0.3295 0.3009 0.2518 0.0512 0.2733 0.5187 0.3238 0.0202 0.351 0.615 0.2625 0.0063 0.4461 0.043 0.1865 0.2321] 2022-08-24 07:20:48 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:20:56 [INFO] [TRAIN] epoch: 107, iter: 134050/160000, loss: 0.4124, lr: 0.000196, batch_cost: 0.1625, reader_cost: 0.00427, ips: 49.2206 samples/sec | ETA 01:10:17 2022-08-24 07:21:04 [INFO] [TRAIN] epoch: 107, iter: 134100/160000, loss: 0.4027, lr: 0.000196, batch_cost: 0.1563, reader_cost: 0.00103, ips: 51.1758 samples/sec | ETA 01:07:28 2022-08-24 07:21:12 [INFO] [TRAIN] epoch: 107, iter: 134150/160000, loss: 0.3953, lr: 0.000196, batch_cost: 0.1593, reader_cost: 0.00099, ips: 50.2284 samples/sec | ETA 01:08:37 2022-08-24 07:21:21 [INFO] [TRAIN] epoch: 107, iter: 134200/160000, loss: 0.4004, lr: 0.000195, batch_cost: 0.1737, reader_cost: 0.00098, ips: 46.0685 samples/sec | ETA 01:14:40 2022-08-24 07:21:29 [INFO] [TRAIN] epoch: 107, iter: 134250/160000, loss: 0.3840, lr: 0.000195, batch_cost: 0.1716, reader_cost: 0.00049, ips: 46.6091 samples/sec | ETA 01:13:39 2022-08-24 07:21:39 [INFO] [TRAIN] epoch: 107, iter: 134300/160000, loss: 0.3986, lr: 0.000195, batch_cost: 0.2050, reader_cost: 0.00053, ips: 39.0187 samples/sec | ETA 01:27:49 2022-08-24 07:21:49 [INFO] [TRAIN] epoch: 107, iter: 134350/160000, loss: 0.4384, lr: 0.000194, batch_cost: 0.1926, reader_cost: 0.00074, ips: 41.5267 samples/sec | ETA 01:22:21 2022-08-24 07:21:58 [INFO] [TRAIN] epoch: 107, iter: 134400/160000, loss: 0.4066, lr: 0.000194, batch_cost: 0.1765, reader_cost: 0.00078, ips: 45.3371 samples/sec | ETA 01:15:17 2022-08-24 07:22:07 [INFO] [TRAIN] epoch: 107, iter: 134450/160000, loss: 0.4086, lr: 0.000193, batch_cost: 0.1869, reader_cost: 0.00135, ips: 42.8000 samples/sec | ETA 01:19:35 2022-08-24 07:22:17 [INFO] [TRAIN] epoch: 107, iter: 134500/160000, loss: 0.3729, lr: 0.000193, batch_cost: 0.1868, reader_cost: 0.00058, ips: 42.8300 samples/sec | ETA 01:19:23 2022-08-24 07:22:27 [INFO] [TRAIN] epoch: 107, iter: 134550/160000, loss: 0.4335, lr: 0.000193, batch_cost: 0.2076, reader_cost: 0.00047, ips: 38.5313 samples/sec | ETA 01:28:04 2022-08-24 07:22:36 [INFO] [TRAIN] epoch: 107, iter: 134600/160000, loss: 0.3993, lr: 0.000192, batch_cost: 0.1828, reader_cost: 0.00059, ips: 43.7728 samples/sec | ETA 01:17:22 2022-08-24 07:22:46 [INFO] [TRAIN] epoch: 107, iter: 134650/160000, loss: 0.3804, lr: 0.000192, batch_cost: 0.2019, reader_cost: 0.00061, ips: 39.6218 samples/sec | ETA 01:25:18 2022-08-24 07:22:56 [INFO] [TRAIN] epoch: 107, iter: 134700/160000, loss: 0.3984, lr: 0.000192, batch_cost: 0.1991, reader_cost: 0.00190, ips: 40.1782 samples/sec | ETA 01:23:57 2022-08-24 07:23:06 [INFO] [TRAIN] epoch: 107, iter: 134750/160000, loss: 0.3929, lr: 0.000191, batch_cost: 0.2047, reader_cost: 0.00064, ips: 39.0885 samples/sec | ETA 01:26:07 2022-08-24 07:23:17 [INFO] [TRAIN] epoch: 107, iter: 134800/160000, loss: 0.3854, lr: 0.000191, batch_cost: 0.2054, reader_cost: 0.00041, ips: 38.9398 samples/sec | ETA 01:26:17 2022-08-24 07:23:25 [INFO] [TRAIN] epoch: 107, iter: 134850/160000, loss: 0.4076, lr: 0.000190, batch_cost: 0.1757, reader_cost: 0.00070, ips: 45.5386 samples/sec | ETA 01:13:38 2022-08-24 07:23:35 [INFO] [TRAIN] epoch: 107, iter: 134900/160000, loss: 0.4187, lr: 0.000190, batch_cost: 0.1905, reader_cost: 0.00070, ips: 41.9967 samples/sec | ETA 01:19:41 2022-08-24 07:23:43 [INFO] [TRAIN] epoch: 107, iter: 134950/160000, loss: 0.4237, lr: 0.000190, batch_cost: 0.1682, reader_cost: 0.00052, ips: 47.5707 samples/sec | ETA 01:10:12 2022-08-24 07:23:52 [INFO] [TRAIN] epoch: 107, iter: 135000/160000, loss: 0.4158, lr: 0.000189, batch_cost: 0.1815, reader_cost: 0.00034, ips: 44.0742 samples/sec | ETA 01:15:37 2022-08-24 07:23:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1782 - reader cost: 9.7695e-04 2022-08-24 07:26:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3775 Acc: 0.7783 Kappa: 0.7614 Dice: 0.5148 2022-08-24 07:26:51 [INFO] [EVAL] Class IoU: [0.6977 0.7983 0.9315 0.7448 0.6923 0.7772 0.7767 0.8123 0.5367 0.6545 0.5 0.5737 0.7219 0.3102 0.3197 0.4503 0.5006 0.448 0.6149 0.4399 0.7782 0.4437 0.6345 0.5175 0.3351 0.4601 0.4563 0.4705 0.4306 0.2779 0.2856 0.5504 0.3316 0.3652 0.3473 0.4331 0.4867 0.594 0.3146 0.3766 0.1395 0.1357 0.3579 0.252 0.2478 0.2048 0.3344 0.5501 0.6449 0.5513 0.6043 0.394 0.1522 0.1822 0.669 0.3285 0.8806 0.4533 0.2385 0.2766 0.1018 0.2577 0.3333 0.2006 0.4708 0.7388 0.3018 0.3807 0.137 0.3834 0.4941 0.5683 0.3772 0.2523 0.4981 0.4147 0.4951 0.307 0.3394 0.3075 0.728 0.4403 0.428 0.1062 0.0888 0.5279 0.1283 0.1373 0.3651 0.5548 0.4253 0.0701 0.2075 0.0894 0.0547 0.0536 0.0927 0.1521 0.2602 0.302 0.2152 0.1087 0.2974 0.7852 0.1849 0.5452 0.1995 0.5928 0.0687 0.2197 0.2161 0.4246 0.2088 0.6876 0.7058 0.0714 0.4236 0.636 0.2054 0.4239 0.4408 0.0789 0.3654 0.207 0.2863 0.2654 0.5234 0.4872 0.5938 0.3873 0.5648 0.0607 0.2783 0.4316 0.2959 0.2043 0.1714 0.0564 0.2118 0.4048 0.1039 0.0085 0.2889 0.2474 0.2356 0.0027 0.4272 0.0344 0.167 0.2146] 2022-08-24 07:26:51 [INFO] [EVAL] Class Precision: [0.798 0.8672 0.9643 0.8262 0.7709 0.8756 0.8795 0.8716 0.6705 0.757 0.7157 0.7142 0.7912 0.5283 0.5038 0.61 0.6543 0.6916 0.7766 0.6476 0.8467 0.6563 0.7828 0.6106 0.49 0.6617 0.5874 0.781 0.7534 0.4927 0.4883 0.7012 0.5702 0.4895 0.471 0.6072 0.6587 0.8288 0.4905 0.612 0.3716 0.3757 0.5897 0.5146 0.3318 0.4246 0.668 0.7314 0.7106 0.6911 0.7835 0.489 0.3195 0.5465 0.7126 0.723 0.9103 0.6631 0.6109 0.4706 0.1462 0.4921 0.445 0.5843 0.5527 0.8428 0.4363 0.5187 0.3126 0.641 0.7421 0.6792 0.5534 0.3481 0.7372 0.6153 0.7358 0.5925 0.7589 0.6188 0.8641 0.6667 0.8196 0.2518 0.1602 0.707 0.5732 0.3972 0.8583 0.6998 0.5809 0.0971 0.4363 0.3095 0.2229 0.1695 0.5384 0.3637 0.357 0.6503 0.7091 0.1628 0.5906 0.8803 0.7013 0.601 0.4117 0.7936 0.1654 0.3895 0.4316 0.5193 0.622 0.8051 0.7134 0.259 0.7424 0.7396 0.3737 0.5808 0.7448 0.2814 0.6196 0.6201 0.7805 0.6208 0.7791 0.6379 0.9058 0.5759 0.6918 0.3554 0.5536 0.7382 0.6089 0.3547 0.3603 0.1302 0.4719 0.7096 0.4159 0.0146 0.579 0.6557 0.5184 0.0044 0.8149 0.2262 0.4541 0.7305] 2022-08-24 07:26:51 [INFO] [EVAL] Class Recall: [0.8473 0.9095 0.9647 0.8833 0.8716 0.8737 0.8692 0.9227 0.729 0.8286 0.6239 0.7446 0.8918 0.4289 0.4666 0.6322 0.6806 0.5598 0.7472 0.5784 0.9058 0.578 0.77 0.7724 0.5145 0.6016 0.6714 0.542 0.5013 0.3893 0.4077 0.7191 0.4422 0.5899 0.5694 0.6017 0.6509 0.6771 0.4674 0.4947 0.1825 0.1753 0.4765 0.3306 0.4946 0.2835 0.401 0.6894 0.8746 0.7316 0.7255 0.6697 0.2251 0.2147 0.9161 0.3758 0.9642 0.589 0.2812 0.4015 0.2508 0.351 0.5704 0.2341 0.7606 0.8568 0.4949 0.5888 0.196 0.4883 0.5965 0.7767 0.5423 0.4783 0.6056 0.5599 0.6021 0.3892 0.3805 0.3794 0.8221 0.5646 0.4725 0.1552 0.1661 0.6757 0.1418 0.1734 0.3885 0.728 0.6135 0.2015 0.2835 0.1117 0.0677 0.0727 0.1007 0.2073 0.4898 0.3605 0.236 0.2466 0.3746 0.8791 0.2007 0.8545 0.2791 0.7009 0.1051 0.335 0.3021 0.6997 0.2392 0.8249 0.9851 0.0897 0.4966 0.8196 0.3133 0.6108 0.5192 0.0988 0.4711 0.237 0.3114 0.3167 0.6146 0.6735 0.6329 0.5418 0.7546 0.0682 0.3588 0.5097 0.3653 0.3252 0.2465 0.0906 0.2776 0.4852 0.1216 0.02 0.3657 0.2843 0.3016 0.007 0.4731 0.039 0.2089 0.233 ] 2022-08-24 07:26:51 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:26:59 [INFO] [TRAIN] epoch: 107, iter: 135050/160000, loss: 0.3887, lr: 0.000189, batch_cost: 0.1681, reader_cost: 0.00298, ips: 47.5773 samples/sec | ETA 01:09:55 2022-08-24 07:27:09 [INFO] [TRAIN] epoch: 107, iter: 135100/160000, loss: 0.3953, lr: 0.000189, batch_cost: 0.1974, reader_cost: 0.00069, ips: 40.5179 samples/sec | ETA 01:21:56 2022-08-24 07:27:20 [INFO] [TRAIN] epoch: 108, iter: 135150/160000, loss: 0.3998, lr: 0.000188, batch_cost: 0.2215, reader_cost: 0.05508, ips: 36.1145 samples/sec | ETA 01:31:44 2022-08-24 07:27:29 [INFO] [TRAIN] epoch: 108, iter: 135200/160000, loss: 0.4096, lr: 0.000188, batch_cost: 0.1668, reader_cost: 0.00057, ips: 47.9504 samples/sec | ETA 01:08:57 2022-08-24 07:27:37 [INFO] [TRAIN] epoch: 108, iter: 135250/160000, loss: 0.3947, lr: 0.000187, batch_cost: 0.1671, reader_cost: 0.00068, ips: 47.8697 samples/sec | ETA 01:08:56 2022-08-24 07:27:46 [INFO] [TRAIN] epoch: 108, iter: 135300/160000, loss: 0.3774, lr: 0.000187, batch_cost: 0.1751, reader_cost: 0.00057, ips: 45.6836 samples/sec | ETA 01:12:05 2022-08-24 07:27:56 [INFO] [TRAIN] epoch: 108, iter: 135350/160000, loss: 0.3902, lr: 0.000187, batch_cost: 0.1971, reader_cost: 0.00088, ips: 40.5847 samples/sec | ETA 01:20:58 2022-08-24 07:28:05 [INFO] [TRAIN] epoch: 108, iter: 135400/160000, loss: 0.3732, lr: 0.000186, batch_cost: 0.1909, reader_cost: 0.00093, ips: 41.9172 samples/sec | ETA 01:18:14 2022-08-24 07:28:15 [INFO] [TRAIN] epoch: 108, iter: 135450/160000, loss: 0.3641, lr: 0.000186, batch_cost: 0.1886, reader_cost: 0.00046, ips: 42.4153 samples/sec | ETA 01:17:10 2022-08-24 07:28:25 [INFO] [TRAIN] epoch: 108, iter: 135500/160000, loss: 0.3906, lr: 0.000185, batch_cost: 0.1969, reader_cost: 0.00060, ips: 40.6258 samples/sec | ETA 01:20:24 2022-08-24 07:28:34 [INFO] [TRAIN] epoch: 108, iter: 135550/160000, loss: 0.4075, lr: 0.000185, batch_cost: 0.1809, reader_cost: 0.00045, ips: 44.2230 samples/sec | ETA 01:13:43 2022-08-24 07:28:43 [INFO] [TRAIN] epoch: 108, iter: 135600/160000, loss: 0.3786, lr: 0.000185, batch_cost: 0.1824, reader_cost: 0.00090, ips: 43.8669 samples/sec | ETA 01:14:09 2022-08-24 07:28:53 [INFO] [TRAIN] epoch: 108, iter: 135650/160000, loss: 0.3988, lr: 0.000184, batch_cost: 0.2118, reader_cost: 0.00050, ips: 37.7632 samples/sec | ETA 01:25:58 2022-08-24 07:29:03 [INFO] [TRAIN] epoch: 108, iter: 135700/160000, loss: 0.4095, lr: 0.000184, batch_cost: 0.1909, reader_cost: 0.00108, ips: 41.9014 samples/sec | ETA 01:17:19 2022-08-24 07:29:12 [INFO] [TRAIN] epoch: 108, iter: 135750/160000, loss: 0.3917, lr: 0.000184, batch_cost: 0.1906, reader_cost: 0.00056, ips: 41.9742 samples/sec | ETA 01:17:01 2022-08-24 07:29:23 [INFO] [TRAIN] epoch: 108, iter: 135800/160000, loss: 0.4052, lr: 0.000183, batch_cost: 0.2074, reader_cost: 0.00033, ips: 38.5801 samples/sec | ETA 01:23:38 2022-08-24 07:29:32 [INFO] [TRAIN] epoch: 108, iter: 135850/160000, loss: 0.3882, lr: 0.000183, batch_cost: 0.1925, reader_cost: 0.00063, ips: 41.5653 samples/sec | ETA 01:17:28 2022-08-24 07:29:42 [INFO] [TRAIN] epoch: 108, iter: 135900/160000, loss: 0.3821, lr: 0.000182, batch_cost: 0.1864, reader_cost: 0.00042, ips: 42.9162 samples/sec | ETA 01:14:52 2022-08-24 07:29:50 [INFO] [TRAIN] epoch: 108, iter: 135950/160000, loss: 0.4023, lr: 0.000182, batch_cost: 0.1653, reader_cost: 0.00054, ips: 48.3948 samples/sec | ETA 01:06:15 2022-08-24 07:30:01 [INFO] [TRAIN] epoch: 108, iter: 136000/160000, loss: 0.3983, lr: 0.000182, batch_cost: 0.2211, reader_cost: 0.00083, ips: 36.1901 samples/sec | ETA 01:28:25 2022-08-24 07:30:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 146s - batch_cost: 0.1459 - reader cost: 0.0012 2022-08-24 07:32:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3826 Acc: 0.7789 Kappa: 0.7622 Dice: 0.5199 2022-08-24 07:32:27 [INFO] [EVAL] Class IoU: [0.6974 0.7966 0.932 0.7474 0.6949 0.778 0.7753 0.8152 0.5377 0.6501 0.511 0.5624 0.7226 0.3182 0.3338 0.4512 0.504 0.4406 0.6161 0.4482 0.7698 0.4464 0.6374 0.527 0.3305 0.449 0.4548 0.4844 0.4571 0.2721 0.3192 0.5556 0.3278 0.3692 0.3609 0.4351 0.4843 0.6032 0.2996 0.3886 0.1371 0.1307 0.3523 0.2678 0.2467 0.2263 0.3449 0.5351 0.6444 0.5509 0.595 0.3672 0.1658 0.1378 0.6751 0.3373 0.8879 0.4379 0.4118 0.2656 0.0788 0.2268 0.3326 0.1589 0.5033 0.7231 0.2883 0.3837 0.1313 0.3764 0.5154 0.5944 0.3953 0.285 0.5012 0.4227 0.5629 0.2934 0.4467 0.2593 0.7226 0.437 0.446 0.0854 0.1127 0.5235 0.1308 0.1298 0.3877 0.5514 0.4359 0.0785 0.2353 0.1199 0.0307 0.0648 0.1681 0.2009 0.2445 0.3499 0.2573 0.1218 0.2983 0.7487 0.1834 0.6908 0.1261 0.5843 0.0818 0.223 0.2088 0.5062 0.2051 0.6486 0.675 0.062 0.3918 0.6382 0.1612 0.4177 0.4699 0.0794 0.3637 0.2249 0.2997 0.2707 0.5277 0.4551 0.5547 0.3855 0.6068 0.0634 0.23 0.4656 0.2975 0.2034 0.1777 0.0485 0.2265 0.4182 0.0748 0.003 0.2819 0.4076 0.2322 0.0026 0.4234 0.0343 0.1583 0.2161] 2022-08-24 07:32:27 [INFO] [EVAL] Class Precision: [0.7998 0.8784 0.9626 0.8293 0.7755 0.8639 0.869 0.8777 0.6817 0.7452 0.7093 0.7313 0.7885 0.515 0.5267 0.6179 0.6786 0.6925 0.769 0.6275 0.8369 0.6543 0.7703 0.6567 0.4802 0.5774 0.5618 0.7533 0.7319 0.5355 0.4961 0.7482 0.536 0.4915 0.4819 0.5929 0.6779 0.8369 0.4777 0.604 0.3679 0.3127 0.5763 0.4822 0.3552 0.4281 0.6517 0.7115 0.7438 0.6884 0.7511 0.4684 0.351 0.5266 0.722 0.7215 0.9268 0.7144 0.6711 0.4423 0.1279 0.3943 0.4573 0.6795 0.6289 0.8143 0.4454 0.5325 0.2514 0.6451 0.7072 0.7738 0.5641 0.3396 0.7419 0.6343 0.7899 0.6482 0.7474 0.481 0.8469 0.7222 0.7987 0.2438 0.1991 0.7195 0.5124 0.4874 0.8348 0.698 0.6192 0.1013 0.4639 0.3367 0.1242 0.1798 0.5558 0.4074 0.3396 0.7527 0.7236 0.2358 0.5616 0.8957 0.8049 0.7837 0.3049 0.8155 0.1714 0.3764 0.4656 0.6795 0.6487 0.8433 0.6815 0.263 0.6818 0.7485 0.2185 0.5873 0.6949 0.2614 0.6468 0.5869 0.7644 0.6505 0.7889 0.5841 0.7875 0.5685 0.7456 0.3032 0.5908 0.7497 0.6401 0.4254 0.3351 0.1003 0.4444 0.6857 0.2985 0.0065 0.5688 0.6637 0.6244 0.0043 0.8629 0.2016 0.3868 0.6697] 2022-08-24 07:32:27 [INFO] [EVAL] Class Recall: [0.8449 0.8954 0.967 0.8832 0.8698 0.8866 0.8778 0.9197 0.718 0.8359 0.6464 0.7088 0.8963 0.4543 0.4769 0.6257 0.6621 0.5477 0.7561 0.6107 0.9057 0.5842 0.787 0.7274 0.5146 0.6688 0.7047 0.5758 0.549 0.3562 0.4723 0.6834 0.4576 0.5973 0.5897 0.6205 0.6291 0.6835 0.4455 0.5215 0.1794 0.1834 0.4754 0.3758 0.4469 0.3243 0.4229 0.6833 0.8283 0.7338 0.7412 0.6294 0.2391 0.1573 0.9123 0.3878 0.9548 0.5309 0.5159 0.3994 0.1701 0.348 0.5495 0.1718 0.716 0.8659 0.4498 0.5785 0.2156 0.4747 0.6552 0.7193 0.5692 0.6391 0.607 0.559 0.662 0.349 0.5261 0.36 0.8313 0.5253 0.5025 0.1161 0.2061 0.6578 0.1495 0.1504 0.4199 0.7242 0.5956 0.2589 0.3231 0.157 0.0391 0.092 0.1942 0.2838 0.466 0.3954 0.2854 0.2013 0.3888 0.8202 0.1919 0.8536 0.1769 0.6733 0.1352 0.3536 0.2746 0.6649 0.2307 0.7374 0.9859 0.075 0.4794 0.8124 0.3809 0.5912 0.592 0.1023 0.4538 0.2672 0.3303 0.3168 0.6145 0.6733 0.6523 0.5449 0.7653 0.0742 0.2736 0.5513 0.3573 0.2805 0.2744 0.0859 0.3161 0.5173 0.0908 0.0056 0.3585 0.5137 0.2699 0.0069 0.454 0.0397 0.2114 0.2419] 2022-08-24 07:32:27 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:32:36 [INFO] [TRAIN] epoch: 108, iter: 136050/160000, loss: 0.4095, lr: 0.000181, batch_cost: 0.1626, reader_cost: 0.00406, ips: 49.1868 samples/sec | ETA 01:04:55 2022-08-24 07:32:44 [INFO] [TRAIN] epoch: 108, iter: 136100/160000, loss: 0.3820, lr: 0.000181, batch_cost: 0.1668, reader_cost: 0.00121, ips: 47.9545 samples/sec | ETA 01:06:27 2022-08-24 07:32:52 [INFO] [TRAIN] epoch: 108, iter: 136150/160000, loss: 0.3796, lr: 0.000181, batch_cost: 0.1532, reader_cost: 0.00030, ips: 52.2137 samples/sec | ETA 01:00:54 2022-08-24 07:32:59 [INFO] [TRAIN] epoch: 108, iter: 136200/160000, loss: 0.4036, lr: 0.000180, batch_cost: 0.1514, reader_cost: 0.00073, ips: 52.8384 samples/sec | ETA 01:00:03 2022-08-24 07:33:07 [INFO] [TRAIN] epoch: 108, iter: 136250/160000, loss: 0.4098, lr: 0.000180, batch_cost: 0.1507, reader_cost: 0.00031, ips: 53.0699 samples/sec | ETA 00:59:40 2022-08-24 07:33:15 [INFO] [TRAIN] epoch: 108, iter: 136300/160000, loss: 0.4418, lr: 0.000179, batch_cost: 0.1638, reader_cost: 0.00051, ips: 48.8498 samples/sec | ETA 01:04:41 2022-08-24 07:33:24 [INFO] [TRAIN] epoch: 108, iter: 136350/160000, loss: 0.4005, lr: 0.000179, batch_cost: 0.1775, reader_cost: 0.00116, ips: 45.0597 samples/sec | ETA 01:09:58 2022-08-24 07:33:32 [INFO] [TRAIN] epoch: 108, iter: 136400/160000, loss: 0.4216, lr: 0.000179, batch_cost: 0.1645, reader_cost: 0.00109, ips: 48.6182 samples/sec | ETA 01:04:43 2022-08-24 07:33:46 [INFO] [TRAIN] epoch: 109, iter: 136450/160000, loss: 0.3831, lr: 0.000178, batch_cost: 0.2768, reader_cost: 0.08803, ips: 28.9002 samples/sec | ETA 01:48:38 2022-08-24 07:33:57 [INFO] [TRAIN] epoch: 109, iter: 136500/160000, loss: 0.3986, lr: 0.000178, batch_cost: 0.2197, reader_cost: 0.00053, ips: 36.4174 samples/sec | ETA 01:26:02 2022-08-24 07:34:07 [INFO] [TRAIN] epoch: 109, iter: 136550/160000, loss: 0.3936, lr: 0.000178, batch_cost: 0.2050, reader_cost: 0.00056, ips: 39.0257 samples/sec | ETA 01:20:07 2022-08-24 07:34:17 [INFO] [TRAIN] epoch: 109, iter: 136600/160000, loss: 0.4092, lr: 0.000177, batch_cost: 0.2047, reader_cost: 0.00050, ips: 39.0759 samples/sec | ETA 01:19:50 2022-08-24 07:34:27 [INFO] [TRAIN] epoch: 109, iter: 136650/160000, loss: 0.3960, lr: 0.000177, batch_cost: 0.1945, reader_cost: 0.00049, ips: 41.1360 samples/sec | ETA 01:15:41 2022-08-24 07:34:35 [INFO] [TRAIN] epoch: 109, iter: 136700/160000, loss: 0.4279, lr: 0.000176, batch_cost: 0.1686, reader_cost: 0.00056, ips: 47.4465 samples/sec | ETA 01:05:28 2022-08-24 07:34:46 [INFO] [TRAIN] epoch: 109, iter: 136750/160000, loss: 0.3759, lr: 0.000176, batch_cost: 0.2097, reader_cost: 0.00054, ips: 38.1494 samples/sec | ETA 01:21:15 2022-08-24 07:34:56 [INFO] [TRAIN] epoch: 109, iter: 136800/160000, loss: 0.3959, lr: 0.000176, batch_cost: 0.1932, reader_cost: 0.00107, ips: 41.4125 samples/sec | ETA 01:14:41 2022-08-24 07:35:05 [INFO] [TRAIN] epoch: 109, iter: 136850/160000, loss: 0.3730, lr: 0.000175, batch_cost: 0.1918, reader_cost: 0.00094, ips: 41.7004 samples/sec | ETA 01:14:01 2022-08-24 07:35:15 [INFO] [TRAIN] epoch: 109, iter: 136900/160000, loss: 0.4098, lr: 0.000175, batch_cost: 0.1925, reader_cost: 0.00105, ips: 41.5496 samples/sec | ETA 01:14:07 2022-08-24 07:35:25 [INFO] [TRAIN] epoch: 109, iter: 136950/160000, loss: 0.3861, lr: 0.000175, batch_cost: 0.1982, reader_cost: 0.00054, ips: 40.3662 samples/sec | ETA 01:16:08 2022-08-24 07:35:34 [INFO] [TRAIN] epoch: 109, iter: 137000/160000, loss: 0.4209, lr: 0.000174, batch_cost: 0.1935, reader_cost: 0.00075, ips: 41.3487 samples/sec | ETA 01:14:09 2022-08-24 07:35:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 236s - batch_cost: 0.2363 - reader cost: 9.5776e-04 2022-08-24 07:39:31 [INFO] [EVAL] #Images: 2000 mIoU: 0.3765 Acc: 0.7772 Kappa: 0.7602 Dice: 0.5142 2022-08-24 07:39:31 [INFO] [EVAL] Class IoU: [0.6972 0.7956 0.9323 0.7442 0.688 0.776 0.7778 0.8136 0.5295 0.6484 0.5008 0.5653 0.7195 0.3054 0.3198 0.4507 0.5081 0.4418 0.6148 0.4458 0.7682 0.4236 0.6328 0.5162 0.3404 0.4233 0.4742 0.467 0.4502 0.2625 0.2686 0.5617 0.3373 0.3759 0.3727 0.4326 0.4875 0.5972 0.3009 0.3835 0.1145 0.1325 0.3489 0.2619 0.2465 0.2196 0.3311 0.5419 0.6479 0.5458 0.5771 0.3514 0.1631 0.2089 0.6754 0.3362 0.8807 0.4424 0.3617 0.2911 0.0854 0.2164 0.3465 0.1956 0.4902 0.7244 0.2922 0.4001 0.1333 0.3652 0.4896 0.5448 0.3918 0.2773 0.5023 0.4115 0.5454 0.2944 0.2737 0.3092 0.7196 0.4363 0.4043 0.0629 0.1102 0.5231 0.1248 0.1305 0.369 0.5467 0.4208 0.0914 0.221 0.1124 0.0517 0.0784 0.1547 0.1792 0.249 0.2981 0.2136 0.1086 0.2906 0.7741 0.1547 0.6653 0.1565 0.5813 0.0755 0.2335 0.1939 0.4881 0.2004 0.5654 0.704 0.0643 0.4267 0.6231 0.1908 0.4058 0.4472 0.0826 0.3248 0.2159 0.2808 0.2723 0.5204 0.4825 0.541 0.3856 0.5708 0.0471 0.3084 0.4228 0.2992 0.2094 0.1746 0.0559 0.2033 0.3988 0.1444 0.0079 0.2452 0.2406 0.2339 0.0043 0.4413 0.0367 0.1489 0.2069] 2022-08-24 07:39:31 [INFO] [EVAL] Class Precision: [0.7916 0.8648 0.9639 0.831 0.7682 0.8686 0.8721 0.8773 0.6809 0.7618 0.6998 0.7383 0.7869 0.5059 0.5336 0.6082 0.716 0.6918 0.7991 0.6335 0.828 0.6606 0.776 0.6085 0.5048 0.6439 0.5772 0.7658 0.733 0.4339 0.4694 0.7248 0.5787 0.5256 0.4734 0.5958 0.6771 0.8154 0.4655 0.6181 0.3303 0.2969 0.6147 0.5037 0.3332 0.4426 0.643 0.7236 0.7279 0.6885 0.7476 0.4207 0.3234 0.6904 0.7288 0.6994 0.9175 0.6818 0.6847 0.5656 0.1195 0.3899 0.5465 0.6004 0.5958 0.8278 0.4666 0.55 0.2742 0.6353 0.7494 0.6557 0.5302 0.3432 0.7277 0.643 0.7113 0.6049 0.7246 0.6042 0.8665 0.7077 0.8422 0.1764 0.2241 0.6916 0.4012 0.433 0.8274 0.7038 0.5638 0.1174 0.448 0.353 0.2385 0.2149 0.5678 0.3925 0.343 0.734 0.7535 0.1669 0.5733 0.8469 0.768 0.7678 0.3649 0.7883 0.1697 0.3789 0.4568 0.6158 0.6135 0.835 0.7116 0.2857 0.6298 0.7224 0.289 0.5723 0.7003 0.3125 0.6524 0.6214 0.7755 0.6041 0.7573 0.6361 0.7817 0.5308 0.7498 0.3031 0.6019 0.8259 0.6114 0.3553 0.3652 0.1214 0.4832 0.7259 0.3664 0.0139 0.5532 0.6589 0.5474 0.0069 0.8515 0.1989 0.3616 0.6986] 2022-08-24 07:39:31 [INFO] [EVAL] Class Recall: [0.8539 0.9087 0.966 0.8769 0.8683 0.8792 0.878 0.918 0.7043 0.8133 0.6379 0.7071 0.8935 0.4352 0.444 0.6351 0.6363 0.5501 0.7273 0.6007 0.9142 0.5415 0.7742 0.773 0.511 0.5526 0.7266 0.5447 0.5385 0.3992 0.3857 0.7139 0.4471 0.5688 0.6367 0.6123 0.635 0.6906 0.4598 0.5025 0.1491 0.1931 0.4465 0.353 0.4864 0.3036 0.4057 0.6834 0.8549 0.7248 0.7168 0.681 0.2475 0.2305 0.9022 0.393 0.9564 0.5574 0.4339 0.375 0.2308 0.3273 0.4863 0.2248 0.7344 0.8529 0.4388 0.5947 0.206 0.462 0.5854 0.7631 0.6002 0.5906 0.6185 0.5333 0.7005 0.3646 0.3055 0.3877 0.8093 0.5322 0.4374 0.0891 0.1782 0.6822 0.1533 0.1574 0.3998 0.71 0.6239 0.2914 0.3036 0.1416 0.062 0.1099 0.1753 0.248 0.476 0.3342 0.2296 0.2374 0.3707 0.9 0.1623 0.8329 0.2151 0.6889 0.1198 0.3783 0.252 0.7018 0.2294 0.6365 0.985 0.0766 0.5696 0.8193 0.3595 0.5825 0.5531 0.1009 0.3929 0.2487 0.3057 0.3314 0.6245 0.6664 0.6373 0.5849 0.7051 0.0529 0.3875 0.4641 0.3694 0.3377 0.2507 0.0938 0.2598 0.4695 0.1924 0.0177 0.3057 0.2749 0.29 0.0109 0.4782 0.0431 0.202 0.2272] 2022-08-24 07:39:31 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:39:41 [INFO] [TRAIN] epoch: 109, iter: 137050/160000, loss: 0.4087, lr: 0.000174, batch_cost: 0.1844, reader_cost: 0.00681, ips: 43.3912 samples/sec | ETA 01:10:31 2022-08-24 07:39:50 [INFO] [TRAIN] epoch: 109, iter: 137100/160000, loss: 0.4101, lr: 0.000173, batch_cost: 0.1814, reader_cost: 0.00090, ips: 44.1045 samples/sec | ETA 01:09:13 2022-08-24 07:39:59 [INFO] [TRAIN] epoch: 109, iter: 137150/160000, loss: 0.4207, lr: 0.000173, batch_cost: 0.1785, reader_cost: 0.00063, ips: 44.8155 samples/sec | ETA 01:07:58 2022-08-24 07:40:07 [INFO] [TRAIN] epoch: 109, iter: 137200/160000, loss: 0.4320, lr: 0.000173, batch_cost: 0.1777, reader_cost: 0.00081, ips: 45.0154 samples/sec | ETA 01:07:31 2022-08-24 07:40:15 [INFO] [TRAIN] epoch: 109, iter: 137250/160000, loss: 0.3949, lr: 0.000172, batch_cost: 0.1607, reader_cost: 0.00079, ips: 49.7951 samples/sec | ETA 01:00:54 2022-08-24 07:40:24 [INFO] [TRAIN] epoch: 109, iter: 137300/160000, loss: 0.4066, lr: 0.000172, batch_cost: 0.1667, reader_cost: 0.00058, ips: 47.9888 samples/sec | ETA 01:03:04 2022-08-24 07:40:32 [INFO] [TRAIN] epoch: 109, iter: 137350/160000, loss: 0.4029, lr: 0.000171, batch_cost: 0.1613, reader_cost: 0.00038, ips: 49.5861 samples/sec | ETA 01:00:54 2022-08-24 07:40:41 [INFO] [TRAIN] epoch: 109, iter: 137400/160000, loss: 0.4097, lr: 0.000171, batch_cost: 0.1827, reader_cost: 0.00062, ips: 43.7791 samples/sec | ETA 01:08:49 2022-08-24 07:40:51 [INFO] [TRAIN] epoch: 109, iter: 137450/160000, loss: 0.3792, lr: 0.000171, batch_cost: 0.2060, reader_cost: 0.00081, ips: 38.8319 samples/sec | ETA 01:17:25 2022-08-24 07:41:01 [INFO] [TRAIN] epoch: 109, iter: 137500/160000, loss: 0.4142, lr: 0.000170, batch_cost: 0.1945, reader_cost: 0.00073, ips: 41.1342 samples/sec | ETA 01:12:55 2022-08-24 07:41:11 [INFO] [TRAIN] epoch: 109, iter: 137550/160000, loss: 0.4257, lr: 0.000170, batch_cost: 0.2084, reader_cost: 0.00147, ips: 38.3876 samples/sec | ETA 01:17:58 2022-08-24 07:41:20 [INFO] [TRAIN] epoch: 109, iter: 137600/160000, loss: 0.4234, lr: 0.000170, batch_cost: 0.1700, reader_cost: 0.00065, ips: 47.0465 samples/sec | ETA 01:03:28 2022-08-24 07:41:28 [INFO] [TRAIN] epoch: 109, iter: 137650/160000, loss: 0.3763, lr: 0.000169, batch_cost: 0.1683, reader_cost: 0.00043, ips: 47.5414 samples/sec | ETA 01:02:40 2022-08-24 07:41:41 [INFO] [TRAIN] epoch: 110, iter: 137700/160000, loss: 0.3842, lr: 0.000169, batch_cost: 0.2464, reader_cost: 0.04879, ips: 32.4654 samples/sec | ETA 01:31:35 2022-08-24 07:41:50 [INFO] [TRAIN] epoch: 110, iter: 137750/160000, loss: 0.3775, lr: 0.000168, batch_cost: 0.1928, reader_cost: 0.00076, ips: 41.4858 samples/sec | ETA 01:11:30 2022-08-24 07:42:00 [INFO] [TRAIN] epoch: 110, iter: 137800/160000, loss: 0.4368, lr: 0.000168, batch_cost: 0.1910, reader_cost: 0.00076, ips: 41.8930 samples/sec | ETA 01:10:39 2022-08-24 07:42:10 [INFO] [TRAIN] epoch: 110, iter: 137850/160000, loss: 0.3918, lr: 0.000168, batch_cost: 0.1978, reader_cost: 0.00101, ips: 40.4538 samples/sec | ETA 01:13:00 2022-08-24 07:42:20 [INFO] [TRAIN] epoch: 110, iter: 137900/160000, loss: 0.3717, lr: 0.000167, batch_cost: 0.1941, reader_cost: 0.00043, ips: 41.2155 samples/sec | ETA 01:11:29 2022-08-24 07:42:30 [INFO] [TRAIN] epoch: 110, iter: 137950/160000, loss: 0.4209, lr: 0.000167, batch_cost: 0.2084, reader_cost: 0.00113, ips: 38.3912 samples/sec | ETA 01:16:34 2022-08-24 07:42:40 [INFO] [TRAIN] epoch: 110, iter: 138000/160000, loss: 0.3962, lr: 0.000167, batch_cost: 0.2028, reader_cost: 0.00046, ips: 39.4521 samples/sec | ETA 01:14:21 2022-08-24 07:42:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 142s - batch_cost: 0.1419 - reader cost: 5.8159e-04 2022-08-24 07:45:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.3786 Acc: 0.7775 Kappa: 0.7605 Dice: 0.5163 2022-08-24 07:45:02 [INFO] [EVAL] Class IoU: [0.6954 0.7981 0.9321 0.7452 0.6885 0.7783 0.7877 0.8085 0.5283 0.6449 0.4907 0.5786 0.7225 0.306 0.3295 0.4411 0.4968 0.433 0.6091 0.4395 0.7709 0.4429 0.6241 0.5247 0.3393 0.4191 0.4455 0.4756 0.4575 0.2716 0.3128 0.5533 0.3255 0.3606 0.3722 0.4154 0.4885 0.6019 0.3059 0.3739 0.1201 0.1253 0.3514 0.2517 0.2463 0.2488 0.349 0.5422 0.6527 0.5204 0.5961 0.3816 0.1742 0.1885 0.6779 0.3419 0.89 0.418 0.4145 0.2775 0.0939 0.2201 0.3376 0.1988 0.5085 0.723 0.2772 0.38 0.1291 0.3848 0.4974 0.5917 0.3707 0.2721 0.498 0.4007 0.531 0.2999 0.281 0.2705 0.7203 0.4395 0.442 0.0735 0.1251 0.519 0.1156 0.1282 0.347 0.5425 0.4297 0.0884 0.213 0.1469 0.0428 0.0685 0.1616 0.1771 0.2477 0.3028 0.2077 0.0999 0.2856 0.7702 0.1845 0.6636 0.1146 0.566 0.0774 0.2464 0.2154 0.4648 0.1957 0.5742 0.713 0.0744 0.4292 0.6422 0.1669 0.4071 0.3804 0.0782 0.3365 0.1985 0.2936 0.2427 0.5289 0.4853 0.5808 0.3997 0.6101 0.0692 0.3101 0.4237 0.3167 0.2026 0.1765 0.0431 0.2031 0.4257 0.1786 0.0032 0.2501 0.3795 0.2333 0.0049 0.4389 0.037 0.1518 0.2034] 2022-08-24 07:45:02 [INFO] [EVAL] Class Precision: [0.7922 0.8675 0.965 0.8328 0.7611 0.8699 0.8898 0.8605 0.6827 0.758 0.6884 0.7333 0.7972 0.4933 0.5448 0.6313 0.7093 0.7021 0.7568 0.6422 0.837 0.6557 0.7412 0.6368 0.4883 0.6156 0.5571 0.7515 0.7029 0.4661 0.4848 0.699 0.6195 0.485 0.5092 0.5362 0.6859 0.8084 0.4461 0.6263 0.3216 0.3758 0.6021 0.5187 0.3469 0.4912 0.6873 0.7529 0.7513 0.6447 0.7541 0.4727 0.317 0.5853 0.7163 0.6344 0.9463 0.7054 0.6875 0.487 0.1362 0.4288 0.4999 0.5646 0.6411 0.8116 0.4411 0.498 0.2465 0.7143 0.7317 0.7447 0.5546 0.3312 0.7382 0.6125 0.7669 0.6493 0.7382 0.5147 0.8524 0.692 0.7993 0.1904 0.2286 0.7229 0.4792 0.4739 0.7782 0.6791 0.6074 0.1233 0.4256 0.3607 0.147 0.213 0.5584 0.3676 0.3486 0.7655 0.7002 0.1585 0.5508 0.8678 0.7435 0.7779 0.2818 0.8007 0.1685 0.3874 0.488 0.6031 0.6802 0.7923 0.7202 0.3014 0.7354 0.7675 0.2351 0.5575 0.7319 0.2851 0.7161 0.6225 0.7653 0.6384 0.7715 0.6449 0.8721 0.6479 0.7232 0.4002 0.6307 0.8569 0.6226 0.4266 0.3707 0.1439 0.4758 0.695 0.3452 0.0067 0.5357 0.6619 0.5284 0.0074 0.8077 0.1584 0.3909 0.6972] 2022-08-24 07:45:02 [INFO] [EVAL] Class Recall: [0.8504 0.909 0.9647 0.8763 0.8784 0.8809 0.8729 0.9304 0.7002 0.8121 0.6308 0.7327 0.8852 0.4462 0.4546 0.5942 0.6238 0.5304 0.7573 0.5821 0.9072 0.5772 0.7981 0.7488 0.5264 0.5676 0.6899 0.5643 0.5671 0.3943 0.4686 0.7264 0.4068 0.5843 0.5803 0.6484 0.6293 0.702 0.4933 0.4812 0.1609 0.1582 0.4577 0.3284 0.4594 0.3352 0.4148 0.6595 0.8326 0.7297 0.7399 0.6646 0.2789 0.2175 0.9268 0.4257 0.9374 0.5064 0.5108 0.3921 0.2322 0.3114 0.5098 0.2349 0.7109 0.8689 0.4272 0.6159 0.2132 0.4548 0.6083 0.7422 0.5277 0.6039 0.6049 0.5368 0.6333 0.3578 0.3121 0.3631 0.823 0.5464 0.4972 0.1069 0.2164 0.6479 0.1321 0.1495 0.3851 0.7295 0.5949 0.2379 0.299 0.1985 0.0569 0.0917 0.1852 0.2547 0.461 0.3337 0.228 0.2127 0.3723 0.8725 0.197 0.8187 0.1619 0.6588 0.1252 0.4036 0.2783 0.6696 0.2155 0.676 0.9861 0.0899 0.5076 0.7973 0.3653 0.6015 0.442 0.0973 0.3883 0.2257 0.3226 0.2814 0.6271 0.6622 0.6349 0.5106 0.796 0.0772 0.3789 0.456 0.3919 0.2784 0.252 0.0579 0.2616 0.5234 0.27 0.0059 0.3193 0.4707 0.2947 0.0146 0.4901 0.0461 0.1988 0.2231] 2022-08-24 07:45:02 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:45:11 [INFO] [TRAIN] epoch: 110, iter: 138050/160000, loss: 0.4068, lr: 0.000166, batch_cost: 0.1643, reader_cost: 0.00410, ips: 48.6983 samples/sec | ETA 01:00:05 2022-08-24 07:45:19 [INFO] [TRAIN] epoch: 110, iter: 138100/160000, loss: 0.4086, lr: 0.000166, batch_cost: 0.1561, reader_cost: 0.00112, ips: 51.2600 samples/sec | ETA 00:56:57 2022-08-24 07:45:27 [INFO] [TRAIN] epoch: 110, iter: 138150/160000, loss: 0.4023, lr: 0.000165, batch_cost: 0.1628, reader_cost: 0.00084, ips: 49.1547 samples/sec | ETA 00:59:16 2022-08-24 07:45:35 [INFO] [TRAIN] epoch: 110, iter: 138200/160000, loss: 0.4242, lr: 0.000165, batch_cost: 0.1704, reader_cost: 0.00106, ips: 46.9522 samples/sec | ETA 01:01:54 2022-08-24 07:45:44 [INFO] [TRAIN] epoch: 110, iter: 138250/160000, loss: 0.3846, lr: 0.000165, batch_cost: 0.1788, reader_cost: 0.00052, ips: 44.7334 samples/sec | ETA 01:04:49 2022-08-24 07:45:52 [INFO] [TRAIN] epoch: 110, iter: 138300/160000, loss: 0.4137, lr: 0.000164, batch_cost: 0.1581, reader_cost: 0.00070, ips: 50.6030 samples/sec | ETA 00:57:10 2022-08-24 07:46:01 [INFO] [TRAIN] epoch: 110, iter: 138350/160000, loss: 0.3920, lr: 0.000164, batch_cost: 0.1779, reader_cost: 0.00106, ips: 44.9654 samples/sec | ETA 01:04:11 2022-08-24 07:46:10 [INFO] [TRAIN] epoch: 110, iter: 138400/160000, loss: 0.4089, lr: 0.000164, batch_cost: 0.1769, reader_cost: 0.00032, ips: 45.2189 samples/sec | ETA 01:03:41 2022-08-24 07:46:19 [INFO] [TRAIN] epoch: 110, iter: 138450/160000, loss: 0.3864, lr: 0.000163, batch_cost: 0.1859, reader_cost: 0.00054, ips: 43.0249 samples/sec | ETA 01:06:46 2022-08-24 07:46:29 [INFO] [TRAIN] epoch: 110, iter: 138500/160000, loss: 0.4026, lr: 0.000163, batch_cost: 0.1973, reader_cost: 0.00089, ips: 40.5553 samples/sec | ETA 01:10:41 2022-08-24 07:46:38 [INFO] [TRAIN] epoch: 110, iter: 138550/160000, loss: 0.4145, lr: 0.000162, batch_cost: 0.1777, reader_cost: 0.00064, ips: 45.0157 samples/sec | ETA 01:03:32 2022-08-24 07:46:47 [INFO] [TRAIN] epoch: 110, iter: 138600/160000, loss: 0.3943, lr: 0.000162, batch_cost: 0.1841, reader_cost: 0.00060, ips: 43.4632 samples/sec | ETA 01:05:38 2022-08-24 07:46:58 [INFO] [TRAIN] epoch: 110, iter: 138650/160000, loss: 0.4159, lr: 0.000162, batch_cost: 0.2185, reader_cost: 0.00054, ips: 36.6131 samples/sec | ETA 01:17:44 2022-08-24 07:47:07 [INFO] [TRAIN] epoch: 110, iter: 138700/160000, loss: 0.3946, lr: 0.000161, batch_cost: 0.1740, reader_cost: 0.00142, ips: 45.9750 samples/sec | ETA 01:01:46 2022-08-24 07:47:16 [INFO] [TRAIN] epoch: 110, iter: 138750/160000, loss: 0.3992, lr: 0.000161, batch_cost: 0.1882, reader_cost: 0.00086, ips: 42.5126 samples/sec | ETA 01:06:38 2022-08-24 07:47:26 [INFO] [TRAIN] epoch: 110, iter: 138800/160000, loss: 0.4148, lr: 0.000161, batch_cost: 0.1895, reader_cost: 0.00065, ips: 42.2070 samples/sec | ETA 01:06:58 2022-08-24 07:47:35 [INFO] [TRAIN] epoch: 110, iter: 138850/160000, loss: 0.3755, lr: 0.000160, batch_cost: 0.1969, reader_cost: 0.00065, ips: 40.6397 samples/sec | ETA 01:09:23 2022-08-24 07:47:44 [INFO] [TRAIN] epoch: 110, iter: 138900/160000, loss: 0.4036, lr: 0.000160, batch_cost: 0.1703, reader_cost: 0.00042, ips: 46.9806 samples/sec | ETA 00:59:52 2022-08-24 07:47:57 [INFO] [TRAIN] epoch: 111, iter: 138950/160000, loss: 0.3839, lr: 0.000159, batch_cost: 0.2662, reader_cost: 0.07735, ips: 30.0509 samples/sec | ETA 01:33:23 2022-08-24 07:48:07 [INFO] [TRAIN] epoch: 111, iter: 139000/160000, loss: 0.3761, lr: 0.000159, batch_cost: 0.1941, reader_cost: 0.00309, ips: 41.2065 samples/sec | ETA 01:07:57 2022-08-24 07:48:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 149s - batch_cost: 0.1494 - reader cost: 6.3005e-04 2022-08-24 07:50:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3759 Acc: 0.7776 Kappa: 0.7607 Dice: 0.5129 2022-08-24 07:50:37 [INFO] [EVAL] Class IoU: [0.7 0.7945 0.9313 0.7483 0.695 0.7763 0.7818 0.8114 0.5311 0.6546 0.4971 0.5666 0.719 0.2896 0.3157 0.4467 0.5162 0.4428 0.6185 0.45 0.7753 0.4438 0.6359 0.5199 0.3204 0.4089 0.4481 0.4664 0.4684 0.2875 0.2676 0.5614 0.3347 0.361 0.3558 0.4287 0.4826 0.6035 0.313 0.3715 0.1387 0.1442 0.3585 0.2587 0.2324 0.2438 0.3308 0.5351 0.6327 0.5448 0.5793 0.3874 0.1638 0.1213 0.6825 0.3407 0.8826 0.4399 0.376 0.2557 0.0686 0.2297 0.3284 0.1671 0.5041 0.7356 0.2675 0.3825 0.1232 0.382 0.4754 0.5497 0.399 0.2444 0.4996 0.3973 0.4712 0.3084 0.2801 0.2452 0.7282 0.4389 0.4493 0.053 0.1963 0.5311 0.1176 0.1336 0.3409 0.5513 0.4285 0.0863 0.2222 0.1224 0.0232 0.0709 0.1571 0.1682 0.2443 0.2941 0.2334 0.1146 0.2889 0.7823 0.1902 0.6893 0.1069 0.5637 0.0742 0.2237 0.2053 0.5089 0.1982 0.5839 0.7012 0.0559 0.4127 0.6262 0.1647 0.4151 0.4201 0.0804 0.3359 0.2143 0.2797 0.2412 0.5129 0.4623 0.5484 0.3902 0.6103 0.0427 0.283 0.4373 0.3392 0.2048 0.1763 0.062 0.2172 0.4127 0.1451 0.0162 0.2824 0.2759 0.2141 0.0045 0.4357 0.0375 0.1513 0.2066] 2022-08-24 07:50:37 [INFO] [EVAL] Class Precision: [0.803 0.8521 0.9629 0.8414 0.7824 0.8669 0.8825 0.8675 0.6479 0.7631 0.6883 0.7099 0.7911 0.5319 0.5559 0.6092 0.693 0.6847 0.7944 0.6294 0.8443 0.6597 0.7814 0.6284 0.4525 0.6703 0.5504 0.7793 0.7301 0.4238 0.5031 0.6967 0.5646 0.4962 0.4823 0.5839 0.7067 0.827 0.4809 0.6292 0.3298 0.3109 0.6009 0.4838 0.3112 0.4572 0.6029 0.7157 0.726 0.6822 0.7905 0.4676 0.3176 0.5942 0.7228 0.7282 0.9248 0.6471 0.6794 0.4081 0.1068 0.4815 0.4406 0.6537 0.6086 0.8552 0.4066 0.5386 0.2303 0.6753 0.7221 0.6658 0.536 0.3417 0.7348 0.6149 0.708 0.6754 0.7535 0.469 0.8589 0.6575 0.7918 0.1613 0.3059 0.703 0.4759 0.4366 0.7894 0.7259 0.5918 0.1056 0.4709 0.3498 0.0726 0.2171 0.6001 0.3875 0.3278 0.6418 0.7771 0.1832 0.5696 0.8411 0.7412 0.79 0.2648 0.7786 0.1582 0.3572 0.4308 0.6965 0.5923 0.7733 0.708 0.2549 0.7376 0.729 0.2618 0.5574 0.7129 0.2672 0.6443 0.6035 0.7641 0.6252 0.7116 0.5744 0.7867 0.6265 0.7381 0.3515 0.6624 0.7999 0.5664 0.363 0.3671 0.1678 0.451 0.6723 0.3368 0.0309 0.5342 0.6402 0.5574 0.007 0.8251 0.1935 0.3997 0.6986] 2022-08-24 07:50:37 [INFO] [EVAL] Class Recall: [0.8451 0.9217 0.966 0.8712 0.8615 0.8814 0.8726 0.9262 0.7465 0.8214 0.6416 0.7374 0.8874 0.3885 0.4222 0.626 0.6692 0.5562 0.7364 0.6123 0.9047 0.5755 0.7735 0.7506 0.5232 0.5118 0.7068 0.5373 0.5664 0.4719 0.3637 0.7431 0.4511 0.5699 0.5756 0.6173 0.6035 0.6908 0.4727 0.4756 0.1932 0.212 0.4705 0.3572 0.4785 0.3431 0.423 0.6796 0.8311 0.7301 0.6844 0.693 0.2527 0.1323 0.9245 0.3904 0.9509 0.5787 0.4571 0.4065 0.161 0.3052 0.5633 0.1833 0.7459 0.8403 0.4389 0.5689 0.2093 0.4679 0.5819 0.7591 0.6095 0.462 0.6094 0.5289 0.5849 0.362 0.3083 0.3393 0.8271 0.569 0.5094 0.0732 0.3541 0.6848 0.1351 0.1614 0.375 0.6963 0.6084 0.3212 0.2961 0.1584 0.0331 0.0952 0.1755 0.2292 0.4898 0.3518 0.2502 0.2342 0.3697 0.9179 0.2038 0.844 0.152 0.6714 0.1225 0.3744 0.2816 0.6539 0.2295 0.7046 0.9863 0.0669 0.4837 0.8162 0.3073 0.6193 0.5057 0.1031 0.4123 0.2494 0.3062 0.282 0.6475 0.7032 0.6442 0.5085 0.779 0.0463 0.3307 0.4911 0.4582 0.3197 0.2533 0.0894 0.2952 0.5166 0.2031 0.0328 0.3747 0.3265 0.258 0.0121 0.4801 0.0444 0.1958 0.2268] 2022-08-24 07:50:37 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:50:47 [INFO] [TRAIN] epoch: 111, iter: 139050/160000, loss: 0.3783, lr: 0.000159, batch_cost: 0.2001, reader_cost: 0.00522, ips: 39.9821 samples/sec | ETA 01:09:51 2022-08-24 07:50:56 [INFO] [TRAIN] epoch: 111, iter: 139100/160000, loss: 0.4196, lr: 0.000158, batch_cost: 0.1750, reader_cost: 0.00123, ips: 45.7060 samples/sec | ETA 01:00:58 2022-08-24 07:51:04 [INFO] [TRAIN] epoch: 111, iter: 139150/160000, loss: 0.4104, lr: 0.000158, batch_cost: 0.1750, reader_cost: 0.00067, ips: 45.7043 samples/sec | ETA 01:00:49 2022-08-24 07:51:13 [INFO] [TRAIN] epoch: 111, iter: 139200/160000, loss: 0.4024, lr: 0.000157, batch_cost: 0.1699, reader_cost: 0.00055, ips: 47.0955 samples/sec | ETA 00:58:53 2022-08-24 07:51:21 [INFO] [TRAIN] epoch: 111, iter: 139250/160000, loss: 0.3876, lr: 0.000157, batch_cost: 0.1697, reader_cost: 0.00080, ips: 47.1381 samples/sec | ETA 00:58:41 2022-08-24 07:51:31 [INFO] [TRAIN] epoch: 111, iter: 139300/160000, loss: 0.4575, lr: 0.000157, batch_cost: 0.1901, reader_cost: 0.00040, ips: 42.0840 samples/sec | ETA 01:05:34 2022-08-24 07:51:40 [INFO] [TRAIN] epoch: 111, iter: 139350/160000, loss: 0.4339, lr: 0.000156, batch_cost: 0.1758, reader_cost: 0.00071, ips: 45.5067 samples/sec | ETA 01:00:30 2022-08-24 07:51:48 [INFO] [TRAIN] epoch: 111, iter: 139400/160000, loss: 0.4024, lr: 0.000156, batch_cost: 0.1783, reader_cost: 0.00070, ips: 44.8662 samples/sec | ETA 01:01:13 2022-08-24 07:51:59 [INFO] [TRAIN] epoch: 111, iter: 139450/160000, loss: 0.3986, lr: 0.000156, batch_cost: 0.2014, reader_cost: 0.00065, ips: 39.7266 samples/sec | ETA 01:08:58 2022-08-24 07:52:07 [INFO] [TRAIN] epoch: 111, iter: 139500/160000, loss: 0.3859, lr: 0.000155, batch_cost: 0.1784, reader_cost: 0.00122, ips: 44.8454 samples/sec | ETA 01:00:57 2022-08-24 07:52:17 [INFO] [TRAIN] epoch: 111, iter: 139550/160000, loss: 0.4080, lr: 0.000155, batch_cost: 0.1935, reader_cost: 0.00048, ips: 41.3392 samples/sec | ETA 01:05:57 2022-08-24 07:52:27 [INFO] [TRAIN] epoch: 111, iter: 139600/160000, loss: 0.4172, lr: 0.000154, batch_cost: 0.2054, reader_cost: 0.00036, ips: 38.9417 samples/sec | ETA 01:09:50 2022-08-24 07:52:37 [INFO] [TRAIN] epoch: 111, iter: 139650/160000, loss: 0.3985, lr: 0.000154, batch_cost: 0.1936, reader_cost: 0.00064, ips: 41.3225 samples/sec | ETA 01:05:39 2022-08-24 07:52:46 [INFO] [TRAIN] epoch: 111, iter: 139700/160000, loss: 0.3971, lr: 0.000154, batch_cost: 0.1797, reader_cost: 0.00047, ips: 44.5184 samples/sec | ETA 01:00:47 2022-08-24 07:52:56 [INFO] [TRAIN] epoch: 111, iter: 139750/160000, loss: 0.3924, lr: 0.000153, batch_cost: 0.1978, reader_cost: 0.00125, ips: 40.4531 samples/sec | ETA 01:06:44 2022-08-24 07:53:06 [INFO] [TRAIN] epoch: 111, iter: 139800/160000, loss: 0.3742, lr: 0.000153, batch_cost: 0.1985, reader_cost: 0.00051, ips: 40.3080 samples/sec | ETA 01:06:49 2022-08-24 07:53:16 [INFO] [TRAIN] epoch: 111, iter: 139850/160000, loss: 0.3872, lr: 0.000153, batch_cost: 0.1959, reader_cost: 0.00077, ips: 40.8450 samples/sec | ETA 01:05:46 2022-08-24 07:53:26 [INFO] [TRAIN] epoch: 111, iter: 139900/160000, loss: 0.4029, lr: 0.000152, batch_cost: 0.2002, reader_cost: 0.00080, ips: 39.9673 samples/sec | ETA 01:07:03 2022-08-24 07:53:36 [INFO] [TRAIN] epoch: 111, iter: 139950/160000, loss: 0.4006, lr: 0.000152, batch_cost: 0.2046, reader_cost: 0.00047, ips: 39.1021 samples/sec | ETA 01:08:22 2022-08-24 07:53:46 [INFO] [TRAIN] epoch: 111, iter: 140000/160000, loss: 0.3930, lr: 0.000151, batch_cost: 0.1924, reader_cost: 0.00060, ips: 41.5820 samples/sec | ETA 01:04:07 2022-08-24 07:53:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1826 - reader cost: 9.5692e-04 2022-08-24 07:56:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3743 Acc: 0.7766 Kappa: 0.7594 Dice: 0.5117 2022-08-24 07:56:48 [INFO] [EVAL] Class IoU: [0.6981 0.7963 0.9319 0.7434 0.7027 0.771 0.7811 0.8142 0.5397 0.6316 0.4943 0.5768 0.7237 0.2924 0.3175 0.4455 0.4883 0.4544 0.6085 0.4381 0.7728 0.4263 0.634 0.5179 0.3413 0.4028 0.4656 0.456 0.447 0.2435 0.2734 0.5496 0.3283 0.3664 0.3657 0.422 0.487 0.6117 0.2964 0.3755 0.1294 0.1307 0.3579 0.2542 0.2359 0.2365 0.3489 0.5446 0.6291 0.5361 0.5808 0.3783 0.1734 0.1961 0.6861 0.3756 0.8905 0.4538 0.3739 0.2833 0.0874 0.2384 0.3294 0.1605 0.5133 0.7429 0.2546 0.3872 0.1262 0.3837 0.4892 0.5685 0.3926 0.2551 0.5028 0.4056 0.4824 0.3067 0.2656 0.1816 0.7398 0.425 0.4383 0.0636 0.1359 0.5208 0.1209 0.1335 0.3296 0.5538 0.4524 0.0953 0.2298 0.1213 0.0361 0.0749 0.1709 0.1838 0.2814 0.3071 0.2366 0.1009 0.2814 0.7199 0.1905 0.6428 0.1335 0.5619 0.0775 0.2089 0.1813 0.4662 0.206 0.5485 0.7033 0.0724 0.4408 0.6281 0.1663 0.3756 0.4277 0.0811 0.2936 0.1988 0.2881 0.26 0.5228 0.4764 0.5773 0.3982 0.5365 0.0391 0.2854 0.4381 0.3026 0.2084 0.1756 0.0407 0.2104 0.3946 0.1489 0.0413 0.2426 0.2519 0.2247 0.0034 0.4433 0.0352 0.1381 0.2002] 2022-08-24 07:56:48 [INFO] [EVAL] Class Precision: [0.7875 0.8713 0.9642 0.8251 0.784 0.8842 0.8876 0.8694 0.6814 0.7347 0.708 0.7094 0.8048 0.4995 0.5703 0.6063 0.7208 0.6689 0.7595 0.6331 0.8392 0.6557 0.7798 0.6345 0.4824 0.6039 0.5613 0.7994 0.7359 0.382 0.4623 0.6958 0.562 0.4941 0.4613 0.5662 0.6896 0.8117 0.4091 0.6165 0.3542 0.3174 0.5985 0.4765 0.3249 0.5112 0.6691 0.7208 0.7336 0.6706 0.7504 0.4512 0.3609 0.5383 0.7273 0.6537 0.936 0.6598 0.6853 0.4967 0.1279 0.459 0.4621 0.6939 0.6449 0.8528 0.4624 0.5483 0.241 0.6886 0.7184 0.7121 0.5847 0.3479 0.74 0.6259 0.7224 0.7001 0.7778 0.4916 0.8776 0.7037 0.7982 0.1637 0.2302 0.7362 0.575 0.4308 0.7656 0.7064 0.6541 0.124 0.4661 0.3644 0.1071 0.2057 0.5394 0.3862 0.4257 0.7207 0.6928 0.1495 0.5667 0.7751 0.8183 0.704 0.3477 0.7515 0.187 0.3577 0.4417 0.6176 0.5667 0.7829 0.7106 0.2918 0.7422 0.7357 0.2342 0.5793 0.6858 0.2705 0.591 0.6441 0.7378 0.6226 0.7798 0.6247 0.8639 0.5852 0.7497 0.3213 0.6007 0.8449 0.6032 0.3801 0.3836 0.0987 0.4467 0.7127 0.3541 0.0649 0.5554 0.6191 0.5771 0.0048 0.8538 0.2369 0.3352 0.7536] 2022-08-24 07:56:48 [INFO] [EVAL] Class Recall: [0.8601 0.9024 0.9652 0.8825 0.8713 0.8576 0.8669 0.9277 0.7218 0.8183 0.6209 0.7552 0.8779 0.4136 0.4174 0.6269 0.6022 0.5862 0.7538 0.5871 0.9071 0.5492 0.7722 0.7381 0.5384 0.5474 0.7321 0.5149 0.5324 0.4018 0.4009 0.7235 0.4411 0.5862 0.6384 0.6237 0.6237 0.7129 0.5184 0.4899 0.1694 0.1818 0.4709 0.3526 0.4627 0.3056 0.4217 0.6902 0.8153 0.7278 0.7198 0.7008 0.2503 0.2358 0.9236 0.4689 0.9482 0.5924 0.4514 0.3974 0.2163 0.3315 0.5343 0.1727 0.7155 0.8522 0.3617 0.5685 0.2094 0.4643 0.6053 0.7382 0.5444 0.4889 0.6107 0.5354 0.5921 0.3531 0.2874 0.2236 0.8249 0.5176 0.493 0.0942 0.2492 0.6403 0.1327 0.1621 0.3666 0.7194 0.5947 0.2923 0.3119 0.1538 0.0516 0.1054 0.2002 0.2597 0.4535 0.3486 0.2643 0.2368 0.3585 0.91 0.1989 0.8809 0.1782 0.6901 0.1168 0.3344 0.2352 0.6554 0.2446 0.6469 0.9856 0.0878 0.5204 0.8111 0.3647 0.5166 0.532 0.1038 0.3685 0.2233 0.321 0.3086 0.6133 0.6674 0.6351 0.5549 0.6535 0.0427 0.3523 0.4764 0.3778 0.3157 0.2447 0.0647 0.2845 0.4693 0.2045 0.1019 0.3011 0.2981 0.269 0.0123 0.4797 0.0397 0.1902 0.2142] 2022-08-24 07:56:49 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 07:56:57 [INFO] [TRAIN] epoch: 111, iter: 140050/160000, loss: 0.4221, lr: 0.000151, batch_cost: 0.1618, reader_cost: 0.00336, ips: 49.4579 samples/sec | ETA 00:53:46 2022-08-24 07:57:05 [INFO] [TRAIN] epoch: 111, iter: 140100/160000, loss: 0.3559, lr: 0.000151, batch_cost: 0.1593, reader_cost: 0.00125, ips: 50.2229 samples/sec | ETA 00:52:49 2022-08-24 07:57:13 [INFO] [TRAIN] epoch: 111, iter: 140150/160000, loss: 0.3848, lr: 0.000150, batch_cost: 0.1634, reader_cost: 0.00061, ips: 48.9466 samples/sec | ETA 00:54:04 2022-08-24 07:57:25 [INFO] [TRAIN] epoch: 112, iter: 140200/160000, loss: 0.3885, lr: 0.000150, batch_cost: 0.2332, reader_cost: 0.06584, ips: 34.3007 samples/sec | ETA 01:16:57 2022-08-24 07:57:34 [INFO] [TRAIN] epoch: 112, iter: 140250/160000, loss: 0.3883, lr: 0.000150, batch_cost: 0.1842, reader_cost: 0.00135, ips: 43.4202 samples/sec | ETA 01:00:38 2022-08-24 07:57:42 [INFO] [TRAIN] epoch: 112, iter: 140300/160000, loss: 0.3875, lr: 0.000149, batch_cost: 0.1699, reader_cost: 0.00070, ips: 47.0968 samples/sec | ETA 00:55:46 2022-08-24 07:57:51 [INFO] [TRAIN] epoch: 112, iter: 140350/160000, loss: 0.4048, lr: 0.000149, batch_cost: 0.1812, reader_cost: 0.00029, ips: 44.1548 samples/sec | ETA 00:59:20 2022-08-24 07:58:01 [INFO] [TRAIN] epoch: 112, iter: 140400/160000, loss: 0.3703, lr: 0.000148, batch_cost: 0.1985, reader_cost: 0.00126, ips: 40.3123 samples/sec | ETA 01:04:49 2022-08-24 07:58:11 [INFO] [TRAIN] epoch: 112, iter: 140450/160000, loss: 0.3918, lr: 0.000148, batch_cost: 0.1928, reader_cost: 0.00116, ips: 41.4942 samples/sec | ETA 01:02:49 2022-08-24 07:58:20 [INFO] [TRAIN] epoch: 112, iter: 140500/160000, loss: 0.4096, lr: 0.000148, batch_cost: 0.1791, reader_cost: 0.00040, ips: 44.6578 samples/sec | ETA 00:58:13 2022-08-24 07:58:29 [INFO] [TRAIN] epoch: 112, iter: 140550/160000, loss: 0.3840, lr: 0.000147, batch_cost: 0.1801, reader_cost: 0.00035, ips: 44.4253 samples/sec | ETA 00:58:22 2022-08-24 07:58:38 [INFO] [TRAIN] epoch: 112, iter: 140600/160000, loss: 0.3685, lr: 0.000147, batch_cost: 0.1792, reader_cost: 0.00056, ips: 44.6432 samples/sec | ETA 00:57:56 2022-08-24 07:58:48 [INFO] [TRAIN] epoch: 112, iter: 140650/160000, loss: 0.3946, lr: 0.000147, batch_cost: 0.1957, reader_cost: 0.00312, ips: 40.8864 samples/sec | ETA 01:03:06 2022-08-24 07:58:58 [INFO] [TRAIN] epoch: 112, iter: 140700/160000, loss: 0.3959, lr: 0.000146, batch_cost: 0.2134, reader_cost: 0.00033, ips: 37.4957 samples/sec | ETA 01:08:37 2022-08-24 07:59:08 [INFO] [TRAIN] epoch: 112, iter: 140750/160000, loss: 0.3733, lr: 0.000146, batch_cost: 0.1913, reader_cost: 0.00069, ips: 41.8190 samples/sec | ETA 01:01:22 2022-08-24 07:59:18 [INFO] [TRAIN] epoch: 112, iter: 140800/160000, loss: 0.3991, lr: 0.000145, batch_cost: 0.1972, reader_cost: 0.00070, ips: 40.5736 samples/sec | ETA 01:03:05 2022-08-24 07:59:28 [INFO] [TRAIN] epoch: 112, iter: 140850/160000, loss: 0.3782, lr: 0.000145, batch_cost: 0.2136, reader_cost: 0.00055, ips: 37.4559 samples/sec | ETA 01:08:10 2022-08-24 07:59:38 [INFO] [TRAIN] epoch: 112, iter: 140900/160000, loss: 0.3969, lr: 0.000145, batch_cost: 0.1995, reader_cost: 0.00094, ips: 40.0921 samples/sec | ETA 01:03:31 2022-08-24 07:59:49 [INFO] [TRAIN] epoch: 112, iter: 140950/160000, loss: 0.3864, lr: 0.000144, batch_cost: 0.2036, reader_cost: 0.00059, ips: 39.2835 samples/sec | ETA 01:04:39 2022-08-24 07:59:58 [INFO] [TRAIN] epoch: 112, iter: 141000/160000, loss: 0.3963, lr: 0.000144, batch_cost: 0.1839, reader_cost: 0.00060, ips: 43.5096 samples/sec | ETA 00:58:13 2022-08-24 07:59:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 161s - batch_cost: 0.1608 - reader cost: 7.2391e-04 2022-08-24 08:02:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3720 Acc: 0.7753 Kappa: 0.7581 Dice: 0.5091 2022-08-24 08:02:39 [INFO] [EVAL] Class IoU: [0.6974 0.7932 0.9318 0.7418 0.6903 0.7732 0.7823 0.8098 0.5394 0.6376 0.4919 0.5766 0.7179 0.2912 0.3107 0.4402 0.4894 0.4261 0.6105 0.4333 0.7701 0.4351 0.6308 0.5269 0.3438 0.4456 0.4476 0.4676 0.3905 0.2934 0.2958 0.5378 0.3423 0.359 0.359 0.4141 0.4887 0.6085 0.2882 0.3775 0.1267 0.1162 0.3565 0.256 0.2392 0.2479 0.347 0.543 0.604 0.5562 0.5886 0.36 0.1745 0.1454 0.6733 0.345 0.8923 0.4408 0.4011 0.272 0.0884 0.2332 0.3119 0.1503 0.5081 0.7418 0.2708 0.3806 0.1031 0.3866 0.4952 0.523 0.385 0.2741 0.4941 0.4088 0.5255 0.3031 0.3546 0.24 0.7386 0.4275 0.3907 0.0782 0.0949 0.5253 0.1208 0.1359 0.3081 0.5462 0.434 0.0985 0.2231 0.1117 0.0347 0.0651 0.1571 0.1653 0.2749 0.3099 0.2278 0.1147 0.2646 0.7483 0.186 0.6393 0.1024 0.5453 0.0838 0.2128 0.2061 0.4794 0.1968 0.657 0.6373 0.0738 0.3548 0.6335 0.1751 0.3779 0.4441 0.0757 0.3037 0.211 0.2919 0.2537 0.5136 0.4241 0.558 0.3713 0.5262 0.0512 0.3106 0.4439 0.2843 0.2193 0.172 0.0541 0.2089 0.4182 0.1498 0.0183 0.2457 0.1659 0.2135 0.0099 0.441 0.0343 0.1441 0.2145] 2022-08-24 08:02:39 [INFO] [EVAL] Class Precision: [0.7894 0.8659 0.9644 0.8181 0.7644 0.8805 0.8803 0.8652 0.689 0.7625 0.7144 0.7319 0.7876 0.5089 0.5663 0.5837 0.7002 0.7164 0.7676 0.6534 0.8429 0.6567 0.762 0.6555 0.4885 0.6487 0.5591 0.7917 0.797 0.4116 0.4824 0.6959 0.5633 0.477 0.468 0.5577 0.6957 0.8108 0.4108 0.6205 0.3239 0.3421 0.6179 0.5108 0.3251 0.4715 0.6014 0.7445 0.6816 0.751 0.7513 0.42 0.3374 0.5491 0.7132 0.7108 0.939 0.6329 0.6778 0.4503 0.1272 0.4393 0.4331 0.6681 0.6342 0.8468 0.4285 0.5139 0.1635 0.6903 0.7309 0.6292 0.5615 0.363 0.72 0.6005 0.6538 0.6526 0.7873 0.5187 0.8874 0.7172 0.8453 0.181 0.1745 0.73 0.4188 0.4156 0.7924 0.7144 0.6177 0.1276 0.3958 0.34 0.1197 0.1914 0.6307 0.3896 0.422 0.7234 0.6609 0.182 0.5656 0.8261 0.8011 0.7152 0.2418 0.7553 0.1855 0.3627 0.4376 0.6743 0.5018 0.8472 0.6446 0.3156 0.6172 0.7476 0.2718 0.5717 0.6845 0.326 0.5967 0.5959 0.7343 0.6495 0.7358 0.5137 0.8176 0.5513 0.6654 0.362 0.6026 0.801 0.6485 0.3842 0.3595 0.1308 0.4854 0.6617 0.3884 0.0278 0.592 0.6003 0.5356 0.0147 0.8194 0.2143 0.5002 0.6981] 2022-08-24 08:02:39 [INFO] [EVAL] Class Recall: [0.8568 0.9043 0.965 0.8883 0.8769 0.8638 0.8755 0.9267 0.713 0.7956 0.6123 0.7311 0.8904 0.4051 0.4078 0.6416 0.6192 0.5125 0.7489 0.5626 0.8992 0.5633 0.7856 0.7287 0.537 0.5873 0.6919 0.5332 0.4337 0.5054 0.4333 0.7031 0.4659 0.592 0.6064 0.6165 0.6216 0.7092 0.4913 0.4908 0.1722 0.1497 0.4573 0.3391 0.4751 0.3433 0.4507 0.6673 0.8415 0.682 0.731 0.7159 0.2656 0.1651 0.9233 0.4013 0.9472 0.5922 0.4957 0.4073 0.2249 0.3321 0.527 0.1624 0.7186 0.8567 0.4238 0.5946 0.2183 0.4677 0.6056 0.756 0.5506 0.5283 0.6116 0.5615 0.7282 0.3614 0.3922 0.3088 0.815 0.5141 0.4208 0.1209 0.1723 0.652 0.1452 0.1681 0.3351 0.6988 0.5933 0.3017 0.3384 0.1427 0.0466 0.0899 0.173 0.2231 0.4408 0.3515 0.2579 0.2367 0.3322 0.8882 0.195 0.8576 0.1509 0.6623 0.1325 0.3398 0.2803 0.6238 0.2447 0.7453 0.9825 0.0878 0.455 0.8059 0.33 0.5271 0.5584 0.0898 0.3822 0.2462 0.3263 0.294 0.6297 0.7085 0.6373 0.5321 0.7156 0.0562 0.3905 0.4989 0.3362 0.3383 0.248 0.0844 0.2684 0.532 0.1961 0.0513 0.2958 0.1865 0.262 0.0297 0.4885 0.0392 0.1684 0.2365] 2022-08-24 08:02:39 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:02:47 [INFO] [TRAIN] epoch: 112, iter: 141050/160000, loss: 0.3770, lr: 0.000143, batch_cost: 0.1634, reader_cost: 0.00428, ips: 48.9577 samples/sec | ETA 00:51:36 2022-08-24 08:02:55 [INFO] [TRAIN] epoch: 112, iter: 141100/160000, loss: 0.3887, lr: 0.000143, batch_cost: 0.1671, reader_cost: 0.00103, ips: 47.8860 samples/sec | ETA 00:52:37 2022-08-24 08:03:05 [INFO] [TRAIN] epoch: 112, iter: 141150/160000, loss: 0.4034, lr: 0.000143, batch_cost: 0.1917, reader_cost: 0.00071, ips: 41.7388 samples/sec | ETA 01:00:12 2022-08-24 08:03:13 [INFO] [TRAIN] epoch: 112, iter: 141200/160000, loss: 0.3909, lr: 0.000142, batch_cost: 0.1695, reader_cost: 0.00083, ips: 47.2073 samples/sec | ETA 00:53:05 2022-08-24 08:03:21 [INFO] [TRAIN] epoch: 112, iter: 141250/160000, loss: 0.3711, lr: 0.000142, batch_cost: 0.1585, reader_cost: 0.00061, ips: 50.4608 samples/sec | ETA 00:49:32 2022-08-24 08:03:31 [INFO] [TRAIN] epoch: 112, iter: 141300/160000, loss: 0.4005, lr: 0.000142, batch_cost: 0.1841, reader_cost: 0.00065, ips: 43.4439 samples/sec | ETA 00:57:23 2022-08-24 08:03:41 [INFO] [TRAIN] epoch: 112, iter: 141350/160000, loss: 0.3681, lr: 0.000141, batch_cost: 0.2079, reader_cost: 0.00087, ips: 38.4877 samples/sec | ETA 01:04:36 2022-08-24 08:03:51 [INFO] [TRAIN] epoch: 112, iter: 141400/160000, loss: 0.4026, lr: 0.000141, batch_cost: 0.2017, reader_cost: 0.00078, ips: 39.6588 samples/sec | ETA 01:02:32 2022-08-24 08:04:00 [INFO] [TRAIN] epoch: 112, iter: 141450/160000, loss: 0.4063, lr: 0.000140, batch_cost: 0.1874, reader_cost: 0.00073, ips: 42.6791 samples/sec | ETA 00:57:57 2022-08-24 08:04:14 [INFO] [TRAIN] epoch: 113, iter: 141500/160000, loss: 0.3741, lr: 0.000140, batch_cost: 0.2797, reader_cost: 0.08841, ips: 28.5971 samples/sec | ETA 01:26:15 2022-08-24 08:04:25 [INFO] [TRAIN] epoch: 113, iter: 141550/160000, loss: 0.3762, lr: 0.000140, batch_cost: 0.2073, reader_cost: 0.00054, ips: 38.5908 samples/sec | ETA 01:03:44 2022-08-24 08:04:34 [INFO] [TRAIN] epoch: 113, iter: 141600/160000, loss: 0.3886, lr: 0.000139, batch_cost: 0.1760, reader_cost: 0.00202, ips: 45.4644 samples/sec | ETA 00:53:57 2022-08-24 08:04:43 [INFO] [TRAIN] epoch: 113, iter: 141650/160000, loss: 0.3998, lr: 0.000139, batch_cost: 0.1938, reader_cost: 0.00033, ips: 41.2758 samples/sec | ETA 00:59:16 2022-08-24 08:04:52 [INFO] [TRAIN] epoch: 113, iter: 141700/160000, loss: 0.3956, lr: 0.000139, batch_cost: 0.1784, reader_cost: 0.00090, ips: 44.8321 samples/sec | ETA 00:54:25 2022-08-24 08:05:01 [INFO] [TRAIN] epoch: 113, iter: 141750/160000, loss: 0.4210, lr: 0.000138, batch_cost: 0.1769, reader_cost: 0.00164, ips: 45.2357 samples/sec | ETA 00:53:47 2022-08-24 08:05:10 [INFO] [TRAIN] epoch: 113, iter: 141800/160000, loss: 0.3961, lr: 0.000138, batch_cost: 0.1824, reader_cost: 0.00263, ips: 43.8645 samples/sec | ETA 00:55:19 2022-08-24 08:05:19 [INFO] [TRAIN] epoch: 113, iter: 141850/160000, loss: 0.3943, lr: 0.000137, batch_cost: 0.1769, reader_cost: 0.00038, ips: 45.2335 samples/sec | ETA 00:53:30 2022-08-24 08:05:29 [INFO] [TRAIN] epoch: 113, iter: 141900/160000, loss: 0.3901, lr: 0.000137, batch_cost: 0.1961, reader_cost: 0.00055, ips: 40.8055 samples/sec | ETA 00:59:08 2022-08-24 08:05:38 [INFO] [TRAIN] epoch: 113, iter: 141950/160000, loss: 0.3868, lr: 0.000137, batch_cost: 0.1900, reader_cost: 0.00045, ips: 42.1009 samples/sec | ETA 00:57:09 2022-08-24 08:05:47 [INFO] [TRAIN] epoch: 113, iter: 142000/160000, loss: 0.4204, lr: 0.000136, batch_cost: 0.1795, reader_cost: 0.00074, ips: 44.5582 samples/sec | ETA 00:53:51 2022-08-24 08:05:47 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1617 - reader cost: 8.3904e-04 2022-08-24 08:08:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3741 Acc: 0.7767 Kappa: 0.7597 Dice: 0.5117 2022-08-24 08:08:29 [INFO] [EVAL] Class IoU: [0.6964 0.7963 0.9318 0.7439 0.6917 0.7743 0.7837 0.8118 0.5383 0.6455 0.4969 0.5803 0.7139 0.299 0.3261 0.4393 0.4955 0.4344 0.6195 0.4471 0.7634 0.4409 0.6361 0.5191 0.3413 0.423 0.4539 0.4698 0.4616 0.248 0.2821 0.5492 0.3455 0.3672 0.3514 0.4229 0.4899 0.582 0.2953 0.4077 0.1315 0.1338 0.3491 0.26 0.2522 0.231 0.3409 0.5272 0.6281 0.5516 0.5892 0.3871 0.1767 0.208 0.6678 0.329 0.8893 0.3918 0.3596 0.2888 0.0887 0.2381 0.3352 0.1299 0.4811 0.7257 0.2612 0.3873 0.1022 0.3736 0.4918 0.5416 0.3872 0.2576 0.4984 0.4051 0.5231 0.3006 0.361 0.2651 0.7253 0.4478 0.4107 0.05 0.1081 0.5296 0.1253 0.128 0.3286 0.5516 0.4147 0.0809 0.2077 0.1359 0.0421 0.0804 0.2086 0.1855 0.2917 0.3 0.224 0.127 0.2205 0.7583 0.1822 0.6846 0.1651 0.5665 0.0817 0.2791 0.2254 0.396 0.2023 0.6355 0.5979 0.0768 0.3701 0.6394 0.1346 0.3944 0.392 0.0797 0.3329 0.1929 0.2966 0.2669 0.5295 0.4551 0.5738 0.4074 0.5619 0.0771 0.3255 0.4245 0.2886 0.2119 0.1801 0.0473 0.2038 0.4029 0.078 0.0194 0.2752 0.1777 0.2185 0.0007 0.438 0.0379 0.1339 0.2108] 2022-08-24 08:08:29 [INFO] [EVAL] Class Precision: [0.7929 0.8736 0.9661 0.8345 0.7639 0.8664 0.8911 0.8725 0.6837 0.7581 0.6855 0.709 0.7783 0.5209 0.5306 0.6175 0.6841 0.6874 0.7876 0.6319 0.8237 0.673 0.7686 0.6274 0.4976 0.6218 0.5506 0.7568 0.7424 0.3989 0.4968 0.7017 0.5841 0.4976 0.4465 0.5608 0.6759 0.818 0.4258 0.5761 0.3212 0.3438 0.5718 0.4686 0.346 0.5269 0.625 0.7339 0.7176 0.7004 0.7382 0.4788 0.3521 0.6108 0.7283 0.6662 0.9299 0.7153 0.6783 0.5499 0.1306 0.5222 0.5047 0.7123 0.5669 0.8257 0.4015 0.539 0.231 0.6388 0.7272 0.6466 0.5285 0.348 0.7334 0.5884 0.6584 0.6577 0.7968 0.515 0.8692 0.699 0.8365 0.1498 0.2087 0.7017 0.4127 0.458 0.789 0.7364 0.5452 0.1447 0.4004 0.3589 0.1357 0.2116 0.6014 0.3753 0.4298 0.7257 0.7288 0.2156 0.6102 0.8342 0.7943 0.7869 0.3476 0.7644 0.1748 0.4114 0.4289 0.503 0.6201 0.8231 0.6031 0.2853 0.5866 0.7514 0.2049 0.5621 0.7643 0.2809 0.6716 0.6536 0.7627 0.633 0.7616 0.5732 0.8349 0.6682 0.6639 0.315 0.5627 0.8396 0.6251 0.4381 0.3624 0.1297 0.4669 0.7194 0.3064 0.0322 0.5891 0.6651 0.5304 0.0011 0.8403 0.1939 0.4133 0.6408] 2022-08-24 08:08:29 [INFO] [EVAL] Class Recall: [0.8513 0.9 0.9633 0.8727 0.8798 0.8792 0.8667 0.9211 0.7168 0.8129 0.6437 0.7616 0.8962 0.4125 0.4583 0.6035 0.6425 0.5414 0.7437 0.6046 0.9125 0.561 0.7868 0.7505 0.5206 0.5696 0.7211 0.5534 0.5497 0.396 0.3949 0.7164 0.4583 0.5835 0.6225 0.6323 0.6403 0.6686 0.4906 0.5824 0.1822 0.1797 0.4728 0.3688 0.482 0.2914 0.4286 0.6518 0.8343 0.7219 0.7449 0.6691 0.2619 0.2398 0.8893 0.394 0.9533 0.4642 0.4335 0.3783 0.2164 0.3045 0.4996 0.1371 0.7608 0.8569 0.4278 0.579 0.1548 0.4737 0.6031 0.7693 0.5914 0.4978 0.6086 0.5652 0.7179 0.3564 0.3976 0.3533 0.8141 0.5547 0.4466 0.0698 0.1833 0.6836 0.1524 0.1509 0.3602 0.6872 0.634 0.1549 0.3014 0.1795 0.0574 0.1148 0.2421 0.2684 0.4758 0.3383 0.2444 0.2361 0.2567 0.8928 0.1912 0.8404 0.2391 0.6864 0.133 0.4646 0.322 0.6506 0.2309 0.7361 0.9858 0.0951 0.5007 0.8109 0.2818 0.5694 0.4459 0.1001 0.3976 0.2149 0.3267 0.3158 0.6347 0.6882 0.6472 0.5107 0.7854 0.0927 0.4357 0.4619 0.349 0.291 0.2636 0.0693 0.2655 0.478 0.0947 0.0465 0.3406 0.1951 0.2709 0.0019 0.4778 0.045 0.1653 0.239 ] 2022-08-24 08:08:30 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:08:38 [INFO] [TRAIN] epoch: 113, iter: 142050/160000, loss: 0.4092, lr: 0.000136, batch_cost: 0.1648, reader_cost: 0.00405, ips: 48.5529 samples/sec | ETA 00:49:17 2022-08-24 08:08:46 [INFO] [TRAIN] epoch: 113, iter: 142100/160000, loss: 0.3889, lr: 0.000136, batch_cost: 0.1652, reader_cost: 0.00085, ips: 48.4145 samples/sec | ETA 00:49:17 2022-08-24 08:08:54 [INFO] [TRAIN] epoch: 113, iter: 142150/160000, loss: 0.4189, lr: 0.000135, batch_cost: 0.1665, reader_cost: 0.00050, ips: 48.0526 samples/sec | ETA 00:49:31 2022-08-24 08:09:03 [INFO] [TRAIN] epoch: 113, iter: 142200/160000, loss: 0.3696, lr: 0.000135, batch_cost: 0.1797, reader_cost: 0.00077, ips: 44.5144 samples/sec | ETA 00:53:18 2022-08-24 08:09:12 [INFO] [TRAIN] epoch: 113, iter: 142250/160000, loss: 0.4128, lr: 0.000134, batch_cost: 0.1755, reader_cost: 0.00048, ips: 45.5904 samples/sec | ETA 00:51:54 2022-08-24 08:09:20 [INFO] [TRAIN] epoch: 113, iter: 142300/160000, loss: 0.3978, lr: 0.000134, batch_cost: 0.1496, reader_cost: 0.00058, ips: 53.4598 samples/sec | ETA 00:44:08 2022-08-24 08:09:28 [INFO] [TRAIN] epoch: 113, iter: 142350/160000, loss: 0.3876, lr: 0.000134, batch_cost: 0.1729, reader_cost: 0.00068, ips: 46.2683 samples/sec | ETA 00:50:51 2022-08-24 08:09:37 [INFO] [TRAIN] epoch: 113, iter: 142400/160000, loss: 0.4087, lr: 0.000133, batch_cost: 0.1775, reader_cost: 0.00431, ips: 45.0662 samples/sec | ETA 00:52:04 2022-08-24 08:09:47 [INFO] [TRAIN] epoch: 113, iter: 142450/160000, loss: 0.3958, lr: 0.000133, batch_cost: 0.1969, reader_cost: 0.00098, ips: 40.6282 samples/sec | ETA 00:57:35 2022-08-24 08:09:57 [INFO] [TRAIN] epoch: 113, iter: 142500/160000, loss: 0.3962, lr: 0.000132, batch_cost: 0.1993, reader_cost: 0.00045, ips: 40.1408 samples/sec | ETA 00:58:07 2022-08-24 08:10:07 [INFO] [TRAIN] epoch: 113, iter: 142550/160000, loss: 0.4012, lr: 0.000132, batch_cost: 0.1987, reader_cost: 0.00053, ips: 40.2564 samples/sec | ETA 00:57:47 2022-08-24 08:10:17 [INFO] [TRAIN] epoch: 113, iter: 142600/160000, loss: 0.3836, lr: 0.000132, batch_cost: 0.1943, reader_cost: 0.00059, ips: 41.1802 samples/sec | ETA 00:56:20 2022-08-24 08:10:26 [INFO] [TRAIN] epoch: 113, iter: 142650/160000, loss: 0.3855, lr: 0.000131, batch_cost: 0.1913, reader_cost: 0.00093, ips: 41.8096 samples/sec | ETA 00:55:19 2022-08-24 08:10:35 [INFO] [TRAIN] epoch: 113, iter: 142700/160000, loss: 0.3851, lr: 0.000131, batch_cost: 0.1729, reader_cost: 0.00033, ips: 46.2582 samples/sec | ETA 00:49:51 2022-08-24 08:10:49 [INFO] [TRAIN] epoch: 114, iter: 142750/160000, loss: 0.3997, lr: 0.000131, batch_cost: 0.2854, reader_cost: 0.09543, ips: 28.0281 samples/sec | ETA 01:22:03 2022-08-24 08:11:00 [INFO] [TRAIN] epoch: 114, iter: 142800/160000, loss: 0.3963, lr: 0.000130, batch_cost: 0.2109, reader_cost: 0.00079, ips: 37.9328 samples/sec | ETA 01:00:27 2022-08-24 08:11:09 [INFO] [TRAIN] epoch: 114, iter: 142850/160000, loss: 0.4219, lr: 0.000130, batch_cost: 0.1920, reader_cost: 0.00076, ips: 41.6688 samples/sec | ETA 00:54:52 2022-08-24 08:11:19 [INFO] [TRAIN] epoch: 114, iter: 142900/160000, loss: 0.3997, lr: 0.000129, batch_cost: 0.1898, reader_cost: 0.00058, ips: 42.1566 samples/sec | ETA 00:54:05 2022-08-24 08:11:29 [INFO] [TRAIN] epoch: 114, iter: 142950/160000, loss: 0.3848, lr: 0.000129, batch_cost: 0.2000, reader_cost: 0.00078, ips: 39.9975 samples/sec | ETA 00:56:50 2022-08-24 08:11:37 [INFO] [TRAIN] epoch: 114, iter: 143000/160000, loss: 0.3823, lr: 0.000129, batch_cost: 0.1728, reader_cost: 0.00037, ips: 46.2839 samples/sec | ETA 00:48:58 2022-08-24 08:11:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 150s - batch_cost: 0.1502 - reader cost: 5.1934e-04 2022-08-24 08:14:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3767 Acc: 0.7770 Kappa: 0.7600 Dice: 0.5141 2022-08-24 08:14:08 [INFO] [EVAL] Class IoU: [0.6974 0.7964 0.9324 0.7399 0.6956 0.7735 0.7811 0.8095 0.5405 0.6598 0.501 0.5713 0.7181 0.3122 0.3103 0.4359 0.4953 0.4411 0.6134 0.4392 0.7746 0.4419 0.6328 0.5208 0.3429 0.4037 0.4579 0.4703 0.4467 0.2538 0.2925 0.5523 0.3282 0.3544 0.3486 0.4246 0.4865 0.587 0.2976 0.3922 0.1315 0.1179 0.3448 0.2569 0.2398 0.2527 0.326 0.538 0.6169 0.527 0.6087 0.4034 0.1673 0.164 0.6741 0.3254 0.8932 0.4049 0.3782 0.2569 0.0941 0.2399 0.3305 0.1916 0.4709 0.7246 0.2788 0.387 0.1183 0.3786 0.5189 0.5855 0.3457 0.2689 0.5006 0.4198 0.5203 0.299 0.3635 0.2975 0.7235 0.4519 0.4294 0.0525 0.166 0.5259 0.1221 0.1327 0.3475 0.532 0.4204 0.1237 0.2251 0.1215 0.0568 0.0689 0.1442 0.1793 0.282 0.3162 0.2354 0.1078 0.2961 0.7224 0.1857 0.7314 0.1624 0.5705 0.079 0.2519 0.2094 0.5158 0.2035 0.6346 0.7504 0.0666 0.3352 0.6264 0.1361 0.3859 0.4148 0.0847 0.33 0.2081 0.2905 0.2618 0.5234 0.4416 0.574 0.4058 0.5629 0.0413 0.3173 0.4056 0.2897 0.2116 0.1816 0.0572 0.2071 0.42 0.1517 0.0187 0.306 0.0776 0.2366 0.0051 0.4373 0.0384 0.1313 0.2152] 2022-08-24 08:14:08 [INFO] [EVAL] Class Precision: [0.7964 0.8708 0.9669 0.8271 0.7737 0.8657 0.8792 0.8623 0.6824 0.7478 0.6884 0.723 0.7825 0.5441 0.5588 0.6116 0.6808 0.6859 0.7708 0.629 0.8447 0.6624 0.7691 0.6325 0.5008 0.6633 0.5577 0.746 0.7276 0.4096 0.4693 0.7067 0.5566 0.4632 0.4599 0.5779 0.6607 0.8081 0.4255 0.6146 0.3139 0.3285 0.5591 0.4641 0.359 0.4937 0.7099 0.7831 0.7362 0.6455 0.778 0.4899 0.3476 0.5332 0.7145 0.6642 0.9274 0.6948 0.6538 0.4306 0.1383 0.542 0.489 0.5704 0.5555 0.8117 0.4417 0.5307 0.2385 0.6441 0.6641 0.7638 0.5889 0.3492 0.7256 0.5907 0.7786 0.687 0.7706 0.528 0.8554 0.6757 0.818 0.1382 0.2596 0.698 0.5245 0.4244 0.7569 0.6644 0.5654 0.2 0.4434 0.342 0.2279 0.1804 0.6138 0.3632 0.4151 0.6987 0.7302 0.1701 0.5431 0.8166 0.7336 0.8295 0.4074 0.7705 0.1732 0.3972 0.4445 0.7403 0.5235 0.8286 0.7612 0.3239 0.6659 0.7309 0.1825 0.5616 0.7 0.3228 0.5805 0.6214 0.7647 0.6455 0.7442 0.5509 0.8554 0.6458 0.6694 0.3616 0.5164 0.8467 0.6181 0.3692 0.3388 0.115 0.4497 0.6917 0.385 0.0267 0.5868 0.3889 0.5293 0.0066 0.8016 0.1836 0.3957 0.7027] 2022-08-24 08:14:08 [INFO] [EVAL] Class Recall: [0.8488 0.9031 0.9631 0.8752 0.8734 0.8789 0.875 0.9297 0.7221 0.8486 0.6479 0.7314 0.8972 0.4228 0.4109 0.6027 0.6452 0.5528 0.7502 0.5928 0.9032 0.5704 0.7812 0.7469 0.5211 0.5077 0.719 0.5601 0.5364 0.4001 0.437 0.7165 0.4444 0.6014 0.5903 0.6155 0.6485 0.6822 0.4974 0.52 0.1846 0.1554 0.4736 0.3653 0.4195 0.3411 0.376 0.6322 0.7921 0.7415 0.7366 0.6957 0.244 0.1915 0.9225 0.3895 0.9604 0.4924 0.4729 0.389 0.2276 0.3008 0.5049 0.2239 0.7557 0.8711 0.4305 0.5882 0.19 0.4788 0.7035 0.715 0.4557 0.539 0.6176 0.592 0.6106 0.3462 0.4076 0.4052 0.8243 0.5771 0.4748 0.0781 0.3153 0.6808 0.1373 0.1618 0.3911 0.7275 0.6212 0.2449 0.3138 0.1585 0.0703 0.1004 0.1586 0.2615 0.4681 0.3662 0.2579 0.2275 0.3944 0.8623 0.1991 0.8607 0.2126 0.6874 0.1269 0.4077 0.2836 0.6297 0.2498 0.7305 0.9815 0.0774 0.403 0.8141 0.3486 0.5523 0.5045 0.103 0.4334 0.2383 0.319 0.3058 0.6382 0.69 0.6357 0.522 0.7796 0.0445 0.4514 0.4378 0.3528 0.3313 0.2813 0.1021 0.2774 0.5168 0.2003 0.0585 0.39 0.0883 0.2997 0.0231 0.4903 0.0462 0.1642 0.2368] 2022-08-24 08:14:08 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:14:16 [INFO] [TRAIN] epoch: 114, iter: 143050/160000, loss: 0.3724, lr: 0.000128, batch_cost: 0.1600, reader_cost: 0.00371, ips: 49.9932 samples/sec | ETA 00:45:12 2022-08-24 08:14:25 [INFO] [TRAIN] epoch: 114, iter: 143100/160000, loss: 0.4014, lr: 0.000128, batch_cost: 0.1713, reader_cost: 0.00158, ips: 46.6983 samples/sec | ETA 00:48:15 2022-08-24 08:14:33 [INFO] [TRAIN] epoch: 114, iter: 143150/160000, loss: 0.3832, lr: 0.000128, batch_cost: 0.1653, reader_cost: 0.00053, ips: 48.3844 samples/sec | ETA 00:46:26 2022-08-24 08:14:41 [INFO] [TRAIN] epoch: 114, iter: 143200/160000, loss: 0.3974, lr: 0.000127, batch_cost: 0.1681, reader_cost: 0.00062, ips: 47.5818 samples/sec | ETA 00:47:04 2022-08-24 08:14:49 [INFO] [TRAIN] epoch: 114, iter: 143250/160000, loss: 0.3849, lr: 0.000127, batch_cost: 0.1531, reader_cost: 0.00051, ips: 52.2610 samples/sec | ETA 00:42:44 2022-08-24 08:14:57 [INFO] [TRAIN] epoch: 114, iter: 143300/160000, loss: 0.3973, lr: 0.000126, batch_cost: 0.1687, reader_cost: 0.00074, ips: 47.4315 samples/sec | ETA 00:46:56 2022-08-24 08:15:07 [INFO] [TRAIN] epoch: 114, iter: 143350/160000, loss: 0.3999, lr: 0.000126, batch_cost: 0.2031, reader_cost: 0.00057, ips: 39.3839 samples/sec | ETA 00:56:22 2022-08-24 08:15:16 [INFO] [TRAIN] epoch: 114, iter: 143400/160000, loss: 0.3722, lr: 0.000126, batch_cost: 0.1741, reader_cost: 0.00064, ips: 45.9552 samples/sec | ETA 00:48:09 2022-08-24 08:15:24 [INFO] [TRAIN] epoch: 114, iter: 143450/160000, loss: 0.3756, lr: 0.000125, batch_cost: 0.1628, reader_cost: 0.00079, ips: 49.1392 samples/sec | ETA 00:44:54 2022-08-24 08:15:35 [INFO] [TRAIN] epoch: 114, iter: 143500/160000, loss: 0.3867, lr: 0.000125, batch_cost: 0.2164, reader_cost: 0.00038, ips: 36.9631 samples/sec | ETA 00:59:31 2022-08-24 08:15:45 [INFO] [TRAIN] epoch: 114, iter: 143550/160000, loss: 0.3968, lr: 0.000125, batch_cost: 0.1873, reader_cost: 0.00054, ips: 42.7130 samples/sec | ETA 00:51:21 2022-08-24 08:15:54 [INFO] [TRAIN] epoch: 114, iter: 143600/160000, loss: 0.4045, lr: 0.000124, batch_cost: 0.1889, reader_cost: 0.00049, ips: 42.3451 samples/sec | ETA 00:51:38 2022-08-24 08:16:03 [INFO] [TRAIN] epoch: 114, iter: 143650/160000, loss: 0.4080, lr: 0.000124, batch_cost: 0.1732, reader_cost: 0.00053, ips: 46.1941 samples/sec | ETA 00:47:11 2022-08-24 08:16:13 [INFO] [TRAIN] epoch: 114, iter: 143700/160000, loss: 0.3871, lr: 0.000123, batch_cost: 0.2146, reader_cost: 0.00043, ips: 37.2734 samples/sec | ETA 00:58:18 2022-08-24 08:16:23 [INFO] [TRAIN] epoch: 114, iter: 143750/160000, loss: 0.3636, lr: 0.000123, batch_cost: 0.1985, reader_cost: 0.00071, ips: 40.2954 samples/sec | ETA 00:53:46 2022-08-24 08:16:33 [INFO] [TRAIN] epoch: 114, iter: 143800/160000, loss: 0.3650, lr: 0.000123, batch_cost: 0.1968, reader_cost: 0.00059, ips: 40.6504 samples/sec | ETA 00:53:08 2022-08-24 08:16:44 [INFO] [TRAIN] epoch: 114, iter: 143850/160000, loss: 0.3701, lr: 0.000122, batch_cost: 0.2112, reader_cost: 0.00112, ips: 37.8776 samples/sec | ETA 00:56:50 2022-08-24 08:16:53 [INFO] [TRAIN] epoch: 114, iter: 143900/160000, loss: 0.4153, lr: 0.000122, batch_cost: 0.1927, reader_cost: 0.00042, ips: 41.5114 samples/sec | ETA 00:51:42 2022-08-24 08:17:03 [INFO] [TRAIN] epoch: 114, iter: 143950/160000, loss: 0.4087, lr: 0.000122, batch_cost: 0.1904, reader_cost: 0.00059, ips: 42.0193 samples/sec | ETA 00:50:55 2022-08-24 08:17:17 [INFO] [TRAIN] epoch: 115, iter: 144000/160000, loss: 0.4017, lr: 0.000121, batch_cost: 0.2840, reader_cost: 0.06970, ips: 28.1728 samples/sec | ETA 01:15:43 2022-08-24 08:17:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 146s - batch_cost: 0.1459 - reader cost: 7.2090e-04 2022-08-24 08:19:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.3787 Acc: 0.7782 Kappa: 0.7613 Dice: 0.5162 2022-08-24 08:19:43 [INFO] [EVAL] Class IoU: [0.6973 0.7966 0.9322 0.7444 0.6911 0.7766 0.7826 0.8129 0.5306 0.6565 0.495 0.5802 0.7163 0.3119 0.3049 0.4441 0.4916 0.4556 0.6141 0.4422 0.7727 0.4329 0.6375 0.5237 0.3413 0.4761 0.4685 0.4883 0.4567 0.2672 0.3051 0.546 0.323 0.3555 0.3481 0.4232 0.4851 0.5875 0.2967 0.3949 0.1398 0.1306 0.3557 0.248 0.2273 0.2594 0.3349 0.5402 0.608 0.5438 0.5969 0.4093 0.1665 0.1696 0.6685 0.3305 0.8919 0.4308 0.4423 0.2687 0.0881 0.2216 0.3405 0.21 0.4957 0.7258 0.2793 0.4009 0.1161 0.3857 0.5101 0.5577 0.3696 0.2898 0.4988 0.4158 0.525 0.2877 0.4178 0.277 0.7348 0.4352 0.4306 0.0423 0.1569 0.5197 0.118 0.129 0.3469 0.5497 0.4259 0.1312 0.2241 0.0977 0.0365 0.0699 0.1842 0.1796 0.2815 0.2997 0.2399 0.1095 0.2938 0.7513 0.187 0.577 0.1486 0.5698 0.0843 0.2304 0.1987 0.5198 0.207 0.6737 0.7413 0.0657 0.3824 0.6283 0.2008 0.376 0.432 0.0726 0.3107 0.2059 0.2972 0.2781 0.5331 0.4626 0.6137 0.3947 0.5069 0.0496 0.2866 0.4367 0.2938 0.2159 0.1711 0.0522 0.2133 0.4335 0.1932 0.0185 0.2626 0.0687 0.223 0.002 0.4356 0.0354 0.141 0.2058] 2022-08-24 08:19:43 [INFO] [EVAL] Class Precision: [0.7948 0.8637 0.9643 0.8348 0.7725 0.8699 0.8837 0.871 0.6651 0.7553 0.6796 0.7267 0.7833 0.5416 0.5485 0.6209 0.6733 0.6747 0.7744 0.6473 0.8383 0.6941 0.7907 0.6409 0.4734 0.6583 0.567 0.7704 0.7562 0.4819 0.4776 0.7325 0.5709 0.4765 0.4653 0.5553 0.6919 0.8024 0.4243 0.622 0.3303 0.3414 0.6091 0.4887 0.3095 0.4685 0.7161 0.7538 0.7396 0.7013 0.7781 0.5031 0.3059 0.5835 0.7069 0.6306 0.9378 0.6857 0.6465 0.473 0.1332 0.4466 0.506 0.5946 0.6056 0.8215 0.4373 0.5545 0.2345 0.7243 0.7138 0.6922 0.5681 0.3694 0.7322 0.5895 0.7674 0.6922 0.7954 0.5712 0.8706 0.7058 0.8166 0.1469 0.2562 0.7431 0.5437 0.4105 0.8222 0.7163 0.5829 0.2065 0.4276 0.3517 0.1411 0.1959 0.611 0.3813 0.4196 0.6765 0.6661 0.1774 0.5333 0.8191 0.8118 0.642 0.5093 0.7759 0.1487 0.3876 0.4486 0.7687 0.588 0.823 0.7511 0.2989 0.6939 0.7268 0.3255 0.5764 0.7295 0.2636 0.6562 0.6215 0.7586 0.6522 0.8057 0.5975 0.8189 0.6624 0.5671 0.2935 0.5787 0.8314 0.5926 0.4056 0.3576 0.104 0.5003 0.6778 0.4754 0.0261 0.5705 0.3533 0.5609 0.0031 0.8479 0.2302 0.3312 0.7032] 2022-08-24 08:19:43 [INFO] [EVAL] Class Recall: [0.8503 0.9112 0.9655 0.8729 0.8677 0.8786 0.8725 0.9241 0.724 0.8338 0.6458 0.7422 0.8934 0.4237 0.407 0.6093 0.6455 0.5838 0.7478 0.5826 0.908 0.5349 0.7669 0.7412 0.5502 0.6323 0.7295 0.5715 0.5355 0.3749 0.4579 0.682 0.4265 0.5834 0.5802 0.6402 0.6188 0.6868 0.4966 0.5196 0.1951 0.1746 0.461 0.3349 0.461 0.3675 0.3862 0.6559 0.7736 0.7077 0.7194 0.6871 0.2677 0.1929 0.9248 0.4098 0.948 0.5368 0.5833 0.3834 0.2061 0.3054 0.51 0.2451 0.7321 0.8617 0.4361 0.5915 0.1869 0.4521 0.6413 0.7417 0.514 0.5734 0.61 0.5853 0.6243 0.33 0.4681 0.3497 0.8249 0.5316 0.4767 0.0561 0.2883 0.6334 0.131 0.1583 0.375 0.7027 0.6127 0.2647 0.32 0.1191 0.047 0.0981 0.2087 0.2534 0.461 0.3499 0.2727 0.2223 0.3955 0.9008 0.1955 0.8507 0.1734 0.682 0.1628 0.3622 0.2629 0.6162 0.2422 0.7879 0.9827 0.0776 0.46 0.8226 0.344 0.5195 0.5144 0.0911 0.3711 0.2354 0.3282 0.3265 0.6118 0.672 0.71 0.4941 0.827 0.0563 0.3622 0.4792 0.3682 0.3158 0.2471 0.0949 0.271 0.5461 0.2455 0.0597 0.3273 0.0786 0.2702 0.0058 0.4725 0.0402 0.1971 0.2254] 2022-08-24 08:19:43 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:19:52 [INFO] [TRAIN] epoch: 115, iter: 144050/160000, loss: 0.3838, lr: 0.000121, batch_cost: 0.1706, reader_cost: 0.00422, ips: 46.8889 samples/sec | ETA 00:45:21 2022-08-24 08:20:00 [INFO] [TRAIN] epoch: 115, iter: 144100/160000, loss: 0.3861, lr: 0.000120, batch_cost: 0.1664, reader_cost: 0.00154, ips: 48.0653 samples/sec | ETA 00:44:06 2022-08-24 08:20:08 [INFO] [TRAIN] epoch: 115, iter: 144150/160000, loss: 0.4150, lr: 0.000120, batch_cost: 0.1517, reader_cost: 0.00093, ips: 52.7333 samples/sec | ETA 00:40:04 2022-08-24 08:20:16 [INFO] [TRAIN] epoch: 115, iter: 144200/160000, loss: 0.3823, lr: 0.000120, batch_cost: 0.1643, reader_cost: 0.00103, ips: 48.7009 samples/sec | ETA 00:43:15 2022-08-24 08:20:25 [INFO] [TRAIN] epoch: 115, iter: 144250/160000, loss: 0.3708, lr: 0.000119, batch_cost: 0.1690, reader_cost: 0.00120, ips: 47.3431 samples/sec | ETA 00:44:21 2022-08-24 08:20:33 [INFO] [TRAIN] epoch: 115, iter: 144300/160000, loss: 0.3682, lr: 0.000119, batch_cost: 0.1733, reader_cost: 0.00081, ips: 46.1666 samples/sec | ETA 00:45:20 2022-08-24 08:20:43 [INFO] [TRAIN] epoch: 115, iter: 144350/160000, loss: 0.3968, lr: 0.000118, batch_cost: 0.2049, reader_cost: 0.00026, ips: 39.0472 samples/sec | ETA 00:53:26 2022-08-24 08:20:53 [INFO] [TRAIN] epoch: 115, iter: 144400/160000, loss: 0.3655, lr: 0.000118, batch_cost: 0.1997, reader_cost: 0.00055, ips: 40.0513 samples/sec | ETA 00:51:56 2022-08-24 08:21:02 [INFO] [TRAIN] epoch: 115, iter: 144450/160000, loss: 0.4294, lr: 0.000118, batch_cost: 0.1806, reader_cost: 0.00060, ips: 44.3016 samples/sec | ETA 00:46:48 2022-08-24 08:21:11 [INFO] [TRAIN] epoch: 115, iter: 144500/160000, loss: 0.3835, lr: 0.000117, batch_cost: 0.1696, reader_cost: 0.00044, ips: 47.1767 samples/sec | ETA 00:43:48 2022-08-24 08:21:20 [INFO] [TRAIN] epoch: 115, iter: 144550/160000, loss: 0.4026, lr: 0.000117, batch_cost: 0.1805, reader_cost: 0.00261, ips: 44.3162 samples/sec | ETA 00:46:29 2022-08-24 08:21:30 [INFO] [TRAIN] epoch: 115, iter: 144600/160000, loss: 0.3906, lr: 0.000117, batch_cost: 0.1953, reader_cost: 0.00061, ips: 40.9595 samples/sec | ETA 00:50:07 2022-08-24 08:21:40 [INFO] [TRAIN] epoch: 115, iter: 144650/160000, loss: 0.3952, lr: 0.000116, batch_cost: 0.2048, reader_cost: 0.00044, ips: 39.0615 samples/sec | ETA 00:52:23 2022-08-24 08:21:50 [INFO] [TRAIN] epoch: 115, iter: 144700/160000, loss: 0.3762, lr: 0.000116, batch_cost: 0.2005, reader_cost: 0.00051, ips: 39.8937 samples/sec | ETA 00:51:08 2022-08-24 08:22:00 [INFO] [TRAIN] epoch: 115, iter: 144750/160000, loss: 0.4168, lr: 0.000115, batch_cost: 0.2076, reader_cost: 0.00138, ips: 38.5380 samples/sec | ETA 00:52:45 2022-08-24 08:22:10 [INFO] [TRAIN] epoch: 115, iter: 144800/160000, loss: 0.3936, lr: 0.000115, batch_cost: 0.2001, reader_cost: 0.00070, ips: 39.9743 samples/sec | ETA 00:50:41 2022-08-24 08:22:21 [INFO] [TRAIN] epoch: 115, iter: 144850/160000, loss: 0.3794, lr: 0.000115, batch_cost: 0.2111, reader_cost: 0.00071, ips: 37.8915 samples/sec | ETA 00:53:18 2022-08-24 08:22:30 [INFO] [TRAIN] epoch: 115, iter: 144900/160000, loss: 0.3866, lr: 0.000114, batch_cost: 0.1913, reader_cost: 0.00069, ips: 41.8288 samples/sec | ETA 00:48:07 2022-08-24 08:22:39 [INFO] [TRAIN] epoch: 115, iter: 144950/160000, loss: 0.3819, lr: 0.000114, batch_cost: 0.1760, reader_cost: 0.00675, ips: 45.4443 samples/sec | ETA 00:44:09 2022-08-24 08:22:50 [INFO] [TRAIN] epoch: 115, iter: 145000/160000, loss: 0.3769, lr: 0.000114, batch_cost: 0.2157, reader_cost: 0.00091, ips: 37.0832 samples/sec | ETA 00:53:55 2022-08-24 08:22:50 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1632 - reader cost: 6.1865e-04 2022-08-24 08:25:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.3775 Acc: 0.7776 Kappa: 0.7607 Dice: 0.5147 2022-08-24 08:25:34 [INFO] [EVAL] Class IoU: [0.6982 0.7936 0.9323 0.7453 0.6948 0.7805 0.7822 0.8106 0.538 0.6442 0.4983 0.5761 0.7221 0.3055 0.3062 0.4401 0.505 0.4445 0.6186 0.4441 0.7737 0.4239 0.6354 0.5181 0.335 0.4746 0.4522 0.484 0.4437 0.2571 0.2736 0.5504 0.3341 0.3621 0.3464 0.4062 0.4886 0.6007 0.2986 0.3987 0.1096 0.1194 0.3616 0.2502 0.2313 0.2568 0.362 0.5395 0.5983 0.5507 0.6206 0.413 0.1657 0.1539 0.6762 0.3234 0.8883 0.3925 0.418 0.2738 0.0916 0.2283 0.3346 0.169 0.5 0.736 0.2811 0.3967 0.1376 0.3825 0.5216 0.5867 0.3654 0.3143 0.5007 0.4056 0.5526 0.308 0.3388 0.2715 0.7127 0.4402 0.4312 0.085 0.1074 0.5292 0.1265 0.1413 0.3185 0.5393 0.4246 0.0774 0.1924 0.1112 0.0387 0.0643 0.1956 0.1587 0.2754 0.3236 0.2475 0.1135 0.2766 0.7701 0.1705 0.4776 0.1633 0.5633 0.0821 0.211 0.2087 0.5345 0.2065 0.6752 0.7149 0.0839 0.401 0.6266 0.2023 0.4019 0.4531 0.0784 0.2918 0.1984 0.2983 0.2644 0.5511 0.478 0.5748 0.3814 0.5637 0.0433 0.3334 0.4345 0.3055 0.206 0.1669 0.0404 0.23 0.4223 0.17 0.0134 0.2831 0.1455 0.2006 0.0039 0.423 0.0384 0.1368 0.2055] 2022-08-24 08:25:34 [INFO] [EVAL] Class Precision: [0.7977 0.8651 0.9647 0.8303 0.775 0.8704 0.8806 0.8657 0.6691 0.7413 0.7012 0.7285 0.7917 0.5324 0.5555 0.6064 0.6843 0.697 0.7846 0.6335 0.8392 0.662 0.7869 0.6216 0.4635 0.638 0.5591 0.7366 0.7584 0.4231 0.4861 0.7233 0.5645 0.4874 0.4447 0.5379 0.6602 0.8218 0.4392 0.5986 0.3315 0.3191 0.6341 0.4722 0.318 0.4696 0.6767 0.7357 0.7223 0.683 0.7803 0.5175 0.3253 0.6116 0.7243 0.6229 0.9316 0.6888 0.7053 0.4662 0.1269 0.4233 0.5001 0.6836 0.6197 0.838 0.4552 0.5331 0.3209 0.6957 0.7068 0.7083 0.579 0.3721 0.739 0.6326 0.7872 0.6505 0.7805 0.5412 0.8408 0.7188 0.8085 0.1822 0.1966 0.7152 0.4857 0.4077 0.8101 0.6759 0.5703 0.0958 0.4414 0.3191 0.1103 0.2052 0.5768 0.3874 0.3908 0.7124 0.528 0.1943 0.5363 0.8666 0.7516 0.52 0.4177 0.7693 0.1589 0.3568 0.4461 0.7196 0.5889 0.8525 0.7232 0.3052 0.6721 0.7237 0.4351 0.5326 0.7221 0.306 0.6099 0.6528 0.7575 0.6238 0.8117 0.6329 0.8518 0.6485 0.6942 0.2784 0.6297 0.8019 0.6328 0.415 0.3752 0.1247 0.4997 0.7166 0.4059 0.0204 0.5672 0.54 0.5043 0.0058 0.8347 0.2286 0.3499 0.724 ] 2022-08-24 08:25:34 [INFO] [EVAL] Class Recall: [0.8485 0.9056 0.9652 0.8793 0.8703 0.8831 0.8751 0.9273 0.733 0.831 0.6327 0.7336 0.8915 0.4175 0.4056 0.6161 0.6583 0.551 0.7452 0.5977 0.9084 0.541 0.7675 0.7568 0.5472 0.6495 0.7028 0.5853 0.5168 0.3959 0.3851 0.6972 0.4501 0.5849 0.6105 0.6238 0.6528 0.6907 0.4825 0.5442 0.1408 0.1601 0.457 0.3474 0.459 0.3618 0.4376 0.6692 0.7771 0.7398 0.752 0.6717 0.2524 0.1706 0.9106 0.4022 0.9503 0.4772 0.5064 0.3988 0.248 0.3314 0.5028 0.1834 0.7213 0.8581 0.4236 0.608 0.1941 0.4594 0.6656 0.7736 0.4975 0.6692 0.6082 0.5305 0.6496 0.369 0.3745 0.3526 0.8238 0.5317 0.4803 0.1374 0.1913 0.6705 0.1461 0.1779 0.3441 0.7275 0.6244 0.2866 0.2544 0.1458 0.0563 0.0857 0.2284 0.2118 0.4825 0.3723 0.3178 0.2145 0.3636 0.8737 0.1807 0.8543 0.2115 0.6778 0.1453 0.3406 0.2816 0.6752 0.2413 0.7645 0.9843 0.1038 0.4985 0.8236 0.2744 0.621 0.5488 0.0953 0.3587 0.2219 0.3298 0.3145 0.6319 0.6613 0.6387 0.4808 0.75 0.0488 0.4147 0.4868 0.3713 0.2903 0.2312 0.0564 0.2987 0.507 0.2264 0.0377 0.3611 0.166 0.2499 0.0114 0.4617 0.0441 0.1835 0.223 ] 2022-08-24 08:25:34 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:25:42 [INFO] [TRAIN] epoch: 115, iter: 145050/160000, loss: 0.3830, lr: 0.000113, batch_cost: 0.1635, reader_cost: 0.00366, ips: 48.9178 samples/sec | ETA 00:40:44 2022-08-24 08:25:50 [INFO] [TRAIN] epoch: 115, iter: 145100/160000, loss: 0.3716, lr: 0.000113, batch_cost: 0.1609, reader_cost: 0.00135, ips: 49.7276 samples/sec | ETA 00:39:57 2022-08-24 08:25:59 [INFO] [TRAIN] epoch: 115, iter: 145150/160000, loss: 0.3657, lr: 0.000112, batch_cost: 0.1745, reader_cost: 0.00108, ips: 45.8538 samples/sec | ETA 00:43:10 2022-08-24 08:26:07 [INFO] [TRAIN] epoch: 115, iter: 145200/160000, loss: 0.3714, lr: 0.000112, batch_cost: 0.1652, reader_cost: 0.00071, ips: 48.4345 samples/sec | ETA 00:40:44 2022-08-24 08:26:18 [INFO] [TRAIN] epoch: 116, iter: 145250/160000, loss: 0.3787, lr: 0.000112, batch_cost: 0.2273, reader_cost: 0.05474, ips: 35.1992 samples/sec | ETA 00:55:52 2022-08-24 08:26:26 [INFO] [TRAIN] epoch: 116, iter: 145300/160000, loss: 0.3675, lr: 0.000111, batch_cost: 0.1577, reader_cost: 0.00085, ips: 50.7384 samples/sec | ETA 00:38:37 2022-08-24 08:26:35 [INFO] [TRAIN] epoch: 116, iter: 145350/160000, loss: 0.3921, lr: 0.000111, batch_cost: 0.1736, reader_cost: 0.00052, ips: 46.0938 samples/sec | ETA 00:42:22 2022-08-24 08:26:44 [INFO] [TRAIN] epoch: 116, iter: 145400/160000, loss: 0.3884, lr: 0.000111, batch_cost: 0.1861, reader_cost: 0.00064, ips: 42.9833 samples/sec | ETA 00:45:17 2022-08-24 08:26:54 [INFO] [TRAIN] epoch: 116, iter: 145450/160000, loss: 0.3823, lr: 0.000110, batch_cost: 0.1987, reader_cost: 0.00032, ips: 40.2566 samples/sec | ETA 00:48:11 2022-08-24 08:27:04 [INFO] [TRAIN] epoch: 116, iter: 145500/160000, loss: 0.3700, lr: 0.000110, batch_cost: 0.1976, reader_cost: 0.00079, ips: 40.4869 samples/sec | ETA 00:47:45 2022-08-24 08:27:13 [INFO] [TRAIN] epoch: 116, iter: 145550/160000, loss: 0.4130, lr: 0.000109, batch_cost: 0.1882, reader_cost: 0.00123, ips: 42.5016 samples/sec | ETA 00:45:19 2022-08-24 08:27:22 [INFO] [TRAIN] epoch: 116, iter: 145600/160000, loss: 0.3745, lr: 0.000109, batch_cost: 0.1767, reader_cost: 0.00090, ips: 45.2677 samples/sec | ETA 00:42:24 2022-08-24 08:27:32 [INFO] [TRAIN] epoch: 116, iter: 145650/160000, loss: 0.3785, lr: 0.000109, batch_cost: 0.1913, reader_cost: 0.00100, ips: 41.8178 samples/sec | ETA 00:45:45 2022-08-24 08:27:41 [INFO] [TRAIN] epoch: 116, iter: 145700/160000, loss: 0.3927, lr: 0.000108, batch_cost: 0.1854, reader_cost: 0.00300, ips: 43.1552 samples/sec | ETA 00:44:10 2022-08-24 08:27:51 [INFO] [TRAIN] epoch: 116, iter: 145750/160000, loss: 0.3938, lr: 0.000108, batch_cost: 0.1899, reader_cost: 0.00534, ips: 42.1238 samples/sec | ETA 00:45:06 2022-08-24 08:28:01 [INFO] [TRAIN] epoch: 116, iter: 145800/160000, loss: 0.4143, lr: 0.000108, batch_cost: 0.1997, reader_cost: 0.00079, ips: 40.0520 samples/sec | ETA 00:47:16 2022-08-24 08:28:11 [INFO] [TRAIN] epoch: 116, iter: 145850/160000, loss: 0.3893, lr: 0.000107, batch_cost: 0.2083, reader_cost: 0.00074, ips: 38.4068 samples/sec | ETA 00:49:07 2022-08-24 08:28:21 [INFO] [TRAIN] epoch: 116, iter: 145900/160000, loss: 0.3947, lr: 0.000107, batch_cost: 0.1991, reader_cost: 0.00418, ips: 40.1797 samples/sec | ETA 00:46:47 2022-08-24 08:28:31 [INFO] [TRAIN] epoch: 116, iter: 145950/160000, loss: 0.3914, lr: 0.000106, batch_cost: 0.1928, reader_cost: 0.00086, ips: 41.4875 samples/sec | ETA 00:45:09 2022-08-24 08:28:40 [INFO] [TRAIN] epoch: 116, iter: 146000/160000, loss: 0.4175, lr: 0.000106, batch_cost: 0.1953, reader_cost: 0.00091, ips: 40.9731 samples/sec | ETA 00:45:33 2022-08-24 08:28:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1616 - reader cost: 7.3195e-04 2022-08-24 08:31:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3784 Acc: 0.7783 Kappa: 0.7614 Dice: 0.5160 2022-08-24 08:31:22 [INFO] [EVAL] Class IoU: [0.7001 0.7958 0.9331 0.7397 0.6944 0.7794 0.7805 0.8112 0.5348 0.6519 0.4977 0.5815 0.7184 0.3118 0.3122 0.4452 0.4995 0.4425 0.6173 0.4445 0.7755 0.431 0.6315 0.5177 0.3374 0.48 0.4662 0.4698 0.3956 0.2671 0.2748 0.5567 0.3215 0.3588 0.3644 0.4201 0.4877 0.5995 0.301 0.392 0.1228 0.1449 0.3595 0.2517 0.233 0.2578 0.362 0.5426 0.5996 0.5478 0.6037 0.3954 0.1653 0.164 0.6788 0.3252 0.8838 0.4196 0.4056 0.2709 0.0826 0.2376 0.3537 0.1719 0.5019 0.7385 0.2677 0.3837 0.1218 0.3771 0.5081 0.5653 0.3815 0.265 0.5015 0.4101 0.5476 0.2975 0.4541 0.3006 0.71 0.4448 0.4397 0.041 0.096 0.5255 0.1218 0.1294 0.338 0.5559 0.4216 0.1236 0.2209 0.1054 0.0317 0.0526 0.1743 0.1666 0.2803 0.3167 0.2388 0.124 0.2926 0.7624 0.1779 0.5046 0.1576 0.5693 0.087 0.2251 0.2088 0.5281 0.2095 0.6662 0.6694 0.074 0.3809 0.6296 0.2321 0.4054 0.4537 0.0827 0.2999 0.1941 0.2938 0.2631 0.5427 0.4505 0.6356 0.3726 0.5661 0.0499 0.289 0.4385 0.3106 0.2179 0.1746 0.0472 0.2045 0.4135 0.1418 0.0272 0.2617 0.2224 0.1936 0.0044 0.4366 0.0378 0.1425 0.2079] 2022-08-24 08:31:22 [INFO] [EVAL] Class Precision: [0.7977 0.8661 0.9641 0.8224 0.7715 0.8715 0.8886 0.8644 0.668 0.7523 0.6954 0.7068 0.785 0.531 0.5631 0.608 0.7009 0.6866 0.7703 0.6264 0.8414 0.6827 0.7887 0.6279 0.4941 0.6749 0.5687 0.7942 0.7808 0.4743 0.4887 0.6895 0.5509 0.4979 0.4644 0.5616 0.6724 0.8353 0.4539 0.6087 0.3256 0.3125 0.5995 0.4401 0.3121 0.5052 0.7214 0.7523 0.7241 0.6949 0.7619 0.4894 0.3267 0.5813 0.7245 0.6609 0.9189 0.6917 0.6918 0.4892 0.1204 0.4442 0.5167 0.6014 0.6233 0.8393 0.4042 0.5314 0.2568 0.6889 0.719 0.6957 0.5397 0.3583 0.7507 0.6298 0.7234 0.6799 0.7619 0.584 0.8313 0.677 0.8023 0.1024 0.1752 0.7242 0.4251 0.4444 0.827 0.7116 0.5769 0.1818 0.469 0.3382 0.1173 0.1638 0.5663 0.4169 0.4093 0.7068 0.5972 0.2199 0.5156 0.866 0.8302 0.5523 0.3907 0.7571 0.1557 0.3663 0.398 0.7392 0.5178 0.8355 0.6758 0.2628 0.7621 0.7335 0.4819 0.557 0.7197 0.3245 0.5928 0.6296 0.7463 0.6256 0.8069 0.5701 0.83 0.5515 0.6895 0.2816 0.6639 0.8193 0.6025 0.4129 0.3941 0.0965 0.4686 0.7101 0.431 0.0442 0.5639 0.6286 0.5534 0.0058 0.8273 0.216 0.3609 0.6801] 2022-08-24 08:31:22 [INFO] [EVAL] Class Recall: [0.8513 0.9075 0.9666 0.8803 0.8743 0.8805 0.8651 0.9295 0.7285 0.8301 0.6364 0.7665 0.8943 0.4302 0.412 0.6243 0.6349 0.5545 0.7566 0.6048 0.9083 0.539 0.7601 0.7467 0.5156 0.6243 0.721 0.5349 0.445 0.3795 0.3857 0.743 0.4358 0.5623 0.6286 0.6251 0.6397 0.6798 0.4719 0.524 0.1646 0.2128 0.4731 0.3702 0.4789 0.3448 0.4208 0.6607 0.7773 0.7213 0.744 0.673 0.2508 0.186 0.915 0.3903 0.9586 0.5161 0.4951 0.3778 0.2083 0.3381 0.5285 0.194 0.7204 0.8601 0.4422 0.58 0.1881 0.4544 0.6341 0.751 0.5656 0.5045 0.6017 0.5403 0.6926 0.346 0.5292 0.3825 0.8296 0.5646 0.4931 0.0641 0.175 0.6569 0.1459 0.1544 0.3637 0.7176 0.6103 0.2784 0.2946 0.1327 0.0416 0.0719 0.2012 0.2172 0.4708 0.3646 0.2847 0.2213 0.4036 0.8644 0.1846 0.8539 0.209 0.6965 0.1646 0.3686 0.3052 0.6491 0.2602 0.7668 0.9862 0.0934 0.4323 0.8163 0.3092 0.5982 0.5511 0.0999 0.3776 0.2191 0.3264 0.3123 0.6237 0.6822 0.7307 0.5345 0.7597 0.0572 0.3385 0.4855 0.3906 0.3156 0.2386 0.0845 0.2663 0.4974 0.1744 0.0659 0.3281 0.256 0.2294 0.0175 0.4804 0.0438 0.1907 0.2304] 2022-08-24 08:31:23 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:31:31 [INFO] [TRAIN] epoch: 116, iter: 146050/160000, loss: 0.4016, lr: 0.000106, batch_cost: 0.1695, reader_cost: 0.00443, ips: 47.2009 samples/sec | ETA 00:39:24 2022-08-24 08:31:39 [INFO] [TRAIN] epoch: 116, iter: 146100/160000, loss: 0.4126, lr: 0.000105, batch_cost: 0.1631, reader_cost: 0.00132, ips: 49.0590 samples/sec | ETA 00:37:46 2022-08-24 08:31:48 [INFO] [TRAIN] epoch: 116, iter: 146150/160000, loss: 0.3903, lr: 0.000105, batch_cost: 0.1684, reader_cost: 0.00154, ips: 47.5171 samples/sec | ETA 00:38:51 2022-08-24 08:31:56 [INFO] [TRAIN] epoch: 116, iter: 146200/160000, loss: 0.3772, lr: 0.000104, batch_cost: 0.1622, reader_cost: 0.00108, ips: 49.3294 samples/sec | ETA 00:37:18 2022-08-24 08:32:04 [INFO] [TRAIN] epoch: 116, iter: 146250/160000, loss: 0.3760, lr: 0.000104, batch_cost: 0.1694, reader_cost: 0.00059, ips: 47.2162 samples/sec | ETA 00:38:49 2022-08-24 08:32:12 [INFO] [TRAIN] epoch: 116, iter: 146300/160000, loss: 0.3927, lr: 0.000104, batch_cost: 0.1657, reader_cost: 0.00125, ips: 48.2857 samples/sec | ETA 00:37:49 2022-08-24 08:32:21 [INFO] [TRAIN] epoch: 116, iter: 146350/160000, loss: 0.3945, lr: 0.000103, batch_cost: 0.1777, reader_cost: 0.00083, ips: 45.0189 samples/sec | ETA 00:40:25 2022-08-24 08:32:31 [INFO] [TRAIN] epoch: 116, iter: 146400/160000, loss: 0.3841, lr: 0.000103, batch_cost: 0.2006, reader_cost: 0.00042, ips: 39.8780 samples/sec | ETA 00:45:28 2022-08-24 08:32:41 [INFO] [TRAIN] epoch: 116, iter: 146450/160000, loss: 0.3895, lr: 0.000103, batch_cost: 0.1834, reader_cost: 0.00708, ips: 43.6171 samples/sec | ETA 00:41:25 2022-08-24 08:32:50 [INFO] [TRAIN] epoch: 116, iter: 146500/160000, loss: 0.3852, lr: 0.000102, batch_cost: 0.1935, reader_cost: 0.00036, ips: 41.3502 samples/sec | ETA 00:43:31 2022-08-24 08:33:06 [INFO] [TRAIN] epoch: 117, iter: 146550/160000, loss: 0.3580, lr: 0.000102, batch_cost: 0.3242, reader_cost: 0.15186, ips: 24.6733 samples/sec | ETA 01:12:40 2022-08-24 08:33:15 [INFO] [TRAIN] epoch: 117, iter: 146600/160000, loss: 0.3814, lr: 0.000101, batch_cost: 0.1795, reader_cost: 0.00122, ips: 44.5797 samples/sec | ETA 00:40:04 2022-08-24 08:33:25 [INFO] [TRAIN] epoch: 117, iter: 146650/160000, loss: 0.3656, lr: 0.000101, batch_cost: 0.1955, reader_cost: 0.00202, ips: 40.9228 samples/sec | ETA 00:43:29 2022-08-24 08:33:34 [INFO] [TRAIN] epoch: 117, iter: 146700/160000, loss: 0.4159, lr: 0.000101, batch_cost: 0.1735, reader_cost: 0.00062, ips: 46.1030 samples/sec | ETA 00:38:27 2022-08-24 08:33:43 [INFO] [TRAIN] epoch: 117, iter: 146750/160000, loss: 0.3861, lr: 0.000100, batch_cost: 0.1896, reader_cost: 0.00494, ips: 42.1938 samples/sec | ETA 00:41:52 2022-08-24 08:33:53 [INFO] [TRAIN] epoch: 117, iter: 146800/160000, loss: 0.3688, lr: 0.000100, batch_cost: 0.1911, reader_cost: 0.00048, ips: 41.8572 samples/sec | ETA 00:42:02 2022-08-24 08:34:03 [INFO] [TRAIN] epoch: 117, iter: 146850/160000, loss: 0.4031, lr: 0.000100, batch_cost: 0.2006, reader_cost: 0.00058, ips: 39.8745 samples/sec | ETA 00:43:58 2022-08-24 08:34:13 [INFO] [TRAIN] epoch: 117, iter: 146900/160000, loss: 0.4179, lr: 0.000099, batch_cost: 0.1976, reader_cost: 0.00036, ips: 40.4757 samples/sec | ETA 00:43:09 2022-08-24 08:34:23 [INFO] [TRAIN] epoch: 117, iter: 146950/160000, loss: 0.3688, lr: 0.000099, batch_cost: 0.2067, reader_cost: 0.00125, ips: 38.7061 samples/sec | ETA 00:44:57 2022-08-24 08:34:33 [INFO] [TRAIN] epoch: 117, iter: 147000/160000, loss: 0.4021, lr: 0.000098, batch_cost: 0.1921, reader_cost: 0.00055, ips: 41.6541 samples/sec | ETA 00:41:36 2022-08-24 08:34:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1929 - reader cost: 7.1948e-04 2022-08-24 08:37:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3776 Acc: 0.7774 Kappa: 0.7603 Dice: 0.5153 2022-08-24 08:37:46 [INFO] [EVAL] Class IoU: [0.6983 0.7974 0.9337 0.7397 0.6948 0.7806 0.7795 0.8122 0.5398 0.6246 0.5015 0.5724 0.7163 0.3012 0.3218 0.4419 0.4975 0.4301 0.6145 0.4487 0.7682 0.4118 0.634 0.5156 0.3417 0.4611 0.4586 0.4796 0.409 0.2631 0.2826 0.5628 0.3232 0.3487 0.3744 0.4212 0.4923 0.6013 0.3019 0.3862 0.1371 0.1271 0.3563 0.2594 0.2424 0.2618 0.3518 0.5356 0.6202 0.5532 0.6092 0.3883 0.1589 0.172 0.6686 0.3384 0.8886 0.442 0.3463 0.2828 0.0939 0.2404 0.3192 0.172 0.4972 0.7403 0.2669 0.3918 0.129 0.3761 0.5176 0.4997 0.3827 0.2708 0.5015 0.4214 0.5539 0.2969 0.392 0.2982 0.7142 0.4272 0.4227 0.0385 0.0971 0.5199 0.1171 0.1257 0.3385 0.5299 0.4271 0.1152 0.2271 0.1022 0.0478 0.0767 0.1714 0.1859 0.2918 0.3179 0.2309 0.1253 0.2749 0.7602 0.1712 0.7332 0.146 0.5674 0.0813 0.2302 0.2053 0.4824 0.206 0.6712 0.6806 0.0821 0.3725 0.6296 0.2065 0.3977 0.4196 0.0822 0.3307 0.1989 0.2959 0.2354 0.5515 0.4345 0.5793 0.3679 0.572 0.0502 0.2903 0.4443 0.279 0.2178 0.1787 0.048 0.2199 0.4065 0.1458 0.0044 0.265 0.2704 0.2167 0.0017 0.4143 0.0365 0.1495 0.2066] 2022-08-24 08:37:46 [INFO] [EVAL] Class Precision: [0.7905 0.8656 0.9664 0.8275 0.7725 0.8785 0.8723 0.8683 0.6935 0.7509 0.7125 0.7242 0.7868 0.5258 0.5512 0.6024 0.6959 0.7027 0.7685 0.6253 0.8275 0.657 0.7884 0.618 0.4638 0.7335 0.5491 0.7992 0.7629 0.3746 0.4842 0.7181 0.5534 0.4662 0.4849 0.5733 0.662 0.827 0.4432 0.6201 0.33 0.343 0.6175 0.4809 0.3325 0.507 0.7311 0.7287 0.732 0.6717 0.7471 0.4678 0.3389 0.6029 0.705 0.6676 0.9315 0.687 0.6615 0.466 0.1341 0.5326 0.4854 0.6131 0.6045 0.8517 0.4112 0.5273 0.2426 0.6323 0.701 0.5912 0.5693 0.3694 0.7511 0.6079 0.7297 0.6382 0.7755 0.5595 0.8456 0.7167 0.8289 0.1121 0.1755 0.7352 0.4893 0.4138 0.824 0.6558 0.5997 0.181 0.4397 0.3411 0.1575 0.216 0.6105 0.4528 0.4255 0.7532 0.6608 0.2313 0.5335 0.8635 0.7422 0.8578 0.5089 0.7278 0.1872 0.3743 0.4299 0.6113 0.557 0.8291 0.6877 0.2558 0.7163 0.7297 0.4318 0.5355 0.7385 0.3058 0.6753 0.6424 0.7563 0.6406 0.8263 0.5277 0.8142 0.5158 0.7124 0.3394 0.5923 0.8118 0.6414 0.4214 0.3535 0.1419 0.4721 0.721 0.4454 0.0098 0.5941 0.6155 0.5136 0.0024 0.868 0.2531 0.3544 0.732 ] 2022-08-24 08:37:46 [INFO] [EVAL] Class Recall: [0.8569 0.9101 0.965 0.8746 0.8735 0.8751 0.8799 0.9263 0.7089 0.7879 0.6288 0.7319 0.8887 0.4135 0.436 0.6238 0.6358 0.5259 0.7542 0.6137 0.9147 0.5245 0.764 0.7567 0.5649 0.5539 0.7356 0.5453 0.4686 0.4691 0.4042 0.7224 0.4371 0.5805 0.6216 0.6136 0.6576 0.6879 0.4864 0.5059 0.19 0.168 0.4572 0.3602 0.472 0.3512 0.4041 0.6691 0.8024 0.7582 0.7675 0.6955 0.2303 0.194 0.9284 0.407 0.9507 0.5534 0.4209 0.4184 0.2388 0.3047 0.4825 0.1929 0.7368 0.8498 0.4319 0.6039 0.2159 0.4814 0.6642 0.7635 0.5387 0.5035 0.6014 0.5788 0.697 0.357 0.4422 0.3896 0.8212 0.514 0.4631 0.0553 0.1784 0.6397 0.1334 0.1529 0.3648 0.734 0.5973 0.2406 0.3196 0.1273 0.0642 0.1063 0.1925 0.2397 0.4815 0.3549 0.262 0.2148 0.3618 0.864 0.182 0.8346 0.1699 0.7201 0.1256 0.3742 0.2821 0.6959 0.2464 0.779 0.9849 0.1078 0.4369 0.8211 0.2836 0.6071 0.4928 0.1011 0.3932 0.2236 0.3271 0.2712 0.6238 0.7111 0.6676 0.5619 0.7438 0.0556 0.3628 0.4953 0.3306 0.3106 0.2654 0.0677 0.2916 0.4824 0.1782 0.008 0.3236 0.3254 0.2726 0.0051 0.4422 0.0409 0.2054 0.2236] 2022-08-24 08:37:46 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:37:56 [INFO] [TRAIN] epoch: 117, iter: 147050/160000, loss: 0.3931, lr: 0.000098, batch_cost: 0.2069, reader_cost: 0.00388, ips: 38.6740 samples/sec | ETA 00:44:38 2022-08-24 08:38:05 [INFO] [TRAIN] epoch: 117, iter: 147100/160000, loss: 0.3868, lr: 0.000098, batch_cost: 0.1635, reader_cost: 0.00093, ips: 48.9441 samples/sec | ETA 00:35:08 2022-08-24 08:38:13 [INFO] [TRAIN] epoch: 117, iter: 147150/160000, loss: 0.3616, lr: 0.000097, batch_cost: 0.1570, reader_cost: 0.00123, ips: 50.9443 samples/sec | ETA 00:33:37 2022-08-24 08:38:21 [INFO] [TRAIN] epoch: 117, iter: 147200/160000, loss: 0.3818, lr: 0.000097, batch_cost: 0.1666, reader_cost: 0.00094, ips: 48.0284 samples/sec | ETA 00:35:32 2022-08-24 08:38:30 [INFO] [TRAIN] epoch: 117, iter: 147250/160000, loss: 0.4009, lr: 0.000097, batch_cost: 0.1761, reader_cost: 0.00076, ips: 45.4359 samples/sec | ETA 00:37:24 2022-08-24 08:38:39 [INFO] [TRAIN] epoch: 117, iter: 147300/160000, loss: 0.3972, lr: 0.000096, batch_cost: 0.1828, reader_cost: 0.00070, ips: 43.7563 samples/sec | ETA 00:38:41 2022-08-24 08:38:51 [INFO] [TRAIN] epoch: 117, iter: 147350/160000, loss: 0.3902, lr: 0.000096, batch_cost: 0.2429, reader_cost: 0.00082, ips: 32.9384 samples/sec | ETA 00:51:12 2022-08-24 08:39:01 [INFO] [TRAIN] epoch: 117, iter: 147400/160000, loss: 0.3797, lr: 0.000095, batch_cost: 0.2099, reader_cost: 0.00070, ips: 38.1103 samples/sec | ETA 00:44:04 2022-08-24 08:39:10 [INFO] [TRAIN] epoch: 117, iter: 147450/160000, loss: 0.3813, lr: 0.000095, batch_cost: 0.1686, reader_cost: 0.00034, ips: 47.4476 samples/sec | ETA 00:35:16 2022-08-24 08:39:19 [INFO] [TRAIN] epoch: 117, iter: 147500/160000, loss: 0.3943, lr: 0.000095, batch_cost: 0.1771, reader_cost: 0.00122, ips: 45.1731 samples/sec | ETA 00:36:53 2022-08-24 08:39:30 [INFO] [TRAIN] epoch: 117, iter: 147550/160000, loss: 0.3958, lr: 0.000094, batch_cost: 0.2163, reader_cost: 0.00059, ips: 36.9777 samples/sec | ETA 00:44:53 2022-08-24 08:39:39 [INFO] [TRAIN] epoch: 117, iter: 147600/160000, loss: 0.3792, lr: 0.000094, batch_cost: 0.1966, reader_cost: 0.00080, ips: 40.6935 samples/sec | ETA 00:40:37 2022-08-24 08:39:49 [INFO] [TRAIN] epoch: 117, iter: 147650/160000, loss: 0.3791, lr: 0.000094, batch_cost: 0.1951, reader_cost: 0.00061, ips: 41.0089 samples/sec | ETA 00:40:09 2022-08-24 08:39:58 [INFO] [TRAIN] epoch: 117, iter: 147700/160000, loss: 0.3676, lr: 0.000093, batch_cost: 0.1788, reader_cost: 0.00111, ips: 44.7357 samples/sec | ETA 00:36:39 2022-08-24 08:40:07 [INFO] [TRAIN] epoch: 117, iter: 147750/160000, loss: 0.4063, lr: 0.000093, batch_cost: 0.1797, reader_cost: 0.00050, ips: 44.5129 samples/sec | ETA 00:36:41 2022-08-24 08:40:21 [INFO] [TRAIN] epoch: 118, iter: 147800/160000, loss: 0.3824, lr: 0.000092, batch_cost: 0.2722, reader_cost: 0.08773, ips: 29.3865 samples/sec | ETA 00:55:21 2022-08-24 08:40:30 [INFO] [TRAIN] epoch: 118, iter: 147850/160000, loss: 0.4085, lr: 0.000092, batch_cost: 0.1804, reader_cost: 0.00531, ips: 44.3576 samples/sec | ETA 00:36:31 2022-08-24 08:40:39 [INFO] [TRAIN] epoch: 118, iter: 147900/160000, loss: 0.4097, lr: 0.000092, batch_cost: 0.1841, reader_cost: 0.00033, ips: 43.4575 samples/sec | ETA 00:37:07 2022-08-24 08:40:48 [INFO] [TRAIN] epoch: 118, iter: 147950/160000, loss: 0.4223, lr: 0.000091, batch_cost: 0.1904, reader_cost: 0.00092, ips: 42.0164 samples/sec | ETA 00:38:14 2022-08-24 08:40:58 [INFO] [TRAIN] epoch: 118, iter: 148000/160000, loss: 0.3865, lr: 0.000091, batch_cost: 0.2006, reader_cost: 0.00048, ips: 39.8764 samples/sec | ETA 00:40:07 2022-08-24 08:40:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1755 - reader cost: 6.9061e-04 2022-08-24 08:43:54 [INFO] [EVAL] #Images: 2000 mIoU: 0.3763 Acc: 0.7775 Kappa: 0.7604 Dice: 0.5141 2022-08-24 08:43:54 [INFO] [EVAL] Class IoU: [0.6971 0.7953 0.9319 0.7364 0.6954 0.7755 0.782 0.8157 0.5321 0.6448 0.5047 0.583 0.7201 0.2912 0.3116 0.4419 0.5134 0.4527 0.6068 0.4419 0.7744 0.4385 0.6303 0.5211 0.3378 0.4777 0.463 0.4701 0.4112 0.2462 0.2867 0.5697 0.3177 0.3597 0.3405 0.4139 0.4918 0.6017 0.3075 0.3768 0.1249 0.1226 0.3568 0.2548 0.2468 0.2613 0.3787 0.5483 0.6071 0.5416 0.6146 0.3918 0.171 0.1963 0.6713 0.3439 0.8849 0.4339 0.3194 0.2829 0.0808 0.2329 0.3177 0.1654 0.4847 0.7409 0.2591 0.3823 0.1188 0.3758 0.496 0.5423 0.3844 0.2411 0.5006 0.3942 0.5908 0.3034 0.3105 0.2932 0.7217 0.4279 0.3982 0.0603 0.1053 0.5194 0.1314 0.144 0.3444 0.5398 0.4163 0.0876 0.2217 0.1274 0.0506 0.0569 0.1495 0.2054 0.2871 0.3125 0.2144 0.1271 0.2871 0.7633 0.1718 0.5803 0.1364 0.5706 0.08 0.2162 0.2178 0.4817 0.2116 0.6219 0.6716 0.0808 0.4068 0.6344 0.2003 0.3661 0.4347 0.0723 0.3189 0.2118 0.3 0.2578 0.5661 0.4662 0.5601 0.3861 0.5658 0.0491 0.3197 0.4241 0.3187 0.2087 0.1758 0.0424 0.2077 0.4079 0.1812 0.0233 0.2814 0.1693 0.2256 0.0124 0.4487 0.0365 0.1484 0.2017] 2022-08-24 08:43:54 [INFO] [EVAL] Class Precision: [0.7885 0.874 0.9652 0.8131 0.7727 0.8715 0.8841 0.874 0.6759 0.7306 0.6935 0.7437 0.7938 0.5382 0.5494 0.6147 0.6824 0.6961 0.7493 0.6498 0.8405 0.6704 0.7756 0.6259 0.4805 0.6529 0.5586 0.7875 0.7726 0.4262 0.4833 0.7368 0.5418 0.4958 0.4383 0.5857 0.6652 0.8123 0.461 0.6166 0.3167 0.3241 0.5891 0.4975 0.3449 0.5415 0.7206 0.7263 0.7297 0.6982 0.8151 0.489 0.3179 0.6422 0.7221 0.6878 0.9252 0.6831 0.6652 0.53 0.1234 0.4286 0.4588 0.6139 0.5847 0.8604 0.4452 0.5331 0.207 0.6413 0.7264 0.6614 0.5902 0.3561 0.7403 0.5652 0.7206 0.6676 0.7387 0.5658 0.8646 0.6665 0.837 0.1533 0.186 0.7184 0.4162 0.431 0.8518 0.6798 0.5555 0.1171 0.4316 0.3276 0.1131 0.1846 0.6404 0.4251 0.416 0.7458 0.6093 0.2264 0.5142 0.8579 0.8146 0.6473 0.398 0.7782 0.1713 0.3616 0.4317 0.652 0.5443 0.8049 0.6782 0.2913 0.7877 0.747 0.4152 0.5229 0.7144 0.2971 0.6623 0.6312 0.7362 0.6126 0.886 0.5756 0.8043 0.5844 0.7083 0.3301 0.6054 0.831 0.6178 0.4189 0.3936 0.146 0.4754 0.7215 0.3942 0.0378 0.5569 0.63 0.5377 0.0154 0.8228 0.2469 0.3995 0.7051] 2022-08-24 08:43:54 [INFO] [EVAL] Class Recall: [0.8575 0.8983 0.9643 0.8865 0.8742 0.8756 0.8712 0.9244 0.7144 0.8459 0.6496 0.7296 0.8857 0.3882 0.4187 0.6113 0.6746 0.5642 0.7614 0.58 0.9078 0.559 0.7709 0.7568 0.5322 0.6404 0.7301 0.5385 0.4679 0.3684 0.4135 0.7152 0.4345 0.567 0.6041 0.5853 0.6535 0.6988 0.4802 0.4921 0.1711 0.1647 0.475 0.3431 0.4644 0.3355 0.4438 0.6912 0.7833 0.7071 0.7141 0.6634 0.27 0.2203 0.9051 0.4075 0.9531 0.5433 0.3806 0.3777 0.1893 0.3377 0.5081 0.1847 0.7392 0.8421 0.3826 0.5747 0.2181 0.4758 0.6099 0.7508 0.5244 0.4274 0.6073 0.5657 0.7663 0.3573 0.3488 0.3783 0.8137 0.5445 0.4316 0.0903 0.1953 0.6521 0.1611 0.1778 0.3663 0.7238 0.6242 0.258 0.313 0.1725 0.084 0.076 0.1632 0.2844 0.4808 0.3497 0.2486 0.2248 0.394 0.8738 0.1788 0.8487 0.1719 0.6815 0.1305 0.3497 0.3054 0.6483 0.2572 0.7324 0.9857 0.1006 0.4568 0.808 0.279 0.5496 0.5261 0.0872 0.3808 0.2417 0.3361 0.308 0.6105 0.7105 0.6485 0.5323 0.7378 0.0545 0.4039 0.4642 0.3969 0.2937 0.2411 0.0564 0.2695 0.4841 0.2512 0.0574 0.3626 0.188 0.2799 0.0594 0.4967 0.041 0.191 0.2202] 2022-08-24 08:43:54 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:44:03 [INFO] [TRAIN] epoch: 118, iter: 148050/160000, loss: 0.3931, lr: 0.000090, batch_cost: 0.1710, reader_cost: 0.00389, ips: 46.7795 samples/sec | ETA 00:34:03 2022-08-24 08:44:12 [INFO] [TRAIN] epoch: 118, iter: 148100/160000, loss: 0.3966, lr: 0.000090, batch_cost: 0.1864, reader_cost: 0.00128, ips: 42.9237 samples/sec | ETA 00:36:57 2022-08-24 08:44:22 [INFO] [TRAIN] epoch: 118, iter: 148150/160000, loss: 0.3746, lr: 0.000090, batch_cost: 0.1997, reader_cost: 0.00095, ips: 40.0683 samples/sec | ETA 00:39:25 2022-08-24 08:44:32 [INFO] [TRAIN] epoch: 118, iter: 148200/160000, loss: 0.3802, lr: 0.000089, batch_cost: 0.1907, reader_cost: 0.00042, ips: 41.9467 samples/sec | ETA 00:37:30 2022-08-24 08:44:41 [INFO] [TRAIN] epoch: 118, iter: 148250/160000, loss: 0.4154, lr: 0.000089, batch_cost: 0.1863, reader_cost: 0.00071, ips: 42.9517 samples/sec | ETA 00:36:28 2022-08-24 08:44:50 [INFO] [TRAIN] epoch: 118, iter: 148300/160000, loss: 0.3768, lr: 0.000089, batch_cost: 0.1726, reader_cost: 0.00077, ips: 46.3509 samples/sec | ETA 00:33:39 2022-08-24 08:44:59 [INFO] [TRAIN] epoch: 118, iter: 148350/160000, loss: 0.3796, lr: 0.000088, batch_cost: 0.1762, reader_cost: 0.00033, ips: 45.3906 samples/sec | ETA 00:34:13 2022-08-24 08:45:07 [INFO] [TRAIN] epoch: 118, iter: 148400/160000, loss: 0.4005, lr: 0.000088, batch_cost: 0.1719, reader_cost: 0.00173, ips: 46.5370 samples/sec | ETA 00:33:14 2022-08-24 08:45:18 [INFO] [TRAIN] epoch: 118, iter: 148450/160000, loss: 0.3793, lr: 0.000087, batch_cost: 0.2088, reader_cost: 0.00051, ips: 38.3110 samples/sec | ETA 00:40:11 2022-08-24 08:45:27 [INFO] [TRAIN] epoch: 118, iter: 148500/160000, loss: 0.3928, lr: 0.000087, batch_cost: 0.1929, reader_cost: 0.00055, ips: 41.4746 samples/sec | ETA 00:36:58 2022-08-24 08:45:35 [INFO] [TRAIN] epoch: 118, iter: 148550/160000, loss: 0.3721, lr: 0.000087, batch_cost: 0.1570, reader_cost: 0.00147, ips: 50.9469 samples/sec | ETA 00:29:57 2022-08-24 08:45:45 [INFO] [TRAIN] epoch: 118, iter: 148600/160000, loss: 0.4087, lr: 0.000086, batch_cost: 0.1985, reader_cost: 0.00061, ips: 40.3119 samples/sec | ETA 00:37:42 2022-08-24 08:45:56 [INFO] [TRAIN] epoch: 118, iter: 148650/160000, loss: 0.3851, lr: 0.000086, batch_cost: 0.2166, reader_cost: 0.00054, ips: 36.9284 samples/sec | ETA 00:40:58 2022-08-24 08:46:06 [INFO] [TRAIN] epoch: 118, iter: 148700/160000, loss: 0.4034, lr: 0.000086, batch_cost: 0.2100, reader_cost: 0.00077, ips: 38.0935 samples/sec | ETA 00:39:33 2022-08-24 08:46:16 [INFO] [TRAIN] epoch: 118, iter: 148750/160000, loss: 0.3909, lr: 0.000085, batch_cost: 0.1916, reader_cost: 0.00093, ips: 41.7631 samples/sec | ETA 00:35:55 2022-08-24 08:46:26 [INFO] [TRAIN] epoch: 118, iter: 148800/160000, loss: 0.3908, lr: 0.000085, batch_cost: 0.2066, reader_cost: 0.00071, ips: 38.7312 samples/sec | ETA 00:38:33 2022-08-24 08:46:37 [INFO] [TRAIN] epoch: 118, iter: 148850/160000, loss: 0.3800, lr: 0.000084, batch_cost: 0.2061, reader_cost: 0.00041, ips: 38.8243 samples/sec | ETA 00:38:17 2022-08-24 08:46:47 [INFO] [TRAIN] epoch: 118, iter: 148900/160000, loss: 0.3806, lr: 0.000084, batch_cost: 0.2036, reader_cost: 0.00042, ips: 39.2986 samples/sec | ETA 00:37:39 2022-08-24 08:46:57 [INFO] [TRAIN] epoch: 118, iter: 148950/160000, loss: 0.3947, lr: 0.000084, batch_cost: 0.2115, reader_cost: 0.00069, ips: 37.8236 samples/sec | ETA 00:38:57 2022-08-24 08:47:07 [INFO] [TRAIN] epoch: 118, iter: 149000/160000, loss: 0.3925, lr: 0.000083, batch_cost: 0.1943, reader_cost: 0.00044, ips: 41.1773 samples/sec | ETA 00:35:37 2022-08-24 08:47:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 134s - batch_cost: 0.1335 - reader cost: 7.0176e-04 2022-08-24 08:49:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3772 Acc: 0.7780 Kappa: 0.7610 Dice: 0.5141 2022-08-24 08:49:21 [INFO] [EVAL] Class IoU: [0.6979 0.7942 0.9324 0.7418 0.6946 0.7765 0.7818 0.8144 0.5312 0.6459 0.5085 0.5813 0.7195 0.3119 0.319 0.4458 0.5102 0.4519 0.6165 0.4548 0.7746 0.4188 0.6392 0.5165 0.3389 0.4608 0.4491 0.4784 0.4673 0.2476 0.2743 0.5696 0.34 0.3659 0.3435 0.417 0.4888 0.5991 0.3059 0.3763 0.1257 0.1302 0.3554 0.2618 0.2594 0.2669 0.4001 0.5471 0.6206 0.5474 0.6127 0.3818 0.1726 0.1951 0.6779 0.3382 0.8889 0.4293 0.2308 0.2803 0.092 0.2328 0.3043 0.191 0.493 0.7401 0.2598 0.381 0.1207 0.3696 0.5133 0.5592 0.3952 0.2817 0.4992 0.4171 0.5623 0.3048 0.2104 0.2669 0.7295 0.4402 0.4398 0.0384 0.09 0.5308 0.1356 0.1291 0.4033 0.5385 0.4349 0.0774 0.2135 0.1052 0.0487 0.0676 0.1604 0.1961 0.3034 0.3227 0.2052 0.1145 0.2869 0.778 0.1758 0.5905 0.1225 0.565 0.0852 0.2035 0.2073 0.4076 0.2059 0.6608 0.7145 0.0739 0.3954 0.6308 0.1887 0.4097 0.4411 0.0819 0.3125 0.1955 0.3 0.2481 0.5628 0.4651 0.5665 0.3892 0.5729 0.0529 0.2968 0.4258 0.3022 0.2161 0.1698 0.048 0.2114 0.4287 0.1818 0.019 0.2742 0.2229 0.2083 0.0036 0.4501 0.0381 0.1366 0.2114] 2022-08-24 08:49:21 [INFO] [EVAL] Class Precision: [0.7936 0.8673 0.9633 0.8265 0.7817 0.8729 0.8873 0.8719 0.6776 0.7589 0.7119 0.7174 0.7882 0.5456 0.516 0.6163 0.6806 0.7089 0.7649 0.6331 0.8498 0.6511 0.7815 0.6118 0.4895 0.6389 0.5463 0.7754 0.7461 0.387 0.5 0.7081 0.5681 0.5008 0.4487 0.5574 0.6751 0.8282 0.4642 0.6257 0.3316 0.3252 0.5741 0.4778 0.3688 0.5379 0.7899 0.7625 0.7149 0.7004 0.7913 0.4649 0.3355 0.5669 0.7296 0.677 0.93 0.6781 0.5715 0.523 0.1266 0.4236 0.4117 0.6387 0.6045 0.8543 0.4357 0.5157 0.2244 0.6271 0.7058 0.6826 0.5764 0.366 0.7478 0.6205 0.7533 0.6521 0.7619 0.5061 0.876 0.6825 0.8046 0.1182 0.1884 0.7055 0.4969 0.3979 0.8082 0.7053 0.6073 0.0997 0.4306 0.3373 0.1423 0.1829 0.6427 0.4077 0.4759 0.7418 0.7195 0.1876 0.5284 0.8867 0.7626 0.6485 0.3349 0.778 0.1647 0.3432 0.4452 0.4973 0.6101 0.8126 0.7217 0.3073 0.7092 0.7344 0.3369 0.557 0.7121 0.2794 0.669 0.6528 0.7695 0.6169 0.8455 0.5778 0.8336 0.62 0.7338 0.2994 0.5366 0.8314 0.6041 0.4004 0.3703 0.1174 0.5006 0.6929 0.4029 0.0358 0.5958 0.594 0.5502 0.0051 0.8187 0.226 0.3677 0.7203] 2022-08-24 08:49:21 [INFO] [EVAL] Class Recall: [0.8527 0.9041 0.9668 0.8787 0.8618 0.8755 0.868 0.9252 0.7109 0.8126 0.6403 0.7539 0.892 0.4214 0.4552 0.6171 0.6709 0.5549 0.7607 0.6175 0.8975 0.54 0.7783 0.7683 0.5242 0.6231 0.7163 0.5553 0.5556 0.4074 0.3779 0.7445 0.4585 0.576 0.5944 0.6234 0.6392 0.6842 0.4728 0.4856 0.1683 0.1784 0.4827 0.3668 0.4664 0.3463 0.4477 0.6595 0.8246 0.7148 0.7308 0.6811 0.2622 0.2293 0.9054 0.4033 0.9527 0.5392 0.2791 0.3766 0.2517 0.3408 0.5383 0.2141 0.7276 0.847 0.3915 0.5933 0.2072 0.4737 0.653 0.7557 0.557 0.5499 0.6002 0.5599 0.6892 0.364 0.2252 0.3608 0.8135 0.5536 0.4924 0.0539 0.147 0.682 0.1572 0.1604 0.446 0.6948 0.605 0.2571 0.2975 0.1325 0.0689 0.0969 0.1761 0.2741 0.4555 0.3635 0.2231 0.2272 0.3856 0.864 0.186 0.8683 0.1619 0.6736 0.1501 0.3334 0.2795 0.6931 0.2371 0.7796 0.9863 0.0886 0.4719 0.8173 0.3002 0.6076 0.5369 0.1039 0.3696 0.2182 0.3297 0.2933 0.6273 0.7046 0.6387 0.5111 0.7231 0.0604 0.3991 0.4661 0.3768 0.3195 0.2388 0.075 0.2679 0.5292 0.2488 0.0389 0.3368 0.2629 0.2511 0.0125 0.4999 0.0438 0.1785 0.2303] 2022-08-24 08:49:21 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:49:32 [INFO] [TRAIN] epoch: 119, iter: 149050/160000, loss: 0.3861, lr: 0.000083, batch_cost: 0.2154, reader_cost: 0.05366, ips: 37.1376 samples/sec | ETA 00:39:18 2022-08-24 08:49:40 [INFO] [TRAIN] epoch: 119, iter: 149100/160000, loss: 0.3862, lr: 0.000083, batch_cost: 0.1694, reader_cost: 0.00082, ips: 47.2239 samples/sec | ETA 00:30:46 2022-08-24 08:49:49 [INFO] [TRAIN] epoch: 119, iter: 149150/160000, loss: 0.3764, lr: 0.000082, batch_cost: 0.1703, reader_cost: 0.00075, ips: 46.9858 samples/sec | ETA 00:30:47 2022-08-24 08:49:59 [INFO] [TRAIN] epoch: 119, iter: 149200/160000, loss: 0.3737, lr: 0.000082, batch_cost: 0.2038, reader_cost: 0.00054, ips: 39.2604 samples/sec | ETA 00:36:40 2022-08-24 08:50:09 [INFO] [TRAIN] epoch: 119, iter: 149250/160000, loss: 0.3779, lr: 0.000081, batch_cost: 0.1986, reader_cost: 0.00046, ips: 40.2829 samples/sec | ETA 00:35:34 2022-08-24 08:50:18 [INFO] [TRAIN] epoch: 119, iter: 149300/160000, loss: 0.4021, lr: 0.000081, batch_cost: 0.1848, reader_cost: 0.00058, ips: 43.2879 samples/sec | ETA 00:32:57 2022-08-24 08:50:28 [INFO] [TRAIN] epoch: 119, iter: 149350/160000, loss: 0.3838, lr: 0.000081, batch_cost: 0.2037, reader_cost: 0.00058, ips: 39.2748 samples/sec | ETA 00:36:09 2022-08-24 08:50:38 [INFO] [TRAIN] epoch: 119, iter: 149400/160000, loss: 0.3880, lr: 0.000080, batch_cost: 0.1939, reader_cost: 0.00075, ips: 41.2649 samples/sec | ETA 00:34:15 2022-08-24 08:50:48 [INFO] [TRAIN] epoch: 119, iter: 149450/160000, loss: 0.3985, lr: 0.000080, batch_cost: 0.2049, reader_cost: 0.00107, ips: 39.0374 samples/sec | ETA 00:36:02 2022-08-24 08:50:58 [INFO] [TRAIN] epoch: 119, iter: 149500/160000, loss: 0.3754, lr: 0.000080, batch_cost: 0.1928, reader_cost: 0.00048, ips: 41.4854 samples/sec | ETA 00:33:44 2022-08-24 08:51:08 [INFO] [TRAIN] epoch: 119, iter: 149550/160000, loss: 0.4132, lr: 0.000079, batch_cost: 0.1950, reader_cost: 0.00056, ips: 41.0166 samples/sec | ETA 00:33:58 2022-08-24 08:51:18 [INFO] [TRAIN] epoch: 119, iter: 149600/160000, loss: 0.4152, lr: 0.000079, batch_cost: 0.2012, reader_cost: 0.00037, ips: 39.7666 samples/sec | ETA 00:34:52 2022-08-24 08:51:28 [INFO] [TRAIN] epoch: 119, iter: 149650/160000, loss: 0.3882, lr: 0.000078, batch_cost: 0.2129, reader_cost: 0.00072, ips: 37.5734 samples/sec | ETA 00:36:43 2022-08-24 08:51:39 [INFO] [TRAIN] epoch: 119, iter: 149700/160000, loss: 0.3856, lr: 0.000078, batch_cost: 0.2192, reader_cost: 0.00067, ips: 36.4895 samples/sec | ETA 00:37:38 2022-08-24 08:51:48 [INFO] [TRAIN] epoch: 119, iter: 149750/160000, loss: 0.4120, lr: 0.000078, batch_cost: 0.1752, reader_cost: 0.00044, ips: 45.6545 samples/sec | ETA 00:29:56 2022-08-24 08:51:58 [INFO] [TRAIN] epoch: 119, iter: 149800/160000, loss: 0.3870, lr: 0.000077, batch_cost: 0.1997, reader_cost: 0.00344, ips: 40.0687 samples/sec | ETA 00:33:56 2022-08-24 08:52:07 [INFO] [TRAIN] epoch: 119, iter: 149850/160000, loss: 0.3888, lr: 0.000077, batch_cost: 0.1833, reader_cost: 0.00038, ips: 43.6352 samples/sec | ETA 00:31:00 2022-08-24 08:52:18 [INFO] [TRAIN] epoch: 119, iter: 149900/160000, loss: 0.3730, lr: 0.000076, batch_cost: 0.2070, reader_cost: 0.00046, ips: 38.6453 samples/sec | ETA 00:34:50 2022-08-24 08:52:28 [INFO] [TRAIN] epoch: 119, iter: 149950/160000, loss: 0.3945, lr: 0.000076, batch_cost: 0.2057, reader_cost: 0.00043, ips: 38.8879 samples/sec | ETA 00:34:27 2022-08-24 08:52:38 [INFO] [TRAIN] epoch: 119, iter: 150000/160000, loss: 0.3937, lr: 0.000076, batch_cost: 0.2021, reader_cost: 0.00055, ips: 39.5874 samples/sec | ETA 00:33:40 2022-08-24 08:52:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 135s - batch_cost: 0.1350 - reader cost: 0.0015 2022-08-24 08:54:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3778 Acc: 0.7783 Kappa: 0.7613 Dice: 0.5144 2022-08-24 08:54:53 [INFO] [EVAL] Class IoU: [0.6966 0.7959 0.9331 0.7448 0.6973 0.7788 0.7769 0.8149 0.533 0.6414 0.5022 0.5725 0.7257 0.3261 0.3177 0.4437 0.4919 0.4472 0.6185 0.4463 0.7788 0.434 0.6352 0.516 0.3221 0.4579 0.4707 0.4826 0.4291 0.2612 0.2777 0.5704 0.3365 0.3584 0.374 0.4207 0.4892 0.5818 0.3058 0.3888 0.1235 0.1366 0.352 0.2627 0.245 0.2509 0.384 0.5454 0.6083 0.5497 0.6251 0.3935 0.164 0.1296 0.6692 0.3332 0.8925 0.4216 0.3902 0.2603 0.1061 0.2433 0.3138 0.1653 0.5012 0.7328 0.2593 0.3819 0.1392 0.3803 0.5019 0.5478 0.3681 0.2726 0.4963 0.4069 0.5412 0.2954 0.2053 0.2794 0.7316 0.4374 0.4383 0.0546 0.101 0.5252 0.1176 0.1359 0.4151 0.5369 0.427 0.0589 0.2057 0.1068 0.0487 0.0686 0.1675 0.1814 0.294 0.3367 0.2052 0.1105 0.2873 0.7725 0.1851 0.6899 0.1404 0.5595 0.0798 0.1948 0.2034 0.5366 0.2026 0.6577 0.7272 0.0709 0.424 0.6326 0.1718 0.3881 0.4203 0.0785 0.3306 0.2033 0.307 0.2435 0.5625 0.4792 0.5718 0.3768 0.5766 0.054 0.2913 0.4284 0.3022 0.21 0.1722 0.0486 0.2109 0.4186 0.1537 0.0234 0.2818 0.1839 0.218 0.0016 0.44 0.0377 0.14 0.2087] 2022-08-24 08:54:53 [INFO] [EVAL] Class Precision: [0.7883 0.8647 0.9629 0.8318 0.7859 0.8761 0.8738 0.8754 0.6751 0.7536 0.6832 0.7218 0.8043 0.5334 0.5505 0.6082 0.7024 0.7039 0.7698 0.6433 0.8505 0.6733 0.7808 0.6097 0.4623 0.6881 0.5594 0.7479 0.7473 0.4448 0.5031 0.7099 0.5677 0.4819 0.4689 0.5703 0.6699 0.8416 0.4584 0.6142 0.3289 0.3151 0.5844 0.485 0.3346 0.4744 0.7929 0.7503 0.6972 0.705 0.7784 0.5017 0.3472 0.5701 0.7165 0.635 0.9377 0.6785 0.6875 0.4472 0.1562 0.4977 0.4383 0.6046 0.6245 0.8379 0.4185 0.5145 0.3047 0.6522 0.7079 0.6719 0.6099 0.3502 0.7267 0.5862 0.7532 0.656 0.7277 0.5576 0.8769 0.7067 0.8039 0.1583 0.1984 0.7188 0.4781 0.4277 0.8024 0.6838 0.5994 0.0738 0.4197 0.3381 0.1776 0.1819 0.6093 0.4112 0.4417 0.6863 0.6653 0.178 0.5309 0.8953 0.7646 0.774 0.3872 0.7669 0.1851 0.338 0.4266 0.7165 0.5889 0.8187 0.7351 0.3001 0.7088 0.7367 0.2934 0.5434 0.7169 0.2654 0.6854 0.6247 0.7376 0.6229 0.8672 0.6077 0.8409 0.5505 0.712 0.325 0.5909 0.8272 0.6106 0.4098 0.3611 0.107 0.4813 0.7071 0.426 0.0374 0.5934 0.6315 0.5107 0.0027 0.8512 0.2153 0.3594 0.7055] 2022-08-24 08:54:53 [INFO] [EVAL] Class Recall: [0.8569 0.9091 0.9679 0.8768 0.8608 0.8752 0.8751 0.9217 0.7169 0.8116 0.6547 0.7345 0.8813 0.4564 0.429 0.6213 0.6214 0.5508 0.7589 0.5931 0.9023 0.5498 0.773 0.7706 0.5151 0.5778 0.748 0.5764 0.5019 0.3875 0.3827 0.7438 0.4525 0.5832 0.6487 0.6159 0.6446 0.6533 0.4787 0.5144 0.165 0.1944 0.4696 0.3644 0.4778 0.3475 0.4269 0.6663 0.8267 0.7139 0.7604 0.6458 0.237 0.1436 0.9102 0.4122 0.9488 0.5269 0.4743 0.3838 0.2487 0.3225 0.5249 0.1853 0.7174 0.8538 0.4054 0.597 0.204 0.4771 0.633 0.7478 0.4815 0.5517 0.6102 0.5708 0.6579 0.3496 0.2223 0.359 0.8153 0.5344 0.4908 0.077 0.1707 0.661 0.1349 0.166 0.4624 0.7142 0.5975 0.2256 0.2874 0.1351 0.0629 0.0991 0.1876 0.245 0.4678 0.3979 0.2288 0.2256 0.3851 0.8492 0.1963 0.864 0.1806 0.6742 0.1231 0.3149 0.28 0.6812 0.236 0.7699 0.9854 0.085 0.5135 0.8174 0.2931 0.576 0.504 0.1003 0.3898 0.2316 0.3446 0.2857 0.6155 0.6938 0.6412 0.5442 0.7519 0.0608 0.365 0.4705 0.3743 0.3011 0.2477 0.0817 0.2729 0.5065 0.1938 0.0591 0.3492 0.206 0.2756 0.0037 0.4766 0.0438 0.1865 0.2286] 2022-08-24 08:54:53 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 08:55:03 [INFO] [TRAIN] epoch: 119, iter: 150050/160000, loss: 0.3832, lr: 0.000075, batch_cost: 0.1950, reader_cost: 0.00379, ips: 41.0260 samples/sec | ETA 00:32:20 2022-08-24 08:55:12 [INFO] [TRAIN] epoch: 119, iter: 150100/160000, loss: 0.3763, lr: 0.000075, batch_cost: 0.1798, reader_cost: 0.00124, ips: 44.5020 samples/sec | ETA 00:29:39 2022-08-24 08:55:21 [INFO] [TRAIN] epoch: 119, iter: 150150/160000, loss: 0.3633, lr: 0.000075, batch_cost: 0.1702, reader_cost: 0.00076, ips: 46.9956 samples/sec | ETA 00:27:56 2022-08-24 08:55:32 [INFO] [TRAIN] epoch: 119, iter: 150200/160000, loss: 0.3806, lr: 0.000074, batch_cost: 0.2177, reader_cost: 0.00043, ips: 36.7432 samples/sec | ETA 00:35:33 2022-08-24 08:55:41 [INFO] [TRAIN] epoch: 119, iter: 150250/160000, loss: 0.3957, lr: 0.000074, batch_cost: 0.1842, reader_cost: 0.00214, ips: 43.4417 samples/sec | ETA 00:29:55 2022-08-24 08:55:54 [INFO] [TRAIN] epoch: 120, iter: 150300/160000, loss: 0.3802, lr: 0.000073, batch_cost: 0.2677, reader_cost: 0.06475, ips: 29.8821 samples/sec | ETA 00:43:16 2022-08-24 08:56:04 [INFO] [TRAIN] epoch: 120, iter: 150350/160000, loss: 0.3984, lr: 0.000073, batch_cost: 0.1930, reader_cost: 0.00070, ips: 41.4476 samples/sec | ETA 00:31:02 2022-08-24 08:56:13 [INFO] [TRAIN] epoch: 120, iter: 150400/160000, loss: 0.4001, lr: 0.000073, batch_cost: 0.1861, reader_cost: 0.00076, ips: 42.9895 samples/sec | ETA 00:29:46 2022-08-24 08:56:23 [INFO] [TRAIN] epoch: 120, iter: 150450/160000, loss: 0.3699, lr: 0.000072, batch_cost: 0.1992, reader_cost: 0.00054, ips: 40.1549 samples/sec | ETA 00:31:42 2022-08-24 08:56:33 [INFO] [TRAIN] epoch: 120, iter: 150500/160000, loss: 0.4105, lr: 0.000072, batch_cost: 0.1924, reader_cost: 0.00059, ips: 41.5739 samples/sec | ETA 00:30:28 2022-08-24 08:56:42 [INFO] [TRAIN] epoch: 120, iter: 150550/160000, loss: 0.3731, lr: 0.000072, batch_cost: 0.1949, reader_cost: 0.00092, ips: 41.0458 samples/sec | ETA 00:30:41 2022-08-24 08:56:52 [INFO] [TRAIN] epoch: 120, iter: 150600/160000, loss: 0.3669, lr: 0.000071, batch_cost: 0.1949, reader_cost: 0.00091, ips: 41.0506 samples/sec | ETA 00:30:31 2022-08-24 08:57:02 [INFO] [TRAIN] epoch: 120, iter: 150650/160000, loss: 0.3726, lr: 0.000071, batch_cost: 0.2018, reader_cost: 0.00041, ips: 39.6389 samples/sec | ETA 00:31:27 2022-08-24 08:57:11 [INFO] [TRAIN] epoch: 120, iter: 150700/160000, loss: 0.3877, lr: 0.000070, batch_cost: 0.1839, reader_cost: 0.00050, ips: 43.4977 samples/sec | ETA 00:28:30 2022-08-24 08:57:20 [INFO] [TRAIN] epoch: 120, iter: 150750/160000, loss: 0.3740, lr: 0.000070, batch_cost: 0.1661, reader_cost: 0.00033, ips: 48.1655 samples/sec | ETA 00:25:36 2022-08-24 08:57:30 [INFO] [TRAIN] epoch: 120, iter: 150800/160000, loss: 0.3861, lr: 0.000070, batch_cost: 0.2010, reader_cost: 0.00120, ips: 39.8018 samples/sec | ETA 00:30:49 2022-08-24 08:57:41 [INFO] [TRAIN] epoch: 120, iter: 150850/160000, loss: 0.3777, lr: 0.000069, batch_cost: 0.2156, reader_cost: 0.00070, ips: 37.1118 samples/sec | ETA 00:32:52 2022-08-24 08:57:51 [INFO] [TRAIN] epoch: 120, iter: 150900/160000, loss: 0.4006, lr: 0.000069, batch_cost: 0.1998, reader_cost: 0.00049, ips: 40.0497 samples/sec | ETA 00:30:17 2022-08-24 08:58:00 [INFO] [TRAIN] epoch: 120, iter: 150950/160000, loss: 0.3798, lr: 0.000069, batch_cost: 0.1811, reader_cost: 0.00043, ips: 44.1725 samples/sec | ETA 00:27:19 2022-08-24 08:58:10 [INFO] [TRAIN] epoch: 120, iter: 151000/160000, loss: 0.3949, lr: 0.000068, batch_cost: 0.2067, reader_cost: 0.00064, ips: 38.7111 samples/sec | ETA 00:30:59 2022-08-24 08:58:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 144s - batch_cost: 0.1438 - reader cost: 0.0014 2022-08-24 09:00:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.3781 Acc: 0.7786 Kappa: 0.7617 Dice: 0.5155 2022-08-24 09:00:34 [INFO] [EVAL] Class IoU: [0.6989 0.7953 0.9336 0.7447 0.6944 0.7784 0.7801 0.8131 0.5359 0.6437 0.5002 0.5838 0.717 0.3271 0.3108 0.4452 0.5061 0.45 0.6182 0.4469 0.776 0.4355 0.6336 0.5227 0.3316 0.4503 0.4538 0.4567 0.4248 0.257 0.2855 0.5552 0.3327 0.3494 0.345 0.4163 0.4939 0.5973 0.3043 0.4009 0.1221 0.1326 0.356 0.2578 0.2428 0.2748 0.3731 0.5481 0.6148 0.5427 0.6194 0.3968 0.1715 0.1546 0.6674 0.3294 0.8908 0.4208 0.4508 0.2714 0.0985 0.2437 0.2944 0.1512 0.4793 0.7299 0.2849 0.38 0.1372 0.3669 0.5165 0.524 0.398 0.2747 0.491 0.4168 0.5618 0.2928 0.3074 0.2566 0.7176 0.4293 0.4403 0.0781 0.0887 0.5207 0.1252 0.1396 0.3764 0.5581 0.412 0.0811 0.2181 0.1095 0.0387 0.075 0.1812 0.1762 0.299 0.3261 0.2065 0.1202 0.2839 0.7803 0.1693 0.7201 0.1448 0.5648 0.0824 0.2393 0.1982 0.5033 0.2087 0.6464 0.6665 0.0635 0.3922 0.6163 0.1873 0.4042 0.4141 0.0805 0.3006 0.2183 0.3011 0.2591 0.5436 0.4655 0.5708 0.3723 0.5352 0.0463 0.2921 0.436 0.301 0.2191 0.1779 0.0454 0.2188 0.4287 0.1691 0.0142 0.2845 0.2247 0.2093 0.0026 0.4222 0.0378 0.1322 0.208 ] 2022-08-24 09:00:34 [INFO] [EVAL] Class Precision: [0.7935 0.8663 0.9655 0.8331 0.7716 0.8791 0.8803 0.8689 0.6828 0.7524 0.6941 0.7175 0.7831 0.5626 0.5803 0.6067 0.6882 0.6941 0.7628 0.6157 0.8437 0.6668 0.7704 0.638 0.4678 0.6508 0.5677 0.8248 0.7234 0.4425 0.4944 0.7113 0.5432 0.4632 0.4345 0.5608 0.6736 0.8091 0.4559 0.593 0.3221 0.3362 0.6018 0.4908 0.3286 0.5112 0.697 0.7684 0.7404 0.6842 0.7846 0.5044 0.346 0.5382 0.7092 0.669 0.9344 0.6972 0.7105 0.473 0.141 0.5219 0.3919 0.6582 0.5723 0.8242 0.4418 0.5216 0.285 0.6345 0.7196 0.6333 0.5854 0.3821 0.705 0.6214 0.7518 0.6913 0.7717 0.5176 0.854 0.6684 0.803 0.1883 0.1917 0.7403 0.4625 0.4182 0.8106 0.7338 0.5617 0.1053 0.4259 0.3313 0.1195 0.2046 0.6093 0.3786 0.4762 0.6759 0.6425 0.2087 0.5612 0.8628 0.8087 0.8384 0.359 0.7613 0.1689 0.373 0.4401 0.6551 0.5797 0.8262 0.6729 0.2848 0.7106 0.7081 0.3512 0.5806 0.7222 0.2585 0.6166 0.5824 0.7599 0.6296 0.8109 0.5849 0.8085 0.5856 0.6224 0.3642 0.5867 0.819 0.6454 0.413 0.3487 0.1289 0.476 0.6906 0.4369 0.0249 0.5906 0.6221 0.5055 0.0043 0.8376 0.2693 0.3197 0.7009] 2022-08-24 09:00:34 [INFO] [EVAL] Class Recall: [0.8544 0.9066 0.9658 0.8752 0.8741 0.8717 0.8727 0.9267 0.7136 0.8167 0.6416 0.7581 0.8947 0.4387 0.401 0.6257 0.6567 0.5614 0.7654 0.6198 0.9062 0.5567 0.7811 0.743 0.5324 0.5937 0.6934 0.5057 0.5072 0.3801 0.4032 0.7167 0.4621 0.5873 0.6264 0.6178 0.6493 0.6953 0.4778 0.5531 0.1643 0.1796 0.4658 0.3519 0.4817 0.3727 0.4453 0.6565 0.7837 0.7241 0.7463 0.6504 0.2538 0.1783 0.919 0.3935 0.9502 0.515 0.5522 0.3891 0.2465 0.3138 0.542 0.164 0.7469 0.8644 0.4452 0.5834 0.2093 0.4652 0.6466 0.7523 0.5542 0.4942 0.618 0.5587 0.6897 0.3368 0.3382 0.3373 0.8179 0.5454 0.4936 0.1177 0.1418 0.6371 0.1465 0.1732 0.4127 0.6998 0.6071 0.2606 0.309 0.1406 0.0542 0.1059 0.205 0.2478 0.4455 0.3865 0.2334 0.2207 0.3649 0.8909 0.1764 0.8362 0.1954 0.6863 0.1385 0.4004 0.265 0.6847 0.246 0.7481 0.9859 0.0756 0.4667 0.8261 0.2864 0.5708 0.4926 0.1047 0.3697 0.2589 0.3327 0.3057 0.6225 0.6952 0.6601 0.5054 0.7926 0.0503 0.3677 0.4825 0.3607 0.3182 0.2664 0.0656 0.2882 0.5306 0.2163 0.0323 0.3544 0.2602 0.2631 0.0065 0.4599 0.0422 0.184 0.2283] 2022-08-24 09:00:34 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:00:42 [INFO] [TRAIN] epoch: 120, iter: 151050/160000, loss: 0.3967, lr: 0.000068, batch_cost: 0.1650, reader_cost: 0.00397, ips: 48.4991 samples/sec | ETA 00:24:36 2022-08-24 09:00:50 [INFO] [TRAIN] epoch: 120, iter: 151100/160000, loss: 0.4113, lr: 0.000067, batch_cost: 0.1584, reader_cost: 0.00081, ips: 50.5036 samples/sec | ETA 00:23:29 2022-08-24 09:00:59 [INFO] [TRAIN] epoch: 120, iter: 151150/160000, loss: 0.3673, lr: 0.000067, batch_cost: 0.1633, reader_cost: 0.00131, ips: 48.9881 samples/sec | ETA 00:24:05 2022-08-24 09:01:07 [INFO] [TRAIN] epoch: 120, iter: 151200/160000, loss: 0.3838, lr: 0.000067, batch_cost: 0.1677, reader_cost: 0.00072, ips: 47.6919 samples/sec | ETA 00:24:36 2022-08-24 09:01:16 [INFO] [TRAIN] epoch: 120, iter: 151250/160000, loss: 0.3806, lr: 0.000066, batch_cost: 0.1879, reader_cost: 0.00049, ips: 42.5816 samples/sec | ETA 00:27:23 2022-08-24 09:01:26 [INFO] [TRAIN] epoch: 120, iter: 151300/160000, loss: 0.4103, lr: 0.000066, batch_cost: 0.2007, reader_cost: 0.00105, ips: 39.8691 samples/sec | ETA 00:29:05 2022-08-24 09:01:36 [INFO] [TRAIN] epoch: 120, iter: 151350/160000, loss: 0.3879, lr: 0.000065, batch_cost: 0.1869, reader_cost: 0.00087, ips: 42.8001 samples/sec | ETA 00:26:56 2022-08-24 09:01:45 [INFO] [TRAIN] epoch: 120, iter: 151400/160000, loss: 0.3882, lr: 0.000065, batch_cost: 0.1777, reader_cost: 0.00039, ips: 45.0087 samples/sec | ETA 00:25:28 2022-08-24 09:01:55 [INFO] [TRAIN] epoch: 120, iter: 151450/160000, loss: 0.3910, lr: 0.000065, batch_cost: 0.2023, reader_cost: 0.00081, ips: 39.5416 samples/sec | ETA 00:28:49 2022-08-24 09:02:05 [INFO] [TRAIN] epoch: 120, iter: 151500/160000, loss: 0.3993, lr: 0.000064, batch_cost: 0.2102, reader_cost: 0.00074, ips: 38.0563 samples/sec | ETA 00:29:46 2022-08-24 09:02:15 [INFO] [TRAIN] epoch: 120, iter: 151550/160000, loss: 0.3937, lr: 0.000064, batch_cost: 0.1870, reader_cost: 0.00073, ips: 42.7830 samples/sec | ETA 00:26:20 2022-08-24 09:02:28 [INFO] [TRAIN] epoch: 121, iter: 151600/160000, loss: 0.4032, lr: 0.000064, batch_cost: 0.2690, reader_cost: 0.08182, ips: 29.7440 samples/sec | ETA 00:37:39 2022-08-24 09:02:38 [INFO] [TRAIN] epoch: 121, iter: 151650/160000, loss: 0.3810, lr: 0.000063, batch_cost: 0.1950, reader_cost: 0.00055, ips: 41.0257 samples/sec | ETA 00:27:08 2022-08-24 09:02:48 [INFO] [TRAIN] epoch: 121, iter: 151700/160000, loss: 0.3712, lr: 0.000063, batch_cost: 0.2126, reader_cost: 0.00058, ips: 37.6282 samples/sec | ETA 00:29:24 2022-08-24 09:02:58 [INFO] [TRAIN] epoch: 121, iter: 151750/160000, loss: 0.3856, lr: 0.000062, batch_cost: 0.1836, reader_cost: 0.00072, ips: 43.5621 samples/sec | ETA 00:25:15 2022-08-24 09:03:07 [INFO] [TRAIN] epoch: 121, iter: 151800/160000, loss: 0.3771, lr: 0.000062, batch_cost: 0.1814, reader_cost: 0.00049, ips: 44.1103 samples/sec | ETA 00:24:47 2022-08-24 09:03:16 [INFO] [TRAIN] epoch: 121, iter: 151850/160000, loss: 0.3967, lr: 0.000062, batch_cost: 0.1848, reader_cost: 0.00053, ips: 43.2807 samples/sec | ETA 00:25:06 2022-08-24 09:03:25 [INFO] [TRAIN] epoch: 121, iter: 151900/160000, loss: 0.3701, lr: 0.000061, batch_cost: 0.1836, reader_cost: 0.00070, ips: 43.5803 samples/sec | ETA 00:24:46 2022-08-24 09:03:34 [INFO] [TRAIN] epoch: 121, iter: 151950/160000, loss: 0.3978, lr: 0.000061, batch_cost: 0.1731, reader_cost: 0.00035, ips: 46.2105 samples/sec | ETA 00:23:13 2022-08-24 09:03:42 [INFO] [TRAIN] epoch: 121, iter: 152000/160000, loss: 0.3829, lr: 0.000061, batch_cost: 0.1748, reader_cost: 0.00072, ips: 45.7797 samples/sec | ETA 00:23:18 2022-08-24 09:03:43 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 153s - batch_cost: 0.1530 - reader cost: 5.6990e-04 2022-08-24 09:06:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3783 Acc: 0.7775 Kappa: 0.7605 Dice: 0.5153 2022-08-24 09:06:16 [INFO] [EVAL] Class IoU: [0.6966 0.7924 0.9329 0.7426 0.6941 0.7765 0.7806 0.8149 0.5352 0.6357 0.5085 0.5884 0.7181 0.3204 0.3149 0.4479 0.4952 0.4506 0.6201 0.4482 0.7746 0.404 0.6364 0.5277 0.3276 0.4569 0.4551 0.4775 0.4319 0.2509 0.2948 0.5749 0.3267 0.3564 0.3773 0.4305 0.4898 0.5853 0.3063 0.3896 0.125 0.1184 0.3526 0.2586 0.2451 0.2549 0.3781 0.5399 0.6101 0.554 0.6114 0.3923 0.1566 0.1524 0.6676 0.3341 0.8944 0.4419 0.3976 0.2693 0.0935 0.2428 0.3081 0.1761 0.4883 0.7178 0.2574 0.3855 0.1458 0.3706 0.5163 0.5449 0.3965 0.2666 0.494 0.4069 0.5658 0.2996 0.2913 0.2692 0.722 0.4494 0.4371 0.0391 0.1017 0.5273 0.1136 0.1407 0.3409 0.5351 0.4163 0.075 0.2093 0.0942 0.033 0.0787 0.1559 0.1806 0.2971 0.3156 0.2383 0.1162 0.2847 0.7674 0.1816 0.7177 0.1689 0.5657 0.0845 0.2188 0.2123 0.5267 0.2151 0.618 0.7068 0.0669 0.381 0.6224 0.1753 0.3702 0.4309 0.0759 0.31 0.2058 0.3023 0.2462 0.5439 0.4525 0.6456 0.3809 0.575 0.0516 0.2998 0.4238 0.3129 0.2156 0.1742 0.0482 0.2276 0.4244 0.1635 0.0157 0.3133 0.1847 0.2204 0.0023 0.437 0.04 0.141 0.2015] 2022-08-24 09:06:16 [INFO] [EVAL] Class Precision: [0.787 0.8674 0.9642 0.8284 0.7763 0.8761 0.8989 0.8739 0.6854 0.7546 0.7103 0.6979 0.7866 0.5297 0.5388 0.6162 0.7211 0.7009 0.7803 0.6381 0.8429 0.6447 0.7734 0.6417 0.4623 0.6275 0.5683 0.7853 0.7608 0.3919 0.4967 0.734 0.5777 0.4758 0.4803 0.5937 0.6718 0.8119 0.4516 0.6202 0.3353 0.3197 0.5969 0.4848 0.3422 0.5579 0.7222 0.7424 0.7014 0.7145 0.7698 0.5053 0.3176 0.5811 0.7057 0.6587 0.933 0.6627 0.6817 0.4634 0.1243 0.4985 0.4407 0.6157 0.5927 0.8109 0.4065 0.5174 0.3191 0.6477 0.7036 0.6743 0.5682 0.3785 0.7271 0.5908 0.7573 0.5997 0.7696 0.5632 0.8593 0.7124 0.8057 0.1179 0.1832 0.7115 0.4658 0.3888 0.7779 0.6746 0.5599 0.1058 0.4881 0.3227 0.1131 0.19 0.6061 0.3877 0.4254 0.762 0.6466 0.1997 0.571 0.8952 0.8327 0.815 0.4219 0.7749 0.1701 0.3559 0.438 0.7125 0.5606 0.7939 0.7154 0.2945 0.724 0.7121 0.2635 0.5253 0.7199 0.2691 0.637 0.6126 0.7474 0.6252 0.8258 0.58 0.8486 0.5707 0.6895 0.3281 0.575 0.8422 0.6181 0.3926 0.3661 0.1144 0.4487 0.6977 0.4085 0.0291 0.5898 0.6613 0.5 0.0036 0.8227 0.2415 0.3268 0.7273] 2022-08-24 09:06:16 [INFO] [EVAL] Class Recall: [0.8584 0.9016 0.9663 0.8775 0.8676 0.8723 0.8558 0.9234 0.7095 0.8013 0.6416 0.7895 0.8918 0.4478 0.4312 0.6213 0.6126 0.5579 0.7513 0.601 0.9053 0.5196 0.7822 0.7481 0.5292 0.6269 0.6955 0.5493 0.4997 0.4108 0.4204 0.7263 0.4292 0.5867 0.6377 0.6103 0.644 0.6772 0.4877 0.5117 0.1661 0.1583 0.4628 0.3567 0.4637 0.3194 0.4424 0.6643 0.8241 0.7115 0.7481 0.6369 0.236 0.1713 0.9251 0.4041 0.9558 0.5701 0.4882 0.3914 0.2739 0.3212 0.5059 0.1978 0.7349 0.862 0.4124 0.6019 0.2117 0.4642 0.6598 0.7396 0.5676 0.4742 0.6065 0.5665 0.691 0.3745 0.3192 0.3402 0.8188 0.549 0.4887 0.0553 0.186 0.6707 0.1306 0.1807 0.3776 0.7214 0.6189 0.2048 0.2681 0.1174 0.0445 0.1185 0.1735 0.2526 0.4963 0.3501 0.274 0.2176 0.3622 0.8431 0.1885 0.8573 0.2198 0.6769 0.1438 0.3623 0.2919 0.6689 0.2588 0.736 0.9833 0.0797 0.4458 0.8317 0.3438 0.5563 0.5177 0.0956 0.3765 0.2366 0.3368 0.2888 0.6144 0.6731 0.7296 0.5338 0.7759 0.0577 0.3851 0.4603 0.388 0.3235 0.2494 0.0769 0.316 0.52 0.2143 0.0331 0.4005 0.204 0.2828 0.0064 0.4824 0.0458 0.1986 0.218 ] 2022-08-24 09:06:16 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:06:25 [INFO] [TRAIN] epoch: 121, iter: 152050/160000, loss: 0.3736, lr: 0.000060, batch_cost: 0.1782, reader_cost: 0.00504, ips: 44.8889 samples/sec | ETA 00:23:36 2022-08-24 09:06:33 [INFO] [TRAIN] epoch: 121, iter: 152100/160000, loss: 0.3996, lr: 0.000060, batch_cost: 0.1687, reader_cost: 0.00111, ips: 47.4233 samples/sec | ETA 00:22:12 2022-08-24 09:06:42 [INFO] [TRAIN] epoch: 121, iter: 152150/160000, loss: 0.4276, lr: 0.000059, batch_cost: 0.1715, reader_cost: 0.00087, ips: 46.6372 samples/sec | ETA 00:22:26 2022-08-24 09:06:51 [INFO] [TRAIN] epoch: 121, iter: 152200/160000, loss: 0.3855, lr: 0.000059, batch_cost: 0.1863, reader_cost: 0.00044, ips: 42.9449 samples/sec | ETA 00:24:13 2022-08-24 09:07:01 [INFO] [TRAIN] epoch: 121, iter: 152250/160000, loss: 0.4050, lr: 0.000059, batch_cost: 0.1945, reader_cost: 0.00142, ips: 41.1279 samples/sec | ETA 00:25:07 2022-08-24 09:07:11 [INFO] [TRAIN] epoch: 121, iter: 152300/160000, loss: 0.3877, lr: 0.000058, batch_cost: 0.2055, reader_cost: 0.00072, ips: 38.9206 samples/sec | ETA 00:26:22 2022-08-24 09:07:21 [INFO] [TRAIN] epoch: 121, iter: 152350/160000, loss: 0.3757, lr: 0.000058, batch_cost: 0.1893, reader_cost: 0.00075, ips: 42.2622 samples/sec | ETA 00:24:08 2022-08-24 09:07:30 [INFO] [TRAIN] epoch: 121, iter: 152400/160000, loss: 0.3819, lr: 0.000058, batch_cost: 0.1754, reader_cost: 0.00066, ips: 45.6029 samples/sec | ETA 00:22:13 2022-08-24 09:07:38 [INFO] [TRAIN] epoch: 121, iter: 152450/160000, loss: 0.3791, lr: 0.000057, batch_cost: 0.1640, reader_cost: 0.00087, ips: 48.7727 samples/sec | ETA 00:20:38 2022-08-24 09:07:47 [INFO] [TRAIN] epoch: 121, iter: 152500/160000, loss: 0.3656, lr: 0.000057, batch_cost: 0.1819, reader_cost: 0.00066, ips: 43.9804 samples/sec | ETA 00:22:44 2022-08-24 09:07:55 [INFO] [TRAIN] epoch: 121, iter: 152550/160000, loss: 0.3870, lr: 0.000056, batch_cost: 0.1683, reader_cost: 0.00471, ips: 47.5270 samples/sec | ETA 00:20:54 2022-08-24 09:08:04 [INFO] [TRAIN] epoch: 121, iter: 152600/160000, loss: 0.3909, lr: 0.000056, batch_cost: 0.1751, reader_cost: 0.00049, ips: 45.6939 samples/sec | ETA 00:21:35 2022-08-24 09:08:13 [INFO] [TRAIN] epoch: 121, iter: 152650/160000, loss: 0.3499, lr: 0.000056, batch_cost: 0.1718, reader_cost: 0.00071, ips: 46.5631 samples/sec | ETA 00:21:02 2022-08-24 09:08:23 [INFO] [TRAIN] epoch: 121, iter: 152700/160000, loss: 0.3810, lr: 0.000055, batch_cost: 0.1991, reader_cost: 0.00046, ips: 40.1853 samples/sec | ETA 00:24:13 2022-08-24 09:08:32 [INFO] [TRAIN] epoch: 121, iter: 152750/160000, loss: 0.3788, lr: 0.000055, batch_cost: 0.1815, reader_cost: 0.00043, ips: 44.0764 samples/sec | ETA 00:21:55 2022-08-24 09:08:40 [INFO] [TRAIN] epoch: 121, iter: 152800/160000, loss: 0.3495, lr: 0.000055, batch_cost: 0.1764, reader_cost: 0.00162, ips: 45.3630 samples/sec | ETA 00:21:09 2022-08-24 09:08:53 [INFO] [TRAIN] epoch: 122, iter: 152850/160000, loss: 0.4112, lr: 0.000054, batch_cost: 0.2600, reader_cost: 0.06659, ips: 30.7720 samples/sec | ETA 00:30:58 2022-08-24 09:09:03 [INFO] [TRAIN] epoch: 122, iter: 152900/160000, loss: 0.3792, lr: 0.000054, batch_cost: 0.1836, reader_cost: 0.00055, ips: 43.5842 samples/sec | ETA 00:21:43 2022-08-24 09:09:13 [INFO] [TRAIN] epoch: 122, iter: 152950/160000, loss: 0.3892, lr: 0.000053, batch_cost: 0.2128, reader_cost: 0.00069, ips: 37.5982 samples/sec | ETA 00:25:00 2022-08-24 09:09:24 [INFO] [TRAIN] epoch: 122, iter: 153000/160000, loss: 0.3789, lr: 0.000053, batch_cost: 0.2229, reader_cost: 0.00062, ips: 35.8910 samples/sec | ETA 00:26:00 2022-08-24 09:09:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 151s - batch_cost: 0.1508 - reader cost: 6.9404e-04 2022-08-24 09:11:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3802 Acc: 0.7787 Kappa: 0.7617 Dice: 0.5177 2022-08-24 09:11:55 [INFO] [EVAL] Class IoU: [0.6984 0.7933 0.9332 0.7438 0.69 0.776 0.7817 0.8127 0.5321 0.6411 0.5069 0.5823 0.7247 0.3261 0.3103 0.4462 0.5081 0.4348 0.6186 0.4414 0.7738 0.428 0.6383 0.5298 0.3344 0.4421 0.4525 0.4766 0.4453 0.2457 0.3051 0.5694 0.3278 0.3569 0.3633 0.4194 0.4926 0.5868 0.3055 0.3742 0.1161 0.1238 0.3571 0.2648 0.2506 0.2507 0.3595 0.5469 0.6165 0.5445 0.6244 0.3863 0.1648 0.1763 0.6671 0.336 0.8884 0.4421 0.3962 0.2782 0.0943 0.2404 0.3156 0.1799 0.4893 0.7273 0.251 0.3876 0.1398 0.3866 0.5176 0.557 0.4065 0.2709 0.4941 0.4092 0.5271 0.307 0.2914 0.2905 0.7255 0.4492 0.4382 0.0721 0.1018 0.5297 0.1286 0.132 0.366 0.5318 0.416 0.101 0.2161 0.1138 0.0396 0.0806 0.1667 0.1838 0.2875 0.3196 0.2305 0.1049 0.287 0.7782 0.1788 0.6457 0.144 0.5659 0.0857 0.219 0.2094 0.5238 0.2022 0.6427 0.7426 0.0622 0.3964 0.6252 0.1722 0.3679 0.4368 0.0761 0.3271 0.2062 0.2988 0.2591 0.5479 0.468 0.6372 0.3801 0.5879 0.0462 0.3274 0.4349 0.3171 0.2196 0.1776 0.0414 0.2256 0.4279 0.1568 0.0118 0.2862 0.2206 0.2263 0.0097 0.4512 0.0392 0.15 0.2026] 2022-08-24 09:11:55 [INFO] [EVAL] Class Precision: [0.791 0.8668 0.9629 0.8252 0.7714 0.8716 0.8751 0.8694 0.6643 0.7533 0.7067 0.7291 0.7974 0.5622 0.5606 0.6075 0.6931 0.7128 0.7805 0.6591 0.8431 0.6566 0.7738 0.6419 0.4774 0.618 0.5714 0.7753 0.7501 0.3875 0.4928 0.7397 0.566 0.4778 0.4682 0.5771 0.6773 0.8139 0.4627 0.6323 0.3162 0.3198 0.5891 0.4801 0.3432 0.5104 0.7349 0.7484 0.7304 0.688 0.8104 0.4928 0.3526 0.6068 0.7051 0.6421 0.928 0.6571 0.7037 0.4957 0.1323 0.4518 0.4468 0.6357 0.5943 0.8231 0.4127 0.5336 0.3286 0.6497 0.7047 0.6741 0.5575 0.3724 0.7125 0.5803 0.7541 0.6661 0.7564 0.5701 0.8603 0.6966 0.8026 0.1913 0.1935 0.7395 0.5352 0.4145 0.8073 0.6638 0.5609 0.1393 0.4683 0.3292 0.1336 0.214 0.5853 0.3926 0.4214 0.7985 0.7321 0.1692 0.5544 0.8902 0.8234 0.7107 0.4305 0.7655 0.1703 0.365 0.432 0.6885 0.6718 0.7873 0.7518 0.2907 0.7134 0.7258 0.2983 0.5684 0.7064 0.2908 0.6643 0.6241 0.7589 0.6574 0.8253 0.6112 0.8677 0.6036 0.6919 0.3536 0.5753 0.8137 0.5974 0.4171 0.3548 0.1045 0.4747 0.6969 0.4326 0.0202 0.5949 0.6188 0.5165 0.0146 0.8516 0.2471 0.3798 0.7209] 2022-08-24 09:11:55 [INFO] [EVAL] Class Recall: [0.8565 0.9034 0.968 0.883 0.8673 0.8762 0.8798 0.9258 0.7279 0.8115 0.6419 0.7432 0.8882 0.4372 0.41 0.627 0.6557 0.5271 0.7489 0.572 0.9039 0.5514 0.7847 0.752 0.5276 0.6084 0.6848 0.553 0.5229 0.4015 0.4449 0.7122 0.4378 0.5851 0.6185 0.6055 0.6436 0.6778 0.4734 0.4783 0.155 0.168 0.4756 0.3713 0.4815 0.33 0.4131 0.6701 0.7981 0.723 0.7312 0.6414 0.2363 0.1991 0.9252 0.4134 0.9542 0.5747 0.4755 0.3879 0.2471 0.3393 0.5181 0.2006 0.7347 0.862 0.3906 0.5861 0.1958 0.4885 0.661 0.7623 0.6002 0.4986 0.6172 0.5811 0.6365 0.3628 0.3216 0.3719 0.8223 0.5585 0.4911 0.1036 0.1769 0.6513 0.1448 0.1622 0.401 0.7278 0.6169 0.2689 0.2864 0.1482 0.0533 0.1144 0.1891 0.2569 0.475 0.3476 0.2517 0.2163 0.3731 0.8608 0.1859 0.876 0.1778 0.6845 0.1472 0.3539 0.289 0.6864 0.2243 0.7778 0.9838 0.0733 0.4715 0.8184 0.2894 0.5105 0.5336 0.0934 0.3918 0.2355 0.3302 0.2995 0.6198 0.6665 0.7057 0.5066 0.7964 0.0505 0.4318 0.4831 0.4033 0.3169 0.2623 0.0643 0.3006 0.5258 0.1974 0.0276 0.3554 0.2553 0.287 0.0282 0.4897 0.0445 0.1986 0.2199] 2022-08-24 09:11:56 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:12:04 [INFO] [TRAIN] epoch: 122, iter: 153050/160000, loss: 0.3590, lr: 0.000053, batch_cost: 0.1652, reader_cost: 0.00362, ips: 48.4359 samples/sec | ETA 00:19:07 2022-08-24 09:12:12 [INFO] [TRAIN] epoch: 122, iter: 153100/160000, loss: 0.3796, lr: 0.000052, batch_cost: 0.1551, reader_cost: 0.00145, ips: 51.5738 samples/sec | ETA 00:17:50 2022-08-24 09:12:22 [INFO] [TRAIN] epoch: 122, iter: 153150/160000, loss: 0.4016, lr: 0.000052, batch_cost: 0.2011, reader_cost: 0.00073, ips: 39.7786 samples/sec | ETA 00:22:57 2022-08-24 09:12:31 [INFO] [TRAIN] epoch: 122, iter: 153200/160000, loss: 0.3772, lr: 0.000051, batch_cost: 0.1797, reader_cost: 0.00080, ips: 44.5250 samples/sec | ETA 00:20:21 2022-08-24 09:12:40 [INFO] [TRAIN] epoch: 122, iter: 153250/160000, loss: 0.3834, lr: 0.000051, batch_cost: 0.1880, reader_cost: 0.00042, ips: 42.5616 samples/sec | ETA 00:21:08 2022-08-24 09:12:49 [INFO] [TRAIN] epoch: 122, iter: 153300/160000, loss: 0.3868, lr: 0.000051, batch_cost: 0.1839, reader_cost: 0.00035, ips: 43.4915 samples/sec | ETA 00:20:32 2022-08-24 09:12:59 [INFO] [TRAIN] epoch: 122, iter: 153350/160000, loss: 0.3779, lr: 0.000050, batch_cost: 0.1846, reader_cost: 0.00057, ips: 43.3408 samples/sec | ETA 00:20:27 2022-08-24 09:13:08 [INFO] [TRAIN] epoch: 122, iter: 153400/160000, loss: 0.3751, lr: 0.000050, batch_cost: 0.1797, reader_cost: 0.00079, ips: 44.5098 samples/sec | ETA 00:19:46 2022-08-24 09:13:17 [INFO] [TRAIN] epoch: 122, iter: 153450/160000, loss: 0.3825, lr: 0.000050, batch_cost: 0.1831, reader_cost: 0.00089, ips: 43.6952 samples/sec | ETA 00:19:59 2022-08-24 09:13:25 [INFO] [TRAIN] epoch: 122, iter: 153500/160000, loss: 0.3900, lr: 0.000049, batch_cost: 0.1735, reader_cost: 0.00063, ips: 46.0974 samples/sec | ETA 00:18:48 2022-08-24 09:13:34 [INFO] [TRAIN] epoch: 122, iter: 153550/160000, loss: 0.4153, lr: 0.000049, batch_cost: 0.1691, reader_cost: 0.00035, ips: 47.3220 samples/sec | ETA 00:18:10 2022-08-24 09:13:43 [INFO] [TRAIN] epoch: 122, iter: 153600/160000, loss: 0.3861, lr: 0.000048, batch_cost: 0.1827, reader_cost: 0.00065, ips: 43.7966 samples/sec | ETA 00:19:29 2022-08-24 09:13:54 [INFO] [TRAIN] epoch: 122, iter: 153650/160000, loss: 0.4075, lr: 0.000048, batch_cost: 0.2190, reader_cost: 0.00040, ips: 36.5224 samples/sec | ETA 00:23:10 2022-08-24 09:14:03 [INFO] [TRAIN] epoch: 122, iter: 153700/160000, loss: 0.3823, lr: 0.000048, batch_cost: 0.1828, reader_cost: 0.00056, ips: 43.7637 samples/sec | ETA 00:19:11 2022-08-24 09:14:13 [INFO] [TRAIN] epoch: 122, iter: 153750/160000, loss: 0.3772, lr: 0.000047, batch_cost: 0.2087, reader_cost: 0.00064, ips: 38.3308 samples/sec | ETA 00:21:44 2022-08-24 09:14:24 [INFO] [TRAIN] epoch: 122, iter: 153800/160000, loss: 0.3758, lr: 0.000047, batch_cost: 0.2156, reader_cost: 0.00036, ips: 37.1096 samples/sec | ETA 00:22:16 2022-08-24 09:14:34 [INFO] [TRAIN] epoch: 122, iter: 153850/160000, loss: 0.4067, lr: 0.000047, batch_cost: 0.2039, reader_cost: 0.00059, ips: 39.2340 samples/sec | ETA 00:20:54 2022-08-24 09:14:44 [INFO] [TRAIN] epoch: 122, iter: 153900/160000, loss: 0.4272, lr: 0.000046, batch_cost: 0.1837, reader_cost: 0.00033, ips: 43.5584 samples/sec | ETA 00:18:40 2022-08-24 09:14:53 [INFO] [TRAIN] epoch: 122, iter: 153950/160000, loss: 0.3920, lr: 0.000046, batch_cost: 0.1787, reader_cost: 0.00353, ips: 44.7675 samples/sec | ETA 00:18:01 2022-08-24 09:15:03 [INFO] [TRAIN] epoch: 122, iter: 154000/160000, loss: 0.3964, lr: 0.000045, batch_cost: 0.2000, reader_cost: 0.00501, ips: 40.0014 samples/sec | ETA 00:19:59 2022-08-24 09:15:03 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 146s - batch_cost: 0.1455 - reader cost: 5.8051e-04 2022-08-24 09:17:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3794 Acc: 0.7789 Kappa: 0.7619 Dice: 0.5168 2022-08-24 09:17:28 [INFO] [EVAL] Class IoU: [0.6982 0.7928 0.9333 0.7449 0.6864 0.7773 0.7815 0.8162 0.5351 0.6348 0.5043 0.5885 0.722 0.3249 0.3153 0.45 0.5181 0.4422 0.6237 0.4531 0.7755 0.4393 0.6374 0.5256 0.336 0.434 0.4565 0.4779 0.4529 0.2546 0.2911 0.5681 0.3336 0.3658 0.3604 0.4218 0.4932 0.5855 0.2924 0.3721 0.1365 0.1304 0.3556 0.2605 0.2428 0.2525 0.3666 0.5518 0.6161 0.5558 0.6139 0.4009 0.156 0.1964 0.674 0.3365 0.8934 0.4319 0.373 0.2768 0.0933 0.2505 0.3364 0.1399 0.4956 0.7208 0.2726 0.3922 0.1426 0.3657 0.5014 0.5812 0.3891 0.2637 0.503 0.4135 0.5403 0.3013 0.3102 0.2822 0.7173 0.4372 0.4415 0.0659 0.0897 0.5229 0.1258 0.1342 0.3746 0.5465 0.4239 0.0709 0.2062 0.1176 0.0269 0.0712 0.1908 0.1884 0.2978 0.3088 0.2372 0.1187 0.2861 0.7757 0.1887 0.6919 0.1575 0.5688 0.0768 0.2467 0.2199 0.4898 0.2121 0.6375 0.6988 0.0614 0.3594 0.625 0.1549 0.3848 0.4148 0.0789 0.3004 0.2167 0.3015 0.2461 0.5504 0.4436 0.6192 0.3999 0.5866 0.0515 0.3011 0.4078 0.3148 0.2156 0.1752 0.0507 0.2281 0.413 0.1533 0.0143 0.3051 0.1988 0.2101 0.0089 0.4387 0.0397 0.1405 0.2033] 2022-08-24 09:17:28 [INFO] [EVAL] Class Precision: [0.7894 0.8678 0.9648 0.8294 0.7579 0.8759 0.8841 0.8758 0.6719 0.7602 0.6836 0.7186 0.7932 0.5555 0.5356 0.6301 0.7014 0.7119 0.7875 0.6283 0.8423 0.6633 0.7844 0.6273 0.4802 0.659 0.5555 0.786 0.7584 0.4081 0.5024 0.725 0.5664 0.4949 0.4716 0.5901 0.6954 0.8284 0.4235 0.6251 0.3237 0.3037 0.6172 0.4835 0.338 0.4952 0.7204 0.7393 0.7217 0.7078 0.7882 0.5071 0.3499 0.6246 0.7151 0.6434 0.9406 0.6797 0.6846 0.5084 0.1402 0.5644 0.5165 0.7065 0.6039 0.8214 0.4399 0.5304 0.3326 0.6091 0.7315 0.7426 0.5712 0.374 0.7478 0.6285 0.7521 0.6457 0.7618 0.5283 0.8544 0.6742 0.8014 0.1851 0.1774 0.7272 0.4632 0.4234 0.8219 0.7005 0.577 0.1002 0.4337 0.3379 0.0996 0.1925 0.5634 0.4018 0.4321 0.7599 0.6117 0.2086 0.5501 0.883 0.8468 0.7887 0.3766 0.7586 0.1675 0.3812 0.4657 0.635 0.6119 0.7768 0.707 0.2619 0.7103 0.728 0.2292 0.5469 0.7403 0.3014 0.6246 0.5932 0.7529 0.633 0.8141 0.5585 0.8216 0.6355 0.7532 0.3644 0.6283 0.8494 0.5972 0.4125 0.3618 0.1415 0.4642 0.7221 0.3916 0.0246 0.5918 0.6526 0.4862 0.0128 0.8029 0.219 0.3915 0.7506] 2022-08-24 09:17:28 [INFO] [EVAL] Class Recall: [0.858 0.9017 0.9661 0.8797 0.8793 0.8735 0.8708 0.923 0.7244 0.7937 0.6579 0.7648 0.8895 0.439 0.4338 0.6115 0.6648 0.5386 0.7499 0.619 0.9073 0.5654 0.7727 0.7643 0.5279 0.5597 0.7193 0.5494 0.5292 0.4037 0.409 0.7242 0.448 0.5838 0.6044 0.5966 0.6291 0.6663 0.4856 0.4789 0.1911 0.186 0.4563 0.3609 0.4629 0.3401 0.4274 0.6851 0.8081 0.7213 0.7352 0.6567 0.2196 0.2227 0.9214 0.4137 0.9468 0.5422 0.4504 0.378 0.2178 0.3106 0.491 0.1485 0.7344 0.8547 0.4176 0.6009 0.1998 0.4778 0.6145 0.7278 0.5496 0.4721 0.6059 0.5473 0.6573 0.361 0.3435 0.3773 0.8172 0.5543 0.4957 0.0928 0.1537 0.6505 0.1473 0.1642 0.4077 0.7131 0.615 0.1951 0.2822 0.1529 0.0355 0.1016 0.2239 0.2619 0.4893 0.3421 0.2793 0.216 0.3734 0.8645 0.1954 0.8492 0.2131 0.6945 0.1243 0.4115 0.2941 0.6816 0.245 0.7805 0.9838 0.0742 0.4211 0.8155 0.3234 0.5649 0.4855 0.0966 0.3665 0.2545 0.3346 0.2871 0.6296 0.6831 0.7154 0.5189 0.7262 0.0566 0.3664 0.4396 0.3996 0.3112 0.2536 0.0731 0.3097 0.491 0.2012 0.033 0.3865 0.2224 0.2701 0.0285 0.4916 0.0462 0.1798 0.218 ] 2022-08-24 09:17:29 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:17:37 [INFO] [TRAIN] epoch: 122, iter: 154050/160000, loss: 0.4187, lr: 0.000045, batch_cost: 0.1728, reader_cost: 0.00408, ips: 46.2838 samples/sec | ETA 00:17:08 2022-08-24 09:17:48 [INFO] [TRAIN] epoch: 123, iter: 154100/160000, loss: 0.3832, lr: 0.000045, batch_cost: 0.2154, reader_cost: 0.04539, ips: 37.1418 samples/sec | ETA 00:21:10 2022-08-24 09:17:56 [INFO] [TRAIN] epoch: 123, iter: 154150/160000, loss: 0.3653, lr: 0.000044, batch_cost: 0.1652, reader_cost: 0.00105, ips: 48.4133 samples/sec | ETA 00:16:06 2022-08-24 09:18:05 [INFO] [TRAIN] epoch: 123, iter: 154200/160000, loss: 0.3914, lr: 0.000044, batch_cost: 0.1680, reader_cost: 0.00086, ips: 47.6322 samples/sec | ETA 00:16:14 2022-08-24 09:18:14 [INFO] [TRAIN] epoch: 123, iter: 154250/160000, loss: 0.3618, lr: 0.000044, batch_cost: 0.1882, reader_cost: 0.00067, ips: 42.5129 samples/sec | ETA 00:18:02 2022-08-24 09:18:24 [INFO] [TRAIN] epoch: 123, iter: 154300/160000, loss: 0.4049, lr: 0.000043, batch_cost: 0.2020, reader_cost: 0.00081, ips: 39.5988 samples/sec | ETA 00:19:11 2022-08-24 09:18:34 [INFO] [TRAIN] epoch: 123, iter: 154350/160000, loss: 0.3979, lr: 0.000043, batch_cost: 0.1963, reader_cost: 0.00077, ips: 40.7638 samples/sec | ETA 00:18:28 2022-08-24 09:18:43 [INFO] [TRAIN] epoch: 123, iter: 154400/160000, loss: 0.3720, lr: 0.000042, batch_cost: 0.1870, reader_cost: 0.00038, ips: 42.7768 samples/sec | ETA 00:17:27 2022-08-24 09:18:53 [INFO] [TRAIN] epoch: 123, iter: 154450/160000, loss: 0.4143, lr: 0.000042, batch_cost: 0.1876, reader_cost: 0.00054, ips: 42.6532 samples/sec | ETA 00:17:20 2022-08-24 09:19:02 [INFO] [TRAIN] epoch: 123, iter: 154500/160000, loss: 0.3686, lr: 0.000042, batch_cost: 0.1902, reader_cost: 0.00064, ips: 42.0716 samples/sec | ETA 00:17:25 2022-08-24 09:19:12 [INFO] [TRAIN] epoch: 123, iter: 154550/160000, loss: 0.3917, lr: 0.000041, batch_cost: 0.1949, reader_cost: 0.00071, ips: 41.0418 samples/sec | ETA 00:17:42 2022-08-24 09:19:21 [INFO] [TRAIN] epoch: 123, iter: 154600/160000, loss: 0.3906, lr: 0.000041, batch_cost: 0.1812, reader_cost: 0.00041, ips: 44.1558 samples/sec | ETA 00:16:18 2022-08-24 09:19:31 [INFO] [TRAIN] epoch: 123, iter: 154650/160000, loss: 0.3656, lr: 0.000041, batch_cost: 0.1868, reader_cost: 0.00256, ips: 42.8194 samples/sec | ETA 00:16:39 2022-08-24 09:19:40 [INFO] [TRAIN] epoch: 123, iter: 154700/160000, loss: 0.3955, lr: 0.000040, batch_cost: 0.1846, reader_cost: 0.00085, ips: 43.3413 samples/sec | ETA 00:16:18 2022-08-24 09:19:50 [INFO] [TRAIN] epoch: 123, iter: 154750/160000, loss: 0.3757, lr: 0.000040, batch_cost: 0.2038, reader_cost: 0.00056, ips: 39.2567 samples/sec | ETA 00:17:49 2022-08-24 09:20:00 [INFO] [TRAIN] epoch: 123, iter: 154800/160000, loss: 0.4277, lr: 0.000039, batch_cost: 0.2038, reader_cost: 0.00068, ips: 39.2535 samples/sec | ETA 00:17:39 2022-08-24 09:20:09 [INFO] [TRAIN] epoch: 123, iter: 154850/160000, loss: 0.3822, lr: 0.000039, batch_cost: 0.1784, reader_cost: 0.00071, ips: 44.8550 samples/sec | ETA 00:15:18 2022-08-24 09:20:22 [INFO] [TRAIN] epoch: 123, iter: 154900/160000, loss: 0.3922, lr: 0.000039, batch_cost: 0.2533, reader_cost: 0.00039, ips: 31.5825 samples/sec | ETA 00:21:31 2022-08-24 09:20:31 [INFO] [TRAIN] epoch: 123, iter: 154950/160000, loss: 0.3636, lr: 0.000038, batch_cost: 0.1896, reader_cost: 0.00056, ips: 42.1914 samples/sec | ETA 00:15:57 2022-08-24 09:20:40 [INFO] [TRAIN] epoch: 123, iter: 155000/160000, loss: 0.3799, lr: 0.000038, batch_cost: 0.1841, reader_cost: 0.00055, ips: 43.4550 samples/sec | ETA 00:15:20 2022-08-24 09:20:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 144s - batch_cost: 0.1442 - reader cost: 6.0216e-04 2022-08-24 09:23:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3783 Acc: 0.7784 Kappa: 0.7613 Dice: 0.5158 2022-08-24 09:23:05 [INFO] [EVAL] Class IoU: [0.6985 0.7932 0.933 0.7414 0.6944 0.7768 0.7826 0.8123 0.5364 0.6437 0.4988 0.5806 0.7192 0.3194 0.3092 0.4447 0.5154 0.4519 0.613 0.4456 0.777 0.4344 0.6367 0.5244 0.3307 0.434 0.4629 0.485 0.4185 0.2481 0.2777 0.5621 0.3323 0.3579 0.3593 0.421 0.49 0.5903 0.296 0.3756 0.1274 0.129 0.356 0.262 0.237 0.2669 0.3842 0.5462 0.6104 0.542 0.6038 0.3977 0.1681 0.165 0.6763 0.3353 0.8939 0.431 0.4088 0.2762 0.1024 0.2498 0.3043 0.1554 0.4965 0.7242 0.2598 0.3798 0.1359 0.3706 0.5268 0.559 0.3768 0.2678 0.4995 0.4128 0.5243 0.2964 0.3587 0.2712 0.7248 0.4291 0.4412 0.0547 0.1004 0.5248 0.1205 0.1366 0.3433 0.5507 0.4381 0.0994 0.2061 0.11 0.0413 0.0751 0.1971 0.1831 0.2956 0.3164 0.2376 0.1155 0.2815 0.7895 0.1873 0.614 0.1493 0.5654 0.0837 0.2218 0.2117 0.5472 0.2108 0.6306 0.6565 0.0578 0.3949 0.6217 0.1758 0.3964 0.4344 0.0799 0.3112 0.2028 0.2978 0.2413 0.5473 0.4654 0.62 0.3933 0.5587 0.0425 0.2981 0.4372 0.3115 0.2158 0.1741 0.0452 0.2044 0.4189 0.1797 0.015 0.299 0.1876 0.1927 0.0057 0.434 0.0382 0.1385 0.2083] 2022-08-24 09:23:05 [INFO] [EVAL] Class Precision: [0.7887 0.8674 0.9652 0.826 0.77 0.8747 0.8794 0.8676 0.682 0.7504 0.7052 0.7241 0.7858 0.5467 0.5693 0.6141 0.6881 0.711 0.7744 0.6442 0.8453 0.6682 0.7809 0.6244 0.4714 0.6478 0.5683 0.7761 0.7653 0.4088 0.5023 0.716 0.5589 0.4678 0.4706 0.5784 0.6934 0.8231 0.4358 0.6287 0.3306 0.3128 0.6161 0.4802 0.3161 0.4891 0.753 0.7564 0.7215 0.6836 0.7738 0.4962 0.3666 0.5976 0.7102 0.6568 0.9412 0.6809 0.6578 0.4851 0.1464 0.5089 0.4071 0.6634 0.6051 0.828 0.3982 0.5258 0.3047 0.6365 0.7181 0.68 0.6071 0.3614 0.7378 0.5905 0.725 0.6886 0.7603 0.5128 0.8607 0.7004 0.8015 0.1463 0.179 0.7194 0.4836 0.4187 0.801 0.7104 0.6116 0.1431 0.3924 0.3362 0.1306 0.1974 0.5777 0.3813 0.4433 0.7166 0.7114 0.1953 0.5633 0.8805 0.8304 0.6785 0.4049 0.7626 0.1713 0.3662 0.4616 0.7441 0.6476 0.8118 0.6634 0.294 0.7536 0.7245 0.293 0.5506 0.7173 0.3204 0.6087 0.6341 0.7486 0.6444 0.8625 0.615 0.8533 0.6408 0.6831 0.3985 0.6444 0.8244 0.6065 0.4096 0.3639 0.126 0.4638 0.7079 0.4584 0.0243 0.5953 0.6152 0.5031 0.0083 0.8771 0.2481 0.3904 0.7483] 2022-08-24 09:23:05 [INFO] [EVAL] Class Recall: [0.8593 0.9026 0.9655 0.8786 0.8761 0.874 0.8766 0.9272 0.7152 0.8191 0.6302 0.7455 0.8945 0.4344 0.4036 0.6172 0.6724 0.5537 0.7464 0.5911 0.9058 0.5538 0.7752 0.7661 0.5257 0.5681 0.7138 0.5639 0.4801 0.3869 0.3832 0.7233 0.4505 0.6038 0.6031 0.6074 0.6256 0.676 0.4798 0.4826 0.1716 0.1801 0.4575 0.3657 0.4863 0.37 0.4396 0.6628 0.7985 0.7234 0.7333 0.6671 0.2369 0.1856 0.9341 0.4065 0.9468 0.54 0.5192 0.3907 0.2544 0.3291 0.5467 0.1687 0.7345 0.8525 0.4278 0.5778 0.197 0.4701 0.6641 0.7584 0.4984 0.5083 0.6073 0.5783 0.6545 0.3422 0.4044 0.3653 0.8212 0.5256 0.4953 0.0803 0.1863 0.6598 0.1384 0.1686 0.3753 0.7101 0.6069 0.2453 0.3026 0.1405 0.057 0.1082 0.2303 0.2606 0.4702 0.3617 0.263 0.2203 0.3601 0.8843 0.1947 0.8659 0.1913 0.6862 0.1406 0.3599 0.2811 0.674 0.2382 0.7386 0.9845 0.0671 0.4535 0.8141 0.3053 0.5861 0.5241 0.0962 0.389 0.2296 0.3309 0.2784 0.5997 0.6567 0.694 0.5046 0.754 0.0454 0.3568 0.4821 0.3905 0.3133 0.2503 0.0658 0.2677 0.5065 0.2282 0.0377 0.3752 0.2125 0.238 0.0175 0.4621 0.0432 0.1767 0.224 ] 2022-08-24 09:23:05 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:23:13 [INFO] [TRAIN] epoch: 123, iter: 155050/160000, loss: 0.3808, lr: 0.000037, batch_cost: 0.1641, reader_cost: 0.00340, ips: 48.7412 samples/sec | ETA 00:13:32 2022-08-24 09:23:21 [INFO] [TRAIN] epoch: 123, iter: 155100/160000, loss: 0.4107, lr: 0.000037, batch_cost: 0.1629, reader_cost: 0.00167, ips: 49.0958 samples/sec | ETA 00:13:18 2022-08-24 09:23:30 [INFO] [TRAIN] epoch: 123, iter: 155150/160000, loss: 0.3826, lr: 0.000037, batch_cost: 0.1753, reader_cost: 0.00138, ips: 45.6454 samples/sec | ETA 00:14:10 2022-08-24 09:23:38 [INFO] [TRAIN] epoch: 123, iter: 155200/160000, loss: 0.3940, lr: 0.000036, batch_cost: 0.1609, reader_cost: 0.00074, ips: 49.7176 samples/sec | ETA 00:12:52 2022-08-24 09:23:47 [INFO] [TRAIN] epoch: 123, iter: 155250/160000, loss: 0.3900, lr: 0.000036, batch_cost: 0.1675, reader_cost: 0.00059, ips: 47.7520 samples/sec | ETA 00:13:15 2022-08-24 09:23:55 [INFO] [TRAIN] epoch: 123, iter: 155300/160000, loss: 0.3926, lr: 0.000036, batch_cost: 0.1657, reader_cost: 0.00059, ips: 48.2864 samples/sec | ETA 00:12:58 2022-08-24 09:24:07 [INFO] [TRAIN] epoch: 124, iter: 155350/160000, loss: 0.3829, lr: 0.000035, batch_cost: 0.2482, reader_cost: 0.04587, ips: 32.2347 samples/sec | ETA 00:19:14 2022-08-24 09:24:17 [INFO] [TRAIN] epoch: 124, iter: 155400/160000, loss: 0.3852, lr: 0.000035, batch_cost: 0.1975, reader_cost: 0.00123, ips: 40.4981 samples/sec | ETA 00:15:08 2022-08-24 09:24:27 [INFO] [TRAIN] epoch: 124, iter: 155450/160000, loss: 0.3754, lr: 0.000034, batch_cost: 0.1905, reader_cost: 0.00063, ips: 41.9854 samples/sec | ETA 00:14:26 2022-08-24 09:24:36 [INFO] [TRAIN] epoch: 124, iter: 155500/160000, loss: 0.3647, lr: 0.000034, batch_cost: 0.1898, reader_cost: 0.00067, ips: 42.1587 samples/sec | ETA 00:14:13 2022-08-24 09:24:46 [INFO] [TRAIN] epoch: 124, iter: 155550/160000, loss: 0.3882, lr: 0.000034, batch_cost: 0.1878, reader_cost: 0.00059, ips: 42.5947 samples/sec | ETA 00:13:55 2022-08-24 09:24:56 [INFO] [TRAIN] epoch: 124, iter: 155600/160000, loss: 0.3842, lr: 0.000033, batch_cost: 0.2030, reader_cost: 0.00059, ips: 39.4077 samples/sec | ETA 00:14:53 2022-08-24 09:25:05 [INFO] [TRAIN] epoch: 124, iter: 155650/160000, loss: 0.3622, lr: 0.000033, batch_cost: 0.1865, reader_cost: 0.00039, ips: 42.9011 samples/sec | ETA 00:13:31 2022-08-24 09:25:14 [INFO] [TRAIN] epoch: 124, iter: 155700/160000, loss: 0.3948, lr: 0.000033, batch_cost: 0.1827, reader_cost: 0.00070, ips: 43.7898 samples/sec | ETA 00:13:05 2022-08-24 09:25:23 [INFO] [TRAIN] epoch: 124, iter: 155750/160000, loss: 0.3834, lr: 0.000032, batch_cost: 0.1763, reader_cost: 0.00653, ips: 45.3680 samples/sec | ETA 00:12:29 2022-08-24 09:25:33 [INFO] [TRAIN] epoch: 124, iter: 155800/160000, loss: 0.3733, lr: 0.000032, batch_cost: 0.1912, reader_cost: 0.00155, ips: 41.8403 samples/sec | ETA 00:13:23 2022-08-24 09:25:42 [INFO] [TRAIN] epoch: 124, iter: 155850/160000, loss: 0.3750, lr: 0.000031, batch_cost: 0.1882, reader_cost: 0.00099, ips: 42.5132 samples/sec | ETA 00:13:00 2022-08-24 09:25:53 [INFO] [TRAIN] epoch: 124, iter: 155900/160000, loss: 0.3501, lr: 0.000031, batch_cost: 0.2120, reader_cost: 0.00085, ips: 37.7370 samples/sec | ETA 00:14:29 2022-08-24 09:26:03 [INFO] [TRAIN] epoch: 124, iter: 155950/160000, loss: 0.3937, lr: 0.000031, batch_cost: 0.2043, reader_cost: 0.00041, ips: 39.1584 samples/sec | ETA 00:13:47 2022-08-24 09:26:13 [INFO] [TRAIN] epoch: 124, iter: 156000/160000, loss: 0.3758, lr: 0.000030, batch_cost: 0.2000, reader_cost: 0.00058, ips: 40.0027 samples/sec | ETA 00:13:19 2022-08-24 09:26:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 153s - batch_cost: 0.1525 - reader cost: 7.7445e-04 2022-08-24 09:28:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3775 Acc: 0.7776 Kappa: 0.7606 Dice: 0.5150 2022-08-24 09:28:46 [INFO] [EVAL] Class IoU: [0.6982 0.7914 0.9332 0.7406 0.6943 0.7754 0.7792 0.8144 0.5341 0.6457 0.4989 0.5804 0.7192 0.3236 0.3162 0.4474 0.5139 0.4521 0.6164 0.4464 0.7719 0.4279 0.6356 0.5202 0.3277 0.4279 0.4554 0.4754 0.4384 0.2405 0.2934 0.561 0.3355 0.355 0.3723 0.4169 0.49 0.5864 0.299 0.3829 0.1257 0.1252 0.3529 0.2632 0.2424 0.2752 0.3718 0.5497 0.6109 0.5425 0.5946 0.3995 0.1722 0.1685 0.6738 0.327 0.8932 0.4236 0.3694 0.2788 0.0928 0.2451 0.3123 0.1635 0.4943 0.7246 0.2724 0.3853 0.1386 0.3805 0.5322 0.5604 0.398 0.2763 0.4979 0.4162 0.547 0.2946 0.3909 0.2727 0.7211 0.4303 0.4331 0.0536 0.1009 0.5239 0.1149 0.1334 0.3301 0.5529 0.4327 0.0939 0.2261 0.1179 0.0418 0.0746 0.1759 0.1999 0.3006 0.3183 0.2219 0.1126 0.2765 0.7596 0.1908 0.6656 0.1433 0.561 0.0811 0.2179 0.2007 0.5015 0.2069 0.6061 0.7053 0.0617 0.3877 0.6265 0.1683 0.3854 0.4195 0.0804 0.2996 0.2035 0.2971 0.2455 0.5416 0.4629 0.5887 0.3721 0.5551 0.0503 0.2893 0.44 0.3014 0.2152 0.1777 0.053 0.2092 0.431 0.1399 0.0123 0.2981 0.1719 0.1964 0.0134 0.4255 0.0358 0.1484 0.2031] 2022-08-24 09:28:46 [INFO] [EVAL] Class Precision: [0.7923 0.8686 0.9637 0.8212 0.7748 0.8692 0.8819 0.8751 0.6783 0.7666 0.7037 0.7164 0.7906 0.5425 0.5538 0.6077 0.7122 0.7038 0.7753 0.6453 0.8383 0.6561 0.7901 0.6223 0.4748 0.6195 0.5522 0.7954 0.7485 0.3887 0.5026 0.7067 0.5702 0.4746 0.4724 0.5636 0.6694 0.808 0.4421 0.6206 0.327 0.3172 0.5779 0.4809 0.324 0.5163 0.7498 0.7603 0.7102 0.665 0.7767 0.4857 0.3428 0.5501 0.7064 0.6261 0.9399 0.6738 0.6901 0.4747 0.1331 0.4957 0.428 0.6343 0.6045 0.823 0.4216 0.5179 0.2754 0.6572 0.7227 0.6936 0.5827 0.3776 0.7368 0.5984 0.7463 0.6863 0.7835 0.4572 0.854 0.7042 0.8146 0.1491 0.1834 0.7246 0.487 0.4287 0.8041 0.7088 0.5994 0.162 0.481 0.3309 0.1406 0.2023 0.5827 0.3954 0.4615 0.7181 0.7125 0.1839 0.5614 0.8638 0.7619 0.7434 0.3485 0.7682 0.2054 0.3701 0.4356 0.6433 0.5824 0.8325 0.7138 0.2921 0.7616 0.7332 0.2508 0.5664 0.7169 0.3145 0.6281 0.6108 0.7717 0.6229 0.8101 0.5835 0.8119 0.5969 0.7234 0.4196 0.5796 0.8233 0.6267 0.4244 0.3602 0.1164 0.4563 0.692 0.4217 0.0197 0.5923 0.5064 0.5244 0.0186 0.8501 0.2299 0.4094 0.7314] 2022-08-24 09:28:46 [INFO] [EVAL] Class Recall: [0.8546 0.899 0.9672 0.883 0.8698 0.8778 0.87 0.9215 0.7153 0.8037 0.6316 0.7536 0.8884 0.4451 0.4244 0.6292 0.6486 0.5584 0.7504 0.5915 0.9069 0.5516 0.7647 0.7603 0.514 0.5804 0.7221 0.5417 0.5142 0.387 0.4134 0.7313 0.4491 0.5849 0.6373 0.6156 0.6465 0.6814 0.4803 0.4999 0.1696 0.1713 0.4755 0.3676 0.4903 0.3708 0.4244 0.6649 0.8138 0.7466 0.7172 0.6925 0.2571 0.1954 0.9359 0.4064 0.9473 0.5329 0.4428 0.4033 0.2347 0.3265 0.5359 0.1805 0.7305 0.8583 0.435 0.6008 0.2181 0.4747 0.6688 0.7447 0.5567 0.5075 0.6056 0.5774 0.6719 0.3404 0.4383 0.4034 0.8225 0.5252 0.4805 0.0773 0.1832 0.6541 0.1308 0.1623 0.3589 0.7154 0.6088 0.1826 0.2991 0.1547 0.0561 0.1057 0.2013 0.2879 0.4631 0.3637 0.2437 0.225 0.3527 0.8629 0.2029 0.8641 0.1958 0.6753 0.1182 0.3463 0.2713 0.6946 0.2429 0.6903 0.9833 0.0725 0.4412 0.8114 0.3384 0.5466 0.5028 0.0975 0.3641 0.2338 0.3257 0.2884 0.6203 0.6913 0.6816 0.497 0.7046 0.054 0.366 0.4859 0.3673 0.3039 0.2597 0.0886 0.2787 0.5333 0.1732 0.0318 0.3751 0.2066 0.239 0.0463 0.46 0.0407 0.1888 0.2195] 2022-08-24 09:28:46 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:28:55 [INFO] [TRAIN] epoch: 124, iter: 156050/160000, loss: 0.3611, lr: 0.000030, batch_cost: 0.1748, reader_cost: 0.00473, ips: 45.7759 samples/sec | ETA 00:11:30 2022-08-24 09:29:03 [INFO] [TRAIN] epoch: 124, iter: 156100/160000, loss: 0.4136, lr: 0.000030, batch_cost: 0.1703, reader_cost: 0.00106, ips: 46.9680 samples/sec | ETA 00:11:04 2022-08-24 09:29:12 [INFO] [TRAIN] epoch: 124, iter: 156150/160000, loss: 0.3953, lr: 0.000029, batch_cost: 0.1859, reader_cost: 0.00083, ips: 43.0259 samples/sec | ETA 00:11:55 2022-08-24 09:29:22 [INFO] [TRAIN] epoch: 124, iter: 156200/160000, loss: 0.3848, lr: 0.000029, batch_cost: 0.1950, reader_cost: 0.00056, ips: 41.0352 samples/sec | ETA 00:12:20 2022-08-24 09:29:31 [INFO] [TRAIN] epoch: 124, iter: 156250/160000, loss: 0.3924, lr: 0.000028, batch_cost: 0.1760, reader_cost: 0.00049, ips: 45.4669 samples/sec | ETA 00:10:59 2022-08-24 09:29:42 [INFO] [TRAIN] epoch: 124, iter: 156300/160000, loss: 0.4011, lr: 0.000028, batch_cost: 0.2190, reader_cost: 0.00055, ips: 36.5237 samples/sec | ETA 00:13:30 2022-08-24 09:29:51 [INFO] [TRAIN] epoch: 124, iter: 156350/160000, loss: 0.3912, lr: 0.000028, batch_cost: 0.1829, reader_cost: 0.00042, ips: 43.7493 samples/sec | ETA 00:11:07 2022-08-24 09:30:00 [INFO] [TRAIN] epoch: 124, iter: 156400/160000, loss: 0.3750, lr: 0.000027, batch_cost: 0.1729, reader_cost: 0.00074, ips: 46.2793 samples/sec | ETA 00:10:22 2022-08-24 09:30:08 [INFO] [TRAIN] epoch: 124, iter: 156450/160000, loss: 0.3928, lr: 0.000027, batch_cost: 0.1743, reader_cost: 0.00093, ips: 45.8959 samples/sec | ETA 00:10:18 2022-08-24 09:30:18 [INFO] [TRAIN] epoch: 124, iter: 156500/160000, loss: 0.3876, lr: 0.000027, batch_cost: 0.1911, reader_cost: 0.00076, ips: 41.8668 samples/sec | ETA 00:11:08 2022-08-24 09:30:26 [INFO] [TRAIN] epoch: 124, iter: 156550/160000, loss: 0.3630, lr: 0.000026, batch_cost: 0.1703, reader_cost: 0.00033, ips: 46.9824 samples/sec | ETA 00:09:47 2022-08-24 09:30:38 [INFO] [TRAIN] epoch: 124, iter: 156600/160000, loss: 0.3587, lr: 0.000026, batch_cost: 0.2287, reader_cost: 0.00086, ips: 34.9868 samples/sec | ETA 00:12:57 2022-08-24 09:30:54 [INFO] [TRAIN] epoch: 125, iter: 156650/160000, loss: 0.3623, lr: 0.000025, batch_cost: 0.3181, reader_cost: 0.12583, ips: 25.1476 samples/sec | ETA 00:17:45 2022-08-24 09:31:03 [INFO] [TRAIN] epoch: 125, iter: 156700/160000, loss: 0.3944, lr: 0.000025, batch_cost: 0.1872, reader_cost: 0.00377, ips: 42.7432 samples/sec | ETA 00:10:17 2022-08-24 09:31:12 [INFO] [TRAIN] epoch: 125, iter: 156750/160000, loss: 0.3727, lr: 0.000025, batch_cost: 0.1843, reader_cost: 0.00218, ips: 43.4028 samples/sec | ETA 00:09:59 2022-08-24 09:31:22 [INFO] [TRAIN] epoch: 125, iter: 156800/160000, loss: 0.3603, lr: 0.000024, batch_cost: 0.2006, reader_cost: 0.00051, ips: 39.8815 samples/sec | ETA 00:10:41 2022-08-24 09:31:31 [INFO] [TRAIN] epoch: 125, iter: 156850/160000, loss: 0.3728, lr: 0.000024, batch_cost: 0.1748, reader_cost: 0.00036, ips: 45.7748 samples/sec | ETA 00:09:10 2022-08-24 09:31:41 [INFO] [TRAIN] epoch: 125, iter: 156900/160000, loss: 0.3806, lr: 0.000023, batch_cost: 0.1948, reader_cost: 0.00048, ips: 41.0728 samples/sec | ETA 00:10:03 2022-08-24 09:31:51 [INFO] [TRAIN] epoch: 125, iter: 156950/160000, loss: 0.3737, lr: 0.000023, batch_cost: 0.2011, reader_cost: 0.00062, ips: 39.7739 samples/sec | ETA 00:10:13 2022-08-24 09:32:01 [INFO] [TRAIN] epoch: 125, iter: 157000/160000, loss: 0.3865, lr: 0.000023, batch_cost: 0.2083, reader_cost: 0.00052, ips: 38.4085 samples/sec | ETA 00:10:24 2022-08-24 09:32:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 150s - batch_cost: 0.1497 - reader cost: 6.8763e-04 2022-08-24 09:34:31 [INFO] [EVAL] #Images: 2000 mIoU: 0.3801 Acc: 0.7792 Kappa: 0.7622 Dice: 0.5175 2022-08-24 09:34:31 [INFO] [EVAL] Class IoU: [0.6987 0.7923 0.9329 0.7444 0.695 0.7763 0.7834 0.8136 0.5341 0.6447 0.5156 0.586 0.7205 0.3159 0.3176 0.4479 0.524 0.4517 0.6186 0.4428 0.7752 0.4256 0.6356 0.5261 0.3295 0.448 0.4643 0.4847 0.441 0.2443 0.3023 0.5682 0.3259 0.3548 0.3538 0.42 0.4913 0.5864 0.308 0.3932 0.1317 0.1261 0.3586 0.2657 0.2673 0.2613 0.3809 0.5435 0.6124 0.5523 0.6045 0.3888 0.1683 0.1809 0.6752 0.3369 0.8936 0.4326 0.3788 0.2803 0.1004 0.2491 0.3226 0.17 0.5032 0.7204 0.2542 0.387 0.1352 0.3731 0.5171 0.5906 0.3981 0.2627 0.4984 0.4129 0.5492 0.3061 0.307 0.288 0.7327 0.4426 0.435 0.0512 0.1043 0.5265 0.1247 0.1428 0.3494 0.55 0.4373 0.094 0.2243 0.1089 0.0375 0.073 0.1742 0.1776 0.2916 0.3066 0.2286 0.1199 0.2837 0.7617 0.1884 0.6544 0.1368 0.5639 0.0793 0.2003 0.2086 0.5138 0.2127 0.6487 0.6945 0.0625 0.3959 0.626 0.1682 0.3777 0.43 0.0783 0.3187 0.2051 0.2978 0.2512 0.5481 0.46 0.6325 0.3872 0.57 0.0453 0.3053 0.43 0.3142 0.2179 0.1742 0.0473 0.2162 0.4166 0.1124 0.0181 0.2845 0.2381 0.2244 0.0101 0.4399 0.0379 0.1644 0.2057] 2022-08-24 09:34:31 [INFO] [EVAL] Class Precision: [0.7915 0.8614 0.9643 0.8314 0.7745 0.8754 0.8909 0.8709 0.6847 0.7532 0.7064 0.7107 0.7878 0.5571 0.547 0.608 0.7004 0.7031 0.7754 0.6374 0.8427 0.6731 0.7839 0.6416 0.4651 0.6717 0.5659 0.7882 0.7587 0.4046 0.4943 0.7183 0.5756 0.4774 0.4431 0.5824 0.6699 0.827 0.4537 0.6121 0.3284 0.311 0.5961 0.489 0.3842 0.5297 0.7111 0.7282 0.7279 0.7153 0.7593 0.4879 0.3163 0.6147 0.7107 0.6697 0.9355 0.6879 0.7129 0.5026 0.1401 0.502 0.4539 0.6359 0.6212 0.8144 0.4234 0.5182 0.2903 0.6417 0.7334 0.7554 0.5856 0.35 0.7351 0.5967 0.7772 0.6521 0.7545 0.5058 0.876 0.7036 0.8126 0.1727 0.1849 0.7282 0.4388 0.4166 0.8198 0.7014 0.6042 0.1501 0.4628 0.3243 0.1138 0.1866 0.5984 0.3821 0.4247 0.7494 0.7253 0.2083 0.5484 0.8921 0.7834 0.7324 0.3449 0.7583 0.1619 0.3548 0.4351 0.738 0.6045 0.844 0.7019 0.2747 0.7527 0.7315 0.2318 0.5594 0.7058 0.3294 0.6378 0.5974 0.7587 0.6171 0.8392 0.586 0.8791 0.5655 0.7426 0.351 0.6029 0.8214 0.6039 0.4207 0.3791 0.1218 0.4498 0.7163 0.4277 0.0299 0.5676 0.6175 0.4931 0.0147 0.8652 0.1968 0.4357 0.7238] 2022-08-24 09:34:31 [INFO] [EVAL] Class Recall: [0.8562 0.9081 0.9663 0.8768 0.8713 0.8727 0.8665 0.9252 0.7082 0.8174 0.6563 0.7696 0.894 0.4218 0.4309 0.6297 0.6754 0.5581 0.7536 0.5919 0.9065 0.5365 0.7706 0.7451 0.5307 0.5735 0.7212 0.5573 0.5129 0.3815 0.4377 0.7311 0.4289 0.5802 0.637 0.6011 0.6482 0.6684 0.4895 0.5237 0.1803 0.175 0.4737 0.3679 0.4675 0.3402 0.4507 0.6818 0.7942 0.708 0.7479 0.6567 0.2645 0.204 0.9312 0.404 0.9522 0.5382 0.447 0.3879 0.2619 0.3308 0.5272 0.1884 0.7259 0.8619 0.3889 0.6045 0.2019 0.4713 0.6368 0.7302 0.5543 0.513 0.6075 0.5727 0.6518 0.3659 0.341 0.4007 0.8175 0.544 0.4835 0.0678 0.193 0.6553 0.1484 0.1785 0.3784 0.7182 0.6129 0.2007 0.3032 0.1409 0.0529 0.107 0.1973 0.2491 0.4821 0.3416 0.2502 0.2203 0.3703 0.839 0.1987 0.8602 0.1848 0.6875 0.1346 0.3151 0.286 0.6284 0.2471 0.7371 0.9851 0.0748 0.4551 0.8128 0.3797 0.5375 0.5239 0.0932 0.3891 0.2379 0.3289 0.2975 0.6124 0.6815 0.6927 0.5512 0.7104 0.0494 0.3822 0.4743 0.3957 0.3113 0.2437 0.0719 0.2939 0.4989 0.1323 0.0438 0.3632 0.2793 0.2917 0.0316 0.4723 0.0448 0.2089 0.2232] 2022-08-24 09:34:31 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:34:39 [INFO] [TRAIN] epoch: 125, iter: 157050/160000, loss: 0.3543, lr: 0.000022, batch_cost: 0.1603, reader_cost: 0.00401, ips: 49.8933 samples/sec | ETA 00:07:53 2022-08-24 09:34:48 [INFO] [TRAIN] epoch: 125, iter: 157100/160000, loss: 0.3638, lr: 0.000022, batch_cost: 0.1644, reader_cost: 0.00132, ips: 48.6492 samples/sec | ETA 00:07:56 2022-08-24 09:34:58 [INFO] [TRAIN] epoch: 125, iter: 157150/160000, loss: 0.4100, lr: 0.000022, batch_cost: 0.2136, reader_cost: 0.00048, ips: 37.4563 samples/sec | ETA 00:10:08 2022-08-24 09:35:08 [INFO] [TRAIN] epoch: 125, iter: 157200/160000, loss: 0.3879, lr: 0.000021, batch_cost: 0.1888, reader_cost: 0.00065, ips: 42.3657 samples/sec | ETA 00:08:48 2022-08-24 09:35:18 [INFO] [TRAIN] epoch: 125, iter: 157250/160000, loss: 0.3911, lr: 0.000021, batch_cost: 0.2015, reader_cost: 0.00271, ips: 39.7017 samples/sec | ETA 00:09:14 2022-08-24 09:35:28 [INFO] [TRAIN] epoch: 125, iter: 157300/160000, loss: 0.3774, lr: 0.000020, batch_cost: 0.1961, reader_cost: 0.00090, ips: 40.8026 samples/sec | ETA 00:08:49 2022-08-24 09:35:37 [INFO] [TRAIN] epoch: 125, iter: 157350/160000, loss: 0.3928, lr: 0.000020, batch_cost: 0.1830, reader_cost: 0.00071, ips: 43.7056 samples/sec | ETA 00:08:05 2022-08-24 09:35:45 [INFO] [TRAIN] epoch: 125, iter: 157400/160000, loss: 0.3902, lr: 0.000020, batch_cost: 0.1691, reader_cost: 0.00035, ips: 47.3023 samples/sec | ETA 00:07:19 2022-08-24 09:35:55 [INFO] [TRAIN] epoch: 125, iter: 157450/160000, loss: 0.3767, lr: 0.000019, batch_cost: 0.1981, reader_cost: 0.00083, ips: 40.3806 samples/sec | ETA 00:08:25 2022-08-24 09:36:06 [INFO] [TRAIN] epoch: 125, iter: 157500/160000, loss: 0.3819, lr: 0.000019, batch_cost: 0.2068, reader_cost: 0.00041, ips: 38.6862 samples/sec | ETA 00:08:36 2022-08-24 09:36:15 [INFO] [TRAIN] epoch: 125, iter: 157550/160000, loss: 0.4013, lr: 0.000019, batch_cost: 0.1980, reader_cost: 0.00037, ips: 40.3959 samples/sec | ETA 00:08:05 2022-08-24 09:36:25 [INFO] [TRAIN] epoch: 125, iter: 157600/160000, loss: 0.3619, lr: 0.000018, batch_cost: 0.1972, reader_cost: 0.00121, ips: 40.5666 samples/sec | ETA 00:07:53 2022-08-24 09:36:36 [INFO] [TRAIN] epoch: 125, iter: 157650/160000, loss: 0.3875, lr: 0.000018, batch_cost: 0.2102, reader_cost: 0.00057, ips: 38.0555 samples/sec | ETA 00:08:14 2022-08-24 09:36:46 [INFO] [TRAIN] epoch: 125, iter: 157700/160000, loss: 0.4077, lr: 0.000017, batch_cost: 0.1960, reader_cost: 0.00112, ips: 40.8242 samples/sec | ETA 00:07:30 2022-08-24 09:36:57 [INFO] [TRAIN] epoch: 125, iter: 157750/160000, loss: 0.3890, lr: 0.000017, batch_cost: 0.2182, reader_cost: 0.00072, ips: 36.6709 samples/sec | ETA 00:08:10 2022-08-24 09:37:08 [INFO] [TRAIN] epoch: 125, iter: 157800/160000, loss: 0.4162, lr: 0.000017, batch_cost: 0.2322, reader_cost: 0.00075, ips: 34.4489 samples/sec | ETA 00:08:30 2022-08-24 09:37:19 [INFO] [TRAIN] epoch: 125, iter: 157850/160000, loss: 0.3524, lr: 0.000016, batch_cost: 0.2152, reader_cost: 0.00066, ips: 37.1773 samples/sec | ETA 00:07:42 2022-08-24 09:37:33 [INFO] [TRAIN] epoch: 126, iter: 157900/160000, loss: 0.3709, lr: 0.000016, batch_cost: 0.2775, reader_cost: 0.10184, ips: 28.8244 samples/sec | ETA 00:09:42 2022-08-24 09:37:41 [INFO] [TRAIN] epoch: 126, iter: 157950/160000, loss: 0.3909, lr: 0.000016, batch_cost: 0.1704, reader_cost: 0.00068, ips: 46.9444 samples/sec | ETA 00:05:49 2022-08-24 09:37:52 [INFO] [TRAIN] epoch: 126, iter: 158000/160000, loss: 0.3818, lr: 0.000015, batch_cost: 0.2045, reader_cost: 0.00915, ips: 39.1143 samples/sec | ETA 00:06:49 2022-08-24 09:37:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 146s - batch_cost: 0.1461 - reader cost: 7.9639e-04 2022-08-24 09:40:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.3802 Acc: 0.7795 Kappa: 0.7625 Dice: 0.5175 2022-08-24 09:40:18 [INFO] [EVAL] Class IoU: [0.6992 0.7959 0.9333 0.7434 0.6909 0.7776 0.7835 0.8111 0.5323 0.6422 0.5067 0.5862 0.7232 0.3268 0.3165 0.4479 0.5067 0.4547 0.6185 0.4454 0.7759 0.4269 0.6408 0.5268 0.329 0.439 0.4623 0.4888 0.4402 0.2615 0.2948 0.5653 0.3271 0.3625 0.366 0.4158 0.4918 0.5828 0.3032 0.3865 0.1084 0.1292 0.3575 0.2618 0.2539 0.2657 0.3896 0.549 0.6257 0.5449 0.6011 0.4023 0.1666 0.1829 0.68 0.335 0.8892 0.4408 0.3741 0.2743 0.0961 0.2372 0.3259 0.1738 0.4956 0.7304 0.2753 0.3871 0.1398 0.38 0.5195 0.5502 0.4101 0.2592 0.5003 0.4116 0.5424 0.3109 0.317 0.284 0.7395 0.4469 0.4342 0.099 0.093 0.5267 0.1168 0.1345 0.3204 0.5425 0.4393 0.0698 0.2158 0.1223 0.04 0.0738 0.1636 0.1715 0.2957 0.3072 0.2323 0.1094 0.2728 0.7546 0.1945 0.6879 0.1306 0.5654 0.0882 0.2078 0.2059 0.525 0.2165 0.653 0.7207 0.0668 0.423 0.6237 0.1841 0.3902 0.429 0.0803 0.3053 0.1976 0.2965 0.2386 0.5509 0.4804 0.5818 0.372 0.5833 0.045 0.2819 0.4213 0.3131 0.2161 0.1731 0.053 0.2069 0.4301 0.1191 0.0092 0.2892 0.2929 0.1955 0.0148 0.4419 0.0376 0.1494 0.2082] 2022-08-24 09:40:18 [INFO] [EVAL] Class Precision: [0.7924 0.8645 0.9637 0.8258 0.7691 0.8714 0.8857 0.866 0.6765 0.7593 0.6954 0.7279 0.7952 0.5316 0.5555 0.6176 0.7109 0.7198 0.7791 0.6475 0.8461 0.6653 0.7782 0.6373 0.4523 0.693 0.568 0.7697 0.7637 0.4515 0.4946 0.7099 0.5725 0.4858 0.4597 0.5583 0.687 0.8241 0.4402 0.6301 0.3028 0.3251 0.5853 0.4855 0.3569 0.5559 0.7612 0.7429 0.7221 0.689 0.7805 0.5115 0.337 0.6212 0.7166 0.6784 0.933 0.6765 0.706 0.5131 0.1338 0.459 0.4746 0.6221 0.6082 0.8363 0.453 0.5244 0.2837 0.6558 0.7254 0.669 0.5655 0.362 0.7369 0.6077 0.7562 0.679 0.7571 0.5314 0.8795 0.7127 0.8106 0.2572 0.1736 0.7291 0.4775 0.4238 0.8501 0.6812 0.6066 0.099 0.4316 0.3411 0.1243 0.2014 0.5982 0.3924 0.4394 0.728 0.6809 0.1862 0.5477 0.8628 0.8066 0.765 0.3771 0.7796 0.1766 0.3559 0.4501 0.6847 0.5955 0.8268 0.7294 0.2781 0.7466 0.7222 0.2822 0.5706 0.7257 0.3264 0.6044 0.6179 0.7585 0.6154 0.8275 0.6142 0.8577 0.5776 0.7216 0.3014 0.5888 0.8371 0.6132 0.4065 0.3635 0.138 0.4748 0.6878 0.421 0.0182 0.5771 0.6706 0.5144 0.0208 0.8589 0.2202 0.4219 0.7184] 2022-08-24 09:40:18 [INFO] [EVAL] Class Recall: [0.856 0.9092 0.9672 0.8816 0.8716 0.8784 0.8717 0.9275 0.7141 0.8063 0.6511 0.7508 0.8887 0.459 0.4238 0.6199 0.6381 0.5525 0.7501 0.5879 0.9034 0.5437 0.784 0.7524 0.5469 0.545 0.7129 0.5725 0.5096 0.3832 0.4219 0.7352 0.4328 0.5883 0.6424 0.6196 0.6338 0.6656 0.4933 0.4999 0.1444 0.1766 0.4787 0.3623 0.4679 0.3373 0.4438 0.6778 0.8241 0.7227 0.7234 0.6533 0.2478 0.2059 0.93 0.3983 0.9498 0.5586 0.4432 0.3707 0.2546 0.3293 0.5099 0.1943 0.728 0.8522 0.4123 0.5965 0.216 0.4747 0.6467 0.756 0.5987 0.4773 0.6092 0.5605 0.6574 0.3644 0.3529 0.3788 0.8229 0.5451 0.4833 0.1387 0.1669 0.6549 0.1339 0.1646 0.3396 0.7271 0.6144 0.1913 0.3014 0.1602 0.0558 0.1044 0.1838 0.2335 0.4749 0.3471 0.2606 0.2096 0.3522 0.8575 0.2041 0.8721 0.1666 0.6729 0.1497 0.3331 0.2751 0.6925 0.2539 0.7565 0.9839 0.0808 0.494 0.8205 0.3461 0.5524 0.5121 0.0962 0.3815 0.2251 0.3274 0.2804 0.6223 0.6879 0.6439 0.511 0.7526 0.0502 0.351 0.4589 0.3902 0.3158 0.2484 0.0792 0.2682 0.5345 0.1424 0.0184 0.367 0.3422 0.2398 0.0493 0.4764 0.0434 0.1879 0.2267] 2022-08-24 09:40:18 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:40:26 [INFO] [TRAIN] epoch: 126, iter: 158050/160000, loss: 0.3650, lr: 0.000015, batch_cost: 0.1677, reader_cost: 0.00393, ips: 47.7032 samples/sec | ETA 00:05:27 2022-08-24 09:40:35 [INFO] [TRAIN] epoch: 126, iter: 158100/160000, loss: 0.3629, lr: 0.000014, batch_cost: 0.1614, reader_cost: 0.00140, ips: 49.5802 samples/sec | ETA 00:05:06 2022-08-24 09:40:42 [INFO] [TRAIN] epoch: 126, iter: 158150/160000, loss: 0.3933, lr: 0.000014, batch_cost: 0.1508, reader_cost: 0.00111, ips: 53.0344 samples/sec | ETA 00:04:39 2022-08-24 09:40:50 [INFO] [TRAIN] epoch: 126, iter: 158200/160000, loss: 0.3926, lr: 0.000014, batch_cost: 0.1568, reader_cost: 0.00111, ips: 51.0256 samples/sec | ETA 00:04:42 2022-08-24 09:40:58 [INFO] [TRAIN] epoch: 126, iter: 158250/160000, loss: 0.3707, lr: 0.000013, batch_cost: 0.1649, reader_cost: 0.00108, ips: 48.5023 samples/sec | ETA 00:04:48 2022-08-24 09:41:06 [INFO] [TRAIN] epoch: 126, iter: 158300/160000, loss: 0.3633, lr: 0.000013, batch_cost: 0.1609, reader_cost: 0.00122, ips: 49.7230 samples/sec | ETA 00:04:33 2022-08-24 09:41:16 [INFO] [TRAIN] epoch: 126, iter: 158350/160000, loss: 0.3785, lr: 0.000012, batch_cost: 0.1868, reader_cost: 0.00067, ips: 42.8320 samples/sec | ETA 00:05:08 2022-08-24 09:41:24 [INFO] [TRAIN] epoch: 126, iter: 158400/160000, loss: 0.3963, lr: 0.000012, batch_cost: 0.1623, reader_cost: 0.00071, ips: 49.2764 samples/sec | ETA 00:04:19 2022-08-24 09:41:32 [INFO] [TRAIN] epoch: 126, iter: 158450/160000, loss: 0.3728, lr: 0.000012, batch_cost: 0.1717, reader_cost: 0.00385, ips: 46.5810 samples/sec | ETA 00:04:26 2022-08-24 09:41:41 [INFO] [TRAIN] epoch: 126, iter: 158500/160000, loss: 0.4048, lr: 0.000011, batch_cost: 0.1757, reader_cost: 0.00058, ips: 45.5194 samples/sec | ETA 00:04:23 2022-08-24 09:41:49 [INFO] [TRAIN] epoch: 126, iter: 158550/160000, loss: 0.3795, lr: 0.000011, batch_cost: 0.1503, reader_cost: 0.00096, ips: 53.2354 samples/sec | ETA 00:03:37 2022-08-24 09:41:56 [INFO] [TRAIN] epoch: 126, iter: 158600/160000, loss: 0.3678, lr: 0.000011, batch_cost: 0.1547, reader_cost: 0.00072, ips: 51.7063 samples/sec | ETA 00:03:36 2022-08-24 09:42:04 [INFO] [TRAIN] epoch: 126, iter: 158650/160000, loss: 0.3840, lr: 0.000010, batch_cost: 0.1573, reader_cost: 0.00081, ips: 50.8729 samples/sec | ETA 00:03:32 2022-08-24 09:42:13 [INFO] [TRAIN] epoch: 126, iter: 158700/160000, loss: 0.3642, lr: 0.000010, batch_cost: 0.1688, reader_cost: 0.00054, ips: 47.3986 samples/sec | ETA 00:03:39 2022-08-24 09:42:21 [INFO] [TRAIN] epoch: 126, iter: 158750/160000, loss: 0.3598, lr: 0.000009, batch_cost: 0.1580, reader_cost: 0.00082, ips: 50.6235 samples/sec | ETA 00:03:17 2022-08-24 09:42:28 [INFO] [TRAIN] epoch: 126, iter: 158800/160000, loss: 0.3820, lr: 0.000009, batch_cost: 0.1589, reader_cost: 0.00062, ips: 50.3360 samples/sec | ETA 00:03:10 2022-08-24 09:42:36 [INFO] [TRAIN] epoch: 126, iter: 158850/160000, loss: 0.3800, lr: 0.000009, batch_cost: 0.1599, reader_cost: 0.00070, ips: 50.0465 samples/sec | ETA 00:03:03 2022-08-24 09:42:45 [INFO] [TRAIN] epoch: 126, iter: 158900/160000, loss: 0.3644, lr: 0.000008, batch_cost: 0.1683, reader_cost: 0.00061, ips: 47.5348 samples/sec | ETA 00:03:05 2022-08-24 09:42:54 [INFO] [TRAIN] epoch: 126, iter: 158950/160000, loss: 0.4204, lr: 0.000008, batch_cost: 0.1810, reader_cost: 0.00072, ips: 44.2014 samples/sec | ETA 00:03:10 2022-08-24 09:43:02 [INFO] [TRAIN] epoch: 126, iter: 159000/160000, loss: 0.3926, lr: 0.000008, batch_cost: 0.1603, reader_cost: 0.00077, ips: 49.9212 samples/sec | ETA 00:02:40 2022-08-24 09:43:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 142s - batch_cost: 0.1416 - reader cost: 9.1574e-04 2022-08-24 09:45:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.3799 Acc: 0.7789 Kappa: 0.7619 Dice: 0.5174 2022-08-24 09:45:24 [INFO] [EVAL] Class IoU: [0.6974 0.7921 0.9331 0.742 0.693 0.776 0.7832 0.8139 0.5351 0.6544 0.5016 0.5848 0.7202 0.3188 0.3136 0.4493 0.5094 0.4555 0.6141 0.4498 0.7792 0.4353 0.631 0.5236 0.3299 0.403 0.4593 0.4824 0.4452 0.2551 0.2946 0.5605 0.3332 0.3591 0.3563 0.4198 0.493 0.5864 0.3035 0.3927 0.1216 0.124 0.3595 0.2632 0.2473 0.2571 0.3698 0.5423 0.6216 0.5409 0.5938 0.3923 0.1663 0.2025 0.6832 0.3316 0.8896 0.4355 0.3791 0.2872 0.0942 0.2485 0.3205 0.1556 0.4951 0.7273 0.2795 0.3823 0.1357 0.3736 0.5202 0.523 0.4002 0.2682 0.5013 0.4154 0.5419 0.3064 0.3569 0.2764 0.7319 0.4407 0.4418 0.0669 0.1001 0.5252 0.1154 0.1329 0.3265 0.5568 0.4372 0.0722 0.2136 0.1218 0.0475 0.0694 0.1832 0.1887 0.3001 0.3078 0.2334 0.1204 0.273 0.7452 0.1941 0.6952 0.145 0.5573 0.0829 0.2347 0.2131 0.545 0.2128 0.6259 0.7413 0.0757 0.3766 0.6319 0.1883 0.3755 0.4013 0.0791 0.3072 0.2074 0.2983 0.2594 0.5559 0.4663 0.6183 0.3908 0.5763 0.0554 0.3077 0.4202 0.3111 0.2191 0.1706 0.0528 0.221 0.4194 0.0999 0.0113 0.2771 0.2404 0.2126 0.0041 0.4344 0.0396 0.1507 0.2146] 2022-08-24 09:45:24 [INFO] [EVAL] Class Precision: [0.7915 0.859 0.9654 0.8265 0.7704 0.8753 0.8917 0.8725 0.6797 0.7479 0.6985 0.7097 0.789 0.5575 0.5456 0.6178 0.6963 0.7143 0.7574 0.6419 0.8461 0.6654 0.7615 0.6307 0.473 0.664 0.5653 0.7787 0.7494 0.4478 0.5034 0.7169 0.5732 0.4907 0.4553 0.5703 0.6903 0.8213 0.4549 0.6057 0.3055 0.3222 0.6049 0.4792 0.3444 0.5306 0.7155 0.74 0.7203 0.6847 0.7659 0.4841 0.3439 0.6095 0.7248 0.6387 0.9424 0.6673 0.7008 0.5369 0.1355 0.5428 0.4603 0.6625 0.6022 0.8295 0.4605 0.513 0.2979 0.6403 0.7191 0.631 0.5564 0.373 0.7475 0.627 0.7533 0.66 0.7923 0.4764 0.8679 0.6861 0.8033 0.1825 0.1932 0.7182 0.4554 0.4231 0.7967 0.7172 0.6015 0.1108 0.421 0.3486 0.1482 0.1864 0.5356 0.3973 0.4553 0.7377 0.6315 0.2047 0.5495 0.8781 0.7837 0.7839 0.3774 0.7665 0.1732 0.384 0.4536 0.7361 0.6047 0.825 0.7522 0.2696 0.7341 0.7375 0.3158 0.5903 0.7523 0.2991 0.6161 0.6084 0.7536 0.6491 0.8316 0.5953 0.8461 0.6083 0.7394 0.3425 0.567 0.845 0.6154 0.4073 0.3751 0.1294 0.4819 0.7014 0.3756 0.0203 0.5746 0.6712 0.5153 0.0063 0.8455 0.2333 0.4188 0.6796] 2022-08-24 09:45:24 [INFO] [EVAL] Class Recall: [0.8543 0.9105 0.9655 0.879 0.8733 0.8725 0.8655 0.9238 0.7154 0.8396 0.6402 0.7687 0.892 0.4268 0.4245 0.6222 0.6549 0.557 0.7645 0.6005 0.908 0.5573 0.7864 0.7552 0.5215 0.5062 0.7103 0.5591 0.5231 0.3722 0.4154 0.7198 0.4431 0.5724 0.6211 0.6141 0.6331 0.6721 0.477 0.5276 0.1681 0.1677 0.4698 0.3686 0.4672 0.3328 0.4335 0.67 0.8195 0.7204 0.7255 0.6741 0.2435 0.2327 0.9225 0.4082 0.9408 0.5563 0.4523 0.3817 0.2362 0.3143 0.5133 0.1691 0.7357 0.8551 0.4157 0.6001 0.1996 0.4728 0.6528 0.7534 0.5878 0.4884 0.6034 0.5517 0.6588 0.3638 0.3937 0.397 0.8236 0.5521 0.4954 0.0954 0.172 0.6615 0.1339 0.1623 0.3561 0.7135 0.6155 0.1713 0.3025 0.1577 0.0653 0.0996 0.2178 0.2644 0.4682 0.3457 0.2702 0.2262 0.3517 0.8312 0.2051 0.86 0.1905 0.6712 0.1372 0.3763 0.2866 0.6773 0.2472 0.7218 0.9809 0.0952 0.4361 0.8152 0.3181 0.508 0.4624 0.0971 0.3799 0.2393 0.3306 0.3017 0.6265 0.6828 0.6966 0.5223 0.7233 0.062 0.4022 0.4552 0.3863 0.3216 0.2383 0.0818 0.2899 0.5106 0.1198 0.0248 0.3486 0.2725 0.2657 0.0118 0.4719 0.0455 0.1906 0.2387] 2022-08-24 09:45:24 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 2022-08-24 09:45:32 [INFO] [TRAIN] epoch: 126, iter: 159050/160000, loss: 0.3586, lr: 0.000007, batch_cost: 0.1621, reader_cost: 0.00446, ips: 49.3418 samples/sec | ETA 00:02:34 2022-08-24 09:45:41 [INFO] [TRAIN] epoch: 126, iter: 159100/160000, loss: 0.3921, lr: 0.000007, batch_cost: 0.1743, reader_cost: 0.00132, ips: 45.9085 samples/sec | ETA 00:02:36 2022-08-24 09:45:53 [INFO] [TRAIN] epoch: 127, iter: 159150/160000, loss: 0.3787, lr: 0.000006, batch_cost: 0.2362, reader_cost: 0.07587, ips: 33.8627 samples/sec | ETA 00:03:20 2022-08-24 09:46:01 [INFO] [TRAIN] epoch: 127, iter: 159200/160000, loss: 0.3658, lr: 0.000006, batch_cost: 0.1718, reader_cost: 0.00090, ips: 46.5758 samples/sec | ETA 00:02:17 2022-08-24 09:46:10 [INFO] [TRAIN] epoch: 127, iter: 159250/160000, loss: 0.3639, lr: 0.000006, batch_cost: 0.1855, reader_cost: 0.00048, ips: 43.1238 samples/sec | ETA 00:02:19 2022-08-24 09:46:18 [INFO] [TRAIN] epoch: 127, iter: 159300/160000, loss: 0.3789, lr: 0.000005, batch_cost: 0.1492, reader_cost: 0.00050, ips: 53.6015 samples/sec | ETA 00:01:44 2022-08-24 09:46:26 [INFO] [TRAIN] epoch: 127, iter: 159350/160000, loss: 0.3965, lr: 0.000005, batch_cost: 0.1563, reader_cost: 0.00062, ips: 51.1948 samples/sec | ETA 00:01:41 2022-08-24 09:46:33 [INFO] [TRAIN] epoch: 127, iter: 159400/160000, loss: 0.3836, lr: 0.000005, batch_cost: 0.1531, reader_cost: 0.00075, ips: 52.2399 samples/sec | ETA 00:01:31 2022-08-24 09:46:42 [INFO] [TRAIN] epoch: 127, iter: 159450/160000, loss: 0.4079, lr: 0.000004, batch_cost: 0.1707, reader_cost: 0.00054, ips: 46.8781 samples/sec | ETA 00:01:33 2022-08-24 09:46:50 [INFO] [TRAIN] epoch: 127, iter: 159500/160000, loss: 0.3939, lr: 0.000004, batch_cost: 0.1654, reader_cost: 0.00110, ips: 48.3557 samples/sec | ETA 00:01:22 2022-08-24 09:46:58 [INFO] [TRAIN] epoch: 127, iter: 159550/160000, loss: 0.3670, lr: 0.000003, batch_cost: 0.1639, reader_cost: 0.00110, ips: 48.8031 samples/sec | ETA 00:01:13 2022-08-24 09:47:07 [INFO] [TRAIN] epoch: 127, iter: 159600/160000, loss: 0.3730, lr: 0.000003, batch_cost: 0.1618, reader_cost: 0.00115, ips: 49.4344 samples/sec | ETA 00:01:04 2022-08-24 09:47:15 [INFO] [TRAIN] epoch: 127, iter: 159650/160000, loss: 0.3887, lr: 0.000003, batch_cost: 0.1762, reader_cost: 0.00087, ips: 45.4100 samples/sec | ETA 00:01:01 2022-08-24 09:47:24 [INFO] [TRAIN] epoch: 127, iter: 159700/160000, loss: 0.3716, lr: 0.000002, batch_cost: 0.1671, reader_cost: 0.00090, ips: 47.8782 samples/sec | ETA 00:00:50 2022-08-24 09:47:31 [INFO] [TRAIN] epoch: 127, iter: 159750/160000, loss: 0.4059, lr: 0.000002, batch_cost: 0.1511, reader_cost: 0.00124, ips: 52.9430 samples/sec | ETA 00:00:37 2022-08-24 09:47:39 [INFO] [TRAIN] epoch: 127, iter: 159800/160000, loss: 0.3723, lr: 0.000002, batch_cost: 0.1537, reader_cost: 0.00052, ips: 52.0512 samples/sec | ETA 00:00:30 2022-08-24 09:47:46 [INFO] [TRAIN] epoch: 127, iter: 159850/160000, loss: 0.3730, lr: 0.000001, batch_cost: 0.1496, reader_cost: 0.00060, ips: 53.4827 samples/sec | ETA 00:00:22 2022-08-24 09:47:54 [INFO] [TRAIN] epoch: 127, iter: 159900/160000, loss: 0.3744, lr: 0.000001, batch_cost: 0.1515, reader_cost: 0.00058, ips: 52.7969 samples/sec | ETA 00:00:15 2022-08-24 09:48:02 [INFO] [TRAIN] epoch: 127, iter: 159950/160000, loss: 0.3796, lr: 0.000000, batch_cost: 0.1621, reader_cost: 0.00027, ips: 49.3493 samples/sec | ETA 00:00:08 2022-08-24 09:48:10 [INFO] [TRAIN] epoch: 127, iter: 160000/160000, loss: 0.3546, lr: 0.000000, batch_cost: 0.1604, reader_cost: 0.00134, ips: 49.8831 samples/sec | ETA 00:00:00 2022-08-24 09:48:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 136s - batch_cost: 0.1361 - reader cost: 0.0012 2022-08-24 09:50:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3799 Acc: 0.7794 Kappa: 0.7625 Dice: 0.5177 2022-08-24 09:50:27 [INFO] [EVAL] Class IoU: [0.6973 0.7914 0.9335 0.7461 0.695 0.7774 0.7815 0.8124 0.5344 0.6522 0.5106 0.5872 0.7235 0.32 0.3149 0.4481 0.4995 0.4595 0.6184 0.4452 0.7762 0.4437 0.6349 0.5291 0.3377 0.4504 0.466 0.4873 0.4578 0.2542 0.3066 0.5591 0.3359 0.3567 0.3599 0.4148 0.4912 0.5925 0.3009 0.3804 0.1131 0.1274 0.3578 0.266 0.2534 0.2593 0.3974 0.5495 0.5995 0.5517 0.6016 0.3946 0.1635 0.1753 0.6776 0.3387 0.8916 0.434 0.3462 0.275 0.1014 0.2487 0.3358 0.1582 0.4921 0.7338 0.2706 0.3886 0.1299 0.3783 0.5317 0.5799 0.4036 0.2669 0.4998 0.413 0.5636 0.3057 0.3465 0.2617 0.7228 0.4357 0.4396 0.0484 0.0996 0.5234 0.1303 0.14 0.3301 0.5468 0.4223 0.0759 0.2282 0.1057 0.0439 0.0734 0.1842 0.1852 0.294 0.3113 0.239 0.1133 0.2886 0.704 0.194 0.581 0.1383 0.5641 0.0808 0.2213 0.217 0.5614 0.2148 0.6321 0.7252 0.059 0.3775 0.6298 0.1829 0.3607 0.4279 0.0798 0.3348 0.2042 0.2982 0.2496 0.5472 0.458 0.6122 0.3897 0.5332 0.0507 0.2994 0.4302 0.3175 0.2177 0.1734 0.0475 0.2117 0.4234 0.1312 0.0164 0.29 0.2856 0.2065 0.013 0.4439 0.0364 0.1565 0.2104] 2022-08-24 09:50:27 [INFO] [EVAL] Class Precision: [0.7899 0.8662 0.9629 0.8295 0.7766 0.875 0.8851 0.8675 0.673 0.7513 0.711 0.7156 0.798 0.5587 0.535 0.624 0.7023 0.7042 0.7728 0.6426 0.8421 0.6759 0.775 0.638 0.4829 0.6276 0.5761 0.7858 0.7772 0.4195 0.5053 0.6983 0.5581 0.4803 0.461 0.5707 0.6858 0.8121 0.4392 0.6154 0.3193 0.3403 0.6121 0.4858 0.353 0.4979 0.7506 0.7585 0.695 0.69 0.7761 0.4941 0.3224 0.5794 0.7192 0.6619 0.936 0.6761 0.6935 0.4787 0.1462 0.4809 0.4691 0.6704 0.5966 0.8326 0.45 0.5363 0.2605 0.6494 0.7261 0.7228 0.5867 0.3592 0.7336 0.5996 0.7553 0.6403 0.7934 0.5383 LAUNCH INFO 2022-08-24 09:50:30,929 Pod failed INFO 2022-08-24 09:50:30,929 controller.py:99] Pod failed LAUNCH ERROR 2022-08-24 09:50:30,929 Container failed !!! Container rank 0 status failed cmd ['/ssd3/pengjuncai/anaconda3/bin/python', '-u', 'train.py', '--config', 'configs/topformer/topformer_base_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_base_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] code 1 log output/topformer/topformer_base_ade20k_512x512_160k/test_0/log_dir/default.xybidw.0.log env {'XDG_SESSION_ID': '5', 'HOSTNAME': 'instance-mqcyj27y-2', 'SHELL': '/bin/bash', 'TERM': 'screen', 'HISTSIZE': '50000', 'SSH_CLIENT': '172.31.22.20 26694 22', 'CONDA_SHLVL': '1', 'CONDA_PROMPT_MODIFIER': '(base) ', 'QTDIR': '/usr/lib64/qt-3.3', 'QTINC': '/usr/lib64/qt-3.3/include', 'SSH_TTY': '/dev/pts/2', 'ZSH': '/ssd3/pengjuncai/.oh-my-zsh', 'USER': 'pengjuncai', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:', 'LD_LIBRARY_PATH': '/usr/local/cuda/lib64', 'CONDA_EXE': '/ssd3/pengjuncai/anaconda3/bin/conda', 'TMOUT': '172800', 'base_model': 'topformer', 'PAGER': 'less', 'TMUX': '/tmp/tmux-1032/default,17077,0', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', '_CE_CONDA': '', 'MAIL': '/var/spool/mail/pengjuncai', 'PATH': '/ssd3/pengjuncai/.BCloud/bin:/usr/local/cuda/bin:/ssd3/pengjuncai/anaconda3/bin:/ssd3/pengjuncai/anaconda3/condabin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/ssd3/pengjuncai/.local/bin:/ssd3/pengjuncai/bin:/opt/bin:/home/opt/bin', 'tag': 'test_0', 'CONDA_PREFIX': '/ssd3/pengjuncai/anaconda3', 'PWD': '/ssd3/pengjuncai/PaddleSeg', 'CUDA_VISIBLE_DEVICES': '2,3', 'LANG': 'en_US.UTF-8', 'TMUX_PANE': '%8', 'HISTCONTROL': 'ignoredups', '_CE_M': '', 'HOME': '/ssd3/pengjuncai', 'SHLVL': '8', 'CONDA_PYTHON_EXE': '/ssd3/pengjuncai/anaconda3/bin/python', 'LESS': '-R', 'LOGNAME': 'pengjuncai', 'QTLIB': '/usr/lib64/qt-3.3/lib', 'SSH_CONNECTION': '172.31.43.62 55146 10.9.189.6 22', 'XDG_DATA_DIRS': '/ssd3/pengjuncai/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'CONDA_DEFAULT_ENV': 'base', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'XDG_RUNTIME_DIR': '/run/user/1032', 'HISTTIMEFORMAT': '%Y-%m-%d %H:%M:%S ', 'model': 'topformer_base_ade20k_512x512_160k', '_': '/usr/bin/nohup', 'OLDPWD': '/ssd3/pengjuncai/PaddleSeg/output/topformer/topformer_base_ade20k_512x512_160k/test_0', 'CUSTOM_DEVICE_ROOT': '', 'OMP_NUM_THREADS': '1', 'QT_QPA_PLATFORM_PLUGIN_PATH': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/plugins', 'QT_QPA_FONTDIR': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/fonts', 'PADDLE_MASTER': '10.9.189.6:33028', 'PADDLE_GLOBAL_SIZE': '2', 'PADDLE_LOCAL_SIZE': '2', 'PADDLE_GLOBAL_RANK': '0', 'PADDLE_LOCAL_RANK': '0', 'PADDLE_TRAINER_ENDPOINTS': '10.9.189.6:44808,10.9.189.6:32912', 'PADDLE_CURRENT_ENDPOINT': '10.9.189.6:44808', 'PADDLE_TRAINER_ID': '0', 'PADDLE_TRAINERS_NUM': '2', 'PADDLE_RANK_IN_NODE': '0', 'FLAGS_selected_gpus': '0'} ERROR 2022-08-24 09:50:30,929 controller.py:100] Container failed !!! Container rank 0 status failed cmd ['/ssd3/pengjuncai/anaconda3/bin/python', '-u', 'train.py', '--config', 'configs/topformer/topformer_base_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_base_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] code 1 log output/topformer/topformer_base_ade20k_512x512_160k/test_0/log_dir/default.xybidw.0.log env {'XDG_SESSION_ID': '5', 'HOSTNAME': 'instance-mqcyj27y-2', 'SHELL': '/bin/bash', 'TERM': 'screen', 'HISTSIZE': '50000', 'SSH_CLIENT': '172.31.22.20 26694 22', 'CONDA_SHLVL': '1', 'CONDA_PROMPT_MODIFIER': '(base) ', 'QTDIR': '/usr/lib64/qt-3.3', 'QTINC': '/usr/lib64/qt-3.3/include', 'SSH_TTY': '/dev/pts/2', 'ZSH': '/ssd3/pengjuncai/.oh-my-zsh', 'USER': 'pengjuncai', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:', 'LD_LIBRARY_PATH': '/usr/local/cuda/lib64', 'CONDA_EXE': '/ssd3/pengjuncai/anaconda3/bin/conda', 'TMOUT': '172800', 'base_model': 'topformer', 'PAGER': 'less', 'TMUX': '/tmp/tmux-1032/default,17077,0', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', '_CE_CONDA': '', 'MAIL': '/var/spool/mail/pengjuncai', 'PATH': '/ssd3/pengjuncai/.BCloud/bin:/usr/local/cuda/bin:/ssd3/pengjuncai/anaconda3/bin:/ssd3/pengjuncai/anaconda3/condabin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/ssd3/pengjuncai/.local/bin:/ssd3/pengjuncai/bin:/opt/bin:/home/opt/bin', 'tag': 'test_0', 'CONDA_PREFIX': '/ssd3/pengjuncai/anaconda3', 'PWD': '/ssd3/pengjuncai/PaddleSeg', 'CUDA_VISIBLE_DEVICES': '2,3', 'LANG': 'en_US.UTF-8', 'TMUX_PANE': '%8', 'HISTCONTROL': 'ignoredups', '_CE_M': '', 'HOME': '/ssd3/pengjuncai', 'SHLVL': '8', 'CONDA_PYTHON_EXE': '/ssd3/pengjuncai/anaconda3/bin/python', 'LESS': '-R', 'LOGNAME': 'pengjuncai', 'QTLIB': '/usr/lib64/qt-3.3/lib', 'SSH_CONNECTION': '172.31.43.62 55146 10.9.189.6 22', 'XDG_DATA_DIRS': '/ssd3/pengjuncai/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'CONDA_DEFAULT_ENV': 'base', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'XDG_RUNTIME_DIR': '/run/user/1032', 'HISTTIMEFORMAT': '%Y-%m-%d %H:%M:%S ', 'model': 'topformer_base_ade20k_512x512_160k', '_': '/usr/bin/nohup', 'OLDPWD': '/ssd3/pengjuncai/PaddleSeg/output/topformer/topformer_base_ade20k_512x512_160k/test_0', 'CUSTOM_DEVICE_ROOT': '', 'OMP_NUM_THREADS': '1', 'QT_QPA_PLATFORM_PLUGIN_PATH': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/plugins', 'QT_QPA_FONTDIR': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/fonts', 'PADDLE_MASTER': '10.9.189.6:33028', 'PADDLE_GLOBAL_SIZE': '2', 'PADDLE_LOCAL_SIZE': '2', 'PADDLE_GLOBAL_RANK': '0', 'PADDLE_LOCAL_RANK': '0', 'PADDLE_TRAINER_ENDPOINTS': '10.9.189.6:44808,10.9.189.6:32912', 'PADDLE_CURRENT_ENDPOINT': '10.9.189.6:44808', 'PADDLE_TRAINER_ID': '0', 'PADDLE_TRAINERS_NUM': '2', 'PADDLE_RANK_IN_NODE': '0', 'FLAGS_selected_gpus': '0'} LAUNCH INFO 2022-08-24 09:50:31,904 Exit code 1 INFO 2022-08-24 09:50:31,904 controller.py:124] Exit code 1 0.859 0.7068 0.8085 0.1424 0.1851 0.7261 0.4883 0.4302 0.8148 0.6998 0.5719 0.1134 0.4559 0.3279 0.1468 0.1952 0.5765 0.3784 0.4415 0.7674 0.6477 0.1991 0.5319 0.8587 0.8361 0.6339 0.3529 0.764 0.183 0.3627 0.4516 0.7652 0.6304 0.8342 0.7337 0.2804 0.7808 0.7355 0.3059 0.5397 0.7093 0.3162 0.6561 0.5989 0.7749 0.6238 0.8352 0.5858 0.8625 0.591 0.6988 0.3401 0.6101 0.8267 0.6124 0.4187 0.3699 0.124 0.4521 0.6974 0.4115 0.0293 0.5857 0.6782 0.4843 0.0175 0.8513 0.212 0.4799 0.7276] 2022-08-24 09:50:27 [INFO] [EVAL] Class Recall: [0.856 0.9017 0.9683 0.8812 0.8687 0.8746 0.8698 0.9275 0.7219 0.8318 0.6443 0.766 0.8857 0.4282 0.4336 0.6139 0.6337 0.5693 0.7557 0.5918 0.9084 0.5636 0.7783 0.7561 0.529 0.6146 0.7092 0.5619 0.5269 0.3922 0.4382 0.7373 0.4576 0.5808 0.6214 0.603 0.6338 0.6866 0.4887 0.499 0.149 0.1692 0.4627 0.3702 0.473 0.3512 0.4579 0.666 0.8136 0.7335 0.728 0.6622 0.2491 0.2008 0.9214 0.4096 0.9495 0.548 0.4087 0.3925 0.2488 0.3399 0.5417 0.1715 0.7375 0.8607 0.4044 0.5852 0.2057 0.4753 0.665 0.7458 0.564 0.5094 0.6106 0.5703 0.6896 0.3691 0.3808 0.3374 0.82 0.5318 0.4907 0.0682 0.1773 0.6521 0.1509 0.1718 0.3569 0.7143 0.6176 0.1865 0.3136 0.135 0.0589 0.1053 0.213 0.2661 0.4681 0.3437 0.2747 0.2081 0.3869 0.7962 0.2017 0.8744 0.1854 0.6832 0.1263 0.3622 0.2946 0.6782 0.2458 0.7229 0.9842 0.0695 0.4223 0.8142 0.3126 0.5209 0.519 0.0965 0.406 0.2365 0.3265 0.2939 0.6134 0.6773 0.6784 0.5335 0.6922 0.0563 0.3702 0.4728 0.3974 0.3121 0.2461 0.0714 0.2847 0.5186 0.1614 0.0361 0.3649 0.3304 0.2648 0.0486 0.4812 0.0421 0.1884 0.2284] 2022-08-24 09:50:27 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 's flops has been counted Customize Function has been applied to 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. Traceback (most recent call last): File "/ssd3/pengjuncai/PaddleSeg/train.py", line 240, in main(args) File "/ssd3/pengjuncai/PaddleSeg/train.py", line 216, in main train( File "/ssd3/pengjuncai/PaddleSeg/paddleseg/core/train.py", line 327, in train _ = paddle.flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 109, in flops return dynamic_flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 257, in dynamic_flops model(inputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/topformer.py", line 76, in forward x = self.backbone(x) # len=3, 1/8,1/16,1/32 File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/backbones/top_transformer.py", line 573, in forward out = self.ppa(ouputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 918, in _dygraph_call_func hook_result = forward_post_hook(self, inputs, outputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 183, in count_io_info m.register_buffer('input_shape', paddle.to_tensor(x[0].shape)) AttributeError: 'list' object has no attribute 'shape' 48 0.0486 0.4812 0.0421 0.1884 0.2284] 2022-08-24 09:50:27 [INFO] [EVAL] The model with the best validation mIoU (0.3828) was saved at iter 134000. 's flops has been counted Customize Function has been applied to 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. Traceback (most recent call last): File "/ssd3/pengjuncai/PaddleSeg/train.py", line 240, in main(args) File "/ssd3/pengjuncai/PaddleSeg/train.py", line 216, in main train( File "/ssd3/pengjuncai/PaddleSeg/paddleseg/core/train.py", line 327, in train _ = paddle.flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 109, in flops return dynamic_flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 257, in dynamic_flops model(inputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/topformer.py", line 76, in forward x = self.backbone(x) # len=3, 1/8,1/16,1/32 File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/backbones/top_transformer.py", line 573, in forward out = self.ppa(ouputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 918, in _dygraph_call_func hook_result = forward_post_hook(self, inputs, outputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 183, in count_io_info m.register_buffer('input_shape', paddle.to_tensor(x[0].shape)) AttributeError: 'list' object has no attribute 'shape'