LAUNCH INFO 2022-08-22 18:00:12,865 ----------- Configuration ---------------------- INFO 2022-08-22 18:00:12,865 __init__.py:21] ----------- Configuration ---------------------- LAUNCH INFO 2022-08-22 18:00:12,865 devices: None INFO 2022-08-22 18:00:12,865 __init__.py:23] devices: None LAUNCH INFO 2022-08-22 18:00:12,865 elastic_level: -1 INFO 2022-08-22 18:00:12,865 __init__.py:23] elastic_level: -1 LAUNCH INFO 2022-08-22 18:00:12,865 elastic_timeout: 30 INFO 2022-08-22 18:00:12,865 __init__.py:23] elastic_timeout: 30 LAUNCH INFO 2022-08-22 18:00:12,865 gloo_port: 6767 INFO 2022-08-22 18:00:12,865 __init__.py:23] gloo_port: 6767 LAUNCH INFO 2022-08-22 18:00:12,865 host: None INFO 2022-08-22 18:00:12,865 __init__.py:23] host: None LAUNCH INFO 2022-08-22 18:00:12,865 job_id: default INFO 2022-08-22 18:00:12,865 __init__.py:23] job_id: default LAUNCH INFO 2022-08-22 18:00:12,865 legacy: False INFO 2022-08-22 18:00:12,865 __init__.py:23] legacy: False LAUNCH INFO 2022-08-22 18:00:12,865 log_dir: output/topformer/topformer_small_ade20k_512x512_160k/test_6/log_dir INFO 2022-08-22 18:00:12,865 __init__.py:23] log_dir: output/topformer/topformer_small_ade20k_512x512_160k/test_6/log_dir LAUNCH INFO 2022-08-22 18:00:12,865 log_level: INFO INFO 2022-08-22 18:00:12,865 __init__.py:23] log_level: INFO LAUNCH INFO 2022-08-22 18:00:12,865 master: None INFO 2022-08-22 18:00:12,865 __init__.py:23] master: None LAUNCH INFO 2022-08-22 18:00:12,865 max_restart: 3 INFO 2022-08-22 18:00:12,865 __init__.py:23] max_restart: 3 LAUNCH INFO 2022-08-22 18:00:12,866 nnodes: 1 INFO 2022-08-22 18:00:12,866 __init__.py:23] nnodes: 1 LAUNCH INFO 2022-08-22 18:00:12,866 nproc_per_node: None INFO 2022-08-22 18:00:12,866 __init__.py:23] nproc_per_node: None LAUNCH INFO 2022-08-22 18:00:12,866 rank: -1 INFO 2022-08-22 18:00:12,866 __init__.py:23] rank: -1 LAUNCH INFO 2022-08-22 18:00:12,866 run_mode: collective INFO 2022-08-22 18:00:12,866 __init__.py:23] run_mode: collective LAUNCH INFO 2022-08-22 18:00:12,866 server_num: None INFO 2022-08-22 18:00:12,866 __init__.py:23] server_num: None LAUNCH INFO 2022-08-22 18:00:12,866 servers: INFO 2022-08-22 18:00:12,866 __init__.py:23] servers: LAUNCH INFO 2022-08-22 18:00:12,866 trainer_num: None INFO 2022-08-22 18:00:12,866 __init__.py:23] trainer_num: None LAUNCH INFO 2022-08-22 18:00:12,866 trainers: INFO 2022-08-22 18:00:12,866 __init__.py:23] trainers: LAUNCH INFO 2022-08-22 18:00:12,866 training_script: train.py INFO 2022-08-22 18:00:12,866 __init__.py:23] training_script: train.py LAUNCH INFO 2022-08-22 18:00:12,866 training_script_args: ['--config', 'configs/topformer/topformer_small_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_small_ade20k_512x512_160k/test_6', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] INFO 2022-08-22 18:00:12,866 __init__.py:23] training_script_args: ['--config', 'configs/topformer/topformer_small_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_small_ade20k_512x512_160k/test_6', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] LAUNCH INFO 2022-08-22 18:00:12,866 with_gloo: 0 INFO 2022-08-22 18:00:12,866 __init__.py:23] with_gloo: 0 LAUNCH INFO 2022-08-22 18:00:12,866 -------------------------------------------------- INFO 2022-08-22 18:00:12,866 __init__.py:24] -------------------------------------------------- LAUNCH INFO 2022-08-22 18:00:12,871 Job: default, mode collective, replicas 1[1:1], elastic False INFO 2022-08-22 18:00:12,871 controller.py:152] Job: default, mode collective, replicas 1[1:1], elastic False LAUNCH INFO 2022-08-22 18:00:12,872 Run Pod: lkbwfy, replicas 2, status ready INFO 2022-08-22 18:00:12,872 controller.py:53] Run Pod: lkbwfy, replicas 2, status ready LAUNCH INFO 2022-08-22 18:00:12,897 Watching Pod: lkbwfy, replicas 2, status running INFO 2022-08-22 18:00:12,897 controller.py:73] Watching Pod: lkbwfy, replicas 2, status running 2022-08-22 18:00:15 [INFO] ------------Environment Information------------- platform: Linux-3.10.0-1062.18.1.el7.x86_64-x86_64-with-glibc2.18 Python: 3.9.7 (default, Sep 16 2021, 13:09:58) [GCC 7.5.0] Paddle compiled with cuda: True NVCC: Cuda compilation tools, release 10.2, V10.2.89 cudnn: 7.6 GPUs used: 2 CUDA_VISIBLE_DEVICES: 2,3 GPU: ['GPU 0: Tesla V100-SXM2-16GB', 'GPU 1: Tesla V100-SXM2-16GB', 'GPU 2: Tesla V100-SXM2-16GB', 'GPU 3: Tesla V100-SXM2-16GB', 'GPU 4: Tesla V100-SXM2-16GB', 'GPU 5: Tesla V100-SXM2-16GB', 'GPU 6: Tesla V100-SXM2-16GB', 'GPU 7: Tesla V100-SXM2-16GB'] GCC: gcc (GCC) 9.1.0 PaddleSeg: develop PaddlePaddle: 2.3.0 OpenCV: 4.4.0 ------------------------------------------------ 2022-08-22 18:00:15 [INFO] ---------------Config Information--------------- batch_size: 8 iters: 160000 loss: coef: - 1 types: - ignore_index: 255 type: CrossEntropyLoss lr_scheduler: end_lr: 0 learning_rate: 0.0012 power: 1.0 type: PolynomialDecay warmup_iters: 1500 warmup_start_lr: 1.0e-06 model: backbone: lr_mult: 0.1 pretrained: pretrained_model/topformer_small_classification_from_torch.pdparams type: TopTransformer_Small type: TopFormer optimizer: type: AdamW weight_decay: 0.01 train_dataset: dataset_root: data/ADEChallengeData2016/ mode: train transforms: - max_scale_factor: 2.0 min_scale_factor: 0.5 scale_step_size: 0.25 type: ResizeStepScaling - crop_size: - 512 - 512 type: RandomPaddingCrop - type: RandomHorizontalFlip - brightness_range: 0.4 contrast_range: 0.4 saturation_range: 0.4 type: RandomDistort - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize type: ADE20K val_dataset: dataset_root: data/ADEChallengeData2016/ mode: val transforms: - keep_ratio: true size_divisor: 32 target_size: - 2048 - 512 type: Resize - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize type: ADE20K ------------------------------------------------ W0822 18:00:15.353842 92614 gpu_context.cc:278] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 10.2, Runtime API Version: 10.2 W0822 18:00:15.353920 92614 gpu_context.cc:306] device: 0, cuDNN Version: 7.6. 2022-08-22 18:00:21 [INFO] Loading pretrained model from pretrained_model/topformer_small_classification_from_torch.pdparams 2022-08-22 18:00:21 [WARNING] SIM.1.local_embedding.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.local_embedding.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.local_embedding.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.local_embedding.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.local_embedding.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_embedding.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_embedding.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_embedding.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_embedding.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_embedding.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_act.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_act.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_act.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_act.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.1.global_act.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.local_embedding.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.local_embedding.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.local_embedding.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.local_embedding.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.local_embedding.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_embedding.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_embedding.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_embedding.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_embedding.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_embedding.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_act.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_act.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_act.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_act.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.2.global_act.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.local_embedding.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.local_embedding.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.local_embedding.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.local_embedding.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.local_embedding.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_embedding.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_embedding.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_embedding.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_embedding.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_embedding.bn._variance is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_act.conv.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_act.bn.weight is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_act.bn.bias is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_act.bn._mean is not in pretrained model 2022-08-22 18:00:21 [WARNING] SIM.3.global_act.bn._variance is not in pretrained model 2022-08-22 18:00:21 [INFO] There are 278/323 variables loaded into TopTransformer. server not ready, wait 3 sec to retry... not ready endpoints:['10.9.189.6:51740'] I0822 18:00:24.317682 92614 nccl_context.cc:83] init nccl context nranks: 2 local rank: 0 gpu id: 0 ring id: 0 I0822 18:00:24.724807 92614 nccl_context.cc:115] init nccl context nranks: 2 local rank: 0 gpu id: 0 ring id: 10 /ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/tensor/creation.py:125: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations if data.dtype == np.object: 2022-08-22 18:00:24,792-INFO: [topology.py:169:__init__] HybridParallelInfo: rank_id: 0, mp_degree: 1, sharding_degree: 1, pp_degree: 1, dp_degree: 2, mp_group: [0], sharding_group: [0], pp_group: [0], dp_group: [0, 1], check/clip group: [0] /ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/math_op_patch.py:276: UserWarning: The dtype of left and right variables are not the same, left dtype is paddle.float32, but right dtype is paddle.int64, the right dtype will convert to paddle.float32 warnings.warn( 2022-08-22 18:00:36 [INFO] [TRAIN] epoch: 1, iter: 50/160000, loss: 5.8003, lr: 0.000040, batch_cost: 0.2353, reader_cost: 0.03565, ips: 34.0059 samples/sec | ETA 10:27:08 2022-08-22 18:00:45 [INFO] [TRAIN] epoch: 1, iter: 100/160000, loss: 5.6049, lr: 0.000080, batch_cost: 0.1795, reader_cost: 0.00038, ips: 44.5593 samples/sec | ETA 07:58:27 2022-08-22 18:00:53 [INFO] [TRAIN] epoch: 1, iter: 150/160000, loss: 5.1686, lr: 0.000120, batch_cost: 0.1634, reader_cost: 0.00079, ips: 48.9715 samples/sec | ETA 07:15:13 2022-08-22 18:01:02 [INFO] [TRAIN] epoch: 1, iter: 200/160000, loss: 4.5436, lr: 0.000160, batch_cost: 0.1799, reader_cost: 0.00066, ips: 44.4759 samples/sec | ETA 07:59:03 2022-08-22 18:01:11 [INFO] [TRAIN] epoch: 1, iter: 250/160000, loss: 3.8711, lr: 0.000200, batch_cost: 0.1753, reader_cost: 0.00067, ips: 45.6444 samples/sec | ETA 07:46:39 2022-08-22 18:01:20 [INFO] [TRAIN] epoch: 1, iter: 300/160000, loss: 3.3567, lr: 0.000240, batch_cost: 0.1792, reader_cost: 0.00036, ips: 44.6342 samples/sec | ETA 07:57:03 2022-08-22 18:01:29 [INFO] [TRAIN] epoch: 1, iter: 350/160000, loss: 3.0835, lr: 0.000280, batch_cost: 0.1771, reader_cost: 0.00072, ips: 45.1624 samples/sec | ETA 07:51:20 2022-08-22 18:01:39 [INFO] [TRAIN] epoch: 1, iter: 400/160000, loss: 2.8102, lr: 0.000320, batch_cost: 0.1929, reader_cost: 0.00036, ips: 41.4772 samples/sec | ETA 08:33:03 2022-08-22 18:01:48 [INFO] [TRAIN] epoch: 1, iter: 450/160000, loss: 2.5947, lr: 0.000360, batch_cost: 0.1794, reader_cost: 0.00038, ips: 44.6028 samples/sec | ETA 07:56:57 2022-08-22 18:01:56 [INFO] [TRAIN] epoch: 1, iter: 500/160000, loss: 2.6448, lr: 0.000400, batch_cost: 0.1618, reader_cost: 0.00315, ips: 49.4334 samples/sec | ETA 07:10:12 2022-08-22 18:02:04 [INFO] [TRAIN] epoch: 1, iter: 550/160000, loss: 2.4674, lr: 0.000440, batch_cost: 0.1674, reader_cost: 0.00181, ips: 47.7849 samples/sec | ETA 07:24:54 2022-08-22 18:02:12 [INFO] [TRAIN] epoch: 1, iter: 600/160000, loss: 2.3611, lr: 0.000480, batch_cost: 0.1597, reader_cost: 0.00058, ips: 50.0997 samples/sec | ETA 07:04:13 2022-08-22 18:02:22 [INFO] [TRAIN] epoch: 1, iter: 650/160000, loss: 2.2977, lr: 0.000520, batch_cost: 0.1898, reader_cost: 0.00069, ips: 42.1461 samples/sec | ETA 08:24:07 2022-08-22 18:02:30 [INFO] [TRAIN] epoch: 1, iter: 700/160000, loss: 2.1568, lr: 0.000560, batch_cost: 0.1624, reader_cost: 0.00040, ips: 49.2481 samples/sec | ETA 07:11:17 2022-08-22 18:02:40 [INFO] [TRAIN] epoch: 1, iter: 750/160000, loss: 2.2465, lr: 0.000600, batch_cost: 0.2054, reader_cost: 0.00064, ips: 38.9478 samples/sec | ETA 09:05:10 2022-08-22 18:02:52 [INFO] [TRAIN] epoch: 1, iter: 800/160000, loss: 1.9612, lr: 0.000640, batch_cost: 0.2479, reader_cost: 0.00064, ips: 32.2733 samples/sec | ETA 10:57:42 2022-08-22 18:03:03 [INFO] [TRAIN] epoch: 1, iter: 850/160000, loss: 2.0061, lr: 0.000680, batch_cost: 0.2048, reader_cost: 0.00041, ips: 39.0560 samples/sec | ETA 09:03:19 2022-08-22 18:03:14 [INFO] [TRAIN] epoch: 1, iter: 900/160000, loss: 2.0588, lr: 0.000720, batch_cost: 0.2237, reader_cost: 0.00053, ips: 35.7694 samples/sec | ETA 09:53:03 2022-08-22 18:03:25 [INFO] [TRAIN] epoch: 1, iter: 950/160000, loss: 1.9637, lr: 0.000760, batch_cost: 0.2199, reader_cost: 0.00049, ips: 36.3758 samples/sec | ETA 09:42:59 2022-08-22 18:03:35 [INFO] [TRAIN] epoch: 1, iter: 1000/160000, loss: 1.9829, lr: 0.000800, batch_cost: 0.2010, reader_cost: 0.00377, ips: 39.8002 samples/sec | ETA 08:52:39 2022-08-22 18:03:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 181s - batch_cost: 0.1810 - reader cost: 0.0014 2022-08-22 18:06:36 [INFO] [EVAL] #Images: 2000 mIoU: 0.0727 Acc: 0.6192 Kappa: 0.5836 Dice: 0.1027 2022-08-22 18:06:36 [INFO] [EVAL] Class IoU: [0.4969 0.6327 0.8631 0.5485 0.5277 0.5742 0.5915 0.4839 0.3672 0.5533 0.2917 0.3298 0.551 0.1407 0.0296 0.0963 0.3036 0.2292 0.3172 0.2364 0.585 0.2904 0.2042 0.2168 0.1742 0.1174 0.2808 0.0015 0.0432 0.1163 0.0115 0.0495 0.0027 0.0234 0.0825 0.0002 0.0352 0.035 0. 0.0035 0. 0. 0. 0.0081 0.0085 0. 0.0004 0.1481 0.0047 0.0001 0. 0. 0.0025 0. 0. 0.0039 0.1576 0.0303 0. 0. 0.0002 0. 0.0084 0. 0.004 0.0003 0.0042 0.0661 0. 0. 0. 0.0098 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0042 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-22 18:06:36 [INFO] [EVAL] Class Precision: [0.5752 0.6719 0.9104 0.6581 0.6156 0.7037 0.6757 0.6128 0.5244 0.6411 0.4045 0.5578 0.6441 0.5455 0.2037 0.253 0.3806 0.5851 0.5778 0.3177 0.704 0.5493 0.4383 0.2587 0.2345 0.2822 0.4575 0.2704 0.6495 0.2366 0.1059 0.6545 0.1773 0.2728 0.2328 0.2957 0.6257 0.3765 0. 0.1197 0. 0.0057 0. 0.6425 0.4531 0. 0.6502 0.548 0.7329 0.3176 0. 0.0104 0.17 0. 0.0414 0.535 0.9239 0.3373 0. 0. 0.0086 0. 0.4307 0. 0.1431 1. 0.2414 0.6221 0. 0. 0. 0.2472 0.0761 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-22 18:06:36 [INFO] [EVAL] Class Recall: [0.7849 0.9156 0.9432 0.7671 0.787 0.7573 0.826 0.697 0.5506 0.8017 0.5114 0.4466 0.7923 0.1594 0.0335 0.1345 0.6002 0.2737 0.4129 0.4799 0.7758 0.3813 0.2765 0.5726 0.4038 0.1673 0.4209 0.0015 0.0442 0.1861 0.0127 0.0508 0.0027 0.0249 0.1133 0.0002 0.036 0.0371 0. 0.0036 0. 0. 0. 0.0082 0.0086 0. 0.0004 0.1687 0.0047 0.0001 0. 0. 0.0025 0. 0. 0.0039 0.1597 0.0322 0. 0. 0.0002 0. 0.0085 0. 0.0041 0.0003 0.0043 0.0689 0. 0. 0. 0.0101 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0042 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-22 18:06:36 [INFO] [EVAL] The model with the best validation mIoU (0.0727) was saved at iter 1000. 2022-08-22 18:06:46 [INFO] [TRAIN] epoch: 1, iter: 1050/160000, loss: 1.8589, lr: 0.000840, batch_cost: 0.1896, reader_cost: 0.00507, ips: 42.2030 samples/sec | ETA 08:22:10 2022-08-22 18:06:55 [INFO] [TRAIN] epoch: 1, iter: 1100/160000, loss: 1.8975, lr: 0.000879, batch_cost: 0.1869, reader_cost: 0.00166, ips: 42.7989 samples/sec | ETA 08:15:01 2022-08-22 18:07:05 [INFO] [TRAIN] epoch: 1, iter: 1150/160000, loss: 1.9220, lr: 0.000919, batch_cost: 0.2023, reader_cost: 0.00065, ips: 39.5511 samples/sec | ETA 08:55:30 2022-08-22 18:07:14 [INFO] [TRAIN] epoch: 1, iter: 1200/160000, loss: 1.8193, lr: 0.000959, batch_cost: 0.1810, reader_cost: 0.00034, ips: 44.2101 samples/sec | ETA 07:58:55 2022-08-22 18:07:24 [INFO] [TRAIN] epoch: 1, iter: 1250/160000, loss: 1.9016, lr: 0.000999, batch_cost: 0.1977, reader_cost: 0.00048, ips: 40.4621 samples/sec | ETA 08:43:07 2022-08-22 18:07:38 [INFO] [TRAIN] epoch: 2, iter: 1300/160000, loss: 1.8071, lr: 0.001039, batch_cost: 0.2698, reader_cost: 0.07413, ips: 29.6465 samples/sec | ETA 11:53:44 2022-08-22 18:07:46 [INFO] [TRAIN] epoch: 2, iter: 1350/160000, loss: 1.6687, lr: 0.001079, batch_cost: 0.1736, reader_cost: 0.00042, ips: 46.0721 samples/sec | ETA 07:39:08 2022-08-22 18:07:55 [INFO] [TRAIN] epoch: 2, iter: 1400/160000, loss: 1.5507, lr: 0.001119, batch_cost: 0.1633, reader_cost: 0.00287, ips: 48.9974 samples/sec | ETA 07:11:35 2022-08-22 18:08:04 [INFO] [TRAIN] epoch: 2, iter: 1450/160000, loss: 1.6704, lr: 0.001159, batch_cost: 0.1919, reader_cost: 0.00052, ips: 41.6790 samples/sec | ETA 08:27:12 2022-08-22 18:08:13 [INFO] [TRAIN] epoch: 2, iter: 1500/160000, loss: 1.5898, lr: 0.001199, batch_cost: 0.1728, reader_cost: 0.00060, ips: 46.2937 samples/sec | ETA 07:36:30 2022-08-22 18:08:23 [INFO] [TRAIN] epoch: 2, iter: 1550/160000, loss: 1.6207, lr: 0.001200, batch_cost: 0.2041, reader_cost: 0.00074, ips: 39.1871 samples/sec | ETA 08:59:07 2022-08-22 18:08:33 [INFO] [TRAIN] epoch: 2, iter: 1600/160000, loss: 1.5430, lr: 0.001199, batch_cost: 0.2011, reader_cost: 0.00084, ips: 39.7767 samples/sec | ETA 08:50:57 2022-08-22 18:08:42 [INFO] [TRAIN] epoch: 2, iter: 1650/160000, loss: 1.6116, lr: 0.001199, batch_cost: 0.1785, reader_cost: 0.00114, ips: 44.8109 samples/sec | ETA 07:51:09 2022-08-22 18:08:51 [INFO] [TRAIN] epoch: 2, iter: 1700/160000, loss: 1.6507, lr: 0.001198, batch_cost: 0.1809, reader_cost: 0.00060, ips: 44.2129 samples/sec | ETA 07:57:23 2022-08-22 18:09:00 [INFO] [TRAIN] epoch: 2, iter: 1750/160000, loss: 1.6357, lr: 0.001198, batch_cost: 0.1825, reader_cost: 0.00040, ips: 43.8403 samples/sec | ETA 08:01:17 2022-08-22 18:09:12 [INFO] [TRAIN] epoch: 2, iter: 1800/160000, loss: 1.5982, lr: 0.001198, batch_cost: 0.2410, reader_cost: 0.00083, ips: 33.2003 samples/sec | ETA 10:35:20 2022-08-22 18:09:24 [INFO] [TRAIN] epoch: 2, iter: 1850/160000, loss: 1.5222, lr: 0.001197, batch_cost: 0.2374, reader_cost: 0.00063, ips: 33.6927 samples/sec | ETA 10:25:51 2022-08-22 18:09:35 [INFO] [TRAIN] epoch: 2, iter: 1900/160000, loss: 1.4320, lr: 0.001197, batch_cost: 0.2255, reader_cost: 0.00060, ips: 35.4829 samples/sec | ETA 09:54:05 2022-08-22 18:09:47 [INFO] [TRAIN] epoch: 2, iter: 1950/160000, loss: 1.6004, lr: 0.001197, batch_cost: 0.2235, reader_cost: 0.00072, ips: 35.7884 samples/sec | ETA 09:48:49 2022-08-22 18:09:56 [INFO] [TRAIN] epoch: 2, iter: 2000/160000, loss: 1.5265, lr: 0.001196, batch_cost: 0.1916, reader_cost: 0.00294, ips: 41.7464 samples/sec | ETA 08:24:38 2022-08-22 18:09:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 185s - batch_cost: 0.1846 - reader cost: 8.0862e-04 2022-08-22 18:13:01 [INFO] [EVAL] #Images: 2000 mIoU: 0.1460 Acc: 0.6679 Kappa: 0.6398 Dice: 0.2146 2022-08-22 18:13:01 [INFO] [EVAL] Class IoU: [0.5576 0.6985 0.8954 0.624 0.5864 0.6402 0.6529 0.5982 0.4068 0.5974 0.3485 0.4305 0.5964 0.2613 0.0599 0.197 0.3787 0.2643 0.4018 0.219 0.617 0.3586 0.3729 0.27 0.2102 0.2194 0.3171 0.0543 0.277 0.1983 0.1262 0.3595 0.1445 0.1219 0.1483 0.0117 0.2131 0.2745 0.0106 0.0911 0.0023 0.0419 0.0041 0.1669 0.1788 0. 0.1706 0.2797 0.3949 0.231 0.1117 0.2309 0. 0.1443 0.1604 0.2192 0.5244 0.2284 0.0003 0.0347 0.0743 0.0143 0.1931 0.1367 0.0431 0.3918 0.1603 0.2279 0.0001 0.0013 0.068 0.2365 0.0988 0.0019 0.2253 0.0643 0.1101 0.0002 0.0006 0. 0. 0.0696 0.0029 0.0025 0. 0.2431 0. 0. 0. 0.2644 0.3436 0. 0.0845 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.024 0. 0. 0. 0. 0.2538 0.0002 0. 0. 0.0575 0. 0.2602 0.052 0. 0.0909 0.3221 0. 0.2192 0.0068 0. 0. 0.0235 0.0543 0.0005 0.0045 0.1568 0. 0.0058 0.2498 0. 0. 0. 0. 0.0022 0. 0. 0. 0. 0. 0. 0.0693 0.0494 0. 0. 0. 0. 0. 0. ] 2022-08-22 18:13:01 [INFO] [EVAL] Class Precision: [0.6501 0.7631 0.9402 0.7361 0.6658 0.7658 0.7839 0.7156 0.5348 0.7563 0.4264 0.5644 0.6698 0.4748 0.2918 0.3885 0.5875 0.6798 0.52 0.4925 0.6724 0.571 0.639 0.4655 0.2727 0.4545 0.3627 0.5345 0.5595 0.2932 0.1459 0.5541 0.3648 0.3606 0.4074 0.771 0.4911 0.5561 0.5152 0.4716 0.4212 0.2218 0.4704 0.417 0.3854 0. 0.3996 0.4706 0.5534 0.4935 0.5375 0.361 0. 0.3281 0.7083 0.2605 0.7765 0.4611 0.8468 0.2447 0.1248 0.2325 0.4768 0.6932 0.3222 0.4448 0.233 0.5687 0.0319 0.5475 0.4374 0.5132 0.9398 0.063 0.6388 0.4932 0.7387 0.0277 0.0184 0. 0. 0.5902 0.7413 0.0116 0.0025 0.5443 0. 0. 0. 0.5382 0.4934 0. 0.5925 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.4597 0. 0. 0. 0. 0.9514 0.0072 0. 0. 0.1207 0. 0.5299 0.9997 0. 0.822 0.3878 0. 0.2515 0.1461 0. 0. 0.377 0.4803 0.4046 1. 0.6917 0. 0.5027 0.7708 0. 0. 0. 0. 0.5177 0. 0. 0. 0.2424 0. 0. 0.9139 0.2124 0. 0. 0. 0. 0. 0. ] 2022-08-22 18:13:01 [INFO] [EVAL] Class Recall: [0.7968 0.8919 0.9495 0.804 0.8311 0.796 0.7962 0.7849 0.6296 0.7399 0.6559 0.6446 0.8448 0.3676 0.0701 0.2855 0.5159 0.3019 0.6386 0.2828 0.8822 0.4909 0.4724 0.3913 0.4783 0.2978 0.7158 0.057 0.3543 0.3798 0.482 0.5059 0.1931 0.1555 0.189 0.0118 0.2734 0.3515 0.0107 0.1014 0.0023 0.0491 0.0041 0.2177 0.2501 0. 0.2293 0.4081 0.5796 0.3027 0.1236 0.3904 0. 0.2048 0.1718 0.58 0.6177 0.3115 0.0003 0.0388 0.1551 0.0151 0.2451 0.1455 0.0474 0.7667 0.3393 0.2755 0.0001 0.0013 0.0746 0.305 0.0994 0.002 0.2582 0.0688 0.1145 0.0002 0.0007 0. 0. 0.0731 0.0029 0.0032 0. 0.3052 0. 0. 0. 0.342 0.5309 0. 0.0897 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0247 0. 0. 0. 0. 0.2571 0.0002 0. 0. 0.0989 0. 0.3383 0.052 0. 0.0927 0.6553 0. 0.6302 0.0071 0. 0. 0.0244 0.0576 0.0005 0.0045 0.1686 0. 0.0058 0.2699 0. 0. 0. 0. 0.0022 0. 0. 0. 0. 0. 0. 0.0698 0.0605 0. 0. 0. 0. 0. 0. ] 2022-08-22 18:13:01 [INFO] [EVAL] The model with the best validation mIoU (0.1460) was saved at iter 2000. 2022-08-22 18:13:09 [INFO] [TRAIN] epoch: 2, iter: 2050/160000, loss: 1.5564, lr: 0.001196, batch_cost: 0.1591, reader_cost: 0.00403, ips: 50.2851 samples/sec | ETA 06:58:48 2022-08-22 18:13:18 [INFO] [TRAIN] epoch: 2, iter: 2100/160000, loss: 1.5218, lr: 0.001195, batch_cost: 0.1840, reader_cost: 0.00071, ips: 43.4842 samples/sec | ETA 08:04:09 2022-08-22 18:13:28 [INFO] [TRAIN] epoch: 2, iter: 2150/160000, loss: 1.4259, lr: 0.001195, batch_cost: 0.1856, reader_cost: 0.00032, ips: 43.1070 samples/sec | ETA 08:08:14 2022-08-22 18:13:36 [INFO] [TRAIN] epoch: 2, iter: 2200/160000, loss: 1.4329, lr: 0.001195, batch_cost: 0.1732, reader_cost: 0.00067, ips: 46.1884 samples/sec | ETA 07:35:31 2022-08-22 18:13:45 [INFO] [TRAIN] epoch: 2, iter: 2250/160000, loss: 1.4882, lr: 0.001194, batch_cost: 0.1841, reader_cost: 0.00063, ips: 43.4635 samples/sec | ETA 08:03:55 2022-08-22 18:13:53 [INFO] [TRAIN] epoch: 2, iter: 2300/160000, loss: 1.4280, lr: 0.001194, batch_cost: 0.1500, reader_cost: 0.00051, ips: 53.3362 samples/sec | ETA 06:34:13 2022-08-22 18:14:02 [INFO] [TRAIN] epoch: 2, iter: 2350/160000, loss: 1.3967, lr: 0.001194, batch_cost: 0.1740, reader_cost: 0.00051, ips: 45.9666 samples/sec | ETA 07:37:17 2022-08-22 18:14:13 [INFO] [TRAIN] epoch: 2, iter: 2400/160000, loss: 1.3919, lr: 0.001193, batch_cost: 0.2209, reader_cost: 0.00057, ips: 36.2176 samples/sec | ETA 09:40:11 2022-08-22 18:14:23 [INFO] [TRAIN] epoch: 2, iter: 2450/160000, loss: 1.4433, lr: 0.001193, batch_cost: 0.1967, reader_cost: 0.00104, ips: 40.6810 samples/sec | ETA 08:36:22 2022-08-22 18:14:31 [INFO] [TRAIN] epoch: 2, iter: 2500/160000, loss: 1.3665, lr: 0.001192, batch_cost: 0.1610, reader_cost: 0.00049, ips: 49.6995 samples/sec | ETA 07:02:32 2022-08-22 18:14:43 [INFO] [TRAIN] epoch: 3, iter: 2550/160000, loss: 1.3321, lr: 0.001192, batch_cost: 0.2435, reader_cost: 0.07378, ips: 32.8490 samples/sec | ETA 10:39:05 2022-08-22 18:14:51 [INFO] [TRAIN] epoch: 3, iter: 2600/160000, loss: 1.3821, lr: 0.001192, batch_cost: 0.1564, reader_cost: 0.00749, ips: 51.1574 samples/sec | ETA 06:50:14 2022-08-22 18:14:59 [INFO] [TRAIN] epoch: 3, iter: 2650/160000, loss: 1.3564, lr: 0.001191, batch_cost: 0.1779, reader_cost: 0.00879, ips: 44.9667 samples/sec | ETA 07:46:34 2022-08-22 18:15:07 [INFO] [TRAIN] epoch: 3, iter: 2700/160000, loss: 1.4322, lr: 0.001191, batch_cost: 0.1563, reader_cost: 0.00057, ips: 51.1946 samples/sec | ETA 06:49:40 2022-08-22 18:15:16 [INFO] [TRAIN] epoch: 3, iter: 2750/160000, loss: 1.3392, lr: 0.001191, batch_cost: 0.1749, reader_cost: 0.00040, ips: 45.7299 samples/sec | ETA 07:38:29 2022-08-22 18:15:25 [INFO] [TRAIN] epoch: 3, iter: 2800/160000, loss: 1.3563, lr: 0.001190, batch_cost: 0.1765, reader_cost: 0.00051, ips: 45.3287 samples/sec | ETA 07:42:24 2022-08-22 18:15:34 [INFO] [TRAIN] epoch: 3, iter: 2850/160000, loss: 1.3199, lr: 0.001190, batch_cost: 0.1878, reader_cost: 0.00256, ips: 42.5886 samples/sec | ETA 08:11:59 2022-08-22 18:15:46 [INFO] [TRAIN] epoch: 3, iter: 2900/160000, loss: 1.4366, lr: 0.001189, batch_cost: 0.2339, reader_cost: 0.00035, ips: 34.2051 samples/sec | ETA 10:12:23 2022-08-22 18:15:57 [INFO] [TRAIN] epoch: 3, iter: 2950/160000, loss: 1.2908, lr: 0.001189, batch_cost: 0.2143, reader_cost: 0.00045, ips: 37.3319 samples/sec | ETA 09:20:54 2022-08-22 18:16:06 [INFO] [TRAIN] epoch: 3, iter: 3000/160000, loss: 1.3973, lr: 0.001189, batch_cost: 0.1875, reader_cost: 0.00142, ips: 42.6620 samples/sec | ETA 08:10:40 2022-08-22 18:16:06 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1815 - reader cost: 9.5816e-04 2022-08-22 18:19:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.1821 Acc: 0.6833 Kappa: 0.6577 Dice: 0.2649 2022-08-22 18:19:08 [INFO] [EVAL] Class IoU: [0.5807 0.7026 0.9073 0.6384 0.624 0.6601 0.6572 0.608 0.4222 0.5005 0.3879 0.4298 0.6138 0.2615 0.0501 0.2178 0.4102 0.3052 0.4505 0.2994 0.6301 0.4647 0.4163 0.3399 0.1413 0.3312 0.256 0.1073 0.1273 0.1738 0.136 0.3848 0.2102 0.1657 0.2484 0.1923 0.2667 0.3143 0.1158 0.0742 0.0654 0.0113 0.0075 0.1729 0.2922 0.0223 0.2143 0.2871 0.5108 0.2752 0.1496 0.1232 0.0387 0.1653 0.5288 0.2684 0.6658 0.1341 0.0045 0.1283 0.0002 0.0993 0.166 0.1527 0.0755 0.4419 0.1807 0.3139 0.0036 0.0236 0.0732 0.2891 0.2788 0.0263 0.3046 0.1588 0.3474 0.0007 0.0067 0.0201 0.3241 0.1908 0.0699 0.0066 0.015 0.2839 0. 0. 0. 0.2738 0.2608 0. 0.1159 0.0038 0. 0. 0.0018 0. 0.0037 0. 0. 0. 0.0001 0.0024 0. 0. 0. 0.3943 0.0043 0. 0. 0.0326 0. 0.3496 0.304 0. 0.1609 0.393 0.002 0.2955 0.3036 0. 0. 0.1031 0.1088 0.0145 0.348 0.2302 0. 0.1376 0.3839 0. 0. 0.0178 0. 0.0366 0. 0. 0. 0.0837 0.0002 0. 0.1848 0. 0.0149 0. 0.0017 0. 0. 0. ] 2022-08-22 18:19:08 [INFO] [EVAL] Class Precision: [0.6797 0.8147 0.9525 0.7458 0.6989 0.7302 0.8051 0.676 0.5616 0.6827 0.5565 0.535 0.687 0.5288 0.3764 0.4632 0.5088 0.7239 0.6539 0.4175 0.6881 0.5181 0.6918 0.4873 0.3136 0.373 0.7648 0.7452 0.7818 0.2103 0.222 0.5627 0.4115 0.2936 0.364 0.528 0.499 0.4993 0.4239 0.4936 0.1747 0.236 0.6174 0.4834 0.4483 0.5779 0.4777 0.4963 0.5835 0.5499 0.5286 0.1412 0.4493 0.6517 0.6262 0.3091 0.9014 0.5138 0.6334 0.2877 0.0228 0.1835 0.2032 0.6696 0.6132 0.4905 0.2675 0.4571 0.1392 0.2902 0.7059 0.4519 0.6868 0.2577 0.5782 0.239 0.4216 0.6269 0.0688 0.3426 0.8899 0.6292 0.617 0.048 0.5784 0.6352 0. 0.2679 0. 0.6627 0.3028 0. 0.4069 0.0894 0. 0. 0.2669 0. 0.2768 0. 0. 0. 0.1439 0.4769 0. 0. 0. 0.7165 0.1667 0. 0. 0.0361 0. 0.6995 0.7657 0. 0.8077 0.659 0.0796 0.3563 0.708 0. 0. 0.423 0.5332 0.4695 0.8655 0.2921 0. 0.1945 0.6508 0. 0. 0.3664 0. 0.1616 0. 0. 0. 0.7039 0.0205 0. 0.794 0. 0.2183 0. 1. 0. 0. 0. ] 2022-08-22 18:19:08 [INFO] [EVAL] Class Recall: [0.7996 0.8363 0.9504 0.816 0.8533 0.873 0.7815 0.8579 0.6297 0.6523 0.5615 0.6861 0.852 0.3409 0.0547 0.2913 0.6793 0.3454 0.5915 0.5142 0.8821 0.8185 0.5111 0.529 0.2046 0.7472 0.2778 0.1114 0.132 0.4999 0.26 0.5489 0.3005 0.2755 0.4388 0.2322 0.3643 0.4591 0.1375 0.0803 0.0947 0.0117 0.0075 0.2121 0.4562 0.0227 0.2798 0.4051 0.8038 0.3552 0.1727 0.4907 0.0406 0.1813 0.7727 0.6713 0.7181 0.1536 0.0045 0.188 0.0002 0.1779 0.4756 0.1651 0.0793 0.8171 0.3578 0.5004 0.0037 0.025 0.0755 0.4452 0.3194 0.0284 0.3916 0.3212 0.6636 0.0007 0.0073 0.0209 0.3376 0.2149 0.0731 0.0075 0.0152 0.3393 0. 0. 0. 0.3182 0.6528 0. 0.1394 0.0039 0. 0. 0.0019 0. 0.0037 0. 0. 0. 0.0001 0.0024 0. 0. 0. 0.4672 0.0044 0. 0. 0.2525 0. 0.4114 0.3352 0. 0.1673 0.4933 0.002 0.6342 0.3471 0. 0. 0.12 0.1202 0.0147 0.3679 0.521 0. 0.3199 0.4834 0. 0. 0.0184 0. 0.0451 0. 0. 0. 0.0868 0.0002 0. 0.1941 0. 0.0157 0. 0.0017 0. 0. 0. ] 2022-08-22 18:19:08 [INFO] [EVAL] The model with the best validation mIoU (0.1821) was saved at iter 3000. 2022-08-22 18:19:17 [INFO] [TRAIN] epoch: 3, iter: 3050/160000, loss: 1.4123, lr: 0.001188, batch_cost: 0.1823, reader_cost: 0.00356, ips: 43.8808 samples/sec | ETA 07:56:53 2022-08-22 18:19:26 [INFO] [TRAIN] epoch: 3, iter: 3100/160000, loss: 1.3316, lr: 0.001188, batch_cost: 0.1760, reader_cost: 0.00100, ips: 45.4517 samples/sec | ETA 07:40:16 2022-08-22 18:19:35 [INFO] [TRAIN] epoch: 3, iter: 3150/160000, loss: 1.3629, lr: 0.001188, batch_cost: 0.1773, reader_cost: 0.00092, ips: 45.1241 samples/sec | ETA 07:43:27 2022-08-22 18:19:44 [INFO] [TRAIN] epoch: 3, iter: 3200/160000, loss: 1.4456, lr: 0.001187, batch_cost: 0.1759, reader_cost: 0.00036, ips: 45.4926 samples/sec | ETA 07:39:33 2022-08-22 18:19:52 [INFO] [TRAIN] epoch: 3, iter: 3250/160000, loss: 1.3128, lr: 0.001187, batch_cost: 0.1618, reader_cost: 0.00165, ips: 49.4506 samples/sec | ETA 07:02:38 2022-08-22 18:19:59 [INFO] [TRAIN] epoch: 3, iter: 3300/160000, loss: 1.2744, lr: 0.001186, batch_cost: 0.1510, reader_cost: 0.00069, ips: 52.9813 samples/sec | ETA 06:34:21 2022-08-22 18:20:09 [INFO] [TRAIN] epoch: 3, iter: 3350/160000, loss: 1.2714, lr: 0.001186, batch_cost: 0.2019, reader_cost: 0.00064, ips: 39.6210 samples/sec | ETA 08:47:09 2022-08-22 18:20:19 [INFO] [TRAIN] epoch: 3, iter: 3400/160000, loss: 1.2995, lr: 0.001186, batch_cost: 0.1871, reader_cost: 0.00102, ips: 42.7687 samples/sec | ETA 08:08:12 2022-08-22 18:20:29 [INFO] [TRAIN] epoch: 3, iter: 3450/160000, loss: 1.2157, lr: 0.001185, batch_cost: 0.1992, reader_cost: 0.00057, ips: 40.1522 samples/sec | ETA 08:39:51 2022-08-22 18:20:38 [INFO] [TRAIN] epoch: 3, iter: 3500/160000, loss: 1.3181, lr: 0.001185, batch_cost: 0.1897, reader_cost: 0.00070, ips: 42.1685 samples/sec | ETA 08:14:50 2022-08-22 18:20:46 [INFO] [TRAIN] epoch: 3, iter: 3550/160000, loss: 1.3733, lr: 0.001184, batch_cost: 0.1673, reader_cost: 0.00036, ips: 47.8321 samples/sec | ETA 07:16:06 2022-08-22 18:20:54 [INFO] [TRAIN] epoch: 3, iter: 3600/160000, loss: 1.2731, lr: 0.001184, batch_cost: 0.1528, reader_cost: 0.00257, ips: 52.3460 samples/sec | ETA 06:38:22 2022-08-22 18:21:02 [INFO] [TRAIN] epoch: 3, iter: 3650/160000, loss: 1.3556, lr: 0.001184, batch_cost: 0.1674, reader_cost: 0.00074, ips: 47.7767 samples/sec | ETA 07:16:20 2022-08-22 18:21:11 [INFO] [TRAIN] epoch: 3, iter: 3700/160000, loss: 1.3364, lr: 0.001183, batch_cost: 0.1670, reader_cost: 0.00046, ips: 47.9038 samples/sec | ETA 07:15:02 2022-08-22 18:21:19 [INFO] [TRAIN] epoch: 3, iter: 3750/160000, loss: 1.2870, lr: 0.001183, batch_cost: 0.1719, reader_cost: 0.00076, ips: 46.5498 samples/sec | ETA 07:27:32 2022-08-22 18:21:36 [INFO] [TRAIN] epoch: 4, iter: 3800/160000, loss: 1.3057, lr: 0.001183, batch_cost: 0.3306, reader_cost: 0.12438, ips: 24.1976 samples/sec | ETA 14:20:41 2022-08-22 18:21:47 [INFO] [TRAIN] epoch: 4, iter: 3850/160000, loss: 1.1918, lr: 0.001182, batch_cost: 0.2306, reader_cost: 0.00093, ips: 34.6995 samples/sec | ETA 10:00:00 2022-08-22 18:21:58 [INFO] [TRAIN] epoch: 4, iter: 3900/160000, loss: 1.1508, lr: 0.001182, batch_cost: 0.2017, reader_cost: 0.00569, ips: 39.6570 samples/sec | ETA 08:44:50 2022-08-22 18:22:07 [INFO] [TRAIN] epoch: 4, iter: 3950/160000, loss: 1.2229, lr: 0.001181, batch_cost: 0.1906, reader_cost: 0.00255, ips: 41.9705 samples/sec | ETA 08:15:44 2022-08-22 18:22:18 [INFO] [TRAIN] epoch: 4, iter: 4000/160000, loss: 1.1661, lr: 0.001181, batch_cost: 0.2213, reader_cost: 0.00309, ips: 36.1532 samples/sec | ETA 09:35:19 2022-08-22 18:22:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1702 - reader cost: 8.7978e-04 2022-08-22 18:25:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.2010 Acc: 0.6980 Kappa: 0.6729 Dice: 0.2877 2022-08-22 18:25:09 [INFO] [EVAL] Class IoU: [0.5938 0.708 0.9119 0.648 0.6124 0.6723 0.673 0.6471 0.4233 0.6029 0.3961 0.4619 0.6259 0.2843 0.0624 0.2853 0.4348 0.3036 0.4614 0.2915 0.6852 0.4639 0.4661 0.3732 0.2124 0.366 0.4629 0.222 0.3255 0.1942 0.1083 0.3644 0.0755 0.1547 0.1387 0.1303 0.3042 0.2688 0.1212 0.1585 0.0222 0.0527 0.1189 0.1698 0.2609 0.0098 0.3022 0.3392 0.5261 0.3406 0.2758 0.2115 0.0615 0.0094 0.618 0.2393 0.7235 0.1903 0.0266 0.1449 0.0032 0.0716 0.2046 0.1214 0.2552 0.4755 0.1636 0.2997 0.005 0.2009 0.2323 0.3327 0.283 0.012 0.3213 0.0661 0.0888 0.1111 0.0465 0.0092 0.4317 0.1338 0.0325 0.0287 0.0146 0.3508 0. 0.0037 0. 0.1841 0.3062 0. 0.2283 0.0127 0. 0. 0.0243 0. 0.0027 0. 0. 0.0006 0.0225 0.1465 0. 0.2088 0. 0.4605 0.0391 0.115 0. 0.1164 0.0009 0.3545 0.7817 0. 0.2131 0.5529 0. 0.1587 0.0418 0. 0. 0.0942 0.1642 0.0127 0.2783 0.2948 0. 0.0156 0.3611 0. 0. 0.0072 0.0015 0.0104 0.0078 0.0003 0. 0.0865 0.016 0. 0.1234 0.0003 0.0135 0.0109 0.0433 0. 0.0027 0. ] 2022-08-22 18:25:09 [INFO] [EVAL] Class Precision: [0.6793 0.7713 0.9673 0.7679 0.6783 0.8023 0.8435 0.7271 0.5916 0.6905 0.6139 0.5806 0.698 0.5057 0.3082 0.4256 0.5239 0.7063 0.6817 0.4577 0.8553 0.6062 0.6983 0.4868 0.286 0.4755 0.5885 0.5835 0.4968 0.2946 0.2602 0.6436 0.4351 0.3741 0.459 0.6323 0.5411 0.7476 0.4372 0.5313 0.1254 0.3126 0.5727 0.5615 0.4143 0.2042 0.604 0.5975 0.591 0.4483 0.648 0.2698 0.4696 0.591 0.7206 0.2681 0.8384 0.54 0.6861 0.3917 0.2775 0.2616 0.3302 0.692 0.4142 0.5473 0.4243 0.455 0.5366 0.4968 0.4588 0.4265 0.7434 0.2272 0.547 0.4203 0.6005 0.2399 0.0586 0.0764 0.7285 0.7993 0.7805 0.0872 0.2689 0.6469 0. 0.3211 0. 0.824 0.3671 0. 0.3542 0.2312 0. 0. 0.2664 0. 0.1442 0. 0. 0.0018 0.9934 0.2308 0. 0.7486 0. 0.7334 0.1462 0.1871 0. 0.1448 0.0653 0.7085 0.9084 0. 0.7863 0.7063 0. 0.5187 0.6052 0. 0. 0.4874 0.5139 0.6996 0.9731 0.4663 0.0001 0.2651 0.43 0. 0. 0.4996 0.9565 0.2448 0.5786 0.0373 0. 0.7213 0.5687 0. 0.7753 0.5179 0.995 0.049 0.9444 0. 0.2823 0. ] 2022-08-22 18:25:09 [INFO] [EVAL] Class Recall: [0.8251 0.8961 0.9409 0.8057 0.8631 0.8058 0.7691 0.8548 0.5982 0.8261 0.5276 0.6933 0.8583 0.3937 0.0725 0.4641 0.7188 0.3474 0.5881 0.4452 0.7751 0.6639 0.5837 0.6152 0.4522 0.6138 0.6844 0.2638 0.4856 0.3628 0.1565 0.4565 0.0837 0.2086 0.1658 0.141 0.41 0.2956 0.1436 0.1842 0.0263 0.0597 0.1305 0.1957 0.4134 0.0102 0.3769 0.4397 0.8273 0.5863 0.3244 0.4944 0.0661 0.0095 0.8126 0.6896 0.8407 0.2272 0.0269 0.187 0.0032 0.0898 0.3497 0.1284 0.3993 0.7836 0.2102 0.4675 0.005 0.2522 0.32 0.602 0.3136 0.0125 0.4378 0.0727 0.0944 0.1714 0.1832 0.0103 0.5145 0.1384 0.0328 0.041 0.0153 0.4338 0. 0.0038 0. 0.1916 0.6486 0. 0.391 0.0133 0. 0. 0.0261 0. 0.0027 0. 0. 0.0009 0.0225 0.2861 0. 0.2245 0. 0.553 0.0507 0.2299 0. 0.3729 0.0009 0.4151 0.8486 0. 0.2262 0.7181 0. 0.1862 0.0429 0. 0. 0.1046 0.1944 0.0127 0.2804 0.4449 0. 0.0163 0.6927 0. 0. 0.0073 0.0015 0.0107 0.0079 0.0003 0. 0.0895 0.0162 0. 0.128 0.0003 0.0135 0.0138 0.0434 0. 0.0027 0. ] 2022-08-22 18:25:09 [INFO] [EVAL] The model with the best validation mIoU (0.2010) was saved at iter 4000. 2022-08-22 18:25:17 [INFO] [TRAIN] epoch: 4, iter: 4050/160000, loss: 1.2634, lr: 0.001181, batch_cost: 0.1644, reader_cost: 0.00319, ips: 48.6580 samples/sec | ETA 07:07:20 2022-08-22 18:25:27 [INFO] [TRAIN] epoch: 4, iter: 4100/160000, loss: 1.2748, lr: 0.001180, batch_cost: 0.2065, reader_cost: 0.00092, ips: 38.7469 samples/sec | ETA 08:56:28 2022-08-22 18:25:37 [INFO] [TRAIN] epoch: 4, iter: 4150/160000, loss: 1.2280, lr: 0.001180, batch_cost: 0.1958, reader_cost: 0.00064, ips: 40.8557 samples/sec | ETA 08:28:37 2022-08-22 18:25:47 [INFO] [TRAIN] epoch: 4, iter: 4200/160000, loss: 1.3055, lr: 0.001180, batch_cost: 0.1922, reader_cost: 0.00076, ips: 41.6152 samples/sec | ETA 08:19:10 2022-08-22 18:25:55 [INFO] [TRAIN] epoch: 4, iter: 4250/160000, loss: 1.2810, lr: 0.001179, batch_cost: 0.1664, reader_cost: 0.00105, ips: 48.0896 samples/sec | ETA 07:11:49 2022-08-22 18:26:04 [INFO] [TRAIN] epoch: 4, iter: 4300/160000, loss: 1.2466, lr: 0.001179, batch_cost: 0.1803, reader_cost: 0.00049, ips: 44.3780 samples/sec | ETA 07:47:47 2022-08-22 18:26:13 [INFO] [TRAIN] epoch: 4, iter: 4350/160000, loss: 1.2072, lr: 0.001178, batch_cost: 0.1885, reader_cost: 0.00064, ips: 42.4322 samples/sec | ETA 08:09:05 2022-08-22 18:26:23 [INFO] [TRAIN] epoch: 4, iter: 4400/160000, loss: 1.1712, lr: 0.001178, batch_cost: 0.1828, reader_cost: 0.00056, ips: 43.7591 samples/sec | ETA 07:54:06 2022-08-22 18:26:30 [INFO] [TRAIN] epoch: 4, iter: 4450/160000, loss: 1.1763, lr: 0.001178, batch_cost: 0.1553, reader_cost: 0.00051, ips: 51.5288 samples/sec | ETA 06:42:29 2022-08-22 18:26:41 [INFO] [TRAIN] epoch: 4, iter: 4500/160000, loss: 1.2800, lr: 0.001177, batch_cost: 0.2064, reader_cost: 0.00052, ips: 38.7520 samples/sec | ETA 08:55:01 2022-08-22 18:26:50 [INFO] [TRAIN] epoch: 4, iter: 4550/160000, loss: 1.2597, lr: 0.001177, batch_cost: 0.1823, reader_cost: 0.00041, ips: 43.8793 samples/sec | ETA 07:52:21 2022-08-22 18:26:59 [INFO] [TRAIN] epoch: 4, iter: 4600/160000, loss: 1.1769, lr: 0.001177, batch_cost: 0.1800, reader_cost: 0.00066, ips: 44.4553 samples/sec | ETA 07:46:05 2022-08-22 18:27:07 [INFO] [TRAIN] epoch: 4, iter: 4650/160000, loss: 1.3075, lr: 0.001176, batch_cost: 0.1660, reader_cost: 0.00617, ips: 48.1956 samples/sec | ETA 07:09:46 2022-08-22 18:27:17 [INFO] [TRAIN] epoch: 4, iter: 4700/160000, loss: 1.1699, lr: 0.001176, batch_cost: 0.1975, reader_cost: 0.00078, ips: 40.5008 samples/sec | ETA 08:31:15 2022-08-22 18:27:27 [INFO] [TRAIN] epoch: 4, iter: 4750/160000, loss: 1.2756, lr: 0.001175, batch_cost: 0.1932, reader_cost: 0.01202, ips: 41.4098 samples/sec | ETA 08:19:52 2022-08-22 18:27:37 [INFO] [TRAIN] epoch: 4, iter: 4800/160000, loss: 1.2135, lr: 0.001175, batch_cost: 0.2042, reader_cost: 0.00071, ips: 39.1783 samples/sec | ETA 08:48:11 2022-08-22 18:27:46 [INFO] [TRAIN] epoch: 4, iter: 4850/160000, loss: 1.2111, lr: 0.001175, batch_cost: 0.1757, reader_cost: 0.00390, ips: 45.5412 samples/sec | ETA 07:34:14 2022-08-22 18:27:56 [INFO] [TRAIN] epoch: 4, iter: 4900/160000, loss: 1.2391, lr: 0.001174, batch_cost: 0.2051, reader_cost: 0.00759, ips: 39.0059 samples/sec | ETA 08:50:10 2022-08-22 18:28:06 [INFO] [TRAIN] epoch: 4, iter: 4950/160000, loss: 1.2428, lr: 0.001174, batch_cost: 0.2100, reader_cost: 0.00980, ips: 38.0917 samples/sec | ETA 09:02:43 2022-08-22 18:28:17 [INFO] [TRAIN] epoch: 4, iter: 5000/160000, loss: 1.2763, lr: 0.001174, batch_cost: 0.2181, reader_cost: 0.00557, ips: 36.6851 samples/sec | ETA 09:23:21 2022-08-22 18:28:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 171s - batch_cost: 0.1707 - reader cost: 9.4467e-04 2022-08-22 18:31:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.2244 Acc: 0.7079 Kappa: 0.6845 Dice: 0.3215 2022-08-22 18:31:08 [INFO] [EVAL] Class IoU: [0.6109 0.7435 0.9188 0.6508 0.6329 0.7031 0.7045 0.6591 0.4331 0.5661 0.401 0.4536 0.6226 0.2903 0.0896 0.2514 0.4707 0.3668 0.4424 0.2644 0.6836 0.374 0.4806 0.3842 0.2671 0.384 0.332 0.2473 0.3188 0.1957 0.1066 0.3734 0.1613 0.2158 0.2596 0.331 0.3093 0.3225 0.2162 0.1614 0.0285 0.0635 0.1668 0.1777 0.241 0.056 0.2065 0.2886 0.3273 0.3815 0.3455 0.163 0.0661 0.2094 0.6542 0.4489 0.7688 0.2476 0.1156 0.2065 0.0189 0.0939 0.0661 0.1281 0.3077 0.3653 0.14 0.3015 0. 0.1487 0.2647 0.3176 0.3529 0.0736 0.3212 0.1784 0.4417 0.0627 0.0499 0.077 0.412 0.1396 0.1283 0.0083 0.0426 0.4336 0.0067 0.0156 0. 0.3397 0.4019 0. 0.2063 0.0263 0. 0.0154 0.021 0.0002 0.0457 0.0229 0. 0.0005 0.0372 0. 0.0027 0.0277 0.0922 0.4952 0.001 0.0219 0.0002 0.0791 0.0036 0.5247 0.7598 0. 0.194 0.3778 0.0443 0.3284 0.3991 0. 0. 0.1457 0.1812 0.0287 0.3981 0.3096 0. 0.1572 0.4384 0. 0. 0.0687 0.0235 0.0559 0.0182 0. 0.0107 0.2651 0.1354 0.0208 0.2845 0. 0.1179 0.0006 0.0688 0.0004 0.0011 0. ] 2022-08-22 18:31:08 [INFO] [EVAL] Class Precision: [0.7095 0.8221 0.959 0.746 0.7277 0.7825 0.822 0.7207 0.5329 0.8165 0.6545 0.6137 0.6948 0.408 0.4073 0.5577 0.6202 0.6715 0.6821 0.5212 0.7607 0.5714 0.7319 0.4496 0.4207 0.4914 0.4046 0.6168 0.6212 0.267 0.2365 0.5688 0.3739 0.3168 0.391 0.646 0.5702 0.6921 0.4318 0.4813 0.2514 0.2534 0.7446 0.6574 0.281 0.6867 0.2622 0.6547 0.6526 0.5036 0.5268 0.2009 0.3182 0.4389 0.7279 0.6046 0.9044 0.4825 0.8601 0.6996 0.0793 0.3993 0.4002 0.7151 0.3896 0.3842 0.3092 0.3596 0.3902 0.716 0.4189 0.4258 0.6054 0.4477 0.6495 0.3012 0.5091 0.3297 0.0926 0.1223 0.9419 0.7232 0.6315 0.0306 0.9233 0.6047 0.2785 0.3445 0. 0.5912 0.5687 0. 0.4765 0.2705 0. 0.067 0.473 1. 0.3617 0.4536 0. 0.0011 0.3652 0. 0.6736 0.817 0.8013 0.8484 0.009 0.0721 0.0474 0.091 0.5446 0.6197 0.8265 0. 0.8401 0.9066 0.1596 0.446 0.6962 0. 0. 0.5103 0.5194 0.8434 0.8409 0.4684 0. 0.2006 0.6921 0. 0. 0.403 0.7476 0.2447 0.418 0. 0.2109 0.5074 0.5091 0.0916 0.5427 0. 0.686 0.003 0.9952 0.2692 0.0855 0. ] 2022-08-22 18:31:08 [INFO] [EVAL] Class Recall: [0.8146 0.886 0.9564 0.836 0.8293 0.8738 0.8313 0.8852 0.698 0.6487 0.5086 0.6348 0.857 0.5015 0.103 0.3139 0.6613 0.447 0.5573 0.3491 0.8709 0.5199 0.5833 0.7252 0.4225 0.6373 0.649 0.2923 0.3957 0.4232 0.1625 0.5209 0.221 0.4036 0.4358 0.4044 0.4033 0.3766 0.3021 0.1954 0.0312 0.0781 0.1769 0.1958 0.6283 0.0575 0.4928 0.3404 0.3963 0.6115 0.501 0.4633 0.077 0.2859 0.866 0.6354 0.8367 0.3372 0.1179 0.2266 0.0242 0.1094 0.0734 0.135 0.5942 0.881 0.2037 0.6511 0. 0.158 0.4182 0.5555 0.4583 0.0809 0.3885 0.3045 0.7695 0.0718 0.0976 0.1721 0.4227 0.1475 0.1387 0.0113 0.0428 0.6052 0.0068 0.0161 0. 0.444 0.5781 0. 0.2668 0.0283 0. 0.0196 0.0215 0.0002 0.0498 0.0235 0. 0.0008 0.0397 0. 0.0027 0.0279 0.0944 0.5432 0.0012 0.0305 0.0002 0.3779 0.0036 0.7739 0.904 0. 0.2014 0.3931 0.0577 0.5547 0.4833 0. 0. 0.1694 0.2176 0.0289 0.4306 0.4773 0. 0.4212 0.5446 0. 0. 0.0765 0.0237 0.0675 0.0187 0. 0.0112 0.357 0.1558 0.0262 0.3743 0. 0.1246 0.0008 0.0688 0.0004 0.0011 0. ] 2022-08-22 18:31:08 [INFO] [EVAL] The model with the best validation mIoU (0.2244) was saved at iter 5000. 2022-08-22 18:31:17 [INFO] [TRAIN] epoch: 4, iter: 5050/160000, loss: 1.3143, lr: 0.001173, batch_cost: 0.1690, reader_cost: 0.00384, ips: 47.3399 samples/sec | ETA 07:16:25 2022-08-22 18:31:27 [INFO] [TRAIN] epoch: 5, iter: 5100/160000, loss: 1.2235, lr: 0.001173, batch_cost: 0.2042, reader_cost: 0.04418, ips: 39.1777 samples/sec | ETA 08:47:10 2022-08-22 18:31:35 [INFO] [TRAIN] epoch: 5, iter: 5150/160000, loss: 1.2675, lr: 0.001172, batch_cost: 0.1601, reader_cost: 0.00474, ips: 49.9597 samples/sec | ETA 06:53:15 2022-08-22 18:31:45 [INFO] [TRAIN] epoch: 5, iter: 5200/160000, loss: 1.2032, lr: 0.001172, batch_cost: 0.2096, reader_cost: 0.00089, ips: 38.1737 samples/sec | ETA 09:00:41 2022-08-22 18:31:55 [INFO] [TRAIN] epoch: 5, iter: 5250/160000, loss: 1.1755, lr: 0.001172, batch_cost: 0.1824, reader_cost: 0.00111, ips: 43.8647 samples/sec | ETA 07:50:23 2022-08-22 18:32:03 [INFO] [TRAIN] epoch: 5, iter: 5300/160000, loss: 1.1261, lr: 0.001171, batch_cost: 0.1773, reader_cost: 0.00045, ips: 45.1113 samples/sec | ETA 07:37:14 2022-08-22 18:32:13 [INFO] [TRAIN] epoch: 5, iter: 5350/160000, loss: 1.2259, lr: 0.001171, batch_cost: 0.1839, reader_cost: 0.00067, ips: 43.4924 samples/sec | ETA 07:54:06 2022-08-22 18:32:21 [INFO] [TRAIN] epoch: 5, iter: 5400/160000, loss: 1.2621, lr: 0.001170, batch_cost: 0.1706, reader_cost: 0.00070, ips: 46.9004 samples/sec | ETA 07:19:30 2022-08-22 18:32:31 [INFO] [TRAIN] epoch: 5, iter: 5450/160000, loss: 1.0715, lr: 0.001170, batch_cost: 0.1897, reader_cost: 0.01023, ips: 42.1637 samples/sec | ETA 08:08:43 2022-08-22 18:32:40 [INFO] [TRAIN] epoch: 5, iter: 5500/160000, loss: 1.1997, lr: 0.001170, batch_cost: 0.1795, reader_cost: 0.00058, ips: 44.5691 samples/sec | ETA 07:42:12 2022-08-22 18:32:49 [INFO] [TRAIN] epoch: 5, iter: 5550/160000, loss: 1.1765, lr: 0.001169, batch_cost: 0.1821, reader_cost: 0.00043, ips: 43.9274 samples/sec | ETA 07:48:48 2022-08-22 18:32:57 [INFO] [TRAIN] epoch: 5, iter: 5600/160000, loss: 1.1211, lr: 0.001169, batch_cost: 0.1699, reader_cost: 0.00236, ips: 47.0929 samples/sec | ETA 07:17:09 2022-08-22 18:33:05 [INFO] [TRAIN] epoch: 5, iter: 5650/160000, loss: 1.1754, lr: 0.001169, batch_cost: 0.1568, reader_cost: 0.00071, ips: 51.0286 samples/sec | ETA 06:43:18 2022-08-22 18:33:13 [INFO] [TRAIN] epoch: 5, iter: 5700/160000, loss: 1.1169, lr: 0.001168, batch_cost: 0.1589, reader_cost: 0.00069, ips: 50.3404 samples/sec | ETA 06:48:41 2022-08-22 18:33:22 [INFO] [TRAIN] epoch: 5, iter: 5750/160000, loss: 1.2384, lr: 0.001168, batch_cost: 0.1743, reader_cost: 0.00090, ips: 45.8985 samples/sec | ETA 07:28:05 2022-08-22 18:33:32 [INFO] [TRAIN] epoch: 5, iter: 5800/160000, loss: 1.1902, lr: 0.001167, batch_cost: 0.2127, reader_cost: 0.00078, ips: 37.6198 samples/sec | ETA 09:06:31 2022-08-22 18:33:43 [INFO] [TRAIN] epoch: 5, iter: 5850/160000, loss: 1.1758, lr: 0.001167, batch_cost: 0.2042, reader_cost: 0.00044, ips: 39.1763 samples/sec | ETA 08:44:38 2022-08-22 18:33:53 [INFO] [TRAIN] epoch: 5, iter: 5900/160000, loss: 1.1791, lr: 0.001167, batch_cost: 0.2040, reader_cost: 0.00142, ips: 39.2097 samples/sec | ETA 08:44:01 2022-08-22 18:34:04 [INFO] [TRAIN] epoch: 5, iter: 5950/160000, loss: 1.2274, lr: 0.001166, batch_cost: 0.2239, reader_cost: 0.00627, ips: 35.7360 samples/sec | ETA 09:34:46 2022-08-22 18:34:15 [INFO] [TRAIN] epoch: 5, iter: 6000/160000, loss: 1.1702, lr: 0.001166, batch_cost: 0.2263, reader_cost: 0.00035, ips: 35.3502 samples/sec | ETA 09:40:51 2022-08-22 18:34:15 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1757 - reader cost: 9.3044e-04 2022-08-22 18:37:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.2363 Acc: 0.7111 Kappa: 0.6878 Dice: 0.3399 2022-08-22 18:37:11 [INFO] [EVAL] Class IoU: [0.6178 0.7243 0.913 0.6652 0.6227 0.7115 0.7162 0.6822 0.4362 0.5687 0.4116 0.4782 0.6174 0.2392 0.1417 0.2704 0.4763 0.3481 0.4724 0.3091 0.6992 0.4545 0.4858 0.402 0.2673 0.1941 0.5092 0.2861 0.266 0.1788 0.0986 0.3947 0.2099 0.2009 0.2387 0.2191 0.3315 0.3304 0.1874 0.2696 0.091 0.1009 0.2286 0.2028 0.2321 0.1365 0.2629 0.3528 0.4077 0.3419 0.3773 0.2431 0.155 0.1406 0.6112 0.4258 0.7707 0.2646 0.1639 0.2249 0.0563 0.0875 0.2651 0.0759 0.336 0.5235 0.18 0.3622 0.0304 0.1676 0.264 0.2993 0.3815 0.1308 0.3135 0.1934 0.3871 0.0214 0.0975 0.0486 0.2898 0.2432 0.1995 0.0266 0.0345 0.4317 0.0234 0.0064 0. 0.3425 0.3435 0. 0.1889 0.0372 0. 0.0124 0.0336 0. 0.0598 0.06 0. 0.0107 0.0363 0.2046 0.0116 0.2633 0.0104 0.4566 0.0741 0.0767 0. 0.1379 0.0165 0.4038 0.6008 0. 0.3877 0.448 0.0001 0.1244 0.4453 0. 0.0011 0.0778 0.1765 0.0727 0.4064 0.3356 0. 0.0402 0.4475 0. 0.0006 0.0667 0.0268 0.0722 0.0792 0. 0.0239 0.2288 0.0706 0.0072 0.2799 0.0001 0.1722 0.0034 0.2094 0.0002 0.0007 0. ] 2022-08-22 18:37:11 [INFO] [EVAL] Class Precision: [0.7197 0.7762 0.9435 0.7516 0.738 0.835 0.8644 0.7944 0.5813 0.6761 0.6968 0.635 0.6638 0.4953 0.3282 0.5132 0.5668 0.6319 0.7075 0.4553 0.8333 0.591 0.7749 0.5249 0.4781 0.5191 0.6047 0.4917 0.676 0.238 0.2674 0.6036 0.3999 0.2586 0.4617 0.7191 0.5713 0.8099 0.469 0.4276 0.2466 0.2244 0.5192 0.5723 0.283 0.3202 0.4623 0.507 0.6377 0.4035 0.6395 0.3413 0.3206 0.6418 0.7398 0.5574 0.885 0.5854 0.7969 0.4755 0.2201 0.2768 0.3771 0.9268 0.535 0.6186 0.3024 0.5682 0.0685 0.7462 0.5201 0.3232 0.5681 0.2928 0.6943 0.3383 0.492 0.5923 0.1598 0.091 0.9265 0.6932 0.6732 0.0379 0.6651 0.699 0.246 0.3378 0. 0.6602 0.4242 0. 0.2325 0.2416 0. 0.0617 0.1771 0. 0.3359 0.6267 0. 0.0156 0.809 0.7862 0.1872 0.5987 0.9918 0.7579 0.229 0.1903 0. 0.1739 0.6262 0.5758 0.6046 0. 0.5756 0.466 0.0132 0.6824 0.5845 0. 0.0953 0.6689 0.4981 0.6255 0.8387 0.5057 0. 0.6075 0.6114 0. 0.1303 0.2655 0.8891 0.2538 0.2202 0. 0.3937 0.6347 0.0954 0.0123 0.5629 0.0104 0.816 0.0061 0.857 0.0839 0.0264 0. ] 2022-08-22 18:37:11 [INFO] [EVAL] Class Recall: [0.8136 0.9154 0.9659 0.8527 0.7995 0.8279 0.8069 0.8285 0.6361 0.7815 0.5013 0.6596 0.8982 0.3163 0.1995 0.3638 0.749 0.4366 0.587 0.4905 0.8129 0.6631 0.5656 0.6319 0.3774 0.2367 0.7633 0.4062 0.3049 0.4184 0.1351 0.5329 0.3064 0.4737 0.3308 0.2396 0.4412 0.3581 0.2378 0.4218 0.126 0.155 0.2899 0.239 0.5633 0.1921 0.3786 0.537 0.5305 0.6914 0.4793 0.458 0.2308 0.1526 0.7786 0.6432 0.8565 0.3256 0.171 0.299 0.0703 0.1135 0.4718 0.0764 0.4745 0.773 0.3078 0.4998 0.052 0.1777 0.349 0.8022 0.5374 0.1911 0.3637 0.3111 0.6447 0.0218 0.1999 0.0946 0.2967 0.2725 0.2208 0.0817 0.0351 0.5303 0.0252 0.0065 0. 0.4157 0.6435 0. 0.5015 0.0421 0. 0.0153 0.0398 0. 0.0678 0.0622 0. 0.0329 0.0366 0.2166 0.0122 0.3197 0.0104 0.5345 0.0987 0.1138 0. 0.4002 0.0167 0.5748 0.9895 0. 0.543 0.9206 0.0001 0.1321 0.6517 0. 0.0012 0.0809 0.2146 0.076 0.4409 0.4994 0. 0.0413 0.6253 0. 0.0006 0.0818 0.0269 0.0916 0.1101 0. 0.0249 0.2635 0.2136 0.0169 0.3576 0.0001 0.1791 0.0078 0.217 0.0002 0.0007 0. ] 2022-08-22 18:37:12 [INFO] [EVAL] The model with the best validation mIoU (0.2363) was saved at iter 6000. 2022-08-22 18:37:20 [INFO] [TRAIN] epoch: 5, iter: 6050/160000, loss: 1.1445, lr: 0.001166, batch_cost: 0.1752, reader_cost: 0.00395, ips: 45.6709 samples/sec | ETA 07:29:26 2022-08-22 18:37:28 [INFO] [TRAIN] epoch: 5, iter: 6100/160000, loss: 1.1962, lr: 0.001165, batch_cost: 0.1587, reader_cost: 0.00091, ips: 50.3951 samples/sec | ETA 06:47:10 2022-08-22 18:37:37 [INFO] [TRAIN] epoch: 5, iter: 6150/160000, loss: 1.1482, lr: 0.001165, batch_cost: 0.1728, reader_cost: 0.00057, ips: 46.2841 samples/sec | ETA 07:23:12 2022-08-22 18:37:47 [INFO] [TRAIN] epoch: 5, iter: 6200/160000, loss: 1.1676, lr: 0.001164, batch_cost: 0.1950, reader_cost: 0.00047, ips: 41.0196 samples/sec | ETA 08:19:55 2022-08-22 18:37:55 [INFO] [TRAIN] epoch: 5, iter: 6250/160000, loss: 1.1223, lr: 0.001164, batch_cost: 0.1748, reader_cost: 0.00043, ips: 45.7575 samples/sec | ETA 07:28:00 2022-08-22 18:38:04 [INFO] [TRAIN] epoch: 5, iter: 6300/160000, loss: 1.1563, lr: 0.001164, batch_cost: 0.1773, reader_cost: 0.00048, ips: 45.1121 samples/sec | ETA 07:34:16 2022-08-22 18:38:14 [INFO] [TRAIN] epoch: 6, iter: 6350/160000, loss: 1.0594, lr: 0.001163, batch_cost: 0.1982, reader_cost: 0.03909, ips: 40.3622 samples/sec | ETA 08:27:34 2022-08-22 18:38:23 [INFO] [TRAIN] epoch: 6, iter: 6400/160000, loss: 1.1181, lr: 0.001163, batch_cost: 0.1746, reader_cost: 0.00083, ips: 45.8065 samples/sec | ETA 07:27:05 2022-08-22 18:38:32 [INFO] [TRAIN] epoch: 6, iter: 6450/160000, loss: 1.1950, lr: 0.001163, batch_cost: 0.1757, reader_cost: 0.00056, ips: 45.5197 samples/sec | ETA 07:29:46 2022-08-22 18:38:41 [INFO] [TRAIN] epoch: 6, iter: 6500/160000, loss: 1.1167, lr: 0.001162, batch_cost: 0.1832, reader_cost: 0.00052, ips: 43.6736 samples/sec | ETA 07:48:37 2022-08-22 18:38:49 [INFO] [TRAIN] epoch: 6, iter: 6550/160000, loss: 1.1181, lr: 0.001162, batch_cost: 0.1591, reader_cost: 0.00042, ips: 50.2753 samples/sec | ETA 06:46:57 2022-08-22 18:38:57 [INFO] [TRAIN] epoch: 6, iter: 6600/160000, loss: 1.0672, lr: 0.001161, batch_cost: 0.1647, reader_cost: 0.00058, ips: 48.5837 samples/sec | ETA 07:00:59 2022-08-22 18:39:06 [INFO] [TRAIN] epoch: 6, iter: 6650/160000, loss: 1.1461, lr: 0.001161, batch_cost: 0.1725, reader_cost: 0.00097, ips: 46.3672 samples/sec | ETA 07:20:58 2022-08-22 18:39:16 [INFO] [TRAIN] epoch: 6, iter: 6700/160000, loss: 1.1302, lr: 0.001161, batch_cost: 0.1993, reader_cost: 0.00051, ips: 40.1462 samples/sec | ETA 08:29:08 2022-08-22 18:39:25 [INFO] [TRAIN] epoch: 6, iter: 6750/160000, loss: 1.1418, lr: 0.001160, batch_cost: 0.1805, reader_cost: 0.00049, ips: 44.3113 samples/sec | ETA 07:41:07 2022-08-22 18:39:34 [INFO] [TRAIN] epoch: 6, iter: 6800/160000, loss: 1.1485, lr: 0.001160, batch_cost: 0.1788, reader_cost: 0.00085, ips: 44.7485 samples/sec | ETA 07:36:28 2022-08-22 18:39:44 [INFO] [TRAIN] epoch: 6, iter: 6850/160000, loss: 1.1301, lr: 0.001160, batch_cost: 0.2036, reader_cost: 0.00045, ips: 39.2874 samples/sec | ETA 08:39:45 2022-08-22 18:39:53 [INFO] [TRAIN] epoch: 6, iter: 6900/160000, loss: 1.1031, lr: 0.001159, batch_cost: 0.1853, reader_cost: 0.00636, ips: 43.1840 samples/sec | ETA 07:52:42 2022-08-22 18:40:03 [INFO] [TRAIN] epoch: 6, iter: 6950/160000, loss: 1.1233, lr: 0.001159, batch_cost: 0.1948, reader_cost: 0.00156, ips: 41.0683 samples/sec | ETA 08:16:53 2022-08-22 18:40:14 [INFO] [TRAIN] epoch: 6, iter: 7000/160000, loss: 1.2012, lr: 0.001158, batch_cost: 0.2260, reader_cost: 0.00035, ips: 35.3977 samples/sec | ETA 09:36:18 2022-08-22 18:40:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 177s - batch_cost: 0.1769 - reader cost: 0.0011 2022-08-22 18:43:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.2474 Acc: 0.7167 Kappa: 0.6943 Dice: 0.3543 2022-08-22 18:43:11 [INFO] [EVAL] Class IoU: [0.6198 0.7423 0.9141 0.6692 0.6446 0.7064 0.7152 0.6936 0.4408 0.5654 0.4217 0.4682 0.6428 0.2659 0.1125 0.3069 0.48 0.4094 0.5186 0.2992 0.6508 0.2993 0.45 0.3983 0.2743 0.2571 0.3721 0.2337 0.2632 0.191 0.1427 0.4083 0.172 0.2145 0.2817 0.3518 0.3204 0.4434 0.2132 0.1719 0.0658 0.0695 0.2478 0.1917 0.3138 0.1572 0.2599 0.3545 0.5207 0.4144 0.3326 0.2766 0.0559 0.2123 0.6704 0.4194 0.7382 0.3227 0.3573 0.234 0.0438 0.135 0.1875 0.1335 0.3323 0.5334 0.1419 0.3771 0.014 0.2281 0.2636 0.3696 0.3663 0.1405 0.3458 0.1694 0.3467 0.088 0.0685 0.1422 0.6055 0.2432 0.1804 0.016 0.2943 0.411 0.0138 0.0179 0. 0.2777 0.338 0. 0.1974 0.0476 0. 0. 0.0217 0. 0.1095 0.0279 0. 0.0106 0.0176 0.1992 0.0031 0.2873 0.0083 0.4604 0.0546 0.1846 0.0007 0.1867 0.0085 0.3389 0.6606 0. 0.1065 0.5939 0.0145 0.3437 0.4025 0. 0.1126 0.2001 0.1754 0.0649 0.371 0.3013 0. 0.2044 0.4213 0. 0.0049 0.0349 0.0089 0.0668 0.0571 0. 0.0497 0.2544 0.216 0.0162 0.2351 0. 0.1631 0. 0.0639 0.0002 0.02 0. ] 2022-08-22 18:43:11 [INFO] [EVAL] Class Precision: [0.7211 0.8181 0.9583 0.7458 0.7429 0.8404 0.8327 0.7785 0.5847 0.6752 0.6275 0.6273 0.7217 0.4996 0.4658 0.5349 0.6116 0.6083 0.6598 0.5648 0.701 0.5127 0.756 0.603 0.3843 0.6006 0.4061 0.731 0.7316 0.2605 0.2753 0.5358 0.3069 0.3171 0.4621 0.5233 0.4464 0.6775 0.4955 0.5048 0.3556 0.3714 0.5009 0.6353 0.5043 0.5905 0.6383 0.4775 0.5463 0.5431 0.4136 0.3612 0.2581 0.4532 0.7716 0.5568 0.7833 0.4904 0.5825 0.5911 0.0754 0.3875 0.5604 0.7901 0.5257 0.6271 0.3369 0.5238 0.2772 0.7729 0.5762 0.4423 0.6039 0.2229 0.4619 0.5878 0.3954 0.744 0.0917 0.2016 0.8288 0.6294 0.7177 0.079 0.6097 0.7159 0.1807 0.3018 0. 0.4846 0.4169 0. 0.3761 0.3423 0. 0.0068 0.1704 0. 0.2297 0.7676 0. 0.017 0.2831 0.4924 0.1245 0.3406 0.7136 0.7167 0.1345 0.2352 0.0335 0.406 0.151 0.5316 0.6658 0. 0.9159 0.7567 0.407 0.452 0.5124 0. 0.6351 0.4418 0.424 0.6046 0.8828 0.4611 0. 0.3291 0.5303 0. 0.0693 0.3308 0.9857 0.2964 0.3482 0. 0.3881 0.6079 0.5523 0.0537 0.6871 0. 0.6168 0. 0.9967 0.1473 0.4011 0. ] 2022-08-22 18:43:11 [INFO] [EVAL] Class Recall: [0.8152 0.8891 0.952 0.867 0.8298 0.8159 0.8352 0.8642 0.6417 0.7767 0.5625 0.6485 0.8547 0.3623 0.1291 0.4186 0.6906 0.5559 0.708 0.3888 0.9009 0.4183 0.5265 0.5399 0.4893 0.3101 0.8161 0.2557 0.2914 0.4174 0.2285 0.6316 0.2813 0.3985 0.4191 0.5176 0.5316 0.5621 0.2723 0.2067 0.0748 0.0788 0.329 0.2154 0.4538 0.1765 0.3048 0.5793 0.9174 0.6362 0.6295 0.5413 0.0666 0.2855 0.8363 0.6294 0.9276 0.4855 0.4802 0.2792 0.0945 0.1715 0.2198 0.1384 0.4747 0.7811 0.1968 0.5739 0.0145 0.2445 0.327 0.6922 0.4821 0.2754 0.5789 0.1923 0.7379 0.0908 0.2134 0.3253 0.6921 0.2838 0.1941 0.0196 0.3627 0.491 0.0148 0.0187 0. 0.3941 0.641 0. 0.2935 0.0524 0. 0. 0.0243 0. 0.1732 0.0281 0. 0.0278 0.0185 0.2506 0.0032 0.6475 0.0083 0.5629 0.0841 0.462 0.0007 0.2568 0.0089 0.4831 0.9885 0. 0.1076 0.734 0.0148 0.5893 0.6525 0. 0.1204 0.2678 0.2304 0.0678 0.3902 0.4651 0. 0.3503 0.6722 0. 0.0053 0.0376 0.009 0.0794 0.064 0. 0.054 0.3044 0.2619 0.0226 0.2633 0. 0.1815 0. 0.0639 0.0002 0.0207 0. ] 2022-08-22 18:43:12 [INFO] [EVAL] The model with the best validation mIoU (0.2474) was saved at iter 7000. 2022-08-22 18:43:20 [INFO] [TRAIN] epoch: 6, iter: 7050/160000, loss: 1.0931, lr: 0.001158, batch_cost: 0.1641, reader_cost: 0.00301, ips: 48.7630 samples/sec | ETA 06:58:12 2022-08-22 18:43:29 [INFO] [TRAIN] epoch: 6, iter: 7100/160000, loss: 1.1472, lr: 0.001158, batch_cost: 0.1793, reader_cost: 0.00081, ips: 44.6297 samples/sec | ETA 07:36:47 2022-08-22 18:43:39 [INFO] [TRAIN] epoch: 6, iter: 7150/160000, loss: 1.1708, lr: 0.001157, batch_cost: 0.2104, reader_cost: 0.00048, ips: 38.0201 samples/sec | ETA 08:56:01 2022-08-22 18:43:48 [INFO] [TRAIN] epoch: 6, iter: 7200/160000, loss: 1.0806, lr: 0.001157, batch_cost: 0.1741, reader_cost: 0.00057, ips: 45.9386 samples/sec | ETA 07:23:29 2022-08-22 18:43:57 [INFO] [TRAIN] epoch: 6, iter: 7250/160000, loss: 1.0980, lr: 0.001156, batch_cost: 0.1787, reader_cost: 0.00051, ips: 44.7592 samples/sec | ETA 07:35:01 2022-08-22 18:44:06 [INFO] [TRAIN] epoch: 6, iter: 7300/160000, loss: 1.1950, lr: 0.001156, batch_cost: 0.1800, reader_cost: 0.00096, ips: 44.4563 samples/sec | ETA 07:37:58 2022-08-22 18:44:15 [INFO] [TRAIN] epoch: 6, iter: 7350/160000, loss: 1.1379, lr: 0.001156, batch_cost: 0.1798, reader_cost: 0.00041, ips: 44.4858 samples/sec | ETA 07:37:31 2022-08-22 18:44:23 [INFO] [TRAIN] epoch: 6, iter: 7400/160000, loss: 1.1084, lr: 0.001155, batch_cost: 0.1563, reader_cost: 0.00060, ips: 51.1810 samples/sec | ETA 06:37:32 2022-08-22 18:44:32 [INFO] [TRAIN] epoch: 6, iter: 7450/160000, loss: 1.1257, lr: 0.001155, batch_cost: 0.1901, reader_cost: 0.00100, ips: 42.0938 samples/sec | ETA 08:03:12 2022-08-22 18:44:40 [INFO] [TRAIN] epoch: 6, iter: 7500/160000, loss: 1.1071, lr: 0.001155, batch_cost: 0.1651, reader_cost: 0.00090, ips: 48.4460 samples/sec | ETA 06:59:42 2022-08-22 18:44:50 [INFO] [TRAIN] epoch: 6, iter: 7550/160000, loss: 1.1772, lr: 0.001154, batch_cost: 0.1826, reader_cost: 0.00087, ips: 43.8147 samples/sec | ETA 07:43:55 2022-08-22 18:44:59 [INFO] [TRAIN] epoch: 7, iter: 7600/160000, loss: 1.1219, lr: 0.001154, batch_cost: 0.1932, reader_cost: 0.02480, ips: 41.4131 samples/sec | ETA 08:10:39 2022-08-22 18:45:08 [INFO] [TRAIN] epoch: 7, iter: 7650/160000, loss: 1.0588, lr: 0.001153, batch_cost: 0.1788, reader_cost: 0.00048, ips: 44.7397 samples/sec | ETA 07:34:02 2022-08-22 18:45:16 [INFO] [TRAIN] epoch: 7, iter: 7700/160000, loss: 1.0006, lr: 0.001153, batch_cost: 0.1651, reader_cost: 0.00210, ips: 48.4505 samples/sec | ETA 06:59:07 2022-08-22 18:45:25 [INFO] [TRAIN] epoch: 7, iter: 7750/160000, loss: 1.0175, lr: 0.001153, batch_cost: 0.1641, reader_cost: 0.00042, ips: 48.7525 samples/sec | ETA 06:56:23 2022-08-22 18:45:32 [INFO] [TRAIN] epoch: 7, iter: 7800/160000, loss: 1.0917, lr: 0.001152, batch_cost: 0.1553, reader_cost: 0.00049, ips: 51.5069 samples/sec | ETA 06:33:59 2022-08-22 18:45:42 [INFO] [TRAIN] epoch: 7, iter: 7850/160000, loss: 1.0722, lr: 0.001152, batch_cost: 0.1895, reader_cost: 0.00073, ips: 42.2226 samples/sec | ETA 08:00:28 2022-08-22 18:45:52 [INFO] [TRAIN] epoch: 7, iter: 7900/160000, loss: 1.1686, lr: 0.001152, batch_cost: 0.1990, reader_cost: 0.00179, ips: 40.1960 samples/sec | ETA 08:24:31 2022-08-22 18:46:01 [INFO] [TRAIN] epoch: 7, iter: 7950/160000, loss: 0.9903, lr: 0.001151, batch_cost: 0.1925, reader_cost: 0.00538, ips: 41.5542 samples/sec | ETA 08:07:52 2022-08-22 18:46:12 [INFO] [TRAIN] epoch: 7, iter: 8000/160000, loss: 1.0641, lr: 0.001151, batch_cost: 0.2163, reader_cost: 0.00068, ips: 36.9819 samples/sec | ETA 09:08:00 2022-08-22 18:46:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 194s - batch_cost: 0.1936 - reader cost: 0.0014 2022-08-22 18:49:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.2617 Acc: 0.7237 Kappa: 0.7022 Dice: 0.3715 2022-08-22 18:49:26 [INFO] [EVAL] Class IoU: [0.6245 0.7545 0.917 0.6838 0.6451 0.7133 0.7182 0.7079 0.4406 0.5788 0.4103 0.4769 0.6597 0.2944 0.1768 0.3035 0.4681 0.4196 0.4885 0.3261 0.7089 0.4343 0.5188 0.3926 0.2032 0.3542 0.4564 0.2787 0.2957 0.2308 0.1489 0.4228 0.2297 0.2401 0.3176 0.3208 0.3426 0.4292 0.2053 0.2134 0.0757 0.0389 0.2097 0.192 0.2611 0.1312 0.2659 0.3701 0.5079 0.4066 0.3244 0.2638 0.107 0.1477 0.6769 0.4003 0.7915 0.3236 0.3618 0.218 0.0297 0.3294 0.2212 0.1019 0.3585 0.482 0.2124 0.3703 0.0101 0.2628 0.2781 0.4306 0.3547 0.1629 0.3531 0.2454 0.3856 0.0742 0.0987 0.1262 0.5444 0.2525 0.2373 0.032 0.1828 0.4501 0.0347 0.0189 0. 0.3782 0.3138 0. 0.1815 0.0262 0. 0. 0.0417 0. 0.1288 0.0213 0.0004 0.0049 0.0458 0.6214 0.0771 0.3723 0.0116 0.4558 0.0173 0.2455 0. 0.0846 0.0316 0.3509 0.5441 0. 0.2345 0.6106 0.0551 0.3604 0.4123 0. 0.03 0.1483 0.1619 0.0831 0.3944 0.2883 0. 0.168 0.5743 0. 0.1016 0.0653 0.039 0.0701 0.0443 0. 0.0785 0.2638 0.3232 0. 0.3066 0.0315 0.0443 0.001 0.1115 0.0214 0.0066 0.0016] 2022-08-22 18:49:26 [INFO] [EVAL] Class Precision: [0.7351 0.8408 0.949 0.7635 0.7324 0.8384 0.8445 0.796 0.5734 0.7829 0.5321 0.634 0.7478 0.5446 0.4325 0.5519 0.6077 0.6526 0.5745 0.4707 0.8098 0.6035 0.6841 0.554 0.4131 0.5356 0.5312 0.7087 0.6601 0.293 0.2762 0.5549 0.4444 0.3665 0.436 0.4429 0.5627 0.7849 0.3751 0.5286 0.2055 0.4546 0.6883 0.7094 0.3497 0.4727 0.5204 0.495 0.5728 0.5365 0.3783 0.3652 0.3255 0.6623 0.7754 0.5086 0.8857 0.5063 0.6782 0.5777 0.3002 0.3878 0.5836 0.8304 0.5377 0.8357 0.4285 0.502 0.9056 0.6332 0.5267 0.6368 0.599 0.1873 0.6403 0.377 0.4467 0.7202 0.1388 0.3022 0.6453 0.5097 0.6747 0.1533 0.7503 0.6529 0.5697 0.2256 0. 0.7105 0.3855 0. 0.3615 0.2713 0. 0. 0.2079 0. 0.518 0.3705 0.0387 0.0074 0.5928 0.8324 0.2398 0.7021 0.7509 0.6548 0.1007 0.2877 0. 0.0871 0.366 0.4023 0.5468 0. 0.652 0.7353 0.2439 0.5095 0.6785 0. 0.3388 0.5654 0.61 0.5701 0.7549 0.3739 0. 0.73 0.8119 0. 0.4024 0.6029 0.7583 0.388 0.4718 0. 0.4338 0.6357 0.7718 0. 0.5956 0.0665 0.9168 0.0048 0.9825 0.2837 0.7839 0.652 ] 2022-08-22 18:49:26 [INFO] [EVAL] Class Recall: [0.8059 0.8802 0.9646 0.8675 0.8441 0.827 0.8277 0.8648 0.6555 0.6894 0.6418 0.658 0.8485 0.3906 0.2302 0.4028 0.6708 0.5403 0.7654 0.5149 0.8504 0.6077 0.6822 0.5741 0.2856 0.5113 0.7642 0.3147 0.3488 0.5208 0.2443 0.6399 0.3223 0.4104 0.5391 0.538 0.4669 0.4863 0.3121 0.2635 0.1069 0.0408 0.2317 0.2084 0.5077 0.1537 0.3522 0.5947 0.8177 0.6268 0.6951 0.4873 0.1375 0.1598 0.842 0.6526 0.8816 0.4727 0.4367 0.2593 0.0319 0.6861 0.2627 0.1041 0.5182 0.5324 0.2963 0.5854 0.0101 0.3101 0.3707 0.5707 0.4651 0.5555 0.4405 0.4128 0.738 0.0764 0.2545 0.1781 0.7768 0.3334 0.2679 0.0389 0.1946 0.5916 0.0357 0.0202 0. 0.4471 0.628 0. 0.2671 0.0282 0. 0. 0.0495 0. 0.1464 0.0221 0.0004 0.014 0.0473 0.7103 0.1021 0.4422 0.0116 0.6 0.0205 0.6258 0. 0.7438 0.0335 0.733 0.9912 0. 0.2681 0.7826 0.0664 0.552 0.5124 0. 0.0318 0.1674 0.1806 0.0886 0.4523 0.5574 0. 0.1791 0.6625 0. 0.1197 0.0683 0.0395 0.0789 0.0467 0. 0.0874 0.3107 0.3573 0. 0.3872 0.0563 0.0445 0.0013 0.1117 0.0226 0.0066 0.0016] 2022-08-22 18:49:26 [INFO] [EVAL] The model with the best validation mIoU (0.2617) was saved at iter 8000. 2022-08-22 18:49:36 [INFO] [TRAIN] epoch: 7, iter: 8050/160000, loss: 1.1427, lr: 0.001150, batch_cost: 0.1958, reader_cost: 0.00364, ips: 40.8500 samples/sec | ETA 08:15:57 2022-08-22 18:49:45 [INFO] [TRAIN] epoch: 7, iter: 8100/160000, loss: 1.0631, lr: 0.001150, batch_cost: 0.1832, reader_cost: 0.00114, ips: 43.6589 samples/sec | ETA 07:43:53 2022-08-22 18:49:54 [INFO] [TRAIN] epoch: 7, iter: 8150/160000, loss: 1.0645, lr: 0.001150, batch_cost: 0.1803, reader_cost: 0.00054, ips: 44.3806 samples/sec | ETA 07:36:12 2022-08-22 18:50:03 [INFO] [TRAIN] epoch: 7, iter: 8200/160000, loss: 1.1075, lr: 0.001149, batch_cost: 0.1700, reader_cost: 0.00032, ips: 47.0556 samples/sec | ETA 07:10:07 2022-08-22 18:50:12 [INFO] [TRAIN] epoch: 7, iter: 8250/160000, loss: 1.0616, lr: 0.001149, batch_cost: 0.1804, reader_cost: 0.00038, ips: 44.3456 samples/sec | ETA 07:36:15 2022-08-22 18:50:20 [INFO] [TRAIN] epoch: 7, iter: 8300/160000, loss: 1.1448, lr: 0.001149, batch_cost: 0.1621, reader_cost: 0.00129, ips: 49.3465 samples/sec | ETA 06:49:53 2022-08-22 18:50:28 [INFO] [TRAIN] epoch: 7, iter: 8350/160000, loss: 1.1070, lr: 0.001148, batch_cost: 0.1707, reader_cost: 0.00042, ips: 46.8620 samples/sec | ETA 07:11:28 2022-08-22 18:50:37 [INFO] [TRAIN] epoch: 7, iter: 8400/160000, loss: 1.0915, lr: 0.001148, batch_cost: 0.1633, reader_cost: 0.00048, ips: 48.9801 samples/sec | ETA 06:52:41 2022-08-22 18:50:45 [INFO] [TRAIN] epoch: 7, iter: 8450/160000, loss: 1.0826, lr: 0.001147, batch_cost: 0.1778, reader_cost: 0.00051, ips: 44.9984 samples/sec | ETA 07:29:03 2022-08-22 18:50:55 [INFO] [TRAIN] epoch: 7, iter: 8500/160000, loss: 1.0871, lr: 0.001147, batch_cost: 0.1889, reader_cost: 0.00041, ips: 42.3614 samples/sec | ETA 07:56:50 2022-08-22 18:51:04 [INFO] [TRAIN] epoch: 7, iter: 8550/160000, loss: 1.0899, lr: 0.001147, batch_cost: 0.1790, reader_cost: 0.00065, ips: 44.6835 samples/sec | ETA 07:31:55 2022-08-22 18:51:14 [INFO] [TRAIN] epoch: 7, iter: 8600/160000, loss: 1.1288, lr: 0.001146, batch_cost: 0.2076, reader_cost: 0.00032, ips: 38.5351 samples/sec | ETA 08:43:51 2022-08-22 18:51:22 [INFO] [TRAIN] epoch: 7, iter: 8650/160000, loss: 1.0640, lr: 0.001146, batch_cost: 0.1648, reader_cost: 0.00068, ips: 48.5364 samples/sec | ETA 06:55:46 2022-08-22 18:51:31 [INFO] [TRAIN] epoch: 7, iter: 8700/160000, loss: 1.1663, lr: 0.001145, batch_cost: 0.1759, reader_cost: 0.00067, ips: 45.4896 samples/sec | ETA 07:23:28 2022-08-22 18:51:40 [INFO] [TRAIN] epoch: 7, iter: 8750/160000, loss: 1.1019, lr: 0.001145, batch_cost: 0.1710, reader_cost: 0.00046, ips: 46.7741 samples/sec | ETA 07:11:09 2022-08-22 18:51:49 [INFO] [TRAIN] epoch: 7, iter: 8800/160000, loss: 0.9628, lr: 0.001145, batch_cost: 0.1811, reader_cost: 0.00030, ips: 44.1858 samples/sec | ETA 07:36:15 2022-08-22 18:52:05 [INFO] [TRAIN] epoch: 8, iter: 8850/160000, loss: 1.0724, lr: 0.001144, batch_cost: 0.3161, reader_cost: 0.09446, ips: 25.3103 samples/sec | ETA 13:16:14 2022-08-22 18:52:15 [INFO] [TRAIN] epoch: 8, iter: 8900/160000, loss: 1.0896, lr: 0.001144, batch_cost: 0.2102, reader_cost: 0.00063, ips: 38.0623 samples/sec | ETA 08:49:18 2022-08-22 18:52:25 [INFO] [TRAIN] epoch: 8, iter: 8950/160000, loss: 1.0956, lr: 0.001144, batch_cost: 0.2035, reader_cost: 0.00036, ips: 39.3198 samples/sec | ETA 08:32:12 2022-08-22 18:52:35 [INFO] [TRAIN] epoch: 8, iter: 9000/160000, loss: 1.0343, lr: 0.001143, batch_cost: 0.2010, reader_cost: 0.00125, ips: 39.7988 samples/sec | ETA 08:25:52 2022-08-22 18:52:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1816 - reader cost: 0.0011 2022-08-22 18:55:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.2581 Acc: 0.7272 Kappa: 0.7053 Dice: 0.3691 2022-08-22 18:55:37 [INFO] [EVAL] Class IoU: [0.6231 0.7484 0.9174 0.6861 0.642 0.7175 0.7275 0.7023 0.46 0.6314 0.4377 0.4793 0.6492 0.2736 0.0766 0.3296 0.4802 0.4064 0.5064 0.3339 0.7091 0.4418 0.5401 0.4067 0.293 0.12 0.4453 0.2937 0.3717 0.2144 0.1579 0.397 0.2134 0.2207 0.3695 0.3346 0.3523 0.3953 0.2102 0.2242 0.0164 0.1128 0.2765 0.2038 0.2825 0.1855 0.2425 0.3947 0.5504 0.4111 0.3987 0.1907 0.154 0.1029 0.5994 0.3722 0.776 0.3032 0.2346 0.2266 0.0977 0.2307 0.3028 0.1494 0.342 0.5331 0.2457 0.3802 0.0783 0.2254 0.3034 0.4133 0.3329 0.2159 0.3635 0.2418 0.4653 0.1376 0.1284 0.1944 0.5796 0.23 0.1898 0.0207 0.2684 0.4399 0.0301 0.0259 0. 0.3036 0.3439 0.0853 0.1244 0.0496 0.0157 0.0034 0.0329 0.1183 0.1327 0. 0.0001 0.0192 0.0638 0.5053 0.0226 0.2652 0.0008 0.463 0.0651 0.1662 0.0194 0.1722 0.0755 0.3049 0.4452 0. 0.2032 0.376 0.0176 0.1285 0.2485 0.0006 0.0661 0.0324 0.1905 0.0795 0.4219 0.3511 0. 0.0171 0.5671 0. 0.0267 0.2097 0.0629 0.0727 0.0834 0. 0.0512 0.2689 0.2704 0.0037 0.044 0.0724 0.0478 0. 0.2223 0.0162 0.0155 0.0057] 2022-08-22 18:55:37 [INFO] [EVAL] Class Precision: [0.721 0.8057 0.9558 0.8251 0.7158 0.7791 0.8467 0.7857 0.5892 0.722 0.6972 0.6328 0.7413 0.5456 0.4398 0.4592 0.7071 0.6866 0.6297 0.5299 0.7982 0.5838 0.752 0.5706 0.4813 0.4902 0.5651 0.6216 0.5814 0.338 0.2987 0.5525 0.4524 0.4413 0.421 0.6163 0.5512 0.7709 0.4877 0.5279 0.297 0.2371 0.4965 0.5623 0.4087 0.3515 0.5008 0.5782 0.652 0.5043 0.6005 0.2422 0.3629 0.5796 0.6475 0.4409 0.8268 0.5397 0.5014 0.3858 0.2426 0.342 0.5508 0.7599 0.5115 0.6701 0.3749 0.5783 0.2379 0.6534 0.6041 0.5085 0.5342 0.3129 0.6425 0.4042 0.5974 0.6637 0.1955 0.2792 0.7726 0.7444 0.7046 0.1755 0.6233 0.6283 0.3641 0.311 0. 0.6995 0.4934 0.1739 0.3832 0.2262 0.4464 0.0169 0.2119 0.2893 0.4164 0. 0.2566 0.0271 0.6645 0.8756 0.6354 0.3049 0.4566 0.8801 0.2126 0.2339 0.3224 0.2018 0.6652 0.8573 0.4464 0. 0.6721 0.3859 0.1725 0.3842 0.7604 0.8197 0.6733 0.8006 0.6263 0.6549 0.7922 0.4896 0. 0.7092 0.7485 0. 0.2018 0.3981 0.7134 0.3293 0.4456 0. 0.4079 0.5262 0.3984 0.0115 0.8951 0.3331 0.5924 0. 0.8157 0.298 0.6724 0.5744] 2022-08-22 18:55:37 [INFO] [EVAL] Class Recall: [0.8211 0.9131 0.958 0.8029 0.8616 0.9007 0.8379 0.8688 0.6773 0.8341 0.5404 0.6639 0.8393 0.3544 0.0849 0.5387 0.5994 0.499 0.7211 0.4745 0.864 0.6449 0.6572 0.586 0.4282 0.1371 0.6775 0.3576 0.5075 0.3695 0.2509 0.5851 0.2877 0.3062 0.7511 0.4227 0.494 0.448 0.2698 0.2805 0.0171 0.1772 0.3843 0.2422 0.478 0.2819 0.3199 0.5542 0.7794 0.6898 0.5427 0.4724 0.2111 0.1112 0.8896 0.705 0.9266 0.4089 0.306 0.3545 0.1406 0.415 0.4022 0.1568 0.508 0.7227 0.4163 0.526 0.1045 0.256 0.3786 0.6882 0.469 0.4105 0.4557 0.3756 0.6777 0.148 0.2723 0.3902 0.6987 0.2497 0.2062 0.0229 0.3204 0.5947 0.0318 0.0275 0. 0.3491 0.5317 0.1436 0.1555 0.0597 0.016 0.0043 0.0376 0.1667 0.163 0. 0.0001 0.0617 0.0659 0.5444 0.0229 0.671 0.0008 0.4942 0.0857 0.3647 0.0202 0.5394 0.0785 0.3212 0.9941 0. 0.2255 0.9363 0.0193 0.1618 0.2696 0.0006 0.0683 0.0327 0.2149 0.083 0.4744 0.5539 0. 0.0172 0.7005 0. 0.0299 0.3071 0.0646 0.0853 0.093 0. 0.0553 0.3549 0.4569 0.0054 0.0442 0.0847 0.0494 0. 0.234 0.0168 0.0156 0.0057] 2022-08-22 18:55:37 [INFO] [EVAL] The model with the best validation mIoU (0.2617) was saved at iter 8000. 2022-08-22 18:55:46 [INFO] [TRAIN] epoch: 8, iter: 9050/160000, loss: 1.0198, lr: 0.001143, batch_cost: 0.1823, reader_cost: 0.00424, ips: 43.8852 samples/sec | ETA 07:38:37 2022-08-22 18:55:55 [INFO] [TRAIN] epoch: 8, iter: 9100/160000, loss: 1.0825, lr: 0.001142, batch_cost: 0.1756, reader_cost: 0.00122, ips: 45.5687 samples/sec | ETA 07:21:31 2022-08-22 18:56:03 [INFO] [TRAIN] epoch: 8, iter: 9150/160000, loss: 1.0363, lr: 0.001142, batch_cost: 0.1661, reader_cost: 0.00063, ips: 48.1620 samples/sec | ETA 06:57:37 2022-08-22 18:56:12 [INFO] [TRAIN] epoch: 8, iter: 9200/160000, loss: 1.0339, lr: 0.001142, batch_cost: 0.1640, reader_cost: 0.00121, ips: 48.7897 samples/sec | ETA 06:52:06 2022-08-22 18:56:21 [INFO] [TRAIN] epoch: 8, iter: 9250/160000, loss: 1.0566, lr: 0.001141, batch_cost: 0.1933, reader_cost: 0.00069, ips: 41.3786 samples/sec | ETA 08:05:45 2022-08-22 18:56:30 [INFO] [TRAIN] epoch: 8, iter: 9300/160000, loss: 1.1068, lr: 0.001141, batch_cost: 0.1783, reader_cost: 0.00048, ips: 44.8676 samples/sec | ETA 07:27:50 2022-08-22 18:56:40 [INFO] [TRAIN] epoch: 8, iter: 9350/160000, loss: 1.0285, lr: 0.001141, batch_cost: 0.1936, reader_cost: 0.00053, ips: 41.3225 samples/sec | ETA 08:06:05 2022-08-22 18:56:49 [INFO] [TRAIN] epoch: 8, iter: 9400/160000, loss: 1.0920, lr: 0.001140, batch_cost: 0.1745, reader_cost: 0.00085, ips: 45.8508 samples/sec | ETA 07:17:56 2022-08-22 18:56:58 [INFO] [TRAIN] epoch: 8, iter: 9450/160000, loss: 0.9248, lr: 0.001140, batch_cost: 0.1870, reader_cost: 0.00075, ips: 42.7712 samples/sec | ETA 07:49:19 2022-08-22 18:57:08 [INFO] [TRAIN] epoch: 8, iter: 9500/160000, loss: 1.0216, lr: 0.001139, batch_cost: 0.2008, reader_cost: 0.00089, ips: 39.8358 samples/sec | ETA 08:23:44 2022-08-22 18:57:16 [INFO] [TRAIN] epoch: 8, iter: 9550/160000, loss: 1.0386, lr: 0.001139, batch_cost: 0.1575, reader_cost: 0.00031, ips: 50.8000 samples/sec | ETA 06:34:52 2022-08-22 18:57:25 [INFO] [TRAIN] epoch: 8, iter: 9600/160000, loss: 1.1074, lr: 0.001139, batch_cost: 0.1783, reader_cost: 0.00041, ips: 44.8683 samples/sec | ETA 07:26:56 2022-08-22 18:57:34 [INFO] [TRAIN] epoch: 8, iter: 9650/160000, loss: 1.0809, lr: 0.001138, batch_cost: 0.1848, reader_cost: 0.00042, ips: 43.2797 samples/sec | ETA 07:43:11 2022-08-22 18:57:43 [INFO] [TRAIN] epoch: 8, iter: 9700/160000, loss: 1.1028, lr: 0.001138, batch_cost: 0.1698, reader_cost: 0.00054, ips: 47.1027 samples/sec | ETA 07:05:27 2022-08-22 18:57:52 [INFO] [TRAIN] epoch: 8, iter: 9750/160000, loss: 1.0906, lr: 0.001138, batch_cost: 0.1827, reader_cost: 0.00058, ips: 43.7880 samples/sec | ETA 07:37:30 2022-08-22 18:58:01 [INFO] [TRAIN] epoch: 8, iter: 9800/160000, loss: 0.9795, lr: 0.001137, batch_cost: 0.1848, reader_cost: 0.00055, ips: 43.2967 samples/sec | ETA 07:42:32 2022-08-22 18:58:10 [INFO] [TRAIN] epoch: 8, iter: 9850/160000, loss: 0.9813, lr: 0.001137, batch_cost: 0.1824, reader_cost: 0.00359, ips: 43.8693 samples/sec | ETA 07:36:21 2022-08-22 18:58:22 [INFO] [TRAIN] epoch: 8, iter: 9900/160000, loss: 1.0781, lr: 0.001136, batch_cost: 0.2430, reader_cost: 0.00103, ips: 32.9171 samples/sec | ETA 10:07:59 2022-08-22 18:58:33 [INFO] [TRAIN] epoch: 8, iter: 9950/160000, loss: 1.0846, lr: 0.001136, batch_cost: 0.2182, reader_cost: 0.00038, ips: 36.6701 samples/sec | ETA 09:05:35 2022-08-22 18:58:44 [INFO] [TRAIN] epoch: 8, iter: 10000/160000, loss: 1.0252, lr: 0.001136, batch_cost: 0.2261, reader_cost: 0.00065, ips: 35.3849 samples/sec | ETA 09:25:12 2022-08-22 18:58:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1762 - reader cost: 8.6443e-04 2022-08-22 19:01:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.2639 Acc: 0.7267 Kappa: 0.7046 Dice: 0.3757 2022-08-22 19:01:41 [INFO] [EVAL] Class IoU: [0.6259 0.7482 0.9232 0.6854 0.6421 0.7201 0.7323 0.6917 0.4496 0.6102 0.4305 0.4727 0.6556 0.2591 0.1669 0.3022 0.4475 0.3124 0.5133 0.3421 0.7069 0.4319 0.5317 0.4157 0.2883 0.3894 0.47 0.2637 0.2956 0.2438 0.1419 0.3545 0.2347 0.2196 0.3037 0.3425 0.3449 0.4534 0.2044 0.2536 0.0774 0.0873 0.2253 0.2092 0.2992 0.1874 0.2909 0.4239 0.5401 0.467 0.3748 0.264 0.066 0.1208 0.6873 0.3732 0.755 0.2397 0.0724 0.253 0.0572 0.2419 0.263 0.2131 0.3518 0.5506 0.176 0.4108 0.0113 0.1551 0.3208 0.4264 0.3225 0.2269 0.3763 0.2493 0.3942 0.1682 0.0884 0.0585 0.5166 0.2512 0.2045 0.0588 0.1987 0.3894 0.0293 0.0392 0.0801 0.3512 0.2489 0.0081 0.2096 0.0202 0.0015 0.0004 0.1484 0.0853 0.035 0.2361 0.001 0.0134 0.0924 0.1461 0.0388 0.3245 0.0204 0.4679 0.0476 0.0002 0.0263 0.1785 0.1074 0.5813 0.7685 0. 0.4282 0.5399 0.1071 0.1284 0.2443 0.0026 0.1316 0.0661 0.1467 0.1312 0.4219 0.4074 0. 0.0548 0.5466 0. 0.1451 0.1629 0.0491 0.0639 0.0536 0.0001 0.1026 0.2673 0.2283 0.0172 0.2598 0.0044 0.0718 0. 0.1233 0.0001 0.0366 0.0116] 2022-08-22 19:01:41 [INFO] [EVAL] Class Precision: [0.7065 0.8375 0.9608 0.7847 0.7124 0.8119 0.8374 0.7604 0.6737 0.7133 0.6784 0.6606 0.7543 0.5601 0.4201 0.523 0.5481 0.6534 0.7183 0.5346 0.7892 0.6231 0.7304 0.6002 0.4687 0.4511 0.5034 0.6941 0.6674 0.3334 0.3488 0.4749 0.4456 0.3336 0.5145 0.5848 0.585 0.7556 0.3427 0.489 0.3373 0.3638 0.7106 0.5811 0.4146 0.5279 0.5013 0.6291 0.5961 0.6651 0.6919 0.3283 0.3423 0.7005 0.7337 0.4553 0.7901 0.6566 0.9828 0.3898 0.1842 0.3334 0.4787 0.6345 0.4983 0.6686 0.2378 0.5515 0.2237 0.7463 0.4699 0.6585 0.514 0.3062 0.579 0.4789 0.4901 0.7109 0.4237 0.1209 0.9565 0.6213 0.7007 0.1313 0.6038 0.7175 0.3631 0.2625 0.6189 0.7371 0.2853 0.0221 0.34 0.231 0.055 0.0056 0.2339 0.3208 0.4473 0.592 0.0492 0.0155 0.4122 0.9918 0.1972 0.9198 0.1387 0.8544 0.2095 0.0008 0.5764 0.2459 0.4128 0.6519 0.8918 0. 0.7559 0.6755 0.2313 0.2552 0.6651 0.4527 0.8709 0.7682 0.7372 0.7296 0.7603 0.6214 0. 0.803 0.7601 0. 0.4077 0.4017 0.4059 0.4021 0.5169 0.0014 0.3714 0.5471 0.2981 0.0422 0.6541 0.3085 0.4361 0. 0.8318 0.3966 0.6636 0.9086] 2022-08-22 19:01:41 [INFO] [EVAL] Class Recall: [0.8459 0.8752 0.9594 0.8442 0.8668 0.8642 0.8537 0.8845 0.5748 0.8085 0.5409 0.6244 0.8336 0.3252 0.2168 0.4172 0.709 0.3745 0.6427 0.4873 0.8715 0.5847 0.6616 0.5749 0.4283 0.7401 0.8761 0.2984 0.3466 0.4757 0.1931 0.5832 0.3316 0.391 0.4258 0.4527 0.4567 0.5314 0.3363 0.3451 0.0913 0.103 0.2481 0.2463 0.5179 0.2252 0.4094 0.5651 0.8518 0.6106 0.4499 0.5737 0.0756 0.1274 0.9158 0.6743 0.9444 0.2741 0.0725 0.4189 0.0766 0.4686 0.3686 0.2429 0.5449 0.7574 0.4037 0.6169 0.0117 0.1637 0.5027 0.5476 0.4641 0.4672 0.518 0.342 0.6681 0.1806 0.1005 0.1019 0.529 0.2966 0.2241 0.0963 0.2285 0.4599 0.0309 0.0441 0.0842 0.4014 0.6611 0.0126 0.3535 0.0217 0.0016 0.0004 0.2886 0.1041 0.0365 0.282 0.001 0.0905 0.1065 0.1463 0.0461 0.334 0.0234 0.5085 0.0581 0.0003 0.0269 0.3944 0.1267 0.843 0.8476 0. 0.4969 0.729 0.1664 0.2053 0.2786 0.0026 0.1342 0.0674 0.1548 0.1379 0.4867 0.5419 0. 0.0555 0.6606 0. 0.1838 0.2151 0.0529 0.0706 0.0564 0.0001 0.1241 0.3433 0.4937 0.0282 0.3012 0.0045 0.0791 0. 0.1264 0.0001 0.0373 0.0117] 2022-08-22 19:01:41 [INFO] [EVAL] The model with the best validation mIoU (0.2639) was saved at iter 10000. 2022-08-22 19:01:50 [INFO] [TRAIN] epoch: 8, iter: 10050/160000, loss: 1.0059, lr: 0.001135, batch_cost: 0.1881, reader_cost: 0.00411, ips: 42.5263 samples/sec | ETA 07:50:08 2022-08-22 19:02:00 [INFO] [TRAIN] epoch: 8, iter: 10100/160000, loss: 1.0076, lr: 0.001135, batch_cost: 0.1981, reader_cost: 0.00117, ips: 40.3805 samples/sec | ETA 08:14:57 2022-08-22 19:02:14 [INFO] [TRAIN] epoch: 9, iter: 10150/160000, loss: 1.0112, lr: 0.001135, batch_cost: 0.2734, reader_cost: 0.09711, ips: 29.2575 samples/sec | ETA 11:22:54 2022-08-22 19:02:23 [INFO] [TRAIN] epoch: 9, iter: 10200/160000, loss: 0.9641, lr: 0.001134, batch_cost: 0.1762, reader_cost: 0.00684, ips: 45.4087 samples/sec | ETA 07:19:51 2022-08-22 19:02:32 [INFO] [TRAIN] epoch: 9, iter: 10250/160000, loss: 1.1308, lr: 0.001134, batch_cost: 0.1911, reader_cost: 0.00064, ips: 41.8581 samples/sec | ETA 07:57:00 2022-08-22 19:02:42 [INFO] [TRAIN] epoch: 9, iter: 10300/160000, loss: 1.0352, lr: 0.001133, batch_cost: 0.1913, reader_cost: 0.00068, ips: 41.8090 samples/sec | ETA 07:57:24 2022-08-22 19:02:52 [INFO] [TRAIN] epoch: 9, iter: 10350/160000, loss: 0.9440, lr: 0.001133, batch_cost: 0.2022, reader_cost: 0.00037, ips: 39.5659 samples/sec | ETA 08:24:18 2022-08-22 19:03:02 [INFO] [TRAIN] epoch: 9, iter: 10400/160000, loss: 1.0600, lr: 0.001133, batch_cost: 0.1954, reader_cost: 0.00060, ips: 40.9490 samples/sec | ETA 08:07:06 2022-08-22 19:03:12 [INFO] [TRAIN] epoch: 9, iter: 10450/160000, loss: 1.0211, lr: 0.001132, batch_cost: 0.1961, reader_cost: 0.00087, ips: 40.7887 samples/sec | ETA 08:08:51 2022-08-22 19:03:22 [INFO] [TRAIN] epoch: 9, iter: 10500/160000, loss: 0.9593, lr: 0.001132, batch_cost: 0.2152, reader_cost: 0.00053, ips: 37.1686 samples/sec | ETA 08:56:17 2022-08-22 19:03:31 [INFO] [TRAIN] epoch: 9, iter: 10550/160000, loss: 1.0580, lr: 0.001131, batch_cost: 0.1797, reader_cost: 0.00050, ips: 44.5122 samples/sec | ETA 07:27:40 2022-08-22 19:03:39 [INFO] [TRAIN] epoch: 9, iter: 10600/160000, loss: 1.0275, lr: 0.001131, batch_cost: 0.1505, reader_cost: 0.00142, ips: 53.1414 samples/sec | ETA 06:14:50 2022-08-22 19:03:47 [INFO] [TRAIN] epoch: 9, iter: 10650/160000, loss: 1.0273, lr: 0.001131, batch_cost: 0.1599, reader_cost: 0.00047, ips: 50.0208 samples/sec | ETA 06:38:06 2022-08-22 19:03:56 [INFO] [TRAIN] epoch: 9, iter: 10700/160000, loss: 0.9330, lr: 0.001130, batch_cost: 0.1785, reader_cost: 0.00048, ips: 44.8056 samples/sec | ETA 07:24:17 2022-08-22 19:04:04 [INFO] [TRAIN] epoch: 9, iter: 10750/160000, loss: 0.9612, lr: 0.001130, batch_cost: 0.1647, reader_cost: 0.00243, ips: 48.5713 samples/sec | ETA 06:49:42 2022-08-22 19:04:13 [INFO] [TRAIN] epoch: 9, iter: 10800/160000, loss: 1.0539, lr: 0.001130, batch_cost: 0.1693, reader_cost: 0.00032, ips: 47.2490 samples/sec | ETA 07:01:01 2022-08-22 19:04:21 [INFO] [TRAIN] epoch: 9, iter: 10850/160000, loss: 0.9951, lr: 0.001129, batch_cost: 0.1741, reader_cost: 0.00056, ips: 45.9568 samples/sec | ETA 07:12:43 2022-08-22 19:04:32 [INFO] [TRAIN] epoch: 9, iter: 10900/160000, loss: 1.0030, lr: 0.001129, batch_cost: 0.2120, reader_cost: 0.00778, ips: 37.7282 samples/sec | ETA 08:46:55 2022-08-22 19:04:43 [INFO] [TRAIN] epoch: 9, iter: 10950/160000, loss: 1.0084, lr: 0.001128, batch_cost: 0.2256, reader_cost: 0.00053, ips: 35.4538 samples/sec | ETA 09:20:32 2022-08-22 19:04:54 [INFO] [TRAIN] epoch: 9, iter: 11000/160000, loss: 0.9886, lr: 0.001128, batch_cost: 0.2246, reader_cost: 0.00148, ips: 35.6262 samples/sec | ETA 09:17:38 2022-08-22 19:04:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1841 - reader cost: 0.0015 2022-08-22 19:07:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.2767 Acc: 0.7307 Kappa: 0.7101 Dice: 0.3943 2022-08-22 19:07:59 [INFO] [EVAL] Class IoU: [0.6378 0.7367 0.9197 0.6811 0.6412 0.723 0.7384 0.69 0.4504 0.6671 0.4301 0.4864 0.6364 0.3397 0.1669 0.2847 0.4918 0.3806 0.5221 0.3259 0.7175 0.4156 0.5456 0.4404 0.3061 0.3455 0.4504 0.363 0.2601 0.2346 0.1535 0.3952 0.2199 0.2294 0.4053 0.3333 0.3542 0.4467 0.2529 0.2811 0.1122 0.1039 0.2894 0.2282 0.2687 0.2354 0.2663 0.4119 0.5036 0.4387 0.389 0.3003 0.1306 0.264 0.7064 0.4273 0.7709 0.2927 0.407 0.2887 0.059 0.194 0.3041 0.1273 0.365 0.5879 0.2065 0.3898 0.0582 0.2932 0.3452 0.197 0.2665 0.1999 0.3947 0.2536 0.2562 0.1375 0.2461 0.0725 0.6606 0.2676 0.2955 0.0228 0.2034 0.4097 0.047 0.0434 0.0971 0.407 0.241 0. 0.222 0.0534 0.0057 0.0017 0.0601 0. 0.0727 0.1755 0.0078 0.012 0.0375 0.2662 0.0414 0.3632 0.0913 0.4984 0.0557 0.0881 0.0738 0.1513 0.1024 0.4733 0.5386 0. 0.1308 0.5068 0.0968 0.264 0.3264 0. 0.1409 0.0893 0.1755 0.1654 0.3905 0.3409 0. 0.187 0.5462 0. 0.1684 0.0776 0.0378 0.126 0.0922 0. 0.0412 0.2673 0.393 0.0129 0.3624 0. 0.223 0. 0.2512 0.0175 0.0686 0.0215] 2022-08-22 19:07:59 [INFO] [EVAL] Class Precision: [0.758 0.8301 0.954 0.7843 0.7036 0.8368 0.8324 0.7399 0.563 0.8248 0.5464 0.6701 0.6992 0.5287 0.4766 0.6208 0.6775 0.6754 0.6722 0.5614 0.8214 0.6279 0.742 0.5892 0.4888 0.4186 0.4922 0.5917 0.7118 0.4019 0.3032 0.4856 0.5055 0.3451 0.4965 0.563 0.6037 0.7241 0.4284 0.5341 0.3019 0.3053 0.5285 0.5558 0.3368 0.4315 0.643 0.6289 0.5359 0.5563 0.5777 0.4434 0.477 0.4742 0.7527 0.5383 0.8521 0.6278 0.8086 0.6076 0.2759 0.4195 0.514 0.7628 0.5412 0.7694 0.4873 0.5556 0.1626 0.4741 0.5902 0.7996 0.5575 0.262 0.534 0.3742 0.2824 0.6573 0.5716 0.1376 0.7732 0.6969 0.695 0.1041 0.5705 0.4618 0.2384 0.2059 0.5703 0.626 0.2832 0. 0.3173 0.3317 0.5039 0.0217 0.1175 0. 0.2887 0.4236 0.2124 0.0198 0.4485 0.5966 0.2554 0.4889 0.5489 0.7905 0.1492 0.1694 0.2313 0.1937 0.3625 0.4819 0.5417 0.0054 0.5793 0.6065 0.1206 0.4327 0.6318 0. 0.9373 0.614 0.6145 0.5348 0.8349 0.4472 0.0034 0.4068 0.7747 0. 0.4926 0.5333 0.6674 0.2228 0.364 0. 0.4411 0.6324 0.8926 0.0313 0.5794 0. 0.5998 0. 0.8986 0.4632 0.6323 0.8932] 2022-08-22 19:07:59 [INFO] [EVAL] Class Recall: [0.8008 0.8675 0.9624 0.8381 0.8784 0.8417 0.8673 0.9109 0.6925 0.7772 0.6689 0.6395 0.8763 0.4872 0.2044 0.3447 0.6421 0.4657 0.7004 0.4373 0.8501 0.5514 0.6733 0.6356 0.4502 0.6643 0.8416 0.4842 0.2907 0.3605 0.2371 0.6799 0.2801 0.4061 0.6881 0.4496 0.4616 0.5383 0.3816 0.3724 0.1514 0.1361 0.3902 0.2791 0.5708 0.3413 0.3125 0.5442 0.8932 0.6748 0.5435 0.4819 0.1524 0.3733 0.9199 0.6745 0.89 0.3542 0.4504 0.3548 0.0698 0.2651 0.4268 0.1326 0.5286 0.7136 0.2637 0.5665 0.083 0.4346 0.4541 0.2072 0.3381 0.4572 0.6021 0.4402 0.7347 0.1481 0.3017 0.1329 0.8193 0.3028 0.3395 0.0284 0.2401 0.7843 0.0553 0.0521 0.1047 0.5377 0.6183 0. 0.425 0.0598 0.0057 0.0019 0.1096 0. 0.0886 0.2306 0.008 0.0296 0.0393 0.3246 0.0471 0.5855 0.0987 0.5743 0.0817 0.1551 0.0977 0.409 0.1249 0.9635 0.9895 0. 0.1445 0.7551 0.3292 0.4038 0.4031 0. 0.1423 0.0946 0.1973 0.1932 0.4232 0.5891 0. 0.2572 0.6493 0. 0.2037 0.0833 0.0385 0.2248 0.1099 0. 0.0435 0.3165 0.4126 0.0216 0.4917 0. 0.262 0. 0.2585 0.0178 0.0714 0.0216] 2022-08-22 19:07:59 [INFO] [EVAL] The model with the best validation mIoU (0.2767) was saved at iter 11000. 2022-08-22 19:08:06 [INFO] [TRAIN] epoch: 9, iter: 11050/160000, loss: 0.9702, lr: 0.001128, batch_cost: 0.1499, reader_cost: 0.00352, ips: 53.3812 samples/sec | ETA 06:12:02 2022-08-22 19:08:14 [INFO] [TRAIN] epoch: 9, iter: 11100/160000, loss: 1.1091, lr: 0.001127, batch_cost: 0.1508, reader_cost: 0.00056, ips: 53.0422 samples/sec | ETA 06:14:17 2022-08-22 19:08:22 [INFO] [TRAIN] epoch: 9, iter: 11150/160000, loss: 1.0637, lr: 0.001127, batch_cost: 0.1678, reader_cost: 0.00074, ips: 47.6751 samples/sec | ETA 06:56:17 2022-08-22 19:08:31 [INFO] [TRAIN] epoch: 9, iter: 11200/160000, loss: 0.9721, lr: 0.001127, batch_cost: 0.1640, reader_cost: 0.00051, ips: 48.7866 samples/sec | ETA 06:46:40 2022-08-22 19:08:39 [INFO] [TRAIN] epoch: 9, iter: 11250/160000, loss: 1.0653, lr: 0.001126, batch_cost: 0.1653, reader_cost: 0.00195, ips: 48.3902 samples/sec | ETA 06:49:51 2022-08-22 19:08:46 [INFO] [TRAIN] epoch: 9, iter: 11300/160000, loss: 1.0577, lr: 0.001126, batch_cost: 0.1516, reader_cost: 0.00033, ips: 52.7840 samples/sec | ETA 06:15:37 2022-08-22 19:08:55 [INFO] [TRAIN] epoch: 9, iter: 11350/160000, loss: 0.9904, lr: 0.001125, batch_cost: 0.1739, reader_cost: 0.00034, ips: 46.0081 samples/sec | ETA 07:10:47 2022-08-22 19:09:07 [INFO] [TRAIN] epoch: 10, iter: 11400/160000, loss: 1.0876, lr: 0.001125, batch_cost: 0.2415, reader_cost: 0.04887, ips: 33.1222 samples/sec | ETA 09:58:11 2022-08-22 19:09:16 [INFO] [TRAIN] epoch: 10, iter: 11450/160000, loss: 0.9886, lr: 0.001125, batch_cost: 0.1792, reader_cost: 0.00050, ips: 44.6416 samples/sec | ETA 07:23:40 2022-08-22 19:09:25 [INFO] [TRAIN] epoch: 10, iter: 11500/160000, loss: 0.9547, lr: 0.001124, batch_cost: 0.1697, reader_cost: 0.00031, ips: 47.1450 samples/sec | ETA 06:59:58 2022-08-22 19:09:34 [INFO] [TRAIN] epoch: 10, iter: 11550/160000, loss: 1.0440, lr: 0.001124, batch_cost: 0.1899, reader_cost: 0.00068, ips: 42.1363 samples/sec | ETA 07:49:44 2022-08-22 19:09:43 [INFO] [TRAIN] epoch: 10, iter: 11600/160000, loss: 0.9626, lr: 0.001124, batch_cost: 0.1677, reader_cost: 0.00229, ips: 47.7018 samples/sec | ETA 06:54:47 2022-08-22 19:09:53 [INFO] [TRAIN] epoch: 10, iter: 11650/160000, loss: 1.0142, lr: 0.001123, batch_cost: 0.2026, reader_cost: 0.00079, ips: 39.4789 samples/sec | ETA 08:21:01 2022-08-22 19:10:01 [INFO] [TRAIN] epoch: 10, iter: 11700/160000, loss: 0.9679, lr: 0.001123, batch_cost: 0.1740, reader_cost: 0.00056, ips: 45.9753 samples/sec | ETA 07:10:05 2022-08-22 19:10:09 [INFO] [TRAIN] epoch: 10, iter: 11750/160000, loss: 1.0196, lr: 0.001122, batch_cost: 0.1558, reader_cost: 0.00050, ips: 51.3433 samples/sec | ETA 06:24:59 2022-08-22 19:10:18 [INFO] [TRAIN] epoch: 10, iter: 11800/160000, loss: 0.9621, lr: 0.001122, batch_cost: 0.1777, reader_cost: 0.00031, ips: 45.0303 samples/sec | ETA 07:18:48 2022-08-22 19:10:27 [INFO] [TRAIN] epoch: 10, iter: 11850/160000, loss: 1.0554, lr: 0.001122, batch_cost: 0.1866, reader_cost: 0.00064, ips: 42.8737 samples/sec | ETA 07:40:44 2022-08-22 19:10:38 [INFO] [TRAIN] epoch: 10, iter: 11900/160000, loss: 0.9945, lr: 0.001121, batch_cost: 0.2151, reader_cost: 0.01174, ips: 37.1900 samples/sec | ETA 08:50:58 2022-08-22 19:10:50 [INFO] [TRAIN] epoch: 10, iter: 11950/160000, loss: 1.0225, lr: 0.001121, batch_cost: 0.2332, reader_cost: 0.00058, ips: 34.3094 samples/sec | ETA 09:35:21 2022-08-22 19:11:01 [INFO] [TRAIN] epoch: 10, iter: 12000/160000, loss: 0.9921, lr: 0.001121, batch_cost: 0.2279, reader_cost: 0.00051, ips: 35.0972 samples/sec | ETA 09:22:14 2022-08-22 19:11:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1815 - reader cost: 7.5099e-04 2022-08-22 19:14:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.2827 Acc: 0.7352 Kappa: 0.7149 Dice: 0.3999 2022-08-22 19:14:03 [INFO] [EVAL] Class IoU: [0.6358 0.7508 0.9215 0.6958 0.6655 0.7185 0.7367 0.7029 0.4616 0.6777 0.4577 0.5267 0.668 0.28 0.2185 0.3405 0.4752 0.4125 0.5306 0.3525 0.7137 0.4147 0.5479 0.4206 0.2827 0.3622 0.4364 0.3439 0.3438 0.2765 0.1706 0.4692 0.1435 0.2768 0.2896 0.3155 0.3566 0.4455 0.2626 0.2909 0.0999 0.1007 0.3021 0.2095 0.2673 0.253 0.2612 0.3645 0.5974 0.4084 0.4216 0.2973 0.1708 0.2141 0.6671 0.4354 0.7184 0.3242 0.4288 0.17 0.0623 0.2785 0.2553 0.14 0.3722 0.5791 0.2328 0.3657 0.0212 0.2749 0.3647 0.4301 0.3958 0.1937 0.3866 0.2792 0.2957 0.2161 0.1987 0.212 0.5401 0.257 0.1785 0.0552 0.3073 0.4824 0.063 0.0272 0.1215 0.3484 0.3421 0. 0.1776 0.0779 0.0086 0.0022 0.0097 0.0435 0.0721 0.181 0.0269 0.0148 0.0947 0.3901 0.0186 0.4012 0.0391 0.5069 0.0637 0.0574 0.008 0.423 0.0761 0.5906 0.5471 0. 0.3016 0.4404 0.0149 0.3315 0.4744 0. 0.1238 0.1236 0.2394 0.097 0.423 0.3994 0. 0.0407 0.5335 0. 0.0138 0.154 0.0259 0.1088 0.0986 0.0147 0.0508 0.2937 0.2558 0.0349 0.1706 0.0032 0.1819 0. 0.1736 0.0005 0.0255 0.044 ] 2022-08-22 19:14:03 [INFO] [EVAL] Class Precision: [0.7464 0.8469 0.953 0.7942 0.7515 0.8328 0.9196 0.7631 0.6261 0.7587 0.7113 0.626 0.7695 0.4856 0.4068 0.5257 0.5623 0.6633 0.6661 0.4992 0.8092 0.5968 0.7155 0.5237 0.5559 0.398 0.5317 0.6538 0.6368 0.464 0.3186 0.7456 0.3336 0.4078 0.4355 0.6141 0.6 0.758 0.4293 0.5434 0.3309 0.3258 0.5475 0.5683 0.4233 0.653 0.3628 0.828 0.6691 0.4967 0.5422 0.3886 0.2631 0.6354 0.6943 0.6449 0.7513 0.6029 0.6708 0.5814 0.1983 0.325 0.4558 0.7064 0.4935 0.6644 0.3549 0.4627 0.3906 0.5194 0.4932 0.5222 0.6304 0.254 0.5831 0.4039 0.5054 0.7046 0.8328 0.2786 0.6281 0.679 0.7524 0.1883 0.6024 0.6142 0.5387 0.3518 0.9471 0.8587 0.4487 0. 0.2759 0.3177 0.0998 0.022 0.043 0.3185 0.2852 0.7491 0.3926 0.0194 0.4261 0.5559 0.3777 0.5686 0.2071 0.7905 0.1512 0.1069 0.227 0.6952 0.5992 0.8491 0.5494 0. 0.711 0.5795 0.5679 0.5038 0.6385 0. 0.9941 0.6981 0.5432 0.8228 0.8543 0.5598 0. 0.6252 0.7616 0. 0.3058 0.5565 0.7997 0.2035 0.406 0.0523 0.3701 0.6191 0.3235 0.1031 0.7982 0.0617 0.5618 0. 0.9618 0.758 0.9334 0.5705] 2022-08-22 19:14:03 [INFO] [EVAL] Class Recall: [0.811 0.8688 0.9653 0.8489 0.8533 0.8396 0.7874 0.8991 0.6372 0.864 0.5622 0.7686 0.8351 0.3982 0.3208 0.4915 0.7541 0.5218 0.7229 0.5453 0.8581 0.5761 0.7005 0.6811 0.3651 0.8009 0.7088 0.4205 0.4276 0.4063 0.2685 0.5587 0.2012 0.463 0.4637 0.3935 0.4679 0.5193 0.4035 0.3849 0.1252 0.1272 0.4025 0.2492 0.4203 0.2923 0.4827 0.3944 0.848 0.6966 0.6547 0.5587 0.3274 0.2441 0.9445 0.5727 0.9425 0.4123 0.5431 0.1937 0.0832 0.6607 0.3673 0.1487 0.6023 0.8184 0.4037 0.6356 0.0219 0.3687 0.5832 0.7093 0.5154 0.4491 0.5342 0.475 0.4161 0.2376 0.207 0.4701 0.794 0.2925 0.1896 0.0724 0.3854 0.6922 0.0666 0.0286 0.1224 0.3696 0.5903 0. 0.3325 0.0935 0.0093 0.0024 0.0124 0.0479 0.088 0.1927 0.028 0.0592 0.1086 0.5667 0.0192 0.5767 0.046 0.5856 0.0991 0.1104 0.0082 0.5193 0.0802 0.6598 0.9924 0. 0.3438 0.6472 0.0151 0.4921 0.6487 0. 0.1239 0.1306 0.2998 0.0991 0.4559 0.5823 0. 0.0417 0.6405 0. 0.0143 0.1755 0.0261 0.1895 0.1152 0.02 0.0556 0.3585 0.5498 0.0501 0.1783 0.0034 0.2119 0. 0.1748 0.0005 0.0255 0.0456] 2022-08-22 19:14:03 [INFO] [EVAL] The model with the best validation mIoU (0.2827) was saved at iter 12000. 2022-08-22 19:14:13 [INFO] [TRAIN] epoch: 10, iter: 12050/160000, loss: 0.9783, lr: 0.001120, batch_cost: 0.1890, reader_cost: 0.00436, ips: 42.3369 samples/sec | ETA 07:45:56 2022-08-22 19:14:21 [INFO] [TRAIN] epoch: 10, iter: 12100/160000, loss: 0.9764, lr: 0.001120, batch_cost: 0.1756, reader_cost: 0.00322, ips: 45.5708 samples/sec | ETA 07:12:43 2022-08-22 19:14:31 [INFO] [TRAIN] epoch: 10, iter: 12150/160000, loss: 1.0455, lr: 0.001119, batch_cost: 0.1935, reader_cost: 0.00043, ips: 41.3351 samples/sec | ETA 07:56:54 2022-08-22 19:14:42 [INFO] [TRAIN] epoch: 10, iter: 12200/160000, loss: 1.0453, lr: 0.001119, batch_cost: 0.2121, reader_cost: 0.00048, ips: 37.7133 samples/sec | ETA 08:42:32 2022-08-22 19:14:51 [INFO] [TRAIN] epoch: 10, iter: 12250/160000, loss: 1.0746, lr: 0.001119, batch_cost: 0.1871, reader_cost: 0.00076, ips: 42.7628 samples/sec | ETA 07:40:40 2022-08-22 19:15:00 [INFO] [TRAIN] epoch: 10, iter: 12300/160000, loss: 1.0247, lr: 0.001118, batch_cost: 0.1814, reader_cost: 0.00396, ips: 44.1006 samples/sec | ETA 07:26:33 2022-08-22 19:15:08 [INFO] [TRAIN] epoch: 10, iter: 12350/160000, loss: 0.9572, lr: 0.001118, batch_cost: 0.1637, reader_cost: 0.00166, ips: 48.8672 samples/sec | ETA 06:42:51 2022-08-22 19:15:17 [INFO] [TRAIN] epoch: 10, iter: 12400/160000, loss: 0.9571, lr: 0.001117, batch_cost: 0.1830, reader_cost: 0.00030, ips: 43.7208 samples/sec | ETA 07:30:07 2022-08-22 19:15:27 [INFO] [TRAIN] epoch: 10, iter: 12450/160000, loss: 1.0047, lr: 0.001117, batch_cost: 0.1954, reader_cost: 0.00031, ips: 40.9446 samples/sec | ETA 08:00:29 2022-08-22 19:15:36 [INFO] [TRAIN] epoch: 10, iter: 12500/160000, loss: 0.9950, lr: 0.001117, batch_cost: 0.1671, reader_cost: 0.00052, ips: 47.8883 samples/sec | ETA 06:50:40 2022-08-22 19:15:44 [INFO] [TRAIN] epoch: 10, iter: 12550/160000, loss: 0.9780, lr: 0.001116, batch_cost: 0.1762, reader_cost: 0.00084, ips: 45.3960 samples/sec | ETA 07:13:04 2022-08-22 19:15:56 [INFO] [TRAIN] epoch: 10, iter: 12600/160000, loss: 1.0524, lr: 0.001116, batch_cost: 0.2239, reader_cost: 0.00075, ips: 35.7243 samples/sec | ETA 09:10:08 2022-08-22 19:16:09 [INFO] [TRAIN] epoch: 11, iter: 12650/160000, loss: 0.9740, lr: 0.001116, batch_cost: 0.2754, reader_cost: 0.07750, ips: 29.0502 samples/sec | ETA 11:16:18 2022-08-22 19:16:18 [INFO] [TRAIN] epoch: 11, iter: 12700/160000, loss: 0.9386, lr: 0.001115, batch_cost: 0.1696, reader_cost: 0.00059, ips: 47.1687 samples/sec | ETA 06:56:22 2022-08-22 19:16:27 [INFO] [TRAIN] epoch: 11, iter: 12750/160000, loss: 0.9739, lr: 0.001115, batch_cost: 0.1907, reader_cost: 0.00050, ips: 41.9499 samples/sec | ETA 07:48:01 2022-08-22 19:16:38 [INFO] [TRAIN] epoch: 11, iter: 12800/160000, loss: 0.9334, lr: 0.001114, batch_cost: 0.2055, reader_cost: 0.02163, ips: 38.9358 samples/sec | ETA 08:24:04 2022-08-22 19:16:48 [INFO] [TRAIN] epoch: 11, iter: 12850/160000, loss: 0.9521, lr: 0.001114, batch_cost: 0.2052, reader_cost: 0.00193, ips: 38.9941 samples/sec | ETA 08:23:09 2022-08-22 19:16:57 [INFO] [TRAIN] epoch: 11, iter: 12900/160000, loss: 0.9351, lr: 0.001114, batch_cost: 0.1900, reader_cost: 0.00033, ips: 42.1157 samples/sec | ETA 07:45:42 2022-08-22 19:17:08 [INFO] [TRAIN] epoch: 11, iter: 12950/160000, loss: 0.9154, lr: 0.001113, batch_cost: 0.2106, reader_cost: 0.00395, ips: 37.9794 samples/sec | ETA 08:36:14 2022-08-22 19:17:18 [INFO] [TRAIN] epoch: 11, iter: 13000/160000, loss: 1.0469, lr: 0.001113, batch_cost: 0.1935, reader_cost: 0.00060, ips: 41.3343 samples/sec | ETA 07:54:10 2022-08-22 19:17:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1739 - reader cost: 0.0015 2022-08-22 19:20:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.2793 Acc: 0.7319 Kappa: 0.7111 Dice: 0.3949 2022-08-22 19:20:12 [INFO] [EVAL] Class IoU: [0.6444 0.7538 0.9264 0.6943 0.6625 0.6907 0.7258 0.7066 0.4607 0.5924 0.4464 0.5148 0.6726 0.3129 0.2045 0.3312 0.502 0.3675 0.5357 0.3473 0.7138 0.3552 0.5645 0.4387 0.2753 0.3963 0.4979 0.3268 0.3314 0.2657 0.054 0.4134 0.2016 0.2586 0.3988 0.3335 0.3619 0.4212 0.2316 0.2543 0.1085 0.0975 0.291 0.2035 0.3047 0.0268 0.2459 0.4176 0.4619 0.466 0.4326 0.2889 0.0862 0.2199 0.6273 0.3749 0.7438 0.1828 0.3855 0.3119 0.0842 0.3432 0.2533 0.1087 0.355 0.6109 0.1647 0.3968 0.0209 0.2854 0.3269 0.4554 0.3901 0.1723 0.3808 0.2732 0.2978 0.1829 0.174 0.0243 0.588 0.2689 0.1913 0.0264 0.2955 0.4495 0.0522 0.0243 0.0589 0.4253 0.3212 0.0766 0.1514 0.1204 0.0119 0.0051 0.0074 0.0014 0.0516 0.1463 0.0586 0.0231 0.0388 0.6427 0.0895 0.3573 0.0672 0.4781 0.0256 0.028 0.0131 0.2507 0.0713 0.4965 0.6747 0. 0.1862 0.4854 0.0403 0.1904 0.4171 0.0009 0.2015 0.1355 0.1979 0.1003 0.423 0.3537 0. 0.2028 0.575 0. 0.0564 0.1545 0.0167 0.0801 0.1151 0.035 0.0704 0.3158 0.395 0.0288 0.251 0.0022 0.1672 0. 0.2193 0.0037 0.097 0.0302] 2022-08-22 19:20:12 [INFO] [EVAL] Class Precision: [0.7394 0.8433 0.9647 0.7929 0.7401 0.8815 0.913 0.7643 0.5854 0.7643 0.6621 0.5946 0.7586 0.5547 0.4058 0.5271 0.6881 0.7198 0.7166 0.5555 0.8056 0.5463 0.7338 0.5531 0.4828 0.4642 0.5386 0.7021 0.6248 0.3229 0.3421 0.5435 0.4796 0.3185 0.4776 0.4166 0.6087 0.6495 0.4022 0.5741 0.1497 0.3672 0.7234 0.578 0.4843 0.8767 0.5554 0.6168 0.4711 0.6196 0.5639 0.3532 0.2873 0.7437 0.6461 0.9079 0.7718 0.6925 0.7138 0.559 0.1121 0.4107 0.4061 0.853 0.4737 0.743 0.2422 0.512 0.7857 0.5206 0.4239 0.5707 0.6548 0.2142 0.6353 0.3987 0.3207 0.7287 0.8272 0.1513 0.7597 0.695 0.7353 0.1793 0.818 0.5847 0.4176 0.4048 0.4352 0.7125 0.3993 0.1404 0.3655 0.2917 0.2714 0.0182 0.0423 0.0894 0.3391 0.6477 0.2365 0.0316 0.4435 0.7276 0.3655 0.4498 0.548 0.7595 0.1192 0.0688 0.3635 0.3878 0.3347 0.6882 0.6822 0. 0.8189 0.5871 0.2389 0.471 0.6123 0.0457 0.79 0.4916 0.6621 0.4894 0.8435 0.5178 0. 0.316 0.7317 0. 0.7003 0.4347 0.703 0.4569 0.3553 0.1104 0.5064 0.5404 0.5263 0.0355 0.6475 0.0546 0.4308 0. 0.8545 0.1916 0.2676 0.6994] 2022-08-22 19:20:12 [INFO] [EVAL] Class Recall: [0.8338 0.8766 0.9589 0.8481 0.8634 0.7614 0.7797 0.9034 0.6838 0.7248 0.5781 0.7933 0.8558 0.4179 0.2919 0.4713 0.6498 0.4289 0.6798 0.4809 0.8624 0.5039 0.7099 0.6796 0.3904 0.7303 0.8682 0.3795 0.4137 0.6002 0.0602 0.6333 0.258 0.5789 0.7073 0.6258 0.4717 0.5451 0.3531 0.3134 0.283 0.1172 0.3275 0.239 0.4512 0.0269 0.3062 0.5639 0.9598 0.6527 0.6502 0.6135 0.1096 0.2379 0.9555 0.3897 0.9534 0.1989 0.4559 0.4137 0.2528 0.6762 0.4023 0.1107 0.5864 0.7745 0.3399 0.6383 0.021 0.3871 0.5881 0.6926 0.4911 0.4686 0.4874 0.4648 0.8064 0.1963 0.1805 0.0281 0.7223 0.3048 0.2055 0.03 0.3162 0.6602 0.0563 0.0252 0.0637 0.5134 0.6216 0.1442 0.2053 0.1701 0.0123 0.007 0.0089 0.0015 0.0573 0.1589 0.0723 0.0794 0.0408 0.8463 0.106 0.6348 0.0711 0.5634 0.0316 0.0452 0.0135 0.4151 0.0831 0.6406 0.984 0. 0.1942 0.737 0.0462 0.2422 0.5668 0.0009 0.2129 0.1575 0.2201 0.112 0.459 0.5275 0. 0.3614 0.7285 0. 0.0578 0.1934 0.0168 0.0886 0.1455 0.0487 0.0755 0.4318 0.6129 0.1322 0.2907 0.0022 0.2146 0. 0.2278 0.0037 0.1321 0.0306] 2022-08-22 19:20:12 [INFO] [EVAL] The model with the best validation mIoU (0.2827) was saved at iter 12000. 2022-08-22 19:20:20 [INFO] [TRAIN] epoch: 11, iter: 13050/160000, loss: 0.9779, lr: 0.001113, batch_cost: 0.1681, reader_cost: 0.00397, ips: 47.5934 samples/sec | ETA 06:51:40 2022-08-22 19:20:28 [INFO] [TRAIN] epoch: 11, iter: 13100/160000, loss: 0.9527, lr: 0.001112, batch_cost: 0.1640, reader_cost: 0.00081, ips: 48.7833 samples/sec | ETA 06:41:30 2022-08-22 19:20:37 [INFO] [TRAIN] epoch: 11, iter: 13150/160000, loss: 0.9862, lr: 0.001112, batch_cost: 0.1649, reader_cost: 0.00030, ips: 48.5040 samples/sec | ETA 06:43:40 2022-08-22 19:20:46 [INFO] [TRAIN] epoch: 11, iter: 13200/160000, loss: 0.9601, lr: 0.001111, batch_cost: 0.1893, reader_cost: 0.00160, ips: 42.2667 samples/sec | ETA 07:43:05 2022-08-22 19:20:57 [INFO] [TRAIN] epoch: 11, iter: 13250/160000, loss: 1.0309, lr: 0.001111, batch_cost: 0.2124, reader_cost: 0.00062, ips: 37.6668 samples/sec | ETA 08:39:27 2022-08-22 19:21:05 [INFO] [TRAIN] epoch: 11, iter: 13300/160000, loss: 0.9490, lr: 0.001111, batch_cost: 0.1735, reader_cost: 0.00241, ips: 46.1152 samples/sec | ETA 07:04:09 2022-08-22 19:21:14 [INFO] [TRAIN] epoch: 11, iter: 13350/160000, loss: 0.8945, lr: 0.001110, batch_cost: 0.1639, reader_cost: 0.00034, ips: 48.7964 samples/sec | ETA 06:40:42 2022-08-22 19:21:22 [INFO] [TRAIN] epoch: 11, iter: 13400/160000, loss: 1.0011, lr: 0.001110, batch_cost: 0.1694, reader_cost: 0.00042, ips: 47.2390 samples/sec | ETA 06:53:46 2022-08-22 19:21:32 [INFO] [TRAIN] epoch: 11, iter: 13450/160000, loss: 0.9046, lr: 0.001110, batch_cost: 0.1982, reader_cost: 0.00051, ips: 40.3630 samples/sec | ETA 08:04:06 2022-08-22 19:21:41 [INFO] [TRAIN] epoch: 11, iter: 13500/160000, loss: 0.9864, lr: 0.001109, batch_cost: 0.1871, reader_cost: 0.00277, ips: 42.7604 samples/sec | ETA 07:36:48 2022-08-22 19:21:50 [INFO] [TRAIN] epoch: 11, iter: 13550/160000, loss: 0.9947, lr: 0.001109, batch_cost: 0.1716, reader_cost: 0.00058, ips: 46.6081 samples/sec | ETA 06:58:57 2022-08-22 19:21:58 [INFO] [TRAIN] epoch: 11, iter: 13600/160000, loss: 0.9548, lr: 0.001108, batch_cost: 0.1665, reader_cost: 0.00066, ips: 48.0551 samples/sec | ETA 06:46:12 2022-08-22 19:22:07 [INFO] [TRAIN] epoch: 11, iter: 13650/160000, loss: 0.9911, lr: 0.001108, batch_cost: 0.1780, reader_cost: 0.00039, ips: 44.9389 samples/sec | ETA 07:14:13 2022-08-22 19:22:15 [INFO] [TRAIN] epoch: 11, iter: 13700/160000, loss: 0.9590, lr: 0.001108, batch_cost: 0.1638, reader_cost: 0.00035, ips: 48.8301 samples/sec | ETA 06:39:28 2022-08-22 19:22:23 [INFO] [TRAIN] epoch: 11, iter: 13750/160000, loss: 0.9756, lr: 0.001107, batch_cost: 0.1530, reader_cost: 0.00043, ips: 52.2908 samples/sec | ETA 06:12:54 2022-08-22 19:22:31 [INFO] [TRAIN] epoch: 11, iter: 13800/160000, loss: 0.9526, lr: 0.001107, batch_cost: 0.1683, reader_cost: 0.00045, ips: 47.5331 samples/sec | ETA 06:50:06 2022-08-22 19:22:41 [INFO] [TRAIN] epoch: 11, iter: 13850/160000, loss: 1.0201, lr: 0.001107, batch_cost: 0.1908, reader_cost: 0.00034, ips: 41.9321 samples/sec | ETA 07:44:43 2022-08-22 19:22:55 [INFO] [TRAIN] epoch: 12, iter: 13900/160000, loss: 0.9662, lr: 0.001106, batch_cost: 0.2762, reader_cost: 0.07779, ips: 28.9635 samples/sec | ETA 11:12:34 2022-08-22 19:23:06 [INFO] [TRAIN] epoch: 12, iter: 13950/160000, loss: 0.9134, lr: 0.001106, batch_cost: 0.2335, reader_cost: 0.00374, ips: 34.2573 samples/sec | ETA 09:28:26 2022-08-22 19:23:16 [INFO] [TRAIN] epoch: 12, iter: 14000/160000, loss: 0.9690, lr: 0.001105, batch_cost: 0.1997, reader_cost: 0.00046, ips: 40.0675 samples/sec | ETA 08:05:50 2022-08-22 19:23:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1894 - reader cost: 9.2856e-04 2022-08-22 19:26:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.2817 Acc: 0.7379 Kappa: 0.7174 Dice: 0.4000 2022-08-22 19:26:26 [INFO] [EVAL] Class IoU: [0.6454 0.7529 0.9239 0.6957 0.6584 0.7111 0.745 0.731 0.477 0.6502 0.4511 0.5115 0.6719 0.2992 0.1775 0.352 0.4985 0.4032 0.5468 0.3378 0.6959 0.3637 0.5647 0.4462 0.2935 0.3411 0.4609 0.3509 0.3286 0.3011 0.1594 0.4226 0.2382 0.2771 0.3433 0.3092 0.3688 0.3316 0.2524 0.2551 0.1127 0.117 0.2987 0.2175 0.2903 0.2085 0.2554 0.4237 0.5057 0.4271 0.367 0.2788 0.1819 0.0539 0.669 0.4191 0.8116 0.3668 0.2559 0.2561 0.0592 0.3251 0.3314 0.1517 0.3737 0.5405 0.205 0.3799 0.0746 0.2439 0.376 0.464 0.3724 0.2225 0.4063 0.2543 0.3401 0.2461 0.3224 0.3157 0.521 0.3012 0.2712 0.028 0.3358 0.5004 0.015 0.028 0.041 0.4688 0.3762 0. 0.1579 0.0777 0.0157 0.0077 0.0044 0.0686 0.16 0.1424 0.0397 0.0294 0.0536 0.2076 0.0164 0.3173 0.0916 0.4871 0.063 0.1023 0.0637 0.0865 0.0919 0.3674 0.6213 0.0001 0.2471 0.5204 0.057 0.2312 0.3848 0.0031 0.1853 0.0097 0.2211 0.1689 0.4048 0.3339 0. 0.201 0.5083 0. 0.0302 0.1766 0.0461 0.0858 0.0995 0.002 0.0774 0.2882 0.1961 0.0034 0.2997 0.0038 0.0569 0. 0.2192 0.0332 0.0852 0.0501] 2022-08-22 19:26:26 [INFO] [EVAL] Class Precision: [0.7382 0.839 0.9628 0.7921 0.7381 0.8676 0.8605 0.8125 0.6174 0.7194 0.6444 0.6511 0.7829 0.5314 0.4409 0.591 0.6945 0.6762 0.6909 0.5861 0.7678 0.6555 0.7124 0.5469 0.4643 0.46 0.505 0.628 0.5374 0.446 0.3574 0.5562 0.3754 0.5089 0.4433 0.601 0.6099 0.6956 0.4208 0.5594 0.2912 0.258 0.6576 0.4839 0.4611 0.3552 0.7763 0.6117 0.6571 0.5294 0.4333 0.3408 0.3594 0.6201 0.6921 0.5337 0.8685 0.554 0.6976 0.3584 0.134 0.3756 0.4387 0.7611 0.51 0.5989 0.4771 0.6023 0.6448 0.4178 0.6212 0.6091 0.5973 0.2859 0.6356 0.4614 0.4003 0.5726 0.5802 0.3871 0.5915 0.6074 0.7448 0.0929 0.4917 0.6512 0.428 0.3965 0.5199 0.6507 0.4896 0. 0.2349 0.3715 0.1263 0.05 0.1373 0.453 0.4773 0.7761 0.1615 0.0437 0.4077 0.9971 0.1497 0.4241 0.2461 0.7787 0.1841 0.1752 0.2714 0.0923 0.1993 0.6071 0.6283 0.007 0.7231 0.5914 0.2313 0.4768 0.6861 0.2293 0.9622 0.7349 0.5769 0.4579 0.8574 0.4427 0. 0.2687 0.5932 0. 0.2445 0.5383 0.4702 0.4471 0.299 0.1104 0.4583 0.6625 0.8456 0.0073 0.6445 0.2193 0.4674 0. 0.8881 0.3615 0.2764 0.8913] 2022-08-22 19:26:26 [INFO] [EVAL] Class Recall: [0.837 0.8801 0.9582 0.8511 0.8591 0.7976 0.8474 0.8793 0.6771 0.8711 0.6006 0.7047 0.8257 0.4064 0.2291 0.4653 0.6385 0.4997 0.7239 0.4435 0.8813 0.4496 0.7315 0.7078 0.4438 0.569 0.8408 0.443 0.4582 0.4809 0.2234 0.6375 0.3945 0.3783 0.6035 0.3891 0.4826 0.388 0.3868 0.3193 0.1552 0.1763 0.3537 0.2832 0.4394 0.3356 0.2757 0.5796 0.6869 0.6883 0.7056 0.6051 0.2693 0.0557 0.9525 0.6613 0.9253 0.5205 0.2878 0.4729 0.0959 0.7075 0.5754 0.1593 0.583 0.8471 0.2643 0.5072 0.0778 0.3694 0.4878 0.6607 0.4973 0.501 0.5297 0.3617 0.6932 0.3014 0.4204 0.6312 0.814 0.374 0.299 0.0385 0.5143 0.6835 0.0153 0.0292 0.0426 0.6264 0.6189 0. 0.3249 0.0895 0.0176 0.009 0.0045 0.0748 0.1941 0.1485 0.05 0.0827 0.0582 0.2077 0.0181 0.5576 0.1274 0.5654 0.0874 0.1972 0.0768 0.5787 0.1457 0.4821 0.9825 0.0001 0.2729 0.8125 0.0703 0.3097 0.4671 0.0032 0.1867 0.0098 0.2638 0.2112 0.434 0.5762 0. 0.444 0.7804 0. 0.0333 0.2081 0.0486 0.096 0.1297 0.0021 0.0852 0.3378 0.2034 0.0064 0.3591 0.0039 0.0608 0. 0.2254 0.0353 0.1096 0.0504] 2022-08-22 19:26:26 [INFO] [EVAL] The model with the best validation mIoU (0.2827) was saved at iter 12000. 2022-08-22 19:26:35 [INFO] [TRAIN] epoch: 12, iter: 14050/160000, loss: 0.9301, lr: 0.001105, batch_cost: 0.1763, reader_cost: 0.00385, ips: 45.3745 samples/sec | ETA 07:08:52 2022-08-22 19:26:44 [INFO] [TRAIN] epoch: 12, iter: 14100/160000, loss: 0.9506, lr: 0.001105, batch_cost: 0.1808, reader_cost: 0.00099, ips: 44.2577 samples/sec | ETA 07:19:32 2022-08-22 19:26:52 [INFO] [TRAIN] epoch: 12, iter: 14150/160000, loss: 1.0344, lr: 0.001104, batch_cost: 0.1553, reader_cost: 0.00060, ips: 51.5152 samples/sec | ETA 06:17:29 2022-08-22 19:27:00 [INFO] [TRAIN] epoch: 12, iter: 14200/160000, loss: 0.9929, lr: 0.001104, batch_cost: 0.1680, reader_cost: 0.00057, ips: 47.6143 samples/sec | ETA 06:48:16 2022-08-22 19:27:08 [INFO] [TRAIN] epoch: 12, iter: 14250/160000, loss: 0.9774, lr: 0.001103, batch_cost: 0.1521, reader_cost: 0.00061, ips: 52.5983 samples/sec | ETA 06:09:28 2022-08-22 19:27:16 [INFO] [TRAIN] epoch: 12, iter: 14300/160000, loss: 0.9540, lr: 0.001103, batch_cost: 0.1716, reader_cost: 0.00858, ips: 46.6203 samples/sec | ETA 06:56:42 2022-08-22 19:27:25 [INFO] [TRAIN] epoch: 12, iter: 14350/160000, loss: 1.0077, lr: 0.001103, batch_cost: 0.1645, reader_cost: 0.00041, ips: 48.6376 samples/sec | ETA 06:39:16 2022-08-22 19:27:33 [INFO] [TRAIN] epoch: 12, iter: 14400/160000, loss: 0.9573, lr: 0.001102, batch_cost: 0.1705, reader_cost: 0.00059, ips: 46.9235 samples/sec | ETA 06:53:43 2022-08-22 19:27:41 [INFO] [TRAIN] epoch: 12, iter: 14450/160000, loss: 0.9134, lr: 0.001102, batch_cost: 0.1489, reader_cost: 0.00177, ips: 53.7222 samples/sec | ETA 06:01:14 2022-08-22 19:27:49 [INFO] [TRAIN] epoch: 12, iter: 14500/160000, loss: 0.9544, lr: 0.001102, batch_cost: 0.1731, reader_cost: 0.00079, ips: 46.2121 samples/sec | ETA 06:59:48 2022-08-22 19:27:57 [INFO] [TRAIN] epoch: 12, iter: 14550/160000, loss: 0.9887, lr: 0.001101, batch_cost: 0.1635, reader_cost: 0.00039, ips: 48.9441 samples/sec | ETA 06:36:14 2022-08-22 19:28:06 [INFO] [TRAIN] epoch: 12, iter: 14600/160000, loss: 0.9831, lr: 0.001101, batch_cost: 0.1609, reader_cost: 0.00048, ips: 49.7270 samples/sec | ETA 06:29:51 2022-08-22 19:28:15 [INFO] [TRAIN] epoch: 12, iter: 14650/160000, loss: 0.9613, lr: 0.001100, batch_cost: 0.1918, reader_cost: 0.00219, ips: 41.7152 samples/sec | ETA 07:44:34 2022-08-22 19:28:25 [INFO] [TRAIN] epoch: 12, iter: 14700/160000, loss: 0.9425, lr: 0.001100, batch_cost: 0.1980, reader_cost: 0.00297, ips: 40.3979 samples/sec | ETA 07:59:33 2022-08-22 19:28:35 [INFO] [TRAIN] epoch: 12, iter: 14750/160000, loss: 0.9611, lr: 0.001100, batch_cost: 0.1912, reader_cost: 0.00058, ips: 41.8346 samples/sec | ETA 07:42:56 2022-08-22 19:28:44 [INFO] [TRAIN] epoch: 12, iter: 14800/160000, loss: 0.9050, lr: 0.001099, batch_cost: 0.1958, reader_cost: 0.00081, ips: 40.8673 samples/sec | ETA 07:53:43 2022-08-22 19:28:54 [INFO] [TRAIN] epoch: 12, iter: 14850/160000, loss: 0.9655, lr: 0.001099, batch_cost: 0.1942, reader_cost: 0.01159, ips: 41.1943 samples/sec | ETA 07:49:48 2022-08-22 19:29:05 [INFO] [TRAIN] epoch: 12, iter: 14900/160000, loss: 0.9569, lr: 0.001099, batch_cost: 0.2102, reader_cost: 0.00469, ips: 38.0506 samples/sec | ETA 08:28:26 2022-08-22 19:29:14 [INFO] [TRAIN] epoch: 12, iter: 14950/160000, loss: 0.9365, lr: 0.001098, batch_cost: 0.1879, reader_cost: 0.00055, ips: 42.5721 samples/sec | ETA 07:34:17 2022-08-22 19:29:25 [INFO] [TRAIN] epoch: 12, iter: 15000/160000, loss: 0.9495, lr: 0.001098, batch_cost: 0.2244, reader_cost: 0.00080, ips: 35.6540 samples/sec | ETA 09:02:14 2022-08-22 19:29:25 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 161s - batch_cost: 0.1610 - reader cost: 5.7509e-04 2022-08-22 19:32:06 [INFO] [EVAL] #Images: 2000 mIoU: 0.2839 Acc: 0.7363 Kappa: 0.7155 Dice: 0.4015 2022-08-22 19:32:06 [INFO] [EVAL] Class IoU: [0.6408 0.7579 0.9251 0.7009 0.6545 0.6922 0.7403 0.7235 0.4685 0.6312 0.4578 0.4978 0.6598 0.3473 0.1982 0.3529 0.5132 0.3041 0.5291 0.3554 0.6934 0.5135 0.5582 0.4361 0.3143 0.3683 0.4818 0.2582 0.3529 0.255 0.1941 0.4322 0.1865 0.2449 0.2718 0.2924 0.3617 0.4479 0.2371 0.3219 0.0854 0.0745 0.3053 0.2229 0.2672 0.173 0.3481 0.4134 0.5568 0.4054 0.4376 0.3033 0.1113 0.1235 0.7018 0.4212 0.8063 0.1936 0.0942 0.2322 0.0283 0.4517 0.2908 0.122 0.3715 0.629 0.1797 0.401 0.0581 0.2506 0.2894 0.433 0.3697 0.2391 0.4031 0.285 0.2834 0.1373 0.1858 0.2579 0.6284 0.2253 0.2089 0.1842 0.2493 0.4966 0.0365 0.0553 0.0997 0.3477 0.3624 0.0159 0.1239 0.0615 0. 0.0162 0.0714 0.0374 0.1303 0.1939 0.003 0.0084 0.1136 0.3393 0.0054 0.3331 0.0629 0.4494 0.1102 0.0007 0.0038 0.1631 0.0888 0.5202 0.6962 0.0011 0.2678 0.5184 0.1097 0.206 0.4301 0. 0.1976 0.0905 0.1809 0.1533 0.4236 0.3464 0. 0.1905 0.5712 0. 0.0645 0.1514 0.0338 0.1307 0.083 0.0139 0.1418 0.2728 0.3958 0.0044 0.262 0.0023 0.2015 0.0022 0.2756 0.0101 0.0533 0.0377] 2022-08-22 19:32:06 [INFO] [EVAL] Class Precision: [0.7231 0.8563 0.9609 0.8265 0.7263 0.865 0.8898 0.8144 0.6432 0.7992 0.6421 0.5754 0.7429 0.5062 0.384 0.5426 0.5939 0.7175 0.6417 0.4877 0.7642 0.5964 0.6804 0.581 0.4227 0.4912 0.7531 0.8122 0.6684 0.3745 0.3117 0.6151 0.3976 0.54 0.5064 0.5541 0.626 0.6939 0.4204 0.4444 0.3315 0.369 0.6524 0.5746 0.3662 0.4493 0.8106 0.6674 0.6057 0.4801 0.5689 0.4541 0.342 0.7249 0.7238 0.5642 0.8707 0.6773 0.6823 0.452 0.1507 0.6037 0.5312 0.8237 0.4973 0.7757 0.3344 0.5603 0.2763 0.4633 0.6655 0.6037 0.6072 0.311 0.5699 0.4581 0.3559 0.7247 0.5967 0.3531 0.8025 0.8022 0.7124 0.3549 0.6484 0.647 0.3461 0.2933 0.2035 0.4533 0.4664 0.0449 0.4499 0.3483 0. 0.0621 0.4441 0.3744 0.6337 0.7713 0.0272 0.0143 0.5237 0.5223 0.0741 0.4214 0.4831 0.6629 0.2171 0.0017 0.0953 0.1702 0.4108 0.8152 0.7 0.0549 0.7078 0.5793 0.1652 0.4152 0.6009 0.0126 0.8389 0.6955 0.5773 0.6606 0.8501 0.4439 0. 0.4621 0.8212 0. 0.3578 0.4691 0.7921 0.3129 0.3287 0.1242 0.4555 0.7106 0.6562 0.0135 0.6739 0.1195 0.6096 0.003 0.8038 0.3733 0.6084 0.9625] 2022-08-22 19:32:06 [INFO] [EVAL] Class Recall: [0.8491 0.8683 0.9613 0.8219 0.8688 0.7761 0.8151 0.8663 0.633 0.7502 0.6146 0.7869 0.8549 0.5252 0.2906 0.5024 0.7907 0.3455 0.751 0.5672 0.8823 0.7871 0.7565 0.6362 0.5507 0.5954 0.5723 0.2746 0.4278 0.4441 0.3395 0.5925 0.26 0.3095 0.3697 0.3824 0.4613 0.5582 0.3521 0.5388 0.1032 0.0853 0.3646 0.2669 0.497 0.2196 0.379 0.5206 0.8734 0.7226 0.6547 0.4772 0.1417 0.1296 0.9586 0.6243 0.916 0.2133 0.0986 0.3231 0.0337 0.6421 0.3911 0.1252 0.5949 0.7688 0.2798 0.5852 0.0685 0.353 0.3387 0.605 0.486 0.5083 0.5792 0.43 0.5818 0.1448 0.2125 0.4888 0.7433 0.2385 0.2281 0.277 0.2883 0.6811 0.0392 0.0638 0.1636 0.599 0.6191 0.024 0.146 0.0695 0. 0.0214 0.0784 0.0399 0.1409 0.2057 0.0034 0.0196 0.1266 0.4919 0.0058 0.6137 0.0675 0.5825 0.1829 0.0013 0.004 0.796 0.1017 0.5897 0.9922 0.0011 0.3011 0.8313 0.2463 0.2902 0.6021 0. 0.2054 0.0942 0.2085 0.1665 0.4578 0.6119 0. 0.2448 0.6524 0. 0.073 0.1827 0.0341 0.1833 0.1 0.0155 0.1708 0.3069 0.4994 0.0064 0.3 0.0023 0.2313 0.0083 0.2955 0.0103 0.0552 0.0378] 2022-08-22 19:32:07 [INFO] [EVAL] The model with the best validation mIoU (0.2839) was saved at iter 15000. 2022-08-22 19:32:17 [INFO] [TRAIN] epoch: 12, iter: 15050/160000, loss: 0.9451, lr: 0.001097, batch_cost: 0.2080, reader_cost: 0.00390, ips: 38.4582 samples/sec | ETA 08:22:32 2022-08-22 19:32:26 [INFO] [TRAIN] epoch: 12, iter: 15100/160000, loss: 0.9736, lr: 0.001097, batch_cost: 0.1727, reader_cost: 0.00101, ips: 46.3146 samples/sec | ETA 06:57:08 2022-08-22 19:32:35 [INFO] [TRAIN] epoch: 12, iter: 15150/160000, loss: 1.0015, lr: 0.001097, batch_cost: 0.1838, reader_cost: 0.00050, ips: 43.5287 samples/sec | ETA 07:23:41 2022-08-22 19:32:46 [INFO] [TRAIN] epoch: 13, iter: 15200/160000, loss: 0.9381, lr: 0.001096, batch_cost: 0.2229, reader_cost: 0.05112, ips: 35.8910 samples/sec | ETA 08:57:55 2022-08-22 19:32:53 [INFO] [TRAIN] epoch: 13, iter: 15250/160000, loss: 0.9574, lr: 0.001096, batch_cost: 0.1502, reader_cost: 0.00266, ips: 53.2451 samples/sec | ETA 06:02:28 2022-08-22 19:33:02 [INFO] [TRAIN] epoch: 13, iter: 15300/160000, loss: 0.9691, lr: 0.001096, batch_cost: 0.1613, reader_cost: 0.00109, ips: 49.5997 samples/sec | ETA 06:28:58 2022-08-22 19:33:09 [INFO] [TRAIN] epoch: 13, iter: 15350/160000, loss: 0.8982, lr: 0.001095, batch_cost: 0.1500, reader_cost: 0.00062, ips: 53.3198 samples/sec | ETA 06:01:43 2022-08-22 19:33:17 [INFO] [TRAIN] epoch: 13, iter: 15400/160000, loss: 0.9804, lr: 0.001095, batch_cost: 0.1637, reader_cost: 0.00069, ips: 48.8786 samples/sec | ETA 06:34:26 2022-08-22 19:33:27 [INFO] [TRAIN] epoch: 13, iter: 15450/160000, loss: 0.9187, lr: 0.001094, batch_cost: 0.1852, reader_cost: 0.00062, ips: 43.1962 samples/sec | ETA 07:26:10 2022-08-22 19:33:35 [INFO] [TRAIN] epoch: 13, iter: 15500/160000, loss: 0.8818, lr: 0.001094, batch_cost: 0.1745, reader_cost: 0.00052, ips: 45.8576 samples/sec | ETA 07:00:08 2022-08-22 19:33:44 [INFO] [TRAIN] epoch: 13, iter: 15550/160000, loss: 0.9606, lr: 0.001094, batch_cost: 0.1768, reader_cost: 0.00123, ips: 45.2464 samples/sec | ETA 07:05:40 2022-08-22 19:33:54 [INFO] [TRAIN] epoch: 13, iter: 15600/160000, loss: 0.9473, lr: 0.001093, batch_cost: 0.1902, reader_cost: 0.00043, ips: 42.0673 samples/sec | ETA 07:37:40 2022-08-22 19:34:02 [INFO] [TRAIN] epoch: 13, iter: 15650/160000, loss: 0.9518, lr: 0.001093, batch_cost: 0.1769, reader_cost: 0.00074, ips: 45.2262 samples/sec | ETA 07:05:33 2022-08-22 19:34:12 [INFO] [TRAIN] epoch: 13, iter: 15700/160000, loss: 0.9093, lr: 0.001092, batch_cost: 0.1896, reader_cost: 0.00062, ips: 42.1887 samples/sec | ETA 07:36:02 2022-08-22 19:34:21 [INFO] [TRAIN] epoch: 13, iter: 15750/160000, loss: 0.9121, lr: 0.001092, batch_cost: 0.1901, reader_cost: 0.01262, ips: 42.0766 samples/sec | ETA 07:37:06 2022-08-22 19:34:31 [INFO] [TRAIN] epoch: 13, iter: 15800/160000, loss: 0.9665, lr: 0.001092, batch_cost: 0.1974, reader_cost: 0.00045, ips: 40.5194 samples/sec | ETA 07:54:30 2022-08-22 19:34:41 [INFO] [TRAIN] epoch: 13, iter: 15850/160000, loss: 0.9034, lr: 0.001091, batch_cost: 0.2000, reader_cost: 0.00523, ips: 39.9921 samples/sec | ETA 08:00:35 2022-08-22 19:34:52 [INFO] [TRAIN] epoch: 13, iter: 15900/160000, loss: 0.8615, lr: 0.001091, batch_cost: 0.2073, reader_cost: 0.00044, ips: 38.5872 samples/sec | ETA 08:17:55 2022-08-22 19:35:02 [INFO] [TRAIN] epoch: 13, iter: 15950/160000, loss: 0.9058, lr: 0.001091, batch_cost: 0.2050, reader_cost: 0.00044, ips: 39.0187 samples/sec | ETA 08:12:14 2022-08-22 19:35:13 [INFO] [TRAIN] epoch: 13, iter: 16000/160000, loss: 0.9102, lr: 0.001090, batch_cost: 0.2233, reader_cost: 0.00081, ips: 35.8274 samples/sec | ETA 08:55:54 2022-08-22 19:35:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1788 - reader cost: 9.0158e-04 2022-08-22 19:38:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.2860 Acc: 0.7390 Kappa: 0.7183 Dice: 0.4068 2022-08-22 19:38:12 [INFO] [EVAL] Class IoU: [0.6459 0.7591 0.9255 0.6992 0.6795 0.7073 0.7411 0.7392 0.4607 0.6267 0.4464 0.4599 0.677 0.3263 0.2093 0.3509 0.4553 0.444 0.5473 0.3673 0.7118 0.4129 0.5582 0.434 0.2756 0.2515 0.5118 0.3024 0.2626 0.2295 0.1183 0.4307 0.2459 0.3281 0.1943 0.377 0.3667 0.3896 0.2286 0.2585 0.1111 0.1077 0.3314 0.2121 0.2586 0.2541 0.3544 0.3862 0.5807 0.4347 0.4307 0.3258 0.0836 0.2425 0.6735 0.4702 0.7982 0.3743 0.1635 0.2551 0.0924 0.1806 0.2757 0.0489 0.3438 0.5619 0.2483 0.3324 0.0372 0.2759 0.3004 0.4501 0.3091 0.1828 0.3817 0.26 0.3297 0.1183 0.4055 0.2646 0.4804 0.2815 0.1498 0.065 0.2855 0.4876 0.0597 0.0342 0.0325 0.4355 0.307 0.0583 0.1579 0.0492 0.0175 0.0008 0.2097 0.0349 0.1833 0.3108 0.0084 0.0231 0.0852 0.3601 0.0094 0.3127 0.1909 0.4897 0.0805 0.0634 0.0244 0.3345 0.1132 0.3509 0.5022 0. 0.3852 0.5903 0.0646 0.162 0.4446 0.0012 0.2184 0.057 0.1943 0.1606 0.4208 0.4058 0. 0.2006 0.5098 0. 0.222 0.1393 0.039 0.0951 0.0942 0.0063 0.1659 0.2977 0.2208 0.0268 0.1551 0.0391 0.1854 0.0009 0.2493 0.0024 0.0819 0.0746] 2022-08-22 19:38:12 [INFO] [EVAL] Class Precision: [0.7295 0.8464 0.9592 0.8021 0.7596 0.819 0.8109 0.8476 0.6398 0.6939 0.7013 0.7147 0.7868 0.5396 0.4385 0.4806 0.6428 0.7076 0.732 0.5459 0.8163 0.5487 0.8087 0.5256 0.4457 0.5468 0.5856 0.6964 0.6076 0.4327 0.3747 0.5713 0.3924 0.5244 0.5327 0.5542 0.5847 0.6669 0.4273 0.5716 0.3454 0.2703 0.6235 0.5206 0.4171 0.38 0.6463 0.4802 0.6458 0.5249 0.6872 0.4579 0.3254 0.6377 0.7046 0.649 0.8463 0.4822 0.7077 0.6707 0.2009 0.3833 0.3399 0.8898 0.3921 0.6659 0.513 0.5409 0.3004 0.545 0.5775 0.6136 0.3981 0.2156 0.5091 0.5101 0.4887 0.5557 0.6054 0.4038 0.5207 0.6854 0.7667 0.1268 0.5743 0.6701 0.2648 0.3225 0.3601 0.6417 0.3731 0.0975 0.37 0.2766 0.09 0.0089 0.3075 0.2945 0.5058 0.5335 0.1737 0.0316 0.729 0.5059 0.4427 0.39 0.5808 0.7828 0.2554 0.1273 0.2709 0.6363 0.3086 0.8048 0.5047 0.0056 0.7125 0.6643 0.1322 0.4917 0.6979 0.2677 0.8788 0.7717 0.6583 0.5761 0.7251 0.5505 0. 0.2548 0.719 0. 0.5441 0.5471 0.6695 0.488 0.2818 0.0882 0.4059 0.6279 0.263 0.0346 0.8406 0.4327 0.7741 0.0018 0.8969 0.1754 0.4942 0.8905] 2022-08-22 19:38:12 [INFO] [EVAL] Class Recall: [0.8493 0.8803 0.9634 0.845 0.8656 0.8383 0.896 0.8525 0.622 0.8663 0.5512 0.5633 0.8291 0.4522 0.2859 0.5651 0.6094 0.5437 0.6843 0.5289 0.8476 0.6251 0.6431 0.7136 0.4193 0.3178 0.8024 0.3483 0.3163 0.3283 0.1474 0.6363 0.397 0.4672 0.2343 0.5411 0.4958 0.4838 0.3295 0.3207 0.1408 0.1519 0.4142 0.2636 0.4048 0.4342 0.4396 0.6634 0.8522 0.7166 0.5356 0.5305 0.1012 0.2812 0.9386 0.6306 0.9336 0.6259 0.1753 0.2917 0.1461 0.2545 0.5932 0.0492 0.7364 0.7825 0.3249 0.4631 0.0407 0.3586 0.3851 0.6282 0.5805 0.5459 0.6041 0.3465 0.5032 0.1306 0.5513 0.4341 0.8613 0.3233 0.157 0.1175 0.3622 0.6416 0.0715 0.0368 0.0344 0.5754 0.6344 0.1266 0.216 0.0565 0.0213 0.0009 0.3973 0.0381 0.2233 0.4267 0.0088 0.0787 0.088 0.5554 0.0096 0.6121 0.2214 0.5667 0.1053 0.112 0.0262 0.4136 0.1517 0.3835 0.9902 0. 0.4561 0.8412 0.1121 0.1946 0.5505 0.0012 0.2252 0.058 0.216 0.1821 0.5006 0.6068 0. 0.4852 0.6367 0. 0.2727 0.1575 0.0397 0.1056 0.1239 0.0067 0.219 0.3615 0.5789 0.106 0.1598 0.0412 0.196 0.002 0.2567 0.0024 0.0894 0.0753] 2022-08-22 19:38:12 [INFO] [EVAL] The model with the best validation mIoU (0.2860) was saved at iter 16000. 2022-08-22 19:38:20 [INFO] [TRAIN] epoch: 13, iter: 16050/160000, loss: 0.9701, lr: 0.001090, batch_cost: 0.1562, reader_cost: 0.00383, ips: 51.2010 samples/sec | ETA 06:14:51 2022-08-22 19:38:29 [INFO] [TRAIN] epoch: 13, iter: 16100/160000, loss: 0.9029, lr: 0.001089, batch_cost: 0.1722, reader_cost: 0.00100, ips: 46.4451 samples/sec | ETA 06:53:06 2022-08-22 19:38:38 [INFO] [TRAIN] epoch: 13, iter: 16150/160000, loss: 0.9394, lr: 0.001089, batch_cost: 0.1808, reader_cost: 0.00064, ips: 44.2487 samples/sec | ETA 07:13:27 2022-08-22 19:38:47 [INFO] [TRAIN] epoch: 13, iter: 16200/160000, loss: 1.0104, lr: 0.001089, batch_cost: 0.1951, reader_cost: 0.00035, ips: 41.0066 samples/sec | ETA 07:47:34 2022-08-22 19:38:56 [INFO] [TRAIN] epoch: 13, iter: 16250/160000, loss: 0.9542, lr: 0.001088, batch_cost: 0.1740, reader_cost: 0.00049, ips: 45.9901 samples/sec | ETA 06:56:45 2022-08-22 19:39:04 [INFO] [TRAIN] epoch: 13, iter: 16300/160000, loss: 0.9414, lr: 0.001088, batch_cost: 0.1539, reader_cost: 0.00211, ips: 51.9934 samples/sec | ETA 06:08:30 2022-08-22 19:39:13 [INFO] [TRAIN] epoch: 13, iter: 16350/160000, loss: 0.9698, lr: 0.001088, batch_cost: 0.1818, reader_cost: 0.00050, ips: 43.9949 samples/sec | ETA 07:15:21 2022-08-22 19:39:21 [INFO] [TRAIN] epoch: 13, iter: 16400/160000, loss: 0.9504, lr: 0.001087, batch_cost: 0.1553, reader_cost: 0.00070, ips: 51.5128 samples/sec | ETA 06:11:41 2022-08-22 19:39:33 [INFO] [TRAIN] epoch: 14, iter: 16450/160000, loss: 0.8804, lr: 0.001087, batch_cost: 0.2436, reader_cost: 0.09398, ips: 32.8453 samples/sec | ETA 09:42:43 2022-08-22 19:39:41 [INFO] [TRAIN] epoch: 14, iter: 16500/160000, loss: 0.8621, lr: 0.001086, batch_cost: 0.1690, reader_cost: 0.00086, ips: 47.3267 samples/sec | ETA 06:44:16 2022-08-22 19:39:51 [INFO] [TRAIN] epoch: 14, iter: 16550/160000, loss: 0.9384, lr: 0.001086, batch_cost: 0.1941, reader_cost: 0.00049, ips: 41.2141 samples/sec | ETA 07:44:04 2022-08-22 19:40:00 [INFO] [TRAIN] epoch: 14, iter: 16600/160000, loss: 0.9024, lr: 0.001086, batch_cost: 0.1774, reader_cost: 0.00053, ips: 45.1059 samples/sec | ETA 07:03:53 2022-08-22 19:40:08 [INFO] [TRAIN] epoch: 14, iter: 16650/160000, loss: 0.8623, lr: 0.001085, batch_cost: 0.1709, reader_cost: 0.00069, ips: 46.8178 samples/sec | ETA 06:48:14 2022-08-22 19:40:17 [INFO] [TRAIN] epoch: 14, iter: 16700/160000, loss: 0.9378, lr: 0.001085, batch_cost: 0.1666, reader_cost: 0.00056, ips: 48.0325 samples/sec | ETA 06:37:47 2022-08-22 19:40:27 [INFO] [TRAIN] epoch: 14, iter: 16750/160000, loss: 0.8896, lr: 0.001085, batch_cost: 0.1996, reader_cost: 0.00061, ips: 40.0704 samples/sec | ETA 07:56:39 2022-08-22 19:40:37 [INFO] [TRAIN] epoch: 14, iter: 16800/160000, loss: 0.9163, lr: 0.001084, batch_cost: 0.2022, reader_cost: 0.00087, ips: 39.5696 samples/sec | ETA 08:02:31 2022-08-22 19:40:47 [INFO] [TRAIN] epoch: 14, iter: 16850/160000, loss: 1.0005, lr: 0.001084, batch_cost: 0.1938, reader_cost: 0.00317, ips: 41.2829 samples/sec | ETA 07:42:20 2022-08-22 19:40:57 [INFO] [TRAIN] epoch: 14, iter: 16900/160000, loss: 0.8944, lr: 0.001083, batch_cost: 0.2138, reader_cost: 0.00555, ips: 37.4264 samples/sec | ETA 08:29:48 2022-08-22 19:41:07 [INFO] [TRAIN] epoch: 14, iter: 16950/160000, loss: 0.8853, lr: 0.001083, batch_cost: 0.1987, reader_cost: 0.00033, ips: 40.2654 samples/sec | ETA 07:53:41 2022-08-22 19:41:18 [INFO] [TRAIN] epoch: 14, iter: 17000/160000, loss: 0.9177, lr: 0.001083, batch_cost: 0.2107, reader_cost: 0.00049, ips: 37.9700 samples/sec | ETA 08:22:09 2022-08-22 19:41:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1891 - reader cost: 0.0010 2022-08-22 19:44:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2895 Acc: 0.7363 Kappa: 0.7155 Dice: 0.4107 2022-08-22 19:44:27 [INFO] [EVAL] Class IoU: [0.643 0.7546 0.9241 0.7 0.6694 0.6828 0.7371 0.7312 0.4736 0.6355 0.4674 0.5112 0.6729 0.2664 0.1695 0.3403 0.4801 0.4093 0.5565 0.3359 0.7359 0.4463 0.5575 0.431 0.2379 0.392 0.4542 0.3788 0.375 0.1861 0.1784 0.4812 0.2449 0.2743 0.2813 0.3409 0.3711 0.4658 0.2444 0.3227 0.128 0.104 0.3319 0.2247 0.2677 0.1686 0.2806 0.396 0.5013 0.3813 0.4308 0.2864 0.1038 0.0691 0.6864 0.3 0.7965 0.2781 0.3767 0.2139 0.0362 0.1789 0.2504 0.173 0.3993 0.5756 0.2183 0.3483 0.0267 0.249 0.2352 0.4442 0.3254 0.1901 0.3887 0.2728 0.4322 0.1551 0.2431 0.0719 0.4546 0.3117 0.2509 0.0117 0.2657 0.4892 0.0415 0.037 0.1845 0.4141 0.3244 0.0269 0.21 0.0938 0.0259 0.0091 0.024 0.0279 0.2834 0.2998 0.0107 0.0138 0.129 0.4672 0.0212 0.3645 0.1588 0.4791 0.092 0.1387 0.0568 0.3757 0.1173 0.3412 0.6717 0. 0.372 0.5535 0.0557 0.1528 0.4574 0. 0.2232 0.1133 0.2194 0.1469 0.3987 0.3595 0. 0.1764 0.4534 0. 0.2105 0.2529 0.0602 0.1157 0.0796 0.001 0.1226 0.296 0.2612 0. 0.3642 0.02 0.2141 0. 0.3311 0.028 0.051 0.0153] 2022-08-22 19:44:27 [INFO] [EVAL] Class Precision: [0.7239 0.8457 0.9712 0.804 0.7607 0.8885 0.8859 0.8108 0.5908 0.7223 0.6703 0.5924 0.7615 0.5798 0.4834 0.5444 0.565 0.6542 0.7804 0.5933 0.8481 0.5558 0.7986 0.5974 0.4894 0.442 0.6322 0.6562 0.6181 0.3269 0.3185 0.6808 0.5703 0.3317 0.5433 0.5408 0.5427 0.6357 0.4131 0.5292 0.3541 0.2943 0.6221 0.5845 0.3852 0.3164 0.4284 0.6266 0.6113 0.4308 0.5465 0.4043 0.2547 0.5108 0.7212 0.3531 0.8636 0.6535 0.8718 0.3441 0.2922 0.3311 0.2876 0.6773 0.5157 0.6465 0.4994 0.5965 0.3104 0.4755 0.6861 0.5496 0.5889 0.2287 0.5757 0.4136 0.6772 0.48 0.5902 0.2676 0.4825 0.7111 0.751 0.0583 0.6571 0.7268 0.6864 0.3058 0.4268 0.722 0.4059 0.0381 0.3479 0.2934 0.0864 0.0234 0.2312 0.2606 0.5097 0.4916 0.3243 0.0206 0.7721 0.6888 0.2027 0.4825 0.6498 0.8554 0.1677 0.2331 0.1913 0.7676 0.2459 0.7369 0.6773 0. 0.7879 0.6284 0.123 0.5245 0.6058 0. 0.8976 0.6183 0.6059 0.5315 0.6943 0.4531 0. 0.4966 0.5081 0. 0.7876 0.5839 0.6574 0.2834 0.2965 0.1071 0.511 0.6238 0.4571 0. 0.6397 0.1047 0.6435 0. 0.8647 0.4692 0.5774 0.9282] 2022-08-22 19:44:27 [INFO] [EVAL] Class Recall: [0.852 0.8751 0.9501 0.8441 0.8479 0.7468 0.8145 0.8816 0.7048 0.8409 0.6069 0.7886 0.8526 0.3301 0.207 0.4757 0.7617 0.5223 0.6598 0.4363 0.8476 0.6938 0.6487 0.6075 0.3164 0.7762 0.6173 0.4726 0.488 0.3016 0.2886 0.6214 0.3003 0.6131 0.3684 0.4797 0.54 0.6355 0.3745 0.4526 0.167 0.1385 0.4157 0.2674 0.4675 0.2651 0.4484 0.5183 0.7359 0.7685 0.6704 0.4954 0.1491 0.074 0.9342 0.6661 0.9112 0.3262 0.3988 0.3611 0.0397 0.2802 0.6596 0.1886 0.6388 0.84 0.2795 0.4556 0.0284 0.3432 0.2636 0.6983 0.4211 0.5297 0.5448 0.4448 0.5444 0.1864 0.2925 0.0895 0.8872 0.3569 0.2736 0.0145 0.3085 0.5994 0.0423 0.0403 0.2453 0.4927 0.6178 0.0832 0.3462 0.1211 0.0356 0.0147 0.0261 0.0304 0.3896 0.4345 0.0109 0.0398 0.1341 0.5923 0.0231 0.5983 0.1736 0.5213 0.1693 0.2551 0.0748 0.4239 0.1831 0.3886 0.9879 0. 0.4134 0.8227 0.0923 0.1774 0.6513 0. 0.2291 0.1218 0.2559 0.1687 0.4835 0.6349 0. 0.2148 0.8083 0. 0.2232 0.3085 0.0621 0.1635 0.0981 0.001 0.139 0.3604 0.3787 0. 0.4582 0.0242 0.2429 0. 0.3492 0.0289 0.053 0.0153] 2022-08-22 19:44:27 [INFO] [EVAL] The model with the best validation mIoU (0.2895) was saved at iter 17000. 2022-08-22 19:44:36 [INFO] [TRAIN] epoch: 14, iter: 17050/160000, loss: 0.9397, lr: 0.001082, batch_cost: 0.1746, reader_cost: 0.00266, ips: 45.8219 samples/sec | ETA 06:55:57 2022-08-22 19:44:46 [INFO] [TRAIN] epoch: 14, iter: 17100/160000, loss: 0.9518, lr: 0.001082, batch_cost: 0.2012, reader_cost: 0.00083, ips: 39.7620 samples/sec | ETA 07:59:11 2022-08-22 19:44:54 [INFO] [TRAIN] epoch: 14, iter: 17150/160000, loss: 0.9490, lr: 0.001082, batch_cost: 0.1651, reader_cost: 0.00053, ips: 48.4488 samples/sec | ETA 06:33:07 2022-08-22 19:45:02 [INFO] [TRAIN] epoch: 14, iter: 17200/160000, loss: 0.8749, lr: 0.001081, batch_cost: 0.1603, reader_cost: 0.00544, ips: 49.9050 samples/sec | ETA 06:21:31 2022-08-22 19:45:12 [INFO] [TRAIN] epoch: 14, iter: 17250/160000, loss: 0.9089, lr: 0.001081, batch_cost: 0.1911, reader_cost: 0.00252, ips: 41.8684 samples/sec | ETA 07:34:35 2022-08-22 19:45:20 [INFO] [TRAIN] epoch: 14, iter: 17300/160000, loss: 0.9308, lr: 0.001080, batch_cost: 0.1685, reader_cost: 0.00041, ips: 47.4646 samples/sec | ETA 06:40:51 2022-08-22 19:45:30 [INFO] [TRAIN] epoch: 14, iter: 17350/160000, loss: 0.9143, lr: 0.001080, batch_cost: 0.1968, reader_cost: 0.00096, ips: 40.6563 samples/sec | ETA 07:47:49 2022-08-22 19:45:39 [INFO] [TRAIN] epoch: 14, iter: 17400/160000, loss: 0.9426, lr: 0.001080, batch_cost: 0.1851, reader_cost: 0.00042, ips: 43.2129 samples/sec | ETA 07:19:59 2022-08-22 19:45:49 [INFO] [TRAIN] epoch: 14, iter: 17450/160000, loss: 0.8660, lr: 0.001079, batch_cost: 0.1980, reader_cost: 0.00058, ips: 40.3950 samples/sec | ETA 07:50:31 2022-08-22 19:45:57 [INFO] [TRAIN] epoch: 14, iter: 17500/160000, loss: 0.9443, lr: 0.001079, batch_cost: 0.1543, reader_cost: 0.00050, ips: 51.8554 samples/sec | ETA 06:06:24 2022-08-22 19:46:06 [INFO] [TRAIN] epoch: 14, iter: 17550/160000, loss: 0.8745, lr: 0.001078, batch_cost: 0.1736, reader_cost: 0.00060, ips: 46.0957 samples/sec | ETA 06:52:02 2022-08-22 19:46:14 [INFO] [TRAIN] epoch: 14, iter: 17600/160000, loss: 0.8854, lr: 0.001078, batch_cost: 0.1572, reader_cost: 0.00315, ips: 50.9029 samples/sec | ETA 06:12:59 2022-08-22 19:46:22 [INFO] [TRAIN] epoch: 14, iter: 17650/160000, loss: 0.8944, lr: 0.001078, batch_cost: 0.1635, reader_cost: 0.00053, ips: 48.9195 samples/sec | ETA 06:27:59 2022-08-22 19:46:32 [INFO] [TRAIN] epoch: 15, iter: 17700/160000, loss: 1.0108, lr: 0.001077, batch_cost: 0.2050, reader_cost: 0.04542, ips: 39.0221 samples/sec | ETA 08:06:13 2022-08-22 19:46:41 [INFO] [TRAIN] epoch: 15, iter: 17750/160000, loss: 0.8537, lr: 0.001077, batch_cost: 0.1897, reader_cost: 0.00050, ips: 42.1627 samples/sec | ETA 07:29:50 2022-08-22 19:46:52 [INFO] [TRAIN] epoch: 15, iter: 17800/160000, loss: 0.9118, lr: 0.001077, batch_cost: 0.2095, reader_cost: 0.00492, ips: 38.1879 samples/sec | ETA 08:16:29 2022-08-22 19:47:02 [INFO] [TRAIN] epoch: 15, iter: 17850/160000, loss: 0.9618, lr: 0.001076, batch_cost: 0.2078, reader_cost: 0.00101, ips: 38.5015 samples/sec | ETA 08:12:16 2022-08-22 19:47:14 [INFO] [TRAIN] epoch: 15, iter: 17900/160000, loss: 0.8932, lr: 0.001076, batch_cost: 0.2287, reader_cost: 0.00066, ips: 34.9759 samples/sec | ETA 09:01:42 2022-08-22 19:47:25 [INFO] [TRAIN] epoch: 15, iter: 17950/160000, loss: 0.8821, lr: 0.001075, batch_cost: 0.2221, reader_cost: 0.00058, ips: 36.0225 samples/sec | ETA 08:45:46 2022-08-22 19:47:36 [INFO] [TRAIN] epoch: 15, iter: 18000/160000, loss: 0.8926, lr: 0.001075, batch_cost: 0.2299, reader_cost: 0.00047, ips: 34.7942 samples/sec | ETA 09:04:09 2022-08-22 19:47:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1789 - reader cost: 0.0011 2022-08-22 19:50:35 [INFO] [EVAL] #Images: 2000 mIoU: 0.2948 Acc: 0.7441 Kappa: 0.7241 Dice: 0.4160 2022-08-22 19:50:35 [INFO] [EVAL] Class IoU: [0.6538 0.7539 0.928 0.695 0.6779 0.7311 0.7345 0.7416 0.4741 0.6585 0.471 0.4875 0.6716 0.2975 0.1793 0.3636 0.5201 0.409 0.5607 0.3587 0.7067 0.4382 0.5768 0.4417 0.3238 0.4191 0.4763 0.3572 0.2397 0.3009 0.1754 0.4113 0.2715 0.3084 0.3553 0.3632 0.3629 0.4627 0.244 0.3089 0.1501 0.0658 0.32 0.2257 0.2418 0.2381 0.3045 0.4418 0.5606 0.4299 0.4395 0.1897 0.1187 0.2289 0.7267 0.4435 0.7368 0.2627 0.4442 0.3293 0.051 0.2349 0.2952 0.0401 0.3723 0.6007 0.2337 0.3735 0.03 0.2667 0.3347 0.3766 0.3445 0.2113 0.3906 0.2965 0.4867 0.2193 0.1513 0.2508 0.5712 0.3327 0.2322 0.0266 0.0812 0.4859 0.1022 0.0405 0.204 0.4302 0.432 0.0575 0.1791 0.1021 0.0062 0.0016 0.0267 0.0448 0.0297 0.2959 0.0366 0.0041 0.0431 0.2277 0.0263 0.3025 0.0977 0.5445 0.1251 0.1615 0.1037 0.2023 0.0961 0.5137 0.6921 0.0001 0.2017 0.4957 0.1182 0.1532 0.4471 0. 0.2251 0.1581 0.1717 0.1472 0.4399 0.2773 0.0154 0.2824 0.4922 0. 0.1039 0.1967 0.0339 0.1039 0.0461 0.0018 0.117 0.3104 0.328 0.0448 0.3471 0.0018 0.2766 0. 0.2742 0.0057 0.0998 0.069 ] 2022-08-22 19:50:35 [INFO] [EVAL] Class Precision: [0.7555 0.8206 0.9653 0.7891 0.7984 0.8379 0.8225 0.8314 0.6398 0.7322 0.6454 0.649 0.7489 0.4933 0.4864 0.5923 0.6783 0.7115 0.7604 0.5361 0.7717 0.619 0.7782 0.5608 0.5029 0.5407 0.5945 0.7167 0.6985 0.4693 0.3232 0.5128 0.4487 0.4068 0.4284 0.567 0.5683 0.7749 0.4699 0.608 0.2626 0.2857 0.5218 0.5105 0.3929 0.468 0.4875 0.6995 0.6583 0.502 0.6484 0.236 0.2959 0.5978 0.7651 0.7573 0.762 0.7037 0.7431 0.4947 0.1572 0.3982 0.4574 0.8817 0.4439 0.6849 0.4414 0.6036 0.5649 0.4981 0.5635 0.8004 0.419 0.3182 0.5822 0.4348 0.7999 0.6636 0.2935 0.3793 0.6692 0.6427 0.7633 0.172 0.3741 0.6786 0.2758 0.3944 0.3439 0.6683 0.6614 0.1089 0.4271 0.2968 0.6024 0.0085 0.2169 0.2469 0.7859 0.6538 0.3923 0.0071 0.4348 0.5527 0.1356 0.3644 0.7459 0.7611 0.2835 0.2343 0.3116 0.2465 0.7615 0.7063 0.6982 0.0128 0.7103 0.6612 0.1609 0.4804 0.5231 0. 0.8607 0.5712 0.6598 0.62 0.8076 0.3417 0.0499 0.4849 0.663 0. 0.3485 0.6778 0.6849 0.3569 0.5144 0.0147 0.4978 0.5185 0.5251 0.0512 0.5101 0.0416 0.605 0. 0.8577 0.2453 0.2535 0.8997] 2022-08-22 19:50:35 [INFO] [EVAL] Class Recall: [0.8294 0.9028 0.96 0.8536 0.8179 0.8516 0.8728 0.8728 0.6468 0.8675 0.6354 0.6621 0.8668 0.4284 0.2211 0.4851 0.6905 0.4903 0.6811 0.5202 0.8935 0.6001 0.6902 0.6754 0.4763 0.6509 0.7054 0.4159 0.2674 0.4562 0.2773 0.6752 0.4073 0.5606 0.6753 0.5027 0.501 0.5346 0.3366 0.3857 0.2595 0.0788 0.4528 0.2881 0.3861 0.3264 0.448 0.5453 0.7907 0.7496 0.577 0.4918 0.1655 0.2706 0.9355 0.517 0.9571 0.2954 0.5248 0.4963 0.0702 0.3643 0.4543 0.0404 0.6976 0.83 0.3319 0.495 0.0307 0.3646 0.4518 0.4156 0.6596 0.3862 0.5427 0.4825 0.5542 0.2467 0.2379 0.4255 0.796 0.4082 0.2502 0.0305 0.0939 0.6311 0.1397 0.0432 0.334 0.547 0.5546 0.1087 0.2358 0.1346 0.0062 0.002 0.0295 0.0519 0.0299 0.3509 0.0388 0.0094 0.0456 0.2791 0.0316 0.6402 0.1011 0.6567 0.1829 0.3422 0.1345 0.5303 0.0991 0.6533 0.9876 0.0001 0.2198 0.6644 0.3083 0.1836 0.7549 0. 0.2336 0.1794 0.1884 0.1619 0.4914 0.5954 0.0218 0.4034 0.6565 0. 0.129 0.217 0.0344 0.1278 0.0482 0.0021 0.1326 0.4361 0.4663 0.2639 0.5206 0.0019 0.3375 0. 0.2873 0.0058 0.1413 0.0696] 2022-08-22 19:50:36 [INFO] [EVAL] The model with the best validation mIoU (0.2948) was saved at iter 18000. 2022-08-22 19:50:43 [INFO] [TRAIN] epoch: 15, iter: 18050/160000, loss: 0.9561, lr: 0.001075, batch_cost: 0.1563, reader_cost: 0.00333, ips: 51.1861 samples/sec | ETA 06:09:45 2022-08-22 19:50:51 [INFO] [TRAIN] epoch: 15, iter: 18100/160000, loss: 0.8911, lr: 0.001074, batch_cost: 0.1537, reader_cost: 0.00084, ips: 52.0350 samples/sec | ETA 06:03:36 2022-08-22 19:51:00 [INFO] [TRAIN] epoch: 15, iter: 18150/160000, loss: 0.8493, lr: 0.001074, batch_cost: 0.1666, reader_cost: 0.00657, ips: 48.0143 samples/sec | ETA 06:33:54 2022-08-22 19:51:07 [INFO] [TRAIN] epoch: 15, iter: 18200/160000, loss: 0.9204, lr: 0.001074, batch_cost: 0.1578, reader_cost: 0.00050, ips: 50.6909 samples/sec | ETA 06:12:58 2022-08-22 19:51:16 [INFO] [TRAIN] epoch: 15, iter: 18250/160000, loss: 0.8685, lr: 0.001073, batch_cost: 0.1619, reader_cost: 0.00341, ips: 49.4246 samples/sec | ETA 06:22:24 2022-08-22 19:51:24 [INFO] [TRAIN] epoch: 15, iter: 18300/160000, loss: 1.0267, lr: 0.001073, batch_cost: 0.1668, reader_cost: 0.00045, ips: 47.9711 samples/sec | ETA 06:33:50 2022-08-22 19:51:32 [INFO] [TRAIN] epoch: 15, iter: 18350/160000, loss: 0.9605, lr: 0.001072, batch_cost: 0.1730, reader_cost: 0.00063, ips: 46.2447 samples/sec | ETA 06:48:24 2022-08-22 19:51:40 [INFO] [TRAIN] epoch: 15, iter: 18400/160000, loss: 0.9241, lr: 0.001072, batch_cost: 0.1566, reader_cost: 0.00040, ips: 51.0696 samples/sec | ETA 06:09:41 2022-08-22 19:51:48 [INFO] [TRAIN] epoch: 15, iter: 18450/160000, loss: 0.8924, lr: 0.001072, batch_cost: 0.1595, reader_cost: 0.00050, ips: 50.1528 samples/sec | ETA 06:16:18 2022-08-22 19:51:57 [INFO] [TRAIN] epoch: 15, iter: 18500/160000, loss: 0.9247, lr: 0.001071, batch_cost: 0.1671, reader_cost: 0.00072, ips: 47.8771 samples/sec | ETA 06:34:03 2022-08-22 19:52:05 [INFO] [TRAIN] epoch: 15, iter: 18550/160000, loss: 0.9005, lr: 0.001071, batch_cost: 0.1687, reader_cost: 0.00074, ips: 47.4249 samples/sec | ETA 06:37:40 2022-08-22 19:52:13 [INFO] [TRAIN] epoch: 15, iter: 18600/160000, loss: 0.8697, lr: 0.001071, batch_cost: 0.1501, reader_cost: 0.00101, ips: 53.3097 samples/sec | ETA 05:53:39 2022-08-22 19:52:21 [INFO] [TRAIN] epoch: 15, iter: 18650/160000, loss: 0.9220, lr: 0.001070, batch_cost: 0.1597, reader_cost: 0.00259, ips: 50.1089 samples/sec | ETA 06:16:06 2022-08-22 19:52:29 [INFO] [TRAIN] epoch: 15, iter: 18700/160000, loss: 0.9360, lr: 0.001070, batch_cost: 0.1584, reader_cost: 0.00946, ips: 50.5204 samples/sec | ETA 06:12:55 2022-08-22 19:52:36 [INFO] [TRAIN] epoch: 15, iter: 18750/160000, loss: 0.9376, lr: 0.001069, batch_cost: 0.1548, reader_cost: 0.00187, ips: 51.6898 samples/sec | ETA 06:04:21 2022-08-22 19:52:45 [INFO] [TRAIN] epoch: 15, iter: 18800/160000, loss: 0.8439, lr: 0.001069, batch_cost: 0.1762, reader_cost: 0.00130, ips: 45.4096 samples/sec | ETA 06:54:35 2022-08-22 19:52:55 [INFO] [TRAIN] epoch: 15, iter: 18850/160000, loss: 0.9209, lr: 0.001069, batch_cost: 0.2056, reader_cost: 0.00036, ips: 38.9090 samples/sec | ETA 08:03:41 2022-08-22 19:53:05 [INFO] [TRAIN] epoch: 15, iter: 18900/160000, loss: 0.9590, lr: 0.001068, batch_cost: 0.1984, reader_cost: 0.00224, ips: 40.3149 samples/sec | ETA 07:46:39 2022-08-22 19:53:21 [INFO] [TRAIN] epoch: 16, iter: 18950/160000, loss: 0.9430, lr: 0.001068, batch_cost: 0.3062, reader_cost: 0.08380, ips: 26.1242 samples/sec | ETA 11:59:53 2022-08-22 19:53:31 [INFO] [TRAIN] epoch: 16, iter: 19000/160000, loss: 0.8765, lr: 0.001068, batch_cost: 0.2113, reader_cost: 0.00153, ips: 37.8547 samples/sec | ETA 08:16:38 2022-08-22 19:53:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1917 - reader cost: 0.0011 2022-08-22 19:56:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.2939 Acc: 0.7384 Kappa: 0.7182 Dice: 0.4159 2022-08-22 19:56:43 [INFO] [EVAL] Class IoU: [0.6517 0.7628 0.9222 0.6963 0.6693 0.7268 0.7404 0.7347 0.4804 0.5918 0.4487 0.515 0.6718 0.2805 0.206 0.3651 0.4396 0.3838 0.5528 0.3644 0.6285 0.3469 0.5908 0.4469 0.3037 0.3808 0.4037 0.3874 0.3572 0.2031 0.13 0.4384 0.2725 0.3128 0.2772 0.3688 0.3726 0.4668 0.2276 0.3088 0.1199 0.1155 0.2909 0.1814 0.2457 0.2116 0.2385 0.4182 0.5433 0.5021 0.4609 0.3503 0.1782 0.2379 0.7332 0.4114 0.7946 0.3701 0.1823 0.3227 0.1124 0.1381 0.2649 0.0849 0.3979 0.6004 0.2813 0.4067 0.081 0.2661 0.2999 0.4676 0.3415 0.2649 0.3849 0.2722 0.2102 0.2406 0.0745 0.3231 0.6843 0.3229 0.24 0.0954 0.3676 0.4945 0.0936 0.0334 0.2301 0.3845 0.3904 0.0016 0.1786 0.0591 0.0082 0.0063 0.0726 0.1272 0.2408 0.2966 0.0009 0.0251 0.1188 0.2076 0.0268 0.3819 0.1624 0.4638 0.0746 0.0097 0.0452 0.2704 0.1092 0.5844 0.3939 0. 0.3561 0.6152 0.0474 0.1067 0.4703 0. 0.2101 0.0262 0.1843 0.1256 0.3521 0.4185 0. 0.0796 0.5889 0. 0.2278 0.1578 0.0751 0.1098 0.0768 0.0064 0.1338 0.3004 0.3266 0.0176 0.2468 0.0237 0.2494 0. 0.2906 0.0232 0.1 0.0894] 2022-08-22 19:56:43 [INFO] [EVAL] Class Precision: [0.7554 0.8444 0.9604 0.7856 0.7456 0.8345 0.8843 0.8238 0.6265 0.8031 0.7064 0.6356 0.7533 0.5166 0.4117 0.5246 0.5729 0.7139 0.6774 0.5777 0.6716 0.576 0.7557 0.5722 0.4675 0.602 0.4373 0.7152 0.5265 0.2408 0.3543 0.7432 0.5106 0.4365 0.4986 0.4884 0.5709 0.6737 0.375 0.4894 0.2762 0.283 0.7094 0.6374 0.4519 0.4117 0.7544 0.5905 0.5613 0.6074 0.5896 0.5042 0.3852 0.639 0.7853 0.5081 0.8549 0.5068 0.8225 0.5292 0.1913 0.2459 0.5096 0.7541 0.5795 0.7103 0.5343 0.6536 0.2002 0.4661 0.7441 0.6092 0.56 0.3766 0.5676 0.5052 0.2398 0.4713 0.6618 0.4568 0.8239 0.7766 0.7722 0.3056 0.5242 0.6866 0.1959 0.4028 0.6115 0.8071 0.5308 0.0021 0.4357 0.2609 0.0727 0.0508 0.1629 0.3981 0.5091 0.4501 0.1921 0.043 0.5581 0.4665 0.1941 0.4932 0.5333 0.8965 0.1642 0.0283 0.4227 0.3242 0.2937 0.6463 0.3943 0. 0.7407 0.6944 0.3514 0.5894 0.7345 0. 0.9338 0.9127 0.7035 0.5833 0.5525 0.5742 0. 0.7885 0.7379 0. 0.4691 0.6769 0.3667 0.4156 0.3941 0.2762 0.4554 0.4131 0.4374 0.0334 0.7308 0.4522 0.4485 0. 0.9076 0.4042 0.6631 0.8475] 2022-08-22 19:56:43 [INFO] [EVAL] Class Recall: [0.826 0.8876 0.9586 0.8597 0.8674 0.8492 0.8199 0.8717 0.6733 0.6923 0.5516 0.7307 0.8612 0.3804 0.2919 0.5457 0.6538 0.4535 0.7504 0.4968 0.9073 0.4659 0.7302 0.6712 0.4644 0.509 0.84 0.4581 0.5262 0.5652 0.1703 0.5167 0.3688 0.5246 0.3844 0.6009 0.5176 0.6031 0.3667 0.4555 0.1747 0.1632 0.3303 0.2023 0.3499 0.3033 0.2585 0.589 0.9443 0.7433 0.6786 0.5344 0.2491 0.2749 0.9171 0.6838 0.9185 0.5784 0.1898 0.4527 0.2141 0.2396 0.3555 0.0873 0.5594 0.795 0.3727 0.5184 0.1198 0.3828 0.3345 0.6678 0.4667 0.4718 0.5445 0.3712 0.6296 0.3295 0.0775 0.5248 0.8016 0.356 0.2583 0.1217 0.5515 0.6385 0.1519 0.0352 0.2694 0.4234 0.5961 0.0064 0.2324 0.071 0.0092 0.0071 0.1157 0.1574 0.3137 0.4653 0.0009 0.0568 0.1311 0.2722 0.0302 0.6286 0.1893 0.49 0.1204 0.0145 0.0481 0.6198 0.1481 0.8591 0.9973 0. 0.4068 0.8436 0.052 0.1153 0.5666 0. 0.2133 0.0263 0.1998 0.1379 0.4927 0.6068 0. 0.0813 0.7447 0. 0.307 0.1706 0.0863 0.1299 0.087 0.0065 0.1593 0.524 0.5631 0.0357 0.2715 0.0244 0.3597 0. 0.2994 0.0241 0.1053 0.0909] 2022-08-22 19:56:43 [INFO] [EVAL] The model with the best validation mIoU (0.2948) was saved at iter 18000. 2022-08-22 19:56:54 [INFO] [TRAIN] epoch: 16, iter: 19050/160000, loss: 0.8454, lr: 0.001067, batch_cost: 0.2088, reader_cost: 0.00563, ips: 38.3121 samples/sec | ETA 08:10:31 2022-08-22 19:57:02 [INFO] [TRAIN] epoch: 16, iter: 19100/160000, loss: 0.8965, lr: 0.001067, batch_cost: 0.1678, reader_cost: 0.00103, ips: 47.6734 samples/sec | ETA 06:34:04 2022-08-22 19:57:11 [INFO] [TRAIN] epoch: 16, iter: 19150/160000, loss: 0.8339, lr: 0.001066, batch_cost: 0.1752, reader_cost: 0.00331, ips: 45.6686 samples/sec | ETA 06:51:13 2022-08-22 19:57:19 [INFO] [TRAIN] epoch: 16, iter: 19200/160000, loss: 0.8761, lr: 0.001066, batch_cost: 0.1571, reader_cost: 0.00069, ips: 50.9284 samples/sec | ETA 06:08:37 2022-08-22 19:57:29 [INFO] [TRAIN] epoch: 16, iter: 19250/160000, loss: 0.8920, lr: 0.001066, batch_cost: 0.1989, reader_cost: 0.00046, ips: 40.2262 samples/sec | ETA 07:46:31 2022-08-22 19:57:37 [INFO] [TRAIN] epoch: 16, iter: 19300/160000, loss: 0.9583, lr: 0.001065, batch_cost: 0.1705, reader_cost: 0.00359, ips: 46.9238 samples/sec | ETA 06:39:47 2022-08-22 19:57:45 [INFO] [TRAIN] epoch: 16, iter: 19350/160000, loss: 0.8767, lr: 0.001065, batch_cost: 0.1545, reader_cost: 0.00716, ips: 51.7959 samples/sec | ETA 06:02:03 2022-08-22 19:57:52 [INFO] [TRAIN] epoch: 16, iter: 19400/160000, loss: 0.9150, lr: 0.001064, batch_cost: 0.1486, reader_cost: 0.00031, ips: 53.8401 samples/sec | ETA 05:48:11 2022-08-22 19:58:01 [INFO] [TRAIN] epoch: 16, iter: 19450/160000, loss: 0.9149, lr: 0.001064, batch_cost: 0.1644, reader_cost: 0.00638, ips: 48.6672 samples/sec | ETA 06:25:03 2022-08-22 19:58:09 [INFO] [TRAIN] epoch: 16, iter: 19500/160000, loss: 0.9573, lr: 0.001064, batch_cost: 0.1760, reader_cost: 0.00092, ips: 45.4527 samples/sec | ETA 06:52:08 2022-08-22 19:58:19 [INFO] [TRAIN] epoch: 16, iter: 19550/160000, loss: 0.8529, lr: 0.001063, batch_cost: 0.1846, reader_cost: 0.00136, ips: 43.3317 samples/sec | ETA 07:12:10 2022-08-22 19:58:29 [INFO] [TRAIN] epoch: 16, iter: 19600/160000, loss: 0.9039, lr: 0.001063, batch_cost: 0.2045, reader_cost: 0.00049, ips: 39.1132 samples/sec | ETA 07:58:36 2022-08-22 19:58:39 [INFO] [TRAIN] epoch: 16, iter: 19650/160000, loss: 0.9036, lr: 0.001063, batch_cost: 0.1937, reader_cost: 0.00034, ips: 41.3098 samples/sec | ETA 07:33:00 2022-08-22 19:58:47 [INFO] [TRAIN] epoch: 16, iter: 19700/160000, loss: 0.8532, lr: 0.001062, batch_cost: 0.1775, reader_cost: 0.00040, ips: 45.0630 samples/sec | ETA 06:55:07 2022-08-22 19:58:56 [INFO] [TRAIN] epoch: 16, iter: 19750/160000, loss: 0.8981, lr: 0.001062, batch_cost: 0.1794, reader_cost: 0.00054, ips: 44.5983 samples/sec | ETA 06:59:17 2022-08-22 19:59:06 [INFO] [TRAIN] epoch: 16, iter: 19800/160000, loss: 0.8897, lr: 0.001061, batch_cost: 0.1877, reader_cost: 0.00046, ips: 42.6222 samples/sec | ETA 07:18:34 2022-08-22 19:59:16 [INFO] [TRAIN] epoch: 16, iter: 19850/160000, loss: 0.9044, lr: 0.001061, batch_cost: 0.2050, reader_cost: 0.00033, ips: 39.0154 samples/sec | ETA 07:58:57 2022-08-22 19:59:27 [INFO] [TRAIN] epoch: 16, iter: 19900/160000, loss: 0.8733, lr: 0.001061, batch_cost: 0.2207, reader_cost: 0.00210, ips: 36.2496 samples/sec | ETA 08:35:18 2022-08-22 19:59:39 [INFO] [TRAIN] epoch: 16, iter: 19950/160000, loss: 0.9503, lr: 0.001060, batch_cost: 0.2473, reader_cost: 0.00113, ips: 32.3436 samples/sec | ETA 09:37:20 2022-08-22 19:59:51 [INFO] [TRAIN] epoch: 16, iter: 20000/160000, loss: 0.9006, lr: 0.001060, batch_cost: 0.2239, reader_cost: 0.00035, ips: 35.7294 samples/sec | ETA 08:42:26 2022-08-22 19:59:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 177s - batch_cost: 0.1766 - reader cost: 9.1503e-04 2022-08-22 20:02:47 [INFO] [EVAL] #Images: 2000 mIoU: 0.3020 Acc: 0.7402 Kappa: 0.7208 Dice: 0.4277 2022-08-22 20:02:47 [INFO] [EVAL] Class IoU: [0.6518 0.7596 0.9261 0.6961 0.6712 0.7283 0.7535 0.7412 0.4828 0.588 0.4509 0.4977 0.6679 0.3014 0.222 0.3622 0.5137 0.4086 0.5318 0.3551 0.7118 0.4516 0.5797 0.4472 0.2671 0.3038 0.4947 0.3428 0.3053 0.2397 0.1255 0.4408 0.2637 0.2963 0.3206 0.3679 0.3721 0.4578 0.2317 0.2931 0.1733 0.1309 0.3451 0.2451 0.2599 0.101 0.2479 0.4107 0.6097 0.4145 0.4199 0.3468 0.1826 0.2531 0.6835 0.2971 0.8071 0.3638 0.2746 0.3426 0.1205 0.3191 0.2838 0.138 0.3615 0.5744 0.3198 0.3887 0.0758 0.2905 0.3621 0.4407 0.351 0.2594 0.3827 0.2777 0.2751 0.2324 0.2218 0.2777 0.5507 0.3313 0.3616 0.042 0.3625 0.5011 0.0454 0.0396 0.2353 0.33 0.2433 0. 0.1805 0.0283 0.1332 0.023 0.0172 0.1196 0.1345 0.2681 0.0061 0.0202 0.1413 0.4824 0.0905 0.3713 0.1375 0.4455 0.0753 0.2626 0.0183 0.3397 0.0964 0.6092 0.723 0.0012 0.2944 0.5526 0.0743 0.1581 0.335 0. 0.2432 0.1535 0.1806 0.2023 0.4015 0.4223 0. 0.1953 0.404 0.0001 0.1773 0.1375 0.0644 0.1001 0.1265 0.0001 0.1553 0.3325 0.3016 0.0051 0.2202 0.0161 0.2079 0. 0.3419 0.0056 0.1309 0.106 ] 2022-08-22 20:02:47 [INFO] [EVAL] Class Precision: [0.7831 0.8488 0.959 0.7811 0.7468 0.8652 0.8618 0.8118 0.6232 0.7576 0.5906 0.6316 0.7351 0.5279 0.4425 0.4849 0.7097 0.6781 0.7005 0.5851 0.7859 0.6757 0.7281 0.5651 0.4588 0.5078 0.5463 0.7837 0.6759 0.3024 0.3596 0.5388 0.4072 0.4241 0.4792 0.5412 0.5022 0.7525 0.3691 0.5308 0.2886 0.3174 0.6022 0.4898 0.4131 0.375 0.4067 0.5216 0.6977 0.4825 0.6643 0.5403 0.3111 0.5725 0.7219 0.3411 0.8529 0.541 0.7613 0.5334 0.4118 0.3873 0.4074 0.7855 0.4341 0.7214 0.5005 0.5046 0.113 0.5116 0.6418 0.5048 0.582 0.3815 0.5793 0.3894 0.4304 0.4686 0.7144 0.3696 0.6342 0.6496 0.661 0.0845 0.6263 0.6404 0.254 0.3123 0.467 0.4359 0.2814 0. 0.3459 0.3196 0.3062 0.102 0.1655 0.3917 0.6288 0.7577 0.2426 0.044 0.5769 0.5322 0.3108 0.4695 0.3369 0.5383 0.16 0.3269 0.233 0.7625 0.3795 0.6645 0.7386 0.1145 0.7878 0.5869 0.1549 0.4792 0.6592 0. 0.6526 0.5522 0.5449 0.5953 0.7744 0.5881 0. 0.6331 0.4964 0.0088 0.6097 0.5255 0.6853 0.3876 0.3343 0.0052 0.4478 0.5891 0.6375 0.0179 0.5648 0.5169 0.8508 0. 0.8132 0.4073 0.7442 0.8768] 2022-08-22 20:02:47 [INFO] [EVAL] Class Recall: [0.7954 0.8785 0.9642 0.8648 0.869 0.8215 0.857 0.8951 0.6818 0.7242 0.6559 0.7012 0.8796 0.4126 0.3082 0.5886 0.6504 0.5069 0.6882 0.4745 0.883 0.5766 0.7398 0.6818 0.3899 0.4306 0.8396 0.3787 0.3577 0.5364 0.1616 0.7079 0.4279 0.4957 0.4921 0.5346 0.5896 0.539 0.3837 0.3955 0.3024 0.1822 0.447 0.3291 0.412 0.1215 0.3884 0.6589 0.8286 0.7463 0.533 0.4919 0.3066 0.3121 0.9278 0.6976 0.9377 0.5262 0.3005 0.4893 0.1456 0.6442 0.4834 0.1434 0.6838 0.7381 0.4696 0.6286 0.1872 0.402 0.4539 0.7764 0.4694 0.4477 0.5301 0.4919 0.4325 0.3155 0.2434 0.5277 0.807 0.4035 0.4439 0.0771 0.4625 0.6973 0.0524 0.0434 0.3216 0.5759 0.6424 0. 0.274 0.0301 0.1908 0.0288 0.0188 0.147 0.1461 0.2933 0.0062 0.0361 0.1576 0.8375 0.1132 0.6395 0.1885 0.721 0.1246 0.5715 0.0195 0.3799 0.1144 0.8798 0.9715 0.0012 0.3198 0.9042 0.125 0.1909 0.4052 0. 0.2794 0.1753 0.2127 0.2346 0.4547 0.5997 0. 0.2202 0.6846 0.0001 0.2 0.157 0.0664 0.1189 0.169 0.0001 0.192 0.4329 0.364 0.007 0.2652 0.0164 0.2158 0. 0.371 0.0057 0.1371 0.1076] 2022-08-22 20:02:48 [INFO] [EVAL] The model with the best validation mIoU (0.3020) was saved at iter 20000. 2022-08-22 20:02:58 [INFO] [TRAIN] epoch: 16, iter: 20050/160000, loss: 0.8961, lr: 0.001060, batch_cost: 0.2040, reader_cost: 0.00345, ips: 39.2226 samples/sec | ETA 07:55:44 2022-08-22 20:03:06 [INFO] [TRAIN] epoch: 16, iter: 20100/160000, loss: 0.8524, lr: 0.001059, batch_cost: 0.1663, reader_cost: 0.00120, ips: 48.1082 samples/sec | ETA 06:27:44 2022-08-22 20:03:16 [INFO] [TRAIN] epoch: 16, iter: 20150/160000, loss: 0.8461, lr: 0.001059, batch_cost: 0.1890, reader_cost: 0.00053, ips: 42.3225 samples/sec | ETA 07:20:35 2022-08-22 20:03:24 [INFO] [TRAIN] epoch: 16, iter: 20200/160000, loss: 0.8657, lr: 0.001058, batch_cost: 0.1719, reader_cost: 0.00042, ips: 46.5338 samples/sec | ETA 06:40:34 2022-08-22 20:03:37 [INFO] [TRAIN] epoch: 17, iter: 20250/160000, loss: 0.9272, lr: 0.001058, batch_cost: 0.2655, reader_cost: 0.08502, ips: 30.1354 samples/sec | ETA 10:18:19 2022-08-22 20:03:47 [INFO] [TRAIN] epoch: 17, iter: 20300/160000, loss: 0.8847, lr: 0.001058, batch_cost: 0.1817, reader_cost: 0.00889, ips: 44.0245 samples/sec | ETA 07:03:05 2022-08-22 20:03:55 [INFO] [TRAIN] epoch: 17, iter: 20350/160000, loss: 0.8757, lr: 0.001057, batch_cost: 0.1708, reader_cost: 0.00055, ips: 46.8428 samples/sec | ETA 06:37:29 2022-08-22 20:04:04 [INFO] [TRAIN] epoch: 17, iter: 20400/160000, loss: 0.9414, lr: 0.001057, batch_cost: 0.1824, reader_cost: 0.00041, ips: 43.8600 samples/sec | ETA 07:04:22 2022-08-22 20:04:12 [INFO] [TRAIN] epoch: 17, iter: 20450/160000, loss: 0.8053, lr: 0.001057, batch_cost: 0.1600, reader_cost: 0.00040, ips: 50.0150 samples/sec | ETA 06:12:01 2022-08-22 20:04:20 [INFO] [TRAIN] epoch: 17, iter: 20500/160000, loss: 0.8310, lr: 0.001056, batch_cost: 0.1640, reader_cost: 0.00041, ips: 48.7892 samples/sec | ETA 06:21:13 2022-08-22 20:04:28 [INFO] [TRAIN] epoch: 17, iter: 20550/160000, loss: 0.8057, lr: 0.001056, batch_cost: 0.1587, reader_cost: 0.00059, ips: 50.4141 samples/sec | ETA 06:08:48 2022-08-22 20:04:37 [INFO] [TRAIN] epoch: 17, iter: 20600/160000, loss: 0.8331, lr: 0.001055, batch_cost: 0.1711, reader_cost: 0.00743, ips: 46.7682 samples/sec | ETA 06:37:25 2022-08-22 20:04:46 [INFO] [TRAIN] epoch: 17, iter: 20650/160000, loss: 0.8270, lr: 0.001055, batch_cost: 0.1865, reader_cost: 0.00030, ips: 42.9027 samples/sec | ETA 07:13:04 2022-08-22 20:04:55 [INFO] [TRAIN] epoch: 17, iter: 20700/160000, loss: 0.9002, lr: 0.001055, batch_cost: 0.1812, reader_cost: 0.00388, ips: 44.1449 samples/sec | ETA 07:00:44 2022-08-22 20:05:03 [INFO] [TRAIN] epoch: 17, iter: 20750/160000, loss: 0.9028, lr: 0.001054, batch_cost: 0.1548, reader_cost: 0.00069, ips: 51.6810 samples/sec | ETA 05:59:15 2022-08-22 20:05:11 [INFO] [TRAIN] epoch: 17, iter: 20800/160000, loss: 0.9455, lr: 0.001054, batch_cost: 0.1599, reader_cost: 0.00817, ips: 50.0246 samples/sec | ETA 06:11:01 2022-08-22 20:05:20 [INFO] [TRAIN] epoch: 17, iter: 20850/160000, loss: 0.9482, lr: 0.001054, batch_cost: 0.1744, reader_cost: 0.00055, ips: 45.8784 samples/sec | ETA 06:44:24 2022-08-22 20:05:31 [INFO] [TRAIN] epoch: 17, iter: 20900/160000, loss: 0.9267, lr: 0.001053, batch_cost: 0.2191, reader_cost: 0.00286, ips: 36.5099 samples/sec | ETA 08:27:59 2022-08-22 20:05:42 [INFO] [TRAIN] epoch: 17, iter: 20950/160000, loss: 0.8665, lr: 0.001053, batch_cost: 0.2198, reader_cost: 0.00055, ips: 36.3943 samples/sec | ETA 08:29:25 2022-08-22 20:05:52 [INFO] [TRAIN] epoch: 17, iter: 21000/160000, loss: 0.8951, lr: 0.001052, batch_cost: 0.2069, reader_cost: 0.00068, ips: 38.6674 samples/sec | ETA 07:59:18 2022-08-22 20:05:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1835 - reader cost: 0.0013 2022-08-22 20:08:56 [INFO] [EVAL] #Images: 2000 mIoU: 0.3036 Acc: 0.7450 Kappa: 0.7257 Dice: 0.4260 2022-08-22 20:08:56 [INFO] [EVAL] Class IoU: [0.6537 0.7436 0.9275 0.7076 0.6718 0.7405 0.7565 0.7357 0.4783 0.6505 0.4598 0.5161 0.6709 0.3044 0.2063 0.376 0.5083 0.4159 0.564 0.3836 0.7205 0.4865 0.5812 0.4567 0.281 0.3484 0.5444 0.3742 0.3214 0.2804 0.1661 0.4416 0.2454 0.3078 0.341 0.3526 0.3739 0.4419 0.2644 0.2908 0.1128 0.1028 0.3241 0.2228 0.2587 0.1651 0.3456 0.3778 0.6076 0.3403 0.423 0.2959 0.1133 0.2453 0.7244 0.4459 0.7946 0.3031 0.4999 0.2373 0.0791 0.1834 0.2783 0.0561 0.3915 0.4976 0.3086 0.3902 0.0538 0.2637 0.3436 0.4728 0.3863 0.2362 0.4197 0.2535 0.5014 0.211 0.465 0.118 0.5313 0.3047 0.1894 0.0979 0.2982 0.5142 0.0906 0.0334 0.2403 0.3855 0.3976 0.0246 0.1963 0.0793 0.0006 0.0025 0.005 0.0343 0.2526 0.2987 0.0472 0.0181 0.1174 0.3606 0.0344 0.3244 0.1482 0.5056 0.0607 0.0836 0.0395 0.107 0.0989 0.5279 0.7049 0.001 0.266 0.5799 0.042 0.1698 0.5012 0. 0.2225 0.162 0.1735 0.1892 0.4387 0.3979 0. 0.0591 0.5799 0. 0.2078 0.1832 0.1658 0.1127 0.0748 0.0007 0.1135 0.2651 0.3784 0.0006 0.3518 0.0062 0.2477 0. 0.3281 0.0095 0.1153 0.0959] 2022-08-22 20:08:56 [INFO] [EVAL] Class Precision: [0.7666 0.8485 0.9648 0.8087 0.7535 0.8381 0.8853 0.793 0.579 0.7652 0.6355 0.623 0.7365 0.5446 0.4756 0.523 0.6626 0.7466 0.7523 0.5337 0.828 0.6223 0.7509 0.5957 0.4808 0.3803 0.631 0.7352 0.6212 0.3959 0.3946 0.5846 0.4708 0.5409 0.4062 0.5891 0.5945 0.771 0.4719 0.5483 0.2716 0.2855 0.6311 0.5152 0.3593 0.47 0.6141 0.4577 0.7363 0.3779 0.7196 0.418 0.2937 0.5605 0.7648 0.6675 0.8347 0.5901 0.6392 0.5136 0.3153 0.3197 0.3694 0.8832 0.5021 0.7773 0.4276 0.5307 0.2005 0.503 0.435 0.5701 0.5542 0.4149 0.5927 0.5022 0.5687 0.492 0.6307 0.2416 0.6 0.6333 0.7942 0.1561 0.6115 0.7136 0.4243 0.4218 0.7174 0.7622 0.5467 0.0322 0.4094 0.3359 0.021 0.0195 0.4923 0.35 0.5001 0.7047 0.3912 0.0306 0.5753 0.9152 0.5333 0.3588 0.308 0.6921 0.1433 0.1912 0.3738 0.1192 0.3188 0.6336 0.7119 0.0472 0.812 0.6585 0.1046 0.5514 0.6499 0. 0.6226 0.5737 0.7098 0.6205 0.7619 0.5458 0. 0.8291 0.7538 0. 0.6046 0.4962 0.5326 0.312 0.5207 0.0055 0.5373 0.7033 0.5386 0.0014 0.5058 0.2812 0.506 0. 0.8666 0.5045 0.532 0.907 ] 2022-08-22 20:08:56 [INFO] [EVAL] Class Recall: [0.8161 0.8574 0.96 0.8498 0.8611 0.8641 0.8387 0.9105 0.7333 0.8127 0.6246 0.7507 0.8828 0.4083 0.2671 0.5723 0.6857 0.4843 0.6925 0.5769 0.8474 0.6903 0.72 0.6619 0.4033 0.8061 0.7985 0.4326 0.3998 0.49 0.2229 0.6435 0.3389 0.4166 0.6797 0.4675 0.5019 0.5086 0.3755 0.3825 0.1617 0.1383 0.3999 0.2819 0.4803 0.2028 0.4414 0.684 0.7766 0.7739 0.5065 0.5033 0.1557 0.3036 0.9319 0.5733 0.943 0.3839 0.6965 0.3062 0.0955 0.3008 0.53 0.0565 0.6401 0.5803 0.5256 0.5959 0.0685 0.3566 0.6207 0.7347 0.5605 0.3541 0.5898 0.3387 0.809 0.2697 0.639 0.1873 0.8227 0.3699 0.1992 0.2078 0.3679 0.6479 0.1033 0.035 0.2654 0.4382 0.593 0.0947 0.2739 0.0941 0.0006 0.0029 0.005 0.0367 0.3379 0.3415 0.0509 0.0422 0.1285 0.3731 0.0355 0.7721 0.2222 0.6523 0.0954 0.1292 0.0423 0.5103 0.1253 0.7599 0.9861 0.001 0.2835 0.8293 0.0657 0.1971 0.6866 0. 0.2572 0.1842 0.1867 0.214 0.5085 0.5948 0. 0.0598 0.7154 0. 0.2405 0.225 0.1941 0.1499 0.0803 0.0008 0.1258 0.2985 0.5599 0.0012 0.5359 0.0063 0.3268 0. 0.3456 0.0096 0.1283 0.0969] 2022-08-22 20:08:56 [INFO] [EVAL] The model with the best validation mIoU (0.3036) was saved at iter 21000. 2022-08-22 20:09:04 [INFO] [TRAIN] epoch: 17, iter: 21050/160000, loss: 0.8565, lr: 0.001052, batch_cost: 0.1635, reader_cost: 0.00291, ips: 48.9325 samples/sec | ETA 06:18:37 2022-08-22 20:09:12 [INFO] [TRAIN] epoch: 17, iter: 21100/160000, loss: 0.8833, lr: 0.001052, batch_cost: 0.1646, reader_cost: 0.00104, ips: 48.6083 samples/sec | ETA 06:21:00 2022-08-22 20:09:20 [INFO] [TRAIN] epoch: 17, iter: 21150/160000, loss: 0.8857, lr: 0.001051, batch_cost: 0.1537, reader_cost: 0.00032, ips: 52.0624 samples/sec | ETA 05:55:35 2022-08-22 20:09:28 [INFO] [TRAIN] epoch: 17, iter: 21200/160000, loss: 0.8917, lr: 0.001051, batch_cost: 0.1564, reader_cost: 0.00043, ips: 51.1569 samples/sec | ETA 06:01:45 2022-08-22 20:09:36 [INFO] [TRAIN] epoch: 17, iter: 21250/160000, loss: 0.9015, lr: 0.001050, batch_cost: 0.1613, reader_cost: 0.00033, ips: 49.5949 samples/sec | ETA 06:13:01 2022-08-22 20:09:44 [INFO] [TRAIN] epoch: 17, iter: 21300/160000, loss: 0.8957, lr: 0.001050, batch_cost: 0.1593, reader_cost: 0.00076, ips: 50.2137 samples/sec | ETA 06:08:17 2022-08-22 20:09:53 [INFO] [TRAIN] epoch: 17, iter: 21350/160000, loss: 0.8346, lr: 0.001050, batch_cost: 0.1722, reader_cost: 0.00045, ips: 46.4623 samples/sec | ETA 06:37:53 2022-08-22 20:10:00 [INFO] [TRAIN] epoch: 17, iter: 21400/160000, loss: 0.8658, lr: 0.001049, batch_cost: 0.1537, reader_cost: 0.00041, ips: 52.0402 samples/sec | ETA 05:55:06 2022-08-22 20:10:08 [INFO] [TRAIN] epoch: 17, iter: 21450/160000, loss: 0.8605, lr: 0.001049, batch_cost: 0.1635, reader_cost: 0.00070, ips: 48.9365 samples/sec | ETA 06:17:29 2022-08-22 20:10:20 [INFO] [TRAIN] epoch: 18, iter: 21500/160000, loss: 0.7898, lr: 0.001049, batch_cost: 0.2371, reader_cost: 0.07160, ips: 33.7470 samples/sec | ETA 09:07:12 2022-08-22 20:10:30 [INFO] [TRAIN] epoch: 18, iter: 21550/160000, loss: 0.8615, lr: 0.001048, batch_cost: 0.1925, reader_cost: 0.00067, ips: 41.5518 samples/sec | ETA 07:24:15 2022-08-22 20:10:40 [INFO] [TRAIN] epoch: 18, iter: 21600/160000, loss: 0.8633, lr: 0.001048, batch_cost: 0.2020, reader_cost: 0.00117, ips: 39.5952 samples/sec | ETA 07:46:02 2022-08-22 20:10:49 [INFO] [TRAIN] epoch: 18, iter: 21650/160000, loss: 0.8439, lr: 0.001047, batch_cost: 0.1873, reader_cost: 0.00113, ips: 42.7159 samples/sec | ETA 07:11:50 2022-08-22 20:10:58 [INFO] [TRAIN] epoch: 18, iter: 21700/160000, loss: 0.8709, lr: 0.001047, batch_cost: 0.1779, reader_cost: 0.00067, ips: 44.9581 samples/sec | ETA 06:50:09 2022-08-22 20:11:08 [INFO] [TRAIN] epoch: 18, iter: 21750/160000, loss: 0.7769, lr: 0.001047, batch_cost: 0.1905, reader_cost: 0.00076, ips: 41.9874 samples/sec | ETA 07:19:01 2022-08-22 20:11:17 [INFO] [TRAIN] epoch: 18, iter: 21800/160000, loss: 0.8098, lr: 0.001046, batch_cost: 0.1890, reader_cost: 0.00109, ips: 42.3190 samples/sec | ETA 07:15:25 2022-08-22 20:11:27 [INFO] [TRAIN] epoch: 18, iter: 21850/160000, loss: 0.8411, lr: 0.001046, batch_cost: 0.1943, reader_cost: 0.00051, ips: 41.1764 samples/sec | ETA 07:27:20 2022-08-22 20:11:35 [INFO] [TRAIN] epoch: 18, iter: 21900/160000, loss: 0.9355, lr: 0.001046, batch_cost: 0.1708, reader_cost: 0.00064, ips: 46.8424 samples/sec | ETA 06:33:05 2022-08-22 20:11:45 [INFO] [TRAIN] epoch: 18, iter: 21950/160000, loss: 0.8045, lr: 0.001045, batch_cost: 0.1806, reader_cost: 0.00040, ips: 44.2856 samples/sec | ETA 06:55:38 2022-08-22 20:11:55 [INFO] [TRAIN] epoch: 18, iter: 22000/160000, loss: 0.8031, lr: 0.001045, batch_cost: 0.2024, reader_cost: 0.00337, ips: 39.5171 samples/sec | ETA 07:45:37 2022-08-22 20:11:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 194s - batch_cost: 0.1936 - reader cost: 7.7229e-04 2022-08-22 20:15:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3026 Acc: 0.7444 Kappa: 0.7246 Dice: 0.4274 2022-08-22 20:15:09 [INFO] [EVAL] Class IoU: [0.6595 0.7568 0.9258 0.7044 0.6676 0.7293 0.7561 0.7116 0.4947 0.6088 0.4364 0.4939 0.6644 0.3162 0.2292 0.3833 0.4842 0.4109 0.5792 0.3664 0.7272 0.5074 0.5941 0.4599 0.2763 0.2809 0.5294 0.3672 0.3572 0.2532 0.1867 0.4288 0.2404 0.3338 0.316 0.3627 0.3814 0.3787 0.2405 0.2636 0.0794 0.1127 0.3403 0.2318 0.2445 0.1922 0.2731 0.4046 0.5921 0.4595 0.4651 0.1942 0.1948 0.2094 0.6746 0.4028 0.7817 0.2315 0.1323 0.2824 0.1178 0.2274 0.2524 0.1654 0.4229 0.5446 0.2875 0.3736 0.0146 0.2601 0.3491 0.4379 0.3684 0.2668 0.3662 0.2839 0.4136 0.2386 0.1161 0.1369 0.6202 0.3032 0.3269 0.0632 0.3564 0.4963 0.0493 0.0476 0.1499 0.4244 0.3679 0.0506 0.1546 0.0605 0.0584 0.0004 0.0155 0.0101 0.1079 0.3153 0.0192 0.021 0.0941 0.4639 0.084 0.3559 0.2489 0.527 0.0526 0.0913 0.0578 0.3825 0.0958 0.5836 0.5467 0.0001 0.3363 0.5417 0.0901 0.2126 0.4383 0. 0.2258 0.0969 0.3417 0.216 0.3863 0.3638 0. 0.1706 0.5067 0. 0.3068 0.15 0.0701 0.1254 0.1222 0.0037 0.1278 0.3347 0.3042 0. 0.3373 0.1503 0.2506 0.0054 0.3344 0.0121 0.0867 0.139 ] 2022-08-22 20:15:09 [INFO] [EVAL] Class Precision: [0.7567 0.8404 0.9709 0.806 0.7369 0.8559 0.8387 0.7656 0.6423 0.7998 0.7602 0.7168 0.7314 0.5572 0.4769 0.5394 0.6484 0.6893 0.722 0.5467 0.8135 0.6557 0.764 0.5992 0.4244 0.5125 0.6473 0.7678 0.5941 0.3205 0.4125 0.6398 0.4385 0.4993 0.4177 0.4935 0.6044 0.762 0.3349 0.69 0.2887 0.2544 0.6096 0.5028 0.3371 0.3719 0.4386 0.5825 0.6566 0.5452 0.6203 0.2359 0.3714 0.7623 0.7027 0.4988 0.8377 0.6902 0.8899 0.5572 0.1874 0.3533 0.3271 0.8054 0.584 0.6217 0.4514 0.5353 0.3337 0.4844 0.416 0.5065 0.575 0.3263 0.4639 0.4856 0.609 0.5289 0.9309 0.411 0.7185 0.6846 0.7118 0.1639 0.5514 0.6188 0.4331 0.4079 0.2313 0.629 0.4843 0.0761 0.2943 0.3086 0.229 0.0026 0.1433 0.3265 0.6166 0.6278 0.3119 0.0316 0.4993 0.5815 0.6639 0.4479 0.5387 0.734 0.1336 0.2157 0.3904 0.6003 0.2423 0.6516 0.548 0.0017 0.7909 0.6254 0.1436 0.5092 0.7012 0.0003 0.8024 0.6252 0.7278 0.5576 0.6311 0.4708 0. 0.4211 0.5763 0. 0.6105 0.6526 0.7397 0.3158 0.3237 0.0478 0.4198 0.533 0.42 0. 0.6347 0.6291 0.7543 0.0064 0.8219 0.3471 0.1851 0.7369] 2022-08-22 20:15:09 [INFO] [EVAL] Class Recall: [0.837 0.8839 0.9522 0.8483 0.8767 0.8314 0.8847 0.9098 0.6828 0.7183 0.506 0.6136 0.8788 0.4222 0.3062 0.5698 0.6565 0.5044 0.7455 0.5263 0.8727 0.6916 0.7277 0.6643 0.4418 0.3833 0.7441 0.4131 0.4726 0.5468 0.2543 0.5652 0.3474 0.5016 0.5649 0.5776 0.5083 0.4295 0.4603 0.2991 0.0987 0.1682 0.4351 0.3008 0.471 0.2845 0.4198 0.5698 0.8576 0.7452 0.6501 0.5233 0.2907 0.224 0.944 0.6766 0.9212 0.2583 0.1345 0.3641 0.2409 0.3896 0.5251 0.1723 0.6051 0.8145 0.4419 0.5529 0.0151 0.3597 0.6845 0.764 0.5063 0.5943 0.6348 0.4059 0.5631 0.303 0.1171 0.1703 0.8192 0.3524 0.3768 0.0932 0.5019 0.7149 0.0527 0.0511 0.2987 0.5661 0.6048 0.1308 0.2456 0.0699 0.0727 0.0005 0.0171 0.0103 0.1157 0.3878 0.02 0.059 0.1039 0.6963 0.0878 0.6341 0.3163 0.6514 0.0798 0.1367 0.0636 0.5133 0.1367 0.8484 0.9957 0.0001 0.3691 0.8019 0.1945 0.2674 0.539 0. 0.2391 0.1029 0.3918 0.2606 0.4991 0.6155 0. 0.2229 0.8076 0. 0.3814 0.1631 0.0719 0.1722 0.1641 0.004 0.1553 0.4735 0.5247 0. 0.4185 0.1649 0.2729 0.0343 0.3605 0.0124 0.1402 0.1463] 2022-08-22 20:15:09 [INFO] [EVAL] The model with the best validation mIoU (0.3036) was saved at iter 21000. 2022-08-22 20:15:18 [INFO] [TRAIN] epoch: 18, iter: 22050/160000, loss: 0.8349, lr: 0.001044, batch_cost: 0.1777, reader_cost: 0.00282, ips: 45.0164 samples/sec | ETA 06:48:35 2022-08-22 20:15:27 [INFO] [TRAIN] epoch: 18, iter: 22100/160000, loss: 0.8253, lr: 0.001044, batch_cost: 0.1925, reader_cost: 0.00115, ips: 41.5528 samples/sec | ETA 07:22:29 2022-08-22 20:15:38 [INFO] [TRAIN] epoch: 18, iter: 22150/160000, loss: 0.9003, lr: 0.001044, batch_cost: 0.2121, reader_cost: 0.00074, ips: 37.7206 samples/sec | ETA 08:07:16 2022-08-22 20:15:47 [INFO] [TRAIN] epoch: 18, iter: 22200/160000, loss: 0.8342, lr: 0.001043, batch_cost: 0.1743, reader_cost: 0.00064, ips: 45.8899 samples/sec | ETA 06:40:22 2022-08-22 20:15:56 [INFO] [TRAIN] epoch: 18, iter: 22250/160000, loss: 0.8465, lr: 0.001043, batch_cost: 0.1982, reader_cost: 0.00106, ips: 40.3595 samples/sec | ETA 07:35:04 2022-08-22 20:16:06 [INFO] [TRAIN] epoch: 18, iter: 22300/160000, loss: 0.8759, lr: 0.001043, batch_cost: 0.1879, reader_cost: 0.00057, ips: 42.5662 samples/sec | ETA 07:11:19 2022-08-22 20:16:14 [INFO] [TRAIN] epoch: 18, iter: 22350/160000, loss: 0.8319, lr: 0.001042, batch_cost: 0.1582, reader_cost: 0.00066, ips: 50.5614 samples/sec | ETA 06:02:59 2022-08-22 20:16:24 [INFO] [TRAIN] epoch: 18, iter: 22400/160000, loss: 0.8945, lr: 0.001042, batch_cost: 0.1998, reader_cost: 0.00063, ips: 40.0355 samples/sec | ETA 07:38:15 2022-08-22 20:16:33 [INFO] [TRAIN] epoch: 18, iter: 22450/160000, loss: 0.8545, lr: 0.001041, batch_cost: 0.1837, reader_cost: 0.00049, ips: 43.5401 samples/sec | ETA 07:01:13 2022-08-22 20:16:42 [INFO] [TRAIN] epoch: 18, iter: 22500/160000, loss: 0.8952, lr: 0.001041, batch_cost: 0.1738, reader_cost: 0.00067, ips: 46.0257 samples/sec | ETA 06:38:19 2022-08-22 20:16:50 [INFO] [TRAIN] epoch: 18, iter: 22550/160000, loss: 0.8942, lr: 0.001041, batch_cost: 0.1683, reader_cost: 0.00096, ips: 47.5266 samples/sec | ETA 06:25:36 2022-08-22 20:16:58 [INFO] [TRAIN] epoch: 18, iter: 22600/160000, loss: 0.8965, lr: 0.001040, batch_cost: 0.1672, reader_cost: 0.00136, ips: 47.8476 samples/sec | ETA 06:22:52 2022-08-22 20:17:08 [INFO] [TRAIN] epoch: 18, iter: 22650/160000, loss: 0.8705, lr: 0.001040, batch_cost: 0.1915, reader_cost: 0.00063, ips: 41.7856 samples/sec | ETA 07:18:16 2022-08-22 20:17:17 [INFO] [TRAIN] epoch: 18, iter: 22700/160000, loss: 0.8695, lr: 0.001040, batch_cost: 0.1788, reader_cost: 0.00059, ips: 44.7418 samples/sec | ETA 06:49:09 2022-08-22 20:17:29 [INFO] [TRAIN] epoch: 19, iter: 22750/160000, loss: 0.8722, lr: 0.001039, batch_cost: 0.2333, reader_cost: 0.06430, ips: 34.2901 samples/sec | ETA 08:53:40 2022-08-22 20:17:37 [INFO] [TRAIN] epoch: 19, iter: 22800/160000, loss: 0.8488, lr: 0.001039, batch_cost: 0.1767, reader_cost: 0.00102, ips: 45.2788 samples/sec | ETA 06:44:00 2022-08-22 20:17:47 [INFO] [TRAIN] epoch: 19, iter: 22850/160000, loss: 0.8292, lr: 0.001038, batch_cost: 0.1973, reader_cost: 0.00077, ips: 40.5463 samples/sec | ETA 07:31:00 2022-08-22 20:17:56 [INFO] [TRAIN] epoch: 19, iter: 22900/160000, loss: 0.8338, lr: 0.001038, batch_cost: 0.1833, reader_cost: 0.00040, ips: 43.6461 samples/sec | ETA 06:58:49 2022-08-22 20:18:07 [INFO] [TRAIN] epoch: 19, iter: 22950/160000, loss: 0.8268, lr: 0.001038, batch_cost: 0.2064, reader_cost: 0.00057, ips: 38.7551 samples/sec | ETA 07:51:30 2022-08-22 20:18:17 [INFO] [TRAIN] epoch: 19, iter: 23000/160000, loss: 0.9594, lr: 0.001037, batch_cost: 0.2128, reader_cost: 0.00224, ips: 37.5854 samples/sec | ETA 08:06:00 2022-08-22 20:18:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 194s - batch_cost: 0.1936 - reader cost: 7.8003e-04 2022-08-22 20:21:31 [INFO] [EVAL] #Images: 2000 mIoU: 0.2913 Acc: 0.7384 Kappa: 0.7191 Dice: 0.4123 2022-08-22 20:21:31 [INFO] [EVAL] Class IoU: [0.6599 0.7426 0.9271 0.7063 0.6746 0.73 0.7484 0.7111 0.4887 0.614 0.4681 0.5118 0.6631 0.2835 0.2353 0.3632 0.5218 0.3814 0.5721 0.3573 0.7018 0.4153 0.5969 0.4418 0.2741 0.3738 0.4402 0.3867 0.3466 0.2442 0.1243 0.4573 0.2778 0.3179 0.3216 0.3636 0.3807 0.4764 0.2523 0.3312 0.1686 0.0933 0.3345 0.2271 0.2906 0.2134 0.1439 0.4148 0.5969 0.4731 0.3978 0.2332 0.1408 0.2418 0.6782 0.3455 0.7473 0.1651 0.3926 0.2586 0.1064 0.2305 0.2322 0.1634 0.3598 0.6086 0.2329 0.4037 0.0446 0.3208 0.3502 0.4329 0.2879 0.258 0.3678 0.3038 0.3081 0.2328 0.2879 0.1049 0.6493 0.3475 0.3107 0.1122 0.3646 0.5102 0.0458 0.0606 0.2236 0.4519 0.3022 0.0264 0.1656 0.0556 0.0113 0.0167 0.0043 0.0235 0.0183 0.2768 0.0244 0.0089 0.0777 0.3523 0.0105 0.297 0.0868 0.4805 0.0795 0.1474 0.0295 0.1592 0.1177 0.5095 0.4816 0. 0.3391 0.5622 0.0462 0.1608 0.4064 0. 0.2306 0.0341 0.2208 0.2058 0.3891 0.4137 0. 0.0279 0.3963 0. 0.2271 0.1519 0.067 0.1064 0.1123 0.0024 0.0934 0.3242 0.2748 0.0147 0.193 0. 0.274 0. 0.3129 0.0064 0.0955 0.0936] 2022-08-22 20:21:31 [INFO] [EVAL] Class Precision: [0.7753 0.8871 0.9657 0.819 0.7607 0.8515 0.8564 0.7517 0.6653 0.8079 0.6091 0.686 0.7362 0.4369 0.4793 0.4772 0.6232 0.6871 0.7116 0.5372 0.7672 0.5763 0.7905 0.5795 0.5092 0.3946 0.4649 0.7185 0.6258 0.316 0.3801 0.6785 0.4676 0.4214 0.516 0.5254 0.5419 0.6168 0.4508 0.5219 0.3386 0.2834 0.5245 0.5363 0.4634 0.4117 0.4376 0.6349 0.6465 0.5911 0.7794 0.2817 0.3142 0.6898 0.7039 0.4099 0.7747 0.7505 0.5838 0.4438 0.5676 0.3763 0.3036 0.6789 0.4243 0.7457 0.2861 0.5259 0.4273 0.5755 0.5443 0.5017 0.4825 0.3482 0.499 0.4559 0.3502 0.4721 0.6503 0.3885 0.8161 0.6064 0.7759 0.2186 0.5507 0.7358 0.5038 0.3209 0.3827 0.6599 0.3783 0.033 0.2532 0.3542 0.0879 0.0345 0.7866 0.2072 0.5338 0.6906 0.3439 0.0113 0.4022 0.5053 0.1869 0.3556 0.1208 0.7506 0.1626 0.276 0.3948 0.1685 0.3202 0.5212 0.4826 0. 0.8153 0.6544 0.0667 0.4238 0.759 0. 0.7331 0.6541 0.5682 0.6039 0.8324 0.5927 0. 1. 0.4368 0. 0.6034 0.463 0.559 0.3558 0.3468 0.0374 0.431 0.6503 0.322 0.0262 0.7307 0. 0.6399 0. 0.8713 0.5413 0.8692 0.8053] 2022-08-22 20:21:31 [INFO] [EVAL] Class Recall: [0.8159 0.8201 0.9587 0.8369 0.8562 0.8364 0.8559 0.9293 0.648 0.7189 0.6692 0.6685 0.8697 0.4469 0.3161 0.6033 0.7624 0.4616 0.7447 0.5162 0.8918 0.5979 0.709 0.6503 0.3726 0.8764 0.8924 0.4558 0.4373 0.518 0.1559 0.5838 0.4062 0.5642 0.4606 0.5414 0.5613 0.6767 0.3642 0.4755 0.2514 0.1221 0.4801 0.2826 0.438 0.307 0.1765 0.5448 0.8862 0.7032 0.4483 0.5749 0.2033 0.2713 0.9488 0.6875 0.955 0.1747 0.5453 0.3825 0.1158 0.3729 0.4968 0.1771 0.703 0.768 0.5564 0.6346 0.0474 0.4202 0.4955 0.7594 0.4164 0.4988 0.5831 0.4767 0.7194 0.3147 0.3406 0.1256 0.7605 0.4488 0.3413 0.1874 0.519 0.6245 0.048 0.0695 0.3497 0.5891 0.6005 0.1172 0.3238 0.0618 0.0128 0.0315 0.0043 0.0258 0.0186 0.316 0.0255 0.0398 0.0879 0.5378 0.011 0.6431 0.2358 0.5718 0.1347 0.2402 0.031 0.7414 0.157 0.9577 0.9956 0. 0.3673 0.7995 0.1305 0.2058 0.4666 0. 0.2518 0.0347 0.2653 0.2379 0.4222 0.5779 0. 0.0279 0.8102 0. 0.2669 0.1844 0.0708 0.1318 0.1424 0.0026 0.1065 0.3926 0.6523 0.0325 0.2078 0. 0.3239 0. 0.3281 0.0065 0.0969 0.0958] 2022-08-22 20:21:31 [INFO] [EVAL] The model with the best validation mIoU (0.3036) was saved at iter 21000. 2022-08-22 20:21:40 [INFO] [TRAIN] epoch: 19, iter: 23050/160000, loss: 0.8518, lr: 0.001037, batch_cost: 0.1651, reader_cost: 0.00388, ips: 48.4687 samples/sec | ETA 06:16:44 2022-08-22 20:21:49 [INFO] [TRAIN] epoch: 19, iter: 23100/160000, loss: 0.8417, lr: 0.001036, batch_cost: 0.1957, reader_cost: 0.00112, ips: 40.8750 samples/sec | ETA 07:26:33 2022-08-22 20:21:58 [INFO] [TRAIN] epoch: 19, iter: 23150/160000, loss: 0.8919, lr: 0.001036, batch_cost: 0.1743, reader_cost: 0.00041, ips: 45.8882 samples/sec | ETA 06:37:38 2022-08-22 20:22:08 [INFO] [TRAIN] epoch: 19, iter: 23200/160000, loss: 0.8832, lr: 0.001036, batch_cost: 0.1901, reader_cost: 0.00032, ips: 42.0768 samples/sec | ETA 07:13:29 2022-08-22 20:22:16 [INFO] [TRAIN] epoch: 19, iter: 23250/160000, loss: 0.8051, lr: 0.001035, batch_cost: 0.1664, reader_cost: 0.00773, ips: 48.0876 samples/sec | ETA 06:19:10 2022-08-22 20:22:25 [INFO] [TRAIN] epoch: 19, iter: 23300/160000, loss: 0.9506, lr: 0.001035, batch_cost: 0.1733, reader_cost: 0.00121, ips: 46.1755 samples/sec | ETA 06:34:43 2022-08-22 20:22:33 [INFO] [TRAIN] epoch: 19, iter: 23350/160000, loss: 0.7914, lr: 0.001035, batch_cost: 0.1623, reader_cost: 0.00034, ips: 49.2851 samples/sec | ETA 06:09:41 2022-08-22 20:22:41 [INFO] [TRAIN] epoch: 19, iter: 23400/160000, loss: 0.8333, lr: 0.001034, batch_cost: 0.1595, reader_cost: 0.00034, ips: 50.1654 samples/sec | ETA 06:03:03 2022-08-22 20:22:50 [INFO] [TRAIN] epoch: 19, iter: 23450/160000, loss: 0.8670, lr: 0.001034, batch_cost: 0.1792, reader_cost: 0.00059, ips: 44.6466 samples/sec | ETA 06:47:47 2022-08-22 20:23:00 [INFO] [TRAIN] epoch: 19, iter: 23500/160000, loss: 0.8625, lr: 0.001033, batch_cost: 0.2013, reader_cost: 0.00038, ips: 39.7379 samples/sec | ETA 07:38:00 2022-08-22 20:23:09 [INFO] [TRAIN] epoch: 19, iter: 23550/160000, loss: 0.9403, lr: 0.001033, batch_cost: 0.1780, reader_cost: 0.00072, ips: 44.9464 samples/sec | ETA 06:44:46 2022-08-22 20:23:18 [INFO] [TRAIN] epoch: 19, iter: 23600/160000, loss: 0.8669, lr: 0.001033, batch_cost: 0.1854, reader_cost: 0.00038, ips: 43.1392 samples/sec | ETA 07:01:34 2022-08-22 20:23:26 [INFO] [TRAIN] epoch: 19, iter: 23650/160000, loss: 0.9103, lr: 0.001032, batch_cost: 0.1604, reader_cost: 0.00035, ips: 49.8723 samples/sec | ETA 06:04:31 2022-08-22 20:23:35 [INFO] [TRAIN] epoch: 19, iter: 23700/160000, loss: 0.8962, lr: 0.001032, batch_cost: 0.1723, reader_cost: 0.00342, ips: 46.4321 samples/sec | ETA 06:31:23 2022-08-22 20:23:43 [INFO] [TRAIN] epoch: 19, iter: 23750/160000, loss: 0.8845, lr: 0.001032, batch_cost: 0.1670, reader_cost: 0.00159, ips: 47.8982 samples/sec | ETA 06:19:16 2022-08-22 20:23:53 [INFO] [TRAIN] epoch: 19, iter: 23800/160000, loss: 0.8357, lr: 0.001031, batch_cost: 0.1954, reader_cost: 0.00386, ips: 40.9396 samples/sec | ETA 07:23:34 2022-08-22 20:24:02 [INFO] [TRAIN] epoch: 19, iter: 23850/160000, loss: 0.8126, lr: 0.001031, batch_cost: 0.1854, reader_cost: 0.00049, ips: 43.1557 samples/sec | ETA 07:00:38 2022-08-22 20:24:11 [INFO] [TRAIN] epoch: 19, iter: 23900/160000, loss: 0.8119, lr: 0.001030, batch_cost: 0.1803, reader_cost: 0.00083, ips: 44.3770 samples/sec | ETA 06:48:55 2022-08-22 20:24:20 [INFO] [TRAIN] epoch: 19, iter: 23950/160000, loss: 0.8429, lr: 0.001030, batch_cost: 0.1750, reader_cost: 0.00032, ips: 45.7183 samples/sec | ETA 06:36:46 2022-08-22 20:24:34 [INFO] [TRAIN] epoch: 20, iter: 24000/160000, loss: 0.8730, lr: 0.001030, batch_cost: 0.2773, reader_cost: 0.05280, ips: 28.8491 samples/sec | ETA 10:28:33 2022-08-22 20:24:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 204s - batch_cost: 0.2035 - reader cost: 7.4918e-04 2022-08-22 20:27:57 [INFO] [EVAL] #Images: 2000 mIoU: 0.2985 Acc: 0.7444 Kappa: 0.7249 Dice: 0.4212 2022-08-22 20:27:57 [INFO] [EVAL] Class IoU: [0.6542 0.7552 0.929 0.6962 0.6741 0.7415 0.7508 0.7483 0.4831 0.6407 0.4282 0.5247 0.6736 0.3181 0.2109 0.3802 0.4858 0.4451 0.5621 0.3626 0.7227 0.4225 0.5845 0.4622 0.3128 0.3543 0.4648 0.3686 0.3615 0.2856 0.1712 0.4567 0.1971 0.3285 0.2855 0.3307 0.3774 0.465 0.2751 0.3343 0.1433 0.1078 0.3297 0.2268 0.3306 0.2125 0.2375 0.4047 0.474 0.4726 0.4478 0.2228 0.1872 0.0475 0.6385 0.4988 0.834 0.3272 0.376 0.2004 0.0664 0.2931 0.2492 0.152 0.4 0.5295 0.2499 0.4217 0.0403 0.3031 0.3098 0.4207 0.3376 0.2246 0.4156 0.2805 0.353 0.1254 0.0723 0.13 0.5324 0.33 0.2855 0.0259 0.361 0.5212 0.1092 0.0405 0.1937 0.4221 0.4038 0.028 0.1941 0.0756 0.0003 0.0072 0.052 0.0298 0.2657 0.3447 0.0052 0.0144 0.2188 0.1299 0.0452 0.4194 0.031 0.5078 0.0909 0.1682 0.0339 0.1896 0.1153 0.5345 0.5858 0. 0.3718 0.5616 0.0521 0.1589 0.4822 0. 0.1859 0.1642 0.2588 0.1555 0.4011 0.3912 0.0253 0.131 0.4956 0.0029 0.22 0.1488 0.0877 0.1216 0.1058 0.0068 0.1863 0.2438 0.2854 0.0017 0.3303 0.004 0.2481 0. 0.3236 0.0094 0.0923 0.0937] 2022-08-22 20:27:57 [INFO] [EVAL] Class Precision: [0.7677 0.8392 0.9684 0.7763 0.7538 0.8369 0.8739 0.8293 0.5903 0.778 0.771 0.6717 0.7577 0.5617 0.4681 0.5836 0.6116 0.6535 0.7126 0.5844 0.8248 0.586 0.7275 0.5932 0.5313 0.5448 0.5344 0.7123 0.6237 0.3825 0.375 0.6054 0.4544 0.5274 0.4015 0.4856 0.5913 0.6502 0.4531 0.4863 0.2124 0.2831 0.5052 0.5074 0.5556 0.3287 0.6803 0.542 0.4813 0.5264 0.6914 0.2545 0.3321 0.8206 0.661 0.719 0.9359 0.6608 0.644 0.2992 0.1312 0.3921 0.5101 0.8098 0.4852 0.577 0.4923 0.568 0.3175 0.5171 0.6233 0.4968 0.5509 0.2859 0.6005 0.4403 0.6535 0.4711 0.8685 0.2537 0.726 0.592 0.7167 0.0705 0.6148 0.7183 0.279 0.3818 0.3965 0.6824 0.5795 0.0394 0.3623 0.3252 0.005 0.0505 0.3946 0.1614 0.4252 0.7483 0.2108 0.0261 0.5493 0.5178 0.2464 0.4825 0.3522 0.7545 0.1738 0.2889 0.1799 0.2043 0.4365 0.6095 0.7898 0.0002 0.6656 0.6085 0.0738 0.512 0.7238 0. 0.3169 0.6028 0.4857 0.6275 0.7716 0.5475 0.0634 0.4567 0.5756 0.1483 0.5296 0.6397 0.7102 0.2849 0.3486 0.1953 0.4546 0.7691 0.3264 0.003 0.6931 0.0573 0.458 0. 0.8021 0.5353 0.5911 0.7017] 2022-08-22 20:27:57 [INFO] [EVAL] Class Recall: [0.8156 0.883 0.958 0.8709 0.8643 0.8667 0.8419 0.8845 0.7267 0.7841 0.4906 0.7056 0.8585 0.4231 0.2774 0.5217 0.7026 0.5827 0.7269 0.4886 0.8537 0.6022 0.7484 0.6766 0.432 0.5033 0.7813 0.433 0.4624 0.53 0.2395 0.6503 0.2582 0.4655 0.4969 0.509 0.5107 0.6202 0.4118 0.5167 0.3057 0.1482 0.4868 0.2909 0.4494 0.3753 0.2674 0.6151 0.9688 0.8221 0.5596 0.6413 0.3001 0.048 0.9495 0.6196 0.8845 0.3932 0.4746 0.3777 0.1186 0.5373 0.3277 0.1576 0.6951 0.8656 0.3367 0.6208 0.0442 0.4228 0.3812 0.7333 0.4657 0.5115 0.5744 0.4359 0.4342 0.1459 0.0731 0.2104 0.6663 0.4272 0.3218 0.0393 0.4665 0.6552 0.1521 0.0433 0.2747 0.5253 0.5711 0.088 0.2947 0.0897 0.0003 0.0083 0.0565 0.0353 0.4146 0.3899 0.0053 0.0311 0.2667 0.1478 0.0524 0.7623 0.0328 0.6083 0.16 0.2871 0.0401 0.7256 0.1354 0.8129 0.694 0. 0.4572 0.8793 0.1505 0.1873 0.5909 0. 0.3104 0.1841 0.3565 0.1713 0.4551 0.5781 0.0405 0.1551 0.7809 0.0029 0.2734 0.1624 0.091 0.175 0.1319 0.007 0.2399 0.263 0.6948 0.0039 0.3869 0.0043 0.3511 0. 0.3517 0.0095 0.0986 0.0976] 2022-08-22 20:27:57 [INFO] [EVAL] The model with the best validation mIoU (0.3036) was saved at iter 21000. 2022-08-22 20:28:08 [INFO] [TRAIN] epoch: 20, iter: 24050/160000, loss: 0.8652, lr: 0.001029, batch_cost: 0.2169, reader_cost: 0.00337, ips: 36.8912 samples/sec | ETA 08:11:21 2022-08-22 20:28:19 [INFO] [TRAIN] epoch: 20, iter: 24100/160000, loss: 0.8384, lr: 0.001029, batch_cost: 0.2148, reader_cost: 0.00076, ips: 37.2400 samples/sec | ETA 08:06:34 2022-08-22 20:28:29 [INFO] [TRAIN] epoch: 20, iter: 24150/160000, loss: 0.8200, lr: 0.001029, batch_cost: 0.1956, reader_cost: 0.00105, ips: 40.9102 samples/sec | ETA 07:22:45 2022-08-22 20:28:38 [INFO] [TRAIN] epoch: 20, iter: 24200/160000, loss: 0.8247, lr: 0.001028, batch_cost: 0.1854, reader_cost: 0.00247, ips: 43.1430 samples/sec | ETA 06:59:41 2022-08-22 20:28:46 [INFO] [TRAIN] epoch: 20, iter: 24250/160000, loss: 0.8734, lr: 0.001028, batch_cost: 0.1614, reader_cost: 0.00152, ips: 49.5721 samples/sec | ETA 06:05:07 2022-08-22 20:28:56 [INFO] [TRAIN] epoch: 20, iter: 24300/160000, loss: 0.8992, lr: 0.001027, batch_cost: 0.1954, reader_cost: 0.00052, ips: 40.9485 samples/sec | ETA 07:21:51 2022-08-22 20:29:06 [INFO] [TRAIN] epoch: 20, iter: 24350/160000, loss: 0.8095, lr: 0.001027, batch_cost: 0.2065, reader_cost: 0.00042, ips: 38.7493 samples/sec | ETA 07:46:45 2022-08-22 20:29:15 [INFO] [TRAIN] epoch: 20, iter: 24400/160000, loss: 0.8171, lr: 0.001027, batch_cost: 0.1800, reader_cost: 0.00058, ips: 44.4529 samples/sec | ETA 06:46:43 2022-08-22 20:29:26 [INFO] [TRAIN] epoch: 20, iter: 24450/160000, loss: 0.7767, lr: 0.001026, batch_cost: 0.2054, reader_cost: 0.00069, ips: 38.9487 samples/sec | ETA 07:44:01 2022-08-22 20:29:34 [INFO] [TRAIN] epoch: 20, iter: 24500/160000, loss: 0.8099, lr: 0.001026, batch_cost: 0.1686, reader_cost: 0.00037, ips: 47.4569 samples/sec | ETA 06:20:41 2022-08-22 20:29:42 [INFO] [TRAIN] epoch: 20, iter: 24550/160000, loss: 0.8499, lr: 0.001025, batch_cost: 0.1592, reader_cost: 0.00285, ips: 50.2515 samples/sec | ETA 05:59:23 2022-08-22 20:29:52 [INFO] [TRAIN] epoch: 20, iter: 24600/160000, loss: 0.8521, lr: 0.001025, batch_cost: 0.1951, reader_cost: 0.00031, ips: 41.0106 samples/sec | ETA 07:20:12 2022-08-22 20:30:03 [INFO] [TRAIN] epoch: 20, iter: 24650/160000, loss: 0.8036, lr: 0.001025, batch_cost: 0.2163, reader_cost: 0.00053, ips: 36.9800 samples/sec | ETA 08:08:00 2022-08-22 20:30:11 [INFO] [TRAIN] epoch: 20, iter: 24700/160000, loss: 0.8665, lr: 0.001024, batch_cost: 0.1722, reader_cost: 0.00058, ips: 46.4691 samples/sec | ETA 06:28:12 2022-08-22 20:30:20 [INFO] [TRAIN] epoch: 20, iter: 24750/160000, loss: 0.8734, lr: 0.001024, batch_cost: 0.1760, reader_cost: 0.00054, ips: 45.4556 samples/sec | ETA 06:36:43 2022-08-22 20:30:29 [INFO] [TRAIN] epoch: 20, iter: 24800/160000, loss: 0.8458, lr: 0.001024, batch_cost: 0.1818, reader_cost: 0.00045, ips: 44.0002 samples/sec | ETA 06:49:41 2022-08-22 20:30:41 [INFO] [TRAIN] epoch: 20, iter: 24850/160000, loss: 0.8107, lr: 0.001023, batch_cost: 0.2317, reader_cost: 0.00440, ips: 34.5261 samples/sec | ETA 08:41:55 2022-08-22 20:30:50 [INFO] [TRAIN] epoch: 20, iter: 24900/160000, loss: 0.8404, lr: 0.001023, batch_cost: 0.1956, reader_cost: 0.00069, ips: 40.9061 samples/sec | ETA 07:20:21 2022-08-22 20:31:00 [INFO] [TRAIN] epoch: 20, iter: 24950/160000, loss: 0.8864, lr: 0.001022, batch_cost: 0.1957, reader_cost: 0.00131, ips: 40.8794 samples/sec | ETA 07:20:28 2022-08-22 20:31:11 [INFO] [TRAIN] epoch: 20, iter: 25000/160000, loss: 0.8264, lr: 0.001022, batch_cost: 0.2139, reader_cost: 0.00038, ips: 37.4094 samples/sec | ETA 08:01:09 2022-08-22 20:31:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1681 - reader cost: 8.3851e-04 2022-08-22 20:33:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3074 Acc: 0.7465 Kappa: 0.7270 Dice: 0.4312 2022-08-22 20:33:59 [INFO] [EVAL] Class IoU: [0.6567 0.7656 0.9295 0.7065 0.6633 0.7425 0.7611 0.7386 0.4928 0.6242 0.4587 0.5218 0.6656 0.2592 0.2276 0.3737 0.4652 0.3877 0.5814 0.3776 0.6958 0.4961 0.5897 0.4171 0.3126 0.3364 0.5159 0.3982 0.3685 0.2522 0.2292 0.4289 0.2017 0.3381 0.2935 0.3517 0.387 0.4402 0.2353 0.3111 0.1704 0.12 0.3219 0.2376 0.3122 0.1626 0.401 0.3643 0.5652 0.5179 0.4394 0.3785 0.1495 0.2499 0.6966 0.3711 0.7901 0.3398 0.4123 0.2512 0.1099 0.3184 0.2597 0.1289 0.4055 0.4973 0.2767 0.4116 0.0144 0.274 0.3724 0.4487 0.3428 0.2311 0.3968 0.3102 0.4714 0.2501 0.1768 0.1382 0.5985 0.3381 0.31 0.0018 0.2826 0.5143 0.0403 0.0356 0.3222 0.3939 0.2934 0.0512 0.1867 0.0795 0.0061 0.0078 0.0029 0.0482 0.262 0.3367 0.0441 0.0064 0.0369 0.7839 0.0014 0.2363 0.0601 0.5404 0.1193 0.2783 0.0358 0.0516 0.1267 0.5599 0.6625 0.0034 0.3492 0.5551 0.0671 0.1757 0.3833 0. 0.1673 0.1422 0.2015 0.1437 0.4067 0.386 0. 0.1461 0.4385 0.0025 0.2087 0.2431 0.0867 0.1259 0.0989 0.0067 0.1207 0.3141 0.2944 0.0238 0.3137 0.1959 0.1971 0. 0.3646 0.0223 0.1199 0.0646] 2022-08-22 20:33:59 [INFO] [EVAL] Class Precision: [0.7571 0.8567 0.9641 0.8182 0.7462 0.8441 0.8651 0.8033 0.655 0.7169 0.7289 0.7285 0.7323 0.517 0.4737 0.5417 0.5375 0.6943 0.7338 0.538 0.7689 0.6415 0.7477 0.6582 0.4497 0.5205 0.6618 0.6113 0.6268 0.4117 0.3802 0.6317 0.4093 0.4468 0.4082 0.5561 0.6494 0.6836 0.4265 0.5226 0.311 0.2753 0.7579 0.5589 0.3871 0.4622 0.8044 0.4414 0.5889 0.6601 0.6322 0.5979 0.3802 0.7775 0.7275 0.4536 0.8512 0.6207 0.6469 0.4584 0.2025 0.3583 0.3044 0.7179 0.5249 0.7526 0.3623 0.5905 0.3013 0.5084 0.6545 0.5825 0.6376 0.2794 0.5983 0.4573 0.6442 0.6035 0.2728 0.2359 0.6901 0.6278 0.7511 0.0069 0.6088 0.6563 0.4334 0.2457 0.5972 0.6717 0.3436 0.2185 0.4648 0.2689 0.0585 0.0341 0.05 0.2178 0.469 0.7628 0.317 0.0122 0.4137 0.9593 0.1517 0.243 0.2777 0.8197 0.189 0.429 0.2598 0.0521 0.396 0.767 0.668 0.1046 0.544 0.6171 0.1262 0.437 0.5906 0. 0.3492 0.7328 0.6624 0.5487 0.9314 0.5909 0. 0.5598 0.7326 0.0574 0.5757 0.6128 0.6253 0.2657 0.2506 0.2413 0.3367 0.6874 0.3362 0.034 0.5567 0.3622 0.6499 0. 0.8284 0.4464 0.5478 0.9877] 2022-08-22 20:33:59 [INFO] [EVAL] Class Recall: [0.8319 0.8781 0.9628 0.8379 0.8565 0.8605 0.8636 0.9016 0.6655 0.8284 0.5531 0.6478 0.8797 0.3421 0.3046 0.5464 0.7755 0.4675 0.7367 0.5587 0.8798 0.6865 0.7361 0.5325 0.5063 0.4875 0.7006 0.5332 0.4721 0.3943 0.366 0.5718 0.2845 0.5817 0.5107 0.4889 0.4891 0.5529 0.3441 0.4347 0.2737 0.1755 0.3588 0.2925 0.6175 0.2006 0.4443 0.6758 0.9337 0.7062 0.5903 0.5077 0.1976 0.2692 0.9424 0.6713 0.9167 0.4289 0.5321 0.3573 0.1939 0.7409 0.6388 0.1358 0.6406 0.5944 0.5393 0.5759 0.0149 0.3728 0.4635 0.6614 0.4257 0.5719 0.5409 0.491 0.6374 0.2993 0.3343 0.2502 0.8185 0.4229 0.3455 0.0024 0.3453 0.7038 0.0425 0.0399 0.4116 0.4879 0.6673 0.0628 0.2378 0.1015 0.0068 0.01 0.0031 0.0583 0.3725 0.376 0.0487 0.0134 0.0389 0.8109 0.0014 0.8967 0.0713 0.6134 0.2445 0.442 0.0399 0.8446 0.1571 0.6746 0.9876 0.0035 0.4937 0.8468 0.1252 0.227 0.5219 0. 0.2431 0.15 0.2246 0.163 0.4193 0.5268 0. 0.1651 0.5221 0.0026 0.2466 0.2872 0.0915 0.1932 0.1405 0.0069 0.1584 0.3664 0.7029 0.0739 0.4182 0.2992 0.2206 0. 0.3944 0.0229 0.1331 0.0646] 2022-08-22 20:33:59 [INFO] [EVAL] The model with the best validation mIoU (0.3074) was saved at iter 25000. 2022-08-22 20:34:07 [INFO] [TRAIN] epoch: 20, iter: 25050/160000, loss: 0.7643, lr: 0.001022, batch_cost: 0.1606, reader_cost: 0.00376, ips: 49.8167 samples/sec | ETA 06:01:11 2022-08-22 20:34:16 [INFO] [TRAIN] epoch: 20, iter: 25100/160000, loss: 0.8698, lr: 0.001021, batch_cost: 0.1702, reader_cost: 0.00180, ips: 47.0016 samples/sec | ETA 06:22:40 2022-08-22 20:34:24 [INFO] [TRAIN] epoch: 20, iter: 25150/160000, loss: 0.8264, lr: 0.001021, batch_cost: 0.1642, reader_cost: 0.00051, ips: 48.7185 samples/sec | ETA 06:09:03 2022-08-22 20:34:33 [INFO] [TRAIN] epoch: 20, iter: 25200/160000, loss: 0.7714, lr: 0.001021, batch_cost: 0.1738, reader_cost: 0.00044, ips: 46.0201 samples/sec | ETA 06:30:33 2022-08-22 20:34:41 [INFO] [TRAIN] epoch: 20, iter: 25250/160000, loss: 0.8062, lr: 0.001020, batch_cost: 0.1598, reader_cost: 0.00032, ips: 50.0635 samples/sec | ETA 05:58:52 2022-08-22 20:34:53 [INFO] [TRAIN] epoch: 21, iter: 25300/160000, loss: 0.8683, lr: 0.001020, batch_cost: 0.2416, reader_cost: 0.06709, ips: 33.1174 samples/sec | ETA 09:02:18 2022-08-22 20:35:02 [INFO] [TRAIN] epoch: 21, iter: 25350/160000, loss: 0.8078, lr: 0.001019, batch_cost: 0.1900, reader_cost: 0.00073, ips: 42.1053 samples/sec | ETA 07:06:23 2022-08-22 20:35:11 [INFO] [TRAIN] epoch: 21, iter: 25400/160000, loss: 0.7826, lr: 0.001019, batch_cost: 0.1690, reader_cost: 0.00042, ips: 47.3317 samples/sec | ETA 06:19:10 2022-08-22 20:35:19 [INFO] [TRAIN] epoch: 21, iter: 25450/160000, loss: 0.8684, lr: 0.001019, batch_cost: 0.1592, reader_cost: 0.00128, ips: 50.2508 samples/sec | ETA 05:57:00 2022-08-22 20:35:29 [INFO] [TRAIN] epoch: 21, iter: 25500/160000, loss: 0.7966, lr: 0.001018, batch_cost: 0.1953, reader_cost: 0.00046, ips: 40.9716 samples/sec | ETA 07:17:42 2022-08-22 20:35:37 [INFO] [TRAIN] epoch: 21, iter: 25550/160000, loss: 0.8238, lr: 0.001018, batch_cost: 0.1706, reader_cost: 0.00083, ips: 46.9035 samples/sec | ETA 06:22:12 2022-08-22 20:35:45 [INFO] [TRAIN] epoch: 21, iter: 25600/160000, loss: 0.8636, lr: 0.001018, batch_cost: 0.1504, reader_cost: 0.00051, ips: 53.1940 samples/sec | ETA 05:36:52 2022-08-22 20:35:52 [INFO] [TRAIN] epoch: 21, iter: 25650/160000, loss: 0.8383, lr: 0.001017, batch_cost: 0.1541, reader_cost: 0.00056, ips: 51.9017 samples/sec | ETA 05:45:08 2022-08-22 20:36:01 [INFO] [TRAIN] epoch: 21, iter: 25700/160000, loss: 0.8530, lr: 0.001017, batch_cost: 0.1650, reader_cost: 0.00082, ips: 48.4731 samples/sec | ETA 06:09:24 2022-08-22 20:36:10 [INFO] [TRAIN] epoch: 21, iter: 25750/160000, loss: 0.8482, lr: 0.001016, batch_cost: 0.1864, reader_cost: 0.00060, ips: 42.9294 samples/sec | ETA 06:56:57 2022-08-22 20:36:19 [INFO] [TRAIN] epoch: 21, iter: 25800/160000, loss: 0.8364, lr: 0.001016, batch_cost: 0.1875, reader_cost: 0.00082, ips: 42.6603 samples/sec | ETA 06:59:26 2022-08-22 20:36:29 [INFO] [TRAIN] epoch: 21, iter: 25850/160000, loss: 0.8256, lr: 0.001016, batch_cost: 0.1893, reader_cost: 0.00058, ips: 42.2573 samples/sec | ETA 07:03:16 2022-08-22 20:36:38 [INFO] [TRAIN] epoch: 21, iter: 25900/160000, loss: 0.8706, lr: 0.001015, batch_cost: 0.1839, reader_cost: 0.00041, ips: 43.5107 samples/sec | ETA 06:50:56 2022-08-22 20:36:48 [INFO] [TRAIN] epoch: 21, iter: 25950/160000, loss: 0.8303, lr: 0.001015, batch_cost: 0.1961, reader_cost: 0.00054, ips: 40.7971 samples/sec | ETA 07:18:06 2022-08-22 20:36:57 [INFO] [TRAIN] epoch: 21, iter: 26000/160000, loss: 0.7890, lr: 0.001015, batch_cost: 0.1932, reader_cost: 0.00077, ips: 41.4172 samples/sec | ETA 07:11:22 2022-08-22 20:36:57 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1932 - reader cost: 7.2253e-04 2022-08-22 20:40:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3071 Acc: 0.7452 Kappa: 0.7258 Dice: 0.4332 2022-08-22 20:40:11 [INFO] [EVAL] Class IoU: [0.6614 0.7609 0.929 0.7046 0.6672 0.7488 0.7524 0.7507 0.4952 0.6105 0.467 0.5234 0.677 0.2928 0.2214 0.3803 0.4519 0.4364 0.5763 0.3593 0.6901 0.4222 0.5995 0.4463 0.2915 0.3916 0.4266 0.3536 0.3577 0.251 0.1503 0.4236 0.2675 0.3048 0.299 0.3575 0.3789 0.4166 0.2457 0.3258 0.1597 0.1074 0.3293 0.2035 0.2996 0.1558 0.3255 0.3812 0.5917 0.4633 0.4383 0.3923 0.1847 0.2062 0.6331 0.2855 0.7567 0.3357 0.4117 0.23 0.1319 0.3463 0.2896 0.2253 0.3943 0.4996 0.3088 0.3892 0.0165 0.2789 0.352 0.3735 0.3235 0.2491 0.4197 0.2804 0.4345 0.2907 0.2275 0.2595 0.5049 0.3428 0.2318 0.0625 0.3824 0.5053 0.0458 0.0665 0.2089 0.3849 0.308 0.0294 0.234 0.038 0.0911 0.0208 0.0172 0.0602 0.0917 0.3075 0.0357 0.0152 0.0365 0.2558 0.0715 0.5504 0.0486 0.4607 0.0823 0.2537 0.0819 0.4077 0.1187 0.4814 0.7457 0.0004 0.3748 0.5325 0.1115 0.2108 0.3792 0.0003 0.2203 0.1306 0.1695 0.1942 0.4141 0.4095 0. 0.0525 0.573 0.0099 0.2736 0.1726 0.0691 0.1374 0.1114 0.0107 0.1092 0.2933 0.4489 0.0027 0.3295 0.0121 0.2213 0. 0.37 0.0164 0.0967 0.1758] 2022-08-22 20:40:11 [INFO] [EVAL] Class Precision: [0.7753 0.8305 0.9668 0.8142 0.7716 0.8469 0.8412 0.8347 0.6707 0.8066 0.6634 0.7118 0.7582 0.539 0.4797 0.5508 0.508 0.7037 0.7335 0.5751 0.7681 0.6761 0.755 0.5478 0.4853 0.5943 0.5525 0.8074 0.6298 0.3242 0.3534 0.5322 0.5781 0.4617 0.4997 0.5672 0.5103 0.715 0.4292 0.4939 0.2932 0.2534 0.7448 0.4054 0.4759 0.4932 0.5348 0.6327 0.6707 0.5484 0.6125 0.5916 0.3329 0.699 0.6506 0.3255 0.7864 0.6452 0.6727 0.3697 0.1738 0.4148 0.367 0.579 0.501 0.5403 0.4499 0.6247 0.151 0.5075 0.6145 0.412 0.6279 0.2887 0.5786 0.4283 0.583 0.4718 0.5821 0.3534 0.5509 0.6134 0.8169 0.1684 0.4255 0.718 0.3655 0.3137 0.3229 0.5287 0.3866 0.0499 0.3606 0.2926 0.1445 0.0425 0.0536 0.1968 0.3027 0.6335 0.2492 0.0228 0.3402 0.3774 0.2812 0.6465 0.3187 0.6942 0.1544 0.3332 0.2027 0.8929 0.5053 0.7859 0.7632 0.0089 0.5622 0.5438 0.1386 0.5326 0.6122 0.0106 0.8129 0.5712 0.6309 0.5354 0.6441 0.5784 0. 0.5232 0.7479 0.0787 0.5402 0.437 0.5106 0.2928 0.2906 0.0397 0.3322 0.6676 0.5751 0.0053 0.6742 0.0828 0.5586 0. 0.8092 0.5432 0.7759 0.9077] 2022-08-22 20:40:11 [INFO] [EVAL] Class Recall: [0.8183 0.9008 0.9596 0.8396 0.8314 0.8659 0.8769 0.8819 0.6543 0.7151 0.6121 0.6641 0.8635 0.3906 0.2914 0.5514 0.8037 0.5347 0.729 0.4892 0.8717 0.5292 0.7443 0.7068 0.422 0.5345 0.6518 0.3861 0.453 0.5266 0.2073 0.6748 0.3324 0.4728 0.4269 0.4917 0.5955 0.4995 0.3649 0.4891 0.2597 0.1571 0.3712 0.29 0.4471 0.1855 0.4542 0.4896 0.834 0.7491 0.6065 0.5381 0.2932 0.2262 0.9594 0.6992 0.9525 0.4116 0.5148 0.3785 0.3539 0.677 0.5786 0.2695 0.6494 0.8692 0.4962 0.508 0.0182 0.3825 0.4518 0.7995 0.4003 0.6446 0.6045 0.4481 0.6305 0.4309 0.2719 0.494 0.8581 0.4373 0.2446 0.0905 0.7902 0.6304 0.0497 0.0778 0.3717 0.5859 0.6025 0.0665 0.4 0.0419 0.198 0.0391 0.0247 0.0798 0.1163 0.3741 0.04 0.0436 0.0393 0.4427 0.0874 0.7874 0.0542 0.5781 0.1497 0.5151 0.1209 0.4287 0.1343 0.5541 0.9702 0.0004 0.5292 0.9623 0.3635 0.2587 0.4991 0.0003 0.232 0.1448 0.1882 0.2336 0.537 0.5838 0. 0.0551 0.7103 0.0113 0.3566 0.2219 0.074 0.2056 0.1531 0.0145 0.1399 0.3434 0.6716 0.0053 0.3918 0.014 0.2682 0. 0.4053 0.0167 0.0995 0.179 ] 2022-08-22 20:40:11 [INFO] [EVAL] The model with the best validation mIoU (0.3074) was saved at iter 25000. 2022-08-22 20:40:20 [INFO] [TRAIN] epoch: 21, iter: 26050/160000, loss: 0.8350, lr: 0.001014, batch_cost: 0.1750, reader_cost: 0.00438, ips: 45.7175 samples/sec | ETA 06:30:39 2022-08-22 20:40:30 [INFO] [TRAIN] epoch: 21, iter: 26100/160000, loss: 0.8573, lr: 0.001014, batch_cost: 0.2114, reader_cost: 0.00116, ips: 37.8511 samples/sec | ETA 07:51:40 2022-08-22 20:40:39 [INFO] [TRAIN] epoch: 21, iter: 26150/160000, loss: 0.8324, lr: 0.001013, batch_cost: 0.1798, reader_cost: 0.00154, ips: 44.4969 samples/sec | ETA 06:41:04 2022-08-22 20:40:48 [INFO] [TRAIN] epoch: 21, iter: 26200/160000, loss: 0.8038, lr: 0.001013, batch_cost: 0.1760, reader_cost: 0.00049, ips: 45.4591 samples/sec | ETA 06:32:26 2022-08-22 20:40:57 [INFO] [TRAIN] epoch: 21, iter: 26250/160000, loss: 0.8107, lr: 0.001013, batch_cost: 0.1846, reader_cost: 0.00041, ips: 43.3444 samples/sec | ETA 06:51:25 2022-08-22 20:41:06 [INFO] [TRAIN] epoch: 21, iter: 26300/160000, loss: 0.8200, lr: 0.001012, batch_cost: 0.1717, reader_cost: 0.00034, ips: 46.5796 samples/sec | ETA 06:22:42 2022-08-22 20:41:14 [INFO] [TRAIN] epoch: 21, iter: 26350/160000, loss: 0.8162, lr: 0.001012, batch_cost: 0.1535, reader_cost: 0.00086, ips: 52.1037 samples/sec | ETA 05:42:00 2022-08-22 20:41:23 [INFO] [TRAIN] epoch: 21, iter: 26400/160000, loss: 0.8597, lr: 0.001011, batch_cost: 0.1813, reader_cost: 0.00042, ips: 44.1254 samples/sec | ETA 06:43:41 2022-08-22 20:41:31 [INFO] [TRAIN] epoch: 21, iter: 26450/160000, loss: 0.8735, lr: 0.001011, batch_cost: 0.1743, reader_cost: 0.00069, ips: 45.9018 samples/sec | ETA 06:27:55 2022-08-22 20:41:40 [INFO] [TRAIN] epoch: 21, iter: 26500/160000, loss: 0.8129, lr: 0.001011, batch_cost: 0.1801, reader_cost: 0.00076, ips: 44.4256 samples/sec | ETA 06:40:40 2022-08-22 20:41:51 [INFO] [TRAIN] epoch: 22, iter: 26550/160000, loss: 0.8486, lr: 0.001010, batch_cost: 0.2098, reader_cost: 0.04549, ips: 38.1324 samples/sec | ETA 07:46:37 2022-08-22 20:41:59 [INFO] [TRAIN] epoch: 22, iter: 26600/160000, loss: 0.7913, lr: 0.001010, batch_cost: 0.1519, reader_cost: 0.00051, ips: 52.6644 samples/sec | ETA 05:37:44 2022-08-22 20:42:06 [INFO] [TRAIN] epoch: 22, iter: 26650/160000, loss: 0.8822, lr: 0.001010, batch_cost: 0.1570, reader_cost: 0.00069, ips: 50.9627 samples/sec | ETA 05:48:52 2022-08-22 20:42:15 [INFO] [TRAIN] epoch: 22, iter: 26700/160000, loss: 0.8115, lr: 0.001009, batch_cost: 0.1649, reader_cost: 0.00032, ips: 48.4996 samples/sec | ETA 06:06:27 2022-08-22 20:42:23 [INFO] [TRAIN] epoch: 22, iter: 26750/160000, loss: 0.8009, lr: 0.001009, batch_cost: 0.1663, reader_cost: 0.00046, ips: 48.1047 samples/sec | ETA 06:09:19 2022-08-22 20:42:31 [INFO] [TRAIN] epoch: 22, iter: 26800/160000, loss: 0.7807, lr: 0.001008, batch_cost: 0.1560, reader_cost: 0.00067, ips: 51.2969 samples/sec | ETA 05:46:13 2022-08-22 20:42:41 [INFO] [TRAIN] epoch: 22, iter: 26850/160000, loss: 0.8197, lr: 0.001008, batch_cost: 0.2006, reader_cost: 0.00043, ips: 39.8759 samples/sec | ETA 07:25:12 2022-08-22 20:42:51 [INFO] [TRAIN] epoch: 22, iter: 26900/160000, loss: 0.8205, lr: 0.001008, batch_cost: 0.2130, reader_cost: 0.00050, ips: 37.5586 samples/sec | ETA 07:52:30 2022-08-22 20:43:00 [INFO] [TRAIN] epoch: 22, iter: 26950/160000, loss: 0.7897, lr: 0.001007, batch_cost: 0.1807, reader_cost: 0.00058, ips: 44.2674 samples/sec | ETA 06:40:44 2022-08-22 20:43:10 [INFO] [TRAIN] epoch: 22, iter: 27000/160000, loss: 0.8480, lr: 0.001007, batch_cost: 0.1898, reader_cost: 0.01364, ips: 42.1477 samples/sec | ETA 07:00:44 2022-08-22 20:43:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 190s - batch_cost: 0.1903 - reader cost: 0.0013 2022-08-22 20:46:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3120 Acc: 0.7480 Kappa: 0.7289 Dice: 0.4381 2022-08-22 20:46:21 [INFO] [EVAL] Class IoU: [0.655 0.7582 0.9266 0.7034 0.665 0.7332 0.7561 0.7447 0.4955 0.6377 0.4598 0.5407 0.6641 0.3034 0.2488 0.3963 0.468 0.3811 0.5735 0.3865 0.7191 0.5066 0.6039 0.4513 0.335 0.4088 0.4486 0.3818 0.369 0.2396 0.1875 0.4268 0.2418 0.3477 0.2714 0.3758 0.378 0.4909 0.2677 0.3461 0.1329 0.0844 0.3378 0.2099 0.2999 0.1985 0.3548 0.4068 0.5584 0.4427 0.4802 0.3065 0.0955 0.2543 0.6854 0.4321 0.7536 0.3268 0.3819 0.2537 0.1473 0.4389 0.2802 0.1662 0.4219 0.5852 0.2326 0.4057 0.0576 0.2442 0.3898 0.4565 0.3388 0.264 0.4133 0.2787 0.3753 0.2366 0.1244 0.1099 0.5091 0.3559 0.3604 0.065 0.3437 0.5184 0.0288 0.0433 0.3061 0.4302 0.3713 0.092 0.1655 0.0694 0. 0.0104 0.0566 0.0464 0.2138 0.2585 0.0168 0.0197 0.1879 0.0832 0.0157 0.4975 0.1263 0.4956 0.0859 0.2299 0.0702 0.4995 0.106 0.5514 0.5305 0.0012 0.3787 0.6061 0.1521 0.3451 0.3935 0.0076 0.1988 0.0533 0.2092 0.1999 0.4471 0.3866 0. 0.0925 0.4601 0. 0.2359 0.1523 0.0937 0.1263 0.0838 0.0041 0.1894 0.278 0.4718 0.029 0.3468 0.0058 0.2709 0. 0.3473 0.0228 0.107 0.0877] 2022-08-22 20:46:21 [INFO] [EVAL] Class Precision: [0.7755 0.8275 0.9683 0.8005 0.7477 0.8742 0.8567 0.8219 0.6405 0.7652 0.6703 0.6674 0.746 0.521 0.4836 0.5763 0.5591 0.6544 0.6817 0.5602 0.8155 0.6719 0.7549 0.59 0.4973 0.5123 0.6377 0.5264 0.6186 0.3751 0.3797 0.7 0.5017 0.4993 0.4818 0.4833 0.6525 0.6439 0.5891 0.5379 0.2015 0.2729 0.5106 0.5494 0.5964 0.5072 0.6104 0.6343 0.5789 0.4942 0.7329 0.411 0.3978 0.5591 0.7117 0.5693 0.7786 0.6621 0.4491 0.4966 0.2741 0.5732 0.4051 0.7301 0.5656 0.6869 0.4326 0.6149 0.7244 0.6748 0.583 0.5349 0.556 0.3555 0.6111 0.4472 0.4134 0.5542 0.2059 0.1975 0.5715 0.6916 0.6766 0.185 0.6129 0.6699 0.2867 0.3524 0.7461 0.7519 0.502 0.151 0.3696 0.2866 0. 0.0327 0.3333 0.3627 0.5031 0.6689 0.7453 0.0376 0.5488 0.8895 0.1729 0.6509 0.2556 0.7757 0.1722 0.3499 0.215 0.7591 0.5402 0.7018 0.5329 0.0513 0.6181 0.7131 0.3338 0.5475 0.6342 0.1956 0.551 0.8624 0.6806 0.6168 0.8971 0.5978 0. 0.7672 0.6389 0. 0.4317 0.683 0.5371 0.2944 0.43 0.0514 0.4905 0.6899 0.823 0.0427 0.4786 0.1055 0.5596 0. 0.7827 0.3997 0.5073 0.8232] 2022-08-22 20:46:21 [INFO] [EVAL] Class Recall: [0.8082 0.9004 0.9556 0.8529 0.8574 0.8196 0.8655 0.888 0.6863 0.7928 0.5942 0.7402 0.8581 0.4208 0.3388 0.5593 0.7416 0.4771 0.7832 0.5549 0.8589 0.6732 0.7512 0.6575 0.5065 0.6694 0.602 0.5816 0.4777 0.3988 0.2703 0.5224 0.3181 0.5338 0.3832 0.6281 0.4733 0.6737 0.3291 0.4925 0.2808 0.1089 0.4996 0.2535 0.3763 0.2459 0.4587 0.5314 0.9403 0.8095 0.582 0.5466 0.1117 0.318 0.9489 0.6418 0.9591 0.3922 0.7187 0.3415 0.2415 0.652 0.4761 0.1771 0.624 0.7981 0.3346 0.5438 0.0589 0.2768 0.5406 0.757 0.4644 0.5065 0.5609 0.4252 0.8025 0.2921 0.2391 0.1986 0.8234 0.4231 0.4355 0.0911 0.439 0.6962 0.031 0.047 0.3417 0.5014 0.5877 0.1906 0.2305 0.0839 0. 0.0151 0.0638 0.0505 0.271 0.2964 0.0168 0.0397 0.2222 0.084 0.0169 0.6786 0.1999 0.5785 0.1463 0.4014 0.0943 0.5936 0.1165 0.72 0.9914 0.0012 0.4943 0.8016 0.2185 0.4829 0.509 0.0079 0.2372 0.0538 0.232 0.2283 0.4713 0.5225 0. 0.0952 0.6217 0. 0.3421 0.1639 0.102 0.181 0.0943 0.0044 0.2358 0.3177 0.5251 0.0834 0.5574 0.0061 0.3442 0. 0.3844 0.0236 0.1194 0.0894] 2022-08-22 20:46:21 [INFO] [EVAL] The model with the best validation mIoU (0.3120) was saved at iter 27000. 2022-08-22 20:46:29 [INFO] [TRAIN] epoch: 22, iter: 27050/160000, loss: 0.7524, lr: 0.001007, batch_cost: 0.1613, reader_cost: 0.00405, ips: 49.6020 samples/sec | ETA 05:57:22 2022-08-22 20:46:38 [INFO] [TRAIN] epoch: 22, iter: 27100/160000, loss: 0.8194, lr: 0.001006, batch_cost: 0.1866, reader_cost: 0.00118, ips: 42.8811 samples/sec | ETA 06:53:14 2022-08-22 20:46:47 [INFO] [TRAIN] epoch: 22, iter: 27150/160000, loss: 0.7994, lr: 0.001006, batch_cost: 0.1837, reader_cost: 0.00072, ips: 43.5383 samples/sec | ETA 06:46:50 2022-08-22 20:46:56 [INFO] [TRAIN] epoch: 22, iter: 27200/160000, loss: 0.8931, lr: 0.001005, batch_cost: 0.1751, reader_cost: 0.00140, ips: 45.6843 samples/sec | ETA 06:27:35 2022-08-22 20:47:06 [INFO] [TRAIN] epoch: 22, iter: 27250/160000, loss: 0.7726, lr: 0.001005, batch_cost: 0.2087, reader_cost: 0.00050, ips: 38.3260 samples/sec | ETA 07:41:49 2022-08-22 20:47:17 [INFO] [TRAIN] epoch: 22, iter: 27300/160000, loss: 0.7937, lr: 0.001005, batch_cost: 0.2014, reader_cost: 0.00059, ips: 39.7264 samples/sec | ETA 07:25:22 2022-08-22 20:47:26 [INFO] [TRAIN] epoch: 22, iter: 27350/160000, loss: 0.8346, lr: 0.001004, batch_cost: 0.1934, reader_cost: 0.00057, ips: 41.3668 samples/sec | ETA 07:07:33 2022-08-22 20:47:36 [INFO] [TRAIN] epoch: 22, iter: 27400/160000, loss: 0.8571, lr: 0.001004, batch_cost: 0.1903, reader_cost: 0.00057, ips: 42.0405 samples/sec | ETA 07:00:32 2022-08-22 20:47:45 [INFO] [TRAIN] epoch: 22, iter: 27450/160000, loss: 0.8343, lr: 0.001004, batch_cost: 0.1841, reader_cost: 0.00057, ips: 43.4639 samples/sec | ETA 06:46:37 2022-08-22 20:47:55 [INFO] [TRAIN] epoch: 22, iter: 27500/160000, loss: 0.7805, lr: 0.001003, batch_cost: 0.1926, reader_cost: 0.00066, ips: 41.5350 samples/sec | ETA 07:05:20 2022-08-22 20:48:03 [INFO] [TRAIN] epoch: 22, iter: 27550/160000, loss: 0.8061, lr: 0.001003, batch_cost: 0.1639, reader_cost: 0.00058, ips: 48.8230 samples/sec | ETA 06:01:42 2022-08-22 20:48:11 [INFO] [TRAIN] epoch: 22, iter: 27600/160000, loss: 0.8149, lr: 0.001002, batch_cost: 0.1692, reader_cost: 0.00061, ips: 47.2694 samples/sec | ETA 06:13:27 2022-08-22 20:48:21 [INFO] [TRAIN] epoch: 22, iter: 27650/160000, loss: 0.8133, lr: 0.001002, batch_cost: 0.1893, reader_cost: 0.00111, ips: 42.2684 samples/sec | ETA 06:57:29 2022-08-22 20:48:31 [INFO] [TRAIN] epoch: 22, iter: 27700/160000, loss: 0.8221, lr: 0.001002, batch_cost: 0.2092, reader_cost: 0.00075, ips: 38.2334 samples/sec | ETA 07:41:22 2022-08-22 20:48:41 [INFO] [TRAIN] epoch: 22, iter: 27750/160000, loss: 0.8948, lr: 0.001001, batch_cost: 0.1974, reader_cost: 0.00063, ips: 40.5246 samples/sec | ETA 07:15:07 2022-08-22 20:48:55 [INFO] [TRAIN] epoch: 23, iter: 27800/160000, loss: 0.7536, lr: 0.001001, batch_cost: 0.2794, reader_cost: 0.06317, ips: 28.6314 samples/sec | ETA 10:15:38 2022-08-22 20:49:09 [INFO] [TRAIN] epoch: 23, iter: 27850/160000, loss: 0.8329, lr: 0.001001, batch_cost: 0.2717, reader_cost: 0.00093, ips: 29.4444 samples/sec | ETA 09:58:24 2022-08-22 20:49:20 [INFO] [TRAIN] epoch: 23, iter: 27900/160000, loss: 0.7848, lr: 0.001000, batch_cost: 0.2294, reader_cost: 0.00634, ips: 34.8733 samples/sec | ETA 08:25:03 2022-08-22 20:49:32 [INFO] [TRAIN] epoch: 23, iter: 27950/160000, loss: 0.8002, lr: 0.001000, batch_cost: 0.2403, reader_cost: 0.00065, ips: 33.2933 samples/sec | ETA 08:48:50 2022-08-22 20:49:44 [INFO] [TRAIN] epoch: 23, iter: 28000/160000, loss: 0.7982, lr: 0.000999, batch_cost: 0.2401, reader_cost: 0.00079, ips: 33.3225 samples/sec | ETA 08:48:10 2022-08-22 20:49:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 206s - batch_cost: 0.2057 - reader cost: 7.2066e-04 2022-08-22 20:53:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.3042 Acc: 0.7425 Kappa: 0.7232 Dice: 0.4288 2022-08-22 20:53:10 [INFO] [EVAL] Class IoU: [0.6621 0.7614 0.9265 0.7047 0.6714 0.736 0.7471 0.7451 0.4852 0.6212 0.4494 0.5182 0.6742 0.2674 0.1829 0.3839 0.4592 0.4023 0.5755 0.3516 0.706 0.3947 0.5931 0.4607 0.3148 0.3512 0.4258 0.3763 0.3587 0.2623 0.1409 0.4715 0.2561 0.3106 0.246 0.3993 0.3819 0.4042 0.2582 0.3309 0.1713 0.1016 0.3469 0.2259 0.3155 0.2209 0.2792 0.3943 0.5013 0.4619 0.4264 0.2246 0.0735 0.1159 0.6731 0.4971 0.7548 0.3002 0.4009 0.2469 0.1398 0.3444 0.2794 0.1261 0.3538 0.4163 0.2699 0.3863 0.131 0.29 0.4063 0.4029 0.3387 0.1902 0.4046 0.2676 0.473 0.2633 0.1613 0.2924 0.5674 0.3543 0.2788 0.026 0.3277 0.5078 0.0442 0.042 0.2361 0.4078 0.3363 0.0601 0.2171 0.0632 0.0124 0.0083 0.0243 0.0653 0.2819 0.3265 0.1038 0.0134 0.0403 0.6726 0.1337 0.3675 0.1699 0.4518 0.1312 0.2293 0.049 0.0722 0.1281 0.4778 0.6246 0.009 0.4029 0.5803 0.0319 0.1673 0.4768 0. 0.2363 0.022 0.1927 0.1485 0.3812 0.4098 0. 0.1655 0.594 0.0001 0.2607 0.2401 0.1383 0.125 0.093 0.0019 0.0863 0.3364 0.3893 0.0197 0.2255 0.0067 0.0818 0. 0.3333 0.0007 0.1086 0.0761] 2022-08-22 20:53:10 [INFO] [EVAL] Class Precision: [0.7881 0.8307 0.9689 0.8035 0.7584 0.8801 0.8564 0.806 0.5749 0.7723 0.6846 0.6615 0.7653 0.5196 0.4799 0.5189 0.5424 0.6711 0.7303 0.6068 0.7777 0.6627 0.7172 0.5414 0.4953 0.5564 0.5421 0.7263 0.583 0.3286 0.3775 0.6462 0.488 0.4608 0.3853 0.5051 0.5641 0.7091 0.4475 0.5604 0.2377 0.271 0.6255 0.424 0.5087 0.3991 0.7452 0.6712 0.6469 0.5235 0.6168 0.2695 0.1961 0.608 0.6908 0.7141 0.7852 0.6726 0.5145 0.3569 0.1881 0.4625 0.4523 0.7721 0.4152 0.4344 0.356 0.5639 0.4545 0.451 0.6888 0.4792 0.5765 0.2036 0.6586 0.3663 0.5612 0.5691 0.6082 0.4463 0.6672 0.5651 0.7683 0.1532 0.4975 0.745 0.3058 0.3428 0.3656 0.6579 0.4322 0.075 0.4082 0.2675 0.0459 0.0255 0.1387 0.2633 0.4309 0.5701 0.2758 0.0178 0.3672 0.9082 0.5293 0.4252 0.4798 0.6436 0.1989 0.3433 0.1765 0.0751 0.3922 0.7248 0.6308 0.1909 0.674 0.6183 0.0912 0.3877 0.6924 0. 0.7235 0.7473 0.6588 0.679 0.8888 0.5785 0. 0.6246 0.7051 0.0023 0.873 0.3316 0.4394 0.3395 0.2387 0.1324 0.3521 0.483 0.5793 0.0481 0.7462 0.1221 0.6061 0. 0.8368 0.776 0.5786 0.813 ] 2022-08-22 20:53:10 [INFO] [EVAL] Class Recall: [0.8055 0.9013 0.9549 0.8514 0.8541 0.818 0.8541 0.908 0.7567 0.7604 0.5667 0.7052 0.85 0.3552 0.2281 0.5961 0.7496 0.501 0.7308 0.4554 0.8845 0.4939 0.7742 0.7554 0.4635 0.4878 0.665 0.4385 0.4825 0.5654 0.1836 0.6355 0.3501 0.4879 0.405 0.656 0.5419 0.4845 0.379 0.4469 0.3803 0.1398 0.4378 0.326 0.4538 0.3309 0.3086 0.4886 0.6901 0.797 0.5801 0.574 0.1052 0.1253 0.9633 0.6206 0.9512 0.3515 0.6447 0.4448 0.3522 0.5742 0.4223 0.131 0.7051 0.9092 0.5276 0.5509 0.1554 0.4483 0.4976 0.7168 0.4509 0.7433 0.512 0.4984 0.7506 0.3288 0.18 0.4588 0.7913 0.4871 0.3044 0.0304 0.4898 0.6146 0.0492 0.0457 0.3999 0.5175 0.6024 0.2326 0.3169 0.0764 0.0167 0.0122 0.0286 0.0799 0.4491 0.4331 0.1426 0.0518 0.0433 0.7216 0.1517 0.7302 0.2083 0.6025 0.2781 0.4083 0.0635 0.6512 0.1599 0.5838 0.9847 0.0094 0.5003 0.904 0.0468 0.2274 0.6049 0. 0.2597 0.0222 0.214 0.1597 0.4003 0.5843 0. 0.1838 0.7903 0.0001 0.271 0.4651 0.1679 0.1652 0.1321 0.0019 0.1026 0.5258 0.5429 0.0323 0.2443 0.007 0.0864 0. 0.3565 0.0007 0.118 0.0774] 2022-08-22 20:53:10 [INFO] [EVAL] The model with the best validation mIoU (0.3120) was saved at iter 27000. 2022-08-22 20:53:21 [INFO] [TRAIN] epoch: 23, iter: 28050/160000, loss: 0.7996, lr: 0.000999, batch_cost: 0.2091, reader_cost: 0.00352, ips: 38.2568 samples/sec | ETA 07:39:52 2022-08-22 20:53:30 [INFO] [TRAIN] epoch: 23, iter: 28100/160000, loss: 0.7605, lr: 0.000999, batch_cost: 0.1856, reader_cost: 0.00086, ips: 43.0970 samples/sec | ETA 06:48:04 2022-08-22 20:53:39 [INFO] [TRAIN] epoch: 23, iter: 28150/160000, loss: 0.7190, lr: 0.000998, batch_cost: 0.1773, reader_cost: 0.00093, ips: 45.1088 samples/sec | ETA 06:29:43 2022-08-22 20:53:48 [INFO] [TRAIN] epoch: 23, iter: 28200/160000, loss: 0.7851, lr: 0.000998, batch_cost: 0.1766, reader_cost: 0.00050, ips: 45.3098 samples/sec | ETA 06:27:50 2022-08-22 20:53:56 [INFO] [TRAIN] epoch: 23, iter: 28250/160000, loss: 0.8364, lr: 0.000997, batch_cost: 0.1631, reader_cost: 0.00177, ips: 49.0500 samples/sec | ETA 05:58:08 2022-08-22 20:54:04 [INFO] [TRAIN] epoch: 23, iter: 28300/160000, loss: 0.7749, lr: 0.000997, batch_cost: 0.1594, reader_cost: 0.00049, ips: 50.1959 samples/sec | ETA 05:49:49 2022-08-22 20:54:14 [INFO] [TRAIN] epoch: 23, iter: 28350/160000, loss: 0.8101, lr: 0.000997, batch_cost: 0.1984, reader_cost: 0.00051, ips: 40.3237 samples/sec | ETA 07:15:18 2022-08-22 20:54:22 [INFO] [TRAIN] epoch: 23, iter: 28400/160000, loss: 0.8172, lr: 0.000996, batch_cost: 0.1677, reader_cost: 0.00056, ips: 47.7044 samples/sec | ETA 06:07:49 2022-08-22 20:54:32 [INFO] [TRAIN] epoch: 23, iter: 28450/160000, loss: 0.8131, lr: 0.000996, batch_cost: 0.1921, reader_cost: 0.00043, ips: 41.6368 samples/sec | ETA 07:01:15 2022-08-22 20:54:41 [INFO] [TRAIN] epoch: 23, iter: 28500/160000, loss: 0.7751, lr: 0.000996, batch_cost: 0.1969, reader_cost: 0.00064, ips: 40.6214 samples/sec | ETA 07:11:37 2022-08-22 20:54:52 [INFO] [TRAIN] epoch: 23, iter: 28550/160000, loss: 0.8070, lr: 0.000995, batch_cost: 0.2052, reader_cost: 0.00049, ips: 38.9841 samples/sec | ETA 07:29:35 2022-08-22 20:55:00 [INFO] [TRAIN] epoch: 23, iter: 28600/160000, loss: 0.8036, lr: 0.000995, batch_cost: 0.1740, reader_cost: 0.00094, ips: 45.9897 samples/sec | ETA 06:20:57 2022-08-22 20:55:09 [INFO] [TRAIN] epoch: 23, iter: 28650/160000, loss: 0.7995, lr: 0.000994, batch_cost: 0.1646, reader_cost: 0.00068, ips: 48.5987 samples/sec | ETA 06:00:21 2022-08-22 20:55:18 [INFO] [TRAIN] epoch: 23, iter: 28700/160000, loss: 0.8799, lr: 0.000994, batch_cost: 0.1787, reader_cost: 0.00049, ips: 44.7712 samples/sec | ETA 06:31:01 2022-08-22 20:55:27 [INFO] [TRAIN] epoch: 23, iter: 28750/160000, loss: 0.7788, lr: 0.000994, batch_cost: 0.1853, reader_cost: 0.00061, ips: 43.1715 samples/sec | ETA 06:45:21 2022-08-22 20:55:37 [INFO] [TRAIN] epoch: 23, iter: 28800/160000, loss: 0.7831, lr: 0.000993, batch_cost: 0.1957, reader_cost: 0.00066, ips: 40.8886 samples/sec | ETA 07:07:49 2022-08-22 20:55:47 [INFO] [TRAIN] epoch: 23, iter: 28850/160000, loss: 0.9206, lr: 0.000993, batch_cost: 0.1971, reader_cost: 0.00068, ips: 40.5857 samples/sec | ETA 07:10:51 2022-08-22 20:55:56 [INFO] [TRAIN] epoch: 23, iter: 28900/160000, loss: 0.8013, lr: 0.000993, batch_cost: 0.1979, reader_cost: 0.00031, ips: 40.4229 samples/sec | ETA 07:12:25 2022-08-22 20:56:06 [INFO] [TRAIN] epoch: 23, iter: 28950/160000, loss: 0.7619, lr: 0.000992, batch_cost: 0.1889, reader_cost: 0.00082, ips: 42.3534 samples/sec | ETA 06:52:33 2022-08-22 20:56:17 [INFO] [TRAIN] epoch: 23, iter: 29000/160000, loss: 0.7944, lr: 0.000992, batch_cost: 0.2243, reader_cost: 0.00067, ips: 35.6695 samples/sec | ETA 08:09:40 2022-08-22 20:56:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 201s - batch_cost: 0.2011 - reader cost: 9.2400e-04 2022-08-22 20:59:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3148 Acc: 0.7503 Kappa: 0.7310 Dice: 0.4405 2022-08-22 20:59:38 [INFO] [EVAL] Class IoU: [0.6598 0.7636 0.9263 0.7131 0.6696 0.7213 0.7564 0.7577 0.4911 0.6452 0.468 0.5316 0.6781 0.274 0.2278 0.3999 0.5035 0.4272 0.5629 0.3821 0.7134 0.4565 0.5991 0.4517 0.3393 0.3762 0.4848 0.4326 0.3383 0.2548 0.2228 0.462 0.2655 0.3563 0.3709 0.3583 0.3887 0.4607 0.248 0.3343 0.1641 0.11 0.334 0.2324 0.32 0.1698 0.2658 0.4427 0.5029 0.4202 0.4518 0.2186 0.0541 0.1263 0.6622 0.5022 0.7971 0.3089 0.4579 0.248 0.1056 0.3858 0.3119 0.0722 0.414 0.63 0.2763 0.4267 0.1357 0.315 0.3893 0.4278 0.3939 0.2812 0.3962 0.2649 0.3778 0.2602 0.1783 0.0069 0.5799 0.3398 0.2344 0.0139 0.2978 0.519 0.0476 0.053 0.2936 0.4 0.3996 0.0755 0.235 0.0427 0.0506 0.0179 0.0176 0.0476 0.0667 0.3234 0.0473 0.0215 0.2057 0.544 0.0199 0.5399 0.0942 0.4725 0.0958 0.2326 0.0763 0.4931 0.1148 0.5162 0.6117 0.0002 0.3038 0.5536 0.0704 0.2307 0.4339 0.0001 0.2142 0.068 0.2458 0.1571 0.4323 0.3644 0.0001 0.1797 0.5438 0.0014 0.2458 0.1642 0.1511 0.1242 0.12 0.008 0.1571 0.2903 0.3633 0.001 0.3631 0.2086 0.2369 0. 0.3498 0.0206 0.0516 0.1093] 2022-08-22 20:59:38 [INFO] [EVAL] Class Precision: [0.7582 0.8434 0.9653 0.8071 0.7559 0.8667 0.8647 0.8223 0.6145 0.771 0.7144 0.653 0.7623 0.533 0.497 0.5648 0.6749 0.6779 0.6815 0.5815 0.7958 0.6029 0.74 0.6138 0.5222 0.5406 0.5798 0.6627 0.6461 0.3676 0.35 0.6641 0.5004 0.4798 0.4962 0.5193 0.6034 0.7317 0.3921 0.5978 0.2628 0.3122 0.7034 0.4377 0.4469 0.4692 0.4172 0.6568 0.6411 0.4842 0.6787 0.2778 0.217 0.8562 0.6776 0.7001 0.8595 0.6512 0.5837 0.3999 0.2352 0.4582 0.4527 0.9297 0.5331 0.7305 0.4697 0.6009 0.2448 0.5285 0.5645 0.4983 0.5768 0.4616 0.5643 0.4903 0.5067 0.5901 0.312 0.0262 0.6872 0.6749 0.8079 0.0494 0.4261 0.7483 0.4612 0.3332 0.86 0.6198 0.606 0.109 0.4571 0.27 0.2772 0.0386 0.2505 0.3819 0.663 0.65 0.2861 0.0442 0.4156 0.7059 0.303 0.5789 0.3415 0.7185 0.1807 0.3306 0.1821 0.7429 0.3121 0.7182 0.614 0.006 0.8308 0.5855 0.1595 0.4916 0.7792 0.0027 0.9413 0.8284 0.5826 0.5872 0.8668 0.5349 0.005 0.3586 0.654 0.0771 0.3867 0.5536 0.435 0.3148 0.2212 0.1057 0.5242 0.7358 0.4672 0.0031 0.5308 0.709 0.6407 0. 0.8636 0.3254 0.8003 0.8606] 2022-08-22 20:59:38 [INFO] [EVAL] Class Recall: [0.8356 0.8898 0.9583 0.8596 0.8544 0.8113 0.8579 0.9061 0.7099 0.7982 0.5758 0.7409 0.8599 0.3605 0.296 0.5779 0.6646 0.536 0.7637 0.527 0.8732 0.6529 0.7588 0.6311 0.492 0.553 0.7473 0.5548 0.4152 0.4536 0.3799 0.603 0.3613 0.5807 0.5949 0.5362 0.522 0.5543 0.403 0.4313 0.304 0.1452 0.3888 0.3313 0.53 0.2101 0.4228 0.576 0.6999 0.7608 0.5747 0.5067 0.0672 0.129 0.9668 0.6398 0.9165 0.3701 0.6799 0.395 0.1608 0.7095 0.5008 0.0726 0.6496 0.8208 0.4015 0.5955 0.2333 0.4382 0.5565 0.7516 0.5539 0.4184 0.5709 0.3656 0.5976 0.3175 0.2937 0.0094 0.7879 0.4063 0.2482 0.0189 0.4973 0.6288 0.0504 0.0593 0.3083 0.53 0.5398 0.1972 0.3259 0.0483 0.0583 0.0323 0.0186 0.0515 0.069 0.3916 0.0536 0.0401 0.2894 0.7034 0.0208 0.8893 0.115 0.5798 0.1693 0.4397 0.116 0.5946 0.1536 0.6473 0.9939 0.0002 0.3239 0.9105 0.1119 0.3029 0.4947 0.0001 0.2171 0.069 0.2983 0.1767 0.4631 0.5334 0.0001 0.2649 0.7635 0.0014 0.4028 0.1892 0.188 0.1702 0.2078 0.0086 0.1832 0.324 0.6203 0.0014 0.5348 0.2281 0.2731 0. 0.3702 0.0216 0.0523 0.1112] 2022-08-22 20:59:39 [INFO] [EVAL] The model with the best validation mIoU (0.3148) was saved at iter 29000. 2022-08-22 20:59:52 [INFO] [TRAIN] epoch: 24, iter: 29050/160000, loss: 0.8106, lr: 0.000991, batch_cost: 0.2587, reader_cost: 0.05260, ips: 30.9205 samples/sec | ETA 09:24:40 2022-08-22 21:00:04 [INFO] [TRAIN] epoch: 24, iter: 29100/160000, loss: 0.7859, lr: 0.000991, batch_cost: 0.2444, reader_cost: 0.04188, ips: 32.7359 samples/sec | ETA 08:53:09 2022-08-22 21:00:14 [INFO] [TRAIN] epoch: 24, iter: 29150/160000, loss: 0.7919, lr: 0.000991, batch_cost: 0.2067, reader_cost: 0.00054, ips: 38.6953 samples/sec | ETA 07:30:52 2022-08-22 21:00:25 [INFO] [TRAIN] epoch: 24, iter: 29200/160000, loss: 0.7873, lr: 0.000990, batch_cost: 0.2185, reader_cost: 0.00102, ips: 36.6168 samples/sec | ETA 07:56:17 2022-08-22 21:00:35 [INFO] [TRAIN] epoch: 24, iter: 29250/160000, loss: 0.7745, lr: 0.000990, batch_cost: 0.2062, reader_cost: 0.00089, ips: 38.8040 samples/sec | ETA 07:29:15 2022-08-22 21:00:46 [INFO] [TRAIN] epoch: 24, iter: 29300/160000, loss: 0.7587, lr: 0.000990, batch_cost: 0.2129, reader_cost: 0.00109, ips: 37.5755 samples/sec | ETA 07:43:46 2022-08-22 21:00:56 [INFO] [TRAIN] epoch: 24, iter: 29350/160000, loss: 0.8006, lr: 0.000989, batch_cost: 0.1979, reader_cost: 0.00076, ips: 40.4335 samples/sec | ETA 07:10:49 2022-08-22 21:01:06 [INFO] [TRAIN] epoch: 24, iter: 29400/160000, loss: 0.7910, lr: 0.000989, batch_cost: 0.2008, reader_cost: 0.00055, ips: 39.8327 samples/sec | ETA 07:17:09 2022-08-22 21:01:16 [INFO] [TRAIN] epoch: 24, iter: 29450/160000, loss: 0.7915, lr: 0.000988, batch_cost: 0.1962, reader_cost: 0.00048, ips: 40.7774 samples/sec | ETA 07:06:52 2022-08-22 21:01:25 [INFO] [TRAIN] epoch: 24, iter: 29500/160000, loss: 0.7347, lr: 0.000988, batch_cost: 0.1798, reader_cost: 0.00057, ips: 44.4974 samples/sec | ETA 06:31:02 2022-08-22 21:01:35 [INFO] [TRAIN] epoch: 24, iter: 29550/160000, loss: 0.8219, lr: 0.000988, batch_cost: 0.2041, reader_cost: 0.00040, ips: 39.1898 samples/sec | ETA 07:23:49 2022-08-22 21:01:44 [INFO] [TRAIN] epoch: 24, iter: 29600/160000, loss: 0.7932, lr: 0.000987, batch_cost: 0.1739, reader_cost: 0.00067, ips: 45.9978 samples/sec | ETA 06:17:59 2022-08-22 21:01:53 [INFO] [TRAIN] epoch: 24, iter: 29650/160000, loss: 0.7839, lr: 0.000987, batch_cost: 0.1921, reader_cost: 0.00058, ips: 41.6541 samples/sec | ETA 06:57:14 2022-08-22 21:02:03 [INFO] [TRAIN] epoch: 24, iter: 29700/160000, loss: 0.8902, lr: 0.000987, batch_cost: 0.1972, reader_cost: 0.00102, ips: 40.5722 samples/sec | ETA 07:08:12 2022-08-22 21:02:12 [INFO] [TRAIN] epoch: 24, iter: 29750/160000, loss: 0.8351, lr: 0.000986, batch_cost: 0.1820, reader_cost: 0.00062, ips: 43.9488 samples/sec | ETA 06:35:09 2022-08-22 21:02:21 [INFO] [TRAIN] epoch: 24, iter: 29800/160000, loss: 0.7961, lr: 0.000986, batch_cost: 0.1832, reader_cost: 0.00084, ips: 43.6626 samples/sec | ETA 06:37:35 2022-08-22 21:02:32 [INFO] [TRAIN] epoch: 24, iter: 29850/160000, loss: 0.8342, lr: 0.000985, batch_cost: 0.2070, reader_cost: 0.00054, ips: 38.6454 samples/sec | ETA 07:29:02 2022-08-22 21:02:45 [INFO] [TRAIN] epoch: 24, iter: 29900/160000, loss: 0.7928, lr: 0.000985, batch_cost: 0.2579, reader_cost: 0.00046, ips: 31.0202 samples/sec | ETA 09:19:12 2022-08-22 21:02:55 [INFO] [TRAIN] epoch: 24, iter: 29950/160000, loss: 0.8322, lr: 0.000985, batch_cost: 0.2138, reader_cost: 0.00103, ips: 37.4113 samples/sec | ETA 07:43:29 2022-08-22 21:03:06 [INFO] [TRAIN] epoch: 24, iter: 30000/160000, loss: 0.7837, lr: 0.000984, batch_cost: 0.2099, reader_cost: 0.00074, ips: 38.1142 samples/sec | ETA 07:34:46 2022-08-22 21:03:06 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1833 - reader cost: 5.8172e-04 2022-08-22 21:06:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3142 Acc: 0.7503 Kappa: 0.7313 Dice: 0.4388 2022-08-22 21:06:09 [INFO] [EVAL] Class IoU: [0.6642 0.7569 0.926 0.7121 0.6625 0.7367 0.753 0.7563 0.4903 0.6465 0.4764 0.5221 0.6822 0.335 0.2403 0.3826 0.5222 0.3623 0.5729 0.3866 0.7231 0.4494 0.5872 0.458 0.3421 0.4114 0.4489 0.332 0.3522 0.2686 0.2295 0.4495 0.2454 0.3096 0.35 0.3578 0.3892 0.4812 0.2463 0.3602 0.1898 0.1346 0.337 0.2191 0.334 0.2272 0.3696 0.433 0.5523 0.5264 0.4638 0.2301 0.0936 0.2318 0.7469 0.508 0.829 0.2621 0.3068 0.274 0.1042 0.4544 0.2903 0.1261 0.4046 0.5976 0.2596 0.4192 0.0139 0.3136 0.4067 0.4477 0.4009 0.2148 0.4155 0.2791 0.3322 0.2696 0.1666 0.0743 0.5904 0.3642 0.2616 0.0109 0.275 0.5221 0.0788 0.0266 0.3215 0.4324 0.3471 0.051 0.2021 0.062 0.0114 0.0069 0.0203 0.0263 0.2273 0.3491 0.0301 0.0176 0.1399 0.4139 0.0108 0.3604 0.1045 0.5926 0.1138 0.0954 0.0317 0.4491 0.1179 0.5655 0.6148 0.0001 0.3653 0.6732 0.1173 0.3052 0.399 0. 0.2443 0.0808 0.2222 0.1966 0.4675 0.3846 0. 0.1555 0.468 0. 0.2613 0.2185 0.0916 0.1204 0.1171 0.0162 0.1124 0.3376 0.2334 0. 0.2511 0.0071 0.2697 0.0008 0.2814 0.005 0.1011 0.1409] 2022-08-22 21:06:09 [INFO] [EVAL] Class Precision: [0.7756 0.8487 0.9651 0.803 0.7201 0.8478 0.8558 0.8256 0.6157 0.7748 0.6638 0.6769 0.7698 0.4871 0.4762 0.5551 0.6953 0.7108 0.7479 0.5252 0.8227 0.6397 0.7821 0.636 0.4805 0.5268 0.5587 0.8169 0.6619 0.4037 0.3862 0.5747 0.4394 0.5183 0.5199 0.5243 0.6134 0.6402 0.3464 0.5144 0.3084 0.2518 0.6391 0.5509 0.4701 0.4921 0.6897 0.6153 0.5915 0.6335 0.657 0.3148 0.2848 0.7939 0.8327 0.7168 0.8912 0.7198 0.4498 0.4939 0.2075 0.5909 0.3876 0.7567 0.4961 0.7044 0.3674 0.6998 0.761 0.7283 0.6411 0.5795 0.6108 0.325 0.6725 0.4236 0.5786 0.6462 0.6191 0.2166 0.7068 0.5926 0.7515 0.0493 0.5782 0.7309 0.311 0.476 0.8449 0.6643 0.4183 0.0745 0.3892 0.2712 0.0427 0.043 0.2843 0.351 0.5878 0.6483 0.2592 0.0286 0.477 0.6619 0.1935 0.9927 0.2653 0.808 0.1834 0.1903 0.174 0.6626 0.5616 0.6366 0.6202 0.0028 0.6594 0.7441 0.2621 0.5093 0.6732 0.0008 0.8412 0.6818 0.6925 0.5356 0.905 0.5174 0.0004 0.5612 0.4924 0.0008 0.7774 0.6975 0.4732 0.3018 0.2379 0.0548 0.3472 0.5383 0.3546 0. 0.7425 0.7579 0.5568 0.0009 0.9738 0.2346 0.4847 0.6677] 2022-08-22 21:06:09 [INFO] [EVAL] Class Recall: [0.8223 0.875 0.9581 0.8627 0.8923 0.849 0.8625 0.9 0.7065 0.7961 0.6279 0.6953 0.857 0.5175 0.3266 0.5517 0.6772 0.4249 0.71 0.5943 0.8566 0.6017 0.702 0.6207 0.5427 0.6525 0.6955 0.3587 0.4295 0.4453 0.3614 0.6736 0.3572 0.4347 0.5172 0.5299 0.5156 0.6595 0.4601 0.5457 0.3304 0.2242 0.4162 0.2668 0.5355 0.2967 0.4434 0.5937 0.8931 0.7569 0.612 0.4609 0.1223 0.2466 0.8787 0.6355 0.9224 0.2919 0.4911 0.3809 0.173 0.663 0.5363 0.1315 0.6869 0.7976 0.4695 0.5111 0.014 0.3552 0.5267 0.663 0.5385 0.3878 0.521 0.45 0.4382 0.3163 0.1856 0.1016 0.7819 0.4859 0.2863 0.0137 0.3441 0.6463 0.0955 0.0274 0.3417 0.5533 0.671 0.1394 0.2959 0.0743 0.0153 0.0082 0.0214 0.0276 0.2704 0.4306 0.0329 0.0437 0.1652 0.5249 0.0113 0.3614 0.1471 0.6897 0.2307 0.1607 0.0373 0.5823 0.1298 0.8351 0.986 0.0001 0.4503 0.876 0.1752 0.4324 0.4949 0. 0.2562 0.0839 0.2465 0.2371 0.4917 0.5998 0. 0.177 0.9045 0. 0.2825 0.2414 0.102 0.167 0.1874 0.0225 0.1425 0.4752 0.4058 0.0001 0.2751 0.0071 0.3434 0.0215 0.2835 0.0051 0.1132 0.1515] 2022-08-22 21:06:09 [INFO] [EVAL] The model with the best validation mIoU (0.3148) was saved at iter 29000. 2022-08-22 21:06:19 [INFO] [TRAIN] epoch: 24, iter: 30050/160000, loss: 0.8392, lr: 0.000984, batch_cost: 0.1820, reader_cost: 0.00430, ips: 43.9662 samples/sec | ETA 06:34:05 2022-08-22 21:06:29 [INFO] [TRAIN] epoch: 24, iter: 30100/160000, loss: 0.8729, lr: 0.000983, batch_cost: 0.2103, reader_cost: 0.00075, ips: 38.0374 samples/sec | ETA 07:35:20 2022-08-22 21:06:38 [INFO] [TRAIN] epoch: 24, iter: 30150/160000, loss: 0.7341, lr: 0.000983, batch_cost: 0.1872, reader_cost: 0.00106, ips: 42.7317 samples/sec | ETA 06:45:09 2022-08-22 21:06:48 [INFO] [TRAIN] epoch: 24, iter: 30200/160000, loss: 0.8568, lr: 0.000983, batch_cost: 0.1959, reader_cost: 0.00086, ips: 40.8349 samples/sec | ETA 07:03:49 2022-08-22 21:06:58 [INFO] [TRAIN] epoch: 24, iter: 30250/160000, loss: 0.8869, lr: 0.000982, batch_cost: 0.1848, reader_cost: 0.00031, ips: 43.2869 samples/sec | ETA 06:39:39 2022-08-22 21:07:06 [INFO] [TRAIN] epoch: 24, iter: 30300/160000, loss: 0.8325, lr: 0.000982, batch_cost: 0.1683, reader_cost: 0.00062, ips: 47.5332 samples/sec | ETA 06:03:48 2022-08-22 21:07:19 [INFO] [TRAIN] epoch: 25, iter: 30350/160000, loss: 0.7539, lr: 0.000982, batch_cost: 0.2705, reader_cost: 0.07227, ips: 29.5802 samples/sec | ETA 09:44:23 2022-08-22 21:07:28 [INFO] [TRAIN] epoch: 25, iter: 30400/160000, loss: 0.7701, lr: 0.000981, batch_cost: 0.1754, reader_cost: 0.00054, ips: 45.6073 samples/sec | ETA 06:18:53 2022-08-22 21:07:39 [INFO] [TRAIN] epoch: 25, iter: 30450/160000, loss: 0.8123, lr: 0.000981, batch_cost: 0.2103, reader_cost: 0.00061, ips: 38.0486 samples/sec | ETA 07:33:58 2022-08-22 21:07:49 [INFO] [TRAIN] epoch: 25, iter: 30500/160000, loss: 0.8561, lr: 0.000980, batch_cost: 0.2060, reader_cost: 0.00111, ips: 38.8443 samples/sec | ETA 07:24:30 2022-08-22 21:07:58 [INFO] [TRAIN] epoch: 25, iter: 30550/160000, loss: 0.7864, lr: 0.000980, batch_cost: 0.1884, reader_cost: 0.00281, ips: 42.4555 samples/sec | ETA 06:46:32 2022-08-22 21:08:08 [INFO] [TRAIN] epoch: 25, iter: 30600/160000, loss: 0.8230, lr: 0.000980, batch_cost: 0.1939, reader_cost: 0.00045, ips: 41.2560 samples/sec | ETA 06:58:12 2022-08-22 21:08:17 [INFO] [TRAIN] epoch: 25, iter: 30650/160000, loss: 0.8183, lr: 0.000979, batch_cost: 0.1827, reader_cost: 0.00040, ips: 43.7835 samples/sec | ETA 06:33:54 2022-08-22 21:08:28 [INFO] [TRAIN] epoch: 25, iter: 30700/160000, loss: 0.7867, lr: 0.000979, batch_cost: 0.2098, reader_cost: 0.00087, ips: 38.1267 samples/sec | ETA 07:32:10 2022-08-22 21:08:38 [INFO] [TRAIN] epoch: 25, iter: 30750/160000, loss: 0.7565, lr: 0.000979, batch_cost: 0.1992, reader_cost: 0.00073, ips: 40.1607 samples/sec | ETA 07:09:06 2022-08-22 21:08:48 [INFO] [TRAIN] epoch: 25, iter: 30800/160000, loss: 0.7790, lr: 0.000978, batch_cost: 0.1978, reader_cost: 0.00062, ips: 40.4420 samples/sec | ETA 07:05:57 2022-08-22 21:08:58 [INFO] [TRAIN] epoch: 25, iter: 30850/160000, loss: 0.8706, lr: 0.000978, batch_cost: 0.2080, reader_cost: 0.00259, ips: 38.4676 samples/sec | ETA 07:27:38 2022-08-22 21:09:10 [INFO] [TRAIN] epoch: 25, iter: 30900/160000, loss: 0.8145, lr: 0.000977, batch_cost: 0.2363, reader_cost: 0.00047, ips: 33.8557 samples/sec | ETA 08:28:25 2022-08-22 21:09:22 [INFO] [TRAIN] epoch: 25, iter: 30950/160000, loss: 0.8017, lr: 0.000977, batch_cost: 0.2433, reader_cost: 0.00045, ips: 32.8752 samples/sec | ETA 08:43:23 2022-08-22 21:09:35 [INFO] [TRAIN] epoch: 25, iter: 31000/160000, loss: 0.8090, lr: 0.000977, batch_cost: 0.2558, reader_cost: 0.00040, ips: 31.2789 samples/sec | ETA 09:09:53 2022-08-22 21:09:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 212s - batch_cost: 0.2116 - reader cost: 0.0012 2022-08-22 21:13:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3123 Acc: 0.7494 Kappa: 0.7300 Dice: 0.4386 2022-08-22 21:13:07 [INFO] [EVAL] Class IoU: [0.6654 0.7623 0.9268 0.7125 0.6661 0.751 0.7527 0.75 0.4938 0.6557 0.4652 0.4953 0.6813 0.293 0.2088 0.3949 0.5236 0.4245 0.5582 0.3613 0.7168 0.404 0.5939 0.4753 0.3042 0.3427 0.3842 0.3999 0.3621 0.2543 0.2052 0.4379 0.2075 0.3346 0.2931 0.3175 0.3906 0.3883 0.254 0.3285 0.1852 0.0963 0.3131 0.212 0.2841 0.1687 0.2788 0.4218 0.5797 0.5102 0.4795 0.2469 0.0996 0.2266 0.7125 0.3962 0.8081 0.3177 0.4756 0.2861 0.0695 0.3902 0.2866 0.1575 0.4228 0.618 0.269 0.4242 0.099 0.3095 0.3076 0.4324 0.3487 0.2285 0.4001 0.2719 0.2846 0.278 0.1825 0.1261 0.4856 0.3497 0.3374 0.0229 0.3734 0.5007 0.056 0.0683 0.2716 0.3728 0.4213 0.0485 0.2213 0.058 0.0089 0.003 0.0607 0.025 0.2917 0.3078 0.0294 0.0254 0.2132 0.6683 0.021 0.5861 0.1513 0.454 0.0689 0.1058 0.0764 0.4357 0.116 0.685 0.5858 0.0019 0.365 0.5512 0.1511 0.1804 0.2781 0.0004 0.1959 0.1804 0.1938 0.1782 0.3967 0.3567 0. 0.163 0.4241 0. 0.259 0.2214 0.0971 0.112 0.1096 0.0014 0.101 0.3292 0.3467 0.0177 0.2024 0.2235 0.2421 0. 0.322 0.0106 0.0767 0.1014] 2022-08-22 21:13:07 [INFO] [EVAL] Class Precision: [0.7643 0.8368 0.967 0.8082 0.7569 0.8592 0.8269 0.842 0.5961 0.7889 0.6977 0.6987 0.774 0.5119 0.4832 0.5938 0.6983 0.7063 0.7071 0.586 0.793 0.5986 0.7174 0.6295 0.4792 0.4966 0.4605 0.6981 0.6275 0.4073 0.401 0.5593 0.4097 0.4575 0.4761 0.6037 0.5993 0.6559 0.5566 0.5799 0.2644 0.253 0.5016 0.6079 0.3519 0.4021 0.3756 0.5537 0.6373 0.6228 0.6165 0.3135 0.3076 0.7597 0.744 0.4826 0.8403 0.6471 0.749 0.4824 0.224 0.4813 0.4271 0.8837 0.5991 0.717 0.4477 0.6033 0.4994 0.6977 0.7112 0.6546 0.4791 0.3042 0.6851 0.3414 0.4636 0.5164 0.4069 0.1823 0.5552 0.5482 0.6708 0.063 0.455 0.6439 0.6368 0.2863 0.6659 0.4418 0.5799 0.0605 0.3266 0.2678 0.0468 0.0169 0.2057 0.2826 0.5115 0.5894 0.2956 0.0972 0.4591 0.7194 0.4414 0.6465 0.4547 0.6453 0.1645 0.2393 0.2113 0.8163 0.2799 0.7406 0.5866 0.0378 0.8064 0.5608 0.1856 0.4201 0.7292 0.0046 0.4511 0.598 0.6315 0.5898 0.7459 0.5085 0.0002 0.2444 0.6208 0. 0.5229 0.6351 0.6039 0.2806 0.2845 0.0195 0.3526 0.7329 0.4503 0.0251 0.8035 0.5583 0.6034 0. 0.899 0.3314 0.6524 0.673 ] 2022-08-22 21:13:07 [INFO] [EVAL] Class Recall: [0.8372 0.8955 0.957 0.8576 0.8474 0.8564 0.8935 0.8729 0.7422 0.7952 0.5827 0.6298 0.8505 0.4067 0.2689 0.5411 0.6767 0.5155 0.7261 0.4852 0.8819 0.5541 0.7753 0.6599 0.4545 0.5251 0.6988 0.4835 0.4612 0.4039 0.296 0.6685 0.2959 0.5547 0.4326 0.401 0.5286 0.4877 0.3184 0.4311 0.382 0.1345 0.4545 0.2455 0.5956 0.2252 0.5196 0.6391 0.8651 0.7384 0.6834 0.5374 0.1284 0.2441 0.9439 0.6886 0.9547 0.3844 0.5657 0.4128 0.0916 0.6736 0.4657 0.1609 0.5895 0.8174 0.4026 0.5884 0.11 0.3575 0.3515 0.5601 0.5617 0.4789 0.4903 0.5717 0.4242 0.3758 0.2487 0.29 0.7949 0.4912 0.4043 0.0348 0.6757 0.6924 0.0578 0.0823 0.3145 0.7047 0.6065 0.196 0.407 0.0689 0.0108 0.0037 0.0794 0.0267 0.4042 0.3918 0.0316 0.0332 0.2847 0.904 0.0216 0.8626 0.1848 0.6049 0.1059 0.1593 0.1069 0.483 0.1654 0.9012 0.9975 0.0019 0.4 0.9697 0.4485 0.2402 0.3102 0.0004 0.2573 0.2053 0.2185 0.2033 0.4587 0.5445 0. 0.3285 0.5723 0. 0.3392 0.2537 0.1038 0.1571 0.1513 0.0015 0.124 0.374 0.6012 0.056 0.213 0.2716 0.2879 0. 0.3341 0.0108 0.08 0.1066] 2022-08-22 21:13:07 [INFO] [EVAL] The model with the best validation mIoU (0.3148) was saved at iter 29000. 2022-08-22 21:13:16 [INFO] [TRAIN] epoch: 25, iter: 31050/160000, loss: 0.7358, lr: 0.000976, batch_cost: 0.1817, reader_cost: 0.00330, ips: 44.0305 samples/sec | ETA 06:30:29 2022-08-22 21:13:25 [INFO] [TRAIN] epoch: 25, iter: 31100/160000, loss: 0.7737, lr: 0.000976, batch_cost: 0.1786, reader_cost: 0.00075, ips: 44.8032 samples/sec | ETA 06:23:36 2022-08-22 21:13:34 [INFO] [TRAIN] epoch: 25, iter: 31150/160000, loss: 0.7494, lr: 0.000976, batch_cost: 0.1770, reader_cost: 0.00044, ips: 45.1974 samples/sec | ETA 06:20:06 2022-08-22 21:13:43 [INFO] [TRAIN] epoch: 25, iter: 31200/160000, loss: 0.8375, lr: 0.000975, batch_cost: 0.1872, reader_cost: 0.00045, ips: 42.7377 samples/sec | ETA 06:41:49 2022-08-22 21:13:53 [INFO] [TRAIN] epoch: 25, iter: 31250/160000, loss: 0.7491, lr: 0.000975, batch_cost: 0.1987, reader_cost: 0.00037, ips: 40.2569 samples/sec | ETA 07:06:25 2022-08-22 21:14:02 [INFO] [TRAIN] epoch: 25, iter: 31300/160000, loss: 0.8305, lr: 0.000974, batch_cost: 0.1844, reader_cost: 0.00068, ips: 43.3750 samples/sec | ETA 06:35:37 2022-08-22 21:14:11 [INFO] [TRAIN] epoch: 25, iter: 31350/160000, loss: 0.7803, lr: 0.000974, batch_cost: 0.1697, reader_cost: 0.00321, ips: 47.1344 samples/sec | ETA 06:03:55 2022-08-22 21:14:19 [INFO] [TRAIN] epoch: 25, iter: 31400/160000, loss: 0.7836, lr: 0.000974, batch_cost: 0.1641, reader_cost: 0.00050, ips: 48.7588 samples/sec | ETA 05:51:39 2022-08-22 21:14:27 [INFO] [TRAIN] epoch: 25, iter: 31450/160000, loss: 0.7707, lr: 0.000973, batch_cost: 0.1694, reader_cost: 0.00042, ips: 47.2324 samples/sec | ETA 06:02:53 2022-08-22 21:14:37 [INFO] [TRAIN] epoch: 25, iter: 31500/160000, loss: 0.7991, lr: 0.000973, batch_cost: 0.1885, reader_cost: 0.00079, ips: 42.4432 samples/sec | ETA 06:43:40 2022-08-22 21:14:45 [INFO] [TRAIN] epoch: 25, iter: 31550/160000, loss: 0.8619, lr: 0.000972, batch_cost: 0.1746, reader_cost: 0.00056, ips: 45.8266 samples/sec | ETA 06:13:43 2022-08-22 21:14:59 [INFO] [TRAIN] epoch: 26, iter: 31600/160000, loss: 0.8328, lr: 0.000972, batch_cost: 0.2701, reader_cost: 0.06736, ips: 29.6196 samples/sec | ETA 09:37:59 2022-08-22 21:15:08 [INFO] [TRAIN] epoch: 26, iter: 31650/160000, loss: 0.7501, lr: 0.000972, batch_cost: 0.1765, reader_cost: 0.00054, ips: 45.3322 samples/sec | ETA 06:17:30 2022-08-22 21:15:17 [INFO] [TRAIN] epoch: 26, iter: 31700/160000, loss: 0.7633, lr: 0.000971, batch_cost: 0.1909, reader_cost: 0.00045, ips: 41.9029 samples/sec | ETA 06:48:14 2022-08-22 21:15:29 [INFO] [TRAIN] epoch: 26, iter: 31750/160000, loss: 0.7574, lr: 0.000971, batch_cost: 0.2350, reader_cost: 0.00060, ips: 34.0448 samples/sec | ETA 08:22:16 2022-08-22 21:15:40 [INFO] [TRAIN] epoch: 26, iter: 31800/160000, loss: 0.7761, lr: 0.000971, batch_cost: 0.2264, reader_cost: 0.00051, ips: 35.3412 samples/sec | ETA 08:03:39 2022-08-22 21:15:53 [INFO] [TRAIN] epoch: 26, iter: 31850/160000, loss: 0.7250, lr: 0.000970, batch_cost: 0.2536, reader_cost: 0.00088, ips: 31.5412 samples/sec | ETA 09:01:43 2022-08-22 21:16:05 [INFO] [TRAIN] epoch: 26, iter: 31900/160000, loss: 0.8069, lr: 0.000970, batch_cost: 0.2312, reader_cost: 0.01274, ips: 34.5968 samples/sec | ETA 08:13:41 2022-08-22 21:16:16 [INFO] [TRAIN] epoch: 26, iter: 31950/160000, loss: 0.7094, lr: 0.000969, batch_cost: 0.2284, reader_cost: 0.00040, ips: 35.0206 samples/sec | ETA 08:07:31 2022-08-22 21:16:28 [INFO] [TRAIN] epoch: 26, iter: 32000/160000, loss: 0.8158, lr: 0.000969, batch_cost: 0.2333, reader_cost: 0.00232, ips: 34.2948 samples/sec | ETA 08:17:38 2022-08-22 21:16:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 195s - batch_cost: 0.1953 - reader cost: 0.0014 2022-08-22 21:19:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.3213 Acc: 0.7503 Kappa: 0.7316 Dice: 0.4506 2022-08-22 21:19:43 [INFO] [EVAL] Class IoU: [0.6614 0.7632 0.9284 0.7063 0.6711 0.7363 0.7591 0.7382 0.4968 0.6313 0.4679 0.515 0.6707 0.216 0.2359 0.3861 0.5012 0.4422 0.5737 0.3855 0.7085 0.4614 0.5952 0.4636 0.3325 0.2525 0.4845 0.3833 0.4021 0.2699 0.196 0.4238 0.2673 0.3162 0.2893 0.4131 0.3916 0.4959 0.2801 0.3328 0.1277 0.1119 0.3293 0.2308 0.2897 0.255 0.3288 0.4625 0.6087 0.4975 0.4907 0.2589 0.1667 0.1863 0.7595 0.4486 0.7983 0.3414 0.4561 0.1946 0.1319 0.3584 0.3114 0.1811 0.4321 0.6018 0.2655 0.4232 0.0699 0.2753 0.4092 0.4635 0.3231 0.19 0.4232 0.3381 0.3207 0.2265 0.2197 0.1982 0.5213 0.3844 0.251 0.0799 0.2712 0.5117 0.0817 0.0554 0.2847 0.4439 0.3999 0.0549 0.2194 0.0416 0.0317 0.0094 0.0234 0.0902 0.2631 0.3489 0.105 0.0321 0.1089 0.1649 0.1287 0.5752 0.0755 0.5112 0.0899 0.2877 0.0914 0.4236 0.1178 0.6514 0.6253 0.0014 0.323 0.5295 0.1349 0.3264 0.4753 0.0001 0.2201 0.1114 0.2674 0.1775 0.3841 0.3863 0.0013 0.2424 0.4923 0.0062 0.2686 0.2051 0.1018 0.1294 0.1118 0.0009 0.1518 0.283 0.2834 0. 0.359 0.2924 0.2756 0. 0.356 0.016 0.1058 0.0733] 2022-08-22 21:19:43 [INFO] [EVAL] Class Precision: [0.7909 0.8398 0.9636 0.8129 0.7443 0.7887 0.8423 0.7934 0.6682 0.736 0.6703 0.705 0.7431 0.5292 0.4867 0.5646 0.6285 0.6908 0.7156 0.55 0.783 0.6531 0.7385 0.6196 0.5045 0.4239 0.5784 0.7288 0.5953 0.3516 0.3831 0.5841 0.5207 0.4067 0.4121 0.5829 0.5454 0.7162 0.495 0.5527 0.248 0.3249 0.5473 0.5045 0.4046 0.5211 0.6493 0.6773 0.7004 0.6068 0.6348 0.3452 0.309 0.558 0.7991 0.594 0.827 0.5975 0.6789 0.354 0.2315 0.491 0.4487 0.756 0.5882 0.6722 0.3819 0.6117 0.3335 0.4852 0.5853 0.6904 0.6332 0.3415 0.6523 0.5161 0.7307 0.5615 0.6269 0.3082 0.6117 0.5413 0.7676 0.2321 0.5947 0.6171 0.4161 0.3032 0.5578 0.7557 0.5864 0.088 0.3584 0.2473 0.1523 0.0313 0.1381 0.3182 0.4164 0.5231 0.4347 0.0399 0.4604 0.5902 0.9481 0.6719 0.2524 0.7908 0.1942 0.3979 0.2382 0.5331 0.2961 0.6734 0.6305 0.0389 0.8596 0.5667 0.227 0.5618 0.6257 0.019 0.7013 0.7158 0.6416 0.6762 0.572 0.6194 0.0122 0.3879 0.5298 0.0926 0.7375 0.8043 0.6084 0.2455 0.2739 0.0136 0.4877 0.755 0.4448 0. 0.6626 0.5924 0.5309 0. 0.7797 0.4741 0.3113 0.9112] 2022-08-22 21:19:43 [INFO] [EVAL] Class Recall: [0.8016 0.8932 0.9621 0.8433 0.8722 0.9173 0.8849 0.9139 0.6594 0.8161 0.6077 0.6565 0.8732 0.2674 0.3141 0.5497 0.7121 0.5513 0.7432 0.5631 0.8816 0.6112 0.7542 0.6481 0.4938 0.3845 0.7491 0.447 0.5534 0.5373 0.2863 0.607 0.3545 0.5869 0.4926 0.5864 0.5813 0.6172 0.3922 0.4554 0.2084 0.1457 0.4526 0.2985 0.505 0.333 0.3997 0.5932 0.823 0.7343 0.6838 0.5087 0.2659 0.2185 0.9388 0.6469 0.9584 0.4434 0.5816 0.3018 0.2346 0.5703 0.5044 0.1923 0.6196 0.8518 0.4656 0.5786 0.0813 0.389 0.5762 0.5851 0.3975 0.3 0.5465 0.495 0.3637 0.2752 0.2528 0.3572 0.7791 0.5702 0.2716 0.1086 0.3328 0.7498 0.0923 0.0635 0.3676 0.5182 0.557 0.1272 0.3611 0.0477 0.0384 0.0133 0.0274 0.1117 0.4168 0.5115 0.1216 0.1416 0.1249 0.1862 0.1296 0.7999 0.0972 0.5912 0.1434 0.5095 0.1292 0.6735 0.1635 0.9523 0.987 0.0015 0.3409 0.8897 0.2494 0.4379 0.6641 0.0001 0.2428 0.1166 0.3144 0.194 0.5391 0.5066 0.0015 0.3924 0.8741 0.0066 0.297 0.2158 0.109 0.2148 0.1589 0.0009 0.1806 0.3116 0.4387 0. 0.4394 0.3661 0.3643 0. 0.3959 0.0163 0.1382 0.0738] 2022-08-22 21:19:43 [INFO] [EVAL] The model with the best validation mIoU (0.3213) was saved at iter 32000. 2022-08-22 21:19:53 [INFO] [TRAIN] epoch: 26, iter: 32050/160000, loss: 0.8219, lr: 0.000969, batch_cost: 0.1937, reader_cost: 0.00436, ips: 41.3036 samples/sec | ETA 06:53:02 2022-08-22 21:20:03 [INFO] [TRAIN] epoch: 26, iter: 32100/160000, loss: 0.8353, lr: 0.000968, batch_cost: 0.2011, reader_cost: 0.00082, ips: 39.7793 samples/sec | ETA 07:08:41 2022-08-22 21:20:14 [INFO] [TRAIN] epoch: 26, iter: 32150/160000, loss: 0.7928, lr: 0.000968, batch_cost: 0.2140, reader_cost: 0.00085, ips: 37.3833 samples/sec | ETA 07:35:59 2022-08-22 21:20:23 [INFO] [TRAIN] epoch: 26, iter: 32200/160000, loss: 0.7498, lr: 0.000968, batch_cost: 0.1798, reader_cost: 0.00056, ips: 44.4911 samples/sec | ETA 06:22:59 2022-08-22 21:20:33 [INFO] [TRAIN] epoch: 26, iter: 32250/160000, loss: 0.8156, lr: 0.000967, batch_cost: 0.2021, reader_cost: 0.00106, ips: 39.5856 samples/sec | ETA 07:10:17 2022-08-22 21:20:44 [INFO] [TRAIN] epoch: 26, iter: 32300/160000, loss: 0.7386, lr: 0.000967, batch_cost: 0.2133, reader_cost: 0.00094, ips: 37.5130 samples/sec | ETA 07:33:53 2022-08-22 21:20:55 [INFO] [TRAIN] epoch: 26, iter: 32350/160000, loss: 0.8196, lr: 0.000966, batch_cost: 0.2207, reader_cost: 0.00064, ips: 36.2498 samples/sec | ETA 07:49:31 2022-08-22 21:21:05 [INFO] [TRAIN] epoch: 26, iter: 32400/160000, loss: 0.8149, lr: 0.000966, batch_cost: 0.1998, reader_cost: 0.00070, ips: 40.0331 samples/sec | ETA 07:04:58 2022-08-22 21:21:14 [INFO] [TRAIN] epoch: 26, iter: 32450/160000, loss: 0.7501, lr: 0.000966, batch_cost: 0.1938, reader_cost: 0.00070, ips: 41.2855 samples/sec | ETA 06:51:55 2022-08-22 21:21:23 [INFO] [TRAIN] epoch: 26, iter: 32500/160000, loss: 0.7804, lr: 0.000965, batch_cost: 0.1829, reader_cost: 0.00095, ips: 43.7300 samples/sec | ETA 06:28:44 2022-08-22 21:21:33 [INFO] [TRAIN] epoch: 26, iter: 32550/160000, loss: 0.7784, lr: 0.000965, batch_cost: 0.1877, reader_cost: 0.00056, ips: 42.6262 samples/sec | ETA 06:38:39 2022-08-22 21:21:41 [INFO] [TRAIN] epoch: 26, iter: 32600/160000, loss: 0.7623, lr: 0.000965, batch_cost: 0.1709, reader_cost: 0.00055, ips: 46.8068 samples/sec | ETA 06:02:54 2022-08-22 21:21:52 [INFO] [TRAIN] epoch: 26, iter: 32650/160000, loss: 0.7948, lr: 0.000964, batch_cost: 0.2115, reader_cost: 0.00055, ips: 37.8218 samples/sec | ETA 07:28:56 2022-08-22 21:22:02 [INFO] [TRAIN] epoch: 26, iter: 32700/160000, loss: 0.8139, lr: 0.000964, batch_cost: 0.1953, reader_cost: 0.00103, ips: 40.9617 samples/sec | ETA 06:54:22 2022-08-22 21:22:13 [INFO] [TRAIN] epoch: 26, iter: 32750/160000, loss: 0.8103, lr: 0.000963, batch_cost: 0.2266, reader_cost: 0.00051, ips: 35.3019 samples/sec | ETA 08:00:36 2022-08-22 21:22:24 [INFO] [TRAIN] epoch: 26, iter: 32800/160000, loss: 0.7741, lr: 0.000963, batch_cost: 0.2111, reader_cost: 0.00075, ips: 37.9000 samples/sec | ETA 07:27:29 2022-08-22 21:22:38 [INFO] [TRAIN] epoch: 27, iter: 32850/160000, loss: 0.6767, lr: 0.000963, batch_cost: 0.2828, reader_cost: 0.04535, ips: 28.2858 samples/sec | ETA 09:59:21 2022-08-22 21:22:49 [INFO] [TRAIN] epoch: 27, iter: 32900/160000, loss: 0.7711, lr: 0.000962, batch_cost: 0.2143, reader_cost: 0.00464, ips: 37.3265 samples/sec | ETA 07:34:00 2022-08-22 21:22:59 [INFO] [TRAIN] epoch: 27, iter: 32950/160000, loss: 0.7754, lr: 0.000962, batch_cost: 0.2139, reader_cost: 0.00500, ips: 37.3933 samples/sec | ETA 07:33:01 2022-08-22 21:23:11 [INFO] [TRAIN] epoch: 27, iter: 33000/160000, loss: 0.7331, lr: 0.000962, batch_cost: 0.2291, reader_cost: 0.00041, ips: 34.9174 samples/sec | ETA 08:04:57 2022-08-22 21:23:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1755 - reader cost: 0.0012 2022-08-22 21:26:06 [INFO] [EVAL] #Images: 2000 mIoU: 0.3185 Acc: 0.7509 Kappa: 0.7320 Dice: 0.4471 2022-08-22 21:26:06 [INFO] [EVAL] Class IoU: [0.6712 0.7699 0.9258 0.7125 0.6686 0.7331 0.7577 0.7572 0.4823 0.6206 0.4554 0.5175 0.6766 0.323 0.248 0.403 0.5329 0.3801 0.5796 0.3939 0.7315 0.454 0.5947 0.4552 0.2738 0.4368 0.492 0.4031 0.3595 0.2455 0.1976 0.4483 0.2775 0.3559 0.3203 0.345 0.3953 0.4665 0.2015 0.354 0.1856 0.1199 0.3295 0.2461 0.3053 0.2027 0.383 0.4471 0.5657 0.4883 0.4644 0.3333 0.1358 0.1767 0.6988 0.3392 0.7958 0.3269 0.4184 0.2475 0.068 0.386 0.2717 0.255 0.3748 0.5775 0.2339 0.3775 0.0796 0.2949 0.3621 0.5017 0.2876 0.2793 0.4079 0.3164 0.42 0.2839 0.3619 0.2208 0.6656 0.371 0.2588 0.1035 0.3099 0.5059 0.0987 0.0806 0.205 0.3501 0.3802 0.0473 0.2453 0.1026 0.004 0.0043 0.0377 0.0741 0.3046 0.3712 0.1163 0.0288 0.1879 0.0841 0.0228 0.4977 0.1716 0.5739 0.1122 0.247 0.0458 0.3745 0.1012 0.5788 0.5097 0.0018 0.4 0.593 0.0618 0.179 0.4124 0.0251 0.1958 0.0734 0.2216 0.1622 0.4466 0.4007 0. 0.141 0.545 0. 0.2575 0.1866 0.1361 0.1324 0.1158 0.0026 0.1704 0.3356 0.1771 0. 0.2722 0.2174 0.1343 0. 0.3502 0.0212 0.0963 0.143 ] 2022-08-22 21:26:06 [INFO] [EVAL] Class Precision: [0.7713 0.863 0.9695 0.811 0.7307 0.8645 0.8513 0.8643 0.7021 0.7694 0.7361 0.677 0.7749 0.475 0.3983 0.5809 0.7063 0.7273 0.7159 0.5151 0.8287 0.6134 0.7661 0.5837 0.4895 0.5144 0.5892 0.7137 0.6722 0.3983 0.4005 0.5666 0.5565 0.4858 0.5069 0.5715 0.6145 0.6845 0.304 0.5272 0.2606 0.3098 0.5598 0.5428 0.4039 0.4521 0.7731 0.5989 0.6123 0.5801 0.647 0.4386 0.3156 0.6311 0.7333 0.4154 0.8346 0.6451 0.6503 0.46 0.1447 0.5236 0.3584 0.5557 0.4261 0.6785 0.2919 0.5171 0.2696 0.5022 0.6156 0.6591 0.5439 0.3869 0.593 0.5132 0.7045 0.5395 0.5977 0.405 0.8539 0.5412 0.7749 0.1649 0.5199 0.668 0.3602 0.328 0.292 0.4861 0.4727 0.061 0.3977 0.2814 0.017 0.0229 0.2327 0.263 0.4445 0.5942 0.3397 0.0583 0.5288 0.4829 0.301 0.646 0.4049 0.8489 0.2197 0.4068 0.4786 0.4801 0.2591 0.7575 0.5111 0.0323 0.5908 0.6862 0.2485 0.4536 0.4983 0.4767 0.4327 0.8221 0.7223 0.5858 0.8775 0.6122 0. 0.8657 0.7073 0. 0.4296 0.7847 0.5887 0.2375 0.2709 0.0812 0.4948 0.6686 0.1911 0. 0.6764 0.6066 0.4482 0. 0.8261 0.4731 0.6289 0.6566] 2022-08-22 21:26:06 [INFO] [EVAL] Class Recall: [0.838 0.8771 0.9536 0.8543 0.8872 0.8282 0.8732 0.8594 0.6064 0.7624 0.5442 0.6872 0.842 0.5024 0.3965 0.5682 0.6845 0.4433 0.7528 0.626 0.8617 0.636 0.7266 0.674 0.3833 0.7435 0.7489 0.4809 0.4359 0.3903 0.2805 0.6822 0.3563 0.5709 0.4652 0.4654 0.5256 0.5942 0.3741 0.5187 0.3918 0.1636 0.4448 0.3104 0.5558 0.2688 0.4315 0.6381 0.8814 0.7551 0.6219 0.5813 0.1924 0.197 0.937 0.6493 0.9449 0.3986 0.5399 0.3488 0.1138 0.5949 0.5291 0.3204 0.7569 0.795 0.5407 0.583 0.1016 0.4167 0.4679 0.6775 0.3791 0.501 0.5665 0.4522 0.5098 0.3748 0.4784 0.3269 0.7511 0.5413 0.2799 0.2175 0.4341 0.6758 0.1196 0.0966 0.4076 0.556 0.6602 0.1742 0.3902 0.139 0.0051 0.0052 0.0431 0.0936 0.4918 0.4973 0.1503 0.0538 0.2257 0.0924 0.024 0.6844 0.2295 0.6391 0.1865 0.3859 0.0482 0.6299 0.1423 0.7105 0.9948 0.0019 0.5533 0.8136 0.0759 0.2282 0.7054 0.0258 0.2634 0.0746 0.2422 0.1832 0.4763 0.537 0. 0.1442 0.7037 0. 0.3912 0.1967 0.1504 0.2303 0.1683 0.0027 0.2063 0.4025 0.7071 0. 0.3129 0.253 0.1609 0. 0.378 0.0217 0.1021 0.1546] 2022-08-22 21:26:07 [INFO] [EVAL] The model with the best validation mIoU (0.3213) was saved at iter 32000. 2022-08-22 21:26:17 [INFO] [TRAIN] epoch: 27, iter: 33050/160000, loss: 0.7695, lr: 0.000961, batch_cost: 0.2056, reader_cost: 0.00399, ips: 38.9014 samples/sec | ETA 07:15:07 2022-08-22 21:26:28 [INFO] [TRAIN] epoch: 27, iter: 33100/160000, loss: 0.7284, lr: 0.000961, batch_cost: 0.2231, reader_cost: 0.00165, ips: 35.8517 samples/sec | ETA 07:51:56 2022-08-22 21:26:38 [INFO] [TRAIN] epoch: 27, iter: 33150/160000, loss: 0.8116, lr: 0.000960, batch_cost: 0.1988, reader_cost: 0.00059, ips: 40.2373 samples/sec | ETA 07:00:20 2022-08-22 21:26:48 [INFO] [TRAIN] epoch: 27, iter: 33200/160000, loss: 0.8182, lr: 0.000960, batch_cost: 0.1994, reader_cost: 0.00032, ips: 40.1124 samples/sec | ETA 07:01:28 2022-08-22 21:26:57 [INFO] [TRAIN] epoch: 27, iter: 33250/160000, loss: 0.8101, lr: 0.000960, batch_cost: 0.1879, reader_cost: 0.00084, ips: 42.5776 samples/sec | ETA 06:36:55 2022-08-22 21:27:07 [INFO] [TRAIN] epoch: 27, iter: 33300/160000, loss: 0.7242, lr: 0.000959, batch_cost: 0.1884, reader_cost: 0.00128, ips: 42.4689 samples/sec | ETA 06:37:46 2022-08-22 21:27:17 [INFO] [TRAIN] epoch: 27, iter: 33350/160000, loss: 0.8243, lr: 0.000959, batch_cost: 0.2038, reader_cost: 0.00056, ips: 39.2475 samples/sec | ETA 07:10:15 2022-08-22 21:27:27 [INFO] [TRAIN] epoch: 27, iter: 33400/160000, loss: 0.7738, lr: 0.000958, batch_cost: 0.1983, reader_cost: 0.00063, ips: 40.3462 samples/sec | ETA 06:58:22 2022-08-22 21:27:37 [INFO] [TRAIN] epoch: 27, iter: 33450/160000, loss: 0.7188, lr: 0.000958, batch_cost: 0.2054, reader_cost: 0.00036, ips: 38.9487 samples/sec | ETA 07:13:13 2022-08-22 21:27:47 [INFO] [TRAIN] epoch: 27, iter: 33500/160000, loss: 0.7569, lr: 0.000958, batch_cost: 0.1940, reader_cost: 0.00072, ips: 41.2345 samples/sec | ETA 06:49:02 2022-08-22 21:27:57 [INFO] [TRAIN] epoch: 27, iter: 33550/160000, loss: 0.8096, lr: 0.000957, batch_cost: 0.2015, reader_cost: 0.00054, ips: 39.6984 samples/sec | ETA 07:04:42 2022-08-22 21:28:06 [INFO] [TRAIN] epoch: 27, iter: 33600/160000, loss: 0.7135, lr: 0.000957, batch_cost: 0.1886, reader_cost: 0.00051, ips: 42.4082 samples/sec | ETA 06:37:24 2022-08-22 21:28:16 [INFO] [TRAIN] epoch: 27, iter: 33650/160000, loss: 0.7370, lr: 0.000957, batch_cost: 0.1830, reader_cost: 0.00033, ips: 43.7173 samples/sec | ETA 06:25:21 2022-08-22 21:28:26 [INFO] [TRAIN] epoch: 27, iter: 33700/160000, loss: 0.7608, lr: 0.000956, batch_cost: 0.2191, reader_cost: 0.00088, ips: 36.5175 samples/sec | ETA 07:41:08 2022-08-22 21:28:39 [INFO] [TRAIN] epoch: 27, iter: 33750/160000, loss: 0.8334, lr: 0.000956, batch_cost: 0.2435, reader_cost: 0.00069, ips: 32.8508 samples/sec | ETA 08:32:25 2022-08-22 21:28:50 [INFO] [TRAIN] epoch: 27, iter: 33800/160000, loss: 0.7415, lr: 0.000955, batch_cost: 0.2328, reader_cost: 0.00694, ips: 34.3589 samples/sec | ETA 08:09:43 2022-08-22 21:29:01 [INFO] [TRAIN] epoch: 27, iter: 33850/160000, loss: 0.7740, lr: 0.000955, batch_cost: 0.2211, reader_cost: 0.00095, ips: 36.1869 samples/sec | ETA 07:44:48 2022-08-22 21:29:12 [INFO] [TRAIN] epoch: 27, iter: 33900/160000, loss: 0.7977, lr: 0.000955, batch_cost: 0.2121, reader_cost: 0.01243, ips: 37.7119 samples/sec | ETA 07:25:50 2022-08-22 21:29:23 [INFO] [TRAIN] epoch: 27, iter: 33950/160000, loss: 0.8207, lr: 0.000954, batch_cost: 0.2282, reader_cost: 0.00091, ips: 35.0606 samples/sec | ETA 07:59:21 2022-08-22 21:29:36 [INFO] [TRAIN] epoch: 27, iter: 34000/160000, loss: 0.7375, lr: 0.000954, batch_cost: 0.2442, reader_cost: 0.00123, ips: 32.7577 samples/sec | ETA 08:32:51 2022-08-22 21:29:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1972 - reader cost: 7.4904e-04 2022-08-22 21:32:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3134 Acc: 0.7491 Kappa: 0.7300 Dice: 0.4410 2022-08-22 21:32:53 [INFO] [EVAL] Class IoU: [0.6638 0.7574 0.9272 0.7117 0.6772 0.7375 0.7598 0.7489 0.5031 0.6073 0.4524 0.5278 0.6729 0.2959 0.1642 0.4063 0.5012 0.4551 0.5739 0.3767 0.7172 0.4547 0.605 0.4661 0.3339 0.411 0.4563 0.3849 0.3919 0.2503 0.2092 0.4566 0.2876 0.3356 0.3277 0.3516 0.4007 0.4493 0.2113 0.3407 0.1507 0.1275 0.3345 0.2405 0.3307 0.1335 0.324 0.4307 0.5926 0.4791 0.4782 0.3255 0.1445 0.1971 0.6459 0.4095 0.7408 0.3311 0.4082 0.2014 0.0803 0.2564 0.2384 0.1766 0.4172 0.6421 0.2715 0.3824 0.0789 0.2686 0.3654 0.4358 0.3916 0.2514 0.3501 0.2803 0.3177 0.2805 0.2029 0.2153 0.5569 0.3785 0.3307 0.1935 0.2871 0.5291 0.065 0.05 0.2238 0.422 0.3337 0.0221 0.2281 0.0928 0. 0.014 0.0246 0.0694 0.2482 0.3683 0.1241 0.0303 0.0982 0.1452 0.0324 0.4044 0.1487 0.4906 0.1257 0.1537 0.0983 0.4793 0.1224 0.6217 0.5443 0. 0.4335 0.6304 0.0735 0.277 0.4078 0. 0.2043 0.0701 0.3035 0.2012 0.3411 0.3871 0.0002 0.1528 0.3961 0. 0.2699 0.3091 0.0955 0.116 0.0787 0.0029 0.1071 0.3359 0.3238 0. 0.3616 0.0341 0.1964 0. 0.3432 0.0201 0.1054 0.0888] 2022-08-22 21:32:53 [INFO] [EVAL] Class Precision: [0.768 0.8401 0.9643 0.8305 0.7702 0.8816 0.8843 0.807 0.6476 0.7793 0.7151 0.6247 0.7596 0.4477 0.6039 0.5794 0.6466 0.6885 0.6669 0.5753 0.8097 0.5823 0.737 0.6279 0.46 0.471 0.5632 0.5794 0.6362 0.345 0.3864 0.6942 0.48 0.4213 0.5689 0.4849 0.6083 0.723 0.4967 0.5985 0.2916 0.3606 0.5999 0.5169 0.4633 0.3432 0.6149 0.5688 0.6942 0.5579 0.6705 0.4301 0.2755 0.6789 0.659 0.5055 0.7564 0.6556 0.4398 0.4113 0.1965 0.4898 0.5014 0.656 0.5304 0.8017 0.456 0.4774 0.3206 0.4649 0.6443 0.4877 0.6251 0.3314 0.7148 0.3759 0.4003 0.4771 0.4609 0.4676 0.6397 0.62 0.7062 0.3581 0.4419 0.7377 0.497 0.3546 0.3808 0.8395 0.4349 0.0686 0.3711 0.3299 0. 0.0514 0.1691 0.3068 0.5224 0.6779 0.3018 0.0505 0.5557 0.9521 0.4357 0.4707 0.3569 0.7934 0.2319 0.2958 0.2112 0.7514 0.4943 0.7611 0.5466 0. 0.7752 0.7303 0.1177 0.5292 0.7129 0.0002 0.3755 0.5673 0.7267 0.5433 0.7881 0.5317 0.1002 0.7128 0.4264 0. 0.3578 0.627 0.487 0.3238 0.33 0.0731 0.3713 0.6382 0.4179 0. 0.7074 0.137 0.4531 0. 0.8065 0.3963 0.5106 0.9449] 2022-08-22 21:32:53 [INFO] [EVAL] Class Recall: [0.8303 0.8849 0.9601 0.8327 0.8486 0.8186 0.8437 0.9122 0.6928 0.7335 0.5519 0.7729 0.855 0.466 0.184 0.5763 0.6903 0.5732 0.8045 0.5218 0.8626 0.6749 0.7715 0.644 0.549 0.7635 0.7062 0.5342 0.505 0.477 0.3132 0.5715 0.4177 0.6227 0.4359 0.5613 0.54 0.5428 0.2688 0.4417 0.2377 0.1647 0.4306 0.3102 0.536 0.1793 0.4065 0.6395 0.8018 0.7721 0.6251 0.5724 0.2332 0.2173 0.9701 0.6833 0.973 0.4008 0.8501 0.283 0.1196 0.3498 0.3124 0.1946 0.6615 0.7633 0.4016 0.6576 0.0947 0.3887 0.4577 0.8036 0.5118 0.5103 0.4069 0.5242 0.6061 0.4049 0.266 0.2852 0.8115 0.4929 0.3835 0.2963 0.4504 0.6518 0.0695 0.055 0.3519 0.459 0.5893 0.0316 0.3718 0.1144 0. 0.0188 0.0279 0.0823 0.321 0.4464 0.1742 0.0706 0.1066 0.1463 0.0338 0.7416 0.2032 0.5625 0.2155 0.2425 0.1552 0.5696 0.1399 0.7724 0.9924 0. 0.4958 0.8217 0.1636 0.3676 0.488 0. 0.3093 0.074 0.3426 0.2422 0.3756 0.5874 0.0002 0.1629 0.848 0. 0.5236 0.3787 0.1062 0.1531 0.0937 0.003 0.1309 0.415 0.5897 0. 0.4251 0.0435 0.2574 0. 0.3739 0.0208 0.1173 0.0893] 2022-08-22 21:32:53 [INFO] [EVAL] The model with the best validation mIoU (0.3213) was saved at iter 32000. 2022-08-22 21:33:03 [INFO] [TRAIN] epoch: 27, iter: 34050/160000, loss: 0.8067, lr: 0.000954, batch_cost: 0.1890, reader_cost: 0.00362, ips: 42.3198 samples/sec | ETA 06:36:49 2022-08-22 21:33:11 [INFO] [TRAIN] epoch: 27, iter: 34100/160000, loss: 0.7940, lr: 0.000953, batch_cost: 0.1786, reader_cost: 0.00118, ips: 44.7815 samples/sec | ETA 06:14:51 2022-08-22 21:33:24 [INFO] [TRAIN] epoch: 28, iter: 34150/160000, loss: 0.8033, lr: 0.000953, batch_cost: 0.2476, reader_cost: 0.06581, ips: 32.3164 samples/sec | ETA 08:39:14 2022-08-22 21:33:32 [INFO] [TRAIN] epoch: 28, iter: 34200/160000, loss: 0.7750, lr: 0.000952, batch_cost: 0.1700, reader_cost: 0.00062, ips: 47.0512 samples/sec | ETA 05:56:29 2022-08-22 21:33:41 [INFO] [TRAIN] epoch: 28, iter: 34250/160000, loss: 0.7948, lr: 0.000952, batch_cost: 0.1816, reader_cost: 0.00109, ips: 44.0491 samples/sec | ETA 06:20:38 2022-08-22 21:33:51 [INFO] [TRAIN] epoch: 28, iter: 34300/160000, loss: 0.7641, lr: 0.000952, batch_cost: 0.1929, reader_cost: 0.00076, ips: 41.4664 samples/sec | ETA 06:44:10 2022-08-22 21:34:01 [INFO] [TRAIN] epoch: 28, iter: 34350/160000, loss: 0.7488, lr: 0.000951, batch_cost: 0.1901, reader_cost: 0.00071, ips: 42.0801 samples/sec | ETA 06:38:07 2022-08-22 21:34:10 [INFO] [TRAIN] epoch: 28, iter: 34400/160000, loss: 0.7700, lr: 0.000951, batch_cost: 0.1885, reader_cost: 0.00166, ips: 42.4398 samples/sec | ETA 06:34:35 2022-08-22 21:34:19 [INFO] [TRAIN] epoch: 28, iter: 34450/160000, loss: 0.7684, lr: 0.000951, batch_cost: 0.1829, reader_cost: 0.00104, ips: 43.7400 samples/sec | ETA 06:22:42 2022-08-22 21:34:28 [INFO] [TRAIN] epoch: 28, iter: 34500/160000, loss: 0.7759, lr: 0.000950, batch_cost: 0.1866, reader_cost: 0.00058, ips: 42.8676 samples/sec | ETA 06:30:20 2022-08-22 21:34:38 [INFO] [TRAIN] epoch: 28, iter: 34550/160000, loss: 0.7521, lr: 0.000950, batch_cost: 0.1840, reader_cost: 0.00046, ips: 43.4770 samples/sec | ETA 06:24:43 2022-08-22 21:34:48 [INFO] [TRAIN] epoch: 28, iter: 34600/160000, loss: 0.7573, lr: 0.000949, batch_cost: 0.2018, reader_cost: 0.00071, ips: 39.6481 samples/sec | ETA 07:01:42 2022-08-22 21:34:58 [INFO] [TRAIN] epoch: 28, iter: 34650/160000, loss: 0.7953, lr: 0.000949, batch_cost: 0.2027, reader_cost: 0.00075, ips: 39.4706 samples/sec | ETA 07:03:26 2022-08-22 21:35:09 [INFO] [TRAIN] epoch: 28, iter: 34700/160000, loss: 0.7940, lr: 0.000949, batch_cost: 0.2278, reader_cost: 0.00036, ips: 35.1140 samples/sec | ETA 07:55:47 2022-08-22 21:35:20 [INFO] [TRAIN] epoch: 28, iter: 34750/160000, loss: 0.7571, lr: 0.000948, batch_cost: 0.2135, reader_cost: 0.00472, ips: 37.4767 samples/sec | ETA 07:25:36 2022-08-22 21:35:33 [INFO] [TRAIN] epoch: 28, iter: 34800/160000, loss: 0.7944, lr: 0.000948, batch_cost: 0.2554, reader_cost: 0.00830, ips: 31.3209 samples/sec | ETA 08:52:58 2022-08-22 21:35:44 [INFO] [TRAIN] epoch: 28, iter: 34850/160000, loss: 0.7611, lr: 0.000948, batch_cost: 0.2301, reader_cost: 0.00154, ips: 34.7705 samples/sec | ETA 07:59:54 2022-08-22 21:35:55 [INFO] [TRAIN] epoch: 28, iter: 34900/160000, loss: 0.7977, lr: 0.000947, batch_cost: 0.2191, reader_cost: 0.00158, ips: 36.5107 samples/sec | ETA 07:36:51 2022-08-22 21:36:06 [INFO] [TRAIN] epoch: 28, iter: 34950/160000, loss: 0.7990, lr: 0.000947, batch_cost: 0.2150, reader_cost: 0.00079, ips: 37.2143 samples/sec | ETA 07:28:02 2022-08-22 21:36:19 [INFO] [TRAIN] epoch: 28, iter: 35000/160000, loss: 0.7556, lr: 0.000946, batch_cost: 0.2517, reader_cost: 0.00068, ips: 31.7817 samples/sec | ETA 08:44:24 2022-08-22 21:36:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2018 - reader cost: 7.6146e-04 2022-08-22 21:39:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3147 Acc: 0.7474 Kappa: 0.7283 Dice: 0.4433 2022-08-22 21:39:41 [INFO] [EVAL] Class IoU: [0.6573 0.7635 0.9271 0.7108 0.676 0.7451 0.7558 0.7272 0.4953 0.6165 0.4544 0.5226 0.6818 0.3078 0.2929 0.4086 0.5109 0.4481 0.5877 0.3634 0.7316 0.4264 0.5869 0.4593 0.3109 0.3673 0.412 0.4204 0.332 0.2092 0.1938 0.4305 0.2518 0.3414 0.31 0.3453 0.3865 0.4631 0.2441 0.2736 0.1598 0.1436 0.3431 0.2177 0.3168 0.1614 0.2799 0.4251 0.5388 0.5161 0.4824 0.278 0.1588 0.1667 0.7268 0.3963 0.8026 0.3734 0.3064 0.2141 0.1245 0.2582 0.2698 0.1438 0.4275 0.6522 0.2842 0.41 0.102 0.272 0.4141 0.4891 0.3334 0.2718 0.4133 0.3093 0.3635 0.2281 0.207 0.2197 0.6091 0.3521 0.3665 0.1657 0.2968 0.5102 0.0734 0.0599 0.233 0.467 0.3768 0.0356 0.1694 0.0935 0.001 0.0097 0.0816 0.1421 0.2601 0.3026 0.0422 0.0177 0.1822 0.0456 0.1084 0.2848 0.1685 0.485 0.0712 0.1776 0.0798 0.4235 0.1113 0.642 0.6032 0.0003 0.231 0.556 0.126 0.1862 0.3555 0. 0.2066 0.0914 0.4283 0.211 0.4257 0.3855 0. 0.2184 0.4326 0. 0.2976 0.2112 0.1005 0.125 0.1083 0.001 0.1414 0.3397 0.2541 0.0182 0.3666 0.0569 0.314 0. 0.3276 0.0219 0.1132 0.1489] 2022-08-22 21:39:41 [INFO] [EVAL] Class Precision: [0.7707 0.8576 0.9641 0.8114 0.7452 0.873 0.86 0.773 0.6757 0.7935 0.7081 0.6914 0.78 0.4451 0.3764 0.5678 0.7044 0.7228 0.7419 0.6383 0.8214 0.6486 0.7093 0.6076 0.5127 0.5138 0.4848 0.6807 0.721 0.3753 0.3811 0.543 0.5503 0.4943 0.4927 0.4281 0.6042 0.7484 0.3991 0.601 0.3169 0.3102 0.628 0.5781 0.5636 0.3372 0.433 0.5445 0.5649 0.6111 0.6456 0.325 0.3572 0.6707 0.7561 0.486 0.8401 0.5534 0.7115 0.3407 0.2409 0.3867 0.466 0.707 0.5761 0.7493 0.4757 0.6232 0.2841 0.4478 0.6683 0.5816 0.5799 0.3461 0.7103 0.4815 0.5953 0.624 0.7424 0.3608 0.7224 0.6545 0.6874 0.2924 0.4077 0.7487 0.3706 0.3357 0.4319 0.6649 0.4898 0.0408 0.3562 0.2998 0.0028 0.0266 0.3926 0.2435 0.5481 0.6297 0.5971 0.0485 0.4774 0.5136 0.7267 0.3251 0.3292 0.7088 0.1588 0.2842 0.2174 0.6298 0.5755 0.6843 0.607 0.0146 0.764 0.5826 0.2713 0.6552 0.6888 0. 0.3309 0.6567 0.6622 0.6652 0.7741 0.5247 0. 0.4345 0.6001 0. 0.6481 0.6308 0.4494 0.1836 0.2966 0.0212 0.4818 0.625 0.3016 0.0226 0.7315 0.2786 0.6016 0. 0.788 0.3235 0.4256 0.7093] 2022-08-22 21:39:41 [INFO] [EVAL] Class Recall: [0.8172 0.8744 0.9602 0.8514 0.8792 0.8357 0.8618 0.9246 0.6497 0.7344 0.5591 0.6816 0.8442 0.4995 0.5692 0.593 0.6503 0.5411 0.7387 0.4576 0.87 0.5545 0.7728 0.653 0.4413 0.563 0.7329 0.5237 0.3809 0.3209 0.2827 0.6751 0.317 0.5245 0.4554 0.641 0.5175 0.5485 0.3859 0.3344 0.2438 0.2109 0.4307 0.2588 0.4199 0.2364 0.4419 0.6597 0.9209 0.7684 0.6562 0.6576 0.2224 0.1815 0.9493 0.6822 0.9473 0.5344 0.3498 0.3656 0.2048 0.4374 0.3905 0.153 0.6237 0.8343 0.4139 0.5452 0.1374 0.4093 0.5212 0.7548 0.4395 0.5586 0.497 0.4639 0.4829 0.2644 0.223 0.3596 0.7953 0.4325 0.4398 0.2765 0.5218 0.6157 0.0838 0.068 0.336 0.6109 0.6203 0.2196 0.2441 0.1197 0.0015 0.0151 0.0934 0.2543 0.3312 0.3682 0.0434 0.0272 0.2276 0.0477 0.113 0.697 0.2565 0.6056 0.1144 0.3212 0.112 0.5639 0.1212 0.9121 0.9897 0.0003 0.2488 0.9243 0.1905 0.2064 0.4235 0. 0.3548 0.0959 0.5481 0.236 0.4861 0.5923 0. 0.3051 0.6079 0. 0.355 0.241 0.1146 0.2812 0.1457 0.001 0.1668 0.4267 0.6172 0.086 0.4236 0.0667 0.3965 0. 0.3593 0.0229 0.1336 0.1586] 2022-08-22 21:39:41 [INFO] [EVAL] The model with the best validation mIoU (0.3213) was saved at iter 32000. 2022-08-22 21:39:50 [INFO] [TRAIN] epoch: 28, iter: 35050/160000, loss: 0.7035, lr: 0.000946, batch_cost: 0.1910, reader_cost: 0.00280, ips: 41.8838 samples/sec | ETA 06:37:46 2022-08-22 21:40:01 [INFO] [TRAIN] epoch: 28, iter: 35100/160000, loss: 0.7451, lr: 0.000946, batch_cost: 0.2102, reader_cost: 0.00161, ips: 38.0594 samples/sec | ETA 07:17:33 2022-08-22 21:40:10 [INFO] [TRAIN] epoch: 28, iter: 35150/160000, loss: 0.8325, lr: 0.000945, batch_cost: 0.1852, reader_cost: 0.00073, ips: 43.1931 samples/sec | ETA 06:25:24 2022-08-22 21:40:20 [INFO] [TRAIN] epoch: 28, iter: 35200/160000, loss: 0.8002, lr: 0.000945, batch_cost: 0.1960, reader_cost: 0.00040, ips: 40.8136 samples/sec | ETA 06:47:42 2022-08-22 21:40:30 [INFO] [TRAIN] epoch: 28, iter: 35250/160000, loss: 0.7504, lr: 0.000944, batch_cost: 0.1956, reader_cost: 0.00052, ips: 40.8953 samples/sec | ETA 06:46:43 2022-08-22 21:40:40 [INFO] [TRAIN] epoch: 28, iter: 35300/160000, loss: 0.7497, lr: 0.000944, batch_cost: 0.2040, reader_cost: 0.00049, ips: 39.2100 samples/sec | ETA 07:04:02 2022-08-22 21:40:49 [INFO] [TRAIN] epoch: 28, iter: 35350/160000, loss: 0.7782, lr: 0.000944, batch_cost: 0.1798, reader_cost: 0.00044, ips: 44.4853 samples/sec | ETA 06:13:36 2022-08-22 21:41:03 [INFO] [TRAIN] epoch: 29, iter: 35400/160000, loss: 0.7529, lr: 0.000943, batch_cost: 0.2753, reader_cost: 0.06449, ips: 29.0561 samples/sec | ETA 09:31:46 2022-08-22 21:41:13 [INFO] [TRAIN] epoch: 29, iter: 35450/160000, loss: 0.7316, lr: 0.000943, batch_cost: 0.2029, reader_cost: 0.00067, ips: 39.4325 samples/sec | ETA 07:01:08 2022-08-22 21:41:23 [INFO] [TRAIN] epoch: 29, iter: 35500/160000, loss: 0.7412, lr: 0.000943, batch_cost: 0.2136, reader_cost: 0.00086, ips: 37.4530 samples/sec | ETA 07:23:13 2022-08-22 21:41:34 [INFO] [TRAIN] epoch: 29, iter: 35550/160000, loss: 0.7523, lr: 0.000942, batch_cost: 0.2120, reader_cost: 0.00043, ips: 37.7394 samples/sec | ETA 07:19:40 2022-08-22 21:41:43 [INFO] [TRAIN] epoch: 29, iter: 35600/160000, loss: 0.7560, lr: 0.000942, batch_cost: 0.1772, reader_cost: 0.00044, ips: 45.1523 samples/sec | ETA 06:07:20 2022-08-22 21:41:54 [INFO] [TRAIN] epoch: 29, iter: 35650/160000, loss: 0.7466, lr: 0.000941, batch_cost: 0.2188, reader_cost: 0.00054, ips: 36.5561 samples/sec | ETA 07:33:32 2022-08-22 21:42:04 [INFO] [TRAIN] epoch: 29, iter: 35700/160000, loss: 0.7373, lr: 0.000941, batch_cost: 0.1938, reader_cost: 0.00047, ips: 41.2822 samples/sec | ETA 06:41:27 2022-08-22 21:42:15 [INFO] [TRAIN] epoch: 29, iter: 35750/160000, loss: 0.7303, lr: 0.000941, batch_cost: 0.2334, reader_cost: 0.00044, ips: 34.2814 samples/sec | ETA 08:03:15 2022-08-22 21:42:25 [INFO] [TRAIN] epoch: 29, iter: 35800/160000, loss: 0.7831, lr: 0.000940, batch_cost: 0.2008, reader_cost: 0.00049, ips: 39.8492 samples/sec | ETA 06:55:34 2022-08-22 21:42:37 [INFO] [TRAIN] epoch: 29, iter: 35850/160000, loss: 0.7469, lr: 0.000940, batch_cost: 0.2456, reader_cost: 0.00090, ips: 32.5681 samples/sec | ETA 08:28:16 2022-08-22 21:42:49 [INFO] [TRAIN] epoch: 29, iter: 35900/160000, loss: 0.7158, lr: 0.000940, batch_cost: 0.2389, reader_cost: 0.00048, ips: 33.4812 samples/sec | ETA 08:14:12 2022-08-22 21:43:00 [INFO] [TRAIN] epoch: 29, iter: 35950/160000, loss: 0.7893, lr: 0.000939, batch_cost: 0.2060, reader_cost: 0.00059, ips: 38.8287 samples/sec | ETA 07:05:58 2022-08-22 21:43:11 [INFO] [TRAIN] epoch: 29, iter: 36000/160000, loss: 0.7548, lr: 0.000939, batch_cost: 0.2225, reader_cost: 0.00054, ips: 35.9524 samples/sec | ETA 07:39:52 2022-08-22 21:43:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2015 - reader cost: 8.9068e-04 2022-08-22 21:46:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3141 Acc: 0.7503 Kappa: 0.7312 Dice: 0.4408 2022-08-22 21:46:33 [INFO] [EVAL] Class IoU: [0.6648 0.7661 0.921 0.7168 0.6743 0.7473 0.7628 0.745 0.4996 0.602 0.4717 0.5288 0.6852 0.3087 0.22 0.4129 0.452 0.4352 0.5783 0.371 0.7317 0.3766 0.5974 0.47 0.3232 0.4242 0.3895 0.4183 0.366 0.2567 0.2053 0.4354 0.2851 0.3472 0.269 0.3502 0.3948 0.5306 0.25 0.3217 0.1677 0.0977 0.3488 0.2082 0.2995 0.2444 0.29 0.4519 0.6102 0.4542 0.5087 0.4004 0.193 0.2172 0.6693 0.3777 0.8043 0.37 0.4035 0.2453 0.133 0.1771 0.241 0.1417 0.4444 0.6002 0.2296 0.3959 0.1288 0.2987 0.4047 0.4592 0.3543 0.2328 0.4233 0.3152 0.4456 0.2525 0.0644 0.0974 0.4858 0.352 0.2869 0.1734 0.2306 0.5227 0.0935 0.0547 0.2935 0.4241 0.2872 0.0892 0.2199 0.06 0.0159 0.0038 0.1489 0.037 0.2323 0.2589 0.0619 0.0386 0.1702 0.805 0.0314 0.5109 0.1166 0.5005 0.081 0.0164 0.0485 0.3566 0.132 0.6365 0.5343 0.0011 0.2618 0.5099 0.1641 0.1575 0.3015 0.0123 0.2155 0.196 0.2964 0.2286 0.4211 0.3636 0. 0.2332 0.5167 0. 0.2608 0.152 0.1589 0.1078 0.1137 0.0069 0.1368 0.3109 0.0851 0.0143 0.2689 0.0136 0.2103 0. 0.353 0.0182 0.1241 0.0816] 2022-08-22 21:46:33 [INFO] [EVAL] Class Precision: [0.7759 0.8384 0.9498 0.8359 0.7598 0.8399 0.8694 0.8007 0.6297 0.8081 0.6366 0.6797 0.7774 0.4795 0.5101 0.6007 0.6586 0.7326 0.7279 0.6256 0.8331 0.6407 0.7193 0.575 0.5099 0.5313 0.4588 0.7183 0.6636 0.3407 0.3627 0.5504 0.531 0.5437 0.3756 0.5181 0.6275 0.6749 0.4362 0.6315 0.2297 0.2917 0.6376 0.603 0.5366 0.3772 0.4664 0.7448 0.6822 0.5077 0.6861 0.6933 0.3239 0.6823 0.6904 0.4648 0.8319 0.6 0.7333 0.4242 0.1888 0.2887 0.3191 0.7441 0.5818 0.6607 0.4035 0.5956 0.2503 0.5223 0.7323 0.6415 0.5739 0.3048 0.5768 0.465 0.5615 0.506 0.6862 0.2863 0.5408 0.6437 0.7746 0.3694 0.3152 0.671 0.3959 0.3509 0.4755 0.6223 0.3442 0.1547 0.3802 0.2725 0.0519 0.0188 0.4102 0.4394 0.5114 0.563 0.3 0.0779 0.5215 0.955 0.345 0.7069 0.2675 0.7748 0.2123 0.0393 0.2269 0.3834 0.4391 0.7102 0.5356 0.0652 0.849 0.5963 0.247 0.4501 0.6434 0.1349 0.665 0.5464 0.7172 0.6586 0.8284 0.5277 0. 0.5712 0.6795 0. 0.6328 0.7581 0.6012 0.2447 0.3115 0.1161 0.3898 0.7143 0.1759 0.0186 0.6738 0.2639 0.7461 0. 0.8272 0.4556 0.3549 0.7528] 2022-08-22 21:46:33 [INFO] [EVAL] Class Recall: [0.8229 0.8988 0.9682 0.8343 0.857 0.8714 0.8615 0.9146 0.7076 0.7025 0.6457 0.7044 0.8524 0.4644 0.279 0.5692 0.5903 0.5174 0.7378 0.4768 0.8574 0.4775 0.7791 0.7201 0.4689 0.6778 0.7206 0.5004 0.4494 0.5102 0.3213 0.6757 0.3811 0.49 0.4866 0.5194 0.5157 0.7129 0.3694 0.3961 0.383 0.128 0.4351 0.2413 0.404 0.4096 0.4339 0.5347 0.8526 0.8118 0.663 0.4866 0.3233 0.2416 0.9563 0.6683 0.9604 0.4912 0.4728 0.3677 0.3102 0.3142 0.4961 0.149 0.6529 0.8676 0.3476 0.5414 0.2098 0.411 0.4749 0.6177 0.4807 0.4964 0.6141 0.4945 0.6833 0.3352 0.0664 0.1286 0.8269 0.4371 0.313 0.2463 0.462 0.7029 0.1091 0.0609 0.434 0.5711 0.6343 0.174 0.3428 0.0714 0.0224 0.0047 0.1894 0.0388 0.2986 0.3241 0.0723 0.071 0.2016 0.8368 0.0334 0.6482 0.1713 0.5858 0.1158 0.0275 0.0582 0.836 0.1587 0.8597 0.9952 0.0011 0.2745 0.7787 0.3283 0.1951 0.362 0.0133 0.2418 0.2341 0.3357 0.2593 0.4613 0.5389 0. 0.2827 0.6833 0. 0.3073 0.1597 0.1776 0.1616 0.1518 0.0072 0.1741 0.3551 0.1416 0.0586 0.3091 0.0141 0.2265 0. 0.3811 0.0186 0.1602 0.0839] 2022-08-22 21:46:33 [INFO] [EVAL] The model with the best validation mIoU (0.3213) was saved at iter 32000. 2022-08-22 21:46:43 [INFO] [TRAIN] epoch: 29, iter: 36050/160000, loss: 0.7760, lr: 0.000938, batch_cost: 0.2076, reader_cost: 0.00366, ips: 38.5371 samples/sec | ETA 07:08:51 2022-08-22 21:46:54 [INFO] [TRAIN] epoch: 29, iter: 36100/160000, loss: 0.7138, lr: 0.000938, batch_cost: 0.2138, reader_cost: 0.00120, ips: 37.4131 samples/sec | ETA 07:21:33 2022-08-22 21:47:03 [INFO] [TRAIN] epoch: 29, iter: 36150/160000, loss: 0.7547, lr: 0.000938, batch_cost: 0.1916, reader_cost: 0.00069, ips: 41.7447 samples/sec | ETA 06:35:34 2022-08-22 21:47:15 [INFO] [TRAIN] epoch: 29, iter: 36200/160000, loss: 0.7182, lr: 0.000937, batch_cost: 0.2225, reader_cost: 0.00041, ips: 35.9574 samples/sec | ETA 07:39:03 2022-08-22 21:47:25 [INFO] [TRAIN] epoch: 29, iter: 36250/160000, loss: 0.7921, lr: 0.000937, batch_cost: 0.2167, reader_cost: 0.00053, ips: 36.9116 samples/sec | ETA 07:27:00 2022-08-22 21:47:37 [INFO] [TRAIN] epoch: 29, iter: 36300/160000, loss: 0.7568, lr: 0.000937, batch_cost: 0.2366, reader_cost: 0.00061, ips: 33.8159 samples/sec | ETA 08:07:44 2022-08-22 21:47:47 [INFO] [TRAIN] epoch: 29, iter: 36350/160000, loss: 0.7355, lr: 0.000936, batch_cost: 0.1950, reader_cost: 0.00093, ips: 41.0254 samples/sec | ETA 06:41:51 2022-08-22 21:47:56 [INFO] [TRAIN] epoch: 29, iter: 36400/160000, loss: 0.7464, lr: 0.000936, batch_cost: 0.1891, reader_cost: 0.00071, ips: 42.3013 samples/sec | ETA 06:29:35 2022-08-22 21:48:06 [INFO] [TRAIN] epoch: 29, iter: 36450/160000, loss: 0.7309, lr: 0.000935, batch_cost: 0.2003, reader_cost: 0.00042, ips: 39.9325 samples/sec | ETA 06:52:31 2022-08-22 21:48:16 [INFO] [TRAIN] epoch: 29, iter: 36500/160000, loss: 0.7830, lr: 0.000935, batch_cost: 0.1831, reader_cost: 0.00068, ips: 43.6945 samples/sec | ETA 06:16:51 2022-08-22 21:48:26 [INFO] [TRAIN] epoch: 29, iter: 36550/160000, loss: 0.8332, lr: 0.000935, batch_cost: 0.2086, reader_cost: 0.00046, ips: 38.3575 samples/sec | ETA 07:09:07 2022-08-22 21:48:36 [INFO] [TRAIN] epoch: 29, iter: 36600/160000, loss: 0.7510, lr: 0.000934, batch_cost: 0.1972, reader_cost: 0.00056, ips: 40.5605 samples/sec | ETA 06:45:38 2022-08-22 21:48:51 [INFO] [TRAIN] epoch: 30, iter: 36650/160000, loss: 0.7478, lr: 0.000934, batch_cost: 0.3113, reader_cost: 0.09167, ips: 25.7008 samples/sec | ETA 10:39:55 2022-08-22 21:49:05 [INFO] [TRAIN] epoch: 30, iter: 36700/160000, loss: 0.7415, lr: 0.000934, batch_cost: 0.2634, reader_cost: 0.00103, ips: 30.3742 samples/sec | ETA 09:01:14 2022-08-22 21:49:16 [INFO] [TRAIN] epoch: 30, iter: 36750/160000, loss: 0.7345, lr: 0.000933, batch_cost: 0.2364, reader_cost: 0.00092, ips: 33.8345 samples/sec | ETA 08:05:41 2022-08-22 21:49:29 [INFO] [TRAIN] epoch: 30, iter: 36800/160000, loss: 0.7642, lr: 0.000933, batch_cost: 0.2425, reader_cost: 0.00214, ips: 32.9848 samples/sec | ETA 08:18:00 2022-08-22 21:49:40 [INFO] [TRAIN] epoch: 30, iter: 36850/160000, loss: 0.7745, lr: 0.000932, batch_cost: 0.2326, reader_cost: 0.00157, ips: 34.3988 samples/sec | ETA 07:57:20 2022-08-22 21:49:53 [INFO] [TRAIN] epoch: 30, iter: 36900/160000, loss: 0.6975, lr: 0.000932, batch_cost: 0.2521, reader_cost: 0.00040, ips: 31.7275 samples/sec | ETA 08:37:19 2022-08-22 21:50:05 [INFO] [TRAIN] epoch: 30, iter: 36950/160000, loss: 0.8117, lr: 0.000932, batch_cost: 0.2493, reader_cost: 0.00058, ips: 32.0908 samples/sec | ETA 08:31:15 2022-08-22 21:50:16 [INFO] [TRAIN] epoch: 30, iter: 37000/160000, loss: 0.7307, lr: 0.000931, batch_cost: 0.2229, reader_cost: 0.00261, ips: 35.8945 samples/sec | ETA 07:36:53 2022-08-22 21:50:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 185s - batch_cost: 0.1853 - reader cost: 8.2179e-04 2022-08-22 21:53:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3174 Acc: 0.7502 Kappa: 0.7313 Dice: 0.4448 2022-08-22 21:53:22 [INFO] [EVAL] Class IoU: [0.674 0.7694 0.9276 0.7065 0.6754 0.7436 0.7516 0.7524 0.4977 0.595 0.4696 0.5309 0.679 0.238 0.2877 0.3952 0.5219 0.4131 0.5828 0.3823 0.7273 0.4425 0.5938 0.4724 0.3427 0.3625 0.4255 0.421 0.338 0.2221 0.1993 0.4503 0.2699 0.32 0.2821 0.3398 0.3974 0.5148 0.2422 0.248 0.1283 0.1053 0.3421 0.2353 0.3104 0.2443 0.3252 0.4646 0.6011 0.5307 0.4702 0.2619 0.1767 0.1859 0.6842 0.4771 0.8347 0.3747 0.4639 0.2481 0.1405 0.1646 0.3032 0.1304 0.3983 0.617 0.2408 0.4313 0.0346 0.2657 0.3463 0.3576 0.3513 0.2577 0.4293 0.3056 0.305 0.2907 0.1401 0.2032 0.5981 0.3694 0.2203 0.0921 0.2524 0.5371 0.0908 0.0739 0.2652 0.4392 0.3591 0.0377 0.1676 0.0619 0.0235 0.0148 0.0892 0.1113 0.2793 0.293 0.0207 0.0144 0.1547 0.2048 0.056 0.4708 0.114 0.5125 0.1248 0.0512 0.069 0.3274 0.1204 0.5445 0.8082 0.0007 0.3901 0.6459 0.1384 0.1638 0.409 0.0021 0.191 0.1105 0.2944 0.2392 0.303 0.4081 0.4139 0.2442 0.4641 0. 0.2999 0.2052 0.0694 0.106 0.0999 0.0028 0.1463 0.3467 0.2256 0.0322 0.3739 0.017 0.3262 0. 0.353 0.0111 0.1144 0.1083] 2022-08-22 21:53:22 [INFO] [EVAL] Class Precision: [0.7879 0.8497 0.9646 0.8019 0.744 0.837 0.8626 0.8282 0.6378 0.7572 0.6648 0.6557 0.7514 0.5084 0.5089 0.5371 0.7343 0.7077 0.7559 0.624 0.8291 0.6612 0.778 0.585 0.5252 0.4854 0.5242 0.6734 0.7183 0.2598 0.4341 0.5839 0.4756 0.4423 0.44 0.4553 0.6244 0.7644 0.4418 0.7251 0.2494 0.2553 0.589 0.5477 0.4858 0.5287 0.503 0.6362 0.6779 0.6033 0.7663 0.3323 0.28 0.7883 0.7023 0.6267 0.891 0.5575 0.581 0.3847 0.2712 0.2497 0.4949 0.654 0.4784 0.709 0.3673 0.6255 0.0625 0.4275 0.7161 0.7124 0.4793 0.3332 0.655 0.434 0.6201 0.5596 0.8168 0.3075 0.6799 0.7849 0.7702 0.1636 0.3099 0.7389 0.3321 0.3006 0.6216 0.6092 0.5057 0.0492 0.3343 0.2871 0.068 0.0368 0.3485 0.4893 0.4507 0.5117 0.7278 0.0273 0.5934 0.7441 0.53 0.6364 0.5904 0.7749 0.2885 0.1302 0.1849 0.5017 0.4705 0.6213 0.8152 0.1182 0.7434 0.7043 0.1559 0.499 0.723 0.1891 0.4781 0.6551 0.7824 0.5756 0.8021 0.5742 0.569 0.3603 0.6382 0. 0.6303 0.5792 0.6191 0.3731 0.3806 0.1098 0.3924 0.6001 0.2549 0.0403 0.648 0.2041 0.6049 0. 0.761 0.5316 0.2783 0.8339] 2022-08-22 21:53:22 [INFO] [EVAL] Class Recall: [0.8234 0.8906 0.9602 0.8558 0.8798 0.8695 0.8539 0.8916 0.6938 0.7352 0.6152 0.7361 0.8756 0.3092 0.3982 0.5993 0.6433 0.4981 0.7178 0.4967 0.8556 0.5722 0.7149 0.7105 0.4966 0.5888 0.6933 0.5291 0.3897 0.6044 0.2693 0.6631 0.3844 0.5365 0.4401 0.5727 0.5222 0.6119 0.3491 0.2737 0.2089 0.152 0.4494 0.2921 0.4623 0.3123 0.4792 0.6327 0.8414 0.8151 0.5488 0.5529 0.3238 0.1956 0.9637 0.6666 0.9297 0.5333 0.6971 0.4113 0.2256 0.3256 0.439 0.14 0.7038 0.8263 0.4115 0.5814 0.0718 0.4124 0.4015 0.4179 0.568 0.5321 0.5547 0.508 0.375 0.377 0.1446 0.3748 0.8325 0.4111 0.2358 0.1741 0.5762 0.6629 0.111 0.0892 0.3162 0.6116 0.5534 0.138 0.2516 0.0731 0.0348 0.0242 0.107 0.126 0.4236 0.4067 0.0208 0.0294 0.173 0.2203 0.0589 0.644 0.1238 0.6022 0.1804 0.0777 0.0992 0.4853 0.1392 0.815 0.9895 0.0007 0.4508 0.8861 0.5526 0.196 0.485 0.0021 0.2412 0.1173 0.3206 0.2904 0.3275 0.5851 0.6028 0.4312 0.6298 0. 0.3639 0.2412 0.0725 0.129 0.1194 0.0028 0.1891 0.4509 0.6623 0.1387 0.4692 0.0182 0.4146 0. 0.397 0.0112 0.1627 0.1107] 2022-08-22 21:53:22 [INFO] [EVAL] The model with the best validation mIoU (0.3213) was saved at iter 32000. 2022-08-22 21:53:32 [INFO] [TRAIN] epoch: 30, iter: 37050/160000, loss: 0.8043, lr: 0.000931, batch_cost: 0.2046, reader_cost: 0.00480, ips: 39.0956 samples/sec | ETA 06:59:18 2022-08-22 21:53:42 [INFO] [TRAIN] epoch: 30, iter: 37100/160000, loss: 0.7290, lr: 0.000930, batch_cost: 0.1896, reader_cost: 0.00115, ips: 42.1938 samples/sec | ETA 06:28:21 2022-08-22 21:53:51 [INFO] [TRAIN] epoch: 30, iter: 37150/160000, loss: 0.7259, lr: 0.000930, batch_cost: 0.1903, reader_cost: 0.00053, ips: 42.0369 samples/sec | ETA 06:29:39 2022-08-22 21:54:01 [INFO] [TRAIN] epoch: 30, iter: 37200/160000, loss: 0.7377, lr: 0.000930, batch_cost: 0.1979, reader_cost: 0.00053, ips: 40.4280 samples/sec | ETA 06:45:00 2022-08-22 21:54:10 [INFO] [TRAIN] epoch: 30, iter: 37250/160000, loss: 0.7253, lr: 0.000929, batch_cost: 0.1844, reader_cost: 0.00075, ips: 43.3890 samples/sec | ETA 06:17:12 2022-08-22 21:54:22 [INFO] [TRAIN] epoch: 30, iter: 37300/160000, loss: 0.7566, lr: 0.000929, batch_cost: 0.2324, reader_cost: 0.00089, ips: 34.4207 samples/sec | ETA 07:55:17 2022-08-22 21:54:33 [INFO] [TRAIN] epoch: 30, iter: 37350/160000, loss: 0.7572, lr: 0.000929, batch_cost: 0.2150, reader_cost: 0.00061, ips: 37.2085 samples/sec | ETA 07:19:30 2022-08-22 21:54:44 [INFO] [TRAIN] epoch: 30, iter: 37400/160000, loss: 0.7010, lr: 0.000928, batch_cost: 0.2237, reader_cost: 0.00084, ips: 35.7636 samples/sec | ETA 07:37:04 2022-08-22 21:54:55 [INFO] [TRAIN] epoch: 30, iter: 37450/160000, loss: 0.7616, lr: 0.000928, batch_cost: 0.2238, reader_cost: 0.00055, ips: 35.7422 samples/sec | ETA 07:37:09 2022-08-22 21:55:07 [INFO] [TRAIN] epoch: 30, iter: 37500/160000, loss: 0.7917, lr: 0.000927, batch_cost: 0.2260, reader_cost: 0.00069, ips: 35.4060 samples/sec | ETA 07:41:18 2022-08-22 21:55:17 [INFO] [TRAIN] epoch: 30, iter: 37550/160000, loss: 0.7486, lr: 0.000927, batch_cost: 0.2157, reader_cost: 0.00353, ips: 37.0825 samples/sec | ETA 07:20:16 2022-08-22 21:55:32 [INFO] [TRAIN] epoch: 30, iter: 37600/160000, loss: 0.7460, lr: 0.000927, batch_cost: 0.2868, reader_cost: 0.00463, ips: 27.8931 samples/sec | ETA 09:45:05 2022-08-22 21:55:45 [INFO] [TRAIN] epoch: 30, iter: 37650/160000, loss: 0.7507, lr: 0.000926, batch_cost: 0.2683, reader_cost: 0.00089, ips: 29.8130 samples/sec | ETA 09:07:11 2022-08-22 21:55:59 [INFO] [TRAIN] epoch: 30, iter: 37700/160000, loss: 0.6970, lr: 0.000926, batch_cost: 0.2806, reader_cost: 0.00449, ips: 28.5070 samples/sec | ETA 09:32:01 2022-08-22 21:56:13 [INFO] [TRAIN] epoch: 30, iter: 37750/160000, loss: 0.7550, lr: 0.000926, batch_cost: 0.2719, reader_cost: 0.00304, ips: 29.4244 samples/sec | ETA 09:13:57 2022-08-22 21:56:26 [INFO] [TRAIN] epoch: 30, iter: 37800/160000, loss: 0.7063, lr: 0.000925, batch_cost: 0.2685, reader_cost: 0.00653, ips: 29.7932 samples/sec | ETA 09:06:52 2022-08-22 21:56:40 [INFO] [TRAIN] epoch: 30, iter: 37850/160000, loss: 0.7746, lr: 0.000925, batch_cost: 0.2782, reader_cost: 0.00236, ips: 28.7544 samples/sec | ETA 09:26:24 2022-08-22 21:56:57 [INFO] [TRAIN] epoch: 31, iter: 37900/160000, loss: 0.7664, lr: 0.000924, batch_cost: 0.3336, reader_cost: 0.06274, ips: 23.9829 samples/sec | ETA 11:18:49 2022-08-22 21:57:11 [INFO] [TRAIN] epoch: 31, iter: 37950/160000, loss: 0.7045, lr: 0.000924, batch_cost: 0.2838, reader_cost: 0.00590, ips: 28.1851 samples/sec | ETA 09:37:22 2022-08-22 21:57:24 [INFO] [TRAIN] epoch: 31, iter: 38000/160000, loss: 0.6937, lr: 0.000924, batch_cost: 0.2676, reader_cost: 0.00244, ips: 29.8980 samples/sec | ETA 09:04:04 2022-08-22 21:57:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 201s - batch_cost: 0.2011 - reader cost: 0.0015 2022-08-22 22:00:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3218 Acc: 0.7524 Kappa: 0.7333 Dice: 0.4493 2022-08-22 22:00:46 [INFO] [EVAL] Class IoU: [0.671 0.7561 0.929 0.7112 0.6735 0.7443 0.7495 0.7635 0.5138 0.5916 0.4687 0.5283 0.6735 0.3182 0.262 0.4157 0.5264 0.412 0.5686 0.3889 0.72 0.4653 0.6044 0.4656 0.3363 0.2187 0.4448 0.4144 0.3678 0.2427 0.2275 0.4623 0.2721 0.3638 0.32 0.3655 0.4033 0.5237 0.2521 0.3247 0.1926 0.0961 0.3278 0.2356 0.3158 0.2104 0.3518 0.4695 0.6394 0.5801 0.464 0.2162 0.1525 0.1208 0.6964 0.4641 0.8352 0.4217 0.4216 0.2402 0.0849 0.2997 0.2923 0.1516 0.4293 0.6173 0.2604 0.4071 0.0765 0.3276 0.3534 0.4825 0.3391 0.2999 0.4135 0.3313 0.4211 0.3018 0.147 0.1111 0.4958 0.3765 0.328 0.0867 0.2386 0.5197 0.0597 0.0805 0.3 0.4731 0.3866 0.0304 0.1746 0.0667 0.0001 0.0154 0.0256 0.1264 0.2012 0.367 0.1256 0.0396 0.1754 0.1231 0.1171 0.3898 0.0663 0.5065 0.1469 0.1902 0.0727 0.4533 0.1297 0.5362 0.7681 0.0055 0.3367 0.5847 0.0387 0.1522 0.3266 0. 0.222 0.1127 0.5389 0.2426 0.3801 0.4255 0.0499 0.2081 0.5832 0.0007 0.2429 0.1634 0.1324 0.1123 0.1179 0.0052 0.1148 0.3191 0.3412 0.0004 0.3804 0.057 0.2607 0. 0.3307 0.0333 0.1134 0.0849] 2022-08-22 22:00:46 [INFO] [EVAL] Class Precision: [0.7697 0.8246 0.9688 0.8178 0.7528 0.8727 0.8664 0.8469 0.6862 0.7777 0.7175 0.6797 0.7397 0.4756 0.5228 0.5829 0.7232 0.7292 0.6682 0.6002 0.7993 0.6407 0.7717 0.6196 0.4669 0.4519 0.567 0.7445 0.7005 0.3657 0.3622 0.6161 0.4732 0.4862 0.3897 0.4874 0.614 0.7481 0.4552 0.5588 0.3081 0.2259 0.5414 0.4685 0.476 0.3533 0.6779 0.6478 0.7034 0.7026 0.7233 0.256 0.3701 0.5734 0.722 0.6573 0.8735 0.5594 0.5433 0.3624 0.1482 0.4961 0.4459 0.6809 0.5415 0.6933 0.3483 0.6597 0.3334 0.6034 0.7506 0.5768 0.5789 0.3888 0.658 0.5073 0.7016 0.5125 0.6398 0.3133 0.5532 0.6467 0.7456 0.1747 0.3542 0.7512 0.5312 0.3353 0.5484 0.6589 0.508 0.034 0.288 0.2397 0.0007 0.0466 0.1433 0.3137 0.4929 0.574 0.4367 0.0821 0.6394 0.5773 0.7244 0.5061 0.4735 0.7264 0.2919 0.436 0.21 0.7786 0.4411 0.6628 0.777 0.07 0.8542 0.6288 0.4975 0.5561 0.6443 0.0011 0.6934 0.564 0.7995 0.5779 0.6409 0.6266 0.2667 0.6182 0.6847 0.0102 0.3363 0.7318 0.5465 0.3499 0.2816 0.0932 0.4623 0.4577 0.8408 0.0006 0.6335 0.4153 0.5896 0. 0.7727 0.3292 0.5227 0.8567] 2022-08-22 22:00:46 [INFO] [EVAL] Class Recall: [0.8396 0.9009 0.9577 0.8451 0.8648 0.8349 0.8474 0.8858 0.6716 0.7119 0.5748 0.7034 0.8826 0.4902 0.3443 0.5917 0.6591 0.4865 0.7922 0.5248 0.8789 0.6296 0.7359 0.6521 0.5459 0.2976 0.6735 0.4831 0.4363 0.4193 0.3794 0.6494 0.3903 0.5911 0.6417 0.5937 0.5403 0.6358 0.361 0.4367 0.3393 0.1432 0.4539 0.3216 0.484 0.3422 0.4224 0.6305 0.8755 0.7688 0.5641 0.5812 0.2059 0.1327 0.9517 0.6122 0.9501 0.6315 0.6529 0.416 0.166 0.4309 0.459 0.1632 0.6744 0.8492 0.5079 0.5153 0.0903 0.4175 0.4004 0.7469 0.4501 0.5674 0.5266 0.4883 0.513 0.4234 0.1603 0.1469 0.827 0.474 0.3693 0.1469 0.4224 0.6277 0.063 0.0958 0.3984 0.6266 0.6179 0.2251 0.3071 0.0846 0.0002 0.0226 0.0302 0.1747 0.2537 0.5044 0.1499 0.0709 0.1947 0.1353 0.1226 0.629 0.0716 0.626 0.2282 0.2523 0.1 0.5204 0.1553 0.7373 0.9853 0.0059 0.3572 0.893 0.0403 0.1732 0.3985 0. 0.2461 0.1235 0.623 0.2949 0.4829 0.5701 0.0579 0.2388 0.7972 0.0008 0.4663 0.1738 0.1488 0.142 0.1686 0.0055 0.1325 0.5131 0.3647 0.0009 0.4878 0.0619 0.3186 0. 0.3663 0.0357 0.1265 0.0861] 2022-08-22 22:00:46 [INFO] [EVAL] The model with the best validation mIoU (0.3218) was saved at iter 38000. 2022-08-22 22:00:55 [INFO] [TRAIN] epoch: 31, iter: 38050/160000, loss: 0.7513, lr: 0.000923, batch_cost: 0.1869, reader_cost: 0.00377, ips: 42.7952 samples/sec | ETA 06:19:56 2022-08-22 22:01:04 [INFO] [TRAIN] epoch: 31, iter: 38100/160000, loss: 0.7575, lr: 0.000923, batch_cost: 0.1797, reader_cost: 0.00094, ips: 44.5277 samples/sec | ETA 06:05:00 2022-08-22 22:01:12 [INFO] [TRAIN] epoch: 31, iter: 38150/160000, loss: 0.6980, lr: 0.000923, batch_cost: 0.1629, reader_cost: 0.00075, ips: 49.1199 samples/sec | ETA 05:30:45 2022-08-22 22:01:23 [INFO] [TRAIN] epoch: 31, iter: 38200/160000, loss: 0.6785, lr: 0.000922, batch_cost: 0.2172, reader_cost: 0.00070, ips: 36.8324 samples/sec | ETA 07:20:54 2022-08-22 22:01:33 [INFO] [TRAIN] epoch: 31, iter: 38250/160000, loss: 0.7593, lr: 0.000922, batch_cost: 0.2036, reader_cost: 0.00085, ips: 39.2948 samples/sec | ETA 06:53:07 2022-08-22 22:01:43 [INFO] [TRAIN] epoch: 31, iter: 38300/160000, loss: 0.7204, lr: 0.000921, batch_cost: 0.1851, reader_cost: 0.00093, ips: 43.2213 samples/sec | ETA 06:15:25 2022-08-22 22:01:54 [INFO] [TRAIN] epoch: 31, iter: 38350/160000, loss: 0.7495, lr: 0.000921, batch_cost: 0.2202, reader_cost: 0.00076, ips: 36.3259 samples/sec | ETA 07:26:30 2022-08-22 22:02:08 [INFO] [TRAIN] epoch: 31, iter: 38400/160000, loss: 0.7617, lr: 0.000921, batch_cost: 0.2862, reader_cost: 0.00469, ips: 27.9530 samples/sec | ETA 09:40:01 2022-08-22 22:02:21 [INFO] [TRAIN] epoch: 31, iter: 38450/160000, loss: 0.8045, lr: 0.000920, batch_cost: 0.2657, reader_cost: 0.00042, ips: 30.1140 samples/sec | ETA 08:58:10 2022-08-22 22:02:35 [INFO] [TRAIN] epoch: 31, iter: 38500/160000, loss: 0.7337, lr: 0.000920, batch_cost: 0.2734, reader_cost: 0.00087, ips: 29.2624 samples/sec | ETA 09:13:36 2022-08-22 22:02:48 [INFO] [TRAIN] epoch: 31, iter: 38550/160000, loss: 0.7183, lr: 0.000920, batch_cost: 0.2674, reader_cost: 0.00730, ips: 29.9125 samples/sec | ETA 09:01:21 2022-08-22 22:03:01 [INFO] [TRAIN] epoch: 31, iter: 38600/160000, loss: 0.7511, lr: 0.000919, batch_cost: 0.2598, reader_cost: 0.00787, ips: 30.7908 samples/sec | ETA 08:45:41 2022-08-22 22:03:15 [INFO] [TRAIN] epoch: 31, iter: 38650/160000, loss: 0.8064, lr: 0.000919, batch_cost: 0.2690, reader_cost: 0.00630, ips: 29.7428 samples/sec | ETA 09:03:59 2022-08-22 22:03:28 [INFO] [TRAIN] epoch: 31, iter: 38700/160000, loss: 0.7713, lr: 0.000918, batch_cost: 0.2742, reader_cost: 0.00196, ips: 29.1804 samples/sec | ETA 09:14:15 2022-08-22 22:03:43 [INFO] [TRAIN] epoch: 31, iter: 38750/160000, loss: 0.6816, lr: 0.000918, batch_cost: 0.3023, reader_cost: 0.00212, ips: 26.4676 samples/sec | ETA 10:10:48 2022-08-22 22:03:57 [INFO] [TRAIN] epoch: 31, iter: 38800/160000, loss: 0.7468, lr: 0.000918, batch_cost: 0.2779, reader_cost: 0.01292, ips: 28.7881 samples/sec | ETA 09:21:20 2022-08-22 22:04:10 [INFO] [TRAIN] epoch: 31, iter: 38850/160000, loss: 0.7985, lr: 0.000917, batch_cost: 0.2561, reader_cost: 0.00256, ips: 31.2436 samples/sec | ETA 08:37:00 2022-08-22 22:04:25 [INFO] [TRAIN] epoch: 31, iter: 38900/160000, loss: 0.7107, lr: 0.000917, batch_cost: 0.2903, reader_cost: 0.00079, ips: 27.5544 samples/sec | ETA 09:45:59 2022-08-22 22:04:38 [INFO] [TRAIN] epoch: 31, iter: 38950/160000, loss: 0.7460, lr: 0.000916, batch_cost: 0.2669, reader_cost: 0.00078, ips: 29.9719 samples/sec | ETA 08:58:30 2022-08-22 22:04:51 [INFO] [TRAIN] epoch: 31, iter: 39000/160000, loss: 0.7295, lr: 0.000916, batch_cost: 0.2561, reader_cost: 0.00078, ips: 31.2385 samples/sec | ETA 08:36:27 2022-08-22 22:04:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 195s - batch_cost: 0.1953 - reader cost: 0.0011 2022-08-22 22:08:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3206 Acc: 0.7544 Kappa: 0.7354 Dice: 0.4465 2022-08-22 22:08:07 [INFO] [EVAL] Class IoU: [0.6675 0.7743 0.929 0.704 0.6756 0.7376 0.7678 0.7629 0.5091 0.6238 0.4628 0.5288 0.6841 0.3031 0.2627 0.4034 0.4682 0.4321 0.5846 0.3735 0.7257 0.5124 0.5891 0.4622 0.3425 0.4305 0.4704 0.4283 0.3825 0.1968 0.2079 0.4521 0.2837 0.324 0.2219 0.347 0.393 0.5526 0.2549 0.3494 0.155 0.0857 0.3356 0.2265 0.2695 0.2077 0.3164 0.4437 0.6573 0.5394 0.5141 0.2766 0.1938 0.2088 0.6894 0.4665 0.8261 0.2134 0.4511 0.2708 0.0485 0.1773 0.2938 0.0893 0.3894 0.594 0.2481 0.416 0.053 0.2898 0.4174 0.4604 0.3769 0.2711 0.4166 0.3462 0.5317 0.324 0.1873 0.0801 0.6551 0.3629 0.2272 0.0086 0.2827 0.536 0.0911 0.0731 0.2671 0.4967 0.3636 0.0731 0.1378 0.0425 0.0304 0.0027 0.0305 0.0477 0.0432 0.3683 0.0868 0.0246 0.2704 0.1544 0.146 0.3962 0.1296 0.4965 0.1607 0.3233 0.0716 0.263 0.1151 0.6006 0.6185 0.001 0.3571 0.6686 0.0896 0.1888 0.4249 0.0022 0.2165 0.168 0.2328 0.259 0.4156 0.4374 0.0262 0.1427 0.5653 0.013 0.2766 0.2058 0.091 0.1011 0.1328 0.0099 0.1226 0.3479 0.5153 0. 0.3141 0.0561 0.215 0. 0.3131 0.0331 0.1048 0.1082] 2022-08-22 22:08:07 [INFO] [EVAL] Class Precision: [0.7559 0.8595 0.9657 0.8081 0.7844 0.8834 0.8519 0.8422 0.6642 0.7591 0.6868 0.6781 0.7612 0.5459 0.5197 0.6213 0.532 0.6962 0.7589 0.6199 0.8171 0.6419 0.7066 0.5722 0.5075 0.5088 0.6522 0.6781 0.6617 0.3578 0.3771 0.6175 0.53 0.4177 0.3366 0.4431 0.6297 0.7422 0.4163 0.5039 0.2778 0.2516 0.6472 0.4581 0.4511 0.4506 0.5315 0.7012 0.7159 0.6407 0.7884 0.3263 0.3611 0.7315 0.7123 0.6386 0.8673 0.7563 0.6502 0.432 0.1078 0.3212 0.552 0.4923 0.4558 0.6733 0.3198 0.6081 0.4056 0.4904 0.6694 0.534 0.5208 0.3357 0.6386 0.5446 0.684 0.5706 0.6329 0.202 0.7598 0.7357 0.8129 0.0294 0.5023 0.7106 0.3556 0.3836 0.5876 0.7489 0.4951 0.4778 0.3331 0.2795 0.1503 0.0119 0.0973 0.3243 0.2747 0.5226 0.3133 0.0511 0.507 0.8075 0.8377 0.4912 0.5042 0.7725 0.2925 0.5544 0.1795 0.4776 0.4117 0.7118 0.6209 0.0736 0.7549 0.7523 0.1569 0.6363 0.5141 0.1425 0.828 0.6207 0.7559 0.6459 0.8756 0.6215 0.6593 0.3256 0.6714 0.3742 0.5313 0.7668 0.6122 0.3398 0.345 0.0824 0.3911 0.6183 0.7986 0. 0.7521 0.4267 0.5665 0. 0.8352 0.2177 0.5048 0.8538] 2022-08-22 22:08:07 [INFO] [EVAL] Class Recall: [0.8509 0.8866 0.9607 0.8452 0.8297 0.8171 0.8862 0.8901 0.6855 0.7777 0.5866 0.7061 0.871 0.4053 0.3469 0.5349 0.796 0.5326 0.718 0.4844 0.8665 0.7174 0.78 0.7063 0.513 0.7367 0.6279 0.5376 0.4755 0.3043 0.3167 0.6279 0.3791 0.5909 0.3944 0.6154 0.5112 0.6839 0.3966 0.5326 0.2595 0.115 0.4107 0.3093 0.4011 0.2782 0.4387 0.5471 0.8894 0.7734 0.5964 0.6448 0.2951 0.2262 0.9554 0.6338 0.9455 0.2292 0.5957 0.4206 0.081 0.2834 0.3858 0.0984 0.7278 0.8346 0.5253 0.5684 0.0575 0.4146 0.5258 0.7694 0.577 0.5851 0.5452 0.4873 0.7048 0.4284 0.2101 0.1171 0.8262 0.4174 0.2397 0.0121 0.3928 0.6856 0.1092 0.0828 0.3287 0.5959 0.5779 0.0795 0.1903 0.0478 0.0368 0.0035 0.0425 0.0529 0.0488 0.5552 0.1071 0.0452 0.3668 0.1602 0.1502 0.6721 0.1486 0.5815 0.263 0.4368 0.1065 0.3692 0.1377 0.7936 0.9937 0.0011 0.4039 0.8572 0.1728 0.2116 0.7101 0.0022 0.2267 0.1872 0.2517 0.3019 0.4416 0.5963 0.0265 0.2025 0.7814 0.0133 0.3659 0.2195 0.0966 0.1259 0.1776 0.0112 0.1515 0.4431 0.5922 0. 0.3504 0.0607 0.2574 0. 0.3337 0.0375 0.1168 0.1102] 2022-08-22 22:08:07 [INFO] [EVAL] The model with the best validation mIoU (0.3218) was saved at iter 38000. 2022-08-22 22:08:16 [INFO] [TRAIN] epoch: 31, iter: 39050/160000, loss: 0.7173, lr: 0.000916, batch_cost: 0.1829, reader_cost: 0.00275, ips: 43.7378 samples/sec | ETA 06:08:42 2022-08-22 22:08:26 [INFO] [TRAIN] epoch: 31, iter: 39100/160000, loss: 0.7485, lr: 0.000915, batch_cost: 0.2030, reader_cost: 0.00155, ips: 39.4155 samples/sec | ETA 06:48:58 2022-08-22 22:08:35 [INFO] [TRAIN] epoch: 31, iter: 39150/160000, loss: 0.7942, lr: 0.000915, batch_cost: 0.1798, reader_cost: 0.00042, ips: 44.4881 samples/sec | ETA 06:02:11 2022-08-22 22:08:50 [INFO] [TRAIN] epoch: 32, iter: 39200/160000, loss: 0.7643, lr: 0.000915, batch_cost: 0.3000, reader_cost: 0.09213, ips: 26.6641 samples/sec | ETA 10:04:03 2022-08-22 22:09:02 [INFO] [TRAIN] epoch: 32, iter: 39250/160000, loss: 0.7626, lr: 0.000914, batch_cost: 0.2374, reader_cost: 0.00194, ips: 33.6964 samples/sec | ETA 07:57:47 2022-08-22 22:09:17 [INFO] [TRAIN] epoch: 32, iter: 39300/160000, loss: 0.7410, lr: 0.000914, batch_cost: 0.2993, reader_cost: 0.00205, ips: 26.7297 samples/sec | ETA 10:02:04 2022-08-22 22:09:31 [INFO] [TRAIN] epoch: 32, iter: 39350/160000, loss: 0.7320, lr: 0.000913, batch_cost: 0.2779, reader_cost: 0.00093, ips: 28.7883 samples/sec | ETA 09:18:47 2022-08-22 22:09:44 [INFO] [TRAIN] epoch: 32, iter: 39400/160000, loss: 0.7635, lr: 0.000913, batch_cost: 0.2612, reader_cost: 0.01493, ips: 30.6317 samples/sec | ETA 08:44:56 2022-08-22 22:09:58 [INFO] [TRAIN] epoch: 32, iter: 39450/160000, loss: 0.6952, lr: 0.000913, batch_cost: 0.2911, reader_cost: 0.03851, ips: 27.4826 samples/sec | ETA 09:44:51 2022-08-22 22:10:12 [INFO] [TRAIN] epoch: 32, iter: 39500/160000, loss: 0.7204, lr: 0.000912, batch_cost: 0.2729, reader_cost: 0.00059, ips: 29.3127 samples/sec | ETA 09:08:06 2022-08-22 22:10:24 [INFO] [TRAIN] epoch: 32, iter: 39550/160000, loss: 0.7478, lr: 0.000912, batch_cost: 0.2507, reader_cost: 0.00070, ips: 31.9151 samples/sec | ETA 08:23:12 2022-08-22 22:10:38 [INFO] [TRAIN] epoch: 32, iter: 39600/160000, loss: 0.7611, lr: 0.000912, batch_cost: 0.2627, reader_cost: 0.00436, ips: 30.4499 samples/sec | ETA 08:47:12 2022-08-22 22:10:51 [INFO] [TRAIN] epoch: 32, iter: 39650/160000, loss: 0.7498, lr: 0.000911, batch_cost: 0.2673, reader_cost: 0.00267, ips: 29.9256 samples/sec | ETA 08:56:13 2022-08-22 22:11:05 [INFO] [TRAIN] epoch: 32, iter: 39700/160000, loss: 0.7561, lr: 0.000911, batch_cost: 0.2767, reader_cost: 0.00047, ips: 28.9109 samples/sec | ETA 09:14:48 2022-08-22 22:11:18 [INFO] [TRAIN] epoch: 32, iter: 39750/160000, loss: 0.7025, lr: 0.000910, batch_cost: 0.2680, reader_cost: 0.00197, ips: 29.8511 samples/sec | ETA 08:57:06 2022-08-22 22:11:33 [INFO] [TRAIN] epoch: 32, iter: 39800/160000, loss: 0.7276, lr: 0.000910, batch_cost: 0.3017, reader_cost: 0.00156, ips: 26.5123 samples/sec | ETA 10:04:30 2022-08-22 22:11:46 [INFO] [TRAIN] epoch: 32, iter: 39850/160000, loss: 0.6838, lr: 0.000910, batch_cost: 0.2616, reader_cost: 0.00100, ips: 30.5839 samples/sec | ETA 08:43:48 2022-08-22 22:12:00 [INFO] [TRAIN] epoch: 32, iter: 39900/160000, loss: 0.7539, lr: 0.000909, batch_cost: 0.2761, reader_cost: 0.00086, ips: 28.9753 samples/sec | ETA 09:12:39 2022-08-22 22:12:14 [INFO] [TRAIN] epoch: 32, iter: 39950/160000, loss: 0.7117, lr: 0.000909, batch_cost: 0.2828, reader_cost: 0.00115, ips: 28.2851 samples/sec | ETA 09:25:54 2022-08-22 22:12:26 [INFO] [TRAIN] epoch: 32, iter: 40000/160000, loss: 0.7569, lr: 0.000909, batch_cost: 0.2411, reader_cost: 0.00662, ips: 33.1798 samples/sec | ETA 08:02:13 2022-08-22 22:12:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1923 - reader cost: 7.4396e-04 2022-08-22 22:15:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3182 Acc: 0.7524 Kappa: 0.7333 Dice: 0.4437 2022-08-22 22:15:39 [INFO] [EVAL] Class IoU: [0.6711 0.7588 0.927 0.7104 0.6798 0.7438 0.7748 0.7494 0.5016 0.5852 0.4819 0.5362 0.6822 0.2969 0.2715 0.393 0.5282 0.4296 0.5687 0.395 0.7146 0.3867 0.6028 0.461 0.3391 0.3917 0.4097 0.4092 0.3731 0.1779 0.2068 0.4667 0.2577 0.3249 0.2928 0.342 0.4042 0.5723 0.2197 0.3463 0.17 0.1171 0.3432 0.2248 0.3097 0.2267 0.2951 0.4746 0.5744 0.4686 0.4944 0.2655 0.1425 0.2351 0.6791 0.4868 0.845 0.38 0.3589 0.2724 0.095 0.2047 0.2937 0.1633 0.3732 0.6238 0.2539 0.4145 0.1069 0.2751 0.4301 0.4498 0.3345 0.2899 0.4092 0.3167 0.5359 0.2354 0.2216 0.2776 0.5126 0.3486 0.3737 0.0167 0.0663 0.5238 0.0702 0.0827 0.1687 0.4294 0.3759 0.0328 0.1943 0.0503 0.0023 0.0143 0.0279 0.0406 0.0791 0.3826 0.0453 0.016 0.1901 0.0948 0.001 0.4068 0.21 0.5328 0.1355 0.0882 0.0768 0.4844 0.1131 0.5798 0.6864 0.0016 0.3506 0.548 0.1332 0.1007 0.3748 0.0074 0.2061 0.0522 0.5335 0.2415 0.3629 0.3995 0.0009 0.2743 0.6262 0.0005 0.2508 0.212 0.0719 0.1194 0.1056 0.0013 0.1094 0.3233 0.3806 0.0001 0.3518 0.1552 0.3077 0. 0.3596 0.0167 0.0961 0.145 ] 2022-08-22 22:15:39 [INFO] [EVAL] Class Precision: [0.7715 0.8337 0.9602 0.8122 0.7648 0.8605 0.8624 0.8068 0.651 0.6849 0.7137 0.7132 0.7818 0.4924 0.4761 0.6196 0.696 0.7134 0.73 0.5938 0.8253 0.6631 0.7657 0.6115 0.5055 0.5727 0.5045 0.7312 0.6808 0.2913 0.3893 0.5892 0.5035 0.4083 0.4126 0.6346 0.5816 0.7173 0.375 0.5608 0.2593 0.2799 0.59 0.4871 0.4439 0.5012 0.4729 0.659 0.6925 0.5348 0.7901 0.3146 0.3144 0.7764 0.7202 0.6719 0.8846 0.6309 0.7868 0.5243 0.1372 0.4209 0.5132 0.6611 0.4494 0.7398 0.3523 0.5275 0.2129 0.4502 0.7241 0.6851 0.5411 0.421 0.6716 0.555 0.7952 0.6373 0.6687 0.4815 0.5706 0.5551 0.6597 0.0395 0.2664 0.7291 0.3003 0.3661 0.2087 0.6115 0.5043 0.042 0.2789 0.2819 0.0091 0.0343 0.1536 0.2469 0.4052 0.6278 0.5001 0.0262 0.7383 0.6968 0.0471 0.5454 0.4158 0.8128 0.2158 0.19 0.1733 0.6746 0.3217 0.8065 0.6925 0.0728 0.6343 0.6308 0.2863 0.675 0.507 0.1305 0.6528 0.5113 0.6803 0.6294 0.8656 0.5638 0.0211 0.4071 0.7226 0.0377 0.4339 0.6501 0.6024 0.2719 0.3741 0.0431 0.3769 0.7201 0.7315 0.0001 0.7717 0.4665 0.4873 0. 0.8652 0.5824 0.3939 0.723 ] 2022-08-22 22:15:39 [INFO] [EVAL] Class Recall: [0.8375 0.8941 0.964 0.85 0.8595 0.8457 0.8842 0.9132 0.686 0.8009 0.5973 0.6836 0.8427 0.4278 0.3872 0.5181 0.6866 0.5192 0.7202 0.5413 0.8419 0.4813 0.7391 0.652 0.5074 0.5534 0.6857 0.4817 0.4522 0.3135 0.3062 0.6918 0.3456 0.6141 0.502 0.4259 0.57 0.7389 0.3465 0.4752 0.3307 0.1677 0.4507 0.2945 0.5061 0.2928 0.4398 0.6291 0.7712 0.791 0.5692 0.6296 0.2067 0.2522 0.9224 0.6386 0.9497 0.4886 0.3976 0.3619 0.2362 0.2849 0.4072 0.1782 0.6877 0.7992 0.4763 0.6593 0.1769 0.4143 0.5144 0.567 0.4671 0.4821 0.5115 0.4244 0.6217 0.2718 0.249 0.3959 0.8344 0.4839 0.4629 0.0281 0.0811 0.6503 0.084 0.0965 0.4679 0.5905 0.5962 0.1295 0.3905 0.0577 0.003 0.024 0.0329 0.0463 0.0894 0.4949 0.0474 0.0392 0.2039 0.0989 0.001 0.6156 0.2978 0.6074 0.2671 0.1413 0.1212 0.6322 0.1486 0.6735 0.9874 0.0017 0.4394 0.8068 0.1993 0.1059 0.5897 0.0078 0.2315 0.055 0.7121 0.2816 0.3846 0.5781 0.001 0.4569 0.8244 0.0005 0.3727 0.2393 0.0755 0.1754 0.1283 0.0013 0.1335 0.3698 0.4424 0.0002 0.3927 0.1887 0.455 0. 0.3809 0.0169 0.1127 0.1535] 2022-08-22 22:15:39 [INFO] [EVAL] The model with the best validation mIoU (0.3218) was saved at iter 38000. 2022-08-22 22:15:48 [INFO] [TRAIN] epoch: 32, iter: 40050/160000, loss: 0.7085, lr: 0.000908, batch_cost: 0.1831, reader_cost: 0.00444, ips: 43.6985 samples/sec | ETA 06:05:59 2022-08-22 22:15:57 [INFO] [TRAIN] epoch: 32, iter: 40100/160000, loss: 0.7478, lr: 0.000908, batch_cost: 0.1673, reader_cost: 0.00111, ips: 47.8118 samples/sec | ETA 05:34:21 2022-08-22 22:16:08 [INFO] [TRAIN] epoch: 32, iter: 40150/160000, loss: 0.7143, lr: 0.000907, batch_cost: 0.2210, reader_cost: 0.00117, ips: 36.1910 samples/sec | ETA 07:21:32 2022-08-22 22:16:20 [INFO] [TRAIN] epoch: 32, iter: 40200/160000, loss: 0.7176, lr: 0.000907, batch_cost: 0.2562, reader_cost: 0.00648, ips: 31.2215 samples/sec | ETA 08:31:36 2022-08-22 22:16:35 [INFO] [TRAIN] epoch: 32, iter: 40250/160000, loss: 0.6774, lr: 0.000907, batch_cost: 0.2901, reader_cost: 0.00063, ips: 27.5813 samples/sec | ETA 09:38:53 2022-08-22 22:16:49 [INFO] [TRAIN] epoch: 32, iter: 40300/160000, loss: 0.7072, lr: 0.000906, batch_cost: 0.2785, reader_cost: 0.00128, ips: 28.7273 samples/sec | ETA 09:15:34 2022-08-22 22:17:04 [INFO] [TRAIN] epoch: 32, iter: 40350/160000, loss: 0.7164, lr: 0.000906, batch_cost: 0.3055, reader_cost: 0.00225, ips: 26.1852 samples/sec | ETA 10:09:14 2022-08-22 22:17:19 [INFO] [TRAIN] epoch: 32, iter: 40400/160000, loss: 0.7945, lr: 0.000905, batch_cost: 0.2908, reader_cost: 0.00141, ips: 27.5096 samples/sec | ETA 09:39:40 2022-08-22 22:17:36 [INFO] [TRAIN] epoch: 33, iter: 40450/160000, loss: 0.6809, lr: 0.000905, batch_cost: 0.3401, reader_cost: 0.05654, ips: 23.5238 samples/sec | ETA 11:17:36 2022-08-22 22:17:50 [INFO] [TRAIN] epoch: 33, iter: 40500/160000, loss: 0.6691, lr: 0.000905, batch_cost: 0.2812, reader_cost: 0.00048, ips: 28.4499 samples/sec | ETA 09:20:02 2022-08-22 22:18:03 [INFO] [TRAIN] epoch: 33, iter: 40550/160000, loss: 0.7281, lr: 0.000904, batch_cost: 0.2576, reader_cost: 0.00350, ips: 31.0606 samples/sec | ETA 08:32:45 2022-08-22 22:18:16 [INFO] [TRAIN] epoch: 33, iter: 40600/160000, loss: 0.7279, lr: 0.000904, batch_cost: 0.2699, reader_cost: 0.00055, ips: 29.6415 samples/sec | ETA 08:57:05 2022-08-22 22:18:30 [INFO] [TRAIN] epoch: 33, iter: 40650/160000, loss: 0.7599, lr: 0.000904, batch_cost: 0.2875, reader_cost: 0.00643, ips: 27.8271 samples/sec | ETA 09:31:51 2022-08-22 22:18:44 [INFO] [TRAIN] epoch: 33, iter: 40700/160000, loss: 0.7530, lr: 0.000903, batch_cost: 0.2786, reader_cost: 0.00079, ips: 28.7165 samples/sec | ETA 09:13:55 2022-08-22 22:18:57 [INFO] [TRAIN] epoch: 33, iter: 40750/160000, loss: 0.6521, lr: 0.000903, batch_cost: 0.2510, reader_cost: 0.00223, ips: 31.8675 samples/sec | ETA 08:18:56 2022-08-22 22:19:11 [INFO] [TRAIN] epoch: 33, iter: 40800/160000, loss: 0.7312, lr: 0.000902, batch_cost: 0.2724, reader_cost: 0.01964, ips: 29.3711 samples/sec | ETA 09:01:07 2022-08-22 22:19:24 [INFO] [TRAIN] epoch: 33, iter: 40850/160000, loss: 0.7374, lr: 0.000902, batch_cost: 0.2700, reader_cost: 0.01779, ips: 29.6325 samples/sec | ETA 08:56:07 2022-08-22 22:19:37 [INFO] [TRAIN] epoch: 33, iter: 40900/160000, loss: 0.6836, lr: 0.000902, batch_cost: 0.2620, reader_cost: 0.00161, ips: 30.5391 samples/sec | ETA 08:39:59 2022-08-22 22:19:51 [INFO] [TRAIN] epoch: 33, iter: 40950/160000, loss: 0.7049, lr: 0.000901, batch_cost: 0.2688, reader_cost: 0.01275, ips: 29.7605 samples/sec | ETA 08:53:22 2022-08-22 22:20:03 [INFO] [TRAIN] epoch: 33, iter: 41000/160000, loss: 0.7528, lr: 0.000901, batch_cost: 0.2505, reader_cost: 0.00090, ips: 31.9360 samples/sec | ETA 08:16:49 2022-08-22 22:20:03 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 169s - batch_cost: 0.1685 - reader cost: 9.7932e-04 2022-08-22 22:22:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3159 Acc: 0.7487 Kappa: 0.7299 Dice: 0.4419 2022-08-22 22:22:52 [INFO] [EVAL] Class IoU: [0.6713 0.741 0.9277 0.707 0.6683 0.7372 0.7624 0.7641 0.4974 0.6272 0.4683 0.5228 0.6826 0.2987 0.2524 0.4149 0.4706 0.364 0.5905 0.368 0.7194 0.5206 0.6069 0.4708 0.3403 0.2957 0.4862 0.4438 0.369 0.2824 0.2361 0.4368 0.2983 0.3518 0.1892 0.3587 0.4087 0.5004 0.2348 0.344 0.2029 0.1171 0.3391 0.2342 0.2783 0.2517 0.2887 0.443 0.4997 0.5289 0.5165 0.3144 0.1365 0.2092 0.6497 0.4783 0.8414 0.3609 0.4705 0.2805 0.0906 0.2023 0.2594 0.2231 0.4207 0.6022 0.1781 0.4005 0.0379 0.2801 0.4068 0.4273 0.3327 0.2396 0.4218 0.333 0.5013 0.3138 0.3148 0.2723 0.5756 0.3801 0.2525 0.0297 0.1685 0.5118 0.0566 0.0351 0.2201 0.4456 0.4022 0.0138 0.1477 0.0795 0.0013 0.0151 0.0219 0.162 0.074 0.369 0.0419 0.0133 0.0986 0.3086 0.0014 0.3733 0.1173 0.4473 0.1223 0.1544 0.0831 0.1901 0.1035 0.5819 0.656 0.0092 0.3665 0.6164 0.0468 0.36 0.3514 0.0014 0.195 0.0485 0.4128 0.276 0.464 0.4197 0.0366 0.1975 0.4981 0.0124 0.2789 0.1981 0.0927 0.1023 0.093 0.0178 0.1356 0.3467 0.2021 0.0343 0.3295 0.0044 0.269 0. 0.3342 0.0255 0.0921 0.1339] 2022-08-22 22:22:52 [INFO] [EVAL] Class Precision: [0.7959 0.813 0.9679 0.8018 0.7554 0.873 0.8684 0.8315 0.6004 0.8201 0.6615 0.6688 0.7587 0.5127 0.4763 0.5411 0.5488 0.6974 0.7695 0.6241 0.8039 0.6979 0.7359 0.6049 0.5038 0.5145 0.6808 0.6535 0.6641 0.3889 0.3905 0.6023 0.4962 0.5059 0.3745 0.455 0.6097 0.8109 0.4381 0.574 0.3268 0.2859 0.6382 0.5364 0.546 0.4459 0.5221 0.6311 0.6549 0.6205 0.7783 0.4032 0.3208 0.4374 0.6621 0.6844 0.8886 0.6364 0.7577 0.4321 0.1326 0.2933 0.4784 0.7133 0.4979 0.6867 0.2202 0.5523 0.5037 0.4439 0.7098 0.4814 0.4452 0.2785 0.6404 0.4903 0.7339 0.6177 0.488 0.4539 0.6533 0.6015 0.7899 0.0642 0.1889 0.7631 0.2962 0.2851 0.3258 0.7422 0.6053 0.0384 0.2517 0.2996 0.0071 0.057 0.1483 0.5072 0.5368 0.5587 0.7831 0.018 0.707 0.5731 0.0255 0.4732 0.2416 0.6482 0.26 0.2972 0.2617 0.8777 0.3427 0.681 0.6589 0.0749 0.798 0.6483 0.2863 0.4675 0.7583 0.0479 0.5342 0.7111 0.8548 0.6192 0.8581 0.6309 0.1511 0.4213 0.6648 0.2639 0.5172 0.7033 0.4734 0.4302 0.3804 0.175 0.4983 0.5717 0.2166 0.0431 0.639 0.1541 0.5106 0. 0.8444 0.3192 0.6141 0.7219] 2022-08-22 22:22:52 [INFO] [EVAL] Class Recall: [0.8109 0.8931 0.9571 0.8567 0.8529 0.8257 0.862 0.9041 0.7435 0.7273 0.6159 0.7055 0.8719 0.4172 0.3494 0.6401 0.7676 0.4323 0.7174 0.4729 0.8726 0.6721 0.7759 0.6799 0.5118 0.4101 0.6299 0.5803 0.4537 0.5078 0.3738 0.6139 0.4278 0.5359 0.2766 0.6289 0.5536 0.5665 0.3359 0.4619 0.3485 0.1656 0.4199 0.2936 0.3621 0.3663 0.3924 0.5979 0.6784 0.7817 0.6056 0.588 0.1921 0.2863 0.9721 0.6137 0.9407 0.4547 0.5538 0.4444 0.2223 0.3946 0.3617 0.2451 0.7305 0.8305 0.4827 0.5931 0.0394 0.4316 0.4879 0.7919 0.5684 0.6317 0.5528 0.5093 0.6127 0.3895 0.4702 0.405 0.8287 0.508 0.2707 0.0525 0.6087 0.6084 0.0654 0.0384 0.4042 0.5273 0.5452 0.021 0.2633 0.0977 0.0016 0.0202 0.0251 0.1922 0.079 0.5208 0.0424 0.0494 0.1027 0.4007 0.0015 0.6389 0.1856 0.5906 0.1876 0.2432 0.1086 0.1952 0.1291 0.7999 0.9935 0.0104 0.404 0.9262 0.0529 0.6101 0.3958 0.0014 0.235 0.0495 0.444 0.3325 0.5026 0.5563 0.0461 0.271 0.665 0.0128 0.377 0.2162 0.1033 0.1183 0.1097 0.0195 0.1571 0.4684 0.7514 0.1449 0.4048 0.0045 0.3625 0. 0.3561 0.027 0.0978 0.1412] 2022-08-22 22:22:52 [INFO] [EVAL] The model with the best validation mIoU (0.3218) was saved at iter 38000. 2022-08-22 22:23:01 [INFO] [TRAIN] epoch: 33, iter: 41050/160000, loss: 0.7248, lr: 0.000901, batch_cost: 0.1777, reader_cost: 0.00282, ips: 45.0138 samples/sec | ETA 05:52:20 2022-08-22 22:23:11 [INFO] [TRAIN] epoch: 33, iter: 41100/160000, loss: 0.7414, lr: 0.000900, batch_cost: 0.2060, reader_cost: 0.00280, ips: 38.8297 samples/sec | ETA 06:48:16 2022-08-22 22:23:22 [INFO] [TRAIN] epoch: 33, iter: 41150/160000, loss: 0.7019, lr: 0.000900, batch_cost: 0.2067, reader_cost: 0.00041, ips: 38.7100 samples/sec | ETA 06:49:22 2022-08-22 22:23:30 [INFO] [TRAIN] epoch: 33, iter: 41200/160000, loss: 0.7467, lr: 0.000899, batch_cost: 0.1753, reader_cost: 0.00064, ips: 45.6411 samples/sec | ETA 05:47:03 2022-08-22 22:23:39 [INFO] [TRAIN] epoch: 33, iter: 41250/160000, loss: 0.7073, lr: 0.000899, batch_cost: 0.1635, reader_cost: 0.00066, ips: 48.9339 samples/sec | ETA 05:23:33 2022-08-22 22:23:49 [INFO] [TRAIN] epoch: 33, iter: 41300/160000, loss: 0.6931, lr: 0.000899, batch_cost: 0.2120, reader_cost: 0.00064, ips: 37.7390 samples/sec | ETA 06:59:22 2022-08-22 22:24:03 [INFO] [TRAIN] epoch: 33, iter: 41350/160000, loss: 0.7056, lr: 0.000898, batch_cost: 0.2837, reader_cost: 0.00291, ips: 28.1944 samples/sec | ETA 09:21:06 2022-08-22 22:24:16 [INFO] [TRAIN] epoch: 33, iter: 41400/160000, loss: 0.6899, lr: 0.000898, batch_cost: 0.2478, reader_cost: 0.00140, ips: 32.2863 samples/sec | ETA 08:09:47 2022-08-22 22:24:28 [INFO] [TRAIN] epoch: 33, iter: 41450/160000, loss: 0.7150, lr: 0.000898, batch_cost: 0.2502, reader_cost: 0.00430, ips: 31.9708 samples/sec | ETA 08:14:24 2022-08-22 22:24:42 [INFO] [TRAIN] epoch: 33, iter: 41500/160000, loss: 0.7239, lr: 0.000897, batch_cost: 0.2813, reader_cost: 0.01649, ips: 28.4377 samples/sec | ETA 09:15:36 2022-08-22 22:24:57 [INFO] [TRAIN] epoch: 33, iter: 41550/160000, loss: 0.7642, lr: 0.000897, batch_cost: 0.2880, reader_cost: 0.00136, ips: 27.7759 samples/sec | ETA 09:28:35 2022-08-22 22:25:10 [INFO] [TRAIN] epoch: 33, iter: 41600/160000, loss: 0.8071, lr: 0.000896, batch_cost: 0.2679, reader_cost: 0.00320, ips: 29.8610 samples/sec | ETA 08:48:40 2022-08-22 22:25:23 [INFO] [TRAIN] epoch: 33, iter: 41650/160000, loss: 0.7704, lr: 0.000896, batch_cost: 0.2544, reader_cost: 0.00035, ips: 31.4523 samples/sec | ETA 08:21:42 2022-08-22 22:25:42 [INFO] [TRAIN] epoch: 34, iter: 41700/160000, loss: 0.6866, lr: 0.000896, batch_cost: 0.3858, reader_cost: 0.13310, ips: 20.7359 samples/sec | ETA 12:40:40 2022-08-22 22:25:57 [INFO] [TRAIN] epoch: 34, iter: 41750/160000, loss: 0.6880, lr: 0.000895, batch_cost: 0.2928, reader_cost: 0.01728, ips: 27.3202 samples/sec | ETA 09:37:06 2022-08-22 22:26:10 [INFO] [TRAIN] epoch: 34, iter: 41800/160000, loss: 0.7187, lr: 0.000895, batch_cost: 0.2720, reader_cost: 0.00485, ips: 29.4091 samples/sec | ETA 08:55:53 2022-08-22 22:26:24 [INFO] [TRAIN] epoch: 34, iter: 41850/160000, loss: 0.7350, lr: 0.000895, batch_cost: 0.2659, reader_cost: 0.02192, ips: 30.0821 samples/sec | ETA 08:43:40 2022-08-22 22:26:37 [INFO] [TRAIN] epoch: 34, iter: 41900/160000, loss: 0.6692, lr: 0.000894, batch_cost: 0.2606, reader_cost: 0.00548, ips: 30.7005 samples/sec | ETA 08:32:54 2022-08-22 22:26:51 [INFO] [TRAIN] epoch: 34, iter: 41950/160000, loss: 0.7583, lr: 0.000894, batch_cost: 0.2851, reader_cost: 0.00178, ips: 28.0580 samples/sec | ETA 09:20:58 2022-08-22 22:27:05 [INFO] [TRAIN] epoch: 34, iter: 42000/160000, loss: 0.7154, lr: 0.000893, batch_cost: 0.2867, reader_cost: 0.00550, ips: 27.9071 samples/sec | ETA 09:23:46 2022-08-22 22:27:05 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 204s - batch_cost: 0.2036 - reader cost: 7.1981e-04 2022-08-22 22:30:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3298 Acc: 0.7535 Kappa: 0.7343 Dice: 0.4586 2022-08-22 22:30:29 [INFO] [EVAL] Class IoU: [0.6698 0.7623 0.9281 0.699 0.6689 0.7488 0.7505 0.7691 0.4985 0.6711 0.469 0.5357 0.6891 0.2732 0.1578 0.4049 0.4939 0.4254 0.5759 0.3724 0.7382 0.4271 0.6228 0.4617 0.3616 0.333 0.4115 0.4286 0.3644 0.2961 0.2354 0.4352 0.2583 0.3408 0.3072 0.3652 0.404 0.5234 0.2705 0.3558 0.1631 0.0917 0.3252 0.248 0.3249 0.2566 0.3196 0.4711 0.6038 0.5064 0.5146 0.3971 0.1889 0.237 0.6869 0.515 0.8505 0.2984 0.3946 0.3491 0.0609 0.2078 0.2934 0.1129 0.41 0.6739 0.2641 0.4144 0.151 0.2902 0.4144 0.4592 0.3454 0.2707 0.4154 0.3174 0.5378 0.276 0.1769 0.1862 0.584 0.3913 0.2739 0.055 0.1472 0.5395 0.0763 0.0528 0.2396 0.4677 0.3587 0.0249 0.2151 0.1029 0.001 0.026 0.0447 0.1036 0.0536 0.4174 0.038 0.0202 0.2548 0.4889 0.0087 0.4085 0.1192 0.4611 0.1007 0.104 0.0498 0.4599 0.1177 0.6186 0.6306 0.0017 0.3035 0.664 0.1639 0.3415 0.4825 0.0023 0.1879 0.1612 0.4075 0.3142 0.4594 0.4008 0.1854 0.2518 0.5361 0.0012 0.2394 0.2528 0.124 0.1164 0.1105 0.0062 0.1552 0.3557 0.1897 0. 0.4058 0.2039 0.2996 0. 0.343 0.0215 0.0775 0.1193] 2022-08-22 22:30:29 [INFO] [EVAL] Class Precision: [0.7629 0.843 0.9649 0.7811 0.746 0.8654 0.883 0.8559 0.6228 0.7557 0.7619 0.6762 0.7735 0.5164 0.529 0.6246 0.6342 0.6896 0.7419 0.6063 0.8384 0.6799 0.7967 0.655 0.4657 0.4944 0.5443 0.7416 0.5799 0.4532 0.3839 0.5969 0.4916 0.4396 0.4714 0.4788 0.6009 0.7568 0.5357 0.4439 0.2673 0.3002 0.4958 0.4885 0.4509 0.4506 0.5458 0.6927 0.6507 0.6576 0.7734 0.6551 0.347 0.6158 0.7109 0.764 0.903 0.7133 0.4798 0.5718 0.0916 0.3631 0.446 0.6991 0.5102 0.824 0.404 0.6107 0.439 0.4757 0.6177 0.6339 0.5454 0.321 0.7514 0.4575 0.6804 0.6444 0.3908 0.3616 0.6656 0.714 0.782 0.1516 0.4274 0.7614 0.2885 0.4115 0.4011 0.7185 0.4837 0.028 0.3826 0.3081 0.0037 0.0581 0.4471 0.2833 0.573 0.6017 0.5044 0.0303 0.6433 0.6411 0.0926 0.5379 0.2558 0.7785 0.2135 0.2379 0.1839 0.7622 0.4204 0.6655 0.6323 0.0397 0.6788 0.6964 0.2646 0.6407 0.6668 0.4125 0.4808 0.721 0.813 0.6154 0.8542 0.5368 0.2708 0.4275 0.6021 0.0723 0.4976 0.4975 0.5185 0.2295 0.3668 0.1165 0.4376 0.6013 0.2123 0. 0.5732 0.507 0.7419 0. 0.7907 0.5054 0.6745 0.7871] 2022-08-22 22:30:29 [INFO] [EVAL] Class Recall: [0.8459 0.8885 0.9605 0.8693 0.8662 0.8474 0.8333 0.8835 0.7142 0.857 0.5495 0.7205 0.8632 0.3672 0.1836 0.535 0.6908 0.5261 0.7202 0.4912 0.8606 0.5347 0.7405 0.6101 0.6181 0.5049 0.6279 0.5038 0.4952 0.4607 0.3784 0.6164 0.3523 0.6025 0.4685 0.606 0.5522 0.6292 0.3534 0.6419 0.2951 0.1166 0.4859 0.335 0.5377 0.3735 0.4353 0.5956 0.8934 0.6877 0.6059 0.5021 0.2931 0.2782 0.953 0.6125 0.936 0.3391 0.6896 0.4727 0.1539 0.3268 0.4617 0.1186 0.6762 0.7872 0.4326 0.5631 0.187 0.4266 0.5574 0.6251 0.4851 0.6337 0.4816 0.509 0.7195 0.3256 0.2443 0.2774 0.8265 0.464 0.2965 0.0795 0.1834 0.6493 0.0939 0.0571 0.373 0.5727 0.5813 0.1875 0.3294 0.1337 0.0015 0.0448 0.0473 0.1404 0.0558 0.5768 0.0394 0.0575 0.2968 0.6731 0.0095 0.6294 0.1825 0.5307 0.1602 0.1559 0.0639 0.537 0.1405 0.8978 0.9958 0.0018 0.3544 0.9345 0.3012 0.4224 0.6359 0.0023 0.2358 0.1719 0.4497 0.391 0.4985 0.6126 0.3703 0.3799 0.8303 0.0012 0.3158 0.3394 0.1401 0.191 0.1365 0.0065 0.1939 0.4654 0.6404 0. 0.5815 0.2543 0.3344 0. 0.3772 0.022 0.0805 0.1232] 2022-08-22 22:30:29 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 22:30:41 [INFO] [TRAIN] epoch: 34, iter: 42050/160000, loss: 0.7245, lr: 0.000893, batch_cost: 0.2323, reader_cost: 0.00469, ips: 34.4428 samples/sec | ETA 07:36:36 2022-08-22 22:30:52 [INFO] [TRAIN] epoch: 34, iter: 42100/160000, loss: 0.6747, lr: 0.000893, batch_cost: 0.2249, reader_cost: 0.00143, ips: 35.5667 samples/sec | ETA 07:21:59 2022-08-22 22:31:05 [INFO] [TRAIN] epoch: 34, iter: 42150/160000, loss: 0.6804, lr: 0.000892, batch_cost: 0.2606, reader_cost: 0.01140, ips: 30.6991 samples/sec | ETA 08:31:51 2022-08-22 22:31:19 [INFO] [TRAIN] epoch: 34, iter: 42200/160000, loss: 0.7058, lr: 0.000892, batch_cost: 0.2670, reader_cost: 0.00484, ips: 29.9647 samples/sec | ETA 08:44:10 2022-08-22 22:31:33 [INFO] [TRAIN] epoch: 34, iter: 42250/160000, loss: 0.7116, lr: 0.000891, batch_cost: 0.2870, reader_cost: 0.01123, ips: 27.8728 samples/sec | ETA 09:23:16 2022-08-22 22:31:48 [INFO] [TRAIN] epoch: 34, iter: 42300/160000, loss: 0.7510, lr: 0.000891, batch_cost: 0.2984, reader_cost: 0.00065, ips: 26.8112 samples/sec | ETA 09:45:19 2022-08-22 22:32:02 [INFO] [TRAIN] epoch: 34, iter: 42350/160000, loss: 0.7320, lr: 0.000891, batch_cost: 0.2746, reader_cost: 0.00558, ips: 29.1370 samples/sec | ETA 08:58:22 2022-08-22 22:32:15 [INFO] [TRAIN] epoch: 34, iter: 42400/160000, loss: 0.7381, lr: 0.000890, batch_cost: 0.2684, reader_cost: 0.01516, ips: 29.8059 samples/sec | ETA 08:46:04 2022-08-22 22:32:29 [INFO] [TRAIN] epoch: 34, iter: 42450/160000, loss: 0.7718, lr: 0.000890, batch_cost: 0.2807, reader_cost: 0.00058, ips: 28.5020 samples/sec | ETA 09:09:54 2022-08-22 22:32:43 [INFO] [TRAIN] epoch: 34, iter: 42500/160000, loss: 0.7310, lr: 0.000890, batch_cost: 0.2890, reader_cost: 0.01169, ips: 27.6826 samples/sec | ETA 09:25:56 2022-08-22 22:32:57 [INFO] [TRAIN] epoch: 34, iter: 42550/160000, loss: 0.7738, lr: 0.000889, batch_cost: 0.2778, reader_cost: 0.02034, ips: 28.7997 samples/sec | ETA 09:03:45 2022-08-22 22:33:10 [INFO] [TRAIN] epoch: 34, iter: 42600/160000, loss: 0.7175, lr: 0.000889, batch_cost: 0.2537, reader_cost: 0.01372, ips: 31.5349 samples/sec | ETA 08:16:22 2022-08-22 22:33:23 [INFO] [TRAIN] epoch: 34, iter: 42650/160000, loss: 0.7777, lr: 0.000888, batch_cost: 0.2553, reader_cost: 0.00119, ips: 31.3381 samples/sec | ETA 08:19:17 2022-08-22 22:33:37 [INFO] [TRAIN] epoch: 34, iter: 42700/160000, loss: 0.7826, lr: 0.000888, batch_cost: 0.2862, reader_cost: 0.00085, ips: 27.9571 samples/sec | ETA 09:19:25 2022-08-22 22:33:51 [INFO] [TRAIN] epoch: 34, iter: 42750/160000, loss: 0.6821, lr: 0.000888, batch_cost: 0.2691, reader_cost: 0.00364, ips: 29.7310 samples/sec | ETA 08:45:49 2022-08-22 22:34:06 [INFO] [TRAIN] epoch: 34, iter: 42800/160000, loss: 0.7420, lr: 0.000887, batch_cost: 0.2997, reader_cost: 0.00057, ips: 26.6895 samples/sec | ETA 09:45:29 2022-08-22 22:34:19 [INFO] [TRAIN] epoch: 34, iter: 42850/160000, loss: 0.6965, lr: 0.000887, batch_cost: 0.2767, reader_cost: 0.00146, ips: 28.9105 samples/sec | ETA 09:00:17 2022-08-22 22:34:34 [INFO] [TRAIN] epoch: 34, iter: 42900/160000, loss: 0.7155, lr: 0.000887, batch_cost: 0.2834, reader_cost: 0.02674, ips: 28.2247 samples/sec | ETA 09:13:10 2022-08-22 22:34:51 [INFO] [TRAIN] epoch: 35, iter: 42950/160000, loss: 0.7083, lr: 0.000886, batch_cost: 0.3450, reader_cost: 0.05926, ips: 23.1870 samples/sec | ETA 11:13:04 2022-08-22 22:35:04 [INFO] [TRAIN] epoch: 35, iter: 43000/160000, loss: 0.6462, lr: 0.000886, batch_cost: 0.2563, reader_cost: 0.00067, ips: 31.2096 samples/sec | ETA 08:19:50 2022-08-22 22:35:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2018 - reader cost: 0.0011 2022-08-22 22:38:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.3265 Acc: 0.7534 Kappa: 0.7351 Dice: 0.4568 2022-08-22 22:38:26 [INFO] [EVAL] Class IoU: [0.6698 0.7599 0.9264 0.7112 0.6836 0.7488 0.7694 0.7604 0.5029 0.6418 0.4716 0.5311 0.6676 0.3063 0.249 0.4082 0.5266 0.4533 0.5696 0.3962 0.7201 0.436 0.6192 0.4819 0.3403 0.3564 0.439 0.4387 0.3881 0.2772 0.2408 0.4136 0.2697 0.3536 0.3228 0.3511 0.4023 0.5169 0.2509 0.3217 0.1523 0.0993 0.3331 0.2396 0.3068 0.2555 0.2478 0.4885 0.559 0.4764 0.4592 0.2525 0.1372 0.2366 0.6533 0.4063 0.8245 0.3654 0.4539 0.2758 0.1002 0.2239 0.3182 0.1712 0.4078 0.6452 0.2764 0.3732 0.055 0.2886 0.4225 0.4896 0.3608 0.2251 0.4164 0.3314 0.5183 0.2964 0.2548 0.1262 0.5718 0.3544 0.2676 0.0403 0.2582 0.537 0.0738 0.0549 0.2622 0.4787 0.3204 0.0567 0.1836 0.0817 0.0456 0.032 0.1081 0.1117 0.2841 0.3441 0.0578 0.046 0.2364 0.1259 0.1131 0.4211 0.0628 0.498 0.1258 0.0527 0.1078 0.3747 0.152 0.5904 0.6146 0. 0.3892 0.5939 0.094 0.3306 0.4385 0.0022 0.2108 0.1843 0.4448 0.2984 0.4304 0.4394 0.0001 0.3034 0.4789 0. 0.2227 0.2406 0.1358 0.1168 0.1184 0.0108 0.1707 0.3541 0.3341 0.0323 0.3656 0.0316 0.2925 0. 0.3463 0.0138 0.0965 0.1861] 2022-08-22 22:38:26 [INFO] [EVAL] Class Precision: [0.7888 0.8592 0.9632 0.812 0.7837 0.8418 0.8845 0.8264 0.6551 0.762 0.6767 0.6819 0.7471 0.4766 0.5294 0.5608 0.709 0.6874 0.669 0.5651 0.8019 0.5951 0.7699 0.6297 0.4827 0.4708 0.5149 0.7401 0.6388 0.4081 0.4543 0.4885 0.5044 0.4504 0.503 0.4841 0.6164 0.7758 0.4036 0.6818 0.2607 0.3033 0.5379 0.5604 0.5267 0.4958 0.4089 0.6449 0.6627 0.5688 0.5352 0.2921 0.2934 0.6181 0.6723 0.5104 0.854 0.6451 0.5543 0.3955 0.1958 0.4714 0.4188 0.8044 0.4998 0.7325 0.4098 0.6692 0.3318 0.5579 0.635 0.6052 0.6334 0.3555 0.5784 0.4889 0.6823 0.5827 0.6453 0.4117 0.6618 0.7258 0.7911 0.102 0.3685 0.7151 0.3039 0.4305 0.3578 0.7019 0.4003 0.1284 0.2871 0.2796 0.0647 0.0675 0.2495 0.3368 0.555 0.639 0.642 0.0621 0.5651 0.2835 0.7429 0.5647 0.2171 0.7494 0.2043 0.1323 0.2071 0.451 0.5297 0.7036 0.6163 0. 0.6933 0.7297 0.1886 0.6382 0.6411 0.1774 0.6352 0.6043 0.6886 0.643 0.7418 0.672 0.145 0.5635 0.5932 0.0096 0.4195 0.5904 0.4421 0.3329 0.3472 0.0435 0.5357 0.6129 0.4105 0.04 0.7237 0.3823 0.621 0. 0.7768 0.5634 0.5668 0.6523] 2022-08-22 22:38:26 [INFO] [EVAL] Class Recall: [0.8162 0.868 0.9604 0.8514 0.8426 0.8714 0.8554 0.9049 0.6839 0.8027 0.6087 0.7061 0.8625 0.4617 0.3198 0.6 0.6717 0.571 0.7931 0.5699 0.8759 0.6198 0.7598 0.6725 0.5356 0.5946 0.7486 0.5186 0.4972 0.4636 0.3388 0.7296 0.3668 0.622 0.4738 0.5611 0.5367 0.6077 0.3987 0.3785 0.2682 0.1287 0.4667 0.2951 0.4237 0.3453 0.3861 0.6682 0.7812 0.7458 0.7638 0.6502 0.205 0.2771 0.9585 0.6659 0.9598 0.4573 0.7148 0.4767 0.1704 0.2989 0.5697 0.1787 0.6891 0.844 0.4592 0.4577 0.0618 0.3742 0.558 0.7194 0.4561 0.3804 0.5978 0.5071 0.6831 0.3762 0.2963 0.1539 0.8078 0.4091 0.2879 0.0626 0.4633 0.6832 0.0888 0.0592 0.4952 0.6009 0.6163 0.0921 0.3375 0.1035 0.1337 0.0575 0.1601 0.1432 0.3678 0.4271 0.0597 0.1514 0.2889 0.1846 0.1177 0.6235 0.0813 0.5975 0.2468 0.0806 0.1836 0.689 0.1758 0.7859 0.9956 0. 0.4701 0.7613 0.1578 0.4068 0.5812 0.0022 0.2398 0.2096 0.5568 0.3576 0.5063 0.5594 0.0001 0.3966 0.713 0. 0.3218 0.2887 0.1638 0.1525 0.1523 0.0141 0.2004 0.456 0.6421 0.145 0.425 0.0333 0.3561 0. 0.3846 0.0139 0.1042 0.2066] 2022-08-22 22:38:26 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 22:38:38 [INFO] [TRAIN] epoch: 35, iter: 43050/160000, loss: 0.7325, lr: 0.000885, batch_cost: 0.2352, reader_cost: 0.00481, ips: 34.0120 samples/sec | ETA 07:38:27 2022-08-22 22:38:49 [INFO] [TRAIN] epoch: 35, iter: 43100/160000, loss: 0.7131, lr: 0.000885, batch_cost: 0.2345, reader_cost: 0.00241, ips: 34.1152 samples/sec | ETA 07:36:52 2022-08-22 22:39:04 [INFO] [TRAIN] epoch: 35, iter: 43150/160000, loss: 0.7432, lr: 0.000885, batch_cost: 0.2973, reader_cost: 0.00082, ips: 26.9060 samples/sec | ETA 09:39:03 2022-08-22 22:39:19 [INFO] [TRAIN] epoch: 35, iter: 43200/160000, loss: 0.7509, lr: 0.000884, batch_cost: 0.2875, reader_cost: 0.00132, ips: 27.8230 samples/sec | ETA 09:19:43 2022-08-22 22:39:33 [INFO] [TRAIN] epoch: 35, iter: 43250/160000, loss: 0.6935, lr: 0.000884, batch_cost: 0.2832, reader_cost: 0.00104, ips: 28.2439 samples/sec | ETA 09:11:09 2022-08-22 22:39:46 [INFO] [TRAIN] epoch: 35, iter: 43300/160000, loss: 0.7410, lr: 0.000884, batch_cost: 0.2571, reader_cost: 0.00143, ips: 31.1103 samples/sec | ETA 08:20:09 2022-08-22 22:39:59 [INFO] [TRAIN] epoch: 35, iter: 43350/160000, loss: 0.7079, lr: 0.000883, batch_cost: 0.2586, reader_cost: 0.01239, ips: 30.9385 samples/sec | ETA 08:22:43 2022-08-22 22:40:13 [INFO] [TRAIN] epoch: 35, iter: 43400/160000, loss: 0.7011, lr: 0.000883, batch_cost: 0.2942, reader_cost: 0.00054, ips: 27.1927 samples/sec | ETA 09:31:43 2022-08-22 22:40:26 [INFO] [TRAIN] epoch: 35, iter: 43450/160000, loss: 0.6824, lr: 0.000882, batch_cost: 0.2653, reader_cost: 0.01407, ips: 30.1522 samples/sec | ETA 08:35:23 2022-08-22 22:40:40 [INFO] [TRAIN] epoch: 35, iter: 43500/160000, loss: 0.7105, lr: 0.000882, batch_cost: 0.2640, reader_cost: 0.00266, ips: 30.3054 samples/sec | ETA 08:32:33 2022-08-22 22:40:54 [INFO] [TRAIN] epoch: 35, iter: 43550/160000, loss: 0.6916, lr: 0.000882, batch_cost: 0.2868, reader_cost: 0.00062, ips: 27.8906 samples/sec | ETA 09:16:41 2022-08-22 22:41:07 [INFO] [TRAIN] epoch: 35, iter: 43600/160000, loss: 0.7104, lr: 0.000881, batch_cost: 0.2588, reader_cost: 0.00545, ips: 30.9157 samples/sec | ETA 08:22:00 2022-08-22 22:41:21 [INFO] [TRAIN] epoch: 35, iter: 43650/160000, loss: 0.7083, lr: 0.000881, batch_cost: 0.2711, reader_cost: 0.00279, ips: 29.5064 samples/sec | ETA 08:45:45 2022-08-22 22:41:34 [INFO] [TRAIN] epoch: 35, iter: 43700/160000, loss: 0.6863, lr: 0.000881, batch_cost: 0.2690, reader_cost: 0.00335, ips: 29.7431 samples/sec | ETA 08:41:21 2022-08-22 22:41:49 [INFO] [TRAIN] epoch: 35, iter: 43750/160000, loss: 0.7083, lr: 0.000880, batch_cost: 0.3018, reader_cost: 0.00444, ips: 26.5078 samples/sec | ETA 09:44:43 2022-08-22 22:42:03 [INFO] [TRAIN] epoch: 35, iter: 43800/160000, loss: 0.7092, lr: 0.000880, batch_cost: 0.2763, reader_cost: 0.00084, ips: 28.9588 samples/sec | ETA 08:55:00 2022-08-22 22:42:17 [INFO] [TRAIN] epoch: 35, iter: 43850/160000, loss: 0.7306, lr: 0.000879, batch_cost: 0.2791, reader_cost: 0.01781, ips: 28.6601 samples/sec | ETA 09:00:21 2022-08-22 22:42:31 [INFO] [TRAIN] epoch: 35, iter: 43900/160000, loss: 0.7166, lr: 0.000879, batch_cost: 0.2759, reader_cost: 0.00146, ips: 28.9936 samples/sec | ETA 08:53:54 2022-08-22 22:42:41 [INFO] [TRAIN] epoch: 35, iter: 43950/160000, loss: 0.7067, lr: 0.000879, batch_cost: 0.2028, reader_cost: 0.00162, ips: 39.4458 samples/sec | ETA 06:32:16 2022-08-22 22:42:52 [INFO] [TRAIN] epoch: 35, iter: 44000/160000, loss: 0.7126, lr: 0.000878, batch_cost: 0.2217, reader_cost: 0.00359, ips: 36.0875 samples/sec | ETA 07:08:35 2022-08-22 22:42:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 160s - batch_cost: 0.1602 - reader cost: 0.0016 2022-08-22 22:45:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3250 Acc: 0.7523 Kappa: 0.7335 Dice: 0.4553 2022-08-22 22:45:32 [INFO] [EVAL] Class IoU: [0.6706 0.7574 0.9293 0.7054 0.6725 0.7394 0.7686 0.7483 0.5025 0.6388 0.4718 0.5473 0.6784 0.3021 0.2588 0.3962 0.5056 0.4277 0.5834 0.39 0.7424 0.4067 0.6049 0.4796 0.314 0.399 0.4462 0.4194 0.3732 0.255 0.2135 0.4481 0.2105 0.315 0.2649 0.3388 0.4099 0.5243 0.2165 0.32 0.1483 0.14 0.3491 0.2331 0.3088 0.1366 0.2834 0.4757 0.579 0.4747 0.5054 0.2897 0.1558 0.253 0.6593 0.4933 0.8358 0.4125 0.4717 0.2923 0.0997 0.3885 0.2912 0.1163 0.4282 0.625 0.2289 0.3834 0.0835 0.3083 0.3897 0.4918 0.3819 0.2502 0.4109 0.3265 0.386 0.3067 0.2143 0.2092 0.5763 0.3984 0.3091 0.0124 0.2981 0.5293 0.061 0.0554 0.244 0.4596 0.3603 0.0607 0.204 0.1004 0.0024 0.0213 0.0652 0.1016 0.2828 0.3436 0.101 0.0338 0.2987 0.0894 0.0954 0.3678 0.1027 0.4941 0.1147 0.1336 0.0508 0.4279 0.1278 0.5529 0.5914 0.0036 0.3787 0.5583 0.1238 0.0895 0.3936 0. 0.1884 0.1663 0.3236 0.3103 0.4184 0.4444 0.1614 0.2667 0.5379 0.0095 0.2898 0.2904 0.1483 0.1308 0.1242 0.0072 0.1199 0.3465 0.2391 0. 0.3721 0.1718 0.2624 0. 0.3442 0.0171 0.1233 0.1024] 2022-08-22 22:45:32 [INFO] [EVAL] Class Precision: [0.7761 0.8471 0.9696 0.8098 0.753 0.8623 0.8884 0.8047 0.6316 0.7811 0.7287 0.6822 0.7528 0.4798 0.5173 0.5899 0.7016 0.7033 0.7427 0.6057 0.852 0.6495 0.7409 0.591 0.4688 0.466 0.5772 0.6889 0.6963 0.3532 0.4466 0.5915 0.4955 0.3764 0.4536 0.436 0.5818 0.7844 0.3977 0.6236 0.2603 0.3385 0.6584 0.5317 0.4542 0.4395 0.5475 0.6596 0.6208 0.5696 0.6751 0.3465 0.3389 0.5404 0.6739 0.6582 0.8738 0.5555 0.5863 0.424 0.1348 0.49 0.3631 0.7285 0.5591 0.7319 0.3623 0.5208 0.2426 0.5554 0.6114 0.6675 0.5061 0.3569 0.6243 0.5158 0.5145 0.8188 0.523 0.2798 0.6233 0.6232 0.7677 0.0395 0.3456 0.7129 0.3511 0.3687 0.3787 0.7161 0.4847 0.1 0.4373 0.2791 0.0285 0.0689 0.3421 0.3338 0.4976 0.5679 0.514 0.0904 0.6118 0.4586 0.6447 0.4374 0.2501 0.7659 0.3081 0.2914 0.3319 0.6269 0.4182 0.7549 0.5929 0.0487 0.5681 0.5966 0.2172 0.666 0.6774 0. 0.4962 0.6098 0.6585 0.6935 0.7145 0.6263 0.3661 0.4 0.6611 0.8297 0.5103 0.5774 0.5402 0.3191 0.2794 0.0549 0.4228 0.6277 0.2623 0. 0.6096 0.5324 0.5986 0. 0.8095 0.5001 0.3864 0.8624] 2022-08-22 22:45:32 [INFO] [EVAL] Class Recall: [0.8315 0.8773 0.9571 0.8455 0.8628 0.8383 0.8508 0.9144 0.7108 0.7781 0.5723 0.7345 0.8727 0.4492 0.3413 0.5467 0.6441 0.5219 0.7311 0.5227 0.8523 0.5211 0.7672 0.7178 0.4875 0.735 0.6629 0.5173 0.4457 0.4785 0.2903 0.6489 0.268 0.6588 0.389 0.6032 0.5811 0.6126 0.322 0.3967 0.2563 0.1926 0.4263 0.2933 0.491 0.1655 0.3701 0.6304 0.8958 0.7402 0.6679 0.6385 0.2238 0.3224 0.9682 0.6631 0.9506 0.6156 0.7071 0.4848 0.2769 0.6522 0.5951 0.1216 0.6465 0.8105 0.3833 0.5924 0.113 0.4093 0.518 0.6515 0.6088 0.4557 0.5459 0.4708 0.6071 0.329 0.2664 0.4534 0.8841 0.5249 0.3409 0.0177 0.6845 0.6727 0.0687 0.0612 0.4068 0.562 0.5839 0.1337 0.2767 0.1356 0.0026 0.0299 0.0745 0.1274 0.3959 0.4652 0.1116 0.0512 0.3685 0.0999 0.1007 0.698 0.1485 0.582 0.1544 0.1978 0.0566 0.5741 0.1555 0.6739 0.9958 0.0039 0.5318 0.897 0.2234 0.0937 0.4844 0. 0.233 0.1861 0.3888 0.3597 0.5025 0.6049 0.224 0.4445 0.7426 0.0095 0.4016 0.3687 0.1697 0.1814 0.1828 0.0083 0.1433 0.4362 0.7301 0. 0.4885 0.2023 0.3185 0. 0.3746 0.0173 0.1533 0.1041] 2022-08-22 22:45:33 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 22:45:43 [INFO] [TRAIN] epoch: 35, iter: 44050/160000, loss: 0.7431, lr: 0.000878, batch_cost: 0.2054, reader_cost: 0.00427, ips: 38.9433 samples/sec | ETA 06:36:59 2022-08-22 22:45:55 [INFO] [TRAIN] epoch: 35, iter: 44100/160000, loss: 0.6931, lr: 0.000877, batch_cost: 0.2428, reader_cost: 0.00117, ips: 32.9544 samples/sec | ETA 07:48:55 2022-08-22 22:46:06 [INFO] [TRAIN] epoch: 35, iter: 44150/160000, loss: 0.7295, lr: 0.000877, batch_cost: 0.2131, reader_cost: 0.00421, ips: 37.5420 samples/sec | ETA 06:51:26 2022-08-22 22:46:19 [INFO] [TRAIN] epoch: 35, iter: 44200/160000, loss: 0.7359, lr: 0.000877, batch_cost: 0.2667, reader_cost: 0.00900, ips: 29.9978 samples/sec | ETA 08:34:42 2022-08-22 22:46:37 [INFO] [TRAIN] epoch: 36, iter: 44250/160000, loss: 0.7029, lr: 0.000876, batch_cost: 0.3561, reader_cost: 0.07248, ips: 22.4674 samples/sec | ETA 11:26:55 2022-08-22 22:46:52 [INFO] [TRAIN] epoch: 36, iter: 44300/160000, loss: 0.7273, lr: 0.000876, batch_cost: 0.2978, reader_cost: 0.00229, ips: 26.8627 samples/sec | ETA 09:34:16 2022-08-22 22:47:06 [INFO] [TRAIN] epoch: 36, iter: 44350/160000, loss: 0.7149, lr: 0.000876, batch_cost: 0.2817, reader_cost: 0.00068, ips: 28.3989 samples/sec | ETA 09:02:58 2022-08-22 22:47:21 [INFO] [TRAIN] epoch: 36, iter: 44400/160000, loss: 0.7269, lr: 0.000875, batch_cost: 0.3042, reader_cost: 0.00057, ips: 26.2971 samples/sec | ETA 09:46:07 2022-08-22 22:47:34 [INFO] [TRAIN] epoch: 36, iter: 44450/160000, loss: 0.6942, lr: 0.000875, batch_cost: 0.2698, reader_cost: 0.00811, ips: 29.6476 samples/sec | ETA 08:39:39 2022-08-22 22:47:49 [INFO] [TRAIN] epoch: 36, iter: 44500/160000, loss: 0.6651, lr: 0.000874, batch_cost: 0.2902, reader_cost: 0.01419, ips: 27.5719 samples/sec | ETA 09:18:32 2022-08-22 22:48:04 [INFO] [TRAIN] epoch: 36, iter: 44550/160000, loss: 0.6574, lr: 0.000874, batch_cost: 0.2919, reader_cost: 0.02121, ips: 27.4055 samples/sec | ETA 09:21:41 2022-08-22 22:48:17 [INFO] [TRAIN] epoch: 36, iter: 44600/160000, loss: 0.7159, lr: 0.000874, batch_cost: 0.2791, reader_cost: 0.00676, ips: 28.6620 samples/sec | ETA 08:56:49 2022-08-22 22:48:32 [INFO] [TRAIN] epoch: 36, iter: 44650/160000, loss: 0.7194, lr: 0.000873, batch_cost: 0.2969, reader_cost: 0.00115, ips: 26.9452 samples/sec | ETA 09:30:47 2022-08-22 22:48:46 [INFO] [TRAIN] epoch: 36, iter: 44700/160000, loss: 0.7071, lr: 0.000873, batch_cost: 0.2639, reader_cost: 0.01124, ips: 30.3147 samples/sec | ETA 08:27:07 2022-08-22 22:48:59 [INFO] [TRAIN] epoch: 36, iter: 44750/160000, loss: 0.7420, lr: 0.000873, batch_cost: 0.2678, reader_cost: 0.00052, ips: 29.8736 samples/sec | ETA 08:34:23 2022-08-22 22:49:13 [INFO] [TRAIN] epoch: 36, iter: 44800/160000, loss: 0.7193, lr: 0.000872, batch_cost: 0.2836, reader_cost: 0.01174, ips: 28.2111 samples/sec | ETA 09:04:28 2022-08-22 22:49:27 [INFO] [TRAIN] epoch: 36, iter: 44850/160000, loss: 0.7067, lr: 0.000872, batch_cost: 0.2726, reader_cost: 0.00386, ips: 29.3504 samples/sec | ETA 08:43:06 2022-08-22 22:49:39 [INFO] [TRAIN] epoch: 36, iter: 44900/160000, loss: 0.6999, lr: 0.000871, batch_cost: 0.2419, reader_cost: 0.00347, ips: 33.0682 samples/sec | ETA 07:44:05 2022-08-22 22:49:50 [INFO] [TRAIN] epoch: 36, iter: 44950/160000, loss: 0.7473, lr: 0.000871, batch_cost: 0.2252, reader_cost: 0.00321, ips: 35.5282 samples/sec | ETA 07:11:46 2022-08-22 22:50:01 [INFO] [TRAIN] epoch: 36, iter: 45000/160000, loss: 0.7075, lr: 0.000871, batch_cost: 0.2184, reader_cost: 0.00394, ips: 36.6369 samples/sec | ETA 06:58:31 2022-08-22 22:50:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 160s - batch_cost: 0.1603 - reader cost: 0.0014 2022-08-22 22:52:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.3219 Acc: 0.7522 Kappa: 0.7333 Dice: 0.4519 2022-08-22 22:52:42 [INFO] [EVAL] Class IoU: [0.673 0.7646 0.9301 0.7032 0.6655 0.7431 0.7473 0.7607 0.509 0.6205 0.4719 0.5276 0.6867 0.3139 0.207 0.4222 0.4952 0.4422 0.5867 0.396 0.7378 0.4294 0.5764 0.4767 0.3213 0.2932 0.3825 0.4177 0.3468 0.2717 0.2402 0.4553 0.2889 0.3519 0.2876 0.3312 0.4056 0.5288 0.2103 0.3325 0.2001 0.0969 0.3517 0.2405 0.297 0.2328 0.3077 0.4466 0.5036 0.4942 0.5235 0.2982 0.1178 0.2042 0.6646 0.5229 0.8422 0.3583 0.3918 0.2775 0.0744 0.3398 0.2561 0.1797 0.4299 0.6415 0.2276 0.3828 0.0858 0.3158 0.3661 0.4619 0.3774 0.2611 0.4288 0.3156 0.4308 0.2794 0.2836 0.2949 0.5215 0.3781 0.236 0.0352 0.183 0.529 0.0774 0.0611 0.245 0.5135 0.2139 0.0508 0.2152 0.0661 0.0035 0.0176 0.0267 0.1165 0.2604 0.3708 0.0654 0.0368 0.2904 0.0656 0.1576 0.3743 0.1247 0.4886 0.1372 0.2921 0.0805 0.3481 0.1177 0.5661 0.5194 0.0068 0.3161 0.6167 0.0799 0.267 0.43 0. 0.1853 0.1199 0.3761 0.3098 0.4064 0.4121 0.2359 0.2693 0.4829 0.0266 0.2744 0.212 0.0834 0.1252 0.112 0.0105 0.1198 0.3608 0.4961 0.0399 0.2595 0.0129 0.1126 0. 0.3592 0.0154 0.0909 0.12 ] 2022-08-22 22:52:42 [INFO] [EVAL] Class Precision: [0.7838 0.8295 0.9648 0.7799 0.7979 0.8627 0.8705 0.8158 0.6449 0.8097 0.693 0.6369 0.7859 0.48 0.525 0.5962 0.5713 0.6912 0.7469 0.6247 0.847 0.6548 0.7112 0.6157 0.4838 0.4571 0.5287 0.6288 0.7089 0.3796 0.4207 0.6063 0.4706 0.4701 0.5112 0.4517 0.6248 0.7339 0.3159 0.6132 0.297 0.3506 0.6179 0.5193 0.4599 0.4017 0.4711 0.7658 0.6179 0.6023 0.756 0.4213 0.2405 0.5132 0.6811 0.7661 0.9124 0.6258 0.6804 0.4536 0.1577 0.447 0.3674 0.6942 0.5447 0.7186 0.3736 0.4765 0.508 0.568 0.671 0.6377 0.6447 0.3446 0.7014 0.6349 0.5216 0.6527 0.8027 0.3985 0.6575 0.6913 0.822 0.0614 0.5484 0.769 0.3653 0.3786 0.4488 0.7323 0.2217 0.0983 0.3803 0.2764 0.011 0.0457 0.1489 0.4538 0.5582 0.6139 0.3487 0.0675 0.4831 0.6944 0.5845 0.4571 0.5136 0.8019 0.2598 0.6261 0.2206 0.9732 0.5722 0.782 0.5198 0.0721 0.692 0.6329 0.1483 0.6081 0.5947 0. 0.5337 0.6667 0.7475 0.716 0.6898 0.5953 0.2969 0.6 0.6808 0.2601 0.4815 0.8366 0.4524 0.3652 0.2324 0.0934 0.5363 0.6211 0.6722 0.0515 0.7471 0.2563 0.5792 0. 0.5984 0.4715 0.6444 0.8674] 2022-08-22 22:52:42 [INFO] [EVAL] Class Recall: [0.8265 0.9072 0.9628 0.8774 0.8005 0.8427 0.8407 0.9184 0.7072 0.7264 0.5967 0.7546 0.8447 0.4756 0.2547 0.5914 0.788 0.5511 0.7322 0.5196 0.8513 0.5552 0.7526 0.6787 0.4889 0.4499 0.5805 0.5545 0.4044 0.4886 0.3588 0.6464 0.428 0.5834 0.3967 0.5539 0.5362 0.6543 0.3861 0.4208 0.3803 0.1181 0.4494 0.3094 0.456 0.3563 0.4701 0.5172 0.7313 0.7337 0.63 0.5049 0.1877 0.2532 0.9648 0.6222 0.9162 0.4559 0.4802 0.4168 0.1234 0.5862 0.4582 0.1952 0.6711 0.8568 0.3679 0.6606 0.0936 0.4156 0.4461 0.6261 0.4765 0.5186 0.5245 0.3856 0.7122 0.3282 0.3049 0.5315 0.716 0.4549 0.2487 0.0762 0.2155 0.6289 0.0894 0.0679 0.3505 0.6321 0.8586 0.0952 0.3313 0.0799 0.0051 0.0277 0.0315 0.1356 0.328 0.4836 0.0745 0.0748 0.4213 0.0676 0.1774 0.674 0.1414 0.5557 0.2253 0.3538 0.1125 0.3515 0.1291 0.6722 0.9982 0.0075 0.3679 0.9602 0.1478 0.3226 0.6082 0. 0.2212 0.1276 0.4308 0.3533 0.4972 0.5724 0.5344 0.3282 0.6243 0.0288 0.3894 0.2211 0.0927 0.1601 0.1777 0.0117 0.1336 0.4627 0.6544 0.1498 0.2845 0.0134 0.1226 0. 0.4733 0.0156 0.0958 0.1222] 2022-08-22 22:52:42 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 22:52:51 [INFO] [TRAIN] epoch: 36, iter: 45050/160000, loss: 0.7008, lr: 0.000870, batch_cost: 0.1912, reader_cost: 0.00382, ips: 41.8307 samples/sec | ETA 06:06:23 2022-08-22 22:53:02 [INFO] [TRAIN] epoch: 36, iter: 45100/160000, loss: 0.7020, lr: 0.000870, batch_cost: 0.2111, reader_cost: 0.00133, ips: 37.8986 samples/sec | ETA 06:44:14 2022-08-22 22:53:13 [INFO] [TRAIN] epoch: 36, iter: 45150/160000, loss: 0.7262, lr: 0.000870, batch_cost: 0.2212, reader_cost: 0.00047, ips: 36.1615 samples/sec | ETA 07:03:28 2022-08-22 22:53:24 [INFO] [TRAIN] epoch: 36, iter: 45200/160000, loss: 0.7084, lr: 0.000869, batch_cost: 0.2240, reader_cost: 0.00035, ips: 35.7207 samples/sec | ETA 07:08:30 2022-08-22 22:53:36 [INFO] [TRAIN] epoch: 36, iter: 45250/160000, loss: 0.7237, lr: 0.000869, batch_cost: 0.2407, reader_cost: 0.01228, ips: 33.2394 samples/sec | ETA 07:40:17 2022-08-22 22:53:48 [INFO] [TRAIN] epoch: 36, iter: 45300/160000, loss: 0.6801, lr: 0.000868, batch_cost: 0.2473, reader_cost: 0.00142, ips: 32.3500 samples/sec | ETA 07:52:44 2022-08-22 22:54:05 [INFO] [TRAIN] epoch: 36, iter: 45350/160000, loss: 0.7125, lr: 0.000868, batch_cost: 0.3229, reader_cost: 0.00080, ips: 24.7777 samples/sec | ETA 10:16:57 2022-08-22 22:54:20 [INFO] [TRAIN] epoch: 36, iter: 45400/160000, loss: 0.7309, lr: 0.000868, batch_cost: 0.3103, reader_cost: 0.00140, ips: 25.7782 samples/sec | ETA 09:52:44 2022-08-22 22:54:35 [INFO] [TRAIN] epoch: 36, iter: 45450/160000, loss: 0.7218, lr: 0.000867, batch_cost: 0.2876, reader_cost: 0.00081, ips: 27.8190 samples/sec | ETA 09:09:01 2022-08-22 22:54:54 [INFO] [TRAIN] epoch: 37, iter: 45500/160000, loss: 0.6887, lr: 0.000867, batch_cost: 0.3954, reader_cost: 0.09944, ips: 20.2311 samples/sec | ETA 12:34:36 2022-08-22 22:55:09 [INFO] [TRAIN] epoch: 37, iter: 45550/160000, loss: 0.6716, lr: 0.000867, batch_cost: 0.2893, reader_cost: 0.00551, ips: 27.6487 samples/sec | ETA 09:11:55 2022-08-22 22:55:24 [INFO] [TRAIN] epoch: 37, iter: 45600/160000, loss: 0.7003, lr: 0.000866, batch_cost: 0.2973, reader_cost: 0.00044, ips: 26.9081 samples/sec | ETA 09:26:52 2022-08-22 22:55:38 [INFO] [TRAIN] epoch: 37, iter: 45650/160000, loss: 0.7276, lr: 0.000866, batch_cost: 0.2948, reader_cost: 0.00067, ips: 27.1413 samples/sec | ETA 09:21:45 2022-08-22 22:55:53 [INFO] [TRAIN] epoch: 37, iter: 45700/160000, loss: 0.6843, lr: 0.000865, batch_cost: 0.2893, reader_cost: 0.00117, ips: 27.6483 samples/sec | ETA 09:11:12 2022-08-22 22:56:08 [INFO] [TRAIN] epoch: 37, iter: 45750/160000, loss: 0.6631, lr: 0.000865, batch_cost: 0.2949, reader_cost: 0.00090, ips: 27.1317 samples/sec | ETA 09:21:27 2022-08-22 22:56:19 [INFO] [TRAIN] epoch: 37, iter: 45800/160000, loss: 0.6541, lr: 0.000865, batch_cost: 0.2370, reader_cost: 0.00332, ips: 33.7547 samples/sec | ETA 07:31:05 2022-08-22 22:56:32 [INFO] [TRAIN] epoch: 37, iter: 45850/160000, loss: 0.7258, lr: 0.000864, batch_cost: 0.2494, reader_cost: 0.00088, ips: 32.0754 samples/sec | ETA 07:54:30 2022-08-22 22:56:46 [INFO] [TRAIN] epoch: 37, iter: 45900/160000, loss: 0.6747, lr: 0.000864, batch_cost: 0.2760, reader_cost: 0.01593, ips: 28.9901 samples/sec | ETA 08:44:46 2022-08-22 22:56:59 [INFO] [TRAIN] epoch: 37, iter: 45950/160000, loss: 0.7139, lr: 0.000863, batch_cost: 0.2559, reader_cost: 0.00755, ips: 31.2640 samples/sec | ETA 08:06:23 2022-08-22 22:57:11 [INFO] [TRAIN] epoch: 37, iter: 46000/160000, loss: 0.6936, lr: 0.000863, batch_cost: 0.2515, reader_cost: 0.00290, ips: 31.8105 samples/sec | ETA 07:57:49 2022-08-22 22:57:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 181s - batch_cost: 0.1813 - reader cost: 6.1535e-04 2022-08-22 23:00:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3276 Acc: 0.7517 Kappa: 0.7331 Dice: 0.4574 2022-08-22 23:00:13 [INFO] [EVAL] Class IoU: [0.6709 0.7655 0.9284 0.7116 0.6717 0.7553 0.7632 0.7715 0.5132 0.6082 0.474 0.5427 0.6885 0.258 0.2809 0.4135 0.5002 0.4415 0.5823 0.391 0.7322 0.4531 0.587 0.4752 0.3572 0.3369 0.4423 0.4078 0.3904 0.2547 0.2354 0.4782 0.2741 0.3114 0.2698 0.3435 0.4064 0.4786 0.2551 0.3856 0.1734 0.1253 0.3343 0.2413 0.2823 0.2386 0.2681 0.4536 0.4567 0.4916 0.5122 0.2499 0.1386 0.161 0.6527 0.4025 0.8571 0.3117 0.4322 0.2505 0.0704 0.3746 0.3054 0.1383 0.4219 0.6106 0.2415 0.3668 0.0577 0.2622 0.3567 0.39 0.3513 0.2563 0.4118 0.33 0.5073 0.3065 0.202 0.2077 0.438 0.3714 0.2616 0.0644 0.0486 0.5247 0.0843 0.0766 0.3331 0.4478 0.3899 0.0326 0.2176 0.039 0.0003 0.0227 0.0297 0.1861 0.2731 0.378 0.0614 0.044 0.2539 0.12 0.039 0.4411 0.1039 0.484 0.145 0.3109 0.0984 0.398 0.1449 0.6727 0.845 0.0004 0.3832 0.5303 0.1512 0.2236 0.3899 0.0037 0.2099 0.0896 0.4354 0.3374 0.4207 0.4591 0.3824 0.2989 0.5458 0.0012 0.2471 0.2802 0.1154 0.1224 0.1236 0.0328 0.0994 0.3246 0.4717 0. 0.3291 0.0587 0.2971 0. 0.3464 0.0252 0.0781 0.133 ] 2022-08-22 23:00:13 [INFO] [EVAL] Class Precision: [0.778 0.8595 0.9706 0.8273 0.7509 0.8616 0.887 0.8604 0.6599 0.7627 0.6727 0.6545 0.7862 0.5072 0.4525 0.5717 0.6227 0.6952 0.7138 0.5707 0.829 0.732 0.7158 0.6072 0.4898 0.4297 0.5538 0.6879 0.6051 0.3135 0.4117 0.6298 0.409 0.4067 0.436 0.5157 0.6035 0.7129 0.4524 0.5653 0.3108 0.2811 0.5134 0.4936 0.3895 0.4224 0.5064 0.6548 0.6339 0.5936 0.6858 0.3109 0.3145 0.8052 0.6665 0.5079 0.9196 0.7182 0.5969 0.408 0.1084 0.5364 0.4938 0.8768 0.5238 0.7039 0.3355 0.6122 0.298 0.4669 0.6047 0.7637 0.6319 0.3388 0.5539 0.573 0.6323 0.5889 0.8311 0.2976 0.639 0.6777 0.7917 0.1121 0.231 0.6653 0.3215 0.4234 0.8045 0.7287 0.5197 0.0359 0.37 0.2771 0.0021 0.0669 0.5392 0.3585 0.5257 0.5984 0.3478 0.0684 0.5793 0.5889 0.5168 0.5891 0.2398 0.778 0.2246 0.4314 0.1714 0.7176 0.3801 0.7901 0.9063 0.0106 0.5449 0.6227 0.1929 0.5504 0.6179 0.068 0.6222 0.6974 0.7991 0.7051 0.6797 0.6395 0.6838 0.5534 0.681 0.027 0.3586 0.553 0.6017 0.2841 0.2469 0.1048 0.4127 0.6618 0.6877 0. 0.7752 0.4851 0.578 0. 0.8449 0.4995 0.5823 0.6409] 2022-08-22 23:00:13 [INFO] [EVAL] Class Recall: [0.8298 0.875 0.9553 0.8357 0.8643 0.8596 0.8454 0.8819 0.6977 0.7502 0.6162 0.7606 0.8472 0.3442 0.4256 0.5991 0.7178 0.5475 0.7596 0.5539 0.8625 0.5433 0.7653 0.6862 0.5689 0.6094 0.6871 0.5004 0.5239 0.5759 0.3548 0.6651 0.4539 0.5705 0.4144 0.507 0.5544 0.5929 0.369 0.5481 0.2817 0.1845 0.4893 0.3207 0.5063 0.3541 0.363 0.5962 0.6203 0.7411 0.6693 0.5601 0.1985 0.1675 0.9691 0.6598 0.9265 0.3551 0.6103 0.3935 0.1674 0.5539 0.4446 0.141 0.6844 0.8217 0.4631 0.4778 0.0667 0.3743 0.4651 0.4435 0.4417 0.5128 0.6162 0.4376 0.7195 0.39 0.2107 0.4076 0.582 0.4511 0.281 0.1313 0.0579 0.7128 0.1025 0.0855 0.3624 0.5374 0.6096 0.2587 0.3455 0.0435 0.0003 0.0331 0.0305 0.279 0.3623 0.5065 0.0694 0.1096 0.3113 0.1309 0.0405 0.6371 0.155 0.5615 0.2902 0.5267 0.1877 0.472 0.1897 0.8191 0.9259 0.0004 0.5636 0.7814 0.4118 0.2736 0.5139 0.0039 0.2406 0.0932 0.4889 0.3929 0.5246 0.6194 0.4645 0.394 0.7332 0.0012 0.4428 0.3623 0.1249 0.177 0.1983 0.0456 0.1157 0.3891 0.6003 0. 0.3639 0.0626 0.3794 0. 0.3699 0.0259 0.0827 0.1437] 2022-08-22 23:00:13 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 23:00:23 [INFO] [TRAIN] epoch: 37, iter: 46050/160000, loss: 0.6943, lr: 0.000863, batch_cost: 0.1989, reader_cost: 0.00340, ips: 40.2212 samples/sec | ETA 06:17:44 2022-08-22 23:00:35 [INFO] [TRAIN] epoch: 37, iter: 46100/160000, loss: 0.6979, lr: 0.000862, batch_cost: 0.2358, reader_cost: 0.00208, ips: 33.9242 samples/sec | ETA 07:27:39 2022-08-22 23:00:44 [INFO] [TRAIN] epoch: 37, iter: 46150/160000, loss: 0.6827, lr: 0.000862, batch_cost: 0.1949, reader_cost: 0.00325, ips: 41.0547 samples/sec | ETA 06:09:45 2022-08-22 23:00:55 [INFO] [TRAIN] epoch: 37, iter: 46200/160000, loss: 0.6774, lr: 0.000862, batch_cost: 0.2052, reader_cost: 0.00039, ips: 38.9837 samples/sec | ETA 06:29:13 2022-08-22 23:01:07 [INFO] [TRAIN] epoch: 37, iter: 46250/160000, loss: 0.6618, lr: 0.000861, batch_cost: 0.2498, reader_cost: 0.00052, ips: 32.0258 samples/sec | ETA 07:53:34 2022-08-22 23:01:20 [INFO] [TRAIN] epoch: 37, iter: 46300/160000, loss: 0.6921, lr: 0.000861, batch_cost: 0.2674, reader_cost: 0.00061, ips: 29.9232 samples/sec | ETA 08:26:37 2022-08-22 23:01:34 [INFO] [TRAIN] epoch: 37, iter: 46350/160000, loss: 0.7064, lr: 0.000860, batch_cost: 0.2819, reader_cost: 0.00116, ips: 28.3816 samples/sec | ETA 08:53:54 2022-08-22 23:01:48 [INFO] [TRAIN] epoch: 37, iter: 46400/160000, loss: 0.6968, lr: 0.000860, batch_cost: 0.2789, reader_cost: 0.00065, ips: 28.6865 samples/sec | ETA 08:48:00 2022-08-22 23:02:02 [INFO] [TRAIN] epoch: 37, iter: 46450/160000, loss: 0.7647, lr: 0.000860, batch_cost: 0.2644, reader_cost: 0.00131, ips: 30.2542 samples/sec | ETA 08:20:25 2022-08-22 23:02:14 [INFO] [TRAIN] epoch: 37, iter: 46500/160000, loss: 0.6901, lr: 0.000859, batch_cost: 0.2446, reader_cost: 0.00051, ips: 32.7031 samples/sec | ETA 07:42:44 2022-08-22 23:02:28 [INFO] [TRAIN] epoch: 37, iter: 46550/160000, loss: 0.6899, lr: 0.000859, batch_cost: 0.2760, reader_cost: 0.00080, ips: 28.9835 samples/sec | ETA 08:41:54 2022-08-22 23:02:40 [INFO] [TRAIN] epoch: 37, iter: 46600/160000, loss: 0.7257, lr: 0.000859, batch_cost: 0.2504, reader_cost: 0.01615, ips: 31.9475 samples/sec | ETA 07:53:16 2022-08-22 23:02:54 [INFO] [TRAIN] epoch: 37, iter: 46650/160000, loss: 0.6996, lr: 0.000858, batch_cost: 0.2706, reader_cost: 0.00061, ips: 29.5595 samples/sec | ETA 08:31:17 2022-08-22 23:03:07 [INFO] [TRAIN] epoch: 37, iter: 46700/160000, loss: 0.6923, lr: 0.000858, batch_cost: 0.2657, reader_cost: 0.00283, ips: 30.1101 samples/sec | ETA 08:21:42 2022-08-22 23:03:24 [INFO] [TRAIN] epoch: 38, iter: 46750/160000, loss: 0.7402, lr: 0.000857, batch_cost: 0.3324, reader_cost: 0.06024, ips: 24.0642 samples/sec | ETA 10:27:29 2022-08-22 23:03:38 [INFO] [TRAIN] epoch: 38, iter: 46800/160000, loss: 0.7146, lr: 0.000857, batch_cost: 0.2788, reader_cost: 0.00057, ips: 28.6944 samples/sec | ETA 08:46:00 2022-08-22 23:03:51 [INFO] [TRAIN] epoch: 38, iter: 46850/160000, loss: 0.7186, lr: 0.000857, batch_cost: 0.2700, reader_cost: 0.00190, ips: 29.6286 samples/sec | ETA 08:29:11 2022-08-22 23:04:04 [INFO] [TRAIN] epoch: 38, iter: 46900/160000, loss: 0.7380, lr: 0.000856, batch_cost: 0.2484, reader_cost: 0.00089, ips: 32.2030 samples/sec | ETA 07:48:16 2022-08-22 23:04:15 [INFO] [TRAIN] epoch: 38, iter: 46950/160000, loss: 0.6615, lr: 0.000856, batch_cost: 0.2210, reader_cost: 0.00063, ips: 36.1959 samples/sec | ETA 06:56:26 2022-08-22 23:04:26 [INFO] [TRAIN] epoch: 38, iter: 47000/160000, loss: 0.7260, lr: 0.000856, batch_cost: 0.2215, reader_cost: 0.00054, ips: 36.1179 samples/sec | ETA 06:57:09 2022-08-22 23:04:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 148s - batch_cost: 0.1482 - reader cost: 6.2911e-04 2022-08-22 23:06:54 [INFO] [EVAL] #Images: 2000 mIoU: 0.3236 Acc: 0.7538 Kappa: 0.7349 Dice: 0.4518 2022-08-22 23:06:54 [INFO] [EVAL] Class IoU: [0.6763 0.7579 0.9302 0.7074 0.6806 0.7483 0.76 0.7689 0.5022 0.6218 0.4552 0.539 0.6839 0.3012 0.224 0.4231 0.5329 0.4619 0.5831 0.3914 0.7466 0.4575 0.6055 0.4705 0.3156 0.3697 0.4426 0.429 0.3318 0.2101 0.2238 0.4317 0.2431 0.3523 0.2878 0.3421 0.3724 0.5035 0.2353 0.3403 0.1611 0.1303 0.3261 0.2407 0.3343 0.179 0.1714 0.4649 0.5386 0.5171 0.5185 0.2051 0.0926 0.2645 0.6415 0.4193 0.8307 0.3992 0.4486 0.2828 0.1082 0.2411 0.3019 0.0572 0.4347 0.6234 0.2336 0.3583 0.0453 0.3109 0.3701 0.5169 0.3772 0.2565 0.4092 0.3454 0.4227 0.285 0.2104 0.097 0.6423 0.3888 0.3195 0.0173 0.1476 0.5317 0.0755 0.0646 0.3093 0.4679 0.342 0.0304 0.1997 0.0663 0. 0.0071 0.034 0.086 0.2681 0.4231 0.0822 0.0334 0.279 0.0853 0.1458 0.4605 0.1109 0.4845 0.1284 0.1227 0.0609 0.2862 0.1556 0.5726 0.5817 0.0028 0.3057 0.5908 0.0781 0.3519 0.4546 0.0291 0.198 0.1307 0.4637 0.3538 0.4511 0.4318 0.2152 0.21 0.5616 0.001 0.2664 0.1961 0.1912 0.1053 0.1059 0.0404 0.1078 0.3085 0.557 0.008 0.3149 0.1106 0.1487 0.0052 0.3259 0.0486 0.0929 0.1296] 2022-08-22 23:06:54 [INFO] [EVAL] Class Precision: [0.7749 0.8378 0.969 0.7978 0.7909 0.8763 0.8676 0.8284 0.6019 0.7645 0.7233 0.6753 0.7648 0.5211 0.4681 0.5919 0.6483 0.6633 0.7409 0.5935 0.8632 0.6971 0.7584 0.641 0.4674 0.4707 0.5055 0.7291 0.6695 0.3407 0.4054 0.5304 0.5273 0.44 0.4759 0.4762 0.6442 0.7825 0.384 0.5862 0.2373 0.3068 0.6434 0.5744 0.4665 0.3473 0.3952 0.6439 0.6452 0.6457 0.7928 0.2428 0.2438 0.62 0.6592 0.7994 0.8777 0.6028 0.6878 0.419 0.1753 0.4553 0.4146 0.8942 0.5685 0.6992 0.387 0.544 0.4412 0.5344 0.608 0.6266 0.6217 0.3029 0.7451 0.4844 0.4738 0.5471 0.409 0.3746 0.768 0.6873 0.7388 0.0808 0.4343 0.7311 0.5714 0.4247 0.6251 0.7493 0.4365 0.0338 0.3191 0.3066 0. 0.0232 0.6469 0.4289 0.427 0.7005 0.2752 0.0574 0.5603 0.5506 0.5329 0.6646 0.1771 0.7599 0.2811 0.2331 0.2059 0.3688 0.5797 0.7057 0.5838 0.047 0.6019 0.6347 0.2228 0.5925 0.6451 0.3017 0.7322 0.6886 0.7217 0.6631 0.9119 0.6451 0.609 0.5164 0.6784 0.012 0.4467 0.7524 0.3403 0.3676 0.3119 0.136 0.3855 0.7328 0.7481 0.0115 0.7027 0.5504 0.6547 0.006 0.9076 0.3295 0.5179 0.7788] 2022-08-22 23:06:54 [INFO] [EVAL] Class Recall: [0.8416 0.8882 0.9588 0.8618 0.83 0.8367 0.8596 0.9146 0.752 0.7691 0.5512 0.7275 0.866 0.4165 0.3004 0.5974 0.7495 0.6034 0.7325 0.5348 0.8467 0.571 0.7502 0.6389 0.4929 0.6328 0.7807 0.5104 0.3968 0.3542 0.3331 0.6987 0.3108 0.6388 0.4213 0.5485 0.4688 0.5854 0.3779 0.4478 0.3341 0.1848 0.3981 0.2929 0.5412 0.2698 0.2324 0.6258 0.7652 0.7218 0.5997 0.5689 0.1299 0.3156 0.9597 0.4686 0.9394 0.5417 0.5632 0.4653 0.2203 0.3388 0.5263 0.0576 0.6488 0.8519 0.3707 0.512 0.048 0.4265 0.4862 0.747 0.4896 0.6262 0.4758 0.5462 0.7965 0.373 0.3023 0.1157 0.7969 0.4724 0.3602 0.0216 0.1827 0.6609 0.0801 0.0708 0.3797 0.5547 0.6126 0.2323 0.3481 0.0779 0. 0.0102 0.0346 0.0971 0.4187 0.5165 0.1049 0.0742 0.3572 0.0917 0.1672 0.6 0.2287 0.5721 0.1912 0.2057 0.0795 0.5609 0.1754 0.7522 0.994 0.0029 0.3831 0.8953 0.1074 0.4643 0.6062 0.0312 0.2134 0.1389 0.5646 0.4313 0.4717 0.5663 0.2497 0.2614 0.7655 0.0011 0.3977 0.2096 0.3039 0.1287 0.1381 0.0544 0.1301 0.3476 0.6856 0.0257 0.3633 0.1216 0.1614 0.0349 0.3371 0.054 0.1017 0.1346] 2022-08-22 23:06:54 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 23:07:04 [INFO] [TRAIN] epoch: 38, iter: 47050/160000, loss: 0.6956, lr: 0.000855, batch_cost: 0.2013, reader_cost: 0.00439, ips: 39.7413 samples/sec | ETA 06:18:57 2022-08-22 23:07:15 [INFO] [TRAIN] epoch: 38, iter: 47100/160000, loss: 0.6891, lr: 0.000855, batch_cost: 0.2247, reader_cost: 0.00099, ips: 35.6074 samples/sec | ETA 07:02:45 2022-08-22 23:07:27 [INFO] [TRAIN] epoch: 38, iter: 47150/160000, loss: 0.6871, lr: 0.000854, batch_cost: 0.2248, reader_cost: 0.00063, ips: 35.5822 samples/sec | ETA 07:02:52 2022-08-22 23:07:37 [INFO] [TRAIN] epoch: 38, iter: 47200/160000, loss: 0.6376, lr: 0.000854, batch_cost: 0.2104, reader_cost: 0.00084, ips: 38.0167 samples/sec | ETA 06:35:36 2022-08-22 23:07:48 [INFO] [TRAIN] epoch: 38, iter: 47250/160000, loss: 0.6948, lr: 0.000854, batch_cost: 0.2207, reader_cost: 0.00158, ips: 36.2418 samples/sec | ETA 06:54:48 2022-08-22 23:07:59 [INFO] [TRAIN] epoch: 38, iter: 47300/160000, loss: 0.6800, lr: 0.000853, batch_cost: 0.2080, reader_cost: 0.00086, ips: 38.4650 samples/sec | ETA 06:30:39 2022-08-22 23:08:10 [INFO] [TRAIN] epoch: 38, iter: 47350/160000, loss: 0.7111, lr: 0.000853, batch_cost: 0.2237, reader_cost: 0.00085, ips: 35.7664 samples/sec | ETA 06:59:56 2022-08-22 23:08:21 [INFO] [TRAIN] epoch: 38, iter: 47400/160000, loss: 0.7379, lr: 0.000852, batch_cost: 0.2287, reader_cost: 0.00066, ips: 34.9771 samples/sec | ETA 07:09:13 2022-08-22 23:08:33 [INFO] [TRAIN] epoch: 38, iter: 47450/160000, loss: 0.7230, lr: 0.000852, batch_cost: 0.2256, reader_cost: 0.00045, ips: 35.4550 samples/sec | ETA 07:03:15 2022-08-22 23:08:43 [INFO] [TRAIN] epoch: 38, iter: 47500/160000, loss: 0.6682, lr: 0.000852, batch_cost: 0.2177, reader_cost: 0.00046, ips: 36.7527 samples/sec | ETA 06:48:07 2022-08-22 23:08:54 [INFO] [TRAIN] epoch: 38, iter: 47550/160000, loss: 0.6871, lr: 0.000851, batch_cost: 0.2081, reader_cost: 0.00045, ips: 38.4479 samples/sec | ETA 06:29:57 2022-08-22 23:09:07 [INFO] [TRAIN] epoch: 38, iter: 47600/160000, loss: 0.6792, lr: 0.000851, batch_cost: 0.2686, reader_cost: 0.00039, ips: 29.7824 samples/sec | ETA 08:23:12 2022-08-22 23:09:21 [INFO] [TRAIN] epoch: 38, iter: 47650/160000, loss: 0.6548, lr: 0.000851, batch_cost: 0.2728, reader_cost: 0.00065, ips: 29.3287 samples/sec | ETA 08:30:45 2022-08-22 23:09:35 [INFO] [TRAIN] epoch: 38, iter: 47700/160000, loss: 0.7151, lr: 0.000850, batch_cost: 0.2870, reader_cost: 0.00106, ips: 27.8726 samples/sec | ETA 08:57:12 2022-08-22 23:09:48 [INFO] [TRAIN] epoch: 38, iter: 47750/160000, loss: 0.6938, lr: 0.000850, batch_cost: 0.2603, reader_cost: 0.00065, ips: 30.7285 samples/sec | ETA 08:07:03 2022-08-22 23:10:01 [INFO] [TRAIN] epoch: 38, iter: 47800/160000, loss: 0.7199, lr: 0.000849, batch_cost: 0.2535, reader_cost: 0.00068, ips: 31.5606 samples/sec | ETA 07:54:00 2022-08-22 23:10:14 [INFO] [TRAIN] epoch: 38, iter: 47850/160000, loss: 0.6686, lr: 0.000849, batch_cost: 0.2696, reader_cost: 0.00139, ips: 29.6770 samples/sec | ETA 08:23:52 2022-08-22 23:10:27 [INFO] [TRAIN] epoch: 38, iter: 47900/160000, loss: 0.6551, lr: 0.000849, batch_cost: 0.2526, reader_cost: 0.00513, ips: 31.6712 samples/sec | ETA 07:51:55 2022-08-22 23:10:41 [INFO] [TRAIN] epoch: 38, iter: 47950/160000, loss: 0.7364, lr: 0.000848, batch_cost: 0.2800, reader_cost: 0.00134, ips: 28.5713 samples/sec | ETA 08:42:54 2022-08-22 23:10:57 [INFO] [TRAIN] epoch: 39, iter: 48000/160000, loss: 0.6990, lr: 0.000848, batch_cost: 0.3133, reader_cost: 0.05063, ips: 25.5323 samples/sec | ETA 09:44:52 2022-08-22 23:10:57 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1835 - reader cost: 0.0012 2022-08-22 23:14:00 [INFO] [EVAL] #Images: 2000 mIoU: 0.3285 Acc: 0.7546 Kappa: 0.7360 Dice: 0.4585 2022-08-22 23:14:00 [INFO] [EVAL] Class IoU: [0.6753 0.7654 0.9302 0.7168 0.6764 0.7444 0.7709 0.734 0.5087 0.6207 0.455 0.5314 0.6904 0.3093 0.2412 0.4036 0.508 0.3985 0.5761 0.3837 0.7226 0.442 0.6104 0.4586 0.3521 0.3504 0.4274 0.4049 0.3822 0.2643 0.1659 0.4787 0.2689 0.3386 0.2851 0.3645 0.4125 0.5336 0.2496 0.3468 0.1217 0.0889 0.3568 0.2436 0.3442 0.2349 0.2408 0.4791 0.5959 0.4775 0.5203 0.2818 0.1799 0.1883 0.763 0.4441 0.8282 0.3701 0.5428 0.2437 0.0856 0.3924 0.2484 0.131 0.4244 0.6891 0.2235 0.4087 0.0828 0.3057 0.4253 0.4759 0.3594 0.2536 0.4229 0.3489 0.3918 0.2763 0.2746 0.205 0.6577 0.3725 0.2666 0.0493 0.2538 0.5448 0.084 0.0663 0.276 0.4959 0.3699 0.0489 0.1842 0.0521 0. 0.0056 0.0224 0.1607 0.2593 0.3209 0.1179 0.0262 0.1501 0.0956 0.1621 0.3905 0.116 0.4954 0.0839 0.1536 0.0923 0.4787 0.1436 0.4815 0.5963 0.0132 0.3419 0.5599 0.0658 0.1728 0.4599 0.0013 0.1714 0.149 0.2538 0.3256 0.4584 0.4376 0.3386 0.1996 0.5078 0.0014 0.3698 0.2616 0.1166 0.1214 0.1233 0.0318 0.1212 0.3385 0.3766 0.0184 0.3197 0.3089 0.1559 0. 0.3914 0.0125 0.0929 0.1069] 2022-08-22 23:14:00 [INFO] [EVAL] Class Precision: [0.7764 0.8704 0.9648 0.8079 0.7529 0.8654 0.8723 0.7729 0.6355 0.7767 0.7059 0.694 0.7732 0.4824 0.5646 0.5425 0.6512 0.7387 0.7641 0.6061 0.806 0.6205 0.7748 0.5711 0.5297 0.4002 0.535 0.7666 0.6986 0.3417 0.4516 0.5961 0.5022 0.4516 0.5116 0.4735 0.6348 0.6558 0.4278 0.5116 0.3061 0.2879 0.588 0.5909 0.4988 0.4479 0.447 0.7269 0.6692 0.5682 0.7091 0.3492 0.3277 0.7278 0.7922 0.6266 0.8725 0.6537 0.7099 0.3947 0.1267 0.5648 0.4864 0.7621 0.5485 0.8064 0.3367 0.508 0.4896 0.5099 0.6128 0.5862 0.5518 0.2882 0.7016 0.5615 0.4257 0.5493 0.7094 0.3362 0.7527 0.6477 0.7513 0.0789 0.4472 0.7156 0.3498 0.3959 0.524 0.7362 0.4643 0.0873 0.3952 0.2978 0. 0.0196 0.8263 0.3864 0.5472 0.5636 0.2864 0.0393 0.6497 0.707 0.666 0.4563 0.3033 0.7654 0.2259 0.4046 0.2556 0.7273 0.6166 0.7248 0.5982 0.1453 0.6821 0.6283 0.1189 0.5066 0.6133 0.1191 0.8014 0.6608 0.6432 0.6611 0.7706 0.5826 0.5908 0.5824 0.6673 0.0207 0.649 0.4776 0.4864 0.332 0.2856 0.068 0.3593 0.6824 0.4413 0.0307 0.7176 0.5678 0.3886 0. 0.8088 0.5289 0.5598 0.8256] 2022-08-22 23:14:00 [INFO] [EVAL] Class Recall: [0.8385 0.8639 0.9629 0.8642 0.8694 0.8418 0.869 0.9358 0.7183 0.7554 0.5615 0.694 0.8657 0.463 0.2963 0.6118 0.6978 0.4639 0.7007 0.5111 0.8747 0.6058 0.742 0.6994 0.5122 0.7378 0.6801 0.4618 0.4577 0.5384 0.2077 0.7086 0.3666 0.575 0.3918 0.6128 0.5409 0.7413 0.3747 0.5184 0.168 0.114 0.4758 0.2931 0.5262 0.3307 0.3428 0.5843 0.8447 0.7495 0.6615 0.5935 0.2851 0.2026 0.954 0.6039 0.9422 0.4604 0.6975 0.3892 0.209 0.5625 0.3367 0.1365 0.6523 0.8257 0.3992 0.6764 0.0906 0.4329 0.5816 0.7167 0.5076 0.6788 0.5157 0.4796 0.831 0.3573 0.3095 0.3445 0.8389 0.4671 0.2924 0.1159 0.3698 0.6954 0.0996 0.0738 0.3685 0.6031 0.6455 0.1001 0.2565 0.0594 0. 0.0078 0.0225 0.2157 0.3301 0.427 0.1669 0.0727 0.1633 0.0996 0.1765 0.7304 0.1582 0.584 0.1178 0.1984 0.1263 0.5835 0.1577 0.5893 0.9946 0.0143 0.4067 0.8372 0.1284 0.2078 0.6477 0.0013 0.179 0.1614 0.2953 0.3908 0.5309 0.6375 0.4424 0.233 0.6799 0.0016 0.4623 0.3664 0.1329 0.1605 0.1783 0.0563 0.1546 0.4018 0.7196 0.044 0.3657 0.4038 0.2066 0. 0.4312 0.0126 0.1002 0.1094] 2022-08-22 23:14:01 [INFO] [EVAL] The model with the best validation mIoU (0.3298) was saved at iter 42000. 2022-08-22 23:14:12 [INFO] [TRAIN] epoch: 39, iter: 48050/160000, loss: 0.6694, lr: 0.000848, batch_cost: 0.2208, reader_cost: 0.00322, ips: 36.2398 samples/sec | ETA 06:51:53 2022-08-22 23:14:22 [INFO] [TRAIN] epoch: 39, iter: 48100/160000, loss: 0.6949, lr: 0.000847, batch_cost: 0.2052, reader_cost: 0.00496, ips: 38.9831 samples/sec | ETA 06:22:43 2022-08-22 23:14:31 [INFO] [TRAIN] epoch: 39, iter: 48150/160000, loss: 0.6826, lr: 0.000847, batch_cost: 0.1876, reader_cost: 0.00045, ips: 42.6397 samples/sec | ETA 05:49:45 2022-08-22 23:14:42 [INFO] [TRAIN] epoch: 39, iter: 48200/160000, loss: 0.6874, lr: 0.000846, batch_cost: 0.2102, reader_cost: 0.00037, ips: 38.0560 samples/sec | ETA 06:31:42 2022-08-22 23:14:53 [INFO] [TRAIN] epoch: 39, iter: 48250/160000, loss: 0.6470, lr: 0.000846, batch_cost: 0.2227, reader_cost: 0.00205, ips: 35.9163 samples/sec | ETA 06:54:51 2022-08-22 23:15:05 [INFO] [TRAIN] epoch: 39, iter: 48300/160000, loss: 0.7520, lr: 0.000846, batch_cost: 0.2392, reader_cost: 0.00068, ips: 33.4442 samples/sec | ETA 07:25:19 2022-08-22 23:15:16 [INFO] [TRAIN] epoch: 39, iter: 48350/160000, loss: 0.7339, lr: 0.000845, batch_cost: 0.2147, reader_cost: 0.00052, ips: 37.2638 samples/sec | ETA 06:39:29 2022-08-22 23:15:26 [INFO] [TRAIN] epoch: 39, iter: 48400/160000, loss: 0.6997, lr: 0.000845, batch_cost: 0.2049, reader_cost: 0.00107, ips: 39.0505 samples/sec | ETA 06:21:02 2022-08-22 23:15:36 [INFO] [TRAIN] epoch: 39, iter: 48450/160000, loss: 0.7198, lr: 0.000845, batch_cost: 0.2120, reader_cost: 0.00100, ips: 37.7292 samples/sec | ETA 06:34:12 2022-08-22 23:15:48 [INFO] [TRAIN] epoch: 39, iter: 48500/160000, loss: 0.7011, lr: 0.000844, batch_cost: 0.2305, reader_cost: 0.00107, ips: 34.7046 samples/sec | ETA 07:08:22 2022-08-22 23:15:58 [INFO] [TRAIN] epoch: 39, iter: 48550/160000, loss: 0.7073, lr: 0.000844, batch_cost: 0.2095, reader_cost: 0.00068, ips: 38.1943 samples/sec | ETA 06:29:03 2022-08-22 23:16:09 [INFO] [TRAIN] epoch: 39, iter: 48600/160000, loss: 0.7279, lr: 0.000843, batch_cost: 0.2104, reader_cost: 0.00151, ips: 38.0236 samples/sec | ETA 06:30:38 2022-08-22 23:16:21 [INFO] [TRAIN] epoch: 39, iter: 48650/160000, loss: 0.6895, lr: 0.000843, batch_cost: 0.2329, reader_cost: 0.00071, ips: 34.3430 samples/sec | ETA 07:12:18 2022-08-22 23:16:32 [INFO] [TRAIN] epoch: 39, iter: 48700/160000, loss: 0.6816, lr: 0.000843, batch_cost: 0.2336, reader_cost: 0.00072, ips: 34.2445 samples/sec | ETA 07:13:21 2022-08-22 23:16:47 [INFO] [TRAIN] epoch: 39, iter: 48750/160000, loss: 0.6911, lr: 0.000842, batch_cost: 0.2862, reader_cost: 0.00372, ips: 27.9509 samples/sec | ETA 08:50:41 2022-08-22 23:16:58 [INFO] [TRAIN] epoch: 39, iter: 48800/160000, loss: 0.6917, lr: 0.000842, batch_cost: 0.2268, reader_cost: 0.01259, ips: 35.2708 samples/sec | ETA 07:00:21 2022-08-22 23:17:11 [INFO] [TRAIN] epoch: 39, iter: 48850/160000, loss: 0.6645, lr: 0.000842, batch_cost: 0.2643, reader_cost: 0.01045, ips: 30.2728 samples/sec | ETA 08:09:32 2022-08-22 23:17:24 [INFO] [TRAIN] epoch: 39, iter: 48900/160000, loss: 0.7021, lr: 0.000841, batch_cost: 0.2507, reader_cost: 0.00101, ips: 31.9160 samples/sec | ETA 07:44:08 2022-08-22 23:17:36 [INFO] [TRAIN] epoch: 39, iter: 48950/160000, loss: 0.6938, lr: 0.000841, batch_cost: 0.2503, reader_cost: 0.00567, ips: 31.9574 samples/sec | ETA 07:43:19 2022-08-22 23:17:47 [INFO] [TRAIN] epoch: 39, iter: 49000/160000, loss: 0.7055, lr: 0.000840, batch_cost: 0.2107, reader_cost: 0.00049, ips: 37.9746 samples/sec | ETA 06:29:44 2022-08-22 23:17:47 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1974 - reader cost: 0.0012 2022-08-22 23:21:04 [INFO] [EVAL] #Images: 2000 mIoU: 0.3325 Acc: 0.7571 Kappa: 0.7387 Dice: 0.4639 2022-08-22 23:21:04 [INFO] [EVAL] Class IoU: [0.6725 0.7646 0.9308 0.7223 0.6736 0.7512 0.77 0.7615 0.5161 0.6229 0.4564 0.5371 0.6811 0.3124 0.2829 0.4075 0.5156 0.446 0.5888 0.4017 0.7228 0.4821 0.6063 0.4963 0.3225 0.34 0.4167 0.4147 0.3969 0.246 0.227 0.4674 0.2591 0.3212 0.2919 0.3518 0.3967 0.5311 0.2522 0.3574 0.1899 0.1178 0.3299 0.2431 0.313 0.2079 0.3208 0.4781 0.608 0.5146 0.5083 0.3713 0.2123 0.2861 0.7425 0.4304 0.8384 0.2987 0.5301 0.3008 0.0683 0.2666 0.2735 0.2099 0.4296 0.6933 0.256 0.4036 0.1154 0.3269 0.3639 0.4651 0.3709 0.2266 0.4014 0.3226 0.4489 0.3054 0.1517 0.1553 0.5861 0.3663 0.3257 0.0402 0.3167 0.5426 0.0512 0.0696 0.2227 0.4741 0.3917 0.1433 0.1898 0.1268 0.0001 0.0131 0.0213 0.1554 0.253 0.3932 0.0664 0.0077 0.134 0.0532 0.1583 0.3647 0.0957 0.471 0.1392 0.263 0.0277 0.2565 0.1532 0.5142 0.6485 0.0081 0.1819 0.7086 0.1418 0.4022 0.428 0.0019 0.1968 0.1214 0.4106 0.3184 0.414 0.3629 0.3209 0.273 0.4404 0.003 0.3446 0.1981 0.1141 0.1342 0.1064 0.0299 0.1705 0.3528 0.3587 0.003 0.3761 0.2038 0.2242 0. 0.3866 0.0145 0.1337 0.1411] 2022-08-22 23:21:04 [INFO] [EVAL] Class Precision: [0.7858 0.8348 0.9647 0.8296 0.7727 0.8722 0.8718 0.8337 0.6637 0.7557 0.6425 0.7099 0.7457 0.4918 0.4723 0.6329 0.6371 0.6971 0.7749 0.5731 0.8042 0.6771 0.7737 0.639 0.4984 0.5714 0.5509 0.5652 0.6038 0.3655 0.48 0.7048 0.4184 0.4003 0.5311 0.4639 0.6181 0.7342 0.3754 0.5617 0.386 0.3226 0.5233 0.488 0.4559 0.4235 0.5182 0.6631 0.7118 0.6573 0.7019 0.5061 0.3726 0.6143 0.7842 0.5688 0.8783 0.6846 0.6315 0.5026 0.1316 0.4022 0.4425 0.6177 0.5257 0.8369 0.4764 0.5127 0.3733 0.5411 0.5535 0.6333 0.562 0.2792 0.6226 0.5869 0.6551 0.4623 0.1767 0.3354 0.6452 0.6884 0.7492 0.0988 0.4032 0.6939 0.4673 0.4057 0.3103 0.6795 0.4987 0.3549 0.4797 0.2854 0.0008 0.0573 0.7209 0.3919 0.5742 0.6 0.3179 0.015 0.7562 0.6796 0.7128 0.4298 0.2954 0.8072 0.2663 0.4942 0.2193 0.4559 0.412 0.6877 0.6512 0.1328 0.773 0.7804 0.1964 0.5983 0.5842 0.3187 0.7768 0.6855 0.6513 0.6448 0.91 0.4291 0.8394 0.5712 0.5566 0.206 0.5288 0.7106 0.5205 0.2716 0.4107 0.0537 0.3348 0.6909 0.4086 0.0071 0.6715 0.5192 0.6744 0. 0.7845 0.5902 0.4969 0.5733] 2022-08-22 23:21:04 [INFO] [EVAL] Class Recall: [0.8233 0.9009 0.9636 0.8481 0.8401 0.844 0.8683 0.8979 0.6988 0.7799 0.6118 0.6881 0.8872 0.4613 0.4136 0.5337 0.7301 0.5532 0.7103 0.5732 0.8772 0.6261 0.7369 0.6897 0.4774 0.4563 0.6309 0.6089 0.5367 0.4295 0.3011 0.5811 0.4049 0.6192 0.3933 0.5929 0.5255 0.6575 0.4346 0.4957 0.2722 0.1566 0.4716 0.3264 0.4997 0.29 0.4571 0.6314 0.8065 0.7034 0.6482 0.5824 0.3304 0.3488 0.9331 0.639 0.9486 0.3463 0.7675 0.4283 0.1244 0.4416 0.4173 0.2412 0.7015 0.8016 0.3562 0.6548 0.1431 0.4522 0.515 0.6365 0.5217 0.5462 0.5305 0.4174 0.5878 0.4736 0.5175 0.2243 0.8649 0.4391 0.3656 0.0634 0.5961 0.7134 0.0543 0.0776 0.4407 0.6106 0.6462 0.1937 0.239 0.1859 0.0002 0.0168 0.0215 0.2048 0.3115 0.5329 0.0774 0.0155 0.14 0.0546 0.1691 0.7064 0.1239 0.5307 0.2257 0.3599 0.0307 0.3697 0.196 0.6709 0.9936 0.0085 0.1922 0.8851 0.3378 0.551 0.6154 0.0019 0.2086 0.1285 0.5264 0.3861 0.4317 0.7017 0.3418 0.3433 0.6785 0.003 0.4972 0.2155 0.1275 0.2096 0.1256 0.0633 0.258 0.4189 0.746 0.0052 0.4608 0.2513 0.2514 0. 0.4325 0.0146 0.1546 0.1577] 2022-08-22 23:21:05 [INFO] [EVAL] The model with the best validation mIoU (0.3325) was saved at iter 49000. 2022-08-22 23:21:15 [INFO] [TRAIN] epoch: 39, iter: 49050/160000, loss: 0.6603, lr: 0.000840, batch_cost: 0.2069, reader_cost: 0.00341, ips: 38.6695 samples/sec | ETA 06:22:33 2022-08-22 23:21:26 [INFO] [TRAIN] epoch: 39, iter: 49100/160000, loss: 0.6225, lr: 0.000840, batch_cost: 0.2253, reader_cost: 0.00112, ips: 35.5125 samples/sec | ETA 06:56:22 2022-08-22 23:21:38 [INFO] [TRAIN] epoch: 39, iter: 49150/160000, loss: 0.7311, lr: 0.000839, batch_cost: 0.2240, reader_cost: 0.00092, ips: 35.7217 samples/sec | ETA 06:53:45 2022-08-22 23:21:49 [INFO] [TRAIN] epoch: 39, iter: 49200/160000, loss: 0.6620, lr: 0.000839, batch_cost: 0.2274, reader_cost: 0.00287, ips: 35.1870 samples/sec | ETA 06:59:51 2022-08-22 23:21:59 [INFO] [TRAIN] epoch: 39, iter: 49250/160000, loss: 0.6605, lr: 0.000838, batch_cost: 0.2068, reader_cost: 0.00043, ips: 38.6937 samples/sec | ETA 06:21:37 2022-08-22 23:22:12 [INFO] [TRAIN] epoch: 40, iter: 49300/160000, loss: 0.6528, lr: 0.000838, batch_cost: 0.2604, reader_cost: 0.04307, ips: 30.7270 samples/sec | ETA 08:00:21 2022-08-22 23:22:23 [INFO] [TRAIN] epoch: 40, iter: 49350/160000, loss: 0.6913, lr: 0.000838, batch_cost: 0.2152, reader_cost: 0.00057, ips: 37.1788 samples/sec | ETA 06:36:49 2022-08-22 23:22:33 [INFO] [TRAIN] epoch: 40, iter: 49400/160000, loss: 0.6731, lr: 0.000837, batch_cost: 0.2066, reader_cost: 0.00089, ips: 38.7129 samples/sec | ETA 06:20:55 2022-08-22 23:22:44 [INFO] [TRAIN] epoch: 40, iter: 49450/160000, loss: 0.6723, lr: 0.000837, batch_cost: 0.2219, reader_cost: 0.00621, ips: 36.0577 samples/sec | ETA 06:48:47 2022-08-22 23:22:54 [INFO] [TRAIN] epoch: 40, iter: 49500/160000, loss: 0.7261, lr: 0.000837, batch_cost: 0.1823, reader_cost: 0.00067, ips: 43.8767 samples/sec | ETA 05:35:47 2022-08-22 23:23:04 [INFO] [TRAIN] epoch: 40, iter: 49550/160000, loss: 0.7192, lr: 0.000836, batch_cost: 0.2104, reader_cost: 0.00031, ips: 38.0252 samples/sec | ETA 06:27:17 2022-08-22 23:23:15 [INFO] [TRAIN] epoch: 40, iter: 49600/160000, loss: 0.6784, lr: 0.000836, batch_cost: 0.2207, reader_cost: 0.00092, ips: 36.2447 samples/sec | ETA 06:46:07 2022-08-22 23:23:25 [INFO] [TRAIN] epoch: 40, iter: 49650/160000, loss: 0.6637, lr: 0.000835, batch_cost: 0.1971, reader_cost: 0.00080, ips: 40.5982 samples/sec | ETA 06:02:24 2022-08-22 23:23:35 [INFO] [TRAIN] epoch: 40, iter: 49700/160000, loss: 0.6819, lr: 0.000835, batch_cost: 0.2092, reader_cost: 0.00425, ips: 38.2359 samples/sec | ETA 06:24:37 2022-08-22 23:23:47 [INFO] [TRAIN] epoch: 40, iter: 49750/160000, loss: 0.6516, lr: 0.000835, batch_cost: 0.2254, reader_cost: 0.00049, ips: 35.4932 samples/sec | ETA 06:54:09 2022-08-22 23:23:57 [INFO] [TRAIN] epoch: 40, iter: 49800/160000, loss: 0.6973, lr: 0.000834, batch_cost: 0.2036, reader_cost: 0.00045, ips: 39.2969 samples/sec | ETA 06:13:54 2022-08-22 23:24:07 [INFO] [TRAIN] epoch: 40, iter: 49850/160000, loss: 0.6635, lr: 0.000834, batch_cost: 0.1968, reader_cost: 0.00033, ips: 40.6414 samples/sec | ETA 06:01:22 2022-08-22 23:24:17 [INFO] [TRAIN] epoch: 40, iter: 49900/160000, loss: 0.6624, lr: 0.000834, batch_cost: 0.1963, reader_cost: 0.00045, ips: 40.7546 samples/sec | ETA 06:00:12 2022-08-22 23:24:26 [INFO] [TRAIN] epoch: 40, iter: 49950/160000, loss: 0.6865, lr: 0.000833, batch_cost: 0.1927, reader_cost: 0.00468, ips: 41.5143 samples/sec | ETA 05:53:27 2022-08-22 23:24:35 [INFO] [TRAIN] epoch: 40, iter: 50000/160000, loss: 0.6828, lr: 0.000833, batch_cost: 0.1811, reader_cost: 0.00046, ips: 44.1642 samples/sec | ETA 05:32:05 2022-08-22 23:24:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 211s - batch_cost: 0.2106 - reader cost: 8.2049e-04 2022-08-22 23:28:06 [INFO] [EVAL] #Images: 2000 mIoU: 0.3267 Acc: 0.7533 Kappa: 0.7344 Dice: 0.4557 2022-08-22 23:28:06 [INFO] [EVAL] Class IoU: [0.6736 0.7625 0.9288 0.7197 0.6605 0.7531 0.7508 0.7713 0.5043 0.6318 0.4688 0.5227 0.683 0.2616 0.2518 0.4129 0.5009 0.4096 0.5633 0.3977 0.7243 0.4765 0.6006 0.4768 0.3126 0.3165 0.4761 0.3912 0.3846 0.211 0.2172 0.5031 0.2557 0.3597 0.2948 0.3752 0.4112 0.5366 0.2353 0.3633 0.1743 0.1102 0.2902 0.246 0.3161 0.2193 0.2796 0.4925 0.5922 0.5094 0.5192 0.3327 0.2065 0.2661 0.6331 0.4382 0.8663 0.3726 0.4468 0.2616 0.0631 0.2193 0.2561 0.1385 0.4018 0.6387 0.2837 0.3844 0.0831 0.3491 0.4115 0.4297 0.345 0.235 0.403 0.3144 0.3897 0.3163 0.197 0.1068 0.6463 0.3918 0.2751 0.0164 0.3062 0.5645 0.0764 0.0646 0.2121 0.5012 0.4147 0.0244 0.256 0.0751 0.0175 0.0233 0.01 0.1777 0.3271 0.262 0.1712 0.0233 0.219 0.0517 0.1255 0.4399 0.0546 0.483 0.1174 0.0643 0.0949 0.3791 0.1398 0.4392 0.7542 0.0086 0.2797 0.6537 0.1876 0.0783 0.4241 0. 0.1847 0.1471 0.3868 0.2568 0.4448 0.4338 0.229 0.3084 0.473 0.0018 0.3026 0.2157 0.1233 0.1191 0.1035 0.0064 0.1626 0.3726 0.5573 0.0461 0.2992 0.0088 0.2728 0. 0.3217 0.023 0.125 0.1624] 2022-08-22 23:28:06 [INFO] [EVAL] Class Precision: [0.7707 0.8494 0.9574 0.8318 0.7385 0.8551 0.8598 0.8554 0.6418 0.7564 0.699 0.6963 0.7471 0.4561 0.5214 0.6527 0.6463 0.7321 0.6664 0.627 0.819 0.6674 0.8329 0.5762 0.4609 0.4321 0.5968 0.8094 0.6166 0.3312 0.4383 0.6409 0.5511 0.4771 0.4594 0.5433 0.6308 0.7815 0.3626 0.5725 0.2393 0.2849 0.3978 0.5261 0.5076 0.4535 0.4932 0.7189 0.6665 0.6319 0.7055 0.5126 0.3915 0.5389 0.6473 0.5814 0.9314 0.5843 0.6937 0.4586 0.1008 0.3805 0.4916 0.7357 0.4603 0.7077 0.4326 0.4781 0.2002 0.6446 0.5665 0.8084 0.61 0.27 0.6589 0.4689 0.4409 0.519 0.805 0.1903 0.7532 0.6478 0.7896 0.0354 0.4136 0.7138 0.5752 0.3495 0.3171 0.7796 0.6031 0.0273 0.4337 0.3345 0.0773 0.094 0.0603 0.4956 0.4709 0.6059 0.4374 0.0564 0.614 0.5092 0.6582 0.5805 0.2056 0.7749 0.2491 0.2688 0.2243 0.6379 0.4826 0.8617 0.7601 0.1224 0.6687 0.757 0.2652 0.453 0.7032 0. 0.4978 0.6292 0.7093 0.6522 0.7723 0.609 0.6881 0.5207 0.7065 0.0493 0.4755 0.6217 0.5625 0.3391 0.4245 0.0888 0.4125 0.6243 0.7808 0.0589 0.7595 0.4634 0.5871 0. 0.9047 0.6154 0.5204 0.7398] 2022-08-22 23:28:06 [INFO] [EVAL] Class Recall: [0.8424 0.8818 0.9689 0.8422 0.862 0.8633 0.8555 0.8869 0.702 0.7931 0.5874 0.6771 0.8884 0.3802 0.3275 0.5291 0.69 0.4819 0.7845 0.521 0.8623 0.6249 0.6829 0.7341 0.4928 0.542 0.7018 0.4309 0.5055 0.3675 0.301 0.7005 0.323 0.5937 0.4513 0.5481 0.5415 0.6313 0.4012 0.4986 0.3907 0.1523 0.5174 0.3161 0.4559 0.2981 0.3922 0.6099 0.8415 0.7243 0.6629 0.4867 0.304 0.3445 0.9666 0.6402 0.9253 0.507 0.5566 0.3785 0.1444 0.341 0.3484 0.1458 0.7597 0.8676 0.4519 0.6624 0.1245 0.4323 0.6006 0.4784 0.4427 0.6443 0.5092 0.4883 0.7705 0.4474 0.2069 0.1956 0.8199 0.4978 0.2969 0.0296 0.5413 0.7296 0.081 0.0734 0.3907 0.5839 0.5704 0.1853 0.3846 0.0883 0.0221 0.03 0.0118 0.2169 0.5172 0.3158 0.2195 0.0381 0.2539 0.0545 0.1342 0.645 0.0693 0.5619 0.1817 0.078 0.1412 0.4831 0.1645 0.4725 0.9899 0.0092 0.3247 0.8273 0.3906 0.0864 0.5166 0. 0.227 0.161 0.4596 0.2976 0.5119 0.6012 0.2555 0.4306 0.5886 0.0018 0.4543 0.2483 0.1364 0.1551 0.1204 0.0068 0.2116 0.4803 0.6607 0.1755 0.3305 0.0089 0.3375 0. 0.333 0.0233 0.1413 0.1722] 2022-08-22 23:28:06 [INFO] [EVAL] The model with the best validation mIoU (0.3325) was saved at iter 49000. 2022-08-22 23:28:18 [INFO] [TRAIN] epoch: 40, iter: 50050/160000, loss: 0.6926, lr: 0.000832, batch_cost: 0.2281, reader_cost: 0.00311, ips: 35.0727 samples/sec | ETA 06:57:59 2022-08-22 23:28:29 [INFO] [TRAIN] epoch: 40, iter: 50100/160000, loss: 0.6640, lr: 0.000832, batch_cost: 0.2211, reader_cost: 0.00121, ips: 36.1857 samples/sec | ETA 06:44:56 2022-08-22 23:28:41 [INFO] [TRAIN] epoch: 40, iter: 50150/160000, loss: 0.6821, lr: 0.000832, batch_cost: 0.2391, reader_cost: 0.00100, ips: 33.4559 samples/sec | ETA 07:17:47 2022-08-22 23:28:50 [INFO] [TRAIN] epoch: 40, iter: 50200/160000, loss: 0.6934, lr: 0.000831, batch_cost: 0.1887, reader_cost: 0.00048, ips: 42.3985 samples/sec | ETA 05:45:17 2022-08-22 23:29:00 [INFO] [TRAIN] epoch: 40, iter: 50250/160000, loss: 0.6460, lr: 0.000831, batch_cost: 0.2022, reader_cost: 0.00161, ips: 39.5718 samples/sec | ETA 06:09:47 2022-08-22 23:29:11 [INFO] [TRAIN] epoch: 40, iter: 50300/160000, loss: 0.7516, lr: 0.000831, batch_cost: 0.2078, reader_cost: 0.00125, ips: 38.4904 samples/sec | ETA 06:20:00 2022-08-22 23:29:21 [INFO] [TRAIN] epoch: 40, iter: 50350/160000, loss: 0.7139, lr: 0.000830, batch_cost: 0.2168, reader_cost: 0.00034, ips: 36.9043 samples/sec | ETA 06:36:09 2022-08-22 23:29:32 [INFO] [TRAIN] epoch: 40, iter: 50400/160000, loss: 0.6976, lr: 0.000830, batch_cost: 0.2054, reader_cost: 0.00047, ips: 38.9444 samples/sec | ETA 06:15:14 2022-08-22 23:29:43 [INFO] [TRAIN] epoch: 40, iter: 50450/160000, loss: 0.6825, lr: 0.000829, batch_cost: 0.2350, reader_cost: 0.00068, ips: 34.0407 samples/sec | ETA 07:09:05 2022-08-22 23:29:54 [INFO] [TRAIN] epoch: 40, iter: 50500/160000, loss: 0.6600, lr: 0.000829, batch_cost: 0.2005, reader_cost: 0.00064, ips: 39.8987 samples/sec | ETA 06:05:55 2022-08-22 23:30:08 [INFO] [TRAIN] epoch: 41, iter: 50550/160000, loss: 0.6369, lr: 0.000829, batch_cost: 0.2842, reader_cost: 0.07626, ips: 28.1480 samples/sec | ETA 08:38:26 2022-08-22 23:30:18 [INFO] [TRAIN] epoch: 41, iter: 50600/160000, loss: 0.6780, lr: 0.000828, batch_cost: 0.2098, reader_cost: 0.01301, ips: 38.1236 samples/sec | ETA 06:22:36 2022-08-22 23:30:29 [INFO] [TRAIN] epoch: 41, iter: 50650/160000, loss: 0.6512, lr: 0.000828, batch_cost: 0.2145, reader_cost: 0.00041, ips: 37.3030 samples/sec | ETA 06:30:51 2022-08-22 23:30:40 [INFO] [TRAIN] epoch: 41, iter: 50700/160000, loss: 0.6956, lr: 0.000828, batch_cost: 0.2229, reader_cost: 0.00734, ips: 35.8926 samples/sec | ETA 06:46:01 2022-08-22 23:30:52 [INFO] [TRAIN] epoch: 41, iter: 50750/160000, loss: 0.6397, lr: 0.000827, batch_cost: 0.2314, reader_cost: 0.00071, ips: 34.5743 samples/sec | ETA 07:01:18 2022-08-22 23:31:02 [INFO] [TRAIN] epoch: 41, iter: 50800/160000, loss: 0.6788, lr: 0.000827, batch_cost: 0.1974, reader_cost: 0.00543, ips: 40.5310 samples/sec | ETA 05:59:13 2022-08-22 23:31:10 [INFO] [TRAIN] epoch: 41, iter: 50850/160000, loss: 0.6903, lr: 0.000826, batch_cost: 0.1754, reader_cost: 0.00159, ips: 45.6000 samples/sec | ETA 05:19:09 2022-08-22 23:31:19 [INFO] [TRAIN] epoch: 41, iter: 50900/160000, loss: 0.6571, lr: 0.000826, batch_cost: 0.1724, reader_cost: 0.00040, ips: 46.3991 samples/sec | ETA 05:13:30 2022-08-22 23:31:29 [INFO] [TRAIN] epoch: 41, iter: 50950/160000, loss: 0.6475, lr: 0.000826, batch_cost: 0.1994, reader_cost: 0.00075, ips: 40.1223 samples/sec | ETA 06:02:23 2022-08-22 23:31:37 [INFO] [TRAIN] epoch: 41, iter: 51000/160000, loss: 0.6821, lr: 0.000825, batch_cost: 0.1713, reader_cost: 0.00032, ips: 46.7147 samples/sec | ETA 05:11:06 2022-08-22 23:31:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1910 - reader cost: 6.7881e-04 2022-08-22 23:34:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3252 Acc: 0.7551 Kappa: 0.7363 Dice: 0.4548 2022-08-22 23:34:49 [INFO] [EVAL] Class IoU: [0.6748 0.7721 0.9272 0.711 0.6576 0.7492 0.7635 0.7575 0.5129 0.6193 0.4779 0.5327 0.6913 0.2768 0.2894 0.4139 0.5108 0.3781 0.5756 0.3853 0.7178 0.4781 0.614 0.4747 0.3283 0.3082 0.4788 0.4457 0.3944 0.2711 0.2461 0.474 0.2587 0.3105 0.2381 0.3259 0.4012 0.5206 0.2478 0.3468 0.1759 0.0863 0.3288 0.2566 0.308 0.2314 0.3068 0.4804 0.5654 0.4785 0.4938 0.2439 0.1218 0.2587 0.6039 0.4482 0.8184 0.3374 0.4368 0.268 0.1372 0.3284 0.3005 0.1513 0.4228 0.6656 0.2277 0.4209 0.1025 0.3394 0.3956 0.5257 0.3949 0.2337 0.4106 0.346 0.4685 0.2931 0.2848 0.131 0.5277 0.3821 0.2708 0.0132 0.0652 0.558 0.0598 0.0781 0.3163 0.4305 0.3667 0.0455 0.2307 0.0961 0.0015 0.0216 0.002 0.1159 0.2823 0.2968 0.1011 0.0073 0.1989 0.0529 0.1486 0.3621 0.059 0.5538 0.1147 0.2655 0.0878 0.1483 0.1317 0.3782 0.713 0.0136 0.3589 0.615 0.172 0.1561 0.4806 0.0079 0.1905 0.0493 0.3573 0.3137 0.4083 0.4215 0.2547 0.3033 0.544 0.0079 0.3483 0.2267 0.1725 0.1332 0.1129 0.007 0.143 0.3301 0.4866 0.0098 0.4023 0.121 0.1854 0. 0.3107 0.0297 0.1309 0.1078] 2022-08-22 23:34:49 [INFO] [EVAL] Class Precision: [0.7779 0.8396 0.9652 0.8152 0.7379 0.8661 0.8737 0.8184 0.6538 0.7542 0.7086 0.6565 0.7851 0.5167 0.487 0.604 0.6507 0.7461 0.7291 0.6451 0.7891 0.6492 0.77 0.6174 0.4515 0.4448 0.6263 0.6883 0.6581 0.3557 0.3664 0.6281 0.467 0.4814 0.4403 0.5528 0.6031 0.7376 0.4482 0.6234 0.3599 0.2716 0.5526 0.5424 0.4728 0.5074 0.5701 0.67 0.7087 0.5359 0.6845 0.2864 0.2642 0.6705 0.6168 0.6311 0.8538 0.6354 0.6239 0.4285 0.189 0.4834 0.4167 0.6958 0.5069 0.7593 0.3117 0.581 0.2526 0.5869 0.6426 0.695 0.6262 0.3357 0.5638 0.5205 0.5322 0.5924 0.7124 0.3783 0.597 0.7143 0.7814 0.0523 0.2359 0.6889 0.3129 0.3652 0.8703 0.6317 0.4893 0.0581 0.4087 0.2681 0.0057 0.0664 0.0231 0.3398 0.4154 0.5896 0.3096 0.0112 0.6156 0.4484 0.5818 0.4502 0.3438 0.7669 0.1835 0.5053 0.2588 0.5111 0.5881 0.6516 0.7167 0.1558 0.6492 0.7164 0.3598 0.4316 0.6539 0.5499 0.5298 0.6059 0.7571 0.6505 0.8845 0.6403 0.7095 0.6074 0.6886 0.1563 0.6241 0.671 0.4655 0.324 0.3188 0.0441 0.4117 0.6846 0.6567 0.0139 0.5983 0.349 0.6211 0. 0.9258 0.4025 0.6157 0.838 ] 2022-08-22 23:34:49 [INFO] [EVAL] Class Recall: [0.8358 0.9057 0.9593 0.8477 0.858 0.8473 0.8582 0.9106 0.7042 0.7759 0.5948 0.7386 0.8527 0.3736 0.4164 0.5682 0.7038 0.434 0.7321 0.4889 0.8881 0.6446 0.7519 0.6725 0.5459 0.501 0.6703 0.5585 0.496 0.5327 0.4284 0.6589 0.3672 0.4665 0.3416 0.4426 0.5452 0.639 0.3565 0.4387 0.2559 0.1123 0.448 0.3275 0.4691 0.2984 0.3992 0.6293 0.7365 0.817 0.6393 0.6218 0.1842 0.2964 0.9666 0.6072 0.9518 0.4185 0.593 0.4172 0.3334 0.506 0.5187 0.162 0.7181 0.8436 0.458 0.6043 0.1471 0.446 0.5072 0.6834 0.5167 0.4349 0.6017 0.508 0.7965 0.3672 0.3218 0.1669 0.8197 0.451 0.293 0.0174 0.0826 0.7459 0.0688 0.0904 0.332 0.5747 0.5941 0.1733 0.3463 0.1303 0.002 0.031 0.0022 0.1496 0.4684 0.3741 0.1305 0.021 0.2271 0.0566 0.1664 0.6493 0.0665 0.6659 0.2344 0.3588 0.1172 0.1728 0.1451 0.474 0.9927 0.0147 0.4453 0.8129 0.2478 0.1965 0.6446 0.008 0.2293 0.051 0.4036 0.3773 0.4313 0.5522 0.2844 0.3773 0.7215 0.0082 0.4407 0.2551 0.2151 0.1844 0.1488 0.0082 0.1797 0.3893 0.6525 0.0327 0.5512 0.1563 0.2091 0. 0.3186 0.0311 0.1425 0.1101] 2022-08-22 23:34:49 [INFO] [EVAL] The model with the best validation mIoU (0.3325) was saved at iter 49000. 2022-08-22 23:35:03 [INFO] [TRAIN] epoch: 41, iter: 51050/160000, loss: 0.6657, lr: 0.000825, batch_cost: 0.2825, reader_cost: 0.00566, ips: 28.3181 samples/sec | ETA 08:32:58 2022-08-22 23:35:16 [INFO] [TRAIN] epoch: 41, iter: 51100/160000, loss: 0.6912, lr: 0.000824, batch_cost: 0.2502, reader_cost: 0.00736, ips: 31.9752 samples/sec | ETA 07:34:06 2022-08-22 23:35:28 [INFO] [TRAIN] epoch: 41, iter: 51150/160000, loss: 0.6456, lr: 0.000824, batch_cost: 0.2403, reader_cost: 0.00852, ips: 33.2922 samples/sec | ETA 07:15:56 2022-08-22 23:35:39 [INFO] [TRAIN] epoch: 41, iter: 51200/160000, loss: 0.6844, lr: 0.000824, batch_cost: 0.2250, reader_cost: 0.00079, ips: 35.5513 samples/sec | ETA 06:48:02 2022-08-22 23:35:51 [INFO] [TRAIN] epoch: 41, iter: 51250/160000, loss: 0.7018, lr: 0.000823, batch_cost: 0.2394, reader_cost: 0.00088, ips: 33.4125 samples/sec | ETA 07:13:58 2022-08-22 23:36:04 [INFO] [TRAIN] epoch: 41, iter: 51300/160000, loss: 0.7002, lr: 0.000823, batch_cost: 0.2676, reader_cost: 0.00065, ips: 29.9001 samples/sec | ETA 08:04:43 2022-08-22 23:36:16 [INFO] [TRAIN] epoch: 41, iter: 51350/160000, loss: 0.7595, lr: 0.000823, batch_cost: 0.2258, reader_cost: 0.00132, ips: 35.4307 samples/sec | ETA 06:48:52 2022-08-22 23:36:27 [INFO] [TRAIN] epoch: 41, iter: 51400/160000, loss: 0.7375, lr: 0.000822, batch_cost: 0.2192, reader_cost: 0.00389, ips: 36.4909 samples/sec | ETA 06:36:48 2022-08-22 23:36:39 [INFO] [TRAIN] epoch: 41, iter: 51450/160000, loss: 0.7333, lr: 0.000822, batch_cost: 0.2414, reader_cost: 0.00034, ips: 33.1366 samples/sec | ETA 07:16:46 2022-08-22 23:36:49 [INFO] [TRAIN] epoch: 41, iter: 51500/160000, loss: 0.6860, lr: 0.000821, batch_cost: 0.2149, reader_cost: 0.00072, ips: 37.2261 samples/sec | ETA 06:28:36 2022-08-22 23:37:01 [INFO] [TRAIN] epoch: 41, iter: 51550/160000, loss: 0.6496, lr: 0.000821, batch_cost: 0.2369, reader_cost: 0.00265, ips: 33.7671 samples/sec | ETA 07:08:13 2022-08-22 23:37:13 [INFO] [TRAIN] epoch: 41, iter: 51600/160000, loss: 0.7379, lr: 0.000821, batch_cost: 0.2275, reader_cost: 0.00053, ips: 35.1723 samples/sec | ETA 06:50:55 2022-08-22 23:37:23 [INFO] [TRAIN] epoch: 41, iter: 51650/160000, loss: 0.6965, lr: 0.000820, batch_cost: 0.2032, reader_cost: 0.00487, ips: 39.3659 samples/sec | ETA 06:06:59 2022-08-22 23:37:33 [INFO] [TRAIN] epoch: 41, iter: 51700/160000, loss: 0.6861, lr: 0.000820, batch_cost: 0.2039, reader_cost: 0.00262, ips: 39.2291 samples/sec | ETA 06:08:05 2022-08-22 23:37:45 [INFO] [TRAIN] epoch: 41, iter: 51750/160000, loss: 0.6817, lr: 0.000820, batch_cost: 0.2388, reader_cost: 0.00037, ips: 33.5045 samples/sec | ETA 07:10:47 2022-08-22 23:37:59 [INFO] [TRAIN] epoch: 42, iter: 51800/160000, loss: 0.6590, lr: 0.000819, batch_cost: 0.2873, reader_cost: 0.05420, ips: 27.8415 samples/sec | ETA 08:38:10 2022-08-22 23:38:09 [INFO] [TRAIN] epoch: 42, iter: 51850/160000, loss: 0.6473, lr: 0.000819, batch_cost: 0.1959, reader_cost: 0.00078, ips: 40.8426 samples/sec | ETA 05:53:03 2022-08-22 23:38:17 [INFO] [TRAIN] epoch: 42, iter: 51900/160000, loss: 0.6942, lr: 0.000818, batch_cost: 0.1609, reader_cost: 0.00048, ips: 49.7328 samples/sec | ETA 04:49:48 2022-08-22 23:38:26 [INFO] [TRAIN] epoch: 42, iter: 51950/160000, loss: 0.7054, lr: 0.000818, batch_cost: 0.1858, reader_cost: 0.00148, ips: 43.0497 samples/sec | ETA 05:34:39 2022-08-22 23:38:35 [INFO] [TRAIN] epoch: 42, iter: 52000/160000, loss: 0.6414, lr: 0.000818, batch_cost: 0.1687, reader_cost: 0.00048, ips: 47.4232 samples/sec | ETA 05:03:38 2022-08-22 23:38:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1834 - reader cost: 6.2719e-04 2022-08-22 23:41:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3312 Acc: 0.7565 Kappa: 0.7379 Dice: 0.4615 2022-08-22 23:41:38 [INFO] [EVAL] Class IoU: [0.6711 0.7717 0.9317 0.7164 0.6685 0.7566 0.7565 0.7676 0.5064 0.6152 0.4848 0.5407 0.6903 0.2536 0.2419 0.4185 0.5049 0.4391 0.5829 0.3929 0.7296 0.508 0.6222 0.4765 0.3095 0.3506 0.5203 0.431 0.3811 0.212 0.1932 0.4835 0.2345 0.3688 0.2521 0.3216 0.4078 0.4875 0.2058 0.3692 0.2146 0.1047 0.3336 0.2465 0.341 0.2434 0.3058 0.5051 0.6124 0.5408 0.4951 0.2629 0.1873 0.243 0.7189 0.4676 0.8502 0.3685 0.4773 0.2856 0.1441 0.4264 0.2787 0.0698 0.4653 0.6552 0.2543 0.3522 0.0546 0.2855 0.4348 0.4854 0.3733 0.2303 0.4405 0.3453 0.4242 0.3022 0.2354 0.1345 0.5535 0.3727 0.3802 0.0157 0.348 0.5396 0.0884 0.0721 0.2387 0.4349 0.4024 0.149 0.235 0.0905 0.0479 0.011 0.0258 0.1409 0.2868 0.305 0.073 0.0233 0.2484 0.1373 0.0959 0.3491 0.0512 0.4996 0.0927 0.2059 0.0475 0.3547 0.141 0.5769 0.6519 0.017 0.387 0.5934 0.1001 0.1712 0.4072 0.0119 0.1933 0.0826 0.3216 0.3155 0.4448 0.4535 0.23 0.2788 0.5219 0.0214 0.2383 0.1769 0.1254 0.1207 0.1035 0.0223 0.1352 0.3823 0.3061 0.0001 0.3973 0.1249 0.2485 0. 0.3954 0.028 0.1239 0.136 ] 2022-08-22 23:41:38 [INFO] [EVAL] Class Precision: [0.7735 0.8522 0.9668 0.8106 0.7515 0.8733 0.8801 0.8551 0.6025 0.7334 0.7035 0.6697 0.7821 0.4974 0.5427 0.5433 0.5923 0.6539 0.7783 0.6419 0.8159 0.7455 0.7482 0.5712 0.5032 0.5366 0.6589 0.7201 0.6427 0.3172 0.4524 0.5986 0.4227 0.4944 0.5234 0.5051 0.5959 0.6488 0.4413 0.5498 0.3345 0.2957 0.5489 0.56 0.4978 0.4599 0.5621 0.7446 0.7046 0.6863 0.7757 0.3168 0.2985 0.6131 0.7431 0.7242 0.8991 0.6657 0.5628 0.4885 0.178 0.5505 0.4122 0.5818 0.611 0.7381 0.4409 0.5098 0.2056 0.4916 0.5417 0.679 0.6395 0.3063 0.6945 0.5607 0.5242 0.6182 0.7899 0.2518 0.6539 0.6238 0.6819 0.0453 0.493 0.7796 0.4297 0.3081 0.399 0.6948 0.5371 0.3279 0.3426 0.2799 0.1268 0.0438 0.2168 0.3875 0.5093 0.6717 0.3442 0.0339 0.549 0.5957 0.6273 0.4204 0.2405 0.8086 0.2634 0.3736 0.3059 0.5978 0.4137 0.6255 0.6548 0.1711 0.7148 0.6678 0.2226 0.5325 0.4624 0.2055 0.6654 0.526 0.7953 0.586 0.8497 0.6861 0.5799 0.4146 0.561 0.3308 0.4051 0.7998 0.6664 0.3464 0.4021 0.0684 0.5006 0.6412 0.3472 0.0002 0.5755 0.6507 0.6313 0. 0.8441 0.3482 0.6269 0.6894] 2022-08-22 23:41:38 [INFO] [EVAL] Class Recall: [0.8353 0.8909 0.9625 0.8605 0.8583 0.8499 0.8435 0.8823 0.7605 0.7924 0.6093 0.7373 0.8547 0.341 0.3038 0.6456 0.7737 0.5721 0.6989 0.5031 0.8734 0.6146 0.7871 0.7417 0.4457 0.5028 0.7121 0.5177 0.4836 0.39 0.2522 0.7155 0.345 0.5922 0.3272 0.4696 0.5637 0.6623 0.2783 0.5293 0.3743 0.1395 0.4595 0.3057 0.5197 0.3408 0.4015 0.611 0.8239 0.7184 0.5779 0.6073 0.3347 0.287 0.9567 0.5689 0.9399 0.4521 0.7587 0.4075 0.4314 0.6541 0.4624 0.0735 0.6612 0.8537 0.3752 0.5325 0.0692 0.4051 0.6878 0.6299 0.4728 0.4815 0.5464 0.4734 0.6897 0.3716 0.2511 0.224 0.7827 0.4807 0.4621 0.0235 0.5419 0.6368 0.1002 0.0861 0.3727 0.5375 0.6161 0.2145 0.428 0.118 0.0714 0.0145 0.0284 0.1812 0.3962 0.3585 0.0848 0.0694 0.3121 0.1515 0.1017 0.673 0.061 0.5666 0.1251 0.3144 0.0532 0.4658 0.1763 0.8813 0.9933 0.0185 0.4577 0.842 0.154 0.2014 0.7734 0.0124 0.2141 0.0893 0.3507 0.4061 0.4828 0.5722 0.276 0.4598 0.8824 0.0223 0.3666 0.1851 0.1338 0.1563 0.1224 0.032 0.1563 0.4863 0.7213 0.0004 0.5619 0.1339 0.2907 0. 0.4265 0.0295 0.1337 0.1448] 2022-08-22 23:41:39 [INFO] [EVAL] The model with the best validation mIoU (0.3325) was saved at iter 49000. 2022-08-22 23:41:52 [INFO] [TRAIN] epoch: 42, iter: 52050/160000, loss: 0.6789, lr: 0.000817, batch_cost: 0.2596, reader_cost: 0.00708, ips: 30.8223 samples/sec | ETA 07:46:58 2022-08-22 23:42:05 [INFO] [TRAIN] epoch: 42, iter: 52100/160000, loss: 0.7122, lr: 0.000817, batch_cost: 0.2599, reader_cost: 0.00920, ips: 30.7869 samples/sec | ETA 07:47:17 2022-08-22 23:42:18 [INFO] [TRAIN] epoch: 42, iter: 52150/160000, loss: 0.7599, lr: 0.000817, batch_cost: 0.2637, reader_cost: 0.00054, ips: 30.3397 samples/sec | ETA 07:53:57 2022-08-22 23:42:30 [INFO] [TRAIN] epoch: 42, iter: 52200/160000, loss: 0.6482, lr: 0.000816, batch_cost: 0.2426, reader_cost: 0.00095, ips: 32.9702 samples/sec | ETA 07:15:56 2022-08-22 23:42:43 [INFO] [TRAIN] epoch: 42, iter: 52250/160000, loss: 0.6621, lr: 0.000816, batch_cost: 0.2560, reader_cost: 0.00053, ips: 31.2497 samples/sec | ETA 07:39:44 2022-08-22 23:42:56 [INFO] [TRAIN] epoch: 42, iter: 52300/160000, loss: 0.6845, lr: 0.000815, batch_cost: 0.2615, reader_cost: 0.00061, ips: 30.5922 samples/sec | ETA 07:49:24 2022-08-22 23:43:10 [INFO] [TRAIN] epoch: 42, iter: 52350/160000, loss: 0.6447, lr: 0.000815, batch_cost: 0.2752, reader_cost: 0.00066, ips: 29.0682 samples/sec | ETA 08:13:46 2022-08-22 23:43:22 [INFO] [TRAIN] epoch: 42, iter: 52400/160000, loss: 0.6730, lr: 0.000815, batch_cost: 0.2502, reader_cost: 0.00123, ips: 31.9735 samples/sec | ETA 07:28:42 2022-08-22 23:43:35 [INFO] [TRAIN] epoch: 42, iter: 52450/160000, loss: 0.6764, lr: 0.000814, batch_cost: 0.2529, reader_cost: 0.00140, ips: 31.6297 samples/sec | ETA 07:33:22 2022-08-22 23:43:47 [INFO] [TRAIN] epoch: 42, iter: 52500/160000, loss: 0.6418, lr: 0.000814, batch_cost: 0.2450, reader_cost: 0.01990, ips: 32.6514 samples/sec | ETA 07:18:58 2022-08-22 23:44:00 [INFO] [TRAIN] epoch: 42, iter: 52550/160000, loss: 0.6891, lr: 0.000814, batch_cost: 0.2591, reader_cost: 0.01263, ips: 30.8726 samples/sec | ETA 07:44:03 2022-08-22 23:44:10 [INFO] [TRAIN] epoch: 42, iter: 52600/160000, loss: 0.6529, lr: 0.000813, batch_cost: 0.2051, reader_cost: 0.00052, ips: 39.0015 samples/sec | ETA 06:07:09 2022-08-22 23:44:21 [INFO] [TRAIN] epoch: 42, iter: 52650/160000, loss: 0.7044, lr: 0.000813, batch_cost: 0.2092, reader_cost: 0.01074, ips: 38.2473 samples/sec | ETA 06:14:13 2022-08-22 23:44:33 [INFO] [TRAIN] epoch: 42, iter: 52700/160000, loss: 0.6547, lr: 0.000812, batch_cost: 0.2424, reader_cost: 0.01106, ips: 33.0053 samples/sec | ETA 07:13:27 2022-08-22 23:44:43 [INFO] [TRAIN] epoch: 42, iter: 52750/160000, loss: 0.6824, lr: 0.000812, batch_cost: 0.1956, reader_cost: 0.00221, ips: 40.9012 samples/sec | ETA 05:49:37 2022-08-22 23:44:53 [INFO] [TRAIN] epoch: 42, iter: 52800/160000, loss: 0.6538, lr: 0.000812, batch_cost: 0.2043, reader_cost: 0.00035, ips: 39.1498 samples/sec | ETA 06:05:05 2022-08-22 23:45:02 [INFO] [TRAIN] epoch: 42, iter: 52850/160000, loss: 0.6836, lr: 0.000811, batch_cost: 0.1821, reader_cost: 0.00050, ips: 43.9248 samples/sec | ETA 05:25:15 2022-08-22 23:45:10 [INFO] [TRAIN] epoch: 42, iter: 52900/160000, loss: 0.6827, lr: 0.000811, batch_cost: 0.1648, reader_cost: 0.00054, ips: 48.5373 samples/sec | ETA 04:54:12 2022-08-22 23:45:20 [INFO] [TRAIN] epoch: 42, iter: 52950/160000, loss: 0.6766, lr: 0.000810, batch_cost: 0.1869, reader_cost: 0.00155, ips: 42.7984 samples/sec | ETA 05:33:30 2022-08-22 23:45:29 [INFO] [TRAIN] epoch: 42, iter: 53000/160000, loss: 0.7031, lr: 0.000810, batch_cost: 0.1895, reader_cost: 0.00039, ips: 42.2270 samples/sec | ETA 05:37:51 2022-08-22 23:45:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 169s - batch_cost: 0.1690 - reader cost: 9.5164e-04 2022-08-22 23:48:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.3241 Acc: 0.7547 Kappa: 0.7357 Dice: 0.4530 2022-08-22 23:48:18 [INFO] [EVAL] Class IoU: [0.6713 0.7688 0.9309 0.7162 0.6739 0.7497 0.7683 0.7693 0.5114 0.5966 0.4555 0.5474 0.6928 0.3077 0.2223 0.4183 0.4943 0.4004 0.5925 0.4014 0.7343 0.4696 0.6243 0.4844 0.3191 0.3754 0.4804 0.3929 0.3639 0.1957 0.196 0.4773 0.2479 0.3517 0.2724 0.3641 0.4079 0.451 0.2256 0.3561 0.2368 0.1154 0.3454 0.2476 0.2875 0.2086 0.269 0.4447 0.5173 0.5098 0.5058 0.3784 0.1553 0.1594 0.6864 0.4546 0.8149 0.3726 0.4756 0.2658 0.1189 0.3533 0.2955 0.1518 0.4281 0.6289 0.2468 0.3825 0.1112 0.2857 0.3673 0.486 0.366 0.2199 0.4362 0.2985 0.4534 0.265 0.2762 0.1878 0.4752 0.3529 0.2833 0.0147 0.1976 0.5524 0.066 0.0625 0.2741 0.5282 0.3593 0.0335 0.0952 0.1146 0. 0.0037 0.0342 0.1883 0.1696 0.3756 0.0352 0.0117 0.205 0.0676 0.0997 0.3793 0.0585 0.4753 0.1263 0.2395 0.1054 0.2217 0.1474 0.5236 0.7419 0.0112 0.3263 0.6499 0.1363 0.1575 0.4536 0.0014 0.1291 0.153 0.3 0.3185 0.4146 0.4048 0.4794 0.2804 0.5355 0.013 0.2933 0.2179 0.0994 0.1068 0.1017 0.0104 0.1447 0.361 0.267 0.0196 0.3877 0.0436 0.2452 0. 0.3813 0.0477 0.1045 0.111 ] 2022-08-22 23:48:18 [INFO] [EVAL] Class Precision: [0.7648 0.8469 0.9648 0.8277 0.7759 0.8642 0.8806 0.8444 0.6524 0.7595 0.6143 0.7124 0.7779 0.4555 0.5462 0.6114 0.6309 0.6869 0.7599 0.5776 0.8239 0.6948 0.7671 0.601 0.5038 0.4594 0.5907 0.7234 0.6674 0.3159 0.5023 0.6144 0.4168 0.5391 0.4259 0.4497 0.6206 0.7146 0.3858 0.5862 0.3803 0.2772 0.6594 0.612 0.3909 0.4336 0.5621 0.6098 0.6555 0.678 0.6352 0.5615 0.4089 0.7162 0.7033 0.6279 0.8483 0.6526 0.6285 0.3669 0.1778 0.5147 0.4089 0.7117 0.5284 0.7342 0.3239 0.6338 0.3996 0.4365 0.5995 0.5847 0.6481 0.3037 0.7038 0.5521 0.6272 0.4869 0.7791 0.3849 0.5422 0.6941 0.7869 0.0459 0.4613 0.7148 0.6217 0.3742 0.4472 0.7373 0.4653 0.0395 0.3892 0.3126 0. 0.0174 0.5349 0.3698 0.5595 0.5536 0.2268 0.0188 0.572 0.5047 0.916 0.4752 0.277 0.7511 0.3294 0.5403 0.227 0.2624 0.5509 0.7657 0.7466 0.0965 0.6077 0.7289 0.1673 0.4791 0.5736 0.8279 0.8294 0.5454 0.5682 0.6314 0.8441 0.5448 0.8117 0.415 0.6328 0.3478 0.6338 0.7063 0.5627 0.3027 0.4339 0.0443 0.3568 0.6827 0.2983 0.0266 0.6167 0.3631 0.5546 0. 0.9054 0.3555 0.5516 0.7974] 2022-08-22 23:48:18 [INFO] [EVAL] Class Recall: [0.8459 0.893 0.9637 0.8416 0.8369 0.8497 0.8576 0.8964 0.7028 0.7355 0.638 0.7027 0.8635 0.4867 0.2727 0.5698 0.6954 0.4898 0.729 0.5682 0.871 0.5916 0.7704 0.7142 0.4654 0.6724 0.72 0.4624 0.4445 0.3397 0.2432 0.6814 0.3796 0.503 0.4304 0.6567 0.5435 0.5501 0.3521 0.4756 0.3855 0.1651 0.4204 0.2937 0.5207 0.2868 0.3403 0.6215 0.7104 0.6726 0.713 0.5371 0.2002 0.1702 0.9661 0.6223 0.9539 0.4648 0.6616 0.4911 0.2639 0.5298 0.5158 0.1617 0.6928 0.8143 0.5091 0.491 0.1334 0.4527 0.4867 0.7423 0.4568 0.4434 0.5343 0.3939 0.6206 0.3677 0.2997 0.2683 0.7938 0.418 0.3068 0.0212 0.2569 0.7086 0.0687 0.0698 0.4145 0.6507 0.6119 0.1793 0.1119 0.1533 0. 0.0048 0.0352 0.2773 0.1958 0.5388 0.0401 0.0302 0.2422 0.0724 0.1006 0.6526 0.069 0.5642 0.1701 0.3008 0.1644 0.5882 0.1676 0.6235 0.9916 0.0125 0.4134 0.857 0.4243 0.19 0.6844 0.0014 0.1326 0.1754 0.3886 0.3912 0.449 0.6115 0.5394 0.4635 0.7768 0.0133 0.3531 0.2396 0.1077 0.1417 0.1173 0.0134 0.1958 0.4338 0.7177 0.0687 0.5108 0.0472 0.3053 0. 0.3971 0.0522 0.1142 0.1142] 2022-08-22 23:48:18 [INFO] [EVAL] The model with the best validation mIoU (0.3325) was saved at iter 49000. 2022-08-22 23:48:34 [INFO] [TRAIN] epoch: 43, iter: 53050/160000, loss: 0.7302, lr: 0.000810, batch_cost: 0.3067, reader_cost: 0.04942, ips: 26.0825 samples/sec | ETA 09:06:43 2022-08-22 23:48:47 [INFO] [TRAIN] epoch: 43, iter: 53100/160000, loss: 0.6128, lr: 0.000809, batch_cost: 0.2720, reader_cost: 0.00405, ips: 29.4120 samples/sec | ETA 08:04:36 2022-08-22 23:49:00 [INFO] [TRAIN] epoch: 43, iter: 53150/160000, loss: 0.7070, lr: 0.000809, batch_cost: 0.2508, reader_cost: 0.00784, ips: 31.9027 samples/sec | ETA 07:26:33 2022-08-22 23:49:13 [INFO] [TRAIN] epoch: 43, iter: 53200/160000, loss: 0.6525, lr: 0.000809, batch_cost: 0.2552, reader_cost: 0.00077, ips: 31.3466 samples/sec | ETA 07:34:16 2022-08-22 23:49:26 [INFO] [TRAIN] epoch: 43, iter: 53250/160000, loss: 0.6846, lr: 0.000808, batch_cost: 0.2725, reader_cost: 0.00169, ips: 29.3556 samples/sec | ETA 08:04:51 2022-08-22 23:49:39 [INFO] [TRAIN] epoch: 43, iter: 53300/160000, loss: 0.6928, lr: 0.000808, batch_cost: 0.2549, reader_cost: 0.00083, ips: 31.3809 samples/sec | ETA 07:33:21 2022-08-22 23:49:51 [INFO] [TRAIN] epoch: 43, iter: 53350/160000, loss: 0.6637, lr: 0.000807, batch_cost: 0.2347, reader_cost: 0.00220, ips: 34.0918 samples/sec | ETA 06:57:06 2022-08-22 23:50:04 [INFO] [TRAIN] epoch: 43, iter: 53400/160000, loss: 0.7113, lr: 0.000807, batch_cost: 0.2564, reader_cost: 0.00103, ips: 31.2037 samples/sec | ETA 07:35:30 2022-08-22 23:50:16 [INFO] [TRAIN] epoch: 43, iter: 53450/160000, loss: 0.7207, lr: 0.000807, batch_cost: 0.2454, reader_cost: 0.00107, ips: 32.6023 samples/sec | ETA 07:15:45 2022-08-22 23:50:29 [INFO] [TRAIN] epoch: 43, iter: 53500/160000, loss: 0.6997, lr: 0.000806, batch_cost: 0.2548, reader_cost: 0.00062, ips: 31.4005 samples/sec | ETA 07:32:13 2022-08-22 23:50:42 [INFO] [TRAIN] epoch: 43, iter: 53550/160000, loss: 0.6460, lr: 0.000806, batch_cost: 0.2729, reader_cost: 0.00062, ips: 29.3129 samples/sec | ETA 08:04:12 2022-08-22 23:50:54 [INFO] [TRAIN] epoch: 43, iter: 53600/160000, loss: 0.6340, lr: 0.000806, batch_cost: 0.2405, reader_cost: 0.00178, ips: 33.2651 samples/sec | ETA 07:06:28 2022-08-22 23:51:07 [INFO] [TRAIN] epoch: 43, iter: 53650/160000, loss: 0.6658, lr: 0.000805, batch_cost: 0.2490, reader_cost: 0.00211, ips: 32.1272 samples/sec | ETA 07:21:22 2022-08-22 23:51:19 [INFO] [TRAIN] epoch: 43, iter: 53700/160000, loss: 0.6745, lr: 0.000805, batch_cost: 0.2553, reader_cost: 0.01058, ips: 31.3359 samples/sec | ETA 07:32:18 2022-08-22 23:51:32 [INFO] [TRAIN] epoch: 43, iter: 53750/160000, loss: 0.6563, lr: 0.000804, batch_cost: 0.2531, reader_cost: 0.00785, ips: 31.6074 samples/sec | ETA 07:28:12 2022-08-22 23:51:43 [INFO] [TRAIN] epoch: 43, iter: 53800/160000, loss: 0.6560, lr: 0.000804, batch_cost: 0.2123, reader_cost: 0.00132, ips: 37.6852 samples/sec | ETA 06:15:44 2022-08-22 23:51:51 [INFO] [TRAIN] epoch: 43, iter: 53850/160000, loss: 0.6607, lr: 0.000804, batch_cost: 0.1749, reader_cost: 0.00106, ips: 45.7458 samples/sec | ETA 05:09:23 2022-08-22 23:52:01 [INFO] [TRAIN] epoch: 43, iter: 53900/160000, loss: 0.6604, lr: 0.000803, batch_cost: 0.1964, reader_cost: 0.00033, ips: 40.7347 samples/sec | ETA 05:47:17 2022-08-22 23:52:12 [INFO] [TRAIN] epoch: 43, iter: 53950/160000, loss: 0.6420, lr: 0.000803, batch_cost: 0.2243, reader_cost: 0.00078, ips: 35.6671 samples/sec | ETA 06:36:26 2022-08-22 23:52:23 [INFO] [TRAIN] epoch: 43, iter: 54000/160000, loss: 0.6706, lr: 0.000803, batch_cost: 0.2087, reader_cost: 0.00062, ips: 38.3337 samples/sec | ETA 06:08:41 2022-08-22 23:52:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1786 - reader cost: 0.0011 2022-08-22 23:55:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3334 Acc: 0.7538 Kappa: 0.7352 Dice: 0.4664 2022-08-22 23:55:22 [INFO] [EVAL] Class IoU: [0.6712 0.7621 0.9301 0.7114 0.6695 0.7521 0.7581 0.7726 0.5142 0.5882 0.4701 0.5396 0.6833 0.293 0.2537 0.4201 0.4858 0.4158 0.5866 0.4093 0.7384 0.4887 0.6067 0.4853 0.3153 0.26 0.4686 0.4247 0.4065 0.2245 0.2211 0.4411 0.2683 0.3607 0.2501 0.3587 0.4068 0.5187 0.2547 0.3178 0.2255 0.115 0.3516 0.2648 0.34 0.2574 0.2686 0.4624 0.5241 0.5117 0.5009 0.2974 0.1876 0.2734 0.6709 0.4163 0.8541 0.4098 0.4131 0.2594 0.1706 0.3878 0.2326 0.207 0.4231 0.6409 0.2184 0.4133 0.1012 0.312 0.4107 0.5231 0.4126 0.2087 0.4396 0.3322 0.4041 0.2524 0.2329 0.2963 0.6143 0.3514 0.2843 0.0181 0.2655 0.5412 0.075 0.0768 0.2531 0.505 0.3928 0.0286 0.2248 0.0712 0.0399 0.0165 0.0215 0.1906 0.2591 0.3616 0.1283 0.0304 0.219 0.1427 0.1778 0.3419 0.0825 0.4915 0.1288 0.2233 0.0853 0.2408 0.1525 0.5982 0.5636 0.0043 0.2226 0.7009 0.0991 0.3336 0.4459 0.0046 0.1898 0.105 0.3951 0.3256 0.5082 0.3624 0.3139 0.2902 0.5011 0.0238 0.3 0.185 0.2123 0.1211 0.0937 0.0158 0.1449 0.3364 0.4089 0. 0.3658 0.1871 0.285 0. 0.3327 0.0302 0.1178 0.1227] 2022-08-22 23:55:22 [INFO] [EVAL] Class Precision: [0.7855 0.8453 0.9649 0.8172 0.7441 0.8512 0.8708 0.8403 0.6837 0.7952 0.6537 0.7202 0.7628 0.4844 0.5244 0.579 0.6136 0.6762 0.7043 0.5825 0.8305 0.7323 0.7261 0.5966 0.5688 0.4406 0.5884 0.7033 0.5646 0.3404 0.4561 0.605 0.5081 0.4941 0.3649 0.5105 0.6007 0.755 0.4561 0.624 0.3071 0.3493 0.5796 0.4755 0.4974 0.4796 0.4474 0.6159 0.6302 0.6053 0.6056 0.3635 0.3251 0.5996 0.6864 0.5413 0.9238 0.5923 0.746 0.4865 0.2241 0.4729 0.5685 0.6371 0.5593 0.8209 0.2606 0.575 0.3122 0.5856 0.64 0.601 0.5415 0.2853 0.6355 0.643 0.5715 0.7085 0.3122 0.5652 0.7029 0.6525 0.7734 0.049 0.294 0.7228 0.292 0.347 0.4594 0.7675 0.5033 0.0348 0.3812 0.3224 0.0787 0.0455 0.1033 0.3554 0.4273 0.6584 0.4583 0.0361 0.681 0.6979 0.9318 0.4083 0.2505 0.8183 0.2484 0.4916 0.2744 0.2589 0.4653 0.8227 0.5649 0.0535 0.7713 0.7494 0.2225 0.646 0.6705 0.0996 0.867 0.5993 0.6908 0.6845 0.8188 0.4284 0.5226 0.5823 0.5315 0.3284 0.4722 0.7283 0.533 0.277 0.3756 0.0413 0.4475 0.7012 0.5101 0. 0.6819 0.7687 0.6092 0. 0.9249 0.329 0.5138 0.4793] 2022-08-22 23:55:22 [INFO] [EVAL] Class Recall: [0.8219 0.8855 0.9627 0.846 0.8697 0.8659 0.8542 0.9055 0.6747 0.6933 0.626 0.6827 0.8676 0.4258 0.3295 0.6049 0.6998 0.5192 0.7783 0.5791 0.8694 0.595 0.7867 0.7223 0.4143 0.3881 0.6971 0.5174 0.5922 0.3973 0.3002 0.6194 0.3624 0.5719 0.443 0.5468 0.5575 0.6236 0.3658 0.393 0.4592 0.1464 0.4719 0.3741 0.5179 0.3572 0.402 0.6497 0.7569 0.7679 0.7434 0.6205 0.3073 0.3345 0.9675 0.6432 0.9189 0.5709 0.4808 0.3572 0.417 0.6831 0.2825 0.2346 0.6347 0.745 0.574 0.5952 0.1302 0.4004 0.534 0.8015 0.6341 0.4373 0.5877 0.4073 0.5797 0.2816 0.4783 0.3837 0.8297 0.4324 0.3101 0.0278 0.7326 0.6829 0.0916 0.0898 0.3605 0.5962 0.6416 0.1382 0.354 0.0837 0.075 0.0253 0.0264 0.2914 0.397 0.445 0.1512 0.161 0.244 0.1521 0.1801 0.6777 0.1096 0.5517 0.2111 0.2903 0.1102 0.7754 0.1849 0.6868 0.9959 0.0046 0.2383 0.9155 0.1515 0.4082 0.571 0.0048 0.1955 0.1129 0.4799 0.3831 0.5726 0.702 0.4401 0.3665 0.8974 0.025 0.4514 0.1987 0.2607 0.1771 0.111 0.025 0.1764 0.3927 0.6733 0. 0.4411 0.1983 0.3488 0. 0.3419 0.0321 0.1326 0.1416] 2022-08-22 23:55:22 [INFO] [EVAL] The model with the best validation mIoU (0.3334) was saved at iter 54000. 2022-08-22 23:55:35 [INFO] [TRAIN] epoch: 43, iter: 54050/160000, loss: 0.7176, lr: 0.000802, batch_cost: 0.2513, reader_cost: 0.00587, ips: 31.8388 samples/sec | ETA 07:23:41 2022-08-22 23:55:48 [INFO] [TRAIN] epoch: 43, iter: 54100/160000, loss: 0.7021, lr: 0.000802, batch_cost: 0.2723, reader_cost: 0.00659, ips: 29.3827 samples/sec | ETA 08:00:33 2022-08-22 23:56:01 [INFO] [TRAIN] epoch: 43, iter: 54150/160000, loss: 0.6414, lr: 0.000801, batch_cost: 0.2588, reader_cost: 0.00855, ips: 30.9115 samples/sec | ETA 07:36:34 2022-08-22 23:56:14 [INFO] [TRAIN] epoch: 43, iter: 54200/160000, loss: 0.7407, lr: 0.000801, batch_cost: 0.2559, reader_cost: 0.00102, ips: 31.2581 samples/sec | ETA 07:31:17 2022-08-22 23:56:27 [INFO] [TRAIN] epoch: 43, iter: 54250/160000, loss: 0.7106, lr: 0.000801, batch_cost: 0.2687, reader_cost: 0.01493, ips: 29.7762 samples/sec | ETA 07:53:31 2022-08-22 23:56:41 [INFO] [TRAIN] epoch: 43, iter: 54300/160000, loss: 0.7114, lr: 0.000800, batch_cost: 0.2751, reader_cost: 0.01418, ips: 29.0853 samples/sec | ETA 08:04:33 2022-08-22 23:57:00 [INFO] [TRAIN] epoch: 44, iter: 54350/160000, loss: 0.6348, lr: 0.000800, batch_cost: 0.3721, reader_cost: 0.11643, ips: 21.5024 samples/sec | ETA 10:55:07 2022-08-22 23:57:14 [INFO] [TRAIN] epoch: 44, iter: 54400/160000, loss: 0.6507, lr: 0.000800, batch_cost: 0.2828, reader_cost: 0.00086, ips: 28.2902 samples/sec | ETA 08:17:41 2022-08-22 23:57:27 [INFO] [TRAIN] epoch: 44, iter: 54450/160000, loss: 0.6362, lr: 0.000799, batch_cost: 0.2704, reader_cost: 0.00094, ips: 29.5817 samples/sec | ETA 07:55:44 2022-08-22 23:57:41 [INFO] [TRAIN] epoch: 44, iter: 54500/160000, loss: 0.6785, lr: 0.000799, batch_cost: 0.2699, reader_cost: 0.01067, ips: 29.6395 samples/sec | ETA 07:54:35 2022-08-22 23:57:56 [INFO] [TRAIN] epoch: 44, iter: 54550/160000, loss: 0.7155, lr: 0.000798, batch_cost: 0.3028, reader_cost: 0.00998, ips: 26.4217 samples/sec | ETA 08:52:08 2022-08-22 23:58:11 [INFO] [TRAIN] epoch: 44, iter: 54600/160000, loss: 0.6510, lr: 0.000798, batch_cost: 0.2956, reader_cost: 0.00056, ips: 27.0629 samples/sec | ETA 08:39:17 2022-08-22 23:58:26 [INFO] [TRAIN] epoch: 44, iter: 54650/160000, loss: 0.6863, lr: 0.000798, batch_cost: 0.3085, reader_cost: 0.00211, ips: 25.9344 samples/sec | ETA 09:01:37 2022-08-22 23:58:39 [INFO] [TRAIN] epoch: 44, iter: 54700/160000, loss: 0.6388, lr: 0.000797, batch_cost: 0.2644, reader_cost: 0.00190, ips: 30.2563 samples/sec | ETA 07:44:02 2022-08-22 23:58:54 [INFO] [TRAIN] epoch: 44, iter: 54750/160000, loss: 0.6234, lr: 0.000797, batch_cost: 0.2974, reader_cost: 0.00312, ips: 26.9023 samples/sec | ETA 08:41:38 2022-08-22 23:59:08 [INFO] [TRAIN] epoch: 44, iter: 54800/160000, loss: 0.6549, lr: 0.000796, batch_cost: 0.2761, reader_cost: 0.01172, ips: 28.9745 samples/sec | ETA 08:04:06 2022-08-22 23:59:21 [INFO] [TRAIN] epoch: 44, iter: 54850/160000, loss: 0.6445, lr: 0.000796, batch_cost: 0.2612, reader_cost: 0.00053, ips: 30.6270 samples/sec | ETA 07:37:45 2022-08-22 23:59:32 [INFO] [TRAIN] epoch: 44, iter: 54900/160000, loss: 0.6705, lr: 0.000796, batch_cost: 0.2150, reader_cost: 0.00760, ips: 37.2082 samples/sec | ETA 06:16:37 2022-08-22 23:59:43 [INFO] [TRAIN] epoch: 44, iter: 54950/160000, loss: 0.6485, lr: 0.000795, batch_cost: 0.2214, reader_cost: 0.00043, ips: 36.1336 samples/sec | ETA 06:27:38 2022-08-22 23:59:53 [INFO] [TRAIN] epoch: 44, iter: 55000/160000, loss: 0.6241, lr: 0.000795, batch_cost: 0.2085, reader_cost: 0.00057, ips: 38.3707 samples/sec | ETA 06:04:51 2022-08-22 23:59:53 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1728 - reader cost: 8.7826e-04 2022-08-23 00:02:47 [INFO] [EVAL] #Images: 2000 mIoU: 0.3372 Acc: 0.7553 Kappa: 0.7365 Dice: 0.4698 2022-08-23 00:02:47 [INFO] [EVAL] Class IoU: [0.6748 0.7582 0.9296 0.7188 0.6753 0.7386 0.7713 0.764 0.5187 0.6056 0.474 0.5602 0.688 0.3071 0.261 0.4161 0.4914 0.4139 0.5947 0.3794 0.7336 0.4613 0.6083 0.4768 0.2869 0.3261 0.4381 0.4221 0.3516 0.2779 0.1732 0.4731 0.2763 0.3498 0.2126 0.3604 0.4061 0.48 0.2417 0.3648 0.1748 0.1138 0.3371 0.2667 0.3306 0.2078 0.3377 0.4528 0.5534 0.4481 0.4914 0.3498 0.1661 0.2768 0.6845 0.3859 0.8426 0.3848 0.4643 0.3064 0.1596 0.3943 0.2691 0.144 0.4426 0.6213 0.248 0.3957 0.0899 0.3145 0.4267 0.5257 0.3854 0.229 0.4329 0.33 0.4699 0.2408 0.3522 0.0968 0.6101 0.3564 0.3217 0.0097 0.3977 0.5513 0.0814 0.0549 0.3148 0.4388 0.3359 0.0482 0.2523 0.1083 0.0016 0.011 0.0224 0.1108 0.3047 0.366 0.057 0.0131 0.2329 0.2088 0.1901 0.5762 0.0951 0.5028 0.1097 0.3471 0.1191 0.2449 0.1503 0.5958 0.7693 0.0089 0.3286 0.6118 0.1139 0.16 0.4271 0. 0.2269 0.1377 0.3231 0.2791 0.4439 0.4082 0.3947 0.2918 0.5584 0.0274 0.2953 0.2446 0.1832 0.1207 0.0941 0.0175 0.1959 0.3658 0.3709 0.0116 0.351 0.1721 0.2555 0. 0.3241 0.0317 0.1449 0.1416] 2022-08-23 00:02:47 [INFO] [EVAL] Class Precision: [0.761 0.8576 0.9677 0.8239 0.7469 0.8926 0.8851 0.8246 0.647 0.8159 0.7052 0.6913 0.78 0.4663 0.5376 0.5782 0.6348 0.6698 0.7541 0.6049 0.8138 0.733 0.7572 0.6089 0.5648 0.3948 0.5426 0.6922 0.6542 0.3882 0.4004 0.6642 0.4939 0.4853 0.3748 0.4926 0.6111 0.8383 0.5104 0.563 0.3844 0.4263 0.6967 0.5288 0.4703 0.3807 0.5701 0.5835 0.5721 0.5124 0.5934 0.4732 0.3006 0.6374 0.7079 0.5038 0.8784 0.6612 0.7993 0.5241 0.2137 0.5282 0.4537 0.779 0.5596 0.7315 0.3497 0.5438 0.1882 0.4728 0.6518 0.6402 0.6176 0.27 0.6479 0.4939 0.566 0.4505 0.6638 0.1732 0.6983 0.6597 0.7791 0.0536 0.536 0.7109 0.4769 0.3401 0.7688 0.6751 0.4335 0.0615 0.37 0.3321 0.0084 0.049 0.1514 0.3966 0.531 0.5931 0.4092 0.0201 0.668 0.7409 0.8461 0.6925 0.2613 0.7699 0.2404 0.5999 0.2492 0.4876 0.3756 0.8079 0.7736 0.0562 0.7306 0.723 0.1594 0.5353 0.5172 0. 0.6299 0.5939 0.6956 0.611 0.9011 0.6318 0.724 0.5897 0.7166 0.3924 0.4322 0.4657 0.5472 0.2772 0.4759 0.0895 0.4202 0.6772 0.5068 0.0167 0.6777 0.454 0.5595 0. 0.8123 0.335 0.6006 0.5783] 2022-08-23 00:02:47 [INFO] [EVAL] Class Recall: [0.8564 0.8674 0.9594 0.8493 0.8756 0.8106 0.8571 0.9122 0.7235 0.7015 0.5912 0.7471 0.8537 0.4736 0.3366 0.5974 0.6851 0.52 0.7378 0.5044 0.8815 0.5544 0.7558 0.6872 0.3683 0.6521 0.6948 0.5196 0.4318 0.4945 0.2339 0.6219 0.3854 0.5562 0.3294 0.5732 0.5477 0.5289 0.3147 0.5089 0.2427 0.1344 0.3951 0.3499 0.5268 0.3139 0.453 0.6692 0.9442 0.7813 0.7408 0.5729 0.2708 0.3285 0.9539 0.6226 0.9539 0.4793 0.5256 0.4245 0.3867 0.6086 0.398 0.1501 0.6793 0.8049 0.4602 0.5924 0.1469 0.4843 0.5526 0.7461 0.5062 0.6016 0.566 0.4985 0.7347 0.3409 0.4287 0.1801 0.8284 0.4367 0.354 0.0118 0.6066 0.7105 0.0893 0.0614 0.3477 0.5563 0.5986 0.1823 0.4422 0.1384 0.002 0.014 0.0256 0.1333 0.4168 0.4886 0.0621 0.0366 0.2634 0.2252 0.1969 0.7743 0.1301 0.5917 0.168 0.4517 0.1857 0.3298 0.2003 0.6941 0.9927 0.0104 0.3739 0.799 0.2849 0.1858 0.7103 0. 0.2618 0.152 0.3763 0.3394 0.4666 0.5355 0.4646 0.3661 0.7167 0.0286 0.4825 0.3401 0.2159 0.1761 0.105 0.0212 0.2685 0.4431 0.5803 0.0372 0.4214 0.217 0.3198 0. 0.3503 0.0339 0.1603 0.1579] 2022-08-23 00:02:47 [INFO] [EVAL] The model with the best validation mIoU (0.3372) was saved at iter 55000. 2022-08-23 00:03:02 [INFO] [TRAIN] epoch: 44, iter: 55050/160000, loss: 0.6514, lr: 0.000795, batch_cost: 0.2945, reader_cost: 0.00478, ips: 27.1659 samples/sec | ETA 08:35:06 2022-08-23 00:03:16 [INFO] [TRAIN] epoch: 44, iter: 55100/160000, loss: 0.6288, lr: 0.000794, batch_cost: 0.2950, reader_cost: 0.01610, ips: 27.1211 samples/sec | ETA 08:35:42 2022-08-23 00:03:32 [INFO] [TRAIN] epoch: 44, iter: 55150/160000, loss: 0.6554, lr: 0.000794, batch_cost: 0.3183, reader_cost: 0.00153, ips: 25.1309 samples/sec | ETA 09:16:17 2022-08-23 00:03:45 [INFO] [TRAIN] epoch: 44, iter: 55200/160000, loss: 0.7008, lr: 0.000793, batch_cost: 0.2568, reader_cost: 0.00723, ips: 31.1566 samples/sec | ETA 07:28:29 2022-08-23 00:03:59 [INFO] [TRAIN] epoch: 44, iter: 55250/160000, loss: 0.6584, lr: 0.000793, batch_cost: 0.2685, reader_cost: 0.00807, ips: 29.7968 samples/sec | ETA 07:48:43 2022-08-23 00:04:13 [INFO] [TRAIN] epoch: 44, iter: 55300/160000, loss: 0.7218, lr: 0.000793, batch_cost: 0.2833, reader_cost: 0.00272, ips: 28.2396 samples/sec | ETA 08:14:20 2022-08-23 00:04:26 [INFO] [TRAIN] epoch: 44, iter: 55350/160000, loss: 0.6486, lr: 0.000792, batch_cost: 0.2706, reader_cost: 0.00063, ips: 29.5676 samples/sec | ETA 07:51:54 2022-08-23 00:04:41 [INFO] [TRAIN] epoch: 44, iter: 55400/160000, loss: 0.6888, lr: 0.000792, batch_cost: 0.2990, reader_cost: 0.00745, ips: 26.7573 samples/sec | ETA 08:41:13 2022-08-23 00:04:55 [INFO] [TRAIN] epoch: 44, iter: 55450/160000, loss: 0.7300, lr: 0.000792, batch_cost: 0.2753, reader_cost: 0.01695, ips: 29.0594 samples/sec | ETA 07:59:42 2022-08-23 00:05:09 [INFO] [TRAIN] epoch: 44, iter: 55500/160000, loss: 0.6887, lr: 0.000791, batch_cost: 0.2710, reader_cost: 0.00818, ips: 29.5193 samples/sec | ETA 07:52:00 2022-08-23 00:05:20 [INFO] [TRAIN] epoch: 44, iter: 55550/160000, loss: 0.6354, lr: 0.000791, batch_cost: 0.2362, reader_cost: 0.00414, ips: 33.8767 samples/sec | ETA 06:51:05 2022-08-23 00:05:41 [INFO] [TRAIN] epoch: 45, iter: 55600/160000, loss: 0.6761, lr: 0.000790, batch_cost: 0.4066, reader_cost: 0.14771, ips: 19.6770 samples/sec | ETA 11:47:25 2022-08-23 00:05:54 [INFO] [TRAIN] epoch: 45, iter: 55650/160000, loss: 0.6389, lr: 0.000790, batch_cost: 0.2690, reader_cost: 0.00113, ips: 29.7450 samples/sec | ETA 07:47:45 2022-08-23 00:06:08 [INFO] [TRAIN] epoch: 45, iter: 55700/160000, loss: 0.6516, lr: 0.000790, batch_cost: 0.2764, reader_cost: 0.00364, ips: 28.9431 samples/sec | ETA 08:00:28 2022-08-23 00:06:21 [INFO] [TRAIN] epoch: 45, iter: 55750/160000, loss: 0.6051, lr: 0.000789, batch_cost: 0.2635, reader_cost: 0.01686, ips: 30.3557 samples/sec | ETA 07:37:54 2022-08-23 00:06:35 [INFO] [TRAIN] epoch: 45, iter: 55800/160000, loss: 0.6455, lr: 0.000789, batch_cost: 0.2792, reader_cost: 0.00093, ips: 28.6522 samples/sec | ETA 08:04:53 2022-08-23 00:06:45 [INFO] [TRAIN] epoch: 45, iter: 55850/160000, loss: 0.6560, lr: 0.000789, batch_cost: 0.1912, reader_cost: 0.00033, ips: 41.8449 samples/sec | ETA 05:31:51 2022-08-23 00:06:55 [INFO] [TRAIN] epoch: 45, iter: 55900/160000, loss: 0.6421, lr: 0.000788, batch_cost: 0.2071, reader_cost: 0.01065, ips: 38.6206 samples/sec | ETA 05:59:23 2022-08-23 00:07:04 [INFO] [TRAIN] epoch: 45, iter: 55950/160000, loss: 0.6288, lr: 0.000788, batch_cost: 0.1843, reader_cost: 0.00098, ips: 43.4014 samples/sec | ETA 05:19:39 2022-08-23 00:07:13 [INFO] [TRAIN] epoch: 45, iter: 56000/160000, loss: 0.6873, lr: 0.000787, batch_cost: 0.1754, reader_cost: 0.00067, ips: 45.6196 samples/sec | ETA 05:03:57 2022-08-23 00:07:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1703 - reader cost: 8.4768e-04 2022-08-23 00:10:04 [INFO] [EVAL] #Images: 2000 mIoU: 0.3294 Acc: 0.7566 Kappa: 0.7380 Dice: 0.4587 2022-08-23 00:10:04 [INFO] [EVAL] Class IoU: [0.6764 0.7675 0.931 0.7117 0.6788 0.7588 0.7741 0.7629 0.5171 0.5989 0.4736 0.5579 0.6783 0.2682 0.2763 0.4257 0.4839 0.4318 0.593 0.3965 0.7217 0.4808 0.6202 0.4762 0.3151 0.2947 0.5071 0.44 0.3649 0.2582 0.1882 0.4815 0.2535 0.3508 0.2795 0.3754 0.3932 0.4913 0.2361 0.3716 0.2385 0.1465 0.3342 0.2447 0.3106 0.255 0.3065 0.4654 0.5451 0.4861 0.516 0.2674 0.185 0.1544 0.7227 0.4734 0.8061 0.2904 0.4529 0.2417 0.1738 0.3993 0.2916 0.1521 0.4252 0.6074 0.2179 0.4235 0.0397 0.3272 0.4255 0.4937 0.3895 0.2228 0.4224 0.325 0.4619 0.2781 0.3012 0.2108 0.5745 0.3585 0.279 0.0326 0.1765 0.5376 0.0678 0.0641 0.3241 0.4484 0.3445 0.0368 0.1724 0.111 0.0016 0.0187 0.0038 0.1494 0.2171 0.3538 0.0749 0.0305 0.1505 0.0465 0.0603 0.4804 0.092 0.4965 0.0841 0.3183 0.0946 0.3377 0.1425 0.5997 0.7327 0.0153 0.352 0.6024 0.1668 0.3598 0.3756 0.0342 0.2186 0.0621 0.4521 0.2464 0.4542 0.431 0.0599 0.3119 0.5724 0.0132 0.2024 0.1531 0.1766 0.1138 0.1152 0.0169 0.158 0.358 0.3716 0.0014 0.3484 0.0058 0.2431 0. 0.3744 0.03 0.1341 0.1737] 2022-08-23 00:10:04 [INFO] [EVAL] Class Precision: [0.7839 0.8407 0.9614 0.8025 0.7682 0.8629 0.8953 0.8147 0.6662 0.7609 0.6991 0.6904 0.7656 0.4565 0.5408 0.6307 0.7355 0.6884 0.7575 0.5804 0.8043 0.6817 0.7638 0.612 0.5031 0.45 0.6245 0.6766 0.6189 0.311 0.437 0.6484 0.567 0.4574 0.4067 0.4679 0.6331 0.7802 0.3854 0.5866 0.4595 0.2879 0.5287 0.5824 0.5089 0.4411 0.4917 0.6115 0.6168 0.5801 0.6493 0.3235 0.3042 0.5884 0.7536 0.6447 0.831 0.712 0.6133 0.3589 0.2297 0.5326 0.4902 0.6656 0.5584 0.7159 0.4075 0.5679 0.1892 0.5753 0.7066 0.6837 0.6003 0.3003 0.6032 0.4684 0.6028 0.5407 0.6868 0.5242 0.6773 0.5576 0.7931 0.0895 0.4114 0.7599 0.1817 0.3767 0.6121 0.7389 0.515 0.0436 0.4497 0.3365 0.0055 0.0641 0.1278 0.2797 0.304 0.5885 0.2518 0.0579 0.5226 0.3583 0.652 0.6165 0.3501 0.7802 0.2183 0.5192 0.2729 0.683 0.5067 0.6566 0.7401 0.2265 0.7505 0.6821 0.2172 0.6382 0.4498 0.444 0.659 0.5555 0.6903 0.6981 0.8148 0.5559 0.4661 0.4757 0.6251 0.1968 0.2854 0.7551 0.4271 0.311 0.3515 0.0543 0.462 0.6767 0.4186 0.0019 0.604 0.1155 0.5095 0. 0.7449 0.3037 0.6311 0.7163] 2022-08-23 00:10:04 [INFO] [EVAL] Class Recall: [0.8315 0.8981 0.9671 0.8629 0.8536 0.8628 0.8511 0.923 0.6979 0.7378 0.5948 0.7441 0.856 0.394 0.361 0.5671 0.5859 0.5367 0.7319 0.5559 0.8753 0.6201 0.7674 0.6821 0.4574 0.4605 0.7296 0.5572 0.4707 0.6029 0.2485 0.6516 0.3143 0.6009 0.4718 0.6551 0.5093 0.5702 0.3787 0.5034 0.3316 0.2297 0.476 0.2968 0.4435 0.3768 0.4486 0.6608 0.8243 0.7501 0.7154 0.6066 0.3207 0.173 0.9463 0.6405 0.9642 0.329 0.6339 0.4254 0.4167 0.6147 0.4184 0.1647 0.6405 0.8004 0.3188 0.6248 0.0478 0.4314 0.5168 0.6399 0.5258 0.4636 0.5849 0.5148 0.664 0.3641 0.3491 0.2607 0.7911 0.501 0.301 0.0488 0.2361 0.6476 0.0976 0.0718 0.4079 0.5329 0.5101 0.1896 0.2185 0.1421 0.0023 0.0258 0.0039 0.2427 0.4319 0.4702 0.0963 0.0608 0.1745 0.0507 0.0623 0.6851 0.1109 0.5772 0.1204 0.4513 0.1265 0.4004 0.1654 0.8737 0.9866 0.0161 0.3987 0.8375 0.4182 0.452 0.6947 0.0357 0.2465 0.0654 0.5671 0.2758 0.5065 0.6572 0.0643 0.4753 0.8718 0.0139 0.4101 0.1611 0.2314 0.1522 0.1463 0.024 0.1936 0.4318 0.768 0.0049 0.4516 0.0061 0.3174 0. 0.4295 0.0323 0.1455 0.1865] 2022-08-23 00:10:04 [INFO] [EVAL] The model with the best validation mIoU (0.3372) was saved at iter 55000. 2022-08-23 00:10:17 [INFO] [TRAIN] epoch: 45, iter: 56050/160000, loss: 0.6314, lr: 0.000787, batch_cost: 0.2706, reader_cost: 0.00809, ips: 29.5596 samples/sec | ETA 07:48:53 2022-08-23 00:10:31 [INFO] [TRAIN] epoch: 45, iter: 56100/160000, loss: 0.6719, lr: 0.000787, batch_cost: 0.2715, reader_cost: 0.00596, ips: 29.4615 samples/sec | ETA 07:50:13 2022-08-23 00:10:44 [INFO] [TRAIN] epoch: 45, iter: 56150/160000, loss: 0.6884, lr: 0.000786, batch_cost: 0.2673, reader_cost: 0.00091, ips: 29.9310 samples/sec | ETA 07:42:37 2022-08-23 00:10:58 [INFO] [TRAIN] epoch: 45, iter: 56200/160000, loss: 0.6503, lr: 0.000786, batch_cost: 0.2747, reader_cost: 0.00069, ips: 29.1240 samples/sec | ETA 07:55:12 2022-08-23 00:11:11 [INFO] [TRAIN] epoch: 45, iter: 56250/160000, loss: 0.6623, lr: 0.000785, batch_cost: 0.2649, reader_cost: 0.00058, ips: 30.1976 samples/sec | ETA 07:38:05 2022-08-23 00:11:25 [INFO] [TRAIN] epoch: 45, iter: 56300/160000, loss: 0.6933, lr: 0.000785, batch_cost: 0.2659, reader_cost: 0.00086, ips: 30.0901 samples/sec | ETA 07:39:30 2022-08-23 00:11:36 [INFO] [TRAIN] epoch: 45, iter: 56350/160000, loss: 0.7295, lr: 0.000785, batch_cost: 0.2350, reader_cost: 0.00806, ips: 34.0453 samples/sec | ETA 06:45:55 2022-08-23 00:11:50 [INFO] [TRAIN] epoch: 45, iter: 56400/160000, loss: 0.6547, lr: 0.000784, batch_cost: 0.2710, reader_cost: 0.00071, ips: 29.5229 samples/sec | ETA 07:47:53 2022-08-23 00:12:02 [INFO] [TRAIN] epoch: 45, iter: 56450/160000, loss: 0.6965, lr: 0.000784, batch_cost: 0.2331, reader_cost: 0.00661, ips: 34.3230 samples/sec | ETA 06:42:15 2022-08-23 00:12:14 [INFO] [TRAIN] epoch: 45, iter: 56500/160000, loss: 0.6192, lr: 0.000784, batch_cost: 0.2414, reader_cost: 0.01136, ips: 33.1358 samples/sec | ETA 06:56:28 2022-08-23 00:12:26 [INFO] [TRAIN] epoch: 45, iter: 56550/160000, loss: 0.6607, lr: 0.000783, batch_cost: 0.2469, reader_cost: 0.01449, ips: 32.3961 samples/sec | ETA 07:05:46 2022-08-23 00:12:39 [INFO] [TRAIN] epoch: 45, iter: 56600/160000, loss: 0.6751, lr: 0.000783, batch_cost: 0.2660, reader_cost: 0.00816, ips: 30.0775 samples/sec | ETA 07:38:22 2022-08-23 00:12:53 [INFO] [TRAIN] epoch: 45, iter: 56650/160000, loss: 0.6870, lr: 0.000782, batch_cost: 0.2676, reader_cost: 0.00230, ips: 29.8918 samples/sec | ETA 07:40:59 2022-08-23 00:13:06 [INFO] [TRAIN] epoch: 45, iter: 56700/160000, loss: 0.7290, lr: 0.000782, batch_cost: 0.2682, reader_cost: 0.00541, ips: 29.8260 samples/sec | ETA 07:41:47 2022-08-23 00:13:19 [INFO] [TRAIN] epoch: 45, iter: 56750/160000, loss: 0.6828, lr: 0.000782, batch_cost: 0.2508, reader_cost: 0.00068, ips: 31.9019 samples/sec | ETA 07:11:31 2022-08-23 00:13:31 [INFO] [TRAIN] epoch: 45, iter: 56800/160000, loss: 0.6903, lr: 0.000781, batch_cost: 0.2539, reader_cost: 0.00085, ips: 31.5127 samples/sec | ETA 07:16:38 2022-08-23 00:13:46 [INFO] [TRAIN] epoch: 46, iter: 56850/160000, loss: 0.6374, lr: 0.000781, batch_cost: 0.2912, reader_cost: 0.05455, ips: 27.4725 samples/sec | ETA 08:20:37 2022-08-23 00:14:00 [INFO] [TRAIN] epoch: 46, iter: 56900/160000, loss: 0.6462, lr: 0.000781, batch_cost: 0.2862, reader_cost: 0.02125, ips: 27.9571 samples/sec | ETA 08:11:42 2022-08-23 00:14:11 [INFO] [TRAIN] epoch: 46, iter: 56950/160000, loss: 0.6959, lr: 0.000780, batch_cost: 0.2110, reader_cost: 0.00080, ips: 37.9063 samples/sec | ETA 06:02:28 2022-08-23 00:14:20 [INFO] [TRAIN] epoch: 46, iter: 57000/160000, loss: 0.6386, lr: 0.000780, batch_cost: 0.1828, reader_cost: 0.00062, ips: 43.7657 samples/sec | ETA 05:13:47 2022-08-23 00:14:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1745 - reader cost: 7.6823e-04 2022-08-23 00:17:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3325 Acc: 0.7567 Kappa: 0.7382 Dice: 0.4624 2022-08-23 00:17:15 [INFO] [EVAL] Class IoU: [0.6765 0.7643 0.9296 0.7198 0.6797 0.7546 0.7708 0.7822 0.5145 0.5993 0.4813 0.5314 0.6884 0.2891 0.2026 0.4257 0.524 0.4418 0.605 0.4001 0.7166 0.4816 0.6107 0.4849 0.3349 0.3245 0.4101 0.3946 0.3349 0.2593 0.2472 0.4968 0.2566 0.3671 0.2789 0.3844 0.4123 0.5221 0.2544 0.368 0.23 0.1238 0.3643 0.2594 0.3432 0.2182 0.3246 0.4781 0.5426 0.5297 0.5162 0.387 0.1851 0.2091 0.7286 0.4403 0.854 0.3453 0.4469 0.3054 0.118 0.2461 0.309 0.1494 0.4107 0.5572 0.2548 0.4283 0.0524 0.3258 0.357 0.5141 0.3792 0.2232 0.4269 0.341 0.5003 0.2565 0.2572 0.1217 0.5598 0.3834 0.2945 0.0497 0.0685 0.5361 0.0948 0.0853 0.2222 0.4827 0.4096 0.0315 0.2424 0.0773 0.0015 0.0138 0.0112 0.107 0.3017 0.2807 0.0604 0.0119 0.2257 0.3174 0.0101 0.4633 0.0995 0.5063 0.0829 0.2567 0.0575 0.383 0.1176 0.6192 0.7228 0.0023 0.3346 0.6061 0.143 0.2744 0.4726 0.0273 0.2404 0.1823 0.3907 0.3194 0.43 0.4356 0.1557 0.3088 0.5837 0.0258 0.2718 0.2382 0.1187 0.123 0.1336 0.0113 0.1481 0.3348 0.3886 0.0042 0.2774 0.0348 0.2808 0. 0.2847 0.0311 0.1353 0.1514] 2022-08-23 00:17:15 [INFO] [EVAL] Class Precision: [0.7725 0.8743 0.9634 0.8301 0.7583 0.8635 0.8598 0.8396 0.6224 0.7544 0.6675 0.7344 0.7754 0.4596 0.5901 0.5935 0.6827 0.6672 0.7765 0.5845 0.7868 0.6886 0.7706 0.6107 0.5052 0.403 0.5902 0.7914 0.6469 0.3399 0.4267 0.6369 0.3979 0.4662 0.399 0.5232 0.6005 0.7523 0.4233 0.6288 0.3647 0.2781 0.6116 0.5197 0.5316 0.4844 0.4701 0.6658 0.5996 0.6667 0.6931 0.5121 0.3359 0.7065 0.7973 0.5925 0.9051 0.6902 0.6083 0.5125 0.1727 0.4189 0.4654 0.7145 0.4828 0.5938 0.4321 0.6067 0.3985 0.5215 0.7795 0.6561 0.6409 0.2691 0.6727 0.4777 0.5906 0.5718 0.6292 0.2321 0.6289 0.6015 0.7732 0.095 0.2544 0.7419 0.4173 0.3366 0.3094 0.6571 0.5633 0.035 0.3952 0.3158 0.0076 0.0576 0.0626 0.2967 0.5428 0.6295 0.2929 0.0251 0.687 0.8122 0.2317 0.5662 0.3927 0.7829 0.238 0.3881 0.2737 0.5053 0.7111 0.7724 0.7267 0.0477 0.7357 0.6247 0.3079 0.6031 0.5727 0.4537 0.4564 0.4882 0.7974 0.671 0.9051 0.5936 0.8364 0.5664 0.6858 0.5666 0.4863 0.6567 0.514 0.3078 0.2798 0.0674 0.374 0.6952 0.4302 0.0058 0.6763 0.2695 0.4818 0. 0.8706 0.3609 0.6841 0.6826] 2022-08-23 00:17:15 [INFO] [EVAL] Class Recall: [0.8449 0.8587 0.9636 0.8442 0.8676 0.8568 0.8816 0.9197 0.748 0.7445 0.633 0.6579 0.8599 0.438 0.2358 0.6008 0.6928 0.5667 0.7325 0.5591 0.8893 0.6157 0.7463 0.7019 0.4983 0.6251 0.5734 0.4404 0.4099 0.5225 0.3701 0.6931 0.4194 0.6334 0.4809 0.5916 0.5681 0.6305 0.3892 0.4702 0.3838 0.1824 0.474 0.3411 0.4918 0.2842 0.512 0.6291 0.8511 0.7206 0.6692 0.613 0.292 0.229 0.8943 0.6315 0.9379 0.4086 0.6274 0.4304 0.2716 0.3737 0.4789 0.1588 0.7332 0.9003 0.383 0.5929 0.0569 0.4647 0.3971 0.7038 0.4815 0.5664 0.5388 0.5437 0.766 0.3175 0.3032 0.2036 0.8361 0.5139 0.3223 0.0944 0.0857 0.6589 0.1092 0.1025 0.4408 0.6453 0.6001 0.2397 0.3853 0.0928 0.0018 0.0179 0.0134 0.1434 0.4045 0.3363 0.0707 0.0221 0.2515 0.3426 0.0105 0.7182 0.1176 0.589 0.1129 0.4312 0.0679 0.6128 0.1235 0.7574 0.9926 0.0024 0.3803 0.9531 0.2108 0.3348 0.7298 0.0283 0.3368 0.2254 0.4338 0.3787 0.4502 0.6208 0.1606 0.4044 0.7969 0.0263 0.3813 0.272 0.1337 0.17 0.2037 0.0134 0.197 0.3925 0.8007 0.0152 0.3199 0.0384 0.4023 0. 0.2973 0.0329 0.1444 0.1628] 2022-08-23 00:17:15 [INFO] [EVAL] The model with the best validation mIoU (0.3372) was saved at iter 55000. 2022-08-23 00:17:28 [INFO] [TRAIN] epoch: 46, iter: 57050/160000, loss: 0.6460, lr: 0.000779, batch_cost: 0.2666, reader_cost: 0.00432, ips: 30.0108 samples/sec | ETA 07:37:23 2022-08-23 00:17:41 [INFO] [TRAIN] epoch: 46, iter: 57100/160000, loss: 0.6197, lr: 0.000779, batch_cost: 0.2514, reader_cost: 0.00094, ips: 31.8198 samples/sec | ETA 07:11:10 2022-08-23 00:17:55 [INFO] [TRAIN] epoch: 46, iter: 57150/160000, loss: 0.6409, lr: 0.000779, batch_cost: 0.2735, reader_cost: 0.00066, ips: 29.2510 samples/sec | ETA 07:48:48 2022-08-23 00:18:08 [INFO] [TRAIN] epoch: 46, iter: 57200/160000, loss: 0.6242, lr: 0.000778, batch_cost: 0.2582, reader_cost: 0.00113, ips: 30.9845 samples/sec | ETA 07:22:22 2022-08-23 00:18:21 [INFO] [TRAIN] epoch: 46, iter: 57250/160000, loss: 0.6373, lr: 0.000778, batch_cost: 0.2639, reader_cost: 0.01694, ips: 30.3175 samples/sec | ETA 07:31:53 2022-08-23 00:18:34 [INFO] [TRAIN] epoch: 46, iter: 57300/160000, loss: 0.6428, lr: 0.000778, batch_cost: 0.2583, reader_cost: 0.00115, ips: 30.9719 samples/sec | ETA 07:22:07 2022-08-23 00:18:46 [INFO] [TRAIN] epoch: 46, iter: 57350/160000, loss: 0.6576, lr: 0.000777, batch_cost: 0.2526, reader_cost: 0.00062, ips: 31.6656 samples/sec | ETA 07:12:13 2022-08-23 00:18:59 [INFO] [TRAIN] epoch: 46, iter: 57400/160000, loss: 0.6167, lr: 0.000777, batch_cost: 0.2589, reader_cost: 0.00757, ips: 30.9026 samples/sec | ETA 07:22:40 2022-08-23 00:19:12 [INFO] [TRAIN] epoch: 46, iter: 57450/160000, loss: 0.6577, lr: 0.000776, batch_cost: 0.2515, reader_cost: 0.01209, ips: 31.8079 samples/sec | ETA 07:09:52 2022-08-23 00:19:26 [INFO] [TRAIN] epoch: 46, iter: 57500/160000, loss: 0.6198, lr: 0.000776, batch_cost: 0.2774, reader_cost: 0.00254, ips: 28.8358 samples/sec | ETA 07:53:56 2022-08-23 00:19:39 [INFO] [TRAIN] epoch: 46, iter: 57550/160000, loss: 0.6729, lr: 0.000776, batch_cost: 0.2589, reader_cost: 0.00466, ips: 30.8964 samples/sec | ETA 07:22:07 2022-08-23 00:19:52 [INFO] [TRAIN] epoch: 46, iter: 57600/160000, loss: 0.6775, lr: 0.000775, batch_cost: 0.2642, reader_cost: 0.00058, ips: 30.2790 samples/sec | ETA 07:30:55 2022-08-23 00:20:05 [INFO] [TRAIN] epoch: 46, iter: 57650/160000, loss: 0.6574, lr: 0.000775, batch_cost: 0.2717, reader_cost: 0.00193, ips: 29.4405 samples/sec | ETA 07:43:32 2022-08-23 00:20:19 [INFO] [TRAIN] epoch: 46, iter: 57700/160000, loss: 0.6299, lr: 0.000775, batch_cost: 0.2660, reader_cost: 0.00114, ips: 30.0710 samples/sec | ETA 07:33:35 2022-08-23 00:20:31 [INFO] [TRAIN] epoch: 46, iter: 57750/160000, loss: 0.6314, lr: 0.000774, batch_cost: 0.2516, reader_cost: 0.00813, ips: 31.7934 samples/sec | ETA 07:08:48 2022-08-23 00:20:44 [INFO] [TRAIN] epoch: 46, iter: 57800/160000, loss: 0.6638, lr: 0.000774, batch_cost: 0.2526, reader_cost: 0.00051, ips: 31.6685 samples/sec | ETA 07:10:17 2022-08-23 00:20:56 [INFO] [TRAIN] epoch: 46, iter: 57850/160000, loss: 0.6561, lr: 0.000773, batch_cost: 0.2321, reader_cost: 0.00052, ips: 34.4672 samples/sec | ETA 06:35:09 2022-08-23 00:21:06 [INFO] [TRAIN] epoch: 46, iter: 57900/160000, loss: 0.6243, lr: 0.000773, batch_cost: 0.2132, reader_cost: 0.00056, ips: 37.5228 samples/sec | ETA 06:02:48 2022-08-23 00:21:17 [INFO] [TRAIN] epoch: 46, iter: 57950/160000, loss: 0.6754, lr: 0.000773, batch_cost: 0.2145, reader_cost: 0.00055, ips: 37.2956 samples/sec | ETA 06:04:49 2022-08-23 00:21:26 [INFO] [TRAIN] epoch: 46, iter: 58000/160000, loss: 0.6518, lr: 0.000772, batch_cost: 0.1727, reader_cost: 0.00056, ips: 46.3224 samples/sec | ETA 04:53:35 2022-08-23 00:21:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 148s - batch_cost: 0.1477 - reader cost: 7.0422e-04 2022-08-23 00:23:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3413 Acc: 0.7604 Kappa: 0.7421 Dice: 0.4717 2022-08-23 00:23:53 [INFO] [EVAL] Class IoU: [0.672 0.7687 0.932 0.713 0.6821 0.7429 0.7767 0.7683 0.5229 0.6125 0.489 0.5499 0.6872 0.3136 0.2463 0.4242 0.5832 0.4458 0.5913 0.4033 0.7456 0.4785 0.6199 0.4671 0.339 0.3455 0.4111 0.4056 0.3478 0.214 0.2399 0.5011 0.2358 0.35 0.2827 0.3556 0.4033 0.5209 0.261 0.3789 0.2199 0.0844 0.3417 0.2548 0.3292 0.2761 0.2558 0.484 0.5931 0.5515 0.5203 0.3599 0.1899 0.267 0.7397 0.4747 0.835 0.4029 0.4717 0.2936 0.1087 0.3912 0.3402 0.1334 0.4359 0.6528 0.2566 0.3767 0.0645 0.3034 0.4199 0.5094 0.3045 0.2294 0.4189 0.3253 0.6126 0.2759 0.2925 0.3132 0.643 0.3773 0.3013 0.0172 0.423 0.5297 0.0646 0.0489 0.267 0.4557 0.4066 0.0564 0.2292 0.0682 0.1017 0.0187 0.0114 0.1411 0.2493 0.3167 0.0602 0.0265 0.2644 0.0987 0.0968 0.3867 0.0856 0.4966 0.118 0.1761 0.0357 0.2 0.0961 0.5166 0.7871 0.006 0.3586 0.6422 0.1304 0.3432 0.4429 0.0034 0.2155 0.0935 0.5661 0.296 0.3573 0.4335 0.4842 0.265 0.5998 0.0508 0.2521 0.1952 0.1524 0.133 0.1232 0.0208 0.1278 0.3605 0.4889 0.0241 0.3345 0.0215 0.2768 0. 0.3994 0.0335 0.1225 0.1342] 2022-08-23 00:23:53 [INFO] [EVAL] Class Precision: [0.7723 0.8599 0.9639 0.8066 0.7654 0.8523 0.8646 0.8498 0.6554 0.74 0.6994 0.7125 0.7662 0.5556 0.5 0.5709 0.7114 0.6877 0.7482 0.59 0.8344 0.6775 0.7933 0.5994 0.4885 0.435 0.5453 0.7145 0.6586 0.3324 0.42 0.6871 0.4399 0.4964 0.5124 0.5474 0.6174 0.7535 0.4567 0.5453 0.3233 0.2756 0.5445 0.524 0.5835 0.465 0.4197 0.6993 0.7025 0.6707 0.6516 0.4571 0.372 0.5852 0.763 0.6549 0.8837 0.6356 0.5427 0.5134 0.1816 0.4895 0.5164 0.7856 0.5403 0.7798 0.4613 0.5787 0.2726 0.5074 0.6827 0.5935 0.6571 0.2798 0.6714 0.5419 0.8675 0.559 0.6971 0.5511 0.8073 0.5555 0.7798 0.06 0.5658 0.645 0.3852 0.445 0.4679 0.7985 0.5777 0.0795 0.5414 0.2749 0.2053 0.0502 0.0916 0.3964 0.554 0.5678 0.4091 0.0397 0.5701 0.4452 0.6637 0.4939 0.1627 0.8418 0.2806 0.3531 0.406 0.2988 0.5642 0.5622 0.7912 0.0446 0.7777 0.6763 0.5105 0.6073 0.6196 0.2534 0.583 0.615 0.7937 0.6077 0.7063 0.6111 0.6894 0.5687 0.687 0.4057 0.3167 0.7721 0.5363 0.343 0.3009 0.0548 0.4514 0.6407 0.6095 0.0317 0.6484 0.2017 0.4758 0. 0.8398 0.3947 0.7471 0.7382] 2022-08-23 00:23:53 [INFO] [EVAL] Class Recall: [0.8381 0.8788 0.9657 0.86 0.8624 0.8527 0.8842 0.889 0.7212 0.7805 0.6192 0.7067 0.8696 0.4186 0.3269 0.6227 0.764 0.559 0.7383 0.5602 0.8752 0.6196 0.7393 0.6791 0.5255 0.6266 0.6255 0.4841 0.4243 0.3752 0.3588 0.6493 0.3369 0.5427 0.3867 0.5037 0.5378 0.6279 0.3786 0.554 0.4075 0.1085 0.4785 0.3315 0.4304 0.4046 0.3958 0.6112 0.792 0.7563 0.7207 0.6286 0.2795 0.3293 0.9603 0.633 0.9381 0.5239 0.7829 0.4068 0.213 0.6608 0.4993 0.1385 0.6929 0.8004 0.3663 0.5191 0.0779 0.4301 0.5217 0.7823 0.3621 0.56 0.527 0.4488 0.6759 0.3526 0.3351 0.4205 0.7596 0.5404 0.3294 0.0235 0.6262 0.7477 0.072 0.0521 0.3834 0.5148 0.5786 0.1625 0.2844 0.0832 0.1677 0.0289 0.0129 0.1798 0.3119 0.4174 0.0659 0.0738 0.3303 0.1125 0.1018 0.6406 0.1529 0.5477 0.1692 0.26 0.0377 0.3769 0.1039 0.8643 0.9934 0.0069 0.3996 0.9273 0.149 0.4411 0.6083 0.0034 0.2548 0.0993 0.6638 0.3659 0.4197 0.5987 0.6194 0.3317 0.8252 0.0549 0.5527 0.2071 0.1755 0.1786 0.1726 0.0325 0.1513 0.4519 0.712 0.0917 0.4086 0.0235 0.3983 0. 0.4323 0.0353 0.1278 0.1409] 2022-08-23 00:23:54 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 00:24:04 [INFO] [TRAIN] epoch: 46, iter: 58050/160000, loss: 0.6521, lr: 0.000772, batch_cost: 0.2015, reader_cost: 0.00257, ips: 39.7082 samples/sec | ETA 05:42:19 2022-08-23 00:24:18 [INFO] [TRAIN] epoch: 47, iter: 58100/160000, loss: 0.6876, lr: 0.000771, batch_cost: 0.2788, reader_cost: 0.03637, ips: 28.6977 samples/sec | ETA 07:53:26 2022-08-23 00:24:30 [INFO] [TRAIN] epoch: 47, iter: 58150/160000, loss: 0.6885, lr: 0.000771, batch_cost: 0.2547, reader_cost: 0.00674, ips: 31.4147 samples/sec | ETA 07:12:16 2022-08-23 00:24:44 [INFO] [TRAIN] epoch: 47, iter: 58200/160000, loss: 0.6535, lr: 0.000771, batch_cost: 0.2812, reader_cost: 0.00117, ips: 28.4453 samples/sec | ETA 07:57:10 2022-08-23 00:24:58 [INFO] [TRAIN] epoch: 47, iter: 58250/160000, loss: 0.6349, lr: 0.000770, batch_cost: 0.2790, reader_cost: 0.00138, ips: 28.6753 samples/sec | ETA 07:53:06 2022-08-23 00:25:10 [INFO] [TRAIN] epoch: 47, iter: 58300/160000, loss: 0.6393, lr: 0.000770, batch_cost: 0.2376, reader_cost: 0.00084, ips: 33.6641 samples/sec | ETA 06:42:48 2022-08-23 00:25:23 [INFO] [TRAIN] epoch: 47, iter: 58350/160000, loss: 0.6139, lr: 0.000770, batch_cost: 0.2605, reader_cost: 0.00067, ips: 30.7147 samples/sec | ETA 07:21:15 2022-08-23 00:25:35 [INFO] [TRAIN] epoch: 47, iter: 58400/160000, loss: 0.6391, lr: 0.000769, batch_cost: 0.2304, reader_cost: 0.00069, ips: 34.7186 samples/sec | ETA 06:30:11 2022-08-23 00:25:47 [INFO] [TRAIN] epoch: 47, iter: 58450/160000, loss: 0.6174, lr: 0.000769, batch_cost: 0.2362, reader_cost: 0.00138, ips: 33.8670 samples/sec | ETA 06:39:47 2022-08-23 00:26:01 [INFO] [TRAIN] epoch: 47, iter: 58500/160000, loss: 0.6350, lr: 0.000768, batch_cost: 0.2866, reader_cost: 0.00048, ips: 27.9169 samples/sec | ETA 08:04:46 2022-08-23 00:26:14 [INFO] [TRAIN] epoch: 47, iter: 58550/160000, loss: 0.6480, lr: 0.000768, batch_cost: 0.2509, reader_cost: 0.00071, ips: 31.8804 samples/sec | ETA 07:04:17 2022-08-23 00:26:27 [INFO] [TRAIN] epoch: 47, iter: 58600/160000, loss: 0.6164, lr: 0.000768, batch_cost: 0.2607, reader_cost: 0.00093, ips: 30.6848 samples/sec | ETA 07:20:36 2022-08-23 00:26:38 [INFO] [TRAIN] epoch: 47, iter: 58650/160000, loss: 0.6707, lr: 0.000767, batch_cost: 0.2311, reader_cost: 0.00162, ips: 34.6244 samples/sec | ETA 06:30:17 2022-08-23 00:26:51 [INFO] [TRAIN] epoch: 47, iter: 58700/160000, loss: 0.6511, lr: 0.000767, batch_cost: 0.2569, reader_cost: 0.00116, ips: 31.1356 samples/sec | ETA 07:13:48 2022-08-23 00:27:04 [INFO] [TRAIN] epoch: 47, iter: 58750/160000, loss: 0.6190, lr: 0.000767, batch_cost: 0.2558, reader_cost: 0.00119, ips: 31.2742 samples/sec | ETA 07:11:39 2022-08-23 00:27:17 [INFO] [TRAIN] epoch: 47, iter: 58800/160000, loss: 0.6380, lr: 0.000766, batch_cost: 0.2744, reader_cost: 0.00093, ips: 29.1519 samples/sec | ETA 07:42:51 2022-08-23 00:27:30 [INFO] [TRAIN] epoch: 47, iter: 58850/160000, loss: 0.6864, lr: 0.000766, batch_cost: 0.2516, reader_cost: 0.00378, ips: 31.8009 samples/sec | ETA 07:04:05 2022-08-23 00:27:44 [INFO] [TRAIN] epoch: 47, iter: 58900/160000, loss: 0.6924, lr: 0.000765, batch_cost: 0.2777, reader_cost: 0.00138, ips: 28.8057 samples/sec | ETA 07:47:57 2022-08-23 00:27:56 [INFO] [TRAIN] epoch: 47, iter: 58950/160000, loss: 0.6589, lr: 0.000765, batch_cost: 0.2395, reader_cost: 0.00272, ips: 33.4018 samples/sec | ETA 06:43:22 2022-08-23 00:28:09 [INFO] [TRAIN] epoch: 47, iter: 59000/160000, loss: 0.6633, lr: 0.000765, batch_cost: 0.2646, reader_cost: 0.00456, ips: 30.2335 samples/sec | ETA 07:25:25 2022-08-23 00:28:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1756 - reader cost: 0.0013 2022-08-23 00:31:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3319 Acc: 0.7549 Kappa: 0.7366 Dice: 0.4614 2022-08-23 00:31:05 [INFO] [EVAL] Class IoU: [0.6738 0.7534 0.9294 0.7244 0.6658 0.7455 0.7585 0.7742 0.5152 0.6479 0.4726 0.5481 0.6852 0.2818 0.2775 0.4107 0.5133 0.4122 0.585 0.4076 0.7361 0.4402 0.6246 0.4737 0.3184 0.3491 0.4101 0.4338 0.387 0.2566 0.2482 0.4931 0.275 0.3366 0.2802 0.3754 0.4162 0.5202 0.2676 0.3692 0.192 0.1563 0.331 0.2405 0.2737 0.2556 0.346 0.4914 0.4563 0.5319 0.5167 0.3752 0.2164 0.2459 0.7606 0.4469 0.8438 0.4214 0.4282 0.2776 0.0701 0.18 0.3253 0.2283 0.4314 0.6301 0.2885 0.3932 0.1135 0.3553 0.4516 0.4795 0.392 0.2239 0.4264 0.3331 0.4447 0.2422 0.259 0.1943 0.5379 0.369 0.3754 0.0173 0.1398 0.534 0.0524 0.0739 0.3245 0.4951 0.4107 0.0572 0.2378 0.1118 0.0021 0.0163 0.011 0.1386 0.1704 0.3713 0.0623 0.0288 0.2759 0.2097 0.1389 0.4595 0.0793 0.4974 0.0751 0.2416 0.0667 0.0636 0.1361 0.6438 0.8228 0.0095 0.3416 0.6682 0.1206 0.2742 0.4331 0.0114 0.2667 0.0503 0.2518 0.2549 0.301 0.4396 0.1567 0.2927 0.5989 0.0436 0.2834 0.2386 0.1208 0.1299 0.0851 0.0169 0.0835 0.36 0.3139 0.0296 0.3788 0.0434 0.2206 0. 0.3534 0.0272 0.1254 0.1512] 2022-08-23 00:31:05 [INFO] [EVAL] Class Precision: [0.7912 0.8345 0.971 0.8455 0.7363 0.8533 0.8984 0.833 0.7053 0.7819 0.6764 0.6471 0.7744 0.5134 0.4869 0.6014 0.6473 0.6872 0.6887 0.5588 0.8221 0.6623 0.7866 0.6249 0.4515 0.451 0.5456 0.7426 0.5575 0.3496 0.4386 0.6304 0.4203 0.4163 0.4747 0.4839 0.5986 0.7071 0.3879 0.6175 0.4273 0.2879 0.4896 0.5529 0.3783 0.4447 0.5448 0.7293 0.5952 0.6501 0.6952 0.4893 0.3782 0.4969 0.8064 0.578 0.8905 0.623 0.6657 0.5005 0.1278 0.379 0.5639 0.6362 0.5321 0.698 0.4068 0.5775 0.3367 0.654 0.5944 0.6436 0.6046 0.342 0.6472 0.4766 0.6113 0.5552 0.8359 0.2988 0.6086 0.668 0.7052 0.0414 0.3075 0.7578 0.3234 0.3564 0.4225 0.6758 0.5333 0.0745 0.3624 0.324 0.0083 0.058 0.1065 0.383 0.4307 0.5762 0.303 0.0694 0.5614 0.5451 0.982 0.6318 0.2171 0.7265 0.1918 0.4252 0.1916 0.0932 0.3673 0.7063 0.8286 0.069 0.7504 0.7949 0.1434 0.5714 0.5929 0.3446 0.6202 0.6886 0.737 0.4774 0.7006 0.669 0.5229 0.4627 0.7205 0.403 0.4004 0.7231 0.5136 0.3058 0.4274 0.0475 0.4817 0.6435 0.3463 0.0388 0.5151 0.4122 0.6504 0. 0.8798 0.3905 0.7606 0.8216] 2022-08-23 00:31:05 [INFO] [EVAL] Class Recall: [0.8195 0.8856 0.9559 0.835 0.8743 0.8551 0.8297 0.9165 0.6566 0.7908 0.6107 0.7818 0.8561 0.3845 0.3923 0.5644 0.7125 0.5074 0.7953 0.601 0.8756 0.5675 0.7521 0.6619 0.5193 0.6069 0.6228 0.5105 0.5586 0.4909 0.3639 0.6936 0.4432 0.6373 0.4062 0.626 0.5773 0.6631 0.463 0.4786 0.2585 0.2547 0.5053 0.2985 0.4973 0.3755 0.4867 0.6011 0.6616 0.7453 0.668 0.6167 0.336 0.3274 0.9304 0.6633 0.9415 0.5656 0.5454 0.3839 0.1343 0.2552 0.4347 0.2625 0.6952 0.8663 0.498 0.552 0.1462 0.4376 0.6526 0.6528 0.5272 0.3932 0.5554 0.5251 0.62 0.3005 0.2728 0.3572 0.8224 0.4519 0.4452 0.029 0.2041 0.6438 0.0588 0.0853 0.5832 0.6493 0.6413 0.1971 0.4089 0.1458 0.0027 0.0222 0.0121 0.1785 0.2199 0.5107 0.0727 0.0469 0.3518 0.2541 0.1393 0.6275 0.111 0.612 0.1098 0.3587 0.0929 0.1668 0.1777 0.879 0.9917 0.0109 0.3854 0.8075 0.4307 0.3452 0.6164 0.0117 0.3187 0.0515 0.2766 0.3536 0.3454 0.5618 0.1828 0.4435 0.7801 0.0466 0.4924 0.2625 0.1365 0.1842 0.0961 0.0255 0.0918 0.4497 0.7704 0.111 0.5889 0.0463 0.2503 0. 0.3714 0.0284 0.1306 0.1564] 2022-08-23 00:31:05 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 00:31:18 [INFO] [TRAIN] epoch: 47, iter: 59050/160000, loss: 0.6527, lr: 0.000764, batch_cost: 0.2606, reader_cost: 0.00429, ips: 30.7005 samples/sec | ETA 07:18:25 2022-08-23 00:31:31 [INFO] [TRAIN] epoch: 47, iter: 59100/160000, loss: 0.6460, lr: 0.000764, batch_cost: 0.2623, reader_cost: 0.00707, ips: 30.5027 samples/sec | ETA 07:21:03 2022-08-23 00:31:46 [INFO] [TRAIN] epoch: 47, iter: 59150/160000, loss: 0.7276, lr: 0.000764, batch_cost: 0.2879, reader_cost: 0.00084, ips: 27.7878 samples/sec | ETA 08:03:54 2022-08-23 00:31:59 [INFO] [TRAIN] epoch: 47, iter: 59200/160000, loss: 0.6495, lr: 0.000763, batch_cost: 0.2693, reader_cost: 0.00057, ips: 29.7014 samples/sec | ETA 07:32:30 2022-08-23 00:32:12 [INFO] [TRAIN] epoch: 47, iter: 59250/160000, loss: 0.7206, lr: 0.000763, batch_cost: 0.2569, reader_cost: 0.00093, ips: 31.1373 samples/sec | ETA 07:11:25 2022-08-23 00:32:24 [INFO] [TRAIN] epoch: 47, iter: 59300/160000, loss: 0.6338, lr: 0.000762, batch_cost: 0.2457, reader_cost: 0.01223, ips: 32.5637 samples/sec | ETA 06:52:19 2022-08-23 00:32:38 [INFO] [TRAIN] epoch: 47, iter: 59350/160000, loss: 0.6512, lr: 0.000762, batch_cost: 0.2628, reader_cost: 0.00513, ips: 30.4363 samples/sec | ETA 07:20:55 2022-08-23 00:32:55 [INFO] [TRAIN] epoch: 48, iter: 59400/160000, loss: 0.6834, lr: 0.000762, batch_cost: 0.3411, reader_cost: 0.07718, ips: 23.4548 samples/sec | ETA 09:31:52 2022-08-23 00:33:07 [INFO] [TRAIN] epoch: 48, iter: 59450/160000, loss: 0.6331, lr: 0.000761, batch_cost: 0.2422, reader_cost: 0.00164, ips: 33.0349 samples/sec | ETA 06:45:49 2022-08-23 00:33:18 [INFO] [TRAIN] epoch: 48, iter: 59500/160000, loss: 0.6505, lr: 0.000761, batch_cost: 0.2321, reader_cost: 0.00845, ips: 34.4752 samples/sec | ETA 06:28:41 2022-08-23 00:33:30 [INFO] [TRAIN] epoch: 48, iter: 59550/160000, loss: 0.6562, lr: 0.000761, batch_cost: 0.2419, reader_cost: 0.00046, ips: 33.0681 samples/sec | ETA 06:45:01 2022-08-23 00:33:43 [INFO] [TRAIN] epoch: 48, iter: 59600/160000, loss: 0.6580, lr: 0.000760, batch_cost: 0.2492, reader_cost: 0.00206, ips: 32.0986 samples/sec | ETA 06:57:02 2022-08-23 00:33:57 [INFO] [TRAIN] epoch: 48, iter: 59650/160000, loss: 0.6684, lr: 0.000760, batch_cost: 0.2748, reader_cost: 0.00105, ips: 29.1118 samples/sec | ETA 07:39:36 2022-08-23 00:34:10 [INFO] [TRAIN] epoch: 48, iter: 59700/160000, loss: 0.6813, lr: 0.000759, batch_cost: 0.2750, reader_cost: 0.00899, ips: 29.0879 samples/sec | ETA 07:39:45 2022-08-23 00:34:22 [INFO] [TRAIN] epoch: 48, iter: 59750/160000, loss: 0.6126, lr: 0.000759, batch_cost: 0.2393, reader_cost: 0.00183, ips: 33.4328 samples/sec | ETA 06:39:48 2022-08-23 00:34:37 [INFO] [TRAIN] epoch: 48, iter: 59800/160000, loss: 0.6512, lr: 0.000759, batch_cost: 0.2886, reader_cost: 0.00120, ips: 27.7191 samples/sec | ETA 08:01:58 2022-08-23 00:34:49 [INFO] [TRAIN] epoch: 48, iter: 59850/160000, loss: 0.6987, lr: 0.000758, batch_cost: 0.2465, reader_cost: 0.00747, ips: 32.4531 samples/sec | ETA 06:51:27 2022-08-23 00:35:02 [INFO] [TRAIN] epoch: 48, iter: 59900/160000, loss: 0.6408, lr: 0.000758, batch_cost: 0.2584, reader_cost: 0.00345, ips: 30.9586 samples/sec | ETA 07:11:06 2022-08-23 00:35:13 [INFO] [TRAIN] epoch: 48, iter: 59950/160000, loss: 0.6607, lr: 0.000757, batch_cost: 0.2199, reader_cost: 0.00319, ips: 36.3816 samples/sec | ETA 06:06:40 2022-08-23 00:35:23 [INFO] [TRAIN] epoch: 48, iter: 60000/160000, loss: 0.5952, lr: 0.000757, batch_cost: 0.1937, reader_cost: 0.00081, ips: 41.2920 samples/sec | ETA 05:22:54 2022-08-23 00:35:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 157s - batch_cost: 0.1568 - reader cost: 8.3676e-04 2022-08-23 00:38:00 [INFO] [EVAL] #Images: 2000 mIoU: 0.3361 Acc: 0.7564 Kappa: 0.7376 Dice: 0.4664 2022-08-23 00:38:00 [INFO] [EVAL] Class IoU: [0.6707 0.7696 0.9302 0.722 0.6939 0.7525 0.7689 0.7721 0.5124 0.6046 0.462 0.5594 0.689 0.2876 0.222 0.4137 0.504 0.4482 0.602 0.3932 0.7296 0.4244 0.592 0.4798 0.3224 0.3216 0.4361 0.4333 0.3522 0.268 0.2194 0.4949 0.2817 0.3288 0.2807 0.3776 0.4161 0.49 0.2485 0.3808 0.2302 0.1232 0.3569 0.2636 0.2763 0.2163 0.3359 0.4598 0.59 0.5424 0.5091 0.3071 0.1596 0.2411 0.7079 0.4719 0.8494 0.4041 0.3736 0.2971 0.1187 0.21 0.3419 0.2256 0.4221 0.6794 0.2973 0.4024 0.0922 0.3417 0.4065 0.4887 0.3785 0.225 0.4349 0.3414 0.3824 0.2507 0.2835 0.1841 0.6095 0.3486 0.3245 0.0179 0.4411 0.5229 0.1195 0.0801 0.3022 0.4717 0.3849 0.0335 0.2087 0.0781 0.002 0.0196 0.0215 0.0808 0.1203 0.3124 0.0423 0.0235 0.2656 0.6753 0.1493 0.4542 0.0888 0.5113 0.1136 0.1182 0.0723 0.198 0.127 0.5281 0.8033 0.0111 0.3431 0.6703 0.1009 0.056 0.397 0.0149 0.2222 0.2059 0.3932 0.2487 0.3556 0.4056 0.2072 0.2216 0.569 0.0029 0.3363 0.2401 0.0766 0.0973 0.1287 0.0203 0.1286 0.3629 0.4326 0. 0.3525 0.2737 0.3106 0. 0.3207 0.027 0.1324 0.1959] 2022-08-23 00:38:00 [INFO] [EVAL] Class Precision: [0.7554 0.8743 0.962 0.8384 0.796 0.8626 0.8741 0.8549 0.6687 0.7615 0.7054 0.7172 0.7833 0.46 0.5602 0.5663 0.685 0.6696 0.7376 0.614 0.8078 0.6478 0.7201 0.6026 0.5175 0.4011 0.5533 0.7619 0.6507 0.3325 0.435 0.6373 0.4056 0.4815 0.478 0.5455 0.5548 0.6639 0.4251 0.608 0.4283 0.2534 0.6027 0.5344 0.3291 0.4607 0.528 0.573 0.6332 0.6276 0.7045 0.3984 0.3101 0.5213 0.7281 0.6392 0.9 0.6226 0.6893 0.4891 0.1603 0.431 0.5279 0.5589 0.5017 0.8534 0.4936 0.5781 0.1499 0.6584 0.6713 0.5662 0.6152 0.2682 0.6497 0.4879 0.43 0.4928 0.7368 0.4423 0.711 0.6784 0.7834 0.0441 0.6157 0.7759 0.4293 0.3763 0.549 0.7106 0.5053 0.038 0.3339 0.2952 0.0086 0.0552 0.0844 0.3609 0.5701 0.5809 0.4976 0.0639 0.5795 0.832 0.8903 0.5996 0.4364 0.7888 0.2017 0.4479 0.2297 0.218 0.5039 0.7704 0.809 0.1046 0.6527 0.7001 0.1451 0.5651 0.4902 0.3384 0.606 0.5057 0.7359 0.5822 0.6941 0.5118 0.606 0.7163 0.6279 0.0658 0.6816 0.6624 0.745 0.3337 0.2975 0.0784 0.5229 0.6291 0.6683 0.0001 0.634 0.6465 0.5275 0. 0.8343 0.3532 0.6249 0.6839] 2022-08-23 00:38:00 [INFO] [EVAL] Class Recall: [0.8567 0.8653 0.9656 0.8387 0.844 0.855 0.8646 0.8886 0.6868 0.7458 0.5724 0.7178 0.8513 0.4341 0.2689 0.6056 0.6559 0.5754 0.7661 0.5222 0.8829 0.5516 0.7688 0.7019 0.461 0.6187 0.6731 0.5012 0.4343 0.5803 0.3068 0.689 0.4797 0.509 0.4048 0.551 0.6246 0.6517 0.3743 0.5047 0.3322 0.1933 0.4667 0.3422 0.6327 0.2896 0.48 0.6994 0.8963 0.7998 0.6472 0.5727 0.2475 0.3096 0.9624 0.6433 0.938 0.5352 0.4493 0.4308 0.3137 0.2906 0.4925 0.2744 0.7267 0.7692 0.4278 0.5698 0.193 0.4153 0.5076 0.7813 0.4959 0.5829 0.5681 0.532 0.7756 0.3378 0.3155 0.2397 0.8103 0.4177 0.3565 0.0292 0.6086 0.6159 0.1421 0.0924 0.402 0.5838 0.6175 0.22 0.3576 0.096 0.0026 0.0294 0.028 0.0943 0.1323 0.4033 0.0441 0.0357 0.329 0.782 0.1521 0.652 0.1003 0.5923 0.2063 0.1384 0.0955 0.6827 0.1451 0.6267 0.9913 0.0122 0.4197 0.9404 0.249 0.0586 0.6761 0.0153 0.2598 0.2578 0.4578 0.3027 0.4216 0.6616 0.2394 0.2429 0.8585 0.003 0.399 0.2736 0.0787 0.1208 0.1849 0.0266 0.1457 0.4617 0.5509 0.0001 0.4426 0.3218 0.4304 0. 0.3425 0.0284 0.1438 0.2154] 2022-08-23 00:38:00 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 00:38:09 [INFO] [TRAIN] epoch: 48, iter: 60050/160000, loss: 0.6518, lr: 0.000757, batch_cost: 0.1790, reader_cost: 0.00417, ips: 44.6827 samples/sec | ETA 04:58:15 2022-08-23 00:38:18 [INFO] [TRAIN] epoch: 48, iter: 60100/160000, loss: 0.6545, lr: 0.000756, batch_cost: 0.1787, reader_cost: 0.00093, ips: 44.7582 samples/sec | ETA 04:57:35 2022-08-23 00:38:27 [INFO] [TRAIN] epoch: 48, iter: 60150/160000, loss: 0.6397, lr: 0.000756, batch_cost: 0.1826, reader_cost: 0.00067, ips: 43.8092 samples/sec | ETA 05:03:53 2022-08-23 00:38:37 [INFO] [TRAIN] epoch: 48, iter: 60200/160000, loss: 0.6916, lr: 0.000756, batch_cost: 0.2000, reader_cost: 0.00047, ips: 40.0066 samples/sec | ETA 05:32:36 2022-08-23 00:38:50 [INFO] [TRAIN] epoch: 48, iter: 60250/160000, loss: 0.6231, lr: 0.000755, batch_cost: 0.2618, reader_cost: 0.00084, ips: 30.5579 samples/sec | ETA 07:15:14 2022-08-23 00:39:01 [INFO] [TRAIN] epoch: 48, iter: 60300/160000, loss: 0.6174, lr: 0.000755, batch_cost: 0.2264, reader_cost: 0.01008, ips: 35.3353 samples/sec | ETA 06:16:12 2022-08-23 00:39:15 [INFO] [TRAIN] epoch: 48, iter: 60350/160000, loss: 0.6685, lr: 0.000754, batch_cost: 0.2778, reader_cost: 0.00070, ips: 28.7940 samples/sec | ETA 07:41:26 2022-08-23 00:39:28 [INFO] [TRAIN] epoch: 48, iter: 60400/160000, loss: 0.6313, lr: 0.000754, batch_cost: 0.2561, reader_cost: 0.00240, ips: 31.2406 samples/sec | ETA 07:05:05 2022-08-23 00:39:41 [INFO] [TRAIN] epoch: 48, iter: 60450/160000, loss: 0.6836, lr: 0.000754, batch_cost: 0.2643, reader_cost: 0.01108, ips: 30.2720 samples/sec | ETA 07:18:28 2022-08-23 00:39:53 [INFO] [TRAIN] epoch: 48, iter: 60500/160000, loss: 0.6181, lr: 0.000753, batch_cost: 0.2441, reader_cost: 0.00755, ips: 32.7706 samples/sec | ETA 06:44:50 2022-08-23 00:40:06 [INFO] [TRAIN] epoch: 48, iter: 60550/160000, loss: 0.6681, lr: 0.000753, batch_cost: 0.2563, reader_cost: 0.02091, ips: 31.2110 samples/sec | ETA 07:04:50 2022-08-23 00:40:19 [INFO] [TRAIN] epoch: 48, iter: 60600/160000, loss: 0.6419, lr: 0.000753, batch_cost: 0.2553, reader_cost: 0.00041, ips: 31.3327 samples/sec | ETA 07:02:59 2022-08-23 00:40:37 [INFO] [TRAIN] epoch: 49, iter: 60650/160000, loss: 0.6430, lr: 0.000752, batch_cost: 0.3664, reader_cost: 0.08650, ips: 21.8322 samples/sec | ETA 10:06:44 2022-08-23 00:40:51 [INFO] [TRAIN] epoch: 49, iter: 60700/160000, loss: 0.6583, lr: 0.000752, batch_cost: 0.2733, reader_cost: 0.00121, ips: 29.2724 samples/sec | ETA 07:32:18 2022-08-23 00:41:04 [INFO] [TRAIN] epoch: 49, iter: 60750/160000, loss: 0.6352, lr: 0.000751, batch_cost: 0.2670, reader_cost: 0.00045, ips: 29.9579 samples/sec | ETA 07:21:43 2022-08-23 00:41:17 [INFO] [TRAIN] epoch: 49, iter: 60800/160000, loss: 0.6961, lr: 0.000751, batch_cost: 0.2538, reader_cost: 0.01200, ips: 31.5152 samples/sec | ETA 06:59:41 2022-08-23 00:41:30 [INFO] [TRAIN] epoch: 49, iter: 60850/160000, loss: 0.6411, lr: 0.000751, batch_cost: 0.2562, reader_cost: 0.02695, ips: 31.2271 samples/sec | ETA 07:03:20 2022-08-23 00:41:44 [INFO] [TRAIN] epoch: 49, iter: 60900/160000, loss: 0.6183, lr: 0.000750, batch_cost: 0.2824, reader_cost: 0.00045, ips: 28.3242 samples/sec | ETA 07:46:30 2022-08-23 00:41:56 [INFO] [TRAIN] epoch: 49, iter: 60950/160000, loss: 0.6303, lr: 0.000750, batch_cost: 0.2476, reader_cost: 0.00052, ips: 32.3083 samples/sec | ETA 06:48:46 2022-08-23 00:42:11 [INFO] [TRAIN] epoch: 49, iter: 61000/160000, loss: 0.6357, lr: 0.000750, batch_cost: 0.2953, reader_cost: 0.00047, ips: 27.0948 samples/sec | ETA 08:07:10 2022-08-23 00:42:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1563 - reader cost: 0.0013 2022-08-23 00:44:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3362 Acc: 0.7574 Kappa: 0.7391 Dice: 0.4678 2022-08-23 00:44:48 [INFO] [EVAL] Class IoU: [0.6732 0.7648 0.9299 0.7206 0.6772 0.7554 0.7603 0.7594 0.5205 0.651 0.4625 0.5121 0.6863 0.3097 0.2601 0.4054 0.5081 0.4391 0.6047 0.4076 0.73 0.5073 0.608 0.4723 0.3602 0.2938 0.4233 0.4179 0.3685 0.2872 0.2294 0.4771 0.2383 0.3109 0.2946 0.3563 0.4088 0.5265 0.2517 0.3776 0.2518 0.0821 0.3495 0.2442 0.354 0.2348 0.3106 0.4893 0.495 0.4982 0.4935 0.3751 0.1983 0.2361 0.6909 0.5281 0.8476 0.3461 0.4501 0.2813 0.1737 0.3592 0.3398 0.2231 0.4109 0.6657 0.2516 0.4256 0.1021 0.3371 0.4237 0.5473 0.3822 0.2263 0.4062 0.3351 0.4274 0.2875 0.1799 0.117 0.528 0.3728 0.2907 0.0218 0.3579 0.5531 0.0825 0.0787 0.2797 0.46 0.4034 0.0299 0.2254 0.0804 0.0085 0.0363 0.0502 0.1538 0.265 0.4056 0.0775 0.0298 0.247 0.5865 0.1034 0.3618 0.08 0.5285 0.0828 0.1347 0.0786 0.4367 0.1424 0.5571 0.7728 0.0091 0.3384 0.6376 0.1841 0.1541 0.4471 0.0098 0.2333 0.1595 0.2026 0.2888 0.4423 0.3906 0.0785 0.2938 0.4637 0.0324 0.2944 0.2139 0.1561 0.1266 0.1281 0.018 0.1783 0.2998 0.4323 0.0137 0.2543 0.0285 0.1954 0. 0.3549 0.0338 0.1296 0.182 ] 2022-08-23 00:44:48 [INFO] [EVAL] Class Precision: [0.7829 0.8616 0.9576 0.8328 0.7462 0.8634 0.84 0.8157 0.6624 0.7824 0.7048 0.7564 0.7588 0.4638 0.5018 0.6367 0.7162 0.6951 0.7283 0.6136 0.8168 0.6719 0.7291 0.57 0.4835 0.4242 0.6243 0.6664 0.6488 0.4109 0.4484 0.6956 0.4994 0.3671 0.4615 0.4715 0.5457 0.7328 0.479 0.6126 0.3348 0.2301 0.5318 0.5542 0.4813 0.4514 0.4753 0.6747 0.6168 0.5876 0.7354 0.5459 0.3092 0.637 0.7139 0.7583 0.9 0.6762 0.5627 0.4891 0.2427 0.5132 0.4854 0.6596 0.4895 0.7895 0.3654 0.5966 0.2761 0.5369 0.6953 0.6942 0.5911 0.282 0.7047 0.5288 0.5282 0.6567 0.3843 0.3185 0.5852 0.6671 0.778 0.0733 0.5198 0.683 0.5645 0.3047 0.5368 0.6535 0.5863 0.0348 0.3982 0.3006 0.0159 0.0994 0.1878 0.4166 0.4298 0.6065 0.243 0.0458 0.7089 0.8602 0.6903 0.44 0.4561 0.801 0.1915 0.516 0.2494 0.7265 0.4261 0.7917 0.7773 0.2147 0.6575 0.7033 0.3051 0.452 0.5478 0.118 0.4759 0.6087 0.7162 0.6319 0.8717 0.482 0.4389 0.6508 0.4901 0.1672 0.5285 0.5624 0.4392 0.2476 0.3303 0.0889 0.544 0.7149 0.5321 0.0182 0.6827 0.2789 0.5503 0. 0.8905 0.314 0.5556 0.7558] 2022-08-23 00:44:48 [INFO] [EVAL] Class Recall: [0.8277 0.872 0.9698 0.8425 0.8799 0.8579 0.889 0.9168 0.7084 0.7949 0.5737 0.6132 0.8779 0.4825 0.3507 0.5274 0.6362 0.5439 0.7808 0.5484 0.8729 0.6742 0.7854 0.7337 0.5854 0.4886 0.568 0.5285 0.4603 0.4882 0.3196 0.6031 0.3132 0.6702 0.4488 0.5933 0.6197 0.6516 0.3466 0.4961 0.5039 0.1132 0.5049 0.3039 0.5725 0.3286 0.4728 0.6405 0.715 0.766 0.6 0.5451 0.3561 0.2728 0.9556 0.635 0.9357 0.4148 0.6923 0.3983 0.3793 0.5448 0.5313 0.2522 0.719 0.8094 0.4468 0.5976 0.1394 0.4752 0.5202 0.7212 0.5195 0.534 0.4896 0.4778 0.6914 0.3383 0.2527 0.1561 0.8438 0.458 0.317 0.0302 0.5347 0.7441 0.0881 0.096 0.3687 0.6084 0.5639 0.1741 0.3418 0.0989 0.0181 0.0542 0.0641 0.196 0.4087 0.5505 0.1021 0.0786 0.2749 0.6482 0.1085 0.6703 0.0885 0.6084 0.1274 0.1542 0.1029 0.5226 0.1763 0.6528 0.9926 0.0094 0.4108 0.8722 0.317 0.1894 0.7087 0.0106 0.3139 0.1777 0.2203 0.3472 0.473 0.6732 0.0873 0.3489 0.8961 0.0386 0.3993 0.2566 0.195 0.2057 0.173 0.0221 0.2096 0.3405 0.6974 0.0527 0.2884 0.0308 0.2326 0. 0.3711 0.0365 0.1446 0.1933] 2022-08-23 00:44:48 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 00:44:57 [INFO] [TRAIN] epoch: 49, iter: 61050/160000, loss: 0.6159, lr: 0.000749, batch_cost: 0.1818, reader_cost: 0.00465, ips: 44.0029 samples/sec | ETA 04:59:49 2022-08-23 00:45:07 [INFO] [TRAIN] epoch: 49, iter: 61100/160000, loss: 0.6332, lr: 0.000749, batch_cost: 0.1951, reader_cost: 0.00082, ips: 40.9981 samples/sec | ETA 05:21:38 2022-08-23 00:45:16 [INFO] [TRAIN] epoch: 49, iter: 61150/160000, loss: 0.6543, lr: 0.000748, batch_cost: 0.1819, reader_cost: 0.00049, ips: 43.9848 samples/sec | ETA 04:59:38 2022-08-23 00:45:25 [INFO] [TRAIN] epoch: 49, iter: 61200/160000, loss: 0.6405, lr: 0.000748, batch_cost: 0.1876, reader_cost: 0.00059, ips: 42.6354 samples/sec | ETA 05:08:58 2022-08-23 00:45:35 [INFO] [TRAIN] epoch: 49, iter: 61250/160000, loss: 0.6468, lr: 0.000748, batch_cost: 0.2039, reader_cost: 0.00044, ips: 39.2403 samples/sec | ETA 05:35:32 2022-08-23 00:45:48 [INFO] [TRAIN] epoch: 49, iter: 61300/160000, loss: 0.6359, lr: 0.000747, batch_cost: 0.2471, reader_cost: 0.00075, ips: 32.3712 samples/sec | ETA 06:46:32 2022-08-23 00:46:01 [INFO] [TRAIN] epoch: 49, iter: 61350/160000, loss: 0.6261, lr: 0.000747, batch_cost: 0.2751, reader_cost: 0.02265, ips: 29.0840 samples/sec | ETA 07:32:15 2022-08-23 00:46:15 [INFO] [TRAIN] epoch: 49, iter: 61400/160000, loss: 0.6337, lr: 0.000747, batch_cost: 0.2621, reader_cost: 0.00219, ips: 30.5261 samples/sec | ETA 07:10:40 2022-08-23 00:46:28 [INFO] [TRAIN] epoch: 49, iter: 61450/160000, loss: 0.6680, lr: 0.000746, batch_cost: 0.2677, reader_cost: 0.00040, ips: 29.8843 samples/sec | ETA 07:19:41 2022-08-23 00:46:42 [INFO] [TRAIN] epoch: 49, iter: 61500/160000, loss: 0.6130, lr: 0.000746, batch_cost: 0.2839, reader_cost: 0.00923, ips: 28.1777 samples/sec | ETA 07:46:05 2022-08-23 00:46:56 [INFO] [TRAIN] epoch: 49, iter: 61550/160000, loss: 0.6662, lr: 0.000745, batch_cost: 0.2734, reader_cost: 0.00050, ips: 29.2608 samples/sec | ETA 07:28:36 2022-08-23 00:47:09 [INFO] [TRAIN] epoch: 49, iter: 61600/160000, loss: 0.5912, lr: 0.000745, batch_cost: 0.2649, reader_cost: 0.01437, ips: 30.2011 samples/sec | ETA 07:14:25 2022-08-23 00:47:24 [INFO] [TRAIN] epoch: 49, iter: 61650/160000, loss: 0.6211, lr: 0.000745, batch_cost: 0.2879, reader_cost: 0.01168, ips: 27.7846 samples/sec | ETA 07:51:57 2022-08-23 00:47:37 [INFO] [TRAIN] epoch: 49, iter: 61700/160000, loss: 0.6500, lr: 0.000744, batch_cost: 0.2726, reader_cost: 0.00055, ips: 29.3462 samples/sec | ETA 07:26:37 2022-08-23 00:47:51 [INFO] [TRAIN] epoch: 49, iter: 61750/160000, loss: 0.6464, lr: 0.000744, batch_cost: 0.2811, reader_cost: 0.00262, ips: 28.4632 samples/sec | ETA 07:40:14 2022-08-23 00:48:05 [INFO] [TRAIN] epoch: 49, iter: 61800/160000, loss: 0.6450, lr: 0.000743, batch_cost: 0.2815, reader_cost: 0.00411, ips: 28.4173 samples/sec | ETA 07:40:45 2022-08-23 00:48:18 [INFO] [TRAIN] epoch: 49, iter: 61850/160000, loss: 0.6258, lr: 0.000743, batch_cost: 0.2482, reader_cost: 0.00082, ips: 32.2325 samples/sec | ETA 06:46:00 2022-08-23 00:48:35 [INFO] [TRAIN] epoch: 50, iter: 61900/160000, loss: 0.6780, lr: 0.000743, batch_cost: 0.3469, reader_cost: 0.07952, ips: 23.0594 samples/sec | ETA 09:27:13 2022-08-23 00:48:49 [INFO] [TRAIN] epoch: 50, iter: 61950/160000, loss: 0.6060, lr: 0.000742, batch_cost: 0.2729, reader_cost: 0.02087, ips: 29.3111 samples/sec | ETA 07:26:01 2022-08-23 00:49:02 [INFO] [TRAIN] epoch: 50, iter: 62000/160000, loss: 0.6675, lr: 0.000742, batch_cost: 0.2748, reader_cost: 0.00699, ips: 29.1142 samples/sec | ETA 07:28:48 2022-08-23 00:49:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1966 - reader cost: 0.0014 2022-08-23 00:52:19 [INFO] [EVAL] #Images: 2000 mIoU: 0.3359 Acc: 0.7583 Kappa: 0.7397 Dice: 0.4671 2022-08-23 00:52:19 [INFO] [EVAL] Class IoU: [0.6798 0.7596 0.9315 0.7192 0.6752 0.7499 0.7571 0.7644 0.5217 0.608 0.4766 0.5475 0.6913 0.2913 0.2961 0.4294 0.5151 0.4086 0.6016 0.4146 0.7372 0.4838 0.6214 0.4498 0.3459 0.2314 0.4512 0.4004 0.4145 0.2716 0.2852 0.5042 0.1949 0.3569 0.291 0.3816 0.4056 0.5086 0.2507 0.3764 0.2402 0.1053 0.3408 0.2497 0.3407 0.2537 0.2909 0.4805 0.5254 0.5477 0.5208 0.3306 0.1902 0.2163 0.7258 0.4311 0.8461 0.3826 0.4141 0.2763 0.078 0.2193 0.3092 0.1846 0.4367 0.6712 0.2729 0.4435 0.1239 0.3538 0.3968 0.5264 0.4117 0.228 0.4439 0.3607 0.3909 0.29 0.244 0.1004 0.6705 0.3711 0.264 0.0263 0.2094 0.5591 0.1415 0.077 0.3265 0.4929 0.4092 0.0259 0.2148 0.0643 0.022 0.0125 0.038 0.1353 0.312 0.3849 0.0916 0.0379 0.2066 0.0803 0.1556 0.5088 0.0893 0.5407 0.1123 0.1206 0.0746 0.4259 0.1462 0.5528 0.7493 0.0137 0.2659 0.5634 0.1753 0.1766 0.437 0. 0.2331 0.0494 0.3175 0.3143 0.279 0.3689 0.3386 0.3125 0.6021 0.0102 0.2644 0.2212 0.232 0.1269 0.1158 0.0266 0.1862 0.3253 0.447 0.0242 0.3324 0.0404 0.2485 0. 0.3021 0.0444 0.1328 0.2118] 2022-08-23 00:52:19 [INFO] [EVAL] Class Precision: [0.7801 0.8314 0.9627 0.8253 0.7539 0.8737 0.8873 0.8228 0.6776 0.7839 0.7363 0.6773 0.7792 0.4303 0.4893 0.6022 0.6337 0.7079 0.7156 0.5833 0.8098 0.667 0.7707 0.6735 0.4848 0.4408 0.5766 0.7292 0.6706 0.3798 0.3977 0.6616 0.4239 0.5066 0.5489 0.5142 0.6372 0.7438 0.525 0.6258 0.4416 0.2552 0.5955 0.5536 0.5135 0.4213 0.4736 0.6141 0.6136 0.6969 0.6528 0.4018 0.341 0.5203 0.7943 0.6048 0.886 0.664 0.5712 0.5195 0.1375 0.4751 0.5547 0.6614 0.5459 0.7815 0.4531 0.5825 0.2807 0.574 0.6608 0.6291 0.6372 0.3211 0.6862 0.5495 0.4908 0.6312 0.5921 0.333 0.7952 0.6451 0.7749 0.0784 0.4459 0.7247 0.3657 0.3738 0.641 0.6379 0.532 0.03 0.4949 0.3077 0.088 0.0416 0.2485 0.2959 0.4828 0.6306 0.3529 0.0693 0.6861 0.7321 0.707 0.654 0.1739 0.8107 0.1944 0.4411 0.2526 0.8308 0.3337 0.8582 0.7517 0.0677 0.7099 0.6022 0.3722 0.4643 0.6733 0. 0.6722 0.8119 0.7596 0.5569 0.9246 0.4826 0.7772 0.6475 0.6836 0.0905 0.5246 0.5973 0.4475 0.344 0.2646 0.0612 0.4648 0.6515 0.5488 0.0338 0.6577 0.4639 0.5155 0. 0.8713 0.2763 0.4571 0.6608] 2022-08-23 00:52:19 [INFO] [EVAL] Class Recall: [0.841 0.8979 0.9664 0.8484 0.8661 0.8411 0.8377 0.9151 0.694 0.7304 0.5747 0.7406 0.8597 0.4741 0.4286 0.5995 0.7334 0.4914 0.7907 0.589 0.8916 0.6379 0.7623 0.5752 0.5471 0.3275 0.6748 0.4704 0.5205 0.4882 0.5021 0.6795 0.2652 0.5471 0.3824 0.5968 0.5275 0.6166 0.3242 0.4858 0.3449 0.152 0.4435 0.3127 0.5031 0.3895 0.43 0.6883 0.7852 0.719 0.7203 0.6509 0.3008 0.2702 0.8938 0.6002 0.9494 0.4744 0.6009 0.3712 0.1527 0.2894 0.4112 0.2039 0.6859 0.8262 0.407 0.65 0.1816 0.4798 0.4982 0.7633 0.5378 0.4401 0.5569 0.5121 0.6575 0.3491 0.2933 0.1257 0.8105 0.4663 0.2859 0.038 0.2831 0.7099 0.1876 0.0884 0.3996 0.6844 0.6393 0.161 0.2751 0.0751 0.0285 0.0176 0.0429 0.1995 0.4687 0.497 0.1101 0.077 0.2282 0.0827 0.1663 0.6963 0.1552 0.6189 0.2101 0.1424 0.0957 0.4664 0.2064 0.6084 0.9957 0.0168 0.2983 0.8974 0.2489 0.2218 0.5546 0. 0.263 0.05 0.3529 0.4191 0.2855 0.6102 0.375 0.3766 0.8347 0.0113 0.3477 0.26 0.3252 0.1675 0.1708 0.045 0.2371 0.3939 0.7068 0.0785 0.4019 0.0424 0.3243 0. 0.3162 0.0502 0.1577 0.2376] 2022-08-23 00:52:19 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 00:52:29 [INFO] [TRAIN] epoch: 50, iter: 62050/160000, loss: 0.5637, lr: 0.000742, batch_cost: 0.1884, reader_cost: 0.00380, ips: 42.4589 samples/sec | ETA 05:07:35 2022-08-23 00:52:39 [INFO] [TRAIN] epoch: 50, iter: 62100/160000, loss: 0.6184, lr: 0.000741, batch_cost: 0.1973, reader_cost: 0.00186, ips: 40.5422 samples/sec | ETA 05:21:58 2022-08-23 00:52:51 [INFO] [TRAIN] epoch: 50, iter: 62150/160000, loss: 0.6175, lr: 0.000741, batch_cost: 0.2521, reader_cost: 0.00049, ips: 31.7356 samples/sec | ETA 06:51:06 2022-08-23 00:53:05 [INFO] [TRAIN] epoch: 50, iter: 62200/160000, loss: 0.6415, lr: 0.000740, batch_cost: 0.2716, reader_cost: 0.00443, ips: 29.4532 samples/sec | ETA 07:22:44 2022-08-23 00:53:17 [INFO] [TRAIN] epoch: 50, iter: 62250/160000, loss: 0.6324, lr: 0.000740, batch_cost: 0.2378, reader_cost: 0.00339, ips: 33.6356 samples/sec | ETA 06:27:29 2022-08-23 00:53:29 [INFO] [TRAIN] epoch: 50, iter: 62300/160000, loss: 0.5982, lr: 0.000740, batch_cost: 0.2438, reader_cost: 0.00816, ips: 32.8079 samples/sec | ETA 06:37:03 2022-08-23 00:53:41 [INFO] [TRAIN] epoch: 50, iter: 62350/160000, loss: 0.6357, lr: 0.000739, batch_cost: 0.2404, reader_cost: 0.01208, ips: 33.2765 samples/sec | ETA 06:31:16 2022-08-23 00:53:53 [INFO] [TRAIN] epoch: 50, iter: 62400/160000, loss: 0.6287, lr: 0.000739, batch_cost: 0.2325, reader_cost: 0.00048, ips: 34.4023 samples/sec | ETA 06:18:16 2022-08-23 00:54:07 [INFO] [TRAIN] epoch: 50, iter: 62450/160000, loss: 0.6271, lr: 0.000739, batch_cost: 0.2800, reader_cost: 0.00035, ips: 28.5689 samples/sec | ETA 07:35:16 2022-08-23 00:54:19 [INFO] [TRAIN] epoch: 50, iter: 62500/160000, loss: 0.6457, lr: 0.000738, batch_cost: 0.2423, reader_cost: 0.00040, ips: 33.0205 samples/sec | ETA 06:33:41 2022-08-23 00:54:31 [INFO] [TRAIN] epoch: 50, iter: 62550/160000, loss: 0.6527, lr: 0.000738, batch_cost: 0.2465, reader_cost: 0.00611, ips: 32.4510 samples/sec | ETA 06:40:23 2022-08-23 00:54:44 [INFO] [TRAIN] epoch: 50, iter: 62600/160000, loss: 0.6413, lr: 0.000737, batch_cost: 0.2615, reader_cost: 0.00335, ips: 30.5964 samples/sec | ETA 07:04:27 2022-08-23 00:54:57 [INFO] [TRAIN] epoch: 50, iter: 62650/160000, loss: 0.6471, lr: 0.000737, batch_cost: 0.2623, reader_cost: 0.00048, ips: 30.4997 samples/sec | ETA 07:05:34 2022-08-23 00:55:11 [INFO] [TRAIN] epoch: 50, iter: 62700/160000, loss: 0.6581, lr: 0.000737, batch_cost: 0.2700, reader_cost: 0.00172, ips: 29.6263 samples/sec | ETA 07:17:53 2022-08-23 00:55:24 [INFO] [TRAIN] epoch: 50, iter: 62750/160000, loss: 0.6338, lr: 0.000736, batch_cost: 0.2570, reader_cost: 0.00055, ips: 31.1274 samples/sec | ETA 06:56:34 2022-08-23 00:55:35 [INFO] [TRAIN] epoch: 50, iter: 62800/160000, loss: 0.6112, lr: 0.000736, batch_cost: 0.2373, reader_cost: 0.00113, ips: 33.7159 samples/sec | ETA 06:24:23 2022-08-23 00:55:49 [INFO] [TRAIN] epoch: 50, iter: 62850/160000, loss: 0.6159, lr: 0.000736, batch_cost: 0.2619, reader_cost: 0.00432, ips: 30.5462 samples/sec | ETA 07:04:03 2022-08-23 00:56:01 [INFO] [TRAIN] epoch: 50, iter: 62900/160000, loss: 0.6150, lr: 0.000735, batch_cost: 0.2563, reader_cost: 0.00762, ips: 31.2143 samples/sec | ETA 06:54:46 2022-08-23 00:56:14 [INFO] [TRAIN] epoch: 50, iter: 62950/160000, loss: 0.6517, lr: 0.000735, batch_cost: 0.2615, reader_cost: 0.00661, ips: 30.5937 samples/sec | ETA 07:02:57 2022-08-23 00:56:27 [INFO] [TRAIN] epoch: 50, iter: 63000/160000, loss: 0.6389, lr: 0.000734, batch_cost: 0.2459, reader_cost: 0.00823, ips: 32.5325 samples/sec | ETA 06:37:33 2022-08-23 00:56:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 203s - batch_cost: 0.2025 - reader cost: 0.0011 2022-08-23 00:59:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.3350 Acc: 0.7582 Kappa: 0.7396 Dice: 0.4648 2022-08-23 00:59:50 [INFO] [EVAL] Class IoU: [0.6755 0.7682 0.9301 0.7196 0.6652 0.756 0.7661 0.7717 0.5168 0.6301 0.4865 0.5347 0.6897 0.2688 0.2776 0.4279 0.4855 0.408 0.607 0.4105 0.7459 0.4607 0.6133 0.4742 0.3455 0.2964 0.4069 0.4022 0.3602 0.2774 0.272 0.5085 0.2451 0.3413 0.2821 0.3376 0.4212 0.4806 0.2453 0.3772 0.2156 0.0642 0.3536 0.2576 0.3336 0.206 0.2631 0.4907 0.496 0.5398 0.5251 0.3641 0.2036 0.189 0.7158 0.448 0.8441 0.396 0.3252 0.2893 0.1779 0.1837 0.3257 0.2244 0.4467 0.6612 0.2718 0.4294 0.0859 0.3439 0.4077 0.5404 0.4028 0.2283 0.4417 0.3068 0.4706 0.3163 0.2521 0.0699 0.672 0.3503 0.2779 0.0151 0.1629 0.5616 0.1062 0.0698 0.2495 0.5045 0.4071 0.0446 0.1775 0.0534 0.0381 0.0226 0.0191 0.0701 0.3148 0.4176 0.1373 0.0127 0.2905 0.469 0.0729 0.3533 0.0756 0.5322 0.0862 0.1685 0.0619 0.4946 0.1213 0.5023 0.8663 0.0074 0.3737 0.6655 0.1082 0.1535 0.3972 0.0166 0.2364 0.1746 0.3683 0.2782 0.3891 0.4505 0.15 0.3106 0.5809 0.0242 0.3099 0.1892 0.141 0.1131 0.1377 0.0092 0.2096 0.338 0.169 0.0233 0.3155 0.1101 0.2322 0. 0.3782 0.0413 0.1187 0.1492] 2022-08-23 00:59:50 [INFO] [EVAL] Class Precision: [0.7713 0.8511 0.9674 0.8197 0.7359 0.8426 0.8674 0.8461 0.688 0.7433 0.7192 0.7087 0.7663 0.4694 0.4895 0.6282 0.6062 0.6984 0.7335 0.6019 0.831 0.7121 0.789 0.6335 0.4887 0.4678 0.5119 0.7193 0.7353 0.4176 0.4141 0.681 0.4406 0.4575 0.4923 0.5597 0.6431 0.6293 0.4338 0.5889 0.3516 0.1972 0.6268 0.5608 0.4539 0.4482 0.4324 0.6554 0.6447 0.6641 0.7284 0.4959 0.3498 0.5947 0.7371 0.5801 0.8953 0.6723 0.4939 0.4089 0.2154 0.2876 0.5372 0.6978 0.5822 0.7552 0.3659 0.5964 0.2314 0.6108 0.6954 0.7035 0.6474 0.3345 0.6879 0.4642 0.5821 0.5101 0.7198 0.2872 0.7884 0.6896 0.8112 0.0539 0.3958 0.7423 0.4629 0.3653 0.4053 0.7493 0.6478 0.0607 0.3696 0.2798 0.0878 0.0561 0.2477 0.3277 0.4977 0.6589 0.391 0.0341 0.5669 0.8983 0.8866 0.4289 0.2456 0.8463 0.1888 0.4345 0.1716 0.7972 0.6493 0.7512 0.8864 0.0849 0.6806 0.7393 0.1937 0.4463 0.5017 0.2069 0.7046 0.5747 0.7722 0.6855 0.8428 0.6236 0.8307 0.6281 0.6617 0.1571 0.508 0.6457 0.5971 0.3649 0.2872 0.0637 0.4796 0.6948 0.2056 0.0332 0.6454 0.4175 0.4946 0. 0.806 0.3045 0.4202 0.8558] 2022-08-23 00:59:50 [INFO] [EVAL] Class Recall: [0.8447 0.8875 0.9602 0.8549 0.8738 0.8804 0.8678 0.8978 0.675 0.8054 0.6006 0.6853 0.8734 0.386 0.3906 0.5731 0.7092 0.4953 0.7787 0.5635 0.8792 0.5661 0.7337 0.6535 0.541 0.4472 0.665 0.4771 0.4139 0.4524 0.4423 0.6674 0.3558 0.5733 0.3979 0.4598 0.5497 0.6705 0.3609 0.5121 0.3577 0.087 0.4478 0.3228 0.5573 0.2759 0.4019 0.6614 0.6826 0.7427 0.6529 0.578 0.3274 0.2169 0.9612 0.6629 0.9366 0.4907 0.4878 0.4974 0.5059 0.337 0.4528 0.2485 0.6574 0.8415 0.5137 0.6053 0.1203 0.4404 0.4964 0.6997 0.516 0.4183 0.5523 0.4749 0.7107 0.4543 0.2795 0.0846 0.8199 0.4158 0.2971 0.0205 0.2168 0.6976 0.1211 0.0794 0.3935 0.6069 0.5228 0.1442 0.2546 0.0619 0.0632 0.0365 0.0203 0.0819 0.4615 0.5328 0.1747 0.0198 0.3734 0.4953 0.0736 0.6673 0.0984 0.5891 0.1368 0.2158 0.0883 0.5658 0.1298 0.6026 0.9744 0.008 0.4532 0.8696 0.1969 0.1896 0.6559 0.0177 0.2624 0.2005 0.4132 0.3189 0.4195 0.6187 0.1547 0.3807 0.8263 0.0279 0.4429 0.2111 0.1558 0.1409 0.2093 0.0107 0.2713 0.3969 0.487 0.0724 0.3816 0.1301 0.3044 0. 0.4161 0.0456 0.1419 0.1531] 2022-08-23 00:59:50 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 01:00:01 [INFO] [TRAIN] epoch: 50, iter: 63050/160000, loss: 0.6530, lr: 0.000734, batch_cost: 0.2337, reader_cost: 0.00308, ips: 34.2252 samples/sec | ETA 06:17:41 2022-08-23 01:00:14 [INFO] [TRAIN] epoch: 50, iter: 63100/160000, loss: 0.6366, lr: 0.000734, batch_cost: 0.2544, reader_cost: 0.00124, ips: 31.4495 samples/sec | ETA 06:50:49 2022-08-23 01:00:25 [INFO] [TRAIN] epoch: 50, iter: 63150/160000, loss: 0.6332, lr: 0.000733, batch_cost: 0.2075, reader_cost: 0.00385, ips: 38.5518 samples/sec | ETA 05:34:57 2022-08-23 01:00:41 [INFO] [TRAIN] epoch: 51, iter: 63200/160000, loss: 0.5807, lr: 0.000733, batch_cost: 0.3239, reader_cost: 0.07461, ips: 24.7023 samples/sec | ETA 08:42:29 2022-08-23 01:00:52 [INFO] [TRAIN] epoch: 51, iter: 63250/160000, loss: 0.6172, lr: 0.000732, batch_cost: 0.2256, reader_cost: 0.00411, ips: 35.4591 samples/sec | ETA 06:03:47 2022-08-23 01:01:04 [INFO] [TRAIN] epoch: 51, iter: 63300/160000, loss: 0.6642, lr: 0.000732, batch_cost: 0.2326, reader_cost: 0.00052, ips: 34.3894 samples/sec | ETA 06:14:55 2022-08-23 01:01:16 [INFO] [TRAIN] epoch: 51, iter: 63350/160000, loss: 0.6221, lr: 0.000732, batch_cost: 0.2400, reader_cost: 0.00518, ips: 33.3268 samples/sec | ETA 06:26:40 2022-08-23 01:01:29 [INFO] [TRAIN] epoch: 51, iter: 63400/160000, loss: 0.6403, lr: 0.000731, batch_cost: 0.2717, reader_cost: 0.00802, ips: 29.4447 samples/sec | ETA 07:17:25 2022-08-23 01:01:42 [INFO] [TRAIN] epoch: 51, iter: 63450/160000, loss: 0.6020, lr: 0.000731, batch_cost: 0.2488, reader_cost: 0.00047, ips: 32.1512 samples/sec | ETA 06:40:24 2022-08-23 01:01:54 [INFO] [TRAIN] epoch: 51, iter: 63500/160000, loss: 0.6412, lr: 0.000731, batch_cost: 0.2473, reader_cost: 0.00165, ips: 32.3441 samples/sec | ETA 06:37:48 2022-08-23 01:02:06 [INFO] [TRAIN] epoch: 51, iter: 63550/160000, loss: 0.6102, lr: 0.000730, batch_cost: 0.2451, reader_cost: 0.00093, ips: 32.6460 samples/sec | ETA 06:33:55 2022-08-23 01:02:18 [INFO] [TRAIN] epoch: 51, iter: 63600/160000, loss: 0.6263, lr: 0.000730, batch_cost: 0.2423, reader_cost: 0.00082, ips: 33.0121 samples/sec | ETA 06:29:21 2022-08-23 01:02:32 [INFO] [TRAIN] epoch: 51, iter: 63650/160000, loss: 0.6105, lr: 0.000729, batch_cost: 0.2619, reader_cost: 0.00035, ips: 30.5404 samples/sec | ETA 07:00:38 2022-08-23 01:02:45 [INFO] [TRAIN] epoch: 51, iter: 63700/160000, loss: 0.6671, lr: 0.000729, batch_cost: 0.2613, reader_cost: 0.00044, ips: 30.6169 samples/sec | ETA 06:59:22 2022-08-23 01:02:57 [INFO] [TRAIN] epoch: 51, iter: 63750/160000, loss: 0.6560, lr: 0.000729, batch_cost: 0.2477, reader_cost: 0.00170, ips: 32.3017 samples/sec | ETA 06:37:17 2022-08-23 01:03:10 [INFO] [TRAIN] epoch: 51, iter: 63800/160000, loss: 0.6884, lr: 0.000728, batch_cost: 0.2680, reader_cost: 0.00074, ips: 29.8486 samples/sec | ETA 07:09:43 2022-08-23 01:03:24 [INFO] [TRAIN] epoch: 51, iter: 63850/160000, loss: 0.6699, lr: 0.000728, batch_cost: 0.2652, reader_cost: 0.00069, ips: 30.1692 samples/sec | ETA 07:04:56 2022-08-23 01:03:36 [INFO] [TRAIN] epoch: 51, iter: 63900/160000, loss: 0.6235, lr: 0.000728, batch_cost: 0.2428, reader_cost: 0.00038, ips: 32.9462 samples/sec | ETA 06:28:54 2022-08-23 01:03:47 [INFO] [TRAIN] epoch: 51, iter: 63950/160000, loss: 0.5987, lr: 0.000727, batch_cost: 0.2178, reader_cost: 0.00033, ips: 36.7348 samples/sec | ETA 05:48:37 2022-08-23 01:03:56 [INFO] [TRAIN] epoch: 51, iter: 64000/160000, loss: 0.6650, lr: 0.000727, batch_cost: 0.1911, reader_cost: 0.00058, ips: 41.8554 samples/sec | ETA 05:05:48 2022-08-23 01:03:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 155s - batch_cost: 0.1551 - reader cost: 8.4304e-04 2022-08-23 01:06:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3293 Acc: 0.7541 Kappa: 0.7354 Dice: 0.4599 2022-08-23 01:06:32 [INFO] [EVAL] Class IoU: [0.6728 0.7663 0.9297 0.7072 0.6754 0.7453 0.7649 0.7781 0.5108 0.5908 0.4883 0.5412 0.6854 0.2981 0.2471 0.422 0.4588 0.4178 0.6001 0.3918 0.7412 0.4627 0.6026 0.4792 0.3167 0.3819 0.4295 0.4212 0.2957 0.2996 0.1981 0.4777 0.2931 0.3512 0.27 0.3465 0.4098 0.5384 0.2593 0.3718 0.2375 0.1002 0.3241 0.2316 0.3291 0.2181 0.3438 0.5012 0.4833 0.5473 0.5224 0.2363 0.1737 0.208 0.691 0.3635 0.8377 0.3289 0.4368 0.2673 0.1469 0.218 0.3203 0.2826 0.4203 0.649 0.2809 0.3935 0.0397 0.2812 0.46 0.5125 0.4093 0.2527 0.4272 0.3055 0.369 0.3368 0.2515 0.1732 0.6082 0.3712 0.3613 0.0135 0.2835 0.5514 0.0786 0.0734 0.267 0.4833 0.3529 0.0173 0.2273 0.1002 0.0005 0.0446 0.0313 0.1364 0.3038 0.3015 0.0795 0.0193 0.2371 0.0647 0.1196 0.3325 0.0504 0.5091 0.0701 0.188 0.0925 0.4463 0.1334 0.5524 0.8314 0.007 0.2311 0.5891 0.1117 0.3399 0.4095 0.0214 0.2373 0.1782 0.2746 0.2232 0.3831 0.3855 0.1639 0.2699 0.5678 0.0165 0.354 0.2095 0.1768 0.1221 0.1215 0.0302 0.1592 0.338 0.2245 0.0036 0.3756 0.1204 0.254 0. 0.3335 0.0403 0.0943 0.1375] 2022-08-23 01:06:32 [INFO] [EVAL] Class Precision: [0.7723 0.8638 0.9619 0.7879 0.7608 0.874 0.8573 0.8388 0.6854 0.7836 0.6879 0.7195 0.7722 0.4306 0.5326 0.6004 0.6695 0.7178 0.7444 0.634 0.8289 0.6666 0.7532 0.5825 0.5302 0.4739 0.5416 0.7354 0.7199 0.4047 0.4395 0.6112 0.4718 0.4638 0.4319 0.4952 0.5868 0.698 0.4536 0.5768 0.4179 0.2555 0.5262 0.5939 0.5309 0.4342 0.6195 0.7218 0.577 0.6672 0.7337 0.2763 0.365 0.4888 0.7057 0.4902 0.8805 0.721 0.6177 0.4134 0.2322 0.3226 0.4919 0.5756 0.5168 0.7376 0.3912 0.5416 0.089 0.4621 0.7245 0.6357 0.6404 0.3138 0.6116 0.4673 0.5718 0.7046 0.5173 0.3412 0.7006 0.5735 0.7807 0.0481 0.5068 0.6711 0.503 0.3543 0.459 0.7622 0.4668 0.0204 0.4001 0.3195 0.002 0.1216 0.2235 0.2354 0.454 0.652 0.3854 0.0254 0.6198 0.4173 0.9062 0.4022 0.139 0.7694 0.1816 0.3116 0.2538 0.5829 0.5089 0.8264 0.841 0.0813 0.7542 0.6443 0.1566 0.6633 0.6412 0.2635 0.4463 0.5833 0.6023 0.6507 0.8205 0.5017 0.6401 0.6239 0.6435 0.1742 0.58 0.6184 0.5169 0.3236 0.3442 0.0473 0.4503 0.6956 0.2365 0.0051 0.5203 0.5454 0.5306 0. 0.889 0.2823 0.6121 0.7808] 2022-08-23 01:06:32 [INFO] [EVAL] Class Recall: [0.8393 0.8716 0.9653 0.8736 0.8576 0.835 0.8766 0.9149 0.6672 0.7061 0.6273 0.6859 0.8591 0.4921 0.3155 0.5868 0.5931 0.4998 0.7559 0.5063 0.8751 0.6021 0.751 0.7301 0.4402 0.663 0.6749 0.4964 0.3342 0.5359 0.2651 0.6862 0.4362 0.5912 0.4188 0.5357 0.576 0.702 0.3771 0.5113 0.355 0.1414 0.4576 0.2752 0.464 0.3047 0.4359 0.6213 0.7485 0.7528 0.6445 0.6199 0.2489 0.2658 0.9709 0.5846 0.9451 0.3769 0.5986 0.4308 0.2858 0.4019 0.4788 0.357 0.6924 0.8438 0.4989 0.59 0.0669 0.418 0.5575 0.7256 0.5314 0.5645 0.5863 0.4689 0.5098 0.3922 0.3287 0.2603 0.8217 0.5126 0.4022 0.0184 0.3915 0.7556 0.0852 0.0847 0.3895 0.5691 0.5912 0.1016 0.3447 0.1275 0.0006 0.0658 0.0351 0.2448 0.4786 0.3593 0.0911 0.0736 0.2774 0.0711 0.1211 0.6573 0.0732 0.6008 0.1024 0.3216 0.127 0.6557 0.1531 0.6248 0.9865 0.0076 0.25 0.8732 0.2807 0.4108 0.5313 0.0227 0.3364 0.2042 0.3355 0.2535 0.4182 0.6247 0.1806 0.3224 0.8285 0.0178 0.476 0.2406 0.2119 0.1639 0.1581 0.0767 0.1976 0.3967 0.816 0.0118 0.5746 0.1338 0.3277 0. 0.348 0.0449 0.1003 0.143 ] 2022-08-23 01:06:32 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 01:06:42 [INFO] [TRAIN] epoch: 51, iter: 64050/160000, loss: 0.6496, lr: 0.000726, batch_cost: 0.2135, reader_cost: 0.00361, ips: 37.4638 samples/sec | ETA 05:41:29 2022-08-23 01:06:53 [INFO] [TRAIN] epoch: 51, iter: 64100/160000, loss: 0.6534, lr: 0.000726, batch_cost: 0.2177, reader_cost: 0.00045, ips: 36.7507 samples/sec | ETA 05:47:55 2022-08-23 01:07:04 [INFO] [TRAIN] epoch: 51, iter: 64150/160000, loss: 0.6533, lr: 0.000726, batch_cost: 0.2098, reader_cost: 0.00054, ips: 38.1344 samples/sec | ETA 05:35:07 2022-08-23 01:07:15 [INFO] [TRAIN] epoch: 51, iter: 64200/160000, loss: 0.6067, lr: 0.000725, batch_cost: 0.2183, reader_cost: 0.00072, ips: 36.6416 samples/sec | ETA 05:48:36 2022-08-23 01:07:27 [INFO] [TRAIN] epoch: 51, iter: 64250/160000, loss: 0.6208, lr: 0.000725, batch_cost: 0.2393, reader_cost: 0.00056, ips: 33.4345 samples/sec | ETA 06:21:50 2022-08-23 01:07:37 [INFO] [TRAIN] epoch: 51, iter: 64300/160000, loss: 0.6473, lr: 0.000725, batch_cost: 0.2089, reader_cost: 0.00099, ips: 38.2873 samples/sec | ETA 05:33:16 2022-08-23 01:07:48 [INFO] [TRAIN] epoch: 51, iter: 64350/160000, loss: 0.6813, lr: 0.000724, batch_cost: 0.2213, reader_cost: 0.00052, ips: 36.1491 samples/sec | ETA 05:52:47 2022-08-23 01:08:00 [INFO] [TRAIN] epoch: 51, iter: 64400/160000, loss: 0.6343, lr: 0.000724, batch_cost: 0.2278, reader_cost: 0.00065, ips: 35.1240 samples/sec | ETA 06:02:54 2022-08-23 01:08:14 [INFO] [TRAIN] epoch: 52, iter: 64450/160000, loss: 0.6565, lr: 0.000723, batch_cost: 0.2975, reader_cost: 0.09438, ips: 26.8940 samples/sec | ETA 07:53:42 2022-08-23 01:08:26 [INFO] [TRAIN] epoch: 52, iter: 64500/160000, loss: 0.6160, lr: 0.000723, batch_cost: 0.2237, reader_cost: 0.00039, ips: 35.7679 samples/sec | ETA 05:55:59 2022-08-23 01:08:36 [INFO] [TRAIN] epoch: 52, iter: 64550/160000, loss: 0.6540, lr: 0.000723, batch_cost: 0.2180, reader_cost: 0.00050, ips: 36.6953 samples/sec | ETA 05:46:49 2022-08-23 01:08:46 [INFO] [TRAIN] epoch: 52, iter: 64600/160000, loss: 0.6070, lr: 0.000722, batch_cost: 0.1969, reader_cost: 0.00047, ips: 40.6285 samples/sec | ETA 05:13:04 2022-08-23 01:08:57 [INFO] [TRAIN] epoch: 52, iter: 64650/160000, loss: 0.6892, lr: 0.000722, batch_cost: 0.2196, reader_cost: 0.00157, ips: 36.4251 samples/sec | ETA 05:49:01 2022-08-23 01:09:08 [INFO] [TRAIN] epoch: 52, iter: 64700/160000, loss: 0.6407, lr: 0.000722, batch_cost: 0.2158, reader_cost: 0.00050, ips: 37.0634 samples/sec | ETA 05:42:50 2022-08-23 01:09:20 [INFO] [TRAIN] epoch: 52, iter: 64750/160000, loss: 0.5919, lr: 0.000721, batch_cost: 0.2420, reader_cost: 0.00040, ips: 33.0601 samples/sec | ETA 06:24:08 2022-08-23 01:09:31 [INFO] [TRAIN] epoch: 52, iter: 64800/160000, loss: 0.6135, lr: 0.000721, batch_cost: 0.2242, reader_cost: 0.00041, ips: 35.6878 samples/sec | ETA 05:55:40 2022-08-23 01:09:42 [INFO] [TRAIN] epoch: 52, iter: 64850/160000, loss: 0.6815, lr: 0.000720, batch_cost: 0.2146, reader_cost: 0.00072, ips: 37.2701 samples/sec | ETA 05:40:23 2022-08-23 01:09:53 [INFO] [TRAIN] epoch: 52, iter: 64900/160000, loss: 0.6114, lr: 0.000720, batch_cost: 0.2143, reader_cost: 0.00066, ips: 37.3364 samples/sec | ETA 05:39:36 2022-08-23 01:10:05 [INFO] [TRAIN] epoch: 52, iter: 64950/160000, loss: 0.6246, lr: 0.000720, batch_cost: 0.2420, reader_cost: 0.00049, ips: 33.0633 samples/sec | ETA 06:23:18 2022-08-23 01:10:18 [INFO] [TRAIN] epoch: 52, iter: 65000/160000, loss: 0.6353, lr: 0.000719, batch_cost: 0.2632, reader_cost: 0.00458, ips: 30.3965 samples/sec | ETA 06:56:42 2022-08-23 01:10:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1822 - reader cost: 0.0012 2022-08-23 01:13:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3327 Acc: 0.7558 Kappa: 0.7375 Dice: 0.4644 2022-08-23 01:13:21 [INFO] [EVAL] Class IoU: [0.675 0.7626 0.9299 0.717 0.6791 0.7491 0.7752 0.7746 0.5071 0.6015 0.4883 0.5454 0.6768 0.2688 0.2833 0.4246 0.4854 0.4378 0.5925 0.3776 0.744 0.4933 0.6162 0.4785 0.3048 0.3906 0.4465 0.4424 0.377 0.2424 0.2112 0.4516 0.2586 0.3597 0.2569 0.3529 0.41 0.5355 0.2748 0.331 0.2074 0.0879 0.3407 0.2466 0.3067 0.24 0.3139 0.4979 0.4778 0.5045 0.5256 0.3038 0.1671 0.2045 0.7098 0.4486 0.8384 0.3741 0.458 0.2787 0.1866 0.223 0.3073 0.1765 0.4242 0.6651 0.2509 0.4021 0.1126 0.3519 0.3996 0.5427 0.4271 0.2128 0.4361 0.288 0.412 0.2681 0.2078 0.1548 0.5107 0.3602 0.3607 0.0421 0.3393 0.5436 0.0844 0.075 0.2382 0.47 0.3526 0.0128 0.2175 0.0745 0.0164 0.0318 0.0322 0.1254 0.259 0.4207 0.0837 0.015 0.2477 0.231 0.0896 0.3642 0.0392 0.4901 0.0526 0.1176 0.0743 0.4961 0.1382 0.4764 0.7179 0.0025 0.261 0.5807 0.1238 0.1735 0.3366 0.0204 0.2742 0.134 0.2751 0.2662 0.4155 0.3921 0.4127 0.2906 0.5733 0.0194 0.3588 0.2027 0.1641 0.13 0.0965 0.0129 0.1852 0.3435 0.3924 0.0282 0.2644 0.3705 0.2006 0. 0.4177 0.0297 0.1174 0.1313] 2022-08-23 01:13:21 [INFO] [EVAL] Class Precision: [0.7912 0.8442 0.9675 0.8097 0.7676 0.8623 0.8714 0.864 0.6154 0.803 0.6812 0.6934 0.7471 0.4772 0.4941 0.5876 0.6483 0.6226 0.7184 0.6415 0.8395 0.739 0.7539 0.5681 0.4626 0.4917 0.5741 0.7305 0.639 0.3263 0.4224 0.5856 0.53 0.4646 0.5476 0.4778 0.6353 0.6804 0.5047 0.6548 0.3436 0.2549 0.5607 0.5331 0.4931 0.4146 0.5118 0.6665 0.6217 0.5996 0.765 0.4037 0.3208 0.424 0.7306 0.581 0.8732 0.6527 0.6455 0.4523 0.2402 0.364 0.4033 0.5978 0.515 0.7712 0.4402 0.5253 0.2298 0.5791 0.6962 0.7002 0.5909 0.4466 0.6377 0.3975 0.5061 0.6325 0.4739 0.2956 0.5904 0.6553 0.75 0.0725 0.4347 0.7245 0.4733 0.309 0.3671 0.6281 0.4629 0.0187 0.3564 0.2786 0.0287 0.0848 0.1812 0.2494 0.6321 0.5768 0.3589 0.023 0.5165 0.7317 0.6915 0.4474 0.0919 0.7482 0.1193 0.2449 0.1914 0.6069 0.4094 0.5984 0.7219 0.0589 0.7503 0.6291 0.1733 0.4748 0.6056 0.233 0.7098 0.6844 0.678 0.5822 0.8475 0.5293 0.8698 0.4778 0.6573 0.1292 0.7365 0.679 0.447 0.3657 0.3422 0.0673 0.5175 0.6943 0.4427 0.0672 0.7237 0.521 0.5137 0. 0.7881 0.3604 0.4949 0.7227] 2022-08-23 01:13:21 [INFO] [EVAL] Class Recall: [0.8213 0.8875 0.9598 0.8623 0.8549 0.8508 0.8753 0.8822 0.7422 0.7056 0.6329 0.7188 0.8779 0.3811 0.3991 0.6048 0.6589 0.5959 0.7717 0.4786 0.8674 0.5974 0.7714 0.7521 0.4719 0.6551 0.6677 0.5286 0.4791 0.4851 0.2969 0.6636 0.3356 0.6146 0.3261 0.5744 0.5361 0.7154 0.3763 0.401 0.3435 0.1182 0.4648 0.3145 0.4479 0.363 0.4481 0.663 0.6737 0.7607 0.6269 0.5513 0.2585 0.2832 0.9614 0.6632 0.9546 0.4671 0.612 0.4208 0.4558 0.3654 0.5635 0.2003 0.7064 0.8286 0.3685 0.6316 0.1809 0.4728 0.484 0.707 0.6064 0.289 0.5797 0.511 0.6892 0.3175 0.2701 0.2452 0.7909 0.4444 0.41 0.0914 0.6071 0.6852 0.0932 0.0901 0.4041 0.6512 0.5968 0.0395 0.3583 0.0923 0.0369 0.0484 0.0376 0.2015 0.305 0.6085 0.0984 0.041 0.3225 0.2523 0.0934 0.662 0.064 0.5869 0.086 0.1844 0.1084 0.7309 0.1726 0.7002 0.9923 0.0026 0.2858 0.883 0.3024 0.2146 0.4312 0.0218 0.3089 0.1428 0.3165 0.329 0.4491 0.602 0.4399 0.4258 0.8176 0.0223 0.4117 0.2241 0.2059 0.1679 0.1185 0.0158 0.2238 0.4048 0.7756 0.0464 0.2941 0.5618 0.2477 0. 0.4705 0.0314 0.1334 0.1383] 2022-08-23 01:13:21 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 01:13:35 [INFO] [TRAIN] epoch: 52, iter: 65050/160000, loss: 0.6186, lr: 0.000719, batch_cost: 0.2823, reader_cost: 0.00237, ips: 28.3427 samples/sec | ETA 07:26:40 2022-08-23 01:13:49 [INFO] [TRAIN] epoch: 52, iter: 65100/160000, loss: 0.6361, lr: 0.000718, batch_cost: 0.2807, reader_cost: 0.00830, ips: 28.4954 samples/sec | ETA 07:24:02 2022-08-23 01:14:02 [INFO] [TRAIN] epoch: 52, iter: 65150/160000, loss: 0.6333, lr: 0.000718, batch_cost: 0.2536, reader_cost: 0.00685, ips: 31.5437 samples/sec | ETA 06:40:55 2022-08-23 01:14:14 [INFO] [TRAIN] epoch: 52, iter: 65200/160000, loss: 0.6345, lr: 0.000718, batch_cost: 0.2493, reader_cost: 0.01332, ips: 32.0933 samples/sec | ETA 06:33:51 2022-08-23 01:14:27 [INFO] [TRAIN] epoch: 52, iter: 65250/160000, loss: 0.6746, lr: 0.000717, batch_cost: 0.2687, reader_cost: 0.00990, ips: 29.7720 samples/sec | ETA 07:04:20 2022-08-23 01:14:42 [INFO] [TRAIN] epoch: 52, iter: 65300/160000, loss: 0.6167, lr: 0.000717, batch_cost: 0.2852, reader_cost: 0.00363, ips: 28.0507 samples/sec | ETA 07:30:08 2022-08-23 01:14:55 [INFO] [TRAIN] epoch: 52, iter: 65350/160000, loss: 0.6571, lr: 0.000717, batch_cost: 0.2695, reader_cost: 0.00075, ips: 29.6793 samples/sec | ETA 07:05:12 2022-08-23 01:15:06 [INFO] [TRAIN] epoch: 52, iter: 65400/160000, loss: 0.6601, lr: 0.000716, batch_cost: 0.2149, reader_cost: 0.00416, ips: 37.2286 samples/sec | ETA 05:38:48 2022-08-23 01:15:18 [INFO] [TRAIN] epoch: 52, iter: 65450/160000, loss: 0.6103, lr: 0.000716, batch_cost: 0.2398, reader_cost: 0.00042, ips: 33.3672 samples/sec | ETA 06:17:48 2022-08-23 01:15:30 [INFO] [TRAIN] epoch: 52, iter: 65500/160000, loss: 0.6370, lr: 0.000715, batch_cost: 0.2425, reader_cost: 0.00050, ips: 32.9851 samples/sec | ETA 06:21:59 2022-08-23 01:15:40 [INFO] [TRAIN] epoch: 52, iter: 65550/160000, loss: 0.6496, lr: 0.000715, batch_cost: 0.2088, reader_cost: 0.00680, ips: 38.3129 samples/sec | ETA 05:28:41 2022-08-23 01:15:51 [INFO] [TRAIN] epoch: 52, iter: 65600/160000, loss: 0.6149, lr: 0.000715, batch_cost: 0.2065, reader_cost: 0.01223, ips: 38.7480 samples/sec | ETA 05:24:50 2022-08-23 01:16:05 [INFO] [TRAIN] epoch: 52, iter: 65650/160000, loss: 0.6642, lr: 0.000714, batch_cost: 0.2764, reader_cost: 0.00111, ips: 28.9449 samples/sec | ETA 07:14:37 2022-08-23 01:16:19 [INFO] [TRAIN] epoch: 53, iter: 65700/160000, loss: 0.6649, lr: 0.000714, batch_cost: 0.2913, reader_cost: 0.05568, ips: 27.4633 samples/sec | ETA 07:37:49 2022-08-23 01:16:30 [INFO] [TRAIN] epoch: 53, iter: 65750/160000, loss: 0.6233, lr: 0.000714, batch_cost: 0.2083, reader_cost: 0.00049, ips: 38.4053 samples/sec | ETA 05:27:12 2022-08-23 01:16:41 [INFO] [TRAIN] epoch: 53, iter: 65800/160000, loss: 0.6286, lr: 0.000713, batch_cost: 0.2186, reader_cost: 0.00055, ips: 36.5969 samples/sec | ETA 05:43:11 2022-08-23 01:16:53 [INFO] [TRAIN] epoch: 53, iter: 65850/160000, loss: 0.6499, lr: 0.000713, batch_cost: 0.2466, reader_cost: 0.00054, ips: 32.4452 samples/sec | ETA 06:26:54 2022-08-23 01:17:03 [INFO] [TRAIN] epoch: 53, iter: 65900/160000, loss: 0.6477, lr: 0.000712, batch_cost: 0.2061, reader_cost: 0.00136, ips: 38.8113 samples/sec | ETA 05:23:16 2022-08-23 01:17:15 [INFO] [TRAIN] epoch: 53, iter: 65950/160000, loss: 0.5997, lr: 0.000712, batch_cost: 0.2278, reader_cost: 0.00372, ips: 35.1174 samples/sec | ETA 05:57:05 2022-08-23 01:17:24 [INFO] [TRAIN] epoch: 53, iter: 66000/160000, loss: 0.6281, lr: 0.000712, batch_cost: 0.1821, reader_cost: 0.00443, ips: 43.9294 samples/sec | ETA 04:45:18 2022-08-23 01:17:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 190s - batch_cost: 0.1895 - reader cost: 6.6697e-04 2022-08-23 01:20:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3317 Acc: 0.7578 Kappa: 0.7393 Dice: 0.4617 2022-08-23 01:20:33 [INFO] [EVAL] Class IoU: [0.6764 0.7709 0.9312 0.7233 0.6787 0.7395 0.7766 0.7787 0.5135 0.6036 0.4729 0.5539 0.6958 0.2961 0.3 0.419 0.5115 0.4299 0.5943 0.3974 0.7345 0.4974 0.6134 0.4787 0.3145 0.3265 0.4876 0.4397 0.4024 0.2204 0.2285 0.4597 0.2575 0.3576 0.3044 0.3714 0.4151 0.5439 0.2768 0.3448 0.2296 0.091 0.3422 0.2566 0.2931 0.2611 0.3135 0.4874 0.5354 0.5194 0.5105 0.3324 0.183 0.2413 0.6877 0.4231 0.8545 0.3695 0.4869 0.2489 0.1144 0.2317 0.3105 0.1497 0.4199 0.6499 0.2354 0.4148 0.0983 0.3382 0.4744 0.4789 0.4203 0.2116 0.4197 0.3227 0.3493 0.249 0.2164 0.2708 0.5419 0.3603 0.3083 0.0297 0.115 0.5449 0.0779 0.0596 0.2698 0.4964 0.3148 0.0663 0.2397 0.0397 0.036 0.0219 0.0296 0.0983 0.2853 0.4292 0.0269 0.0353 0.2379 0.2293 0.077 0.3437 0.065 0.4459 0.1048 0.0753 0.1364 0.4115 0.1524 0.4663 0.8259 0.0064 0.3987 0.5961 0.0904 0.3375 0.4081 0.0426 0.2288 0.0979 0.2402 0.2543 0.4359 0.4349 0. 0.2817 0.5999 0.0115 0.2953 0.2091 0.1279 0.1286 0.1267 0.0028 0.195 0.3556 0.3919 0.0193 0.2985 0.2878 0.2313 0. 0.3748 0.0318 0.1145 0.1247] 2022-08-23 01:20:33 [INFO] [EVAL] Class Precision: [0.7756 0.8451 0.9636 0.8239 0.7756 0.8926 0.8865 0.8489 0.6772 0.7492 0.6806 0.6997 0.7779 0.4332 0.5361 0.5766 0.723 0.7041 0.7144 0.6197 0.8364 0.673 0.752 0.6048 0.4913 0.5002 0.6441 0.7357 0.6497 0.3436 0.4908 0.6203 0.4593 0.4647 0.4556 0.4817 0.5995 0.7514 0.5029 0.6023 0.3008 0.2675 0.6019 0.4857 0.4354 0.4854 0.4977 0.6596 0.6479 0.6164 0.7298 0.4223 0.3586 0.5484 0.711 0.5227 0.8964 0.685 0.6915 0.3935 0.1562 0.387 0.5207 0.742 0.5063 0.7743 0.4528 0.5696 0.2126 0.5461 0.66 0.5853 0.6374 0.2522 0.693 0.4984 0.3877 0.547 0.3144 0.5418 0.6061 0.6203 0.8023 0.0622 0.3467 0.7055 0.384 0.3855 0.5695 0.7191 0.3775 0.0953 0.418 0.2973 0.0689 0.0658 0.3503 0.2472 0.5577 0.5894 0.1938 0.0552 0.6351 0.805 0.5551 0.4074 0.1564 0.8406 0.1609 0.178 0.2712 0.9112 0.391 0.7334 0.8392 0.2246 0.649 0.6968 0.1187 0.5832 0.5791 0.3501 0.7721 0.6552 0.698 0.6178 0.795 0.7096 0. 0.5601 0.6952 0.162 0.4934 0.6794 0.4345 0.3085 0.2448 0.0154 0.5348 0.6472 0.4438 0.0346 0.6323 0.594 0.6729 0. 0.8648 0.2551 0.5109 0.7462] 2022-08-23 01:20:33 [INFO] [EVAL] Class Recall: [0.841 0.8978 0.9651 0.8556 0.8445 0.8117 0.8623 0.904 0.6799 0.7564 0.6078 0.7266 0.8683 0.4835 0.4051 0.6051 0.6361 0.5247 0.7794 0.5257 0.8577 0.6558 0.769 0.6965 0.4664 0.4846 0.6674 0.5221 0.5139 0.3807 0.2994 0.6398 0.3694 0.6081 0.4783 0.6186 0.5744 0.6632 0.381 0.4464 0.4923 0.1211 0.4424 0.3523 0.4728 0.361 0.4586 0.6512 0.7552 0.7675 0.6294 0.6097 0.272 0.3012 0.9545 0.6896 0.9481 0.4452 0.6221 0.4038 0.2998 0.3659 0.4347 0.1579 0.7111 0.8018 0.329 0.6041 0.1545 0.4704 0.6279 0.7247 0.5524 0.5678 0.5156 0.4779 0.7791 0.3137 0.4097 0.3513 0.8363 0.4623 0.3337 0.0538 0.1469 0.7053 0.0891 0.0659 0.339 0.6157 0.6548 0.179 0.3598 0.0438 0.0702 0.0317 0.0313 0.1402 0.3688 0.6123 0.0303 0.0893 0.2755 0.2427 0.0821 0.6873 0.1 0.4871 0.231 0.1154 0.2152 0.4287 0.1998 0.5615 0.9811 0.0065 0.5083 0.8049 0.2751 0.4448 0.5802 0.0462 0.2453 0.1032 0.268 0.3017 0.4911 0.5291 0. 0.3617 0.814 0.0122 0.4238 0.2319 0.1534 0.1808 0.2081 0.0034 0.2348 0.4411 0.77 0.042 0.3612 0.3583 0.2606 0. 0.3981 0.035 0.1286 0.1302] 2022-08-23 01:20:34 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 01:20:47 [INFO] [TRAIN] epoch: 53, iter: 66050/160000, loss: 0.5982, lr: 0.000711, batch_cost: 0.2673, reader_cost: 0.00453, ips: 29.9343 samples/sec | ETA 06:58:28 2022-08-23 01:20:59 [INFO] [TRAIN] epoch: 53, iter: 66100/160000, loss: 0.5894, lr: 0.000711, batch_cost: 0.2353, reader_cost: 0.00178, ips: 34.0012 samples/sec | ETA 06:08:13 2022-08-23 01:21:12 [INFO] [TRAIN] epoch: 53, iter: 66150/160000, loss: 0.6748, lr: 0.000711, batch_cost: 0.2569, reader_cost: 0.00287, ips: 31.1353 samples/sec | ETA 06:41:54 2022-08-23 01:21:24 [INFO] [TRAIN] epoch: 53, iter: 66200/160000, loss: 0.6726, lr: 0.000710, batch_cost: 0.2566, reader_cost: 0.00089, ips: 31.1824 samples/sec | ETA 06:41:04 2022-08-23 01:21:36 [INFO] [TRAIN] epoch: 53, iter: 66250/160000, loss: 0.6300, lr: 0.000710, batch_cost: 0.2403, reader_cost: 0.00128, ips: 33.2957 samples/sec | ETA 06:15:25 2022-08-23 01:21:50 [INFO] [TRAIN] epoch: 53, iter: 66300/160000, loss: 0.6342, lr: 0.000709, batch_cost: 0.2643, reader_cost: 0.00775, ips: 30.2696 samples/sec | ETA 06:52:44 2022-08-23 01:22:03 [INFO] [TRAIN] epoch: 53, iter: 66350/160000, loss: 0.5739, lr: 0.000709, batch_cost: 0.2577, reader_cost: 0.00905, ips: 31.0497 samples/sec | ETA 06:42:09 2022-08-23 01:22:15 [INFO] [TRAIN] epoch: 53, iter: 66400/160000, loss: 0.5985, lr: 0.000709, batch_cost: 0.2437, reader_cost: 0.00142, ips: 32.8222 samples/sec | ETA 06:20:13 2022-08-23 01:22:29 [INFO] [TRAIN] epoch: 53, iter: 66450/160000, loss: 0.6149, lr: 0.000708, batch_cost: 0.2814, reader_cost: 0.00042, ips: 28.4246 samples/sec | ETA 07:18:49 2022-08-23 01:22:40 [INFO] [TRAIN] epoch: 53, iter: 66500/160000, loss: 0.6470, lr: 0.000708, batch_cost: 0.2182, reader_cost: 0.00477, ips: 36.6652 samples/sec | ETA 05:40:00 2022-08-23 01:22:52 [INFO] [TRAIN] epoch: 53, iter: 66550/160000, loss: 0.6545, lr: 0.000708, batch_cost: 0.2401, reader_cost: 0.00104, ips: 33.3224 samples/sec | ETA 06:13:55 2022-08-23 01:23:03 [INFO] [TRAIN] epoch: 53, iter: 66600/160000, loss: 0.6552, lr: 0.000707, batch_cost: 0.2318, reader_cost: 0.00050, ips: 34.5149 samples/sec | ETA 06:00:48 2022-08-23 01:23:14 [INFO] [TRAIN] epoch: 53, iter: 66650/160000, loss: 0.6221, lr: 0.000707, batch_cost: 0.2214, reader_cost: 0.00063, ips: 36.1292 samples/sec | ETA 05:44:30 2022-08-23 01:23:27 [INFO] [TRAIN] epoch: 53, iter: 66700/160000, loss: 0.6825, lr: 0.000706, batch_cost: 0.2486, reader_cost: 0.00055, ips: 32.1863 samples/sec | ETA 06:26:29 2022-08-23 01:23:39 [INFO] [TRAIN] epoch: 53, iter: 66750/160000, loss: 0.6299, lr: 0.000706, batch_cost: 0.2462, reader_cost: 0.00084, ips: 32.4883 samples/sec | ETA 06:22:42 2022-08-23 01:23:52 [INFO] [TRAIN] epoch: 53, iter: 66800/160000, loss: 0.6029, lr: 0.000706, batch_cost: 0.2493, reader_cost: 0.00064, ips: 32.0905 samples/sec | ETA 06:27:14 2022-08-23 01:24:04 [INFO] [TRAIN] epoch: 53, iter: 66850/160000, loss: 0.5725, lr: 0.000705, batch_cost: 0.2407, reader_cost: 0.00106, ips: 33.2296 samples/sec | ETA 06:13:45 2022-08-23 01:24:15 [INFO] [TRAIN] epoch: 53, iter: 66900/160000, loss: 0.6523, lr: 0.000705, batch_cost: 0.2215, reader_cost: 0.00089, ips: 36.1161 samples/sec | ETA 05:43:42 2022-08-23 01:24:30 [INFO] [TRAIN] epoch: 54, iter: 66950/160000, loss: 0.6398, lr: 0.000704, batch_cost: 0.2960, reader_cost: 0.06743, ips: 27.0227 samples/sec | ETA 07:39:07 2022-08-23 01:24:39 [INFO] [TRAIN] epoch: 54, iter: 67000/160000, loss: 0.6075, lr: 0.000704, batch_cost: 0.1982, reader_cost: 0.00066, ips: 40.3607 samples/sec | ETA 05:07:13 2022-08-23 01:24:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 195s - batch_cost: 0.1946 - reader cost: 6.2705e-04 2022-08-23 01:27:54 [INFO] [EVAL] #Images: 2000 mIoU: 0.3307 Acc: 0.7591 Kappa: 0.7406 Dice: 0.4603 2022-08-23 01:27:54 [INFO] [EVAL] Class IoU: [0.677 0.7705 0.9297 0.725 0.6795 0.7443 0.774 0.7731 0.5128 0.6191 0.4806 0.5545 0.6938 0.297 0.3059 0.4162 0.5191 0.4421 0.5923 0.409 0.7347 0.4932 0.6211 0.4691 0.3011 0.282 0.4054 0.4184 0.3693 0.2527 0.2194 0.4924 0.2498 0.3193 0.2828 0.3483 0.4233 0.5549 0.2833 0.3372 0.2473 0.1084 0.3337 0.2369 0.3088 0.2311 0.2898 0.4964 0.569 0.493 0.5194 0.3669 0.2088 0.2032 0.6323 0.3808 0.8539 0.4138 0.4596 0.2762 0.0579 0.2389 0.2967 0.1807 0.4276 0.6454 0.2426 0.4027 0.1242 0.3501 0.3995 0.5323 0.3798 0.2264 0.394 0.3237 0.3644 0.2791 0.2383 0.1498 0.6806 0.3857 0.4071 0.026 0.2783 0.535 0.0866 0.0753 0.2398 0.4775 0.3496 0.0346 0.2083 0.0991 0.0058 0.0286 0.023 0.119 0.2631 0.3968 0.062 0.0366 0.1906 0.1555 0.0032 0.3911 0.074 0.5024 0.085 0.1639 0.1178 0.3504 0.1376 0.5179 0.8622 0.009 0.4141 0.6585 0.1294 0.1507 0.3195 0.0075 0.201 0.1113 0.3898 0.2262 0.366 0.41 0.0307 0.3188 0.5863 0.0172 0.2213 0.2594 0.1607 0.1327 0.1329 0.0107 0.1668 0.3722 0.2992 0.0341 0.3034 0.2501 0.2845 0. 0.3277 0.0222 0.1088 0.1433] 2022-08-23 01:27:54 [INFO] [EVAL] Class Precision: [0.7804 0.8415 0.9592 0.8221 0.7712 0.8726 0.8677 0.839 0.7034 0.7503 0.6794 0.7209 0.7832 0.5023 0.52 0.6308 0.6959 0.6786 0.7045 0.5825 0.8145 0.6147 0.7842 0.5666 0.4727 0.4416 0.6661 0.7181 0.7704 0.3457 0.3992 0.6095 0.4647 0.3993 0.4628 0.4668 0.6275 0.8611 0.4852 0.6728 0.4283 0.2798 0.5204 0.528 0.4464 0.4615 0.4337 0.6564 0.731 0.554 0.733 0.5129 0.36 0.5861 0.6984 0.4603 0.902 0.6237 0.7004 0.4651 0.1027 0.4269 0.4027 0.6659 0.5444 0.7493 0.5254 0.5441 0.3561 0.6108 0.6854 0.6426 0.6261 0.2673 0.7292 0.5139 0.4812 0.4765 0.609 0.4051 0.8303 0.6572 0.6641 0.066 0.4141 0.7199 0.5105 0.3075 0.5161 0.7032 0.4552 0.042 0.326 0.2989 0.0266 0.0695 0.2105 0.3183 0.3611 0.6523 0.3116 0.0653 0.6296 0.6844 0.1071 0.4835 0.2397 0.6722 0.2438 0.3101 0.1974 0.3685 0.5447 0.6862 0.8721 0.157 0.6318 0.6881 0.3856 0.3562 0.5194 0.1344 0.6403 0.6503 0.6719 0.6618 0.7976 0.5549 0.3382 0.5376 0.6569 0.2084 0.298 0.672 0.546 0.3052 0.3898 0.0471 0.4507 0.6104 0.34 0.0536 0.6671 0.5215 0.628 0. 0.8496 0.4136 0.5385 0.7283] 2022-08-23 01:27:54 [INFO] [EVAL] Class Recall: [0.8363 0.9012 0.968 0.8598 0.8511 0.8351 0.8776 0.9078 0.6542 0.7799 0.6216 0.706 0.8587 0.4209 0.4262 0.5502 0.6713 0.5591 0.7882 0.5788 0.8822 0.7139 0.7492 0.7317 0.4534 0.4382 0.5088 0.5006 0.4149 0.4844 0.3275 0.7193 0.3507 0.6145 0.421 0.5784 0.5653 0.6094 0.405 0.4033 0.3691 0.1503 0.4818 0.3005 0.5003 0.3164 0.4663 0.6708 0.7198 0.8172 0.6406 0.5632 0.332 0.2372 0.8698 0.6881 0.9412 0.5514 0.5721 0.4047 0.1174 0.3517 0.5299 0.1987 0.666 0.8231 0.3108 0.6077 0.1601 0.4507 0.4892 0.756 0.4912 0.5965 0.4616 0.4666 0.6002 0.4026 0.2813 0.192 0.7906 0.4828 0.5127 0.0412 0.459 0.6756 0.0945 0.0907 0.3094 0.598 0.601 0.1638 0.366 0.1291 0.0074 0.0462 0.0252 0.1598 0.4922 0.5033 0.0718 0.0769 0.2147 0.1676 0.0033 0.6719 0.0967 0.6654 0.1154 0.258 0.226 0.877 0.1555 0.6787 0.987 0.0094 0.5458 0.9388 0.1629 0.2072 0.4535 0.0078 0.2266 0.1184 0.4815 0.2557 0.4035 0.6108 0.0326 0.4393 0.8452 0.0184 0.4624 0.2969 0.1855 0.1902 0.1678 0.0136 0.2093 0.4882 0.7139 0.0855 0.3575 0.3246 0.3422 0. 0.3479 0.0229 0.12 0.1514] 2022-08-23 01:27:55 [INFO] [EVAL] The model with the best validation mIoU (0.3413) was saved at iter 58000. 2022-08-23 01:28:08 [INFO] [TRAIN] epoch: 54, iter: 67050/160000, loss: 0.6187, lr: 0.000704, batch_cost: 0.2683, reader_cost: 0.00992, ips: 29.8163 samples/sec | ETA 06:55:39 2022-08-23 01:28:20 [INFO] [TRAIN] epoch: 54, iter: 67100/160000, loss: 0.6534, lr: 0.000703, batch_cost: 0.2457, reader_cost: 0.01735, ips: 32.5655 samples/sec | ETA 06:20:21 2022-08-23 01:28:33 [INFO] [TRAIN] epoch: 54, iter: 67150/160000, loss: 0.6006, lr: 0.000703, batch_cost: 0.2473, reader_cost: 0.03311, ips: 32.3518 samples/sec | ETA 06:22:40 2022-08-23 01:28:45 [INFO] [TRAIN] epoch: 54, iter: 67200/160000, loss: 0.5920, lr: 0.000703, batch_cost: 0.2551, reader_cost: 0.00498, ips: 31.3572 samples/sec | ETA 06:34:35 2022-08-23 01:28:59 [INFO] [TRAIN] epoch: 54, iter: 67250/160000, loss: 0.6216, lr: 0.000702, batch_cost: 0.2631, reader_cost: 0.00214, ips: 30.4110 samples/sec | ETA 06:46:39 2022-08-23 01:29:12 [INFO] [TRAIN] epoch: 54, iter: 67300/160000, loss: 0.5980, lr: 0.000702, batch_cost: 0.2715, reader_cost: 0.00132, ips: 29.4666 samples/sec | ETA 06:59:27 2022-08-23 01:29:24 [INFO] [TRAIN] epoch: 54, iter: 67350/160000, loss: 0.6073, lr: 0.000701, batch_cost: 0.2429, reader_cost: 0.02974, ips: 32.9335 samples/sec | ETA 06:15:05 2022-08-23 01:29:37 [INFO] [TRAIN] epoch: 54, iter: 67400/160000, loss: 0.6395, lr: 0.000701, batch_cost: 0.2476, reader_cost: 0.00905, ips: 32.3045 samples/sec | ETA 06:22:11 2022-08-23 01:29:50 [INFO] [TRAIN] epoch: 54, iter: 67450/160000, loss: 0.6217, lr: 0.000701, batch_cost: 0.2663, reader_cost: 0.00625, ips: 30.0455 samples/sec | ETA 06:50:42 2022-08-23 01:30:03 [INFO] [TRAIN] epoch: 54, iter: 67500/160000, loss: 0.6335, lr: 0.000700, batch_cost: 0.2695, reader_cost: 0.00420, ips: 29.6865 samples/sec | ETA 06:55:27 2022-08-23 01:30:16 [INFO] [TRAIN] epoch: 54, iter: 67550/160000, loss: 0.5812, lr: 0.000700, batch_cost: 0.2585, reader_cost: 0.00138, ips: 30.9441 samples/sec | ETA 06:38:21 2022-08-23 01:30:27 [INFO] [TRAIN] epoch: 54, iter: 67600/160000, loss: 0.6033, lr: 0.000700, batch_cost: 0.2205, reader_cost: 0.00049, ips: 36.2796 samples/sec | ETA 05:39:35 2022-08-23 01:30:38 [INFO] [TRAIN] epoch: 54, iter: 67650/160000, loss: 0.6456, lr: 0.000699, batch_cost: 0.2174, reader_cost: 0.00057, ips: 36.7991 samples/sec | ETA 05:34:36 2022-08-23 01:30:50 [INFO] [TRAIN] epoch: 54, iter: 67700/160000, loss: 0.6354, lr: 0.000699, batch_cost: 0.2295, reader_cost: 0.00051, ips: 34.8648 samples/sec | ETA 05:52:58 2022-08-23 01:30:59 [INFO] [TRAIN] epoch: 54, iter: 67750/160000, loss: 0.6121, lr: 0.000698, batch_cost: 0.1945, reader_cost: 0.00043, ips: 41.1261 samples/sec | ETA 04:59:04 2022-08-23 01:31:09 [INFO] [TRAIN] epoch: 54, iter: 67800/160000, loss: 0.6791, lr: 0.000698, batch_cost: 0.1920, reader_cost: 0.00046, ips: 41.6700 samples/sec | ETA 04:55:01 2022-08-23 01:31:19 [INFO] [TRAIN] epoch: 54, iter: 67850/160000, loss: 0.6382, lr: 0.000698, batch_cost: 0.2016, reader_cost: 0.00050, ips: 39.6816 samples/sec | ETA 05:09:37 2022-08-23 01:31:30 [INFO] [TRAIN] epoch: 54, iter: 67900/160000, loss: 0.6222, lr: 0.000697, batch_cost: 0.2232, reader_cost: 0.00057, ips: 35.8472 samples/sec | ETA 05:42:33 2022-08-23 01:31:42 [INFO] [TRAIN] epoch: 54, iter: 67950/160000, loss: 0.6465, lr: 0.000697, batch_cost: 0.2280, reader_cost: 0.00142, ips: 35.0920 samples/sec | ETA 05:49:44 2022-08-23 01:31:52 [INFO] [TRAIN] epoch: 54, iter: 68000/160000, loss: 0.6268, lr: 0.000697, batch_cost: 0.2014, reader_cost: 0.00088, ips: 39.7159 samples/sec | ETA 05:08:51 2022-08-23 01:31:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 190s - batch_cost: 0.1903 - reader cost: 6.7057e-04 2022-08-23 01:35:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.3416 Acc: 0.7580 Kappa: 0.7397 Dice: 0.4727 2022-08-23 01:35:02 [INFO] [EVAL] Class IoU: [0.6756 0.7657 0.9297 0.7142 0.6806 0.75 0.7617 0.7743 0.516 0.6185 0.4731 0.5612 0.6967 0.2866 0.2873 0.426 0.5206 0.4532 0.5974 0.3902 0.7433 0.4072 0.6168 0.4659 0.3322 0.4126 0.4191 0.419 0.3919 0.246 0.2429 0.4941 0.2688 0.344 0.27 0.3657 0.4124 0.5645 0.2848 0.336 0.1963 0.0876 0.3346 0.2513 0.3256 0.2685 0.2103 0.5071 0.5111 0.5757 0.4915 0.4423 0.1969 0.1499 0.6753 0.511 0.8346 0.427 0.4697 0.2307 0.0794 0.2524 0.3209 0.1883 0.4123 0.6674 0.2909 0.4456 0.1215 0.3549 0.4792 0.5206 0.3912 0.2472 0.4421 0.3009 0.449 0.2775 0.2903 0.0826 0.6294 0.3697 0.3219 0.0176 0.3205 0.5283 0.0842 0.086 0.2508 0.4608 0.3444 0.0207 0.2448 0.0778 0.004 0.0253 0.0275 0.0936 0.2278 0.4068 0.0645 0.0253 0.2709 0.2056 0.0362 0.4912 0.054 0.5327 0.0735 0.3157 0.0894 0.4593 0.142 0.593 0.8487 0.0063 0.457 0.596 0.1838 0.1553 0.441 0.0086 0.2138 0.1205 0.3378 0.2368 0.4063 0.4202 0.1912 0.3232 0.5668 0.0583 0.2899 0.2152 0.2116 0.1274 0.1198 0.0104 0.1856 0.3302 0.4812 0.019 0.3213 0.3008 0.2839 0. 0.375 0.0209 0.1326 0.1253] 2022-08-23 01:35:02 [INFO] [EVAL] Class Precision: [0.7756 0.869 0.9616 0.8028 0.7848 0.8741 0.8809 0.8429 0.6793 0.7546 0.7258 0.7184 0.7884 0.492 0.4975 0.5724 0.6951 0.6476 0.7367 0.617 0.8266 0.6416 0.7569 0.5951 0.4949 0.4539 0.5126 0.5495 0.6972 0.331 0.3714 0.6138 0.4346 0.4442 0.4222 0.4742 0.6335 0.8271 0.5384 0.6502 0.3741 0.2372 0.5153 0.5672 0.4599 0.5007 0.4557 0.7247 0.5337 0.6615 0.5932 0.6585 0.3345 0.443 0.7115 0.7812 0.8657 0.6335 0.7609 0.3615 0.1173 0.4071 0.5479 0.6268 0.4847 0.7884 0.5577 0.5725 0.3173 0.6579 0.6477 0.6683 0.5869 0.3382 0.7201 0.551 0.5152 0.566 0.6371 0.2553 0.7346 0.7113 0.7805 0.0668 0.479 0.7474 0.4604 0.2967 0.5274 0.6655 0.4364 0.0376 0.3803 0.3624 0.0253 0.0759 0.3977 0.3747 0.5008 0.6233 0.3945 0.0427 0.548 0.6115 0.699 0.5475 0.1384 0.7617 0.1934 0.4584 0.2705 0.7532 0.5971 0.7062 0.8587 0.1118 0.749 0.6771 0.3238 0.3695 0.6019 0.2587 0.4778 0.6729 0.683 0.6586 0.774 0.5943 0.4185 0.5661 0.6317 0.3027 0.5467 0.6263 0.524 0.3385 0.4147 0.046 0.3521 0.7215 0.5614 0.0381 0.6695 0.515 0.6335 0. 0.7564 0.5091 0.5727 0.8454] 2022-08-23 01:35:02 [INFO] [EVAL] Class Recall: [0.8396 0.8656 0.9656 0.8662 0.8367 0.8408 0.8492 0.9049 0.6821 0.7742 0.576 0.7194 0.857 0.4072 0.4047 0.625 0.6746 0.6016 0.7596 0.5148 0.8807 0.5271 0.7693 0.682 0.5026 0.819 0.6967 0.6383 0.4722 0.4893 0.4124 0.717 0.4135 0.6037 0.4282 0.6151 0.5417 0.64 0.3768 0.4102 0.2923 0.1219 0.4883 0.3109 0.5272 0.3668 0.2809 0.6281 0.9235 0.8161 0.7414 0.574 0.3238 0.1847 0.93 0.5964 0.9588 0.567 0.551 0.3894 0.1975 0.3991 0.4365 0.2121 0.7339 0.813 0.3781 0.6678 0.1645 0.4352 0.6483 0.702 0.5398 0.4789 0.5338 0.3986 0.7775 0.3524 0.3479 0.1089 0.8146 0.435 0.354 0.0233 0.492 0.6432 0.0934 0.1081 0.3235 0.5997 0.6202 0.0439 0.4074 0.0901 0.0047 0.0365 0.0287 0.1109 0.2947 0.5394 0.0716 0.0587 0.3489 0.2365 0.0368 0.8268 0.0813 0.6392 0.106 0.5036 0.1177 0.5407 0.1571 0.7873 0.9864 0.0066 0.5396 0.8326 0.2984 0.2113 0.6226 0.0089 0.279 0.128 0.4006 0.2699 0.461 0.5893 0.2603 0.4296 0.8464 0.0673 0.3816 0.2469 0.2619 0.1696 0.1442 0.0133 0.2819 0.3785 0.7711 0.0366 0.3818 0.4196 0.3398 0. 0.4264 0.0213 0.1472 0.1282] 2022-08-23 01:35:02 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 01:35:16 [INFO] [TRAIN] epoch: 54, iter: 68050/160000, loss: 0.6394, lr: 0.000696, batch_cost: 0.2611, reader_cost: 0.00483, ips: 30.6394 samples/sec | ETA 06:40:08 2022-08-23 01:35:27 [INFO] [TRAIN] epoch: 54, iter: 68100/160000, loss: 0.5839, lr: 0.000696, batch_cost: 0.2387, reader_cost: 0.00124, ips: 33.5119 samples/sec | ETA 06:05:38 2022-08-23 01:35:42 [INFO] [TRAIN] epoch: 54, iter: 68150/160000, loss: 0.6172, lr: 0.000695, batch_cost: 0.2817, reader_cost: 0.00091, ips: 28.3971 samples/sec | ETA 07:11:15 2022-08-23 01:35:54 [INFO] [TRAIN] epoch: 54, iter: 68200/160000, loss: 0.6417, lr: 0.000695, batch_cost: 0.2529, reader_cost: 0.00536, ips: 31.6370 samples/sec | ETA 06:26:53 2022-08-23 01:36:11 [INFO] [TRAIN] epoch: 55, iter: 68250/160000, loss: 0.6161, lr: 0.000695, batch_cost: 0.3429, reader_cost: 0.04930, ips: 23.3272 samples/sec | ETA 08:44:25 2022-08-23 01:36:24 [INFO] [TRAIN] epoch: 55, iter: 68300/160000, loss: 0.5859, lr: 0.000694, batch_cost: 0.2525, reader_cost: 0.00123, ips: 31.6840 samples/sec | ETA 06:25:53 2022-08-23 01:36:36 [INFO] [TRAIN] epoch: 55, iter: 68350/160000, loss: 0.6580, lr: 0.000694, batch_cost: 0.2410, reader_cost: 0.02075, ips: 33.1987 samples/sec | ETA 06:08:05 2022-08-23 01:36:49 [INFO] [TRAIN] epoch: 55, iter: 68400/160000, loss: 0.6094, lr: 0.000694, batch_cost: 0.2604, reader_cost: 0.00850, ips: 30.7211 samples/sec | ETA 06:37:33 2022-08-23 01:37:02 [INFO] [TRAIN] epoch: 55, iter: 68450/160000, loss: 0.6611, lr: 0.000693, batch_cost: 0.2540, reader_cost: 0.00084, ips: 31.4910 samples/sec | ETA 06:27:37 2022-08-23 01:37:15 [INFO] [TRAIN] epoch: 55, iter: 68500/160000, loss: 0.6821, lr: 0.000693, batch_cost: 0.2688, reader_cost: 0.01650, ips: 29.7602 samples/sec | ETA 06:49:56 2022-08-23 01:37:28 [INFO] [TRAIN] epoch: 55, iter: 68550/160000, loss: 0.6077, lr: 0.000692, batch_cost: 0.2654, reader_cost: 0.00048, ips: 30.1409 samples/sec | ETA 06:44:32 2022-08-23 01:37:41 [INFO] [TRAIN] epoch: 55, iter: 68600/160000, loss: 0.6271, lr: 0.000692, batch_cost: 0.2507, reader_cost: 0.00116, ips: 31.9147 samples/sec | ETA 06:21:51 2022-08-23 01:37:52 [INFO] [TRAIN] epoch: 55, iter: 68650/160000, loss: 0.5921, lr: 0.000692, batch_cost: 0.2099, reader_cost: 0.00074, ips: 38.1064 samples/sec | ETA 05:19:37 2022-08-23 01:38:03 [INFO] [TRAIN] epoch: 55, iter: 68700/160000, loss: 0.6399, lr: 0.000691, batch_cost: 0.2205, reader_cost: 0.00056, ips: 36.2750 samples/sec | ETA 05:35:35 2022-08-23 01:38:14 [INFO] [TRAIN] epoch: 55, iter: 68750/160000, loss: 0.5653, lr: 0.000691, batch_cost: 0.2308, reader_cost: 0.00082, ips: 34.6598 samples/sec | ETA 05:51:01 2022-08-23 01:38:26 [INFO] [TRAIN] epoch: 55, iter: 68800/160000, loss: 0.6348, lr: 0.000690, batch_cost: 0.2306, reader_cost: 0.00070, ips: 34.6898 samples/sec | ETA 05:50:32 2022-08-23 01:38:36 [INFO] [TRAIN] epoch: 55, iter: 68850/160000, loss: 0.6117, lr: 0.000690, batch_cost: 0.2043, reader_cost: 0.00097, ips: 39.1657 samples/sec | ETA 05:10:18 2022-08-23 01:38:46 [INFO] [TRAIN] epoch: 55, iter: 68900/160000, loss: 0.6114, lr: 0.000690, batch_cost: 0.1968, reader_cost: 0.00235, ips: 40.6495 samples/sec | ETA 04:58:48 2022-08-23 01:38:55 [INFO] [TRAIN] epoch: 55, iter: 68950/160000, loss: 0.6868, lr: 0.000689, batch_cost: 0.1851, reader_cost: 0.00042, ips: 43.2269 samples/sec | ETA 04:40:50 2022-08-23 01:39:05 [INFO] [TRAIN] epoch: 55, iter: 69000/160000, loss: 0.6139, lr: 0.000689, batch_cost: 0.1998, reader_cost: 0.00052, ips: 40.0338 samples/sec | ETA 05:03:04 2022-08-23 01:39:05 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1735 - reader cost: 6.6730e-04 2022-08-23 01:41:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3329 Acc: 0.7606 Kappa: 0.7425 Dice: 0.4624 2022-08-23 01:41:59 [INFO] [EVAL] Class IoU: [0.6815 0.7754 0.9287 0.7266 0.6669 0.7555 0.7633 0.7655 0.5159 0.6542 0.4759 0.5506 0.6968 0.3017 0.2836 0.432 0.5163 0.4169 0.5961 0.411 0.7362 0.4644 0.6189 0.4741 0.3178 0.4088 0.4367 0.4362 0.4086 0.2524 0.221 0.4752 0.2865 0.3428 0.251 0.3824 0.4159 0.5122 0.2646 0.3852 0.2254 0.1038 0.3444 0.255 0.2927 0.2587 0.2999 0.4831 0.5651 0.5506 0.5245 0.4094 0.1842 0.2208 0.699 0.4419 0.8602 0.3743 0.4916 0.2665 0.0602 0.1868 0.3445 0.1629 0.43 0.655 0.292 0.3814 0.0945 0.3307 0.4902 0.5425 0.3149 0.2206 0.4426 0.3237 0.4398 0.2392 0.2777 0.1489 0.1713 0.3509 0.3143 0.0095 0.3466 0.5313 0.0905 0.0637 0.2408 0.4201 0.396 0.0212 0.2032 0.0804 0.0256 0.0168 0.0293 0.0935 0.1876 0.3767 0.0754 0.0154 0.2013 0.1969 0.0007 0.5172 0.05 0.4624 0.044 0.3806 0.0905 0.4404 0.1271 0.4798 0.8013 0.0136 0.3235 0.5989 0.1044 0.2804 0.4332 0.005 0.1929 0.132 0.301 0.2296 0.444 0.4225 0.2228 0.311 0.61 0.0297 0.331 0.2063 0.186 0.1374 0.1095 0.0146 0.147 0.3352 0.4489 0.0447 0.3292 0.1403 0.1808 0. 0.3328 0.0251 0.1321 0.0948] 2022-08-23 01:41:59 [INFO] [EVAL] Class Precision: [0.7909 0.852 0.9687 0.8326 0.736 0.8367 0.8916 0.8335 0.6419 0.7805 0.6661 0.6941 0.7846 0.5038 0.5434 0.5917 0.6743 0.7041 0.7405 0.5746 0.8093 0.6166 0.7713 0.5871 0.4948 0.5405 0.6306 0.7263 0.6555 0.3972 0.4294 0.6517 0.5324 0.4746 0.3793 0.5239 0.6049 0.8234 0.4926 0.5929 0.3971 0.3236 0.5842 0.5402 0.4343 0.4553 0.4904 0.6614 0.7158 0.636 0.7635 0.5586 0.239 0.5276 0.7194 0.7214 0.9239 0.6922 0.8225 0.5234 0.0954 0.3745 0.4969 0.7289 0.5312 0.8161 0.4232 0.5791 0.1857 0.518 0.6508 0.6527 0.6005 0.2824 0.714 0.4702 0.5087 0.5927 0.2845 0.4181 0.8847 0.6964 0.7724 0.0224 0.4558 0.6761 0.4338 0.3479 0.4209 0.6159 0.5541 0.023 0.3296 0.3297 0.138 0.0483 0.3502 0.3408 0.3425 0.6135 0.2359 0.0309 0.4201 0.7969 0.0121 0.5695 0.1456 0.7986 0.1996 0.7434 0.3191 0.7974 0.4924 0.5348 0.8085 0.0797 0.6817 0.6953 0.2758 0.5695 0.4987 0.2175 0.739 0.5407 0.7074 0.7097 0.912 0.5962 0.582 0.4876 0.7133 0.2494 0.5349 0.5964 0.4716 0.2533 0.3511 0.059 0.6034 0.6356 0.516 0.0669 0.699 0.3949 0.5769 0. 0.8567 0.3337 0.5897 0.5997] 2022-08-23 01:41:59 [INFO] [EVAL] Class Recall: [0.8313 0.8961 0.9574 0.851 0.8766 0.8862 0.8414 0.9036 0.7244 0.8017 0.6249 0.7271 0.8617 0.4293 0.3723 0.6154 0.6878 0.5055 0.7535 0.5907 0.8908 0.6531 0.758 0.7112 0.4703 0.6265 0.5868 0.522 0.5204 0.4091 0.313 0.637 0.3828 0.5523 0.4259 0.586 0.571 0.5755 0.3638 0.5236 0.3426 0.1326 0.4562 0.3257 0.473 0.3747 0.4358 0.6419 0.7286 0.8039 0.6263 0.605 0.4455 0.2751 0.961 0.5327 0.9259 0.449 0.5499 0.3519 0.1404 0.2715 0.529 0.1734 0.6931 0.7684 0.485 0.5278 0.1612 0.4778 0.6651 0.7626 0.3984 0.502 0.5379 0.5096 0.7644 0.2862 0.9205 0.1878 0.1752 0.4143 0.3463 0.0162 0.5913 0.7128 0.1026 0.0724 0.3602 0.5692 0.5813 0.215 0.3464 0.0961 0.0304 0.025 0.031 0.1142 0.2931 0.4938 0.0997 0.0298 0.2788 0.2073 0.0007 0.8493 0.0709 0.5235 0.0534 0.4382 0.1122 0.4958 0.1463 0.8235 0.9889 0.0161 0.381 0.812 0.1438 0.3558 0.7672 0.0051 0.207 0.1487 0.3438 0.2533 0.4638 0.5918 0.2653 0.4619 0.8081 0.0326 0.4647 0.2398 0.2349 0.2309 0.1373 0.0191 0.1628 0.4149 0.7752 0.1189 0.3836 0.1788 0.2084 0. 0.3525 0.0265 0.1455 0.1013] 2022-08-23 01:41:59 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 01:42:11 [INFO] [TRAIN] epoch: 55, iter: 69050/160000, loss: 0.6037, lr: 0.000689, batch_cost: 0.2501, reader_cost: 0.00422, ips: 31.9858 samples/sec | ETA 06:19:07 2022-08-23 01:42:24 [INFO] [TRAIN] epoch: 55, iter: 69100/160000, loss: 0.6341, lr: 0.000688, batch_cost: 0.2592, reader_cost: 0.00175, ips: 30.8698 samples/sec | ETA 06:32:36 2022-08-23 01:42:37 [INFO] [TRAIN] epoch: 55, iter: 69150/160000, loss: 0.6610, lr: 0.000688, batch_cost: 0.2582, reader_cost: 0.01225, ips: 30.9826 samples/sec | ETA 06:30:58 2022-08-23 01:42:50 [INFO] [TRAIN] epoch: 55, iter: 69200/160000, loss: 0.6222, lr: 0.000687, batch_cost: 0.2641, reader_cost: 0.00069, ips: 30.2929 samples/sec | ETA 06:39:39 2022-08-23 01:43:04 [INFO] [TRAIN] epoch: 55, iter: 69250/160000, loss: 0.6456, lr: 0.000687, batch_cost: 0.2700, reader_cost: 0.02025, ips: 29.6334 samples/sec | ETA 06:48:19 2022-08-23 01:43:17 [INFO] [TRAIN] epoch: 55, iter: 69300/160000, loss: 0.6255, lr: 0.000687, batch_cost: 0.2595, reader_cost: 0.01180, ips: 30.8298 samples/sec | ETA 06:32:15 2022-08-23 01:43:29 [INFO] [TRAIN] epoch: 55, iter: 69350/160000, loss: 0.6549, lr: 0.000686, batch_cost: 0.2477, reader_cost: 0.00268, ips: 32.3001 samples/sec | ETA 06:14:11 2022-08-23 01:43:44 [INFO] [TRAIN] epoch: 55, iter: 69400/160000, loss: 0.6198, lr: 0.000686, batch_cost: 0.2978, reader_cost: 0.00122, ips: 26.8657 samples/sec | ETA 07:29:38 2022-08-23 01:43:59 [INFO] [TRAIN] epoch: 55, iter: 69450/160000, loss: 0.6774, lr: 0.000686, batch_cost: 0.2948, reader_cost: 0.00113, ips: 27.1405 samples/sec | ETA 07:24:50 2022-08-23 01:44:17 [INFO] [TRAIN] epoch: 56, iter: 69500/160000, loss: 0.5934, lr: 0.000685, batch_cost: 0.3547, reader_cost: 0.08190, ips: 22.5542 samples/sec | ETA 08:55:00 2022-08-23 01:44:29 [INFO] [TRAIN] epoch: 56, iter: 69550/160000, loss: 0.5964, lr: 0.000685, batch_cost: 0.2551, reader_cost: 0.00151, ips: 31.3623 samples/sec | ETA 06:24:32 2022-08-23 01:44:42 [INFO] [TRAIN] epoch: 56, iter: 69600/160000, loss: 0.5490, lr: 0.000684, batch_cost: 0.2615, reader_cost: 0.00048, ips: 30.5870 samples/sec | ETA 06:34:04 2022-08-23 01:44:54 [INFO] [TRAIN] epoch: 56, iter: 69650/160000, loss: 0.5927, lr: 0.000684, batch_cost: 0.2234, reader_cost: 0.00077, ips: 35.8112 samples/sec | ETA 05:36:23 2022-08-23 01:45:04 [INFO] [TRAIN] epoch: 56, iter: 69700/160000, loss: 0.5818, lr: 0.000684, batch_cost: 0.2115, reader_cost: 0.00089, ips: 37.8326 samples/sec | ETA 05:18:14 2022-08-23 01:45:15 [INFO] [TRAIN] epoch: 56, iter: 69750/160000, loss: 0.6101, lr: 0.000683, batch_cost: 0.2204, reader_cost: 0.00116, ips: 36.2987 samples/sec | ETA 05:31:30 2022-08-23 01:45:27 [INFO] [TRAIN] epoch: 56, iter: 69800/160000, loss: 0.5911, lr: 0.000683, batch_cost: 0.2256, reader_cost: 0.00034, ips: 35.4567 samples/sec | ETA 05:39:11 2022-08-23 01:45:38 [INFO] [TRAIN] epoch: 56, iter: 69850/160000, loss: 0.6185, lr: 0.000683, batch_cost: 0.2257, reader_cost: 0.00067, ips: 35.4529 samples/sec | ETA 05:39:02 2022-08-23 01:45:50 [INFO] [TRAIN] epoch: 56, iter: 69900/160000, loss: 0.5954, lr: 0.000682, batch_cost: 0.2360, reader_cost: 0.00085, ips: 33.8921 samples/sec | ETA 05:54:27 2022-08-23 01:46:00 [INFO] [TRAIN] epoch: 56, iter: 69950/160000, loss: 0.5852, lr: 0.000682, batch_cost: 0.2127, reader_cost: 0.00047, ips: 37.6182 samples/sec | ETA 05:19:10 2022-08-23 01:46:11 [INFO] [TRAIN] epoch: 56, iter: 70000/160000, loss: 0.6181, lr: 0.000681, batch_cost: 0.2059, reader_cost: 0.00046, ips: 38.8538 samples/sec | ETA 05:08:50 2022-08-23 01:46:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 201s - batch_cost: 0.2008 - reader cost: 0.0014 2022-08-23 01:49:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3378 Acc: 0.7588 Kappa: 0.7405 Dice: 0.4690 2022-08-23 01:49:32 [INFO] [EVAL] Class IoU: [0.6761 0.773 0.9301 0.718 0.6786 0.7492 0.7725 0.7725 0.5257 0.6283 0.4676 0.5438 0.6889 0.3174 0.2741 0.4228 0.5129 0.4274 0.608 0.3958 0.7246 0.4358 0.6119 0.481 0.303 0.3494 0.4465 0.4238 0.4122 0.2326 0.2373 0.4659 0.2938 0.3418 0.2353 0.399 0.3994 0.5365 0.2743 0.3878 0.2054 0.133 0.3627 0.248 0.2789 0.2627 0.3048 0.4943 0.4869 0.5661 0.5116 0.2782 0.2015 0.2297 0.7065 0.4118 0.8505 0.4118 0.4698 0.2581 0.0833 0.2978 0.3185 0.1858 0.4113 0.6628 0.276 0.3975 0.0841 0.3456 0.4642 0.5176 0.42 0.2025 0.4186 0.2874 0.4084 0.2339 0.3562 0.1763 0.6578 0.3763 0.3296 0.0209 0.1561 0.5412 0.1307 0.071 0.2169 0.5034 0.4245 0.0946 0.1707 0.0689 0.0026 0.0205 0.0313 0.1986 0.2049 0.4076 0.0375 0.0152 0.1614 0.0889 0.0384 0.4476 0.0783 0.4874 0.1328 0.3128 0.0512 0.4337 0.1375 0.5909 0.8176 0.0117 0.2783 0.6146 0.1429 0.191 0.4268 0.0169 0.2701 0.1588 0.3442 0.3144 0.405 0.3972 0.2686 0.2743 0.5315 0.052 0.3072 0.2118 0.1369 0.1229 0.1301 0.0287 0.1918 0.3753 0.5256 0.0076 0.397 0.1127 0.2218 0. 0.3259 0.0207 0.1329 0.1706] 2022-08-23 01:49:32 [INFO] [EVAL] Class Precision: [0.7747 0.8625 0.9707 0.816 0.7688 0.8783 0.8873 0.8656 0.683 0.7382 0.6722 0.7238 0.7668 0.5147 0.5304 0.5506 0.6586 0.6922 0.7457 0.581 0.7964 0.611 0.7108 0.6093 0.525 0.4997 0.5775 0.6056 0.6319 0.3775 0.4311 0.6137 0.4556 0.4349 0.4327 0.5426 0.5432 0.7428 0.5513 0.53 0.4592 0.2825 0.6899 0.5195 0.3873 0.4453 0.5106 0.6875 0.73 0.6994 0.6439 0.3522 0.3342 0.5525 0.7483 0.5245 0.8918 0.603 0.6946 0.4498 0.1491 0.4427 0.4684 0.777 0.4924 0.768 0.3528 0.5246 0.2459 0.5907 0.6301 0.6066 0.6221 0.2802 0.5782 0.4811 0.4622 0.4877 0.6556 0.4854 0.842 0.7171 0.7699 0.0361 0.3749 0.6981 0.4372 0.4145 0.3566 0.7268 0.6029 0.2338 0.4062 0.3559 0.0079 0.0597 0.376 0.3411 0.2626 0.6154 0.252 0.0365 0.7077 0.5487 0.5781 0.4745 0.1658 0.7618 0.2495 0.4585 0.2794 0.5103 0.4369 0.686 0.8278 0.1288 0.7867 0.7105 0.3066 0.5957 0.6447 0.2236 0.5894 0.5863 0.6817 0.6597 0.8648 0.5579 0.4897 0.3963 0.5814 0.3328 0.5545 0.5967 0.4434 0.3807 0.3269 0.046 0.5197 0.6271 0.6266 0.0112 0.6262 0.3184 0.5419 0. 0.8897 0.4073 0.5837 0.7499] 2022-08-23 01:49:32 [INFO] [EVAL] Class Recall: [0.8416 0.8817 0.9569 0.8566 0.8527 0.8359 0.8566 0.8779 0.6953 0.8084 0.6058 0.6862 0.8716 0.453 0.362 0.6455 0.6987 0.5277 0.7669 0.5539 0.8893 0.6031 0.8148 0.6956 0.4175 0.5372 0.6631 0.5854 0.5425 0.3773 0.3455 0.6592 0.4528 0.6147 0.3402 0.6012 0.6014 0.6589 0.3532 0.591 0.271 0.2009 0.4334 0.3219 0.4992 0.3905 0.4306 0.6375 0.5939 0.7481 0.7135 0.5699 0.3365 0.2822 0.9267 0.6572 0.9484 0.565 0.5921 0.3772 0.1589 0.4764 0.4988 0.1962 0.7141 0.8288 0.5591 0.6214 0.1133 0.4545 0.6382 0.7791 0.5639 0.4221 0.6027 0.4165 0.7785 0.3102 0.4382 0.2168 0.7504 0.442 0.3656 0.0472 0.211 0.7065 0.1572 0.0789 0.3563 0.6209 0.5892 0.137 0.2274 0.0788 0.0038 0.0302 0.033 0.3222 0.4823 0.547 0.0421 0.0254 0.173 0.0959 0.0395 0.8874 0.1292 0.5751 0.2211 0.4961 0.059 0.7427 0.1672 0.8101 0.9851 0.0127 0.301 0.8199 0.211 0.2195 0.5581 0.018 0.3328 0.1789 0.4101 0.3753 0.4324 0.5797 0.3729 0.4711 0.861 0.0581 0.4078 0.2472 0.1654 0.1537 0.1777 0.0706 0.2331 0.4831 0.7653 0.0236 0.5203 0.1486 0.273 0. 0.3396 0.0213 0.1468 0.1808] 2022-08-23 01:49:32 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 01:49:43 [INFO] [TRAIN] epoch: 56, iter: 70050/160000, loss: 0.5642, lr: 0.000681, batch_cost: 0.2206, reader_cost: 0.00286, ips: 36.2659 samples/sec | ETA 05:30:42 2022-08-23 01:49:56 [INFO] [TRAIN] epoch: 56, iter: 70100/160000, loss: 0.6372, lr: 0.000681, batch_cost: 0.2668, reader_cost: 0.00138, ips: 29.9828 samples/sec | ETA 06:39:47 2022-08-23 01:50:09 [INFO] [TRAIN] epoch: 56, iter: 70150/160000, loss: 0.6644, lr: 0.000680, batch_cost: 0.2491, reader_cost: 0.00053, ips: 32.1213 samples/sec | ETA 06:12:57 2022-08-23 01:50:21 [INFO] [TRAIN] epoch: 56, iter: 70200/160000, loss: 0.5991, lr: 0.000680, batch_cost: 0.2553, reader_cost: 0.00094, ips: 31.3406 samples/sec | ETA 06:22:02 2022-08-23 01:50:33 [INFO] [TRAIN] epoch: 56, iter: 70250/160000, loss: 0.6014, lr: 0.000680, batch_cost: 0.2365, reader_cost: 0.00210, ips: 33.8205 samples/sec | ETA 05:53:49 2022-08-23 01:50:46 [INFO] [TRAIN] epoch: 56, iter: 70300/160000, loss: 0.6585, lr: 0.000679, batch_cost: 0.2557, reader_cost: 0.00045, ips: 31.2866 samples/sec | ETA 06:22:16 2022-08-23 01:50:58 [INFO] [TRAIN] epoch: 56, iter: 70350/160000, loss: 0.6245, lr: 0.000679, batch_cost: 0.2350, reader_cost: 0.00063, ips: 34.0419 samples/sec | ETA 05:51:08 2022-08-23 01:51:12 [INFO] [TRAIN] epoch: 56, iter: 70400/160000, loss: 0.6103, lr: 0.000678, batch_cost: 0.2826, reader_cost: 0.00046, ips: 28.3073 samples/sec | ETA 07:02:02 2022-08-23 01:51:26 [INFO] [TRAIN] epoch: 56, iter: 70450/160000, loss: 0.5965, lr: 0.000678, batch_cost: 0.2796, reader_cost: 0.00080, ips: 28.6076 samples/sec | ETA 06:57:22 2022-08-23 01:51:38 [INFO] [TRAIN] epoch: 56, iter: 70500/160000, loss: 0.6452, lr: 0.000678, batch_cost: 0.2492, reader_cost: 0.00078, ips: 32.0981 samples/sec | ETA 06:11:46 2022-08-23 01:51:51 [INFO] [TRAIN] epoch: 56, iter: 70550/160000, loss: 0.6248, lr: 0.000677, batch_cost: 0.2590, reader_cost: 0.00163, ips: 30.8834 samples/sec | ETA 06:26:11 2022-08-23 01:52:05 [INFO] [TRAIN] epoch: 56, iter: 70600/160000, loss: 0.6203, lr: 0.000677, batch_cost: 0.2667, reader_cost: 0.00100, ips: 29.9957 samples/sec | ETA 06:37:23 2022-08-23 01:52:18 [INFO] [TRAIN] epoch: 56, iter: 70650/160000, loss: 0.6333, lr: 0.000676, batch_cost: 0.2683, reader_cost: 0.00294, ips: 29.8137 samples/sec | ETA 06:39:35 2022-08-23 01:52:30 [INFO] [TRAIN] epoch: 56, iter: 70700/160000, loss: 0.6819, lr: 0.000676, batch_cost: 0.2476, reader_cost: 0.00161, ips: 32.3075 samples/sec | ETA 06:08:32 2022-08-23 01:52:45 [INFO] [TRAIN] epoch: 57, iter: 70750/160000, loss: 0.6221, lr: 0.000676, batch_cost: 0.2874, reader_cost: 0.06522, ips: 27.8344 samples/sec | ETA 07:07:31 2022-08-23 01:52:54 [INFO] [TRAIN] epoch: 57, iter: 70800/160000, loss: 0.6293, lr: 0.000675, batch_cost: 0.1765, reader_cost: 0.00060, ips: 45.3136 samples/sec | ETA 04:22:28 2022-08-23 01:53:04 [INFO] [TRAIN] epoch: 57, iter: 70850/160000, loss: 0.6019, lr: 0.000675, batch_cost: 0.2021, reader_cost: 0.00092, ips: 39.5810 samples/sec | ETA 05:00:18 2022-08-23 01:53:14 [INFO] [TRAIN] epoch: 57, iter: 70900/160000, loss: 0.6099, lr: 0.000675, batch_cost: 0.1952, reader_cost: 0.00043, ips: 40.9808 samples/sec | ETA 04:49:53 2022-08-23 01:53:23 [INFO] [TRAIN] epoch: 57, iter: 70950/160000, loss: 0.6001, lr: 0.000674, batch_cost: 0.1878, reader_cost: 0.00050, ips: 42.6098 samples/sec | ETA 04:38:39 2022-08-23 01:53:31 [INFO] [TRAIN] epoch: 57, iter: 71000/160000, loss: 0.5705, lr: 0.000674, batch_cost: 0.1671, reader_cost: 0.00060, ips: 47.8675 samples/sec | ETA 04:07:54 2022-08-23 01:53:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1556 - reader cost: 8.9498e-04 2022-08-23 01:56:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3371 Acc: 0.7579 Kappa: 0.7396 Dice: 0.4664 2022-08-23 01:56:07 [INFO] [EVAL] Class IoU: [0.6745 0.7676 0.9277 0.7265 0.6896 0.7476 0.7729 0.7672 0.5145 0.5845 0.4904 0.5438 0.677 0.3041 0.2555 0.4207 0.5396 0.4396 0.6027 0.3971 0.724 0.513 0.5961 0.4757 0.3068 0.3498 0.4624 0.4282 0.3997 0.2352 0.2013 0.4682 0.2731 0.3533 0.3151 0.3472 0.4106 0.5927 0.28 0.3629 0.2222 0.116 0.3653 0.2539 0.2834 0.2372 0.3046 0.5012 0.6153 0.4943 0.5092 0.3234 0.1825 0.2074 0.6902 0.3974 0.8481 0.3965 0.5013 0.2754 0.0803 0.3395 0.299 0.1674 0.4231 0.6953 0.3064 0.3508 0.0873 0.3127 0.4663 0.5356 0.352 0.2142 0.4366 0.3176 0.4723 0.2752 0.3478 0.0686 0.6848 0.3747 0.3653 0.0196 0.288 0.525 0.1261 0.0786 0.2198 0.4765 0.3744 0.0935 0.2037 0.0817 0.0116 0.0141 0.0233 0.1344 0.2345 0.4343 0.0573 0.0101 0.1387 0.1925 0.0251 0.4054 0.0673 0.4856 0.0877 0.1867 0.0763 0.5018 0.1401 0.5871 0.8402 0.0157 0.3574 0.5809 0.1422 0.1503 0.4454 0.0179 0.2276 0.1864 0.3399 0.2489 0.4174 0.3894 0.0018 0.3102 0.5922 0.0431 0.3796 0.2113 0.1516 0.1318 0.133 0.0189 0.2119 0.3675 0.466 0.0137 0.3592 0.0212 0.2475 0. 0.2904 0.0259 0.1239 0.1734] 2022-08-23 01:56:07 [INFO] [EVAL] Class Precision: [0.7744 0.862 0.963 0.844 0.7717 0.8739 0.8843 0.8286 0.6395 0.7113 0.707 0.7001 0.7538 0.4857 0.5594 0.6058 0.6936 0.6858 0.7639 0.6042 0.7937 0.6871 0.7154 0.5777 0.4833 0.446 0.6675 0.6593 0.6537 0.3069 0.4313 0.6557 0.4447 0.4663 0.4579 0.4481 0.6157 0.7371 0.5262 0.6345 0.3886 0.2716 0.6424 0.5258 0.4646 0.4414 0.4875 0.7344 0.722 0.549 0.81 0.409 0.3796 0.5429 0.8017 0.629 0.9003 0.6626 0.7124 0.4547 0.1314 0.5034 0.3511 0.6996 0.5177 0.8132 0.4241 0.6365 0.1957 0.5338 0.6789 0.7039 0.6399 0.2862 0.7265 0.4489 0.5533 0.5687 0.4118 0.1746 0.8453 0.6316 0.7485 0.0395 0.3903 0.6328 0.4534 0.3436 0.3672 0.7134 0.5226 0.185 0.4194 0.3413 0.0281 0.0364 0.565 0.2732 0.3606 0.6443 0.2113 0.0144 0.6376 0.7209 0.483 0.5137 0.1568 0.8016 0.2146 0.5068 0.2465 0.6935 0.423 0.6798 0.8515 0.1691 0.629 0.6674 0.1969 0.5158 0.5218 0.1408 0.517 0.4857 0.725 0.5596 0.8252 0.4889 0.0294 0.5255 0.6637 0.2371 0.6041 0.5786 0.5283 0.2309 0.3029 0.0557 0.4494 0.6101 0.5625 0.0186 0.6593 0.2323 0.4807 0. 0.8896 0.3377 0.4986 0.687 ] 2022-08-23 01:56:07 [INFO] [EVAL] Class Recall: [0.8395 0.8751 0.962 0.8392 0.8664 0.8379 0.8597 0.9119 0.7247 0.7664 0.6155 0.709 0.8692 0.4485 0.3199 0.5793 0.7086 0.5505 0.7406 0.5367 0.8918 0.6693 0.7815 0.7292 0.4566 0.6185 0.6009 0.5498 0.5071 0.5016 0.274 0.6209 0.4145 0.5931 0.5026 0.6067 0.5522 0.7516 0.3744 0.4589 0.3415 0.1684 0.4585 0.3292 0.4209 0.3389 0.4482 0.6121 0.8064 0.8323 0.5783 0.607 0.26 0.2512 0.8322 0.5191 0.9361 0.4968 0.6285 0.4111 0.1713 0.5105 0.6684 0.1804 0.6983 0.8274 0.5248 0.4387 0.1361 0.4301 0.5983 0.6914 0.4389 0.46 0.5225 0.5206 0.7634 0.3478 0.6912 0.1016 0.7829 0.4796 0.4165 0.0373 0.5236 0.7549 0.1488 0.0925 0.354 0.5893 0.569 0.1589 0.2837 0.0969 0.0193 0.0225 0.0238 0.2092 0.4015 0.5714 0.0729 0.0323 0.1505 0.2081 0.0258 0.6578 0.1056 0.552 0.1291 0.2281 0.0996 0.6448 0.1731 0.8115 0.9844 0.017 0.453 0.8175 0.3385 0.175 0.7527 0.0201 0.289 0.2323 0.3902 0.3095 0.4579 0.6569 0.0019 0.431 0.8461 0.0501 0.5052 0.2497 0.1753 0.235 0.1916 0.0278 0.2861 0.4803 0.7308 0.0495 0.4411 0.0228 0.3379 0. 0.3013 0.0273 0.1415 0.1882] 2022-08-23 01:56:07 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 01:56:19 [INFO] [TRAIN] epoch: 57, iter: 71050/160000, loss: 0.6128, lr: 0.000673, batch_cost: 0.2306, reader_cost: 0.00457, ips: 34.6933 samples/sec | ETA 05:41:51 2022-08-23 01:56:32 [INFO] [TRAIN] epoch: 57, iter: 71100/160000, loss: 0.6172, lr: 0.000673, batch_cost: 0.2569, reader_cost: 0.00151, ips: 31.1447 samples/sec | ETA 06:20:35 2022-08-23 01:56:43 [INFO] [TRAIN] epoch: 57, iter: 71150/160000, loss: 0.6176, lr: 0.000673, batch_cost: 0.2316, reader_cost: 0.00727, ips: 34.5445 samples/sec | ETA 05:42:56 2022-08-23 01:56:56 [INFO] [TRAIN] epoch: 57, iter: 71200/160000, loss: 0.6457, lr: 0.000672, batch_cost: 0.2540, reader_cost: 0.01610, ips: 31.4935 samples/sec | ETA 06:15:57 2022-08-23 01:57:09 [INFO] [TRAIN] epoch: 57, iter: 71250/160000, loss: 0.6097, lr: 0.000672, batch_cost: 0.2565, reader_cost: 0.00569, ips: 31.1921 samples/sec | ETA 06:19:22 2022-08-23 01:57:22 [INFO] [TRAIN] epoch: 57, iter: 71300/160000, loss: 0.6408, lr: 0.000672, batch_cost: 0.2709, reader_cost: 0.00062, ips: 29.5316 samples/sec | ETA 06:40:28 2022-08-23 01:57:34 [INFO] [TRAIN] epoch: 57, iter: 71350/160000, loss: 0.5927, lr: 0.000671, batch_cost: 0.2379, reader_cost: 0.00131, ips: 33.6270 samples/sec | ETA 05:51:30 2022-08-23 01:57:48 [INFO] [TRAIN] epoch: 57, iter: 71400/160000, loss: 0.6274, lr: 0.000671, batch_cost: 0.2731, reader_cost: 0.00041, ips: 29.2961 samples/sec | ETA 06:43:14 2022-08-23 01:58:00 [INFO] [TRAIN] epoch: 57, iter: 71450/160000, loss: 0.5804, lr: 0.000670, batch_cost: 0.2509, reader_cost: 0.00146, ips: 31.8827 samples/sec | ETA 06:10:18 2022-08-23 01:58:13 [INFO] [TRAIN] epoch: 57, iter: 71500/160000, loss: 0.6562, lr: 0.000670, batch_cost: 0.2568, reader_cost: 0.00912, ips: 31.1566 samples/sec | ETA 06:18:43 2022-08-23 01:58:28 [INFO] [TRAIN] epoch: 57, iter: 71550/160000, loss: 0.6238, lr: 0.000670, batch_cost: 0.2911, reader_cost: 0.00055, ips: 27.4854 samples/sec | ETA 07:09:04 2022-08-23 01:58:40 [INFO] [TRAIN] epoch: 57, iter: 71600/160000, loss: 0.5937, lr: 0.000669, batch_cost: 0.2399, reader_cost: 0.00034, ips: 33.3424 samples/sec | ETA 05:53:30 2022-08-23 01:58:55 [INFO] [TRAIN] epoch: 57, iter: 71650/160000, loss: 0.5926, lr: 0.000669, batch_cost: 0.2972, reader_cost: 0.00073, ips: 26.9144 samples/sec | ETA 07:17:41 2022-08-23 01:59:07 [INFO] [TRAIN] epoch: 57, iter: 71700/160000, loss: 0.6991, lr: 0.000669, batch_cost: 0.2453, reader_cost: 0.00592, ips: 32.6133 samples/sec | ETA 06:00:59 2022-08-23 01:59:21 [INFO] [TRAIN] epoch: 57, iter: 71750/160000, loss: 0.6106, lr: 0.000668, batch_cost: 0.2777, reader_cost: 0.00085, ips: 28.8091 samples/sec | ETA 06:48:26 2022-08-23 01:59:33 [INFO] [TRAIN] epoch: 57, iter: 71800/160000, loss: 0.6200, lr: 0.000668, batch_cost: 0.2414, reader_cost: 0.00115, ips: 33.1343 samples/sec | ETA 05:54:55 2022-08-23 01:59:45 [INFO] [TRAIN] epoch: 57, iter: 71850/160000, loss: 0.6144, lr: 0.000667, batch_cost: 0.2336, reader_cost: 0.01022, ips: 34.2482 samples/sec | ETA 05:43:10 2022-08-23 01:59:55 [INFO] [TRAIN] epoch: 57, iter: 71900/160000, loss: 0.6322, lr: 0.000667, batch_cost: 0.2124, reader_cost: 0.00063, ips: 37.6566 samples/sec | ETA 05:11:56 2022-08-23 02:00:06 [INFO] [TRAIN] epoch: 57, iter: 71950/160000, loss: 0.6418, lr: 0.000667, batch_cost: 0.2215, reader_cost: 0.00075, ips: 36.1107 samples/sec | ETA 05:25:06 2022-08-23 02:00:19 [INFO] [TRAIN] epoch: 58, iter: 72000/160000, loss: 0.6231, lr: 0.000666, batch_cost: 0.2538, reader_cost: 0.04252, ips: 31.5160 samples/sec | ETA 06:12:17 2022-08-23 02:00:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 224s - batch_cost: 0.2240 - reader cost: 9.2139e-04 2022-08-23 02:04:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.3357 Acc: 0.7592 Kappa: 0.7409 Dice: 0.4647 2022-08-23 02:04:03 [INFO] [EVAL] Class IoU: [0.6828 0.7661 0.9296 0.7206 0.6752 0.7427 0.7655 0.7773 0.5264 0.5813 0.4895 0.5576 0.6983 0.3194 0.3091 0.4322 0.5273 0.4205 0.6105 0.4115 0.7428 0.4408 0.6242 0.4833 0.333 0.3676 0.4286 0.4142 0.3697 0.26 0.2051 0.4807 0.2613 0.3604 0.3195 0.3802 0.4124 0.5354 0.2689 0.3724 0.2082 0.062 0.3592 0.2417 0.3047 0.1876 0.2589 0.4879 0.491 0.4885 0.5285 0.3534 0.1831 0.1785 0.6982 0.4 0.8571 0.3974 0.2922 0.2269 0.1076 0.3694 0.3449 0.1668 0.4368 0.6613 0.2741 0.404 0.098 0.3349 0.4537 0.5081 0.3552 0.2042 0.4386 0.3111 0.4404 0.2896 0.2523 0.113 0.6839 0.3889 0.3521 0.0188 0.1986 0.5385 0.1009 0.0755 0.2664 0.4976 0.4212 0.0279 0.2782 0.0464 0.0337 0.0138 0.0331 0.135 0.2919 0.4118 0.0126 0.0197 0.2986 0.0655 0.0144 0.5731 0.0611 0.5224 0.1216 0.1245 0.1108 0.4276 0.1396 0.5714 0.7594 0.011 0.2785 0.6269 0.098 0.152 0.4838 0.0227 0.2145 0.1066 0.3817 0.2795 0.4707 0.3803 0.2326 0.2848 0.5645 0.0207 0.3492 0.2187 0.1176 0.1149 0.1232 0.039 0.163 0.3559 0.4636 0.0173 0.3626 0.0031 0.3095 0. 0.3549 0.0262 0.1468 0.1765] 2022-08-23 02:04:03 [INFO] [EVAL] Class Precision: [0.7872 0.8377 0.9695 0.8109 0.7687 0.8845 0.8763 0.8476 0.6662 0.7923 0.7151 0.7006 0.7873 0.4727 0.4663 0.5851 0.7079 0.7038 0.7387 0.5483 0.8267 0.6176 0.7792 0.6257 0.4679 0.4592 0.5285 0.697 0.6516 0.3538 0.4317 0.6442 0.4387 0.4861 0.5036 0.554 0.6105 0.7086 0.4364 0.6564 0.4945 0.245 0.6089 0.5701 0.4464 0.4079 0.5301 0.6409 0.7722 0.5574 0.7044 0.5169 0.3173 0.5007 0.7213 0.6564 0.9241 0.639 0.6257 0.3993 0.1682 0.5458 0.4947 0.7025 0.5869 0.7973 0.4044 0.5776 0.1941 0.5191 0.6326 0.639 0.623 0.2864 0.6947 0.5637 0.531 0.614 0.7355 0.3606 0.807 0.6387 0.7607 0.0462 0.4882 0.6787 0.4988 0.3678 0.5893 0.6922 0.5571 0.0385 0.4828 0.3684 0.0711 0.0441 0.3601 0.2248 0.5011 0.6337 0.2814 0.0459 0.5452 0.5852 0.7298 0.6368 0.1064 0.7593 0.2545 0.3405 0.2629 0.6116 0.5313 0.7161 0.8281 0.1025 0.6402 0.6964 0.1377 0.4224 0.6105 0.1825 0.7203 0.5804 0.6391 0.5507 0.8455 0.4738 0.8752 0.4597 0.6359 0.2133 0.4643 0.5871 0.5445 0.2975 0.3329 0.0744 0.4548 0.622 0.6262 0.0254 0.7024 0.2917 0.7012 0. 0.8649 0.3224 0.6371 0.6043] 2022-08-23 02:04:03 [INFO] [EVAL] Class Recall: [0.8374 0.8997 0.9576 0.8662 0.8474 0.8225 0.8583 0.9035 0.7149 0.6858 0.6081 0.7321 0.8606 0.4962 0.4784 0.6232 0.6739 0.511 0.7787 0.6226 0.8797 0.6063 0.7584 0.6798 0.5361 0.6485 0.694 0.5052 0.4608 0.4953 0.2809 0.6545 0.3925 0.5822 0.4663 0.5478 0.5597 0.6864 0.412 0.4626 0.2645 0.0767 0.4668 0.2955 0.4897 0.2578 0.3359 0.6714 0.5742 0.7979 0.6791 0.5277 0.3022 0.2171 0.9562 0.5059 0.922 0.5124 0.354 0.3445 0.2298 0.5335 0.5326 0.1794 0.6308 0.795 0.4595 0.5735 0.1652 0.4855 0.616 0.7126 0.4524 0.4155 0.5434 0.4098 0.7208 0.3541 0.2774 0.1413 0.8177 0.4985 0.396 0.0308 0.2508 0.7227 0.1123 0.0867 0.3271 0.6389 0.6333 0.092 0.3963 0.0504 0.0602 0.0198 0.0352 0.2525 0.4115 0.5404 0.0131 0.0335 0.3976 0.0687 0.0144 0.8515 0.1256 0.6261 0.189 0.164 0.1607 0.5871 0.1592 0.7388 0.9015 0.0122 0.3302 0.8626 0.2537 0.1919 0.6998 0.0252 0.234 0.1155 0.4866 0.3621 0.515 0.6585 0.2405 0.428 0.8341 0.0224 0.5847 0.2585 0.1305 0.1578 0.1635 0.0757 0.2026 0.4541 0.641 0.0519 0.4284 0.0032 0.3566 0. 0.3758 0.0277 0.1602 0.1995] 2022-08-23 02:04:03 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:04:17 [INFO] [TRAIN] epoch: 58, iter: 72050/160000, loss: 0.5770, lr: 0.000666, batch_cost: 0.2760, reader_cost: 0.00476, ips: 28.9858 samples/sec | ETA 06:44:33 2022-08-23 02:04:31 [INFO] [TRAIN] epoch: 58, iter: 72100/160000, loss: 0.5723, lr: 0.000665, batch_cost: 0.2737, reader_cost: 0.01789, ips: 29.2246 samples/sec | ETA 06:41:01 2022-08-23 02:04:45 [INFO] [TRAIN] epoch: 58, iter: 72150/160000, loss: 0.6037, lr: 0.000665, batch_cost: 0.2755, reader_cost: 0.00279, ips: 29.0416 samples/sec | ETA 06:43:19 2022-08-23 02:05:00 [INFO] [TRAIN] epoch: 58, iter: 72200/160000, loss: 0.5698, lr: 0.000665, batch_cost: 0.3032, reader_cost: 0.00762, ips: 26.3893 samples/sec | ETA 07:23:36 2022-08-23 02:05:13 [INFO] [TRAIN] epoch: 58, iter: 72250/160000, loss: 0.6169, lr: 0.000664, batch_cost: 0.2690, reader_cost: 0.00123, ips: 29.7380 samples/sec | ETA 06:33:26 2022-08-23 02:05:26 [INFO] [TRAIN] epoch: 58, iter: 72300/160000, loss: 0.6036, lr: 0.000664, batch_cost: 0.2573, reader_cost: 0.00059, ips: 31.0882 samples/sec | ETA 06:16:08 2022-08-23 02:05:40 [INFO] [TRAIN] epoch: 58, iter: 72350/160000, loss: 0.6242, lr: 0.000664, batch_cost: 0.2811, reader_cost: 0.00085, ips: 28.4646 samples/sec | ETA 06:50:34 2022-08-23 02:05:55 [INFO] [TRAIN] epoch: 58, iter: 72400/160000, loss: 0.6515, lr: 0.000663, batch_cost: 0.2972, reader_cost: 0.00094, ips: 26.9135 samples/sec | ETA 07:13:58 2022-08-23 02:06:09 [INFO] [TRAIN] epoch: 58, iter: 72450/160000, loss: 0.5568, lr: 0.000663, batch_cost: 0.2745, reader_cost: 0.00201, ips: 29.1487 samples/sec | ETA 06:40:28 2022-08-23 02:06:22 [INFO] [TRAIN] epoch: 58, iter: 72500/160000, loss: 0.6362, lr: 0.000662, batch_cost: 0.2579, reader_cost: 0.00193, ips: 31.0140 samples/sec | ETA 06:16:10 2022-08-23 02:06:35 [INFO] [TRAIN] epoch: 58, iter: 72550/160000, loss: 0.6613, lr: 0.000662, batch_cost: 0.2621, reader_cost: 0.01086, ips: 30.5275 samples/sec | ETA 06:21:57 2022-08-23 02:06:45 [INFO] [TRAIN] epoch: 58, iter: 72600/160000, loss: 0.6410, lr: 0.000662, batch_cost: 0.2071, reader_cost: 0.00061, ips: 38.6239 samples/sec | ETA 05:01:42 2022-08-23 02:06:58 [INFO] [TRAIN] epoch: 58, iter: 72650/160000, loss: 0.5782, lr: 0.000661, batch_cost: 0.2560, reader_cost: 0.00063, ips: 31.2504 samples/sec | ETA 06:12:41 2022-08-23 02:07:10 [INFO] [TRAIN] epoch: 58, iter: 72700/160000, loss: 0.5873, lr: 0.000661, batch_cost: 0.2312, reader_cost: 0.00359, ips: 34.6076 samples/sec | ETA 05:36:20 2022-08-23 02:07:21 [INFO] [TRAIN] epoch: 58, iter: 72750/160000, loss: 0.5999, lr: 0.000661, batch_cost: 0.2293, reader_cost: 0.00059, ips: 34.8852 samples/sec | ETA 05:33:28 2022-08-23 02:07:32 [INFO] [TRAIN] epoch: 58, iter: 72800/160000, loss: 0.6108, lr: 0.000660, batch_cost: 0.2106, reader_cost: 0.01000, ips: 37.9791 samples/sec | ETA 05:06:07 2022-08-23 02:07:43 [INFO] [TRAIN] epoch: 58, iter: 72850/160000, loss: 0.6429, lr: 0.000660, batch_cost: 0.2347, reader_cost: 0.00073, ips: 34.0903 samples/sec | ETA 05:40:51 2022-08-23 02:07:53 [INFO] [TRAIN] epoch: 58, iter: 72900/160000, loss: 0.6481, lr: 0.000659, batch_cost: 0.1961, reader_cost: 0.00075, ips: 40.7936 samples/sec | ETA 04:44:41 2022-08-23 02:08:03 [INFO] [TRAIN] epoch: 58, iter: 72950/160000, loss: 0.6017, lr: 0.000659, batch_cost: 0.1972, reader_cost: 0.00052, ips: 40.5594 samples/sec | ETA 04:46:09 2022-08-23 02:08:12 [INFO] [TRAIN] epoch: 58, iter: 73000/160000, loss: 0.6402, lr: 0.000659, batch_cost: 0.1861, reader_cost: 0.00072, ips: 42.9874 samples/sec | ETA 04:29:50 2022-08-23 02:08:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1834 - reader cost: 6.2940e-04 2022-08-23 02:11:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3409 Acc: 0.7631 Kappa: 0.7449 Dice: 0.4716 2022-08-23 02:11:16 [INFO] [EVAL] Class IoU: [0.6834 0.783 0.929 0.722 0.68 0.7604 0.7701 0.7669 0.5155 0.6107 0.4714 0.53 0.6949 0.3254 0.2498 0.424 0.514 0.4329 0.6084 0.3984 0.7332 0.4758 0.6352 0.4703 0.3309 0.2789 0.47 0.4311 0.3431 0.2725 0.2633 0.4814 0.2903 0.3462 0.2942 0.387 0.4081 0.5262 0.2758 0.3917 0.2335 0.1233 0.3556 0.2524 0.3132 0.2714 0.2487 0.4915 0.6335 0.5262 0.5495 0.3382 0.1525 0.2052 0.746 0.4763 0.8572 0.3589 0.4534 0.3506 0.0845 0.4076 0.3603 0.1868 0.4237 0.6855 0.2751 0.4058 0.0916 0.3322 0.417 0.5135 0.3882 0.2279 0.4306 0.3141 0.3601 0.2789 0.2198 0.1437 0.6265 0.3767 0.3341 0.0374 0.1424 0.5459 0.1109 0.0723 0.2174 0.4971 0.3925 0.0896 0.2354 0.0983 0.0096 0.0132 0.0259 0.1381 0.257 0.3831 0.0721 0.0364 0.2102 0.58 0.0456 0.3999 0.0843 0.4933 0.0874 0.1054 0.1192 0.4686 0.1253 0.3901 0.8201 0.0191 0.3399 0.589 0.1127 0.0451 0.4456 0.0093 0.2107 0.179 0.2549 0.2093 0.4436 0.4369 0.4192 0.297 0.5634 0.0177 0.3199 0.2138 0.1383 0.1224 0.1255 0.0343 0.1836 0.3635 0.4442 0.0027 0.4156 0.0206 0.2783 0. 0.3978 0.0376 0.1518 0.1976] 2022-08-23 02:11:16 [INFO] [EVAL] Class Precision: [0.7876 0.8444 0.967 0.8108 0.7606 0.8547 0.8644 0.8249 0.6939 0.784 0.7107 0.6599 0.7664 0.4885 0.5386 0.5995 0.6868 0.718 0.7491 0.6463 0.8102 0.6245 0.793 0.6702 0.498 0.5838 0.5581 0.7299 0.6795 0.391 0.4094 0.6335 0.4691 0.4672 0.5224 0.4931 0.6414 0.7292 0.448 0.5608 0.3736 0.2885 0.6104 0.4879 0.41 0.4197 0.4743 0.6734 0.7479 0.6465 0.7983 0.4634 0.3052 0.5584 0.8524 0.6873 0.9024 0.6836 0.5968 0.4997 0.2558 0.5041 0.5174 0.6937 0.5212 0.8346 0.3694 0.5538 0.2871 0.5223 0.687 0.6507 0.6503 0.2863 0.6734 0.5351 0.4091 0.5738 0.7096 0.3377 0.7317 0.6551 0.7819 0.0761 0.3443 0.7102 0.5953 0.3339 0.3564 0.6897 0.5433 0.2068 0.4399 0.3206 0.0491 0.054 0.2789 0.298 0.3557 0.6147 0.2255 0.0521 0.7519 0.7013 0.3951 0.4523 0.1771 0.7797 0.2172 0.2837 0.3159 0.8474 0.5411 0.7267 0.826 0.114 0.7464 0.6379 0.1389 0.5161 0.7666 0.3739 0.6034 0.5727 0.7067 0.6237 0.8784 0.6126 0.7388 0.5355 0.6612 0.1194 0.6006 0.5708 0.5358 0.3283 0.3216 0.1232 0.35 0.6807 0.5322 0.0041 0.6431 0.3522 0.4772 0. 0.792 0.3644 0.5819 0.4902] 2022-08-23 02:11:16 [INFO] [EVAL] Class Recall: [0.8378 0.9151 0.9594 0.8683 0.8651 0.8733 0.8759 0.916 0.6673 0.7343 0.5833 0.7292 0.8817 0.4935 0.3178 0.5915 0.6714 0.5217 0.7641 0.5096 0.8852 0.6665 0.7615 0.6118 0.4966 0.3481 0.7486 0.513 0.4094 0.4734 0.4247 0.6672 0.4324 0.5721 0.4025 0.6428 0.5287 0.654 0.4177 0.565 0.3837 0.1772 0.46 0.3433 0.5702 0.4345 0.3434 0.6453 0.8054 0.7387 0.638 0.5558 0.2336 0.245 0.8566 0.6081 0.9448 0.4304 0.6536 0.5401 0.112 0.6804 0.5426 0.2036 0.6938 0.7932 0.5187 0.603 0.1185 0.4771 0.5148 0.7089 0.4906 0.5275 0.5443 0.432 0.7506 0.3517 0.2415 0.2002 0.8133 0.4698 0.3685 0.0684 0.1953 0.7023 0.1199 0.0845 0.358 0.6404 0.5858 0.1365 0.3361 0.1241 0.0118 0.0172 0.0278 0.2047 0.481 0.5042 0.0958 0.1079 0.2259 0.7703 0.0491 0.7755 0.1385 0.5732 0.1275 0.1437 0.1607 0.5118 0.1402 0.4571 0.9914 0.0224 0.3843 0.885 0.3744 0.0471 0.5155 0.0094 0.2446 0.2065 0.2851 0.2395 0.4727 0.6037 0.4922 0.4001 0.7921 0.0203 0.4063 0.2548 0.1571 0.1633 0.1706 0.0454 0.2787 0.4383 0.7288 0.008 0.5401 0.0215 0.4003 0. 0.4442 0.0402 0.1704 0.2487] 2022-08-23 02:11:16 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:11:29 [INFO] [TRAIN] epoch: 58, iter: 73050/160000, loss: 0.6595, lr: 0.000658, batch_cost: 0.2508, reader_cost: 0.02547, ips: 31.9031 samples/sec | ETA 06:03:23 2022-08-23 02:11:42 [INFO] [TRAIN] epoch: 58, iter: 73100/160000, loss: 0.6054, lr: 0.000658, batch_cost: 0.2574, reader_cost: 0.00781, ips: 31.0757 samples/sec | ETA 06:12:51 2022-08-23 02:11:54 [INFO] [TRAIN] epoch: 58, iter: 73150/160000, loss: 0.6246, lr: 0.000658, batch_cost: 0.2428, reader_cost: 0.00040, ips: 32.9550 samples/sec | ETA 05:51:23 2022-08-23 02:12:06 [INFO] [TRAIN] epoch: 58, iter: 73200/160000, loss: 0.6133, lr: 0.000657, batch_cost: 0.2399, reader_cost: 0.00048, ips: 33.3443 samples/sec | ETA 05:47:05 2022-08-23 02:12:19 [INFO] [TRAIN] epoch: 58, iter: 73250/160000, loss: 0.6198, lr: 0.000657, batch_cost: 0.2597, reader_cost: 0.00104, ips: 30.7989 samples/sec | ETA 06:15:33 2022-08-23 02:12:36 [INFO] [TRAIN] epoch: 59, iter: 73300/160000, loss: 0.5867, lr: 0.000656, batch_cost: 0.3379, reader_cost: 0.06360, ips: 23.6738 samples/sec | ETA 08:08:18 2022-08-23 02:12:49 [INFO] [TRAIN] epoch: 59, iter: 73350/160000, loss: 0.6283, lr: 0.000656, batch_cost: 0.2658, reader_cost: 0.00742, ips: 30.0924 samples/sec | ETA 06:23:55 2022-08-23 02:13:02 [INFO] [TRAIN] epoch: 59, iter: 73400/160000, loss: 0.5469, lr: 0.000656, batch_cost: 0.2526, reader_cost: 0.00065, ips: 31.6765 samples/sec | ETA 06:04:31 2022-08-23 02:13:16 [INFO] [TRAIN] epoch: 59, iter: 73450/160000, loss: 0.5536, lr: 0.000655, batch_cost: 0.2831, reader_cost: 0.00048, ips: 28.2562 samples/sec | ETA 06:48:24 2022-08-23 02:13:26 [INFO] [TRAIN] epoch: 59, iter: 73500/160000, loss: 0.6433, lr: 0.000655, batch_cost: 0.2120, reader_cost: 0.00059, ips: 37.7329 samples/sec | ETA 05:05:39 2022-08-23 02:13:37 [INFO] [TRAIN] epoch: 59, iter: 73550/160000, loss: 0.5653, lr: 0.000655, batch_cost: 0.2193, reader_cost: 0.00100, ips: 36.4796 samples/sec | ETA 05:15:58 2022-08-23 02:13:49 [INFO] [TRAIN] epoch: 59, iter: 73600/160000, loss: 0.5935, lr: 0.000654, batch_cost: 0.2273, reader_cost: 0.00048, ips: 35.1904 samples/sec | ETA 05:27:21 2022-08-23 02:13:58 [INFO] [TRAIN] epoch: 59, iter: 73650/160000, loss: 0.6279, lr: 0.000654, batch_cost: 0.1935, reader_cost: 0.00096, ips: 41.3450 samples/sec | ETA 04:38:28 2022-08-23 02:14:09 [INFO] [TRAIN] epoch: 59, iter: 73700/160000, loss: 0.5952, lr: 0.000653, batch_cost: 0.2016, reader_cost: 0.00062, ips: 39.6869 samples/sec | ETA 04:49:56 2022-08-23 02:14:20 [INFO] [TRAIN] epoch: 59, iter: 73750/160000, loss: 0.5789, lr: 0.000653, batch_cost: 0.2274, reader_cost: 0.00081, ips: 35.1772 samples/sec | ETA 05:26:54 2022-08-23 02:14:30 [INFO] [TRAIN] epoch: 59, iter: 73800/160000, loss: 0.6659, lr: 0.000653, batch_cost: 0.2095, reader_cost: 0.00077, ips: 38.1878 samples/sec | ETA 05:00:58 2022-08-23 02:14:41 [INFO] [TRAIN] epoch: 59, iter: 73850/160000, loss: 0.6034, lr: 0.000652, batch_cost: 0.2164, reader_cost: 0.00079, ips: 36.9727 samples/sec | ETA 05:10:40 2022-08-23 02:14:53 [INFO] [TRAIN] epoch: 59, iter: 73900/160000, loss: 0.6259, lr: 0.000652, batch_cost: 0.2324, reader_cost: 0.00054, ips: 34.4190 samples/sec | ETA 05:33:32 2022-08-23 02:15:04 [INFO] [TRAIN] epoch: 59, iter: 73950/160000, loss: 0.6129, lr: 0.000651, batch_cost: 0.2228, reader_cost: 0.00055, ips: 35.8994 samples/sec | ETA 05:19:35 2022-08-23 02:15:13 [INFO] [TRAIN] epoch: 59, iter: 74000/160000, loss: 0.6545, lr: 0.000651, batch_cost: 0.1874, reader_cost: 0.00076, ips: 42.6948 samples/sec | ETA 04:28:34 2022-08-23 02:15:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1794 - reader cost: 7.4221e-04 2022-08-23 02:18:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3387 Acc: 0.7572 Kappa: 0.7388 Dice: 0.4705 2022-08-23 02:18:13 [INFO] [EVAL] Class IoU: [0.6793 0.7651 0.9308 0.7153 0.6709 0.7489 0.7721 0.7767 0.5267 0.5962 0.473 0.5521 0.703 0.3063 0.2785 0.4237 0.5084 0.4026 0.607 0.4061 0.7406 0.4626 0.6235 0.4584 0.2919 0.3223 0.4075 0.4279 0.378 0.2168 0.2384 0.4945 0.2926 0.3294 0.3492 0.3864 0.4135 0.5191 0.2859 0.3702 0.2188 0.1248 0.3668 0.2514 0.2892 0.2902 0.3223 0.4884 0.5812 0.5313 0.5263 0.3753 0.1896 0.1526 0.7065 0.4689 0.7626 0.4025 0.3648 0.218 0.1375 0.193 0.3274 0.2122 0.455 0.6394 0.2332 0.3849 0.0882 0.3258 0.4348 0.5441 0.3645 0.2314 0.4209 0.3345 0.5613 0.271 0.2209 0.1468 0.5677 0.3595 0.2928 0.0643 0.2375 0.5416 0.1106 0.0655 0.2171 0.4652 0.4118 0.0663 0.1933 0.1099 0.004 0.0108 0.0215 0.1388 0.2858 0.4249 0.0285 0.0265 0.2617 0.2292 0.0049 0.5112 0.0988 0.4912 0.1137 0.2024 0.1276 0.4209 0.1536 0.4996 0.8809 0.0221 0.2981 0.6195 0.0899 0.0882 0.431 0.0145 0.2027 0.1721 0.4364 0.2495 0.423 0.3407 0.4217 0.3101 0.5479 0.0421 0.3666 0.228 0.1541 0.1143 0.1318 0.0349 0.1703 0.3524 0.4192 0.0112 0.3795 0.1301 0.2766 0. 0.3547 0.0198 0.1485 0.1516] 2022-08-23 02:18:13 [INFO] [EVAL] Class Precision: [0.7792 0.8605 0.9585 0.8043 0.7461 0.8838 0.8894 0.8542 0.6607 0.8027 0.6694 0.7026 0.8008 0.4377 0.5169 0.5917 0.7742 0.7078 0.7624 0.5994 0.8333 0.7122 0.7554 0.5925 0.461 0.4143 0.5239 0.6919 0.6192 0.3396 0.4127 0.6263 0.45 0.3929 0.5292 0.512 0.6018 0.7521 0.5156 0.5986 0.4064 0.2642 0.67 0.5905 0.3958 0.5218 0.5084 0.6937 0.6771 0.6489 0.692 0.552 0.3028 0.5333 0.7739 0.6377 0.7821 0.6458 0.6293 0.3552 0.1935 0.4377 0.4686 0.6299 0.5731 0.7221 0.2843 0.5147 0.1773 0.5023 0.5699 0.6558 0.6678 0.3054 0.6766 0.5437 0.7383 0.6172 0.7459 0.371 0.6622 0.6649 0.7876 0.122 0.5113 0.6544 0.6185 0.3851 0.3408 0.7524 0.5741 0.0944 0.377 0.3122 0.0132 0.0435 0.6785 0.3112 0.4595 0.6822 0.2704 0.0414 0.5837 0.8443 0.4232 0.5789 0.1991 0.7992 0.2735 0.4278 0.3452 0.6271 0.4797 0.7604 0.891 0.166 0.6856 0.7572 0.1502 0.4368 0.5811 0.3818 0.509 0.5959 0.7492 0.6633 0.7481 0.4302 0.558 0.5044 0.6315 0.281 0.5512 0.5536 0.6735 0.3446 0.2625 0.1107 0.4019 0.6225 0.5309 0.0161 0.6449 0.4216 0.5243 0. 0.8641 0.3597 0.6302 0.7221] 2022-08-23 02:18:13 [INFO] [EVAL] Class Recall: [0.8413 0.8735 0.9699 0.866 0.8695 0.8308 0.8541 0.8955 0.7219 0.6986 0.6172 0.7205 0.852 0.5052 0.3765 0.5987 0.5969 0.4828 0.7486 0.5573 0.8695 0.5689 0.7813 0.6695 0.4431 0.592 0.6471 0.5286 0.4926 0.3748 0.3609 0.7015 0.4554 0.6709 0.5066 0.6115 0.5693 0.6262 0.3908 0.4924 0.3216 0.1912 0.4476 0.3045 0.5177 0.3953 0.4682 0.6227 0.8041 0.7456 0.6873 0.5397 0.3366 0.1762 0.8903 0.6391 0.9684 0.5165 0.4646 0.3608 0.322 0.2566 0.5207 0.2424 0.6882 0.8482 0.5645 0.6042 0.1493 0.4811 0.6471 0.7616 0.4452 0.4886 0.5269 0.4651 0.7008 0.3257 0.2389 0.1955 0.7991 0.439 0.3179 0.1196 0.3073 0.7586 0.1187 0.0731 0.3743 0.5493 0.5929 0.1819 0.284 0.1451 0.0058 0.0142 0.0217 0.2005 0.4306 0.5298 0.0309 0.0685 0.3218 0.2394 0.0049 0.8138 0.1641 0.5603 0.1629 0.2775 0.1684 0.5615 0.1842 0.5929 0.9874 0.0249 0.3454 0.773 0.1829 0.0995 0.6254 0.0148 0.252 0.1948 0.5111 0.2857 0.4933 0.621 0.6333 0.446 0.8054 0.0472 0.5226 0.2794 0.1665 0.1461 0.2092 0.0485 0.2282 0.4482 0.666 0.0354 0.4798 0.1584 0.3692 0. 0.3756 0.0206 0.1627 0.161 ] 2022-08-23 02:18:13 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:18:25 [INFO] [TRAIN] epoch: 59, iter: 74050/160000, loss: 0.6031, lr: 0.000651, batch_cost: 0.2275, reader_cost: 0.00706, ips: 35.1655 samples/sec | ETA 05:25:53 2022-08-23 02:18:37 [INFO] [TRAIN] epoch: 59, iter: 74100/160000, loss: 0.6150, lr: 0.000650, batch_cost: 0.2470, reader_cost: 0.00131, ips: 32.3862 samples/sec | ETA 05:53:38 2022-08-23 02:18:48 [INFO] [TRAIN] epoch: 59, iter: 74150/160000, loss: 0.6108, lr: 0.000650, batch_cost: 0.2239, reader_cost: 0.00063, ips: 35.7346 samples/sec | ETA 05:20:19 2022-08-23 02:18:59 [INFO] [TRAIN] epoch: 59, iter: 74200/160000, loss: 0.5963, lr: 0.000650, batch_cost: 0.2230, reader_cost: 0.00061, ips: 35.8679 samples/sec | ETA 05:18:56 2022-08-23 02:19:12 [INFO] [TRAIN] epoch: 59, iter: 74250/160000, loss: 0.5861, lr: 0.000649, batch_cost: 0.2502, reader_cost: 0.00284, ips: 31.9713 samples/sec | ETA 05:57:36 2022-08-23 02:19:26 [INFO] [TRAIN] epoch: 59, iter: 74300/160000, loss: 0.6214, lr: 0.000649, batch_cost: 0.2771, reader_cost: 0.00487, ips: 28.8716 samples/sec | ETA 06:35:46 2022-08-23 02:19:37 [INFO] [TRAIN] epoch: 59, iter: 74350/160000, loss: 0.6164, lr: 0.000648, batch_cost: 0.2343, reader_cost: 0.00080, ips: 34.1466 samples/sec | ETA 05:34:26 2022-08-23 02:19:49 [INFO] [TRAIN] epoch: 59, iter: 74400/160000, loss: 0.6122, lr: 0.000648, batch_cost: 0.2306, reader_cost: 0.00248, ips: 34.6862 samples/sec | ETA 05:29:02 2022-08-23 02:20:01 [INFO] [TRAIN] epoch: 59, iter: 74450/160000, loss: 0.6268, lr: 0.000648, batch_cost: 0.2456, reader_cost: 0.00055, ips: 32.5735 samples/sec | ETA 05:50:10 2022-08-23 02:20:13 [INFO] [TRAIN] epoch: 59, iter: 74500/160000, loss: 0.6058, lr: 0.000647, batch_cost: 0.2366, reader_cost: 0.00037, ips: 33.8127 samples/sec | ETA 05:37:09 2022-08-23 02:20:26 [INFO] [TRAIN] epoch: 60, iter: 74550/160000, loss: 0.6220, lr: 0.000647, batch_cost: 0.2598, reader_cost: 0.05116, ips: 30.7884 samples/sec | ETA 06:10:03 2022-08-23 02:20:37 [INFO] [TRAIN] epoch: 60, iter: 74600/160000, loss: 0.6042, lr: 0.000647, batch_cost: 0.2158, reader_cost: 0.00062, ips: 37.0699 samples/sec | ETA 05:07:10 2022-08-23 02:20:48 [INFO] [TRAIN] epoch: 60, iter: 74650/160000, loss: 0.6097, lr: 0.000646, batch_cost: 0.2153, reader_cost: 0.00059, ips: 37.1585 samples/sec | ETA 05:06:15 2022-08-23 02:20:59 [INFO] [TRAIN] epoch: 60, iter: 74700/160000, loss: 0.6153, lr: 0.000646, batch_cost: 0.2250, reader_cost: 0.00056, ips: 35.5605 samples/sec | ETA 05:19:49 2022-08-23 02:21:09 [INFO] [TRAIN] epoch: 60, iter: 74750/160000, loss: 0.5973, lr: 0.000645, batch_cost: 0.2069, reader_cost: 0.00080, ips: 38.6598 samples/sec | ETA 04:54:01 2022-08-23 02:21:19 [INFO] [TRAIN] epoch: 60, iter: 74800/160000, loss: 0.6151, lr: 0.000645, batch_cost: 0.2041, reader_cost: 0.00109, ips: 39.2023 samples/sec | ETA 04:49:46 2022-08-23 02:21:30 [INFO] [TRAIN] epoch: 60, iter: 74850/160000, loss: 0.5726, lr: 0.000645, batch_cost: 0.2167, reader_cost: 0.00080, ips: 36.9185 samples/sec | ETA 05:07:31 2022-08-23 02:21:40 [INFO] [TRAIN] epoch: 60, iter: 74900/160000, loss: 0.6065, lr: 0.000644, batch_cost: 0.1896, reader_cost: 0.00076, ips: 42.1962 samples/sec | ETA 04:28:54 2022-08-23 02:21:49 [INFO] [TRAIN] epoch: 60, iter: 74950/160000, loss: 0.5937, lr: 0.000644, batch_cost: 0.1921, reader_cost: 0.00088, ips: 41.6524 samples/sec | ETA 04:32:15 2022-08-23 02:22:01 [INFO] [TRAIN] epoch: 60, iter: 75000/160000, loss: 0.6126, lr: 0.000644, batch_cost: 0.2266, reader_cost: 0.00080, ips: 35.3022 samples/sec | ETA 05:21:02 2022-08-23 02:22:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1932 - reader cost: 0.0011 2022-08-23 02:25:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3412 Acc: 0.7591 Kappa: 0.7405 Dice: 0.4725 2022-08-23 02:25:14 [INFO] [EVAL] Class IoU: [0.6776 0.7708 0.9301 0.7245 0.6877 0.74 0.7662 0.7848 0.5188 0.5968 0.4882 0.5323 0.6983 0.3068 0.2895 0.4237 0.4867 0.4315 0.5975 0.4055 0.7422 0.4429 0.6134 0.4632 0.3012 0.1045 0.4127 0.4335 0.3804 0.2827 0.2234 0.454 0.2739 0.3457 0.329 0.4041 0.4202 0.5797 0.2554 0.3723 0.2653 0.1081 0.3584 0.2508 0.3139 0.2808 0.2574 0.4825 0.6141 0.4601 0.5538 0.3966 0.1575 0.2365 0.728 0.4763 0.823 0.4202 0.3541 0.2574 0.0802 0.2789 0.3205 0.2392 0.4363 0.6557 0.26 0.3997 0.0996 0.3251 0.4686 0.4911 0.4017 0.1979 0.4258 0.3458 0.5827 0.2814 0.2636 0.1251 0.6102 0.3848 0.3797 0.0427 0.2179 0.5525 0.1179 0.0835 0.2143 0.4747 0.4089 0.0499 0.2163 0.0863 0.0078 0.0143 0.022 0.1468 0.2792 0.3691 0.0686 0.015 0.25 0.7178 0.0768 0.4001 0.0851 0.5019 0.0873 0.1743 0.1083 0.4158 0.1303 0.423 0.588 0.0284 0.3691 0.6326 0.0884 0.0842 0.4526 0.0179 0.214 0.1411 0.3621 0.2649 0.4281 0.4374 0.5735 0.3381 0.544 0.0353 0.3481 0.2293 0.1549 0.1217 0.1289 0.0193 0.2116 0.3585 0.3447 0.0285 0.3898 0.006 0.2803 0. 0.3581 0.0198 0.1447 0.1439] 2022-08-23 02:25:14 [INFO] [EVAL] Class Precision: [0.7738 0.8334 0.9685 0.8218 0.7793 0.8951 0.8744 0.8532 0.6694 0.752 0.7018 0.6725 0.7851 0.4842 0.5144 0.5691 0.7282 0.6726 0.7617 0.6144 0.8332 0.6454 0.7506 0.5819 0.4983 0.3984 0.5148 0.7268 0.6177 0.3772 0.4149 0.5675 0.4745 0.4545 0.5171 0.5941 0.6155 0.7266 0.4253 0.6192 0.4067 0.2542 0.5599 0.5762 0.4676 0.453 0.3737 0.7498 0.718 0.536 0.7078 0.5634 0.3191 0.5149 0.8239 0.701 0.8565 0.6289 0.6297 0.4489 0.1471 0.4666 0.417 0.5858 0.5377 0.7265 0.3605 0.5397 0.2178 0.6674 0.6738 0.5786 0.5861 0.281 0.6494 0.538 0.7341 0.57 0.5978 0.3061 0.7216 0.6349 0.7057 0.1101 0.4286 0.7537 0.5105 0.3104 0.3075 0.654 0.5283 0.0652 0.4283 0.3509 0.0425 0.0439 0.2362 0.2932 0.4293 0.7573 0.3116 0.0322 0.5319 0.8687 0.622 0.4751 0.1682 0.7398 0.2337 0.4889 0.2345 0.7392 0.5046 0.7762 0.5892 0.0981 0.6683 0.6995 0.13 0.4859 0.6357 0.1624 0.5255 0.6347 0.6852 0.6178 0.8251 0.6157 0.9148 0.5468 0.6271 0.1797 0.4529 0.5916 0.5885 0.3007 0.3483 0.0547 0.4485 0.6344 0.3889 0.0368 0.6306 0.1839 0.5392 0. 0.8006 0.5319 0.5471 0.8323] 2022-08-23 02:25:14 [INFO] [EVAL] Class Recall: [0.845 0.9111 0.9591 0.8596 0.8541 0.8103 0.861 0.9073 0.6975 0.743 0.6159 0.7187 0.8633 0.4557 0.3983 0.6238 0.5948 0.5462 0.7348 0.544 0.8717 0.5854 0.7705 0.6943 0.4322 0.1241 0.6756 0.5179 0.4976 0.5301 0.3262 0.6941 0.3931 0.5909 0.4749 0.5582 0.5698 0.7414 0.39 0.4829 0.4328 0.1583 0.4989 0.3075 0.4885 0.4248 0.4525 0.5752 0.8093 0.7647 0.718 0.5726 0.2374 0.3042 0.8622 0.5977 0.9547 0.5588 0.4473 0.3762 0.1497 0.4094 0.5808 0.2879 0.6982 0.8705 0.4827 0.6065 0.155 0.3879 0.6061 0.7645 0.5607 0.4007 0.5529 0.4918 0.7385 0.3572 0.3204 0.1747 0.7981 0.4941 0.4511 0.0651 0.3071 0.6743 0.1329 0.1025 0.4143 0.634 0.6441 0.176 0.3041 0.1027 0.0095 0.0207 0.0237 0.2273 0.444 0.4187 0.0809 0.0275 0.3205 0.8052 0.0805 0.7172 0.1469 0.6095 0.1223 0.2131 0.1676 0.4873 0.1494 0.4817 0.9968 0.0385 0.4518 0.8686 0.2165 0.0924 0.6111 0.0197 0.2653 0.1536 0.4344 0.3168 0.4708 0.6016 0.6058 0.4698 0.8042 0.0421 0.6009 0.2724 0.1737 0.1698 0.1698 0.0289 0.286 0.4519 0.7517 0.112 0.5052 0.0062 0.3685 0. 0.3932 0.0202 0.1645 0.1482] 2022-08-23 02:25:14 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:25:25 [INFO] [TRAIN] epoch: 60, iter: 75050/160000, loss: 0.5742, lr: 0.000643, batch_cost: 0.2266, reader_cost: 0.00327, ips: 35.2985 samples/sec | ETA 05:20:52 2022-08-23 02:25:37 [INFO] [TRAIN] epoch: 60, iter: 75100/160000, loss: 0.5967, lr: 0.000643, batch_cost: 0.2301, reader_cost: 0.00129, ips: 34.7739 samples/sec | ETA 05:25:31 2022-08-23 02:25:50 [INFO] [TRAIN] epoch: 60, iter: 75150/160000, loss: 0.6083, lr: 0.000642, batch_cost: 0.2641, reader_cost: 0.00076, ips: 30.2886 samples/sec | ETA 06:13:31 2022-08-23 02:26:00 [INFO] [TRAIN] epoch: 60, iter: 75200/160000, loss: 0.5913, lr: 0.000642, batch_cost: 0.1913, reader_cost: 0.00633, ips: 41.8178 samples/sec | ETA 04:30:22 2022-08-23 02:26:10 [INFO] [TRAIN] epoch: 60, iter: 75250/160000, loss: 0.5631, lr: 0.000642, batch_cost: 0.2021, reader_cost: 0.00252, ips: 39.5924 samples/sec | ETA 04:45:24 2022-08-23 02:26:21 [INFO] [TRAIN] epoch: 60, iter: 75300/160000, loss: 0.5978, lr: 0.000641, batch_cost: 0.2199, reader_cost: 0.00572, ips: 36.3869 samples/sec | ETA 05:10:22 2022-08-23 02:26:32 [INFO] [TRAIN] epoch: 60, iter: 75350/160000, loss: 0.6024, lr: 0.000641, batch_cost: 0.2331, reader_cost: 0.00035, ips: 34.3247 samples/sec | ETA 05:28:49 2022-08-23 02:26:45 [INFO] [TRAIN] epoch: 60, iter: 75400/160000, loss: 0.6220, lr: 0.000641, batch_cost: 0.2477, reader_cost: 0.00035, ips: 32.2938 samples/sec | ETA 05:49:17 2022-08-23 02:26:55 [INFO] [TRAIN] epoch: 60, iter: 75450/160000, loss: 0.6039, lr: 0.000640, batch_cost: 0.2112, reader_cost: 0.00078, ips: 37.8854 samples/sec | ETA 04:57:33 2022-08-23 02:27:08 [INFO] [TRAIN] epoch: 60, iter: 75500/160000, loss: 0.5936, lr: 0.000640, batch_cost: 0.2512, reader_cost: 0.00065, ips: 31.8506 samples/sec | ETA 05:53:44 2022-08-23 02:27:20 [INFO] [TRAIN] epoch: 60, iter: 75550/160000, loss: 0.6124, lr: 0.000639, batch_cost: 0.2398, reader_cost: 0.00118, ips: 33.3669 samples/sec | ETA 05:37:27 2022-08-23 02:27:30 [INFO] [TRAIN] epoch: 60, iter: 75600/160000, loss: 0.5929, lr: 0.000639, batch_cost: 0.1982, reader_cost: 0.00096, ips: 40.3669 samples/sec | ETA 04:38:46 2022-08-23 02:27:40 [INFO] [TRAIN] epoch: 60, iter: 75650/160000, loss: 0.6036, lr: 0.000639, batch_cost: 0.1919, reader_cost: 0.00055, ips: 41.6959 samples/sec | ETA 04:29:43 2022-08-23 02:27:48 [INFO] [TRAIN] epoch: 60, iter: 75700/160000, loss: 0.6343, lr: 0.000638, batch_cost: 0.1796, reader_cost: 0.00122, ips: 44.5466 samples/sec | ETA 04:12:19 2022-08-23 02:27:58 [INFO] [TRAIN] epoch: 60, iter: 75750/160000, loss: 0.6290, lr: 0.000638, batch_cost: 0.1921, reader_cost: 0.00052, ips: 41.6459 samples/sec | ETA 04:29:44 2022-08-23 02:28:11 [INFO] [TRAIN] epoch: 61, iter: 75800/160000, loss: 0.5907, lr: 0.000637, batch_cost: 0.2659, reader_cost: 0.05796, ips: 30.0831 samples/sec | ETA 06:13:11 2022-08-23 02:28:21 [INFO] [TRAIN] epoch: 61, iter: 75850/160000, loss: 0.6131, lr: 0.000637, batch_cost: 0.1973, reader_cost: 0.00063, ips: 40.5414 samples/sec | ETA 04:36:45 2022-08-23 02:28:32 [INFO] [TRAIN] epoch: 61, iter: 75900/160000, loss: 0.5987, lr: 0.000637, batch_cost: 0.2084, reader_cost: 0.00052, ips: 38.3871 samples/sec | ETA 04:52:06 2022-08-23 02:28:42 [INFO] [TRAIN] epoch: 61, iter: 75950/160000, loss: 0.5803, lr: 0.000636, batch_cost: 0.2001, reader_cost: 0.00065, ips: 39.9891 samples/sec | ETA 04:40:14 2022-08-23 02:28:51 [INFO] [TRAIN] epoch: 61, iter: 76000/160000, loss: 0.5484, lr: 0.000636, batch_cost: 0.1944, reader_cost: 0.00032, ips: 41.1427 samples/sec | ETA 04:32:13 2022-08-23 02:28:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 203s - batch_cost: 0.2027 - reader cost: 7.0044e-04 2022-08-23 02:32:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3384 Acc: 0.7605 Kappa: 0.7422 Dice: 0.4700 2022-08-23 02:32:14 [INFO] [EVAL] Class IoU: [0.6788 0.7637 0.9317 0.7188 0.6747 0.744 0.7817 0.7772 0.5297 0.6259 0.4993 0.558 0.688 0.3187 0.3028 0.4301 0.4956 0.4374 0.6022 0.4011 0.7352 0.4687 0.6224 0.4863 0.3012 0.3013 0.4063 0.4453 0.341 0.223 0.2471 0.4725 0.2798 0.3695 0.305 0.3806 0.4267 0.5383 0.2696 0.3516 0.2295 0.1022 0.3516 0.2571 0.3259 0.2738 0.2889 0.4976 0.4619 0.5381 0.5238 0.3085 0.2216 0.2259 0.6489 0.4447 0.8579 0.391 0.494 0.2736 0.0896 0.2053 0.3415 0.1875 0.4606 0.6808 0.2903 0.4149 0.1212 0.3373 0.4601 0.5389 0.3941 0.2027 0.4456 0.3163 0.4478 0.2802 0.2408 0.1368 0.5973 0.3475 0.2852 0.0617 0.1747 0.5411 0.1022 0.0898 0.238 0.4646 0.4341 0.0929 0.1925 0.0789 0.0149 0.0178 0.0311 0.1059 0.2795 0.4024 0.0621 0.0538 0.2411 0.0607 0.0228 0.4476 0.0899 0.5038 0.0877 0.1921 0.0951 0.4724 0.1518 0.5425 0.7162 0.037 0.2921 0.6196 0.1267 0.1643 0.4372 0.0044 0.2285 0.1433 0.5226 0.2244 0.4082 0.3724 0.4827 0.3468 0.5719 0.0179 0.3182 0.1912 0.1424 0.1268 0.1202 0.0138 0.1873 0.3177 0.3985 0.0411 0.3818 0.0377 0.2176 0. 0.3027 0.0365 0.1487 0.2485] 2022-08-23 02:32:14 [INFO] [EVAL] Class Precision: [0.7842 0.8377 0.969 0.8066 0.7486 0.8892 0.8963 0.836 0.7043 0.7937 0.6975 0.7011 0.7589 0.4866 0.5129 0.6074 0.6634 0.6736 0.7516 0.5772 0.822 0.6882 0.7486 0.6211 0.4658 0.4612 0.5039 0.6064 0.7252 0.3464 0.4144 0.6072 0.4604 0.5384 0.426 0.5294 0.6371 0.7697 0.4249 0.6555 0.4436 0.246 0.6203 0.4802 0.4698 0.4575 0.545 0.6705 0.7549 0.6298 0.7054 0.4099 0.3579 0.5383 0.7221 0.6094 0.9452 0.6573 0.6079 0.4872 0.1708 0.4092 0.5287 0.6326 0.6299 0.7925 0.4851 0.5332 0.2354 0.5621 0.6524 0.6801 0.6055 0.2931 0.6667 0.4698 0.5579 0.4881 0.7344 0.3219 0.6686 0.7267 0.8208 0.1522 0.414 0.7349 0.4356 0.3086 0.4462 0.7723 0.5875 0.166 0.4295 0.3277 0.0731 0.0398 0.5716 0.3907 0.4377 0.5097 0.4109 0.1215 0.5709 0.4659 0.1097 0.599 0.1917 0.7916 0.2754 0.5427 0.2454 0.8599 0.4629 0.6115 0.7185 0.4371 0.5904 0.7442 0.2176 0.5018 0.5565 0.048 0.6955 0.5214 0.7117 0.6295 0.7858 0.4554 0.8641 0.5997 0.6732 0.176 0.4238 0.7039 0.6658 0.3015 0.4406 0.0771 0.4963 0.6988 0.5376 0.0561 0.6223 0.3234 0.4436 0. 0.8615 0.3544 0.5671 0.6255] 2022-08-23 02:32:14 [INFO] [EVAL] Class Recall: [0.8347 0.8963 0.9603 0.8685 0.8723 0.82 0.8595 0.917 0.6813 0.7475 0.6373 0.7322 0.8805 0.4802 0.4249 0.5957 0.6621 0.5551 0.7518 0.568 0.8744 0.595 0.7869 0.6914 0.4601 0.465 0.677 0.6263 0.3916 0.3849 0.3796 0.6805 0.4163 0.5407 0.5177 0.5753 0.5637 0.6417 0.4245 0.4313 0.3222 0.1487 0.448 0.3562 0.5156 0.4055 0.3808 0.6587 0.5434 0.7871 0.6705 0.555 0.3678 0.2802 0.8649 0.622 0.9028 0.4911 0.7251 0.3842 0.1584 0.2918 0.491 0.2104 0.6315 0.8284 0.4196 0.6516 0.1998 0.4575 0.6095 0.7219 0.5302 0.3967 0.5733 0.4919 0.6942 0.3969 0.2638 0.1921 0.8485 0.3998 0.3042 0.0939 0.2321 0.6723 0.1178 0.1125 0.3377 0.5384 0.6244 0.1744 0.2586 0.0941 0.0183 0.0313 0.0318 0.1268 0.436 0.6565 0.0682 0.088 0.2945 0.0652 0.028 0.6391 0.1448 0.5809 0.114 0.2293 0.1345 0.5118 0.1843 0.8279 0.9956 0.0389 0.3663 0.7873 0.2327 0.1964 0.6711 0.0049 0.2539 0.165 0.663 0.2586 0.4594 0.6716 0.5224 0.4512 0.7917 0.0196 0.5607 0.2079 0.1534 0.1796 0.1419 0.0165 0.2312 0.368 0.6063 0.1336 0.497 0.0409 0.2993 0. 0.3182 0.0391 0.1677 0.2919] 2022-08-23 02:32:15 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:32:27 [INFO] [TRAIN] epoch: 61, iter: 76050/160000, loss: 0.6102, lr: 0.000636, batch_cost: 0.2541, reader_cost: 0.00291, ips: 31.4893 samples/sec | ETA 05:55:27 2022-08-23 02:32:38 [INFO] [TRAIN] epoch: 61, iter: 76100/160000, loss: 0.5191, lr: 0.000635, batch_cost: 0.2084, reader_cost: 0.00109, ips: 38.3908 samples/sec | ETA 04:51:23 2022-08-23 02:32:50 [INFO] [TRAIN] epoch: 61, iter: 76150/160000, loss: 0.6242, lr: 0.000635, batch_cost: 0.2432, reader_cost: 0.00116, ips: 32.8970 samples/sec | ETA 05:39:50 2022-08-23 02:33:01 [INFO] [TRAIN] epoch: 61, iter: 76200/160000, loss: 0.5811, lr: 0.000634, batch_cost: 0.2231, reader_cost: 0.00111, ips: 35.8504 samples/sec | ETA 05:11:39 2022-08-23 02:33:13 [INFO] [TRAIN] epoch: 61, iter: 76250/160000, loss: 0.6581, lr: 0.000634, batch_cost: 0.2314, reader_cost: 0.00042, ips: 34.5683 samples/sec | ETA 05:23:01 2022-08-23 02:33:24 [INFO] [TRAIN] epoch: 61, iter: 76300/160000, loss: 0.5787, lr: 0.000634, batch_cost: 0.2296, reader_cost: 0.00063, ips: 34.8362 samples/sec | ETA 05:20:21 2022-08-23 02:33:35 [INFO] [TRAIN] epoch: 61, iter: 76350/160000, loss: 0.6326, lr: 0.000633, batch_cost: 0.2104, reader_cost: 0.00065, ips: 38.0311 samples/sec | ETA 04:53:16 2022-08-23 02:33:47 [INFO] [TRAIN] epoch: 61, iter: 76400/160000, loss: 0.5680, lr: 0.000633, batch_cost: 0.2486, reader_cost: 0.00091, ips: 32.1861 samples/sec | ETA 05:46:19 2022-08-23 02:33:59 [INFO] [TRAIN] epoch: 61, iter: 76450/160000, loss: 0.6238, lr: 0.000633, batch_cost: 0.2427, reader_cost: 0.00075, ips: 32.9607 samples/sec | ETA 05:37:58 2022-08-23 02:34:11 [INFO] [TRAIN] epoch: 61, iter: 76500/160000, loss: 0.5639, lr: 0.000632, batch_cost: 0.2335, reader_cost: 0.00062, ips: 34.2642 samples/sec | ETA 05:24:55 2022-08-23 02:34:22 [INFO] [TRAIN] epoch: 61, iter: 76550/160000, loss: 0.5809, lr: 0.000632, batch_cost: 0.2229, reader_cost: 0.00090, ips: 35.8904 samples/sec | ETA 05:10:01 2022-08-23 02:34:32 [INFO] [TRAIN] epoch: 61, iter: 76600/160000, loss: 0.5728, lr: 0.000631, batch_cost: 0.2065, reader_cost: 0.00035, ips: 38.7436 samples/sec | ETA 04:47:00 2022-08-23 02:34:43 [INFO] [TRAIN] epoch: 61, iter: 76650/160000, loss: 0.6282, lr: 0.000631, batch_cost: 0.2047, reader_cost: 0.00088, ips: 39.0825 samples/sec | ETA 04:44:21 2022-08-23 02:34:53 [INFO] [TRAIN] epoch: 61, iter: 76700/160000, loss: 0.5812, lr: 0.000631, batch_cost: 0.2131, reader_cost: 0.00040, ips: 37.5370 samples/sec | ETA 04:55:53 2022-08-23 02:35:03 [INFO] [TRAIN] epoch: 61, iter: 76750/160000, loss: 0.5911, lr: 0.000630, batch_cost: 0.1919, reader_cost: 0.00037, ips: 41.6919 samples/sec | ETA 04:26:14 2022-08-23 02:35:13 [INFO] [TRAIN] epoch: 61, iter: 76800/160000, loss: 0.6360, lr: 0.000630, batch_cost: 0.2023, reader_cost: 0.00042, ips: 39.5523 samples/sec | ETA 04:40:28 2022-08-23 02:35:23 [INFO] [TRAIN] epoch: 61, iter: 76850/160000, loss: 0.6538, lr: 0.000630, batch_cost: 0.2013, reader_cost: 0.00043, ips: 39.7438 samples/sec | ETA 04:38:57 2022-08-23 02:35:33 [INFO] [TRAIN] epoch: 61, iter: 76900/160000, loss: 0.5786, lr: 0.000629, batch_cost: 0.1922, reader_cost: 0.00042, ips: 41.6334 samples/sec | ETA 04:26:07 2022-08-23 02:35:42 [INFO] [TRAIN] epoch: 61, iter: 76950/160000, loss: 0.6391, lr: 0.000629, batch_cost: 0.1782, reader_cost: 0.00073, ips: 44.9011 samples/sec | ETA 04:06:36 2022-08-23 02:35:51 [INFO] [TRAIN] epoch: 61, iter: 77000/160000, loss: 0.5704, lr: 0.000628, batch_cost: 0.1786, reader_cost: 0.00048, ips: 44.7903 samples/sec | ETA 04:07:04 2022-08-23 02:35:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 210s - batch_cost: 0.2097 - reader cost: 7.2478e-04 2022-08-23 02:39:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3370 Acc: 0.7617 Kappa: 0.7435 Dice: 0.4667 2022-08-23 02:39:21 [INFO] [EVAL] Class IoU: [0.6871 0.7716 0.9309 0.7295 0.6743 0.7554 0.7653 0.7777 0.518 0.6397 0.4972 0.5417 0.6887 0.3048 0.2796 0.4229 0.5145 0.4262 0.5956 0.4069 0.7415 0.4113 0.6194 0.4825 0.3137 0.1663 0.4036 0.4495 0.3264 0.232 0.2481 0.4919 0.2607 0.3385 0.3351 0.3912 0.4052 0.5782 0.2634 0.3498 0.249 0.1353 0.3566 0.2395 0.3243 0.2457 0.1937 0.5025 0.5505 0.551 0.5302 0.3705 0.1426 0.1752 0.6893 0.3911 0.8451 0.4076 0.4924 0.2783 0.0747 0.3632 0.3324 0.2051 0.4405 0.6813 0.2934 0.4009 0.1146 0.323 0.4725 0.5457 0.3759 0.1851 0.4339 0.3406 0.4595 0.2812 0.194 0.1255 0.5467 0.3608 0.2804 0.012 0.4133 0.5459 0.0903 0.0559 0.2563 0.5048 0.3984 0.0287 0.2156 0.1171 0.0092 0.012 0.0255 0.1121 0.2853 0.3875 0.0571 0.0219 0.2225 0.1143 0.0131 0.4604 0.0807 0.4894 0.1122 0.1537 0.0492 0.4034 0.1441 0.5833 0.7982 0.0144 0.3565 0.602 0.1143 0.155 0.3893 0.0192 0.2531 0.1009 0.4047 0.2974 0.4389 0.4261 0.0667 0.353 0.5575 0.0231 0.3781 0.2383 0.2392 0.1203 0.1395 0.0147 0.1698 0.3371 0.4042 0.0013 0.3833 0.0581 0.3056 0. 0.329 0.0311 0.1828 0.2409] 2022-08-23 02:39:21 [INFO] [EVAL] Class Precision: [0.7989 0.8275 0.9671 0.8295 0.7665 0.8617 0.8772 0.8348 0.64 0.7779 0.6756 0.6817 0.7565 0.4951 0.5487 0.5556 0.6764 0.6827 0.6994 0.5791 0.8327 0.6373 0.7448 0.6009 0.5044 0.47 0.4735 0.7412 0.6841 0.3921 0.4541 0.6362 0.4621 0.5651 0.5131 0.5214 0.6143 0.8089 0.3885 0.648 0.456 0.3037 0.6125 0.5233 0.4753 0.481 0.3979 0.7015 0.638 0.679 0.7357 0.4796 0.3271 0.4776 0.7241 0.5067 0.8993 0.6226 0.5948 0.4089 0.1105 0.4812 0.5578 0.7288 0.5424 0.7881 0.4503 0.5434 0.2181 0.5206 0.6327 0.6908 0.6476 0.3014 0.7677 0.5275 0.7715 0.57 0.7103 0.3095 0.6163 0.6441 0.7929 0.0702 0.5746 0.7014 0.5239 0.3703 0.4814 0.6918 0.5365 0.0325 0.384 0.3343 0.0251 0.0301 0.1662 0.3244 0.441 0.6538 0.3731 0.0533 0.5466 0.7549 0.2888 0.6152 0.1946 0.8949 0.2097 0.3227 0.252 0.7639 0.5176 0.6744 0.8042 0.1453 0.8019 0.6753 0.1738 0.5671 0.5644 0.1976 0.7007 0.6044 0.7623 0.5771 0.8154 0.6231 0.505 0.5483 0.6158 0.2008 0.6021 0.6317 0.4322 0.3217 0.3593 0.0523 0.4867 0.6971 0.5074 0.002 0.6522 0.4739 0.6098 0. 0.8186 0.3609 0.6857 0.6861] 2022-08-23 02:39:21 [INFO] [EVAL] Class Recall: [0.8308 0.9195 0.9614 0.8582 0.8486 0.8597 0.8572 0.9192 0.7309 0.7826 0.6531 0.725 0.885 0.4424 0.3631 0.6392 0.6825 0.5316 0.8004 0.5777 0.8712 0.537 0.7864 0.71 0.4535 0.2047 0.7322 0.5331 0.3843 0.3624 0.3536 0.6845 0.3743 0.4578 0.4914 0.6103 0.5434 0.6696 0.4501 0.4319 0.3542 0.1961 0.4605 0.3064 0.5051 0.3343 0.2741 0.6391 0.8006 0.7451 0.655 0.6196 0.2018 0.2168 0.9349 0.6315 0.9334 0.5414 0.741 0.4657 0.1874 0.5969 0.4514 0.2221 0.7011 0.8341 0.4572 0.6045 0.1946 0.4596 0.651 0.7221 0.4726 0.3242 0.4995 0.4902 0.5319 0.3569 0.2107 0.1744 0.8288 0.4507 0.3026 0.0143 0.5955 0.7112 0.0984 0.0618 0.3541 0.6512 0.6074 0.1975 0.3296 0.1526 0.0142 0.0195 0.0292 0.1463 0.4469 0.4876 0.0631 0.0358 0.2729 0.1187 0.0135 0.6466 0.1212 0.5192 0.1945 0.2269 0.0577 0.4609 0.1665 0.812 0.9907 0.0157 0.3909 0.8473 0.2502 0.1758 0.5565 0.0208 0.2838 0.1081 0.4631 0.3803 0.4873 0.5741 0.0714 0.4976 0.8548 0.0254 0.504 0.2768 0.3489 0.1611 0.1858 0.0201 0.2069 0.3949 0.6653 0.0041 0.4818 0.0621 0.3798 0. 0.3549 0.0329 0.1995 0.2708] 2022-08-23 02:39:21 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:39:36 [INFO] [TRAIN] epoch: 62, iter: 77050/160000, loss: 0.6444, lr: 0.000628, batch_cost: 0.2988, reader_cost: 0.07937, ips: 26.7744 samples/sec | ETA 06:53:04 2022-08-23 02:39:46 [INFO] [TRAIN] epoch: 62, iter: 77100/160000, loss: 0.6133, lr: 0.000628, batch_cost: 0.2158, reader_cost: 0.01033, ips: 37.0684 samples/sec | ETA 04:58:11 2022-08-23 02:39:59 [INFO] [TRAIN] epoch: 62, iter: 77150/160000, loss: 0.5323, lr: 0.000627, batch_cost: 0.2471, reader_cost: 0.00037, ips: 32.3739 samples/sec | ETA 05:41:13 2022-08-23 02:40:10 [INFO] [TRAIN] epoch: 62, iter: 77200/160000, loss: 0.5459, lr: 0.000627, batch_cost: 0.2248, reader_cost: 0.00104, ips: 35.5898 samples/sec | ETA 05:10:12 2022-08-23 02:40:21 [INFO] [TRAIN] epoch: 62, iter: 77250/160000, loss: 0.6034, lr: 0.000627, batch_cost: 0.2145, reader_cost: 0.00153, ips: 37.2990 samples/sec | ETA 04:55:48 2022-08-23 02:40:32 [INFO] [TRAIN] epoch: 62, iter: 77300/160000, loss: 0.6117, lr: 0.000626, batch_cost: 0.2191, reader_cost: 0.00764, ips: 36.5157 samples/sec | ETA 05:01:58 2022-08-23 02:40:43 [INFO] [TRAIN] epoch: 62, iter: 77350/160000, loss: 0.5576, lr: 0.000626, batch_cost: 0.2297, reader_cost: 0.00058, ips: 34.8237 samples/sec | ETA 05:16:27 2022-08-23 02:40:55 [INFO] [TRAIN] epoch: 62, iter: 77400/160000, loss: 0.5668, lr: 0.000625, batch_cost: 0.2272, reader_cost: 0.00068, ips: 35.2069 samples/sec | ETA 05:12:49 2022-08-23 02:41:05 [INFO] [TRAIN] epoch: 62, iter: 77450/160000, loss: 0.5850, lr: 0.000625, batch_cost: 0.2150, reader_cost: 0.00128, ips: 37.2134 samples/sec | ETA 04:55:46 2022-08-23 02:41:17 [INFO] [TRAIN] epoch: 62, iter: 77500/160000, loss: 0.6127, lr: 0.000625, batch_cost: 0.2342, reader_cost: 0.00154, ips: 34.1650 samples/sec | ETA 05:21:58 2022-08-23 02:41:29 [INFO] [TRAIN] epoch: 62, iter: 77550/160000, loss: 0.6227, lr: 0.000624, batch_cost: 0.2347, reader_cost: 0.00083, ips: 34.0877 samples/sec | ETA 05:22:30 2022-08-23 02:41:40 [INFO] [TRAIN] epoch: 62, iter: 77600/160000, loss: 0.6203, lr: 0.000624, batch_cost: 0.2282, reader_cost: 0.00097, ips: 35.0551 samples/sec | ETA 05:13:24 2022-08-23 02:41:50 [INFO] [TRAIN] epoch: 62, iter: 77650/160000, loss: 0.6066, lr: 0.000623, batch_cost: 0.2024, reader_cost: 0.00050, ips: 39.5238 samples/sec | ETA 04:37:48 2022-08-23 02:42:00 [INFO] [TRAIN] epoch: 62, iter: 77700/160000, loss: 0.5745, lr: 0.000623, batch_cost: 0.1895, reader_cost: 0.00056, ips: 42.2099 samples/sec | ETA 04:19:58 2022-08-23 02:42:10 [INFO] [TRAIN] epoch: 62, iter: 77750/160000, loss: 0.6027, lr: 0.000623, batch_cost: 0.1995, reader_cost: 0.00093, ips: 40.0970 samples/sec | ETA 04:33:30 2022-08-23 02:42:20 [INFO] [TRAIN] epoch: 62, iter: 77800/160000, loss: 0.5978, lr: 0.000622, batch_cost: 0.1965, reader_cost: 0.00088, ips: 40.7087 samples/sec | ETA 04:29:13 2022-08-23 02:42:28 [INFO] [TRAIN] epoch: 62, iter: 77850/160000, loss: 0.5690, lr: 0.000622, batch_cost: 0.1771, reader_cost: 0.00111, ips: 45.1750 samples/sec | ETA 04:02:27 2022-08-23 02:42:37 [INFO] [TRAIN] epoch: 62, iter: 77900/160000, loss: 0.5987, lr: 0.000622, batch_cost: 0.1674, reader_cost: 0.00054, ips: 47.7862 samples/sec | ETA 03:49:04 2022-08-23 02:42:45 [INFO] [TRAIN] epoch: 62, iter: 77950/160000, loss: 0.5893, lr: 0.000621, batch_cost: 0.1661, reader_cost: 0.00068, ips: 48.1745 samples/sec | ETA 03:47:05 2022-08-23 02:42:54 [INFO] [TRAIN] epoch: 62, iter: 78000/160000, loss: 0.6477, lr: 0.000621, batch_cost: 0.1704, reader_cost: 0.00041, ips: 46.9485 samples/sec | ETA 03:52:52 2022-08-23 02:42:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2019 - reader cost: 6.5588e-04 2022-08-23 02:46:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3357 Acc: 0.7599 Kappa: 0.7416 Dice: 0.4668 2022-08-23 02:46:16 [INFO] [EVAL] Class IoU: [0.6827 0.7652 0.932 0.7226 0.6729 0.7556 0.7688 0.7843 0.5232 0.6277 0.4871 0.5511 0.7009 0.2732 0.2718 0.4238 0.4911 0.4449 0.6061 0.3987 0.7414 0.4511 0.6153 0.4882 0.3112 0.327 0.4483 0.4583 0.3508 0.25 0.249 0.4691 0.2502 0.347 0.305 0.4 0.416 0.5634 0.2804 0.3456 0.2131 0.1076 0.3534 0.2466 0.3151 0.2123 0.2642 0.5047 0.4661 0.5601 0.5032 0.3929 0.2003 0.2187 0.6393 0.4106 0.8631 0.4187 0.3687 0.2809 0.1172 0.266 0.3236 0.2394 0.469 0.7003 0.2871 0.3982 0.0831 0.2977 0.4642 0.5116 0.422 0.2099 0.4401 0.3269 0.4709 0.2685 0.2014 0.1106 0.622 0.3725 0.3137 0.0176 0.1969 0.5452 0.1078 0.0923 0.2492 0.5044 0.4107 0.0406 0.187 0.0837 0.0321 0.0043 0.0254 0.141 0.2756 0.3598 0.0722 0.0581 0.1971 0.0999 0.0615 0.4481 0.0709 0.5034 0.1102 0.2028 0.1008 0.373 0.1454 0.5841 0.7458 0.016 0.2529 0.6602 0.0858 0.1597 0.3825 0.0045 0.2165 0.159 0.3211 0.222 0.4235 0.3959 0.1846 0.304 0.548 0.0345 0.3961 0.2534 0.1474 0.1207 0.1365 0.0078 0.2106 0.344 0.4455 0.0005 0.3814 0.1269 0.2981 0. 0.3139 0.0398 0.1545 0.1577] 2022-08-23 02:46:16 [INFO] [EVAL] Class Precision: [0.7864 0.8463 0.9625 0.8121 0.7537 0.8495 0.8732 0.863 0.6822 0.738 0.7289 0.7021 0.7793 0.4972 0.5078 0.5835 0.7157 0.7013 0.7327 0.5919 0.8286 0.7193 0.758 0.6106 0.4989 0.4321 0.5743 0.6941 0.7012 0.395 0.4419 0.5792 0.4353 0.4492 0.4185 0.5188 0.6136 0.7415 0.4904 0.6559 0.3903 0.2847 0.6024 0.5916 0.4776 0.3999 0.4313 0.7348 0.5972 0.6447 0.7017 0.5823 0.3667 0.5182 0.7624 0.5058 0.9083 0.6094 0.6945 0.4908 0.1548 0.4017 0.5375 0.6685 0.6075 0.8436 0.3994 0.5484 0.1138 0.4977 0.5691 0.6686 0.6179 0.2743 0.6898 0.5068 0.5683 0.4763 0.6342 0.2882 0.7381 0.6593 0.7581 0.0585 0.4297 0.6632 0.4863 0.3374 0.4286 0.704 0.5918 0.048 0.4248 0.2799 0.1155 0.023 0.1531 0.3821 0.5179 0.6491 0.5205 0.103 0.5533 0.6189 0.7975 0.61 0.1334 0.7845 0.208 0.3905 0.2351 0.9696 0.6119 0.7022 0.7513 0.1333 0.744 0.6875 0.1436 0.538 0.5859 0.1746 0.7067 0.5618 0.751 0.6107 0.7789 0.5135 0.4262 0.4275 0.6065 0.2454 0.6062 0.5672 0.646 0.3515 0.3129 0.0916 0.366 0.6098 0.6587 0.0008 0.7343 0.4542 0.5363 0. 0.8601 0.2932 0.5901 0.8118] 2022-08-23 02:46:16 [INFO] [EVAL] Class Recall: [0.8382 0.8887 0.9671 0.8676 0.8625 0.8724 0.8654 0.8958 0.6919 0.8077 0.5948 0.7193 0.8746 0.3774 0.369 0.6075 0.6101 0.5489 0.7781 0.5498 0.8758 0.5475 0.7658 0.7089 0.4527 0.5735 0.6714 0.5743 0.4124 0.4052 0.3633 0.7115 0.3705 0.6039 0.5294 0.636 0.5637 0.7011 0.3956 0.4221 0.3195 0.1474 0.4608 0.2972 0.4807 0.3116 0.4055 0.6171 0.6798 0.8103 0.6401 0.5471 0.3061 0.2745 0.7985 0.6858 0.9454 0.5723 0.44 0.3964 0.326 0.4405 0.4485 0.2716 0.6729 0.8047 0.5052 0.5926 0.2352 0.4255 0.7158 0.6854 0.5709 0.4717 0.5486 0.4793 0.7331 0.3809 0.2279 0.1523 0.7982 0.4613 0.3486 0.0246 0.2665 0.754 0.1216 0.1127 0.3731 0.6401 0.573 0.2079 0.2505 0.1066 0.0426 0.0053 0.0295 0.1826 0.3707 0.4467 0.0774 0.1174 0.2344 0.1064 0.0624 0.6279 0.1316 0.5842 0.1899 0.2967 0.15 0.3774 0.1602 0.7765 0.9902 0.0178 0.277 0.9431 0.1756 0.1851 0.5242 0.0046 0.2379 0.1815 0.3594 0.2587 0.4813 0.6334 0.2456 0.5126 0.8504 0.0386 0.5334 0.3142 0.1604 0.1553 0.195 0.0084 0.3316 0.4411 0.5792 0.0014 0.4425 0.1498 0.4015 0. 0.3308 0.044 0.1731 0.1637] 2022-08-23 02:46:16 [INFO] [EVAL] The model with the best validation mIoU (0.3416) was saved at iter 68000. 2022-08-23 02:46:31 [INFO] [TRAIN] epoch: 62, iter: 78050/160000, loss: 0.5628, lr: 0.000620, batch_cost: 0.3029, reader_cost: 0.00565, ips: 26.4156 samples/sec | ETA 06:53:38 2022-08-23 02:46:44 [INFO] [TRAIN] epoch: 62, iter: 78100/160000, loss: 0.5858, lr: 0.000620, batch_cost: 0.2643, reader_cost: 0.00168, ips: 30.2668 samples/sec | ETA 06:00:47 2022-08-23 02:46:59 [INFO] [TRAIN] epoch: 62, iter: 78150/160000, loss: 0.5944, lr: 0.000620, batch_cost: 0.2885, reader_cost: 0.00162, ips: 27.7299 samples/sec | ETA 06:33:33 2022-08-23 02:47:13 [INFO] [TRAIN] epoch: 62, iter: 78200/160000, loss: 0.6063, lr: 0.000619, batch_cost: 0.2762, reader_cost: 0.00776, ips: 28.9689 samples/sec | ETA 06:16:29 2022-08-23 02:47:26 [INFO] [TRAIN] epoch: 62, iter: 78250/160000, loss: 0.6202, lr: 0.000619, batch_cost: 0.2736, reader_cost: 0.00522, ips: 29.2354 samples/sec | ETA 06:12:50 2022-08-23 02:47:41 [INFO] [TRAIN] epoch: 62, iter: 78300/160000, loss: 0.5975, lr: 0.000619, batch_cost: 0.2944, reader_cost: 0.00066, ips: 27.1730 samples/sec | ETA 06:40:53 2022-08-23 02:47:58 [INFO] [TRAIN] epoch: 63, iter: 78350/160000, loss: 0.5950, lr: 0.000618, batch_cost: 0.3364, reader_cost: 0.07005, ips: 23.7787 samples/sec | ETA 07:37:50 2022-08-23 02:48:10 [INFO] [TRAIN] epoch: 63, iter: 78400/160000, loss: 0.5719, lr: 0.000618, batch_cost: 0.2356, reader_cost: 0.00057, ips: 33.9502 samples/sec | ETA 05:20:28 2022-08-23 02:48:21 [INFO] [TRAIN] epoch: 63, iter: 78450/160000, loss: 0.5594, lr: 0.000617, batch_cost: 0.2211, reader_cost: 0.00069, ips: 36.1851 samples/sec | ETA 05:00:29 2022-08-23 02:48:31 [INFO] [TRAIN] epoch: 63, iter: 78500/160000, loss: 0.5741, lr: 0.000617, batch_cost: 0.2077, reader_cost: 0.00069, ips: 38.5172 samples/sec | ETA 04:42:07 2022-08-23 02:48:41 [INFO] [TRAIN] epoch: 63, iter: 78550/160000, loss: 0.5695, lr: 0.000617, batch_cost: 0.2038, reader_cost: 0.00069, ips: 39.2532 samples/sec | ETA 04:36:39 2022-08-23 02:48:52 [INFO] [TRAIN] epoch: 63, iter: 78600/160000, loss: 0.5470, lr: 0.000616, batch_cost: 0.2158, reader_cost: 0.00097, ips: 37.0700 samples/sec | ETA 04:52:46 2022-08-23 02:49:01 [INFO] [TRAIN] epoch: 63, iter: 78650/160000, loss: 0.5731, lr: 0.000616, batch_cost: 0.1766, reader_cost: 0.00102, ips: 45.3050 samples/sec | ETA 03:59:24 2022-08-23 02:49:10 [INFO] [TRAIN] epoch: 63, iter: 78700/160000, loss: 0.5522, lr: 0.000616, batch_cost: 0.1783, reader_cost: 0.00042, ips: 44.8683 samples/sec | ETA 04:01:35 2022-08-23 02:49:20 [INFO] [TRAIN] epoch: 63, iter: 78750/160000, loss: 0.6016, lr: 0.000615, batch_cost: 0.1968, reader_cost: 0.00052, ips: 40.6537 samples/sec | ETA 04:26:28 2022-08-23 02:49:28 [INFO] [TRAIN] epoch: 63, iter: 78800/160000, loss: 0.6024, lr: 0.000615, batch_cost: 0.1655, reader_cost: 0.00054, ips: 48.3241 samples/sec | ETA 03:44:02 2022-08-23 02:49:36 [INFO] [TRAIN] epoch: 63, iter: 78850/160000, loss: 0.5837, lr: 0.000614, batch_cost: 0.1602, reader_cost: 0.00031, ips: 49.9445 samples/sec | ETA 03:36:38 2022-08-23 02:49:44 [INFO] [TRAIN] epoch: 63, iter: 78900/160000, loss: 0.5641, lr: 0.000614, batch_cost: 0.1648, reader_cost: 0.00073, ips: 48.5556 samples/sec | ETA 03:42:41 2022-08-23 02:49:55 [INFO] [TRAIN] epoch: 63, iter: 78950/160000, loss: 0.6036, lr: 0.000614, batch_cost: 0.2075, reader_cost: 0.00045, ips: 38.5490 samples/sec | ETA 04:40:20 2022-08-23 02:50:03 [INFO] [TRAIN] epoch: 63, iter: 79000/160000, loss: 0.5840, lr: 0.000613, batch_cost: 0.1776, reader_cost: 0.00081, ips: 45.0380 samples/sec | ETA 03:59:47 2022-08-23 02:50:03 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 209s - batch_cost: 0.2094 - reader cost: 7.8588e-04 2022-08-23 02:53:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3433 Acc: 0.7609 Kappa: 0.7427 Dice: 0.4760 2022-08-23 02:53:33 [INFO] [EVAL] Class IoU: [0.6824 0.7667 0.9318 0.7288 0.6707 0.7535 0.7642 0.781 0.5201 0.6202 0.4794 0.5366 0.6961 0.3128 0.2735 0.4246 0.4972 0.3944 0.6063 0.4022 0.7354 0.4224 0.6062 0.4915 0.3339 0.4022 0.4417 0.4229 0.392 0.2118 0.2521 0.4814 0.2585 0.3478 0.2905 0.4196 0.4257 0.5446 0.2662 0.3839 0.2295 0.0616 0.3534 0.2455 0.31 0.2607 0.2835 0.5021 0.5112 0.5403 0.5196 0.3578 0.1745 0.2125 0.7329 0.4228 0.8502 0.3919 0.4917 0.2707 0.1414 0.2589 0.3449 0.158 0.4307 0.7126 0.2857 0.4283 0.1596 0.3101 0.4777 0.525 0.3971 0.204 0.3831 0.3324 0.4894 0.2885 0.2072 0.1206 0.5772 0.3595 0.3217 0.0208 0.3647 0.5334 0.1336 0.0932 0.2446 0.4356 0.3807 0.0789 0.1988 0.0747 0.0151 0.0057 0.0358 0.1653 0.3 0.4001 0.0361 0.0202 0.2029 0.4863 0.1189 0.5091 0.0594 0.5164 0.0999 0.2243 0.1091 0.4207 0.1584 0.5474 0.7919 0.0116 0.334 0.5979 0.1218 0.1871 0.3564 0.0097 0.2111 0.1575 0.3428 0.246 0.364 0.4391 0.3908 0.2953 0.5418 0.0087 0.2645 0.221 0.1445 0.1257 0.1184 0.0274 0.1803 0.3463 0.5722 0.0171 0.3298 0.1985 0.2834 0. 0.3528 0.0372 0.1358 0.135 ] 2022-08-23 02:53:33 [INFO] [EVAL] Class Precision: [0.7863 0.8416 0.9681 0.8283 0.7488 0.8811 0.8854 0.8453 0.6324 0.76 0.7148 0.6508 0.7931 0.4818 0.5532 0.5657 0.6798 0.6823 0.746 0.6059 0.823 0.7076 0.714 0.6018 0.5181 0.5352 0.5386 0.7469 0.6857 0.3541 0.4257 0.6393 0.5007 0.5115 0.3979 0.5579 0.5562 0.7813 0.4963 0.6196 0.3777 0.2339 0.542 0.5877 0.463 0.4726 0.5215 0.6791 0.5771 0.5993 0.6668 0.4732 0.3908 0.5929 0.7583 0.5401 0.8962 0.6535 0.6563 0.4773 0.19 0.5212 0.5275 0.7185 0.5093 0.8428 0.3702 0.5875 0.4116 0.5462 0.6226 0.6921 0.64 0.2475 0.6336 0.5426 0.6317 0.5259 0.6634 0.3717 0.6654 0.6302 0.7623 0.0699 0.5037 0.6969 0.3557 0.3269 0.4138 0.7605 0.489 0.2127 0.4696 0.3349 0.0637 0.0231 0.2156 0.3583 0.5058 0.6127 0.2359 0.0414 0.5714 0.8682 0.7033 0.7218 0.1317 0.8204 0.2049 0.3787 0.2878 0.8391 0.4388 0.7962 0.7959 0.133 0.6683 0.6073 0.2178 0.4389 0.5091 0.1238 0.618 0.6066 0.7138 0.6144 0.7677 0.6124 0.4862 0.4025 0.6018 0.1218 0.3231 0.7208 0.5532 0.27 0.401 0.0914 0.5349 0.6407 0.8414 0.0255 0.7212 0.5858 0.3709 0. 0.8486 0.2895 0.5287 0.8372] 2022-08-23 02:53:33 [INFO] [EVAL] Class Recall: [0.8377 0.8961 0.9612 0.8584 0.8655 0.8388 0.8481 0.9113 0.7455 0.7712 0.5928 0.7535 0.8505 0.4713 0.351 0.63 0.6493 0.4831 0.764 0.5447 0.8735 0.5118 0.8007 0.7285 0.4844 0.6181 0.7106 0.4936 0.4779 0.3451 0.382 0.661 0.3482 0.5207 0.5183 0.6286 0.6447 0.6425 0.3647 0.5023 0.3691 0.0772 0.5039 0.2966 0.4842 0.3677 0.3831 0.6582 0.8174 0.846 0.7018 0.5948 0.2397 0.2488 0.9564 0.6606 0.9431 0.4948 0.6622 0.3847 0.3561 0.3397 0.4992 0.1684 0.7364 0.8217 0.5558 0.6125 0.2068 0.4177 0.6724 0.6849 0.5113 0.5374 0.4922 0.4617 0.6848 0.3898 0.2316 0.1515 0.8131 0.4556 0.3576 0.0288 0.5692 0.6945 0.1763 0.1154 0.3743 0.5048 0.6321 0.1115 0.2563 0.0877 0.0194 0.0075 0.0411 0.2347 0.4244 0.5356 0.0408 0.0378 0.2394 0.5251 0.1252 0.6334 0.0975 0.5822 0.1632 0.3549 0.1494 0.4576 0.1986 0.6366 0.9937 0.0126 0.4003 0.9747 0.2167 0.246 0.5429 0.0104 0.2427 0.1754 0.3975 0.2909 0.409 0.608 0.6655 0.5259 0.8447 0.0093 0.5931 0.2416 0.1636 0.1904 0.1439 0.0376 0.2138 0.4298 0.6414 0.0497 0.3781 0.2309 0.5457 0. 0.3764 0.041 0.1545 0.1386] 2022-08-23 02:53:33 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 02:53:47 [INFO] [TRAIN] epoch: 63, iter: 79050/160000, loss: 0.6049, lr: 0.000613, batch_cost: 0.2767, reader_cost: 0.00469, ips: 28.9116 samples/sec | ETA 06:13:19 2022-08-23 02:54:01 [INFO] [TRAIN] epoch: 63, iter: 79100/160000, loss: 0.6029, lr: 0.000612, batch_cost: 0.2830, reader_cost: 0.00150, ips: 28.2676 samples/sec | ETA 06:21:35 2022-08-23 02:54:16 [INFO] [TRAIN] epoch: 63, iter: 79150/160000, loss: 0.5983, lr: 0.000612, batch_cost: 0.2868, reader_cost: 0.00132, ips: 27.8914 samples/sec | ETA 06:26:29 2022-08-23 02:54:30 [INFO] [TRAIN] epoch: 63, iter: 79200/160000, loss: 0.5562, lr: 0.000612, batch_cost: 0.2778, reader_cost: 0.01231, ips: 28.7987 samples/sec | ETA 06:14:05 2022-08-23 02:54:45 [INFO] [TRAIN] epoch: 63, iter: 79250/160000, loss: 0.5807, lr: 0.000611, batch_cost: 0.3001, reader_cost: 0.00307, ips: 26.6536 samples/sec | ETA 06:43:56 2022-08-23 02:54:58 [INFO] [TRAIN] epoch: 63, iter: 79300/160000, loss: 0.6152, lr: 0.000611, batch_cost: 0.2695, reader_cost: 0.01482, ips: 29.6804 samples/sec | ETA 06:02:31 2022-08-23 02:55:11 [INFO] [TRAIN] epoch: 63, iter: 79350/160000, loss: 0.6087, lr: 0.000611, batch_cost: 0.2578, reader_cost: 0.00158, ips: 31.0339 samples/sec | ETA 05:46:30 2022-08-23 02:55:23 [INFO] [TRAIN] epoch: 63, iter: 79400/160000, loss: 0.5860, lr: 0.000610, batch_cost: 0.2289, reader_cost: 0.01067, ips: 34.9457 samples/sec | ETA 05:07:31 2022-08-23 02:55:33 [INFO] [TRAIN] epoch: 63, iter: 79450/160000, loss: 0.6040, lr: 0.000610, batch_cost: 0.2123, reader_cost: 0.00097, ips: 37.6753 samples/sec | ETA 04:45:04 2022-08-23 02:55:45 [INFO] [TRAIN] epoch: 63, iter: 79500/160000, loss: 0.5981, lr: 0.000609, batch_cost: 0.2380, reader_cost: 0.00107, ips: 33.6125 samples/sec | ETA 05:19:19 2022-08-23 02:55:55 [INFO] [TRAIN] epoch: 63, iter: 79550/160000, loss: 0.6041, lr: 0.000609, batch_cost: 0.2047, reader_cost: 0.00034, ips: 39.0911 samples/sec | ETA 04:34:24 2022-08-23 02:56:07 [INFO] [TRAIN] epoch: 64, iter: 79600/160000, loss: 0.6098, lr: 0.000609, batch_cost: 0.2375, reader_cost: 0.05924, ips: 33.6878 samples/sec | ETA 05:18:12 2022-08-23 02:56:17 [INFO] [TRAIN] epoch: 64, iter: 79650/160000, loss: 0.5577, lr: 0.000608, batch_cost: 0.1873, reader_cost: 0.00045, ips: 42.7036 samples/sec | ETA 04:10:52 2022-08-23 02:56:27 [INFO] [TRAIN] epoch: 64, iter: 79700/160000, loss: 0.5555, lr: 0.000608, batch_cost: 0.2101, reader_cost: 0.00063, ips: 38.0849 samples/sec | ETA 04:41:07 2022-08-23 02:56:37 [INFO] [TRAIN] epoch: 64, iter: 79750/160000, loss: 0.6039, lr: 0.000608, batch_cost: 0.1997, reader_cost: 0.00107, ips: 40.0587 samples/sec | ETA 04:27:06 2022-08-23 02:56:47 [INFO] [TRAIN] epoch: 64, iter: 79800/160000, loss: 0.5682, lr: 0.000607, batch_cost: 0.2064, reader_cost: 0.00033, ips: 38.7672 samples/sec | ETA 04:35:50 2022-08-23 02:56:56 [INFO] [TRAIN] epoch: 64, iter: 79850/160000, loss: 0.5768, lr: 0.000607, batch_cost: 0.1649, reader_cost: 0.00063, ips: 48.5093 samples/sec | ETA 03:40:18 2022-08-23 02:57:05 [INFO] [TRAIN] epoch: 64, iter: 79900/160000, loss: 0.6438, lr: 0.000606, batch_cost: 0.1790, reader_cost: 0.00063, ips: 44.6977 samples/sec | ETA 03:58:56 2022-08-23 02:57:14 [INFO] [TRAIN] epoch: 64, iter: 79950/160000, loss: 0.5976, lr: 0.000606, batch_cost: 0.1928, reader_cost: 0.00065, ips: 41.5015 samples/sec | ETA 04:17:10 2022-08-23 02:57:24 [INFO] [TRAIN] epoch: 64, iter: 80000/160000, loss: 0.6188, lr: 0.000606, batch_cost: 0.1919, reader_cost: 0.00056, ips: 41.6849 samples/sec | ETA 04:15:53 2022-08-23 02:57:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 211s - batch_cost: 0.2110 - reader cost: 0.0011 2022-08-23 03:00:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3416 Acc: 0.7608 Kappa: 0.7429 Dice: 0.4734 2022-08-23 03:00:55 [INFO] [EVAL] Class IoU: [0.684 0.7717 0.9309 0.7225 0.6773 0.7626 0.7592 0.7744 0.5219 0.6281 0.4769 0.5431 0.6935 0.2704 0.2854 0.4254 0.5213 0.4191 0.6113 0.3902 0.7478 0.4469 0.6271 0.4961 0.3006 0.3753 0.4347 0.4319 0.4102 0.2893 0.2773 0.4791 0.2794 0.3696 0.2732 0.3664 0.4164 0.5671 0.2804 0.3683 0.1612 0.1148 0.3495 0.2455 0.3039 0.2709 0.2487 0.5193 0.5713 0.5666 0.5259 0.3707 0.1612 0.1907 0.7505 0.4665 0.8094 0.3492 0.504 0.2477 0.1368 0.2907 0.3504 0.2125 0.451 0.6796 0.2742 0.3857 0.1244 0.2958 0.4569 0.5353 0.3634 0.2592 0.4295 0.3243 0.354 0.2467 0.2417 0.0827 0.6126 0.3857 0.3221 0.0492 0.3739 0.5413 0.1244 0.0713 0.207 0.4802 0.3611 0.0344 0.1832 0.0889 0.0201 0.0102 0.0278 0.1418 0.2826 0.3985 0.0376 0.0217 0.2039 0.5023 0.0114 0.436 0.0529 0.4974 0.105 0.0977 0.102 0.3498 0.1418 0.5835 0.7792 0.0312 0.3236 0.5987 0.1174 0.2132 0.313 0.0056 0.225 0.1542 0.336 0.2644 0.4375 0.42 0.3541 0.3407 0.5428 0.0392 0.3477 0.2523 0.1913 0.1306 0.1336 0.0226 0.1818 0.3349 0.4454 0.0122 0.3722 0.1151 0.3074 0. 0.3706 0.0319 0.1383 0.1689] 2022-08-23 03:00:55 [INFO] [EVAL] Class Precision: [0.7973 0.8525 0.9633 0.8097 0.7884 0.8521 0.8639 0.8325 0.6674 0.7833 0.7031 0.7029 0.7619 0.4551 0.5031 0.5968 0.6701 0.6835 0.7501 0.5986 0.8544 0.6496 0.7488 0.6313 0.4739 0.4614 0.5216 0.6732 0.6403 0.4254 0.3976 0.6155 0.435 0.5192 0.4437 0.4426 0.5812 0.7231 0.4462 0.6496 0.3728 0.279 0.5849 0.553 0.463 0.5656 0.4394 0.7786 0.6188 0.6677 0.74 0.5665 0.3193 0.6144 0.7718 0.6276 0.8364 0.7382 0.6981 0.413 0.1971 0.6248 0.5811 0.6786 0.5574 0.7692 0.4069 0.5375 0.2744 0.5059 0.6634 0.6451 0.6571 0.3373 0.6534 0.4874 0.4331 0.6088 0.522 0.1829 0.7608 0.621 0.7797 0.0925 0.5522 0.7412 0.3916 0.3596 0.3481 0.6363 0.4526 0.0359 0.4175 0.3561 0.0722 0.0345 0.3412 0.4016 0.421 0.5984 0.5028 0.0324 0.5733 0.8145 0.2477 0.5751 0.1071 0.7736 0.2102 0.2594 0.3012 0.8361 0.5457 0.7855 0.7843 0.1607 0.7146 0.6875 0.2554 0.5269 0.5985 0.0731 0.6947 0.6837 0.6675 0.6256 0.7671 0.5469 0.8819 0.6591 0.591 0.384 0.5256 0.6579 0.4432 0.3223 0.3716 0.1064 0.5051 0.6535 0.5046 0.0184 0.6716 0.7275 0.4521 0. 0.8942 0.3791 0.612 0.8456] 2022-08-23 03:00:55 [INFO] [EVAL] Class Recall: [0.828 0.8906 0.9652 0.8702 0.8277 0.879 0.8624 0.9172 0.7054 0.7603 0.5972 0.7048 0.8854 0.3998 0.3975 0.5969 0.7013 0.5201 0.7677 0.5284 0.857 0.5889 0.7943 0.6986 0.4512 0.668 0.723 0.5464 0.5331 0.4748 0.4782 0.6838 0.4386 0.5619 0.4155 0.6803 0.5948 0.7245 0.43 0.4596 0.2212 0.1632 0.4647 0.3063 0.4694 0.3421 0.3642 0.6092 0.8816 0.7891 0.6451 0.5175 0.2456 0.2166 0.9646 0.645 0.9617 0.3986 0.6445 0.3823 0.309 0.3521 0.4687 0.2363 0.7025 0.8536 0.4569 0.5773 0.1853 0.416 0.5947 0.7587 0.4484 0.5282 0.5563 0.4921 0.6598 0.2932 0.3105 0.1311 0.7587 0.5045 0.3543 0.0952 0.5365 0.6674 0.1542 0.0816 0.3381 0.6619 0.641 0.4562 0.2461 0.106 0.0272 0.0143 0.0294 0.1798 0.4623 0.5439 0.039 0.0615 0.2404 0.5672 0.0118 0.6431 0.0946 0.5821 0.1735 0.1355 0.1336 0.3756 0.1608 0.6941 0.9917 0.0373 0.3716 0.8225 0.1784 0.2636 0.3962 0.006 0.2497 0.1661 0.4036 0.3141 0.5045 0.6441 0.3717 0.4136 0.8694 0.0419 0.5068 0.2904 0.2518 0.18 0.1726 0.0279 0.2212 0.4072 0.7917 0.035 0.4549 0.1203 0.4899 0. 0.3876 0.0336 0.1515 0.1743] 2022-08-23 03:00:55 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 03:01:09 [INFO] [TRAIN] epoch: 64, iter: 80050/160000, loss: 0.6148, lr: 0.000605, batch_cost: 0.2682, reader_cost: 0.00568, ips: 29.8332 samples/sec | ETA 05:57:19 2022-08-23 03:01:23 [INFO] [TRAIN] epoch: 64, iter: 80100/160000, loss: 0.5946, lr: 0.000605, batch_cost: 0.2793, reader_cost: 0.00941, ips: 28.6405 samples/sec | ETA 06:11:58 2022-08-23 03:01:37 [INFO] [TRAIN] epoch: 64, iter: 80150/160000, loss: 0.6191, lr: 0.000605, batch_cost: 0.2809, reader_cost: 0.00452, ips: 28.4840 samples/sec | ETA 06:13:46 2022-08-23 03:01:49 [INFO] [TRAIN] epoch: 64, iter: 80200/160000, loss: 0.5706, lr: 0.000604, batch_cost: 0.2375, reader_cost: 0.01039, ips: 33.6791 samples/sec | ETA 05:15:55 2022-08-23 03:02:02 [INFO] [TRAIN] epoch: 64, iter: 80250/160000, loss: 0.5997, lr: 0.000604, batch_cost: 0.2709, reader_cost: 0.00797, ips: 29.5275 samples/sec | ETA 06:00:07 2022-08-23 03:02:16 [INFO] [TRAIN] epoch: 64, iter: 80300/160000, loss: 0.5420, lr: 0.000603, batch_cost: 0.2798, reader_cost: 0.01535, ips: 28.5901 samples/sec | ETA 06:11:41 2022-08-23 03:02:27 [INFO] [TRAIN] epoch: 64, iter: 80350/160000, loss: 0.6137, lr: 0.000603, batch_cost: 0.2253, reader_cost: 0.01299, ips: 35.5020 samples/sec | ETA 04:59:08 2022-08-23 03:02:38 [INFO] [TRAIN] epoch: 64, iter: 80400/160000, loss: 0.6007, lr: 0.000603, batch_cost: 0.2060, reader_cost: 0.00076, ips: 38.8307 samples/sec | ETA 04:33:19 2022-08-23 03:02:48 [INFO] [TRAIN] epoch: 64, iter: 80450/160000, loss: 0.5664, lr: 0.000602, batch_cost: 0.2142, reader_cost: 0.00043, ips: 37.3400 samples/sec | ETA 04:44:03 2022-08-23 03:02:57 [INFO] [TRAIN] epoch: 64, iter: 80500/160000, loss: 0.5272, lr: 0.000602, batch_cost: 0.1769, reader_cost: 0.00042, ips: 45.2202 samples/sec | ETA 03:54:24 2022-08-23 03:03:06 [INFO] [TRAIN] epoch: 64, iter: 80550/160000, loss: 0.6104, lr: 0.000602, batch_cost: 0.1726, reader_cost: 0.00048, ips: 46.3521 samples/sec | ETA 03:48:32 2022-08-23 03:03:14 [INFO] [TRAIN] epoch: 64, iter: 80600/160000, loss: 0.6251, lr: 0.000601, batch_cost: 0.1602, reader_cost: 0.00049, ips: 49.9394 samples/sec | ETA 03:31:59 2022-08-23 03:03:24 [INFO] [TRAIN] epoch: 64, iter: 80650/160000, loss: 0.5733, lr: 0.000601, batch_cost: 0.1994, reader_cost: 0.00062, ips: 40.1148 samples/sec | ETA 04:23:44 2022-08-23 03:03:32 [INFO] [TRAIN] epoch: 64, iter: 80700/160000, loss: 0.6038, lr: 0.000600, batch_cost: 0.1690, reader_cost: 0.00052, ips: 47.3308 samples/sec | ETA 03:43:23 2022-08-23 03:03:43 [INFO] [TRAIN] epoch: 64, iter: 80750/160000, loss: 0.5624, lr: 0.000600, batch_cost: 0.2162, reader_cost: 0.00036, ips: 37.0044 samples/sec | ETA 04:45:33 2022-08-23 03:03:52 [INFO] [TRAIN] epoch: 64, iter: 80800/160000, loss: 0.6883, lr: 0.000600, batch_cost: 0.1840, reader_cost: 0.00054, ips: 43.4696 samples/sec | ETA 04:02:55 2022-08-23 03:04:06 [INFO] [TRAIN] epoch: 65, iter: 80850/160000, loss: 0.5649, lr: 0.000599, batch_cost: 0.2679, reader_cost: 0.07939, ips: 29.8655 samples/sec | ETA 05:53:21 2022-08-23 03:04:15 [INFO] [TRAIN] epoch: 65, iter: 80900/160000, loss: 0.5687, lr: 0.000599, batch_cost: 0.1774, reader_cost: 0.00043, ips: 45.1039 samples/sec | ETA 03:53:49 2022-08-23 03:04:24 [INFO] [TRAIN] epoch: 65, iter: 80950/160000, loss: 0.5909, lr: 0.000598, batch_cost: 0.1897, reader_cost: 0.00050, ips: 42.1673 samples/sec | ETA 04:09:57 2022-08-23 03:04:34 [INFO] [TRAIN] epoch: 65, iter: 81000/160000, loss: 0.6230, lr: 0.000598, batch_cost: 0.1907, reader_cost: 0.00092, ips: 41.9481 samples/sec | ETA 04:11:06 2022-08-23 03:04:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 240s - batch_cost: 0.2400 - reader cost: 6.6221e-04 2022-08-23 03:08:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.3381 Acc: 0.7617 Kappa: 0.7435 Dice: 0.4687 2022-08-23 03:08:34 [INFO] [EVAL] Class IoU: [0.6796 0.7666 0.9317 0.7241 0.6816 0.7538 0.7662 0.7821 0.5253 0.6207 0.4977 0.5339 0.6914 0.288 0.2872 0.4283 0.509 0.437 0.6144 0.4082 0.7378 0.4311 0.6237 0.49 0.3038 0.3434 0.4691 0.4561 0.382 0.2881 0.2161 0.4886 0.2911 0.3562 0.2857 0.4214 0.4242 0.5631 0.2565 0.3669 0.134 0.0993 0.3532 0.259 0.3244 0.2776 0.2195 0.5146 0.5385 0.5452 0.5124 0.3583 0.1966 0.2326 0.6705 0.4558 0.8516 0.3934 0.4304 0.29 0.0812 0.4328 0.3059 0.1934 0.4434 0.6933 0.2625 0.3828 0.0499 0.2881 0.4693 0.5372 0.4026 0.2063 0.422 0.3288 0.3852 0.2481 0.2164 0.1759 0.5903 0.361 0.3391 0.0284 0.1109 0.5332 0.0822 0.074 0.2196 0.4575 0.3968 0.0073 0.2061 0.1216 0.0058 0.0131 0.0267 0.1224 0.2915 0.3713 0.0517 0.0262 0.2251 0.2426 0.0381 0.4379 0.0446 0.5061 0.1189 0.1812 0.1033 0.3245 0.1498 0.5483 0.834 0.0101 0.2961 0.6384 0.0981 0.2422 0.3379 0.0079 0.1973 0.1072 0.3473 0.2268 0.3655 0.4207 0.4521 0.3137 0.5591 0.0169 0.3463 0.2519 0.1511 0.1364 0.1556 0.0177 0.1955 0.3588 0.4304 0.0015 0.3547 0.3084 0.2622 0. 0.3253 0.0327 0.133 0.1239] 2022-08-23 03:08:34 [INFO] [EVAL] Class Precision: [0.7756 0.8482 0.9623 0.8194 0.7734 0.8743 0.8666 0.8597 0.6796 0.7524 0.6935 0.7091 0.7704 0.4995 0.5583 0.5537 0.6758 0.6791 0.7882 0.6229 0.8213 0.6273 0.7288 0.5774 0.5208 0.4826 0.526 0.7084 0.6912 0.4087 0.4912 0.6324 0.4498 0.4948 0.4231 0.5975 0.5867 0.7496 0.5231 0.6499 0.3927 0.2735 0.6611 0.4817 0.4937 0.4458 0.4155 0.6678 0.6186 0.6303 0.7279 0.4936 0.3356 0.5827 0.6864 0.5845 0.8949 0.6931 0.7912 0.573 0.2076 0.5557 0.4194 0.7006 0.5612 0.8282 0.3768 0.5398 0.1514 0.4395 0.6676 0.6446 0.6511 0.2862 0.6177 0.5091 0.4577 0.5027 0.6362 0.3277 0.6936 0.655 0.7975 0.0603 0.2965 0.7257 0.5945 0.4121 0.3491 0.6749 0.5525 0.0164 0.4517 0.3487 0.0473 0.0374 0.0736 0.3591 0.4565 0.6817 0.4803 0.0387 0.5536 0.7632 0.3532 0.5946 0.0864 0.8489 0.2301 0.3362 0.2875 0.4937 0.4404 0.7304 0.8429 0.1223 0.7474 0.673 0.1629 0.6123 0.5576 0.0854 0.4419 0.7211 0.7052 0.6351 0.7134 0.6036 0.8071 0.5632 0.6158 0.2992 0.5028 0.4136 0.5319 0.245 0.313 0.0846 0.5107 0.6276 0.5089 0.0026 0.6011 0.6453 0.5621 0. 0.8939 0.3136 0.5527 0.8925] 2022-08-23 03:08:34 [INFO] [EVAL] Class Recall: [0.8459 0.8886 0.9669 0.8616 0.8517 0.8455 0.8687 0.8965 0.6983 0.7799 0.638 0.6835 0.8708 0.4048 0.3716 0.6541 0.6735 0.5508 0.7359 0.5423 0.8789 0.5796 0.8122 0.7641 0.4217 0.5435 0.8127 0.5615 0.4606 0.4939 0.2784 0.6824 0.4521 0.5597 0.4682 0.5885 0.605 0.6936 0.3348 0.4573 0.1691 0.1349 0.4313 0.3591 0.4861 0.4239 0.3176 0.6916 0.8061 0.8014 0.6338 0.5666 0.322 0.2791 0.9666 0.6744 0.9463 0.4764 0.4855 0.3699 0.1176 0.6618 0.5305 0.2108 0.6786 0.8098 0.4639 0.5683 0.0693 0.4553 0.6123 0.7632 0.5133 0.4247 0.5712 0.4813 0.7087 0.3288 0.247 0.2752 0.7985 0.4457 0.371 0.0508 0.1505 0.6678 0.087 0.0827 0.3718 0.5868 0.5848 0.013 0.2748 0.1573 0.0066 0.0197 0.0402 0.1567 0.4464 0.4492 0.0548 0.0747 0.275 0.2623 0.0409 0.6243 0.0845 0.5562 0.1975 0.282 0.1389 0.4864 0.1851 0.6875 0.9876 0.0109 0.329 0.9256 0.1977 0.286 0.4616 0.0086 0.2627 0.1118 0.4062 0.2608 0.4283 0.5813 0.5069 0.4146 0.8586 0.0175 0.5267 0.3919 0.1742 0.2352 0.2362 0.0218 0.2406 0.4558 0.7362 0.0033 0.4638 0.3714 0.3295 0. 0.3384 0.0352 0.149 0.1258] 2022-08-23 03:08:34 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 03:08:48 [INFO] [TRAIN] epoch: 65, iter: 81050/160000, loss: 0.5705, lr: 0.000598, batch_cost: 0.2820, reader_cost: 0.00378, ips: 28.3687 samples/sec | ETA 06:11:03 2022-08-23 03:09:02 [INFO] [TRAIN] epoch: 65, iter: 81100/160000, loss: 0.5903, lr: 0.000597, batch_cost: 0.2803, reader_cost: 0.00252, ips: 28.5423 samples/sec | ETA 06:08:34 2022-08-23 03:09:16 [INFO] [TRAIN] epoch: 65, iter: 81150/160000, loss: 0.5833, lr: 0.000597, batch_cost: 0.2719, reader_cost: 0.00091, ips: 29.4256 samples/sec | ETA 05:57:17 2022-08-23 03:09:28 [INFO] [TRAIN] epoch: 65, iter: 81200/160000, loss: 0.5901, lr: 0.000597, batch_cost: 0.2431, reader_cost: 0.00124, ips: 32.9096 samples/sec | ETA 05:19:15 2022-08-23 03:09:39 [INFO] [TRAIN] epoch: 65, iter: 81250/160000, loss: 0.6001, lr: 0.000596, batch_cost: 0.2119, reader_cost: 0.00361, ips: 37.7459 samples/sec | ETA 04:38:10 2022-08-23 03:09:48 [INFO] [TRAIN] epoch: 65, iter: 81300/160000, loss: 0.5981, lr: 0.000596, batch_cost: 0.1888, reader_cost: 0.00056, ips: 42.3642 samples/sec | ETA 04:07:41 2022-08-23 03:09:58 [INFO] [TRAIN] epoch: 65, iter: 81350/160000, loss: 0.6015, lr: 0.000595, batch_cost: 0.1856, reader_cost: 0.00073, ips: 43.1050 samples/sec | ETA 04:03:16 2022-08-23 03:10:06 [INFO] [TRAIN] epoch: 65, iter: 81400/160000, loss: 0.6152, lr: 0.000595, batch_cost: 0.1667, reader_cost: 0.00035, ips: 47.9807 samples/sec | ETA 03:38:25 2022-08-23 03:10:16 [INFO] [TRAIN] epoch: 65, iter: 81450/160000, loss: 0.5854, lr: 0.000595, batch_cost: 0.2100, reader_cost: 0.00054, ips: 38.0866 samples/sec | ETA 04:34:59 2022-08-23 03:10:26 [INFO] [TRAIN] epoch: 65, iter: 81500/160000, loss: 0.6571, lr: 0.000594, batch_cost: 0.1947, reader_cost: 0.00066, ips: 41.0805 samples/sec | ETA 04:14:47 2022-08-23 03:10:35 [INFO] [TRAIN] epoch: 65, iter: 81550/160000, loss: 0.6246, lr: 0.000594, batch_cost: 0.1839, reader_cost: 0.00040, ips: 43.4979 samples/sec | ETA 04:00:28 2022-08-23 03:10:45 [INFO] [TRAIN] epoch: 65, iter: 81600/160000, loss: 0.6299, lr: 0.000594, batch_cost: 0.1895, reader_cost: 0.00062, ips: 42.2261 samples/sec | ETA 04:07:33 2022-08-23 03:10:55 [INFO] [TRAIN] epoch: 65, iter: 81650/160000, loss: 0.5605, lr: 0.000593, batch_cost: 0.2055, reader_cost: 0.00074, ips: 38.9316 samples/sec | ETA 04:28:20 2022-08-23 03:11:03 [INFO] [TRAIN] epoch: 65, iter: 81700/160000, loss: 0.5742, lr: 0.000593, batch_cost: 0.1681, reader_cost: 0.00039, ips: 47.5893 samples/sec | ETA 03:39:22 2022-08-23 03:11:13 [INFO] [TRAIN] epoch: 65, iter: 81750/160000, loss: 0.5736, lr: 0.000592, batch_cost: 0.1878, reader_cost: 0.00055, ips: 42.5924 samples/sec | ETA 04:04:57 2022-08-23 03:11:22 [INFO] [TRAIN] epoch: 65, iter: 81800/160000, loss: 0.5757, lr: 0.000592, batch_cost: 0.1853, reader_cost: 0.00042, ips: 43.1835 samples/sec | ETA 04:01:27 2022-08-23 03:11:31 [INFO] [TRAIN] epoch: 65, iter: 81850/160000, loss: 0.6112, lr: 0.000592, batch_cost: 0.1783, reader_cost: 0.00058, ips: 44.8715 samples/sec | ETA 03:52:13 2022-08-23 03:11:40 [INFO] [TRAIN] epoch: 65, iter: 81900/160000, loss: 0.5681, lr: 0.000591, batch_cost: 0.1819, reader_cost: 0.00053, ips: 43.9741 samples/sec | ETA 03:56:48 2022-08-23 03:11:50 [INFO] [TRAIN] epoch: 65, iter: 81950/160000, loss: 0.5723, lr: 0.000591, batch_cost: 0.1903, reader_cost: 0.00040, ips: 42.0309 samples/sec | ETA 04:07:35 2022-08-23 03:11:59 [INFO] [TRAIN] epoch: 65, iter: 82000/160000, loss: 0.5841, lr: 0.000591, batch_cost: 0.1816, reader_cost: 0.00061, ips: 44.0540 samples/sec | ETA 03:56:04 2022-08-23 03:11:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 233s - batch_cost: 0.2333 - reader cost: 8.7873e-04 2022-08-23 03:15:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3374 Acc: 0.7617 Kappa: 0.7436 Dice: 0.4682 2022-08-23 03:15:52 [INFO] [EVAL] Class IoU: [0.6827 0.7747 0.9304 0.7235 0.6801 0.7553 0.769 0.7809 0.5202 0.5973 0.5054 0.5552 0.6909 0.3062 0.299 0.4419 0.4945 0.43 0.6008 0.4038 0.7424 0.4649 0.6299 0.4678 0.3317 0.3157 0.4725 0.4338 0.4082 0.2247 0.2704 0.4811 0.3016 0.3693 0.2714 0.434 0.4214 0.5408 0.247 0.3736 0.2038 0.0967 0.3395 0.2419 0.3233 0.2441 0.2446 0.4774 0.4756 0.5436 0.5261 0.3638 0.2025 0.2234 0.6123 0.4033 0.8643 0.4078 0.466 0.2883 0.0777 0.3612 0.3399 0.2444 0.4379 0.6558 0.2938 0.4344 0.0488 0.3467 0.4775 0.5438 0.4011 0.2055 0.4389 0.3449 0.4357 0.2762 0.1714 0.1599 0.5684 0.3847 0.28 0.0175 0.3069 0.5631 0.0807 0.0917 0.2116 0.492 0.4101 0.0566 0.1996 0.0969 0.0284 0.0083 0.0126 0.1128 0.3031 0.4121 0.0611 0.0154 0.1315 0.0566 0.0521 0.3571 0.0544 0.4978 0.1061 0.2783 0.0708 0.437 0.1645 0.5074 0.7821 0.0132 0.1934 0.5746 0.1045 0.0875 0.4017 0.0026 0.2385 0.1637 0.2487 0.3151 0.4287 0.36 0.2007 0.366 0.6413 0.002 0.3165 0.2216 0.1549 0.1176 0.1413 0.0261 0.2138 0.3459 0.3785 0.0238 0.3925 0.4128 0.2802 0. 0.296 0.0288 0.138 0.1846] 2022-08-23 03:15:52 [INFO] [EVAL] Class Precision: [0.7893 0.8503 0.9612 0.8167 0.774 0.8527 0.9007 0.84 0.6911 0.7966 0.6642 0.6716 0.7621 0.4511 0.5306 0.5943 0.6871 0.6732 0.7834 0.6124 0.8368 0.637 0.779 0.6825 0.536 0.4918 0.6115 0.7655 0.631 0.3431 0.4038 0.7421 0.4547 0.5165 0.3713 0.6203 0.5857 0.7225 0.4714 0.6179 0.4483 0.2215 0.5424 0.6237 0.5158 0.4708 0.4375 0.6181 0.6741 0.6178 0.6868 0.5008 0.3415 0.4365 0.6228 0.5041 0.9277 0.6388 0.5606 0.5138 0.1193 0.4871 0.6146 0.5063 0.5513 0.8523 0.4934 0.6178 0.1908 0.5684 0.6387 0.6819 0.6138 0.281 0.6826 0.5546 0.5146 0.5151 0.7052 0.3596 0.644 0.635 0.8104 0.0437 0.3987 0.7241 0.4151 0.3441 0.4366 0.7501 0.5699 0.0792 0.3987 0.3438 0.1372 0.0323 0.3645 0.2948 0.4914 0.638 0.3616 0.0297 0.6382 0.5453 0.5163 0.439 0.1851 0.8483 0.2561 0.3527 0.2849 0.9364 0.4599 0.7266 0.7953 0.1612 0.7275 0.6804 0.1347 0.5444 0.5416 0.0761 0.653 0.5927 0.6405 0.5553 0.8548 0.4453 0.318 0.6476 0.786 0.0515 0.5293 0.6344 0.4951 0.3391 0.3905 0.0838 0.4741 0.6446 0.4812 0.0502 0.6349 0.6653 0.6526 0. 0.9113 0.4589 0.5244 0.7451] 2022-08-23 03:15:52 [INFO] [EVAL] Class Recall: [0.8349 0.8971 0.9668 0.8638 0.8486 0.8687 0.8403 0.9174 0.6778 0.7047 0.6788 0.7621 0.8809 0.4881 0.4065 0.6329 0.6383 0.5434 0.7205 0.5424 0.8681 0.6325 0.767 0.5979 0.4652 0.4686 0.6752 0.5004 0.5362 0.3944 0.4501 0.5776 0.4724 0.5644 0.502 0.591 0.6004 0.6825 0.3416 0.4858 0.2721 0.1466 0.4758 0.2833 0.4642 0.3364 0.3567 0.6771 0.6176 0.819 0.6922 0.5709 0.3323 0.314 0.9732 0.6686 0.9267 0.53 0.7342 0.3965 0.1823 0.583 0.432 0.3208 0.6803 0.7399 0.4207 0.594 0.0616 0.4707 0.6542 0.7287 0.5365 0.4335 0.5514 0.4771 0.7399 0.3732 0.1846 0.2236 0.8288 0.4939 0.2997 0.0284 0.5712 0.7169 0.091 0.1111 0.2912 0.5885 0.5939 0.1659 0.2855 0.1188 0.0346 0.011 0.0129 0.1545 0.4417 0.5378 0.0685 0.0309 0.1421 0.0594 0.0548 0.6569 0.0715 0.5464 0.1534 0.5686 0.0861 0.4503 0.2039 0.6272 0.9792 0.0141 0.2085 0.7869 0.3182 0.0944 0.6087 0.0027 0.2732 0.1844 0.2891 0.4214 0.4624 0.6525 0.3525 0.4571 0.7769 0.0021 0.4404 0.2541 0.184 0.1525 0.1812 0.0365 0.2803 0.4274 0.6395 0.0433 0.5069 0.5209 0.3293 0. 0.3048 0.0299 0.1578 0.1971] 2022-08-23 03:15:52 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 03:16:04 [INFO] [TRAIN] epoch: 65, iter: 82050/160000, loss: 0.5948, lr: 0.000590, batch_cost: 0.2232, reader_cost: 0.00335, ips: 35.8485 samples/sec | ETA 04:49:55 2022-08-23 03:16:19 [INFO] [TRAIN] epoch: 66, iter: 82100/160000, loss: 0.5449, lr: 0.000590, batch_cost: 0.3009, reader_cost: 0.06410, ips: 26.5891 samples/sec | ETA 06:30:38 2022-08-23 03:16:30 [INFO] [TRAIN] epoch: 66, iter: 82150/160000, loss: 0.5778, lr: 0.000589, batch_cost: 0.2334, reader_cost: 0.00747, ips: 34.2757 samples/sec | ETA 05:02:50 2022-08-23 03:16:40 [INFO] [TRAIN] epoch: 66, iter: 82200/160000, loss: 0.5799, lr: 0.000589, batch_cost: 0.1939, reader_cost: 0.00034, ips: 41.2593 samples/sec | ETA 04:11:25 2022-08-23 03:16:50 [INFO] [TRAIN] epoch: 66, iter: 82250/160000, loss: 0.5559, lr: 0.000589, batch_cost: 0.1955, reader_cost: 0.00049, ips: 40.9238 samples/sec | ETA 04:13:18 2022-08-23 03:17:00 [INFO] [TRAIN] epoch: 66, iter: 82300/160000, loss: 0.5662, lr: 0.000588, batch_cost: 0.1990, reader_cost: 0.00046, ips: 40.2056 samples/sec | ETA 04:17:40 2022-08-23 03:17:09 [INFO] [TRAIN] epoch: 66, iter: 82350/160000, loss: 0.5862, lr: 0.000588, batch_cost: 0.1865, reader_cost: 0.00143, ips: 42.8848 samples/sec | ETA 04:01:25 2022-08-23 03:17:19 [INFO] [TRAIN] epoch: 66, iter: 82400/160000, loss: 0.5835, lr: 0.000588, batch_cost: 0.1935, reader_cost: 0.00050, ips: 41.3343 samples/sec | ETA 04:10:18 2022-08-23 03:17:28 [INFO] [TRAIN] epoch: 66, iter: 82450/160000, loss: 0.5776, lr: 0.000587, batch_cost: 0.1936, reader_cost: 0.00049, ips: 41.3302 samples/sec | ETA 04:10:10 2022-08-23 03:17:37 [INFO] [TRAIN] epoch: 66, iter: 82500/160000, loss: 0.5496, lr: 0.000587, batch_cost: 0.1824, reader_cost: 0.00081, ips: 43.8632 samples/sec | ETA 03:55:34 2022-08-23 03:17:47 [INFO] [TRAIN] epoch: 66, iter: 82550/160000, loss: 0.5832, lr: 0.000586, batch_cost: 0.1973, reader_cost: 0.00054, ips: 40.5387 samples/sec | ETA 04:14:44 2022-08-23 03:17:57 [INFO] [TRAIN] epoch: 66, iter: 82600/160000, loss: 0.6029, lr: 0.000586, batch_cost: 0.1935, reader_cost: 0.00097, ips: 41.3535 samples/sec | ETA 04:09:33 2022-08-23 03:18:07 [INFO] [TRAIN] epoch: 66, iter: 82650/160000, loss: 0.5932, lr: 0.000586, batch_cost: 0.2041, reader_cost: 0.00053, ips: 39.2013 samples/sec | ETA 04:23:05 2022-08-23 03:18:16 [INFO] [TRAIN] epoch: 66, iter: 82700/160000, loss: 0.6294, lr: 0.000585, batch_cost: 0.1676, reader_cost: 0.00091, ips: 47.7404 samples/sec | ETA 03:35:53 2022-08-23 03:18:25 [INFO] [TRAIN] epoch: 66, iter: 82750/160000, loss: 0.5890, lr: 0.000585, batch_cost: 0.1811, reader_cost: 0.00049, ips: 44.1807 samples/sec | ETA 03:53:08 2022-08-23 03:18:34 [INFO] [TRAIN] epoch: 66, iter: 82800/160000, loss: 0.5919, lr: 0.000584, batch_cost: 0.1882, reader_cost: 0.00050, ips: 42.5057 samples/sec | ETA 04:02:09 2022-08-23 03:18:44 [INFO] [TRAIN] epoch: 66, iter: 82850/160000, loss: 0.5946, lr: 0.000584, batch_cost: 0.1969, reader_cost: 0.00036, ips: 40.6241 samples/sec | ETA 04:13:12 2022-08-23 03:18:54 [INFO] [TRAIN] epoch: 66, iter: 82900/160000, loss: 0.5881, lr: 0.000584, batch_cost: 0.2040, reader_cost: 0.00062, ips: 39.2251 samples/sec | ETA 04:22:04 2022-08-23 03:19:03 [INFO] [TRAIN] epoch: 66, iter: 82950/160000, loss: 0.6121, lr: 0.000583, batch_cost: 0.1846, reader_cost: 0.00042, ips: 43.3395 samples/sec | ETA 03:57:02 2022-08-23 03:19:13 [INFO] [TRAIN] epoch: 66, iter: 83000/160000, loss: 0.5579, lr: 0.000583, batch_cost: 0.1828, reader_cost: 0.00048, ips: 43.7697 samples/sec | ETA 03:54:33 2022-08-23 03:19:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1622 - reader cost: 7.3907e-04 2022-08-23 03:21:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3421 Acc: 0.7624 Kappa: 0.7442 Dice: 0.4732 2022-08-23 03:21:55 [INFO] [EVAL] Class IoU: [0.686 0.7707 0.9307 0.7282 0.6758 0.7497 0.7694 0.7762 0.5219 0.6154 0.4778 0.5443 0.6934 0.2984 0.3109 0.43 0.4989 0.4198 0.6063 0.4053 0.7334 0.4451 0.6409 0.4744 0.3511 0.1962 0.4141 0.4668 0.4084 0.2009 0.2555 0.4893 0.2574 0.3423 0.2759 0.4058 0.4291 0.5483 0.261 0.3946 0.2372 0.0726 0.3459 0.2522 0.2853 0.2267 0.2952 0.5028 0.6186 0.5426 0.4776 0.3405 0.2112 0.2133 0.6948 0.4964 0.8413 0.3868 0.4203 0.3166 0.0682 0.3715 0.3475 0.2144 0.4001 0.6817 0.3045 0.4234 0.0518 0.319 0.4937 0.5338 0.3558 0.2124 0.4308 0.3661 0.4482 0.2557 0.22 0.1464 0.6029 0.392 0.3686 0.0197 0.127 0.5423 0.1056 0.0695 0.2358 0.4846 0.3835 0.0727 0.2363 0.0903 0.0216 0.0133 0.0282 0.1405 0.2825 0.3564 0.136 0.0349 0.1763 0.7117 0.0503 0.3592 0.0755 0.5084 0.1009 0.0615 0.1262 0.4092 0.126 0.6269 0.8081 0.0268 0.3237 0.5987 0.081 0.1861 0.3436 0.0072 0.2416 0.114 0.2757 0.3008 0.4342 0.4318 0.4418 0.3021 0.613 0.0401 0.2943 0.2179 0.1726 0.1168 0.1495 0.0165 0.2193 0.346 0.3519 0.0177 0.2872 0.2243 0.2488 0. 0.3237 0.0217 0.1561 0.1885] 2022-08-23 03:21:55 [INFO] [EVAL] Class Precision: [0.7941 0.8279 0.9642 0.823 0.762 0.852 0.8655 0.8387 0.6723 0.7333 0.6998 0.7191 0.7808 0.4797 0.5459 0.6105 0.7019 0.6675 0.7687 0.6064 0.8186 0.6638 0.8237 0.578 0.5049 0.4092 0.5334 0.6463 0.7422 0.3587 0.4221 0.7047 0.5062 0.4569 0.4094 0.5759 0.5641 0.8384 0.4788 0.6122 0.4185 0.1983 0.5347 0.5888 0.3848 0.4159 0.4327 0.6792 0.7525 0.5984 0.7248 0.4288 0.3439 0.5739 0.7132 0.6589 0.888 0.671 0.6408 0.4596 0.1124 0.5881 0.5122 0.6471 0.4631 0.7888 0.4088 0.616 0.3535 0.5094 0.6616 0.619 0.5457 0.286 0.7331 0.535 0.5249 0.5433 0.6832 0.3881 0.7126 0.694 0.7579 0.0647 0.3178 0.7463 0.4212 0.3804 0.4372 0.7298 0.5014 0.2555 0.4597 0.3039 0.1217 0.0403 0.126 0.4319 0.4805 0.6732 0.4259 0.0918 0.4976 0.8268 0.4546 0.4221 0.1403 0.8358 0.2404 0.145 0.3938 0.6251 0.7199 0.7007 0.8186 0.2076 0.6404 0.7047 0.1191 0.4789 0.6265 0.079 0.6359 0.7497 0.6865 0.6993 0.8386 0.6037 0.8109 0.4592 0.7058 0.3555 0.4456 0.5622 0.6033 0.3433 0.3156 0.0816 0.4954 0.6767 0.3892 0.0286 0.6264 0.6435 0.7699 0. 0.8778 0.4754 0.6615 0.5242] 2022-08-23 03:21:55 [INFO] [EVAL] Class Recall: [0.8345 0.9177 0.9641 0.8633 0.8566 0.862 0.8738 0.9124 0.7 0.7928 0.6011 0.6912 0.861 0.4412 0.4194 0.5926 0.633 0.5309 0.7416 0.55 0.8758 0.5747 0.7428 0.7257 0.5355 0.2738 0.6492 0.6269 0.4759 0.3134 0.3928 0.6155 0.3437 0.5771 0.4583 0.5788 0.642 0.6131 0.3646 0.5261 0.3538 0.1028 0.4949 0.3061 0.5244 0.3326 0.4816 0.6594 0.7766 0.8534 0.5834 0.6232 0.3538 0.2534 0.9642 0.668 0.9412 0.4773 0.5499 0.5043 0.1477 0.5021 0.5193 0.2428 0.7462 0.834 0.5442 0.5753 0.0572 0.4605 0.6606 0.795 0.5056 0.4523 0.5109 0.5369 0.7542 0.3258 0.245 0.1903 0.7966 0.4739 0.4178 0.0275 0.1746 0.6649 0.1236 0.0784 0.3386 0.5906 0.6199 0.0922 0.3273 0.1139 0.0256 0.0195 0.0351 0.1723 0.4068 0.4309 0.1665 0.0533 0.2144 0.8364 0.0535 0.707 0.1406 0.5648 0.148 0.0964 0.1567 0.5423 0.1325 0.8563 0.9843 0.0299 0.3956 0.7991 0.2019 0.2334 0.4322 0.0078 0.2804 0.1185 0.3155 0.3455 0.4738 0.6027 0.4925 0.469 0.8233 0.0432 0.4643 0.2624 0.1947 0.1503 0.2213 0.0203 0.2824 0.4145 0.7858 0.0442 0.3465 0.2562 0.2688 0. 0.3389 0.0222 0.1696 0.2274] 2022-08-23 03:21:55 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 03:22:05 [INFO] [TRAIN] epoch: 66, iter: 83050/160000, loss: 0.5745, lr: 0.000583, batch_cost: 0.2009, reader_cost: 0.00381, ips: 39.8136 samples/sec | ETA 04:17:42 2022-08-23 03:22:15 [INFO] [TRAIN] epoch: 66, iter: 83100/160000, loss: 0.5680, lr: 0.000582, batch_cost: 0.1909, reader_cost: 0.00146, ips: 41.9132 samples/sec | ETA 04:04:37 2022-08-23 03:22:25 [INFO] [TRAIN] epoch: 66, iter: 83150/160000, loss: 0.5853, lr: 0.000582, batch_cost: 0.1992, reader_cost: 0.00179, ips: 40.1636 samples/sec | ETA 04:15:07 2022-08-23 03:22:34 [INFO] [TRAIN] epoch: 66, iter: 83200/160000, loss: 0.5578, lr: 0.000581, batch_cost: 0.1969, reader_cost: 0.00047, ips: 40.6372 samples/sec | ETA 04:11:59 2022-08-23 03:22:45 [INFO] [TRAIN] epoch: 66, iter: 83250/160000, loss: 0.5866, lr: 0.000581, batch_cost: 0.2024, reader_cost: 0.00054, ips: 39.5169 samples/sec | ETA 04:18:57 2022-08-23 03:22:53 [INFO] [TRAIN] epoch: 66, iter: 83300/160000, loss: 0.5793, lr: 0.000581, batch_cost: 0.1726, reader_cost: 0.00078, ips: 46.3468 samples/sec | ETA 03:40:39 2022-08-23 03:23:03 [INFO] [TRAIN] epoch: 66, iter: 83350/160000, loss: 0.5765, lr: 0.000580, batch_cost: 0.2003, reader_cost: 0.00060, ips: 39.9342 samples/sec | ETA 04:15:55 2022-08-23 03:23:14 [INFO] [TRAIN] epoch: 67, iter: 83400/160000, loss: 0.6258, lr: 0.000580, batch_cost: 0.2257, reader_cost: 0.03813, ips: 35.4468 samples/sec | ETA 04:48:07 2022-08-23 03:23:25 [INFO] [TRAIN] epoch: 67, iter: 83450/160000, loss: 0.5616, lr: 0.000580, batch_cost: 0.2085, reader_cost: 0.00077, ips: 38.3682 samples/sec | ETA 04:26:01 2022-08-23 03:23:34 [INFO] [TRAIN] epoch: 67, iter: 83500/160000, loss: 0.5904, lr: 0.000579, batch_cost: 0.1820, reader_cost: 0.00046, ips: 43.9639 samples/sec | ETA 03:52:00 2022-08-23 03:23:44 [INFO] [TRAIN] epoch: 67, iter: 83550/160000, loss: 0.5813, lr: 0.000579, batch_cost: 0.1933, reader_cost: 0.00145, ips: 41.3915 samples/sec | ETA 04:06:15 2022-08-23 03:23:53 [INFO] [TRAIN] epoch: 67, iter: 83600/160000, loss: 0.5988, lr: 0.000578, batch_cost: 0.1817, reader_cost: 0.00113, ips: 44.0402 samples/sec | ETA 03:51:18 2022-08-23 03:24:02 [INFO] [TRAIN] epoch: 67, iter: 83650/160000, loss: 0.5920, lr: 0.000578, batch_cost: 0.1831, reader_cost: 0.00054, ips: 43.6816 samples/sec | ETA 03:53:03 2022-08-23 03:24:12 [INFO] [TRAIN] epoch: 67, iter: 83700/160000, loss: 0.5350, lr: 0.000578, batch_cost: 0.2003, reader_cost: 0.00073, ips: 39.9391 samples/sec | ETA 04:14:43 2022-08-23 03:24:21 [INFO] [TRAIN] epoch: 67, iter: 83750/160000, loss: 0.5385, lr: 0.000577, batch_cost: 0.1812, reader_cost: 0.00186, ips: 44.1537 samples/sec | ETA 03:50:15 2022-08-23 03:24:31 [INFO] [TRAIN] epoch: 67, iter: 83800/160000, loss: 0.5539, lr: 0.000577, batch_cost: 0.1953, reader_cost: 0.00069, ips: 40.9545 samples/sec | ETA 04:08:04 2022-08-23 03:24:41 [INFO] [TRAIN] epoch: 67, iter: 83850/160000, loss: 0.5366, lr: 0.000577, batch_cost: 0.2093, reader_cost: 0.00138, ips: 38.2259 samples/sec | ETA 04:25:36 2022-08-23 03:24:52 [INFO] [TRAIN] epoch: 67, iter: 83900/160000, loss: 0.5546, lr: 0.000576, batch_cost: 0.2192, reader_cost: 0.00048, ips: 36.4898 samples/sec | ETA 04:38:04 2022-08-23 03:25:02 [INFO] [TRAIN] epoch: 67, iter: 83950/160000, loss: 0.5907, lr: 0.000576, batch_cost: 0.1951, reader_cost: 0.00067, ips: 41.0131 samples/sec | ETA 04:07:14 2022-08-23 03:25:12 [INFO] [TRAIN] epoch: 67, iter: 84000/160000, loss: 0.5891, lr: 0.000575, batch_cost: 0.2054, reader_cost: 0.00085, ips: 38.9559 samples/sec | ETA 04:20:07 2022-08-23 03:25:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 166s - batch_cost: 0.1663 - reader cost: 7.5341e-04 2022-08-23 03:27:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3423 Acc: 0.7620 Kappa: 0.7440 Dice: 0.4735 2022-08-23 03:27:59 [INFO] [EVAL] Class IoU: [0.6824 0.767 0.9321 0.7267 0.6796 0.7574 0.7743 0.77 0.5217 0.582 0.492 0.5583 0.6855 0.2994 0.3112 0.422 0.5133 0.4134 0.6125 0.4181 0.7474 0.4765 0.6238 0.48 0.3475 0.3355 0.49 0.4635 0.4227 0.2068 0.2711 0.4642 0.2452 0.3823 0.2945 0.4144 0.4254 0.5909 0.2741 0.3734 0.2356 0.0898 0.339 0.2559 0.3071 0.2263 0.2754 0.4987 0.6063 0.5076 0.5122 0.3655 0.1323 0.2022 0.6877 0.4735 0.8467 0.4216 0.4824 0.2399 0.0593 0.4539 0.3068 0.2358 0.4253 0.6855 0.2629 0.4395 0.1127 0.3198 0.4611 0.5503 0.4015 0.2055 0.45 0.3312 0.4463 0.2895 0.3236 0.1173 0.599 0.3848 0.2913 0.0211 0.2929 0.5489 0.0982 0.0829 0.2177 0.5113 0.336 0.0795 0.2133 0.0901 0.0025 0.0126 0.0174 0.1685 0.2924 0.3981 0.0272 0.0241 0.1758 0.5815 0.054 0.3801 0.0975 0.518 0.0996 0.1747 0.1063 0.4595 0.1537 0.4946 0.7552 0.0119 0.3772 0.5864 0.1155 0.1269 0.393 0.0077 0.1983 0.1155 0.3069 0.2946 0.4353 0.437 0.2459 0.3229 0.5302 0.0086 0.2506 0.2296 0.1159 0.1196 0.1284 0.0218 0.206 0.3657 0.3423 0.0393 0.3021 0.1127 0.3405 0. 0.3193 0.0308 0.1572 0.1561] 2022-08-23 03:27:59 [INFO] [EVAL] Class Precision: [0.7887 0.8541 0.9657 0.8344 0.7557 0.8557 0.8785 0.8287 0.6792 0.7557 0.6781 0.6978 0.7692 0.4401 0.5513 0.5523 0.7102 0.7199 0.7626 0.5989 0.8512 0.5941 0.7995 0.6621 0.5165 0.4405 0.6843 0.7282 0.7455 0.3311 0.4528 0.6598 0.5156 0.5358 0.4911 0.5431 0.5754 0.8028 0.4775 0.6155 0.39 0.2211 0.5096 0.5092 0.4565 0.4816 0.4423 0.6797 0.7077 0.5968 0.7329 0.5714 0.3214 0.5519 0.7124 0.6491 0.8814 0.6575 0.6414 0.3363 0.1027 0.5845 0.6266 0.5584 0.527 0.7807 0.4208 0.6106 0.2497 0.4873 0.6844 0.7031 0.5726 0.3067 0.6519 0.4986 0.5889 0.6061 0.3622 0.4498 0.715 0.6145 0.8108 0.0739 0.3938 0.7382 0.4749 0.3282 0.3993 0.704 0.414 0.2123 0.3469 0.3576 0.0247 0.0351 0.1324 0.363 0.5094 0.6593 0.3433 0.0573 0.5064 0.895 0.5327 0.4778 0.2418 0.7749 0.2934 0.3592 0.2797 0.7315 0.536 0.7991 0.76 0.1136 0.6904 0.6871 0.1577 0.4991 0.5224 0.1337 0.383 0.7271 0.6778 0.6713 0.7633 0.643 0.5262 0.538 0.5913 0.1101 0.3286 0.6795 0.7442 0.2356 0.4315 0.054 0.476 0.6156 0.3799 0.0597 0.7285 0.5184 0.5943 0. 0.8793 0.3261 0.6335 0.6016] 2022-08-23 03:27:59 [INFO] [EVAL] Class Recall: [0.835 0.8827 0.964 0.8492 0.8709 0.8682 0.8672 0.9158 0.6924 0.7168 0.6419 0.7364 0.8631 0.4837 0.4167 0.6415 0.6492 0.4927 0.7569 0.5807 0.8598 0.7064 0.7395 0.6357 0.515 0.5847 0.6332 0.5604 0.494 0.3551 0.4032 0.6103 0.3186 0.5716 0.4238 0.6361 0.62 0.6912 0.3916 0.4871 0.373 0.1313 0.5032 0.3396 0.4842 0.2991 0.4218 0.652 0.809 0.7726 0.6298 0.5035 0.1836 0.2419 0.952 0.6364 0.9556 0.5402 0.6605 0.4556 0.1228 0.6701 0.3755 0.2898 0.688 0.8489 0.4119 0.6107 0.1703 0.482 0.5856 0.7169 0.5732 0.3838 0.5922 0.4966 0.6481 0.3566 0.7525 0.137 0.787 0.5073 0.3126 0.0286 0.5334 0.6816 0.1101 0.0999 0.3236 0.6514 0.6405 0.1127 0.3564 0.1075 0.0027 0.0193 0.0196 0.2393 0.407 0.5012 0.0286 0.0399 0.2121 0.6241 0.0567 0.6503 0.1405 0.6097 0.1311 0.2538 0.1463 0.5527 0.1772 0.5648 0.9917 0.0131 0.454 0.8001 0.3018 0.1454 0.6135 0.0081 0.2914 0.1207 0.3592 0.3443 0.5032 0.5769 0.3159 0.4467 0.8368 0.0093 0.5135 0.2575 0.1207 0.1953 0.1545 0.0354 0.2664 0.4739 0.7758 0.1032 0.3404 0.1259 0.4436 0. 0.3339 0.0329 0.1729 0.1741] 2022-08-23 03:27:59 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 03:28:09 [INFO] [TRAIN] epoch: 67, iter: 84050/160000, loss: 0.5409, lr: 0.000575, batch_cost: 0.2071, reader_cost: 0.00403, ips: 38.6272 samples/sec | ETA 04:22:09 2022-08-23 03:28:18 [INFO] [TRAIN] epoch: 67, iter: 84100/160000, loss: 0.5568, lr: 0.000575, batch_cost: 0.1728, reader_cost: 0.00104, ips: 46.2993 samples/sec | ETA 03:38:34 2022-08-23 03:28:28 [INFO] [TRAIN] epoch: 67, iter: 84150/160000, loss: 0.5912, lr: 0.000574, batch_cost: 0.2047, reader_cost: 0.00066, ips: 39.0907 samples/sec | ETA 04:18:42 2022-08-23 03:28:38 [INFO] [TRAIN] epoch: 67, iter: 84200/160000, loss: 0.5769, lr: 0.000574, batch_cost: 0.1988, reader_cost: 0.00052, ips: 40.2419 samples/sec | ETA 04:11:08 2022-08-23 03:28:48 [INFO] [TRAIN] epoch: 67, iter: 84250/160000, loss: 0.5457, lr: 0.000574, batch_cost: 0.1923, reader_cost: 0.00041, ips: 41.6012 samples/sec | ETA 04:02:46 2022-08-23 03:28:58 [INFO] [TRAIN] epoch: 67, iter: 84300/160000, loss: 0.5836, lr: 0.000573, batch_cost: 0.2037, reader_cost: 0.00046, ips: 39.2714 samples/sec | ETA 04:17:00 2022-08-23 03:29:08 [INFO] [TRAIN] epoch: 67, iter: 84350/160000, loss: 0.5713, lr: 0.000573, batch_cost: 0.1959, reader_cost: 0.00051, ips: 40.8271 samples/sec | ETA 04:07:03 2022-08-23 03:29:16 [INFO] [TRAIN] epoch: 67, iter: 84400/160000, loss: 0.5797, lr: 0.000572, batch_cost: 0.1675, reader_cost: 0.00052, ips: 47.7717 samples/sec | ETA 03:31:00 2022-08-23 03:29:27 [INFO] [TRAIN] epoch: 67, iter: 84450/160000, loss: 0.5467, lr: 0.000572, batch_cost: 0.2140, reader_cost: 0.00053, ips: 37.3783 samples/sec | ETA 04:29:29 2022-08-23 03:29:37 [INFO] [TRAIN] epoch: 67, iter: 84500/160000, loss: 0.5362, lr: 0.000572, batch_cost: 0.2015, reader_cost: 0.00083, ips: 39.6942 samples/sec | ETA 04:13:36 2022-08-23 03:29:48 [INFO] [TRAIN] epoch: 67, iter: 84550/160000, loss: 0.5594, lr: 0.000571, batch_cost: 0.2191, reader_cost: 0.00095, ips: 36.5108 samples/sec | ETA 04:35:32 2022-08-23 03:29:59 [INFO] [TRAIN] epoch: 67, iter: 84600/160000, loss: 0.6051, lr: 0.000571, batch_cost: 0.2221, reader_cost: 0.00032, ips: 36.0167 samples/sec | ETA 04:39:07 2022-08-23 03:30:12 [INFO] [TRAIN] epoch: 68, iter: 84650/160000, loss: 0.5302, lr: 0.000570, batch_cost: 0.2713, reader_cost: 0.06854, ips: 29.4902 samples/sec | ETA 05:40:40 2022-08-23 03:30:23 [INFO] [TRAIN] epoch: 68, iter: 84700/160000, loss: 0.5814, lr: 0.000570, batch_cost: 0.2158, reader_cost: 0.00070, ips: 37.0640 samples/sec | ETA 04:30:52 2022-08-23 03:30:34 [INFO] [TRAIN] epoch: 68, iter: 84750/160000, loss: 0.5771, lr: 0.000570, batch_cost: 0.2090, reader_cost: 0.00087, ips: 38.2809 samples/sec | ETA 04:22:05 2022-08-23 03:30:43 [INFO] [TRAIN] epoch: 68, iter: 84800/160000, loss: 0.5797, lr: 0.000569, batch_cost: 0.1957, reader_cost: 0.00073, ips: 40.8706 samples/sec | ETA 04:05:19 2022-08-23 03:30:53 [INFO] [TRAIN] epoch: 68, iter: 84850/160000, loss: 0.5473, lr: 0.000569, batch_cost: 0.2017, reader_cost: 0.00100, ips: 39.6582 samples/sec | ETA 04:12:39 2022-08-23 03:31:03 [INFO] [TRAIN] epoch: 68, iter: 84900/160000, loss: 0.6183, lr: 0.000569, batch_cost: 0.1963, reader_cost: 0.00124, ips: 40.7641 samples/sec | ETA 04:05:38 2022-08-23 03:31:14 [INFO] [TRAIN] epoch: 68, iter: 84950/160000, loss: 0.5617, lr: 0.000568, batch_cost: 0.2044, reader_cost: 0.00088, ips: 39.1335 samples/sec | ETA 04:15:42 2022-08-23 03:31:24 [INFO] [TRAIN] epoch: 68, iter: 85000/160000, loss: 0.5863, lr: 0.000568, batch_cost: 0.2092, reader_cost: 0.00073, ips: 38.2436 samples/sec | ETA 04:21:28 2022-08-23 03:31:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1700 - reader cost: 7.3879e-04 2022-08-23 03:34:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3417 Acc: 0.7613 Kappa: 0.7430 Dice: 0.4732 2022-08-23 03:34:14 [INFO] [EVAL] Class IoU: [0.6846 0.7689 0.932 0.7244 0.6707 0.7494 0.7687 0.7846 0.5147 0.584 0.4835 0.5377 0.6959 0.2711 0.3137 0.4245 0.5298 0.4219 0.5991 0.4095 0.739 0.4957 0.6234 0.4853 0.3296 0.2888 0.5175 0.4414 0.4073 0.2612 0.2629 0.4995 0.3107 0.3472 0.298 0.3889 0.4292 0.5501 0.256 0.3566 0.2354 0.0883 0.3525 0.258 0.3266 0.2336 0.2445 0.5033 0.5536 0.5936 0.5116 0.2866 0.2079 0.2356 0.691 0.4388 0.8533 0.3961 0.4805 0.2723 0.0596 0.4515 0.3415 0.2475 0.4504 0.6635 0.2415 0.4002 0.0502 0.3307 0.4684 0.5115 0.3851 0.2183 0.4254 0.3314 0.5399 0.3079 0.2826 0.0616 0.5716 0.3485 0.394 0.0385 0.2439 0.5441 0.0916 0.0824 0.2295 0.4752 0.4228 0.004 0.2209 0.0771 0.0322 0.0118 0.0309 0.167 0.2872 0.4389 0.0771 0.0204 0.2309 0.4337 0.0163 0.3576 0.0424 0.4903 0.1073 0.1834 0.0942 0.4407 0.1641 0.5871 0.6753 0.0062 0.3347 0.6223 0.121 0.1563 0.2232 0.002 0.2002 0.195 0.2843 0.2796 0.4263 0.4415 0.1187 0.3084 0.5571 0.024 0.3033 0.1978 0.172 0.1355 0.1558 0.0074 0.2167 0.3461 0.3447 0.0189 0.3714 0.2913 0.3159 0. 0.4021 0.0362 0.1398 0.1702] 2022-08-23 03:34:14 [INFO] [EVAL] Class Precision: [0.7769 0.86 0.9649 0.8273 0.7549 0.8862 0.86 0.8527 0.7229 0.7205 0.7342 0.6905 0.7728 0.4841 0.4949 0.5889 0.6752 0.6905 0.7218 0.5894 0.8314 0.6236 0.758 0.6189 0.5303 0.4244 0.6943 0.7634 0.6631 0.3468 0.443 0.648 0.5324 0.4677 0.495 0.5393 0.6158 0.758 0.4703 0.6184 0.3962 0.2246 0.5747 0.5177 0.4318 0.3901 0.4189 0.6947 0.5826 0.7569 0.7426 0.3763 0.358 0.5685 0.7984 0.5527 0.9075 0.6327 0.6056 0.4885 0.092 0.59 0.5269 0.5068 0.5747 0.7454 0.4229 0.6067 0.1425 0.5755 0.5937 0.6225 0.6062 0.326 0.5731 0.5126 0.6808 0.5527 0.445 0.2717 0.6425 0.6901 0.735 0.0834 0.3931 0.7518 0.4175 0.3706 0.5184 0.6542 0.6011 0.0077 0.4463 0.3527 0.2021 0.0449 0.172 0.3802 0.5028 0.6556 0.4255 0.0542 0.5161 0.68 0.2643 0.4261 0.1841 0.7595 0.2898 0.3387 0.2093 0.6352 0.42 0.8005 0.683 0.0957 0.7041 0.7428 0.2083 0.4973 0.5678 0.1202 0.4351 0.5785 0.6784 0.6182 0.8386 0.6139 0.31 0.4326 0.6335 0.3191 0.5309 0.5332 0.6619 0.2746 0.3685 0.0709 0.4422 0.6205 0.4033 0.0347 0.5451 0.5389 0.5629 0. 0.8465 0.2382 0.4955 0.7851] 2022-08-23 03:34:14 [INFO] [EVAL] Class Recall: [0.8521 0.8789 0.9646 0.8535 0.8575 0.8293 0.8787 0.9076 0.6413 0.755 0.5861 0.7084 0.8748 0.3813 0.4613 0.6032 0.7111 0.5203 0.779 0.5729 0.8693 0.7074 0.7783 0.6921 0.4654 0.4748 0.6703 0.5114 0.5136 0.5141 0.3926 0.6856 0.4273 0.5741 0.4281 0.5823 0.5861 0.6673 0.3597 0.4572 0.367 0.127 0.4768 0.3395 0.5728 0.3681 0.37 0.6463 0.9175 0.7333 0.6219 0.546 0.3314 0.2869 0.837 0.6806 0.9347 0.5144 0.6993 0.3809 0.145 0.658 0.4925 0.326 0.6757 0.8579 0.3603 0.5404 0.0719 0.4373 0.6894 0.7414 0.5136 0.3979 0.6228 0.484 0.7229 0.4101 0.4365 0.0738 0.8381 0.4131 0.4593 0.0669 0.3913 0.6632 0.105 0.0957 0.2917 0.6347 0.5877 0.0081 0.3042 0.0898 0.0369 0.0157 0.0363 0.2295 0.4011 0.5704 0.0861 0.0316 0.2947 0.5448 0.0171 0.6899 0.0522 0.5804 0.1455 0.2857 0.1462 0.5901 0.2122 0.6877 0.9836 0.0066 0.3895 0.7932 0.2241 0.1857 0.269 0.0021 0.2705 0.2273 0.3286 0.338 0.4644 0.6113 0.1612 0.5177 0.8221 0.0253 0.4144 0.2392 0.1885 0.211 0.2125 0.0082 0.2982 0.439 0.7034 0.04 0.5384 0.388 0.4185 0. 0.4338 0.041 0.163 0.1785] 2022-08-23 03:34:14 [INFO] [EVAL] The model with the best validation mIoU (0.3433) was saved at iter 79000. 2022-08-23 03:34:24 [INFO] [TRAIN] epoch: 68, iter: 85050/160000, loss: 0.6015, lr: 0.000567, batch_cost: 0.1916, reader_cost: 0.00368, ips: 41.7640 samples/sec | ETA 03:59:16 2022-08-23 03:34:33 [INFO] [TRAIN] epoch: 68, iter: 85100/160000, loss: 0.5831, lr: 0.000567, batch_cost: 0.1788, reader_cost: 0.00105, ips: 44.7465 samples/sec | ETA 03:43:10 2022-08-23 03:34:43 [INFO] [TRAIN] epoch: 68, iter: 85150/160000, loss: 0.5392, lr: 0.000567, batch_cost: 0.1997, reader_cost: 0.00109, ips: 40.0686 samples/sec | ETA 04:09:04 2022-08-23 03:34:54 [INFO] [TRAIN] epoch: 68, iter: 85200/160000, loss: 0.5729, lr: 0.000566, batch_cost: 0.2180, reader_cost: 0.00044, ips: 36.7050 samples/sec | ETA 04:31:42 2022-08-23 03:35:03 [INFO] [TRAIN] epoch: 68, iter: 85250/160000, loss: 0.5474, lr: 0.000566, batch_cost: 0.1821, reader_cost: 0.00056, ips: 43.9431 samples/sec | ETA 03:46:48 2022-08-23 03:35:12 [INFO] [TRAIN] epoch: 68, iter: 85300/160000, loss: 0.5648, lr: 0.000566, batch_cost: 0.1871, reader_cost: 0.00059, ips: 42.7687 samples/sec | ETA 03:52:52 2022-08-23 03:35:22 [INFO] [TRAIN] epoch: 68, iter: 85350/160000, loss: 0.6142, lr: 0.000565, batch_cost: 0.1962, reader_cost: 0.00064, ips: 40.7775 samples/sec | ETA 04:04:05 2022-08-23 03:35:32 [INFO] [TRAIN] epoch: 68, iter: 85400/160000, loss: 0.5760, lr: 0.000565, batch_cost: 0.1945, reader_cost: 0.00054, ips: 41.1233 samples/sec | ETA 04:01:52 2022-08-23 03:35:42 [INFO] [TRAIN] epoch: 68, iter: 85450/160000, loss: 0.6229, lr: 0.000564, batch_cost: 0.2015, reader_cost: 0.00079, ips: 39.6931 samples/sec | ETA 04:10:25 2022-08-23 03:35:51 [INFO] [TRAIN] epoch: 68, iter: 85500/160000, loss: 0.5777, lr: 0.000564, batch_cost: 0.1808, reader_cost: 0.00031, ips: 44.2459 samples/sec | ETA 03:44:30 2022-08-23 03:36:00 [INFO] [TRAIN] epoch: 68, iter: 85550/160000, loss: 0.5609, lr: 0.000564, batch_cost: 0.1873, reader_cost: 0.00052, ips: 42.7109 samples/sec | ETA 03:52:24 2022-08-23 03:36:09 [INFO] [TRAIN] epoch: 68, iter: 85600/160000, loss: 0.5794, lr: 0.000563, batch_cost: 0.1820, reader_cost: 0.00050, ips: 43.9455 samples/sec | ETA 03:45:44 2022-08-23 03:36:18 [INFO] [TRAIN] epoch: 68, iter: 85650/160000, loss: 0.5559, lr: 0.000563, batch_cost: 0.1749, reader_cost: 0.00053, ips: 45.7501 samples/sec | ETA 03:36:41 2022-08-23 03:36:28 [INFO] [TRAIN] epoch: 68, iter: 85700/160000, loss: 0.5905, lr: 0.000563, batch_cost: 0.2021, reader_cost: 0.00040, ips: 39.5779 samples/sec | ETA 04:10:18 2022-08-23 03:36:38 [INFO] [TRAIN] epoch: 68, iter: 85750/160000, loss: 0.5407, lr: 0.000562, batch_cost: 0.1974, reader_cost: 0.00166, ips: 40.5326 samples/sec | ETA 04:04:14 2022-08-23 03:36:47 [INFO] [TRAIN] epoch: 68, iter: 85800/160000, loss: 0.5477, lr: 0.000562, batch_cost: 0.1794, reader_cost: 0.00078, ips: 44.6012 samples/sec | ETA 03:41:49 2022-08-23 03:36:56 [INFO] [TRAIN] epoch: 68, iter: 85850/160000, loss: 0.5928, lr: 0.000561, batch_cost: 0.1726, reader_cost: 0.00043, ips: 46.3581 samples/sec | ETA 03:33:16 2022-08-23 03:37:06 [INFO] [TRAIN] epoch: 69, iter: 85900/160000, loss: 0.5839, lr: 0.000561, batch_cost: 0.2136, reader_cost: 0.03218, ips: 37.4448 samples/sec | ETA 04:23:51 2022-08-23 03:37:15 [INFO] [TRAIN] epoch: 69, iter: 85950/160000, loss: 0.5906, lr: 0.000561, batch_cost: 0.1751, reader_cost: 0.00072, ips: 45.6764 samples/sec | ETA 03:36:09 2022-08-23 03:37:24 [INFO] [TRAIN] epoch: 69, iter: 86000/160000, loss: 0.5494, lr: 0.000560, batch_cost: 0.1792, reader_cost: 0.00269, ips: 44.6380 samples/sec | ETA 03:41:02 2022-08-23 03:37:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 136s - batch_cost: 0.1363 - reader cost: 9.2938e-04 2022-08-23 03:39:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3437 Acc: 0.7627 Kappa: 0.7447 Dice: 0.4757 2022-08-23 03:39:41 [INFO] [EVAL] Class IoU: [0.6875 0.7711 0.9311 0.7247 0.6753 0.762 0.7703 0.7705 0.5242 0.6043 0.4759 0.5622 0.6933 0.2767 0.3059 0.4299 0.4934 0.4111 0.613 0.4005 0.7352 0.4712 0.624 0.4825 0.3315 0.3331 0.4232 0.4646 0.3934 0.2429 0.2671 0.5012 0.2503 0.3476 0.3204 0.3954 0.4292 0.5309 0.2758 0.3967 0.2096 0.1049 0.3506 0.2413 0.2882 0.2319 0.2623 0.4865 0.5825 0.5276 0.5025 0.3801 0.1803 0.2029 0.69 0.4383 0.8373 0.359 0.5011 0.2407 0.0539 0.4783 0.3307 0.2605 0.4127 0.6881 0.2249 0.4252 0.084 0.3445 0.4895 0.522 0.3868 0.1918 0.445 0.3539 0.5396 0.279 0.3329 0.2472 0.5997 0.4114 0.3817 0.0234 0.0634 0.5661 0.1003 0.095 0.2503 0.5111 0.4041 0.0399 0.2223 0.0985 0.0265 0.0109 0.0256 0.1461 0.1944 0.3116 0.1196 0.0177 0.1844 0.2775 0.0392 0.4318 0.0829 0.5006 0.0962 0.1885 0.1394 0.4255 0.1618 0.5163 0.6846 0.0142 0.3325 0.5902 0.1483 0.1675 0.3979 0.005 0.2441 0.1529 0.4416 0.2671 0.4684 0.4125 0.4734 0.3592 0.5778 0.0285 0.2281 0.213 0.2113 0.114 0.1259 0.0395 0.1946 0.3658 0.38 0.052 0.3745 0.0067 0.2964 0. 0.4021 0.0358 0.1403 0.1327] 2022-08-23 03:39:41 [INFO] [EVAL] Class Precision: [0.7907 0.845 0.9613 0.8317 0.7544 0.8658 0.8806 0.8302 0.6903 0.7481 0.6172 0.6904 0.7819 0.5043 0.534 0.5971 0.6381 0.6723 0.7885 0.6423 0.8164 0.6871 0.7828 0.6193 0.5104 0.5596 0.5675 0.6931 0.6926 0.3492 0.4538 0.6729 0.4828 0.4662 0.447 0.526 0.6257 0.7801 0.5635 0.547 0.3618 0.2412 0.6115 0.5957 0.3787 0.4433 0.4241 0.6226 0.6851 0.6332 0.7302 0.5069 0.3829 0.5969 0.7237 0.6647 0.872 0.7123 0.7275 0.3748 0.0939 0.5965 0.512 0.5637 0.4945 0.8054 0.3612 0.5443 0.1908 0.5921 0.6077 0.7484 0.578 0.3055 0.6804 0.5081 0.6955 0.5052 0.5978 0.4412 0.6885 0.678 0.7542 0.0638 0.1867 0.721 0.446 0.3317 0.504 0.6964 0.5851 0.0482 0.3883 0.3368 0.0895 0.036 0.3783 0.3479 0.5651 0.7844 0.4402 0.0478 0.5143 0.7747 0.3316 0.5193 0.2181 0.7607 0.2158 0.342 0.2422 0.5893 0.3712 0.7398 0.6885 0.126 0.6729 0.6789 0.2059 0.4441 0.5025 0.0789 0.5505 0.6385 0.7357 0.6452 0.8017 0.5408 0.7936 0.5656 0.6982 0.1182 0.3592 0.6426 0.4824 0.2194 0.4746 0.0657 0.4493 0.6378 0.4463 0.0667 0.6129 0.2053 0.5737 0. 0.8611 0.3753 0.6008 0.7476] 2022-08-23 03:39:41 [INFO] [EVAL] Class Recall: [0.8405 0.8981 0.9673 0.8493 0.8655 0.8641 0.8601 0.9147 0.6853 0.7587 0.6753 0.7516 0.8596 0.3801 0.4173 0.6056 0.6851 0.5141 0.7335 0.5155 0.8809 0.5999 0.7547 0.6861 0.4861 0.4514 0.6247 0.585 0.4766 0.4438 0.3937 0.6626 0.3421 0.5774 0.5307 0.6142 0.5774 0.6243 0.3507 0.5909 0.3326 0.1566 0.4511 0.2886 0.5466 0.3272 0.4075 0.69 0.7956 0.76 0.6171 0.6031 0.2543 0.2351 0.9368 0.5627 0.9547 0.4198 0.617 0.4022 0.1123 0.7071 0.4829 0.3263 0.7138 0.8253 0.3735 0.6602 0.1305 0.4517 0.7156 0.6331 0.5391 0.34 0.5626 0.5385 0.7065 0.3838 0.429 0.3599 0.823 0.5114 0.436 0.0357 0.0876 0.7249 0.1146 0.1175 0.3321 0.6577 0.5663 0.187 0.3421 0.1223 0.0362 0.0154 0.0267 0.2012 0.2286 0.3408 0.141 0.0273 0.2232 0.3018 0.0426 0.7193 0.118 0.5943 0.1479 0.2959 0.2471 0.605 0.2229 0.6308 0.9918 0.0157 0.3966 0.8189 0.3467 0.2119 0.6566 0.0053 0.3048 0.1673 0.5249 0.3131 0.5298 0.6349 0.5398 0.496 0.7701 0.0362 0.3846 0.2416 0.2733 0.1917 0.1463 0.0902 0.2556 0.4616 0.719 0.1906 0.4906 0.0068 0.3801 0. 0.43 0.0381 0.1548 0.1389] 2022-08-23 03:39:41 [INFO] [EVAL] The model with the best validation mIoU (0.3437) was saved at iter 86000. 2022-08-23 03:39:51 [INFO] [TRAIN] epoch: 69, iter: 86050/160000, loss: 0.5766, lr: 0.000560, batch_cost: 0.1940, reader_cost: 0.00435, ips: 41.2464 samples/sec | ETA 03:59:03 2022-08-23 03:40:01 [INFO] [TRAIN] epoch: 69, iter: 86100/160000, loss: 0.6345, lr: 0.000560, batch_cost: 0.2065, reader_cost: 0.00138, ips: 38.7487 samples/sec | ETA 04:14:17 2022-08-23 03:40:11 [INFO] [TRAIN] epoch: 69, iter: 86150/160000, loss: 0.5624, lr: 0.000559, batch_cost: 0.2011, reader_cost: 0.00057, ips: 39.7860 samples/sec | ETA 04:07:29 2022-08-23 03:40:20 [INFO] [TRAIN] epoch: 69, iter: 86200/160000, loss: 0.5680, lr: 0.000559, batch_cost: 0.1907, reader_cost: 0.00069, ips: 41.9500 samples/sec | ETA 03:54:33 2022-08-23 03:40:30 [INFO] [TRAIN] epoch: 69, iter: 86250/160000, loss: 0.5741, lr: 0.000558, batch_cost: 0.1943, reader_cost: 0.00051, ips: 41.1708 samples/sec | ETA 03:58:50 2022-08-23 03:40:41 [INFO] [TRAIN] epoch: 69, iter: 86300/160000, loss: 0.6068, lr: 0.000558, batch_cost: 0.2147, reader_cost: 0.00072, ips: 37.2699 samples/sec | ETA 04:23:39 2022-08-23 03:40:51 [INFO] [TRAIN] epoch: 69, iter: 86350/160000, loss: 0.5593, lr: 0.000558, batch_cost: 0.1932, reader_cost: 0.00053, ips: 41.3978 samples/sec | ETA 03:57:12 2022-08-23 03:41:00 [INFO] [TRAIN] epoch: 69, iter: 86400/160000, loss: 0.5803, lr: 0.000557, batch_cost: 0.1879, reader_cost: 0.00181, ips: 42.5670 samples/sec | ETA 03:50:32 2022-08-23 03:41:11 [INFO] [TRAIN] epoch: 69, iter: 86450/160000, loss: 0.5422, lr: 0.000557, batch_cost: 0.2198, reader_cost: 0.00056, ips: 36.3896 samples/sec | ETA 04:29:29 2022-08-23 03:41:21 [INFO] [TRAIN] epoch: 69, iter: 86500/160000, loss: 0.5942, lr: 0.000556, batch_cost: 0.2084, reader_cost: 0.00044, ips: 38.3904 samples/sec | ETA 04:15:16 2022-08-23 03:41:31 [INFO] [TRAIN] epoch: 69, iter: 86550/160000, loss: 0.6127, lr: 0.000556, batch_cost: 0.2023, reader_cost: 0.00054, ips: 39.5550 samples/sec | ETA 04:07:35 2022-08-23 03:41:41 [INFO] [TRAIN] epoch: 69, iter: 86600/160000, loss: 0.5760, lr: 0.000556, batch_cost: 0.1886, reader_cost: 0.00085, ips: 42.4184 samples/sec | ETA 03:50:43 2022-08-23 03:41:51 [INFO] [TRAIN] epoch: 69, iter: 86650/160000, loss: 0.5446, lr: 0.000555, batch_cost: 0.2005, reader_cost: 0.00088, ips: 39.9074 samples/sec | ETA 04:05:04 2022-08-23 03:42:00 [INFO] [TRAIN] epoch: 69, iter: 86700/160000, loss: 0.5506, lr: 0.000555, batch_cost: 0.1826, reader_cost: 0.00060, ips: 43.8004 samples/sec | ETA 03:43:08 2022-08-23 03:42:10 [INFO] [TRAIN] epoch: 69, iter: 86750/160000, loss: 0.6210, lr: 0.000555, batch_cost: 0.2060, reader_cost: 0.00071, ips: 38.8336 samples/sec | ETA 04:11:30 2022-08-23 03:42:20 [INFO] [TRAIN] epoch: 69, iter: 86800/160000, loss: 0.5839, lr: 0.000554, batch_cost: 0.1897, reader_cost: 0.00034, ips: 42.1704 samples/sec | ETA 03:51:26 2022-08-23 03:42:30 [INFO] [TRAIN] epoch: 69, iter: 86850/160000, loss: 0.6128, lr: 0.000554, batch_cost: 0.2033, reader_cost: 0.00044, ips: 39.3557 samples/sec | ETA 04:07:49 2022-08-23 03:42:38 [INFO] [TRAIN] epoch: 69, iter: 86900/160000, loss: 0.5840, lr: 0.000553, batch_cost: 0.1693, reader_cost: 0.00358, ips: 47.2603 samples/sec | ETA 03:26:14 2022-08-23 03:42:48 [INFO] [TRAIN] epoch: 69, iter: 86950/160000, loss: 0.5917, lr: 0.000553, batch_cost: 0.1908, reader_cost: 0.00184, ips: 41.9286 samples/sec | ETA 03:52:17 2022-08-23 03:42:59 [INFO] [TRAIN] epoch: 69, iter: 87000/160000, loss: 0.5663, lr: 0.000553, batch_cost: 0.2143, reader_cost: 0.00135, ips: 37.3295 samples/sec | ETA 04:20:44 2022-08-23 03:42:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1694 - reader cost: 7.9953e-04 2022-08-23 03:45:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3442 Acc: 0.7631 Kappa: 0.7453 Dice: 0.4761 2022-08-23 03:45:48 [INFO] [EVAL] Class IoU: [0.6861 0.771 0.9309 0.7248 0.682 0.7649 0.7681 0.7749 0.5212 0.6306 0.4787 0.5388 0.6977 0.3045 0.3177 0.4303 0.4998 0.4292 0.6193 0.4184 0.7449 0.4524 0.6377 0.4911 0.322 0.289 0.3971 0.4467 0.3733 0.2016 0.2184 0.4929 0.3047 0.3601 0.2749 0.4043 0.428 0.5392 0.2772 0.3309 0.1865 0.0983 0.3335 0.2468 0.2843 0.2112 0.2763 0.5107 0.6114 0.5343 0.5358 0.384 0.1601 0.2216 0.6805 0.3894 0.8529 0.3944 0.3953 0.2673 0.0713 0.4851 0.3258 0.1808 0.4457 0.6729 0.2521 0.3953 0.0686 0.3197 0.4811 0.5283 0.3951 0.2207 0.4487 0.3318 0.5772 0.2816 0.2712 0.2213 0.6084 0.3555 0.3779 0.029 0.3455 0.5638 0.0925 0.0859 0.2254 0.5307 0.3882 0.0595 0.2117 0.0793 0.0289 0.0086 0.0373 0.1224 0.2233 0.4517 0.0994 0.0197 0.235 0.5084 0.0149 0.4034 0.0467 0.5253 0.091 0.2193 0.1629 0.4504 0.1492 0.5318 0.7544 0.0084 0.2832 0.5676 0.1413 0.0469 0.3306 0.0077 0.2413 0.1719 0.3842 0.3056 0.4022 0.4117 0.4488 0.3156 0.5467 0.0132 0.2838 0.2475 0.1595 0.1159 0.132 0.0192 0.1792 0.2724 0.3438 0.0603 0.3716 0.2547 0.25 0.0031 0.4095 0.0229 0.167 0.1444] 2022-08-23 03:45:48 [INFO] [EVAL] Class Precision: [0.7987 0.8527 0.9663 0.8204 0.768 0.838 0.8664 0.837 0.6935 0.7475 0.6709 0.6955 0.7822 0.5014 0.4984 0.5915 0.6767 0.6769 0.7481 0.5813 0.8387 0.6386 0.7629 0.5818 0.4914 0.421 0.5184 0.8024 0.6872 0.3467 0.5215 0.6464 0.5203 0.5428 0.4335 0.589 0.6153 0.7292 0.4346 0.6764 0.3892 0.2828 0.5262 0.4576 0.3887 0.3936 0.429 0.7178 0.7039 0.6248 0.6806 0.518 0.3155 0.4956 0.7022 0.4727 0.8966 0.6157 0.7368 0.4044 0.1295 0.665 0.4482 0.7436 0.5733 0.7563 0.4588 0.5265 0.1916 0.5233 0.7078 0.6577 0.6184 0.2712 0.6737 0.5339 0.7074 0.5262 0.6226 0.4575 0.7265 0.6573 0.7743 0.0814 0.5157 0.7688 0.4358 0.3625 0.4958 0.7286 0.5163 0.079 0.3932 0.3722 0.0877 0.0346 0.1967 0.3511 0.5945 0.6468 0.4192 0.0755 0.4859 0.8717 0.2789 0.491 0.1838 0.8878 0.2484 0.3994 0.3276 0.7295 0.5601 0.6551 0.7595 0.0845 0.7518 0.6464 0.2152 0.4122 0.4897 0.1328 0.6024 0.5911 0.6815 0.6617 0.8497 0.5957 0.732 0.4879 0.6121 0.1956 0.6369 0.5082 0.5316 0.336 0.4047 0.0531 0.4192 0.7624 0.4008 0.0955 0.6843 0.4877 0.5738 0.0033 0.8336 0.3904 0.516 0.5993] 2022-08-23 03:45:48 [INFO] [EVAL] Class Recall: [0.8295 0.8895 0.9621 0.8614 0.859 0.8976 0.8714 0.9127 0.6772 0.8013 0.6256 0.7052 0.8658 0.4367 0.4671 0.6122 0.6565 0.5398 0.7824 0.5988 0.8695 0.6081 0.7953 0.7591 0.4829 0.4797 0.6292 0.5019 0.4497 0.325 0.2732 0.6749 0.4237 0.5169 0.4291 0.5632 0.5844 0.6741 0.4335 0.3932 0.2636 0.1309 0.4767 0.3489 0.5143 0.3131 0.4371 0.639 0.8232 0.7867 0.7157 0.5976 0.2452 0.286 0.9566 0.6884 0.946 0.5233 0.4603 0.4409 0.1369 0.642 0.544 0.1928 0.6669 0.8592 0.3588 0.6134 0.0966 0.4511 0.6002 0.7287 0.5225 0.5423 0.5732 0.467 0.7581 0.3772 0.3246 0.3 0.7891 0.4364 0.4247 0.0431 0.5114 0.6789 0.1051 0.1012 0.2923 0.6615 0.6101 0.1944 0.3145 0.0915 0.0414 0.0113 0.044 0.1582 0.2634 0.5997 0.1153 0.026 0.3128 0.5496 0.0155 0.6933 0.0589 0.5626 0.1255 0.3273 0.2446 0.5407 0.1689 0.7386 0.9912 0.0092 0.3124 0.8232 0.2917 0.0502 0.5044 0.0081 0.287 0.1952 0.4683 0.3622 0.433 0.5713 0.537 0.472 0.8363 0.014 0.3385 0.3254 0.1856 0.1503 0.1637 0.0293 0.2383 0.2977 0.7074 0.1409 0.4485 0.3477 0.307 0.0431 0.4459 0.0238 0.198 0.1598] 2022-08-23 03:45:49 [INFO] [EVAL] The model with the best validation mIoU (0.3442) was saved at iter 87000. 2022-08-23 03:45:59 [INFO] [TRAIN] epoch: 69, iter: 87050/160000, loss: 0.5706, lr: 0.000552, batch_cost: 0.2101, reader_cost: 0.00450, ips: 38.0801 samples/sec | ETA 04:15:25 2022-08-23 03:46:08 [INFO] [TRAIN] epoch: 69, iter: 87100/160000, loss: 0.5958, lr: 0.000552, batch_cost: 0.1743, reader_cost: 0.00069, ips: 45.8990 samples/sec | ETA 03:31:46 2022-08-23 03:46:20 [INFO] [TRAIN] epoch: 70, iter: 87150/160000, loss: 0.5594, lr: 0.000552, batch_cost: 0.2358, reader_cost: 0.07405, ips: 33.9217 samples/sec | ETA 04:46:20 2022-08-23 03:46:29 [INFO] [TRAIN] epoch: 70, iter: 87200/160000, loss: 0.5666, lr: 0.000551, batch_cost: 0.1871, reader_cost: 0.00074, ips: 42.7580 samples/sec | ETA 03:47:00 2022-08-23 03:46:38 [INFO] [TRAIN] epoch: 70, iter: 87250/160000, loss: 0.5909, lr: 0.000551, batch_cost: 0.1797, reader_cost: 0.00077, ips: 44.5151 samples/sec | ETA 03:37:54 2022-08-23 03:46:48 [INFO] [TRAIN] epoch: 70, iter: 87300/160000, loss: 0.5546, lr: 0.000550, batch_cost: 0.1935, reader_cost: 0.00111, ips: 41.3427 samples/sec | ETA 03:54:27 2022-08-23 03:46:58 [INFO] [TRAIN] epoch: 70, iter: 87350/160000, loss: 0.5575, lr: 0.000550, batch_cost: 0.2023, reader_cost: 0.00054, ips: 39.5492 samples/sec | ETA 04:04:55 2022-08-23 03:47:08 [INFO] [TRAIN] epoch: 70, iter: 87400/160000, loss: 0.5880, lr: 0.000550, batch_cost: 0.2061, reader_cost: 0.00040, ips: 38.8173 samples/sec | ETA 04:09:22 2022-08-23 03:47:18 [INFO] [TRAIN] epoch: 70, iter: 87450/160000, loss: 0.5556, lr: 0.000549, batch_cost: 0.2031, reader_cost: 0.00040, ips: 39.3798 samples/sec | ETA 04:05:38 2022-08-23 03:47:28 [INFO] [TRAIN] epoch: 70, iter: 87500/160000, loss: 0.5352, lr: 0.000549, batch_cost: 0.1894, reader_cost: 0.00090, ips: 42.2429 samples/sec | ETA 03:48:50 2022-08-23 03:47:38 [INFO] [TRAIN] epoch: 70, iter: 87550/160000, loss: 0.5493, lr: 0.000549, batch_cost: 0.2053, reader_cost: 0.00033, ips: 38.9689 samples/sec | ETA 04:07:53 2022-08-23 03:47:47 [INFO] [TRAIN] epoch: 70, iter: 87600/160000, loss: 0.5942, lr: 0.000548, batch_cost: 0.1867, reader_cost: 0.00099, ips: 42.8571 samples/sec | ETA 03:45:14 2022-08-23 03:47:57 [INFO] [TRAIN] epoch: 70, iter: 87650/160000, loss: 0.5399, lr: 0.000548, batch_cost: 0.1986, reader_cost: 0.00058, ips: 40.2864 samples/sec | ETA 03:59:27 2022-08-23 03:48:06 [INFO] [TRAIN] epoch: 70, iter: 87700/160000, loss: 0.5723, lr: 0.000547, batch_cost: 0.1848, reader_cost: 0.00052, ips: 43.2789 samples/sec | ETA 03:42:44 2022-08-23 03:48:15 [INFO] [TRAIN] epoch: 70, iter: 87750/160000, loss: 0.5618, lr: 0.000547, batch_cost: 0.1710, reader_cost: 0.00044, ips: 46.7735 samples/sec | ETA 03:25:57 2022-08-23 03:48:25 [INFO] [TRAIN] epoch: 70, iter: 87800/160000, loss: 0.6309, lr: 0.000547, batch_cost: 0.1919, reader_cost: 0.00103, ips: 41.6847 samples/sec | ETA 03:50:56 2022-08-23 03:48:35 [INFO] [TRAIN] epoch: 70, iter: 87850/160000, loss: 0.5651, lr: 0.000546, batch_cost: 0.2100, reader_cost: 0.00040, ips: 38.1037 samples/sec | ETA 04:12:28 2022-08-23 03:48:45 [INFO] [TRAIN] epoch: 70, iter: 87900/160000, loss: 0.5371, lr: 0.000546, batch_cost: 0.2071, reader_cost: 0.00095, ips: 38.6233 samples/sec | ETA 04:08:53 2022-08-23 03:48:56 [INFO] [TRAIN] epoch: 70, iter: 87950/160000, loss: 0.5530, lr: 0.000545, batch_cost: 0.2099, reader_cost: 0.00064, ips: 38.1201 samples/sec | ETA 04:12:00 2022-08-23 03:49:05 [INFO] [TRAIN] epoch: 70, iter: 88000/160000, loss: 0.5916, lr: 0.000545, batch_cost: 0.1853, reader_cost: 0.00063, ips: 43.1765 samples/sec | ETA 03:42:20 2022-08-23 03:49:05 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 137s - batch_cost: 0.1374 - reader cost: 5.6471e-04 2022-08-23 03:51:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3415 Acc: 0.7614 Kappa: 0.7432 Dice: 0.4719 2022-08-23 03:51:23 [INFO] [EVAL] Class IoU: [0.6837 0.77 0.9339 0.7285 0.6793 0.7595 0.77 0.7793 0.533 0.5846 0.4822 0.5648 0.6935 0.282 0.3043 0.4318 0.4957 0.4349 0.6173 0.4066 0.7305 0.4423 0.6334 0.4912 0.339 0.2955 0.4206 0.4788 0.3976 0.237 0.2677 0.4887 0.2584 0.3537 0.2546 0.3454 0.4205 0.5262 0.2615 0.3713 0.2413 0.0809 0.3367 0.2546 0.3098 0.1978 0.2537 0.4968 0.5695 0.5671 0.5013 0.3774 0.1943 0.1459 0.6497 0.4495 0.8551 0.4148 0.4791 0.2702 0.0804 0.469 0.2889 0.2024 0.4525 0.6896 0.2976 0.4386 0.1092 0.3172 0.4636 0.4962 0.335 0.1898 0.4364 0.3371 0.4342 0.2387 0.2207 0.1843 0.6539 0.3729 0.3507 0.0364 0.2441 0.5618 0.1088 0.0672 0.2303 0.5192 0.427 0.0387 0.142 0.1046 0.0177 0.0164 0.0191 0.1063 0.1698 0.4545 0.0868 0.019 0.178 0.7091 0.0162 0.558 0.0607 0.4922 0.114 0.1998 0.1495 0.4734 0.1613 0.3974 0.7716 0.0154 0.2653 0.619 0.1379 0.2119 0.4285 0.0067 0.24 0.1635 0.3451 0.2857 0.4059 0.3908 0.009 0.3205 0.6011 0.0154 0.2808 0.2558 0.2066 0.1298 0.1424 0.0241 0.1904 0.3538 0.2466 0.0016 0.3555 0.2453 0.2916 0. 0.3911 0.0292 0.1646 0.1508] 2022-08-23 03:51:23 [INFO] [EVAL] Class Precision: [0.78 0.852 0.9642 0.8297 0.7671 0.8746 0.8803 0.8469 0.6636 0.7548 0.7056 0.7048 0.7677 0.5067 0.5106 0.6119 0.6745 0.6922 0.782 0.6548 0.8037 0.5739 0.7766 0.634 0.5124 0.4238 0.5193 0.6931 0.6438 0.3152 0.4786 0.6474 0.4844 0.4448 0.3986 0.4485 0.6772 0.828 0.4198 0.5971 0.3526 0.2186 0.5325 0.5297 0.4657 0.4344 0.4008 0.6537 0.6649 0.7016 0.7019 0.5756 0.3577 0.5239 0.7168 0.6089 0.8966 0.6286 0.7012 0.4096 0.1864 0.6858 0.6764 0.7186 0.5711 0.797 0.4418 0.5984 0.1727 0.5228 0.7317 0.6157 0.4966 0.2889 0.7842 0.4934 0.5311 0.5265 0.709 0.479 0.7947 0.6818 0.787 0.0874 0.3872 0.7207 0.4741 0.3914 0.5397 0.7665 0.5571 0.0466 0.3056 0.3395 0.0709 0.0418 0.2789 0.3211 0.5455 0.6467 0.309 0.0376 0.5545 0.8961 0.3248 0.6411 0.1471 0.9363 0.2428 0.3571 0.3359 0.8661 0.4576 0.641 0.7794 0.0994 0.7437 0.7204 0.1819 0.5377 0.5464 0.4504 0.6319 0.6283 0.7156 0.6135 0.7884 0.4904 0.1293 0.474 0.6856 0.0926 0.4416 0.5436 0.5087 0.3065 0.3576 0.0766 0.4685 0.5716 0.2661 0.0031 0.7022 0.6516 0.6236 0. 0.8185 0.4842 0.6193 0.6426] 2022-08-23 03:51:23 [INFO] [EVAL] Class Recall: [0.847 0.8889 0.9675 0.8566 0.8557 0.8523 0.86 0.9071 0.7304 0.7217 0.6037 0.7399 0.8777 0.3887 0.4295 0.5947 0.6516 0.5392 0.7456 0.5175 0.8892 0.6587 0.7746 0.6856 0.5004 0.4939 0.6887 0.6076 0.5097 0.4885 0.3778 0.666 0.3565 0.6333 0.4134 0.6004 0.5259 0.5907 0.4096 0.4955 0.4331 0.1139 0.478 0.3289 0.4808 0.2664 0.4087 0.6742 0.7988 0.7474 0.6369 0.523 0.2984 0.1681 0.874 0.632 0.9487 0.5495 0.6019 0.4426 0.1238 0.5974 0.3352 0.2198 0.6854 0.8366 0.477 0.6216 0.2291 0.4466 0.5586 0.7189 0.5072 0.3561 0.496 0.5156 0.7041 0.304 0.2427 0.2305 0.7868 0.4515 0.3875 0.0588 0.3977 0.7181 0.1237 0.0751 0.2866 0.6167 0.6464 0.1858 0.2097 0.1314 0.023 0.0263 0.0201 0.1371 0.1978 0.6045 0.1077 0.0371 0.2078 0.7726 0.0167 0.8116 0.0936 0.5093 0.1768 0.3121 0.2123 0.5108 0.1995 0.5111 0.9871 0.0179 0.292 0.8147 0.3629 0.259 0.665 0.0068 0.279 0.1811 0.4 0.3484 0.4555 0.658 0.0095 0.4974 0.8298 0.0181 0.4353 0.3258 0.2581 0.1838 0.1914 0.034 0.2429 0.4814 0.7712 0.0032 0.4187 0.2823 0.3538 0. 0.4283 0.0301 0.1832 0.1646] 2022-08-23 03:51:23 [INFO] [EVAL] The model with the best validation mIoU (0.3442) was saved at iter 87000. 2022-08-23 03:51:32 [INFO] [TRAIN] epoch: 70, iter: 88050/160000, loss: 0.5647, lr: 0.000545, batch_cost: 0.1766, reader_cost: 0.00293, ips: 45.3100 samples/sec | ETA 03:31:43 2022-08-23 03:51:42 [INFO] [TRAIN] epoch: 70, iter: 88100/160000, loss: 0.6059, lr: 0.000544, batch_cost: 0.2030, reader_cost: 0.00139, ips: 39.4061 samples/sec | ETA 04:03:16 2022-08-23 03:51:53 [INFO] [TRAIN] epoch: 70, iter: 88150/160000, loss: 0.5373, lr: 0.000544, batch_cost: 0.2209, reader_cost: 0.00083, ips: 36.2161 samples/sec | ETA 04:24:31 2022-08-23 03:52:04 [INFO] [TRAIN] epoch: 70, iter: 88200/160000, loss: 0.6351, lr: 0.000544, batch_cost: 0.2187, reader_cost: 0.00094, ips: 36.5858 samples/sec | ETA 04:21:40 2022-08-23 03:52:13 [INFO] [TRAIN] epoch: 70, iter: 88250/160000, loss: 0.6029, lr: 0.000543, batch_cost: 0.1888, reader_cost: 0.00100, ips: 42.3751 samples/sec | ETA 03:45:45 2022-08-23 03:52:23 [INFO] [TRAIN] epoch: 70, iter: 88300/160000, loss: 0.5729, lr: 0.000543, batch_cost: 0.1893, reader_cost: 0.00055, ips: 42.2720 samples/sec | ETA 03:46:09 2022-08-23 03:52:33 [INFO] [TRAIN] epoch: 70, iter: 88350/160000, loss: 0.5609, lr: 0.000542, batch_cost: 0.2001, reader_cost: 0.00083, ips: 39.9797 samples/sec | ETA 03:58:57 2022-08-23 03:52:43 [INFO] [TRAIN] epoch: 70, iter: 88400/160000, loss: 0.5640, lr: 0.000542, batch_cost: 0.1937, reader_cost: 0.00063, ips: 41.2905 samples/sec | ETA 03:51:12 2022-08-23 03:52:56 [INFO] [TRAIN] epoch: 71, iter: 88450/160000, loss: 0.5534, lr: 0.000542, batch_cost: 0.2602, reader_cost: 0.07928, ips: 30.7466 samples/sec | ETA 05:10:16 2022-08-23 03:53:04 [INFO] [TRAIN] epoch: 71, iter: 88500/160000, loss: 0.5705, lr: 0.000541, batch_cost: 0.1779, reader_cost: 0.00113, ips: 44.9598 samples/sec | ETA 03:32:02 2022-08-23 03:53:14 [INFO] [TRAIN] epoch: 71, iter: 88550/160000, loss: 0.5444, lr: 0.000541, batch_cost: 0.1919, reader_cost: 0.00106, ips: 41.6944 samples/sec | ETA 03:48:29 2022-08-23 03:53:24 [INFO] [TRAIN] epoch: 71, iter: 88600/160000, loss: 0.5783, lr: 0.000541, batch_cost: 0.1944, reader_cost: 0.00114, ips: 41.1425 samples/sec | ETA 03:51:23 2022-08-23 03:53:34 [INFO] [TRAIN] epoch: 71, iter: 88650/160000, loss: 0.5434, lr: 0.000540, batch_cost: 0.2043, reader_cost: 0.00118, ips: 39.1671 samples/sec | ETA 04:02:53 2022-08-23 03:53:43 [INFO] [TRAIN] epoch: 71, iter: 88700/160000, loss: 0.5557, lr: 0.000540, batch_cost: 0.1841, reader_cost: 0.00120, ips: 43.4567 samples/sec | ETA 03:38:45 2022-08-23 03:53:52 [INFO] [TRAIN] epoch: 71, iter: 88750/160000, loss: 0.5759, lr: 0.000539, batch_cost: 0.1707, reader_cost: 0.00059, ips: 46.8759 samples/sec | ETA 03:22:39 2022-08-23 03:54:02 [INFO] [TRAIN] epoch: 71, iter: 88800/160000, loss: 0.5321, lr: 0.000539, batch_cost: 0.1989, reader_cost: 0.00058, ips: 40.2282 samples/sec | ETA 03:55:59 2022-08-23 03:54:10 [INFO] [TRAIN] epoch: 71, iter: 88850/160000, loss: 0.5750, lr: 0.000539, batch_cost: 0.1739, reader_cost: 0.00085, ips: 46.0087 samples/sec | ETA 03:26:11 2022-08-23 03:54:21 [INFO] [TRAIN] epoch: 71, iter: 88900/160000, loss: 0.6272, lr: 0.000538, batch_cost: 0.2045, reader_cost: 0.00076, ips: 39.1115 samples/sec | ETA 04:02:23 2022-08-23 03:54:31 [INFO] [TRAIN] epoch: 71, iter: 88950/160000, loss: 0.5633, lr: 0.000538, batch_cost: 0.2084, reader_cost: 0.00076, ips: 38.3787 samples/sec | ETA 04:06:50 2022-08-23 03:54:40 [INFO] [TRAIN] epoch: 71, iter: 89000/160000, loss: 0.5542, lr: 0.000538, batch_cost: 0.1891, reader_cost: 0.00078, ips: 42.3132 samples/sec | ETA 03:43:43 2022-08-23 03:54:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1799 - reader cost: 8.8772e-04 2022-08-23 03:57:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3455 Acc: 0.7655 Kappa: 0.7475 Dice: 0.4762 2022-08-23 03:57:41 [INFO] [EVAL] Class IoU: [0.6839 0.7778 0.9325 0.7264 0.6628 0.7657 0.7778 0.7805 0.5348 0.6134 0.4896 0.5629 0.6893 0.2866 0.2955 0.4291 0.4875 0.4091 0.6072 0.425 0.7397 0.4969 0.6317 0.5015 0.34 0.3306 0.4466 0.48 0.3792 0.2225 0.2492 0.4951 0.2696 0.3515 0.2868 0.3884 0.4284 0.5436 0.2683 0.3753 0.246 0.0906 0.3281 0.242 0.3242 0.202 0.2571 0.496 0.5949 0.5533 0.515 0.3516 0.2133 0.2149 0.6707 0.4448 0.8687 0.4072 0.5371 0.2692 0.064 0.4514 0.3445 0.2782 0.4603 0.6861 0.2636 0.4388 0.1302 0.3218 0.4631 0.4994 0.4013 0.2024 0.4611 0.3191 0.548 0.2827 0.2486 0.2152 0.6204 0.3748 0.3067 0.0285 0.1834 0.5493 0.096 0.0584 0.2137 0.5265 0.3924 0.047 0.21 0.0705 0.0177 0.0162 0.0299 0.1412 0.2749 0.4308 0.0594 0.0192 0.2293 0.539 0.0755 0.518 0.0489 0.5193 0.1147 0.2665 0.0712 0.4557 0.1491 0.5039 0.7021 0.0319 0.3254 0.6019 0.1315 0.0057 0.4012 0.013 0.2034 0.088 0.459 0.2944 0.461 0.3992 0.3188 0.3059 0.5446 0.0027 0.1272 0.2695 0.2009 0.1314 0.1512 0.0121 0.1729 0.3449 0.4282 0.0113 0.3734 0.0999 0.2883 0. 0.3583 0.0427 0.1807 0.1091] 2022-08-23 03:57:41 [INFO] [EVAL] Class Precision: [0.7851 0.8468 0.9648 0.8178 0.7386 0.8744 0.8811 0.8457 0.6705 0.7413 0.7044 0.7134 0.7539 0.4928 0.5165 0.6093 0.6899 0.6809 0.753 0.5931 0.8229 0.648 0.8058 0.6107 0.5382 0.5859 0.5779 0.7145 0.6708 0.3473 0.4899 0.663 0.4767 0.5079 0.4443 0.5986 0.6379 0.7622 0.4755 0.6361 0.4279 0.2447 0.5167 0.4991 0.4249 0.4415 0.4397 0.6698 0.6612 0.6566 0.7388 0.4891 0.3634 0.5468 0.6965 0.5962 0.9063 0.6532 0.7523 0.4644 0.1506 0.5606 0.5662 0.6358 0.6065 0.8046 0.4671 0.5973 0.2656 0.5516 0.5668 0.6017 0.6277 0.2762 0.7072 0.5634 0.7093 0.4978 0.6561 0.4229 0.7495 0.6834 0.8273 0.0691 0.3461 0.7402 0.4609 0.3833 0.6022 0.7084 0.5551 0.0583 0.4572 0.3303 0.0512 0.049 0.1567 0.3751 0.4268 0.7103 0.3429 0.0298 0.5036 0.891 0.7666 0.7577 0.1494 0.8706 0.2363 0.6285 0.2822 0.6152 0.5613 0.712 0.7061 0.1441 0.7403 0.682 0.2606 0.5603 0.5473 0.1611 0.5334 0.7037 0.7662 0.6627 0.8735 0.5727 0.7497 0.4671 0.5986 0.027 0.2413 0.5842 0.4913 0.2854 0.2857 0.0543 0.4524 0.6126 0.5216 0.0181 0.7421 0.4256 0.703 0. 0.8552 0.2649 0.5989 0.9135] 2022-08-23 03:57:41 [INFO] [EVAL] Class Recall: [0.8414 0.9052 0.9654 0.8667 0.8659 0.8603 0.869 0.91 0.7254 0.7806 0.6161 0.7274 0.8895 0.4066 0.4085 0.592 0.6243 0.5062 0.7581 0.5999 0.8797 0.6805 0.7451 0.7372 0.48 0.4314 0.6629 0.5938 0.4659 0.3825 0.3366 0.6616 0.3829 0.533 0.4473 0.5252 0.5661 0.6546 0.3811 0.4779 0.3666 0.1257 0.4734 0.3196 0.5776 0.2714 0.3823 0.6565 0.8557 0.7787 0.6296 0.5557 0.3405 0.2614 0.9476 0.6367 0.9544 0.5196 0.6524 0.3903 0.1001 0.6986 0.468 0.3309 0.6564 0.8233 0.3769 0.6231 0.2034 0.4359 0.7168 0.746 0.5267 0.4312 0.5699 0.4239 0.7067 0.3955 0.2859 0.3048 0.7827 0.4535 0.3277 0.0462 0.2807 0.6805 0.1081 0.0645 0.2489 0.6722 0.5725 0.1953 0.2797 0.0823 0.0264 0.0236 0.0357 0.1847 0.4357 0.5227 0.067 0.0513 0.2964 0.577 0.0773 0.6208 0.0677 0.5627 0.1823 0.3163 0.087 0.6373 0.1688 0.633 0.9919 0.0394 0.3673 0.8367 0.2097 0.0057 0.6004 0.0139 0.2474 0.0914 0.5337 0.3463 0.4939 0.5686 0.3567 0.4699 0.8578 0.003 0.2119 0.3335 0.2537 0.1959 0.2431 0.0153 0.2186 0.4412 0.705 0.0293 0.4291 0.1155 0.3283 0. 0.3815 0.0484 0.2056 0.1102] 2022-08-23 03:57:41 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 03:57:51 [INFO] [TRAIN] epoch: 71, iter: 89050/160000, loss: 0.5583, lr: 0.000537, batch_cost: 0.1974, reader_cost: 0.00410, ips: 40.5355 samples/sec | ETA 03:53:22 2022-08-23 03:58:00 [INFO] [TRAIN] epoch: 71, iter: 89100/160000, loss: 0.5393, lr: 0.000537, batch_cost: 0.1896, reader_cost: 0.00128, ips: 42.1850 samples/sec | ETA 03:44:05 2022-08-23 03:58:11 [INFO] [TRAIN] epoch: 71, iter: 89150/160000, loss: 0.5517, lr: 0.000536, batch_cost: 0.2129, reader_cost: 0.00044, ips: 37.5743 samples/sec | ETA 04:11:24 2022-08-23 03:58:21 [INFO] [TRAIN] epoch: 71, iter: 89200/160000, loss: 0.6068, lr: 0.000536, batch_cost: 0.2137, reader_cost: 0.00098, ips: 37.4418 samples/sec | ETA 04:12:07 2022-08-23 03:58:32 [INFO] [TRAIN] epoch: 71, iter: 89250/160000, loss: 0.5556, lr: 0.000536, batch_cost: 0.2028, reader_cost: 0.00115, ips: 39.4409 samples/sec | ETA 03:59:10 2022-08-23 03:58:42 [INFO] [TRAIN] epoch: 71, iter: 89300/160000, loss: 0.5558, lr: 0.000535, batch_cost: 0.2028, reader_cost: 0.00059, ips: 39.4407 samples/sec | ETA 03:59:00 2022-08-23 03:58:53 [INFO] [TRAIN] epoch: 71, iter: 89350/160000, loss: 0.5864, lr: 0.000535, batch_cost: 0.2197, reader_cost: 0.00101, ips: 36.4129 samples/sec | ETA 04:18:41 2022-08-23 03:59:03 [INFO] [TRAIN] epoch: 71, iter: 89400/160000, loss: 0.5719, lr: 0.000535, batch_cost: 0.2082, reader_cost: 0.00102, ips: 38.4187 samples/sec | ETA 04:05:01 2022-08-23 03:59:13 [INFO] [TRAIN] epoch: 71, iter: 89450/160000, loss: 0.5888, lr: 0.000534, batch_cost: 0.1962, reader_cost: 0.00048, ips: 40.7695 samples/sec | ETA 03:50:43 2022-08-23 03:59:23 [INFO] [TRAIN] epoch: 71, iter: 89500/160000, loss: 0.5760, lr: 0.000534, batch_cost: 0.1955, reader_cost: 0.00059, ips: 40.9255 samples/sec | ETA 03:49:41 2022-08-23 03:59:32 [INFO] [TRAIN] epoch: 71, iter: 89550/160000, loss: 0.5671, lr: 0.000533, batch_cost: 0.1817, reader_cost: 0.00059, ips: 44.0368 samples/sec | ETA 03:33:18 2022-08-23 03:59:41 [INFO] [TRAIN] epoch: 71, iter: 89600/160000, loss: 0.5402, lr: 0.000533, batch_cost: 0.1795, reader_cost: 0.00068, ips: 44.5794 samples/sec | ETA 03:30:33 2022-08-23 03:59:49 [INFO] [TRAIN] epoch: 71, iter: 89650/160000, loss: 0.5612, lr: 0.000533, batch_cost: 0.1642, reader_cost: 0.00067, ips: 48.7144 samples/sec | ETA 03:12:33 2022-08-23 04:00:03 [INFO] [TRAIN] epoch: 72, iter: 89700/160000, loss: 0.5526, lr: 0.000532, batch_cost: 0.2845, reader_cost: 0.05884, ips: 28.1150 samples/sec | ETA 05:33:23 2022-08-23 04:00:13 [INFO] [TRAIN] epoch: 72, iter: 89750/160000, loss: 0.5779, lr: 0.000532, batch_cost: 0.1927, reader_cost: 0.00082, ips: 41.5104 samples/sec | ETA 03:45:38 2022-08-23 04:00:23 [INFO] [TRAIN] epoch: 72, iter: 89800/160000, loss: 0.5489, lr: 0.000531, batch_cost: 0.2078, reader_cost: 0.00083, ips: 38.4906 samples/sec | ETA 04:03:10 2022-08-23 04:00:35 [INFO] [TRAIN] epoch: 72, iter: 89850/160000, loss: 0.5591, lr: 0.000531, batch_cost: 0.2293, reader_cost: 0.00062, ips: 34.8931 samples/sec | ETA 04:28:03 2022-08-23 04:00:44 [INFO] [TRAIN] epoch: 72, iter: 89900/160000, loss: 0.5569, lr: 0.000531, batch_cost: 0.1947, reader_cost: 0.00059, ips: 41.0951 samples/sec | ETA 03:47:26 2022-08-23 04:00:55 [INFO] [TRAIN] epoch: 72, iter: 89950/160000, loss: 0.5657, lr: 0.000530, batch_cost: 0.2077, reader_cost: 0.00074, ips: 38.5101 samples/sec | ETA 04:02:32 2022-08-23 04:01:05 [INFO] [TRAIN] epoch: 72, iter: 90000/160000, loss: 0.5760, lr: 0.000530, batch_cost: 0.1940, reader_cost: 0.00090, ips: 41.2408 samples/sec | ETA 03:46:18 2022-08-23 04:01:05 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 158s - batch_cost: 0.1583 - reader cost: 6.8414e-04 2022-08-23 04:03:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.3429 Acc: 0.7639 Kappa: 0.7459 Dice: 0.4733 2022-08-23 04:03:43 [INFO] [EVAL] Class IoU: [0.6849 0.7817 0.9314 0.7299 0.6721 0.7604 0.7775 0.7777 0.5321 0.5619 0.4882 0.5599 0.6906 0.3055 0.3036 0.4267 0.4932 0.4306 0.6054 0.4135 0.7408 0.5345 0.6274 0.4958 0.3202 0.3729 0.4658 0.4681 0.4151 0.2268 0.2531 0.4978 0.2969 0.3792 0.3024 0.3942 0.422 0.5385 0.2634 0.3233 0.2376 0.062 0.3539 0.2493 0.2901 0.1828 0.2804 0.498 0.4945 0.5722 0.5277 0.3716 0.1979 0.251 0.6837 0.4116 0.8769 0.3764 0.4927 0.2756 0.0618 0.441 0.3243 0.2339 0.4412 0.6927 0.2921 0.4135 0.1071 0.3364 0.4798 0.494 0.3575 0.2258 0.4363 0.3366 0.5184 0.2852 0.2594 0.1034 0.6539 0.3899 0.4048 0.0301 0.2927 0.5685 0.1127 0.1123 0.2293 0.5072 0.4236 0.0793 0.2103 0.0777 0.0129 0.0174 0.0281 0.1339 0.2924 0.4671 0.0438 0.0255 0.2477 0.1507 0.0017 0.443 0.0458 0.5203 0.0778 0.2566 0.1123 0.411 0.1346 0.6008 0.7592 0.0126 0.2741 0.6058 0.0929 0.0765 0.3507 0.0113 0.2127 0.151 0.304 0.2935 0.4244 0.4019 0.4692 0.2955 0.5738 0.0015 0.2782 0.2374 0.1698 0.1313 0.1511 0.0084 0.1736 0.3054 0.2845 0.0268 0.3561 0.0358 0.3035 0. 0.3897 0.0319 0.1579 0.1749] 2022-08-23 04:03:43 [INFO] [EVAL] Class Precision: [0.7758 0.8732 0.9633 0.8315 0.7479 0.8816 0.8876 0.8422 0.6924 0.7798 0.7232 0.6991 0.7605 0.4507 0.5333 0.5734 0.7189 0.693 0.7441 0.5747 0.8208 0.6579 0.7939 0.602 0.5231 0.4862 0.6453 0.6753 0.6943 0.3103 0.4675 0.6711 0.4737 0.5285 0.5273 0.5719 0.6131 0.7556 0.4142 0.6997 0.3558 0.2178 0.6082 0.4676 0.4022 0.4455 0.4236 0.666 0.66 0.6829 0.7672 0.497 0.3416 0.6693 0.7056 0.5122 0.9276 0.6804 0.8053 0.5773 0.1129 0.5655 0.5182 0.6389 0.5478 0.7949 0.4196 0.5559 0.2018 0.5826 0.6477 0.5942 0.5318 0.2873 0.7487 0.4691 0.6457 0.5043 0.6627 0.2465 0.7753 0.66 0.744 0.0689 0.5034 0.7039 0.4334 0.3092 0.4554 0.7599 0.5784 0.1412 0.3989 0.3501 0.0505 0.0581 0.302 0.4095 0.5008 0.6925 0.4362 0.048 0.6205 0.7235 0.0857 0.5877 0.1369 0.8767 0.2479 0.7871 0.2894 0.5366 0.5467 0.7805 0.7655 0.1203 0.6389 0.6487 0.1689 0.566 0.5268 0.1506 0.4881 0.6935 0.7436 0.6409 0.7496 0.5365 0.7654 0.5122 0.6414 0.0498 0.3923 0.6426 0.4765 0.2856 0.3267 0.0556 0.5099 0.6737 0.3109 0.0365 0.6747 0.2878 0.5548 0. 0.8455 0.3057 0.59 0.569 ] 2022-08-23 04:03:43 [INFO] [EVAL] Class Recall: [0.8538 0.8818 0.9656 0.8565 0.869 0.8469 0.8624 0.9103 0.6969 0.6679 0.6004 0.7377 0.8825 0.4866 0.4134 0.6251 0.6111 0.532 0.7647 0.5958 0.8837 0.7403 0.7494 0.7376 0.4523 0.6156 0.6261 0.6041 0.508 0.4572 0.3557 0.6585 0.4431 0.5731 0.4149 0.5593 0.5752 0.6521 0.4196 0.3754 0.4171 0.0798 0.4583 0.348 0.5102 0.2367 0.4533 0.6637 0.6635 0.7792 0.6282 0.5957 0.3199 0.2866 0.9567 0.677 0.9413 0.4573 0.5593 0.3453 0.1201 0.667 0.4643 0.2696 0.6938 0.8435 0.4901 0.6176 0.1859 0.4432 0.6492 0.7455 0.5217 0.5133 0.5112 0.5436 0.7245 0.3964 0.2989 0.1513 0.8068 0.4879 0.4704 0.0508 0.4115 0.7472 0.1322 0.1499 0.3159 0.604 0.6127 0.1532 0.3079 0.0907 0.017 0.0243 0.0301 0.1659 0.4128 0.5893 0.0465 0.0515 0.292 0.1599 0.0017 0.6428 0.0644 0.5613 0.1019 0.2757 0.155 0.6372 0.1515 0.7229 0.9894 0.0139 0.3244 0.9014 0.1712 0.0813 0.5119 0.0121 0.2737 0.1618 0.3396 0.3513 0.4945 0.6157 0.5479 0.4113 0.8447 0.0015 0.4889 0.2734 0.2087 0.1954 0.2195 0.0098 0.2084 0.3584 0.77 0.0912 0.43 0.0393 0.4012 0. 0.4196 0.0344 0.1774 0.2016] 2022-08-23 04:03:43 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:03:52 [INFO] [TRAIN] epoch: 72, iter: 90050/160000, loss: 0.6130, lr: 0.000530, batch_cost: 0.1833, reader_cost: 0.00317, ips: 43.6377 samples/sec | ETA 03:33:43 2022-08-23 04:04:02 [INFO] [TRAIN] epoch: 72, iter: 90100/160000, loss: 0.5730, lr: 0.000529, batch_cost: 0.1902, reader_cost: 0.00083, ips: 42.0719 samples/sec | ETA 03:41:31 2022-08-23 04:04:12 [INFO] [TRAIN] epoch: 72, iter: 90150/160000, loss: 0.5644, lr: 0.000529, batch_cost: 0.1917, reader_cost: 0.00099, ips: 41.7418 samples/sec | ETA 03:43:07 2022-08-23 04:04:20 [INFO] [TRAIN] epoch: 72, iter: 90200/160000, loss: 0.5639, lr: 0.000528, batch_cost: 0.1658, reader_cost: 0.00031, ips: 48.2562 samples/sec | ETA 03:12:51 2022-08-23 04:04:28 [INFO] [TRAIN] epoch: 72, iter: 90250/160000, loss: 0.5283, lr: 0.000528, batch_cost: 0.1653, reader_cost: 0.00063, ips: 48.3879 samples/sec | ETA 03:12:11 2022-08-23 04:04:38 [INFO] [TRAIN] epoch: 72, iter: 90300/160000, loss: 0.5724, lr: 0.000528, batch_cost: 0.2022, reader_cost: 0.00051, ips: 39.5654 samples/sec | ETA 03:54:53 2022-08-23 04:04:48 [INFO] [TRAIN] epoch: 72, iter: 90350/160000, loss: 0.5911, lr: 0.000527, batch_cost: 0.1923, reader_cost: 0.00045, ips: 41.6077 samples/sec | ETA 03:43:11 2022-08-23 04:04:57 [INFO] [TRAIN] epoch: 72, iter: 90400/160000, loss: 0.5641, lr: 0.000527, batch_cost: 0.1908, reader_cost: 0.00061, ips: 41.9337 samples/sec | ETA 03:41:18 2022-08-23 04:05:08 [INFO] [TRAIN] epoch: 72, iter: 90450/160000, loss: 0.5611, lr: 0.000527, batch_cost: 0.2097, reader_cost: 0.00064, ips: 38.1457 samples/sec | ETA 04:03:06 2022-08-23 04:05:17 [INFO] [TRAIN] epoch: 72, iter: 90500/160000, loss: 0.6129, lr: 0.000526, batch_cost: 0.1905, reader_cost: 0.00074, ips: 41.9946 samples/sec | ETA 03:40:39 2022-08-23 04:05:27 [INFO] [TRAIN] epoch: 72, iter: 90550/160000, loss: 0.5565, lr: 0.000526, batch_cost: 0.1891, reader_cost: 0.00059, ips: 42.3110 samples/sec | ETA 03:38:51 2022-08-23 04:05:36 [INFO] [TRAIN] epoch: 72, iter: 90600/160000, loss: 0.5643, lr: 0.000525, batch_cost: 0.1826, reader_cost: 0.00064, ips: 43.8182 samples/sec | ETA 03:31:10 2022-08-23 04:05:44 [INFO] [TRAIN] epoch: 72, iter: 90650/160000, loss: 0.5731, lr: 0.000525, batch_cost: 0.1703, reader_cost: 0.00049, ips: 46.9815 samples/sec | ETA 03:16:48 2022-08-23 04:05:54 [INFO] [TRAIN] epoch: 72, iter: 90700/160000, loss: 0.5997, lr: 0.000525, batch_cost: 0.1841, reader_cost: 0.00075, ips: 43.4597 samples/sec | ETA 03:32:36 2022-08-23 04:06:02 [INFO] [TRAIN] epoch: 72, iter: 90750/160000, loss: 0.5813, lr: 0.000524, batch_cost: 0.1755, reader_cost: 0.00054, ips: 45.5822 samples/sec | ETA 03:22:33 2022-08-23 04:06:11 [INFO] [TRAIN] epoch: 72, iter: 90800/160000, loss: 0.5937, lr: 0.000524, batch_cost: 0.1674, reader_cost: 0.00095, ips: 47.7835 samples/sec | ETA 03:13:05 2022-08-23 04:06:19 [INFO] [TRAIN] epoch: 72, iter: 90850/160000, loss: 0.5685, lr: 0.000524, batch_cost: 0.1639, reader_cost: 0.00071, ips: 48.8159 samples/sec | ETA 03:08:52 2022-08-23 04:06:28 [INFO] [TRAIN] epoch: 72, iter: 90900/160000, loss: 0.5443, lr: 0.000523, batch_cost: 0.1781, reader_cost: 0.00063, ips: 44.9299 samples/sec | ETA 03:25:03 2022-08-23 04:06:39 [INFO] [TRAIN] epoch: 73, iter: 90950/160000, loss: 0.5574, lr: 0.000523, batch_cost: 0.2228, reader_cost: 0.05292, ips: 35.9038 samples/sec | ETA 04:16:25 2022-08-23 04:06:49 [INFO] [TRAIN] epoch: 73, iter: 91000/160000, loss: 0.5368, lr: 0.000522, batch_cost: 0.1890, reader_cost: 0.00076, ips: 42.3342 samples/sec | ETA 03:37:19 2022-08-23 04:06:49 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1864 - reader cost: 6.5087e-04 2022-08-23 04:09:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3413 Acc: 0.7643 Kappa: 0.7464 Dice: 0.4725 2022-08-23 04:09:55 [INFO] [EVAL] Class IoU: [0.6873 0.7733 0.9313 0.731 0.6747 0.7647 0.769 0.7657 0.5337 0.6114 0.5006 0.5498 0.6897 0.3039 0.2897 0.4202 0.5262 0.4241 0.6087 0.4057 0.7332 0.4793 0.641 0.4921 0.2939 0.2515 0.4454 0.464 0.4284 0.1967 0.2734 0.4924 0.253 0.3669 0.3169 0.3829 0.422 0.5537 0.2798 0.3721 0.2219 0.0795 0.3405 0.2501 0.2987 0.1852 0.2917 0.5093 0.5328 0.4874 0.5139 0.4116 0.2084 0.2361 0.6635 0.4086 0.8645 0.3991 0.4618 0.2706 0.111 0.4413 0.3127 0.2633 0.4265 0.6959 0.2942 0.379 0.1098 0.3214 0.4984 0.5263 0.392 0.2002 0.441 0.3266 0.5587 0.282 0.2284 0.1043 0.6298 0.3985 0.3568 0.0248 0.0662 0.5562 0.1327 0.0927 0.1808 0.488 0.3742 0.0753 0.1877 0.1007 0.0138 0.0182 0.0241 0.1436 0.2759 0.4636 0.1396 0.0196 0.2323 0.0799 0.004 0.4592 0.0548 0.5069 0.0795 0.2143 0.153 0.4084 0.1297 0.479 0.8081 0.0143 0.332 0.5961 0.1173 0.1492 0.3925 0.0064 0.2563 0.1872 0.2514 0.2878 0.4401 0.4388 0.3625 0.322 0.584 0.011 0.2543 0.2508 0.2058 0.1337 0.1508 0.0136 0.2152 0.3425 0.3599 0.0001 0.3043 0.119 0.2726 0. 0.4088 0.0264 0.1947 0.1795] 2022-08-23 04:09:55 [INFO] [EVAL] Class Precision: [0.7906 0.8501 0.9666 0.8289 0.7466 0.8668 0.8673 0.8162 0.6751 0.7241 0.6962 0.7069 0.7643 0.5127 0.5473 0.5742 0.6683 0.6669 0.7615 0.6449 0.8101 0.6483 0.8145 0.6161 0.5027 0.4466 0.5761 0.6527 0.6541 0.3365 0.4498 0.6367 0.4308 0.5183 0.4612 0.6034 0.6185 0.7339 0.4453 0.6612 0.3845 0.2482 0.5525 0.5384 0.4144 0.4332 0.5159 0.7522 0.717 0.5678 0.8015 0.572 0.4053 0.5726 0.738 0.5128 0.9146 0.6773 0.8193 0.4566 0.1867 0.6372 0.4211 0.5808 0.5045 0.8009 0.4287 0.5194 0.3858 0.5195 0.6922 0.7064 0.5833 0.3081 0.643 0.4895 0.6934 0.5433 0.7703 0.3122 0.7533 0.6819 0.7933 0.0748 0.2082 0.7557 0.4653 0.3515 0.2926 0.6685 0.4799 0.1467 0.3102 0.293 0.044 0.0459 0.4313 0.3432 0.4298 0.6916 0.522 0.0478 0.5623 0.641 0.1953 0.6221 0.1594 0.8129 0.2796 0.4347 0.2671 0.6347 0.489 0.6888 0.8141 0.1247 0.7584 0.6856 0.1499 0.536 0.5376 0.1572 0.632 0.5953 0.694 0.6764 0.8849 0.6298 0.9328 0.511 0.6592 0.1111 0.3406 0.6606 0.4651 0.312 0.3338 0.081 0.4489 0.6494 0.4208 0.0001 0.7154 0.4654 0.5079 0. 0.7962 0.3438 0.6808 0.6073] 2022-08-23 04:09:55 [INFO] [EVAL] Class Recall: [0.8403 0.8954 0.9624 0.8608 0.8751 0.8665 0.8715 0.9252 0.7181 0.7971 0.6406 0.7121 0.8761 0.4274 0.381 0.6103 0.7122 0.538 0.752 0.5224 0.8853 0.6476 0.7506 0.7097 0.4145 0.3654 0.6626 0.6161 0.5539 0.3213 0.4107 0.6847 0.3801 0.5567 0.5032 0.5116 0.5705 0.6928 0.4294 0.4597 0.3442 0.1048 0.4702 0.3184 0.5167 0.2444 0.4016 0.612 0.6747 0.7749 0.5889 0.5948 0.3001 0.2867 0.868 0.6679 0.9405 0.4928 0.5141 0.3991 0.2149 0.5893 0.5484 0.3251 0.734 0.8415 0.484 0.5837 0.1331 0.4573 0.6403 0.6737 0.5444 0.3638 0.584 0.4952 0.742 0.3696 0.245 0.1355 0.7935 0.4895 0.3934 0.0357 0.0884 0.6781 0.1566 0.1118 0.3211 0.6438 0.6294 0.1341 0.3221 0.133 0.0198 0.0293 0.0249 0.198 0.4353 0.5843 0.1601 0.0322 0.2836 0.0837 0.0041 0.6369 0.077 0.5738 0.1 0.2972 0.2637 0.534 0.15 0.6113 0.9908 0.0159 0.3713 0.8205 0.3506 0.1714 0.5925 0.0067 0.3013 0.2145 0.2828 0.3338 0.4668 0.5913 0.3722 0.4655 0.8366 0.012 0.5011 0.2879 0.2697 0.1896 0.2157 0.016 0.2926 0.4202 0.7132 0.0002 0.3463 0.1378 0.3705 0. 0.4565 0.0278 0.2142 0.2031] 2022-08-23 04:09:55 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:10:05 [INFO] [TRAIN] epoch: 73, iter: 91050/160000, loss: 0.5499, lr: 0.000522, batch_cost: 0.2012, reader_cost: 0.00418, ips: 39.7622 samples/sec | ETA 03:51:12 2022-08-23 04:10:15 [INFO] [TRAIN] epoch: 73, iter: 91100/160000, loss: 0.5319, lr: 0.000522, batch_cost: 0.1962, reader_cost: 0.00096, ips: 40.7781 samples/sec | ETA 03:45:17 2022-08-23 04:10:26 [INFO] [TRAIN] epoch: 73, iter: 91150/160000, loss: 0.5745, lr: 0.000521, batch_cost: 0.2093, reader_cost: 0.00065, ips: 38.2310 samples/sec | ETA 04:00:07 2022-08-23 04:10:36 [INFO] [TRAIN] epoch: 73, iter: 91200/160000, loss: 0.5557, lr: 0.000521, batch_cost: 0.1990, reader_cost: 0.00044, ips: 40.2085 samples/sec | ETA 03:48:08 2022-08-23 04:10:46 [INFO] [TRAIN] epoch: 73, iter: 91250/160000, loss: 0.5576, lr: 0.000521, batch_cost: 0.2026, reader_cost: 0.00059, ips: 39.4852 samples/sec | ETA 03:52:09 2022-08-23 04:10:56 [INFO] [TRAIN] epoch: 73, iter: 91300/160000, loss: 0.5766, lr: 0.000520, batch_cost: 0.1985, reader_cost: 0.00063, ips: 40.3034 samples/sec | ETA 03:47:16 2022-08-23 04:11:06 [INFO] [TRAIN] epoch: 73, iter: 91350/160000, loss: 0.5465, lr: 0.000520, batch_cost: 0.2059, reader_cost: 0.00074, ips: 38.8597 samples/sec | ETA 03:55:32 2022-08-23 04:11:16 [INFO] [TRAIN] epoch: 73, iter: 91400/160000, loss: 0.5646, lr: 0.000519, batch_cost: 0.1930, reader_cost: 0.00071, ips: 41.4531 samples/sec | ETA 03:40:39 2022-08-23 04:11:26 [INFO] [TRAIN] epoch: 73, iter: 91450/160000, loss: 0.5676, lr: 0.000519, batch_cost: 0.1989, reader_cost: 0.00053, ips: 40.2240 samples/sec | ETA 03:47:13 2022-08-23 04:11:35 [INFO] [TRAIN] epoch: 73, iter: 91500/160000, loss: 0.6230, lr: 0.000519, batch_cost: 0.1840, reader_cost: 0.00059, ips: 43.4695 samples/sec | ETA 03:30:06 2022-08-23 04:11:45 [INFO] [TRAIN] epoch: 73, iter: 91550/160000, loss: 0.5868, lr: 0.000518, batch_cost: 0.2003, reader_cost: 0.00077, ips: 39.9310 samples/sec | ETA 03:48:33 2022-08-23 04:11:55 [INFO] [TRAIN] epoch: 73, iter: 91600/160000, loss: 0.5863, lr: 0.000518, batch_cost: 0.2014, reader_cost: 0.00058, ips: 39.7270 samples/sec | ETA 03:49:34 2022-08-23 04:12:05 [INFO] [TRAIN] epoch: 73, iter: 91650/160000, loss: 0.5683, lr: 0.000517, batch_cost: 0.2124, reader_cost: 0.00101, ips: 37.6579 samples/sec | ETA 04:02:00 2022-08-23 04:12:17 [INFO] [TRAIN] epoch: 73, iter: 91700/160000, loss: 0.5313, lr: 0.000517, batch_cost: 0.2302, reader_cost: 0.00064, ips: 34.7598 samples/sec | ETA 04:21:59 2022-08-23 04:12:27 [INFO] [TRAIN] epoch: 73, iter: 91750/160000, loss: 0.5749, lr: 0.000517, batch_cost: 0.2041, reader_cost: 0.00073, ips: 39.1976 samples/sec | ETA 03:52:09 2022-08-23 04:12:36 [INFO] [TRAIN] epoch: 73, iter: 91800/160000, loss: 0.5307, lr: 0.000516, batch_cost: 0.1827, reader_cost: 0.00217, ips: 43.7991 samples/sec | ETA 03:27:36 2022-08-23 04:12:47 [INFO] [TRAIN] epoch: 73, iter: 91850/160000, loss: 0.5755, lr: 0.000516, batch_cost: 0.2097, reader_cost: 0.00080, ips: 38.1444 samples/sec | ETA 03:58:13 2022-08-23 04:12:58 [INFO] [TRAIN] epoch: 73, iter: 91900/160000, loss: 0.5579, lr: 0.000516, batch_cost: 0.2192, reader_cost: 0.00059, ips: 36.4889 samples/sec | ETA 04:08:50 2022-08-23 04:13:09 [INFO] [TRAIN] epoch: 73, iter: 91950/160000, loss: 0.5429, lr: 0.000515, batch_cost: 0.2164, reader_cost: 0.00091, ips: 36.9719 samples/sec | ETA 04:05:24 2022-08-23 04:13:19 [INFO] [TRAIN] epoch: 73, iter: 92000/160000, loss: 0.5926, lr: 0.000515, batch_cost: 0.2122, reader_cost: 0.00052, ips: 37.7076 samples/sec | ETA 04:00:26 2022-08-23 04:13:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 136s - batch_cost: 0.1361 - reader cost: 7.7102e-04 2022-08-23 04:15:35 [INFO] [EVAL] #Images: 2000 mIoU: 0.3429 Acc: 0.7648 Kappa: 0.7470 Dice: 0.4741 2022-08-23 04:15:35 [INFO] [EVAL] Class IoU: [0.6864 0.7741 0.9319 0.7283 0.6782 0.7659 0.773 0.7764 0.5233 0.609 0.4919 0.5521 0.6906 0.3219 0.2806 0.4249 0.5524 0.4354 0.6096 0.4098 0.7379 0.4922 0.6361 0.498 0.3118 0.2333 0.4906 0.4471 0.4105 0.201 0.2906 0.5003 0.2768 0.3448 0.2993 0.3666 0.4287 0.5719 0.2472 0.39 0.2645 0.1114 0.3365 0.249 0.2987 0.2258 0.3142 0.5217 0.6044 0.5468 0.5198 0.3905 0.2253 0.2638 0.6778 0.4559 0.8567 0.4059 0.5002 0.1906 0.1123 0.4072 0.3053 0.1884 0.438 0.6752 0.2929 0.3976 0.0953 0.3162 0.4705 0.5067 0.4013 0.2078 0.4223 0.3244 0.577 0.283 0.1921 0.116 0.5468 0.3596 0.3617 0.0258 0.0649 0.5533 0.1318 0.1038 0.2108 0.5388 0.3814 0.0485 0.2315 0.0769 0.0328 0.0108 0.0241 0.1524 0.286 0.4409 0.0897 0.0292 0.2489 0.15 0.0061 0.4661 0.0657 0.5082 0.0901 0.2686 0.0982 0.4632 0.1663 0.4494 0.827 0.0299 0.3311 0.6094 0.0744 0.1519 0.3773 0.0249 0.2315 0.1439 0.2709 0.3034 0.3981 0.4173 0.3949 0.2796 0.5455 0.0127 0.2233 0.2606 0.1658 0.1172 0.1374 0.0281 0.2038 0.3257 0.4021 0.0021 0.3802 0.1847 0.2611 0. 0.3934 0.0344 0.1662 0.1625] 2022-08-23 04:15:35 [INFO] [EVAL] Class Precision: [0.7957 0.8512 0.9677 0.8258 0.7607 0.8398 0.8696 0.8323 0.6692 0.7444 0.6725 0.7169 0.7592 0.5456 0.5548 0.5952 0.7275 0.6695 0.7559 0.6196 0.8241 0.7189 0.7876 0.6355 0.494 0.4607 0.6148 0.7705 0.682 0.303 0.4438 0.6789 0.4971 0.4769 0.441 0.6522 0.5848 0.7045 0.3782 0.6267 0.4543 0.2357 0.5497 0.4722 0.3986 0.443 0.4675 0.6954 0.6912 0.6795 0.7125 0.5925 0.3761 0.4513 0.6959 0.6241 0.8954 0.6936 0.5929 0.3281 0.2011 0.5782 0.4502 0.6876 0.5329 0.7747 0.4309 0.4982 0.2207 0.5188 0.5997 0.6375 0.5405 0.2772 0.7188 0.5452 0.7328 0.5774 0.8793 0.3591 0.6153 0.6993 0.7653 0.069 0.2241 0.7045 0.4274 0.3286 0.3845 0.7857 0.5167 0.0612 0.4211 0.3143 0.1052 0.0322 0.3959 0.3793 0.4634 0.6354 0.3817 0.0636 0.6005 0.8042 0.2784 0.5045 0.253 0.81 0.1624 0.4322 0.2368 0.6399 0.4067 0.5437 0.8385 0.1476 0.7321 0.7014 0.1061 0.5649 0.5161 0.3058 0.4204 0.6547 0.6893 0.5467 0.7806 0.5926 0.729 0.3973 0.6024 0.0826 0.297 0.5596 0.5667 0.372 0.3087 0.0756 0.4747 0.7269 0.5134 0.0032 0.5225 0.5578 0.6073 0. 0.8448 0.3343 0.7088 0.8297] 2022-08-23 04:15:35 [INFO] [EVAL] Class Recall: [0.8333 0.8953 0.9618 0.8605 0.862 0.897 0.8743 0.9205 0.7058 0.7701 0.6469 0.7061 0.8843 0.4398 0.3621 0.5976 0.6966 0.5546 0.7591 0.5476 0.8758 0.6095 0.7678 0.6971 0.4582 0.321 0.7084 0.5158 0.5077 0.3739 0.4571 0.6555 0.3844 0.5546 0.4822 0.4557 0.6163 0.7523 0.4164 0.5081 0.3878 0.1744 0.4645 0.3451 0.5438 0.3153 0.4894 0.6763 0.8281 0.7369 0.6578 0.5338 0.3598 0.3883 0.963 0.6285 0.952 0.4946 0.7618 0.3125 0.2028 0.5792 0.4868 0.206 0.7109 0.8402 0.4777 0.6631 0.1436 0.4476 0.6858 0.7118 0.6091 0.4536 0.5058 0.4447 0.7307 0.357 0.1973 0.1463 0.8307 0.4254 0.4069 0.0395 0.0837 0.7206 0.16 0.1317 0.3181 0.6317 0.593 0.1891 0.3396 0.0924 0.0455 0.0159 0.025 0.203 0.4277 0.5902 0.1049 0.0514 0.2983 0.1557 0.0062 0.8598 0.0816 0.577 0.1685 0.415 0.1437 0.6266 0.2196 0.7216 0.9836 0.0362 0.3768 0.823 0.1997 0.1721 0.5837 0.0264 0.34 0.1558 0.3085 0.4054 0.4482 0.5852 0.4629 0.4856 0.8524 0.0148 0.4738 0.3279 0.1899 0.1461 0.1985 0.0429 0.2632 0.3711 0.6497 0.0063 0.5826 0.2164 0.3142 0. 0.4241 0.0369 0.1784 0.1681] 2022-08-23 04:15:36 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:15:46 [INFO] [TRAIN] epoch: 73, iter: 92050/160000, loss: 0.5564, lr: 0.000514, batch_cost: 0.2025, reader_cost: 0.00411, ips: 39.5155 samples/sec | ETA 03:49:16 2022-08-23 04:15:56 [INFO] [TRAIN] epoch: 73, iter: 92100/160000, loss: 0.5583, lr: 0.000514, batch_cost: 0.2033, reader_cost: 0.00105, ips: 39.3594 samples/sec | ETA 03:50:01 2022-08-23 04:16:05 [INFO] [TRAIN] epoch: 73, iter: 92150/160000, loss: 0.5728, lr: 0.000514, batch_cost: 0.1852, reader_cost: 0.00356, ips: 43.2004 samples/sec | ETA 03:29:24 2022-08-23 04:16:16 [INFO] [TRAIN] epoch: 74, iter: 92200/160000, loss: 0.5714, lr: 0.000513, batch_cost: 0.2096, reader_cost: 0.04719, ips: 38.1710 samples/sec | ETA 03:56:49 2022-08-23 04:16:25 [INFO] [TRAIN] epoch: 74, iter: 92250/160000, loss: 0.5661, lr: 0.000513, batch_cost: 0.1779, reader_cost: 0.00773, ips: 44.9623 samples/sec | ETA 03:20:54 2022-08-23 04:16:33 [INFO] [TRAIN] epoch: 74, iter: 92300/160000, loss: 0.5577, lr: 0.000513, batch_cost: 0.1775, reader_cost: 0.00107, ips: 45.0693 samples/sec | ETA 03:20:17 2022-08-23 04:16:43 [INFO] [TRAIN] epoch: 74, iter: 92350/160000, loss: 0.5926, lr: 0.000512, batch_cost: 0.1828, reader_cost: 0.00050, ips: 43.7706 samples/sec | ETA 03:26:04 2022-08-23 04:16:53 [INFO] [TRAIN] epoch: 74, iter: 92400/160000, loss: 0.5829, lr: 0.000512, batch_cost: 0.2088, reader_cost: 0.00055, ips: 38.3192 samples/sec | ETA 03:55:13 2022-08-23 04:17:04 [INFO] [TRAIN] epoch: 74, iter: 92450/160000, loss: 0.5191, lr: 0.000511, batch_cost: 0.2127, reader_cost: 0.00139, ips: 37.6093 samples/sec | ETA 03:59:28 2022-08-23 04:17:14 [INFO] [TRAIN] epoch: 74, iter: 92500/160000, loss: 0.5410, lr: 0.000511, batch_cost: 0.2114, reader_cost: 0.00963, ips: 37.8377 samples/sec | ETA 03:57:51 2022-08-23 04:17:24 [INFO] [TRAIN] epoch: 74, iter: 92550/160000, loss: 0.5473, lr: 0.000511, batch_cost: 0.1980, reader_cost: 0.01295, ips: 40.3943 samples/sec | ETA 03:42:38 2022-08-23 04:17:35 [INFO] [TRAIN] epoch: 74, iter: 92600/160000, loss: 0.5996, lr: 0.000510, batch_cost: 0.2116, reader_cost: 0.00731, ips: 37.8076 samples/sec | ETA 03:57:41 2022-08-23 04:17:46 [INFO] [TRAIN] epoch: 74, iter: 92650/160000, loss: 0.5487, lr: 0.000510, batch_cost: 0.2221, reader_cost: 0.01090, ips: 36.0233 samples/sec | ETA 04:09:16 2022-08-23 04:17:56 [INFO] [TRAIN] epoch: 74, iter: 92700/160000, loss: 0.5085, lr: 0.000510, batch_cost: 0.2027, reader_cost: 0.00074, ips: 39.4628 samples/sec | ETA 03:47:23 2022-08-23 04:18:06 [INFO] [TRAIN] epoch: 74, iter: 92750/160000, loss: 0.5627, lr: 0.000509, batch_cost: 0.2014, reader_cost: 0.00071, ips: 39.7297 samples/sec | ETA 03:45:41 2022-08-23 04:18:16 [INFO] [TRAIN] epoch: 74, iter: 92800/160000, loss: 0.5255, lr: 0.000509, batch_cost: 0.1987, reader_cost: 0.00072, ips: 40.2660 samples/sec | ETA 03:42:31 2022-08-23 04:18:27 [INFO] [TRAIN] epoch: 74, iter: 92850/160000, loss: 0.5490, lr: 0.000508, batch_cost: 0.2108, reader_cost: 0.00052, ips: 37.9419 samples/sec | ETA 03:55:58 2022-08-23 04:18:37 [INFO] [TRAIN] epoch: 74, iter: 92900/160000, loss: 0.5709, lr: 0.000508, batch_cost: 0.2132, reader_cost: 0.00078, ips: 37.5237 samples/sec | ETA 03:58:25 2022-08-23 04:18:47 [INFO] [TRAIN] epoch: 74, iter: 92950/160000, loss: 0.5893, lr: 0.000508, batch_cost: 0.1946, reader_cost: 0.00052, ips: 41.1037 samples/sec | ETA 03:37:29 2022-08-23 04:18:58 [INFO] [TRAIN] epoch: 74, iter: 93000/160000, loss: 0.5875, lr: 0.000507, batch_cost: 0.2307, reader_cost: 0.00080, ips: 34.6700 samples/sec | ETA 04:17:40 2022-08-23 04:18:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1726 - reader cost: 6.0585e-04 2022-08-23 04:21:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3410 Acc: 0.7645 Kappa: 0.7466 Dice: 0.4715 2022-08-23 04:21:51 [INFO] [EVAL] Class IoU: [0.6866 0.7735 0.9308 0.7309 0.6866 0.7612 0.7755 0.7734 0.5213 0.5862 0.4928 0.5563 0.6929 0.3118 0.3282 0.4217 0.5351 0.4358 0.6129 0.4014 0.7399 0.5111 0.6323 0.4895 0.3048 0.3172 0.5296 0.4648 0.4021 0.2191 0.2758 0.4862 0.2507 0.3607 0.3428 0.3984 0.4322 0.5735 0.27 0.3867 0.2094 0.0867 0.3509 0.2483 0.3185 0.2496 0.2284 0.5266 0.4728 0.5255 0.5415 0.39 0.2046 0.2246 0.7157 0.3823 0.8633 0.4166 0.5224 0.2053 0.1328 0.4471 0.3308 0.2447 0.4234 0.6727 0.2602 0.4201 0.0743 0.3282 0.4968 0.505 0.3869 0.2253 0.4213 0.3444 0.544 0.3208 0.2333 0.1094 0.5749 0.3887 0.3143 0.0119 0.0604 0.5471 0.11 0.0652 0.2515 0.5335 0.4386 0.0637 0.2477 0.1337 0.0411 0.0095 0.0274 0.1458 0.2861 0.4106 0.0418 0.0189 0.2421 0.1126 0.0129 0.5601 0.0657 0.5155 0.1109 0.2028 0.1171 0.2228 0.1308 0.3591 0.7719 0.0107 0.2942 0.6058 0.0751 0.2204 0.38 0.0118 0.219 0.1157 0.2587 0.2717 0.4225 0.3979 0.4493 0.3291 0.569 0.0165 0.2517 0.2637 0.1729 0.1168 0.1367 0.026 0.2274 0.3627 0.3637 0.0005 0.3485 0.1035 0.246 0. 0.3475 0.0274 0.1908 0.169 ] 2022-08-23 04:21:51 [INFO] [EVAL] Class Precision: [0.7898 0.8505 0.9593 0.8175 0.7818 0.8653 0.891 0.8373 0.6958 0.7489 0.7273 0.7091 0.7553 0.5078 0.5126 0.5881 0.7215 0.672 0.742 0.6287 0.8329 0.7069 0.7565 0.6159 0.5092 0.4711 0.6902 0.755 0.7142 0.3063 0.4261 0.6289 0.4476 0.4911 0.4904 0.5382 0.6212 0.7713 0.436 0.5824 0.3318 0.2472 0.587 0.565 0.4608 0.5093 0.4652 0.721 0.6681 0.626 0.7476 0.5391 0.3721 0.3797 0.7624 0.4905 0.937 0.6457 0.6757 0.3707 0.1617 0.5773 0.4812 0.6578 0.5042 0.7653 0.3407 0.6027 0.1825 0.4987 0.7004 0.6285 0.6138 0.2932 0.7523 0.5312 0.6544 0.6175 0.7743 0.4017 0.6594 0.6611 0.81 0.0534 0.1986 0.7541 0.3357 0.4236 0.4764 0.7162 0.6442 0.0944 0.4504 0.3536 0.1223 0.0298 0.3793 0.3468 0.3938 0.6786 0.2563 0.0371 0.4132 0.6932 0.1799 0.8389 0.196 0.8423 0.2288 0.3758 0.252 0.2545 0.3698 0.5775 0.7792 0.0906 0.7347 0.7103 0.1264 0.523 0.5213 0.1533 0.5458 0.6259 0.6917 0.5878 0.9056 0.5126 0.6649 0.635 0.6345 0.1257 0.3281 0.5773 0.655 0.2244 0.3499 0.0691 0.465 0.5555 0.4283 0.0007 0.6226 0.3542 0.6447 0. 0.8918 0.3514 0.6403 0.8336] 2022-08-23 04:21:51 [INFO] [EVAL] Class Recall: [0.8401 0.8952 0.9691 0.8735 0.8494 0.8635 0.8568 0.9101 0.6752 0.7296 0.6045 0.7207 0.8935 0.4468 0.4771 0.5984 0.6744 0.5536 0.7788 0.5261 0.8688 0.6485 0.7938 0.7046 0.4317 0.4927 0.6947 0.5473 0.4792 0.4347 0.4387 0.6818 0.3629 0.576 0.5324 0.6055 0.5869 0.6909 0.4149 0.535 0.3619 0.1179 0.466 0.3069 0.5077 0.3286 0.3097 0.6614 0.6179 0.7661 0.6626 0.585 0.3126 0.3547 0.9212 0.6341 0.9165 0.54 0.6972 0.3152 0.4258 0.6648 0.5142 0.2804 0.7255 0.8475 0.524 0.5811 0.1114 0.4899 0.6308 0.7199 0.5113 0.4933 0.4892 0.4948 0.7634 0.4004 0.2504 0.1307 0.8176 0.4854 0.3393 0.015 0.0799 0.6659 0.1406 0.0715 0.3475 0.6765 0.5787 0.1635 0.355 0.177 0.0584 0.0137 0.0286 0.201 0.5113 0.5097 0.0476 0.037 0.3689 0.1185 0.0137 0.6276 0.0899 0.5705 0.1771 0.3059 0.1795 0.6414 0.1683 0.4871 0.988 0.012 0.3292 0.8046 0.1562 0.2758 0.5837 0.0126 0.2678 0.1243 0.2924 0.3356 0.4419 0.6401 0.5809 0.4059 0.8465 0.0187 0.5195 0.3268 0.1902 0.196 0.1832 0.04 0.3079 0.5111 0.7067 0.0014 0.4419 0.1275 0.2846 0. 0.3628 0.0289 0.2136 0.1749] 2022-08-23 04:21:51 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:22:01 [INFO] [TRAIN] epoch: 74, iter: 93050/160000, loss: 0.5512, lr: 0.000507, batch_cost: 0.1899, reader_cost: 0.00386, ips: 42.1347 samples/sec | ETA 03:31:51 2022-08-23 04:22:10 [INFO] [TRAIN] epoch: 74, iter: 93100/160000, loss: 0.5197, lr: 0.000507, batch_cost: 0.1848, reader_cost: 0.00125, ips: 43.2843 samples/sec | ETA 03:26:04 2022-08-23 04:22:19 [INFO] [TRAIN] epoch: 74, iter: 93150/160000, loss: 0.5374, lr: 0.000506, batch_cost: 0.1834, reader_cost: 0.00064, ips: 43.6100 samples/sec | ETA 03:24:23 2022-08-23 04:22:27 [INFO] [TRAIN] epoch: 74, iter: 93200/160000, loss: 0.5849, lr: 0.000506, batch_cost: 0.1627, reader_cost: 0.00146, ips: 49.1627 samples/sec | ETA 03:01:10 2022-08-23 04:22:36 [INFO] [TRAIN] epoch: 74, iter: 93250/160000, loss: 0.6063, lr: 0.000505, batch_cost: 0.1759, reader_cost: 0.00094, ips: 45.4824 samples/sec | ETA 03:15:40 2022-08-23 04:22:45 [INFO] [TRAIN] epoch: 74, iter: 93300/160000, loss: 0.5075, lr: 0.000505, batch_cost: 0.1660, reader_cost: 0.00062, ips: 48.2023 samples/sec | ETA 03:04:30 2022-08-23 04:22:55 [INFO] [TRAIN] epoch: 74, iter: 93350/160000, loss: 0.5540, lr: 0.000505, batch_cost: 0.2021, reader_cost: 0.00091, ips: 39.5826 samples/sec | ETA 03:44:30 2022-08-23 04:23:03 [INFO] [TRAIN] epoch: 74, iter: 93400/160000, loss: 0.5631, lr: 0.000504, batch_cost: 0.1635, reader_cost: 0.00075, ips: 48.9325 samples/sec | ETA 03:01:28 2022-08-23 04:23:12 [INFO] [TRAIN] epoch: 74, iter: 93450/160000, loss: 0.5559, lr: 0.000504, batch_cost: 0.1868, reader_cost: 0.00058, ips: 42.8323 samples/sec | ETA 03:27:09 2022-08-23 04:23:28 [INFO] [TRAIN] epoch: 75, iter: 93500/160000, loss: 0.5680, lr: 0.000503, batch_cost: 0.3092, reader_cost: 0.07135, ips: 25.8733 samples/sec | ETA 05:42:41 2022-08-23 04:23:37 [INFO] [TRAIN] epoch: 75, iter: 93550/160000, loss: 0.5537, lr: 0.000503, batch_cost: 0.1900, reader_cost: 0.00093, ips: 42.1073 samples/sec | ETA 03:30:24 2022-08-23 04:23:49 [INFO] [TRAIN] epoch: 75, iter: 93600/160000, loss: 0.5861, lr: 0.000503, batch_cost: 0.2305, reader_cost: 0.00054, ips: 34.7051 samples/sec | ETA 04:15:06 2022-08-23 04:24:00 [INFO] [TRAIN] epoch: 75, iter: 93650/160000, loss: 0.5537, lr: 0.000502, batch_cost: 0.2325, reader_cost: 0.00068, ips: 34.4067 samples/sec | ETA 04:17:07 2022-08-23 04:24:11 [INFO] [TRAIN] epoch: 75, iter: 93700/160000, loss: 0.5401, lr: 0.000502, batch_cost: 0.2090, reader_cost: 0.00073, ips: 38.2783 samples/sec | ETA 03:50:56 2022-08-23 04:24:22 [INFO] [TRAIN] epoch: 75, iter: 93750/160000, loss: 0.5537, lr: 0.000502, batch_cost: 0.2319, reader_cost: 0.00084, ips: 34.4933 samples/sec | ETA 04:16:05 2022-08-23 04:24:34 [INFO] [TRAIN] epoch: 75, iter: 93800/160000, loss: 0.5437, lr: 0.000501, batch_cost: 0.2280, reader_cost: 0.00058, ips: 35.0809 samples/sec | ETA 04:11:36 2022-08-23 04:24:45 [INFO] [TRAIN] epoch: 75, iter: 93850/160000, loss: 0.5443, lr: 0.000501, batch_cost: 0.2274, reader_cost: 0.00059, ips: 35.1854 samples/sec | ETA 04:10:40 2022-08-23 04:24:55 [INFO] [TRAIN] epoch: 75, iter: 93900/160000, loss: 0.5456, lr: 0.000500, batch_cost: 0.2041, reader_cost: 0.00096, ips: 39.1938 samples/sec | ETA 03:44:51 2022-08-23 04:25:07 [INFO] [TRAIN] epoch: 75, iter: 93950/160000, loss: 0.5431, lr: 0.000500, batch_cost: 0.2280, reader_cost: 0.00054, ips: 35.0839 samples/sec | ETA 04:11:01 2022-08-23 04:25:17 [INFO] [TRAIN] epoch: 75, iter: 94000/160000, loss: 0.5149, lr: 0.000500, batch_cost: 0.2074, reader_cost: 0.00068, ips: 38.5733 samples/sec | ETA 03:48:08 2022-08-23 04:25:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1858 - reader cost: 0.0012 2022-08-23 04:28:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3424 Acc: 0.7641 Kappa: 0.7460 Dice: 0.4734 2022-08-23 04:28:23 [INFO] [EVAL] Class IoU: [0.6835 0.7807 0.9318 0.7257 0.6644 0.7552 0.7752 0.7892 0.5148 0.6034 0.4862 0.5523 0.6905 0.3082 0.3004 0.4176 0.5143 0.3861 0.6147 0.4125 0.7356 0.5238 0.6287 0.4882 0.3229 0.319 0.5209 0.4535 0.3861 0.2353 0.2602 0.487 0.2504 0.3561 0.3206 0.3881 0.4131 0.5554 0.2838 0.3699 0.2651 0.0919 0.3384 0.248 0.2978 0.1984 0.2712 0.5212 0.6271 0.5225 0.524 0.3951 0.203 0.2109 0.6893 0.3735 0.8591 0.4345 0.4864 0.2349 0.0977 0.4626 0.3448 0.2279 0.4369 0.6881 0.2497 0.4125 0.0792 0.3403 0.4834 0.4773 0.4025 0.2246 0.4412 0.321 0.4887 0.263 0.2272 0.1491 0.5608 0.3786 0.3816 0.0293 0.0998 0.5556 0.0858 0.106 0.1941 0.4704 0.4131 0.0566 0.2189 0.0765 0.0219 0.0105 0.0225 0.1388 0.2797 0.452 0.1054 0.0226 0.2639 0.1028 0.0844 0.5877 0.0886 0.5051 0.0632 0.2326 0.1398 0.0667 0.141 0.5576 0.7677 0.0115 0.3079 0.5652 0.1393 0.1372 0.4043 0.0141 0.2216 0.1213 0.3658 0.3005 0.45 0.4371 0.4262 0.3345 0.506 0.0251 0.2524 0.2295 0.174 0.1215 0.1398 0.037 0.1983 0.374 0.4569 0.013 0.3941 0.0674 0.281 0. 0.3708 0.0352 0.1844 0.1689] 2022-08-23 04:28:23 [INFO] [EVAL] Class Precision: [0.7791 0.8488 0.968 0.8161 0.7364 0.8742 0.8687 0.87 0.6878 0.7566 0.6944 0.7432 0.7527 0.5072 0.5442 0.5907 0.7306 0.6719 0.7443 0.6368 0.8225 0.6861 0.8178 0.5872 0.4732 0.5106 0.6769 0.69 0.6733 0.3332 0.4751 0.6447 0.5615 0.4738 0.4655 0.5926 0.5785 0.7987 0.496 0.623 0.4018 0.2034 0.6001 0.5518 0.4208 0.4025 0.4666 0.7307 0.7552 0.6277 0.7071 0.6635 0.3141 0.5523 0.7307 0.5034 0.9062 0.5982 0.6912 0.3942 0.1566 0.6478 0.5249 0.7046 0.5158 0.7919 0.3405 0.5619 0.3523 0.5409 0.6624 0.7374 0.5503 0.2999 0.7268 0.5588 0.5656 0.5373 0.7385 0.4292 0.6465 0.6477 0.7574 0.1044 0.2909 0.7396 0.414 0.3782 0.307 0.7425 0.5634 0.0754 0.3509 0.3596 0.1435 0.0345 0.4888 0.3507 0.4743 0.6018 0.2839 0.0802 0.4239 0.6147 0.5356 0.6934 0.245 0.8792 0.2382 0.5114 0.2503 0.0683 0.397 0.6404 0.775 0.0726 0.7806 0.6397 0.1876 0.5157 0.4862 0.2251 0.7309 0.7264 0.7346 0.5086 0.8128 0.6681 0.8346 0.5422 0.7374 0.196 0.4052 0.5912 0.555 0.2783 0.2683 0.1033 0.4593 0.6356 0.5906 0.019 0.6111 0.3041 0.5619 0. 0.8795 0.3643 0.6457 0.6112] 2022-08-23 04:28:23 [INFO] [EVAL] Class Recall: [0.8477 0.9068 0.9613 0.8676 0.8718 0.8472 0.878 0.8948 0.6719 0.7487 0.6185 0.6825 0.8931 0.4399 0.4015 0.5876 0.6346 0.4757 0.7792 0.5395 0.8744 0.6889 0.7311 0.7433 0.5041 0.4594 0.6934 0.5694 0.4751 0.4449 0.3653 0.6657 0.3112 0.5891 0.5075 0.5294 0.5911 0.6458 0.3989 0.4766 0.438 0.1435 0.4369 0.3106 0.5049 0.2811 0.3931 0.6451 0.7872 0.7571 0.6692 0.4941 0.3648 0.2544 0.9242 0.5914 0.943 0.6136 0.6215 0.3676 0.2063 0.6181 0.5012 0.252 0.7407 0.84 0.4837 0.608 0.0927 0.4786 0.6415 0.575 0.5997 0.4724 0.5289 0.4299 0.7824 0.34 0.2471 0.186 0.8089 0.4767 0.4347 0.0392 0.1319 0.6908 0.0977 0.1284 0.3456 0.5621 0.6077 0.1851 0.3677 0.0886 0.0252 0.0149 0.023 0.1867 0.4054 0.6449 0.1436 0.0306 0.4116 0.1099 0.0911 0.794 0.1218 0.5428 0.0792 0.2991 0.2406 0.7374 0.1795 0.8119 0.9878 0.0135 0.3371 0.8291 0.3511 0.1575 0.7058 0.0148 0.2412 0.1271 0.4215 0.4234 0.5021 0.5583 0.4655 0.4662 0.6172 0.0279 0.4009 0.2728 0.2022 0.1774 0.226 0.0545 0.2586 0.476 0.6687 0.0394 0.5261 0.0797 0.3598 0. 0.3906 0.0375 0.2052 0.1893] 2022-08-23 04:28:23 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:28:32 [INFO] [TRAIN] epoch: 75, iter: 94050/160000, loss: 0.5381, lr: 0.000499, batch_cost: 0.1765, reader_cost: 0.00416, ips: 45.3190 samples/sec | ETA 03:14:01 2022-08-23 04:28:40 [INFO] [TRAIN] epoch: 75, iter: 94100/160000, loss: 0.5828, lr: 0.000499, batch_cost: 0.1680, reader_cost: 0.00081, ips: 47.6326 samples/sec | ETA 03:04:28 2022-08-23 04:28:50 [INFO] [TRAIN] epoch: 75, iter: 94150/160000, loss: 0.5944, lr: 0.000499, batch_cost: 0.1833, reader_cost: 0.00099, ips: 43.6551 samples/sec | ETA 03:21:07 2022-08-23 04:28:59 [INFO] [TRAIN] epoch: 75, iter: 94200/160000, loss: 0.5903, lr: 0.000498, batch_cost: 0.1791, reader_cost: 0.00042, ips: 44.6575 samples/sec | ETA 03:16:27 2022-08-23 04:29:09 [INFO] [TRAIN] epoch: 75, iter: 94250/160000, loss: 0.5045, lr: 0.000498, batch_cost: 0.2042, reader_cost: 0.00235, ips: 39.1678 samples/sec | ETA 03:43:49 2022-08-23 04:29:20 [INFO] [TRAIN] epoch: 75, iter: 94300/160000, loss: 0.5173, lr: 0.000497, batch_cost: 0.2200, reader_cost: 0.00052, ips: 36.3596 samples/sec | ETA 04:00:55 2022-08-23 04:29:28 [INFO] [TRAIN] epoch: 75, iter: 94350/160000, loss: 0.5514, lr: 0.000497, batch_cost: 0.1668, reader_cost: 0.00083, ips: 47.9581 samples/sec | ETA 03:02:31 2022-08-23 04:29:38 [INFO] [TRAIN] epoch: 75, iter: 94400/160000, loss: 0.5871, lr: 0.000497, batch_cost: 0.1972, reader_cost: 0.00260, ips: 40.5680 samples/sec | ETA 03:35:36 2022-08-23 04:29:49 [INFO] [TRAIN] epoch: 75, iter: 94450/160000, loss: 0.5502, lr: 0.000496, batch_cost: 0.2262, reader_cost: 0.00045, ips: 35.3594 samples/sec | ETA 04:07:10 2022-08-23 04:30:02 [INFO] [TRAIN] epoch: 75, iter: 94500/160000, loss: 0.5292, lr: 0.000496, batch_cost: 0.2562, reader_cost: 0.00082, ips: 31.2298 samples/sec | ETA 04:39:38 2022-08-23 04:30:14 [INFO] [TRAIN] epoch: 75, iter: 94550/160000, loss: 0.5647, lr: 0.000496, batch_cost: 0.2333, reader_cost: 0.00071, ips: 34.2913 samples/sec | ETA 04:14:29 2022-08-23 04:30:27 [INFO] [TRAIN] epoch: 75, iter: 94600/160000, loss: 0.5899, lr: 0.000495, batch_cost: 0.2539, reader_cost: 0.00093, ips: 31.5081 samples/sec | ETA 04:36:45 2022-08-23 04:30:39 [INFO] [TRAIN] epoch: 75, iter: 94650/160000, loss: 0.5674, lr: 0.000495, batch_cost: 0.2435, reader_cost: 0.00042, ips: 32.8540 samples/sec | ETA 04:25:12 2022-08-23 04:30:50 [INFO] [TRAIN] epoch: 75, iter: 94700/160000, loss: 0.5475, lr: 0.000494, batch_cost: 0.2161, reader_cost: 0.00067, ips: 37.0138 samples/sec | ETA 03:55:13 2022-08-23 04:31:02 [INFO] [TRAIN] epoch: 76, iter: 94750/160000, loss: 0.5408, lr: 0.000494, batch_cost: 0.2568, reader_cost: 0.04700, ips: 31.1529 samples/sec | ETA 04:39:16 2022-08-23 04:31:13 [INFO] [TRAIN] epoch: 76, iter: 94800/160000, loss: 0.5277, lr: 0.000494, batch_cost: 0.2156, reader_cost: 0.00067, ips: 37.1007 samples/sec | ETA 03:54:19 2022-08-23 04:31:24 [INFO] [TRAIN] epoch: 76, iter: 94850/160000, loss: 0.5664, lr: 0.000493, batch_cost: 0.2093, reader_cost: 0.00036, ips: 38.2156 samples/sec | ETA 03:47:18 2022-08-23 04:31:36 [INFO] [TRAIN] epoch: 76, iter: 94900/160000, loss: 0.5685, lr: 0.000493, batch_cost: 0.2401, reader_cost: 0.00043, ips: 33.3149 samples/sec | ETA 04:20:32 2022-08-23 04:31:47 [INFO] [TRAIN] epoch: 76, iter: 94950/160000, loss: 0.5227, lr: 0.000492, batch_cost: 0.2199, reader_cost: 0.00058, ips: 36.3816 samples/sec | ETA 03:58:23 2022-08-23 04:31:56 [INFO] [TRAIN] epoch: 76, iter: 95000/160000, loss: 0.5564, lr: 0.000492, batch_cost: 0.1940, reader_cost: 0.00899, ips: 41.2462 samples/sec | ETA 03:30:07 2022-08-23 04:31:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 177s - batch_cost: 0.1770 - reader cost: 8.1710e-04 2022-08-23 04:34:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3393 Acc: 0.7648 Kappa: 0.7467 Dice: 0.4683 2022-08-23 04:34:53 [INFO] [EVAL] Class IoU: [0.6863 0.7736 0.9335 0.7305 0.6783 0.7677 0.7748 0.7766 0.5318 0.6 0.4921 0.542 0.6982 0.3123 0.318 0.4201 0.4979 0.3905 0.6123 0.4121 0.7387 0.4737 0.6333 0.5012 0.2762 0.198 0.5116 0.4576 0.4262 0.2453 0.2813 0.4883 0.3144 0.3615 0.292 0.4089 0.41 0.5585 0.2809 0.3901 0.2009 0.0923 0.3538 0.2582 0.3153 0.2021 0.2505 0.5221 0.6639 0.5698 0.5281 0.3732 0.2214 0.228 0.7052 0.3468 0.8658 0.3383 0.4326 0.2242 0.1108 0.3552 0.3433 0.1774 0.4543 0.6825 0.2488 0.4 0.1125 0.3234 0.492 0.5296 0.4153 0.1916 0.4371 0.3299 0.5576 0.3141 0.2243 0.102 0.5911 0.3666 0.3251 0.0248 0.0613 0.5616 0.0992 0.0751 0.2325 0.5311 0.421 0.0385 0.2119 0.1162 0.0365 0.0096 0.0288 0.1429 0.2756 0.4195 0.0656 0.054 0.2209 0.1282 0.007 0.5543 0.057 0.5166 0.0799 0.3232 0.1085 0.1132 0.1278 0.4928 0.8457 0.0158 0.3301 0.6445 0.1212 0.1608 0.3887 0.0122 0.2087 0.1378 0.3403 0.2855 0.4324 0.4163 0.3053 0.3311 0.5224 0.0448 0.3055 0.2522 0.1547 0.1184 0.1519 0.0327 0.1852 0.3576 0.3992 0.0014 0.369 0.0079 0.2791 0.0036 0.3249 0.0348 0.1212 0.1381] 2022-08-23 04:34:53 [INFO] [EVAL] Class Precision: [0.784 0.8377 0.9625 0.8291 0.7683 0.8688 0.8626 0.8276 0.6769 0.7261 0.727 0.7166 0.7773 0.5255 0.5528 0.5858 0.702 0.6988 0.7835 0.6368 0.8211 0.6771 0.7645 0.6357 0.5121 0.4015 0.6606 0.6974 0.6845 0.3337 0.4762 0.5876 0.4986 0.4775 0.4764 0.5371 0.6107 0.7506 0.499 0.6209 0.4198 0.234 0.596 0.5339 0.4027 0.4853 0.5009 0.7435 0.7682 0.6825 0.6926 0.5558 0.3791 0.509 0.7424 0.4214 0.8961 0.7074 0.6199 0.4682 0.1556 0.593 0.5796 0.6485 0.5696 0.764 0.3167 0.5371 0.219 0.5071 0.6486 0.6903 0.594 0.2873 0.6902 0.5317 0.6482 0.537 0.8127 0.2177 0.6833 0.6149 0.8158 0.1045 0.226 0.7452 0.3733 0.4238 0.482 0.7281 0.6163 0.0752 0.3914 0.3918 0.1544 0.0303 0.3813 0.3532 0.3744 0.6204 0.3837 0.1518 0.5643 0.5964 0.6955 0.6583 0.2393 0.8451 0.1898 0.522 0.2367 0.1211 0.5237 0.755 0.8596 0.1487 0.7916 0.7638 0.1896 0.4282 0.5255 0.1798 0.6401 0.6204 0.7038 0.6727 0.7512 0.6337 0.383 0.5168 0.5857 0.249 0.4466 0.5975 0.4734 0.3196 0.3303 0.0826 0.417 0.6344 0.4716 0.0021 0.6863 0.1545 0.5388 0.0039 0.8773 0.3674 0.5506 0.8071] 2022-08-23 04:34:53 [INFO] [EVAL] Class Recall: [0.8463 0.91 0.9687 0.86 0.8527 0.8684 0.8839 0.9265 0.7127 0.7755 0.6036 0.6899 0.8728 0.435 0.4281 0.5977 0.6314 0.4695 0.737 0.5388 0.8804 0.612 0.7868 0.7033 0.375 0.281 0.694 0.571 0.5305 0.4807 0.4075 0.743 0.4598 0.5981 0.4301 0.6313 0.5551 0.6857 0.3912 0.5121 0.2782 0.1323 0.4654 0.3333 0.5921 0.2571 0.3337 0.6368 0.8302 0.7754 0.6898 0.5318 0.3474 0.2923 0.9335 0.6619 0.9624 0.3933 0.5888 0.3009 0.2779 0.4697 0.4571 0.1963 0.6918 0.8648 0.5373 0.6105 0.1878 0.4717 0.6708 0.6947 0.58 0.3653 0.5437 0.4651 0.7996 0.4307 0.2365 0.1612 0.8141 0.4758 0.3508 0.0314 0.0775 0.695 0.119 0.0836 0.31 0.6624 0.5705 0.073 0.3161 0.1417 0.0456 0.0139 0.0302 0.1936 0.5106 0.5644 0.0733 0.0774 0.2664 0.1403 0.007 0.7782 0.0695 0.5707 0.1213 0.4591 0.1669 0.6332 0.1446 0.5866 0.9812 0.0173 0.3615 0.8049 0.2515 0.2048 0.5989 0.013 0.2364 0.1504 0.3972 0.3316 0.5046 0.5483 0.6007 0.4795 0.8285 0.0517 0.4917 0.3038 0.1869 0.1582 0.2195 0.0513 0.2499 0.4504 0.7223 0.0045 0.4439 0.0083 0.3667 0.0431 0.3404 0.0371 0.1345 0.1428] 2022-08-23 04:34:54 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:35:03 [INFO] [TRAIN] epoch: 76, iter: 95050/160000, loss: 0.5292, lr: 0.000492, batch_cost: 0.1881, reader_cost: 0.00332, ips: 42.5356 samples/sec | ETA 03:23:35 2022-08-23 04:35:14 [INFO] [TRAIN] epoch: 76, iter: 95100/160000, loss: 0.5359, lr: 0.000491, batch_cost: 0.2144, reader_cost: 0.00159, ips: 37.3162 samples/sec | ETA 03:51:53 2022-08-23 04:35:23 [INFO] [TRAIN] epoch: 76, iter: 95150/160000, loss: 0.5643, lr: 0.000491, batch_cost: 0.1873, reader_cost: 0.00078, ips: 42.7067 samples/sec | ETA 03:22:27 2022-08-23 04:35:33 [INFO] [TRAIN] epoch: 76, iter: 95200/160000, loss: 0.5816, lr: 0.000491, batch_cost: 0.1993, reader_cost: 0.00076, ips: 40.1306 samples/sec | ETA 03:35:17 2022-08-23 04:35:43 [INFO] [TRAIN] epoch: 76, iter: 95250/160000, loss: 0.5444, lr: 0.000490, batch_cost: 0.1926, reader_cost: 0.00069, ips: 41.5270 samples/sec | ETA 03:27:53 2022-08-23 04:35:53 [INFO] [TRAIN] epoch: 76, iter: 95300/160000, loss: 0.5707, lr: 0.000490, batch_cost: 0.2054, reader_cost: 0.00060, ips: 38.9559 samples/sec | ETA 03:41:26 2022-08-23 04:36:05 [INFO] [TRAIN] epoch: 76, iter: 95350/160000, loss: 0.5698, lr: 0.000489, batch_cost: 0.2326, reader_cost: 0.00082, ips: 34.3955 samples/sec | ETA 04:10:36 2022-08-23 04:36:16 [INFO] [TRAIN] epoch: 76, iter: 95400/160000, loss: 0.5447, lr: 0.000489, batch_cost: 0.2217, reader_cost: 0.00056, ips: 36.0913 samples/sec | ETA 03:58:39 2022-08-23 04:36:26 [INFO] [TRAIN] epoch: 76, iter: 95450/160000, loss: 0.5494, lr: 0.000489, batch_cost: 0.2101, reader_cost: 0.00070, ips: 38.0692 samples/sec | ETA 03:46:04 2022-08-23 04:36:36 [INFO] [TRAIN] epoch: 76, iter: 95500/160000, loss: 0.5618, lr: 0.000488, batch_cost: 0.1986, reader_cost: 0.00034, ips: 40.2865 samples/sec | ETA 03:33:28 2022-08-23 04:36:47 [INFO] [TRAIN] epoch: 76, iter: 95550/160000, loss: 0.5561, lr: 0.000488, batch_cost: 0.2238, reader_cost: 0.00050, ips: 35.7389 samples/sec | ETA 04:00:26 2022-08-23 04:36:57 [INFO] [TRAIN] epoch: 76, iter: 95600/160000, loss: 0.5738, lr: 0.000488, batch_cost: 0.1910, reader_cost: 0.00061, ips: 41.8866 samples/sec | ETA 03:24:59 2022-08-23 04:37:07 [INFO] [TRAIN] epoch: 76, iter: 95650/160000, loss: 0.5359, lr: 0.000487, batch_cost: 0.2082, reader_cost: 0.00044, ips: 38.4293 samples/sec | ETA 03:43:16 2022-08-23 04:37:17 [INFO] [TRAIN] epoch: 76, iter: 95700/160000, loss: 0.5885, lr: 0.000487, batch_cost: 0.1950, reader_cost: 0.00282, ips: 41.0344 samples/sec | ETA 03:28:55 2022-08-23 04:37:28 [INFO] [TRAIN] epoch: 76, iter: 95750/160000, loss: 0.5376, lr: 0.000486, batch_cost: 0.2125, reader_cost: 0.00051, ips: 37.6510 samples/sec | ETA 03:47:31 2022-08-23 04:37:38 [INFO] [TRAIN] epoch: 76, iter: 95800/160000, loss: 0.5727, lr: 0.000486, batch_cost: 0.2092, reader_cost: 0.00038, ips: 38.2389 samples/sec | ETA 03:43:51 2022-08-23 04:37:49 [INFO] [TRAIN] epoch: 76, iter: 95850/160000, loss: 0.5434, lr: 0.000486, batch_cost: 0.2072, reader_cost: 0.00054, ips: 38.6018 samples/sec | ETA 03:41:34 2022-08-23 04:38:00 [INFO] [TRAIN] epoch: 76, iter: 95900/160000, loss: 0.5890, lr: 0.000485, batch_cost: 0.2295, reader_cost: 0.00073, ips: 34.8604 samples/sec | ETA 04:05:10 2022-08-23 04:38:11 [INFO] [TRAIN] epoch: 76, iter: 95950/160000, loss: 0.5449, lr: 0.000485, batch_cost: 0.2151, reader_cost: 0.00048, ips: 37.1889 samples/sec | ETA 03:49:38 2022-08-23 04:38:24 [INFO] [TRAIN] epoch: 77, iter: 96000/160000, loss: 0.5444, lr: 0.000485, batch_cost: 0.2726, reader_cost: 0.06583, ips: 29.3450 samples/sec | ETA 04:50:47 2022-08-23 04:38:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1719 - reader cost: 7.4961e-04 2022-08-23 04:41:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3424 Acc: 0.7651 Kappa: 0.7470 Dice: 0.4728 2022-08-23 04:41:16 [INFO] [EVAL] Class IoU: [0.6896 0.7769 0.9321 0.7316 0.675 0.7602 0.771 0.774 0.5283 0.6199 0.4948 0.5319 0.6929 0.3231 0.2996 0.4215 0.4961 0.4185 0.6235 0.4054 0.7438 0.4445 0.6336 0.4941 0.3174 0.2519 0.4561 0.4577 0.3614 0.2556 0.2741 0.4894 0.31 0.3793 0.2786 0.3904 0.4258 0.5255 0.2732 0.3965 0.2261 0.0908 0.3592 0.2591 0.3091 0.1726 0.3333 0.5148 0.677 0.5125 0.522 0.3901 0.2243 0.229 0.6931 0.4116 0.8783 0.3734 0.425 0.1965 0.0765 0.3302 0.322 0.2359 0.4497 0.676 0.2841 0.4077 0.0851 0.3467 0.4868 0.467 0.409 0.1981 0.421 0.342 0.5104 0.2855 0.2283 0.1125 0.5704 0.3974 0.3497 0.0348 0.0706 0.5645 0.0925 0.0913 0.2881 0.4907 0.3951 0.0378 0.1772 0.0826 0.0323 0.0063 0.03 0.1336 0.2564 0.4657 0.0636 0.026 0.2872 0.6721 0.0488 0.5079 0.0599 0.5291 0.0861 0.2007 0.1106 0.3748 0.1395 0.49 0.8251 0.0778 0.258 0.6314 0.1521 0.1123 0.3778 0.02 0.2152 0.168 0.2583 0.2906 0.4622 0.4087 0.3005 0.3183 0.5474 0.0175 0.268 0.2308 0.1508 0.1244 0.1418 0.0247 0.1834 0.3452 0.2983 0.0078 0.2773 0.0761 0.2968 0.0053 0.445 0.0319 0.1141 0.1295] 2022-08-23 04:41:16 [INFO] [EVAL] Class Precision: [0.7855 0.8357 0.9651 0.8312 0.7555 0.8726 0.8492 0.83 0.6856 0.8028 0.6841 0.7329 0.7762 0.4769 0.5426 0.5633 0.7203 0.6823 0.7922 0.6352 0.8442 0.6427 0.7613 0.6263 0.4971 0.4496 0.5841 0.7257 0.7802 0.3551 0.4361 0.6707 0.5184 0.5484 0.4171 0.506 0.6195 0.7757 0.5048 0.6299 0.3747 0.2586 0.63 0.5367 0.481 0.3945 0.5824 0.7037 0.8045 0.6074 0.7256 0.5464 0.3801 0.5376 0.713 0.5336 0.9322 0.7057 0.6588 0.5178 0.1276 0.569 0.4912 0.6562 0.564 0.7611 0.425 0.5714 0.1817 0.5786 0.6267 0.7692 0.6293 0.2562 0.7985 0.5065 0.6203 0.4917 0.7849 0.2922 0.6327 0.5938 0.7841 0.1045 0.2212 0.7651 0.3053 0.3823 0.6158 0.6839 0.5381 0.0458 0.336 0.3305 0.0938 0.0214 0.5297 0.3568 0.4676 0.6517 0.3439 0.0695 0.5594 0.9373 0.4729 0.7192 0.1639 0.7987 0.2364 0.5262 0.2368 0.4886 0.4529 0.6341 0.8341 0.3851 0.778 0.7571 0.1919 0.4907 0.7311 0.2562 0.6064 0.5848 0.6903 0.576 0.9448 0.6287 0.423 0.5509 0.6504 0.3283 0.499 0.6109 0.5126 0.3478 0.3562 0.118 0.4491 0.6753 0.3295 0.0129 0.702 0.3011 0.6228 0.0061 0.7647 0.4108 0.6702 0.7948] 2022-08-23 04:41:16 [INFO] [EVAL] Class Recall: [0.8497 0.9169 0.9646 0.8593 0.8637 0.855 0.8933 0.9198 0.6972 0.7312 0.6413 0.6597 0.8659 0.5004 0.4008 0.6262 0.6145 0.5198 0.7455 0.5284 0.8622 0.5904 0.7908 0.7006 0.4676 0.3642 0.6756 0.5535 0.4024 0.4771 0.4246 0.6442 0.4354 0.5516 0.4562 0.6308 0.5766 0.6197 0.3732 0.5169 0.363 0.1228 0.4552 0.3337 0.4637 0.2347 0.438 0.6572 0.8102 0.7663 0.6505 0.577 0.3538 0.2852 0.9612 0.6429 0.9382 0.4423 0.5449 0.2405 0.1603 0.4403 0.4831 0.2692 0.6894 0.8581 0.4615 0.5874 0.138 0.4638 0.6856 0.5431 0.5388 0.4662 0.4711 0.513 0.7423 0.4051 0.2435 0.1547 0.8529 0.5458 0.3869 0.0495 0.094 0.6829 0.1172 0.1071 0.3512 0.6346 0.5978 0.1773 0.2727 0.0992 0.047 0.0089 0.0308 0.176 0.3621 0.6201 0.0724 0.0399 0.3712 0.7038 0.0516 0.6334 0.0861 0.6105 0.1193 0.245 0.1718 0.6168 0.1677 0.6831 0.987 0.0889 0.2785 0.7918 0.4229 0.1271 0.4388 0.0212 0.2502 0.1907 0.2922 0.3697 0.4751 0.5387 0.5093 0.4299 0.7756 0.0181 0.3666 0.2705 0.176 0.1623 0.1906 0.0303 0.2366 0.4139 0.7585 0.0194 0.3143 0.0924 0.3618 0.0383 0.5156 0.0334 0.1209 0.134 ] 2022-08-23 04:41:17 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:41:26 [INFO] [TRAIN] epoch: 77, iter: 96050/160000, loss: 0.5689, lr: 0.000484, batch_cost: 0.1807, reader_cost: 0.00230, ips: 44.2640 samples/sec | ETA 03:12:37 2022-08-23 04:41:35 [INFO] [TRAIN] epoch: 77, iter: 96100/160000, loss: 0.5519, lr: 0.000484, batch_cost: 0.1903, reader_cost: 0.00136, ips: 42.0486 samples/sec | ETA 03:22:37 2022-08-23 04:41:45 [INFO] [TRAIN] epoch: 77, iter: 96150/160000, loss: 0.5516, lr: 0.000483, batch_cost: 0.1919, reader_cost: 0.00063, ips: 41.6982 samples/sec | ETA 03:24:09 2022-08-23 04:41:54 [INFO] [TRAIN] epoch: 77, iter: 96200/160000, loss: 0.5306, lr: 0.000483, batch_cost: 0.1910, reader_cost: 0.00087, ips: 41.8952 samples/sec | ETA 03:23:02 2022-08-23 04:42:04 [INFO] [TRAIN] epoch: 77, iter: 96250/160000, loss: 0.5686, lr: 0.000483, batch_cost: 0.1929, reader_cost: 0.00096, ips: 41.4716 samples/sec | ETA 03:24:57 2022-08-23 04:42:13 [INFO] [TRAIN] epoch: 77, iter: 96300/160000, loss: 0.5672, lr: 0.000482, batch_cost: 0.1818, reader_cost: 0.00103, ips: 43.9938 samples/sec | ETA 03:13:03 2022-08-23 04:42:23 [INFO] [TRAIN] epoch: 77, iter: 96350/160000, loss: 0.5809, lr: 0.000482, batch_cost: 0.1960, reader_cost: 0.00182, ips: 40.8136 samples/sec | ETA 03:27:56 2022-08-23 04:42:32 [INFO] [TRAIN] epoch: 77, iter: 96400/160000, loss: 0.5286, lr: 0.000482, batch_cost: 0.1764, reader_cost: 0.00068, ips: 45.3408 samples/sec | ETA 03:07:01 2022-08-23 04:42:41 [INFO] [TRAIN] epoch: 77, iter: 96450/160000, loss: 0.5936, lr: 0.000481, batch_cost: 0.1838, reader_cost: 0.00082, ips: 43.5320 samples/sec | ETA 03:14:38 2022-08-23 04:42:50 [INFO] [TRAIN] epoch: 77, iter: 96500/160000, loss: 0.5423, lr: 0.000481, batch_cost: 0.1809, reader_cost: 0.00061, ips: 44.2166 samples/sec | ETA 03:11:28 2022-08-23 04:43:02 [INFO] [TRAIN] epoch: 77, iter: 96550/160000, loss: 0.5347, lr: 0.000480, batch_cost: 0.2359, reader_cost: 0.00049, ips: 33.9155 samples/sec | ETA 04:09:26 2022-08-23 04:43:13 [INFO] [TRAIN] epoch: 77, iter: 96600/160000, loss: 0.5342, lr: 0.000480, batch_cost: 0.2190, reader_cost: 0.00086, ips: 36.5265 samples/sec | ETA 03:51:25 2022-08-23 04:43:23 [INFO] [TRAIN] epoch: 77, iter: 96650/160000, loss: 0.5730, lr: 0.000480, batch_cost: 0.2153, reader_cost: 0.00099, ips: 37.1513 samples/sec | ETA 03:47:21 2022-08-23 04:43:34 [INFO] [TRAIN] epoch: 77, iter: 96700/160000, loss: 0.5741, lr: 0.000479, batch_cost: 0.2125, reader_cost: 0.00044, ips: 37.6417 samples/sec | ETA 03:44:13 2022-08-23 04:43:45 [INFO] [TRAIN] epoch: 77, iter: 96750/160000, loss: 0.5360, lr: 0.000479, batch_cost: 0.2226, reader_cost: 0.00051, ips: 35.9448 samples/sec | ETA 03:54:37 2022-08-23 04:43:55 [INFO] [TRAIN] epoch: 77, iter: 96800/160000, loss: 0.5650, lr: 0.000478, batch_cost: 0.1933, reader_cost: 0.00046, ips: 41.3816 samples/sec | ETA 03:23:38 2022-08-23 04:44:04 [INFO] [TRAIN] epoch: 77, iter: 96850/160000, loss: 0.5612, lr: 0.000478, batch_cost: 0.1735, reader_cost: 0.00571, ips: 46.1111 samples/sec | ETA 03:02:36 2022-08-23 04:44:13 [INFO] [TRAIN] epoch: 77, iter: 96900/160000, loss: 0.5655, lr: 0.000478, batch_cost: 0.1951, reader_cost: 0.00071, ips: 41.0078 samples/sec | ETA 03:25:09 2022-08-23 04:44:23 [INFO] [TRAIN] epoch: 77, iter: 96950/160000, loss: 0.5404, lr: 0.000477, batch_cost: 0.1987, reader_cost: 0.00656, ips: 40.2647 samples/sec | ETA 03:28:47 2022-08-23 04:44:34 [INFO] [TRAIN] epoch: 77, iter: 97000/160000, loss: 0.5335, lr: 0.000477, batch_cost: 0.2055, reader_cost: 0.00085, ips: 38.9310 samples/sec | ETA 03:35:45 2022-08-23 04:44:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 185s - batch_cost: 0.1851 - reader cost: 0.0015 2022-08-23 04:47:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3446 Acc: 0.7637 Kappa: 0.7455 Dice: 0.4756 2022-08-23 04:47:39 [INFO] [EVAL] Class IoU: [0.6831 0.7783 0.9321 0.7237 0.6801 0.7517 0.7685 0.7853 0.5216 0.6102 0.4979 0.5304 0.7022 0.2941 0.3248 0.4258 0.5095 0.4311 0.6113 0.4131 0.7392 0.4184 0.6297 0.4846 0.3148 0.3711 0.4419 0.4487 0.4105 0.2497 0.2851 0.5038 0.2604 0.3807 0.3062 0.3809 0.4184 0.5551 0.2614 0.4027 0.2322 0.0904 0.335 0.2688 0.3092 0.2097 0.331 0.499 0.6742 0.4966 0.527 0.3088 0.2237 0.2152 0.6832 0.3711 0.871 0.3969 0.3805 0.2291 0.0739 0.4624 0.3044 0.2651 0.449 0.6763 0.293 0.4142 0.1142 0.3519 0.5022 0.5011 0.4207 0.2008 0.4398 0.3339 0.5634 0.3134 0.2273 0.1069 0.5226 0.3739 0.3844 0.0303 0.2357 0.5677 0.1018 0.0691 0.2806 0.5027 0.3949 0.075 0.2405 0.0861 0.018 0.0226 0.0269 0.1371 0.2851 0.424 0.0569 0.0556 0.2454 0.0862 0.0159 0.4847 0.0463 0.5282 0.0965 0.277 0.072 0.4487 0.1375 0.5505 0.8735 0.0135 0.3009 0.6179 0.135 0.1545 0.4406 0.0118 0.2067 0.1465 0.3571 0.3005 0.4266 0.4091 0.3196 0.2953 0.5231 0.0469 0.2549 0.2561 0.1142 0.1296 0.1369 0.0266 0.2093 0.3696 0.408 0.0007 0.3929 0.0271 0.2846 0. 0.3943 0.0356 0.1609 0.1288] 2022-08-23 04:47:39 [INFO] [EVAL] Class Precision: [0.7754 0.8547 0.9679 0.8139 0.774 0.8774 0.8576 0.8521 0.696 0.7681 0.6939 0.7384 0.7921 0.4899 0.5171 0.5847 0.6834 0.6974 0.7756 0.6302 0.8241 0.6537 0.749 0.6378 0.4785 0.5301 0.5242 0.7732 0.6786 0.3271 0.4537 0.6854 0.4979 0.5023 0.5051 0.6172 0.6147 0.7654 0.4109 0.6289 0.4342 0.2434 0.5359 0.4953 0.45 0.4263 0.4988 0.6875 0.7681 0.5597 0.7262 0.4404 0.3317 0.6005 0.751 0.6048 0.9187 0.6912 0.5927 0.4647 0.1249 0.6039 0.4879 0.6332 0.5414 0.8276 0.4225 0.5477 0.1841 0.6521 0.6699 0.6052 0.6153 0.2566 0.6206 0.5105 0.6851 0.5376 0.8301 0.2513 0.591 0.6399 0.7543 0.1013 0.4246 0.7661 0.4347 0.4099 0.8981 0.7671 0.5003 0.1116 0.3752 0.329 0.1457 0.0659 0.4018 0.3595 0.4398 0.6983 0.3641 0.112 0.5636 0.6529 0.3015 0.6587 0.1708 0.809 0.2518 0.5161 0.2276 0.6384 0.4968 0.704 0.8865 0.1002 0.7105 0.7183 0.208 0.4158 0.51 0.1613 0.6891 0.6125 0.7095 0.5555 0.8558 0.5521 0.4681 0.5644 0.5793 0.2189 0.4795 0.5365 0.4366 0.3109 0.2597 0.0776 0.5096 0.6186 0.5028 0.0011 0.6196 0.3414 0.5822 0. 0.8428 0.286 0.3718 0.7224] 2022-08-23 04:47:39 [INFO] [EVAL] Class Recall: [0.8516 0.8969 0.9619 0.8671 0.8486 0.84 0.8809 0.9092 0.6755 0.748 0.638 0.6531 0.8608 0.424 0.4662 0.6104 0.6669 0.5303 0.7427 0.5453 0.8776 0.5376 0.7982 0.6686 0.4792 0.5531 0.7379 0.5166 0.5096 0.5134 0.4341 0.6553 0.3532 0.6113 0.4375 0.4987 0.5672 0.6689 0.418 0.5282 0.3331 0.1258 0.472 0.3702 0.4972 0.2922 0.496 0.6454 0.8464 0.815 0.6576 0.5081 0.4072 0.2511 0.8832 0.4899 0.9438 0.4824 0.5151 0.3112 0.153 0.6637 0.4473 0.3132 0.7246 0.7872 0.4887 0.6295 0.2311 0.4333 0.6673 0.7446 0.5709 0.4799 0.6016 0.4912 0.7602 0.4292 0.2384 0.1568 0.8187 0.4736 0.4395 0.0415 0.3464 0.6867 0.1173 0.0767 0.2898 0.5932 0.6522 0.1865 0.4012 0.1045 0.0202 0.0332 0.0281 0.1814 0.4476 0.5191 0.0632 0.0993 0.303 0.0903 0.0165 0.6473 0.0597 0.6035 0.1352 0.3743 0.0953 0.6015 0.1597 0.7163 0.9835 0.0154 0.3429 0.8155 0.2777 0.1974 0.764 0.0126 0.228 0.1615 0.4183 0.3956 0.4597 0.6122 0.5019 0.3825 0.8437 0.0563 0.3525 0.3288 0.1339 0.1818 0.2244 0.0389 0.2622 0.4787 0.6837 0.0024 0.5178 0.0286 0.3577 0. 0.4256 0.0391 0.221 0.1355] 2022-08-23 04:47:39 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:47:48 [INFO] [TRAIN] epoch: 77, iter: 97050/160000, loss: 0.5426, lr: 0.000477, batch_cost: 0.1869, reader_cost: 0.00467, ips: 42.7946 samples/sec | ETA 03:16:07 2022-08-23 04:47:58 [INFO] [TRAIN] epoch: 77, iter: 97100/160000, loss: 0.5482, lr: 0.000476, batch_cost: 0.2034, reader_cost: 0.00070, ips: 39.3372 samples/sec | ETA 03:33:11 2022-08-23 04:48:09 [INFO] [TRAIN] epoch: 77, iter: 97150/160000, loss: 0.5461, lr: 0.000476, batch_cost: 0.2027, reader_cost: 0.00100, ips: 39.4579 samples/sec | ETA 03:32:22 2022-08-23 04:48:17 [INFO] [TRAIN] epoch: 77, iter: 97200/160000, loss: 0.5594, lr: 0.000475, batch_cost: 0.1742, reader_cost: 0.00244, ips: 45.9327 samples/sec | ETA 03:02:17 2022-08-23 04:48:26 [INFO] [TRAIN] epoch: 77, iter: 97250/160000, loss: 0.6074, lr: 0.000475, batch_cost: 0.1770, reader_cost: 0.00070, ips: 45.1910 samples/sec | ETA 03:05:08 2022-08-23 04:48:39 [INFO] [TRAIN] epoch: 78, iter: 97300/160000, loss: 0.5374, lr: 0.000475, batch_cost: 0.2590, reader_cost: 0.08367, ips: 30.8922 samples/sec | ETA 04:30:37 2022-08-23 04:48:49 [INFO] [TRAIN] epoch: 78, iter: 97350/160000, loss: 0.5582, lr: 0.000474, batch_cost: 0.2012, reader_cost: 0.00063, ips: 39.7540 samples/sec | ETA 03:30:07 2022-08-23 04:48:59 [INFO] [TRAIN] epoch: 78, iter: 97400/160000, loss: 0.5432, lr: 0.000474, batch_cost: 0.1941, reader_cost: 0.00073, ips: 41.2170 samples/sec | ETA 03:22:30 2022-08-23 04:49:09 [INFO] [TRAIN] epoch: 78, iter: 97450/160000, loss: 0.5256, lr: 0.000474, batch_cost: 0.1973, reader_cost: 0.00320, ips: 40.5550 samples/sec | ETA 03:25:38 2022-08-23 04:49:21 [INFO] [TRAIN] epoch: 78, iter: 97500/160000, loss: 0.5265, lr: 0.000473, batch_cost: 0.2362, reader_cost: 0.00051, ips: 33.8635 samples/sec | ETA 04:06:05 2022-08-23 04:49:31 [INFO] [TRAIN] epoch: 78, iter: 97550/160000, loss: 0.5690, lr: 0.000473, batch_cost: 0.2024, reader_cost: 0.00152, ips: 39.5289 samples/sec | ETA 03:30:38 2022-08-23 04:49:42 [INFO] [TRAIN] epoch: 78, iter: 97600/160000, loss: 0.5564, lr: 0.000472, batch_cost: 0.2175, reader_cost: 0.00224, ips: 36.7805 samples/sec | ETA 03:46:12 2022-08-23 04:49:51 [INFO] [TRAIN] epoch: 78, iter: 97650/160000, loss: 0.5243, lr: 0.000472, batch_cost: 0.1972, reader_cost: 0.00047, ips: 40.5589 samples/sec | ETA 03:24:58 2022-08-23 04:50:03 [INFO] [TRAIN] epoch: 78, iter: 97700/160000, loss: 0.5448, lr: 0.000472, batch_cost: 0.2330, reader_cost: 0.00527, ips: 34.3376 samples/sec | ETA 04:01:54 2022-08-23 04:50:14 [INFO] [TRAIN] epoch: 78, iter: 97750/160000, loss: 0.5500, lr: 0.000471, batch_cost: 0.2160, reader_cost: 0.00046, ips: 37.0438 samples/sec | ETA 03:44:03 2022-08-23 04:50:24 [INFO] [TRAIN] epoch: 78, iter: 97800/160000, loss: 0.5455, lr: 0.000471, batch_cost: 0.2087, reader_cost: 0.00057, ips: 38.3327 samples/sec | ETA 03:36:21 2022-08-23 04:50:35 [INFO] [TRAIN] epoch: 78, iter: 97850/160000, loss: 0.5176, lr: 0.000471, batch_cost: 0.2071, reader_cost: 0.00074, ips: 38.6321 samples/sec | ETA 03:34:30 2022-08-23 04:50:46 [INFO] [TRAIN] epoch: 78, iter: 97900/160000, loss: 0.5456, lr: 0.000470, batch_cost: 0.2275, reader_cost: 0.00058, ips: 35.1640 samples/sec | ETA 03:55:28 2022-08-23 04:50:58 [INFO] [TRAIN] epoch: 78, iter: 97950/160000, loss: 0.5633, lr: 0.000470, batch_cost: 0.2305, reader_cost: 0.00041, ips: 34.7101 samples/sec | ETA 03:58:21 2022-08-23 04:51:07 [INFO] [TRAIN] epoch: 78, iter: 98000/160000, loss: 0.5534, lr: 0.000469, batch_cost: 0.1898, reader_cost: 0.00062, ips: 42.1422 samples/sec | ETA 03:16:09 2022-08-23 04:51:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1786 - reader cost: 0.0012 2022-08-23 04:54:06 [INFO] [EVAL] #Images: 2000 mIoU: 0.3423 Acc: 0.7628 Kappa: 0.7448 Dice: 0.4741 2022-08-23 04:54:06 [INFO] [EVAL] Class IoU: [0.6851 0.7776 0.9319 0.7225 0.6787 0.7525 0.7735 0.7689 0.5228 0.598 0.4853 0.5462 0.6988 0.2858 0.294 0.4274 0.5058 0.41 0.6145 0.4238 0.7409 0.4554 0.6394 0.4904 0.2677 0.3865 0.4598 0.4606 0.3985 0.2595 0.2627 0.5196 0.2813 0.349 0.2658 0.3908 0.4311 0.5163 0.2656 0.3873 0.1987 0.0801 0.3453 0.2499 0.3038 0.2086 0.335 0.4997 0.6964 0.5263 0.5525 0.2586 0.2276 0.2161 0.6687 0.3802 0.8658 0.3946 0.3707 0.2703 0.0724 0.4717 0.3203 0.2238 0.4762 0.6702 0.2801 0.3887 0.1057 0.321 0.4868 0.4932 0.399 0.219 0.4324 0.3086 0.5247 0.3191 0.2183 0.2206 0.5749 0.3836 0.3754 0.0287 0.1792 0.5442 0.0936 0.078 0.2081 0.5019 0.4171 0.0272 0.2268 0.096 0.0112 0.0187 0.0296 0.1106 0.2871 0.4288 0.1109 0.0546 0.2733 0.4224 0.0199 0.4401 0.0607 0.514 0.1031 0.2708 0.1021 0.3004 0.1324 0.5527 0.7192 0.0279 0.3091 0.6281 0.0945 0.1538 0.4666 0.0083 0.2071 0.1284 0.3509 0.2938 0.446 0.3876 0.284 0.2902 0.5141 0.042 0.2374 0.2422 0.1555 0.1337 0.1354 0.0166 0.1942 0.3771 0.2891 0.0025 0.3596 0.0779 0.285 0. 0.4245 0.0343 0.1215 0.1953] 2022-08-23 04:54:06 [INFO] [EVAL] Class Precision: [0.7852 0.8582 0.9688 0.8018 0.755 0.8729 0.8701 0.8227 0.685 0.7807 0.7094 0.7386 0.7764 0.4719 0.5486 0.5917 0.6791 0.6949 0.7567 0.5922 0.829 0.6512 0.7884 0.6356 0.482 0.5221 0.5713 0.7532 0.7291 0.3347 0.5105 0.6579 0.4651 0.4606 0.4365 0.5236 0.617 0.6994 0.4855 0.631 0.3564 0.2346 0.5238 0.5407 0.4202 0.3733 0.5097 0.6529 0.8092 0.6169 0.7329 0.319 0.4064 0.4956 0.7157 0.4684 0.9222 0.7141 0.7706 0.534 0.1364 0.595 0.4877 0.5956 0.6083 0.7434 0.4136 0.5488 0.1769 0.5277 0.7002 0.6053 0.6114 0.2915 0.6758 0.5324 0.6208 0.6523 0.7364 0.3631 0.664 0.6787 0.7548 0.0987 0.3978 0.7461 0.4267 0.3506 0.4184 0.7534 0.5553 0.0296 0.4439 0.3175 0.0587 0.0502 0.664 0.3868 0.4335 0.6579 0.4519 0.1327 0.5682 0.8014 0.7395 0.5864 0.1522 0.8139 0.1898 0.436 0.2494 0.3701 0.4861 0.7144 0.7228 0.1903 0.7744 0.7328 0.1488 0.4559 0.6432 0.1304 0.5418 0.7086 0.6975 0.597 0.8733 0.512 0.7949 0.568 0.5668 0.2027 0.331 0.5785 0.5055 0.3269 0.3559 0.0941 0.4221 0.5927 0.3273 0.0039 0.6507 0.3854 0.6253 0. 0.8685 0.4468 0.3739 0.4484] 2022-08-23 04:54:06 [INFO] [EVAL] Class Recall: [0.8431 0.8922 0.9607 0.8795 0.8704 0.8451 0.8744 0.9215 0.6882 0.7188 0.6058 0.6771 0.8748 0.4201 0.3878 0.6062 0.6648 0.5 0.7658 0.5985 0.8746 0.6023 0.7719 0.6822 0.3757 0.5981 0.702 0.5424 0.4678 0.5359 0.3511 0.7118 0.4159 0.5902 0.4046 0.6065 0.5885 0.6635 0.3697 0.5006 0.31 0.1084 0.5032 0.3173 0.5232 0.321 0.4944 0.6806 0.8332 0.7817 0.6917 0.5773 0.3409 0.277 0.9106 0.6688 0.934 0.4687 0.4167 0.3537 0.1335 0.6948 0.4828 0.264 0.6869 0.8719 0.4645 0.5712 0.2078 0.4504 0.615 0.7269 0.5345 0.4681 0.5455 0.4233 0.7721 0.3845 0.2368 0.3598 0.8108 0.4687 0.4276 0.0389 0.2459 0.6679 0.1071 0.0912 0.2928 0.6005 0.6263 0.2525 0.3169 0.1209 0.0137 0.029 0.0301 0.1341 0.4594 0.5519 0.1281 0.0848 0.3449 0.4718 0.0201 0.6382 0.0917 0.5825 0.1842 0.4167 0.1475 0.6147 0.154 0.7096 0.9931 0.0316 0.3397 0.8146 0.2058 0.1884 0.6295 0.0088 0.251 0.1355 0.4139 0.3665 0.4768 0.6147 0.3064 0.3724 0.8468 0.0503 0.4565 0.2941 0.1834 0.1845 0.1794 0.0198 0.2645 0.509 0.7124 0.0066 0.4457 0.0889 0.3436 0. 0.4537 0.0358 0.1526 0.2571] 2022-08-23 04:54:06 [INFO] [EVAL] The model with the best validation mIoU (0.3455) was saved at iter 89000. 2022-08-23 04:54:15 [INFO] [TRAIN] epoch: 78, iter: 98050/160000, loss: 0.5415, lr: 0.000469, batch_cost: 0.1706, reader_cost: 0.00314, ips: 46.8873 samples/sec | ETA 02:56:10 2022-08-23 04:54:23 [INFO] [TRAIN] epoch: 78, iter: 98100/160000, loss: 0.5836, lr: 0.000469, batch_cost: 0.1602, reader_cost: 0.00214, ips: 49.9457 samples/sec | ETA 02:45:14 2022-08-23 04:54:31 [INFO] [TRAIN] epoch: 78, iter: 98150/160000, loss: 0.5190, lr: 0.000468, batch_cost: 0.1735, reader_cost: 0.00057, ips: 46.1196 samples/sec | ETA 02:58:48 2022-08-23 04:54:40 [INFO] [TRAIN] epoch: 78, iter: 98200/160000, loss: 0.5527, lr: 0.000468, batch_cost: 0.1721, reader_cost: 0.00042, ips: 46.4932 samples/sec | ETA 02:57:13 2022-08-23 04:54:50 [INFO] [TRAIN] epoch: 78, iter: 98250/160000, loss: 0.5712, lr: 0.000468, batch_cost: 0.2048, reader_cost: 0.00061, ips: 39.0541 samples/sec | ETA 03:30:49 2022-08-23 04:55:01 [INFO] [TRAIN] epoch: 78, iter: 98300/160000, loss: 0.6069, lr: 0.000467, batch_cost: 0.2164, reader_cost: 0.00068, ips: 36.9610 samples/sec | ETA 03:42:34 2022-08-23 04:55:10 [INFO] [TRAIN] epoch: 78, iter: 98350/160000, loss: 0.5618, lr: 0.000467, batch_cost: 0.1865, reader_cost: 0.00058, ips: 42.8983 samples/sec | ETA 03:11:36 2022-08-23 04:55:20 [INFO] [TRAIN] epoch: 78, iter: 98400/160000, loss: 0.5243, lr: 0.000466, batch_cost: 0.1855, reader_cost: 0.00062, ips: 43.1244 samples/sec | ETA 03:10:27 2022-08-23 04:55:28 [INFO] [TRAIN] epoch: 78, iter: 98450/160000, loss: 0.5608, lr: 0.000466, batch_cost: 0.1735, reader_cost: 0.00040, ips: 46.0969 samples/sec | ETA 02:58:01 2022-08-23 04:55:39 [INFO] [TRAIN] epoch: 78, iter: 98500/160000, loss: 0.5648, lr: 0.000466, batch_cost: 0.2165, reader_cost: 0.00066, ips: 36.9446 samples/sec | ETA 03:41:57 2022-08-23 04:55:53 [INFO] [TRAIN] epoch: 79, iter: 98550/160000, loss: 0.5094, lr: 0.000465, batch_cost: 0.2857, reader_cost: 0.09040, ips: 27.9986 samples/sec | ETA 04:52:38 2022-08-23 04:56:03 [INFO] [TRAIN] epoch: 79, iter: 98600/160000, loss: 0.5230, lr: 0.000465, batch_cost: 0.1854, reader_cost: 0.00051, ips: 43.1490 samples/sec | ETA 03:09:43 2022-08-23 04:56:13 [INFO] [TRAIN] epoch: 79, iter: 98650/160000, loss: 0.5238, lr: 0.000464, batch_cost: 0.2006, reader_cost: 0.00032, ips: 39.8716 samples/sec | ETA 03:25:09 2022-08-23 04:56:22 [INFO] [TRAIN] epoch: 79, iter: 98700/160000, loss: 0.5262, lr: 0.000464, batch_cost: 0.1930, reader_cost: 0.00065, ips: 41.4563 samples/sec | ETA 03:17:09 2022-08-23 04:56:32 [INFO] [TRAIN] epoch: 79, iter: 98750/160000, loss: 0.5322, lr: 0.000464, batch_cost: 0.2010, reader_cost: 0.00055, ips: 39.7948 samples/sec | ETA 03:25:13 2022-08-23 04:56:43 [INFO] [TRAIN] epoch: 79, iter: 98800/160000, loss: 0.5357, lr: 0.000463, batch_cost: 0.2135, reader_cost: 0.00071, ips: 37.4720 samples/sec | ETA 03:37:45 2022-08-23 04:56:53 [INFO] [TRAIN] epoch: 79, iter: 98850/160000, loss: 0.5604, lr: 0.000463, batch_cost: 0.1959, reader_cost: 0.00146, ips: 40.8402 samples/sec | ETA 03:19:38 2022-08-23 04:57:03 [INFO] [TRAIN] epoch: 79, iter: 98900/160000, loss: 0.5239, lr: 0.000463, batch_cost: 0.1979, reader_cost: 0.00032, ips: 40.4255 samples/sec | ETA 03:21:31 2022-08-23 04:57:13 [INFO] [TRAIN] epoch: 79, iter: 98950/160000, loss: 0.5704, lr: 0.000462, batch_cost: 0.2041, reader_cost: 0.00040, ips: 39.1993 samples/sec | ETA 03:27:39 2022-08-23 04:57:23 [INFO] [TRAIN] epoch: 79, iter: 99000/160000, loss: 0.5806, lr: 0.000462, batch_cost: 0.1980, reader_cost: 0.00040, ips: 40.4073 samples/sec | ETA 03:21:17 2022-08-23 04:57:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1616 - reader cost: 5.6508e-04 2022-08-23 05:00:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3461 Acc: 0.7653 Kappa: 0.7472 Dice: 0.4771 2022-08-23 05:00:05 [INFO] [EVAL] Class IoU: [0.6836 0.7774 0.9334 0.7319 0.6764 0.754 0.7746 0.7629 0.5288 0.6346 0.4885 0.5409 0.6903 0.3204 0.3041 0.4204 0.4812 0.4174 0.6098 0.4193 0.7427 0.4793 0.6243 0.4891 0.3494 0.3757 0.4725 0.4561 0.4232 0.22 0.2898 0.5101 0.2681 0.3677 0.2856 0.3901 0.4289 0.5281 0.2591 0.4022 0.2219 0.0712 0.3487 0.251 0.2757 0.1626 0.2766 0.5104 0.6438 0.5249 0.5303 0.2427 0.2285 0.1675 0.7007 0.4076 0.8685 0.3922 0.4376 0.2512 0.0639 0.4316 0.3384 0.2341 0.477 0.7022 0.2906 0.3941 0.1309 0.3167 0.499 0.5089 0.3957 0.2115 0.4502 0.3266 0.5247 0.3275 0.2263 0.1141 0.6165 0.3785 0.3579 0.0246 0.2971 0.5498 0.1185 0.0766 0.2782 0.5234 0.3845 0.0425 0.2387 0.1024 0.012 0.0149 0.0166 0.1333 0.2944 0.4528 0.0246 0.0277 0.2518 0.5559 0.0418 0.3672 0.0656 0.5237 0.0891 0.2196 0.1193 0.3769 0.1321 0.6072 0.796 0.0124 0.2408 0.6453 0.0973 0.159 0.4443 0.0123 0.2177 0.1586 0.2809 0.2877 0.45 0.3977 0.3931 0.3166 0.5417 0.0437 0.2965 0.2396 0.1784 0.1229 0.1384 0.0298 0.1808 0.3494 0.26 0.0279 0.4007 0.0124 0.279 0. 0.4049 0.0325 0.1854 0.1737] 2022-08-23 05:00:05 [INFO] [EVAL] Class Precision: [0.7744 0.8649 0.9677 0.8299 0.757 0.8866 0.8601 0.8181 0.6636 0.7654 0.7086 0.7062 0.7679 0.5149 0.5313 0.5651 0.6808 0.6648 0.7506 0.6374 0.8372 0.6797 0.7448 0.6625 0.5087 0.5081 0.5635 0.7873 0.757 0.3903 0.4597 0.6945 0.4797 0.4766 0.4516 0.5987 0.5921 0.7529 0.4254 0.6148 0.3704 0.245 0.6362 0.5604 0.3675 0.4001 0.4875 0.6916 0.7247 0.634 0.6941 0.2864 0.4134 0.5553 0.7671 0.5493 0.9363 0.6882 0.7329 0.4135 0.1255 0.5521 0.4838 0.656 0.6075 0.8072 0.3724 0.6581 0.3015 0.5039 0.6797 0.6459 0.6265 0.3125 0.706 0.5365 0.6067 0.6513 0.6582 0.3066 0.7208 0.6634 0.7878 0.0804 0.4727 0.7424 0.3228 0.4033 0.7734 0.7289 0.4997 0.0552 0.3834 0.3348 0.039 0.0519 0.467 0.3518 0.446 0.6509 0.2389 0.0547 0.574 0.7974 0.9433 0.4602 0.1732 0.8447 0.2538 0.4654 0.2888 0.4824 0.4779 0.7946 0.801 0.1248 0.6113 0.7842 0.1666 0.4568 0.5494 0.157 0.4819 0.613 0.6965 0.6172 0.8221 0.536 0.722 0.5349 0.6037 0.1662 0.4976 0.5756 0.5202 0.3031 0.4052 0.0695 0.4851 0.6874 0.279 0.0378 0.6078 0.1153 0.6248 0. 0.8766 0.3529 0.6176 0.7777] 2022-08-23 05:00:05 [INFO] [EVAL] Class Recall: [0.8536 0.8848 0.9634 0.861 0.8641 0.8345 0.8862 0.9186 0.7224 0.7879 0.6113 0.6979 0.8724 0.459 0.4156 0.6215 0.6215 0.5286 0.7647 0.5507 0.8682 0.6192 0.7942 0.6514 0.5274 0.5905 0.7453 0.5201 0.4897 0.3353 0.4396 0.6577 0.378 0.6168 0.4373 0.5282 0.6088 0.6388 0.3987 0.5377 0.3562 0.0913 0.4356 0.3125 0.5246 0.2151 0.39 0.6607 0.8522 0.7532 0.6921 0.614 0.3381 0.1934 0.8901 0.6125 0.923 0.477 0.5206 0.3902 0.1151 0.6641 0.5296 0.2669 0.6896 0.8437 0.5696 0.4956 0.1879 0.4602 0.6524 0.7059 0.5179 0.3956 0.5541 0.4551 0.7951 0.3971 0.2564 0.1538 0.81 0.4685 0.396 0.0343 0.4444 0.6795 0.1576 0.0864 0.3029 0.65 0.6251 0.1561 0.3874 0.1285 0.0171 0.0205 0.017 0.1766 0.4642 0.598 0.0267 0.053 0.3096 0.6473 0.0419 0.645 0.0956 0.5795 0.1207 0.2937 0.1689 0.633 0.1544 0.7203 0.9921 0.0136 0.2843 0.7846 0.1896 0.196 0.699 0.0132 0.2843 0.1763 0.3201 0.3502 0.4985 0.6067 0.4632 0.4368 0.8406 0.056 0.4231 0.291 0.2135 0.1712 0.1736 0.0497 0.2238 0.4154 0.7929 0.0962 0.5405 0.0137 0.3352 0. 0.4294 0.0346 0.2095 0.1828] 2022-08-23 05:00:05 [INFO] [EVAL] The model with the best validation mIoU (0.3461) was saved at iter 99000. 2022-08-23 05:00:16 [INFO] [TRAIN] epoch: 79, iter: 99050/160000, loss: 0.5721, lr: 0.000461, batch_cost: 0.2205, reader_cost: 0.00456, ips: 36.2784 samples/sec | ETA 03:44:00 2022-08-23 05:00:26 [INFO] [TRAIN] epoch: 79, iter: 99100/160000, loss: 0.5051, lr: 0.000461, batch_cost: 0.2029, reader_cost: 0.00145, ips: 39.4235 samples/sec | ETA 03:25:58 2022-08-23 05:00:36 [INFO] [TRAIN] epoch: 79, iter: 99150/160000, loss: 0.5475, lr: 0.000461, batch_cost: 0.1932, reader_cost: 0.00050, ips: 41.4126 samples/sec | ETA 03:15:54 2022-08-23 05:00:47 [INFO] [TRAIN] epoch: 79, iter: 99200/160000, loss: 0.5153, lr: 0.000460, batch_cost: 0.2211, reader_cost: 0.00047, ips: 36.1815 samples/sec | ETA 03:44:03 2022-08-23 05:00:57 [INFO] [TRAIN] epoch: 79, iter: 99250/160000, loss: 0.5249, lr: 0.000460, batch_cost: 0.2092, reader_cost: 0.00081, ips: 38.2457 samples/sec | ETA 03:31:47 2022-08-23 05:01:07 [INFO] [TRAIN] epoch: 79, iter: 99300/160000, loss: 0.5688, lr: 0.000460, batch_cost: 0.1934, reader_cost: 0.00108, ips: 41.3666 samples/sec | ETA 03:15:38 2022-08-23 05:01:16 [INFO] [TRAIN] epoch: 79, iter: 99350/160000, loss: 0.5913, lr: 0.000459, batch_cost: 0.1865, reader_cost: 0.00055, ips: 42.9000 samples/sec | ETA 03:08:30 2022-08-23 05:01:26 [INFO] [TRAIN] epoch: 79, iter: 99400/160000, loss: 0.5795, lr: 0.000459, batch_cost: 0.1956, reader_cost: 0.00156, ips: 40.8989 samples/sec | ETA 03:17:33 2022-08-23 05:01:35 [INFO] [TRAIN] epoch: 79, iter: 99450/160000, loss: 0.5291, lr: 0.000458, batch_cost: 0.1766, reader_cost: 0.00084, ips: 45.2946 samples/sec | ETA 02:58:14 2022-08-23 05:01:44 [INFO] [TRAIN] epoch: 79, iter: 99500/160000, loss: 0.5198, lr: 0.000458, batch_cost: 0.1928, reader_cost: 0.00097, ips: 41.5002 samples/sec | ETA 03:14:22 2022-08-23 05:01:54 [INFO] [TRAIN] epoch: 79, iter: 99550/160000, loss: 0.5873, lr: 0.000458, batch_cost: 0.1871, reader_cost: 0.00043, ips: 42.7676 samples/sec | ETA 03:08:27 2022-08-23 05:02:02 [INFO] [TRAIN] epoch: 79, iter: 99600/160000, loss: 0.5334, lr: 0.000457, batch_cost: 0.1723, reader_cost: 0.00061, ips: 46.4303 samples/sec | ETA 02:53:27 2022-08-23 05:02:11 [INFO] [TRAIN] epoch: 79, iter: 99650/160000, loss: 0.5203, lr: 0.000457, batch_cost: 0.1669, reader_cost: 0.00049, ips: 47.9267 samples/sec | ETA 02:47:53 2022-08-23 05:02:20 [INFO] [TRAIN] epoch: 79, iter: 99700/160000, loss: 0.5374, lr: 0.000457, batch_cost: 0.1814, reader_cost: 0.00115, ips: 44.0963 samples/sec | ETA 03:02:19 2022-08-23 05:02:29 [INFO] [TRAIN] epoch: 79, iter: 99750/160000, loss: 0.5660, lr: 0.000456, batch_cost: 0.1778, reader_cost: 0.00049, ips: 44.9913 samples/sec | ETA 02:58:33 2022-08-23 05:02:41 [INFO] [TRAIN] epoch: 80, iter: 99800/160000, loss: 0.5310, lr: 0.000456, batch_cost: 0.2384, reader_cost: 0.04436, ips: 33.5573 samples/sec | ETA 03:59:11 2022-08-23 05:02:51 [INFO] [TRAIN] epoch: 80, iter: 99850/160000, loss: 0.5478, lr: 0.000455, batch_cost: 0.2146, reader_cost: 0.00042, ips: 37.2769 samples/sec | ETA 03:35:08 2022-08-23 05:03:02 [INFO] [TRAIN] epoch: 80, iter: 99900/160000, loss: 0.5168, lr: 0.000455, batch_cost: 0.2145, reader_cost: 0.00045, ips: 37.2997 samples/sec | ETA 03:34:50 2022-08-23 05:03:13 [INFO] [TRAIN] epoch: 80, iter: 99950/160000, loss: 0.5168, lr: 0.000455, batch_cost: 0.2087, reader_cost: 0.00045, ips: 38.3341 samples/sec | ETA 03:28:51 2022-08-23 05:03:22 [INFO] [TRAIN] epoch: 80, iter: 100000/160000, loss: 0.5395, lr: 0.000454, batch_cost: 0.1839, reader_cost: 0.00055, ips: 43.5020 samples/sec | ETA 03:03:53 2022-08-23 05:03:22 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 157s - batch_cost: 0.1572 - reader cost: 5.7663e-04 2022-08-23 05:05:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3470 Acc: 0.7651 Kappa: 0.7474 Dice: 0.4783 2022-08-23 05:05:59 [INFO] [EVAL] Class IoU: [0.6865 0.7752 0.9334 0.7287 0.678 0.7619 0.7759 0.7683 0.5303 0.5877 0.4859 0.5486 0.6951 0.2742 0.3372 0.4309 0.5089 0.4133 0.6148 0.4161 0.7361 0.5016 0.624 0.47 0.3316 0.3748 0.4956 0.4758 0.4031 0.2665 0.2928 0.4832 0.2782 0.3533 0.3014 0.3957 0.4251 0.5369 0.2676 0.4041 0.2244 0.087 0.3532 0.2477 0.2905 0.152 0.3206 0.5155 0.6438 0.566 0.5615 0.2718 0.2161 0.2277 0.6821 0.4152 0.871 0.3225 0.4593 0.281 0.0592 0.2428 0.3472 0.2583 0.4795 0.7146 0.273 0.4132 0.1022 0.3378 0.5083 0.5028 0.3961 0.1874 0.452 0.3308 0.5755 0.3173 0.2199 0.0905 0.5809 0.4002 0.3267 0.0273 0.2672 0.5446 0.0948 0.0869 0.2047 0.5232 0.4288 0.0426 0.2198 0.0972 0.0009 0.0192 0.0468 0.125 0.2876 0.4477 0.0415 0.0262 0.2626 0.0988 0.1171 0.4835 0.0693 0.544 0.0914 0.2313 0.1206 0.4739 0.1385 0.5187 0.7822 0.0117 0.3149 0.6434 0.105 0.1419 0.4392 0.0103 0.2181 0.1224 0.3461 0.3156 0.4416 0.4026 0.3493 0.3195 0.5609 0.0769 0.2917 0.2545 0.243 0.1234 0.1456 0.0279 0.2347 0.3806 0.4447 0.0402 0.333 0.0119 0.2951 0. 0.3984 0.0266 0.1772 0.1407] 2022-08-23 05:05:59 [INFO] [EVAL] Class Precision: [0.7967 0.8527 0.963 0.8179 0.7623 0.8685 0.8769 0.825 0.6609 0.7232 0.7136 0.6914 0.7714 0.4858 0.518 0.5856 0.6903 0.6861 0.7614 0.6256 0.8207 0.6559 0.7524 0.6721 0.5094 0.4953 0.6296 0.6788 0.7531 0.3552 0.4013 0.623 0.4765 0.4894 0.496 0.5348 0.6186 0.6456 0.411 0.619 0.4121 0.2312 0.5642 0.5378 0.3949 0.3727 0.5368 0.7458 0.7585 0.6843 0.7749 0.3439 0.3629 0.5226 0.7855 0.5857 0.9377 0.7407 0.6383 0.483 0.1161 0.43 0.5181 0.7208 0.6281 0.8271 0.4165 0.6199 0.503 0.6549 0.7107 0.6289 0.6286 0.2822 0.6966 0.4862 0.7447 0.5569 0.6556 0.325 0.6785 0.6041 0.8137 0.0949 0.4897 0.6827 0.4683 0.3518 0.6497 0.7516 0.4779 0.0563 0.3742 0.3117 0.004 0.0483 0.7304 0.3541 0.5092 0.6311 0.2711 0.0621 0.5972 0.6728 0.9726 0.674 0.1791 0.8229 0.2344 0.4472 0.2569 0.684 0.5448 0.8099 0.7862 0.1109 0.5777 0.7354 0.1725 0.52 0.5743 0.3036 0.6061 0.6591 0.6299 0.6671 0.7953 0.5715 0.6813 0.5057 0.6567 0.2929 0.4161 0.6432 0.455 0.3276 0.2772 0.0765 0.5246 0.6201 0.5493 0.0567 0.6473 0.1907 0.5349 0. 0.8651 0.3382 0.6184 0.7776] 2022-08-23 05:05:59 [INFO] [EVAL] Class Recall: [0.8323 0.8951 0.9681 0.8699 0.8598 0.8612 0.8708 0.9179 0.7286 0.7583 0.6036 0.7265 0.8755 0.3863 0.4913 0.6198 0.6596 0.5097 0.7615 0.554 0.8771 0.6808 0.7852 0.6099 0.4872 0.6063 0.6996 0.614 0.4645 0.5162 0.5199 0.683 0.4007 0.5595 0.4345 0.6035 0.576 0.7613 0.434 0.5379 0.3301 0.1225 0.4857 0.3147 0.5235 0.2042 0.4431 0.6255 0.8098 0.7661 0.6709 0.5647 0.3483 0.2875 0.8383 0.588 0.9246 0.3636 0.6209 0.4019 0.1076 0.3581 0.5128 0.287 0.6697 0.8401 0.4421 0.5535 0.1137 0.411 0.6408 0.7149 0.5171 0.3583 0.5627 0.5087 0.717 0.4244 0.2486 0.1115 0.8015 0.5425 0.3531 0.037 0.3702 0.7292 0.1063 0.1034 0.2302 0.6326 0.8067 0.1493 0.3475 0.1238 0.0011 0.0308 0.0476 0.162 0.3979 0.6064 0.0467 0.0434 0.3192 0.1038 0.1175 0.6311 0.1016 0.6161 0.1303 0.324 0.1853 0.6068 0.1567 0.5906 0.9936 0.0129 0.409 0.8373 0.2115 0.1632 0.6512 0.0106 0.2541 0.1307 0.4345 0.3746 0.4982 0.5767 0.4176 0.4646 0.7937 0.0945 0.4939 0.2963 0.3428 0.1653 0.2347 0.042 0.2981 0.4963 0.7002 0.1212 0.4068 0.0125 0.397 0. 0.4248 0.0281 0.1989 0.1466] 2022-08-23 05:05:59 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:06:09 [INFO] [TRAIN] epoch: 80, iter: 100050/160000, loss: 0.5994, lr: 0.000454, batch_cost: 0.1968, reader_cost: 0.00398, ips: 40.6564 samples/sec | ETA 03:16:36 2022-08-23 05:06:19 [INFO] [TRAIN] epoch: 80, iter: 100100/160000, loss: 0.5250, lr: 0.000454, batch_cost: 0.1989, reader_cost: 0.00137, ips: 40.2138 samples/sec | ETA 03:18:36 2022-08-23 05:06:29 [INFO] [TRAIN] epoch: 80, iter: 100150/160000, loss: 0.5984, lr: 0.000453, batch_cost: 0.1900, reader_cost: 0.00041, ips: 42.1127 samples/sec | ETA 03:09:29 2022-08-23 05:06:39 [INFO] [TRAIN] epoch: 80, iter: 100200/160000, loss: 0.5381, lr: 0.000453, batch_cost: 0.2141, reader_cost: 0.00085, ips: 37.3593 samples/sec | ETA 03:33:25 2022-08-23 05:06:48 [INFO] [TRAIN] epoch: 80, iter: 100250/160000, loss: 0.5428, lr: 0.000452, batch_cost: 0.1801, reader_cost: 0.00101, ips: 44.4200 samples/sec | ETA 02:59:20 2022-08-23 05:06:58 [INFO] [TRAIN] epoch: 80, iter: 100300/160000, loss: 0.5481, lr: 0.000452, batch_cost: 0.1957, reader_cost: 0.00050, ips: 40.8778 samples/sec | ETA 03:14:43 2022-08-23 05:07:07 [INFO] [TRAIN] epoch: 80, iter: 100350/160000, loss: 0.5585, lr: 0.000452, batch_cost: 0.1836, reader_cost: 0.00045, ips: 43.5813 samples/sec | ETA 03:02:29 2022-08-23 05:07:16 [INFO] [TRAIN] epoch: 80, iter: 100400/160000, loss: 0.5597, lr: 0.000451, batch_cost: 0.1828, reader_cost: 0.00565, ips: 43.7754 samples/sec | ETA 03:01:31 2022-08-23 05:07:26 [INFO] [TRAIN] epoch: 80, iter: 100450/160000, loss: 0.5665, lr: 0.000451, batch_cost: 0.1998, reader_cost: 0.00064, ips: 40.0352 samples/sec | ETA 03:18:19 2022-08-23 05:07:37 [INFO] [TRAIN] epoch: 80, iter: 100500/160000, loss: 0.5571, lr: 0.000450, batch_cost: 0.2115, reader_cost: 0.00063, ips: 37.8257 samples/sec | ETA 03:29:44 2022-08-23 05:07:47 [INFO] [TRAIN] epoch: 80, iter: 100550/160000, loss: 0.5470, lr: 0.000450, batch_cost: 0.1991, reader_cost: 0.00103, ips: 40.1875 samples/sec | ETA 03:17:14 2022-08-23 05:07:57 [INFO] [TRAIN] epoch: 80, iter: 100600/160000, loss: 0.5356, lr: 0.000450, batch_cost: 0.1996, reader_cost: 0.00044, ips: 40.0732 samples/sec | ETA 03:17:38 2022-08-23 05:08:08 [INFO] [TRAIN] epoch: 80, iter: 100650/160000, loss: 0.5758, lr: 0.000449, batch_cost: 0.2128, reader_cost: 0.00060, ips: 37.5992 samples/sec | ETA 03:30:27 2022-08-23 05:08:18 [INFO] [TRAIN] epoch: 80, iter: 100700/160000, loss: 0.4904, lr: 0.000449, batch_cost: 0.2014, reader_cost: 0.00072, ips: 39.7196 samples/sec | ETA 03:19:03 2022-08-23 05:08:28 [INFO] [TRAIN] epoch: 80, iter: 100750/160000, loss: 0.5520, lr: 0.000449, batch_cost: 0.2089, reader_cost: 0.00061, ips: 38.2988 samples/sec | ETA 03:26:16 2022-08-23 05:08:39 [INFO] [TRAIN] epoch: 80, iter: 100800/160000, loss: 0.5772, lr: 0.000448, batch_cost: 0.2175, reader_cost: 0.00059, ips: 36.7811 samples/sec | ETA 03:34:36 2022-08-23 05:08:49 [INFO] [TRAIN] epoch: 80, iter: 100850/160000, loss: 0.5427, lr: 0.000448, batch_cost: 0.2048, reader_cost: 0.00088, ips: 39.0683 samples/sec | ETA 03:21:52 2022-08-23 05:09:00 [INFO] [TRAIN] epoch: 80, iter: 100900/160000, loss: 0.5681, lr: 0.000447, batch_cost: 0.2089, reader_cost: 0.00053, ips: 38.2916 samples/sec | ETA 03:25:47 2022-08-23 05:09:09 [INFO] [TRAIN] epoch: 80, iter: 100950/160000, loss: 0.5419, lr: 0.000447, batch_cost: 0.1946, reader_cost: 0.00053, ips: 41.1056 samples/sec | ETA 03:11:32 2022-08-23 05:09:19 [INFO] [TRAIN] epoch: 80, iter: 101000/160000, loss: 0.5450, lr: 0.000447, batch_cost: 0.1827, reader_cost: 0.00053, ips: 43.7929 samples/sec | ETA 02:59:38 2022-08-23 05:09:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1644 - reader cost: 7.0679e-04 2022-08-23 05:12:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.3459 Acc: 0.7647 Kappa: 0.7467 Dice: 0.4786 2022-08-23 05:12:03 [INFO] [EVAL] Class IoU: [0.6831 0.775 0.932 0.726 0.6747 0.7585 0.7793 0.7831 0.529 0.5952 0.4956 0.5445 0.6889 0.2939 0.2856 0.4224 0.4971 0.4047 0.6126 0.4129 0.745 0.5548 0.6358 0.4991 0.3296 0.3686 0.5725 0.4722 0.4636 0.2183 0.2543 0.5095 0.2527 0.3627 0.2857 0.3808 0.4269 0.5458 0.2342 0.3947 0.2464 0.0959 0.3472 0.2524 0.3194 0.1692 0.3363 0.5225 0.435 0.5198 0.5598 0.352 0.2246 0.1955 0.6748 0.4347 0.8573 0.4209 0.4456 0.2644 0.0794 0.406 0.3564 0.2546 0.4585 0.6916 0.2859 0.4248 0.1148 0.3793 0.497 0.5153 0.4162 0.1905 0.4524 0.3415 0.4641 0.2995 0.2442 0.1179 0.5172 0.4166 0.3323 0.049 0.2893 0.5558 0.1192 0.0705 0.2505 0.5268 0.374 0.052 0.2292 0.1246 0.0183 0.0112 0.0355 0.1284 0.2875 0.3972 0.0737 0.0556 0.303 0.0927 0.0676 0.4115 0.0583 0.5015 0.1056 0.254 0.1045 0.3609 0.1291 0.5384 0.8487 0.0178 0.2611 0.6372 0.1454 0.1422 0.3869 0.0273 0.1991 0.1749 0.2432 0.2259 0.4656 0.388 0.5271 0.3225 0.5107 0.0474 0.2269 0.2778 0.1983 0.1002 0.1541 0.038 0.2115 0.3617 0.3189 0.01 0.3301 0.1357 0.308 0. 0.3823 0.0255 0.1948 0.136 ] 2022-08-23 05:12:03 [INFO] [EVAL] Class Precision: [0.7795 0.8571 0.9606 0.8218 0.7513 0.859 0.8632 0.8577 0.6893 0.7677 0.7011 0.7434 0.78 0.4688 0.5835 0.5874 0.6737 0.7041 0.7816 0.6387 0.8425 0.6928 0.7883 0.5953 0.4923 0.4925 0.7467 0.6629 0.6973 0.323 0.4366 0.65 0.4718 0.4988 0.4225 0.4974 0.659 0.767 0.4346 0.6078 0.4114 0.2347 0.5113 0.549 0.4961 0.4079 0.5883 0.6927 0.6774 0.6188 0.7344 0.4933 0.39 0.5075 0.6966 0.6125 0.9075 0.6414 0.7242 0.4518 0.1273 0.5589 0.5335 0.6419 0.5549 0.7943 0.4076 0.5924 0.2061 0.7171 0.7787 0.6755 0.6017 0.2918 0.6897 0.5041 0.5823 0.5699 0.6177 0.331 0.5788 0.733 0.8067 0.111 0.5069 0.7286 0.4062 0.3915 0.4999 0.76 0.4708 0.0661 0.4355 0.3446 0.0828 0.028 0.2587 0.3742 0.4733 0.6553 0.3457 0.0901 0.5704 0.5499 0.91 0.5252 0.153 0.8411 0.2617 0.5752 0.3228 0.4897 0.4883 0.7163 0.8586 0.0896 0.5964 0.7501 0.205 0.2804 0.5149 0.222 0.5186 0.6153 0.6635 0.619 0.8837 0.5107 0.8023 0.6273 0.5728 0.4484 0.3791 0.5618 0.51 0.1913 0.3198 0.0798 0.5604 0.6444 0.3644 0.0177 0.6386 0.6163 0.5714 0. 0.8625 0.3405 0.6721 0.6491] 2022-08-23 05:12:03 [INFO] [EVAL] Class Recall: [0.8466 0.89 0.9691 0.8617 0.8687 0.8664 0.8891 0.9001 0.6946 0.7259 0.6283 0.6705 0.855 0.4407 0.3587 0.6006 0.6548 0.4876 0.7391 0.5386 0.8655 0.7358 0.7667 0.7556 0.4994 0.5945 0.7104 0.6215 0.5804 0.4026 0.3785 0.7021 0.3524 0.5707 0.4688 0.6191 0.548 0.6543 0.3369 0.5295 0.3806 0.1395 0.5197 0.3185 0.4728 0.2244 0.4398 0.6801 0.5487 0.7648 0.7019 0.5514 0.3463 0.2413 0.9556 0.5996 0.9395 0.5504 0.5366 0.3893 0.1743 0.5975 0.5178 0.2968 0.7252 0.8425 0.4893 0.6003 0.2057 0.4461 0.5788 0.6849 0.5744 0.3544 0.568 0.5143 0.6958 0.387 0.2877 0.1547 0.8292 0.4911 0.3611 0.0805 0.4027 0.701 0.1444 0.0791 0.3343 0.632 0.6452 0.1951 0.3261 0.1632 0.023 0.0183 0.0395 0.1635 0.4228 0.5022 0.0856 0.1266 0.3926 0.1003 0.068 0.6553 0.086 0.554 0.1504 0.3126 0.1339 0.5783 0.1493 0.6843 0.9866 0.0218 0.3171 0.809 0.3335 0.2238 0.6088 0.0302 0.2443 0.1964 0.2774 0.2624 0.496 0.6175 0.6057 0.3989 0.8247 0.0503 0.3612 0.3546 0.245 0.1738 0.2292 0.0677 0.2535 0.4519 0.7185 0.0226 0.4059 0.1482 0.4006 0. 0.4071 0.0268 0.2152 0.1468] 2022-08-23 05:12:03 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:12:16 [INFO] [TRAIN] epoch: 81, iter: 101050/160000, loss: 0.5558, lr: 0.000446, batch_cost: 0.2581, reader_cost: 0.06209, ips: 30.9957 samples/sec | ETA 04:13:35 2022-08-23 05:12:27 [INFO] [TRAIN] epoch: 81, iter: 101100/160000, loss: 0.5842, lr: 0.000446, batch_cost: 0.2079, reader_cost: 0.00071, ips: 38.4738 samples/sec | ETA 03:24:07 2022-08-23 05:12:37 [INFO] [TRAIN] epoch: 81, iter: 101150/160000, loss: 0.5741, lr: 0.000446, batch_cost: 0.2113, reader_cost: 0.00060, ips: 37.8691 samples/sec | ETA 03:27:12 2022-08-23 05:12:48 [INFO] [TRAIN] epoch: 81, iter: 101200/160000, loss: 0.5252, lr: 0.000445, batch_cost: 0.2108, reader_cost: 0.00060, ips: 37.9567 samples/sec | ETA 03:26:33 2022-08-23 05:12:58 [INFO] [TRAIN] epoch: 81, iter: 101250/160000, loss: 0.5438, lr: 0.000445, batch_cost: 0.1980, reader_cost: 0.00055, ips: 40.4017 samples/sec | ETA 03:13:53 2022-08-23 05:13:08 [INFO] [TRAIN] epoch: 81, iter: 101300/160000, loss: 0.5559, lr: 0.000444, batch_cost: 0.2009, reader_cost: 0.00069, ips: 39.8293 samples/sec | ETA 03:16:30 2022-08-23 05:13:18 [INFO] [TRAIN] epoch: 81, iter: 101350/160000, loss: 0.5371, lr: 0.000444, batch_cost: 0.2026, reader_cost: 0.00042, ips: 39.4922 samples/sec | ETA 03:18:00 2022-08-23 05:13:29 [INFO] [TRAIN] epoch: 81, iter: 101400/160000, loss: 0.5414, lr: 0.000444, batch_cost: 0.2163, reader_cost: 0.00093, ips: 36.9799 samples/sec | ETA 03:31:17 2022-08-23 05:13:40 [INFO] [TRAIN] epoch: 81, iter: 101450/160000, loss: 0.5473, lr: 0.000443, batch_cost: 0.2255, reader_cost: 0.00069, ips: 35.4839 samples/sec | ETA 03:40:00 2022-08-23 05:13:50 [INFO] [TRAIN] epoch: 81, iter: 101500/160000, loss: 0.5509, lr: 0.000443, batch_cost: 0.2061, reader_cost: 0.00049, ips: 38.8170 samples/sec | ETA 03:20:56 2022-08-23 05:14:00 [INFO] [TRAIN] epoch: 81, iter: 101550/160000, loss: 0.5207, lr: 0.000443, batch_cost: 0.1919, reader_cost: 0.00044, ips: 41.6926 samples/sec | ETA 03:06:55 2022-08-23 05:14:11 [INFO] [TRAIN] epoch: 81, iter: 101600/160000, loss: 0.5517, lr: 0.000442, batch_cost: 0.2265, reader_cost: 0.00123, ips: 35.3190 samples/sec | ETA 03:40:28 2022-08-23 05:14:21 [INFO] [TRAIN] epoch: 81, iter: 101650/160000, loss: 0.5171, lr: 0.000442, batch_cost: 0.1939, reader_cost: 0.00087, ips: 41.2585 samples/sec | ETA 03:08:34 2022-08-23 05:14:30 [INFO] [TRAIN] epoch: 81, iter: 101700/160000, loss: 0.5517, lr: 0.000441, batch_cost: 0.1906, reader_cost: 0.00046, ips: 41.9837 samples/sec | ETA 03:05:09 2022-08-23 05:14:41 [INFO] [TRAIN] epoch: 81, iter: 101750/160000, loss: 0.5216, lr: 0.000441, batch_cost: 0.2038, reader_cost: 0.00087, ips: 39.2505 samples/sec | ETA 03:17:52 2022-08-23 05:14:50 [INFO] [TRAIN] epoch: 81, iter: 101800/160000, loss: 0.5503, lr: 0.000441, batch_cost: 0.1832, reader_cost: 0.00098, ips: 43.6787 samples/sec | ETA 02:57:39 2022-08-23 05:14:59 [INFO] [TRAIN] epoch: 81, iter: 101850/160000, loss: 0.6002, lr: 0.000440, batch_cost: 0.1811, reader_cost: 0.00050, ips: 44.1663 samples/sec | ETA 02:55:32 2022-08-23 05:15:08 [INFO] [TRAIN] epoch: 81, iter: 101900/160000, loss: 0.5355, lr: 0.000440, batch_cost: 0.1835, reader_cost: 0.00062, ips: 43.6061 samples/sec | ETA 02:57:39 2022-08-23 05:15:18 [INFO] [TRAIN] epoch: 81, iter: 101950/160000, loss: 0.5368, lr: 0.000440, batch_cost: 0.2023, reader_cost: 0.00082, ips: 39.5387 samples/sec | ETA 03:15:45 2022-08-23 05:15:28 [INFO] [TRAIN] epoch: 81, iter: 102000/160000, loss: 0.5476, lr: 0.000439, batch_cost: 0.1930, reader_cost: 0.00063, ips: 41.4574 samples/sec | ETA 03:06:32 2022-08-23 05:15:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1739 - reader cost: 0.0013 2022-08-23 05:18:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3422 Acc: 0.7642 Kappa: 0.7461 Dice: 0.4740 2022-08-23 05:18:22 [INFO] [EVAL] Class IoU: [0.6882 0.7743 0.9319 0.7245 0.6853 0.7567 0.7688 0.7747 0.5182 0.6307 0.467 0.5403 0.7015 0.3109 0.3029 0.4227 0.5073 0.4254 0.6157 0.4138 0.725 0.525 0.6402 0.5028 0.2262 0.3166 0.5771 0.4637 0.4161 0.2235 0.2768 0.4629 0.2842 0.3487 0.288 0.3688 0.4267 0.5602 0.2672 0.3856 0.1909 0.1299 0.3433 0.256 0.3015 0.1758 0.3075 0.5205 0.5759 0.4952 0.5374 0.3609 0.2051 0.1905 0.6938 0.4689 0.8746 0.3427 0.4405 0.2104 0.0742 0.2033 0.3552 0.2084 0.4776 0.6935 0.2963 0.4005 0.1011 0.3198 0.4921 0.5006 0.4103 0.2038 0.4565 0.3136 0.5142 0.2426 0.1924 0.1069 0.5226 0.4067 0.2943 0.0304 0.2251 0.5546 0.1046 0.1052 0.2338 0.5033 0.423 0.0506 0.2241 0.0786 0.0273 0.011 0.0191 0.1341 0.3053 0.4057 0.091 0.0495 0.2547 0.2229 0.1161 0.4384 0.0707 0.5233 0.0686 0.2374 0.1395 0.3774 0.1327 0.5623 0.843 0.0067 0.2859 0.6386 0.1801 0.131 0.4709 0.0114 0.2204 0.1139 0.2726 0.259 0.4448 0.4079 0.2806 0.3222 0.5465 0.0528 0.2628 0.245 0.1526 0.1148 0.1602 0.0338 0.206 0.3522 0.3536 0.0017 0.2859 0.2284 0.2311 0. 0.4097 0.0302 0.2161 0.1775] 2022-08-23 05:18:22 [INFO] [EVAL] Class Precision: [0.7818 0.8559 0.9678 0.815 0.7627 0.8648 0.8644 0.8357 0.6643 0.8046 0.6369 0.7006 0.7843 0.4643 0.5435 0.5947 0.7474 0.672 0.7532 0.6335 0.7942 0.6723 0.7723 0.6433 0.5102 0.4736 0.7205 0.7512 0.7304 0.3443 0.4779 0.5791 0.4884 0.4613 0.5161 0.4726 0.6346 0.8156 0.4589 0.634 0.3479 0.2875 0.5295 0.5149 0.4132 0.4001 0.4881 0.6941 0.7143 0.5774 0.7592 0.469 0.3501 0.4505 0.8072 0.6813 0.9315 0.7237 0.623 0.3679 0.1166 0.4167 0.5267 0.6532 0.6303 0.8179 0.4301 0.5227 0.2071 0.5772 0.782 0.6473 0.6104 0.2859 0.6967 0.5917 0.6021 0.4958 0.3448 0.3355 0.5978 0.6519 0.8412 0.0715 0.4507 0.7743 0.4738 0.3522 0.4539 0.7753 0.6001 0.0991 0.4467 0.3548 0.0763 0.0341 0.5424 0.3195 0.5048 0.6146 0.284 0.0895 0.6257 0.5752 0.848 0.5419 0.2837 0.7993 0.2247 0.5513 0.3614 0.6025 0.3775 0.7598 0.8502 0.0606 0.7561 0.7767 0.2707 0.4405 0.6056 0.1683 0.5233 0.6576 0.6845 0.6415 0.8162 0.5341 0.6743 0.6631 0.6261 0.2871 0.4501 0.6197 0.5023 0.2324 0.414 0.102 0.4593 0.6781 0.4069 0.0034 0.6435 0.6387 0.4434 0. 0.8476 0.3453 0.5168 0.61 ] 2022-08-23 05:18:22 [INFO] [EVAL] Class Recall: [0.8519 0.8904 0.9618 0.8671 0.8709 0.8582 0.8742 0.9139 0.702 0.7448 0.6363 0.7025 0.8691 0.4849 0.4062 0.5938 0.6123 0.5369 0.7713 0.5441 0.8928 0.7055 0.7891 0.6971 0.289 0.4886 0.7436 0.5478 0.4916 0.389 0.3968 0.6975 0.4047 0.5882 0.3946 0.6268 0.5658 0.6414 0.39 0.496 0.2972 0.1915 0.494 0.3373 0.5273 0.2387 0.4539 0.6754 0.7482 0.7768 0.6477 0.6103 0.3312 0.2482 0.8317 0.6006 0.9348 0.3944 0.6006 0.3294 0.1695 0.2841 0.5217 0.2343 0.6634 0.8202 0.4878 0.6314 0.1651 0.4176 0.5703 0.6883 0.5558 0.4149 0.5697 0.4002 0.7788 0.322 0.3033 0.1356 0.806 0.5194 0.3116 0.0501 0.3102 0.6615 0.1183 0.1304 0.3254 0.5892 0.589 0.0937 0.3101 0.0917 0.0407 0.016 0.0194 0.1877 0.4359 0.5441 0.1181 0.0996 0.3005 0.2668 0.1186 0.6966 0.0861 0.6025 0.0899 0.2942 0.1851 0.5026 0.1698 0.6839 0.9901 0.0075 0.3149 0.7822 0.3499 0.1572 0.6792 0.0121 0.2757 0.1211 0.3118 0.3029 0.4942 0.6333 0.3246 0.3853 0.8114 0.0608 0.3871 0.2883 0.1797 0.185 0.2073 0.0482 0.272 0.4229 0.7296 0.0034 0.3397 0.2623 0.3256 0. 0.4422 0.032 0.2708 0.2002] 2022-08-23 05:18:22 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:18:31 [INFO] [TRAIN] epoch: 81, iter: 102050/160000, loss: 0.5564, lr: 0.000439, batch_cost: 0.1805, reader_cost: 0.00463, ips: 44.3149 samples/sec | ETA 02:54:21 2022-08-23 05:18:40 [INFO] [TRAIN] epoch: 81, iter: 102100/160000, loss: 0.5299, lr: 0.000438, batch_cost: 0.1830, reader_cost: 0.00093, ips: 43.7275 samples/sec | ETA 02:56:32 2022-08-23 05:18:51 [INFO] [TRAIN] epoch: 81, iter: 102150/160000, loss: 0.5314, lr: 0.000438, batch_cost: 0.2064, reader_cost: 0.00066, ips: 38.7609 samples/sec | ETA 03:18:59 2022-08-23 05:19:00 [INFO] [TRAIN] epoch: 81, iter: 102200/160000, loss: 0.5448, lr: 0.000438, batch_cost: 0.1976, reader_cost: 0.00080, ips: 40.4904 samples/sec | ETA 03:10:19 2022-08-23 05:19:10 [INFO] [TRAIN] epoch: 81, iter: 102250/160000, loss: 0.5397, lr: 0.000437, batch_cost: 0.1957, reader_cost: 0.00051, ips: 40.8815 samples/sec | ETA 03:08:20 2022-08-23 05:19:19 [INFO] [TRAIN] epoch: 81, iter: 102300/160000, loss: 0.5214, lr: 0.000437, batch_cost: 0.1807, reader_cost: 0.00060, ips: 44.2646 samples/sec | ETA 02:53:48 2022-08-23 05:19:32 [INFO] [TRAIN] epoch: 82, iter: 102350/160000, loss: 0.5314, lr: 0.000436, batch_cost: 0.2607, reader_cost: 0.07159, ips: 30.6869 samples/sec | ETA 04:10:29 2022-08-23 05:19:42 [INFO] [TRAIN] epoch: 82, iter: 102400/160000, loss: 0.5626, lr: 0.000436, batch_cost: 0.1847, reader_cost: 0.00078, ips: 43.3081 samples/sec | ETA 02:57:20 2022-08-23 05:19:51 [INFO] [TRAIN] epoch: 82, iter: 102450/160000, loss: 0.5454, lr: 0.000436, batch_cost: 0.1991, reader_cost: 0.00043, ips: 40.1735 samples/sec | ETA 03:11:00 2022-08-23 05:20:01 [INFO] [TRAIN] epoch: 82, iter: 102500/160000, loss: 0.5273, lr: 0.000435, batch_cost: 0.1951, reader_cost: 0.00031, ips: 41.0132 samples/sec | ETA 03:06:55 2022-08-23 05:20:10 [INFO] [TRAIN] epoch: 82, iter: 102550/160000, loss: 0.5245, lr: 0.000435, batch_cost: 0.1751, reader_cost: 0.00107, ips: 45.6753 samples/sec | ETA 02:47:42 2022-08-23 05:20:19 [INFO] [TRAIN] epoch: 82, iter: 102600/160000, loss: 0.5164, lr: 0.000435, batch_cost: 0.1821, reader_cost: 0.00032, ips: 43.9408 samples/sec | ETA 02:54:10 2022-08-23 05:20:28 [INFO] [TRAIN] epoch: 82, iter: 102650/160000, loss: 0.5232, lr: 0.000434, batch_cost: 0.1735, reader_cost: 0.00088, ips: 46.1094 samples/sec | ETA 02:45:50 2022-08-23 05:20:38 [INFO] [TRAIN] epoch: 82, iter: 102700/160000, loss: 0.5535, lr: 0.000434, batch_cost: 0.2062, reader_cost: 0.00155, ips: 38.7936 samples/sec | ETA 03:16:56 2022-08-23 05:20:49 [INFO] [TRAIN] epoch: 82, iter: 102750/160000, loss: 0.5522, lr: 0.000433, batch_cost: 0.2174, reader_cost: 0.00091, ips: 36.8007 samples/sec | ETA 03:27:25 2022-08-23 05:20:59 [INFO] [TRAIN] epoch: 82, iter: 102800/160000, loss: 0.5663, lr: 0.000433, batch_cost: 0.1945, reader_cost: 0.00042, ips: 41.1347 samples/sec | ETA 03:05:24 2022-08-23 05:21:08 [INFO] [TRAIN] epoch: 82, iter: 102850/160000, loss: 0.5471, lr: 0.000433, batch_cost: 0.1763, reader_cost: 0.00086, ips: 45.3826 samples/sec | ETA 02:47:54 2022-08-23 05:21:18 [INFO] [TRAIN] epoch: 82, iter: 102900/160000, loss: 0.5417, lr: 0.000432, batch_cost: 0.2038, reader_cost: 0.00086, ips: 39.2560 samples/sec | ETA 03:13:56 2022-08-23 05:21:28 [INFO] [TRAIN] epoch: 82, iter: 102950/160000, loss: 0.5564, lr: 0.000432, batch_cost: 0.2062, reader_cost: 0.00034, ips: 38.7925 samples/sec | ETA 03:16:05 2022-08-23 05:21:39 [INFO] [TRAIN] epoch: 82, iter: 103000/160000, loss: 0.5719, lr: 0.000432, batch_cost: 0.2150, reader_cost: 0.00032, ips: 37.2087 samples/sec | ETA 03:24:15 2022-08-23 05:21:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1819 - reader cost: 8.4610e-04 2022-08-23 05:24:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3436 Acc: 0.7654 Kappa: 0.7473 Dice: 0.4743 2022-08-23 05:24:41 [INFO] [EVAL] Class IoU: [0.6902 0.7681 0.9315 0.7313 0.6823 0.7493 0.7752 0.7682 0.5229 0.5892 0.4883 0.5537 0.7006 0.3108 0.3107 0.4217 0.547 0.4365 0.6128 0.4159 0.731 0.4891 0.6363 0.4911 0.3052 0.2162 0.4908 0.4651 0.4357 0.2412 0.2907 0.5135 0.2806 0.3584 0.2766 0.3912 0.437 0.5468 0.2589 0.402 0.2106 0.1241 0.3491 0.2495 0.3132 0.1458 0.3124 0.5118 0.6387 0.5321 0.5182 0.3586 0.231 0.2041 0.7079 0.4704 0.8655 0.3778 0.342 0.2284 0.0599 0.3231 0.3391 0.2485 0.4782 0.6919 0.2874 0.4046 0.0997 0.3552 0.524 0.5166 0.4128 0.1769 0.4475 0.2951 0.4285 0.2801 0.2461 0.1586 0.5546 0.4201 0.2867 0.0272 0.3624 0.5534 0.0992 0.087 0.2526 0.5048 0.3907 0.0557 0.236 0.0687 0.0164 0.0156 0.0206 0.0954 0.2913 0.4595 0.0691 0.0516 0.1621 0.0644 0.0689 0.535 0.0248 0.5256 0.1034 0.241 0.0824 0.5461 0.1356 0.5381 0.755 0.023 0.3244 0.6718 0.0996 0.1554 0.491 0.0121 0.2062 0.1501 0.2709 0.2879 0.409 0.4141 0.4547 0.2582 0.5318 0.0859 0.2224 0.2557 0.1514 0.117 0.1479 0.0151 0.2259 0.3703 0.3178 0.0188 0.222 0.2333 0.2497 0.0017 0.4093 0.0226 0.1635 0.1208] 2022-08-23 05:24:41 [INFO] [EVAL] Class Precision: [0.7832 0.8327 0.9626 0.8309 0.7786 0.8836 0.8644 0.8187 0.6846 0.7556 0.7006 0.711 0.7771 0.4956 0.5625 0.5691 0.7321 0.6774 0.7513 0.6044 0.8096 0.6817 0.7643 0.6417 0.5183 0.4342 0.6149 0.7156 0.7274 0.3266 0.415 0.6561 0.4705 0.4691 0.4288 0.5379 0.629 0.7468 0.485 0.5709 0.434 0.2778 0.7264 0.5298 0.453 0.3595 0.4826 0.7783 0.7051 0.6478 0.7636 0.4928 0.3952 0.5567 0.7764 0.6557 0.9147 0.7075 0.7367 0.4912 0.1025 0.5531 0.5548 0.5908 0.6453 0.7949 0.3897 0.5173 0.2062 0.6898 0.7008 0.705 0.6081 0.2863 0.7305 0.507 0.5167 0.5627 0.697 0.4398 0.6364 0.7326 0.8371 0.0636 0.525 0.7627 0.4315 0.3873 0.4132 0.7713 0.5242 0.0743 0.4429 0.3722 0.0469 0.0617 0.5727 0.2884 0.5527 0.6567 0.3222 0.0938 0.5244 0.5913 0.8711 0.5855 0.1153 0.8124 0.2509 0.574 0.3078 0.8047 0.5648 0.7317 0.76 0.1326 0.625 0.803 0.1367 0.482 0.6231 0.1536 0.5588 0.5359 0.6 0.6296 0.7565 0.5896 0.757 0.3246 0.5896 0.4329 0.3335 0.5898 0.585 0.3817 0.3352 0.0945 0.5552 0.5941 0.3567 0.0351 0.6625 0.4685 0.454 0.0031 0.8561 0.3505 0.6408 0.6963] 2022-08-23 05:24:41 [INFO] [EVAL] Class Recall: [0.8532 0.9083 0.9665 0.8591 0.8465 0.8314 0.8825 0.9257 0.6889 0.7279 0.6171 0.7145 0.8768 0.4546 0.4098 0.6194 0.6839 0.551 0.7688 0.5714 0.8827 0.6338 0.7917 0.6767 0.4261 0.301 0.7086 0.5706 0.5207 0.4797 0.4925 0.7026 0.4102 0.6031 0.4379 0.5892 0.5887 0.6712 0.357 0.5761 0.2903 0.1831 0.4019 0.3204 0.5036 0.197 0.4697 0.5991 0.8716 0.7487 0.6172 0.5684 0.3574 0.2436 0.8892 0.6248 0.9416 0.4478 0.3896 0.2991 0.1261 0.4372 0.466 0.3002 0.6487 0.8423 0.5224 0.6502 0.1617 0.4228 0.675 0.659 0.5624 0.3166 0.5359 0.4138 0.7151 0.3581 0.2756 0.1988 0.8118 0.4961 0.3037 0.0452 0.5392 0.6686 0.1141 0.1009 0.3939 0.5936 0.6054 0.1814 0.3356 0.0777 0.0245 0.0204 0.0209 0.1247 0.3812 0.6049 0.0808 0.103 0.19 0.0674 0.0696 0.8612 0.0306 0.5983 0.1496 0.2935 0.1011 0.6295 0.1514 0.6703 0.9914 0.0271 0.4028 0.8044 0.2685 0.1865 0.6985 0.0129 0.2463 0.1725 0.3306 0.3466 0.4711 0.5818 0.5324 0.5578 0.8443 0.0968 0.4005 0.3111 0.1696 0.1443 0.2093 0.0176 0.2758 0.4957 0.7444 0.039 0.2503 0.3173 0.3569 0.0037 0.4395 0.0236 0.18 0.1275] 2022-08-23 05:24:41 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:24:51 [INFO] [TRAIN] epoch: 82, iter: 103050/160000, loss: 0.5408, lr: 0.000431, batch_cost: 0.2070, reader_cost: 0.00359, ips: 38.6502 samples/sec | ETA 03:16:27 2022-08-23 05:25:01 [INFO] [TRAIN] epoch: 82, iter: 103100/160000, loss: 0.5386, lr: 0.000431, batch_cost: 0.1919, reader_cost: 0.00130, ips: 41.6885 samples/sec | ETA 03:01:59 2022-08-23 05:25:11 [INFO] [TRAIN] epoch: 82, iter: 103150/160000, loss: 0.5270, lr: 0.000430, batch_cost: 0.1928, reader_cost: 0.00080, ips: 41.4943 samples/sec | ETA 03:02:40 2022-08-23 05:25:21 [INFO] [TRAIN] epoch: 82, iter: 103200/160000, loss: 0.5135, lr: 0.000430, batch_cost: 0.2025, reader_cost: 0.00060, ips: 39.5041 samples/sec | ETA 03:11:42 2022-08-23 05:25:30 [INFO] [TRAIN] epoch: 82, iter: 103250/160000, loss: 0.5431, lr: 0.000430, batch_cost: 0.1828, reader_cost: 0.00067, ips: 43.7676 samples/sec | ETA 02:52:52 2022-08-23 05:25:39 [INFO] [TRAIN] epoch: 82, iter: 103300/160000, loss: 0.5386, lr: 0.000429, batch_cost: 0.1779, reader_cost: 0.00071, ips: 44.9567 samples/sec | ETA 02:48:09 2022-08-23 05:25:49 [INFO] [TRAIN] epoch: 82, iter: 103350/160000, loss: 0.5291, lr: 0.000429, batch_cost: 0.1963, reader_cost: 0.00063, ips: 40.7617 samples/sec | ETA 03:05:18 2022-08-23 05:25:59 [INFO] [TRAIN] epoch: 82, iter: 103400/160000, loss: 0.5290, lr: 0.000429, batch_cost: 0.2078, reader_cost: 0.00074, ips: 38.4985 samples/sec | ETA 03:16:01 2022-08-23 05:26:08 [INFO] [TRAIN] epoch: 82, iter: 103450/160000, loss: 0.5479, lr: 0.000428, batch_cost: 0.1876, reader_cost: 0.00041, ips: 42.6342 samples/sec | ETA 02:56:51 2022-08-23 05:26:18 [INFO] [TRAIN] epoch: 82, iter: 103500/160000, loss: 0.5178, lr: 0.000428, batch_cost: 0.1961, reader_cost: 0.00099, ips: 40.7997 samples/sec | ETA 03:04:38 2022-08-23 05:26:28 [INFO] [TRAIN] epoch: 82, iter: 103550/160000, loss: 0.5064, lr: 0.000427, batch_cost: 0.2034, reader_cost: 0.00058, ips: 39.3255 samples/sec | ETA 03:11:23 2022-08-23 05:26:41 [INFO] [TRAIN] epoch: 83, iter: 103600/160000, loss: 0.5331, lr: 0.000427, batch_cost: 0.2574, reader_cost: 0.08373, ips: 31.0787 samples/sec | ETA 04:01:57 2022-08-23 05:26:52 [INFO] [TRAIN] epoch: 83, iter: 103650/160000, loss: 0.5650, lr: 0.000427, batch_cost: 0.2071, reader_cost: 0.00078, ips: 38.6333 samples/sec | ETA 03:14:28 2022-08-23 05:27:02 [INFO] [TRAIN] epoch: 83, iter: 103700/160000, loss: 0.5379, lr: 0.000426, batch_cost: 0.2098, reader_cost: 0.00042, ips: 38.1360 samples/sec | ETA 03:16:50 2022-08-23 05:27:11 [INFO] [TRAIN] epoch: 83, iter: 103750/160000, loss: 0.5295, lr: 0.000426, batch_cost: 0.1741, reader_cost: 0.00052, ips: 45.9537 samples/sec | ETA 02:43:12 2022-08-23 05:27:20 [INFO] [TRAIN] epoch: 83, iter: 103800/160000, loss: 0.5577, lr: 0.000425, batch_cost: 0.1873, reader_cost: 0.00043, ips: 42.7112 samples/sec | ETA 02:55:26 2022-08-23 05:27:30 [INFO] [TRAIN] epoch: 83, iter: 103850/160000, loss: 0.5371, lr: 0.000425, batch_cost: 0.1873, reader_cost: 0.00066, ips: 42.7190 samples/sec | ETA 02:55:15 2022-08-23 05:27:38 [INFO] [TRAIN] epoch: 83, iter: 103900/160000, loss: 0.5628, lr: 0.000425, batch_cost: 0.1666, reader_cost: 0.00081, ips: 48.0320 samples/sec | ETA 02:35:43 2022-08-23 05:27:47 [INFO] [TRAIN] epoch: 83, iter: 103950/160000, loss: 0.5106, lr: 0.000424, batch_cost: 0.1886, reader_cost: 0.00105, ips: 42.4188 samples/sec | ETA 02:56:10 2022-08-23 05:27:58 [INFO] [TRAIN] epoch: 83, iter: 104000/160000, loss: 0.5476, lr: 0.000424, batch_cost: 0.2090, reader_cost: 0.00037, ips: 38.2697 samples/sec | ETA 03:15:06 2022-08-23 05:27:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1735 - reader cost: 5.9358e-04 2022-08-23 05:30:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3455 Acc: 0.7637 Kappa: 0.7455 Dice: 0.4769 2022-08-23 05:30:51 [INFO] [EVAL] Class IoU: [0.6889 0.7782 0.9312 0.7229 0.6787 0.7613 0.773 0.7833 0.5224 0.5862 0.4858 0.5554 0.6969 0.2554 0.3154 0.4247 0.5031 0.4411 0.6147 0.4174 0.7317 0.4407 0.6378 0.4996 0.2706 0.2965 0.4362 0.4644 0.4133 0.2393 0.2793 0.4887 0.2934 0.3685 0.2793 0.3707 0.428 0.5626 0.2452 0.3955 0.264 0.1251 0.3584 0.2527 0.2883 0.1968 0.314 0.5194 0.6251 0.5318 0.5241 0.4162 0.2199 0.1859 0.6911 0.4137 0.8606 0.3999 0.4618 0.2394 0.0733 0.4093 0.3041 0.1872 0.4394 0.6836 0.3045 0.3779 0.1218 0.3288 0.5201 0.5405 0.414 0.2246 0.4298 0.3094 0.5527 0.2446 0.2566 0.1415 0.619 0.404 0.3244 0.0265 0.1462 0.5586 0.1214 0.0868 0.2747 0.4701 0.39 0.0362 0.2315 0.09 0.0029 0.017 0.0268 0.115 0.2995 0.3742 0.052 0.0503 0.2966 0.1179 0.0465 0.5735 0.0445 0.5176 0.0998 0.2491 0.0917 0.5141 0.1313 0.5742 0.7514 0.0126 0.2859 0.6622 0.1743 0.3075 0.4022 0.0165 0.2182 0.1432 0.2481 0.2985 0.3877 0.3985 0.5048 0.3104 0.5836 0.0618 0.2716 0.2344 0.1657 0.1248 0.1439 0.0298 0.2341 0.3617 0.2625 0.0045 0.3734 0.0423 0.2878 0.0094 0.397 0.0273 0.1388 0.1499] 2022-08-23 05:30:51 [INFO] [EVAL] Class Precision: [0.7825 0.8474 0.9644 0.8151 0.7848 0.8631 0.8702 0.8467 0.6741 0.7353 0.6887 0.7134 0.7645 0.474 0.5541 0.6206 0.6825 0.6918 0.7815 0.6245 0.8349 0.689 0.8047 0.6458 0.5262 0.5261 0.5309 0.745 0.6956 0.319 0.4636 0.6129 0.5218 0.4651 0.4158 0.5256 0.6109 0.7423 0.4153 0.5969 0.3894 0.3013 0.6062 0.5562 0.3677 0.3859 0.5136 0.6838 0.6984 0.6294 0.7694 0.628 0.3481 0.5206 0.7138 0.5157 0.9019 0.669 0.6546 0.4604 0.1047 0.5048 0.5588 0.602 0.5357 0.8025 0.469 0.4428 0.2453 0.5726 0.7441 0.7089 0.6026 0.3015 0.6438 0.4672 0.6604 0.4982 0.7281 0.3141 0.7032 0.6358 0.8106 0.0504 0.3536 0.7548 0.4086 0.411 0.5023 0.7407 0.6064 0.045 0.4026 0.342 0.0287 0.0464 0.6785 0.3335 0.4893 0.6509 0.324 0.095 0.6044 0.372 0.2969 0.6132 0.1606 0.7921 0.2193 0.5408 0.2839 0.7505 0.4621 0.7864 0.7557 0.0907 0.7032 0.7928 0.2581 0.5444 0.5919 0.1264 0.8293 0.666 0.6508 0.6003 0.8516 0.5405 0.8871 0.4621 0.6493 0.3255 0.4364 0.6684 0.48 0.2044 0.3332 0.0796 0.4429 0.6719 0.2911 0.007 0.6226 0.5736 0.5653 0.0255 0.8871 0.3293 0.418 0.4942] 2022-08-23 05:30:51 [INFO] [EVAL] Class Recall: [0.8521 0.905 0.9644 0.8646 0.8339 0.8658 0.8737 0.9127 0.6989 0.743 0.6225 0.715 0.8874 0.3563 0.4227 0.5737 0.6568 0.5489 0.7423 0.5572 0.8556 0.5502 0.7546 0.6883 0.3577 0.4045 0.7097 0.5521 0.5045 0.4893 0.4127 0.7069 0.4012 0.6397 0.4596 0.5572 0.5884 0.6992 0.3744 0.5397 0.4506 0.1763 0.4672 0.3165 0.5718 0.2864 0.4469 0.6836 0.8563 0.7744 0.6218 0.5524 0.3739 0.2242 0.9559 0.6765 0.9495 0.4986 0.6106 0.3327 0.1964 0.6838 0.4003 0.2136 0.7097 0.8218 0.4646 0.7205 0.1947 0.4358 0.6333 0.6946 0.5694 0.4684 0.5638 0.478 0.7722 0.3246 0.2838 0.2049 0.8379 0.5257 0.351 0.0531 0.1995 0.6824 0.1473 0.0992 0.3775 0.5628 0.5222 0.1556 0.3526 0.1088 0.0032 0.0262 0.0271 0.1493 0.4356 0.4681 0.0583 0.0965 0.368 0.1472 0.0522 0.8984 0.058 0.599 0.1548 0.3159 0.1193 0.6201 0.155 0.6804 0.9923 0.0145 0.3252 0.8008 0.3494 0.414 0.5565 0.0186 0.2285 0.1543 0.2862 0.3725 0.4158 0.6028 0.5395 0.486 0.8524 0.0709 0.4184 0.2653 0.202 0.2426 0.2021 0.0454 0.3319 0.4392 0.7277 0.0122 0.4826 0.0436 0.3696 0.0148 0.4181 0.0289 0.172 0.177 ] 2022-08-23 05:30:52 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:31:01 [INFO] [TRAIN] epoch: 83, iter: 104050/160000, loss: 0.5514, lr: 0.000424, batch_cost: 0.1917, reader_cost: 0.00318, ips: 41.7220 samples/sec | ETA 02:58:48 2022-08-23 05:31:10 [INFO] [TRAIN] epoch: 83, iter: 104100/160000, loss: 0.5689, lr: 0.000423, batch_cost: 0.1812, reader_cost: 0.00126, ips: 44.1454 samples/sec | ETA 02:48:50 2022-08-23 05:31:20 [INFO] [TRAIN] epoch: 83, iter: 104150/160000, loss: 0.5264, lr: 0.000423, batch_cost: 0.1933, reader_cost: 0.00045, ips: 41.3967 samples/sec | ETA 02:59:53 2022-08-23 05:31:31 [INFO] [TRAIN] epoch: 83, iter: 104200/160000, loss: 0.5342, lr: 0.000422, batch_cost: 0.2146, reader_cost: 0.00084, ips: 37.2809 samples/sec | ETA 03:19:33 2022-08-23 05:31:42 [INFO] [TRAIN] epoch: 83, iter: 104250/160000, loss: 0.5384, lr: 0.000422, batch_cost: 0.2235, reader_cost: 0.00060, ips: 35.7988 samples/sec | ETA 03:27:38 2022-08-23 05:31:53 [INFO] [TRAIN] epoch: 83, iter: 104300/160000, loss: 0.5799, lr: 0.000422, batch_cost: 0.2172, reader_cost: 0.00059, ips: 36.8395 samples/sec | ETA 03:21:35 2022-08-23 05:32:03 [INFO] [TRAIN] epoch: 83, iter: 104350/160000, loss: 0.5128, lr: 0.000421, batch_cost: 0.2071, reader_cost: 0.00051, ips: 38.6323 samples/sec | ETA 03:12:04 2022-08-23 05:32:13 [INFO] [TRAIN] epoch: 83, iter: 104400/160000, loss: 0.5086, lr: 0.000421, batch_cost: 0.2018, reader_cost: 0.00056, ips: 39.6377 samples/sec | ETA 03:07:01 2022-08-23 05:32:23 [INFO] [TRAIN] epoch: 83, iter: 104450/160000, loss: 0.5267, lr: 0.000421, batch_cost: 0.2010, reader_cost: 0.00095, ips: 39.7929 samples/sec | ETA 03:06:07 2022-08-23 05:32:34 [INFO] [TRAIN] epoch: 83, iter: 104500/160000, loss: 0.5710, lr: 0.000420, batch_cost: 0.2111, reader_cost: 0.00050, ips: 37.8966 samples/sec | ETA 03:15:16 2022-08-23 05:32:43 [INFO] [TRAIN] epoch: 83, iter: 104550/160000, loss: 0.5633, lr: 0.000420, batch_cost: 0.1923, reader_cost: 0.00087, ips: 41.5980 samples/sec | ETA 02:57:43 2022-08-23 05:32:53 [INFO] [TRAIN] epoch: 83, iter: 104600/160000, loss: 0.5141, lr: 0.000419, batch_cost: 0.1922, reader_cost: 0.00073, ips: 41.6135 samples/sec | ETA 02:57:30 2022-08-23 05:33:03 [INFO] [TRAIN] epoch: 83, iter: 104650/160000, loss: 0.5213, lr: 0.000419, batch_cost: 0.2026, reader_cost: 0.00049, ips: 39.4843 samples/sec | ETA 03:06:54 2022-08-23 05:33:13 [INFO] [TRAIN] epoch: 83, iter: 104700/160000, loss: 0.5211, lr: 0.000419, batch_cost: 0.1982, reader_cost: 0.00083, ips: 40.3595 samples/sec | ETA 03:02:41 2022-08-23 05:33:23 [INFO] [TRAIN] epoch: 83, iter: 104750/160000, loss: 0.5439, lr: 0.000418, batch_cost: 0.1996, reader_cost: 0.00065, ips: 40.0744 samples/sec | ETA 03:03:49 2022-08-23 05:33:33 [INFO] [TRAIN] epoch: 83, iter: 104800/160000, loss: 0.5422, lr: 0.000418, batch_cost: 0.2003, reader_cost: 0.00089, ips: 39.9459 samples/sec | ETA 03:04:14 2022-08-23 05:33:47 [INFO] [TRAIN] epoch: 84, iter: 104850/160000, loss: 0.5469, lr: 0.000418, batch_cost: 0.2760, reader_cost: 0.07711, ips: 28.9837 samples/sec | ETA 04:13:42 2022-08-23 05:33:58 [INFO] [TRAIN] epoch: 84, iter: 104900/160000, loss: 0.5574, lr: 0.000417, batch_cost: 0.2193, reader_cost: 0.00137, ips: 36.4806 samples/sec | ETA 03:21:23 2022-08-23 05:34:09 [INFO] [TRAIN] epoch: 84, iter: 104950/160000, loss: 0.5525, lr: 0.000417, batch_cost: 0.2203, reader_cost: 0.00073, ips: 36.3110 samples/sec | ETA 03:22:08 2022-08-23 05:34:18 [INFO] [TRAIN] epoch: 84, iter: 105000/160000, loss: 0.5093, lr: 0.000416, batch_cost: 0.1742, reader_cost: 0.00041, ips: 45.9280 samples/sec | ETA 02:39:40 2022-08-23 05:34:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 154s - batch_cost: 0.1542 - reader cost: 0.0015 2022-08-23 05:36:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3412 Acc: 0.7639 Kappa: 0.7457 Dice: 0.4730 2022-08-23 05:36:52 [INFO] [EVAL] Class IoU: [0.6848 0.7763 0.9302 0.7303 0.678 0.7555 0.7821 0.7819 0.5302 0.6326 0.4969 0.5359 0.6909 0.3005 0.296 0.4184 0.4775 0.4138 0.6186 0.4149 0.7356 0.5001 0.6407 0.4889 0.3166 0.3444 0.4965 0.4511 0.3601 0.2167 0.2993 0.4922 0.2725 0.3638 0.2905 0.3966 0.4277 0.5309 0.255 0.4062 0.205 0.1059 0.3579 0.2557 0.2842 0.2121 0.2747 0.5141 0.5984 0.5029 0.5249 0.3336 0.2245 0.1531 0.6655 0.4107 0.8601 0.4229 0.4011 0.2288 0.0505 0.3641 0.322 0.2343 0.4434 0.6871 0.2981 0.4012 0.1071 0.2892 0.52 0.5253 0.4039 0.1939 0.4569 0.2956 0.5342 0.2596 0.2665 0.1791 0.6235 0.4131 0.3197 0.036 0.1862 0.5598 0.0856 0.0998 0.255 0.4876 0.4185 0.047 0.23 0.0816 0.0132 0.0129 0.0318 0.1071 0.2461 0.4158 0.0771 0.0438 0.274 0.0579 0.0886 0.4941 0.0774 0.5146 0.075 0.2339 0.1365 0.4809 0.1442 0.5744 0.4825 0.0107 0.2725 0.6209 0.0798 0.1422 0.3796 0.0131 0.2439 0.1906 0.2457 0.3196 0.4653 0.4016 0.4655 0.3047 0.5438 0.1244 0.2699 0.2592 0.147 0.1253 0.1443 0.0313 0.2273 0.3549 0.3384 0.0207 0.359 0.0057 0.2666 0. 0.4175 0.0269 0.1856 0.1508] 2022-08-23 05:36:52 [INFO] [EVAL] Class Precision: [0.7778 0.8522 0.9684 0.8287 0.7642 0.8733 0.8614 0.8678 0.6708 0.7777 0.6933 0.7421 0.7723 0.4769 0.5711 0.5591 0.6785 0.6848 0.7839 0.6298 0.8229 0.6539 0.7664 0.6097 0.479 0.5418 0.6282 0.742 0.7124 0.3581 0.4714 0.6296 0.5051 0.5124 0.4467 0.6054 0.6358 0.7592 0.3823 0.601 0.3902 0.2897 0.5434 0.5307 0.3939 0.3918 0.441 0.6902 0.6455 0.5891 0.7919 0.4471 0.3629 0.4634 0.6968 0.5479 0.9018 0.6512 0.7391 0.3942 0.086 0.5299 0.5803 0.6858 0.532 0.7837 0.4604 0.511 0.2781 0.4686 0.6876 0.6608 0.57 0.3019 0.7369 0.4849 0.6359 0.4903 0.661 0.3883 0.7364 0.6695 0.8098 0.0775 0.4255 0.7573 0.4579 0.3246 0.4728 0.7317 0.529 0.0585 0.3131 0.3462 0.0585 0.0358 0.3169 0.2891 0.3533 0.6755 0.3928 0.0719 0.5431 0.4594 0.6475 0.5187 0.1483 0.7866 0.248 0.5536 0.2816 0.6764 0.4293 0.782 0.4834 0.1061 0.7759 0.7198 0.115 0.5309 0.5043 0.143 0.6334 0.555 0.6102 0.6845 0.8093 0.5607 0.8595 0.4416 0.6143 0.3471 0.4376 0.5393 0.5026 0.2793 0.3376 0.0608 0.4875 0.6575 0.4064 0.0296 0.6563 0.1714 0.5057 0. 0.8538 0.3025 0.6093 0.7993] 2022-08-23 05:36:52 [INFO] [EVAL] Class Recall: [0.8513 0.8971 0.9593 0.8602 0.8574 0.8485 0.8946 0.8875 0.7167 0.7722 0.6369 0.6586 0.8676 0.4482 0.3806 0.6244 0.6171 0.5111 0.7458 0.5488 0.874 0.6801 0.7961 0.7117 0.4829 0.4859 0.703 0.535 0.4213 0.3542 0.4505 0.6928 0.3717 0.5565 0.4537 0.5349 0.5665 0.6383 0.4338 0.5563 0.3015 0.1429 0.5117 0.3303 0.505 0.3161 0.4216 0.6684 0.8914 0.7747 0.6089 0.568 0.3706 0.1861 0.9369 0.6211 0.949 0.5467 0.4673 0.353 0.1087 0.5378 0.4198 0.2625 0.7268 0.8479 0.4581 0.6512 0.1483 0.4304 0.6809 0.7193 0.5809 0.3517 0.546 0.4309 0.7696 0.3556 0.3087 0.2496 0.8027 0.5189 0.3456 0.0631 0.2487 0.6822 0.0952 0.1259 0.3564 0.5937 0.667 0.1928 0.4641 0.0965 0.0168 0.0198 0.0342 0.1453 0.4477 0.5196 0.0875 0.1006 0.3561 0.0621 0.0931 0.9122 0.1394 0.5981 0.0971 0.2883 0.2094 0.6245 0.1783 0.6839 0.9961 0.0118 0.2957 0.8188 0.207 0.1627 0.6056 0.0142 0.2839 0.225 0.2915 0.3748 0.5226 0.5859 0.5039 0.4955 0.8257 0.1624 0.4133 0.3329 0.172 0.1852 0.2013 0.0606 0.2987 0.4354 0.6692 0.0649 0.4421 0.0059 0.3605 0. 0.4497 0.0287 0.2106 0.1568] 2022-08-23 05:36:52 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:37:03 [INFO] [TRAIN] epoch: 84, iter: 105050/160000, loss: 0.5152, lr: 0.000416, batch_cost: 0.2110, reader_cost: 0.00281, ips: 37.9065 samples/sec | ETA 03:13:16 2022-08-23 05:37:15 [INFO] [TRAIN] epoch: 84, iter: 105100/160000, loss: 0.5540, lr: 0.000416, batch_cost: 0.2374, reader_cost: 0.00125, ips: 33.6961 samples/sec | ETA 03:37:14 2022-08-23 05:37:26 [INFO] [TRAIN] epoch: 84, iter: 105150/160000, loss: 0.5345, lr: 0.000415, batch_cost: 0.2315, reader_cost: 0.00069, ips: 34.5531 samples/sec | ETA 03:31:39 2022-08-23 05:37:37 [INFO] [TRAIN] epoch: 84, iter: 105200/160000, loss: 0.5248, lr: 0.000415, batch_cost: 0.2097, reader_cost: 0.01698, ips: 38.1423 samples/sec | ETA 03:11:33 2022-08-23 05:37:48 [INFO] [TRAIN] epoch: 84, iter: 105250/160000, loss: 0.5323, lr: 0.000415, batch_cost: 0.2271, reader_cost: 0.00066, ips: 35.2267 samples/sec | ETA 03:27:13 2022-08-23 05:37:59 [INFO] [TRAIN] epoch: 84, iter: 105300/160000, loss: 0.5630, lr: 0.000414, batch_cost: 0.2132, reader_cost: 0.00057, ips: 37.5230 samples/sec | ETA 03:14:22 2022-08-23 05:38:09 [INFO] [TRAIN] epoch: 84, iter: 105350/160000, loss: 0.5432, lr: 0.000414, batch_cost: 0.2060, reader_cost: 0.00137, ips: 38.8333 samples/sec | ETA 03:07:38 2022-08-23 05:38:21 [INFO] [TRAIN] epoch: 84, iter: 105400/160000, loss: 0.5290, lr: 0.000413, batch_cost: 0.2413, reader_cost: 0.00063, ips: 33.1489 samples/sec | ETA 03:39:36 2022-08-23 05:38:32 [INFO] [TRAIN] epoch: 84, iter: 105450/160000, loss: 0.5538, lr: 0.000413, batch_cost: 0.2193, reader_cost: 0.00084, ips: 36.4736 samples/sec | ETA 03:19:24 2022-08-23 05:38:43 [INFO] [TRAIN] epoch: 84, iter: 105500/160000, loss: 0.5223, lr: 0.000413, batch_cost: 0.2122, reader_cost: 0.00063, ips: 37.7014 samples/sec | ETA 03:12:44 2022-08-23 05:38:54 [INFO] [TRAIN] epoch: 84, iter: 105550/160000, loss: 0.5544, lr: 0.000412, batch_cost: 0.2251, reader_cost: 0.00058, ips: 35.5434 samples/sec | ETA 03:24:15 2022-08-23 05:39:05 [INFO] [TRAIN] epoch: 84, iter: 105600/160000, loss: 0.5545, lr: 0.000412, batch_cost: 0.2275, reader_cost: 0.00083, ips: 35.1690 samples/sec | ETA 03:26:14 2022-08-23 05:39:17 [INFO] [TRAIN] epoch: 84, iter: 105650/160000, loss: 0.5388, lr: 0.000411, batch_cost: 0.2367, reader_cost: 0.00044, ips: 33.7982 samples/sec | ETA 03:34:24 2022-08-23 05:39:27 [INFO] [TRAIN] epoch: 84, iter: 105700/160000, loss: 0.5156, lr: 0.000411, batch_cost: 0.1965, reader_cost: 0.00202, ips: 40.7222 samples/sec | ETA 02:57:47 2022-08-23 05:39:38 [INFO] [TRAIN] epoch: 84, iter: 105750/160000, loss: 0.5334, lr: 0.000411, batch_cost: 0.2096, reader_cost: 0.00116, ips: 38.1687 samples/sec | ETA 03:09:30 2022-08-23 05:39:49 [INFO] [TRAIN] epoch: 84, iter: 105800/160000, loss: 0.5219, lr: 0.000410, batch_cost: 0.2289, reader_cost: 0.00076, ips: 34.9445 samples/sec | ETA 03:26:48 2022-08-23 05:39:59 [INFO] [TRAIN] epoch: 84, iter: 105850/160000, loss: 0.5604, lr: 0.000410, batch_cost: 0.2094, reader_cost: 0.00087, ips: 38.2074 samples/sec | ETA 03:08:58 2022-08-23 05:40:09 [INFO] [TRAIN] epoch: 84, iter: 105900/160000, loss: 0.5462, lr: 0.000410, batch_cost: 0.1997, reader_cost: 0.00072, ips: 40.0542 samples/sec | ETA 03:00:05 2022-08-23 05:40:21 [INFO] [TRAIN] epoch: 84, iter: 105950/160000, loss: 0.5309, lr: 0.000409, batch_cost: 0.2323, reader_cost: 0.00065, ips: 34.4341 samples/sec | ETA 03:29:17 2022-08-23 05:40:31 [INFO] [TRAIN] epoch: 84, iter: 106000/160000, loss: 0.5224, lr: 0.000409, batch_cost: 0.1905, reader_cost: 0.00046, ips: 42.0056 samples/sec | ETA 02:51:24 2022-08-23 05:40:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1640 - reader cost: 9.5322e-04 2022-08-23 05:43:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3461 Acc: 0.7649 Kappa: 0.7468 Dice: 0.4786 2022-08-23 05:43:15 [INFO] [EVAL] Class IoU: [0.6879 0.7747 0.9289 0.7214 0.6868 0.7596 0.7756 0.7891 0.5207 0.6086 0.4744 0.5587 0.6947 0.307 0.2941 0.4262 0.5219 0.4419 0.6176 0.4038 0.7313 0.4737 0.6358 0.4935 0.3146 0.3711 0.451 0.4451 0.4069 0.2687 0.2771 0.4934 0.2618 0.3518 0.302 0.3967 0.4294 0.5204 0.2583 0.3905 0.2331 0.1119 0.359 0.251 0.2679 0.2361 0.2721 0.5144 0.5999 0.5406 0.5326 0.2908 0.1971 0.1998 0.6636 0.4394 0.8637 0.3962 0.4516 0.2777 0.0437 0.4052 0.3385 0.2558 0.4641 0.7004 0.2994 0.3909 0.1006 0.3414 0.5013 0.5129 0.4097 0.1892 0.4494 0.3131 0.5345 0.2924 0.2508 0.1319 0.578 0.3974 0.3226 0.023 0.1917 0.5572 0.1029 0.0823 0.2328 0.5087 0.4058 0.0974 0.2404 0.098 0.0128 0.0184 0.0357 0.1008 0.282 0.4386 0.1259 0.0295 0.2675 0.2218 0.0394 0.5409 0.0719 0.5061 0.1098 0.2167 0.0958 0.471 0.133 0.5781 0.8158 0.01 0.3086 0.654 0.154 0.2058 0.4379 0.0083 0.2379 0.1807 0.2368 0.2921 0.4477 0.4091 0.374 0.2764 0.5684 0.0572 0.2348 0.2399 0.1518 0.1304 0.1614 0.0345 0.2092 0.3626 0.2826 0.0109 0.3144 0.1473 0.2514 0.0029 0.3921 0.0253 0.1392 0.1327] 2022-08-23 05:43:15 [INFO] [EVAL] Class Precision: [0.7748 0.8678 0.9651 0.8108 0.7623 0.8657 0.883 0.8614 0.6604 0.7703 0.728 0.7039 0.7741 0.4895 0.5833 0.5657 0.7336 0.6882 0.788 0.5951 0.8375 0.6667 0.798 0.6152 0.5073 0.5237 0.5743 0.8132 0.6835 0.3475 0.4407 0.6622 0.5096 0.4561 0.4772 0.5702 0.6375 0.7733 0.4994 0.6257 0.4808 0.2885 0.571 0.5722 0.3712 0.4211 0.4016 0.6943 0.6884 0.6319 0.763 0.368 0.3543 0.5525 0.7133 0.6117 0.913 0.6819 0.6645 0.445 0.0706 0.5793 0.4672 0.6526 0.5899 0.8165 0.4114 0.5684 0.2012 0.5589 0.6868 0.6597 0.6275 0.2084 0.6881 0.4814 0.6383 0.5257 0.6092 0.4021 0.657 0.7198 0.8237 0.0465 0.3959 0.7244 0.402 0.4064 0.4018 0.7 0.5645 0.2429 0.4949 0.3682 0.0664 0.0427 0.7139 0.2771 0.4871 0.6607 0.4103 0.0762 0.5674 0.6587 0.9538 0.5858 0.2127 0.8893 0.3299 0.4703 0.2882 0.6644 0.5123 0.7923 0.8213 0.0789 0.7225 0.8186 0.2788 0.4601 0.5662 0.1167 0.6295 0.6083 0.6993 0.6522 0.8379 0.5482 0.7204 0.4735 0.6374 0.6489 0.3452 0.5947 0.526 0.2486 0.3276 0.0694 0.4231 0.5951 0.3137 0.0197 0.6151 0.5205 0.4979 0.0043 0.8652 0.3503 0.6305 0.7969] 2022-08-23 05:43:15 [INFO] [EVAL] Class Recall: [0.8597 0.8784 0.9612 0.8674 0.8739 0.8611 0.8645 0.9039 0.711 0.7435 0.5767 0.7304 0.8713 0.4517 0.3723 0.6335 0.6439 0.5525 0.7406 0.5567 0.8522 0.6206 0.7577 0.7139 0.4531 0.5602 0.6775 0.4958 0.5013 0.5422 0.4274 0.6593 0.35 0.6061 0.4512 0.566 0.5681 0.6141 0.3485 0.5095 0.3115 0.1546 0.4917 0.3089 0.4905 0.3495 0.4575 0.665 0.8235 0.7891 0.6382 0.5809 0.3076 0.2384 0.905 0.6093 0.9411 0.4861 0.5849 0.4248 0.1029 0.5742 0.5513 0.2961 0.6852 0.8313 0.5238 0.5559 0.1674 0.4673 0.6498 0.6975 0.5414 0.6728 0.5643 0.4724 0.7667 0.3972 0.2989 0.1641 0.8279 0.4702 0.3465 0.0433 0.2709 0.7072 0.1215 0.0935 0.3562 0.6505 0.5906 0.1398 0.3186 0.1178 0.0157 0.0314 0.0362 0.1367 0.4011 0.5661 0.1537 0.0461 0.336 0.2506 0.0395 0.8759 0.0979 0.5401 0.1414 0.2866 0.1254 0.618 0.1523 0.6814 0.9918 0.0113 0.3501 0.7648 0.2561 0.2712 0.6591 0.0089 0.2766 0.2046 0.2637 0.346 0.4901 0.6172 0.4375 0.3989 0.8401 0.0591 0.4234 0.2868 0.1759 0.2152 0.2413 0.0643 0.2927 0.4814 0.7401 0.0237 0.3914 0.1704 0.3367 0.0085 0.4176 0.0265 0.1515 0.1373] 2022-08-23 05:43:15 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:43:24 [INFO] [TRAIN] epoch: 84, iter: 106050/160000, loss: 0.5373, lr: 0.000408, batch_cost: 0.1725, reader_cost: 0.00338, ips: 46.3786 samples/sec | ETA 02:35:06 2022-08-23 05:43:36 [INFO] [TRAIN] epoch: 85, iter: 106100/160000, loss: 0.5203, lr: 0.000408, batch_cost: 0.2408, reader_cost: 0.03376, ips: 33.2256 samples/sec | ETA 03:36:17 2022-08-23 05:43:45 [INFO] [TRAIN] epoch: 85, iter: 106150/160000, loss: 0.5220, lr: 0.000408, batch_cost: 0.1894, reader_cost: 0.00092, ips: 42.2326 samples/sec | ETA 02:50:00 2022-08-23 05:43:55 [INFO] [TRAIN] epoch: 85, iter: 106200/160000, loss: 0.5274, lr: 0.000407, batch_cost: 0.1927, reader_cost: 0.00067, ips: 41.5065 samples/sec | ETA 02:52:49 2022-08-23 05:44:05 [INFO] [TRAIN] epoch: 85, iter: 106250/160000, loss: 0.5648, lr: 0.000407, batch_cost: 0.2001, reader_cost: 0.00044, ips: 39.9796 samples/sec | ETA 02:59:15 2022-08-23 05:44:15 [INFO] [TRAIN] epoch: 85, iter: 106300/160000, loss: 0.5234, lr: 0.000407, batch_cost: 0.2035, reader_cost: 0.00034, ips: 39.3112 samples/sec | ETA 03:02:08 2022-08-23 05:44:24 [INFO] [TRAIN] epoch: 85, iter: 106350/160000, loss: 0.5519, lr: 0.000406, batch_cost: 0.1857, reader_cost: 0.00050, ips: 43.0859 samples/sec | ETA 02:46:01 2022-08-23 05:44:34 [INFO] [TRAIN] epoch: 85, iter: 106400/160000, loss: 0.4953, lr: 0.000406, batch_cost: 0.1996, reader_cost: 0.00048, ips: 40.0706 samples/sec | ETA 02:58:21 2022-08-23 05:44:45 [INFO] [TRAIN] epoch: 85, iter: 106450/160000, loss: 0.5332, lr: 0.000405, batch_cost: 0.2085, reader_cost: 0.00069, ips: 38.3761 samples/sec | ETA 03:06:03 2022-08-23 05:44:55 [INFO] [TRAIN] epoch: 85, iter: 106500/160000, loss: 0.5197, lr: 0.000405, batch_cost: 0.2087, reader_cost: 0.00104, ips: 38.3323 samples/sec | ETA 03:06:05 2022-08-23 05:45:06 [INFO] [TRAIN] epoch: 85, iter: 106550/160000, loss: 0.5368, lr: 0.000405, batch_cost: 0.2117, reader_cost: 0.00082, ips: 37.7953 samples/sec | ETA 03:08:33 2022-08-23 05:45:16 [INFO] [TRAIN] epoch: 85, iter: 106600/160000, loss: 0.5522, lr: 0.000404, batch_cost: 0.2033, reader_cost: 0.00068, ips: 39.3459 samples/sec | ETA 03:00:57 2022-08-23 05:45:26 [INFO] [TRAIN] epoch: 85, iter: 106650/160000, loss: 0.5436, lr: 0.000404, batch_cost: 0.2122, reader_cost: 0.00046, ips: 37.7042 samples/sec | ETA 03:08:39 2022-08-23 05:45:38 [INFO] [TRAIN] epoch: 85, iter: 106700/160000, loss: 0.5168, lr: 0.000404, batch_cost: 0.2287, reader_cost: 0.00070, ips: 34.9737 samples/sec | ETA 03:23:12 2022-08-23 05:45:49 [INFO] [TRAIN] epoch: 85, iter: 106750/160000, loss: 0.5637, lr: 0.000403, batch_cost: 0.2266, reader_cost: 0.00063, ips: 35.3070 samples/sec | ETA 03:21:05 2022-08-23 05:46:01 [INFO] [TRAIN] epoch: 85, iter: 106800/160000, loss: 0.5195, lr: 0.000403, batch_cost: 0.2448, reader_cost: 0.00090, ips: 32.6769 samples/sec | ETA 03:37:04 2022-08-23 05:46:12 [INFO] [TRAIN] epoch: 85, iter: 106850/160000, loss: 0.5100, lr: 0.000402, batch_cost: 0.2037, reader_cost: 0.00057, ips: 39.2792 samples/sec | ETA 03:00:25 2022-08-23 05:46:23 [INFO] [TRAIN] epoch: 85, iter: 106900/160000, loss: 0.5281, lr: 0.000402, batch_cost: 0.2344, reader_cost: 0.00064, ips: 34.1284 samples/sec | ETA 03:27:27 2022-08-23 05:46:34 [INFO] [TRAIN] epoch: 85, iter: 106950/160000, loss: 0.5306, lr: 0.000402, batch_cost: 0.2083, reader_cost: 0.00044, ips: 38.4144 samples/sec | ETA 03:04:07 2022-08-23 05:46:46 [INFO] [TRAIN] epoch: 85, iter: 107000/160000, loss: 0.5188, lr: 0.000401, batch_cost: 0.2419, reader_cost: 0.00060, ips: 33.0775 samples/sec | ETA 03:33:38 2022-08-23 05:46:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1731 - reader cost: 0.0016 2022-08-23 05:49:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3461 Acc: 0.7648 Kappa: 0.7469 Dice: 0.4786 2022-08-23 05:49:39 [INFO] [EVAL] Class IoU: [0.684 0.7748 0.9311 0.7238 0.6761 0.7629 0.7743 0.7761 0.5309 0.6152 0.4844 0.5357 0.6941 0.299 0.2846 0.421 0.5246 0.4265 0.6212 0.4114 0.7365 0.5036 0.6569 0.4961 0.3352 0.4088 0.4805 0.4662 0.3656 0.2692 0.2897 0.4841 0.2556 0.3651 0.2846 0.4044 0.4229 0.5035 0.2388 0.4008 0.2303 0.1116 0.3482 0.2621 0.2812 0.2245 0.2787 0.5181 0.5945 0.5504 0.5526 0.4305 0.2294 0.2218 0.7162 0.4291 0.8679 0.3954 0.4325 0.2367 0.0674 0.3348 0.3361 0.2161 0.4394 0.6847 0.2942 0.4155 0.1182 0.3319 0.4948 0.5015 0.3979 0.2085 0.4538 0.3238 0.5521 0.2454 0.2803 0.1096 0.5879 0.4095 0.3583 0.026 0.2412 0.5616 0.1094 0.0962 0.2318 0.5164 0.4202 0.0572 0.2312 0.0949 0.0176 0.0273 0.036 0.1247 0.249 0.3793 0.0578 0.072 0.2509 0.1345 0.0243 0.5046 0.1118 0.5154 0.1169 0.2061 0.12 0.4306 0.1453 0.6127 0.7746 0.0236 0.3036 0.6326 0.1021 0.1537 0.4244 0.012 0.2086 0.1645 0.2718 0.2419 0.4096 0.4046 0.3804 0.3488 0.5245 0.0515 0.2523 0.2665 0.1778 0.1255 0.1378 0.0391 0.1949 0.3569 0.3785 0.0039 0.3372 0.1051 0.1996 0.0022 0.3958 0.0489 0.1491 0.1962] 2022-08-23 05:49:39 [INFO] [EVAL] Class Precision: [0.7841 0.8618 0.9634 0.8177 0.7551 0.8623 0.857 0.8453 0.6783 0.785 0.6524 0.7243 0.7659 0.5172 0.5554 0.6241 0.6835 0.7054 0.7668 0.6253 0.8256 0.6795 0.8102 0.6134 0.4547 0.5533 0.6411 0.6976 0.7315 0.3587 0.458 0.6053 0.5229 0.523 0.448 0.5684 0.5868 0.7186 0.4104 0.6401 0.406 0.2415 0.6303 0.5667 0.3834 0.391 0.4278 0.7055 0.6588 0.6459 0.6948 0.6875 0.4164 0.5473 0.7382 0.5859 0.9122 0.6944 0.6205 0.4074 0.1049 0.4769 0.5169 0.7079 0.5525 0.7853 0.4401 0.5353 0.3088 0.6133 0.6531 0.6561 0.5631 0.303 0.6596 0.4504 0.6887 0.5095 0.7983 0.3489 0.657 0.6882 0.7941 0.0698 0.4607 0.7069 0.4111 0.341 0.4252 0.7243 0.5805 0.074 0.3991 0.3591 0.0481 0.0715 0.7365 0.3074 0.5164 0.7512 0.2525 0.1295 0.6527 0.5876 0.7515 0.527 0.3022 0.7802 0.3034 0.536 0.2225 0.5608 0.4287 0.7613 0.7784 0.1278 0.7455 0.7396 0.1331 0.4759 0.6486 0.2526 0.6915 0.6385 0.6161 0.6293 0.7856 0.5266 0.7281 0.6372 0.5746 0.3485 0.4259 0.6033 0.618 0.2191 0.4991 0.0654 0.3807 0.6118 0.4389 0.0059 0.6049 0.3772 0.5902 0.0025 0.8647 0.2806 0.5526 0.7659] 2022-08-23 05:49:39 [INFO] [EVAL] Class Recall: [0.8427 0.8848 0.9652 0.863 0.8659 0.8688 0.8892 0.9045 0.7096 0.7398 0.6529 0.673 0.8809 0.4148 0.3685 0.5641 0.6929 0.5189 0.7659 0.546 0.8722 0.6605 0.7763 0.7218 0.5605 0.6102 0.6574 0.5842 0.4222 0.519 0.4408 0.7075 0.3333 0.5474 0.4384 0.5836 0.6022 0.6272 0.3636 0.5174 0.3475 0.1718 0.4376 0.3278 0.5134 0.3451 0.4444 0.6611 0.8589 0.7884 0.7297 0.5352 0.3381 0.2716 0.96 0.6158 0.9469 0.4787 0.5881 0.3609 0.1586 0.5292 0.4899 0.2372 0.6821 0.8424 0.4702 0.6501 0.1607 0.4197 0.6712 0.6803 0.5757 0.4007 0.5927 0.5353 0.7356 0.3212 0.3016 0.1378 0.8482 0.5027 0.395 0.0398 0.3361 0.732 0.1297 0.1181 0.3376 0.6427 0.6036 0.2015 0.3546 0.1143 0.0269 0.0424 0.0365 0.1734 0.3247 0.4338 0.0697 0.1394 0.2895 0.1485 0.0245 0.9224 0.1506 0.6029 0.1597 0.2509 0.2067 0.6497 0.1802 0.7585 0.9936 0.0281 0.3386 0.8138 0.3048 0.185 0.551 0.0124 0.23 0.1814 0.3272 0.2821 0.4612 0.6359 0.4434 0.4353 0.8576 0.057 0.3824 0.3231 0.1998 0.2271 0.1599 0.0886 0.2854 0.4614 0.7334 0.0116 0.4325 0.1272 0.2317 0.0209 0.4219 0.0559 0.1696 0.2087] 2022-08-23 05:49:39 [INFO] [EVAL] The model with the best validation mIoU (0.3470) was saved at iter 100000. 2022-08-23 05:49:48 [INFO] [TRAIN] epoch: 85, iter: 107050/160000, loss: 0.5822, lr: 0.000401, batch_cost: 0.1799, reader_cost: 0.00373, ips: 44.4738 samples/sec | ETA 02:38:44 2022-08-23 05:49:57 [INFO] [TRAIN] epoch: 85, iter: 107100/160000, loss: 0.5469, lr: 0.000401, batch_cost: 0.1741, reader_cost: 0.00087, ips: 45.9420 samples/sec | ETA 02:33:31 2022-08-23 05:50:06 [INFO] [TRAIN] epoch: 85, iter: 107150/160000, loss: 0.5280, lr: 0.000400, batch_cost: 0.1779, reader_cost: 0.00069, ips: 44.9568 samples/sec | ETA 02:36:44 2022-08-23 05:50:16 [INFO] [TRAIN] epoch: 85, iter: 107200/160000, loss: 0.5199, lr: 0.000400, batch_cost: 0.1961, reader_cost: 0.00045, ips: 40.8044 samples/sec | ETA 02:52:31 2022-08-23 05:50:26 [INFO] [TRAIN] epoch: 85, iter: 107250/160000, loss: 0.5484, lr: 0.000399, batch_cost: 0.2034, reader_cost: 0.00090, ips: 39.3218 samples/sec | ETA 02:58:51 2022-08-23 05:50:37 [INFO] [TRAIN] epoch: 85, iter: 107300/160000, loss: 0.5808, lr: 0.000399, batch_cost: 0.2142, reader_cost: 0.00135, ips: 37.3462 samples/sec | ETA 03:08:08 2022-08-23 05:50:48 [INFO] [TRAIN] epoch: 85, iter: 107350/160000, loss: 0.5277, lr: 0.000399, batch_cost: 0.2202, reader_cost: 0.00150, ips: 36.3249 samples/sec | ETA 03:13:15 2022-08-23 05:51:02 [INFO] [TRAIN] epoch: 86, iter: 107400/160000, loss: 0.5148, lr: 0.000398, batch_cost: 0.2925, reader_cost: 0.07757, ips: 27.3503 samples/sec | ETA 04:16:25 2022-08-23 05:51:14 [INFO] [TRAIN] epoch: 86, iter: 107450/160000, loss: 0.5257, lr: 0.000398, batch_cost: 0.2285, reader_cost: 0.00093, ips: 35.0156 samples/sec | ETA 03:20:06 2022-08-23 05:51:23 [INFO] [TRAIN] epoch: 86, iter: 107500/160000, loss: 0.4934, lr: 0.000397, batch_cost: 0.1891, reader_cost: 0.00384, ips: 42.3075 samples/sec | ETA 02:45:27 2022-08-23 05:51:34 [INFO] [TRAIN] epoch: 86, iter: 107550/160000, loss: 0.5288, lr: 0.000397, batch_cost: 0.2103, reader_cost: 0.00083, ips: 38.0415 samples/sec | ETA 03:03:50 2022-08-23 05:51:44 [INFO] [TRAIN] epoch: 86, iter: 107600/160000, loss: 0.5927, lr: 0.000397, batch_cost: 0.1966, reader_cost: 0.00370, ips: 40.6927 samples/sec | ETA 02:51:41 2022-08-23 05:51:54 [INFO] [TRAIN] epoch: 86, iter: 107650/160000, loss: 0.5043, lr: 0.000396, batch_cost: 0.2050, reader_cost: 0.00043, ips: 39.0188 samples/sec | ETA 02:58:53 2022-08-23 05:52:04 [INFO] [TRAIN] epoch: 86, iter: 107700/160000, loss: 0.5468, lr: 0.000396, batch_cost: 0.2119, reader_cost: 0.00037, ips: 37.7516 samples/sec | ETA 03:04:42 2022-08-23 05:52:16 [INFO] [TRAIN] epoch: 86, iter: 107750/160000, loss: 0.5582, lr: 0.000396, batch_cost: 0.2254, reader_cost: 0.00728, ips: 35.4893 samples/sec | ETA 03:16:18 2022-08-23 05:52:26 [INFO] [TRAIN] epoch: 86, iter: 107800/160000, loss: 0.5311, lr: 0.000395, batch_cost: 0.2111, reader_cost: 0.00064, ips: 37.9010 samples/sec | ETA 03:03:38 2022-08-23 05:52:36 [INFO] [TRAIN] epoch: 86, iter: 107850/160000, loss: 0.5595, lr: 0.000395, batch_cost: 0.2034, reader_cost: 0.00046, ips: 39.3340 samples/sec | ETA 02:56:46 2022-08-23 05:52:46 [INFO] [TRAIN] epoch: 86, iter: 107900/160000, loss: 0.4912, lr: 0.000394, batch_cost: 0.1974, reader_cost: 0.00615, ips: 40.5178 samples/sec | ETA 02:51:26 2022-08-23 05:52:56 [INFO] [TRAIN] epoch: 86, iter: 107950/160000, loss: 0.5173, lr: 0.000394, batch_cost: 0.2024, reader_cost: 0.00093, ips: 39.5328 samples/sec | ETA 02:55:33 2022-08-23 05:53:07 [INFO] [TRAIN] epoch: 86, iter: 108000/160000, loss: 0.5269, lr: 0.000394, batch_cost: 0.2087, reader_cost: 0.00092, ips: 38.3367 samples/sec | ETA 03:00:51 2022-08-23 05:53:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1587 - reader cost: 0.0014 2022-08-23 05:55:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3481 Acc: 0.7649 Kappa: 0.7467 Dice: 0.4794 2022-08-23 05:55:46 [INFO] [EVAL] Class IoU: [0.6847 0.7751 0.9312 0.7227 0.6833 0.7556 0.7758 0.7825 0.5274 0.5992 0.4859 0.5369 0.6997 0.2977 0.3095 0.4291 0.4566 0.4574 0.6172 0.413 0.7363 0.5299 0.6433 0.4865 0.3241 0.2709 0.4781 0.4369 0.4207 0.2086 0.295 0.4925 0.2915 0.3584 0.2706 0.4045 0.4259 0.5424 0.2392 0.3953 0.2293 0.0766 0.3633 0.2677 0.2936 0.1982 0.2892 0.5286 0.6518 0.5338 0.5505 0.3808 0.1926 0.1602 0.7133 0.4644 0.8568 0.3904 0.4814 0.2721 0.0361 0.4178 0.3511 0.2845 0.4454 0.6887 0.3013 0.3793 0.1216 0.3147 0.4988 0.5442 0.3955 0.2098 0.4359 0.3208 0.5562 0.2453 0.2824 0.118 0.6438 0.4133 0.3659 0.0281 0.1138 0.5683 0.0984 0.0858 0.2713 0.5108 0.4143 0.0555 0.2212 0.1049 0.0154 0.0275 0.0196 0.1306 0.1803 0.4408 0.0411 0.0351 0.2304 0.1855 0.0611 0.5066 0.0832 0.5217 0.0951 0.267 0.1058 0.3236 0.1266 0.6113 0.8024 0.0175 0.3105 0.6283 0.1009 0.1462 0.4344 0.0136 0.2266 0.1824 0.2607 0.293 0.3873 0.416 0.5073 0.3331 0.5486 0.05 0.2845 0.2294 0.1817 0.1331 0.1612 0.0396 0.1702 0.3467 0.406 0. 0.3595 0.1818 0.264 0.004 0.4415 0.0321 0.1792 0.2029] 2022-08-23 05:55:46 [INFO] [EVAL] Class Precision: [0.7755 0.8479 0.967 0.8166 0.7751 0.8693 0.8701 0.8452 0.708 0.7704 0.7079 0.7043 0.7818 0.5143 0.5247 0.5719 0.6787 0.6743 0.7535 0.6457 0.8235 0.6779 0.7658 0.6412 0.5084 0.5374 0.6772 0.789 0.6577 0.3126 0.4289 0.643 0.4561 0.4781 0.4626 0.5661 0.6147 0.7075 0.3521 0.6084 0.4198 0.2464 0.6365 0.5351 0.3979 0.3901 0.4026 0.7564 0.7108 0.6031 0.7391 0.4977 0.4204 0.5 0.7338 0.6562 0.9012 0.6773 0.6347 0.4131 0.0736 0.5164 0.5113 0.5807 0.5653 0.7808 0.4419 0.4787 0.2253 0.5815 0.7323 0.6669 0.5845 0.2755 0.7281 0.5152 0.682 0.5105 0.7464 0.3423 0.7614 0.644 0.7967 0.0803 0.2993 0.7158 0.3715 0.3727 0.5629 0.7438 0.5395 0.068 0.4129 0.3241 0.0443 0.0678 0.7187 0.2926 0.6039 0.6994 0.3592 0.1195 0.5218 0.5312 0.8821 0.5424 0.2655 0.7936 0.258 0.5082 0.2589 0.3956 0.5504 0.8314 0.8063 0.1323 0.7419 0.74 0.1445 0.5538 0.584 0.1331 0.7332 0.5867 0.663 0.6551 0.9606 0.5772 0.8352 0.5717 0.6103 0.2908 0.4375 0.6767 0.5373 0.3069 0.3776 0.0743 0.4077 0.6627 0.4602 0.0001 0.6725 0.5776 0.5979 0.0045 0.8044 0.4071 0.515 0.7635] 2022-08-23 05:55:46 [INFO] [EVAL] Class Recall: [0.854 0.9002 0.9618 0.8628 0.8522 0.8524 0.8773 0.9134 0.674 0.7295 0.6078 0.6932 0.8695 0.4142 0.4301 0.6322 0.5824 0.5871 0.7732 0.534 0.8743 0.7082 0.8009 0.6685 0.472 0.3533 0.6192 0.4947 0.5387 0.3855 0.4858 0.6778 0.4468 0.5886 0.3948 0.5862 0.581 0.6991 0.4274 0.5302 0.3357 0.1 0.4584 0.3488 0.5283 0.2873 0.5065 0.6371 0.887 0.8229 0.6833 0.6185 0.2623 0.1908 0.9622 0.6138 0.9456 0.4796 0.6658 0.4435 0.0662 0.6865 0.5284 0.358 0.6773 0.8537 0.4864 0.6462 0.2091 0.4069 0.6101 0.7473 0.5502 0.4682 0.5206 0.4595 0.7509 0.3208 0.3123 0.1525 0.8066 0.5357 0.4036 0.0415 0.1551 0.7339 0.1181 0.1002 0.3437 0.6199 0.6411 0.2314 0.3228 0.1343 0.023 0.0443 0.0197 0.191 0.2045 0.5438 0.0444 0.0474 0.2921 0.2218 0.0616 0.8846 0.1081 0.6035 0.1309 0.36 0.1518 0.6401 0.1412 0.6978 0.9941 0.0198 0.348 0.8062 0.2507 0.1657 0.6291 0.0149 0.2469 0.2093 0.3006 0.3465 0.3936 0.5983 0.5638 0.4438 0.8443 0.0569 0.4485 0.2576 0.2155 0.1903 0.2196 0.078 0.2262 0.4209 0.775 0.0001 0.4358 0.2097 0.321 0.0301 0.4946 0.0336 0.2156 0.2166] 2022-08-23 05:55:46 [INFO] [EVAL] The model with the best validation mIoU (0.3481) was saved at iter 108000. 2022-08-23 05:55:55 [INFO] [TRAIN] epoch: 86, iter: 108050/160000, loss: 0.5065, lr: 0.000393, batch_cost: 0.1810, reader_cost: 0.00389, ips: 44.1933 samples/sec | ETA 02:36:44 2022-08-23 05:56:06 [INFO] [TRAIN] epoch: 86, iter: 108100/160000, loss: 0.5568, lr: 0.000393, batch_cost: 0.2134, reader_cost: 0.00104, ips: 37.4892 samples/sec | ETA 03:04:35 2022-08-23 05:56:16 [INFO] [TRAIN] epoch: 86, iter: 108150/160000, loss: 0.5316, lr: 0.000393, batch_cost: 0.1980, reader_cost: 0.00075, ips: 40.4127 samples/sec | ETA 02:51:04 2022-08-23 05:56:24 [INFO] [TRAIN] epoch: 86, iter: 108200/160000, loss: 0.5631, lr: 0.000392, batch_cost: 0.1712, reader_cost: 0.00065, ips: 46.7228 samples/sec | ETA 02:27:49 2022-08-23 05:56:34 [INFO] [TRAIN] epoch: 86, iter: 108250/160000, loss: 0.5477, lr: 0.000392, batch_cost: 0.1855, reader_cost: 0.00072, ips: 43.1202 samples/sec | ETA 02:40:01 2022-08-23 05:56:42 [INFO] [TRAIN] epoch: 86, iter: 108300/160000, loss: 0.5182, lr: 0.000391, batch_cost: 0.1662, reader_cost: 0.00112, ips: 48.1259 samples/sec | ETA 02:23:14 2022-08-23 05:56:52 [INFO] [TRAIN] epoch: 86, iter: 108350/160000, loss: 0.5361, lr: 0.000391, batch_cost: 0.2080, reader_cost: 0.00056, ips: 38.4670 samples/sec | ETA 02:59:01 2022-08-23 05:57:02 [INFO] [TRAIN] epoch: 86, iter: 108400/160000, loss: 0.5441, lr: 0.000391, batch_cost: 0.1861, reader_cost: 0.00079, ips: 42.9888 samples/sec | ETA 02:40:02 2022-08-23 05:57:11 [INFO] [TRAIN] epoch: 86, iter: 108450/160000, loss: 0.5175, lr: 0.000390, batch_cost: 0.1836, reader_cost: 0.00177, ips: 43.5687 samples/sec | ETA 02:37:45 2022-08-23 05:57:20 [INFO] [TRAIN] epoch: 86, iter: 108500/160000, loss: 0.5532, lr: 0.000390, batch_cost: 0.1936, reader_cost: 0.00046, ips: 41.3165 samples/sec | ETA 02:46:11 2022-08-23 05:57:30 [INFO] [TRAIN] epoch: 86, iter: 108550/160000, loss: 0.5241, lr: 0.000390, batch_cost: 0.1889, reader_cost: 0.00137, ips: 42.3447 samples/sec | ETA 02:42:00 2022-08-23 05:57:41 [INFO] [TRAIN] epoch: 86, iter: 108600/160000, loss: 0.5577, lr: 0.000389, batch_cost: 0.2179, reader_cost: 0.00076, ips: 36.7125 samples/sec | ETA 03:06:40 2022-08-23 05:57:55 [INFO] [TRAIN] epoch: 87, iter: 108650/160000, loss: 0.4943, lr: 0.000389, batch_cost: 0.2789, reader_cost: 0.07465, ips: 28.6874 samples/sec | ETA 03:58:39 2022-08-23 05:58:05 [INFO] [TRAIN] epoch: 87, iter: 108700/160000, loss: 0.5472, lr: 0.000388, batch_cost: 0.2069, reader_cost: 0.00241, ips: 38.6749 samples/sec | ETA 02:56:51 2022-08-23 05:58:16 [INFO] [TRAIN] epoch: 87, iter: 108750/160000, loss: 0.5139, lr: 0.000388, batch_cost: 0.2140, reader_cost: 0.00115, ips: 37.3894 samples/sec | ETA 03:02:45 2022-08-23 05:58:25 [INFO] [TRAIN] epoch: 87, iter: 108800/160000, loss: 0.4963, lr: 0.000388, batch_cost: 0.1858, reader_cost: 0.00042, ips: 43.0492 samples/sec | ETA 02:38:34 2022-08-23 05:58:34 [INFO] [TRAIN] epoch: 87, iter: 108850/160000, loss: 0.5169, lr: 0.000387, batch_cost: 0.1884, reader_cost: 0.00044, ips: 42.4666 samples/sec | ETA 02:40:35 2022-08-23 05:58:45 [INFO] [TRAIN] epoch: 87, iter: 108900/160000, loss: 0.5347, lr: 0.000387, batch_cost: 0.2034, reader_cost: 0.00086, ips: 39.3396 samples/sec | ETA 02:53:11 2022-08-23 05:58:55 [INFO] [TRAIN] epoch: 87, iter: 108950/160000, loss: 0.5134, lr: 0.000387, batch_cost: 0.2001, reader_cost: 0.00033, ips: 39.9816 samples/sec | ETA 02:50:14 2022-08-23 05:59:04 [INFO] [TRAIN] epoch: 87, iter: 109000/160000, loss: 0.5580, lr: 0.000386, batch_cost: 0.1801, reader_cost: 0.00129, ips: 44.4141 samples/sec | ETA 02:33:06 2022-08-23 05:59:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 185s - batch_cost: 0.1850 - reader cost: 8.6966e-04 2022-08-23 06:02:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3488 Acc: 0.7677 Kappa: 0.7500 Dice: 0.4801 2022-08-23 06:02:09 [INFO] [EVAL] Class IoU: [0.6909 0.7783 0.9311 0.7289 0.6885 0.766 0.7772 0.7822 0.5252 0.6255 0.4955 0.5512 0.6889 0.2926 0.2819 0.4268 0.5125 0.4443 0.6172 0.4116 0.7424 0.492 0.6478 0.5023 0.3181 0.3066 0.4563 0.4441 0.4019 0.2104 0.2765 0.4848 0.2667 0.3649 0.3003 0.3805 0.4316 0.5497 0.2499 0.378 0.197 0.1102 0.3674 0.2619 0.3093 0.2061 0.2671 0.523 0.6637 0.5807 0.5188 0.4484 0.2094 0.2473 0.7141 0.4479 0.8646 0.4074 0.4577 0.2816 0.043 0.4128 0.3119 0.2066 0.4617 0.6971 0.2815 0.4174 0.0822 0.3018 0.5113 0.5278 0.3969 0.1963 0.4546 0.3353 0.5277 0.2477 0.2853 0.1368 0.6306 0.4247 0.3254 0.0302 0.329 0.5716 0.087 0.0957 0.2779 0.5177 0.4062 0.1126 0.2241 0.1197 0.0041 0.0208 0.031 0.1073 0.2998 0.3783 0.061 0.0392 0.2565 0.0741 0.0262 0.5082 0.0705 0.5216 0.0954 0.2183 0.1156 0.4479 0.1444 0.5753 0.8046 0.0179 0.2849 0.6453 0.1189 0.1619 0.453 0.0144 0.1989 0.1758 0.2456 0.2456 0.4424 0.411 0.491 0.3368 0.5689 0.0465 0.2606 0.2491 0.1773 0.1188 0.1571 0.0181 0.2328 0.3469 0.3198 0.0077 0.3617 0.1352 0.2261 0. 0.3578 0.0287 0.2012 0.1705] 2022-08-23 06:02:09 [INFO] [EVAL] Class Precision: [0.7941 0.8522 0.9637 0.8235 0.7668 0.8507 0.8732 0.8501 0.6729 0.7613 0.7033 0.6873 0.7646 0.486 0.5505 0.5854 0.6755 0.6491 0.7416 0.6384 0.8365 0.6364 0.7738 0.6175 0.4942 0.5636 0.5959 0.707 0.6805 0.3331 0.4645 0.6063 0.5106 0.4755 0.4634 0.5002 0.6365 0.773 0.4303 0.6603 0.4468 0.2578 0.6117 0.4574 0.4292 0.4181 0.4273 0.7438 0.7736 0.6792 0.7718 0.605 0.3984 0.5539 0.7509 0.5978 0.923 0.6696 0.6578 0.4658 0.0771 0.528 0.5572 0.6961 0.6196 0.7893 0.4082 0.5316 0.2052 0.5341 0.6766 0.707 0.589 0.2841 0.7165 0.4709 0.7208 0.5278 0.7193 0.4173 0.7133 0.7222 0.8184 0.1003 0.4645 0.7381 0.4209 0.3634 0.5795 0.6996 0.5624 0.1904 0.4662 0.3408 0.0195 0.0619 0.5564 0.287 0.4975 0.6235 0.2467 0.0982 0.6308 0.4842 0.6618 0.5342 0.265 0.8153 0.2476 0.6036 0.2499 0.5957 0.4706 0.7826 0.8091 0.1131 0.7797 0.7842 0.1613 0.464 0.5561 0.2144 0.5849 0.607 0.6645 0.6628 0.954 0.6031 0.8728 0.5403 0.6579 0.4806 0.4519 0.7469 0.5517 0.2274 0.4823 0.0795 0.4762 0.6979 0.3532 0.0118 0.6461 0.4405 0.5172 0. 0.8592 0.5109 0.649 0.7793] 2022-08-23 06:02:09 [INFO] [EVAL] Class Recall: [0.8417 0.8997 0.965 0.8639 0.8709 0.8849 0.8761 0.9074 0.7053 0.7781 0.6265 0.7356 0.8742 0.4237 0.3661 0.6117 0.6798 0.5849 0.7864 0.5368 0.8684 0.6843 0.7991 0.7292 0.4717 0.4021 0.6609 0.5443 0.4954 0.3636 0.4058 0.7077 0.3583 0.6108 0.4604 0.6139 0.5729 0.6555 0.3735 0.4692 0.2606 0.1614 0.4791 0.3799 0.5255 0.2889 0.4161 0.638 0.8236 0.8002 0.6127 0.6339 0.3062 0.3087 0.9357 0.6411 0.9318 0.51 0.6008 0.4159 0.0888 0.6544 0.4147 0.2271 0.6443 0.8566 0.4756 0.6603 0.1207 0.4097 0.6767 0.6755 0.5489 0.3882 0.5544 0.5379 0.6633 0.3183 0.3211 0.1692 0.8447 0.5077 0.3507 0.0415 0.5301 0.717 0.0988 0.115 0.3481 0.6657 0.594 0.2162 0.3015 0.1557 0.0052 0.0303 0.0318 0.1463 0.43 0.4903 0.075 0.0612 0.3018 0.0804 0.0265 0.9126 0.0876 0.5915 0.1344 0.2549 0.177 0.6435 0.1723 0.6847 0.9932 0.0208 0.3099 0.7847 0.3113 0.1991 0.7095 0.0152 0.2315 0.1983 0.2804 0.2807 0.452 0.5633 0.5289 0.472 0.808 0.0489 0.3811 0.2721 0.2072 0.1992 0.1889 0.0229 0.3129 0.4082 0.7714 0.0218 0.4511 0.1632 0.2866 0. 0.3801 0.0295 0.2257 0.1791] 2022-08-23 06:02:09 [INFO] [EVAL] The model with the best validation mIoU (0.3488) was saved at iter 109000. 2022-08-23 06:02:18 [INFO] [TRAIN] epoch: 87, iter: 109050/160000, loss: 0.5351, lr: 0.000386, batch_cost: 0.1869, reader_cost: 0.00416, ips: 42.8111 samples/sec | ETA 02:38:40 2022-08-23 06:02:27 [INFO] [TRAIN] epoch: 87, iter: 109100/160000, loss: 0.5349, lr: 0.000385, batch_cost: 0.1713, reader_cost: 0.00109, ips: 46.6881 samples/sec | ETA 02:25:21 2022-08-23 06:02:36 [INFO] [TRAIN] epoch: 87, iter: 109150/160000, loss: 0.5147, lr: 0.000385, batch_cost: 0.1788, reader_cost: 0.00069, ips: 44.7420 samples/sec | ETA 02:31:32 2022-08-23 06:02:46 [INFO] [TRAIN] epoch: 87, iter: 109200/160000, loss: 0.5499, lr: 0.000385, batch_cost: 0.1918, reader_cost: 0.00051, ips: 41.7134 samples/sec | ETA 02:42:22 2022-08-23 06:02:55 [INFO] [TRAIN] epoch: 87, iter: 109250/160000, loss: 0.5197, lr: 0.000384, batch_cost: 0.1910, reader_cost: 0.00035, ips: 41.8888 samples/sec | ETA 02:41:32 2022-08-23 06:03:05 [INFO] [TRAIN] epoch: 87, iter: 109300/160000, loss: 0.4808, lr: 0.000384, batch_cost: 0.1962, reader_cost: 0.00042, ips: 40.7675 samples/sec | ETA 02:45:49 2022-08-23 06:03:14 [INFO] [TRAIN] epoch: 87, iter: 109350/160000, loss: 0.5257, lr: 0.000383, batch_cost: 0.1870, reader_cost: 0.00173, ips: 42.7858 samples/sec | ETA 02:37:50 2022-08-23 06:03:23 [INFO] [TRAIN] epoch: 87, iter: 109400/160000, loss: 0.5359, lr: 0.000383, batch_cost: 0.1812, reader_cost: 0.00035, ips: 44.1471 samples/sec | ETA 02:32:49 2022-08-23 06:03:32 [INFO] [TRAIN] epoch: 87, iter: 109450/160000, loss: 0.5298, lr: 0.000383, batch_cost: 0.1724, reader_cost: 0.00076, ips: 46.3908 samples/sec | ETA 02:25:17 2022-08-23 06:03:40 [INFO] [TRAIN] epoch: 87, iter: 109500/160000, loss: 0.5492, lr: 0.000382, batch_cost: 0.1681, reader_cost: 0.00077, ips: 47.5922 samples/sec | ETA 02:21:28 2022-08-23 06:03:49 [INFO] [TRAIN] epoch: 87, iter: 109550/160000, loss: 0.5072, lr: 0.000382, batch_cost: 0.1743, reader_cost: 0.00045, ips: 45.8950 samples/sec | ETA 02:26:33 2022-08-23 06:03:59 [INFO] [TRAIN] epoch: 87, iter: 109600/160000, loss: 0.5119, lr: 0.000382, batch_cost: 0.1898, reader_cost: 0.00045, ips: 42.1446 samples/sec | ETA 02:39:27 2022-08-23 06:04:09 [INFO] [TRAIN] epoch: 87, iter: 109650/160000, loss: 0.5388, lr: 0.000381, batch_cost: 0.2085, reader_cost: 0.00119, ips: 38.3768 samples/sec | ETA 02:54:55 2022-08-23 06:04:20 [INFO] [TRAIN] epoch: 87, iter: 109700/160000, loss: 0.5344, lr: 0.000381, batch_cost: 0.2139, reader_cost: 0.00034, ips: 37.3967 samples/sec | ETA 02:59:20 2022-08-23 06:04:29 [INFO] [TRAIN] epoch: 87, iter: 109750/160000, loss: 0.5129, lr: 0.000380, batch_cost: 0.1866, reader_cost: 0.00091, ips: 42.8773 samples/sec | ETA 02:36:15 2022-08-23 06:04:40 [INFO] [TRAIN] epoch: 87, iter: 109800/160000, loss: 0.5899, lr: 0.000380, batch_cost: 0.2124, reader_cost: 0.00090, ips: 37.6681 samples/sec | ETA 02:57:41 2022-08-23 06:04:49 [INFO] [TRAIN] epoch: 87, iter: 109850/160000, loss: 0.5312, lr: 0.000380, batch_cost: 0.1863, reader_cost: 0.00052, ips: 42.9311 samples/sec | ETA 02:35:45 2022-08-23 06:05:01 [INFO] [TRAIN] epoch: 88, iter: 109900/160000, loss: 0.5337, lr: 0.000379, batch_cost: 0.2384, reader_cost: 0.05483, ips: 33.5528 samples/sec | ETA 03:19:05 2022-08-23 06:05:10 [INFO] [TRAIN] epoch: 88, iter: 109950/160000, loss: 0.5316, lr: 0.000379, batch_cost: 0.1773, reader_cost: 0.00213, ips: 45.1148 samples/sec | ETA 02:27:55 2022-08-23 06:05:20 [INFO] [TRAIN] epoch: 88, iter: 110000/160000, loss: 0.4840, lr: 0.000379, batch_cost: 0.2002, reader_cost: 0.00062, ips: 39.9557 samples/sec | ETA 02:46:51 2022-08-23 06:05:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1727 - reader cost: 0.0012 2022-08-23 06:08:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3470 Acc: 0.7644 Kappa: 0.7463 Dice: 0.4792 2022-08-23 06:08:13 [INFO] [EVAL] Class IoU: [0.6862 0.7699 0.931 0.7276 0.6804 0.7597 0.7786 0.7799 0.5248 0.5893 0.501 0.5554 0.6976 0.3028 0.2941 0.423 0.4919 0.407 0.6176 0.4228 0.7419 0.4802 0.6404 0.4996 0.283 0.2864 0.4533 0.464 0.3902 0.2351 0.2969 0.4868 0.2764 0.3679 0.3084 0.3903 0.4358 0.5546 0.2657 0.3762 0.2367 0.0673 0.3646 0.2536 0.2949 0.2053 0.276 0.5233 0.5977 0.532 0.5451 0.3341 0.2238 0.2205 0.6825 0.361 0.8697 0.4162 0.3845 0.2622 0.0889 0.4206 0.3243 0.2091 0.471 0.6962 0.2823 0.3976 0.1076 0.3162 0.5114 0.4992 0.4061 0.1872 0.4493 0.347 0.5104 0.2697 0.2302 0.1342 0.5958 0.4196 0.3861 0.0237 0.3337 0.5692 0.086 0.0977 0.2882 0.5223 0.439 0.0509 0.2111 0.0868 0.0019 0.0323 0.0179 0.0862 0.2758 0.3181 0.0948 0.0417 0.302 0.1544 0.0395 0.5659 0.0627 0.511 0.1011 0.2778 0.1256 0.432 0.1477 0.5229 0.7823 0.0303 0.2965 0.6309 0.1677 0.0937 0.4627 0.0131 0.2139 0.1899 0.2434 0.305 0.5216 0.399 0.4497 0.3229 0.5671 0.059 0.2582 0.2937 0.172 0.1263 0.1483 0.0356 0.2135 0.3591 0.3469 0.0194 0.3594 0.0599 0.2092 0.0014 0.3757 0.0384 0.1657 0.2119] 2022-08-23 06:08:13 [INFO] [EVAL] Class Precision: [0.7803 0.852 0.9632 0.8115 0.7605 0.8744 0.8694 0.8441 0.6414 0.7711 0.6998 0.718 0.7737 0.4479 0.5389 0.6166 0.695 0.7158 0.7999 0.6382 0.8289 0.6882 0.7746 0.646 0.4992 0.5324 0.5667 0.7482 0.6908 0.3492 0.4579 0.6666 0.4806 0.5104 0.4956 0.5592 0.621 0.7768 0.4245 0.6779 0.4145 0.2352 0.6418 0.543 0.3875 0.4249 0.4687 0.7229 0.6578 0.6063 0.7427 0.4036 0.368 0.5029 0.6972 0.468 0.921 0.6424 0.7607 0.4505 0.1373 0.5481 0.5132 0.7659 0.5874 0.8231 0.4125 0.5123 0.274 0.5423 0.6492 0.6928 0.6248 0.2846 0.6517 0.5373 0.6562 0.5707 0.6412 0.349 0.6763 0.6134 0.7631 0.0745 0.4568 0.7031 0.4426 0.3481 0.6333 0.7193 0.6464 0.0654 0.338 0.3806 0.0071 0.104 0.2728 0.252 0.5411 0.6921 0.3347 0.14 0.5483 0.656 0.8955 0.6146 0.3042 0.8033 0.2985 0.5448 0.3093 0.5677 0.4915 0.8108 0.7873 0.12 0.6287 0.7501 0.2219 0.6253 0.6073 0.1326 0.6057 0.5367 0.7074 0.6247 0.8782 0.5484 0.7412 0.5148 0.6512 0.3144 0.3504 0.6315 0.5985 0.2474 0.4149 0.0615 0.5227 0.6658 0.3981 0.0273 0.6897 0.4244 0.4902 0.0037 0.8295 0.3937 0.443 0.5872] 2022-08-23 06:08:13 [INFO] [EVAL] Class Recall: [0.8506 0.8887 0.9654 0.8756 0.8659 0.8528 0.8817 0.9112 0.7427 0.7143 0.6382 0.7103 0.8765 0.4832 0.3931 0.574 0.6273 0.4855 0.7304 0.5561 0.8761 0.6136 0.787 0.6879 0.3952 0.3827 0.6938 0.5499 0.4728 0.4183 0.4577 0.6435 0.3942 0.5686 0.4495 0.5638 0.5937 0.6597 0.4153 0.4581 0.3556 0.0862 0.4577 0.3225 0.5523 0.2843 0.4017 0.6546 0.8672 0.8127 0.672 0.6599 0.3636 0.282 0.97 0.6121 0.9398 0.5416 0.4374 0.3855 0.2013 0.6439 0.4684 0.2234 0.7038 0.8187 0.472 0.6398 0.1505 0.4313 0.7068 0.641 0.5371 0.3536 0.5912 0.4948 0.6967 0.3384 0.2642 0.1791 0.8335 0.5705 0.4387 0.0337 0.5532 0.7493 0.0964 0.1196 0.346 0.656 0.5778 0.1867 0.3599 0.1011 0.0025 0.0447 0.0188 0.1159 0.3601 0.3706 0.1169 0.0561 0.402 0.168 0.0397 0.8772 0.0732 0.5841 0.1326 0.3618 0.1745 0.6438 0.1744 0.5955 0.9919 0.0389 0.3594 0.7988 0.4073 0.0993 0.6602 0.0143 0.2486 0.2272 0.2706 0.3734 0.5623 0.5943 0.5334 0.4642 0.8145 0.0677 0.4954 0.3544 0.1944 0.2052 0.1875 0.0778 0.2652 0.438 0.7297 0.0633 0.4287 0.0652 0.2674 0.0023 0.4071 0.0408 0.2093 0.2491] 2022-08-23 06:08:13 [INFO] [EVAL] The model with the best validation mIoU (0.3488) was saved at iter 109000. 2022-08-23 06:08:23 [INFO] [TRAIN] epoch: 88, iter: 110050/160000, loss: 0.5499, lr: 0.000378, batch_cost: 0.2085, reader_cost: 0.00473, ips: 38.3688 samples/sec | ETA 02:53:34 2022-08-23 06:08:35 [INFO] [TRAIN] epoch: 88, iter: 110100/160000, loss: 0.5608, lr: 0.000378, batch_cost: 0.2274, reader_cost: 0.00171, ips: 35.1780 samples/sec | ETA 03:09:07 2022-08-23 06:08:45 [INFO] [TRAIN] epoch: 88, iter: 110150/160000, loss: 0.5307, lr: 0.000377, batch_cost: 0.2041, reader_cost: 0.00072, ips: 39.2017 samples/sec | ETA 02:49:33 2022-08-23 06:08:54 [INFO] [TRAIN] epoch: 88, iter: 110200/160000, loss: 0.5184, lr: 0.000377, batch_cost: 0.1846, reader_cost: 0.00115, ips: 43.3455 samples/sec | ETA 02:33:11 2022-08-23 06:09:05 [INFO] [TRAIN] epoch: 88, iter: 110250/160000, loss: 0.5310, lr: 0.000377, batch_cost: 0.2104, reader_cost: 0.00077, ips: 38.0211 samples/sec | ETA 02:54:27 2022-08-23 06:09:15 [INFO] [TRAIN] epoch: 88, iter: 110300/160000, loss: 0.5638, lr: 0.000376, batch_cost: 0.2097, reader_cost: 0.00097, ips: 38.1503 samples/sec | ETA 02:53:41 2022-08-23 06:09:25 [INFO] [TRAIN] epoch: 88, iter: 110350/160000, loss: 0.5172, lr: 0.000376, batch_cost: 0.1884, reader_cost: 0.00079, ips: 42.4518 samples/sec | ETA 02:35:56 2022-08-23 06:09:34 [INFO] [TRAIN] epoch: 88, iter: 110400/160000, loss: 0.5678, lr: 0.000376, batch_cost: 0.1887, reader_cost: 0.00077, ips: 42.4056 samples/sec | ETA 02:35:57 2022-08-23 06:09:44 [INFO] [TRAIN] epoch: 88, iter: 110450/160000, loss: 0.5113, lr: 0.000375, batch_cost: 0.2033, reader_cost: 0.00055, ips: 39.3538 samples/sec | ETA 02:47:52 2022-08-23 06:09:54 [INFO] [TRAIN] epoch: 88, iter: 110500/160000, loss: 0.5239, lr: 0.000375, batch_cost: 0.1921, reader_cost: 0.00218, ips: 41.6478 samples/sec | ETA 02:38:28 2022-08-23 06:10:03 [INFO] [TRAIN] epoch: 88, iter: 110550/160000, loss: 0.5101, lr: 0.000374, batch_cost: 0.1810, reader_cost: 0.00151, ips: 44.2043 samples/sec | ETA 02:29:09 2022-08-23 06:10:13 [INFO] [TRAIN] epoch: 88, iter: 110600/160000, loss: 0.5218, lr: 0.000374, batch_cost: 0.2101, reader_cost: 0.00049, ips: 38.0818 samples/sec | ETA 02:52:57 2022-08-23 06:10:22 [INFO] [TRAIN] epoch: 88, iter: 110650/160000, loss: 0.5194, lr: 0.000374, batch_cost: 0.1756, reader_cost: 0.00087, ips: 45.5482 samples/sec | ETA 02:24:27 2022-08-23 06:10:31 [INFO] [TRAIN] epoch: 88, iter: 110700/160000, loss: 0.5259, lr: 0.000373, batch_cost: 0.1811, reader_cost: 0.00052, ips: 44.1741 samples/sec | ETA 02:28:48 2022-08-23 06:10:39 [INFO] [TRAIN] epoch: 88, iter: 110750/160000, loss: 0.5198, lr: 0.000373, batch_cost: 0.1665, reader_cost: 0.00076, ips: 48.0438 samples/sec | ETA 02:16:40 2022-08-23 06:10:48 [INFO] [TRAIN] epoch: 88, iter: 110800/160000, loss: 0.5219, lr: 0.000372, batch_cost: 0.1786, reader_cost: 0.00040, ips: 44.7966 samples/sec | ETA 02:26:26 2022-08-23 06:10:59 [INFO] [TRAIN] epoch: 88, iter: 110850/160000, loss: 0.5344, lr: 0.000372, batch_cost: 0.2085, reader_cost: 0.00060, ips: 38.3645 samples/sec | ETA 02:50:49 2022-08-23 06:11:10 [INFO] [TRAIN] epoch: 88, iter: 110900/160000, loss: 0.5170, lr: 0.000372, batch_cost: 0.2151, reader_cost: 0.00223, ips: 37.1981 samples/sec | ETA 02:55:59 2022-08-23 06:11:19 [INFO] [TRAIN] epoch: 88, iter: 110950/160000, loss: 0.5380, lr: 0.000371, batch_cost: 0.1937, reader_cost: 0.00065, ips: 41.3001 samples/sec | ETA 02:38:21 2022-08-23 06:11:29 [INFO] [TRAIN] epoch: 88, iter: 111000/160000, loss: 0.5156, lr: 0.000371, batch_cost: 0.1913, reader_cost: 0.00088, ips: 41.8265 samples/sec | ETA 02:36:12 2022-08-23 06:11:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1652 - reader cost: 5.9088e-04 2022-08-23 06:14:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3448 Acc: 0.7663 Kappa: 0.7484 Dice: 0.4758 2022-08-23 06:14:14 [INFO] [EVAL] Class IoU: [0.6895 0.772 0.9308 0.7306 0.6818 0.7668 0.7806 0.7796 0.5331 0.6162 0.4914 0.5471 0.6933 0.3086 0.3098 0.4219 0.5135 0.3886 0.6223 0.4177 0.7432 0.5186 0.6007 0.4994 0.3237 0.3211 0.4564 0.4571 0.4242 0.2059 0.2839 0.491 0.2582 0.3398 0.308 0.4032 0.4296 0.5303 0.256 0.3802 0.2285 0.0806 0.3636 0.2603 0.2991 0.1826 0.2947 0.5085 0.6148 0.533 0.5303 0.4476 0.1933 0.2345 0.7275 0.4216 0.8642 0.3835 0.5128 0.251 0.0514 0.3794 0.3435 0.2588 0.423 0.6799 0.2383 0.3541 0.1129 0.3279 0.515 0.5484 0.3858 0.1929 0.4502 0.3533 0.5577 0.2905 0.2413 0.1188 0.6 0.408 0.3109 0.0292 0.2419 0.5727 0.0666 0.0735 0.2269 0.4799 0.433 0.0779 0.2016 0.1113 0.0004 0.0371 0.0309 0.1334 0.2715 0.4095 0.087 0.0465 0.29 0.0824 0.0473 0.5667 0.0811 0.553 0.1123 0.2421 0.0994 0.3218 0.1319 0.5447 0.7084 0.0371 0.3136 0.6596 0.1584 0.132 0.4531 0.0113 0.22 0.1079 0.2601 0.2153 0.4841 0.4226 0.4452 0.3347 0.5327 0.0306 0.3131 0.2803 0.1804 0.1064 0.1581 0.0422 0.1962 0.3544 0.3826 0.014 0.3059 0.0074 0.2008 0. 0.3496 0.0226 0.1998 0.1828] 2022-08-23 06:14:14 [INFO] [EVAL] Class Precision: [0.7934 0.8401 0.9607 0.8244 0.7758 0.8623 0.8701 0.8403 0.6838 0.7691 0.7031 0.718 0.7679 0.498 0.5597 0.5912 0.693 0.6592 0.7636 0.6373 0.8329 0.665 0.6892 0.6201 0.5209 0.538 0.6364 0.7262 0.6398 0.315 0.4273 0.649 0.4524 0.4394 0.4794 0.5417 0.6294 0.7379 0.434 0.6626 0.3588 0.2258 0.5961 0.5171 0.4088 0.4423 0.4655 0.654 0.7265 0.6284 0.6893 0.6589 0.3865 0.5004 0.7752 0.6274 0.9361 0.6848 0.7267 0.4347 0.0955 0.5613 0.4934 0.6815 0.4952 0.8014 0.286 0.4717 0.1997 0.5415 0.6724 0.7136 0.6456 0.2904 0.7314 0.507 0.6782 0.5517 0.7095 0.4125 0.7005 0.6851 0.8225 0.0782 0.4532 0.7229 0.5817 0.3674 0.3608 0.7761 0.6043 0.1177 0.3497 0.3343 0.0019 0.1114 0.3772 0.337 0.5927 0.6543 0.3486 0.1033 0.6047 0.375 0.7229 0.6097 0.2404 0.8981 0.2512 0.6695 0.287 0.3916 0.6581 0.7405 0.7119 0.3066 0.6103 0.8527 0.2776 0.572 0.6435 0.1463 0.7242 0.6729 0.696 0.6373 0.8439 0.5927 0.7035 0.5748 0.5892 0.333 0.4407 0.6611 0.5521 0.1944 0.432 0.0711 0.4405 0.6579 0.4657 0.0191 0.7041 0.1468 0.3939 0. 0.8778 0.4813 0.5735 0.646 ] 2022-08-23 06:14:14 [INFO] [EVAL] Class Recall: [0.8404 0.905 0.9676 0.8652 0.849 0.8738 0.8835 0.9152 0.7075 0.7561 0.6201 0.6969 0.877 0.448 0.4097 0.5956 0.6647 0.4863 0.7709 0.5479 0.8735 0.702 0.8239 0.7194 0.461 0.4433 0.6175 0.5523 0.5573 0.3728 0.4583 0.6684 0.3755 0.6 0.4627 0.6119 0.5751 0.6533 0.3842 0.4715 0.3864 0.1115 0.4824 0.3438 0.5271 0.2373 0.4455 0.6957 0.8 0.7785 0.6968 0.5826 0.2788 0.3062 0.9219 0.5624 0.9184 0.4657 0.6353 0.3725 0.1001 0.5393 0.5307 0.2944 0.7438 0.8176 0.5882 0.5867 0.2061 0.4539 0.6875 0.7032 0.4895 0.365 0.5394 0.5383 0.7585 0.3804 0.2678 0.143 0.807 0.5022 0.3332 0.0445 0.3415 0.7337 0.07 0.0842 0.3794 0.557 0.6044 0.187 0.3224 0.1429 0.0006 0.0527 0.0326 0.181 0.3338 0.5226 0.1038 0.0779 0.3578 0.0955 0.0482 0.8893 0.109 0.5901 0.1687 0.275 0.1321 0.6435 0.1416 0.6733 0.9931 0.0405 0.3921 0.7445 0.2695 0.1465 0.605 0.0121 0.2401 0.1139 0.2935 0.2453 0.5317 0.5955 0.548 0.4449 0.8474 0.0326 0.5194 0.3274 0.2113 0.1904 0.1996 0.0942 0.2614 0.4345 0.6821 0.0506 0.351 0.0078 0.2905 0. 0.3675 0.0232 0.2347 0.2031] 2022-08-23 06:14:14 [INFO] [EVAL] The model with the best validation mIoU (0.3488) was saved at iter 109000. 2022-08-23 06:14:25 [INFO] [TRAIN] epoch: 88, iter: 111050/160000, loss: 0.5399, lr: 0.000371, batch_cost: 0.2090, reader_cost: 0.00512, ips: 38.2754 samples/sec | ETA 02:50:31 2022-08-23 06:14:35 [INFO] [TRAIN] epoch: 88, iter: 111100/160000, loss: 0.4912, lr: 0.000370, batch_cost: 0.1956, reader_cost: 0.00090, ips: 40.9010 samples/sec | ETA 02:39:24 2022-08-23 06:14:48 [INFO] [TRAIN] epoch: 89, iter: 111150/160000, loss: 0.5188, lr: 0.000370, batch_cost: 0.2654, reader_cost: 0.05362, ips: 30.1488 samples/sec | ETA 03:36:02 2022-08-23 06:14:59 [INFO] [TRAIN] epoch: 89, iter: 111200/160000, loss: 0.5175, lr: 0.000369, batch_cost: 0.2175, reader_cost: 0.00061, ips: 36.7745 samples/sec | ETA 02:56:56 2022-08-23 06:15:09 [INFO] [TRAIN] epoch: 89, iter: 111250/160000, loss: 0.5394, lr: 0.000369, batch_cost: 0.2142, reader_cost: 0.00074, ips: 37.3527 samples/sec | ETA 02:54:01 2022-08-23 06:15:20 [INFO] [TRAIN] epoch: 89, iter: 111300/160000, loss: 0.5023, lr: 0.000369, batch_cost: 0.2167, reader_cost: 0.00058, ips: 36.9214 samples/sec | ETA 02:55:52 2022-08-23 06:15:29 [INFO] [TRAIN] epoch: 89, iter: 111350/160000, loss: 0.5208, lr: 0.000368, batch_cost: 0.1789, reader_cost: 0.00047, ips: 44.7054 samples/sec | ETA 02:25:05 2022-08-23 06:15:38 [INFO] [TRAIN] epoch: 89, iter: 111400/160000, loss: 0.5347, lr: 0.000368, batch_cost: 0.1795, reader_cost: 0.00314, ips: 44.5650 samples/sec | ETA 02:25:24 2022-08-23 06:15:48 [INFO] [TRAIN] epoch: 89, iter: 111450/160000, loss: 0.5486, lr: 0.000368, batch_cost: 0.1966, reader_cost: 0.00056, ips: 40.6841 samples/sec | ETA 02:39:06 2022-08-23 06:15:58 [INFO] [TRAIN] epoch: 89, iter: 111500/160000, loss: 0.4763, lr: 0.000367, batch_cost: 0.1973, reader_cost: 0.00127, ips: 40.5413 samples/sec | ETA 02:39:30 2022-08-23 06:16:09 [INFO] [TRAIN] epoch: 89, iter: 111550/160000, loss: 0.5191, lr: 0.000367, batch_cost: 0.2171, reader_cost: 0.00065, ips: 36.8445 samples/sec | ETA 02:55:19 2022-08-23 06:16:18 [INFO] [TRAIN] epoch: 89, iter: 111600/160000, loss: 0.5087, lr: 0.000366, batch_cost: 0.1860, reader_cost: 0.00096, ips: 43.0066 samples/sec | ETA 02:30:03 2022-08-23 06:16:27 [INFO] [TRAIN] epoch: 89, iter: 111650/160000, loss: 0.5096, lr: 0.000366, batch_cost: 0.1871, reader_cost: 0.00056, ips: 42.7549 samples/sec | ETA 02:30:46 2022-08-23 06:16:38 [INFO] [TRAIN] epoch: 89, iter: 111700/160000, loss: 0.4660, lr: 0.000366, batch_cost: 0.2102, reader_cost: 0.00045, ips: 38.0575 samples/sec | ETA 02:49:13 2022-08-23 06:16:47 [INFO] [TRAIN] epoch: 89, iter: 111750/160000, loss: 0.5234, lr: 0.000365, batch_cost: 0.1857, reader_cost: 0.00033, ips: 43.0711 samples/sec | ETA 02:29:21 2022-08-23 06:16:56 [INFO] [TRAIN] epoch: 89, iter: 111800/160000, loss: 0.5254, lr: 0.000365, batch_cost: 0.1798, reader_cost: 0.00068, ips: 44.5051 samples/sec | ETA 02:24:24 2022-08-23 06:17:07 [INFO] [TRAIN] epoch: 89, iter: 111850/160000, loss: 0.5557, lr: 0.000365, batch_cost: 0.2252, reader_cost: 0.00107, ips: 35.5235 samples/sec | ETA 03:00:43 2022-08-23 06:17:18 [INFO] [TRAIN] epoch: 89, iter: 111900/160000, loss: 0.5230, lr: 0.000364, batch_cost: 0.2017, reader_cost: 0.00138, ips: 39.6701 samples/sec | ETA 02:41:39 2022-08-23 06:17:27 [INFO] [TRAIN] epoch: 89, iter: 111950/160000, loss: 0.4962, lr: 0.000364, batch_cost: 0.1854, reader_cost: 0.00071, ips: 43.1478 samples/sec | ETA 02:28:28 2022-08-23 06:17:36 [INFO] [TRAIN] epoch: 89, iter: 112000/160000, loss: 0.5617, lr: 0.000363, batch_cost: 0.1897, reader_cost: 0.00046, ips: 42.1783 samples/sec | ETA 02:31:44 2022-08-23 06:17:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 183s - batch_cost: 0.1830 - reader cost: 5.7631e-04 2022-08-23 06:20:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3454 Acc: 0.7656 Kappa: 0.7476 Dice: 0.4777 2022-08-23 06:20:39 [INFO] [EVAL] Class IoU: [0.6884 0.7776 0.9306 0.7308 0.6856 0.7585 0.7712 0.7818 0.518 0.6272 0.49 0.5274 0.6882 0.3096 0.3194 0.4255 0.4498 0.4279 0.6175 0.4093 0.7401 0.4841 0.6433 0.4884 0.3142 0.3605 0.4587 0.4481 0.4328 0.2161 0.2907 0.4766 0.2569 0.3593 0.2922 0.3935 0.4175 0.5395 0.2506 0.4083 0.2336 0.0809 0.3545 0.2645 0.2966 0.2491 0.3292 0.519 0.6454 0.5553 0.5116 0.3723 0.2121 0.1913 0.6672 0.3559 0.8608 0.3701 0.5537 0.218 0.0554 0.4693 0.3377 0.203 0.4399 0.6891 0.2908 0.4008 0.1116 0.3348 0.5268 0.5282 0.4196 0.1932 0.4434 0.3489 0.5469 0.254 0.2585 0.122 0.5568 0.3981 0.3531 0.0293 0.3472 0.5677 0.0924 0.083 0.2763 0.5108 0.3877 0.0509 0.2227 0.0899 0.0097 0.0131 0.0491 0.1348 0.2874 0.375 0.0886 0.0459 0.2855 0.0776 0.0633 0.5104 0.0451 0.5194 0.1091 0.2699 0.1318 0.3712 0.1517 0.4055 0.5883 0.0216 0.2842 0.6589 0.1163 0.1813 0.4221 0.015 0.2684 0.1484 0.2755 0.2627 0.4591 0.4004 0.419 0.3342 0.5521 0.0352 0.3025 0.2902 0.1573 0.1211 0.1626 0.0221 0.2014 0.3624 0.4561 0.0071 0.3431 0.008 0.2261 0. 0.3658 0.0352 0.1798 0.1931] 2022-08-23 06:20:39 [INFO] [EVAL] Class Precision: [0.7792 0.8548 0.9621 0.8393 0.7851 0.8783 0.8473 0.8422 0.7052 0.7456 0.6967 0.7083 0.7543 0.4963 0.525 0.5977 0.6897 0.6638 0.7412 0.6506 0.8294 0.646 0.7774 0.6307 0.503 0.5699 0.6057 0.7662 0.6327 0.3597 0.4185 0.6459 0.5007 0.4896 0.4366 0.5681 0.6124 0.7199 0.4122 0.5569 0.4421 0.2284 0.6303 0.5237 0.399 0.4163 0.5189 0.7107 0.718 0.6391 0.7313 0.499 0.3731 0.4791 0.7453 0.4932 0.9068 0.7182 0.7219 0.4145 0.1109 0.6197 0.5279 0.7893 0.5349 0.7701 0.3862 0.5375 0.2667 0.573 0.6926 0.6907 0.6278 0.2655 0.7047 0.5086 0.6416 0.4962 0.6906 0.3025 0.6216 0.6746 0.7845 0.114 0.4992 0.721 0.3498 0.3638 0.5718 0.7572 0.5035 0.0632 0.421 0.3622 0.0293 0.0414 0.3313 0.3244 0.478 0.6096 0.3819 0.1254 0.5889 0.3896 0.574 0.5355 0.1611 0.8721 0.2127 0.553 0.2804 0.4854 0.475 0.7096 0.5901 0.1052 0.6599 0.785 0.1602 0.6287 0.5011 0.1552 0.6896 0.6307 0.6524 0.6482 0.8005 0.5762 0.6553 0.561 0.6212 0.2974 0.3926 0.6511 0.5025 0.2182 0.4479 0.0501 0.4636 0.6354 0.6204 0.0103 0.6696 0.168 0.5577 0. 0.8299 0.4042 0.6302 0.7406] 2022-08-23 06:20:39 [INFO] [EVAL] Class Recall: [0.8553 0.896 0.9661 0.8497 0.844 0.8476 0.8957 0.9159 0.6612 0.798 0.6228 0.6737 0.887 0.4514 0.4493 0.5963 0.564 0.5463 0.7873 0.5247 0.8729 0.6588 0.7886 0.684 0.4557 0.4953 0.654 0.5191 0.578 0.3512 0.4878 0.6452 0.3454 0.5745 0.4691 0.5615 0.5675 0.6829 0.3899 0.6047 0.3313 0.1114 0.4475 0.3483 0.536 0.3828 0.4739 0.658 0.8645 0.809 0.63 0.5945 0.3296 0.2416 0.8643 0.5611 0.9443 0.433 0.7038 0.315 0.0997 0.6591 0.4838 0.2147 0.7124 0.8675 0.5406 0.6118 0.161 0.4461 0.6875 0.6919 0.5585 0.4148 0.5445 0.5264 0.7874 0.3424 0.2923 0.1698 0.8425 0.4927 0.3911 0.038 0.5327 0.7274 0.1115 0.0971 0.3483 0.6109 0.6277 0.2068 0.321 0.1068 0.0143 0.0187 0.0545 0.1875 0.4188 0.4935 0.1035 0.0674 0.3565 0.0884 0.0664 0.9159 0.059 0.5622 0.183 0.3452 0.1991 0.6121 0.1823 0.4861 0.9949 0.0264 0.3329 0.804 0.2979 0.203 0.7282 0.0163 0.3052 0.1625 0.3229 0.3063 0.5185 0.5675 0.5374 0.4525 0.8324 0.0384 0.5687 0.3437 0.1863 0.2138 0.2034 0.038 0.2626 0.4575 0.6328 0.023 0.4131 0.0083 0.2755 0. 0.3955 0.0371 0.201 0.2072] 2022-08-23 06:20:40 [INFO] [EVAL] The model with the best validation mIoU (0.3488) was saved at iter 109000. 2022-08-23 06:20:50 [INFO] [TRAIN] epoch: 89, iter: 112050/160000, loss: 0.5148, lr: 0.000363, batch_cost: 0.2131, reader_cost: 0.00315, ips: 37.5408 samples/sec | ETA 02:50:18 2022-08-23 06:21:00 [INFO] [TRAIN] epoch: 89, iter: 112100/160000, loss: 0.5158, lr: 0.000363, batch_cost: 0.1942, reader_cost: 0.00071, ips: 41.2006 samples/sec | ETA 02:35:00 2022-08-23 06:21:10 [INFO] [TRAIN] epoch: 89, iter: 112150/160000, loss: 0.5098, lr: 0.000362, batch_cost: 0.1986, reader_cost: 0.00098, ips: 40.2915 samples/sec | ETA 02:38:20 2022-08-23 06:21:19 [INFO] [TRAIN] epoch: 89, iter: 112200/160000, loss: 0.5027, lr: 0.000362, batch_cost: 0.1843, reader_cost: 0.00058, ips: 43.4134 samples/sec | ETA 02:26:48 2022-08-23 06:21:30 [INFO] [TRAIN] epoch: 89, iter: 112250/160000, loss: 0.5161, lr: 0.000362, batch_cost: 0.2186, reader_cost: 0.00090, ips: 36.6033 samples/sec | ETA 02:53:56 2022-08-23 06:21:42 [INFO] [TRAIN] epoch: 89, iter: 112300/160000, loss: 0.5256, lr: 0.000361, batch_cost: 0.2334, reader_cost: 0.00094, ips: 34.2740 samples/sec | ETA 03:05:33 2022-08-23 06:21:53 [INFO] [TRAIN] epoch: 89, iter: 112350/160000, loss: 0.5148, lr: 0.000361, batch_cost: 0.2216, reader_cost: 0.00069, ips: 36.1032 samples/sec | ETA 02:55:58 2022-08-23 06:22:03 [INFO] [TRAIN] epoch: 89, iter: 112400/160000, loss: 0.5062, lr: 0.000360, batch_cost: 0.2000, reader_cost: 0.00068, ips: 40.0084 samples/sec | ETA 02:38:38 2022-08-23 06:22:16 [INFO] [TRAIN] epoch: 90, iter: 112450/160000, loss: 0.5775, lr: 0.000360, batch_cost: 0.2646, reader_cost: 0.07469, ips: 30.2294 samples/sec | ETA 03:29:43 2022-08-23 06:22:27 [INFO] [TRAIN] epoch: 90, iter: 112500/160000, loss: 0.5176, lr: 0.000360, batch_cost: 0.2088, reader_cost: 0.00069, ips: 38.3145 samples/sec | ETA 02:45:17 2022-08-23 06:22:37 [INFO] [TRAIN] epoch: 90, iter: 112550/160000, loss: 0.5499, lr: 0.000359, batch_cost: 0.2070, reader_cost: 0.00069, ips: 38.6472 samples/sec | ETA 02:43:42 2022-08-23 06:22:48 [INFO] [TRAIN] epoch: 90, iter: 112600/160000, loss: 0.5204, lr: 0.000359, batch_cost: 0.2170, reader_cost: 0.00036, ips: 36.8586 samples/sec | ETA 02:51:27 2022-08-23 06:22:58 [INFO] [TRAIN] epoch: 90, iter: 112650/160000, loss: 0.4957, lr: 0.000358, batch_cost: 0.2056, reader_cost: 0.00094, ips: 38.9144 samples/sec | ETA 02:42:14 2022-08-23 06:23:08 [INFO] [TRAIN] epoch: 90, iter: 112700/160000, loss: 0.5389, lr: 0.000358, batch_cost: 0.1990, reader_cost: 0.00048, ips: 40.2084 samples/sec | ETA 02:36:50 2022-08-23 06:23:18 [INFO] [TRAIN] epoch: 90, iter: 112750/160000, loss: 0.5372, lr: 0.000358, batch_cost: 0.1947, reader_cost: 0.00050, ips: 41.0848 samples/sec | ETA 02:33:20 2022-08-23 06:23:27 [INFO] [TRAIN] epoch: 90, iter: 112800/160000, loss: 0.5419, lr: 0.000357, batch_cost: 0.1933, reader_cost: 0.00067, ips: 41.3890 samples/sec | ETA 02:32:03 2022-08-23 06:23:38 [INFO] [TRAIN] epoch: 90, iter: 112850/160000, loss: 0.5345, lr: 0.000357, batch_cost: 0.2043, reader_cost: 0.00046, ips: 39.1525 samples/sec | ETA 02:40:34 2022-08-23 06:23:48 [INFO] [TRAIN] epoch: 90, iter: 112900/160000, loss: 0.5639, lr: 0.000357, batch_cost: 0.2017, reader_cost: 0.00056, ips: 39.6646 samples/sec | ETA 02:38:19 2022-08-23 06:23:58 [INFO] [TRAIN] epoch: 90, iter: 112950/160000, loss: 0.5093, lr: 0.000356, batch_cost: 0.2024, reader_cost: 0.00074, ips: 39.5199 samples/sec | ETA 02:38:44 2022-08-23 06:24:07 [INFO] [TRAIN] epoch: 90, iter: 113000/160000, loss: 0.5136, lr: 0.000356, batch_cost: 0.1871, reader_cost: 0.00056, ips: 42.7563 samples/sec | ETA 02:26:34 2022-08-23 06:24:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1652 - reader cost: 7.2423e-04 2022-08-23 06:26:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3495 Acc: 0.7674 Kappa: 0.7496 Dice: 0.4819 2022-08-23 06:26:52 [INFO] [EVAL] Class IoU: [0.6898 0.7792 0.93 0.727 0.6818 0.7614 0.7709 0.7736 0.5236 0.6397 0.491 0.5127 0.7047 0.291 0.3342 0.4229 0.5029 0.4333 0.6267 0.4221 0.7384 0.5393 0.6455 0.5068 0.3193 0.3241 0.4538 0.4573 0.4221 0.2344 0.2896 0.49 0.2585 0.3371 0.302 0.3845 0.4297 0.5827 0.2398 0.3974 0.2208 0.1098 0.3431 0.2559 0.3256 0.2143 0.3221 0.5302 0.5967 0.5481 0.5357 0.3578 0.2183 0.2314 0.6957 0.4076 0.859 0.3771 0.4694 0.2486 0.0895 0.4218 0.3338 0.2375 0.4462 0.6699 0.3012 0.4247 0.0762 0.3334 0.5287 0.5318 0.4221 0.2042 0.4624 0.3597 0.5404 0.2495 0.2616 0.1273 0.6313 0.4073 0.382 0.0275 0.3234 0.5666 0.0939 0.0776 0.2445 0.513 0.4134 0.0441 0.202 0.1187 0.0136 0.0412 0.0237 0.1119 0.296 0.3939 0.0542 0.0296 0.269 0.0779 0.0966 0.5619 0.087 0.5477 0.1031 0.2653 0.1397 0.3582 0.1394 0.4524 0.6968 0.011 0.2657 0.6489 0.1291 0.2919 0.4075 0.0126 0.2201 0.1288 0.2678 0.3133 0.4842 0.3982 0.4063 0.336 0.5334 0.0246 0.2386 0.3088 0.1961 0.1255 0.1614 0.0313 0.2066 0.3615 0.4029 0.0091 0.3694 0.0275 0.2866 0. 0.3599 0.0229 0.1827 0.1882] 2022-08-23 06:26:52 [INFO] [EVAL] Class Precision: [0.7908 0.8469 0.9651 0.828 0.7708 0.8762 0.8433 0.8286 0.6659 0.7675 0.696 0.7208 0.7922 0.488 0.5021 0.5744 0.7084 0.6771 0.7677 0.6094 0.8276 0.7092 0.7749 0.6572 0.5316 0.6402 0.6037 0.7591 0.651 0.3751 0.4486 0.6639 0.4592 0.467 0.4365 0.5263 0.6101 0.7611 0.4338 0.5944 0.3948 0.2785 0.532 0.5203 0.4581 0.4016 0.489 0.7396 0.728 0.64 0.7662 0.4699 0.3899 0.4561 0.7655 0.5216 0.9185 0.7146 0.7823 0.4842 0.1598 0.5483 0.5235 0.67 0.5366 0.7537 0.4361 0.5824 0.2051 0.5661 0.7363 0.6745 0.628 0.2924 0.691 0.5779 0.6381 0.5497 0.651 0.3719 0.734 0.7017 0.7661 0.0684 0.5086 0.7281 0.4266 0.3751 0.4276 0.7317 0.5466 0.0525 0.3801 0.3295 0.0341 0.1108 0.561 0.2902 0.4732 0.6619 0.3244 0.0809 0.6075 0.533 0.8601 0.5995 0.2005 0.8594 0.2637 0.8025 0.3112 0.4481 0.5271 0.6722 0.6991 0.0972 0.7004 0.8019 0.2445 0.6675 0.5341 0.2883 0.733 0.6951 0.7118 0.652 0.7427 0.5513 0.6227 0.5901 0.6044 0.2491 0.3635 0.6222 0.62 0.2823 0.3385 0.0716 0.4425 0.6446 0.4778 0.0129 0.6168 0.2768 0.5204 0. 0.8805 0.4723 0.4399 0.7433] 2022-08-23 06:26:52 [INFO] [EVAL] Class Recall: [0.8438 0.907 0.9624 0.8563 0.8552 0.8532 0.8997 0.9209 0.7103 0.7935 0.625 0.6397 0.8645 0.419 0.4999 0.616 0.6342 0.5462 0.7733 0.5787 0.8726 0.6924 0.7944 0.689 0.4443 0.3963 0.6463 0.535 0.5456 0.3845 0.4495 0.6517 0.3716 0.5478 0.495 0.588 0.5924 0.7132 0.349 0.5453 0.3336 0.1534 0.4913 0.3349 0.5297 0.3149 0.4856 0.6518 0.7679 0.7923 0.6404 0.5999 0.3316 0.3196 0.8842 0.651 0.9299 0.4439 0.54 0.3382 0.1691 0.6465 0.4795 0.269 0.726 0.8576 0.4933 0.6106 0.1082 0.4479 0.6521 0.7154 0.5629 0.4038 0.5829 0.4879 0.7793 0.3135 0.3043 0.1622 0.8186 0.4927 0.4324 0.044 0.4704 0.7186 0.1074 0.0891 0.3635 0.6319 0.6291 0.2146 0.3013 0.1566 0.022 0.0615 0.0241 0.1541 0.4415 0.4932 0.0611 0.0445 0.3255 0.0836 0.0982 0.8995 0.1332 0.6016 0.1448 0.2838 0.2023 0.641 0.1593 0.5804 0.9955 0.0123 0.2997 0.7728 0.2147 0.3415 0.6322 0.013 0.2393 0.1365 0.3004 0.3763 0.5817 0.5892 0.539 0.4384 0.8194 0.0266 0.4097 0.3801 0.2228 0.1842 0.2357 0.0526 0.2793 0.4514 0.72 0.0296 0.4794 0.0296 0.3894 0. 0.3783 0.0235 0.2381 0.2013] 2022-08-23 06:26:53 [INFO] [EVAL] The model with the best validation mIoU (0.3495) was saved at iter 113000. 2022-08-23 06:27:02 [INFO] [TRAIN] epoch: 90, iter: 113050/160000, loss: 0.5068, lr: 0.000355, batch_cost: 0.1936, reader_cost: 0.00416, ips: 41.3119 samples/sec | ETA 02:31:31 2022-08-23 06:27:12 [INFO] [TRAIN] epoch: 90, iter: 113100/160000, loss: 0.5230, lr: 0.000355, batch_cost: 0.1841, reader_cost: 0.00090, ips: 43.4533 samples/sec | ETA 02:23:54 2022-08-23 06:27:21 [INFO] [TRAIN] epoch: 90, iter: 113150/160000, loss: 0.5209, lr: 0.000355, batch_cost: 0.1945, reader_cost: 0.00032, ips: 41.1345 samples/sec | ETA 02:31:51 2022-08-23 06:27:31 [INFO] [TRAIN] epoch: 90, iter: 113200/160000, loss: 0.4823, lr: 0.000354, batch_cost: 0.1903, reader_cost: 0.00093, ips: 42.0329 samples/sec | ETA 02:28:27 2022-08-23 06:27:42 [INFO] [TRAIN] epoch: 90, iter: 113250/160000, loss: 0.5335, lr: 0.000354, batch_cost: 0.2256, reader_cost: 0.00042, ips: 35.4600 samples/sec | ETA 02:55:47 2022-08-23 06:27:53 [INFO] [TRAIN] epoch: 90, iter: 113300/160000, loss: 0.5555, lr: 0.000354, batch_cost: 0.2104, reader_cost: 0.00039, ips: 38.0257 samples/sec | ETA 02:43:44 2022-08-23 06:28:04 [INFO] [TRAIN] epoch: 90, iter: 113350/160000, loss: 0.5492, lr: 0.000353, batch_cost: 0.2219, reader_cost: 0.00056, ips: 36.0454 samples/sec | ETA 02:52:33 2022-08-23 06:28:14 [INFO] [TRAIN] epoch: 90, iter: 113400/160000, loss: 0.5340, lr: 0.000353, batch_cost: 0.2150, reader_cost: 0.00113, ips: 37.2082 samples/sec | ETA 02:46:59 2022-08-23 06:28:26 [INFO] [TRAIN] epoch: 90, iter: 113450/160000, loss: 0.5640, lr: 0.000352, batch_cost: 0.2209, reader_cost: 0.00054, ips: 36.2122 samples/sec | ETA 02:51:23 2022-08-23 06:28:36 [INFO] [TRAIN] epoch: 90, iter: 113500/160000, loss: 0.5160, lr: 0.000352, batch_cost: 0.2081, reader_cost: 0.00057, ips: 38.4516 samples/sec | ETA 02:41:14 2022-08-23 06:28:46 [INFO] [TRAIN] epoch: 90, iter: 113550/160000, loss: 0.5170, lr: 0.000352, batch_cost: 0.2000, reader_cost: 0.00168, ips: 40.0077 samples/sec | ETA 02:34:48 2022-08-23 06:28:56 [INFO] [TRAIN] epoch: 90, iter: 113600/160000, loss: 0.5469, lr: 0.000351, batch_cost: 0.2074, reader_cost: 0.00043, ips: 38.5663 samples/sec | ETA 02:40:24 2022-08-23 06:29:08 [INFO] [TRAIN] epoch: 90, iter: 113650/160000, loss: 0.5047, lr: 0.000351, batch_cost: 0.2240, reader_cost: 0.00110, ips: 35.7095 samples/sec | ETA 02:53:03 2022-08-23 06:29:22 [INFO] [TRAIN] epoch: 91, iter: 113700/160000, loss: 0.5105, lr: 0.000351, batch_cost: 0.2987, reader_cost: 0.09316, ips: 26.7788 samples/sec | ETA 03:50:31 2022-08-23 06:29:33 [INFO] [TRAIN] epoch: 91, iter: 113750/160000, loss: 0.5445, lr: 0.000350, batch_cost: 0.2121, reader_cost: 0.00077, ips: 37.7123 samples/sec | ETA 02:43:31 2022-08-23 06:29:43 [INFO] [TRAIN] epoch: 91, iter: 113800/160000, loss: 0.5362, lr: 0.000350, batch_cost: 0.2003, reader_cost: 0.00481, ips: 39.9490 samples/sec | ETA 02:34:11 2022-08-23 06:29:54 [INFO] [TRAIN] epoch: 91, iter: 113850/160000, loss: 0.5487, lr: 0.000349, batch_cost: 0.2106, reader_cost: 0.00066, ips: 37.9832 samples/sec | ETA 02:42:00 2022-08-23 06:30:04 [INFO] [TRAIN] epoch: 91, iter: 113900/160000, loss: 0.4884, lr: 0.000349, batch_cost: 0.2001, reader_cost: 0.00144, ips: 39.9872 samples/sec | ETA 02:33:42 2022-08-23 06:30:15 [INFO] [TRAIN] epoch: 91, iter: 113950/160000, loss: 0.5096, lr: 0.000349, batch_cost: 0.2349, reader_cost: 0.00063, ips: 34.0514 samples/sec | ETA 03:00:18 2022-08-23 06:30:26 [INFO] [TRAIN] epoch: 91, iter: 114000/160000, loss: 0.5385, lr: 0.000348, batch_cost: 0.2211, reader_cost: 0.00052, ips: 36.1801 samples/sec | ETA 02:49:31 2022-08-23 06:30:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 148s - batch_cost: 0.1483 - reader cost: 6.9790e-04 2022-08-23 06:32:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3440 Acc: 0.7666 Kappa: 0.7487 Dice: 0.4741 2022-08-23 06:32:55 [INFO] [EVAL] Class IoU: [0.689 0.7813 0.9301 0.7248 0.6841 0.7599 0.78 0.7824 0.5274 0.6075 0.4929 0.5537 0.7005 0.2804 0.3153 0.4202 0.5058 0.4458 0.6191 0.4206 0.7426 0.5178 0.6441 0.5116 0.3258 0.3408 0.4427 0.4165 0.4044 0.2096 0.2883 0.4599 0.2667 0.3699 0.3083 0.3736 0.4287 0.5645 0.2415 0.356 0.1639 0.0979 0.3456 0.2624 0.2802 0.1489 0.3158 0.5277 0.5727 0.5673 0.5292 0.366 0.223 0.1639 0.6639 0.4335 0.8691 0.4234 0.5461 0.2001 0.0767 0.4179 0.3103 0.2585 0.4417 0.7009 0.2912 0.4227 0.1319 0.3144 0.4835 0.4867 0.4019 0.1967 0.4716 0.3513 0.5224 0.2821 0.272 0.1583 0.567 0.403 0.352 0.0193 0.0684 0.5657 0.0831 0.0888 0.2636 0.5256 0.4226 0.0354 0.2109 0.1063 0.0004 0.0135 0.0326 0.1091 0.2679 0.4576 0.0875 0.0376 0.2968 0.063 0.0399 0.535 0.0589 0.5086 0.105 0.2557 0.1135 0.4717 0.1316 0.5562 0.7518 0.0164 0.2776 0.631 0.1456 0.1675 0.4574 0.012 0.2133 0.1476 0.2512 0.1965 0.4464 0.3814 0.4399 0.3458 0.5381 0.0376 0.2429 0.2917 0.162 0.1094 0.1469 0.0463 0.2137 0.3695 0.4771 0.0062 0.3299 0.068 0.1373 0.0023 0.3461 0.0262 0.2074 0.179 ] 2022-08-23 06:32:55 [INFO] [EVAL] Class Precision: [0.7826 0.8567 0.9615 0.8153 0.7834 0.851 0.8723 0.8437 0.6947 0.7324 0.7116 0.7075 0.7853 0.4818 0.5293 0.6007 0.6666 0.6685 0.7657 0.6118 0.8258 0.718 0.7803 0.6317 0.4968 0.6265 0.5921 0.8148 0.6685 0.3192 0.4468 0.5682 0.4588 0.5159 0.4255 0.4748 0.6558 0.7753 0.4697 0.6884 0.3198 0.2645 0.6433 0.5588 0.4098 0.3971 0.4831 0.7665 0.6541 0.6773 0.7582 0.4774 0.3803 0.4502 0.7372 0.7618 0.9219 0.6332 0.7066 0.3495 0.1362 0.5272 0.4493 0.6554 0.5256 0.8027 0.4226 0.5731 0.3936 0.5595 0.7467 0.7578 0.5868 0.3327 0.695 0.5146 0.6103 0.5199 0.6634 0.4754 0.6342 0.7006 0.7995 0.0803 0.2196 0.731 0.5255 0.363 0.5829 0.7048 0.5915 0.0433 0.3553 0.3981 0.0018 0.04 0.5527 0.3477 0.4618 0.6394 0.3283 0.0808 0.5945 0.4875 0.7975 0.5611 0.1628 0.8135 0.2574 0.5876 0.281 0.6643 0.4978 0.7332 0.7551 0.1218 0.6594 0.7683 0.1808 0.5521 0.5994 0.2464 0.8133 0.6437 0.6927 0.6272 0.8332 0.4988 0.7194 0.573 0.6384 0.4304 0.3845 0.6792 0.5776 0.1824 0.4614 0.0889 0.4535 0.6794 0.5675 0.0091 0.6597 0.2942 0.4112 0.0062 0.8877 0.4858 0.6079 0.8174] 2022-08-23 06:32:55 [INFO] [EVAL] Class Recall: [0.8521 0.8988 0.9661 0.8672 0.8436 0.8765 0.8805 0.915 0.6865 0.7808 0.6159 0.7182 0.8665 0.4015 0.4381 0.5831 0.6771 0.5723 0.7638 0.5738 0.8805 0.6501 0.7868 0.7291 0.4862 0.4277 0.637 0.46 0.5059 0.379 0.4484 0.7069 0.3891 0.5666 0.5282 0.6367 0.5532 0.675 0.332 0.4243 0.2515 0.1345 0.4275 0.3309 0.4696 0.1923 0.4769 0.6289 0.8215 0.7775 0.6367 0.6106 0.3503 0.205 0.8699 0.5015 0.9381 0.5609 0.7062 0.3187 0.1492 0.6684 0.5009 0.2991 0.7343 0.8469 0.4835 0.6169 0.1656 0.4178 0.5783 0.5764 0.5605 0.3247 0.5946 0.5254 0.7839 0.3815 0.3156 0.1918 0.8425 0.4868 0.386 0.0248 0.0904 0.7144 0.0898 0.1052 0.3249 0.6739 0.5968 0.1618 0.3416 0.1266 0.0005 0.02 0.0335 0.1373 0.3895 0.6168 0.1066 0.0658 0.3722 0.0675 0.0403 0.92 0.0845 0.5758 0.1506 0.3117 0.1599 0.6193 0.1517 0.6974 0.9942 0.0186 0.324 0.7793 0.4277 0.1938 0.6588 0.0125 0.2243 0.1608 0.2826 0.2225 0.4902 0.6184 0.531 0.4659 0.774 0.0396 0.3974 0.3383 0.1837 0.2148 0.1773 0.0879 0.2878 0.4475 0.7497 0.0193 0.3975 0.0813 0.171 0.0036 0.3619 0.027 0.2395 0.1865] 2022-08-23 06:32:55 [INFO] [EVAL] The model with the best validation mIoU (0.3495) was saved at iter 113000. 2022-08-23 06:33:04 [INFO] [TRAIN] epoch: 91, iter: 114050/160000, loss: 0.5301, lr: 0.000348, batch_cost: 0.1790, reader_cost: 0.00343, ips: 44.6893 samples/sec | ETA 02:17:05 2022-08-23 06:33:13 [INFO] [TRAIN] epoch: 91, iter: 114100/160000, loss: 0.5303, lr: 0.000348, batch_cost: 0.1856, reader_cost: 0.00132, ips: 43.0959 samples/sec | ETA 02:22:00 2022-08-23 06:33:22 [INFO] [TRAIN] epoch: 91, iter: 114150/160000, loss: 0.5433, lr: 0.000347, batch_cost: 0.1712, reader_cost: 0.00058, ips: 46.7252 samples/sec | ETA 02:10:50 2022-08-23 06:33:31 [INFO] [TRAIN] epoch: 91, iter: 114200/160000, loss: 0.5261, lr: 0.000347, batch_cost: 0.1789, reader_cost: 0.00053, ips: 44.7191 samples/sec | ETA 02:16:33 2022-08-23 06:33:41 [INFO] [TRAIN] epoch: 91, iter: 114250/160000, loss: 0.4946, lr: 0.000346, batch_cost: 0.2010, reader_cost: 0.00052, ips: 39.7911 samples/sec | ETA 02:33:18 2022-08-23 06:33:51 [INFO] [TRAIN] epoch: 91, iter: 114300/160000, loss: 0.4925, lr: 0.000346, batch_cost: 0.2078, reader_cost: 0.00032, ips: 38.5001 samples/sec | ETA 02:38:16 2022-08-23 06:34:03 [INFO] [TRAIN] epoch: 91, iter: 114350/160000, loss: 0.5821, lr: 0.000346, batch_cost: 0.2281, reader_cost: 0.00053, ips: 35.0745 samples/sec | ETA 02:53:32 2022-08-23 06:34:14 [INFO] [TRAIN] epoch: 91, iter: 114400/160000, loss: 0.5377, lr: 0.000345, batch_cost: 0.2236, reader_cost: 0.00059, ips: 35.7757 samples/sec | ETA 02:49:56 2022-08-23 06:34:26 [INFO] [TRAIN] epoch: 91, iter: 114450/160000, loss: 0.4981, lr: 0.000345, batch_cost: 0.2342, reader_cost: 0.00079, ips: 34.1542 samples/sec | ETA 02:57:49 2022-08-23 06:34:37 [INFO] [TRAIN] epoch: 91, iter: 114500/160000, loss: 0.5231, lr: 0.000344, batch_cost: 0.2268, reader_cost: 0.00049, ips: 35.2696 samples/sec | ETA 02:52:00 2022-08-23 06:34:47 [INFO] [TRAIN] epoch: 91, iter: 114550/160000, loss: 0.5459, lr: 0.000344, batch_cost: 0.2073, reader_cost: 0.00253, ips: 38.5832 samples/sec | ETA 02:37:03 2022-08-23 06:34:59 [INFO] [TRAIN] epoch: 91, iter: 114600/160000, loss: 0.5100, lr: 0.000344, batch_cost: 0.2362, reader_cost: 0.00050, ips: 33.8741 samples/sec | ETA 02:58:42 2022-08-23 06:35:10 [INFO] [TRAIN] epoch: 91, iter: 114650/160000, loss: 0.5202, lr: 0.000343, batch_cost: 0.2131, reader_cost: 0.00055, ips: 37.5381 samples/sec | ETA 02:41:04 2022-08-23 06:35:20 [INFO] [TRAIN] epoch: 91, iter: 114700/160000, loss: 0.5436, lr: 0.000343, batch_cost: 0.2146, reader_cost: 0.01692, ips: 37.2771 samples/sec | ETA 02:42:01 2022-08-23 06:35:31 [INFO] [TRAIN] epoch: 91, iter: 114750/160000, loss: 0.5061, lr: 0.000343, batch_cost: 0.2141, reader_cost: 0.00058, ips: 37.3651 samples/sec | ETA 02:41:28 2022-08-23 06:35:42 [INFO] [TRAIN] epoch: 91, iter: 114800/160000, loss: 0.5185, lr: 0.000342, batch_cost: 0.2103, reader_cost: 0.00066, ips: 38.0436 samples/sec | ETA 02:38:24 2022-08-23 06:35:53 [INFO] [TRAIN] epoch: 91, iter: 114850/160000, loss: 0.5373, lr: 0.000342, batch_cost: 0.2179, reader_cost: 0.00053, ips: 36.7115 samples/sec | ETA 02:43:58 2022-08-23 06:36:04 [INFO] [TRAIN] epoch: 91, iter: 114900/160000, loss: 0.4899, lr: 0.000341, batch_cost: 0.2196, reader_cost: 0.00057, ips: 36.4338 samples/sec | ETA 02:45:02 2022-08-23 06:36:16 [INFO] [TRAIN] epoch: 92, iter: 114950/160000, loss: 0.5084, lr: 0.000341, batch_cost: 0.2439, reader_cost: 0.06758, ips: 32.8008 samples/sec | ETA 03:03:07 2022-08-23 06:36:25 [INFO] [TRAIN] epoch: 92, iter: 115000/160000, loss: 0.5354, lr: 0.000341, batch_cost: 0.1908, reader_cost: 0.00124, ips: 41.9294 samples/sec | ETA 02:23:05 2022-08-23 06:36:25 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1930 - reader cost: 0.0012 2022-08-23 06:39:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3482 Acc: 0.7681 Kappa: 0.7503 Dice: 0.4792 2022-08-23 06:39:38 [INFO] [EVAL] Class IoU: [0.6929 0.7772 0.9314 0.7275 0.6834 0.7628 0.7741 0.7757 0.5203 0.6278 0.4974 0.551 0.6944 0.2862 0.3271 0.4277 0.5057 0.4363 0.6133 0.4298 0.7435 0.5438 0.6374 0.5005 0.3258 0.3267 0.521 0.4659 0.3312 0.2402 0.3033 0.4703 0.2689 0.3575 0.3076 0.3874 0.4281 0.5418 0.267 0.3831 0.17 0.0762 0.3421 0.2608 0.2873 0.1925 0.3269 0.5117 0.5806 0.5984 0.5271 0.3608 0.2128 0.2043 0.6643 0.4586 0.8657 0.3972 0.4837 0.3117 0.0809 0.3507 0.3314 0.2006 0.48 0.7043 0.312 0.4255 0.0421 0.3276 0.5089 0.5018 0.3802 0.2319 0.4573 0.3447 0.4871 0.2746 0.2639 0.1341 0.6181 0.3968 0.344 0.0334 0.3722 0.5714 0.1156 0.0868 0.2746 0.5182 0.3882 0.0387 0.2248 0.1316 0.0182 0.0168 0.035 0.0994 0.2779 0.4369 0.0419 0.0304 0.2753 0.1289 0.0357 0.5705 0.0514 0.4792 0.1033 0.2476 0.129 0.4225 0.1379 0.5569 0.7898 0.0192 0.3269 0.6326 0.1205 0.1509 0.4276 0.0087 0.243 0.1626 0.25 0.2739 0.5036 0.4247 0.4812 0.3571 0.5814 0.032 0.2753 0.2456 0.1986 0.1169 0.1659 0.0155 0.1844 0.3599 0.2472 0.0029 0.3666 0.074 0.1621 0. 0.3808 0.0338 0.1754 0.1683] 2022-08-23 06:39:38 [INFO] [EVAL] Class Precision: [0.7865 0.858 0.9628 0.8155 0.7712 0.8642 0.864 0.8285 0.6911 0.7449 0.6981 0.7451 0.767 0.4691 0.5192 0.6063 0.6765 0.6689 0.7414 0.6287 0.8296 0.6792 0.7782 0.6732 0.4887 0.5732 0.7079 0.7382 0.6784 0.3831 0.4351 0.6219 0.4796 0.4776 0.463 0.5449 0.6455 0.7926 0.485 0.6494 0.3682 0.2365 0.5647 0.5659 0.3979 0.448 0.5332 0.7074 0.6491 0.7363 0.7651 0.4846 0.3257 0.4517 0.7619 0.655 0.91 0.669 0.7981 0.5102 0.1393 0.5736 0.4844 0.7518 0.6012 0.8123 0.4248 0.5742 0.1396 0.6061 0.6643 0.7181 0.6257 0.3069 0.7225 0.4813 0.6874 0.5759 0.7443 0.4279 0.7324 0.639 0.8131 0.0859 0.4635 0.7585 0.4775 0.3883 0.638 0.7266 0.5033 0.0436 0.3955 0.3552 0.0795 0.0534 0.5727 0.3514 0.4636 0.6615 0.1874 0.0708 0.5205 0.4754 0.5418 0.6391 0.2572 0.8605 0.2812 0.5605 0.3017 0.5554 0.4958 0.6935 0.7956 0.1005 0.6217 0.7663 0.1559 0.5586 0.5819 0.3717 0.5959 0.5536 0.6947 0.624 0.8173 0.6038 0.794 0.7476 0.6528 0.2768 0.3977 0.6195 0.5679 0.2404 0.3316 0.0511 0.482 0.6343 0.2622 0.0047 0.6532 0.6286 0.4054 0. 0.8393 0.3926 0.398 0.7778] 2022-08-23 06:39:38 [INFO] [EVAL] Class Recall: [0.8534 0.892 0.9662 0.8709 0.8573 0.8667 0.8815 0.924 0.678 0.7997 0.6337 0.6789 0.8801 0.4233 0.4692 0.5921 0.667 0.5566 0.7803 0.5761 0.8776 0.7318 0.779 0.6612 0.4943 0.4316 0.6637 0.5581 0.3929 0.3917 0.5003 0.6586 0.3797 0.587 0.4782 0.5727 0.5597 0.6313 0.3727 0.4829 0.24 0.1011 0.4646 0.3261 0.5082 0.2523 0.458 0.6491 0.8461 0.7616 0.6289 0.5854 0.3805 0.2717 0.8382 0.6047 0.9467 0.4944 0.5511 0.4448 0.1618 0.4743 0.5119 0.2149 0.7041 0.8412 0.5402 0.6216 0.0569 0.4162 0.6851 0.6249 0.4922 0.4871 0.5547 0.5483 0.6257 0.3443 0.2902 0.1634 0.7983 0.5114 0.3735 0.0518 0.6538 0.6985 0.1323 0.1006 0.3252 0.6437 0.6293 0.2527 0.3424 0.173 0.0231 0.0239 0.036 0.1217 0.4097 0.5626 0.0513 0.0507 0.3689 0.1503 0.0368 0.8415 0.0603 0.5196 0.1403 0.3073 0.1839 0.6385 0.1604 0.7386 0.9908 0.0231 0.408 0.7838 0.3465 0.1713 0.6173 0.0088 0.291 0.1872 0.2809 0.328 0.5675 0.5887 0.5498 0.406 0.8418 0.035 0.4722 0.2892 0.2339 0.1854 0.2493 0.0219 0.2299 0.4542 0.8122 0.0075 0.4552 0.0774 0.2126 0. 0.4108 0.0357 0.2386 0.1768] 2022-08-23 06:39:39 [INFO] [EVAL] The model with the best validation mIoU (0.3495) was saved at iter 113000. 2022-08-23 06:39:48 [INFO] [TRAIN] epoch: 92, iter: 115050/160000, loss: 0.5289, lr: 0.000340, batch_cost: 0.1857, reader_cost: 0.00491, ips: 43.0756 samples/sec | ETA 02:19:08 2022-08-23 06:39:57 [INFO] [TRAIN] epoch: 92, iter: 115100/160000, loss: 0.5183, lr: 0.000340, batch_cost: 0.1831, reader_cost: 0.00105, ips: 43.6930 samples/sec | ETA 02:17:00 2022-08-23 06:40:07 [INFO] [TRAIN] epoch: 92, iter: 115150/160000, loss: 0.5253, lr: 0.000340, batch_cost: 0.1921, reader_cost: 0.00042, ips: 41.6376 samples/sec | ETA 02:23:37 2022-08-23 06:40:17 [INFO] [TRAIN] epoch: 92, iter: 115200/160000, loss: 0.5337, lr: 0.000339, batch_cost: 0.2039, reader_cost: 0.00048, ips: 39.2371 samples/sec | ETA 02:32:14 2022-08-23 06:40:26 [INFO] [TRAIN] epoch: 92, iter: 115250/160000, loss: 0.5154, lr: 0.000339, batch_cost: 0.1814, reader_cost: 0.00054, ips: 44.1055 samples/sec | ETA 02:15:16 2022-08-23 06:40:36 [INFO] [TRAIN] epoch: 92, iter: 115300/160000, loss: 0.5376, lr: 0.000338, batch_cost: 0.1988, reader_cost: 0.00055, ips: 40.2465 samples/sec | ETA 02:28:05 2022-08-23 06:40:46 [INFO] [TRAIN] epoch: 92, iter: 115350/160000, loss: 0.5195, lr: 0.000338, batch_cost: 0.2081, reader_cost: 0.00089, ips: 38.4460 samples/sec | ETA 02:34:50 2022-08-23 06:40:55 [INFO] [TRAIN] epoch: 92, iter: 115400/160000, loss: 0.5652, lr: 0.000338, batch_cost: 0.1845, reader_cost: 0.00035, ips: 43.3507 samples/sec | ETA 02:17:10 2022-08-23 06:41:05 [INFO] [TRAIN] epoch: 92, iter: 115450/160000, loss: 0.5322, lr: 0.000337, batch_cost: 0.1862, reader_cost: 0.00068, ips: 42.9681 samples/sec | ETA 02:18:14 2022-08-23 06:41:15 [INFO] [TRAIN] epoch: 92, iter: 115500/160000, loss: 0.5191, lr: 0.000337, batch_cost: 0.2021, reader_cost: 0.00064, ips: 39.5807 samples/sec | ETA 02:29:54 2022-08-23 06:41:25 [INFO] [TRAIN] epoch: 92, iter: 115550/160000, loss: 0.4765, lr: 0.000337, batch_cost: 0.1964, reader_cost: 0.00072, ips: 40.7307 samples/sec | ETA 02:25:30 2022-08-23 06:41:36 [INFO] [TRAIN] epoch: 92, iter: 115600/160000, loss: 0.5535, lr: 0.000336, batch_cost: 0.2201, reader_cost: 0.00058, ips: 36.3524 samples/sec | ETA 02:42:51 2022-08-23 06:41:47 [INFO] [TRAIN] epoch: 92, iter: 115650/160000, loss: 0.5280, lr: 0.000336, batch_cost: 0.2156, reader_cost: 0.00060, ips: 37.1125 samples/sec | ETA 02:39:20 2022-08-23 06:41:57 [INFO] [TRAIN] epoch: 92, iter: 115700/160000, loss: 0.5197, lr: 0.000335, batch_cost: 0.2129, reader_cost: 0.00092, ips: 37.5732 samples/sec | ETA 02:37:12 2022-08-23 06:42:09 [INFO] [TRAIN] epoch: 92, iter: 115750/160000, loss: 0.5224, lr: 0.000335, batch_cost: 0.2266, reader_cost: 0.00073, ips: 35.3013 samples/sec | ETA 02:47:07 2022-08-23 06:42:19 [INFO] [TRAIN] epoch: 92, iter: 115800/160000, loss: 0.5244, lr: 0.000335, batch_cost: 0.2171, reader_cost: 0.00051, ips: 36.8436 samples/sec | ETA 02:39:57 2022-08-23 06:42:30 [INFO] [TRAIN] epoch: 92, iter: 115850/160000, loss: 0.5280, lr: 0.000334, batch_cost: 0.2135, reader_cost: 0.00068, ips: 37.4732 samples/sec | ETA 02:37:05 2022-08-23 06:42:40 [INFO] [TRAIN] epoch: 92, iter: 115900/160000, loss: 0.5174, lr: 0.000334, batch_cost: 0.1999, reader_cost: 0.00094, ips: 40.0259 samples/sec | ETA 02:26:54 2022-08-23 06:42:49 [INFO] [TRAIN] epoch: 92, iter: 115950/160000, loss: 0.4950, lr: 0.000334, batch_cost: 0.1787, reader_cost: 0.00054, ips: 44.7714 samples/sec | ETA 02:11:11 2022-08-23 06:42:59 [INFO] [TRAIN] epoch: 92, iter: 116000/160000, loss: 0.5147, lr: 0.000333, batch_cost: 0.2096, reader_cost: 0.00096, ips: 38.1601 samples/sec | ETA 02:33:44 2022-08-23 06:42:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1719 - reader cost: 9.2270e-04 2022-08-23 06:45:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3498 Acc: 0.7687 Kappa: 0.7509 Dice: 0.4813 2022-08-23 06:45:52 [INFO] [EVAL] Class IoU: [0.6892 0.7785 0.9306 0.7298 0.6898 0.7666 0.784 0.7841 0.5191 0.6357 0.4912 0.5592 0.7056 0.3009 0.2894 0.4342 0.5271 0.4329 0.6157 0.4257 0.7345 0.5215 0.6339 0.5068 0.3104 0.2753 0.486 0.4591 0.3921 0.2591 0.2787 0.4762 0.268 0.3735 0.3129 0.3972 0.4329 0.5478 0.257 0.3637 0.1918 0.1126 0.3576 0.2412 0.2846 0.1811 0.3151 0.5052 0.6072 0.5313 0.5594 0.4122 0.2158 0.172 0.6438 0.4368 0.8719 0.4238 0.5204 0.2248 0.0647 0.3986 0.3183 0.2694 0.4605 0.6976 0.2686 0.4119 0.1179 0.3151 0.4833 0.5048 0.4077 0.194 0.4508 0.3585 0.5067 0.306 0.2533 0.1551 0.6037 0.3952 0.3547 0.0219 0.3552 0.5695 0.1175 0.0924 0.1874 0.5262 0.4201 0.0492 0.2179 0.1222 0.0028 0.0184 0.0343 0.1307 0.268 0.4331 0.0373 0.0295 0.3006 0.1261 0.054 0.5906 0.0529 0.5108 0.0998 0.2514 0.1183 0.5317 0.1484 0.5202 0.8158 0.0144 0.2718 0.641 0.1076 0.1594 0.4385 0.0138 0.229 0.186 0.2618 0.2102 0.496 0.382 0.4686 0.3413 0.5908 0.0316 0.2777 0.2644 0.2141 0.1016 0.1471 0.0228 0.1711 0.3638 0.2735 0.0082 0.3544 0.1961 0.2367 0.0009 0.3962 0.0283 0.1923 0.149 ] 2022-08-23 06:45:52 [INFO] [EVAL] Class Precision: [0.783 0.8461 0.9658 0.8289 0.7801 0.8583 0.8797 0.8442 0.6762 0.7838 0.6811 0.716 0.7965 0.4662 0.5528 0.5767 0.7003 0.6448 0.7715 0.6188 0.8116 0.6827 0.7952 0.6435 0.4999 0.6337 0.6013 0.7395 0.6384 0.4061 0.5111 0.6128 0.5095 0.5423 0.4888 0.5734 0.6435 0.727 0.4357 0.6827 0.3676 0.2432 0.6351 0.5806 0.3902 0.4135 0.552 0.7331 0.7206 0.6034 0.7453 0.6096 0.3824 0.492 0.7396 0.6027 0.9176 0.62 0.6681 0.3873 0.1421 0.5577 0.4522 0.6087 0.554 0.8131 0.3537 0.6086 0.2981 0.598 0.6602 0.6349 0.6054 0.2968 0.6697 0.5463 0.7201 0.5341 0.4974 0.4153 0.6977 0.67 0.8035 0.0521 0.544 0.7586 0.389 0.4044 0.2952 0.7687 0.5896 0.0606 0.348 0.3854 0.0137 0.0537 0.3848 0.366 0.4612 0.6597 0.2311 0.0759 0.5127 0.4463 0.7638 0.6544 0.3336 0.7719 0.251 0.4529 0.271 0.8842 0.3933 0.7006 0.8213 0.0726 0.7212 0.778 0.1372 0.5438 0.5561 0.1682 0.6097 0.4936 0.6273 0.5371 0.8309 0.4892 0.8024 0.721 0.668 0.1958 0.4952 0.6516 0.6135 0.2006 0.4569 0.0627 0.4224 0.5903 0.3482 0.0151 0.6361 0.5815 0.5701 0.0012 0.8618 0.5029 0.6561 0.806 ] 2022-08-23 06:45:52 [INFO] [EVAL] Class Recall: [0.852 0.9069 0.9622 0.8592 0.8563 0.8777 0.8781 0.9168 0.6908 0.771 0.638 0.7186 0.8607 0.4591 0.3779 0.6372 0.6806 0.5685 0.753 0.577 0.8855 0.6884 0.7575 0.7047 0.4503 0.3274 0.717 0.5477 0.5039 0.4172 0.38 0.6812 0.3612 0.5455 0.465 0.5639 0.5694 0.6897 0.3852 0.4377 0.2863 0.1732 0.4501 0.2921 0.5125 0.2438 0.4234 0.619 0.794 0.8163 0.6916 0.5601 0.3313 0.2091 0.8324 0.6133 0.946 0.5725 0.7018 0.3488 0.1062 0.5828 0.5181 0.3258 0.7317 0.8308 0.5275 0.5602 0.1632 0.3998 0.6433 0.7111 0.5553 0.3592 0.5796 0.5104 0.631 0.4174 0.3405 0.1984 0.8175 0.4908 0.3884 0.0363 0.5059 0.6956 0.1442 0.1069 0.3393 0.6253 0.5937 0.2077 0.3681 0.1517 0.0035 0.0272 0.0362 0.169 0.3902 0.5577 0.0426 0.0459 0.4208 0.1495 0.0549 0.8584 0.0592 0.6016 0.142 0.3612 0.1735 0.5715 0.1924 0.6689 0.9919 0.0176 0.3037 0.7845 0.3328 0.184 0.6746 0.0148 0.2683 0.2299 0.31 0.2567 0.5516 0.6353 0.5297 0.3932 0.8365 0.0363 0.3873 0.3079 0.2475 0.1706 0.1783 0.0345 0.2233 0.4867 0.5605 0.0178 0.4446 0.2283 0.2881 0.0039 0.423 0.0291 0.2139 0.1545] 2022-08-23 06:45:52 [INFO] [EVAL] The model with the best validation mIoU (0.3498) was saved at iter 116000. 2022-08-23 06:46:02 [INFO] [TRAIN] epoch: 92, iter: 116050/160000, loss: 0.5656, lr: 0.000333, batch_cost: 0.1962, reader_cost: 0.00347, ips: 40.7765 samples/sec | ETA 02:23:42 2022-08-23 06:46:11 [INFO] [TRAIN] epoch: 92, iter: 116100/160000, loss: 0.4985, lr: 0.000332, batch_cost: 0.1813, reader_cost: 0.00155, ips: 44.1338 samples/sec | ETA 02:12:37 2022-08-23 06:46:20 [INFO] [TRAIN] epoch: 92, iter: 116150/160000, loss: 0.5230, lr: 0.000332, batch_cost: 0.1803, reader_cost: 0.00084, ips: 44.3667 samples/sec | ETA 02:11:46 2022-08-23 06:46:32 [INFO] [TRAIN] epoch: 93, iter: 116200/160000, loss: 0.5167, lr: 0.000332, batch_cost: 0.2467, reader_cost: 0.05493, ips: 32.4337 samples/sec | ETA 03:00:03 2022-08-23 06:46:41 [INFO] [TRAIN] epoch: 93, iter: 116250/160000, loss: 0.4969, lr: 0.000331, batch_cost: 0.1727, reader_cost: 0.00032, ips: 46.3316 samples/sec | ETA 02:05:54 2022-08-23 06:46:50 [INFO] [TRAIN] epoch: 93, iter: 116300/160000, loss: 0.5786, lr: 0.000331, batch_cost: 0.1967, reader_cost: 0.00039, ips: 40.6702 samples/sec | ETA 02:23:15 2022-08-23 06:46:59 [INFO] [TRAIN] epoch: 93, iter: 116350/160000, loss: 0.4789, lr: 0.000330, batch_cost: 0.1758, reader_cost: 0.00084, ips: 45.4954 samples/sec | ETA 02:07:55 2022-08-23 06:47:09 [INFO] [TRAIN] epoch: 93, iter: 116400/160000, loss: 0.5088, lr: 0.000330, batch_cost: 0.1989, reader_cost: 0.00060, ips: 40.2260 samples/sec | ETA 02:24:31 2022-08-23 06:47:18 [INFO] [TRAIN] epoch: 93, iter: 116450/160000, loss: 0.5398, lr: 0.000330, batch_cost: 0.1698, reader_cost: 0.00043, ips: 47.1166 samples/sec | ETA 02:03:14 2022-08-23 06:47:27 [INFO] [TRAIN] epoch: 93, iter: 116500/160000, loss: 0.5111, lr: 0.000329, batch_cost: 0.1865, reader_cost: 0.00030, ips: 42.9000 samples/sec | ETA 02:15:11 2022-08-23 06:47:36 [INFO] [TRAIN] epoch: 93, iter: 116550/160000, loss: 0.5492, lr: 0.000329, batch_cost: 0.1774, reader_cost: 0.00103, ips: 45.1055 samples/sec | ETA 02:08:26 2022-08-23 06:47:45 [INFO] [TRAIN] epoch: 93, iter: 116600/160000, loss: 0.5125, lr: 0.000329, batch_cost: 0.1903, reader_cost: 0.00052, ips: 42.0386 samples/sec | ETA 02:17:39 2022-08-23 06:47:54 [INFO] [TRAIN] epoch: 93, iter: 116650/160000, loss: 0.5247, lr: 0.000328, batch_cost: 0.1814, reader_cost: 0.00090, ips: 44.1087 samples/sec | ETA 02:11:02 2022-08-23 06:48:04 [INFO] [TRAIN] epoch: 93, iter: 116700/160000, loss: 0.5234, lr: 0.000328, batch_cost: 0.1956, reader_cost: 0.00068, ips: 40.9095 samples/sec | ETA 02:21:07 2022-08-23 06:48:14 [INFO] [TRAIN] epoch: 93, iter: 116750/160000, loss: 0.4925, lr: 0.000327, batch_cost: 0.1923, reader_cost: 0.00083, ips: 41.5924 samples/sec | ETA 02:18:38 2022-08-23 06:48:23 [INFO] [TRAIN] epoch: 93, iter: 116800/160000, loss: 0.5314, lr: 0.000327, batch_cost: 0.1903, reader_cost: 0.00036, ips: 42.0464 samples/sec | ETA 02:16:59 2022-08-23 06:48:33 [INFO] [TRAIN] epoch: 93, iter: 116850/160000, loss: 0.5312, lr: 0.000327, batch_cost: 0.1911, reader_cost: 0.00041, ips: 41.8631 samples/sec | ETA 02:17:25 2022-08-23 06:48:43 [INFO] [TRAIN] epoch: 93, iter: 116900/160000, loss: 0.5206, lr: 0.000326, batch_cost: 0.1989, reader_cost: 0.00054, ips: 40.2178 samples/sec | ETA 02:22:53 2022-08-23 06:48:53 [INFO] [TRAIN] epoch: 93, iter: 116950/160000, loss: 0.5266, lr: 0.000326, batch_cost: 0.2026, reader_cost: 0.00086, ips: 39.4815 samples/sec | ETA 02:25:23 2022-08-23 06:49:05 [INFO] [TRAIN] epoch: 93, iter: 117000/160000, loss: 0.5156, lr: 0.000326, batch_cost: 0.2314, reader_cost: 0.00105, ips: 34.5695 samples/sec | ETA 02:45:50 2022-08-23 06:49:05 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1681 - reader cost: 8.2613e-04 2022-08-23 06:51:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3466 Acc: 0.7680 Kappa: 0.7504 Dice: 0.4782 2022-08-23 06:51:53 [INFO] [EVAL] Class IoU: [0.6929 0.7815 0.9303 0.7325 0.6843 0.7666 0.7803 0.7801 0.5231 0.6209 0.4919 0.5656 0.6976 0.2947 0.3117 0.4346 0.5006 0.441 0.6109 0.4272 0.7331 0.5261 0.6262 0.4984 0.329 0.3812 0.498 0.4511 0.4038 0.2463 0.3032 0.4749 0.2625 0.3699 0.3097 0.3984 0.4298 0.5443 0.2647 0.3624 0.2304 0.095 0.361 0.2646 0.2897 0.2083 0.3072 0.5232 0.5541 0.5572 0.5231 0.4021 0.213 0.1665 0.6507 0.4022 0.8718 0.4554 0.433 0.247 0.0624 0.35 0.3121 0.2514 0.4701 0.6714 0.3002 0.4133 0.0912 0.3263 0.5069 0.5173 0.3919 0.2009 0.4552 0.3388 0.5636 0.305 0.2153 0.2092 0.6203 0.3944 0.3651 0.0225 0.2197 0.5772 0.1198 0.093 0.2659 0.4957 0.3973 0.0511 0.1989 0.104 0.0064 0.0488 0.0312 0.109 0.2955 0.4453 0.0485 0.0284 0.2339 0.1834 0.0065 0.5447 0.0936 0.5222 0.0878 0.2226 0.1381 0.5268 0.1374 0.5241 0.7496 0.0148 0.1945 0.6287 0.1118 0.1298 0.4752 0.014 0.2028 0.1388 0.2623 0.2599 0.4908 0.3653 0.4005 0.3489 0.4688 0.0405 0.308 0.2889 0.1671 0.1146 0.1472 0.0477 0.176 0.3602 0.2896 0.0009 0.3891 0.0657 0.2607 0. 0.3406 0.0239 0.1914 0.1702] 2022-08-23 06:51:53 [INFO] [EVAL] Class Precision: [0.7926 0.8569 0.9663 0.8305 0.7709 0.8635 0.8844 0.8399 0.6635 0.7612 0.7256 0.7354 0.7705 0.4445 0.5304 0.6056 0.7375 0.6613 0.75 0.6114 0.807 0.6838 0.7509 0.6467 0.5034 0.5689 0.6885 0.7618 0.6711 0.3852 0.4777 0.6743 0.4721 0.4888 0.4552 0.5405 0.6191 0.7505 0.4341 0.6094 0.3672 0.2231 0.581 0.5367 0.4153 0.4344 0.4827 0.7198 0.698 0.6561 0.6876 0.5651 0.3486 0.4791 0.6977 0.6145 0.9342 0.5644 0.6867 0.3962 0.1173 0.5311 0.4708 0.6332 0.5862 0.7465 0.4908 0.5819 0.1624 0.5763 0.721 0.7132 0.6267 0.2893 0.6928 0.5135 0.7313 0.5765 0.8165 0.3869 0.7065 0.6538 0.7903 0.081 0.457 0.7581 0.468 0.3861 0.507 0.7912 0.5628 0.0632 0.408 0.4188 0.0141 0.1316 0.2519 0.3577 0.4505 0.6502 0.2794 0.073 0.6378 0.6198 0.2578 0.5861 0.2434 0.8146 0.2255 0.5308 0.3208 0.8775 0.4143 0.7078 0.7538 0.0878 0.5717 0.7442 0.1441 0.5977 0.5733 0.2986 0.598 0.6516 0.6482 0.6354 0.8736 0.4653 0.5421 0.5789 0.5007 0.2696 0.463 0.6028 0.5409 0.3102 0.4988 0.0841 0.513 0.6212 0.3154 0.0014 0.5874 0.5713 0.5619 0. 0.8876 0.527 0.6446 0.7978] 2022-08-23 06:51:53 [INFO] [EVAL] Class Recall: [0.8463 0.8987 0.9614 0.8612 0.859 0.8723 0.8689 0.9164 0.7119 0.7712 0.6043 0.71 0.8806 0.4665 0.4305 0.6063 0.6092 0.5696 0.7671 0.5864 0.889 0.6952 0.7905 0.6849 0.4871 0.536 0.6429 0.5252 0.5034 0.406 0.4536 0.6163 0.3717 0.6033 0.492 0.6024 0.5844 0.6646 0.4041 0.4721 0.3821 0.142 0.488 0.3429 0.4892 0.2859 0.4579 0.657 0.7288 0.787 0.6862 0.5823 0.3538 0.2033 0.9063 0.538 0.9289 0.7023 0.5396 0.396 0.1174 0.5064 0.4807 0.2943 0.7035 0.8697 0.4361 0.5879 0.1722 0.4293 0.6306 0.6531 0.5112 0.3965 0.5704 0.499 0.7107 0.3931 0.2263 0.3129 0.8357 0.4985 0.4042 0.0302 0.2973 0.7075 0.1387 0.1091 0.3587 0.5704 0.5747 0.2108 0.2796 0.1215 0.0117 0.072 0.0343 0.1355 0.4622 0.5855 0.0554 0.0443 0.2697 0.2066 0.0066 0.885 0.132 0.5926 0.1257 0.2771 0.1952 0.5687 0.1705 0.6688 0.9926 0.0175 0.2277 0.802 0.3324 0.1422 0.7354 0.0145 0.2348 0.1499 0.3059 0.3055 0.5283 0.6295 0.6053 0.4676 0.8802 0.0455 0.479 0.3568 0.1947 0.1539 0.1727 0.0994 0.2112 0.4616 0.7797 0.0022 0.5355 0.0691 0.3272 0. 0.3559 0.0244 0.2139 0.1778] 2022-08-23 06:51:53 [INFO] [EVAL] The model with the best validation mIoU (0.3498) was saved at iter 116000. 2022-08-23 06:52:04 [INFO] [TRAIN] epoch: 93, iter: 117050/160000, loss: 0.5326, lr: 0.000325, batch_cost: 0.2129, reader_cost: 0.00441, ips: 37.5718 samples/sec | ETA 02:32:25 2022-08-23 06:52:15 [INFO] [TRAIN] epoch: 93, iter: 117100/160000, loss: 0.5327, lr: 0.000325, batch_cost: 0.2241, reader_cost: 0.00120, ips: 35.6984 samples/sec | ETA 02:40:13 2022-08-23 06:52:25 [INFO] [TRAIN] epoch: 93, iter: 117150/160000, loss: 0.5081, lr: 0.000324, batch_cost: 0.2105, reader_cost: 0.00087, ips: 38.0095 samples/sec | ETA 02:30:18 2022-08-23 06:52:34 [INFO] [TRAIN] epoch: 93, iter: 117200/160000, loss: 0.5388, lr: 0.000324, batch_cost: 0.1796, reader_cost: 0.00041, ips: 44.5329 samples/sec | ETA 02:08:08 2022-08-23 06:52:45 [INFO] [TRAIN] epoch: 93, iter: 117250/160000, loss: 0.5138, lr: 0.000324, batch_cost: 0.2044, reader_cost: 0.00104, ips: 39.1420 samples/sec | ETA 02:25:37 2022-08-23 06:52:56 [INFO] [TRAIN] epoch: 93, iter: 117300/160000, loss: 0.5544, lr: 0.000323, batch_cost: 0.2175, reader_cost: 0.00105, ips: 36.7857 samples/sec | ETA 02:34:46 2022-08-23 06:53:05 [INFO] [TRAIN] epoch: 93, iter: 117350/160000, loss: 0.5061, lr: 0.000323, batch_cost: 0.1904, reader_cost: 0.00096, ips: 42.0072 samples/sec | ETA 02:15:22 2022-08-23 06:53:16 [INFO] [TRAIN] epoch: 93, iter: 117400/160000, loss: 0.5508, lr: 0.000323, batch_cost: 0.2136, reader_cost: 0.00100, ips: 37.4463 samples/sec | ETA 02:31:41 2022-08-23 06:53:27 [INFO] [TRAIN] epoch: 93, iter: 117450/160000, loss: 0.5027, lr: 0.000322, batch_cost: 0.2254, reader_cost: 0.00056, ips: 35.4906 samples/sec | ETA 02:39:51 2022-08-23 06:53:40 [INFO] [TRAIN] epoch: 94, iter: 117500/160000, loss: 0.5018, lr: 0.000322, batch_cost: 0.2550, reader_cost: 0.05652, ips: 31.3668 samples/sec | ETA 03:00:39 2022-08-23 06:53:48 [INFO] [TRAIN] epoch: 94, iter: 117550/160000, loss: 0.5331, lr: 0.000321, batch_cost: 0.1605, reader_cost: 0.00119, ips: 49.8316 samples/sec | ETA 01:53:34 2022-08-23 06:53:56 [INFO] [TRAIN] epoch: 94, iter: 117600/160000, loss: 0.5067, lr: 0.000321, batch_cost: 0.1627, reader_cost: 0.00395, ips: 49.1802 samples/sec | ETA 01:54:57 2022-08-23 06:54:05 [INFO] [TRAIN] epoch: 94, iter: 117650/160000, loss: 0.4964, lr: 0.000321, batch_cost: 0.1838, reader_cost: 0.00088, ips: 43.5295 samples/sec | ETA 02:09:43 2022-08-23 06:54:14 [INFO] [TRAIN] epoch: 94, iter: 117700/160000, loss: 0.5352, lr: 0.000320, batch_cost: 0.1849, reader_cost: 0.00046, ips: 43.2761 samples/sec | ETA 02:10:19 2022-08-23 06:54:23 [INFO] [TRAIN] epoch: 94, iter: 117750/160000, loss: 0.4835, lr: 0.000320, batch_cost: 0.1724, reader_cost: 0.00070, ips: 46.4163 samples/sec | ETA 02:01:21 2022-08-23 06:54:32 [INFO] [TRAIN] epoch: 94, iter: 117800/160000, loss: 0.5484, lr: 0.000320, batch_cost: 0.1805, reader_cost: 0.00092, ips: 44.3170 samples/sec | ETA 02:06:57 2022-08-23 06:54:42 [INFO] [TRAIN] epoch: 94, iter: 117850/160000, loss: 0.5112, lr: 0.000319, batch_cost: 0.1902, reader_cost: 0.00052, ips: 42.0717 samples/sec | ETA 02:13:34 2022-08-23 06:54:51 [INFO] [TRAIN] epoch: 94, iter: 117900/160000, loss: 0.4866, lr: 0.000319, batch_cost: 0.1825, reader_cost: 0.00062, ips: 43.8274 samples/sec | ETA 02:08:04 2022-08-23 06:54:59 [INFO] [TRAIN] epoch: 94, iter: 117950/160000, loss: 0.4762, lr: 0.000318, batch_cost: 0.1705, reader_cost: 0.00245, ips: 46.9125 samples/sec | ETA 01:59:30 2022-08-23 06:55:09 [INFO] [TRAIN] epoch: 94, iter: 118000/160000, loss: 0.4768, lr: 0.000318, batch_cost: 0.2006, reader_cost: 0.00049, ips: 39.8871 samples/sec | ETA 02:20:23 2022-08-23 06:55:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1758 - reader cost: 5.7788e-04 2022-08-23 06:58:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3490 Acc: 0.7677 Kappa: 0.7501 Dice: 0.4808 2022-08-23 06:58:05 [INFO] [EVAL] Class IoU: [0.6922 0.7844 0.9313 0.7281 0.6795 0.7654 0.7797 0.7827 0.5272 0.6031 0.495 0.5566 0.698 0.2871 0.3321 0.4242 0.503 0.4304 0.6182 0.4079 0.7392 0.5086 0.6192 0.4981 0.2831 0.4436 0.4549 0.4616 0.4039 0.2559 0.2853 0.4559 0.2588 0.3758 0.3051 0.3761 0.4421 0.553 0.2696 0.4023 0.1895 0.1011 0.3627 0.2603 0.2837 0.1508 0.3053 0.5162 0.5759 0.5647 0.5258 0.3845 0.213 0.1986 0.6467 0.4173 0.87 0.3554 0.5125 0.2241 0.0555 0.3306 0.2732 0.2311 0.4625 0.6882 0.2872 0.3752 0.1101 0.3262 0.502 0.4925 0.3926 0.1932 0.4445 0.3479 0.4819 0.2952 0.2352 0.1986 0.6764 0.415 0.3381 0.0261 0.2979 0.5642 0.1174 0.0778 0.2768 0.5129 0.3863 0.0483 0.2235 0.1218 0.0046 0.0165 0.0272 0.0984 0.2927 0.4094 0.0966 0.0246 0.2945 0.5977 0.0325 0.5555 0.0473 0.5082 0.0897 0.2457 0.1194 0.5205 0.129 0.5371 0.7346 0.0129 0.3478 0.6519 0.0984 0.1938 0.4532 0.0154 0.2193 0.1152 0.2608 0.2059 0.4836 0.4225 0.3221 0.3195 0.5676 0.0574 0.2782 0.2834 0.1676 0.107 0.1682 0.0521 0.1908 0.3427 0.3221 0.0155 0.3521 0.0368 0.2459 0. 0.3357 0.0263 0.2338 0.1779] 2022-08-23 06:58:05 [INFO] [EVAL] Class Precision: [0.7971 0.8669 0.9693 0.8089 0.7493 0.844 0.8709 0.8509 0.6819 0.7538 0.7238 0.7377 0.7838 0.4874 0.5336 0.5703 0.6707 0.6559 0.7579 0.6627 0.8214 0.6553 0.7439 0.6162 0.5363 0.6159 0.6248 0.7471 0.7054 0.3624 0.4603 0.6081 0.4938 0.4947 0.4797 0.5296 0.6231 0.7824 0.4601 0.557 0.3588 0.2672 0.5817 0.5202 0.3931 0.3921 0.4733 0.7297 0.6952 0.6395 0.7553 0.5722 0.3282 0.5065 0.6606 0.636 0.9348 0.7335 0.7212 0.4069 0.118 0.531 0.3257 0.6864 0.565 0.7987 0.3943 0.5962 0.2071 0.5729 0.6518 0.6118 0.6159 0.2553 0.7402 0.4984 0.5911 0.5273 0.8631 0.5263 0.7979 0.6188 0.8074 0.0625 0.5026 0.7484 0.3592 0.3978 0.48 0.736 0.5598 0.0696 0.5041 0.3656 0.0157 0.0708 0.463 0.4065 0.4579 0.6856 0.369 0.0576 0.5934 0.75 0.7492 0.5884 0.1524 0.8352 0.2587 0.5887 0.2958 0.8392 0.4563 0.6463 0.7378 0.0653 0.6151 0.7942 0.139 0.6022 0.5361 0.4559 0.6905 0.7023 0.6627 0.6122 0.8467 0.5727 0.5606 0.5286 0.6336 0.3626 0.3895 0.6661 0.6248 0.2204 0.3492 0.0881 0.5329 0.6507 0.3626 0.023 0.6401 0.3483 0.5415 0. 0.8737 0.3375 0.6279 0.858 ] 2022-08-23 06:58:05 [INFO] [EVAL] Class Recall: [0.8402 0.8919 0.9596 0.8794 0.8795 0.8915 0.8816 0.9072 0.6991 0.751 0.6103 0.6939 0.8644 0.4112 0.4678 0.6235 0.6679 0.556 0.7703 0.5148 0.8808 0.6944 0.7869 0.7222 0.3749 0.6132 0.626 0.5471 0.4858 0.4656 0.4287 0.6455 0.3523 0.6099 0.4561 0.5647 0.6035 0.6535 0.3943 0.5917 0.2866 0.1399 0.4907 0.3424 0.5047 0.1969 0.4623 0.6382 0.7704 0.8284 0.6338 0.5397 0.3776 0.2463 0.9685 0.5483 0.9261 0.4081 0.6391 0.3327 0.0949 0.4671 0.6289 0.2584 0.7183 0.8325 0.514 0.503 0.1901 0.4311 0.686 0.7163 0.5199 0.4424 0.5267 0.5355 0.7228 0.4015 0.2443 0.2418 0.8161 0.5575 0.3677 0.0429 0.4224 0.6962 0.1484 0.0881 0.3954 0.6285 0.5549 0.1361 0.2865 0.1545 0.0065 0.0211 0.0281 0.1149 0.448 0.504 0.1156 0.0412 0.3689 0.7465 0.0329 0.9087 0.0642 0.5649 0.1208 0.2966 0.1668 0.5782 0.1524 0.7608 0.9941 0.0159 0.4446 0.7844 0.2522 0.2223 0.7455 0.0157 0.2433 0.1212 0.3007 0.2367 0.53 0.6171 0.4309 0.4469 0.8449 0.0638 0.4932 0.3304 0.1863 0.1721 0.245 0.1132 0.2291 0.4199 0.7429 0.0452 0.439 0.0395 0.3106 0. 0.3528 0.0278 0.2714 0.1833] 2022-08-23 06:58:05 [INFO] [EVAL] The model with the best validation mIoU (0.3498) was saved at iter 116000. 2022-08-23 06:58:15 [INFO] [TRAIN] epoch: 94, iter: 118050/160000, loss: 0.4997, lr: 0.000318, batch_cost: 0.1811, reader_cost: 0.00585, ips: 44.1634 samples/sec | ETA 02:06:39 2022-08-23 06:58:24 [INFO] [TRAIN] epoch: 94, iter: 118100/160000, loss: 0.5154, lr: 0.000317, batch_cost: 0.1855, reader_cost: 0.00400, ips: 43.1151 samples/sec | ETA 02:09:34 2022-08-23 06:58:33 [INFO] [TRAIN] epoch: 94, iter: 118150/160000, loss: 0.5248, lr: 0.000317, batch_cost: 0.1809, reader_cost: 0.00076, ips: 44.2134 samples/sec | ETA 02:06:12 2022-08-23 06:58:43 [INFO] [TRAIN] epoch: 94, iter: 118200/160000, loss: 0.5044, lr: 0.000316, batch_cost: 0.1991, reader_cost: 0.00041, ips: 40.1769 samples/sec | ETA 02:18:43 2022-08-23 06:58:54 [INFO] [TRAIN] epoch: 94, iter: 118250/160000, loss: 0.4860, lr: 0.000316, batch_cost: 0.2203, reader_cost: 0.00082, ips: 36.3184 samples/sec | ETA 02:33:16 2022-08-23 06:59:04 [INFO] [TRAIN] epoch: 94, iter: 118300/160000, loss: 0.5306, lr: 0.000316, batch_cost: 0.1999, reader_cost: 0.00076, ips: 40.0189 samples/sec | ETA 02:18:56 2022-08-23 06:59:14 [INFO] [TRAIN] epoch: 94, iter: 118350/160000, loss: 0.5178, lr: 0.000315, batch_cost: 0.1960, reader_cost: 0.00042, ips: 40.8167 samples/sec | ETA 02:16:03 2022-08-23 06:59:24 [INFO] [TRAIN] epoch: 94, iter: 118400/160000, loss: 0.5152, lr: 0.000315, batch_cost: 0.2047, reader_cost: 0.00134, ips: 39.0822 samples/sec | ETA 02:21:55 2022-08-23 06:59:34 [INFO] [TRAIN] epoch: 94, iter: 118450/160000, loss: 0.5648, lr: 0.000315, batch_cost: 0.2088, reader_cost: 0.00049, ips: 38.3168 samples/sec | ETA 02:24:35 2022-08-23 06:59:44 [INFO] [TRAIN] epoch: 94, iter: 118500/160000, loss: 0.5430, lr: 0.000314, batch_cost: 0.2003, reader_cost: 0.00074, ips: 39.9330 samples/sec | ETA 02:18:33 2022-08-23 06:59:55 [INFO] [TRAIN] epoch: 94, iter: 118550/160000, loss: 0.5092, lr: 0.000314, batch_cost: 0.2126, reader_cost: 0.00075, ips: 37.6251 samples/sec | ETA 02:26:53 2022-08-23 07:00:04 [INFO] [TRAIN] epoch: 94, iter: 118600/160000, loss: 0.5722, lr: 0.000313, batch_cost: 0.1896, reader_cost: 0.00082, ips: 42.1853 samples/sec | ETA 02:10:51 2022-08-23 07:00:13 [INFO] [TRAIN] epoch: 94, iter: 118650/160000, loss: 0.5448, lr: 0.000313, batch_cost: 0.1799, reader_cost: 0.00068, ips: 44.4783 samples/sec | ETA 02:03:57 2022-08-23 07:00:24 [INFO] [TRAIN] epoch: 94, iter: 118700/160000, loss: 0.5296, lr: 0.000313, batch_cost: 0.2184, reader_cost: 0.00065, ips: 36.6265 samples/sec | ETA 02:30:20 2022-08-23 07:00:38 [INFO] [TRAIN] epoch: 95, iter: 118750/160000, loss: 0.5134, lr: 0.000312, batch_cost: 0.2644, reader_cost: 0.06531, ips: 30.2515 samples/sec | ETA 03:01:48 2022-08-23 07:00:49 [INFO] [TRAIN] epoch: 95, iter: 118800/160000, loss: 0.4927, lr: 0.000312, batch_cost: 0.2201, reader_cost: 0.00060, ips: 36.3539 samples/sec | ETA 02:31:06 2022-08-23 07:01:00 [INFO] [TRAIN] epoch: 95, iter: 118850/160000, loss: 0.5239, lr: 0.000312, batch_cost: 0.2212, reader_cost: 0.00052, ips: 36.1587 samples/sec | ETA 02:31:44 2022-08-23 07:01:08 [INFO] [TRAIN] epoch: 95, iter: 118900/160000, loss: 0.4782, lr: 0.000311, batch_cost: 0.1736, reader_cost: 0.00057, ips: 46.0759 samples/sec | ETA 01:58:56 2022-08-23 07:01:19 [INFO] [TRAIN] epoch: 95, iter: 118950/160000, loss: 0.4685, lr: 0.000311, batch_cost: 0.2041, reader_cost: 0.00043, ips: 39.1948 samples/sec | ETA 02:19:38 2022-08-23 07:01:28 [INFO] [TRAIN] epoch: 95, iter: 119000/160000, loss: 0.4806, lr: 0.000310, batch_cost: 0.1895, reader_cost: 0.00072, ips: 42.2180 samples/sec | ETA 02:09:29 2022-08-23 07:01:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 133s - batch_cost: 0.1326 - reader cost: 6.6310e-04 2022-08-23 07:03:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3473 Acc: 0.7683 Kappa: 0.7503 Dice: 0.4793 2022-08-23 07:03:41 [INFO] [EVAL] Class IoU: [0.692 0.7873 0.9302 0.7253 0.6825 0.763 0.7778 0.7824 0.5306 0.6079 0.4851 0.5483 0.7022 0.2967 0.3193 0.4296 0.4989 0.4167 0.6185 0.412 0.7323 0.4737 0.6384 0.4989 0.3375 0.4119 0.4454 0.4421 0.3762 0.244 0.2739 0.4574 0.2712 0.3693 0.2781 0.4025 0.4377 0.5487 0.2714 0.3926 0.2167 0.1062 0.35 0.2633 0.2918 0.1877 0.3139 0.5215 0.6104 0.5949 0.519 0.2826 0.1929 0.2183 0.6469 0.4448 0.8764 0.3085 0.4859 0.224 0.0868 0.4331 0.3361 0.2486 0.4464 0.6988 0.281 0.4086 0.1136 0.3334 0.49 0.5047 0.4057 0.1931 0.4594 0.3228 0.5604 0.2655 0.259 0.1577 0.6263 0.3885 0.3408 0.0216 0.1694 0.5643 0.088 0.0957 0.2691 0.5205 0.4006 0.0425 0.2315 0.1112 0.0017 0.0193 0.0307 0.1222 0.3054 0.4152 0.0976 0.0343 0.2407 0.2257 0.0834 0.4551 0.0366 0.5217 0.0949 0.2186 0.1409 0.4576 0.1226 0.5514 0.8093 0.01 0.3025 0.626 0.0975 0.1697 0.4511 0.0133 0.2223 0.1443 0.3348 0.2661 0.4836 0.4193 0.3642 0.349 0.576 0.0303 0.2701 0.2867 0.1434 0.1214 0.1426 0.0629 0.21 0.3574 0.325 0.0078 0.3752 0.0477 0.2928 0.0002 0.3427 0.028 0.2201 0.1731] 2022-08-23 07:03:41 [INFO] [EVAL] Class Precision: [0.7812 0.8661 0.965 0.8098 0.7622 0.8484 0.8607 0.8444 0.6852 0.7529 0.6938 0.7238 0.7753 0.4948 0.5407 0.5941 0.7061 0.6914 0.7862 0.6477 0.8165 0.6928 0.773 0.6178 0.5176 0.7132 0.5624 0.79 0.6791 0.3512 0.4574 0.6201 0.5207 0.5205 0.4592 0.542 0.6287 0.7788 0.4657 0.651 0.3861 0.2606 0.6005 0.5286 0.3752 0.3747 0.5078 0.7407 0.6856 0.6798 0.7459 0.3658 0.3747 0.5306 0.6617 0.6531 0.9275 0.7604 0.7166 0.4606 0.1335 0.5653 0.5281 0.6765 0.5266 0.8096 0.3819 0.6015 0.2527 0.5812 0.7186 0.6257 0.5836 0.2817 0.6864 0.497 0.7248 0.5603 0.6834 0.4295 0.7256 0.7066 0.8125 0.0541 0.3826 0.7623 0.4256 0.4178 0.5249 0.7354 0.5631 0.0515 0.4078 0.3219 0.0066 0.0584 0.381 0.3489 0.5256 0.6649 0.394 0.085 0.6011 0.6687 0.8566 0.4956 0.2354 0.8558 0.259 0.5137 0.3189 0.6912 0.4612 0.7118 0.8145 0.0817 0.6264 0.7427 0.1643 0.6428 0.5981 0.2406 0.7515 0.6511 0.6892 0.5894 0.8142 0.5699 0.6728 0.6605 0.646 0.4806 0.4948 0.602 0.4997 0.2593 0.4725 0.1139 0.431 0.6453 0.3623 0.0119 0.6655 0.3424 0.553 0.0004 0.8515 0.4735 0.6515 0.7399] 2022-08-23 07:03:41 [INFO] [EVAL] Class Recall: [0.8584 0.8965 0.9626 0.8743 0.8671 0.8834 0.8898 0.9142 0.7017 0.7594 0.6172 0.6933 0.8817 0.4257 0.4381 0.6081 0.6297 0.5119 0.7436 0.5311 0.8766 0.5997 0.7856 0.7217 0.4923 0.4937 0.6816 0.501 0.4575 0.4444 0.4056 0.6354 0.3614 0.5597 0.4135 0.61 0.5902 0.65 0.3941 0.4973 0.3307 0.1519 0.4563 0.344 0.5676 0.2734 0.4512 0.6379 0.8477 0.8266 0.6305 0.5542 0.2844 0.2705 0.9666 0.5824 0.9408 0.3417 0.6015 0.3037 0.1987 0.6492 0.4805 0.2822 0.7455 0.8362 0.5156 0.5603 0.1711 0.4389 0.6063 0.723 0.5709 0.3804 0.5814 0.4795 0.7119 0.3354 0.2943 0.1996 0.8206 0.4633 0.3699 0.0346 0.233 0.6848 0.0999 0.1104 0.3557 0.6405 0.5814 0.195 0.3486 0.1452 0.0023 0.028 0.0323 0.1583 0.4216 0.5251 0.1149 0.0543 0.2864 0.2541 0.0845 0.8478 0.0415 0.572 0.1302 0.2757 0.2015 0.5752 0.1431 0.7099 0.9922 0.0113 0.3691 0.7994 0.1933 0.1874 0.6472 0.0139 0.24 0.1564 0.3944 0.3267 0.5436 0.6134 0.4426 0.4252 0.8416 0.0313 0.373 0.3537 0.1674 0.1858 0.1696 0.1232 0.2906 0.4448 0.7595 0.0218 0.4624 0.0525 0.3836 0.0003 0.3645 0.0289 0.2495 0.1843] 2022-08-23 07:03:41 [INFO] [EVAL] The model with the best validation mIoU (0.3498) was saved at iter 116000. 2022-08-23 07:03:50 [INFO] [TRAIN] epoch: 95, iter: 119050/160000, loss: 0.4805, lr: 0.000310, batch_cost: 0.1733, reader_cost: 0.00382, ips: 46.1594 samples/sec | ETA 01:58:17 2022-08-23 07:04:00 [INFO] [TRAIN] epoch: 95, iter: 119100/160000, loss: 0.4676, lr: 0.000310, batch_cost: 0.1994, reader_cost: 0.00094, ips: 40.1165 samples/sec | ETA 02:15:56 2022-08-23 07:04:09 [INFO] [TRAIN] epoch: 95, iter: 119150/160000, loss: 0.4877, lr: 0.000309, batch_cost: 0.1961, reader_cost: 0.00061, ips: 40.7952 samples/sec | ETA 02:13:30 2022-08-23 07:04:18 [INFO] [TRAIN] epoch: 95, iter: 119200/160000, loss: 0.5084, lr: 0.000309, batch_cost: 0.1819, reader_cost: 0.00077, ips: 43.9863 samples/sec | ETA 02:03:40 2022-08-23 07:04:28 [INFO] [TRAIN] epoch: 95, iter: 119250/160000, loss: 0.5159, lr: 0.000309, batch_cost: 0.1951, reader_cost: 0.00128, ips: 40.9965 samples/sec | ETA 02:12:31 2022-08-23 07:04:39 [INFO] [TRAIN] epoch: 95, iter: 119300/160000, loss: 0.5298, lr: 0.000308, batch_cost: 0.2233, reader_cost: 0.00141, ips: 35.8322 samples/sec | ETA 02:31:26 2022-08-23 07:04:49 [INFO] [TRAIN] epoch: 95, iter: 119350/160000, loss: 0.5249, lr: 0.000308, batch_cost: 0.2016, reader_cost: 0.00068, ips: 39.6823 samples/sec | ETA 02:16:35 2022-08-23 07:04:59 [INFO] [TRAIN] epoch: 95, iter: 119400/160000, loss: 0.5154, lr: 0.000307, batch_cost: 0.1963, reader_cost: 0.00136, ips: 40.7509 samples/sec | ETA 02:12:50 2022-08-23 07:05:09 [INFO] [TRAIN] epoch: 95, iter: 119450/160000, loss: 0.5075, lr: 0.000307, batch_cost: 0.1858, reader_cost: 0.00080, ips: 43.0608 samples/sec | ETA 02:05:33 2022-08-23 07:05:19 [INFO] [TRAIN] epoch: 95, iter: 119500/160000, loss: 0.4967, lr: 0.000307, batch_cost: 0.2007, reader_cost: 0.00067, ips: 39.8564 samples/sec | ETA 02:15:29 2022-08-23 07:05:28 [INFO] [TRAIN] epoch: 95, iter: 119550/160000, loss: 0.5188, lr: 0.000306, batch_cost: 0.1945, reader_cost: 0.00052, ips: 41.1359 samples/sec | ETA 02:11:06 2022-08-23 07:05:38 [INFO] [TRAIN] epoch: 95, iter: 119600/160000, loss: 0.4813, lr: 0.000306, batch_cost: 0.1855, reader_cost: 0.00064, ips: 43.1184 samples/sec | ETA 02:04:55 2022-08-23 07:05:48 [INFO] [TRAIN] epoch: 95, iter: 119650/160000, loss: 0.5101, lr: 0.000305, batch_cost: 0.2104, reader_cost: 0.00106, ips: 38.0317 samples/sec | ETA 02:21:27 2022-08-23 07:05:58 [INFO] [TRAIN] epoch: 95, iter: 119700/160000, loss: 0.4897, lr: 0.000305, batch_cost: 0.1904, reader_cost: 0.00053, ips: 42.0131 samples/sec | ETA 02:07:53 2022-08-23 07:06:07 [INFO] [TRAIN] epoch: 95, iter: 119750/160000, loss: 0.4970, lr: 0.000305, batch_cost: 0.1863, reader_cost: 0.00056, ips: 42.9420 samples/sec | ETA 02:04:58 2022-08-23 07:06:19 [INFO] [TRAIN] epoch: 95, iter: 119800/160000, loss: 0.4705, lr: 0.000304, batch_cost: 0.2343, reader_cost: 0.00065, ips: 34.1458 samples/sec | ETA 02:36:58 2022-08-23 07:06:29 [INFO] [TRAIN] epoch: 95, iter: 119850/160000, loss: 0.5033, lr: 0.000304, batch_cost: 0.2074, reader_cost: 0.00033, ips: 38.5661 samples/sec | ETA 02:18:48 2022-08-23 07:06:39 [INFO] [TRAIN] epoch: 95, iter: 119900/160000, loss: 0.5043, lr: 0.000304, batch_cost: 0.1993, reader_cost: 0.00074, ips: 40.1410 samples/sec | ETA 02:13:11 2022-08-23 07:06:49 [INFO] [TRAIN] epoch: 95, iter: 119950/160000, loss: 0.5340, lr: 0.000303, batch_cost: 0.1955, reader_cost: 0.00059, ips: 40.9170 samples/sec | ETA 02:10:30 2022-08-23 07:07:02 [INFO] [TRAIN] epoch: 96, iter: 120000/160000, loss: 0.5239, lr: 0.000303, batch_cost: 0.2663, reader_cost: 0.03980, ips: 30.0365 samples/sec | ETA 02:57:33 2022-08-23 07:07:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1644 - reader cost: 9.0266e-04 2022-08-23 07:09:47 [INFO] [EVAL] #Images: 2000 mIoU: 0.3512 Acc: 0.7697 Kappa: 0.7520 Dice: 0.4834 2022-08-23 07:09:47 [INFO] [EVAL] Class IoU: [0.6911 0.7826 0.9312 0.7329 0.6881 0.7648 0.7798 0.7867 0.5297 0.6248 0.4904 0.5649 0.697 0.3201 0.3028 0.4309 0.5247 0.4351 0.6137 0.412 0.7467 0.4946 0.6376 0.4978 0.333 0.4147 0.4664 0.4537 0.3887 0.2352 0.2897 0.472 0.2662 0.3677 0.2876 0.3928 0.4348 0.5609 0.2587 0.4093 0.2239 0.1204 0.3584 0.2582 0.2827 0.1811 0.3027 0.5212 0.6182 0.5533 0.537 0.3595 0.2113 0.2055 0.669 0.4193 0.87 0.3954 0.4841 0.2525 0.0589 0.4434 0.3162 0.3106 0.4485 0.703 0.3045 0.3992 0.1023 0.3233 0.5208 0.5128 0.3799 0.1906 0.4606 0.325 0.5537 0.249 0.2521 0.1161 0.6077 0.41 0.3826 0.0277 0.292 0.5708 0.1069 0.0999 0.2138 0.5146 0.4297 0.0761 0.2081 0.1278 0.0009 0.0256 0.0283 0.1159 0.2741 0.4003 0.1513 0.0325 0.2966 0.1207 0.0486 0.489 0.0567 0.5235 0.0787 0.2382 0.1125 0.5198 0.1272 0.5427 0.851 0.0163 0.3199 0.5916 0.1028 0.2328 0.4293 0.015 0.2188 0.149 0.2369 0.3108 0.4743 0.4285 0.2624 0.3589 0.5603 0.0525 0.264 0.3032 0.1791 0.1287 0.1469 0.0401 0.2086 0.3726 0.3928 0.0008 0.3484 0.0507 0.2976 0.0063 0.3592 0.0235 0.2137 0.1819] 2022-08-23 07:09:47 [INFO] [EVAL] Class Precision: [0.78 0.8623 0.9683 0.8256 0.7686 0.8721 0.8798 0.8461 0.7053 0.751 0.7007 0.7141 0.7696 0.5192 0.5389 0.5868 0.7325 0.6955 0.7817 0.6381 0.8342 0.6544 0.7781 0.634 0.5134 0.6331 0.5876 0.7455 0.6946 0.3465 0.4199 0.6413 0.4671 0.5492 0.454 0.5623 0.6494 0.724 0.4292 0.6121 0.395 0.246 0.6736 0.5334 0.3952 0.4005 0.5101 0.7397 0.6868 0.6225 0.7249 0.4928 0.3752 0.5479 0.6894 0.5531 0.9134 0.6903 0.7036 0.4465 0.1248 0.5982 0.4885 0.5591 0.5384 0.8217 0.4471 0.53 0.1879 0.5316 0.6688 0.6171 0.6338 0.3338 0.6966 0.5286 0.6967 0.5428 0.7268 0.4065 0.679 0.6691 0.7708 0.0825 0.4655 0.7466 0.4108 0.3642 0.3364 0.7295 0.6086 0.107 0.3087 0.3841 0.0039 0.0785 0.4167 0.2778 0.5762 0.6595 0.4536 0.1042 0.5881 0.5706 0.8146 0.5706 0.2011 0.8281 0.2249 0.5812 0.3094 0.7781 0.4756 0.7997 0.8586 0.0764 0.5956 0.6889 0.1348 0.7198 0.5271 0.2044 0.7256 0.6211 0.6642 0.656 0.921 0.6312 0.4219 0.6205 0.6193 0.3381 0.4798 0.4948 0.574 0.2794 0.4369 0.0821 0.5252 0.6066 0.4788 0.0014 0.6577 0.311 0.6167 0.0084 0.8362 0.4589 0.6628 0.7428] 2022-08-23 07:09:47 [INFO] [EVAL] Class Recall: [0.8584 0.8944 0.9605 0.8672 0.8678 0.8615 0.8727 0.918 0.6803 0.7881 0.6204 0.7299 0.8808 0.4549 0.4087 0.6185 0.649 0.5375 0.7407 0.5376 0.8769 0.6695 0.7792 0.6985 0.4866 0.5458 0.6933 0.5368 0.4688 0.4226 0.4829 0.6412 0.3824 0.5266 0.4397 0.5657 0.5681 0.7134 0.3943 0.5526 0.3406 0.1908 0.4337 0.3334 0.4984 0.2485 0.4268 0.6382 0.8608 0.8327 0.6744 0.5705 0.3261 0.2475 0.9576 0.6342 0.9482 0.4806 0.6081 0.3676 0.1002 0.6316 0.4728 0.4113 0.7287 0.8295 0.4883 0.618 0.1835 0.4521 0.7018 0.7521 0.4867 0.3076 0.5763 0.4577 0.7295 0.3151 0.2785 0.1398 0.8527 0.5143 0.4318 0.04 0.4393 0.7079 0.1262 0.121 0.3697 0.636 0.5938 0.2084 0.3897 0.1607 0.0012 0.0367 0.0295 0.1658 0.3433 0.5046 0.185 0.045 0.3744 0.1328 0.0491 0.7738 0.0732 0.5873 0.1081 0.2876 0.1502 0.6102 0.1479 0.6281 0.9897 0.0203 0.4087 0.8073 0.3022 0.256 0.6983 0.016 0.2386 0.164 0.2692 0.3713 0.4944 0.5716 0.4097 0.4598 0.8548 0.0585 0.3699 0.4391 0.2066 0.1927 0.1812 0.0727 0.2571 0.4913 0.6863 0.0021 0.4256 0.0571 0.3652 0.0248 0.3864 0.0242 0.2397 0.1941] 2022-08-23 07:09:47 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:09:55 [INFO] [TRAIN] epoch: 96, iter: 120050/160000, loss: 0.5224, lr: 0.000302, batch_cost: 0.1707, reader_cost: 0.00377, ips: 46.8683 samples/sec | ETA 01:53:39 2022-08-23 07:10:03 [INFO] [TRAIN] epoch: 96, iter: 120100/160000, loss: 0.5028, lr: 0.000302, batch_cost: 0.1603, reader_cost: 0.00083, ips: 49.9137 samples/sec | ETA 01:46:35 2022-08-23 07:10:12 [INFO] [TRAIN] epoch: 96, iter: 120150/160000, loss: 0.5099, lr: 0.000302, batch_cost: 0.1688, reader_cost: 0.00066, ips: 47.3822 samples/sec | ETA 01:52:08 2022-08-23 07:10:20 [INFO] [TRAIN] epoch: 96, iter: 120200/160000, loss: 0.4959, lr: 0.000301, batch_cost: 0.1593, reader_cost: 0.00044, ips: 50.2194 samples/sec | ETA 01:45:40 2022-08-23 07:10:28 [INFO] [TRAIN] epoch: 96, iter: 120250/160000, loss: 0.4824, lr: 0.000301, batch_cost: 0.1610, reader_cost: 0.00048, ips: 49.6979 samples/sec | ETA 01:46:38 2022-08-23 07:10:37 [INFO] [TRAIN] epoch: 96, iter: 120300/160000, loss: 0.4818, lr: 0.000301, batch_cost: 0.1842, reader_cost: 0.00041, ips: 43.4221 samples/sec | ETA 02:01:54 2022-08-23 07:10:47 [INFO] [TRAIN] epoch: 96, iter: 120350/160000, loss: 0.5513, lr: 0.000300, batch_cost: 0.1921, reader_cost: 0.00059, ips: 41.6427 samples/sec | ETA 02:06:57 2022-08-23 07:10:55 [INFO] [TRAIN] epoch: 96, iter: 120400/160000, loss: 0.5464, lr: 0.000300, batch_cost: 0.1648, reader_cost: 0.00066, ips: 48.5563 samples/sec | ETA 01:48:44 2022-08-23 07:11:05 [INFO] [TRAIN] epoch: 96, iter: 120450/160000, loss: 0.5568, lr: 0.000299, batch_cost: 0.2071, reader_cost: 0.00070, ips: 38.6203 samples/sec | ETA 02:16:32 2022-08-23 07:11:15 [INFO] [TRAIN] epoch: 96, iter: 120500/160000, loss: 0.5106, lr: 0.000299, batch_cost: 0.1967, reader_cost: 0.00047, ips: 40.6760 samples/sec | ETA 02:09:28 2022-08-23 07:11:25 [INFO] [TRAIN] epoch: 96, iter: 120550/160000, loss: 0.5125, lr: 0.000299, batch_cost: 0.2017, reader_cost: 0.00055, ips: 39.6669 samples/sec | ETA 02:12:36 2022-08-23 07:11:33 [INFO] [TRAIN] epoch: 96, iter: 120600/160000, loss: 0.5138, lr: 0.000298, batch_cost: 0.1608, reader_cost: 0.00048, ips: 49.7556 samples/sec | ETA 01:45:34 2022-08-23 07:11:43 [INFO] [TRAIN] epoch: 96, iter: 120650/160000, loss: 0.5057, lr: 0.000298, batch_cost: 0.1907, reader_cost: 0.00051, ips: 41.9456 samples/sec | ETA 02:05:04 2022-08-23 07:11:52 [INFO] [TRAIN] epoch: 96, iter: 120700/160000, loss: 0.5204, lr: 0.000298, batch_cost: 0.1767, reader_cost: 0.00048, ips: 45.2785 samples/sec | ETA 01:55:43 2022-08-23 07:12:02 [INFO] [TRAIN] epoch: 96, iter: 120750/160000, loss: 0.5152, lr: 0.000297, batch_cost: 0.2044, reader_cost: 0.00109, ips: 39.1463 samples/sec | ETA 02:13:41 2022-08-23 07:12:11 [INFO] [TRAIN] epoch: 96, iter: 120800/160000, loss: 0.5068, lr: 0.000297, batch_cost: 0.1857, reader_cost: 0.00042, ips: 43.0852 samples/sec | ETA 02:01:18 2022-08-23 07:12:20 [INFO] [TRAIN] epoch: 96, iter: 120850/160000, loss: 0.4989, lr: 0.000296, batch_cost: 0.1744, reader_cost: 0.00360, ips: 45.8793 samples/sec | ETA 01:53:46 2022-08-23 07:12:29 [INFO] [TRAIN] epoch: 96, iter: 120900/160000, loss: 0.4919, lr: 0.000296, batch_cost: 0.1892, reader_cost: 0.00043, ips: 42.2879 samples/sec | ETA 02:03:16 2022-08-23 07:12:38 [INFO] [TRAIN] epoch: 96, iter: 120950/160000, loss: 0.5281, lr: 0.000296, batch_cost: 0.1823, reader_cost: 0.00062, ips: 43.8891 samples/sec | ETA 01:58:37 2022-08-23 07:12:48 [INFO] [TRAIN] epoch: 96, iter: 121000/160000, loss: 0.5235, lr: 0.000295, batch_cost: 0.1985, reader_cost: 0.00067, ips: 40.3121 samples/sec | ETA 02:08:59 2022-08-23 07:12:48 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1904 - reader cost: 7.0416e-04 2022-08-23 07:15:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3504 Acc: 0.7684 Kappa: 0.7505 Dice: 0.4831 2022-08-23 07:15:59 [INFO] [EVAL] Class IoU: [0.6933 0.7861 0.931 0.7312 0.6795 0.7582 0.7877 0.7839 0.5302 0.614 0.477 0.5722 0.7059 0.2939 0.3053 0.4309 0.5108 0.4208 0.6087 0.4129 0.7527 0.4344 0.6326 0.5011 0.3172 0.4124 0.424 0.4324 0.4021 0.2443 0.2912 0.4763 0.2687 0.3632 0.3128 0.3881 0.4441 0.5314 0.261 0.4 0.2501 0.1116 0.3558 0.264 0.2735 0.1468 0.3129 0.5049 0.6671 0.5518 0.5521 0.3966 0.2139 0.193 0.6945 0.4419 0.8528 0.3642 0.4168 0.2475 0.0549 0.4454 0.3252 0.2161 0.4871 0.6804 0.2778 0.4139 0.1247 0.3065 0.4847 0.5091 0.3795 0.1767 0.4506 0.3375 0.493 0.2774 0.2514 0.1916 0.5856 0.3905 0.3808 0.0313 0.3578 0.578 0.1097 0.0907 0.2303 0.515 0.42 0.0473 0.2135 0.1119 0.0086 0.0291 0.0293 0.1434 0.2876 0.4065 0.087 0.025 0.2967 0.2486 0.0777 0.5735 0.0603 0.5334 0.0839 0.201 0.1688 0.5251 0.1461 0.5091 0.7844 0.0154 0.3127 0.6081 0.1022 0.2029 0.4778 0.0148 0.2343 0.1634 0.247 0.2879 0.4715 0.4209 0.4062 0.3577 0.5826 0.049 0.2698 0.2366 0.1568 0.1297 0.1648 0.0462 0.2114 0.3532 0.3338 0.0096 0.3565 0.0215 0.2615 0. 0.3613 0.0262 0.1926 0.1685] 2022-08-23 07:15:59 [INFO] [EVAL] Class Precision: [0.7817 0.8579 0.9624 0.8229 0.7657 0.8797 0.8814 0.8448 0.6703 0.7633 0.6877 0.7203 0.7806 0.517 0.5427 0.5935 0.7155 0.6957 0.7452 0.6489 0.8432 0.6209 0.768 0.6453 0.5489 0.579 0.5263 0.8106 0.6573 0.3456 0.4423 0.6362 0.4871 0.4923 0.4669 0.5379 0.6343 0.7821 0.4417 0.6421 0.4037 0.2695 0.5945 0.5647 0.3562 0.3777 0.4438 0.6433 0.7747 0.6554 0.7217 0.6678 0.3529 0.5109 0.722 0.7799 0.9096 0.696 0.7086 0.4218 0.0986 0.6028 0.4828 0.6478 0.6308 0.7568 0.3798 0.5723 0.3092 0.5445 0.704 0.6346 0.6002 0.2495 0.7577 0.4865 0.6243 0.5 0.7064 0.4545 0.6495 0.657 0.7602 0.0914 0.5247 0.7485 0.418 0.3951 0.4164 0.782 0.587 0.0601 0.3155 0.3932 0.0396 0.0682 0.4244 0.2587 0.4352 0.6853 0.3752 0.0816 0.5602 0.6558 0.7933 0.6405 0.1456 0.8784 0.2399 0.4808 0.2794 0.7775 0.4707 0.7873 0.7878 0.0937 0.5926 0.7122 0.1352 0.7004 0.5897 0.2329 0.7169 0.6432 0.6467 0.6818 0.9012 0.5954 0.6894 0.6215 0.6577 0.367 0.4498 0.6348 0.6392 0.2802 0.3344 0.0976 0.4118 0.6311 0.3817 0.0146 0.621 0.296 0.577 0. 0.8232 0.3721 0.6581 0.7718] 2022-08-23 07:15:59 [INFO] [EVAL] Class Recall: [0.8599 0.9039 0.9661 0.8678 0.8579 0.846 0.881 0.9158 0.7172 0.7585 0.6089 0.7357 0.8807 0.4052 0.4109 0.6113 0.641 0.5157 0.7686 0.5316 0.8753 0.5912 0.7821 0.6916 0.429 0.589 0.6855 0.481 0.5087 0.4546 0.4602 0.6546 0.3748 0.5807 0.4867 0.5821 0.5969 0.6237 0.3895 0.5148 0.3966 0.16 0.4698 0.3315 0.5409 0.1936 0.5147 0.7013 0.8276 0.7773 0.7015 0.4941 0.3519 0.2368 0.948 0.5048 0.9317 0.4331 0.503 0.3747 0.1103 0.6304 0.499 0.2448 0.6813 0.8708 0.5083 0.5993 0.1729 0.4121 0.6087 0.7201 0.5079 0.3771 0.5264 0.5244 0.7009 0.3839 0.2807 0.2489 0.8562 0.4905 0.4328 0.0454 0.5293 0.7173 0.1295 0.1054 0.34 0.6013 0.5962 0.1815 0.3977 0.1353 0.0109 0.0483 0.0305 0.2434 0.4588 0.4998 0.1018 0.0348 0.3868 0.2859 0.0793 0.8457 0.0934 0.5759 0.1142 0.2566 0.299 0.6179 0.1748 0.5903 0.9945 0.0181 0.3983 0.8063 0.2951 0.2222 0.7156 0.0156 0.2581 0.1797 0.2855 0.3326 0.4972 0.5895 0.4971 0.4573 0.8361 0.0535 0.4027 0.2739 0.172 0.1945 0.2452 0.0807 0.3028 0.4451 0.7269 0.0271 0.4556 0.0227 0.3236 0. 0.3917 0.0274 0.214 0.1773] 2022-08-23 07:15:59 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:16:09 [INFO] [TRAIN] epoch: 96, iter: 121050/160000, loss: 0.5127, lr: 0.000295, batch_cost: 0.1984, reader_cost: 0.00224, ips: 40.3200 samples/sec | ETA 02:08:48 2022-08-23 07:16:18 [INFO] [TRAIN] epoch: 96, iter: 121100/160000, loss: 0.4860, lr: 0.000295, batch_cost: 0.1747, reader_cost: 0.00074, ips: 45.7902 samples/sec | ETA 01:53:16 2022-08-23 07:16:26 [INFO] [TRAIN] epoch: 96, iter: 121150/160000, loss: 0.4916, lr: 0.000294, batch_cost: 0.1693, reader_cost: 0.00427, ips: 47.2418 samples/sec | ETA 01:49:38 2022-08-23 07:16:36 [INFO] [TRAIN] epoch: 96, iter: 121200/160000, loss: 0.4734, lr: 0.000294, batch_cost: 0.1975, reader_cost: 0.00055, ips: 40.5073 samples/sec | ETA 02:07:42 2022-08-23 07:16:46 [INFO] [TRAIN] epoch: 97, iter: 121250/160000, loss: 0.5323, lr: 0.000293, batch_cost: 0.2007, reader_cost: 0.02940, ips: 39.8557 samples/sec | ETA 02:09:38 2022-08-23 07:16:55 [INFO] [TRAIN] epoch: 97, iter: 121300/160000, loss: 0.5054, lr: 0.000293, batch_cost: 0.1826, reader_cost: 0.00146, ips: 43.8056 samples/sec | ETA 01:57:47 2022-08-23 07:17:05 [INFO] [TRAIN] epoch: 97, iter: 121350/160000, loss: 0.4936, lr: 0.000293, batch_cost: 0.1902, reader_cost: 0.00076, ips: 42.0575 samples/sec | ETA 02:02:31 2022-08-23 07:17:15 [INFO] [TRAIN] epoch: 97, iter: 121400/160000, loss: 0.5023, lr: 0.000292, batch_cost: 0.1948, reader_cost: 0.00037, ips: 41.0771 samples/sec | ETA 02:05:17 2022-08-23 07:17:24 [INFO] [TRAIN] epoch: 97, iter: 121450/160000, loss: 0.5219, lr: 0.000292, batch_cost: 0.1863, reader_cost: 0.00056, ips: 42.9504 samples/sec | ETA 01:59:40 2022-08-23 07:17:33 [INFO] [TRAIN] epoch: 97, iter: 121500/160000, loss: 0.5341, lr: 0.000291, batch_cost: 0.1798, reader_cost: 0.00038, ips: 44.5029 samples/sec | ETA 01:55:20 2022-08-23 07:17:42 [INFO] [TRAIN] epoch: 97, iter: 121550/160000, loss: 0.5177, lr: 0.000291, batch_cost: 0.1766, reader_cost: 0.00047, ips: 45.3125 samples/sec | ETA 01:53:08 2022-08-23 07:17:51 [INFO] [TRAIN] epoch: 97, iter: 121600/160000, loss: 0.5025, lr: 0.000291, batch_cost: 0.1873, reader_cost: 0.00031, ips: 42.7199 samples/sec | ETA 01:59:51 2022-08-23 07:18:00 [INFO] [TRAIN] epoch: 97, iter: 121650/160000, loss: 0.4901, lr: 0.000290, batch_cost: 0.1702, reader_cost: 0.00061, ips: 47.0009 samples/sec | ETA 01:48:47 2022-08-23 07:18:08 [INFO] [TRAIN] epoch: 97, iter: 121700/160000, loss: 0.5464, lr: 0.000290, batch_cost: 0.1678, reader_cost: 0.00056, ips: 47.6735 samples/sec | ETA 01:47:07 2022-08-23 07:18:18 [INFO] [TRAIN] epoch: 97, iter: 121750/160000, loss: 0.5114, lr: 0.000290, batch_cost: 0.1919, reader_cost: 0.00039, ips: 41.6965 samples/sec | ETA 02:02:18 2022-08-23 07:18:27 [INFO] [TRAIN] epoch: 97, iter: 121800/160000, loss: 0.4904, lr: 0.000289, batch_cost: 0.1829, reader_cost: 0.00041, ips: 43.7505 samples/sec | ETA 01:56:25 2022-08-23 07:18:36 [INFO] [TRAIN] epoch: 97, iter: 121850/160000, loss: 0.4738, lr: 0.000289, batch_cost: 0.1826, reader_cost: 0.00036, ips: 43.8146 samples/sec | ETA 01:56:05 2022-08-23 07:18:46 [INFO] [TRAIN] epoch: 97, iter: 121900/160000, loss: 0.4901, lr: 0.000288, batch_cost: 0.1975, reader_cost: 0.00055, ips: 40.5029 samples/sec | ETA 02:05:25 2022-08-23 07:18:55 [INFO] [TRAIN] epoch: 97, iter: 121950/160000, loss: 0.5277, lr: 0.000288, batch_cost: 0.1735, reader_cost: 0.00065, ips: 46.1174 samples/sec | ETA 01:50:00 2022-08-23 07:19:04 [INFO] [TRAIN] epoch: 97, iter: 122000/160000, loss: 0.5394, lr: 0.000288, batch_cost: 0.1918, reader_cost: 0.00045, ips: 41.7076 samples/sec | ETA 02:01:28 2022-08-23 07:19:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1865 - reader cost: 7.6289e-04 2022-08-23 07:22:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3485 Acc: 0.7677 Kappa: 0.7498 Dice: 0.4803 2022-08-23 07:22:11 [INFO] [EVAL] Class IoU: [0.6921 0.7773 0.9308 0.7298 0.6822 0.7646 0.7757 0.7822 0.5254 0.6098 0.4899 0.5629 0.7049 0.301 0.3175 0.4389 0.4899 0.4259 0.6168 0.4206 0.7324 0.4924 0.6184 0.4999 0.3081 0.2908 0.4789 0.4691 0.4198 0.2256 0.2836 0.4789 0.2671 0.3767 0.2499 0.4098 0.4393 0.5397 0.2558 0.385 0.2273 0.1203 0.349 0.2575 0.2893 0.2043 0.3355 0.5008 0.6562 0.5118 0.5617 0.4486 0.2203 0.1975 0.6795 0.4453 0.8612 0.3606 0.45 0.1992 0.0921 0.3845 0.3224 0.2745 0.4674 0.6971 0.3011 0.3888 0.121 0.3116 0.4947 0.528 0.4121 0.1838 0.4479 0.3352 0.466 0.241 0.2648 0.1828 0.5986 0.4076 0.3414 0.0344 0.3375 0.5763 0.1177 0.0973 0.2083 0.5183 0.4461 0.0675 0.2329 0.0822 0.0118 0.0149 0.0181 0.1356 0.2896 0.4665 0.1209 0.0239 0.2424 0.128 0.0416 0.5768 0.0718 0.5225 0.0601 0.1984 0.1172 0.5251 0.1487 0.4991 0.8129 0.0171 0.2807 0.6305 0.0821 0.1831 0.4717 0.0103 0.2124 0.1292 0.2742 0.2996 0.5021 0.4179 0.3189 0.354 0.5884 0.0477 0.3037 0.2655 0.1791 0.1354 0.159 0.0621 0.2007 0.3389 0.3568 0.0301 0.3468 0.0222 0.1687 0.0022 0.3435 0.0304 0.209 0.1582] 2022-08-23 07:22:11 [INFO] [EVAL] Class Precision: [0.7926 0.8351 0.9667 0.8301 0.7698 0.8595 0.8699 0.8362 0.6996 0.7595 0.6734 0.7295 0.7914 0.4769 0.5647 0.6116 0.6579 0.6884 0.7639 0.6195 0.8112 0.7185 0.717 0.6122 0.5036 0.6293 0.6107 0.7527 0.6505 0.3491 0.4789 0.633 0.4989 0.5018 0.4101 0.5511 0.6388 0.7911 0.4646 0.661 0.4714 0.2656 0.5307 0.5305 0.4358 0.4046 0.5509 0.7002 0.7898 0.5676 0.7375 0.674 0.3839 0.4028 0.7069 0.6796 0.9246 0.6914 0.7727 0.3749 0.1435 0.5022 0.4534 0.5983 0.5755 0.805 0.4478 0.5677 0.2488 0.579 0.6558 0.6401 0.5804 0.2605 0.6936 0.4991 0.5418 0.4463 0.7289 0.5284 0.6711 0.6122 0.8076 0.0926 0.5187 0.7376 0.3936 0.4042 0.359 0.7442 0.6532 0.0946 0.4131 0.3697 0.0576 0.0429 0.23 0.3013 0.478 0.6648 0.3225 0.0819 0.6217 0.5015 0.8823 0.6669 0.2396 0.8001 0.2265 0.5213 0.2795 0.7882 0.4482 0.7743 0.8176 0.1092 0.6474 0.747 0.1143 0.5982 0.617 0.1913 0.7401 0.6755 0.5934 0.6687 0.8876 0.5779 0.5487 0.598 0.6852 0.4528 0.5137 0.6946 0.6261 0.3121 0.3479 0.1092 0.4297 0.6248 0.4017 0.0405 0.6529 0.1855 0.4757 0.0032 0.8468 0.3622 0.5842 0.7915] 2022-08-23 07:22:11 [INFO] [EVAL] Class Recall: [0.8452 0.9182 0.9616 0.8579 0.857 0.8739 0.8774 0.9236 0.6784 0.7557 0.6425 0.7114 0.8657 0.4494 0.4204 0.6084 0.6574 0.5277 0.762 0.5672 0.8828 0.6101 0.818 0.7315 0.4424 0.3509 0.6892 0.5546 0.5421 0.3894 0.4102 0.663 0.365 0.6018 0.3902 0.6152 0.5845 0.6293 0.3627 0.4798 0.3051 0.1804 0.5048 0.3336 0.4626 0.2921 0.4619 0.6375 0.7951 0.8389 0.702 0.5729 0.3407 0.2793 0.9461 0.5636 0.9263 0.4298 0.5187 0.2983 0.2047 0.6212 0.5274 0.3366 0.7132 0.8387 0.4788 0.5523 0.1905 0.4028 0.6682 0.751 0.587 0.3843 0.5585 0.5051 0.7693 0.3438 0.2938 0.2185 0.8473 0.5495 0.3717 0.0519 0.4914 0.7249 0.1438 0.1136 0.3316 0.6306 0.5845 0.1909 0.348 0.0957 0.0146 0.0223 0.0193 0.1978 0.4234 0.61 0.1621 0.0327 0.2843 0.1467 0.0418 0.8103 0.0929 0.601 0.0756 0.2427 0.168 0.6113 0.1821 0.584 0.993 0.0198 0.3313 0.8017 0.2259 0.2088 0.667 0.0108 0.2295 0.1378 0.3376 0.3519 0.5361 0.6015 0.4322 0.4646 0.8063 0.0506 0.4262 0.3006 0.2006 0.193 0.2265 0.126 0.2736 0.4256 0.7614 0.1044 0.4252 0.0246 0.2073 0.007 0.3663 0.0321 0.2455 0.1651] 2022-08-23 07:22:11 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:22:21 [INFO] [TRAIN] epoch: 97, iter: 122050/160000, loss: 0.5074, lr: 0.000287, batch_cost: 0.2067, reader_cost: 0.00269, ips: 38.6969 samples/sec | ETA 02:10:45 2022-08-23 07:22:32 [INFO] [TRAIN] epoch: 97, iter: 122100/160000, loss: 0.5101, lr: 0.000287, batch_cost: 0.2068, reader_cost: 0.00144, ips: 38.6882 samples/sec | ETA 02:10:37 2022-08-23 07:22:42 [INFO] [TRAIN] epoch: 97, iter: 122150/160000, loss: 0.5166, lr: 0.000287, batch_cost: 0.2141, reader_cost: 0.00097, ips: 37.3574 samples/sec | ETA 02:15:05 2022-08-23 07:22:52 [INFO] [TRAIN] epoch: 97, iter: 122200/160000, loss: 0.4986, lr: 0.000286, batch_cost: 0.1914, reader_cost: 0.00046, ips: 41.8021 samples/sec | ETA 02:00:34 2022-08-23 07:23:02 [INFO] [TRAIN] epoch: 97, iter: 122250/160000, loss: 0.4987, lr: 0.000286, batch_cost: 0.2112, reader_cost: 0.00067, ips: 37.8845 samples/sec | ETA 02:12:51 2022-08-23 07:23:12 [INFO] [TRAIN] epoch: 97, iter: 122300/160000, loss: 0.4864, lr: 0.000285, batch_cost: 0.1998, reader_cost: 0.00058, ips: 40.0363 samples/sec | ETA 02:05:33 2022-08-23 07:23:22 [INFO] [TRAIN] epoch: 97, iter: 122350/160000, loss: 0.4800, lr: 0.000285, batch_cost: 0.1981, reader_cost: 0.00043, ips: 40.3757 samples/sec | ETA 02:04:19 2022-08-23 07:23:32 [INFO] [TRAIN] epoch: 97, iter: 122400/160000, loss: 0.4938, lr: 0.000285, batch_cost: 0.2023, reader_cost: 0.00081, ips: 39.5398 samples/sec | ETA 02:06:47 2022-08-23 07:23:43 [INFO] [TRAIN] epoch: 97, iter: 122450/160000, loss: 0.5383, lr: 0.000284, batch_cost: 0.2136, reader_cost: 0.00069, ips: 37.4488 samples/sec | ETA 02:13:41 2022-08-23 07:23:53 [INFO] [TRAIN] epoch: 97, iter: 122500/160000, loss: 0.4812, lr: 0.000284, batch_cost: 0.2017, reader_cost: 0.00065, ips: 39.6638 samples/sec | ETA 02:06:03 2022-08-23 07:24:06 [INFO] [TRAIN] epoch: 98, iter: 122550/160000, loss: 0.4931, lr: 0.000284, batch_cost: 0.2531, reader_cost: 0.06946, ips: 31.6028 samples/sec | ETA 02:38:00 2022-08-23 07:24:15 [INFO] [TRAIN] epoch: 98, iter: 122600/160000, loss: 0.5344, lr: 0.000283, batch_cost: 0.1819, reader_cost: 0.00076, ips: 43.9845 samples/sec | ETA 01:53:22 2022-08-23 07:24:25 [INFO] [TRAIN] epoch: 98, iter: 122650/160000, loss: 0.5193, lr: 0.000283, batch_cost: 0.2040, reader_cost: 0.00046, ips: 39.2188 samples/sec | ETA 02:06:58 2022-08-23 07:24:34 [INFO] [TRAIN] epoch: 98, iter: 122700/160000, loss: 0.5012, lr: 0.000282, batch_cost: 0.1717, reader_cost: 0.00105, ips: 46.5983 samples/sec | ETA 01:46:43 2022-08-23 07:24:44 [INFO] [TRAIN] epoch: 98, iter: 122750/160000, loss: 0.4832, lr: 0.000282, batch_cost: 0.1985, reader_cost: 0.00059, ips: 40.3079 samples/sec | ETA 02:03:13 2022-08-23 07:24:52 [INFO] [TRAIN] epoch: 98, iter: 122800/160000, loss: 0.5321, lr: 0.000282, batch_cost: 0.1748, reader_cost: 0.00033, ips: 45.7723 samples/sec | ETA 01:48:21 2022-08-23 07:25:01 [INFO] [TRAIN] epoch: 98, iter: 122850/160000, loss: 0.5169, lr: 0.000281, batch_cost: 0.1800, reader_cost: 0.00193, ips: 44.4390 samples/sec | ETA 01:51:27 2022-08-23 07:25:10 [INFO] [TRAIN] epoch: 98, iter: 122900/160000, loss: 0.4760, lr: 0.000281, batch_cost: 0.1698, reader_cost: 0.00758, ips: 47.1260 samples/sec | ETA 01:44:58 2022-08-23 07:25:19 [INFO] [TRAIN] epoch: 98, iter: 122950/160000, loss: 0.5192, lr: 0.000281, batch_cost: 0.1808, reader_cost: 0.00061, ips: 44.2486 samples/sec | ETA 01:51:38 2022-08-23 07:25:28 [INFO] [TRAIN] epoch: 98, iter: 123000/160000, loss: 0.5283, lr: 0.000280, batch_cost: 0.1822, reader_cost: 0.00044, ips: 43.9133 samples/sec | ETA 01:52:20 2022-08-23 07:25:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 145s - batch_cost: 0.1445 - reader cost: 5.9366e-04 2022-08-23 07:27:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3477 Acc: 0.7675 Kappa: 0.7498 Dice: 0.4797 2022-08-23 07:27:53 [INFO] [EVAL] Class IoU: [0.6931 0.7917 0.9315 0.7322 0.676 0.7657 0.7839 0.7897 0.5281 0.5978 0.4909 0.5513 0.6952 0.3088 0.3174 0.4322 0.4831 0.4188 0.6114 0.414 0.7411 0.4315 0.6297 0.4935 0.3115 0.4446 0.4157 0.474 0.4 0.2023 0.2925 0.4851 0.2663 0.3655 0.2876 0.3779 0.4287 0.5407 0.2607 0.4126 0.2011 0.0908 0.3505 0.2635 0.27 0.1942 0.3068 0.5131 0.6789 0.5387 0.5522 0.4241 0.2183 0.1793 0.6857 0.3892 0.8553 0.3939 0.4274 0.2362 0.0956 0.3607 0.3254 0.2435 0.4796 0.7069 0.2922 0.4037 0.1165 0.3219 0.4979 0.5203 0.3979 0.1989 0.4353 0.3165 0.5002 0.2762 0.2738 0.2701 0.6354 0.3906 0.3637 0.0299 0.3417 0.5747 0.1152 0.0786 0.2482 0.501 0.4202 0.062 0.2171 0.1231 0.0094 0.019 0.0102 0.1474 0.2909 0.4446 0.0699 0.0364 0.2772 0.1595 0.0184 0.4921 0.0722 0.5386 0.0933 0.2275 0.1494 0.5027 0.1395 0.4164 0.8123 0.0115 0.2544 0.6349 0.0897 0.1002 0.4482 0.0121 0.2295 0.152 0.2771 0.2907 0.4753 0.4003 0.2929 0.3411 0.5843 0.0812 0.3351 0.2944 0.1475 0.1172 0.1693 0.0491 0.2052 0.3568 0.3255 0.0106 0.3202 0.011 0.2865 0. 0.3245 0.0284 0.2069 0.1812] 2022-08-23 07:27:53 [INFO] [EVAL] Class Precision: [0.7912 0.8678 0.9642 0.8282 0.7482 0.8547 0.8727 0.8514 0.6903 0.7558 0.7034 0.7606 0.7611 0.4607 0.5261 0.5753 0.6833 0.7033 0.7519 0.6346 0.8203 0.6396 0.7521 0.6459 0.5061 0.678 0.5165 0.6995 0.6849 0.3027 0.4686 0.686 0.4804 0.5346 0.4804 0.554 0.6506 0.6959 0.4055 0.5909 0.4239 0.2489 0.5281 0.5087 0.4052 0.4197 0.5116 0.7017 0.7783 0.6363 0.7838 0.6238 0.3299 0.4754 0.7382 0.5636 0.9048 0.6958 0.7214 0.357 0.1734 0.5737 0.4918 0.6173 0.6109 0.8394 0.3897 0.5555 0.2797 0.5683 0.7033 0.6235 0.6045 0.2837 0.6587 0.5036 0.692 0.512 0.7363 0.5216 0.7321 0.6797 0.7855 0.0942 0.5163 0.7684 0.4216 0.4521 0.4457 0.7671 0.5805 0.0845 0.3611 0.3344 0.0288 0.0495 0.1272 0.2709 0.4459 0.6951 0.5238 0.1087 0.5333 0.7437 0.7422 0.6428 0.1623 0.7976 0.2495 0.5447 0.3015 0.719 0.4107 0.7196 0.8185 0.0703 0.6187 0.7483 0.1375 0.7867 0.5221 0.1789 0.6205 0.5899 0.6395 0.5672 0.8513 0.5218 0.4637 0.5978 0.6621 0.4592 0.4893 0.6255 0.4935 0.2832 0.364 0.0921 0.4201 0.6121 0.3743 0.015 0.6586 0.1438 0.5344 0. 0.8628 0.3401 0.54 0.6615] 2022-08-23 07:27:53 [INFO] [EVAL] Class Recall: [0.8482 0.9002 0.9648 0.8633 0.8752 0.8803 0.8851 0.9159 0.692 0.741 0.619 0.6671 0.8892 0.4837 0.4444 0.6348 0.6225 0.5086 0.7659 0.5435 0.8848 0.5702 0.7946 0.6765 0.4475 0.5636 0.6805 0.5953 0.4902 0.3789 0.4378 0.6235 0.3741 0.5362 0.4174 0.5432 0.5568 0.708 0.422 0.5777 0.2767 0.1251 0.5103 0.3535 0.4471 0.2655 0.4339 0.6562 0.8417 0.7784 0.6514 0.5699 0.3922 0.2236 0.9062 0.5572 0.9399 0.4759 0.512 0.411 0.1756 0.4928 0.4903 0.2868 0.6907 0.8174 0.5389 0.5964 0.1665 0.426 0.6304 0.7587 0.5379 0.3995 0.562 0.46 0.6434 0.3748 0.3036 0.359 0.8279 0.4787 0.4038 0.042 0.5027 0.6951 0.1368 0.0868 0.3591 0.5908 0.6034 0.1891 0.3525 0.1631 0.0137 0.0298 0.0109 0.2444 0.4556 0.5523 0.0746 0.0519 0.366 0.1688 0.0185 0.6774 0.1152 0.6238 0.1298 0.2809 0.2285 0.6257 0.1744 0.497 0.9908 0.0136 0.3017 0.8074 0.2053 0.103 0.76 0.0128 0.2669 0.1699 0.3284 0.3735 0.5183 0.6324 0.4431 0.4427 0.8326 0.0898 0.5152 0.3574 0.1738 0.1667 0.2404 0.0952 0.2864 0.461 0.7139 0.0346 0.3839 0.0118 0.3819 0. 0.3421 0.03 0.2512 0.1997] 2022-08-23 07:27:53 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:28:03 [INFO] [TRAIN] epoch: 98, iter: 123050/160000, loss: 0.4839, lr: 0.000280, batch_cost: 0.2015, reader_cost: 0.00379, ips: 39.6937 samples/sec | ETA 02:04:07 2022-08-23 07:28:12 [INFO] [TRAIN] epoch: 98, iter: 123100/160000, loss: 0.5099, lr: 0.000279, batch_cost: 0.1853, reader_cost: 0.00098, ips: 43.1640 samples/sec | ETA 01:53:59 2022-08-23 07:28:22 [INFO] [TRAIN] epoch: 98, iter: 123150/160000, loss: 0.4990, lr: 0.000279, batch_cost: 0.1900, reader_cost: 0.00108, ips: 42.1153 samples/sec | ETA 01:56:39 2022-08-23 07:28:31 [INFO] [TRAIN] epoch: 98, iter: 123200/160000, loss: 0.5051, lr: 0.000279, batch_cost: 0.1837, reader_cost: 0.00089, ips: 43.5460 samples/sec | ETA 01:52:40 2022-08-23 07:28:40 [INFO] [TRAIN] epoch: 98, iter: 123250/160000, loss: 0.5129, lr: 0.000278, batch_cost: 0.1801, reader_cost: 0.00035, ips: 44.4278 samples/sec | ETA 01:50:17 2022-08-23 07:28:50 [INFO] [TRAIN] epoch: 98, iter: 123300/160000, loss: 0.5207, lr: 0.000278, batch_cost: 0.1987, reader_cost: 0.00074, ips: 40.2529 samples/sec | ETA 02:01:33 2022-08-23 07:29:00 [INFO] [TRAIN] epoch: 98, iter: 123350/160000, loss: 0.4876, lr: 0.000277, batch_cost: 0.1900, reader_cost: 0.00058, ips: 42.0951 samples/sec | ETA 01:56:05 2022-08-23 07:29:09 [INFO] [TRAIN] epoch: 98, iter: 123400/160000, loss: 0.5194, lr: 0.000277, batch_cost: 0.1915, reader_cost: 0.00037, ips: 41.7668 samples/sec | ETA 01:56:50 2022-08-23 07:29:20 [INFO] [TRAIN] epoch: 98, iter: 123450/160000, loss: 0.4919, lr: 0.000277, batch_cost: 0.2084, reader_cost: 0.00079, ips: 38.3831 samples/sec | ETA 02:06:57 2022-08-23 07:29:30 [INFO] [TRAIN] epoch: 98, iter: 123500/160000, loss: 0.5088, lr: 0.000276, batch_cost: 0.2049, reader_cost: 0.00043, ips: 39.0394 samples/sec | ETA 02:04:39 2022-08-23 07:29:40 [INFO] [TRAIN] epoch: 98, iter: 123550/160000, loss: 0.5081, lr: 0.000276, batch_cost: 0.2050, reader_cost: 0.00081, ips: 39.0286 samples/sec | ETA 02:04:31 2022-08-23 07:29:50 [INFO] [TRAIN] epoch: 98, iter: 123600/160000, loss: 0.5357, lr: 0.000276, batch_cost: 0.1913, reader_cost: 0.00065, ips: 41.8136 samples/sec | ETA 01:56:04 2022-08-23 07:30:00 [INFO] [TRAIN] epoch: 98, iter: 123650/160000, loss: 0.4962, lr: 0.000275, batch_cost: 0.2030, reader_cost: 0.00080, ips: 39.4107 samples/sec | ETA 02:02:58 2022-08-23 07:30:10 [INFO] [TRAIN] epoch: 98, iter: 123700/160000, loss: 0.5087, lr: 0.000275, batch_cost: 0.2015, reader_cost: 0.00096, ips: 39.7002 samples/sec | ETA 02:01:54 2022-08-23 07:30:20 [INFO] [TRAIN] epoch: 98, iter: 123750/160000, loss: 0.4811, lr: 0.000274, batch_cost: 0.2049, reader_cost: 0.00075, ips: 39.0345 samples/sec | ETA 02:03:49 2022-08-23 07:30:33 [INFO] [TRAIN] epoch: 99, iter: 123800/160000, loss: 0.4848, lr: 0.000274, batch_cost: 0.2596, reader_cost: 0.07041, ips: 30.8206 samples/sec | ETA 02:36:36 2022-08-23 07:30:43 [INFO] [TRAIN] epoch: 99, iter: 123850/160000, loss: 0.5133, lr: 0.000274, batch_cost: 0.2033, reader_cost: 0.00056, ips: 39.3590 samples/sec | ETA 02:02:27 2022-08-23 07:30:53 [INFO] [TRAIN] epoch: 99, iter: 123900/160000, loss: 0.4720, lr: 0.000273, batch_cost: 0.2056, reader_cost: 0.00067, ips: 38.9169 samples/sec | ETA 02:03:40 2022-08-23 07:31:03 [INFO] [TRAIN] epoch: 99, iter: 123950/160000, loss: 0.5127, lr: 0.000273, batch_cost: 0.1879, reader_cost: 0.00072, ips: 42.5748 samples/sec | ETA 01:52:53 2022-08-23 07:31:12 [INFO] [TRAIN] epoch: 99, iter: 124000/160000, loss: 0.5008, lr: 0.000273, batch_cost: 0.1889, reader_cost: 0.00102, ips: 42.3457 samples/sec | ETA 01:53:21 2022-08-23 07:31:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1586 - reader cost: 9.2779e-04 2022-08-23 07:33:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3426 Acc: 0.7656 Kappa: 0.7478 Dice: 0.4752 2022-08-23 07:33:51 [INFO] [EVAL] Class IoU: [0.6878 0.7847 0.93 0.73 0.6898 0.7643 0.7871 0.785 0.5266 0.6078 0.4932 0.5611 0.7078 0.302 0.317 0.4287 0.4579 0.4524 0.6079 0.4209 0.7355 0.3985 0.6477 0.5003 0.3232 0.4246 0.3767 0.4614 0.4276 0.2253 0.2898 0.4494 0.2777 0.3574 0.2775 0.3919 0.4323 0.5461 0.2491 0.4156 0.1935 0.0895 0.3573 0.2698 0.2856 0.1843 0.2966 0.4971 0.5788 0.5096 0.5521 0.3751 0.2066 0.1824 0.663 0.4077 0.8499 0.3851 0.3651 0.2475 0.0633 0.3197 0.3059 0.2263 0.4462 0.6699 0.285 0.402 0.1146 0.3319 0.4939 0.5234 0.3876 0.1974 0.4472 0.3162 0.5285 0.2395 0.2818 0.1942 0.6202 0.3912 0.3488 0.0345 0.15 0.5714 0.123 0.0933 0.2367 0.5034 0.4131 0.052 0.2242 0.0872 0.0135 0.0125 0.0237 0.1315 0.2736 0.4047 0.1366 0.0678 0.2444 0.277 0.0278 0.478 0.0662 0.5373 0.083 0.1848 0.1201 0.4997 0.1361 0.4124 0.7637 0.011 0.2565 0.6452 0.1082 0.2465 0.436 0.0153 0.2255 0.1685 0.2523 0.2862 0.4579 0.4012 0.2226 0.3592 0.5746 0.0847 0.307 0.2566 0.175 0.1346 0.1656 0.0558 0.1986 0.3549 0.3103 0.008 0.303 0.0403 0.2698 0.0005 0.3604 0.0275 0.2461 0.1573] 2022-08-23 07:33:51 [INFO] [EVAL] Class Precision: [0.7838 0.8719 0.9666 0.8296 0.7794 0.8574 0.8708 0.8427 0.6745 0.7388 0.7034 0.747 0.79 0.4926 0.5323 0.561 0.6757 0.7161 0.7415 0.6175 0.812 0.6082 0.7883 0.6415 0.533 0.5389 0.4496 0.7261 0.6196 0.3253 0.4512 0.5782 0.4993 0.543 0.438 0.5539 0.6495 0.776 0.4152 0.6079 0.4224 0.2513 0.5654 0.5305 0.3864 0.3564 0.47 0.6292 0.7389 0.5967 0.7285 0.513 0.3122 0.5182 0.712 0.6438 0.8907 0.719 0.7188 0.4121 0.1303 0.4904 0.4639 0.7053 0.535 0.7765 0.3986 0.5532 0.2486 0.5481 0.6838 0.6422 0.6041 0.3096 0.6938 0.5027 0.6413 0.474 0.775 0.4299 0.7256 0.6584 0.7959 0.1128 0.3646 0.7616 0.3825 0.3988 0.4862 0.7586 0.564 0.0673 0.4093 0.3716 0.0533 0.0333 0.5642 0.3355 0.5347 0.726 0.6097 0.1325 0.5993 0.7393 0.7222 0.5467 0.1783 0.8227 0.21 0.5096 0.3332 0.6949 0.4275 0.7095 0.7688 0.0788 0.6202 0.7642 0.1496 0.6524 0.5823 0.2002 0.6554 0.5711 0.6267 0.6484 0.8685 0.5082 0.6684 0.5689 0.6431 0.5056 0.4886 0.6415 0.5055 0.2979 0.3397 0.0849 0.385 0.5916 0.3772 0.012 0.6674 0.3888 0.5831 0.0012 0.8563 0.4021 0.5859 0.7368] 2022-08-23 07:33:51 [INFO] [EVAL] Class Recall: [0.8489 0.887 0.9608 0.8587 0.8572 0.8757 0.8911 0.9199 0.706 0.7743 0.6228 0.6928 0.8718 0.4384 0.4395 0.6451 0.5868 0.5512 0.7714 0.5693 0.8865 0.5361 0.784 0.6945 0.4509 0.667 0.6993 0.5587 0.5797 0.423 0.4475 0.6687 0.3848 0.5112 0.4309 0.5726 0.5639 0.6483 0.3837 0.5677 0.2631 0.1221 0.4925 0.3545 0.5226 0.2763 0.4457 0.703 0.7276 0.7772 0.6951 0.5824 0.3792 0.2196 0.9059 0.5264 0.9488 0.4534 0.426 0.3825 0.1096 0.4788 0.4732 0.2499 0.7287 0.8299 0.5 0.5954 0.1754 0.457 0.64 0.7388 0.5195 0.3527 0.5571 0.4601 0.7503 0.3262 0.3069 0.2616 0.8102 0.4908 0.3831 0.0473 0.2031 0.6959 0.1535 0.1086 0.3156 0.5994 0.6068 0.1862 0.3315 0.1024 0.0178 0.0196 0.0241 0.1777 0.359 0.4776 0.1497 0.122 0.2921 0.307 0.0281 0.7919 0.0953 0.6077 0.1208 0.2248 0.1581 0.6402 0.1665 0.4962 0.9914 0.0127 0.3043 0.8055 0.281 0.2838 0.6344 0.0163 0.2558 0.1929 0.2969 0.3388 0.4921 0.656 0.2502 0.4934 0.8435 0.0923 0.4524 0.2996 0.2111 0.1972 0.2443 0.1402 0.2908 0.4701 0.6362 0.0236 0.3568 0.0431 0.3343 0.0009 0.3836 0.0287 0.2979 0.1666] 2022-08-23 07:33:51 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:34:01 [INFO] [TRAIN] epoch: 99, iter: 124050/160000, loss: 0.5130, lr: 0.000272, batch_cost: 0.1851, reader_cost: 0.00344, ips: 43.2132 samples/sec | ETA 01:50:55 2022-08-23 07:34:10 [INFO] [TRAIN] epoch: 99, iter: 124100/160000, loss: 0.4863, lr: 0.000272, batch_cost: 0.1964, reader_cost: 0.00109, ips: 40.7333 samples/sec | ETA 01:57:30 2022-08-23 07:34:19 [INFO] [TRAIN] epoch: 99, iter: 124150/160000, loss: 0.5159, lr: 0.000271, batch_cost: 0.1783, reader_cost: 0.00075, ips: 44.8591 samples/sec | ETA 01:46:33 2022-08-23 07:34:28 [INFO] [TRAIN] epoch: 99, iter: 124200/160000, loss: 0.4968, lr: 0.000271, batch_cost: 0.1793, reader_cost: 0.00047, ips: 44.6148 samples/sec | ETA 01:46:59 2022-08-23 07:34:38 [INFO] [TRAIN] epoch: 99, iter: 124250/160000, loss: 0.5160, lr: 0.000271, batch_cost: 0.1923, reader_cost: 0.00102, ips: 41.6120 samples/sec | ETA 01:54:33 2022-08-23 07:34:47 [INFO] [TRAIN] epoch: 99, iter: 124300/160000, loss: 0.4981, lr: 0.000270, batch_cost: 0.1911, reader_cost: 0.00075, ips: 41.8559 samples/sec | ETA 01:53:43 2022-08-23 07:34:57 [INFO] [TRAIN] epoch: 99, iter: 124350/160000, loss: 0.5103, lr: 0.000270, batch_cost: 0.1863, reader_cost: 0.00132, ips: 42.9458 samples/sec | ETA 01:50:40 2022-08-23 07:35:06 [INFO] [TRAIN] epoch: 99, iter: 124400/160000, loss: 0.4918, lr: 0.000270, batch_cost: 0.1899, reader_cost: 0.00076, ips: 42.1360 samples/sec | ETA 01:52:39 2022-08-23 07:35:16 [INFO] [TRAIN] epoch: 99, iter: 124450/160000, loss: 0.4893, lr: 0.000269, batch_cost: 0.1957, reader_cost: 0.00047, ips: 40.8742 samples/sec | ETA 01:55:57 2022-08-23 07:35:27 [INFO] [TRAIN] epoch: 99, iter: 124500/160000, loss: 0.4945, lr: 0.000269, batch_cost: 0.2135, reader_cost: 0.00114, ips: 37.4677 samples/sec | ETA 02:06:19 2022-08-23 07:35:36 [INFO] [TRAIN] epoch: 99, iter: 124550/160000, loss: 0.4711, lr: 0.000268, batch_cost: 0.1903, reader_cost: 0.00107, ips: 42.0338 samples/sec | ETA 01:52:26 2022-08-23 07:35:46 [INFO] [TRAIN] epoch: 99, iter: 124600/160000, loss: 0.5241, lr: 0.000268, batch_cost: 0.1880, reader_cost: 0.00091, ips: 42.5432 samples/sec | ETA 01:50:56 2022-08-23 07:35:55 [INFO] [TRAIN] epoch: 99, iter: 124650/160000, loss: 0.5307, lr: 0.000268, batch_cost: 0.1851, reader_cost: 0.00040, ips: 43.2206 samples/sec | ETA 01:49:03 2022-08-23 07:36:04 [INFO] [TRAIN] epoch: 99, iter: 124700/160000, loss: 0.4853, lr: 0.000267, batch_cost: 0.1885, reader_cost: 0.00155, ips: 42.4377 samples/sec | ETA 01:50:54 2022-08-23 07:36:13 [INFO] [TRAIN] epoch: 99, iter: 124750/160000, loss: 0.4741, lr: 0.000267, batch_cost: 0.1774, reader_cost: 0.00055, ips: 45.1044 samples/sec | ETA 01:44:12 2022-08-23 07:36:22 [INFO] [TRAIN] epoch: 99, iter: 124800/160000, loss: 0.4744, lr: 0.000267, batch_cost: 0.1848, reader_cost: 0.00073, ips: 43.2789 samples/sec | ETA 01:48:26 2022-08-23 07:36:32 [INFO] [TRAIN] epoch: 99, iter: 124850/160000, loss: 0.5214, lr: 0.000266, batch_cost: 0.1905, reader_cost: 0.00091, ips: 42.0046 samples/sec | ETA 01:51:34 2022-08-23 07:36:41 [INFO] [TRAIN] epoch: 99, iter: 124900/160000, loss: 0.4875, lr: 0.000266, batch_cost: 0.1828, reader_cost: 0.00096, ips: 43.7696 samples/sec | ETA 01:46:55 2022-08-23 07:36:50 [INFO] [TRAIN] epoch: 99, iter: 124950/160000, loss: 0.5176, lr: 0.000265, batch_cost: 0.1839, reader_cost: 0.00060, ips: 43.5030 samples/sec | ETA 01:47:25 2022-08-23 07:37:00 [INFO] [TRAIN] epoch: 99, iter: 125000/160000, loss: 0.4635, lr: 0.000265, batch_cost: 0.1986, reader_cost: 0.00066, ips: 40.2753 samples/sec | ETA 01:55:52 2022-08-23 07:37:00 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 172s - batch_cost: 0.1722 - reader cost: 0.0011 2022-08-23 07:39:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3466 Acc: 0.7674 Kappa: 0.7496 Dice: 0.4778 2022-08-23 07:39:53 [INFO] [EVAL] Class IoU: [0.6907 0.786 0.9314 0.7304 0.6767 0.7612 0.7832 0.7891 0.5324 0.6021 0.4865 0.5653 0.7 0.2889 0.318 0.4344 0.5157 0.4258 0.6107 0.4189 0.7422 0.4736 0.6388 0.4927 0.3108 0.4187 0.4352 0.4676 0.3925 0.2329 0.2613 0.4347 0.2736 0.3623 0.2631 0.3824 0.4327 0.5478 0.2545 0.3963 0.2499 0.1032 0.3638 0.2641 0.2876 0.1965 0.2707 0.5202 0.6249 0.5245 0.5497 0.3531 0.1787 0.2129 0.6862 0.3995 0.8601 0.3906 0.4013 0.2312 0.0652 0.3932 0.3217 0.2403 0.4529 0.6807 0.2839 0.4009 0.0759 0.321 0.5025 0.5466 0.4113 0.1737 0.449 0.3187 0.5604 0.2564 0.2604 0.1845 0.6199 0.3955 0.3769 0.0333 0.0862 0.5714 0.1061 0.0931 0.279 0.5156 0.4109 0.0405 0.2193 0.1144 0.0053 0.0079 0.0272 0.1508 0.3092 0.4051 0.1184 0.0824 0.2669 0.3985 0.0212 0.5226 0.0509 0.5501 0.0745 0.2208 0.1436 0.4788 0.1305 0.4706 0.775 0.0126 0.3287 0.6457 0.0794 0.1096 0.4678 0.0137 0.2129 0.134 0.2435 0.2927 0.4886 0.4318 0.1864 0.3225 0.5859 0.0656 0.3074 0.2552 0.1471 0.139 0.1659 0.048 0.1809 0.3396 0.4436 0.013 0.373 0.0291 0.237 0.0054 0.3501 0.0236 0.2267 0.177 ] 2022-08-23 07:39:53 [INFO] [EVAL] Class Precision: [0.7841 0.863 0.9677 0.8181 0.753 0.8766 0.8833 0.8565 0.6935 0.7654 0.6887 0.711 0.7841 0.5011 0.5181 0.5913 0.653 0.6945 0.7327 0.6298 0.825 0.6202 0.7748 0.5892 0.5151 0.6105 0.5775 0.7711 0.6893 0.328 0.4817 0.5457 0.4994 0.492 0.4521 0.5113 0.6518 0.7758 0.4469 0.608 0.4513 0.2785 0.6169 0.5312 0.4112 0.3879 0.4345 0.72 0.811 0.6254 0.7208 0.4563 0.3443 0.4239 0.7088 0.6043 0.9229 0.6877 0.7619 0.4213 0.1349 0.5043 0.5001 0.6679 0.5518 0.7618 0.4094 0.5257 0.4009 0.5525 0.7294 0.7172 0.6343 0.3004 0.7035 0.5474 0.7074 0.532 0.6401 0.4467 0.7011 0.6611 0.7654 0.0904 0.2515 0.7557 0.3937 0.3886 0.4908 0.7402 0.547 0.0489 0.3548 0.38 0.0251 0.0284 0.4337 0.4311 0.444 0.6691 0.4115 0.2207 0.6515 0.7189 0.7166 0.6064 0.1755 0.8144 0.2516 0.5363 0.3001 0.6495 0.4831 0.6901 0.7803 0.0681 0.639 0.7722 0.1196 0.5725 0.5601 0.1686 0.6368 0.6039 0.6813 0.657 0.8486 0.5802 0.7274 0.462 0.6684 0.4592 0.4584 0.6407 0.5281 0.2659 0.357 0.0877 0.4242 0.6427 0.5855 0.019 0.6594 0.3446 0.5524 0.0067 0.861 0.4362 0.4811 0.7241] 2022-08-23 07:39:53 [INFO] [EVAL] Class Recall: [0.8528 0.8981 0.9613 0.8721 0.8697 0.8526 0.8736 0.9093 0.6963 0.7384 0.6237 0.734 0.8671 0.4055 0.4515 0.6209 0.7103 0.524 0.7858 0.5557 0.8809 0.6671 0.7844 0.7503 0.4394 0.5713 0.6386 0.5429 0.4768 0.4457 0.3634 0.6812 0.377 0.5788 0.3862 0.6027 0.5627 0.6508 0.3716 0.5323 0.359 0.1409 0.47 0.3443 0.4889 0.2848 0.418 0.6521 0.7314 0.7649 0.6985 0.6095 0.2708 0.2995 0.9556 0.5411 0.9267 0.4748 0.4588 0.3388 0.1119 0.641 0.4741 0.2729 0.7164 0.8648 0.4809 0.6281 0.0857 0.4338 0.6177 0.6968 0.5392 0.2915 0.5538 0.4327 0.7295 0.3312 0.3051 0.2391 0.8426 0.496 0.4261 0.05 0.1159 0.7009 0.1268 0.109 0.3925 0.6295 0.6228 0.1918 0.3649 0.1406 0.0067 0.0108 0.0282 0.1883 0.5046 0.5065 0.1426 0.1162 0.3114 0.4721 0.0213 0.7908 0.0669 0.6289 0.0958 0.2728 0.2158 0.6456 0.1517 0.5967 0.9912 0.0152 0.4036 0.7976 0.1913 0.1193 0.7396 0.0147 0.2424 0.1469 0.2748 0.3456 0.5352 0.628 0.2004 0.5164 0.8259 0.0711 0.4826 0.2978 0.1693 0.2255 0.2367 0.096 0.2397 0.4187 0.6466 0.0399 0.4619 0.0308 0.2933 0.0278 0.3711 0.0244 0.3001 0.1897] 2022-08-23 07:39:53 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:40:06 [INFO] [TRAIN] epoch: 100, iter: 125050/160000, loss: 0.5257, lr: 0.000265, batch_cost: 0.2579, reader_cost: 0.06950, ips: 31.0149 samples/sec | ETA 02:30:15 2022-08-23 07:40:15 [INFO] [TRAIN] epoch: 100, iter: 125100/160000, loss: 0.5095, lr: 0.000264, batch_cost: 0.1844, reader_cost: 0.00114, ips: 43.3879 samples/sec | ETA 01:47:14 2022-08-23 07:40:23 [INFO] [TRAIN] epoch: 100, iter: 125150/160000, loss: 0.4918, lr: 0.000264, batch_cost: 0.1688, reader_cost: 0.00061, ips: 47.3929 samples/sec | ETA 01:38:02 2022-08-23 07:40:32 [INFO] [TRAIN] epoch: 100, iter: 125200/160000, loss: 0.5127, lr: 0.000263, batch_cost: 0.1830, reader_cost: 0.00054, ips: 43.7277 samples/sec | ETA 01:46:06 2022-08-23 07:40:42 [INFO] [TRAIN] epoch: 100, iter: 125250/160000, loss: 0.5112, lr: 0.000263, batch_cost: 0.1825, reader_cost: 0.00033, ips: 43.8405 samples/sec | ETA 01:45:41 2022-08-23 07:40:51 [INFO] [TRAIN] epoch: 100, iter: 125300/160000, loss: 0.5240, lr: 0.000263, batch_cost: 0.1907, reader_cost: 0.00332, ips: 41.9615 samples/sec | ETA 01:50:15 2022-08-23 07:41:00 [INFO] [TRAIN] epoch: 100, iter: 125350/160000, loss: 0.5152, lr: 0.000262, batch_cost: 0.1827, reader_cost: 0.00054, ips: 43.7928 samples/sec | ETA 01:45:29 2022-08-23 07:41:10 [INFO] [TRAIN] epoch: 100, iter: 125400/160000, loss: 0.5425, lr: 0.000262, batch_cost: 0.1934, reader_cost: 0.00055, ips: 41.3609 samples/sec | ETA 01:51:32 2022-08-23 07:41:20 [INFO] [TRAIN] epoch: 100, iter: 125450/160000, loss: 0.5036, lr: 0.000262, batch_cost: 0.2074, reader_cost: 0.00113, ips: 38.5817 samples/sec | ETA 01:59:24 2022-08-23 07:41:29 [INFO] [TRAIN] epoch: 100, iter: 125500/160000, loss: 0.5546, lr: 0.000261, batch_cost: 0.1768, reader_cost: 0.00066, ips: 45.2434 samples/sec | ETA 01:41:40 2022-08-23 07:41:38 [INFO] [TRAIN] epoch: 100, iter: 125550/160000, loss: 0.5330, lr: 0.000261, batch_cost: 0.1751, reader_cost: 0.00082, ips: 45.6801 samples/sec | ETA 01:40:33 2022-08-23 07:41:47 [INFO] [TRAIN] epoch: 100, iter: 125600/160000, loss: 0.5364, lr: 0.000260, batch_cost: 0.1727, reader_cost: 0.00087, ips: 46.3333 samples/sec | ETA 01:38:59 2022-08-23 07:41:56 [INFO] [TRAIN] epoch: 100, iter: 125650/160000, loss: 0.4692, lr: 0.000260, batch_cost: 0.1800, reader_cost: 0.00212, ips: 44.4515 samples/sec | ETA 01:43:02 2022-08-23 07:42:04 [INFO] [TRAIN] epoch: 100, iter: 125700/160000, loss: 0.5326, lr: 0.000260, batch_cost: 0.1712, reader_cost: 0.00062, ips: 46.7166 samples/sec | ETA 01:37:53 2022-08-23 07:42:12 [INFO] [TRAIN] epoch: 100, iter: 125750/160000, loss: 0.4898, lr: 0.000259, batch_cost: 0.1635, reader_cost: 0.00083, ips: 48.9269 samples/sec | ETA 01:33:20 2022-08-23 07:42:21 [INFO] [TRAIN] epoch: 100, iter: 125800/160000, loss: 0.5410, lr: 0.000259, batch_cost: 0.1817, reader_cost: 0.00061, ips: 44.0357 samples/sec | ETA 01:43:33 2022-08-23 07:42:31 [INFO] [TRAIN] epoch: 100, iter: 125850/160000, loss: 0.4591, lr: 0.000259, batch_cost: 0.1899, reader_cost: 0.00046, ips: 42.1357 samples/sec | ETA 01:48:03 2022-08-23 07:42:40 [INFO] [TRAIN] epoch: 100, iter: 125900/160000, loss: 0.4743, lr: 0.000258, batch_cost: 0.1774, reader_cost: 0.00080, ips: 45.0996 samples/sec | ETA 01:40:48 2022-08-23 07:42:49 [INFO] [TRAIN] epoch: 100, iter: 125950/160000, loss: 0.5036, lr: 0.000258, batch_cost: 0.1934, reader_cost: 0.00078, ips: 41.3734 samples/sec | ETA 01:49:43 2022-08-23 07:42:59 [INFO] [TRAIN] epoch: 100, iter: 126000/160000, loss: 0.4956, lr: 0.000257, batch_cost: 0.1862, reader_cost: 0.00042, ips: 42.9674 samples/sec | ETA 01:45:30 2022-08-23 07:42:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1698 - reader cost: 8.0184e-04 2022-08-23 07:45:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3505 Acc: 0.7693 Kappa: 0.7516 Dice: 0.4832 2022-08-23 07:45:49 [INFO] [EVAL] Class IoU: [0.6924 0.7784 0.9317 0.7349 0.6783 0.7692 0.7864 0.7797 0.5341 0.6054 0.492 0.5506 0.7007 0.3115 0.3142 0.4345 0.5047 0.4233 0.6193 0.4218 0.739 0.5179 0.6372 0.4967 0.3383 0.3207 0.5107 0.4612 0.4263 0.2358 0.2797 0.4772 0.2775 0.3673 0.2647 0.4014 0.4411 0.5305 0.2538 0.3868 0.2646 0.0981 0.3661 0.266 0.2878 0.2089 0.2892 0.5114 0.6334 0.5581 0.5328 0.3667 0.2138 0.1834 0.6916 0.4336 0.8656 0.3949 0.5089 0.2362 0.0631 0.3627 0.3305 0.2225 0.4559 0.7219 0.2923 0.4207 0.1302 0.3509 0.5055 0.5179 0.4005 0.2075 0.4523 0.3451 0.5897 0.2945 0.2752 0.1445 0.594 0.4031 0.3359 0.0357 0.0685 0.5658 0.117 0.0891 0.2736 0.4904 0.4308 0.0718 0.2234 0.1015 0.0029 0.0186 0.0387 0.1521 0.2962 0.4229 0.1329 0.0416 0.2766 0.1234 0.1423 0.4846 0.0878 0.5263 0.0872 0.2253 0.1424 0.4529 0.131 0.4976 0.77 0.0357 0.3252 0.6256 0.1332 0.1312 0.4442 0.0127 0.2162 0.1559 0.2476 0.2285 0.4866 0.4392 0.345 0.3494 0.5577 0.0513 0.3271 0.2867 0.17 0.1223 0.1529 0.0421 0.2283 0.3545 0.421 0.0265 0.3306 0.0136 0.2638 0.005 0.3597 0.0304 0.2269 0.1555] 2022-08-23 07:45:49 [INFO] [EVAL] Class Precision: [0.785 0.8601 0.9658 0.834 0.7498 0.8561 0.8646 0.8335 0.6813 0.7623 0.7403 0.7283 0.7784 0.4895 0.5495 0.5738 0.6917 0.7167 0.7676 0.6167 0.8204 0.6718 0.7589 0.6234 0.5628 0.5348 0.6674 0.7595 0.6562 0.3389 0.4467 0.6036 0.4907 0.5212 0.444 0.5858 0.6233 0.7575 0.4201 0.6386 0.4394 0.2718 0.6217 0.4987 0.3911 0.4342 0.5201 0.7154 0.7198 0.6331 0.8403 0.4652 0.4327 0.4411 0.758 0.5791 0.9182 0.6787 0.7062 0.4006 0.1184 0.5174 0.5325 0.6624 0.5448 0.8365 0.3912 0.5721 0.2747 0.6041 0.6346 0.7137 0.6094 0.2954 0.709 0.5169 0.777 0.5765 0.7613 0.4089 0.6883 0.6802 0.8107 0.0912 0.2164 0.7604 0.4338 0.422 0.5687 0.7783 0.5948 0.0993 0.3902 0.3763 0.0099 0.0584 0.5923 0.4027 0.5087 0.6922 0.4422 0.0997 0.6529 0.6497 0.7914 0.5297 0.2191 0.8038 0.2702 0.5255 0.3257 0.6377 0.4984 0.7552 0.7735 0.1851 0.68 0.7374 0.1728 0.597 0.6434 0.1595 0.6201 0.5613 0.7057 0.6235 0.853 0.6104 0.7315 0.5943 0.6279 0.539 0.464 0.6824 0.6215 0.2378 0.3278 0.0897 0.4498 0.6136 0.5248 0.0357 0.6727 0.1435 0.488 0.0092 0.8291 0.3758 0.6842 0.7182] 2022-08-23 07:45:49 [INFO] [EVAL] Class Recall: [0.8544 0.8912 0.9634 0.8608 0.8768 0.8834 0.8969 0.9235 0.7121 0.7463 0.5946 0.6928 0.8753 0.4614 0.4233 0.6416 0.6512 0.5083 0.7622 0.5717 0.8817 0.6932 0.7989 0.7097 0.459 0.4448 0.685 0.5401 0.5488 0.4366 0.4279 0.695 0.3898 0.5544 0.3959 0.5605 0.6014 0.6391 0.3907 0.4952 0.3993 0.1331 0.4711 0.3631 0.5214 0.2871 0.3944 0.642 0.8407 0.825 0.5928 0.634 0.2972 0.239 0.8876 0.6332 0.9378 0.4857 0.6456 0.3653 0.1191 0.5482 0.4656 0.2509 0.7362 0.8405 0.5362 0.6138 0.1985 0.4557 0.7132 0.6537 0.5388 0.4108 0.5555 0.5093 0.7098 0.3758 0.3011 0.1826 0.8127 0.4973 0.3645 0.0554 0.0911 0.6885 0.1381 0.1014 0.3452 0.57 0.6098 0.206 0.3431 0.122 0.0041 0.0266 0.0397 0.1964 0.4149 0.5209 0.1597 0.0667 0.3243 0.1322 0.1478 0.8508 0.1278 0.6039 0.114 0.2828 0.2019 0.6097 0.151 0.5934 0.9941 0.0423 0.3839 0.8051 0.3678 0.144 0.5893 0.0136 0.2493 0.1775 0.2761 0.265 0.5311 0.6102 0.395 0.4589 0.833 0.0536 0.5259 0.3309 0.1897 0.2013 0.2228 0.0736 0.3167 0.4564 0.6804 0.0931 0.3939 0.0148 0.3647 0.0106 0.3885 0.032 0.2535 0.1656] 2022-08-23 07:45:49 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:45:59 [INFO] [TRAIN] epoch: 100, iter: 126050/160000, loss: 0.5083, lr: 0.000257, batch_cost: 0.2085, reader_cost: 0.00430, ips: 38.3609 samples/sec | ETA 01:58:00 2022-08-23 07:46:08 [INFO] [TRAIN] epoch: 100, iter: 126100/160000, loss: 0.5025, lr: 0.000257, batch_cost: 0.1764, reader_cost: 0.00140, ips: 45.3512 samples/sec | ETA 01:39:40 2022-08-23 07:46:17 [INFO] [TRAIN] epoch: 100, iter: 126150/160000, loss: 0.5032, lr: 0.000256, batch_cost: 0.1813, reader_cost: 0.00065, ips: 44.1185 samples/sec | ETA 01:42:18 2022-08-23 07:46:27 [INFO] [TRAIN] epoch: 100, iter: 126200/160000, loss: 0.5180, lr: 0.000256, batch_cost: 0.1887, reader_cost: 0.00043, ips: 42.4045 samples/sec | ETA 01:46:16 2022-08-23 07:46:36 [INFO] [TRAIN] epoch: 100, iter: 126250/160000, loss: 0.5086, lr: 0.000256, batch_cost: 0.1909, reader_cost: 0.00104, ips: 41.9116 samples/sec | ETA 01:47:22 2022-08-23 07:46:46 [INFO] [TRAIN] epoch: 100, iter: 126300/160000, loss: 0.4888, lr: 0.000255, batch_cost: 0.2027, reader_cost: 0.00058, ips: 39.4660 samples/sec | ETA 01:53:51 2022-08-23 07:47:01 [INFO] [TRAIN] epoch: 101, iter: 126350/160000, loss: 0.4884, lr: 0.000255, batch_cost: 0.2842, reader_cost: 0.07402, ips: 28.1526 samples/sec | ETA 02:39:22 2022-08-23 07:47:10 [INFO] [TRAIN] epoch: 101, iter: 126400/160000, loss: 0.4700, lr: 0.000254, batch_cost: 0.1919, reader_cost: 0.00060, ips: 41.6879 samples/sec | ETA 01:47:27 2022-08-23 07:47:20 [INFO] [TRAIN] epoch: 101, iter: 126450/160000, loss: 0.5316, lr: 0.000254, batch_cost: 0.2032, reader_cost: 0.00080, ips: 39.3688 samples/sec | ETA 01:53:37 2022-08-23 07:47:30 [INFO] [TRAIN] epoch: 101, iter: 126500/160000, loss: 0.5181, lr: 0.000254, batch_cost: 0.1851, reader_cost: 0.00119, ips: 43.2278 samples/sec | ETA 01:43:19 2022-08-23 07:47:40 [INFO] [TRAIN] epoch: 101, iter: 126550/160000, loss: 0.5022, lr: 0.000253, batch_cost: 0.2008, reader_cost: 0.00084, ips: 39.8382 samples/sec | ETA 01:51:57 2022-08-23 07:47:49 [INFO] [TRAIN] epoch: 101, iter: 126600/160000, loss: 0.4836, lr: 0.000253, batch_cost: 0.1798, reader_cost: 0.00081, ips: 44.5019 samples/sec | ETA 01:40:04 2022-08-23 07:47:58 [INFO] [TRAIN] epoch: 101, iter: 126650/160000, loss: 0.4938, lr: 0.000252, batch_cost: 0.1856, reader_cost: 0.00072, ips: 43.0944 samples/sec | ETA 01:43:11 2022-08-23 07:48:07 [INFO] [TRAIN] epoch: 101, iter: 126700/160000, loss: 0.5205, lr: 0.000252, batch_cost: 0.1888, reader_cost: 0.00059, ips: 42.3691 samples/sec | ETA 01:44:47 2022-08-23 07:48:17 [INFO] [TRAIN] epoch: 101, iter: 126750/160000, loss: 0.5193, lr: 0.000252, batch_cost: 0.1957, reader_cost: 0.00095, ips: 40.8810 samples/sec | ETA 01:48:26 2022-08-23 07:48:26 [INFO] [TRAIN] epoch: 101, iter: 126800/160000, loss: 0.4903, lr: 0.000251, batch_cost: 0.1828, reader_cost: 0.00064, ips: 43.7710 samples/sec | ETA 01:41:07 2022-08-23 07:48:35 [INFO] [TRAIN] epoch: 101, iter: 126850/160000, loss: 0.5217, lr: 0.000251, batch_cost: 0.1790, reader_cost: 0.00102, ips: 44.6972 samples/sec | ETA 01:38:53 2022-08-23 07:48:44 [INFO] [TRAIN] epoch: 101, iter: 126900/160000, loss: 0.4881, lr: 0.000251, batch_cost: 0.1724, reader_cost: 0.00050, ips: 46.4171 samples/sec | ETA 01:35:04 2022-08-23 07:48:52 [INFO] [TRAIN] epoch: 101, iter: 126950/160000, loss: 0.4706, lr: 0.000250, batch_cost: 0.1664, reader_cost: 0.00041, ips: 48.0691 samples/sec | ETA 01:31:40 2022-08-23 07:49:01 [INFO] [TRAIN] epoch: 101, iter: 127000/160000, loss: 0.5060, lr: 0.000250, batch_cost: 0.1716, reader_cost: 0.00061, ips: 46.6145 samples/sec | ETA 01:34:23 2022-08-23 07:49:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1587 - reader cost: 7.0603e-04 2022-08-23 07:51:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.3493 Acc: 0.7696 Kappa: 0.7520 Dice: 0.4807 2022-08-23 07:51:40 [INFO] [EVAL] Class IoU: [0.6924 0.7826 0.9328 0.7298 0.6843 0.7705 0.7892 0.7905 0.5285 0.6018 0.5096 0.5611 0.6912 0.291 0.3091 0.4423 0.5026 0.4471 0.6056 0.4232 0.7356 0.5155 0.6344 0.4972 0.3178 0.3407 0.5124 0.4518 0.4209 0.2629 0.288 0.4585 0.2701 0.3671 0.2448 0.3998 0.4311 0.527 0.2511 0.3965 0.2693 0.1035 0.364 0.2671 0.2889 0.1792 0.3177 0.5155 0.6528 0.5383 0.5737 0.4 0.1857 0.2223 0.6968 0.4531 0.8578 0.3989 0.3972 0.2607 0.0857 0.4293 0.3406 0.2463 0.4652 0.6885 0.289 0.3891 0.1349 0.3235 0.5161 0.5413 0.4132 0.1958 0.4367 0.3464 0.5425 0.256 0.2777 0.163 0.58 0.4008 0.4008 0.0334 0.0924 0.5691 0.122 0.1003 0.2669 0.4903 0.4437 0.0511 0.2205 0.1155 0.0028 0.0101 0.0286 0.1326 0.2987 0.4205 0.159 0.0349 0.2839 0.1485 0.0819 0.5586 0.0377 0.5271 0.1093 0.2342 0.1371 0.5038 0.1423 0.4762 0.8386 0.0199 0.3691 0.6269 0.1048 0.1313 0.4768 0.0084 0.2223 0.1563 0.2393 0.2172 0.4827 0.4509 0.0576 0.3587 0.562 0.0702 0.3287 0.2292 0.1995 0.1184 0.1564 0.0395 0.2108 0.3731 0.2918 0.0034 0.3392 0.0588 0.232 0.0018 0.3807 0.0236 0.221 0.1484] 2022-08-23 07:51:40 [INFO] [EVAL] Class Precision: [0.7857 0.8613 0.9635 0.825 0.7702 0.8745 0.87 0.8553 0.7022 0.7403 0.6846 0.7349 0.7627 0.4998 0.5488 0.5971 0.6659 0.6894 0.7158 0.6412 0.8165 0.662 0.7612 0.6408 0.4998 0.5742 0.714 0.77 0.6444 0.3757 0.4143 0.6131 0.5216 0.4954 0.4225 0.5369 0.6288 0.7709 0.4283 0.6477 0.4245 0.2626 0.6066 0.5265 0.4554 0.3757 0.5013 0.7212 0.7674 0.6247 0.7618 0.5777 0.3644 0.4655 0.7383 0.6191 0.9173 0.6765 0.8178 0.4776 0.1364 0.6096 0.5182 0.682 0.5631 0.7696 0.3875 0.5303 0.3214 0.568 0.6601 0.7198 0.6262 0.3473 0.7398 0.5129 0.7611 0.5128 0.7183 0.4413 0.6753 0.6674 0.741 0.0991 0.255 0.7314 0.4373 0.4033 0.5071 0.7277 0.6276 0.0646 0.4097 0.38 0.0172 0.0298 0.6786 0.3995 0.4794 0.6472 0.4417 0.0923 0.5945 0.6029 0.8141 0.6008 0.1492 0.7763 0.2365 0.5174 0.2872 0.745 0.4015 0.6162 0.8478 0.1082 0.638 0.7318 0.1376 0.647 0.5855 0.1606 0.6781 0.6167 0.6647 0.5805 0.8924 0.6623 0.1307 0.5585 0.6269 0.4292 0.5032 0.6519 0.59 0.2375 0.3262 0.1019 0.4539 0.5996 0.335 0.0049 0.6254 0.3488 0.5679 0.0035 0.8076 0.346 0.6088 0.7582] 2022-08-23 07:51:40 [INFO] [EVAL] Class Recall: [0.8536 0.8955 0.967 0.8634 0.8599 0.8662 0.8948 0.9126 0.6812 0.7628 0.6659 0.7035 0.8805 0.4106 0.4143 0.6304 0.672 0.56 0.7972 0.5545 0.8812 0.6996 0.7921 0.6894 0.4661 0.4559 0.6447 0.5224 0.5483 0.4668 0.4857 0.6452 0.3591 0.5862 0.3679 0.6102 0.5783 0.6249 0.3776 0.5055 0.4242 0.1459 0.4765 0.3515 0.4414 0.2551 0.4645 0.6438 0.8138 0.7955 0.6991 0.5652 0.2748 0.2985 0.9253 0.6282 0.9297 0.4928 0.4358 0.3647 0.1872 0.5921 0.4984 0.2783 0.728 0.8673 0.5321 0.5939 0.1885 0.4291 0.7028 0.6858 0.5486 0.31 0.5159 0.5163 0.6538 0.3383 0.3116 0.2055 0.8042 0.5009 0.4661 0.0481 0.1266 0.7194 0.1447 0.1178 0.3604 0.6005 0.6022 0.1972 0.3231 0.1423 0.0033 0.015 0.029 0.1657 0.4422 0.5456 0.199 0.0532 0.3521 0.1646 0.0835 0.8883 0.048 0.6215 0.1689 0.2997 0.2078 0.6088 0.1806 0.677 0.9872 0.0238 0.4669 0.8138 0.305 0.1414 0.7199 0.0088 0.2485 0.1732 0.2722 0.2576 0.5126 0.5854 0.0933 0.5006 0.8445 0.0774 0.4866 0.2611 0.2316 0.1909 0.231 0.0606 0.2824 0.497 0.6931 0.0106 0.4257 0.066 0.2818 0.0035 0.4187 0.0247 0.2575 0.1557] 2022-08-23 07:51:40 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:51:49 [INFO] [TRAIN] epoch: 101, iter: 127050/160000, loss: 0.5115, lr: 0.000249, batch_cost: 0.1802, reader_cost: 0.00377, ips: 44.4057 samples/sec | ETA 01:38:56 2022-08-23 07:51:58 [INFO] [TRAIN] epoch: 101, iter: 127100/160000, loss: 0.4924, lr: 0.000249, batch_cost: 0.1898, reader_cost: 0.00158, ips: 42.1427 samples/sec | ETA 01:44:05 2022-08-23 07:52:07 [INFO] [TRAIN] epoch: 101, iter: 127150/160000, loss: 0.5498, lr: 0.000249, batch_cost: 0.1823, reader_cost: 0.00106, ips: 43.8791 samples/sec | ETA 01:39:49 2022-08-23 07:52:16 [INFO] [TRAIN] epoch: 101, iter: 127200/160000, loss: 0.5355, lr: 0.000248, batch_cost: 0.1717, reader_cost: 0.00081, ips: 46.5948 samples/sec | ETA 01:33:51 2022-08-23 07:52:26 [INFO] [TRAIN] epoch: 101, iter: 127250/160000, loss: 0.5184, lr: 0.000248, batch_cost: 0.1931, reader_cost: 0.00054, ips: 41.4223 samples/sec | ETA 01:45:25 2022-08-23 07:52:35 [INFO] [TRAIN] epoch: 101, iter: 127300/160000, loss: 0.4994, lr: 0.000248, batch_cost: 0.1916, reader_cost: 0.00046, ips: 41.7624 samples/sec | ETA 01:44:24 2022-08-23 07:52:45 [INFO] [TRAIN] epoch: 101, iter: 127350/160000, loss: 0.4845, lr: 0.000247, batch_cost: 0.1931, reader_cost: 0.00066, ips: 41.4194 samples/sec | ETA 01:45:06 2022-08-23 07:52:55 [INFO] [TRAIN] epoch: 101, iter: 127400/160000, loss: 0.4991, lr: 0.000247, batch_cost: 0.1929, reader_cost: 0.00112, ips: 41.4741 samples/sec | ETA 01:44:48 2022-08-23 07:53:05 [INFO] [TRAIN] epoch: 101, iter: 127450/160000, loss: 0.4990, lr: 0.000246, batch_cost: 0.2133, reader_cost: 0.00032, ips: 37.5096 samples/sec | ETA 01:55:42 2022-08-23 07:53:14 [INFO] [TRAIN] epoch: 101, iter: 127500/160000, loss: 0.4967, lr: 0.000246, batch_cost: 0.1714, reader_cost: 0.00094, ips: 46.6674 samples/sec | ETA 01:32:51 2022-08-23 07:53:23 [INFO] [TRAIN] epoch: 101, iter: 127550/160000, loss: 0.5079, lr: 0.000246, batch_cost: 0.1819, reader_cost: 0.00054, ips: 43.9699 samples/sec | ETA 01:38:24 2022-08-23 07:53:36 [INFO] [TRAIN] epoch: 102, iter: 127600/160000, loss: 0.4710, lr: 0.000245, batch_cost: 0.2702, reader_cost: 0.09160, ips: 29.6074 samples/sec | ETA 02:25:54 2022-08-23 07:53:47 [INFO] [TRAIN] epoch: 102, iter: 127650/160000, loss: 0.4902, lr: 0.000245, batch_cost: 0.2052, reader_cost: 0.00147, ips: 38.9943 samples/sec | ETA 01:50:36 2022-08-23 07:53:57 [INFO] [TRAIN] epoch: 102, iter: 127700/160000, loss: 0.5465, lr: 0.000245, batch_cost: 0.2015, reader_cost: 0.00073, ips: 39.7097 samples/sec | ETA 01:48:27 2022-08-23 07:54:07 [INFO] [TRAIN] epoch: 102, iter: 127750/160000, loss: 0.4815, lr: 0.000244, batch_cost: 0.1969, reader_cost: 0.00052, ips: 40.6352 samples/sec | ETA 01:45:49 2022-08-23 07:54:16 [INFO] [TRAIN] epoch: 102, iter: 127800/160000, loss: 0.5125, lr: 0.000244, batch_cost: 0.1860, reader_cost: 0.00093, ips: 43.0050 samples/sec | ETA 01:39:50 2022-08-23 07:54:26 [INFO] [TRAIN] epoch: 102, iter: 127850/160000, loss: 0.4603, lr: 0.000243, batch_cost: 0.1966, reader_cost: 0.00088, ips: 40.6937 samples/sec | ETA 01:45:20 2022-08-23 07:54:36 [INFO] [TRAIN] epoch: 102, iter: 127900/160000, loss: 0.4852, lr: 0.000243, batch_cost: 0.2020, reader_cost: 0.00041, ips: 39.5997 samples/sec | ETA 01:48:04 2022-08-23 07:54:46 [INFO] [TRAIN] epoch: 102, iter: 127950/160000, loss: 0.4787, lr: 0.000243, batch_cost: 0.1980, reader_cost: 0.00068, ips: 40.3944 samples/sec | ETA 01:45:47 2022-08-23 07:54:55 [INFO] [TRAIN] epoch: 102, iter: 128000/160000, loss: 0.5348, lr: 0.000242, batch_cost: 0.1945, reader_cost: 0.00065, ips: 41.1322 samples/sec | ETA 01:43:43 2022-08-23 07:54:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1856 - reader cost: 5.7851e-04 2022-08-23 07:58:01 [INFO] [EVAL] #Images: 2000 mIoU: 0.3493 Acc: 0.7702 Kappa: 0.7527 Dice: 0.4805 2022-08-23 07:58:01 [INFO] [EVAL] Class IoU: [0.6936 0.7813 0.9323 0.7357 0.6838 0.7679 0.7897 0.7884 0.5287 0.6213 0.4971 0.5679 0.7051 0.311 0.3233 0.4414 0.5237 0.4355 0.6133 0.4271 0.7372 0.5024 0.6368 0.4938 0.3186 0.4189 0.4779 0.448 0.4065 0.2386 0.2427 0.4617 0.2815 0.3644 0.3036 0.3974 0.4375 0.5369 0.2531 0.4084 0.2593 0.122 0.3632 0.2688 0.2865 0.1709 0.296 0.5241 0.5851 0.5379 0.5494 0.4554 0.2212 0.1682 0.689 0.4537 0.863 0.4095 0.4615 0.2588 0.0553 0.4044 0.2922 0.2598 0.4796 0.703 0.2914 0.3748 0.0633 0.3124 0.5192 0.5252 0.4145 0.1965 0.4312 0.3509 0.5285 0.2791 0.2799 0.1614 0.6217 0.4078 0.3828 0.0337 0.09 0.5731 0.1019 0.0881 0.2509 0.4976 0.4493 0.053 0.2351 0.1154 0.0023 0.0093 0.0385 0.1743 0.2942 0.4441 0.1184 0.0627 0.2992 0.1556 0.0189 0.5268 0.0492 0.5131 0.0974 0.2124 0.1518 0.4121 0.1285 0.4953 0.8202 0.0221 0.3363 0.6364 0.097 0.137 0.4752 0.0154 0.2063 0.1419 0.2551 0.2585 0.4851 0.4364 0.1724 0.3635 0.5964 0.0898 0.3299 0.2938 0.1871 0.1273 0.1638 0.0327 0.2099 0.3454 0.2835 0.0108 0.3447 0.0112 0.2453 0.0011 0.3642 0.0291 0.2265 0.1401] 2022-08-23 07:58:01 [INFO] [EVAL] Class Precision: [0.79 0.8651 0.9633 0.8337 0.7618 0.8736 0.8806 0.8512 0.6843 0.7772 0.7227 0.7222 0.7856 0.4837 0.5473 0.6011 0.6899 0.6926 0.7602 0.6295 0.8202 0.6446 0.7638 0.5907 0.4912 0.5452 0.6317 0.7633 0.683 0.379 0.478 0.5784 0.4376 0.4836 0.4766 0.5282 0.641 0.7552 0.4506 0.6479 0.4512 0.2586 0.6205 0.5061 0.379 0.3879 0.4875 0.7167 0.6432 0.6348 0.8056 0.6584 0.3898 0.4333 0.7188 0.6937 0.9408 0.6911 0.674 0.4417 0.1247 0.5749 0.3594 0.648 0.5967 0.7941 0.3931 0.5133 0.2241 0.5655 0.679 0.7017 0.6227 0.2978 0.6466 0.5119 0.6692 0.5604 0.7782 0.3543 0.7148 0.6482 0.7662 0.0835 0.2474 0.7332 0.3537 0.4199 0.4439 0.7654 0.6084 0.0664 0.4092 0.3431 0.0118 0.0285 0.4234 0.4118 0.4905 0.647 0.3857 0.1416 0.6219 0.6079 0.5819 0.5607 0.1818 0.804 0.2619 0.528 0.3029 0.5464 0.4274 0.7402 0.8264 0.1764 0.6227 0.7568 0.1417 0.5675 0.596 0.1532 0.5991 0.604 0.6309 0.6185 0.7648 0.5843 0.7411 0.5892 0.669 0.4352 0.4365 0.6533 0.6467 0.2616 0.3624 0.0841 0.474 0.6686 0.3275 0.015 0.5916 0.1278 0.5871 0.0028 0.8161 0.3411 0.4843 0.6761] 2022-08-23 07:58:01 [INFO] [EVAL] Class Recall: [0.8503 0.8898 0.9666 0.8623 0.8697 0.8639 0.8844 0.9144 0.6992 0.756 0.6143 0.7267 0.8732 0.4655 0.4413 0.6242 0.6849 0.5399 0.7604 0.5704 0.8792 0.6949 0.793 0.7506 0.4756 0.6438 0.6625 0.5203 0.5009 0.3916 0.3302 0.6959 0.441 0.5965 0.4555 0.616 0.5794 0.65 0.3661 0.5249 0.3788 0.1877 0.4669 0.3644 0.5401 0.234 0.4298 0.6611 0.8662 0.7789 0.6334 0.5964 0.3384 0.2156 0.9431 0.5673 0.9125 0.5012 0.5941 0.3846 0.0903 0.5768 0.6098 0.3025 0.7096 0.8597 0.5297 0.5815 0.0811 0.4111 0.688 0.6763 0.5536 0.366 0.5642 0.5272 0.7155 0.3574 0.3042 0.2287 0.8267 0.5237 0.4334 0.0536 0.1239 0.7241 0.1253 0.1003 0.3659 0.5871 0.632 0.2081 0.3559 0.1481 0.0029 0.0137 0.0406 0.232 0.4237 0.5861 0.1459 0.1011 0.3657 0.173 0.0192 0.897 0.0631 0.5864 0.1343 0.2622 0.2332 0.6264 0.1552 0.5995 0.991 0.0246 0.4223 0.8 0.2354 0.153 0.701 0.0168 0.2394 0.1565 0.2999 0.3075 0.5701 0.633 0.1834 0.4869 0.8459 0.1017 0.5746 0.348 0.2084 0.1987 0.2301 0.0507 0.2737 0.4168 0.6787 0.0373 0.4524 0.0121 0.2964 0.0019 0.3968 0.0309 0.2984 0.1502] 2022-08-23 07:58:01 [INFO] [EVAL] The model with the best validation mIoU (0.3512) was saved at iter 120000. 2022-08-23 07:58:12 [INFO] [TRAIN] epoch: 102, iter: 128050/160000, loss: 0.5192, lr: 0.000242, batch_cost: 0.2020, reader_cost: 0.00456, ips: 39.5969 samples/sec | ETA 01:47:35 2022-08-23 07:58:22 [INFO] [TRAIN] epoch: 102, iter: 128100/160000, loss: 0.5332, lr: 0.000242, batch_cost: 0.2002, reader_cost: 0.00127, ips: 39.9613 samples/sec | ETA 01:46:26 2022-08-23 07:58:31 [INFO] [TRAIN] epoch: 102, iter: 128150/160000, loss: 0.5087, lr: 0.000241, batch_cost: 0.1896, reader_cost: 0.00042, ips: 42.2006 samples/sec | ETA 01:40:37 2022-08-23 07:58:41 [INFO] [TRAIN] epoch: 102, iter: 128200/160000, loss: 0.5266, lr: 0.000241, batch_cost: 0.1999, reader_cost: 0.00050, ips: 40.0300 samples/sec | ETA 01:45:55 2022-08-23 07:58:51 [INFO] [TRAIN] epoch: 102, iter: 128250/160000, loss: 0.4834, lr: 0.000240, batch_cost: 0.1946, reader_cost: 0.00066, ips: 41.1026 samples/sec | ETA 01:42:59 2022-08-23 07:59:00 [INFO] [TRAIN] epoch: 102, iter: 128300/160000, loss: 0.4641, lr: 0.000240, batch_cost: 0.1806, reader_cost: 0.00081, ips: 44.2854 samples/sec | ETA 01:35:26 2022-08-23 07:59:08 [INFO] [TRAIN] epoch: 102, iter: 128350/160000, loss: 0.5217, lr: 0.000240, batch_cost: 0.1669, reader_cost: 0.00060, ips: 47.9412 samples/sec | ETA 01:28:01 2022-08-23 07:59:18 [INFO] [TRAIN] epoch: 102, iter: 128400/160000, loss: 0.5284, lr: 0.000239, batch_cost: 0.1950, reader_cost: 0.00052, ips: 41.0254 samples/sec | ETA 01:42:42 2022-08-23 07:59:28 [INFO] [TRAIN] epoch: 102, iter: 128450/160000, loss: 0.4891, lr: 0.000239, batch_cost: 0.1953, reader_cost: 0.00094, ips: 40.9549 samples/sec | ETA 01:42:42 2022-08-23 07:59:38 [INFO] [TRAIN] epoch: 102, iter: 128500/160000, loss: 0.5350, lr: 0.000238, batch_cost: 0.2103, reader_cost: 0.00074, ips: 38.0485 samples/sec | ETA 01:50:23 2022-08-23 07:59:47 [INFO] [TRAIN] epoch: 102, iter: 128550/160000, loss: 0.5026, lr: 0.000238, batch_cost: 0.1780, reader_cost: 0.00063, ips: 44.9497 samples/sec | ETA 01:33:17 2022-08-23 07:59:56 [INFO] [TRAIN] epoch: 102, iter: 128600/160000, loss: 0.4808, lr: 0.000238, batch_cost: 0.1877, reader_cost: 0.00078, ips: 42.6143 samples/sec | ETA 01:38:14 2022-08-23 08:00:06 [INFO] [TRAIN] epoch: 102, iter: 128650/160000, loss: 0.5112, lr: 0.000237, batch_cost: 0.1946, reader_cost: 0.00081, ips: 41.1108 samples/sec | ETA 01:41:40 2022-08-23 08:00:16 [INFO] [TRAIN] epoch: 102, iter: 128700/160000, loss: 0.4856, lr: 0.000237, batch_cost: 0.2027, reader_cost: 0.00074, ips: 39.4675 samples/sec | ETA 01:45:44 2022-08-23 08:00:26 [INFO] [TRAIN] epoch: 102, iter: 128750/160000, loss: 0.5005, lr: 0.000237, batch_cost: 0.1918, reader_cost: 0.00085, ips: 41.7014 samples/sec | ETA 01:39:54 2022-08-23 08:00:37 [INFO] [TRAIN] epoch: 102, iter: 128800/160000, loss: 0.4892, lr: 0.000236, batch_cost: 0.2169, reader_cost: 0.00081, ips: 36.8810 samples/sec | ETA 01:52:47 2022-08-23 08:00:49 [INFO] [TRAIN] epoch: 103, iter: 128850/160000, loss: 0.5249, lr: 0.000236, batch_cost: 0.2402, reader_cost: 0.05246, ips: 33.3112 samples/sec | ETA 02:04:40 2022-08-23 08:00:59 [INFO] [TRAIN] epoch: 103, iter: 128900/160000, loss: 0.4962, lr: 0.000235, batch_cost: 0.1975, reader_cost: 0.00040, ips: 40.5123 samples/sec | ETA 01:42:21 2022-08-23 08:01:09 [INFO] [TRAIN] epoch: 103, iter: 128950/160000, loss: 0.4902, lr: 0.000235, batch_cost: 0.2031, reader_cost: 0.00122, ips: 39.3945 samples/sec | ETA 01:45:05 2022-08-23 08:01:19 [INFO] [TRAIN] epoch: 103, iter: 129000/160000, loss: 0.5036, lr: 0.000235, batch_cost: 0.1966, reader_cost: 0.00083, ips: 40.6933 samples/sec | ETA 01:41:34 2022-08-23 08:01:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 160s - batch_cost: 0.1602 - reader cost: 5.0226e-04 2022-08-23 08:03:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3544 Acc: 0.7726 Kappa: 0.7550 Dice: 0.4856 2022-08-23 08:03:59 [INFO] [EVAL] Class IoU: [0.6939 0.7867 0.9321 0.7307 0.6816 0.7675 0.7925 0.7884 0.5354 0.6097 0.4989 0.57 0.7091 0.3076 0.3248 0.4389 0.5206 0.4282 0.6157 0.4314 0.742 0.5106 0.6317 0.5062 0.327 0.3633 0.48 0.4692 0.4172 0.2637 0.2855 0.4851 0.2726 0.366 0.3252 0.383 0.4347 0.5553 0.2639 0.4064 0.2217 0.1067 0.3578 0.2637 0.2969 0.2343 0.3131 0.5222 0.6404 0.5477 0.5628 0.4661 0.2218 0.2153 0.7221 0.4156 0.8679 0.4051 0.4885 0.2703 0.0856 0.4088 0.3355 0.247 0.4827 0.7076 0.3058 0.4 0.1234 0.3151 0.5326 0.5392 0.4023 0.2174 0.4401 0.3519 0.6083 0.2513 0.2702 0.1525 0.6394 0.4091 0.4136 0.0363 0.0711 0.5713 0.1065 0.1018 0.2873 0.4983 0.4328 0.0439 0.2016 0.1147 0.0076 0.023 0.0189 0.1508 0.3026 0.4044 0.0329 0.037 0.3055 0.1171 0.0067 0.5728 0.0443 0.5111 0.1098 0.2371 0.141 0.3801 0.1437 0.5013 0.8266 0.0273 0.3059 0.6181 0.096 0.1469 0.4848 0.0151 0.2169 0.1484 0.2672 0.2934 0.4744 0.4271 0.4271 0.3644 0.5646 0.0495 0.3514 0.2733 0.214 0.1346 0.1588 0.0419 0.2028 0.3704 0.3161 0.0085 0.346 0.0524 0.2538 0.0059 0.3848 0.0252 0.1912 0.1235] 2022-08-23 08:03:59 [INFO] [EVAL] Class Precision: [0.7816 0.8559 0.9678 0.8226 0.7639 0.8818 0.8813 0.8532 0.6656 0.7408 0.7268 0.7133 0.7923 0.4991 0.5271 0.6067 0.6999 0.7222 0.7742 0.6308 0.8303 0.6808 0.7541 0.6425 0.512 0.5895 0.6095 0.7098 0.6762 0.3706 0.4799 0.6647 0.5035 0.5026 0.5543 0.4931 0.6461 0.753 0.421 0.6368 0.471 0.2768 0.6618 0.525 0.4337 0.4314 0.5135 0.707 0.8125 0.6393 0.7559 0.6847 0.4129 0.6106 0.7525 0.5941 0.9237 0.7018 0.7625 0.5163 0.1728 0.5752 0.565 0.6913 0.6009 0.8134 0.4478 0.5456 0.3443 0.5471 0.7176 0.7296 0.6051 0.319 0.7205 0.5306 0.8111 0.5274 0.7457 0.2999 0.757 0.6673 0.7262 0.1062 0.2002 0.7423 0.3868 0.4029 0.721 0.7381 0.5924 0.06 0.4506 0.3655 0.0487 0.061 0.4951 0.4464 0.4714 0.657 0.2616 0.0991 0.6007 0.6342 0.3789 0.6155 0.1882 0.7979 0.2952 0.5577 0.3066 0.4958 0.3645 0.8077 0.8331 0.1286 0.6472 0.7231 0.1318 0.5319 0.6704 0.1553 0.6976 0.6139 0.6513 0.6543 0.7697 0.5511 0.8031 0.5894 0.627 0.3696 0.5245 0.5205 0.5096 0.2827 0.3625 0.0779 0.4109 0.6089 0.385 0.0123 0.599 0.3218 0.5073 0.0092 0.8123 0.3463 0.6713 0.713 ] 2022-08-23 08:03:59 [INFO] [EVAL] Class Recall: [0.8608 0.9068 0.9619 0.8674 0.8636 0.8555 0.8871 0.9122 0.7324 0.775 0.614 0.7395 0.871 0.445 0.4584 0.6135 0.6701 0.5127 0.7504 0.577 0.8746 0.6713 0.7956 0.7048 0.475 0.4864 0.6931 0.5806 0.5213 0.4775 0.4135 0.6422 0.3729 0.5739 0.4404 0.6318 0.5705 0.679 0.4143 0.529 0.2952 0.148 0.4378 0.3462 0.4849 0.339 0.445 0.6665 0.7515 0.7926 0.6878 0.5934 0.3241 0.2496 0.947 0.5803 0.9349 0.4893 0.5762 0.3619 0.1449 0.5857 0.4523 0.2776 0.7105 0.8447 0.4909 0.5999 0.1613 0.4263 0.6739 0.6739 0.5455 0.4059 0.5307 0.5108 0.7087 0.3243 0.2976 0.2368 0.8045 0.5139 0.4901 0.0522 0.0994 0.7126 0.1281 0.1199 0.3232 0.6053 0.6163 0.1409 0.2672 0.1433 0.0089 0.0355 0.0192 0.1855 0.4581 0.5127 0.0362 0.0558 0.3833 0.1256 0.0068 0.8919 0.0547 0.5872 0.1488 0.292 0.2071 0.6195 0.1917 0.5692 0.9907 0.0334 0.3671 0.8098 0.2613 0.1687 0.6365 0.0165 0.2395 0.1636 0.3118 0.3473 0.5528 0.6549 0.477 0.4883 0.8502 0.0541 0.5158 0.3653 0.2695 0.2045 0.2203 0.0831 0.2859 0.4859 0.6385 0.0263 0.4503 0.0589 0.3368 0.0158 0.4224 0.0265 0.2109 0.13 ] 2022-08-23 08:03:59 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:04:08 [INFO] [TRAIN] epoch: 103, iter: 129050/160000, loss: 0.4794, lr: 0.000234, batch_cost: 0.1811, reader_cost: 0.00468, ips: 44.1678 samples/sec | ETA 01:33:25 2022-08-23 08:04:17 [INFO] [TRAIN] epoch: 103, iter: 129100/160000, loss: 0.4927, lr: 0.000234, batch_cost: 0.1667, reader_cost: 0.00121, ips: 47.9871 samples/sec | ETA 01:25:51 2022-08-23 08:04:25 [INFO] [TRAIN] epoch: 103, iter: 129150/160000, loss: 0.5165, lr: 0.000234, batch_cost: 0.1665, reader_cost: 0.00087, ips: 48.0456 samples/sec | ETA 01:25:36 2022-08-23 08:04:34 [INFO] [TRAIN] epoch: 103, iter: 129200/160000, loss: 0.5217, lr: 0.000233, batch_cost: 0.1736, reader_cost: 0.00070, ips: 46.0760 samples/sec | ETA 01:29:07 2022-08-23 08:04:44 [INFO] [TRAIN] epoch: 103, iter: 129250/160000, loss: 0.4895, lr: 0.000233, batch_cost: 0.1954, reader_cost: 0.00105, ips: 40.9361 samples/sec | ETA 01:40:09 2022-08-23 08:04:53 [INFO] [TRAIN] epoch: 103, iter: 129300/160000, loss: 0.4920, lr: 0.000232, batch_cost: 0.1799, reader_cost: 0.00056, ips: 44.4674 samples/sec | ETA 01:32:03 2022-08-23 08:05:01 [INFO] [TRAIN] epoch: 103, iter: 129350/160000, loss: 0.4624, lr: 0.000232, batch_cost: 0.1710, reader_cost: 0.00038, ips: 46.7746 samples/sec | ETA 01:27:22 2022-08-23 08:05:11 [INFO] [TRAIN] epoch: 103, iter: 129400/160000, loss: 0.5357, lr: 0.000232, batch_cost: 0.1960, reader_cost: 0.00072, ips: 40.8077 samples/sec | ETA 01:39:58 2022-08-23 08:05:22 [INFO] [TRAIN] epoch: 103, iter: 129450/160000, loss: 0.5067, lr: 0.000231, batch_cost: 0.2185, reader_cost: 0.00048, ips: 36.6154 samples/sec | ETA 01:51:14 2022-08-23 08:05:31 [INFO] [TRAIN] epoch: 103, iter: 129500/160000, loss: 0.5340, lr: 0.000231, batch_cost: 0.1743, reader_cost: 0.00053, ips: 45.9074 samples/sec | ETA 01:28:35 2022-08-23 08:05:39 [INFO] [TRAIN] epoch: 103, iter: 129550/160000, loss: 0.4994, lr: 0.000231, batch_cost: 0.1720, reader_cost: 0.00097, ips: 46.4998 samples/sec | ETA 01:27:18 2022-08-23 08:05:47 [INFO] [TRAIN] epoch: 103, iter: 129600/160000, loss: 0.4923, lr: 0.000230, batch_cost: 0.1674, reader_cost: 0.00157, ips: 47.7763 samples/sec | ETA 01:24:50 2022-08-23 08:05:56 [INFO] [TRAIN] epoch: 103, iter: 129650/160000, loss: 0.4986, lr: 0.000230, batch_cost: 0.1706, reader_cost: 0.00064, ips: 46.8829 samples/sec | ETA 01:26:18 2022-08-23 08:06:05 [INFO] [TRAIN] epoch: 103, iter: 129700/160000, loss: 0.5189, lr: 0.000229, batch_cost: 0.1843, reader_cost: 0.00094, ips: 43.4131 samples/sec | ETA 01:33:03 2022-08-23 08:06:15 [INFO] [TRAIN] epoch: 103, iter: 129750/160000, loss: 0.5356, lr: 0.000229, batch_cost: 0.1975, reader_cost: 0.00045, ips: 40.5103 samples/sec | ETA 01:39:33 2022-08-23 08:06:26 [INFO] [TRAIN] epoch: 103, iter: 129800/160000, loss: 0.5034, lr: 0.000229, batch_cost: 0.2088, reader_cost: 0.00053, ips: 38.3119 samples/sec | ETA 01:45:06 2022-08-23 08:06:36 [INFO] [TRAIN] epoch: 103, iter: 129850/160000, loss: 0.5053, lr: 0.000228, batch_cost: 0.2050, reader_cost: 0.00076, ips: 39.0154 samples/sec | ETA 01:43:02 2022-08-23 08:06:45 [INFO] [TRAIN] epoch: 103, iter: 129900/160000, loss: 0.5370, lr: 0.000228, batch_cost: 0.1771, reader_cost: 0.00327, ips: 45.1753 samples/sec | ETA 01:28:50 2022-08-23 08:06:54 [INFO] [TRAIN] epoch: 103, iter: 129950/160000, loss: 0.4769, lr: 0.000228, batch_cost: 0.1950, reader_cost: 0.00115, ips: 41.0320 samples/sec | ETA 01:37:38 2022-08-23 08:07:04 [INFO] [TRAIN] epoch: 103, iter: 130000/160000, loss: 0.5398, lr: 0.000227, batch_cost: 0.1988, reader_cost: 0.00035, ips: 40.2442 samples/sec | ETA 01:39:23 2022-08-23 08:07:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1652 - reader cost: 6.9225e-04 2022-08-23 08:09:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.3500 Acc: 0.7713 Kappa: 0.7540 Dice: 0.4815 2022-08-23 08:09:50 [INFO] [EVAL] Class IoU: [0.6954 0.7869 0.9329 0.7293 0.6839 0.7723 0.7861 0.7919 0.5313 0.6208 0.496 0.5558 0.7012 0.2932 0.331 0.4372 0.5324 0.4531 0.6156 0.4217 0.7434 0.4635 0.6387 0.5033 0.3441 0.4062 0.4422 0.4627 0.4251 0.2434 0.2902 0.4793 0.2658 0.3686 0.3278 0.3951 0.4319 0.5261 0.2571 0.4054 0.2617 0.1231 0.3531 0.2626 0.2818 0.2275 0.2951 0.5148 0.6487 0.5219 0.5865 0.3973 0.2311 0.1889 0.6963 0.4366 0.8553 0.4304 0.4651 0.2586 0.0825 0.3923 0.3493 0.2136 0.4608 0.7012 0.3074 0.4023 0.126 0.3098 0.4969 0.5391 0.3836 0.2063 0.4517 0.3465 0.5618 0.2873 0.2746 0.1025 0.5967 0.4029 0.3288 0.0354 0.0887 0.5624 0.105 0.0945 0.2663 0.4917 0.4305 0.0616 0.2295 0.1085 0.011 0.0324 0.0169 0.1582 0.3109 0.4481 0.0708 0.0305 0.3026 0.1953 0.008 0.5677 0.0836 0.4989 0.1033 0.2135 0.1537 0.3441 0.1392 0.5501 0.8225 0.011 0.2978 0.6116 0.1023 0.1291 0.4844 0.0097 0.2119 0.1545 0.248 0.301 0.4506 0.4201 0.1484 0.3388 0.5557 0.0662 0.3096 0.2724 0.1911 0.1261 0.1594 0.0395 0.2189 0.3647 0.351 0.0059 0.3761 0.0304 0.2878 0.0028 0.366 0.0298 0.2029 0.1369] 2022-08-23 08:09:50 [INFO] [EVAL] Class Precision: [0.7977 0.8627 0.9682 0.8216 0.7622 0.8654 0.8679 0.8548 0.686 0.7374 0.6797 0.7528 0.769 0.5066 0.5398 0.5992 0.7193 0.6988 0.7678 0.6013 0.8321 0.6241 0.7669 0.6484 0.5372 0.5605 0.5467 0.7622 0.6366 0.3668 0.4762 0.6552 0.4774 0.5232 0.4986 0.5328 0.6166 0.75 0.4235 0.6607 0.4725 0.2747 0.565 0.5372 0.4244 0.421 0.4759 0.6913 0.7483 0.6113 0.7678 0.5639 0.4226 0.4618 0.7341 0.5986 0.902 0.6801 0.7324 0.4278 0.1603 0.5319 0.4746 0.6137 0.5545 0.7865 0.4768 0.5864 0.3914 0.5233 0.7711 0.7245 0.6112 0.2865 0.7038 0.5125 0.7185 0.5606 0.6382 0.324 0.6781 0.6896 0.8208 0.1053 0.2364 0.7567 0.4059 0.3962 0.5245 0.7961 0.641 0.0858 0.4243 0.3527 0.0464 0.08 0.3844 0.3053 0.5553 0.6558 0.4152 0.0878 0.6372 0.5963 0.2937 0.6002 0.2139 0.8381 0.2879 0.5303 0.3019 0.4359 0.4031 0.7923 0.8282 0.0787 0.6948 0.7153 0.1399 0.472 0.6198 0.1937 0.5513 0.6013 0.6789 0.6408 0.7722 0.574 0.5201 0.5238 0.6054 0.4715 0.4282 0.6355 0.5204 0.3121 0.3834 0.0849 0.4193 0.6152 0.4299 0.0083 0.6034 0.2241 0.5287 0.0051 0.7355 0.2835 0.6694 0.6631] 2022-08-23 08:09:50 [INFO] [EVAL] Class Recall: [0.8444 0.8996 0.9623 0.8665 0.8694 0.8778 0.8929 0.915 0.7021 0.7971 0.6472 0.6798 0.8883 0.4104 0.4611 0.6178 0.6719 0.5631 0.7564 0.5853 0.8746 0.6431 0.7926 0.6922 0.4891 0.596 0.6983 0.5407 0.5614 0.4197 0.4263 0.641 0.3748 0.555 0.4889 0.6045 0.5905 0.638 0.3956 0.512 0.3697 0.1823 0.485 0.3394 0.4562 0.3311 0.4373 0.6685 0.8298 0.7812 0.713 0.5735 0.3377 0.2422 0.9312 0.6174 0.9429 0.5397 0.5603 0.3952 0.1452 0.5992 0.5697 0.2468 0.7317 0.866 0.4639 0.5616 0.1568 0.4315 0.5828 0.6781 0.5074 0.4242 0.5578 0.517 0.7203 0.3708 0.3253 0.1304 0.8326 0.4922 0.3543 0.0506 0.1242 0.6865 0.1241 0.1104 0.3511 0.5625 0.5673 0.1794 0.3333 0.1355 0.0143 0.0517 0.0174 0.2472 0.4139 0.5859 0.0787 0.0446 0.3656 0.2251 0.0082 0.9129 0.1206 0.5521 0.1388 0.2633 0.2385 0.6202 0.1753 0.6428 0.9917 0.0127 0.3426 0.8083 0.2754 0.1509 0.6892 0.0101 0.2561 0.1721 0.281 0.362 0.5197 0.6105 0.172 0.4896 0.8713 0.0715 0.5276 0.3228 0.2319 0.1746 0.2143 0.0689 0.3141 0.4725 0.6567 0.0196 0.4996 0.034 0.3872 0.0064 0.4215 0.0322 0.2256 0.1471] 2022-08-23 08:09:50 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:09:59 [INFO] [TRAIN] epoch: 103, iter: 130050/160000, loss: 0.4967, lr: 0.000227, batch_cost: 0.1834, reader_cost: 0.00355, ips: 43.6309 samples/sec | ETA 01:31:31 2022-08-23 08:10:11 [INFO] [TRAIN] epoch: 104, iter: 130100/160000, loss: 0.4991, lr: 0.000226, batch_cost: 0.2404, reader_cost: 0.04647, ips: 33.2822 samples/sec | ETA 01:59:47 2022-08-23 08:10:21 [INFO] [TRAIN] epoch: 104, iter: 130150/160000, loss: 0.4746, lr: 0.000226, batch_cost: 0.1917, reader_cost: 0.00053, ips: 41.7232 samples/sec | ETA 01:35:23 2022-08-23 08:10:30 [INFO] [TRAIN] epoch: 104, iter: 130200/160000, loss: 0.5055, lr: 0.000226, batch_cost: 0.1888, reader_cost: 0.00087, ips: 42.3632 samples/sec | ETA 01:33:47 2022-08-23 08:10:40 [INFO] [TRAIN] epoch: 104, iter: 130250/160000, loss: 0.4714, lr: 0.000225, batch_cost: 0.1965, reader_cost: 0.00051, ips: 40.7150 samples/sec | ETA 01:37:25 2022-08-23 08:10:49 [INFO] [TRAIN] epoch: 104, iter: 130300/160000, loss: 0.4764, lr: 0.000225, batch_cost: 0.1857, reader_cost: 0.00112, ips: 43.0907 samples/sec | ETA 01:31:53 2022-08-23 08:10:59 [INFO] [TRAIN] epoch: 104, iter: 130350/160000, loss: 0.4702, lr: 0.000224, batch_cost: 0.2021, reader_cost: 0.00064, ips: 39.5801 samples/sec | ETA 01:39:52 2022-08-23 08:11:09 [INFO] [TRAIN] epoch: 104, iter: 130400/160000, loss: 0.5303, lr: 0.000224, batch_cost: 0.2031, reader_cost: 0.00087, ips: 39.3980 samples/sec | ETA 01:40:10 2022-08-23 08:11:19 [INFO] [TRAIN] epoch: 104, iter: 130450/160000, loss: 0.4847, lr: 0.000224, batch_cost: 0.1841, reader_cost: 0.00070, ips: 43.4442 samples/sec | ETA 01:30:41 2022-08-23 08:11:28 [INFO] [TRAIN] epoch: 104, iter: 130500/160000, loss: 0.4713, lr: 0.000223, batch_cost: 0.1811, reader_cost: 0.00035, ips: 44.1842 samples/sec | ETA 01:29:01 2022-08-23 08:11:37 [INFO] [TRAIN] epoch: 104, iter: 130550/160000, loss: 0.4844, lr: 0.000223, batch_cost: 0.1797, reader_cost: 0.00056, ips: 44.5303 samples/sec | ETA 01:28:10 2022-08-23 08:11:46 [INFO] [TRAIN] epoch: 104, iter: 130600/160000, loss: 0.5195, lr: 0.000223, batch_cost: 0.1765, reader_cost: 0.00093, ips: 45.3242 samples/sec | ETA 01:26:29 2022-08-23 08:11:55 [INFO] [TRAIN] epoch: 104, iter: 130650/160000, loss: 0.4440, lr: 0.000222, batch_cost: 0.1830, reader_cost: 0.00091, ips: 43.7223 samples/sec | ETA 01:29:30 2022-08-23 08:12:04 [INFO] [TRAIN] epoch: 104, iter: 130700/160000, loss: 0.4719, lr: 0.000222, batch_cost: 0.1835, reader_cost: 0.00065, ips: 43.5951 samples/sec | ETA 01:29:36 2022-08-23 08:12:13 [INFO] [TRAIN] epoch: 104, iter: 130750/160000, loss: 0.4821, lr: 0.000221, batch_cost: 0.1872, reader_cost: 0.00069, ips: 42.7407 samples/sec | ETA 01:31:14 2022-08-23 08:12:22 [INFO] [TRAIN] epoch: 104, iter: 130800/160000, loss: 0.5061, lr: 0.000221, batch_cost: 0.1855, reader_cost: 0.00051, ips: 43.1371 samples/sec | ETA 01:30:15 2022-08-23 08:12:31 [INFO] [TRAIN] epoch: 104, iter: 130850/160000, loss: 0.5001, lr: 0.000221, batch_cost: 0.1666, reader_cost: 0.00065, ips: 48.0312 samples/sec | ETA 01:20:55 2022-08-23 08:12:40 [INFO] [TRAIN] epoch: 104, iter: 130900/160000, loss: 0.5070, lr: 0.000220, batch_cost: 0.1892, reader_cost: 0.00053, ips: 42.2765 samples/sec | ETA 01:31:46 2022-08-23 08:12:50 [INFO] [TRAIN] epoch: 104, iter: 130950/160000, loss: 0.4709, lr: 0.000220, batch_cost: 0.1944, reader_cost: 0.00099, ips: 41.1566 samples/sec | ETA 01:34:06 2022-08-23 08:13:00 [INFO] [TRAIN] epoch: 104, iter: 131000/160000, loss: 0.5138, lr: 0.000220, batch_cost: 0.2040, reader_cost: 0.00297, ips: 39.2210 samples/sec | ETA 01:38:35 2022-08-23 08:13:00 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 152s - batch_cost: 0.1516 - reader cost: 0.0015 2022-08-23 08:15:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3496 Acc: 0.7699 Kappa: 0.7523 Dice: 0.4814 2022-08-23 08:15:32 [INFO] [EVAL] Class IoU: [0.6919 0.7777 0.9322 0.7294 0.679 0.7627 0.7915 0.7889 0.5284 0.6265 0.5075 0.5542 0.7 0.315 0.3239 0.4389 0.5348 0.422 0.6168 0.423 0.7414 0.541 0.6406 0.494 0.3177 0.3194 0.4992 0.4709 0.3751 0.2371 0.2767 0.4779 0.2654 0.3604 0.3427 0.3983 0.4317 0.5404 0.2446 0.4063 0.2256 0.1134 0.3423 0.2603 0.318 0.1799 0.308 0.5081 0.5954 0.5526 0.5831 0.3316 0.2227 0.1918 0.7103 0.435 0.8641 0.4363 0.5032 0.2732 0.0827 0.3771 0.3376 0.2609 0.491 0.6991 0.2516 0.4117 0.1186 0.31 0.5221 0.5365 0.3894 0.1961 0.4413 0.3391 0.5729 0.2867 0.2663 0.1664 0.5861 0.4082 0.3622 0.0249 0.0894 0.5658 0.1239 0.0919 0.2406 0.5202 0.4039 0.0879 0.2159 0.1183 0.0102 0.0459 0.0526 0.1347 0.2938 0.3567 0.1378 0.034 0.3029 0.092 0.0057 0.581 0.057 0.5095 0.1042 0.2198 0.1051 0.449 0.1405 0.4597 0.807 0.0228 0.3303 0.6291 0.1178 0.1151 0.41 0.0096 0.2147 0.1618 0.2659 0.2562 0.4462 0.4242 0.3856 0.3384 0.5535 0.0831 0.2913 0.3001 0.2002 0.1228 0.1731 0.0265 0.213 0.3482 0.2978 0.0068 0.3655 0.0073 0.2643 0.0125 0.364 0.0266 0.2293 0.1478] 2022-08-23 08:15:32 [INFO] [EVAL] Class Precision: [0.786 0.8535 0.9643 0.8226 0.764 0.8731 0.8755 0.8495 0.6894 0.7741 0.6891 0.7339 0.7801 0.507 0.5422 0.6112 0.7178 0.6624 0.7616 0.6126 0.8305 0.6785 0.7681 0.6209 0.4949 0.5331 0.6779 0.6855 0.6928 0.3502 0.4596 0.6248 0.4448 0.4972 0.534 0.6181 0.6339 0.7502 0.4453 0.6437 0.4214 0.2665 0.6416 0.5293 0.4794 0.4001 0.4786 0.6855 0.7923 0.6613 0.7209 0.4454 0.3788 0.4866 0.7371 0.5521 0.9122 0.659 0.6898 0.464 0.1567 0.5427 0.5282 0.6713 0.6244 0.8074 0.331 0.5433 0.2784 0.5552 0.7254 0.6502 0.6164 0.2929 0.6404 0.4858 0.6977 0.5361 0.577 0.4262 0.664 0.6651 0.7887 0.1018 0.2355 0.7309 0.4187 0.3959 0.57 0.738 0.5422 0.1566 0.3672 0.3348 0.0598 0.0974 0.4911 0.3889 0.4398 0.6662 0.4283 0.1245 0.5654 0.546 0.4562 0.6158 0.1778 0.8089 0.2799 0.5419 0.2779 0.6283 0.4894 0.7222 0.8117 0.0895 0.7329 0.7467 0.1777 0.595 0.5411 0.1859 0.6346 0.5952 0.6377 0.6098 0.8079 0.5805 0.9162 0.529 0.6066 0.427 0.4763 0.578 0.5194 0.2966 0.3547 0.0557 0.4478 0.6534 0.4162 0.0095 0.6381 0.1101 0.4798 0.0184 0.75 0.4699 0.623 0.7184] 2022-08-23 08:15:32 [INFO] [EVAL] Class Recall: [0.8525 0.8975 0.9656 0.8656 0.8593 0.8577 0.8919 0.9172 0.6935 0.7667 0.6583 0.6936 0.8721 0.4541 0.4457 0.6089 0.6772 0.5376 0.7643 0.5776 0.8735 0.7276 0.7942 0.7073 0.4703 0.4435 0.6544 0.6008 0.4499 0.4236 0.4101 0.6701 0.3969 0.5671 0.4889 0.5284 0.5751 0.659 0.3517 0.5241 0.3268 0.1649 0.4232 0.3388 0.4857 0.2464 0.4634 0.6626 0.7056 0.7707 0.7531 0.5648 0.3508 0.2405 0.9513 0.6723 0.9424 0.5636 0.6503 0.3992 0.1491 0.5528 0.4835 0.2991 0.6969 0.8389 0.5118 0.6296 0.1713 0.4125 0.6507 0.7541 0.5139 0.3722 0.5866 0.5291 0.762 0.3813 0.3309 0.2145 0.8332 0.5139 0.4011 0.0319 0.126 0.7147 0.1496 0.1069 0.294 0.6381 0.6129 0.1668 0.3439 0.1546 0.0121 0.0797 0.0556 0.1709 0.4696 0.4343 0.1689 0.0446 0.3948 0.0996 0.0058 0.9115 0.0774 0.5792 0.1424 0.27 0.1445 0.6113 0.1646 0.5584 0.9929 0.0297 0.3755 0.7998 0.2592 0.1249 0.6285 0.01 0.2449 0.1819 0.3133 0.3064 0.4992 0.6118 0.3997 0.4843 0.8634 0.0935 0.4286 0.3843 0.2457 0.1732 0.2526 0.0482 0.2888 0.4271 0.5115 0.0232 0.4611 0.0078 0.3704 0.0374 0.4143 0.0274 0.2662 0.1569] 2022-08-23 08:15:32 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:15:42 [INFO] [TRAIN] epoch: 104, iter: 131050/160000, loss: 0.5243, lr: 0.000219, batch_cost: 0.1868, reader_cost: 0.00387, ips: 42.8325 samples/sec | ETA 01:30:07 2022-08-23 08:15:50 [INFO] [TRAIN] epoch: 104, iter: 131100/160000, loss: 0.4930, lr: 0.000219, batch_cost: 0.1683, reader_cost: 0.00122, ips: 47.5261 samples/sec | ETA 01:21:04 2022-08-23 08:16:00 [INFO] [TRAIN] epoch: 104, iter: 131150/160000, loss: 0.5071, lr: 0.000218, batch_cost: 0.2004, reader_cost: 0.00045, ips: 39.9260 samples/sec | ETA 01:36:20 2022-08-23 08:16:10 [INFO] [TRAIN] epoch: 104, iter: 131200/160000, loss: 0.5386, lr: 0.000218, batch_cost: 0.1947, reader_cost: 0.00041, ips: 41.0968 samples/sec | ETA 01:33:26 2022-08-23 08:16:18 [INFO] [TRAIN] epoch: 104, iter: 131250/160000, loss: 0.4826, lr: 0.000218, batch_cost: 0.1673, reader_cost: 0.00056, ips: 47.8094 samples/sec | ETA 01:20:10 2022-08-23 08:16:27 [INFO] [TRAIN] epoch: 104, iter: 131300/160000, loss: 0.4598, lr: 0.000217, batch_cost: 0.1829, reader_cost: 0.00042, ips: 43.7318 samples/sec | ETA 01:27:30 2022-08-23 08:16:37 [INFO] [TRAIN] epoch: 104, iter: 131350/160000, loss: 0.4845, lr: 0.000217, batch_cost: 0.1947, reader_cost: 0.00063, ips: 41.0835 samples/sec | ETA 01:32:58 2022-08-23 08:16:49 [INFO] [TRAIN] epoch: 105, iter: 131400/160000, loss: 0.4846, lr: 0.000217, batch_cost: 0.2468, reader_cost: 0.06239, ips: 32.4163 samples/sec | ETA 01:57:38 2022-08-23 08:16:59 [INFO] [TRAIN] epoch: 105, iter: 131450/160000, loss: 0.4768, lr: 0.000216, batch_cost: 0.1880, reader_cost: 0.00084, ips: 42.5556 samples/sec | ETA 01:29:27 2022-08-23 08:17:07 [INFO] [TRAIN] epoch: 105, iter: 131500/160000, loss: 0.4813, lr: 0.000216, batch_cost: 0.1758, reader_cost: 0.00077, ips: 45.5010 samples/sec | ETA 01:23:30 2022-08-23 08:17:16 [INFO] [TRAIN] epoch: 105, iter: 131550/160000, loss: 0.5002, lr: 0.000215, batch_cost: 0.1728, reader_cost: 0.00101, ips: 46.2982 samples/sec | ETA 01:21:55 2022-08-23 08:17:25 [INFO] [TRAIN] epoch: 105, iter: 131600/160000, loss: 0.4917, lr: 0.000215, batch_cost: 0.1818, reader_cost: 0.00130, ips: 44.0146 samples/sec | ETA 01:26:01 2022-08-23 08:17:35 [INFO] [TRAIN] epoch: 105, iter: 131650/160000, loss: 0.4812, lr: 0.000215, batch_cost: 0.1976, reader_cost: 0.00087, ips: 40.4869 samples/sec | ETA 01:33:21 2022-08-23 08:17:46 [INFO] [TRAIN] epoch: 105, iter: 131700/160000, loss: 0.5306, lr: 0.000214, batch_cost: 0.2102, reader_cost: 0.00050, ips: 38.0613 samples/sec | ETA 01:39:08 2022-08-23 08:17:55 [INFO] [TRAIN] epoch: 105, iter: 131750/160000, loss: 0.4996, lr: 0.000214, batch_cost: 0.1966, reader_cost: 0.00093, ips: 40.6944 samples/sec | ETA 01:32:33 2022-08-23 08:18:05 [INFO] [TRAIN] epoch: 105, iter: 131800/160000, loss: 0.4828, lr: 0.000214, batch_cost: 0.1957, reader_cost: 0.00045, ips: 40.8716 samples/sec | ETA 01:31:59 2022-08-23 08:18:15 [INFO] [TRAIN] epoch: 105, iter: 131850/160000, loss: 0.5090, lr: 0.000213, batch_cost: 0.1999, reader_cost: 0.00067, ips: 40.0166 samples/sec | ETA 01:33:47 2022-08-23 08:18:24 [INFO] [TRAIN] epoch: 105, iter: 131900/160000, loss: 0.4915, lr: 0.000213, batch_cost: 0.1843, reader_cost: 0.00089, ips: 43.4097 samples/sec | ETA 01:26:18 2022-08-23 08:18:34 [INFO] [TRAIN] epoch: 105, iter: 131950/160000, loss: 0.5452, lr: 0.000212, batch_cost: 0.1865, reader_cost: 0.00118, ips: 42.8938 samples/sec | ETA 01:27:11 2022-08-23 08:18:44 [INFO] [TRAIN] epoch: 105, iter: 132000/160000, loss: 0.5378, lr: 0.000212, batch_cost: 0.1977, reader_cost: 0.00032, ips: 40.4575 samples/sec | ETA 01:32:16 2022-08-23 08:18:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1563 - reader cost: 5.3031e-04 2022-08-23 08:21:20 [INFO] [EVAL] #Images: 2000 mIoU: 0.3458 Acc: 0.7682 Kappa: 0.7502 Dice: 0.4768 2022-08-23 08:21:20 [INFO] [EVAL] Class IoU: [0.6905 0.7743 0.932 0.7274 0.6809 0.7677 0.7762 0.7907 0.5225 0.6165 0.4898 0.5423 0.7036 0.3001 0.2979 0.4356 0.5267 0.4277 0.6153 0.4212 0.7398 0.5445 0.6407 0.4996 0.2944 0.2932 0.5239 0.4417 0.4173 0.2304 0.2632 0.4658 0.2698 0.3634 0.2858 0.392 0.4265 0.5496 0.2484 0.375 0.23 0.0986 0.3442 0.2684 0.2844 0.2327 0.2998 0.5195 0.6098 0.5122 0.5605 0.3741 0.1906 0.2097 0.7193 0.4841 0.8653 0.4453 0.4986 0.2702 0.0782 0.4466 0.3105 0.2595 0.4673 0.709 0.2894 0.3895 0.1197 0.3138 0.4855 0.5289 0.4113 0.2042 0.4554 0.3257 0.568 0.2502 0.2641 0.1181 0.593 0.4009 0.3442 0.0259 0.0943 0.5604 0.1109 0.0846 0.2096 0.5152 0.4405 0.0476 0.2244 0.1017 0.0013 0.0194 0.029 0.1317 0.293 0.3109 0.0894 0.0375 0.3036 0.0953 0.0047 0.4923 0.0526 0.5182 0.0989 0.2236 0.1357 0.3338 0.1354 0.5234 0.7214 0.0187 0.3005 0.6347 0.1242 0.1819 0.4503 0.0091 0.2293 0.1505 0.2598 0.2805 0.4394 0.435 0.4069 0.3302 0.5348 0.0785 0.3518 0.293 0.186 0.129 0.1641 0.0303 0.2099 0.365 0.2847 0.0005 0.3301 0.0179 0.1862 0.0059 0.4086 0.0282 0.2403 0.1546] 2022-08-23 08:21:20 [INFO] [EVAL] Class Precision: [0.7779 0.8553 0.9628 0.8131 0.7681 0.869 0.8594 0.8586 0.6904 0.7589 0.7012 0.7081 0.7831 0.5128 0.5643 0.6173 0.708 0.6736 0.7468 0.6263 0.8261 0.6721 0.7727 0.605 0.5565 0.5325 0.7032 0.7871 0.6602 0.3434 0.45 0.6351 0.4737 0.5189 0.5127 0.5037 0.6278 0.7497 0.4122 0.697 0.4177 0.2857 0.614 0.533 0.4046 0.4167 0.4593 0.7056 0.7342 0.5672 0.7641 0.5146 0.3458 0.4771 0.759 0.6453 0.9131 0.6407 0.7191 0.4738 0.1376 0.5948 0.4756 0.593 0.5736 0.8136 0.3991 0.49 0.2567 0.5693 0.7278 0.6634 0.5964 0.2771 0.6932 0.538 0.7986 0.5569 0.6517 0.3371 0.678 0.6307 0.803 0.0778 0.2297 0.7181 0.3536 0.4064 0.3202 0.7433 0.649 0.0598 0.4581 0.3564 0.0053 0.0631 0.3656 0.4286 0.4613 0.6634 0.3415 0.1129 0.5892 0.5978 0.488 0.5161 0.2204 0.8207 0.2624 0.5496 0.3419 0.4187 0.4494 0.7654 0.7246 0.0997 0.6908 0.7715 0.1908 0.5017 0.6064 0.1648 0.7031 0.6317 0.6709 0.6513 0.784 0.6047 0.7067 0.5307 0.5793 0.3543 0.497 0.6592 0.5038 0.2956 0.3055 0.0783 0.4481 0.6138 0.3415 0.0008 0.7008 0.1864 0.5137 0.0089 0.8468 0.4451 0.6786 0.6864] 2022-08-23 08:21:20 [INFO] [EVAL] Class Recall: [0.86 0.891 0.9668 0.8735 0.8571 0.8682 0.8891 0.9091 0.6825 0.7666 0.619 0.6984 0.8739 0.4198 0.3869 0.5969 0.6729 0.5396 0.7775 0.5626 0.8763 0.7414 0.7896 0.7414 0.3847 0.3948 0.6727 0.5017 0.5314 0.4117 0.3879 0.6361 0.3853 0.5481 0.3925 0.6388 0.5709 0.6731 0.3847 0.4481 0.3385 0.1308 0.4392 0.3509 0.4891 0.3451 0.4634 0.6633 0.7826 0.8409 0.6778 0.5781 0.2982 0.2723 0.9321 0.6596 0.9429 0.5935 0.6193 0.3861 0.1534 0.6419 0.4722 0.3157 0.7161 0.8465 0.5129 0.6551 0.1833 0.4115 0.5932 0.7229 0.5699 0.4367 0.5704 0.4521 0.663 0.3124 0.3076 0.1539 0.8255 0.5239 0.376 0.0374 0.1378 0.7185 0.1391 0.0966 0.3778 0.6267 0.5783 0.1888 0.3056 0.1246 0.0018 0.0273 0.0306 0.1597 0.4455 0.3691 0.1081 0.0532 0.3852 0.1019 0.0047 0.9143 0.0646 0.5844 0.1371 0.2737 0.1837 0.6219 0.1624 0.6234 0.9939 0.0226 0.3472 0.7817 0.2626 0.222 0.6363 0.0095 0.2539 0.165 0.2977 0.3301 0.4999 0.6079 0.4896 0.4663 0.8745 0.0916 0.5463 0.3453 0.2277 0.1864 0.2618 0.0472 0.2831 0.4739 0.6311 0.0016 0.3843 0.0194 0.2261 0.0175 0.4412 0.0292 0.2711 0.1663] 2022-08-23 08:21:20 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:21:30 [INFO] [TRAIN] epoch: 105, iter: 132050/160000, loss: 0.4811, lr: 0.000212, batch_cost: 0.1892, reader_cost: 0.00346, ips: 42.2845 samples/sec | ETA 01:28:07 2022-08-23 08:21:39 [INFO] [TRAIN] epoch: 105, iter: 132100/160000, loss: 0.5174, lr: 0.000211, batch_cost: 0.1771, reader_cost: 0.00136, ips: 45.1612 samples/sec | ETA 01:22:22 2022-08-23 08:21:47 [INFO] [TRAIN] epoch: 105, iter: 132150/160000, loss: 0.4998, lr: 0.000211, batch_cost: 0.1694, reader_cost: 0.00073, ips: 47.2378 samples/sec | ETA 01:18:36 2022-08-23 08:21:56 [INFO] [TRAIN] epoch: 105, iter: 132200/160000, loss: 0.5236, lr: 0.000210, batch_cost: 0.1694, reader_cost: 0.00061, ips: 47.2318 samples/sec | ETA 01:18:28 2022-08-23 08:22:04 [INFO] [TRAIN] epoch: 105, iter: 132250/160000, loss: 0.4877, lr: 0.000210, batch_cost: 0.1700, reader_cost: 0.00198, ips: 47.0458 samples/sec | ETA 01:18:38 2022-08-23 08:22:13 [INFO] [TRAIN] epoch: 105, iter: 132300/160000, loss: 0.4922, lr: 0.000210, batch_cost: 0.1845, reader_cost: 0.00060, ips: 43.3546 samples/sec | ETA 01:25:11 2022-08-23 08:22:22 [INFO] [TRAIN] epoch: 105, iter: 132350/160000, loss: 0.5259, lr: 0.000209, batch_cost: 0.1754, reader_cost: 0.00058, ips: 45.5990 samples/sec | ETA 01:20:50 2022-08-23 08:22:32 [INFO] [TRAIN] epoch: 105, iter: 132400/160000, loss: 0.5178, lr: 0.000209, batch_cost: 0.1933, reader_cost: 0.00040, ips: 41.3921 samples/sec | ETA 01:28:54 2022-08-23 08:22:41 [INFO] [TRAIN] epoch: 105, iter: 132450/160000, loss: 0.4924, lr: 0.000209, batch_cost: 0.1859, reader_cost: 0.00045, ips: 43.0332 samples/sec | ETA 01:25:21 2022-08-23 08:22:51 [INFO] [TRAIN] epoch: 105, iter: 132500/160000, loss: 0.5072, lr: 0.000208, batch_cost: 0.1955, reader_cost: 0.00059, ips: 40.9215 samples/sec | ETA 01:29:36 2022-08-23 08:23:00 [INFO] [TRAIN] epoch: 105, iter: 132550/160000, loss: 0.5271, lr: 0.000208, batch_cost: 0.1859, reader_cost: 0.00088, ips: 43.0313 samples/sec | ETA 01:25:03 2022-08-23 08:23:09 [INFO] [TRAIN] epoch: 105, iter: 132600/160000, loss: 0.4833, lr: 0.000207, batch_cost: 0.1814, reader_cost: 0.00033, ips: 44.1041 samples/sec | ETA 01:22:50 2022-08-23 08:23:23 [INFO] [TRAIN] epoch: 106, iter: 132650/160000, loss: 0.5095, lr: 0.000207, batch_cost: 0.2735, reader_cost: 0.08385, ips: 29.2452 samples/sec | ETA 02:04:41 2022-08-23 08:23:33 [INFO] [TRAIN] epoch: 106, iter: 132700/160000, loss: 0.4990, lr: 0.000207, batch_cost: 0.2054, reader_cost: 0.00069, ips: 38.9561 samples/sec | ETA 01:33:26 2022-08-23 08:23:43 [INFO] [TRAIN] epoch: 106, iter: 132750/160000, loss: 0.4896, lr: 0.000206, batch_cost: 0.2050, reader_cost: 0.00044, ips: 39.0270 samples/sec | ETA 01:33:05 2022-08-23 08:23:53 [INFO] [TRAIN] epoch: 106, iter: 132800/160000, loss: 0.4647, lr: 0.000206, batch_cost: 0.1971, reader_cost: 0.00089, ips: 40.5952 samples/sec | ETA 01:29:20 2022-08-23 08:24:02 [INFO] [TRAIN] epoch: 106, iter: 132850/160000, loss: 0.4917, lr: 0.000206, batch_cost: 0.1854, reader_cost: 0.00083, ips: 43.1477 samples/sec | ETA 01:23:53 2022-08-23 08:24:12 [INFO] [TRAIN] epoch: 106, iter: 132900/160000, loss: 0.5143, lr: 0.000205, batch_cost: 0.1884, reader_cost: 0.00038, ips: 42.4561 samples/sec | ETA 01:25:06 2022-08-23 08:24:20 [INFO] [TRAIN] epoch: 106, iter: 132950/160000, loss: 0.5089, lr: 0.000205, batch_cost: 0.1638, reader_cost: 0.00064, ips: 48.8416 samples/sec | ETA 01:13:50 2022-08-23 08:24:30 [INFO] [TRAIN] epoch: 106, iter: 133000/160000, loss: 0.4694, lr: 0.000204, batch_cost: 0.1890, reader_cost: 0.00078, ips: 42.3172 samples/sec | ETA 01:25:04 2022-08-23 08:24:30 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1821 - reader cost: 9.7131e-04 2022-08-23 08:27:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3470 Acc: 0.7687 Kappa: 0.7510 Dice: 0.4786 2022-08-23 08:27:32 [INFO] [EVAL] Class IoU: [0.6924 0.7735 0.9325 0.7276 0.6741 0.7697 0.7836 0.7832 0.5331 0.6148 0.4926 0.5607 0.7039 0.3072 0.3256 0.4381 0.5302 0.4188 0.6236 0.4234 0.7428 0.5217 0.6416 0.5006 0.3074 0.2149 0.5002 0.4507 0.3908 0.2265 0.2777 0.4705 0.26 0.3711 0.3121 0.388 0.4339 0.5454 0.2464 0.3859 0.2499 0.1128 0.3489 0.2649 0.2869 0.1822 0.3042 0.5185 0.6175 0.5226 0.5687 0.4126 0.2327 0.2147 0.7154 0.4425 0.842 0.417 0.4642 0.2493 0.0793 0.4445 0.335 0.2414 0.4732 0.6976 0.2881 0.4056 0.1275 0.3145 0.5262 0.5226 0.4176 0.2014 0.4599 0.3444 0.5591 0.2888 0.2739 0.1907 0.6194 0.4087 0.3712 0.0161 0.1577 0.5698 0.1064 0.0942 0.2403 0.503 0.4285 0.0769 0.2179 0.1283 0.0011 0.0297 0.033 0.1419 0.2914 0.4129 0.0762 0.0369 0.2844 0.096 0.0061 0.4749 0.0541 0.511 0.1034 0.2348 0.1458 0.3234 0.1336 0.5034 0.7346 0.0229 0.3268 0.6059 0.0994 0.0625 0.4351 0.0202 0.219 0.1297 0.2562 0.2887 0.4452 0.4229 0.3498 0.3412 0.5505 0.0542 0.3285 0.2751 0.1632 0.1284 0.161 0.0317 0.221 0.3595 0.3155 0.0062 0.3269 0.0471 0.2515 0.0048 0.3755 0.0314 0.2201 0.146 ] 2022-08-23 08:27:32 [INFO] [EVAL] Class Precision: [0.7919 0.8448 0.9644 0.8168 0.76 0.8676 0.8785 0.8388 0.6693 0.7555 0.6991 0.7186 0.7779 0.5073 0.529 0.6214 0.7225 0.6872 0.7523 0.6179 0.838 0.6639 0.7803 0.6233 0.5371 0.6241 0.6338 0.749 0.6864 0.3369 0.4573 0.6065 0.4574 0.5162 0.4753 0.5259 0.6426 0.7557 0.414 0.703 0.4171 0.2783 0.6052 0.5316 0.402 0.4173 0.4776 0.7253 0.6819 0.6136 0.7549 0.5904 0.423 0.4972 0.7451 0.5588 0.8816 0.6653 0.6527 0.4461 0.1536 0.6417 0.5106 0.6852 0.5835 0.784 0.4099 0.5576 0.3091 0.5477 0.7106 0.6501 0.6009 0.2894 0.7261 0.5028 0.7379 0.5298 0.6952 0.4424 0.7145 0.6314 0.7864 0.0559 0.3323 0.732 0.3161 0.3838 0.4117 0.7128 0.6028 0.117 0.3703 0.352 0.0034 0.0731 0.3783 0.4027 0.4392 0.6222 0.3718 0.0929 0.5968 0.4505 0.3727 0.4938 0.2266 0.7945 0.2585 0.5545 0.334 0.3963 0.4669 0.7253 0.738 0.1143 0.672 0.7055 0.1345 0.5499 0.5727 0.156 0.6296 0.6383 0.6959 0.6562 0.7516 0.5513 0.7846 0.5331 0.607 0.3275 0.4761 0.6409 0.5634 0.2892 0.4248 0.0923 0.4411 0.6478 0.373 0.0092 0.6218 0.248 0.5651 0.0072 0.8702 0.3168 0.588 0.7142] 2022-08-23 08:27:32 [INFO] [EVAL] Class Recall: [0.8465 0.9016 0.9657 0.8695 0.8564 0.8722 0.8788 0.922 0.7237 0.7675 0.6252 0.7184 0.8809 0.4379 0.4585 0.5977 0.6657 0.5175 0.7847 0.5735 0.8673 0.7089 0.7831 0.7178 0.4181 0.2469 0.7036 0.5309 0.4757 0.4087 0.4142 0.6773 0.376 0.569 0.4762 0.5966 0.572 0.6621 0.3784 0.461 0.384 0.1594 0.4517 0.3455 0.5004 0.2443 0.4559 0.6452 0.8674 0.7788 0.6974 0.5781 0.3409 0.2743 0.9472 0.6802 0.9493 0.5277 0.6165 0.361 0.1409 0.5912 0.4934 0.2715 0.7146 0.8635 0.4921 0.598 0.1783 0.4249 0.6697 0.7272 0.5779 0.3985 0.5564 0.5223 0.6976 0.3883 0.3113 0.2511 0.8232 0.5367 0.4129 0.0221 0.231 0.7201 0.1383 0.111 0.3659 0.6309 0.597 0.1832 0.3462 0.168 0.0015 0.0477 0.0349 0.1798 0.4641 0.551 0.0875 0.0576 0.3521 0.1087 0.0062 0.9254 0.0663 0.5888 0.1471 0.2894 0.2056 0.6377 0.1577 0.6219 0.9937 0.0279 0.3889 0.8111 0.2762 0.0659 0.6442 0.0227 0.2514 0.1399 0.2886 0.3401 0.522 0.6448 0.3869 0.4866 0.8555 0.061 0.5146 0.3252 0.1869 0.1876 0.2059 0.046 0.307 0.4468 0.6719 0.0187 0.4081 0.0549 0.3119 0.0141 0.3978 0.0337 0.2602 0.1551] 2022-08-23 08:27:32 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:27:41 [INFO] [TRAIN] epoch: 106, iter: 133050/160000, loss: 0.5243, lr: 0.000204, batch_cost: 0.1777, reader_cost: 0.00350, ips: 45.0229 samples/sec | ETA 01:19:48 2022-08-23 08:27:50 [INFO] [TRAIN] epoch: 106, iter: 133100/160000, loss: 0.4709, lr: 0.000204, batch_cost: 0.1882, reader_cost: 0.00081, ips: 42.5184 samples/sec | ETA 01:24:21 2022-08-23 08:28:00 [INFO] [TRAIN] epoch: 106, iter: 133150/160000, loss: 0.4966, lr: 0.000203, batch_cost: 0.1987, reader_cost: 0.00074, ips: 40.2708 samples/sec | ETA 01:28:53 2022-08-23 08:28:10 [INFO] [TRAIN] epoch: 106, iter: 133200/160000, loss: 0.4679, lr: 0.000203, batch_cost: 0.1953, reader_cost: 0.00155, ips: 40.9531 samples/sec | ETA 01:27:15 2022-08-23 08:28:19 [INFO] [TRAIN] epoch: 106, iter: 133250/160000, loss: 0.4707, lr: 0.000203, batch_cost: 0.1834, reader_cost: 0.00138, ips: 43.6095 samples/sec | ETA 01:21:47 2022-08-23 08:28:29 [INFO] [TRAIN] epoch: 106, iter: 133300/160000, loss: 0.5307, lr: 0.000202, batch_cost: 0.1977, reader_cost: 0.00036, ips: 40.4632 samples/sec | ETA 01:27:58 2022-08-23 08:28:38 [INFO] [TRAIN] epoch: 106, iter: 133350/160000, loss: 0.4903, lr: 0.000202, batch_cost: 0.1825, reader_cost: 0.00091, ips: 43.8456 samples/sec | ETA 01:21:02 2022-08-23 08:28:47 [INFO] [TRAIN] epoch: 106, iter: 133400/160000, loss: 0.5014, lr: 0.000201, batch_cost: 0.1732, reader_cost: 0.00087, ips: 46.1934 samples/sec | ETA 01:16:46 2022-08-23 08:28:56 [INFO] [TRAIN] epoch: 106, iter: 133450/160000, loss: 0.5119, lr: 0.000201, batch_cost: 0.1908, reader_cost: 0.00069, ips: 41.9208 samples/sec | ETA 01:24:26 2022-08-23 08:29:07 [INFO] [TRAIN] epoch: 106, iter: 133500/160000, loss: 0.4966, lr: 0.000201, batch_cost: 0.2076, reader_cost: 0.00040, ips: 38.5445 samples/sec | ETA 01:31:40 2022-08-23 08:29:16 [INFO] [TRAIN] epoch: 106, iter: 133550/160000, loss: 0.4976, lr: 0.000200, batch_cost: 0.1840, reader_cost: 0.00096, ips: 43.4872 samples/sec | ETA 01:21:05 2022-08-23 08:29:26 [INFO] [TRAIN] epoch: 106, iter: 133600/160000, loss: 0.4916, lr: 0.000200, batch_cost: 0.2083, reader_cost: 0.00078, ips: 38.4147 samples/sec | ETA 01:31:37 2022-08-23 08:29:36 [INFO] [TRAIN] epoch: 106, iter: 133650/160000, loss: 0.5332, lr: 0.000200, batch_cost: 0.1950, reader_cost: 0.00313, ips: 41.0238 samples/sec | ETA 01:25:38 2022-08-23 08:29:47 [INFO] [TRAIN] epoch: 106, iter: 133700/160000, loss: 0.4873, lr: 0.000199, batch_cost: 0.2195, reader_cost: 0.00368, ips: 36.4421 samples/sec | ETA 01:36:13 2022-08-23 08:29:58 [INFO] [TRAIN] epoch: 106, iter: 133750/160000, loss: 0.4628, lr: 0.000199, batch_cost: 0.2177, reader_cost: 0.00056, ips: 36.7525 samples/sec | ETA 01:35:13 2022-08-23 08:30:08 [INFO] [TRAIN] epoch: 106, iter: 133800/160000, loss: 0.4785, lr: 0.000198, batch_cost: 0.1890, reader_cost: 0.00038, ips: 42.3304 samples/sec | ETA 01:22:31 2022-08-23 08:30:19 [INFO] [TRAIN] epoch: 106, iter: 133850/160000, loss: 0.5036, lr: 0.000198, batch_cost: 0.2210, reader_cost: 0.00117, ips: 36.1977 samples/sec | ETA 01:36:19 2022-08-23 08:30:31 [INFO] [TRAIN] epoch: 107, iter: 133900/160000, loss: 0.5066, lr: 0.000198, batch_cost: 0.2402, reader_cost: 0.03210, ips: 33.3074 samples/sec | ETA 01:44:28 2022-08-23 08:30:42 [INFO] [TRAIN] epoch: 107, iter: 133950/160000, loss: 0.4734, lr: 0.000197, batch_cost: 0.2265, reader_cost: 0.00910, ips: 35.3161 samples/sec | ETA 01:38:20 2022-08-23 08:30:53 [INFO] [TRAIN] epoch: 107, iter: 134000/160000, loss: 0.4858, lr: 0.000197, batch_cost: 0.2210, reader_cost: 0.00051, ips: 36.1944 samples/sec | ETA 01:35:46 2022-08-23 08:30:53 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1756 - reader cost: 6.4120e-04 2022-08-23 08:33:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3471 Acc: 0.7680 Kappa: 0.7501 Dice: 0.4784 2022-08-23 08:33:49 [INFO] [EVAL] Class IoU: [0.69 0.7746 0.9326 0.7319 0.6788 0.7617 0.7865 0.7885 0.5303 0.5973 0.4975 0.5399 0.6986 0.3039 0.3186 0.4381 0.5188 0.4352 0.6217 0.4233 0.7482 0.5264 0.6451 0.4901 0.2975 0.2442 0.4796 0.4623 0.399 0.2508 0.2766 0.4708 0.2607 0.359 0.3122 0.3874 0.4326 0.5394 0.2288 0.4055 0.2253 0.1026 0.3593 0.2611 0.2798 0.176 0.2701 0.509 0.6573 0.5516 0.5958 0.407 0.2056 0.2278 0.665 0.4395 0.8628 0.4387 0.4634 0.2376 0.0698 0.42 0.3397 0.2276 0.4828 0.6936 0.3046 0.3871 0.1299 0.2952 0.5095 0.5353 0.384 0.189 0.4568 0.3159 0.5516 0.296 0.2758 0.1509 0.6122 0.4079 0.3476 0.026 0.1196 0.569 0.1108 0.0932 0.2143 0.5264 0.3985 0.0478 0.2186 0.1291 0.005 0.0297 0.0329 0.1318 0.2906 0.381 0.0966 0.0266 0.2991 0.1552 0.0036 0.551 0.0626 0.4969 0.0847 0.2328 0.1443 0.3449 0.1366 0.5229 0.8238 0.0137 0.313 0.6151 0.164 0.1344 0.4468 0.0109 0.2036 0.1615 0.2633 0.2973 0.4342 0.4256 0.3409 0.3585 0.5759 0.0752 0.3087 0.2825 0.2035 0.1219 0.1644 0.0348 0.2232 0.3676 0.2749 0.0059 0.3244 0.014 0.2215 0.011 0.3632 0.035 0.1951 0.1659] 2022-08-23 08:33:49 [INFO] [EVAL] Class Precision: [0.7808 0.8485 0.9642 0.8244 0.7616 0.8725 0.868 0.8513 0.6746 0.7453 0.709 0.7302 0.7704 0.5282 0.5468 0.5769 0.6956 0.6957 0.7761 0.6236 0.8407 0.6532 0.794 0.6338 0.5288 0.482 0.644 0.7421 0.682 0.3424 0.454 0.62 0.4591 0.4802 0.5085 0.5218 0.6339 0.7864 0.4426 0.6685 0.4252 0.2762 0.5862 0.5295 0.4115 0.3768 0.4061 0.6692 0.7427 0.6644 0.7614 0.6061 0.3383 0.5144 0.7076 0.6184 0.9191 0.6628 0.6892 0.4809 0.1356 0.5749 0.5002 0.7368 0.6133 0.8221 0.4627 0.5453 0.2843 0.4874 0.6752 0.7114 0.6609 0.2839 0.7257 0.519 0.7126 0.5314 0.7944 0.3578 0.7088 0.677 0.8165 0.0734 0.2978 0.7508 0.3062 0.3913 0.3918 0.7564 0.5227 0.0599 0.4055 0.378 0.0268 0.0771 0.4643 0.3779 0.4486 0.6611 0.4504 0.0725 0.5302 0.5861 0.3317 0.5807 0.2316 0.8071 0.2792 0.5442 0.321 0.4426 0.4713 0.7535 0.8312 0.08 0.6455 0.7269 0.246 0.5515 0.5552 0.1223 0.6042 0.6098 0.6704 0.6721 0.7564 0.6011 0.8029 0.577 0.6454 0.3029 0.4469 0.6097 0.5551 0.2757 0.4087 0.0668 0.4807 0.6213 0.3294 0.0092 0.6049 0.2842 0.4863 0.015 0.8643 0.3162 0.691 0.6243] 2022-08-23 08:33:49 [INFO] [EVAL] Class Recall: [0.8558 0.899 0.966 0.8671 0.862 0.8571 0.8934 0.9144 0.7125 0.7506 0.6252 0.6745 0.8823 0.4172 0.4329 0.6455 0.6711 0.5375 0.7576 0.5685 0.8718 0.7306 0.7747 0.6837 0.4048 0.3312 0.6527 0.5509 0.4902 0.4837 0.4145 0.6618 0.3762 0.5873 0.4471 0.6005 0.5767 0.632 0.3215 0.5076 0.3239 0.1403 0.4814 0.34 0.4664 0.2483 0.4463 0.6801 0.8511 0.7646 0.7326 0.5534 0.3439 0.2902 0.917 0.6031 0.9337 0.5647 0.5858 0.3195 0.1256 0.6092 0.5142 0.2477 0.694 0.816 0.4714 0.5716 0.193 0.4282 0.6749 0.6838 0.4781 0.3612 0.5521 0.4466 0.7093 0.4005 0.297 0.2069 0.818 0.5064 0.377 0.0386 0.1666 0.7015 0.1479 0.109 0.3211 0.6338 0.6266 0.1914 0.3218 0.1639 0.0062 0.0461 0.0342 0.1683 0.4521 0.4736 0.1095 0.0403 0.4071 0.1744 0.0036 0.9152 0.079 0.5639 0.1084 0.2892 0.2076 0.6099 0.1613 0.6308 0.9892 0.0162 0.3779 0.8 0.3299 0.1509 0.6958 0.0119 0.2349 0.1801 0.3024 0.3477 0.5047 0.5932 0.372 0.4862 0.8425 0.091 0.4996 0.3449 0.2431 0.1794 0.2158 0.0676 0.294 0.4738 0.6246 0.0165 0.4115 0.0145 0.2892 0.0401 0.3851 0.0379 0.2137 0.1843] 2022-08-23 08:33:49 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:33:57 [INFO] [TRAIN] epoch: 107, iter: 134050/160000, loss: 0.4980, lr: 0.000196, batch_cost: 0.1700, reader_cost: 0.00389, ips: 47.0714 samples/sec | ETA 01:13:30 2022-08-23 08:34:05 [INFO] [TRAIN] epoch: 107, iter: 134100/160000, loss: 0.4619, lr: 0.000196, batch_cost: 0.1589, reader_cost: 0.00074, ips: 50.3612 samples/sec | ETA 01:08:34 2022-08-23 08:34:14 [INFO] [TRAIN] epoch: 107, iter: 134150/160000, loss: 0.4803, lr: 0.000196, batch_cost: 0.1786, reader_cost: 0.00052, ips: 44.7836 samples/sec | ETA 01:16:57 2022-08-23 08:34:24 [INFO] [TRAIN] epoch: 107, iter: 134200/160000, loss: 0.4850, lr: 0.000195, batch_cost: 0.1847, reader_cost: 0.00054, ips: 43.3028 samples/sec | ETA 01:19:26 2022-08-23 08:34:32 [INFO] [TRAIN] epoch: 107, iter: 134250/160000, loss: 0.4977, lr: 0.000195, batch_cost: 0.1746, reader_cost: 0.00043, ips: 45.8288 samples/sec | ETA 01:14:54 2022-08-23 08:34:42 [INFO] [TRAIN] epoch: 107, iter: 134300/160000, loss: 0.5224, lr: 0.000195, batch_cost: 0.1865, reader_cost: 0.00054, ips: 42.8897 samples/sec | ETA 01:19:53 2022-08-23 08:34:51 [INFO] [TRAIN] epoch: 107, iter: 134350/160000, loss: 0.4950, lr: 0.000194, batch_cost: 0.1870, reader_cost: 0.00071, ips: 42.7843 samples/sec | ETA 01:19:56 2022-08-23 08:35:01 [INFO] [TRAIN] epoch: 107, iter: 134400/160000, loss: 0.4706, lr: 0.000194, batch_cost: 0.1920, reader_cost: 0.00046, ips: 41.6645 samples/sec | ETA 01:21:55 2022-08-23 08:35:10 [INFO] [TRAIN] epoch: 107, iter: 134450/160000, loss: 0.5362, lr: 0.000193, batch_cost: 0.1922, reader_cost: 0.00062, ips: 41.6221 samples/sec | ETA 01:21:50 2022-08-23 08:35:20 [INFO] [TRAIN] epoch: 107, iter: 134500/160000, loss: 0.4836, lr: 0.000193, batch_cost: 0.1887, reader_cost: 0.00071, ips: 42.3972 samples/sec | ETA 01:20:11 2022-08-23 08:35:29 [INFO] [TRAIN] epoch: 107, iter: 134550/160000, loss: 0.4963, lr: 0.000193, batch_cost: 0.1804, reader_cost: 0.00041, ips: 44.3410 samples/sec | ETA 01:16:31 2022-08-23 08:35:38 [INFO] [TRAIN] epoch: 107, iter: 134600/160000, loss: 0.4921, lr: 0.000192, batch_cost: 0.1897, reader_cost: 0.00066, ips: 42.1778 samples/sec | ETA 01:20:17 2022-08-23 08:35:48 [INFO] [TRAIN] epoch: 107, iter: 134650/160000, loss: 0.4926, lr: 0.000192, batch_cost: 0.2065, reader_cost: 0.00035, ips: 38.7414 samples/sec | ETA 01:27:14 2022-08-23 08:35:58 [INFO] [TRAIN] epoch: 107, iter: 134700/160000, loss: 0.5028, lr: 0.000192, batch_cost: 0.1878, reader_cost: 0.00095, ips: 42.5944 samples/sec | ETA 01:19:11 2022-08-23 08:36:07 [INFO] [TRAIN] epoch: 107, iter: 134750/160000, loss: 0.4821, lr: 0.000191, batch_cost: 0.1909, reader_cost: 0.00093, ips: 41.9134 samples/sec | ETA 01:20:19 2022-08-23 08:36:17 [INFO] [TRAIN] epoch: 107, iter: 134800/160000, loss: 0.4896, lr: 0.000191, batch_cost: 0.1925, reader_cost: 0.00070, ips: 41.5653 samples/sec | ETA 01:20:50 2022-08-23 08:36:27 [INFO] [TRAIN] epoch: 107, iter: 134850/160000, loss: 0.4829, lr: 0.000190, batch_cost: 0.1943, reader_cost: 0.00096, ips: 41.1803 samples/sec | ETA 01:21:25 2022-08-23 08:36:36 [INFO] [TRAIN] epoch: 107, iter: 134900/160000, loss: 0.5109, lr: 0.000190, batch_cost: 0.1874, reader_cost: 0.00202, ips: 42.6822 samples/sec | ETA 01:18:24 2022-08-23 08:36:46 [INFO] [TRAIN] epoch: 107, iter: 134950/160000, loss: 0.5014, lr: 0.000190, batch_cost: 0.1971, reader_cost: 0.00167, ips: 40.5813 samples/sec | ETA 01:22:18 2022-08-23 08:36:54 [INFO] [TRAIN] epoch: 107, iter: 135000/160000, loss: 0.4920, lr: 0.000189, batch_cost: 0.1680, reader_cost: 0.00055, ips: 47.6154 samples/sec | ETA 01:10:00 2022-08-23 08:36:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 149s - batch_cost: 0.1486 - reader cost: 8.4989e-04 2022-08-23 08:39:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.3488 Acc: 0.7696 Kappa: 0.7520 Dice: 0.4810 2022-08-23 08:39:23 [INFO] [EVAL] Class IoU: [0.6929 0.7762 0.9327 0.7318 0.6802 0.77 0.7882 0.7842 0.5282 0.6047 0.5008 0.5599 0.7045 0.3186 0.3151 0.4333 0.5292 0.4217 0.6276 0.4201 0.7403 0.5163 0.6421 0.4898 0.3282 0.3398 0.479 0.4701 0.386 0.2629 0.2721 0.4576 0.26 0.3652 0.3181 0.3816 0.4342 0.5313 0.2486 0.3907 0.2208 0.0862 0.3491 0.2586 0.2991 0.1804 0.288 0.5135 0.6142 0.5265 0.5785 0.4131 0.2207 0.2307 0.7142 0.4668 0.8654 0.4412 0.4289 0.2308 0.0795 0.428 0.3329 0.2746 0.487 0.6942 0.3057 0.3974 0.1454 0.3033 0.4972 0.5424 0.3992 0.2315 0.4581 0.3305 0.5278 0.2686 0.2683 0.1423 0.5984 0.4114 0.3758 0.0318 0.1293 0.5728 0.1008 0.1007 0.2386 0.5312 0.4131 0.0825 0.235 0.1113 0.0011 0.0211 0.0303 0.1372 0.2989 0.4203 0.1162 0.0358 0.2847 0.1375 0.0115 0.5152 0.0632 0.4953 0.0956 0.2259 0.1273 0.335 0.1489 0.5268 0.7982 0.0127 0.295 0.6231 0.126 0.1451 0.4261 0.0091 0.213 0.17 0.2628 0.2536 0.4631 0.4166 0.3765 0.3347 0.5321 0.0598 0.2933 0.3065 0.1866 0.1167 0.1549 0.0483 0.2415 0.357 0.3238 0.0063 0.316 0.0491 0.182 0.0011 0.382 0.0285 0.2229 0.1578] 2022-08-23 08:39:23 [INFO] [EVAL] Class Precision: [0.792 0.8593 0.9644 0.8199 0.7595 0.8612 0.8813 0.8456 0.6695 0.7608 0.6845 0.7206 0.7837 0.5046 0.5468 0.5743 0.6912 0.7133 0.7875 0.6282 0.8246 0.6585 0.7777 0.637 0.5259 0.5068 0.6678 0.756 0.7009 0.3514 0.4389 0.6029 0.5023 0.5076 0.5181 0.5439 0.6166 0.7768 0.4489 0.7027 0.3944 0.2534 0.5936 0.5292 0.4512 0.3872 0.4534 0.689 0.6881 0.6234 0.7325 0.5962 0.4013 0.486 0.7459 0.6355 0.9161 0.6241 0.704 0.4519 0.1398 0.5801 0.4757 0.6485 0.6183 0.7785 0.4114 0.5893 0.4023 0.5492 0.6875 0.6873 0.5789 0.313 0.7047 0.5017 0.6141 0.4748 0.5785 0.3847 0.6854 0.6792 0.7938 0.0851 0.303 0.7262 0.3274 0.3753 0.429 0.7133 0.549 0.1236 0.4995 0.3646 0.0035 0.0549 0.2834 0.3657 0.4614 0.6311 0.3935 0.1322 0.6216 0.5871 0.289 0.5397 0.1678 0.7785 0.2877 0.5516 0.3211 0.4225 0.4603 0.7647 0.8037 0.1256 0.7601 0.7388 0.1966 0.5712 0.6357 0.1277 0.6245 0.6017 0.6478 0.6519 0.7792 0.571 0.8039 0.5169 0.5896 0.409 0.375 0.5833 0.5848 0.209 0.4026 0.1095 0.4761 0.6482 0.3999 0.0096 0.6104 0.3804 0.4095 0.003 0.8267 0.3655 0.6028 0.8004] 2022-08-23 08:39:23 [INFO] [EVAL] Class Recall: [0.8471 0.8892 0.966 0.8719 0.8669 0.879 0.8817 0.9152 0.7145 0.7467 0.6512 0.7152 0.8745 0.4636 0.4265 0.6383 0.693 0.5078 0.7556 0.5591 0.8787 0.705 0.7865 0.6795 0.4662 0.5077 0.6289 0.5541 0.4621 0.5106 0.4172 0.6552 0.3502 0.5655 0.4518 0.5612 0.5948 0.6271 0.3577 0.468 0.3341 0.1156 0.4587 0.3359 0.4702 0.2525 0.4412 0.6684 0.8512 0.7722 0.7335 0.5736 0.329 0.3051 0.9439 0.6374 0.9399 0.6009 0.5233 0.3204 0.1556 0.6201 0.5259 0.3227 0.6963 0.865 0.5433 0.5497 0.1854 0.4038 0.6424 0.7202 0.5625 0.4705 0.567 0.4921 0.7898 0.3822 0.3335 0.1843 0.825 0.5106 0.4164 0.0484 0.184 0.7305 0.1271 0.1209 0.3497 0.6754 0.6253 0.1989 0.3074 0.1381 0.0016 0.0332 0.0328 0.18 0.459 0.5571 0.1416 0.0468 0.3444 0.1522 0.0118 0.9189 0.0921 0.5766 0.1253 0.2767 0.1742 0.618 0.1804 0.6287 0.9916 0.014 0.3253 0.7991 0.2595 0.1629 0.5638 0.0097 0.2443 0.1916 0.3066 0.2933 0.533 0.6064 0.4146 0.4871 0.845 0.0655 0.574 0.3925 0.2151 0.209 0.2012 0.0795 0.329 0.4428 0.63 0.0179 0.3958 0.0534 0.2467 0.0018 0.4152 0.0299 0.2613 0.1643] 2022-08-23 08:39:23 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:39:32 [INFO] [TRAIN] epoch: 107, iter: 135050/160000, loss: 0.5001, lr: 0.000189, batch_cost: 0.1809, reader_cost: 0.00350, ips: 44.2126 samples/sec | ETA 01:15:14 2022-08-23 08:39:42 [INFO] [TRAIN] epoch: 107, iter: 135100/160000, loss: 0.4865, lr: 0.000189, batch_cost: 0.1970, reader_cost: 0.00077, ips: 40.6081 samples/sec | ETA 01:21:45 2022-08-23 08:39:56 [INFO] [TRAIN] epoch: 108, iter: 135150/160000, loss: 0.4846, lr: 0.000188, batch_cost: 0.2758, reader_cost: 0.08483, ips: 29.0087 samples/sec | ETA 01:54:13 2022-08-23 08:40:05 [INFO] [TRAIN] epoch: 108, iter: 135200/160000, loss: 0.4879, lr: 0.000188, batch_cost: 0.1711, reader_cost: 0.00132, ips: 46.7647 samples/sec | ETA 01:10:42 2022-08-23 08:40:14 [INFO] [TRAIN] epoch: 108, iter: 135250/160000, loss: 0.4802, lr: 0.000187, batch_cost: 0.1923, reader_cost: 0.00072, ips: 41.5936 samples/sec | ETA 01:19:20 2022-08-23 08:40:24 [INFO] [TRAIN] epoch: 108, iter: 135300/160000, loss: 0.4969, lr: 0.000187, batch_cost: 0.1988, reader_cost: 0.00050, ips: 40.2445 samples/sec | ETA 01:21:49 2022-08-23 08:40:34 [INFO] [TRAIN] epoch: 108, iter: 135350/160000, loss: 0.5184, lr: 0.000187, batch_cost: 0.1982, reader_cost: 0.00035, ips: 40.3590 samples/sec | ETA 01:21:26 2022-08-23 08:40:44 [INFO] [TRAIN] epoch: 108, iter: 135400/160000, loss: 0.4891, lr: 0.000186, batch_cost: 0.2049, reader_cost: 0.00144, ips: 39.0479 samples/sec | ETA 01:23:59 2022-08-23 08:40:54 [INFO] [TRAIN] epoch: 108, iter: 135450/160000, loss: 0.4668, lr: 0.000186, batch_cost: 0.2049, reader_cost: 0.00056, ips: 39.0488 samples/sec | ETA 01:23:49 2022-08-23 08:41:04 [INFO] [TRAIN] epoch: 108, iter: 135500/160000, loss: 0.4610, lr: 0.000185, batch_cost: 0.1855, reader_cost: 0.00075, ips: 43.1359 samples/sec | ETA 01:15:43 2022-08-23 08:41:13 [INFO] [TRAIN] epoch: 108, iter: 135550/160000, loss: 0.5057, lr: 0.000185, batch_cost: 0.1785, reader_cost: 0.00075, ips: 44.8261 samples/sec | ETA 01:12:43 2022-08-23 08:41:23 [INFO] [TRAIN] epoch: 108, iter: 135600/160000, loss: 0.4868, lr: 0.000185, batch_cost: 0.2031, reader_cost: 0.00067, ips: 39.3892 samples/sec | ETA 01:22:35 2022-08-23 08:41:32 [INFO] [TRAIN] epoch: 108, iter: 135650/160000, loss: 0.4765, lr: 0.000184, batch_cost: 0.1849, reader_cost: 0.00070, ips: 43.2759 samples/sec | ETA 01:15:01 2022-08-23 08:41:42 [INFO] [TRAIN] epoch: 108, iter: 135700/160000, loss: 0.4616, lr: 0.000184, batch_cost: 0.1969, reader_cost: 0.00176, ips: 40.6335 samples/sec | ETA 01:19:44 2022-08-23 08:41:52 [INFO] [TRAIN] epoch: 108, iter: 135750/160000, loss: 0.4966, lr: 0.000184, batch_cost: 0.1935, reader_cost: 0.00082, ips: 41.3501 samples/sec | ETA 01:18:11 2022-08-23 08:42:01 [INFO] [TRAIN] epoch: 108, iter: 135800/160000, loss: 0.4924, lr: 0.000183, batch_cost: 0.1885, reader_cost: 0.00041, ips: 42.4448 samples/sec | ETA 01:16:01 2022-08-23 08:42:10 [INFO] [TRAIN] epoch: 108, iter: 135850/160000, loss: 0.4833, lr: 0.000183, batch_cost: 0.1835, reader_cost: 0.00072, ips: 43.6079 samples/sec | ETA 01:13:50 2022-08-23 08:42:19 [INFO] [TRAIN] epoch: 108, iter: 135900/160000, loss: 0.4555, lr: 0.000182, batch_cost: 0.1837, reader_cost: 0.00124, ips: 43.5457 samples/sec | ETA 01:13:47 2022-08-23 08:42:29 [INFO] [TRAIN] epoch: 108, iter: 135950/160000, loss: 0.5144, lr: 0.000182, batch_cost: 0.1827, reader_cost: 0.00089, ips: 43.7890 samples/sec | ETA 01:13:13 2022-08-23 08:42:38 [INFO] [TRAIN] epoch: 108, iter: 136000/160000, loss: 0.5149, lr: 0.000182, batch_cost: 0.1856, reader_cost: 0.00083, ips: 43.1112 samples/sec | ETA 01:14:13 2022-08-23 08:42:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 151s - batch_cost: 0.1514 - reader cost: 5.5848e-04 2022-08-23 08:45:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3490 Acc: 0.7703 Kappa: 0.7527 Dice: 0.4803 2022-08-23 08:45:09 [INFO] [EVAL] Class IoU: [0.691 0.778 0.9324 0.7348 0.683 0.7666 0.7876 0.7894 0.5312 0.6319 0.4926 0.5599 0.7052 0.3165 0.3246 0.4344 0.5187 0.43 0.6278 0.4133 0.7404 0.5265 0.6461 0.4907 0.3237 0.3767 0.4977 0.4608 0.4121 0.2509 0.2771 0.4807 0.2728 0.3451 0.33 0.3901 0.4331 0.5243 0.2514 0.423 0.2 0.104 0.3531 0.2597 0.2875 0.1911 0.2656 0.5104 0.63 0.5209 0.5617 0.4008 0.2038 0.247 0.6856 0.4777 0.8676 0.4302 0.4316 0.2517 0.067 0.3871 0.3306 0.2019 0.4631 0.7011 0.3024 0.3928 0.129 0.3165 0.5034 0.5455 0.4031 0.2089 0.447 0.3335 0.5459 0.284 0.2699 0.1899 0.5849 0.4169 0.3611 0.0304 0.0666 0.5763 0.1048 0.0874 0.2407 0.5066 0.432 0.0684 0.2252 0.1239 0.0003 0.022 0.0282 0.1335 0.2937 0.4287 0.1218 0.0242 0.2855 0.1352 0.009 0.5747 0.0352 0.5004 0.0979 0.2325 0.1257 0.4171 0.1373 0.568 0.7765 0.0126 0.2801 0.6202 0.1664 0.1581 0.44 0.01 0.2074 0.1581 0.2629 0.2671 0.4606 0.4042 0.2113 0.3355 0.588 0.0724 0.3352 0.3045 0.1913 0.115 0.1577 0.0441 0.2224 0.3631 0.3305 0.002 0.3443 0.0147 0.2563 0.0006 0.3577 0.0288 0.1951 0.1596] 2022-08-23 08:45:09 [INFO] [EVAL] Class Precision: [0.7808 0.8615 0.9623 0.8292 0.7718 0.874 0.8792 0.8539 0.6813 0.7635 0.6981 0.7228 0.7745 0.4946 0.5315 0.5987 0.6939 0.7088 0.7695 0.6225 0.8253 0.6617 0.7822 0.6438 0.5365 0.5269 0.6491 0.7616 0.6732 0.3846 0.436 0.6497 0.5103 0.4321 0.5385 0.5652 0.621 0.8055 0.4493 0.6297 0.4157 0.2677 0.5973 0.5329 0.3836 0.3998 0.4343 0.6714 0.7428 0.5975 0.7484 0.5811 0.361 0.4861 0.7064 0.6453 0.915 0.6883 0.779 0.5222 0.1188 0.5589 0.476 0.6977 0.5646 0.8021 0.4344 0.5531 0.3462 0.551 0.6913 0.7166 0.6344 0.2681 0.7578 0.4973 0.6626 0.5013 0.7415 0.4503 0.6627 0.6286 0.8139 0.0928 0.1986 0.7451 0.3421 0.4092 0.4087 0.7435 0.6225 0.0941 0.414 0.3743 0.0011 0.0552 0.3173 0.3662 0.4848 0.6422 0.5167 0.134 0.5935 0.597 0.3225 0.6285 0.1385 0.7937 0.2576 0.5344 0.3089 0.5561 0.4738 0.7618 0.7799 0.0751 0.6472 0.745 0.2203 0.6214 0.6547 0.1518 0.6905 0.6105 0.6959 0.6686 0.8253 0.5293 0.5402 0.5461 0.6575 0.4446 0.4563 0.5811 0.5644 0.2505 0.3853 0.1268 0.467 0.6399 0.4149 0.003 0.5989 0.2351 0.4 0.0015 0.8556 0.445 0.6894 0.7886] 2022-08-23 08:45:09 [INFO] [EVAL] Class Recall: [0.8573 0.8892 0.9678 0.8659 0.8559 0.8619 0.8832 0.9127 0.7068 0.7856 0.626 0.713 0.8874 0.4678 0.4547 0.6129 0.6726 0.5223 0.7732 0.5515 0.8781 0.7203 0.7878 0.6734 0.4494 0.5691 0.6809 0.5384 0.5151 0.4192 0.4318 0.6489 0.3696 0.6315 0.4602 0.5573 0.5888 0.6003 0.3634 0.563 0.2783 0.1454 0.4635 0.3362 0.5343 0.2681 0.4061 0.6803 0.8058 0.8024 0.6924 0.5636 0.3187 0.3344 0.959 0.6478 0.9437 0.5342 0.4918 0.3271 0.1333 0.5574 0.5198 0.2213 0.7204 0.8478 0.4989 0.5754 0.1705 0.4266 0.6494 0.6956 0.5251 0.486 0.5215 0.5032 0.7561 0.3959 0.2979 0.2472 0.8327 0.5532 0.3936 0.0433 0.0911 0.7178 0.1313 0.1 0.3695 0.6139 0.5854 0.2001 0.3305 0.1563 0.0005 0.0353 0.03 0.1736 0.427 0.5632 0.1375 0.0287 0.355 0.1488 0.0092 0.8705 0.0451 0.5752 0.1364 0.2915 0.175 0.6253 0.162 0.6906 0.9944 0.015 0.3305 0.7874 0.4046 0.175 0.5729 0.0106 0.2287 0.1758 0.297 0.3079 0.5104 0.6309 0.2576 0.4652 0.8476 0.0795 0.558 0.3902 0.2245 0.1753 0.2107 0.0633 0.2982 0.4563 0.6189 0.0058 0.4475 0.0155 0.4163 0.0011 0.3807 0.0298 0.2139 0.1667] 2022-08-23 08:45:10 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:45:19 [INFO] [TRAIN] epoch: 108, iter: 136050/160000, loss: 0.5015, lr: 0.000181, batch_cost: 0.1884, reader_cost: 0.00330, ips: 42.4520 samples/sec | ETA 01:15:13 2022-08-23 08:45:29 [INFO] [TRAIN] epoch: 108, iter: 136100/160000, loss: 0.5115, lr: 0.000181, batch_cost: 0.2052, reader_cost: 0.00187, ips: 38.9922 samples/sec | ETA 01:21:43 2022-08-23 08:45:39 [INFO] [TRAIN] epoch: 108, iter: 136150/160000, loss: 0.4945, lr: 0.000181, batch_cost: 0.1913, reader_cost: 0.00068, ips: 41.8184 samples/sec | ETA 01:16:02 2022-08-23 08:45:48 [INFO] [TRAIN] epoch: 108, iter: 136200/160000, loss: 0.4990, lr: 0.000180, batch_cost: 0.1757, reader_cost: 0.00052, ips: 45.5257 samples/sec | ETA 01:09:42 2022-08-23 08:45:58 [INFO] [TRAIN] epoch: 108, iter: 136250/160000, loss: 0.4721, lr: 0.000180, batch_cost: 0.2037, reader_cost: 0.00059, ips: 39.2782 samples/sec | ETA 01:20:37 2022-08-23 08:46:08 [INFO] [TRAIN] epoch: 108, iter: 136300/160000, loss: 0.5196, lr: 0.000179, batch_cost: 0.2059, reader_cost: 0.00237, ips: 38.8568 samples/sec | ETA 01:21:19 2022-08-23 08:46:19 [INFO] [TRAIN] epoch: 108, iter: 136350/160000, loss: 0.5122, lr: 0.000179, batch_cost: 0.2090, reader_cost: 0.00077, ips: 38.2828 samples/sec | ETA 01:22:22 2022-08-23 08:46:30 [INFO] [TRAIN] epoch: 108, iter: 136400/160000, loss: 0.5262, lr: 0.000179, batch_cost: 0.2286, reader_cost: 0.00119, ips: 34.9934 samples/sec | ETA 01:29:55 2022-08-23 08:46:44 [INFO] [TRAIN] epoch: 109, iter: 136450/160000, loss: 0.4611, lr: 0.000178, batch_cost: 0.2736, reader_cost: 0.06405, ips: 29.2440 samples/sec | ETA 01:47:22 2022-08-23 08:46:55 [INFO] [TRAIN] epoch: 109, iter: 136500/160000, loss: 0.4971, lr: 0.000178, batch_cost: 0.2248, reader_cost: 0.00080, ips: 35.5807 samples/sec | ETA 01:28:03 2022-08-23 08:47:04 [INFO] [TRAIN] epoch: 109, iter: 136550/160000, loss: 0.4627, lr: 0.000178, batch_cost: 0.1788, reader_cost: 0.00072, ips: 44.7452 samples/sec | ETA 01:09:52 2022-08-23 08:47:14 [INFO] [TRAIN] epoch: 109, iter: 136600/160000, loss: 0.5021, lr: 0.000177, batch_cost: 0.2018, reader_cost: 0.00073, ips: 39.6348 samples/sec | ETA 01:18:43 2022-08-23 08:47:24 [INFO] [TRAIN] epoch: 109, iter: 136650/160000, loss: 0.4856, lr: 0.000177, batch_cost: 0.1948, reader_cost: 0.00093, ips: 41.0676 samples/sec | ETA 01:15:48 2022-08-23 08:47:32 [INFO] [TRAIN] epoch: 109, iter: 136700/160000, loss: 0.4794, lr: 0.000176, batch_cost: 0.1661, reader_cost: 0.00040, ips: 48.1592 samples/sec | ETA 01:04:30 2022-08-23 08:47:41 [INFO] [TRAIN] epoch: 109, iter: 136750/160000, loss: 0.4467, lr: 0.000176, batch_cost: 0.1811, reader_cost: 0.00083, ips: 44.1864 samples/sec | ETA 01:10:09 2022-08-23 08:47:51 [INFO] [TRAIN] epoch: 109, iter: 136800/160000, loss: 0.4790, lr: 0.000176, batch_cost: 0.1909, reader_cost: 0.00069, ips: 41.9105 samples/sec | ETA 01:13:48 2022-08-23 08:48:00 [INFO] [TRAIN] epoch: 109, iter: 136850/160000, loss: 0.4472, lr: 0.000175, batch_cost: 0.1959, reader_cost: 0.00067, ips: 40.8469 samples/sec | ETA 01:15:34 2022-08-23 08:48:09 [INFO] [TRAIN] epoch: 109, iter: 136900/160000, loss: 0.5225, lr: 0.000175, batch_cost: 0.1699, reader_cost: 0.00061, ips: 47.0773 samples/sec | ETA 01:05:25 2022-08-23 08:48:18 [INFO] [TRAIN] epoch: 109, iter: 136950/160000, loss: 0.4741, lr: 0.000175, batch_cost: 0.1793, reader_cost: 0.00065, ips: 44.6207 samples/sec | ETA 01:08:52 2022-08-23 08:48:27 [INFO] [TRAIN] epoch: 109, iter: 137000/160000, loss: 0.5098, lr: 0.000174, batch_cost: 0.1901, reader_cost: 0.00093, ips: 42.0852 samples/sec | ETA 01:12:52 2022-08-23 08:48:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1629 - reader cost: 7.1741e-04 2022-08-23 08:51:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.3477 Acc: 0.7692 Kappa: 0.7515 Dice: 0.4792 2022-08-23 08:51:10 [INFO] [EVAL] Class IoU: [0.691 0.7786 0.9325 0.732 0.6791 0.7691 0.7902 0.7831 0.5261 0.613 0.4931 0.5703 0.6979 0.3027 0.3122 0.4348 0.5368 0.4407 0.6246 0.415 0.7434 0.4912 0.6477 0.4993 0.311 0.3069 0.4563 0.4467 0.4076 0.2679 0.2685 0.4868 0.2607 0.3578 0.3353 0.3907 0.4294 0.5359 0.2489 0.4133 0.2317 0.0977 0.3431 0.2604 0.2763 0.2201 0.2816 0.5155 0.5973 0.5106 0.5749 0.4151 0.2202 0.2339 0.6631 0.4509 0.8622 0.4164 0.4589 0.256 0.0553 0.4423 0.3328 0.1941 0.4708 0.6978 0.3154 0.3974 0.1286 0.3194 0.5203 0.5383 0.4096 0.2004 0.4465 0.3326 0.524 0.2744 0.2683 0.1796 0.6026 0.4045 0.3999 0.0314 0.1538 0.5685 0.0983 0.1049 0.2143 0.5054 0.4241 0.0469 0.2084 0.1134 0.0027 0.0127 0.022 0.1827 0.2936 0.4196 0.1278 0.0435 0.2684 0.325 0.0049 0.5671 0.0385 0.4925 0.1103 0.1957 0.1413 0.434 0.152 0.4894 0.8244 0.0175 0.2951 0.6209 0.1271 0.0815 0.4726 0.0115 0.1826 0.1692 0.2499 0.2721 0.4297 0.4145 0.1642 0.3532 0.5537 0.0732 0.3483 0.2374 0.1817 0.1276 0.1598 0.0398 0.2225 0.3602 0.2835 0.0064 0.2919 0.0328 0.2434 0.002 0.3754 0.0303 0.1813 0.1521] 2022-08-23 08:51:10 [INFO] [EVAL] Class Precision: [0.7834 0.8545 0.9641 0.8262 0.7606 0.8649 0.8922 0.842 0.6696 0.7591 0.6779 0.7292 0.77 0.5113 0.5434 0.6075 0.699 0.6966 0.7829 0.6256 0.8305 0.638 0.787 0.6267 0.5154 0.5595 0.6375 0.7822 0.6466 0.3705 0.4758 0.6853 0.4771 0.4767 0.5028 0.5361 0.6378 0.7877 0.4561 0.647 0.4167 0.2585 0.6033 0.5329 0.3888 0.4229 0.4163 0.6885 0.7085 0.6042 0.7548 0.5763 0.3836 0.5418 0.6814 0.6249 0.9093 0.6884 0.7677 0.5219 0.093 0.6133 0.5215 0.7188 0.6034 0.8012 0.4676 0.5273 0.3087 0.5999 0.7524 0.6729 0.6162 0.294 0.7155 0.4971 0.6396 0.5384 0.7179 0.3761 0.6872 0.596 0.7573 0.0991 0.3237 0.737 0.3192 0.3865 0.3644 0.768 0.6038 0.0578 0.3629 0.3642 0.011 0.0398 0.403 0.3921 0.4171 0.685 0.479 0.1213 0.615 0.674 0.1722 0.6174 0.1334 0.8112 0.2498 0.5376 0.289 0.6456 0.4471 0.7378 0.8316 0.0959 0.6367 0.7293 0.1936 0.5412 0.5875 0.1734 0.625 0.5647 0.6462 0.6805 0.8061 0.5673 0.439 0.5902 0.6081 0.3562 0.4718 0.6467 0.5613 0.2827 0.3261 0.0837 0.5288 0.6503 0.3376 0.0095 0.6236 0.4155 0.4647 0.003 0.8146 0.4426 0.591 0.7804] 2022-08-23 08:51:10 [INFO] [EVAL] Class Recall: [0.8542 0.8976 0.966 0.8652 0.8638 0.8741 0.8736 0.918 0.7105 0.761 0.6439 0.7235 0.8818 0.426 0.4231 0.6046 0.6983 0.5455 0.7554 0.5521 0.8763 0.6811 0.7855 0.7106 0.4396 0.4046 0.6162 0.5101 0.5244 0.4918 0.3812 0.6269 0.3649 0.5892 0.5016 0.5903 0.5679 0.6263 0.354 0.5336 0.343 0.1357 0.4431 0.3373 0.4884 0.3145 0.4654 0.6723 0.7919 0.7673 0.7069 0.5975 0.3408 0.2917 0.9611 0.6182 0.9433 0.5131 0.5329 0.3344 0.1198 0.6133 0.4791 0.2101 0.6818 0.8439 0.4922 0.6174 0.1806 0.4059 0.6277 0.7291 0.5498 0.3864 0.5429 0.5013 0.7435 0.3588 0.2999 0.2558 0.8303 0.5573 0.4587 0.0439 0.2267 0.7132 0.1243 0.1258 0.3421 0.5965 0.5877 0.1994 0.3286 0.1414 0.0035 0.0183 0.0227 0.255 0.498 0.5199 0.1484 0.0635 0.3226 0.3857 0.005 0.8743 0.0514 0.5563 0.165 0.2352 0.2166 0.5697 0.1871 0.5924 0.9896 0.021 0.3549 0.8068 0.27 0.0875 0.7073 0.0122 0.2051 0.1945 0.2895 0.3119 0.4793 0.6061 0.2078 0.4679 0.8608 0.0843 0.5708 0.2727 0.2117 0.1887 0.2387 0.0704 0.2775 0.4467 0.6388 0.0187 0.3543 0.0344 0.3383 0.0061 0.4104 0.0315 0.2073 0.1589] 2022-08-23 08:51:11 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:51:20 [INFO] [TRAIN] epoch: 109, iter: 137050/160000, loss: 0.4822, lr: 0.000174, batch_cost: 0.1836, reader_cost: 0.00370, ips: 43.5728 samples/sec | ETA 01:10:13 2022-08-23 08:51:28 [INFO] [TRAIN] epoch: 109, iter: 137100/160000, loss: 0.5175, lr: 0.000173, batch_cost: 0.1749, reader_cost: 0.00110, ips: 45.7395 samples/sec | ETA 01:06:45 2022-08-23 08:51:38 [INFO] [TRAIN] epoch: 109, iter: 137150/160000, loss: 0.4744, lr: 0.000173, batch_cost: 0.1896, reader_cost: 0.00084, ips: 42.1862 samples/sec | ETA 01:12:13 2022-08-23 08:51:47 [INFO] [TRAIN] epoch: 109, iter: 137200/160000, loss: 0.5123, lr: 0.000173, batch_cost: 0.1786, reader_cost: 0.00030, ips: 44.8016 samples/sec | ETA 01:07:51 2022-08-23 08:51:56 [INFO] [TRAIN] epoch: 109, iter: 137250/160000, loss: 0.4901, lr: 0.000172, batch_cost: 0.1810, reader_cost: 0.00049, ips: 44.1997 samples/sec | ETA 01:08:37 2022-08-23 08:52:05 [INFO] [TRAIN] epoch: 109, iter: 137300/160000, loss: 0.4991, lr: 0.000172, batch_cost: 0.1833, reader_cost: 0.00084, ips: 43.6381 samples/sec | ETA 01:09:21 2022-08-23 08:52:13 [INFO] [TRAIN] epoch: 109, iter: 137350/160000, loss: 0.5334, lr: 0.000171, batch_cost: 0.1616, reader_cost: 0.00088, ips: 49.5025 samples/sec | ETA 01:01:00 2022-08-23 08:52:22 [INFO] [TRAIN] epoch: 109, iter: 137400/160000, loss: 0.4672, lr: 0.000171, batch_cost: 0.1791, reader_cost: 0.00065, ips: 44.6665 samples/sec | ETA 01:07:27 2022-08-23 08:52:32 [INFO] [TRAIN] epoch: 109, iter: 137450/160000, loss: 0.4746, lr: 0.000171, batch_cost: 0.1970, reader_cost: 0.00065, ips: 40.6023 samples/sec | ETA 01:14:03 2022-08-23 08:52:41 [INFO] [TRAIN] epoch: 109, iter: 137500/160000, loss: 0.4723, lr: 0.000170, batch_cost: 0.1731, reader_cost: 0.00182, ips: 46.2042 samples/sec | ETA 01:04:55 2022-08-23 08:52:49 [INFO] [TRAIN] epoch: 109, iter: 137550/160000, loss: 0.5048, lr: 0.000170, batch_cost: 0.1684, reader_cost: 0.00074, ips: 47.5005 samples/sec | ETA 01:03:01 2022-08-23 08:52:57 [INFO] [TRAIN] epoch: 109, iter: 137600/160000, loss: 0.5178, lr: 0.000170, batch_cost: 0.1682, reader_cost: 0.00041, ips: 47.5505 samples/sec | ETA 01:02:48 2022-08-23 08:53:06 [INFO] [TRAIN] epoch: 109, iter: 137650/160000, loss: 0.4676, lr: 0.000169, batch_cost: 0.1626, reader_cost: 0.00070, ips: 49.2145 samples/sec | ETA 01:00:33 2022-08-23 08:53:19 [INFO] [TRAIN] epoch: 110, iter: 137700/160000, loss: 0.4672, lr: 0.000169, batch_cost: 0.2635, reader_cost: 0.08224, ips: 30.3584 samples/sec | ETA 01:37:56 2022-08-23 08:53:27 [INFO] [TRAIN] epoch: 110, iter: 137750/160000, loss: 0.4785, lr: 0.000168, batch_cost: 0.1702, reader_cost: 0.00061, ips: 46.9949 samples/sec | ETA 01:03:07 2022-08-23 08:53:36 [INFO] [TRAIN] epoch: 110, iter: 137800/160000, loss: 0.5311, lr: 0.000168, batch_cost: 0.1704, reader_cost: 0.00061, ips: 46.9621 samples/sec | ETA 01:03:01 2022-08-23 08:53:44 [INFO] [TRAIN] epoch: 110, iter: 137850/160000, loss: 0.5017, lr: 0.000168, batch_cost: 0.1619, reader_cost: 0.00097, ips: 49.4155 samples/sec | ETA 00:59:45 2022-08-23 08:53:53 [INFO] [TRAIN] epoch: 110, iter: 137900/160000, loss: 0.4618, lr: 0.000167, batch_cost: 0.1832, reader_cost: 0.00063, ips: 43.6674 samples/sec | ETA 01:07:28 2022-08-23 08:54:03 [INFO] [TRAIN] epoch: 110, iter: 137950/160000, loss: 0.5159, lr: 0.000167, batch_cost: 0.1924, reader_cost: 0.00090, ips: 41.5834 samples/sec | ETA 01:10:42 2022-08-23 08:54:12 [INFO] [TRAIN] epoch: 110, iter: 138000/160000, loss: 0.5133, lr: 0.000167, batch_cost: 0.1842, reader_cost: 0.00070, ips: 43.4257 samples/sec | ETA 01:07:32 2022-08-23 08:54:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 155s - batch_cost: 0.1549 - reader cost: 7.3147e-04 2022-08-23 08:56:47 [INFO] [EVAL] #Images: 2000 mIoU: 0.3496 Acc: 0.7701 Kappa: 0.7526 Dice: 0.4818 2022-08-23 08:56:47 [INFO] [EVAL] Class IoU: [0.694 0.7798 0.9319 0.7335 0.6822 0.7705 0.7909 0.7901 0.5281 0.6097 0.4866 0.5589 0.704 0.3102 0.3268 0.4359 0.5292 0.4335 0.6263 0.4204 0.7449 0.5119 0.6427 0.4937 0.3215 0.3113 0.4938 0.4663 0.4196 0.2472 0.2706 0.481 0.2813 0.3564 0.32 0.3864 0.4317 0.5396 0.2604 0.4063 0.2218 0.0999 0.3512 0.2494 0.2942 0.1716 0.2641 0.5193 0.6176 0.5054 0.5611 0.4275 0.2153 0.2308 0.659 0.4548 0.8628 0.4073 0.4937 0.2641 0.0457 0.3724 0.3384 0.2415 0.4666 0.7053 0.3103 0.4004 0.1164 0.3083 0.5074 0.5393 0.4135 0.2014 0.4537 0.3323 0.5622 0.2737 0.2741 0.172 0.5812 0.4208 0.3996 0.0335 0.2017 0.5657 0.0791 0.0897 0.2451 0.5031 0.385 0.1228 0.2241 0.1091 0.0009 0.0111 0.0324 0.1536 0.3037 0.406 0.1109 0.0333 0.2854 0.4185 0.0088 0.551 0.0689 0.4997 0.092 0.241 0.1365 0.2855 0.1424 0.5268 0.7965 0.0142 0.3055 0.6033 0.1222 0.1853 0.4643 0.0105 0.2036 0.125 0.251 0.2807 0.439 0.4123 0.2804 0.3403 0.5576 0.068 0.2989 0.3057 0.2032 0.1223 0.1524 0.039 0.2315 0.3689 0.2586 0.0034 0.2941 0.0127 0.2332 0. 0.3782 0.0309 0.1959 0.146 ] 2022-08-23 08:56:47 [INFO] [EVAL] Class Precision: [0.795 0.8537 0.9659 0.8236 0.7599 0.8658 0.8756 0.8542 0.6837 0.7455 0.6709 0.7335 0.7854 0.4947 0.545 0.6195 0.7459 0.6912 0.7583 0.6157 0.8382 0.658 0.7775 0.6154 0.4954 0.5418 0.6508 0.747 0.6257 0.3586 0.4386 0.6081 0.5012 0.4968 0.4775 0.5572 0.6405 0.7537 0.4447 0.6587 0.4437 0.2664 0.6137 0.5435 0.3868 0.3758 0.4334 0.696 0.7525 0.5889 0.7827 0.5922 0.366 0.4493 0.6785 0.6321 0.9296 0.7033 0.7409 0.5046 0.0804 0.5788 0.5426 0.6374 0.5855 0.819 0.4268 0.5378 0.2601 0.5395 0.6979 0.6857 0.6126 0.2995 0.7168 0.5262 0.6822 0.5032 0.6863 0.351 0.6588 0.6663 0.7603 0.0999 0.3914 0.7442 0.3264 0.3735 0.5341 0.7509 0.5014 0.2739 0.4105 0.3873 0.0027 0.0361 0.533 0.3376 0.4544 0.6803 0.3642 0.1315 0.593 0.7124 0.3357 0.5816 0.2382 0.7882 0.2609 0.5477 0.3249 0.3538 0.5511 0.8178 0.8024 0.0895 0.7167 0.6968 0.1649 0.6775 0.5777 0.1222 0.5974 0.6118 0.6498 0.6311 0.8048 0.5572 0.7787 0.6294 0.6185 0.4099 0.4144 0.6333 0.5592 0.272 0.4201 0.0791 0.5268 0.6502 0.2947 0.0053 0.6165 0.2979 0.459 0.0001 0.8131 0.3457 0.6273 0.7118] 2022-08-23 08:56:47 [INFO] [EVAL] Class Recall: [0.8452 0.9001 0.9636 0.8703 0.8697 0.875 0.8909 0.9133 0.6988 0.7699 0.6392 0.7014 0.8716 0.454 0.4494 0.5953 0.6456 0.5376 0.7825 0.57 0.8699 0.6975 0.7876 0.7141 0.478 0.4225 0.6719 0.5538 0.5603 0.4431 0.414 0.697 0.3907 0.5578 0.4925 0.5577 0.5698 0.6551 0.3858 0.5146 0.3073 0.1378 0.4509 0.3155 0.5513 0.2399 0.4034 0.6717 0.7751 0.781 0.6647 0.6059 0.3433 0.3219 0.9583 0.6186 0.9231 0.4918 0.5968 0.3565 0.096 0.5108 0.4735 0.28 0.6968 0.8356 0.5319 0.6105 0.174 0.4185 0.6503 0.7163 0.5599 0.381 0.5528 0.4742 0.7616 0.3751 0.3134 0.2522 0.8315 0.5332 0.4572 0.0479 0.294 0.7023 0.0945 0.1056 0.3117 0.6038 0.6239 0.182 0.3304 0.1319 0.0014 0.0157 0.0334 0.2199 0.478 0.5018 0.1376 0.0427 0.3549 0.5035 0.009 0.9126 0.0884 0.5772 0.1245 0.301 0.1905 0.5967 0.1611 0.5968 0.9908 0.0166 0.3475 0.818 0.3208 0.2032 0.7029 0.0113 0.2359 0.1357 0.2903 0.3359 0.4913 0.6133 0.3047 0.4255 0.8498 0.0754 0.5174 0.3714 0.2419 0.1818 0.1929 0.0714 0.2923 0.4603 0.6785 0.0095 0.36 0.0131 0.3216 0.0001 0.4142 0.0328 0.2217 0.1551] 2022-08-23 08:56:47 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 08:56:57 [INFO] [TRAIN] epoch: 110, iter: 138050/160000, loss: 0.4662, lr: 0.000166, batch_cost: 0.1931, reader_cost: 0.00324, ips: 41.4339 samples/sec | ETA 01:10:38 2022-08-23 08:57:07 [INFO] [TRAIN] epoch: 110, iter: 138100/160000, loss: 0.4990, lr: 0.000166, batch_cost: 0.2007, reader_cost: 0.00104, ips: 39.8509 samples/sec | ETA 01:13:16 2022-08-23 08:57:17 [INFO] [TRAIN] epoch: 110, iter: 138150/160000, loss: 0.4848, lr: 0.000165, batch_cost: 0.2039, reader_cost: 0.00048, ips: 39.2410 samples/sec | ETA 01:14:14 2022-08-23 08:57:26 [INFO] [TRAIN] epoch: 110, iter: 138200/160000, loss: 0.4806, lr: 0.000165, batch_cost: 0.1866, reader_cost: 0.00048, ips: 42.8732 samples/sec | ETA 01:07:47 2022-08-23 08:57:37 [INFO] [TRAIN] epoch: 110, iter: 138250/160000, loss: 0.4762, lr: 0.000165, batch_cost: 0.2102, reader_cost: 0.00220, ips: 38.0550 samples/sec | ETA 01:16:12 2022-08-23 08:57:46 [INFO] [TRAIN] epoch: 110, iter: 138300/160000, loss: 0.5048, lr: 0.000164, batch_cost: 0.1907, reader_cost: 0.00076, ips: 41.9500 samples/sec | ETA 01:08:58 2022-08-23 08:57:56 [INFO] [TRAIN] epoch: 110, iter: 138350/160000, loss: 0.4912, lr: 0.000164, batch_cost: 0.2002, reader_cost: 0.00071, ips: 39.9650 samples/sec | ETA 01:12:13 2022-08-23 08:58:06 [INFO] [TRAIN] epoch: 110, iter: 138400/160000, loss: 0.4785, lr: 0.000164, batch_cost: 0.1898, reader_cost: 0.00082, ips: 42.1446 samples/sec | ETA 01:08:20 2022-08-23 08:58:15 [INFO] [TRAIN] epoch: 110, iter: 138450/160000, loss: 0.4782, lr: 0.000163, batch_cost: 0.1851, reader_cost: 0.00069, ips: 43.2263 samples/sec | ETA 01:06:28 2022-08-23 08:58:25 [INFO] [TRAIN] epoch: 110, iter: 138500/160000, loss: 0.4959, lr: 0.000163, batch_cost: 0.2017, reader_cost: 0.00050, ips: 39.6699 samples/sec | ETA 01:12:15 2022-08-23 08:58:35 [INFO] [TRAIN] epoch: 110, iter: 138550/160000, loss: 0.5018, lr: 0.000162, batch_cost: 0.1909, reader_cost: 0.00094, ips: 41.8965 samples/sec | ETA 01:08:15 2022-08-23 08:58:44 [INFO] [TRAIN] epoch: 110, iter: 138600/160000, loss: 0.5235, lr: 0.000162, batch_cost: 0.1850, reader_cost: 0.00066, ips: 43.2362 samples/sec | ETA 01:05:59 2022-08-23 08:58:53 [INFO] [TRAIN] epoch: 110, iter: 138650/160000, loss: 0.5080, lr: 0.000162, batch_cost: 0.1849, reader_cost: 0.00054, ips: 43.2566 samples/sec | ETA 01:05:48 2022-08-23 08:59:02 [INFO] [TRAIN] epoch: 110, iter: 138700/160000, loss: 0.4674, lr: 0.000161, batch_cost: 0.1794, reader_cost: 0.00076, ips: 44.6045 samples/sec | ETA 01:03:40 2022-08-23 08:59:12 [INFO] [TRAIN] epoch: 110, iter: 138750/160000, loss: 0.4928, lr: 0.000161, batch_cost: 0.2012, reader_cost: 0.00068, ips: 39.7705 samples/sec | ETA 01:11:14 2022-08-23 08:59:23 [INFO] [TRAIN] epoch: 110, iter: 138800/160000, loss: 0.5181, lr: 0.000161, batch_cost: 0.2034, reader_cost: 0.00132, ips: 39.3333 samples/sec | ETA 01:11:51 2022-08-23 08:59:32 [INFO] [TRAIN] epoch: 110, iter: 138850/160000, loss: 0.5051, lr: 0.000160, batch_cost: 0.1937, reader_cost: 0.00050, ips: 41.3106 samples/sec | ETA 01:08:15 2022-08-23 08:59:41 [INFO] [TRAIN] epoch: 110, iter: 138900/160000, loss: 0.4911, lr: 0.000160, batch_cost: 0.1690, reader_cost: 0.00077, ips: 47.3490 samples/sec | ETA 00:59:25 2022-08-23 08:59:53 [INFO] [TRAIN] epoch: 111, iter: 138950/160000, loss: 0.4945, lr: 0.000159, batch_cost: 0.2525, reader_cost: 0.06308, ips: 31.6791 samples/sec | ETA 01:28:35 2022-08-23 09:00:04 [INFO] [TRAIN] epoch: 111, iter: 139000/160000, loss: 0.4531, lr: 0.000159, batch_cost: 0.2055, reader_cost: 0.00071, ips: 38.9314 samples/sec | ETA 01:11:55 2022-08-23 09:00:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1645 - reader cost: 8.5337e-04 2022-08-23 09:02:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3518 Acc: 0.7702 Kappa: 0.7526 Dice: 0.4839 2022-08-23 09:02:48 [INFO] [EVAL] Class IoU: [0.6945 0.7767 0.9324 0.7329 0.683 0.7693 0.7864 0.7851 0.5306 0.6008 0.497 0.5559 0.704 0.3067 0.3233 0.4315 0.5427 0.4357 0.6286 0.4207 0.7482 0.5363 0.635 0.4969 0.3162 0.276 0.4979 0.467 0.4311 0.2703 0.2748 0.5049 0.2695 0.3613 0.3178 0.3892 0.4319 0.5386 0.2662 0.4192 0.2237 0.1031 0.3579 0.2581 0.2773 0.1898 0.2885 0.5121 0.6296 0.5082 0.5615 0.4124 0.209 0.2187 0.6832 0.4631 0.8707 0.3994 0.4325 0.2754 0.0417 0.4144 0.3291 0.1766 0.4671 0.6912 0.3059 0.3982 0.1153 0.3341 0.4914 0.5429 0.4011 0.1892 0.4533 0.3148 0.5885 0.2489 0.2712 0.1472 0.5697 0.4096 0.3889 0.0375 0.3282 0.5743 0.0972 0.0898 0.2409 0.4982 0.4485 0.0459 0.1947 0.0951 0.0003 0.0191 0.0263 0.1497 0.2917 0.3997 0.1219 0.0402 0.2696 0.3228 0.0144 0.6454 0.0525 0.4993 0.093 0.2835 0.1203 0.3373 0.1645 0.4985 0.8515 0.0124 0.3465 0.6415 0.145 0.2412 0.4554 0.0108 0.2126 0.1746 0.249 0.2686 0.4394 0.4161 0.2353 0.3392 0.562 0.0828 0.3317 0.2723 0.2199 0.1185 0.1544 0.0368 0.2102 0.3608 0.3372 0.0056 0.344 0.0305 0.2037 0. 0.3489 0.0313 0.1858 0.1486] 2022-08-23 09:02:48 [INFO] [EVAL] Class Precision: [0.7902 0.8489 0.9632 0.8258 0.7694 0.8701 0.8712 0.8438 0.6741 0.768 0.6866 0.7338 0.7782 0.5069 0.5369 0.6018 0.7377 0.6903 0.7826 0.6172 0.8386 0.6683 0.7593 0.6225 0.5137 0.5257 0.7563 0.7505 0.637 0.3518 0.4658 0.6774 0.4915 0.5298 0.5166 0.5361 0.641 0.7793 0.4518 0.6451 0.447 0.2514 0.6083 0.5451 0.3929 0.4182 0.4267 0.6838 0.7043 0.5852 0.7787 0.5861 0.3449 0.5358 0.7036 0.682 0.9271 0.7085 0.7049 0.4574 0.0737 0.5724 0.4723 0.7138 0.5865 0.7822 0.4293 0.5326 0.2223 0.5846 0.6779 0.7019 0.6136 0.304 0.6918 0.5021 0.7677 0.4852 0.7548 0.317 0.6343 0.6493 0.793 0.1022 0.4845 0.7348 0.414 0.4002 0.4351 0.7772 0.646 0.0573 0.3309 0.3711 0.0012 0.0531 0.6162 0.3913 0.4431 0.6485 0.3697 0.1538 0.5972 0.758 0.5986 0.7204 0.1667 0.751 0.2145 0.6039 0.3052 0.4531 0.47 0.8133 0.8581 0.0877 0.6448 0.764 0.214 0.6581 0.5864 0.0943 0.713 0.5688 0.6581 0.6145 0.7486 0.5486 0.5131 0.5392 0.6234 0.3694 0.4437 0.6558 0.5538 0.2186 0.3175 0.0733 0.4344 0.6439 0.4226 0.0082 0.6234 0.2806 0.5267 0. 0.7771 0.3174 0.4818 0.7214] 2022-08-23 09:02:48 [INFO] [EVAL] Class Recall: [0.8515 0.9013 0.9669 0.8669 0.8588 0.8691 0.8898 0.9186 0.7138 0.734 0.6427 0.6963 0.8807 0.437 0.4484 0.6039 0.6725 0.5416 0.7615 0.5692 0.8741 0.7308 0.7951 0.7113 0.4513 0.3676 0.593 0.5529 0.5715 0.5385 0.4013 0.6647 0.3736 0.5319 0.4523 0.5868 0.5697 0.6356 0.3931 0.5448 0.3093 0.1487 0.465 0.3289 0.4852 0.2579 0.4711 0.671 0.8558 0.7943 0.6681 0.5819 0.3465 0.2698 0.9594 0.5905 0.9347 0.478 0.5281 0.409 0.0878 0.6002 0.5205 0.1901 0.6965 0.856 0.5156 0.6122 0.1934 0.438 0.6411 0.7056 0.5367 0.334 0.568 0.4577 0.716 0.3381 0.2974 0.2155 0.8483 0.526 0.4328 0.0558 0.5042 0.7245 0.1127 0.1037 0.3505 0.5812 0.5946 0.1873 0.3212 0.1134 0.0004 0.0291 0.0267 0.1951 0.4605 0.5102 0.154 0.0516 0.3295 0.3599 0.0145 0.8611 0.0711 0.5984 0.1409 0.3483 0.1656 0.5689 0.202 0.563 0.9911 0.0142 0.4283 0.8 0.3102 0.2758 0.6709 0.0121 0.2325 0.2012 0.286 0.323 0.5154 0.6328 0.303 0.4776 0.8508 0.0964 0.5677 0.3176 0.2672 0.2056 0.2311 0.0689 0.2894 0.4507 0.6254 0.0173 0.4343 0.0331 0.2494 0. 0.3877 0.0335 0.2322 0.1576] 2022-08-23 09:02:48 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 09:02:57 [INFO] [TRAIN] epoch: 111, iter: 139050/160000, loss: 0.4713, lr: 0.000159, batch_cost: 0.1784, reader_cost: 0.00399, ips: 44.8357 samples/sec | ETA 01:02:18 2022-08-23 09:03:07 [INFO] [TRAIN] epoch: 111, iter: 139100/160000, loss: 0.5047, lr: 0.000158, batch_cost: 0.1878, reader_cost: 0.00090, ips: 42.5887 samples/sec | ETA 01:05:25 2022-08-23 09:03:16 [INFO] [TRAIN] epoch: 111, iter: 139150/160000, loss: 0.5101, lr: 0.000158, batch_cost: 0.1909, reader_cost: 0.00039, ips: 41.9034 samples/sec | ETA 01:06:20 2022-08-23 09:03:26 [INFO] [TRAIN] epoch: 111, iter: 139200/160000, loss: 0.4872, lr: 0.000157, batch_cost: 0.1972, reader_cost: 0.00137, ips: 40.5709 samples/sec | ETA 01:08:21 2022-08-23 09:03:36 [INFO] [TRAIN] epoch: 111, iter: 139250/160000, loss: 0.4935, lr: 0.000157, batch_cost: 0.1885, reader_cost: 0.00095, ips: 42.4451 samples/sec | ETA 01:05:10 2022-08-23 09:03:45 [INFO] [TRAIN] epoch: 111, iter: 139300/160000, loss: 0.5103, lr: 0.000157, batch_cost: 0.1907, reader_cost: 0.00073, ips: 41.9440 samples/sec | ETA 01:05:48 2022-08-23 09:03:54 [INFO] [TRAIN] epoch: 111, iter: 139350/160000, loss: 0.5054, lr: 0.000156, batch_cost: 0.1819, reader_cost: 0.00061, ips: 43.9854 samples/sec | ETA 01:02:35 2022-08-23 09:04:03 [INFO] [TRAIN] epoch: 111, iter: 139400/160000, loss: 0.5011, lr: 0.000156, batch_cost: 0.1809, reader_cost: 0.00069, ips: 44.2136 samples/sec | ETA 01:02:07 2022-08-23 09:04:14 [INFO] [TRAIN] epoch: 111, iter: 139450/160000, loss: 0.4796, lr: 0.000156, batch_cost: 0.2142, reader_cost: 0.00048, ips: 37.3456 samples/sec | ETA 01:13:22 2022-08-23 09:04:24 [INFO] [TRAIN] epoch: 111, iter: 139500/160000, loss: 0.4953, lr: 0.000155, batch_cost: 0.1924, reader_cost: 0.00074, ips: 41.5817 samples/sec | ETA 01:05:44 2022-08-23 09:04:34 [INFO] [TRAIN] epoch: 111, iter: 139550/160000, loss: 0.4945, lr: 0.000155, batch_cost: 0.2035, reader_cost: 0.00051, ips: 39.3165 samples/sec | ETA 01:09:21 2022-08-23 09:04:44 [INFO] [TRAIN] epoch: 111, iter: 139600/160000, loss: 0.5061, lr: 0.000154, batch_cost: 0.1945, reader_cost: 0.00057, ips: 41.1387 samples/sec | ETA 01:06:07 2022-08-23 09:04:54 [INFO] [TRAIN] epoch: 111, iter: 139650/160000, loss: 0.4779, lr: 0.000154, batch_cost: 0.2112, reader_cost: 0.00067, ips: 37.8770 samples/sec | ETA 01:11:38 2022-08-23 09:05:04 [INFO] [TRAIN] epoch: 111, iter: 139700/160000, loss: 0.4498, lr: 0.000154, batch_cost: 0.1996, reader_cost: 0.00075, ips: 40.0891 samples/sec | ETA 01:07:30 2022-08-23 09:05:14 [INFO] [TRAIN] epoch: 111, iter: 139750/160000, loss: 0.4967, lr: 0.000153, batch_cost: 0.2081, reader_cost: 0.00060, ips: 38.4357 samples/sec | ETA 01:10:14 2022-08-23 09:05:25 [INFO] [TRAIN] epoch: 111, iter: 139800/160000, loss: 0.4522, lr: 0.000153, batch_cost: 0.2136, reader_cost: 0.00064, ips: 37.4505 samples/sec | ETA 01:11:55 2022-08-23 09:05:35 [INFO] [TRAIN] epoch: 111, iter: 139850/160000, loss: 0.4849, lr: 0.000153, batch_cost: 0.1918, reader_cost: 0.00078, ips: 41.7181 samples/sec | ETA 01:04:24 2022-08-23 09:05:46 [INFO] [TRAIN] epoch: 111, iter: 139900/160000, loss: 0.5098, lr: 0.000152, batch_cost: 0.2169, reader_cost: 0.00233, ips: 36.8879 samples/sec | ETA 01:12:39 2022-08-23 09:05:56 [INFO] [TRAIN] epoch: 111, iter: 139950/160000, loss: 0.5220, lr: 0.000152, batch_cost: 0.1992, reader_cost: 0.00068, ips: 40.1640 samples/sec | ETA 01:06:33 2022-08-23 09:06:06 [INFO] [TRAIN] epoch: 111, iter: 140000/160000, loss: 0.4824, lr: 0.000151, batch_cost: 0.2185, reader_cost: 0.00048, ips: 36.6164 samples/sec | ETA 01:12:49 2022-08-23 09:06:06 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 155s - batch_cost: 0.1552 - reader cost: 7.7885e-04 2022-08-23 09:08:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.3508 Acc: 0.7692 Kappa: 0.7514 Dice: 0.4828 2022-08-23 09:08:42 [INFO] [EVAL] Class IoU: [0.6933 0.7779 0.9324 0.7371 0.6855 0.7645 0.7865 0.7847 0.5345 0.6055 0.488 0.5653 0.7108 0.3075 0.2987 0.4325 0.527 0.4368 0.6345 0.4283 0.7492 0.4661 0.6358 0.4938 0.3275 0.3322 0.4315 0.4414 0.416 0.2669 0.2586 0.4896 0.2789 0.3662 0.3238 0.3907 0.4282 0.529 0.2619 0.4007 0.2339 0.0972 0.3494 0.2611 0.2809 0.2192 0.2743 0.5131 0.5564 0.4868 0.5554 0.4138 0.2218 0.2265 0.6807 0.4208 0.8576 0.4223 0.4714 0.2536 0.0453 0.3743 0.3213 0.2278 0.4688 0.7022 0.3032 0.395 0.1 0.3179 0.5021 0.5339 0.4048 0.198 0.4588 0.3014 0.5841 0.2707 0.2619 0.0989 0.6063 0.4063 0.3692 0.0363 0.3263 0.5697 0.095 0.0995 0.2516 0.4968 0.4426 0.0629 0.2104 0.1073 0.0005 0.0311 0.0243 0.1711 0.2928 0.438 0.109 0.036 0.2901 0.6066 0.0094 0.5833 0.0621 0.5084 0.0902 0.2444 0.1456 0.3247 0.1529 0.5224 0.823 0.015 0.3272 0.6279 0.1026 0.1312 0.4566 0.0162 0.1946 0.1487 0.2511 0.2881 0.4294 0.4278 0.2692 0.3334 0.5718 0.0617 0.2955 0.2523 0.2132 0.1302 0.1678 0.037 0.2223 0.3525 0.3372 0.0088 0.3311 0.0178 0.2596 0.0009 0.3601 0.0269 0.193 0.1296] 2022-08-23 09:08:42 [INFO] [EVAL] Class Precision: [0.7801 0.8592 0.9657 0.8309 0.7651 0.8746 0.886 0.8451 0.6687 0.7594 0.7009 0.723 0.7918 0.4983 0.5577 0.6035 0.7171 0.6799 0.7801 0.6182 0.8447 0.6527 0.7976 0.6197 0.521 0.5139 0.5343 0.7888 0.6424 0.3567 0.504 0.6553 0.4887 0.5245 0.4845 0.5577 0.6503 0.7421 0.398 0.6459 0.4349 0.2436 0.5879 0.5254 0.397 0.4499 0.4638 0.7018 0.7335 0.5599 0.7785 0.5748 0.3887 0.5012 0.701 0.5587 0.9038 0.6728 0.7121 0.4558 0.0806 0.5692 0.4612 0.7068 0.5955 0.8148 0.4542 0.5151 0.2164 0.5456 0.6726 0.6757 0.6323 0.2773 0.7594 0.5243 0.7664 0.5179 0.7511 0.3246 0.6872 0.6658 0.7972 0.0981 0.4863 0.7625 0.3574 0.3742 0.4728 0.7317 0.6402 0.0831 0.3534 0.3908 0.0018 0.0731 0.449 0.3661 0.5353 0.6721 0.4608 0.1011 0.5903 0.8248 0.2209 0.6275 0.2093 0.8146 0.237 0.5876 0.3124 0.4231 0.4594 0.812 0.8295 0.106 0.6825 0.739 0.1478 0.5253 0.5608 0.1091 0.6867 0.6009 0.7107 0.6167 0.7637 0.5838 0.753 0.5828 0.6334 0.2637 0.3685 0.6446 0.5462 0.2599 0.4366 0.0804 0.4869 0.6397 0.428 0.0134 0.6166 0.2953 0.524 0.0013 0.8211 0.4468 0.586 0.7751] 2022-08-23 09:08:42 [INFO] [EVAL] Class Recall: [0.8617 0.8915 0.9644 0.8672 0.8682 0.8586 0.875 0.9165 0.7269 0.7493 0.6163 0.7216 0.8742 0.4455 0.3914 0.6041 0.6654 0.5498 0.7727 0.5822 0.8689 0.6197 0.7581 0.7086 0.4685 0.4844 0.6915 0.5005 0.5413 0.5148 0.347 0.6594 0.3939 0.5482 0.4941 0.5662 0.5563 0.6482 0.4339 0.5136 0.3361 0.1391 0.4627 0.3418 0.4899 0.2995 0.4017 0.6561 0.6975 0.7886 0.6596 0.5963 0.3406 0.2924 0.9593 0.6304 0.9438 0.5314 0.5824 0.3637 0.0939 0.5223 0.5145 0.2516 0.6878 0.8355 0.477 0.6288 0.1568 0.4324 0.6645 0.7177 0.5294 0.4091 0.5368 0.4147 0.7107 0.3619 0.2868 0.1245 0.8374 0.5104 0.4075 0.0546 0.4979 0.6926 0.1146 0.1194 0.3497 0.6075 0.5891 0.2055 0.342 0.1288 0.0007 0.0513 0.0251 0.2431 0.3926 0.557 0.1249 0.053 0.3632 0.6963 0.0097 0.8923 0.0811 0.575 0.1272 0.295 0.2142 0.5825 0.1864 0.5943 0.9906 0.0171 0.386 0.8067 0.2512 0.1489 0.7109 0.0186 0.2135 0.165 0.2796 0.351 0.4952 0.6156 0.2952 0.4379 0.8547 0.0746 0.5986 0.2931 0.2591 0.2068 0.2142 0.0641 0.2903 0.4398 0.6137 0.0254 0.417 0.0186 0.3397 0.0033 0.3908 0.0278 0.2234 0.1346] 2022-08-23 09:08:42 [INFO] [EVAL] The model with the best validation mIoU (0.3544) was saved at iter 129000. 2022-08-23 09:08:51 [INFO] [TRAIN] epoch: 111, iter: 140050/160000, loss: 0.4821, lr: 0.000151, batch_cost: 0.1715, reader_cost: 0.00332, ips: 46.6337 samples/sec | ETA 00:57:02 2022-08-23 09:09:01 [INFO] [TRAIN] epoch: 111, iter: 140100/160000, loss: 0.4520, lr: 0.000151, batch_cost: 0.2081, reader_cost: 0.00147, ips: 38.4365 samples/sec | ETA 01:09:01 2022-08-23 09:09:10 [INFO] [TRAIN] epoch: 111, iter: 140150/160000, loss: 0.4675, lr: 0.000150, batch_cost: 0.1866, reader_cost: 0.00052, ips: 42.8802 samples/sec | ETA 01:01:43 2022-08-23 09:09:21 [INFO] [TRAIN] epoch: 112, iter: 140200/160000, loss: 0.4801, lr: 0.000150, batch_cost: 0.2172, reader_cost: 0.05538, ips: 36.8271 samples/sec | ETA 01:11:41 2022-08-23 09:09:30 [INFO] [TRAIN] epoch: 112, iter: 140250/160000, loss: 0.4706, lr: 0.000150, batch_cost: 0.1646, reader_cost: 0.00080, ips: 48.5885 samples/sec | ETA 00:54:11 2022-08-23 09:09:39 [INFO] [TRAIN] epoch: 112, iter: 140300/160000, loss: 0.4738, lr: 0.000149, batch_cost: 0.1906, reader_cost: 0.00063, ips: 41.9746 samples/sec | ETA 01:02:34 2022-08-23 09:09:48 [INFO] [TRAIN] epoch: 112, iter: 140350/160000, loss: 0.5276, lr: 0.000149, batch_cost: 0.1832, reader_cost: 0.00095, ips: 43.6716 samples/sec | ETA 00:59:59 2022-08-23 09:09:58 [INFO] [TRAIN] epoch: 112, iter: 140400/160000, loss: 0.4753, lr: 0.000148, batch_cost: 0.1977, reader_cost: 0.00033, ips: 40.4718 samples/sec | ETA 01:04:34 2022-08-23 09:10:08 [INFO] [TRAIN] epoch: 112, iter: 140450/160000, loss: 0.4830, lr: 0.000148, batch_cost: 0.1966, reader_cost: 0.00064, ips: 40.6955 samples/sec | ETA 01:04:03 2022-08-23 09:10:17 [INFO] [TRAIN] epoch: 112, iter: 140500/160000, loss: 0.4620, lr: 0.000148, batch_cost: 0.1868, reader_cost: 0.00067, ips: 42.8212 samples/sec | ETA 01:00:43 2022-08-23 09:10:25 [INFO] [TRAIN] epoch: 112, iter: 140550/160000, loss: 0.4784, lr: 0.000147, batch_cost: 0.1636, reader_cost: 0.00031, ips: 48.9146 samples/sec | ETA 00:53:01 2022-08-23 09:10:35 [INFO] [TRAIN] epoch: 112, iter: 140600/160000, loss: 0.4471, lr: 0.000147, batch_cost: 0.1916, reader_cost: 0.00241, ips: 41.7458 samples/sec | ETA 01:01:57 2022-08-23 09:10:44 [INFO] [TRAIN] epoch: 112, iter: 140650/160000, loss: 0.4863, lr: 0.000147, batch_cost: 0.1838, reader_cost: 0.00050, ips: 43.5369 samples/sec | ETA 00:59:15 2022-08-23 09:10:52 [INFO] [TRAIN] epoch: 112, iter: 140700/160000, loss: 0.4768, lr: 0.000146, batch_cost: 0.1643, reader_cost: 0.00131, ips: 48.6874 samples/sec | ETA 00:52:51 2022-08-23 09:11:02 [INFO] [TRAIN] epoch: 112, iter: 140750/160000, loss: 0.4880, lr: 0.000146, batch_cost: 0.1948, reader_cost: 0.00129, ips: 41.0780 samples/sec | ETA 01:02:28 2022-08-23 09:11:11 [INFO] [TRAIN] epoch: 112, iter: 140800/160000, loss: 0.4764, lr: 0.000145, batch_cost: 0.1727, reader_cost: 0.00045, ips: 46.3205 samples/sec | ETA 00:55:16 2022-08-23 09:11:21 [INFO] [TRAIN] epoch: 112, iter: 140850/160000, loss: 0.4687, lr: 0.000145, batch_cost: 0.2028, reader_cost: 0.00215, ips: 39.4483 samples/sec | ETA 01:04:43 2022-08-23 09:11:31 [INFO] [TRAIN] epoch: 112, iter: 140900/160000, loss: 0.4767, lr: 0.000145, batch_cost: 0.1990, reader_cost: 0.00037, ips: 40.2099 samples/sec | ETA 01:03:20 2022-08-23 09:11:42 [INFO] [TRAIN] epoch: 112, iter: 140950/160000, loss: 0.4866, lr: 0.000144, batch_cost: 0.2309, reader_cost: 0.00071, ips: 34.6412 samples/sec | ETA 01:13:19 2022-08-23 09:11:53 [INFO] [TRAIN] epoch: 112, iter: 141000/160000, loss: 0.4796, lr: 0.000144, batch_cost: 0.2134, reader_cost: 0.00059, ips: 37.4916 samples/sec | ETA 01:07:34 2022-08-23 09:11:53 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1745 - reader cost: 6.4025e-04 2022-08-23 09:14:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3549 Acc: 0.7714 Kappa: 0.7537 Dice: 0.4876 2022-08-23 09:14:48 [INFO] [EVAL] Class IoU: [0.6957 0.7806 0.9309 0.7356 0.6821 0.7722 0.7842 0.788 0.528 0.6022 0.4924 0.5649 0.7071 0.2913 0.3263 0.4374 0.5307 0.4428 0.6228 0.4221 0.7402 0.5001 0.6342 0.4931 0.3206 0.3505 0.483 0.4522 0.3902 0.2659 0.2817 0.4781 0.271 0.3716 0.2985 0.3856 0.434 0.5416 0.2748 0.3862 0.219 0.0893 0.3503 0.2631 0.2896 0.1759 0.3254 0.5247 0.5173 0.5617 0.5733 0.4684 0.2196 0.2242 0.692 0.4613 0.8415 0.4456 0.5114 0.2451 0.0412 0.3429 0.3401 0.256 0.4796 0.7006 0.3189 0.4033 0.1233 0.3191 0.4994 0.5356 0.4013 0.2008 0.4651 0.3217 0.5779 0.2757 0.2626 0.1803 0.614 0.4071 0.361 0.027 0.3644 0.5662 0.1157 0.0977 0.2413 0.5172 0.4294 0.0652 0.2306 0.1133 0.003 0.0151 0.0307 0.1257 0.2925 0.4095 0.0888 0.0648 0.2853 0.6405 0.0202 0.6038 0.0577 0.4951 0.0873 0.2663 0.1481 0.3605 0.1427 0.4877 0.8031 0.0172 0.3173 0.6367 0.1239 0.1594 0.4347 0.0155 0.1955 0.1553 0.2526 0.2993 0.4474 0.4211 0.2674 0.3422 0.5352 0.0695 0.3291 0.3174 0.1943 0.128 0.1675 0.0391 0.2313 0.3519 0.32 0.0046 0.3302 0.0221 0.2362 0.0003 0.3605 0.0259 0.2072 0.1608] 2022-08-23 09:14:48 [INFO] [EVAL] Class Precision: [0.7853 0.8522 0.9621 0.8242 0.7603 0.8649 0.8746 0.8503 0.6911 0.7676 0.7198 0.7214 0.7873 0.4996 0.535 0.6081 0.7156 0.6898 0.756 0.6272 0.8228 0.6696 0.7713 0.6453 0.5225 0.5796 0.6347 0.7908 0.6677 0.3605 0.4393 0.6276 0.499 0.5231 0.4631 0.5483 0.6258 0.7335 0.4733 0.6811 0.4135 0.2583 0.6149 0.529 0.4171 0.3863 0.5065 0.7063 0.7259 0.6649 0.7735 0.698 0.3876 0.5251 0.7141 0.6279 0.8899 0.6503 0.6962 0.4274 0.0747 0.5414 0.5039 0.7038 0.6087 0.8131 0.4444 0.5729 0.285 0.58 0.7289 0.6784 0.632 0.292 0.737 0.4834 0.6991 0.5203 0.6408 0.4377 0.7068 0.6421 0.8158 0.07 0.5146 0.7437 0.3514 0.3932 0.5117 0.7455 0.601 0.093 0.4184 0.3938 0.0101 0.0413 0.5967 0.3666 0.4614 0.6596 0.3861 0.1749 0.5723 0.8011 0.7114 0.6598 0.21 0.852 0.2597 0.5901 0.2995 0.4911 0.4726 0.7736 0.8079 0.1284 0.5849 0.7646 0.1797 0.5212 0.5796 0.193 0.6892 0.6036 0.6745 0.6299 0.8053 0.5586 0.548 0.5687 0.578 0.3876 0.443 0.5018 0.5952 0.2887 0.371 0.0835 0.4511 0.6439 0.3858 0.0069 0.6545 0.31 0.5008 0.0006 0.8404 0.383 0.5094 0.6569] 2022-08-23 09:14:48 [INFO] [EVAL] Class Recall: [0.8591 0.9027 0.9664 0.8725 0.869 0.8782 0.8835 0.9149 0.6912 0.7364 0.6091 0.7225 0.8742 0.4112 0.4556 0.6091 0.6725 0.5529 0.7795 0.5635 0.8805 0.6639 0.7811 0.6764 0.4535 0.4699 0.669 0.5136 0.4842 0.5032 0.4399 0.6675 0.3723 0.5619 0.4564 0.565 0.5861 0.6744 0.3958 0.4714 0.3176 0.1201 0.4488 0.3437 0.4863 0.2441 0.4764 0.6711 0.6429 0.7834 0.6889 0.5875 0.3362 0.2812 0.9572 0.6348 0.9392 0.5861 0.6584 0.365 0.0841 0.4833 0.5113 0.287 0.6934 0.8351 0.5303 0.5768 0.1785 0.415 0.6133 0.7178 0.5236 0.3914 0.5577 0.4904 0.7692 0.3697 0.3079 0.2347 0.8238 0.5266 0.3931 0.042 0.5552 0.7035 0.1472 0.115 0.3135 0.628 0.6007 0.1789 0.3393 0.1373 0.0042 0.0232 0.0313 0.1606 0.444 0.5193 0.1033 0.0934 0.3627 0.7617 0.0204 0.8768 0.0736 0.5417 0.1163 0.3268 0.2265 0.5755 0.1697 0.5688 0.9927 0.0195 0.4095 0.792 0.2854 0.1867 0.6349 0.0166 0.2144 0.1729 0.2876 0.3631 0.5017 0.631 0.3431 0.4622 0.8785 0.078 0.5614 0.4636 0.2238 0.187 0.2339 0.0685 0.3218 0.4369 0.6522 0.0134 0.3999 0.0233 0.3089 0.0006 0.387 0.027 0.2589 0.1756] 2022-08-23 09:14:48 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:14:58 [INFO] [TRAIN] epoch: 112, iter: 141050/160000, loss: 0.4820, lr: 0.000143, batch_cost: 0.1952, reader_cost: 0.00363, ips: 40.9835 samples/sec | ETA 01:01:39 2022-08-23 09:15:08 [INFO] [TRAIN] epoch: 112, iter: 141100/160000, loss: 0.4839, lr: 0.000143, batch_cost: 0.2111, reader_cost: 0.00222, ips: 37.8881 samples/sec | ETA 01:06:30 2022-08-23 09:15:19 [INFO] [TRAIN] epoch: 112, iter: 141150/160000, loss: 0.5022, lr: 0.000143, batch_cost: 0.2059, reader_cost: 0.00060, ips: 38.8618 samples/sec | ETA 01:04:40 2022-08-23 09:15:28 [INFO] [TRAIN] epoch: 112, iter: 141200/160000, loss: 0.4600, lr: 0.000142, batch_cost: 0.1912, reader_cost: 0.00072, ips: 41.8386 samples/sec | ETA 00:59:54 2022-08-23 09:15:37 [INFO] [TRAIN] epoch: 112, iter: 141250/160000, loss: 0.4756, lr: 0.000142, batch_cost: 0.1830, reader_cost: 0.00051, ips: 43.7250 samples/sec | ETA 00:57:10 2022-08-23 09:15:47 [INFO] [TRAIN] epoch: 112, iter: 141300/160000, loss: 0.4868, lr: 0.000142, batch_cost: 0.1836, reader_cost: 0.00061, ips: 43.5750 samples/sec | ETA 00:57:13 2022-08-23 09:15:55 [INFO] [TRAIN] epoch: 112, iter: 141350/160000, loss: 0.4804, lr: 0.000141, batch_cost: 0.1614, reader_cost: 0.00030, ips: 49.5676 samples/sec | ETA 00:50:10 2022-08-23 09:16:03 [INFO] [TRAIN] epoch: 112, iter: 141400/160000, loss: 0.4869, lr: 0.000141, batch_cost: 0.1613, reader_cost: 0.00030, ips: 49.5974 samples/sec | ETA 00:50:00 2022-08-23 09:16:12 [INFO] [TRAIN] epoch: 112, iter: 141450/160000, loss: 0.4805, lr: 0.000140, batch_cost: 0.1760, reader_cost: 0.00057, ips: 45.4512 samples/sec | ETA 00:54:25 2022-08-23 09:16:23 [INFO] [TRAIN] epoch: 113, iter: 141500/160000, loss: 0.4874, lr: 0.000140, batch_cost: 0.2263, reader_cost: 0.04789, ips: 35.3436 samples/sec | ETA 01:09:47 2022-08-23 09:16:31 [INFO] [TRAIN] epoch: 113, iter: 141550/160000, loss: 0.4738, lr: 0.000140, batch_cost: 0.1721, reader_cost: 0.00052, ips: 46.4822 samples/sec | ETA 00:52:55 2022-08-23 09:16:41 [INFO] [TRAIN] epoch: 113, iter: 141600/160000, loss: 0.5064, lr: 0.000139, batch_cost: 0.1907, reader_cost: 0.00091, ips: 41.9613 samples/sec | ETA 00:58:27 2022-08-23 09:16:51 [INFO] [TRAIN] epoch: 113, iter: 141650/160000, loss: 0.5162, lr: 0.000139, batch_cost: 0.2014, reader_cost: 0.00042, ips: 39.7171 samples/sec | ETA 01:01:36 2022-08-23 09:16:59 [INFO] [TRAIN] epoch: 113, iter: 141700/160000, loss: 0.4917, lr: 0.000139, batch_cost: 0.1662, reader_cost: 0.00052, ips: 48.1261 samples/sec | ETA 00:50:42 2022-08-23 09:17:08 [INFO] [TRAIN] epoch: 113, iter: 141750/160000, loss: 0.5172, lr: 0.000138, batch_cost: 0.1810, reader_cost: 0.00041, ips: 44.2090 samples/sec | ETA 00:55:02 2022-08-23 09:17:17 [INFO] [TRAIN] epoch: 113, iter: 141800/160000, loss: 0.5105, lr: 0.000138, batch_cost: 0.1700, reader_cost: 0.00074, ips: 47.0497 samples/sec | ETA 00:51:34 2022-08-23 09:17:26 [INFO] [TRAIN] epoch: 113, iter: 141850/160000, loss: 0.4811, lr: 0.000137, batch_cost: 0.1799, reader_cost: 0.00043, ips: 44.4621 samples/sec | ETA 00:54:25 2022-08-23 09:17:35 [INFO] [TRAIN] epoch: 113, iter: 141900/160000, loss: 0.4895, lr: 0.000137, batch_cost: 0.1797, reader_cost: 0.00070, ips: 44.5144 samples/sec | ETA 00:54:12 2022-08-23 09:17:45 [INFO] [TRAIN] epoch: 113, iter: 141950/160000, loss: 0.4745, lr: 0.000137, batch_cost: 0.2036, reader_cost: 0.00181, ips: 39.2953 samples/sec | ETA 01:01:14 2022-08-23 09:17:54 [INFO] [TRAIN] epoch: 113, iter: 142000/160000, loss: 0.5049, lr: 0.000136, batch_cost: 0.1742, reader_cost: 0.00188, ips: 45.9358 samples/sec | ETA 00:52:14 2022-08-23 09:17:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 160s - batch_cost: 0.1597 - reader cost: 0.0031 2022-08-23 09:20:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.3531 Acc: 0.7699 Kappa: 0.7523 Dice: 0.4868 2022-08-23 09:20:34 [INFO] [EVAL] Class IoU: [0.6952 0.7766 0.9318 0.7364 0.6789 0.7657 0.7851 0.7843 0.5295 0.6037 0.4912 0.5739 0.7042 0.2994 0.3242 0.4373 0.5365 0.4382 0.6285 0.4155 0.7408 0.4764 0.639 0.4931 0.3207 0.3124 0.4512 0.4701 0.4108 0.2584 0.2783 0.4949 0.2616 0.3564 0.3005 0.3945 0.4263 0.5411 0.2637 0.4124 0.2327 0.091 0.3537 0.2632 0.303 0.1996 0.301 0.5122 0.5414 0.5252 0.5856 0.4233 0.2299 0.1947 0.6954 0.4657 0.8541 0.3953 0.4662 0.2366 0.0402 0.3745 0.3377 0.252 0.4515 0.7018 0.3193 0.4018 0.1291 0.3139 0.5229 0.5247 0.4148 0.2003 0.4451 0.3279 0.5532 0.2741 0.2792 0.1418 0.6077 0.4186 0.3611 0.0211 0.3412 0.563 0.1208 0.0938 0.2697 0.5105 0.4241 0.0636 0.2181 0.1009 0.0002 0.031 0.0314 0.1584 0.2995 0.404 0.1317 0.0492 0.2779 0.3561 0.009 0.5413 0.0525 0.5089 0.107 0.2724 0.1542 0.3825 0.1517 0.5096 0.7403 0.0221 0.3586 0.6367 0.1159 0.1667 0.4676 0.0096 0.2065 0.1624 0.2628 0.3005 0.4392 0.4297 0.3182 0.3393 0.5534 0.0846 0.3265 0.3161 0.2041 0.1323 0.1666 0.0348 0.2093 0.3523 0.3486 0.0086 0.3577 0.038 0.2252 0. 0.356 0.0262 0.2227 0.1624] 2022-08-23 09:20:34 [INFO] [EVAL] Class Precision: [0.7926 0.8545 0.9672 0.8315 0.7443 0.8519 0.8829 0.8459 0.6614 0.7625 0.7189 0.7162 0.7739 0.518 0.5556 0.6061 0.7282 0.6996 0.7804 0.6115 0.823 0.6319 0.7817 0.628 0.5274 0.5376 0.5867 0.7289 0.6631 0.3514 0.4291 0.6698 0.4665 0.4558 0.4697 0.569 0.6223 0.8051 0.4584 0.6357 0.3879 0.2415 0.5915 0.5245 0.4081 0.4241 0.4865 0.6971 0.7251 0.6045 0.7719 0.5842 0.4306 0.551 0.7239 0.6315 0.9053 0.7226 0.7553 0.4185 0.0782 0.5194 0.5336 0.6909 0.5404 0.8137 0.4658 0.5452 0.3759 0.5374 0.6955 0.6704 0.6311 0.2779 0.7554 0.4839 0.6544 0.5396 0.697 0.4497 0.6928 0.6637 0.8158 0.068 0.5056 0.7108 0.3229 0.3976 0.5981 0.7747 0.5781 0.0839 0.3672 0.3578 0.0006 0.0781 0.3313 0.3854 0.4636 0.6694 0.4228 0.1788 0.6115 0.7161 0.3896 0.6095 0.1915 0.8148 0.261 0.5775 0.3404 0.5018 0.4348 0.7732 0.7438 0.1356 0.636 0.7662 0.1582 0.5221 0.5975 0.1946 0.6515 0.5951 0.6683 0.648 0.7734 0.6014 0.7452 0.567 0.6073 0.3499 0.4341 0.6114 0.5396 0.2949 0.322 0.0795 0.5065 0.6508 0.4218 0.013 0.6409 0.3162 0.5108 0. 0.8807 0.3796 0.5558 0.5599] 2022-08-23 09:20:34 [INFO] [EVAL] Class Recall: [0.8498 0.895 0.9622 0.8656 0.8855 0.8833 0.8763 0.915 0.7265 0.7434 0.608 0.7428 0.8866 0.415 0.4377 0.6111 0.6708 0.5398 0.7636 0.5645 0.8812 0.6594 0.7778 0.6967 0.4499 0.4272 0.6615 0.5696 0.5191 0.494 0.4421 0.6546 0.3732 0.6205 0.4547 0.5627 0.575 0.6227 0.383 0.5401 0.3678 0.1274 0.4681 0.3457 0.5405 0.2738 0.4412 0.6588 0.6813 0.8001 0.7081 0.6058 0.3304 0.2314 0.9465 0.6395 0.938 0.4659 0.5492 0.3525 0.0763 0.5731 0.4791 0.284 0.7329 0.8361 0.5037 0.6044 0.1643 0.4302 0.6782 0.7072 0.5475 0.4178 0.5201 0.5043 0.7815 0.3578 0.3177 0.1716 0.8319 0.5313 0.3931 0.0297 0.512 0.7302 0.1618 0.1093 0.3294 0.5995 0.6143 0.2083 0.3494 0.1232 0.0003 0.0489 0.0335 0.212 0.4582 0.5047 0.1605 0.0636 0.3375 0.4146 0.0092 0.8287 0.0674 0.5754 0.1534 0.3402 0.2199 0.6166 0.1889 0.5992 0.9936 0.0258 0.4513 0.7902 0.3024 0.1967 0.6826 0.01 0.2321 0.1825 0.3022 0.3591 0.5041 0.6008 0.3571 0.458 0.8618 0.1004 0.5684 0.3955 0.2471 0.1936 0.2567 0.0582 0.2629 0.4344 0.6676 0.0251 0.4474 0.0414 0.2871 0. 0.374 0.0274 0.2709 0.1862] 2022-08-23 09:20:34 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:20:43 [INFO] [TRAIN] epoch: 113, iter: 142050/160000, loss: 0.4857, lr: 0.000136, batch_cost: 0.1892, reader_cost: 0.00339, ips: 42.2747 samples/sec | ETA 00:56:36 2022-08-23 09:20:53 [INFO] [TRAIN] epoch: 113, iter: 142100/160000, loss: 0.4734, lr: 0.000136, batch_cost: 0.1981, reader_cost: 0.00165, ips: 40.3893 samples/sec | ETA 00:59:05 2022-08-23 09:21:04 [INFO] [TRAIN] epoch: 113, iter: 142150/160000, loss: 0.5085, lr: 0.000135, batch_cost: 0.2063, reader_cost: 0.00060, ips: 38.7807 samples/sec | ETA 01:01:22 2022-08-23 09:21:14 [INFO] [TRAIN] epoch: 113, iter: 142200/160000, loss: 0.4540, lr: 0.000135, batch_cost: 0.2069, reader_cost: 0.00078, ips: 38.6637 samples/sec | ETA 01:01:23 2022-08-23 09:21:24 [INFO] [TRAIN] epoch: 113, iter: 142250/160000, loss: 0.5010, lr: 0.000134, batch_cost: 0.1981, reader_cost: 0.00046, ips: 40.3797 samples/sec | ETA 00:58:36 2022-08-23 09:21:34 [INFO] [TRAIN] epoch: 113, iter: 142300/160000, loss: 0.4716, lr: 0.000134, batch_cost: 0.2020, reader_cost: 0.00156, ips: 39.6027 samples/sec | ETA 00:59:35 2022-08-23 09:21:46 [INFO] [TRAIN] epoch: 113, iter: 142350/160000, loss: 0.4699, lr: 0.000134, batch_cost: 0.2336, reader_cost: 0.00052, ips: 34.2512 samples/sec | ETA 01:08:42 2022-08-23 09:21:56 [INFO] [TRAIN] epoch: 113, iter: 142400/160000, loss: 0.4673, lr: 0.000133, batch_cost: 0.2029, reader_cost: 0.00066, ips: 39.4324 samples/sec | ETA 00:59:30 2022-08-23 09:22:07 [INFO] [TRAIN] epoch: 113, iter: 142450/160000, loss: 0.4760, lr: 0.000133, batch_cost: 0.2132, reader_cost: 0.00044, ips: 37.5199 samples/sec | ETA 01:02:22 2022-08-23 09:22:16 [INFO] [TRAIN] epoch: 113, iter: 142500/160000, loss: 0.5041, lr: 0.000132, batch_cost: 0.1887, reader_cost: 0.00034, ips: 42.3975 samples/sec | ETA 00:55:02 2022-08-23 09:22:25 [INFO] [TRAIN] epoch: 113, iter: 142550/160000, loss: 0.4830, lr: 0.000132, batch_cost: 0.1799, reader_cost: 0.00102, ips: 44.4751 samples/sec | ETA 00:52:18 2022-08-23 09:22:35 [INFO] [TRAIN] epoch: 113, iter: 142600/160000, loss: 0.4836, lr: 0.000132, batch_cost: 0.2051, reader_cost: 0.00035, ips: 39.0025 samples/sec | ETA 00:59:29 2022-08-23 09:22:47 [INFO] [TRAIN] epoch: 113, iter: 142650/160000, loss: 0.4799, lr: 0.000131, batch_cost: 0.2271, reader_cost: 0.00116, ips: 35.2314 samples/sec | ETA 01:05:39 2022-08-23 09:22:56 [INFO] [TRAIN] epoch: 113, iter: 142700/160000, loss: 0.5155, lr: 0.000131, batch_cost: 0.1944, reader_cost: 0.00062, ips: 41.1597 samples/sec | ETA 00:56:02 2022-08-23 09:23:09 [INFO] [TRAIN] epoch: 114, iter: 142750/160000, loss: 0.4547, lr: 0.000131, batch_cost: 0.2587, reader_cost: 0.09146, ips: 30.9199 samples/sec | ETA 01:14:23 2022-08-23 09:23:18 [INFO] [TRAIN] epoch: 114, iter: 142800/160000, loss: 0.5066, lr: 0.000130, batch_cost: 0.1682, reader_cost: 0.00073, ips: 47.5553 samples/sec | ETA 00:48:13 2022-08-23 09:23:28 [INFO] [TRAIN] epoch: 114, iter: 142850/160000, loss: 0.4698, lr: 0.000130, batch_cost: 0.2017, reader_cost: 0.00044, ips: 39.6531 samples/sec | ETA 00:57:40 2022-08-23 09:23:37 [INFO] [TRAIN] epoch: 114, iter: 142900/160000, loss: 0.4671, lr: 0.000129, batch_cost: 0.1851, reader_cost: 0.00047, ips: 43.2161 samples/sec | ETA 00:52:45 2022-08-23 09:23:47 [INFO] [TRAIN] epoch: 114, iter: 142950/160000, loss: 0.4822, lr: 0.000129, batch_cost: 0.2016, reader_cost: 0.00029, ips: 39.6920 samples/sec | ETA 00:57:16 2022-08-23 09:23:56 [INFO] [TRAIN] epoch: 114, iter: 143000/160000, loss: 0.4902, lr: 0.000129, batch_cost: 0.1717, reader_cost: 0.00054, ips: 46.5968 samples/sec | ETA 00:48:38 2022-08-23 09:23:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1629 - reader cost: 5.8933e-04 2022-08-23 09:26:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3505 Acc: 0.7693 Kappa: 0.7517 Dice: 0.4830 2022-08-23 09:26:39 [INFO] [EVAL] Class IoU: [0.6925 0.775 0.9327 0.7331 0.687 0.7696 0.7867 0.782 0.534 0.6093 0.4933 0.5651 0.7072 0.299 0.3272 0.4367 0.5089 0.4492 0.6286 0.4189 0.7397 0.5063 0.6362 0.4993 0.3435 0.294 0.464 0.4696 0.4092 0.2744 0.2762 0.4917 0.2647 0.3568 0.294 0.3956 0.4394 0.5335 0.2686 0.408 0.2164 0.0993 0.3523 0.2635 0.2791 0.2005 0.2978 0.5151 0.5878 0.5294 0.567 0.4201 0.2213 0.2251 0.6773 0.4609 0.8618 0.4225 0.4581 0.248 0.0606 0.3456 0.331 0.2454 0.454 0.6925 0.313 0.3932 0.1185 0.312 0.5022 0.5343 0.4024 0.194 0.4563 0.3306 0.511 0.2851 0.2663 0.1326 0.6201 0.4058 0.3783 0.028 0.1087 0.5654 0.0934 0.0987 0.2735 0.5168 0.3964 0.0654 0.2147 0.1046 0.0102 0.0145 0.0212 0.1677 0.2723 0.3963 0.1543 0.074 0.2865 0.4468 0.0081 0.5304 0.0734 0.4995 0.0977 0.2502 0.1312 0.3754 0.157 0.5688 0.7725 0.0108 0.3103 0.6381 0.1115 0.1055 0.4447 0.0098 0.2162 0.1454 0.2725 0.3023 0.4376 0.422 0.3363 0.3459 0.544 0.0668 0.2918 0.2759 0.1913 0.1272 0.159 0.0311 0.2191 0.3649 0.3343 0.0045 0.3401 0.0215 0.2433 0.0055 0.3728 0.0285 0.2163 0.1612] 2022-08-23 09:26:39 [INFO] [EVAL] Class Precision: [0.7903 0.8523 0.9655 0.8299 0.7757 0.8701 0.8754 0.8374 0.6775 0.7587 0.6978 0.7418 0.7872 0.4965 0.546 0.5991 0.7087 0.6645 0.7586 0.6158 0.8231 0.6809 0.7735 0.6355 0.509 0.5089 0.5888 0.7849 0.6615 0.3793 0.4544 0.657 0.4631 0.4976 0.4534 0.5827 0.6239 0.7682 0.4316 0.6325 0.4012 0.2386 0.5992 0.5185 0.3848 0.4383 0.472 0.6915 0.6828 0.6126 0.7722 0.5696 0.3768 0.5227 0.7034 0.614 0.9118 0.6934 0.7476 0.4619 0.1114 0.5393 0.4705 0.7043 0.5457 0.7814 0.435 0.5267 0.2619 0.5315 0.6669 0.6891 0.6189 0.2868 0.6747 0.5024 0.6043 0.5095 0.6732 0.3922 0.705 0.6372 0.8012 0.0779 0.2841 0.7292 0.3471 0.4025 0.6053 0.7114 0.5238 0.0874 0.3547 0.3595 0.0283 0.0368 0.353 0.3694 0.557 0.7042 0.4619 0.1999 0.5937 0.7279 0.2627 0.6056 0.2001 0.7789 0.2341 0.5715 0.312 0.5038 0.4407 0.7653 0.7773 0.0867 0.6637 0.7642 0.1506 0.5599 0.5611 0.1857 0.5781 0.5679 0.6657 0.6526 0.7795 0.5588 0.5958 0.5749 0.5964 0.3781 0.414 0.662 0.6556 0.2983 0.4152 0.085 0.464 0.6325 0.4006 0.0067 0.6209 0.2993 0.5017 0.0076 0.8555 0.4634 0.5785 0.7451] 2022-08-23 09:26:39 [INFO] [EVAL] Class Recall: [0.8484 0.8952 0.9649 0.8626 0.8573 0.8696 0.8859 0.9219 0.716 0.7558 0.6272 0.7035 0.8745 0.429 0.4496 0.6171 0.6435 0.581 0.7857 0.5671 0.8796 0.6639 0.7819 0.6996 0.5138 0.4105 0.6865 0.5389 0.5176 0.498 0.4132 0.6614 0.3819 0.5578 0.4555 0.5519 0.5976 0.6358 0.4157 0.5347 0.3196 0.1454 0.4609 0.3489 0.5041 0.2699 0.4465 0.6687 0.8085 0.7959 0.6809 0.6155 0.3491 0.2834 0.948 0.649 0.9402 0.5195 0.542 0.3488 0.1175 0.4904 0.5274 0.2736 0.7299 0.8589 0.5275 0.608 0.1779 0.4304 0.6703 0.7041 0.535 0.3747 0.5851 0.4916 0.7681 0.3929 0.3058 0.1669 0.8374 0.5277 0.4175 0.042 0.1497 0.7157 0.1133 0.1156 0.3328 0.6539 0.6198 0.2058 0.3523 0.1285 0.0156 0.0233 0.022 0.2349 0.3476 0.4754 0.1881 0.1051 0.3563 0.5364 0.0083 0.8103 0.1039 0.582 0.1436 0.3079 0.1847 0.5955 0.1961 0.689 0.992 0.0122 0.3682 0.7946 0.3005 0.115 0.6819 0.0102 0.2568 0.1635 0.3157 0.3603 0.4995 0.6329 0.4357 0.4648 0.861 0.0751 0.4973 0.3211 0.2126 0.1815 0.2048 0.0468 0.2932 0.4631 0.6689 0.0134 0.4293 0.0227 0.3208 0.02 0.3978 0.0295 0.2568 0.1706] 2022-08-23 09:26:39 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:26:50 [INFO] [TRAIN] epoch: 114, iter: 143050/160000, loss: 0.4694, lr: 0.000128, batch_cost: 0.2212, reader_cost: 0.00422, ips: 36.1585 samples/sec | ETA 01:02:30 2022-08-23 09:27:01 [INFO] [TRAIN] epoch: 114, iter: 143100/160000, loss: 0.4828, lr: 0.000128, batch_cost: 0.2132, reader_cost: 0.00121, ips: 37.5210 samples/sec | ETA 01:00:03 2022-08-23 09:27:10 [INFO] [TRAIN] epoch: 114, iter: 143150/160000, loss: 0.4906, lr: 0.000128, batch_cost: 0.1808, reader_cost: 0.00051, ips: 44.2537 samples/sec | ETA 00:50:46 2022-08-23 09:27:20 [INFO] [TRAIN] epoch: 114, iter: 143200/160000, loss: 0.4903, lr: 0.000127, batch_cost: 0.2025, reader_cost: 0.00069, ips: 39.5137 samples/sec | ETA 00:56:41 2022-08-23 09:27:29 [INFO] [TRAIN] epoch: 114, iter: 143250/160000, loss: 0.5127, lr: 0.000127, batch_cost: 0.1909, reader_cost: 0.00094, ips: 41.8970 samples/sec | ETA 00:53:18 2022-08-23 09:27:38 [INFO] [TRAIN] epoch: 114, iter: 143300/160000, loss: 0.4888, lr: 0.000126, batch_cost: 0.1793, reader_cost: 0.00058, ips: 44.6097 samples/sec | ETA 00:49:54 2022-08-23 09:27:48 [INFO] [TRAIN] epoch: 114, iter: 143350/160000, loss: 0.4869, lr: 0.000126, batch_cost: 0.1912, reader_cost: 0.00058, ips: 41.8397 samples/sec | ETA 00:53:03 2022-08-23 09:27:57 [INFO] [TRAIN] epoch: 114, iter: 143400/160000, loss: 0.4756, lr: 0.000126, batch_cost: 0.1784, reader_cost: 0.00075, ips: 44.8487 samples/sec | ETA 00:49:21 2022-08-23 09:28:07 [INFO] [TRAIN] epoch: 114, iter: 143450/160000, loss: 0.4531, lr: 0.000125, batch_cost: 0.2046, reader_cost: 0.00052, ips: 39.0954 samples/sec | ETA 00:56:26 2022-08-23 09:28:18 [INFO] [TRAIN] epoch: 114, iter: 143500/160000, loss: 0.4646, lr: 0.000125, batch_cost: 0.2119, reader_cost: 0.00052, ips: 37.7593 samples/sec | ETA 00:58:15 2022-08-23 09:28:27 [INFO] [TRAIN] epoch: 114, iter: 143550/160000, loss: 0.4950, lr: 0.000125, batch_cost: 0.1806, reader_cost: 0.00078, ips: 44.2932 samples/sec | ETA 00:49:31 2022-08-23 09:28:36 [INFO] [TRAIN] epoch: 114, iter: 143600/160000, loss: 0.5148, lr: 0.000124, batch_cost: 0.1946, reader_cost: 0.00075, ips: 41.1085 samples/sec | ETA 00:53:11 2022-08-23 09:28:47 [INFO] [TRAIN] epoch: 114, iter: 143650/160000, loss: 0.5092, lr: 0.000124, batch_cost: 0.2055, reader_cost: 0.00066, ips: 38.9304 samples/sec | ETA 00:55:59 2022-08-23 09:28:57 [INFO] [TRAIN] epoch: 114, iter: 143700/160000, loss: 0.4547, lr: 0.000123, batch_cost: 0.2069, reader_cost: 0.00067, ips: 38.6683 samples/sec | ETA 00:56:12 2022-08-23 09:29:08 [INFO] [TRAIN] epoch: 114, iter: 143750/160000, loss: 0.4822, lr: 0.000123, batch_cost: 0.2146, reader_cost: 0.00035, ips: 37.2859 samples/sec | ETA 00:58:06 2022-08-23 09:29:17 [INFO] [TRAIN] epoch: 114, iter: 143800/160000, loss: 0.4993, lr: 0.000123, batch_cost: 0.1800, reader_cost: 0.00076, ips: 44.4373 samples/sec | ETA 00:48:36 2022-08-23 09:29:26 [INFO] [TRAIN] epoch: 114, iter: 143850/160000, loss: 0.4437, lr: 0.000122, batch_cost: 0.1854, reader_cost: 0.00084, ips: 43.1536 samples/sec | ETA 00:49:53 2022-08-23 09:29:36 [INFO] [TRAIN] epoch: 114, iter: 143900/160000, loss: 0.5189, lr: 0.000122, batch_cost: 0.1995, reader_cost: 0.00080, ips: 40.0939 samples/sec | ETA 00:53:32 2022-08-23 09:29:46 [INFO] [TRAIN] epoch: 114, iter: 143950/160000, loss: 0.4912, lr: 0.000122, batch_cost: 0.1919, reader_cost: 0.00130, ips: 41.6842 samples/sec | ETA 00:51:20 2022-08-23 09:29:58 [INFO] [TRAIN] epoch: 115, iter: 144000/160000, loss: 0.4837, lr: 0.000121, batch_cost: 0.2392, reader_cost: 0.04962, ips: 33.4395 samples/sec | ETA 01:03:47 2022-08-23 09:29:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1590 - reader cost: 8.9421e-04 2022-08-23 09:32:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3513 Acc: 0.7707 Kappa: 0.7531 Dice: 0.4835 2022-08-23 09:32:37 [INFO] [EVAL] Class IoU: [0.6946 0.7781 0.932 0.7341 0.6814 0.7661 0.7846 0.7885 0.5299 0.6127 0.491 0.5601 0.7014 0.2927 0.3157 0.441 0.5109 0.4393 0.6246 0.4213 0.7405 0.5566 0.6381 0.4928 0.3342 0.4306 0.5612 0.4738 0.4097 0.2492 0.2814 0.4825 0.2533 0.3643 0.3093 0.3953 0.433 0.5418 0.278 0.4117 0.2279 0.0898 0.3554 0.2595 0.2986 0.1818 0.3015 0.5181 0.5597 0.533 0.5764 0.4066 0.2148 0.22 0.6715 0.442 0.8641 0.4401 0.4699 0.2395 0.0507 0.4168 0.3466 0.259 0.473 0.6908 0.3246 0.4125 0.0981 0.3103 0.5186 0.5186 0.407 0.1925 0.4589 0.3394 0.5669 0.2578 0.2706 0.156 0.6001 0.4018 0.3854 0.0274 0.0648 0.5519 0.1117 0.0856 0.2628 0.513 0.4407 0.0504 0.2076 0.1262 0.0013 0.0147 0.0267 0.1544 0.2757 0.3897 0.1524 0.0818 0.3084 0.2661 0.0027 0.5715 0.06 0.5095 0.1037 0.221 0.1208 0.3951 0.1558 0.522 0.7709 0.014 0.3163 0.6263 0.1125 0.1774 0.4457 0.0115 0.2135 0.1443 0.2636 0.3067 0.422 0.4217 0.3054 0.3293 0.5455 0.072 0.2983 0.2709 0.2096 0.1284 0.1661 0.0292 0.211 0.3594 0.3215 0.0035 0.3267 0.0247 0.2361 0. 0.3724 0.0321 0.2239 0.1487] 2022-08-23 09:32:37 [INFO] [EVAL] Class Precision: [0.7873 0.8607 0.9611 0.8309 0.7674 0.8665 0.8759 0.8513 0.6794 0.7539 0.6726 0.7457 0.7756 0.4946 0.5626 0.6148 0.6774 0.6682 0.7676 0.6288 0.8253 0.6973 0.7857 0.6264 0.5202 0.5391 0.7552 0.7539 0.659 0.3652 0.4527 0.6208 0.4699 0.5082 0.4822 0.5535 0.6456 0.7782 0.4642 0.6339 0.3921 0.2227 0.6341 0.5103 0.4132 0.4198 0.4955 0.7062 0.6925 0.622 0.7775 0.5456 0.3541 0.5467 0.7101 0.6243 0.9246 0.6843 0.7806 0.4284 0.0808 0.5781 0.5422 0.6754 0.5919 0.7828 0.4895 0.5496 0.3228 0.5607 0.6717 0.6665 0.6155 0.2947 0.7076 0.5144 0.6805 0.542 0.7441 0.3545 0.6807 0.6541 0.7975 0.0811 0.2034 0.755 0.3274 0.3965 0.5222 0.7444 0.624 0.0621 0.3689 0.3499 0.0038 0.0445 0.6228 0.443 0.5327 0.7075 0.4287 0.2197 0.5703 0.6701 0.3043 0.6248 0.2328 0.8136 0.2479 0.536 0.2964 0.5508 0.4736 0.8154 0.7746 0.0951 0.6625 0.739 0.1473 0.5725 0.518 0.1753 0.6712 0.6097 0.6601 0.6289 0.7905 0.573 0.7069 0.5628 0.5915 0.3041 0.4229 0.6588 0.5889 0.3095 0.4138 0.0799 0.5088 0.6477 0.3777 0.0053 0.6367 0.2568 0.5008 0. 0.8596 0.5015 0.629 0.7244] 2022-08-23 09:32:37 [INFO] [EVAL] Class Recall: [0.855 0.8902 0.9686 0.863 0.8588 0.8687 0.8827 0.9145 0.7066 0.7659 0.6453 0.6923 0.88 0.4177 0.4184 0.6094 0.6751 0.5618 0.7703 0.5608 0.878 0.7339 0.7725 0.6979 0.483 0.6815 0.6861 0.5604 0.52 0.4397 0.4265 0.684 0.3545 0.5625 0.4632 0.5803 0.5681 0.6407 0.4093 0.5401 0.3524 0.1309 0.4471 0.3456 0.5185 0.2428 0.435 0.6604 0.7448 0.7884 0.6903 0.6147 0.3531 0.2691 0.925 0.6022 0.9296 0.5522 0.5415 0.3521 0.1198 0.599 0.4901 0.2959 0.702 0.8545 0.4906 0.6231 0.1236 0.4101 0.6947 0.7002 0.5458 0.3569 0.5663 0.4994 0.7725 0.3296 0.2984 0.2179 0.8352 0.5103 0.4272 0.0397 0.0868 0.6723 0.1449 0.0985 0.346 0.6227 0.6 0.2111 0.322 0.1648 0.0019 0.0216 0.0272 0.1916 0.3638 0.4646 0.1912 0.1153 0.4017 0.3062 0.0027 0.8701 0.0749 0.5768 0.1513 0.2733 0.1694 0.583 0.1884 0.5919 0.9939 0.0161 0.377 0.8042 0.3226 0.2045 0.7614 0.0121 0.2384 0.159 0.305 0.3745 0.4751 0.615 0.3497 0.4425 0.8752 0.0862 0.5031 0.3152 0.2456 0.18 0.2171 0.0441 0.265 0.4467 0.6835 0.0106 0.4015 0.0266 0.3088 0. 0.3965 0.0331 0.258 0.1577] 2022-08-23 09:32:37 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:32:47 [INFO] [TRAIN] epoch: 115, iter: 144050/160000, loss: 0.4957, lr: 0.000121, batch_cost: 0.1911, reader_cost: 0.00350, ips: 41.8691 samples/sec | ETA 00:50:47 2022-08-23 09:32:55 [INFO] [TRAIN] epoch: 115, iter: 144100/160000, loss: 0.4838, lr: 0.000120, batch_cost: 0.1591, reader_cost: 0.00096, ips: 50.2756 samples/sec | ETA 00:42:10 2022-08-23 09:33:03 [INFO] [TRAIN] epoch: 115, iter: 144150/160000, loss: 0.5092, lr: 0.000120, batch_cost: 0.1720, reader_cost: 0.00062, ips: 46.5002 samples/sec | ETA 00:45:26 2022-08-23 09:33:13 [INFO] [TRAIN] epoch: 115, iter: 144200/160000, loss: 0.4915, lr: 0.000120, batch_cost: 0.2031, reader_cost: 0.00038, ips: 39.3970 samples/sec | ETA 00:53:28 2022-08-23 09:33:23 [INFO] [TRAIN] epoch: 115, iter: 144250/160000, loss: 0.4851, lr: 0.000119, batch_cost: 0.1964, reader_cost: 0.00048, ips: 40.7231 samples/sec | ETA 00:51:34 2022-08-23 09:33:32 [INFO] [TRAIN] epoch: 115, iter: 144300/160000, loss: 0.4342, lr: 0.000119, batch_cost: 0.1847, reader_cost: 0.00103, ips: 43.3166 samples/sec | ETA 00:48:19 2022-08-23 09:33:43 [INFO] [TRAIN] epoch: 115, iter: 144350/160000, loss: 0.4991, lr: 0.000118, batch_cost: 0.2167, reader_cost: 0.00088, ips: 36.9203 samples/sec | ETA 00:56:31 2022-08-23 09:33:53 [INFO] [TRAIN] epoch: 115, iter: 144400/160000, loss: 0.4569, lr: 0.000118, batch_cost: 0.2041, reader_cost: 0.00060, ips: 39.1898 samples/sec | ETA 00:53:04 2022-08-23 09:34:03 [INFO] [TRAIN] epoch: 115, iter: 144450/160000, loss: 0.5181, lr: 0.000118, batch_cost: 0.1847, reader_cost: 0.00045, ips: 43.3102 samples/sec | ETA 00:47:52 2022-08-23 09:34:13 [INFO] [TRAIN] epoch: 115, iter: 144500/160000, loss: 0.4613, lr: 0.000117, batch_cost: 0.2135, reader_cost: 0.00055, ips: 37.4747 samples/sec | ETA 00:55:08 2022-08-23 09:34:23 [INFO] [TRAIN] epoch: 115, iter: 144550/160000, loss: 0.4622, lr: 0.000117, batch_cost: 0.2034, reader_cost: 0.01040, ips: 39.3270 samples/sec | ETA 00:52:22 2022-08-23 09:34:34 [INFO] [TRAIN] epoch: 115, iter: 144600/160000, loss: 0.4712, lr: 0.000117, batch_cost: 0.2050, reader_cost: 0.00032, ips: 39.0258 samples/sec | ETA 00:52:36 2022-08-23 09:34:44 [INFO] [TRAIN] epoch: 115, iter: 144650/160000, loss: 0.4923, lr: 0.000116, batch_cost: 0.1974, reader_cost: 0.00107, ips: 40.5186 samples/sec | ETA 00:50:30 2022-08-23 09:34:54 [INFO] [TRAIN] epoch: 115, iter: 144700/160000, loss: 0.4770, lr: 0.000116, batch_cost: 0.2051, reader_cost: 0.00065, ips: 39.0148 samples/sec | ETA 00:52:17 2022-08-23 09:35:03 [INFO] [TRAIN] epoch: 115, iter: 144750/160000, loss: 0.5010, lr: 0.000115, batch_cost: 0.1866, reader_cost: 0.00075, ips: 42.8708 samples/sec | ETA 00:47:25 2022-08-23 09:35:12 [INFO] [TRAIN] epoch: 115, iter: 144800/160000, loss: 0.4957, lr: 0.000115, batch_cost: 0.1837, reader_cost: 0.00061, ips: 43.5555 samples/sec | ETA 00:46:31 2022-08-23 09:35:22 [INFO] [TRAIN] epoch: 115, iter: 144850/160000, loss: 0.4623, lr: 0.000115, batch_cost: 0.2005, reader_cost: 0.00050, ips: 39.9087 samples/sec | ETA 00:50:36 2022-08-23 09:35:32 [INFO] [TRAIN] epoch: 115, iter: 144900/160000, loss: 0.4808, lr: 0.000114, batch_cost: 0.2007, reader_cost: 0.00056, ips: 39.8619 samples/sec | ETA 00:50:30 2022-08-23 09:35:42 [INFO] [TRAIN] epoch: 115, iter: 144950/160000, loss: 0.4940, lr: 0.000114, batch_cost: 0.1904, reader_cost: 0.00064, ips: 42.0082 samples/sec | ETA 00:47:46 2022-08-23 09:35:51 [INFO] [TRAIN] epoch: 115, iter: 145000/160000, loss: 0.4818, lr: 0.000114, batch_cost: 0.1755, reader_cost: 0.00555, ips: 45.5906 samples/sec | ETA 00:43:52 2022-08-23 09:35:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 177s - batch_cost: 0.1773 - reader cost: 9.2602e-04 2022-08-23 09:38:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3530 Acc: 0.7697 Kappa: 0.7520 Dice: 0.4857 2022-08-23 09:38:48 [INFO] [EVAL] Class IoU: [0.6932 0.7725 0.9316 0.7295 0.6777 0.7643 0.788 0.787 0.5298 0.6081 0.4936 0.5661 0.7039 0.2927 0.3235 0.444 0.5164 0.4391 0.6264 0.4174 0.7447 0.5341 0.6322 0.4941 0.3347 0.3193 0.4725 0.4601 0.4163 0.2844 0.2688 0.4922 0.2413 0.364 0.3369 0.3949 0.4334 0.5415 0.2594 0.4123 0.2173 0.0742 0.3536 0.2596 0.2779 0.1807 0.3045 0.5175 0.5864 0.5308 0.5698 0.4454 0.2249 0.2092 0.6857 0.4545 0.8577 0.396 0.4421 0.2152 0.0546 0.3743 0.3458 0.2482 0.467 0.7071 0.317 0.4103 0.131 0.3116 0.5099 0.5262 0.4033 0.2072 0.4578 0.337 0.6206 0.2949 0.2661 0.1846 0.6066 0.4127 0.3727 0.0313 0.1155 0.5565 0.1052 0.1052 0.2549 0.506 0.4485 0.0524 0.216 0.0985 0.0004 0.0132 0.0354 0.144 0.2993 0.4659 0.1453 0.052 0.3067 0.1983 0.0035 0.5764 0.0604 0.5069 0.1097 0.2451 0.1394 0.4266 0.1528 0.5209 0.7836 0.0118 0.3565 0.6307 0.125 0.2185 0.4467 0.019 0.2148 0.1595 0.2558 0.2855 0.4434 0.4148 0.3326 0.3468 0.5644 0.0672 0.3481 0.3187 0.206 0.1222 0.162 0.0358 0.2137 0.3585 0.3228 0.0035 0.3474 0.0455 0.2297 0. 0.3591 0.0413 0.2153 0.1462] 2022-08-23 09:38:48 [INFO] [EVAL] Class Precision: [0.7906 0.8541 0.9658 0.8193 0.7577 0.8412 0.882 0.8502 0.6844 0.7414 0.6965 0.732 0.7784 0.5081 0.5561 0.6214 0.715 0.688 0.7672 0.6233 0.8345 0.7055 0.7504 0.6081 0.4865 0.5074 0.6318 0.7314 0.6696 0.3934 0.4701 0.6422 0.4541 0.4787 0.5025 0.5513 0.6136 0.7966 0.428 0.629 0.4082 0.2234 0.6287 0.5125 0.3903 0.4324 0.4754 0.7077 0.6887 0.6184 0.7917 0.5981 0.4094 0.5028 0.7195 0.5832 0.9088 0.7248 0.7542 0.4077 0.1012 0.5383 0.5588 0.7213 0.5756 0.8175 0.4424 0.5577 0.3407 0.5835 0.6721 0.6933 0.6386 0.2913 0.6871 0.493 0.7782 0.5296 0.6979 0.4708 0.6898 0.6728 0.8018 0.0824 0.2807 0.7288 0.3655 0.3792 0.5359 0.7621 0.6281 0.0665 0.4014 0.3561 0.0016 0.0331 0.3714 0.4104 0.4949 0.6412 0.4099 0.2129 0.594 0.6688 0.3973 0.6401 0.2239 0.7947 0.2687 0.5758 0.2959 0.5969 0.4193 0.7611 0.7889 0.1051 0.6655 0.7473 0.1723 0.5297 0.6224 0.1337 0.7066 0.6251 0.6865 0.6197 0.7815 0.5445 0.6101 0.5999 0.6233 0.3419 0.4545 0.6338 0.5955 0.2576 0.3382 0.0907 0.4688 0.6382 0.3691 0.0051 0.6229 0.2853 0.4592 0. 0.8603 0.3854 0.6318 0.6987] 2022-08-23 09:38:48 [INFO] [EVAL] Class Recall: [0.8492 0.89 0.9633 0.8694 0.8653 0.8932 0.8809 0.9137 0.7011 0.7719 0.6289 0.7141 0.8803 0.4085 0.4362 0.6086 0.6503 0.5484 0.7733 0.5583 0.8738 0.6873 0.8004 0.725 0.5176 0.4628 0.6521 0.5537 0.5239 0.5066 0.3856 0.6782 0.3399 0.603 0.5055 0.5819 0.596 0.6283 0.3971 0.5448 0.3173 0.1 0.4469 0.3446 0.491 0.2369 0.4585 0.6582 0.798 0.7893 0.6703 0.6356 0.3328 0.2638 0.9359 0.6732 0.9385 0.4661 0.5166 0.3131 0.1062 0.5512 0.4756 0.2745 0.7122 0.8396 0.5281 0.6083 0.1754 0.4007 0.6788 0.6859 0.5225 0.4178 0.5784 0.5158 0.754 0.3995 0.3007 0.233 0.8342 0.5163 0.4105 0.0481 0.164 0.7018 0.1288 0.1271 0.3271 0.601 0.6106 0.1985 0.3186 0.1199 0.0005 0.0214 0.0377 0.1815 0.431 0.6302 0.1838 0.0643 0.388 0.2199 0.0035 0.8527 0.0764 0.5833 0.1564 0.2991 0.2087 0.5992 0.1938 0.6227 0.9914 0.0131 0.4343 0.8017 0.3131 0.271 0.6128 0.0216 0.2359 0.1764 0.2897 0.3462 0.5062 0.6353 0.4224 0.4512 0.8566 0.0772 0.598 0.3906 0.2395 0.1887 0.2371 0.0559 0.282 0.4499 0.7202 0.0108 0.4398 0.0513 0.3149 0. 0.3814 0.0442 0.2462 0.1561] 2022-08-23 09:38:48 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:38:58 [INFO] [TRAIN] epoch: 115, iter: 145050/160000, loss: 0.4680, lr: 0.000113, batch_cost: 0.1951, reader_cost: 0.00447, ips: 41.0041 samples/sec | ETA 00:48:36 2022-08-23 09:39:08 [INFO] [TRAIN] epoch: 115, iter: 145100/160000, loss: 0.4633, lr: 0.000113, batch_cost: 0.1949, reader_cost: 0.00107, ips: 41.0450 samples/sec | ETA 00:48:24 2022-08-23 09:39:17 [INFO] [TRAIN] epoch: 115, iter: 145150/160000, loss: 0.4748, lr: 0.000112, batch_cost: 0.1820, reader_cost: 0.00046, ips: 43.9637 samples/sec | ETA 00:45:02 2022-08-23 09:39:26 [INFO] [TRAIN] epoch: 115, iter: 145200/160000, loss: 0.4494, lr: 0.000112, batch_cost: 0.1809, reader_cost: 0.00047, ips: 44.2169 samples/sec | ETA 00:44:37 2022-08-23 09:39:39 [INFO] [TRAIN] epoch: 116, iter: 145250/160000, loss: 0.4629, lr: 0.000112, batch_cost: 0.2590, reader_cost: 0.08251, ips: 30.8846 samples/sec | ETA 01:03:40 2022-08-23 09:39:48 [INFO] [TRAIN] epoch: 116, iter: 145300/160000, loss: 0.4899, lr: 0.000111, batch_cost: 0.1825, reader_cost: 0.00043, ips: 43.8433 samples/sec | ETA 00:44:42 2022-08-23 09:39:56 [INFO] [TRAIN] epoch: 116, iter: 145350/160000, loss: 0.5001, lr: 0.000111, batch_cost: 0.1599, reader_cost: 0.00054, ips: 50.0245 samples/sec | ETA 00:39:02 2022-08-23 09:40:05 [INFO] [TRAIN] epoch: 116, iter: 145400/160000, loss: 0.4952, lr: 0.000111, batch_cost: 0.1738, reader_cost: 0.00082, ips: 46.0397 samples/sec | ETA 00:42:16 2022-08-23 09:40:14 [INFO] [TRAIN] epoch: 116, iter: 145450/160000, loss: 0.4810, lr: 0.000110, batch_cost: 0.1802, reader_cost: 0.00042, ips: 44.3872 samples/sec | ETA 00:43:42 2022-08-23 09:40:25 [INFO] [TRAIN] epoch: 116, iter: 145500/160000, loss: 0.4320, lr: 0.000110, batch_cost: 0.2253, reader_cost: 0.00032, ips: 35.5133 samples/sec | ETA 00:54:26 2022-08-23 09:40:36 [INFO] [TRAIN] epoch: 116, iter: 145550/160000, loss: 0.5087, lr: 0.000109, batch_cost: 0.2119, reader_cost: 0.00054, ips: 37.7603 samples/sec | ETA 00:51:01 2022-08-23 09:40:47 [INFO] [TRAIN] epoch: 116, iter: 145600/160000, loss: 0.4693, lr: 0.000109, batch_cost: 0.2242, reader_cost: 0.00054, ips: 35.6797 samples/sec | ETA 00:53:48 2022-08-23 09:40:57 [INFO] [TRAIN] epoch: 116, iter: 145650/160000, loss: 0.4579, lr: 0.000109, batch_cost: 0.2102, reader_cost: 0.00048, ips: 38.0651 samples/sec | ETA 00:50:15 2022-08-23 09:41:07 [INFO] [TRAIN] epoch: 116, iter: 145700/160000, loss: 0.4786, lr: 0.000108, batch_cost: 0.1964, reader_cost: 0.00267, ips: 40.7311 samples/sec | ETA 00:46:48 2022-08-23 09:41:17 [INFO] [TRAIN] epoch: 116, iter: 145750/160000, loss: 0.4885, lr: 0.000108, batch_cost: 0.1963, reader_cost: 0.00152, ips: 40.7544 samples/sec | ETA 00:46:37 2022-08-23 09:41:28 [INFO] [TRAIN] epoch: 116, iter: 145800/160000, loss: 0.4666, lr: 0.000108, batch_cost: 0.2269, reader_cost: 0.00115, ips: 35.2597 samples/sec | ETA 00:53:41 2022-08-23 09:41:38 [INFO] [TRAIN] epoch: 116, iter: 145850/160000, loss: 0.4673, lr: 0.000107, batch_cost: 0.1855, reader_cost: 0.00192, ips: 43.1226 samples/sec | ETA 00:43:45 2022-08-23 09:41:47 [INFO] [TRAIN] epoch: 116, iter: 145900/160000, loss: 0.4680, lr: 0.000107, batch_cost: 0.1907, reader_cost: 0.00043, ips: 41.9571 samples/sec | ETA 00:44:48 2022-08-23 09:41:57 [INFO] [TRAIN] epoch: 116, iter: 145950/160000, loss: 0.4756, lr: 0.000106, batch_cost: 0.1993, reader_cost: 0.00574, ips: 40.1451 samples/sec | ETA 00:46:39 2022-08-23 09:42:09 [INFO] [TRAIN] epoch: 116, iter: 146000/160000, loss: 0.4958, lr: 0.000106, batch_cost: 0.2299, reader_cost: 0.00072, ips: 34.7918 samples/sec | ETA 00:53:39 2022-08-23 09:42:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1778 - reader cost: 9.9182e-04 2022-08-23 09:45:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3522 Acc: 0.7701 Kappa: 0.7526 Dice: 0.4851 2022-08-23 09:45:07 [INFO] [EVAL] Class IoU: [0.696 0.7786 0.9316 0.7304 0.6788 0.7652 0.7888 0.7896 0.5301 0.6104 0.4919 0.5637 0.701 0.3046 0.3268 0.4397 0.5224 0.4397 0.62 0.4211 0.7454 0.5391 0.639 0.4939 0.3257 0.3389 0.4739 0.4692 0.401 0.2874 0.2838 0.4896 0.2731 0.3642 0.3094 0.3958 0.4367 0.5379 0.2621 0.3987 0.2383 0.0921 0.3577 0.254 0.2877 0.2116 0.3009 0.5226 0.6182 0.5575 0.5808 0.409 0.2187 0.2171 0.6789 0.4094 0.8624 0.4249 0.4398 0.2372 0.0549 0.397 0.3286 0.2584 0.463 0.7018 0.3231 0.4018 0.1145 0.3074 0.5152 0.5138 0.3963 0.1947 0.4556 0.3369 0.5794 0.3029 0.2645 0.1363 0.6351 0.4061 0.3721 0.0272 0.1741 0.5605 0.0996 0.0939 0.2541 0.4957 0.4429 0.063 0.2259 0.1143 0.0002 0.0193 0.0298 0.1394 0.2958 0.4192 0.137 0.0839 0.3016 0.2671 0.0008 0.4999 0.0597 0.5127 0.1035 0.2448 0.1493 0.4571 0.1559 0.5127 0.7668 0.0179 0.3189 0.6342 0.132 0.1497 0.496 0.0144 0.2158 0.1306 0.2586 0.2932 0.408 0.3879 0.309 0.3343 0.5624 0.0664 0.3628 0.3017 0.1819 0.1152 0.1637 0.0367 0.2229 0.3727 0.3142 0.0042 0.3243 0.0583 0.1898 0.0014 0.3756 0.0348 0.2058 0.1569] 2022-08-23 09:45:07 [INFO] [EVAL] Class Precision: [0.7958 0.859 0.9677 0.822 0.7562 0.8561 0.8769 0.8551 0.6876 0.7799 0.6978 0.743 0.7709 0.4942 0.5539 0.5989 0.7264 0.6735 0.7426 0.611 0.8295 0.6823 0.774 0.6359 0.5119 0.5265 0.6564 0.789 0.6789 0.3683 0.4374 0.6289 0.4643 0.5071 0.4911 0.5632 0.6106 0.7567 0.4224 0.6765 0.4163 0.2476 0.6282 0.5223 0.3917 0.4355 0.4466 0.7058 0.6976 0.652 0.7243 0.5296 0.3655 0.5257 0.7015 0.613 0.9122 0.6781 0.7126 0.4565 0.0975 0.5363 0.4926 0.6966 0.5715 0.8032 0.4453 0.5511 0.2445 0.5491 0.6846 0.6903 0.6202 0.2759 0.6988 0.5059 0.7976 0.5694 0.6519 0.3975 0.7242 0.6377 0.814 0.0828 0.3769 0.7369 0.3213 0.4059 0.4921 0.7539 0.6237 0.0869 0.4077 0.3523 0.0008 0.0453 0.5264 0.4096 0.4791 0.6881 0.4086 0.2553 0.5671 0.7304 0.0989 0.5209 0.1866 0.8022 0.2473 0.5716 0.3408 0.6468 0.4671 0.7104 0.77 0.1005 0.6779 0.7518 0.1677 0.5893 0.6052 0.2175 0.5573 0.6203 0.6784 0.6323 0.7708 0.4864 0.6572 0.5492 0.6146 0.305 0.4993 0.6576 0.6304 0.2702 0.3782 0.06 0.4805 0.6036 0.3785 0.0065 0.6043 0.4391 0.4636 0.002 0.8438 0.4334 0.6578 0.7579] 2022-08-23 09:45:07 [INFO] [EVAL] Class Recall: [0.8472 0.8926 0.9615 0.8676 0.8691 0.8781 0.887 0.9116 0.6982 0.7374 0.6249 0.7002 0.8854 0.4426 0.4435 0.6231 0.6503 0.5588 0.7898 0.5753 0.8802 0.7197 0.7856 0.6886 0.4725 0.4874 0.6302 0.5365 0.4949 0.5665 0.447 0.6885 0.3987 0.5637 0.4555 0.5712 0.6054 0.6504 0.4086 0.4927 0.3578 0.1279 0.4537 0.3308 0.52 0.2916 0.4799 0.668 0.8445 0.7937 0.7457 0.6424 0.3525 0.2699 0.9547 0.5521 0.9405 0.5322 0.5346 0.3306 0.1117 0.6044 0.4968 0.2911 0.7092 0.8475 0.5408 0.5974 0.1772 0.4112 0.6756 0.6677 0.5232 0.398 0.5669 0.5022 0.6793 0.393 0.308 0.1718 0.8377 0.5279 0.4067 0.0389 0.2444 0.7008 0.1261 0.1088 0.3445 0.5914 0.6044 0.1863 0.3362 0.1448 0.0003 0.0327 0.0306 0.1744 0.436 0.5175 0.1709 0.1111 0.3918 0.2963 0.0008 0.9256 0.0807 0.5869 0.1511 0.2998 0.21 0.6092 0.1896 0.6482 0.9947 0.0214 0.3758 0.8021 0.3826 0.1671 0.7333 0.0152 0.2605 0.142 0.2947 0.3534 0.4644 0.6569 0.3683 0.4608 0.8688 0.0782 0.5704 0.3579 0.2036 0.1672 0.224 0.0862 0.2937 0.4935 0.6491 0.0118 0.4118 0.063 0.2432 0.0045 0.4036 0.0365 0.2305 0.1652] 2022-08-23 09:45:07 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:45:15 [INFO] [TRAIN] epoch: 116, iter: 146050/160000, loss: 0.4886, lr: 0.000106, batch_cost: 0.1679, reader_cost: 0.00369, ips: 47.6586 samples/sec | ETA 00:39:01 2022-08-23 09:45:23 [INFO] [TRAIN] epoch: 116, iter: 146100/160000, loss: 0.5238, lr: 0.000105, batch_cost: 0.1669, reader_cost: 0.00102, ips: 47.9307 samples/sec | ETA 00:38:40 2022-08-23 09:45:33 [INFO] [TRAIN] epoch: 116, iter: 146150/160000, loss: 0.4767, lr: 0.000105, batch_cost: 0.1975, reader_cost: 0.00134, ips: 40.5058 samples/sec | ETA 00:45:35 2022-08-23 09:45:42 [INFO] [TRAIN] epoch: 116, iter: 146200/160000, loss: 0.4782, lr: 0.000104, batch_cost: 0.1816, reader_cost: 0.00051, ips: 44.0590 samples/sec | ETA 00:41:45 2022-08-23 09:45:51 [INFO] [TRAIN] epoch: 116, iter: 146250/160000, loss: 0.4634, lr: 0.000104, batch_cost: 0.1753, reader_cost: 0.00095, ips: 45.6477 samples/sec | ETA 00:40:09 2022-08-23 09:46:00 [INFO] [TRAIN] epoch: 116, iter: 146300/160000, loss: 0.5044, lr: 0.000104, batch_cost: 0.1714, reader_cost: 0.00040, ips: 46.6793 samples/sec | ETA 00:39:07 2022-08-23 09:46:09 [INFO] [TRAIN] epoch: 116, iter: 146350/160000, loss: 0.5013, lr: 0.000103, batch_cost: 0.1914, reader_cost: 0.00074, ips: 41.8029 samples/sec | ETA 00:43:32 2022-08-23 09:46:18 [INFO] [TRAIN] epoch: 116, iter: 146400/160000, loss: 0.4959, lr: 0.000103, batch_cost: 0.1648, reader_cost: 0.00045, ips: 48.5577 samples/sec | ETA 00:37:20 2022-08-23 09:46:27 [INFO] [TRAIN] epoch: 116, iter: 146450/160000, loss: 0.4675, lr: 0.000103, batch_cost: 0.1841, reader_cost: 0.00076, ips: 43.4638 samples/sec | ETA 00:41:34 2022-08-23 09:46:35 [INFO] [TRAIN] epoch: 116, iter: 146500/160000, loss: 0.4683, lr: 0.000102, batch_cost: 0.1720, reader_cost: 0.00040, ips: 46.5206 samples/sec | ETA 00:38:41 2022-08-23 09:46:51 [INFO] [TRAIN] epoch: 117, iter: 146550/160000, loss: 0.4534, lr: 0.000102, batch_cost: 0.3206, reader_cost: 0.09203, ips: 24.9568 samples/sec | ETA 01:11:51 2022-08-23 09:47:01 [INFO] [TRAIN] epoch: 117, iter: 146600/160000, loss: 0.4941, lr: 0.000101, batch_cost: 0.1954, reader_cost: 0.00158, ips: 40.9487 samples/sec | ETA 00:43:37 2022-08-23 09:47:12 [INFO] [TRAIN] epoch: 117, iter: 146650/160000, loss: 0.4396, lr: 0.000101, batch_cost: 0.2244, reader_cost: 0.00078, ips: 35.6532 samples/sec | ETA 00:49:55 2022-08-23 09:47:23 [INFO] [TRAIN] epoch: 117, iter: 146700/160000, loss: 0.5055, lr: 0.000101, batch_cost: 0.2022, reader_cost: 0.00422, ips: 39.5603 samples/sec | ETA 00:44:49 2022-08-23 09:47:33 [INFO] [TRAIN] epoch: 117, iter: 146750/160000, loss: 0.4811, lr: 0.000100, batch_cost: 0.2149, reader_cost: 0.00116, ips: 37.2277 samples/sec | ETA 00:47:27 2022-08-23 09:47:44 [INFO] [TRAIN] epoch: 117, iter: 146800/160000, loss: 0.4971, lr: 0.000100, batch_cost: 0.2105, reader_cost: 0.00066, ips: 38.0002 samples/sec | ETA 00:46:18 2022-08-23 09:47:53 [INFO] [TRAIN] epoch: 117, iter: 146850/160000, loss: 0.5173, lr: 0.000100, batch_cost: 0.1915, reader_cost: 0.00120, ips: 41.7707 samples/sec | ETA 00:41:58 2022-08-23 09:48:04 [INFO] [TRAIN] epoch: 117, iter: 146900/160000, loss: 0.4601, lr: 0.000099, batch_cost: 0.2198, reader_cost: 0.00229, ips: 36.3992 samples/sec | ETA 00:47:59 2022-08-23 09:48:15 [INFO] [TRAIN] epoch: 117, iter: 146950/160000, loss: 0.4578, lr: 0.000099, batch_cost: 0.2200, reader_cost: 0.00059, ips: 36.3703 samples/sec | ETA 00:47:50 2022-08-23 09:48:27 [INFO] [TRAIN] epoch: 117, iter: 147000/160000, loss: 0.5065, lr: 0.000098, batch_cost: 0.2318, reader_cost: 0.00083, ips: 34.5089 samples/sec | ETA 00:50:13 2022-08-23 09:48:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1620 - reader cost: 6.8116e-04 2022-08-23 09:51:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3504 Acc: 0.7703 Kappa: 0.7527 Dice: 0.4827 2022-08-23 09:51:09 [INFO] [EVAL] Class IoU: [0.6949 0.7759 0.9316 0.7329 0.6847 0.7656 0.7896 0.783 0.5332 0.6107 0.4894 0.5721 0.7062 0.3238 0.3133 0.4397 0.5071 0.4291 0.6276 0.4212 0.7438 0.5556 0.6476 0.4954 0.3372 0.29 0.4836 0.4536 0.4176 0.2768 0.272 0.4864 0.2782 0.3651 0.3046 0.4006 0.4375 0.5478 0.2529 0.3818 0.2241 0.0976 0.361 0.2577 0.2923 0.1992 0.2956 0.5264 0.6353 0.5136 0.5822 0.384 0.2259 0.2002 0.67 0.4428 0.8694 0.4188 0.4026 0.2489 0.0524 0.376 0.3257 0.2586 0.4695 0.7098 0.3015 0.4076 0.0995 0.301 0.5069 0.531 0.4021 0.1923 0.4553 0.3377 0.5837 0.3078 0.2756 0.1603 0.6014 0.4116 0.3774 0.0255 0.1164 0.5638 0.0955 0.0953 0.233 0.4719 0.4354 0.0715 0.2099 0.0989 0.0013 0.013 0.0312 0.1541 0.2899 0.4131 0.1076 0.0664 0.2887 0.3099 0.0018 0.5141 0.0616 0.5168 0.0998 0.2335 0.1632 0.3313 0.1568 0.567 0.795 0.0196 0.3119 0.636 0.1082 0.2108 0.4531 0.0199 0.2101 0.1475 0.2634 0.298 0.4487 0.4199 0.3607 0.327 0.5752 0.0545 0.3199 0.3062 0.163 0.1171 0.1596 0.0365 0.2287 0.356 0.306 0.0029 0.2687 0.0674 0.2396 0.0007 0.3465 0.0337 0.2136 0.1507] 2022-08-23 09:51:09 [INFO] [EVAL] Class Precision: [0.7908 0.851 0.9673 0.8237 0.7663 0.8574 0.8834 0.84 0.686 0.7848 0.7164 0.7247 0.7809 0.4924 0.5542 0.6068 0.7157 0.6748 0.7726 0.6253 0.8323 0.6967 0.792 0.6087 0.498 0.5323 0.6712 0.7975 0.6744 0.3756 0.4586 0.6228 0.4793 0.4824 0.4953 0.5657 0.6103 0.784 0.4055 0.6898 0.4296 0.2423 0.6501 0.5465 0.3868 0.4226 0.4364 0.7343 0.7384 0.5943 0.7732 0.497 0.3926 0.4731 0.6895 0.581 0.9218 0.6674 0.6981 0.3882 0.0976 0.5396 0.4651 0.6183 0.5808 0.8094 0.4249 0.5444 0.2152 0.5311 0.6612 0.6884 0.622 0.2763 0.7381 0.5505 0.7389 0.5785 0.706 0.4539 0.6809 0.6511 0.7947 0.0756 0.2768 0.7499 0.3362 0.4037 0.4358 0.7473 0.6124 0.0987 0.3438 0.3558 0.0033 0.0357 0.5287 0.3854 0.5684 0.6826 0.3886 0.2068 0.5724 0.6978 0.1074 0.541 0.2844 0.8224 0.3211 0.536 0.3078 0.4192 0.4303 0.7802 0.7993 0.1189 0.6691 0.7709 0.1476 0.5785 0.5961 0.2158 0.6344 0.627 0.6863 0.6082 0.8075 0.5668 0.7481 0.5461 0.6325 0.2906 0.4708 0.6281 0.626 0.2817 0.3204 0.0861 0.5284 0.6559 0.351 0.0046 0.6352 0.4782 0.581 0.0011 0.8426 0.381 0.5753 0.7479] 2022-08-23 09:51:09 [INFO] [EVAL] Class Recall: [0.8514 0.8979 0.9619 0.8693 0.8654 0.8773 0.8815 0.9202 0.7053 0.7335 0.607 0.731 0.8807 0.4861 0.4189 0.6149 0.635 0.541 0.7698 0.5634 0.875 0.7329 0.7803 0.7269 0.5108 0.3893 0.6337 0.5127 0.5231 0.5126 0.4007 0.6896 0.3987 0.6003 0.4418 0.5785 0.6072 0.6452 0.4019 0.461 0.3191 0.1404 0.448 0.3278 0.5447 0.2738 0.4782 0.6502 0.8199 0.7908 0.702 0.6281 0.3471 0.2577 0.9595 0.6505 0.9386 0.5294 0.4875 0.4096 0.1016 0.5536 0.5208 0.3078 0.71 0.8522 0.5094 0.6187 0.1562 0.41 0.6847 0.6989 0.5322 0.3877 0.5431 0.4662 0.7353 0.3968 0.3113 0.1986 0.8375 0.5281 0.4181 0.037 0.1672 0.6944 0.1177 0.1109 0.3337 0.5615 0.601 0.2056 0.3502 0.1204 0.0022 0.0201 0.0321 0.2044 0.3718 0.5114 0.1295 0.0891 0.3681 0.3579 0.0018 0.9118 0.0728 0.5818 0.1265 0.2927 0.2578 0.6125 0.1979 0.6747 0.9933 0.0229 0.3688 0.7843 0.2886 0.2491 0.6538 0.0214 0.2391 0.1617 0.2995 0.3688 0.5025 0.6183 0.4106 0.449 0.864 0.0628 0.4995 0.374 0.1806 0.1669 0.2412 0.0596 0.2874 0.4378 0.7046 0.0075 0.3177 0.0728 0.2897 0.0019 0.3705 0.0357 0.2536 0.1588] 2022-08-23 09:51:09 [INFO] [EVAL] The model with the best validation mIoU (0.3549) was saved at iter 141000. 2022-08-23 09:51:19 [INFO] [TRAIN] epoch: 117, iter: 147050/160000, loss: 0.4460, lr: 0.000098, batch_cost: 0.1943, reader_cost: 0.00417, ips: 41.1829 samples/sec | ETA 00:41:55 2022-08-23 09:51:29 [INFO] [TRAIN] epoch: 117, iter: 147100/160000, loss: 0.5106, lr: 0.000098, batch_cost: 0.1933, reader_cost: 0.00092, ips: 41.3771 samples/sec | ETA 00:41:34 2022-08-23 09:51:39 [INFO] [TRAIN] epoch: 117, iter: 147150/160000, loss: 0.4497, lr: 0.000097, batch_cost: 0.2127, reader_cost: 0.00068, ips: 37.6035 samples/sec | ETA 00:45:33 2022-08-23 09:51:49 [INFO] [TRAIN] epoch: 117, iter: 147200/160000, loss: 0.4737, lr: 0.000097, batch_cost: 0.1880, reader_cost: 0.00070, ips: 42.5549 samples/sec | ETA 00:40:06 2022-08-23 09:51:58 [INFO] [TRAIN] epoch: 117, iter: 147250/160000, loss: 0.5280, lr: 0.000097, batch_cost: 0.1902, reader_cost: 0.00055, ips: 42.0699 samples/sec | ETA 00:40:24 2022-08-23 09:52:08 [INFO] [TRAIN] epoch: 117, iter: 147300/160000, loss: 0.5103, lr: 0.000096, batch_cost: 0.1950, reader_cost: 0.00051, ips: 41.0216 samples/sec | ETA 00:41:16 2022-08-23 09:52:18 [INFO] [TRAIN] epoch: 117, iter: 147350/160000, loss: 0.4908, lr: 0.000096, batch_cost: 0.1914, reader_cost: 0.00039, ips: 41.7872 samples/sec | ETA 00:40:21 2022-08-23 09:52:28 [INFO] [TRAIN] epoch: 117, iter: 147400/160000, loss: 0.4827, lr: 0.000095, batch_cost: 0.2026, reader_cost: 0.00079, ips: 39.4828 samples/sec | ETA 00:42:33 2022-08-23 09:52:37 [INFO] [TRAIN] epoch: 117, iter: 147450/160000, loss: 0.4452, lr: 0.000095, batch_cost: 0.1806, reader_cost: 0.00035, ips: 44.3088 samples/sec | ETA 00:37:45 2022-08-23 09:52:47 [INFO] [TRAIN] epoch: 117, iter: 147500/160000, loss: 0.4688, lr: 0.000095, batch_cost: 0.1958, reader_cost: 0.00557, ips: 40.8604 samples/sec | ETA 00:40:47 2022-08-23 09:52:58 [INFO] [TRAIN] epoch: 117, iter: 147550/160000, loss: 0.4937, lr: 0.000094, batch_cost: 0.2237, reader_cost: 0.00172, ips: 35.7567 samples/sec | ETA 00:46:25 2022-08-23 09:53:08 [INFO] [TRAIN] epoch: 117, iter: 147600/160000, loss: 0.4662, lr: 0.000094, batch_cost: 0.2017, reader_cost: 0.00357, ips: 39.6593 samples/sec | ETA 00:41:41 2022-08-23 09:53:18 [INFO] [TRAIN] epoch: 117, iter: 147650/160000, loss: 0.4748, lr: 0.000094, batch_cost: 0.2004, reader_cost: 0.00071, ips: 39.9217 samples/sec | ETA 00:41:14 2022-08-23 09:53:28 [INFO] [TRAIN] epoch: 117, iter: 147700/160000, loss: 0.4655, lr: 0.000093, batch_cost: 0.2110, reader_cost: 0.00043, ips: 37.9125 samples/sec | ETA 00:43:15 2022-08-23 09:53:39 [INFO] [TRAIN] epoch: 117, iter: 147750/160000, loss: 0.5106, lr: 0.000093, batch_cost: 0.2188, reader_cost: 0.00123, ips: 36.5694 samples/sec | ETA 00:44:39 2022-08-23 09:53:52 [INFO] [TRAIN] epoch: 118, iter: 147800/160000, loss: 0.4607, lr: 0.000092, batch_cost: 0.2576, reader_cost: 0.05395, ips: 31.0594 samples/sec | ETA 00:52:22 2022-08-23 09:54:03 [INFO] [TRAIN] epoch: 118, iter: 147850/160000, loss: 0.5160, lr: 0.000092, batch_cost: 0.2232, reader_cost: 0.00067, ips: 35.8416 samples/sec | ETA 00:45:11 2022-08-23 09:54:15 [INFO] [TRAIN] epoch: 118, iter: 147900/160000, loss: 0.4845, lr: 0.000092, batch_cost: 0.2272, reader_cost: 0.00066, ips: 35.2039 samples/sec | ETA 00:45:49 2022-08-23 09:54:25 [INFO] [TRAIN] epoch: 118, iter: 147950/160000, loss: 0.5103, lr: 0.000091, batch_cost: 0.2114, reader_cost: 0.00033, ips: 37.8373 samples/sec | ETA 00:42:27 2022-08-23 09:54:36 [INFO] [TRAIN] epoch: 118, iter: 148000/160000, loss: 0.4635, lr: 0.000091, batch_cost: 0.2155, reader_cost: 0.00054, ips: 37.1197 samples/sec | ETA 00:43:06 2022-08-23 09:54:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1562 - reader cost: 0.0010 2022-08-23 09:57:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3561 Acc: 0.7711 Kappa: 0.7536 Dice: 0.4886 2022-08-23 09:57:13 [INFO] [EVAL] Class IoU: [0.6933 0.7792 0.9316 0.7342 0.6782 0.7651 0.7858 0.7884 0.5313 0.6199 0.4876 0.5638 0.7055 0.3035 0.3172 0.4361 0.5246 0.4397 0.6255 0.4257 0.754 0.5562 0.6353 0.4962 0.3357 0.341 0.5094 0.4605 0.4209 0.2703 0.2684 0.502 0.2609 0.364 0.3103 0.3912 0.4331 0.5336 0.2604 0.3965 0.2402 0.0818 0.3475 0.2657 0.2854 0.2017 0.3004 0.5159 0.6034 0.5591 0.5674 0.4585 0.2238 0.2143 0.6784 0.4382 0.8705 0.418 0.4326 0.2146 0.0549 0.4037 0.3209 0.2281 0.4684 0.6748 0.2999 0.4072 0.1212 0.3085 0.5201 0.5409 0.3983 0.1932 0.4581 0.3254 0.5613 0.2782 0.2699 0.1487 0.6216 0.4039 0.3738 0.0288 0.2971 0.5614 0.0957 0.0967 0.2365 0.5189 0.4342 0.055 0.2235 0.1198 0.0012 0.011 0.0319 0.1615 0.3016 0.4425 0.1382 0.0588 0.2963 0.3935 0.001 0.608 0.0536 0.5294 0.0956 0.2558 0.136 0.5339 0.1563 0.5631 0.7841 0.0137 0.3771 0.6385 0.0978 0.1252 0.4785 0.011 0.1994 0.1464 0.2723 0.2744 0.4284 0.4211 0.4133 0.3467 0.5852 0.0651 0.3311 0.262 0.1865 0.1207 0.1657 0.0318 0.206 0.3677 0.3049 0.008 0.3426 0.1394 0.1752 0.0012 0.3671 0.0332 0.199 0.1276] 2022-08-23 09:57:13 [INFO] [EVAL] Class Precision: [0.7846 0.8586 0.9674 0.8315 0.7534 0.8668 0.8719 0.8512 0.6918 0.7714 0.6889 0.7366 0.7814 0.512 0.5438 0.5842 0.7034 0.6945 0.7673 0.6041 0.8502 0.7003 0.7601 0.6093 0.5069 0.5345 0.6709 0.7567 0.6587 0.3844 0.4754 0.6612 0.481 0.4999 0.4967 0.5825 0.6232 0.7754 0.4193 0.664 0.4122 0.2236 0.646 0.5225 0.3895 0.4234 0.5077 0.6886 0.6936 0.679 0.7475 0.6731 0.3832 0.4851 0.7005 0.6278 0.9278 0.68 0.7109 0.4113 0.1042 0.5637 0.4753 0.6614 0.5834 0.7546 0.4255 0.5659 0.2778 0.523 0.6757 0.6863 0.646 0.2773 0.7133 0.5035 0.6769 0.4947 0.7213 0.4377 0.7064 0.6437 0.8045 0.0983 0.4973 0.7417 0.3496 0.3931 0.4833 0.767 0.6105 0.0701 0.3968 0.3913 0.005 0.0307 0.5857 0.4031 0.5163 0.6712 0.4078 0.2743 0.5746 0.7181 0.0481 0.6498 0.1831 0.8377 0.3063 0.5117 0.3242 0.8103 0.4848 0.793 0.7884 0.1045 0.6857 0.7708 0.1337 0.5508 0.5902 0.2048 0.6743 0.61 0.6615 0.6071 0.8043 0.5806 0.7913 0.5613 0.6573 0.3483 0.4537 0.7056 0.5323 0.2369 0.3729 0.0681 0.4675 0.6298 0.3518 0.0132 0.616 0.5126 0.4371 0.0017 0.8282 0.3851 0.5578 0.7011] 2022-08-23 09:57:13 [INFO] [EVAL] Class Recall: [0.8562 0.8939 0.9618 0.8626 0.8718 0.8671 0.8884 0.9145 0.696 0.7594 0.6253 0.7062 0.879 0.427 0.4322 0.6325 0.6736 0.5451 0.772 0.5904 0.8695 0.73 0.7946 0.7278 0.4986 0.4851 0.6791 0.5405 0.5384 0.4765 0.3814 0.6759 0.3632 0.5724 0.4527 0.5437 0.5868 0.6311 0.4074 0.496 0.3653 0.1143 0.4292 0.3509 0.5166 0.2781 0.4239 0.6729 0.8226 0.7599 0.7019 0.5899 0.3499 0.2775 0.9556 0.592 0.9338 0.5203 0.5249 0.3098 0.1039 0.5871 0.4969 0.2583 0.7038 0.8645 0.5039 0.5922 0.177 0.4292 0.6932 0.7185 0.5094 0.3891 0.5615 0.479 0.7666 0.3886 0.3014 0.1838 0.8382 0.5201 0.4111 0.0391 0.4246 0.6978 0.1165 0.1136 0.3165 0.6161 0.6006 0.2036 0.3386 0.1473 0.0017 0.0168 0.0327 0.2123 0.4204 0.5649 0.1729 0.0697 0.3795 0.4653 0.001 0.9042 0.0705 0.5899 0.1219 0.3384 0.1898 0.6101 0.1875 0.6602 0.9931 0.0155 0.456 0.7882 0.2672 0.1395 0.7165 0.0114 0.2206 0.1615 0.3164 0.3336 0.4783 0.6051 0.4639 0.4756 0.8421 0.0741 0.5508 0.2942 0.223 0.1975 0.2296 0.0564 0.2692 0.4692 0.6958 0.0199 0.4356 0.1606 0.2262 0.004 0.3973 0.0351 0.2362 0.1349] 2022-08-23 09:57:13 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 09:57:22 [INFO] [TRAIN] epoch: 118, iter: 148050/160000, loss: 0.4729, lr: 0.000090, batch_cost: 0.1792, reader_cost: 0.00435, ips: 44.6345 samples/sec | ETA 00:35:41 2022-08-23 09:57:30 [INFO] [TRAIN] epoch: 118, iter: 148100/160000, loss: 0.4597, lr: 0.000090, batch_cost: 0.1643, reader_cost: 0.00031, ips: 48.6955 samples/sec | ETA 00:32:35 2022-08-23 09:57:38 [INFO] [TRAIN] epoch: 118, iter: 148150/160000, loss: 0.4463, lr: 0.000090, batch_cost: 0.1669, reader_cost: 0.00074, ips: 47.9429 samples/sec | ETA 00:32:57 2022-08-23 09:57:48 [INFO] [TRAIN] epoch: 118, iter: 148200/160000, loss: 0.4674, lr: 0.000089, batch_cost: 0.1923, reader_cost: 0.00090, ips: 41.6061 samples/sec | ETA 00:37:48 2022-08-23 09:57:57 [INFO] [TRAIN] epoch: 118, iter: 148250/160000, loss: 0.4616, lr: 0.000089, batch_cost: 0.1871, reader_cost: 0.00073, ips: 42.7685 samples/sec | ETA 00:36:37 2022-08-23 09:58:07 [INFO] [TRAIN] epoch: 118, iter: 148300/160000, loss: 0.4691, lr: 0.000089, batch_cost: 0.1852, reader_cost: 0.00069, ips: 43.2007 samples/sec | ETA 00:36:06 2022-08-23 09:58:16 [INFO] [TRAIN] epoch: 118, iter: 148350/160000, loss: 0.4838, lr: 0.000088, batch_cost: 0.1914, reader_cost: 0.00055, ips: 41.8056 samples/sec | ETA 00:37:09 2022-08-23 09:58:26 [INFO] [TRAIN] epoch: 118, iter: 148400/160000, loss: 0.4785, lr: 0.000088, batch_cost: 0.2051, reader_cost: 0.00044, ips: 39.0076 samples/sec | ETA 00:39:39 2022-08-23 09:58:35 [INFO] [TRAIN] epoch: 118, iter: 148450/160000, loss: 0.4757, lr: 0.000087, batch_cost: 0.1770, reader_cost: 0.00044, ips: 45.1861 samples/sec | ETA 00:34:04 2022-08-23 09:58:44 [INFO] [TRAIN] epoch: 118, iter: 148500/160000, loss: 0.4893, lr: 0.000087, batch_cost: 0.1807, reader_cost: 0.00106, ips: 44.2830 samples/sec | ETA 00:34:37 2022-08-23 09:58:54 [INFO] [TRAIN] epoch: 118, iter: 148550/160000, loss: 0.4739, lr: 0.000087, batch_cost: 0.1904, reader_cost: 0.00124, ips: 42.0238 samples/sec | ETA 00:36:19 2022-08-23 09:59:04 [INFO] [TRAIN] epoch: 118, iter: 148600/160000, loss: 0.4771, lr: 0.000086, batch_cost: 0.2123, reader_cost: 0.00068, ips: 37.6837 samples/sec | ETA 00:40:20 2022-08-23 09:59:14 [INFO] [TRAIN] epoch: 118, iter: 148650/160000, loss: 0.4554, lr: 0.000086, batch_cost: 0.2003, reader_cost: 0.00058, ips: 39.9441 samples/sec | ETA 00:37:53 2022-08-23 09:59:25 [INFO] [TRAIN] epoch: 118, iter: 148700/160000, loss: 0.4741, lr: 0.000086, batch_cost: 0.2111, reader_cost: 0.01064, ips: 37.9023 samples/sec | ETA 00:39:45 2022-08-23 09:59:35 [INFO] [TRAIN] epoch: 118, iter: 148750/160000, loss: 0.4570, lr: 0.000085, batch_cost: 0.1919, reader_cost: 0.00052, ips: 41.6881 samples/sec | ETA 00:35:58 2022-08-23 09:59:45 [INFO] [TRAIN] epoch: 118, iter: 148800/160000, loss: 0.4969, lr: 0.000085, batch_cost: 0.2049, reader_cost: 0.00048, ips: 39.0504 samples/sec | ETA 00:38:14 2022-08-23 09:59:56 [INFO] [TRAIN] epoch: 118, iter: 148850/160000, loss: 0.4704, lr: 0.000084, batch_cost: 0.2153, reader_cost: 0.00056, ips: 37.1594 samples/sec | ETA 00:40:00 2022-08-23 10:00:07 [INFO] [TRAIN] epoch: 118, iter: 148900/160000, loss: 0.4899, lr: 0.000084, batch_cost: 0.2210, reader_cost: 0.00050, ips: 36.1925 samples/sec | ETA 00:40:53 2022-08-23 10:00:17 [INFO] [TRAIN] epoch: 118, iter: 148950/160000, loss: 0.4917, lr: 0.000084, batch_cost: 0.2116, reader_cost: 0.00091, ips: 37.8039 samples/sec | ETA 00:38:58 2022-08-23 10:00:28 [INFO] [TRAIN] epoch: 118, iter: 149000/160000, loss: 0.4918, lr: 0.000083, batch_cost: 0.2166, reader_cost: 0.00056, ips: 36.9359 samples/sec | ETA 00:39:42 2022-08-23 10:00:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1590 - reader cost: 7.2332e-04 2022-08-23 10:03:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3526 Acc: 0.7703 Kappa: 0.7527 Dice: 0.4851 2022-08-23 10:03:07 [INFO] [EVAL] Class IoU: [0.6934 0.7775 0.9328 0.732 0.6876 0.763 0.7888 0.7878 0.5377 0.597 0.4928 0.5668 0.7079 0.3061 0.3257 0.4346 0.5252 0.4457 0.6301 0.4176 0.7445 0.5204 0.6481 0.4967 0.3332 0.35 0.4797 0.4638 0.4115 0.2815 0.2845 0.4902 0.2647 0.3701 0.3138 0.388 0.4316 0.5405 0.2563 0.4152 0.232 0.0932 0.348 0.2661 0.289 0.1859 0.3224 0.5218 0.596 0.5421 0.5691 0.4221 0.214 0.2088 0.6851 0.4338 0.8689 0.4192 0.4326 0.2375 0.0653 0.3962 0.3302 0.26 0.4656 0.7049 0.3164 0.4173 0.1209 0.2998 0.5184 0.5312 0.4084 0.2031 0.4578 0.305 0.5607 0.3007 0.2557 0.1776 0.6082 0.4091 0.3953 0.0226 0.149 0.5673 0.1094 0.1023 0.2099 0.5046 0.4507 0.0464 0.2096 0.1069 0.0022 0.0149 0.0278 0.1571 0.2992 0.4141 0.1231 0.0927 0.272 0.1255 0.0028 0.5714 0.0641 0.521 0.105 0.2762 0.155 0.3876 0.1528 0.5174 0.789 0.0296 0.2994 0.636 0.1202 0.0902 0.4822 0.0143 0.2171 0.1618 0.274 0.3019 0.4569 0.411 0.3935 0.3304 0.5789 0.0832 0.3453 0.289 0.1802 0.1253 0.159 0.0349 0.2213 0.3716 0.3537 0.0037 0.3328 0.1247 0.1868 0.005 0.3454 0.0325 0.1592 0.1565] 2022-08-23 10:03:07 [INFO] [EVAL] Class Precision: [0.7838 0.8622 0.9667 0.8182 0.7777 0.8682 0.8789 0.8517 0.6922 0.7554 0.7109 0.7233 0.7774 0.5013 0.53 0.6104 0.7079 0.6845 0.786 0.6317 0.8299 0.6925 0.7949 0.6458 0.5391 0.5288 0.6416 0.7743 0.6809 0.372 0.4416 0.634 0.4658 0.5049 0.472 0.547 0.6138 0.7701 0.422 0.6365 0.3809 0.2358 0.616 0.5008 0.4024 0.4261 0.4952 0.6956 0.6617 0.6361 0.7997 0.5843 0.3493 0.5258 0.723 0.631 0.922 0.6888 0.6896 0.444 0.1111 0.5431 0.5176 0.6746 0.5724 0.8211 0.4686 0.5504 0.2943 0.5187 0.6631 0.6848 0.6221 0.2713 0.7346 0.5174 0.6798 0.5452 0.7673 0.4817 0.6837 0.6469 0.7823 0.0798 0.348 0.7462 0.3067 0.4109 0.3863 0.7591 0.644 0.0566 0.3515 0.3622 0.0068 0.0394 0.4393 0.3089 0.5066 0.6674 0.3757 0.2567 0.6037 0.6107 0.1093 0.6127 0.1938 0.8225 0.2749 0.5759 0.3382 0.5366 0.4402 0.7851 0.7942 0.1466 0.649 0.7625 0.1685 0.6606 0.614 0.1813 0.6527 0.6143 0.6711 0.6103 0.797 0.5613 0.6473 0.5255 0.6388 0.373 0.4636 0.6846 0.5151 0.2755 0.3056 0.0711 0.4618 0.6261 0.4292 0.0061 0.5965 0.4972 0.4399 0.0068 0.8407 0.4135 0.5668 0.6984] 2022-08-23 10:03:07 [INFO] [EVAL] Class Recall: [0.8573 0.8878 0.9638 0.8742 0.8558 0.863 0.8849 0.913 0.7066 0.7401 0.6163 0.7237 0.888 0.4402 0.4579 0.6015 0.6705 0.5609 0.7606 0.552 0.8786 0.6767 0.7783 0.6827 0.4659 0.5086 0.6553 0.5363 0.5097 0.5365 0.4442 0.6838 0.3801 0.5809 0.4837 0.5718 0.5924 0.6445 0.3949 0.5444 0.3724 0.1336 0.4444 0.3622 0.5064 0.248 0.4801 0.6763 0.8573 0.7858 0.6637 0.6033 0.3557 0.2573 0.929 0.5813 0.9378 0.5171 0.5372 0.3381 0.1365 0.5942 0.4771 0.2972 0.714 0.8329 0.4934 0.6332 0.1703 0.4153 0.7037 0.7032 0.5431 0.4467 0.5485 0.4262 0.762 0.4014 0.2772 0.2195 0.8463 0.5267 0.4442 0.0305 0.2067 0.703 0.1454 0.1199 0.3149 0.6008 0.6003 0.2053 0.3416 0.1317 0.0031 0.0234 0.0288 0.2423 0.4223 0.5217 0.1548 0.1267 0.3312 0.1364 0.0029 0.8943 0.0874 0.5871 0.1453 0.3468 0.2224 0.5827 0.1896 0.6028 0.9918 0.0357 0.3572 0.7932 0.2953 0.0946 0.692 0.0153 0.2455 0.1801 0.3165 0.374 0.517 0.6054 0.5008 0.4709 0.8606 0.0967 0.5751 0.3334 0.217 0.1868 0.249 0.0642 0.2982 0.4776 0.6676 0.0096 0.4296 0.1427 0.2451 0.018 0.3696 0.0341 0.1813 0.1679] 2022-08-23 10:03:07 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:03:18 [INFO] [TRAIN] epoch: 119, iter: 149050/160000, loss: 0.4794, lr: 0.000083, batch_cost: 0.2097, reader_cost: 0.03834, ips: 38.1534 samples/sec | ETA 00:38:15 2022-08-23 10:03:27 [INFO] [TRAIN] epoch: 119, iter: 149100/160000, loss: 0.4981, lr: 0.000083, batch_cost: 0.1782, reader_cost: 0.00357, ips: 44.8919 samples/sec | ETA 00:32:22 2022-08-23 10:03:35 [INFO] [TRAIN] epoch: 119, iter: 149150/160000, loss: 0.4770, lr: 0.000082, batch_cost: 0.1628, reader_cost: 0.00043, ips: 49.1536 samples/sec | ETA 00:29:25 2022-08-23 10:03:44 [INFO] [TRAIN] epoch: 119, iter: 149200/160000, loss: 0.4692, lr: 0.000082, batch_cost: 0.1803, reader_cost: 0.00033, ips: 44.3802 samples/sec | ETA 00:32:26 2022-08-23 10:03:53 [INFO] [TRAIN] epoch: 119, iter: 149250/160000, loss: 0.4578, lr: 0.000081, batch_cost: 0.1719, reader_cost: 0.00062, ips: 46.5374 samples/sec | ETA 00:30:47 2022-08-23 10:04:02 [INFO] [TRAIN] epoch: 119, iter: 149300/160000, loss: 0.4930, lr: 0.000081, batch_cost: 0.1926, reader_cost: 0.00049, ips: 41.5290 samples/sec | ETA 00:34:21 2022-08-23 10:04:12 [INFO] [TRAIN] epoch: 119, iter: 149350/160000, loss: 0.4761, lr: 0.000081, batch_cost: 0.2003, reader_cost: 0.00078, ips: 39.9456 samples/sec | ETA 00:35:32 2022-08-23 10:04:21 [INFO] [TRAIN] epoch: 119, iter: 149400/160000, loss: 0.4778, lr: 0.000080, batch_cost: 0.1799, reader_cost: 0.00050, ips: 44.4760 samples/sec | ETA 00:31:46 2022-08-23 10:04:31 [INFO] [TRAIN] epoch: 119, iter: 149450/160000, loss: 0.4892, lr: 0.000080, batch_cost: 0.1949, reader_cost: 0.00052, ips: 41.0414 samples/sec | ETA 00:34:16 2022-08-23 10:04:41 [INFO] [TRAIN] epoch: 119, iter: 149500/160000, loss: 0.4554, lr: 0.000080, batch_cost: 0.1928, reader_cost: 0.00044, ips: 41.4864 samples/sec | ETA 00:33:44 2022-08-23 10:04:51 [INFO] [TRAIN] epoch: 119, iter: 149550/160000, loss: 0.4938, lr: 0.000079, batch_cost: 0.1996, reader_cost: 0.00051, ips: 40.0812 samples/sec | ETA 00:34:45 2022-08-23 10:05:00 [INFO] [TRAIN] epoch: 119, iter: 149600/160000, loss: 0.5286, lr: 0.000079, batch_cost: 0.1863, reader_cost: 0.00051, ips: 42.9371 samples/sec | ETA 00:32:17 2022-08-23 10:05:09 [INFO] [TRAIN] epoch: 119, iter: 149650/160000, loss: 0.4919, lr: 0.000078, batch_cost: 0.1772, reader_cost: 0.00068, ips: 45.1346 samples/sec | ETA 00:30:34 2022-08-23 10:05:19 [INFO] [TRAIN] epoch: 119, iter: 149700/160000, loss: 0.4635, lr: 0.000078, batch_cost: 0.2042, reader_cost: 0.00037, ips: 39.1845 samples/sec | ETA 00:35:02 2022-08-23 10:05:30 [INFO] [TRAIN] epoch: 119, iter: 149750/160000, loss: 0.4679, lr: 0.000078, batch_cost: 0.2149, reader_cost: 0.00063, ips: 37.2269 samples/sec | ETA 00:36:42 2022-08-23 10:05:40 [INFO] [TRAIN] epoch: 119, iter: 149800/160000, loss: 0.4850, lr: 0.000077, batch_cost: 0.2123, reader_cost: 0.00062, ips: 37.6794 samples/sec | ETA 00:36:05 2022-08-23 10:05:52 [INFO] [TRAIN] epoch: 119, iter: 149850/160000, loss: 0.4807, lr: 0.000077, batch_cost: 0.2299, reader_cost: 0.00085, ips: 34.7972 samples/sec | ETA 00:38:53 2022-08-23 10:06:02 [INFO] [TRAIN] epoch: 119, iter: 149900/160000, loss: 0.4786, lr: 0.000076, batch_cost: 0.2075, reader_cost: 0.00050, ips: 38.5497 samples/sec | ETA 00:34:55 2022-08-23 10:06:13 [INFO] [TRAIN] epoch: 119, iter: 149950/160000, loss: 0.4505, lr: 0.000076, batch_cost: 0.2201, reader_cost: 0.00048, ips: 36.3552 samples/sec | ETA 00:36:51 2022-08-23 10:06:24 [INFO] [TRAIN] epoch: 119, iter: 150000/160000, loss: 0.4828, lr: 0.000076, batch_cost: 0.2130, reader_cost: 0.00196, ips: 37.5501 samples/sec | ETA 00:35:30 2022-08-23 10:06:24 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1844 - reader cost: 6.9211e-04 2022-08-23 10:09:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3542 Acc: 0.7707 Kappa: 0.7531 Dice: 0.4876 2022-08-23 10:09:28 [INFO] [EVAL] Class IoU: [0.6956 0.7764 0.9322 0.7341 0.6787 0.7636 0.7906 0.7891 0.5349 0.6278 0.4953 0.562 0.7074 0.2973 0.3204 0.4343 0.5203 0.4346 0.6264 0.4215 0.7462 0.5112 0.6404 0.495 0.3048 0.2991 0.4913 0.4744 0.4265 0.2711 0.2879 0.4894 0.2624 0.3657 0.3177 0.3956 0.4365 0.5309 0.2642 0.4143 0.2247 0.1094 0.3414 0.2705 0.2799 0.1845 0.2953 0.5223 0.6123 0.5388 0.5905 0.3647 0.2132 0.2205 0.6577 0.3994 0.8652 0.4216 0.4265 0.2364 0.0647 0.4033 0.3288 0.2443 0.4767 0.6992 0.3035 0.3872 0.1287 0.3013 0.5227 0.5313 0.3951 0.1802 0.4602 0.3391 0.5892 0.2897 0.2679 0.1888 0.6498 0.4124 0.3876 0.036 0.1956 0.5655 0.1059 0.1033 0.239 0.4924 0.3944 0.0514 0.2133 0.101 0.0005 0.0149 0.0293 0.1593 0.2897 0.4287 0.1264 0.085 0.2873 0.3671 0.0064 0.5731 0.0606 0.5124 0.0978 0.26 0.1478 0.3493 0.1569 0.5633 0.7847 0.0139 0.3697 0.6442 0.1051 0.2146 0.4671 0.0181 0.2208 0.1525 0.2713 0.2998 0.4547 0.4306 0.3437 0.3423 0.5567 0.0826 0.3171 0.3274 0.1963 0.1185 0.1605 0.0319 0.2242 0.3645 0.3157 0.0041 0.3308 0.0848 0.2408 0.0012 0.3603 0.0349 0.1881 0.1531] 2022-08-23 10:09:28 [INFO] [EVAL] Class Precision: [0.788 0.8541 0.9656 0.8275 0.7644 0.8682 0.8728 0.8494 0.6912 0.7528 0.688 0.7361 0.7859 0.5038 0.5488 0.5895 0.7101 0.6894 0.7618 0.6213 0.8393 0.6957 0.7688 0.6351 0.5208 0.5312 0.6186 0.7747 0.6619 0.4076 0.4558 0.6306 0.4993 0.4992 0.4908 0.5527 0.6282 0.7717 0.4261 0.6295 0.4312 0.2529 0.6359 0.5063 0.4192 0.4028 0.5064 0.7147 0.697 0.6271 0.7765 0.4825 0.3785 0.5246 0.6755 0.5543 0.922 0.7027 0.7645 0.4511 0.109 0.5527 0.4932 0.7218 0.6007 0.7898 0.4135 0.5159 0.3469 0.5241 0.6737 0.6739 0.655 0.2487 0.6917 0.4825 0.7378 0.5339 0.788 0.3706 0.7574 0.6645 0.7958 0.0875 0.4066 0.7326 0.3189 0.3984 0.4316 0.7369 0.529 0.0637 0.3529 0.3744 0.0022 0.037 0.3808 0.3673 0.4597 0.6661 0.4253 0.2479 0.5782 0.7389 0.4161 0.6128 0.2419 0.8039 0.2632 0.5716 0.313 0.4486 0.4876 0.8305 0.7892 0.1142 0.6964 0.7749 0.1434 0.6544 0.5764 0.1524 0.6046 0.6044 0.6562 0.6386 0.8125 0.6189 0.7017 0.5578 0.6113 0.3904 0.451 0.6308 0.5939 0.2588 0.372 0.0614 0.4626 0.6493 0.3736 0.0062 0.6083 0.4231 0.4744 0.0015 0.8532 0.3581 0.6324 0.7608] 2022-08-23 10:09:28 [INFO] [EVAL] Class Recall: [0.8558 0.8952 0.9642 0.8668 0.8581 0.8637 0.8936 0.9175 0.7029 0.7908 0.6388 0.7038 0.8762 0.4204 0.4351 0.6226 0.6607 0.5404 0.7789 0.5673 0.8705 0.6584 0.7932 0.6917 0.4236 0.4063 0.7048 0.5504 0.5453 0.4473 0.4388 0.6862 0.356 0.5777 0.4739 0.5818 0.5885 0.6298 0.41 0.5479 0.3193 0.1617 0.4243 0.3675 0.4571 0.2539 0.4145 0.6599 0.8345 0.7928 0.7114 0.5989 0.3281 0.2755 0.9615 0.5884 0.9335 0.5132 0.4909 0.3318 0.1375 0.5986 0.4966 0.2696 0.6977 0.859 0.5328 0.6082 0.1698 0.4147 0.6999 0.7151 0.4989 0.3954 0.5789 0.5328 0.7453 0.3877 0.2887 0.2779 0.8205 0.5207 0.4304 0.0577 0.2738 0.7126 0.1369 0.1224 0.3489 0.5975 0.608 0.2095 0.3504 0.1215 0.0007 0.0244 0.0308 0.2195 0.4392 0.546 0.1524 0.1145 0.3635 0.4218 0.0065 0.8984 0.0749 0.5856 0.1347 0.323 0.2188 0.6121 0.1878 0.6365 0.9929 0.0155 0.4408 0.7925 0.2826 0.242 0.7112 0.0201 0.2581 0.1694 0.3162 0.3611 0.508 0.586 0.4026 0.4697 0.8618 0.0949 0.5164 0.4049 0.2268 0.1793 0.2202 0.0622 0.3032 0.4539 0.6708 0.0119 0.4204 0.0959 0.3284 0.0048 0.3841 0.0372 0.2113 0.1609] 2022-08-23 10:09:29 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:09:38 [INFO] [TRAIN] epoch: 119, iter: 150050/160000, loss: 0.4889, lr: 0.000075, batch_cost: 0.1840, reader_cost: 0.00380, ips: 43.4899 samples/sec | ETA 00:30:30 2022-08-23 10:09:47 [INFO] [TRAIN] epoch: 119, iter: 150100/160000, loss: 0.4426, lr: 0.000075, batch_cost: 0.1910, reader_cost: 0.00112, ips: 41.8890 samples/sec | ETA 00:31:30 2022-08-23 10:09:58 [INFO] [TRAIN] epoch: 119, iter: 150150/160000, loss: 0.4545, lr: 0.000075, batch_cost: 0.2046, reader_cost: 0.00089, ips: 39.1099 samples/sec | ETA 00:33:34 2022-08-23 10:10:07 [INFO] [TRAIN] epoch: 119, iter: 150200/160000, loss: 0.4804, lr: 0.000074, batch_cost: 0.1879, reader_cost: 0.00116, ips: 42.5651 samples/sec | ETA 00:30:41 2022-08-23 10:10:16 [INFO] [TRAIN] epoch: 119, iter: 150250/160000, loss: 0.4449, lr: 0.000074, batch_cost: 0.1738, reader_cost: 0.00074, ips: 46.0409 samples/sec | ETA 00:28:14 2022-08-23 10:10:26 [INFO] [TRAIN] epoch: 120, iter: 150300/160000, loss: 0.5094, lr: 0.000073, batch_cost: 0.1991, reader_cost: 0.04117, ips: 40.1708 samples/sec | ETA 00:32:11 2022-08-23 10:10:34 [INFO] [TRAIN] epoch: 120, iter: 150350/160000, loss: 0.4802, lr: 0.000073, batch_cost: 0.1736, reader_cost: 0.00052, ips: 46.0700 samples/sec | ETA 00:27:55 2022-08-23 10:10:44 [INFO] [TRAIN] epoch: 120, iter: 150400/160000, loss: 0.4619, lr: 0.000073, batch_cost: 0.2025, reader_cost: 0.00069, ips: 39.5079 samples/sec | ETA 00:32:23 2022-08-23 10:10:53 [INFO] [TRAIN] epoch: 120, iter: 150450/160000, loss: 0.4827, lr: 0.000072, batch_cost: 0.1818, reader_cost: 0.00051, ips: 44.0040 samples/sec | ETA 00:28:56 2022-08-23 10:11:02 [INFO] [TRAIN] epoch: 120, iter: 150500/160000, loss: 0.5053, lr: 0.000072, batch_cost: 0.1641, reader_cost: 0.00037, ips: 48.7506 samples/sec | ETA 00:25:58 2022-08-23 10:11:10 [INFO] [TRAIN] epoch: 120, iter: 150550/160000, loss: 0.4718, lr: 0.000072, batch_cost: 0.1656, reader_cost: 0.00053, ips: 48.3060 samples/sec | ETA 00:26:05 2022-08-23 10:11:20 [INFO] [TRAIN] epoch: 120, iter: 150600/160000, loss: 0.4524, lr: 0.000071, batch_cost: 0.2059, reader_cost: 0.00042, ips: 38.8482 samples/sec | ETA 00:32:15 2022-08-23 10:11:30 [INFO] [TRAIN] epoch: 120, iter: 150650/160000, loss: 0.4798, lr: 0.000071, batch_cost: 0.1953, reader_cost: 0.00077, ips: 40.9557 samples/sec | ETA 00:30:26 2022-08-23 10:11:40 [INFO] [TRAIN] epoch: 120, iter: 150700/160000, loss: 0.5115, lr: 0.000070, batch_cost: 0.2077, reader_cost: 0.00068, ips: 38.5087 samples/sec | ETA 00:32:12 2022-08-23 10:11:52 [INFO] [TRAIN] epoch: 120, iter: 150750/160000, loss: 0.4659, lr: 0.000070, batch_cost: 0.2386, reader_cost: 0.00055, ips: 33.5260 samples/sec | ETA 00:36:47 2022-08-23 10:12:03 [INFO] [TRAIN] epoch: 120, iter: 150800/160000, loss: 0.4662, lr: 0.000070, batch_cost: 0.2210, reader_cost: 0.00067, ips: 36.1991 samples/sec | ETA 00:33:53 2022-08-23 10:12:15 [INFO] [TRAIN] epoch: 120, iter: 150850/160000, loss: 0.4435, lr: 0.000069, batch_cost: 0.2372, reader_cost: 0.00099, ips: 33.7256 samples/sec | ETA 00:36:10 2022-08-23 10:12:26 [INFO] [TRAIN] epoch: 120, iter: 150900/160000, loss: 0.4858, lr: 0.000069, batch_cost: 0.2245, reader_cost: 0.00055, ips: 35.6379 samples/sec | ETA 00:34:02 2022-08-23 10:12:38 [INFO] [TRAIN] epoch: 120, iter: 150950/160000, loss: 0.4782, lr: 0.000069, batch_cost: 0.2236, reader_cost: 0.00088, ips: 35.7811 samples/sec | ETA 00:33:43 2022-08-23 10:12:48 [INFO] [TRAIN] epoch: 120, iter: 151000/160000, loss: 0.4822, lr: 0.000068, batch_cost: 0.2163, reader_cost: 0.00070, ips: 36.9918 samples/sec | ETA 00:32:26 2022-08-23 10:12:48 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 182s - batch_cost: 0.1817 - reader cost: 0.0010 2022-08-23 10:15:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.3547 Acc: 0.7718 Kappa: 0.7544 Dice: 0.4875 2022-08-23 10:15:50 [INFO] [EVAL] Class IoU: [0.6958 0.7792 0.9318 0.7333 0.6844 0.763 0.7863 0.7944 0.5364 0.6351 0.4936 0.5667 0.7064 0.3135 0.317 0.434 0.5166 0.4477 0.6251 0.425 0.7427 0.5284 0.6433 0.4973 0.3369 0.3633 0.474 0.4513 0.4262 0.2886 0.2841 0.4921 0.2462 0.3563 0.3121 0.3935 0.4361 0.5415 0.2653 0.4047 0.2139 0.0903 0.3467 0.262 0.2897 0.2034 0.3038 0.5267 0.6032 0.5504 0.5886 0.4352 0.2172 0.1924 0.6866 0.4498 0.8711 0.4241 0.4424 0.2463 0.0681 0.3817 0.3179 0.24 0.4665 0.7069 0.3186 0.3877 0.1201 0.3114 0.5003 0.5352 0.4158 0.1883 0.4581 0.3105 0.5859 0.2991 0.2801 0.1646 0.6249 0.4121 0.3938 0.0377 0.1992 0.5695 0.1053 0.1023 0.2432 0.5147 0.433 0.073 0.2248 0.1128 0.0007 0.0197 0.0296 0.1623 0.2831 0.4186 0.1321 0.0587 0.2895 0.2527 0.0037 0.5846 0.0542 0.5192 0.1052 0.2708 0.1362 0.4027 0.1535 0.5039 0.7982 0.0213 0.3233 0.6386 0.0977 0.1446 0.4804 0.0163 0.2071 0.1528 0.2701 0.3155 0.4527 0.4121 0.3874 0.3388 0.5828 0.083 0.3335 0.3096 0.1754 0.1224 0.1603 0.0326 0.2167 0.3648 0.3001 0.007 0.3346 0.0796 0.2197 0.0037 0.3785 0.0297 0.1771 0.1422] 2022-08-23 10:15:50 [INFO] [EVAL] Class Precision: [0.7917 0.8567 0.9656 0.825 0.7717 0.8619 0.8822 0.8582 0.6846 0.7691 0.6965 0.7203 0.7769 0.5112 0.5532 0.5857 0.7093 0.6868 0.7649 0.5999 0.8273 0.6742 0.7647 0.6395 0.5311 0.5282 0.6428 0.7979 0.662 0.4176 0.4449 0.6688 0.4542 0.4657 0.4734 0.5807 0.6207 0.7607 0.4049 0.6705 0.3825 0.2285 0.6048 0.5093 0.3951 0.4202 0.4723 0.6988 0.7847 0.6425 0.7827 0.6252 0.3663 0.4676 0.7101 0.6464 0.9206 0.6848 0.7128 0.3969 0.1283 0.5825 0.4356 0.7037 0.568 0.8026 0.4615 0.5345 0.3117 0.5247 0.6835 0.6731 0.6494 0.2659 0.699 0.51 0.729 0.5751 0.7226 0.4507 0.7141 0.6703 0.7841 0.0982 0.4191 0.7617 0.3235 0.4025 0.4525 0.7698 0.5863 0.1012 0.4007 0.3678 0.0028 0.0458 0.3506 0.3898 0.6223 0.6811 0.412 0.2277 0.6022 0.7808 0.339 0.6238 0.2166 0.7824 0.2572 0.5679 0.32 0.5351 0.4399 0.7932 0.803 0.1568 0.6872 0.7722 0.1356 0.5466 0.6406 0.1903 0.7077 0.6339 0.6522 0.6248 0.7948 0.554 0.8095 0.574 0.6498 0.3631 0.4684 0.6544 0.5418 0.2819 0.3847 0.0658 0.4715 0.6133 0.3532 0.0105 0.6205 0.3733 0.4804 0.0061 0.8385 0.4128 0.6286 0.7971] 2022-08-23 10:15:50 [INFO] [EVAL] Class Recall: [0.8517 0.896 0.9638 0.8684 0.8581 0.8692 0.8785 0.9144 0.7124 0.7847 0.6288 0.7267 0.8861 0.4476 0.4261 0.6263 0.6553 0.5626 0.7738 0.593 0.879 0.7096 0.8022 0.691 0.4795 0.5378 0.6434 0.5096 0.5448 0.483 0.44 0.6507 0.3497 0.6026 0.4782 0.5496 0.5945 0.6527 0.4348 0.5052 0.3267 0.1298 0.4482 0.3504 0.5205 0.2827 0.4599 0.6815 0.7228 0.7935 0.7036 0.5887 0.3478 0.2464 0.9541 0.5965 0.9419 0.527 0.5383 0.3938 0.1269 0.5255 0.5405 0.2669 0.7231 0.8557 0.5072 0.5852 0.1635 0.4338 0.6511 0.723 0.5363 0.3921 0.5706 0.4425 0.749 0.3839 0.3138 0.2059 0.8335 0.5169 0.4417 0.0576 0.2751 0.693 0.1351 0.1206 0.3446 0.6084 0.6235 0.2072 0.3386 0.1399 0.001 0.0335 0.0313 0.2176 0.3418 0.5207 0.1628 0.0733 0.358 0.272 0.0037 0.9029 0.0673 0.6068 0.1512 0.341 0.1917 0.6194 0.1908 0.5802 0.9925 0.0241 0.3791 0.7869 0.259 0.1643 0.6576 0.0175 0.2265 0.1676 0.3156 0.3892 0.5126 0.6168 0.4263 0.4526 0.8497 0.0972 0.5366 0.3701 0.2059 0.1778 0.2156 0.0608 0.2863 0.4737 0.6662 0.0204 0.4207 0.0918 0.2881 0.0093 0.4082 0.031 0.1978 0.1475] 2022-08-23 10:15:51 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:15:59 [INFO] [TRAIN] epoch: 120, iter: 151050/160000, loss: 0.4992, lr: 0.000068, batch_cost: 0.1767, reader_cost: 0.00318, ips: 45.2683 samples/sec | ETA 00:26:21 2022-08-23 10:16:08 [INFO] [TRAIN] epoch: 120, iter: 151100/160000, loss: 0.5026, lr: 0.000067, batch_cost: 0.1699, reader_cost: 0.00068, ips: 47.0844 samples/sec | ETA 00:25:12 2022-08-23 10:16:17 [INFO] [TRAIN] epoch: 120, iter: 151150/160000, loss: 0.4454, lr: 0.000067, batch_cost: 0.1778, reader_cost: 0.00093, ips: 44.9847 samples/sec | ETA 00:26:13 2022-08-23 10:16:26 [INFO] [TRAIN] epoch: 120, iter: 151200/160000, loss: 0.4887, lr: 0.000067, batch_cost: 0.1897, reader_cost: 0.00034, ips: 42.1690 samples/sec | ETA 00:27:49 2022-08-23 10:16:35 [INFO] [TRAIN] epoch: 120, iter: 151250/160000, loss: 0.4677, lr: 0.000066, batch_cost: 0.1803, reader_cost: 0.00056, ips: 44.3756 samples/sec | ETA 00:26:17 2022-08-23 10:16:45 [INFO] [TRAIN] epoch: 120, iter: 151300/160000, loss: 0.4984, lr: 0.000066, batch_cost: 0.1936, reader_cost: 0.00050, ips: 41.3286 samples/sec | ETA 00:28:04 2022-08-23 10:16:54 [INFO] [TRAIN] epoch: 120, iter: 151350/160000, loss: 0.4391, lr: 0.000065, batch_cost: 0.1738, reader_cost: 0.00447, ips: 46.0412 samples/sec | ETA 00:25:03 2022-08-23 10:17:02 [INFO] [TRAIN] epoch: 120, iter: 151400/160000, loss: 0.4750, lr: 0.000065, batch_cost: 0.1771, reader_cost: 0.00068, ips: 45.1725 samples/sec | ETA 00:25:23 2022-08-23 10:17:11 [INFO] [TRAIN] epoch: 120, iter: 151450/160000, loss: 0.4898, lr: 0.000065, batch_cost: 0.1785, reader_cost: 0.00056, ips: 44.8065 samples/sec | ETA 00:25:26 2022-08-23 10:17:22 [INFO] [TRAIN] epoch: 120, iter: 151500/160000, loss: 0.4632, lr: 0.000064, batch_cost: 0.2122, reader_cost: 0.00057, ips: 37.7076 samples/sec | ETA 00:30:03 2022-08-23 10:17:32 [INFO] [TRAIN] epoch: 120, iter: 151550/160000, loss: 0.4690, lr: 0.000064, batch_cost: 0.2027, reader_cost: 0.00062, ips: 39.4736 samples/sec | ETA 00:28:32 2022-08-23 10:17:46 [INFO] [TRAIN] epoch: 121, iter: 151600/160000, loss: 0.4929, lr: 0.000064, batch_cost: 0.2824, reader_cost: 0.08278, ips: 28.3238 samples/sec | ETA 00:39:32 2022-08-23 10:17:58 [INFO] [TRAIN] epoch: 121, iter: 151650/160000, loss: 0.5155, lr: 0.000063, batch_cost: 0.2308, reader_cost: 0.00056, ips: 34.6686 samples/sec | ETA 00:32:06 2022-08-23 10:18:09 [INFO] [TRAIN] epoch: 121, iter: 151700/160000, loss: 0.4456, lr: 0.000063, batch_cost: 0.2324, reader_cost: 0.00067, ips: 34.4241 samples/sec | ETA 00:32:08 2022-08-23 10:18:19 [INFO] [TRAIN] epoch: 121, iter: 151750/160000, loss: 0.4852, lr: 0.000062, batch_cost: 0.1904, reader_cost: 0.00049, ips: 42.0072 samples/sec | ETA 00:26:11 2022-08-23 10:18:29 [INFO] [TRAIN] epoch: 121, iter: 151800/160000, loss: 0.4847, lr: 0.000062, batch_cost: 0.1943, reader_cost: 0.00089, ips: 41.1812 samples/sec | ETA 00:26:32 2022-08-23 10:18:41 [INFO] [TRAIN] epoch: 121, iter: 151850/160000, loss: 0.4836, lr: 0.000062, batch_cost: 0.2431, reader_cost: 0.00074, ips: 32.9099 samples/sec | ETA 00:33:01 2022-08-23 10:18:54 [INFO] [TRAIN] epoch: 121, iter: 151900/160000, loss: 0.4628, lr: 0.000061, batch_cost: 0.2560, reader_cost: 0.00080, ips: 31.2561 samples/sec | ETA 00:34:33 2022-08-23 10:19:05 [INFO] [TRAIN] epoch: 121, iter: 151950/160000, loss: 0.4626, lr: 0.000061, batch_cost: 0.2259, reader_cost: 0.00034, ips: 35.4118 samples/sec | ETA 00:30:18 2022-08-23 10:19:16 [INFO] [TRAIN] epoch: 121, iter: 152000/160000, loss: 0.4646, lr: 0.000061, batch_cost: 0.2256, reader_cost: 0.00083, ips: 35.4590 samples/sec | ETA 00:30:04 2022-08-23 10:19:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 213s - batch_cost: 0.2127 - reader cost: 0.0011 2022-08-23 10:22:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3532 Acc: 0.7706 Kappa: 0.7531 Dice: 0.4854 2022-08-23 10:22:49 [INFO] [EVAL] Class IoU: [0.6945 0.7785 0.9321 0.7293 0.6828 0.7681 0.7891 0.7912 0.5334 0.6208 0.4961 0.5567 0.7046 0.3072 0.332 0.4313 0.5182 0.4324 0.6266 0.4259 0.7411 0.5128 0.6428 0.496 0.3151 0.3718 0.4616 0.4554 0.4095 0.2788 0.2662 0.4993 0.255 0.3611 0.3023 0.3958 0.4339 0.5395 0.2659 0.4162 0.224 0.0905 0.349 0.2638 0.2829 0.1883 0.2931 0.5164 0.6153 0.5427 0.591 0.3816 0.2251 0.2323 0.6793 0.4435 0.8745 0.4395 0.4804 0.2463 0.0623 0.443 0.3194 0.2189 0.4799 0.693 0.3109 0.4028 0.1132 0.3096 0.5153 0.5201 0.406 0.1824 0.4526 0.3502 0.5641 0.2888 0.2732 0.1835 0.6072 0.4187 0.3975 0.0303 0.1541 0.5665 0.0956 0.0987 0.2591 0.496 0.4304 0.0489 0.2234 0.1047 0.0009 0.0174 0.029 0.1649 0.2846 0.4109 0.1007 0.0773 0.2658 0.2026 0.0034 0.5942 0.0611 0.5318 0.1016 0.2501 0.1366 0.4115 0.1577 0.5463 0.8102 0.025 0.3479 0.6473 0.1165 0.1344 0.4528 0.0131 0.2143 0.1554 0.2644 0.2915 0.4637 0.4241 0.4027 0.3394 0.5633 0.0836 0.3089 0.2893 0.1897 0.1211 0.1574 0.0309 0.1974 0.3615 0.2832 0.0041 0.3483 0.0595 0.2163 0.0066 0.3768 0.0375 0.1957 0.1446] 2022-08-23 10:22:49 [INFO] [EVAL] Class Precision: [0.7938 0.857 0.9651 0.8208 0.7656 0.8545 0.8722 0.8573 0.6923 0.7711 0.6956 0.7409 0.7793 0.4849 0.5342 0.5874 0.7188 0.6908 0.7742 0.6416 0.8244 0.6594 0.7608 0.6132 0.5304 0.5411 0.6107 0.7957 0.6633 0.3917 0.5037 0.6563 0.473 0.4822 0.4904 0.5466 0.6181 0.7656 0.4205 0.6465 0.4023 0.2465 0.5798 0.5478 0.3961 0.4025 0.4595 0.6959 0.7851 0.6265 0.7546 0.5094 0.3912 0.5206 0.7001 0.5958 0.9311 0.6896 0.6526 0.4275 0.1164 0.6305 0.4756 0.6931 0.5948 0.7857 0.4567 0.5376 0.2788 0.5489 0.6816 0.6955 0.6399 0.2603 0.7054 0.5279 0.7259 0.5176 0.7326 0.4643 0.6845 0.6417 0.7733 0.0795 0.3487 0.7563 0.3299 0.3993 0.4859 0.7687 0.6149 0.0625 0.4118 0.3648 0.0022 0.0434 0.3397 0.3947 0.4778 0.6685 0.3744 0.2298 0.632 0.7033 0.4574 0.6323 0.2256 0.7771 0.2864 0.5553 0.3121 0.5661 0.4637 0.7484 0.8154 0.1746 0.745 0.7869 0.1488 0.5616 0.6146 0.216 0.5979 0.5968 0.656 0.6445 0.8048 0.5822 0.8046 0.5309 0.6199 0.2737 0.4326 0.6764 0.5441 0.2597 0.3537 0.073 0.4475 0.6386 0.3318 0.0061 0.6029 0.3119 0.501 0.0087 0.8523 0.374 0.619 0.7713] 2022-08-23 10:22:49 [INFO] [EVAL] Class Recall: [0.8474 0.8947 0.9646 0.8675 0.8633 0.8836 0.8922 0.9113 0.699 0.7611 0.6336 0.6914 0.8803 0.456 0.4673 0.6188 0.65 0.5362 0.7667 0.5588 0.88 0.6975 0.8057 0.7218 0.437 0.5431 0.6542 0.5157 0.517 0.4919 0.3609 0.676 0.3563 0.5899 0.4408 0.5892 0.5927 0.6463 0.4197 0.5388 0.3357 0.1251 0.4671 0.3372 0.4975 0.2613 0.4473 0.6669 0.7399 0.8023 0.7317 0.6035 0.3465 0.2954 0.9581 0.6345 0.9349 0.5479 0.6455 0.3675 0.1183 0.5984 0.4931 0.2424 0.713 0.8545 0.4934 0.6162 0.1601 0.4153 0.6787 0.6735 0.5262 0.3788 0.5581 0.51 0.7167 0.3952 0.3035 0.2328 0.8432 0.5464 0.4499 0.0467 0.2164 0.6931 0.1186 0.1159 0.357 0.583 0.5893 0.183 0.3282 0.1281 0.0015 0.0281 0.0307 0.2207 0.4131 0.516 0.1211 0.1044 0.3145 0.2215 0.0034 0.9079 0.0773 0.6275 0.1361 0.3127 0.1955 0.6011 0.1929 0.6692 0.9922 0.0283 0.395 0.7849 0.3496 0.1502 0.6323 0.0137 0.2504 0.1737 0.3069 0.3473 0.5224 0.6097 0.4464 0.4848 0.8604 0.1074 0.5194 0.3358 0.2256 0.1849 0.2208 0.0509 0.2609 0.4544 0.659 0.0124 0.452 0.0685 0.2757 0.0266 0.4032 0.0401 0.2225 0.151 ] 2022-08-23 10:22:49 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:22:59 [INFO] [TRAIN] epoch: 121, iter: 152050/160000, loss: 0.4625, lr: 0.000060, batch_cost: 0.1964, reader_cost: 0.00294, ips: 40.7243 samples/sec | ETA 00:26:01 2022-08-23 10:23:08 [INFO] [TRAIN] epoch: 121, iter: 152100/160000, loss: 0.4998, lr: 0.000060, batch_cost: 0.1812, reader_cost: 0.00150, ips: 44.1466 samples/sec | ETA 00:23:51 2022-08-23 10:23:18 [INFO] [TRAIN] epoch: 121, iter: 152150/160000, loss: 0.4864, lr: 0.000059, batch_cost: 0.1940, reader_cost: 0.00075, ips: 41.2344 samples/sec | ETA 00:25:22 2022-08-23 10:23:27 [INFO] [TRAIN] epoch: 121, iter: 152200/160000, loss: 0.4877, lr: 0.000059, batch_cost: 0.1786, reader_cost: 0.00054, ips: 44.7912 samples/sec | ETA 00:23:13 2022-08-23 10:23:36 [INFO] [TRAIN] epoch: 121, iter: 152250/160000, loss: 0.5142, lr: 0.000059, batch_cost: 0.1884, reader_cost: 0.00252, ips: 42.4650 samples/sec | ETA 00:24:20 2022-08-23 10:23:47 [INFO] [TRAIN] epoch: 121, iter: 152300/160000, loss: 0.4984, lr: 0.000058, batch_cost: 0.2079, reader_cost: 0.00058, ips: 38.4770 samples/sec | ETA 00:26:40 2022-08-23 10:23:57 [INFO] [TRAIN] epoch: 121, iter: 152350/160000, loss: 0.4711, lr: 0.000058, batch_cost: 0.2010, reader_cost: 0.00062, ips: 39.8059 samples/sec | ETA 00:25:37 2022-08-23 10:24:07 [INFO] [TRAIN] epoch: 121, iter: 152400/160000, loss: 0.4509, lr: 0.000058, batch_cost: 0.2021, reader_cost: 0.00112, ips: 39.5874 samples/sec | ETA 00:25:35 2022-08-23 10:24:16 [INFO] [TRAIN] epoch: 121, iter: 152450/160000, loss: 0.4496, lr: 0.000057, batch_cost: 0.1828, reader_cost: 0.00047, ips: 43.7543 samples/sec | ETA 00:23:00 2022-08-23 10:24:25 [INFO] [TRAIN] epoch: 121, iter: 152500/160000, loss: 0.4516, lr: 0.000057, batch_cost: 0.1892, reader_cost: 0.00057, ips: 42.2807 samples/sec | ETA 00:23:39 2022-08-23 10:24:35 [INFO] [TRAIN] epoch: 121, iter: 152550/160000, loss: 0.4382, lr: 0.000056, batch_cost: 0.2002, reader_cost: 0.00073, ips: 39.9504 samples/sec | ETA 00:24:51 2022-08-23 10:24:45 [INFO] [TRAIN] epoch: 121, iter: 152600/160000, loss: 0.4710, lr: 0.000056, batch_cost: 0.2018, reader_cost: 0.00036, ips: 39.6494 samples/sec | ETA 00:24:53 2022-08-23 10:24:55 [INFO] [TRAIN] epoch: 121, iter: 152650/160000, loss: 0.4510, lr: 0.000056, batch_cost: 0.1919, reader_cost: 0.00079, ips: 41.6924 samples/sec | ETA 00:23:30 2022-08-23 10:25:04 [INFO] [TRAIN] epoch: 121, iter: 152700/160000, loss: 0.4879, lr: 0.000055, batch_cost: 0.1824, reader_cost: 0.00054, ips: 43.8637 samples/sec | ETA 00:22:11 2022-08-23 10:25:14 [INFO] [TRAIN] epoch: 121, iter: 152750/160000, loss: 0.4595, lr: 0.000055, batch_cost: 0.2039, reader_cost: 0.00065, ips: 39.2307 samples/sec | ETA 00:24:38 2022-08-23 10:25:24 [INFO] [TRAIN] epoch: 121, iter: 152800/160000, loss: 0.4612, lr: 0.000055, batch_cost: 0.1947, reader_cost: 0.00048, ips: 41.0787 samples/sec | ETA 00:23:22 2022-08-23 10:25:38 [INFO] [TRAIN] epoch: 122, iter: 152850/160000, loss: 0.4704, lr: 0.000054, batch_cost: 0.2714, reader_cost: 0.04223, ips: 29.4778 samples/sec | ETA 00:32:20 2022-08-23 10:25:49 [INFO] [TRAIN] epoch: 122, iter: 152900/160000, loss: 0.4641, lr: 0.000054, batch_cost: 0.2349, reader_cost: 0.00076, ips: 34.0559 samples/sec | ETA 00:27:47 2022-08-23 10:26:00 [INFO] [TRAIN] epoch: 122, iter: 152950/160000, loss: 0.4730, lr: 0.000053, batch_cost: 0.2171, reader_cost: 0.00063, ips: 36.8499 samples/sec | ETA 00:25:30 2022-08-23 10:26:10 [INFO] [TRAIN] epoch: 122, iter: 153000/160000, loss: 0.4686, lr: 0.000053, batch_cost: 0.1919, reader_cost: 0.00036, ips: 41.6977 samples/sec | ETA 00:22:22 2022-08-23 10:26:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 167s - batch_cost: 0.1668 - reader cost: 9.0676e-04 2022-08-23 10:28:57 [INFO] [EVAL] #Images: 2000 mIoU: 0.3529 Acc: 0.7717 Kappa: 0.7541 Dice: 0.4845 2022-08-23 10:28:57 [INFO] [EVAL] Class IoU: [0.6942 0.78 0.9326 0.7328 0.6822 0.7665 0.786 0.7885 0.5359 0.6343 0.4935 0.5579 0.7089 0.2999 0.3126 0.4356 0.5216 0.442 0.6285 0.4207 0.7454 0.525 0.6393 0.503 0.3236 0.3337 0.488 0.4604 0.4167 0.3008 0.2794 0.497 0.2527 0.3668 0.3087 0.3961 0.4347 0.536 0.2643 0.3956 0.2301 0.0867 0.3514 0.264 0.2709 0.1926 0.3049 0.5192 0.59 0.5483 0.5915 0.4573 0.2258 0.2056 0.6874 0.4563 0.8734 0.4439 0.4754 0.2373 0.0638 0.398 0.3302 0.2097 0.4663 0.6897 0.3099 0.396 0.1272 0.3024 0.5053 0.5289 0.4248 0.2015 0.4539 0.335 0.5883 0.2951 0.2854 0.1621 0.6145 0.4116 0.3586 0.0385 0.0783 0.566 0.1014 0.0965 0.2487 0.5064 0.4376 0.0553 0.2078 0.1051 0.0008 0.0189 0.0308 0.1708 0.2893 0.3838 0.104 0.0585 0.2893 0.1505 0.0066 0.5709 0.0644 0.5165 0.1153 0.2374 0.1468 0.4063 0.1535 0.5658 0.8156 0.0188 0.3315 0.6298 0.09 0.1018 0.4711 0.0159 0.2114 0.1667 0.2651 0.2971 0.4604 0.4299 0.4658 0.3501 0.5638 0.0728 0.3223 0.2898 0.1988 0.1261 0.1603 0.0343 0.1964 0.3543 0.3444 0.0034 0.3367 0.0649 0.1829 0.0051 0.3782 0.0332 0.2048 0.134 ] 2022-08-23 10:28:57 [INFO] [EVAL] Class Precision: [0.7842 0.8553 0.9654 0.8218 0.7608 0.8726 0.87 0.8497 0.6672 0.7724 0.7061 0.7339 0.7838 0.4991 0.5658 0.6024 0.6999 0.6921 0.7843 0.6455 0.8356 0.6802 0.7689 0.6281 0.5152 0.5438 0.659 0.742 0.6573 0.4396 0.4586 0.6418 0.4835 0.5033 0.4762 0.5581 0.6439 0.7542 0.4276 0.6795 0.434 0.2322 0.5914 0.5517 0.3842 0.4329 0.5082 0.6926 0.7498 0.6382 0.7842 0.6563 0.3963 0.5255 0.7094 0.6269 0.9329 0.6592 0.7438 0.4014 0.1097 0.5411 0.5057 0.7171 0.5669 0.7738 0.4565 0.5411 0.3182 0.5435 0.6844 0.6686 0.6506 0.2817 0.683 0.5006 0.7609 0.5474 0.7826 0.4266 0.6998 0.6339 0.8213 0.0961 0.2177 0.7593 0.3322 0.3979 0.5039 0.7402 0.613 0.0702 0.3527 0.3776 0.0023 0.0453 0.3641 0.3984 0.4806 0.6711 0.371 0.1767 0.5898 0.6677 0.3331 0.611 0.2379 0.7998 0.2507 0.5559 0.3138 0.5563 0.4715 0.7922 0.8209 0.1292 0.6544 0.7495 0.1235 0.5892 0.6279 0.2188 0.667 0.6037 0.6538 0.6324 0.7966 0.5975 0.835 0.5979 0.6218 0.3155 0.4264 0.6691 0.5208 0.257 0.3915 0.0741 0.4431 0.6278 0.4503 0.0052 0.6324 0.3889 0.5507 0.0063 0.8005 0.4187 0.6173 0.7348] 2022-08-23 10:28:57 [INFO] [EVAL] Class Recall: [0.8581 0.8986 0.9649 0.8712 0.8685 0.8631 0.8906 0.9162 0.7313 0.7801 0.621 0.6993 0.8812 0.4291 0.4113 0.6115 0.6719 0.5502 0.7598 0.5472 0.8735 0.6971 0.7914 0.7164 0.4652 0.4635 0.6529 0.5481 0.5324 0.4878 0.4169 0.6878 0.3461 0.575 0.4674 0.5771 0.5722 0.6495 0.4091 0.4863 0.3288 0.1216 0.464 0.336 0.4788 0.2576 0.4326 0.6747 0.7345 0.7955 0.7064 0.6014 0.3442 0.2524 0.9569 0.6264 0.9319 0.5762 0.5685 0.3674 0.1324 0.6007 0.4876 0.2286 0.7243 0.8639 0.4911 0.5962 0.1748 0.4053 0.6588 0.7168 0.5504 0.4143 0.575 0.5031 0.7218 0.3904 0.31 0.2073 0.8344 0.5399 0.389 0.0604 0.109 0.6897 0.1273 0.113 0.3293 0.6158 0.6046 0.2067 0.3359 0.1271 0.0012 0.0313 0.0326 0.2302 0.4209 0.4727 0.1263 0.0804 0.3621 0.1627 0.0067 0.897 0.0812 0.5932 0.1759 0.2929 0.2161 0.6011 0.1854 0.6644 0.992 0.0216 0.4019 0.7978 0.249 0.1096 0.6535 0.0168 0.2363 0.1872 0.3084 0.3591 0.5217 0.6052 0.513 0.4578 0.8581 0.0865 0.5691 0.3383 0.2433 0.1985 0.2134 0.0601 0.2609 0.4484 0.5941 0.0095 0.4186 0.0723 0.215 0.0277 0.4175 0.0348 0.2346 0.1408] 2022-08-23 10:28:57 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:29:07 [INFO] [TRAIN] epoch: 122, iter: 153050/160000, loss: 0.4665, lr: 0.000053, batch_cost: 0.1926, reader_cost: 0.00477, ips: 41.5439 samples/sec | ETA 00:22:18 2022-08-23 10:29:15 [INFO] [TRAIN] epoch: 122, iter: 153100/160000, loss: 0.4777, lr: 0.000052, batch_cost: 0.1756, reader_cost: 0.00085, ips: 45.5536 samples/sec | ETA 00:20:11 2022-08-23 10:29:26 [INFO] [TRAIN] epoch: 122, iter: 153150/160000, loss: 0.4577, lr: 0.000052, batch_cost: 0.2011, reader_cost: 0.00125, ips: 39.7730 samples/sec | ETA 00:22:57 2022-08-23 10:29:35 [INFO] [TRAIN] epoch: 122, iter: 153200/160000, loss: 0.4756, lr: 0.000051, batch_cost: 0.1916, reader_cost: 0.00048, ips: 41.7499 samples/sec | ETA 00:21:42 2022-08-23 10:29:45 [INFO] [TRAIN] epoch: 122, iter: 153250/160000, loss: 0.4653, lr: 0.000051, batch_cost: 0.1983, reader_cost: 0.00065, ips: 40.3354 samples/sec | ETA 00:22:18 2022-08-23 10:29:55 [INFO] [TRAIN] epoch: 122, iter: 153300/160000, loss: 0.4720, lr: 0.000051, batch_cost: 0.2074, reader_cost: 0.00062, ips: 38.5638 samples/sec | ETA 00:23:09 2022-08-23 10:30:05 [INFO] [TRAIN] epoch: 122, iter: 153350/160000, loss: 0.4537, lr: 0.000050, batch_cost: 0.1964, reader_cost: 0.00035, ips: 40.7261 samples/sec | ETA 00:21:46 2022-08-23 10:30:15 [INFO] [TRAIN] epoch: 122, iter: 153400/160000, loss: 0.4778, lr: 0.000050, batch_cost: 0.1912, reader_cost: 0.00100, ips: 41.8428 samples/sec | ETA 00:21:01 2022-08-23 10:30:25 [INFO] [TRAIN] epoch: 122, iter: 153450/160000, loss: 0.4429, lr: 0.000050, batch_cost: 0.1974, reader_cost: 0.00068, ips: 40.5302 samples/sec | ETA 00:21:32 2022-08-23 10:30:35 [INFO] [TRAIN] epoch: 122, iter: 153500/160000, loss: 0.4737, lr: 0.000049, batch_cost: 0.1983, reader_cost: 0.00071, ips: 40.3439 samples/sec | ETA 00:21:28 2022-08-23 10:30:44 [INFO] [TRAIN] epoch: 122, iter: 153550/160000, loss: 0.4994, lr: 0.000049, batch_cost: 0.1882, reader_cost: 0.00073, ips: 42.5165 samples/sec | ETA 00:20:13 2022-08-23 10:30:54 [INFO] [TRAIN] epoch: 122, iter: 153600/160000, loss: 0.4880, lr: 0.000048, batch_cost: 0.1930, reader_cost: 0.00077, ips: 41.4529 samples/sec | ETA 00:20:35 2022-08-23 10:31:04 [INFO] [TRAIN] epoch: 122, iter: 153650/160000, loss: 0.4818, lr: 0.000048, batch_cost: 0.1994, reader_cost: 0.00057, ips: 40.1147 samples/sec | ETA 00:21:06 2022-08-23 10:31:13 [INFO] [TRAIN] epoch: 122, iter: 153700/160000, loss: 0.4802, lr: 0.000048, batch_cost: 0.1947, reader_cost: 0.00054, ips: 41.0988 samples/sec | ETA 00:20:26 2022-08-23 10:31:22 [INFO] [TRAIN] epoch: 122, iter: 153750/160000, loss: 0.4388, lr: 0.000047, batch_cost: 0.1786, reader_cost: 0.00075, ips: 44.7917 samples/sec | ETA 00:18:36 2022-08-23 10:31:31 [INFO] [TRAIN] epoch: 122, iter: 153800/160000, loss: 0.5089, lr: 0.000047, batch_cost: 0.1798, reader_cost: 0.00038, ips: 44.4881 samples/sec | ETA 00:18:34 2022-08-23 10:31:40 [INFO] [TRAIN] epoch: 122, iter: 153850/160000, loss: 0.4967, lr: 0.000047, batch_cost: 0.1808, reader_cost: 0.00115, ips: 44.2372 samples/sec | ETA 00:18:32 2022-08-23 10:31:48 [INFO] [TRAIN] epoch: 122, iter: 153900/160000, loss: 0.4991, lr: 0.000046, batch_cost: 0.1612, reader_cost: 0.00080, ips: 49.6425 samples/sec | ETA 00:16:23 2022-08-23 10:31:57 [INFO] [TRAIN] epoch: 122, iter: 153950/160000, loss: 0.4560, lr: 0.000046, batch_cost: 0.1775, reader_cost: 0.00062, ips: 45.0713 samples/sec | ETA 00:17:53 2022-08-23 10:32:06 [INFO] [TRAIN] epoch: 122, iter: 154000/160000, loss: 0.4871, lr: 0.000045, batch_cost: 0.1694, reader_cost: 0.00082, ips: 47.2313 samples/sec | ETA 00:16:56 2022-08-23 10:32:06 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 167s - batch_cost: 0.1668 - reader cost: 5.6590e-04 2022-08-23 10:34:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.3528 Acc: 0.7713 Kappa: 0.7538 Dice: 0.4851 2022-08-23 10:34:53 [INFO] [EVAL] Class IoU: [0.6955 0.7802 0.9327 0.7318 0.6794 0.7687 0.789 0.7893 0.535 0.6169 0.4952 0.5663 0.7071 0.3036 0.3198 0.4351 0.5255 0.4337 0.6288 0.4265 0.7427 0.515 0.64 0.5011 0.3199 0.4003 0.4857 0.4528 0.4168 0.2514 0.2846 0.5041 0.2606 0.3681 0.3009 0.3964 0.4336 0.5289 0.2653 0.4059 0.2499 0.0935 0.3397 0.2646 0.281 0.1917 0.3166 0.5117 0.616 0.5594 0.5943 0.4022 0.2212 0.2041 0.692 0.426 0.8736 0.4235 0.4817 0.2331 0.061 0.3654 0.3325 0.2048 0.4694 0.6816 0.3134 0.4016 0.1159 0.3012 0.5003 0.5377 0.4114 0.1922 0.4551 0.328 0.5813 0.2662 0.2802 0.1689 0.6154 0.4152 0.3847 0.029 0.1076 0.5611 0.1056 0.0973 0.2259 0.519 0.4399 0.0501 0.2195 0.1113 0.0007 0.0137 0.0304 0.1567 0.2981 0.4148 0.1142 0.0507 0.2752 0.2772 0.0038 0.5657 0.0603 0.527 0.1014 0.2581 0.1524 0.4 0.1607 0.5104 0.7681 0.0156 0.3265 0.6359 0.0997 0.1187 0.4658 0.0146 0.2154 0.1592 0.2654 0.2906 0.4551 0.4272 0.408 0.3524 0.5681 0.0768 0.338 0.2959 0.1958 0.1283 0.1647 0.0367 0.1935 0.3605 0.2979 0.0062 0.3307 0.1072 0.1872 0.0037 0.4032 0.0368 0.1836 0.1586] 2022-08-23 10:34:53 [INFO] [EVAL] Class Precision: [0.7933 0.8595 0.9661 0.8232 0.7564 0.8637 0.8744 0.8497 0.6692 0.7617 0.6832 0.7267 0.7849 0.5089 0.5587 0.604 0.7025 0.7041 0.7704 0.6156 0.8267 0.6803 0.7686 0.6342 0.5175 0.5558 0.6238 0.7459 0.6553 0.3755 0.4791 0.6506 0.4868 0.5043 0.4681 0.5621 0.6282 0.7544 0.4047 0.6503 0.4055 0.2346 0.6143 0.536 0.3977 0.4154 0.4947 0.696 0.7524 0.6739 0.7839 0.5716 0.3973 0.5299 0.7241 0.5936 0.9247 0.6823 0.7093 0.3903 0.1099 0.5994 0.5061 0.7008 0.5802 0.7603 0.4577 0.553 0.2776 0.5178 0.6397 0.7036 0.6361 0.2787 0.7375 0.5397 0.7316 0.5094 0.7973 0.4504 0.704 0.6452 0.7826 0.0851 0.2735 0.7637 0.3097 0.3933 0.4647 0.7598 0.6192 0.0625 0.3938 0.3797 0.0021 0.0365 0.27 0.3788 0.4724 0.6666 0.375 0.1703 0.6001 0.7544 0.6046 0.6026 0.2089 0.8196 0.2609 0.5799 0.3251 0.5476 0.4317 0.8057 0.7719 0.1199 0.6258 0.7453 0.1302 0.6115 0.6236 0.161 0.7162 0.5862 0.6688 0.6343 0.7794 0.6029 0.7767 0.5768 0.6294 0.2709 0.4733 0.6794 0.5336 0.2637 0.3945 0.0698 0.4411 0.6227 0.3582 0.0097 0.6082 0.4962 0.4639 0.0046 0.791 0.3724 0.6009 0.6803] 2022-08-23 10:34:53 [INFO] [EVAL] Class Recall: [0.8495 0.8942 0.9642 0.8682 0.8697 0.8748 0.8899 0.9174 0.7273 0.7644 0.6427 0.7196 0.8771 0.4295 0.4278 0.6089 0.6758 0.5304 0.7738 0.5813 0.8797 0.6794 0.7927 0.7048 0.4559 0.5885 0.6869 0.5354 0.5338 0.4319 0.4122 0.6912 0.3593 0.5767 0.4573 0.5735 0.5834 0.639 0.4351 0.5193 0.3943 0.1346 0.4319 0.3431 0.4892 0.2625 0.4679 0.659 0.7726 0.7671 0.7107 0.5758 0.333 0.2493 0.9399 0.6015 0.9406 0.5275 0.6001 0.3666 0.1207 0.4835 0.4922 0.2244 0.7109 0.8681 0.4985 0.5946 0.1659 0.4186 0.6966 0.6951 0.5381 0.3823 0.5431 0.4553 0.7388 0.3581 0.3017 0.2128 0.8303 0.538 0.4308 0.042 0.1507 0.679 0.1381 0.1144 0.3053 0.6208 0.603 0.2018 0.3315 0.136 0.0011 0.0215 0.0332 0.211 0.4469 0.5233 0.1411 0.0674 0.337 0.3047 0.0039 0.9023 0.0782 0.5962 0.1422 0.3174 0.2229 0.5973 0.2039 0.582 0.9936 0.0176 0.4057 0.8126 0.2984 0.1284 0.648 0.0159 0.2355 0.1793 0.3056 0.3491 0.5223 0.5945 0.4622 0.4753 0.8536 0.0968 0.5418 0.3439 0.2363 0.1998 0.2204 0.0719 0.2563 0.4612 0.6388 0.017 0.4203 0.1203 0.2389 0.0192 0.4513 0.0393 0.2091 0.1714] 2022-08-23 10:34:53 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:35:03 [INFO] [TRAIN] epoch: 122, iter: 154050/160000, loss: 0.4714, lr: 0.000045, batch_cost: 0.1973, reader_cost: 0.00317, ips: 40.5425 samples/sec | ETA 00:19:34 2022-08-23 10:35:16 [INFO] [TRAIN] epoch: 123, iter: 154100/160000, loss: 0.4609, lr: 0.000045, batch_cost: 0.2578, reader_cost: 0.07616, ips: 31.0263 samples/sec | ETA 00:25:21 2022-08-23 10:35:26 [INFO] [TRAIN] epoch: 123, iter: 154150/160000, loss: 0.4455, lr: 0.000044, batch_cost: 0.2043, reader_cost: 0.00053, ips: 39.1647 samples/sec | ETA 00:19:54 2022-08-23 10:35:35 [INFO] [TRAIN] epoch: 123, iter: 154200/160000, loss: 0.5011, lr: 0.000044, batch_cost: 0.1770, reader_cost: 0.00051, ips: 45.1998 samples/sec | ETA 00:17:06 2022-08-23 10:35:44 [INFO] [TRAIN] epoch: 123, iter: 154250/160000, loss: 0.4549, lr: 0.000044, batch_cost: 0.1884, reader_cost: 0.00062, ips: 42.4696 samples/sec | ETA 00:18:03 2022-08-23 10:35:54 [INFO] [TRAIN] epoch: 123, iter: 154300/160000, loss: 0.4684, lr: 0.000043, batch_cost: 0.1954, reader_cost: 0.00088, ips: 40.9401 samples/sec | ETA 00:18:33 2022-08-23 10:36:04 [INFO] [TRAIN] epoch: 123, iter: 154350/160000, loss: 0.4879, lr: 0.000043, batch_cost: 0.2038, reader_cost: 0.00050, ips: 39.2580 samples/sec | ETA 00:19:11 2022-08-23 10:36:15 [INFO] [TRAIN] epoch: 123, iter: 154400/160000, loss: 0.4573, lr: 0.000042, batch_cost: 0.2152, reader_cost: 0.00140, ips: 37.1819 samples/sec | ETA 00:20:04 2022-08-23 10:36:25 [INFO] [TRAIN] epoch: 123, iter: 154450/160000, loss: 0.5233, lr: 0.000042, batch_cost: 0.2085, reader_cost: 0.00070, ips: 38.3620 samples/sec | ETA 00:19:17 2022-08-23 10:36:35 [INFO] [TRAIN] epoch: 123, iter: 154500/160000, loss: 0.4839, lr: 0.000042, batch_cost: 0.1991, reader_cost: 0.00066, ips: 40.1885 samples/sec | ETA 00:18:14 2022-08-23 10:36:46 [INFO] [TRAIN] epoch: 123, iter: 154550/160000, loss: 0.4609, lr: 0.000041, batch_cost: 0.2082, reader_cost: 0.00047, ips: 38.4304 samples/sec | ETA 00:18:54 2022-08-23 10:36:54 [INFO] [TRAIN] epoch: 123, iter: 154600/160000, loss: 0.4818, lr: 0.000041, batch_cost: 0.1775, reader_cost: 0.00048, ips: 45.0640 samples/sec | ETA 00:15:58 2022-08-23 10:37:05 [INFO] [TRAIN] epoch: 123, iter: 154650/160000, loss: 0.4664, lr: 0.000041, batch_cost: 0.2071, reader_cost: 0.00046, ips: 38.6243 samples/sec | ETA 00:18:28 2022-08-23 10:37:14 [INFO] [TRAIN] epoch: 123, iter: 154700/160000, loss: 0.4725, lr: 0.000040, batch_cost: 0.1907, reader_cost: 0.00042, ips: 41.9516 samples/sec | ETA 00:16:50 2022-08-23 10:37:24 [INFO] [TRAIN] epoch: 123, iter: 154750/160000, loss: 0.4579, lr: 0.000040, batch_cost: 0.1861, reader_cost: 0.00083, ips: 42.9763 samples/sec | ETA 00:16:17 2022-08-23 10:37:33 [INFO] [TRAIN] epoch: 123, iter: 154800/160000, loss: 0.4841, lr: 0.000039, batch_cost: 0.1831, reader_cost: 0.00079, ips: 43.6942 samples/sec | ETA 00:15:52 2022-08-23 10:37:43 [INFO] [TRAIN] epoch: 123, iter: 154850/160000, loss: 0.4973, lr: 0.000039, batch_cost: 0.1944, reader_cost: 0.00043, ips: 41.1617 samples/sec | ETA 00:16:40 2022-08-23 10:37:53 [INFO] [TRAIN] epoch: 123, iter: 154900/160000, loss: 0.4966, lr: 0.000039, batch_cost: 0.2085, reader_cost: 0.00082, ips: 38.3628 samples/sec | ETA 00:17:43 2022-08-23 10:38:04 [INFO] [TRAIN] epoch: 123, iter: 154950/160000, loss: 0.4796, lr: 0.000038, batch_cost: 0.2160, reader_cost: 0.00089, ips: 37.0394 samples/sec | ETA 00:18:10 2022-08-23 10:38:14 [INFO] [TRAIN] epoch: 123, iter: 155000/160000, loss: 0.4804, lr: 0.000038, batch_cost: 0.2003, reader_cost: 0.00083, ips: 39.9372 samples/sec | ETA 00:16:41 2022-08-23 10:38:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1743 - reader cost: 7.5428e-04 2022-08-23 10:41:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3539 Acc: 0.7718 Kappa: 0.7543 Dice: 0.4861 2022-08-23 10:41:08 [INFO] [EVAL] Class IoU: [0.6942 0.7796 0.9328 0.7336 0.6834 0.766 0.7893 0.7872 0.536 0.6229 0.4931 0.555 0.7014 0.3109 0.32 0.4382 0.5193 0.4367 0.6258 0.4216 0.7483 0.549 0.6397 0.4987 0.3301 0.3957 0.5304 0.4696 0.406 0.2835 0.2775 0.4823 0.2658 0.3692 0.3064 0.3987 0.4333 0.5283 0.2558 0.4014 0.2507 0.0897 0.341 0.2636 0.2837 0.1896 0.3105 0.5199 0.566 0.5488 0.5903 0.387 0.2247 0.2025 0.681 0.4309 0.8698 0.4343 0.4875 0.2419 0.0621 0.4121 0.3262 0.2375 0.4853 0.6938 0.3044 0.3948 0.1166 0.3065 0.5171 0.5285 0.4107 0.195 0.4561 0.3391 0.595 0.2893 0.285 0.1517 0.6162 0.416 0.3751 0.0301 0.1206 0.5613 0.0997 0.0938 0.2409 0.5073 0.451 0.0576 0.2213 0.1171 0.0006 0.0132 0.0296 0.1452 0.2877 0.4078 0.1483 0.082 0.2949 0.1078 0.0017 0.5947 0.0594 0.5208 0.1055 0.2412 0.1396 0.4631 0.1501 0.505 0.7797 0.0163 0.3233 0.6371 0.1211 0.1556 0.4812 0.0123 0.2126 0.1719 0.2652 0.2939 0.4468 0.4268 0.4376 0.3499 0.5667 0.0741 0.3437 0.2847 0.2087 0.1272 0.1608 0.0329 0.2107 0.3565 0.3142 0.0033 0.3135 0.0759 0.1721 0.003 0.3695 0.0354 0.1906 0.1676] 2022-08-23 10:41:08 [INFO] [EVAL] Class Precision: [0.7855 0.8613 0.9668 0.8254 0.7636 0.8687 0.8664 0.8477 0.6788 0.7578 0.7084 0.7492 0.7733 0.4976 0.5643 0.5938 0.7032 0.7017 0.7534 0.6388 0.8373 0.6862 0.7776 0.6254 0.5098 0.5521 0.7321 0.7564 0.6754 0.4143 0.4559 0.6265 0.468 0.5061 0.4833 0.5548 0.637 0.7594 0.4406 0.6561 0.4073 0.2275 0.6286 0.5358 0.3877 0.4163 0.4981 0.7134 0.7352 0.6564 0.7776 0.5188 0.4076 0.5308 0.7084 0.6516 0.9204 0.6643 0.7387 0.4019 0.1063 0.6097 0.4923 0.6949 0.6184 0.7811 0.45 0.5335 0.3 0.5482 0.6676 0.687 0.6593 0.2797 0.7079 0.4984 0.7786 0.5379 0.7345 0.4448 0.6962 0.6416 0.8038 0.0832 0.2922 0.7651 0.3491 0.4112 0.4481 0.7476 0.6627 0.0731 0.3954 0.3839 0.0019 0.0362 0.3337 0.3819 0.5065 0.6894 0.3831 0.2175 0.5687 0.6029 0.2249 0.6398 0.2282 0.7969 0.2665 0.5712 0.33 0.6639 0.4396 0.8012 0.7845 0.0978 0.6786 0.7577 0.1579 0.5518 0.5963 0.1734 0.6178 0.5844 0.6594 0.6713 0.7987 0.5871 0.7726 0.5548 0.6237 0.3086 0.4656 0.6626 0.542 0.2664 0.3639 0.0621 0.4697 0.6241 0.3758 0.005 0.6235 0.3696 0.4664 0.0039 0.841 0.3988 0.6135 0.7444] 2022-08-23 10:41:08 [INFO] [EVAL] Class Recall: [0.8565 0.8916 0.9636 0.8684 0.8667 0.8663 0.8987 0.9168 0.7182 0.7778 0.6188 0.6817 0.8829 0.4532 0.425 0.6257 0.6651 0.5362 0.7869 0.5537 0.8756 0.733 0.783 0.7111 0.4835 0.5827 0.6581 0.5532 0.5044 0.4732 0.4149 0.677 0.3809 0.5772 0.4556 0.5863 0.5754 0.6345 0.3787 0.5083 0.3947 0.1291 0.427 0.3417 0.514 0.2583 0.452 0.6572 0.711 0.77 0.7103 0.6037 0.3338 0.2467 0.9464 0.56 0.9405 0.5564 0.5891 0.378 0.13 0.5598 0.4915 0.2652 0.6928 0.8612 0.4848 0.603 0.1603 0.41 0.6964 0.6962 0.5213 0.3917 0.5619 0.5148 0.7162 0.385 0.3177 0.1871 0.8429 0.5419 0.4129 0.0451 0.1704 0.6782 0.1225 0.1083 0.3426 0.6121 0.5854 0.2138 0.3345 0.1442 0.001 0.0203 0.0314 0.1898 0.3997 0.4996 0.1948 0.1163 0.3798 0.1161 0.0017 0.8941 0.0744 0.6005 0.1487 0.2945 0.1948 0.6048 0.1855 0.5774 0.9922 0.0192 0.3817 0.8001 0.3423 0.1781 0.7137 0.013 0.2448 0.1958 0.3073 0.3433 0.5035 0.6099 0.5023 0.4864 0.8613 0.0889 0.5677 0.333 0.2534 0.1958 0.2237 0.0654 0.2764 0.454 0.6571 0.0095 0.3866 0.0872 0.2144 0.0131 0.3973 0.0374 0.2166 0.1778] 2022-08-23 10:41:08 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:41:18 [INFO] [TRAIN] epoch: 123, iter: 155050/160000, loss: 0.4708, lr: 0.000037, batch_cost: 0.1909, reader_cost: 0.00391, ips: 41.9046 samples/sec | ETA 00:15:45 2022-08-23 10:41:27 [INFO] [TRAIN] epoch: 123, iter: 155100/160000, loss: 0.4666, lr: 0.000037, batch_cost: 0.1885, reader_cost: 0.00134, ips: 42.4505 samples/sec | ETA 00:15:23 2022-08-23 10:41:37 [INFO] [TRAIN] epoch: 123, iter: 155150/160000, loss: 0.4892, lr: 0.000037, batch_cost: 0.1939, reader_cost: 0.00070, ips: 41.2533 samples/sec | ETA 00:15:40 2022-08-23 10:41:47 [INFO] [TRAIN] epoch: 123, iter: 155200/160000, loss: 0.4766, lr: 0.000036, batch_cost: 0.1921, reader_cost: 0.00075, ips: 41.6422 samples/sec | ETA 00:15:22 2022-08-23 10:41:56 [INFO] [TRAIN] epoch: 123, iter: 155250/160000, loss: 0.4469, lr: 0.000036, batch_cost: 0.1803, reader_cost: 0.00062, ips: 44.3782 samples/sec | ETA 00:14:16 2022-08-23 10:42:05 [INFO] [TRAIN] epoch: 123, iter: 155300/160000, loss: 0.4951, lr: 0.000036, batch_cost: 0.1934, reader_cost: 0.00074, ips: 41.3731 samples/sec | ETA 00:15:08 2022-08-23 10:42:19 [INFO] [TRAIN] epoch: 124, iter: 155350/160000, loss: 0.4802, lr: 0.000035, batch_cost: 0.2631, reader_cost: 0.06004, ips: 30.4027 samples/sec | ETA 00:20:23 2022-08-23 10:42:27 [INFO] [TRAIN] epoch: 124, iter: 155400/160000, loss: 0.4834, lr: 0.000035, batch_cost: 0.1781, reader_cost: 0.00367, ips: 44.9161 samples/sec | ETA 00:13:39 2022-08-23 10:42:38 [INFO] [TRAIN] epoch: 124, iter: 155450/160000, loss: 0.4735, lr: 0.000034, batch_cost: 0.2110, reader_cost: 0.00067, ips: 37.9149 samples/sec | ETA 00:16:00 2022-08-23 10:42:49 [INFO] [TRAIN] epoch: 124, iter: 155500/160000, loss: 0.4854, lr: 0.000034, batch_cost: 0.2106, reader_cost: 0.00071, ips: 37.9798 samples/sec | ETA 00:15:47 2022-08-23 10:42:59 [INFO] [TRAIN] epoch: 124, iter: 155550/160000, loss: 0.4581, lr: 0.000034, batch_cost: 0.2143, reader_cost: 0.00076, ips: 37.3267 samples/sec | ETA 00:15:53 2022-08-23 10:43:09 [INFO] [TRAIN] epoch: 124, iter: 155600/160000, loss: 0.4711, lr: 0.000033, batch_cost: 0.2012, reader_cost: 0.00064, ips: 39.7572 samples/sec | ETA 00:14:45 2022-08-23 10:43:18 [INFO] [TRAIN] epoch: 124, iter: 155650/160000, loss: 0.5015, lr: 0.000033, batch_cost: 0.1696, reader_cost: 0.00052, ips: 47.1607 samples/sec | ETA 00:12:17 2022-08-23 10:43:28 [INFO] [TRAIN] epoch: 124, iter: 155700/160000, loss: 0.4787, lr: 0.000033, batch_cost: 0.1952, reader_cost: 0.00037, ips: 40.9805 samples/sec | ETA 00:13:59 2022-08-23 10:43:37 [INFO] [TRAIN] epoch: 124, iter: 155750/160000, loss: 0.4818, lr: 0.000032, batch_cost: 0.1868, reader_cost: 0.00038, ips: 42.8370 samples/sec | ETA 00:13:13 2022-08-23 10:43:47 [INFO] [TRAIN] epoch: 124, iter: 155800/160000, loss: 0.4793, lr: 0.000032, batch_cost: 0.2043, reader_cost: 0.00044, ips: 39.1570 samples/sec | ETA 00:14:18 2022-08-23 10:43:57 [INFO] [TRAIN] epoch: 124, iter: 155850/160000, loss: 0.4679, lr: 0.000031, batch_cost: 0.2019, reader_cost: 0.00094, ips: 39.6297 samples/sec | ETA 00:13:57 2022-08-23 10:44:08 [INFO] [TRAIN] epoch: 124, iter: 155900/160000, loss: 0.4688, lr: 0.000031, batch_cost: 0.2054, reader_cost: 0.00047, ips: 38.9437 samples/sec | ETA 00:14:02 2022-08-23 10:44:17 [INFO] [TRAIN] epoch: 124, iter: 155950/160000, loss: 0.4659, lr: 0.000031, batch_cost: 0.1837, reader_cost: 0.00110, ips: 43.5497 samples/sec | ETA 00:12:23 2022-08-23 10:44:27 [INFO] [TRAIN] epoch: 124, iter: 156000/160000, loss: 0.4692, lr: 0.000030, batch_cost: 0.1963, reader_cost: 0.00137, ips: 40.7550 samples/sec | ETA 00:13:05 2022-08-23 10:44:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 163s - batch_cost: 0.1633 - reader cost: 9.4909e-04 2022-08-23 10:47:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.3524 Acc: 0.7711 Kappa: 0.7535 Dice: 0.4849 2022-08-23 10:47:10 [INFO] [EVAL] Class IoU: [0.6956 0.7797 0.9321 0.7287 0.6819 0.768 0.7855 0.7924 0.5327 0.61 0.4905 0.5654 0.7026 0.3123 0.3176 0.4352 0.5265 0.433 0.6278 0.4206 0.7483 0.5174 0.6445 0.4976 0.3253 0.343 0.5033 0.4627 0.4138 0.2783 0.2886 0.4986 0.2588 0.3643 0.3052 0.3997 0.4322 0.5388 0.2623 0.4172 0.2402 0.0883 0.3538 0.2594 0.2751 0.1934 0.3088 0.5191 0.5752 0.5521 0.5798 0.4077 0.2279 0.2083 0.6669 0.4356 0.8736 0.4147 0.4935 0.2431 0.0638 0.4363 0.3331 0.2432 0.4745 0.7089 0.3127 0.3932 0.1075 0.3098 0.5225 0.5364 0.4285 0.2018 0.4503 0.3453 0.5462 0.2955 0.2714 0.1603 0.6463 0.4155 0.3734 0.0311 0.2484 0.5615 0.0959 0.0945 0.2525 0.5153 0.4014 0.0518 0.2266 0.1131 0.0025 0.0143 0.0303 0.1695 0.2922 0.383 0.1257 0.0759 0.2959 0.1748 0.0033 0.5889 0.0653 0.5188 0.1065 0.2406 0.1303 0.4057 0.1482 0.4197 0.7844 0.0152 0.3113 0.6291 0.1152 0.1153 0.4622 0.0126 0.204 0.1556 0.2592 0.3013 0.4619 0.4213 0.3705 0.3394 0.5588 0.0721 0.3344 0.2943 0.19 0.1252 0.1634 0.0329 0.2142 0.3579 0.2783 0.0037 0.3232 0.0773 0.2145 0.0043 0.3794 0.0314 0.1854 0.1426] 2022-08-23 10:47:10 [INFO] [EVAL] Class Precision: [0.791 0.8552 0.9653 0.8147 0.7638 0.8563 0.8736 0.8581 0.6723 0.7824 0.6991 0.7354 0.7778 0.4988 0.5651 0.5978 0.7109 0.7122 0.772 0.6402 0.839 0.6759 0.7864 0.64 0.5271 0.5263 0.647 0.7863 0.6658 0.3746 0.4561 0.6476 0.474 0.4831 0.4887 0.5619 0.6153 0.7626 0.4356 0.6543 0.4051 0.2537 0.6055 0.5418 0.3844 0.4148 0.4823 0.6934 0.7365 0.6311 0.7872 0.5557 0.4073 0.5605 0.6853 0.5926 0.9278 0.7065 0.7044 0.4123 0.1093 0.6062 0.4881 0.6779 0.5955 0.8082 0.4639 0.5329 0.2454 0.5604 0.6923 0.7022 0.6331 0.2749 0.7102 0.5207 0.6715 0.5352 0.7688 0.4418 0.7312 0.6679 0.7979 0.0808 0.4468 0.7446 0.3267 0.3959 0.4624 0.7363 0.5485 0.0651 0.4324 0.3776 0.0057 0.0434 0.436 0.3947 0.5346 0.679 0.3965 0.1954 0.6179 0.6207 0.3599 0.6327 0.2499 0.8056 0.2694 0.548 0.317 0.5605 0.4545 0.8036 0.7885 0.1139 0.7108 0.7446 0.1495 0.5578 0.5921 0.1938 0.6805 0.6053 0.6713 0.6591 0.7969 0.5883 0.6621 0.5777 0.6145 0.2993 0.4812 0.6265 0.5456 0.2789 0.3944 0.0742 0.5104 0.6259 0.3205 0.0056 0.6114 0.3413 0.5005 0.0071 0.8586 0.4059 0.5986 0.7527] 2022-08-23 10:47:10 [INFO] [EVAL] Class Recall: [0.8522 0.8983 0.9644 0.8734 0.8642 0.8816 0.8863 0.9119 0.7195 0.7347 0.6218 0.7097 0.8791 0.4552 0.4203 0.6154 0.6699 0.5249 0.7708 0.5507 0.8737 0.6881 0.7813 0.691 0.4593 0.4963 0.6939 0.5293 0.5222 0.52 0.44 0.6842 0.3631 0.5969 0.4483 0.5807 0.5922 0.6474 0.3973 0.5351 0.3712 0.1192 0.4598 0.3323 0.492 0.2659 0.4619 0.6737 0.7242 0.8152 0.6875 0.6048 0.3409 0.249 0.9614 0.6219 0.9373 0.501 0.6224 0.372 0.1329 0.6089 0.5119 0.275 0.7003 0.8522 0.4898 0.6001 0.1605 0.4093 0.6806 0.6943 0.5701 0.4313 0.5517 0.5063 0.7453 0.3975 0.2955 0.201 0.8476 0.5237 0.4124 0.0482 0.3586 0.6955 0.1195 0.1104 0.3574 0.632 0.5995 0.2019 0.3226 0.1391 0.0045 0.0209 0.0316 0.2289 0.392 0.4677 0.1555 0.1104 0.3622 0.1958 0.0033 0.8948 0.0812 0.593 0.1498 0.3002 0.1812 0.5949 0.1802 0.4676 0.9934 0.0172 0.3565 0.8022 0.3339 0.1269 0.6781 0.0132 0.2256 0.1732 0.2969 0.357 0.5236 0.5975 0.4569 0.4514 0.8603 0.0867 0.5228 0.357 0.2257 0.1851 0.2181 0.0558 0.2696 0.4553 0.6787 0.011 0.4067 0.0909 0.2729 0.0106 0.4047 0.0329 0.2118 0.1496] 2022-08-23 10:47:10 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:47:18 [INFO] [TRAIN] epoch: 124, iter: 156050/160000, loss: 0.4724, lr: 0.000030, batch_cost: 0.1648, reader_cost: 0.00324, ips: 48.5500 samples/sec | ETA 00:10:50 2022-08-23 10:47:28 [INFO] [TRAIN] epoch: 124, iter: 156100/160000, loss: 0.5142, lr: 0.000030, batch_cost: 0.1896, reader_cost: 0.00148, ips: 42.1917 samples/sec | ETA 00:12:19 2022-08-23 10:47:37 [INFO] [TRAIN] epoch: 124, iter: 156150/160000, loss: 0.4702, lr: 0.000029, batch_cost: 0.1737, reader_cost: 0.00125, ips: 46.0600 samples/sec | ETA 00:11:08 2022-08-23 10:47:46 [INFO] [TRAIN] epoch: 124, iter: 156200/160000, loss: 0.4701, lr: 0.000029, batch_cost: 0.1858, reader_cost: 0.00061, ips: 43.0667 samples/sec | ETA 00:11:45 2022-08-23 10:47:55 [INFO] [TRAIN] epoch: 124, iter: 156250/160000, loss: 0.4693, lr: 0.000028, batch_cost: 0.1860, reader_cost: 0.00046, ips: 43.0055 samples/sec | ETA 00:11:37 2022-08-23 10:48:04 [INFO] [TRAIN] epoch: 124, iter: 156300/160000, loss: 0.4966, lr: 0.000028, batch_cost: 0.1796, reader_cost: 0.00098, ips: 44.5457 samples/sec | ETA 00:11:04 2022-08-23 10:48:14 [INFO] [TRAIN] epoch: 124, iter: 156350/160000, loss: 0.4878, lr: 0.000028, batch_cost: 0.2023, reader_cost: 0.00064, ips: 39.5498 samples/sec | ETA 00:12:18 2022-08-23 10:48:24 [INFO] [TRAIN] epoch: 124, iter: 156400/160000, loss: 0.4772, lr: 0.000027, batch_cost: 0.1940, reader_cost: 0.00072, ips: 41.2412 samples/sec | ETA 00:11:38 2022-08-23 10:48:34 [INFO] [TRAIN] epoch: 124, iter: 156450/160000, loss: 0.4541, lr: 0.000027, batch_cost: 0.2050, reader_cost: 0.00084, ips: 39.0336 samples/sec | ETA 00:12:07 2022-08-23 10:48:44 [INFO] [TRAIN] epoch: 124, iter: 156500/160000, loss: 0.4952, lr: 0.000027, batch_cost: 0.1968, reader_cost: 0.00046, ips: 40.6543 samples/sec | ETA 00:11:28 2022-08-23 10:48:55 [INFO] [TRAIN] epoch: 124, iter: 156550/160000, loss: 0.4441, lr: 0.000026, batch_cost: 0.2133, reader_cost: 0.00114, ips: 37.5122 samples/sec | ETA 00:12:15 2022-08-23 10:49:05 [INFO] [TRAIN] epoch: 124, iter: 156600/160000, loss: 0.4204, lr: 0.000026, batch_cost: 0.2094, reader_cost: 0.00034, ips: 38.2075 samples/sec | ETA 00:11:51 2022-08-23 10:49:18 [INFO] [TRAIN] epoch: 125, iter: 156650/160000, loss: 0.4753, lr: 0.000025, batch_cost: 0.2554, reader_cost: 0.04845, ips: 31.3258 samples/sec | ETA 00:14:15 2022-08-23 10:49:28 [INFO] [TRAIN] epoch: 125, iter: 156700/160000, loss: 0.4869, lr: 0.000025, batch_cost: 0.2101, reader_cost: 0.00070, ips: 38.0753 samples/sec | ETA 00:11:33 2022-08-23 10:49:40 [INFO] [TRAIN] epoch: 125, iter: 156750/160000, loss: 0.4505, lr: 0.000025, batch_cost: 0.2223, reader_cost: 0.00082, ips: 35.9912 samples/sec | ETA 00:12:02 2022-08-23 10:49:50 [INFO] [TRAIN] epoch: 125, iter: 156800/160000, loss: 0.4591, lr: 0.000024, batch_cost: 0.2154, reader_cost: 0.00066, ips: 37.1416 samples/sec | ETA 00:11:29 2022-08-23 10:50:01 [INFO] [TRAIN] epoch: 125, iter: 156850/160000, loss: 0.4917, lr: 0.000024, batch_cost: 0.2089, reader_cost: 0.00061, ips: 38.3039 samples/sec | ETA 00:10:57 2022-08-23 10:50:11 [INFO] [TRAIN] epoch: 125, iter: 156900/160000, loss: 0.4924, lr: 0.000023, batch_cost: 0.2055, reader_cost: 0.00064, ips: 38.9293 samples/sec | ETA 00:10:37 2022-08-23 10:50:21 [INFO] [TRAIN] epoch: 125, iter: 156950/160000, loss: 0.4714, lr: 0.000023, batch_cost: 0.2045, reader_cost: 0.00089, ips: 39.1152 samples/sec | ETA 00:10:23 2022-08-23 10:50:31 [INFO] [TRAIN] epoch: 125, iter: 157000/160000, loss: 0.4801, lr: 0.000023, batch_cost: 0.1901, reader_cost: 0.00077, ips: 42.0757 samples/sec | ETA 00:09:30 2022-08-23 10:50:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 158s - batch_cost: 0.1582 - reader cost: 7.5545e-04 2022-08-23 10:53:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3529 Acc: 0.7722 Kappa: 0.7548 Dice: 0.4848 2022-08-23 10:53:09 [INFO] [EVAL] Class IoU: [0.6958 0.7823 0.9327 0.7324 0.6852 0.7691 0.789 0.788 0.5345 0.6187 0.4976 0.5675 0.7055 0.3152 0.3234 0.4366 0.5285 0.4415 0.6255 0.4287 0.7465 0.5274 0.6401 0.5005 0.3365 0.3682 0.4848 0.457 0.4205 0.2752 0.2807 0.4869 0.2538 0.3683 0.3041 0.3927 0.4382 0.5346 0.2579 0.4113 0.2362 0.0891 0.3542 0.2598 0.2777 0.1829 0.3151 0.5202 0.6044 0.5617 0.5769 0.4825 0.2195 0.2081 0.6906 0.453 0.8732 0.4269 0.4498 0.2519 0.0708 0.4304 0.3258 0.2242 0.4826 0.7049 0.31 0.4068 0.1213 0.3021 0.5186 0.5321 0.4164 0.1936 0.454 0.3332 0.5513 0.3021 0.2769 0.1388 0.6131 0.4141 0.3678 0.0294 0.1211 0.5647 0.1159 0.1042 0.231 0.5136 0.4465 0.0678 0.2112 0.1047 0.0004 0.0167 0.0265 0.1652 0.287 0.4065 0.1408 0.0765 0.2855 0.0975 0.0036 0.6076 0.0626 0.5183 0.1145 0.2289 0.1328 0.4346 0.1489 0.4971 0.7693 0.0229 0.3063 0.6278 0.1093 0.1239 0.4797 0.0162 0.2062 0.1673 0.2627 0.3035 0.4489 0.4138 0.3825 0.3475 0.5697 0.0709 0.3051 0.2926 0.19 0.1232 0.1616 0.0331 0.2253 0.3646 0.3158 0.0047 0.317 0.048 0.1988 0.0051 0.3739 0.0285 0.2102 0.1423] 2022-08-23 10:53:09 [INFO] [EVAL] Class Precision: [0.7908 0.854 0.966 0.8277 0.7677 0.8697 0.8799 0.8489 0.6786 0.7741 0.7117 0.7243 0.7804 0.5163 0.5569 0.5948 0.7069 0.6876 0.7573 0.6212 0.8307 0.6836 0.7737 0.6338 0.5166 0.5386 0.6646 0.7815 0.668 0.3871 0.4681 0.6381 0.461 0.5002 0.4707 0.5472 0.6296 0.7619 0.3965 0.6523 0.3861 0.2379 0.6037 0.5474 0.4032 0.436 0.4876 0.6915 0.7773 0.6492 0.7589 0.6983 0.3703 0.5283 0.7141 0.6381 0.9209 0.6818 0.7098 0.4225 0.1185 0.5821 0.4903 0.6932 0.6069 0.8017 0.4627 0.5385 0.3062 0.528 0.6948 0.6939 0.6297 0.2839 0.6893 0.5079 0.6841 0.552 0.7481 0.4052 0.6926 0.6533 0.8077 0.0862 0.2946 0.755 0.3138 0.3862 0.4127 0.727 0.6314 0.093 0.3583 0.3781 0.001 0.0443 0.4273 0.417 0.5242 0.6797 0.3957 0.2043 0.5957 0.5949 0.2872 0.6639 0.2039 0.8073 0.2605 0.5408 0.3174 0.6046 0.4456 0.7833 0.7729 0.1294 0.6182 0.7353 0.1441 0.597 0.6184 0.2005 0.7105 0.5936 0.6634 0.6397 0.7966 0.5684 0.7911 0.5971 0.6349 0.2977 0.4285 0.6732 0.5427 0.2743 0.373 0.081 0.4746 0.6197 0.3876 0.0069 0.5924 0.2812 0.4521 0.0066 0.829 0.4378 0.5908 0.74 ] 2022-08-23 10:53:09 [INFO] [EVAL] Class Recall: [0.8528 0.903 0.9643 0.8641 0.8644 0.8693 0.8841 0.9166 0.7157 0.7551 0.6231 0.7238 0.8803 0.4472 0.4355 0.6213 0.6768 0.5522 0.7823 0.5805 0.8805 0.6977 0.7874 0.7042 0.4911 0.5378 0.6418 0.5239 0.5315 0.4877 0.4122 0.6726 0.3608 0.5826 0.4622 0.5817 0.5903 0.6418 0.4245 0.5268 0.3783 0.1247 0.4615 0.3308 0.4716 0.2396 0.4711 0.6774 0.731 0.8064 0.7064 0.6095 0.3502 0.2556 0.9545 0.6096 0.944 0.5331 0.5512 0.3841 0.1496 0.6228 0.4927 0.2489 0.7021 0.8538 0.4843 0.6244 0.1672 0.4139 0.6715 0.6954 0.5514 0.3783 0.5708 0.4921 0.7395 0.4003 0.3053 0.1743 0.8422 0.5308 0.4031 0.0426 0.1706 0.6914 0.1552 0.1249 0.3441 0.6363 0.604 0.2004 0.3396 0.1264 0.0006 0.0262 0.0274 0.2147 0.3881 0.5028 0.1794 0.109 0.3541 0.1044 0.0037 0.8775 0.0828 0.5914 0.1697 0.2841 0.1859 0.6071 0.1828 0.5764 0.9939 0.027 0.3778 0.8111 0.3113 0.1352 0.6815 0.0174 0.2251 0.189 0.3031 0.3661 0.507 0.6033 0.4255 0.454 0.8474 0.0852 0.5144 0.3411 0.2262 0.1827 0.2219 0.0531 0.3002 0.4697 0.6304 0.0148 0.4054 0.0546 0.2619 0.0234 0.4052 0.0296 0.246 0.1498] 2022-08-23 10:53:09 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:53:20 [INFO] [TRAIN] epoch: 125, iter: 157050/160000, loss: 0.4590, lr: 0.000022, batch_cost: 0.2092, reader_cost: 0.00476, ips: 38.2392 samples/sec | ETA 00:10:17 2022-08-23 10:53:30 [INFO] [TRAIN] epoch: 125, iter: 157100/160000, loss: 0.4633, lr: 0.000022, batch_cost: 0.2118, reader_cost: 0.00102, ips: 37.7701 samples/sec | ETA 00:10:14 2022-08-23 10:53:41 [INFO] [TRAIN] epoch: 125, iter: 157150/160000, loss: 0.5115, lr: 0.000022, batch_cost: 0.2178, reader_cost: 0.00094, ips: 36.7241 samples/sec | ETA 00:10:20 2022-08-23 10:53:51 [INFO] [TRAIN] epoch: 125, iter: 157200/160000, loss: 0.4882, lr: 0.000021, batch_cost: 0.1908, reader_cost: 0.00075, ips: 41.9219 samples/sec | ETA 00:08:54 2022-08-23 10:54:01 [INFO] [TRAIN] epoch: 125, iter: 157250/160000, loss: 0.4756, lr: 0.000021, batch_cost: 0.2091, reader_cost: 0.00051, ips: 38.2552 samples/sec | ETA 00:09:35 2022-08-23 10:54:11 [INFO] [TRAIN] epoch: 125, iter: 157300/160000, loss: 0.4750, lr: 0.000020, batch_cost: 0.1867, reader_cost: 0.00162, ips: 42.8387 samples/sec | ETA 00:08:24 2022-08-23 10:54:20 [INFO] [TRAIN] epoch: 125, iter: 157350/160000, loss: 0.4650, lr: 0.000020, batch_cost: 0.1851, reader_cost: 0.00063, ips: 43.2157 samples/sec | ETA 00:08:10 2022-08-23 10:54:31 [INFO] [TRAIN] epoch: 125, iter: 157400/160000, loss: 0.4464, lr: 0.000020, batch_cost: 0.2127, reader_cost: 0.00068, ips: 37.6166 samples/sec | ETA 00:09:12 2022-08-23 10:54:41 [INFO] [TRAIN] epoch: 125, iter: 157450/160000, loss: 0.4670, lr: 0.000019, batch_cost: 0.2126, reader_cost: 0.00037, ips: 37.6308 samples/sec | ETA 00:09:02 2022-08-23 10:54:52 [INFO] [TRAIN] epoch: 125, iter: 157500/160000, loss: 0.4541, lr: 0.000019, batch_cost: 0.2148, reader_cost: 0.00042, ips: 37.2378 samples/sec | ETA 00:08:57 2022-08-23 10:55:01 [INFO] [TRAIN] epoch: 125, iter: 157550/160000, loss: 0.5065, lr: 0.000019, batch_cost: 0.1792, reader_cost: 0.00045, ips: 44.6346 samples/sec | ETA 00:07:19 2022-08-23 10:55:10 [INFO] [TRAIN] epoch: 125, iter: 157600/160000, loss: 0.4579, lr: 0.000018, batch_cost: 0.1906, reader_cost: 0.00058, ips: 41.9706 samples/sec | ETA 00:07:37 2022-08-23 10:55:22 [INFO] [TRAIN] epoch: 125, iter: 157650/160000, loss: 0.5022, lr: 0.000018, batch_cost: 0.2222, reader_cost: 0.00085, ips: 36.0049 samples/sec | ETA 00:08:42 2022-08-23 10:55:31 [INFO] [TRAIN] epoch: 125, iter: 157700/160000, loss: 0.4742, lr: 0.000017, batch_cost: 0.1859, reader_cost: 0.00037, ips: 43.0266 samples/sec | ETA 00:07:07 2022-08-23 10:55:41 [INFO] [TRAIN] epoch: 125, iter: 157750/160000, loss: 0.4575, lr: 0.000017, batch_cost: 0.1965, reader_cost: 0.00071, ips: 40.7200 samples/sec | ETA 00:07:22 2022-08-23 10:55:52 [INFO] [TRAIN] epoch: 125, iter: 157800/160000, loss: 0.4952, lr: 0.000017, batch_cost: 0.2174, reader_cost: 0.00062, ips: 36.7955 samples/sec | ETA 00:07:58 2022-08-23 10:56:01 [INFO] [TRAIN] epoch: 125, iter: 157850/160000, loss: 0.4460, lr: 0.000016, batch_cost: 0.1942, reader_cost: 0.00067, ips: 41.1856 samples/sec | ETA 00:06:57 2022-08-23 10:56:14 [INFO] [TRAIN] epoch: 126, iter: 157900/160000, loss: 0.4408, lr: 0.000016, batch_cost: 0.2583, reader_cost: 0.04934, ips: 30.9724 samples/sec | ETA 00:09:02 2022-08-23 10:56:24 [INFO] [TRAIN] epoch: 126, iter: 157950/160000, loss: 0.4648, lr: 0.000016, batch_cost: 0.1977, reader_cost: 0.00055, ips: 40.4639 samples/sec | ETA 00:06:45 2022-08-23 10:56:35 [INFO] [TRAIN] epoch: 126, iter: 158000/160000, loss: 0.4676, lr: 0.000015, batch_cost: 0.2092, reader_cost: 0.00054, ips: 38.2492 samples/sec | ETA 00:06:58 2022-08-23 10:56:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1642 - reader cost: 5.1766e-04 2022-08-23 10:59:19 [INFO] [EVAL] #Images: 2000 mIoU: 0.3516 Acc: 0.7712 Kappa: 0.7536 Dice: 0.4833 2022-08-23 10:59:19 [INFO] [EVAL] Class IoU: [0.6944 0.78 0.9324 0.7302 0.6832 0.7658 0.7868 0.7899 0.5324 0.6195 0.4978 0.5615 0.7059 0.3045 0.3137 0.436 0.5192 0.4322 0.6291 0.4239 0.748 0.528 0.638 0.4995 0.3307 0.3537 0.5073 0.4636 0.4177 0.2608 0.2675 0.4995 0.2488 0.3712 0.3023 0.4006 0.4361 0.5378 0.2601 0.4051 0.217 0.0894 0.3517 0.2599 0.2802 0.1766 0.3094 0.5229 0.545 0.5415 0.5809 0.4512 0.2204 0.2104 0.6831 0.441 0.8714 0.4173 0.4614 0.2338 0.0602 0.4401 0.3308 0.2509 0.4652 0.7009 0.3081 0.4103 0.1114 0.3056 0.5142 0.5344 0.4102 0.1847 0.4576 0.323 0.5734 0.2876 0.2705 0.1468 0.6259 0.4195 0.3729 0.0335 0.1257 0.5669 0.1003 0.1006 0.2576 0.5011 0.4393 0.051 0.2246 0.1025 0.0051 0.0156 0.031 0.168 0.2909 0.4036 0.1471 0.06 0.2904 0.1164 0.0019 0.6293 0.059 0.5235 0.1102 0.2376 0.1386 0.4187 0.1505 0.4881 0.8072 0.0197 0.3143 0.6365 0.1112 0.1533 0.4644 0.0151 0.2002 0.1648 0.2644 0.2875 0.446 0.4225 0.3301 0.3524 0.5664 0.0752 0.3246 0.2847 0.1956 0.1287 0.1621 0.0393 0.2102 0.3641 0.3353 0.0057 0.3329 0.06 0.1556 0.0039 0.3738 0.0337 0.184 0.1424] 2022-08-23 10:59:19 [INFO] [EVAL] Class Precision: [0.7859 0.8548 0.9667 0.8173 0.7584 0.8646 0.8685 0.8508 0.6843 0.7619 0.6909 0.741 0.7817 0.5059 0.5572 0.6016 0.7238 0.7063 0.7736 0.6338 0.8391 0.6722 0.7689 0.6225 0.4969 0.5425 0.6874 0.7781 0.6631 0.3787 0.4816 0.6593 0.477 0.5123 0.4769 0.5653 0.6406 0.7842 0.4266 0.673 0.4103 0.2398 0.5985 0.5334 0.4015 0.438 0.484 0.6964 0.7296 0.625 0.7763 0.6485 0.3744 0.5231 0.7076 0.6638 0.9246 0.6827 0.7194 0.4287 0.1008 0.5968 0.5088 0.6827 0.5718 0.7947 0.4494 0.5695 0.2524 0.5339 0.6906 0.6936 0.6082 0.2772 0.7049 0.505 0.7422 0.5274 0.7879 0.4464 0.7021 0.6685 0.804 0.0892 0.3036 0.7442 0.3257 0.3768 0.5407 0.7471 0.6133 0.0644 0.3964 0.3819 0.0158 0.0429 0.3889 0.3771 0.4891 0.6811 0.3716 0.1967 0.6155 0.6097 0.1442 0.6848 0.2081 0.8307 0.249 0.5433 0.2989 0.5693 0.4026 0.7937 0.8128 0.118 0.6634 0.7597 0.1458 0.5727 0.5924 0.1685 0.651 0.6043 0.66 0.6385 0.7752 0.5894 0.7304 0.5699 0.6309 0.3009 0.4445 0.6739 0.5698 0.2607 0.3617 0.0769 0.464 0.6176 0.419 0.0084 0.6159 0.3622 0.4776 0.0052 0.8389 0.3852 0.5985 0.7324] 2022-08-23 10:59:19 [INFO] [EVAL] Class Recall: [0.8564 0.8992 0.9634 0.8726 0.8733 0.8703 0.8932 0.9169 0.7057 0.7682 0.6405 0.6987 0.8792 0.4335 0.4178 0.6129 0.6474 0.5269 0.7711 0.5614 0.8733 0.711 0.7893 0.7166 0.4972 0.5041 0.6593 0.5343 0.5302 0.4559 0.3757 0.6732 0.3421 0.574 0.4522 0.579 0.5773 0.6312 0.4 0.5043 0.3154 0.1247 0.4604 0.3364 0.4812 0.2284 0.4617 0.6772 0.6829 0.8022 0.6977 0.5972 0.3488 0.2603 0.9518 0.5677 0.9381 0.5178 0.5627 0.3397 0.1299 0.6263 0.4861 0.2841 0.7138 0.8559 0.4948 0.5947 0.1662 0.4169 0.6681 0.6996 0.5575 0.3563 0.566 0.4727 0.7159 0.3875 0.2918 0.1795 0.8522 0.5298 0.4101 0.0509 0.1767 0.7041 0.1267 0.1207 0.3297 0.6035 0.6075 0.1977 0.3414 0.1229 0.0075 0.0239 0.0325 0.2326 0.4178 0.4977 0.1959 0.0795 0.3548 0.1258 0.0019 0.8859 0.0761 0.586 0.1651 0.2968 0.2055 0.6128 0.1938 0.559 0.9915 0.0231 0.3739 0.7968 0.3194 0.1731 0.6824 0.0163 0.2243 0.1847 0.306 0.3433 0.5123 0.5987 0.3759 0.4801 0.847 0.0911 0.5463 0.3302 0.2295 0.2028 0.2271 0.0742 0.2775 0.47 0.6265 0.0178 0.4201 0.0671 0.1875 0.016 0.4027 0.0356 0.2099 0.1503] 2022-08-23 10:59:19 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 10:59:31 [INFO] [TRAIN] epoch: 126, iter: 158050/160000, loss: 0.4488, lr: 0.000015, batch_cost: 0.2288, reader_cost: 0.00405, ips: 34.9723 samples/sec | ETA 00:07:26 2022-08-23 10:59:42 [INFO] [TRAIN] epoch: 126, iter: 158100/160000, loss: 0.5004, lr: 0.000014, batch_cost: 0.2256, reader_cost: 0.00178, ips: 35.4606 samples/sec | ETA 00:07:08 2022-08-23 10:59:53 [INFO] [TRAIN] epoch: 126, iter: 158150/160000, loss: 0.4953, lr: 0.000014, batch_cost: 0.2207, reader_cost: 0.00052, ips: 36.2403 samples/sec | ETA 00:06:48 2022-08-23 11:00:03 [INFO] [TRAIN] epoch: 126, iter: 158200/160000, loss: 0.4654, lr: 0.000014, batch_cost: 0.1985, reader_cost: 0.00078, ips: 40.2934 samples/sec | ETA 00:05:57 2022-08-23 11:00:13 [INFO] [TRAIN] epoch: 126, iter: 158250/160000, loss: 0.4678, lr: 0.000013, batch_cost: 0.2053, reader_cost: 0.00095, ips: 38.9762 samples/sec | ETA 00:05:59 2022-08-23 11:00:23 [INFO] [TRAIN] epoch: 126, iter: 158300/160000, loss: 0.4657, lr: 0.000013, batch_cost: 0.2057, reader_cost: 0.00111, ips: 38.8899 samples/sec | ETA 00:05:49 2022-08-23 11:00:34 [INFO] [TRAIN] epoch: 126, iter: 158350/160000, loss: 0.5082, lr: 0.000012, batch_cost: 0.2107, reader_cost: 0.00273, ips: 37.9745 samples/sec | ETA 00:05:47 2022-08-23 11:00:43 [INFO] [TRAIN] epoch: 126, iter: 158400/160000, loss: 0.4756, lr: 0.000012, batch_cost: 0.1840, reader_cost: 0.00052, ips: 43.4867 samples/sec | ETA 00:04:54 2022-08-23 11:00:53 [INFO] [TRAIN] epoch: 126, iter: 158450/160000, loss: 0.4666, lr: 0.000012, batch_cost: 0.1917, reader_cost: 0.00046, ips: 41.7365 samples/sec | ETA 00:04:57 2022-08-23 11:01:03 [INFO] [TRAIN] epoch: 126, iter: 158500/160000, loss: 0.4702, lr: 0.000011, batch_cost: 0.2086, reader_cost: 0.00084, ips: 38.3513 samples/sec | ETA 00:05:12 2022-08-23 11:01:13 [INFO] [TRAIN] epoch: 126, iter: 158550/160000, loss: 0.4804, lr: 0.000011, batch_cost: 0.2001, reader_cost: 0.00033, ips: 39.9721 samples/sec | ETA 00:04:50 2022-08-23 11:01:24 [INFO] [TRAIN] epoch: 126, iter: 158600/160000, loss: 0.4524, lr: 0.000011, batch_cost: 0.2131, reader_cost: 0.00097, ips: 37.5331 samples/sec | ETA 00:04:58 2022-08-23 11:01:35 [INFO] [TRAIN] epoch: 126, iter: 158650/160000, loss: 0.4765, lr: 0.000010, batch_cost: 0.2155, reader_cost: 0.00090, ips: 37.1225 samples/sec | ETA 00:04:50 2022-08-23 11:01:44 [INFO] [TRAIN] epoch: 126, iter: 158700/160000, loss: 0.4835, lr: 0.000010, batch_cost: 0.1955, reader_cost: 0.00044, ips: 40.9301 samples/sec | ETA 00:04:14 2022-08-23 11:01:55 [INFO] [TRAIN] epoch: 126, iter: 158750/160000, loss: 0.4453, lr: 0.000009, batch_cost: 0.2054, reader_cost: 0.00060, ips: 38.9535 samples/sec | ETA 00:04:16 2022-08-23 11:02:04 [INFO] [TRAIN] epoch: 126, iter: 158800/160000, loss: 0.4587, lr: 0.000009, batch_cost: 0.1807, reader_cost: 0.00068, ips: 44.2800 samples/sec | ETA 00:03:36 2022-08-23 11:02:15 [INFO] [TRAIN] epoch: 126, iter: 158850/160000, loss: 0.4614, lr: 0.000009, batch_cost: 0.2193, reader_cost: 0.00040, ips: 36.4765 samples/sec | ETA 00:04:12 2022-08-23 11:02:25 [INFO] [TRAIN] epoch: 126, iter: 158900/160000, loss: 0.4615, lr: 0.000008, batch_cost: 0.1998, reader_cost: 0.00059, ips: 40.0469 samples/sec | ETA 00:03:39 2022-08-23 11:02:34 [INFO] [TRAIN] epoch: 126, iter: 158950/160000, loss: 0.4790, lr: 0.000008, batch_cost: 0.1956, reader_cost: 0.00057, ips: 40.8931 samples/sec | ETA 00:03:25 2022-08-23 11:02:45 [INFO] [TRAIN] epoch: 126, iter: 159000/160000, loss: 0.4672, lr: 0.000008, batch_cost: 0.2151, reader_cost: 0.00038, ips: 37.1934 samples/sec | ETA 00:03:35 2022-08-23 11:02:45 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1741 - reader cost: 7.0915e-04 2022-08-23 11:05:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3519 Acc: 0.7710 Kappa: 0.7533 Dice: 0.4837 2022-08-23 11:05:39 [INFO] [EVAL] Class IoU: [0.6933 0.7779 0.9328 0.7322 0.6832 0.7657 0.7863 0.7906 0.5352 0.6183 0.4991 0.5649 0.7071 0.2996 0.3153 0.4386 0.5217 0.4357 0.6233 0.4248 0.746 0.5232 0.6362 0.5004 0.3269 0.3154 0.4829 0.4616 0.4232 0.2758 0.2826 0.4933 0.2571 0.3708 0.2971 0.3971 0.4368 0.5332 0.2645 0.4147 0.2378 0.0884 0.3533 0.2634 0.2816 0.1851 0.3134 0.523 0.5867 0.5416 0.5822 0.4632 0.2131 0.2117 0.6863 0.4436 0.876 0.4142 0.4719 0.2332 0.0658 0.4382 0.3261 0.2445 0.4689 0.697 0.3083 0.3962 0.1168 0.2979 0.5169 0.5301 0.4122 0.1857 0.4539 0.3232 0.5783 0.2856 0.2845 0.1415 0.6364 0.4162 0.3763 0.0291 0.1463 0.5628 0.101 0.0967 0.2346 0.5094 0.4302 0.0513 0.2147 0.1104 0.0022 0.0135 0.03 0.1724 0.2878 0.4131 0.1452 0.0787 0.2851 0.1364 0.0016 0.5647 0.0527 0.5172 0.0976 0.2491 0.1333 0.4247 0.1547 0.5047 0.8104 0.0285 0.3313 0.6452 0.1162 0.125 0.4722 0.0213 0.1986 0.1607 0.2646 0.2901 0.4495 0.4222 0.346 0.3526 0.5743 0.08 0.3402 0.28 0.1927 0.1309 0.1623 0.0356 0.1916 0.3582 0.3154 0.0055 0.323 0.0656 0.1684 0.0046 0.3752 0.029 0.1726 0.1389] 2022-08-23 11:05:39 [INFO] [EVAL] Class Precision: [0.7844 0.8506 0.9663 0.8231 0.7618 0.8712 0.8753 0.8521 0.6916 0.7617 0.7045 0.7201 0.7834 0.5097 0.5499 0.6156 0.7173 0.697 0.7559 0.6248 0.8326 0.6782 0.7571 0.6312 0.514 0.5235 0.6422 0.77 0.6452 0.3939 0.4615 0.6471 0.4824 0.5188 0.4765 0.5737 0.6349 0.7553 0.4189 0.6346 0.4102 0.2452 0.6021 0.5219 0.4069 0.4315 0.4799 0.7091 0.7454 0.6344 0.7593 0.6659 0.3768 0.5009 0.7156 0.6105 0.9357 0.6893 0.7149 0.4143 0.1186 0.6162 0.4861 0.6747 0.5802 0.7904 0.4415 0.5387 0.2776 0.5165 0.7126 0.6829 0.6125 0.2848 0.6981 0.5058 0.7319 0.5545 0.7764 0.4406 0.7199 0.6455 0.7968 0.0822 0.3317 0.7496 0.3363 0.3851 0.46 0.7604 0.5895 0.0646 0.3642 0.3702 0.0067 0.037 0.3665 0.4184 0.484 0.6782 0.3711 0.2152 0.589 0.6139 0.2616 0.6209 0.1799 0.8161 0.2588 0.5566 0.3142 0.6033 0.4187 0.7992 0.8156 0.1404 0.6365 0.7693 0.1521 0.6025 0.6237 0.202 0.6794 0.5839 0.6442 0.628 0.7917 0.5803 0.7655 0.5787 0.6406 0.3071 0.4604 0.6311 0.5808 0.268 0.391 0.0749 0.4413 0.6148 0.3882 0.0082 0.6174 0.3924 0.44 0.006 0.8414 0.4111 0.5622 0.7126] 2022-08-23 11:05:39 [INFO] [EVAL] Class Recall: [0.8565 0.9009 0.9641 0.8689 0.8689 0.8634 0.8855 0.9163 0.7029 0.7666 0.6312 0.7239 0.8789 0.421 0.4249 0.6041 0.6567 0.5376 0.7804 0.5703 0.8777 0.696 0.7994 0.7071 0.4731 0.4424 0.6606 0.5354 0.5515 0.4791 0.4215 0.6748 0.3551 0.5651 0.4412 0.5633 0.5834 0.6446 0.4179 0.5448 0.3613 0.1215 0.4609 0.3472 0.4777 0.2448 0.4747 0.666 0.7337 0.7874 0.7139 0.6034 0.3291 0.2683 0.9436 0.6187 0.9321 0.5093 0.5812 0.3479 0.1287 0.6028 0.4977 0.2771 0.7097 0.8551 0.5053 0.5996 0.1678 0.4131 0.6531 0.7032 0.5576 0.3478 0.5647 0.4723 0.7336 0.3706 0.3098 0.1725 0.8459 0.5395 0.4162 0.0432 0.2074 0.6932 0.1261 0.1143 0.3238 0.6068 0.6142 0.199 0.3436 0.1359 0.0034 0.0209 0.0316 0.2268 0.4153 0.5138 0.1925 0.1104 0.3559 0.1491 0.0016 0.862 0.0694 0.5855 0.1355 0.3107 0.1879 0.5893 0.1971 0.5779 0.9921 0.0345 0.4086 0.8 0.3297 0.1362 0.6604 0.0233 0.2192 0.1815 0.3099 0.3503 0.5098 0.6078 0.3871 0.4744 0.8472 0.0977 0.5657 0.3348 0.2238 0.2039 0.2172 0.0634 0.253 0.4618 0.6271 0.0165 0.4038 0.073 0.2144 0.0193 0.4037 0.0303 0.1994 0.1471] 2022-08-23 11:05:40 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 2022-08-23 11:05:48 [INFO] [TRAIN] epoch: 126, iter: 159050/160000, loss: 0.4848, lr: 0.000007, batch_cost: 0.1681, reader_cost: 0.00376, ips: 47.5795 samples/sec | ETA 00:02:39 2022-08-23 11:05:57 [INFO] [TRAIN] epoch: 126, iter: 159100/160000, loss: 0.4412, lr: 0.000007, batch_cost: 0.1775, reader_cost: 0.00083, ips: 45.0726 samples/sec | ETA 00:02:39 2022-08-23 11:06:09 [INFO] [TRAIN] epoch: 127, iter: 159150/160000, loss: 0.4589, lr: 0.000006, batch_cost: 0.2414, reader_cost: 0.04651, ips: 33.1357 samples/sec | ETA 00:03:25 2022-08-23 11:06:20 [INFO] [TRAIN] epoch: 127, iter: 159200/160000, loss: 0.4566, lr: 0.000006, batch_cost: 0.2207, reader_cost: 0.00063, ips: 36.2410 samples/sec | ETA 00:02:56 2022-08-23 11:06:30 [INFO] [TRAIN] epoch: 127, iter: 159250/160000, loss: 0.4476, lr: 0.000006, batch_cost: 0.1972, reader_cost: 0.00048, ips: 40.5618 samples/sec | ETA 00:02:27 2022-08-23 11:06:40 [INFO] [TRAIN] epoch: 127, iter: 159300/160000, loss: 0.4696, lr: 0.000005, batch_cost: 0.1997, reader_cost: 0.00125, ips: 40.0637 samples/sec | ETA 00:02:19 2022-08-23 11:06:49 [INFO] [TRAIN] epoch: 127, iter: 159350/160000, loss: 0.4694, lr: 0.000005, batch_cost: 0.1920, reader_cost: 0.00068, ips: 41.6704 samples/sec | ETA 00:02:04 2022-08-23 11:06:58 [INFO] [TRAIN] epoch: 127, iter: 159400/160000, loss: 0.4748, lr: 0.000005, batch_cost: 0.1620, reader_cost: 0.00031, ips: 49.3931 samples/sec | ETA 00:01:37 2022-08-23 11:07:06 [INFO] [TRAIN] epoch: 127, iter: 159450/160000, loss: 0.5142, lr: 0.000004, batch_cost: 0.1597, reader_cost: 0.00065, ips: 50.1003 samples/sec | ETA 00:01:27 2022-08-23 11:07:14 [INFO] [TRAIN] epoch: 127, iter: 159500/160000, loss: 0.4813, lr: 0.000004, batch_cost: 0.1614, reader_cost: 0.00056, ips: 49.5673 samples/sec | ETA 00:01:20 2022-08-23 11:07:23 [INFO] [TRAIN] epoch: 127, iter: 159550/160000, loss: 0.4695, lr: 0.000003, batch_cost: 0.1856, reader_cost: 0.00043, ips: 43.1134 samples/sec | ETA 00:01:23 2022-08-23 11:07:32 [INFO] [TRAIN] epoch: 127, iter: 159600/160000, loss: 0.4581, lr: 0.000003, batch_cost: 0.1812, reader_cost: 0.00097, ips: 44.1436 samples/sec | ETA 00:01:12 2022-08-23 11:07:41 [INFO] [TRAIN] epoch: 127, iter: 159650/160000, loss: 0.4821, lr: 0.000003, batch_cost: 0.1837, reader_cost: 0.00040, ips: 43.5377 samples/sec | ETA 00:01:04 2022-08-23 11:07:51 [INFO] [TRAIN] epoch: 127, iter: 159700/160000, loss: 0.4581, lr: 0.000002, batch_cost: 0.1951, reader_cost: 0.00097, ips: 41.0051 samples/sec | ETA 00:00:58 2022-08-23 11:08:00 [INFO] [TRAIN] epoch: 127, iter: 159750/160000, loss: 0.4751, lr: 0.000002, batch_cost: 0.1746, reader_cost: 0.00041, ips: 45.8306 samples/sec | ETA 00:00:43 2022-08-23 11:08:09 [INFO] [TRAIN] epoch: 127, iter: 159800/160000, loss: 0.4633, lr: 0.000002, batch_cost: 0.1805, reader_cost: 0.00072, ips: 44.3131 samples/sec | ETA 00:00:36 2022-08-23 11:08:18 [INFO] [TRAIN] epoch: 127, iter: 159850/160000, loss: 0.4653, lr: 0.000001, batch_cost: 0.1787, reader_cost: 0.00051, ips: 44.7686 samples/sec | ETA 00:00:26 2022-08-23 11:08:27 [INFO] [TRAIN] epoch: 127, iter: 159900/160000, loss: 0.4682, lr: 0.000001, batch_cost: 0.1840, reader_cost: 0.00079, ips: 43.4802 samples/sec | ETA 00:00:18 2022-08-23 11:08:36 [INFO] [TRAIN] epoch: 127, iter: 159950/160000, loss: 0.4770, lr: 0.000000, batch_cost: 0.1815, reader_cost: 0.00077, ips: 44.0721 samples/sec | ETA 00:00:09 2022-08-23 11:08:46 [INFO] [TRAIN] epoch: 127, iter: 160000/160000, loss: 0.4242, lr: 0.000000, batch_cost: 0.1945, reader_cost: 0.00154, ips: 41.1299 samples/sec | ETA 00:00:00 2022-08-23 11:08:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 152s - batch_cost: 0.1518 - reader cost: 5.3957e-04 2022-08-23 11:11:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.3523 Acc: 0.7716 Kappa: 0.7540 Dice: 0.4843 2022-08-23 11:11:18 [INFO] [EVAL] Class IoU: [0.6953 0.7779 0.9324 0.7331 0.6864 0.768 0.787 0.7875 0.5359 0.6201 0.4964 0.5592 0.7077 0.3105 0.3144 0.4399 0.5219 0.4338 0.6258 0.4263 0.7467 0.5246 0.6451 0.495 0.3291 0.3341 0.4902 0.4663 0.4127 0.2585 0.2685 0.4822 0.2619 0.3677 0.3076 0.3995 0.435 0.5326 0.2664 0.4093 0.2246 0.0927 0.3522 0.2653 0.2904 0.1873 0.3073 0.5208 0.5456 0.5556 0.5866 0.4435 0.2242 0.2092 0.6767 0.4537 0.8712 0.4186 0.4709 0.2509 0.0719 0.4181 0.3264 0.2668 0.4727 0.7036 0.313 0.4008 0.1063 0.3021 0.5331 0.5314 0.4034 0.1957 0.4567 0.3365 0.5785 0.297 0.2791 0.1467 0.6248 0.4151 0.3697 0.0296 0.08 0.5654 0.1046 0.0963 0.2328 0.516 0.4364 0.0509 0.2211 0.1074 0.001 0.0135 0.03 0.1609 0.2827 0.401 0.139 0.081 0.2821 0.1843 0.0024 0.5955 0.0536 0.5215 0.1083 0.2362 0.1406 0.3779 0.1418 0.519 0.8022 0.0173 0.316 0.6296 0.1173 0.1542 0.4607 0.0159 0.2057 0.1629 0.2629 0.3007 0.4495 0.4274 0.3937 0.3562 0.5717 0.073 0.325 0.2941 0.2025 0.1269 0.1625 0.0358 0.2051 0.3572 0.3082 0.008 0.3173 0.0548 0.1891 0.0022 0.3705 0.0333 0.1844 0.1515] 2022-08-23 11:11:18 [INFO] [EVAL] Class Precision: [0.7874 0.8524 0.9663 0.8258 0.7654 0.8639 0.8675 0.8472 0.6894 0.7506 0.7065 0.7326 0.7862 0.5094 0.5711 0.6096 0.7223 0.6953 0.7592 0.6399 0.8362 0.6866 0.7853 0.6208 0.5172 0.5291 0.6394 0.7608 0.6529 0.3846 0.4595 0.6147 0.4796 0.4911 0.4949 0.5685 0.6353 0.7739 0.4246 0.6566 0.4115 0.2462 0.5963 0.5308 0.3994 0.4204 0.4805 0.7005 0.7115 0.6555 0.7746 0.634 0.3807 0.5089 0.698 0.6318 0.9293 0.6801 0.7137 0.4136 0.1214 0.5831 0.4893 0.6901 0.5812 0.8024 0.4622 0.5395 0.2473 0.5351 0.6945 0.6997 0.6281 0.2839 0.695 0.5112 0.7242 0.5403 0.7524 0.4256 0.7075 0.6516 0.8009 0.0771 0.2215 0.7555 0.3267 0.374 0.4326 0.7495 0.6072 0.0635 0.3881 0.3704 0.0026 0.0366 0.4932 0.3817 0.48 0.6965 0.3896 0.2074 0.5992 0.6768 0.248 0.6437 0.1738 0.807 0.2834 0.5509 0.3088 0.5053 0.4564 0.7946 0.8075 0.1157 0.6627 0.7478 0.1585 0.6093 0.5831 0.2025 0.676 0.5978 0.6676 0.6503 0.802 0.6096 0.7715 0.6041 0.6336 0.3128 0.465 0.6376 0.5672 0.2783 0.3978 0.0758 0.4406 0.6348 0.3714 0.012 0.617 0.3385 0.4793 0.0027 0.835 0.374 0.6338 0.7578] 2022-08-23 11:11:18 [INFO] [EVAL] Class Recall: [0.856 0.8991 0.9638 0.8672 0.8694 0.8737 0.8945 0.9179 0.7064 0.7811 0.6254 0.7026 0.8763 0.443 0.4117 0.6124 0.6529 0.5356 0.7808 0.5608 LAUNCH INFO 2022-08-23 11:11:21,870 Pod failed INFO 2022-08-23 11:11:21,870 controller.py:99] Pod failed LAUNCH ERROR 2022-08-23 11:11:21,870 Container failed !!! Container rank 0 status failed cmd ['/ssd3/pengjuncai/anaconda3/bin/python', '-u', 'train.py', '--config', 'configs/topformer/topformer_small_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_small_ade20k_512x512_160k/test_6', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] code 1 log output/topformer/topformer_small_ade20k_512x512_160k/test_6/log_dir/default.lkbwfy.0.log env {'XDG_SESSION_ID': '5', 'HOSTNAME': 'instance-mqcyj27y-2', 'SHELL': '/bin/bash', 'TERM': 'screen', 'HISTSIZE': '50000', 'SSH_CLIENT': '172.31.22.20 26694 22', 'CONDA_SHLVL': '1', 'CONDA_PROMPT_MODIFIER': '(base) ', 'QTDIR': '/usr/lib64/qt-3.3', 'QTINC': '/usr/lib64/qt-3.3/include', 'SSH_TTY': '/dev/pts/2', 'ZSH': '/ssd3/pengjuncai/.oh-my-zsh', 'USER': 'pengjuncai', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:', 'LD_LIBRARY_PATH': '/usr/local/cuda/lib64', 'CONDA_EXE': '/ssd3/pengjuncai/anaconda3/bin/conda', 'TMOUT': '172800', 'base_model': 'topformer', 'PAGER': 'less', 'TMUX': '/tmp/tmux-1032/default,17077,0', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', '_CE_CONDA': '', 'MAIL': '/var/spool/mail/pengjuncai', 'PATH': '/ssd3/pengjuncai/.BCloud/bin:/usr/local/cuda/bin:/ssd3/pengjuncai/anaconda3/bin:/ssd3/pengjuncai/anaconda3/condabin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/ssd3/pengjuncai/.local/bin:/ssd3/pengjuncai/bin:/opt/bin:/home/opt/bin', 'tag': 'test_6', 'CONDA_PREFIX': '/ssd3/pengjuncai/anaconda3', 'PWD': '/ssd3/pengjuncai/PaddleSeg', 'CUDA_VISIBLE_DEVICES': '2,3', 'LANG': 'en_US.UTF-8', 'TMUX_PANE': '%12', 'HISTCONTROL': 'ignoredups', '_CE_M': '', 'HOME': '/ssd3/pengjuncai', 'SHLVL': '8', 'CONDA_PYTHON_EXE': '/ssd3/pengjuncai/anaconda3/bin/python', 'LESS': '-R', 'LOGNAME': 'pengjuncai', 'QTLIB': '/usr/lib64/qt-3.3/lib', 'SSH_CONNECTION': '172.31.22.20 23106 10.9.189.6 22', 'XDG_DATA_DIRS': '/ssd3/pengjuncai/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'CONDA_DEFAULT_ENV': 'base', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'XDG_RUNTIME_DIR': '/run/user/1032', 'HISTTIMEFORMAT': '%Y-%m-%d %H:%M:%S ', 'model': 'topformer_small_ade20k_512x512_160k', '_': '/usr/bin/nohup', 'OLDPWD': '/ssd3/pengjuncai/PaddleSeg/output/topformer/topformer_small_ade20k_512x512_160k/test_6', 'CUSTOM_DEVICE_ROOT': '', 'OMP_NUM_THREADS': '1', 'QT_QPA_PLATFORM_PLUGIN_PATH': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/plugins', 'QT_QPA_FONTDIR': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/fonts', 'PADDLE_MASTER': '10.9.189.6:60702', 'PADDLE_GLOBAL_SIZE': '2', 'PADDLE_LOCAL_SIZE': '2', 'PADDLE_GLOBAL_RANK': '0', 'PADDLE_LOCAL_RANK': '0', 'PADDLE_TRAINER_ENDPOINTS': '10.9.189.6:38568,10.9.189.6:51740', 'PADDLE_CURRENT_ENDPOINT': '10.9.189.6:38568', 'PADDLE_TRAINER_ID': '0', 'PADDLE_TRAINERS_NUM': '2', 'PADDLE_RANK_IN_NODE': '0', 'FLAGS_selected_gpus': '0'} ERROR 2022-08-23 11:11:21,870 controller.py:100] Container failed !!! Container rank 0 status failed cmd ['/ssd3/pengjuncai/anaconda3/bin/python', '-u', 'train.py', '--config', 'configs/topformer/topformer_small_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_small_ade20k_512x512_160k/test_6', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] code 1 log output/topformer/topformer_small_ade20k_512x512_160k/test_6/log_dir/default.lkbwfy.0.log env {'XDG_SESSION_ID': '5', 'HOSTNAME': 'instance-mqcyj27y-2', 'SHELL': '/bin/bash', 'TERM': 'screen', 'HISTSIZE': '50000', 'SSH_CLIENT': '172.31.22.20 26694 22', 'CONDA_SHLVL': '1', 'CONDA_PROMPT_MODIFIER': '(base) ', 'QTDIR': '/usr/lib64/qt-3.3', 'QTINC': '/usr/lib64/qt-3.3/include', 'SSH_TTY': '/dev/pts/2', 'ZSH': '/ssd3/pengjuncai/.oh-my-zsh', 'USER': 'pengjuncai', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:', 'LD_LIBRARY_PATH': '/usr/local/cuda/lib64', 'CONDA_EXE': '/ssd3/pengjuncai/anaconda3/bin/conda', 'TMOUT': '172800', 'base_model': 'topformer', 'PAGER': 'less', 'TMUX': '/tmp/tmux-1032/default,17077,0', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', '_CE_CONDA': '', 'MAIL': '/var/spool/mail/pengjuncai', 'PATH': '/ssd3/pengjuncai/.BCloud/bin:/usr/local/cuda/bin:/ssd3/pengjuncai/anaconda3/bin:/ssd3/pengjuncai/anaconda3/condabin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/ssd3/pengjuncai/.local/bin:/ssd3/pengjuncai/bin:/opt/bin:/home/opt/bin', 'tag': 'test_6', 'CONDA_PREFIX': '/ssd3/pengjuncai/anaconda3', 'PWD': '/ssd3/pengjuncai/PaddleSeg', 'CUDA_VISIBLE_DEVICES': '2,3', 'LANG': 'en_US.UTF-8', 'TMUX_PANE': '%12', 'HISTCONTROL': 'ignoredups', '_CE_M': '', 'HOME': '/ssd3/pengjuncai', 'SHLVL': '8', 'CONDA_PYTHON_EXE': '/ssd3/pengjuncai/anaconda3/bin/python', 'LESS': '-R', 'LOGNAME': 'pengjuncai', 'QTLIB': '/usr/lib64/qt-3.3/lib', 'SSH_CONNECTION': '172.31.22.20 23106 10.9.189.6 22', 'XDG_DATA_DIRS': '/ssd3/pengjuncai/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'CONDA_DEFAULT_ENV': 'base', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'XDG_RUNTIME_DIR': '/run/user/1032', 'HISTTIMEFORMAT': '%Y-%m-%d %H:%M:%S ', 'model': 'topformer_small_ade20k_512x512_160k', '_': '/usr/bin/nohup', 'OLDPWD': '/ssd3/pengjuncai/PaddleSeg/output/topformer/topformer_small_ade20k_512x512_160k/test_6', 'CUSTOM_DEVICE_ROOT': '', 'OMP_NUM_THREADS': '1', 'QT_QPA_PLATFORM_PLUGIN_PATH': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/plugins', 'QT_QPA_FONTDIR': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/fonts', 'PADDLE_MASTER': '10.9.189.6:60702', 'PADDLE_GLOBAL_SIZE': '2', 'PADDLE_LOCAL_SIZE': '2', 'PADDLE_GLOBAL_RANK': '0', 'PADDLE_LOCAL_RANK': '0', 'PADDLE_TRAINER_ENDPOINTS': '10.9.189.6:38568,10.9.189.6:51740', 'PADDLE_CURRENT_ENDPOINT': '10.9.189.6:38568', 'PADDLE_TRAINER_ID': '0', 'PADDLE_TRAINERS_NUM': '2', 'PADDLE_RANK_IN_NODE': '0', 'FLAGS_selected_gpus': '0'} LAUNCH INFO 2022-08-23 11:11:22,472 Exit code 1 INFO 2022-08-23 11:11:22,472 controller.py:124] Exit code 1 0.8746 0.6898 0.7832 0.7096 0.4751 0.4755 0.6774 0.5465 0.5286 0.4409 0.3924 0.6911 0.3659 0.5939 0.4484 0.5733 0.5798 0.6307 0.4169 0.5208 0.3308 0.1294 0.4625 0.3466 0.5154 0.2524 0.4601 0.6701 0.7007 0.7847 0.7073 0.5962 0.353 0.2622 0.957 0.6168 0.9331 0.5212 0.5806 0.3894 0.1497 0.5964 0.495 0.3031 0.717 0.8512 0.4924 0.6092 0.1572 0.4096 0.6964 0.6884 0.53 0.3865 0.5713 0.4962 0.7421 0.3974 0.3073 0.1829 0.8424 0.5335 0.4071 0.0459 0.1114 0.6921 0.1333 0.1148 0.3351 0.6236 0.6081 0.2037 0.3393 0.1314 0.0015 0.0209 0.0309 0.2176 0.4074 0.486 0.1778 0.1172 0.3478 0.2021 0.0025 0.8883 0.072 0.5958 0.1491 0.2925 0.2053 0.5998 0.1706 0.5995 0.992 0.0199 0.3766 0.7992 0.3106 0.1711 0.687 0.0169 0.2281 0.1829 0.3026 0.3587 0.5057 0.5884 0.4456 0.4647 0.8541 0.087 0.5191 0.3531 0.2395 0.1892 0.2155 0.0634 0.2773 0.4496 0.6443 0.0232 0.3952 0.0614 0.238 0.0103 0.3998 0.0353 0.2064 0.1593] 2022-08-23 11:11:18 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 's flops has been counted Customize Function has been applied to 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. Traceback (most recent call last): File "/ssd3/pengjuncai/PaddleSeg/train.py", line 240, in main(args) File "/ssd3/pengjuncai/PaddleSeg/train.py", line 216, in main train( File "/ssd3/pengjuncai/PaddleSeg/paddleseg/core/train.py", line 327, in train _ = paddle.flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 109, in flops return dynamic_flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 257, in dynamic_flops model(inputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/topformer.py", line 81, in forward return [x] File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/backbones/top_transformer.py", line 598, in forward [3, 1, 16, 1], # 1/2 File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 918, in _dygraph_call_func hook_result = forward_post_hook(self, inputs, outputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 183, in count_io_info m.register_buffer('input_shape', paddle.to_tensor(x[0].shape)) AttributeError: 'list' object has no attribute 'shape' 3 0.0232 0.3952 0.0614 0.238 0.0103 0.3998 0.0353 0.2064 0.1593] 2022-08-23 11:11:18 [INFO] [EVAL] The model with the best validation mIoU (0.3561) was saved at iter 148000. 's flops has been counted Customize Function has been applied to 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. Traceback (most recent call last): File "/ssd3/pengjuncai/PaddleSeg/train.py", line 240, in main(args) File "/ssd3/pengjuncai/PaddleSeg/train.py", line 216, in main train( File "/ssd3/pengjuncai/PaddleSeg/paddleseg/core/train.py", line 327, in train _ = paddle.flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 109, in flops return dynamic_flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 257, in dynamic_flops model(inputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/topformer.py", line 81, in forward return [x] File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/backbones/top_transformer.py", line 598, in forward [3, 1, 16, 1], # 1/2 File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 918, in _dygraph_call_func hook_result = forward_post_hook(self, inputs, outputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 183, in count_io_info m.register_buffer('input_shape', paddle.to_tensor(x[0].shape)) AttributeError: 'list' object has no attribute 'shape'