LAUNCH INFO 2022-08-24 10:15:04,834 ----------- Configuration ---------------------- INFO 2022-08-24 10:15:04,834 __init__.py:21] ----------- Configuration ---------------------- LAUNCH INFO 2022-08-24 10:15:04,834 devices: None INFO 2022-08-24 10:15:04,834 __init__.py:23] devices: None LAUNCH INFO 2022-08-24 10:15:04,834 elastic_level: -1 INFO 2022-08-24 10:15:04,834 __init__.py:23] elastic_level: -1 LAUNCH INFO 2022-08-24 10:15:04,834 elastic_timeout: 30 INFO 2022-08-24 10:15:04,834 __init__.py:23] elastic_timeout: 30 LAUNCH INFO 2022-08-24 10:15:04,834 gloo_port: 6767 INFO 2022-08-24 10:15:04,834 __init__.py:23] gloo_port: 6767 LAUNCH INFO 2022-08-24 10:15:04,834 host: None INFO 2022-08-24 10:15:04,834 __init__.py:23] host: None LAUNCH INFO 2022-08-24 10:15:04,834 job_id: default INFO 2022-08-24 10:15:04,834 __init__.py:23] job_id: default LAUNCH INFO 2022-08-24 10:15:04,834 legacy: False INFO 2022-08-24 10:15:04,834 __init__.py:23] legacy: False LAUNCH INFO 2022-08-24 10:15:04,834 log_dir: output/topformer/topformer_tiny_ade20k_512x512_160k/test_0/log_dir INFO 2022-08-24 10:15:04,834 __init__.py:23] log_dir: output/topformer/topformer_tiny_ade20k_512x512_160k/test_0/log_dir LAUNCH INFO 2022-08-24 10:15:04,834 log_level: INFO INFO 2022-08-24 10:15:04,834 __init__.py:23] log_level: INFO LAUNCH INFO 2022-08-24 10:15:04,834 master: None INFO 2022-08-24 10:15:04,834 __init__.py:23] master: None LAUNCH INFO 2022-08-24 10:15:04,834 max_restart: 3 INFO 2022-08-24 10:15:04,834 __init__.py:23] max_restart: 3 LAUNCH INFO 2022-08-24 10:15:04,834 nnodes: 1 INFO 2022-08-24 10:15:04,834 __init__.py:23] nnodes: 1 LAUNCH INFO 2022-08-24 10:15:04,834 nproc_per_node: None INFO 2022-08-24 10:15:04,834 __init__.py:23] nproc_per_node: None LAUNCH INFO 2022-08-24 10:15:04,834 rank: -1 INFO 2022-08-24 10:15:04,834 __init__.py:23] rank: -1 LAUNCH INFO 2022-08-24 10:15:04,834 run_mode: collective INFO 2022-08-24 10:15:04,834 __init__.py:23] run_mode: collective LAUNCH INFO 2022-08-24 10:15:04,834 server_num: None INFO 2022-08-24 10:15:04,834 __init__.py:23] server_num: None LAUNCH INFO 2022-08-24 10:15:04,835 servers: INFO 2022-08-24 10:15:04,835 __init__.py:23] servers: LAUNCH INFO 2022-08-24 10:15:04,835 trainer_num: None INFO 2022-08-24 10:15:04,835 __init__.py:23] trainer_num: None LAUNCH INFO 2022-08-24 10:15:04,835 trainers: INFO 2022-08-24 10:15:04,835 __init__.py:23] trainers: LAUNCH INFO 2022-08-24 10:15:04,835 training_script: train.py INFO 2022-08-24 10:15:04,835 __init__.py:23] training_script: train.py LAUNCH INFO 2022-08-24 10:15:04,835 training_script_args: ['--config', 'configs/topformer/topformer_tiny_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_tiny_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] INFO 2022-08-24 10:15:04,835 __init__.py:23] training_script_args: ['--config', 'configs/topformer/topformer_tiny_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_tiny_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] LAUNCH INFO 2022-08-24 10:15:04,835 with_gloo: 0 INFO 2022-08-24 10:15:04,835 __init__.py:23] with_gloo: 0 LAUNCH INFO 2022-08-24 10:15:04,835 -------------------------------------------------- INFO 2022-08-24 10:15:04,835 __init__.py:24] -------------------------------------------------- LAUNCH INFO 2022-08-24 10:15:04,840 Job: default, mode collective, replicas 1[1:1], elastic False INFO 2022-08-24 10:15:04,840 controller.py:152] Job: default, mode collective, replicas 1[1:1], elastic False LAUNCH INFO 2022-08-24 10:15:04,840 Run Pod: rrrvdl, replicas 2, status ready INFO 2022-08-24 10:15:04,840 controller.py:53] Run Pod: rrrvdl, replicas 2, status ready LAUNCH INFO 2022-08-24 10:15:04,869 Watching Pod: rrrvdl, replicas 2, status running INFO 2022-08-24 10:15:04,869 controller.py:73] Watching Pod: rrrvdl, replicas 2, status running 2022-08-24 10:15:12 [INFO] ------------Environment Information------------- platform: Linux-3.10.0-1062.18.1.el7.x86_64-x86_64-with-glibc2.18 Python: 3.9.7 (default, Sep 16 2021, 13:09:58) [GCC 7.5.0] Paddle compiled with cuda: True NVCC: Cuda compilation tools, release 10.2, V10.2.89 cudnn: 7.6 GPUs used: 2 CUDA_VISIBLE_DEVICES: 3,4 GPU: ['GPU 0: Tesla V100-SXM2-16GB', 'GPU 1: Tesla V100-SXM2-16GB', 'GPU 2: Tesla V100-SXM2-16GB', 'GPU 3: Tesla V100-SXM2-16GB', 'GPU 4: Tesla V100-SXM2-16GB', 'GPU 5: Tesla V100-SXM2-16GB', 'GPU 6: Tesla V100-SXM2-16GB', 'GPU 7: Tesla V100-SXM2-16GB'] GCC: gcc (GCC) 9.1.0 PaddleSeg: develop PaddlePaddle: 2.3.0 OpenCV: 4.4.0 ------------------------------------------------ 2022-08-24 10:15:12 [INFO] ---------------Config Information--------------- batch_size: 8 export: transforms: - keep_ratio: true size_divisor: 32 target_size: - 2048 - 512 type: Resize - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize iters: 160000 loss: coef: - 1 types: - ignore_index: 255 type: CrossEntropyLoss lr_scheduler: end_lr: 0 learning_rate: 0.0012 power: 1.0 type: PolynomialDecay warmup_iters: 1500 warmup_start_lr: 1.0e-06 model: backbone: lr_mult: 0.1 pretrained: https://paddleseg.bj.bcebos.com/dygraph/backbone/topformer_tiny_imagenet_pretrained.zip type: TopTransformer_Tiny head_use_dw: true type: TopFormer optimizer: type: AdamW weight_decay: 0.01 train_dataset: dataset_root: data/ADEChallengeData2016/ mode: train transforms: - max_scale_factor: 2.0 min_scale_factor: 0.5 scale_step_size: 0.25 type: ResizeStepScaling - crop_size: - 512 - 512 type: RandomPaddingCrop - type: RandomHorizontalFlip - brightness_range: 0.4 contrast_range: 0.4 saturation_range: 0.4 type: RandomDistort - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize type: ADE20K val_dataset: dataset_root: data/ADEChallengeData2016/ mode: val transforms: - keep_ratio: true size_divisor: 32 target_size: - 2048 - 512 type: Resize - mean: - 0.485 - 0.456 - 0.406 std: - 0.229 - 0.224 - 0.225 type: Normalize type: ADE20K ------------------------------------------------ W0824 10:15:12.686036 78851 gpu_context.cc:278] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 10.2, Runtime API Version: 10.2 W0824 10:15:12.686110 78851 gpu_context.cc:306] device: 0, cuDNN Version: 7.6. 2022-08-24 10:15:23 [INFO] Loading pretrained model from https://paddleseg.bj.bcebos.com/dygraph/backbone/topformer_tiny_imagenet_pretrained.zip 2022-08-24 10:15:23 [WARNING] SIM.1.local_embedding.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.local_embedding.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.local_embedding.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.local_embedding.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.local_embedding.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_embedding.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_embedding.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_embedding.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_embedding.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_embedding.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_act.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_act.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_act.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_act.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.1.global_act.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.local_embedding.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.local_embedding.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.local_embedding.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.local_embedding.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.local_embedding.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_embedding.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_embedding.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_embedding.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_embedding.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_embedding.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_act.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_act.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_act.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_act.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.2.global_act.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.local_embedding.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.local_embedding.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.local_embedding.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.local_embedding.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.local_embedding.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_embedding.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_embedding.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_embedding.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_embedding.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_embedding.bn._variance is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_act.conv.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_act.bn.weight is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_act.bn.bias is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_act.bn._mean is not in pretrained model 2022-08-24 10:15:23 [WARNING] SIM.3.global_act.bn._variance is not in pretrained model 2022-08-24 10:15:23 [INFO] There are 263/308 variables loaded into TopTransformer. I0824 10:15:25.101281 78851 nccl_context.cc:83] init nccl context nranks: 2 local rank: 0 gpu id: 0 ring id: 0 I0824 10:15:25.897094 78851 nccl_context.cc:115] init nccl context nranks: 2 local rank: 0 gpu id: 0 ring id: 10 /ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/tensor/creation.py:125: DeprecationWarning: `np.object` is a deprecated alias for the builtin `object`. To silence this warning, use `object` by itself. Doing this will not modify any behavior and is safe. Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations if data.dtype == np.object: 2022-08-24 10:15:25,976-INFO: [topology.py:169:__init__] HybridParallelInfo: rank_id: 0, mp_degree: 1, sharding_degree: 1, pp_degree: 1, dp_degree: 2, mp_group: [0], sharding_group: [0], pp_group: [0], dp_group: [0, 1], check/clip group: [0] /ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/math_op_patch.py:276: UserWarning: The dtype of left and right variables are not the same, left dtype is paddle.float32, but right dtype is paddle.int64, the right dtype will convert to paddle.float32 warnings.warn( 2022-08-24 10:15:35 [INFO] [TRAIN] epoch: 1, iter: 50/160000, loss: 5.6392, lr: 0.000040, batch_cost: 0.1968, reader_cost: 0.03765, ips: 40.6423 samples/sec | ETA 08:44:44 2022-08-24 10:15:44 [INFO] [TRAIN] epoch: 1, iter: 100/160000, loss: 5.5590, lr: 0.000080, batch_cost: 0.1741, reader_cost: 0.00065, ips: 45.9597 samples/sec | ETA 07:43:53 2022-08-24 10:15:52 [INFO] [TRAIN] epoch: 1, iter: 150/160000, loss: 5.2439, lr: 0.000120, batch_cost: 0.1470, reader_cost: 0.00057, ips: 54.4036 samples/sec | ETA 06:31:45 2022-08-24 10:15:59 [INFO] [TRAIN] epoch: 1, iter: 200/160000, loss: 4.8935, lr: 0.000160, batch_cost: 0.1591, reader_cost: 0.00064, ips: 50.2971 samples/sec | ETA 07:03:36 2022-08-24 10:16:07 [INFO] [TRAIN] epoch: 1, iter: 250/160000, loss: 4.4877, lr: 0.000200, batch_cost: 0.1491, reader_cost: 0.00068, ips: 53.6597 samples/sec | ETA 06:36:56 2022-08-24 10:16:16 [INFO] [TRAIN] epoch: 1, iter: 300/160000, loss: 4.1306, lr: 0.000240, batch_cost: 0.1742, reader_cost: 0.00104, ips: 45.9276 samples/sec | ETA 07:43:37 2022-08-24 10:16:23 [INFO] [TRAIN] epoch: 1, iter: 350/160000, loss: 3.8073, lr: 0.000280, batch_cost: 0.1561, reader_cost: 0.00044, ips: 51.2541 samples/sec | ETA 06:55:18 2022-08-24 10:16:31 [INFO] [TRAIN] epoch: 1, iter: 400/160000, loss: 3.4868, lr: 0.000320, batch_cost: 0.1532, reader_cost: 0.00052, ips: 52.2129 samples/sec | ETA 06:47:33 2022-08-24 10:16:41 [INFO] [TRAIN] epoch: 1, iter: 450/160000, loss: 3.3037, lr: 0.000360, batch_cost: 0.1876, reader_cost: 0.00045, ips: 42.6502 samples/sec | ETA 08:18:47 2022-08-24 10:16:49 [INFO] [TRAIN] epoch: 1, iter: 500/160000, loss: 3.1397, lr: 0.000400, batch_cost: 0.1713, reader_cost: 0.00590, ips: 46.7079 samples/sec | ETA 07:35:18 2022-08-24 10:16:59 [INFO] [TRAIN] epoch: 1, iter: 550/160000, loss: 3.0387, lr: 0.000440, batch_cost: 0.1995, reader_cost: 0.00060, ips: 40.0980 samples/sec | ETA 08:50:12 2022-08-24 10:17:07 [INFO] [TRAIN] epoch: 1, iter: 600/160000, loss: 2.7688, lr: 0.000480, batch_cost: 0.1597, reader_cost: 0.00283, ips: 50.1005 samples/sec | ETA 07:04:12 2022-08-24 10:17:17 [INFO] [TRAIN] epoch: 1, iter: 650/160000, loss: 2.7803, lr: 0.000520, batch_cost: 0.2033, reader_cost: 0.00266, ips: 39.3440 samples/sec | ETA 09:00:01 2022-08-24 10:17:27 [INFO] [TRAIN] epoch: 1, iter: 700/160000, loss: 2.5710, lr: 0.000560, batch_cost: 0.1906, reader_cost: 0.00396, ips: 41.9629 samples/sec | ETA 08:26:09 2022-08-24 10:17:36 [INFO] [TRAIN] epoch: 1, iter: 750/160000, loss: 2.6445, lr: 0.000600, batch_cost: 0.1828, reader_cost: 0.00071, ips: 43.7686 samples/sec | ETA 08:05:07 2022-08-24 10:17:46 [INFO] [TRAIN] epoch: 1, iter: 800/160000, loss: 2.4538, lr: 0.000640, batch_cost: 0.1956, reader_cost: 0.00062, ips: 40.9043 samples/sec | ETA 08:38:56 2022-08-24 10:17:54 [INFO] [TRAIN] epoch: 1, iter: 850/160000, loss: 2.3232, lr: 0.000680, batch_cost: 0.1677, reader_cost: 0.00220, ips: 47.7138 samples/sec | ETA 07:24:44 2022-08-24 10:18:05 [INFO] [TRAIN] epoch: 1, iter: 900/160000, loss: 2.3011, lr: 0.000720, batch_cost: 0.2169, reader_cost: 0.00049, ips: 36.8854 samples/sec | ETA 09:35:06 2022-08-24 10:18:14 [INFO] [TRAIN] epoch: 1, iter: 950/160000, loss: 2.2763, lr: 0.000760, batch_cost: 0.1889, reader_cost: 0.00681, ips: 42.3408 samples/sec | ETA 08:20:51 2022-08-24 10:18:23 [INFO] [TRAIN] epoch: 1, iter: 1000/160000, loss: 2.3441, lr: 0.000800, batch_cost: 0.1685, reader_cost: 0.00872, ips: 47.4685 samples/sec | ETA 07:26:36 2022-08-24 10:18:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 167s - batch_cost: 0.1674 - reader cost: 6.3587e-04 2022-08-24 10:21:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.0490 Acc: 0.5637 Kappa: 0.5178 Dice: 0.0688 2022-08-24 10:21:10 [INFO] [EVAL] Class IoU: [0.4418 0.6088 0.8186 0.4671 0.4646 0.4276 0.5349 0.3519 0.2314 0.5189 0.1848 0.06 0.486 0.0761 0.0567 0.0241 0.2325 0.0872 0.097 0.1567 0.5239 0.1884 0.0148 0.0477 0.0078 0.0309 0.1963 0. 0. 0.0091 0.0007 0. 0. 0. 0.0002 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:21:10 [INFO] [EVAL] Class Precision: [0.4863 0.6762 0.875 0.5378 0.5266 0.734 0.6171 0.4339 0.5063 0.6388 0.3923 0.4474 0.5895 0.3159 0.1625 0.1355 0.3575 0.3307 0.5991 0.2149 0.6101 0.4217 0.123 0.1522 0.153 0.3585 0.2498 0.0002 0.0309 0.1487 0.0301 0. 0. 0.0062 0.0223 0. 0. 0. 0. 0.0156 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0016 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:21:10 [INFO] [EVAL] Class Recall: [0.8285 0.8593 0.9271 0.7803 0.7979 0.506 0.8005 0.6505 0.2988 0.7343 0.2589 0.0649 0.7345 0.0912 0.0801 0.0284 0.3993 0.1059 0.1038 0.3662 0.7876 0.254 0.0166 0.065 0.0081 0.0327 0.4784 0. 0. 0.0096 0.0007 0. 0. 0. 0.0002 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:21:10 [INFO] [EVAL] The model with the best validation mIoU (0.0490) was saved at iter 1000. 2022-08-24 10:21:18 [INFO] [TRAIN] epoch: 1, iter: 1050/160000, loss: 2.1827, lr: 0.000840, batch_cost: 0.1484, reader_cost: 0.00278, ips: 53.8969 samples/sec | ETA 06:33:13 2022-08-24 10:21:25 [INFO] [TRAIN] epoch: 1, iter: 1100/160000, loss: 2.2028, lr: 0.000879, batch_cost: 0.1402, reader_cost: 0.00158, ips: 57.0646 samples/sec | ETA 06:11:16 2022-08-24 10:21:32 [INFO] [TRAIN] epoch: 1, iter: 1150/160000, loss: 2.2164, lr: 0.000919, batch_cost: 0.1390, reader_cost: 0.00258, ips: 57.5615 samples/sec | ETA 06:07:57 2022-08-24 10:21:39 [INFO] [TRAIN] epoch: 1, iter: 1200/160000, loss: 2.1384, lr: 0.000959, batch_cost: 0.1489, reader_cost: 0.00051, ips: 53.7344 samples/sec | ETA 06:34:02 2022-08-24 10:21:46 [INFO] [TRAIN] epoch: 1, iter: 1250/160000, loss: 2.1985, lr: 0.000999, batch_cost: 0.1423, reader_cost: 0.00039, ips: 56.2207 samples/sec | ETA 06:16:29 2022-08-24 10:21:55 [INFO] [TRAIN] epoch: 2, iter: 1300/160000, loss: 2.1451, lr: 0.001039, batch_cost: 0.1793, reader_cost: 0.02500, ips: 44.6227 samples/sec | ETA 07:54:11 2022-08-24 10:22:03 [INFO] [TRAIN] epoch: 2, iter: 1350/160000, loss: 1.9698, lr: 0.001079, batch_cost: 0.1468, reader_cost: 0.00045, ips: 54.5115 samples/sec | ETA 06:28:03 2022-08-24 10:22:12 [INFO] [TRAIN] epoch: 2, iter: 1400/160000, loss: 1.8480, lr: 0.001119, batch_cost: 0.1898, reader_cost: 0.00055, ips: 42.1395 samples/sec | ETA 08:21:49 2022-08-24 10:22:21 [INFO] [TRAIN] epoch: 2, iter: 1450/160000, loss: 2.0453, lr: 0.001159, batch_cost: 0.1721, reader_cost: 0.00041, ips: 46.4777 samples/sec | ETA 07:34:50 2022-08-24 10:22:30 [INFO] [TRAIN] epoch: 2, iter: 1500/160000, loss: 1.8457, lr: 0.001199, batch_cost: 0.1902, reader_cost: 0.00064, ips: 42.0575 samples/sec | ETA 08:22:29 2022-08-24 10:22:40 [INFO] [TRAIN] epoch: 2, iter: 1550/160000, loss: 1.9245, lr: 0.001200, batch_cost: 0.1949, reader_cost: 0.00086, ips: 41.0432 samples/sec | ETA 08:34:44 2022-08-24 10:22:50 [INFO] [TRAIN] epoch: 2, iter: 1600/160000, loss: 1.8918, lr: 0.001199, batch_cost: 0.1991, reader_cost: 0.00033, ips: 40.1772 samples/sec | ETA 08:45:40 2022-08-24 10:23:00 [INFO] [TRAIN] epoch: 2, iter: 1650/160000, loss: 1.9042, lr: 0.001199, batch_cost: 0.1937, reader_cost: 0.00048, ips: 41.2968 samples/sec | ETA 08:31:15 2022-08-24 10:23:09 [INFO] [TRAIN] epoch: 2, iter: 1700/160000, loss: 1.9196, lr: 0.001198, batch_cost: 0.1766, reader_cost: 0.00042, ips: 45.3014 samples/sec | ETA 07:45:54 2022-08-24 10:23:17 [INFO] [TRAIN] epoch: 2, iter: 1750/160000, loss: 1.9173, lr: 0.001198, batch_cost: 0.1666, reader_cost: 0.00043, ips: 48.0280 samples/sec | ETA 07:19:19 2022-08-24 10:23:26 [INFO] [TRAIN] epoch: 2, iter: 1800/160000, loss: 1.8427, lr: 0.001198, batch_cost: 0.1719, reader_cost: 0.00196, ips: 46.5454 samples/sec | ETA 07:33:10 2022-08-24 10:23:35 [INFO] [TRAIN] epoch: 2, iter: 1850/160000, loss: 1.7378, lr: 0.001197, batch_cost: 0.1821, reader_cost: 0.00113, ips: 43.9278 samples/sec | ETA 08:00:01 2022-08-24 10:23:44 [INFO] [TRAIN] epoch: 2, iter: 1900/160000, loss: 1.6916, lr: 0.001197, batch_cost: 0.1808, reader_cost: 0.00033, ips: 44.2571 samples/sec | ETA 07:56:18 2022-08-24 10:23:53 [INFO] [TRAIN] epoch: 2, iter: 1950/160000, loss: 1.8199, lr: 0.001197, batch_cost: 0.1845, reader_cost: 0.00036, ips: 43.3664 samples/sec | ETA 08:05:56 2022-08-24 10:24:02 [INFO] [TRAIN] epoch: 2, iter: 2000/160000, loss: 1.7943, lr: 0.001196, batch_cost: 0.1830, reader_cost: 0.00099, ips: 43.7208 samples/sec | ETA 08:01:50 2022-08-24 10:24:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1555 - reader cost: 8.7625e-04 2022-08-24 10:26:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.0820 Acc: 0.6321 Kappa: 0.5984 Dice: 0.1177 2022-08-24 10:26:38 [INFO] [EVAL] Class IoU: [0.5251 0.6596 0.8739 0.5592 0.5805 0.6105 0.596 0.4836 0.3709 0.588 0.3023 0.2907 0.5102 0.2258 0.0111 0.1246 0.3488 0.237 0.3391 0.2177 0.5589 0.3019 0.2543 0.2138 0.1651 0.1269 0.2595 0.0006 0.1562 0.1408 0.0205 0.0471 0.0105 0.0279 0.0151 0. 0.0602 0.0733 0.0018 0.0489 0.0002 0.0008 0. 0.0304 0.1206 0. 0.0499 0.2098 0.1046 0.0447 0.0004 0.017 0. 0.0014 0.0036 0.0237 0.2121 0.0046 0. 0.0005 0.0008 0.0002 0.0107 0.0041 0. 0.0855 0.1109 0.0172 0. 0.0001 0. 0.044 0. 0. 0.0178 0.0044 0.0096 0. 0. 0. 0. 0. 0.0039 0.003 0. 0.0234 0. 0. 0. 0.0033 0.1827 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0099 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0045 0.0004 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0001 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:26:38 [INFO] [EVAL] Class Precision: [0.5961 0.7161 0.9238 0.6449 0.6923 0.786 0.6833 0.58 0.5514 0.699 0.3828 0.4946 0.5774 0.3816 0.2189 0.2849 0.4402 0.6533 0.569 0.3379 0.641 0.551 0.594 0.3009 0.2477 0.5354 0.2813 0.5749 0.4854 0.2578 0.0792 0.404 0.16 0.1494 0.1645 0. 0.4065 0.5153 0.4051 0.3706 0.0762 0.0623 0. 0.384 0.5697 0.7273 0.2152 0.296 0.541 0.4812 0.0772 0.2718 0. 0.3632 0.562 0.1413 0.9634 0.3823 0. 0.0193 0.0421 0.2752 0.3982 0.2274 0. 0.4714 0.2029 0.7906 0. 0.1149 0.013 0.309 0. 0. 0.8834 0.3707 0.6846 0. 0. 0. 0. 0. 0.369 0.025 0. 0.5964 0. 0. 0. 0.5913 0.8424 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.5568 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.6878 1. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.4444 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:26:38 [INFO] [EVAL] Class Recall: [0.8152 0.8931 0.9418 0.8081 0.7823 0.7322 0.8235 0.7442 0.5311 0.7873 0.5897 0.4135 0.8143 0.3561 0.0116 0.1813 0.6269 0.2711 0.4564 0.3796 0.8135 0.4003 0.3078 0.4247 0.3314 0.1425 0.7705 0.0006 0.1873 0.2368 0.0269 0.0506 0.0112 0.0332 0.0163 0. 0.066 0.0787 0.0018 0.0533 0.0002 0.0008 0. 0.0319 0.1327 0. 0.0609 0.4188 0.1148 0.047 0.0004 0.0179 0. 0.0014 0.0036 0.0277 0.2139 0.0047 0. 0.0005 0.0008 0.0002 0.0109 0.0041 0. 0.0945 0.1965 0.0173 0. 0.0001 0. 0.0488 0. 0. 0.0179 0.0045 0.0096 0. 0. 0. 0. 0. 0.0039 0.0034 0. 0.0238 0. 0. 0. 0.0033 0.1891 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.01 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0045 0.0004 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0001 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:26:38 [INFO] [EVAL] The model with the best validation mIoU (0.0820) was saved at iter 2000. 2022-08-24 10:26:46 [INFO] [TRAIN] epoch: 2, iter: 2050/160000, loss: 1.7136, lr: 0.001196, batch_cost: 0.1668, reader_cost: 0.00434, ips: 47.9507 samples/sec | ETA 07:19:12 2022-08-24 10:26:54 [INFO] [TRAIN] epoch: 2, iter: 2100/160000, loss: 1.7487, lr: 0.001195, batch_cost: 0.1476, reader_cost: 0.00106, ips: 54.2147 samples/sec | ETA 06:28:19 2022-08-24 10:27:01 [INFO] [TRAIN] epoch: 2, iter: 2150/160000, loss: 1.6632, lr: 0.001195, batch_cost: 0.1468, reader_cost: 0.00086, ips: 54.5061 samples/sec | ETA 06:26:08 2022-08-24 10:27:08 [INFO] [TRAIN] epoch: 2, iter: 2200/160000, loss: 1.6829, lr: 0.001195, batch_cost: 0.1494, reader_cost: 0.00086, ips: 53.5531 samples/sec | ETA 06:32:52 2022-08-24 10:27:16 [INFO] [TRAIN] epoch: 2, iter: 2250/160000, loss: 1.7795, lr: 0.001194, batch_cost: 0.1592, reader_cost: 0.00072, ips: 50.2464 samples/sec | ETA 06:58:36 2022-08-24 10:27:24 [INFO] [TRAIN] epoch: 2, iter: 2300/160000, loss: 1.7126, lr: 0.001194, batch_cost: 0.1541, reader_cost: 0.00109, ips: 51.9145 samples/sec | ETA 06:45:01 2022-08-24 10:27:32 [INFO] [TRAIN] epoch: 2, iter: 2350/160000, loss: 1.6615, lr: 0.001194, batch_cost: 0.1498, reader_cost: 0.00096, ips: 53.4009 samples/sec | ETA 06:33:37 2022-08-24 10:27:39 [INFO] [TRAIN] epoch: 2, iter: 2400/160000, loss: 1.5930, lr: 0.001193, batch_cost: 0.1450, reader_cost: 0.00334, ips: 55.1654 samples/sec | ETA 06:20:54 2022-08-24 10:27:46 [INFO] [TRAIN] epoch: 2, iter: 2450/160000, loss: 1.6419, lr: 0.001193, batch_cost: 0.1520, reader_cost: 0.01161, ips: 52.6238 samples/sec | ETA 06:39:11 2022-08-24 10:27:55 [INFO] [TRAIN] epoch: 2, iter: 2500/160000, loss: 1.6391, lr: 0.001192, batch_cost: 0.1730, reader_cost: 0.00233, ips: 46.2332 samples/sec | ETA 07:34:13 2022-08-24 10:28:06 [INFO] [TRAIN] epoch: 3, iter: 2550/160000, loss: 1.5590, lr: 0.001192, batch_cost: 0.2222, reader_cost: 0.02503, ips: 35.9966 samples/sec | ETA 09:43:12 2022-08-24 10:28:15 [INFO] [TRAIN] epoch: 3, iter: 2600/160000, loss: 1.6361, lr: 0.001192, batch_cost: 0.1810, reader_cost: 0.00120, ips: 44.2051 samples/sec | ETA 07:54:45 2022-08-24 10:28:25 [INFO] [TRAIN] epoch: 3, iter: 2650/160000, loss: 1.5303, lr: 0.001191, batch_cost: 0.1975, reader_cost: 0.00046, ips: 40.5082 samples/sec | ETA 08:37:55 2022-08-24 10:28:33 [INFO] [TRAIN] epoch: 3, iter: 2700/160000, loss: 1.7374, lr: 0.001191, batch_cost: 0.1642, reader_cost: 0.00141, ips: 48.7339 samples/sec | ETA 07:10:21 2022-08-24 10:28:42 [INFO] [TRAIN] epoch: 3, iter: 2750/160000, loss: 1.5643, lr: 0.001191, batch_cost: 0.1691, reader_cost: 0.01187, ips: 47.3184 samples/sec | ETA 07:23:05 2022-08-24 10:28:51 [INFO] [TRAIN] epoch: 3, iter: 2800/160000, loss: 1.5643, lr: 0.001190, batch_cost: 0.1807, reader_cost: 0.00054, ips: 44.2605 samples/sec | ETA 07:53:33 2022-08-24 10:29:00 [INFO] [TRAIN] epoch: 3, iter: 2850/160000, loss: 1.5752, lr: 0.001190, batch_cost: 0.1888, reader_cost: 0.00060, ips: 42.3635 samples/sec | ETA 08:14:36 2022-08-24 10:29:09 [INFO] [TRAIN] epoch: 3, iter: 2900/160000, loss: 1.5899, lr: 0.001189, batch_cost: 0.1810, reader_cost: 0.00046, ips: 44.2038 samples/sec | ETA 07:53:51 2022-08-24 10:29:19 [INFO] [TRAIN] epoch: 3, iter: 2950/160000, loss: 1.4946, lr: 0.001189, batch_cost: 0.1954, reader_cost: 0.00081, ips: 40.9514 samples/sec | ETA 08:31:20 2022-08-24 10:29:29 [INFO] [TRAIN] epoch: 3, iter: 3000/160000, loss: 1.6121, lr: 0.001189, batch_cost: 0.1960, reader_cost: 0.00069, ips: 40.8069 samples/sec | ETA 08:32:59 2022-08-24 10:29:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1592 - reader cost: 5.6481e-04 2022-08-24 10:32:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.1261 Acc: 0.6530 Kappa: 0.6231 Dice: 0.1857 2022-08-24 10:32:08 [INFO] [EVAL] Class IoU: [0.5432 0.6845 0.8908 0.5839 0.6083 0.6355 0.6173 0.5068 0.3855 0.555 0.3469 0.3818 0.5455 0.2035 0.0327 0.1794 0.3773 0.3355 0.3831 0.2333 0.6 0.3408 0.297 0.2233 0.1911 0.3217 0.3054 0.0135 0.1694 0.1707 0.0169 0.1809 0.036 0.1109 0.1616 0.0308 0.1253 0.2883 0.0572 0.1495 0.0097 0. 0. 0.058 0.1517 0.0004 0.0833 0.1964 0.4137 0.1106 0.1102 0.2085 0.0058 0.0369 0.2012 0.0779 0.4751 0.0073 0. 0.0511 0.0015 0.0312 0.1115 0.0882 0.0004 0.3829 0.1437 0.2895 0. 0.0038 0.0002 0.218 0.0076 0.0067 0.194 0.0882 0.1783 0. 0. 0. 0.0839 0.012 0.0293 0.0047 0.0019 0.1962 0. 0.0007 0. 0.241 0.2918 0. 0.0406 0. 0. 0. 0. 0. 0. 0.0007 0. 0. 0. 0.0001 0. 0. 0. 0.154 0. 0.0001 0. 0. 0.0002 0.0995 0.1474 0. 0.1164 0.223 0. 0.0295 0.0014 0. 0. 0.0001 0.0058 0. 0.1185 0.3196 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0372 0. 0. 0.001 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:32:08 [INFO] [EVAL] Class Precision: [0.6236 0.7714 0.9534 0.6889 0.7148 0.7493 0.7336 0.5675 0.543 0.6512 0.5431 0.4972 0.6143 0.5076 0.2795 0.3098 0.4699 0.6539 0.5975 0.3348 0.6704 0.4613 0.6411 0.4457 0.2908 0.5035 0.3582 0.7281 0.6188 0.2498 0.124 0.512 0.1974 0.198 0.2493 0.5975 0.3453 0.3983 0.4346 0.3405 0.1149 0.0126 0. 0.3622 0.3696 0.413 0.3197 0.4134 0.5279 0.5047 0.3447 0.3759 0.3517 0.412 0.7072 0.3204 0.8591 0.4633 0. 0.237 0.0456 0.3349 0.2654 0.585 0.1798 0.4952 0.2213 0.4176 0. 0.2294 0.0127 0.3274 0.9272 0.1467 0.5396 0.1099 0.442 1. 0. 0. 0.9034 0.5326 0.5294 0.0426 0.1146 0.5236 0. 0.6855 0. 0.3834 0.3601 0. 0.5476 0. 0. 0. 0. 0. 0.0139 0.0397 0. 0. 0.0303 0.6389 0. 0. 0. 0.9821 0. 0.0067 0. 0. 0.6308 0.5781 0.9138 0. 0.8295 0.7393 0. 0.5112 0.4521 0. 0. 0.6579 0.6128 0. 0.9128 0.6315 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.4014 0. 0. 0.336 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:32:08 [INFO] [EVAL] Class Recall: [0.8081 0.8586 0.9313 0.793 0.8033 0.8072 0.7957 0.8257 0.5707 0.7898 0.4898 0.6219 0.8297 0.2535 0.0358 0.2989 0.6568 0.408 0.5163 0.435 0.851 0.5663 0.3562 0.3092 0.3578 0.4713 0.6744 0.0136 0.1892 0.3503 0.0192 0.2186 0.0421 0.2014 0.3148 0.0314 0.1643 0.5108 0.0618 0.2104 0.0105 0. 0. 0.0645 0.2047 0.0004 0.1012 0.2723 0.6566 0.1241 0.1394 0.3189 0.0058 0.0389 0.2195 0.0933 0.5152 0.0074 0. 0.0611 0.0016 0.0333 0.1612 0.0941 0.0004 0.6281 0.2905 0.4855 0. 0.0039 0.0002 0.3949 0.0076 0.0069 0.2325 0.3089 0.2301 0. 0. 0. 0.0846 0.0121 0.0301 0.0053 0.0019 0.2389 0. 0.0007 0. 0.3936 0.606 0. 0.042 0. 0. 0. 0. 0. 0. 0.0007 0. 0. 0. 0.0001 0. 0. 0. 0.1545 0. 0.0001 0. 0. 0.0002 0.1073 0.1495 0. 0.1193 0.242 0. 0.0304 0.0014 0. 0. 0.0001 0.0058 0. 0.1198 0.3928 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0394 0. 0. 0.001 0. 0. 0. 0. 0. 0. 0. ] 2022-08-24 10:32:08 [INFO] [EVAL] The model with the best validation mIoU (0.1261) was saved at iter 3000. 2022-08-24 10:32:17 [INFO] [TRAIN] epoch: 3, iter: 3050/160000, loss: 1.5759, lr: 0.001188, batch_cost: 0.1611, reader_cost: 0.00379, ips: 49.6557 samples/sec | ETA 07:01:26 2022-08-24 10:32:25 [INFO] [TRAIN] epoch: 3, iter: 3100/160000, loss: 1.5322, lr: 0.001188, batch_cost: 0.1726, reader_cost: 0.00116, ips: 46.3386 samples/sec | ETA 07:31:27 2022-08-24 10:32:32 [INFO] [TRAIN] epoch: 3, iter: 3150/160000, loss: 1.5465, lr: 0.001188, batch_cost: 0.1434, reader_cost: 0.00087, ips: 55.8066 samples/sec | ETA 06:14:44 2022-08-24 10:32:39 [INFO] [TRAIN] epoch: 3, iter: 3200/160000, loss: 1.6382, lr: 0.001187, batch_cost: 0.1425, reader_cost: 0.00045, ips: 56.1313 samples/sec | ETA 06:12:27 2022-08-24 10:32:47 [INFO] [TRAIN] epoch: 3, iter: 3250/160000, loss: 1.4945, lr: 0.001187, batch_cost: 0.1549, reader_cost: 0.00029, ips: 51.6611 samples/sec | ETA 06:44:33 2022-08-24 10:32:55 [INFO] [TRAIN] epoch: 3, iter: 3300/160000, loss: 1.4186, lr: 0.001186, batch_cost: 0.1581, reader_cost: 0.00046, ips: 50.6085 samples/sec | ETA 06:52:50 2022-08-24 10:33:03 [INFO] [TRAIN] epoch: 3, iter: 3350/160000, loss: 1.4731, lr: 0.001186, batch_cost: 0.1579, reader_cost: 0.00090, ips: 50.6708 samples/sec | ETA 06:52:12 2022-08-24 10:33:12 [INFO] [TRAIN] epoch: 3, iter: 3400/160000, loss: 1.4384, lr: 0.001186, batch_cost: 0.1881, reader_cost: 0.00072, ips: 42.5413 samples/sec | ETA 08:10:49 2022-08-24 10:33:22 [INFO] [TRAIN] epoch: 3, iter: 3450/160000, loss: 1.4503, lr: 0.001185, batch_cost: 0.1904, reader_cost: 0.00761, ips: 42.0074 samples/sec | ETA 08:16:53 2022-08-24 10:33:31 [INFO] [TRAIN] epoch: 3, iter: 3500/160000, loss: 1.4358, lr: 0.001185, batch_cost: 0.1876, reader_cost: 0.00034, ips: 42.6538 samples/sec | ETA 08:09:12 2022-08-24 10:33:41 [INFO] [TRAIN] epoch: 3, iter: 3550/160000, loss: 1.5321, lr: 0.001184, batch_cost: 0.1865, reader_cost: 0.00036, ips: 42.9001 samples/sec | ETA 08:06:14 2022-08-24 10:33:50 [INFO] [TRAIN] epoch: 3, iter: 3600/160000, loss: 1.5073, lr: 0.001184, batch_cost: 0.1934, reader_cost: 0.00076, ips: 41.3668 samples/sec | ETA 08:24:06 2022-08-24 10:34:01 [INFO] [TRAIN] epoch: 3, iter: 3650/160000, loss: 1.5214, lr: 0.001184, batch_cost: 0.2061, reader_cost: 0.00034, ips: 38.8137 samples/sec | ETA 08:57:05 2022-08-24 10:34:11 [INFO] [TRAIN] epoch: 3, iter: 3700/160000, loss: 1.5289, lr: 0.001183, batch_cost: 0.1996, reader_cost: 0.00150, ips: 40.0855 samples/sec | ETA 08:39:53 2022-08-24 10:34:20 [INFO] [TRAIN] epoch: 3, iter: 3750/160000, loss: 1.4771, lr: 0.001183, batch_cost: 0.1905, reader_cost: 0.00040, ips: 41.9944 samples/sec | ETA 08:16:05 2022-08-24 10:34:32 [INFO] [TRAIN] epoch: 4, iter: 3800/160000, loss: 1.5188, lr: 0.001183, batch_cost: 0.2331, reader_cost: 0.03734, ips: 34.3152 samples/sec | ETA 10:06:55 2022-08-24 10:34:41 [INFO] [TRAIN] epoch: 4, iter: 3850/160000, loss: 1.4682, lr: 0.001182, batch_cost: 0.1788, reader_cost: 0.00085, ips: 44.7398 samples/sec | ETA 07:45:21 2022-08-24 10:34:51 [INFO] [TRAIN] epoch: 4, iter: 3900/160000, loss: 1.3414, lr: 0.001182, batch_cost: 0.1955, reader_cost: 0.00038, ips: 40.9205 samples/sec | ETA 08:28:37 2022-08-24 10:35:00 [INFO] [TRAIN] epoch: 4, iter: 3950/160000, loss: 1.3942, lr: 0.001181, batch_cost: 0.1889, reader_cost: 0.00054, ips: 42.3511 samples/sec | ETA 08:11:17 2022-08-24 10:35:09 [INFO] [TRAIN] epoch: 4, iter: 4000/160000, loss: 1.3891, lr: 0.001181, batch_cost: 0.1872, reader_cost: 0.00034, ips: 42.7333 samples/sec | ETA 08:06:44 2022-08-24 10:35:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 152s - batch_cost: 0.1520 - reader cost: 0.0014 2022-08-24 10:37:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.1509 Acc: 0.6670 Kappa: 0.6380 Dice: 0.2216 2022-08-24 10:37:42 [INFO] [EVAL] Class IoU: [0.5545 0.6958 0.8958 0.6093 0.6051 0.6445 0.6462 0.5285 0.392 0.5576 0.3596 0.3916 0.5624 0.2355 0.0322 0.1891 0.4035 0.2977 0.3834 0.2403 0.6365 0.3741 0.3991 0.2871 0.1812 0.3779 0.3387 0.0615 0.258 0.1923 0.0356 0.263 0.0251 0.1035 0.1745 0.0589 0.149 0.1629 0.1239 0.1306 0.0116 0. 0.0002 0.0769 0.1871 0.0055 0.1535 0.1822 0.3512 0.2205 0.0656 0.1536 0.0118 0.0207 0.3385 0.2595 0.5209 0.1164 0.004 0.0725 0.0056 0.0375 0.1721 0.0671 0.1213 0.3821 0.1457 0.2772 0.0002 0.055 0.0986 0.2272 0.0651 0.0022 0.2508 0.0517 0.1123 0.008 0.0347 0. 0.3218 0.0486 0.0245 0.0027 0. 0.2381 0. 0.0041 0. 0.2429 0.3853 0. 0.0962 0.0011 0. 0. 0. 0. 0. 0. 0. 0. 0.0001 0.0141 0. 0.0178 0.009 0.2566 0. 0.079 0. 0. 0. 0.0959 0.5438 0. 0.1274 0.3975 0. 0.2892 0.1556 0. 0. 0.0257 0.0596 0. 0.25 0.3151 0.0076 0. 0.0046 0. 0. 0. 0. 0.001 0. 0. 0.0025 0.0837 0. 0. 0.1145 0. 0. 0. 0.0007 0. 0. 0. ] 2022-08-24 10:37:42 [INFO] [EVAL] Class Precision: [0.6309 0.7568 0.961 0.7321 0.6813 0.7889 0.7479 0.5848 0.55 0.6735 0.5249 0.5445 0.6437 0.4182 0.3463 0.3304 0.5354 0.6732 0.6508 0.4238 0.7241 0.5196 0.625 0.4648 0.2998 0.5323 0.4615 0.613 0.532 0.2665 0.229 0.4034 0.3253 0.2262 0.4225 0.8146 0.4196 0.7505 0.5502 0.4221 0.1716 0. 1. 0.4253 0.3126 0.9359 0.3324 0.2896 0.6429 0.3357 0.6366 0.2014 0.401 0.5304 0.7198 0.3639 0.6415 0.3943 0.4432 0.24 0.3544 0.2272 0.2794 0.5778 0.337 0.4399 0.3316 0.469 0.4936 0.3973 0.2468 0.3977 0.9213 0.2351 0.5886 0.4128 0.5502 0.2202 0.0795 0.0068 0.598 0.6579 0.6059 0.0432 0. 0.6072 0. 0.1755 0.003 0.4826 0.5673 0. 0.4389 0.0775 0. 0. 0. 0. 0.0137 0. 0. 0. 0.2545 0.1125 0. 0.9768 0.5337 0.8016 0. 0.1823 0. 0. 0.0039 0.7972 0.7512 0. 0.5813 0.7009 0. 0.5676 0.6175 0. 0. 0.4707 0.4661 0. 0.7751 0.5987 0.0462 0. 0.698 0. 0. 0.0923 0. 0.3253 0. 0. 1. 0.5123 0. 0. 0.6423 0. 0. 0. 1. 0. 0. 0. ] 2022-08-24 10:37:42 [INFO] [EVAL] Class Recall: [0.8207 0.8962 0.9297 0.7842 0.8441 0.7788 0.826 0.8461 0.577 0.764 0.5332 0.5823 0.8165 0.3503 0.0343 0.3065 0.621 0.348 0.4827 0.3569 0.8404 0.5718 0.5248 0.429 0.3142 0.5657 0.5599 0.064 0.3338 0.4084 0.0405 0.4304 0.0265 0.1602 0.2292 0.0597 0.1877 0.1723 0.1379 0.1591 0.0123 0. 0.0002 0.0858 0.3179 0.0055 0.222 0.3295 0.4363 0.3911 0.0681 0.3934 0.012 0.0211 0.3899 0.4751 0.7348 0.1417 0.004 0.0941 0.0056 0.043 0.3096 0.0706 0.1593 0.7441 0.2062 0.4041 0.0002 0.06 0.141 0.3463 0.0655 0.0022 0.3042 0.0558 0.1236 0.0082 0.058 0. 0.4106 0.0498 0.0249 0.0028 0. 0.2815 0. 0.0042 0. 0.3285 0.5457 0. 0.1097 0.0012 0. 0. 0. 0. 0. 0. 0. 0. 0.0001 0.0158 0. 0.0178 0.0091 0.274 0. 0.1224 0. 0. 0. 0.0982 0.6634 0. 0.1403 0.4787 0. 0.3709 0.1722 0. 0. 0.0265 0.064 0. 0.2696 0.3994 0.009 0. 0.0046 0. 0. 0. 0. 0.001 0. 0. 0.0025 0.0909 0. 0. 0.1223 0. 0. 0. 0.0007 0. 0. 0. ] 2022-08-24 10:37:42 [INFO] [EVAL] The model with the best validation mIoU (0.1509) was saved at iter 4000. 2022-08-24 10:37:56 [INFO] [TRAIN] epoch: 4, iter: 4050/160000, loss: 1.4033, lr: 0.001181, batch_cost: 0.2829, reader_cost: 0.00493, ips: 28.2791 samples/sec | ETA 12:15:17 2022-08-24 10:38:10 [INFO] [TRAIN] epoch: 4, iter: 4100/160000, loss: 1.3803, lr: 0.001180, batch_cost: 0.2890, reader_cost: 0.00296, ips: 27.6855 samples/sec | ETA 12:30:48 2022-08-24 10:38:24 [INFO] [TRAIN] epoch: 4, iter: 4150/160000, loss: 1.3790, lr: 0.001180, batch_cost: 0.2655, reader_cost: 0.00051, ips: 30.1264 samples/sec | ETA 11:29:45 2022-08-24 10:38:37 [INFO] [TRAIN] epoch: 4, iter: 4200/160000, loss: 1.4146, lr: 0.001180, batch_cost: 0.2650, reader_cost: 0.00039, ips: 30.1905 samples/sec | ETA 11:28:04 2022-08-24 10:38:51 [INFO] [TRAIN] epoch: 4, iter: 4250/160000, loss: 1.5101, lr: 0.001179, batch_cost: 0.2754, reader_cost: 0.00047, ips: 29.0468 samples/sec | ETA 11:54:56 2022-08-24 10:39:04 [INFO] [TRAIN] epoch: 4, iter: 4300/160000, loss: 1.3915, lr: 0.001179, batch_cost: 0.2720, reader_cost: 0.00077, ips: 29.4083 samples/sec | ETA 11:45:55 2022-08-24 10:39:18 [INFO] [TRAIN] epoch: 4, iter: 4350/160000, loss: 1.3496, lr: 0.001178, batch_cost: 0.2714, reader_cost: 0.00065, ips: 29.4738 samples/sec | ETA 11:44:07 2022-08-24 10:39:32 [INFO] [TRAIN] epoch: 4, iter: 4400/160000, loss: 1.3125, lr: 0.001178, batch_cost: 0.2763, reader_cost: 0.00051, ips: 28.9556 samples/sec | ETA 11:56:29 2022-08-24 10:39:45 [INFO] [TRAIN] epoch: 4, iter: 4450/160000, loss: 1.4054, lr: 0.001178, batch_cost: 0.2695, reader_cost: 0.00071, ips: 29.6839 samples/sec | ETA 11:38:41 2022-08-24 10:39:59 [INFO] [TRAIN] epoch: 4, iter: 4500/160000, loss: 1.4595, lr: 0.001177, batch_cost: 0.2685, reader_cost: 0.00067, ips: 29.7999 samples/sec | ETA 11:35:45 2022-08-24 10:40:12 [INFO] [TRAIN] epoch: 4, iter: 4550/160000, loss: 1.4285, lr: 0.001177, batch_cost: 0.2633, reader_cost: 0.00083, ips: 30.3847 samples/sec | ETA 11:22:08 2022-08-24 10:40:25 [INFO] [TRAIN] epoch: 4, iter: 4600/160000, loss: 1.3591, lr: 0.001177, batch_cost: 0.2595, reader_cost: 0.00105, ips: 30.8324 samples/sec | ETA 11:12:01 2022-08-24 10:40:37 [INFO] [TRAIN] epoch: 4, iter: 4650/160000, loss: 1.4598, lr: 0.001176, batch_cost: 0.2509, reader_cost: 0.00063, ips: 31.8815 samples/sec | ETA 10:49:41 2022-08-24 10:40:51 [INFO] [TRAIN] epoch: 4, iter: 4700/160000, loss: 1.2976, lr: 0.001176, batch_cost: 0.2773, reader_cost: 0.00071, ips: 28.8501 samples/sec | ETA 11:57:43 2022-08-24 10:41:05 [INFO] [TRAIN] epoch: 4, iter: 4750/160000, loss: 1.4654, lr: 0.001175, batch_cost: 0.2775, reader_cost: 0.00066, ips: 28.8314 samples/sec | ETA 11:57:58 2022-08-24 10:41:19 [INFO] [TRAIN] epoch: 4, iter: 4800/160000, loss: 1.4217, lr: 0.001175, batch_cost: 0.2809, reader_cost: 0.00077, ips: 28.4836 samples/sec | ETA 12:06:30 2022-08-24 10:41:33 [INFO] [TRAIN] epoch: 4, iter: 4850/160000, loss: 1.3268, lr: 0.001175, batch_cost: 0.2694, reader_cost: 0.00046, ips: 29.6971 samples/sec | ETA 11:36:35 2022-08-24 10:41:45 [INFO] [TRAIN] epoch: 4, iter: 4900/160000, loss: 1.4398, lr: 0.001174, batch_cost: 0.2500, reader_cost: 0.00036, ips: 31.9966 samples/sec | ETA 10:46:19 2022-08-24 10:41:59 [INFO] [TRAIN] epoch: 4, iter: 4950/160000, loss: 1.4392, lr: 0.001174, batch_cost: 0.2749, reader_cost: 0.00037, ips: 29.1043 samples/sec | ETA 11:50:19 2022-08-24 10:42:12 [INFO] [TRAIN] epoch: 4, iter: 5000/160000, loss: 1.3556, lr: 0.001174, batch_cost: 0.2707, reader_cost: 0.00031, ips: 29.5497 samples/sec | ETA 11:39:23 2022-08-24 10:42:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 152s - batch_cost: 0.1518 - reader cost: 5.9846e-04 2022-08-24 10:44:44 [INFO] [EVAL] #Images: 2000 mIoU: 0.1765 Acc: 0.6768 Kappa: 0.6498 Dice: 0.2569 2022-08-24 10:44:44 [INFO] [EVAL] Class IoU: [0.568 0.7156 0.9004 0.6181 0.6197 0.652 0.6684 0.5728 0.3918 0.5622 0.3745 0.3985 0.5712 0.2592 0.0335 0.202 0.4079 0.3517 0.3987 0.2459 0.6389 0.3532 0.3988 0.3336 0.1941 0.4089 0.3575 0.0876 0.2651 0.1853 0.0508 0.3232 0.112 0.1479 0.217 0.0783 0.1945 0.3414 0.1128 0.1151 0.0017 0.0125 0. 0.0719 0.2039 0.0264 0.1691 0.1764 0.4635 0.2755 0.2187 0.1693 0.036 0.1879 0.5633 0.3396 0.6425 0.0922 0.0002 0.1 0.0382 0.0404 0.1577 0.0933 0.1885 0.3955 0.1497 0.275 0. 0.0455 0.0759 0.2541 0.1959 0.0006 0.2481 0.1085 0.2774 0.0138 0.0172 0.0134 0.1885 0.0632 0.167 0.0032 0.0256 0.2879 0. 0.0092 0. 0.2984 0.3913 0.0001 0.1392 0.0042 0. 0.001 0.0026 0. 0. 0.0012 0. 0. 0.022 0. 0. 0.0509 0.0007 0.3008 0.0025 0.0327 0. 0.0738 0. 0.1882 0.6915 0. 0.2469 0.4765 0. 0.3278 0.3605 0. 0. 0.1396 0.111 0. 0.2157 0.3011 0. 0.0204 0.0453 0. 0. 0.0022 0.0221 0.0135 0.0048 0. 0.0088 0.1644 0.0047 0.0125 0.203 0.0004 0.0599 0. 0.0237 0. 0.0025 0. ] 2022-08-24 10:44:44 [INFO] [EVAL] Class Precision: [0.6505 0.8152 0.948 0.7298 0.7149 0.8235 0.7514 0.6397 0.496 0.74 0.5895 0.5521 0.6496 0.4323 0.3298 0.3748 0.5523 0.5999 0.5629 0.4215 0.7176 0.4992 0.6382 0.4329 0.2909 0.5569 0.4284 0.5839 0.5896 0.2636 0.1968 0.466 0.3924 0.2047 0.3849 0.7893 0.4021 0.477 0.5255 0.4288 0.3076 0.1669 0. 0.6287 0.2869 0.9117 0.2307 0.6215 0.6638 0.4827 0.4108 0.2195 0.3513 0.5651 0.7869 0.46 0.9159 0.4633 0.0848 0.394 0.0986 0.7702 0.2915 0.5622 0.3597 0.4494 0.2264 0.3466 0. 0.9508 0.3905 0.3268 0.7146 0.1287 0.5592 0.441 0.3752 0.2526 0.1278 0.0825 0.8647 0.5363 0.5479 0.0351 0.4155 0.5603 0. 0.2166 0. 0.5419 0.5335 1. 0.5456 0.1485 0.0188 0.005 0.9174 0. 0.0023 0.6042 0. 0. 0.4127 0. 0.0062 1. 0.0444 0.7067 0.0188 0.0942 0.005 0.349 0.1011 0.6535 0.8912 0. 0.7171 0.6506 0. 0.4166 0.6556 0. 0. 0.4067 0.4232 0. 0.8929 0.5659 0.0059 0.7005 0.9312 0. 0. 0.1414 0.8323 0.1496 0.6385 0. 0.6475 0.4502 0.0159 0.0729 0.5421 0.5962 0.5224 0. 0.9993 0. 0.2972 0. ] 2022-08-24 10:44:44 [INFO] [EVAL] Class Recall: [0.8175 0.8542 0.9471 0.8016 0.8232 0.7579 0.8582 0.8457 0.6509 0.7005 0.5065 0.5888 0.8255 0.393 0.036 0.3047 0.6094 0.4594 0.5776 0.3711 0.8535 0.5471 0.5154 0.5925 0.3685 0.606 0.6836 0.0934 0.3251 0.384 0.0642 0.5135 0.1355 0.3478 0.3323 0.0799 0.2736 0.5456 0.1255 0.136 0.0017 0.0134 0. 0.0751 0.4135 0.0265 0.388 0.1976 0.6058 0.3908 0.3186 0.4252 0.0385 0.2197 0.6648 0.5646 0.6828 0.1032 0.0002 0.1181 0.0587 0.0409 0.2557 0.1006 0.2838 0.7674 0.3064 0.5712 0. 0.0456 0.0861 0.5331 0.2125 0.0006 0.3084 0.1257 0.5155 0.0144 0.0195 0.0158 0.1942 0.0668 0.1937 0.0035 0.0266 0.3719 0. 0.0095 0. 0.399 0.5947 0.0001 0.1575 0.0043 0. 0.0012 0.0026 0. 0. 0.0012 0. 0. 0.0227 0. 0. 0.0509 0.0007 0.3437 0.0028 0.0477 0. 0.0856 0. 0.2091 0.7553 0. 0.2736 0.6403 0. 0.6059 0.4447 0. 0. 0.1753 0.1308 0. 0.2214 0.3916 0. 0.0206 0.0454 0. 0. 0.0023 0.0222 0.0146 0.0048 0. 0.0089 0.2057 0.0066 0.0148 0.2451 0.0004 0.0633 0. 0.0237 0. 0.0026 0. ] 2022-08-24 10:44:45 [INFO] [EVAL] The model with the best validation mIoU (0.1765) was saved at iter 5000. 2022-08-24 10:44:58 [INFO] [TRAIN] epoch: 4, iter: 5050/160000, loss: 1.4785, lr: 0.001173, batch_cost: 0.2714, reader_cost: 0.00392, ips: 29.4757 samples/sec | ETA 11:40:55 2022-08-24 10:45:14 [INFO] [TRAIN] epoch: 5, iter: 5100/160000, loss: 1.3528, lr: 0.001173, batch_cost: 0.3071, reader_cost: 0.03090, ips: 26.0472 samples/sec | ETA 13:12:55 2022-08-24 10:45:27 [INFO] [TRAIN] epoch: 5, iter: 5150/160000, loss: 1.3816, lr: 0.001172, batch_cost: 0.2719, reader_cost: 0.00112, ips: 29.4243 samples/sec | ETA 11:41:41 2022-08-24 10:45:40 [INFO] [TRAIN] epoch: 5, iter: 5200/160000, loss: 1.3453, lr: 0.001172, batch_cost: 0.2605, reader_cost: 0.00076, ips: 30.7144 samples/sec | ETA 11:11:59 2022-08-24 10:45:53 [INFO] [TRAIN] epoch: 5, iter: 5250/160000, loss: 1.3472, lr: 0.001172, batch_cost: 0.2609, reader_cost: 0.00101, ips: 30.6597 samples/sec | ETA 11:12:58 2022-08-24 10:46:06 [INFO] [TRAIN] epoch: 5, iter: 5300/160000, loss: 1.3193, lr: 0.001171, batch_cost: 0.2549, reader_cost: 0.00079, ips: 31.3787 samples/sec | ETA 10:57:20 2022-08-24 10:46:19 [INFO] [TRAIN] epoch: 5, iter: 5350/160000, loss: 1.3604, lr: 0.001171, batch_cost: 0.2665, reader_cost: 0.00057, ips: 30.0195 samples/sec | ETA 11:26:53 2022-08-24 10:46:33 [INFO] [TRAIN] epoch: 5, iter: 5400/160000, loss: 1.4527, lr: 0.001170, batch_cost: 0.2803, reader_cost: 0.00110, ips: 28.5398 samples/sec | ETA 12:02:16 2022-08-24 10:46:44 [INFO] [TRAIN] epoch: 5, iter: 5450/160000, loss: 1.2515, lr: 0.001170, batch_cost: 0.2229, reader_cost: 0.00088, ips: 35.8952 samples/sec | ETA 09:34:04 2022-08-24 10:46:55 [INFO] [TRAIN] epoch: 5, iter: 5500/160000, loss: 1.3714, lr: 0.001170, batch_cost: 0.2084, reader_cost: 0.00075, ips: 38.3878 samples/sec | ETA 08:56:37 2022-08-24 10:47:05 [INFO] [TRAIN] epoch: 5, iter: 5550/160000, loss: 1.3255, lr: 0.001169, batch_cost: 0.2112, reader_cost: 0.00057, ips: 37.8734 samples/sec | ETA 09:03:44 2022-08-24 10:47:15 [INFO] [TRAIN] epoch: 5, iter: 5600/160000, loss: 1.2560, lr: 0.001169, batch_cost: 0.1840, reader_cost: 0.00077, ips: 43.4711 samples/sec | ETA 07:53:34 2022-08-24 10:47:25 [INFO] [TRAIN] epoch: 5, iter: 5650/160000, loss: 1.3320, lr: 0.001169, batch_cost: 0.2144, reader_cost: 0.00108, ips: 37.3179 samples/sec | ETA 09:11:28 2022-08-24 10:47:37 [INFO] [TRAIN] epoch: 5, iter: 5700/160000, loss: 1.3014, lr: 0.001168, batch_cost: 0.2257, reader_cost: 0.00089, ips: 35.4435 samples/sec | ETA 09:40:27 2022-08-24 10:47:50 [INFO] [TRAIN] epoch: 5, iter: 5750/160000, loss: 1.3661, lr: 0.001168, batch_cost: 0.2648, reader_cost: 0.00037, ips: 30.2065 samples/sec | ETA 11:20:52 2022-08-24 10:48:03 [INFO] [TRAIN] epoch: 5, iter: 5800/160000, loss: 1.4391, lr: 0.001167, batch_cost: 0.2666, reader_cost: 0.00031, ips: 30.0045 samples/sec | ETA 11:25:13 2022-08-24 10:48:16 [INFO] [TRAIN] epoch: 5, iter: 5850/160000, loss: 1.3329, lr: 0.001167, batch_cost: 0.2545, reader_cost: 0.00083, ips: 31.4374 samples/sec | ETA 10:53:47 2022-08-24 10:48:29 [INFO] [TRAIN] epoch: 5, iter: 5900/160000, loss: 1.3294, lr: 0.001167, batch_cost: 0.2584, reader_cost: 0.00049, ips: 30.9579 samples/sec | ETA 11:03:41 2022-08-24 10:48:42 [INFO] [TRAIN] epoch: 5, iter: 5950/160000, loss: 1.3504, lr: 0.001166, batch_cost: 0.2656, reader_cost: 0.00058, ips: 30.1217 samples/sec | ETA 11:21:53 2022-08-24 10:48:55 [INFO] [TRAIN] epoch: 5, iter: 6000/160000, loss: 1.3087, lr: 0.001166, batch_cost: 0.2594, reader_cost: 0.00040, ips: 30.8426 samples/sec | ETA 11:05:44 2022-08-24 10:48:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1867 - reader cost: 5.8526e-04 2022-08-24 10:52:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.1886 Acc: 0.6846 Kappa: 0.6590 Dice: 0.2771 2022-08-24 10:52:02 [INFO] [EVAL] Class IoU: [0.5747 0.6986 0.8976 0.6321 0.6137 0.6784 0.6972 0.5866 0.4054 0.6281 0.3779 0.429 0.575 0.249 0.1015 0.2298 0.4108 0.3445 0.4264 0.2692 0.6547 0.3625 0.4145 0.3276 0.1824 0.3372 0.387 0.1348 0.2851 0.1966 0.0279 0.3227 0.1491 0.1326 0.1745 0.1532 0.202 0.3129 0.1418 0.2227 0.0186 0.0312 0.0169 0.1068 0.212 0.0955 0.1717 0.2792 0.2819 0.3078 0.1906 0.2312 0.0627 0.1051 0.5846 0.2779 0.6882 0.0942 0.0636 0.1484 0.0322 0.0698 0.2344 0.0055 0.1773 0.4488 0.1355 0.2795 0.0329 0.0604 0.159 0.2245 0.2426 0.0406 0.2535 0.1374 0.2597 0.0081 0.0938 0.0313 0.0798 0.1221 0.1089 0.0623 0.1596 0.3451 0.0111 0.013 0. 0.3078 0.2274 0.0795 0.162 0.0178 0. 0. 0.0297 0. 0.004 0.0853 0.0257 0. 0.0009 0.0815 0.0043 0.2156 0.0001 0.2598 0. 0.0858 0. 0.0425 0.0003 0.2486 0.7684 0. 0.2076 0.2826 0. 0.3206 0.3376 0. 0. 0.0075 0.0939 0.0001 0.2278 0.33 0. 0. 0.2301 0. 0. 0.0369 0.0175 0.0177 0.0462 0. 0.0135 0.1616 0.0763 0.0739 0.2555 0.0017 0.0888 0. 0.0685 0. 0.002 0. ] 2022-08-24 10:52:02 [INFO] [EVAL] Class Precision: [0.6889 0.7561 0.9261 0.7601 0.7379 0.7647 0.8165 0.6522 0.5139 0.6989 0.5601 0.6302 0.6439 0.5071 0.3145 0.3536 0.5172 0.6568 0.6367 0.3812 0.7514 0.4746 0.674 0.4718 0.3643 0.6159 0.4576 0.5644 0.5658 0.3451 0.2554 0.4494 0.3766 0.3116 0.3974 0.776 0.4504 0.7689 0.398 0.3751 0.1106 0.1657 0.9607 0.4045 0.3027 0.3394 0.426 0.4401 0.7259 0.4814 0.5219 0.3805 0.3347 0.5217 0.6382 0.3404 0.8168 0.4734 0.5607 0.2762 0.1585 0.3154 0.3457 0.5462 0.3235 0.5588 0.3124 0.5345 0.0925 0.8511 0.3004 0.2518 0.7997 0.1462 0.5394 0.4553 0.3302 0.5254 0.2667 0.4439 0.6799 0.4794 0.5746 0.0898 0.5683 0.5232 0.2019 0.2822 0. 0.5371 0.2626 0.924 0.2866 0.1681 0. 0. 0.365 0. 0.0989 0.2997 0.348 0. 0.9791 0.9797 0.1881 0.8271 0.0072 0.7229 0.0006 0.1713 0. 0.0805 0.0182 0.6532 0.8566 0. 0.9237 0.3096 0. 0.4977 0.4205 0. 0. 0.6668 0.3752 0.0244 0.9186 0.5099 0. 0. 0.7326 0. 0. 0.1953 0.8011 0.1218 0.4448 0. 0.7495 0.5223 0.1907 0.2403 0.4586 0.0867 0.667 0. 0.9387 0. 0.1333 0. ] 2022-08-24 10:52:02 [INFO] [EVAL] Class Recall: [0.7762 0.9019 0.9669 0.7895 0.7848 0.8573 0.8267 0.8536 0.6575 0.8612 0.5374 0.5732 0.8432 0.3286 0.1303 0.3962 0.6663 0.4201 0.5635 0.4784 0.8357 0.6056 0.5184 0.5174 0.2675 0.427 0.7148 0.1505 0.3648 0.3135 0.0304 0.5336 0.1979 0.1876 0.2373 0.1603 0.2681 0.3453 0.1805 0.354 0.0218 0.0369 0.0169 0.1267 0.4142 0.1173 0.2234 0.433 0.3155 0.4605 0.231 0.3706 0.0717 0.1163 0.8742 0.6022 0.8138 0.1052 0.067 0.243 0.0388 0.0823 0.4213 0.0055 0.2817 0.6951 0.1931 0.3695 0.0486 0.061 0.2525 0.6738 0.2583 0.0531 0.3235 0.1645 0.5487 0.0081 0.1263 0.0326 0.0829 0.1407 0.1184 0.1689 0.1816 0.5035 0.0116 0.0134 0. 0.4189 0.6291 0.08 0.2716 0.0196 0. 0. 0.0314 0. 0.0042 0.1065 0.027 0. 0.0009 0.0816 0.0043 0.2257 0.0001 0.2885 0. 0.1467 0. 0.0825 0.0003 0.2865 0.8817 0. 0.2112 0.7643 0. 0.4741 0.6312 0. 0. 0.0075 0.1113 0.0001 0.2325 0.4834 0. 0. 0.2511 0. 0. 0.0435 0.0176 0.0203 0.049 0. 0.0136 0.1897 0.1129 0.0964 0.3659 0.0018 0.0929 0. 0.0688 0. 0.002 0. ] 2022-08-24 10:52:02 [INFO] [EVAL] The model with the best validation mIoU (0.1886) was saved at iter 6000. 2022-08-24 10:52:17 [INFO] [TRAIN] epoch: 5, iter: 6050/160000, loss: 1.3827, lr: 0.001166, batch_cost: 0.2941, reader_cost: 0.00383, ips: 27.1980 samples/sec | ETA 12:34:42 2022-08-24 10:52:30 [INFO] [TRAIN] epoch: 5, iter: 6100/160000, loss: 1.3481, lr: 0.001165, batch_cost: 0.2583, reader_cost: 0.00126, ips: 30.9714 samples/sec | ETA 11:02:32 2022-08-24 10:52:43 [INFO] [TRAIN] epoch: 5, iter: 6150/160000, loss: 1.2876, lr: 0.001165, batch_cost: 0.2594, reader_cost: 0.00083, ips: 30.8411 samples/sec | ETA 11:05:07 2022-08-24 10:52:56 [INFO] [TRAIN] epoch: 5, iter: 6200/160000, loss: 1.3889, lr: 0.001164, batch_cost: 0.2569, reader_cost: 0.00030, ips: 31.1428 samples/sec | ETA 10:58:28 2022-08-24 10:53:09 [INFO] [TRAIN] epoch: 5, iter: 6250/160000, loss: 1.2850, lr: 0.001164, batch_cost: 0.2627, reader_cost: 0.00052, ips: 30.4482 samples/sec | ETA 11:13:16 2022-08-24 10:53:22 [INFO] [TRAIN] epoch: 5, iter: 6300/160000, loss: 1.3329, lr: 0.001164, batch_cost: 0.2647, reader_cost: 0.00030, ips: 30.2205 samples/sec | ETA 11:18:07 2022-08-24 10:53:38 [INFO] [TRAIN] epoch: 6, iter: 6350/160000, loss: 1.2605, lr: 0.001163, batch_cost: 0.3159, reader_cost: 0.05038, ips: 25.3248 samples/sec | ETA 13:28:57 2022-08-24 10:53:52 [INFO] [TRAIN] epoch: 6, iter: 6400/160000, loss: 1.4025, lr: 0.001163, batch_cost: 0.2796, reader_cost: 0.00048, ips: 28.6107 samples/sec | ETA 11:55:49 2022-08-24 10:54:06 [INFO] [TRAIN] epoch: 6, iter: 6450/160000, loss: 1.2806, lr: 0.001163, batch_cost: 0.2913, reader_cost: 0.00070, ips: 27.4595 samples/sec | ETA 12:25:34 2022-08-24 10:54:20 [INFO] [TRAIN] epoch: 6, iter: 6500/160000, loss: 1.2234, lr: 0.001162, batch_cost: 0.2773, reader_cost: 0.00061, ips: 28.8513 samples/sec | ETA 11:49:23 2022-08-24 10:54:33 [INFO] [TRAIN] epoch: 6, iter: 6550/160000, loss: 1.3257, lr: 0.001162, batch_cost: 0.2597, reader_cost: 0.00044, ips: 30.8006 samples/sec | ETA 11:04:16 2022-08-24 10:54:45 [INFO] [TRAIN] epoch: 6, iter: 6600/160000, loss: 1.2387, lr: 0.001161, batch_cost: 0.2404, reader_cost: 0.00031, ips: 33.2841 samples/sec | ETA 10:14:30 2022-08-24 10:54:59 [INFO] [TRAIN] epoch: 6, iter: 6650/160000, loss: 1.2969, lr: 0.001161, batch_cost: 0.2694, reader_cost: 0.00051, ips: 29.6928 samples/sec | ETA 11:28:36 2022-08-24 10:55:14 [INFO] [TRAIN] epoch: 6, iter: 6700/160000, loss: 1.2691, lr: 0.001161, batch_cost: 0.3026, reader_cost: 0.00142, ips: 26.4338 samples/sec | ETA 12:53:15 2022-08-24 10:55:28 [INFO] [TRAIN] epoch: 6, iter: 6750/160000, loss: 1.2423, lr: 0.001160, batch_cost: 0.2904, reader_cost: 0.00090, ips: 27.5447 samples/sec | ETA 12:21:49 2022-08-24 10:55:43 [INFO] [TRAIN] epoch: 6, iter: 6800/160000, loss: 1.3845, lr: 0.001160, batch_cost: 0.2980, reader_cost: 0.00075, ips: 26.8475 samples/sec | ETA 12:40:50 2022-08-24 10:55:58 [INFO] [TRAIN] epoch: 6, iter: 6850/160000, loss: 1.3664, lr: 0.001160, batch_cost: 0.2880, reader_cost: 0.00076, ips: 27.7752 samples/sec | ETA 12:15:11 2022-08-24 10:56:12 [INFO] [TRAIN] epoch: 6, iter: 6900/160000, loss: 1.3006, lr: 0.001159, batch_cost: 0.2770, reader_cost: 0.00087, ips: 28.8858 samples/sec | ETA 11:46:41 2022-08-24 10:56:26 [INFO] [TRAIN] epoch: 6, iter: 6950/160000, loss: 1.2813, lr: 0.001159, batch_cost: 0.2808, reader_cost: 0.00039, ips: 28.4866 samples/sec | ETA 11:56:21 2022-08-24 10:56:38 [INFO] [TRAIN] epoch: 6, iter: 7000/160000, loss: 1.3493, lr: 0.001158, batch_cost: 0.2498, reader_cost: 0.00037, ips: 32.0236 samples/sec | ETA 10:37:01 2022-08-24 10:56:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 169s - batch_cost: 0.1692 - reader cost: 7.5064e-04 2022-08-24 10:59:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.2038 Acc: 0.6926 Kappa: 0.6674 Dice: 0.2967 2022-08-24 10:59:28 [INFO] [EVAL] Class IoU: [0.5841 0.7252 0.9092 0.6232 0.6281 0.6872 0.6944 0.6027 0.4188 0.6128 0.3885 0.4279 0.5892 0.2813 0.0541 0.2296 0.4266 0.3796 0.4363 0.2688 0.6169 0.2696 0.4322 0.3216 0.2177 0.412 0.375 0.1243 0.2287 0.1891 0.0977 0.3113 0.1448 0.1261 0.1869 0.2214 0.2192 0.3592 0.1739 0.2192 0.0191 0.0123 0.1243 0.1188 0.2038 0.1135 0.1903 0.273 0.5237 0.3208 0.252 0.2069 0.0168 0.2084 0.5788 0.369 0.688 0.1809 0.1291 0.1311 0.0403 0.0814 0.2242 0.0573 0.1934 0.4648 0.117 0.3001 0.0144 0.1403 0.0968 0.2799 0.323 0.0857 0.254 0.1508 0.2846 0.0252 0.0966 0.1016 0.1795 0.1424 0.0874 0.0105 0.2897 0.3219 0.0071 0.0209 0. 0.2601 0.3761 0.0551 0.1756 0.0126 0. 0. 0.0141 0. 0.0263 0.0219 0. 0. 0. 0.3601 0. 0.1603 0.0142 0.3887 0.0014 0.1094 0.0013 0.0565 0.0014 0.3331 0.6954 0. 0.1371 0.4559 0. 0.3378 0.4192 0. 0. 0.177 0.0794 0.0004 0.3055 0.3157 0. 0.0077 0.2795 0. 0. 0.0017 0.0019 0.0315 0.0277 0. 0.0286 0.185 0.0009 0.0353 0.2086 0.001 0.0869 0. 0.1097 0. 0.0058 0. ] 2022-08-24 10:59:28 [INFO] [EVAL] Class Precision: [0.6784 0.814 0.9544 0.7122 0.7261 0.7818 0.805 0.6755 0.591 0.7085 0.5499 0.5769 0.6701 0.4731 0.3953 0.4231 0.5875 0.5967 0.6505 0.4427 0.6652 0.4145 0.6831 0.5069 0.4146 0.5529 0.4396 0.6829 0.6326 0.3374 0.2356 0.4371 0.3681 0.2274 0.3884 0.6475 0.4344 0.6863 0.5204 0.347 0.3183 0.219 0.6912 0.5716 0.3028 0.5765 0.3158 0.389 0.5924 0.4079 0.4665 0.2779 0.1746 0.4561 0.7665 0.478 0.7714 0.4338 0.416 0.5344 0.0594 0.5322 0.3442 0.6526 0.4268 0.5462 0.3348 0.5496 0.469 0.616 0.5646 0.391 0.6732 0.214 0.3308 0.5582 0.4906 0.568 0.2924 0.3266 0.8336 0.4333 0.6481 0.038 0.6146 0.6007 0.0666 0.2532 0. 0.5592 0.5319 0.8638 0.4621 0.1143 0. 0. 0.4542 0. 0.1348 0.5477 0. 0. 0. 0.9205 0. 0.8545 0.1764 0.7213 0.0107 0.1796 0.4114 0.2232 0.0865 0.3893 0.8046 0. 0.8989 0.7333 0. 0.5433 0.6196 0. 0. 0.3655 0.399 0.0627 0.8343 0.5238 0. 0.7714 0.6627 0. 0. 0.062 0.8321 0.2186 0.543 0. 0.6103 0.5534 0.0557 0.0558 0.7659 0.6829 0.6334 0. 0.8592 0. 0.0802 0. ] 2022-08-24 10:59:28 [INFO] [EVAL] Class Recall: [0.8078 0.8692 0.9505 0.8331 0.8231 0.8502 0.8349 0.8484 0.5896 0.8194 0.5698 0.6236 0.8299 0.4095 0.0589 0.3342 0.609 0.5106 0.5699 0.4062 0.8947 0.4353 0.5406 0.4681 0.3144 0.6179 0.7185 0.132 0.2638 0.3009 0.143 0.5196 0.1926 0.2206 0.2649 0.2518 0.3067 0.4297 0.2071 0.3731 0.0199 0.0129 0.1316 0.1305 0.384 0.1239 0.3237 0.4779 0.8186 0.6003 0.354 0.4475 0.0182 0.2772 0.7027 0.6179 0.8642 0.2368 0.1576 0.148 0.1118 0.0877 0.3915 0.0591 0.2612 0.7572 0.1524 0.398 0.0147 0.1537 0.1046 0.496 0.3831 0.125 0.5227 0.1712 0.4039 0.0257 0.126 0.1284 0.1861 0.1751 0.0917 0.0143 0.3539 0.4096 0.0079 0.0223 0. 0.3272 0.5622 0.0556 0.2207 0.0139 0. 0. 0.0144 0. 0.0316 0.0223 0. 0. 0. 0.3717 0. 0.1648 0.0153 0.4573 0.0016 0.2186 0.0013 0.0703 0.0014 0.6977 0.8368 0. 0.1393 0.5465 0. 0.4716 0.5646 0. 0. 0.2555 0.0902 0.0004 0.3252 0.4428 0. 0.0077 0.3259 0. 0. 0.0018 0.0019 0.0355 0.0284 0. 0.0292 0.2174 0.0009 0.0877 0.2228 0.001 0.0914 0. 0.1117 0. 0.0062 0. ] 2022-08-24 10:59:28 [INFO] [EVAL] The model with the best validation mIoU (0.2038) was saved at iter 7000. 2022-08-24 10:59:42 [INFO] [TRAIN] epoch: 6, iter: 7050/160000, loss: 1.2713, lr: 0.001158, batch_cost: 0.2789, reader_cost: 0.00363, ips: 28.6806 samples/sec | ETA 11:51:03 2022-08-24 10:59:55 [INFO] [TRAIN] epoch: 6, iter: 7100/160000, loss: 1.2801, lr: 0.001158, batch_cost: 0.2761, reader_cost: 0.00109, ips: 28.9761 samples/sec | ETA 11:43:34 2022-08-24 11:00:10 [INFO] [TRAIN] epoch: 6, iter: 7150/160000, loss: 1.3259, lr: 0.001157, batch_cost: 0.2848, reader_cost: 0.00045, ips: 28.0907 samples/sec | ETA 12:05:30 2022-08-24 11:00:25 [INFO] [TRAIN] epoch: 6, iter: 7200/160000, loss: 1.2783, lr: 0.001157, batch_cost: 0.2993, reader_cost: 0.00104, ips: 26.7278 samples/sec | ETA 12:42:15 2022-08-24 11:00:38 [INFO] [TRAIN] epoch: 6, iter: 7250/160000, loss: 1.2967, lr: 0.001156, batch_cost: 0.2645, reader_cost: 0.00069, ips: 30.2464 samples/sec | ETA 11:13:21 2022-08-24 11:00:51 [INFO] [TRAIN] epoch: 6, iter: 7300/160000, loss: 1.3812, lr: 0.001156, batch_cost: 0.2612, reader_cost: 0.00056, ips: 30.6250 samples/sec | ETA 11:04:48 2022-08-24 11:01:04 [INFO] [TRAIN] epoch: 6, iter: 7350/160000, loss: 1.3601, lr: 0.001156, batch_cost: 0.2634, reader_cost: 0.00042, ips: 30.3694 samples/sec | ETA 11:10:11 2022-08-24 11:01:18 [INFO] [TRAIN] epoch: 6, iter: 7400/160000, loss: 1.2828, lr: 0.001155, batch_cost: 0.2687, reader_cost: 0.00044, ips: 29.7761 samples/sec | ETA 11:23:19 2022-08-24 11:01:30 [INFO] [TRAIN] epoch: 6, iter: 7450/160000, loss: 1.3164, lr: 0.001155, batch_cost: 0.2524, reader_cost: 0.00053, ips: 31.6978 samples/sec | ETA 10:41:41 2022-08-24 11:01:42 [INFO] [TRAIN] epoch: 6, iter: 7500/160000, loss: 1.2200, lr: 0.001155, batch_cost: 0.2443, reader_cost: 0.00061, ips: 32.7463 samples/sec | ETA 10:20:56 2022-08-24 11:01:56 [INFO] [TRAIN] epoch: 6, iter: 7550/160000, loss: 1.3179, lr: 0.001154, batch_cost: 0.2620, reader_cost: 0.00035, ips: 30.5389 samples/sec | ETA 11:05:35 2022-08-24 11:02:12 [INFO] [TRAIN] epoch: 7, iter: 7600/160000, loss: 1.2555, lr: 0.001154, batch_cost: 0.3200, reader_cost: 0.03454, ips: 24.9985 samples/sec | ETA 13:32:50 2022-08-24 11:02:26 [INFO] [TRAIN] epoch: 7, iter: 7650/160000, loss: 1.1725, lr: 0.001153, batch_cost: 0.2830, reader_cost: 0.00075, ips: 28.2706 samples/sec | ETA 11:58:31 2022-08-24 11:02:40 [INFO] [TRAIN] epoch: 7, iter: 7700/160000, loss: 1.1985, lr: 0.001153, batch_cost: 0.2814, reader_cost: 0.00054, ips: 28.4329 samples/sec | ETA 11:54:11 2022-08-24 11:02:52 [INFO] [TRAIN] epoch: 7, iter: 7750/160000, loss: 1.2472, lr: 0.001153, batch_cost: 0.2493, reader_cost: 0.00064, ips: 32.0910 samples/sec | ETA 10:32:34 2022-08-24 11:03:06 [INFO] [TRAIN] epoch: 7, iter: 7800/160000, loss: 1.2313, lr: 0.001152, batch_cost: 0.2698, reader_cost: 0.00084, ips: 29.6519 samples/sec | ETA 11:24:23 2022-08-24 11:03:19 [INFO] [TRAIN] epoch: 7, iter: 7850/160000, loss: 1.3170, lr: 0.001152, batch_cost: 0.2726, reader_cost: 0.00085, ips: 29.3462 samples/sec | ETA 11:31:17 2022-08-24 11:03:33 [INFO] [TRAIN] epoch: 7, iter: 7900/160000, loss: 1.3257, lr: 0.001152, batch_cost: 0.2806, reader_cost: 0.00099, ips: 28.5142 samples/sec | ETA 11:51:13 2022-08-24 11:03:47 [INFO] [TRAIN] epoch: 7, iter: 7950/160000, loss: 1.1932, lr: 0.001151, batch_cost: 0.2666, reader_cost: 0.00056, ips: 30.0083 samples/sec | ETA 11:15:35 2022-08-24 11:04:01 [INFO] [TRAIN] epoch: 7, iter: 8000/160000, loss: 1.3179, lr: 0.001151, batch_cost: 0.2768, reader_cost: 0.00083, ips: 28.8991 samples/sec | ETA 11:41:17 2022-08-24 11:04:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1680 - reader cost: 5.7847e-04 2022-08-24 11:06:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.2051 Acc: 0.6959 Kappa: 0.6708 Dice: 0.2972 2022-08-24 11:06:49 [INFO] [EVAL] Class IoU: [0.5865 0.7127 0.9084 0.6337 0.6277 0.6901 0.7007 0.6366 0.4131 0.5745 0.3886 0.4514 0.6008 0.2551 0.0598 0.2303 0.4273 0.3427 0.4392 0.2872 0.6768 0.3911 0.4508 0.353 0.1974 0.3985 0.4105 0.167 0.2979 0.2004 0.0469 0.3327 0.1325 0.1436 0.2432 0.3111 0.2242 0.3075 0.1718 0.1614 0.056 0.0062 0.0963 0.1545 0.2221 0.0544 0.2125 0.2594 0.5235 0.3638 0.2816 0.2299 0.0923 0.1671 0.559 0.3612 0.7533 0.1839 0.0963 0.149 0.0057 0.0796 0.1873 0.0586 0.2444 0.4335 0.1102 0.3325 0.0085 0.1863 0.1491 0.2494 0.3213 0.1018 0.2819 0.1885 0.261 0.0184 0.0719 0.1151 0.0592 0.12 0.0613 0.0406 0.1737 0.3462 0. 0.023 0. 0.3418 0.2 0. 0.1484 0.0103 0. 0. 0.0376 0.0043 0.0018 0.0094 0. 0. 0.0324 0.2605 0.0319 0.1315 0.0479 0.3582 0.0011 0.0265 0.0002 0.0854 0.0024 0.2767 0.685 0. 0.2569 0.51 0. 0.3253 0.4407 0. 0. 0.0916 0.1138 0.0006 0.3614 0.2602 0.0009 0.006 0.3981 0. 0. 0.007 0.0279 0.0327 0.017 0.0001 0.0315 0.1262 0.007 0. 0.2541 0.0531 0.0458 0. 0.0606 0.0007 0.0112 0.0004] 2022-08-24 11:06:49 [INFO] [EVAL] Class Precision: [0.6801 0.7825 0.9547 0.7305 0.6988 0.7972 0.8365 0.7275 0.5429 0.7103 0.5159 0.6006 0.7053 0.5199 0.3656 0.4513 0.5694 0.6179 0.6107 0.4603 0.7829 0.5462 0.6439 0.4488 0.4078 0.6327 0.4752 0.6585 0.5326 0.2796 0.2865 0.4863 0.5203 0.2551 0.3682 0.5479 0.477 0.7374 0.4554 0.4684 0.1557 0.7775 0.732 0.4336 0.3637 0.7226 0.3583 0.4555 0.6707 0.5395 0.6152 0.3278 0.1891 0.7162 0.6434 0.4597 0.8979 0.4297 0.6838 0.5684 0.1536 0.1977 0.3628 0.7762 0.4046 0.4907 0.2632 0.4468 0.1274 0.7027 0.4703 0.3963 0.695 0.1812 0.6147 0.4795 0.5459 0.5708 0.1204 0.428 0.6373 0.5293 0.6465 0.0698 0.556 0.4599 0.0005 0.2619 0. 0.6265 0.2224 0. 0.4606 0.0969 0. 0. 0.45 0.5027 0.0878 0.1811 0. 0. 0.6363 0.7429 0.3157 0.9159 0.7485 0.7976 0.0183 0.0607 0.1156 0.1015 0.0654 0.5914 0.7201 0. 0.8865 0.7321 0. 0.4304 0.6245 0. 0. 0.5359 0.4308 0.1922 0.804 0.3573 0.4499 0.7811 0.82 0. 0. 0.2176 0.5998 0.2308 0.6706 0.0078 0.5642 0.612 0.4214 0. 0.7286 0.251 0.8199 0. 0.9244 0.7727 0.1613 0.067 ] 2022-08-24 11:06:49 [INFO] [EVAL] Class Recall: [0.81 0.8887 0.9494 0.8271 0.8606 0.8371 0.8119 0.836 0.6333 0.7502 0.6116 0.645 0.802 0.3338 0.0667 0.3199 0.6313 0.4349 0.61 0.4331 0.8331 0.5794 0.6005 0.6231 0.2768 0.5185 0.751 0.1828 0.4033 0.4142 0.0531 0.5131 0.151 0.2472 0.4174 0.4186 0.2974 0.3453 0.2163 0.1977 0.0804 0.0062 0.0999 0.1936 0.3634 0.0556 0.343 0.3759 0.7047 0.5276 0.3418 0.4352 0.1527 0.179 0.8099 0.6277 0.8238 0.2433 0.1008 0.168 0.0059 0.1176 0.2791 0.0596 0.3816 0.788 0.1593 0.5652 0.009 0.2022 0.1792 0.4023 0.374 0.1884 0.3424 0.237 0.3333 0.0186 0.1515 0.1361 0.0613 0.1344 0.0634 0.0882 0.2016 0.5833 0. 0.0246 0. 0.4293 0.6649 0. 0.1796 0.0114 0. 0. 0.0394 0.0044 0.0019 0.0098 0. 0. 0.033 0.2863 0.0343 0.1331 0.0486 0.3941 0.0012 0.0449 0.0002 0.3504 0.0025 0.342 0.9336 0. 0.2657 0.6271 0. 0.5711 0.5995 0. 0. 0.0995 0.1339 0.0006 0.3963 0.4892 0.0009 0.006 0.4362 0. 0. 0.0071 0.0285 0.0367 0.0171 0.0001 0.0323 0.1372 0.0071 0. 0.2807 0.063 0.0462 0. 0.0609 0.0007 0.0119 0.0004] 2022-08-24 11:06:49 [INFO] [EVAL] The model with the best validation mIoU (0.2051) was saved at iter 8000. 2022-08-24 11:07:02 [INFO] [TRAIN] epoch: 7, iter: 8050/160000, loss: 1.3046, lr: 0.001150, batch_cost: 0.2654, reader_cost: 0.00330, ips: 30.1484 samples/sec | ETA 11:12:00 2022-08-24 11:07:15 [INFO] [TRAIN] epoch: 7, iter: 8100/160000, loss: 1.2682, lr: 0.001150, batch_cost: 0.2589, reader_cost: 0.00108, ips: 30.9035 samples/sec | ETA 10:55:22 2022-08-24 11:07:30 [INFO] [TRAIN] epoch: 7, iter: 8150/160000, loss: 1.2210, lr: 0.001150, batch_cost: 0.2866, reader_cost: 0.00045, ips: 27.9136 samples/sec | ETA 12:05:19 2022-08-24 11:07:45 [INFO] [TRAIN] epoch: 7, iter: 8200/160000, loss: 1.2527, lr: 0.001149, batch_cost: 0.3035, reader_cost: 0.00040, ips: 26.3551 samples/sec | ETA 12:47:58 2022-08-24 11:08:00 [INFO] [TRAIN] epoch: 7, iter: 8250/160000, loss: 1.2397, lr: 0.001149, batch_cost: 0.3028, reader_cost: 0.00051, ips: 26.4214 samples/sec | ETA 12:45:47 2022-08-24 11:08:14 [INFO] [TRAIN] epoch: 7, iter: 8300/160000, loss: 1.3140, lr: 0.001149, batch_cost: 0.2876, reader_cost: 0.00058, ips: 27.8124 samples/sec | ETA 12:07:15 2022-08-24 11:08:29 [INFO] [TRAIN] epoch: 7, iter: 8350/160000, loss: 1.2308, lr: 0.001148, batch_cost: 0.2951, reader_cost: 0.00052, ips: 27.1105 samples/sec | ETA 12:25:50 2022-08-24 11:08:43 [INFO] [TRAIN] epoch: 7, iter: 8400/160000, loss: 1.2866, lr: 0.001148, batch_cost: 0.2878, reader_cost: 0.00088, ips: 27.8003 samples/sec | ETA 12:07:05 2022-08-24 11:08:56 [INFO] [TRAIN] epoch: 7, iter: 8450/160000, loss: 1.2688, lr: 0.001147, batch_cost: 0.2583, reader_cost: 0.00044, ips: 30.9733 samples/sec | ETA 10:52:23 2022-08-24 11:09:11 [INFO] [TRAIN] epoch: 7, iter: 8500/160000, loss: 1.2135, lr: 0.001147, batch_cost: 0.2830, reader_cost: 0.00052, ips: 28.2667 samples/sec | ETA 11:54:37 2022-08-24 11:09:25 [INFO] [TRAIN] epoch: 7, iter: 8550/160000, loss: 1.2820, lr: 0.001147, batch_cost: 0.2901, reader_cost: 0.00072, ips: 27.5809 samples/sec | ETA 12:12:08 2022-08-24 11:09:39 [INFO] [TRAIN] epoch: 7, iter: 8600/160000, loss: 1.2619, lr: 0.001146, batch_cost: 0.2811, reader_cost: 0.00071, ips: 28.4549 samples/sec | ETA 11:49:25 2022-08-24 11:09:54 [INFO] [TRAIN] epoch: 7, iter: 8650/160000, loss: 1.2131, lr: 0.001146, batch_cost: 0.3007, reader_cost: 0.00087, ips: 26.6031 samples/sec | ETA 12:38:33 2022-08-24 11:10:08 [INFO] [TRAIN] epoch: 7, iter: 8700/160000, loss: 1.3483, lr: 0.001145, batch_cost: 0.2789, reader_cost: 0.00083, ips: 28.6805 samples/sec | ETA 11:43:22 2022-08-24 11:10:22 [INFO] [TRAIN] epoch: 7, iter: 8750/160000, loss: 1.2745, lr: 0.001145, batch_cost: 0.2733, reader_cost: 0.00103, ips: 29.2764 samples/sec | ETA 11:28:50 2022-08-24 11:10:35 [INFO] [TRAIN] epoch: 7, iter: 8800/160000, loss: 1.1612, lr: 0.001145, batch_cost: 0.2735, reader_cost: 0.00056, ips: 29.2475 samples/sec | ETA 11:29:17 2022-08-24 11:10:50 [INFO] [TRAIN] epoch: 8, iter: 8850/160000, loss: 1.2392, lr: 0.001144, batch_cost: 0.2984, reader_cost: 0.03876, ips: 26.8121 samples/sec | ETA 12:31:39 2022-08-24 11:11:04 [INFO] [TRAIN] epoch: 8, iter: 8900/160000, loss: 1.2436, lr: 0.001144, batch_cost: 0.2643, reader_cost: 0.00052, ips: 30.2661 samples/sec | ETA 11:05:39 2022-08-24 11:11:17 [INFO] [TRAIN] epoch: 8, iter: 8950/160000, loss: 1.2666, lr: 0.001144, batch_cost: 0.2705, reader_cost: 0.00041, ips: 29.5700 samples/sec | ETA 11:21:05 2022-08-24 11:11:31 [INFO] [TRAIN] epoch: 8, iter: 9000/160000, loss: 1.1916, lr: 0.001143, batch_cost: 0.2807, reader_cost: 0.00062, ips: 28.4995 samples/sec | ETA 11:46:26 2022-08-24 11:11:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 171s - batch_cost: 0.1713 - reader cost: 5.9244e-04 2022-08-24 11:14:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.2134 Acc: 0.7002 Kappa: 0.6755 Dice: 0.3096 2022-08-24 11:14:23 [INFO] [EVAL] Class IoU: [0.5905 0.7336 0.9112 0.646 0.6246 0.6983 0.6948 0.6314 0.4165 0.6074 0.3875 0.4336 0.6088 0.2685 0.0697 0.2663 0.4105 0.3639 0.4513 0.2934 0.6436 0.31 0.462 0.3492 0.2315 0.4213 0.4011 0.1862 0.2766 0.2251 0.1318 0.388 0.1599 0.152 0.2473 0.294 0.2355 0.4204 0.1562 0.1974 0.0108 0.0291 0.2076 0.1512 0.223 0.1328 0.2017 0.317 0.3453 0.3818 0.2855 0.2165 0.0972 0.1984 0.6123 0.2807 0.7723 0.1955 0.115 0.1328 0.0477 0.1192 0.2312 0.0494 0.2758 0.5192 0.1614 0.333 0.0022 0.1599 0.1952 0.2682 0.3445 0.1534 0.2858 0.1517 0.3155 0.0212 0.0392 0.1024 0.4892 0.128 0.138 0. 0.2797 0.3623 0.0043 0.0313 0. 0.3235 0.3254 0.0053 0.1611 0.0176 0.0097 0.001 0.0064 0.0107 0.0043 0.0329 0.0017 0. 0.0002 0.1106 0.0148 0.1486 0.001 0.4323 0.0005 0.0414 0. 0.1979 0. 0.3122 0.4553 0. 0.2999 0.4721 0.0008 0.1265 0.4139 0. 0. 0.0226 0.1683 0.0008 0.2098 0.2631 0. 0.0192 0.4052 0. 0. 0.1158 0.0298 0.0522 0.0143 0.0002 0.0405 0.1913 0.0055 0.0048 0.1557 0.0392 0.1133 0. 0.0987 0.0101 0.0018 0.0068] 2022-08-24 11:14:23 [INFO] [EVAL] Class Precision: [0.6791 0.8058 0.9505 0.7501 0.7207 0.7788 0.8056 0.724 0.5609 0.7398 0.6765 0.5907 0.7224 0.5068 0.3907 0.4303 0.6267 0.6306 0.6402 0.4204 0.696 0.5546 0.6619 0.4865 0.3795 0.5845 0.4638 0.6394 0.6399 0.3267 0.2832 0.5733 0.4196 0.2958 0.2847 0.6894 0.5503 0.6634 0.4044 0.4545 0.3702 0.2 0.6095 0.418 0.3826 0.2393 0.4811 0.6074 0.7324 0.5261 0.5474 0.2997 0.3919 0.6699 0.6776 0.3314 0.886 0.4344 0.4517 0.2983 0.0715 0.2968 0.4512 0.8237 0.444 0.6272 0.3043 0.5371 0.2123 0.7542 0.5296 0.3348 0.5773 0.2717 0.5434 0.6193 0.5394 0.5694 0.3216 0.2419 0.9264 0.6651 0.6042 0.0016 0.5219 0.6097 0.1448 0.284 0. 0.5562 0.4438 1. 0.2382 0.1443 0.4749 0.0163 0.0505 0.3682 0.1405 0.7047 1. 0. 0.2955 0.8728 0.6808 0.1756 0.0575 0.7452 0.008 0.0812 0.0185 0.4646 0. 0.7397 0.4612 0. 0.815 0.6053 0.0053 0.5441 0.6834 0. 0. 0.7055 0.3328 0.1828 0.9458 0.3778 0. 0.38 0.6888 0. 0. 0.2161 0.7134 0.156 0.4319 0.0058 0.4651 0.5383 0.0823 0.0195 0.8907 0.55 0.6764 0. 0.9071 0.4295 0.0549 0.4414] 2022-08-24 11:14:23 [INFO] [EVAL] Class Recall: [0.8189 0.8912 0.9566 0.8232 0.8241 0.8711 0.8348 0.8317 0.618 0.7724 0.4757 0.6198 0.7948 0.3635 0.0782 0.4113 0.5434 0.4625 0.6047 0.4927 0.8953 0.4128 0.6048 0.553 0.3726 0.6014 0.7482 0.208 0.3276 0.4198 0.1977 0.5455 0.2053 0.2383 0.6531 0.3389 0.2917 0.5344 0.2029 0.2587 0.011 0.033 0.2394 0.1915 0.3482 0.2298 0.2578 0.3987 0.3951 0.5821 0.3738 0.4381 0.1144 0.2199 0.8639 0.6471 0.8576 0.2623 0.1337 0.1932 0.1255 0.166 0.3216 0.05 0.4213 0.7509 0.2557 0.4671 0.0023 0.1687 0.2361 0.5741 0.4607 0.2606 0.3761 0.1673 0.4319 0.0216 0.0428 0.1508 0.509 0.1368 0.1518 0.0001 0.3761 0.4718 0.0044 0.034 0. 0.4361 0.5495 0.0053 0.3323 0.0197 0.0098 0.001 0.0072 0.0109 0.0044 0.0333 0.0017 0. 0.0002 0.1124 0.0149 0.4908 0.0011 0.5073 0.0005 0.0779 0. 0.2565 0. 0.3507 0.9724 0. 0.3218 0.6821 0.0009 0.1416 0.5121 0. 0. 0.0228 0.2539 0.0008 0.2123 0.4643 0. 0.0198 0.496 0. 0. 0.1996 0.0301 0.0727 0.0146 0.0002 0.0425 0.2289 0.0058 0.0063 0.1588 0.0405 0.1197 0. 0.0997 0.0102 0.0019 0.0068] 2022-08-24 11:14:23 [INFO] [EVAL] The model with the best validation mIoU (0.2134) was saved at iter 9000. 2022-08-24 11:14:37 [INFO] [TRAIN] epoch: 8, iter: 9050/160000, loss: 1.1232, lr: 0.001143, batch_cost: 0.2889, reader_cost: 0.00495, ips: 27.6957 samples/sec | ETA 12:06:42 2022-08-24 11:14:52 [INFO] [TRAIN] epoch: 8, iter: 9100/160000, loss: 1.2513, lr: 0.001142, batch_cost: 0.2832, reader_cost: 0.00093, ips: 28.2472 samples/sec | ETA 11:52:16 2022-08-24 11:15:05 [INFO] [TRAIN] epoch: 8, iter: 9150/160000, loss: 1.2338, lr: 0.001142, batch_cost: 0.2618, reader_cost: 0.00124, ips: 30.5518 samples/sec | ETA 10:58:20 2022-08-24 11:15:18 [INFO] [TRAIN] epoch: 8, iter: 9200/160000, loss: 1.2302, lr: 0.001142, batch_cost: 0.2651, reader_cost: 0.00062, ips: 30.1772 samples/sec | ETA 11:06:17 2022-08-24 11:15:31 [INFO] [TRAIN] epoch: 8, iter: 9250/160000, loss: 1.1848, lr: 0.001141, batch_cost: 0.2663, reader_cost: 0.00177, ips: 30.0367 samples/sec | ETA 11:09:10 2022-08-24 11:15:45 [INFO] [TRAIN] epoch: 8, iter: 9300/160000, loss: 1.2667, lr: 0.001141, batch_cost: 0.2656, reader_cost: 0.00117, ips: 30.1204 samples/sec | ETA 11:07:05 2022-08-24 11:15:55 [INFO] [TRAIN] epoch: 8, iter: 9350/160000, loss: 1.1201, lr: 0.001141, batch_cost: 0.2173, reader_cost: 0.00092, ips: 36.8226 samples/sec | ETA 09:05:29 2022-08-24 11:16:06 [INFO] [TRAIN] epoch: 8, iter: 9400/160000, loss: 1.2091, lr: 0.001140, batch_cost: 0.2158, reader_cost: 0.00107, ips: 37.0775 samples/sec | ETA 09:01:34 2022-08-24 11:16:17 [INFO] [TRAIN] epoch: 8, iter: 9450/160000, loss: 1.1389, lr: 0.001140, batch_cost: 0.2187, reader_cost: 0.00067, ips: 36.5756 samples/sec | ETA 09:08:49 2022-08-24 11:16:27 [INFO] [TRAIN] epoch: 8, iter: 9500/160000, loss: 1.2167, lr: 0.001139, batch_cost: 0.2048, reader_cost: 0.00081, ips: 39.0556 samples/sec | ETA 08:33:47 2022-08-24 11:16:38 [INFO] [TRAIN] epoch: 8, iter: 9550/160000, loss: 1.2620, lr: 0.001139, batch_cost: 0.2187, reader_cost: 0.00054, ips: 36.5872 samples/sec | ETA 09:08:16 2022-08-24 11:16:50 [INFO] [TRAIN] epoch: 8, iter: 9600/160000, loss: 1.2245, lr: 0.001139, batch_cost: 0.2332, reader_cost: 0.00059, ips: 34.3101 samples/sec | ETA 09:44:28 2022-08-24 11:17:04 [INFO] [TRAIN] epoch: 8, iter: 9650/160000, loss: 1.2280, lr: 0.001138, batch_cost: 0.2752, reader_cost: 0.00077, ips: 29.0664 samples/sec | ETA 11:29:41 2022-08-24 11:17:16 [INFO] [TRAIN] epoch: 8, iter: 9700/160000, loss: 1.2065, lr: 0.001138, batch_cost: 0.2499, reader_cost: 0.00088, ips: 32.0084 samples/sec | ETA 10:26:05 2022-08-24 11:17:29 [INFO] [TRAIN] epoch: 8, iter: 9750/160000, loss: 1.2546, lr: 0.001138, batch_cost: 0.2609, reader_cost: 0.00056, ips: 30.6650 samples/sec | ETA 10:53:17 2022-08-24 11:17:42 [INFO] [TRAIN] epoch: 8, iter: 9800/160000, loss: 1.1630, lr: 0.001137, batch_cost: 0.2578, reader_cost: 0.00044, ips: 31.0355 samples/sec | ETA 10:45:16 2022-08-24 11:17:57 [INFO] [TRAIN] epoch: 8, iter: 9850/160000, loss: 1.1742, lr: 0.001137, batch_cost: 0.2937, reader_cost: 0.00073, ips: 27.2418 samples/sec | ETA 12:14:54 2022-08-24 11:18:13 [INFO] [TRAIN] epoch: 8, iter: 9900/160000, loss: 1.1286, lr: 0.001136, batch_cost: 0.3152, reader_cost: 0.00092, ips: 25.3793 samples/sec | ETA 13:08:34 2022-08-24 11:18:27 [INFO] [TRAIN] epoch: 8, iter: 9950/160000, loss: 1.2340, lr: 0.001136, batch_cost: 0.2946, reader_cost: 0.00065, ips: 27.1531 samples/sec | ETA 12:16:48 2022-08-24 11:18:41 [INFO] [TRAIN] epoch: 8, iter: 10000/160000, loss: 1.2492, lr: 0.001136, batch_cost: 0.2718, reader_cost: 0.00062, ips: 29.4310 samples/sec | ETA 11:19:33 2022-08-24 11:18:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 203s - batch_cost: 0.2029 - reader cost: 0.0010 2022-08-24 11:22:04 [INFO] [EVAL] #Images: 2000 mIoU: 0.2150 Acc: 0.7002 Kappa: 0.6754 Dice: 0.3111 2022-08-24 11:22:04 [INFO] [EVAL] Class IoU: [0.5965 0.7251 0.9102 0.6481 0.6212 0.6961 0.699 0.6374 0.4226 0.593 0.3909 0.3656 0.6064 0.2233 0.1242 0.2529 0.4503 0.2815 0.4619 0.2754 0.664 0.4063 0.4517 0.3539 0.1707 0.447 0.3675 0.2132 0.2567 0.1658 0.1329 0.3596 0.1867 0.1579 0.3001 0.3126 0.2397 0.2274 0.1664 0.2289 0.0385 0.0242 0.1522 0.1431 0.2207 0.1045 0.2409 0.2597 0.4545 0.3894 0.2832 0.2173 0.0679 0.1175 0.623 0.261 0.6052 0.0609 0.0533 0.1414 0.0508 0.1359 0.1982 0.1121 0.2687 0.4463 0.154 0.3312 0.0577 0.187 0.1662 0.2629 0.3538 0.1782 0.2734 0.2269 0.2742 0.025 0.1203 0.009 0.5877 0.1267 0.1128 0.0023 0.2687 0.3556 0.0045 0.0171 0.008 0.3138 0.3102 0.0007 0.1774 0.0154 0.0235 0. 0.0771 0.0021 0.0018 0.1129 0. 0.0049 0.0229 0.0056 0.0142 0.2108 0.0039 0.3865 0.0117 0.0443 0.0004 0.1533 0.0096 0.3955 0.8136 0. 0.2047 0.4895 0.0134 0.19 0.3063 0. 0.0028 0.0366 0.0822 0.0034 0.3861 0.3276 0. 0.0044 0.4597 0. 0. 0.047 0.0229 0.0193 0.0365 0.0014 0.0148 0.1972 0.0646 0.0325 0.2828 0.0161 0.2031 0. 0.1145 0.0019 0.0065 0.0012] 2022-08-24 11:22:04 [INFO] [EVAL] Class Precision: [0.6827 0.798 0.9501 0.7641 0.7074 0.8034 0.7656 0.7197 0.5999 0.6341 0.6007 0.7095 0.688 0.521 0.3411 0.4462 0.6286 0.6767 0.6745 0.5094 0.7489 0.5317 0.6751 0.5101 0.4084 0.5578 0.4475 0.6314 0.5983 0.3804 0.2508 0.4881 0.4302 0.2849 0.3733 0.5941 0.5384 0.7184 0.4207 0.4659 0.2315 0.3733 0.6821 0.4714 0.2758 0.3002 0.364 0.3576 0.7054 0.5208 0.5863 0.2877 0.341 0.8254 0.7081 0.302 0.6325 0.567 0.7074 0.3013 0.3125 0.2903 0.3067 0.6502 0.3862 0.4948 0.2753 0.4189 0.3037 0.7007 0.3007 0.4101 0.6382 0.3649 0.3572 0.4014 0.588 0.405 0.542 0.2118 0.9102 0.2918 0.6697 0.0203 0.5968 0.5954 0.072 0.3438 0.1374 0.6516 0.3943 0.2235 0.3624 0.1593 0.3618 0. 0.2026 0.268 0.0877 0.3943 1. 0.0127 0.4038 1. 0.0896 0.8013 0.0832 0.5176 0.0699 0.3149 0.0389 0.3746 0.4222 0.6489 0.849 0. 0.8087 0.7295 0.0343 0.5094 0.7756 0. 0.2779 0.618 0.4794 0.2642 0.7042 0.4854 0. 0.2796 0.6413 0. 0. 0.2813 0.5055 0.2143 0.5873 0.0211 0.2991 0.6229 0.0731 0.0469 0.6639 0.1516 0.4891 0. 0.9189 0.5468 0.1111 0.1517] 2022-08-24 11:22:04 [INFO] [EVAL] Class Recall: [0.8253 0.8881 0.9559 0.8102 0.8361 0.839 0.8893 0.8479 0.5885 0.9014 0.5281 0.43 0.8365 0.2809 0.1634 0.3685 0.6135 0.3252 0.5945 0.3749 0.8542 0.6329 0.5772 0.5362 0.2268 0.6923 0.6728 0.2436 0.3101 0.2272 0.2204 0.5773 0.2481 0.2617 0.6047 0.3975 0.3018 0.2496 0.2158 0.3103 0.0441 0.0252 0.1639 0.1705 0.5248 0.1382 0.416 0.4866 0.561 0.607 0.3539 0.4701 0.0781 0.1205 0.8382 0.6577 0.9335 0.0639 0.0545 0.2104 0.0572 0.2035 0.359 0.1193 0.4689 0.8198 0.2589 0.6126 0.0664 0.2032 0.2707 0.4229 0.4425 0.2583 0.5383 0.3429 0.3394 0.0259 0.1339 0.0093 0.6239 0.183 0.1194 0.0025 0.3282 0.469 0.0048 0.0177 0.0084 0.3771 0.5925 0.0007 0.2579 0.0167 0.0245 0. 0.1106 0.0021 0.0018 0.1365 0. 0.0078 0.0237 0.0056 0.0166 0.2224 0.0041 0.6043 0.0139 0.049 0.0004 0.2061 0.0097 0.5031 0.9512 0. 0.2151 0.598 0.0216 0.2325 0.3361 0. 0.0028 0.0374 0.0902 0.0034 0.4609 0.5019 0. 0.0045 0.6188 0. 0. 0.0534 0.0234 0.0207 0.0374 0.0015 0.0154 0.2239 0.358 0.0962 0.3301 0.0177 0.2577 0. 0.1157 0.0019 0.0068 0.0013] 2022-08-24 11:22:04 [INFO] [EVAL] The model with the best validation mIoU (0.2150) was saved at iter 10000. 2022-08-24 11:22:19 [INFO] [TRAIN] epoch: 8, iter: 10050/160000, loss: 1.2616, lr: 0.001135, batch_cost: 0.2856, reader_cost: 0.00357, ips: 28.0108 samples/sec | ETA 11:53:46 2022-08-24 11:22:33 [INFO] [TRAIN] epoch: 8, iter: 10100/160000, loss: 1.1736, lr: 0.001135, batch_cost: 0.2881, reader_cost: 0.00209, ips: 27.7699 samples/sec | ETA 11:59:43 2022-08-24 11:22:48 [INFO] [TRAIN] epoch: 9, iter: 10150/160000, loss: 1.1732, lr: 0.001135, batch_cost: 0.2958, reader_cost: 0.03106, ips: 27.0491 samples/sec | ETA 12:18:39 2022-08-24 11:23:01 [INFO] [TRAIN] epoch: 9, iter: 10200/160000, loss: 1.0965, lr: 0.001134, batch_cost: 0.2695, reader_cost: 0.00061, ips: 29.6875 samples/sec | ETA 11:12:47 2022-08-24 11:23:15 [INFO] [TRAIN] epoch: 9, iter: 10250/160000, loss: 1.2356, lr: 0.001134, batch_cost: 0.2789, reader_cost: 0.00049, ips: 28.6806 samples/sec | ETA 11:36:10 2022-08-24 11:23:29 [INFO] [TRAIN] epoch: 9, iter: 10300/160000, loss: 1.2650, lr: 0.001133, batch_cost: 0.2731, reader_cost: 0.00042, ips: 29.2935 samples/sec | ETA 11:21:22 2022-08-24 11:23:43 [INFO] [TRAIN] epoch: 9, iter: 10350/160000, loss: 1.1159, lr: 0.001133, batch_cost: 0.2909, reader_cost: 0.00079, ips: 27.5049 samples/sec | ETA 12:05:26 2022-08-24 11:23:58 [INFO] [TRAIN] epoch: 9, iter: 10400/160000, loss: 1.2025, lr: 0.001133, batch_cost: 0.2825, reader_cost: 0.00042, ips: 28.3232 samples/sec | ETA 11:44:15 2022-08-24 11:24:11 [INFO] [TRAIN] epoch: 9, iter: 10450/160000, loss: 1.1396, lr: 0.001132, batch_cost: 0.2727, reader_cost: 0.00034, ips: 29.3323 samples/sec | ETA 11:19:47 2022-08-24 11:24:25 [INFO] [TRAIN] epoch: 9, iter: 10500/160000, loss: 1.2230, lr: 0.001132, batch_cost: 0.2800, reader_cost: 0.00061, ips: 28.5675 samples/sec | ETA 11:37:45 2022-08-24 11:24:40 [INFO] [TRAIN] epoch: 9, iter: 10550/160000, loss: 1.2508, lr: 0.001131, batch_cost: 0.2997, reader_cost: 0.00065, ips: 26.6962 samples/sec | ETA 12:26:25 2022-08-24 11:24:54 [INFO] [TRAIN] epoch: 9, iter: 10600/160000, loss: 1.2120, lr: 0.001131, batch_cost: 0.2751, reader_cost: 0.00078, ips: 29.0841 samples/sec | ETA 11:24:54 2022-08-24 11:25:08 [INFO] [TRAIN] epoch: 9, iter: 10650/160000, loss: 1.2027, lr: 0.001131, batch_cost: 0.2887, reader_cost: 0.00055, ips: 27.7146 samples/sec | ETA 11:58:30 2022-08-24 11:25:21 [INFO] [TRAIN] epoch: 9, iter: 10700/160000, loss: 1.0991, lr: 0.001130, batch_cost: 0.2577, reader_cost: 0.00043, ips: 31.0412 samples/sec | ETA 10:41:17 2022-08-24 11:25:36 [INFO] [TRAIN] epoch: 9, iter: 10750/160000, loss: 1.1856, lr: 0.001130, batch_cost: 0.2973, reader_cost: 0.00057, ips: 26.9044 samples/sec | ETA 12:19:39 2022-08-24 11:25:50 [INFO] [TRAIN] epoch: 9, iter: 10800/160000, loss: 1.2277, lr: 0.001130, batch_cost: 0.2839, reader_cost: 0.00041, ips: 28.1811 samples/sec | ETA 11:45:54 2022-08-24 11:26:04 [INFO] [TRAIN] epoch: 9, iter: 10850/160000, loss: 1.1491, lr: 0.001129, batch_cost: 0.2748, reader_cost: 0.00102, ips: 29.1149 samples/sec | ETA 11:23:02 2022-08-24 11:26:17 [INFO] [TRAIN] epoch: 9, iter: 10900/160000, loss: 1.1557, lr: 0.001129, batch_cost: 0.2678, reader_cost: 0.00068, ips: 29.8783 samples/sec | ETA 11:05:21 2022-08-24 11:26:27 [INFO] [TRAIN] epoch: 9, iter: 10950/160000, loss: 1.1880, lr: 0.001128, batch_cost: 0.1954, reader_cost: 0.00071, ips: 40.9370 samples/sec | ETA 08:05:27 2022-08-24 11:26:38 [INFO] [TRAIN] epoch: 9, iter: 11000/160000, loss: 1.1691, lr: 0.001128, batch_cost: 0.2205, reader_cost: 0.00062, ips: 36.2829 samples/sec | ETA 09:07:32 2022-08-24 11:26:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 209s - batch_cost: 0.2090 - reader cost: 9.8970e-04 2022-08-24 11:30:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.2257 Acc: 0.7037 Kappa: 0.6799 Dice: 0.3256 2022-08-24 11:30:08 [INFO] [EVAL] Class IoU: [0.6002 0.7226 0.9145 0.6453 0.6142 0.7009 0.7051 0.6458 0.4255 0.6264 0.3792 0.4473 0.6112 0.2708 0.0494 0.2523 0.455 0.2895 0.4381 0.2643 0.6921 0.3575 0.4488 0.3469 0.2477 0.4186 0.3492 0.2196 0.282 0.2277 0.0539 0.3364 0.124 0.1705 0.2319 0.3447 0.2421 0.4082 0.1532 0.2212 0.036 0.0295 0.191 0.1564 0.2206 0.1558 0.2455 0.3048 0.5442 0.377 0.2886 0.2305 0.048 0.207 0.6015 0.3435 0.7442 0.1361 0.2305 0.1376 0.0048 0.0917 0.2586 0.0622 0.2622 0.5029 0.1532 0.3599 0.0482 0.2218 0.2251 0.2188 0.2346 0.2052 0.2962 0.2204 0.2177 0.0688 0.1193 0.0926 0.4839 0.1355 0.1752 0.0004 0.1391 0.3711 0.0083 0.0199 0.0004 0.3652 0.3209 0. 0.1511 0.0091 0. 0. 0.0685 0. 0.0026 0.0413 0. 0.0002 0.054 0.4352 0.0119 0.2681 0.0025 0.4065 0.0194 0.0974 0.0213 0.1551 0.0001 0.3915 0.6723 0. 0.229 0.5416 0.0459 0.3774 0.4402 0. 0. 0.0407 0.1423 0.0079 0.2919 0.3147 0. 0.0577 0.4352 0. 0. 0.0255 0.0395 0.033 0.0353 0.0015 0.0664 0.1678 0.1476 0.0304 0.2543 0.0081 0.12 0. 0.1275 0.0003 0.0113 0.0024] 2022-08-24 11:30:08 [INFO] [EVAL] Class Precision: [0.71 0.7889 0.9516 0.7369 0.6907 0.8096 0.8081 0.7296 0.5427 0.7501 0.4764 0.6276 0.6986 0.5553 0.4668 0.4965 0.5572 0.6369 0.673 0.5132 0.796 0.5703 0.5531 0.4095 0.4596 0.5902 0.3951 0.5992 0.6149 0.422 0.2582 0.4182 0.495 0.3149 0.3492 0.5671 0.5423 0.6577 0.3933 0.3963 0.2181 0.3371 0.6076 0.4906 0.3711 0.3927 0.392 0.5664 0.6857 0.5736 0.5121 0.3049 0.2864 0.5907 0.6325 0.4212 0.8266 0.5247 0.4645 0.467 0.2484 0.2963 0.4634 0.8669 0.4238 0.5731 0.2503 0.5367 0.1206 0.6062 0.5309 0.3884 0.8416 0.3295 0.4276 0.3957 0.4707 0.5287 0.5491 0.3293 0.7093 0.456 0.6525 0.0051 0.5633 0.5076 0.0745 0.3053 0.0779 0.6047 0.4111 0. 0.3352 0.1332 0. 0. 0.2335 0. 0.2346 0.5 0. 0.0007 0.5215 0.8097 0.264 0.5556 0.3171 0.8786 0.1081 0.1615 0.2306 0.2968 0.0078 0.6756 0.6814 0. 0.9098 0.7334 0.0616 0.5628 0.6672 0. 0. 0.7267 0.4102 0.2966 0.8985 0.4444 0. 0.5258 0.6333 0. 0. 0.3792 0.7316 0.2341 0.5033 0.0422 0.3794 0.6352 0.3936 0.0526 0.77 0.3277 0.5454 0. 0.9405 0.9194 0.322 0.1216] 2022-08-24 11:30:08 [INFO] [EVAL] Class Recall: [0.7953 0.8959 0.9591 0.8385 0.8472 0.8393 0.8469 0.8489 0.6632 0.7915 0.6503 0.609 0.8301 0.3457 0.0524 0.339 0.7127 0.3468 0.5566 0.3527 0.8414 0.4893 0.7041 0.6943 0.3496 0.5901 0.7506 0.2574 0.3424 0.3308 0.0637 0.6323 0.1419 0.2709 0.4085 0.4677 0.3043 0.5183 0.2005 0.3337 0.0413 0.0313 0.2178 0.1867 0.3523 0.2052 0.3966 0.3975 0.7252 0.5237 0.398 0.4855 0.0545 0.2416 0.9246 0.6508 0.8819 0.1552 0.314 0.1633 0.0048 0.1173 0.3691 0.0628 0.4074 0.8043 0.2831 0.5221 0.0743 0.2591 0.281 0.3338 0.2455 0.3524 0.4909 0.3323 0.2883 0.0733 0.1323 0.1141 0.6036 0.1616 0.1932 0.0004 0.1559 0.5799 0.0093 0.0208 0.0004 0.4798 0.594 0. 0.2157 0.0097 0. 0. 0.0884 0. 0.0026 0.0431 0. 0.0002 0.0568 0.4848 0.0123 0.3412 0.0025 0.4306 0.0231 0.1972 0.0229 0.2453 0.0001 0.4822 0.9804 0. 0.2343 0.6743 0.1528 0.5339 0.564 0. 0. 0.0413 0.1789 0.0081 0.3018 0.5189 0. 0.0608 0.5818 0. 0. 0.0266 0.0401 0.037 0.0366 0.0015 0.0744 0.1857 0.191 0.0673 0.2752 0.0082 0.1334 0. 0.1285 0.0003 0.0115 0.0025] 2022-08-24 11:30:08 [INFO] [EVAL] The model with the best validation mIoU (0.2257) was saved at iter 11000. 2022-08-24 11:30:22 [INFO] [TRAIN] epoch: 9, iter: 11050/160000, loss: 1.1267, lr: 0.001128, batch_cost: 0.2825, reader_cost: 0.00403, ips: 28.3191 samples/sec | ETA 11:41:17 2022-08-24 11:30:35 [INFO] [TRAIN] epoch: 9, iter: 11100/160000, loss: 1.2416, lr: 0.001127, batch_cost: 0.2679, reader_cost: 0.00084, ips: 29.8565 samples/sec | ETA 11:04:57 2022-08-24 11:30:50 [INFO] [TRAIN] epoch: 9, iter: 11150/160000, loss: 1.2415, lr: 0.001127, batch_cost: 0.2845, reader_cost: 0.00083, ips: 28.1153 samples/sec | ETA 11:45:54 2022-08-24 11:31:05 [INFO] [TRAIN] epoch: 9, iter: 11200/160000, loss: 1.0846, lr: 0.001127, batch_cost: 0.3095, reader_cost: 0.00052, ips: 25.8502 samples/sec | ETA 12:47:29 2022-08-24 11:31:20 [INFO] [TRAIN] epoch: 9, iter: 11250/160000, loss: 1.1812, lr: 0.001126, batch_cost: 0.2931, reader_cost: 0.00046, ips: 27.2903 samples/sec | ETA 12:06:45 2022-08-24 11:31:35 [INFO] [TRAIN] epoch: 9, iter: 11300/160000, loss: 1.1757, lr: 0.001126, batch_cost: 0.2997, reader_cost: 0.00061, ips: 26.6970 samples/sec | ETA 12:22:39 2022-08-24 11:31:48 [INFO] [TRAIN] epoch: 9, iter: 11350/160000, loss: 1.2559, lr: 0.001125, batch_cost: 0.2770, reader_cost: 0.00033, ips: 28.8808 samples/sec | ETA 11:26:16 2022-08-24 11:32:05 [INFO] [TRAIN] epoch: 10, iter: 11400/160000, loss: 1.2559, lr: 0.001125, batch_cost: 0.3267, reader_cost: 0.04153, ips: 24.4882 samples/sec | ETA 13:29:05 2022-08-24 11:32:19 [INFO] [TRAIN] epoch: 10, iter: 11450/160000, loss: 1.1351, lr: 0.001125, batch_cost: 0.2897, reader_cost: 0.00076, ips: 27.6185 samples/sec | ETA 11:57:09 2022-08-24 11:32:34 [INFO] [TRAIN] epoch: 10, iter: 11500/160000, loss: 1.1040, lr: 0.001124, batch_cost: 0.2903, reader_cost: 0.00064, ips: 27.5606 samples/sec | ETA 11:58:25 2022-08-24 11:32:48 [INFO] [TRAIN] epoch: 10, iter: 11550/160000, loss: 1.2265, lr: 0.001124, batch_cost: 0.2887, reader_cost: 0.00068, ips: 27.7126 samples/sec | ETA 11:54:14 2022-08-24 11:33:01 [INFO] [TRAIN] epoch: 10, iter: 11600/160000, loss: 1.1570, lr: 0.001124, batch_cost: 0.2591, reader_cost: 0.00058, ips: 30.8720 samples/sec | ETA 10:40:55 2022-08-24 11:33:15 [INFO] [TRAIN] epoch: 10, iter: 11650/160000, loss: 1.1577, lr: 0.001123, batch_cost: 0.2822, reader_cost: 0.00095, ips: 28.3514 samples/sec | ETA 11:37:40 2022-08-24 11:33:30 [INFO] [TRAIN] epoch: 10, iter: 11700/160000, loss: 1.1126, lr: 0.001123, batch_cost: 0.2989, reader_cost: 0.00047, ips: 26.7678 samples/sec | ETA 12:18:41 2022-08-24 11:33:45 [INFO] [TRAIN] epoch: 10, iter: 11750/160000, loss: 1.1863, lr: 0.001122, batch_cost: 0.2989, reader_cost: 0.00088, ips: 26.7670 samples/sec | ETA 12:18:28 2022-08-24 11:33:59 [INFO] [TRAIN] epoch: 10, iter: 11800/160000, loss: 1.0560, lr: 0.001122, batch_cost: 0.2739, reader_cost: 0.00090, ips: 29.2079 samples/sec | ETA 11:16:31 2022-08-24 11:34:13 [INFO] [TRAIN] epoch: 10, iter: 11850/160000, loss: 1.1645, lr: 0.001122, batch_cost: 0.2868, reader_cost: 0.00066, ips: 27.8949 samples/sec | ETA 11:48:08 2022-08-24 11:34:28 [INFO] [TRAIN] epoch: 10, iter: 11900/160000, loss: 1.1858, lr: 0.001121, batch_cost: 0.2989, reader_cost: 0.00042, ips: 26.7682 samples/sec | ETA 12:17:41 2022-08-24 11:34:42 [INFO] [TRAIN] epoch: 10, iter: 11950/160000, loss: 1.2785, lr: 0.001121, batch_cost: 0.2725, reader_cost: 0.00065, ips: 29.3585 samples/sec | ETA 11:12:22 2022-08-24 11:34:56 [INFO] [TRAIN] epoch: 10, iter: 12000/160000, loss: 1.1643, lr: 0.001121, batch_cost: 0.2874, reader_cost: 0.00117, ips: 27.8319 samples/sec | ETA 11:49:01 2022-08-24 11:34:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1787 - reader cost: 0.0012 2022-08-24 11:37:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.2368 Acc: 0.7069 Kappa: 0.6843 Dice: 0.3422 2022-08-24 11:37:55 [INFO] [EVAL] Class IoU: [0.6051 0.7266 0.9137 0.6527 0.6288 0.7067 0.713 0.6182 0.4304 0.6262 0.4016 0.459 0.599 0.3072 0.1523 0.2708 0.4122 0.3262 0.4824 0.2845 0.6738 0.4183 0.478 0.3561 0.2458 0.4177 0.4396 0.2688 0.2503 0.2473 0.093 0.4094 0.2057 0.1551 0.2131 0.3229 0.2618 0.4389 0.1627 0.1852 0.0513 0.0439 0.1863 0.1651 0.2366 0.1071 0.2722 0.281 0.5059 0.4078 0.3511 0.2128 0.1212 0.2055 0.609 0.3403 0.7539 0.2454 0.2336 0.1909 0.0491 0.1315 0.2163 0.1073 0.2841 0.507 0.1543 0.3687 0.0159 0.2459 0.273 0.3269 0.3885 0.1846 0.2964 0.2205 0.2967 0.0658 0.1118 0.122 0.5172 0.1268 0.0896 0.0013 0.2685 0.3795 0.0371 0.0251 0.0156 0.369 0.3324 0. 0.1287 0.0245 0.0054 0. 0.0752 0.0041 0.0094 0.1168 0.0247 0.02 0.0465 0.4359 0.0001 0.2641 0. 0.423 0.0168 0.1548 0.0068 0.1835 0.002 0.2719 0.4888 0. 0.1829 0.3487 0. 0.3848 0.4405 0. 0. 0.1377 0.2146 0.0024 0.3281 0.2592 0. 0.0985 0.4962 0. 0. 0.0079 0.0172 0.0384 0.0174 0.0054 0.0374 0.2706 0.1496 0.0027 0.2263 0.2592 0.1172 0. 0.1155 0. 0.0235 0.031 ] 2022-08-24 11:37:55 [INFO] [EVAL] Class Precision: [0.7246 0.8345 0.9524 0.7567 0.7047 0.8253 0.8419 0.6702 0.5874 0.7214 0.5383 0.6122 0.666 0.4807 0.3546 0.4705 0.5226 0.7072 0.6305 0.4613 0.7677 0.572 0.6557 0.4519 0.3713 0.4988 0.5368 0.5184 0.666 0.3982 0.2388 0.6271 0.434 0.308 0.3561 0.5817 0.5106 0.6535 0.3309 0.4261 0.2247 0.2772 0.7561 0.555 0.3593 0.3624 0.4683 0.6743 0.5806 0.5928 0.5616 0.2581 0.3157 0.5851 0.6539 0.4345 0.8455 0.4522 0.4596 0.4736 0.2203 0.332 0.346 0.7801 0.4532 0.5807 0.3351 0.4977 0.319 0.7012 0.4234 0.4452 0.7327 0.3321 0.4925 0.5559 0.4587 0.5449 0.1637 0.2744 0.6225 0.6074 0.7267 0.0081 0.633 0.6865 0.184 0.2678 0.3661 0.7939 0.4341 0. 0.2022 0.1551 0.2599 0. 0.1846 0.4677 0.1036 0.4838 0.6301 0.0312 0.51 0.9629 0.0068 0.5911 0. 0.6055 0.0649 0.2473 0.2083 0.2721 0.1709 0.7807 0.4925 0.0007 0.937 0.3839 0. 0.611 0.5635 0. 0. 0.5737 0.3715 0.4929 0.8986 0.336 0. 0.2746 0.7689 0. 0. 0.256 0.4649 0.2554 0.6838 0.0739 0.388 0.5235 0.2395 0.0073 0.7566 0.4084 0.7032 0. 0.8184 0. 0.4575 0.5154] 2022-08-24 11:37:55 [INFO] [EVAL] Class Recall: [0.7859 0.8489 0.9574 0.8261 0.8538 0.8311 0.8232 0.8886 0.617 0.826 0.6127 0.6472 0.8561 0.4598 0.2108 0.3895 0.6613 0.3772 0.6726 0.426 0.8464 0.609 0.6382 0.6267 0.4212 0.7196 0.7082 0.3583 0.2862 0.3948 0.1321 0.5411 0.2811 0.238 0.3466 0.4205 0.3495 0.572 0.2425 0.2468 0.0624 0.0496 0.1982 0.1903 0.4091 0.132 0.394 0.3251 0.7974 0.5666 0.4836 0.5483 0.1644 0.2405 0.8987 0.6107 0.8743 0.3492 0.3221 0.2423 0.0594 0.1788 0.3658 0.1106 0.4321 0.7998 0.2223 0.5873 0.0164 0.2747 0.4346 0.5516 0.4527 0.2935 0.4268 0.2677 0.4566 0.0696 0.2605 0.1801 0.7535 0.1382 0.0927 0.0015 0.318 0.4591 0.0445 0.0269 0.016 0.4081 0.5865 0. 0.2613 0.0283 0.0055 0. 0.1126 0.0041 0.0102 0.1335 0.0251 0.0526 0.0486 0.4434 0.0001 0.3231 0. 0.5839 0.0221 0.2926 0.007 0.3605 0.0021 0.2944 0.9848 0. 0.1852 0.7918 0. 0.5097 0.6687 0. 0. 0.1534 0.3369 0.0024 0.3407 0.5315 0. 0.1332 0.5832 0. 0. 0.0081 0.0175 0.0432 0.0176 0.0058 0.0397 0.3591 0.285 0.0042 0.2441 0.415 0.1233 0. 0.1185 0. 0.0242 0.0319] 2022-08-24 11:37:55 [INFO] [EVAL] The model with the best validation mIoU (0.2368) was saved at iter 12000. 2022-08-24 11:38:10 [INFO] [TRAIN] epoch: 10, iter: 12050/160000, loss: 1.0883, lr: 0.001120, batch_cost: 0.2861, reader_cost: 0.00352, ips: 27.9646 samples/sec | ETA 11:45:24 2022-08-24 11:38:25 [INFO] [TRAIN] epoch: 10, iter: 12100/160000, loss: 1.1673, lr: 0.001120, batch_cost: 0.3070, reader_cost: 0.00092, ips: 26.0616 samples/sec | ETA 12:36:40 2022-08-24 11:38:39 [INFO] [TRAIN] epoch: 10, iter: 12150/160000, loss: 1.2405, lr: 0.001119, batch_cost: 0.2850, reader_cost: 0.00070, ips: 28.0662 samples/sec | ETA 11:42:23 2022-08-24 11:38:53 [INFO] [TRAIN] epoch: 10, iter: 12200/160000, loss: 1.2162, lr: 0.001119, batch_cost: 0.2769, reader_cost: 0.00070, ips: 28.8962 samples/sec | ETA 11:21:58 2022-08-24 11:39:07 [INFO] [TRAIN] epoch: 10, iter: 12250/160000, loss: 1.2247, lr: 0.001119, batch_cost: 0.2690, reader_cost: 0.00047, ips: 29.7416 samples/sec | ETA 11:02:22 2022-08-24 11:39:19 [INFO] [TRAIN] epoch: 10, iter: 12300/160000, loss: 1.1926, lr: 0.001118, batch_cost: 0.2592, reader_cost: 0.00043, ips: 30.8599 samples/sec | ETA 10:38:09 2022-08-24 11:39:35 [INFO] [TRAIN] epoch: 10, iter: 12350/160000, loss: 1.1971, lr: 0.001118, batch_cost: 0.3050, reader_cost: 0.00128, ips: 26.2338 samples/sec | ETA 12:30:25 2022-08-24 11:39:50 [INFO] [TRAIN] epoch: 10, iter: 12400/160000, loss: 1.1095, lr: 0.001117, batch_cost: 0.2986, reader_cost: 0.00050, ips: 26.7955 samples/sec | ETA 12:14:27 2022-08-24 11:40:05 [INFO] [TRAIN] epoch: 10, iter: 12450/160000, loss: 1.1128, lr: 0.001117, batch_cost: 0.3030, reader_cost: 0.00071, ips: 26.4024 samples/sec | ETA 12:25:08 2022-08-24 11:40:20 [INFO] [TRAIN] epoch: 10, iter: 12500/160000, loss: 1.1476, lr: 0.001117, batch_cost: 0.2945, reader_cost: 0.00060, ips: 27.1639 samples/sec | ETA 12:03:59 2022-08-24 11:40:36 [INFO] [TRAIN] epoch: 10, iter: 12550/160000, loss: 1.1105, lr: 0.001116, batch_cost: 0.3197, reader_cost: 0.00095, ips: 25.0205 samples/sec | ETA 13:05:45 2022-08-24 11:40:51 [INFO] [TRAIN] epoch: 10, iter: 12600/160000, loss: 1.2444, lr: 0.001116, batch_cost: 0.3073, reader_cost: 0.00114, ips: 26.0367 samples/sec | ETA 12:34:49 2022-08-24 11:41:08 [INFO] [TRAIN] epoch: 11, iter: 12650/160000, loss: 1.1186, lr: 0.001116, batch_cost: 0.3450, reader_cost: 0.04086, ips: 23.1908 samples/sec | ETA 14:07:10 2022-08-24 11:41:23 [INFO] [TRAIN] epoch: 11, iter: 12700/160000, loss: 1.0770, lr: 0.001115, batch_cost: 0.2980, reader_cost: 0.00087, ips: 26.8439 samples/sec | ETA 12:11:38 2022-08-24 11:41:36 [INFO] [TRAIN] epoch: 11, iter: 12750/160000, loss: 1.1730, lr: 0.001115, batch_cost: 0.2688, reader_cost: 0.00052, ips: 29.7575 samples/sec | ETA 10:59:46 2022-08-24 11:41:51 [INFO] [TRAIN] epoch: 11, iter: 12800/160000, loss: 1.0984, lr: 0.001114, batch_cost: 0.2816, reader_cost: 0.00085, ips: 28.4044 samples/sec | ETA 11:30:58 2022-08-24 11:42:05 [INFO] [TRAIN] epoch: 11, iter: 12850/160000, loss: 1.1250, lr: 0.001114, batch_cost: 0.2894, reader_cost: 0.00056, ips: 27.6443 samples/sec | ETA 11:49:43 2022-08-24 11:42:19 [INFO] [TRAIN] epoch: 11, iter: 12900/160000, loss: 1.0794, lr: 0.001114, batch_cost: 0.2774, reader_cost: 0.00051, ips: 28.8417 samples/sec | ETA 11:20:01 2022-08-24 11:42:34 [INFO] [TRAIN] epoch: 11, iter: 12950/160000, loss: 1.0494, lr: 0.001113, batch_cost: 0.3062, reader_cost: 0.00100, ips: 26.1255 samples/sec | ETA 12:30:28 2022-08-24 11:42:49 [INFO] [TRAIN] epoch: 11, iter: 13000/160000, loss: 1.1828, lr: 0.001113, batch_cost: 0.2874, reader_cost: 0.00119, ips: 27.8359 samples/sec | ETA 11:44:07 2022-08-24 11:42:49 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1893 - reader cost: 7.3582e-04 2022-08-24 11:45:58 [INFO] [EVAL] #Images: 2000 mIoU: 0.2397 Acc: 0.7055 Kappa: 0.6821 Dice: 0.3440 2022-08-24 11:45:58 [INFO] [EVAL] Class IoU: [0.6048 0.7254 0.9134 0.651 0.6376 0.7105 0.7165 0.6475 0.4319 0.6244 0.3987 0.4526 0.6226 0.2511 0.1096 0.2723 0.4045 0.3436 0.4564 0.2974 0.6855 0.2785 0.4854 0.3806 0.26 0.4267 0.4444 0.2324 0.244 0.2454 0.0289 0.3454 0.1497 0.1679 0.1957 0.3758 0.2455 0.3875 0.1711 0.2323 0.0708 0.0581 0.1923 0.1764 0.2542 0.0172 0.2882 0.3098 0.4254 0.4229 0.329 0.2446 0.0436 0.1051 0.6546 0.4418 0.702 0.1504 0.233 0.1703 0.0522 0.1683 0.2338 0.0726 0.2997 0.5302 0.133 0.3603 0.0052 0.1609 0.1405 0.3191 0.3501 0.2011 0.3059 0.2141 0.3058 0.032 0.163 0.0667 0.4455 0.1344 0.1344 0.0044 0.2708 0.3852 0.0323 0.0356 0.0791 0.3743 0.3656 0.0107 0.0902 0.0553 0. 0.0012 0.0798 0. 0.0917 0.1347 0.0371 0.0115 0.0259 0.6801 0.059 0.2703 0.0018 0.4687 0.0015 0.1343 0.0006 0.2231 0.0021 0.425 0.7005 0. 0.3389 0.4747 0. 0.3442 0.4494 0. 0.0313 0.1202 0.2256 0.0141 0.3536 0.326 0.0133 0.0299 0.4817 0. 0. 0.0726 0.0447 0.022 0.0577 0.001 0.0518 0.2015 0.0891 0.0183 0.3161 0.0613 0.1804 0. 0.1239 0.0148 0.0656 0.0124] 2022-08-24 11:45:58 [INFO] [EVAL] Class Precision: [0.7047 0.8203 0.9609 0.7553 0.7314 0.8136 0.8359 0.7126 0.5392 0.7537 0.6005 0.6226 0.7265 0.5196 0.3822 0.5014 0.4798 0.6865 0.6366 0.4572 0.7656 0.5655 0.714 0.4826 0.4221 0.5362 0.5247 0.7159 0.6543 0.3248 0.2644 0.4331 0.4497 0.238 0.302 0.4912 0.5274 0.6757 0.3749 0.4552 0.1137 0.3051 0.6725 0.5375 0.4685 0.2768 0.5002 0.4467 0.433 0.5652 0.5309 0.3034 0.3062 0.7445 0.7108 0.6052 0.7508 0.5307 0.4586 0.3301 0.066 0.3499 0.5872 0.8435 0.4649 0.6305 0.2935 0.5789 0.4806 0.5784 0.5387 0.3844 0.6738 0.261 0.4639 0.5443 0.4584 0.5549 0.497 0.2621 0.8934 0.5054 0.7219 0.0266 0.4545 0.5692 0.122 0.3192 0.7464 0.6068 0.547 0.0697 0.299 0.1738 0.0002 0.0072 0.2314 0. 0.2452 0.4659 0.5494 0.0237 0.4228 0.7423 0.2062 0.5605 0.0945 0.7539 0.0266 0.2612 0.1576 0.3926 0.0837 0.4878 0.7119 0. 0.6695 0.5489 0. 0.6107 0.6373 0. 0.89 0.6179 0.4169 0.3339 0.8666 0.5469 0.0319 0.2537 0.5564 0. 0. 0.261 0.5108 0.2555 0.4881 0.0272 0.3564 0.6711 0.5625 0.0245 0.6488 0.3197 0.4738 0. 0.7279 0.2276 0.2615 0.4848] 2022-08-24 11:45:58 [INFO] [EVAL] Class Recall: [0.81 0.8625 0.9486 0.825 0.8324 0.8486 0.8339 0.8763 0.6846 0.7846 0.5426 0.6238 0.8131 0.3271 0.1332 0.3733 0.7204 0.4075 0.6173 0.4597 0.8676 0.3543 0.6026 0.6431 0.4036 0.6762 0.7439 0.256 0.2802 0.501 0.0314 0.6305 0.1832 0.3633 0.3574 0.6153 0.3147 0.4761 0.2394 0.3217 0.1581 0.067 0.2122 0.208 0.3572 0.018 0.4048 0.5028 0.9601 0.6268 0.4638 0.5579 0.0484 0.1091 0.8922 0.6208 0.9153 0.1735 0.3213 0.2603 0.1999 0.2449 0.2798 0.0736 0.4576 0.7691 0.1957 0.4884 0.0052 0.1822 0.1597 0.6526 0.4216 0.4669 0.4731 0.2608 0.4789 0.0328 0.1952 0.0822 0.4705 0.1547 0.1417 0.0053 0.4013 0.5436 0.0421 0.0385 0.0813 0.4942 0.5244 0.0124 0.1144 0.075 0. 0.0014 0.1085 0. 0.1278 0.1593 0.0383 0.0217 0.0268 0.8903 0.0763 0.343 0.0018 0.5534 0.0016 0.2166 0.0006 0.3408 0.0022 0.7674 0.9776 0. 0.407 0.7783 0. 0.441 0.604 0. 0.0314 0.1298 0.3296 0.0145 0.3739 0.4466 0.0224 0.0328 0.782 0. 0. 0.0914 0.0467 0.0235 0.0614 0.001 0.0572 0.2236 0.0957 0.0672 0.3813 0.0705 0.2256 0. 0.1299 0.0156 0.0806 0.0126] 2022-08-24 11:45:59 [INFO] [EVAL] The model with the best validation mIoU (0.2397) was saved at iter 13000. 2022-08-24 11:46:14 [INFO] [TRAIN] epoch: 11, iter: 13050/160000, loss: 1.2099, lr: 0.001113, batch_cost: 0.3101, reader_cost: 0.00502, ips: 25.7947 samples/sec | ETA 12:39:35 2022-08-24 11:46:28 [INFO] [TRAIN] epoch: 11, iter: 13100/160000, loss: 1.1176, lr: 0.001112, batch_cost: 0.2782, reader_cost: 0.00136, ips: 28.7516 samples/sec | ETA 11:21:14 2022-08-24 11:46:42 [INFO] [TRAIN] epoch: 11, iter: 13150/160000, loss: 1.1396, lr: 0.001112, batch_cost: 0.2789, reader_cost: 0.00125, ips: 28.6831 samples/sec | ETA 11:22:37 2022-08-24 11:46:54 [INFO] [TRAIN] epoch: 11, iter: 13200/160000, loss: 1.2147, lr: 0.001111, batch_cost: 0.2339, reader_cost: 0.00052, ips: 34.2087 samples/sec | ETA 09:32:10 2022-08-24 11:47:05 [INFO] [TRAIN] epoch: 11, iter: 13250/160000, loss: 1.1563, lr: 0.001111, batch_cost: 0.2303, reader_cost: 0.00062, ips: 34.7334 samples/sec | ETA 09:23:20 2022-08-24 11:47:18 [INFO] [TRAIN] epoch: 11, iter: 13300/160000, loss: 1.0522, lr: 0.001111, batch_cost: 0.2587, reader_cost: 0.00069, ips: 30.9262 samples/sec | ETA 10:32:28 2022-08-24 11:47:31 [INFO] [TRAIN] epoch: 11, iter: 13350/160000, loss: 1.1056, lr: 0.001110, batch_cost: 0.2617, reader_cost: 0.00668, ips: 30.5725 samples/sec | ETA 10:39:34 2022-08-24 11:47:45 [INFO] [TRAIN] epoch: 11, iter: 13400/160000, loss: 1.1411, lr: 0.001110, batch_cost: 0.2760, reader_cost: 0.00175, ips: 28.9822 samples/sec | ETA 11:14:26 2022-08-24 11:47:58 [INFO] [TRAIN] epoch: 11, iter: 13450/160000, loss: 1.1190, lr: 0.001110, batch_cost: 0.2619, reader_cost: 0.00559, ips: 30.5487 samples/sec | ETA 10:39:38 2022-08-24 11:48:12 [INFO] [TRAIN] epoch: 11, iter: 13500/160000, loss: 1.1534, lr: 0.001109, batch_cost: 0.2780, reader_cost: 0.00092, ips: 28.7721 samples/sec | ETA 11:18:53 2022-08-24 11:48:25 [INFO] [TRAIN] epoch: 11, iter: 13550/160000, loss: 1.1645, lr: 0.001109, batch_cost: 0.2613, reader_cost: 0.00073, ips: 30.6156 samples/sec | ETA 10:37:48 2022-08-24 11:48:38 [INFO] [TRAIN] epoch: 11, iter: 13600/160000, loss: 1.1438, lr: 0.001108, batch_cost: 0.2538, reader_cost: 0.00082, ips: 31.5255 samples/sec | ETA 10:19:10 2022-08-24 11:48:52 [INFO] [TRAIN] epoch: 11, iter: 13650/160000, loss: 1.1000, lr: 0.001108, batch_cost: 0.2825, reader_cost: 0.00082, ips: 28.3142 samples/sec | ETA 11:29:10 2022-08-24 11:49:06 [INFO] [TRAIN] epoch: 11, iter: 13700/160000, loss: 1.1339, lr: 0.001108, batch_cost: 0.2884, reader_cost: 0.00091, ips: 27.7418 samples/sec | ETA 11:43:09 2022-08-24 11:49:20 [INFO] [TRAIN] epoch: 11, iter: 13750/160000, loss: 1.1403, lr: 0.001107, batch_cost: 0.2790, reader_cost: 0.00078, ips: 28.6784 samples/sec | ETA 11:19:57 2022-08-24 11:49:35 [INFO] [TRAIN] epoch: 11, iter: 13800/160000, loss: 1.0798, lr: 0.001107, batch_cost: 0.2850, reader_cost: 0.00064, ips: 28.0667 samples/sec | ETA 11:34:32 2022-08-24 11:49:48 [INFO] [TRAIN] epoch: 11, iter: 13850/160000, loss: 1.1728, lr: 0.001107, batch_cost: 0.2654, reader_cost: 0.00070, ips: 30.1463 samples/sec | ETA 10:46:24 2022-08-24 11:50:04 [INFO] [TRAIN] epoch: 12, iter: 13900/160000, loss: 1.1023, lr: 0.001106, batch_cost: 0.3151, reader_cost: 0.04461, ips: 25.3874 samples/sec | ETA 12:47:18 2022-08-24 11:50:18 [INFO] [TRAIN] epoch: 12, iter: 13950/160000, loss: 1.0927, lr: 0.001106, batch_cost: 0.2853, reader_cost: 0.00079, ips: 28.0399 samples/sec | ETA 11:34:29 2022-08-24 11:50:33 [INFO] [TRAIN] epoch: 12, iter: 14000/160000, loss: 1.2338, lr: 0.001105, batch_cost: 0.2957, reader_cost: 0.00053, ips: 27.0529 samples/sec | ETA 11:59:34 2022-08-24 11:50:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 197s - batch_cost: 0.1965 - reader cost: 0.0011 2022-08-24 11:53:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.2403 Acc: 0.7102 Kappa: 0.6863 Dice: 0.3448 2022-08-24 11:53:49 [INFO] [EVAL] Class IoU: [0.61 0.734 0.9156 0.6506 0.616 0.6933 0.7175 0.6566 0.4376 0.5993 0.3968 0.4523 0.6109 0.2773 0.0995 0.2984 0.4259 0.2934 0.4458 0.2924 0.6944 0.4226 0.4832 0.3901 0.2336 0.396 0.4582 0.2167 0.281 0.2124 0.1074 0.3552 0.2105 0.1654 0.262 0.3055 0.2523 0.4149 0.1969 0.233 0.0436 0.0683 0.2193 0.161 0.2568 0.1055 0.2193 0.3551 0.5559 0.3979 0.342 0.2475 0.1218 0.039 0.6851 0.3925 0.7841 0.1803 0.1642 0.1319 0.0414 0.1281 0.2179 0.0357 0.3253 0.5436 0.1376 0.345 0.0164 0.2068 0.1662 0.2562 0.3839 0.2078 0.3113 0.2076 0.3613 0.1351 0.1441 0.2321 0.6071 0.1531 0.1655 0.0056 0.2593 0.4029 0.0027 0.0224 0. 0.3507 0.3692 0.0006 0.1574 0.0342 0.0384 0.0043 0.0871 0.0011 0.0333 0.0986 0.0125 0.0171 0.0405 0.265 0.0084 0.2714 0.0002 0.3848 0.0068 0.2297 0.0418 0.1972 0.0046 0.3714 0.6195 0. 0.2949 0.5077 0.0731 0.3917 0.4847 0. 0.0076 0.0528 0.167 0.0119 0.3528 0.233 0. 0.116 0.5382 0. 0. 0.0216 0.0649 0.0147 0.0611 0.002 0.0732 0.2216 0.0459 0.0298 0.2942 0.0068 0.1503 0. 0.1279 0.0041 0.0383 0.0034] 2022-08-24 11:53:49 [INFO] [EVAL] Class Precision: [0.6966 0.7927 0.9578 0.744 0.6906 0.8663 0.818 0.735 0.5506 0.7199 0.6101 0.6041 0.7135 0.593 0.3704 0.4747 0.5675 0.6321 0.6835 0.4772 0.7873 0.6631 0.6887 0.5388 0.4452 0.6788 0.5441 0.7398 0.6177 0.3243 0.2654 0.438 0.4664 0.329 0.3406 0.562 0.5093 0.5406 0.3601 0.435 0.2394 0.2685 0.6295 0.5217 0.6293 0.3521 0.7294 0.5964 0.7589 0.5118 0.6783 0.3152 0.3042 0.6094 0.7736 0.5061 0.8623 0.495 0.4858 0.2427 0.0859 0.2005 0.2961 0.7562 0.4456 0.6468 0.2549 0.4769 0.4373 0.5478 0.5833 0.5212 0.5968 0.3455 0.5098 0.5123 0.5674 0.3893 0.453 0.3228 0.8377 0.5683 0.7379 0.027 0.5915 0.5777 0.0257 0.411 0. 0.7141 0.4943 0.0013 0.2694 0.1846 0.1499 0.0156 0.4116 0.4276 0.2699 0.4885 0.4246 0.0257 0.4919 0.9827 0.0547 0.6411 0.0017 0.6467 0.0474 0.3041 0.2438 0.2859 0.2824 0.5959 0.6312 0.0115 0.767 0.6373 0.1359 0.63 0.7463 0. 0.5273 0.7472 0.5395 0.3432 0.9388 0.2966 0. 0.4373 0.6654 0. 0. 0.5603 0.6724 0.2696 0.4674 0.0191 0.3723 0.6395 0.4227 0.0508 0.745 0.514 0.5539 0. 0.8446 0.3268 0.1747 0.2038] 2022-08-24 11:53:49 [INFO] [EVAL] Class Recall: [0.8306 0.9084 0.954 0.8381 0.8508 0.7764 0.8538 0.8603 0.6808 0.7815 0.5316 0.6429 0.8094 0.3425 0.1198 0.4456 0.6305 0.3538 0.5619 0.4301 0.8547 0.5381 0.6182 0.5857 0.3295 0.4874 0.7438 0.2345 0.3402 0.3811 0.1528 0.6525 0.2773 0.2495 0.532 0.401 0.3333 0.641 0.3028 0.3342 0.0506 0.0839 0.2518 0.1889 0.3025 0.1309 0.2387 0.4675 0.6752 0.6413 0.4081 0.5355 0.1689 0.04 0.8568 0.6363 0.8963 0.221 0.1988 0.2243 0.0739 0.2617 0.4519 0.0362 0.5465 0.7731 0.2302 0.5551 0.0168 0.2493 0.1885 0.335 0.5183 0.3426 0.4443 0.2587 0.4988 0.1715 0.1744 0.4524 0.688 0.1732 0.1758 0.007 0.3159 0.5711 0.003 0.0232 0. 0.408 0.5934 0.0012 0.2745 0.0402 0.0491 0.0059 0.0994 0.0011 0.0366 0.1099 0.0127 0.0484 0.0422 0.2663 0.0099 0.3201 0.0002 0.4872 0.0079 0.4842 0.0481 0.3886 0.0046 0.4965 0.9709 0. 0.3239 0.7139 0.1366 0.5087 0.5802 0. 0.0077 0.0538 0.1947 0.0122 0.3611 0.5206 0. 0.1364 0.7379 0. 0. 0.022 0.0671 0.0153 0.0657 0.0023 0.0835 0.2533 0.049 0.0671 0.3271 0.0069 0.171 0. 0.131 0.0041 0.0468 0.0034] 2022-08-24 11:53:50 [INFO] [EVAL] The model with the best validation mIoU (0.2403) was saved at iter 14000. 2022-08-24 11:54:04 [INFO] [TRAIN] epoch: 12, iter: 14050/160000, loss: 1.0229, lr: 0.001105, batch_cost: 0.2872, reader_cost: 0.00422, ips: 27.8550 samples/sec | ETA 11:38:37 2022-08-24 11:54:19 [INFO] [TRAIN] epoch: 12, iter: 14100/160000, loss: 1.1250, lr: 0.001105, batch_cost: 0.2930, reader_cost: 0.00171, ips: 27.3028 samples/sec | ETA 11:52:30 2022-08-24 11:54:34 [INFO] [TRAIN] epoch: 12, iter: 14150/160000, loss: 1.2124, lr: 0.001104, batch_cost: 0.2997, reader_cost: 0.00103, ips: 26.6892 samples/sec | ETA 12:08:38 2022-08-24 11:54:47 [INFO] [TRAIN] epoch: 12, iter: 14200/160000, loss: 1.1487, lr: 0.001104, batch_cost: 0.2775, reader_cost: 0.00074, ips: 28.8244 samples/sec | ETA 11:14:25 2022-08-24 11:55:02 [INFO] [TRAIN] epoch: 12, iter: 14250/160000, loss: 1.1609, lr: 0.001103, batch_cost: 0.2890, reader_cost: 0.00049, ips: 27.6832 samples/sec | ETA 11:41:59 2022-08-24 11:55:16 [INFO] [TRAIN] epoch: 12, iter: 14300/160000, loss: 1.1412, lr: 0.001103, batch_cost: 0.2918, reader_cost: 0.00048, ips: 27.4138 samples/sec | ETA 11:48:38 2022-08-24 11:55:30 [INFO] [TRAIN] epoch: 12, iter: 14350/160000, loss: 1.2062, lr: 0.001103, batch_cost: 0.2731, reader_cost: 0.00067, ips: 29.2941 samples/sec | ETA 11:02:55 2022-08-24 11:55:44 [INFO] [TRAIN] epoch: 12, iter: 14400/160000, loss: 1.0750, lr: 0.001102, batch_cost: 0.2707, reader_cost: 0.00103, ips: 29.5553 samples/sec | ETA 10:56:50 2022-08-24 11:55:57 [INFO] [TRAIN] epoch: 12, iter: 14450/160000, loss: 1.0770, lr: 0.001102, batch_cost: 0.2710, reader_cost: 0.00061, ips: 29.5241 samples/sec | ETA 10:57:18 2022-08-24 11:56:11 [INFO] [TRAIN] epoch: 12, iter: 14500/160000, loss: 1.0972, lr: 0.001102, batch_cost: 0.2847, reader_cost: 0.00090, ips: 28.1041 samples/sec | ETA 11:30:17 2022-08-24 11:56:26 [INFO] [TRAIN] epoch: 12, iter: 14550/160000, loss: 1.1918, lr: 0.001101, batch_cost: 0.2978, reader_cost: 0.00088, ips: 26.8607 samples/sec | ETA 12:01:59 2022-08-24 11:56:41 [INFO] [TRAIN] epoch: 12, iter: 14600/160000, loss: 1.1042, lr: 0.001101, batch_cost: 0.2851, reader_cost: 0.00065, ips: 28.0633 samples/sec | ETA 11:30:49 2022-08-24 11:56:55 [INFO] [TRAIN] epoch: 12, iter: 14650/160000, loss: 1.1092, lr: 0.001100, batch_cost: 0.2909, reader_cost: 0.00081, ips: 27.5047 samples/sec | ETA 11:44:36 2022-08-24 11:57:08 [INFO] [TRAIN] epoch: 12, iter: 14700/160000, loss: 1.1252, lr: 0.001100, batch_cost: 0.2602, reader_cost: 0.00038, ips: 30.7490 samples/sec | ETA 10:30:02 2022-08-24 11:57:22 [INFO] [TRAIN] epoch: 12, iter: 14750/160000, loss: 1.0960, lr: 0.001100, batch_cost: 0.2675, reader_cost: 0.00066, ips: 29.9030 samples/sec | ETA 10:47:39 2022-08-24 11:57:32 [INFO] [TRAIN] epoch: 12, iter: 14800/160000, loss: 1.0649, lr: 0.001099, batch_cost: 0.2093, reader_cost: 0.00061, ips: 38.2236 samples/sec | ETA 08:26:29 2022-08-24 11:57:42 [INFO] [TRAIN] epoch: 12, iter: 14850/160000, loss: 1.0844, lr: 0.001099, batch_cost: 0.1960, reader_cost: 0.00056, ips: 40.8162 samples/sec | ETA 07:54:09 2022-08-24 11:57:51 [INFO] [TRAIN] epoch: 12, iter: 14900/160000, loss: 1.1486, lr: 0.001099, batch_cost: 0.1909, reader_cost: 0.00106, ips: 41.8965 samples/sec | ETA 07:41:46 2022-08-24 11:58:01 [INFO] [TRAIN] epoch: 12, iter: 14950/160000, loss: 1.0474, lr: 0.001098, batch_cost: 0.2011, reader_cost: 0.00057, ips: 39.7822 samples/sec | ETA 08:06:08 2022-08-24 11:58:11 [INFO] [TRAIN] epoch: 12, iter: 15000/160000, loss: 1.0555, lr: 0.001098, batch_cost: 0.1934, reader_cost: 0.00042, ips: 41.3670 samples/sec | ETA 07:47:21 2022-08-24 11:58:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 224s - batch_cost: 0.2238 - reader cost: 6.1789e-04 2022-08-24 12:01:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.2414 Acc: 0.7125 Kappa: 0.6891 Dice: 0.3455 2022-08-24 12:01:55 [INFO] [EVAL] Class IoU: [0.6057 0.7286 0.9163 0.6551 0.6306 0.6931 0.7299 0.6531 0.4394 0.6442 0.4015 0.4758 0.6331 0.3136 0.113 0.2669 0.467 0.3222 0.4768 0.2832 0.6335 0.4 0.4874 0.39 0.2498 0.4021 0.4093 0.2261 0.2582 0.2427 0.1269 0.4221 0.1606 0.1645 0.2899 0.3578 0.2538 0.4482 0.185 0.232 0.0674 0.0385 0.2007 0.161 0.2472 0.0934 0.2593 0.3182 0.5927 0.3717 0.3544 0.2251 0.1035 0.1985 0.6595 0.3849 0.7704 0.1587 0.1223 0.1705 0.0461 0.146 0.2688 0.0184 0.3126 0.5415 0.1643 0.3745 0.0308 0.1934 0.2611 0.2851 0.3833 0.1986 0.3146 0.2117 0.3279 0.0422 0.1586 0.1024 0.2371 0.1581 0.1808 0.0124 0.2513 0.3939 0.0009 0.0381 0.034 0.3743 0.3499 0. 0.1148 0.0223 0.0172 0. 0.0981 0. 0.0137 0.1772 0.0045 0.0223 0.0145 0.5921 0.0085 0.2914 0.0074 0.3853 0.0049 0.1399 0.0156 0.097 0.0061 0.4222 0.678 0. 0.2286 0.4402 0.0499 0.3771 0.4472 0. 0.046 0.0341 0.1584 0.0267 0.3675 0.2906 0.0001 0.0886 0.6014 0. 0. 0.0398 0.0171 0.0531 0.0107 0.0067 0.0617 0.1418 0.1371 0.0231 0.3204 0.1009 0.1359 0. 0.1399 0.0016 0.0279 0.0016] 2022-08-24 12:01:55 [INFO] [EVAL] Class Precision: [0.6964 0.7982 0.9624 0.7536 0.7261 0.8423 0.8482 0.7181 0.5825 0.7336 0.5916 0.614 0.7349 0.5385 0.3714 0.5198 0.606 0.7054 0.6367 0.5045 0.6837 0.575 0.7063 0.5155 0.4113 0.5272 0.5303 0.7563 0.6648 0.4923 0.2535 0.6328 0.4059 0.236 0.3979 0.4884 0.5381 0.621 0.37 0.4877 0.1909 0.47 0.6926 0.5373 0.3428 0.3974 0.4072 0.6574 0.6924 0.4382 0.4953 0.2663 0.2572 0.7457 0.7182 0.4842 0.8453 0.5757 0.6685 0.4184 0.1668 0.3293 0.5139 0.6173 0.4211 0.6436 0.329 0.5377 0.2853 0.6059 0.4547 0.452 0.6854 0.3692 0.4929 0.4093 0.5321 0.4946 0.383 0.1719 0.8383 0.6553 0.6445 0.0179 0.5333 0.6816 0.0145 0.3285 0.2706 0.6016 0.5617 0.0001 0.2707 0.2232 0.1649 0. 0.3896 0. 0.43 0.4045 0.9074 0.0556 0.5223 0.7552 0.2646 0.6014 0.2803 0.583 0.0324 0.2327 0.466 0.1165 0.2654 0.564 0.686 0. 0.8385 0.499 0.0796 0.5891 0.7217 0. 0.875 0.529 0.5323 0.359 0.8733 0.4175 0.0033 0.2935 0.85 0. 0. 0.5017 0.7938 0.2077 0.3503 0.1853 0.3847 0.7287 0.4345 0.0379 0.6457 0.4651 0.5509 0. 0.9233 0.5942 0.2338 0.0519] 2022-08-24 12:01:55 [INFO] [EVAL] Class Recall: [0.8231 0.8932 0.9503 0.8337 0.8275 0.7965 0.8396 0.8782 0.6412 0.841 0.5554 0.6788 0.8204 0.4289 0.1397 0.3542 0.6707 0.3723 0.655 0.3923 0.8961 0.568 0.6113 0.6157 0.3888 0.6288 0.642 0.2439 0.2968 0.3237 0.2026 0.559 0.2099 0.352 0.5164 0.5722 0.3245 0.6171 0.2701 0.3067 0.0943 0.0403 0.2204 0.1869 0.4699 0.1088 0.4165 0.3814 0.8045 0.7099 0.5548 0.5927 0.1475 0.2129 0.8897 0.6524 0.8968 0.1798 0.1302 0.2235 0.0599 0.2079 0.3604 0.0186 0.5482 0.7734 0.247 0.5523 0.0333 0.2212 0.3802 0.4356 0.4652 0.3005 0.4651 0.3049 0.4608 0.0441 0.213 0.2023 0.2484 0.1724 0.2009 0.0383 0.3221 0.4827 0.001 0.0413 0.0374 0.4978 0.4813 0.0001 0.1662 0.0242 0.0188 0. 0.1159 0. 0.014 0.2397 0.0045 0.0359 0.0147 0.7328 0.0087 0.3611 0.0075 0.5319 0.0058 0.2598 0.0159 0.3677 0.0062 0.6267 0.9831 0. 0.2392 0.7888 0.1182 0.5117 0.5405 0. 0.0463 0.0351 0.1841 0.028 0.3882 0.4888 0.0001 0.1126 0.6729 0. 0. 0.0414 0.0172 0.0666 0.0109 0.0069 0.0685 0.1497 0.1669 0.0559 0.3887 0.1141 0.1529 0. 0.1416 0.0016 0.0307 0.0017] 2022-08-24 12:01:55 [INFO] [EVAL] The model with the best validation mIoU (0.2414) was saved at iter 15000. 2022-08-24 12:02:09 [INFO] [TRAIN] epoch: 12, iter: 15050/160000, loss: 1.1349, lr: 0.001097, batch_cost: 0.2801, reader_cost: 0.00418, ips: 28.5636 samples/sec | ETA 11:16:37 2022-08-24 12:02:23 [INFO] [TRAIN] epoch: 12, iter: 15100/160000, loss: 1.0942, lr: 0.001097, batch_cost: 0.2711, reader_cost: 0.00132, ips: 29.5143 samples/sec | ETA 10:54:35 2022-08-24 12:02:37 [INFO] [TRAIN] epoch: 12, iter: 15150/160000, loss: 1.2556, lr: 0.001097, batch_cost: 0.2785, reader_cost: 0.00067, ips: 28.7216 samples/sec | ETA 11:12:25 2022-08-24 12:02:52 [INFO] [TRAIN] epoch: 13, iter: 15200/160000, loss: 1.0661, lr: 0.001096, batch_cost: 0.3033, reader_cost: 0.03615, ips: 26.3794 samples/sec | ETA 12:11:53 2022-08-24 12:03:07 [INFO] [TRAIN] epoch: 13, iter: 15250/160000, loss: 1.1067, lr: 0.001096, batch_cost: 0.2926, reader_cost: 0.00046, ips: 27.3377 samples/sec | ETA 11:45:59 2022-08-24 12:03:21 [INFO] [TRAIN] epoch: 13, iter: 15300/160000, loss: 1.1101, lr: 0.001096, batch_cost: 0.2792, reader_cost: 0.00111, ips: 28.6501 samples/sec | ETA 11:13:24 2022-08-24 12:03:34 [INFO] [TRAIN] epoch: 13, iter: 15350/160000, loss: 1.0525, lr: 0.001095, batch_cost: 0.2771, reader_cost: 0.00069, ips: 28.8695 samples/sec | ETA 11:08:03 2022-08-24 12:03:48 [INFO] [TRAIN] epoch: 13, iter: 15400/160000, loss: 1.0924, lr: 0.001095, batch_cost: 0.2733, reader_cost: 0.00046, ips: 29.2674 samples/sec | ETA 10:58:45 2022-08-24 12:04:02 [INFO] [TRAIN] epoch: 13, iter: 15450/160000, loss: 1.1325, lr: 0.001094, batch_cost: 0.2786, reader_cost: 0.00097, ips: 28.7165 samples/sec | ETA 11:11:09 2022-08-24 12:04:15 [INFO] [TRAIN] epoch: 13, iter: 15500/160000, loss: 1.0740, lr: 0.001094, batch_cost: 0.2492, reader_cost: 0.00055, ips: 32.1019 samples/sec | ETA 10:00:10 2022-08-24 12:04:27 [INFO] [TRAIN] epoch: 13, iter: 15550/160000, loss: 1.0936, lr: 0.001094, batch_cost: 0.2534, reader_cost: 0.00042, ips: 31.5732 samples/sec | ETA 10:10:00 2022-08-24 12:04:41 [INFO] [TRAIN] epoch: 13, iter: 15600/160000, loss: 1.1056, lr: 0.001093, batch_cost: 0.2665, reader_cost: 0.00119, ips: 30.0241 samples/sec | ETA 10:41:15 2022-08-24 12:04:54 [INFO] [TRAIN] epoch: 13, iter: 15650/160000, loss: 1.0916, lr: 0.001093, batch_cost: 0.2750, reader_cost: 0.00036, ips: 29.0892 samples/sec | ETA 11:01:38 2022-08-24 12:05:10 [INFO] [TRAIN] epoch: 13, iter: 15700/160000, loss: 1.1704, lr: 0.001092, batch_cost: 0.3058, reader_cost: 0.00063, ips: 26.1607 samples/sec | ETA 12:15:27 2022-08-24 12:05:24 [INFO] [TRAIN] epoch: 13, iter: 15750/160000, loss: 1.0894, lr: 0.001092, batch_cost: 0.2918, reader_cost: 0.00095, ips: 27.4183 samples/sec | ETA 11:41:28 2022-08-24 12:05:38 [INFO] [TRAIN] epoch: 13, iter: 15800/160000, loss: 1.1196, lr: 0.001092, batch_cost: 0.2744, reader_cost: 0.00100, ips: 29.1517 samples/sec | ETA 10:59:32 2022-08-24 12:05:53 [INFO] [TRAIN] epoch: 13, iter: 15850/160000, loss: 1.0193, lr: 0.001091, batch_cost: 0.2982, reader_cost: 0.00081, ips: 26.8291 samples/sec | ETA 11:56:23 2022-08-24 12:06:07 [INFO] [TRAIN] epoch: 13, iter: 15900/160000, loss: 1.0949, lr: 0.001091, batch_cost: 0.2853, reader_cost: 0.00036, ips: 28.0449 samples/sec | ETA 11:25:05 2022-08-24 12:06:21 [INFO] [TRAIN] epoch: 13, iter: 15950/160000, loss: 1.0813, lr: 0.001091, batch_cost: 0.2837, reader_cost: 0.00049, ips: 28.2020 samples/sec | ETA 11:21:02 2022-08-24 12:06:35 [INFO] [TRAIN] epoch: 13, iter: 16000/160000, loss: 1.1689, lr: 0.001090, batch_cost: 0.2786, reader_cost: 0.00091, ips: 28.7174 samples/sec | ETA 11:08:35 2022-08-24 12:06:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 211s - batch_cost: 0.2109 - reader cost: 0.0011 2022-08-24 12:10:06 [INFO] [EVAL] #Images: 2000 mIoU: 0.2572 Acc: 0.7168 Kappa: 0.6943 Dice: 0.3666 2022-08-24 12:10:06 [INFO] [EVAL] Class IoU: [0.6182 0.7481 0.9165 0.658 0.6445 0.6979 0.7229 0.6642 0.438 0.6198 0.3959 0.4308 0.6234 0.2822 0.1262 0.3072 0.4588 0.3902 0.4766 0.2961 0.7012 0.4697 0.5023 0.3867 0.2777 0.3247 0.4941 0.2735 0.2775 0.2397 0.109 0.3967 0.2044 0.1837 0.2768 0.3767 0.2556 0.4164 0.1852 0.2391 0.0144 0.0685 0.2288 0.1772 0.2516 0.1404 0.2615 0.3422 0.6445 0.4126 0.3844 0.2195 0.094 0.2354 0.6405 0.4436 0.794 0.2823 0.189 0.1986 0.0534 0.1564 0.2661 0.112 0.3124 0.5315 0.1411 0.3301 0.0287 0.2608 0.262 0.3333 0.3788 0.2452 0.3253 0.2328 0.3605 0.0436 0.1545 0.1471 0.5866 0.1795 0.1887 0.0039 0.31 0.4177 0.0063 0.0229 0.1646 0.394 0.3358 0. 0.1418 0.0455 0.0952 0.0102 0.0891 0.099 0.005 0.1547 0.0329 0.0071 0.0133 0.4505 0.0006 0.323 0.0009 0.4213 0.0232 0.0325 0.028 0.2596 0.0215 0.3466 0.8104 0. 0.3759 0.5476 0.0168 0.1129 0.4491 0. 0.0208 0.1284 0.1592 0.0647 0.3325 0.2798 0. 0.1557 0.564 0. 0.0068 0.0364 0.0452 0.0269 0.038 0.0004 0.0711 0.2903 0.2332 0.0879 0.2302 0.1523 0.1647 0.0014 0.0959 0.0004 0.0565 0.0068] 2022-08-24 12:10:06 [INFO] [EVAL] Class Precision: [0.7109 0.8429 0.96 0.7403 0.7286 0.8427 0.8115 0.7625 0.5818 0.7526 0.6608 0.6862 0.7502 0.4377 0.4211 0.4525 0.5994 0.6542 0.6366 0.4428 0.8157 0.6179 0.6976 0.5231 0.3945 0.5573 0.59 0.6456 0.6639 0.3327 0.2757 0.4832 0.3847 0.2932 0.4254 0.5291 0.5806 0.5493 0.3111 0.3957 0.2873 0.2809 0.656 0.513 0.3895 0.3106 0.4433 0.6653 0.7385 0.5374 0.5746 0.2652 0.3682 0.6172 0.6883 0.6064 0.875 0.4466 0.5939 0.7928 0.1786 0.4926 0.3608 0.6194 0.3956 0.6236 0.3262 0.5232 0.1087 0.6044 0.4772 0.4801 0.6219 0.3309 0.5605 0.4498 0.4105 0.4695 0.4729 0.2249 0.7215 0.5852 0.6455 0.0237 0.5589 0.6337 0.0689 0.3799 0.6677 0.6698 0.4959 0. 0.3216 0.2529 0.297 0.0347 0.2103 0.5206 0.1342 0.4846 0.5115 0.0166 0.5768 0.6258 0.033 0.6741 0.0073 0.6236 0.1174 0.0845 0.2879 0.4517 0.1913 0.6964 0.8473 0. 0.4643 0.6988 0.0479 0.5937 0.6771 0. 0.9358 0.6046 0.4725 0.3838 0.9013 0.3991 0. 0.2411 0.7354 0. 0.1022 0.5834 0.7041 0.1966 0.4576 0.0207 0.2562 0.5614 0.3094 0.1097 0.7239 0.439 0.7065 0.0028 0.9248 0.6188 0.4096 0.683 ] 2022-08-24 12:10:06 [INFO] [EVAL] Class Recall: [0.8257 0.8694 0.9528 0.8556 0.8481 0.8024 0.8688 0.8374 0.6392 0.7784 0.4968 0.5366 0.7867 0.4426 0.1527 0.4888 0.6618 0.4916 0.6547 0.472 0.8332 0.6619 0.6421 0.5973 0.4838 0.4375 0.7525 0.3218 0.3228 0.4615 0.1527 0.6891 0.3036 0.3299 0.4421 0.5666 0.3135 0.6324 0.3139 0.3765 0.015 0.0831 0.26 0.213 0.4154 0.2039 0.3893 0.4133 0.8351 0.6399 0.5373 0.5602 0.112 0.2756 0.9021 0.623 0.8956 0.4342 0.217 0.2095 0.0709 0.1864 0.5034 0.1203 0.5978 0.7826 0.1992 0.4721 0.0375 0.3145 0.3676 0.5215 0.4921 0.4865 0.4367 0.3255 0.7473 0.0458 0.1866 0.2982 0.7583 0.2057 0.2106 0.0046 0.4103 0.5507 0.0069 0.0238 0.1793 0.489 0.5098 0. 0.2024 0.0526 0.1229 0.0143 0.1339 0.1089 0.0052 0.1852 0.0339 0.0123 0.0134 0.6166 0.0006 0.3827 0.001 0.565 0.028 0.0502 0.0301 0.379 0.0236 0.4083 0.949 0. 0.6638 0.7168 0.0252 0.1224 0.5714 0. 0.0208 0.1402 0.1936 0.0722 0.345 0.4834 0. 0.3053 0.7075 0. 0.0073 0.0374 0.0461 0.0302 0.0398 0.0004 0.0895 0.3754 0.4865 0.3059 0.2523 0.1892 0.1767 0.003 0.0967 0.0004 0.0615 0.0069] 2022-08-24 12:10:06 [INFO] [EVAL] The model with the best validation mIoU (0.2572) was saved at iter 16000. 2022-08-24 12:10:21 [INFO] [TRAIN] epoch: 13, iter: 16050/160000, loss: 1.1269, lr: 0.001090, batch_cost: 0.2910, reader_cost: 0.00472, ips: 27.4921 samples/sec | ETA 11:38:08 2022-08-24 12:10:34 [INFO] [TRAIN] epoch: 13, iter: 16100/160000, loss: 1.0189, lr: 0.001089, batch_cost: 0.2558, reader_cost: 0.00109, ips: 31.2727 samples/sec | ETA 10:13:31 2022-08-24 12:10:47 [INFO] [TRAIN] epoch: 13, iter: 16150/160000, loss: 1.1347, lr: 0.001089, batch_cost: 0.2553, reader_cost: 0.00054, ips: 31.3305 samples/sec | ETA 10:12:10 2022-08-24 12:11:00 [INFO] [TRAIN] epoch: 13, iter: 16200/160000, loss: 1.1299, lr: 0.001089, batch_cost: 0.2741, reader_cost: 0.00080, ips: 29.1892 samples/sec | ETA 10:56:51 2022-08-24 12:11:14 [INFO] [TRAIN] epoch: 13, iter: 16250/160000, loss: 1.0572, lr: 0.001088, batch_cost: 0.2742, reader_cost: 0.00042, ips: 29.1759 samples/sec | ETA 10:56:56 2022-08-24 12:11:28 [INFO] [TRAIN] epoch: 13, iter: 16300/160000, loss: 1.0394, lr: 0.001088, batch_cost: 0.2712, reader_cost: 0.00048, ips: 29.5019 samples/sec | ETA 10:49:27 2022-08-24 12:11:42 [INFO] [TRAIN] epoch: 13, iter: 16350/160000, loss: 1.1000, lr: 0.001088, batch_cost: 0.2826, reader_cost: 0.00067, ips: 28.3056 samples/sec | ETA 11:16:39 2022-08-24 12:11:56 [INFO] [TRAIN] epoch: 13, iter: 16400/160000, loss: 1.1196, lr: 0.001087, batch_cost: 0.2918, reader_cost: 0.00119, ips: 27.4197 samples/sec | ETA 11:38:16 2022-08-24 12:12:12 [INFO] [TRAIN] epoch: 14, iter: 16450/160000, loss: 1.0233, lr: 0.001087, batch_cost: 0.3233, reader_cost: 0.04318, ips: 24.7426 samples/sec | ETA 12:53:33 2022-08-24 12:12:26 [INFO] [TRAIN] epoch: 14, iter: 16500/160000, loss: 1.0403, lr: 0.001086, batch_cost: 0.2783, reader_cost: 0.00106, ips: 28.7479 samples/sec | ETA 11:05:33 2022-08-24 12:12:41 [INFO] [TRAIN] epoch: 14, iter: 16550/160000, loss: 1.1322, lr: 0.001086, batch_cost: 0.2968, reader_cost: 0.00052, ips: 26.9559 samples/sec | ETA 11:49:33 2022-08-24 12:12:56 [INFO] [TRAIN] epoch: 14, iter: 16600/160000, loss: 1.1106, lr: 0.001086, batch_cost: 0.2888, reader_cost: 0.00049, ips: 27.6979 samples/sec | ETA 11:30:18 2022-08-24 12:13:09 [INFO] [TRAIN] epoch: 14, iter: 16650/160000, loss: 1.0650, lr: 0.001085, batch_cost: 0.2711, reader_cost: 0.00090, ips: 29.5117 samples/sec | ETA 10:47:39 2022-08-24 12:13:24 [INFO] [TRAIN] epoch: 14, iter: 16700/160000, loss: 1.1037, lr: 0.001085, batch_cost: 0.2909, reader_cost: 0.00063, ips: 27.5043 samples/sec | ETA 11:34:40 2022-08-24 12:13:36 [INFO] [TRAIN] epoch: 14, iter: 16750/160000, loss: 1.0363, lr: 0.001085, batch_cost: 0.2534, reader_cost: 0.00044, ips: 31.5674 samples/sec | ETA 10:05:03 2022-08-24 12:13:49 [INFO] [TRAIN] epoch: 14, iter: 16800/160000, loss: 1.0800, lr: 0.001084, batch_cost: 0.2536, reader_cost: 0.00056, ips: 31.5487 samples/sec | ETA 10:05:12 2022-08-24 12:14:03 [INFO] [TRAIN] epoch: 14, iter: 16850/160000, loss: 1.1368, lr: 0.001084, batch_cost: 0.2679, reader_cost: 0.00066, ips: 29.8588 samples/sec | ETA 10:39:13 2022-08-24 12:14:17 [INFO] [TRAIN] epoch: 14, iter: 16900/160000, loss: 1.0523, lr: 0.001083, batch_cost: 0.2904, reader_cost: 0.00055, ips: 27.5455 samples/sec | ETA 11:32:40 2022-08-24 12:14:34 [INFO] [TRAIN] epoch: 14, iter: 16950/160000, loss: 1.0756, lr: 0.001083, batch_cost: 0.3454, reader_cost: 0.00081, ips: 23.1601 samples/sec | ETA 13:43:32 2022-08-24 12:14:50 [INFO] [TRAIN] epoch: 14, iter: 17000/160000, loss: 1.1235, lr: 0.001083, batch_cost: 0.3056, reader_cost: 0.00111, ips: 26.1791 samples/sec | ETA 12:08:19 2022-08-24 12:14:50 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 199s - batch_cost: 0.1987 - reader cost: 8.4243e-04 2022-08-24 12:18:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.2504 Acc: 0.7176 Kappa: 0.6951 Dice: 0.3595 2022-08-24 12:18:09 [INFO] [EVAL] Class IoU: [0.6153 0.7432 0.9165 0.666 0.639 0.7135 0.7211 0.6698 0.4397 0.66 0.4078 0.4747 0.6236 0.2808 0.113 0.3022 0.4194 0.3638 0.4789 0.3041 0.6834 0.4427 0.4815 0.3958 0.2587 0.415 0.4412 0.27 0.3247 0.1935 0.1075 0.3951 0.1852 0.1776 0.2385 0.3696 0.2645 0.3977 0.1626 0.2165 0.0282 0.0468 0.2309 0.1754 0.2292 0.1399 0.3156 0.338 0.6317 0.4393 0.3224 0.1894 0.0606 0.1703 0.6699 0.2783 0.7859 0.1878 0.2768 0.2169 0.0092 0.1722 0.2751 0.07 0.3282 0.5355 0.1604 0.3049 0.0212 0.2461 0.1828 0.3228 0.3481 0.2467 0.2947 0.2163 0.3536 0.0684 0.1512 0.0536 0.6486 0.1907 0.2234 0.0114 0.3204 0.4052 0.0289 0.0278 0.138 0.392 0.2838 0.0084 0.1867 0.0361 0.0001 0.0059 0.1401 0.0779 0.0976 0.2328 0.0005 0.0397 0.0231 0.4596 0.0162 0.3971 0.0044 0.435 0.044 0.1809 0.0072 0.0736 0.0168 0.3027 0.4934 0. 0.2079 0.4531 0.0375 0.2806 0.3766 0. 0.049 0.1342 0.1802 0.0229 0.3692 0.2744 0. 0.1303 0.5031 0. 0. 0.0974 0.0232 0.0415 0.0636 0.0038 0.0477 0.2268 0.2305 0.0635 0.2605 0.0813 0.1942 0. 0.1451 0.0056 0.0285 0.0047] 2022-08-24 12:18:09 [INFO] [EVAL] Class Precision: [0.7042 0.8458 0.9551 0.7777 0.7166 0.8514 0.8517 0.7365 0.6023 0.7352 0.6152 0.5758 0.7077 0.5309 0.3852 0.5085 0.5231 0.6655 0.7077 0.4902 0.7497 0.5258 0.6691 0.4978 0.4532 0.4787 0.6135 0.681 0.5899 0.551 0.3022 0.5329 0.5086 0.2411 0.4687 0.5369 0.5304 0.5033 0.3624 0.5059 0.3815 0.3041 0.6229 0.5097 0.4768 0.3559 0.6092 0.5324 0.6932 0.586 0.5558 0.2191 0.2225 0.5068 0.7447 0.3222 0.8271 0.5399 0.5184 0.4097 0.1595 0.3517 0.3703 0.7107 0.4366 0.6859 0.2977 0.5416 0.1611 0.7583 0.5853 0.4773 0.6988 0.3703 0.3911 0.3381 0.601 0.4357 0.2519 0.3033 0.8087 0.6871 0.6375 0.0685 0.5852 0.6814 0.2166 0.3426 0.4552 0.6637 0.3626 0.0224 0.3538 0.227 0.0014 0.0186 0.2155 0.8359 0.6235 0.3945 0.07 0.0946 0.4513 0.6282 0.2022 0.5498 0.2369 0.8439 0.0873 0.2485 0.1899 0.0877 0.1576 0.7942 0.4967 0. 0.9135 0.5241 0.0952 0.611 0.7693 0. 0.9879 0.5493 0.5865 0.6626 0.8712 0.3754 0. 0.3534 0.6469 0. 0. 0.3995 0.745 0.2315 0.3498 0.2032 0.4422 0.6941 0.5331 0.08 0.6773 0.2676 0.5087 0. 0.8628 0.4691 0.2367 0.8305] 2022-08-24 12:18:09 [INFO] [EVAL] Class Recall: [0.8297 0.8597 0.9577 0.8225 0.8552 0.815 0.8247 0.881 0.6196 0.8657 0.5474 0.73 0.84 0.3734 0.1378 0.4269 0.6791 0.4452 0.597 0.4447 0.8854 0.7369 0.6319 0.6588 0.3762 0.7573 0.6111 0.3091 0.4193 0.2297 0.1431 0.6045 0.2256 0.4029 0.3268 0.5426 0.3454 0.6547 0.2278 0.2745 0.0296 0.0524 0.2684 0.211 0.3062 0.1873 0.3958 0.4807 0.8768 0.6371 0.4343 0.5831 0.0769 0.2042 0.8695 0.6715 0.9405 0.2235 0.3726 0.3154 0.0097 0.2523 0.5169 0.0721 0.5693 0.7095 0.2581 0.411 0.0238 0.267 0.21 0.4993 0.4096 0.4249 0.5447 0.375 0.462 0.075 0.2744 0.0611 0.7661 0.2089 0.2559 0.0135 0.4145 0.5 0.0322 0.0294 0.1653 0.4892 0.5662 0.0133 0.2834 0.0412 0.0001 0.0087 0.2861 0.0791 0.1037 0.3622 0.0005 0.0641 0.0237 0.6314 0.0173 0.5886 0.0044 0.473 0.0816 0.3992 0.0075 0.3145 0.0184 0.3285 0.9866 0. 0.2121 0.7698 0.0583 0.3416 0.4245 0. 0.049 0.1508 0.2065 0.0232 0.3905 0.5049 0. 0.1711 0.6935 0. 0. 0.1141 0.0234 0.0481 0.0721 0.0039 0.0507 0.252 0.2887 0.2356 0.2974 0.1045 0.2391 0. 0.1485 0.0057 0.0313 0.0047] 2022-08-24 12:18:09 [INFO] [EVAL] The model with the best validation mIoU (0.2572) was saved at iter 16000. 2022-08-24 12:18:20 [INFO] [TRAIN] epoch: 14, iter: 17050/160000, loss: 1.1277, lr: 0.001082, batch_cost: 0.2230, reader_cost: 0.00441, ips: 35.8781 samples/sec | ETA 08:51:14 2022-08-24 12:18:32 [INFO] [TRAIN] epoch: 14, iter: 17100/160000, loss: 1.1042, lr: 0.001082, batch_cost: 0.2355, reader_cost: 0.00112, ips: 33.9751 samples/sec | ETA 09:20:48 2022-08-24 12:18:43 [INFO] [TRAIN] epoch: 14, iter: 17150/160000, loss: 1.0798, lr: 0.001082, batch_cost: 0.2254, reader_cost: 0.00054, ips: 35.4873 samples/sec | ETA 08:56:43 2022-08-24 12:18:53 [INFO] [TRAIN] epoch: 14, iter: 17200/160000, loss: 1.0986, lr: 0.001081, batch_cost: 0.1928, reader_cost: 0.01251, ips: 41.4881 samples/sec | ETA 07:38:55 2022-08-24 12:19:02 [INFO] [TRAIN] epoch: 14, iter: 17250/160000, loss: 1.0654, lr: 0.001081, batch_cost: 0.1888, reader_cost: 0.00365, ips: 42.3816 samples/sec | ETA 07:29:05 2022-08-24 12:19:11 [INFO] [TRAIN] epoch: 14, iter: 17300/160000, loss: 1.0902, lr: 0.001080, batch_cost: 0.1866, reader_cost: 0.00330, ips: 42.8761 samples/sec | ETA 07:23:45 2022-08-24 12:19:25 [INFO] [TRAIN] epoch: 14, iter: 17350/160000, loss: 1.0491, lr: 0.001080, batch_cost: 0.2678, reader_cost: 0.00093, ips: 29.8677 samples/sec | ETA 10:36:48 2022-08-24 12:19:40 [INFO] [TRAIN] epoch: 14, iter: 17400/160000, loss: 1.1387, lr: 0.001080, batch_cost: 0.2982, reader_cost: 0.00039, ips: 26.8285 samples/sec | ETA 11:48:41 2022-08-24 12:19:54 [INFO] [TRAIN] epoch: 14, iter: 17450/160000, loss: 1.0821, lr: 0.001079, batch_cost: 0.2891, reader_cost: 0.00044, ips: 27.6725 samples/sec | ETA 11:26:50 2022-08-24 12:20:09 [INFO] [TRAIN] epoch: 14, iter: 17500/160000, loss: 1.1202, lr: 0.001079, batch_cost: 0.3035, reader_cost: 0.00065, ips: 26.3633 samples/sec | ETA 12:00:41 2022-08-24 12:20:23 [INFO] [TRAIN] epoch: 14, iter: 17550/160000, loss: 1.0602, lr: 0.001078, batch_cost: 0.2762, reader_cost: 0.00168, ips: 28.9671 samples/sec | ETA 10:55:41 2022-08-24 12:20:38 [INFO] [TRAIN] epoch: 14, iter: 17600/160000, loss: 1.0718, lr: 0.001078, batch_cost: 0.3012, reader_cost: 0.00126, ips: 26.5584 samples/sec | ETA 11:54:54 2022-08-24 12:20:53 [INFO] [TRAIN] epoch: 14, iter: 17650/160000, loss: 1.0936, lr: 0.001078, batch_cost: 0.2983, reader_cost: 0.00103, ips: 26.8207 samples/sec | ETA 11:47:39 2022-08-24 12:21:10 [INFO] [TRAIN] epoch: 15, iter: 17700/160000, loss: 1.0635, lr: 0.001077, batch_cost: 0.3395, reader_cost: 0.04075, ips: 23.5658 samples/sec | ETA 13:25:07 2022-08-24 12:21:25 [INFO] [TRAIN] epoch: 15, iter: 17750/160000, loss: 1.0475, lr: 0.001077, batch_cost: 0.2917, reader_cost: 0.00065, ips: 27.4210 samples/sec | ETA 11:31:41 2022-08-24 12:21:38 [INFO] [TRAIN] epoch: 15, iter: 17800/160000, loss: 1.0594, lr: 0.001077, batch_cost: 0.2731, reader_cost: 0.00121, ips: 29.2944 samples/sec | ETA 10:47:13 2022-08-24 12:21:53 [INFO] [TRAIN] epoch: 15, iter: 17850/160000, loss: 1.0714, lr: 0.001076, batch_cost: 0.3007, reader_cost: 0.00094, ips: 26.6068 samples/sec | ETA 11:52:20 2022-08-24 12:22:07 [INFO] [TRAIN] epoch: 15, iter: 17900/160000, loss: 1.0906, lr: 0.001076, batch_cost: 0.2768, reader_cost: 0.00108, ips: 28.8967 samples/sec | ETA 10:55:40 2022-08-24 12:22:22 [INFO] [TRAIN] epoch: 15, iter: 17950/160000, loss: 1.0110, lr: 0.001075, batch_cost: 0.2937, reader_cost: 0.00042, ips: 27.2431 samples/sec | ETA 11:35:13 2022-08-24 12:22:36 [INFO] [TRAIN] epoch: 15, iter: 18000/160000, loss: 1.1304, lr: 0.001075, batch_cost: 0.2843, reader_cost: 0.00128, ips: 28.1442 samples/sec | ETA 11:12:43 2022-08-24 12:22:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 195s - batch_cost: 0.1950 - reader cost: 8.1824e-04 2022-08-24 12:25:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.2510 Acc: 0.7199 Kappa: 0.6974 Dice: 0.3596 2022-08-24 12:25:51 [INFO] [EVAL] Class IoU: [0.6179 0.745 0.9176 0.6719 0.6299 0.7207 0.7063 0.6792 0.4438 0.6476 0.4197 0.4737 0.6371 0.3307 0.1255 0.3056 0.4216 0.3319 0.4836 0.2737 0.6811 0.427 0.4937 0.388 0.2637 0.4256 0.4229 0.2502 0.3232 0.2838 0.133 0.4022 0.1615 0.2161 0.3028 0.4019 0.276 0.4505 0.1696 0.1868 0.0979 0.0556 0.2253 0.1748 0.2502 0.1571 0.2861 0.3309 0.6271 0.3765 0.3621 0.2789 0.0974 0.1865 0.6175 0.4383 0.7497 0.2269 0.2683 0.2156 0.0111 0.1197 0.293 0.0923 0.2687 0.5398 0.1191 0.3193 0.015 0.1505 0.1716 0.2802 0.3873 0.2134 0.3416 0.2451 0.2605 0.0431 0.1263 0.1536 0.4918 0.2065 0.1285 0.0106 0.2613 0.4091 0.0495 0.0387 0.1722 0.333 0.3233 0.0448 0.1086 0.0504 0.0018 0. 0.1153 0.0215 0.0011 0.1163 0.0267 0.017 0.0013 0.0594 0.01 0.3315 0.0018 0.3975 0.0242 0.16 0.0363 0.1394 0.0051 0.3574 0.7121 0. 0.4004 0.4747 0.072 0.3588 0.4429 0. 0.0724 0.1103 0.163 0.0129 0.4079 0.2318 0. 0.1653 0.5336 0. 0.0003 0.1034 0.0233 0.0661 0.0465 0.0041 0.0491 0.249 0.3013 0.0912 0.3321 0.0163 0.1078 0. 0.1466 0. 0.0633 0.0171] 2022-08-24 12:25:51 [INFO] [EVAL] Class Precision: [0.7072 0.8256 0.961 0.7751 0.714 0.8213 0.8442 0.7493 0.593 0.7646 0.5883 0.619 0.7495 0.4878 0.4365 0.4936 0.615 0.6402 0.6939 0.5452 0.749 0.5595 0.714 0.4994 0.4888 0.496 0.5449 0.7071 0.6424 0.5347 0.2707 0.5463 0.43 0.3684 0.4317 0.5997 0.5577 0.6785 0.2948 0.5222 0.1881 0.2778 0.6294 0.5555 0.3644 0.4305 0.5039 0.6482 0.687 0.4499 0.5778 0.3636 0.3056 0.5167 0.6491 0.6135 0.7733 0.4733 0.5127 0.3703 0.2319 0.3194 0.429 0.6206 0.3079 0.7007 0.2806 0.6107 0.197 0.6201 0.745 0.4798 0.5998 0.3804 0.5142 0.4016 0.6508 0.6044 0.2215 0.4144 0.9076 0.6099 0.7084 0.0314 0.5991 0.7245 0.1734 0.3702 0.3891 0.4818 0.4426 0.3582 0.331 0.2102 0.1299 0. 0.5853 0.698 0.1147 0.4113 0.3565 0.0431 0.247 0.6338 0.2256 0.6747 0.0926 0.6298 0.0742 0.2175 0.2364 0.1902 0.2659 0.7243 0.724 0. 0.7615 0.5794 0.1039 0.5489 0.5743 0. 0.8885 0.6681 0.5874 0.4089 0.8264 0.2941 0. 0.3253 0.6552 0. 0.0024 0.3236 0.8731 0.2457 0.6319 0.1737 0.3215 0.6282 0.4121 0.1062 0.5611 0.3236 0.4775 0. 0.857 0. 0.1562 0.381 ] 2022-08-24 12:25:51 [INFO] [EVAL] Class Recall: [0.8304 0.884 0.9531 0.8347 0.8424 0.8547 0.8121 0.8789 0.6382 0.8089 0.5943 0.6686 0.8095 0.5066 0.1497 0.4451 0.5727 0.4081 0.6147 0.3547 0.8826 0.6432 0.6154 0.6348 0.364 0.7498 0.6539 0.2791 0.3941 0.3769 0.2072 0.604 0.2054 0.3432 0.5034 0.5492 0.3533 0.5727 0.2854 0.2253 0.1695 0.065 0.2597 0.2032 0.444 0.1983 0.3984 0.4034 0.8781 0.6978 0.4925 0.5446 0.125 0.2259 0.9268 0.6054 0.961 0.3035 0.3601 0.3405 0.0116 0.1607 0.4803 0.0978 0.6787 0.7016 0.1714 0.401 0.016 0.1658 0.1824 0.4026 0.5223 0.327 0.5044 0.386 0.3028 0.0444 0.2273 0.1962 0.5177 0.238 0.1357 0.0157 0.3167 0.4845 0.0648 0.0414 0.236 0.5187 0.5453 0.0487 0.1391 0.0622 0.0018 0. 0.1256 0.0217 0.0011 0.1395 0.028 0.0274 0.0013 0.0616 0.0104 0.3946 0.0018 0.5188 0.0347 0.3774 0.0411 0.3426 0.0052 0.4137 0.9776 0. 0.4578 0.7242 0.1902 0.509 0.6595 0. 0.0731 0.1167 0.184 0.0131 0.4461 0.5225 0. 0.2516 0.742 0. 0.0003 0.1319 0.0233 0.0829 0.0478 0.0041 0.0548 0.292 0.5285 0.3926 0.4487 0.0169 0.1223 0. 0.1503 0. 0.0962 0.0176] 2022-08-24 12:25:51 [INFO] [EVAL] The model with the best validation mIoU (0.2572) was saved at iter 16000. 2022-08-24 12:26:05 [INFO] [TRAIN] epoch: 15, iter: 18050/160000, loss: 1.0268, lr: 0.001075, batch_cost: 0.2801, reader_cost: 0.00344, ips: 28.5567 samples/sec | ETA 11:02:46 2022-08-24 12:26:20 [INFO] [TRAIN] epoch: 15, iter: 18100/160000, loss: 1.0309, lr: 0.001074, batch_cost: 0.2914, reader_cost: 0.00183, ips: 27.4567 samples/sec | ETA 11:29:05 2022-08-24 12:26:34 [INFO] [TRAIN] epoch: 15, iter: 18150/160000, loss: 1.0212, lr: 0.001074, batch_cost: 0.2878, reader_cost: 0.00126, ips: 27.8005 samples/sec | ETA 11:20:19 2022-08-24 12:26:49 [INFO] [TRAIN] epoch: 15, iter: 18200/160000, loss: 1.1580, lr: 0.001074, batch_cost: 0.2854, reader_cost: 0.00120, ips: 28.0271 samples/sec | ETA 11:14:35 2022-08-24 12:27:04 [INFO] [TRAIN] epoch: 15, iter: 18250/160000, loss: 1.0275, lr: 0.001073, batch_cost: 0.2996, reader_cost: 0.00100, ips: 26.7066 samples/sec | ETA 11:47:41 2022-08-24 12:27:18 [INFO] [TRAIN] epoch: 15, iter: 18300/160000, loss: 1.1371, lr: 0.001073, batch_cost: 0.2913, reader_cost: 0.00074, ips: 27.4639 samples/sec | ETA 11:27:56 2022-08-24 12:27:33 [INFO] [TRAIN] epoch: 15, iter: 18350/160000, loss: 1.0535, lr: 0.001072, batch_cost: 0.2968, reader_cost: 0.00047, ips: 26.9542 samples/sec | ETA 11:40:41 2022-08-24 12:27:47 [INFO] [TRAIN] epoch: 15, iter: 18400/160000, loss: 1.0963, lr: 0.001072, batch_cost: 0.2850, reader_cost: 0.00084, ips: 28.0718 samples/sec | ETA 11:12:33 2022-08-24 12:28:02 [INFO] [TRAIN] epoch: 15, iter: 18450/160000, loss: 1.0985, lr: 0.001072, batch_cost: 0.2940, reader_cost: 0.00066, ips: 27.2140 samples/sec | ETA 11:33:30 2022-08-24 12:28:16 [INFO] [TRAIN] epoch: 15, iter: 18500/160000, loss: 1.0531, lr: 0.001071, batch_cost: 0.2761, reader_cost: 0.00051, ips: 28.9726 samples/sec | ETA 10:51:11 2022-08-24 12:28:30 [INFO] [TRAIN] epoch: 15, iter: 18550/160000, loss: 1.0197, lr: 0.001071, batch_cost: 0.2737, reader_cost: 0.00049, ips: 29.2284 samples/sec | ETA 10:45:15 2022-08-24 12:28:43 [INFO] [TRAIN] epoch: 15, iter: 18600/160000, loss: 1.0172, lr: 0.001071, batch_cost: 0.2628, reader_cost: 0.00049, ips: 30.4445 samples/sec | ETA 10:19:16 2022-08-24 12:28:57 [INFO] [TRAIN] epoch: 15, iter: 18650/160000, loss: 1.0389, lr: 0.001070, batch_cost: 0.2934, reader_cost: 0.00044, ips: 27.2704 samples/sec | ETA 11:31:06 2022-08-24 12:29:10 [INFO] [TRAIN] epoch: 15, iter: 18700/160000, loss: 1.1111, lr: 0.001070, batch_cost: 0.2505, reader_cost: 0.00091, ips: 31.9342 samples/sec | ETA 09:49:57 2022-08-24 12:29:21 [INFO] [TRAIN] epoch: 15, iter: 18750/160000, loss: 1.0677, lr: 0.001069, batch_cost: 0.2250, reader_cost: 0.00038, ips: 35.5542 samples/sec | ETA 08:49:42 2022-08-24 12:29:31 [INFO] [TRAIN] epoch: 15, iter: 18800/160000, loss: 0.9853, lr: 0.001069, batch_cost: 0.1914, reader_cost: 0.00051, ips: 41.7915 samples/sec | ETA 07:30:29 2022-08-24 12:29:42 [INFO] [TRAIN] epoch: 15, iter: 18850/160000, loss: 1.0817, lr: 0.001069, batch_cost: 0.2272, reader_cost: 0.00048, ips: 35.2154 samples/sec | ETA 08:54:25 2022-08-24 12:29:54 [INFO] [TRAIN] epoch: 15, iter: 18900/160000, loss: 1.0925, lr: 0.001068, batch_cost: 0.2331, reader_cost: 0.00050, ips: 34.3262 samples/sec | ETA 09:08:04 2022-08-24 12:30:05 [INFO] [TRAIN] epoch: 16, iter: 18950/160000, loss: 1.1374, lr: 0.001068, batch_cost: 0.2330, reader_cost: 0.03616, ips: 34.3331 samples/sec | ETA 09:07:46 2022-08-24 12:30:18 [INFO] [TRAIN] epoch: 16, iter: 19000/160000, loss: 1.0113, lr: 0.001068, batch_cost: 0.2575, reader_cost: 0.00677, ips: 31.0658 samples/sec | ETA 10:05:10 2022-08-24 12:30:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1868 - reader cost: 0.0010 2022-08-24 12:33:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.2595 Acc: 0.7221 Kappa: 0.6997 Dice: 0.3691 2022-08-24 12:33:25 [INFO] [EVAL] Class IoU: [0.6188 0.745 0.9167 0.6688 0.6501 0.7086 0.7281 0.6654 0.4468 0.6515 0.4171 0.4644 0.6337 0.2997 0.1185 0.2969 0.452 0.3861 0.4959 0.3111 0.6955 0.4137 0.5095 0.3916 0.2779 0.4106 0.4507 0.2789 0.3447 0.2795 0.1367 0.4001 0.2012 0.1878 0.285 0.3793 0.2653 0.3928 0.1918 0.251 0.0613 0.0426 0.2468 0.1506 0.2303 0.1853 0.2793 0.3036 0.5137 0.4121 0.3483 0.2839 0.1283 0.1997 0.6502 0.3401 0.7828 0.2552 0.1498 0.2622 0.0621 0.117 0.2936 0.0358 0.315 0.507 0.179 0.3452 0.0345 0.2024 0.2566 0.3587 0.3811 0.2459 0.2841 0.2531 0.3578 0.0416 0.1465 0.2593 0.6507 0.2065 0.1788 0.0097 0.3271 0.432 0.0176 0.0278 0.1208 0.3714 0.3292 0.0073 0.1345 0.0261 0.0248 0.0001 0.0693 0.0067 0.0158 0.2118 0.0041 0.0116 0.0326 0.4776 0.0061 0.3116 0.0161 0.4507 0.0447 0.0572 0.0182 0.222 0.0192 0.457 0.6651 0. 0.4091 0.4797 0. 0.1354 0.3649 0. 0.0945 0.0351 0.1644 0.0218 0.3979 0.3357 0.0007 0.1562 0.5694 0. 0. 0.0717 0.0269 0.0271 0.0462 0. 0.0563 0.2919 0.3357 0. 0.2405 0.4225 0.1874 0. 0.1201 0.005 0.0343 0.0093] 2022-08-24 12:33:25 [INFO] [EVAL] Class Precision: [0.7058 0.8267 0.954 0.7625 0.7519 0.8514 0.8327 0.7362 0.6158 0.7242 0.6186 0.66 0.7371 0.5314 0.3576 0.4927 0.6436 0.6832 0.6447 0.5216 0.7664 0.5995 0.7361 0.5582 0.3882 0.5242 0.533 0.6823 0.5897 0.4036 0.3463 0.6468 0.4923 0.3489 0.4057 0.5703 0.5761 0.7502 0.3495 0.434 0.261 0.2633 0.6286 0.569 0.3552 0.3528 0.4963 0.7031 0.529 0.5824 0.6013 0.3658 0.304 0.5883 0.7038 0.4088 0.8287 0.4786 0.5493 0.4001 0.1307 0.2589 0.4121 0.6295 0.4587 0.5711 0.3638 0.5735 0.2296 0.5642 0.5734 0.4816 0.6452 0.457 0.3363 0.4457 0.4372 0.4041 0.5623 0.3249 0.8772 0.5797 0.7238 0.0729 0.5609 0.7 0.1899 0.3826 0.7631 0.7707 0.431 0.0149 0.2978 0.231 0.1577 0.0011 0.1045 0.6888 0.2685 0.3634 0.2587 0.0202 0.4427 0.8572 0.0776 0.6168 0.0935 0.7664 0.1274 0.1127 0.2488 0.3823 0.2009 0.5896 0.6708 0. 0.7669 0.5682 0. 0.6954 0.7597 0. 0.9953 0.8727 0.5511 0.4082 0.683 0.4945 0.0084 0.4405 0.7201 0. 0.0003 0.4419 0.6156 0.2605 0.4402 0. 0.2952 0.5744 0.7594 0. 0.7568 0.6742 0.3912 0. 0.9022 0.5151 0.1962 0.4076] 2022-08-24 12:33:25 [INFO] [EVAL] Class Recall: [0.8338 0.8828 0.9591 0.8447 0.8275 0.8086 0.8528 0.8737 0.6194 0.8664 0.5616 0.6104 0.8188 0.4074 0.1506 0.4277 0.603 0.4703 0.6824 0.4353 0.8826 0.5717 0.6234 0.5675 0.4946 0.6546 0.7448 0.3205 0.4535 0.4762 0.1842 0.512 0.2538 0.2891 0.4891 0.531 0.3296 0.4519 0.2984 0.3731 0.0741 0.0483 0.2889 0.1699 0.3958 0.2807 0.3898 0.3483 0.9467 0.585 0.453 0.5592 0.1817 0.2322 0.8951 0.6695 0.934 0.3534 0.1708 0.4319 0.1059 0.1759 0.5052 0.0366 0.5014 0.8188 0.2607 0.4644 0.039 0.2399 0.3171 0.5844 0.4822 0.3474 0.6465 0.3693 0.6633 0.0444 0.1654 0.5623 0.7159 0.2428 0.1919 0.011 0.4396 0.5301 0.019 0.0291 0.1255 0.4175 0.5821 0.0142 0.1969 0.0286 0.0286 0.0001 0.1706 0.0068 0.0165 0.3368 0.0041 0.0265 0.034 0.5188 0.0066 0.3864 0.0191 0.5225 0.0644 0.104 0.0193 0.3461 0.0208 0.6702 0.9875 0. 0.4672 0.7549 0. 0.1439 0.4125 0. 0.0946 0.0353 0.1898 0.0225 0.488 0.5112 0.0008 0.1949 0.7312 0. 0. 0.0788 0.0274 0.0294 0.0491 0. 0.065 0.3724 0.3756 0. 0.2607 0.5309 0.2646 0. 0.1217 0.005 0.0399 0.0095] 2022-08-24 12:33:26 [INFO] [EVAL] The model with the best validation mIoU (0.2595) was saved at iter 19000. 2022-08-24 12:33:39 [INFO] [TRAIN] epoch: 16, iter: 19050/160000, loss: 0.9872, lr: 0.001067, batch_cost: 0.2722, reader_cost: 0.00421, ips: 29.3914 samples/sec | ETA 10:39:25 2022-08-24 12:33:54 [INFO] [TRAIN] epoch: 16, iter: 19100/160000, loss: 1.0726, lr: 0.001067, batch_cost: 0.2945, reader_cost: 0.00141, ips: 27.1607 samples/sec | ETA 11:31:41 2022-08-24 12:34:09 [INFO] [TRAIN] epoch: 16, iter: 19150/160000, loss: 1.0238, lr: 0.001066, batch_cost: 0.3004, reader_cost: 0.00114, ips: 26.6327 samples/sec | ETA 11:45:08 2022-08-24 12:34:23 [INFO] [TRAIN] epoch: 16, iter: 19200/160000, loss: 1.0782, lr: 0.001066, batch_cost: 0.2827, reader_cost: 0.00087, ips: 28.2946 samples/sec | ETA 11:03:29 2022-08-24 12:34:37 [INFO] [TRAIN] epoch: 16, iter: 19250/160000, loss: 1.0648, lr: 0.001066, batch_cost: 0.2883, reader_cost: 0.00057, ips: 27.7462 samples/sec | ETA 11:16:22 2022-08-24 12:34:53 [INFO] [TRAIN] epoch: 16, iter: 19300/160000, loss: 1.1166, lr: 0.001065, batch_cost: 0.3054, reader_cost: 0.00090, ips: 26.1912 samples/sec | ETA 11:56:16 2022-08-24 12:35:07 [INFO] [TRAIN] epoch: 16, iter: 19350/160000, loss: 1.0258, lr: 0.001065, batch_cost: 0.2883, reader_cost: 0.00043, ips: 27.7453 samples/sec | ETA 11:15:54 2022-08-24 12:35:21 [INFO] [TRAIN] epoch: 16, iter: 19400/160000, loss: 1.0925, lr: 0.001064, batch_cost: 0.2811, reader_cost: 0.00079, ips: 28.4601 samples/sec | ETA 10:58:42 2022-08-24 12:35:35 [INFO] [TRAIN] epoch: 16, iter: 19450/160000, loss: 1.0477, lr: 0.001064, batch_cost: 0.2832, reader_cost: 0.00058, ips: 28.2462 samples/sec | ETA 11:03:27 2022-08-24 12:35:50 [INFO] [TRAIN] epoch: 16, iter: 19500/160000, loss: 1.0558, lr: 0.001064, batch_cost: 0.2997, reader_cost: 0.00068, ips: 26.6894 samples/sec | ETA 11:41:54 2022-08-24 12:36:03 [INFO] [TRAIN] epoch: 16, iter: 19550/160000, loss: 1.0166, lr: 0.001063, batch_cost: 0.2522, reader_cost: 0.00047, ips: 31.7149 samples/sec | ETA 09:50:28 2022-08-24 12:36:17 [INFO] [TRAIN] epoch: 16, iter: 19600/160000, loss: 1.0909, lr: 0.001063, batch_cost: 0.2867, reader_cost: 0.00068, ips: 27.9042 samples/sec | ETA 11:10:52 2022-08-24 12:36:32 [INFO] [TRAIN] epoch: 16, iter: 19650/160000, loss: 1.0975, lr: 0.001063, batch_cost: 0.2833, reader_cost: 0.00078, ips: 28.2378 samples/sec | ETA 11:02:42 2022-08-24 12:36:46 [INFO] [TRAIN] epoch: 16, iter: 19700/160000, loss: 0.9986, lr: 0.001062, batch_cost: 0.2994, reader_cost: 0.00056, ips: 26.7185 samples/sec | ETA 11:40:08 2022-08-24 12:37:01 [INFO] [TRAIN] epoch: 16, iter: 19750/160000, loss: 1.0591, lr: 0.001062, batch_cost: 0.2811, reader_cost: 0.00040, ips: 28.4591 samples/sec | ETA 10:57:04 2022-08-24 12:37:15 [INFO] [TRAIN] epoch: 16, iter: 19800/160000, loss: 1.0338, lr: 0.001061, batch_cost: 0.2890, reader_cost: 0.00097, ips: 27.6834 samples/sec | ETA 11:15:15 2022-08-24 12:37:29 [INFO] [TRAIN] epoch: 16, iter: 19850/160000, loss: 1.0440, lr: 0.001061, batch_cost: 0.2715, reader_cost: 0.00070, ips: 29.4641 samples/sec | ETA 10:34:13 2022-08-24 12:37:43 [INFO] [TRAIN] epoch: 16, iter: 19900/160000, loss: 0.9888, lr: 0.001061, batch_cost: 0.2908, reader_cost: 0.00078, ips: 27.5143 samples/sec | ETA 11:18:55 2022-08-24 12:37:53 [INFO] [TRAIN] epoch: 16, iter: 19950/160000, loss: 1.0561, lr: 0.001060, batch_cost: 0.1931, reader_cost: 0.00114, ips: 41.4244 samples/sec | ETA 07:30:46 2022-08-24 12:38:03 [INFO] [TRAIN] epoch: 16, iter: 20000/160000, loss: 1.0706, lr: 0.001060, batch_cost: 0.1959, reader_cost: 0.00073, ips: 40.8344 samples/sec | ETA 07:37:07 2022-08-24 12:38:03 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 177s - batch_cost: 0.1766 - reader cost: 7.3157e-04 2022-08-24 12:41:00 [INFO] [EVAL] #Images: 2000 mIoU: 0.2593 Acc: 0.7242 Kappa: 0.7028 Dice: 0.3696 2022-08-24 12:41:00 [INFO] [EVAL] Class IoU: [0.6257 0.7473 0.9163 0.6756 0.6412 0.721 0.7268 0.6812 0.4486 0.6399 0.4204 0.4758 0.6222 0.2956 0.1304 0.3222 0.4702 0.3752 0.4681 0.3114 0.6954 0.4457 0.5079 0.4138 0.2647 0.4302 0.4611 0.2866 0.2834 0.2589 0.1054 0.4073 0.2188 0.2062 0.268 0.3652 0.298 0.4231 0.1794 0.2038 0.1016 0.0558 0.2578 0.1869 0.2433 0.0715 0.2881 0.343 0.5942 0.3993 0.3775 0.2857 0.1667 0.2309 0.6719 0.2323 0.7871 0.267 0.1127 0.1859 0.0175 0.1739 0.251 0.1284 0.3266 0.5579 0.1467 0.3565 0.046 0.1876 0.2519 0.3519 0.3428 0.1829 0.3622 0.2335 0.4381 0.1396 0.1809 0.1669 0.6687 0.2314 0.1865 0.0115 0.3341 0.433 0.0534 0.0327 0.1583 0.3585 0.3728 0. 0.1815 0.0233 0.0188 0.0008 0.046 0.0457 0.0666 0.1139 0.0186 0.0002 0.0449 0.2067 0.0244 0.3274 0.008 0.4726 0.0273 0.207 0.008 0.1548 0.0091 0.51 0.7838 0. 0.2289 0.4066 0.0057 0.3152 0.3352 0. 0.1223 0.1272 0.1685 0.035 0.3482 0.3242 0. 0.1062 0.5607 0. 0.007 0.0606 0.0243 0.0249 0.0457 0.0029 0.0539 0.2839 0.0993 0.091 0.228 0.2845 0.1326 0. 0.1599 0.0041 0.0317 0.0033] 2022-08-24 12:41:00 [INFO] [EVAL] Class Precision: [0.7352 0.8367 0.9482 0.7867 0.725 0.8352 0.827 0.7481 0.5692 0.7018 0.5911 0.6351 0.6897 0.5406 0.3874 0.4327 0.655 0.6334 0.5768 0.5071 0.7904 0.6115 0.6911 0.5244 0.4364 0.5601 0.5267 0.7137 0.7065 0.5444 0.3152 0.5116 0.426 0.4097 0.5316 0.5035 0.5263 0.7613 0.3883 0.4691 0.2073 0.2602 0.6001 0.4621 0.5086 0.3753 0.5519 0.6271 0.6267 0.4977 0.5447 0.3559 0.3125 0.6142 0.7299 0.261 0.844 0.4504 0.6089 0.3714 0.4313 0.3095 0.3911 0.7415 0.462 0.6581 0.3383 0.5179 0.1225 0.567 0.4403 0.4929 0.6949 0.4323 0.5285 0.435 0.5372 0.5435 0.4881 0.3545 0.8331 0.3991 0.726 0.153 0.5419 0.6352 0.2228 0.2495 0.6786 0.5974 0.5555 0. 0.3327 0.175 0.2157 0.0069 0.1848 0.9111 0.508 0.4026 0.2277 0.0008 0.4218 0.8499 0.2158 0.4936 0.0378 0.7495 0.0907 0.2974 0.1988 0.1995 0.2015 0.6136 0.8046 0. 0.9542 0.4828 0.0295 0.6633 0.7199 0. 0.8114 0.5764 0.6185 0.4549 0.8721 0.4624 0. 0.6009 0.8705 0. 0.0729 0.7069 0.5306 0.2694 0.4792 0.0368 0.3094 0.5444 0.2527 0.1432 0.8252 0.6087 0.4361 0. 0.8498 0.5343 0.2634 0.1517] 2022-08-24 12:41:00 [INFO] [EVAL] Class Recall: [0.8077 0.8749 0.9646 0.8272 0.8473 0.8406 0.8571 0.884 0.6792 0.8788 0.5929 0.6548 0.8641 0.3948 0.1643 0.5579 0.625 0.4793 0.713 0.4465 0.8526 0.6218 0.657 0.6624 0.4023 0.6498 0.7873 0.3238 0.3212 0.3305 0.1367 0.6664 0.3103 0.2932 0.3508 0.5709 0.4071 0.4879 0.2501 0.2648 0.166 0.0663 0.3113 0.2389 0.3181 0.0812 0.3761 0.431 0.9196 0.6689 0.5515 0.5917 0.2633 0.2701 0.8943 0.6786 0.9212 0.3961 0.1215 0.2712 0.0179 0.2842 0.412 0.1344 0.527 0.7857 0.2058 0.5335 0.0687 0.219 0.3706 0.5516 0.4036 0.2408 0.5352 0.3353 0.7037 0.1581 0.2233 0.2398 0.7722 0.3552 0.2007 0.0122 0.4656 0.5763 0.0657 0.0362 0.1711 0.4727 0.5313 0. 0.2855 0.0261 0.0202 0.0009 0.0577 0.046 0.0712 0.1371 0.0199 0.0003 0.0478 0.2146 0.0268 0.4929 0.0101 0.5612 0.0377 0.4051 0.0083 0.4086 0.0094 0.7512 0.9682 0. 0.2315 0.7202 0.0071 0.3753 0.3855 0. 0.1259 0.1403 0.188 0.0366 0.367 0.5204 0. 0.1143 0.6117 0. 0.0077 0.0622 0.0248 0.0267 0.0481 0.0031 0.0612 0.3723 0.1405 0.2 0.2396 0.3482 0.16 0. 0.1646 0.0041 0.0348 0.0033] 2022-08-24 12:41:00 [INFO] [EVAL] The model with the best validation mIoU (0.2595) was saved at iter 19000. 2022-08-24 12:41:10 [INFO] [TRAIN] epoch: 16, iter: 20050/160000, loss: 1.0761, lr: 0.001060, batch_cost: 0.2112, reader_cost: 0.00374, ips: 37.8876 samples/sec | ETA 08:12:30 2022-08-24 12:41:22 [INFO] [TRAIN] epoch: 16, iter: 20100/160000, loss: 1.0077, lr: 0.001059, batch_cost: 0.2254, reader_cost: 0.01027, ips: 35.4850 samples/sec | ETA 08:45:40 2022-08-24 12:41:35 [INFO] [TRAIN] epoch: 16, iter: 20150/160000, loss: 1.0926, lr: 0.001059, batch_cost: 0.2696, reader_cost: 0.02321, ips: 29.6730 samples/sec | ETA 10:28:24 2022-08-24 12:41:48 [INFO] [TRAIN] epoch: 16, iter: 20200/160000, loss: 1.0205, lr: 0.001058, batch_cost: 0.2641, reader_cost: 0.02472, ips: 30.2929 samples/sec | ETA 10:15:19 2022-08-24 12:42:01 [INFO] [TRAIN] epoch: 17, iter: 20250/160000, loss: 1.1143, lr: 0.001058, batch_cost: 0.2643, reader_cost: 0.05403, ips: 30.2683 samples/sec | ETA 10:15:36 2022-08-24 12:42:12 [INFO] [TRAIN] epoch: 17, iter: 20300/160000, loss: 1.0410, lr: 0.001058, batch_cost: 0.2080, reader_cost: 0.00048, ips: 38.4529 samples/sec | ETA 08:04:24 2022-08-24 12:42:23 [INFO] [TRAIN] epoch: 17, iter: 20350/160000, loss: 1.0487, lr: 0.001057, batch_cost: 0.2175, reader_cost: 0.00075, ips: 36.7877 samples/sec | ETA 08:26:08 2022-08-24 12:42:34 [INFO] [TRAIN] epoch: 17, iter: 20400/160000, loss: 1.0621, lr: 0.001057, batch_cost: 0.2204, reader_cost: 0.00088, ips: 36.2958 samples/sec | ETA 08:32:49 2022-08-24 12:42:46 [INFO] [TRAIN] epoch: 17, iter: 20450/160000, loss: 0.9556, lr: 0.001057, batch_cost: 0.2370, reader_cost: 0.00133, ips: 33.7571 samples/sec | ETA 09:11:11 2022-08-24 12:42:57 [INFO] [TRAIN] epoch: 17, iter: 20500/160000, loss: 1.0391, lr: 0.001056, batch_cost: 0.2332, reader_cost: 0.00061, ips: 34.3110 samples/sec | ETA 09:02:06 2022-08-24 12:43:06 [INFO] [TRAIN] epoch: 17, iter: 20550/160000, loss: 1.0108, lr: 0.001056, batch_cost: 0.1835, reader_cost: 0.00066, ips: 43.5982 samples/sec | ETA 07:06:28 2022-08-24 12:43:20 [INFO] [TRAIN] epoch: 17, iter: 20600/160000, loss: 1.0511, lr: 0.001055, batch_cost: 0.2750, reader_cost: 0.00081, ips: 29.0894 samples/sec | ETA 10:38:57 2022-08-24 12:43:34 [INFO] [TRAIN] epoch: 17, iter: 20650/160000, loss: 0.9986, lr: 0.001055, batch_cost: 0.2661, reader_cost: 0.00076, ips: 30.0661 samples/sec | ETA 10:17:58 2022-08-24 12:43:44 [INFO] [TRAIN] epoch: 17, iter: 20700/160000, loss: 1.1165, lr: 0.001055, batch_cost: 0.2050, reader_cost: 0.00045, ips: 39.0332 samples/sec | ETA 07:55:50 2022-08-24 12:43:54 [INFO] [TRAIN] epoch: 17, iter: 20750/160000, loss: 1.0401, lr: 0.001054, batch_cost: 0.2062, reader_cost: 0.00214, ips: 38.7903 samples/sec | ETA 07:58:38 2022-08-24 12:44:04 [INFO] [TRAIN] epoch: 17, iter: 20800/160000, loss: 1.0788, lr: 0.001054, batch_cost: 0.1963, reader_cost: 0.00042, ips: 40.7560 samples/sec | ETA 07:35:23 2022-08-24 12:44:14 [INFO] [TRAIN] epoch: 17, iter: 20850/160000, loss: 1.1053, lr: 0.001054, batch_cost: 0.2062, reader_cost: 0.00152, ips: 38.8039 samples/sec | ETA 07:58:07 2022-08-24 12:44:24 [INFO] [TRAIN] epoch: 17, iter: 20900/160000, loss: 1.0663, lr: 0.001053, batch_cost: 0.1974, reader_cost: 0.00263, ips: 40.5295 samples/sec | ETA 07:37:36 2022-08-24 12:44:33 [INFO] [TRAIN] epoch: 17, iter: 20950/160000, loss: 1.0122, lr: 0.001053, batch_cost: 0.1840, reader_cost: 0.00033, ips: 43.4767 samples/sec | ETA 07:06:26 2022-08-24 12:44:43 [INFO] [TRAIN] epoch: 17, iter: 21000/160000, loss: 1.0452, lr: 0.001052, batch_cost: 0.1952, reader_cost: 0.00347, ips: 40.9903 samples/sec | ETA 07:32:08 2022-08-24 12:44:43 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 186s - batch_cost: 0.1858 - reader cost: 9.3691e-04 2022-08-24 12:47:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.2621 Acc: 0.7238 Kappa: 0.7020 Dice: 0.3739 2022-08-24 12:47:49 [INFO] [EVAL] Class IoU: [0.6249 0.7411 0.9163 0.6729 0.639 0.7096 0.7339 0.6848 0.4429 0.66 0.4164 0.5037 0.6352 0.3096 0.1172 0.3264 0.4727 0.3186 0.4918 0.3144 0.7033 0.4424 0.5023 0.4144 0.2685 0.3971 0.4441 0.2576 0.3034 0.2932 0.1568 0.3605 0.1781 0.1938 0.2998 0.3999 0.2764 0.4048 0.2053 0.2304 0.0637 0.0728 0.2604 0.1663 0.2596 0.1709 0.2724 0.3412 0.6429 0.3856 0.3424 0.2469 0.1466 0.2316 0.6714 0.3907 0.7363 0.2194 0.2032 0.2082 0.0091 0.1322 0.2589 0.1007 0.3438 0.5723 0.1693 0.3397 0.0484 0.2281 0.2889 0.3282 0.379 0.1786 0.3077 0.2364 0.385 0.1542 0.1909 0.1035 0.5596 0.202 0.1339 0.0241 0.3375 0.4389 0.0287 0.0217 0.1528 0.3865 0.3118 0. 0.1685 0.0358 0.0263 0.0013 0.0835 0.0153 0.0667 0.2359 0.0307 0.0042 0.099 0.6276 0.0048 0.372 0.0346 0.3229 0.03 0.0958 0.0129 0.1721 0.0216 0.4011 0.5722 0. 0.2961 0.4267 0.0281 0.239 0.4376 0. 0.1784 0.0407 0.1819 0.0185 0.3795 0.3231 0. 0.0595 0.6016 0. 0.0004 0.1182 0.0451 0.0369 0.0207 0. 0.0534 0.2469 0.2967 0.0018 0.3042 0.2725 0.2206 0. 0.1583 0.0056 0.0347 0.006 ] 2022-08-24 12:47:49 [INFO] [EVAL] Class Precision: [0.7225 0.83 0.9564 0.7625 0.7197 0.8312 0.8623 0.7821 0.5633 0.7415 0.6174 0.6285 0.7431 0.5138 0.4773 0.4713 0.6489 0.7238 0.7165 0.5118 0.7976 0.5816 0.7235 0.5444 0.4389 0.4416 0.5251 0.7296 0.659 0.5197 0.2942 0.4464 0.4832 0.3207 0.4364 0.5387 0.6054 0.703 0.4536 0.4705 0.2682 0.2203 0.5127 0.5562 0.3784 0.4 0.4699 0.534 0.7209 0.4504 0.5917 0.3056 0.2885 0.4763 0.7207 0.5652 0.7682 0.4944 0.5923 0.5197 0.2774 0.2714 0.3703 0.7756 0.4624 0.6658 0.3262 0.477 0.1373 0.6946 0.4639 0.4572 0.6691 0.4957 0.4097 0.5313 0.4746 0.477 0.456 0.2239 0.8089 0.6456 0.702 0.0744 0.6263 0.6818 0.1478 0.369 0.5745 0.6908 0.3989 0. 0.3365 0.2165 0.3678 0.0238 0.3208 0.9787 0.3086 0.4509 0.2007 0.0086 0.4185 0.8175 0.0634 0.445 0.1288 0.8731 0.0728 0.1795 0.1318 0.2945 0.2223 0.7046 0.5752 0. 0.7997 0.4854 0.075 0.6924 0.5461 0. 0.9527 0.817 0.6016 0.4678 0.652 0.4827 0. 0.3016 0.8135 0. 0.0056 0.2921 0.4202 0.2261 0.7578 0. 0.3929 0.6408 0.5512 0.0042 0.4431 0.5374 0.477 0. 0.8962 0.5051 0.3324 0.1642] 2022-08-24 12:47:49 [INFO] [EVAL] Class Recall: [0.8222 0.8737 0.9563 0.8513 0.8507 0.8292 0.8313 0.8462 0.6745 0.8573 0.5613 0.7172 0.8138 0.4379 0.1344 0.5151 0.6351 0.3628 0.6106 0.449 0.8561 0.6489 0.6216 0.6345 0.4088 0.7975 0.7421 0.2848 0.36 0.4023 0.2514 0.6521 0.2201 0.3288 0.4892 0.6081 0.3371 0.4883 0.2727 0.3111 0.0771 0.098 0.346 0.1917 0.4525 0.2298 0.3933 0.486 0.8559 0.7281 0.4483 0.5623 0.2296 0.3107 0.9075 0.5587 0.9465 0.2829 0.2362 0.2579 0.0093 0.2049 0.4625 0.1037 0.5726 0.8031 0.2602 0.5412 0.0695 0.2536 0.4338 0.5378 0.4665 0.2183 0.5528 0.2987 0.671 0.1856 0.2472 0.1615 0.6448 0.2271 0.142 0.0344 0.4227 0.5519 0.0344 0.0225 0.1723 0.4673 0.5882 0. 0.2522 0.0412 0.0276 0.0014 0.1015 0.0153 0.0784 0.3309 0.035 0.0083 0.1148 0.7299 0.0052 0.694 0.0452 0.3388 0.0485 0.1704 0.0141 0.2928 0.0233 0.4821 0.991 0. 0.3198 0.7792 0.0429 0.2674 0.6878 0. 0.18 0.0411 0.2068 0.0189 0.4759 0.4942 0. 0.0691 0.6978 0. 0.0005 0.1657 0.0481 0.0422 0.0208 0. 0.0582 0.2866 0.3913 0.003 0.4924 0.3561 0.2909 0. 0.1613 0.0057 0.0372 0.0062] 2022-08-24 12:47:49 [INFO] [EVAL] The model with the best validation mIoU (0.2621) was saved at iter 21000. 2022-08-24 12:48:01 [INFO] [TRAIN] epoch: 17, iter: 21050/160000, loss: 0.9645, lr: 0.001052, batch_cost: 0.2324, reader_cost: 0.00329, ips: 34.4160 samples/sec | ETA 08:58:18 2022-08-24 12:48:10 [INFO] [TRAIN] epoch: 17, iter: 21100/160000, loss: 1.0280, lr: 0.001052, batch_cost: 0.1855, reader_cost: 0.00120, ips: 43.1179 samples/sec | ETA 07:09:31 2022-08-24 12:48:24 [INFO] [TRAIN] epoch: 17, iter: 21150/160000, loss: 1.0222, lr: 0.001051, batch_cost: 0.2722, reader_cost: 0.01026, ips: 29.3866 samples/sec | ETA 10:29:59 2022-08-24 12:48:34 [INFO] [TRAIN] epoch: 17, iter: 21200/160000, loss: 1.0477, lr: 0.001051, batch_cost: 0.2001, reader_cost: 0.00087, ips: 39.9895 samples/sec | ETA 07:42:47 2022-08-24 12:48:44 [INFO] [TRAIN] epoch: 17, iter: 21250/160000, loss: 1.0898, lr: 0.001050, batch_cost: 0.1974, reader_cost: 0.00098, ips: 40.5264 samples/sec | ETA 07:36:29 2022-08-24 12:48:55 [INFO] [TRAIN] epoch: 17, iter: 21300/160000, loss: 1.0325, lr: 0.001050, batch_cost: 0.2225, reader_cost: 0.00473, ips: 35.9494 samples/sec | ETA 08:34:25 2022-08-24 12:49:05 [INFO] [TRAIN] epoch: 17, iter: 21350/160000, loss: 1.0253, lr: 0.001050, batch_cost: 0.2015, reader_cost: 0.00079, ips: 39.7119 samples/sec | ETA 07:45:31 2022-08-24 12:49:14 [INFO] [TRAIN] epoch: 17, iter: 21400/160000, loss: 1.0042, lr: 0.001049, batch_cost: 0.1897, reader_cost: 0.00085, ips: 42.1666 samples/sec | ETA 07:18:15 2022-08-24 12:49:24 [INFO] [TRAIN] epoch: 17, iter: 21450/160000, loss: 1.0004, lr: 0.001049, batch_cost: 0.1860, reader_cost: 0.00031, ips: 43.0000 samples/sec | ETA 07:09:36 2022-08-24 12:49:36 [INFO] [TRAIN] epoch: 18, iter: 21500/160000, loss: 0.9674, lr: 0.001049, batch_cost: 0.2511, reader_cost: 0.03579, ips: 31.8585 samples/sec | ETA 09:39:38 2022-08-24 12:49:46 [INFO] [TRAIN] epoch: 18, iter: 21550/160000, loss: 0.9748, lr: 0.001048, batch_cost: 0.2010, reader_cost: 0.00088, ips: 39.8069 samples/sec | ETA 07:43:44 2022-08-24 12:49:56 [INFO] [TRAIN] epoch: 18, iter: 21600/160000, loss: 1.0524, lr: 0.001048, batch_cost: 0.1997, reader_cost: 0.00392, ips: 40.0635 samples/sec | ETA 07:40:36 2022-08-24 12:50:07 [INFO] [TRAIN] epoch: 18, iter: 21650/160000, loss: 0.9613, lr: 0.001047, batch_cost: 0.2061, reader_cost: 0.00547, ips: 38.8075 samples/sec | ETA 07:55:20 2022-08-24 12:50:18 [INFO] [TRAIN] epoch: 18, iter: 21700/160000, loss: 1.0347, lr: 0.001047, batch_cost: 0.2220, reader_cost: 0.00035, ips: 36.0435 samples/sec | ETA 08:31:36 2022-08-24 12:50:28 [INFO] [TRAIN] epoch: 18, iter: 21750/160000, loss: 0.9472, lr: 0.001047, batch_cost: 0.2030, reader_cost: 0.00033, ips: 39.4100 samples/sec | ETA 07:47:43 2022-08-24 12:50:39 [INFO] [TRAIN] epoch: 18, iter: 21800/160000, loss: 0.9812, lr: 0.001046, batch_cost: 0.2196, reader_cost: 0.00034, ips: 36.4276 samples/sec | ETA 08:25:50 2022-08-24 12:50:49 [INFO] [TRAIN] epoch: 18, iter: 21850/160000, loss: 1.0163, lr: 0.001046, batch_cost: 0.2008, reader_cost: 0.00664, ips: 39.8406 samples/sec | ETA 07:42:20 2022-08-24 12:50:58 [INFO] [TRAIN] epoch: 18, iter: 21900/160000, loss: 1.1281, lr: 0.001046, batch_cost: 0.1738, reader_cost: 0.00124, ips: 46.0291 samples/sec | ETA 06:40:02 2022-08-24 12:51:07 [INFO] [TRAIN] epoch: 18, iter: 21950/160000, loss: 1.0255, lr: 0.001045, batch_cost: 0.1903, reader_cost: 0.00148, ips: 42.0402 samples/sec | ETA 07:17:50 2022-08-24 12:51:17 [INFO] [TRAIN] epoch: 18, iter: 22000/160000, loss: 0.9726, lr: 0.001045, batch_cost: 0.2078, reader_cost: 0.00045, ips: 38.4964 samples/sec | ETA 07:57:58 2022-08-24 12:51:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1907 - reader cost: 6.5419e-04 2022-08-24 12:54:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.2610 Acc: 0.7200 Kappa: 0.6980 Dice: 0.3730 2022-08-24 12:54:28 [INFO] [EVAL] Class IoU: [0.6236 0.7446 0.9158 0.6766 0.6383 0.6989 0.7312 0.6891 0.4375 0.6353 0.4109 0.4597 0.6436 0.2936 0.1083 0.314 0.4408 0.331 0.4885 0.3227 0.6936 0.374 0.5171 0.3976 0.2834 0.4472 0.4169 0.2599 0.3441 0.2858 0.1269 0.3974 0.1913 0.1829 0.2994 0.4075 0.3088 0.3883 0.1733 0.2115 0.0445 0.0533 0.2569 0.1779 0.2418 0.1456 0.2737 0.3356 0.6329 0.4109 0.3809 0.2742 0.1571 0.2005 0.6585 0.3326 0.8166 0.2251 0.2001 0.164 0.0789 0.1049 0.2682 0.0847 0.3488 0.5816 0.1358 0.339 0.0027 0.1988 0.2623 0.3532 0.3582 0.2241 0.2696 0.2095 0.4069 0.1306 0.1848 0.2354 0.6578 0.2256 0.2292 0.0135 0.2781 0.4392 0.0119 0.0327 0.1938 0.3998 0.3856 0.039 0.1804 0.0325 0.0583 0.0014 0.0722 0.0089 0.011 0.2172 0.0094 0.0236 0.0406 0.3081 0.0845 0.3651 0.0032 0.3454 0.0111 0.1369 0.0033 0.2014 0.0184 0.4928 0.5901 0. 0.2157 0.4469 0.0509 0.1109 0.4051 0. 0.1914 0.1269 0.1975 0.0484 0.3897 0.3273 0.0009 0.1496 0.5086 0. 0.0062 0.0518 0.0078 0.0508 0.0515 0. 0.0597 0.2931 0.1559 0. 0.2475 0.4154 0.2519 0.0007 0.1383 0.0037 0.0461 0.0135] 2022-08-24 12:54:28 [INFO] [EVAL] Class Precision: [0.7271 0.8161 0.9604 0.8069 0.7098 0.8678 0.8145 0.7689 0.5473 0.7727 0.6571 0.6611 0.745 0.4941 0.5238 0.47 0.5607 0.6585 0.6398 0.4239 0.7609 0.6552 0.7304 0.5375 0.38 0.5682 0.4644 0.6531 0.5101 0.4703 0.3574 0.5605 0.4376 0.2823 0.4464 0.5574 0.5489 0.759 0.2755 0.5528 0.2658 0.2893 0.6217 0.5265 0.4056 0.4723 0.4513 0.6073 0.719 0.5041 0.573 0.3431 0.3469 0.5117 0.7166 0.3987 0.9137 0.5473 0.6839 0.3313 0.1488 0.2635 0.3757 0.7006 0.4567 0.6918 0.2324 0.5549 0.5776 0.7084 0.3602 0.4648 0.7053 0.2829 0.3396 0.6585 0.6014 0.3922 0.6177 0.43 0.8177 0.5699 0.6447 0.0774 0.5637 0.6377 0.0907 0.4099 0.4365 0.7013 0.5702 0.0684 0.3743 0.2824 0.1554 0.0087 0.1071 1. 0.0737 0.411 0.1135 0.0305 0.4617 0.6543 0.6143 0.5259 0.0216 0.5588 0.0928 0.2231 0.1963 0.2818 0.1848 0.6986 0.5925 0. 0.9384 0.5113 0.1472 0.5281 0.6819 0. 0.8943 0.6111 0.4914 0.4064 0.7267 0.4624 0.0049 0.251 0.5833 0. 0.0405 0.4941 0.45 0.2435 0.2947 0. 0.2628 0.6088 0.3423 0. 0.6174 0.4883 0.558 0.005 0.8653 0.5617 0.2699 0.2826] 2022-08-24 12:54:28 [INFO] [EVAL] Class Recall: [0.8143 0.8947 0.9517 0.8073 0.8637 0.7822 0.8772 0.8691 0.6857 0.7813 0.523 0.6015 0.8254 0.4197 0.1201 0.4862 0.6733 0.3996 0.6738 0.575 0.887 0.4656 0.639 0.6044 0.5272 0.6775 0.803 0.3015 0.514 0.4216 0.1645 0.5772 0.2536 0.342 0.4761 0.6026 0.4139 0.4429 0.3183 0.2551 0.0507 0.0614 0.3045 0.2118 0.3746 0.1739 0.4103 0.4286 0.8409 0.6898 0.532 0.577 0.223 0.248 0.8903 0.6676 0.8849 0.2766 0.2204 0.2452 0.144 0.1483 0.4837 0.0879 0.5961 0.785 0.2461 0.4656 0.0027 0.2165 0.4911 0.5954 0.4212 0.5188 0.5667 0.2351 0.5572 0.1638 0.2087 0.3422 0.7709 0.2719 0.2624 0.0161 0.3544 0.5853 0.0135 0.0343 0.2585 0.4818 0.5437 0.0831 0.2582 0.0354 0.0853 0.0016 0.1811 0.0089 0.0127 0.3154 0.0102 0.0952 0.0426 0.368 0.0892 0.5443 0.0037 0.4749 0.0125 0.2616 0.0033 0.4137 0.0201 0.6259 0.993 0. 0.2188 0.7801 0.0722 0.1231 0.4996 0. 0.1958 0.1381 0.2482 0.0521 0.4565 0.5284 0.001 0.2703 0.7989 0. 0.0073 0.0547 0.0079 0.0603 0.0587 0. 0.0717 0.3611 0.2225 0. 0.2924 0.7356 0.3148 0.0008 0.1413 0.0038 0.0527 0.014 ] 2022-08-24 12:54:29 [INFO] [EVAL] The model with the best validation mIoU (0.2621) was saved at iter 21000. 2022-08-24 12:54:39 [INFO] [TRAIN] epoch: 18, iter: 22050/160000, loss: 1.0210, lr: 0.001044, batch_cost: 0.2151, reader_cost: 0.00355, ips: 37.2005 samples/sec | ETA 08:14:26 2022-08-24 12:54:50 [INFO] [TRAIN] epoch: 18, iter: 22100/160000, loss: 1.0203, lr: 0.001044, batch_cost: 0.2056, reader_cost: 0.00099, ips: 38.9067 samples/sec | ETA 07:52:35 2022-08-24 12:55:02 [INFO] [TRAIN] epoch: 18, iter: 22150/160000, loss: 1.0358, lr: 0.001044, batch_cost: 0.2523, reader_cost: 0.00035, ips: 31.7130 samples/sec | ETA 09:39:34 2022-08-24 12:55:13 [INFO] [TRAIN] epoch: 18, iter: 22200/160000, loss: 0.9714, lr: 0.001043, batch_cost: 0.2062, reader_cost: 0.00051, ips: 38.8012 samples/sec | ETA 07:53:31 2022-08-24 12:55:23 [INFO] [TRAIN] epoch: 18, iter: 22250/160000, loss: 1.0113, lr: 0.001043, batch_cost: 0.2017, reader_cost: 0.00036, ips: 39.6682 samples/sec | ETA 07:43:00 2022-08-24 12:55:33 [INFO] [TRAIN] epoch: 18, iter: 22300/160000, loss: 1.0252, lr: 0.001043, batch_cost: 0.2057, reader_cost: 0.00064, ips: 38.8858 samples/sec | ETA 07:52:09 2022-08-24 12:55:44 [INFO] [TRAIN] epoch: 18, iter: 22350/160000, loss: 1.0709, lr: 0.001042, batch_cost: 0.2301, reader_cost: 0.00040, ips: 34.7692 samples/sec | ETA 08:47:51 2022-08-24 12:55:55 [INFO] [TRAIN] epoch: 18, iter: 22400/160000, loss: 1.0049, lr: 0.001042, batch_cost: 0.2178, reader_cost: 0.00065, ips: 36.7367 samples/sec | ETA 08:19:24 2022-08-24 12:56:05 [INFO] [TRAIN] epoch: 18, iter: 22450/160000, loss: 1.0181, lr: 0.001041, batch_cost: 0.1924, reader_cost: 0.00036, ips: 41.5716 samples/sec | ETA 07:21:10 2022-08-24 12:56:15 [INFO] [TRAIN] epoch: 18, iter: 22500/160000, loss: 1.0553, lr: 0.001041, batch_cost: 0.2028, reader_cost: 0.00765, ips: 39.4413 samples/sec | ETA 07:44:49 2022-08-24 12:56:25 [INFO] [TRAIN] epoch: 18, iter: 22550/160000, loss: 1.0536, lr: 0.001041, batch_cost: 0.2077, reader_cost: 0.00065, ips: 38.5176 samples/sec | ETA 07:55:47 2022-08-24 12:56:34 [INFO] [TRAIN] epoch: 18, iter: 22600/160000, loss: 1.0834, lr: 0.001040, batch_cost: 0.1773, reader_cost: 0.00853, ips: 45.1159 samples/sec | ETA 06:46:03 2022-08-24 12:56:44 [INFO] [TRAIN] epoch: 18, iter: 22650/160000, loss: 0.9946, lr: 0.001040, batch_cost: 0.1948, reader_cost: 0.01009, ips: 41.0728 samples/sec | ETA 07:25:52 2022-08-24 12:56:54 [INFO] [TRAIN] epoch: 18, iter: 22700/160000, loss: 1.0411, lr: 0.001040, batch_cost: 0.1990, reader_cost: 0.00312, ips: 40.1964 samples/sec | ETA 07:35:25 2022-08-24 12:57:05 [INFO] [TRAIN] epoch: 19, iter: 22750/160000, loss: 1.0480, lr: 0.001039, batch_cost: 0.2137, reader_cost: 0.03284, ips: 37.4419 samples/sec | ETA 08:08:45 2022-08-24 12:57:15 [INFO] [TRAIN] epoch: 19, iter: 22800/160000, loss: 1.0285, lr: 0.001039, batch_cost: 0.2077, reader_cost: 0.00990, ips: 38.5162 samples/sec | ETA 07:54:57 2022-08-24 12:57:24 [INFO] [TRAIN] epoch: 19, iter: 22850/160000, loss: 1.0052, lr: 0.001038, batch_cost: 0.1879, reader_cost: 0.00218, ips: 42.5648 samples/sec | ETA 07:09:37 2022-08-24 12:57:35 [INFO] [TRAIN] epoch: 19, iter: 22900/160000, loss: 0.9569, lr: 0.001038, batch_cost: 0.2016, reader_cost: 0.00057, ips: 39.6901 samples/sec | ETA 07:40:34 2022-08-24 12:57:45 [INFO] [TRAIN] epoch: 19, iter: 22950/160000, loss: 0.9563, lr: 0.001038, batch_cost: 0.2024, reader_cost: 0.00421, ips: 39.5267 samples/sec | ETA 07:42:18 2022-08-24 12:57:55 [INFO] [TRAIN] epoch: 19, iter: 23000/160000, loss: 1.1275, lr: 0.001037, batch_cost: 0.2039, reader_cost: 0.00044, ips: 39.2394 samples/sec | ETA 07:45:31 2022-08-24 12:57:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1869 - reader cost: 7.5847e-04 2022-08-24 13:01:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.2586 Acc: 0.7193 Kappa: 0.6980 Dice: 0.3707 2022-08-24 13:01:02 [INFO] [EVAL] Class IoU: [0.6255 0.7453 0.9147 0.6666 0.6501 0.7176 0.7353 0.6825 0.444 0.5915 0.4251 0.4882 0.6387 0.2662 0.161 0.3239 0.4232 0.3946 0.4793 0.3101 0.6893 0.438 0.4966 0.3988 0.2529 0.3929 0.4497 0.2678 0.2742 0.2552 0.1313 0.4445 0.2151 0.183 0.233 0.3986 0.2839 0.4056 0.2009 0.2379 0.0813 0.0666 0.2693 0.1862 0.2478 0.2118 0.2749 0.3488 0.5336 0.4344 0.3598 0.169 0.1882 0.1975 0.6058 0.2701 0.7806 0.1542 0.2901 0.1863 0.0261 0.1297 0.2755 0.1506 0.3518 0.5003 0.1615 0.3421 0.029 0.2749 0.2771 0.3163 0.3764 0.2505 0.345 0.2381 0.4716 0.1367 0.1232 0.1307 0.6186 0.2406 0.1598 0.0407 0.2838 0.444 0.0435 0.0356 0.2138 0.4183 0.2857 0.0181 0.2107 0.0419 0.008 0.0071 0.1092 0.0541 0.0061 0.1652 0.0049 0.0074 0.0843 0.3348 0.0001 0.3113 0.0718 0.4846 0.039 0.1537 0.0064 0.0327 0.0146 0.3999 0.5019 0. 0.2859 0.4727 0.0443 0.346 0.3547 0. 0.1547 0.0268 0.1911 0.0437 0.3941 0.3385 0. 0.0118 0.5476 0. 0.0054 0.132 0.0159 0.0468 0.0438 0.0064 0.0636 0.2804 0.2744 0.0138 0.2849 0.0454 0.1534 0. 0.165 0.0109 0.058 0.0263] 2022-08-24 13:01:02 [INFO] [EVAL] Class Precision: [0.7479 0.8369 0.9623 0.7509 0.7763 0.8192 0.8517 0.7463 0.6243 0.7259 0.5599 0.642 0.7364 0.5203 0.4233 0.4679 0.5172 0.6146 0.5734 0.4906 0.7599 0.5718 0.6531 0.556 0.4739 0.5291 0.4914 0.7507 0.6988 0.332 0.315 0.6506 0.4568 0.3564 0.3683 0.5798 0.6211 0.7089 0.3495 0.503 0.2602 0.2507 0.6008 0.4773 0.4219 0.4791 0.5223 0.5518 0.7388 0.555 0.5584 0.1849 0.3285 0.7704 0.6495 0.3093 0.8294 0.5544 0.4469 0.2702 0.2666 0.2682 0.3593 0.6516 0.4673 0.5685 0.2611 0.585 0.4818 0.7845 0.4631 0.3968 0.6174 0.4386 0.5109 0.3838 0.6317 0.5127 0.1727 0.213 0.7359 0.5444 0.6838 0.1435 0.4817 0.6905 0.4293 0.4192 0.6313 0.7121 0.3635 0.0564 0.3376 0.2375 0.0425 0.0362 0.4374 0.5955 0.1186 0.4191 0.3133 0.0111 0.4779 0.3993 0.0109 0.485 0.1019 0.694 0.2046 0.2709 0.2647 0.0343 0.226 0.6454 0.5141 0. 0.7594 0.6173 0.125 0.711 0.7076 0. 0.8564 0.9289 0.5352 0.5596 0.7194 0.4679 0.0067 0.8192 0.6575 0. 0.1341 0.3605 0.4569 0.3315 0.4067 0.2311 0.381 0.6117 0.7432 0.0223 0.7068 0.3413 0.6526 0. 0.882 0.4415 0.3843 0.7652] 2022-08-24 13:01:02 [INFO] [EVAL] Class Recall: [0.7927 0.8719 0.9487 0.8559 0.8 0.8527 0.8432 0.8886 0.6059 0.7617 0.6383 0.6709 0.828 0.3529 0.2062 0.5128 0.6996 0.5243 0.745 0.4574 0.8812 0.6517 0.6745 0.5852 0.3517 0.6042 0.8414 0.2939 0.311 0.5244 0.1838 0.5839 0.2891 0.2733 0.3882 0.5606 0.3434 0.4867 0.3209 0.3111 0.1058 0.0832 0.3281 0.2339 0.3752 0.2752 0.3672 0.4867 0.6577 0.6664 0.5029 0.6628 0.3059 0.2099 0.9001 0.6805 0.9299 0.176 0.4527 0.3749 0.0281 0.2008 0.5415 0.1637 0.5874 0.8065 0.2975 0.4517 0.03 0.2974 0.4082 0.6093 0.4909 0.3687 0.5151 0.3854 0.6505 0.1571 0.3003 0.2529 0.7951 0.3012 0.1725 0.0537 0.4086 0.5544 0.0462 0.0374 0.2443 0.5034 0.5715 0.0259 0.3592 0.0484 0.0098 0.0087 0.1271 0.0561 0.0064 0.2142 0.005 0.0216 0.0928 0.6746 0.0001 0.4651 0.1958 0.6162 0.0459 0.2621 0.0065 0.4161 0.0153 0.5125 0.9547 0. 0.3144 0.6685 0.0643 0.4026 0.4157 0. 0.1588 0.0269 0.2291 0.0452 0.4657 0.5504 0. 0.0118 0.7661 0. 0.0056 0.1723 0.0162 0.0517 0.0468 0.0066 0.0709 0.3411 0.3032 0.035 0.323 0.0497 0.1671 0. 0.1687 0.0111 0.0639 0.0265] 2022-08-24 13:01:02 [INFO] [EVAL] The model with the best validation mIoU (0.2621) was saved at iter 21000. 2022-08-24 13:01:13 [INFO] [TRAIN] epoch: 19, iter: 23050/160000, loss: 0.9920, lr: 0.001037, batch_cost: 0.2060, reader_cost: 0.00366, ips: 38.8283 samples/sec | ETA 07:50:16 2022-08-24 13:01:24 [INFO] [TRAIN] epoch: 19, iter: 23100/160000, loss: 1.0766, lr: 0.001036, batch_cost: 0.2308, reader_cost: 0.02826, ips: 34.6660 samples/sec | ETA 08:46:32 2022-08-24 13:01:37 [INFO] [TRAIN] epoch: 19, iter: 23150/160000, loss: 0.9964, lr: 0.001036, batch_cost: 0.2502, reader_cost: 0.00047, ips: 31.9709 samples/sec | ETA 09:30:43 2022-08-24 13:01:49 [INFO] [TRAIN] epoch: 19, iter: 23200/160000, loss: 1.0958, lr: 0.001036, batch_cost: 0.2437, reader_cost: 0.01682, ips: 32.8240 samples/sec | ETA 09:15:41 2022-08-24 13:02:00 [INFO] [TRAIN] epoch: 19, iter: 23250/160000, loss: 0.9485, lr: 0.001035, batch_cost: 0.2148, reader_cost: 0.00266, ips: 37.2508 samples/sec | ETA 08:09:28 2022-08-24 13:02:10 [INFO] [TRAIN] epoch: 19, iter: 23300/160000, loss: 1.0143, lr: 0.001035, batch_cost: 0.2039, reader_cost: 0.00127, ips: 39.2353 samples/sec | ETA 07:44:32 2022-08-24 13:02:21 [INFO] [TRAIN] epoch: 19, iter: 23350/160000, loss: 0.9509, lr: 0.001035, batch_cost: 0.2167, reader_cost: 0.00074, ips: 36.9178 samples/sec | ETA 08:13:31 2022-08-24 13:02:32 [INFO] [TRAIN] epoch: 19, iter: 23400/160000, loss: 1.0190, lr: 0.001034, batch_cost: 0.2247, reader_cost: 0.00098, ips: 35.6029 samples/sec | ETA 08:31:34 2022-08-24 13:02:41 [INFO] [TRAIN] epoch: 19, iter: 23450/160000, loss: 1.0672, lr: 0.001034, batch_cost: 0.1845, reader_cost: 0.00103, ips: 43.3648 samples/sec | ETA 06:59:50 2022-08-24 13:02:51 [INFO] [TRAIN] epoch: 19, iter: 23500/160000, loss: 0.9744, lr: 0.001033, batch_cost: 0.2053, reader_cost: 0.00049, ips: 38.9751 samples/sec | ETA 07:46:57 2022-08-24 13:03:01 [INFO] [TRAIN] epoch: 19, iter: 23550/160000, loss: 1.0997, lr: 0.001033, batch_cost: 0.1905, reader_cost: 0.00033, ips: 41.9966 samples/sec | ETA 07:13:12 2022-08-24 13:03:11 [INFO] [TRAIN] epoch: 19, iter: 23600/160000, loss: 1.0180, lr: 0.001033, batch_cost: 0.1972, reader_cost: 0.00322, ips: 40.5766 samples/sec | ETA 07:28:12 2022-08-24 13:03:20 [INFO] [TRAIN] epoch: 19, iter: 23650/160000, loss: 1.0373, lr: 0.001032, batch_cost: 0.1765, reader_cost: 0.00697, ips: 45.3303 samples/sec | ETA 06:41:03 2022-08-24 13:03:31 [INFO] [TRAIN] epoch: 19, iter: 23700/160000, loss: 1.0428, lr: 0.001032, batch_cost: 0.2281, reader_cost: 0.00872, ips: 35.0685 samples/sec | ETA 08:38:13 2022-08-24 13:03:41 [INFO] [TRAIN] epoch: 19, iter: 23750/160000, loss: 0.9522, lr: 0.001032, batch_cost: 0.2108, reader_cost: 0.00052, ips: 37.9457 samples/sec | ETA 07:58:45 2022-08-24 13:03:52 [INFO] [TRAIN] epoch: 19, iter: 23800/160000, loss: 1.0062, lr: 0.001031, batch_cost: 0.2202, reader_cost: 0.00071, ips: 36.3324 samples/sec | ETA 08:19:49 2022-08-24 13:04:04 [INFO] [TRAIN] epoch: 19, iter: 23850/160000, loss: 1.0093, lr: 0.001031, batch_cost: 0.2214, reader_cost: 0.00065, ips: 36.1378 samples/sec | ETA 08:22:20 2022-08-24 13:04:13 [INFO] [TRAIN] epoch: 19, iter: 23900/160000, loss: 1.0130, lr: 0.001030, batch_cost: 0.1942, reader_cost: 0.00035, ips: 41.1870 samples/sec | ETA 07:20:35 2022-08-24 13:04:24 [INFO] [TRAIN] epoch: 19, iter: 23950/160000, loss: 1.0153, lr: 0.001030, batch_cost: 0.2115, reader_cost: 0.00036, ips: 37.8327 samples/sec | ETA 07:59:28 2022-08-24 13:04:36 [INFO] [TRAIN] epoch: 20, iter: 24000/160000, loss: 0.9328, lr: 0.001030, batch_cost: 0.2345, reader_cost: 0.03242, ips: 34.1119 samples/sec | ETA 08:51:35 2022-08-24 13:04:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1775 - reader cost: 7.4510e-04 2022-08-24 13:07:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.2693 Acc: 0.7248 Kappa: 0.7029 Dice: 0.3820 2022-08-24 13:07:33 [INFO] [EVAL] Class IoU: [0.6235 0.7454 0.9188 0.6698 0.6423 0.7191 0.7268 0.693 0.4455 0.6247 0.4223 0.4834 0.6353 0.2686 0.1419 0.334 0.4363 0.3237 0.4992 0.312 0.7006 0.4267 0.5227 0.4069 0.2337 0.4014 0.4255 0.2996 0.349 0.2936 0.1233 0.3894 0.092 0.217 0.2717 0.3995 0.291 0.4636 0.2135 0.2768 0.1195 0.0744 0.2473 0.1774 0.2664 0.195 0.2576 0.3699 0.5895 0.4375 0.3882 0.3456 0.1344 0.1058 0.659 0.439 0.8038 0.1769 0.3183 0.1728 0.0356 0.1435 0.2837 0.1321 0.3757 0.5703 0.1405 0.3616 0.007 0.2318 0.2662 0.3657 0.3553 0.1869 0.3477 0.2279 0.4549 0.1506 0.1261 0.0903 0.5107 0.2205 0.1521 0.0142 0.3279 0.4317 0.0334 0.0388 0.1525 0.3992 0.3453 0.0068 0.1951 0.0466 0. 0.001 0.0951 0.0636 0.0042 0.2738 0.0391 0.0292 0.0496 0.5123 0.0058 0.3752 0.0464 0.4676 0.0606 0.2257 0.0211 0.1243 0.0053 0.5131 0.733 0. 0.3831 0.5337 0.0161 0.3976 0.4566 0. 0.1974 0.0819 0.2212 0.0386 0.3405 0.2788 0.0149 0.1681 0.6072 0. 0.0093 0.1342 0.0194 0.0764 0.0438 0.0056 0.0694 0.1796 0.185 0.0031 0.2042 0.1493 0.2728 0. 0.1334 0.0113 0.0251 0.0191] 2022-08-24 13:07:33 [INFO] [EVAL] Class Precision: [0.7212 0.8057 0.9652 0.7669 0.7377 0.8361 0.8604 0.7809 0.5703 0.7062 0.686 0.5994 0.7382 0.5363 0.4242 0.4975 0.5447 0.6677 0.6897 0.5087 0.7775 0.6126 0.7273 0.5421 0.5062 0.572 0.5031 0.6922 0.6305 0.4328 0.3248 0.5117 0.5237 0.3977 0.4932 0.5194 0.5915 0.6079 0.4464 0.4042 0.2442 0.1833 0.5019 0.5266 0.3914 0.434 0.4031 0.6303 0.6164 0.5919 0.53 0.4465 0.3501 0.9302 0.7034 0.5997 0.8535 0.5631 0.5845 0.2923 0.2039 0.3564 0.4234 0.6192 0.5023 0.6558 0.2169 0.5624 0.5424 0.7575 0.5743 0.4407 0.6639 0.2303 0.5412 0.3644 0.6249 0.5892 0.302 0.389 0.8638 0.5546 0.6872 0.0545 0.5782 0.6901 0.1189 0.3525 0.8298 0.5487 0.4525 0.0406 0.3658 0.2648 0.0014 0.0093 0.2473 0.6086 0.1598 0.5085 0.3908 0.0523 0.501 0.8542 0.0448 0.4838 0.3355 0.7407 0.2074 0.3426 0.232 0.1537 0.2094 0.6959 0.7468 0. 0.7029 0.6466 0.1336 0.5714 0.6615 0. 0.8103 0.7705 0.45 0.6481 0.6035 0.382 0.1052 0.3685 0.8114 0. 0.2278 0.4302 0.5075 0.2664 0.6312 0.2044 0.3657 0.7705 0.2804 0.0048 0.6533 0.2513 0.4566 0. 0.9727 0.4415 0.3519 0.7052] 2022-08-24 13:07:33 [INFO] [EVAL] Class Recall: [0.8216 0.9087 0.9502 0.8411 0.8324 0.8371 0.824 0.8602 0.6706 0.8442 0.5236 0.714 0.8201 0.3498 0.1758 0.504 0.6867 0.3859 0.6438 0.4466 0.8763 0.5843 0.6502 0.62 0.3027 0.5738 0.7338 0.3456 0.4387 0.4773 0.1657 0.6198 0.1004 0.3232 0.377 0.6339 0.3642 0.6614 0.2903 0.4675 0.1897 0.1113 0.3278 0.211 0.4547 0.2614 0.4164 0.4723 0.9311 0.6265 0.5919 0.6047 0.1791 0.1067 0.9125 0.621 0.9324 0.205 0.4114 0.2969 0.0414 0.1938 0.4623 0.1438 0.5985 0.8138 0.2851 0.5032 0.0071 0.2504 0.3316 0.6822 0.4332 0.4975 0.493 0.3782 0.6259 0.1683 0.1779 0.1053 0.5555 0.2679 0.1635 0.0188 0.431 0.5354 0.0444 0.0418 0.1574 0.5943 0.593 0.008 0.2947 0.0535 0. 0.0012 0.1338 0.0663 0.0043 0.3723 0.0416 0.062 0.0522 0.5614 0.0067 0.6257 0.0511 0.5591 0.0789 0.3983 0.0226 0.3944 0.0055 0.6613 0.9754 0. 0.4571 0.7536 0.018 0.5666 0.5957 0. 0.207 0.0839 0.3032 0.0394 0.4387 0.5078 0.017 0.2361 0.7071 0. 0.0096 0.1632 0.0197 0.0967 0.045 0.0057 0.0788 0.1898 0.3521 0.0087 0.229 0.2688 0.4039 0. 0.1339 0.0114 0.0263 0.0193] 2022-08-24 13:07:34 [INFO] [EVAL] The model with the best validation mIoU (0.2693) was saved at iter 24000. 2022-08-24 13:07:44 [INFO] [TRAIN] epoch: 20, iter: 24050/160000, loss: 0.9529, lr: 0.001029, batch_cost: 0.2064, reader_cost: 0.00376, ips: 38.7645 samples/sec | ETA 07:47:36 2022-08-24 13:07:55 [INFO] [TRAIN] epoch: 20, iter: 24100/160000, loss: 1.0324, lr: 0.001029, batch_cost: 0.2306, reader_cost: 0.00087, ips: 34.6966 samples/sec | ETA 08:42:14 2022-08-24 13:08:09 [INFO] [TRAIN] epoch: 20, iter: 24150/160000, loss: 1.0292, lr: 0.001029, batch_cost: 0.2774, reader_cost: 0.02065, ips: 28.8399 samples/sec | ETA 10:28:03 2022-08-24 13:08:23 [INFO] [TRAIN] epoch: 20, iter: 24200/160000, loss: 0.9952, lr: 0.001028, batch_cost: 0.2713, reader_cost: 0.00330, ips: 29.4876 samples/sec | ETA 10:14:02 2022-08-24 13:08:36 [INFO] [TRAIN] epoch: 20, iter: 24250/160000, loss: 0.9882, lr: 0.001028, batch_cost: 0.2596, reader_cost: 0.00559, ips: 30.8111 samples/sec | ETA 09:47:27 2022-08-24 13:08:48 [INFO] [TRAIN] epoch: 20, iter: 24300/160000, loss: 1.0098, lr: 0.001027, batch_cost: 0.2404, reader_cost: 0.00532, ips: 33.2737 samples/sec | ETA 09:03:46 2022-08-24 13:08:59 [INFO] [TRAIN] epoch: 20, iter: 24350/160000, loss: 0.9890, lr: 0.001027, batch_cost: 0.2293, reader_cost: 0.00501, ips: 34.8833 samples/sec | ETA 08:38:29 2022-08-24 13:09:09 [INFO] [TRAIN] epoch: 20, iter: 24400/160000, loss: 0.9913, lr: 0.001027, batch_cost: 0.1974, reader_cost: 0.00134, ips: 40.5282 samples/sec | ETA 07:26:06 2022-08-24 13:09:20 [INFO] [TRAIN] epoch: 20, iter: 24450/160000, loss: 0.9457, lr: 0.001026, batch_cost: 0.2247, reader_cost: 0.01090, ips: 35.6096 samples/sec | ETA 08:27:32 2022-08-24 13:09:30 [INFO] [TRAIN] epoch: 20, iter: 24500/160000, loss: 0.9176, lr: 0.001026, batch_cost: 0.2014, reader_cost: 0.00037, ips: 39.7137 samples/sec | ETA 07:34:55 2022-08-24 13:09:40 [INFO] [TRAIN] epoch: 20, iter: 24550/160000, loss: 1.0111, lr: 0.001025, batch_cost: 0.1947, reader_cost: 0.00616, ips: 41.0840 samples/sec | ETA 07:19:35 2022-08-24 13:09:50 [INFO] [TRAIN] epoch: 20, iter: 24600/160000, loss: 1.0345, lr: 0.001025, batch_cost: 0.1922, reader_cost: 0.00059, ips: 41.6222 samples/sec | ETA 07:13:44 2022-08-24 13:09:59 [INFO] [TRAIN] epoch: 20, iter: 24650/160000, loss: 1.0211, lr: 0.001025, batch_cost: 0.1916, reader_cost: 0.01135, ips: 41.7582 samples/sec | ETA 07:12:10 2022-08-24 13:10:09 [INFO] [TRAIN] epoch: 20, iter: 24700/160000, loss: 1.1025, lr: 0.001024, batch_cost: 0.1919, reader_cost: 0.00044, ips: 41.6971 samples/sec | ETA 07:12:38 2022-08-24 13:10:18 [INFO] [TRAIN] epoch: 20, iter: 24750/160000, loss: 1.0475, lr: 0.001024, batch_cost: 0.1895, reader_cost: 0.00034, ips: 42.2165 samples/sec | ETA 07:07:09 2022-08-24 13:10:30 [INFO] [TRAIN] epoch: 20, iter: 24800/160000, loss: 0.9889, lr: 0.001024, batch_cost: 0.2336, reader_cost: 0.00035, ips: 34.2444 samples/sec | ETA 08:46:24 2022-08-24 13:10:40 [INFO] [TRAIN] epoch: 20, iter: 24850/160000, loss: 0.9852, lr: 0.001023, batch_cost: 0.2005, reader_cost: 0.00533, ips: 39.9096 samples/sec | ETA 07:31:31 2022-08-24 13:10:52 [INFO] [TRAIN] epoch: 20, iter: 24900/160000, loss: 0.9697, lr: 0.001023, batch_cost: 0.2277, reader_cost: 0.00042, ips: 35.1273 samples/sec | ETA 08:32:48 2022-08-24 13:11:03 [INFO] [TRAIN] epoch: 20, iter: 24950/160000, loss: 1.0521, lr: 0.001022, batch_cost: 0.2198, reader_cost: 0.00151, ips: 36.3983 samples/sec | ETA 08:14:42 2022-08-24 13:11:13 [INFO] [TRAIN] epoch: 20, iter: 25000/160000, loss: 0.9416, lr: 0.001022, batch_cost: 0.2154, reader_cost: 0.00076, ips: 37.1448 samples/sec | ETA 08:04:35 2022-08-24 13:11:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1744 - reader cost: 8.3241e-04 2022-08-24 13:14:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.2633 Acc: 0.7254 Kappa: 0.7036 Dice: 0.3767 2022-08-24 13:14:08 [INFO] [EVAL] Class IoU: [0.622 0.7538 0.9181 0.676 0.6368 0.7168 0.7399 0.698 0.452 0.6659 0.4176 0.4475 0.6337 0.2748 0.1382 0.3283 0.4212 0.3463 0.5068 0.3155 0.6827 0.4551 0.491 0.4174 0.2794 0.4619 0.4538 0.2836 0.3433 0.3004 0.0796 0.4301 0.2314 0.2172 0.2532 0.3538 0.2748 0.4028 0.1856 0.1887 0.041 0.0689 0.2575 0.1982 0.2387 0.174 0.3036 0.3289 0.5002 0.4162 0.385 0.2619 0.1579 0.1962 0.6948 0.4134 0.7792 0.2622 0.2779 0.174 0.0363 0.1419 0.2687 0.179 0.3576 0.5248 0.153 0.3155 0.0088 0.2714 0.2616 0.3093 0.3522 0.2071 0.3663 0.2206 0.5053 0.1579 0.1288 0.1433 0.4508 0.2405 0.1824 0.0066 0.329 0.4092 0.0412 0.0389 0.1944 0.4482 0.2937 0.0194 0.1865 0.0697 0. 0.0028 0.0894 0.0162 0.0367 0.185 0.0501 0.0216 0.0521 0.3679 0.0134 0.4108 0.0275 0.4027 0.0581 0.1518 0.0225 0.1592 0.0048 0.4978 0.6351 0. 0.268 0.4715 0.0688 0.1563 0.3736 0. 0.2511 0.1033 0.2154 0.0678 0.3551 0.3577 0. 0.1391 0.5521 0. 0.0026 0.1777 0.0478 0.0664 0.0756 0.0008 0.0639 0.2702 0.0513 0.0535 0.2077 0.0247 0.1908 0. 0.174 0.0101 0.0451 0.0009] 2022-08-24 13:14:08 [INFO] [EVAL] Class Precision: [0.7148 0.8377 0.9573 0.7856 0.7131 0.8492 0.813 0.7834 0.605 0.7546 0.6646 0.7033 0.7118 0.5888 0.4179 0.4808 0.503 0.6489 0.6573 0.5172 0.7497 0.615 0.6369 0.5312 0.4575 0.579 0.5551 0.7021 0.6429 0.5333 0.379 0.5872 0.4648 0.3098 0.3962 0.5685 0.6286 0.664 0.5034 0.5557 0.193 0.2225 0.5897 0.49 0.3029 0.4449 0.5863 0.4605 0.7222 0.5312 0.5137 0.3309 0.431 0.7744 0.7545 0.5172 0.8214 0.4954 0.6153 0.3421 0.3861 0.2519 0.3367 0.6108 0.4719 0.6193 0.234 0.493 0.5176 0.6361 0.4897 0.3869 0.6732 0.2749 0.5573 0.3674 0.6645 0.5518 0.2092 0.2635 0.7557 0.5559 0.6951 0.1655 0.5966 0.5287 0.3652 0.3761 0.3724 0.6947 0.3601 0.133 0.3641 0.2311 0.0002 0.0209 0.351 0.7735 0.1104 0.4056 0.5814 0.031 0.5219 0.48 0.4108 0.4987 0.0984 0.6629 0.2255 0.2258 0.2382 0.2061 0.1632 0.7199 0.6453 0. 0.8615 0.5615 0.1243 0.6127 0.751 0. 0.8428 0.6401 0.512 0.5717 0.7222 0.5701 0. 0.352 0.6464 0. 0.0608 0.4435 0.5411 0.2639 0.4023 0.1458 0.317 0.7296 0.3876 0.0667 0.5292 0.1083 0.596 0. 0.8281 0.5327 0.2888 1. ] 2022-08-24 13:14:08 [INFO] [EVAL] Class Recall: [0.8273 0.8827 0.9573 0.8289 0.8561 0.8213 0.8916 0.865 0.6412 0.85 0.5291 0.5516 0.8525 0.3401 0.1711 0.5086 0.7215 0.4261 0.6888 0.4473 0.8842 0.6365 0.6819 0.661 0.418 0.6954 0.7132 0.3224 0.4241 0.4075 0.0916 0.6165 0.3154 0.4206 0.4123 0.4838 0.3281 0.5059 0.2273 0.2222 0.0494 0.0907 0.3138 0.2498 0.5294 0.2223 0.3864 0.5352 0.6194 0.6577 0.6056 0.5568 0.1995 0.2081 0.8977 0.673 0.9381 0.3578 0.3363 0.2614 0.0385 0.2454 0.5708 0.2021 0.596 0.7747 0.3065 0.4669 0.0089 0.3213 0.3596 0.6064 0.4248 0.4565 0.5167 0.3557 0.6783 0.1811 0.2511 0.2391 0.5277 0.2977 0.1983 0.0068 0.4232 0.6441 0.0444 0.0416 0.289 0.5581 0.6142 0.0223 0.2766 0.0907 0. 0.0032 0.1071 0.0163 0.0521 0.2539 0.052 0.0669 0.0547 0.6118 0.0137 0.6999 0.0367 0.5064 0.0726 0.3166 0.0243 0.4115 0.0049 0.6174 0.9758 0. 0.2801 0.7464 0.1336 0.1734 0.4264 0. 0.2635 0.1097 0.2711 0.0715 0.4113 0.4898 0. 0.1871 0.7909 0. 0.0027 0.2287 0.0498 0.0815 0.0851 0.0008 0.0741 0.3002 0.0558 0.2125 0.2548 0.031 0.2192 0. 0.1805 0.0102 0.0507 0.0009] 2022-08-24 13:14:08 [INFO] [EVAL] The model with the best validation mIoU (0.2693) was saved at iter 24000. 2022-08-24 13:14:18 [INFO] [TRAIN] epoch: 20, iter: 25050/160000, loss: 0.9547, lr: 0.001022, batch_cost: 0.2061, reader_cost: 0.00940, ips: 38.8203 samples/sec | ETA 07:43:30 2022-08-24 13:14:29 [INFO] [TRAIN] epoch: 20, iter: 25100/160000, loss: 1.0593, lr: 0.001021, batch_cost: 0.2119, reader_cost: 0.00533, ips: 37.7465 samples/sec | ETA 07:56:30 2022-08-24 13:14:41 [INFO] [TRAIN] epoch: 20, iter: 25150/160000, loss: 1.0377, lr: 0.001021, batch_cost: 0.2477, reader_cost: 0.01075, ips: 32.3013 samples/sec | ETA 09:16:38 2022-08-24 13:14:54 [INFO] [TRAIN] epoch: 20, iter: 25200/160000, loss: 0.9770, lr: 0.001021, batch_cost: 0.2586, reader_cost: 0.01695, ips: 30.9320 samples/sec | ETA 09:41:03 2022-08-24 13:15:06 [INFO] [TRAIN] epoch: 20, iter: 25250/160000, loss: 1.0151, lr: 0.001020, batch_cost: 0.2316, reader_cost: 0.00765, ips: 34.5406 samples/sec | ETA 08:40:09 2022-08-24 13:15:21 [INFO] [TRAIN] epoch: 21, iter: 25300/160000, loss: 1.0145, lr: 0.001020, batch_cost: 0.2965, reader_cost: 0.03568, ips: 26.9839 samples/sec | ETA 11:05:34 2022-08-24 13:15:34 [INFO] [TRAIN] epoch: 21, iter: 25350/160000, loss: 1.0141, lr: 0.001019, batch_cost: 0.2557, reader_cost: 0.00134, ips: 31.2848 samples/sec | ETA 09:33:52 2022-08-24 13:15:46 [INFO] [TRAIN] epoch: 21, iter: 25400/160000, loss: 0.9635, lr: 0.001019, batch_cost: 0.2552, reader_cost: 0.01488, ips: 31.3443 samples/sec | ETA 09:32:33 2022-08-24 13:15:57 [INFO] [TRAIN] epoch: 21, iter: 25450/160000, loss: 1.0274, lr: 0.001019, batch_cost: 0.2122, reader_cost: 0.00075, ips: 37.6947 samples/sec | ETA 07:55:55 2022-08-24 13:16:07 [INFO] [TRAIN] epoch: 21, iter: 25500/160000, loss: 0.9893, lr: 0.001018, batch_cost: 0.2067, reader_cost: 0.00287, ips: 38.6964 samples/sec | ETA 07:43:26 2022-08-24 13:16:18 [INFO] [TRAIN] epoch: 21, iter: 25550/160000, loss: 1.0007, lr: 0.001018, batch_cost: 0.2212, reader_cost: 0.00050, ips: 36.1740 samples/sec | ETA 08:15:34 2022-08-24 13:16:28 [INFO] [TRAIN] epoch: 21, iter: 25600/160000, loss: 1.0148, lr: 0.001018, batch_cost: 0.2003, reader_cost: 0.00038, ips: 39.9318 samples/sec | ETA 07:28:45 2022-08-24 13:16:39 [INFO] [TRAIN] epoch: 21, iter: 25650/160000, loss: 1.0134, lr: 0.001017, batch_cost: 0.2183, reader_cost: 0.00135, ips: 36.6506 samples/sec | ETA 08:08:45 2022-08-24 13:16:50 [INFO] [TRAIN] epoch: 21, iter: 25700/160000, loss: 1.0065, lr: 0.001017, batch_cost: 0.2192, reader_cost: 0.00038, ips: 36.4984 samples/sec | ETA 08:10:36 2022-08-24 13:17:00 [INFO] [TRAIN] epoch: 21, iter: 25750/160000, loss: 0.9764, lr: 0.001016, batch_cost: 0.1927, reader_cost: 0.00209, ips: 41.5167 samples/sec | ETA 07:11:09 2022-08-24 13:17:11 [INFO] [TRAIN] epoch: 21, iter: 25800/160000, loss: 1.0323, lr: 0.001016, batch_cost: 0.2138, reader_cost: 0.00128, ips: 37.4191 samples/sec | ETA 07:58:11 2022-08-24 13:17:20 [INFO] [TRAIN] epoch: 21, iter: 25850/160000, loss: 1.0563, lr: 0.001016, batch_cost: 0.1973, reader_cost: 0.00192, ips: 40.5408 samples/sec | ETA 07:21:12 2022-08-24 13:17:30 [INFO] [TRAIN] epoch: 21, iter: 25900/160000, loss: 1.0334, lr: 0.001015, batch_cost: 0.1856, reader_cost: 0.00074, ips: 43.1109 samples/sec | ETA 06:54:44 2022-08-24 13:17:41 [INFO] [TRAIN] epoch: 21, iter: 25950/160000, loss: 0.9887, lr: 0.001015, batch_cost: 0.2195, reader_cost: 0.00081, ips: 36.4491 samples/sec | ETA 08:10:21 2022-08-24 13:17:50 [INFO] [TRAIN] epoch: 21, iter: 26000/160000, loss: 0.9631, lr: 0.001015, batch_cost: 0.1886, reader_cost: 0.00676, ips: 42.4251 samples/sec | ETA 07:01:08 2022-08-24 13:17:50 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1923 - reader cost: 7.4303e-04 2022-08-24 13:21:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.2659 Acc: 0.7216 Kappa: 0.6995 Dice: 0.3796 2022-08-24 13:21:03 [INFO] [EVAL] Class IoU: [0.6226 0.7501 0.9174 0.6691 0.6334 0.701 0.7282 0.681 0.4476 0.6025 0.4252 0.4843 0.6365 0.2337 0.149 0.3329 0.425 0.3413 0.4957 0.3264 0.7075 0.4315 0.4816 0.4026 0.2778 0.4124 0.4428 0.3072 0.3189 0.2404 0.076 0.4098 0.2183 0.2157 0.3519 0.3545 0.3044 0.3879 0.2067 0.2274 0.0698 0.0464 0.2475 0.1848 0.2873 0.1379 0.2908 0.3309 0.5792 0.4343 0.3635 0.2965 0.1359 0.1816 0.6781 0.3731 0.7367 0.1871 0.2332 0.1622 0.0672 0.1294 0.2524 0.147 0.354 0.487 0.1572 0.3398 0.0305 0.2799 0.2974 0.344 0.3394 0.2457 0.3614 0.2565 0.4137 0.1775 0.1709 0.1481 0.626 0.2266 0.201 0.0145 0.3198 0.4516 0.0274 0.0347 0.1944 0.4083 0.3232 0. 0.1953 0.0372 0.0272 0.0094 0.1127 0.0669 0.0109 0.1761 0.0173 0.0065 0.0379 0.514 0.0797 0.4728 0.0489 0.3654 0.0614 0.1691 0.0429 0.1506 0.0142 0.5372 0.7301 0. 0.3836 0.4207 0.0515 0.1801 0.3608 0. 0.2404 0.1234 0.1893 0.042 0.352 0.3139 0.0049 0.0349 0.5652 0. 0.0076 0.0666 0.0242 0.0687 0.0707 0.0041 0.063 0.2891 0.2641 0.0194 0.2532 0.0268 0.184 0. 0.1676 0.0097 0.041 0.0175] 2022-08-24 13:21:03 [INFO] [EVAL] Class Precision: [0.7183 0.8234 0.9641 0.7639 0.7222 0.8492 0.832 0.7334 0.6177 0.8031 0.6492 0.6396 0.7177 0.575 0.3907 0.4984 0.5224 0.6312 0.6257 0.5032 0.7823 0.6661 0.6272 0.5117 0.4578 0.6044 0.5269 0.6985 0.6321 0.314 0.3454 0.5541 0.5379 0.3477 0.4645 0.5458 0.5606 0.7472 0.4351 0.575 0.1886 0.2152 0.6545 0.5485 0.446 0.434 0.5311 0.7088 0.73 0.5552 0.4911 0.3776 0.3011 0.5507 0.7276 0.4576 0.7667 0.5013 0.6511 0.2744 0.1413 0.2276 0.3235 0.568 0.5185 0.5343 0.3113 0.4648 0.3221 0.6685 0.4568 0.4722 0.6653 0.4172 0.5217 0.5422 0.4628 0.38 0.3519 0.4477 0.8595 0.5579 0.7151 0.0623 0.378 0.6878 0.2815 0.3632 0.4403 0.583 0.4349 0. 0.3234 0.2911 0.0786 0.0254 0.2135 0.6317 0.0336 0.5815 0.4404 0.01 0.6628 0.5821 0.4165 0.665 0.1115 0.6947 0.1892 0.2569 0.1871 0.1912 0.3924 0.6886 0.7668 0. 0.6174 0.4628 0.1354 0.5654 0.6355 0. 0.9478 0.5729 0.5506 0.6456 0.6832 0.4311 0.3076 0.3628 0.6971 0. 0.0508 0.44 0.5646 0.314 0.473 0.0565 0.2907 0.6661 0.4365 0.0266 0.7525 0.17 0.5231 0. 0.8393 0.4847 0.4409 0.6316] 2022-08-24 13:21:03 [INFO] [EVAL] Class Recall: [0.8237 0.8939 0.9498 0.8435 0.8374 0.8007 0.8538 0.9049 0.6191 0.7069 0.552 0.6659 0.8491 0.2825 0.1941 0.5006 0.6952 0.4263 0.7046 0.4817 0.8809 0.5506 0.6748 0.6537 0.414 0.5649 0.735 0.3542 0.3915 0.5061 0.0889 0.6115 0.2687 0.3623 0.5922 0.5029 0.3998 0.4464 0.2826 0.2733 0.0997 0.0558 0.2847 0.2179 0.4466 0.1681 0.3912 0.383 0.737 0.666 0.5832 0.5799 0.1986 0.2132 0.909 0.669 0.9496 0.2299 0.2665 0.2839 0.1136 0.2306 0.5346 0.1655 0.5275 0.8462 0.241 0.5583 0.0326 0.3251 0.46 0.5589 0.4093 0.3741 0.5404 0.3275 0.7958 0.2499 0.2494 0.1812 0.6974 0.2762 0.2185 0.0185 0.675 0.5681 0.0295 0.0369 0.2582 0.5767 0.5573 0. 0.3304 0.041 0.0399 0.0148 0.1927 0.0696 0.0159 0.2017 0.0177 0.0186 0.0386 0.8146 0.0897 0.6207 0.0802 0.4353 0.0833 0.3311 0.0527 0.4151 0.0146 0.7096 0.9385 0. 0.5033 0.8221 0.0767 0.2091 0.455 0. 0.2436 0.1359 0.224 0.043 0.4207 0.5359 0.0049 0.0371 0.7491 0. 0.0089 0.0728 0.0246 0.0809 0.0767 0.0044 0.0744 0.3381 0.4007 0.0666 0.2761 0.0308 0.2211 0. 0.1732 0.0098 0.0433 0.0177] 2022-08-24 13:21:03 [INFO] [EVAL] The model with the best validation mIoU (0.2693) was saved at iter 24000. 2022-08-24 13:21:16 [INFO] [TRAIN] epoch: 21, iter: 26050/160000, loss: 1.0332, lr: 0.001014, batch_cost: 0.2637, reader_cost: 0.00286, ips: 30.3422 samples/sec | ETA 09:48:37 2022-08-24 13:21:29 [INFO] [TRAIN] epoch: 21, iter: 26100/160000, loss: 1.0037, lr: 0.001014, batch_cost: 0.2605, reader_cost: 0.00742, ips: 30.7142 samples/sec | ETA 09:41:16 2022-08-24 13:21:42 [INFO] [TRAIN] epoch: 21, iter: 26150/160000, loss: 0.9810, lr: 0.001013, batch_cost: 0.2525, reader_cost: 0.00165, ips: 31.6782 samples/sec | ETA 09:23:22 2022-08-24 13:21:54 [INFO] [TRAIN] epoch: 21, iter: 26200/160000, loss: 0.9442, lr: 0.001013, batch_cost: 0.2460, reader_cost: 0.02045, ips: 32.5183 samples/sec | ETA 09:08:36 2022-08-24 13:22:08 [INFO] [TRAIN] epoch: 21, iter: 26250/160000, loss: 0.9598, lr: 0.001013, batch_cost: 0.2658, reader_cost: 0.00779, ips: 30.1011 samples/sec | ETA 09:52:26 2022-08-24 13:22:21 [INFO] [TRAIN] epoch: 21, iter: 26300/160000, loss: 0.9710, lr: 0.001012, batch_cost: 0.2704, reader_cost: 0.02236, ips: 29.5907 samples/sec | ETA 10:02:26 2022-08-24 13:22:32 [INFO] [TRAIN] epoch: 21, iter: 26350/160000, loss: 0.9972, lr: 0.001012, batch_cost: 0.2145, reader_cost: 0.00443, ips: 37.2956 samples/sec | ETA 07:57:48 2022-08-24 13:22:41 [INFO] [TRAIN] epoch: 21, iter: 26400/160000, loss: 0.9733, lr: 0.001011, batch_cost: 0.1908, reader_cost: 0.00529, ips: 41.9269 samples/sec | ETA 07:04:51 2022-08-24 13:22:51 [INFO] [TRAIN] epoch: 21, iter: 26450/160000, loss: 1.0152, lr: 0.001011, batch_cost: 0.2022, reader_cost: 0.00100, ips: 39.5571 samples/sec | ETA 07:30:09 2022-08-24 13:23:01 [INFO] [TRAIN] epoch: 21, iter: 26500/160000, loss: 0.9685, lr: 0.001011, batch_cost: 0.1987, reader_cost: 0.01217, ips: 40.2701 samples/sec | ETA 07:22:00 2022-08-24 13:23:13 [INFO] [TRAIN] epoch: 22, iter: 26550/160000, loss: 0.9955, lr: 0.001010, batch_cost: 0.2317, reader_cost: 0.04547, ips: 34.5208 samples/sec | ETA 08:35:26 2022-08-24 13:23:22 [INFO] [TRAIN] epoch: 22, iter: 26600/160000, loss: 0.9584, lr: 0.001010, batch_cost: 0.1886, reader_cost: 0.01093, ips: 42.4141 samples/sec | ETA 06:59:21 2022-08-24 13:23:31 [INFO] [TRAIN] epoch: 22, iter: 26650/160000, loss: 1.0290, lr: 0.001010, batch_cost: 0.1821, reader_cost: 0.00035, ips: 43.9204 samples/sec | ETA 06:44:49 2022-08-24 13:23:41 [INFO] [TRAIN] epoch: 22, iter: 26700/160000, loss: 0.9858, lr: 0.001009, batch_cost: 0.1989, reader_cost: 0.00464, ips: 40.2222 samples/sec | ETA 07:21:52 2022-08-24 13:23:51 [INFO] [TRAIN] epoch: 22, iter: 26750/160000, loss: 0.9604, lr: 0.001009, batch_cost: 0.1987, reader_cost: 0.00068, ips: 40.2651 samples/sec | ETA 07:21:14 2022-08-24 13:24:03 [INFO] [TRAIN] epoch: 22, iter: 26800/160000, loss: 0.9837, lr: 0.001008, batch_cost: 0.2350, reader_cost: 0.00052, ips: 34.0398 samples/sec | ETA 08:41:44 2022-08-24 13:24:13 [INFO] [TRAIN] epoch: 22, iter: 26850/160000, loss: 0.9338, lr: 0.001008, batch_cost: 0.1989, reader_cost: 0.00106, ips: 40.2125 samples/sec | ETA 07:21:29 2022-08-24 13:24:23 [INFO] [TRAIN] epoch: 22, iter: 26900/160000, loss: 0.9761, lr: 0.001008, batch_cost: 0.1924, reader_cost: 0.00033, ips: 41.5860 samples/sec | ETA 07:06:44 2022-08-24 13:24:31 [INFO] [TRAIN] epoch: 22, iter: 26950/160000, loss: 0.9557, lr: 0.001007, batch_cost: 0.1683, reader_cost: 0.00161, ips: 47.5478 samples/sec | ETA 06:13:05 2022-08-24 13:24:39 [INFO] [TRAIN] epoch: 22, iter: 27000/160000, loss: 0.9810, lr: 0.001007, batch_cost: 0.1614, reader_cost: 0.00032, ips: 49.5808 samples/sec | ETA 05:57:39 2022-08-24 13:24:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.1995 - reader cost: 0.0011 2022-08-24 13:27:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.2673 Acc: 0.7263 Kappa: 0.7045 Dice: 0.3788 2022-08-24 13:27:59 [INFO] [EVAL] Class IoU: [0.6278 0.7381 0.9182 0.6791 0.6398 0.7086 0.7453 0.6982 0.4512 0.6597 0.4333 0.4897 0.6359 0.2652 0.1284 0.3388 0.4308 0.3533 0.5032 0.3259 0.6861 0.45 0.5192 0.3842 0.2796 0.3125 0.4399 0.3107 0.3234 0.2252 0.1636 0.3894 0.2254 0.2267 0.3236 0.3968 0.2949 0.4381 0.2222 0.2897 0.0682 0.0573 0.2767 0.1689 0.2606 0.1774 0.2974 0.3489 0.6377 0.4023 0.381 0.2571 0.1481 0.2373 0.6431 0.44 0.8085 0.181 0.3295 0.1915 0.0366 0.1351 0.2615 0.0753 0.3435 0.5797 0.1603 0.2592 0.0221 0.2843 0.2479 0.3594 0.3911 0.205 0.3604 0.2364 0.4546 0.1491 0.1429 0.115 0.6286 0.1844 0.2202 0.0646 0.2304 0.4451 0.0401 0.0495 0.1739 0.4272 0.3681 0.027 0.1322 0.0342 0.0008 0.0034 0.0804 0.0049 0.0079 0.218 0.0117 0.012 0.0575 0.5481 0.0191 0.4687 0.0312 0.4177 0.0493 0.2364 0.0285 0.0367 0.0051 0.3873 0.698 0. 0.3396 0.5245 0.0153 0.2971 0.4235 0. 0.2131 0.0618 0.2275 0.053 0.4036 0.237 0. 0.1193 0.6038 0. 0.0066 0.0665 0.0349 0.04 0.0659 0.001 0.0773 0.3052 0.05 0.063 0.2225 0.027 0.2201 0. 0.1719 0.0111 0.0468 0.0155] 2022-08-24 13:27:59 [INFO] [EVAL] Class Precision: [0.7287 0.7951 0.9568 0.7874 0.7282 0.8649 0.8435 0.7804 0.5644 0.7654 0.6491 0.6494 0.7195 0.5132 0.3897 0.4678 0.5262 0.6063 0.7202 0.5077 0.7532 0.6175 0.7494 0.5948 0.4632 0.6668 0.5386 0.6486 0.6459 0.3931 0.3248 0.5375 0.5434 0.3638 0.5804 0.5279 0.6214 0.6914 0.4036 0.4357 0.2614 0.2397 0.5794 0.5583 0.5602 0.4809 0.4888 0.4977 0.6721 0.496 0.4647 0.3153 0.4067 0.6281 0.6978 0.5805 0.8721 0.6183 0.5687 0.5319 0.2825 0.2473 0.3072 0.7792 0.4827 0.7981 0.3049 0.5837 0.3234 0.6659 0.5057 0.4451 0.6583 0.297 0.594 0.3504 0.5475 0.5909 0.2086 0.2134 0.7736 0.7243 0.6868 0.2788 0.6294 0.6454 0.2649 0.324 0.3315 0.6756 0.6117 0.0496 0.3288 0.2356 0.0087 0.0397 0.2257 0.1515 0.0297 0.4842 0.8229 0.0194 0.5663 0.9069 0.2203 0.5762 0.1762 0.7407 0.1974 0.2841 0.2917 0.0387 0.1958 0.6276 0.7186 0. 0.8172 0.6265 0.0848 0.5308 0.6532 0. 0.9649 0.7363 0.4678 0.5018 0.8603 0.2895 0.0121 0.3041 0.8152 0. 0.1138 0.2519 0.6737 0.259 0.4487 0.0356 0.3361 0.5976 0.2456 0.0825 0.653 0.1946 0.5286 0. 0.9291 0.4226 0.2408 0.1689] 2022-08-24 13:27:59 [INFO] [EVAL] Class Recall: [0.8193 0.9115 0.9579 0.8316 0.8404 0.7968 0.865 0.8689 0.6924 0.8269 0.5659 0.6658 0.8455 0.3543 0.1608 0.5512 0.7038 0.4584 0.6255 0.4764 0.885 0.624 0.6283 0.5204 0.4136 0.3703 0.7059 0.3735 0.393 0.3453 0.248 0.5857 0.2781 0.3755 0.4224 0.615 0.3594 0.5446 0.331 0.4637 0.0845 0.07 0.3462 0.1949 0.3276 0.2194 0.4317 0.5384 0.9257 0.6807 0.679 0.582 0.189 0.2762 0.8912 0.6452 0.9173 0.2037 0.4392 0.2304 0.0404 0.2294 0.637 0.0769 0.5436 0.6794 0.2526 0.3181 0.0232 0.3316 0.3272 0.6512 0.4907 0.3982 0.4781 0.4209 0.7284 0.1663 0.3122 0.1995 0.7704 0.1983 0.2448 0.0775 0.2665 0.5893 0.0451 0.0552 0.2678 0.5374 0.4804 0.056 0.1811 0.0385 0.0009 0.0037 0.111 0.005 0.0106 0.284 0.0118 0.0306 0.0602 0.5807 0.0205 0.7153 0.0366 0.4893 0.0616 0.5848 0.0306 0.4161 0.0052 0.5028 0.9606 0. 0.3676 0.7631 0.0184 0.403 0.5462 0. 0.2148 0.0631 0.3069 0.0559 0.4319 0.5665 0. 0.164 0.6995 0. 0.007 0.0828 0.0355 0.0451 0.0716 0.001 0.0913 0.3842 0.059 0.2107 0.2524 0.0305 0.2739 0. 0.1741 0.0112 0.0549 0.0168] 2022-08-24 13:27:59 [INFO] [EVAL] The model with the best validation mIoU (0.2693) was saved at iter 24000. 2022-08-24 13:28:12 [INFO] [TRAIN] epoch: 22, iter: 27050/160000, loss: 0.9276, lr: 0.001007, batch_cost: 0.2552, reader_cost: 0.00323, ips: 31.3504 samples/sec | ETA 09:25:26 2022-08-24 13:28:24 [INFO] [TRAIN] epoch: 22, iter: 27100/160000, loss: 1.0244, lr: 0.001006, batch_cost: 0.2415, reader_cost: 0.01088, ips: 33.1321 samples/sec | ETA 08:54:49 2022-08-24 13:28:37 [INFO] [TRAIN] epoch: 22, iter: 27150/160000, loss: 0.9608, lr: 0.001006, batch_cost: 0.2531, reader_cost: 0.00072, ips: 31.6041 samples/sec | ETA 09:20:28 2022-08-24 13:28:49 [INFO] [TRAIN] epoch: 22, iter: 27200/160000, loss: 1.0551, lr: 0.001005, batch_cost: 0.2458, reader_cost: 0.00163, ips: 32.5521 samples/sec | ETA 09:03:56 2022-08-24 13:29:02 [INFO] [TRAIN] epoch: 22, iter: 27250/160000, loss: 0.9669, lr: 0.001005, batch_cost: 0.2665, reader_cost: 0.00923, ips: 30.0180 samples/sec | ETA 09:49:38 2022-08-24 13:29:13 [INFO] [TRAIN] epoch: 22, iter: 27300/160000, loss: 0.9661, lr: 0.001005, batch_cost: 0.2095, reader_cost: 0.00149, ips: 38.1927 samples/sec | ETA 07:43:15 2022-08-24 13:29:22 [INFO] [TRAIN] epoch: 22, iter: 27350/160000, loss: 0.9856, lr: 0.001004, batch_cost: 0.1923, reader_cost: 0.00509, ips: 41.5911 samples/sec | ETA 07:05:15 2022-08-24 13:29:32 [INFO] [TRAIN] epoch: 22, iter: 27400/160000, loss: 1.0138, lr: 0.001004, batch_cost: 0.1956, reader_cost: 0.00111, ips: 40.9023 samples/sec | ETA 07:12:14 2022-08-24 13:29:42 [INFO] [TRAIN] epoch: 22, iter: 27450/160000, loss: 0.9650, lr: 0.001004, batch_cost: 0.1907, reader_cost: 0.00083, ips: 41.9438 samples/sec | ETA 07:01:21 2022-08-24 13:29:52 [INFO] [TRAIN] epoch: 22, iter: 27500/160000, loss: 1.0342, lr: 0.001003, batch_cost: 0.2023, reader_cost: 0.00055, ips: 39.5534 samples/sec | ETA 07:26:39 2022-08-24 13:30:01 [INFO] [TRAIN] epoch: 22, iter: 27550/160000, loss: 0.9265, lr: 0.001003, batch_cost: 0.1892, reader_cost: 0.00723, ips: 42.2921 samples/sec | ETA 06:57:34 2022-08-24 13:30:12 [INFO] [TRAIN] epoch: 22, iter: 27600/160000, loss: 1.0233, lr: 0.001002, batch_cost: 0.2151, reader_cost: 0.00067, ips: 37.1959 samples/sec | ETA 07:54:36 2022-08-24 13:30:22 [INFO] [TRAIN] epoch: 22, iter: 27650/160000, loss: 0.9910, lr: 0.001002, batch_cost: 0.2029, reader_cost: 0.00032, ips: 39.4297 samples/sec | ETA 07:27:32 2022-08-24 13:30:32 [INFO] [TRAIN] epoch: 22, iter: 27700/160000, loss: 1.0011, lr: 0.001002, batch_cost: 0.1966, reader_cost: 0.00051, ips: 40.7009 samples/sec | ETA 07:13:24 2022-08-24 13:30:42 [INFO] [TRAIN] epoch: 22, iter: 27750/160000, loss: 1.0439, lr: 0.001001, batch_cost: 0.1959, reader_cost: 0.02366, ips: 40.8405 samples/sec | ETA 07:11:45 2022-08-24 13:30:54 [INFO] [TRAIN] epoch: 23, iter: 27800/160000, loss: 0.9298, lr: 0.001001, batch_cost: 0.2472, reader_cost: 0.04339, ips: 32.3642 samples/sec | ETA 09:04:38 2022-08-24 13:31:06 [INFO] [TRAIN] epoch: 23, iter: 27850/160000, loss: 0.9794, lr: 0.001001, batch_cost: 0.2287, reader_cost: 0.00060, ips: 34.9815 samples/sec | ETA 08:23:41 2022-08-24 13:31:13 [INFO] [TRAIN] epoch: 23, iter: 27900/160000, loss: 1.0233, lr: 0.001000, batch_cost: 0.1571, reader_cost: 0.00122, ips: 50.9096 samples/sec | ETA 05:45:58 2022-08-24 13:31:21 [INFO] [TRAIN] epoch: 23, iter: 27950/160000, loss: 0.9858, lr: 0.001000, batch_cost: 0.1560, reader_cost: 0.00846, ips: 51.2813 samples/sec | ETA 05:43:20 2022-08-24 13:31:30 [INFO] [TRAIN] epoch: 23, iter: 28000/160000, loss: 0.9830, lr: 0.000999, batch_cost: 0.1763, reader_cost: 0.00047, ips: 45.3875 samples/sec | ETA 06:27:46 2022-08-24 13:31:30 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 209s - batch_cost: 0.2092 - reader cost: 5.8978e-04 2022-08-24 13:35:00 [INFO] [EVAL] #Images: 2000 mIoU: 0.2718 Acc: 0.7236 Kappa: 0.7022 Dice: 0.3862 2022-08-24 13:35:00 [INFO] [EVAL] Class IoU: [0.6317 0.7384 0.9164 0.678 0.6464 0.7116 0.739 0.6915 0.4433 0.6251 0.426 0.4888 0.6387 0.2662 0.1494 0.343 0.4691 0.3851 0.5195 0.3215 0.6808 0.235 0.5188 0.414 0.2855 0.4165 0.4125 0.308 0.3232 0.2787 0.1119 0.4204 0.1917 0.2117 0.2818 0.3856 0.3034 0.4056 0.1822 0.2836 0.0919 0.0724 0.2654 0.1795 0.3121 0.1553 0.2789 0.3671 0.5713 0.4184 0.4242 0.2987 0.1141 0.1382 0.6944 0.4657 0.8215 0.2368 0.3518 0.1669 0.0828 0.1107 0.2855 0.1503 0.3553 0.5234 0.1601 0.3953 0.0532 0.3105 0.2311 0.3274 0.3949 0.234 0.3674 0.2249 0.4217 0.1649 0.1584 0.113 0.5612 0.2504 0.2346 0.0795 0.3202 0.4618 0.0492 0.0565 0.2051 0.3911 0.2897 0.0296 0.1659 0.0341 0.046 0.007 0.1104 0.063 0.0423 0.2403 0.0414 0.0069 0.0045 0.4826 0.0004 0.4484 0.0406 0.4908 0.0721 0.0454 0.0374 0.2219 0.0066 0.5365 0.6846 0. 0.3591 0.4956 0.0257 0.0848 0.4191 0. 0.2592 0.0742 0.2145 0.0695 0.3611 0.2477 0. 0.1397 0.5977 0. 0.0005 0.1074 0.0592 0.0776 0.0377 0.0014 0.0646 0.2948 0.2478 0.0448 0.3429 0.0355 0.1109 0. 0.1969 0.0168 0.0401 0.0187] 2022-08-24 13:35:00 [INFO] [EVAL] Class Precision: [0.7496 0.795 0.9534 0.7798 0.796 0.8454 0.8424 0.7497 0.5468 0.7374 0.6172 0.6386 0.7244 0.5401 0.4239 0.5024 0.6033 0.6667 0.7 0.5073 0.7428 0.5628 0.7451 0.5313 0.4733 0.6431 0.481 0.6475 0.6691 0.3916 0.3795 0.6015 0.5149 0.4095 0.3297 0.4666 0.593 0.5499 0.408 0.4841 0.2102 0.2485 0.6096 0.4971 0.5145 0.44 0.4566 0.5658 0.7471 0.5191 0.5798 0.4187 0.2732 0.6013 0.7841 0.6866 0.8944 0.5839 0.5685 0.2898 0.0955 0.1865 0.416 0.522 0.4931 0.5813 0.2413 0.5487 0.1378 0.6246 0.5909 0.4388 0.629 0.2984 0.5139 0.2959 0.4894 0.4243 0.4999 0.263 0.7752 0.5363 0.6458 0.3565 0.6272 0.6714 0.3406 0.3478 0.3471 0.6129 0.3506 0.0951 0.3383 0.227 0.0888 0.0355 0.2415 0.4428 0.0902 0.4263 0.3277 0.0124 0.1409 0.7325 0.0018 0.5749 0.0861 0.7117 0.1363 0.0971 0.2688 0.3276 0.1136 0.6968 0.7136 0. 0.6165 0.5628 0.0769 0.549 0.612 0. 0.868 0.82 0.4847 0.6016 0.7254 0.3339 0.0017 0.3248 0.7476 0. 0.0115 0.4003 0.6277 0.2566 0.5035 0.3207 0.4471 0.6653 0.5837 0.0586 0.5547 0.1233 0.6288 0. 0.827 0.4324 0.2152 0.937 ] 2022-08-24 13:35:00 [INFO] [EVAL] Class Recall: [0.8007 0.912 0.9593 0.8386 0.7748 0.8181 0.8576 0.899 0.7008 0.8041 0.579 0.6756 0.8436 0.3442 0.1875 0.5194 0.6784 0.4768 0.6683 0.4673 0.8907 0.2874 0.6307 0.6522 0.4185 0.5417 0.7434 0.3701 0.3847 0.4915 0.1369 0.5827 0.2339 0.3047 0.6602 0.6896 0.3833 0.6071 0.2476 0.4063 0.1403 0.0927 0.3198 0.2193 0.4424 0.1936 0.4174 0.511 0.7082 0.6831 0.6125 0.5103 0.1639 0.1521 0.8586 0.5915 0.9097 0.2848 0.48 0.2825 0.3838 0.2139 0.4765 0.1743 0.5599 0.8401 0.3224 0.5858 0.0796 0.3817 0.2751 0.5632 0.5148 0.5204 0.563 0.4835 0.7529 0.2125 0.1882 0.1653 0.6702 0.3197 0.2693 0.0928 0.3954 0.5967 0.0544 0.0632 0.3339 0.5193 0.6255 0.0412 0.2455 0.0386 0.087 0.0086 0.169 0.0685 0.0737 0.3551 0.0453 0.0151 0.0047 0.5859 0.0006 0.6709 0.0715 0.6126 0.1327 0.0786 0.0416 0.4077 0.0069 0.6999 0.944 0. 0.4625 0.8058 0.0372 0.0912 0.5708 0. 0.2699 0.0754 0.2779 0.0729 0.4183 0.4895 0. 0.1969 0.7488 0. 0.0005 0.1279 0.0613 0.1001 0.0392 0.0014 0.0702 0.3462 0.301 0.1599 0.4732 0.0475 0.1187 0. 0.2054 0.0172 0.0469 0.0187] 2022-08-24 13:35:00 [INFO] [EVAL] The model with the best validation mIoU (0.2718) was saved at iter 28000. 2022-08-24 13:35:12 [INFO] [TRAIN] epoch: 23, iter: 28050/160000, loss: 1.0044, lr: 0.000999, batch_cost: 0.2420, reader_cost: 0.01017, ips: 33.0574 samples/sec | ETA 08:52:12 2022-08-24 13:35:25 [INFO] [TRAIN] epoch: 23, iter: 28100/160000, loss: 0.9909, lr: 0.000999, batch_cost: 0.2619, reader_cost: 0.00647, ips: 30.5494 samples/sec | ETA 09:35:40 2022-08-24 13:35:38 [INFO] [TRAIN] epoch: 23, iter: 28150/160000, loss: 0.9225, lr: 0.000998, batch_cost: 0.2572, reader_cost: 0.00084, ips: 31.1013 samples/sec | ETA 09:25:14 2022-08-24 13:35:50 [INFO] [TRAIN] epoch: 23, iter: 28200/160000, loss: 0.9702, lr: 0.000998, batch_cost: 0.2491, reader_cost: 0.02976, ips: 32.1117 samples/sec | ETA 09:07:15 2022-08-24 13:36:03 [INFO] [TRAIN] epoch: 23, iter: 28250/160000, loss: 1.0536, lr: 0.000997, batch_cost: 0.2623, reader_cost: 0.00064, ips: 30.4949 samples/sec | ETA 09:36:03 2022-08-24 13:36:14 [INFO] [TRAIN] epoch: 23, iter: 28300/160000, loss: 0.9585, lr: 0.000997, batch_cost: 0.2169, reader_cost: 0.00059, ips: 36.8831 samples/sec | ETA 07:56:05 2022-08-24 13:36:25 [INFO] [TRAIN] epoch: 23, iter: 28350/160000, loss: 0.9819, lr: 0.000997, batch_cost: 0.2186, reader_cost: 0.00051, ips: 36.6032 samples/sec | ETA 07:59:33 2022-08-24 13:36:37 [INFO] [TRAIN] epoch: 23, iter: 28400/160000, loss: 0.9520, lr: 0.000996, batch_cost: 0.2286, reader_cost: 0.00060, ips: 35.0028 samples/sec | ETA 08:21:17 2022-08-24 13:36:49 [INFO] [TRAIN] epoch: 23, iter: 28450/160000, loss: 1.0574, lr: 0.000996, batch_cost: 0.2420, reader_cost: 0.00094, ips: 33.0581 samples/sec | ETA 08:50:34 2022-08-24 13:36:59 [INFO] [TRAIN] epoch: 23, iter: 28500/160000, loss: 0.9311, lr: 0.000996, batch_cost: 0.2029, reader_cost: 0.00036, ips: 39.4301 samples/sec | ETA 07:24:40 2022-08-24 13:37:09 [INFO] [TRAIN] epoch: 23, iter: 28550/160000, loss: 0.9785, lr: 0.000995, batch_cost: 0.2127, reader_cost: 0.00052, ips: 37.6118 samples/sec | ETA 07:45:59 2022-08-24 13:37:19 [INFO] [TRAIN] epoch: 23, iter: 28600/160000, loss: 1.0262, lr: 0.000995, batch_cost: 0.1891, reader_cost: 0.01831, ips: 42.2988 samples/sec | ETA 06:54:11 2022-08-24 13:37:30 [INFO] [TRAIN] epoch: 23, iter: 28650/160000, loss: 1.0193, lr: 0.000994, batch_cost: 0.2154, reader_cost: 0.00079, ips: 37.1378 samples/sec | ETA 07:51:34 2022-08-24 13:37:39 [INFO] [TRAIN] epoch: 23, iter: 28700/160000, loss: 0.9982, lr: 0.000994, batch_cost: 0.1817, reader_cost: 0.00267, ips: 44.0338 samples/sec | ETA 06:37:34 2022-08-24 13:37:46 [INFO] [TRAIN] epoch: 23, iter: 28750/160000, loss: 0.9420, lr: 0.000994, batch_cost: 0.1497, reader_cost: 0.00042, ips: 53.4522 samples/sec | ETA 05:27:23 2022-08-24 13:37:55 [INFO] [TRAIN] epoch: 23, iter: 28800/160000, loss: 1.0080, lr: 0.000993, batch_cost: 0.1784, reader_cost: 0.00194, ips: 44.8503 samples/sec | ETA 06:30:02 2022-08-24 13:38:03 [INFO] [TRAIN] epoch: 23, iter: 28850/160000, loss: 1.0793, lr: 0.000993, batch_cost: 0.1668, reader_cost: 0.00154, ips: 47.9487 samples/sec | ETA 06:04:41 2022-08-24 13:38:13 [INFO] [TRAIN] epoch: 23, iter: 28900/160000, loss: 0.9341, lr: 0.000993, batch_cost: 0.1997, reader_cost: 0.00042, ips: 40.0638 samples/sec | ETA 07:16:18 2022-08-24 13:38:23 [INFO] [TRAIN] epoch: 23, iter: 28950/160000, loss: 0.9219, lr: 0.000992, batch_cost: 0.1890, reader_cost: 0.00125, ips: 42.3338 samples/sec | ETA 06:52:45 2022-08-24 13:38:31 [INFO] [TRAIN] epoch: 23, iter: 29000/160000, loss: 0.9988, lr: 0.000992, batch_cost: 0.1701, reader_cost: 0.00078, ips: 47.0187 samples/sec | ETA 06:11:29 2022-08-24 13:38:31 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1751 - reader cost: 6.5302e-04 2022-08-24 13:41:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2733 Acc: 0.7286 Kappa: 0.7071 Dice: 0.3866 2022-08-24 13:41:27 [INFO] [EVAL] Class IoU: [0.6272 0.7486 0.9204 0.6743 0.6489 0.7097 0.7494 0.7062 0.4505 0.6353 0.421 0.5003 0.6375 0.263 0.1285 0.3242 0.4725 0.4008 0.5144 0.2984 0.6741 0.4702 0.521 0.4036 0.3183 0.444 0.4697 0.2962 0.2999 0.2744 0.1818 0.3867 0.2194 0.2446 0.3391 0.3929 0.3026 0.4169 0.2188 0.269 0.0787 0.046 0.2766 0.1843 0.2617 0.1755 0.2288 0.364 0.665 0.3564 0.4144 0.2711 0.0954 0.1373 0.6465 0.3504 0.7288 0.2059 0.3461 0.1504 0.0224 0.133 0.3068 0.0312 0.341 0.6021 0.1617 0.381 0.0546 0.2694 0.2298 0.3752 0.3853 0.2259 0.3626 0.2491 0.4758 0.1631 0.1804 0.11 0.6151 0.2202 0.1922 0.0187 0.38 0.3995 0.0288 0.0452 0.2161 0.3955 0.3362 0.0219 0.1649 0.0252 0.005 0.004 0.1176 0.0028 0.0012 0.2735 0.0146 0.0217 0.0833 0.487 0.0008 0.359 0.0828 0.4889 0.0625 0.2107 0.0392 0.2059 0.0123 0.5804 0.7619 0. 0.2658 0.4641 0.0172 0.1306 0.4464 0. 0.2112 0.0441 0.2158 0.0799 0.3932 0.2875 0.0075 0.0285 0.6274 0. 0.1153 0.1285 0.0528 0.0551 0.0897 0.004 0.0634 0.1419 0.3528 0.051 0.3337 0.0417 0.1871 0. 0.2035 0.0079 0.0336 0.019 ] 2022-08-24 13:41:27 [INFO] [EVAL] Class Precision: [0.7234 0.8263 0.9543 0.7667 0.7502 0.8604 0.8449 0.8172 0.5681 0.7767 0.6931 0.6303 0.7189 0.5377 0.4567 0.4987 0.63 0.6291 0.6489 0.5564 0.7361 0.5444 0.6784 0.5857 0.4887 0.5209 0.6638 0.6207 0.6767 0.4377 0.2961 0.5216 0.4089 0.3366 0.51 0.5354 0.575 0.7563 0.4346 0.4896 0.2004 0.2849 0.5384 0.4761 0.4024 0.3216 0.3447 0.5983 0.7039 0.4133 0.6266 0.3315 0.3155 0.7369 0.6866 0.4296 0.7533 0.5996 0.564 0.3119 0.2919 0.2627 0.5068 0.7392 0.4428 0.7483 0.3608 0.5753 0.1405 0.8157 0.5131 0.454 0.6212 0.3996 0.515 0.4938 0.698 0.3724 0.3936 0.2607 0.732 0.7047 0.6896 0.1169 0.5753 0.7505 0.2827 0.4128 0.49 0.7135 0.4581 0.0408 0.3285 0.2352 0.0218 0.0174 0.357 0.5055 0.0054 0.47 0.8503 0.0404 0.5111 0.8519 0.0074 0.4054 0.3094 0.6954 0.1481 0.2623 0.2236 0.3168 0.2219 0.6598 0.7977 0. 0.9279 0.5206 0.0953 0.4177 0.7425 0. 0.9737 0.8796 0.4799 0.501 0.7145 0.4113 0.5485 0.3661 0.7836 0. 0.2865 0.3699 0.5176 0.2758 0.3271 0.1098 0.3337 0.7582 0.539 0.0697 0.6113 0.1185 0.5629 0. 0.8385 0.3027 0.3614 0.6688] 2022-08-24 13:41:27 [INFO] [EVAL] Class Recall: [0.8251 0.8884 0.9628 0.8484 0.8277 0.802 0.869 0.8386 0.6852 0.7773 0.5175 0.7081 0.8492 0.3399 0.1516 0.481 0.654 0.5248 0.7128 0.3915 0.8889 0.7753 0.6918 0.5648 0.4772 0.7506 0.6164 0.3616 0.35 0.4237 0.3202 0.5991 0.3214 0.4722 0.5029 0.5962 0.3898 0.4816 0.3059 0.3739 0.1147 0.052 0.3627 0.2313 0.428 0.2787 0.405 0.4818 0.9232 0.7213 0.5503 0.5982 0.1203 0.1444 0.9172 0.6552 0.9573 0.2388 0.4725 0.2252 0.0237 0.2122 0.4375 0.0315 0.5974 0.7551 0.2267 0.53 0.0819 0.2869 0.2939 0.6838 0.5037 0.342 0.5506 0.3346 0.5991 0.2249 0.2499 0.1598 0.794 0.2426 0.2104 0.0218 0.5282 0.4607 0.0311 0.0483 0.2789 0.4701 0.5582 0.0452 0.2488 0.0274 0.0064 0.0052 0.1491 0.0029 0.0015 0.3955 0.0147 0.0446 0.0905 0.532 0.0008 0.7585 0.1015 0.6221 0.0976 0.5171 0.0454 0.3705 0.0128 0.8282 0.9445 0. 0.2714 0.8105 0.0205 0.1597 0.5282 0. 0.2125 0.0444 0.2816 0.0869 0.4665 0.4886 0.0076 0.03 0.7589 0. 0.1617 0.1644 0.0555 0.0644 0.11 0.0042 0.0726 0.1486 0.5051 0.1602 0.4236 0.0604 0.2188 0. 0.2118 0.0081 0.0357 0.0192] 2022-08-24 13:41:27 [INFO] [EVAL] The model with the best validation mIoU (0.2733) was saved at iter 29000. 2022-08-24 13:41:42 [INFO] [TRAIN] epoch: 24, iter: 29050/160000, loss: 0.9686, lr: 0.000991, batch_cost: 0.3023, reader_cost: 0.05014, ips: 26.4613 samples/sec | ETA 10:59:49 2022-08-24 13:41:55 [INFO] [TRAIN] epoch: 24, iter: 29100/160000, loss: 0.9897, lr: 0.000991, batch_cost: 0.2578, reader_cost: 0.03642, ips: 31.0316 samples/sec | ETA 09:22:26 2022-08-24 13:42:08 [INFO] [TRAIN] epoch: 24, iter: 29150/160000, loss: 0.9431, lr: 0.000991, batch_cost: 0.2629, reader_cost: 0.00102, ips: 30.4265 samples/sec | ETA 09:33:24 2022-08-24 13:42:21 [INFO] [TRAIN] epoch: 24, iter: 29200/160000, loss: 0.9572, lr: 0.000990, batch_cost: 0.2494, reader_cost: 0.00125, ips: 32.0731 samples/sec | ETA 09:03:45 2022-08-24 13:42:33 [INFO] [TRAIN] epoch: 24, iter: 29250/160000, loss: 0.9898, lr: 0.000990, batch_cost: 0.2468, reader_cost: 0.01197, ips: 32.4100 samples/sec | ETA 08:57:53 2022-08-24 13:42:45 [INFO] [TRAIN] epoch: 24, iter: 29300/160000, loss: 0.9478, lr: 0.000990, batch_cost: 0.2499, reader_cost: 0.00082, ips: 32.0113 samples/sec | ETA 09:04:23 2022-08-24 13:42:58 [INFO] [TRAIN] epoch: 24, iter: 29350/160000, loss: 0.9664, lr: 0.000989, batch_cost: 0.2598, reader_cost: 0.00080, ips: 30.7924 samples/sec | ETA 09:25:43 2022-08-24 13:43:11 [INFO] [TRAIN] epoch: 24, iter: 29400/160000, loss: 0.9985, lr: 0.000989, batch_cost: 0.2518, reader_cost: 0.00107, ips: 31.7709 samples/sec | ETA 09:08:05 2022-08-24 13:43:21 [INFO] [TRAIN] epoch: 24, iter: 29450/160000, loss: 0.9753, lr: 0.000988, batch_cost: 0.2004, reader_cost: 0.00211, ips: 39.9169 samples/sec | ETA 07:16:04 2022-08-24 13:43:32 [INFO] [TRAIN] epoch: 24, iter: 29500/160000, loss: 0.9206, lr: 0.000988, batch_cost: 0.2126, reader_cost: 0.00042, ips: 37.6220 samples/sec | ETA 07:42:29 2022-08-24 13:43:42 [INFO] [TRAIN] epoch: 24, iter: 29550/160000, loss: 1.0013, lr: 0.000988, batch_cost: 0.2028, reader_cost: 0.00500, ips: 39.4552 samples/sec | ETA 07:20:50 2022-08-24 13:43:51 [INFO] [TRAIN] epoch: 24, iter: 29600/160000, loss: 0.9467, lr: 0.000987, batch_cost: 0.1886, reader_cost: 0.00153, ips: 42.4244 samples/sec | ETA 06:49:49 2022-08-24 13:44:02 [INFO] [TRAIN] epoch: 24, iter: 29650/160000, loss: 0.9429, lr: 0.000987, batch_cost: 0.2242, reader_cost: 0.00045, ips: 35.6806 samples/sec | ETA 08:07:06 2022-08-24 13:44:14 [INFO] [TRAIN] epoch: 24, iter: 29700/160000, loss: 1.0653, lr: 0.000987, batch_cost: 0.2270, reader_cost: 0.00800, ips: 35.2376 samples/sec | ETA 08:13:02 2022-08-24 13:44:24 [INFO] [TRAIN] epoch: 24, iter: 29750/160000, loss: 1.0358, lr: 0.000986, batch_cost: 0.1957, reader_cost: 0.00079, ips: 40.8717 samples/sec | ETA 07:04:54 2022-08-24 13:44:33 [INFO] [TRAIN] epoch: 24, iter: 29800/160000, loss: 0.9451, lr: 0.000986, batch_cost: 0.1863, reader_cost: 0.00131, ips: 42.9358 samples/sec | ETA 06:44:19 2022-08-24 13:44:41 [INFO] [TRAIN] epoch: 24, iter: 29850/160000, loss: 0.9930, lr: 0.000985, batch_cost: 0.1722, reader_cost: 0.00051, ips: 46.4561 samples/sec | ETA 06:13:32 2022-08-24 13:44:50 [INFO] [TRAIN] epoch: 24, iter: 29900/160000, loss: 0.9834, lr: 0.000985, batch_cost: 0.1798, reader_cost: 0.00053, ips: 44.4858 samples/sec | ETA 06:29:56 2022-08-24 13:45:00 [INFO] [TRAIN] epoch: 24, iter: 29950/160000, loss: 1.0248, lr: 0.000985, batch_cost: 0.1827, reader_cost: 0.00055, ips: 43.7922 samples/sec | ETA 06:35:57 2022-08-24 13:45:10 [INFO] [TRAIN] epoch: 24, iter: 30000/160000, loss: 0.9039, lr: 0.000984, batch_cost: 0.2049, reader_cost: 0.00225, ips: 39.0385 samples/sec | ETA 07:24:00 2022-08-24 13:45:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 178s - batch_cost: 0.1778 - reader cost: 6.3813e-04 2022-08-24 13:48:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.2705 Acc: 0.7288 Kappa: 0.7080 Dice: 0.3836 2022-08-24 13:48:08 [INFO] [EVAL] Class IoU: [0.6277 0.7473 0.9187 0.6745 0.6525 0.7115 0.7508 0.6964 0.4579 0.6718 0.4333 0.49 0.6432 0.2678 0.1709 0.3356 0.4194 0.3961 0.5073 0.3169 0.7102 0.4822 0.524 0.4057 0.3087 0.4227 0.4348 0.2935 0.3118 0.2768 0.1721 0.3976 0.229 0.231 0.3285 0.3956 0.3099 0.2892 0.2303 0.2882 0.0949 0.0543 0.2762 0.1846 0.2821 0.1361 0.2958 0.2902 0.6457 0.4318 0.3841 0.2283 0.0965 0.1974 0.7121 0.415 0.7842 0.1801 0.0275 0.1906 0.0589 0.1785 0.2741 0.1486 0.3527 0.4029 0.1973 0.2759 0.0087 0.3409 0.2805 0.3345 0.3897 0.2266 0.3761 0.2484 0.5181 0.1707 0.1692 0.1598 0.6036 0.2225 0.2142 0.0136 0.3598 0.4405 0.0439 0.0374 0.2628 0.4115 0.3082 0.0216 0.1681 0.0221 0.0285 0.0018 0.1432 0. 0.0011 0.2295 0.0084 0.0207 0.0748 0.5303 0.0021 0.3845 0.0545 0.4764 0.053 0.1604 0.0013 0.1355 0.0106 0.4554 0.6308 0. 0.4062 0.5209 0.0252 0.218 0.3744 0. 0.2309 0.0811 0.173 0.0829 0.4101 0.3055 0.0078 0.0285 0.5605 0. 0.0031 0.1706 0.0509 0.071 0.0807 0.0128 0.072 0.2667 0.1417 0.0084 0.2657 0.0341 0.2541 0.0062 0.156 0.0134 0.0379 0.0144] 2022-08-24 13:48:08 [INFO] [EVAL] Class Precision: [0.7433 0.8486 0.9627 0.7687 0.7382 0.7827 0.8657 0.7558 0.5999 0.7474 0.6207 0.63 0.7261 0.5195 0.4218 0.4816 0.5451 0.59 0.6495 0.52 0.7912 0.5913 0.7328 0.5814 0.4805 0.4804 0.6749 0.7103 0.7283 0.5025 0.3069 0.5209 0.4398 0.3566 0.5414 0.5356 0.5648 0.7976 0.4241 0.4736 0.2121 0.2907 0.5856 0.5465 0.4153 0.377 0.5394 0.5887 0.6831 0.5383 0.5566 0.2844 0.3643 0.7782 0.8018 0.5264 0.8349 0.5814 0.3259 0.396 0.2275 0.359 0.3214 0.5435 0.4317 0.4256 0.3094 0.6041 0.4107 0.6652 0.4349 0.5026 0.6119 0.3616 0.5553 0.371 0.7142 0.4397 0.3699 0.3717 0.7351 0.6701 0.6807 0.1144 0.6117 0.634 0.3788 0.4688 0.4765 0.7112 0.3752 0.0755 0.3357 0.2205 0.1031 0.0118 0.5018 0. 0.1096 0.3625 0.8223 0.0402 0.5382 0.7628 0.0338 0.7585 0.1377 0.9009 0.1706 0.2673 0.1271 0.1718 0.2224 0.5025 0.6456 0. 0.7872 0.6279 0.1034 0.5939 0.7807 0. 0.9844 0.6652 0.6287 0.4978 0.7985 0.4182 0.2301 0.3189 0.6414 0. 0.1178 0.4325 0.6364 0.2869 0.2913 0.0706 0.3547 0.7509 0.6535 0.0136 0.7824 0.179 0.6441 0.0069 0.9434 0.5117 0.3568 0.582 ] 2022-08-24 13:48:08 [INFO] [EVAL] Class Recall: [0.8015 0.8623 0.9527 0.8462 0.849 0.8867 0.8498 0.8985 0.6593 0.8692 0.5893 0.688 0.8492 0.3559 0.2232 0.5252 0.6453 0.5466 0.6985 0.448 0.874 0.7233 0.6478 0.5731 0.4633 0.7787 0.5499 0.3334 0.3528 0.3813 0.2815 0.6268 0.3232 0.3961 0.4551 0.6021 0.4071 0.3121 0.3352 0.4239 0.1466 0.0626 0.3433 0.218 0.4681 0.1756 0.3958 0.3639 0.9219 0.6857 0.5534 0.5367 0.116 0.2092 0.8642 0.6624 0.9281 0.2069 0.0291 0.2687 0.0736 0.262 0.6507 0.1698 0.6584 0.8835 0.3524 0.3368 0.0088 0.4116 0.4413 0.5001 0.5177 0.3777 0.5381 0.4292 0.6537 0.2181 0.2377 0.2189 0.7713 0.2498 0.2381 0.0153 0.4663 0.5907 0.0473 0.0391 0.3694 0.4941 0.633 0.0294 0.2519 0.0239 0.0379 0.0022 0.1669 0. 0.0011 0.3848 0.0084 0.0409 0.08 0.635 0.0022 0.4381 0.0829 0.5027 0.0713 0.2861 0.0013 0.3911 0.011 0.829 0.965 0. 0.4563 0.7535 0.0323 0.2562 0.4184 0. 0.2318 0.0845 0.1926 0.0905 0.4575 0.5313 0.008 0.0304 0.8164 0. 0.0032 0.2198 0.0525 0.0862 0.1005 0.0154 0.0828 0.2926 0.1532 0.0212 0.2869 0.0404 0.2956 0.0573 0.1574 0.0136 0.0407 0.0146] 2022-08-24 13:48:08 [INFO] [EVAL] The model with the best validation mIoU (0.2733) was saved at iter 29000. 2022-08-24 13:48:20 [INFO] [TRAIN] epoch: 24, iter: 30050/160000, loss: 0.9100, lr: 0.000984, batch_cost: 0.2435, reader_cost: 0.00648, ips: 32.8493 samples/sec | ETA 08:47:27 2022-08-24 13:48:33 [INFO] [TRAIN] epoch: 24, iter: 30100/160000, loss: 0.9854, lr: 0.000983, batch_cost: 0.2494, reader_cost: 0.00371, ips: 32.0829 samples/sec | ETA 08:59:51 2022-08-24 13:48:45 [INFO] [TRAIN] epoch: 24, iter: 30150/160000, loss: 0.9243, lr: 0.000983, batch_cost: 0.2437, reader_cost: 0.00043, ips: 32.8268 samples/sec | ETA 08:47:24 2022-08-24 13:48:57 [INFO] [TRAIN] epoch: 24, iter: 30200/160000, loss: 0.9955, lr: 0.000983, batch_cost: 0.2422, reader_cost: 0.01684, ips: 33.0315 samples/sec | ETA 08:43:56 2022-08-24 13:49:09 [INFO] [TRAIN] epoch: 24, iter: 30250/160000, loss: 1.0627, lr: 0.000982, batch_cost: 0.2403, reader_cost: 0.01096, ips: 33.2931 samples/sec | ETA 08:39:37 2022-08-24 13:49:22 [INFO] [TRAIN] epoch: 24, iter: 30300/160000, loss: 0.9962, lr: 0.000982, batch_cost: 0.2556, reader_cost: 0.00689, ips: 31.3019 samples/sec | ETA 09:12:28 2022-08-24 13:49:37 [INFO] [TRAIN] epoch: 25, iter: 30350/160000, loss: 0.9560, lr: 0.000982, batch_cost: 0.3073, reader_cost: 0.03908, ips: 26.0307 samples/sec | ETA 11:04:05 2022-08-24 13:49:51 [INFO] [TRAIN] epoch: 25, iter: 30400/160000, loss: 0.9414, lr: 0.000981, batch_cost: 0.2669, reader_cost: 0.00391, ips: 29.9700 samples/sec | ETA 09:36:34 2022-08-24 13:50:02 [INFO] [TRAIN] epoch: 25, iter: 30450/160000, loss: 0.9804, lr: 0.000981, batch_cost: 0.2365, reader_cost: 0.01952, ips: 33.8337 samples/sec | ETA 08:30:32 2022-08-24 13:50:12 [INFO] [TRAIN] epoch: 25, iter: 30500/160000, loss: 1.0561, lr: 0.000980, batch_cost: 0.1970, reader_cost: 0.00158, ips: 40.6040 samples/sec | ETA 07:05:14 2022-08-24 13:50:23 [INFO] [TRAIN] epoch: 25, iter: 30550/160000, loss: 0.9669, lr: 0.000980, batch_cost: 0.2051, reader_cost: 0.00358, ips: 39.0142 samples/sec | ETA 07:22:24 2022-08-24 13:50:34 [INFO] [TRAIN] epoch: 25, iter: 30600/160000, loss: 1.0203, lr: 0.000980, batch_cost: 0.2324, reader_cost: 0.00101, ips: 34.4191 samples/sec | ETA 08:21:16 2022-08-24 13:50:44 [INFO] [TRAIN] epoch: 25, iter: 30650/160000, loss: 0.9568, lr: 0.000979, batch_cost: 0.2020, reader_cost: 0.00118, ips: 39.6041 samples/sec | ETA 07:15:28 2022-08-24 13:50:55 [INFO] [TRAIN] epoch: 25, iter: 30700/160000, loss: 0.9761, lr: 0.000979, batch_cost: 0.2242, reader_cost: 0.00189, ips: 35.6850 samples/sec | ETA 08:03:06 2022-08-24 13:51:07 [INFO] [TRAIN] epoch: 25, iter: 30750/160000, loss: 0.9396, lr: 0.000979, batch_cost: 0.2394, reader_cost: 0.00052, ips: 33.4123 samples/sec | ETA 08:35:46 2022-08-24 13:51:17 [INFO] [TRAIN] epoch: 25, iter: 30800/160000, loss: 1.0042, lr: 0.000978, batch_cost: 0.1837, reader_cost: 0.00075, ips: 43.5526 samples/sec | ETA 06:35:32 2022-08-24 13:51:26 [INFO] [TRAIN] epoch: 25, iter: 30850/160000, loss: 1.0026, lr: 0.000978, batch_cost: 0.1848, reader_cost: 0.00087, ips: 43.2898 samples/sec | ETA 06:37:47 2022-08-24 13:51:34 [INFO] [TRAIN] epoch: 25, iter: 30900/160000, loss: 0.9877, lr: 0.000977, batch_cost: 0.1657, reader_cost: 0.00055, ips: 48.2821 samples/sec | ETA 05:56:30 2022-08-24 13:51:43 [INFO] [TRAIN] epoch: 25, iter: 30950/160000, loss: 0.9320, lr: 0.000977, batch_cost: 0.1743, reader_cost: 0.00047, ips: 45.8937 samples/sec | ETA 06:14:55 2022-08-24 13:51:52 [INFO] [TRAIN] epoch: 25, iter: 31000/160000, loss: 0.9862, lr: 0.000977, batch_cost: 0.1820, reader_cost: 0.00042, ips: 43.9509 samples/sec | ETA 06:31:20 2022-08-24 13:51:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 211s - batch_cost: 0.2105 - reader cost: 7.4475e-04 2022-08-24 13:55:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.2793 Acc: 0.7298 Kappa: 0.7085 Dice: 0.3962 2022-08-24 13:55:23 [INFO] [EVAL] Class IoU: [0.6304 0.751 0.9197 0.6827 0.6494 0.7227 0.7368 0.7007 0.4439 0.67 0.4191 0.5008 0.6313 0.277 0.1334 0.3353 0.4388 0.3789 0.5014 0.324 0.711 0.453 0.48 0.4259 0.3073 0.397 0.4804 0.3104 0.3662 0.2559 0.1764 0.3985 0.2215 0.2424 0.2798 0.4009 0.293 0.3988 0.2208 0.2813 0.087 0.0483 0.2851 0.1942 0.2747 0.1182 0.2909 0.3626 0.6462 0.4183 0.4306 0.2656 0.0728 0.1639 0.658 0.4111 0.7717 0.2521 0.3118 0.1827 0.053 0.1816 0.2503 0.0559 0.3661 0.5548 0.189 0.3877 0.0258 0.2794 0.1912 0.3501 0.3859 0.222 0.394 0.226 0.4796 0.171 0.1949 0.1522 0.4863 0.2499 0.2501 0.0076 0.3259 0.4268 0.0287 0.0504 0.2841 0.3968 0.2763 0.0773 0.1555 0.054 0. 0.0033 0.1352 0.1202 0.0655 0.1879 0.0157 0.0049 0.1177 0.6664 0.0003 0.4967 0.0628 0.4789 0.0482 0.1447 0.0143 0.2997 0.0173 0.5276 0.5977 0.0028 0.3202 0.4952 0.1224 0.3315 0.3997 0. 0.2557 0.1249 0.1876 0.0874 0.4074 0.282 0.0495 0.086 0.5183 0. 0.0141 0.1084 0.0739 0.0601 0.0497 0.0097 0.0821 0.2956 0.3006 0.0351 0.357 0.0132 0.2548 0. 0.1491 0.0051 0.0291 0.004 ] 2022-08-24 13:55:23 [INFO] [EVAL] Class Precision: [0.7245 0.8314 0.9625 0.8066 0.7447 0.8455 0.8582 0.769 0.5661 0.7712 0.67 0.6268 0.7129 0.5164 0.4542 0.5507 0.5209 0.6239 0.6624 0.5399 0.8009 0.619 0.5725 0.5821 0.4513 0.5193 0.6051 0.7106 0.6232 0.6015 0.3092 0.4913 0.4905 0.3531 0.4431 0.5645 0.5927 0.7678 0.4947 0.5117 0.14 0.2359 0.5468 0.5133 0.3853 0.2908 0.4725 0.5237 0.6731 0.5135 0.5206 0.3586 0.3451 0.6127 0.7005 0.5227 0.8117 0.5581 0.5646 0.3402 0.2232 0.357 0.4097 0.6311 0.4597 0.6375 0.3821 0.5073 0.3338 0.801 0.5981 0.5015 0.6386 0.2945 0.6479 0.3198 0.6721 0.4052 0.3898 0.2734 0.7925 0.6717 0.6591 0.097 0.6046 0.6979 0.1661 0.4102 0.8228 0.5542 0.3268 0.1874 0.3948 0.2427 0.0004 0.0161 0.5698 0.4847 0.2362 0.4423 0.7204 0.0144 0.4257 0.8236 0.017 0.6562 0.2216 0.8397 0.1953 0.2873 0.2846 0.9219 0.1394 0.5956 0.6014 0.1381 0.8417 0.5752 0.1608 0.4635 0.746 0. 0.9491 0.6323 0.5587 0.5637 0.8541 0.3849 0.2789 0.3308 0.6045 0. 0.1007 0.6289 0.4178 0.2697 0.3787 0.1544 0.2701 0.719 0.6593 0.0432 0.6504 0.1965 0.6367 0. 0.9284 0.4761 0.5384 0.5401] 2022-08-24 13:55:23 [INFO] [EVAL] Class Recall: [0.8292 0.886 0.9539 0.8163 0.8353 0.8326 0.8389 0.8875 0.6728 0.8362 0.5281 0.7136 0.8465 0.374 0.1589 0.4615 0.7358 0.491 0.6734 0.4476 0.8637 0.628 0.7483 0.6134 0.4905 0.6276 0.6999 0.3554 0.4703 0.3081 0.2911 0.6784 0.2877 0.4359 0.4316 0.5805 0.3669 0.4535 0.2851 0.3845 0.187 0.0572 0.3732 0.238 0.4891 0.166 0.4308 0.541 0.9418 0.6928 0.7135 0.5059 0.0845 0.1828 0.9157 0.6581 0.94 0.3149 0.4105 0.2829 0.065 0.2698 0.3915 0.0577 0.6427 0.8104 0.2722 0.6219 0.0272 0.3002 0.2194 0.5369 0.4938 0.4742 0.5013 0.435 0.6261 0.2283 0.2805 0.2556 0.5573 0.2847 0.2873 0.0082 0.4142 0.5235 0.0335 0.0543 0.3026 0.5827 0.6414 0.1164 0.2041 0.065 0. 0.0041 0.1506 0.1379 0.0832 0.2462 0.0158 0.0074 0.14 0.7773 0.0004 0.6714 0.0805 0.5271 0.0601 0.2258 0.0148 0.3075 0.0194 0.8221 0.9896 0.0029 0.3407 0.7809 0.3392 0.5378 0.4626 0. 0.2593 0.1347 0.2202 0.0938 0.4379 0.5134 0.0568 0.1042 0.7843 0. 0.0161 0.1158 0.0824 0.0718 0.0541 0.0103 0.1056 0.3342 0.3559 0.1578 0.4418 0.014 0.2982 0. 0.1508 0.0051 0.0299 0.004 ] 2022-08-24 13:55:23 [INFO] [EVAL] The model with the best validation mIoU (0.2793) was saved at iter 31000. 2022-08-24 13:55:35 [INFO] [TRAIN] epoch: 25, iter: 31050/160000, loss: 0.9273, lr: 0.000976, batch_cost: 0.2429, reader_cost: 0.02218, ips: 32.9332 samples/sec | ETA 08:42:04 2022-08-24 13:55:47 [INFO] [TRAIN] epoch: 25, iter: 31100/160000, loss: 0.9919, lr: 0.000976, batch_cost: 0.2344, reader_cost: 0.00176, ips: 34.1233 samples/sec | ETA 08:23:39 2022-08-24 13:55:59 [INFO] [TRAIN] epoch: 25, iter: 31150/160000, loss: 0.9052, lr: 0.000976, batch_cost: 0.2455, reader_cost: 0.00369, ips: 32.5849 samples/sec | ETA 08:47:14 2022-08-24 13:56:12 [INFO] [TRAIN] epoch: 25, iter: 31200/160000, loss: 0.9644, lr: 0.000975, batch_cost: 0.2484, reader_cost: 0.03065, ips: 32.2079 samples/sec | ETA 08:53:12 2022-08-24 13:56:23 [INFO] [TRAIN] epoch: 25, iter: 31250/160000, loss: 0.9678, lr: 0.000975, batch_cost: 0.2323, reader_cost: 0.00318, ips: 34.4353 samples/sec | ETA 08:18:31 2022-08-24 13:56:33 [INFO] [TRAIN] epoch: 25, iter: 31300/160000, loss: 1.0011, lr: 0.000974, batch_cost: 0.2025, reader_cost: 0.00076, ips: 39.5132 samples/sec | ETA 07:14:17 2022-08-24 13:56:45 [INFO] [TRAIN] epoch: 25, iter: 31350/160000, loss: 0.9454, lr: 0.000974, batch_cost: 0.2271, reader_cost: 0.00071, ips: 35.2273 samples/sec | ETA 08:06:55 2022-08-24 13:56:56 [INFO] [TRAIN] epoch: 25, iter: 31400/160000, loss: 0.9481, lr: 0.000974, batch_cost: 0.2338, reader_cost: 0.00060, ips: 34.2159 samples/sec | ETA 08:21:07 2022-08-24 13:57:07 [INFO] [TRAIN] epoch: 25, iter: 31450/160000, loss: 0.9752, lr: 0.000973, batch_cost: 0.2174, reader_cost: 0.00063, ips: 36.7940 samples/sec | ETA 07:45:50 2022-08-24 13:57:17 [INFO] [TRAIN] epoch: 25, iter: 31500/160000, loss: 0.9551, lr: 0.000973, batch_cost: 0.1870, reader_cost: 0.00057, ips: 42.7744 samples/sec | ETA 06:40:33 2022-08-24 13:57:27 [INFO] [TRAIN] epoch: 25, iter: 31550/160000, loss: 1.0315, lr: 0.000972, batch_cost: 0.2001, reader_cost: 0.00065, ips: 39.9825 samples/sec | ETA 07:08:21 2022-08-24 13:57:39 [INFO] [TRAIN] epoch: 26, iter: 31600/160000, loss: 0.9006, lr: 0.000972, batch_cost: 0.2537, reader_cost: 0.04490, ips: 31.5296 samples/sec | ETA 09:02:58 2022-08-24 13:57:50 [INFO] [TRAIN] epoch: 26, iter: 31650/160000, loss: 0.9341, lr: 0.000972, batch_cost: 0.2057, reader_cost: 0.00060, ips: 38.8905 samples/sec | ETA 07:20:02 2022-08-24 13:57:58 [INFO] [TRAIN] epoch: 26, iter: 31700/160000, loss: 0.9458, lr: 0.000971, batch_cost: 0.1627, reader_cost: 0.00456, ips: 49.1594 samples/sec | ETA 05:47:59 2022-08-24 13:58:08 [INFO] [TRAIN] epoch: 26, iter: 31750/160000, loss: 0.9209, lr: 0.000971, batch_cost: 0.2017, reader_cost: 0.00056, ips: 39.6717 samples/sec | ETA 07:11:02 2022-08-24 13:58:16 [INFO] [TRAIN] epoch: 26, iter: 31800/160000, loss: 0.8737, lr: 0.000971, batch_cost: 0.1670, reader_cost: 0.00068, ips: 47.9181 samples/sec | ETA 05:56:43 2022-08-24 13:58:25 [INFO] [TRAIN] epoch: 26, iter: 31850/160000, loss: 0.9455, lr: 0.000970, batch_cost: 0.1717, reader_cost: 0.00055, ips: 46.6026 samples/sec | ETA 06:06:38 2022-08-24 13:58:32 [INFO] [TRAIN] epoch: 26, iter: 31900/160000, loss: 0.9749, lr: 0.000970, batch_cost: 0.1501, reader_cost: 0.00362, ips: 53.2924 samples/sec | ETA 05:20:29 2022-08-24 13:58:40 [INFO] [TRAIN] epoch: 26, iter: 31950/160000, loss: 0.8905, lr: 0.000969, batch_cost: 0.1591, reader_cost: 0.00040, ips: 50.2779 samples/sec | ETA 05:39:34 2022-08-24 13:58:50 [INFO] [TRAIN] epoch: 26, iter: 32000/160000, loss: 0.9375, lr: 0.000969, batch_cost: 0.2023, reader_cost: 0.00057, ips: 39.5373 samples/sec | ETA 07:11:39 2022-08-24 13:58:50 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 195s - batch_cost: 0.1948 - reader cost: 7.5694e-04 2022-08-24 14:02:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.2779 Acc: 0.7296 Kappa: 0.7087 Dice: 0.3943 2022-08-24 14:02:06 [INFO] [EVAL] Class IoU: [0.6387 0.7566 0.9183 0.6789 0.6541 0.721 0.7346 0.7104 0.4541 0.6238 0.4308 0.4842 0.6467 0.2044 0.1411 0.3461 0.4375 0.3985 0.5035 0.3358 0.7013 0.3739 0.5305 0.422 0.2896 0.4347 0.4748 0.3126 0.2948 0.2462 0.1841 0.4191 0.1777 0.2367 0.302 0.3839 0.321 0.4453 0.213 0.2679 0.1029 0.0572 0.2893 0.2082 0.2825 0.1237 0.2874 0.3805 0.6331 0.462 0.4245 0.2267 0.1603 0.1511 0.6399 0.368 0.7491 0.3294 0.2327 0.1731 0.0962 0.2142 0.2546 0.0549 0.3448 0.6101 0.1835 0.3718 0.0267 0.3421 0.2945 0.3351 0.3606 0.1988 0.3581 0.2559 0.3527 0.1793 0.1978 0.1368 0.6322 0.2301 0.2258 0.0136 0.3278 0.4524 0.0396 0.0547 0.2505 0.4114 0.3047 0. 0.1996 0.0367 0.0056 0.0063 0.1467 0.0316 0.003 0.2644 0.0253 0.0096 0.083 0.0089 0.001 0.4084 0.0483 0.4295 0.0525 0.1842 0.0643 0.1822 0.023 0.5765 0.7807 0. 0.3696 0.5257 0.071 0.3132 0.4068 0. 0.2335 0.0975 0.1868 0.0772 0.4496 0.286 0.1759 0.1777 0.5735 0.0019 0.0094 0.0764 0.0509 0.0855 0.0667 0.01 0.0692 0.3092 0.2657 0.0135 0.287 0.2654 0.2141 0.002 0.1336 0.0155 0.0325 0.0243] 2022-08-24 14:02:06 [INFO] [EVAL] Class Precision: [0.7504 0.8438 0.9571 0.7647 0.7317 0.8426 0.8121 0.8008 0.5639 0.6875 0.6055 0.6557 0.7513 0.5352 0.4573 0.5115 0.5314 0.6032 0.6813 0.5044 0.7801 0.6745 0.7381 0.5849 0.5016 0.5334 0.5586 0.773 0.729 0.3914 0.3422 0.6211 0.3874 0.3218 0.4739 0.5419 0.5795 0.6778 0.3832 0.4909 0.1907 0.2619 0.5596 0.4873 0.5017 0.446 0.5321 0.6303 0.7784 0.6219 0.5896 0.2863 0.3195 0.5266 0.6908 0.4517 0.7852 0.502 0.6291 0.264 0.1482 0.3213 0.3868 0.774 0.4101 0.7968 0.3261 0.5516 0.1659 0.7756 0.4557 0.5191 0.6615 0.3451 0.4912 0.5098 0.6002 0.5296 0.4724 0.4066 0.8 0.6686 0.6912 0.0809 0.6526 0.6463 0.1701 0.4039 0.4643 0.6268 0.3959 0. 0.3352 0.2368 0.0333 0.0376 0.5271 0.3785 0.0078 0.4492 0.5584 0.0139 0.6093 0.114 0.011 0.699 0.124 0.6173 0.1765 0.2658 0.2979 0.3193 0.2905 0.6979 0.7973 0. 0.8159 0.6125 0.1321 0.5266 0.7689 0. 0.9268 0.6964 0.4926 0.5348 0.8069 0.3992 0.3237 0.2764 0.7168 0.9821 0.1709 0.5471 0.5117 0.3759 0.3146 0.0878 0.3533 0.6552 0.3539 0.0219 0.6337 0.4038 0.5302 0.0033 0.956 0.3491 0.2389 0.8468] 2022-08-24 14:02:06 [INFO] [EVAL] Class Recall: [0.8109 0.8799 0.9578 0.8582 0.8604 0.8332 0.885 0.8629 0.6999 0.8706 0.5989 0.6493 0.8228 0.2485 0.1695 0.5169 0.7123 0.5401 0.6586 0.5013 0.8741 0.4562 0.6536 0.6024 0.4066 0.7015 0.7598 0.3441 0.3311 0.3989 0.2849 0.5631 0.2471 0.4721 0.4544 0.5685 0.4185 0.5648 0.3242 0.371 0.1828 0.0682 0.3747 0.2666 0.3927 0.1462 0.3846 0.4898 0.7722 0.6423 0.6024 0.5213 0.2433 0.1749 0.8966 0.6651 0.9421 0.4893 0.2698 0.3344 0.2152 0.3913 0.4269 0.0558 0.6839 0.7225 0.2956 0.5329 0.0309 0.3796 0.4543 0.4859 0.4422 0.3194 0.5692 0.3394 0.461 0.2132 0.2539 0.1709 0.7509 0.2597 0.2512 0.0161 0.397 0.6012 0.0491 0.0595 0.3524 0.5448 0.5697 0. 0.3304 0.0417 0.0067 0.0075 0.1689 0.0333 0.0048 0.3911 0.0258 0.0303 0.0876 0.0095 0.0011 0.4956 0.0734 0.5853 0.0696 0.3751 0.0758 0.2979 0.0244 0.7683 0.974 0. 0.4033 0.7876 0.1331 0.436 0.4635 0. 0.2379 0.1018 0.2313 0.0828 0.5038 0.5022 0.2781 0.3323 0.7416 0.0019 0.0098 0.0815 0.0535 0.0997 0.0781 0.0112 0.0792 0.3693 0.516 0.0342 0.3441 0.4366 0.2643 0.0053 0.1344 0.0159 0.0363 0.0244] 2022-08-24 14:02:06 [INFO] [EVAL] The model with the best validation mIoU (0.2793) was saved at iter 31000. 2022-08-24 14:02:19 [INFO] [TRAIN] epoch: 26, iter: 32050/160000, loss: 0.9589, lr: 0.000969, batch_cost: 0.2583, reader_cost: 0.01100, ips: 30.9727 samples/sec | ETA 09:10:48 2022-08-24 14:02:31 [INFO] [TRAIN] epoch: 26, iter: 32100/160000, loss: 0.9831, lr: 0.000968, batch_cost: 0.2532, reader_cost: 0.00684, ips: 31.5997 samples/sec | ETA 08:59:40 2022-08-24 14:02:43 [INFO] [TRAIN] epoch: 26, iter: 32150/160000, loss: 0.9686, lr: 0.000968, batch_cost: 0.2383, reader_cost: 0.01439, ips: 33.5685 samples/sec | ETA 08:27:49 2022-08-24 14:02:56 [INFO] [TRAIN] epoch: 26, iter: 32200/160000, loss: 0.8967, lr: 0.000968, batch_cost: 0.2544, reader_cost: 0.01452, ips: 31.4449 samples/sec | ETA 09:01:53 2022-08-24 14:03:10 [INFO] [TRAIN] epoch: 26, iter: 32250/160000, loss: 0.9293, lr: 0.000967, batch_cost: 0.2750, reader_cost: 0.00071, ips: 29.0919 samples/sec | ETA 09:45:30 2022-08-24 14:03:19 [INFO] [TRAIN] epoch: 26, iter: 32300/160000, loss: 1.0130, lr: 0.000967, batch_cost: 0.1967, reader_cost: 0.00731, ips: 40.6775 samples/sec | ETA 06:58:34 2022-08-24 14:03:30 [INFO] [TRAIN] epoch: 26, iter: 32350/160000, loss: 0.9598, lr: 0.000966, batch_cost: 0.2128, reader_cost: 0.00085, ips: 37.6003 samples/sec | ETA 07:32:39 2022-08-24 14:03:40 [INFO] [TRAIN] epoch: 26, iter: 32400/160000, loss: 0.9692, lr: 0.000966, batch_cost: 0.1965, reader_cost: 0.00194, ips: 40.7204 samples/sec | ETA 06:57:48 2022-08-24 14:03:50 [INFO] [TRAIN] epoch: 26, iter: 32450/160000, loss: 0.9364, lr: 0.000966, batch_cost: 0.1986, reader_cost: 0.00047, ips: 40.2767 samples/sec | ETA 07:02:14 2022-08-24 14:04:00 [INFO] [TRAIN] epoch: 26, iter: 32500/160000, loss: 0.9306, lr: 0.000965, batch_cost: 0.2001, reader_cost: 0.00037, ips: 39.9859 samples/sec | ETA 07:05:08 2022-08-24 14:04:10 [INFO] [TRAIN] epoch: 26, iter: 32550/160000, loss: 0.9326, lr: 0.000965, batch_cost: 0.1953, reader_cost: 0.00039, ips: 40.9579 samples/sec | ETA 06:54:53 2022-08-24 14:04:19 [INFO] [TRAIN] epoch: 26, iter: 32600/160000, loss: 0.9005, lr: 0.000965, batch_cost: 0.1949, reader_cost: 0.00704, ips: 41.0419 samples/sec | ETA 06:53:53 2022-08-24 14:04:32 [INFO] [TRAIN] epoch: 26, iter: 32650/160000, loss: 1.0068, lr: 0.000964, batch_cost: 0.2498, reader_cost: 0.00074, ips: 32.0212 samples/sec | ETA 08:50:16 2022-08-24 14:04:41 [INFO] [TRAIN] epoch: 26, iter: 32700/160000, loss: 1.0009, lr: 0.000964, batch_cost: 0.1842, reader_cost: 0.00036, ips: 43.4318 samples/sec | ETA 06:30:48 2022-08-24 14:04:49 [INFO] [TRAIN] epoch: 26, iter: 32750/160000, loss: 0.9639, lr: 0.000963, batch_cost: 0.1497, reader_cost: 0.00050, ips: 53.4409 samples/sec | ETA 05:17:29 2022-08-24 14:04:56 [INFO] [TRAIN] epoch: 26, iter: 32800/160000, loss: 0.9238, lr: 0.000963, batch_cost: 0.1543, reader_cost: 0.00082, ips: 51.8628 samples/sec | ETA 05:27:01 2022-08-24 14:05:06 [INFO] [TRAIN] epoch: 27, iter: 32850/160000, loss: 0.8654, lr: 0.000963, batch_cost: 0.2023, reader_cost: 0.02856, ips: 39.5414 samples/sec | ETA 07:08:44 2022-08-24 14:05:14 [INFO] [TRAIN] epoch: 27, iter: 32900/160000, loss: 0.9261, lr: 0.000962, batch_cost: 0.1618, reader_cost: 0.00060, ips: 49.4422 samples/sec | ETA 05:42:45 2022-08-24 14:05:24 [INFO] [TRAIN] epoch: 27, iter: 32950/160000, loss: 0.9798, lr: 0.000962, batch_cost: 0.1830, reader_cost: 0.00065, ips: 43.7136 samples/sec | ETA 06:27:31 2022-08-24 14:05:33 [INFO] [TRAIN] epoch: 27, iter: 33000/160000, loss: 0.8604, lr: 0.000962, batch_cost: 0.1924, reader_cost: 0.00051, ips: 41.5805 samples/sec | ETA 06:47:14 2022-08-24 14:05:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 189s - batch_cost: 0.1893 - reader cost: 6.5469e-04 2022-08-24 14:08:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.2815 Acc: 0.7304 Kappa: 0.7094 Dice: 0.3987 2022-08-24 14:08:43 [INFO] [EVAL] Class IoU: [0.6282 0.7534 0.9181 0.6844 0.6461 0.7098 0.7432 0.7204 0.456 0.6162 0.4174 0.4851 0.6469 0.3119 0.1942 0.3462 0.441 0.3503 0.5199 0.3536 0.7161 0.461 0.5197 0.424 0.3024 0.4208 0.4983 0.3102 0.3216 0.2615 0.1378 0.4197 0.2266 0.2414 0.3526 0.3289 0.3177 0.4745 0.2287 0.2688 0.091 0.0432 0.2791 0.2104 0.3025 0.2101 0.2874 0.378 0.6105 0.4477 0.4096 0.2559 0.1789 0.1676 0.668 0.4172 0.8251 0.2597 0.3133 0.1545 0.0817 0.167 0.2907 0.1398 0.3395 0.575 0.1576 0.3785 0.082 0.3012 0.2385 0.3924 0.3672 0.2217 0.3637 0.2616 0.4598 0.1727 0.1513 0.2101 0.6304 0.2601 0.1652 0.0111 0.351 0.3966 0.0311 0.046 0.2482 0.411 0.3517 0.0751 0.2096 0.0679 0.0113 0.0001 0.1581 0.02 0.0247 0.2379 0.0638 0.0159 0.0865 0.4078 0.0001 0.3615 0.0569 0.4363 0.0432 0.0614 0.0005 0.1449 0.0202 0.5748 0.8063 0. 0.3914 0.4865 0.0231 0.3008 0.4217 0. 0.2421 0.1067 0.2196 0.0889 0.3906 0.2923 0.001 0.1264 0.3488 0. 0.0011 0.1015 0.0557 0.1011 0.0535 0.0103 0.0802 0.2531 0.3799 0.0155 0.3333 0.0874 0.2498 0.0003 0.1677 0.0051 0.063 0.0099] 2022-08-24 14:08:43 [INFO] [EVAL] Class Precision: [0.7384 0.8361 0.962 0.785 0.7408 0.7726 0.8292 0.8099 0.5994 0.7548 0.6688 0.7095 0.7408 0.4927 0.3629 0.5397 0.5664 0.7027 0.7061 0.4829 0.8208 0.6253 0.6771 0.5396 0.4932 0.6806 0.5885 0.7611 0.7017 0.429 0.3337 0.5551 0.5365 0.3316 0.5359 0.5648 0.5163 0.6987 0.397 0.4849 0.1697 0.1932 0.5286 0.4879 0.5395 0.4094 0.5497 0.5496 0.6416 0.5885 0.5107 0.3538 0.4224 0.5268 0.7246 0.5318 0.9123 0.5755 0.623 0.2845 0.1325 0.3363 0.5574 0.5393 0.4008 0.6919 0.3188 0.5296 0.2829 0.513 0.4886 0.5329 0.6452 0.3886 0.5039 0.6141 0.6498 0.3954 0.3318 0.2818 0.8219 0.486 0.7547 0.0501 0.6225 0.7375 0.3845 0.4213 0.4754 0.7724 0.4894 0.1431 0.3686 0.2798 0.0359 0.0258 0.4391 0.2353 0.0608 0.4417 0.4925 0.0219 0.6537 0.4286 0.0047 0.6689 0.1425 0.7887 0.1519 0.2013 0.0193 0.1839 0.23 0.7169 0.8189 0. 0.6638 0.5517 0.0902 0.4354 0.7279 0. 0.8965 0.6246 0.548 0.5677 0.7117 0.4089 1. 0.4587 0.367 0. 0.0121 0.6132 0.5493 0.2101 0.3801 0.1013 0.2664 0.7583 0.5206 0.0263 0.7408 0.2507 0.559 0.0004 0.8706 0.439 0.3704 0.4771] 2022-08-24 14:08:43 [INFO] [EVAL] Class Recall: [0.8081 0.8839 0.9527 0.8422 0.8348 0.8972 0.8775 0.8671 0.6558 0.7704 0.5261 0.6053 0.8361 0.4594 0.2947 0.4913 0.6658 0.4112 0.6635 0.5691 0.8488 0.6369 0.6908 0.6643 0.4387 0.5244 0.7647 0.3437 0.3726 0.401 0.1901 0.6324 0.2817 0.4702 0.5075 0.4405 0.4523 0.5966 0.3503 0.3763 0.1639 0.0527 0.3717 0.27 0.4078 0.3015 0.3759 0.5476 0.9265 0.6518 0.6741 0.4805 0.2367 0.1973 0.8953 0.6594 0.8962 0.3212 0.3866 0.2528 0.1759 0.2491 0.378 0.1588 0.6892 0.7729 0.2377 0.5702 0.1035 0.4218 0.3178 0.5981 0.4601 0.3404 0.5665 0.313 0.6113 0.2347 0.2176 0.4521 0.7302 0.3588 0.1746 0.0141 0.446 0.4618 0.0328 0.0491 0.3418 0.4676 0.5555 0.1364 0.3269 0.0822 0.0163 0.0001 0.1982 0.0214 0.04 0.3402 0.0683 0.0543 0.0907 0.8936 0.0001 0.4403 0.0865 0.4941 0.0569 0.0812 0.0006 0.4058 0.0217 0.7436 0.9813 0. 0.4882 0.8048 0.0301 0.4931 0.5007 0. 0.2491 0.1141 0.2682 0.0953 0.464 0.5063 0.001 0.1485 0.8759 0. 0.0012 0.1085 0.0583 0.1631 0.0586 0.0113 0.1028 0.2753 0.5843 0.0363 0.3773 0.1184 0.3111 0.0008 0.172 0.0052 0.0706 0.01 ] 2022-08-24 14:08:43 [INFO] [EVAL] The model with the best validation mIoU (0.2815) was saved at iter 33000. 2022-08-24 14:08:57 [INFO] [TRAIN] epoch: 27, iter: 33050/160000, loss: 0.9417, lr: 0.000961, batch_cost: 0.2735, reader_cost: 0.00450, ips: 29.2518 samples/sec | ETA 09:38:39 2022-08-24 14:09:11 [INFO] [TRAIN] epoch: 27, iter: 33100/160000, loss: 0.9833, lr: 0.000961, batch_cost: 0.2782, reader_cost: 0.00158, ips: 28.7587 samples/sec | ETA 09:48:20 2022-08-24 14:09:23 [INFO] [TRAIN] epoch: 27, iter: 33150/160000, loss: 0.9581, lr: 0.000960, batch_cost: 0.2519, reader_cost: 0.00035, ips: 31.7557 samples/sec | ETA 08:52:36 2022-08-24 14:09:38 [INFO] [TRAIN] epoch: 27, iter: 33200/160000, loss: 0.9745, lr: 0.000960, batch_cost: 0.2965, reader_cost: 0.00922, ips: 26.9856 samples/sec | ETA 10:26:30 2022-08-24 14:09:50 [INFO] [TRAIN] epoch: 27, iter: 33250/160000, loss: 0.9624, lr: 0.000960, batch_cost: 0.2486, reader_cost: 0.00557, ips: 32.1743 samples/sec | ETA 08:45:15 2022-08-24 14:10:04 [INFO] [TRAIN] epoch: 27, iter: 33300/160000, loss: 0.9212, lr: 0.000959, batch_cost: 0.2685, reader_cost: 0.00426, ips: 29.7980 samples/sec | ETA 09:26:55 2022-08-24 14:10:17 [INFO] [TRAIN] epoch: 27, iter: 33350/160000, loss: 0.9437, lr: 0.000959, batch_cost: 0.2520, reader_cost: 0.00483, ips: 31.7418 samples/sec | ETA 08:52:00 2022-08-24 14:10:27 [INFO] [TRAIN] epoch: 27, iter: 33400/160000, loss: 0.9041, lr: 0.000958, batch_cost: 0.2177, reader_cost: 0.00102, ips: 36.7497 samples/sec | ETA 07:39:19 2022-08-24 14:10:37 [INFO] [TRAIN] epoch: 27, iter: 33450/160000, loss: 0.8886, lr: 0.000958, batch_cost: 0.1944, reader_cost: 0.00071, ips: 41.1472 samples/sec | ETA 06:50:04 2022-08-24 14:10:46 [INFO] [TRAIN] epoch: 27, iter: 33500/160000, loss: 0.9735, lr: 0.000958, batch_cost: 0.1849, reader_cost: 0.01836, ips: 43.2608 samples/sec | ETA 06:29:53 2022-08-24 14:10:57 [INFO] [TRAIN] epoch: 27, iter: 33550/160000, loss: 1.0121, lr: 0.000957, batch_cost: 0.2144, reader_cost: 0.01346, ips: 37.3192 samples/sec | ETA 07:31:46 2022-08-24 14:11:07 [INFO] [TRAIN] epoch: 27, iter: 33600/160000, loss: 0.9065, lr: 0.000957, batch_cost: 0.1971, reader_cost: 0.00205, ips: 40.5814 samples/sec | ETA 06:55:17 2022-08-24 14:11:17 [INFO] [TRAIN] epoch: 27, iter: 33650/160000, loss: 0.9714, lr: 0.000957, batch_cost: 0.1987, reader_cost: 0.00104, ips: 40.2527 samples/sec | ETA 06:58:31 2022-08-24 14:11:27 [INFO] [TRAIN] epoch: 27, iter: 33700/160000, loss: 0.9344, lr: 0.000956, batch_cost: 0.2014, reader_cost: 0.00570, ips: 39.7204 samples/sec | ETA 07:03:57 2022-08-24 14:11:37 [INFO] [TRAIN] epoch: 27, iter: 33750/160000, loss: 0.9385, lr: 0.000956, batch_cost: 0.2104, reader_cost: 0.00414, ips: 38.0253 samples/sec | ETA 07:22:41 2022-08-24 14:11:46 [INFO] [TRAIN] epoch: 27, iter: 33800/160000, loss: 0.8726, lr: 0.000955, batch_cost: 0.1711, reader_cost: 0.00041, ips: 46.7431 samples/sec | ETA 05:59:58 2022-08-24 14:11:54 [INFO] [TRAIN] epoch: 27, iter: 33850/160000, loss: 0.9097, lr: 0.000955, batch_cost: 0.1510, reader_cost: 0.00031, ips: 52.9883 samples/sec | ETA 05:17:25 2022-08-24 14:12:02 [INFO] [TRAIN] epoch: 27, iter: 33900/160000, loss: 0.9819, lr: 0.000955, batch_cost: 0.1622, reader_cost: 0.00071, ips: 49.3189 samples/sec | ETA 05:40:54 2022-08-24 14:12:10 [INFO] [TRAIN] epoch: 27, iter: 33950/160000, loss: 0.9517, lr: 0.000954, batch_cost: 0.1639, reader_cost: 0.00042, ips: 48.8222 samples/sec | ETA 05:44:14 2022-08-24 14:12:20 [INFO] [TRAIN] epoch: 27, iter: 34000/160000, loss: 0.9127, lr: 0.000954, batch_cost: 0.2014, reader_cost: 0.00036, ips: 39.7124 samples/sec | ETA 07:03:02 2022-08-24 14:12:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 211s - batch_cost: 0.2105 - reader cost: 7.0957e-04 2022-08-24 14:15:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.2817 Acc: 0.7313 Kappa: 0.7100 Dice: 0.3996 2022-08-24 14:15:51 [INFO] [EVAL] Class IoU: [0.6343 0.7524 0.9223 0.6897 0.6543 0.7026 0.7496 0.7071 0.4536 0.6277 0.4186 0.5046 0.6406 0.2626 0.1391 0.3397 0.4212 0.3873 0.5114 0.3396 0.7035 0.4695 0.5295 0.3941 0.3145 0.4222 0.4446 0.3471 0.3186 0.2745 0.182 0.376 0.2587 0.2439 0.4076 0.386 0.3138 0.4766 0.2013 0.2643 0.0677 0.0564 0.2833 0.1953 0.2777 0.188 0.3027 0.3029 0.6599 0.4509 0.4385 0.2215 0.116 0.2294 0.6447 0.4349 0.7424 0.2198 0.3445 0.1997 0.0734 0.1908 0.2526 0.1106 0.3678 0.573 0.1626 0.3809 0.0108 0.2952 0.2337 0.3858 0.3885 0.2394 0.3681 0.2323 0.35 0.1781 0.1475 0.2116 0.6357 0.2349 0.251 0.0133 0.3763 0.4419 0.0299 0.0495 0.298 0.4219 0.3161 0.0501 0.2047 0.0342 0.0658 0.0013 0.1256 0.0467 0.0128 0.2273 0.0793 0.009 0.0586 0.4338 0.0096 0.3578 0.0684 0.4943 0.0565 0.1039 0.0346 0.3454 0.0126 0.5667 0.6616 0. 0.3429 0.5011 0.018 0.2265 0.4097 0. 0.2254 0.0564 0.2044 0.0627 0.3666 0.2907 0.0946 0.1679 0.5379 0. 0.0227 0.1127 0.0186 0.0664 0.0603 0.0033 0.0786 0.2829 0.2896 0.015 0.2381 0.2055 0.2692 0. 0.1943 0.0141 0.03 0.0109] 2022-08-24 14:15:51 [INFO] [EVAL] Class Precision: [0.725 0.8314 0.9616 0.7991 0.7581 0.8688 0.8665 0.771 0.5731 0.7394 0.6516 0.6305 0.7292 0.4571 0.4519 0.4924 0.5263 0.6392 0.698 0.5143 0.783 0.6286 0.7133 0.6332 0.4397 0.632 0.562 0.6936 0.7371 0.4505 0.3324 0.4717 0.4661 0.35 0.613 0.529 0.5985 0.6256 0.447 0.4933 0.1737 0.2029 0.6154 0.4389 0.3952 0.4115 0.5529 0.6869 0.6955 0.5747 0.6268 0.2597 0.3216 0.5843 0.6892 0.5681 0.77 0.5605 0.4541 0.4423 0.1725 0.323 0.5088 0.5752 0.4951 0.6527 0.3442 0.4961 0.3619 0.6682 0.6111 0.5094 0.6393 0.3914 0.6677 0.3426 0.4064 0.4211 0.3051 0.3335 0.7938 0.626 0.6764 0.0674 0.5573 0.7203 0.2719 0.4216 0.5851 0.7395 0.4039 0.0941 0.317 0.2357 0.1544 0.0127 0.2221 0.7891 0.0717 0.4209 0.4677 0.0126 0.665 0.7968 0.1553 0.541 0.2089 0.797 0.1779 0.2946 0.1631 0.8368 0.321 0.6796 0.6661 0.007 0.8572 0.583 0.1212 0.5317 0.7681 0. 0.9886 0.7791 0.5984 0.6115 0.7433 0.3757 0.9562 0.3984 0.5999 0. 0.1171 0.6956 0.5452 0.2847 0.4625 0.1166 0.3683 0.674 0.8156 0.0793 0.7695 0.6626 0.5962 0. 0.8916 0.2913 0.3595 0.3414] 2022-08-24 14:15:51 [INFO] [EVAL] Class Recall: [0.8353 0.8879 0.9576 0.8344 0.827 0.786 0.8475 0.8951 0.6852 0.8061 0.5393 0.7163 0.8406 0.3817 0.1673 0.5227 0.6783 0.4956 0.6566 0.5 0.874 0.6497 0.6727 0.5107 0.5247 0.5599 0.6803 0.4099 0.3594 0.4126 0.2868 0.6496 0.3677 0.4459 0.5488 0.5881 0.3975 0.6669 0.268 0.3629 0.1 0.0724 0.3442 0.2602 0.4831 0.2572 0.4007 0.3514 0.928 0.6766 0.5935 0.6008 0.1536 0.2742 0.909 0.6497 0.9538 0.2656 0.5879 0.2669 0.1133 0.3179 0.3341 0.1205 0.5886 0.8243 0.2355 0.6214 0.011 0.3459 0.2746 0.6137 0.4976 0.3814 0.4507 0.4191 0.7162 0.2359 0.2222 0.3668 0.7614 0.2732 0.2853 0.0164 0.5367 0.5334 0.0325 0.0531 0.3778 0.4955 0.5926 0.0969 0.3663 0.0384 0.1029 0.0014 0.2242 0.0473 0.0153 0.3306 0.0872 0.0301 0.0604 0.4877 0.0102 0.5137 0.0924 0.5655 0.0765 0.1383 0.0421 0.3703 0.013 0.7733 0.9899 0. 0.3636 0.781 0.0208 0.2829 0.4675 0. 0.226 0.0573 0.2368 0.0653 0.4197 0.5622 0.095 0.2249 0.839 0. 0.0274 0.1185 0.0189 0.0798 0.0648 0.0033 0.0908 0.3277 0.3099 0.0182 0.2563 0.2296 0.3292 0. 0.199 0.0146 0.0317 0.0112] 2022-08-24 14:15:51 [INFO] [EVAL] The model with the best validation mIoU (0.2817) was saved at iter 34000. 2022-08-24 14:16:04 [INFO] [TRAIN] epoch: 27, iter: 34050/160000, loss: 0.9133, lr: 0.000954, batch_cost: 0.2523, reader_cost: 0.00602, ips: 31.7045 samples/sec | ETA 08:49:40 2022-08-24 14:16:15 [INFO] [TRAIN] epoch: 27, iter: 34100/160000, loss: 0.9465, lr: 0.000953, batch_cost: 0.2372, reader_cost: 0.00911, ips: 33.7308 samples/sec | ETA 08:17:39 2022-08-24 14:16:31 [INFO] [TRAIN] epoch: 28, iter: 34150/160000, loss: 0.8905, lr: 0.000953, batch_cost: 0.3166, reader_cost: 0.05659, ips: 25.2697 samples/sec | ETA 11:04:02 2022-08-24 14:16:45 [INFO] [TRAIN] epoch: 28, iter: 34200/160000, loss: 0.9630, lr: 0.000952, batch_cost: 0.2717, reader_cost: 0.00086, ips: 29.4483 samples/sec | ETA 09:29:35 2022-08-24 14:16:58 [INFO] [TRAIN] epoch: 28, iter: 34250/160000, loss: 0.8918, lr: 0.000952, batch_cost: 0.2722, reader_cost: 0.00104, ips: 29.3861 samples/sec | ETA 09:30:33 2022-08-24 14:17:10 [INFO] [TRAIN] epoch: 28, iter: 34300/160000, loss: 0.9875, lr: 0.000952, batch_cost: 0.2273, reader_cost: 0.00469, ips: 35.1916 samples/sec | ETA 07:56:14 2022-08-24 14:17:19 [INFO] [TRAIN] epoch: 28, iter: 34350/160000, loss: 0.8818, lr: 0.000951, batch_cost: 0.1809, reader_cost: 0.00953, ips: 44.2268 samples/sec | ETA 06:18:48 2022-08-24 14:17:30 [INFO] [TRAIN] epoch: 28, iter: 34400/160000, loss: 0.9046, lr: 0.000951, batch_cost: 0.2242, reader_cost: 0.00058, ips: 35.6840 samples/sec | ETA 07:49:18 2022-08-24 14:17:41 [INFO] [TRAIN] epoch: 28, iter: 34450/160000, loss: 0.9823, lr: 0.000951, batch_cost: 0.2078, reader_cost: 0.00069, ips: 38.4964 samples/sec | ETA 07:14:50 2022-08-24 14:17:51 [INFO] [TRAIN] epoch: 28, iter: 34500/160000, loss: 0.9507, lr: 0.000950, batch_cost: 0.2062, reader_cost: 0.00118, ips: 38.7981 samples/sec | ETA 07:11:17 2022-08-24 14:18:00 [INFO] [TRAIN] epoch: 28, iter: 34550/160000, loss: 0.9056, lr: 0.000950, batch_cost: 0.1872, reader_cost: 0.00194, ips: 42.7290 samples/sec | ETA 06:31:27 2022-08-24 14:18:10 [INFO] [TRAIN] epoch: 28, iter: 34600/160000, loss: 0.9415, lr: 0.000949, batch_cost: 0.1896, reader_cost: 0.00220, ips: 42.1930 samples/sec | ETA 06:36:16 2022-08-24 14:18:19 [INFO] [TRAIN] epoch: 28, iter: 34650/160000, loss: 0.9876, lr: 0.000949, batch_cost: 0.1844, reader_cost: 0.00543, ips: 43.3787 samples/sec | ETA 06:25:17 2022-08-24 14:18:28 [INFO] [TRAIN] epoch: 28, iter: 34700/160000, loss: 1.0144, lr: 0.000949, batch_cost: 0.1758, reader_cost: 0.00058, ips: 45.4966 samples/sec | ETA 06:07:12 2022-08-24 14:18:36 [INFO] [TRAIN] epoch: 28, iter: 34750/160000, loss: 0.9729, lr: 0.000948, batch_cost: 0.1746, reader_cost: 0.00040, ips: 45.8072 samples/sec | ETA 06:04:34 2022-08-24 14:18:45 [INFO] [TRAIN] epoch: 28, iter: 34800/160000, loss: 0.9524, lr: 0.000948, batch_cost: 0.1677, reader_cost: 0.00043, ips: 47.7115 samples/sec | ETA 05:49:52 2022-08-24 14:18:54 [INFO] [TRAIN] epoch: 28, iter: 34850/160000, loss: 0.9098, lr: 0.000948, batch_cost: 0.1895, reader_cost: 0.00055, ips: 42.2188 samples/sec | ETA 06:35:14 2022-08-24 14:19:02 [INFO] [TRAIN] epoch: 28, iter: 34900/160000, loss: 0.9930, lr: 0.000947, batch_cost: 0.1544, reader_cost: 0.00378, ips: 51.8174 samples/sec | ETA 05:21:53 2022-08-24 14:19:10 [INFO] [TRAIN] epoch: 28, iter: 34950/160000, loss: 1.0155, lr: 0.000947, batch_cost: 0.1565, reader_cost: 0.01076, ips: 51.1292 samples/sec | ETA 05:26:06 2022-08-24 14:19:18 [INFO] [TRAIN] epoch: 28, iter: 35000/160000, loss: 0.9450, lr: 0.000946, batch_cost: 0.1598, reader_cost: 0.00096, ips: 50.0514 samples/sec | ETA 05:32:59 2022-08-24 14:19:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 196s - batch_cost: 0.1959 - reader cost: 0.0011 2022-08-24 14:22:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.2783 Acc: 0.7279 Kappa: 0.7065 Dice: 0.3967 2022-08-24 14:22:34 [INFO] [EVAL] Class IoU: [0.6297 0.7419 0.9241 0.6889 0.6472 0.7177 0.7444 0.7139 0.4645 0.6486 0.4166 0.4967 0.632 0.2513 0.1994 0.3559 0.4164 0.376 0.5163 0.3291 0.6909 0.4394 0.5143 0.4155 0.2852 0.3918 0.4254 0.3315 0.2995 0.2838 0.1791 0.3966 0.1889 0.2481 0.32 0.3744 0.3107 0.4137 0.2239 0.2616 0.0813 0.0894 0.2848 0.2016 0.2952 0.1557 0.2448 0.3713 0.6282 0.4455 0.4356 0.1562 0.1728 0.1352 0.6526 0.4086 0.8097 0.2361 0.1194 0.1521 0.0576 0.1947 0.3259 0.1117 0.3487 0.5897 0.1805 0.2712 0.096 0.3354 0.2956 0.3685 0.3627 0.1982 0.384 0.2492 0.4355 0.1683 0.1388 0.1555 0.5037 0.2412 0.2278 0.0105 0.4086 0.4249 0.0516 0.0623 0.2633 0.4134 0.2534 0.0036 0.1296 0.0305 0.0727 0.0026 0.103 0.1374 0.0597 0.2082 0.0889 0.0104 0.0563 0.2699 0.0105 0.3401 0.0845 0.4998 0.0375 0.0817 0.0222 0.2465 0.0114 0.458 0.7504 0.0008 0.3274 0.4875 0.0716 0.0444 0.4643 0. 0.325 0.0919 0.2244 0.0855 0.3732 0.2579 0.1816 0.1831 0.6204 0. 0.0007 0.1493 0.0717 0.0798 0.0756 0.0055 0.0743 0.2619 0.2756 0.0662 0.3228 0.3227 0.246 0.0036 0.1552 0.0168 0.0432 0.0162] 2022-08-24 14:22:34 [INFO] [EVAL] Class Precision: [0.7234 0.8277 0.967 0.7946 0.7296 0.8611 0.863 0.7843 0.6077 0.7719 0.6393 0.6394 0.713 0.5361 0.4023 0.5614 0.5531 0.638 0.6676 0.5272 0.7688 0.6641 0.7089 0.5677 0.5202 0.5359 0.5568 0.7388 0.7143 0.4123 0.3417 0.4868 0.588 0.3814 0.5172 0.4769 0.6026 0.767 0.4205 0.501 0.1724 0.2052 0.4637 0.5427 0.5264 0.4079 0.3607 0.6068 0.6543 0.5685 0.6167 0.1718 0.3281 0.624 0.695 0.5362 0.8848 0.525 0.5871 0.2436 0.1424 0.2967 0.4291 0.7167 0.4187 0.7018 0.3595 0.5926 0.2021 0.7532 0.4917 0.5053 0.6522 0.2259 0.6442 0.407 0.648 0.549 0.3407 0.2848 0.8856 0.5583 0.6732 0.026 0.5099 0.679 0.2483 0.3779 0.6902 0.6476 0.3144 0.1023 0.3469 0.2022 0.1693 0.0115 0.3062 0.277 0.1277 0.483 0.3211 0.0146 0.6613 0.4552 0.126 0.4866 0.3835 0.8262 0.1467 0.1452 0.3003 0.4597 0.404 0.478 0.7603 0.0383 0.8443 0.5658 0.1843 0.6278 0.7156 0. 0.7416 0.6768 0.6079 0.5551 0.6896 0.3448 0.362 0.3909 0.765 0. 0.009 0.6058 0.501 0.2508 0.3101 0.1213 0.354 0.732 0.4092 0.1582 0.6621 0.7166 0.5336 0.0055 0.9028 0.2737 0.3162 0.295 ] 2022-08-24 14:22:34 [INFO] [EVAL] Class Recall: [0.8294 0.8774 0.9542 0.8382 0.8515 0.8117 0.8442 0.8884 0.6634 0.8024 0.5446 0.69 0.8476 0.3211 0.2833 0.4929 0.6275 0.478 0.6949 0.467 0.8722 0.5649 0.6519 0.6078 0.3871 0.5931 0.6432 0.3755 0.3403 0.4766 0.2735 0.6816 0.2178 0.4152 0.4562 0.6353 0.3908 0.4731 0.3238 0.3538 0.1333 0.1368 0.4247 0.2428 0.402 0.2012 0.4324 0.4889 0.9402 0.673 0.5972 0.6326 0.2676 0.1472 0.9145 0.6319 0.905 0.3002 0.1303 0.2881 0.0883 0.3617 0.5754 0.1169 0.676 0.7868 0.2661 0.3334 0.1545 0.3768 0.4256 0.5764 0.4497 0.6177 0.4874 0.3912 0.5704 0.1953 0.1898 0.2551 0.5388 0.2981 0.2561 0.0173 0.6728 0.5316 0.0612 0.0694 0.2986 0.5335 0.5662 0.0037 0.1714 0.0346 0.1129 0.0034 0.1344 0.2142 0.1009 0.2679 0.1095 0.0353 0.058 0.3986 0.0113 0.5303 0.0978 0.5586 0.048 0.1574 0.0234 0.347 0.0116 0.9164 0.9829 0.0008 0.3484 0.7788 0.1047 0.0456 0.5693 0. 0.3666 0.0961 0.2624 0.0918 0.4486 0.5059 0.2672 0.2562 0.7665 0. 0.0008 0.1654 0.0772 0.1048 0.0909 0.0057 0.0859 0.2897 0.4577 0.1022 0.3865 0.3699 0.3133 0.0101 0.1578 0.0175 0.0477 0.0169] 2022-08-24 14:22:34 [INFO] [EVAL] The model with the best validation mIoU (0.2817) was saved at iter 34000. 2022-08-24 14:22:47 [INFO] [TRAIN] epoch: 28, iter: 35050/160000, loss: 0.8949, lr: 0.000946, batch_cost: 0.2617, reader_cost: 0.01206, ips: 30.5704 samples/sec | ETA 09:04:58 2022-08-24 14:23:00 [INFO] [TRAIN] epoch: 28, iter: 35100/160000, loss: 0.9362, lr: 0.000946, batch_cost: 0.2449, reader_cost: 0.00902, ips: 32.6652 samples/sec | ETA 08:29:49 2022-08-24 14:23:13 [INFO] [TRAIN] epoch: 28, iter: 35150/160000, loss: 0.9811, lr: 0.000945, batch_cost: 0.2698, reader_cost: 0.01403, ips: 29.6512 samples/sec | ETA 09:21:24 2022-08-24 14:23:26 [INFO] [TRAIN] epoch: 28, iter: 35200/160000, loss: 0.9784, lr: 0.000945, batch_cost: 0.2527, reader_cost: 0.00124, ips: 31.6612 samples/sec | ETA 08:45:33 2022-08-24 14:23:38 [INFO] [TRAIN] epoch: 28, iter: 35250/160000, loss: 0.9540, lr: 0.000944, batch_cost: 0.2467, reader_cost: 0.00043, ips: 32.4221 samples/sec | ETA 08:33:01 2022-08-24 14:23:52 [INFO] [TRAIN] epoch: 28, iter: 35300/160000, loss: 0.8907, lr: 0.000944, batch_cost: 0.2728, reader_cost: 0.00387, ips: 29.3254 samples/sec | ETA 09:26:58 2022-08-24 14:24:04 [INFO] [TRAIN] epoch: 28, iter: 35350/160000, loss: 0.9251, lr: 0.000944, batch_cost: 0.2514, reader_cost: 0.00046, ips: 31.8164 samples/sec | ETA 08:42:22 2022-08-24 14:24:18 [INFO] [TRAIN] epoch: 29, iter: 35400/160000, loss: 1.0063, lr: 0.000943, batch_cost: 0.2783, reader_cost: 0.06588, ips: 28.7434 samples/sec | ETA 09:37:59 2022-08-24 14:24:29 [INFO] [TRAIN] epoch: 29, iter: 35450/160000, loss: 0.8876, lr: 0.000943, batch_cost: 0.2078, reader_cost: 0.01117, ips: 38.5040 samples/sec | ETA 07:11:17 2022-08-24 14:24:38 [INFO] [TRAIN] epoch: 29, iter: 35500/160000, loss: 0.9390, lr: 0.000943, batch_cost: 0.1982, reader_cost: 0.00073, ips: 40.3631 samples/sec | ETA 06:51:15 2022-08-24 14:24:49 [INFO] [TRAIN] epoch: 29, iter: 35550/160000, loss: 0.8754, lr: 0.000942, batch_cost: 0.2050, reader_cost: 0.00364, ips: 39.0228 samples/sec | ETA 07:05:13 2022-08-24 14:24:59 [INFO] [TRAIN] epoch: 29, iter: 35600/160000, loss: 0.9455, lr: 0.000942, batch_cost: 0.2073, reader_cost: 0.00036, ips: 38.5840 samples/sec | ETA 07:09:53 2022-08-24 14:25:10 [INFO] [TRAIN] epoch: 29, iter: 35650/160000, loss: 0.9721, lr: 0.000941, batch_cost: 0.2253, reader_cost: 0.00062, ips: 35.5064 samples/sec | ETA 07:46:57 2022-08-24 14:25:19 [INFO] [TRAIN] epoch: 29, iter: 35700/160000, loss: 0.8931, lr: 0.000941, batch_cost: 0.1735, reader_cost: 0.00063, ips: 46.1171 samples/sec | ETA 05:59:22 2022-08-24 14:25:28 [INFO] [TRAIN] epoch: 29, iter: 35750/160000, loss: 0.9243, lr: 0.000941, batch_cost: 0.1736, reader_cost: 0.00039, ips: 46.0852 samples/sec | ETA 05:59:28 2022-08-24 14:25:38 [INFO] [TRAIN] epoch: 29, iter: 35800/160000, loss: 0.9450, lr: 0.000940, batch_cost: 0.2026, reader_cost: 0.00063, ips: 39.4781 samples/sec | ETA 06:59:28 2022-08-24 14:25:47 [INFO] [TRAIN] epoch: 29, iter: 35850/160000, loss: 0.8606, lr: 0.000940, batch_cost: 0.1936, reader_cost: 0.00054, ips: 41.3310 samples/sec | ETA 06:40:30 2022-08-24 14:25:56 [INFO] [TRAIN] epoch: 29, iter: 35900/160000, loss: 0.9038, lr: 0.000940, batch_cost: 0.1727, reader_cost: 0.00299, ips: 46.3152 samples/sec | ETA 05:57:15 2022-08-24 14:26:04 [INFO] [TRAIN] epoch: 29, iter: 35950/160000, loss: 0.9425, lr: 0.000939, batch_cost: 0.1632, reader_cost: 0.00080, ips: 49.0057 samples/sec | ETA 05:37:30 2022-08-24 14:26:12 [INFO] [TRAIN] epoch: 29, iter: 36000/160000, loss: 0.9723, lr: 0.000939, batch_cost: 0.1505, reader_cost: 0.00284, ips: 53.1389 samples/sec | ETA 05:11:08 2022-08-24 14:26:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1907 - reader cost: 7.4077e-04 2022-08-24 14:29:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.2792 Acc: 0.7314 Kappa: 0.7105 Dice: 0.3957 2022-08-24 14:29:23 [INFO] [EVAL] Class IoU: [0.6357 0.7514 0.9208 0.6846 0.6562 0.7247 0.7372 0.7147 0.4595 0.6285 0.4246 0.4956 0.6426 0.2816 0.1681 0.3593 0.4612 0.3999 0.5039 0.3467 0.7119 0.4422 0.5217 0.4193 0.3159 0.4035 0.4323 0.3508 0.3965 0.2752 0.201 0.4082 0.2143 0.2206 0.3154 0.366 0.3224 0.4417 0.2288 0.2709 0.0769 0.0649 0.2722 0.1976 0.2491 0.1941 0.2259 0.3656 0.6575 0.4042 0.3998 0.2287 0.186 0.1482 0.6645 0.3184 0.813 0.2618 0.3207 0.1758 0.055 0.1613 0.3313 0.0875 0.3599 0.6032 0.1664 0.338 0.0336 0.3576 0.2806 0.361 0.3501 0.2333 0.396 0.2643 0.4584 0.0955 0.0245 0.1303 0.614 0.2361 0.2002 0.0133 0.333 0.4373 0.0733 0.0386 0.2795 0.3978 0.2923 0.0752 0.1528 0.0523 0.0357 0.0039 0.1341 0.0012 0.1113 0.2485 0.0763 0.0042 0.0459 0.5025 0.0072 0.3514 0.0409 0.46 0.0332 0.2995 0.0042 0.1202 0.0136 0.52 0.7152 0.0012 0.3749 0.4541 0.0691 0.3541 0.3572 0. 0.3197 0.1234 0.2198 0.0746 0.4193 0.299 0.0772 0.0621 0.4881 0. 0.0213 0.1192 0.0605 0.0505 0.0631 0.0064 0.0868 0.3088 0.1745 0.0319 0.2514 0.0408 0.2313 0. 0.1734 0.0177 0.0337 0.0238] 2022-08-24 14:29:23 [INFO] [EVAL] Class Precision: [0.7358 0.83 0.9613 0.7959 0.7581 0.854 0.8629 0.7905 0.5995 0.7622 0.6154 0.6371 0.7394 0.4897 0.4776 0.5344 0.6006 0.6561 0.6755 0.5034 0.7996 0.6167 0.6962 0.5865 0.4563 0.6032 0.5701 0.6609 0.6029 0.4732 0.3244 0.5378 0.4905 0.4272 0.4173 0.5038 0.6008 0.5985 0.4032 0.4681 0.24 0.2353 0.5206 0.5325 0.358 0.3341 0.3358 0.6985 0.7103 0.4879 0.5576 0.2826 0.3667 0.5745 0.6995 0.3754 0.8724 0.5559 0.6562 0.3 0.1146 0.2636 0.4226 0.6697 0.4676 0.7259 0.2599 0.5408 0.3424 0.7205 0.4663 0.5209 0.6097 0.3463 0.5613 0.4915 0.6503 0.4884 0.6168 0.4544 0.7239 0.5866 0.7255 0.0864 0.3884 0.644 0.1699 0.4773 0.589 0.5785 0.3762 0.6374 0.2537 0.2431 0.0812 0.0192 0.2651 0.9104 0.3378 0.5775 0.3673 0.0055 0.6731 0.6275 0.051 0.5461 0.1743 0.8574 0.1125 0.3568 0.1962 0.1462 0.1826 0.5734 0.7202 0.2936 0.8273 0.592 0.142 0.5708 0.7785 0. 0.9464 0.6604 0.5908 0.5863 0.7821 0.4177 0.2443 0.6332 0.5379 0. 0.1823 0.6087 0.5779 0.2922 0.4337 0.1767 0.2893 0.6634 0.4347 0.049 0.6225 0.506 0.7238 0. 0.8379 0.445 0.3063 0.4614] 2022-08-24 14:29:23 [INFO] [EVAL] Class Recall: [0.8237 0.8881 0.9563 0.8303 0.8299 0.8272 0.835 0.8817 0.663 0.7818 0.5779 0.6906 0.8308 0.3987 0.2059 0.523 0.6653 0.5059 0.6649 0.5269 0.8665 0.6098 0.6755 0.5954 0.5066 0.5493 0.6414 0.4278 0.5367 0.3967 0.3458 0.6287 0.2757 0.3132 0.5636 0.5724 0.4104 0.6277 0.346 0.3913 0.1016 0.0822 0.3633 0.2391 0.4502 0.3165 0.4083 0.4341 0.8984 0.7019 0.5856 0.5454 0.2741 0.1664 0.9299 0.6769 0.9228 0.331 0.3855 0.2982 0.0956 0.2936 0.6053 0.0915 0.6098 0.7811 0.3162 0.4741 0.0359 0.4151 0.4134 0.5405 0.4512 0.4168 0.5735 0.3638 0.6084 0.1061 0.0249 0.1545 0.8018 0.2832 0.2166 0.0155 0.7001 0.5767 0.1142 0.0403 0.3472 0.5602 0.5673 0.0786 0.2776 0.0624 0.0598 0.0049 0.2134 0.0012 0.1424 0.3037 0.0878 0.0178 0.0469 0.7162 0.0083 0.4964 0.0508 0.4981 0.0449 0.6507 0.0043 0.4031 0.0144 0.8481 0.9905 0.0012 0.4068 0.6608 0.1186 0.4825 0.3976 0. 0.3256 0.1318 0.2592 0.0788 0.4748 0.5126 0.1014 0.0644 0.8406 0. 0.0235 0.1291 0.0633 0.0575 0.0688 0.0066 0.1103 0.3661 0.2257 0.084 0.2966 0.0425 0.2536 0. 0.1794 0.0181 0.0365 0.0245] 2022-08-24 14:29:23 [INFO] [EVAL] The model with the best validation mIoU (0.2817) was saved at iter 34000. 2022-08-24 14:29:36 [INFO] [TRAIN] epoch: 29, iter: 36050/160000, loss: 0.9784, lr: 0.000938, batch_cost: 0.2582, reader_cost: 0.01112, ips: 30.9886 samples/sec | ETA 08:53:18 2022-08-24 14:29:49 [INFO] [TRAIN] epoch: 29, iter: 36100/160000, loss: 0.8596, lr: 0.000938, batch_cost: 0.2519, reader_cost: 0.00076, ips: 31.7628 samples/sec | ETA 08:40:06 2022-08-24 14:30:02 [INFO] [TRAIN] epoch: 29, iter: 36150/160000, loss: 0.9639, lr: 0.000938, batch_cost: 0.2608, reader_cost: 0.01614, ips: 30.6746 samples/sec | ETA 08:58:20 2022-08-24 14:30:15 [INFO] [TRAIN] epoch: 29, iter: 36200/160000, loss: 0.8790, lr: 0.000937, batch_cost: 0.2750, reader_cost: 0.01539, ips: 29.0912 samples/sec | ETA 09:27:24 2022-08-24 14:30:29 [INFO] [TRAIN] epoch: 29, iter: 36250/160000, loss: 0.9425, lr: 0.000937, batch_cost: 0.2659, reader_cost: 0.00080, ips: 30.0820 samples/sec | ETA 09:08:30 2022-08-24 14:30:41 [INFO] [TRAIN] epoch: 29, iter: 36300/160000, loss: 0.9064, lr: 0.000937, batch_cost: 0.2420, reader_cost: 0.01880, ips: 33.0591 samples/sec | ETA 08:18:54 2022-08-24 14:30:54 [INFO] [TRAIN] epoch: 29, iter: 36350/160000, loss: 0.8974, lr: 0.000936, batch_cost: 0.2658, reader_cost: 0.01134, ips: 30.1026 samples/sec | ETA 09:07:40 2022-08-24 14:31:06 [INFO] [TRAIN] epoch: 29, iter: 36400/160000, loss: 0.9532, lr: 0.000936, batch_cost: 0.2420, reader_cost: 0.00919, ips: 33.0541 samples/sec | ETA 08:18:34 2022-08-24 14:31:16 [INFO] [TRAIN] epoch: 29, iter: 36450/160000, loss: 0.9132, lr: 0.000935, batch_cost: 0.2071, reader_cost: 0.00253, ips: 38.6327 samples/sec | ETA 07:06:24 2022-08-24 14:31:27 [INFO] [TRAIN] epoch: 29, iter: 36500/160000, loss: 1.0220, lr: 0.000935, batch_cost: 0.2187, reader_cost: 0.00074, ips: 36.5797 samples/sec | ETA 07:30:09 2022-08-24 14:31:37 [INFO] [TRAIN] epoch: 29, iter: 36550/160000, loss: 0.9908, lr: 0.000935, batch_cost: 0.1890, reader_cost: 0.00639, ips: 42.3280 samples/sec | ETA 06:28:52 2022-08-24 14:31:47 [INFO] [TRAIN] epoch: 29, iter: 36600/160000, loss: 0.9142, lr: 0.000934, batch_cost: 0.1986, reader_cost: 0.00313, ips: 40.2887 samples/sec | ETA 06:48:23 2022-08-24 14:32:00 [INFO] [TRAIN] epoch: 30, iter: 36650/160000, loss: 0.9539, lr: 0.000934, batch_cost: 0.2719, reader_cost: 0.05816, ips: 29.4279 samples/sec | ETA 09:18:52 2022-08-24 14:32:09 [INFO] [TRAIN] epoch: 30, iter: 36700/160000, loss: 0.8993, lr: 0.000934, batch_cost: 0.1807, reader_cost: 0.01592, ips: 44.2786 samples/sec | ETA 06:11:17 2022-08-24 14:32:19 [INFO] [TRAIN] epoch: 30, iter: 36750/160000, loss: 0.9438, lr: 0.000933, batch_cost: 0.1946, reader_cost: 0.00031, ips: 41.1138 samples/sec | ETA 06:39:42 2022-08-24 14:32:28 [INFO] [TRAIN] epoch: 30, iter: 36800/160000, loss: 0.9134, lr: 0.000933, batch_cost: 0.1741, reader_cost: 0.00067, ips: 45.9410 samples/sec | ETA 05:57:33 2022-08-24 14:32:36 [INFO] [TRAIN] epoch: 30, iter: 36850/160000, loss: 0.9243, lr: 0.000932, batch_cost: 0.1666, reader_cost: 0.00232, ips: 48.0119 samples/sec | ETA 05:41:59 2022-08-24 14:32:44 [INFO] [TRAIN] epoch: 30, iter: 36900/160000, loss: 0.9001, lr: 0.000932, batch_cost: 0.1613, reader_cost: 0.00237, ips: 49.5895 samples/sec | ETA 05:30:59 2022-08-24 14:32:52 [INFO] [TRAIN] epoch: 30, iter: 36950/160000, loss: 0.9244, lr: 0.000932, batch_cost: 0.1566, reader_cost: 0.00289, ips: 51.0905 samples/sec | ETA 05:21:07 2022-08-24 14:33:00 [INFO] [TRAIN] epoch: 30, iter: 37000/160000, loss: 0.9172, lr: 0.000931, batch_cost: 0.1623, reader_cost: 0.01408, ips: 49.2867 samples/sec | ETA 05:32:44 2022-08-24 14:33:00 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 208s - batch_cost: 0.2082 - reader cost: 6.6929e-04 2022-08-24 14:36:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.2863 Acc: 0.7333 Kappa: 0.7124 Dice: 0.4049 2022-08-24 14:36:29 [INFO] [EVAL] Class IoU: [0.6404 0.7637 0.9156 0.6891 0.6449 0.7214 0.7466 0.7152 0.4581 0.6121 0.4297 0.5149 0.6497 0.2457 0.1939 0.3463 0.4633 0.4066 0.5239 0.3262 0.7139 0.4335 0.5417 0.4325 0.2924 0.3577 0.4363 0.3132 0.266 0.278 0.1329 0.3872 0.246 0.2445 0.3392 0.3656 0.3255 0.4592 0.2248 0.2079 0.0961 0.0642 0.2832 0.1969 0.2711 0.1517 0.215 0.373 0.6001 0.3729 0.4106 0.2385 0.1571 0.1514 0.6408 0.4321 0.7946 0.2949 0.3841 0.1746 0.0559 0.1293 0.2632 0.0995 0.3124 0.6078 0.1488 0.3468 0.0228 0.3131 0.2846 0.3576 0.3638 0.2138 0.3718 0.251 0.3056 0.1854 0.1679 0.3118 0.6451 0.2419 0.1759 0.0133 0.3844 0.4445 0.0509 0.0437 0.2317 0.3257 0.3259 0.0053 0.2192 0.0389 0.0489 0.0079 0.1412 0.039 0.0015 0.271 0.0607 0.0145 0.1368 0.627 0.0017 0.4518 0.0864 0.4985 0.0554 0.2255 0.0262 0.1086 0.0259 0.5502 0.7693 0.0009 0.4058 0.5162 0.0601 0.1728 0.4275 0. 0.3425 0.0879 0.2006 0.0893 0.4251 0.2832 0.3311 0.1427 0.5833 0. 0.1069 0.1046 0.0595 0.0608 0.0391 0.0094 0.0842 0.2978 0.238 0.0045 0.2803 0.3111 0.2748 0.0006 0.2042 0.0172 0.0524 0.0079] 2022-08-24 14:36:29 [INFO] [EVAL] Class Precision: [0.7401 0.8365 0.9434 0.7843 0.7579 0.841 0.8715 0.7957 0.5981 0.7356 0.6657 0.6285 0.7464 0.522 0.4338 0.4833 0.5942 0.6136 0.6705 0.5394 0.8136 0.6384 0.7835 0.5563 0.4521 0.5507 0.5374 0.7005 0.766 0.3866 0.3771 0.5121 0.4554 0.3523 0.5661 0.5213 0.5701 0.6948 0.4676 0.5541 0.2526 0.1808 0.5716 0.4849 0.4241 0.4045 0.3703 0.5589 0.7015 0.4367 0.6426 0.2813 0.4219 0.6278 0.6939 0.5567 0.8428 0.544 0.6206 0.3156 0.1407 0.2949 0.3291 0.6878 0.3732 0.7357 0.2833 0.5098 0.212 0.601 0.5331 0.4257 0.609 0.3128 0.5116 0.4084 0.5854 0.5399 0.5879 0.4241 0.7724 0.5996 0.7499 0.0663 0.5153 0.6204 0.3357 0.433 0.557 0.4265 0.4515 0.0276 0.3743 0.2633 0.1428 0.0197 0.2503 0.8128 0.0483 0.4542 0.7432 0.0201 0.4828 0.6939 0.06 0.6179 0.3222 0.7827 0.2275 0.3202 0.1376 0.1293 0.3359 0.7045 0.7777 0.163 0.8414 0.583 0.163 0.4204 0.698 0. 0.807 0.6982 0.5023 0.6253 0.681 0.3817 0.4572 0.37 0.7063 0. 0.3368 0.6079 0.6948 0.2494 0.4233 0.3941 0.2602 0.6619 0.2898 0.0103 0.6088 0.7679 0.5136 0.0009 0.8275 0.4003 0.2277 0.8187] 2022-08-24 14:36:29 [INFO] [EVAL] Class Recall: [0.8263 0.8976 0.9688 0.8502 0.8122 0.8353 0.839 0.8761 0.6619 0.7847 0.548 0.7403 0.8338 0.317 0.2596 0.55 0.6778 0.5465 0.7056 0.4522 0.8535 0.5746 0.637 0.6603 0.453 0.5051 0.6988 0.3616 0.2895 0.4974 0.1703 0.6135 0.3487 0.4442 0.4583 0.5503 0.4314 0.5752 0.3021 0.2497 0.1344 0.0904 0.3595 0.249 0.429 0.1953 0.3388 0.5286 0.806 0.7185 0.5322 0.6109 0.2002 0.1663 0.8933 0.6587 0.9328 0.3917 0.502 0.2809 0.0849 0.1871 0.5678 0.1042 0.6576 0.7776 0.2387 0.5203 0.025 0.3953 0.3792 0.6911 0.4748 0.4031 0.5764 0.3945 0.39 0.2202 0.1903 0.5409 0.7966 0.2885 0.1868 0.0163 0.602 0.6106 0.0566 0.0463 0.2841 0.5796 0.5395 0.0065 0.3459 0.0437 0.0691 0.0131 0.2446 0.0393 0.0015 0.402 0.062 0.0492 0.1603 0.8668 0.0018 0.6269 0.1055 0.5786 0.0682 0.4324 0.0313 0.4045 0.0273 0.7153 0.9861 0.0009 0.4394 0.8185 0.0869 0.2268 0.5246 0. 0.373 0.0913 0.2504 0.0944 0.5308 0.523 0.5455 0.1885 0.7701 0. 0.1354 0.1122 0.0611 0.0744 0.0413 0.0095 0.1106 0.3513 0.5711 0.0079 0.3419 0.3434 0.3715 0.0024 0.2133 0.0176 0.0637 0.0079] 2022-08-24 14:36:29 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 14:36:41 [INFO] [TRAIN] epoch: 30, iter: 37050/160000, loss: 1.0035, lr: 0.000931, batch_cost: 0.2504, reader_cost: 0.00258, ips: 31.9523 samples/sec | ETA 08:33:03 2022-08-24 14:36:55 [INFO] [TRAIN] epoch: 30, iter: 37100/160000, loss: 0.9692, lr: 0.000930, batch_cost: 0.2687, reader_cost: 0.02880, ips: 29.7729 samples/sec | ETA 09:10:23 2022-08-24 14:37:08 [INFO] [TRAIN] epoch: 30, iter: 37150/160000, loss: 0.8790, lr: 0.000930, batch_cost: 0.2540, reader_cost: 0.00435, ips: 31.4997 samples/sec | ETA 08:40:00 2022-08-24 14:37:20 [INFO] [TRAIN] epoch: 30, iter: 37200/160000, loss: 0.9534, lr: 0.000930, batch_cost: 0.2565, reader_cost: 0.01585, ips: 31.1920 samples/sec | ETA 08:44:55 2022-08-24 14:37:34 [INFO] [TRAIN] epoch: 30, iter: 37250/160000, loss: 0.9205, lr: 0.000929, batch_cost: 0.2658, reader_cost: 0.00100, ips: 30.0986 samples/sec | ETA 09:03:46 2022-08-24 14:37:47 [INFO] [TRAIN] epoch: 30, iter: 37300/160000, loss: 0.9412, lr: 0.000929, batch_cost: 0.2666, reader_cost: 0.00225, ips: 30.0040 samples/sec | ETA 09:05:15 2022-08-24 14:37:59 [INFO] [TRAIN] epoch: 30, iter: 37350/160000, loss: 0.9225, lr: 0.000929, batch_cost: 0.2465, reader_cost: 0.00038, ips: 32.4531 samples/sec | ETA 08:23:54 2022-08-24 14:38:11 [INFO] [TRAIN] epoch: 30, iter: 37400/160000, loss: 0.8973, lr: 0.000928, batch_cost: 0.2261, reader_cost: 0.00058, ips: 35.3753 samples/sec | ETA 07:42:05 2022-08-24 14:38:21 [INFO] [TRAIN] epoch: 30, iter: 37450/160000, loss: 0.8919, lr: 0.000928, batch_cost: 0.2151, reader_cost: 0.00047, ips: 37.1869 samples/sec | ETA 07:19:24 2022-08-24 14:38:32 [INFO] [TRAIN] epoch: 30, iter: 37500/160000, loss: 0.9516, lr: 0.000927, batch_cost: 0.2151, reader_cost: 0.00033, ips: 37.1921 samples/sec | ETA 07:19:09 2022-08-24 14:38:41 [INFO] [TRAIN] epoch: 30, iter: 37550/160000, loss: 0.9475, lr: 0.000927, batch_cost: 0.1860, reader_cost: 0.00053, ips: 43.0193 samples/sec | ETA 06:19:31 2022-08-24 14:38:50 [INFO] [TRAIN] epoch: 30, iter: 37600/160000, loss: 0.8872, lr: 0.000927, batch_cost: 0.1677, reader_cost: 0.00335, ips: 47.6949 samples/sec | ETA 05:42:10 2022-08-24 14:38:57 [INFO] [TRAIN] epoch: 30, iter: 37650/160000, loss: 0.9261, lr: 0.000926, batch_cost: 0.1522, reader_cost: 0.00764, ips: 52.5703 samples/sec | ETA 05:10:18 2022-08-24 14:39:06 [INFO] [TRAIN] epoch: 30, iter: 37700/160000, loss: 0.8512, lr: 0.000926, batch_cost: 0.1641, reader_cost: 0.00048, ips: 48.7435 samples/sec | ETA 05:34:32 2022-08-24 14:39:14 [INFO] [TRAIN] epoch: 30, iter: 37750/160000, loss: 0.9162, lr: 0.000926, batch_cost: 0.1624, reader_cost: 0.00072, ips: 49.2718 samples/sec | ETA 05:30:49 2022-08-24 14:39:21 [INFO] [TRAIN] epoch: 30, iter: 37800/160000, loss: 0.8796, lr: 0.000925, batch_cost: 0.1514, reader_cost: 0.00342, ips: 52.8477 samples/sec | ETA 05:08:18 2022-08-24 14:39:30 [INFO] [TRAIN] epoch: 30, iter: 37850/160000, loss: 0.9701, lr: 0.000925, batch_cost: 0.1709, reader_cost: 0.00104, ips: 46.8215 samples/sec | ETA 05:47:50 2022-08-24 14:39:41 [INFO] [TRAIN] epoch: 31, iter: 37900/160000, loss: 0.8920, lr: 0.000924, batch_cost: 0.2189, reader_cost: 0.02589, ips: 36.5533 samples/sec | ETA 07:25:22 2022-08-24 14:39:50 [INFO] [TRAIN] epoch: 31, iter: 37950/160000, loss: 0.8879, lr: 0.000924, batch_cost: 0.1822, reader_cost: 0.00055, ips: 43.9120 samples/sec | ETA 06:10:35 2022-08-24 14:39:58 [INFO] [TRAIN] epoch: 31, iter: 38000/160000, loss: 0.9032, lr: 0.000924, batch_cost: 0.1704, reader_cost: 0.00064, ips: 46.9396 samples/sec | ETA 05:46:32 2022-08-24 14:39:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1793 - reader cost: 5.4291e-04 2022-08-24 14:42:58 [INFO] [EVAL] #Images: 2000 mIoU: 0.2813 Acc: 0.7323 Kappa: 0.7111 Dice: 0.3988 2022-08-24 14:42:58 [INFO] [EVAL] Class IoU: [0.6375 0.7592 0.921 0.6825 0.6472 0.7252 0.7488 0.7159 0.454 0.6284 0.4276 0.5089 0.6465 0.2687 0.1669 0.3606 0.4241 0.3664 0.5125 0.3216 0.7037 0.4261 0.5352 0.4362 0.308 0.3733 0.4592 0.3056 0.3021 0.2211 0.1718 0.405 0.2511 0.2431 0.3011 0.3406 0.3065 0.4472 0.2312 0.208 0.0976 0.0607 0.2861 0.2171 0.292 0.25 0.246 0.3613 0.5394 0.4454 0.4018 0.2014 0.1809 0.0401 0.6324 0.466 0.8202 0.2886 0.3837 0.1604 0.1199 0.1475 0.3244 0.093 0.3464 0.6244 0.2035 0.3231 0.0194 0.3158 0.2331 0.391 0.3751 0.2351 0.3787 0.2426 0.4202 0.1728 0.1326 0.1962 0.542 0.2627 0.2215 0.0096 0.3926 0.4352 0.0505 0.0451 0.1881 0.3859 0.2988 0.0326 0.1566 0.0382 0.0902 0.0027 0.1325 0.062 0.0043 0.1847 0.0928 0.0042 0.0546 0.538 0.0065 0.3762 0.0513 0.5047 0.0427 0.2082 0.0893 0.0987 0.0199 0.5484 0.7828 0. 0.3143 0.488 0.016 0.1369 0.402 0.001 0.3093 0.0493 0.198 0.0891 0.4186 0.287 0.0616 0.114 0.5561 0. 0.0085 0.1475 0.0416 0.0707 0.0689 0.0079 0.0874 0.2605 0.355 0.0563 0.254 0.3246 0.2674 0. 0.1998 0.014 0.0329 0.0364] 2022-08-24 14:42:58 [INFO] [EVAL] Class Precision: [0.7292 0.8491 0.9537 0.7718 0.7286 0.8212 0.8664 0.7903 0.6138 0.736 0.6477 0.6558 0.7259 0.4884 0.4539 0.5296 0.5666 0.6438 0.657 0.5609 0.7998 0.6616 0.7504 0.6021 0.5169 0.5688 0.5182 0.7491 0.7185 0.4388 0.3663 0.5722 0.437 0.4377 0.4101 0.4493 0.5858 0.6802 0.4708 0.5463 0.1893 0.2153 0.531 0.4744 0.4806 0.4066 0.4317 0.6432 0.7337 0.5898 0.605 0.2405 0.3196 0.5557 0.6976 0.6171 0.8806 0.4964 0.5949 0.2329 0.2314 0.4108 0.4274 0.6234 0.4398 0.7745 0.3378 0.5921 0.1602 0.844 0.5684 0.4699 0.5605 0.3249 0.6381 0.4687 0.549 0.384 0.2724 0.3885 0.8478 0.5572 0.7025 0.0199 0.5409 0.6467 0.3399 0.4542 0.2629 0.6166 0.3846 0.0418 0.2588 0.2085 0.1644 0.0162 0.2729 0.521 0.1251 0.5369 0.4267 0.0074 0.7231 0.6298 0.0674 0.5252 0.225 0.7967 0.1606 0.2823 0.2201 0.1148 0.3205 0.6346 0.7956 0. 0.8775 0.5262 0.1206 0.5003 0.7846 0.1408 0.8977 0.6907 0.5637 0.5492 0.6922 0.3756 0.1937 0.4289 0.6641 0. 0.0862 0.6581 0.6275 0.2489 0.3211 0.1008 0.2875 0.7229 0.5803 0.0905 0.6142 0.6226 0.6513 0. 0.7942 0.4462 0.2106 0.7345] 2022-08-24 14:42:58 [INFO] [EVAL] Class Recall: [0.8352 0.8777 0.9641 0.855 0.8527 0.8612 0.8466 0.8839 0.6356 0.8113 0.5572 0.6944 0.8554 0.3739 0.2089 0.5305 0.6277 0.4596 0.6996 0.4299 0.8541 0.5448 0.6512 0.6129 0.4326 0.5208 0.8012 0.3405 0.3426 0.3083 0.2444 0.5809 0.3712 0.3534 0.5312 0.5846 0.3913 0.5662 0.3124 0.2514 0.1676 0.0779 0.3828 0.2859 0.4267 0.3937 0.3637 0.4519 0.6707 0.6452 0.5447 0.5536 0.2942 0.0414 0.8712 0.6555 0.9228 0.408 0.5194 0.3402 0.1992 0.1871 0.5736 0.0986 0.62 0.7631 0.3385 0.4157 0.0216 0.3353 0.2833 0.6995 0.5314 0.4595 0.4822 0.3345 0.6418 0.239 0.2054 0.2839 0.6004 0.332 0.2444 0.018 0.5887 0.571 0.056 0.0477 0.3979 0.5078 0.5724 0.1292 0.2839 0.0446 0.1666 0.0032 0.2049 0.0658 0.0044 0.2196 0.106 0.0097 0.0557 0.7868 0.0072 0.57 0.0623 0.5794 0.055 0.4425 0.1306 0.413 0.0208 0.8014 0.9799 0. 0.3288 0.8704 0.0181 0.1585 0.4519 0.001 0.3206 0.0505 0.2339 0.0962 0.5144 0.5488 0.0828 0.1343 0.7737 0. 0.0093 0.1598 0.0427 0.0898 0.0806 0.0085 0.1115 0.2894 0.4776 0.1297 0.3022 0.4041 0.312 0. 0.2107 0.0143 0.0375 0.0369] 2022-08-24 14:42:58 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 14:43:10 [INFO] [TRAIN] epoch: 31, iter: 38050/160000, loss: 0.9591, lr: 0.000923, batch_cost: 0.2333, reader_cost: 0.01088, ips: 34.2909 samples/sec | ETA 07:54:10 2022-08-24 14:43:24 [INFO] [TRAIN] epoch: 31, iter: 38100/160000, loss: 0.9126, lr: 0.000923, batch_cost: 0.2876, reader_cost: 0.00122, ips: 27.8197 samples/sec | ETA 09:44:14 2022-08-24 14:43:37 [INFO] [TRAIN] epoch: 31, iter: 38150/160000, loss: 0.8614, lr: 0.000923, batch_cost: 0.2609, reader_cost: 0.00262, ips: 30.6594 samples/sec | ETA 08:49:54 2022-08-24 14:43:50 [INFO] [TRAIN] epoch: 31, iter: 38200/160000, loss: 0.8474, lr: 0.000922, batch_cost: 0.2611, reader_cost: 0.00070, ips: 30.6383 samples/sec | ETA 08:50:03 2022-08-24 14:44:03 [INFO] [TRAIN] epoch: 31, iter: 38250/160000, loss: 0.8464, lr: 0.000922, batch_cost: 0.2519, reader_cost: 0.00415, ips: 31.7598 samples/sec | ETA 08:31:07 2022-08-24 14:44:16 [INFO] [TRAIN] epoch: 31, iter: 38300/160000, loss: 0.8697, lr: 0.000921, batch_cost: 0.2630, reader_cost: 0.00101, ips: 30.4157 samples/sec | ETA 08:53:29 2022-08-24 14:44:29 [INFO] [TRAIN] epoch: 31, iter: 38350/160000, loss: 1.0039, lr: 0.000921, batch_cost: 0.2618, reader_cost: 0.02059, ips: 30.5524 samples/sec | ETA 08:50:53 2022-08-24 14:44:40 [INFO] [TRAIN] epoch: 31, iter: 38400/160000, loss: 0.9704, lr: 0.000921, batch_cost: 0.2184, reader_cost: 0.00039, ips: 36.6374 samples/sec | ETA 07:22:32 2022-08-24 14:44:51 [INFO] [TRAIN] epoch: 31, iter: 38450/160000, loss: 0.9416, lr: 0.000920, batch_cost: 0.2135, reader_cost: 0.00078, ips: 37.4705 samples/sec | ETA 07:12:31 2022-08-24 14:45:02 [INFO] [TRAIN] epoch: 31, iter: 38500/160000, loss: 0.9047, lr: 0.000920, batch_cost: 0.2273, reader_cost: 0.01313, ips: 35.1886 samples/sec | ETA 07:40:22 2022-08-24 14:45:13 [INFO] [TRAIN] epoch: 31, iter: 38550/160000, loss: 0.8566, lr: 0.000920, batch_cost: 0.2192, reader_cost: 0.00234, ips: 36.4936 samples/sec | ETA 07:23:43 2022-08-24 14:45:24 [INFO] [TRAIN] epoch: 31, iter: 38600/160000, loss: 0.8780, lr: 0.000919, batch_cost: 0.2206, reader_cost: 0.00063, ips: 36.2711 samples/sec | ETA 07:26:16 2022-08-24 14:45:35 [INFO] [TRAIN] epoch: 31, iter: 38650/160000, loss: 0.9940, lr: 0.000919, batch_cost: 0.2242, reader_cost: 0.00054, ips: 35.6754 samples/sec | ETA 07:33:32 2022-08-24 14:45:46 [INFO] [TRAIN] epoch: 31, iter: 38700/160000, loss: 0.9293, lr: 0.000918, batch_cost: 0.2066, reader_cost: 0.00175, ips: 38.7133 samples/sec | ETA 06:57:46 2022-08-24 14:45:57 [INFO] [TRAIN] epoch: 31, iter: 38750/160000, loss: 0.8313, lr: 0.000918, batch_cost: 0.2215, reader_cost: 0.01277, ips: 36.1166 samples/sec | ETA 07:27:37 2022-08-24 14:46:06 [INFO] [TRAIN] epoch: 31, iter: 38800/160000, loss: 0.9075, lr: 0.000918, batch_cost: 0.1745, reader_cost: 0.01375, ips: 45.8360 samples/sec | ETA 05:52:33 2022-08-24 14:46:13 [INFO] [TRAIN] epoch: 31, iter: 38850/160000, loss: 0.9863, lr: 0.000917, batch_cost: 0.1521, reader_cost: 0.00284, ips: 52.6048 samples/sec | ETA 05:07:04 2022-08-24 14:46:23 [INFO] [TRAIN] epoch: 31, iter: 38900/160000, loss: 0.9102, lr: 0.000917, batch_cost: 0.1929, reader_cost: 0.00068, ips: 41.4730 samples/sec | ETA 06:29:19 2022-08-24 14:46:31 [INFO] [TRAIN] epoch: 31, iter: 38950/160000, loss: 0.8872, lr: 0.000916, batch_cost: 0.1709, reader_cost: 0.00059, ips: 46.8006 samples/sec | ETA 05:44:52 2022-08-24 14:46:39 [INFO] [TRAIN] epoch: 31, iter: 39000/160000, loss: 0.9248, lr: 0.000916, batch_cost: 0.1580, reader_cost: 0.00560, ips: 50.6307 samples/sec | ETA 05:18:38 2022-08-24 14:46:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 217s - batch_cost: 0.2171 - reader cost: 7.0987e-04 2022-08-24 14:50:17 [INFO] [EVAL] #Images: 2000 mIoU: 0.2831 Acc: 0.7318 Kappa: 0.7111 Dice: 0.4017 2022-08-24 14:50:17 [INFO] [EVAL] Class IoU: [0.6384 0.756 0.9229 0.6848 0.6549 0.724 0.7493 0.7112 0.4593 0.6447 0.4368 0.495 0.6479 0.2461 0.1814 0.3461 0.4305 0.3966 0.5113 0.3314 0.6898 0.4447 0.5226 0.4204 0.3255 0.3336 0.3866 0.3475 0.3706 0.2258 0.1183 0.4299 0.2134 0.2332 0.2361 0.3416 0.3158 0.4458 0.2427 0.2619 0.0959 0.0592 0.2857 0.1996 0.2733 0.2139 0.2376 0.3881 0.6233 0.3984 0.4136 0.276 0.1749 0.1344 0.6478 0.4496 0.7791 0.1856 0.3033 0.1581 0.0582 0.1521 0.2693 0.147 0.3465 0.5773 0.1817 0.3583 0.0242 0.326 0.3028 0.3546 0.3886 0.2263 0.3782 0.262 0.4031 0.1816 0.2575 0.2407 0.6181 0.2432 0.1457 0.0123 0.3815 0.4471 0.0327 0.0447 0.2741 0.3985 0.3434 0. 0.1636 0.048 0.0469 0.0029 0.0435 0.0437 0.0137 0.2602 0.0231 0.0089 0.1826 0.5383 0.0006 0.4057 0.116 0.508 0.0652 0.2449 0.057 0.207 0.028 0.5657 0.6766 0.0019 0.1756 0.5217 0.0771 0.1384 0.4263 0. 0.2897 0.0966 0.2244 0.0638 0.369 0.3434 0.013 0.1766 0.5402 0. 0.0105 0.1717 0.0314 0.0776 0.0895 0.011 0.0902 0.2605 0.3447 0.0809 0.2678 0.2412 0.2912 0. 0.164 0.0172 0.0396 0.0169] 2022-08-24 14:50:17 [INFO] [EVAL] Class Precision: [0.7477 0.8269 0.9631 0.784 0.7568 0.8375 0.8489 0.7972 0.6138 0.7234 0.6637 0.6576 0.7452 0.4444 0.3956 0.5413 0.5114 0.6418 0.7243 0.556 0.7684 0.5647 0.6859 0.491 0.4732 0.5428 0.5363 0.6942 0.6773 0.5537 0.3385 0.5497 0.4843 0.3247 0.377 0.5663 0.6174 0.6781 0.4392 0.4908 0.1877 0.2561 0.4772 0.4969 0.3728 0.4567 0.3474 0.6066 0.7151 0.4691 0.6158 0.3437 0.3031 0.789 0.7076 0.5877 0.8205 0.6019 0.6603 0.3138 0.2708 0.4709 0.5118 0.6419 0.4272 0.6402 0.327 0.5383 0.2641 0.6626 0.4723 0.5097 0.6185 0.2823 0.5875 0.4269 0.5467 0.4788 0.5148 0.3509 0.8274 0.5203 0.728 0.0498 0.6305 0.6016 0.1749 0.4386 0.6211 0.6303 0.4372 0. 0.3628 0.2689 0.2102 0.0226 0.1442 0.7927 0.1429 0.4031 0.5161 0.0111 0.5832 0.5869 0.0125 0.5546 0.488 0.8683 0.1845 0.4471 0.222 0.2974 0.377 0.6467 0.6799 0.1242 0.9261 0.5656 0.1084 0.5379 0.6309 0. 0.8808 0.7501 0.5638 0.5267 0.8629 0.4918 0.1898 0.2771 0.6403 0. 0.1668 0.4808 0.499 0.332 0.3091 0.1308 0.2269 0.7212 0.4803 0.1267 0.6595 0.5294 0.6363 0. 0.8527 0.3415 0.3136 0.4347] 2022-08-24 14:50:17 [INFO] [EVAL] Class Recall: [0.8136 0.8981 0.9566 0.8441 0.8295 0.8424 0.8646 0.8682 0.6461 0.8556 0.561 0.6668 0.8324 0.3555 0.251 0.4898 0.7311 0.5094 0.6348 0.4507 0.8708 0.6766 0.6869 0.745 0.5104 0.464 0.5807 0.4104 0.45 0.2761 0.1539 0.6636 0.2761 0.4528 0.3873 0.4627 0.3927 0.5654 0.3518 0.3596 0.1639 0.0714 0.4159 0.2502 0.506 0.2869 0.4289 0.5187 0.8293 0.7254 0.5575 0.5837 0.2926 0.1394 0.8847 0.6567 0.9392 0.2116 0.3594 0.2417 0.0691 0.1834 0.3623 0.1602 0.647 0.8545 0.2903 0.5173 0.0259 0.3909 0.4577 0.5381 0.5111 0.5327 0.515 0.4041 0.6054 0.2263 0.34 0.434 0.7095 0.3135 0.1541 0.0162 0.4913 0.6351 0.0386 0.0474 0.3291 0.5201 0.6154 0. 0.2296 0.0552 0.0569 0.0034 0.0587 0.0442 0.015 0.4231 0.0237 0.0413 0.2101 0.8667 0.0006 0.6018 0.1321 0.5504 0.0917 0.3513 0.0712 0.4053 0.0293 0.8186 0.9928 0.0019 0.1781 0.8703 0.2105 0.1571 0.5679 0. 0.3015 0.0998 0.2715 0.0677 0.392 0.5323 0.0138 0.3274 0.7755 0. 0.0111 0.2108 0.0325 0.092 0.1119 0.0119 0.1301 0.2897 0.5498 0.1831 0.3107 0.3071 0.3493 0. 0.1688 0.0178 0.0434 0.0173] 2022-08-24 14:50:17 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 14:50:29 [INFO] [TRAIN] epoch: 31, iter: 39050/160000, loss: 0.9385, lr: 0.000916, batch_cost: 0.2427, reader_cost: 0.00933, ips: 32.9671 samples/sec | ETA 08:09:10 2022-08-24 14:50:43 [INFO] [TRAIN] epoch: 31, iter: 39100/160000, loss: 0.9423, lr: 0.000915, batch_cost: 0.2782, reader_cost: 0.02519, ips: 28.7609 samples/sec | ETA 09:20:29 2022-08-24 14:50:57 [INFO] [TRAIN] epoch: 31, iter: 39150/160000, loss: 0.9601, lr: 0.000915, batch_cost: 0.2753, reader_cost: 0.00052, ips: 29.0555 samples/sec | ETA 09:14:34 2022-08-24 14:51:13 [INFO] [TRAIN] epoch: 32, iter: 39200/160000, loss: 0.9398, lr: 0.000915, batch_cost: 0.3317, reader_cost: 0.05685, ips: 24.1208 samples/sec | ETA 11:07:44 2022-08-24 14:51:26 [INFO] [TRAIN] epoch: 32, iter: 39250/160000, loss: 0.9455, lr: 0.000914, batch_cost: 0.2479, reader_cost: 0.00049, ips: 32.2701 samples/sec | ETA 08:18:54 2022-08-24 14:51:37 [INFO] [TRAIN] epoch: 32, iter: 39300/160000, loss: 0.9361, lr: 0.000914, batch_cost: 0.2256, reader_cost: 0.00066, ips: 35.4587 samples/sec | ETA 07:33:51 2022-08-24 14:51:48 [INFO] [TRAIN] epoch: 32, iter: 39350/160000, loss: 0.9282, lr: 0.000913, batch_cost: 0.2177, reader_cost: 0.00062, ips: 36.7480 samples/sec | ETA 07:17:45 2022-08-24 14:51:59 [INFO] [TRAIN] epoch: 32, iter: 39400/160000, loss: 0.9151, lr: 0.000913, batch_cost: 0.2187, reader_cost: 0.00059, ips: 36.5799 samples/sec | ETA 07:19:35 2022-08-24 14:52:10 [INFO] [TRAIN] epoch: 32, iter: 39450/160000, loss: 0.9112, lr: 0.000913, batch_cost: 0.2147, reader_cost: 0.00608, ips: 37.2598 samples/sec | ETA 07:11:23 2022-08-24 14:52:20 [INFO] [TRAIN] epoch: 32, iter: 39500/160000, loss: 0.8739, lr: 0.000912, batch_cost: 0.2165, reader_cost: 0.00613, ips: 36.9513 samples/sec | ETA 07:14:48 2022-08-24 14:52:30 [INFO] [TRAIN] epoch: 32, iter: 39550/160000, loss: 0.9058, lr: 0.000912, batch_cost: 0.1976, reader_cost: 0.00232, ips: 40.4845 samples/sec | ETA 06:36:41 2022-08-24 14:52:40 [INFO] [TRAIN] epoch: 32, iter: 39600/160000, loss: 0.8835, lr: 0.000912, batch_cost: 0.2027, reader_cost: 0.00048, ips: 39.4721 samples/sec | ETA 06:46:42 2022-08-24 14:52:50 [INFO] [TRAIN] epoch: 32, iter: 39650/160000, loss: 0.8925, lr: 0.000911, batch_cost: 0.1968, reader_cost: 0.00403, ips: 40.6540 samples/sec | ETA 06:34:42 2022-08-24 14:52:58 [INFO] [TRAIN] epoch: 32, iter: 39700/160000, loss: 0.9598, lr: 0.000911, batch_cost: 0.1631, reader_cost: 0.00562, ips: 49.0545 samples/sec | ETA 05:26:58 2022-08-24 14:53:06 [INFO] [TRAIN] epoch: 32, iter: 39750/160000, loss: 0.8657, lr: 0.000910, batch_cost: 0.1565, reader_cost: 0.00813, ips: 51.1293 samples/sec | ETA 05:13:35 2022-08-24 14:53:15 [INFO] [TRAIN] epoch: 32, iter: 39800/160000, loss: 0.9323, lr: 0.000910, batch_cost: 0.1765, reader_cost: 0.00085, ips: 45.3265 samples/sec | ETA 05:53:34 2022-08-24 14:53:24 [INFO] [TRAIN] epoch: 32, iter: 39850/160000, loss: 0.8437, lr: 0.000910, batch_cost: 0.1793, reader_cost: 0.00059, ips: 44.6299 samples/sec | ETA 05:58:57 2022-08-24 14:53:34 [INFO] [TRAIN] epoch: 32, iter: 39900/160000, loss: 0.9818, lr: 0.000909, batch_cost: 0.1980, reader_cost: 0.00046, ips: 40.4005 samples/sec | ETA 06:36:21 2022-08-24 14:53:44 [INFO] [TRAIN] epoch: 32, iter: 39950/160000, loss: 0.9318, lr: 0.000909, batch_cost: 0.1974, reader_cost: 0.00072, ips: 40.5301 samples/sec | ETA 06:34:55 2022-08-24 14:53:54 [INFO] [TRAIN] epoch: 32, iter: 40000/160000, loss: 0.8947, lr: 0.000909, batch_cost: 0.1994, reader_cost: 0.00055, ips: 40.1117 samples/sec | ETA 06:38:53 2022-08-24 14:53:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 216s - batch_cost: 0.2164 - reader cost: 8.2515e-04 2022-08-24 14:57:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.2834 Acc: 0.7335 Kappa: 0.7126 Dice: 0.4005 2022-08-24 14:57:30 [INFO] [EVAL] Class IoU: [0.6384 0.7502 0.9234 0.6872 0.6601 0.7182 0.7479 0.7055 0.4624 0.6518 0.432 0.5034 0.6432 0.2898 0.189 0.3426 0.4322 0.4061 0.5257 0.3228 0.7126 0.4511 0.5173 0.4139 0.3156 0.3327 0.4202 0.295 0.3761 0.2698 0.1156 0.4172 0.2609 0.2446 0.2822 0.3082 0.3153 0.4531 0.2257 0.2558 0.079 0.0577 0.2831 0.1834 0.2793 0.2229 0.2533 0.3859 0.6882 0.4161 0.4205 0.2043 0.1508 0.1593 0.6643 0.4764 0.7489 0.2714 0.3646 0.1458 0.0853 0.13 0.3254 0.0817 0.3365 0.63 0.2051 0.3251 0.0132 0.3356 0.2701 0.3775 0.3711 0.2257 0.3954 0.2384 0.4278 0.1784 0.1264 0.2387 0.5693 0.2555 0.2439 0.0122 0.3116 0.4513 0.0527 0.0548 0.1945 0.3859 0.3751 0.048 0.2069 0.0268 0.0164 0.0047 0.1546 0.008 0.051 0.2517 0.0068 0.0117 0.0659 0.6436 0. 0.4213 0.0934 0.504 0.0375 0.2594 0.0514 0.2467 0.0215 0.5284 0.652 0.0024 0.258 0.5112 0.0162 0.1392 0.4602 0. 0.2666 0.0538 0.2024 0.0965 0.3788 0.2877 0.0462 0.1866 0.5742 0. 0.0206 0.1866 0.0386 0.0558 0.0471 0.0022 0.0829 0.22 0.1151 0.0876 0.2434 0.4117 0.2806 0. 0.184 0.005 0.0241 0.0176] 2022-08-24 14:57:30 [INFO] [EVAL] Class Precision: [0.7374 0.8166 0.9621 0.7975 0.7854 0.8295 0.856 0.7661 0.6248 0.7344 0.6757 0.6817 0.7298 0.4475 0.3966 0.4963 0.5262 0.6729 0.6839 0.5264 0.8211 0.6368 0.6755 0.5057 0.5304 0.6412 0.5333 0.8108 0.6021 0.5757 0.3607 0.5923 0.5202 0.3863 0.3911 0.612 0.6069 0.7222 0.444 0.5218 0.1661 0.2463 0.6296 0.5403 0.3603 0.4608 0.4015 0.5998 0.7449 0.4952 0.6456 0.2362 0.3774 0.7035 0.728 0.6532 0.8127 0.6026 0.6258 0.2812 0.206 0.2522 0.4679 0.6776 0.4119 0.7997 0.3193 0.5203 0.154 0.6556 0.4909 0.4788 0.6753 0.2756 0.5924 0.5548 0.6931 0.4989 0.3286 0.333 0.8093 0.6293 0.6636 0.0482 0.6517 0.5899 0.1689 0.3875 0.2701 0.5523 0.504 0.272 0.4021 0.229 0.1466 0.0235 0.542 0.9617 0.4438 0.4193 0.7334 0.0165 0.5359 0.7057 0. 0.5258 0.2621 0.7318 0.128 0.3997 0.1882 0.326 0.3832 0.684 0.6648 0.2924 0.879 0.6337 0.0462 0.394 0.7239 0.0016 0.9662 0.7076 0.5614 0.5478 0.7524 0.3614 0.2149 0.3337 0.7032 0. 0.1302 0.5079 0.5469 0.3103 0.4776 0.3102 0.3546 0.7349 0.263 0.199 0.7186 0.4846 0.4655 0. 0.7972 0.4597 0.3445 0.9115] 2022-08-24 14:57:30 [INFO] [EVAL] Class Recall: [0.8262 0.9022 0.9583 0.8325 0.8053 0.8425 0.8556 0.8992 0.6401 0.8528 0.5451 0.6581 0.8442 0.4513 0.2654 0.5252 0.7077 0.506 0.6944 0.455 0.8436 0.6072 0.6883 0.6951 0.4379 0.4088 0.6645 0.3168 0.5005 0.3368 0.1454 0.5853 0.3436 0.4002 0.5034 0.3831 0.3962 0.5488 0.3146 0.3341 0.1309 0.0701 0.3397 0.2173 0.5541 0.3015 0.4069 0.5196 0.9005 0.7225 0.5467 0.6022 0.2007 0.1708 0.8836 0.6377 0.9052 0.3305 0.4663 0.2325 0.1271 0.2116 0.5164 0.085 0.6474 0.7481 0.3645 0.4642 0.0142 0.4074 0.3751 0.6407 0.4518 0.5549 0.5432 0.2948 0.5278 0.2173 0.1705 0.4573 0.6575 0.3007 0.2784 0.016 0.3739 0.6576 0.0711 0.06 0.4099 0.5615 0.5946 0.0551 0.2989 0.0294 0.0181 0.0059 0.1778 0.008 0.0545 0.3864 0.0068 0.0383 0.0699 0.8797 0. 0.6796 0.1268 0.6182 0.0503 0.4249 0.066 0.5035 0.0223 0.6991 0.9714 0.0024 0.2675 0.7256 0.0244 0.1771 0.5582 0. 0.2691 0.0551 0.2405 0.1048 0.4328 0.5852 0.0556 0.2974 0.7579 0. 0.0239 0.2278 0.0399 0.0637 0.0497 0.0022 0.0976 0.239 0.1699 0.1353 0.2691 0.7325 0.4141 0. 0.1931 0.005 0.0253 0.0176] 2022-08-24 14:57:31 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 14:57:45 [INFO] [TRAIN] epoch: 32, iter: 40050/160000, loss: 0.8773, lr: 0.000908, batch_cost: 0.2842, reader_cost: 0.00471, ips: 28.1489 samples/sec | ETA 09:28:10 2022-08-24 14:57:58 [INFO] [TRAIN] epoch: 32, iter: 40100/160000, loss: 0.8968, lr: 0.000908, batch_cost: 0.2729, reader_cost: 0.00144, ips: 29.3102 samples/sec | ETA 09:05:25 2022-08-24 14:58:12 [INFO] [TRAIN] epoch: 32, iter: 40150/160000, loss: 0.8830, lr: 0.000907, batch_cost: 0.2680, reader_cost: 0.00065, ips: 29.8539 samples/sec | ETA 08:55:16 2022-08-24 14:58:25 [INFO] [TRAIN] epoch: 32, iter: 40200/160000, loss: 0.8736, lr: 0.000907, batch_cost: 0.2695, reader_cost: 0.00104, ips: 29.6854 samples/sec | ETA 08:58:05 2022-08-24 14:58:38 [INFO] [TRAIN] epoch: 32, iter: 40250/160000, loss: 0.8941, lr: 0.000907, batch_cost: 0.2575, reader_cost: 0.00957, ips: 31.0728 samples/sec | ETA 08:33:50 2022-08-24 14:58:48 [INFO] [TRAIN] epoch: 32, iter: 40300/160000, loss: 0.9041, lr: 0.000906, batch_cost: 0.1917, reader_cost: 0.00438, ips: 41.7352 samples/sec | ETA 06:22:24 2022-08-24 14:58:58 [INFO] [TRAIN] epoch: 32, iter: 40350/160000, loss: 0.9063, lr: 0.000906, batch_cost: 0.2072, reader_cost: 0.00042, ips: 38.6023 samples/sec | ETA 06:53:16 2022-08-24 14:59:08 [INFO] [TRAIN] epoch: 32, iter: 40400/160000, loss: 1.0122, lr: 0.000905, batch_cost: 0.2004, reader_cost: 0.00051, ips: 39.9131 samples/sec | ETA 06:39:32 2022-08-24 14:59:21 [INFO] [TRAIN] epoch: 33, iter: 40450/160000, loss: 0.8726, lr: 0.000905, batch_cost: 0.2605, reader_cost: 0.06064, ips: 30.7153 samples/sec | ETA 08:38:57 2022-08-24 14:59:32 [INFO] [TRAIN] epoch: 33, iter: 40500/160000, loss: 0.8402, lr: 0.000905, batch_cost: 0.2184, reader_cost: 0.00250, ips: 36.6263 samples/sec | ETA 07:15:01 2022-08-24 14:59:42 [INFO] [TRAIN] epoch: 33, iter: 40550/160000, loss: 0.9515, lr: 0.000904, batch_cost: 0.1905, reader_cost: 0.00040, ips: 41.9975 samples/sec | ETA 06:19:13 2022-08-24 14:59:51 [INFO] [TRAIN] epoch: 33, iter: 40600/160000, loss: 0.9359, lr: 0.000904, batch_cost: 0.1805, reader_cost: 0.00073, ips: 44.3106 samples/sec | ETA 05:59:16 2022-08-24 15:00:00 [INFO] [TRAIN] epoch: 33, iter: 40650/160000, loss: 0.9631, lr: 0.000904, batch_cost: 0.1880, reader_cost: 0.00044, ips: 42.5593 samples/sec | ETA 06:13:54 2022-08-24 15:00:09 [INFO] [TRAIN] epoch: 33, iter: 40700/160000, loss: 0.8820, lr: 0.000903, batch_cost: 0.1696, reader_cost: 0.00058, ips: 47.1678 samples/sec | ETA 05:37:14 2022-08-24 15:00:18 [INFO] [TRAIN] epoch: 33, iter: 40750/160000, loss: 0.8626, lr: 0.000903, batch_cost: 0.1858, reader_cost: 0.00113, ips: 43.0488 samples/sec | ETA 06:09:20 2022-08-24 15:00:27 [INFO] [TRAIN] epoch: 33, iter: 40800/160000, loss: 0.9093, lr: 0.000902, batch_cost: 0.1789, reader_cost: 0.00065, ips: 44.7250 samples/sec | ETA 05:55:21 2022-08-24 15:00:36 [INFO] [TRAIN] epoch: 33, iter: 40850/160000, loss: 0.9160, lr: 0.000902, batch_cost: 0.1752, reader_cost: 0.00052, ips: 45.6521 samples/sec | ETA 05:47:59 2022-08-24 15:00:44 [INFO] [TRAIN] epoch: 33, iter: 40900/160000, loss: 0.8729, lr: 0.000902, batch_cost: 0.1716, reader_cost: 0.00047, ips: 46.6264 samples/sec | ETA 05:40:34 2022-08-24 15:00:53 [INFO] [TRAIN] epoch: 33, iter: 40950/160000, loss: 0.8571, lr: 0.000901, batch_cost: 0.1684, reader_cost: 0.00061, ips: 47.5119 samples/sec | ETA 05:34:05 2022-08-24 15:01:01 [INFO] [TRAIN] epoch: 33, iter: 41000/160000, loss: 0.9329, lr: 0.000901, batch_cost: 0.1760, reader_cost: 0.00042, ips: 45.4555 samples/sec | ETA 05:49:03 2022-08-24 15:01:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 205s - batch_cost: 0.2054 - reader cost: 8.4775e-04 2022-08-24 15:04:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2809 Acc: 0.7317 Kappa: 0.7110 Dice: 0.3973 2022-08-24 15:04:27 [INFO] [EVAL] Class IoU: [0.6429 0.75 0.9226 0.6791 0.6497 0.7226 0.7378 0.7167 0.4481 0.6319 0.4311 0.5001 0.6384 0.2691 0.1747 0.3579 0.4653 0.3694 0.5055 0.3237 0.6999 0.3962 0.5266 0.4354 0.3094 0.4074 0.407 0.3602 0.37 0.2926 0.17 0.3892 0.2414 0.2384 0.2398 0.3545 0.324 0.397 0.244 0.2862 0.0945 0.0511 0.2823 0.2231 0.2884 0.2029 0.2355 0.3753 0.6289 0.446 0.4312 0.2557 0.1496 0.196 0.6656 0.4507 0.8235 0.2232 0.305 0.1674 0.0411 0.1646 0.3297 0.1141 0.3686 0.6267 0.1701 0.3488 0.0134 0.3192 0.2815 0.3862 0.3747 0.2236 0.3694 0.2362 0.4703 0.1659 0.1966 0.0798 0.6567 0.2426 0.216 0.0131 0.2931 0.4461 0.0557 0.0357 0.1873 0.4116 0.359 0.0274 0.1486 0.0535 0.0029 0.0009 0.1209 0.0279 0.0278 0.2012 0.016 0.001 0.0471 0.5289 0.0069 0.3811 0.0768 0.4955 0.0687 0.2105 0.0157 0.3483 0.0502 0.5594 0.6381 0.0049 0.3129 0.4945 0.0155 0.1384 0.4137 0. 0.2457 0.0498 0.2009 0.0708 0.3947 0.2511 0.1825 0.1818 0.6183 0. 0.0222 0.13 0.0315 0.0837 0.0277 0.002 0.0928 0.3083 0.2024 0.0711 0.2462 0.0623 0.3006 0. 0.1364 0.0036 0.0227 0.0459] 2022-08-24 15:04:27 [INFO] [EVAL] Class Precision: [0.7495 0.8313 0.9637 0.779 0.7448 0.8498 0.874 0.7994 0.5457 0.7414 0.6377 0.6024 0.7106 0.5284 0.4598 0.4673 0.561 0.6884 0.7036 0.5357 0.7678 0.6485 0.6926 0.5605 0.4918 0.5174 0.4811 0.6945 0.683 0.4198 0.3412 0.6193 0.5661 0.4821 0.5099 0.4598 0.6031 0.7696 0.4797 0.4588 0.1666 0.2441 0.603 0.4839 0.4572 0.4222 0.4292 0.5501 0.6963 0.5725 0.6248 0.3099 0.3932 0.6037 0.7055 0.5813 0.8904 0.6321 0.719 0.3139 0.0963 0.3427 0.4881 0.5426 0.4663 0.7722 0.2971 0.5091 0.3763 0.7904 0.5283 0.4915 0.5962 0.3311 0.4853 0.4138 0.5598 0.3792 0.4488 0.2696 0.795 0.6454 0.7324 0.0553 0.3276 0.7037 0.2715 0.4506 0.2449 0.6493 0.4945 0.0712 0.2472 0.2425 0.0143 0.0045 0.4656 0.4782 0.3567 0.4968 0.4331 0.0015 0.4883 0.5713 0.084 0.5023 0.1815 0.7647 0.2065 0.3797 0.2085 0.732 0.2773 0.6268 0.6402 0.2769 0.8223 0.5308 0.1208 0.3265 0.796 0. 0.8859 0.6921 0.5724 0.5247 0.7347 0.3084 0.2824 0.275 0.7761 0. 0.1791 0.5247 0.6042 0.3145 0.4814 0.1401 0.3862 0.4727 0.3344 0.0939 0.6284 0.2575 0.6115 0. 0.8498 0.4511 0.4197 0.547 ] 2022-08-24 15:04:27 [INFO] [EVAL] Class Recall: [0.8188 0.8846 0.9558 0.8412 0.8358 0.8283 0.8256 0.8739 0.7147 0.8107 0.5709 0.7465 0.8627 0.3542 0.2199 0.6046 0.7317 0.4436 0.6422 0.45 0.8878 0.5045 0.6873 0.6612 0.4547 0.6571 0.7253 0.4281 0.4467 0.4914 0.2531 0.5115 0.2962 0.3205 0.3116 0.6076 0.4117 0.4506 0.3317 0.4321 0.1793 0.0607 0.3467 0.2928 0.4387 0.2809 0.343 0.5416 0.8667 0.6687 0.5818 0.5939 0.1945 0.2249 0.9217 0.6673 0.9163 0.2565 0.3463 0.2641 0.0669 0.2405 0.504 0.1262 0.6376 0.7688 0.2847 0.5255 0.0137 0.3487 0.376 0.6432 0.5022 0.4078 0.6073 0.3549 0.7464 0.2278 0.2592 0.1019 0.7905 0.2799 0.2345 0.0169 0.7357 0.5493 0.0655 0.0373 0.4431 0.5294 0.5672 0.0427 0.2716 0.0643 0.0037 0.0011 0.1404 0.0287 0.0293 0.2528 0.0163 0.0027 0.0495 0.8772 0.0075 0.6125 0.1175 0.5846 0.0934 0.3209 0.0167 0.3992 0.0577 0.8386 0.995 0.005 0.3356 0.8785 0.0175 0.1938 0.4627 0. 0.2537 0.0509 0.2363 0.0757 0.4603 0.575 0.3402 0.3492 0.7525 0. 0.0247 0.1473 0.0322 0.1024 0.0285 0.002 0.1088 0.47 0.339 0.2266 0.2881 0.076 0.3716 0. 0.1398 0.0037 0.0234 0.0477] 2022-08-24 15:04:27 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 15:04:40 [INFO] [TRAIN] epoch: 33, iter: 41050/160000, loss: 0.9411, lr: 0.000901, batch_cost: 0.2605, reader_cost: 0.01134, ips: 30.7144 samples/sec | ETA 08:36:22 2022-08-24 15:04:55 [INFO] [TRAIN] epoch: 33, iter: 41100/160000, loss: 0.9225, lr: 0.000900, batch_cost: 0.2958, reader_cost: 0.00177, ips: 27.0477 samples/sec | ETA 09:46:07 2022-08-24 15:05:09 [INFO] [TRAIN] epoch: 33, iter: 41150/160000, loss: 0.8462, lr: 0.000900, batch_cost: 0.2742, reader_cost: 0.01701, ips: 29.1755 samples/sec | ETA 09:03:08 2022-08-24 15:05:22 [INFO] [TRAIN] epoch: 33, iter: 41200/160000, loss: 0.9172, lr: 0.000899, batch_cost: 0.2587, reader_cost: 0.00674, ips: 30.9279 samples/sec | ETA 08:32:09 2022-08-24 15:05:35 [INFO] [TRAIN] epoch: 33, iter: 41250/160000, loss: 0.8978, lr: 0.000899, batch_cost: 0.2758, reader_cost: 0.00091, ips: 29.0117 samples/sec | ETA 09:05:45 2022-08-24 15:05:47 [INFO] [TRAIN] epoch: 33, iter: 41300/160000, loss: 0.8579, lr: 0.000899, batch_cost: 0.2351, reader_cost: 0.00051, ips: 34.0228 samples/sec | ETA 07:45:10 2022-08-24 15:06:00 [INFO] [TRAIN] epoch: 33, iter: 41350/160000, loss: 0.9201, lr: 0.000898, batch_cost: 0.2553, reader_cost: 0.00043, ips: 31.3418 samples/sec | ETA 08:24:45 2022-08-24 15:06:12 [INFO] [TRAIN] epoch: 33, iter: 41400/160000, loss: 0.9030, lr: 0.000898, batch_cost: 0.2349, reader_cost: 0.00053, ips: 34.0636 samples/sec | ETA 07:44:13 2022-08-24 15:06:23 [INFO] [TRAIN] epoch: 33, iter: 41450/160000, loss: 0.8341, lr: 0.000898, batch_cost: 0.2155, reader_cost: 0.00037, ips: 37.1276 samples/sec | ETA 07:05:44 2022-08-24 15:06:34 [INFO] [TRAIN] epoch: 33, iter: 41500/160000, loss: 0.8981, lr: 0.000897, batch_cost: 0.2256, reader_cost: 0.00106, ips: 35.4659 samples/sec | ETA 07:25:29 2022-08-24 15:06:43 [INFO] [TRAIN] epoch: 33, iter: 41550/160000, loss: 0.9301, lr: 0.000897, batch_cost: 0.1752, reader_cost: 0.00032, ips: 45.6641 samples/sec | ETA 05:45:51 2022-08-24 15:06:51 [INFO] [TRAIN] epoch: 33, iter: 41600/160000, loss: 0.9340, lr: 0.000896, batch_cost: 0.1758, reader_cost: 0.00031, ips: 45.4953 samples/sec | ETA 05:46:59 2022-08-24 15:07:00 [INFO] [TRAIN] epoch: 33, iter: 41650/160000, loss: 0.9015, lr: 0.000896, batch_cost: 0.1638, reader_cost: 0.00050, ips: 48.8256 samples/sec | ETA 05:23:11 2022-08-24 15:07:11 [INFO] [TRAIN] epoch: 34, iter: 41700/160000, loss: 0.8549, lr: 0.000896, batch_cost: 0.2234, reader_cost: 0.05487, ips: 35.8041 samples/sec | ETA 07:20:32 2022-08-24 15:07:21 [INFO] [TRAIN] epoch: 34, iter: 41750/160000, loss: 0.9149, lr: 0.000895, batch_cost: 0.2036, reader_cost: 0.00052, ips: 39.2985 samples/sec | ETA 06:41:12 2022-08-24 15:07:30 [INFO] [TRAIN] epoch: 34, iter: 41800/160000, loss: 0.9023, lr: 0.000895, batch_cost: 0.1751, reader_cost: 0.00046, ips: 45.6859 samples/sec | ETA 05:44:57 2022-08-24 15:07:39 [INFO] [TRAIN] epoch: 34, iter: 41850/160000, loss: 0.8898, lr: 0.000895, batch_cost: 0.1844, reader_cost: 0.00047, ips: 43.3945 samples/sec | ETA 06:03:01 2022-08-24 15:07:49 [INFO] [TRAIN] epoch: 34, iter: 41900/160000, loss: 0.8619, lr: 0.000894, batch_cost: 0.1923, reader_cost: 0.00087, ips: 41.6110 samples/sec | ETA 06:18:25 2022-08-24 15:07:57 [INFO] [TRAIN] epoch: 34, iter: 41950/160000, loss: 0.8900, lr: 0.000894, batch_cost: 0.1772, reader_cost: 0.00078, ips: 45.1386 samples/sec | ETA 05:48:42 2022-08-24 15:08:06 [INFO] [TRAIN] epoch: 34, iter: 42000/160000, loss: 0.8581, lr: 0.000893, batch_cost: 0.1770, reader_cost: 0.00052, ips: 45.2006 samples/sec | ETA 05:48:04 2022-08-24 15:08:06 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 219s - batch_cost: 0.2189 - reader cost: 0.0013 2022-08-24 15:11:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.2822 Acc: 0.7322 Kappa: 0.7111 Dice: 0.4009 2022-08-24 15:11:46 [INFO] [EVAL] Class IoU: [0.6418 0.7473 0.9227 0.6849 0.6442 0.7286 0.7461 0.7129 0.4575 0.6219 0.4267 0.5073 0.6336 0.2396 0.1425 0.3553 0.4665 0.3584 0.5219 0.3346 0.7125 0.4157 0.5365 0.4388 0.3153 0.3532 0.3477 0.3534 0.3413 0.2551 0.1852 0.4306 0.239 0.2502 0.2619 0.3673 0.3228 0.4471 0.2173 0.2275 0.0829 0.0365 0.2878 0.2093 0.2668 0.2448 0.226 0.3773 0.6525 0.415 0.4193 0.2057 0.1634 0.1633 0.6658 0.3753 0.792 0.2804 0.3617 0.1528 0.0268 0.1626 0.2964 0.0469 0.3723 0.587 0.1588 0.3545 0.0527 0.285 0.2395 0.3681 0.3492 0.2214 0.3787 0.251 0.3475 0.1834 0.1182 0.1096 0.6273 0.2417 0.2247 0.0214 0.351 0.4337 0.0574 0.0386 0.3103 0.4101 0.3546 0.0219 0.234 0.0375 0.0399 0.0062 0.1625 0.1089 0.0161 0.2983 0.0124 0.0073 0.1198 0.5003 0.0005 0.3727 0.1395 0.4806 0.0439 0.1702 0.0351 0.2427 0.0523 0.5376 0.5913 0.0025 0.2831 0.4688 0.015 0.1255 0.4276 0. 0.2698 0.081 0.2076 0.1022 0.3603 0.2991 0.1294 0.2004 0.5488 0. 0.0117 0.1229 0.043 0.0972 0.066 0.0069 0.1071 0.2326 0.3179 0.0063 0.3448 0.4675 0.2536 0.0007 0.1608 0.0086 0.051 0.0103] 2022-08-24 15:11:46 [INFO] [EVAL] Class Precision: [0.7409 0.8162 0.9578 0.7891 0.7404 0.8509 0.8728 0.779 0.5758 0.7169 0.6481 0.6299 0.7032 0.5227 0.4661 0.5158 0.5884 0.6415 0.7003 0.5343 0.8041 0.6044 0.7058 0.6141 0.4317 0.6055 0.494 0.7524 0.6913 0.3952 0.3459 0.5794 0.506 0.3845 0.5636 0.5209 0.5432 0.6568 0.4855 0.5147 0.1934 0.2235 0.5975 0.4815 0.3531 0.528 0.3232 0.5801 0.7059 0.5021 0.5073 0.244 0.3391 0.5528 0.7222 0.468 0.8437 0.5608 0.5152 0.2829 0.1764 0.3493 0.46 0.4651 0.4811 0.6828 0.3419 0.5326 0.1469 0.6212 0.5343 0.4975 0.5898 0.2661 0.6786 0.4129 0.5748 0.545 0.4042 0.3317 0.7406 0.5071 0.6888 0.1391 0.5003 0.6572 0.1987 0.4476 0.6726 0.6168 0.4525 0.0441 0.3601 0.2308 0.1745 0.0173 0.5814 0.615 0.2174 0.4897 0.2369 0.0189 0.6258 0.6446 0.0096 0.5313 0.2787 0.8829 0.1639 0.2955 0.1758 0.3681 0.5142 0.6256 0.5933 0.1914 0.8949 0.4917 0.031 0.483 0.7048 0. 0.7849 0.6525 0.5999 0.6425 0.7838 0.4404 0.2211 0.3557 0.6579 0. 0.4882 0.5909 0.5052 0.2229 0.4672 0.4434 0.3265 0.7456 0.4311 0.0678 0.5774 0.6079 0.5584 0.001 0.9343 0.4299 0.4299 0.5817] 2022-08-24 15:11:46 [INFO] [EVAL] Class Recall: [0.8275 0.8985 0.9618 0.8384 0.8322 0.8353 0.8371 0.8936 0.69 0.8243 0.5554 0.7227 0.8649 0.3068 0.1703 0.533 0.6925 0.4482 0.672 0.4725 0.8622 0.571 0.691 0.6058 0.539 0.4588 0.5401 0.4 0.4027 0.4186 0.2852 0.6263 0.3118 0.4175 0.3285 0.5547 0.443 0.5834 0.2824 0.2896 0.1267 0.0418 0.357 0.2703 0.5218 0.3134 0.4292 0.5191 0.8961 0.7052 0.7073 0.567 0.2398 0.1881 0.895 0.6546 0.9282 0.3593 0.5483 0.2495 0.0307 0.2332 0.4546 0.0496 0.622 0.8071 0.2287 0.5146 0.076 0.3449 0.3027 0.5859 0.4612 0.5686 0.4615 0.3903 0.4677 0.2166 0.1432 0.1406 0.8039 0.316 0.2501 0.0247 0.5405 0.5605 0.0746 0.0405 0.3655 0.5502 0.621 0.0416 0.4006 0.0429 0.0492 0.0096 0.1841 0.1168 0.017 0.4329 0.0129 0.0117 0.1291 0.691 0.0005 0.5553 0.2184 0.5133 0.0566 0.2864 0.042 0.4161 0.055 0.7926 0.9943 0.0026 0.2929 0.9098 0.0281 0.145 0.5208 0. 0.2913 0.0846 0.241 0.1083 0.4001 0.4823 0.238 0.3146 0.768 0. 0.0119 0.1343 0.0449 0.1471 0.0714 0.0069 0.1374 0.2526 0.5476 0.0069 0.4611 0.6693 0.3172 0.0024 0.1626 0.0087 0.0547 0.0104] 2022-08-24 15:11:46 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 15:12:00 [INFO] [TRAIN] epoch: 34, iter: 42050/160000, loss: 0.9534, lr: 0.000893, batch_cost: 0.2778, reader_cost: 0.00662, ips: 28.7930 samples/sec | ETA 09:06:11 2022-08-24 15:12:12 [INFO] [TRAIN] epoch: 34, iter: 42100/160000, loss: 0.8762, lr: 0.000893, batch_cost: 0.2503, reader_cost: 0.00675, ips: 31.9574 samples/sec | ETA 08:11:54 2022-08-24 15:12:25 [INFO] [TRAIN] epoch: 34, iter: 42150/160000, loss: 0.8939, lr: 0.000892, batch_cost: 0.2670, reader_cost: 0.00650, ips: 29.9680 samples/sec | ETA 08:44:20 2022-08-24 15:12:39 [INFO] [TRAIN] epoch: 34, iter: 42200/160000, loss: 0.8775, lr: 0.000892, batch_cost: 0.2686, reader_cost: 0.01236, ips: 29.7834 samples/sec | ETA 08:47:21 2022-08-24 15:12:51 [INFO] [TRAIN] epoch: 34, iter: 42250/160000, loss: 0.8724, lr: 0.000891, batch_cost: 0.2398, reader_cost: 0.03071, ips: 33.3642 samples/sec | ETA 07:50:33 2022-08-24 15:13:01 [INFO] [TRAIN] epoch: 34, iter: 42300/160000, loss: 0.8898, lr: 0.000891, batch_cost: 0.2110, reader_cost: 0.00046, ips: 37.9134 samples/sec | ETA 06:53:55 2022-08-24 15:13:12 [INFO] [TRAIN] epoch: 34, iter: 42350/160000, loss: 0.8649, lr: 0.000891, batch_cost: 0.2072, reader_cost: 0.00285, ips: 38.6081 samples/sec | ETA 06:46:18 2022-08-24 15:13:23 [INFO] [TRAIN] epoch: 34, iter: 42400/160000, loss: 0.8863, lr: 0.000890, batch_cost: 0.2276, reader_cost: 0.00070, ips: 35.1445 samples/sec | ETA 07:26:09 2022-08-24 15:13:32 [INFO] [TRAIN] epoch: 34, iter: 42450/160000, loss: 0.8586, lr: 0.000890, batch_cost: 0.1816, reader_cost: 0.00737, ips: 44.0446 samples/sec | ETA 05:55:51 2022-08-24 15:13:41 [INFO] [TRAIN] epoch: 34, iter: 42500/160000, loss: 0.8862, lr: 0.000890, batch_cost: 0.1797, reader_cost: 0.00528, ips: 44.5253 samples/sec | ETA 05:51:51 2022-08-24 15:13:50 [INFO] [TRAIN] epoch: 34, iter: 42550/160000, loss: 0.9536, lr: 0.000889, batch_cost: 0.1792, reader_cost: 0.00078, ips: 44.6529 samples/sec | ETA 05:50:42 2022-08-24 15:13:59 [INFO] [TRAIN] epoch: 34, iter: 42600/160000, loss: 0.9036, lr: 0.000889, batch_cost: 0.1699, reader_cost: 0.00044, ips: 47.0966 samples/sec | ETA 05:32:21 2022-08-24 15:14:08 [INFO] [TRAIN] epoch: 34, iter: 42650/160000, loss: 0.9866, lr: 0.000888, batch_cost: 0.1783, reader_cost: 0.00055, ips: 44.8605 samples/sec | ETA 05:48:47 2022-08-24 15:14:16 [INFO] [TRAIN] epoch: 34, iter: 42700/160000, loss: 0.9288, lr: 0.000888, batch_cost: 0.1582, reader_cost: 0.00044, ips: 50.5642 samples/sec | ETA 05:09:18 2022-08-24 15:14:24 [INFO] [TRAIN] epoch: 34, iter: 42750/160000, loss: 0.8580, lr: 0.000888, batch_cost: 0.1771, reader_cost: 0.00037, ips: 45.1828 samples/sec | ETA 05:46:00 2022-08-24 15:14:32 [INFO] [TRAIN] epoch: 34, iter: 42800/160000, loss: 0.8928, lr: 0.000887, batch_cost: 0.1503, reader_cost: 0.00250, ips: 53.2388 samples/sec | ETA 04:53:31 2022-08-24 15:14:41 [INFO] [TRAIN] epoch: 34, iter: 42850/160000, loss: 0.8595, lr: 0.000887, batch_cost: 0.1853, reader_cost: 0.00091, ips: 43.1627 samples/sec | ETA 06:01:53 2022-08-24 15:14:50 [INFO] [TRAIN] epoch: 34, iter: 42900/160000, loss: 0.8939, lr: 0.000887, batch_cost: 0.1752, reader_cost: 0.00069, ips: 45.6503 samples/sec | ETA 05:42:01 2022-08-24 15:15:01 [INFO] [TRAIN] epoch: 35, iter: 42950/160000, loss: 0.8669, lr: 0.000886, batch_cost: 0.2198, reader_cost: 0.02507, ips: 36.4049 samples/sec | ETA 07:08:41 2022-08-24 15:15:10 [INFO] [TRAIN] epoch: 35, iter: 43000/160000, loss: 0.8351, lr: 0.000886, batch_cost: 0.1874, reader_cost: 0.00038, ips: 42.6969 samples/sec | ETA 06:05:21 2022-08-24 15:15:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 209s - batch_cost: 0.2093 - reader cost: 6.0675e-04 2022-08-24 15:18:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.2821 Acc: 0.7314 Kappa: 0.7109 Dice: 0.4019 2022-08-24 15:18:40 [INFO] [EVAL] Class IoU: [0.6359 0.7546 0.9188 0.6908 0.6561 0.7194 0.7323 0.7079 0.4521 0.6238 0.4393 0.4721 0.64 0.2865 0.2211 0.361 0.4724 0.4103 0.5028 0.3384 0.7096 0.4479 0.5374 0.4296 0.299 0.3763 0.3489 0.3303 0.3387 0.2432 0.1853 0.4094 0.2554 0.2373 0.2731 0.3419 0.3315 0.3958 0.2293 0.2105 0.0611 0.0511 0.2827 0.2088 0.278 0.2793 0.2369 0.3566 0.6349 0.4677 0.4379 0.2111 0.1491 0.157 0.6712 0.4457 0.8074 0.2341 0.315 0.1542 0.0292 0.202 0.2842 0.1565 0.3694 0.6123 0.162 0.333 0.0139 0.3285 0.2237 0.373 0.3171 0.2122 0.3872 0.2618 0.4042 0.1992 0.1597 0.165 0.612 0.2761 0.1556 0.0135 0.3549 0.4426 0.0549 0.0615 0.2611 0.3192 0.3835 0.0484 0.1532 0.0598 0.0907 0.0118 0.1483 0.0653 0.1451 0.3116 0.0208 0.0088 0.1045 0.5229 0. 0.3929 0.0582 0.4624 0.0362 0.2211 0.0704 0.1485 0.0354 0.5191 0.6101 0.0001 0.2637 0.4941 0.0442 0.1313 0.3889 0. 0.2982 0.1334 0.2194 0.1061 0.3732 0.3124 0. 0.175 0.5132 0. 0.0012 0.1259 0.0645 0.0859 0.0643 0.0035 0.1093 0.2345 0.3267 0.0494 0.2101 0.2135 0.268 0.0031 0.2205 0.0183 0.0359 0.0509] 2022-08-24 15:18:40 [INFO] [EVAL] Class Precision: [0.7533 0.8371 0.9587 0.798 0.7769 0.8183 0.81 0.7666 0.6327 0.7385 0.6424 0.7003 0.7235 0.4694 0.4032 0.4963 0.5874 0.6532 0.5962 0.5018 0.7961 0.5702 0.7097 0.5688 0.4224 0.5954 0.5107 0.7601 0.729 0.6043 0.3346 0.5062 0.4442 0.4055 0.5314 0.5534 0.5676 0.7545 0.5137 0.5895 0.1733 0.2046 0.5559 0.5299 0.4122 0.5556 0.3487 0.4787 0.686 0.5971 0.5817 0.2404 0.3897 0.5461 0.7495 0.5828 0.8696 0.5703 0.5058 0.2904 0.0781 0.395 0.3706 0.6076 0.5174 0.7278 0.2893 0.5193 0.651 0.5937 0.5469 0.4479 0.6541 0.2987 0.4949 0.4166 0.5176 0.5202 0.4105 0.4991 0.7589 0.5534 0.758 0.0627 0.4968 0.6649 0.275 0.4184 0.3943 0.3959 0.5601 0.401 0.2595 0.256 0.1373 0.0338 0.2968 0.687 0.376 0.4885 0.3296 0.0118 0.608 0.6943 0. 0.5036 0.2512 0.819 0.1245 0.307 0.1819 0.1876 0.6327 0.6049 0.6131 0.1457 0.8808 0.523 0.0875 0.2921 0.7998 0. 0.7573 0.6034 0.445 0.5206 0.6833 0.467 0. 0.3848 0.5634 0. 0.0331 0.629 0.5002 0.2794 0.3621 0.2012 0.3066 0.7512 0.4833 0.0673 0.5713 0.5883 0.6029 0.0035 0.8737 0.2958 0.4302 0.6124] 2022-08-24 15:18:40 [INFO] [EVAL] Class Recall: [0.8032 0.8844 0.9567 0.8372 0.8084 0.8561 0.8842 0.9024 0.6129 0.8007 0.5815 0.5915 0.8472 0.4237 0.3287 0.5698 0.7069 0.5247 0.7625 0.5096 0.8673 0.6762 0.6889 0.6372 0.5057 0.5055 0.5241 0.3688 0.3875 0.2893 0.2934 0.6817 0.3754 0.3639 0.3596 0.4722 0.4435 0.4543 0.2929 0.2467 0.0863 0.0638 0.3652 0.2563 0.4606 0.3597 0.425 0.5829 0.895 0.6834 0.6392 0.634 0.1945 0.1806 0.8654 0.6545 0.9186 0.2842 0.455 0.2474 0.0445 0.2925 0.5494 0.174 0.5636 0.7942 0.2691 0.4813 0.0141 0.4238 0.2746 0.6904 0.381 0.423 0.6404 0.4133 0.6485 0.2441 0.2073 0.1978 0.7597 0.3553 0.1637 0.0169 0.5541 0.5698 0.0642 0.0672 0.436 0.6223 0.5489 0.0522 0.2723 0.0724 0.2108 0.0177 0.2286 0.0673 0.1911 0.4626 0.0218 0.0337 0.1121 0.6792 0. 0.6412 0.0704 0.515 0.0486 0.4413 0.1031 0.4161 0.0362 0.7854 0.9919 0.0001 0.2734 0.8992 0.082 0.1926 0.4308 0. 0.3297 0.1463 0.302 0.1175 0.4513 0.4854 0. 0.243 0.8521 0. 0.0012 0.136 0.069 0.1103 0.0725 0.0036 0.1452 0.2543 0.5019 0.1565 0.2494 0.251 0.3255 0.0215 0.2278 0.0191 0.0377 0.0526] 2022-08-24 15:18:40 [INFO] [EVAL] The model with the best validation mIoU (0.2863) was saved at iter 37000. 2022-08-24 15:18:52 [INFO] [TRAIN] epoch: 35, iter: 43050/160000, loss: 0.8816, lr: 0.000885, batch_cost: 0.2443, reader_cost: 0.00493, ips: 32.7476 samples/sec | ETA 07:56:10 2022-08-24 15:19:06 [INFO] [TRAIN] epoch: 35, iter: 43100/160000, loss: 0.8037, lr: 0.000885, batch_cost: 0.2763, reader_cost: 0.00584, ips: 28.9544 samples/sec | ETA 08:58:19 2022-08-24 15:19:21 [INFO] [TRAIN] epoch: 35, iter: 43150/160000, loss: 0.8675, lr: 0.000885, batch_cost: 0.2908, reader_cost: 0.01696, ips: 27.5074 samples/sec | ETA 09:26:23 2022-08-24 15:19:35 [INFO] [TRAIN] epoch: 35, iter: 43200/160000, loss: 0.9902, lr: 0.000884, batch_cost: 0.2825, reader_cost: 0.00493, ips: 28.3198 samples/sec | ETA 09:09:54 2022-08-24 15:19:48 [INFO] [TRAIN] epoch: 35, iter: 43250/160000, loss: 0.8939, lr: 0.000884, batch_cost: 0.2668, reader_cost: 0.00365, ips: 29.9898 samples/sec | ETA 08:39:03 2022-08-24 15:20:01 [INFO] [TRAIN] epoch: 35, iter: 43300/160000, loss: 0.8744, lr: 0.000884, batch_cost: 0.2555, reader_cost: 0.03147, ips: 31.3168 samples/sec | ETA 08:16:51 2022-08-24 15:20:13 [INFO] [TRAIN] epoch: 35, iter: 43350/160000, loss: 0.8948, lr: 0.000883, batch_cost: 0.2376, reader_cost: 0.01119, ips: 33.6718 samples/sec | ETA 07:41:54 2022-08-24 15:20:23 [INFO] [TRAIN] epoch: 35, iter: 43400/160000, loss: 0.8830, lr: 0.000883, batch_cost: 0.2042, reader_cost: 0.00073, ips: 39.1816 samples/sec | ETA 06:36:47 2022-08-24 15:20:33 [INFO] [TRAIN] epoch: 35, iter: 43450/160000, loss: 0.8236, lr: 0.000882, batch_cost: 0.2080, reader_cost: 0.00115, ips: 38.4539 samples/sec | ETA 06:44:07 2022-08-24 15:20:43 [INFO] [TRAIN] epoch: 35, iter: 43500/160000, loss: 0.8672, lr: 0.000882, batch_cost: 0.2008, reader_cost: 0.00062, ips: 39.8369 samples/sec | ETA 06:29:55 2022-08-24 15:20:53 [INFO] [TRAIN] epoch: 35, iter: 43550/160000, loss: 0.8822, lr: 0.000882, batch_cost: 0.1857, reader_cost: 0.00052, ips: 43.0901 samples/sec | ETA 06:00:19 2022-08-24 15:21:03 [INFO] [TRAIN] epoch: 35, iter: 43600/160000, loss: 0.8426, lr: 0.000881, batch_cost: 0.2014, reader_cost: 0.00062, ips: 39.7248 samples/sec | ETA 06:30:41 2022-08-24 15:21:12 [INFO] [TRAIN] epoch: 35, iter: 43650/160000, loss: 0.8918, lr: 0.000881, batch_cost: 0.1776, reader_cost: 0.00058, ips: 45.0501 samples/sec | ETA 05:44:21 2022-08-24 15:21:21 [INFO] [TRAIN] epoch: 35, iter: 43700/160000, loss: 0.8798, lr: 0.000881, batch_cost: 0.1894, reader_cost: 0.00035, ips: 42.2451 samples/sec | ETA 06:07:03 2022-08-24 15:21:29 [INFO] [TRAIN] epoch: 35, iter: 43750/160000, loss: 0.8389, lr: 0.000880, batch_cost: 0.1666, reader_cost: 0.00046, ips: 48.0233 samples/sec | ETA 05:22:45 2022-08-24 15:21:39 [INFO] [TRAIN] epoch: 35, iter: 43800/160000, loss: 0.8559, lr: 0.000880, batch_cost: 0.1968, reader_cost: 0.00052, ips: 40.6539 samples/sec | ETA 06:21:06 2022-08-24 15:21:48 [INFO] [TRAIN] epoch: 35, iter: 43850/160000, loss: 0.9263, lr: 0.000879, batch_cost: 0.1777, reader_cost: 0.00052, ips: 45.0249 samples/sec | ETA 05:43:57 2022-08-24 15:21:57 [INFO] [TRAIN] epoch: 35, iter: 43900/160000, loss: 0.9477, lr: 0.000879, batch_cost: 0.1671, reader_cost: 0.00054, ips: 47.8636 samples/sec | ETA 05:23:25 2022-08-24 15:22:06 [INFO] [TRAIN] epoch: 35, iter: 43950/160000, loss: 0.8500, lr: 0.000879, batch_cost: 0.1857, reader_cost: 0.00106, ips: 43.0837 samples/sec | ETA 05:59:08 2022-08-24 15:22:15 [INFO] [TRAIN] epoch: 35, iter: 44000/160000, loss: 0.8701, lr: 0.000878, batch_cost: 0.1853, reader_cost: 0.00050, ips: 43.1833 samples/sec | ETA 05:58:09 2022-08-24 15:22:15 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 212s - batch_cost: 0.2124 - reader cost: 8.1449e-04 2022-08-24 15:25:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.2877 Acc: 0.7326 Kappa: 0.7121 Dice: 0.4086 2022-08-24 15:25:48 [INFO] [EVAL] Class IoU: [0.6415 0.7524 0.9238 0.6839 0.6552 0.7206 0.7451 0.714 0.4596 0.5968 0.437 0.5169 0.65 0.3003 0.1866 0.3537 0.4434 0.4049 0.5153 0.3345 0.7132 0.4232 0.5316 0.4437 0.2719 0.3847 0.3591 0.3786 0.3684 0.2313 0.1632 0.3909 0.2452 0.243 0.2337 0.3784 0.3289 0.4612 0.2288 0.2694 0.0906 0.0804 0.285 0.1933 0.2641 0.2485 0.2596 0.3894 0.5318 0.448 0.4312 0.1717 0.1574 0.2231 0.6136 0.4435 0.8058 0.2681 0.384 0.2144 0.0648 0.1636 0.2886 0.1001 0.355 0.6213 0.1682 0.3346 0.0256 0.3414 0.2584 0.3751 0.3501 0.2091 0.4027 0.2632 0.3074 0.1882 0.1407 0.2164 0.6247 0.2881 0.2118 0.0156 0.3765 0.4426 0.0536 0.0516 0.2906 0.3934 0.3906 0.0211 0.1986 0.0651 0.0059 0.0097 0.1456 0.1103 0.1268 0.1846 0.0048 0.0051 0.1183 0.5619 0.0009 0.3343 0.0656 0.4974 0.0368 0.3114 0.0118 0.2583 0.0748 0.604 0.5268 0.0001 0.3202 0.4562 0.0713 0.1983 0.406 0. 0.2753 0.1137 0.1872 0.1008 0.4068 0.3375 0.1489 0.1921 0.5893 0. 0.0007 0.1383 0.0602 0.1007 0.0695 0.0059 0.0678 0.2745 0.3199 0.0007 0.1887 0.418 0.2588 0. 0.1796 0.0078 0.0412 0.0341] 2022-08-24 15:25:48 [INFO] [EVAL] Class Precision: [0.7503 0.8428 0.962 0.7796 0.7465 0.8369 0.8775 0.775 0.567 0.7781 0.6479 0.6179 0.7469 0.4685 0.4944 0.5317 0.5405 0.6161 0.6833 0.5381 0.8017 0.6304 0.7241 0.5684 0.5417 0.5942 0.5079 0.6314 0.6662 0.38 0.3823 0.4848 0.5083 0.3272 0.4058 0.5333 0.557 0.6213 0.4486 0.4599 0.2238 0.2091 0.5594 0.5147 0.3901 0.5149 0.4098 0.6093 0.5456 0.566 0.6274 0.1942 0.4652 0.5014 0.774 0.5708 0.8528 0.6311 0.5155 0.3907 0.1059 0.3106 0.3703 0.6104 0.5004 0.7759 0.3779 0.5429 0.4306 0.6881 0.5143 0.5767 0.573 0.2678 0.5968 0.5262 0.4678 0.5285 0.3184 0.4849 0.7295 0.5568 0.6716 0.0742 0.6049 0.7073 0.1957 0.3775 0.4688 0.5601 0.54 0.0356 0.3423 0.2606 0.0438 0.0226 0.4308 0.5752 0.3499 0.5711 0.3382 0.0075 0.6248 0.7072 0.0157 0.4291 0.2568 0.7496 0.153 0.5063 0.3535 0.4051 0.4272 0.7459 0.5291 0.0111 0.8219 0.5114 0.1485 0.3641 0.7159 0. 0.6856 0.673 0.5928 0.5287 0.7848 0.4953 0.3643 0.2809 0.7097 0. 0.0435 0.5616 0.6 0.3029 0.473 0.1342 0.3948 0.6814 0.3943 0.0131 0.7145 0.5896 0.6654 0. 0.9056 0.5317 0.2717 0.7666] 2022-08-24 15:25:48 [INFO] [EVAL] Class Recall: [0.8156 0.8752 0.9588 0.8479 0.8428 0.8383 0.8316 0.9006 0.7081 0.7192 0.573 0.7598 0.8338 0.4556 0.2306 0.5137 0.7116 0.5416 0.677 0.4692 0.8659 0.5629 0.6667 0.6691 0.3531 0.5217 0.5507 0.486 0.4518 0.3715 0.2216 0.6687 0.3215 0.4858 0.3553 0.5658 0.4455 0.6416 0.3184 0.3942 0.132 0.1155 0.3674 0.2363 0.4498 0.3244 0.4147 0.519 0.9545 0.6825 0.5796 0.5972 0.1921 0.2867 0.7475 0.6655 0.936 0.318 0.6008 0.3222 0.143 0.2568 0.5666 0.1069 0.5498 0.7571 0.2326 0.4658 0.0265 0.4039 0.3418 0.5176 0.4737 0.4882 0.5533 0.3449 0.4727 0.2262 0.2014 0.281 0.813 0.3738 0.2362 0.0193 0.4993 0.5418 0.0687 0.0564 0.4332 0.5692 0.5854 0.0495 0.3211 0.0798 0.0068 0.0167 0.1802 0.1201 0.1659 0.2143 0.0049 0.0156 0.1273 0.7322 0.001 0.6019 0.081 0.5965 0.0462 0.4471 0.0121 0.4161 0.0831 0.7605 0.992 0.0001 0.344 0.8085 0.1206 0.3034 0.4839 0. 0.315 0.1204 0.2149 0.1107 0.4579 0.5145 0.2012 0.3778 0.7764 0. 0.0007 0.1551 0.0627 0.1311 0.0753 0.0061 0.0757 0.315 0.6289 0.0008 0.2041 0.5896 0.2975 0. 0.183 0.0079 0.0463 0.0345] 2022-08-24 15:25:48 [INFO] [EVAL] The model with the best validation mIoU (0.2877) was saved at iter 44000. 2022-08-24 15:26:00 [INFO] [TRAIN] epoch: 35, iter: 44050/160000, loss: 0.9389, lr: 0.000878, batch_cost: 0.2459, reader_cost: 0.00342, ips: 32.5303 samples/sec | ETA 07:55:14 2022-08-24 15:26:12 [INFO] [TRAIN] epoch: 35, iter: 44100/160000, loss: 0.8945, lr: 0.000877, batch_cost: 0.2282, reader_cost: 0.00895, ips: 35.0646 samples/sec | ETA 07:20:42 2022-08-24 15:26:24 [INFO] [TRAIN] epoch: 35, iter: 44150/160000, loss: 0.8966, lr: 0.000877, batch_cost: 0.2541, reader_cost: 0.00788, ips: 31.4873 samples/sec | ETA 08:10:34 2022-08-24 15:26:38 [INFO] [TRAIN] epoch: 35, iter: 44200/160000, loss: 0.9071, lr: 0.000877, batch_cost: 0.2747, reader_cost: 0.00109, ips: 29.1208 samples/sec | ETA 08:50:12 2022-08-24 15:26:53 [INFO] [TRAIN] epoch: 36, iter: 44250/160000, loss: 0.9237, lr: 0.000876, batch_cost: 0.2863, reader_cost: 0.05184, ips: 27.9453 samples/sec | ETA 09:12:16 2022-08-24 15:27:03 [INFO] [TRAIN] epoch: 36, iter: 44300/160000, loss: 0.9063, lr: 0.000876, batch_cost: 0.2131, reader_cost: 0.00046, ips: 37.5473 samples/sec | ETA 06:50:51 2022-08-24 15:27:14 [INFO] [TRAIN] epoch: 36, iter: 44350/160000, loss: 0.9323, lr: 0.000876, batch_cost: 0.2073, reader_cost: 0.00044, ips: 38.5848 samples/sec | ETA 06:39:38 2022-08-24 15:27:24 [INFO] [TRAIN] epoch: 36, iter: 44400/160000, loss: 0.9589, lr: 0.000875, batch_cost: 0.2154, reader_cost: 0.00061, ips: 37.1353 samples/sec | ETA 06:55:03 2022-08-24 15:27:33 [INFO] [TRAIN] epoch: 36, iter: 44450/160000, loss: 0.9246, lr: 0.000875, batch_cost: 0.1751, reader_cost: 0.00058, ips: 45.6860 samples/sec | ETA 05:37:13 2022-08-24 15:27:41 [INFO] [TRAIN] epoch: 36, iter: 44500/160000, loss: 0.8854, lr: 0.000874, batch_cost: 0.1586, reader_cost: 0.00052, ips: 50.4338 samples/sec | ETA 05:05:21 2022-08-24 15:27:49 [INFO] [TRAIN] epoch: 36, iter: 44550/160000, loss: 0.8337, lr: 0.000874, batch_cost: 0.1599, reader_cost: 0.00036, ips: 50.0164 samples/sec | ETA 05:07:45 2022-08-24 15:27:58 [INFO] [TRAIN] epoch: 36, iter: 44600/160000, loss: 0.8330, lr: 0.000874, batch_cost: 0.1709, reader_cost: 0.00047, ips: 46.8087 samples/sec | ETA 05:28:42 2022-08-24 15:28:06 [INFO] [TRAIN] epoch: 36, iter: 44650/160000, loss: 0.8386, lr: 0.000873, batch_cost: 0.1689, reader_cost: 0.00040, ips: 47.3675 samples/sec | ETA 05:24:41 2022-08-24 15:28:16 [INFO] [TRAIN] epoch: 36, iter: 44700/160000, loss: 0.8472, lr: 0.000873, batch_cost: 0.1999, reader_cost: 0.00086, ips: 40.0121 samples/sec | ETA 06:24:13 2022-08-24 15:28:25 [INFO] [TRAIN] epoch: 36, iter: 44750/160000, loss: 0.9554, lr: 0.000873, batch_cost: 0.1870, reader_cost: 0.00056, ips: 42.7789 samples/sec | ETA 05:59:12 2022-08-24 15:28:34 [INFO] [TRAIN] epoch: 36, iter: 44800/160000, loss: 0.9275, lr: 0.000872, batch_cost: 0.1699, reader_cost: 0.00043, ips: 47.0872 samples/sec | ETA 05:26:12 2022-08-24 15:28:43 [INFO] [TRAIN] epoch: 36, iter: 44850/160000, loss: 0.9084, lr: 0.000872, batch_cost: 0.1733, reader_cost: 0.00038, ips: 46.1558 samples/sec | ETA 05:32:38 2022-08-24 15:28:52 [INFO] [TRAIN] epoch: 36, iter: 44900/160000, loss: 0.8545, lr: 0.000871, batch_cost: 0.1829, reader_cost: 0.00103, ips: 43.7336 samples/sec | ETA 05:50:54 2022-08-24 15:29:00 [INFO] [TRAIN] epoch: 36, iter: 44950/160000, loss: 0.8800, lr: 0.000871, batch_cost: 0.1664, reader_cost: 0.00039, ips: 48.0807 samples/sec | ETA 05:19:02 2022-08-24 15:29:10 [INFO] [TRAIN] epoch: 36, iter: 45000/160000, loss: 0.8626, lr: 0.000871, batch_cost: 0.1908, reader_cost: 0.00033, ips: 41.9216 samples/sec | ETA 06:05:45 2022-08-24 15:29:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 210s - batch_cost: 0.2097 - reader cost: 7.0225e-04 2022-08-24 15:32:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.2923 Acc: 0.7333 Kappa: 0.7125 Dice: 0.4143 2022-08-24 15:32:40 [INFO] [EVAL] Class IoU: [0.64 0.7604 0.9231 0.6846 0.6454 0.7254 0.7491 0.705 0.4526 0.6108 0.43 0.5188 0.6461 0.2689 0.1921 0.3524 0.4292 0.4199 0.5269 0.3138 0.7076 0.4074 0.5237 0.3814 0.3117 0.3924 0.4308 0.3727 0.3526 0.3067 0.2221 0.3937 0.2202 0.2451 0.3296 0.3641 0.3262 0.4778 0.2387 0.2751 0.0807 0.0529 0.2914 0.2023 0.2959 0.2813 0.2023 0.3801 0.6401 0.4511 0.4552 0.2595 0.164 0.1987 0.6742 0.4435 0.791 0.211 0.3427 0.1644 0.0323 0.1579 0.2438 0.0958 0.3553 0.6043 0.1851 0.3686 0.0329 0.3527 0.3099 0.3735 0.3685 0.2142 0.3805 0.2547 0.4296 0.1905 0.2355 0.1798 0.5594 0.266 0.2037 0.016 0.4062 0.4522 0.0536 0.0436 0.3248 0.4224 0.3665 0.0292 0.1772 0.0785 0.0488 0.0082 0.1075 0.079 0.0086 0.3011 0.0009 0.0061 0.1341 0.4978 0.0009 0.3377 0.0769 0.4543 0.0466 0.3506 0.0327 0.3552 0.0541 0.5223 0.6571 0.0002 0.3204 0.4477 0.0958 0.2392 0.4507 0. 0.1929 0.0852 0.1949 0.1467 0.3975 0.3219 0.233 0.2023 0.5398 0.0053 0.0168 0.1088 0.0789 0.0993 0.0975 0.0088 0.0823 0.2901 0.2694 0.0134 0.2058 0.4418 0.2367 0. 0.2151 0.0156 0.0434 0.0423] 2022-08-24 15:32:40 [INFO] [EVAL] Class Precision: [0.744 0.8294 0.9563 0.7753 0.798 0.8558 0.8775 0.7713 0.6128 0.7771 0.6646 0.6377 0.7264 0.4273 0.4843 0.5006 0.5837 0.6247 0.6679 0.5596 0.79 0.6606 0.6644 0.6202 0.4345 0.5327 0.5288 0.6403 0.7163 0.4743 0.324 0.4926 0.5776 0.3381 0.4633 0.4976 0.5535 0.6465 0.4388 0.4495 0.194 0.2254 0.6119 0.4913 0.3963 0.4713 0.2893 0.7183 0.7339 0.5558 0.6267 0.3037 0.3726 0.6306 0.7308 0.5934 0.8352 0.6726 0.621 0.3831 0.0917 0.334 0.4142 0.571 0.457 0.6973 0.3469 0.4919 0.2917 0.7195 0.5376 0.5619 0.6379 0.2605 0.5005 0.4342 0.6115 0.5885 0.451 0.4094 0.7931 0.5168 0.7283 0.0527 0.651 0.5962 0.1894 0.4228 0.7696 0.6392 0.4879 0.0424 0.2696 0.2608 0.1273 0.0247 0.4006 0.6167 0.2679 0.4273 0.6216 0.0088 0.5106 0.8202 0.0088 0.4304 0.1833 0.7939 0.1457 0.5016 0.2879 0.7944 0.3363 0.7712 0.6597 0.0623 0.8104 0.4997 0.1679 0.427 0.6434 0. 0.9475 0.6681 0.5561 0.6416 0.7595 0.4569 0.3282 0.3941 0.6088 0.3186 0.2579 0.6825 0.5797 0.262 0.3292 0.206 0.3001 0.6758 0.3753 0.0332 0.5823 0.6064 0.7357 0. 0.832 0.3672 0.2077 0.7065] 2022-08-24 15:32:40 [INFO] [EVAL] Class Recall: [0.8208 0.9013 0.9637 0.854 0.7714 0.8264 0.8366 0.8913 0.6338 0.7405 0.5492 0.7355 0.8539 0.4205 0.2414 0.5434 0.6185 0.5615 0.714 0.4168 0.8716 0.5152 0.7121 0.4976 0.5246 0.5984 0.6991 0.4714 0.4098 0.4645 0.4139 0.6621 0.2625 0.4712 0.5332 0.5759 0.4427 0.6468 0.3436 0.4149 0.1215 0.0647 0.3574 0.256 0.5386 0.411 0.4022 0.4467 0.8336 0.7056 0.6245 0.6407 0.2265 0.2248 0.897 0.637 0.9373 0.2352 0.4334 0.2235 0.0475 0.2304 0.3722 0.1033 0.6149 0.8192 0.2841 0.5951 0.0357 0.409 0.4226 0.527 0.466 0.5467 0.6136 0.3812 0.591 0.2198 0.3301 0.2428 0.655 0.3541 0.2205 0.0224 0.5193 0.6518 0.0696 0.0463 0.3597 0.5546 0.5956 0.086 0.341 0.101 0.0734 0.0122 0.1281 0.0831 0.0088 0.5048 0.0009 0.0193 0.1539 0.5588 0.001 0.6106 0.117 0.5151 0.0641 0.538 0.0356 0.3912 0.0605 0.6181 0.994 0.0002 0.3464 0.8113 0.1824 0.3523 0.6009 0. 0.195 0.089 0.2308 0.1598 0.4548 0.5215 0.4456 0.2937 0.8266 0.0053 0.0177 0.1146 0.0836 0.1379 0.1216 0.0091 0.1018 0.337 0.4883 0.0219 0.2415 0.6195 0.2587 0. 0.2248 0.016 0.052 0.043 ] 2022-08-24 15:32:40 [INFO] [EVAL] The model with the best validation mIoU (0.2923) was saved at iter 45000. 2022-08-24 15:32:54 [INFO] [TRAIN] epoch: 36, iter: 45050/160000, loss: 0.8702, lr: 0.000870, batch_cost: 0.2844, reader_cost: 0.00485, ips: 28.1325 samples/sec | ETA 09:04:48 2022-08-24 15:33:07 [INFO] [TRAIN] epoch: 36, iter: 45100/160000, loss: 0.8961, lr: 0.000870, batch_cost: 0.2543, reader_cost: 0.04461, ips: 31.4644 samples/sec | ETA 08:06:53 2022-08-24 15:33:19 [INFO] [TRAIN] epoch: 36, iter: 45150/160000, loss: 0.9366, lr: 0.000870, batch_cost: 0.2520, reader_cost: 0.02827, ips: 31.7505 samples/sec | ETA 08:02:18 2022-08-24 15:33:32 [INFO] [TRAIN] epoch: 36, iter: 45200/160000, loss: 0.9037, lr: 0.000869, batch_cost: 0.2613, reader_cost: 0.00068, ips: 30.6188 samples/sec | ETA 08:19:54 2022-08-24 15:33:44 [INFO] [TRAIN] epoch: 36, iter: 45250/160000, loss: 0.9448, lr: 0.000869, batch_cost: 0.2423, reader_cost: 0.00325, ips: 33.0225 samples/sec | ETA 07:43:19 2022-08-24 15:33:57 [INFO] [TRAIN] epoch: 36, iter: 45300/160000, loss: 0.8857, lr: 0.000868, batch_cost: 0.2474, reader_cost: 0.00059, ips: 32.3312 samples/sec | ETA 07:53:01 2022-08-24 15:34:07 [INFO] [TRAIN] epoch: 36, iter: 45350/160000, loss: 0.8864, lr: 0.000868, batch_cost: 0.2103, reader_cost: 0.00039, ips: 38.0330 samples/sec | ETA 06:41:55 2022-08-24 15:34:17 [INFO] [TRAIN] epoch: 36, iter: 45400/160000, loss: 0.9077, lr: 0.000868, batch_cost: 0.1840, reader_cost: 0.00053, ips: 43.4803 samples/sec | ETA 05:51:25 2022-08-24 15:34:24 [INFO] [TRAIN] epoch: 36, iter: 45450/160000, loss: 0.8951, lr: 0.000867, batch_cost: 0.1595, reader_cost: 0.00030, ips: 50.1695 samples/sec | ETA 05:04:26 2022-08-24 15:34:34 [INFO] [TRAIN] epoch: 37, iter: 45500/160000, loss: 0.8912, lr: 0.000867, batch_cost: 0.1842, reader_cost: 0.02607, ips: 43.4302 samples/sec | ETA 05:51:31 2022-08-24 15:34:43 [INFO] [TRAIN] epoch: 37, iter: 45550/160000, loss: 0.8932, lr: 0.000867, batch_cost: 0.1917, reader_cost: 0.00080, ips: 41.7343 samples/sec | ETA 06:05:38 2022-08-24 15:34:51 [INFO] [TRAIN] epoch: 37, iter: 45600/160000, loss: 0.8267, lr: 0.000866, batch_cost: 0.1612, reader_cost: 0.00175, ips: 49.6222 samples/sec | ETA 05:07:23 2022-08-24 15:34:59 [INFO] [TRAIN] epoch: 37, iter: 45650/160000, loss: 0.8782, lr: 0.000866, batch_cost: 0.1488, reader_cost: 0.00032, ips: 53.7514 samples/sec | ETA 04:43:39 2022-08-24 15:35:07 [INFO] [TRAIN] epoch: 37, iter: 45700/160000, loss: 0.8388, lr: 0.000865, batch_cost: 0.1665, reader_cost: 0.00049, ips: 48.0566 samples/sec | ETA 05:17:07 2022-08-24 15:35:17 [INFO] [TRAIN] epoch: 37, iter: 45750/160000, loss: 0.8590, lr: 0.000865, batch_cost: 0.1987, reader_cost: 0.00044, ips: 40.2670 samples/sec | ETA 06:18:18 2022-08-24 15:35:26 [INFO] [TRAIN] epoch: 37, iter: 45800/160000, loss: 0.7915, lr: 0.000865, batch_cost: 0.1850, reader_cost: 0.00054, ips: 43.2358 samples/sec | ETA 05:52:10 2022-08-24 15:35:34 [INFO] [TRAIN] epoch: 37, iter: 45850/160000, loss: 0.8995, lr: 0.000864, batch_cost: 0.1574, reader_cost: 0.00415, ips: 50.8401 samples/sec | ETA 04:59:22 2022-08-24 15:35:42 [INFO] [TRAIN] epoch: 37, iter: 45900/160000, loss: 0.9582, lr: 0.000864, batch_cost: 0.1631, reader_cost: 0.00675, ips: 49.0473 samples/sec | ETA 05:10:10 2022-08-24 15:35:52 [INFO] [TRAIN] epoch: 37, iter: 45950/160000, loss: 0.8660, lr: 0.000863, batch_cost: 0.1973, reader_cost: 0.00048, ips: 40.5461 samples/sec | ETA 06:15:02 2022-08-24 15:36:01 [INFO] [TRAIN] epoch: 37, iter: 46000/160000, loss: 0.8518, lr: 0.000863, batch_cost: 0.1834, reader_cost: 0.00063, ips: 43.6261 samples/sec | ETA 05:48:24 2022-08-24 15:36:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2015 - reader cost: 0.0012 2022-08-24 15:39:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.2863 Acc: 0.7305 Kappa: 0.7095 Dice: 0.4048 2022-08-24 15:39:23 [INFO] [EVAL] Class IoU: [0.6416 0.7505 0.9219 0.6893 0.6554 0.7185 0.7159 0.7301 0.4563 0.6346 0.4393 0.5048 0.6371 0.1969 0.1966 0.3567 0.4191 0.4033 0.5352 0.3353 0.7159 0.4097 0.5347 0.4432 0.3289 0.3066 0.3848 0.3457 0.352 0.2843 0.1283 0.4094 0.2681 0.2264 0.2548 0.3758 0.3277 0.4671 0.2369 0.2872 0.1027 0.0708 0.3082 0.1892 0.2843 0.2431 0.1917 0.361 0.5725 0.4555 0.4515 0.2607 0.1221 0.1766 0.6527 0.4343 0.825 0.2115 0.3501 0.1854 0.0419 0.1508 0.2885 0.0974 0.3642 0.6167 0.1867 0.326 0.0147 0.3141 0.2646 0.3655 0.3717 0.2016 0.3467 0.2399 0.3863 0.1612 0.1385 0.0982 0.6285 0.2519 0.2148 0.0135 0.3139 0.4488 0.0327 0.0415 0.2903 0.4402 0.3639 0.0133 0.1424 0.0524 0.0734 0.0001 0.1388 0.0367 0.0382 0.3263 0.0738 0.0012 0.1878 0.4747 0.0002 0.5124 0.1063 0.467 0.0477 0.2911 0.0805 0.2282 0.0689 0.5574 0.7515 0.0041 0.3899 0.5247 0.0826 0.1572 0.4492 0. 0.2278 0.0199 0.2011 0.0857 0.387 0.309 0.0264 0.1623 0.5466 0. 0.0195 0.1786 0.0408 0.0906 0.0682 0.0104 0.0969 0.3026 0.1624 0.0005 0.3032 0.4074 0.2862 0. 0.1918 0.0163 0.0239 0.0153] 2022-08-24 15:39:23 [INFO] [EVAL] Class Precision: [0.7435 0.8128 0.9597 0.8016 0.7686 0.8669 0.9084 0.8049 0.6088 0.7331 0.661 0.5725 0.7135 0.5381 0.4464 0.5227 0.5372 0.6119 0.6673 0.4822 0.8124 0.6116 0.7556 0.5498 0.4936 0.5279 0.4699 0.7566 0.6786 0.4108 0.3693 0.6285 0.4504 0.3111 0.4605 0.5072 0.6066 0.6085 0.4248 0.4214 0.1923 0.2372 0.5559 0.5505 0.4013 0.4717 0.2626 0.5552 0.7638 0.561 0.5922 0.3278 0.2329 0.5538 0.6993 0.5594 0.8987 0.6668 0.6849 0.3103 0.0888 0.2871 0.3647 0.709 0.4752 0.7591 0.2678 0.6001 0.1721 0.6348 0.5141 0.4529 0.6172 0.2457 0.477 0.5075 0.6498 0.5908 0.5723 0.5095 0.7583 0.5371 0.7026 0.0736 0.5474 0.646 0.1333 0.4457 0.7314 0.6687 0.5067 0.0192 0.2256 0.3043 0.2053 0.0006 0.5407 0.5336 0.1913 0.4591 0.5851 0.0015 0.5797 0.5457 0.0035 0.6072 0.2366 0.8423 0.1524 0.4876 0.2525 0.3358 0.3489 0.6967 0.7593 0.3881 0.8583 0.583 0.1573 0.6556 0.7632 0.1358 0.8911 0.904 0.5529 0.6817 0.7343 0.4253 0.1113 0.5386 0.6274 0. 0.265 0.3973 0.6476 0.2119 0.3127 0.1881 0.4086 0.6341 0.3562 0.0009 0.5499 0.5089 0.5815 0. 0.8701 0.336 0.2858 0.7807] 2022-08-24 15:39:23 [INFO] [EVAL] Class Recall: [0.8241 0.9074 0.959 0.8311 0.8165 0.8076 0.7716 0.8871 0.6456 0.8254 0.567 0.8102 0.8562 0.237 0.2601 0.529 0.6558 0.542 0.73 0.524 0.8577 0.5537 0.6465 0.6956 0.4963 0.4225 0.6798 0.389 0.4224 0.48 0.1644 0.5401 0.3984 0.4541 0.3632 0.592 0.4162 0.6679 0.3488 0.4742 0.1807 0.0916 0.4088 0.2237 0.4938 0.3341 0.4151 0.508 0.6957 0.7078 0.6552 0.5601 0.2043 0.206 0.9073 0.6601 0.9097 0.2365 0.4173 0.3154 0.0734 0.2412 0.5798 0.1015 0.6094 0.7668 0.3813 0.4165 0.0159 0.3833 0.3529 0.6545 0.4831 0.5288 0.5593 0.3127 0.4878 0.1814 0.1545 0.1085 0.7859 0.3218 0.2363 0.0162 0.4239 0.5952 0.0415 0.0437 0.325 0.563 0.5636 0.041 0.2788 0.0595 0.1025 0.0002 0.1574 0.0379 0.0456 0.53 0.0778 0.0055 0.2174 0.7848 0.0002 0.7665 0.1619 0.5117 0.065 0.4193 0.1057 0.4161 0.079 0.736 0.9865 0.0041 0.4168 0.8399 0.1482 0.1714 0.522 0. 0.2343 0.0199 0.2402 0.0893 0.45 0.5306 0.0334 0.1885 0.8093 0. 0.0207 0.245 0.0417 0.1366 0.0802 0.0109 0.1127 0.3665 0.2299 0.0011 0.4034 0.6714 0.3604 0. 0.1974 0.0169 0.0254 0.0153] 2022-08-24 15:39:23 [INFO] [EVAL] The model with the best validation mIoU (0.2923) was saved at iter 45000. 2022-08-24 15:39:37 [INFO] [TRAIN] epoch: 37, iter: 46050/160000, loss: 0.8977, lr: 0.000863, batch_cost: 0.2653, reader_cost: 0.01130, ips: 30.1515 samples/sec | ETA 08:23:53 2022-08-24 15:39:50 [INFO] [TRAIN] epoch: 37, iter: 46100/160000, loss: 0.8384, lr: 0.000862, batch_cost: 0.2668, reader_cost: 0.01100, ips: 29.9884 samples/sec | ETA 08:26:25 2022-08-24 15:40:01 [INFO] [TRAIN] epoch: 37, iter: 46150/160000, loss: 0.8554, lr: 0.000862, batch_cost: 0.2294, reader_cost: 0.01284, ips: 34.8793 samples/sec | ETA 07:15:12 2022-08-24 15:40:16 [INFO] [TRAIN] epoch: 37, iter: 46200/160000, loss: 0.8903, lr: 0.000862, batch_cost: 0.2885, reader_cost: 0.00858, ips: 27.7270 samples/sec | ETA 09:07:14 2022-08-24 15:40:28 [INFO] [TRAIN] epoch: 37, iter: 46250/160000, loss: 0.8588, lr: 0.000861, batch_cost: 0.2499, reader_cost: 0.00950, ips: 32.0114 samples/sec | ETA 07:53:47 2022-08-24 15:40:42 [INFO] [TRAIN] epoch: 37, iter: 46300/160000, loss: 0.8659, lr: 0.000861, batch_cost: 0.2765, reader_cost: 0.00111, ips: 28.9359 samples/sec | ETA 08:43:54 2022-08-24 15:40:52 [INFO] [TRAIN] epoch: 37, iter: 46350/160000, loss: 0.9154, lr: 0.000860, batch_cost: 0.2026, reader_cost: 0.00489, ips: 39.4902 samples/sec | ETA 06:23:43 2022-08-24 15:41:02 [INFO] [TRAIN] epoch: 37, iter: 46400/160000, loss: 0.8771, lr: 0.000860, batch_cost: 0.1911, reader_cost: 0.00040, ips: 41.8655 samples/sec | ETA 06:01:47 2022-08-24 15:41:13 [INFO] [TRAIN] epoch: 37, iter: 46450/160000, loss: 0.9964, lr: 0.000860, batch_cost: 0.2291, reader_cost: 0.00034, ips: 34.9225 samples/sec | ETA 07:13:31 2022-08-24 15:41:23 [INFO] [TRAIN] epoch: 37, iter: 46500/160000, loss: 0.8696, lr: 0.000859, batch_cost: 0.1941, reader_cost: 0.00173, ips: 41.2100 samples/sec | ETA 06:07:13 2022-08-24 15:41:32 [INFO] [TRAIN] epoch: 37, iter: 46550/160000, loss: 0.8454, lr: 0.000859, batch_cost: 0.1755, reader_cost: 0.00039, ips: 45.5941 samples/sec | ETA 05:31:46 2022-08-24 15:41:40 [INFO] [TRAIN] epoch: 37, iter: 46600/160000, loss: 0.8818, lr: 0.000859, batch_cost: 0.1675, reader_cost: 0.00044, ips: 47.7482 samples/sec | ETA 05:16:39 2022-08-24 15:41:49 [INFO] [TRAIN] epoch: 37, iter: 46650/160000, loss: 0.8802, lr: 0.000858, batch_cost: 0.1806, reader_cost: 0.00084, ips: 44.2929 samples/sec | ETA 05:41:12 2022-08-24 15:41:57 [INFO] [TRAIN] epoch: 37, iter: 46700/160000, loss: 0.8604, lr: 0.000858, batch_cost: 0.1603, reader_cost: 0.00240, ips: 49.9204 samples/sec | ETA 05:02:36 2022-08-24 15:42:08 [INFO] [TRAIN] epoch: 38, iter: 46750/160000, loss: 0.8989, lr: 0.000857, batch_cost: 0.2071, reader_cost: 0.02723, ips: 38.6375 samples/sec | ETA 06:30:48 2022-08-24 15:42:16 [INFO] [TRAIN] epoch: 38, iter: 46800/160000, loss: 0.9382, lr: 0.000857, batch_cost: 0.1666, reader_cost: 0.00041, ips: 48.0228 samples/sec | ETA 05:14:17 2022-08-24 15:42:25 [INFO] [TRAIN] epoch: 38, iter: 46850/160000, loss: 0.8723, lr: 0.000857, batch_cost: 0.1757, reader_cost: 0.00049, ips: 45.5362 samples/sec | ETA 05:31:18 2022-08-24 15:42:34 [INFO] [TRAIN] epoch: 38, iter: 46900/160000, loss: 0.8742, lr: 0.000856, batch_cost: 0.1839, reader_cost: 0.00043, ips: 43.5050 samples/sec | ETA 05:46:37 2022-08-24 15:42:43 [INFO] [TRAIN] epoch: 38, iter: 46950/160000, loss: 0.8515, lr: 0.000856, batch_cost: 0.1729, reader_cost: 0.00037, ips: 46.2642 samples/sec | ETA 05:25:48 2022-08-24 15:42:53 [INFO] [TRAIN] epoch: 38, iter: 47000/160000, loss: 0.8299, lr: 0.000856, batch_cost: 0.1988, reader_cost: 0.00056, ips: 40.2498 samples/sec | ETA 06:14:19 2022-08-24 15:42:53 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 205s - batch_cost: 0.2051 - reader cost: 8.6667e-04 2022-08-24 15:46:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.2863 Acc: 0.7341 Kappa: 0.7135 Dice: 0.4060 2022-08-24 15:46:18 [INFO] [EVAL] Class IoU: [0.6433 0.7557 0.9199 0.6853 0.657 0.7268 0.7497 0.7091 0.4638 0.6071 0.4476 0.507 0.6409 0.267 0.1859 0.3618 0.4409 0.4053 0.5347 0.332 0.71 0.4309 0.5171 0.4378 0.3137 0.4301 0.3993 0.3621 0.3505 0.2718 0.2136 0.3705 0.2333 0.2559 0.3355 0.3715 0.3011 0.4528 0.2145 0.2495 0.1037 0.0677 0.2933 0.2091 0.2897 0.2131 0.2218 0.3813 0.5903 0.4541 0.4554 0.2257 0.1969 0.1448 0.6324 0.4517 0.807 0.3186 0.2304 0.1589 0.0271 0.1768 0.2774 0.0284 0.3658 0.6 0.1628 0.352 0.0485 0.3321 0.3048 0.3797 0.3717 0.2377 0.3739 0.2588 0.382 0.2103 0.1915 0.3194 0.5646 0.232 0.2204 0.0139 0.4 0.4529 0.0611 0.0414 0.2065 0.4251 0.3422 0.0147 0.198 0.0401 0.0156 0.0014 0.1605 0.0816 0.0676 0.2885 0.0058 0.0087 0.0717 0.5792 0.0004 0.3862 0.1309 0.4076 0.0659 0.2745 0.0439 0.2202 0.0552 0.5015 0.6963 0. 0.378 0.4198 0.0323 0.0958 0.4409 0. 0.2242 0.1143 0.2387 0.0965 0.4125 0.3025 0.0146 0.1958 0.4944 0.0001 0.0747 0.1303 0.0586 0.0782 0.0672 0.0046 0.088 0.2357 0.1653 0.0008 0.2229 0.4088 0.2849 0.0048 0.1774 0.0242 0.0441 0.0313] 2022-08-24 15:46:18 [INFO] [EVAL] Class Precision: [0.7548 0.8224 0.9616 0.7789 0.7768 0.8499 0.8709 0.7651 0.6058 0.7303 0.6508 0.6607 0.7274 0.431 0.4633 0.5298 0.5725 0.6695 0.6948 0.5164 0.7907 0.6205 0.662 0.6363 0.461 0.6125 0.4943 0.6325 0.7154 0.4349 0.3622 0.4456 0.5704 0.3842 0.457 0.4983 0.6084 0.7072 0.4654 0.4898 0.1608 0.2883 0.5767 0.5295 0.4906 0.516 0.3425 0.5651 0.7097 0.5527 0.6442 0.2626 0.4288 0.6634 0.6832 0.6564 0.8564 0.5759 0.6562 0.2712 0.0712 0.6008 0.3511 0.6481 0.4906 0.6908 0.3446 0.5473 0.3676 0.6519 0.4894 0.5002 0.6791 0.3035 0.665 0.4131 0.5962 0.476 0.5365 0.4144 0.8122 0.5953 0.7054 0.1031 0.6227 0.6348 0.2988 0.4163 0.2701 0.6952 0.4485 0.0222 0.2936 0.2248 0.084 0.0097 0.4612 0.6142 0.3858 0.4747 0.4951 0.0102 0.5314 0.9855 0.0143 0.6737 0.375 0.8597 0.1584 0.5625 0.2607 0.3355 0.3113 0.744 0.7003 0. 0.807 0.4543 0.1335 0.5269 0.6257 0. 0.8698 0.7141 0.5605 0.5997 0.6747 0.4331 0.3678 0.3438 0.5422 0.7778 0.3315 0.4906 0.4083 0.3391 0.3301 0.1692 0.36 0.7552 0.3846 0.0085 0.6343 0.5649 0.6352 0.0057 0.9265 0.2147 0.3148 0.6718] 2022-08-24 15:46:18 [INFO] [EVAL] Class Recall: [0.8132 0.903 0.955 0.8508 0.8098 0.8339 0.8434 0.9064 0.6643 0.7825 0.589 0.6855 0.8434 0.4123 0.2369 0.5328 0.6574 0.5066 0.6988 0.4817 0.8742 0.5851 0.7026 0.584 0.4954 0.5909 0.6752 0.4585 0.4073 0.4201 0.3425 0.6873 0.283 0.4338 0.558 0.5936 0.3734 0.5572 0.2847 0.3371 0.2262 0.0812 0.3738 0.2569 0.4143 0.2664 0.3863 0.5396 0.7782 0.7179 0.6085 0.6158 0.2669 0.1563 0.8949 0.5916 0.9333 0.4163 0.2621 0.2772 0.0419 0.2003 0.5692 0.0288 0.5897 0.8203 0.2359 0.4965 0.0529 0.4037 0.4468 0.6118 0.4508 0.523 0.4606 0.4093 0.5154 0.2736 0.2295 0.5822 0.6494 0.2755 0.2427 0.0159 0.5281 0.6125 0.0714 0.0439 0.4672 0.5224 0.5907 0.0419 0.3781 0.0465 0.0188 0.0016 0.1975 0.086 0.0758 0.4237 0.0058 0.0579 0.0766 0.5841 0.0004 0.4751 0.1674 0.4367 0.1013 0.349 0.0501 0.3906 0.0629 0.6061 0.992 0. 0.4156 0.8467 0.0408 0.1048 0.5988 0. 0.232 0.1198 0.2937 0.1032 0.5149 0.5006 0.0149 0.3128 0.8489 0.0001 0.0879 0.1506 0.064 0.0923 0.0778 0.0047 0.1044 0.2551 0.2248 0.0009 0.2558 0.5966 0.3406 0.0307 0.18 0.0265 0.0488 0.0318] 2022-08-24 15:46:18 [INFO] [EVAL] The model with the best validation mIoU (0.2923) was saved at iter 45000. 2022-08-24 15:46:30 [INFO] [TRAIN] epoch: 38, iter: 47050/160000, loss: 0.8671, lr: 0.000855, batch_cost: 0.2321, reader_cost: 0.00319, ips: 34.4721 samples/sec | ETA 07:16:52 2022-08-24 15:46:42 [INFO] [TRAIN] epoch: 38, iter: 47100/160000, loss: 0.9185, lr: 0.000855, batch_cost: 0.2400, reader_cost: 0.00059, ips: 33.3273 samples/sec | ETA 07:31:40 2022-08-24 15:46:55 [INFO] [TRAIN] epoch: 38, iter: 47150/160000, loss: 0.8933, lr: 0.000854, batch_cost: 0.2637, reader_cost: 0.02043, ips: 30.3384 samples/sec | ETA 08:15:57 2022-08-24 15:47:08 [INFO] [TRAIN] epoch: 38, iter: 47200/160000, loss: 0.8576, lr: 0.000854, batch_cost: 0.2550, reader_cost: 0.00861, ips: 31.3707 samples/sec | ETA 07:59:25 2022-08-24 15:47:20 [INFO] [TRAIN] epoch: 38, iter: 47250/160000, loss: 0.8761, lr: 0.000854, batch_cost: 0.2555, reader_cost: 0.00409, ips: 31.3155 samples/sec | ETA 08:00:03 2022-08-24 15:47:33 [INFO] [TRAIN] epoch: 38, iter: 47300/160000, loss: 0.8532, lr: 0.000853, batch_cost: 0.2538, reader_cost: 0.00609, ips: 31.5190 samples/sec | ETA 07:56:44 2022-08-24 15:47:43 [INFO] [TRAIN] epoch: 38, iter: 47350/160000, loss: 0.8671, lr: 0.000853, batch_cost: 0.1995, reader_cost: 0.00784, ips: 40.1044 samples/sec | ETA 06:14:31 2022-08-24 15:47:54 [INFO] [TRAIN] epoch: 38, iter: 47400/160000, loss: 0.8857, lr: 0.000852, batch_cost: 0.2164, reader_cost: 0.00494, ips: 36.9614 samples/sec | ETA 06:46:11 2022-08-24 15:48:04 [INFO] [TRAIN] epoch: 38, iter: 47450/160000, loss: 0.8812, lr: 0.000852, batch_cost: 0.1943, reader_cost: 0.00059, ips: 41.1689 samples/sec | ETA 06:04:30 2022-08-24 15:48:15 [INFO] [TRAIN] epoch: 38, iter: 47500/160000, loss: 0.8680, lr: 0.000852, batch_cost: 0.2290, reader_cost: 0.00034, ips: 34.9292 samples/sec | ETA 07:09:26 2022-08-24 15:48:25 [INFO] [TRAIN] epoch: 38, iter: 47550/160000, loss: 0.8624, lr: 0.000851, batch_cost: 0.1941, reader_cost: 0.00051, ips: 41.2145 samples/sec | ETA 06:03:47 2022-08-24 15:48:33 [INFO] [TRAIN] epoch: 38, iter: 47600/160000, loss: 0.8680, lr: 0.000851, batch_cost: 0.1534, reader_cost: 0.00049, ips: 52.1610 samples/sec | ETA 04:47:18 2022-08-24 15:48:40 [INFO] [TRAIN] epoch: 38, iter: 47650/160000, loss: 0.8334, lr: 0.000851, batch_cost: 0.1563, reader_cost: 0.00046, ips: 51.1810 samples/sec | ETA 04:52:41 2022-08-24 15:48:48 [INFO] [TRAIN] epoch: 38, iter: 47700/160000, loss: 0.8877, lr: 0.000850, batch_cost: 0.1502, reader_cost: 0.00035, ips: 53.2644 samples/sec | ETA 04:41:06 2022-08-24 15:48:56 [INFO] [TRAIN] epoch: 38, iter: 47750/160000, loss: 0.9065, lr: 0.000850, batch_cost: 0.1601, reader_cost: 0.00034, ips: 49.9728 samples/sec | ETA 04:59:29 2022-08-24 15:49:04 [INFO] [TRAIN] epoch: 38, iter: 47800/160000, loss: 0.8871, lr: 0.000849, batch_cost: 0.1618, reader_cost: 0.00536, ips: 49.4372 samples/sec | ETA 05:02:36 2022-08-24 15:49:13 [INFO] [TRAIN] epoch: 38, iter: 47850/160000, loss: 0.8664, lr: 0.000849, batch_cost: 0.1808, reader_cost: 0.00034, ips: 44.2454 samples/sec | ETA 05:37:57 2022-08-24 15:49:22 [INFO] [TRAIN] epoch: 38, iter: 47900/160000, loss: 0.8679, lr: 0.000849, batch_cost: 0.1848, reader_cost: 0.00057, ips: 43.2902 samples/sec | ETA 05:45:15 2022-08-24 15:49:31 [INFO] [TRAIN] epoch: 38, iter: 47950/160000, loss: 0.8654, lr: 0.000848, batch_cost: 0.1810, reader_cost: 0.00046, ips: 44.1930 samples/sec | ETA 05:38:03 2022-08-24 15:49:41 [INFO] [TRAIN] epoch: 39, iter: 48000/160000, loss: 0.8777, lr: 0.000848, batch_cost: 0.1939, reader_cost: 0.02694, ips: 41.2596 samples/sec | ETA 06:01:56 2022-08-24 15:49:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 225s - batch_cost: 0.2245 - reader cost: 6.9371e-04 2022-08-24 15:53:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.2887 Acc: 0.7352 Kappa: 0.7147 Dice: 0.4080 2022-08-24 15:53:26 [INFO] [EVAL] Class IoU: [0.6449 0.762 0.9233 0.6909 0.6563 0.731 0.7442 0.7291 0.4613 0.6007 0.4268 0.4965 0.6507 0.2723 0.1639 0.3552 0.4388 0.3654 0.5258 0.3306 0.7017 0.4651 0.5403 0.428 0.3122 0.3762 0.4097 0.358 0.3771 0.2636 0.2112 0.4113 0.2168 0.2439 0.2328 0.3734 0.3305 0.4329 0.2444 0.2853 0.0745 0.0825 0.2848 0.2084 0.2792 0.2638 0.2216 0.3897 0.6377 0.453 0.4537 0.2459 0.1383 0.218 0.6478 0.4112 0.8117 0.2445 0.2486 0.1923 0.0583 0.1451 0.2631 0.1249 0.3296 0.5967 0.1727 0.365 0.0097 0.3413 0.3128 0.3726 0.3783 0.1803 0.3665 0.2756 0.4349 0.1821 0.1661 0.2264 0.6198 0.2506 0.2116 0.0151 0.4075 0.4621 0.0461 0.0457 0.3215 0.4187 0.3753 0.0061 0.1297 0.0316 0.0024 0.0059 0.133 0.0489 0.0878 0.278 0.0637 0.002 0.0505 0.6933 0. 0.5252 0.1324 0.4616 0.0879 0.2564 0.0209 0.1952 0.0249 0.513 0.6284 0.0031 0.3431 0.5106 0.0789 0.2441 0.4068 0. 0.2157 0.1099 0.2063 0.0885 0.4116 0.3196 0.0793 0.1128 0.5377 0. 0.0357 0.2037 0.0814 0.0872 0.0799 0.0036 0.0964 0.269 0.113 0.0019 0.2251 0.3614 0.27 0. 0.2016 0.0061 0.0374 0.0214] 2022-08-24 15:53:26 [INFO] [EVAL] Class Precision: [0.7544 0.8427 0.9628 0.7817 0.7376 0.8266 0.835 0.8054 0.6127 0.7348 0.6911 0.6749 0.7362 0.4493 0.4347 0.4835 0.5506 0.6887 0.6543 0.5364 0.7803 0.5985 0.71 0.6177 0.5161 0.6227 0.5932 0.715 0.6728 0.3944 0.3807 0.5574 0.4857 0.333 0.5643 0.5266 0.5339 0.6085 0.4857 0.4424 0.2137 0.238 0.4257 0.5122 0.3482 0.4318 0.3152 0.6559 0.7652 0.5702 0.6657 0.2922 0.3846 0.6018 0.7053 0.5204 0.864 0.6486 0.7465 0.3958 0.132 0.3004 0.3646 0.4806 0.3866 0.6803 0.2976 0.4688 0.2817 0.7028 0.5126 0.4906 0.5921 0.1973 0.6671 0.4536 0.624 0.4332 0.4442 0.3825 0.7905 0.5411 0.7321 0.0543 0.6598 0.6461 0.2848 0.3475 0.7568 0.6212 0.5022 0.0099 0.266 0.194 0.0172 0.0288 0.6052 0.5932 0.4573 0.5061 0.7168 0.0035 0.5495 0.8352 0. 0.6 0.2427 0.7775 0.2084 0.4092 0.2184 0.2705 0.2149 0.688 0.63 0.2966 0.8104 0.5864 0.1466 0.4616 0.6486 0. 0.934 0.6474 0.5746 0.6669 0.7433 0.4315 0.445 0.438 0.6887 0.0037 0.1489 0.3955 0.7195 0.3191 0.3929 0.1999 0.3389 0.7151 0.2007 0.0102 0.5579 0.5715 0.6867 0. 0.8891 0.4973 0.2546 0.8503] 2022-08-24 15:53:26 [INFO] [EVAL] Class Recall: [0.8163 0.8883 0.9575 0.856 0.8562 0.8634 0.8725 0.8849 0.6513 0.767 0.5274 0.6526 0.8486 0.4087 0.2083 0.5723 0.6837 0.4377 0.7279 0.4629 0.8745 0.676 0.6933 0.5822 0.4413 0.4873 0.5697 0.4176 0.4618 0.4427 0.3216 0.6106 0.2815 0.4768 0.2839 0.5622 0.4646 0.6001 0.3297 0.4456 0.1026 0.1122 0.4627 0.26 0.5851 0.4041 0.4275 0.4899 0.7929 0.6878 0.5875 0.6079 0.1776 0.2548 0.8884 0.6623 0.9305 0.2818 0.2715 0.2722 0.0944 0.2191 0.4858 0.1444 0.691 0.8294 0.2915 0.6223 0.0099 0.3989 0.4453 0.6077 0.5117 0.6759 0.4485 0.4125 0.5895 0.239 0.2097 0.3567 0.7416 0.3182 0.2294 0.0205 0.5159 0.6187 0.0521 0.0499 0.3585 0.5622 0.5977 0.0154 0.2021 0.0364 0.0028 0.0074 0.1456 0.0506 0.0981 0.3815 0.0654 0.0044 0.0527 0.8032 0. 0.8081 0.2255 0.5319 0.1319 0.407 0.0226 0.4123 0.0274 0.6685 0.9959 0.0031 0.3731 0.7979 0.1459 0.3412 0.5219 0. 0.2191 0.1169 0.2435 0.0926 0.4799 0.5521 0.088 0.1319 0.7103 0. 0.0449 0.2959 0.084 0.1072 0.0912 0.0036 0.1187 0.3013 0.2055 0.0024 0.2739 0.4958 0.3079 0. 0.2068 0.0061 0.0421 0.0215] 2022-08-24 15:53:26 [INFO] [EVAL] The model with the best validation mIoU (0.2923) was saved at iter 45000. 2022-08-24 15:53:38 [INFO] [TRAIN] epoch: 39, iter: 48050/160000, loss: 0.8478, lr: 0.000848, batch_cost: 0.2471, reader_cost: 0.00353, ips: 32.3734 samples/sec | ETA 07:41:04 2022-08-24 15:53:52 [INFO] [TRAIN] epoch: 39, iter: 48100/160000, loss: 0.8441, lr: 0.000847, batch_cost: 0.2695, reader_cost: 0.01548, ips: 29.6834 samples/sec | ETA 08:22:38 2022-08-24 15:54:05 [INFO] [TRAIN] epoch: 39, iter: 48150/160000, loss: 0.8903, lr: 0.000847, batch_cost: 0.2743, reader_cost: 0.00788, ips: 29.1670 samples/sec | ETA 08:31:18 2022-08-24 15:54:18 [INFO] [TRAIN] epoch: 39, iter: 48200/160000, loss: 0.8395, lr: 0.000846, batch_cost: 0.2537, reader_cost: 0.00041, ips: 31.5392 samples/sec | ETA 07:52:38 2022-08-24 15:54:31 [INFO] [TRAIN] epoch: 39, iter: 48250/160000, loss: 0.8660, lr: 0.000846, batch_cost: 0.2486, reader_cost: 0.00497, ips: 32.1821 samples/sec | ETA 07:42:59 2022-08-24 15:54:45 [INFO] [TRAIN] epoch: 39, iter: 48300/160000, loss: 0.9422, lr: 0.000846, batch_cost: 0.2811, reader_cost: 0.03414, ips: 28.4546 samples/sec | ETA 08:43:24 2022-08-24 15:54:56 [INFO] [TRAIN] epoch: 39, iter: 48350/160000, loss: 0.9323, lr: 0.000845, batch_cost: 0.2263, reader_cost: 0.00051, ips: 35.3536 samples/sec | ETA 07:01:04 2022-08-24 15:55:06 [INFO] [TRAIN] epoch: 39, iter: 48400/160000, loss: 0.8164, lr: 0.000845, batch_cost: 0.1922, reader_cost: 0.00401, ips: 41.6227 samples/sec | ETA 05:57:29 2022-08-24 15:55:15 [INFO] [TRAIN] epoch: 39, iter: 48450/160000, loss: 0.9043, lr: 0.000845, batch_cost: 0.1899, reader_cost: 0.00804, ips: 42.1328 samples/sec | ETA 05:53:00 2022-08-24 15:55:23 [INFO] [TRAIN] epoch: 39, iter: 48500/160000, loss: 0.8880, lr: 0.000844, batch_cost: 0.1601, reader_cost: 0.00076, ips: 49.9601 samples/sec | ETA 04:57:34 2022-08-24 15:55:32 [INFO] [TRAIN] epoch: 39, iter: 48550/160000, loss: 0.8854, lr: 0.000844, batch_cost: 0.1746, reader_cost: 0.00058, ips: 45.8310 samples/sec | ETA 05:24:14 2022-08-24 15:55:40 [INFO] [TRAIN] epoch: 39, iter: 48600/160000, loss: 0.8705, lr: 0.000843, batch_cost: 0.1628, reader_cost: 0.00259, ips: 49.1418 samples/sec | ETA 05:02:15 2022-08-24 15:55:48 [INFO] [TRAIN] epoch: 39, iter: 48650/160000, loss: 0.8963, lr: 0.000843, batch_cost: 0.1652, reader_cost: 0.00039, ips: 48.4178 samples/sec | ETA 05:06:38 2022-08-24 15:55:58 [INFO] [TRAIN] epoch: 39, iter: 48700/160000, loss: 0.9167, lr: 0.000843, batch_cost: 0.1945, reader_cost: 0.00039, ips: 41.1322 samples/sec | ETA 06:00:47 2022-08-24 15:56:08 [INFO] [TRAIN] epoch: 39, iter: 48750/160000, loss: 0.8819, lr: 0.000842, batch_cost: 0.1961, reader_cost: 0.00067, ips: 40.7911 samples/sec | ETA 06:03:38 2022-08-24 15:56:17 [INFO] [TRAIN] epoch: 39, iter: 48800/160000, loss: 0.8891, lr: 0.000842, batch_cost: 0.1791, reader_cost: 0.00040, ips: 44.6768 samples/sec | ETA 05:31:51 2022-08-24 15:56:24 [INFO] [TRAIN] epoch: 39, iter: 48850/160000, loss: 0.8432, lr: 0.000842, batch_cost: 0.1523, reader_cost: 0.00472, ips: 52.5227 samples/sec | ETA 04:42:09 2022-08-24 15:56:33 [INFO] [TRAIN] epoch: 39, iter: 48900/160000, loss: 0.8327, lr: 0.000841, batch_cost: 0.1641, reader_cost: 0.00237, ips: 48.7523 samples/sec | ETA 05:03:50 2022-08-24 15:56:42 [INFO] [TRAIN] epoch: 39, iter: 48950/160000, loss: 0.8473, lr: 0.000841, batch_cost: 0.1869, reader_cost: 0.00049, ips: 42.8086 samples/sec | ETA 05:45:52 2022-08-24 15:56:51 [INFO] [TRAIN] epoch: 39, iter: 49000/160000, loss: 0.8655, lr: 0.000840, batch_cost: 0.1863, reader_cost: 0.00034, ips: 42.9448 samples/sec | ETA 05:44:37 2022-08-24 15:56:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 216s - batch_cost: 0.2159 - reader cost: 5.9984e-04 2022-08-24 16:00:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2881 Acc: 0.7390 Kappa: 0.7188 Dice: 0.4070 2022-08-24 16:00:27 [INFO] [EVAL] Class IoU: [0.6429 0.7646 0.9242 0.6872 0.6553 0.7326 0.7495 0.7223 0.4684 0.633 0.435 0.5124 0.6446 0.2772 0.2038 0.3549 0.4565 0.3995 0.5238 0.3427 0.7186 0.4927 0.5131 0.44 0.3116 0.4521 0.4009 0.3828 0.3639 0.2576 0.1256 0.4092 0.2288 0.2395 0.3284 0.3512 0.3253 0.4979 0.2466 0.281 0.0988 0.0484 0.2829 0.194 0.289 0.1402 0.2373 0.3976 0.6612 0.469 0.4221 0.3009 0.1902 0.2448 0.6337 0.4265 0.792 0.2677 0.3393 0.2249 0.024 0.1819 0.2847 0.1299 0.3537 0.6287 0.1532 0.3789 0.0269 0.3254 0.2961 0.359 0.3394 0.1998 0.385 0.2845 0.3272 0.1844 0.1667 0.1903 0.5767 0.2521 0.2329 0.014 0.4282 0.4605 0.0413 0.0409 0.3075 0.4315 0.3835 0.0357 0.0877 0.0464 0. 0.004 0.1514 0.0885 0.0703 0.2662 0.0116 0.0025 0.0746 0.4398 0. 0.3742 0.1102 0.502 0.072 0.252 0.0116 0.0978 0.0752 0.542 0.6921 0. 0.3507 0.538 0.0451 0.1822 0.4642 0. 0.2419 0.1094 0.2306 0.1214 0.3286 0.3108 0.0235 0.193 0.5147 0. 0.0148 0.1169 0.0562 0.0749 0.0754 0.0119 0.1028 0.2771 0.2121 0.0278 0.344 0.2233 0.2217 0. 0.168 0.0101 0.0392 0.0253] 2022-08-24 16:00:27 [INFO] [EVAL] Class Precision: [0.7466 0.8349 0.9559 0.8052 0.7572 0.8551 0.8888 0.779 0.596 0.6971 0.6084 0.6475 0.7214 0.5043 0.4585 0.5315 0.6034 0.6378 0.6929 0.5056 0.8063 0.5815 0.6501 0.5791 0.4574 0.6208 0.6623 0.6934 0.6955 0.563 0.3993 0.508 0.4719 0.3471 0.516 0.4348 0.6003 0.6765 0.4222 0.478 0.2389 0.2974 0.4726 0.5353 0.4558 0.4086 0.3482 0.6317 0.7104 0.6093 0.6485 0.3825 0.3584 0.589 0.742 0.5728 0.8443 0.6215 0.6727 0.4076 0.1455 0.5043 0.4845 0.5375 0.4625 0.7566 0.3959 0.5068 0.3237 0.582 0.5933 0.6119 0.6332 0.2659 0.6194 0.4959 0.6281 0.3048 0.5862 0.3898 0.7388 0.5473 0.7393 0.1328 0.5394 0.6287 0.2325 0.41 0.5618 0.6833 0.5353 0.046 0.2301 0.2008 0.0002 0.0147 0.5463 0.4631 0.3526 0.4657 0.5847 0.0039 0.5624 0.9555 0. 0.5081 0.359 0.8703 0.2462 0.4701 0.1659 0.118 0.3066 0.6472 0.6971 0.0048 0.8291 0.5972 0.133 0.3966 0.6813 0. 0.9601 0.6268 0.5622 0.5829 0.7506 0.393 0.3869 0.3439 0.5966 0. 0.2423 0.5743 0.666 0.326 0.3821 0.0761 0.3397 0.7069 0.3164 0.0634 0.5788 0.6192 0.6573 0. 0.8843 0.3723 0.2725 0.6176] 2022-08-24 16:00:27 [INFO] [EVAL] Class Recall: [0.8223 0.9008 0.9654 0.8242 0.8296 0.8365 0.8271 0.9085 0.6862 0.8733 0.6041 0.7105 0.8583 0.381 0.2684 0.5165 0.6521 0.5167 0.6822 0.5154 0.8685 0.7635 0.7088 0.6469 0.4944 0.6246 0.5038 0.4609 0.4329 0.3219 0.1549 0.6779 0.3076 0.436 0.4747 0.6461 0.4153 0.6535 0.3723 0.4053 0.1442 0.0547 0.4133 0.2333 0.4412 0.1759 0.4271 0.5176 0.9052 0.6708 0.5474 0.5851 0.2885 0.2953 0.8128 0.6255 0.9274 0.3198 0.4063 0.3341 0.028 0.2216 0.4084 0.1462 0.6005 0.7881 0.2 0.6002 0.0285 0.4247 0.3716 0.4649 0.4225 0.4453 0.5044 0.4003 0.4058 0.3183 0.1889 0.2711 0.7244 0.3185 0.2537 0.0154 0.675 0.6325 0.0478 0.0435 0.4046 0.5394 0.5748 0.1372 0.124 0.0569 0. 0.0055 0.1732 0.0987 0.0807 0.3833 0.0117 0.0072 0.0792 0.449 0. 0.5868 0.1372 0.5426 0.0923 0.352 0.0123 0.3633 0.0906 0.7694 0.9898 0. 0.3781 0.8446 0.0639 0.252 0.5929 0. 0.2444 0.117 0.281 0.1329 0.3688 0.5976 0.0244 0.3054 0.7894 0. 0.0155 0.128 0.0578 0.0887 0.0858 0.014 0.1284 0.3131 0.3913 0.0473 0.4588 0.2589 0.2507 0. 0.1717 0.0103 0.0438 0.0257] 2022-08-24 16:00:28 [INFO] [EVAL] The model with the best validation mIoU (0.2923) was saved at iter 45000. 2022-08-24 16:00:41 [INFO] [TRAIN] epoch: 39, iter: 49050/160000, loss: 0.8384, lr: 0.000840, batch_cost: 0.2754, reader_cost: 0.00420, ips: 29.0532 samples/sec | ETA 08:29:10 2022-08-24 16:00:55 [INFO] [TRAIN] epoch: 39, iter: 49100/160000, loss: 0.8432, lr: 0.000840, batch_cost: 0.2681, reader_cost: 0.02678, ips: 29.8341 samples/sec | ETA 08:15:37 2022-08-24 16:01:09 [INFO] [TRAIN] epoch: 39, iter: 49150/160000, loss: 0.8658, lr: 0.000839, batch_cost: 0.2902, reader_cost: 0.00045, ips: 27.5657 samples/sec | ETA 08:56:10 2022-08-24 16:01:22 [INFO] [TRAIN] epoch: 39, iter: 49200/160000, loss: 0.8267, lr: 0.000839, batch_cost: 0.2549, reader_cost: 0.00107, ips: 31.3815 samples/sec | ETA 07:50:45 2022-08-24 16:01:36 [INFO] [TRAIN] epoch: 39, iter: 49250/160000, loss: 0.8877, lr: 0.000838, batch_cost: 0.2747, reader_cost: 0.00907, ips: 29.1241 samples/sec | ETA 08:27:01 2022-08-24 16:01:52 [INFO] [TRAIN] epoch: 40, iter: 49300/160000, loss: 0.8833, lr: 0.000838, batch_cost: 0.3246, reader_cost: 0.06090, ips: 24.6463 samples/sec | ETA 09:58:52 2022-08-24 16:02:05 [INFO] [TRAIN] epoch: 40, iter: 49350/160000, loss: 0.7975, lr: 0.000838, batch_cost: 0.2674, reader_cost: 0.01231, ips: 29.9166 samples/sec | ETA 08:13:08 2022-08-24 16:02:14 [INFO] [TRAIN] epoch: 40, iter: 49400/160000, loss: 0.8533, lr: 0.000837, batch_cost: 0.1723, reader_cost: 0.00395, ips: 46.4196 samples/sec | ETA 05:17:40 2022-08-24 16:02:23 [INFO] [TRAIN] epoch: 40, iter: 49450/160000, loss: 0.8431, lr: 0.000837, batch_cost: 0.1875, reader_cost: 0.00086, ips: 42.6728 samples/sec | ETA 05:45:25 2022-08-24 16:02:32 [INFO] [TRAIN] epoch: 40, iter: 49500/160000, loss: 0.9424, lr: 0.000837, batch_cost: 0.1803, reader_cost: 0.00059, ips: 44.3769 samples/sec | ETA 05:32:00 2022-08-24 16:02:41 [INFO] [TRAIN] epoch: 40, iter: 49550/160000, loss: 0.9032, lr: 0.000836, batch_cost: 0.1765, reader_cost: 0.00073, ips: 45.3228 samples/sec | ETA 05:24:55 2022-08-24 16:02:50 [INFO] [TRAIN] epoch: 40, iter: 49600/160000, loss: 0.8748, lr: 0.000836, batch_cost: 0.1787, reader_cost: 0.00406, ips: 44.7783 samples/sec | ETA 05:28:43 2022-08-24 16:02:59 [INFO] [TRAIN] epoch: 40, iter: 49650/160000, loss: 0.8769, lr: 0.000835, batch_cost: 0.1813, reader_cost: 0.00032, ips: 44.1376 samples/sec | ETA 05:33:21 2022-08-24 16:03:07 [INFO] [TRAIN] epoch: 40, iter: 49700/160000, loss: 0.9172, lr: 0.000835, batch_cost: 0.1593, reader_cost: 0.00177, ips: 50.2316 samples/sec | ETA 04:52:46 2022-08-24 16:03:15 [INFO] [TRAIN] epoch: 40, iter: 49750/160000, loss: 0.8270, lr: 0.000835, batch_cost: 0.1555, reader_cost: 0.00037, ips: 51.4451 samples/sec | ETA 04:45:44 2022-08-24 16:03:24 [INFO] [TRAIN] epoch: 40, iter: 49800/160000, loss: 0.8667, lr: 0.000834, batch_cost: 0.1864, reader_cost: 0.00031, ips: 42.9217 samples/sec | ETA 05:42:19 2022-08-24 16:03:32 [INFO] [TRAIN] epoch: 40, iter: 49850/160000, loss: 0.8782, lr: 0.000834, batch_cost: 0.1629, reader_cost: 0.00155, ips: 49.1183 samples/sec | ETA 04:59:00 2022-08-24 16:03:41 [INFO] [TRAIN] epoch: 40, iter: 49900/160000, loss: 0.8267, lr: 0.000834, batch_cost: 0.1730, reader_cost: 0.00041, ips: 46.2519 samples/sec | ETA 05:17:23 2022-08-24 16:03:50 [INFO] [TRAIN] epoch: 40, iter: 49950/160000, loss: 0.8559, lr: 0.000833, batch_cost: 0.1850, reader_cost: 0.00057, ips: 43.2549 samples/sec | ETA 05:39:13 2022-08-24 16:03:59 [INFO] [TRAIN] epoch: 40, iter: 50000/160000, loss: 0.8837, lr: 0.000833, batch_cost: 0.1826, reader_cost: 0.00522, ips: 43.8146 samples/sec | ETA 05:34:44 2022-08-24 16:03:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 232s - batch_cost: 0.2318 - reader cost: 6.9680e-04 2022-08-24 16:07:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.2900 Acc: 0.7364 Kappa: 0.7160 Dice: 0.4113 2022-08-24 16:07:52 [INFO] [EVAL] Class IoU: [0.6459 0.759 0.9189 0.6895 0.6501 0.7329 0.7542 0.7308 0.4642 0.644 0.4373 0.5199 0.6393 0.2644 0.198 0.3538 0.4691 0.3818 0.5264 0.341 0.7101 0.416 0.5262 0.4394 0.2908 0.38 0.3852 0.3239 0.3897 0.2564 0.1943 0.4151 0.2453 0.2362 0.3209 0.3973 0.3238 0.4833 0.244 0.2389 0.1064 0.0546 0.276 0.2111 0.2597 0.1716 0.1381 0.3934 0.602 0.4406 0.4455 0.3079 0.1904 0.2319 0.647 0.4003 0.7981 0.3511 0.3379 0.1872 0.0736 0.1648 0.2437 0.1107 0.3147 0.5971 0.2019 0.3427 0.0793 0.3402 0.2937 0.3865 0.3259 0.1883 0.3732 0.2719 0.3756 0.1882 0.1298 0.1603 0.5539 0.2391 0.231 0.0146 0.3986 0.4496 0.0511 0.0417 0.2958 0.4024 0.3785 0.0157 0.1825 0.0606 0. 0.0184 0.1455 0.0793 0.0867 0.2874 0.0882 0.0015 0.0424 0.6476 0.0025 0.3958 0.0614 0.421 0.0797 0.2343 0.0852 0.13 0.0861 0.5018 0.7406 0. 0.3089 0.5264 0.0264 0.1122 0.3815 0. 0.2807 0.0798 0.216 0.1082 0.4001 0.3435 0.1394 0.1971 0.5101 0.0003 0.0256 0.2241 0.0384 0.0999 0.0673 0.0057 0.0954 0.2827 0.3274 0.0708 0.2057 0.2804 0.3242 0. 0.2066 0.0215 0.043 0.0455] 2022-08-24 16:07:52 [INFO] [EVAL] Class Precision: [0.7504 0.8406 0.9458 0.8097 0.7285 0.8445 0.8806 0.8072 0.598 0.7421 0.6424 0.6732 0.7069 0.5006 0.4595 0.5601 0.6137 0.6648 0.6421 0.539 0.788 0.6506 0.7305 0.5751 0.4828 0.5392 0.5218 0.7784 0.6519 0.4189 0.3544 0.5225 0.624 0.3451 0.4669 0.5505 0.5401 0.6973 0.414 0.4724 0.1709 0.3005 0.4134 0.523 0.4029 0.4407 0.2719 0.6037 0.7047 0.5578 0.6104 0.3655 0.3889 0.5176 0.7062 0.5033 0.8447 0.5661 0.6986 0.4424 0.1281 0.354 0.4059 0.6422 0.3561 0.7064 0.3878 0.4184 0.257 0.679 0.5179 0.4929 0.6559 0.2133 0.5155 0.4321 0.4243 0.4047 0.6155 0.4392 0.7754 0.4899 0.6976 0.0738 0.5401 0.5553 0.1899 0.3392 0.4986 0.6575 0.5096 0.023 0.2822 0.2433 0.0005 0.0338 0.4667 0.768 0.333 0.5813 0.541 0.0024 0.6747 0.8107 0.0468 0.5256 0.3301 0.7062 0.1992 0.5171 0.205 0.159 0.4273 0.7707 0.761 0. 0.8953 0.6073 0.0798 0.433 0.7949 0. 0.8496 0.6467 0.5355 0.6345 0.6838 0.5114 0.4069 0.4463 0.6071 1. 0.1693 0.4747 0.5518 0.2616 0.4073 0.2152 0.3457 0.7236 0.436 0.1065 0.6296 0.6207 0.5789 0. 0.863 0.3853 0.2003 0.4429] 2022-08-24 16:07:52 [INFO] [EVAL] Class Recall: [0.8227 0.8866 0.97 0.8229 0.858 0.8473 0.8402 0.8854 0.6747 0.8297 0.5779 0.6954 0.8699 0.3592 0.2581 0.4899 0.6657 0.4728 0.7449 0.4813 0.8778 0.5357 0.6529 0.6507 0.4224 0.5627 0.5955 0.3568 0.4921 0.3979 0.3007 0.6688 0.2878 0.4282 0.5064 0.5881 0.447 0.6116 0.3729 0.3258 0.22 0.0625 0.4536 0.2614 0.4223 0.2193 0.2191 0.5304 0.8051 0.6772 0.6225 0.6612 0.2716 0.2958 0.8853 0.6616 0.9352 0.4804 0.3955 0.2451 0.1475 0.2357 0.3789 0.118 0.7304 0.7942 0.2964 0.6545 0.1028 0.4054 0.4042 0.6415 0.3931 0.6171 0.5748 0.4231 0.7658 0.2602 0.1413 0.2015 0.6597 0.3184 0.2568 0.0179 0.6033 0.7025 0.0653 0.0454 0.4211 0.5091 0.5954 0.0471 0.3407 0.0747 0.0001 0.0387 0.1745 0.0813 0.1049 0.3624 0.0953 0.0036 0.0433 0.763 0.0026 0.6159 0.0701 0.5104 0.1173 0.3 0.1273 0.4161 0.0973 0.5899 0.9651 0. 0.3205 0.798 0.038 0.1316 0.4231 0. 0.2954 0.0834 0.2657 0.1154 0.491 0.5113 0.175 0.2608 0.7614 0.0003 0.0293 0.298 0.0397 0.1391 0.0745 0.0058 0.1164 0.3169 0.5682 0.1746 0.234 0.3384 0.4242 0. 0.2136 0.0223 0.0519 0.0482] 2022-08-24 16:07:52 [INFO] [EVAL] The model with the best validation mIoU (0.2923) was saved at iter 45000. 2022-08-24 16:08:05 [INFO] [TRAIN] epoch: 40, iter: 50050/160000, loss: 0.9051, lr: 0.000832, batch_cost: 0.2596, reader_cost: 0.01645, ips: 30.8214 samples/sec | ETA 07:55:38 2022-08-24 16:08:19 [INFO] [TRAIN] epoch: 40, iter: 50100/160000, loss: 0.8782, lr: 0.000832, batch_cost: 0.2748, reader_cost: 0.01142, ips: 29.1164 samples/sec | ETA 08:23:15 2022-08-24 16:08:31 [INFO] [TRAIN] epoch: 40, iter: 50150/160000, loss: 0.8098, lr: 0.000832, batch_cost: 0.2464, reader_cost: 0.00185, ips: 32.4691 samples/sec | ETA 07:31:05 2022-08-24 16:08:45 [INFO] [TRAIN] epoch: 40, iter: 50200/160000, loss: 0.8322, lr: 0.000831, batch_cost: 0.2798, reader_cost: 0.02714, ips: 28.5877 samples/sec | ETA 08:32:06 2022-08-24 16:08:57 [INFO] [TRAIN] epoch: 40, iter: 50250/160000, loss: 0.8181, lr: 0.000831, batch_cost: 0.2461, reader_cost: 0.02634, ips: 32.5041 samples/sec | ETA 07:30:11 2022-08-24 16:09:11 [INFO] [TRAIN] epoch: 40, iter: 50300/160000, loss: 0.8398, lr: 0.000831, batch_cost: 0.2709, reader_cost: 0.01213, ips: 29.5291 samples/sec | ETA 08:15:19 2022-08-24 16:09:20 [INFO] [TRAIN] epoch: 40, iter: 50350/160000, loss: 0.9002, lr: 0.000830, batch_cost: 0.1823, reader_cost: 0.00048, ips: 43.8954 samples/sec | ETA 05:33:03 2022-08-24 16:09:28 [INFO] [TRAIN] epoch: 40, iter: 50400/160000, loss: 0.9766, lr: 0.000830, batch_cost: 0.1626, reader_cost: 0.00083, ips: 49.2067 samples/sec | ETA 04:56:58 2022-08-24 16:09:37 [INFO] [TRAIN] epoch: 40, iter: 50450/160000, loss: 0.8473, lr: 0.000829, batch_cost: 0.1745, reader_cost: 0.00116, ips: 45.8347 samples/sec | ETA 05:18:40 2022-08-24 16:09:46 [INFO] [TRAIN] epoch: 40, iter: 50500/160000, loss: 0.8634, lr: 0.000829, batch_cost: 0.1860, reader_cost: 0.00051, ips: 43.0086 samples/sec | ETA 05:39:28 2022-08-24 16:09:56 [INFO] [TRAIN] epoch: 41, iter: 50550/160000, loss: 0.7948, lr: 0.000829, batch_cost: 0.1934, reader_cost: 0.03097, ips: 41.3562 samples/sec | ETA 05:52:52 2022-08-24 16:10:05 [INFO] [TRAIN] epoch: 41, iter: 50600/160000, loss: 0.8808, lr: 0.000828, batch_cost: 0.1846, reader_cost: 0.00037, ips: 43.3447 samples/sec | ETA 05:36:31 2022-08-24 16:10:14 [INFO] [TRAIN] epoch: 41, iter: 50650/160000, loss: 0.7631, lr: 0.000828, batch_cost: 0.1791, reader_cost: 0.00034, ips: 44.6772 samples/sec | ETA 05:26:20 2022-08-24 16:10:22 [INFO] [TRAIN] epoch: 41, iter: 50700/160000, loss: 0.8557, lr: 0.000828, batch_cost: 0.1616, reader_cost: 0.00901, ips: 49.4960 samples/sec | ETA 04:54:26 2022-08-24 16:10:31 [INFO] [TRAIN] epoch: 41, iter: 50750/160000, loss: 0.8145, lr: 0.000827, batch_cost: 0.1832, reader_cost: 0.00178, ips: 43.6620 samples/sec | ETA 05:33:37 2022-08-24 16:10:41 [INFO] [TRAIN] epoch: 41, iter: 50800/160000, loss: 0.8663, lr: 0.000827, batch_cost: 0.1872, reader_cost: 0.00038, ips: 42.7259 samples/sec | ETA 05:40:46 2022-08-24 16:10:49 [INFO] [TRAIN] epoch: 41, iter: 50850/160000, loss: 0.8578, lr: 0.000826, batch_cost: 0.1791, reader_cost: 0.00052, ips: 44.6692 samples/sec | ETA 05:25:48 2022-08-24 16:10:58 [INFO] [TRAIN] epoch: 41, iter: 50900/160000, loss: 0.8741, lr: 0.000826, batch_cost: 0.1660, reader_cost: 0.00080, ips: 48.2037 samples/sec | ETA 05:01:46 2022-08-24 16:11:05 [INFO] [TRAIN] epoch: 41, iter: 50950/160000, loss: 0.8537, lr: 0.000826, batch_cost: 0.1536, reader_cost: 0.00053, ips: 52.0779 samples/sec | ETA 04:39:11 2022-08-24 16:11:14 [INFO] [TRAIN] epoch: 41, iter: 51000/160000, loss: 0.9308, lr: 0.000825, batch_cost: 0.1756, reader_cost: 0.00031, ips: 45.5555 samples/sec | ETA 05:19:01 2022-08-24 16:11:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 209s - batch_cost: 0.2092 - reader cost: 7.8665e-04 2022-08-24 16:14:44 [INFO] [EVAL] #Images: 2000 mIoU: 0.2946 Acc: 0.7342 Kappa: 0.7139 Dice: 0.4182 2022-08-24 16:14:44 [INFO] [EVAL] Class IoU: [0.6425 0.7593 0.9212 0.6884 0.6619 0.7273 0.7438 0.7279 0.4639 0.5879 0.4315 0.5067 0.6544 0.2334 0.2215 0.3573 0.4884 0.4056 0.5239 0.3379 0.7053 0.4491 0.5351 0.4133 0.305 0.4013 0.4158 0.374 0.3442 0.2445 0.2374 0.4098 0.2686 0.265 0.397 0.3178 0.3185 0.4386 0.2277 0.2331 0.1101 0.0726 0.2857 0.2156 0.2797 0.2073 0.2053 0.3936 0.5317 0.3555 0.4375 0.3369 0.1466 0.1813 0.7031 0.4674 0.8084 0.2621 0.4356 0.1459 0.115 0.2011 0.2707 0.1601 0.348 0.6011 0.1877 0.3638 0.0474 0.3278 0.3023 0.3723 0.3762 0.2029 0.3913 0.2705 0.4478 0.1637 0.1715 0.1581 0.6044 0.2426 0.2575 0.0359 0.384 0.4459 0.0486 0.0483 0.3485 0.4177 0.3369 0.0205 0.1564 0.0849 0.0684 0.0046 0.1672 0.0608 0.1757 0.2902 0.0894 0.003 0.0482 0.5383 0.0293 0.3868 0.1506 0.4529 0.0758 0.2678 0.0729 0.1957 0.0645 0.3606 0.7642 0. 0.377 0.5127 0.0731 0.14 0.4355 0.0028 0.2683 0.0408 0.1968 0.1359 0.3755 0.3439 0.338 0.2096 0.502 0.0035 0.0471 0.1975 0.0615 0.0882 0.0707 0.0035 0.0938 0.2563 0.0702 0.0337 0.3279 0.2882 0.2948 0. 0.1571 0.0172 0.0614 0.0253] 2022-08-24 16:14:44 [INFO] [EVAL] Class Precision: [0.759 0.836 0.9526 0.7869 0.7488 0.8607 0.8771 0.8233 0.5984 0.6834 0.6221 0.6278 0.7499 0.5213 0.3788 0.5138 0.6701 0.7174 0.6886 0.5371 0.8004 0.6398 0.6986 0.6169 0.5052 0.528 0.5575 0.7078 0.6858 0.3607 0.3152 0.5466 0.4884 0.4027 0.5079 0.4597 0.5657 0.7253 0.4479 0.5375 0.1857 0.2228 0.5228 0.4634 0.4476 0.4955 0.2991 0.5991 0.7671 0.4078 0.6096 0.4114 0.2744 0.5201 0.7449 0.646 0.8659 0.6124 0.5555 0.2657 0.1747 0.338 0.3682 0.5374 0.4195 0.6925 0.3551 0.5328 0.1255 0.6048 0.548 0.5034 0.6165 0.2695 0.5415 0.4491 0.6225 0.5888 0.6312 0.4802 0.785 0.5 0.6846 0.0913 0.4435 0.6538 0.5443 0.3634 0.5998 0.6842 0.4205 0.0284 0.2616 0.2466 0.2017 0.0144 0.5534 0.3445 0.4192 0.4117 0.53 0.0044 0.5752 0.6433 0.103 0.5846 0.3329 0.7951 0.1744 0.4396 0.2376 0.2704 0.3925 0.6715 0.7818 0. 0.7215 0.543 0.1183 0.4886 0.7384 0.281 0.7719 0.5897 0.5769 0.6308 0.7486 0.5573 0.6076 0.4663 0.5569 0.3607 0.1613 0.5908 0.6097 0.3129 0.3682 0.2177 0.4188 0.7406 0.1817 0.0763 0.5559 0.5277 0.7309 0. 0.8883 0.3169 0.2677 0.7896] 2022-08-24 16:14:44 [INFO] [EVAL] Class Recall: [0.8071 0.8921 0.9654 0.8461 0.8508 0.8243 0.8304 0.8626 0.6735 0.8079 0.5847 0.7243 0.8371 0.2971 0.3478 0.5397 0.643 0.4827 0.6865 0.4767 0.8559 0.6011 0.6958 0.5561 0.435 0.6258 0.6205 0.4423 0.4086 0.4314 0.4906 0.6208 0.3737 0.4368 0.6452 0.5073 0.4217 0.5259 0.3166 0.2916 0.2127 0.0973 0.3866 0.2874 0.4273 0.2628 0.3957 0.5344 0.634 0.7351 0.6077 0.6504 0.2395 0.2177 0.9261 0.6283 0.9241 0.3142 0.6686 0.2445 0.2517 0.3319 0.5055 0.1857 0.6712 0.82 0.2848 0.5341 0.0709 0.4172 0.4028 0.5884 0.4911 0.4512 0.5851 0.4048 0.6148 0.1848 0.1906 0.1907 0.7242 0.3204 0.2922 0.0559 0.741 0.5838 0.0507 0.0528 0.4541 0.5174 0.6291 0.0692 0.2799 0.1147 0.0937 0.0067 0.1932 0.0688 0.2323 0.496 0.0971 0.01 0.0499 0.7673 0.0394 0.5334 0.2157 0.5128 0.1183 0.4065 0.0951 0.415 0.0717 0.4379 0.9714 0. 0.4411 0.9019 0.1607 0.1641 0.5149 0.0028 0.2915 0.042 0.23 0.1476 0.4297 0.4731 0.4324 0.2757 0.8358 0.0035 0.0623 0.2287 0.0641 0.1094 0.0804 0.0035 0.1078 0.2816 0.1026 0.0569 0.4442 0.3884 0.3307 0. 0.1603 0.0179 0.0738 0.0255] 2022-08-24 16:14:44 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:14:57 [INFO] [TRAIN] epoch: 41, iter: 51050/160000, loss: 0.9152, lr: 0.000825, batch_cost: 0.2634, reader_cost: 0.00474, ips: 30.3736 samples/sec | ETA 07:58:15 2022-08-24 16:15:12 [INFO] [TRAIN] epoch: 41, iter: 51100/160000, loss: 0.7977, lr: 0.000824, batch_cost: 0.3000, reader_cost: 0.00094, ips: 26.6691 samples/sec | ETA 09:04:26 2022-08-24 16:15:25 [INFO] [TRAIN] epoch: 41, iter: 51150/160000, loss: 0.7755, lr: 0.000824, batch_cost: 0.2559, reader_cost: 0.01534, ips: 31.2606 samples/sec | ETA 07:44:16 2022-08-24 16:15:39 [INFO] [TRAIN] epoch: 41, iter: 51200/160000, loss: 0.8387, lr: 0.000824, batch_cost: 0.2883, reader_cost: 0.00775, ips: 27.7507 samples/sec | ETA 08:42:44 2022-08-24 16:15:52 [INFO] [TRAIN] epoch: 41, iter: 51250/160000, loss: 0.8606, lr: 0.000823, batch_cost: 0.2479, reader_cost: 0.01557, ips: 32.2708 samples/sec | ETA 07:29:19 2022-08-24 16:16:06 [INFO] [TRAIN] epoch: 41, iter: 51300/160000, loss: 0.8891, lr: 0.000823, batch_cost: 0.2822, reader_cost: 0.00137, ips: 28.3506 samples/sec | ETA 08:31:13 2022-08-24 16:16:15 [INFO] [TRAIN] epoch: 41, iter: 51350/160000, loss: 0.9168, lr: 0.000823, batch_cost: 0.1889, reader_cost: 0.00366, ips: 42.3608 samples/sec | ETA 05:41:58 2022-08-24 16:16:27 [INFO] [TRAIN] epoch: 41, iter: 51400/160000, loss: 0.8765, lr: 0.000822, batch_cost: 0.2351, reader_cost: 0.00083, ips: 34.0329 samples/sec | ETA 07:05:28 2022-08-24 16:16:35 [INFO] [TRAIN] epoch: 41, iter: 51450/160000, loss: 0.9242, lr: 0.000822, batch_cost: 0.1638, reader_cost: 0.00064, ips: 48.8441 samples/sec | ETA 04:56:18 2022-08-24 16:16:45 [INFO] [TRAIN] epoch: 41, iter: 51500/160000, loss: 0.8804, lr: 0.000821, batch_cost: 0.1886, reader_cost: 0.00055, ips: 42.4132 samples/sec | ETA 05:41:05 2022-08-24 16:16:54 [INFO] [TRAIN] epoch: 41, iter: 51550/160000, loss: 0.8522, lr: 0.000821, batch_cost: 0.1797, reader_cost: 0.00066, ips: 44.5303 samples/sec | ETA 05:24:43 2022-08-24 16:17:03 [INFO] [TRAIN] epoch: 41, iter: 51600/160000, loss: 0.8893, lr: 0.000821, batch_cost: 0.1881, reader_cost: 0.00068, ips: 42.5258 samples/sec | ETA 05:39:52 2022-08-24 16:17:11 [INFO] [TRAIN] epoch: 41, iter: 51650/160000, loss: 0.8470, lr: 0.000820, batch_cost: 0.1505, reader_cost: 0.00145, ips: 53.1678 samples/sec | ETA 04:31:43 2022-08-24 16:17:19 [INFO] [TRAIN] epoch: 41, iter: 51700/160000, loss: 0.8524, lr: 0.000820, batch_cost: 0.1705, reader_cost: 0.00568, ips: 46.9074 samples/sec | ETA 05:07:50 2022-08-24 16:17:28 [INFO] [TRAIN] epoch: 41, iter: 51750/160000, loss: 0.8630, lr: 0.000820, batch_cost: 0.1783, reader_cost: 0.00046, ips: 44.8714 samples/sec | ETA 05:21:39 2022-08-24 16:17:38 [INFO] [TRAIN] epoch: 42, iter: 51800/160000, loss: 0.8362, lr: 0.000819, batch_cost: 0.1981, reader_cost: 0.02701, ips: 40.3763 samples/sec | ETA 05:57:18 2022-08-24 16:17:47 [INFO] [TRAIN] epoch: 42, iter: 51850/160000, loss: 0.7711, lr: 0.000819, batch_cost: 0.1825, reader_cost: 0.00089, ips: 43.8329 samples/sec | ETA 05:28:58 2022-08-24 16:17:56 [INFO] [TRAIN] epoch: 42, iter: 51900/160000, loss: 0.9017, lr: 0.000818, batch_cost: 0.1789, reader_cost: 0.00051, ips: 44.7201 samples/sec | ETA 05:22:18 2022-08-24 16:18:06 [INFO] [TRAIN] epoch: 42, iter: 51950/160000, loss: 0.9116, lr: 0.000818, batch_cost: 0.2046, reader_cost: 0.00036, ips: 39.1047 samples/sec | ETA 06:08:24 2022-08-24 16:18:15 [INFO] [TRAIN] epoch: 42, iter: 52000/160000, loss: 0.8325, lr: 0.000818, batch_cost: 0.1713, reader_cost: 0.00067, ips: 46.6989 samples/sec | ETA 05:08:21 2022-08-24 16:18:15 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 214s - batch_cost: 0.2143 - reader cost: 7.3212e-04 2022-08-24 16:21:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.2886 Acc: 0.7353 Kappa: 0.7149 Dice: 0.4088 2022-08-24 16:21:50 [INFO] [EVAL] Class IoU: [0.6454 0.742 0.9213 0.6896 0.6592 0.7276 0.758 0.7362 0.4555 0.6254 0.4454 0.5091 0.65 0.2849 0.1333 0.3622 0.4757 0.3955 0.5258 0.3357 0.703 0.4562 0.5455 0.4421 0.3111 0.3682 0.4406 0.3704 0.3938 0.2795 0.1611 0.3988 0.2089 0.2673 0.3524 0.3757 0.315 0.4294 0.2219 0.258 0.1052 0.0567 0.2926 0.2029 0.2913 0.2417 0.2491 0.398 0.6468 0.4327 0.4275 0.1773 0.1791 0.1916 0.621 0.4075 0.8037 0.2686 0.2174 0.1968 0.0731 0.1739 0.2992 0.0746 0.3317 0.6383 0.1908 0.3715 0.0375 0.3343 0.3217 0.3856 0.37 0.2293 0.3921 0.2577 0.3476 0.2405 0.1487 0.1793 0.5878 0.2389 0.2386 0.0135 0.4577 0.4488 0.0613 0.0496 0.2756 0.3926 0.3784 0.0216 0.2346 0.0533 0.0077 0.0048 0.0881 0.1417 0.0562 0.3032 0.0007 0.003 0.0628 0.5721 0. 0.3235 0.1649 0.451 0.0615 0.2327 0.0159 0.1437 0.0722 0.5177 0.7303 0.0001 0.305 0.5333 0.0312 0.0915 0.4528 0.0004 0.1671 0.0836 0.2047 0.1472 0.3902 0.2917 0.2569 0.1537 0.4504 0. 0.0527 0.1362 0.0412 0.0855 0.0647 0.0093 0.1006 0.2782 0.1445 0.0454 0.2877 0.3278 0.2535 0.0054 0.2067 0.0221 0.0408 0.0428] 2022-08-24 16:21:50 [INFO] [EVAL] Class Precision: [0.745 0.8508 0.9661 0.7922 0.7418 0.857 0.8695 0.8078 0.5503 0.7175 0.6461 0.6492 0.7439 0.5183 0.5277 0.5154 0.6105 0.6406 0.7063 0.5509 0.7696 0.5918 0.7357 0.5419 0.4907 0.4139 0.641 0.684 0.6425 0.6442 0.3636 0.4927 0.3922 0.4201 0.6071 0.5932 0.6027 0.7305 0.4795 0.4863 0.2071 0.2776 0.5148 0.5478 0.4122 0.4084 0.4043 0.6501 0.6923 0.5275 0.6666 0.1899 0.3408 0.5085 0.6727 0.5227 0.8462 0.6208 0.6444 0.3924 0.1229 0.465 0.4588 0.5302 0.397 0.7833 0.364 0.5352 0.1391 0.6413 0.5153 0.5122 0.6252 0.2918 0.621 0.5002 0.5518 0.5089 0.2678 0.3765 0.7229 0.468 0.6973 0.07 0.5826 0.6995 0.2485 0.3863 0.3842 0.5672 0.5268 0.0303 0.3963 0.2747 0.0278 0.0202 0.3752 0.621 0.216 0.4776 0.2235 0.0043 0.5087 0.7711 0. 0.4092 0.4655 0.8971 0.1936 0.3758 0.1386 0.1801 0.3057 0.5978 0.7368 0.0019 0.8673 0.5738 0.0851 0.4876 0.7024 0.2598 0.8585 0.6364 0.622 0.541 0.6736 0.3956 0.4835 0.4031 0.5056 0. 0.3606 0.6608 0.5283 0.2706 0.3558 0.1616 0.3712 0.6885 0.218 0.1248 0.5614 0.6597 0.6003 0.0059 0.8129 0.267 0.3678 0.6261] 2022-08-24 16:21:50 [INFO] [EVAL] Class Recall: [0.8284 0.8531 0.9521 0.8419 0.8555 0.8282 0.8554 0.8926 0.7257 0.8298 0.5892 0.7023 0.8374 0.3875 0.1514 0.5493 0.6829 0.5083 0.6729 0.4622 0.8904 0.6656 0.6784 0.706 0.4595 0.7692 0.5849 0.4469 0.5043 0.3305 0.2244 0.6766 0.309 0.4235 0.4565 0.5061 0.3976 0.5103 0.2923 0.3547 0.1761 0.0665 0.4041 0.2437 0.4982 0.372 0.3935 0.5065 0.9077 0.7067 0.5437 0.727 0.274 0.2351 0.8898 0.6491 0.9412 0.3213 0.247 0.2831 0.153 0.2175 0.4623 0.0799 0.6684 0.7752 0.2863 0.5485 0.0489 0.4112 0.4614 0.6093 0.4754 0.5168 0.5154 0.3472 0.4843 0.3132 0.2504 0.2551 0.7587 0.3279 0.2661 0.0165 0.681 0.556 0.0752 0.0539 0.4938 0.5604 0.5733 0.0701 0.365 0.062 0.0105 0.0063 0.1032 0.1552 0.0706 0.4536 0.0007 0.0097 0.0669 0.6891 0. 0.6072 0.2034 0.4756 0.0828 0.3792 0.0177 0.4161 0.0863 0.7945 0.9881 0.0001 0.3199 0.8829 0.0469 0.1012 0.5603 0.0004 0.1719 0.0878 0.2338 0.1682 0.4812 0.5262 0.3541 0.1989 0.805 0. 0.0581 0.1464 0.0427 0.1111 0.0732 0.0098 0.1212 0.3183 0.2998 0.0666 0.3711 0.3946 0.305 0.0611 0.217 0.0235 0.0438 0.0439] 2022-08-24 16:21:50 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:22:03 [INFO] [TRAIN] epoch: 42, iter: 52050/160000, loss: 0.8772, lr: 0.000817, batch_cost: 0.2624, reader_cost: 0.00468, ips: 30.4845 samples/sec | ETA 07:52:09 2022-08-24 16:22:16 [INFO] [TRAIN] epoch: 42, iter: 52100/160000, loss: 0.8761, lr: 0.000817, batch_cost: 0.2627, reader_cost: 0.00655, ips: 30.4572 samples/sec | ETA 07:52:21 2022-08-24 16:22:29 [INFO] [TRAIN] epoch: 42, iter: 52150/160000, loss: 0.9533, lr: 0.000817, batch_cost: 0.2599, reader_cost: 0.00157, ips: 30.7853 samples/sec | ETA 07:47:06 2022-08-24 16:22:43 [INFO] [TRAIN] epoch: 42, iter: 52200/160000, loss: 0.8574, lr: 0.000816, batch_cost: 0.2784, reader_cost: 0.01924, ips: 28.7363 samples/sec | ETA 08:20:10 2022-08-24 16:22:56 [INFO] [TRAIN] epoch: 42, iter: 52250/160000, loss: 0.8402, lr: 0.000816, batch_cost: 0.2681, reader_cost: 0.00141, ips: 29.8401 samples/sec | ETA 08:01:27 2022-08-24 16:23:06 [INFO] [TRAIN] epoch: 42, iter: 52300/160000, loss: 0.9093, lr: 0.000815, batch_cost: 0.1903, reader_cost: 0.00053, ips: 42.0365 samples/sec | ETA 05:41:36 2022-08-24 16:23:17 [INFO] [TRAIN] epoch: 42, iter: 52350/160000, loss: 0.8293, lr: 0.000815, batch_cost: 0.2269, reader_cost: 0.00060, ips: 35.2575 samples/sec | ETA 06:47:06 2022-08-24 16:23:28 [INFO] [TRAIN] epoch: 42, iter: 52400/160000, loss: 0.8919, lr: 0.000815, batch_cost: 0.2258, reader_cost: 0.00089, ips: 35.4278 samples/sec | ETA 06:44:57 2022-08-24 16:23:40 [INFO] [TRAIN] epoch: 42, iter: 52450/160000, loss: 0.8321, lr: 0.000814, batch_cost: 0.2230, reader_cost: 0.00035, ips: 35.8775 samples/sec | ETA 06:39:41 2022-08-24 16:23:48 [INFO] [TRAIN] epoch: 42, iter: 52500/160000, loss: 0.8193, lr: 0.000814, batch_cost: 0.1678, reader_cost: 0.00073, ips: 47.6632 samples/sec | ETA 05:00:43 2022-08-24 16:23:56 [INFO] [TRAIN] epoch: 42, iter: 52550/160000, loss: 0.8753, lr: 0.000814, batch_cost: 0.1693, reader_cost: 0.00055, ips: 47.2509 samples/sec | ETA 05:03:12 2022-08-24 16:24:07 [INFO] [TRAIN] epoch: 42, iter: 52600/160000, loss: 0.8540, lr: 0.000813, batch_cost: 0.2107, reader_cost: 0.00040, ips: 37.9695 samples/sec | ETA 06:17:08 2022-08-24 16:24:16 [INFO] [TRAIN] epoch: 42, iter: 52650/160000, loss: 0.8304, lr: 0.000813, batch_cost: 0.1752, reader_cost: 0.00588, ips: 45.6654 samples/sec | ETA 05:13:26 2022-08-24 16:24:25 [INFO] [TRAIN] epoch: 42, iter: 52700/160000, loss: 0.7718, lr: 0.000812, batch_cost: 0.1840, reader_cost: 0.00060, ips: 43.4829 samples/sec | ETA 05:29:01 2022-08-24 16:24:33 [INFO] [TRAIN] epoch: 42, iter: 52750/160000, loss: 0.8535, lr: 0.000812, batch_cost: 0.1633, reader_cost: 0.00067, ips: 48.9815 samples/sec | ETA 04:51:56 2022-08-24 16:24:41 [INFO] [TRAIN] epoch: 42, iter: 52800/160000, loss: 0.8709, lr: 0.000812, batch_cost: 0.1545, reader_cost: 0.00097, ips: 51.7640 samples/sec | ETA 04:36:07 2022-08-24 16:24:49 [INFO] [TRAIN] epoch: 42, iter: 52850/160000, loss: 0.8773, lr: 0.000811, batch_cost: 0.1623, reader_cost: 0.01195, ips: 49.3020 samples/sec | ETA 04:49:46 2022-08-24 16:24:58 [INFO] [TRAIN] epoch: 42, iter: 52900/160000, loss: 0.8229, lr: 0.000811, batch_cost: 0.1769, reader_cost: 0.00399, ips: 45.2342 samples/sec | ETA 05:15:41 2022-08-24 16:25:07 [INFO] [TRAIN] epoch: 42, iter: 52950/160000, loss: 0.8722, lr: 0.000810, batch_cost: 0.1814, reader_cost: 0.00056, ips: 44.1003 samples/sec | ETA 05:23:39 2022-08-24 16:25:16 [INFO] [TRAIN] epoch: 42, iter: 53000/160000, loss: 0.8741, lr: 0.000810, batch_cost: 0.1870, reader_cost: 0.00037, ips: 42.7772 samples/sec | ETA 05:33:30 2022-08-24 16:25:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 230s - batch_cost: 0.2301 - reader cost: 0.0010 2022-08-24 16:29:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.2895 Acc: 0.7373 Kappa: 0.7165 Dice: 0.4110 2022-08-24 16:29:07 [INFO] [EVAL] Class IoU: [0.6435 0.7595 0.9242 0.6935 0.6578 0.7137 0.7625 0.713 0.4767 0.6422 0.4447 0.5266 0.6507 0.2651 0.1899 0.3441 0.4514 0.3915 0.5338 0.3399 0.7186 0.4446 0.535 0.4486 0.3172 0.3221 0.4017 0.3765 0.3783 0.3066 0.1408 0.3884 0.2596 0.26 0.2125 0.3636 0.3219 0.4564 0.2294 0.2414 0.1015 0.0711 0.2984 0.1825 0.3044 0.2159 0.2327 0.3934 0.6232 0.4519 0.4461 0.3014 0.1291 0.1446 0.6133 0.4487 0.8123 0.2817 0.3834 0.2136 0.0896 0.1792 0.2954 0.1222 0.3532 0.6235 0.1986 0.3665 0.0642 0.2933 0.2806 0.3682 0.3734 0.1943 0.3899 0.2673 0.3852 0.1774 0.1802 0.2203 0.5667 0.2393 0.1794 0.0137 0.4507 0.4648 0.0597 0.0448 0.331 0.4119 0.331 0.0341 0.1779 0.0577 0.0033 0.0042 0.1437 0.0938 0.1062 0.2782 0.0004 0.0085 0.0536 0.0955 0.0002 0.3732 0.1039 0.4095 0.0469 0.2053 0.1282 0.1934 0.0586 0.4344 0.7487 0. 0.4717 0.4576 0.0758 0.1185 0.4211 0.0001 0.1975 0.0954 0.246 0.2069 0.385 0.2891 0.3138 0.1534 0.5717 0. 0.1006 0.1557 0.027 0.0663 0.0597 0.0058 0.1183 0.2714 0.2414 0.0725 0.2937 0.2994 0.3148 0. 0.1377 0.0194 0.0349 0.0262] 2022-08-24 16:29:07 [INFO] [EVAL] Class Precision: [0.734 0.82 0.9611 0.8137 0.7723 0.863 0.8745 0.7687 0.6328 0.7719 0.6171 0.6801 0.7354 0.4741 0.4561 0.5318 0.5538 0.6269 0.7034 0.5314 0.8066 0.6714 0.7085 0.5647 0.4887 0.5663 0.5252 0.6417 0.6619 0.4438 0.3775 0.5441 0.486 0.4069 0.4756 0.5326 0.5389 0.6951 0.3739 0.4973 0.1863 0.2632 0.5815 0.5954 0.426 0.4303 0.3542 0.623 0.7037 0.5649 0.6353 0.3928 0.4291 0.5484 0.6686 0.5902 0.8602 0.5894 0.6251 0.2993 0.1821 0.4184 0.4451 0.6178 0.4406 0.7243 0.3211 0.5396 0.157 0.6424 0.5346 0.5031 0.5525 0.2847 0.5378 0.5665 0.59 0.3812 0.6169 0.4165 0.7925 0.4869 0.7458 0.078 0.6587 0.5979 0.1953 0.3718 0.6617 0.6831 0.4275 0.0538 0.4031 0.2296 0.0223 0.0156 0.4149 0.6864 0.5208 0.4973 0.2197 0.012 0.4259 0.339 0.0032 0.5036 0.2462 0.8664 0.1594 0.3605 0.2161 0.404 0.4106 0.6806 0.7563 0.0008 0.8148 0.5217 0.1438 0.4957 0.7475 0.1687 0.9905 0.6103 0.4894 0.5464 0.6663 0.3836 0.5486 0.5042 0.702 0. 0.7399 0.6242 0.5271 0.3706 0.4715 0.1312 0.3581 0.709 0.4442 0.119 0.5991 0.5355 0.6126 0. 0.8811 0.3671 0.2375 0.8927] 2022-08-24 16:29:07 [INFO] [EVAL] Class Recall: [0.8392 0.9114 0.9602 0.8244 0.8161 0.8049 0.8563 0.9078 0.659 0.7927 0.6142 0.7 0.8497 0.3755 0.2455 0.4937 0.7094 0.5104 0.6888 0.4853 0.8683 0.5683 0.686 0.6857 0.4749 0.4276 0.6309 0.4766 0.4689 0.4979 0.1834 0.5757 0.3579 0.4187 0.2776 0.534 0.4442 0.5705 0.3724 0.3193 0.1823 0.0888 0.38 0.2084 0.516 0.3022 0.4042 0.5163 0.845 0.6932 0.5998 0.5642 0.1559 0.1641 0.8811 0.6517 0.9359 0.3504 0.498 0.4273 0.1498 0.2387 0.4676 0.1322 0.6405 0.8175 0.3423 0.5333 0.098 0.3505 0.3712 0.5785 0.5353 0.3795 0.5863 0.3361 0.5261 0.2492 0.2029 0.3185 0.6654 0.3199 0.1911 0.0164 0.5881 0.6762 0.0792 0.0484 0.3984 0.5093 0.5945 0.0855 0.2416 0.0715 0.0039 0.0058 0.1802 0.098 0.1177 0.3871 0.0004 0.0282 0.0578 0.1174 0.0002 0.5904 0.1524 0.4371 0.0623 0.3228 0.2397 0.2707 0.064 0.5457 0.9867 0. 0.5284 0.7884 0.1383 0.1348 0.491 0.0001 0.1979 0.1016 0.3309 0.2498 0.4769 0.5401 0.423 0.1807 0.7549 0. 0.1043 0.1718 0.0277 0.0747 0.064 0.0061 0.1501 0.3055 0.3459 0.1565 0.3655 0.4045 0.393 0. 0.1404 0.0201 0.0394 0.0262] 2022-08-24 16:29:07 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:29:24 [INFO] [TRAIN] epoch: 43, iter: 53050/160000, loss: 0.8999, lr: 0.000810, batch_cost: 0.3377, reader_cost: 0.06830, ips: 23.6925 samples/sec | ETA 10:01:52 2022-08-24 16:29:39 [INFO] [TRAIN] epoch: 43, iter: 53100/160000, loss: 0.7684, lr: 0.000809, batch_cost: 0.2966, reader_cost: 0.00261, ips: 26.9718 samples/sec | ETA 08:48:27 2022-08-24 16:29:52 [INFO] [TRAIN] epoch: 43, iter: 53150/160000, loss: 0.9325, lr: 0.000809, batch_cost: 0.2691, reader_cost: 0.00103, ips: 29.7289 samples/sec | ETA 07:59:13 2022-08-24 16:30:02 [INFO] [TRAIN] epoch: 43, iter: 53200/160000, loss: 0.8270, lr: 0.000809, batch_cost: 0.2072, reader_cost: 0.00098, ips: 38.6089 samples/sec | ETA 06:08:49 2022-08-24 16:30:13 [INFO] [TRAIN] epoch: 43, iter: 53250/160000, loss: 0.9169, lr: 0.000808, batch_cost: 0.2206, reader_cost: 0.00054, ips: 36.2724 samples/sec | ETA 06:32:24 2022-08-24 16:30:25 [INFO] [TRAIN] epoch: 43, iter: 53300/160000, loss: 0.8688, lr: 0.000808, batch_cost: 0.2328, reader_cost: 0.00065, ips: 34.3696 samples/sec | ETA 06:53:55 2022-08-24 16:30:33 [INFO] [TRAIN] epoch: 43, iter: 53350/160000, loss: 0.8776, lr: 0.000807, batch_cost: 0.1503, reader_cost: 0.00043, ips: 53.2322 samples/sec | ETA 04:27:07 2022-08-24 16:30:43 [INFO] [TRAIN] epoch: 43, iter: 53400/160000, loss: 0.8597, lr: 0.000807, batch_cost: 0.2001, reader_cost: 0.00285, ips: 39.9811 samples/sec | ETA 05:55:30 2022-08-24 16:30:51 [INFO] [TRAIN] epoch: 43, iter: 53450/160000, loss: 0.8724, lr: 0.000807, batch_cost: 0.1735, reader_cost: 0.00239, ips: 46.0979 samples/sec | ETA 05:08:11 2022-08-24 16:31:00 [INFO] [TRAIN] epoch: 43, iter: 53500/160000, loss: 0.8945, lr: 0.000806, batch_cost: 0.1658, reader_cost: 0.00412, ips: 48.2573 samples/sec | ETA 04:54:15 2022-08-24 16:31:08 [INFO] [TRAIN] epoch: 43, iter: 53550/160000, loss: 0.8428, lr: 0.000806, batch_cost: 0.1602, reader_cost: 0.00058, ips: 49.9347 samples/sec | ETA 04:44:14 2022-08-24 16:31:17 [INFO] [TRAIN] epoch: 43, iter: 53600/160000, loss: 0.7807, lr: 0.000806, batch_cost: 0.1876, reader_cost: 0.00043, ips: 42.6328 samples/sec | ETA 05:32:45 2022-08-24 16:31:26 [INFO] [TRAIN] epoch: 43, iter: 53650/160000, loss: 0.8464, lr: 0.000805, batch_cost: 0.1769, reader_cost: 0.00076, ips: 45.2110 samples/sec | ETA 05:13:38 2022-08-24 16:31:34 [INFO] [TRAIN] epoch: 43, iter: 53700/160000, loss: 0.8320, lr: 0.000805, batch_cost: 0.1574, reader_cost: 0.00053, ips: 50.8396 samples/sec | ETA 04:38:47 2022-08-24 16:31:43 [INFO] [TRAIN] epoch: 43, iter: 53750/160000, loss: 0.8742, lr: 0.000804, batch_cost: 0.1873, reader_cost: 0.00034, ips: 42.7050 samples/sec | ETA 05:31:43 2022-08-24 16:31:51 [INFO] [TRAIN] epoch: 43, iter: 53800/160000, loss: 0.8604, lr: 0.000804, batch_cost: 0.1672, reader_cost: 0.00040, ips: 47.8434 samples/sec | ETA 04:55:57 2022-08-24 16:32:00 [INFO] [TRAIN] epoch: 43, iter: 53850/160000, loss: 0.8721, lr: 0.000804, batch_cost: 0.1650, reader_cost: 0.00048, ips: 48.4855 samples/sec | ETA 04:51:54 2022-08-24 16:32:09 [INFO] [TRAIN] epoch: 43, iter: 53900/160000, loss: 0.8242, lr: 0.000803, batch_cost: 0.1784, reader_cost: 0.00065, ips: 44.8486 samples/sec | ETA 05:15:25 2022-08-24 16:32:17 [INFO] [TRAIN] epoch: 43, iter: 53950/160000, loss: 0.8549, lr: 0.000803, batch_cost: 0.1687, reader_cost: 0.00032, ips: 47.4305 samples/sec | ETA 04:58:07 2022-08-24 16:32:25 [INFO] [TRAIN] epoch: 43, iter: 54000/160000, loss: 0.8654, lr: 0.000803, batch_cost: 0.1654, reader_cost: 0.00046, ips: 48.3570 samples/sec | ETA 04:52:16 2022-08-24 16:32:25 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 226s - batch_cost: 0.2262 - reader cost: 6.5749e-04 2022-08-24 16:36:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.2869 Acc: 0.7389 Kappa: 0.7185 Dice: 0.4083 2022-08-24 16:36:12 [INFO] [EVAL] Class IoU: [0.6462 0.7529 0.9265 0.6921 0.6688 0.7346 0.7567 0.729 0.4669 0.6241 0.4466 0.5194 0.6535 0.2817 0.1691 0.3612 0.481 0.4214 0.5316 0.3412 0.7145 0.4081 0.5532 0.4506 0.3261 0.3419 0.4027 0.3746 0.3938 0.3047 0.1723 0.4049 0.2765 0.2476 0.343 0.3592 0.3296 0.4505 0.2258 0.2183 0.1076 0.0385 0.3021 0.1928 0.2769 0.2425 0.2339 0.376 0.3963 0.3882 0.4076 0.2381 0.1597 0.1866 0.6408 0.3902 0.7838 0.287 0.3839 0.1946 0.1076 0.1374 0.3016 0.1444 0.3466 0.618 0.2002 0.3655 0.0743 0.3029 0.3317 0.3613 0.3799 0.2119 0.3955 0.268 0.3152 0.2367 0.1155 0.237 0.563 0.2441 0.1529 0.0145 0.4722 0.4287 0.0606 0.0601 0.3316 0.3593 0.388 0.0149 0.1722 0.0349 0.0303 0.0084 0.0933 0.1495 0.1901 0.3013 0.0075 0.0025 0.0672 0.0244 0.0002 0.388 0.1573 0.483 0.0833 0.2626 0.0314 0.3035 0.0742 0.5498 0.6762 0.0014 0.3389 0.4685 0.0512 0.1363 0.3892 0.0028 0.2871 0.1322 0.1939 0.1314 0.4147 0.3593 0.2743 0.1881 0.5267 0. 0.1726 0.1473 0.0406 0.0954 0.0628 0.006 0.1051 0.2024 0.1227 0.0313 0.2367 0.0287 0.2514 0. 0.2062 0.0122 0.0231 0.0369] 2022-08-24 16:36:12 [INFO] [EVAL] Class Precision: [0.752 0.8112 0.9603 0.794 0.7799 0.842 0.8678 0.7918 0.6423 0.7371 0.6435 0.6589 0.7385 0.4884 0.4689 0.5192 0.6295 0.6707 0.6675 0.5365 0.8054 0.6153 0.7244 0.5597 0.5337 0.6494 0.5327 0.6008 0.6328 0.4165 0.3614 0.548 0.5203 0.4052 0.5285 0.504 0.569 0.603 0.5721 0.5468 0.2134 0.2751 0.5276 0.549 0.4886 0.4494 0.3631 0.6822 0.6693 0.4469 0.5297 0.2885 0.36 0.5802 0.7179 0.4819 0.8295 0.5712 0.6116 0.3866 0.2067 0.2631 0.5097 0.5427 0.4365 0.7711 0.3281 0.5308 0.1886 0.6592 0.5179 0.5305 0.5492 0.2491 0.5241 0.5755 0.7094 0.566 0.4354 0.5461 0.8183 0.4645 0.7473 0.1262 0.5715 0.6577 0.3802 0.325 0.5464 0.5462 0.5693 0.0245 0.3666 0.2365 0.0932 0.0213 0.3998 0.5165 0.4685 0.4835 0.5261 0.0037 0.6152 0.2137 0.0018 0.6785 0.2919 0.7933 0.289 0.3733 0.2211 0.6537 0.4469 0.7247 0.6832 0.1148 0.8584 0.5365 0.0877 0.5496 0.7682 0.7162 0.765 0.5862 0.5387 0.5412 0.6954 0.5276 0.3946 0.6327 0.61 0.0385 0.5591 0.6263 0.5669 0.2327 0.3294 0.1721 0.3764 0.7593 0.3045 0.0398 0.5219 0.2556 0.8722 0. 0.8984 0.3266 0.201 0.7839] 2022-08-24 16:36:12 [INFO] [EVAL] Class Recall: [0.8212 0.9129 0.9635 0.8435 0.8244 0.852 0.8552 0.9018 0.631 0.8027 0.5933 0.7105 0.8502 0.3995 0.2091 0.5427 0.671 0.5313 0.7231 0.4839 0.8635 0.5478 0.7006 0.6982 0.4561 0.4194 0.6226 0.4987 0.5104 0.5316 0.2478 0.608 0.3711 0.3891 0.4942 0.5556 0.4393 0.6404 0.2717 0.2665 0.1784 0.0429 0.4141 0.2291 0.3898 0.3451 0.3966 0.4558 0.4928 0.7472 0.6389 0.5767 0.223 0.2157 0.8563 0.6721 0.9343 0.3659 0.5076 0.2815 0.1832 0.2232 0.4248 0.1644 0.6273 0.7568 0.3391 0.54 0.1093 0.3592 0.48 0.5313 0.5521 0.586 0.617 0.3341 0.3619 0.2892 0.1358 0.2952 0.6434 0.3396 0.1612 0.0161 0.7311 0.5519 0.0673 0.0686 0.4575 0.5123 0.5491 0.0366 0.2451 0.0393 0.043 0.0136 0.1085 0.1738 0.2424 0.4444 0.0075 0.008 0.0702 0.0268 0.0002 0.4754 0.2543 0.5525 0.1048 0.4697 0.0353 0.3617 0.0817 0.6949 0.9851 0.0014 0.3589 0.787 0.1096 0.1534 0.4409 0.0028 0.3149 0.1459 0.2325 0.1479 0.5068 0.5298 0.4735 0.2112 0.7942 0. 0.1998 0.1615 0.0419 0.1392 0.0719 0.0062 0.1273 0.2163 0.1705 0.1281 0.3022 0.0314 0.2611 0. 0.2112 0.0125 0.0254 0.0373] 2022-08-24 16:36:12 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:36:26 [INFO] [TRAIN] epoch: 43, iter: 54050/160000, loss: 0.8500, lr: 0.000802, batch_cost: 0.2826, reader_cost: 0.01288, ips: 28.3037 samples/sec | ETA 08:19:06 2022-08-24 16:36:38 [INFO] [TRAIN] epoch: 43, iter: 54100/160000, loss: 0.8971, lr: 0.000802, batch_cost: 0.2353, reader_cost: 0.00080, ips: 34.0031 samples/sec | ETA 06:55:15 2022-08-24 16:36:48 [INFO] [TRAIN] epoch: 43, iter: 54150/160000, loss: 0.8622, lr: 0.000801, batch_cost: 0.1943, reader_cost: 0.00090, ips: 41.1786 samples/sec | ETA 05:42:44 2022-08-24 16:36:59 [INFO] [TRAIN] epoch: 43, iter: 54200/160000, loss: 0.8985, lr: 0.000801, batch_cost: 0.2250, reader_cost: 0.00077, ips: 35.5477 samples/sec | ETA 06:36:50 2022-08-24 16:37:11 [INFO] [TRAIN] epoch: 43, iter: 54250/160000, loss: 0.8203, lr: 0.000801, batch_cost: 0.2411, reader_cost: 0.00069, ips: 33.1865 samples/sec | ETA 07:04:52 2022-08-24 16:37:23 [INFO] [TRAIN] epoch: 43, iter: 54300/160000, loss: 0.8841, lr: 0.000800, batch_cost: 0.2306, reader_cost: 0.00047, ips: 34.6927 samples/sec | ETA 06:46:13 2022-08-24 16:37:33 [INFO] [TRAIN] epoch: 44, iter: 54350/160000, loss: 0.8206, lr: 0.000800, batch_cost: 0.2154, reader_cost: 0.02865, ips: 37.1479 samples/sec | ETA 06:19:12 2022-08-24 16:37:42 [INFO] [TRAIN] epoch: 44, iter: 54400/160000, loss: 0.8289, lr: 0.000800, batch_cost: 0.1713, reader_cost: 0.00340, ips: 46.7127 samples/sec | ETA 05:01:25 2022-08-24 16:37:52 [INFO] [TRAIN] epoch: 44, iter: 54450/160000, loss: 0.7693, lr: 0.000799, batch_cost: 0.1936, reader_cost: 0.00044, ips: 41.3221 samples/sec | ETA 05:40:34 2022-08-24 16:38:02 [INFO] [TRAIN] epoch: 44, iter: 54500/160000, loss: 0.8160, lr: 0.000799, batch_cost: 0.1985, reader_cost: 0.00044, ips: 40.2938 samples/sec | ETA 05:49:06 2022-08-24 16:38:11 [INFO] [TRAIN] epoch: 44, iter: 54550/160000, loss: 0.8788, lr: 0.000798, batch_cost: 0.1872, reader_cost: 0.00066, ips: 42.7354 samples/sec | ETA 05:29:00 2022-08-24 16:38:20 [INFO] [TRAIN] epoch: 44, iter: 54600/160000, loss: 0.8559, lr: 0.000798, batch_cost: 0.1772, reader_cost: 0.01369, ips: 45.1447 samples/sec | ETA 05:11:17 2022-08-24 16:38:28 [INFO] [TRAIN] epoch: 44, iter: 54650/160000, loss: 0.8593, lr: 0.000798, batch_cost: 0.1568, reader_cost: 0.00039, ips: 51.0141 samples/sec | ETA 04:35:20 2022-08-24 16:38:36 [INFO] [TRAIN] epoch: 44, iter: 54700/160000, loss: 0.7793, lr: 0.000797, batch_cost: 0.1711, reader_cost: 0.00134, ips: 46.7603 samples/sec | ETA 05:00:15 2022-08-24 16:38:45 [INFO] [TRAIN] epoch: 44, iter: 54750/160000, loss: 0.8045, lr: 0.000797, batch_cost: 0.1775, reader_cost: 0.00042, ips: 45.0726 samples/sec | ETA 05:11:20 2022-08-24 16:38:53 [INFO] [TRAIN] epoch: 44, iter: 54800/160000, loss: 0.7974, lr: 0.000796, batch_cost: 0.1573, reader_cost: 0.00216, ips: 50.8593 samples/sec | ETA 04:35:47 2022-08-24 16:39:01 [INFO] [TRAIN] epoch: 44, iter: 54850/160000, loss: 0.8344, lr: 0.000796, batch_cost: 0.1657, reader_cost: 0.00069, ips: 48.2694 samples/sec | ETA 04:50:27 2022-08-24 16:39:10 [INFO] [TRAIN] epoch: 44, iter: 54900/160000, loss: 0.8542, lr: 0.000796, batch_cost: 0.1769, reader_cost: 0.00201, ips: 45.2198 samples/sec | ETA 05:09:53 2022-08-24 16:39:19 [INFO] [TRAIN] epoch: 44, iter: 54950/160000, loss: 0.8054, lr: 0.000795, batch_cost: 0.1818, reader_cost: 0.00104, ips: 44.0055 samples/sec | ETA 05:18:17 2022-08-24 16:39:28 [INFO] [TRAIN] epoch: 44, iter: 55000/160000, loss: 0.8591, lr: 0.000795, batch_cost: 0.1722, reader_cost: 0.00132, ips: 46.4569 samples/sec | ETA 05:01:21 2022-08-24 16:39:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 235s - batch_cost: 0.2350 - reader cost: 0.0012 2022-08-24 16:43:23 [INFO] [EVAL] #Images: 2000 mIoU: 0.2901 Acc: 0.7371 Kappa: 0.7168 Dice: 0.4110 2022-08-24 16:43:23 [INFO] [EVAL] Class IoU: [0.6463 0.7631 0.9193 0.6929 0.6607 0.7252 0.757 0.7326 0.4652 0.633 0.4392 0.5167 0.6445 0.2847 0.1783 0.3652 0.454 0.4072 0.5398 0.3451 0.7056 0.4102 0.5324 0.4524 0.2981 0.4268 0.361 0.3495 0.3314 0.2787 0.16 0.4066 0.2411 0.2598 0.2715 0.4 0.3259 0.4152 0.2156 0.2739 0.0844 0.0542 0.2908 0.2044 0.254 0.2326 0.248 0.3677 0.6132 0.431 0.4577 0.268 0.1924 0.2187 0.5982 0.4393 0.7997 0.299 0.3837 0.195 0.1465 0.1341 0.2739 0.0876 0.3849 0.6441 0.1673 0.3473 0.0505 0.2911 0.3154 0.3578 0.3697 0.2304 0.4061 0.2747 0.3346 0.1811 0.1496 0.3305 0.6369 0.2606 0.2296 0.0391 0.42 0.4333 0.0538 0.0446 0.3115 0.4005 0.3593 0.013 0.2013 0.0292 0.0155 0.0034 0.1572 0.1093 0.1084 0.2819 0.0107 0.0061 0.0299 0.5008 0.0004 0.395 0.1479 0.4445 0.0619 0.2565 0.0273 0.2441 0.056 0.4761 0.7036 0.0049 0.3082 0.4565 0.0498 0.0913 0.4329 0. 0.2695 0.0699 0.2151 0.1069 0.4111 0.2829 0.3852 0.2042 0.4675 0.0045 0.0753 0.1586 0.0365 0.0922 0.054 0.0021 0.0951 0.3365 0.0335 0.0381 0.2539 0.1078 0.3024 0. 0.218 0.009 0.0375 0.0334] 2022-08-24 16:43:23 [INFO] [EVAL] Class Precision: [0.7509 0.8478 0.9468 0.8014 0.7366 0.8563 0.8926 0.8074 0.5831 0.7811 0.6856 0.6448 0.7239 0.4454 0.5073 0.5201 0.562 0.6788 0.6839 0.5115 0.7741 0.6433 0.6624 0.6056 0.5156 0.6081 0.5059 0.7597 0.7105 0.4821 0.4143 0.6341 0.4744 0.4494 0.4296 0.5761 0.551 0.7112 0.4186 0.483 0.1671 0.2948 0.5105 0.424 0.3164 0.4591 0.3726 0.568 0.6422 0.5253 0.5913 0.3338 0.3528 0.6515 0.6774 0.5754 0.8388 0.5753 0.6216 0.4087 0.2049 0.2668 0.3958 0.6089 0.4859 0.8003 0.3197 0.4527 0.1164 0.5111 0.5279 0.4268 0.6519 0.2909 0.5439 0.4348 0.5462 0.3826 0.4673 0.5478 0.774 0.6544 0.7222 0.3001 0.6404 0.7153 0.1744 0.3867 0.627 0.5789 0.4554 0.0185 0.2861 0.2238 0.0784 0.0145 0.2392 0.6996 0.3164 0.3983 0.7561 0.0078 0.3696 0.5441 0.002 0.5633 0.351 0.8049 0.2391 0.3225 0.1929 0.3713 0.5206 0.7113 0.7092 0.1804 0.8988 0.505 0.1118 0.571 0.6806 0. 0.592 0.7075 0.4989 0.6975 0.7631 0.3794 0.6294 0.4685 0.5239 0.3218 0.6006 0.6021 0.6337 0.2672 0.3968 0.169 0.3289 0.6011 0.127 0.0593 0.5695 0.3671 0.6126 0. 0.8908 0.3895 0.3582 0.6816] 2022-08-24 16:43:23 [INFO] [EVAL] Class Recall: [0.8226 0.8843 0.9693 0.8365 0.865 0.8257 0.8328 0.8878 0.697 0.7695 0.55 0.7222 0.8546 0.4411 0.2156 0.5507 0.7026 0.5044 0.7192 0.5149 0.8887 0.5309 0.7306 0.6413 0.4141 0.5887 0.5576 0.3929 0.3832 0.3977 0.2068 0.5313 0.3289 0.3811 0.4245 0.5669 0.4437 0.4995 0.3077 0.3874 0.1457 0.0623 0.4033 0.283 0.5629 0.3205 0.4259 0.5104 0.9315 0.7061 0.6696 0.5763 0.2974 0.2476 0.8366 0.65 0.945 0.3837 0.5007 0.2717 0.3398 0.2123 0.4707 0.0929 0.6493 0.7674 0.2599 0.5987 0.082 0.4033 0.4394 0.6888 0.4607 0.5254 0.6158 0.4272 0.4634 0.2558 0.1803 0.4545 0.7824 0.3022 0.2519 0.043 0.5496 0.5236 0.0722 0.048 0.3823 0.5651 0.63 0.0423 0.4044 0.0325 0.019 0.0045 0.3142 0.1147 0.1416 0.491 0.0107 0.0276 0.0315 0.8631 0.0005 0.5693 0.2036 0.4981 0.0771 0.5564 0.0309 0.4161 0.059 0.5901 0.989 0.005 0.3193 0.8264 0.0823 0.098 0.5432 0. 0.331 0.072 0.2744 0.1121 0.4712 0.5264 0.4981 0.2657 0.8127 0.0045 0.0792 0.1771 0.0373 0.1234 0.0588 0.0022 0.118 0.4332 0.0436 0.096 0.3143 0.1324 0.3739 0. 0.2239 0.0091 0.0402 0.0339] 2022-08-24 16:43:23 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:43:34 [INFO] [TRAIN] epoch: 44, iter: 55050/160000, loss: 0.8723, lr: 0.000795, batch_cost: 0.2186, reader_cost: 0.00390, ips: 36.6047 samples/sec | ETA 06:22:16 2022-08-24 16:43:44 [INFO] [TRAIN] epoch: 44, iter: 55100/160000, loss: 0.8227, lr: 0.000794, batch_cost: 0.2042, reader_cost: 0.00147, ips: 39.1740 samples/sec | ETA 05:57:02 2022-08-24 16:43:54 [INFO] [TRAIN] epoch: 44, iter: 55150/160000, loss: 0.8347, lr: 0.000794, batch_cost: 0.1980, reader_cost: 0.00089, ips: 40.4078 samples/sec | ETA 05:45:58 2022-08-24 16:44:05 [INFO] [TRAIN] epoch: 44, iter: 55200/160000, loss: 0.8320, lr: 0.000793, batch_cost: 0.2181, reader_cost: 0.00076, ips: 36.6839 samples/sec | ETA 06:20:54 2022-08-24 16:44:14 [INFO] [TRAIN] epoch: 44, iter: 55250/160000, loss: 0.8481, lr: 0.000793, batch_cost: 0.1713, reader_cost: 0.00032, ips: 46.6909 samples/sec | ETA 04:59:07 2022-08-24 16:44:23 [INFO] [TRAIN] epoch: 44, iter: 55300/160000, loss: 0.8737, lr: 0.000793, batch_cost: 0.1824, reader_cost: 0.00037, ips: 43.8611 samples/sec | ETA 05:18:16 2022-08-24 16:44:32 [INFO] [TRAIN] epoch: 44, iter: 55350/160000, loss: 0.8199, lr: 0.000792, batch_cost: 0.1824, reader_cost: 0.00055, ips: 43.8636 samples/sec | ETA 05:18:06 2022-08-24 16:44:40 [INFO] [TRAIN] epoch: 44, iter: 55400/160000, loss: 0.8305, lr: 0.000792, batch_cost: 0.1624, reader_cost: 0.00092, ips: 49.2614 samples/sec | ETA 04:43:06 2022-08-24 16:44:49 [INFO] [TRAIN] epoch: 44, iter: 55450/160000, loss: 0.8980, lr: 0.000792, batch_cost: 0.1767, reader_cost: 0.00042, ips: 45.2690 samples/sec | ETA 05:07:56 2022-08-24 16:44:57 [INFO] [TRAIN] epoch: 44, iter: 55500/160000, loss: 0.8575, lr: 0.000791, batch_cost: 0.1623, reader_cost: 0.00235, ips: 49.2848 samples/sec | ETA 04:42:42 2022-08-24 16:45:05 [INFO] [TRAIN] epoch: 44, iter: 55550/160000, loss: 0.8333, lr: 0.000791, batch_cost: 0.1494, reader_cost: 0.00370, ips: 53.5359 samples/sec | ETA 04:20:08 2022-08-24 16:45:15 [INFO] [TRAIN] epoch: 45, iter: 55600/160000, loss: 0.8527, lr: 0.000790, batch_cost: 0.2185, reader_cost: 0.05438, ips: 36.6145 samples/sec | ETA 06:20:10 2022-08-24 16:45:23 [INFO] [TRAIN] epoch: 45, iter: 55650/160000, loss: 0.8290, lr: 0.000790, batch_cost: 0.1495, reader_cost: 0.00051, ips: 53.5260 samples/sec | ETA 04:19:56 2022-08-24 16:45:32 [INFO] [TRAIN] epoch: 45, iter: 55700/160000, loss: 0.8507, lr: 0.000790, batch_cost: 0.1765, reader_cost: 0.00042, ips: 45.3138 samples/sec | ETA 05:06:53 2022-08-24 16:45:40 [INFO] [TRAIN] epoch: 45, iter: 55750/160000, loss: 0.8709, lr: 0.000789, batch_cost: 0.1723, reader_cost: 0.00057, ips: 46.4379 samples/sec | ETA 04:59:19 2022-08-24 16:45:50 [INFO] [TRAIN] epoch: 45, iter: 55800/160000, loss: 0.8388, lr: 0.000789, batch_cost: 0.1938, reader_cost: 0.00045, ips: 41.2740 samples/sec | ETA 05:36:36 2022-08-24 16:45:59 [INFO] [TRAIN] epoch: 45, iter: 55850/160000, loss: 0.8051, lr: 0.000789, batch_cost: 0.1699, reader_cost: 0.00044, ips: 47.0969 samples/sec | ETA 04:54:51 2022-08-24 16:46:08 [INFO] [TRAIN] epoch: 45, iter: 55900/160000, loss: 0.8830, lr: 0.000788, batch_cost: 0.1869, reader_cost: 0.00059, ips: 42.7951 samples/sec | ETA 05:24:20 2022-08-24 16:46:17 [INFO] [TRAIN] epoch: 45, iter: 55950/160000, loss: 0.7675, lr: 0.000788, batch_cost: 0.1762, reader_cost: 0.00054, ips: 45.3985 samples/sec | ETA 05:05:35 2022-08-24 16:46:25 [INFO] [TRAIN] epoch: 45, iter: 56000/160000, loss: 0.8586, lr: 0.000787, batch_cost: 0.1699, reader_cost: 0.00079, ips: 47.0728 samples/sec | ETA 04:54:34 2022-08-24 16:46:25 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 225s - batch_cost: 0.2250 - reader cost: 6.7048e-04 2022-08-24 16:50:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.2940 Acc: 0.7392 Kappa: 0.7194 Dice: 0.4167 2022-08-24 16:50:11 [INFO] [EVAL] Class IoU: [0.6486 0.7634 0.9246 0.6952 0.6674 0.7319 0.7412 0.7244 0.4678 0.6011 0.4382 0.4976 0.6445 0.2784 0.2347 0.3608 0.4837 0.4267 0.5378 0.3476 0.7105 0.422 0.5283 0.4459 0.32 0.439 0.3642 0.3699 0.3651 0.2729 0.1599 0.3835 0.2497 0.2731 0.3396 0.3677 0.3339 0.4497 0.2444 0.279 0.1066 0.0662 0.2964 0.2079 0.2928 0.279 0.2331 0.3531 0.5933 0.4368 0.4385 0.2979 0.1499 0.1816 0.6234 0.4758 0.8135 0.2359 0.4012 0.1635 0.0465 0.1577 0.2785 0.1381 0.3583 0.5852 0.1824 0.3499 0.0269 0.3232 0.281 0.3794 0.3802 0.2294 0.3822 0.2873 0.4109 0.2163 0.136 0.2413 0.6082 0.2457 0.2076 0.0218 0.4352 0.4352 0.0525 0.0507 0.325 0.4107 0.3947 0.0113 0.1789 0.0371 0.0482 0.0057 0.1256 0.1148 0.0441 0.2865 0.0462 0.0028 0.0807 0.3388 0.0228 0.349 0.1881 0.4478 0.0915 0.2027 0.04 0.3395 0.0611 0.5063 0.7326 0.004 0.4634 0.4897 0.0897 0.12 0.41 0.0046 0.2336 0.0901 0.233 0.133 0.4068 0.2647 0.2773 0.2129 0.4729 0. 0.0161 0.196 0.0347 0.0931 0.0637 0.0082 0.1193 0.3084 0.2815 0.0034 0.2277 0.1128 0.3417 0. 0.2417 0.0207 0.027 0.039 ] 2022-08-24 16:50:11 [INFO] [EVAL] Class Precision: [0.7721 0.825 0.9609 0.8003 0.7695 0.8493 0.843 0.7886 0.5892 0.7632 0.6727 0.6392 0.7232 0.4675 0.4526 0.4991 0.6037 0.6803 0.688 0.5395 0.7909 0.6352 0.6312 0.5556 0.4856 0.612 0.5043 0.6974 0.705 0.3936 0.3882 0.5075 0.5277 0.4215 0.5009 0.4999 0.5358 0.6251 0.4572 0.4508 0.1944 0.2468 0.5777 0.5235 0.4138 0.426 0.3681 0.594 0.6548 0.5524 0.5778 0.3864 0.3507 0.6525 0.6623 0.6351 0.8686 0.6204 0.5541 0.317 0.1247 0.363 0.3944 0.5862 0.4322 0.6537 0.3801 0.5412 0.2522 0.605 0.6165 0.564 0.5625 0.2977 0.4987 0.46 0.5866 0.5958 0.427 0.6393 0.7201 0.4218 0.7581 0.0957 0.6717 0.7182 0.1344 0.432 0.5252 0.602 0.5335 0.0149 0.2739 0.2347 0.4284 0.0248 0.4302 0.7998 0.7179 0.5354 0.8566 0.0062 0.5628 0.9134 0.2075 0.4692 0.5024 0.6519 0.3083 0.3557 0.2019 0.654 0.4721 0.7078 0.7413 0.2298 0.8041 0.586 0.1616 0.5519 0.7502 0.3792 0.9052 0.6021 0.5324 0.6761 0.7061 0.3414 0.535 0.4075 0.5106 0. 0.5258 0.5727 0.4977 0.2435 0.3773 0.1692 0.3236 0.6576 0.5542 0.0049 0.512 0.3556 0.6184 0. 0.7989 0.2932 0.2052 0.6297] 2022-08-24 16:50:11 [INFO] [EVAL] Class Recall: [0.8022 0.911 0.9607 0.8411 0.8341 0.841 0.8599 0.8988 0.6942 0.7389 0.557 0.692 0.8557 0.4078 0.3277 0.5656 0.7087 0.5338 0.7113 0.4943 0.8749 0.5571 0.7642 0.6932 0.484 0.6083 0.5674 0.4406 0.4309 0.4709 0.2137 0.6108 0.3215 0.437 0.5133 0.5818 0.4699 0.6159 0.3444 0.4228 0.1911 0.0829 0.3784 0.2565 0.5003 0.4469 0.3885 0.4655 0.8633 0.6762 0.6454 0.5652 0.2074 0.201 0.9141 0.6548 0.9277 0.2757 0.5925 0.2525 0.069 0.2181 0.4865 0.153 0.677 0.8481 0.2596 0.4975 0.0293 0.4097 0.3405 0.5368 0.5398 0.5 0.6208 0.4335 0.5784 0.2535 0.1664 0.2794 0.7964 0.3705 0.2224 0.0275 0.5528 0.5249 0.0794 0.0543 0.4602 0.5637 0.6027 0.0443 0.3404 0.0421 0.0515 0.0073 0.1507 0.1182 0.0449 0.3814 0.0466 0.0052 0.0861 0.35 0.025 0.5765 0.2312 0.5885 0.1151 0.3202 0.0476 0.4138 0.0655 0.6401 0.9842 0.0041 0.5224 0.7488 0.1679 0.133 0.4748 0.0046 0.2394 0.0958 0.2929 0.1421 0.4897 0.5408 0.3653 0.3083 0.8651 0. 0.0163 0.2296 0.036 0.131 0.0712 0.0085 0.1589 0.3674 0.3639 0.0112 0.2908 0.1418 0.433 0. 0.2574 0.0218 0.0301 0.0399] 2022-08-24 16:50:11 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:50:23 [INFO] [TRAIN] epoch: 45, iter: 56050/160000, loss: 0.8309, lr: 0.000787, batch_cost: 0.2393, reader_cost: 0.00306, ips: 33.4279 samples/sec | ETA 06:54:37 2022-08-24 16:50:34 [INFO] [TRAIN] epoch: 45, iter: 56100/160000, loss: 0.8386, lr: 0.000787, batch_cost: 0.2268, reader_cost: 0.00719, ips: 35.2689 samples/sec | ETA 06:32:47 2022-08-24 16:50:44 [INFO] [TRAIN] epoch: 45, iter: 56150/160000, loss: 0.8811, lr: 0.000786, batch_cost: 0.2079, reader_cost: 0.00433, ips: 38.4833 samples/sec | ETA 05:59:48 2022-08-24 16:50:55 [INFO] [TRAIN] epoch: 45, iter: 56200/160000, loss: 0.8149, lr: 0.000786, batch_cost: 0.2054, reader_cost: 0.00054, ips: 38.9420 samples/sec | ETA 05:55:24 2022-08-24 16:51:03 [INFO] [TRAIN] epoch: 45, iter: 56250/160000, loss: 0.8635, lr: 0.000785, batch_cost: 0.1730, reader_cost: 0.00072, ips: 46.2406 samples/sec | ETA 04:59:09 2022-08-24 16:51:12 [INFO] [TRAIN] epoch: 45, iter: 56300/160000, loss: 0.8956, lr: 0.000785, batch_cost: 0.1675, reader_cost: 0.00037, ips: 47.7711 samples/sec | ETA 04:49:26 2022-08-24 16:51:20 [INFO] [TRAIN] epoch: 45, iter: 56350/160000, loss: 0.9160, lr: 0.000785, batch_cost: 0.1619, reader_cost: 0.00063, ips: 49.4091 samples/sec | ETA 04:39:42 2022-08-24 16:51:28 [INFO] [TRAIN] epoch: 45, iter: 56400/160000, loss: 0.8524, lr: 0.000784, batch_cost: 0.1612, reader_cost: 0.00037, ips: 49.6289 samples/sec | ETA 04:38:19 2022-08-24 16:51:35 [INFO] [TRAIN] epoch: 45, iter: 56450/160000, loss: 0.8647, lr: 0.000784, batch_cost: 0.1452, reader_cost: 0.00082, ips: 55.0854 samples/sec | ETA 04:10:38 2022-08-24 16:51:44 [INFO] [TRAIN] epoch: 45, iter: 56500/160000, loss: 0.8099, lr: 0.000784, batch_cost: 0.1811, reader_cost: 0.00115, ips: 44.1710 samples/sec | ETA 05:12:25 2022-08-24 16:51:53 [INFO] [TRAIN] epoch: 45, iter: 56550/160000, loss: 0.8217, lr: 0.000783, batch_cost: 0.1775, reader_cost: 0.00068, ips: 45.0748 samples/sec | ETA 05:06:00 2022-08-24 16:52:02 [INFO] [TRAIN] epoch: 45, iter: 56600/160000, loss: 0.8342, lr: 0.000783, batch_cost: 0.1805, reader_cost: 0.00046, ips: 44.3305 samples/sec | ETA 05:10:59 2022-08-24 16:52:11 [INFO] [TRAIN] epoch: 45, iter: 56650/160000, loss: 0.8503, lr: 0.000782, batch_cost: 0.1861, reader_cost: 0.00039, ips: 42.9858 samples/sec | ETA 05:20:34 2022-08-24 16:52:21 [INFO] [TRAIN] epoch: 45, iter: 56700/160000, loss: 0.8876, lr: 0.000782, batch_cost: 0.1845, reader_cost: 0.00036, ips: 43.3620 samples/sec | ETA 05:17:38 2022-08-24 16:52:29 [INFO] [TRAIN] epoch: 45, iter: 56750/160000, loss: 0.8225, lr: 0.000782, batch_cost: 0.1724, reader_cost: 0.00139, ips: 46.3941 samples/sec | ETA 04:56:43 2022-08-24 16:52:38 [INFO] [TRAIN] epoch: 45, iter: 56800/160000, loss: 0.8288, lr: 0.000781, batch_cost: 0.1805, reader_cost: 0.00075, ips: 44.3169 samples/sec | ETA 05:10:29 2022-08-24 16:52:49 [INFO] [TRAIN] epoch: 46, iter: 56850/160000, loss: 0.8606, lr: 0.000781, batch_cost: 0.2083, reader_cost: 0.02702, ips: 38.4014 samples/sec | ETA 05:58:08 2022-08-24 16:52:58 [INFO] [TRAIN] epoch: 46, iter: 56900/160000, loss: 0.8150, lr: 0.000781, batch_cost: 0.1935, reader_cost: 0.00089, ips: 41.3477 samples/sec | ETA 05:32:27 2022-08-24 16:53:07 [INFO] [TRAIN] epoch: 46, iter: 56950/160000, loss: 0.8223, lr: 0.000780, batch_cost: 0.1689, reader_cost: 0.00043, ips: 47.3537 samples/sec | ETA 04:50:09 2022-08-24 16:53:16 [INFO] [TRAIN] epoch: 46, iter: 57000/160000, loss: 0.8588, lr: 0.000780, batch_cost: 0.1919, reader_cost: 0.00057, ips: 41.6882 samples/sec | ETA 05:29:25 2022-08-24 16:53:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 235s - batch_cost: 0.2350 - reader cost: 6.0537e-04 2022-08-24 16:57:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.2926 Acc: 0.7370 Kappa: 0.7165 Dice: 0.4139 2022-08-24 16:57:12 [INFO] [EVAL] Class IoU: [0.648 0.7583 0.9207 0.6929 0.6685 0.722 0.7505 0.728 0.4721 0.6109 0.4392 0.5028 0.6545 0.286 0.1767 0.3488 0.4851 0.4179 0.528 0.3458 0.723 0.45 0.5381 0.4549 0.3244 0.3533 0.3873 0.3724 0.3661 0.2648 0.1824 0.3969 0.2188 0.2375 0.2874 0.3753 0.3381 0.3885 0.2179 0.2318 0.0955 0.0721 0.2848 0.2203 0.2923 0.291 0.2274 0.3516 0.5379 0.4609 0.4574 0.3218 0.1936 0.1543 0.6664 0.4821 0.8096 0.2029 0.3788 0.1554 0.0888 0.1352 0.2782 0.0745 0.3535 0.5221 0.1898 0.3712 0.0202 0.3219 0.3107 0.3896 0.3691 0.2097 0.4191 0.2761 0.4152 0.178 0.1355 0.2287 0.6257 0.2492 0.2928 0.0135 0.4446 0.4531 0.0603 0.0587 0.3566 0.3731 0.4145 0.0201 0.1785 0.0306 0.142 0.002 0.1363 0.0637 0.053 0.3183 0.0562 0.0112 0.0275 0.5388 0.0014 0.516 0.1396 0.4756 0.0714 0.2526 0.0379 0.2744 0.0544 0.4105 0.6999 0.0009 0.3825 0.4749 0.0174 0.0716 0.4723 0. 0.2264 0.1138 0.228 0.1234 0.3898 0.289 0.3092 0.1273 0.5007 0.0049 0.0607 0.1935 0.0468 0.095 0.0737 0.001 0.0914 0.2908 0.1126 0.0308 0.2961 0.1923 0.3352 0. 0.1711 0.0326 0.045 0.0236] 2022-08-24 16:57:12 [INFO] [EVAL] Class Precision: [0.7428 0.8357 0.9499 0.8294 0.7736 0.8863 0.8654 0.7905 0.5799 0.7142 0.6309 0.6413 0.7558 0.4617 0.5187 0.4675 0.6258 0.6731 0.7319 0.5566 0.8211 0.6522 0.6934 0.5777 0.4827 0.5161 0.53 0.6845 0.7173 0.4167 0.387 0.5376 0.415 0.3142 0.4374 0.5638 0.5133 0.6574 0.4647 0.5348 0.2118 0.222 0.55 0.5085 0.3917 0.4488 0.3153 0.501 0.6854 0.5828 0.6215 0.4387 0.3697 0.7446 0.7207 0.7096 0.8605 0.6145 0.6938 0.2997 0.1469 0.2715 0.4267 0.563 0.4524 0.5792 0.2919 0.5122 0.1209 0.6014 0.5491 0.4673 0.6448 0.2619 0.6089 0.4163 0.606 0.5785 0.5342 0.4591 0.7915 0.4822 0.6715 0.0733 0.5888 0.6075 0.1498 0.36 0.7402 0.5287 0.6106 0.0263 0.2844 0.2521 0.3704 0.0156 0.4933 0.7632 0.6308 0.4452 0.8325 0.015 0.579 0.7061 0.0222 0.6747 0.4097 0.8955 0.1722 0.3885 0.2608 0.4463 0.4588 0.6675 0.7046 0.2941 0.8917 0.5024 0.0599 0.549 0.6609 0. 0.8228 0.6449 0.554 0.6444 0.7683 0.3841 0.627 0.4234 0.5789 0.3311 0.5058 0.5635 0.5648 0.187 0.3258 0.0411 0.314 0.656 0.2137 0.0423 0.5381 0.4108 0.7187 0. 0.8835 0.2039 0.2756 0.7027] 2022-08-24 16:57:12 [INFO] [EVAL] Class Recall: [0.8355 0.8912 0.9677 0.8081 0.8311 0.7958 0.8497 0.9021 0.7175 0.8087 0.5912 0.6996 0.83 0.429 0.2114 0.5786 0.6834 0.5243 0.6547 0.4772 0.8583 0.592 0.7061 0.6815 0.4973 0.5284 0.5899 0.4495 0.4279 0.4207 0.2566 0.6027 0.3163 0.4932 0.456 0.5289 0.4976 0.4872 0.2908 0.2903 0.148 0.0965 0.3713 0.2799 0.5352 0.4528 0.4491 0.541 0.7143 0.6878 0.634 0.5469 0.2891 0.1629 0.8983 0.6005 0.9319 0.2325 0.4548 0.2439 0.1834 0.2123 0.4442 0.079 0.6179 0.841 0.3519 0.5742 0.0237 0.4093 0.4171 0.7009 0.4634 0.5123 0.5735 0.4505 0.5687 0.2046 0.1536 0.3131 0.7493 0.3402 0.3418 0.0163 0.6448 0.6407 0.0916 0.0655 0.4076 0.5589 0.5634 0.0783 0.324 0.0337 0.1872 0.0022 0.1585 0.065 0.0547 0.5275 0.0568 0.0421 0.028 0.6946 0.0015 0.6869 0.1748 0.5036 0.1088 0.4193 0.0424 0.4161 0.0582 0.5161 0.9905 0.0009 0.4011 0.8965 0.0239 0.0761 0.6234 0. 0.238 0.1214 0.2792 0.1324 0.4417 0.5385 0.3789 0.1541 0.7875 0.005 0.0645 0.2276 0.0485 0.1619 0.0869 0.001 0.1142 0.3431 0.1922 0.1017 0.397 0.2655 0.3859 0. 0.1751 0.0374 0.0511 0.0239] 2022-08-24 16:57:12 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 16:57:22 [INFO] [TRAIN] epoch: 46, iter: 57050/160000, loss: 0.8047, lr: 0.000779, batch_cost: 0.1930, reader_cost: 0.00965, ips: 41.4446 samples/sec | ETA 05:31:12 2022-08-24 16:57:32 [INFO] [TRAIN] epoch: 46, iter: 57100/160000, loss: 0.8203, lr: 0.000779, batch_cost: 0.2104, reader_cost: 0.00112, ips: 38.0214 samples/sec | ETA 06:00:50 2022-08-24 16:57:42 [INFO] [TRAIN] epoch: 46, iter: 57150/160000, loss: 0.8092, lr: 0.000779, batch_cost: 0.2060, reader_cost: 0.00074, ips: 38.8360 samples/sec | ETA 05:53:06 2022-08-24 16:57:52 [INFO] [TRAIN] epoch: 46, iter: 57200/160000, loss: 0.8750, lr: 0.000778, batch_cost: 0.1851, reader_cost: 0.00093, ips: 43.2152 samples/sec | ETA 05:17:10 2022-08-24 16:58:01 [INFO] [TRAIN] epoch: 46, iter: 57250/160000, loss: 0.8110, lr: 0.000778, batch_cost: 0.1964, reader_cost: 0.00055, ips: 40.7390 samples/sec | ETA 05:36:17 2022-08-24 16:58:11 [INFO] [TRAIN] epoch: 46, iter: 57300/160000, loss: 0.8444, lr: 0.000778, batch_cost: 0.1908, reader_cost: 0.00159, ips: 41.9305 samples/sec | ETA 05:26:34 2022-08-24 16:58:20 [INFO] [TRAIN] epoch: 46, iter: 57350/160000, loss: 0.7843, lr: 0.000777, batch_cost: 0.1853, reader_cost: 0.00062, ips: 43.1795 samples/sec | ETA 05:16:58 2022-08-24 16:58:29 [INFO] [TRAIN] epoch: 46, iter: 57400/160000, loss: 0.8703, lr: 0.000777, batch_cost: 0.1726, reader_cost: 0.00043, ips: 46.3404 samples/sec | ETA 04:55:12 2022-08-24 16:58:37 [INFO] [TRAIN] epoch: 46, iter: 57450/160000, loss: 0.8628, lr: 0.000776, batch_cost: 0.1595, reader_cost: 0.00106, ips: 50.1505 samples/sec | ETA 04:32:38 2022-08-24 16:58:45 [INFO] [TRAIN] epoch: 46, iter: 57500/160000, loss: 0.8480, lr: 0.000776, batch_cost: 0.1600, reader_cost: 0.00296, ips: 50.0078 samples/sec | ETA 04:33:17 2022-08-24 16:58:53 [INFO] [TRAIN] epoch: 46, iter: 57550/160000, loss: 0.8515, lr: 0.000776, batch_cost: 0.1676, reader_cost: 0.00314, ips: 47.7466 samples/sec | ETA 04:46:05 2022-08-24 16:59:01 [INFO] [TRAIN] epoch: 46, iter: 57600/160000, loss: 0.8924, lr: 0.000775, batch_cost: 0.1640, reader_cost: 0.01311, ips: 48.7672 samples/sec | ETA 04:39:58 2022-08-24 16:59:09 [INFO] [TRAIN] epoch: 46, iter: 57650/160000, loss: 0.8591, lr: 0.000775, batch_cost: 0.1538, reader_cost: 0.00064, ips: 52.0127 samples/sec | ETA 04:22:22 2022-08-24 16:59:18 [INFO] [TRAIN] epoch: 46, iter: 57700/160000, loss: 0.8344, lr: 0.000775, batch_cost: 0.1696, reader_cost: 0.00223, ips: 47.1705 samples/sec | ETA 04:49:09 2022-08-24 16:59:27 [INFO] [TRAIN] epoch: 46, iter: 57750/160000, loss: 0.8279, lr: 0.000774, batch_cost: 0.1831, reader_cost: 0.00096, ips: 43.6979 samples/sec | ETA 05:11:59 2022-08-24 16:59:35 [INFO] [TRAIN] epoch: 46, iter: 57800/160000, loss: 0.8214, lr: 0.000774, batch_cost: 0.1694, reader_cost: 0.00041, ips: 47.2185 samples/sec | ETA 04:48:35 2022-08-24 16:59:45 [INFO] [TRAIN] epoch: 46, iter: 57850/160000, loss: 0.8281, lr: 0.000773, batch_cost: 0.1937, reader_cost: 0.00068, ips: 41.2906 samples/sec | ETA 05:29:51 2022-08-24 16:59:55 [INFO] [TRAIN] epoch: 46, iter: 57900/160000, loss: 0.8145, lr: 0.000773, batch_cost: 0.2072, reader_cost: 0.00061, ips: 38.6101 samples/sec | ETA 05:52:35 2022-08-24 17:00:03 [INFO] [TRAIN] epoch: 46, iter: 57950/160000, loss: 0.8341, lr: 0.000773, batch_cost: 0.1551, reader_cost: 0.00033, ips: 51.5718 samples/sec | ETA 04:23:50 2022-08-24 17:00:12 [INFO] [TRAIN] epoch: 46, iter: 58000/160000, loss: 0.8543, lr: 0.000772, batch_cost: 0.1707, reader_cost: 0.00052, ips: 46.8675 samples/sec | ETA 04:50:10 2022-08-24 17:00:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 249s - batch_cost: 0.2488 - reader cost: 6.3015e-04 2022-08-24 17:04:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.2909 Acc: 0.7382 Kappa: 0.7178 Dice: 0.4131 2022-08-24 17:04:21 [INFO] [EVAL] Class IoU: [0.6484 0.7517 0.9247 0.6966 0.6566 0.7337 0.7444 0.7251 0.466 0.5988 0.4591 0.5096 0.6577 0.2218 0.2105 0.3522 0.4674 0.3825 0.54 0.3515 0.719 0.4368 0.5346 0.4527 0.3051 0.3543 0.3916 0.381 0.3533 0.2501 0.1873 0.41 0.2628 0.2703 0.38 0.3958 0.3115 0.4822 0.2423 0.2778 0.098 0.0757 0.3016 0.2014 0.3098 0.2325 0.2341 0.388 0.5147 0.4693 0.4497 0.2253 0.1493 0.2176 0.5919 0.4593 0.7757 0.2156 0.4195 0.1847 0.0765 0.1561 0.2929 0.039 0.3546 0.645 0.1599 0.3472 0.0209 0.3078 0.3192 0.3899 0.3681 0.1836 0.4112 0.2724 0.3607 0.1988 0.1225 0.1731 0.626 0.2415 0.2209 0.0579 0.3934 0.4502 0.0326 0.0558 0.2465 0.436 0.3955 0.0149 0.1805 0.0415 0.0553 0.0038 0.1183 0.1124 0.0339 0.2722 0.0522 0.0036 0.1858 0.3202 0.0178 0.4056 0.1468 0.4678 0.0853 0.2372 0.0358 0.2614 0.0639 0.3562 0.6916 0.0004 0.3813 0.4837 0.0204 0.1802 0.4592 0. 0.126 0.0651 0.2308 0.166 0.3964 0.278 0.3413 0.1677 0.5437 0.005 0.0383 0.1869 0.0895 0.078 0.0666 0.0101 0.1122 0.3069 0.2223 0.0117 0.2778 0.2598 0.2936 0. 0.1996 0.0085 0.0686 0.0231] 2022-08-24 17:04:21 [INFO] [EVAL] Class Precision: [0.749 0.8243 0.9634 0.8012 0.7349 0.8433 0.8458 0.7895 0.5836 0.716 0.639 0.6428 0.7495 0.5247 0.4782 0.499 0.6184 0.6893 0.7064 0.5366 0.8064 0.671 0.7106 0.5851 0.5449 0.5059 0.5315 0.7221 0.6978 0.327 0.3832 0.5749 0.4818 0.4118 0.4954 0.5358 0.6209 0.727 0.4099 0.4768 0.2031 0.2222 0.5651 0.515 0.5611 0.4746 0.343 0.6053 0.7565 0.6511 0.6216 0.2759 0.3503 0.6444 0.7147 0.6395 0.8095 0.6314 0.6743 0.4393 0.1569 0.396 0.4205 0.5359 0.4637 0.7835 0.3171 0.5261 0.4308 0.6016 0.5708 0.4416 0.6817 0.227 0.7142 0.456 0.8065 0.4636 0.5356 0.5886 0.7761 0.5479 0.748 0.2337 0.4955 0.6246 0.2002 0.3921 0.3399 0.6796 0.5646 0.0208 0.3694 0.2557 0.1789 0.0132 0.4314 0.5798 0.3913 0.5758 0.8599 0.0081 0.6444 0.6705 0.106 0.526 0.4335 0.8775 0.2669 0.3264 0.2393 0.4133 0.4009 0.6269 0.6954 0.0263 0.7888 0.5402 0.1073 0.7464 0.7051 0. 0.9402 0.6127 0.5924 0.6722 0.6916 0.4218 0.4687 0.502 0.6513 0.6423 0.1988 0.5582 0.5292 0.3639 0.3662 0.255 0.2996 0.6549 0.4037 0.1105 0.4896 0.3902 0.5806 0. 0.8951 0.2738 0.2776 0.779 ] 2022-08-24 17:04:21 [INFO] [EVAL] Class Recall: [0.8284 0.8951 0.9584 0.8422 0.8605 0.8496 0.8613 0.8989 0.6982 0.7853 0.6199 0.7108 0.8431 0.2776 0.2732 0.5449 0.6569 0.4622 0.6963 0.5046 0.8691 0.5559 0.6835 0.6667 0.4095 0.5418 0.5979 0.4465 0.4171 0.5156 0.2682 0.5885 0.3663 0.4404 0.6199 0.6024 0.3847 0.5888 0.3721 0.3995 0.1592 0.103 0.3927 0.2485 0.409 0.313 0.4243 0.5194 0.6169 0.6269 0.6193 0.5513 0.2065 0.2473 0.7751 0.6198 0.9489 0.2466 0.5262 0.2416 0.13 0.2049 0.4911 0.0403 0.6011 0.7849 0.2438 0.5052 0.0215 0.3866 0.42 0.769 0.4445 0.4898 0.4922 0.4035 0.3948 0.2582 0.1371 0.197 0.764 0.3015 0.2386 0.0715 0.6561 0.6172 0.0375 0.0611 0.4727 0.5488 0.569 0.0496 0.2609 0.0472 0.0742 0.0054 0.1401 0.1224 0.0358 0.3404 0.0527 0.0063 0.207 0.38 0.0209 0.6394 0.1817 0.5005 0.1113 0.4647 0.0405 0.4157 0.0706 0.452 0.9922 0.0004 0.4246 0.8224 0.0245 0.1919 0.5683 0. 0.127 0.0679 0.2743 0.1806 0.4816 0.4493 0.5566 0.2011 0.767 0.005 0.0454 0.2194 0.0972 0.0903 0.0752 0.0104 0.152 0.366 0.331 0.0129 0.3911 0.4374 0.3726 0. 0.2044 0.0087 0.0836 0.0233] 2022-08-24 17:04:21 [INFO] [EVAL] The model with the best validation mIoU (0.2946) was saved at iter 51000. 2022-08-24 17:04:32 [INFO] [TRAIN] epoch: 46, iter: 58050/160000, loss: 0.8488, lr: 0.000772, batch_cost: 0.2189, reader_cost: 0.00376, ips: 36.5384 samples/sec | ETA 06:12:01 2022-08-24 17:04:41 [INFO] [TRAIN] epoch: 47, iter: 58100/160000, loss: 0.8482, lr: 0.000771, batch_cost: 0.1849, reader_cost: 0.03128, ips: 43.2714 samples/sec | ETA 05:13:59 2022-08-24 17:04:50 [INFO] [TRAIN] epoch: 47, iter: 58150/160000, loss: 0.8701, lr: 0.000771, batch_cost: 0.1723, reader_cost: 0.00059, ips: 46.4352 samples/sec | ETA 04:52:27 2022-08-24 17:04:58 [INFO] [TRAIN] epoch: 47, iter: 58200/160000, loss: 0.8515, lr: 0.000771, batch_cost: 0.1643, reader_cost: 0.00048, ips: 48.6829 samples/sec | ETA 04:38:48 2022-08-24 17:05:06 [INFO] [TRAIN] epoch: 47, iter: 58250/160000, loss: 0.8166, lr: 0.000770, batch_cost: 0.1602, reader_cost: 0.00065, ips: 49.9446 samples/sec | ETA 04:31:38 2022-08-24 17:05:14 [INFO] [TRAIN] epoch: 47, iter: 58300/160000, loss: 0.7959, lr: 0.000770, batch_cost: 0.1715, reader_cost: 0.00098, ips: 46.6434 samples/sec | ETA 04:50:42 2022-08-24 17:05:22 [INFO] [TRAIN] epoch: 47, iter: 58350/160000, loss: 0.8187, lr: 0.000770, batch_cost: 0.1596, reader_cost: 0.00032, ips: 50.1127 samples/sec | ETA 04:30:27 2022-08-24 17:05:31 [INFO] [TRAIN] epoch: 47, iter: 58400/160000, loss: 0.8070, lr: 0.000769, batch_cost: 0.1704, reader_cost: 0.00050, ips: 46.9462 samples/sec | ETA 04:48:33 2022-08-24 17:05:39 [INFO] [TRAIN] epoch: 47, iter: 58450/160000, loss: 0.8071, lr: 0.000769, batch_cost: 0.1704, reader_cost: 0.00039, ips: 46.9602 samples/sec | ETA 04:48:19 2022-08-24 17:05:48 [INFO] [TRAIN] epoch: 47, iter: 58500/160000, loss: 0.8157, lr: 0.000768, batch_cost: 0.1738, reader_cost: 0.00070, ips: 46.0298 samples/sec | ETA 04:54:00 2022-08-24 17:05:57 [INFO] [TRAIN] epoch: 47, iter: 58550/160000, loss: 0.8364, lr: 0.000768, batch_cost: 0.1735, reader_cost: 0.00051, ips: 46.1018 samples/sec | ETA 04:53:24 2022-08-24 17:06:05 [INFO] [TRAIN] epoch: 47, iter: 58600/160000, loss: 0.7766, lr: 0.000768, batch_cost: 0.1656, reader_cost: 0.00042, ips: 48.3139 samples/sec | ETA 04:39:50 2022-08-24 17:06:13 [INFO] [TRAIN] epoch: 47, iter: 58650/160000, loss: 0.8364, lr: 0.000767, batch_cost: 0.1626, reader_cost: 0.00032, ips: 49.1864 samples/sec | ETA 04:34:44 2022-08-24 17:06:21 [INFO] [TRAIN] epoch: 47, iter: 58700/160000, loss: 0.8669, lr: 0.000767, batch_cost: 0.1584, reader_cost: 0.00663, ips: 50.5030 samples/sec | ETA 04:27:26 2022-08-24 17:06:30 [INFO] [TRAIN] epoch: 47, iter: 58750/160000, loss: 0.7433, lr: 0.000767, batch_cost: 0.1728, reader_cost: 0.00431, ips: 46.2967 samples/sec | ETA 04:51:35 2022-08-24 17:06:39 [INFO] [TRAIN] epoch: 47, iter: 58800/160000, loss: 0.8106, lr: 0.000766, batch_cost: 0.1916, reader_cost: 0.00043, ips: 41.7614 samples/sec | ETA 05:23:06 2022-08-24 17:06:49 [INFO] [TRAIN] epoch: 47, iter: 58850/160000, loss: 0.7903, lr: 0.000766, batch_cost: 0.1908, reader_cost: 0.00087, ips: 41.9269 samples/sec | ETA 05:21:40 2022-08-24 17:06:56 [INFO] [TRAIN] epoch: 47, iter: 58900/160000, loss: 0.9168, lr: 0.000765, batch_cost: 0.1496, reader_cost: 0.00153, ips: 53.4805 samples/sec | ETA 04:12:03 2022-08-24 17:07:04 [INFO] [TRAIN] epoch: 47, iter: 58950/160000, loss: 0.8181, lr: 0.000765, batch_cost: 0.1576, reader_cost: 0.00424, ips: 50.7759 samples/sec | ETA 04:25:20 2022-08-24 17:07:13 [INFO] [TRAIN] epoch: 47, iter: 59000/160000, loss: 0.8253, lr: 0.000765, batch_cost: 0.1656, reader_cost: 0.00058, ips: 48.3212 samples/sec | ETA 04:38:41 2022-08-24 17:07:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 229s - batch_cost: 0.2287 - reader cost: 5.9754e-04 2022-08-24 17:11:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.2955 Acc: 0.7377 Kappa: 0.7175 Dice: 0.4182 2022-08-24 17:11:02 [INFO] [EVAL] Class IoU: [0.6494 0.7529 0.9231 0.695 0.6619 0.7279 0.7507 0.7279 0.4783 0.6219 0.4466 0.5021 0.6531 0.2566 0.1978 0.3389 0.4703 0.4112 0.5388 0.3605 0.7232 0.4321 0.522 0.443 0.3143 0.3669 0.3599 0.3678 0.3498 0.2596 0.2036 0.3876 0.2748 0.2666 0.2683 0.3942 0.3204 0.46 0.2338 0.2171 0.0849 0.0728 0.2885 0.1932 0.301 0.2761 0.2381 0.3857 0.5795 0.4103 0.4381 0.2115 0.1574 0.178 0.6142 0.4211 0.8083 0.3165 0.4035 0.1963 0.0967 0.1489 0.2717 0.0973 0.3535 0.6271 0.1813 0.3567 0.0229 0.3382 0.3172 0.3811 0.3557 0.188 0.4117 0.2692 0.3669 0.1787 0.0801 0.2596 0.625 0.2552 0.2987 0.0523 0.3398 0.4649 0.0399 0.0489 0.2734 0.4338 0.3535 0.0139 0.176 0.0532 0.1249 0.0021 0.1412 0.0999 0.048 0.2864 0.0725 0.0084 0.1137 0.6118 0.0106 0.5457 0.142 0.5031 0.07 0.3315 0.0622 0.0875 0.0884 0.5375 0.6322 0.0044 0.3361 0.49 0.0967 0.2301 0.4398 0. 0.1831 0.0463 0.1899 0.2578 0.4006 0.3176 0.3478 0.1472 0.5807 0.004 0.0551 0.1131 0.0711 0.1121 0.0495 0.0094 0.1187 0.3231 0.2422 0.0233 0.2925 0.0606 0.2808 0. 0.2408 0.0167 0.0361 0.0546] 2022-08-24 17:11:02 [INFO] [EVAL] Class Precision: [0.761 0.8206 0.9569 0.7949 0.7621 0.8711 0.8516 0.79 0.6029 0.7186 0.6329 0.6831 0.7451 0.5347 0.505 0.5514 0.5878 0.6601 0.6504 0.5067 0.8127 0.6644 0.6725 0.613 0.4552 0.5919 0.5461 0.7435 0.6415 0.4187 0.3818 0.5476 0.4933 0.3926 0.4752 0.525 0.5704 0.6593 0.4278 0.5243 0.1869 0.2008 0.4719 0.4553 0.4482 0.4734 0.3498 0.5501 0.7131 0.4958 0.591 0.2528 0.3086 0.499 0.6977 0.5499 0.8622 0.5221 0.5953 0.2948 0.1443 0.3287 0.3654 0.6449 0.441 0.7601 0.2919 0.5028 0.1337 0.6628 0.5333 0.6007 0.539 0.2146 0.6252 0.5853 0.6071 0.5553 0.4731 0.5138 0.7168 0.5477 0.6743 0.1908 0.6643 0.6567 0.185 0.3055 0.3635 0.6969 0.4663 0.0181 0.2741 0.2302 0.2372 0.008 0.3304 0.5183 0.531 0.4475 0.6738 0.0109 0.6064 0.6676 0.0989 0.7718 0.3812 0.8331 0.1843 0.4389 0.2291 0.0997 0.4086 0.6172 0.6341 0.1127 0.8567 0.5163 0.1691 0.8207 0.7697 0. 0.7634 0.6838 0.6581 0.6271 0.6909 0.4629 0.4492 0.4379 0.7106 0.4498 0.2713 0.7203 0.5917 0.2587 0.5243 0.2174 0.4231 0.6346 0.3628 0.0395 0.5647 0.4249 0.822 0. 0.8248 0.4146 0.1864 0.5831] 2022-08-24 17:11:02 [INFO] [EVAL] Class Recall: [0.8158 0.9012 0.9632 0.8469 0.8342 0.8157 0.8637 0.9026 0.6983 0.8221 0.6028 0.6546 0.8411 0.3303 0.2453 0.468 0.7019 0.5217 0.7585 0.5554 0.8679 0.5528 0.7001 0.615 0.5039 0.4911 0.5134 0.4212 0.4348 0.4058 0.3037 0.5703 0.3828 0.4537 0.3813 0.6126 0.4223 0.6034 0.3401 0.2704 0.1348 0.1025 0.426 0.2513 0.4781 0.3984 0.4273 0.5635 0.7557 0.7041 0.6287 0.5644 0.243 0.2168 0.837 0.6427 0.9283 0.4456 0.556 0.3703 0.2267 0.2141 0.5147 0.1028 0.6404 0.7818 0.3236 0.5511 0.0269 0.4085 0.4391 0.5104 0.5112 0.6027 0.5466 0.3326 0.4812 0.2086 0.0879 0.3441 0.8299 0.3234 0.349 0.0671 0.4102 0.6141 0.0484 0.0551 0.5245 0.5348 0.5937 0.057 0.3298 0.0647 0.2088 0.0029 0.1977 0.1102 0.0501 0.4431 0.0752 0.0352 0.1228 0.8799 0.0117 0.6507 0.1845 0.5596 0.1014 0.5755 0.0787 0.4161 0.1014 0.8063 0.9952 0.0046 0.3561 0.906 0.1842 0.2422 0.5064 0. 0.1941 0.0473 0.2106 0.3045 0.488 0.5029 0.6064 0.1816 0.7606 0.004 0.0646 0.1183 0.0748 0.1652 0.0518 0.0098 0.1417 0.397 0.4216 0.0537 0.3776 0.066 0.299 0. 0.2538 0.0171 0.0429 0.0568] 2022-08-24 17:11:02 [INFO] [EVAL] The model with the best validation mIoU (0.2955) was saved at iter 59000. 2022-08-24 17:11:12 [INFO] [TRAIN] epoch: 47, iter: 59050/160000, loss: 0.8335, lr: 0.000764, batch_cost: 0.2099, reader_cost: 0.00286, ips: 38.1147 samples/sec | ETA 05:53:08 2022-08-24 17:11:23 [INFO] [TRAIN] epoch: 47, iter: 59100/160000, loss: 0.8249, lr: 0.000764, batch_cost: 0.2127, reader_cost: 0.00178, ips: 37.6060 samples/sec | ETA 05:57:44 2022-08-24 17:11:32 [INFO] [TRAIN] epoch: 47, iter: 59150/160000, loss: 0.8812, lr: 0.000764, batch_cost: 0.1827, reader_cost: 0.00065, ips: 43.7774 samples/sec | ETA 05:07:09 2022-08-24 17:11:40 [INFO] [TRAIN] epoch: 47, iter: 59200/160000, loss: 0.8857, lr: 0.000763, batch_cost: 0.1515, reader_cost: 0.00031, ips: 52.8034 samples/sec | ETA 04:14:31 2022-08-24 17:11:48 [INFO] [TRAIN] epoch: 47, iter: 59250/160000, loss: 0.8321, lr: 0.000763, batch_cost: 0.1633, reader_cost: 0.00664, ips: 48.9936 samples/sec | ETA 04:34:11 2022-08-24 17:11:56 [INFO] [TRAIN] epoch: 47, iter: 59300/160000, loss: 0.8259, lr: 0.000762, batch_cost: 0.1558, reader_cost: 0.00064, ips: 51.3353 samples/sec | ETA 04:21:32 2022-08-24 17:12:04 [INFO] [TRAIN] epoch: 47, iter: 59350/160000, loss: 0.8093, lr: 0.000762, batch_cost: 0.1689, reader_cost: 0.00045, ips: 47.3540 samples/sec | ETA 04:43:23 2022-08-24 17:12:15 [INFO] [TRAIN] epoch: 48, iter: 59400/160000, loss: 0.8543, lr: 0.000762, batch_cost: 0.2155, reader_cost: 0.02756, ips: 37.1260 samples/sec | ETA 06:01:17 2022-08-24 17:12:24 [INFO] [TRAIN] epoch: 48, iter: 59450/160000, loss: 0.8468, lr: 0.000761, batch_cost: 0.1815, reader_cost: 0.00098, ips: 44.0786 samples/sec | ETA 05:04:09 2022-08-24 17:12:33 [INFO] [TRAIN] epoch: 48, iter: 59500/160000, loss: 0.8243, lr: 0.000761, batch_cost: 0.1844, reader_cost: 0.00076, ips: 43.3837 samples/sec | ETA 05:08:52 2022-08-24 17:12:43 [INFO] [TRAIN] epoch: 48, iter: 59550/160000, loss: 0.8439, lr: 0.000761, batch_cost: 0.1942, reader_cost: 0.00052, ips: 41.1993 samples/sec | ETA 05:25:05 2022-08-24 17:12:53 [INFO] [TRAIN] epoch: 48, iter: 59600/160000, loss: 0.8382, lr: 0.000760, batch_cost: 0.2082, reader_cost: 0.00040, ips: 38.4168 samples/sec | ETA 05:48:27 2022-08-24 17:13:03 [INFO] [TRAIN] epoch: 48, iter: 59650/160000, loss: 0.8709, lr: 0.000760, batch_cost: 0.1923, reader_cost: 0.00054, ips: 41.5983 samples/sec | ETA 05:21:38 2022-08-24 17:13:12 [INFO] [TRAIN] epoch: 48, iter: 59700/160000, loss: 0.8424, lr: 0.000759, batch_cost: 0.1863, reader_cost: 0.00033, ips: 42.9339 samples/sec | ETA 05:11:29 2022-08-24 17:13:22 [INFO] [TRAIN] epoch: 48, iter: 59750/160000, loss: 0.7681, lr: 0.000759, batch_cost: 0.2030, reader_cost: 0.00037, ips: 39.4153 samples/sec | ETA 05:39:07 2022-08-24 17:13:31 [INFO] [TRAIN] epoch: 48, iter: 59800/160000, loss: 0.7748, lr: 0.000759, batch_cost: 0.1714, reader_cost: 0.00580, ips: 46.6789 samples/sec | ETA 04:46:12 2022-08-24 17:13:40 [INFO] [TRAIN] epoch: 48, iter: 59850/160000, loss: 0.8518, lr: 0.000758, batch_cost: 0.1768, reader_cost: 0.00808, ips: 45.2558 samples/sec | ETA 04:55:03 2022-08-24 17:13:48 [INFO] [TRAIN] epoch: 48, iter: 59900/160000, loss: 0.7916, lr: 0.000758, batch_cost: 0.1677, reader_cost: 0.00075, ips: 47.7158 samples/sec | ETA 04:39:42 2022-08-24 17:13:56 [INFO] [TRAIN] epoch: 48, iter: 59950/160000, loss: 0.8126, lr: 0.000757, batch_cost: 0.1588, reader_cost: 0.00054, ips: 50.3726 samples/sec | ETA 04:24:49 2022-08-24 17:14:04 [INFO] [TRAIN] epoch: 48, iter: 60000/160000, loss: 0.7804, lr: 0.000757, batch_cost: 0.1537, reader_cost: 0.00570, ips: 52.0477 samples/sec | ETA 04:16:10 2022-08-24 17:14:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 239s - batch_cost: 0.2388 - reader cost: 7.5254e-04 2022-08-24 17:18:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.3010 Acc: 0.7378 Kappa: 0.7177 Dice: 0.4236 2022-08-24 17:18:03 [INFO] [EVAL] Class IoU: [0.6459 0.7588 0.9206 0.6993 0.6664 0.724 0.7484 0.7407 0.4717 0.6077 0.4474 0.5075 0.6504 0.3016 0.208 0.3541 0.4427 0.4195 0.5295 0.3434 0.7102 0.4007 0.5271 0.449 0.291 0.4215 0.376 0.3864 0.3691 0.244 0.2074 0.4163 0.2517 0.2533 0.4111 0.3638 0.3453 0.4737 0.2169 0.2708 0.1044 0.0853 0.2949 0.2105 0.2791 0.2556 0.1938 0.408 0.5901 0.4265 0.4596 0.2387 0.1554 0.148 0.6451 0.4806 0.8306 0.2741 0.4194 0.1871 0.0994 0.1223 0.2866 0.1054 0.348 0.6092 0.1744 0.3522 0.0763 0.3346 0.3375 0.3677 0.395 0.2259 0.4216 0.2716 0.3409 0.1911 0.1599 0.3126 0.6485 0.2531 0.2425 0.0162 0.4996 0.4682 0.0591 0.0445 0.3199 0.4188 0.372 0.0204 0.1721 0.0813 0.1127 0.0068 0.0759 0.1278 0.1635 0.3173 0.0541 0.0169 0.0851 0.7498 0.004 0.5937 0.0972 0.41 0.0576 0.2365 0.0527 0.1459 0.0738 0.5498 0.7069 0.0035 0.4005 0.513 0.0612 0.0251 0.4668 0. 0.1985 0.1001 0.2212 0.1649 0.3864 0.2976 0.3669 0.2474 0.4686 0. 0.046 0.1511 0.0472 0.0769 0.0829 0.0101 0.1189 0.3399 0.1403 0.0594 0.2204 0.2511 0.3241 0. 0.2516 0.0061 0.0177 0.0349] 2022-08-24 17:18:03 [INFO] [EVAL] Class Precision: [0.7512 0.8504 0.9556 0.7998 0.7644 0.843 0.8614 0.8166 0.6377 0.725 0.6564 0.6496 0.7351 0.4604 0.4802 0.5126 0.6993 0.6571 0.673 0.5344 0.7925 0.6683 0.6885 0.5716 0.484 0.5596 0.5042 0.7081 0.6412 0.4462 0.3836 0.5015 0.4321 0.3944 0.5434 0.5371 0.5381 0.7296 0.5035 0.4938 0.2057 0.1917 0.4873 0.4735 0.3438 0.4743 0.2758 0.6323 0.6309 0.5264 0.6428 0.2645 0.3831 0.5515 0.7013 0.6733 0.9033 0.6332 0.6565 0.2697 0.1422 0.2383 0.412 0.6947 0.4204 0.6937 0.3369 0.5055 0.1897 0.6218 0.5991 0.6297 0.6205 0.3002 0.6768 0.4798 0.5942 0.3622 0.5745 0.6129 0.7902 0.4912 0.7288 0.126 0.6409 0.695 0.2284 0.3831 0.4995 0.6343 0.515 0.0263 0.3458 0.2548 0.2546 0.0264 0.2825 0.4658 0.29 0.5212 0.6749 0.0206 0.4328 0.927 0.0407 0.8072 0.4522 0.8904 0.3563 0.3576 0.2085 0.212 0.444 0.6977 0.7121 0.2919 0.7627 0.5871 0.1122 0.5007 0.7251 0.0022 0.8367 0.6182 0.649 0.6819 0.6444 0.3883 0.4511 0.4258 0.5146 0.0354 0.2434 0.56 0.7406 0.3745 0.3794 0.26 0.337 0.6464 0.1666 0.2011 0.5702 0.5252 0.4978 0. 0.7454 0.3625 0.0907 0.6466] 2022-08-24 17:18:03 [INFO] [EVAL] Class Recall: [0.8217 0.8757 0.9617 0.8476 0.8386 0.8369 0.8508 0.8885 0.6444 0.7898 0.5841 0.6987 0.8496 0.4665 0.2685 0.5339 0.5468 0.537 0.713 0.49 0.8725 0.5001 0.6921 0.6767 0.422 0.6306 0.5966 0.4597 0.4652 0.35 0.3111 0.7101 0.376 0.4147 0.6281 0.53 0.4907 0.5745 0.2758 0.3748 0.1748 0.1332 0.4276 0.2749 0.5974 0.3566 0.3943 0.5349 0.9012 0.6921 0.6172 0.7097 0.2072 0.1682 0.8897 0.6267 0.9116 0.3258 0.5373 0.3792 0.2481 0.2009 0.485 0.1105 0.6689 0.8333 0.2656 0.5374 0.1131 0.42 0.4359 0.4691 0.5208 0.4773 0.5278 0.385 0.4443 0.2879 0.1814 0.3895 0.7834 0.3429 0.2666 0.0183 0.6937 0.5893 0.0738 0.0479 0.4709 0.5522 0.5726 0.0837 0.2552 0.1067 0.1683 0.0091 0.094 0.1498 0.2726 0.4478 0.0555 0.0844 0.0957 0.7968 0.0044 0.6917 0.1102 0.4318 0.0643 0.4111 0.0658 0.3189 0.0814 0.7217 0.9897 0.0036 0.4576 0.8025 0.1187 0.0258 0.5671 0. 0.2065 0.1067 0.2513 0.1786 0.4911 0.5603 0.6628 0.3713 0.8399 0. 0.0537 0.1715 0.048 0.0882 0.0959 0.0104 0.1553 0.4175 0.4704 0.0778 0.2643 0.3248 0.4816 0. 0.2753 0.0061 0.0215 0.0356] 2022-08-24 17:18:03 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:18:11 [INFO] [TRAIN] epoch: 48, iter: 60050/160000, loss: 0.8087, lr: 0.000757, batch_cost: 0.1640, reader_cost: 0.00533, ips: 48.7856 samples/sec | ETA 04:33:10 2022-08-24 17:18:19 [INFO] [TRAIN] epoch: 48, iter: 60100/160000, loss: 0.8288, lr: 0.000756, batch_cost: 0.1551, reader_cost: 0.00724, ips: 51.5913 samples/sec | ETA 04:18:10 2022-08-24 17:18:27 [INFO] [TRAIN] epoch: 48, iter: 60150/160000, loss: 0.8432, lr: 0.000756, batch_cost: 0.1657, reader_cost: 0.00120, ips: 48.2665 samples/sec | ETA 04:35:49 2022-08-24 17:18:37 [INFO] [TRAIN] epoch: 48, iter: 60200/160000, loss: 0.8785, lr: 0.000756, batch_cost: 0.1888, reader_cost: 0.00060, ips: 42.3655 samples/sec | ETA 05:14:05 2022-08-24 17:18:44 [INFO] [TRAIN] epoch: 48, iter: 60250/160000, loss: 0.8101, lr: 0.000755, batch_cost: 0.1497, reader_cost: 0.00043, ips: 53.4320 samples/sec | ETA 04:08:54 2022-08-24 17:18:54 [INFO] [TRAIN] epoch: 48, iter: 60300/160000, loss: 0.8235, lr: 0.000755, batch_cost: 0.1960, reader_cost: 0.00030, ips: 40.8063 samples/sec | ETA 05:25:45 2022-08-24 17:19:03 [INFO] [TRAIN] epoch: 48, iter: 60350/160000, loss: 0.8381, lr: 0.000754, batch_cost: 0.1799, reader_cost: 0.00057, ips: 44.4633 samples/sec | ETA 04:58:49 2022-08-24 17:19:13 [INFO] [TRAIN] epoch: 48, iter: 60400/160000, loss: 0.8748, lr: 0.000754, batch_cost: 0.1892, reader_cost: 0.00253, ips: 42.2771 samples/sec | ETA 05:14:07 2022-08-24 17:19:22 [INFO] [TRAIN] epoch: 48, iter: 60450/160000, loss: 0.9188, lr: 0.000754, batch_cost: 0.1800, reader_cost: 0.00067, ips: 44.4498 samples/sec | ETA 04:58:36 2022-08-24 17:19:30 [INFO] [TRAIN] epoch: 48, iter: 60500/160000, loss: 0.8565, lr: 0.000753, batch_cost: 0.1726, reader_cost: 0.00052, ips: 46.3515 samples/sec | ETA 04:46:13 2022-08-24 17:19:38 [INFO] [TRAIN] epoch: 48, iter: 60550/160000, loss: 0.8917, lr: 0.000753, batch_cost: 0.1618, reader_cost: 0.00471, ips: 49.4517 samples/sec | ETA 04:28:08 2022-08-24 17:19:48 [INFO] [TRAIN] epoch: 48, iter: 60600/160000, loss: 0.8283, lr: 0.000753, batch_cost: 0.1867, reader_cost: 0.00036, ips: 42.8424 samples/sec | ETA 05:09:21 2022-08-24 17:19:57 [INFO] [TRAIN] epoch: 49, iter: 60650/160000, loss: 0.8631, lr: 0.000752, batch_cost: 0.1885, reader_cost: 0.02420, ips: 42.4407 samples/sec | ETA 05:12:07 2022-08-24 17:20:05 [INFO] [TRAIN] epoch: 49, iter: 60700/160000, loss: 0.8010, lr: 0.000752, batch_cost: 0.1573, reader_cost: 0.00053, ips: 50.8576 samples/sec | ETA 04:20:20 2022-08-24 17:20:13 [INFO] [TRAIN] epoch: 49, iter: 60750/160000, loss: 0.8172, lr: 0.000751, batch_cost: 0.1642, reader_cost: 0.00030, ips: 48.7180 samples/sec | ETA 04:31:37 2022-08-24 17:20:21 [INFO] [TRAIN] epoch: 49, iter: 60800/160000, loss: 0.8796, lr: 0.000751, batch_cost: 0.1560, reader_cost: 0.00102, ips: 51.2715 samples/sec | ETA 04:17:58 2022-08-24 17:20:29 [INFO] [TRAIN] epoch: 49, iter: 60850/160000, loss: 0.8546, lr: 0.000751, batch_cost: 0.1568, reader_cost: 0.00221, ips: 51.0143 samples/sec | ETA 04:19:08 2022-08-24 17:20:37 [INFO] [TRAIN] epoch: 49, iter: 60900/160000, loss: 0.7789, lr: 0.000750, batch_cost: 0.1572, reader_cost: 0.00058, ips: 50.8983 samples/sec | ETA 04:19:36 2022-08-24 17:20:46 [INFO] [TRAIN] epoch: 49, iter: 60950/160000, loss: 0.8327, lr: 0.000750, batch_cost: 0.1895, reader_cost: 0.00642, ips: 42.2214 samples/sec | ETA 05:12:47 2022-08-24 17:20:54 [INFO] [TRAIN] epoch: 49, iter: 61000/160000, loss: 0.7945, lr: 0.000750, batch_cost: 0.1545, reader_cost: 0.00249, ips: 51.7917 samples/sec | ETA 04:14:52 2022-08-24 17:20:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 232s - batch_cost: 0.2322 - reader cost: 0.0011 2022-08-24 17:24:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3002 Acc: 0.7397 Kappa: 0.7193 Dice: 0.4225 2022-08-24 17:24:46 [INFO] [EVAL] Class IoU: [0.6421 0.7582 0.9233 0.7 0.6631 0.7254 0.7598 0.7294 0.4753 0.6299 0.4446 0.5209 0.6523 0.3038 0.209 0.3555 0.4564 0.3965 0.5227 0.3405 0.697 0.4178 0.529 0.4342 0.3321 0.3947 0.4007 0.3938 0.3625 0.2925 0.1895 0.4038 0.2394 0.2605 0.3072 0.411 0.3436 0.4237 0.2531 0.2642 0.0857 0.0507 0.2922 0.1913 0.2834 0.2832 0.2135 0.371 0.6241 0.4618 0.4511 0.1997 0.1986 0.2066 0.6279 0.4743 0.8358 0.2983 0.3926 0.2245 0.0981 0.1592 0.3011 0.0867 0.3435 0.6409 0.2005 0.3869 0.082 0.3202 0.3296 0.3873 0.3716 0.2124 0.4182 0.2586 0.3353 0.1728 0.1201 0.3328 0.6442 0.2526 0.1736 0.0142 0.3782 0.4646 0.0481 0.0679 0.3244 0.4043 0.389 0.0138 0.1837 0.0364 0.09 0.0043 0.0825 0.1088 0.089 0.2659 0.0534 0.0048 0.0523 0.6738 0.0098 0.5948 0.095 0.4823 0.0627 0.3186 0.0283 0.3094 0.0622 0.5442 0.7013 0.0056 0.363 0.5112 0.0443 0.1409 0.4629 0. 0.2275 0.0839 0.1941 0.1244 0.4119 0.314 0.3627 0.0736 0.5436 0. 0.0946 0.1853 0.1233 0.0836 0.0527 0.0081 0.1261 0.2767 0.2451 0.0319 0.2292 0.2166 0.291 0. 0.2165 0.0134 0.0265 0.0372] 2022-08-24 17:24:46 [INFO] [EVAL] Class Precision: [0.736 0.8431 0.9546 0.81 0.7579 0.8526 0.8655 0.7825 0.6409 0.743 0.6508 0.6869 0.7355 0.4735 0.4755 0.5461 0.6579 0.6714 0.7098 0.5768 0.7693 0.6645 0.6705 0.5577 0.4391 0.6223 0.5273 0.7094 0.6934 0.4709 0.3858 0.5203 0.4763 0.3692 0.4542 0.5523 0.5527 0.7344 0.5245 0.5423 0.1932 0.2292 0.5353 0.4621 0.3631 0.5123 0.3186 0.5424 0.6819 0.5838 0.602 0.2259 0.3714 0.5419 0.7069 0.6561 0.9084 0.582 0.6252 0.459 0.1407 0.4231 0.4634 0.6016 0.403 0.7786 0.4052 0.5529 0.1689 0.6216 0.5456 0.5013 0.6739 0.2879 0.6152 0.4573 0.4253 0.694 0.6258 0.4463 0.7991 0.5583 0.7719 0.0556 0.6729 0.6729 0.3296 0.2892 0.5488 0.5488 0.5439 0.0182 0.3086 0.2556 0.205 0.0135 0.3041 0.7119 0.2716 0.4992 0.6681 0.007 0.478 0.8109 0.1758 0.6848 0.4406 0.9072 0.2589 0.4583 0.3712 0.56 0.4301 0.6756 0.7062 0.1818 0.8082 0.5819 0.0771 0.5945 0.7342 0. 0.7258 0.6982 0.6311 0.6594 0.7177 0.4421 0.6437 0.551 0.6255 0. 0.3271 0.6497 0.5882 0.2727 0.5016 0.0602 0.3799 0.7368 0.357 0.058 0.6829 0.5591 0.6667 0. 0.8646 0.529 0.1075 0.4397] 2022-08-24 17:24:46 [INFO] [EVAL] Class Recall: [0.8342 0.8828 0.9657 0.8374 0.8412 0.8295 0.8616 0.9149 0.6478 0.8054 0.584 0.6831 0.8522 0.4589 0.2716 0.5046 0.5985 0.4919 0.6647 0.4538 0.8813 0.5295 0.7149 0.6623 0.5767 0.519 0.6253 0.4695 0.4316 0.4357 0.2713 0.6435 0.3249 0.4696 0.4869 0.6164 0.4759 0.5004 0.3284 0.34 0.1334 0.0611 0.3915 0.246 0.5637 0.3877 0.3927 0.54 0.8804 0.6885 0.6428 0.6326 0.2991 0.2503 0.8489 0.6313 0.9127 0.3796 0.5135 0.3053 0.2446 0.2034 0.4624 0.0919 0.6996 0.7837 0.2842 0.563 0.1375 0.3978 0.4543 0.6301 0.4531 0.4475 0.5663 0.3731 0.6129 0.1871 0.1294 0.5668 0.7687 0.3157 0.183 0.0187 0.4633 0.6001 0.0533 0.0815 0.4423 0.6057 0.5772 0.0537 0.3122 0.0407 0.1382 0.0063 0.1018 0.1138 0.1168 0.3627 0.0548 0.0147 0.0555 0.7994 0.0102 0.819 0.108 0.5073 0.0764 0.5111 0.0297 0.4087 0.0677 0.7367 0.9902 0.0058 0.3973 0.8079 0.0945 0.1559 0.5561 0. 0.2489 0.0871 0.2189 0.1329 0.4915 0.52 0.4538 0.0783 0.8059 0. 0.1175 0.2059 0.135 0.1076 0.0556 0.0092 0.1588 0.307 0.4387 0.0663 0.2565 0.2612 0.3406 0. 0.2241 0.0136 0.0339 0.039 ] 2022-08-24 17:24:46 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:24:55 [INFO] [TRAIN] epoch: 49, iter: 61050/160000, loss: 0.7870, lr: 0.000749, batch_cost: 0.1728, reader_cost: 0.00351, ips: 46.3075 samples/sec | ETA 04:44:54 2022-08-24 17:25:03 [INFO] [TRAIN] epoch: 49, iter: 61100/160000, loss: 0.8076, lr: 0.000749, batch_cost: 0.1646, reader_cost: 0.00067, ips: 48.5892 samples/sec | ETA 04:31:23 2022-08-24 17:25:13 [INFO] [TRAIN] epoch: 49, iter: 61150/160000, loss: 0.8388, lr: 0.000748, batch_cost: 0.1962, reader_cost: 0.00093, ips: 40.7678 samples/sec | ETA 05:23:17 2022-08-24 17:25:22 [INFO] [TRAIN] epoch: 49, iter: 61200/160000, loss: 0.8229, lr: 0.000748, batch_cost: 0.1843, reader_cost: 0.00032, ips: 43.4192 samples/sec | ETA 05:03:23 2022-08-24 17:25:31 [INFO] [TRAIN] epoch: 49, iter: 61250/160000, loss: 0.8491, lr: 0.000748, batch_cost: 0.1814, reader_cost: 0.00038, ips: 44.0916 samples/sec | ETA 04:58:37 2022-08-24 17:25:40 [INFO] [TRAIN] epoch: 49, iter: 61300/160000, loss: 0.7903, lr: 0.000747, batch_cost: 0.1738, reader_cost: 0.00065, ips: 46.0168 samples/sec | ETA 04:45:58 2022-08-24 17:25:49 [INFO] [TRAIN] epoch: 49, iter: 61350/160000, loss: 0.7901, lr: 0.000747, batch_cost: 0.1706, reader_cost: 0.00044, ips: 46.8843 samples/sec | ETA 04:40:32 2022-08-24 17:25:58 [INFO] [TRAIN] epoch: 49, iter: 61400/160000, loss: 0.8025, lr: 0.000747, batch_cost: 0.1876, reader_cost: 0.00049, ips: 42.6486 samples/sec | ETA 05:08:15 2022-08-24 17:26:09 [INFO] [TRAIN] epoch: 49, iter: 61450/160000, loss: 0.8352, lr: 0.000746, batch_cost: 0.2111, reader_cost: 0.00070, ips: 37.8920 samples/sec | ETA 05:46:46 2022-08-24 17:26:17 [INFO] [TRAIN] epoch: 49, iter: 61500/160000, loss: 0.7966, lr: 0.000746, batch_cost: 0.1608, reader_cost: 0.00058, ips: 49.7562 samples/sec | ETA 04:23:57 2022-08-24 17:26:24 [INFO] [TRAIN] epoch: 49, iter: 61550/160000, loss: 0.8198, lr: 0.000745, batch_cost: 0.1462, reader_cost: 0.00089, ips: 54.7210 samples/sec | ETA 03:59:53 2022-08-24 17:26:33 [INFO] [TRAIN] epoch: 49, iter: 61600/160000, loss: 0.7826, lr: 0.000745, batch_cost: 0.1837, reader_cost: 0.00238, ips: 43.5602 samples/sec | ETA 05:01:11 2022-08-24 17:26:41 [INFO] [TRAIN] epoch: 49, iter: 61650/160000, loss: 0.8319, lr: 0.000745, batch_cost: 0.1597, reader_cost: 0.00032, ips: 50.0912 samples/sec | ETA 04:21:47 2022-08-24 17:26:51 [INFO] [TRAIN] epoch: 49, iter: 61700/160000, loss: 0.7946, lr: 0.000744, batch_cost: 0.1963, reader_cost: 0.00056, ips: 40.7564 samples/sec | ETA 05:21:35 2022-08-24 17:27:00 [INFO] [TRAIN] epoch: 49, iter: 61750/160000, loss: 0.7979, lr: 0.000744, batch_cost: 0.1797, reader_cost: 0.00091, ips: 44.5145 samples/sec | ETA 04:54:17 2022-08-24 17:27:10 [INFO] [TRAIN] epoch: 49, iter: 61800/160000, loss: 0.8156, lr: 0.000743, batch_cost: 0.1999, reader_cost: 0.00064, ips: 40.0207 samples/sec | ETA 05:27:09 2022-08-24 17:27:19 [INFO] [TRAIN] epoch: 49, iter: 61850/160000, loss: 0.8187, lr: 0.000743, batch_cost: 0.1785, reader_cost: 0.00049, ips: 44.8193 samples/sec | ETA 04:51:59 2022-08-24 17:27:30 [INFO] [TRAIN] epoch: 50, iter: 61900/160000, loss: 0.8222, lr: 0.000743, batch_cost: 0.2238, reader_cost: 0.02700, ips: 35.7442 samples/sec | ETA 06:05:56 2022-08-24 17:27:39 [INFO] [TRAIN] epoch: 50, iter: 61950/160000, loss: 0.7912, lr: 0.000742, batch_cost: 0.1868, reader_cost: 0.00061, ips: 42.8298 samples/sec | ETA 05:05:14 2022-08-24 17:27:48 [INFO] [TRAIN] epoch: 50, iter: 62000/160000, loss: 0.8591, lr: 0.000742, batch_cost: 0.1715, reader_cost: 0.00082, ips: 46.6359 samples/sec | ETA 04:40:11 2022-08-24 17:27:48 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 245s - batch_cost: 0.2451 - reader cost: 8.1244e-04 2022-08-24 17:31:53 [INFO] [EVAL] #Images: 2000 mIoU: 0.2943 Acc: 0.7389 Kappa: 0.7183 Dice: 0.4175 2022-08-24 17:31:53 [INFO] [EVAL] Class IoU: [0.6471 0.7531 0.9245 0.6988 0.6601 0.7277 0.7547 0.728 0.4768 0.618 0.434 0.5221 0.6471 0.3036 0.18 0.3498 0.4866 0.38 0.5348 0.3525 0.6945 0.4267 0.5444 0.4414 0.3167 0.4166 0.3697 0.3745 0.3606 0.2743 0.1874 0.4108 0.2099 0.2909 0.2644 0.4062 0.3169 0.4836 0.2457 0.2365 0.0803 0.067 0.2999 0.1986 0.2743 0.3069 0.1974 0.3978 0.5633 0.4781 0.424 0.1692 0.1648 0.1984 0.6309 0.4433 0.8355 0.2624 0.3935 0.2407 0.072 0.1579 0.2777 0.1196 0.3645 0.6386 0.1854 0.3673 0.0365 0.3114 0.3258 0.3669 0.3698 0.2152 0.4209 0.2682 0.2941 0.2014 0.1096 0.411 0.6364 0.2376 0.2484 0.0146 0.3827 0.4591 0.0496 0.0536 0.2666 0.3823 0.4103 0.0086 0.2025 0.0538 0.0546 0.0014 0.13 0.1197 0.118 0.2877 0.09 0.0232 0.0584 0.3188 0.0028 0.5815 0.1219 0.4103 0.0647 0.2761 0.0285 0.2093 0.0846 0.5748 0.5455 0.0045 0.3552 0.4999 0.0857 0.1563 0.4132 0.0008 0.224 0.0739 0.2063 0.1732 0.4098 0.3023 0.3302 0.1389 0.4853 0.0025 0.0669 0.1592 0.0886 0.0851 0.0815 0.0146 0.1225 0.3285 0.2133 0.0525 0.2464 0.128 0.3058 0. 0.1928 0.015 0.0559 0.0239] 2022-08-24 17:31:53 [INFO] [EVAL] Class Precision: [0.738 0.8332 0.9612 0.8082 0.7484 0.853 0.882 0.7868 0.601 0.7505 0.6967 0.6591 0.7295 0.4595 0.4911 0.5305 0.6193 0.6905 0.7002 0.5446 0.7539 0.6387 0.7587 0.571 0.46 0.6391 0.4946 0.7676 0.6428 0.4033 0.3721 0.5818 0.4631 0.438 0.5616 0.5407 0.6285 0.7172 0.4343 0.5171 0.152 0.2388 0.5682 0.5415 0.3567 0.4788 0.2929 0.6924 0.6908 0.6097 0.5905 0.1849 0.3493 0.4753 0.7172 0.5738 0.879 0.5923 0.6699 0.4204 0.1708 0.5602 0.4205 0.6978 0.4511 0.7775 0.3388 0.5331 0.1178 0.7036 0.5323 0.5744 0.6391 0.3323 0.5857 0.4864 0.6773 0.5735 0.476 0.5447 0.7342 0.4833 0.6878 0.0625 0.6485 0.6919 0.1412 0.4351 0.3778 0.5002 0.576 0.0122 0.3359 0.2574 0.2285 0.0059 0.3471 0.7292 0.6365 0.5768 0.603 0.0361 0.651 0.5508 0.1128 0.7008 0.4457 0.8397 0.246 0.4398 0.2854 0.2972 0.3821 0.6694 0.5471 0.1178 0.8178 0.5218 0.1679 0.5642 0.798 0.1409 0.8251 0.7563 0.6171 0.5452 0.7369 0.3915 0.5377 0.7118 0.5558 0.6086 0.4514 0.6794 0.6195 0.3894 0.3146 0.1386 0.3233 0.6564 0.4355 0.0763 0.5943 0.6763 0.4655 0. 0.8763 0.2014 0.3791 0.6127] 2022-08-24 17:31:53 [INFO] [EVAL] Class Recall: [0.8402 0.8868 0.9603 0.8376 0.8483 0.832 0.8394 0.9069 0.6977 0.7777 0.5352 0.7152 0.8514 0.4721 0.2212 0.5067 0.6944 0.458 0.6937 0.4998 0.898 0.5624 0.6584 0.6605 0.5041 0.5447 0.5941 0.4224 0.4509 0.4617 0.2741 0.583 0.2774 0.464 0.3332 0.6201 0.3899 0.5975 0.3613 0.3035 0.1454 0.0852 0.3884 0.2388 0.543 0.4609 0.3773 0.4831 0.7532 0.689 0.6005 0.6664 0.2378 0.2541 0.8397 0.6609 0.944 0.3202 0.4882 0.3602 0.1107 0.1802 0.4498 0.1261 0.6551 0.7815 0.2906 0.5415 0.0502 0.3585 0.4565 0.5039 0.4674 0.3791 0.5993 0.3741 0.342 0.2369 0.1246 0.626 0.8269 0.3186 0.2799 0.0187 0.4829 0.5771 0.071 0.0576 0.4753 0.6186 0.5878 0.0277 0.3376 0.0636 0.0669 0.0018 0.1721 0.1253 0.1265 0.3646 0.0957 0.0612 0.0603 0.4308 0.0029 0.7735 0.1436 0.4451 0.0807 0.426 0.0307 0.4146 0.098 0.8025 0.9946 0.0047 0.3857 0.9225 0.1489 0.1777 0.4614 0.0008 0.2352 0.0757 0.2366 0.2025 0.4801 0.5704 0.461 0.1472 0.7928 0.0025 0.0729 0.1721 0.0937 0.0982 0.0992 0.016 0.1647 0.3968 0.2948 0.1442 0.2962 0.1364 0.4714 0. 0.1982 0.0159 0.0615 0.0243] 2022-08-24 17:31:53 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:32:03 [INFO] [TRAIN] epoch: 50, iter: 62050/160000, loss: 0.8308, lr: 0.000742, batch_cost: 0.1884, reader_cost: 0.00331, ips: 42.4525 samples/sec | ETA 05:07:38 2022-08-24 17:32:13 [INFO] [TRAIN] epoch: 50, iter: 62100/160000, loss: 0.8037, lr: 0.000741, batch_cost: 0.1981, reader_cost: 0.00217, ips: 40.3777 samples/sec | ETA 05:23:16 2022-08-24 17:32:21 [INFO] [TRAIN] epoch: 50, iter: 62150/160000, loss: 0.8406, lr: 0.000741, batch_cost: 0.1644, reader_cost: 0.00304, ips: 48.6492 samples/sec | ETA 04:28:10 2022-08-24 17:32:29 [INFO] [TRAIN] epoch: 50, iter: 62200/160000, loss: 0.8146, lr: 0.000740, batch_cost: 0.1675, reader_cost: 0.00630, ips: 47.7481 samples/sec | ETA 04:33:06 2022-08-24 17:32:38 [INFO] [TRAIN] epoch: 50, iter: 62250/160000, loss: 0.8199, lr: 0.000740, batch_cost: 0.1653, reader_cost: 0.01323, ips: 48.3877 samples/sec | ETA 04:29:21 2022-08-24 17:32:46 [INFO] [TRAIN] epoch: 50, iter: 62300/160000, loss: 0.7681, lr: 0.000740, batch_cost: 0.1698, reader_cost: 0.00075, ips: 47.1182 samples/sec | ETA 04:36:28 2022-08-24 17:32:56 [INFO] [TRAIN] epoch: 50, iter: 62350/160000, loss: 0.7844, lr: 0.000739, batch_cost: 0.1923, reader_cost: 0.00051, ips: 41.6045 samples/sec | ETA 05:12:56 2022-08-24 17:33:05 [INFO] [TRAIN] epoch: 50, iter: 62400/160000, loss: 0.8299, lr: 0.000739, batch_cost: 0.1812, reader_cost: 0.00035, ips: 44.1415 samples/sec | ETA 04:54:48 2022-08-24 17:33:13 [INFO] [TRAIN] epoch: 50, iter: 62450/160000, loss: 0.8226, lr: 0.000739, batch_cost: 0.1717, reader_cost: 0.00044, ips: 46.5969 samples/sec | ETA 04:39:07 2022-08-24 17:33:23 [INFO] [TRAIN] epoch: 50, iter: 62500/160000, loss: 0.8353, lr: 0.000738, batch_cost: 0.1874, reader_cost: 0.00105, ips: 42.6947 samples/sec | ETA 05:04:29 2022-08-24 17:33:32 [INFO] [TRAIN] epoch: 50, iter: 62550/160000, loss: 0.8195, lr: 0.000738, batch_cost: 0.1930, reader_cost: 0.00040, ips: 41.4415 samples/sec | ETA 05:13:32 2022-08-24 17:33:41 [INFO] [TRAIN] epoch: 50, iter: 62600/160000, loss: 0.8004, lr: 0.000737, batch_cost: 0.1687, reader_cost: 0.00034, ips: 47.4188 samples/sec | ETA 04:33:52 2022-08-24 17:33:49 [INFO] [TRAIN] epoch: 50, iter: 62650/160000, loss: 0.7646, lr: 0.000737, batch_cost: 0.1717, reader_cost: 0.00041, ips: 46.6036 samples/sec | ETA 04:38:31 2022-08-24 17:33:58 [INFO] [TRAIN] epoch: 50, iter: 62700/160000, loss: 0.8002, lr: 0.000737, batch_cost: 0.1666, reader_cost: 0.01093, ips: 48.0205 samples/sec | ETA 04:30:09 2022-08-24 17:34:06 [INFO] [TRAIN] epoch: 50, iter: 62750/160000, loss: 0.8000, lr: 0.000736, batch_cost: 0.1597, reader_cost: 0.00857, ips: 50.1073 samples/sec | ETA 04:18:46 2022-08-24 17:34:14 [INFO] [TRAIN] epoch: 50, iter: 62800/160000, loss: 0.7924, lr: 0.000736, batch_cost: 0.1695, reader_cost: 0.00034, ips: 47.1945 samples/sec | ETA 04:34:36 2022-08-24 17:34:23 [INFO] [TRAIN] epoch: 50, iter: 62850/160000, loss: 0.7988, lr: 0.000736, batch_cost: 0.1686, reader_cost: 0.00030, ips: 47.4603 samples/sec | ETA 04:32:55 2022-08-24 17:34:34 [INFO] [TRAIN] epoch: 50, iter: 62900/160000, loss: 0.7937, lr: 0.000735, batch_cost: 0.2251, reader_cost: 0.00033, ips: 35.5391 samples/sec | ETA 06:04:17 2022-08-24 17:34:43 [INFO] [TRAIN] epoch: 50, iter: 62950/160000, loss: 0.8729, lr: 0.000735, batch_cost: 0.1909, reader_cost: 0.00049, ips: 41.9169 samples/sec | ETA 05:08:42 2022-08-24 17:34:52 [INFO] [TRAIN] epoch: 50, iter: 63000/160000, loss: 0.8228, lr: 0.000734, batch_cost: 0.1652, reader_cost: 0.00046, ips: 48.4131 samples/sec | ETA 04:27:08 2022-08-24 17:34:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 228s - batch_cost: 0.2275 - reader cost: 9.9348e-04 2022-08-24 17:38:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.2983 Acc: 0.7381 Kappa: 0.7175 Dice: 0.4218 2022-08-24 17:38:39 [INFO] [EVAL] Class IoU: [0.645 0.7456 0.9253 0.6986 0.6479 0.7296 0.7569 0.7414 0.4741 0.6093 0.4384 0.4992 0.6544 0.2644 0.1955 0.3532 0.4743 0.3553 0.5173 0.3479 0.7215 0.4591 0.5325 0.4283 0.3084 0.3731 0.4025 0.3891 0.3604 0.2736 0.2325 0.4104 0.2401 0.257 0.2459 0.3588 0.3436 0.4614 0.2701 0.2766 0.0816 0.0719 0.2884 0.201 0.2759 0.2858 0.2303 0.3886 0.5813 0.4347 0.4252 0.2553 0.1636 0.1928 0.5952 0.4438 0.8217 0.2545 0.3755 0.2206 0.0812 0.1309 0.3081 0.1126 0.3882 0.5986 0.2168 0.3615 0.0335 0.3439 0.3314 0.4268 0.368 0.1984 0.3872 0.2817 0.2938 0.2173 0.1079 0.3383 0.631 0.257 0.2218 0.0125 0.3046 0.4646 0.059 0.04 0.3027 0.3954 0.4284 0. 0.0557 0.0449 0.0345 0.0058 0.1175 0.1024 0.1966 0.2529 0.1106 0.0031 0.2353 0.5322 0. 0.6434 0.1282 0.3651 0.0671 0.2807 0.0338 0.3297 0.0654 0.5219 0.7207 0.0008 0.3371 0.5172 0.0204 0.0219 0.4497 0.0044 0.2535 0.139 0.2124 0.1289 0.3848 0.2484 0.3357 0.1862 0.552 0.0075 0.1171 0.2012 0.0589 0.1024 0.0898 0.0081 0.1107 0.2055 0.2064 0.0906 0.2959 0.3593 0.276 0.0004 0.238 0.0078 0.0478 0.0255] 2022-08-24 17:38:39 [INFO] [EVAL] Class Precision: [0.7465 0.8072 0.9609 0.8023 0.7411 0.8427 0.8418 0.8182 0.5952 0.7156 0.6488 0.6887 0.7564 0.4815 0.4608 0.5362 0.5706 0.6655 0.7651 0.5431 0.8307 0.656 0.6857 0.6328 0.5057 0.6565 0.5764 0.6927 0.7083 0.4786 0.3603 0.5063 0.5584 0.4094 0.6046 0.5159 0.5996 0.6688 0.5236 0.5148 0.19 0.204 0.5177 0.5791 0.362 0.4487 0.38 0.7264 0.685 0.519 0.5364 0.3091 0.3883 0.6111 0.6942 0.5814 0.8747 0.624 0.4904 0.3884 0.1329 0.275 0.475 0.6538 0.518 0.675 0.377 0.5286 0.1044 0.6366 0.5308 0.5278 0.5957 0.2486 0.5147 0.4345 0.5393 0.4874 0.3635 0.5464 0.7426 0.6324 0.7408 0.0397 0.6466 0.6608 0.303 0.5036 0.5174 0.5623 0.6228 0. 0.2157 0.2658 0.0975 0.016 0.5199 0.4656 0.3781 0.5339 0.7139 0.0054 0.5851 0.7983 0. 0.7435 0.5234 0.9032 0.187 0.3455 0.1775 0.6147 0.3824 0.6711 0.7269 0.0299 0.7502 0.5748 0.0636 0.3788 0.7801 0.3722 0.551 0.6864 0.5951 0.6592 0.6898 0.3406 0.5725 0.5034 0.659 0.4669 0.3938 0.5215 0.5832 0.3542 0.2403 0.2741 0.4305 0.7395 0.5879 0.1837 0.7046 0.5482 0.4015 0.0005 0.8426 0.4918 0.2777 0.5661] 2022-08-24 17:38:39 [INFO] [EVAL] Class Recall: [0.8259 0.9072 0.9615 0.8438 0.8375 0.8445 0.8825 0.8877 0.6997 0.804 0.5748 0.6447 0.8292 0.3696 0.2534 0.5086 0.7375 0.4326 0.6149 0.4918 0.8458 0.6047 0.7044 0.57 0.4415 0.4635 0.5716 0.4703 0.4232 0.3897 0.396 0.6841 0.2963 0.4084 0.293 0.5408 0.446 0.5981 0.3582 0.3742 0.1251 0.0999 0.3944 0.2354 0.5372 0.4405 0.3688 0.4553 0.7934 0.728 0.6722 0.5946 0.2203 0.2197 0.8066 0.6522 0.9313 0.3006 0.6159 0.3381 0.1727 0.2 0.4672 0.1198 0.6078 0.841 0.3378 0.5335 0.047 0.428 0.4686 0.6904 0.4905 0.4953 0.6099 0.4447 0.3922 0.2817 0.133 0.4705 0.8076 0.3021 0.2405 0.0178 0.3654 0.6101 0.0683 0.0416 0.4219 0.5713 0.5784 0. 0.0698 0.0512 0.0508 0.0091 0.1317 0.116 0.2906 0.3246 0.1157 0.0072 0.2824 0.6149 0. 0.827 0.1451 0.38 0.0948 0.5997 0.0401 0.4156 0.0732 0.7012 0.9883 0.0008 0.3797 0.8376 0.0292 0.0227 0.515 0.0044 0.3196 0.1485 0.2482 0.1381 0.4654 0.4786 0.448 0.2281 0.7726 0.0075 0.1429 0.2468 0.0615 0.1259 0.1255 0.0083 0.1296 0.2216 0.2413 0.1516 0.3378 0.5103 0.4687 0.0041 0.2491 0.0079 0.0545 0.026 ] 2022-08-24 17:38:39 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:38:50 [INFO] [TRAIN] epoch: 50, iter: 63050/160000, loss: 0.8009, lr: 0.000734, batch_cost: 0.2114, reader_cost: 0.00383, ips: 37.8442 samples/sec | ETA 05:41:34 2022-08-24 17:38:59 [INFO] [TRAIN] epoch: 50, iter: 63100/160000, loss: 0.8098, lr: 0.000734, batch_cost: 0.1784, reader_cost: 0.00087, ips: 44.8461 samples/sec | ETA 04:48:05 2022-08-24 17:39:09 [INFO] [TRAIN] epoch: 50, iter: 63150/160000, loss: 0.8284, lr: 0.000733, batch_cost: 0.1972, reader_cost: 0.00069, ips: 40.5653 samples/sec | ETA 05:18:20 2022-08-24 17:39:21 [INFO] [TRAIN] epoch: 51, iter: 63200/160000, loss: 0.7433, lr: 0.000733, batch_cost: 0.2333, reader_cost: 0.02745, ips: 34.2949 samples/sec | ETA 06:16:20 2022-08-24 17:39:30 [INFO] [TRAIN] epoch: 51, iter: 63250/160000, loss: 0.7917, lr: 0.000732, batch_cost: 0.1895, reader_cost: 0.00053, ips: 42.2227 samples/sec | ETA 05:05:31 2022-08-24 17:39:38 [INFO] [TRAIN] epoch: 51, iter: 63300/160000, loss: 0.8506, lr: 0.000732, batch_cost: 0.1699, reader_cost: 0.00058, ips: 47.0822 samples/sec | ETA 04:33:50 2022-08-24 17:39:49 [INFO] [TRAIN] epoch: 51, iter: 63350/160000, loss: 0.8616, lr: 0.000732, batch_cost: 0.2105, reader_cost: 0.00048, ips: 38.0081 samples/sec | ETA 05:39:03 2022-08-24 17:40:00 [INFO] [TRAIN] epoch: 51, iter: 63400/160000, loss: 0.8110, lr: 0.000731, batch_cost: 0.2154, reader_cost: 0.00082, ips: 37.1391 samples/sec | ETA 05:46:48 2022-08-24 17:40:08 [INFO] [TRAIN] epoch: 51, iter: 63450/160000, loss: 0.8324, lr: 0.000731, batch_cost: 0.1619, reader_cost: 0.00188, ips: 49.4132 samples/sec | ETA 04:20:31 2022-08-24 17:40:17 [INFO] [TRAIN] epoch: 51, iter: 63500/160000, loss: 0.8281, lr: 0.000731, batch_cost: 0.1909, reader_cost: 0.00104, ips: 41.9091 samples/sec | ETA 05:07:00 2022-08-24 17:40:26 [INFO] [TRAIN] epoch: 51, iter: 63550/160000, loss: 0.7513, lr: 0.000730, batch_cost: 0.1763, reader_cost: 0.00061, ips: 45.3665 samples/sec | ETA 04:43:28 2022-08-24 17:40:35 [INFO] [TRAIN] epoch: 51, iter: 63600/160000, loss: 0.8131, lr: 0.000730, batch_cost: 0.1723, reader_cost: 0.00077, ips: 46.4368 samples/sec | ETA 04:36:47 2022-08-24 17:40:43 [INFO] [TRAIN] epoch: 51, iter: 63650/160000, loss: 0.7727, lr: 0.000729, batch_cost: 0.1658, reader_cost: 0.00084, ips: 48.2378 samples/sec | ETA 04:26:19 2022-08-24 17:40:51 [INFO] [TRAIN] epoch: 51, iter: 63700/160000, loss: 0.8076, lr: 0.000729, batch_cost: 0.1495, reader_cost: 0.00052, ips: 53.4979 samples/sec | ETA 04:00:00 2022-08-24 17:40:58 [INFO] [TRAIN] epoch: 51, iter: 63750/160000, loss: 0.8097, lr: 0.000729, batch_cost: 0.1458, reader_cost: 0.00040, ips: 54.8639 samples/sec | ETA 03:53:54 2022-08-24 17:41:06 [INFO] [TRAIN] epoch: 51, iter: 63800/160000, loss: 0.8843, lr: 0.000728, batch_cost: 0.1637, reader_cost: 0.00042, ips: 48.8573 samples/sec | ETA 04:22:32 2022-08-24 17:41:16 [INFO] [TRAIN] epoch: 51, iter: 63850/160000, loss: 0.8505, lr: 0.000728, batch_cost: 0.1944, reader_cost: 0.00038, ips: 41.1428 samples/sec | ETA 05:11:35 2022-08-24 17:41:24 [INFO] [TRAIN] epoch: 51, iter: 63900/160000, loss: 0.8471, lr: 0.000728, batch_cost: 0.1718, reader_cost: 0.00046, ips: 46.5624 samples/sec | ETA 04:35:11 2022-08-24 17:41:33 [INFO] [TRAIN] epoch: 51, iter: 63950/160000, loss: 0.8047, lr: 0.000727, batch_cost: 0.1754, reader_cost: 0.00058, ips: 45.6102 samples/sec | ETA 04:40:47 2022-08-24 17:41:42 [INFO] [TRAIN] epoch: 51, iter: 64000/160000, loss: 0.8546, lr: 0.000727, batch_cost: 0.1745, reader_cost: 0.00036, ips: 45.8427 samples/sec | ETA 04:39:12 2022-08-24 17:41:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 237s - batch_cost: 0.2370 - reader cost: 8.6108e-04 2022-08-24 17:45:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.2978 Acc: 0.7388 Kappa: 0.7182 Dice: 0.4206 2022-08-24 17:45:39 [INFO] [EVAL] Class IoU: [0.6472 0.7545 0.9235 0.6983 0.6519 0.7284 0.7526 0.7289 0.4744 0.5958 0.4505 0.5088 0.656 0.2661 0.1614 0.3655 0.5003 0.3833 0.5248 0.3525 0.7074 0.4344 0.5355 0.4422 0.3162 0.3416 0.4548 0.3867 0.3405 0.2761 0.2139 0.4184 0.2603 0.2442 0.357 0.3722 0.3325 0.3968 0.2479 0.2552 0.0934 0.0883 0.2998 0.204 0.2718 0.2598 0.1606 0.3917 0.5984 0.4312 0.4621 0.2519 0.149 0.2131 0.6298 0.4999 0.7834 0.2127 0.3548 0.1827 0.0778 0.1453 0.2837 0.1695 0.3706 0.6179 0.148 0.3619 0.0364 0.3237 0.296 0.3969 0.3775 0.2305 0.3813 0.2797 0.3128 0.2279 0.1987 0.3265 0.5853 0.2539 0.2299 0.0395 0.3521 0.4674 0.0488 0.0445 0.3418 0.4127 0.3599 0.0008 0.1802 0.0689 0.0004 0.006 0.1148 0.1159 0.0321 0.3094 0.1043 0.004 0.1227 0.4782 0.0002 0.5921 0.1032 0.3692 0.0791 0.2535 0.0557 0.3686 0.0712 0.5775 0.7333 0.0008 0.375 0.5159 0.0517 0.0607 0.4473 0.0083 0.2559 0.0814 0.2413 0.1468 0.4127 0.2874 0.357 0.0898 0.5423 0.0046 0.0745 0.1625 0.0648 0.0887 0.04 0.0092 0.1234 0.2635 0.1667 0.0353 0.3137 0.3822 0.2997 0. 0.24 0.0264 0.0384 0.0242] 2022-08-24 17:45:39 [INFO] [EVAL] Class Precision: [0.7439 0.8164 0.9587 0.8153 0.7374 0.865 0.8457 0.8213 0.6129 0.7238 0.6316 0.6718 0.7642 0.5081 0.5065 0.517 0.6444 0.694 0.6686 0.5635 0.8045 0.6755 0.719 0.5703 0.4773 0.6256 0.5465 0.701 0.6921 0.4038 0.3752 0.5407 0.4613 0.3572 0.5232 0.5488 0.569 0.5758 0.4579 0.4917 0.156 0.2502 0.568 0.4533 0.3898 0.4842 0.3251 0.5734 0.6386 0.5174 0.7097 0.318 0.4495 0.6248 0.6995 0.6963 0.8169 0.6321 0.6506 0.3332 0.1367 0.3517 0.4028 0.5713 0.4623 0.765 0.2974 0.5179 0.1699 0.5784 0.5339 0.5344 0.6544 0.287 0.4949 0.4382 0.5539 0.4986 0.5003 0.5481 0.7187 0.4656 0.7525 0.0932 0.641 0.5989 0.2606 0.4101 0.5887 0.6512 0.4803 0.0012 0.3918 0.2693 0.0048 0.0187 0.4545 0.5211 0.2598 0.5614 0.7737 0.0055 0.6677 0.6164 0.0068 0.6842 0.255 0.83 0.1624 0.31 0.2912 0.7956 0.2168 0.6924 0.7393 0.0276 0.728 0.5422 0.116 0.562 0.8053 0.607 0.7808 0.7162 0.5255 0.6454 0.7785 0.4058 0.4826 0.996 0.632 0.6817 0.5297 0.6208 0.6722 0.3256 0.4679 0.343 0.4012 0.7257 0.2015 0.0719 0.4962 0.6367 0.524 0. 0.8615 0.3441 0.1683 0.6506] 2022-08-24 17:45:39 [INFO] [EVAL] Class Recall: [0.8328 0.9086 0.9618 0.8296 0.849 0.8218 0.8724 0.8663 0.6774 0.7711 0.6112 0.6771 0.8225 0.3584 0.1916 0.5551 0.6911 0.4613 0.7094 0.4849 0.8542 0.549 0.6772 0.6632 0.4836 0.4294 0.7307 0.4631 0.4013 0.466 0.3322 0.6491 0.374 0.4358 0.529 0.5364 0.4444 0.5607 0.3508 0.3466 0.1888 0.1201 0.3884 0.2706 0.4733 0.3592 0.2409 0.5528 0.9049 0.7214 0.5698 0.548 0.1822 0.2444 0.8633 0.6392 0.9502 0.2427 0.4383 0.2881 0.1528 0.1984 0.4897 0.1942 0.6514 0.7627 0.2276 0.5459 0.0442 0.4236 0.3992 0.6067 0.4715 0.5395 0.6243 0.4361 0.4181 0.2957 0.2479 0.4468 0.7593 0.3584 0.2488 0.0641 0.4386 0.6804 0.0567 0.0475 0.4491 0.5298 0.5895 0.0022 0.2502 0.0848 0.0004 0.0088 0.1332 0.1298 0.0353 0.408 0.1076 0.0142 0.1306 0.6808 0.0002 0.8146 0.1478 0.3994 0.1335 0.5821 0.0645 0.4071 0.0958 0.7768 0.989 0.0009 0.4362 0.9142 0.0853 0.0637 0.5016 0.0083 0.2757 0.0841 0.3086 0.1596 0.4676 0.4961 0.5784 0.0898 0.7925 0.0046 0.0797 0.1804 0.0669 0.1087 0.0419 0.0094 0.1512 0.2927 0.4915 0.065 0.4603 0.4888 0.4118 0. 0.2496 0.0278 0.0474 0.0245] 2022-08-24 17:45:40 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:45:51 [INFO] [TRAIN] epoch: 51, iter: 64050/160000, loss: 0.7958, lr: 0.000726, batch_cost: 0.2202, reader_cost: 0.00305, ips: 36.3371 samples/sec | ETA 05:52:04 2022-08-24 17:46:00 [INFO] [TRAIN] epoch: 51, iter: 64100/160000, loss: 0.8014, lr: 0.000726, batch_cost: 0.1939, reader_cost: 0.00124, ips: 41.2627 samples/sec | ETA 05:09:53 2022-08-24 17:46:09 [INFO] [TRAIN] epoch: 51, iter: 64150/160000, loss: 0.8369, lr: 0.000726, batch_cost: 0.1751, reader_cost: 0.00080, ips: 45.6943 samples/sec | ETA 04:39:41 2022-08-24 17:46:19 [INFO] [TRAIN] epoch: 51, iter: 64200/160000, loss: 0.7834, lr: 0.000725, batch_cost: 0.1910, reader_cost: 0.00315, ips: 41.8831 samples/sec | ETA 05:04:58 2022-08-24 17:46:28 [INFO] [TRAIN] epoch: 51, iter: 64250/160000, loss: 0.8111, lr: 0.000725, batch_cost: 0.1916, reader_cost: 0.00050, ips: 41.7440 samples/sec | ETA 05:05:49 2022-08-24 17:46:37 [INFO] [TRAIN] epoch: 51, iter: 64300/160000, loss: 0.8244, lr: 0.000725, batch_cost: 0.1720, reader_cost: 0.00043, ips: 46.5215 samples/sec | ETA 04:34:16 2022-08-24 17:46:45 [INFO] [TRAIN] epoch: 51, iter: 64350/160000, loss: 0.8517, lr: 0.000724, batch_cost: 0.1599, reader_cost: 0.00776, ips: 50.0436 samples/sec | ETA 04:14:50 2022-08-24 17:46:54 [INFO] [TRAIN] epoch: 51, iter: 64400/160000, loss: 0.7979, lr: 0.000724, batch_cost: 0.1834, reader_cost: 0.00049, ips: 43.6303 samples/sec | ETA 04:52:09 2022-08-24 17:47:04 [INFO] [TRAIN] epoch: 52, iter: 64450/160000, loss: 0.8792, lr: 0.000723, batch_cost: 0.2039, reader_cost: 0.02610, ips: 39.2435 samples/sec | ETA 05:24:38 2022-08-24 17:47:13 [INFO] [TRAIN] epoch: 52, iter: 64500/160000, loss: 0.8138, lr: 0.000723, batch_cost: 0.1712, reader_cost: 0.00775, ips: 46.7313 samples/sec | ETA 04:32:28 2022-08-24 17:47:21 [INFO] [TRAIN] epoch: 52, iter: 64550/160000, loss: 0.8327, lr: 0.000723, batch_cost: 0.1590, reader_cost: 0.00039, ips: 50.3015 samples/sec | ETA 04:13:00 2022-08-24 17:47:29 [INFO] [TRAIN] epoch: 52, iter: 64600/160000, loss: 0.8056, lr: 0.000722, batch_cost: 0.1729, reader_cost: 0.00072, ips: 46.2800 samples/sec | ETA 04:34:50 2022-08-24 17:47:38 [INFO] [TRAIN] epoch: 52, iter: 64650/160000, loss: 0.8459, lr: 0.000722, batch_cost: 0.1756, reader_cost: 0.00049, ips: 45.5600 samples/sec | ETA 04:39:02 2022-08-24 17:47:46 [INFO] [TRAIN] epoch: 52, iter: 64700/160000, loss: 0.7961, lr: 0.000722, batch_cost: 0.1554, reader_cost: 0.00071, ips: 51.4744 samples/sec | ETA 04:06:51 2022-08-24 17:47:53 [INFO] [TRAIN] epoch: 52, iter: 64750/160000, loss: 0.7626, lr: 0.000721, batch_cost: 0.1489, reader_cost: 0.00304, ips: 53.7107 samples/sec | ETA 03:56:27 2022-08-24 17:48:03 [INFO] [TRAIN] epoch: 52, iter: 64800/160000, loss: 0.8449, lr: 0.000721, batch_cost: 0.1844, reader_cost: 0.00055, ips: 43.3726 samples/sec | ETA 04:52:39 2022-08-24 17:48:11 [INFO] [TRAIN] epoch: 52, iter: 64850/160000, loss: 0.8396, lr: 0.000720, batch_cost: 0.1728, reader_cost: 0.00055, ips: 46.2842 samples/sec | ETA 04:34:06 2022-08-24 17:48:19 [INFO] [TRAIN] epoch: 52, iter: 64900/160000, loss: 0.8202, lr: 0.000720, batch_cost: 0.1633, reader_cost: 0.00859, ips: 48.9979 samples/sec | ETA 04:18:47 2022-08-24 17:48:28 [INFO] [TRAIN] epoch: 52, iter: 64950/160000, loss: 0.8329, lr: 0.000720, batch_cost: 0.1741, reader_cost: 0.00204, ips: 45.9633 samples/sec | ETA 04:35:43 2022-08-24 17:48:36 [INFO] [TRAIN] epoch: 52, iter: 65000/160000, loss: 0.8269, lr: 0.000719, batch_cost: 0.1647, reader_cost: 0.00031, ips: 48.5714 samples/sec | ETA 04:20:47 2022-08-24 17:48:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 230s - batch_cost: 0.2295 - reader cost: 8.8167e-04 2022-08-24 17:52:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.2997 Acc: 0.7393 Kappa: 0.7194 Dice: 0.4251 2022-08-24 17:52:26 [INFO] [EVAL] Class IoU: [0.648 0.755 0.9242 0.6963 0.6622 0.7323 0.7532 0.7383 0.4694 0.6192 0.457 0.5123 0.6355 0.257 0.1814 0.3731 0.4879 0.4005 0.5308 0.3562 0.7027 0.4147 0.5455 0.4477 0.3067 0.4447 0.4307 0.3965 0.3667 0.2671 0.1988 0.4165 0.2579 0.2459 0.2659 0.3834 0.3337 0.3942 0.2524 0.2222 0.1074 0.0765 0.2813 0.197 0.3061 0.266 0.2332 0.3619 0.6306 0.4564 0.4422 0.2264 0.158 0.2172 0.6595 0.4921 0.8109 0.2886 0.4077 0.1913 0.1307 0.1312 0.256 0.1216 0.335 0.6077 0.1887 0.3691 0.0489 0.3293 0.3016 0.3801 0.3869 0.2264 0.4048 0.242 0.3383 0.2266 0.1423 0.1867 0.5306 0.2761 0.2437 0.0223 0.4416 0.4762 0.0529 0.0517 0.2892 0.4102 0.3424 0. 0.2271 0.0572 0.1064 0.0032 0.1655 0.1217 0.1366 0.3187 0.1362 0.0074 0.1882 0.5732 0.052 0.414 0.1713 0.3305 0.0789 0.2644 0.049 0.2229 0.0659 0.5371 0.5647 0.0003 0.3602 0.5401 0.0775 0.1027 0.4212 0.0001 0.277 0.0641 0.2301 0.1494 0.4191 0.3172 0.2902 0.1969 0.4988 0.0029 0.0009 0.2106 0.0938 0.0825 0.0684 0.009 0.117 0.2914 0.2247 0.0205 0.2456 0.4536 0.2948 0. 0.2433 0.0227 0.0373 0.0149] 2022-08-24 17:52:26 [INFO] [EVAL] Class Precision: [0.7689 0.8197 0.9593 0.8018 0.7652 0.8489 0.8506 0.8065 0.614 0.7168 0.6396 0.65 0.7096 0.528 0.4929 0.5343 0.6149 0.6101 0.6824 0.5348 0.7949 0.6777 0.7257 0.5799 0.5121 0.6383 0.6008 0.689 0.7081 0.4262 0.432 0.5898 0.5449 0.3816 0.5637 0.535 0.5999 0.6952 0.5212 0.524 0.157 0.2546 0.4305 0.4373 0.4132 0.4568 0.3516 0.4702 0.7163 0.5617 0.5926 0.2656 0.3236 0.6278 0.6864 0.6738 0.8613 0.5616 0.5867 0.4396 0.1667 0.2364 0.4419 0.5413 0.3973 0.7534 0.2999 0.4786 0.1324 0.6024 0.5238 0.4908 0.6109 0.3831 0.6165 0.3613 0.6517 0.4597 0.2833 0.6826 0.583 0.5084 0.6959 0.0489 0.5204 0.6363 0.1398 0.3867 0.3722 0.5804 0.4471 0. 0.3706 0.2781 0.2292 0.0158 0.3686 0.3784 0.4222 0.4971 0.5978 0.0119 0.5749 0.618 0.484 0.494 0.3494 0.8246 0.131 0.3579 0.2564 0.3402 0.2927 0.6823 0.5665 0.0098 0.7263 0.5652 0.1288 0.5966 0.8234 0.1346 0.7468 0.7738 0.5145 0.6191 0.7328 0.4536 0.4309 0.5634 0.6337 0.3712 0.0294 0.4993 0.6267 0.2886 0.3985 0.0861 0.3922 0.7072 0.3183 0.0382 0.6646 0.5832 0.5141 0. 0.838 0.2724 0.1464 0.774 ] 2022-08-24 17:52:26 [INFO] [EVAL] Class Recall: [0.8048 0.9054 0.9619 0.8411 0.8311 0.8421 0.868 0.8972 0.666 0.8198 0.6154 0.7075 0.8588 0.3336 0.223 0.5529 0.7026 0.5383 0.705 0.5162 0.8583 0.5165 0.6872 0.6625 0.4334 0.5944 0.6033 0.483 0.432 0.417 0.2691 0.5862 0.3288 0.4089 0.3347 0.575 0.4292 0.4766 0.3287 0.2784 0.254 0.0986 0.4481 0.2639 0.5416 0.3891 0.4092 0.611 0.8404 0.7089 0.6355 0.6053 0.236 0.2493 0.944 0.646 0.9326 0.3726 0.572 0.253 0.377 0.2276 0.3784 0.1356 0.6813 0.7586 0.3371 0.6174 0.0719 0.4207 0.4156 0.6277 0.5134 0.3562 0.541 0.4229 0.413 0.3089 0.2224 0.2044 0.8551 0.3767 0.2727 0.0395 0.7446 0.6543 0.0783 0.0563 0.5645 0.5831 0.5938 0. 0.3696 0.0672 0.1657 0.004 0.231 0.1522 0.1681 0.4703 0.1499 0.0191 0.2187 0.8879 0.0551 0.719 0.2515 0.3555 0.1654 0.5032 0.0572 0.3928 0.0783 0.7162 0.9944 0.0003 0.4168 0.924 0.1628 0.1103 0.463 0.0001 0.3057 0.0653 0.2939 0.1646 0.4947 0.5135 0.4707 0.2324 0.7009 0.003 0.0009 0.2669 0.0994 0.1036 0.0762 0.01 0.1429 0.3314 0.433 0.0425 0.2803 0.6713 0.4087 0. 0.2553 0.0242 0.0478 0.015 ] 2022-08-24 17:52:26 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:52:37 [INFO] [TRAIN] epoch: 52, iter: 65050/160000, loss: 0.8451, lr: 0.000719, batch_cost: 0.2192, reader_cost: 0.00960, ips: 36.5031 samples/sec | ETA 05:46:49 2022-08-24 17:52:47 [INFO] [TRAIN] epoch: 52, iter: 65100/160000, loss: 0.7687, lr: 0.000718, batch_cost: 0.1885, reader_cost: 0.00083, ips: 42.4405 samples/sec | ETA 04:58:08 2022-08-24 17:52:55 [INFO] [TRAIN] epoch: 52, iter: 65150/160000, loss: 0.8041, lr: 0.000718, batch_cost: 0.1728, reader_cost: 0.00044, ips: 46.3047 samples/sec | ETA 04:33:07 2022-08-24 17:53:04 [INFO] [TRAIN] epoch: 52, iter: 65200/160000, loss: 0.8371, lr: 0.000718, batch_cost: 0.1804, reader_cost: 0.00044, ips: 44.3364 samples/sec | ETA 04:45:05 2022-08-24 17:53:12 [INFO] [TRAIN] epoch: 52, iter: 65250/160000, loss: 0.8555, lr: 0.000717, batch_cost: 0.1649, reader_cost: 0.00031, ips: 48.5010 samples/sec | ETA 04:20:28 2022-08-24 17:53:21 [INFO] [TRAIN] epoch: 52, iter: 65300/160000, loss: 0.8294, lr: 0.000717, batch_cost: 0.1761, reader_cost: 0.00080, ips: 45.4303 samples/sec | ETA 04:37:56 2022-08-24 17:53:31 [INFO] [TRAIN] epoch: 52, iter: 65350/160000, loss: 0.7914, lr: 0.000717, batch_cost: 0.1921, reader_cost: 0.00051, ips: 41.6354 samples/sec | ETA 05:03:06 2022-08-24 17:53:40 [INFO] [TRAIN] epoch: 52, iter: 65400/160000, loss: 0.8259, lr: 0.000716, batch_cost: 0.1830, reader_cost: 0.00172, ips: 43.7092 samples/sec | ETA 04:48:34 2022-08-24 17:53:49 [INFO] [TRAIN] epoch: 52, iter: 65450/160000, loss: 0.7654, lr: 0.000716, batch_cost: 0.1829, reader_cost: 0.00032, ips: 43.7320 samples/sec | ETA 04:48:16 2022-08-24 17:53:59 [INFO] [TRAIN] epoch: 52, iter: 65500/160000, loss: 0.7721, lr: 0.000715, batch_cost: 0.1949, reader_cost: 0.00046, ips: 41.0528 samples/sec | ETA 05:06:55 2022-08-24 17:54:08 [INFO] [TRAIN] epoch: 52, iter: 65550/160000, loss: 0.8210, lr: 0.000715, batch_cost: 0.1778, reader_cost: 0.00050, ips: 44.9934 samples/sec | ETA 04:39:53 2022-08-24 17:54:16 [INFO] [TRAIN] epoch: 52, iter: 65600/160000, loss: 0.8180, lr: 0.000715, batch_cost: 0.1570, reader_cost: 0.00085, ips: 50.9563 samples/sec | ETA 04:07:00 2022-08-24 17:54:24 [INFO] [TRAIN] epoch: 52, iter: 65650/160000, loss: 0.7871, lr: 0.000714, batch_cost: 0.1746, reader_cost: 0.00030, ips: 45.8222 samples/sec | ETA 04:34:32 2022-08-24 17:54:35 [INFO] [TRAIN] epoch: 53, iter: 65700/160000, loss: 0.7833, lr: 0.000714, batch_cost: 0.2144, reader_cost: 0.02786, ips: 37.3108 samples/sec | ETA 05:36:59 2022-08-24 17:54:44 [INFO] [TRAIN] epoch: 53, iter: 65750/160000, loss: 0.7549, lr: 0.000714, batch_cost: 0.1828, reader_cost: 0.00047, ips: 43.7717 samples/sec | ETA 04:47:05 2022-08-24 17:54:53 [INFO] [TRAIN] epoch: 53, iter: 65800/160000, loss: 0.8218, lr: 0.000713, batch_cost: 0.1680, reader_cost: 0.00120, ips: 47.6239 samples/sec | ETA 04:23:43 2022-08-24 17:55:01 [INFO] [TRAIN] epoch: 53, iter: 65850/160000, loss: 0.8095, lr: 0.000713, batch_cost: 0.1653, reader_cost: 0.00043, ips: 48.4070 samples/sec | ETA 04:19:19 2022-08-24 17:55:10 [INFO] [TRAIN] epoch: 53, iter: 65900/160000, loss: 0.8086, lr: 0.000712, batch_cost: 0.1784, reader_cost: 0.00072, ips: 44.8398 samples/sec | ETA 04:39:48 2022-08-24 17:55:20 [INFO] [TRAIN] epoch: 53, iter: 65950/160000, loss: 0.7588, lr: 0.000712, batch_cost: 0.2046, reader_cost: 0.00038, ips: 39.0942 samples/sec | ETA 05:20:45 2022-08-24 17:55:30 [INFO] [TRAIN] epoch: 53, iter: 66000/160000, loss: 0.7589, lr: 0.000712, batch_cost: 0.2083, reader_cost: 0.00101, ips: 38.4104 samples/sec | ETA 05:26:18 2022-08-24 17:55:30 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 234s - batch_cost: 0.2335 - reader cost: 7.5941e-04 2022-08-24 17:59:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.2995 Acc: 0.7408 Kappa: 0.7207 Dice: 0.4223 2022-08-24 17:59:24 [INFO] [EVAL] Class IoU: [0.6487 0.7625 0.9249 0.6953 0.6625 0.7292 0.7491 0.747 0.4709 0.6072 0.4272 0.5189 0.6549 0.2724 0.2207 0.3692 0.4715 0.4306 0.5371 0.3529 0.7214 0.4313 0.542 0.4637 0.3289 0.42 0.4176 0.3666 0.3495 0.2243 0.2138 0.4087 0.2536 0.2652 0.29 0.3641 0.344 0.454 0.2514 0.2337 0.1029 0.0674 0.2954 0.2058 0.3101 0.2751 0.2034 0.3965 0.637 0.4713 0.4083 0.271 0.1737 0.1839 0.641 0.4188 0.8423 0.2747 0.3494 0.1696 0.0782 0.1417 0.3124 0.0789 0.3397 0.6105 0.2138 0.3571 0.0314 0.3223 0.3223 0.4151 0.3759 0.192 0.4209 0.2817 0.3009 0.227 0.0742 0.1407 0.5869 0.2618 0.1941 0.0129 0.4083 0.4623 0.0299 0.0552 0.2869 0.4045 0.3665 0.0025 0.1681 0.04 0.0377 0.0065 0.123 0.1252 0.1159 0.3332 0.047 0.006 0.1875 0.6393 0.0095 0.4943 0.1157 0.397 0.0791 0.2851 0.0372 0.3098 0.0862 0.5877 0.7797 0.0007 0.4149 0.5272 0.0481 0.1768 0.4021 0.0089 0.2241 0.0473 0.2196 0.154 0.3799 0.2963 0.319 0.2018 0.4949 0.0045 0.0764 0.2393 0.0802 0.1128 0.0744 0.0091 0.1225 0.2678 0.1931 0.0283 0.2681 0.3354 0.3125 0. 0.2065 0.008 0.0444 0.0167] 2022-08-24 17:59:24 [INFO] [EVAL] Class Precision: [0.754 0.8235 0.9571 0.7883 0.773 0.8599 0.8755 0.8175 0.6135 0.7024 0.6782 0.6247 0.7658 0.5334 0.4345 0.5233 0.6223 0.6739 0.6589 0.5199 0.8234 0.637 0.7121 0.6119 0.4899 0.6232 0.5424 0.7482 0.6636 0.4056 0.4182 0.5887 0.5074 0.4597 0.466 0.5018 0.5623 0.6047 0.4687 0.5651 0.1676 0.2488 0.5457 0.506 0.4193 0.4203 0.2796 0.6846 0.6805 0.5945 0.5151 0.3614 0.3889 0.689 0.7166 0.5321 0.9141 0.5739 0.6486 0.3332 0.1235 0.2795 0.4905 0.5781 0.4014 0.6963 0.4552 0.5715 0.0887 0.556 0.5057 0.5102 0.5785 0.2106 0.6969 0.4591 0.4594 0.6771 0.572 0.5419 0.6577 0.5758 0.7641 0.0494 0.6517 0.68 0.1033 0.4093 0.487 0.5399 0.4748 0.0037 0.3708 0.3201 0.1365 0.0306 0.4764 0.527 0.4329 0.534 0.7648 0.0116 0.6585 0.9449 0.215 0.698 0.3422 0.7745 0.1669 0.4222 0.334 0.5482 0.326 0.684 0.7891 0.0206 0.7624 0.5775 0.1211 0.5462 0.7981 0.86 0.9273 0.7708 0.5679 0.5889 0.7628 0.4164 0.6887 0.5664 0.5657 0.9515 0.3746 0.5906 0.6642 0.3146 0.3162 0.2483 0.3961 0.7175 0.486 0.0463 0.6217 0.6328 0.7291 0. 0.8828 0.278 0.2274 0.7816] 2022-08-24 17:59:24 [INFO] [EVAL] Class Recall: [0.8229 0.9115 0.965 0.855 0.8226 0.8275 0.8383 0.8965 0.6695 0.8176 0.5357 0.754 0.8189 0.3577 0.3096 0.5563 0.6606 0.5439 0.7439 0.5235 0.8535 0.5719 0.6941 0.6568 0.5002 0.563 0.6449 0.4182 0.4247 0.3341 0.3042 0.5721 0.3365 0.3853 0.4344 0.5703 0.4698 0.6456 0.3515 0.2849 0.2105 0.0846 0.3918 0.2575 0.5435 0.4433 0.4273 0.4851 0.9087 0.6945 0.6632 0.5202 0.239 0.2005 0.8587 0.663 0.9147 0.345 0.4311 0.2567 0.1758 0.2233 0.4625 0.0837 0.6885 0.8321 0.2873 0.4877 0.0463 0.434 0.4707 0.69 0.5177 0.6858 0.5152 0.4217 0.4659 0.2546 0.0785 0.1596 0.845 0.3244 0.2065 0.0172 0.5222 0.5909 0.0404 0.06 0.4111 0.6173 0.6164 0.0077 0.2351 0.0437 0.0495 0.0082 0.1423 0.1411 0.1367 0.4697 0.0477 0.0122 0.2077 0.6641 0.0099 0.6287 0.1488 0.4489 0.1308 0.4674 0.0402 0.4161 0.1049 0.8067 0.9848 0.0007 0.4766 0.8581 0.0738 0.2072 0.4476 0.0089 0.2281 0.048 0.2636 0.1726 0.4308 0.5067 0.3727 0.2387 0.7982 0.0045 0.0876 0.2869 0.0836 0.1496 0.0887 0.0093 0.1506 0.2994 0.2426 0.0676 0.3204 0.4164 0.3535 0. 0.2123 0.0081 0.0523 0.0167] 2022-08-24 17:59:24 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 17:59:33 [INFO] [TRAIN] epoch: 53, iter: 66050/160000, loss: 0.7912, lr: 0.000711, batch_cost: 0.1631, reader_cost: 0.00428, ips: 49.0557 samples/sec | ETA 04:15:21 2022-08-24 17:59:41 [INFO] [TRAIN] epoch: 53, iter: 66100/160000, loss: 0.7637, lr: 0.000711, batch_cost: 0.1658, reader_cost: 0.00623, ips: 48.2513 samples/sec | ETA 04:19:28 2022-08-24 17:59:50 [INFO] [TRAIN] epoch: 53, iter: 66150/160000, loss: 0.8610, lr: 0.000711, batch_cost: 0.1833, reader_cost: 0.00089, ips: 43.6421 samples/sec | ETA 04:46:43 2022-08-24 17:59:59 [INFO] [TRAIN] epoch: 53, iter: 66200/160000, loss: 0.8895, lr: 0.000710, batch_cost: 0.1697, reader_cost: 0.00053, ips: 47.1408 samples/sec | ETA 04:25:18 2022-08-24 18:00:07 [INFO] [TRAIN] epoch: 53, iter: 66250/160000, loss: 0.7901, lr: 0.000710, batch_cost: 0.1776, reader_cost: 0.00056, ips: 45.0455 samples/sec | ETA 04:37:29 2022-08-24 18:00:17 [INFO] [TRAIN] epoch: 53, iter: 66300/160000, loss: 0.8292, lr: 0.000709, batch_cost: 0.1855, reader_cost: 0.00071, ips: 43.1182 samples/sec | ETA 04:49:44 2022-08-24 18:00:25 [INFO] [TRAIN] epoch: 53, iter: 66350/160000, loss: 0.7750, lr: 0.000709, batch_cost: 0.1689, reader_cost: 0.00060, ips: 47.3582 samples/sec | ETA 04:23:39 2022-08-24 18:00:33 [INFO] [TRAIN] epoch: 53, iter: 66400/160000, loss: 0.8022, lr: 0.000709, batch_cost: 0.1615, reader_cost: 0.00326, ips: 49.5437 samples/sec | ETA 04:11:53 2022-08-24 18:00:42 [INFO] [TRAIN] epoch: 53, iter: 66450/160000, loss: 0.8324, lr: 0.000708, batch_cost: 0.1724, reader_cost: 0.00249, ips: 46.4037 samples/sec | ETA 04:28:48 2022-08-24 18:00:51 [INFO] [TRAIN] epoch: 53, iter: 66500/160000, loss: 0.8200, lr: 0.000708, batch_cost: 0.1886, reader_cost: 0.00074, ips: 42.4209 samples/sec | ETA 04:53:52 2022-08-24 18:01:00 [INFO] [TRAIN] epoch: 53, iter: 66550/160000, loss: 0.8481, lr: 0.000708, batch_cost: 0.1659, reader_cost: 0.00046, ips: 48.2297 samples/sec | ETA 04:18:20 2022-08-24 18:01:08 [INFO] [TRAIN] epoch: 53, iter: 66600/160000, loss: 0.8231, lr: 0.000707, batch_cost: 0.1656, reader_cost: 0.00064, ips: 48.3043 samples/sec | ETA 04:17:48 2022-08-24 18:01:16 [INFO] [TRAIN] epoch: 53, iter: 66650/160000, loss: 0.7966, lr: 0.000707, batch_cost: 0.1661, reader_cost: 0.00038, ips: 48.1777 samples/sec | ETA 04:18:20 2022-08-24 18:01:25 [INFO] [TRAIN] epoch: 53, iter: 66700/160000, loss: 0.8655, lr: 0.000706, batch_cost: 0.1756, reader_cost: 0.00079, ips: 45.5621 samples/sec | ETA 04:33:02 2022-08-24 18:01:35 [INFO] [TRAIN] epoch: 53, iter: 66750/160000, loss: 0.8030, lr: 0.000706, batch_cost: 0.2019, reader_cost: 0.00034, ips: 39.6262 samples/sec | ETA 05:13:45 2022-08-24 18:01:44 [INFO] [TRAIN] epoch: 53, iter: 66800/160000, loss: 0.7855, lr: 0.000706, batch_cost: 0.1807, reader_cost: 0.00057, ips: 44.2731 samples/sec | ETA 04:40:40 2022-08-24 18:01:53 [INFO] [TRAIN] epoch: 53, iter: 66850/160000, loss: 0.7880, lr: 0.000705, batch_cost: 0.1802, reader_cost: 0.00042, ips: 44.3958 samples/sec | ETA 04:39:45 2022-08-24 18:02:04 [INFO] [TRAIN] epoch: 53, iter: 66900/160000, loss: 0.8199, lr: 0.000705, batch_cost: 0.2150, reader_cost: 0.00051, ips: 37.2076 samples/sec | ETA 05:33:37 2022-08-24 18:02:17 [INFO] [TRAIN] epoch: 54, iter: 66950/160000, loss: 0.7645, lr: 0.000704, batch_cost: 0.2561, reader_cost: 0.04162, ips: 31.2393 samples/sec | ETA 06:37:08 2022-08-24 18:02:27 [INFO] [TRAIN] epoch: 54, iter: 67000/160000, loss: 0.8111, lr: 0.000704, batch_cost: 0.2140, reader_cost: 0.00511, ips: 37.3896 samples/sec | ETA 05:31:38 2022-08-24 18:02:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 239s - batch_cost: 0.2392 - reader cost: 0.0014 2022-08-24 18:06:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2986 Acc: 0.7410 Kappa: 0.7205 Dice: 0.4233 2022-08-24 18:06:27 [INFO] [EVAL] Class IoU: [0.6492 0.7547 0.9206 0.6978 0.6614 0.7285 0.7667 0.7423 0.4721 0.6279 0.4358 0.5184 0.6426 0.2865 0.2229 0.3573 0.4796 0.414 0.5397 0.3607 0.7112 0.4573 0.5488 0.4651 0.3319 0.3563 0.3714 0.3583 0.3593 0.3102 0.1801 0.3799 0.1612 0.261 0.2445 0.34 0.3389 0.4313 0.244 0.2442 0.0971 0.0486 0.298 0.2169 0.3027 0.2507 0.2343 0.3944 0.4418 0.4618 0.4384 0.2438 0.1521 0.1807 0.6502 0.4017 0.8408 0.3297 0.3598 0.2094 0.0808 0.1428 0.2896 0.0958 0.3812 0.6184 0.1867 0.3689 0.0481 0.2823 0.3163 0.4141 0.3732 0.2263 0.4156 0.2786 0.33 0.2353 0.1535 0.2347 0.6243 0.2567 0.2619 0.0117 0.3673 0.4668 0.072 0.0582 0.2831 0.442 0.346 0.0072 0.2016 0.0836 0.0282 0.0055 0.1619 0.1021 0.2012 0.3421 0.1085 0.0113 0.0968 0.387 0.0003 0.4354 0.1083 0.4892 0.0838 0.3152 0.0267 0.1208 0.0929 0.5761 0.6456 0. 0.3587 0.5692 0.1032 0.1235 0.4389 0.0039 0.252 0.0641 0.2198 0.1266 0.4323 0.3298 0.3179 0.2468 0.5668 0.0015 0.1427 0.1934 0.0622 0.107 0.0738 0.0115 0.1098 0.2646 0.1663 0.0594 0.3098 0.2731 0.3115 0. 0.2289 0.027 0.0359 0.038 ] 2022-08-24 18:06:27 [INFO] [EVAL] Class Precision: [0.7419 0.8179 0.9476 0.8086 0.7669 0.8436 0.8735 0.8202 0.6512 0.7536 0.6833 0.6739 0.728 0.4876 0.4007 0.5927 0.6015 0.6549 0.6954 0.4987 0.793 0.5829 0.7603 0.5702 0.4948 0.6913 0.5763 0.7673 0.7047 0.5306 0.3689 0.4803 0.4346 0.4051 0.4708 0.5434 0.6028 0.7103 0.4852 0.5155 0.1944 0.1984 0.5372 0.4675 0.4361 0.4688 0.3664 0.6335 0.7494 0.5737 0.5726 0.2968 0.363 0.6438 0.712 0.5013 0.888 0.5961 0.5851 0.3646 0.1384 0.2856 0.4323 0.648 0.5074 0.7579 0.3444 0.5093 0.1305 0.7394 0.5443 0.6677 0.6121 0.2886 0.6129 0.4893 0.6109 0.6589 0.4943 0.8151 0.7264 0.4383 0.7143 0.0322 0.6791 0.6929 0.3184 0.3357 0.4764 0.6901 0.4378 0.01 0.347 0.2711 0.1027 0.0196 0.6516 0.6619 0.5491 0.6158 0.6023 0.0177 0.6584 0.575 0.0023 0.5615 0.2767 0.7228 0.2277 0.4232 0.2942 0.1455 0.3232 0.6674 0.649 0. 0.736 0.6038 0.261 0.5262 0.7119 0.5453 0.8471 0.6943 0.5956 0.6197 0.7306 0.4449 0.7604 0.3956 0.7769 1. 0.6026 0.5659 0.5973 0.3097 0.4529 0.1713 0.4289 0.7256 0.1857 0.078 0.6049 0.5481 0.6971 0. 0.8613 0.3023 0.2106 0.6656] 2022-08-24 18:06:27 [INFO] [EVAL] Class Recall: [0.8385 0.9071 0.97 0.8358 0.8278 0.8422 0.8625 0.8865 0.6319 0.79 0.5461 0.6921 0.8456 0.4099 0.3344 0.4737 0.7028 0.5295 0.7067 0.5659 0.8733 0.6798 0.6637 0.7162 0.5021 0.4237 0.5109 0.402 0.423 0.4274 0.2603 0.6451 0.2039 0.4232 0.3372 0.476 0.4363 0.5234 0.3292 0.3169 0.1624 0.0604 0.4009 0.288 0.4974 0.3502 0.394 0.511 0.5183 0.703 0.6517 0.5772 0.2075 0.2008 0.8821 0.6692 0.9405 0.4245 0.4831 0.3297 0.1627 0.2222 0.4673 0.1011 0.6053 0.7706 0.2897 0.5724 0.0707 0.3135 0.4302 0.5215 0.4889 0.5119 0.5635 0.3929 0.4178 0.2679 0.1821 0.2479 0.8162 0.3826 0.2925 0.0179 0.4444 0.5886 0.0851 0.0658 0.4109 0.5515 0.6226 0.025 0.3248 0.1079 0.0374 0.0077 0.1772 0.1077 0.2411 0.4349 0.1169 0.0306 0.1019 0.542 0.0003 0.6597 0.1511 0.6021 0.1172 0.5525 0.0285 0.4161 0.1153 0.808 0.9921 0. 0.4116 0.9085 0.1459 0.139 0.5337 0.0039 0.264 0.066 0.2583 0.1373 0.5142 0.5604 0.3533 0.3962 0.677 0.0015 0.1576 0.2271 0.0649 0.1406 0.081 0.0122 0.1286 0.294 0.6136 0.1995 0.3883 0.3525 0.3603 0. 0.2376 0.0288 0.0415 0.0388] 2022-08-24 18:06:27 [INFO] [EVAL] The model with the best validation mIoU (0.3010) was saved at iter 60000. 2022-08-24 18:06:36 [INFO] [TRAIN] epoch: 54, iter: 67050/160000, loss: 0.8222, lr: 0.000704, batch_cost: 0.1800, reader_cost: 0.00328, ips: 44.4519 samples/sec | ETA 04:38:48 2022-08-24 18:06:44 [INFO] [TRAIN] epoch: 54, iter: 67100/160000, loss: 0.8406, lr: 0.000703, batch_cost: 0.1611, reader_cost: 0.00941, ips: 49.6487 samples/sec | ETA 04:09:29 2022-08-24 18:06:53 [INFO] [TRAIN] epoch: 54, iter: 67150/160000, loss: 0.7358, lr: 0.000703, batch_cost: 0.1761, reader_cost: 0.01088, ips: 45.4298 samples/sec | ETA 04:32:30 2022-08-24 18:07:00 [INFO] [TRAIN] epoch: 54, iter: 67200/160000, loss: 0.8470, lr: 0.000703, batch_cost: 0.1484, reader_cost: 0.00448, ips: 53.8916 samples/sec | ETA 03:49:35 2022-08-24 18:07:08 [INFO] [TRAIN] epoch: 54, iter: 67250/160000, loss: 0.7763, lr: 0.000702, batch_cost: 0.1620, reader_cost: 0.00044, ips: 49.3791 samples/sec | ETA 04:10:26 2022-08-24 18:07:17 [INFO] [TRAIN] epoch: 54, iter: 67300/160000, loss: 0.7760, lr: 0.000702, batch_cost: 0.1699, reader_cost: 0.00046, ips: 47.0940 samples/sec | ETA 04:22:27 2022-08-24 18:07:26 [INFO] [TRAIN] epoch: 54, iter: 67350/160000, loss: 0.7901, lr: 0.000701, batch_cost: 0.1827, reader_cost: 0.00053, ips: 43.7763 samples/sec | ETA 04:42:11 2022-08-24 18:07:35 [INFO] [TRAIN] epoch: 54, iter: 67400/160000, loss: 0.8323, lr: 0.000701, batch_cost: 0.1747, reader_cost: 0.00097, ips: 45.7801 samples/sec | ETA 04:29:41 2022-08-24 18:07:43 [INFO] [TRAIN] epoch: 54, iter: 67450/160000, loss: 0.8363, lr: 0.000701, batch_cost: 0.1753, reader_cost: 0.00073, ips: 45.6362 samples/sec | ETA 04:30:23 2022-08-24 18:07:52 [INFO] [TRAIN] epoch: 54, iter: 67500/160000, loss: 0.7809, lr: 0.000700, batch_cost: 0.1698, reader_cost: 0.00694, ips: 47.1185 samples/sec | ETA 04:21:45 2022-08-24 18:08:00 [INFO] [TRAIN] epoch: 54, iter: 67550/160000, loss: 0.7869, lr: 0.000700, batch_cost: 0.1562, reader_cost: 0.00085, ips: 51.2221 samples/sec | ETA 04:00:39 2022-08-24 18:08:08 [INFO] [TRAIN] epoch: 54, iter: 67600/160000, loss: 0.7985, lr: 0.000700, batch_cost: 0.1716, reader_cost: 0.00389, ips: 46.6141 samples/sec | ETA 04:24:17 2022-08-24 18:08:18 [INFO] [TRAIN] epoch: 54, iter: 67650/160000, loss: 0.8543, lr: 0.000699, batch_cost: 0.1993, reader_cost: 0.00066, ips: 40.1314 samples/sec | ETA 05:06:49 2022-08-24 18:08:27 [INFO] [TRAIN] epoch: 54, iter: 67700/160000, loss: 0.7916, lr: 0.000699, batch_cost: 0.1647, reader_cost: 0.00061, ips: 48.5597 samples/sec | ETA 04:13:26 2022-08-24 18:08:34 [INFO] [TRAIN] epoch: 54, iter: 67750/160000, loss: 0.7708, lr: 0.000698, batch_cost: 0.1536, reader_cost: 0.00178, ips: 52.0744 samples/sec | ETA 03:56:12 2022-08-24 18:08:43 [INFO] [TRAIN] epoch: 54, iter: 67800/160000, loss: 0.8357, lr: 0.000698, batch_cost: 0.1705, reader_cost: 0.00418, ips: 46.9325 samples/sec | ETA 04:21:56 2022-08-24 18:08:51 [INFO] [TRAIN] epoch: 54, iter: 67850/160000, loss: 0.8223, lr: 0.000698, batch_cost: 0.1653, reader_cost: 0.00966, ips: 48.3852 samples/sec | ETA 04:13:56 2022-08-24 18:09:01 [INFO] [TRAIN] epoch: 54, iter: 67900/160000, loss: 0.8234, lr: 0.000697, batch_cost: 0.1970, reader_cost: 0.00037, ips: 40.6193 samples/sec | ETA 05:02:19 2022-08-24 18:09:12 [INFO] [TRAIN] epoch: 54, iter: 67950/160000, loss: 0.8182, lr: 0.000697, batch_cost: 0.2267, reader_cost: 0.00356, ips: 35.2956 samples/sec | ETA 05:47:43 2022-08-24 18:09:23 [INFO] [TRAIN] epoch: 54, iter: 68000/160000, loss: 0.8111, lr: 0.000697, batch_cost: 0.2154, reader_cost: 0.00042, ips: 37.1390 samples/sec | ETA 05:30:17 2022-08-24 18:09:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 234s - batch_cost: 0.2342 - reader cost: 7.6033e-04 2022-08-24 18:13:17 [INFO] [EVAL] #Images: 2000 mIoU: 0.3032 Acc: 0.7398 Kappa: 0.7202 Dice: 0.4274 2022-08-24 18:13:17 [INFO] [EVAL] Class IoU: [0.6512 0.7609 0.9235 0.7011 0.6683 0.7319 0.7705 0.7354 0.4744 0.6267 0.4364 0.5227 0.6394 0.2828 0.2235 0.3622 0.4528 0.4341 0.5471 0.3553 0.7138 0.4077 0.5427 0.4434 0.3363 0.4391 0.3816 0.3899 0.3732 0.274 0.2023 0.4226 0.2464 0.2536 0.2707 0.3825 0.3398 0.4319 0.2333 0.2587 0.0785 0.0483 0.3001 0.2166 0.2937 0.2383 0.2174 0.3768 0.6303 0.4587 0.4032 0.2087 0.1975 0.1686 0.6072 0.4158 0.8376 0.3726 0.3267 0.1649 0.1081 0.1152 0.2453 0.193 0.3661 0.5965 0.2028 0.359 0.037 0.3361 0.3187 0.4158 0.3626 0.2028 0.4087 0.2569 0.3708 0.2321 0.1586 0.3675 0.5767 0.2725 0.2519 0.0103 0.4547 0.4792 0.0835 0.0574 0.3422 0.4202 0.3225 0.0008 0.1498 0.0385 0.0037 0.0038 0.1641 0.12 0.1689 0.2878 0.0734 0.0045 0.1767 0.6153 0.0087 0.5018 0.1055 0.4301 0.077 0.3064 0.0178 0.2679 0.0841 0.5898 0.6763 0.0054 0.3795 0.5167 0.052 0.1445 0.4522 0.009 0.2589 0.1348 0.2127 0.0984 0.4317 0.3081 0.3779 0.207 0.4965 0.0076 0.0476 0.1751 0.0805 0.1067 0.05 0.006 0.1248 0.2678 0.1889 0.024 0.2447 0.4516 0.3132 0.0015 0.2293 0.0182 0.0235 0.0295] 2022-08-24 18:13:17 [INFO] [EVAL] Class Precision: [0.7717 0.8378 0.9539 0.8147 0.7938 0.8505 0.8908 0.8149 0.62 0.7374 0.6727 0.6802 0.7228 0.467 0.431 0.5158 0.55 0.6174 0.7093 0.5265 0.79 0.6914 0.6922 0.5706 0.4712 0.6079 0.5174 0.6891 0.6667 0.4421 0.3685 0.5166 0.4616 0.3346 0.4208 0.493 0.5648 0.681 0.4571 0.445 0.1303 0.2315 0.5151 0.5495 0.3859 0.4349 0.3464 0.614 0.6698 0.5628 0.4921 0.2403 0.3598 0.6229 0.7138 0.7396 0.8864 0.5316 0.6799 0.309 0.1461 0.1968 0.3868 0.5748 0.4484 0.7014 0.353 0.4559 0.2262 0.602 0.4803 0.5408 0.6032 0.311 0.5798 0.4366 0.5931 0.4579 0.4429 0.6535 0.6608 0.5416 0.7545 0.0753 0.7036 0.6575 0.2487 0.3518 0.547 0.6276 0.3921 0.0011 0.2449 0.2709 0.0217 0.021 0.4867 0.6494 0.358 0.4445 0.6616 0.0079 0.7071 0.6676 0.0979 0.6325 0.2743 0.8101 0.1727 0.4883 0.2433 0.4461 0.2649 0.6799 0.6817 0.0449 0.7657 0.5352 0.1477 0.4085 0.8379 0.8162 0.7525 0.6259 0.5635 0.6535 0.7364 0.4013 0.6477 0.4206 0.5649 0.9388 0.2404 0.6866 0.5679 0.2848 0.4674 0.1241 0.3066 0.715 0.3218 0.0588 0.5664 0.5221 0.7937 0.0022 0.8422 0.4063 0.1956 0.6749] 2022-08-24 18:13:17 [INFO] [EVAL] Class Recall: [0.8066 0.8924 0.9666 0.8341 0.8087 0.84 0.8508 0.8829 0.6688 0.8067 0.554 0.6929 0.847 0.4176 0.317 0.5488 0.7191 0.5939 0.7053 0.5222 0.8809 0.4984 0.7153 0.6655 0.54 0.6125 0.5924 0.4731 0.4588 0.4189 0.3095 0.6991 0.3459 0.5118 0.4316 0.6305 0.4603 0.5414 0.3228 0.382 0.1647 0.0575 0.4183 0.2634 0.5513 0.3452 0.3685 0.4939 0.9146 0.7126 0.6906 0.6131 0.3045 0.1878 0.8026 0.4871 0.9384 0.5547 0.3861 0.2612 0.2939 0.2174 0.4012 0.2251 0.6658 0.7995 0.3228 0.6283 0.0423 0.4322 0.4865 0.6426 0.4763 0.3682 0.5807 0.3844 0.4973 0.32 0.1982 0.4564 0.8191 0.3543 0.2744 0.0118 0.5625 0.6385 0.1116 0.0642 0.4775 0.5597 0.6453 0.0028 0.2785 0.0429 0.0044 0.0046 0.1984 0.1284 0.2424 0.4495 0.0762 0.01 0.1907 0.887 0.0095 0.7083 0.1464 0.4783 0.1221 0.4513 0.0188 0.4014 0.1096 0.8165 0.9886 0.0061 0.4293 0.9373 0.0744 0.1827 0.4956 0.009 0.283 0.1466 0.2546 0.1038 0.5107 0.5702 0.4757 0.2896 0.804 0.0076 0.056 0.1903 0.0857 0.1458 0.053 0.0062 0.1738 0.2998 0.314 0.039 0.3011 0.7699 0.341 0.0048 0.2396 0.0187 0.026 0.03 ] 2022-08-24 18:13:18 [INFO] [EVAL] The model with the best validation mIoU (0.3032) was saved at iter 68000. 2022-08-24 18:13:27 [INFO] [TRAIN] epoch: 54, iter: 68050/160000, loss: 0.8628, lr: 0.000696, batch_cost: 0.1967, reader_cost: 0.00345, ips: 40.6802 samples/sec | ETA 05:01:22 2022-08-24 18:13:36 [INFO] [TRAIN] epoch: 54, iter: 68100/160000, loss: 0.7785, lr: 0.000696, batch_cost: 0.1761, reader_cost: 0.00096, ips: 45.4346 samples/sec | ETA 04:29:41 2022-08-24 18:13:45 [INFO] [TRAIN] epoch: 54, iter: 68150/160000, loss: 0.8286, lr: 0.000695, batch_cost: 0.1817, reader_cost: 0.00054, ips: 44.0232 samples/sec | ETA 04:38:11 2022-08-24 18:13:55 [INFO] [TRAIN] epoch: 54, iter: 68200/160000, loss: 0.8193, lr: 0.000695, batch_cost: 0.1853, reader_cost: 0.00106, ips: 43.1665 samples/sec | ETA 04:43:33 2022-08-24 18:14:06 [INFO] [TRAIN] epoch: 55, iter: 68250/160000, loss: 0.8255, lr: 0.000695, batch_cost: 0.2236, reader_cost: 0.03764, ips: 35.7763 samples/sec | ETA 05:41:56 2022-08-24 18:14:15 [INFO] [TRAIN] epoch: 55, iter: 68300/160000, loss: 0.7492, lr: 0.000694, batch_cost: 0.1744, reader_cost: 0.00082, ips: 45.8651 samples/sec | ETA 04:26:34 2022-08-24 18:14:23 [INFO] [TRAIN] epoch: 55, iter: 68350/160000, loss: 0.8122, lr: 0.000694, batch_cost: 0.1776, reader_cost: 0.00034, ips: 45.0377 samples/sec | ETA 04:31:19 2022-08-24 18:14:31 [INFO] [TRAIN] epoch: 55, iter: 68400/160000, loss: 0.7618, lr: 0.000694, batch_cost: 0.1566, reader_cost: 0.00475, ips: 51.0876 samples/sec | ETA 03:59:03 2022-08-24 18:14:40 [INFO] [TRAIN] epoch: 55, iter: 68450/160000, loss: 0.8268, lr: 0.000693, batch_cost: 0.1817, reader_cost: 0.00074, ips: 44.0398 samples/sec | ETA 04:37:10 2022-08-24 18:14:49 [INFO] [TRAIN] epoch: 55, iter: 68500/160000, loss: 0.8173, lr: 0.000693, batch_cost: 0.1660, reader_cost: 0.01143, ips: 48.1994 samples/sec | ETA 04:13:06 2022-08-24 18:14:58 [INFO] [TRAIN] epoch: 55, iter: 68550/160000, loss: 0.8303, lr: 0.000692, batch_cost: 0.1843, reader_cost: 0.00090, ips: 43.4174 samples/sec | ETA 04:40:50 2022-08-24 18:15:06 [INFO] [TRAIN] epoch: 55, iter: 68600/160000, loss: 0.8259, lr: 0.000692, batch_cost: 0.1723, reader_cost: 0.00065, ips: 46.4308 samples/sec | ETA 04:22:28 2022-08-24 18:15:16 [INFO] [TRAIN] epoch: 55, iter: 68650/160000, loss: 0.8299, lr: 0.000692, batch_cost: 0.1926, reader_cost: 0.00062, ips: 41.5324 samples/sec | ETA 04:53:15 2022-08-24 18:15:25 [INFO] [TRAIN] epoch: 55, iter: 68700/160000, loss: 0.7746, lr: 0.000691, batch_cost: 0.1705, reader_cost: 0.00060, ips: 46.9283 samples/sec | ETA 04:19:24 2022-08-24 18:15:34 [INFO] [TRAIN] epoch: 55, iter: 68750/160000, loss: 0.7664, lr: 0.000691, batch_cost: 0.1797, reader_cost: 0.00378, ips: 44.5152 samples/sec | ETA 04:33:18 2022-08-24 18:15:43 [INFO] [TRAIN] epoch: 55, iter: 68800/160000, loss: 0.8295, lr: 0.000690, batch_cost: 0.1915, reader_cost: 0.00064, ips: 41.7809 samples/sec | ETA 04:51:02 2022-08-24 18:15:52 [INFO] [TRAIN] epoch: 55, iter: 68850/160000, loss: 0.8083, lr: 0.000690, batch_cost: 0.1740, reader_cost: 0.00048, ips: 45.9666 samples/sec | ETA 04:24:23 2022-08-24 18:16:03 [INFO] [TRAIN] epoch: 55, iter: 68900/160000, loss: 0.7694, lr: 0.000690, batch_cost: 0.2201, reader_cost: 0.00105, ips: 36.3413 samples/sec | ETA 05:34:14 2022-08-24 18:16:14 [INFO] [TRAIN] epoch: 55, iter: 68950/160000, loss: 0.8439, lr: 0.000689, batch_cost: 0.2139, reader_cost: 0.00118, ips: 37.3991 samples/sec | ETA 05:24:36 2022-08-24 18:16:23 [INFO] [TRAIN] epoch: 55, iter: 69000/160000, loss: 0.8166, lr: 0.000689, batch_cost: 0.1931, reader_cost: 0.00046, ips: 41.4221 samples/sec | ETA 04:52:55 2022-08-24 18:16:23 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 229s - batch_cost: 0.2294 - reader cost: 8.3430e-04 2022-08-24 18:20:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3049 Acc: 0.7423 Kappa: 0.7220 Dice: 0.4284 2022-08-24 18:20:13 [INFO] [EVAL] Class IoU: [0.6519 0.7589 0.9259 0.6988 0.6632 0.7355 0.7585 0.731 0.4666 0.634 0.4167 0.5287 0.6526 0.2973 0.1589 0.3715 0.4904 0.3836 0.5426 0.357 0.7014 0.4312 0.5454 0.4387 0.333 0.3906 0.352 0.3611 0.3652 0.3123 0.2183 0.3992 0.2213 0.2356 0.2801 0.406 0.3317 0.4112 0.2489 0.2895 0.099 0.0716 0.2981 0.2144 0.2986 0.2617 0.2221 0.4041 0.6364 0.4976 0.4356 0.3078 0.1518 0.1701 0.6939 0.4654 0.8458 0.2285 0.2871 0.1629 0.0534 0.1489 0.2834 0.0442 0.3703 0.6232 0.1931 0.3745 0.0878 0.337 0.3053 0.3984 0.3699 0.2137 0.4269 0.2799 0.327 0.2132 0.1809 0.2846 0.6381 0.2602 0.2794 0.0186 0.4762 0.4618 0.0436 0.0623 0.3136 0.4149 0.3616 0. 0.2044 0.0462 0.002 0.0034 0.1679 0.1336 0.0398 0.3234 0.0586 0.0074 0.1452 0.7291 0.0126 0.5679 0.1542 0.4902 0.0783 0.315 0.0496 0.2949 0.0758 0.4665 0.6527 0.0012 0.3576 0.5792 0.0173 0.2878 0.4745 0.0059 0.2236 0.122 0.231 0.1911 0.4155 0.3075 0.4007 0.1612 0.5159 0.0075 0.0172 0.1783 0.108 0.1212 0.076 0.0122 0.1108 0.3492 0.211 0.0535 0.209 0.2656 0.3194 0. 0.2252 0.022 0.0286 0.0164] 2022-08-24 18:20:13 [INFO] [EVAL] Class Precision: [0.7454 0.8279 0.963 0.7996 0.7515 0.849 0.8685 0.8045 0.571 0.7526 0.693 0.6615 0.7504 0.4827 0.4698 0.5163 0.6571 0.6779 0.708 0.5391 0.7868 0.6029 0.6952 0.6212 0.4904 0.6104 0.5307 0.7729 0.6489 0.6215 0.4203 0.5873 0.5403 0.3267 0.4102 0.5423 0.5859 0.726 0.5057 0.4743 0.1807 0.2201 0.5591 0.4824 0.4149 0.4911 0.3384 0.6375 0.6972 0.6175 0.6087 0.417 0.3513 0.7453 0.7565 0.6708 0.8939 0.64 0.7476 0.315 0.0857 0.322 0.4231 0.7943 0.4722 0.7604 0.2837 0.5394 0.256 0.5975 0.522 0.485 0.6279 0.248 0.6157 0.5143 0.5692 0.6476 0.4089 0.6043 0.7611 0.6084 0.7073 0.0407 0.6127 0.7174 0.2465 0.3977 0.5099 0.5645 0.4704 0. 0.4349 0.2874 0.0102 0.0223 0.5982 0.5058 0.2733 0.6027 0.5919 0.0125 0.6024 0.883 0.2398 0.7169 0.5448 0.7871 0.1529 0.5196 0.3296 0.5066 0.3144 0.7604 0.6561 0.0217 0.6752 0.635 0.0914 0.6905 0.6394 0.7258 0.8941 0.6787 0.5894 0.6 0.7774 0.4544 0.594 0.3919 0.7047 0.8255 0.1877 0.6639 0.5598 0.2855 0.3541 0.0651 0.3521 0.6094 0.3018 0.0708 0.5834 0.5432 0.6559 0. 0.8436 0.257 0.24 0.5367] 2022-08-24 18:20:13 [INFO] [EVAL] Class Recall: [0.8386 0.901 0.9601 0.8472 0.8495 0.8461 0.8569 0.8888 0.7185 0.8009 0.5111 0.7248 0.8334 0.4363 0.1936 0.5697 0.6592 0.4691 0.6991 0.5139 0.866 0.6022 0.7169 0.5988 0.5092 0.5203 0.511 0.404 0.4551 0.3857 0.3123 0.5549 0.2726 0.458 0.469 0.6176 0.4332 0.4867 0.3289 0.4262 0.1798 0.096 0.3898 0.2784 0.5159 0.359 0.3924 0.5246 0.8795 0.7194 0.605 0.5403 0.211 0.1806 0.8935 0.6032 0.9402 0.2621 0.3179 0.2521 0.1241 0.2168 0.4619 0.0447 0.6316 0.7755 0.3767 0.5506 0.1179 0.436 0.4238 0.6904 0.4737 0.6074 0.5818 0.3805 0.4345 0.2413 0.2449 0.3498 0.798 0.3125 0.3159 0.033 0.6813 0.5645 0.0503 0.0688 0.4489 0.6102 0.6097 0. 0.2784 0.0522 0.0024 0.004 0.1892 0.1537 0.0445 0.411 0.0611 0.0178 0.1605 0.8071 0.0131 0.7321 0.177 0.5651 0.1384 0.4444 0.0552 0.4137 0.0908 0.5468 0.9922 0.0013 0.4319 0.8682 0.0209 0.3304 0.6479 0.006 0.2297 0.1295 0.2753 0.219 0.4717 0.4877 0.5518 0.215 0.6582 0.0075 0.0186 0.196 0.118 0.1739 0.0882 0.0149 0.1391 0.45 0.4122 0.1794 0.2457 0.342 0.3837 0. 0.235 0.0235 0.0314 0.0166] 2022-08-24 18:20:13 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 69000. 2022-08-24 18:20:24 [INFO] [TRAIN] epoch: 55, iter: 69050/160000, loss: 0.7973, lr: 0.000689, batch_cost: 0.2194, reader_cost: 0.00463, ips: 36.4588 samples/sec | ETA 05:32:36 2022-08-24 18:20:34 [INFO] [TRAIN] epoch: 55, iter: 69100/160000, loss: 0.7849, lr: 0.000688, batch_cost: 0.2006, reader_cost: 0.00124, ips: 39.8833 samples/sec | ETA 05:03:53 2022-08-24 18:20:43 [INFO] [TRAIN] epoch: 55, iter: 69150/160000, loss: 0.8095, lr: 0.000688, batch_cost: 0.1839, reader_cost: 0.00064, ips: 43.4911 samples/sec | ETA 04:38:31 2022-08-24 18:20:54 [INFO] [TRAIN] epoch: 55, iter: 69200/160000, loss: 0.8138, lr: 0.000687, batch_cost: 0.2191, reader_cost: 0.00045, ips: 36.5193 samples/sec | ETA 05:31:30 2022-08-24 18:21:05 [INFO] [TRAIN] epoch: 55, iter: 69250/160000, loss: 0.7981, lr: 0.000687, batch_cost: 0.2187, reader_cost: 0.00072, ips: 36.5748 samples/sec | ETA 05:30:49 2022-08-24 18:21:15 [INFO] [TRAIN] epoch: 55, iter: 69300/160000, loss: 0.7515, lr: 0.000687, batch_cost: 0.2040, reader_cost: 0.00065, ips: 39.2221 samples/sec | ETA 05:08:19 2022-08-24 18:21:25 [INFO] [TRAIN] epoch: 55, iter: 69350/160000, loss: 0.8061, lr: 0.000686, batch_cost: 0.1982, reader_cost: 0.00032, ips: 40.3629 samples/sec | ETA 04:59:27 2022-08-24 18:21:33 [INFO] [TRAIN] epoch: 55, iter: 69400/160000, loss: 0.7997, lr: 0.000686, batch_cost: 0.1587, reader_cost: 0.00047, ips: 50.4097 samples/sec | ETA 03:59:38 2022-08-24 18:21:42 [INFO] [TRAIN] epoch: 55, iter: 69450/160000, loss: 0.8826, lr: 0.000686, batch_cost: 0.1867, reader_cost: 0.00042, ips: 42.8544 samples/sec | ETA 04:41:43 2022-08-24 18:21:55 [INFO] [TRAIN] epoch: 56, iter: 69500/160000, loss: 0.7900, lr: 0.000685, batch_cost: 0.2424, reader_cost: 0.04493, ips: 33.0072 samples/sec | ETA 06:05:34 2022-08-24 18:22:04 [INFO] [TRAIN] epoch: 56, iter: 69550/160000, loss: 0.7803, lr: 0.000685, batch_cost: 0.1927, reader_cost: 0.00092, ips: 41.5251 samples/sec | ETA 04:50:25 2022-08-24 18:22:13 [INFO] [TRAIN] epoch: 56, iter: 69600/160000, loss: 0.7443, lr: 0.000684, batch_cost: 0.1806, reader_cost: 0.00051, ips: 44.2990 samples/sec | ETA 04:32:05 2022-08-24 18:22:22 [INFO] [TRAIN] epoch: 56, iter: 69650/160000, loss: 0.7780, lr: 0.000684, batch_cost: 0.1797, reader_cost: 0.00048, ips: 44.5082 samples/sec | ETA 04:30:39 2022-08-24 18:22:31 [INFO] [TRAIN] epoch: 56, iter: 69700/160000, loss: 0.8350, lr: 0.000684, batch_cost: 0.1839, reader_cost: 0.00060, ips: 43.4930 samples/sec | ETA 04:36:49 2022-08-24 18:22:42 [INFO] [TRAIN] epoch: 56, iter: 69750/160000, loss: 0.8043, lr: 0.000683, batch_cost: 0.2010, reader_cost: 0.00071, ips: 39.7950 samples/sec | ETA 05:02:22 2022-08-24 18:22:52 [INFO] [TRAIN] epoch: 56, iter: 69800/160000, loss: 0.7728, lr: 0.000683, batch_cost: 0.2146, reader_cost: 0.00520, ips: 37.2739 samples/sec | ETA 05:22:39 2022-08-24 18:23:04 [INFO] [TRAIN] epoch: 56, iter: 69850/160000, loss: 0.8115, lr: 0.000683, batch_cost: 0.2264, reader_cost: 0.01986, ips: 35.3428 samples/sec | ETA 05:40:05 2022-08-24 18:23:14 [INFO] [TRAIN] epoch: 56, iter: 69900/160000, loss: 0.8004, lr: 0.000682, batch_cost: 0.2007, reader_cost: 0.00599, ips: 39.8533 samples/sec | ETA 05:01:26 2022-08-24 18:23:26 [INFO] [TRAIN] epoch: 56, iter: 69950/160000, loss: 0.8069, lr: 0.000682, batch_cost: 0.2389, reader_cost: 0.01216, ips: 33.4884 samples/sec | ETA 05:58:31 2022-08-24 18:23:36 [INFO] [TRAIN] epoch: 56, iter: 70000/160000, loss: 0.8121, lr: 0.000681, batch_cost: 0.1992, reader_cost: 0.00161, ips: 40.1674 samples/sec | ETA 04:58:44 2022-08-24 18:23:36 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 233s - batch_cost: 0.2326 - reader cost: 8.5361e-04 2022-08-24 18:27:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3006 Acc: 0.7407 Kappa: 0.7206 Dice: 0.4236 2022-08-24 18:27:29 [INFO] [EVAL] Class IoU: [0.6496 0.7591 0.9252 0.7015 0.672 0.736 0.7587 0.7398 0.4743 0.6108 0.4517 0.5198 0.6568 0.2724 0.179 0.3666 0.4286 0.4258 0.5426 0.3473 0.7043 0.4075 0.5428 0.4436 0.3162 0.4251 0.3997 0.3842 0.3687 0.2854 0.2083 0.4092 0.2519 0.2671 0.2681 0.3691 0.3365 0.45 0.2271 0.2499 0.0808 0.0829 0.2924 0.2078 0.3075 0.2518 0.1922 0.3909 0.647 0.4755 0.4386 0.2394 0.1205 0.1704 0.6681 0.4295 0.8294 0.2819 0.3416 0.2107 0.074 0.1377 0.3021 0.0543 0.3464 0.6365 0.2384 0.3763 0.0862 0.3359 0.2913 0.3798 0.3828 0.223 0.4104 0.2606 0.3337 0.2 0.3006 0.2551 0.6423 0.2641 0.2729 0.0143 0.4658 0.4825 0.0694 0.0483 0.3582 0.4168 0.3354 0.0057 0.1085 0.0729 0.0238 0.0017 0.1322 0.1345 0.0099 0.339 0.0334 0.0027 0.1436 0.4909 0.0099 0.4977 0.1542 0.4609 0.0718 0.2715 0.0432 0.2503 0.0753 0.5069 0.6545 0.0015 0.3577 0.5252 0.0144 0.169 0.4204 0.0013 0.2409 0.1407 0.2158 0.1829 0.4238 0.2891 0.3883 0.1166 0.5028 0.0077 0.0608 0.1423 0.0824 0.1016 0.0447 0.0147 0.1182 0.292 0.1526 0.017 0.2426 0.4615 0.3421 0.0049 0.2257 0.01 0.0575 0.0343] 2022-08-24 18:27:29 [INFO] [EVAL] Class Precision: [0.7477 0.8392 0.9635 0.7968 0.7874 0.871 0.8572 0.8138 0.5766 0.7453 0.7144 0.6819 0.768 0.4985 0.4765 0.5217 0.5459 0.63 0.6958 0.5593 0.7735 0.601 0.6612 0.5561 0.4786 0.5538 0.4933 0.652 0.6432 0.4817 0.3834 0.5733 0.5179 0.376 0.4042 0.5036 0.5531 0.683 0.4907 0.5297 0.1739 0.2051 0.5221 0.5877 0.4175 0.4083 0.3007 0.6047 0.7372 0.5846 0.5848 0.295 0.3903 0.7234 0.7528 0.6753 0.8755 0.5932 0.7208 0.3403 0.1257 0.2988 0.4923 0.746 0.4231 0.7768 0.3677 0.5644 0.2789 0.7206 0.5441 0.4386 0.6339 0.2823 0.5498 0.4502 0.4781 0.4381 0.7848 0.6209 0.7428 0.5793 0.7551 0.0801 0.661 0.6575 0.1728 0.4394 0.5964 0.6047 0.4168 0.0072 0.2623 0.2776 0.0632 0.0102 0.5641 0.4158 0.2118 0.493 0.5233 0.0046 0.6671 0.5902 0.1798 0.8048 0.3479 0.7583 0.1815 0.3815 0.1921 0.3927 0.2813 0.7259 0.6573 0.028 0.7448 0.5712 0.0462 0.5429 0.8185 0.0978 0.6499 0.6987 0.6303 0.6032 0.7101 0.3898 0.5499 0.402 0.5902 0.5048 0.3026 0.6073 0.5616 0.3636 0.4886 0.0544 0.4454 0.6971 0.3365 0.053 0.5163 0.5868 0.7002 0.0077 0.8673 0.3575 0.3149 0.5879] 2022-08-24 18:27:29 [INFO] [EVAL] Class Recall: [0.8319 0.8882 0.9587 0.8543 0.8209 0.826 0.8685 0.8906 0.7277 0.7719 0.5512 0.6862 0.8194 0.3752 0.2228 0.5522 0.6662 0.5677 0.7114 0.4782 0.8874 0.5586 0.7519 0.6868 0.4823 0.6464 0.6779 0.4834 0.4635 0.4119 0.3132 0.5884 0.3291 0.4798 0.4432 0.5801 0.4621 0.5687 0.2972 0.3211 0.1311 0.1222 0.3993 0.2432 0.5387 0.3965 0.3477 0.5251 0.8411 0.7182 0.6369 0.5596 0.1485 0.1823 0.8559 0.5413 0.9404 0.3494 0.3936 0.3561 0.1524 0.2034 0.4389 0.0553 0.6563 0.7789 0.4041 0.5303 0.1109 0.3862 0.3854 0.739 0.4915 0.515 0.6181 0.3821 0.5249 0.2689 0.3276 0.3022 0.826 0.3268 0.2995 0.0171 0.612 0.6445 0.104 0.0514 0.4729 0.5729 0.6321 0.0265 0.1561 0.0899 0.0367 0.002 0.1473 0.1658 0.0103 0.5204 0.0344 0.0063 0.1547 0.7449 0.0103 0.566 0.2169 0.5403 0.1063 0.485 0.0527 0.4084 0.0932 0.6269 0.9935 0.0015 0.4077 0.867 0.0205 0.1971 0.4636 0.0013 0.2769 0.1498 0.2471 0.2079 0.5124 0.5279 0.5693 0.1411 0.7726 0.0078 0.0707 0.1568 0.0881 0.1236 0.0469 0.0198 0.1385 0.3345 0.2183 0.0244 0.314 0.6835 0.4008 0.013 0.2338 0.0102 0.0657 0.0352] 2022-08-24 18:27:29 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 69000. 2022-08-24 18:27:37 [INFO] [TRAIN] epoch: 56, iter: 70050/160000, loss: 0.8276, lr: 0.000681, batch_cost: 0.1747, reader_cost: 0.00631, ips: 45.7903 samples/sec | ETA 04:21:55 2022-08-24 18:27:46 [INFO] [TRAIN] epoch: 56, iter: 70100/160000, loss: 0.7988, lr: 0.000681, batch_cost: 0.1673, reader_cost: 0.00129, ips: 47.8244 samples/sec | ETA 04:10:38 2022-08-24 18:27:55 [INFO] [TRAIN] epoch: 56, iter: 70150/160000, loss: 0.8100, lr: 0.000680, batch_cost: 0.1824, reader_cost: 0.00033, ips: 43.8644 samples/sec | ETA 04:33:06 2022-08-24 18:28:03 [INFO] [TRAIN] epoch: 56, iter: 70200/160000, loss: 0.7867, lr: 0.000680, batch_cost: 0.1714, reader_cost: 0.00074, ips: 46.6702 samples/sec | ETA 04:16:33 2022-08-24 18:28:12 [INFO] [TRAIN] epoch: 56, iter: 70250/160000, loss: 0.7615, lr: 0.000680, batch_cost: 0.1769, reader_cost: 0.00092, ips: 45.2331 samples/sec | ETA 04:24:33 2022-08-24 18:28:21 [INFO] [TRAIN] epoch: 56, iter: 70300/160000, loss: 0.8008, lr: 0.000679, batch_cost: 0.1667, reader_cost: 0.00045, ips: 47.9901 samples/sec | ETA 04:09:13 2022-08-24 18:28:29 [INFO] [TRAIN] epoch: 56, iter: 70350/160000, loss: 0.8120, lr: 0.000679, batch_cost: 0.1744, reader_cost: 0.00222, ips: 45.8678 samples/sec | ETA 04:20:36 2022-08-24 18:28:37 [INFO] [TRAIN] epoch: 56, iter: 70400/160000, loss: 0.8313, lr: 0.000678, batch_cost: 0.1609, reader_cost: 0.00043, ips: 49.7049 samples/sec | ETA 04:00:21 2022-08-24 18:28:46 [INFO] [TRAIN] epoch: 56, iter: 70450/160000, loss: 0.7869, lr: 0.000678, batch_cost: 0.1694, reader_cost: 0.00063, ips: 47.2168 samples/sec | ETA 04:12:52 2022-08-24 18:28:54 [INFO] [TRAIN] epoch: 56, iter: 70500/160000, loss: 0.8227, lr: 0.000678, batch_cost: 0.1702, reader_cost: 0.00048, ips: 47.0042 samples/sec | ETA 04:13:52 2022-08-24 18:29:04 [INFO] [TRAIN] epoch: 56, iter: 70550/160000, loss: 0.8185, lr: 0.000677, batch_cost: 0.1916, reader_cost: 0.00045, ips: 41.7439 samples/sec | ETA 04:45:42 2022-08-24 18:29:13 [INFO] [TRAIN] epoch: 56, iter: 70600/160000, loss: 0.7659, lr: 0.000677, batch_cost: 0.1854, reader_cost: 0.00043, ips: 43.1525 samples/sec | ETA 04:36:13 2022-08-24 18:29:22 [INFO] [TRAIN] epoch: 56, iter: 70650/160000, loss: 0.7995, lr: 0.000676, batch_cost: 0.1734, reader_cost: 0.00572, ips: 46.1372 samples/sec | ETA 04:18:12 2022-08-24 18:29:31 [INFO] [TRAIN] epoch: 56, iter: 70700/160000, loss: 0.8470, lr: 0.000676, batch_cost: 0.1844, reader_cost: 0.00041, ips: 43.3811 samples/sec | ETA 04:34:27 2022-08-24 18:29:44 [INFO] [TRAIN] epoch: 57, iter: 70750/160000, loss: 0.8084, lr: 0.000676, batch_cost: 0.2672, reader_cost: 0.04782, ips: 29.9365 samples/sec | ETA 06:37:30 2022-08-24 18:29:54 [INFO] [TRAIN] epoch: 57, iter: 70800/160000, loss: 0.8029, lr: 0.000675, batch_cost: 0.2006, reader_cost: 0.00048, ips: 39.8747 samples/sec | ETA 04:58:16 2022-08-24 18:30:05 [INFO] [TRAIN] epoch: 57, iter: 70850/160000, loss: 0.7866, lr: 0.000675, batch_cost: 0.2081, reader_cost: 0.00123, ips: 38.4439 samples/sec | ETA 05:09:11 2022-08-24 18:30:16 [INFO] [TRAIN] epoch: 57, iter: 70900/160000, loss: 0.7941, lr: 0.000675, batch_cost: 0.2205, reader_cost: 0.00165, ips: 36.2791 samples/sec | ETA 05:27:27 2022-08-24 18:30:26 [INFO] [TRAIN] epoch: 57, iter: 70950/160000, loss: 0.7968, lr: 0.000674, batch_cost: 0.2095, reader_cost: 0.00117, ips: 38.1830 samples/sec | ETA 05:10:57 2022-08-24 18:30:38 [INFO] [TRAIN] epoch: 57, iter: 71000/160000, loss: 0.7854, lr: 0.000674, batch_cost: 0.2339, reader_cost: 0.00060, ips: 34.2068 samples/sec | ETA 05:46:54 2022-08-24 18:30:38 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 231s - batch_cost: 0.2307 - reader cost: 7.3109e-04 2022-08-24 18:34:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3037 Acc: 0.7412 Kappa: 0.7211 Dice: 0.4268 2022-08-24 18:34:29 [INFO] [EVAL] Class IoU: [0.6516 0.7615 0.9265 0.7054 0.6596 0.7379 0.7614 0.7305 0.47 0.5964 0.4499 0.5338 0.6489 0.2582 0.2116 0.3601 0.4777 0.4031 0.5441 0.3619 0.7112 0.3886 0.5452 0.4594 0.3185 0.4015 0.3539 0.4054 0.3642 0.2342 0.1838 0.379 0.2424 0.2683 0.3154 0.3803 0.3367 0.4899 0.2365 0.2462 0.0843 0.071 0.3054 0.225 0.2717 0.264 0.2399 0.4197 0.5927 0.4525 0.4217 0.2287 0.1693 0.1706 0.6322 0.487 0.8398 0.2591 0.4725 0.1801 0.0704 0.1598 0.2499 0.1202 0.355 0.5956 0.1712 0.345 0.0719 0.3374 0.3279 0.4035 0.3627 0.2397 0.4181 0.2708 0.3723 0.2311 0.3076 0.3275 0.6284 0.2653 0.3262 0.0133 0.5224 0.4624 0.0651 0.0686 0.1977 0.3953 0.3847 0.0015 0.1536 0.0397 0.0195 0.0042 0.1755 0.1334 0.0915 0.362 0.0631 0.0034 0.0926 0.7361 0.0103 0.482 0.1869 0.4952 0.0421 0.2284 0.0229 0.215 0.0898 0.5667 0.7067 0. 0.3248 0.5274 0.093 0.0852 0.4866 0.0016 0.2851 0.0937 0.199 0.1257 0.4519 0.2936 0.3467 0.2214 0.541 0.0124 0.039 0.1908 0.0559 0.1204 0.0558 0.0073 0.1271 0.3081 0.1519 0.0681 0.2375 0.1806 0.3418 0. 0.2017 0.018 0.0426 0.0292] 2022-08-24 18:34:29 [INFO] [EVAL] Class Precision: [0.7516 0.8323 0.9604 0.8151 0.7502 0.8508 0.8676 0.7896 0.6507 0.7593 0.6943 0.677 0.7326 0.5113 0.4529 0.5511 0.5968 0.6299 0.7158 0.5167 0.7866 0.5912 0.7136 0.5812 0.5529 0.6052 0.506 0.6827 0.6635 0.3179 0.4167 0.6124 0.5171 0.388 0.476 0.5119 0.6022 0.6915 0.5202 0.5087 0.1582 0.2074 0.5393 0.5078 0.3375 0.4452 0.3969 0.656 0.75 0.5484 0.5565 0.261 0.4346 0.6678 0.7294 0.7383 0.8877 0.5524 0.6027 0.3163 0.106 0.4968 0.3056 0.599 0.4513 0.6883 0.3352 0.5259 0.296 0.6525 0.4779 0.6024 0.6452 0.2977 0.6187 0.3832 0.6389 0.509 0.4584 0.4843 0.7315 0.6267 0.6579 0.0405 0.6803 0.5797 0.2215 0.3613 0.2422 0.5316 0.5471 0.0019 0.3724 0.2638 0.0467 0.0227 0.5155 0.5006 0.5625 0.5326 0.6663 0.006 0.7154 0.8858 0.0666 0.703 0.5433 0.756 0.1139 0.4853 0.2424 0.57 0.3646 0.6691 0.7134 0. 0.7984 0.5615 0.1455 0.5844 0.7219 0.2455 0.6932 0.5997 0.6091 0.6205 0.8293 0.4039 0.5666 0.4465 0.7292 0.4431 0.3861 0.5306 0.6498 0.202 0.4102 0.1884 0.3561 0.6917 0.189 0.0888 0.569 0.4088 0.6894 0. 0.8898 0.3606 0.1791 0.6294] 2022-08-24 18:34:29 [INFO] [EVAL] Class Recall: [0.8305 0.8995 0.9633 0.8398 0.8453 0.8476 0.8616 0.9071 0.6286 0.7354 0.561 0.7162 0.8503 0.3428 0.2842 0.5095 0.7054 0.5282 0.694 0.5471 0.8813 0.5313 0.6979 0.6866 0.4289 0.544 0.5408 0.4995 0.4467 0.4707 0.2474 0.4986 0.3133 0.4652 0.4832 0.5966 0.433 0.6269 0.3024 0.3231 0.1527 0.0975 0.4132 0.2878 0.5819 0.3934 0.3774 0.5381 0.7387 0.7214 0.6351 0.6484 0.2171 0.1864 0.8258 0.5886 0.9396 0.328 0.6863 0.295 0.1735 0.1906 0.5783 0.1308 0.6246 0.8156 0.2592 0.5007 0.0867 0.4113 0.5111 0.55 0.453 0.5516 0.5632 0.48 0.4715 0.2974 0.4832 0.5027 0.8169 0.3151 0.3928 0.0195 0.6924 0.6956 0.0845 0.078 0.5181 0.6066 0.5644 0.0065 0.2072 0.0447 0.0323 0.0051 0.2101 0.1539 0.0985 0.5306 0.0651 0.0077 0.0961 0.8133 0.0121 0.6053 0.2217 0.5895 0.0627 0.3015 0.0247 0.2567 0.1065 0.7874 0.987 0. 0.3538 0.8968 0.205 0.0907 0.5988 0.0016 0.3263 0.0999 0.2282 0.1362 0.4982 0.5181 0.4717 0.3052 0.6769 0.0126 0.0415 0.2296 0.0576 0.2297 0.0606 0.0075 0.165 0.3571 0.4362 0.2258 0.2896 0.2444 0.404 0. 0.2069 0.0186 0.053 0.0297] 2022-08-24 18:34:29 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 69000. 2022-08-24 18:34:38 [INFO] [TRAIN] epoch: 57, iter: 71050/160000, loss: 0.7646, lr: 0.000673, batch_cost: 0.1773, reader_cost: 0.00321, ips: 45.1115 samples/sec | ETA 04:22:54 2022-08-24 18:34:46 [INFO] [TRAIN] epoch: 57, iter: 71100/160000, loss: 0.8279, lr: 0.000673, batch_cost: 0.1671, reader_cost: 0.00131, ips: 47.8870 samples/sec | ETA 04:07:31 2022-08-24 18:34:55 [INFO] [TRAIN] epoch: 57, iter: 71150/160000, loss: 0.8370, lr: 0.000673, batch_cost: 0.1662, reader_cost: 0.00788, ips: 48.1403 samples/sec | ETA 04:06:05 2022-08-24 18:35:04 [INFO] [TRAIN] epoch: 57, iter: 71200/160000, loss: 0.7711, lr: 0.000672, batch_cost: 0.1895, reader_cost: 0.00341, ips: 42.2131 samples/sec | ETA 04:40:28 2022-08-24 18:35:14 [INFO] [TRAIN] epoch: 57, iter: 71250/160000, loss: 0.8072, lr: 0.000672, batch_cost: 0.1872, reader_cost: 0.00056, ips: 42.7246 samples/sec | ETA 04:36:58 2022-08-24 18:35:23 [INFO] [TRAIN] epoch: 57, iter: 71300/160000, loss: 0.8313, lr: 0.000672, batch_cost: 0.1898, reader_cost: 0.00179, ips: 42.1480 samples/sec | ETA 04:40:35 2022-08-24 18:35:32 [INFO] [TRAIN] epoch: 57, iter: 71350/160000, loss: 0.7623, lr: 0.000671, batch_cost: 0.1822, reader_cost: 0.00083, ips: 43.9169 samples/sec | ETA 04:29:08 2022-08-24 18:35:42 [INFO] [TRAIN] epoch: 57, iter: 71400/160000, loss: 0.7564, lr: 0.000671, batch_cost: 0.1999, reader_cost: 0.00069, ips: 40.0275 samples/sec | ETA 04:55:07 2022-08-24 18:35:50 [INFO] [TRAIN] epoch: 57, iter: 71450/160000, loss: 0.7863, lr: 0.000670, batch_cost: 0.1603, reader_cost: 0.00971, ips: 49.9027 samples/sec | ETA 03:56:35 2022-08-24 18:35:59 [INFO] [TRAIN] epoch: 57, iter: 71500/160000, loss: 0.8276, lr: 0.000670, batch_cost: 0.1767, reader_cost: 0.00042, ips: 45.2785 samples/sec | ETA 04:20:36 2022-08-24 18:36:08 [INFO] [TRAIN] epoch: 57, iter: 71550/160000, loss: 0.7885, lr: 0.000670, batch_cost: 0.1845, reader_cost: 0.00035, ips: 43.3519 samples/sec | ETA 04:32:02 2022-08-24 18:36:19 [INFO] [TRAIN] epoch: 57, iter: 71600/160000, loss: 0.8382, lr: 0.000669, batch_cost: 0.2101, reader_cost: 0.00038, ips: 38.0708 samples/sec | ETA 05:09:35 2022-08-24 18:36:31 [INFO] [TRAIN] epoch: 57, iter: 71650/160000, loss: 0.7520, lr: 0.000669, batch_cost: 0.2479, reader_cost: 0.00052, ips: 32.2763 samples/sec | ETA 06:04:58 2022-08-24 18:36:42 [INFO] [TRAIN] epoch: 57, iter: 71700/160000, loss: 0.8143, lr: 0.000669, batch_cost: 0.2135, reader_cost: 0.00415, ips: 37.4790 samples/sec | ETA 05:14:07 2022-08-24 18:36:52 [INFO] [TRAIN] epoch: 57, iter: 71750/160000, loss: 0.7890, lr: 0.000668, batch_cost: 0.1980, reader_cost: 0.00049, ips: 40.4082 samples/sec | ETA 04:51:11 2022-08-24 18:37:01 [INFO] [TRAIN] epoch: 57, iter: 71800/160000, loss: 0.8389, lr: 0.000668, batch_cost: 0.1922, reader_cost: 0.00189, ips: 41.6309 samples/sec | ETA 04:42:28 2022-08-24 18:37:13 [INFO] [TRAIN] epoch: 57, iter: 71850/160000, loss: 0.7790, lr: 0.000667, batch_cost: 0.2328, reader_cost: 0.00079, ips: 34.3694 samples/sec | ETA 05:41:58 2022-08-24 18:37:25 [INFO] [TRAIN] epoch: 57, iter: 71900/160000, loss: 0.8136, lr: 0.000667, batch_cost: 0.2427, reader_cost: 0.00046, ips: 32.9620 samples/sec | ETA 05:56:22 2022-08-24 18:37:38 [INFO] [TRAIN] epoch: 57, iter: 71950/160000, loss: 0.8003, lr: 0.000667, batch_cost: 0.2497, reader_cost: 0.00082, ips: 32.0388 samples/sec | ETA 06:06:25 2022-08-24 18:37:51 [INFO] [TRAIN] epoch: 58, iter: 72000/160000, loss: 0.8231, lr: 0.000666, batch_cost: 0.2582, reader_cost: 0.04048, ips: 30.9797 samples/sec | ETA 06:18:44 2022-08-24 18:37:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 216s - batch_cost: 0.2160 - reader cost: 8.7205e-04 2022-08-24 18:41:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.2993 Acc: 0.7415 Kappa: 0.7215 Dice: 0.4201 2022-08-24 18:41:27 [INFO] [EVAL] Class IoU: [0.6553 0.7562 0.9246 0.7012 0.6487 0.7405 0.7583 0.7244 0.4678 0.63 0.4504 0.5164 0.656 0.2799 0.2499 0.372 0.4759 0.3789 0.5367 0.3641 0.7245 0.4123 0.5596 0.4605 0.3258 0.3714 0.3585 0.3974 0.3533 0.3278 0.1951 0.3898 0.1952 0.2928 0.3746 0.3482 0.3309 0.4575 0.2376 0.2223 0.0954 0.0453 0.2924 0.1963 0.2736 0.2511 0.2243 0.3895 0.6265 0.459 0.4342 0.2948 0.1933 0.1476 0.6641 0.4264 0.842 0.3258 0.444 0.1718 0.0581 0.1564 0.2934 0.0958 0.3572 0.6346 0.1879 0.3481 0.1062 0.2821 0.3074 0.3994 0.3702 0.1973 0.4304 0.2944 0.3458 0.2137 0.1381 0.1358 0.6612 0.2661 0.2185 0.0131 0.5151 0.4818 0.0529 0.045 0.3307 0.4363 0.3787 0.0093 0.2009 0.0358 0.0172 0.0037 0.1679 0.1436 0.0173 0.3223 0.0091 0.0062 0.1578 0.6584 0.0026 0.4691 0.1223 0.4121 0.0594 0.2805 0.0671 0.2659 0.0797 0.5856 0.6651 0.0021 0.3103 0.5905 0.0467 0.0347 0.463 0.0049 0.2593 0.1163 0.2251 0.1452 0.447 0.3062 0.311 0.2124 0.5536 0.0051 0.0356 0.2095 0.0643 0.1011 0.0594 0.0157 0.1195 0.2794 0.0638 0.0599 0.2219 0.0619 0.3193 0. 0.2423 0.0209 0.0304 0.0252] 2022-08-24 18:41:27 [INFO] [EVAL] Class Precision: [0.7637 0.8248 0.9615 0.8033 0.7295 0.8571 0.8719 0.8081 0.6355 0.7631 0.6465 0.6687 0.7436 0.5042 0.4087 0.5518 0.6287 0.6733 0.7073 0.5003 0.8218 0.5854 0.7529 0.5737 0.5204 0.4438 0.4987 0.6271 0.7138 0.4955 0.4026 0.5573 0.3918 0.4642 0.5856 0.4944 0.6088 0.7492 0.4309 0.5429 0.1865 0.2484 0.4495 0.6147 0.4091 0.4362 0.3379 0.591 0.7345 0.5815 0.5713 0.3776 0.3726 0.7708 0.7261 0.5639 0.8973 0.5752 0.6076 0.3171 0.1031 0.3151 0.3773 0.5966 0.4509 0.801 0.377 0.5728 0.2354 0.4697 0.5482 0.5298 0.62 0.2352 0.6464 0.5293 0.5602 0.5382 0.5254 0.6438 0.792 0.5765 0.7671 0.0752 0.7213 0.6861 0.2814 0.435 0.6903 0.676 0.5005 0.0135 0.431 0.2464 0.047 0.0194 0.451 0.4841 0.3324 0.4674 0.6846 0.0107 0.698 0.8229 0.116 0.6591 0.3142 0.9593 0.2877 0.4982 0.2285 0.4357 0.3391 0.7213 0.6691 0.1178 0.7752 0.69 0.1164 0.4262 0.6912 0.3738 0.9022 0.5763 0.6013 0.585 0.7609 0.4024 0.5715 0.4131 0.7309 0.515 0.3271 0.6328 0.5989 0.3227 0.3134 0.1987 0.3774 0.7023 0.1117 0.0793 0.614 0.2541 0.6756 0. 0.8014 0.4243 0.2674 0.8057] 2022-08-24 18:41:27 [INFO] [EVAL] Class Recall: [0.8219 0.9009 0.9601 0.8465 0.8541 0.8448 0.8533 0.8749 0.6393 0.7833 0.5975 0.694 0.8477 0.3862 0.3915 0.533 0.6619 0.4643 0.6899 0.5722 0.8595 0.5823 0.6856 0.7001 0.4656 0.6947 0.5607 0.5204 0.4116 0.492 0.2747 0.5646 0.2801 0.4423 0.5098 0.5407 0.4202 0.5402 0.3464 0.2736 0.1633 0.0525 0.4555 0.2239 0.4522 0.3718 0.4003 0.5331 0.8099 0.6855 0.6441 0.5735 0.2866 0.1544 0.8862 0.6363 0.9318 0.429 0.6224 0.2727 0.1175 0.2369 0.5691 0.1025 0.6322 0.7534 0.2725 0.4703 0.1621 0.4138 0.4117 0.6187 0.4788 0.5503 0.563 0.3988 0.4747 0.2617 0.1577 0.1468 0.8001 0.3308 0.234 0.0156 0.6431 0.618 0.0612 0.0478 0.3884 0.5518 0.6088 0.0289 0.2734 0.0402 0.0264 0.0045 0.2111 0.1696 0.0179 0.5093 0.0092 0.0142 0.1694 0.7671 0.0027 0.6194 0.1669 0.4195 0.0697 0.3911 0.0867 0.4055 0.0944 0.7569 0.9909 0.0021 0.341 0.8037 0.0724 0.0363 0.5838 0.0049 0.2668 0.1271 0.2646 0.1619 0.5201 0.5615 0.4055 0.3041 0.6953 0.0051 0.0385 0.2386 0.0672 0.1284 0.0683 0.0167 0.1489 0.3169 0.1293 0.1962 0.2579 0.0756 0.3772 0. 0.2578 0.0215 0.0331 0.0254] 2022-08-24 18:41:27 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 69000. 2022-08-24 18:41:35 [INFO] [TRAIN] epoch: 58, iter: 72050/160000, loss: 0.7704, lr: 0.000666, batch_cost: 0.1555, reader_cost: 0.00384, ips: 51.4573 samples/sec | ETA 03:47:53 2022-08-24 18:41:44 [INFO] [TRAIN] epoch: 58, iter: 72100/160000, loss: 0.7550, lr: 0.000665, batch_cost: 0.1836, reader_cost: 0.00442, ips: 43.5628 samples/sec | ETA 04:29:02 2022-08-24 18:41:53 [INFO] [TRAIN] epoch: 58, iter: 72150/160000, loss: 0.7499, lr: 0.000665, batch_cost: 0.1869, reader_cost: 0.00066, ips: 42.8029 samples/sec | ETA 04:33:39 2022-08-24 18:42:03 [INFO] [TRAIN] epoch: 58, iter: 72200/160000, loss: 0.7383, lr: 0.000665, batch_cost: 0.1866, reader_cost: 0.00069, ips: 42.8661 samples/sec | ETA 04:33:05 2022-08-24 18:42:11 [INFO] [TRAIN] epoch: 58, iter: 72250/160000, loss: 0.7898, lr: 0.000664, batch_cost: 0.1729, reader_cost: 0.00171, ips: 46.2757 samples/sec | ETA 04:12:49 2022-08-24 18:42:20 [INFO] [TRAIN] epoch: 58, iter: 72300/160000, loss: 0.8183, lr: 0.000664, batch_cost: 0.1845, reader_cost: 0.00032, ips: 43.3585 samples/sec | ETA 04:29:41 2022-08-24 18:42:29 [INFO] [TRAIN] epoch: 58, iter: 72350/160000, loss: 0.7722, lr: 0.000664, batch_cost: 0.1677, reader_cost: 0.00072, ips: 47.7032 samples/sec | ETA 04:04:59 2022-08-24 18:42:38 [INFO] [TRAIN] epoch: 58, iter: 72400/160000, loss: 0.8343, lr: 0.000663, batch_cost: 0.1876, reader_cost: 0.00051, ips: 42.6381 samples/sec | ETA 04:33:55 2022-08-24 18:42:47 [INFO] [TRAIN] epoch: 58, iter: 72450/160000, loss: 0.7808, lr: 0.000663, batch_cost: 0.1690, reader_cost: 0.00048, ips: 47.3361 samples/sec | ETA 04:06:36 2022-08-24 18:42:55 [INFO] [TRAIN] epoch: 58, iter: 72500/160000, loss: 0.7785, lr: 0.000662, batch_cost: 0.1761, reader_cost: 0.01807, ips: 45.4273 samples/sec | ETA 04:16:49 2022-08-24 18:43:05 [INFO] [TRAIN] epoch: 58, iter: 72550/160000, loss: 0.7996, lr: 0.000662, batch_cost: 0.2006, reader_cost: 0.00089, ips: 39.8706 samples/sec | ETA 04:52:26 2022-08-24 18:43:15 [INFO] [TRAIN] epoch: 58, iter: 72600/160000, loss: 0.7951, lr: 0.000662, batch_cost: 0.1881, reader_cost: 0.00060, ips: 42.5202 samples/sec | ETA 04:34:03 2022-08-24 18:43:26 [INFO] [TRAIN] epoch: 58, iter: 72650/160000, loss: 0.7397, lr: 0.000661, batch_cost: 0.2146, reader_cost: 0.00074, ips: 37.2713 samples/sec | ETA 05:12:29 2022-08-24 18:43:36 [INFO] [TRAIN] epoch: 58, iter: 72700/160000, loss: 0.7811, lr: 0.000661, batch_cost: 0.2161, reader_cost: 0.00076, ips: 37.0173 samples/sec | ETA 05:14:26 2022-08-24 18:43:46 [INFO] [TRAIN] epoch: 58, iter: 72750/160000, loss: 0.7376, lr: 0.000661, batch_cost: 0.1914, reader_cost: 0.00190, ips: 41.7938 samples/sec | ETA 04:38:21 2022-08-24 18:43:56 [INFO] [TRAIN] epoch: 58, iter: 72800/160000, loss: 0.7440, lr: 0.000660, batch_cost: 0.1939, reader_cost: 0.00056, ips: 41.2523 samples/sec | ETA 04:41:50 2022-08-24 18:44:07 [INFO] [TRAIN] epoch: 58, iter: 72850/160000, loss: 0.8102, lr: 0.000660, batch_cost: 0.2174, reader_cost: 0.00081, ips: 36.8051 samples/sec | ETA 05:15:43 2022-08-24 18:44:18 [INFO] [TRAIN] epoch: 58, iter: 72900/160000, loss: 0.8305, lr: 0.000659, batch_cost: 0.2218, reader_cost: 0.00430, ips: 36.0766 samples/sec | ETA 05:21:54 2022-08-24 18:44:28 [INFO] [TRAIN] epoch: 58, iter: 72950/160000, loss: 0.8257, lr: 0.000659, batch_cost: 0.2164, reader_cost: 0.00159, ips: 36.9636 samples/sec | ETA 05:14:00 2022-08-24 18:44:39 [INFO] [TRAIN] epoch: 58, iter: 73000/160000, loss: 0.8340, lr: 0.000659, batch_cost: 0.2019, reader_cost: 0.00665, ips: 39.6149 samples/sec | ETA 04:52:49 2022-08-24 18:44:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 215s - batch_cost: 0.2145 - reader cost: 7.7253e-04 2022-08-24 18:48:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.2956 Acc: 0.7414 Kappa: 0.7215 Dice: 0.4172 2022-08-24 18:48:13 [INFO] [EVAL] Class IoU: [0.65 0.7583 0.9255 0.6988 0.6615 0.739 0.7574 0.7274 0.4715 0.6172 0.4417 0.5299 0.6471 0.3004 0.2224 0.3616 0.4846 0.4096 0.5582 0.3495 0.7109 0.4405 0.5631 0.428 0.3208 0.3584 0.4264 0.3866 0.3507 0.2462 0.2098 0.3968 0.2201 0.279 0.3625 0.3717 0.319 0.4412 0.2398 0.2854 0.0948 0.0668 0.2973 0.2232 0.2874 0.2919 0.2144 0.3929 0.6521 0.4859 0.4523 0.2225 0.1992 0.1359 0.6591 0.3839 0.8351 0.2349 0.3598 0.1972 0.0669 0.1833 0.2568 0.1208 0.3686 0.6332 0.2095 0.362 0.0703 0.2854 0.3121 0.3684 0.3586 0.2063 0.3954 0.2728 0.294 0.221 0.0856 0.2995 0.64 0.2653 0.2186 0.0095 0.5033 0.4723 0.0518 0.0421 0.2941 0.4479 0.3748 0.0107 0.1226 0.0774 0. 0.0044 0.1668 0.0861 0.0172 0.326 0.0854 0.0087 0.1397 0.299 0.0044 0.4907 0.1937 0.4653 0.029 0.2516 0.0674 0.1128 0.0612 0.5346 0.7519 0.0001 0.2531 0.469 0.0925 0.0757 0.4793 0.0069 0.2643 0.0919 0.2262 0.1042 0.4224 0.336 0.2908 0.2076 0.5534 0.0075 0.1129 0.1176 0.0612 0.1193 0.0481 0.0094 0.1152 0.2748 0.0664 0.0634 0.2499 0.239 0.2991 0.0004 0.2489 0.0209 0.0392 0.0282] 2022-08-24 18:48:13 [INFO] [EVAL] Class Precision: [0.7618 0.8175 0.9614 0.7933 0.7632 0.8429 0.8821 0.8109 0.6283 0.7293 0.6504 0.6533 0.7311 0.491 0.4224 0.5624 0.6502 0.6323 0.7122 0.5732 0.7928 0.6345 0.7407 0.638 0.4711 0.6211 0.5421 0.5801 0.6454 0.5164 0.3856 0.5121 0.4599 0.4383 0.5054 0.4789 0.6379 0.7109 0.435 0.4368 0.168 0.2129 0.5106 0.4914 0.4426 0.4911 0.3328 0.6375 0.708 0.6378 0.6414 0.2582 0.3692 0.7409 0.741 0.6883 0.8875 0.608 0.4781 0.3265 0.1912 0.3696 0.3107 0.5727 0.4674 0.8062 0.3306 0.5489 0.271 0.6395 0.5565 0.6151 0.6567 0.2628 0.5168 0.5682 0.5783 0.6189 0.4884 0.5827 0.7632 0.505 0.7645 0.0713 0.6845 0.627 0.3015 0.48 0.449 0.6605 0.4944 0.0149 0.2694 0.2753 0. 0.02 0.4812 0.6734 0.3684 0.4988 0.5741 0.0116 0.6806 0.5743 0.0649 0.6836 0.4284 0.9311 0.1151 0.355 0.2132 0.1347 0.2976 0.6489 0.7601 0.0061 0.8335 0.4784 0.1297 0.5309 0.7649 0.424 0.7429 0.5758 0.6209 0.6367 0.8607 0.4878 0.5678 0.4346 0.7088 0.3423 0.6385 0.7459 0.6433 0.3648 0.3795 0.1405 0.3397 0.7203 0.1936 0.1056 0.576 0.6363 0.6144 0.0005 0.7275 0.2539 0.3679 0.7506] 2022-08-24 18:48:13 [INFO] [EVAL] Class Recall: [0.8158 0.9129 0.9612 0.8543 0.8324 0.8571 0.8427 0.876 0.6538 0.8006 0.5792 0.7372 0.8492 0.4363 0.3196 0.5031 0.6555 0.5377 0.7208 0.4725 0.8732 0.5903 0.7014 0.5653 0.5014 0.4586 0.6663 0.5369 0.4343 0.32 0.3152 0.6381 0.2969 0.4343 0.5618 0.6242 0.3895 0.5376 0.3483 0.4516 0.1786 0.0887 0.4158 0.2903 0.4505 0.4185 0.376 0.5059 0.892 0.6711 0.6053 0.6169 0.302 0.1426 0.8564 0.4646 0.934 0.2768 0.5925 0.3324 0.0933 0.2666 0.5969 0.1328 0.6356 0.7469 0.3639 0.5153 0.0867 0.3402 0.4154 0.4789 0.4413 0.4896 0.6272 0.3441 0.3742 0.2558 0.0941 0.3812 0.7987 0.3585 0.2344 0.0109 0.6553 0.6568 0.0589 0.0441 0.4602 0.5819 0.6076 0.0372 0.1836 0.0971 0. 0.0055 0.2033 0.0899 0.0177 0.4849 0.0911 0.0328 0.1495 0.3841 0.0047 0.6349 0.2612 0.4819 0.0373 0.4637 0.0897 0.409 0.0716 0.7521 0.9859 0.0001 0.2666 0.9599 0.2436 0.0812 0.5621 0.007 0.2909 0.0986 0.2625 0.1108 0.4534 0.5192 0.3735 0.2844 0.7163 0.0076 0.1206 0.1225 0.0634 0.1506 0.0522 0.0099 0.1485 0.3077 0.0917 0.1369 0.3063 0.2768 0.3683 0.0054 0.2745 0.0222 0.042 0.0284] 2022-08-24 18:48:13 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 69000. 2022-08-24 18:48:24 [INFO] [TRAIN] epoch: 58, iter: 73050/160000, loss: 0.7945, lr: 0.000658, batch_cost: 0.2081, reader_cost: 0.00644, ips: 38.4452 samples/sec | ETA 05:01:33 2022-08-24 18:48:35 [INFO] [TRAIN] epoch: 58, iter: 73100/160000, loss: 0.8177, lr: 0.000658, batch_cost: 0.2256, reader_cost: 0.00113, ips: 35.4572 samples/sec | ETA 05:26:46 2022-08-24 18:48:44 [INFO] [TRAIN] epoch: 58, iter: 73150/160000, loss: 0.8187, lr: 0.000658, batch_cost: 0.1826, reader_cost: 0.00049, ips: 43.8163 samples/sec | ETA 04:24:17 2022-08-24 18:48:53 [INFO] [TRAIN] epoch: 58, iter: 73200/160000, loss: 0.7900, lr: 0.000657, batch_cost: 0.1785, reader_cost: 0.00077, ips: 44.8212 samples/sec | ETA 04:18:12 2022-08-24 18:49:02 [INFO] [TRAIN] epoch: 58, iter: 73250/160000, loss: 0.8064, lr: 0.000657, batch_cost: 0.1842, reader_cost: 0.00096, ips: 43.4340 samples/sec | ETA 04:26:18 2022-08-24 18:49:12 [INFO] [TRAIN] epoch: 59, iter: 73300/160000, loss: 0.7669, lr: 0.000656, batch_cost: 0.1989, reader_cost: 0.03070, ips: 40.2163 samples/sec | ETA 04:47:26 2022-08-24 18:49:20 [INFO] [TRAIN] epoch: 59, iter: 73350/160000, loss: 0.8169, lr: 0.000656, batch_cost: 0.1574, reader_cost: 0.00054, ips: 50.8131 samples/sec | ETA 03:47:22 2022-08-24 18:49:28 [INFO] [TRAIN] epoch: 59, iter: 73400/160000, loss: 0.7441, lr: 0.000656, batch_cost: 0.1623, reader_cost: 0.00066, ips: 49.2941 samples/sec | ETA 03:54:14 2022-08-24 18:49:37 [INFO] [TRAIN] epoch: 59, iter: 73450/160000, loss: 0.7666, lr: 0.000655, batch_cost: 0.1795, reader_cost: 0.00033, ips: 44.5685 samples/sec | ETA 04:18:55 2022-08-24 18:49:45 [INFO] [TRAIN] epoch: 59, iter: 73500/160000, loss: 0.8076, lr: 0.000655, batch_cost: 0.1543, reader_cost: 0.00032, ips: 51.8635 samples/sec | ETA 03:42:22 2022-08-24 18:49:53 [INFO] [TRAIN] epoch: 59, iter: 73550/160000, loss: 0.7594, lr: 0.000655, batch_cost: 0.1661, reader_cost: 0.00034, ips: 48.1647 samples/sec | ETA 03:59:19 2022-08-24 18:50:02 [INFO] [TRAIN] epoch: 59, iter: 73600/160000, loss: 0.7429, lr: 0.000654, batch_cost: 0.1787, reader_cost: 0.00057, ips: 44.7597 samples/sec | ETA 04:17:22 2022-08-24 18:50:13 [INFO] [TRAIN] epoch: 59, iter: 73650/160000, loss: 0.8148, lr: 0.000654, batch_cost: 0.2083, reader_cost: 0.00498, ips: 38.4003 samples/sec | ETA 04:59:49 2022-08-24 18:50:23 [INFO] [TRAIN] epoch: 59, iter: 73700/160000, loss: 0.8197, lr: 0.000653, batch_cost: 0.2017, reader_cost: 0.00444, ips: 39.6664 samples/sec | ETA 04:50:05 2022-08-24 18:50:34 [INFO] [TRAIN] epoch: 59, iter: 73750/160000, loss: 0.7694, lr: 0.000653, batch_cost: 0.2176, reader_cost: 0.01997, ips: 36.7607 samples/sec | ETA 05:12:50 2022-08-24 18:50:45 [INFO] [TRAIN] epoch: 59, iter: 73800/160000, loss: 0.7790, lr: 0.000653, batch_cost: 0.2243, reader_cost: 0.00050, ips: 35.6708 samples/sec | ETA 05:22:12 2022-08-24 18:50:56 [INFO] [TRAIN] epoch: 59, iter: 73850/160000, loss: 0.7495, lr: 0.000652, batch_cost: 0.2197, reader_cost: 0.00049, ips: 36.4140 samples/sec | ETA 05:15:26 2022-08-24 18:51:07 [INFO] [TRAIN] epoch: 59, iter: 73900/160000, loss: 0.7879, lr: 0.000652, batch_cost: 0.2142, reader_cost: 0.00065, ips: 37.3545 samples/sec | ETA 05:07:19 2022-08-24 18:51:18 [INFO] [TRAIN] epoch: 59, iter: 73950/160000, loss: 0.7763, lr: 0.000651, batch_cost: 0.2187, reader_cost: 0.00046, ips: 36.5731 samples/sec | ETA 05:13:42 2022-08-24 18:51:28 [INFO] [TRAIN] epoch: 59, iter: 74000/160000, loss: 0.8684, lr: 0.000651, batch_cost: 0.2119, reader_cost: 0.00396, ips: 37.7582 samples/sec | ETA 05:03:41 2022-08-24 18:51:28 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 224s - batch_cost: 0.2235 - reader cost: 0.0017 2022-08-24 18:55:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.3049 Acc: 0.7412 Kappa: 0.7211 Dice: 0.4303 2022-08-24 18:55:12 [INFO] [EVAL] Class IoU: [0.651 0.7599 0.9239 0.7011 0.6602 0.7372 0.7615 0.7397 0.4752 0.6058 0.4378 0.5312 0.6472 0.3217 0.1978 0.3631 0.4561 0.3806 0.5423 0.3545 0.7087 0.417 0.5567 0.46 0.2982 0.3821 0.3788 0.3711 0.3729 0.258 0.1799 0.3898 0.2522 0.2654 0.3933 0.337 0.3432 0.445 0.2377 0.2796 0.086 0.0639 0.3005 0.2228 0.2968 0.2177 0.2443 0.4083 0.6313 0.4855 0.4499 0.1807 0.192 0.1928 0.6625 0.4565 0.8007 0.3113 0.4128 0.2105 0.0552 0.1716 0.2352 0.1245 0.3796 0.6405 0.2134 0.3607 0.0586 0.3125 0.327 0.415 0.3579 0.2405 0.4212 0.2855 0.3195 0.2159 0.1409 0.2491 0.6495 0.2583 0.2347 0.0149 0.5428 0.4746 0.0754 0.0365 0.2871 0.4259 0.3657 0.0106 0.1936 0.0644 0.0219 0.0074 0.1796 0.1301 0.2114 0.3324 0.0252 0.0082 0.167 0.3004 0.0002 0.4808 0.1585 0.4989 0.0631 0.2835 0.0302 0.3204 0.067 0.5787 0.6167 0.0008 0.3025 0.5649 0.0459 0.1371 0.4674 0.0035 0.2865 0.1059 0.2159 0.1933 0.4512 0.3473 0.367 0.2217 0.4935 0.0055 0.0777 0.2053 0.0822 0.1113 0.0609 0.0405 0.1214 0.3289 0.1391 0.0753 0.2774 0.3261 0.3215 0. 0.2347 0.0129 0.0356 0.0389] 2022-08-24 18:55:12 [INFO] [EVAL] Class Precision: [0.7515 0.8356 0.9583 0.8056 0.7466 0.8496 0.901 0.8226 0.5963 0.7469 0.6709 0.6558 0.7448 0.4712 0.4566 0.5149 0.7209 0.6728 0.7034 0.5623 0.7867 0.6141 0.7241 0.5824 0.5129 0.6335 0.4813 0.5925 0.6532 0.4766 0.4084 0.5486 0.4149 0.3448 0.4988 0.4601 0.5705 0.6777 0.3991 0.4109 0.1826 0.2149 0.5639 0.5231 0.4205 0.4333 0.3935 0.6445 0.6813 0.614 0.6445 0.199 0.3481 0.6718 0.7529 0.6839 0.8436 0.5387 0.6426 0.4 0.122 0.4654 0.2904 0.6116 0.4824 0.8154 0.3111 0.5215 0.1705 0.6561 0.5242 0.5726 0.6535 0.2935 0.6083 0.477 0.5352 0.596 0.751 0.7357 0.7689 0.5898 0.7564 0.0948 0.709 0.6934 0.3063 0.4289 0.41 0.6191 0.4712 0.0154 0.331 0.2827 0.1151 0.0291 0.5553 0.5528 0.4798 0.522 0.5153 0.0161 0.6484 0.7624 0.0139 0.7037 0.5389 0.8483 0.201 0.5316 0.3023 0.5903 0.3349 0.8078 0.6183 0.0465 0.8133 0.6167 0.0929 0.5364 0.7449 0.1885 0.7742 0.6039 0.6204 0.6786 0.8189 0.5142 0.6651 0.4303 0.5496 0.7303 0.4855 0.5839 0.5893 0.354 0.374 0.1023 0.3101 0.6428 0.2448 0.1244 0.552 0.5252 0.5829 0. 0.8951 0.2406 0.4243 0.6289] 2022-08-24 18:55:12 [INFO] [EVAL] Class Recall: [0.8295 0.8934 0.9627 0.8438 0.8508 0.8479 0.8309 0.8801 0.7006 0.7624 0.5575 0.7366 0.8317 0.5035 0.2587 0.5519 0.5538 0.4671 0.7031 0.4895 0.8774 0.5651 0.7065 0.6864 0.4161 0.4906 0.6402 0.4983 0.465 0.36 0.2433 0.5738 0.3914 0.5355 0.6504 0.5575 0.4627 0.5645 0.3703 0.4665 0.1397 0.0834 0.3915 0.2796 0.5021 0.3044 0.3919 0.5269 0.8959 0.6988 0.5983 0.6628 0.2998 0.2129 0.8466 0.5785 0.9402 0.4244 0.5358 0.3077 0.0917 0.2137 0.553 0.1352 0.6404 0.749 0.4046 0.5392 0.0819 0.3738 0.465 0.6011 0.4417 0.5714 0.578 0.4155 0.4422 0.2529 0.1478 0.2735 0.8071 0.3149 0.2538 0.0173 0.6984 0.6006 0.0909 0.0384 0.4892 0.5772 0.6204 0.033 0.3181 0.0769 0.0263 0.0098 0.2098 0.1454 0.2743 0.4778 0.0258 0.0166 0.1837 0.3314 0.0002 0.6028 0.1833 0.5478 0.0842 0.378 0.0324 0.412 0.0773 0.6711 0.996 0.0008 0.3251 0.8707 0.0833 0.1556 0.5565 0.0036 0.3126 0.1138 0.2487 0.2128 0.5012 0.5168 0.4502 0.3138 0.8286 0.0056 0.0847 0.2404 0.0871 0.1397 0.0678 0.0627 0.1663 0.4024 0.2436 0.1602 0.358 0.4624 0.4176 0. 0.2413 0.0135 0.0374 0.0398] 2022-08-24 18:55:12 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 74000. 2022-08-24 18:55:23 [INFO] [TRAIN] epoch: 59, iter: 74050/160000, loss: 0.7821, lr: 0.000651, batch_cost: 0.2251, reader_cost: 0.00327, ips: 35.5383 samples/sec | ETA 05:22:28 2022-08-24 18:55:33 [INFO] [TRAIN] epoch: 59, iter: 74100/160000, loss: 0.7984, lr: 0.000650, batch_cost: 0.1910, reader_cost: 0.00099, ips: 41.8884 samples/sec | ETA 04:33:25 2022-08-24 18:55:43 [INFO] [TRAIN] epoch: 59, iter: 74150/160000, loss: 0.8134, lr: 0.000650, batch_cost: 0.1972, reader_cost: 0.00031, ips: 40.5699 samples/sec | ETA 04:42:08 2022-08-24 18:55:52 [INFO] [TRAIN] epoch: 59, iter: 74200/160000, loss: 0.8148, lr: 0.000650, batch_cost: 0.1765, reader_cost: 0.00056, ips: 45.3222 samples/sec | ETA 04:12:24 2022-08-24 18:56:00 [INFO] [TRAIN] epoch: 59, iter: 74250/160000, loss: 0.7872, lr: 0.000649, batch_cost: 0.1633, reader_cost: 0.00058, ips: 48.9768 samples/sec | ETA 03:53:26 2022-08-24 18:56:08 [INFO] [TRAIN] epoch: 59, iter: 74300/160000, loss: 0.7667, lr: 0.000649, batch_cost: 0.1616, reader_cost: 0.00100, ips: 49.4981 samples/sec | ETA 03:50:51 2022-08-24 18:56:15 [INFO] [TRAIN] epoch: 59, iter: 74350/160000, loss: 0.7764, lr: 0.000648, batch_cost: 0.1490, reader_cost: 0.00493, ips: 53.6802 samples/sec | ETA 03:32:44 2022-08-24 18:56:24 [INFO] [TRAIN] epoch: 59, iter: 74400/160000, loss: 0.7910, lr: 0.000648, batch_cost: 0.1703, reader_cost: 0.00172, ips: 46.9828 samples/sec | ETA 04:02:55 2022-08-24 18:56:32 [INFO] [TRAIN] epoch: 59, iter: 74450/160000, loss: 0.8474, lr: 0.000648, batch_cost: 0.1527, reader_cost: 0.00098, ips: 52.3742 samples/sec | ETA 03:37:47 2022-08-24 18:56:41 [INFO] [TRAIN] epoch: 59, iter: 74500/160000, loss: 0.8012, lr: 0.000647, batch_cost: 0.1854, reader_cost: 0.00051, ips: 43.1446 samples/sec | ETA 04:24:13 2022-08-24 18:56:53 [INFO] [TRAIN] epoch: 60, iter: 74550/160000, loss: 0.7851, lr: 0.000647, batch_cost: 0.2391, reader_cost: 0.04461, ips: 33.4530 samples/sec | ETA 05:40:34 2022-08-24 18:57:04 [INFO] [TRAIN] epoch: 60, iter: 74600/160000, loss: 0.7995, lr: 0.000647, batch_cost: 0.2176, reader_cost: 0.00134, ips: 36.7601 samples/sec | ETA 05:09:45 2022-08-24 18:57:15 [INFO] [TRAIN] epoch: 60, iter: 74650/160000, loss: 0.8230, lr: 0.000646, batch_cost: 0.2193, reader_cost: 0.00416, ips: 36.4800 samples/sec | ETA 05:11:57 2022-08-24 18:57:26 [INFO] [TRAIN] epoch: 60, iter: 74700/160000, loss: 0.8194, lr: 0.000646, batch_cost: 0.2299, reader_cost: 0.00042, ips: 34.7920 samples/sec | ETA 05:26:53 2022-08-24 18:57:36 [INFO] [TRAIN] epoch: 60, iter: 74750/160000, loss: 0.7998, lr: 0.000645, batch_cost: 0.2050, reader_cost: 0.00072, ips: 39.0151 samples/sec | ETA 04:51:20 2022-08-24 18:57:46 [INFO] [TRAIN] epoch: 60, iter: 74800/160000, loss: 0.7777, lr: 0.000645, batch_cost: 0.1867, reader_cost: 0.00062, ips: 42.8542 samples/sec | ETA 04:25:05 2022-08-24 18:57:55 [INFO] [TRAIN] epoch: 60, iter: 74850/160000, loss: 0.7602, lr: 0.000645, batch_cost: 0.1959, reader_cost: 0.00340, ips: 40.8362 samples/sec | ETA 04:38:01 2022-08-24 18:58:05 [INFO] [TRAIN] epoch: 60, iter: 74900/160000, loss: 0.7840, lr: 0.000644, batch_cost: 0.1925, reader_cost: 0.00058, ips: 41.5647 samples/sec | ETA 04:32:59 2022-08-24 18:58:16 [INFO] [TRAIN] epoch: 60, iter: 74950/160000, loss: 0.7708, lr: 0.000644, batch_cost: 0.2195, reader_cost: 0.00044, ips: 36.4453 samples/sec | ETA 05:11:09 2022-08-24 18:58:26 [INFO] [TRAIN] epoch: 60, iter: 75000/160000, loss: 0.7979, lr: 0.000644, batch_cost: 0.1922, reader_cost: 0.00374, ips: 41.6172 samples/sec | ETA 04:32:19 2022-08-24 18:58:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 205s - batch_cost: 0.2050 - reader cost: 8.7451e-04 2022-08-24 19:01:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3038 Acc: 0.7427 Kappa: 0.7231 Dice: 0.4279 2022-08-24 19:01:51 [INFO] [EVAL] Class IoU: [0.6564 0.7675 0.9254 0.7024 0.663 0.7327 0.7627 0.7435 0.4836 0.6062 0.4397 0.5311 0.6538 0.2824 0.2277 0.3711 0.4975 0.3808 0.5383 0.3615 0.7205 0.404 0.5352 0.4641 0.3325 0.4147 0.3856 0.3741 0.3536 0.2901 0.1795 0.4035 0.2406 0.273 0.3217 0.3517 0.3393 0.4607 0.2483 0.2516 0.1029 0.0534 0.2998 0.2021 0.2754 0.2939 0.1934 0.403 0.6265 0.4595 0.4437 0.2806 0.1425 0.2109 0.637 0.484 0.8232 0.3238 0.4783 0.1819 0.0638 0.1677 0.2848 0.1651 0.3424 0.6492 0.1848 0.3739 0.0872 0.3083 0.3352 0.4098 0.3731 0.1969 0.4198 0.2883 0.3097 0.2115 0.1403 0.1721 0.61 0.2787 0.2617 0.0205 0.5454 0.4756 0.0536 0.0547 0.2488 0.4295 0.3674 0.012 0.1529 0.0561 0.0273 0.0059 0.1807 0.1262 0.1707 0.3381 0.0796 0.0029 0.1796 0.5607 0.0048 0.5037 0.1559 0.4614 0.0731 0.2418 0.062 0.1961 0.0666 0.5502 0.6201 0.0025 0.3569 0.5595 0.0374 0.0546 0.4721 0.011 0.3026 0.1184 0.225 0.1598 0.4273 0.3042 0.3531 0.1922 0.5514 0.0014 0.0308 0.213 0.0583 0.1103 0.0551 0.0144 0.1213 0.3229 0.1848 0.1053 0.237 0.1323 0.3135 0. 0.2348 0.0098 0.042 0.017 ] 2022-08-24 19:01:51 [INFO] [EVAL] Class Precision: [0.7701 0.8384 0.9627 0.8013 0.7482 0.865 0.895 0.8155 0.6166 0.7088 0.6422 0.6483 0.7543 0.4933 0.4425 0.539 0.6812 0.6544 0.7221 0.5252 0.8222 0.6396 0.6942 0.5725 0.4932 0.5625 0.4868 0.736 0.6319 0.4162 0.4268 0.5241 0.4787 0.3792 0.5277 0.4559 0.5546 0.633 0.4198 0.5302 0.1765 0.2246 0.502 0.5846 0.3932 0.4618 0.2847 0.6927 0.6822 0.5851 0.5967 0.3247 0.3662 0.5391 0.7165 0.6936 0.8656 0.5737 0.6435 0.39 0.1093 0.4636 0.3548 0.5425 0.4026 0.8004 0.2782 0.5669 0.2135 0.6539 0.5691 0.5383 0.6124 0.2456 0.6399 0.5552 0.4732 0.6591 0.4852 0.3809 0.7095 0.4858 0.7216 0.0416 0.6623 0.6564 0.2198 0.3928 0.3415 0.6137 0.4624 0.0156 0.3075 0.3105 0.1254 0.0286 0.6635 0.4768 0.4194 0.5307 0.6109 0.0062 0.682 0.686 0.0667 0.6568 0.4758 0.8136 0.1867 0.345 0.1854 0.2793 0.3081 0.7534 0.6248 0.0579 0.6921 0.5777 0.1099 0.5509 0.7686 0.2019 0.7567 0.6903 0.6295 0.6919 0.6869 0.416 0.4641 0.4342 0.7261 1. 0.3956 0.5768 0.6123 0.3137 0.4638 0.2174 0.3404 0.6682 0.2203 0.1332 0.6195 0.3711 0.5265 0. 0.9034 0.4161 0.2786 0.691 ] 2022-08-24 19:01:51 [INFO] [EVAL] Class Recall: [0.8163 0.9008 0.9598 0.8506 0.8533 0.8272 0.8376 0.8939 0.6916 0.8071 0.5823 0.746 0.8307 0.3977 0.3193 0.5436 0.6484 0.4767 0.679 0.5368 0.8535 0.523 0.7003 0.7103 0.505 0.6121 0.6498 0.432 0.4453 0.4891 0.2365 0.6369 0.326 0.4937 0.4518 0.6061 0.4664 0.6285 0.378 0.3238 0.1981 0.0654 0.4267 0.2359 0.479 0.4471 0.376 0.4907 0.8848 0.6817 0.6336 0.6739 0.1891 0.2573 0.8517 0.6156 0.9438 0.4265 0.6508 0.2543 0.133 0.2081 0.5906 0.1919 0.696 0.7747 0.355 0.5234 0.1285 0.3684 0.4492 0.6319 0.4884 0.4981 0.5497 0.375 0.4727 0.2374 0.1649 0.2389 0.8132 0.3954 0.2911 0.0389 0.7555 0.6333 0.0662 0.0598 0.4782 0.5887 0.6413 0.0491 0.2331 0.064 0.0337 0.0074 0.199 0.1465 0.2236 0.4822 0.0838 0.0053 0.196 0.7543 0.0051 0.6837 0.1882 0.516 0.1072 0.447 0.0852 0.3971 0.0782 0.671 0.988 0.0026 0.4243 0.9465 0.0536 0.0571 0.5504 0.0115 0.3353 0.125 0.2593 0.172 0.5307 0.5309 0.5962 0.2565 0.6962 0.0014 0.0323 0.2525 0.0605 0.1454 0.0589 0.0152 0.1586 0.3846 0.5336 0.3343 0.2774 0.1705 0.4365 0. 0.2409 0.0099 0.0471 0.0171] 2022-08-24 19:01:51 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 74000. 2022-08-24 19:02:02 [INFO] [TRAIN] epoch: 60, iter: 75050/160000, loss: 0.7640, lr: 0.000643, batch_cost: 0.2140, reader_cost: 0.00476, ips: 37.3836 samples/sec | ETA 05:02:59 2022-08-24 19:02:12 [INFO] [TRAIN] epoch: 60, iter: 75100/160000, loss: 0.7524, lr: 0.000643, batch_cost: 0.2120, reader_cost: 0.00250, ips: 37.7337 samples/sec | ETA 04:59:59 2022-08-24 19:02:21 [INFO] [TRAIN] epoch: 60, iter: 75150/160000, loss: 0.7621, lr: 0.000642, batch_cost: 0.1709, reader_cost: 0.00037, ips: 46.8199 samples/sec | ETA 04:01:38 2022-08-24 19:02:31 [INFO] [TRAIN] epoch: 60, iter: 75200/160000, loss: 0.7890, lr: 0.000642, batch_cost: 0.1928, reader_cost: 0.00084, ips: 41.4978 samples/sec | ETA 04:32:27 2022-08-24 19:02:39 [INFO] [TRAIN] epoch: 60, iter: 75250/160000, loss: 0.7314, lr: 0.000642, batch_cost: 0.1691, reader_cost: 0.00035, ips: 47.3170 samples/sec | ETA 03:58:48 2022-08-24 19:02:48 [INFO] [TRAIN] epoch: 60, iter: 75300/160000, loss: 0.7882, lr: 0.000641, batch_cost: 0.1781, reader_cost: 0.00068, ips: 44.9220 samples/sec | ETA 04:11:23 2022-08-24 19:02:57 [INFO] [TRAIN] epoch: 60, iter: 75350/160000, loss: 0.7790, lr: 0.000641, batch_cost: 0.1897, reader_cost: 0.00035, ips: 42.1661 samples/sec | ETA 04:27:40 2022-08-24 19:03:06 [INFO] [TRAIN] epoch: 60, iter: 75400/160000, loss: 0.8071, lr: 0.000641, batch_cost: 0.1791, reader_cost: 0.00098, ips: 44.6666 samples/sec | ETA 04:12:32 2022-08-24 19:03:15 [INFO] [TRAIN] epoch: 60, iter: 75450/160000, loss: 0.8266, lr: 0.000640, batch_cost: 0.1738, reader_cost: 0.00828, ips: 46.0370 samples/sec | ETA 04:04:52 2022-08-24 19:03:24 [INFO] [TRAIN] epoch: 60, iter: 75500/160000, loss: 0.8046, lr: 0.000640, batch_cost: 0.1740, reader_cost: 0.01272, ips: 45.9725 samples/sec | ETA 04:05:04 2022-08-24 19:03:33 [INFO] [TRAIN] epoch: 60, iter: 75550/160000, loss: 0.7652, lr: 0.000639, batch_cost: 0.1845, reader_cost: 0.00043, ips: 43.3585 samples/sec | ETA 04:19:41 2022-08-24 19:03:43 [INFO] [TRAIN] epoch: 60, iter: 75600/160000, loss: 0.7994, lr: 0.000639, batch_cost: 0.2067, reader_cost: 0.00051, ips: 38.6955 samples/sec | ETA 04:50:49 2022-08-24 19:03:55 [INFO] [TRAIN] epoch: 60, iter: 75650/160000, loss: 0.7867, lr: 0.000639, batch_cost: 0.2299, reader_cost: 0.00664, ips: 34.8034 samples/sec | ETA 05:23:08 2022-08-24 19:04:05 [INFO] [TRAIN] epoch: 60, iter: 75700/160000, loss: 0.7984, lr: 0.000638, batch_cost: 0.2082, reader_cost: 0.00386, ips: 38.4241 samples/sec | ETA 04:52:31 2022-08-24 19:04:15 [INFO] [TRAIN] epoch: 60, iter: 75750/160000, loss: 0.9037, lr: 0.000638, batch_cost: 0.1922, reader_cost: 0.00152, ips: 41.6218 samples/sec | ETA 04:29:53 2022-08-24 19:04:27 [INFO] [TRAIN] epoch: 61, iter: 75800/160000, loss: 0.7874, lr: 0.000637, batch_cost: 0.2526, reader_cost: 0.04965, ips: 31.6737 samples/sec | ETA 05:54:26 2022-08-24 19:04:38 [INFO] [TRAIN] epoch: 61, iter: 75850/160000, loss: 0.7828, lr: 0.000637, batch_cost: 0.2144, reader_cost: 0.00871, ips: 37.3197 samples/sec | ETA 05:00:38 2022-08-24 19:04:49 [INFO] [TRAIN] epoch: 61, iter: 75900/160000, loss: 0.7692, lr: 0.000637, batch_cost: 0.2123, reader_cost: 0.00059, ips: 37.6884 samples/sec | ETA 04:57:31 2022-08-24 19:05:01 [INFO] [TRAIN] epoch: 61, iter: 75950/160000, loss: 0.7789, lr: 0.000636, batch_cost: 0.2357, reader_cost: 0.00750, ips: 33.9402 samples/sec | ETA 05:30:11 2022-08-24 19:05:11 [INFO] [TRAIN] epoch: 61, iter: 76000/160000, loss: 0.7014, lr: 0.000636, batch_cost: 0.2158, reader_cost: 0.00048, ips: 37.0639 samples/sec | ETA 05:02:10 2022-08-24 19:05:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 206s - batch_cost: 0.2057 - reader cost: 0.0014 2022-08-24 19:08:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3031 Acc: 0.7423 Kappa: 0.7222 Dice: 0.4282 2022-08-24 19:08:37 [INFO] [EVAL] Class IoU: [0.6486 0.7687 0.9228 0.6977 0.6697 0.7352 0.7509 0.7412 0.47 0.6135 0.4504 0.531 0.6492 0.2739 0.2259 0.3624 0.4621 0.4094 0.542 0.3619 0.7199 0.451 0.5724 0.4573 0.3314 0.3992 0.4035 0.3687 0.332 0.2846 0.2221 0.3777 0.2303 0.2772 0.3298 0.3522 0.336 0.4529 0.236 0.2481 0.0913 0.0729 0.3057 0.2098 0.2729 0.2507 0.2018 0.4224 0.6265 0.426 0.4394 0.2411 0.194 0.2262 0.6409 0.4301 0.8406 0.1938 0.4722 0.1833 0.0469 0.1661 0.2982 0.1789 0.3905 0.6532 0.1801 0.3783 0.0692 0.323 0.3398 0.3969 0.3612 0.1997 0.4262 0.2838 0.3598 0.1788 0.1853 0.4329 0.6255 0.2353 0.2293 0.0162 0.4707 0.4684 0.0612 0.0765 0.3387 0.4295 0.38 0.0126 0.1688 0.0614 0.0027 0.0176 0.165 0.1176 0.1849 0.3497 0.0297 0.0264 0.1848 0.1275 0.0187 0.4905 0.1548 0.4487 0.1029 0.2559 0.0588 0.2218 0.0743 0.4758 0.6763 0.0012 0.3672 0.5635 0.0144 0.1063 0.4394 0.0154 0.3047 0.1471 0.263 0.1354 0.443 0.3271 0.3063 0.1519 0.548 0.0024 0.1811 0.1544 0.0857 0.1042 0.0602 0.0145 0.1076 0.2915 0.2215 0.0553 0.2733 0.2474 0.2072 0. 0.1993 0.0105 0.0401 0.0568] 2022-08-24 19:08:37 [INFO] [EVAL] Class Precision: [0.7473 0.8338 0.9653 0.7935 0.7698 0.8414 0.9058 0.814 0.6326 0.7468 0.6621 0.6325 0.7345 0.4791 0.4499 0.5282 0.642 0.6155 0.7339 0.5426 0.8199 0.574 0.7624 0.6196 0.5443 0.6056 0.5735 0.7407 0.6485 0.4918 0.3586 0.5517 0.4182 0.3832 0.4354 0.5157 0.6051 0.7372 0.4397 0.5129 0.1661 0.2218 0.595 0.504 0.4182 0.4729 0.3071 0.6892 0.7742 0.5086 0.5875 0.2787 0.4236 0.6248 0.703 0.5588 0.9092 0.6127 0.6898 0.3906 0.1103 0.5831 0.4556 0.5732 0.5045 0.7626 0.2732 0.5542 0.2446 0.6891 0.5888 0.4904 0.6397 0.2524 0.6592 0.4513 0.5101 0.5273 0.6367 0.6541 0.746 0.5518 0.76 0.0816 0.6814 0.6271 0.2346 0.345 0.5594 0.5943 0.5119 0.0178 0.2838 0.3008 0.0123 0.0363 0.5552 0.6061 0.464 0.5121 0.5891 0.038 0.7307 0.4235 0.1333 0.767 0.5287 0.7659 0.2471 0.371 0.2182 0.322 0.3811 0.7126 0.6794 0.0323 0.7525 0.5895 0.045 0.4636 0.806 0.3117 0.6435 0.6298 0.538 0.718 0.7704 0.4721 0.5741 0.4039 0.635 0.1972 0.518 0.6855 0.634 0.3247 0.4218 0.2697 0.3406 0.7239 0.3054 0.0784 0.5962 0.5985 0.8752 0. 0.8977 0.3415 0.2542 0.5872] 2022-08-24 19:08:37 [INFO] [EVAL] Class Recall: [0.8308 0.9079 0.9545 0.8525 0.8373 0.8534 0.8145 0.8924 0.6465 0.7746 0.5848 0.768 0.8482 0.39 0.3121 0.5358 0.6225 0.5501 0.6746 0.5208 0.8552 0.6779 0.6966 0.6358 0.4586 0.5395 0.5764 0.4233 0.4048 0.4032 0.3685 0.5449 0.3388 0.5007 0.5762 0.5262 0.4304 0.5401 0.3374 0.3246 0.1687 0.098 0.386 0.2644 0.44 0.3479 0.3705 0.5218 0.7666 0.7241 0.6356 0.6409 0.2636 0.2617 0.8789 0.6513 0.9176 0.2209 0.5994 0.2567 0.0753 0.1885 0.4631 0.2064 0.6335 0.8199 0.3458 0.5438 0.088 0.3781 0.4455 0.6756 0.4535 0.4885 0.5466 0.4333 0.5498 0.2129 0.2072 0.5614 0.7947 0.2909 0.2472 0.0198 0.6036 0.6491 0.0764 0.0895 0.4618 0.6076 0.5959 0.0409 0.2941 0.0716 0.0034 0.0331 0.1901 0.1273 0.2351 0.5244 0.0303 0.0791 0.1983 0.1543 0.0212 0.5764 0.1796 0.52 0.15 0.4521 0.0746 0.4161 0.0845 0.5887 0.9933 0.0012 0.4177 0.9274 0.0208 0.1212 0.4914 0.0159 0.3665 0.161 0.3397 0.143 0.5104 0.5158 0.3964 0.1958 0.8001 0.0025 0.2178 0.1661 0.0902 0.133 0.0657 0.0151 0.1359 0.328 0.4462 0.1578 0.3353 0.2967 0.2135 0. 0.2039 0.0107 0.0454 0.0592] 2022-08-24 19:08:38 [INFO] [EVAL] The model with the best validation mIoU (0.3049) was saved at iter 74000. 2022-08-24 19:08:48 [INFO] [TRAIN] epoch: 61, iter: 76050/160000, loss: 0.7644, lr: 0.000636, batch_cost: 0.1994, reader_cost: 0.01275, ips: 40.1225 samples/sec | ETA 04:38:58 2022-08-24 19:08:58 [INFO] [TRAIN] epoch: 61, iter: 76100/160000, loss: 0.7187, lr: 0.000635, batch_cost: 0.2080, reader_cost: 0.00109, ips: 38.4671 samples/sec | ETA 04:50:48 2022-08-24 19:09:06 [INFO] [TRAIN] epoch: 61, iter: 76150/160000, loss: 0.8068, lr: 0.000635, batch_cost: 0.1660, reader_cost: 0.00050, ips: 48.1957 samples/sec | ETA 03:51:58 2022-08-24 19:09:16 [INFO] [TRAIN] epoch: 61, iter: 76200/160000, loss: 0.7863, lr: 0.000634, batch_cost: 0.1921, reader_cost: 0.00043, ips: 41.6508 samples/sec | ETA 04:28:15 2022-08-24 19:09:25 [INFO] [TRAIN] epoch: 61, iter: 76250/160000, loss: 0.7792, lr: 0.000634, batch_cost: 0.1881, reader_cost: 0.00059, ips: 42.5223 samples/sec | ETA 04:22:36 2022-08-24 19:09:34 [INFO] [TRAIN] epoch: 61, iter: 76300/160000, loss: 0.7577, lr: 0.000634, batch_cost: 0.1849, reader_cost: 0.00043, ips: 43.2667 samples/sec | ETA 04:17:56 2022-08-24 19:09:42 [INFO] [TRAIN] epoch: 61, iter: 76350/160000, loss: 0.8058, lr: 0.000633, batch_cost: 0.1564, reader_cost: 0.00164, ips: 51.1548 samples/sec | ETA 03:38:01 2022-08-24 19:09:51 [INFO] [TRAIN] epoch: 61, iter: 76400/160000, loss: 0.7726, lr: 0.000633, batch_cost: 0.1681, reader_cost: 0.00618, ips: 47.5916 samples/sec | ETA 03:54:12 2022-08-24 19:10:00 [INFO] [TRAIN] epoch: 61, iter: 76450/160000, loss: 0.7524, lr: 0.000633, batch_cost: 0.1805, reader_cost: 0.00045, ips: 44.3181 samples/sec | ETA 04:11:21 2022-08-24 19:10:10 [INFO] [TRAIN] epoch: 61, iter: 76500/160000, loss: 0.7509, lr: 0.000632, batch_cost: 0.1961, reader_cost: 0.00093, ips: 40.7961 samples/sec | ETA 04:32:54 2022-08-24 19:10:19 [INFO] [TRAIN] epoch: 61, iter: 76550/160000, loss: 0.7262, lr: 0.000632, batch_cost: 0.1923, reader_cost: 0.00073, ips: 41.6043 samples/sec | ETA 04:27:26 2022-08-24 19:10:28 [INFO] [TRAIN] epoch: 61, iter: 76600/160000, loss: 0.7767, lr: 0.000631, batch_cost: 0.1811, reader_cost: 0.00041, ips: 44.1711 samples/sec | ETA 04:11:44 2022-08-24 19:10:38 [INFO] [TRAIN] epoch: 61, iter: 76650/160000, loss: 0.8396, lr: 0.000631, batch_cost: 0.1943, reader_cost: 0.00045, ips: 41.1707 samples/sec | ETA 04:29:56 2022-08-24 19:10:48 [INFO] [TRAIN] epoch: 61, iter: 76700/160000, loss: 0.7411, lr: 0.000631, batch_cost: 0.1987, reader_cost: 0.00252, ips: 40.2614 samples/sec | ETA 04:35:51 2022-08-24 19:10:59 [INFO] [TRAIN] epoch: 61, iter: 76750/160000, loss: 0.8051, lr: 0.000630, batch_cost: 0.2162, reader_cost: 0.00072, ips: 36.9966 samples/sec | ETA 05:00:01 2022-08-24 19:11:09 [INFO] [TRAIN] epoch: 61, iter: 76800/160000, loss: 0.7839, lr: 0.000630, batch_cost: 0.2022, reader_cost: 0.00055, ips: 39.5661 samples/sec | ETA 04:40:22 2022-08-24 19:11:19 [INFO] [TRAIN] epoch: 61, iter: 76850/160000, loss: 0.8082, lr: 0.000630, batch_cost: 0.1961, reader_cost: 0.01136, ips: 40.7932 samples/sec | ETA 04:31:46 2022-08-24 19:11:29 [INFO] [TRAIN] epoch: 61, iter: 76900/160000, loss: 0.7820, lr: 0.000629, batch_cost: 0.2149, reader_cost: 0.00058, ips: 37.2291 samples/sec | ETA 04:57:36 2022-08-24 19:11:41 [INFO] [TRAIN] epoch: 61, iter: 76950/160000, loss: 0.8217, lr: 0.000629, batch_cost: 0.2243, reader_cost: 0.00047, ips: 35.6711 samples/sec | ETA 05:10:25 2022-08-24 19:11:51 [INFO] [TRAIN] epoch: 61, iter: 77000/160000, loss: 0.7430, lr: 0.000628, batch_cost: 0.2009, reader_cost: 0.00919, ips: 39.8277 samples/sec | ETA 04:37:51 2022-08-24 19:11:51 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 214s - batch_cost: 0.2138 - reader cost: 9.2866e-04 2022-08-24 19:15:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.3067 Acc: 0.7428 Kappa: 0.7229 Dice: 0.4316 2022-08-24 19:15:25 [INFO] [EVAL] Class IoU: [0.6498 0.7619 0.9239 0.7068 0.6601 0.7296 0.7593 0.7307 0.487 0.6022 0.4544 0.5389 0.6529 0.3148 0.2249 0.3812 0.4468 0.3812 0.5395 0.3608 0.7174 0.4031 0.5609 0.45 0.3269 0.377 0.3874 0.3653 0.3521 0.2803 0.2086 0.4187 0.2513 0.2663 0.4073 0.3412 0.3355 0.443 0.2322 0.2396 0.0751 0.0895 0.3151 0.2215 0.2793 0.2772 0.202 0.4001 0.6558 0.4818 0.4227 0.1994 0.2158 0.2266 0.6347 0.4597 0.831 0.2401 0.4386 0.2066 0.1166 0.1428 0.3095 0.1462 0.4018 0.6567 0.2151 0.3668 0.1308 0.3169 0.3295 0.4075 0.3406 0.2228 0.4162 0.3 0.3888 0.2071 0.2474 0.2416 0.6368 0.2536 0.1912 0.013 0.4772 0.4567 0.0607 0.0633 0.3213 0.4259 0.3637 0.019 0.2157 0.0674 0.0444 0.0055 0.1523 0.1309 0.1701 0.3553 0.0362 0.0094 0.1254 0.6048 0.0044 0.4888 0.1427 0.4138 0.0631 0.302 0.04 0.1011 0.0604 0.4049 0.7002 0. 0.3207 0.5553 0.0591 0.1479 0.4274 0.0157 0.3181 0.044 0.2583 0.1343 0.4295 0.3152 0.4064 0.2299 0.6178 0.0029 0.2034 0.141 0.0843 0.0788 0.0925 0.0172 0.1193 0.2969 0.0293 0.0355 0.2081 0.3923 0.3115 0. 0.2583 0.0071 0.0419 0.0319] 2022-08-24 19:15:25 [INFO] [EVAL] Class Precision: [0.7557 0.8295 0.9603 0.8089 0.7484 0.8613 0.8853 0.7863 0.6391 0.7305 0.6184 0.6685 0.7318 0.4866 0.4678 0.5142 0.6845 0.6904 0.6816 0.5398 0.8115 0.7181 0.752 0.6104 0.4724 0.6049 0.4955 0.7147 0.704 0.4576 0.3899 0.5717 0.4202 0.4482 0.5277 0.4899 0.5797 0.7074 0.4318 0.5011 0.195 0.2066 0.5768 0.4814 0.3996 0.4838 0.3159 0.6141 0.7021 0.6315 0.5779 0.2295 0.389 0.5904 0.6833 0.5898 0.8773 0.5869 0.6559 0.3802 0.1568 0.2841 0.4162 0.6059 0.5081 0.8442 0.3368 0.5765 0.3129 0.5945 0.4836 0.6135 0.671 0.3025 0.7255 0.564 0.5875 0.4742 0.4879 0.6816 0.7502 0.5321 0.7792 0.0574 0.6633 0.6817 0.2008 0.4027 0.5214 0.5726 0.483 0.0232 0.3461 0.2548 0.162 0.0271 0.4048 0.4475 0.5123 0.557 0.6207 0.0152 0.6197 0.7811 0.1586 0.6889 0.5123 0.8022 0.2174 0.5019 0.3556 0.1178 0.2234 0.7898 0.7053 0. 0.7546 0.6068 0.1054 0.8275 0.7809 0.4099 0.7911 0.6679 0.5716 0.6379 0.7464 0.4819 0.6284 0.373 0.75 0.2486 0.7558 0.6502 0.6382 0.38 0.3439 0.069 0.4542 0.7094 0.0791 0.1063 0.5823 0.5862 0.725 0. 0.8504 0.2548 0.3142 0.8194] 2022-08-24 19:15:25 [INFO] [EVAL] Class Recall: [0.8226 0.9034 0.9606 0.8484 0.8483 0.8267 0.8421 0.9117 0.6717 0.7742 0.6315 0.7354 0.8581 0.4713 0.3023 0.5958 0.5627 0.4598 0.7212 0.5211 0.8608 0.4789 0.6882 0.6313 0.5149 0.5001 0.6398 0.4276 0.4133 0.4199 0.3097 0.6101 0.3848 0.3963 0.641 0.5292 0.4433 0.5424 0.3343 0.3147 0.1087 0.1364 0.4098 0.2909 0.4813 0.3936 0.359 0.5345 0.9086 0.6702 0.6114 0.6031 0.3264 0.2689 0.8992 0.6759 0.9403 0.2889 0.5697 0.3116 0.3125 0.223 0.5468 0.1615 0.6577 0.7473 0.3733 0.5022 0.1836 0.4043 0.5083 0.5482 0.4088 0.4583 0.494 0.3906 0.5348 0.2689 0.3342 0.2724 0.8081 0.3264 0.2021 0.0166 0.6297 0.5805 0.0801 0.0699 0.4558 0.6242 0.5956 0.0963 0.364 0.0839 0.0576 0.0069 0.1962 0.1561 0.2029 0.4953 0.037 0.0241 0.1358 0.7282 0.0045 0.6273 0.1652 0.4609 0.0817 0.4311 0.0432 0.4161 0.0764 0.4538 0.9898 0. 0.358 0.8675 0.1185 0.1526 0.4857 0.016 0.3473 0.045 0.3203 0.1454 0.5029 0.4769 0.5349 0.3747 0.778 0.0029 0.2177 0.1525 0.0885 0.0904 0.1123 0.0224 0.1393 0.3381 0.0445 0.0505 0.2446 0.5426 0.3533 0. 0.2706 0.0073 0.0461 0.0322] 2022-08-24 19:15:25 [INFO] [EVAL] The model with the best validation mIoU (0.3067) was saved at iter 77000. 2022-08-24 19:15:40 [INFO] [TRAIN] epoch: 62, iter: 77050/160000, loss: 0.8034, lr: 0.000628, batch_cost: 0.3059, reader_cost: 0.08875, ips: 26.1520 samples/sec | ETA 07:02:54 2022-08-24 19:15:50 [INFO] [TRAIN] epoch: 62, iter: 77100/160000, loss: 0.7338, lr: 0.000628, batch_cost: 0.1890, reader_cost: 0.00071, ips: 42.3266 samples/sec | ETA 04:21:08 2022-08-24 19:15:58 [INFO] [TRAIN] epoch: 62, iter: 77150/160000, loss: 0.7111, lr: 0.000627, batch_cost: 0.1781, reader_cost: 0.00045, ips: 44.9284 samples/sec | ETA 04:05:52 2022-08-24 19:16:07 [INFO] [TRAIN] epoch: 62, iter: 77200/160000, loss: 0.7368, lr: 0.000627, batch_cost: 0.1771, reader_cost: 0.00141, ips: 45.1799 samples/sec | ETA 04:04:21 2022-08-24 19:16:17 [INFO] [TRAIN] epoch: 62, iter: 77250/160000, loss: 0.8361, lr: 0.000627, batch_cost: 0.1873, reader_cost: 0.00044, ips: 42.7167 samples/sec | ETA 04:18:17 2022-08-24 19:16:26 [INFO] [TRAIN] epoch: 62, iter: 77300/160000, loss: 0.7729, lr: 0.000626, batch_cost: 0.1852, reader_cost: 0.00078, ips: 43.1934 samples/sec | ETA 04:15:17 2022-08-24 19:16:34 [INFO] [TRAIN] epoch: 62, iter: 77350/160000, loss: 0.7830, lr: 0.000626, batch_cost: 0.1644, reader_cost: 0.00138, ips: 48.6639 samples/sec | ETA 03:46:27 2022-08-24 19:16:43 [INFO] [TRAIN] epoch: 62, iter: 77400/160000, loss: 0.7351, lr: 0.000625, batch_cost: 0.1728, reader_cost: 0.00052, ips: 46.3076 samples/sec | ETA 03:57:49 2022-08-24 19:16:51 [INFO] [TRAIN] epoch: 62, iter: 77450/160000, loss: 0.7873, lr: 0.000625, batch_cost: 0.1722, reader_cost: 0.00076, ips: 46.4523 samples/sec | ETA 03:56:56 2022-08-24 19:16:59 [INFO] [TRAIN] epoch: 62, iter: 77500/160000, loss: 0.8077, lr: 0.000625, batch_cost: 0.1588, reader_cost: 0.00627, ips: 50.3643 samples/sec | ETA 03:38:24 2022-08-24 19:17:07 [INFO] [TRAIN] epoch: 62, iter: 77550/160000, loss: 0.7804, lr: 0.000624, batch_cost: 0.1493, reader_cost: 0.00040, ips: 53.5990 samples/sec | ETA 03:25:06 2022-08-24 19:17:15 [INFO] [TRAIN] epoch: 62, iter: 77600/160000, loss: 0.8356, lr: 0.000624, batch_cost: 0.1676, reader_cost: 0.00060, ips: 47.7198 samples/sec | ETA 03:50:13 2022-08-24 19:17:23 [INFO] [TRAIN] epoch: 62, iter: 77650/160000, loss: 0.7788, lr: 0.000623, batch_cost: 0.1640, reader_cost: 0.00358, ips: 48.7830 samples/sec | ETA 03:45:04 2022-08-24 19:17:32 [INFO] [TRAIN] epoch: 62, iter: 77700/160000, loss: 0.7283, lr: 0.000623, batch_cost: 0.1642, reader_cost: 0.00377, ips: 48.7218 samples/sec | ETA 03:45:13 2022-08-24 19:17:40 [INFO] [TRAIN] epoch: 62, iter: 77750/160000, loss: 0.8434, lr: 0.000623, batch_cost: 0.1733, reader_cost: 0.00047, ips: 46.1623 samples/sec | ETA 03:57:34 2022-08-24 19:17:51 [INFO] [TRAIN] epoch: 62, iter: 77800/160000, loss: 0.7656, lr: 0.000622, batch_cost: 0.2047, reader_cost: 0.00380, ips: 39.0881 samples/sec | ETA 04:40:23 2022-08-24 19:18:01 [INFO] [TRAIN] epoch: 62, iter: 77850/160000, loss: 0.7831, lr: 0.000622, batch_cost: 0.2121, reader_cost: 0.00716, ips: 37.7131 samples/sec | ETA 04:50:26 2022-08-24 19:18:11 [INFO] [TRAIN] epoch: 62, iter: 77900/160000, loss: 0.7752, lr: 0.000622, batch_cost: 0.2045, reader_cost: 0.00263, ips: 39.1192 samples/sec | ETA 04:39:49 2022-08-24 19:18:22 [INFO] [TRAIN] epoch: 62, iter: 77950/160000, loss: 0.7906, lr: 0.000621, batch_cost: 0.2158, reader_cost: 0.00055, ips: 37.0755 samples/sec | ETA 04:55:04 2022-08-24 19:18:33 [INFO] [TRAIN] epoch: 62, iter: 78000/160000, loss: 0.8011, lr: 0.000621, batch_cost: 0.2250, reader_cost: 0.00035, ips: 35.5589 samples/sec | ETA 05:07:28 2022-08-24 19:18:33 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 222s - batch_cost: 0.2215 - reader cost: 8.6128e-04 2022-08-24 19:22:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3075 Acc: 0.7407 Kappa: 0.7208 Dice: 0.4320 2022-08-24 19:22:15 [INFO] [EVAL] Class IoU: [0.6494 0.7599 0.9235 0.7004 0.664 0.7299 0.763 0.728 0.4736 0.5963 0.4502 0.5259 0.6492 0.2497 0.2369 0.3596 0.4776 0.3885 0.5414 0.3497 0.7164 0.4559 0.5364 0.46 0.3025 0.2491 0.3823 0.3913 0.3498 0.2944 0.2123 0.3945 0.2239 0.2737 0.3467 0.3486 0.3356 0.4697 0.2328 0.2623 0.107 0.0912 0.3012 0.2304 0.2855 0.2851 0.2643 0.4189 0.6131 0.4642 0.446 0.3065 0.223 0.1787 0.6416 0.4352 0.8412 0.1879 0.4365 0.1817 0.0736 0.1281 0.3086 0.1253 0.411 0.6351 0.1883 0.382 0.0964 0.2874 0.3293 0.4101 0.3819 0.2179 0.4429 0.2887 0.3957 0.2316 0.1227 0.3603 0.6192 0.2585 0.2172 0.0123 0.5118 0.4673 0.0619 0.0724 0.3698 0.3958 0.3857 0.021 0.1664 0.0499 0.0394 0.0018 0.1482 0.1221 0.1654 0.3511 0.0681 0.0061 0.1901 0.7698 0.0092 0.5479 0.1481 0.4604 0.0772 0.3119 0.0713 0.3607 0.0626 0.5404 0.7155 0.0005 0.325 0.5124 0.0303 0.0187 0.453 0.0017 0.2932 0.0908 0.2479 0.1269 0.4427 0.3437 0.3628 0.1985 0.4748 0.0055 0.1969 0.1574 0.0698 0.0924 0.0779 0.0149 0.1298 0.2886 0.0999 0.043 0.2368 0.1793 0.3234 0. 0.2027 0.0216 0.0589 0.0154] 2022-08-24 19:22:15 [INFO] [EVAL] Class Precision: [0.7623 0.8281 0.9579 0.8003 0.7646 0.8619 0.8687 0.7875 0.5949 0.72 0.6809 0.678 0.7288 0.4997 0.4492 0.5255 0.6188 0.6738 0.7161 0.5888 0.8105 0.5908 0.6571 0.5908 0.4833 0.6195 0.5834 0.5923 0.697 0.4108 0.3839 0.4858 0.489 0.4297 0.4465 0.465 0.582 0.6697 0.416 0.5142 0.1983 0.1791 0.5278 0.5134 0.3786 0.4707 0.4176 0.6899 0.6355 0.5638 0.5662 0.4087 0.466 0.6611 0.7098 0.5652 0.8897 0.6084 0.5723 0.361 0.114 0.237 0.5368 0.5525 0.5484 0.7285 0.3027 0.5579 0.2376 0.7317 0.5412 0.556 0.5833 0.2566 0.6047 0.4889 0.5896 0.5621 0.685 0.7503 0.7165 0.614 0.7828 0.0412 0.5991 0.6192 0.2385 0.345 0.6245 0.5107 0.5274 0.0265 0.3516 0.2796 0.0933 0.0073 0.357 0.4811 0.3948 0.4916 0.6432 0.0092 0.611 0.9078 0.0827 0.8108 0.3927 0.7447 0.2223 0.435 0.2151 0.7309 0.3536 0.6614 0.7211 0.0162 0.7601 0.5443 0.0713 0.3194 0.8112 0.1963 0.8895 0.5726 0.6145 0.625 0.7175 0.5685 0.7688 0.4884 0.5098 0.3443 0.5619 0.6568 0.6292 0.3575 0.3509 0.1987 0.3154 0.709 0.2467 0.0713 0.629 0.4126 0.5512 0. 0.9078 0.2535 0.3564 0.844 ] 2022-08-24 19:22:15 [INFO] [EVAL] Class Recall: [0.8144 0.9023 0.9625 0.8486 0.8346 0.8266 0.8625 0.9059 0.6991 0.7763 0.5706 0.7011 0.8559 0.333 0.3338 0.5325 0.6767 0.4785 0.6893 0.4626 0.8604 0.6662 0.7449 0.6751 0.447 0.294 0.5259 0.5356 0.4126 0.5095 0.322 0.6773 0.2923 0.4299 0.6081 0.5819 0.4422 0.6114 0.3459 0.3488 0.1886 0.1568 0.4123 0.2947 0.5371 0.4197 0.4187 0.5161 0.9457 0.7243 0.6775 0.5505 0.2996 0.1968 0.8698 0.6542 0.9391 0.2138 0.6478 0.2679 0.1719 0.2181 0.4206 0.1394 0.6214 0.832 0.3325 0.5478 0.1396 0.3213 0.4569 0.6097 0.5252 0.5911 0.6234 0.4136 0.5461 0.2826 0.13 0.4095 0.8202 0.3087 0.2311 0.0172 0.7784 0.6558 0.0772 0.0839 0.4755 0.6374 0.5895 0.0921 0.2401 0.0573 0.0639 0.0023 0.2022 0.1406 0.2217 0.5511 0.0708 0.018 0.2163 0.8352 0.0103 0.6282 0.1922 0.5467 0.1057 0.5244 0.0964 0.416 0.0707 0.747 0.9893 0.0005 0.3621 0.8974 0.05 0.0194 0.5063 0.0018 0.3042 0.0974 0.2935 0.1374 0.5361 0.465 0.4072 0.2507 0.8737 0.0056 0.2326 0.1715 0.0728 0.1108 0.0911 0.0159 0.1807 0.3274 0.1438 0.0975 0.2752 0.2408 0.439 0. 0.207 0.0231 0.0659 0.0154] 2022-08-24 19:22:15 [INFO] [EVAL] The model with the best validation mIoU (0.3075) was saved at iter 78000. 2022-08-24 19:22:26 [INFO] [TRAIN] epoch: 62, iter: 78050/160000, loss: 0.7253, lr: 0.000620, batch_cost: 0.2071, reader_cost: 0.00847, ips: 38.6377 samples/sec | ETA 04:42:47 2022-08-24 19:22:36 [INFO] [TRAIN] epoch: 62, iter: 78100/160000, loss: 0.7626, lr: 0.000620, batch_cost: 0.2085, reader_cost: 0.01624, ips: 38.3758 samples/sec | ETA 04:44:33 2022-08-24 19:22:47 [INFO] [TRAIN] epoch: 62, iter: 78150/160000, loss: 0.7539, lr: 0.000620, batch_cost: 0.2090, reader_cost: 0.00059, ips: 38.2811 samples/sec | ETA 04:45:05 2022-08-24 19:22:57 [INFO] [TRAIN] epoch: 62, iter: 78200/160000, loss: 0.7873, lr: 0.000619, batch_cost: 0.2030, reader_cost: 0.00038, ips: 39.4159 samples/sec | ETA 04:36:42 2022-08-24 19:23:05 [INFO] [TRAIN] epoch: 62, iter: 78250/160000, loss: 0.7903, lr: 0.000619, batch_cost: 0.1630, reader_cost: 0.00068, ips: 49.0881 samples/sec | ETA 03:42:02 2022-08-24 19:23:14 [INFO] [TRAIN] epoch: 62, iter: 78300/160000, loss: 0.7606, lr: 0.000619, batch_cost: 0.1738, reader_cost: 0.00043, ips: 46.0233 samples/sec | ETA 03:56:41 2022-08-24 19:23:24 [INFO] [TRAIN] epoch: 63, iter: 78350/160000, loss: 0.7341, lr: 0.000618, batch_cost: 0.2139, reader_cost: 0.03163, ips: 37.3937 samples/sec | ETA 04:51:08 2022-08-24 19:23:33 [INFO] [TRAIN] epoch: 63, iter: 78400/160000, loss: 0.7689, lr: 0.000618, batch_cost: 0.1680, reader_cost: 0.00385, ips: 47.6155 samples/sec | ETA 03:48:29 2022-08-24 19:23:43 [INFO] [TRAIN] epoch: 63, iter: 78450/160000, loss: 0.7324, lr: 0.000617, batch_cost: 0.2001, reader_cost: 0.00080, ips: 39.9804 samples/sec | ETA 04:31:57 2022-08-24 19:23:52 [INFO] [TRAIN] epoch: 63, iter: 78500/160000, loss: 0.7836, lr: 0.000617, batch_cost: 0.1845, reader_cost: 0.00059, ips: 43.3645 samples/sec | ETA 04:10:35 2022-08-24 19:24:01 [INFO] [TRAIN] epoch: 63, iter: 78550/160000, loss: 0.7018, lr: 0.000617, batch_cost: 0.1847, reader_cost: 0.00491, ips: 43.3058 samples/sec | ETA 04:10:46 2022-08-24 19:24:10 [INFO] [TRAIN] epoch: 63, iter: 78600/160000, loss: 0.7826, lr: 0.000616, batch_cost: 0.1671, reader_cost: 0.00261, ips: 47.8811 samples/sec | ETA 03:46:40 2022-08-24 19:24:18 [INFO] [TRAIN] epoch: 63, iter: 78650/160000, loss: 0.7720, lr: 0.000616, batch_cost: 0.1608, reader_cost: 0.00483, ips: 49.7544 samples/sec | ETA 03:38:00 2022-08-24 19:24:26 [INFO] [TRAIN] epoch: 63, iter: 78700/160000, loss: 0.7689, lr: 0.000616, batch_cost: 0.1749, reader_cost: 0.00061, ips: 45.7429 samples/sec | ETA 03:56:58 2022-08-24 19:24:36 [INFO] [TRAIN] epoch: 63, iter: 78750/160000, loss: 0.7965, lr: 0.000615, batch_cost: 0.1850, reader_cost: 0.00288, ips: 43.2336 samples/sec | ETA 04:10:34 2022-08-24 19:24:47 [INFO] [TRAIN] epoch: 63, iter: 78800/160000, loss: 0.7764, lr: 0.000615, batch_cost: 0.2318, reader_cost: 0.00131, ips: 34.5198 samples/sec | ETA 05:13:38 2022-08-24 19:24:58 [INFO] [TRAIN] epoch: 63, iter: 78850/160000, loss: 0.7732, lr: 0.000614, batch_cost: 0.2125, reader_cost: 0.01262, ips: 37.6397 samples/sec | ETA 04:47:27 2022-08-24 19:25:09 [INFO] [TRAIN] epoch: 63, iter: 78900/160000, loss: 0.7312, lr: 0.000614, batch_cost: 0.2239, reader_cost: 0.00168, ips: 35.7224 samples/sec | ETA 05:02:42 2022-08-24 19:25:21 [INFO] [TRAIN] epoch: 63, iter: 78950/160000, loss: 0.7652, lr: 0.000614, batch_cost: 0.2479, reader_cost: 0.00090, ips: 32.2715 samples/sec | ETA 05:34:52 2022-08-24 19:25:32 [INFO] [TRAIN] epoch: 63, iter: 79000/160000, loss: 0.7650, lr: 0.000613, batch_cost: 0.2061, reader_cost: 0.00368, ips: 38.8212 samples/sec | ETA 04:38:11 2022-08-24 19:25:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 222s - batch_cost: 0.2218 - reader cost: 9.2939e-04 2022-08-24 19:29:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3092 Acc: 0.7427 Kappa: 0.7228 Dice: 0.4347 2022-08-24 19:29:14 [INFO] [EVAL] Class IoU: [0.6505 0.7692 0.9246 0.7062 0.6604 0.739 0.7494 0.7443 0.4787 0.6111 0.4393 0.5274 0.6572 0.2936 0.2119 0.3646 0.5013 0.3936 0.5464 0.3525 0.7158 0.4182 0.5507 0.4001 0.3104 0.35 0.3742 0.3945 0.361 0.3564 0.2429 0.3799 0.1965 0.2898 0.3617 0.3364 0.3477 0.4601 0.2462 0.2349 0.1002 0.0815 0.2931 0.2119 0.2857 0.2766 0.2539 0.4237 0.6771 0.4643 0.4081 0.2399 0.1637 0.1783 0.6717 0.4785 0.8078 0.2602 0.4474 0.2084 0.0966 0.1583 0.2774 0.0548 0.3498 0.6313 0.2204 0.3819 0.0645 0.2954 0.3438 0.3966 0.345 0.2271 0.4237 0.2642 0.4111 0.2053 0.2548 0.3722 0.6338 0.2532 0.1983 0.0139 0.5097 0.4568 0.0616 0.0938 0.3253 0.383 0.3614 0. 0.1709 0.0511 0.0539 0.0094 0.1735 0.1085 0.1052 0.338 0.0455 0.0038 0.1634 0.6447 0.0001 0.575 0.1441 0.4216 0.0821 0.2429 0.0517 0.3272 0.0857 0.4537 0.7061 0. 0.335 0.558 0.0621 0.1159 0.4824 0.0178 0.2877 0.1273 0.2543 0.2024 0.4314 0.3096 0.3648 0.2671 0.5513 0.0048 0.2244 0.1424 0.0877 0.1129 0.0625 0.0131 0.1024 0.3135 0.0523 0.0255 0.2529 0.3516 0.3043 0. 0.2522 0.0313 0.0558 0.0221] 2022-08-24 19:29:14 [INFO] [EVAL] Class Precision: [0.7519 0.8458 0.9601 0.8118 0.7476 0.8437 0.8899 0.8285 0.6055 0.7723 0.6876 0.6057 0.7556 0.4545 0.472 0.5394 0.6547 0.6594 0.684 0.5427 0.8091 0.6211 0.7258 0.6173 0.5464 0.6158 0.5203 0.7321 0.6261 0.6286 0.312 0.5 0.4735 0.4389 0.4829 0.4355 0.5243 0.677 0.4933 0.5033 0.2014 0.2276 0.4755 0.5288 0.4743 0.5118 0.4114 0.6423 0.7475 0.5621 0.5173 0.3047 0.4683 0.6689 0.7172 0.6457 0.8443 0.5717 0.5704 0.3676 0.1316 0.3031 0.3909 0.6233 0.4099 0.7669 0.3828 0.5456 0.2434 0.6813 0.5381 0.6915 0.6353 0.2756 0.7097 0.3734 0.5022 0.561 0.6588 0.61 0.7478 0.5278 0.7794 0.0739 0.6646 0.6706 0.2808 0.256 0.5125 0.6155 0.4878 0. 0.354 0.2883 0.097 0.0379 0.5669 0.3522 0.4955 0.4426 0.4213 0.0065 0.6266 0.8041 0.0009 0.8829 0.4646 0.8718 0.2096 0.3955 0.3879 0.6049 0.3689 0.7451 0.7117 0. 0.7288 0.6003 0.1141 0.5996 0.7278 0.3447 0.7902 0.5566 0.6815 0.6493 0.7802 0.4465 0.6577 0.4802 0.6318 0.2689 0.4457 0.7397 0.5769 0.2634 0.3892 0.1847 0.3634 0.6436 0.0889 0.0573 0.5016 0.48 0.5431 0. 0.8842 0.1692 0.3451 0.7052] 2022-08-24 19:29:14 [INFO] [EVAL] Class Recall: [0.8284 0.8947 0.9616 0.8444 0.8499 0.8562 0.826 0.8799 0.6956 0.7453 0.5489 0.8032 0.8346 0.4534 0.2777 0.5294 0.6816 0.494 0.731 0.5014 0.8613 0.5615 0.6954 0.5321 0.4182 0.4478 0.5714 0.4611 0.4602 0.4514 0.5233 0.6127 0.2514 0.4603 0.5904 0.5966 0.5079 0.5895 0.3296 0.3057 0.1662 0.1127 0.433 0.2612 0.4182 0.3757 0.3988 0.5545 0.8778 0.7274 0.6591 0.5302 0.2011 0.1956 0.9137 0.6488 0.9492 0.3232 0.6748 0.3248 0.2662 0.249 0.4885 0.0566 0.7046 0.7812 0.3418 0.56 0.0807 0.3428 0.4877 0.4819 0.4302 0.5634 0.5125 0.4747 0.6938 0.2446 0.2935 0.4884 0.8061 0.3274 0.2101 0.0169 0.6862 0.589 0.0731 0.129 0.4711 0.5034 0.5825 0. 0.2483 0.0585 0.1081 0.0123 0.2 0.1356 0.1179 0.5884 0.0486 0.0092 0.1811 0.7648 0.0001 0.6225 0.1727 0.4494 0.1189 0.3864 0.0563 0.4161 0.1005 0.537 0.9888 0. 0.3827 0.8879 0.12 0.1256 0.5886 0.0184 0.3115 0.1417 0.2886 0.2272 0.4911 0.5024 0.4503 0.3757 0.8123 0.0049 0.3114 0.1499 0.0938 0.165 0.0693 0.0139 0.1248 0.3793 0.1126 0.0441 0.3378 0.5678 0.409 0. 0.2608 0.037 0.0624 0.0223] 2022-08-24 19:29:14 [INFO] [EVAL] The model with the best validation mIoU (0.3092) was saved at iter 79000. 2022-08-24 19:29:25 [INFO] [TRAIN] epoch: 63, iter: 79050/160000, loss: 0.8164, lr: 0.000613, batch_cost: 0.2186, reader_cost: 0.00383, ips: 36.5957 samples/sec | ETA 04:54:56 2022-08-24 19:29:36 [INFO] [TRAIN] epoch: 63, iter: 79100/160000, loss: 0.8031, lr: 0.000612, batch_cost: 0.2038, reader_cost: 0.00681, ips: 39.2636 samples/sec | ETA 04:34:43 2022-08-24 19:29:47 [INFO] [TRAIN] epoch: 63, iter: 79150/160000, loss: 0.7598, lr: 0.000612, batch_cost: 0.2353, reader_cost: 0.00043, ips: 34.0040 samples/sec | ETA 05:17:01 2022-08-24 19:29:56 [INFO] [TRAIN] epoch: 63, iter: 79200/160000, loss: 0.7639, lr: 0.000612, batch_cost: 0.1769, reader_cost: 0.00098, ips: 45.2112 samples/sec | ETA 03:58:17 2022-08-24 19:30:05 [INFO] [TRAIN] epoch: 63, iter: 79250/160000, loss: 0.7712, lr: 0.000611, batch_cost: 0.1728, reader_cost: 0.00042, ips: 46.2962 samples/sec | ETA 03:52:33 2022-08-24 19:30:13 [INFO] [TRAIN] epoch: 63, iter: 79300/160000, loss: 0.8011, lr: 0.000611, batch_cost: 0.1587, reader_cost: 0.00052, ips: 50.4234 samples/sec | ETA 03:33:23 2022-08-24 19:30:21 [INFO] [TRAIN] epoch: 63, iter: 79350/160000, loss: 0.8004, lr: 0.000611, batch_cost: 0.1725, reader_cost: 0.00049, ips: 46.3813 samples/sec | ETA 03:51:50 2022-08-24 19:30:30 [INFO] [TRAIN] epoch: 63, iter: 79400/160000, loss: 0.7663, lr: 0.000610, batch_cost: 0.1695, reader_cost: 0.00074, ips: 47.2059 samples/sec | ETA 03:47:39 2022-08-24 19:30:40 [INFO] [TRAIN] epoch: 63, iter: 79450/160000, loss: 0.8416, lr: 0.000610, batch_cost: 0.1993, reader_cost: 0.00061, ips: 40.1339 samples/sec | ETA 04:27:36 2022-08-24 19:30:49 [INFO] [TRAIN] epoch: 63, iter: 79500/160000, loss: 0.8072, lr: 0.000609, batch_cost: 0.1863, reader_cost: 0.00074, ips: 42.9356 samples/sec | ETA 04:09:59 2022-08-24 19:30:57 [INFO] [TRAIN] epoch: 63, iter: 79550/160000, loss: 0.8523, lr: 0.000609, batch_cost: 0.1594, reader_cost: 0.00072, ips: 50.1882 samples/sec | ETA 03:33:43 2022-08-24 19:31:10 [INFO] [TRAIN] epoch: 64, iter: 79600/160000, loss: 0.7827, lr: 0.000609, batch_cost: 0.2476, reader_cost: 0.08868, ips: 32.3165 samples/sec | ETA 05:31:43 2022-08-24 19:31:18 [INFO] [TRAIN] epoch: 64, iter: 79650/160000, loss: 0.7268, lr: 0.000608, batch_cost: 0.1692, reader_cost: 0.00905, ips: 47.2853 samples/sec | ETA 03:46:34 2022-08-24 19:31:28 [INFO] [TRAIN] epoch: 64, iter: 79700/160000, loss: 0.7155, lr: 0.000608, batch_cost: 0.1985, reader_cost: 0.00072, ips: 40.2925 samples/sec | ETA 04:25:43 2022-08-24 19:31:40 [INFO] [TRAIN] epoch: 64, iter: 79750/160000, loss: 0.8057, lr: 0.000608, batch_cost: 0.2359, reader_cost: 0.00074, ips: 33.9067 samples/sec | ETA 05:15:34 2022-08-24 19:31:51 [INFO] [TRAIN] epoch: 64, iter: 79800/160000, loss: 0.7377, lr: 0.000607, batch_cost: 0.2222, reader_cost: 0.00096, ips: 36.0022 samples/sec | ETA 04:57:01 2022-08-24 19:32:02 [INFO] [TRAIN] epoch: 64, iter: 79850/160000, loss: 0.7585, lr: 0.000607, batch_cost: 0.2221, reader_cost: 0.00078, ips: 36.0172 samples/sec | ETA 04:56:42 2022-08-24 19:32:12 [INFO] [TRAIN] epoch: 64, iter: 79900/160000, loss: 0.7642, lr: 0.000606, batch_cost: 0.1991, reader_cost: 0.00147, ips: 40.1787 samples/sec | ETA 04:25:48 2022-08-24 19:32:22 [INFO] [TRAIN] epoch: 64, iter: 79950/160000, loss: 0.7737, lr: 0.000606, batch_cost: 0.2002, reader_cost: 0.00040, ips: 39.9630 samples/sec | ETA 04:27:04 2022-08-24 19:32:32 [INFO] [TRAIN] epoch: 64, iter: 80000/160000, loss: 0.8143, lr: 0.000606, batch_cost: 0.2064, reader_cost: 0.00705, ips: 38.7614 samples/sec | ETA 04:35:11 2022-08-24 19:32:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 217s - batch_cost: 0.2169 - reader cost: 7.0331e-04 2022-08-24 19:36:09 [INFO] [EVAL] #Images: 2000 mIoU: 0.3048 Acc: 0.7395 Kappa: 0.7200 Dice: 0.4310 2022-08-24 19:36:09 [INFO] [EVAL] Class IoU: [0.6512 0.757 0.9239 0.7 0.6626 0.7308 0.7562 0.7493 0.4748 0.6298 0.4476 0.5277 0.6444 0.2965 0.2281 0.3619 0.4757 0.4198 0.5596 0.3601 0.7073 0.4327 0.5519 0.4569 0.3087 0.3425 0.326 0.385 0.3854 0.3101 0.2068 0.4237 0.2424 0.2685 0.3254 0.3442 0.3417 0.4689 0.2421 0.2592 0.1185 0.0953 0.2997 0.21 0.2757 0.2791 0.2406 0.4099 0.6423 0.4541 0.4182 0.2369 0.2054 0.1649 0.6182 0.4401 0.8237 0.2188 0.3983 0.2008 0.1187 0.1307 0.308 0.0883 0.3919 0.6237 0.2004 0.3707 0.1045 0.2901 0.3348 0.4076 0.3619 0.1936 0.4238 0.2846 0.3444 0.195 0.0874 0.1712 0.5884 0.2639 0.248 0.0144 0.5026 0.4715 0.074 0.0451 0.3266 0.4256 0.3428 0.0214 0.131 0.0894 0.0014 0.0069 0.1496 0.1245 0.1435 0.336 0.0926 0.0061 0.1868 0.4486 0.0001 0.5005 0.1774 0.4234 0.1084 0.2215 0.0562 0.3163 0.0924 0.6244 0.6207 0.005 0.304 0.4851 0.0616 0.0107 0.3929 0.0147 0.3102 0.1457 0.2518 0.1717 0.4125 0.2866 0.3953 0.2187 0.547 0.0084 0.0436 0.2043 0.1188 0.1193 0.068 0.0167 0.1108 0.3212 0.157 0.1692 0.244 0.4528 0.2969 0.0021 0.2096 0.0126 0.0654 0.0387] 2022-08-24 19:36:09 [INFO] [EVAL] Class Precision: [0.7766 0.8303 0.9598 0.8126 0.7823 0.8724 0.889 0.8141 0.5953 0.7505 0.6306 0.6552 0.7236 0.507 0.4445 0.5394 0.587 0.6518 0.7292 0.5139 0.7952 0.6564 0.7411 0.5889 0.4389 0.463 0.4727 0.6214 0.6218 0.511 0.3856 0.5968 0.4393 0.3932 0.4773 0.4201 0.5712 0.7467 0.4283 0.4954 0.1713 0.21 0.5069 0.5437 0.4447 0.4776 0.3526 0.6462 0.6891 0.5626 0.4996 0.2817 0.349 0.5946 0.6553 0.5696 0.8616 0.645 0.5444 0.3447 0.2048 0.2214 0.4278 0.5204 0.5142 0.7455 0.3387 0.6043 0.2307 0.6565 0.5741 0.5695 0.6657 0.2265 0.6955 0.4749 0.4985 0.6452 0.3944 0.5833 0.6735 0.5249 0.7368 0.0621 0.6152 0.6617 0.2104 0.4834 0.4828 0.5823 0.4225 0.0263 0.3121 0.3057 0.0061 0.0311 0.466 0.5088 0.4398 0.4673 0.6735 0.0124 0.6115 0.4986 0.0008 0.6913 0.4305 0.8601 0.2017 0.3039 0.3224 0.5688 0.3148 0.8062 0.6233 0.0861 0.7619 0.5358 0.0939 0.5296 0.7294 0.3596 0.8159 0.5716 0.5812 0.6561 0.6857 0.3601 0.8368 0.4871 0.6623 0.5339 0.4018 0.5124 0.5465 0.3357 0.3488 0.2117 0.3728 0.6285 0.2076 0.3564 0.4353 0.7238 0.4424 0.0024 0.8898 0.232 0.343 0.6034] 2022-08-24 19:36:09 [INFO] [EVAL] Class Recall: [0.8014 0.8955 0.961 0.8348 0.8124 0.8183 0.835 0.9041 0.701 0.7966 0.6067 0.7307 0.8549 0.4166 0.319 0.5237 0.715 0.5411 0.7064 0.5461 0.8648 0.5593 0.6837 0.6709 0.5099 0.5684 0.5124 0.5029 0.5034 0.441 0.3084 0.5936 0.3511 0.4585 0.5056 0.6556 0.4596 0.5576 0.3578 0.3522 0.2777 0.1485 0.4231 0.2549 0.4204 0.4017 0.4309 0.5285 0.9044 0.702 0.7198 0.598 0.333 0.1858 0.9161 0.6594 0.9492 0.2488 0.5974 0.3247 0.2201 0.2417 0.5237 0.0961 0.6223 0.7925 0.3291 0.4895 0.1604 0.342 0.4455 0.5891 0.4423 0.5713 0.5203 0.4152 0.5269 0.2184 0.1009 0.1951 0.8232 0.3467 0.2721 0.0184 0.7331 0.6213 0.1025 0.0474 0.5025 0.6127 0.6449 0.103 0.1842 0.1122 0.0017 0.0087 0.1806 0.1415 0.1756 0.5447 0.097 0.012 0.212 0.8174 0.0001 0.6445 0.2318 0.4547 0.1899 0.4496 0.0637 0.4161 0.1157 0.7346 0.9934 0.0053 0.336 0.8367 0.1516 0.0108 0.4599 0.0151 0.3335 0.1635 0.3077 0.1887 0.5087 0.5838 0.4283 0.2841 0.7585 0.0085 0.0466 0.2537 0.1318 0.1561 0.078 0.0178 0.1363 0.3966 0.3917 0.2437 0.357 0.5474 0.4744 0.0199 0.2152 0.0132 0.0747 0.0397] 2022-08-24 19:36:09 [INFO] [EVAL] The model with the best validation mIoU (0.3092) was saved at iter 79000. 2022-08-24 19:36:20 [INFO] [TRAIN] epoch: 64, iter: 80050/160000, loss: 0.7971, lr: 0.000605, batch_cost: 0.2117, reader_cost: 0.00395, ips: 37.7858 samples/sec | ETA 04:42:07 2022-08-24 19:36:31 [INFO] [TRAIN] epoch: 64, iter: 80100/160000, loss: 0.7336, lr: 0.000605, batch_cost: 0.2215, reader_cost: 0.00137, ips: 36.1100 samples/sec | ETA 04:55:01 2022-08-24 19:36:42 [INFO] [TRAIN] epoch: 64, iter: 80150/160000, loss: 0.7990, lr: 0.000605, batch_cost: 0.2225, reader_cost: 0.00082, ips: 35.9627 samples/sec | ETA 04:56:02 2022-08-24 19:36:51 [INFO] [TRAIN] epoch: 64, iter: 80200/160000, loss: 0.7518, lr: 0.000604, batch_cost: 0.1732, reader_cost: 0.00037, ips: 46.1887 samples/sec | ETA 03:50:21 2022-08-24 19:37:01 [INFO] [TRAIN] epoch: 64, iter: 80250/160000, loss: 0.7408, lr: 0.000604, batch_cost: 0.2089, reader_cost: 0.00053, ips: 38.2884 samples/sec | ETA 04:37:43 2022-08-24 19:37:09 [INFO] [TRAIN] epoch: 64, iter: 80300/160000, loss: 0.7430, lr: 0.000603, batch_cost: 0.1610, reader_cost: 0.00050, ips: 49.6776 samples/sec | ETA 03:33:54 2022-08-24 19:37:19 [INFO] [TRAIN] epoch: 64, iter: 80350/160000, loss: 0.7663, lr: 0.000603, batch_cost: 0.1859, reader_cost: 0.00383, ips: 43.0229 samples/sec | ETA 04:06:50 2022-08-24 19:37:28 [INFO] [TRAIN] epoch: 64, iter: 80400/160000, loss: 0.7831, lr: 0.000603, batch_cost: 0.1868, reader_cost: 0.00086, ips: 42.8243 samples/sec | ETA 04:07:50 2022-08-24 19:37:36 [INFO] [TRAIN] epoch: 64, iter: 80450/160000, loss: 0.7498, lr: 0.000602, batch_cost: 0.1647, reader_cost: 0.00080, ips: 48.5836 samples/sec | ETA 03:38:19 2022-08-24 19:37:44 [INFO] [TRAIN] epoch: 64, iter: 80500/160000, loss: 0.7192, lr: 0.000602, batch_cost: 0.1631, reader_cost: 0.00051, ips: 49.0423 samples/sec | ETA 03:36:08 2022-08-24 19:37:53 [INFO] [TRAIN] epoch: 64, iter: 80550/160000, loss: 0.8261, lr: 0.000602, batch_cost: 0.1759, reader_cost: 0.00045, ips: 45.4920 samples/sec | ETA 03:52:51 2022-08-24 19:38:04 [INFO] [TRAIN] epoch: 64, iter: 80600/160000, loss: 0.7826, lr: 0.000601, batch_cost: 0.2099, reader_cost: 0.00041, ips: 38.1166 samples/sec | ETA 04:37:44 2022-08-24 19:38:13 [INFO] [TRAIN] epoch: 64, iter: 80650/160000, loss: 0.7498, lr: 0.000601, batch_cost: 0.1775, reader_cost: 0.00033, ips: 45.0804 samples/sec | ETA 03:54:41 2022-08-24 19:38:21 [INFO] [TRAIN] epoch: 64, iter: 80700/160000, loss: 0.8080, lr: 0.000600, batch_cost: 0.1662, reader_cost: 0.00081, ips: 48.1299 samples/sec | ETA 03:39:40 2022-08-24 19:38:31 [INFO] [TRAIN] epoch: 64, iter: 80750/160000, loss: 0.8007, lr: 0.000600, batch_cost: 0.2108, reader_cost: 0.00045, ips: 37.9491 samples/sec | ETA 04:38:26 2022-08-24 19:38:41 [INFO] [TRAIN] epoch: 64, iter: 80800/160000, loss: 0.8559, lr: 0.000600, batch_cost: 0.1994, reader_cost: 0.00059, ips: 40.1184 samples/sec | ETA 04:23:13 2022-08-24 19:38:53 [INFO] [TRAIN] epoch: 65, iter: 80850/160000, loss: 0.7531, lr: 0.000599, batch_cost: 0.2406, reader_cost: 0.03467, ips: 33.2531 samples/sec | ETA 05:17:21 2022-08-24 19:39:05 [INFO] [TRAIN] epoch: 65, iter: 80900/160000, loss: 0.7484, lr: 0.000599, batch_cost: 0.2277, reader_cost: 0.01155, ips: 35.1338 samples/sec | ETA 05:00:11 2022-08-24 19:39:16 [INFO] [TRAIN] epoch: 65, iter: 80950/160000, loss: 0.8338, lr: 0.000598, batch_cost: 0.2185, reader_cost: 0.00777, ips: 36.6102 samples/sec | ETA 04:47:53 2022-08-24 19:39:26 [INFO] [TRAIN] epoch: 65, iter: 81000/160000, loss: 0.7989, lr: 0.000598, batch_cost: 0.1997, reader_cost: 0.00430, ips: 40.0629 samples/sec | ETA 04:22:55 2022-08-24 19:39:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 218s - batch_cost: 0.2179 - reader cost: 0.0011 2022-08-24 19:43:04 [INFO] [EVAL] #Images: 2000 mIoU: 0.3015 Acc: 0.7434 Kappa: 0.7231 Dice: 0.4242 2022-08-24 19:43:04 [INFO] [EVAL] Class IoU: [0.6506 0.7672 0.9241 0.7011 0.66 0.7333 0.7567 0.7462 0.4778 0.6335 0.4557 0.5136 0.6442 0.2827 0.2045 0.3743 0.4504 0.4111 0.5413 0.3631 0.7081 0.4064 0.5497 0.4562 0.3232 0.3602 0.3188 0.38 0.3188 0.2702 0.1854 0.3933 0.2252 0.2602 0.3094 0.3744 0.3439 0.4355 0.2393 0.2394 0.0928 0.0683 0.3066 0.2036 0.2869 0.2559 0.2197 0.4014 0.6587 0.4658 0.4346 0.2656 0.1773 0.2544 0.6377 0.4765 0.8328 0.2893 0.4664 0.1525 0.0558 0.1278 0.3076 0.0388 0.3937 0.6372 0.2164 0.3648 0.0439 0.3237 0.3375 0.4285 0.3737 0.2289 0.4174 0.2692 0.4017 0.2039 0.156 0.4514 0.6029 0.2412 0.2787 0.0145 0.5085 0.4724 0.0467 0.049 0.3301 0.4149 0.3406 0.0039 0.1815 0.0685 0.0359 0.0094 0.1814 0.1254 0.0447 0.3605 0.0082 0.0033 0.1671 0.4729 0.0013 0.5007 0.1599 0.4511 0.1005 0.2727 0.0418 0.1258 0.094 0.6391 0.5706 0.0012 0.3429 0.5499 0.0332 0.0951 0.423 0.0052 0.2693 0.077 0.2669 0.1065 0.3953 0.3234 0.3227 0.2319 0.5427 0.0044 0.0699 0.1597 0.0949 0.1267 0.095 0.0132 0.1203 0.3326 0.092 0.0565 0.2397 0.1461 0.3464 0. 0.2464 0.0168 0.0444 0.0021] 2022-08-24 19:43:04 [INFO] [EVAL] Class Precision: [0.7406 0.8346 0.9563 0.8011 0.7731 0.8697 0.8498 0.827 0.5989 0.74 0.6868 0.6789 0.7264 0.505 0.4871 0.5226 0.5679 0.658 0.7342 0.5277 0.7838 0.5836 0.6999 0.5937 0.4841 0.6034 0.4428 0.7254 0.7181 0.5778 0.3943 0.5561 0.4373 0.4725 0.5074 0.5255 0.5707 0.7568 0.4657 0.5236 0.1954 0.2181 0.581 0.5813 0.4407 0.4707 0.3858 0.5527 0.7719 0.6062 0.5299 0.3318 0.4693 0.7097 0.6924 0.609 0.8746 0.5778 0.6094 0.4013 0.1186 0.2193 0.4262 0.6034 0.5094 0.806 0.3354 0.6021 0.2464 0.6142 0.5567 0.6318 0.6558 0.2973 0.5638 0.3898 0.5602 0.4344 0.5727 0.6851 0.6989 0.541 0.7399 0.0703 0.6855 0.6835 0.3421 0.4224 0.5474 0.5562 0.4238 0.0052 0.3578 0.3196 0.0907 0.0379 0.6809 0.6277 0.4381 0.6161 0.7691 0.0063 0.503 0.621 0.0212 0.7243 0.4342 0.7062 0.2047 0.4358 0.3234 0.1528 0.3247 0.7846 0.572 0.0414 0.7069 0.5796 0.088 0.5106 0.7193 0.4234 0.6817 0.5931 0.6025 0.6402 0.8224 0.4761 0.508 0.4247 0.6411 0.766 0.4734 0.701 0.5859 0.2715 0.2752 0.1296 0.3813 0.6577 0.148 0.084 0.4389 0.4404 0.6225 0. 0.8732 0.261 0.3742 0.434 ] 2022-08-24 19:43:04 [INFO] [EVAL] Class Recall: [0.8426 0.9048 0.9649 0.8489 0.8185 0.8238 0.8736 0.8841 0.7027 0.8148 0.5752 0.6784 0.8506 0.3911 0.2606 0.5688 0.6851 0.5229 0.6733 0.5379 0.8799 0.5723 0.7192 0.6632 0.4929 0.4719 0.5323 0.4438 0.3644 0.3367 0.2591 0.5732 0.3171 0.3668 0.4423 0.5656 0.4639 0.5064 0.3299 0.3061 0.1503 0.0904 0.3936 0.2386 0.451 0.3593 0.3378 0.5947 0.8179 0.668 0.7072 0.571 0.2218 0.284 0.8899 0.6865 0.9457 0.3669 0.6653 0.1975 0.0954 0.2345 0.525 0.0398 0.6342 0.7527 0.3789 0.4806 0.0507 0.4064 0.4615 0.5711 0.4648 0.4987 0.6164 0.4653 0.5868 0.2777 0.1766 0.5696 0.8144 0.3033 0.309 0.0179 0.6631 0.6047 0.0513 0.0525 0.454 0.6204 0.6345 0.0148 0.2691 0.0803 0.0561 0.0124 0.1983 0.1355 0.0474 0.4649 0.0083 0.007 0.2002 0.6647 0.0013 0.6186 0.202 0.5553 0.165 0.4214 0.0457 0.4161 0.1169 0.7751 0.9956 0.0013 0.3998 0.9148 0.0505 0.1046 0.5066 0.0052 0.308 0.0813 0.324 0.1133 0.4322 0.5022 0.4693 0.338 0.7796 0.0044 0.0758 0.1713 0.1017 0.1919 0.1266 0.0144 0.1495 0.4022 0.1958 0.147 0.3456 0.1794 0.4385 0. 0.2555 0.0176 0.048 0.0021] 2022-08-24 19:43:04 [INFO] [EVAL] The model with the best validation mIoU (0.3092) was saved at iter 79000. 2022-08-24 19:43:15 [INFO] [TRAIN] epoch: 65, iter: 81050/160000, loss: 0.7624, lr: 0.000598, batch_cost: 0.2153, reader_cost: 0.01058, ips: 37.1523 samples/sec | ETA 04:43:20 2022-08-24 19:43:25 [INFO] [TRAIN] epoch: 65, iter: 81100/160000, loss: 0.7958, lr: 0.000597, batch_cost: 0.2086, reader_cost: 0.00350, ips: 38.3506 samples/sec | ETA 04:34:18 2022-08-24 19:43:35 [INFO] [TRAIN] epoch: 65, iter: 81150/160000, loss: 0.7135, lr: 0.000597, batch_cost: 0.2002, reader_cost: 0.00443, ips: 39.9698 samples/sec | ETA 04:23:01 2022-08-24 19:43:45 [INFO] [TRAIN] epoch: 65, iter: 81200/160000, loss: 0.7510, lr: 0.000597, batch_cost: 0.1962, reader_cost: 0.00080, ips: 40.7684 samples/sec | ETA 04:17:42 2022-08-24 19:43:54 [INFO] [TRAIN] epoch: 65, iter: 81250/160000, loss: 0.7981, lr: 0.000596, batch_cost: 0.1668, reader_cost: 0.00080, ips: 47.9559 samples/sec | ETA 03:38:57 2022-08-24 19:44:03 [INFO] [TRAIN] epoch: 65, iter: 81300/160000, loss: 0.7675, lr: 0.000596, batch_cost: 0.1937, reader_cost: 0.00046, ips: 41.2905 samples/sec | ETA 04:14:08 2022-08-24 19:44:13 [INFO] [TRAIN] epoch: 65, iter: 81350/160000, loss: 0.7923, lr: 0.000595, batch_cost: 0.1861, reader_cost: 0.00041, ips: 42.9976 samples/sec | ETA 04:03:53 2022-08-24 19:44:22 [INFO] [TRAIN] epoch: 65, iter: 81400/160000, loss: 0.8023, lr: 0.000595, batch_cost: 0.1851, reader_cost: 0.00083, ips: 43.2083 samples/sec | ETA 04:02:32 2022-08-24 19:44:31 [INFO] [TRAIN] epoch: 65, iter: 81450/160000, loss: 0.7736, lr: 0.000595, batch_cost: 0.1854, reader_cost: 0.00037, ips: 43.1568 samples/sec | ETA 04:02:40 2022-08-24 19:44:39 [INFO] [TRAIN] epoch: 65, iter: 81500/160000, loss: 0.8055, lr: 0.000594, batch_cost: 0.1639, reader_cost: 0.00630, ips: 48.8024 samples/sec | ETA 03:34:28 2022-08-24 19:44:47 [INFO] [TRAIN] epoch: 65, iter: 81550/160000, loss: 0.8301, lr: 0.000594, batch_cost: 0.1636, reader_cost: 0.00081, ips: 48.8880 samples/sec | ETA 03:33:57 2022-08-24 19:44:58 [INFO] [TRAIN] epoch: 65, iter: 81600/160000, loss: 0.8332, lr: 0.000594, batch_cost: 0.2011, reader_cost: 0.00047, ips: 39.7908 samples/sec | ETA 04:22:42 2022-08-24 19:45:06 [INFO] [TRAIN] epoch: 65, iter: 81650/160000, loss: 0.7503, lr: 0.000593, batch_cost: 0.1734, reader_cost: 0.00043, ips: 46.1382 samples/sec | ETA 03:46:25 2022-08-24 19:45:15 [INFO] [TRAIN] epoch: 65, iter: 81700/160000, loss: 0.7010, lr: 0.000593, batch_cost: 0.1841, reader_cost: 0.00049, ips: 43.4545 samples/sec | ETA 04:00:15 2022-08-24 19:45:26 [INFO] [TRAIN] epoch: 65, iter: 81750/160000, loss: 0.7951, lr: 0.000592, batch_cost: 0.2031, reader_cost: 0.00492, ips: 39.3798 samples/sec | ETA 04:24:56 2022-08-24 19:45:36 [INFO] [TRAIN] epoch: 65, iter: 81800/160000, loss: 0.7104, lr: 0.000592, batch_cost: 0.2147, reader_cost: 0.01594, ips: 37.2685 samples/sec | ETA 04:39:46 2022-08-24 19:45:48 [INFO] [TRAIN] epoch: 65, iter: 81850/160000, loss: 0.7950, lr: 0.000592, batch_cost: 0.2240, reader_cost: 0.00064, ips: 35.7111 samples/sec | ETA 04:51:47 2022-08-24 19:45:57 [INFO] [TRAIN] epoch: 65, iter: 81900/160000, loss: 0.7278, lr: 0.000591, batch_cost: 0.1959, reader_cost: 0.00187, ips: 40.8331 samples/sec | ETA 04:15:01 2022-08-24 19:46:07 [INFO] [TRAIN] epoch: 65, iter: 81950/160000, loss: 0.7905, lr: 0.000591, batch_cost: 0.1992, reader_cost: 0.00045, ips: 40.1701 samples/sec | ETA 04:19:03 2022-08-24 19:46:16 [INFO] [TRAIN] epoch: 65, iter: 82000/160000, loss: 0.7954, lr: 0.000591, batch_cost: 0.1830, reader_cost: 0.00123, ips: 43.7263 samples/sec | ETA 03:57:50 2022-08-24 19:46:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 213s - batch_cost: 0.2126 - reader cost: 7.8207e-04 2022-08-24 19:49:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3069 Acc: 0.7419 Kappa: 0.7223 Dice: 0.4323 2022-08-24 19:49:49 [INFO] [EVAL] Class IoU: [0.6519 0.7673 0.9249 0.6981 0.6592 0.7345 0.7516 0.7487 0.4757 0.5849 0.4471 0.5252 0.6517 0.2851 0.2143 0.3742 0.4743 0.438 0.5474 0.3506 0.7106 0.4275 0.5639 0.4289 0.3153 0.3621 0.3528 0.3896 0.3477 0.2605 0.2498 0.427 0.2234 0.2827 0.3729 0.344 0.3435 0.4284 0.238 0.2793 0.1125 0.0839 0.2945 0.2021 0.293 0.2913 0.2445 0.4007 0.6116 0.4768 0.4362 0.2854 0.2279 0.2257 0.6688 0.4347 0.8427 0.2453 0.4846 0.2183 0.0569 0.17 0.2957 0.1189 0.3757 0.6362 0.2167 0.378 0.0279 0.3258 0.3315 0.3956 0.3737 0.2241 0.4024 0.2767 0.357 0.1964 0.311 0.1612 0.628 0.2665 0.1865 0.0151 0.4709 0.4744 0.0621 0.0537 0.2511 0.44 0.3933 0.0045 0.183 0.0694 0.0239 0.0053 0.1757 0.1128 0.095 0.3702 0.0677 0.0034 0.079 0.527 0.0078 0.3736 0.1726 0.4672 0.0721 0.2865 0.0463 0.1061 0.0938 0.6278 0.6755 0.0002 0.3972 0.5218 0.0716 0.0923 0.4275 0.0151 0.2761 0.1092 0.2509 0.2241 0.4373 0.32 0.3684 0.2344 0.5406 0.0093 0.0828 0.1964 0.0908 0.1274 0.06 0.0152 0.12 0.3292 0.1346 0.0967 0.2769 0.3188 0.3123 0. 0.2243 0.019 0.0393 0.0437] 2022-08-24 19:49:49 [INFO] [EVAL] Class Precision: [0.7694 0.8453 0.9567 0.7886 0.7722 0.8331 0.8904 0.8214 0.6007 0.7599 0.6471 0.6325 0.7295 0.4889 0.4593 0.5289 0.5914 0.6272 0.6909 0.5729 0.7988 0.6073 0.7349 0.6342 0.5437 0.5462 0.474 0.6393 0.6492 0.3951 0.3647 0.5991 0.4218 0.4259 0.517 0.4902 0.5648 0.7246 0.399 0.4727 0.1937 0.2457 0.4847 0.5461 0.4551 0.4747 0.4187 0.5602 0.6796 0.5885 0.5355 0.3633 0.43 0.5225 0.7173 0.5621 0.9064 0.602 0.6056 0.4469 0.0951 0.3712 0.5281 0.5605 0.4653 0.8334 0.3777 0.5359 0.14 0.5565 0.5509 0.6221 0.5761 0.2698 0.6323 0.4477 0.5169 0.546 0.6615 0.6203 0.7385 0.5732 0.7793 0.065 0.6846 0.607 0.1944 0.3847 0.3377 0.6692 0.5564 0.006 0.3441 0.2889 0.0853 0.0207 0.7005 0.4629 0.5877 0.488 0.5048 0.0055 0.654 0.7452 0.0684 0.4641 0.4143 0.7813 0.1967 0.3885 0.3182 0.1247 0.3669 0.8084 0.6797 0.013 0.6843 0.574 0.1159 0.5702 0.6848 0.506 0.5881 0.6196 0.6081 0.6579 0.7397 0.4414 0.6824 0.3822 0.6551 0.4634 0.3405 0.6882 0.6203 0.2669 0.3315 0.2235 0.3136 0.626 0.4766 0.1426 0.4103 0.4709 0.6836 0. 0.9363 0.3243 0.3478 0.675 ] 2022-08-24 19:49:49 [INFO] [EVAL] Class Recall: [0.8102 0.8927 0.9653 0.8588 0.8183 0.8612 0.8282 0.8942 0.6955 0.7175 0.5912 0.7559 0.8593 0.4062 0.2866 0.5611 0.7055 0.5922 0.7249 0.4746 0.8655 0.5909 0.7079 0.5699 0.4288 0.5179 0.5797 0.4993 0.4282 0.4333 0.4425 0.5977 0.3219 0.4568 0.5724 0.5357 0.4671 0.5118 0.3711 0.4058 0.2118 0.113 0.4287 0.2429 0.4513 0.4298 0.3702 0.5846 0.8595 0.7153 0.7016 0.5709 0.3266 0.2844 0.9081 0.6573 0.923 0.2928 0.7081 0.2991 0.1241 0.2387 0.4019 0.1311 0.661 0.7289 0.337 0.562 0.0337 0.44 0.4542 0.5207 0.5154 0.5694 0.5253 0.42 0.5359 0.2347 0.3699 0.1789 0.8076 0.3325 0.1969 0.0193 0.6014 0.6847 0.0836 0.0587 0.4949 0.5623 0.573 0.0168 0.2809 0.0837 0.0322 0.007 0.19 0.1297 0.1018 0.6052 0.0725 0.0087 0.0825 0.6429 0.0087 0.6571 0.2283 0.5375 0.1023 0.522 0.0514 0.4161 0.112 0.7375 0.991 0.0002 0.4863 0.8516 0.1575 0.0991 0.5322 0.0153 0.3422 0.1171 0.2993 0.2537 0.5169 0.538 0.4446 0.3775 0.7557 0.0094 0.0986 0.2156 0.0962 0.1959 0.0682 0.016 0.1627 0.4098 0.158 0.2313 0.4599 0.4967 0.3651 0. 0.2278 0.0198 0.0424 0.0447] 2022-08-24 19:49:49 [INFO] [EVAL] The model with the best validation mIoU (0.3092) was saved at iter 79000. 2022-08-24 19:50:00 [INFO] [TRAIN] epoch: 65, iter: 82050/160000, loss: 0.8238, lr: 0.000590, batch_cost: 0.2039, reader_cost: 0.00283, ips: 39.2395 samples/sec | ETA 04:24:52 2022-08-24 19:50:13 [INFO] [TRAIN] epoch: 66, iter: 82100/160000, loss: 0.7565, lr: 0.000590, batch_cost: 0.2598, reader_cost: 0.04171, ips: 30.7873 samples/sec | ETA 05:37:22 2022-08-24 19:50:23 [INFO] [TRAIN] epoch: 66, iter: 82150/160000, loss: 0.7017, lr: 0.000589, batch_cost: 0.2073, reader_cost: 0.00778, ips: 38.5841 samples/sec | ETA 04:29:01 2022-08-24 19:50:33 [INFO] [TRAIN] epoch: 66, iter: 82200/160000, loss: 0.7728, lr: 0.000589, batch_cost: 0.1952, reader_cost: 0.00092, ips: 40.9836 samples/sec | ETA 04:13:06 2022-08-24 19:50:42 [INFO] [TRAIN] epoch: 66, iter: 82250/160000, loss: 0.7252, lr: 0.000589, batch_cost: 0.1896, reader_cost: 0.00384, ips: 42.1954 samples/sec | ETA 04:05:40 2022-08-24 19:50:50 [INFO] [TRAIN] epoch: 66, iter: 82300/160000, loss: 0.7645, lr: 0.000588, batch_cost: 0.1631, reader_cost: 0.00031, ips: 49.0458 samples/sec | ETA 03:31:13 2022-08-24 19:50:58 [INFO] [TRAIN] epoch: 66, iter: 82350/160000, loss: 0.7860, lr: 0.000588, batch_cost: 0.1527, reader_cost: 0.00036, ips: 52.3878 samples/sec | ETA 03:17:37 2022-08-24 19:51:06 [INFO] [TRAIN] epoch: 66, iter: 82400/160000, loss: 0.8046, lr: 0.000588, batch_cost: 0.1591, reader_cost: 0.00332, ips: 50.2757 samples/sec | ETA 03:25:47 2022-08-24 19:51:14 [INFO] [TRAIN] epoch: 66, iter: 82450/160000, loss: 0.7320, lr: 0.000587, batch_cost: 0.1521, reader_cost: 0.00135, ips: 52.6034 samples/sec | ETA 03:16:33 2022-08-24 19:51:22 [INFO] [TRAIN] epoch: 66, iter: 82500/160000, loss: 0.7833, lr: 0.000587, batch_cost: 0.1711, reader_cost: 0.00042, ips: 46.7533 samples/sec | ETA 03:41:01 2022-08-24 19:51:31 [INFO] [TRAIN] epoch: 66, iter: 82550/160000, loss: 0.7903, lr: 0.000586, batch_cost: 0.1786, reader_cost: 0.00030, ips: 44.7914 samples/sec | ETA 03:50:33 2022-08-24 19:51:39 [INFO] [TRAIN] epoch: 66, iter: 82600/160000, loss: 0.7697, lr: 0.000586, batch_cost: 0.1676, reader_cost: 0.00035, ips: 47.7214 samples/sec | ETA 03:36:15 2022-08-24 19:51:47 [INFO] [TRAIN] epoch: 66, iter: 82650/160000, loss: 0.7886, lr: 0.000586, batch_cost: 0.1530, reader_cost: 0.00049, ips: 52.3006 samples/sec | ETA 03:17:11 2022-08-24 19:51:55 [INFO] [TRAIN] epoch: 66, iter: 82700/160000, loss: 0.8215, lr: 0.000585, batch_cost: 0.1655, reader_cost: 0.00038, ips: 48.3468 samples/sec | ETA 03:33:10 2022-08-24 19:52:04 [INFO] [TRAIN] epoch: 66, iter: 82750/160000, loss: 0.7385, lr: 0.000585, batch_cost: 0.1826, reader_cost: 0.00391, ips: 43.8130 samples/sec | ETA 03:55:05 2022-08-24 19:52:16 [INFO] [TRAIN] epoch: 66, iter: 82800/160000, loss: 0.7245, lr: 0.000584, batch_cost: 0.2283, reader_cost: 0.00067, ips: 35.0490 samples/sec | ETA 04:53:41 2022-08-24 19:52:26 [INFO] [TRAIN] epoch: 66, iter: 82850/160000, loss: 0.7313, lr: 0.000584, batch_cost: 0.2120, reader_cost: 0.00276, ips: 37.7318 samples/sec | ETA 04:32:37 2022-08-24 19:52:37 [INFO] [TRAIN] epoch: 66, iter: 82900/160000, loss: 0.7573, lr: 0.000584, batch_cost: 0.2040, reader_cost: 0.00830, ips: 39.2189 samples/sec | ETA 04:22:07 2022-08-24 19:52:47 [INFO] [TRAIN] epoch: 66, iter: 82950/160000, loss: 0.7849, lr: 0.000583, batch_cost: 0.2030, reader_cost: 0.02701, ips: 39.4127 samples/sec | ETA 04:20:39 2022-08-24 19:52:58 [INFO] [TRAIN] epoch: 66, iter: 83000/160000, loss: 0.7217, lr: 0.000583, batch_cost: 0.2247, reader_cost: 0.00063, ips: 35.6014 samples/sec | ETA 04:48:22 2022-08-24 19:52:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 220s - batch_cost: 0.2203 - reader cost: 8.9157e-04 2022-08-24 19:56:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3080 Acc: 0.7429 Kappa: 0.7230 Dice: 0.4335 2022-08-24 19:56:39 [INFO] [EVAL] Class IoU: [0.6537 0.765 0.9233 0.7011 0.6531 0.7333 0.7486 0.7528 0.4785 0.6169 0.4442 0.5282 0.6463 0.2571 0.1961 0.3784 0.4752 0.4005 0.5495 0.3681 0.7109 0.4324 0.5461 0.4438 0.3249 0.3393 0.3348 0.3988 0.347 0.2919 0.2404 0.3728 0.181 0.2772 0.3675 0.3624 0.3545 0.4198 0.248 0.2499 0.0994 0.0628 0.297 0.1972 0.2966 0.2661 0.2272 0.4101 0.6707 0.4709 0.4571 0.2319 0.2128 0.2257 0.6218 0.426 0.8129 0.2463 0.4901 0.207 0.0917 0.1541 0.2819 0.167 0.3565 0.659 0.2144 0.3657 0.0739 0.3281 0.3402 0.3927 0.3543 0.2144 0.4001 0.2831 0.3391 0.1951 0.1614 0.3326 0.5863 0.2628 0.2278 0.0127 0.4572 0.4486 0.0643 0.0484 0.2614 0.4078 0.2858 0.0068 0.2165 0.0901 0.0143 0.0077 0.1893 0.1258 0.1692 0.3834 0.0128 0.0085 0.1355 0.6291 0.0043 0.553 0.1797 0.4565 0.0396 0.3007 0.042 0.2449 0.0899 0.631 0.6803 0.004 0.3681 0.5672 0.0425 0.0478 0.4277 0.0115 0.28 0.1118 0.2618 0.1835 0.4369 0.3056 0.3462 0.227 0.5503 0.0112 0.1086 0.2226 0.1198 0.1162 0.0778 0.0167 0.1204 0.3062 0.0961 0.0373 0.2329 0.4271 0.3007 0. 0.2585 0.0225 0.094 0.036 ] 2022-08-24 19:56:39 [INFO] [EVAL] Class Precision: [0.756 0.8432 0.9557 0.7954 0.7311 0.8267 0.8955 0.826 0.6311 0.7536 0.6944 0.6325 0.731 0.5119 0.4839 0.5628 0.605 0.6319 0.7239 0.5302 0.7934 0.6062 0.7084 0.6123 0.5272 0.558 0.5032 0.6583 0.657 0.4289 0.3947 0.6251 0.4808 0.4077 0.5536 0.4856 0.5436 0.7086 0.45 0.5011 0.1891 0.303 0.5012 0.6044 0.3832 0.4512 0.3383 0.6552 0.7459 0.573 0.6046 0.2796 0.3562 0.5617 0.7041 0.6342 0.8507 0.6145 0.6745 0.3751 0.1581 0.2988 0.3939 0.5796 0.42 0.7871 0.361 0.5005 0.2124 0.5823 0.506 0.5338 0.6194 0.2655 0.5916 0.5671 0.5498 0.4985 0.7393 0.51 0.6765 0.5476 0.7638 0.0619 0.5196 0.6641 0.2507 0.3937 0.3584 0.5735 0.3351 0.0088 0.3825 0.3281 0.074 0.0253 0.6149 0.452 0.5756 0.5689 0.1683 0.0169 0.5928 0.7691 0.0698 0.7787 0.5012 0.874 0.1891 0.47 0.288 0.3732 0.5243 0.7774 0.6831 0.1198 0.6979 0.6014 0.0878 0.5482 0.787 0.3116 0.7688 0.6183 0.5529 0.702 0.8335 0.436 0.5656 0.4139 0.6539 0.5264 0.4568 0.6598 0.6343 0.2178 0.288 0.1506 0.345 0.6812 0.1929 0.1507 0.4548 0.5938 0.6276 0. 0.8883 0.2131 0.4144 0.8699] 2022-08-24 19:56:39 [INFO] [EVAL] Class Recall: [0.8285 0.8918 0.9646 0.8553 0.8595 0.8665 0.8202 0.8947 0.6643 0.7728 0.552 0.762 0.8479 0.3405 0.248 0.5359 0.6889 0.5224 0.6953 0.5462 0.8723 0.6014 0.7044 0.6172 0.4584 0.464 0.5 0.5029 0.4237 0.4775 0.3809 0.4802 0.2249 0.4642 0.5223 0.5882 0.5047 0.5074 0.356 0.3326 0.1733 0.0734 0.4216 0.2265 0.5674 0.3934 0.4089 0.523 0.8694 0.7255 0.652 0.5761 0.3457 0.2739 0.8418 0.5648 0.9481 0.2913 0.642 0.3159 0.1791 0.2415 0.4979 0.1901 0.7022 0.8019 0.3455 0.5758 0.1018 0.4291 0.5094 0.5976 0.4528 0.527 0.5529 0.3612 0.4693 0.2427 0.1711 0.4887 0.8148 0.3357 0.2451 0.0157 0.7918 0.5803 0.0795 0.0523 0.4912 0.5852 0.6602 0.0294 0.3328 0.1106 0.0174 0.0109 0.2148 0.1484 0.1933 0.5404 0.0137 0.017 0.1495 0.7756 0.0045 0.6562 0.2188 0.4887 0.0478 0.4549 0.0469 0.4161 0.0979 0.7702 0.9939 0.0041 0.4379 0.9089 0.076 0.0497 0.4837 0.0118 0.3057 0.1201 0.3321 0.199 0.4787 0.5053 0.4715 0.3344 0.7765 0.0114 0.1247 0.2515 0.1287 0.1993 0.0963 0.0184 0.1561 0.3575 0.1608 0.0473 0.3231 0.6033 0.3661 0. 0.2672 0.0246 0.1084 0.0362] 2022-08-24 19:56:39 [INFO] [EVAL] The model with the best validation mIoU (0.3092) was saved at iter 79000. 2022-08-24 19:56:49 [INFO] [TRAIN] epoch: 66, iter: 83050/160000, loss: 0.7538, lr: 0.000583, batch_cost: 0.2002, reader_cost: 0.00249, ips: 39.9551 samples/sec | ETA 04:16:47 2022-08-24 19:57:00 [INFO] [TRAIN] epoch: 66, iter: 83100/160000, loss: 0.7710, lr: 0.000582, batch_cost: 0.2236, reader_cost: 0.00165, ips: 35.7745 samples/sec | ETA 04:46:36 2022-08-24 19:57:11 [INFO] [TRAIN] epoch: 66, iter: 83150/160000, loss: 0.7452, lr: 0.000582, batch_cost: 0.2282, reader_cost: 0.00095, ips: 35.0553 samples/sec | ETA 04:52:18 2022-08-24 19:57:21 [INFO] [TRAIN] epoch: 66, iter: 83200/160000, loss: 0.6992, lr: 0.000581, batch_cost: 0.1941, reader_cost: 0.01002, ips: 41.2131 samples/sec | ETA 04:08:27 2022-08-24 19:57:32 [INFO] [TRAIN] epoch: 66, iter: 83250/160000, loss: 0.7524, lr: 0.000581, batch_cost: 0.2210, reader_cost: 0.00069, ips: 36.2068 samples/sec | ETA 04:42:38 2022-08-24 19:57:43 [INFO] [TRAIN] epoch: 66, iter: 83300/160000, loss: 0.7686, lr: 0.000581, batch_cost: 0.2093, reader_cost: 0.00185, ips: 38.2174 samples/sec | ETA 04:27:35 2022-08-24 19:57:52 [INFO] [TRAIN] epoch: 66, iter: 83350/160000, loss: 0.7604, lr: 0.000580, batch_cost: 0.1950, reader_cost: 0.00054, ips: 41.0157 samples/sec | ETA 04:09:10 2022-08-24 19:58:02 [INFO] [TRAIN] epoch: 67, iter: 83400/160000, loss: 0.8155, lr: 0.000580, batch_cost: 0.1929, reader_cost: 0.02878, ips: 41.4658 samples/sec | ETA 04:06:18 2022-08-24 19:58:10 [INFO] [TRAIN] epoch: 67, iter: 83450/160000, loss: 0.7373, lr: 0.000580, batch_cost: 0.1646, reader_cost: 0.00617, ips: 48.6022 samples/sec | ETA 03:30:00 2022-08-24 19:58:20 [INFO] [TRAIN] epoch: 67, iter: 83500/160000, loss: 0.7874, lr: 0.000579, batch_cost: 0.1868, reader_cost: 0.00052, ips: 42.8295 samples/sec | ETA 03:58:09 2022-08-24 19:58:29 [INFO] [TRAIN] epoch: 67, iter: 83550/160000, loss: 0.7841, lr: 0.000579, batch_cost: 0.1889, reader_cost: 0.00042, ips: 42.3440 samples/sec | ETA 04:00:43 2022-08-24 19:58:39 [INFO] [TRAIN] epoch: 67, iter: 83600/160000, loss: 0.7716, lr: 0.000578, batch_cost: 0.1998, reader_cost: 0.00053, ips: 40.0460 samples/sec | ETA 04:14:22 2022-08-24 19:58:48 [INFO] [TRAIN] epoch: 67, iter: 83650/160000, loss: 0.7862, lr: 0.000578, batch_cost: 0.1720, reader_cost: 0.00080, ips: 46.5074 samples/sec | ETA 03:38:53 2022-08-24 19:58:58 [INFO] [TRAIN] epoch: 67, iter: 83700/160000, loss: 0.7312, lr: 0.000578, batch_cost: 0.2124, reader_cost: 0.00074, ips: 37.6650 samples/sec | ETA 04:30:06 2022-08-24 19:59:09 [INFO] [TRAIN] epoch: 67, iter: 83750/160000, loss: 0.7867, lr: 0.000577, batch_cost: 0.2135, reader_cost: 0.00070, ips: 37.4772 samples/sec | ETA 04:31:16 2022-08-24 19:59:19 [INFO] [TRAIN] epoch: 67, iter: 83800/160000, loss: 0.7302, lr: 0.000577, batch_cost: 0.2068, reader_cost: 0.00867, ips: 38.6811 samples/sec | ETA 04:22:39 2022-08-24 19:59:29 [INFO] [TRAIN] epoch: 67, iter: 83850/160000, loss: 0.7460, lr: 0.000577, batch_cost: 0.1989, reader_cost: 0.00339, ips: 40.2121 samples/sec | ETA 04:12:29 2022-08-24 19:59:40 [INFO] [TRAIN] epoch: 67, iter: 83900/160000, loss: 0.7723, lr: 0.000576, batch_cost: 0.2041, reader_cost: 0.01116, ips: 39.1883 samples/sec | ETA 04:18:55 2022-08-24 19:59:50 [INFO] [TRAIN] epoch: 67, iter: 83950/160000, loss: 0.8048, lr: 0.000576, batch_cost: 0.2051, reader_cost: 0.00090, ips: 39.0034 samples/sec | ETA 04:19:58 2022-08-24 19:59:59 [INFO] [TRAIN] epoch: 67, iter: 84000/160000, loss: 0.7529, lr: 0.000575, batch_cost: 0.1943, reader_cost: 0.00589, ips: 41.1816 samples/sec | ETA 04:06:03 2022-08-24 19:59:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 206s - batch_cost: 0.2057 - reader cost: 9.3461e-04 2022-08-24 20:03:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.3081 Acc: 0.7428 Kappa: 0.7231 Dice: 0.4334 2022-08-24 20:03:26 [INFO] [EVAL] Class IoU: [0.6515 0.76 0.9245 0.7043 0.6644 0.7321 0.7579 0.7462 0.4837 0.6112 0.4473 0.5156 0.6396 0.2944 0.2025 0.3728 0.4835 0.3997 0.5551 0.3649 0.7121 0.4101 0.5489 0.4435 0.3249 0.397 0.3516 0.3932 0.3498 0.3197 0.2181 0.3801 0.2716 0.2777 0.3728 0.3685 0.3512 0.4587 0.2608 0.2705 0.0926 0.0709 0.2896 0.2062 0.3026 0.2942 0.2414 0.4199 0.6337 0.4628 0.4504 0.2124 0.1372 0.2074 0.6194 0.4812 0.8185 0.2693 0.4615 0.2023 0.0316 0.1395 0.2927 0.0909 0.357 0.6582 0.1624 0.372 0.1014 0.3165 0.3323 0.4176 0.373 0.2236 0.409 0.2879 0.3466 0.2106 0.2164 0.3249 0.6439 0.2643 0.2253 0.0128 0.4607 0.4741 0.0935 0.0499 0.3424 0.432 0.365 0.0106 0.1978 0.083 0.0248 0.0122 0.1873 0.1302 0.155 0.3406 0.0418 0.0045 0.0948 0.4529 0.0026 0.51 0.2142 0.4496 0.0734 0.2769 0.0528 0.2242 0.0882 0.6165 0.6022 0.0009 0.4144 0.5241 0.0222 0.0573 0.464 0.0048 0.2938 0.1025 0.2384 0.1375 0.3955 0.307 0.3168 0.2557 0.5787 0.0135 0.0552 0.3011 0.0711 0.0866 0.063 0.0146 0.1042 0.3475 0.058 0.1598 0.2477 0.4105 0.2852 0.0001 0.2334 0.0127 0.059 0.0355] 2022-08-24 20:03:26 [INFO] [EVAL] Class Precision: [0.7589 0.8358 0.9606 0.8079 0.7861 0.8388 0.8771 0.8181 0.6146 0.7431 0.681 0.6413 0.7149 0.4622 0.5022 0.5177 0.6241 0.6733 0.7364 0.5345 0.7979 0.5877 0.7113 0.5741 0.5168 0.5398 0.4643 0.6553 0.6612 0.4834 0.3738 0.531 0.496 0.3992 0.552 0.5064 0.5838 0.7533 0.4905 0.4627 0.1847 0.19 0.4226 0.5427 0.4238 0.556 0.3757 0.6433 0.711 0.5547 0.6559 0.2501 0.3998 0.5237 0.6523 0.6677 0.8683 0.6336 0.7205 0.4104 0.0559 0.3721 0.518 0.6021 0.4202 0.7645 0.3291 0.4933 0.1985 0.5523 0.5467 0.6004 0.5954 0.3039 0.5355 0.4866 0.5046 0.7072 0.5131 0.7185 0.815 0.4636 0.7774 0.0567 0.5936 0.6603 0.2363 0.4018 0.5483 0.6621 0.4858 0.0148 0.3081 0.3154 0.1177 0.0789 0.6207 0.5107 0.5341 0.6073 0.7809 0.0105 0.4695 0.5601 0.0679 0.7506 0.5261 0.7923 0.2384 0.4618 0.3491 0.3272 0.3577 0.7161 0.6047 0.0345 0.7028 0.5486 0.0773 0.5318 0.6667 0.422 0.8036 0.7013 0.5484 0.6708 0.6452 0.4229 0.7967 0.4598 0.7273 0.5727 0.3465 0.5887 0.7563 0.2874 0.3969 0.0676 0.3362 0.6168 0.1213 0.2757 0.4272 0.6039 0.4216 0.0002 0.9172 0.2018 0.4155 0.645 ] 2022-08-24 20:03:26 [INFO] [EVAL] Class Recall: [0.8215 0.8933 0.961 0.8459 0.8111 0.8519 0.848 0.8945 0.6943 0.775 0.5659 0.7247 0.8586 0.4478 0.2533 0.5711 0.6822 0.4959 0.6928 0.5349 0.8687 0.5757 0.7062 0.6609 0.4667 0.6002 0.5917 0.4957 0.4263 0.4857 0.3436 0.5722 0.3751 0.4772 0.5346 0.575 0.4686 0.5398 0.3577 0.3943 0.1565 0.1016 0.4791 0.2496 0.5141 0.3845 0.4032 0.5473 0.8535 0.7363 0.5897 0.5848 0.1728 0.2556 0.9247 0.6327 0.9345 0.319 0.5621 0.2852 0.0679 0.1825 0.4023 0.0967 0.7036 0.8257 0.2427 0.602 0.1716 0.4258 0.4586 0.5783 0.4995 0.4584 0.6338 0.4134 0.5254 0.2307 0.2723 0.3722 0.7542 0.3806 0.2408 0.0163 0.6729 0.6271 0.134 0.0539 0.4769 0.5541 0.5947 0.0354 0.3558 0.1013 0.0305 0.0143 0.2115 0.1488 0.1793 0.4369 0.0423 0.0078 0.1062 0.7029 0.0027 0.614 0.2655 0.5097 0.0959 0.4089 0.0586 0.4161 0.1048 0.8159 0.9933 0.0009 0.5025 0.9212 0.0303 0.0604 0.6042 0.0049 0.3166 0.1071 0.2967 0.1474 0.5055 0.5282 0.3447 0.3654 0.7391 0.0136 0.0616 0.3813 0.0728 0.1102 0.0697 0.0183 0.1312 0.4432 0.0999 0.2754 0.3708 0.5618 0.4685 0.0002 0.2384 0.0134 0.0643 0.0362] 2022-08-24 20:03:26 [INFO] [EVAL] The model with the best validation mIoU (0.3092) was saved at iter 79000. 2022-08-24 20:03:38 [INFO] [TRAIN] epoch: 67, iter: 84050/160000, loss: 0.7571, lr: 0.000575, batch_cost: 0.2500, reader_cost: 0.00466, ips: 32.0063 samples/sec | ETA 05:16:23 2022-08-24 20:03:48 [INFO] [TRAIN] epoch: 67, iter: 84100/160000, loss: 0.7844, lr: 0.000575, batch_cost: 0.2026, reader_cost: 0.00063, ips: 39.4877 samples/sec | ETA 04:16:16 2022-08-24 20:03:58 [INFO] [TRAIN] epoch: 67, iter: 84150/160000, loss: 0.7816, lr: 0.000574, batch_cost: 0.1967, reader_cost: 0.01216, ips: 40.6645 samples/sec | ETA 04:08:42 2022-08-24 20:04:09 [INFO] [TRAIN] epoch: 67, iter: 84200/160000, loss: 0.7649, lr: 0.000574, batch_cost: 0.2152, reader_cost: 0.01139, ips: 37.1678 samples/sec | ETA 04:31:55 2022-08-24 20:04:18 [INFO] [TRAIN] epoch: 67, iter: 84250/160000, loss: 0.7617, lr: 0.000574, batch_cost: 0.1752, reader_cost: 0.00048, ips: 45.6726 samples/sec | ETA 03:41:08 2022-08-24 20:04:27 [INFO] [TRAIN] epoch: 67, iter: 84300/160000, loss: 0.7643, lr: 0.000573, batch_cost: 0.1909, reader_cost: 0.01194, ips: 41.9095 samples/sec | ETA 04:00:50 2022-08-24 20:04:38 [INFO] [TRAIN] epoch: 67, iter: 84350/160000, loss: 0.7325, lr: 0.000573, batch_cost: 0.2045, reader_cost: 0.00256, ips: 39.1168 samples/sec | ETA 04:17:51 2022-08-24 20:04:49 [INFO] [TRAIN] epoch: 67, iter: 84400/160000, loss: 0.8302, lr: 0.000572, batch_cost: 0.2202, reader_cost: 0.00159, ips: 36.3288 samples/sec | ETA 04:37:27 2022-08-24 20:04:59 [INFO] [TRAIN] epoch: 67, iter: 84450/160000, loss: 0.7671, lr: 0.000572, batch_cost: 0.2160, reader_cost: 0.00048, ips: 37.0295 samples/sec | ETA 04:32:02 2022-08-24 20:05:10 [INFO] [TRAIN] epoch: 67, iter: 84500/160000, loss: 0.7383, lr: 0.000572, batch_cost: 0.2038, reader_cost: 0.00053, ips: 39.2573 samples/sec | ETA 04:16:25 2022-08-24 20:05:19 [INFO] [TRAIN] epoch: 67, iter: 84550/160000, loss: 0.7331, lr: 0.000571, batch_cost: 0.1804, reader_cost: 0.00063, ips: 44.3561 samples/sec | ETA 03:46:48 2022-08-24 20:05:27 [INFO] [TRAIN] epoch: 67, iter: 84600/160000, loss: 0.7826, lr: 0.000571, batch_cost: 0.1782, reader_cost: 0.00064, ips: 44.9019 samples/sec | ETA 03:43:53 2022-08-24 20:05:38 [INFO] [TRAIN] epoch: 68, iter: 84650/160000, loss: 0.7606, lr: 0.000570, batch_cost: 0.2053, reader_cost: 0.04384, ips: 38.9586 samples/sec | ETA 04:17:52 2022-08-24 20:05:47 [INFO] [TRAIN] epoch: 68, iter: 84700/160000, loss: 0.7593, lr: 0.000570, batch_cost: 0.1798, reader_cost: 0.00796, ips: 44.5001 samples/sec | ETA 03:45:37 2022-08-24 20:05:57 [INFO] [TRAIN] epoch: 68, iter: 84750/160000, loss: 0.7532, lr: 0.000570, batch_cost: 0.1996, reader_cost: 0.01193, ips: 40.0796 samples/sec | ETA 04:10:20 2022-08-24 20:06:07 [INFO] [TRAIN] epoch: 68, iter: 84800/160000, loss: 0.7353, lr: 0.000569, batch_cost: 0.1996, reader_cost: 0.00266, ips: 40.0708 samples/sec | ETA 04:10:13 2022-08-24 20:06:17 [INFO] [TRAIN] epoch: 68, iter: 84850/160000, loss: 0.7482, lr: 0.000569, batch_cost: 0.2086, reader_cost: 0.00053, ips: 38.3550 samples/sec | ETA 04:21:14 2022-08-24 20:06:28 [INFO] [TRAIN] epoch: 68, iter: 84900/160000, loss: 0.7610, lr: 0.000569, batch_cost: 0.2244, reader_cost: 0.00053, ips: 35.6545 samples/sec | ETA 04:40:50 2022-08-24 20:06:38 [INFO] [TRAIN] epoch: 68, iter: 84950/160000, loss: 0.7724, lr: 0.000568, batch_cost: 0.1903, reader_cost: 0.00187, ips: 42.0491 samples/sec | ETA 03:57:58 2022-08-24 20:06:48 [INFO] [TRAIN] epoch: 68, iter: 85000/160000, loss: 0.8577, lr: 0.000568, batch_cost: 0.2103, reader_cost: 0.00558, ips: 38.0416 samples/sec | ETA 04:22:52 2022-08-24 20:06:48 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 213s - batch_cost: 0.2134 - reader cost: 0.0011 2022-08-24 20:10:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3121 Acc: 0.7424 Kappa: 0.7225 Dice: 0.4391 2022-08-24 20:10:22 [INFO] [EVAL] Class IoU: [0.6507 0.7645 0.9224 0.7014 0.6498 0.7346 0.7652 0.7456 0.4754 0.6226 0.4521 0.5154 0.6545 0.3003 0.2174 0.3732 0.4606 0.4092 0.5282 0.3734 0.7198 0.3783 0.5569 0.4533 0.2966 0.3631 0.3613 0.3742 0.3577 0.3293 0.1836 0.4074 0.2394 0.2685 0.2784 0.3411 0.3436 0.4286 0.2455 0.2321 0.0896 0.0855 0.297 0.2134 0.2867 0.2722 0.2709 0.4201 0.6539 0.493 0.4421 0.2571 0.2138 0.1813 0.6396 0.3667 0.8159 0.2552 0.3721 0.221 0.0538 0.3042 0.3009 0.1066 0.4035 0.6161 0.181 0.3729 0.0716 0.3471 0.3387 0.4121 0.3736 0.2177 0.4036 0.3005 0.2869 0.2095 0.3005 0.3107 0.6018 0.2643 0.2918 0.0083 0.4845 0.4697 0.0558 0.0419 0.3641 0.4405 0.3947 0.0123 0.155 0.0666 0.0006 0.0129 0.1685 0.1245 0.2169 0.3356 0.0666 0.0006 0.1759 0.2981 0.0576 0.5684 0.2158 0.5009 0.0679 0.2483 0.0575 0.2828 0.0882 0.6144 0.8427 0.001 0.3416 0.5143 0.0375 0.0654 0.476 0.0035 0.2899 0.1576 0.2203 0.2145 0.4295 0.3222 0.3981 0.2082 0.6183 0.0097 0.1946 0.1788 0.0899 0.1162 0.0855 0.016 0.1273 0.311 0.1376 0.0957 0.2743 0.3451 0.2977 0.0017 0.3079 0.0144 0.0421 0.0397] 2022-08-24 20:10:22 [INFO] [EVAL] Class Precision: [0.7561 0.8282 0.9539 0.8182 0.75 0.862 0.864 0.831 0.6204 0.7789 0.6867 0.6849 0.7411 0.5052 0.4347 0.5316 0.6033 0.6384 0.6344 0.5212 0.8101 0.5334 0.7087 0.5834 0.4881 0.6248 0.5092 0.6963 0.5903 0.4816 0.3975 0.5787 0.4614 0.3851 0.4587 0.4669 0.5853 0.651 0.424 0.5181 0.1656 0.1914 0.4952 0.4608 0.4417 0.402 0.5238 0.612 0.7151 0.6275 0.5714 0.3129 0.4165 0.6722 0.7171 0.6193 0.8655 0.5896 0.6809 0.3727 0.0979 0.4651 0.4171 0.5798 0.5563 0.6977 0.3718 0.574 0.3581 0.6766 0.5025 0.5871 0.6493 0.2922 0.5708 0.4651 0.5146 0.5613 0.6097 0.44 0.6839 0.5971 0.7235 0.0275 0.6386 0.6789 0.4093 0.4891 0.545 0.6778 0.537 0.0172 0.3229 0.2782 0.0035 0.0411 0.6161 0.6299 0.4353 0.4618 0.6135 0.0012 0.6751 0.617 0.4776 0.8181 0.5848 0.655 0.1608 0.3793 0.3449 0.6867 0.3664 0.7506 0.857 0.0361 0.729 0.5267 0.0933 0.4505 0.6784 0.455 0.7165 0.6438 0.6088 0.6605 0.8217 0.5024 0.6065 0.4806 0.7685 0.5163 0.6336 0.7777 0.6147 0.2786 0.2374 0.2274 0.354 0.6918 0.2707 0.161 0.6163 0.5532 0.5034 0.0024 0.885 0.2989 0.352 0.7375] 2022-08-24 20:10:22 [INFO] [EVAL] Class Recall: [0.8235 0.9087 0.9654 0.8309 0.8295 0.8325 0.87 0.8789 0.6703 0.7563 0.5696 0.6757 0.8485 0.4254 0.3031 0.556 0.6608 0.5328 0.7592 0.5683 0.8659 0.5653 0.7222 0.6703 0.4304 0.4644 0.5544 0.4471 0.4758 0.5101 0.2544 0.5792 0.3321 0.47 0.4145 0.5588 0.4543 0.5564 0.3684 0.296 0.1633 0.1339 0.426 0.2845 0.4497 0.4575 0.3594 0.5726 0.8843 0.697 0.6615 0.5903 0.3053 0.1989 0.8553 0.4734 0.9343 0.3103 0.4507 0.3518 0.1068 0.4679 0.5192 0.1156 0.5951 0.8404 0.2607 0.5155 0.0822 0.4161 0.5095 0.5802 0.4681 0.4608 0.5794 0.4592 0.3934 0.2505 0.3721 0.514 0.8338 0.3216 0.3284 0.0117 0.6674 0.6039 0.0607 0.0438 0.5231 0.5572 0.5984 0.0417 0.2297 0.0805 0.0007 0.0185 0.1883 0.1343 0.3017 0.5513 0.0695 0.0014 0.1921 0.3659 0.0614 0.6507 0.2549 0.6805 0.1051 0.4183 0.0645 0.3247 0.104 0.772 0.9805 0.001 0.3912 0.9563 0.059 0.0711 0.6147 0.0035 0.3275 0.1727 0.2566 0.241 0.4736 0.4733 0.5367 0.2686 0.7598 0.0098 0.2193 0.1885 0.0953 0.1662 0.1179 0.0169 0.1658 0.3611 0.2187 0.1908 0.3308 0.4785 0.4214 0.0055 0.3208 0.0149 0.0456 0.0403] 2022-08-24 20:10:22 [INFO] [EVAL] The model with the best validation mIoU (0.3121) was saved at iter 85000. 2022-08-24 20:10:33 [INFO] [TRAIN] epoch: 68, iter: 85050/160000, loss: 0.7674, lr: 0.000567, batch_cost: 0.2131, reader_cost: 0.00361, ips: 37.5336 samples/sec | ETA 04:26:15 2022-08-24 20:10:43 [INFO] [TRAIN] epoch: 68, iter: 85100/160000, loss: 0.7802, lr: 0.000567, batch_cost: 0.1960, reader_cost: 0.00324, ips: 40.8174 samples/sec | ETA 04:04:40 2022-08-24 20:10:53 [INFO] [TRAIN] epoch: 68, iter: 85150/160000, loss: 0.7190, lr: 0.000567, batch_cost: 0.2160, reader_cost: 0.00069, ips: 37.0409 samples/sec | ETA 04:29:25 2022-08-24 20:11:03 [INFO] [TRAIN] epoch: 68, iter: 85200/160000, loss: 0.7509, lr: 0.000566, batch_cost: 0.1924, reader_cost: 0.00093, ips: 41.5766 samples/sec | ETA 03:59:52 2022-08-24 20:11:13 [INFO] [TRAIN] epoch: 68, iter: 85250/160000, loss: 0.7520, lr: 0.000566, batch_cost: 0.2040, reader_cost: 0.00168, ips: 39.2074 samples/sec | ETA 04:14:12 2022-08-24 20:11:23 [INFO] [TRAIN] epoch: 68, iter: 85300/160000, loss: 0.7411, lr: 0.000566, batch_cost: 0.2054, reader_cost: 0.03385, ips: 38.9444 samples/sec | ETA 04:15:44 2022-08-24 20:11:33 [INFO] [TRAIN] epoch: 68, iter: 85350/160000, loss: 0.7923, lr: 0.000565, batch_cost: 0.1943, reader_cost: 0.00226, ips: 41.1666 samples/sec | ETA 04:01:46 2022-08-24 20:11:45 [INFO] [TRAIN] epoch: 68, iter: 85400/160000, loss: 0.8057, lr: 0.000565, batch_cost: 0.2269, reader_cost: 0.00034, ips: 35.2504 samples/sec | ETA 04:42:10 2022-08-24 20:11:54 [INFO] [TRAIN] epoch: 68, iter: 85450/160000, loss: 0.7763, lr: 0.000564, batch_cost: 0.1845, reader_cost: 0.00066, ips: 43.3522 samples/sec | ETA 03:49:17 2022-08-24 20:12:03 [INFO] [TRAIN] epoch: 68, iter: 85500/160000, loss: 0.7871, lr: 0.000564, batch_cost: 0.1865, reader_cost: 0.00033, ips: 42.8917 samples/sec | ETA 03:51:35 2022-08-24 20:12:12 [INFO] [TRAIN] epoch: 68, iter: 85550/160000, loss: 0.7341, lr: 0.000564, batch_cost: 0.1717, reader_cost: 0.00033, ips: 46.5835 samples/sec | ETA 03:33:05 2022-08-24 20:12:20 [INFO] [TRAIN] epoch: 68, iter: 85600/160000, loss: 0.7919, lr: 0.000563, batch_cost: 0.1680, reader_cost: 0.00372, ips: 47.6308 samples/sec | ETA 03:28:16 2022-08-24 20:12:28 [INFO] [TRAIN] epoch: 68, iter: 85650/160000, loss: 0.7957, lr: 0.000563, batch_cost: 0.1581, reader_cost: 0.00078, ips: 50.5965 samples/sec | ETA 03:15:55 2022-08-24 20:12:37 [INFO] [TRAIN] epoch: 68, iter: 85700/160000, loss: 0.7746, lr: 0.000563, batch_cost: 0.1849, reader_cost: 0.00058, ips: 43.2760 samples/sec | ETA 03:48:55 2022-08-24 20:12:47 [INFO] [TRAIN] epoch: 68, iter: 85750/160000, loss: 0.7213, lr: 0.000562, batch_cost: 0.2045, reader_cost: 0.02024, ips: 39.1124 samples/sec | ETA 04:13:07 2022-08-24 20:12:58 [INFO] [TRAIN] epoch: 68, iter: 85800/160000, loss: 0.7517, lr: 0.000562, batch_cost: 0.2187, reader_cost: 0.00646, ips: 36.5765 samples/sec | ETA 04:30:29 2022-08-24 20:13:09 [INFO] [TRAIN] epoch: 68, iter: 85850/160000, loss: 0.7946, lr: 0.000561, batch_cost: 0.2212, reader_cost: 0.00044, ips: 36.1648 samples/sec | ETA 04:33:22 2022-08-24 20:13:22 [INFO] [TRAIN] epoch: 69, iter: 85900/160000, loss: 0.7433, lr: 0.000561, batch_cost: 0.2556, reader_cost: 0.02947, ips: 31.2938 samples/sec | ETA 05:15:43 2022-08-24 20:13:34 [INFO] [TRAIN] epoch: 69, iter: 85950/160000, loss: 0.7794, lr: 0.000561, batch_cost: 0.2293, reader_cost: 0.00084, ips: 34.8941 samples/sec | ETA 04:42:57 2022-08-24 20:13:44 [INFO] [TRAIN] epoch: 69, iter: 86000/160000, loss: 0.7338, lr: 0.000560, batch_cost: 0.2137, reader_cost: 0.00710, ips: 37.4379 samples/sec | ETA 04:23:32 2022-08-24 20:13:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 227s - batch_cost: 0.2273 - reader cost: 9.9282e-04 2022-08-24 20:17:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3137 Acc: 0.7449 Kappa: 0.7248 Dice: 0.4404 2022-08-24 20:17:32 [INFO] [EVAL] Class IoU: [0.6525 0.7696 0.9225 0.7018 0.664 0.7265 0.7578 0.7446 0.4805 0.6138 0.449 0.5051 0.6534 0.2843 0.2173 0.3719 0.4898 0.4127 0.5384 0.3637 0.719 0.3952 0.5525 0.4639 0.2917 0.368 0.3373 0.3768 0.3259 0.3235 0.1996 0.3877 0.2618 0.2515 0.3354 0.3672 0.3341 0.4877 0.2605 0.2618 0.0891 0.1048 0.3024 0.214 0.2915 0.2682 0.2687 0.4135 0.6602 0.4647 0.4585 0.3152 0.2189 0.1558 0.6332 0.4839 0.8214 0.1857 0.4852 0.2132 0.0459 0.1632 0.3185 0.1653 0.3745 0.6727 0.2072 0.3636 0.0572 0.325 0.3089 0.4003 0.3595 0.2021 0.4042 0.273 0.3232 0.2083 0.2387 0.2711 0.656 0.268 0.2524 0.0154 0.4635 0.4698 0.0703 0.0748 0.3757 0.4178 0.4018 0. 0.1729 0.0661 0.0476 0.0091 0.1771 0.1471 0.2367 0.3313 0.1271 0.009 0.1463 0.7743 0.1002 0.5284 0.1534 0.5152 0.0814 0.2862 0.0606 0.2379 0.0884 0.6074 0.6948 0.0021 0.3761 0.5358 0.0632 0.0977 0.4645 0.0053 0.321 0.0806 0.244 0.2063 0.4167 0.2945 0.2899 0.2337 0.575 0.0106 0.0834 0.2266 0.0934 0.1034 0.0574 0.0139 0.1085 0.3036 0.228 0.0786 0.2796 0.1714 0.3219 0. 0.2676 0.009 0.0359 0.0347] 2022-08-24 20:17:32 [INFO] [EVAL] Class Precision: [0.7465 0.8311 0.959 0.7983 0.7645 0.854 0.8489 0.8177 0.6525 0.7563 0.6456 0.6692 0.7362 0.5192 0.4682 0.5334 0.6389 0.6266 0.7209 0.5727 0.8031 0.5923 0.7496 0.6023 0.5014 0.6153 0.4855 0.6253 0.6658 0.4837 0.4061 0.5407 0.5337 0.3491 0.4712 0.5162 0.623 0.7429 0.4819 0.5043 0.1793 0.1938 0.5366 0.5048 0.383 0.4518 0.4977 0.6626 0.7205 0.5684 0.6565 0.4376 0.4437 0.6574 0.6916 0.6783 0.8629 0.6524 0.5986 0.3568 0.0747 0.3603 0.4542 0.5628 0.4825 0.8131 0.3608 0.6092 0.2936 0.637 0.5947 0.6024 0.6541 0.2476 0.6595 0.6011 0.4304 0.5304 0.7833 0.5707 0.7825 0.5379 0.7268 0.0539 0.7056 0.6761 0.2481 0.3169 0.6934 0.5697 0.5703 0. 0.3833 0.2701 0.1354 0.0469 0.5858 0.5097 0.4397 0.5374 0.5019 0.015 0.6264 0.879 0.7528 0.6863 0.4056 0.6696 0.2474 0.4982 0.2724 0.3582 0.4642 0.7178 0.6993 0.063 0.6637 0.5711 0.1257 0.6253 0.7607 0.3663 0.8514 0.7426 0.5736 0.6938 0.7691 0.4209 0.5301 0.5427 0.6892 0.5555 0.4732 0.7267 0.6071 0.2938 0.3493 0.1299 0.3406 0.6909 0.356 0.1219 0.5212 0.4041 0.658 0. 0.8982 0.3595 0.4082 0.6516] 2022-08-24 20:17:32 [INFO] [EVAL] Class Recall: [0.8382 0.9123 0.9604 0.853 0.8348 0.8295 0.8759 0.8929 0.6458 0.7651 0.5959 0.6732 0.853 0.3858 0.2885 0.5511 0.6774 0.5473 0.6801 0.4992 0.8729 0.5429 0.6775 0.6687 0.4108 0.478 0.5248 0.4867 0.3896 0.4942 0.2819 0.578 0.3395 0.4736 0.5378 0.5598 0.4187 0.5868 0.3618 0.3526 0.1503 0.1859 0.4093 0.2708 0.5496 0.3977 0.3686 0.5239 0.8875 0.7181 0.6032 0.5297 0.3018 0.1695 0.8824 0.628 0.9447 0.206 0.7192 0.3463 0.1061 0.2297 0.5159 0.1896 0.6258 0.7956 0.3274 0.4742 0.0663 0.3988 0.3912 0.5441 0.4438 0.5237 0.5107 0.3333 0.5647 0.2554 0.2556 0.3405 0.8023 0.3481 0.2788 0.0211 0.5747 0.6063 0.0893 0.0892 0.4505 0.6103 0.5762 0. 0.2394 0.0805 0.0685 0.0112 0.2025 0.1714 0.3389 0.4634 0.1455 0.0221 0.1603 0.8667 0.1036 0.6967 0.1979 0.6909 0.1082 0.4021 0.0723 0.4147 0.0985 0.7979 0.9908 0.0022 0.4646 0.8964 0.1127 0.1037 0.5439 0.0054 0.3401 0.0829 0.298 0.2269 0.4763 0.4952 0.3902 0.2909 0.7763 0.0107 0.092 0.2477 0.0994 0.1375 0.0642 0.0153 0.1373 0.3513 0.388 0.1812 0.3762 0.2293 0.3866 0. 0.2759 0.0091 0.0379 0.0354] 2022-08-24 20:17:32 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:17:43 [INFO] [TRAIN] epoch: 69, iter: 86050/160000, loss: 0.7883, lr: 0.000560, batch_cost: 0.2082, reader_cost: 0.00330, ips: 38.4298 samples/sec | ETA 04:16:34 2022-08-24 20:17:54 [INFO] [TRAIN] epoch: 69, iter: 86100/160000, loss: 0.7336, lr: 0.000560, batch_cost: 0.2307, reader_cost: 0.00179, ips: 34.6844 samples/sec | ETA 04:44:05 2022-08-24 20:18:05 [INFO] [TRAIN] epoch: 69, iter: 86150/160000, loss: 0.7642, lr: 0.000559, batch_cost: 0.2249, reader_cost: 0.00034, ips: 35.5765 samples/sec | ETA 04:36:46 2022-08-24 20:18:15 [INFO] [TRAIN] epoch: 69, iter: 86200/160000, loss: 0.7892, lr: 0.000559, batch_cost: 0.1933, reader_cost: 0.00672, ips: 41.3952 samples/sec | ETA 03:57:42 2022-08-24 20:18:26 [INFO] [TRAIN] epoch: 69, iter: 86250/160000, loss: 0.7540, lr: 0.000558, batch_cost: 0.2146, reader_cost: 0.01913, ips: 37.2737 samples/sec | ETA 04:23:48 2022-08-24 20:18:36 [INFO] [TRAIN] epoch: 69, iter: 86300/160000, loss: 0.8089, lr: 0.000558, batch_cost: 0.2069, reader_cost: 0.00705, ips: 38.6680 samples/sec | ETA 04:14:07 2022-08-24 20:18:45 [INFO] [TRAIN] epoch: 69, iter: 86350/160000, loss: 0.7414, lr: 0.000558, batch_cost: 0.1872, reader_cost: 0.00056, ips: 42.7389 samples/sec | ETA 03:49:46 2022-08-24 20:18:54 [INFO] [TRAIN] epoch: 69, iter: 86400/160000, loss: 0.7615, lr: 0.000557, batch_cost: 0.1611, reader_cost: 0.00033, ips: 49.6722 samples/sec | ETA 03:17:33 2022-08-24 20:19:03 [INFO] [TRAIN] epoch: 69, iter: 86450/160000, loss: 0.7181, lr: 0.000557, batch_cost: 0.1858, reader_cost: 0.00625, ips: 43.0619 samples/sec | ETA 03:47:44 2022-08-24 20:19:12 [INFO] [TRAIN] epoch: 69, iter: 86500/160000, loss: 0.7877, lr: 0.000556, batch_cost: 0.1877, reader_cost: 0.00331, ips: 42.6103 samples/sec | ETA 03:49:59 2022-08-24 20:19:20 [INFO] [TRAIN] epoch: 69, iter: 86550/160000, loss: 0.7882, lr: 0.000556, batch_cost: 0.1636, reader_cost: 0.01445, ips: 48.9106 samples/sec | ETA 03:20:13 2022-08-24 20:19:31 [INFO] [TRAIN] epoch: 69, iter: 86600/160000, loss: 0.7569, lr: 0.000556, batch_cost: 0.2184, reader_cost: 0.00058, ips: 36.6345 samples/sec | ETA 04:27:08 2022-08-24 20:19:42 [INFO] [TRAIN] epoch: 69, iter: 86650/160000, loss: 0.7716, lr: 0.000555, batch_cost: 0.2046, reader_cost: 0.01480, ips: 39.0933 samples/sec | ETA 04:10:10 2022-08-24 20:19:53 [INFO] [TRAIN] epoch: 69, iter: 86700/160000, loss: 0.7882, lr: 0.000555, batch_cost: 0.2389, reader_cost: 0.00065, ips: 33.4854 samples/sec | ETA 04:51:52 2022-08-24 20:20:04 [INFO] [TRAIN] epoch: 69, iter: 86750/160000, loss: 0.7751, lr: 0.000555, batch_cost: 0.2166, reader_cost: 0.00083, ips: 36.9305 samples/sec | ETA 04:24:27 2022-08-24 20:20:15 [INFO] [TRAIN] epoch: 69, iter: 86800/160000, loss: 0.7286, lr: 0.000554, batch_cost: 0.2065, reader_cost: 0.00061, ips: 38.7464 samples/sec | ETA 04:11:53 2022-08-24 20:20:25 [INFO] [TRAIN] epoch: 69, iter: 86850/160000, loss: 0.7906, lr: 0.000554, batch_cost: 0.2101, reader_cost: 0.00497, ips: 38.0796 samples/sec | ETA 04:16:07 2022-08-24 20:20:36 [INFO] [TRAIN] epoch: 69, iter: 86900/160000, loss: 0.7402, lr: 0.000553, batch_cost: 0.2143, reader_cost: 0.00074, ips: 37.3323 samples/sec | ETA 04:21:04 2022-08-24 20:20:45 [INFO] [TRAIN] epoch: 69, iter: 86950/160000, loss: 0.7514, lr: 0.000553, batch_cost: 0.1916, reader_cost: 0.00393, ips: 41.7628 samples/sec | ETA 03:53:13 2022-08-24 20:20:55 [INFO] [TRAIN] epoch: 69, iter: 87000/160000, loss: 0.7964, lr: 0.000553, batch_cost: 0.1921, reader_cost: 0.00122, ips: 41.6436 samples/sec | ETA 03:53:43 2022-08-24 20:20:55 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.2002 - reader cost: 8.0059e-04 2022-08-24 20:24:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3060 Acc: 0.7427 Kappa: 0.7230 Dice: 0.4326 2022-08-24 20:24:16 [INFO] [EVAL] Class IoU: [0.6497 0.7704 0.9228 0.6981 0.6701 0.7334 0.7617 0.7474 0.4822 0.6208 0.452 0.5131 0.6528 0.2888 0.2215 0.3763 0.4729 0.4226 0.5425 0.3615 0.7216 0.3707 0.5587 0.4529 0.3083 0.3956 0.3554 0.3798 0.3313 0.2679 0.1981 0.3701 0.2639 0.278 0.3058 0.3521 0.3434 0.4609 0.2386 0.2357 0.1192 0.0581 0.2915 0.2149 0.3065 0.2451 0.2444 0.4186 0.6484 0.476 0.4929 0.2769 0.2222 0.223 0.6697 0.4284 0.8218 0.2366 0.3135 0.2569 0.079 0.1703 0.2979 0.0606 0.3631 0.6477 0.2262 0.3284 0.1093 0.3447 0.311 0.4396 0.3879 0.2152 0.416 0.29 0.3371 0.2461 0.0814 0.1649 0.655 0.2806 0.2394 0.0076 0.4374 0.4534 0.0491 0.0599 0.3029 0.4469 0.3977 0.0178 0.1785 0.0634 0.1144 0.004 0.1762 0.1297 0.2269 0.3654 0.0668 0.0042 0.1722 0.5945 0.0341 0.4029 0.1646 0.3884 0.0488 0.268 0.0797 0.1342 0.0925 0.5139 0.7075 0.0006 0.3365 0.4747 0.0204 0.1752 0.4221 0.0121 0.3271 0.0845 0.2463 0.2031 0.4104 0.2991 0.3388 0.252 0.532 0.0083 0.0958 0.2097 0.0749 0.1108 0.0657 0.0165 0.1209 0.3077 0.1456 0.1145 0.2449 0.2534 0.2826 0.0052 0.2609 0.0145 0.0763 0.0515] 2022-08-24 20:24:16 [INFO] [EVAL] Class Precision: [0.7648 0.8363 0.9581 0.8002 0.7636 0.8343 0.882 0.8276 0.6097 0.7216 0.6681 0.6546 0.7376 0.4799 0.4446 0.5328 0.6115 0.6559 0.7183 0.5419 0.8073 0.5308 0.7113 0.5885 0.5181 0.6317 0.5047 0.6907 0.6335 0.5032 0.3864 0.4866 0.4921 0.4191 0.535 0.5278 0.6134 0.6689 0.3881 0.5179 0.1789 0.2811 0.4674 0.4954 0.4476 0.4622 0.3823 0.6441 0.7395 0.5791 0.6622 0.3515 0.4133 0.6471 0.75 0.5281 0.8686 0.6085 0.698 0.4603 0.1293 0.3818 0.3683 0.6519 0.446 0.7724 0.342 0.5863 0.3075 0.6485 0.5964 0.6495 0.6486 0.2707 0.6481 0.5545 0.4962 0.5421 0.7464 0.7008 0.7835 0.5863 0.7559 0.0513 0.7122 0.5791 0.2489 0.394 0.4229 0.6506 0.5528 0.0216 0.2881 0.2697 0.2068 0.0258 0.3201 0.4289 0.5717 0.5169 0.5616 0.0069 0.6174 0.7882 0.3155 0.4482 0.3026 0.8818 0.1797 0.3798 0.2428 0.1655 0.3883 0.7669 0.7121 0.0222 0.7127 0.5045 0.0825 0.5299 0.7771 0.4574 0.8068 0.6245 0.5282 0.6409 0.7558 0.4374 0.4304 0.4056 0.6381 0.5888 0.3976 0.7926 0.6291 0.2414 0.3165 0.1796 0.4037 0.6963 0.236 0.1694 0.522 0.474 0.4648 0.0056 0.853 0.2162 0.4332 0.6765] 2022-08-24 20:24:16 [INFO] [EVAL] Class Recall: [0.8119 0.9072 0.9616 0.8454 0.8455 0.8584 0.8481 0.8853 0.6975 0.8163 0.5829 0.7036 0.8502 0.4205 0.3062 0.5616 0.676 0.5429 0.689 0.5207 0.8718 0.5514 0.7226 0.6627 0.4323 0.5142 0.5458 0.4577 0.4099 0.3642 0.2891 0.607 0.3627 0.4523 0.4164 0.5141 0.4383 0.5972 0.3825 0.3019 0.263 0.0683 0.4366 0.2751 0.4929 0.343 0.404 0.5446 0.8404 0.7277 0.6586 0.5661 0.3246 0.2538 0.8623 0.694 0.9385 0.2791 0.3626 0.3677 0.1687 0.2351 0.6093 0.0627 0.6615 0.8005 0.4006 0.4274 0.145 0.4239 0.3939 0.5763 0.491 0.5121 0.5374 0.3781 0.5124 0.3107 0.0837 0.1774 0.7998 0.3498 0.2594 0.0089 0.5313 0.6761 0.0576 0.0659 0.5164 0.588 0.5863 0.0922 0.3194 0.0766 0.2039 0.0048 0.2815 0.1568 0.2733 0.555 0.0704 0.0103 0.1928 0.7075 0.0368 0.7994 0.2653 0.4097 0.0628 0.4764 0.1062 0.4152 0.1083 0.6091 0.9908 0.0006 0.3893 0.8892 0.0264 0.2074 0.4802 0.0123 0.355 0.089 0.3157 0.2292 0.4732 0.486 0.614 0.3995 0.7619 0.0083 0.112 0.2219 0.0784 0.17 0.0766 0.0178 0.1472 0.3555 0.2755 0.261 0.3157 0.3524 0.4189 0.0802 0.2731 0.0153 0.0848 0.0528] 2022-08-24 20:24:16 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:24:25 [INFO] [TRAIN] epoch: 69, iter: 87050/160000, loss: 0.7568, lr: 0.000552, batch_cost: 0.1946, reader_cost: 0.00348, ips: 41.1146 samples/sec | ETA 03:56:34 2022-08-24 20:24:36 [INFO] [TRAIN] epoch: 69, iter: 87100/160000, loss: 0.7383, lr: 0.000552, batch_cost: 0.2102, reader_cost: 0.00249, ips: 38.0660 samples/sec | ETA 04:15:20 2022-08-24 20:24:48 [INFO] [TRAIN] epoch: 70, iter: 87150/160000, loss: 0.7713, lr: 0.000552, batch_cost: 0.2466, reader_cost: 0.03149, ips: 32.4412 samples/sec | ETA 04:59:24 2022-08-24 20:25:00 [INFO] [TRAIN] epoch: 70, iter: 87200/160000, loss: 0.7036, lr: 0.000551, batch_cost: 0.2253, reader_cost: 0.00449, ips: 35.5032 samples/sec | ETA 04:33:24 2022-08-24 20:25:10 [INFO] [TRAIN] epoch: 70, iter: 87250/160000, loss: 0.7524, lr: 0.000551, batch_cost: 0.2035, reader_cost: 0.00109, ips: 39.3134 samples/sec | ETA 04:06:44 2022-08-24 20:25:22 [INFO] [TRAIN] epoch: 70, iter: 87300/160000, loss: 0.7204, lr: 0.000550, batch_cost: 0.2381, reader_cost: 0.00060, ips: 33.5927 samples/sec | ETA 04:48:33 2022-08-24 20:25:32 [INFO] [TRAIN] epoch: 70, iter: 87350/160000, loss: 0.7656, lr: 0.000550, batch_cost: 0.2141, reader_cost: 0.00083, ips: 37.3691 samples/sec | ETA 04:19:12 2022-08-24 20:25:42 [INFO] [TRAIN] epoch: 70, iter: 87400/160000, loss: 0.7601, lr: 0.000550, batch_cost: 0.1926, reader_cost: 0.00576, ips: 41.5363 samples/sec | ETA 03:53:02 2022-08-24 20:25:51 [INFO] [TRAIN] epoch: 70, iter: 87450/160000, loss: 0.6669, lr: 0.000549, batch_cost: 0.1838, reader_cost: 0.00083, ips: 43.5163 samples/sec | ETA 03:42:17 2022-08-24 20:25:59 [INFO] [TRAIN] epoch: 70, iter: 87500/160000, loss: 0.7056, lr: 0.000549, batch_cost: 0.1618, reader_cost: 0.00169, ips: 49.4477 samples/sec | ETA 03:15:29 2022-08-24 20:26:08 [INFO] [TRAIN] epoch: 70, iter: 87550/160000, loss: 0.7870, lr: 0.000549, batch_cost: 0.1765, reader_cost: 0.00443, ips: 45.3327 samples/sec | ETA 03:33:05 2022-08-24 20:26:18 [INFO] [TRAIN] epoch: 70, iter: 87600/160000, loss: 0.7986, lr: 0.000548, batch_cost: 0.2017, reader_cost: 0.00103, ips: 39.6632 samples/sec | ETA 04:03:22 2022-08-24 20:26:29 [INFO] [TRAIN] epoch: 70, iter: 87650/160000, loss: 0.7065, lr: 0.000548, batch_cost: 0.2064, reader_cost: 0.00358, ips: 38.7683 samples/sec | ETA 04:08:49 2022-08-24 20:26:40 [INFO] [TRAIN] epoch: 70, iter: 87700/160000, loss: 0.7361, lr: 0.000547, batch_cost: 0.2230, reader_cost: 0.00049, ips: 35.8729 samples/sec | ETA 04:28:43 2022-08-24 20:26:51 [INFO] [TRAIN] epoch: 70, iter: 87750/160000, loss: 0.7858, lr: 0.000547, batch_cost: 0.2246, reader_cost: 0.00056, ips: 35.6167 samples/sec | ETA 04:30:28 2022-08-24 20:27:01 [INFO] [TRAIN] epoch: 70, iter: 87800/160000, loss: 0.7766, lr: 0.000547, batch_cost: 0.2055, reader_cost: 0.00929, ips: 38.9313 samples/sec | ETA 04:07:16 2022-08-24 20:27:12 [INFO] [TRAIN] epoch: 70, iter: 87850/160000, loss: 0.7042, lr: 0.000546, batch_cost: 0.2220, reader_cost: 0.01027, ips: 36.0396 samples/sec | ETA 04:26:55 2022-08-24 20:27:22 [INFO] [TRAIN] epoch: 70, iter: 87900/160000, loss: 0.7158, lr: 0.000546, batch_cost: 0.1998, reader_cost: 0.00306, ips: 40.0433 samples/sec | ETA 04:00:04 2022-08-24 20:27:33 [INFO] [TRAIN] epoch: 70, iter: 87950/160000, loss: 0.7572, lr: 0.000545, batch_cost: 0.2069, reader_cost: 0.00045, ips: 38.6623 samples/sec | ETA 04:08:28 2022-08-24 20:27:46 [INFO] [TRAIN] epoch: 70, iter: 88000/160000, loss: 0.7459, lr: 0.000545, batch_cost: 0.2647, reader_cost: 0.00036, ips: 30.2228 samples/sec | ETA 05:17:38 2022-08-24 20:27:46 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 210s - batch_cost: 0.2101 - reader cost: 8.5359e-04 2022-08-24 20:31:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3092 Acc: 0.7437 Kappa: 0.7239 Dice: 0.4349 2022-08-24 20:31:16 [INFO] [EVAL] Class IoU: [0.6535 0.7666 0.9231 0.7022 0.6617 0.7403 0.7591 0.7487 0.48 0.6068 0.4574 0.5159 0.6564 0.2824 0.2091 0.369 0.4969 0.3994 0.5344 0.3624 0.7155 0.3657 0.5548 0.4439 0.313 0.3553 0.3417 0.3976 0.3706 0.3236 0.184 0.3877 0.2186 0.2794 0.3821 0.3956 0.3503 0.4532 0.2552 0.2482 0.1188 0.0706 0.3049 0.2135 0.2892 0.2806 0.2396 0.416 0.6518 0.465 0.4884 0.2606 0.1943 0.2007 0.5997 0.4823 0.8073 0.3311 0.4694 0.222 0.0514 0.204 0.3211 0.0685 0.3666 0.6563 0.1878 0.3666 0.1053 0.3546 0.3405 0.417 0.3834 0.2364 0.4042 0.2994 0.3153 0.2083 0.0746 0.3126 0.6261 0.2844 0.2205 0.015 0.4296 0.4664 0.054 0.0457 0.3574 0.4242 0.3942 0.0022 0.1638 0.0683 0.0742 0.0042 0.175 0.1402 0.2535 0.3518 0.0618 0.009 0.1172 0.6491 0.0176 0.5903 0.1606 0.3683 0.0811 0.3141 0.0808 0.3157 0.0859 0.6132 0.6506 0.0006 0.3307 0.5329 0.0628 0.0765 0.4643 0.0114 0.2182 0.1205 0.2319 0.1335 0.4068 0.2703 0.0964 0.2243 0.5337 0.0111 0.1057 0.2591 0.0987 0.1117 0.0334 0.0145 0.1018 0.3144 0.1567 0.1136 0.2559 0.391 0.2721 0. 0.2345 0.0108 0.0457 0.0327] 2022-08-24 20:31:16 [INFO] [EVAL] Class Precision: [0.7575 0.8272 0.9593 0.8044 0.7674 0.8454 0.9055 0.8225 0.6272 0.7772 0.6921 0.6172 0.7573 0.4745 0.4969 0.5391 0.646 0.6515 0.6631 0.5423 0.802 0.5031 0.7179 0.5887 0.5303 0.5681 0.4672 0.6835 0.6245 0.4834 0.4146 0.4843 0.6073 0.4067 0.4698 0.5288 0.5975 0.7341 0.4034 0.4863 0.1744 0.2612 0.5232 0.5414 0.5089 0.454 0.3557 0.6338 0.7272 0.5719 0.6155 0.3297 0.3646 0.6639 0.6711 0.6852 0.8511 0.5807 0.6275 0.4185 0.084 0.4181 0.4644 0.5732 0.4599 0.7808 0.3188 0.6065 0.2727 0.6398 0.5807 0.6006 0.557 0.2917 0.6692 0.4927 0.4256 0.5963 0.7859 0.6043 0.7351 0.5158 0.7827 0.0441 0.716 0.6495 0.1656 0.4064 0.7567 0.6613 0.521 0.0032 0.2569 0.3031 0.2224 0.0168 0.5716 0.5276 0.467 0.5598 0.4812 0.0158 0.6424 0.7108 0.2748 0.7913 0.3967 0.8163 0.2646 0.4522 0.3146 0.567 0.4009 0.7075 0.6518 0.0226 0.8213 0.564 0.1823 0.5477 0.7571 0.5399 0.8206 0.6283 0.6267 0.6441 0.7177 0.3412 0.2952 0.3428 0.6275 0.3335 0.4795 0.5982 0.6458 0.2577 0.5488 0.3282 0.3646 0.6645 0.2012 0.5255 0.5417 0.7408 0.4372 0. 0.9061 0.3005 0.4883 0.8247] 2022-08-24 20:31:16 [INFO] [EVAL] Class Recall: [0.8264 0.9128 0.9607 0.8467 0.8278 0.8562 0.8245 0.893 0.6715 0.7346 0.5742 0.7587 0.8314 0.4109 0.2653 0.5392 0.6828 0.5079 0.7335 0.5221 0.869 0.5726 0.7094 0.6435 0.433 0.4868 0.5598 0.4873 0.4769 0.4947 0.2485 0.6603 0.2546 0.4715 0.6719 0.6109 0.4585 0.5422 0.41 0.3365 0.2716 0.0882 0.4223 0.2606 0.4012 0.4234 0.4233 0.5476 0.8628 0.7133 0.7029 0.5544 0.2938 0.2234 0.8494 0.6195 0.9401 0.4351 0.6507 0.3211 0.1172 0.2849 0.51 0.0722 0.6436 0.8046 0.3136 0.481 0.1465 0.4431 0.4514 0.577 0.5516 0.5549 0.5052 0.4328 0.5489 0.2425 0.0761 0.393 0.8085 0.3879 0.2349 0.0222 0.5178 0.6232 0.0742 0.049 0.4038 0.5419 0.6184 0.0074 0.3113 0.081 0.1002 0.0055 0.2014 0.1604 0.3568 0.4863 0.0662 0.0202 0.1254 0.8821 0.0185 0.6991 0.2126 0.4016 0.1047 0.507 0.098 0.4161 0.0985 0.8213 0.997 0.0006 0.3564 0.9063 0.0874 0.0816 0.5456 0.0116 0.2292 0.1297 0.2691 0.1441 0.4843 0.5656 0.1251 0.3934 0.7811 0.0114 0.1194 0.3137 0.1044 0.1647 0.0343 0.0149 0.1238 0.3737 0.4143 0.1266 0.3266 0.453 0.4187 0. 0.2404 0.0111 0.048 0.0329] 2022-08-24 20:31:16 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:31:27 [INFO] [TRAIN] epoch: 70, iter: 88050/160000, loss: 0.7089, lr: 0.000545, batch_cost: 0.2110, reader_cost: 0.00222, ips: 37.9058 samples/sec | ETA 04:13:04 2022-08-24 20:31:37 [INFO] [TRAIN] epoch: 70, iter: 88100/160000, loss: 0.7515, lr: 0.000544, batch_cost: 0.2022, reader_cost: 0.02117, ips: 39.5592 samples/sec | ETA 04:02:20 2022-08-24 20:31:49 [INFO] [TRAIN] epoch: 70, iter: 88150/160000, loss: 0.7387, lr: 0.000544, batch_cost: 0.2387, reader_cost: 0.00102, ips: 33.5207 samples/sec | ETA 04:45:47 2022-08-24 20:31:59 [INFO] [TRAIN] epoch: 70, iter: 88200/160000, loss: 0.8689, lr: 0.000544, batch_cost: 0.1970, reader_cost: 0.00037, ips: 40.6104 samples/sec | ETA 03:55:44 2022-08-24 20:32:09 [INFO] [TRAIN] epoch: 70, iter: 88250/160000, loss: 0.8173, lr: 0.000543, batch_cost: 0.2119, reader_cost: 0.00084, ips: 37.7496 samples/sec | ETA 04:13:25 2022-08-24 20:32:20 [INFO] [TRAIN] epoch: 70, iter: 88300/160000, loss: 0.8243, lr: 0.000543, batch_cost: 0.2204, reader_cost: 0.00035, ips: 36.2926 samples/sec | ETA 04:23:24 2022-08-24 20:32:31 [INFO] [TRAIN] epoch: 70, iter: 88350/160000, loss: 0.7237, lr: 0.000542, batch_cost: 0.2074, reader_cost: 0.00159, ips: 38.5732 samples/sec | ETA 04:07:40 2022-08-24 20:32:42 [INFO] [TRAIN] epoch: 70, iter: 88400/160000, loss: 0.7657, lr: 0.000542, batch_cost: 0.2249, reader_cost: 0.01649, ips: 35.5779 samples/sec | ETA 04:28:19 2022-08-24 20:32:52 [INFO] [TRAIN] epoch: 71, iter: 88450/160000, loss: 0.6952, lr: 0.000542, batch_cost: 0.1949, reader_cost: 0.03387, ips: 41.0511 samples/sec | ETA 03:52:23 2022-08-24 20:33:01 [INFO] [TRAIN] epoch: 71, iter: 88500/160000, loss: 0.7865, lr: 0.000541, batch_cost: 0.1901, reader_cost: 0.00298, ips: 42.0761 samples/sec | ETA 03:46:34 2022-08-24 20:33:12 [INFO] [TRAIN] epoch: 71, iter: 88550/160000, loss: 0.7397, lr: 0.000541, batch_cost: 0.2190, reader_cost: 0.00080, ips: 36.5291 samples/sec | ETA 04:20:47 2022-08-24 20:33:23 [INFO] [TRAIN] epoch: 71, iter: 88600/160000, loss: 0.7633, lr: 0.000541, batch_cost: 0.2220, reader_cost: 0.00055, ips: 36.0306 samples/sec | ETA 04:24:13 2022-08-24 20:33:35 [INFO] [TRAIN] epoch: 71, iter: 88650/160000, loss: 0.7456, lr: 0.000540, batch_cost: 0.2358, reader_cost: 0.00062, ips: 33.9291 samples/sec | ETA 04:40:23 2022-08-24 20:33:45 [INFO] [TRAIN] epoch: 71, iter: 88700/160000, loss: 0.7697, lr: 0.000540, batch_cost: 0.1967, reader_cost: 0.00187, ips: 40.6688 samples/sec | ETA 03:53:45 2022-08-24 20:33:55 [INFO] [TRAIN] epoch: 71, iter: 88750/160000, loss: 0.7910, lr: 0.000539, batch_cost: 0.2062, reader_cost: 0.00165, ips: 38.7941 samples/sec | ETA 04:04:52 2022-08-24 20:34:07 [INFO] [TRAIN] epoch: 71, iter: 88800/160000, loss: 0.7046, lr: 0.000539, batch_cost: 0.2334, reader_cost: 0.00592, ips: 34.2828 samples/sec | ETA 04:36:54 2022-08-24 20:34:18 [INFO] [TRAIN] epoch: 71, iter: 88850/160000, loss: 0.7679, lr: 0.000539, batch_cost: 0.2150, reader_cost: 0.01066, ips: 37.2113 samples/sec | ETA 04:14:56 2022-08-24 20:34:28 [INFO] [TRAIN] epoch: 71, iter: 88900/160000, loss: 0.7536, lr: 0.000538, batch_cost: 0.2055, reader_cost: 0.00840, ips: 38.9283 samples/sec | ETA 04:03:31 2022-08-24 20:34:38 [INFO] [TRAIN] epoch: 71, iter: 88950/160000, loss: 0.7574, lr: 0.000538, batch_cost: 0.1990, reader_cost: 0.00518, ips: 40.1925 samples/sec | ETA 03:55:41 2022-08-24 20:34:49 [INFO] [TRAIN] epoch: 71, iter: 89000/160000, loss: 0.7579, lr: 0.000538, batch_cost: 0.2149, reader_cost: 0.00546, ips: 37.2287 samples/sec | ETA 04:14:17 2022-08-24 20:34:49 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2017 - reader cost: 7.8402e-04 2022-08-24 20:38:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3092 Acc: 0.7414 Kappa: 0.7215 Dice: 0.4351 2022-08-24 20:38:11 [INFO] [EVAL] Class IoU: [0.6508 0.7636 0.9235 0.7031 0.665 0.7307 0.7693 0.7472 0.4798 0.6071 0.4378 0.523 0.6401 0.2286 0.2022 0.3711 0.4604 0.3957 0.5503 0.3698 0.7179 0.3918 0.5194 0.4462 0.3017 0.4342 0.3552 0.4074 0.3625 0.2803 0.1747 0.394 0.2646 0.2748 0.3029 0.3963 0.3394 0.4599 0.251 0.2332 0.0971 0.1122 0.2972 0.2028 0.27 0.2522 0.2513 0.4105 0.6312 0.4853 0.4802 0.1978 0.2149 0.2401 0.5941 0.4815 0.8316 0.3368 0.4533 0.1841 0.0504 0.3455 0.3081 0.1366 0.3589 0.6605 0.1933 0.3437 0.0924 0.3648 0.3353 0.3883 0.3836 0.2429 0.4072 0.2817 0.3203 0.2241 0.1481 0.4096 0.6182 0.2758 0.2345 0.0124 0.4424 0.4763 0.0722 0.0503 0.344 0.396 0.3984 0.007 0.1698 0.0487 0.0089 0.0147 0.188 0.1382 0.1551 0.3193 0.0706 0.0212 0.1653 0.4912 0.0623 0.4861 0.1523 0.4435 0.0559 0.3097 0.0482 0.1355 0.1162 0.6274 0.6938 0.008 0.4076 0.528 0.0397 0.2267 0.5032 0.0193 0.1787 0.0916 0.2273 0.1958 0.4293 0.2641 0.1657 0.225 0.5253 0.0129 0.0523 0.1784 0.0585 0.0924 0.0697 0.0201 0.1233 0.3114 0.1657 0. 0.2758 0.4951 0.3041 0. 0.2704 0.0259 0.0679 0.0262] 2022-08-24 20:38:11 [INFO] [EVAL] Class Precision: [0.7554 0.8366 0.9548 0.8092 0.7695 0.8407 0.8916 0.8266 0.6282 0.6959 0.6923 0.6682 0.7104 0.4714 0.4963 0.5354 0.5816 0.6816 0.7269 0.5499 0.8024 0.552 0.6688 0.5689 0.4939 0.7183 0.5126 0.6848 0.6458 0.4196 0.393 0.5231 0.435 0.43 0.4893 0.4846 0.6083 0.7726 0.504 0.5369 0.1469 0.22 0.5003 0.56 0.3323 0.4973 0.4297 0.6344 0.709 0.6068 0.5866 0.2232 0.3978 0.6203 0.7111 0.7727 0.8883 0.5703 0.6765 0.4168 0.0847 0.5671 0.5361 0.5905 0.4347 0.7769 0.3205 0.5538 0.1511 0.6527 0.4678 0.5095 0.6244 0.2932 0.6113 0.5381 0.6149 0.4567 0.7867 0.6323 0.7179 0.5602 0.7663 0.0459 0.5004 0.6663 0.2906 0.4414 0.6121 0.5563 0.5788 0.0089 0.3209 0.2986 0.0394 0.0538 0.6044 0.4653 0.2883 0.5815 0.4919 0.0278 0.6855 0.8256 0.3599 0.6729 0.3849 0.8801 0.3011 0.5436 0.2299 0.1674 0.4291 0.7389 0.6975 0.0861 0.7175 0.5561 0.0867 0.8483 0.6863 0.3501 0.6676 0.6873 0.6377 0.663 0.7195 0.3589 0.3468 0.3548 0.5989 0.3164 0.4693 0.7022 0.6074 0.2655 0.2716 0.0589 0.4605 0.6933 0.2759 0. 0.4679 0.6461 0.5858 0. 0.8057 0.1708 0.6162 0.4673] 2022-08-24 20:38:11 [INFO] [EVAL] Class Recall: [0.8246 0.8974 0.9657 0.8429 0.8303 0.8482 0.8487 0.886 0.67 0.8264 0.5435 0.7064 0.8662 0.3073 0.2545 0.5475 0.6884 0.4854 0.6937 0.5304 0.872 0.5745 0.6992 0.674 0.4367 0.5232 0.5363 0.5014 0.4525 0.4579 0.2393 0.6149 0.4032 0.4322 0.443 0.6849 0.4343 0.5319 0.3333 0.2919 0.2225 0.1862 0.4227 0.2412 0.5901 0.3386 0.3771 0.5376 0.852 0.7079 0.726 0.6351 0.3186 0.2814 0.783 0.561 0.9287 0.4513 0.5787 0.2479 0.1107 0.4692 0.42 0.1508 0.6729 0.815 0.3275 0.4754 0.1922 0.4526 0.5422 0.6202 0.4986 0.5865 0.5494 0.3716 0.4007 0.3055 0.1543 0.5377 0.8167 0.352 0.2526 0.0168 0.7924 0.6255 0.0876 0.0537 0.44 0.5789 0.5611 0.0327 0.2652 0.055 0.0113 0.0198 0.2144 0.1642 0.2513 0.4146 0.0761 0.0815 0.1789 0.548 0.07 0.6366 0.2013 0.472 0.0643 0.4185 0.0575 0.4161 0.1375 0.8062 0.9925 0.0087 0.4856 0.9126 0.0681 0.2363 0.6535 0.02 0.1962 0.0956 0.261 0.2175 0.5156 0.5001 0.2409 0.3808 0.8104 0.0133 0.0556 0.193 0.0608 0.1242 0.0857 0.0295 0.1441 0.3612 0.2932 0. 0.4019 0.6794 0.3873 0. 0.2893 0.0296 0.0709 0.027 ] 2022-08-24 20:38:11 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:38:21 [INFO] [TRAIN] epoch: 71, iter: 89050/160000, loss: 0.7242, lr: 0.000537, batch_cost: 0.2094, reader_cost: 0.01078, ips: 38.2035 samples/sec | ETA 04:07:37 2022-08-24 20:38:32 [INFO] [TRAIN] epoch: 71, iter: 89100/160000, loss: 0.7139, lr: 0.000537, batch_cost: 0.2200, reader_cost: 0.00531, ips: 36.3590 samples/sec | ETA 04:19:59 2022-08-24 20:38:43 [INFO] [TRAIN] epoch: 71, iter: 89150/160000, loss: 0.7284, lr: 0.000536, batch_cost: 0.2090, reader_cost: 0.00180, ips: 38.2720 samples/sec | ETA 04:06:49 2022-08-24 20:38:53 [INFO] [TRAIN] epoch: 71, iter: 89200/160000, loss: 0.8024, lr: 0.000536, batch_cost: 0.2001, reader_cost: 0.00046, ips: 39.9747 samples/sec | ETA 03:56:08 2022-08-24 20:39:03 [INFO] [TRAIN] epoch: 71, iter: 89250/160000, loss: 0.7860, lr: 0.000536, batch_cost: 0.2001, reader_cost: 0.00818, ips: 39.9828 samples/sec | ETA 03:55:56 2022-08-24 20:39:13 [INFO] [TRAIN] epoch: 71, iter: 89300/160000, loss: 0.7548, lr: 0.000535, batch_cost: 0.2007, reader_cost: 0.01028, ips: 39.8687 samples/sec | ETA 03:56:26 2022-08-24 20:39:25 [INFO] [TRAIN] epoch: 71, iter: 89350/160000, loss: 0.7313, lr: 0.000535, batch_cost: 0.2478, reader_cost: 0.00047, ips: 32.2801 samples/sec | ETA 04:51:49 2022-08-24 20:39:36 [INFO] [TRAIN] epoch: 71, iter: 89400/160000, loss: 0.7671, lr: 0.000535, batch_cost: 0.2236, reader_cost: 0.00493, ips: 35.7859 samples/sec | ETA 04:23:02 2022-08-24 20:39:46 [INFO] [TRAIN] epoch: 71, iter: 89450/160000, loss: 0.7735, lr: 0.000534, batch_cost: 0.1929, reader_cost: 0.00340, ips: 41.4815 samples/sec | ETA 03:46:46 2022-08-24 20:39:55 [INFO] [TRAIN] epoch: 71, iter: 89500/160000, loss: 0.7449, lr: 0.000534, batch_cost: 0.1898, reader_cost: 0.00071, ips: 42.1390 samples/sec | ETA 03:43:04 2022-08-24 20:40:04 [INFO] [TRAIN] epoch: 71, iter: 89550/160000, loss: 0.7565, lr: 0.000533, batch_cost: 0.1780, reader_cost: 0.00063, ips: 44.9483 samples/sec | ETA 03:28:58 2022-08-24 20:40:14 [INFO] [TRAIN] epoch: 71, iter: 89600/160000, loss: 0.7187, lr: 0.000533, batch_cost: 0.1978, reader_cost: 0.00047, ips: 40.4501 samples/sec | ETA 03:52:03 2022-08-24 20:40:24 [INFO] [TRAIN] epoch: 71, iter: 89650/160000, loss: 0.7511, lr: 0.000533, batch_cost: 0.1899, reader_cost: 0.00772, ips: 42.1236 samples/sec | ETA 03:42:40 2022-08-24 20:40:36 [INFO] [TRAIN] epoch: 72, iter: 89700/160000, loss: 0.7549, lr: 0.000532, batch_cost: 0.2358, reader_cost: 0.05034, ips: 33.9294 samples/sec | ETA 04:36:15 2022-08-24 20:40:45 [INFO] [TRAIN] epoch: 72, iter: 89750/160000, loss: 0.7473, lr: 0.000532, batch_cost: 0.1979, reader_cost: 0.00401, ips: 40.4229 samples/sec | ETA 03:51:43 2022-08-24 20:40:57 [INFO] [TRAIN] epoch: 72, iter: 89800/160000, loss: 0.7570, lr: 0.000531, batch_cost: 0.2225, reader_cost: 0.00105, ips: 35.9470 samples/sec | ETA 04:20:23 2022-08-24 20:41:07 [INFO] [TRAIN] epoch: 72, iter: 89850/160000, loss: 0.7521, lr: 0.000531, batch_cost: 0.2168, reader_cost: 0.00081, ips: 36.8986 samples/sec | ETA 04:13:29 2022-08-24 20:41:18 [INFO] [TRAIN] epoch: 72, iter: 89900/160000, loss: 0.6952, lr: 0.000531, batch_cost: 0.2065, reader_cost: 0.00064, ips: 38.7366 samples/sec | ETA 04:01:17 2022-08-24 20:41:28 [INFO] [TRAIN] epoch: 72, iter: 89950/160000, loss: 0.7243, lr: 0.000530, batch_cost: 0.2152, reader_cost: 0.00060, ips: 37.1689 samples/sec | ETA 04:11:17 2022-08-24 20:41:39 [INFO] [TRAIN] epoch: 72, iter: 90000/160000, loss: 0.7210, lr: 0.000530, batch_cost: 0.2162, reader_cost: 0.00046, ips: 37.0011 samples/sec | ETA 04:12:14 2022-08-24 20:41:39 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 213s - batch_cost: 0.2125 - reader cost: 7.1938e-04 2022-08-24 20:45:12 [INFO] [EVAL] #Images: 2000 mIoU: 0.3102 Acc: 0.7443 Kappa: 0.7246 Dice: 0.4368 2022-08-24 20:45:12 [INFO] [EVAL] Class IoU: [0.6536 0.7646 0.9238 0.7047 0.6584 0.737 0.7676 0.7499 0.4804 0.6234 0.4473 0.5277 0.6449 0.3033 0.2072 0.3674 0.4958 0.3969 0.5475 0.3704 0.7083 0.4399 0.5265 0.4496 0.3077 0.3283 0.3506 0.3906 0.3659 0.3005 0.1649 0.397 0.2537 0.2658 0.347 0.3851 0.3513 0.4789 0.2478 0.2172 0.0949 0.0986 0.3072 0.2133 0.2922 0.2926 0.2458 0.3972 0.6405 0.49 0.4415 0.2741 0.2088 0.206 0.6162 0.4667 0.839 0.3196 0.4728 0.2153 0.0433 0.2446 0.3063 0.1474 0.3743 0.6491 0.2071 0.3818 0.0936 0.3459 0.32 0.3843 0.3821 0.2137 0.4153 0.2946 0.3041 0.1986 0.1631 0.313 0.6193 0.2687 0.2745 0.0114 0.5227 0.4688 0.072 0.0799 0.3425 0.3973 0.403 0.0041 0.136 0.052 0.0417 0.0059 0.1944 0.1191 0.1865 0.246 0.0937 0.0037 0.1595 0.5116 0.0613 0.4581 0.1563 0.4198 0.0781 0.3387 0.0641 0.2004 0.0915 0.5723 0.7247 0.0027 0.4282 0.5394 0.0343 0.0752 0.4877 0.0095 0.2424 0.1684 0.2383 0.2036 0.4273 0.2708 0.2448 0.2011 0.5249 0.0082 0.0364 0.2013 0.0603 0.09 0.0433 0.012 0.13 0.2983 0.1619 0.1118 0.2915 0.3366 0.2999 0. 0.2866 0.026 0.0574 0.0307] 2022-08-24 20:45:12 [INFO] [EVAL] Class Precision: [0.7628 0.8271 0.9584 0.8161 0.7612 0.8512 0.8924 0.829 0.6287 0.7786 0.6289 0.6387 0.7285 0.4834 0.463 0.552 0.6755 0.6412 0.6876 0.5413 0.7959 0.6004 0.66 0.5686 0.5369 0.6348 0.5017 0.6651 0.6181 0.4479 0.4094 0.5027 0.4378 0.4115 0.5361 0.5063 0.6031 0.7521 0.4113 0.5483 0.1648 0.2319 0.565 0.5154 0.393 0.5007 0.3858 0.5589 0.7239 0.6397 0.5638 0.3333 0.3909 0.6234 0.7053 0.6337 0.8964 0.5928 0.58 0.4231 0.0846 0.4739 0.463 0.6166 0.4708 0.772 0.3649 0.5225 0.201 0.6107 0.5547 0.4666 0.6496 0.2482 0.5975 0.4902 0.4774 0.6183 0.5644 0.6053 0.7193 0.5576 0.753 0.0289 0.6831 0.6798 0.4685 0.3351 0.6556 0.5497 0.5949 0.0058 0.2613 0.3262 0.268 0.0236 0.5291 0.4493 0.4367 0.5456 0.4653 0.0061 0.703 0.6217 0.6568 0.6305 0.4285 0.8967 0.2572 0.4968 0.3118 0.2787 0.4403 0.6599 0.7318 0.1528 0.7636 0.5759 0.06 0.7839 0.7152 0.2866 0.6785 0.6201 0.5907 0.6601 0.7209 0.3637 0.4445 0.4247 0.6706 0.5961 0.3249 0.7314 0.6524 0.3143 0.4864 0.0902 0.3724 0.7245 0.238 0.138 0.5088 0.679 0.6028 0. 0.8238 0.2226 0.3119 0.5727] 2022-08-24 20:45:12 [INFO] [EVAL] Class Recall: [0.8204 0.91 0.9623 0.8377 0.8297 0.846 0.8459 0.887 0.6707 0.7577 0.6076 0.7523 0.849 0.4488 0.2727 0.5235 0.6507 0.5103 0.7287 0.5399 0.8656 0.622 0.7224 0.6823 0.4188 0.4047 0.5379 0.4863 0.4727 0.4773 0.2165 0.6537 0.3763 0.4289 0.4959 0.6166 0.4569 0.5686 0.384 0.2646 0.1828 0.1464 0.4023 0.2668 0.5327 0.4131 0.4037 0.5785 0.8476 0.6767 0.6705 0.6067 0.3095 0.2352 0.8299 0.6391 0.929 0.4094 0.7189 0.3047 0.0816 0.3357 0.4751 0.1623 0.6462 0.8031 0.3238 0.5865 0.149 0.4437 0.4306 0.6855 0.4812 0.6056 0.5766 0.4247 0.4557 0.2263 0.1866 0.3932 0.8166 0.3415 0.3016 0.0185 0.6901 0.6017 0.0784 0.0949 0.4176 0.5889 0.5553 0.0143 0.2209 0.0583 0.0471 0.0078 0.235 0.1395 0.2456 0.3094 0.105 0.0093 0.171 0.7428 0.0633 0.6262 0.1974 0.4412 0.1009 0.5157 0.0747 0.4161 0.1036 0.8118 0.9868 0.0027 0.4936 0.895 0.0742 0.0768 0.6052 0.0097 0.2739 0.1878 0.2855 0.2274 0.512 0.5144 0.3527 0.2764 0.7073 0.0082 0.0393 0.2174 0.0623 0.112 0.0454 0.0137 0.1665 0.3365 0.3363 0.3706 0.4057 0.4003 0.3737 0. 0.3054 0.0286 0.0658 0.0314] 2022-08-24 20:45:12 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:45:23 [INFO] [TRAIN] epoch: 72, iter: 90050/160000, loss: 0.7396, lr: 0.000530, batch_cost: 0.2178, reader_cost: 0.00316, ips: 36.7304 samples/sec | ETA 04:13:55 2022-08-24 20:45:34 [INFO] [TRAIN] epoch: 72, iter: 90100/160000, loss: 0.7572, lr: 0.000529, batch_cost: 0.2232, reader_cost: 0.00035, ips: 35.8457 samples/sec | ETA 04:20:00 2022-08-24 20:45:45 [INFO] [TRAIN] epoch: 72, iter: 90150/160000, loss: 0.7649, lr: 0.000529, batch_cost: 0.2079, reader_cost: 0.00047, ips: 38.4840 samples/sec | ETA 04:02:00 2022-08-24 20:45:55 [INFO] [TRAIN] epoch: 72, iter: 90200/160000, loss: 0.7250, lr: 0.000528, batch_cost: 0.1975, reader_cost: 0.00221, ips: 40.5012 samples/sec | ETA 03:49:47 2022-08-24 20:46:06 [INFO] [TRAIN] epoch: 72, iter: 90250/160000, loss: 0.7137, lr: 0.000528, batch_cost: 0.2216, reader_cost: 0.00097, ips: 36.0971 samples/sec | ETA 04:17:38 2022-08-24 20:46:16 [INFO] [TRAIN] epoch: 72, iter: 90300/160000, loss: 0.7016, lr: 0.000528, batch_cost: 0.2112, reader_cost: 0.00074, ips: 37.8808 samples/sec | ETA 04:05:19 2022-08-24 20:46:27 [INFO] [TRAIN] epoch: 72, iter: 90350/160000, loss: 0.8126, lr: 0.000527, batch_cost: 0.2193, reader_cost: 0.00082, ips: 36.4742 samples/sec | ETA 04:14:36 2022-08-24 20:46:37 [INFO] [TRAIN] epoch: 72, iter: 90400/160000, loss: 0.7298, lr: 0.000527, batch_cost: 0.1965, reader_cost: 0.00050, ips: 40.7068 samples/sec | ETA 03:47:58 2022-08-24 20:46:48 [INFO] [TRAIN] epoch: 72, iter: 90450/160000, loss: 0.7825, lr: 0.000527, batch_cost: 0.2083, reader_cost: 0.00587, ips: 38.4059 samples/sec | ETA 04:01:27 2022-08-24 20:46:56 [INFO] [TRAIN] epoch: 72, iter: 90500/160000, loss: 0.7598, lr: 0.000526, batch_cost: 0.1663, reader_cost: 0.00072, ips: 48.1070 samples/sec | ETA 03:12:37 2022-08-24 20:47:05 [INFO] [TRAIN] epoch: 72, iter: 90550/160000, loss: 0.7542, lr: 0.000526, batch_cost: 0.1798, reader_cost: 0.00056, ips: 44.4984 samples/sec | ETA 03:28:05 2022-08-24 20:47:15 [INFO] [TRAIN] epoch: 72, iter: 90600/160000, loss: 0.7591, lr: 0.000525, batch_cost: 0.2091, reader_cost: 0.00913, ips: 38.2531 samples/sec | ETA 04:01:53 2022-08-24 20:47:27 [INFO] [TRAIN] epoch: 72, iter: 90650/160000, loss: 0.6890, lr: 0.000525, batch_cost: 0.2375, reader_cost: 0.00065, ips: 33.6912 samples/sec | ETA 04:34:27 2022-08-24 20:47:38 [INFO] [TRAIN] epoch: 72, iter: 90700/160000, loss: 0.7595, lr: 0.000525, batch_cost: 0.2117, reader_cost: 0.00968, ips: 37.7823 samples/sec | ETA 04:04:33 2022-08-24 20:47:48 [INFO] [TRAIN] epoch: 72, iter: 90750/160000, loss: 0.7763, lr: 0.000524, batch_cost: 0.2120, reader_cost: 0.00445, ips: 37.7334 samples/sec | ETA 04:04:41 2022-08-24 20:47:58 [INFO] [TRAIN] epoch: 72, iter: 90800/160000, loss: 0.7550, lr: 0.000524, batch_cost: 0.2016, reader_cost: 0.01841, ips: 39.6738 samples/sec | ETA 03:52:33 2022-08-24 20:48:10 [INFO] [TRAIN] epoch: 72, iter: 90850/160000, loss: 0.8015, lr: 0.000524, batch_cost: 0.2224, reader_cost: 0.00046, ips: 35.9665 samples/sec | ETA 04:16:20 2022-08-24 20:48:21 [INFO] [TRAIN] epoch: 72, iter: 90900/160000, loss: 0.7221, lr: 0.000523, batch_cost: 0.2225, reader_cost: 0.00081, ips: 35.9594 samples/sec | ETA 04:16:12 2022-08-24 20:48:34 [INFO] [TRAIN] epoch: 73, iter: 90950/160000, loss: 0.7163, lr: 0.000523, batch_cost: 0.2711, reader_cost: 0.02849, ips: 29.5117 samples/sec | ETA 05:11:57 2022-08-24 20:48:44 [INFO] [TRAIN] epoch: 73, iter: 91000/160000, loss: 0.7854, lr: 0.000522, batch_cost: 0.2039, reader_cost: 0.00166, ips: 39.2309 samples/sec | ETA 03:54:30 2022-08-24 20:48:44 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.2002 - reader cost: 7.0274e-04 2022-08-24 20:52:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3132 Acc: 0.7451 Kappa: 0.7255 Dice: 0.4407 2022-08-24 20:52:05 [INFO] [EVAL] Class IoU: [0.6523 0.7741 0.925 0.7048 0.6649 0.7261 0.759 0.7345 0.4897 0.6177 0.4591 0.5145 0.6491 0.2867 0.2126 0.369 0.4973 0.3846 0.5476 0.3642 0.712 0.418 0.5207 0.4454 0.3283 0.4356 0.3702 0.4092 0.3603 0.3107 0.1911 0.3973 0.2554 0.2878 0.3486 0.3999 0.3518 0.4911 0.2412 0.2216 0.0885 0.066 0.2998 0.2011 0.2832 0.2843 0.2658 0.4071 0.651 0.437 0.4791 0.2672 0.2012 0.2127 0.5731 0.4859 0.8165 0.2417 0.4457 0.2095 0.0672 0.1761 0.2936 0.0794 0.3578 0.6247 0.1983 0.3532 0.1187 0.333 0.3397 0.405 0.38 0.2222 0.4148 0.28 0.3392 0.2225 0.1375 0.4421 0.6199 0.2865 0.2419 0.0118 0.426 0.4702 0.0827 0.0501 0.3401 0.4134 0.3921 0.0006 0.2039 0.0756 0.0047 0.0135 0.2295 0.1665 0.1899 0.3368 0.0974 0.0074 0.2107 0.371 0.1068 0.5097 0.1856 0.493 0.0715 0.3336 0.0586 0.1784 0.0785 0.604 0.7393 0.0002 0.3464 0.5525 0.0474 0.218 0.4545 0.0098 0.2199 0.0834 0.2225 0.1403 0.3956 0.2935 0.4047 0.2601 0.5198 0.009 0.0954 0.3091 0.0823 0.0876 0.065 0.0165 0.1256 0.3211 0.1165 0.0616 0.2791 0.441 0.3147 0.0008 0.2709 0.0106 0.0583 0.0142] 2022-08-24 20:52:05 [INFO] [EVAL] Class Precision: [0.7604 0.8406 0.9655 0.8069 0.7519 0.8801 0.8489 0.791 0.6082 0.7032 0.6739 0.6937 0.7341 0.4843 0.4803 0.5312 0.6749 0.6536 0.7082 0.561 0.7941 0.607 0.6591 0.5483 0.4837 0.6749 0.549 0.6803 0.6327 0.5485 0.3846 0.5147 0.4421 0.4147 0.4809 0.5827 0.5939 0.8003 0.4826 0.554 0.1448 0.2359 0.4783 0.5797 0.4411 0.4641 0.4167 0.707 0.757 0.5082 0.661 0.3184 0.4345 0.5293 0.6705 0.6556 0.8637 0.5882 0.5395 0.3893 0.0993 0.3387 0.3661 0.5332 0.4509 0.7198 0.3805 0.5757 0.3303 0.5497 0.5119 0.5909 0.6137 0.2574 0.6228 0.4501 0.6027 0.5367 0.5378 0.6368 0.7236 0.546 0.7744 0.0369 0.7011 0.6611 0.371 0.3988 0.5247 0.566 0.5686 0.0008 0.3669 0.2817 0.0381 0.0392 0.6105 0.4806 0.4875 0.5323 0.6207 0.0114 0.6173 0.7662 0.5236 0.7687 0.5139 0.8841 0.3094 0.5333 0.2868 0.2379 0.5314 0.7132 0.7459 0.0236 0.7061 0.5983 0.1066 0.6094 0.8264 0.2466 0.633 0.7 0.6262 0.6423 0.7463 0.3951 0.7464 0.4369 0.6165 0.3892 0.2461 0.5331 0.619 0.3416 0.3885 0.2179 0.2754 0.6543 0.2237 0.1824 0.5985 0.589 0.6759 0.0012 0.8276 0.3019 0.5207 0.7509] 2022-08-24 20:52:05 [INFO] [EVAL] Class Recall: [0.821 0.9073 0.9565 0.8478 0.8517 0.8058 0.8776 0.9114 0.7154 0.8355 0.5902 0.6658 0.8486 0.4128 0.2761 0.5472 0.6539 0.483 0.7072 0.5093 0.8732 0.573 0.7127 0.7035 0.5055 0.5513 0.532 0.5067 0.4556 0.4175 0.2752 0.6351 0.3768 0.4845 0.5589 0.5605 0.4631 0.5597 0.3254 0.2697 0.1853 0.084 0.4455 0.2354 0.4417 0.4233 0.4233 0.4897 0.823 0.757 0.6351 0.6244 0.2727 0.2623 0.7977 0.6525 0.9372 0.2909 0.7193 0.3121 0.172 0.2683 0.5971 0.0853 0.634 0.8254 0.2929 0.4774 0.1563 0.4579 0.5024 0.5627 0.4995 0.6192 0.554 0.4255 0.4368 0.2753 0.1559 0.5911 0.8123 0.3762 0.2602 0.017 0.5206 0.6195 0.0961 0.0542 0.4914 0.6054 0.5581 0.0026 0.3146 0.0937 0.0053 0.0202 0.2688 0.203 0.2373 0.4784 0.1036 0.0202 0.2423 0.4184 0.1183 0.602 0.2251 0.5271 0.085 0.4712 0.0687 0.4161 0.0843 0.7977 0.9882 0.0002 0.4048 0.8783 0.0785 0.2535 0.5025 0.0101 0.2521 0.0865 0.2566 0.1522 0.457 0.5328 0.4693 0.3912 0.7682 0.0091 0.1348 0.4238 0.0867 0.1053 0.0724 0.0175 0.1877 0.3868 0.1955 0.0852 0.3434 0.637 0.3707 0.0023 0.2872 0.0109 0.0616 0.0142] 2022-08-24 20:52:05 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:52:15 [INFO] [TRAIN] epoch: 73, iter: 91050/160000, loss: 0.7529, lr: 0.000522, batch_cost: 0.2062, reader_cost: 0.00410, ips: 38.7883 samples/sec | ETA 03:57:00 2022-08-24 20:52:26 [INFO] [TRAIN] epoch: 73, iter: 91100/160000, loss: 0.7470, lr: 0.000522, batch_cost: 0.2143, reader_cost: 0.00091, ips: 37.3242 samples/sec | ETA 04:06:07 2022-08-24 20:52:38 [INFO] [TRAIN] epoch: 73, iter: 91150/160000, loss: 0.7250, lr: 0.000521, batch_cost: 0.2389, reader_cost: 0.00066, ips: 33.4933 samples/sec | ETA 04:34:05 2022-08-24 20:52:49 [INFO] [TRAIN] epoch: 73, iter: 91200/160000, loss: 0.7269, lr: 0.000521, batch_cost: 0.2252, reader_cost: 0.00078, ips: 35.5315 samples/sec | ETA 04:18:10 2022-08-24 20:53:00 [INFO] [TRAIN] epoch: 73, iter: 91250/160000, loss: 0.7279, lr: 0.000521, batch_cost: 0.2095, reader_cost: 0.00061, ips: 38.1792 samples/sec | ETA 04:00:05 2022-08-24 20:53:10 [INFO] [TRAIN] epoch: 73, iter: 91300/160000, loss: 0.7644, lr: 0.000520, batch_cost: 0.2068, reader_cost: 0.00061, ips: 38.6834 samples/sec | ETA 03:56:47 2022-08-24 20:53:19 [INFO] [TRAIN] epoch: 73, iter: 91350/160000, loss: 0.6974, lr: 0.000520, batch_cost: 0.1816, reader_cost: 0.00107, ips: 44.0614 samples/sec | ETA 03:27:44 2022-08-24 20:53:31 [INFO] [TRAIN] epoch: 73, iter: 91400/160000, loss: 0.7279, lr: 0.000519, batch_cost: 0.2383, reader_cost: 0.00037, ips: 33.5765 samples/sec | ETA 04:32:24 2022-08-24 20:53:42 [INFO] [TRAIN] epoch: 73, iter: 91450/160000, loss: 0.7725, lr: 0.000519, batch_cost: 0.2100, reader_cost: 0.00080, ips: 38.0982 samples/sec | ETA 03:59:54 2022-08-24 20:53:52 [INFO] [TRAIN] epoch: 73, iter: 91500/160000, loss: 0.7832, lr: 0.000519, batch_cost: 0.1991, reader_cost: 0.00114, ips: 40.1905 samples/sec | ETA 03:47:15 2022-08-24 20:53:59 [INFO] [TRAIN] epoch: 73, iter: 91550/160000, loss: 0.7658, lr: 0.000518, batch_cost: 0.1572, reader_cost: 0.00042, ips: 50.8836 samples/sec | ETA 02:59:21 2022-08-24 20:54:10 [INFO] [TRAIN] epoch: 73, iter: 91600/160000, loss: 0.7410, lr: 0.000518, batch_cost: 0.2160, reader_cost: 0.00298, ips: 37.0351 samples/sec | ETA 04:06:15 2022-08-24 20:54:20 [INFO] [TRAIN] epoch: 73, iter: 91650/160000, loss: 0.7265, lr: 0.000517, batch_cost: 0.1970, reader_cost: 0.01606, ips: 40.6058 samples/sec | ETA 03:44:26 2022-08-24 20:54:31 [INFO] [TRAIN] epoch: 73, iter: 91700/160000, loss: 0.7421, lr: 0.000517, batch_cost: 0.2199, reader_cost: 0.00124, ips: 36.3725 samples/sec | ETA 04:10:22 2022-08-24 20:54:42 [INFO] [TRAIN] epoch: 73, iter: 91750/160000, loss: 0.7875, lr: 0.000517, batch_cost: 0.2172, reader_cost: 0.00056, ips: 36.8358 samples/sec | ETA 04:07:02 2022-08-24 20:54:53 [INFO] [TRAIN] epoch: 73, iter: 91800/160000, loss: 0.7025, lr: 0.000516, batch_cost: 0.2143, reader_cost: 0.00571, ips: 37.3301 samples/sec | ETA 04:03:35 2022-08-24 20:55:03 [INFO] [TRAIN] epoch: 73, iter: 91850/160000, loss: 0.7326, lr: 0.000516, batch_cost: 0.2022, reader_cost: 0.00711, ips: 39.5642 samples/sec | ETA 03:49:40 2022-08-24 20:55:13 [INFO] [TRAIN] epoch: 73, iter: 91900/160000, loss: 0.7630, lr: 0.000516, batch_cost: 0.2073, reader_cost: 0.00372, ips: 38.5999 samples/sec | ETA 03:55:14 2022-08-24 20:55:24 [INFO] [TRAIN] epoch: 73, iter: 91950/160000, loss: 0.7431, lr: 0.000515, batch_cost: 0.2066, reader_cost: 0.01191, ips: 38.7134 samples/sec | ETA 03:54:22 2022-08-24 20:55:34 [INFO] [TRAIN] epoch: 73, iter: 92000/160000, loss: 0.7051, lr: 0.000515, batch_cost: 0.2082, reader_cost: 0.00590, ips: 38.4313 samples/sec | ETA 03:55:55 2022-08-24 20:55:34 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1921 - reader cost: 9.2878e-04 2022-08-24 20:58:46 [INFO] [EVAL] #Images: 2000 mIoU: 0.3115 Acc: 0.7469 Kappa: 0.7274 Dice: 0.4381 2022-08-24 20:58:46 [INFO] [EVAL] Class IoU: [0.657 0.7703 0.9258 0.7011 0.6647 0.736 0.7577 0.7525 0.4821 0.6419 0.4579 0.5261 0.6592 0.2981 0.2143 0.3695 0.4901 0.4178 0.5421 0.3623 0.7142 0.4119 0.5388 0.4383 0.3338 0.4005 0.3908 0.3888 0.3465 0.3285 0.2207 0.3946 0.2706 0.2648 0.3512 0.3899 0.3602 0.4662 0.2579 0.25 0.0671 0.0989 0.2932 0.21 0.2847 0.2836 0.2315 0.4101 0.6388 0.4902 0.443 0.2748 0.1678 0.2508 0.6258 0.4867 0.8153 0.308 0.4886 0.1572 0.0504 0.2291 0.2898 0.1421 0.3913 0.6574 0.2211 0.3743 0.1063 0.3432 0.3359 0.4117 0.3543 0.23 0.408 0.2833 0.3764 0.246 0.1373 0.3071 0.6137 0.2689 0.2106 0.0141 0.4963 0.4728 0.0775 0.0633 0.3055 0.4133 0.422 0.003 0.203 0.0572 0.0928 0.0073 0.1773 0.1418 0.2313 0.3234 0.0971 0.023 0.2125 0.5169 0.0858 0.4175 0.0996 0.4683 0.1008 0.4031 0.0444 0.0638 0.0928 0.6051 0.6856 0.0015 0.3482 0.5419 0.0343 0.161 0.4579 0.0146 0.2839 0.1102 0.2497 0.1995 0.4206 0.3141 0.0677 0.228 0.5027 0.0092 0.0658 0.161 0.0859 0.0894 0.0673 0.0108 0.1197 0.2702 0.192 0.0343 0.2594 0.4013 0.297 0. 0.2774 0.0101 0.0404 0.027 ] 2022-08-24 20:58:46 [INFO] [EVAL] Class Precision: [0.7633 0.8349 0.9635 0.7979 0.7718 0.837 0.8852 0.8319 0.6072 0.7947 0.6599 0.644 0.7581 0.4727 0.4629 0.539 0.6728 0.6101 0.7427 0.5699 0.7943 0.5784 0.6613 0.5925 0.5272 0.6226 0.5413 0.712 0.6562 0.5266 0.3855 0.5101 0.4477 0.3569 0.4878 0.5982 0.5881 0.7222 0.5204 0.5297 0.151 0.2393 0.5289 0.541 0.3867 0.4716 0.3487 0.6211 0.6905 0.6325 0.5649 0.3406 0.3979 0.5593 0.6924 0.6624 0.8627 0.5977 0.6519 0.4857 0.0853 0.4739 0.4552 0.6221 0.4949 0.8053 0.4043 0.5012 0.348 0.6674 0.5123 0.6103 0.6621 0.2879 0.5724 0.4254 0.5335 0.6211 0.5702 0.618 0.7005 0.5346 0.7841 0.0868 0.6908 0.6618 0.2495 0.3758 0.4856 0.5738 0.6483 0.0042 0.375 0.2857 0.3078 0.0204 0.6731 0.5582 0.4554 0.5112 0.5435 0.037 0.598 0.7481 0.5577 0.5538 0.4536 0.8758 0.2328 0.5602 0.2203 0.0701 0.3723 0.7077 0.6905 0.0171 0.7094 0.5734 0.1026 0.6194 0.7009 0.5318 0.5701 0.6232 0.5779 0.5979 0.7391 0.4661 0.2248 0.3191 0.5743 0.8036 0.5641 0.8349 0.5716 0.3974 0.33 0.1423 0.2868 0.7252 0.3321 0.0934 0.3698 0.5037 0.5364 0. 0.8919 0.2525 0.5074 0.5513] 2022-08-24 20:58:46 [INFO] [EVAL] Class Recall: [0.825 0.9088 0.9594 0.8525 0.8272 0.8591 0.8403 0.8875 0.7006 0.7696 0.5994 0.7418 0.8347 0.4466 0.2851 0.5402 0.6436 0.57 0.6675 0.4987 0.8763 0.5886 0.7442 0.6274 0.4763 0.5288 0.5842 0.4613 0.4233 0.4662 0.3405 0.6352 0.4062 0.5064 0.5564 0.5282 0.4817 0.5681 0.3382 0.3213 0.1077 0.1443 0.3968 0.2556 0.5191 0.4157 0.4079 0.547 0.8951 0.6855 0.6726 0.5874 0.2249 0.3125 0.8667 0.6472 0.9368 0.3886 0.6611 0.1886 0.1099 0.3072 0.4438 0.1555 0.6514 0.7816 0.328 0.5966 0.1328 0.414 0.4938 0.5584 0.4325 0.5336 0.5868 0.4588 0.561 0.2894 0.1531 0.3791 0.8319 0.351 0.2236 0.0165 0.638 0.6233 0.1011 0.0708 0.4516 0.5964 0.5473 0.0103 0.3069 0.0667 0.1172 0.0112 0.194 0.1598 0.3198 0.4682 0.1058 0.0572 0.2479 0.6259 0.092 0.629 0.1131 0.5015 0.1509 0.5897 0.0527 0.4161 0.11 0.8068 0.9897 0.0016 0.4062 0.908 0.0491 0.1786 0.569 0.0148 0.3612 0.118 0.3054 0.2304 0.4939 0.4906 0.0883 0.4439 0.8011 0.0092 0.0693 0.1663 0.0919 0.1035 0.0779 0.0116 0.1705 0.3011 0.3127 0.0514 0.4649 0.6638 0.3997 0. 0.2871 0.0104 0.042 0.0276] 2022-08-24 20:58:46 [INFO] [EVAL] The model with the best validation mIoU (0.3137) was saved at iter 86000. 2022-08-24 20:58:56 [INFO] [TRAIN] epoch: 73, iter: 92050/160000, loss: 0.7226, lr: 0.000514, batch_cost: 0.1994, reader_cost: 0.00326, ips: 40.1146 samples/sec | ETA 03:45:51 2022-08-24 20:59:06 [INFO] [TRAIN] epoch: 73, iter: 92100/160000, loss: 0.7315, lr: 0.000514, batch_cost: 0.1999, reader_cost: 0.00068, ips: 40.0181 samples/sec | ETA 03:46:13 2022-08-24 20:59:15 [INFO] [TRAIN] epoch: 73, iter: 92150/160000, loss: 0.7777, lr: 0.000514, batch_cost: 0.1803, reader_cost: 0.00034, ips: 44.3713 samples/sec | ETA 03:23:53 2022-08-24 20:59:28 [INFO] [TRAIN] epoch: 74, iter: 92200/160000, loss: 0.7221, lr: 0.000513, batch_cost: 0.2548, reader_cost: 0.04917, ips: 31.3921 samples/sec | ETA 04:47:58 2022-08-24 20:59:39 [INFO] [TRAIN] epoch: 74, iter: 92250/160000, loss: 0.7408, lr: 0.000513, batch_cost: 0.2275, reader_cost: 0.02846, ips: 35.1584 samples/sec | ETA 04:16:55 2022-08-24 20:59:50 [INFO] [TRAIN] epoch: 74, iter: 92300/160000, loss: 0.7228, lr: 0.000513, batch_cost: 0.2093, reader_cost: 0.00929, ips: 38.2235 samples/sec | ETA 03:56:09 2022-08-24 21:00:00 [INFO] [TRAIN] epoch: 74, iter: 92350/160000, loss: 0.7399, lr: 0.000512, batch_cost: 0.1949, reader_cost: 0.00738, ips: 41.0476 samples/sec | ETA 03:39:44 2022-08-24 21:00:10 [INFO] [TRAIN] epoch: 74, iter: 92400/160000, loss: 0.7177, lr: 0.000512, batch_cost: 0.2090, reader_cost: 0.00178, ips: 38.2838 samples/sec | ETA 03:55:26 2022-08-24 21:00:22 [INFO] [TRAIN] epoch: 74, iter: 92450/160000, loss: 0.7850, lr: 0.000511, batch_cost: 0.2280, reader_cost: 0.00160, ips: 35.0872 samples/sec | ETA 04:16:41 2022-08-24 21:00:31 [INFO] [TRAIN] epoch: 74, iter: 92500/160000, loss: 0.7434, lr: 0.000511, batch_cost: 0.1812, reader_cost: 0.00068, ips: 44.1382 samples/sec | ETA 03:23:54 2022-08-24 21:00:41 [INFO] [TRAIN] epoch: 74, iter: 92550/160000, loss: 0.7340, lr: 0.000511, batch_cost: 0.2139, reader_cost: 0.00650, ips: 37.4052 samples/sec | ETA 04:00:25 2022-08-24 21:00:50 [INFO] [TRAIN] epoch: 74, iter: 92600/160000, loss: 0.7845, lr: 0.000510, batch_cost: 0.1804, reader_cost: 0.00612, ips: 44.3531 samples/sec | ETA 03:22:36 2022-08-24 21:01:00 [INFO] [TRAIN] epoch: 74, iter: 92650/160000, loss: 0.7094, lr: 0.000510, batch_cost: 0.1924, reader_cost: 0.00462, ips: 41.5890 samples/sec | ETA 03:35:55 2022-08-24 21:01:10 [INFO] [TRAIN] epoch: 74, iter: 92700/160000, loss: 0.6561, lr: 0.000510, batch_cost: 0.1986, reader_cost: 0.00134, ips: 40.2802 samples/sec | ETA 03:42:46 2022-08-24 21:01:19 [INFO] [TRAIN] epoch: 74, iter: 92750/160000, loss: 0.7508, lr: 0.000509, batch_cost: 0.1903, reader_cost: 0.00714, ips: 42.0331 samples/sec | ETA 03:33:19 2022-08-24 21:01:29 [INFO] [TRAIN] epoch: 74, iter: 92800/160000, loss: 0.7319, lr: 0.000509, batch_cost: 0.1952, reader_cost: 0.00053, ips: 40.9925 samples/sec | ETA 03:38:34 2022-08-24 21:01:40 [INFO] [TRAIN] epoch: 74, iter: 92850/160000, loss: 0.7414, lr: 0.000508, batch_cost: 0.2078, reader_cost: 0.00445, ips: 38.4954 samples/sec | ETA 03:52:34 2022-08-24 21:01:50 [INFO] [TRAIN] epoch: 74, iter: 92900/160000, loss: 0.7707, lr: 0.000508, batch_cost: 0.2015, reader_cost: 0.00253, ips: 39.6961 samples/sec | ETA 03:45:22 2022-08-24 21:01:59 [INFO] [TRAIN] epoch: 74, iter: 92950/160000, loss: 0.7677, lr: 0.000508, batch_cost: 0.1903, reader_cost: 0.00582, ips: 42.0325 samples/sec | ETA 03:32:41 2022-08-24 21:02:10 [INFO] [TRAIN] epoch: 74, iter: 93000/160000, loss: 0.7359, lr: 0.000507, batch_cost: 0.2074, reader_cost: 0.00043, ips: 38.5796 samples/sec | ETA 03:51:33 2022-08-24 21:02:10 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 208s - batch_cost: 0.2082 - reader cost: 6.7131e-04 2022-08-24 21:05:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3147 Acc: 0.7458 Kappa: 0.7261 Dice: 0.4413 2022-08-24 21:05:38 [INFO] [EVAL] Class IoU: [0.6525 0.7666 0.9226 0.6989 0.6726 0.7404 0.7552 0.7387 0.4821 0.6044 0.4574 0.5259 0.6431 0.2855 0.2111 0.3624 0.478 0.4167 0.5501 0.3713 0.7248 0.4499 0.5523 0.4439 0.3233 0.4229 0.3844 0.3891 0.3265 0.2961 0.2029 0.3892 0.2717 0.2851 0.3778 0.3816 0.3533 0.4805 0.2721 0.2549 0.1076 0.09 0.2759 0.2083 0.2987 0.2687 0.2492 0.4124 0.6514 0.5074 0.4722 0.2305 0.2203 0.2243 0.633 0.4974 0.8153 0.3108 0.4854 0.1873 0.0401 0.4146 0.336 0.0792 0.3699 0.6449 0.2273 0.3509 0.0911 0.3364 0.3578 0.4107 0.3883 0.2535 0.393 0.2978 0.3177 0.25 0.0427 0.3071 0.6357 0.2945 0.1901 0.0104 0.5309 0.4655 0.0729 0.0369 0.2384 0.3956 0.406 0.0024 0.2075 0.0874 0.0509 0.0065 0.1746 0.1745 0.1101 0.3966 0.0733 0.0078 0.2159 0.2827 0.0399 0.5121 0.1923 0.4816 0.0966 0.3905 0.0926 0.0705 0.0863 0.5816 0.7288 0.0029 0.3864 0.548 0.0291 0.3398 0.4252 0.0061 0.2499 0.0953 0.1871 0.2052 0.4231 0.333 0.3642 0.2627 0.494 0.0072 0.1685 0.2692 0.0551 0.0891 0.0389 0.0155 0.1174 0.3234 0.1251 0.05 0.2396 0.4271 0.3146 0. 0.2228 0.0071 0.0493 0.0249] 2022-08-24 21:05:38 [INFO] [EVAL] Class Precision: [0.757 0.8346 0.9516 0.7995 0.7806 0.8505 0.8948 0.7993 0.624 0.7089 0.7165 0.6585 0.7132 0.486 0.4433 0.5502 0.6595 0.6511 0.6878 0.5386 0.8285 0.6084 0.7208 0.5893 0.56 0.6332 0.5573 0.7315 0.7075 0.4712 0.3681 0.5047 0.4598 0.398 0.4633 0.5238 0.6174 0.7062 0.5312 0.5103 0.148 0.2557 0.4055 0.5222 0.4232 0.476 0.4358 0.7103 0.7365 0.6496 0.6221 0.2671 0.3923 0.7386 0.6945 0.6883 0.8644 0.6076 0.7161 0.4548 0.0826 0.6824 0.4613 0.5738 0.4657 0.7647 0.3587 0.577 0.2268 0.6206 0.5671 0.5823 0.6455 0.3404 0.5851 0.4695 0.5172 0.5457 0.8716 0.6797 0.7407 0.5297 0.7923 0.0304 0.7237 0.6956 0.3748 0.4959 0.367 0.5465 0.5825 0.0031 0.3328 0.2519 0.2365 0.025 0.7438 0.5097 0.5236 0.5453 0.7665 0.0184 0.6039 0.7759 0.3792 0.7984 0.4614 0.9057 0.2946 0.566 0.2673 0.0782 0.4115 0.8221 0.7346 0.049 0.7591 0.6023 0.0967 0.6643 0.7993 0.3984 0.6826 0.6618 0.6664 0.7198 0.7636 0.4856 0.6297 0.5238 0.5894 0.6329 0.5408 0.6629 0.6496 0.3699 0.3635 0.2227 0.2936 0.6349 0.2628 0.1061 0.6429 0.5099 0.7561 0. 0.921 0.3808 0.3694 0.5811] 2022-08-24 21:05:38 [INFO] [EVAL] Class Recall: [0.8253 0.904 0.9681 0.8475 0.8293 0.8511 0.8287 0.9069 0.6794 0.8039 0.5584 0.723 0.8673 0.4091 0.2873 0.5151 0.6347 0.5365 0.7331 0.5445 0.8528 0.6333 0.7026 0.6428 0.4335 0.56 0.5533 0.454 0.3774 0.4435 0.3113 0.6297 0.399 0.5013 0.6716 0.5844 0.4524 0.6005 0.358 0.3374 0.2829 0.122 0.4634 0.2574 0.5039 0.3816 0.3678 0.4958 0.8493 0.6987 0.6621 0.6275 0.3344 0.2436 0.8772 0.6421 0.9348 0.3889 0.6011 0.2415 0.0722 0.5138 0.5531 0.0842 0.6425 0.8046 0.3829 0.4725 0.1321 0.4235 0.4923 0.5822 0.4936 0.4985 0.5448 0.4489 0.4516 0.3157 0.043 0.3591 0.8178 0.3987 0.2001 0.0155 0.6659 0.5846 0.0831 0.0383 0.4048 0.5888 0.5727 0.0097 0.3553 0.118 0.0609 0.0087 0.1858 0.2097 0.1223 0.5925 0.075 0.0133 0.2515 0.3078 0.0427 0.5881 0.248 0.507 0.1256 0.5573 0.1241 0.4161 0.0985 0.6654 0.9894 0.0031 0.4404 0.8588 0.0399 0.4102 0.476 0.0062 0.2828 0.1002 0.2064 0.223 0.4869 0.5146 0.4635 0.3451 0.7533 0.0072 0.1966 0.3119 0.0568 0.105 0.0418 0.0163 0.1636 0.3973 0.1927 0.0865 0.2764 0.7244 0.3501 0. 0.2272 0.0072 0.0539 0.0254] 2022-08-24 21:05:38 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:05:49 [INFO] [TRAIN] epoch: 74, iter: 93050/160000, loss: 0.7708, lr: 0.000507, batch_cost: 0.2252, reader_cost: 0.00502, ips: 35.5314 samples/sec | ETA 04:11:13 2022-08-24 21:06:01 [INFO] [TRAIN] epoch: 74, iter: 93100/160000, loss: 0.7283, lr: 0.000507, batch_cost: 0.2289, reader_cost: 0.00238, ips: 34.9449 samples/sec | ETA 04:15:15 2022-08-24 21:06:13 [INFO] [TRAIN] epoch: 74, iter: 93150/160000, loss: 0.7797, lr: 0.000506, batch_cost: 0.2376, reader_cost: 0.00047, ips: 33.6641 samples/sec | ETA 04:24:46 2022-08-24 21:06:24 [INFO] [TRAIN] epoch: 74, iter: 93200/160000, loss: 0.7658, lr: 0.000506, batch_cost: 0.2186, reader_cost: 0.00068, ips: 36.5893 samples/sec | ETA 04:03:25 2022-08-24 21:06:34 [INFO] [TRAIN] epoch: 74, iter: 93250/160000, loss: 0.7579, lr: 0.000505, batch_cost: 0.2122, reader_cost: 0.00058, ips: 37.6999 samples/sec | ETA 03:56:04 2022-08-24 21:06:44 [INFO] [TRAIN] epoch: 74, iter: 93300/160000, loss: 0.7320, lr: 0.000505, batch_cost: 0.1832, reader_cost: 0.00109, ips: 43.6724 samples/sec | ETA 03:23:38 2022-08-24 21:06:54 [INFO] [TRAIN] epoch: 74, iter: 93350/160000, loss: 0.7324, lr: 0.000505, batch_cost: 0.2106, reader_cost: 0.00091, ips: 37.9810 samples/sec | ETA 03:53:58 2022-08-24 21:07:05 [INFO] [TRAIN] epoch: 74, iter: 93400/160000, loss: 0.7986, lr: 0.000504, batch_cost: 0.2107, reader_cost: 0.00040, ips: 37.9641 samples/sec | ETA 03:53:54 2022-08-24 21:07:15 [INFO] [TRAIN] epoch: 74, iter: 93450/160000, loss: 0.7727, lr: 0.000504, batch_cost: 0.2036, reader_cost: 0.00068, ips: 39.2967 samples/sec | ETA 03:45:48 2022-08-24 21:07:26 [INFO] [TRAIN] epoch: 75, iter: 93500/160000, loss: 0.7327, lr: 0.000503, batch_cost: 0.2343, reader_cost: 0.04027, ips: 34.1473 samples/sec | ETA 04:19:39 2022-08-24 21:07:37 [INFO] [TRAIN] epoch: 75, iter: 93550/160000, loss: 0.7581, lr: 0.000503, batch_cost: 0.2138, reader_cost: 0.00056, ips: 37.4122 samples/sec | ETA 03:56:49 2022-08-24 21:07:48 [INFO] [TRAIN] epoch: 75, iter: 93600/160000, loss: 0.7664, lr: 0.000503, batch_cost: 0.2130, reader_cost: 0.00239, ips: 37.5504 samples/sec | ETA 03:55:46 2022-08-24 21:07:59 [INFO] [TRAIN] epoch: 75, iter: 93650/160000, loss: 0.7232, lr: 0.000502, batch_cost: 0.2304, reader_cost: 0.00058, ips: 34.7272 samples/sec | ETA 04:14:44 2022-08-24 21:08:11 [INFO] [TRAIN] epoch: 75, iter: 93700/160000, loss: 0.7600, lr: 0.000502, batch_cost: 0.2306, reader_cost: 0.00046, ips: 34.6967 samples/sec | ETA 04:14:46 2022-08-24 21:08:21 [INFO] [TRAIN] epoch: 75, iter: 93750/160000, loss: 0.6990, lr: 0.000502, batch_cost: 0.2076, reader_cost: 0.00058, ips: 38.5419 samples/sec | ETA 03:49:11 2022-08-24 21:08:32 [INFO] [TRAIN] epoch: 75, iter: 93800/160000, loss: 0.7107, lr: 0.000501, batch_cost: 0.2088, reader_cost: 0.00544, ips: 38.3080 samples/sec | ETA 03:50:24 2022-08-24 21:08:42 [INFO] [TRAIN] epoch: 75, iter: 93850/160000, loss: 0.7114, lr: 0.000501, batch_cost: 0.2045, reader_cost: 0.00082, ips: 39.1120 samples/sec | ETA 03:45:30 2022-08-24 21:08:52 [INFO] [TRAIN] epoch: 75, iter: 93900/160000, loss: 0.7585, lr: 0.000500, batch_cost: 0.2020, reader_cost: 0.02683, ips: 39.5990 samples/sec | ETA 03:42:33 2022-08-24 21:09:03 [INFO] [TRAIN] epoch: 75, iter: 93950/160000, loss: 0.6942, lr: 0.000500, batch_cost: 0.2249, reader_cost: 0.00071, ips: 35.5748 samples/sec | ETA 04:07:33 2022-08-24 21:09:14 [INFO] [TRAIN] epoch: 75, iter: 94000/160000, loss: 0.7027, lr: 0.000500, batch_cost: 0.2173, reader_cost: 0.00049, ips: 36.8112 samples/sec | ETA 03:59:03 2022-08-24 21:09:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 202s - batch_cost: 0.2017 - reader cost: 0.0012 2022-08-24 21:12:36 [INFO] [EVAL] #Images: 2000 mIoU: 0.3131 Acc: 0.7457 Kappa: 0.7258 Dice: 0.4398 2022-08-24 21:12:36 [INFO] [EVAL] Class IoU: [0.6516 0.7669 0.9247 0.7018 0.6631 0.7404 0.7585 0.7417 0.4872 0.6078 0.4547 0.5169 0.6588 0.2838 0.2216 0.37 0.4912 0.3986 0.5533 0.3617 0.722 0.434 0.541 0.4523 0.3225 0.3453 0.373 0.3978 0.3679 0.2774 0.1906 0.4086 0.2851 0.2779 0.4211 0.3805 0.3497 0.488 0.2566 0.217 0.1106 0.0686 0.2921 0.2107 0.2913 0.2811 0.2131 0.4232 0.6878 0.4913 0.4522 0.2977 0.187 0.1784 0.6168 0.4512 0.8058 0.3368 0.5049 0.2179 0.0618 0.1575 0.3068 0.1033 0.3705 0.6501 0.1825 0.3433 0.0796 0.3213 0.3407 0.4327 0.3896 0.2313 0.4006 0.2966 0.3175 0.2626 0.2414 0.2794 0.6455 0.2537 0.2751 0.0126 0.5325 0.4469 0.0678 0.0549 0.3553 0.4006 0.3767 0.0135 0.1891 0.0656 0.0264 0.015 0.1802 0.1764 0.1683 0.3854 0.086 0.0134 0.1797 0.4988 0.0375 0.4257 0.1446 0.4411 0.0985 0.3255 0.08 0.1394 0.0808 0.5368 0.7569 0.0046 0.3582 0.5744 0.0379 0.0602 0.4194 0.0075 0.2596 0.0659 0.2412 0.2097 0.4199 0.3433 0.3965 0.253 0.5068 0.0157 0.1398 0.225 0.0561 0.0995 0.0701 0.0126 0.1004 0.3154 0.1578 0.0318 0.2498 0.4215 0.3255 0. 0.236 0.0168 0.0691 0.0341] 2022-08-24 21:12:36 [INFO] [EVAL] Class Precision: [0.7493 0.8269 0.962 0.8014 0.7606 0.8568 0.8861 0.8109 0.6298 0.7074 0.6692 0.6743 0.7551 0.4723 0.4805 0.5539 0.7189 0.6691 0.7397 0.5772 0.822 0.604 0.6778 0.5582 0.579 0.6362 0.5339 0.6252 0.6481 0.4397 0.3958 0.5585 0.5395 0.3851 0.5338 0.5204 0.5672 0.821 0.5286 0.5527 0.1965 0.2209 0.4653 0.5149 0.4239 0.4512 0.3118 0.686 0.7538 0.6144 0.5587 0.3828 0.2847 0.6571 0.7128 0.5968 0.8483 0.5823 0.7237 0.3689 0.1075 0.3675 0.4157 0.6348 0.454 0.8074 0.3352 0.5506 0.1806 0.7033 0.6005 0.5804 0.6312 0.3038 0.575 0.5498 0.5043 0.5474 0.6215 0.7469 0.7434 0.499 0.7369 0.0388 0.7143 0.7011 0.4118 0.4134 0.6088 0.5288 0.4877 0.016 0.3373 0.2758 0.1576 0.0308 0.6097 0.4694 0.6062 0.5586 0.7982 0.026 0.6516 0.7219 0.3847 0.5444 0.3185 0.9461 0.2693 0.5367 0.2829 0.1733 0.3904 0.8037 0.7658 0.0605 0.8046 0.6052 0.1151 0.487 0.7499 0.4944 0.7938 0.6596 0.5991 0.605 0.664 0.4919 0.6938 0.4706 0.6303 0.3288 0.5738 0.6775 0.7046 0.2799 0.2522 0.14 0.4972 0.6683 0.2616 0.06 0.5106 0.6227 0.5826 0. 0.9174 0.3451 0.5201 0.5902] 2022-08-24 21:12:36 [INFO] [EVAL] Class Recall: [0.8333 0.9136 0.9598 0.8496 0.838 0.845 0.8404 0.8968 0.6827 0.812 0.5865 0.6889 0.8378 0.4156 0.2914 0.527 0.6081 0.4965 0.6871 0.4921 0.8558 0.6066 0.7283 0.7045 0.4213 0.4302 0.5531 0.5224 0.4597 0.4291 0.2688 0.6036 0.3768 0.4996 0.6662 0.586 0.4769 0.5461 0.3328 0.2632 0.2018 0.0905 0.4396 0.2629 0.4821 0.4271 0.4024 0.5249 0.8871 0.7102 0.7035 0.5726 0.3525 0.1967 0.8207 0.6491 0.9415 0.444 0.6255 0.3474 0.127 0.2161 0.5394 0.1098 0.6682 0.7694 0.2861 0.477 0.1246 0.3717 0.4405 0.6297 0.5044 0.4923 0.5692 0.3918 0.4616 0.3355 0.283 0.3087 0.8306 0.3404 0.3051 0.0182 0.6766 0.552 0.0751 0.0596 0.4603 0.6229 0.6234 0.0792 0.301 0.0792 0.0307 0.0283 0.2037 0.2203 0.1889 0.5542 0.0879 0.0271 0.1989 0.6175 0.0399 0.6613 0.2093 0.4525 0.1344 0.4526 0.1004 0.4161 0.0925 0.6178 0.9848 0.0049 0.3923 0.9185 0.0534 0.0643 0.4876 0.0075 0.2784 0.0682 0.2876 0.243 0.5332 0.5321 0.4806 0.3536 0.7213 0.0162 0.156 0.2521 0.0575 0.1337 0.0885 0.0137 0.1118 0.3739 0.2845 0.0636 0.3284 0.5661 0.4246 0. 0.2411 0.0173 0.0738 0.0349] 2022-08-24 21:12:36 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:12:46 [INFO] [TRAIN] epoch: 75, iter: 94050/160000, loss: 0.7245, lr: 0.000499, batch_cost: 0.2000, reader_cost: 0.00298, ips: 40.0089 samples/sec | ETA 03:39:47 2022-08-24 21:12:57 [INFO] [TRAIN] epoch: 75, iter: 94100/160000, loss: 0.7546, lr: 0.000499, batch_cost: 0.2194, reader_cost: 0.00393, ips: 36.4625 samples/sec | ETA 04:00:58 2022-08-24 21:13:07 [INFO] [TRAIN] epoch: 75, iter: 94150/160000, loss: 0.7995, lr: 0.000499, batch_cost: 0.1937, reader_cost: 0.00533, ips: 41.3066 samples/sec | ETA 03:32:33 2022-08-24 21:13:17 [INFO] [TRAIN] epoch: 75, iter: 94200/160000, loss: 0.7314, lr: 0.000498, batch_cost: 0.2064, reader_cost: 0.00064, ips: 38.7533 samples/sec | ETA 03:46:23 2022-08-24 21:13:27 [INFO] [TRAIN] epoch: 75, iter: 94250/160000, loss: 0.6757, lr: 0.000498, batch_cost: 0.1943, reader_cost: 0.02759, ips: 41.1754 samples/sec | ETA 03:32:54 2022-08-24 21:13:38 [INFO] [TRAIN] epoch: 75, iter: 94300/160000, loss: 0.6950, lr: 0.000497, batch_cost: 0.2241, reader_cost: 0.00045, ips: 35.7029 samples/sec | ETA 04:05:21 2022-08-24 21:13:49 [INFO] [TRAIN] epoch: 75, iter: 94350/160000, loss: 0.7626, lr: 0.000497, batch_cost: 0.2100, reader_cost: 0.00041, ips: 38.0884 samples/sec | ETA 03:49:48 2022-08-24 21:14:00 [INFO] [TRAIN] epoch: 75, iter: 94400/160000, loss: 0.7383, lr: 0.000497, batch_cost: 0.2189, reader_cost: 0.00055, ips: 36.5521 samples/sec | ETA 03:59:17 2022-08-24 21:14:10 [INFO] [TRAIN] epoch: 75, iter: 94450/160000, loss: 0.7499, lr: 0.000496, batch_cost: 0.2190, reader_cost: 0.00040, ips: 36.5280 samples/sec | ETA 03:59:16 2022-08-24 21:14:22 [INFO] [TRAIN] epoch: 75, iter: 94500/160000, loss: 0.6722, lr: 0.000496, batch_cost: 0.2211, reader_cost: 0.00488, ips: 36.1838 samples/sec | ETA 04:01:21 2022-08-24 21:14:34 [INFO] [TRAIN] epoch: 75, iter: 94550/160000, loss: 0.7293, lr: 0.000496, batch_cost: 0.2476, reader_cost: 0.00168, ips: 32.3110 samples/sec | ETA 04:30:05 2022-08-24 21:14:46 [INFO] [TRAIN] epoch: 75, iter: 94600/160000, loss: 0.7729, lr: 0.000495, batch_cost: 0.2513, reader_cost: 0.00093, ips: 31.8346 samples/sec | ETA 04:33:54 2022-08-24 21:14:56 [INFO] [TRAIN] epoch: 75, iter: 94650/160000, loss: 0.7427, lr: 0.000495, batch_cost: 0.1988, reader_cost: 0.00036, ips: 40.2512 samples/sec | ETA 03:36:28 2022-08-24 21:15:08 [INFO] [TRAIN] epoch: 75, iter: 94700/160000, loss: 0.7628, lr: 0.000494, batch_cost: 0.2402, reader_cost: 0.00053, ips: 33.3018 samples/sec | ETA 04:21:26 2022-08-24 21:15:21 [INFO] [TRAIN] epoch: 76, iter: 94750/160000, loss: 0.6960, lr: 0.000494, batch_cost: 0.2531, reader_cost: 0.03826, ips: 31.6036 samples/sec | ETA 04:35:17 2022-08-24 21:15:33 [INFO] [TRAIN] epoch: 76, iter: 94800/160000, loss: 0.7220, lr: 0.000494, batch_cost: 0.2360, reader_cost: 0.00107, ips: 33.8955 samples/sec | ETA 04:16:28 2022-08-24 21:15:44 [INFO] [TRAIN] epoch: 76, iter: 94850/160000, loss: 0.7258, lr: 0.000493, batch_cost: 0.2156, reader_cost: 0.00061, ips: 37.1060 samples/sec | ETA 03:54:06 2022-08-24 21:15:55 [INFO] [TRAIN] epoch: 76, iter: 94900/160000, loss: 0.7316, lr: 0.000493, batch_cost: 0.2351, reader_cost: 0.00057, ips: 34.0244 samples/sec | ETA 04:15:06 2022-08-24 21:16:07 [INFO] [TRAIN] epoch: 76, iter: 94950/160000, loss: 0.7216, lr: 0.000492, batch_cost: 0.2380, reader_cost: 0.00041, ips: 33.6184 samples/sec | ETA 04:17:59 2022-08-24 21:16:19 [INFO] [TRAIN] epoch: 76, iter: 95000/160000, loss: 0.7329, lr: 0.000492, batch_cost: 0.2391, reader_cost: 0.00038, ips: 33.4557 samples/sec | ETA 04:19:02 2022-08-24 21:16:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 199s - batch_cost: 0.1990 - reader cost: 0.0013 2022-08-24 21:19:39 [INFO] [EVAL] #Images: 2000 mIoU: 0.3092 Acc: 0.7446 Kappa: 0.7250 Dice: 0.4364 2022-08-24 21:19:39 [INFO] [EVAL] Class IoU: [0.6536 0.7674 0.9252 0.7015 0.6692 0.7399 0.7598 0.7439 0.4875 0.6134 0.4469 0.5148 0.6527 0.2894 0.1997 0.3726 0.4934 0.3966 0.5423 0.3407 0.6982 0.4125 0.5474 0.4373 0.3025 0.4054 0.3991 0.3991 0.3831 0.3328 0.2343 0.3882 0.2805 0.2816 0.3311 0.3718 0.3518 0.4872 0.2548 0.2372 0.0984 0.0964 0.2896 0.209 0.2775 0.2126 0.2394 0.425 0.6554 0.4875 0.4797 0.2486 0.1927 0.1725 0.5733 0.4764 0.7992 0.2888 0.5017 0.184 0.074 0.1529 0.317 0.1817 0.3834 0.6495 0.2145 0.3541 0.1008 0.3351 0.352 0.4158 0.3585 0.2426 0.4122 0.2692 0.3302 0.2428 0.0791 0.2307 0.6494 0.2717 0.2614 0.0123 0.5027 0.4615 0.0801 0.0566 0.335 0.3989 0.375 0.0044 0.2 0.0646 0.0027 0.0041 0.249 0.1414 0.2618 0.3727 0.0484 0.0091 0.121 0.2925 0.028 0.466 0.158 0.3735 0.1169 0.3329 0.0862 0.0609 0.0887 0.496 0.6282 0.002 0.3274 0.5016 0.0329 0.1452 0.4653 0.0151 0.243 0.1829 0.2359 0.1392 0.4121 0.3383 0.4471 0.2535 0.5031 0.0089 0.1198 0.2219 0.0846 0.1076 0.0767 0.016 0.1285 0.3148 0.1211 0.0473 0.2528 0.4891 0.317 0. 0.2475 0.0136 0.0559 0.0575] 2022-08-24 21:19:39 [INFO] [EVAL] Class Precision: [0.7619 0.8468 0.9588 0.8046 0.7684 0.8406 0.8452 0.8071 0.6562 0.7467 0.68 0.6705 0.7441 0.4595 0.4979 0.5357 0.6519 0.6533 0.6865 0.5914 0.7799 0.6483 0.6805 0.6011 0.5706 0.5235 0.5423 0.7081 0.5842 0.4663 0.3487 0.5201 0.4491 0.3903 0.4755 0.4811 0.571 0.6448 0.4973 0.5375 0.1707 0.2158 0.4381 0.5809 0.3586 0.4659 0.3822 0.6618 0.7211 0.5959 0.6584 0.2935 0.4213 0.6828 0.7236 0.6486 0.8289 0.6147 0.6511 0.3644 0.1193 0.3275 0.5028 0.5691 0.4899 0.8014 0.3381 0.551 0.2112 0.583 0.5276 0.6002 0.6477 0.3045 0.5887 0.3847 0.6676 0.4457 0.5984 0.5801 0.773 0.5188 0.7519 0.0641 0.6958 0.6604 0.3162 0.3516 0.6863 0.6208 0.4965 0.0059 0.3354 0.2771 0.0171 0.0344 0.6035 0.4988 0.5149 0.5302 0.6147 0.0227 0.7039 0.7305 0.475 0.6546 0.3523 0.9259 0.2819 0.4291 0.2499 0.0665 0.3746 0.7689 0.6318 0.277 0.7213 0.5129 0.0748 0.6331 0.785 0.2576 0.574 0.5758 0.646 0.6458 0.7288 0.4989 0.5013 0.4366 0.5642 0.4606 0.5507 0.7175 0.6393 0.3256 0.3417 0.144 0.4089 0.6849 0.2755 0.1891 0.5077 0.6749 0.5529 0. 0.8466 0.4305 0.4082 0.5544] 2022-08-24 21:19:39 [INFO] [EVAL] Class Recall: [0.8214 0.8911 0.9635 0.8456 0.8384 0.8606 0.8826 0.9047 0.6547 0.7746 0.5659 0.689 0.8415 0.4388 0.25 0.5504 0.6699 0.5023 0.7207 0.4456 0.8695 0.5315 0.7367 0.6161 0.3917 0.6424 0.6018 0.4777 0.5268 0.5377 0.4166 0.6049 0.4276 0.5028 0.5216 0.6206 0.4782 0.666 0.3431 0.2979 0.1886 0.1483 0.4607 0.2461 0.5509 0.2811 0.3905 0.5428 0.878 0.7283 0.6386 0.619 0.262 0.1875 0.734 0.6421 0.9571 0.3526 0.6862 0.271 0.1632 0.2229 0.4616 0.2107 0.6383 0.774 0.3698 0.4977 0.1616 0.4406 0.514 0.5751 0.4453 0.544 0.5788 0.4729 0.3952 0.3478 0.0836 0.2769 0.8024 0.3633 0.2861 0.015 0.6442 0.6052 0.0969 0.0631 0.3955 0.5274 0.6052 0.0164 0.3313 0.0777 0.0033 0.0046 0.2977 0.1648 0.3475 0.5565 0.0499 0.015 0.1275 0.3279 0.0288 0.618 0.2227 0.3851 0.1665 0.5974 0.1162 0.4161 0.1041 0.5829 0.9911 0.002 0.3748 0.9581 0.0554 0.1585 0.5332 0.0158 0.2965 0.2113 0.2709 0.1507 0.4867 0.5124 0.8053 0.3768 0.8228 0.009 0.1328 0.2432 0.0888 0.1384 0.09 0.0177 0.1579 0.3681 0.1777 0.0593 0.3349 0.6399 0.4263 0. 0.2592 0.0138 0.0608 0.0602] 2022-08-24 21:19:39 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:19:49 [INFO] [TRAIN] epoch: 76, iter: 95050/160000, loss: 0.7294, lr: 0.000492, batch_cost: 0.2072, reader_cost: 0.00395, ips: 38.6054 samples/sec | ETA 03:44:19 2022-08-24 21:19:59 [INFO] [TRAIN] epoch: 76, iter: 95100/160000, loss: 0.7497, lr: 0.000491, batch_cost: 0.1990, reader_cost: 0.00503, ips: 40.1949 samples/sec | ETA 03:35:17 2022-08-24 21:20:11 [INFO] [TRAIN] epoch: 76, iter: 95150/160000, loss: 0.7598, lr: 0.000491, batch_cost: 0.2356, reader_cost: 0.00722, ips: 33.9496 samples/sec | ETA 04:14:41 2022-08-24 21:20:21 [INFO] [TRAIN] epoch: 76, iter: 95200/160000, loss: 0.7439, lr: 0.000491, batch_cost: 0.2124, reader_cost: 0.00265, ips: 37.6601 samples/sec | ETA 03:49:25 2022-08-24 21:20:33 [INFO] [TRAIN] epoch: 76, iter: 95250/160000, loss: 0.7667, lr: 0.000490, batch_cost: 0.2256, reader_cost: 0.00050, ips: 35.4615 samples/sec | ETA 04:03:27 2022-08-24 21:20:42 [INFO] [TRAIN] epoch: 76, iter: 95300/160000, loss: 0.7349, lr: 0.000490, batch_cost: 0.1940, reader_cost: 0.01199, ips: 41.2271 samples/sec | ETA 03:29:14 2022-08-24 21:20:53 [INFO] [TRAIN] epoch: 76, iter: 95350/160000, loss: 0.7071, lr: 0.000489, batch_cost: 0.2083, reader_cost: 0.00107, ips: 38.4096 samples/sec | ETA 03:44:25 2022-08-24 21:21:05 [INFO] [TRAIN] epoch: 76, iter: 95400/160000, loss: 0.7446, lr: 0.000489, batch_cost: 0.2358, reader_cost: 0.00059, ips: 33.9266 samples/sec | ETA 04:13:52 2022-08-24 21:21:16 [INFO] [TRAIN] epoch: 76, iter: 95450/160000, loss: 0.7175, lr: 0.000489, batch_cost: 0.2304, reader_cost: 0.00066, ips: 34.7205 samples/sec | ETA 04:07:53 2022-08-24 21:21:26 [INFO] [TRAIN] epoch: 76, iter: 95500/160000, loss: 0.7260, lr: 0.000488, batch_cost: 0.1970, reader_cost: 0.00279, ips: 40.6142 samples/sec | ETA 03:31:44 2022-08-24 21:21:37 [INFO] [TRAIN] epoch: 76, iter: 95550/160000, loss: 0.7038, lr: 0.000488, batch_cost: 0.2128, reader_cost: 0.00113, ips: 37.6000 samples/sec | ETA 03:48:32 2022-08-24 21:21:46 [INFO] [TRAIN] epoch: 76, iter: 95600/160000, loss: 0.7728, lr: 0.000488, batch_cost: 0.1941, reader_cost: 0.00112, ips: 41.2176 samples/sec | ETA 03:28:19 2022-08-24 21:21:58 [INFO] [TRAIN] epoch: 76, iter: 95650/160000, loss: 0.7040, lr: 0.000487, batch_cost: 0.2223, reader_cost: 0.01350, ips: 35.9870 samples/sec | ETA 03:58:25 2022-08-24 21:22:09 [INFO] [TRAIN] epoch: 76, iter: 95700/160000, loss: 0.7711, lr: 0.000487, batch_cost: 0.2306, reader_cost: 0.00034, ips: 34.6947 samples/sec | ETA 04:07:06 2022-08-24 21:22:20 [INFO] [TRAIN] epoch: 76, iter: 95750/160000, loss: 0.7042, lr: 0.000486, batch_cost: 0.2125, reader_cost: 0.00039, ips: 37.6502 samples/sec | ETA 03:47:31 2022-08-24 21:22:30 [INFO] [TRAIN] epoch: 76, iter: 95800/160000, loss: 0.7390, lr: 0.000486, batch_cost: 0.2121, reader_cost: 0.00036, ips: 37.7125 samples/sec | ETA 03:46:58 2022-08-24 21:22:42 [INFO] [TRAIN] epoch: 76, iter: 95850/160000, loss: 0.7591, lr: 0.000486, batch_cost: 0.2342, reader_cost: 0.00037, ips: 34.1604 samples/sec | ETA 04:10:23 2022-08-24 21:22:53 [INFO] [TRAIN] epoch: 76, iter: 95900/160000, loss: 0.7204, lr: 0.000485, batch_cost: 0.2298, reader_cost: 0.00040, ips: 34.8078 samples/sec | ETA 04:05:32 2022-08-24 21:23:04 [INFO] [TRAIN] epoch: 76, iter: 95950/160000, loss: 0.7062, lr: 0.000485, batch_cost: 0.2193, reader_cost: 0.01551, ips: 36.4790 samples/sec | ETA 03:54:06 2022-08-24 21:23:16 [INFO] [TRAIN] epoch: 77, iter: 96000/160000, loss: 0.7115, lr: 0.000485, batch_cost: 0.2336, reader_cost: 0.03454, ips: 34.2431 samples/sec | ETA 04:09:11 2022-08-24 21:23:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.2002 - reader cost: 8.8514e-04 2022-08-24 21:26:37 [INFO] [EVAL] #Images: 2000 mIoU: 0.3116 Acc: 0.7454 Kappa: 0.7255 Dice: 0.4377 2022-08-24 21:26:37 [INFO] [EVAL] Class IoU: [0.656 0.7679 0.9242 0.7056 0.6661 0.7369 0.7628 0.7369 0.478 0.645 0.4574 0.5257 0.6401 0.2762 0.2179 0.3685 0.4736 0.3967 0.5394 0.367 0.7207 0.3859 0.5631 0.4437 0.3337 0.3337 0.3517 0.3897 0.3354 0.3811 0.197 0.3857 0.2812 0.2752 0.3476 0.3672 0.3465 0.4813 0.2412 0.271 0.0935 0.078 0.2854 0.2079 0.2968 0.2638 0.2373 0.4178 0.656 0.4809 0.4507 0.2688 0.1977 0.1804 0.5979 0.4768 0.8338 0.2501 0.43 0.1991 0.0712 0.1589 0.3112 0.0613 0.3768 0.652 0.2062 0.373 0.084 0.3389 0.3202 0.3882 0.3483 0.2034 0.4031 0.2914 0.3399 0.2308 0.1181 0.1962 0.6426 0.2711 0.2621 0.0171 0.4691 0.4567 0.0641 0.0567 0.336 0.4111 0.3982 0.0035 0.2226 0.0654 0.0581 0.0074 0.1856 0.1537 0.2097 0.3484 0.0788 0.0055 0.2331 0.6415 0.0364 0.5331 0.153 0.3987 0.063 0.359 0.0631 0.0705 0.086 0.6001 0.7251 0.0047 0.3338 0.5877 0.0875 0.1247 0.4206 0.0142 0.2485 0.1429 0.2519 0.2433 0.4055 0.2965 0.448 0.2479 0.4892 0.0095 0.0318 0.2437 0.0827 0.0953 0.0618 0.0142 0.1429 0.3152 0.1323 0.0149 0.2824 0.4332 0.2762 0.0079 0.2594 0.0173 0.0887 0.0402] 2022-08-24 21:26:37 [INFO] [EVAL] Class Precision: [0.753 0.8343 0.9587 0.8049 0.776 0.8682 0.8714 0.8046 0.6028 0.7467 0.6569 0.6655 0.7109 0.48 0.4761 0.5596 0.621 0.6189 0.6938 0.566 0.819 0.5802 0.7296 0.5837 0.5342 0.6065 0.5132 0.6887 0.7191 0.577 0.4049 0.5259 0.4596 0.4508 0.4796 0.5121 0.59 0.7692 0.5406 0.5019 0.1597 0.2366 0.5035 0.5746 0.4741 0.5418 0.3776 0.6444 0.7194 0.6161 0.6011 0.3193 0.3094 0.5469 0.7071 0.6363 0.8808 0.6378 0.7384 0.3713 0.0979 0.357 0.4677 0.6433 0.4773 0.8034 0.2813 0.5429 0.2578 0.609 0.5865 0.6377 0.6471 0.2505 0.5972 0.461 0.5608 0.5845 0.8305 0.5284 0.7276 0.5118 0.7502 0.0551 0.5954 0.6745 0.3807 0.4037 0.5525 0.607 0.555 0.0048 0.3663 0.2854 0.1829 0.0294 0.6233 0.5238 0.4306 0.5812 0.7617 0.0175 0.6647 0.8782 0.3582 0.8398 0.3273 0.8349 0.2838 0.47 0.2743 0.0782 0.3393 0.7513 0.7305 0.086 0.7764 0.6266 0.1459 0.661 0.8107 0.2502 0.7234 0.5657 0.5659 0.5934 0.7617 0.4327 0.6904 0.5652 0.5609 0.579 0.3897 0.7432 0.6362 0.2645 0.3661 0.1383 0.3761 0.6909 0.2283 0.05 0.6134 0.5639 0.377 0.0093 0.8792 0.4805 0.4994 0.7068] 2022-08-24 21:26:37 [INFO] [EVAL] Class Recall: [0.8359 0.9061 0.9625 0.8512 0.8247 0.8297 0.8595 0.8975 0.6978 0.8257 0.6009 0.7144 0.8653 0.3942 0.2866 0.5189 0.666 0.525 0.7079 0.5108 0.8573 0.5354 0.7115 0.6491 0.4707 0.426 0.5277 0.4731 0.3859 0.5288 0.2773 0.5913 0.4201 0.414 0.5581 0.5647 0.4564 0.5625 0.3034 0.3708 0.1841 0.1042 0.3972 0.2457 0.4424 0.3396 0.3898 0.543 0.8815 0.6866 0.643 0.6298 0.3538 0.2121 0.7947 0.6554 0.9399 0.2915 0.5073 0.3004 0.2071 0.2225 0.4818 0.0635 0.6416 0.7758 0.436 0.5438 0.1108 0.4331 0.4136 0.498 0.43 0.5192 0.5536 0.4421 0.4631 0.2762 0.121 0.2379 0.8462 0.3656 0.2871 0.0241 0.6886 0.5857 0.0715 0.0619 0.4617 0.5602 0.5849 0.0131 0.3622 0.0782 0.0785 0.0098 0.209 0.1786 0.2901 0.4653 0.0808 0.0081 0.2642 0.7041 0.0389 0.5934 0.2233 0.4329 0.075 0.6034 0.0757 0.4161 0.1033 0.7488 0.99 0.0049 0.3693 0.9045 0.1794 0.1332 0.4664 0.0148 0.2745 0.1605 0.3122 0.292 0.4644 0.4852 0.5606 0.3063 0.7928 0.0096 0.0335 0.2661 0.0868 0.1296 0.0692 0.0155 0.1874 0.3669 0.2392 0.0208 0.3435 0.6516 0.5081 0.0528 0.269 0.0176 0.0973 0.0409] 2022-08-24 21:26:37 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:26:48 [INFO] [TRAIN] epoch: 77, iter: 96050/160000, loss: 0.7362, lr: 0.000484, batch_cost: 0.2209, reader_cost: 0.00277, ips: 36.2101 samples/sec | ETA 03:55:28 2022-08-24 21:26:58 [INFO] [TRAIN] epoch: 77, iter: 96100/160000, loss: 0.8011, lr: 0.000484, batch_cost: 0.1985, reader_cost: 0.01072, ips: 40.3110 samples/sec | ETA 03:31:21 2022-08-24 21:27:10 [INFO] [TRAIN] epoch: 77, iter: 96150/160000, loss: 0.7089, lr: 0.000483, batch_cost: 0.2387, reader_cost: 0.00579, ips: 33.5100 samples/sec | ETA 04:14:03 2022-08-24 21:27:21 [INFO] [TRAIN] epoch: 77, iter: 96200/160000, loss: 0.7039, lr: 0.000483, batch_cost: 0.2151, reader_cost: 0.00121, ips: 37.1904 samples/sec | ETA 03:48:43 2022-08-24 21:27:32 [INFO] [TRAIN] epoch: 77, iter: 96250/160000, loss: 0.7809, lr: 0.000483, batch_cost: 0.2229, reader_cost: 0.00757, ips: 35.8891 samples/sec | ETA 03:56:50 2022-08-24 21:27:42 [INFO] [TRAIN] epoch: 77, iter: 96300/160000, loss: 0.7308, lr: 0.000482, batch_cost: 0.2065, reader_cost: 0.00044, ips: 38.7390 samples/sec | ETA 03:39:14 2022-08-24 21:27:53 [INFO] [TRAIN] epoch: 77, iter: 96350/160000, loss: 0.7402, lr: 0.000482, batch_cost: 0.2124, reader_cost: 0.00864, ips: 37.6633 samples/sec | ETA 03:45:19 2022-08-24 21:28:03 [INFO] [TRAIN] epoch: 77, iter: 96400/160000, loss: 0.7456, lr: 0.000482, batch_cost: 0.1960, reader_cost: 0.02263, ips: 40.8247 samples/sec | ETA 03:27:43 2022-08-24 21:28:13 [INFO] [TRAIN] epoch: 77, iter: 96450/160000, loss: 0.7489, lr: 0.000481, batch_cost: 0.2161, reader_cost: 0.00649, ips: 37.0223 samples/sec | ETA 03:48:52 2022-08-24 21:28:24 [INFO] [TRAIN] epoch: 77, iter: 96500/160000, loss: 0.7265, lr: 0.000481, batch_cost: 0.2216, reader_cost: 0.01099, ips: 36.1087 samples/sec | ETA 03:54:28 2022-08-24 21:28:37 [INFO] [TRAIN] epoch: 77, iter: 96550/160000, loss: 0.6983, lr: 0.000480, batch_cost: 0.2523, reader_cost: 0.00587, ips: 31.7039 samples/sec | ETA 04:26:50 2022-08-24 21:28:51 [INFO] [TRAIN] epoch: 77, iter: 96600/160000, loss: 0.7103, lr: 0.000480, batch_cost: 0.2706, reader_cost: 0.00133, ips: 29.5690 samples/sec | ETA 04:45:53 2022-08-24 21:29:02 [INFO] [TRAIN] epoch: 77, iter: 96650/160000, loss: 0.7633, lr: 0.000480, batch_cost: 0.2182, reader_cost: 0.00541, ips: 36.6572 samples/sec | ETA 03:50:25 2022-08-24 21:29:13 [INFO] [TRAIN] epoch: 77, iter: 96700/160000, loss: 0.7948, lr: 0.000479, batch_cost: 0.2275, reader_cost: 0.00059, ips: 35.1614 samples/sec | ETA 04:00:02 2022-08-24 21:29:24 [INFO] [TRAIN] epoch: 77, iter: 96750/160000, loss: 0.7044, lr: 0.000479, batch_cost: 0.2305, reader_cost: 0.00098, ips: 34.7104 samples/sec | ETA 04:02:57 2022-08-24 21:29:34 [INFO] [TRAIN] epoch: 77, iter: 96800/160000, loss: 0.7416, lr: 0.000478, batch_cost: 0.2006, reader_cost: 0.00181, ips: 39.8853 samples/sec | ETA 03:31:16 2022-08-24 21:29:45 [INFO] [TRAIN] epoch: 77, iter: 96850/160000, loss: 0.7736, lr: 0.000478, batch_cost: 0.2070, reader_cost: 0.00800, ips: 38.6395 samples/sec | ETA 03:37:54 2022-08-24 21:29:56 [INFO] [TRAIN] epoch: 77, iter: 96900/160000, loss: 0.7166, lr: 0.000478, batch_cost: 0.2234, reader_cost: 0.00149, ips: 35.8044 samples/sec | ETA 03:54:58 2022-08-24 21:30:07 [INFO] [TRAIN] epoch: 77, iter: 96950/160000, loss: 0.7625, lr: 0.000477, batch_cost: 0.2120, reader_cost: 0.00097, ips: 37.7415 samples/sec | ETA 03:42:44 2022-08-24 21:30:17 [INFO] [TRAIN] epoch: 77, iter: 97000/160000, loss: 0.7365, lr: 0.000477, batch_cost: 0.2077, reader_cost: 0.00633, ips: 38.5238 samples/sec | ETA 03:38:02 2022-08-24 21:30:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 184s - batch_cost: 0.1838 - reader cost: 9.4043e-04 2022-08-24 21:33:21 [INFO] [EVAL] #Images: 2000 mIoU: 0.3142 Acc: 0.7466 Kappa: 0.7270 Dice: 0.4418 2022-08-24 21:33:21 [INFO] [EVAL] Class IoU: [0.6595 0.7736 0.9257 0.7011 0.6695 0.7382 0.7622 0.7425 0.4783 0.633 0.4512 0.5316 0.655 0.2928 0.2323 0.3679 0.5001 0.3945 0.545 0.3518 0.7121 0.4233 0.5585 0.455 0.3227 0.381 0.3904 0.3895 0.3761 0.3062 0.196 0.3691 0.272 0.2859 0.354 0.3655 0.3454 0.4776 0.2612 0.2446 0.0945 0.0702 0.2988 0.221 0.3 0.2257 0.2407 0.4207 0.6723 0.4504 0.4482 0.2337 0.2118 0.1984 0.6132 0.41 0.8106 0.3138 0.4573 0.2121 0.0724 0.2156 0.2881 0.1988 0.3634 0.6558 0.1923 0.3621 0.0941 0.3291 0.348 0.4029 0.3992 0.2103 0.4172 0.2955 0.3309 0.2214 0.1403 0.2054 0.6595 0.3011 0.2755 0.0186 0.4996 0.466 0.0639 0.05 0.3854 0.4414 0.3854 0.0103 0.2212 0.0809 0.0043 0.0081 0.183 0.1328 0.2286 0.3748 0.0842 0.0074 0.1686 0.1595 0.0645 0.54 0.1452 0.4264 0.0886 0.3809 0.0824 0.1476 0.0831 0.5596 0.7968 0.0006 0.3284 0.5702 0.0489 0.3408 0.352 0.0202 0.211 0.1271 0.2311 0.1405 0.4121 0.3183 0.4203 0.2027 0.5345 0.0101 0.0523 0.2726 0.0647 0.108 0.0721 0.0158 0.1184 0.2981 0.1657 0.0921 0.2455 0.4837 0.3477 0.0121 0.2657 0.0104 0.1227 0.0485] 2022-08-24 21:33:21 [INFO] [EVAL] Class Precision: [0.7603 0.8395 0.9629 0.7925 0.7682 0.8579 0.8901 0.8149 0.6279 0.7385 0.6866 0.6592 0.7511 0.4697 0.4549 0.4949 0.6648 0.6842 0.732 0.574 0.8025 0.631 0.7554 0.571 0.5268 0.5718 0.547 0.7245 0.647 0.4853 0.4092 0.5593 0.5362 0.4188 0.5639 0.5775 0.6257 0.6683 0.4535 0.5566 0.1417 0.251 0.4746 0.5112 0.4136 0.4702 0.3754 0.7104 0.7549 0.5383 0.5574 0.2741 0.3658 0.6243 0.7147 0.5045 0.8443 0.6142 0.7187 0.4049 0.1095 0.3809 0.5422 0.5277 0.4509 0.8182 0.294 0.553 0.2058 0.5447 0.512 0.5162 0.6246 0.2469 0.5891 0.4881 0.5514 0.5877 0.8038 0.5217 0.7752 0.5846 0.7228 0.0567 0.6149 0.6715 0.3489 0.448 0.6067 0.6813 0.5287 0.0134 0.3712 0.2962 0.0187 0.0279 0.6951 0.6009 0.45 0.5113 0.7327 0.0114 0.6508 0.4494 0.4904 0.8549 0.3757 0.9506 0.2633 0.4918 0.2978 0.1863 0.3339 0.6411 0.8074 0.0304 0.7122 0.6059 0.1002 0.6957 0.7986 0.1609 0.7278 0.5916 0.5987 0.6468 0.7224 0.4298 0.5579 0.4086 0.692 0.596 0.3709 0.6268 0.6727 0.261 0.3419 0.2196 0.3822 0.6781 0.3177 0.1776 0.5488 0.7003 0.6614 0.0165 0.8843 0.2718 0.4605 0.6047] 2022-08-24 21:33:21 [INFO] [EVAL] Class Recall: [0.8325 0.9078 0.96 0.8588 0.839 0.841 0.8414 0.8931 0.6675 0.816 0.5683 0.7331 0.8366 0.4375 0.322 0.5891 0.6687 0.4823 0.681 0.4762 0.8634 0.5625 0.6818 0.6912 0.4544 0.5331 0.5769 0.4571 0.4732 0.4535 0.2734 0.5205 0.3557 0.4739 0.4875 0.4989 0.4354 0.6261 0.3811 0.3037 0.2211 0.0888 0.4465 0.2803 0.5219 0.3026 0.4014 0.5077 0.8601 0.7338 0.696 0.6133 0.3347 0.2253 0.812 0.6863 0.953 0.3909 0.5569 0.3082 0.176 0.3319 0.3808 0.2419 0.652 0.7677 0.3573 0.512 0.1479 0.4539 0.5208 0.6474 0.5252 0.5863 0.5884 0.4281 0.4527 0.2621 0.1453 0.253 0.8154 0.383 0.3081 0.027 0.7271 0.6037 0.0726 0.0533 0.5139 0.5563 0.587 0.0432 0.3539 0.1002 0.0055 0.0113 0.1989 0.1456 0.3171 0.584 0.0869 0.0205 0.1854 0.1983 0.0691 0.5945 0.1914 0.436 0.1178 0.6282 0.1023 0.4155 0.0996 0.8149 0.9837 0.0006 0.3786 0.9063 0.0873 0.4005 0.3863 0.0226 0.229 0.1393 0.2734 0.1522 0.4897 0.551 0.6302 0.2868 0.7013 0.0102 0.0574 0.3254 0.0668 0.1556 0.0838 0.0168 0.1464 0.3472 0.2573 0.1606 0.3076 0.61 0.423 0.0429 0.2753 0.0107 0.1433 0.0501] 2022-08-24 21:33:21 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:33:32 [INFO] [TRAIN] epoch: 77, iter: 97050/160000, loss: 0.7347, lr: 0.000477, batch_cost: 0.2131, reader_cost: 0.01101, ips: 37.5409 samples/sec | ETA 03:43:34 2022-08-24 21:33:42 [INFO] [TRAIN] epoch: 77, iter: 97100/160000, loss: 0.6913, lr: 0.000476, batch_cost: 0.2024, reader_cost: 0.01927, ips: 39.5164 samples/sec | ETA 03:32:13 2022-08-24 21:33:52 [INFO] [TRAIN] epoch: 77, iter: 97150/160000, loss: 0.7332, lr: 0.000476, batch_cost: 0.1943, reader_cost: 0.00748, ips: 41.1655 samples/sec | ETA 03:23:34 2022-08-24 21:34:03 [INFO] [TRAIN] epoch: 77, iter: 97200/160000, loss: 0.7584, lr: 0.000475, batch_cost: 0.2188, reader_cost: 0.00045, ips: 36.5619 samples/sec | ETA 03:49:01 2022-08-24 21:34:13 [INFO] [TRAIN] epoch: 77, iter: 97250/160000, loss: 0.7680, lr: 0.000475, batch_cost: 0.2099, reader_cost: 0.00799, ips: 38.1092 samples/sec | ETA 03:39:32 2022-08-24 21:34:26 [INFO] [TRAIN] epoch: 78, iter: 97300/160000, loss: 0.6501, lr: 0.000475, batch_cost: 0.2517, reader_cost: 0.03402, ips: 31.7805 samples/sec | ETA 04:23:03 2022-08-24 21:34:37 [INFO] [TRAIN] epoch: 78, iter: 97350/160000, loss: 0.7195, lr: 0.000474, batch_cost: 0.2174, reader_cost: 0.00111, ips: 36.7935 samples/sec | ETA 03:47:01 2022-08-24 21:34:47 [INFO] [TRAIN] epoch: 78, iter: 97400/160000, loss: 0.7381, lr: 0.000474, batch_cost: 0.2031, reader_cost: 0.00606, ips: 39.3908 samples/sec | ETA 03:31:53 2022-08-24 21:34:58 [INFO] [TRAIN] epoch: 78, iter: 97450/160000, loss: 0.7366, lr: 0.000474, batch_cost: 0.2296, reader_cost: 0.00044, ips: 34.8441 samples/sec | ETA 03:59:21 2022-08-24 21:35:09 [INFO] [TRAIN] epoch: 78, iter: 97500/160000, loss: 0.6889, lr: 0.000473, batch_cost: 0.2122, reader_cost: 0.00049, ips: 37.6981 samples/sec | ETA 03:41:03 2022-08-24 21:35:19 [INFO] [TRAIN] epoch: 78, iter: 97550/160000, loss: 0.7226, lr: 0.000473, batch_cost: 0.1939, reader_cost: 0.00047, ips: 41.2608 samples/sec | ETA 03:21:48 2022-08-24 21:35:30 [INFO] [TRAIN] epoch: 78, iter: 97600/160000, loss: 0.7282, lr: 0.000472, batch_cost: 0.2253, reader_cost: 0.00037, ips: 35.5106 samples/sec | ETA 03:54:17 2022-08-24 21:35:43 [INFO] [TRAIN] epoch: 78, iter: 97650/160000, loss: 0.7176, lr: 0.000472, batch_cost: 0.2720, reader_cost: 0.01188, ips: 29.4166 samples/sec | ETA 04:42:36 2022-08-24 21:35:56 [INFO] [TRAIN] epoch: 78, iter: 97700/160000, loss: 0.7289, lr: 0.000472, batch_cost: 0.2592, reader_cost: 0.00068, ips: 30.8601 samples/sec | ETA 04:29:10 2022-08-24 21:36:08 [INFO] [TRAIN] epoch: 78, iter: 97750/160000, loss: 0.7027, lr: 0.000471, batch_cost: 0.2254, reader_cost: 0.00035, ips: 35.4910 samples/sec | ETA 03:53:51 2022-08-24 21:36:19 [INFO] [TRAIN] epoch: 78, iter: 97800/160000, loss: 0.7386, lr: 0.000471, batch_cost: 0.2240, reader_cost: 0.00062, ips: 35.7162 samples/sec | ETA 03:52:12 2022-08-24 21:36:31 [INFO] [TRAIN] epoch: 78, iter: 97850/160000, loss: 0.6957, lr: 0.000471, batch_cost: 0.2343, reader_cost: 0.00060, ips: 34.1437 samples/sec | ETA 04:02:41 2022-08-24 21:36:42 [INFO] [TRAIN] epoch: 78, iter: 97900/160000, loss: 0.7364, lr: 0.000470, batch_cost: 0.2379, reader_cost: 0.00097, ips: 33.6213 samples/sec | ETA 04:06:16 2022-08-24 21:36:52 [INFO] [TRAIN] epoch: 78, iter: 97950/160000, loss: 0.7534, lr: 0.000470, batch_cost: 0.1895, reader_cost: 0.01151, ips: 42.2164 samples/sec | ETA 03:15:58 2022-08-24 21:37:02 [INFO] [TRAIN] epoch: 78, iter: 98000/160000, loss: 0.7366, lr: 0.000469, batch_cost: 0.2049, reader_cost: 0.00330, ips: 39.0410 samples/sec | ETA 03:31:44 2022-08-24 21:37:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 193s - batch_cost: 0.1926 - reader cost: 9.8196e-04 2022-08-24 21:40:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3097 Acc: 0.7463 Kappa: 0.7267 Dice: 0.4360 2022-08-24 21:40:15 [INFO] [EVAL] Class IoU: [0.6568 0.7656 0.9244 0.7053 0.6759 0.7432 0.7708 0.7359 0.4791 0.6044 0.4495 0.518 0.6518 0.294 0.2233 0.3693 0.5004 0.4276 0.5416 0.3756 0.7111 0.411 0.556 0.4518 0.3271 0.3357 0.3654 0.3944 0.3507 0.298 0.1835 0.3902 0.2745 0.283 0.3677 0.3853 0.3433 0.4944 0.2563 0.2586 0.0986 0.0496 0.2904 0.2021 0.2871 0.2668 0.2537 0.4261 0.686 0.4735 0.4706 0.284 0.2111 0.1735 0.5947 0.4165 0.8397 0.2832 0.4517 0.1969 0.0505 0.1743 0.3043 0.0833 0.3657 0.6633 0.1966 0.3568 0.096 0.3341 0.3048 0.3753 0.3672 0.2242 0.4143 0.2983 0.3057 0.229 0.1312 0.2351 0.6439 0.2803 0.2474 0.0069 0.4816 0.475 0.0865 0.0514 0.3827 0.431 0.3944 0.0009 0.1923 0.0783 0.026 0.0072 0.1855 0.17 0.2263 0.336 0.1075 0.0238 0.152 0.433 0.0036 0.5062 0.1656 0.3575 0.0982 0.3375 0.0635 0.1018 0.084 0.544 0.6928 0.0039 0.311 0.5645 0.0807 0.1818 0.3981 0.0179 0.2815 0.0846 0.2195 0.1393 0.4311 0.3091 0.4182 0.2247 0.5128 0.0105 0.0514 0.222 0.0582 0.1083 0.0747 0.0171 0.1283 0.2821 0.1785 0.0038 0.2632 0.4822 0.2716 0.0073 0.2715 0.0163 0.0934 0.0429] 2022-08-24 21:40:15 [INFO] [EVAL] Class Precision: [0.7616 0.8368 0.9549 0.8047 0.7881 0.853 0.857 0.7891 0.6272 0.7077 0.6447 0.7146 0.734 0.5018 0.4619 0.5617 0.6692 0.6705 0.6911 0.5551 0.7985 0.6073 0.7462 0.588 0.5391 0.4841 0.5092 0.6446 0.7193 0.468 0.3971 0.495 0.5391 0.4032 0.4871 0.4993 0.6176 0.7655 0.4915 0.5172 0.1528 0.262 0.4499 0.5803 0.4096 0.4541 0.3543 0.653 0.7675 0.569 0.6582 0.3523 0.3692 0.6409 0.7011 0.5039 0.8891 0.6092 0.7321 0.3472 0.0868 0.4713 0.4428 0.4545 0.4714 0.8074 0.3235 0.5771 0.1711 0.5824 0.5834 0.5918 0.6179 0.2887 0.6644 0.5421 0.4633 0.6465 0.7517 0.516 0.7572 0.5644 0.7578 0.0407 0.6673 0.6314 0.3429 0.4356 0.6342 0.6293 0.5518 0.0011 0.3755 0.2823 0.0826 0.0303 0.6633 0.4915 0.4645 0.5662 0.6507 0.0447 0.6765 0.7313 0.0959 0.737 0.3806 0.9562 0.3316 0.5109 0.3673 0.1188 0.3848 0.7552 0.6993 0.14 0.7073 0.5848 0.1266 0.5191 0.8361 0.1295 0.7032 0.638 0.6695 0.6158 0.6929 0.424 0.6319 0.523 0.5763 0.5025 0.2668 0.7209 0.7146 0.319 0.3668 0.242 0.3447 0.7154 0.3523 0.0183 0.6392 0.5755 0.3902 0.0133 0.866 0.3535 0.5038 0.5395] 2022-08-24 21:40:15 [INFO] [EVAL] Class Recall: [0.8267 0.9 0.9666 0.8509 0.8259 0.8524 0.8846 0.9161 0.6699 0.8055 0.5976 0.6531 0.8533 0.4153 0.3018 0.5188 0.6648 0.5415 0.7146 0.5373 0.8666 0.5598 0.6856 0.661 0.4541 0.5227 0.564 0.5039 0.4063 0.4507 0.2544 0.6481 0.3586 0.4869 0.6001 0.6279 0.436 0.5827 0.3488 0.3409 0.2175 0.0577 0.4502 0.2367 0.4899 0.3926 0.4717 0.5508 0.8659 0.7384 0.6228 0.5946 0.3302 0.1922 0.7967 0.7061 0.9379 0.346 0.5411 0.3126 0.1075 0.2167 0.4933 0.0925 0.6199 0.788 0.3339 0.4832 0.1793 0.4394 0.3896 0.5064 0.4751 0.5009 0.524 0.3989 0.4733 0.2617 0.1372 0.3015 0.8115 0.3577 0.2686 0.0083 0.6338 0.6573 0.1036 0.055 0.4912 0.5777 0.5802 0.0049 0.2827 0.0977 0.0365 0.0094 0.2047 0.2063 0.3063 0.4524 0.1141 0.0484 0.1639 0.5149 0.0037 0.6178 0.2267 0.3634 0.1224 0.4987 0.0713 0.4161 0.0971 0.6605 0.9869 0.0039 0.3569 0.9421 0.1818 0.2186 0.4318 0.0204 0.3194 0.0889 0.2461 0.1526 0.5329 0.5329 0.5529 0.2826 0.8233 0.0106 0.0599 0.2429 0.0596 0.1408 0.0858 0.018 0.1697 0.3178 0.2657 0.0048 0.3091 0.7483 0.4719 0.016 0.2834 0.0168 0.1028 0.0445] 2022-08-24 21:40:15 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:40:26 [INFO] [TRAIN] epoch: 78, iter: 98050/160000, loss: 0.7102, lr: 0.000469, batch_cost: 0.2214, reader_cost: 0.00388, ips: 36.1310 samples/sec | ETA 03:48:36 2022-08-24 21:40:37 [INFO] [TRAIN] epoch: 78, iter: 98100/160000, loss: 0.7232, lr: 0.000469, batch_cost: 0.2097, reader_cost: 0.00140, ips: 38.1482 samples/sec | ETA 03:36:20 2022-08-24 21:40:48 [INFO] [TRAIN] epoch: 78, iter: 98150/160000, loss: 0.7170, lr: 0.000468, batch_cost: 0.2192, reader_cost: 0.00068, ips: 36.4974 samples/sec | ETA 03:45:57 2022-08-24 21:40:58 [INFO] [TRAIN] epoch: 78, iter: 98200/160000, loss: 0.7461, lr: 0.000468, batch_cost: 0.1992, reader_cost: 0.00034, ips: 40.1657 samples/sec | ETA 03:25:09 2022-08-24 21:41:08 [INFO] [TRAIN] epoch: 78, iter: 98250/160000, loss: 0.7435, lr: 0.000468, batch_cost: 0.2173, reader_cost: 0.00044, ips: 36.8208 samples/sec | ETA 03:43:36 2022-08-24 21:41:19 [INFO] [TRAIN] epoch: 78, iter: 98300/160000, loss: 0.7530, lr: 0.000467, batch_cost: 0.2025, reader_cost: 0.02905, ips: 39.5038 samples/sec | ETA 03:28:15 2022-08-24 21:41:28 [INFO] [TRAIN] epoch: 78, iter: 98350/160000, loss: 0.7621, lr: 0.000467, batch_cost: 0.1954, reader_cost: 0.00303, ips: 40.9493 samples/sec | ETA 03:20:44 2022-08-24 21:41:39 [INFO] [TRAIN] epoch: 78, iter: 98400/160000, loss: 0.7358, lr: 0.000466, batch_cost: 0.2074, reader_cost: 0.00725, ips: 38.5669 samples/sec | ETA 03:32:57 2022-08-24 21:41:49 [INFO] [TRAIN] epoch: 78, iter: 98450/160000, loss: 0.7537, lr: 0.000466, batch_cost: 0.2063, reader_cost: 0.00498, ips: 38.7751 samples/sec | ETA 03:31:38 2022-08-24 21:42:00 [INFO] [TRAIN] epoch: 78, iter: 98500/160000, loss: 0.7498, lr: 0.000466, batch_cost: 0.2123, reader_cost: 0.00063, ips: 37.6897 samples/sec | ETA 03:37:33 2022-08-24 21:42:12 [INFO] [TRAIN] epoch: 79, iter: 98550/160000, loss: 0.7027, lr: 0.000465, batch_cost: 0.2517, reader_cost: 0.04385, ips: 31.7854 samples/sec | ETA 04:17:46 2022-08-24 21:42:22 [INFO] [TRAIN] epoch: 79, iter: 98600/160000, loss: 0.6867, lr: 0.000465, batch_cost: 0.1957, reader_cost: 0.00419, ips: 40.8800 samples/sec | ETA 03:20:15 2022-08-24 21:42:34 [INFO] [TRAIN] epoch: 79, iter: 98650/160000, loss: 0.7272, lr: 0.000464, batch_cost: 0.2407, reader_cost: 0.02338, ips: 33.2320 samples/sec | ETA 04:06:08 2022-08-24 21:42:49 [INFO] [TRAIN] epoch: 79, iter: 98700/160000, loss: 0.7410, lr: 0.000464, batch_cost: 0.2915, reader_cost: 0.03418, ips: 27.4454 samples/sec | ETA 04:57:48 2022-08-24 21:43:01 [INFO] [TRAIN] epoch: 79, iter: 98750/160000, loss: 0.7040, lr: 0.000464, batch_cost: 0.2496, reader_cost: 0.00823, ips: 32.0467 samples/sec | ETA 04:14:50 2022-08-24 21:43:14 [INFO] [TRAIN] epoch: 79, iter: 98800/160000, loss: 0.7926, lr: 0.000463, batch_cost: 0.2510, reader_cost: 0.00512, ips: 31.8727 samples/sec | ETA 04:16:01 2022-08-24 21:43:25 [INFO] [TRAIN] epoch: 79, iter: 98850/160000, loss: 0.7350, lr: 0.000463, batch_cost: 0.2239, reader_cost: 0.00035, ips: 35.7359 samples/sec | ETA 03:48:09 2022-08-24 21:43:36 [INFO] [TRAIN] epoch: 79, iter: 98900/160000, loss: 0.6741, lr: 0.000463, batch_cost: 0.2152, reader_cost: 0.00072, ips: 37.1703 samples/sec | ETA 03:39:10 2022-08-24 21:43:46 [INFO] [TRAIN] epoch: 79, iter: 98950/160000, loss: 0.7356, lr: 0.000462, batch_cost: 0.2014, reader_cost: 0.01063, ips: 39.7181 samples/sec | ETA 03:24:56 2022-08-24 21:43:58 [INFO] [TRAIN] epoch: 79, iter: 99000/160000, loss: 0.7889, lr: 0.000462, batch_cost: 0.2378, reader_cost: 0.00121, ips: 33.6482 samples/sec | ETA 04:01:43 2022-08-24 21:43:58 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 198s - batch_cost: 0.1977 - reader cost: 8.5913e-04 2022-08-24 21:47:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3118 Acc: 0.7441 Kappa: 0.7246 Dice: 0.4385 2022-08-24 21:47:16 [INFO] [EVAL] Class IoU: [0.6561 0.7701 0.925 0.7006 0.6684 0.735 0.7626 0.7432 0.4807 0.6225 0.4405 0.5134 0.6564 0.2614 0.2269 0.377 0.4573 0.4157 0.54 0.3548 0.7106 0.3933 0.5558 0.4432 0.3389 0.4185 0.3759 0.3972 0.3717 0.3176 0.2083 0.3876 0.2649 0.285 0.3548 0.3993 0.3477 0.4825 0.2379 0.2415 0.0632 0.0581 0.286 0.2219 0.2648 0.2467 0.2048 0.4291 0.6743 0.4944 0.4328 0.2634 0.2154 0.1989 0.6038 0.4823 0.8258 0.2873 0.3814 0.2074 0.0381 0.175 0.2966 0.1284 0.3857 0.6649 0.2072 0.3454 0.0319 0.3514 0.3326 0.4227 0.3454 0.2292 0.4257 0.2906 0.345 0.213 0.1263 0.2408 0.6448 0.2677 0.2824 0.0128 0.5073 0.4751 0.0696 0.052 0.3816 0.4197 0.3873 0. 0.1933 0.0749 0.0131 0.005 0.2216 0.1117 0.1757 0.3853 0.0875 0.0045 0.1401 0.5599 0.0485 0.4631 0.1859 0.4647 0.1078 0.3733 0.0809 0.2279 0.0707 0.4384 0.7221 0.0023 0.3768 0.5255 0.025 0.2238 0.4478 0.0141 0.2319 0.0527 0.2529 0.1698 0.4207 0.2704 0.334 0.2235 0.5273 0.0134 0.1004 0.2258 0.0927 0.1095 0.0561 0.0189 0.1558 0.3094 0.136 0.056 0.2911 0.4223 0.333 0.0042 0.2803 0.0095 0.0767 0.0518] 2022-08-24 21:47:16 [INFO] [EVAL] Class Precision: [0.7668 0.846 0.959 0.7966 0.7716 0.868 0.8523 0.8102 0.61 0.7399 0.7105 0.6576 0.7513 0.4984 0.4644 0.5097 0.5731 0.6556 0.6983 0.5646 0.8086 0.5897 0.7262 0.5739 0.5088 0.5452 0.4979 0.6474 0.6902 0.5434 0.385 0.5408 0.4664 0.3981 0.481 0.5355 0.579 0.7264 0.4157 0.5377 0.1146 0.2157 0.4611 0.4911 0.3336 0.5139 0.3216 0.6671 0.735 0.6137 0.5068 0.3185 0.4254 0.6947 0.6493 0.6467 0.8656 0.6107 0.6257 0.3698 0.0605 0.4585 0.3851 0.5217 0.5082 0.8009 0.3534 0.5438 0.1058 0.6515 0.5158 0.5587 0.6533 0.2888 0.6455 0.4956 0.5123 0.5156 0.673 0.5959 0.7544 0.5037 0.7326 0.0395 0.6408 0.6367 0.2418 0.4409 0.6309 0.568 0.5294 0. 0.2937 0.2974 0.0318 0.0259 0.6087 0.5869 0.2643 0.5009 0.5788 0.0095 0.7349 0.634 0.7672 0.6364 0.4692 0.8689 0.2586 0.5679 0.282 0.3352 0.3235 0.8285 0.7275 0.0249 0.6823 0.5466 0.0471 0.6597 0.7989 0.2991 0.8732 0.7035 0.5785 0.6221 0.7602 0.3605 0.5103 0.5321 0.6228 0.3809 0.6029 0.7574 0.5464 0.3512 0.3316 0.0994 0.4669 0.6804 0.1913 0.2552 0.5251 0.7459 0.6008 0.0045 0.8738 0.3702 0.588 0.5895] 2022-08-24 21:47:16 [INFO] [EVAL] Class Recall: [0.8197 0.8957 0.9631 0.8532 0.8332 0.8274 0.8788 0.8999 0.6939 0.7968 0.5368 0.7007 0.8385 0.3547 0.3074 0.5915 0.6937 0.5319 0.7042 0.4884 0.8543 0.5415 0.7031 0.6604 0.5036 0.6431 0.6054 0.5069 0.4461 0.4332 0.3121 0.5777 0.38 0.501 0.5751 0.6108 0.4653 0.5897 0.3574 0.3048 0.1235 0.0736 0.4297 0.2882 0.5619 0.3218 0.3606 0.546 0.8908 0.7178 0.7477 0.6034 0.3038 0.218 0.8962 0.655 0.9473 0.3517 0.4941 0.3208 0.0931 0.2206 0.5634 0.1456 0.6154 0.7966 0.3338 0.4862 0.0438 0.4327 0.4835 0.6346 0.4229 0.5264 0.5556 0.4126 0.5137 0.2664 0.1346 0.2877 0.8161 0.3636 0.3148 0.0187 0.709 0.6518 0.089 0.0557 0.4914 0.6165 0.5907 0. 0.3611 0.0911 0.0218 0.0062 0.2584 0.1213 0.3439 0.6255 0.0934 0.0084 0.1476 0.8274 0.0492 0.6297 0.2355 0.4997 0.1561 0.5213 0.1019 0.4161 0.0829 0.4822 0.9898 0.0025 0.457 0.9315 0.0507 0.253 0.5047 0.0146 0.2399 0.0539 0.3101 0.1894 0.485 0.5199 0.4914 0.2782 0.7746 0.0137 0.1075 0.2434 0.1004 0.1372 0.0632 0.0228 0.1896 0.362 0.3199 0.0669 0.3952 0.4933 0.4276 0.0675 0.2922 0.0096 0.081 0.0538] 2022-08-24 21:47:16 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:47:27 [INFO] [TRAIN] epoch: 79, iter: 99050/160000, loss: 0.7237, lr: 0.000461, batch_cost: 0.2145, reader_cost: 0.00451, ips: 37.3041 samples/sec | ETA 03:37:50 2022-08-24 21:47:37 [INFO] [TRAIN] epoch: 79, iter: 99100/160000, loss: 0.6978, lr: 0.000461, batch_cost: 0.2147, reader_cost: 0.00219, ips: 37.2573 samples/sec | ETA 03:37:56 2022-08-24 21:47:48 [INFO] [TRAIN] epoch: 79, iter: 99150/160000, loss: 0.6901, lr: 0.000461, batch_cost: 0.2180, reader_cost: 0.00442, ips: 36.7009 samples/sec | ETA 03:41:03 2022-08-24 21:47:59 [INFO] [TRAIN] epoch: 79, iter: 99200/160000, loss: 0.6985, lr: 0.000460, batch_cost: 0.2258, reader_cost: 0.00094, ips: 35.4262 samples/sec | ETA 03:48:49 2022-08-24 21:48:10 [INFO] [TRAIN] epoch: 79, iter: 99250/160000, loss: 0.7391, lr: 0.000460, batch_cost: 0.2160, reader_cost: 0.00069, ips: 37.0302 samples/sec | ETA 03:38:44 2022-08-24 21:48:21 [INFO] [TRAIN] epoch: 79, iter: 99300/160000, loss: 0.7410, lr: 0.000460, batch_cost: 0.2192, reader_cost: 0.00350, ips: 36.4914 samples/sec | ETA 03:41:47 2022-08-24 21:48:31 [INFO] [TRAIN] epoch: 79, iter: 99350/160000, loss: 0.7350, lr: 0.000459, batch_cost: 0.1968, reader_cost: 0.00146, ips: 40.6490 samples/sec | ETA 03:18:56 2022-08-24 21:48:41 [INFO] [TRAIN] epoch: 79, iter: 99400/160000, loss: 0.7773, lr: 0.000459, batch_cost: 0.2077, reader_cost: 0.00702, ips: 38.5106 samples/sec | ETA 03:29:48 2022-08-24 21:48:52 [INFO] [TRAIN] epoch: 79, iter: 99450/160000, loss: 0.6925, lr: 0.000458, batch_cost: 0.2156, reader_cost: 0.00041, ips: 37.1016 samples/sec | ETA 03:37:36 2022-08-24 21:49:03 [INFO] [TRAIN] epoch: 79, iter: 99500/160000, loss: 0.7125, lr: 0.000458, batch_cost: 0.2156, reader_cost: 0.00036, ips: 37.1125 samples/sec | ETA 03:37:21 2022-08-24 21:49:13 [INFO] [TRAIN] epoch: 79, iter: 99550/160000, loss: 0.7829, lr: 0.000458, batch_cost: 0.2012, reader_cost: 0.00392, ips: 39.7657 samples/sec | ETA 03:22:41 2022-08-24 21:49:24 [INFO] [TRAIN] epoch: 79, iter: 99600/160000, loss: 0.7085, lr: 0.000457, batch_cost: 0.2284, reader_cost: 0.00037, ips: 35.0230 samples/sec | ETA 03:49:56 2022-08-24 21:49:38 [INFO] [TRAIN] epoch: 79, iter: 99650/160000, loss: 0.7336, lr: 0.000457, batch_cost: 0.2624, reader_cost: 0.00106, ips: 30.4893 samples/sec | ETA 04:23:55 2022-08-24 21:49:51 [INFO] [TRAIN] epoch: 79, iter: 99700/160000, loss: 0.7040, lr: 0.000457, batch_cost: 0.2697, reader_cost: 0.00687, ips: 29.6600 samples/sec | ETA 04:31:04 2022-08-24 21:50:01 [INFO] [TRAIN] epoch: 79, iter: 99750/160000, loss: 0.7350, lr: 0.000456, batch_cost: 0.2040, reader_cost: 0.00946, ips: 39.2156 samples/sec | ETA 03:24:51 2022-08-24 21:50:14 [INFO] [TRAIN] epoch: 80, iter: 99800/160000, loss: 0.6905, lr: 0.000456, batch_cost: 0.2632, reader_cost: 0.03478, ips: 30.3956 samples/sec | ETA 04:24:04 2022-08-24 21:50:25 [INFO] [TRAIN] epoch: 80, iter: 99850/160000, loss: 0.7164, lr: 0.000455, batch_cost: 0.2045, reader_cost: 0.00426, ips: 39.1266 samples/sec | ETA 03:24:58 2022-08-24 21:50:37 [INFO] [TRAIN] epoch: 80, iter: 99900/160000, loss: 0.7250, lr: 0.000455, batch_cost: 0.2384, reader_cost: 0.00236, ips: 33.5536 samples/sec | ETA 03:58:49 2022-08-24 21:50:48 [INFO] [TRAIN] epoch: 80, iter: 99950/160000, loss: 0.6624, lr: 0.000455, batch_cost: 0.2247, reader_cost: 0.00049, ips: 35.5979 samples/sec | ETA 03:44:55 2022-08-24 21:50:59 [INFO] [TRAIN] epoch: 80, iter: 100000/160000, loss: 0.6912, lr: 0.000454, batch_cost: 0.2276, reader_cost: 0.00039, ips: 35.1443 samples/sec | ETA 03:47:37 2022-08-24 21:50:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 209s - batch_cost: 0.2089 - reader cost: 0.0010 2022-08-24 21:54:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3101 Acc: 0.7445 Kappa: 0.7251 Dice: 0.4360 2022-08-24 21:54:28 [INFO] [EVAL] Class IoU: [0.6526 0.7665 0.9249 0.7047 0.6688 0.7423 0.759 0.7499 0.4769 0.6242 0.4479 0.5163 0.6552 0.2931 0.2249 0.3731 0.494 0.4131 0.5457 0.3391 0.7208 0.3958 0.5552 0.4488 0.3266 0.3609 0.3636 0.3865 0.3069 0.3549 0.2233 0.3824 0.261 0.2781 0.3847 0.3798 0.3531 0.5051 0.2778 0.2579 0.0843 0.0836 0.285 0.2067 0.2806 0.238 0.2499 0.4159 0.6766 0.454 0.4302 0.2551 0.2167 0.2202 0.6007 0.4823 0.8253 0.2409 0.4888 0.1997 0.0331 0.188 0.3109 0.0831 0.3743 0.6635 0.1862 0.3575 0.0414 0.3354 0.3452 0.4226 0.3843 0.2196 0.4117 0.2956 0.3392 0.2023 0.1311 0.2114 0.6251 0.2772 0.2199 0.0135 0.4928 0.4542 0.0502 0.0452 0.4386 0.4141 0.3441 0.0004 0.1959 0.0697 0.0011 0.0039 0.1983 0.1403 0.2095 0.3571 0.1353 0.0128 0.1592 0.5049 0.0001 0.5024 0.1507 0.4118 0.0856 0.3761 0.0777 0.1069 0.08 0.466 0.6301 0.0008 0.438 0.5296 0.0199 0.3287 0.4228 0.0104 0.2257 0.0684 0.256 0.1843 0.4003 0.3199 0.4167 0.2126 0.5115 0.0173 0.057 0.2752 0.0743 0.0911 0.0742 0.017 0.1248 0.2703 0.0908 0.0844 0.259 0.4147 0.3063 0. 0.3025 0.0092 0.0598 0.0318] 2022-08-24 21:54:28 [INFO] [EVAL] Class Precision: [0.7736 0.8373 0.9616 0.7926 0.7688 0.8424 0.8667 0.8224 0.5945 0.7511 0.7049 0.6476 0.7438 0.4965 0.4866 0.5039 0.648 0.6368 0.6812 0.5893 0.8132 0.597 0.7213 0.571 0.5335 0.5043 0.5168 0.5708 0.6764 0.5316 0.3595 0.4786 0.4686 0.4626 0.5012 0.4928 0.5856 0.7345 0.5216 0.5018 0.1331 0.2228 0.4576 0.5192 0.4156 0.4597 0.4056 0.7018 0.7387 0.5514 0.532 0.3166 0.3997 0.6231 0.7124 0.6011 0.8693 0.6389 0.6452 0.4067 0.0535 0.3665 0.4959 0.6179 0.4753 0.8044 0.3279 0.5768 0.1089 0.7106 0.5722 0.5302 0.6644 0.3074 0.5517 0.4672 0.5217 0.5462 0.8126 0.5444 0.7208 0.4914 0.7905 0.0697 0.6881 0.6499 0.2435 0.4817 0.6599 0.579 0.4339 0.0006 0.2936 0.2815 0.005 0.0151 0.5918 0.5326 0.3443 0.5488 0.551 0.0237 0.6667 0.6389 0.0017 0.721 0.4295 0.9166 0.2187 0.4784 0.2648 0.1258 0.4213 0.7405 0.6335 0.007 0.7399 0.5573 0.0626 0.7168 0.7511 0.5301 0.7947 0.731 0.5467 0.5845 0.7248 0.463 0.5809 0.4341 0.6075 0.4042 0.3266 0.6639 0.5719 0.2966 0.282 0.1255 0.3791 0.722 0.1937 0.1344 0.5957 0.7754 0.468 0. 0.8198 0.3608 0.4892 0.4964] 2022-08-24 21:54:28 [INFO] [EVAL] Class Recall: [0.8067 0.9006 0.9604 0.8641 0.8371 0.8621 0.8593 0.8948 0.7068 0.787 0.5513 0.718 0.8461 0.417 0.2948 0.5896 0.6753 0.5404 0.7328 0.444 0.8638 0.5401 0.7069 0.6772 0.4572 0.5594 0.5508 0.5447 0.3597 0.5163 0.3707 0.6556 0.3708 0.4107 0.6235 0.6236 0.4708 0.6179 0.3728 0.3466 0.1867 0.1181 0.4305 0.2556 0.4635 0.3304 0.3942 0.5051 0.8895 0.7199 0.6923 0.5674 0.3213 0.2541 0.793 0.7093 0.9422 0.2789 0.6685 0.2818 0.0797 0.2785 0.4545 0.0876 0.6379 0.7911 0.3012 0.4845 0.0625 0.3885 0.4653 0.6756 0.4769 0.4346 0.6186 0.446 0.4923 0.2431 0.1352 0.2569 0.8248 0.3888 0.2335 0.0164 0.6346 0.6013 0.0595 0.0475 0.5667 0.5925 0.6244 0.0017 0.3706 0.0848 0.0015 0.0051 0.2297 0.16 0.3486 0.5055 0.152 0.0271 0.1729 0.7065 0.0001 0.6236 0.1884 0.4278 0.1233 0.6374 0.0991 0.4161 0.0899 0.557 0.9914 0.0009 0.5177 0.9143 0.0284 0.3777 0.4917 0.0105 0.2397 0.0701 0.325 0.2121 0.472 0.5087 0.5959 0.2942 0.7639 0.0177 0.0645 0.3197 0.0787 0.1162 0.0914 0.0193 0.1568 0.3017 0.1461 0.1851 0.3142 0.4713 0.47 0. 0.3241 0.0094 0.0637 0.0329] 2022-08-24 21:54:28 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 21:54:38 [INFO] [TRAIN] epoch: 80, iter: 100050/160000, loss: 0.7615, lr: 0.000454, batch_cost: 0.1997, reader_cost: 0.00316, ips: 40.0532 samples/sec | ETA 03:19:34 2022-08-24 21:54:49 [INFO] [TRAIN] epoch: 80, iter: 100100/160000, loss: 0.6800, lr: 0.000454, batch_cost: 0.2047, reader_cost: 0.01119, ips: 39.0733 samples/sec | ETA 03:24:24 2022-08-24 21:54:58 [INFO] [TRAIN] epoch: 80, iter: 100150/160000, loss: 0.7599, lr: 0.000453, batch_cost: 0.1854, reader_cost: 0.00211, ips: 43.1508 samples/sec | ETA 03:04:55 2022-08-24 21:55:09 [INFO] [TRAIN] epoch: 80, iter: 100200/160000, loss: 0.7453, lr: 0.000453, batch_cost: 0.2235, reader_cost: 0.00133, ips: 35.7916 samples/sec | ETA 03:42:46 2022-08-24 21:55:19 [INFO] [TRAIN] epoch: 80, iter: 100250/160000, loss: 0.6841, lr: 0.000452, batch_cost: 0.1919, reader_cost: 0.00296, ips: 41.6801 samples/sec | ETA 03:11:08 2022-08-24 21:55:30 [INFO] [TRAIN] epoch: 80, iter: 100300/160000, loss: 0.7456, lr: 0.000452, batch_cost: 0.2266, reader_cost: 0.00048, ips: 35.3091 samples/sec | ETA 03:45:26 2022-08-24 21:55:40 [INFO] [TRAIN] epoch: 80, iter: 100350/160000, loss: 0.7502, lr: 0.000452, batch_cost: 0.2027, reader_cost: 0.00185, ips: 39.4727 samples/sec | ETA 03:21:29 2022-08-24 21:55:51 [INFO] [TRAIN] epoch: 80, iter: 100400/160000, loss: 0.7265, lr: 0.000451, batch_cost: 0.2241, reader_cost: 0.00061, ips: 35.7010 samples/sec | ETA 03:42:35 2022-08-24 21:56:01 [INFO] [TRAIN] epoch: 80, iter: 100450/160000, loss: 0.7674, lr: 0.000451, batch_cost: 0.1992, reader_cost: 0.00073, ips: 40.1567 samples/sec | ETA 03:17:43 2022-08-24 21:56:12 [INFO] [TRAIN] epoch: 80, iter: 100500/160000, loss: 0.7197, lr: 0.000450, batch_cost: 0.2077, reader_cost: 0.00088, ips: 38.5179 samples/sec | ETA 03:25:57 2022-08-24 21:56:22 [INFO] [TRAIN] epoch: 80, iter: 100550/160000, loss: 0.6959, lr: 0.000450, batch_cost: 0.1987, reader_cost: 0.00759, ips: 40.2716 samples/sec | ETA 03:16:49 2022-08-24 21:56:31 [INFO] [TRAIN] epoch: 80, iter: 100600/160000, loss: 0.7118, lr: 0.000450, batch_cost: 0.1913, reader_cost: 0.00039, ips: 41.8263 samples/sec | ETA 03:09:21 2022-08-24 21:56:42 [INFO] [TRAIN] epoch: 80, iter: 100650/160000, loss: 0.7436, lr: 0.000449, batch_cost: 0.2084, reader_cost: 0.01162, ips: 38.3803 samples/sec | ETA 03:26:10 2022-08-24 21:56:53 [INFO] [TRAIN] epoch: 80, iter: 100700/160000, loss: 0.7388, lr: 0.000449, batch_cost: 0.2248, reader_cost: 0.00035, ips: 35.5874 samples/sec | ETA 03:42:10 2022-08-24 21:57:04 [INFO] [TRAIN] epoch: 80, iter: 100750/160000, loss: 0.7361, lr: 0.000449, batch_cost: 0.2118, reader_cost: 0.00407, ips: 37.7687 samples/sec | ETA 03:29:10 2022-08-24 21:57:14 [INFO] [TRAIN] epoch: 80, iter: 100800/160000, loss: 0.7381, lr: 0.000448, batch_cost: 0.2132, reader_cost: 0.00724, ips: 37.5219 samples/sec | ETA 03:30:21 2022-08-24 21:57:24 [INFO] [TRAIN] epoch: 80, iter: 100850/160000, loss: 0.7137, lr: 0.000448, batch_cost: 0.2046, reader_cost: 0.00046, ips: 39.1077 samples/sec | ETA 03:21:39 2022-08-24 21:57:36 [INFO] [TRAIN] epoch: 80, iter: 100900/160000, loss: 0.7714, lr: 0.000447, batch_cost: 0.2252, reader_cost: 0.00096, ips: 35.5260 samples/sec | ETA 03:41:48 2022-08-24 21:57:46 [INFO] [TRAIN] epoch: 80, iter: 100950/160000, loss: 0.7426, lr: 0.000447, batch_cost: 0.2139, reader_cost: 0.00069, ips: 37.4044 samples/sec | ETA 03:30:29 2022-08-24 21:57:57 [INFO] [TRAIN] epoch: 80, iter: 101000/160000, loss: 0.7428, lr: 0.000447, batch_cost: 0.2206, reader_cost: 0.00547, ips: 36.2679 samples/sec | ETA 03:36:54 2022-08-24 21:57:57 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1730 - reader cost: 7.1246e-04 2022-08-24 22:00:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3106 Acc: 0.7466 Kappa: 0.7270 Dice: 0.4359 2022-08-24 22:00:51 [INFO] [EVAL] Class IoU: [0.6539 0.7684 0.925 0.7026 0.6687 0.744 0.7638 0.7507 0.4877 0.622 0.4567 0.5217 0.6655 0.2937 0.1983 0.3683 0.4888 0.4291 0.5445 0.3647 0.7193 0.4177 0.5638 0.4555 0.3303 0.3983 0.3795 0.3843 0.3866 0.3103 0.1926 0.3978 0.2578 0.2863 0.3367 0.3726 0.3348 0.4974 0.2362 0.2452 0.0935 0.0548 0.3048 0.204 0.278 0.2678 0.2506 0.4172 0.6491 0.4736 0.4601 0.2742 0.2401 0.1724 0.6234 0.381 0.807 0.279 0.4938 0.189 0.0481 0.2009 0.3151 0.1356 0.3539 0.6552 0.1871 0.3469 0.0824 0.3278 0.3328 0.4 0.3582 0.2055 0.4041 0.2899 0.3128 0.188 0.2489 0.2008 0.6613 0.2567 0.2635 0.0174 0.4189 0.4545 0.0708 0.0451 0.3899 0.4255 0.3933 0. 0.1598 0.063 0.0008 0.0118 0.0629 0.1364 0.2389 0.3302 0.06 0.0119 0.1387 0.6723 0.0038 0.4777 0.1717 0.417 0.0978 0.3676 0.0575 0.1885 0.0771 0.4822 0.7437 0. 0.3902 0.531 0.0656 0.2697 0.4689 0.0133 0.198 0.1334 0.2342 0.1157 0.4431 0.3221 0.3576 0.1905 0.5096 0.0142 0.0606 0.2228 0.0648 0.1143 0.057 0.0333 0.1455 0.3086 0.0844 0.0667 0.2391 0.4407 0.3354 0. 0.2765 0.0122 0.0439 0.0355] 2022-08-24 22:00:51 [INFO] [EVAL] Class Precision: [0.7549 0.8424 0.9645 0.7961 0.7621 0.8551 0.872 0.8362 0.6241 0.7302 0.6347 0.673 0.7646 0.4783 0.4946 0.5554 0.6469 0.6322 0.754 0.5639 0.8059 0.6125 0.7438 0.5671 0.5048 0.502 0.5225 0.5708 0.6432 0.4997 0.4065 0.5869 0.5135 0.439 0.4815 0.4939 0.635 0.7906 0.5551 0.5161 0.1476 0.2063 0.5072 0.5565 0.4261 0.4163 0.4069 0.6437 0.7893 0.576 0.5893 0.3409 0.4936 0.7227 0.7035 0.6285 0.8429 0.6195 0.6446 0.3538 0.0794 0.4144 0.4651 0.5125 0.424 0.7765 0.4031 0.5867 0.189 0.7178 0.5215 0.5228 0.6741 0.2808 0.5494 0.4411 0.4883 0.6434 0.7696 0.5175 0.7691 0.512 0.7587 0.0547 0.6673 0.6796 0.2852 0.4462 0.6186 0.607 0.5363 0. 0.3236 0.2869 0.0049 0.05 0.6302 0.6039 0.4817 0.5146 0.7551 0.0204 0.6958 0.7749 0.0562 0.6334 0.4615 0.9672 0.2267 0.5531 0.2965 0.2563 0.4092 0.7812 0.7512 0. 0.6859 0.5444 0.0941 0.6596 0.7939 0.7287 0.8794 0.5912 0.6358 0.6412 0.7491 0.4737 0.5147 0.6047 0.6274 0.4885 0.4362 0.715 0.5662 0.3133 0.3439 0.1308 0.4084 0.6599 0.1545 0.2935 0.5848 0.801 0.6411 0. 0.7683 0.3747 0.4685 0.6578] 2022-08-24 22:00:51 [INFO] [EVAL] Class Recall: [0.83 0.8974 0.9576 0.8567 0.8452 0.8513 0.8603 0.8801 0.6905 0.8076 0.6196 0.6989 0.837 0.4321 0.2486 0.5224 0.6667 0.5719 0.6622 0.5079 0.8701 0.5678 0.6997 0.6983 0.4887 0.6585 0.581 0.5404 0.4922 0.4503 0.268 0.5525 0.3411 0.4516 0.5282 0.6026 0.4145 0.5728 0.2914 0.3184 0.2034 0.0694 0.433 0.2437 0.4445 0.4288 0.3949 0.5424 0.7852 0.7271 0.6772 0.5837 0.3186 0.1846 0.8455 0.4918 0.9498 0.3367 0.6785 0.2887 0.1086 0.2805 0.4942 0.1557 0.6818 0.8076 0.2589 0.4591 0.1275 0.3763 0.4791 0.6301 0.4333 0.4338 0.6044 0.4582 0.4654 0.2099 0.269 0.2471 0.825 0.3398 0.2876 0.0249 0.5295 0.5784 0.0861 0.0477 0.5133 0.5872 0.596 0. 0.2399 0.0746 0.001 0.0152 0.0653 0.1497 0.3215 0.4795 0.0612 0.0275 0.1477 0.8355 0.0041 0.6603 0.2148 0.423 0.1468 0.5229 0.0666 0.4161 0.0868 0.5575 0.9868 0. 0.4751 0.9556 0.1776 0.3133 0.5339 0.0134 0.2036 0.1469 0.2705 0.1237 0.5203 0.5016 0.5394 0.2176 0.7307 0.0144 0.0658 0.2445 0.0682 0.1524 0.064 0.0427 0.1844 0.367 0.1569 0.0795 0.288 0.4949 0.4129 0. 0.3016 0.0125 0.0462 0.0362] 2022-08-24 22:00:51 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:01:02 [INFO] [TRAIN] epoch: 81, iter: 101050/160000, loss: 0.7217, lr: 0.000446, batch_cost: 0.2336, reader_cost: 0.03514, ips: 34.2416 samples/sec | ETA 03:49:32 2022-08-24 22:01:14 [INFO] [TRAIN] epoch: 81, iter: 101100/160000, loss: 0.7250, lr: 0.000446, batch_cost: 0.2237, reader_cost: 0.00060, ips: 35.7603 samples/sec | ETA 03:39:36 2022-08-24 22:01:24 [INFO] [TRAIN] epoch: 81, iter: 101150/160000, loss: 0.7368, lr: 0.000446, batch_cost: 0.1992, reader_cost: 0.00710, ips: 40.1514 samples/sec | ETA 03:15:25 2022-08-24 22:01:34 [INFO] [TRAIN] epoch: 81, iter: 101200/160000, loss: 0.6517, lr: 0.000445, batch_cost: 0.1986, reader_cost: 0.00098, ips: 40.2845 samples/sec | ETA 03:14:36 2022-08-24 22:01:44 [INFO] [TRAIN] epoch: 81, iter: 101250/160000, loss: 0.7361, lr: 0.000445, batch_cost: 0.2111, reader_cost: 0.00170, ips: 37.8974 samples/sec | ETA 03:26:41 2022-08-24 22:01:53 [INFO] [TRAIN] epoch: 81, iter: 101300/160000, loss: 0.7040, lr: 0.000444, batch_cost: 0.1824, reader_cost: 0.00129, ips: 43.8613 samples/sec | ETA 02:58:26 2022-08-24 22:02:05 [INFO] [TRAIN] epoch: 81, iter: 101350/160000, loss: 0.7313, lr: 0.000444, batch_cost: 0.2268, reader_cost: 0.00202, ips: 35.2660 samples/sec | ETA 03:41:44 2022-08-24 22:02:15 [INFO] [TRAIN] epoch: 81, iter: 101400/160000, loss: 0.7052, lr: 0.000444, batch_cost: 0.2064, reader_cost: 0.00303, ips: 38.7623 samples/sec | ETA 03:21:34 2022-08-24 22:02:27 [INFO] [TRAIN] epoch: 81, iter: 101450/160000, loss: 0.7353, lr: 0.000443, batch_cost: 0.2403, reader_cost: 0.00067, ips: 33.2961 samples/sec | ETA 03:54:27 2022-08-24 22:02:38 [INFO] [TRAIN] epoch: 81, iter: 101500/160000, loss: 0.7035, lr: 0.000443, batch_cost: 0.2315, reader_cost: 0.00044, ips: 34.5604 samples/sec | ETA 03:45:41 2022-08-24 22:02:49 [INFO] [TRAIN] epoch: 81, iter: 101550/160000, loss: 0.7630, lr: 0.000443, batch_cost: 0.2129, reader_cost: 0.00054, ips: 37.5786 samples/sec | ETA 03:27:23 2022-08-24 22:03:01 [INFO] [TRAIN] epoch: 81, iter: 101600/160000, loss: 0.7471, lr: 0.000442, batch_cost: 0.2412, reader_cost: 0.00100, ips: 33.1609 samples/sec | ETA 03:54:48 2022-08-24 22:03:12 [INFO] [TRAIN] epoch: 81, iter: 101650/160000, loss: 0.7184, lr: 0.000442, batch_cost: 0.2089, reader_cost: 0.00497, ips: 38.2999 samples/sec | ETA 03:23:08 2022-08-24 22:03:23 [INFO] [TRAIN] epoch: 81, iter: 101700/160000, loss: 0.6957, lr: 0.000441, batch_cost: 0.2248, reader_cost: 0.00145, ips: 35.5947 samples/sec | ETA 03:38:23 2022-08-24 22:03:33 [INFO] [TRAIN] epoch: 81, iter: 101750/160000, loss: 0.7220, lr: 0.000441, batch_cost: 0.2044, reader_cost: 0.00045, ips: 39.1326 samples/sec | ETA 03:18:28 2022-08-24 22:03:45 [INFO] [TRAIN] epoch: 81, iter: 101800/160000, loss: 0.7273, lr: 0.000441, batch_cost: 0.2424, reader_cost: 0.00112, ips: 32.9991 samples/sec | ETA 03:55:09 2022-08-24 22:03:56 [INFO] [TRAIN] epoch: 81, iter: 101850/160000, loss: 0.7675, lr: 0.000440, batch_cost: 0.2159, reader_cost: 0.00086, ips: 37.0511 samples/sec | ETA 03:29:15 2022-08-24 22:04:05 [INFO] [TRAIN] epoch: 81, iter: 101900/160000, loss: 0.7329, lr: 0.000440, batch_cost: 0.1895, reader_cost: 0.01371, ips: 42.2064 samples/sec | ETA 03:03:32 2022-08-24 22:04:16 [INFO] [TRAIN] epoch: 81, iter: 101950/160000, loss: 0.7430, lr: 0.000440, batch_cost: 0.2073, reader_cost: 0.01120, ips: 38.5989 samples/sec | ETA 03:20:31 2022-08-24 22:04:26 [INFO] [TRAIN] epoch: 81, iter: 102000/160000, loss: 0.6885, lr: 0.000439, batch_cost: 0.1949, reader_cost: 0.00203, ips: 41.0421 samples/sec | ETA 03:08:25 2022-08-24 22:04:26 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 204s - batch_cost: 0.2039 - reader cost: 7.8257e-04 2022-08-24 22:07:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.3122 Acc: 0.7452 Kappa: 0.7256 Dice: 0.4386 2022-08-24 22:07:50 [INFO] [EVAL] Class IoU: [0.6587 0.7655 0.9257 0.7087 0.6637 0.7388 0.7616 0.7436 0.4809 0.6203 0.4629 0.5251 0.6612 0.2934 0.2038 0.3725 0.4848 0.3935 0.5436 0.3529 0.7065 0.4035 0.5654 0.4526 0.3314 0.3822 0.3545 0.3783 0.3836 0.3098 0.2116 0.3738 0.269 0.2892 0.3055 0.3867 0.3541 0.486 0.2374 0.2255 0.0915 0.0521 0.291 0.2234 0.2876 0.2828 0.2433 0.4069 0.6944 0.4746 0.4645 0.2383 0.2162 0.2035 0.612 0.4472 0.8204 0.2511 0.4507 0.231 0.102 0.1837 0.3019 0.1027 0.3674 0.6522 0.1923 0.3842 0.0928 0.2765 0.3204 0.399 0.3669 0.2049 0.4188 0.2835 0.3071 0.1973 0.2954 0.2678 0.6174 0.2806 0.2476 0.0117 0.4824 0.4558 0.0851 0.0521 0.3953 0.3726 0.4082 0.0021 0.1356 0.0782 0.001 0.0051 0.1687 0.1243 0.1986 0.3549 0.1365 0.0116 0.0537 0.5375 0.0006 0.5203 0.2068 0.4813 0.0996 0.3672 0.0614 0.2838 0.0876 0.5539 0.7262 0. 0.3153 0.5738 0.0553 0.1496 0.4726 0.0146 0.2125 0.0592 0.2315 0.1795 0.4392 0.2798 0.3377 0.2534 0.5034 0.0138 0.1017 0.2459 0.0727 0.1211 0.0451 0.0174 0.1407 0.2876 0.0791 0.0551 0.266 0.4968 0.3239 0.0001 0.258 0.0076 0.1009 0.0465] 2022-08-24 22:07:50 [INFO] [EVAL] Class Precision: [0.761 0.8453 0.9633 0.8093 0.7497 0.8694 0.875 0.8081 0.6041 0.7588 0.6511 0.6614 0.7542 0.4684 0.4887 0.5514 0.6345 0.69 0.7025 0.5594 0.7853 0.6109 0.726 0.6062 0.524 0.5193 0.5065 0.7028 0.6154 0.4356 0.372 0.4764 0.4625 0.4069 0.5217 0.5106 0.5863 0.7662 0.4439 0.5593 0.1568 0.1984 0.4889 0.5249 0.4306 0.5221 0.3726 0.5773 0.7487 0.5864 0.6184 0.2789 0.4701 0.458 0.7244 0.604 0.8607 0.619 0.6416 0.4461 0.1365 0.424 0.5137 0.5042 0.4517 0.7794 0.3781 0.5357 0.3011 0.6824 0.5576 0.4916 0.659 0.2655 0.6077 0.4013 0.5269 0.6043 0.5236 0.5473 0.7178 0.5291 0.775 0.0384 0.6552 0.6788 0.2952 0.4451 0.5849 0.4735 0.554 0.0028 0.3322 0.2713 0.0062 0.0215 0.5492 0.5411 0.4045 0.5444 0.5636 0.0209 0.6665 0.5849 0.0103 0.7516 0.5915 0.9052 0.2676 0.5054 0.258 0.4719 0.4116 0.7736 0.7328 0. 0.7326 0.6175 0.1282 0.5945 0.7958 0.1491 0.8107 0.5817 0.6376 0.6623 0.7146 0.3646 0.545 0.5224 0.5706 0.5531 0.4081 0.7552 0.5963 0.2804 0.3248 0.1269 0.4253 0.7083 0.1687 0.116 0.5645 0.6245 0.5983 0.0001 0.8653 0.2897 0.4192 0.7261] 2022-08-24 22:07:50 [INFO] [EVAL] Class Recall: [0.8306 0.8902 0.9595 0.8508 0.8526 0.831 0.8545 0.9031 0.7022 0.7727 0.6156 0.7182 0.8428 0.4399 0.2591 0.5345 0.6728 0.478 0.7061 0.4887 0.8755 0.5432 0.7188 0.6412 0.4742 0.5915 0.5417 0.4504 0.5045 0.5175 0.3292 0.6345 0.3914 0.4998 0.4244 0.6144 0.472 0.5706 0.338 0.2743 0.1802 0.066 0.4183 0.28 0.464 0.3815 0.4123 0.5796 0.9056 0.7133 0.6512 0.6207 0.2858 0.2681 0.7977 0.6327 0.946 0.297 0.6023 0.324 0.2873 0.2448 0.4227 0.1142 0.663 0.7998 0.2813 0.576 0.1182 0.3173 0.4296 0.6794 0.4528 0.4731 0.574 0.4912 0.4241 0.2266 0.404 0.344 0.8153 0.3741 0.2668 0.0165 0.6465 0.5811 0.1067 0.0557 0.5494 0.6362 0.6081 0.0087 0.1863 0.099 0.0012 0.0066 0.1959 0.1389 0.2808 0.5049 0.1527 0.0251 0.0552 0.8688 0.0006 0.6283 0.2412 0.5069 0.1369 0.5732 0.0745 0.4159 0.1002 0.661 0.9878 0. 0.3563 0.8901 0.0886 0.1666 0.5378 0.0159 0.2236 0.0619 0.2666 0.1976 0.5327 0.5462 0.4703 0.3297 0.8102 0.014 0.1193 0.2672 0.0764 0.1756 0.0498 0.0198 0.1737 0.3262 0.1295 0.0951 0.3347 0.7085 0.4139 0.0007 0.2688 0.0078 0.1173 0.0473] 2022-08-24 22:07:50 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:08:00 [INFO] [TRAIN] epoch: 81, iter: 102050/160000, loss: 0.7407, lr: 0.000439, batch_cost: 0.2022, reader_cost: 0.00875, ips: 39.5587 samples/sec | ETA 03:15:19 2022-08-24 22:08:09 [INFO] [TRAIN] epoch: 81, iter: 102100/160000, loss: 0.6963, lr: 0.000438, batch_cost: 0.1786, reader_cost: 0.00852, ips: 44.7837 samples/sec | ETA 02:52:23 2022-08-24 22:08:19 [INFO] [TRAIN] epoch: 81, iter: 102150/160000, loss: 0.7379, lr: 0.000438, batch_cost: 0.2042, reader_cost: 0.00282, ips: 39.1859 samples/sec | ETA 03:16:50 2022-08-24 22:08:30 [INFO] [TRAIN] epoch: 81, iter: 102200/160000, loss: 0.7355, lr: 0.000438, batch_cost: 0.2184, reader_cost: 0.00037, ips: 36.6325 samples/sec | ETA 03:30:22 2022-08-24 22:08:40 [INFO] [TRAIN] epoch: 81, iter: 102250/160000, loss: 0.7096, lr: 0.000437, batch_cost: 0.2052, reader_cost: 0.00053, ips: 38.9940 samples/sec | ETA 03:17:27 2022-08-24 22:08:50 [INFO] [TRAIN] epoch: 81, iter: 102300/160000, loss: 0.7160, lr: 0.000437, batch_cost: 0.2012, reader_cost: 0.00082, ips: 39.7575 samples/sec | ETA 03:13:30 2022-08-24 22:09:03 [INFO] [TRAIN] epoch: 82, iter: 102350/160000, loss: 0.7494, lr: 0.000436, batch_cost: 0.2607, reader_cost: 0.03347, ips: 30.6904 samples/sec | ETA 04:10:27 2022-08-24 22:09:14 [INFO] [TRAIN] epoch: 82, iter: 102400/160000, loss: 0.8125, lr: 0.000436, batch_cost: 0.2067, reader_cost: 0.00139, ips: 38.7075 samples/sec | ETA 03:18:24 2022-08-24 22:09:23 [INFO] [TRAIN] epoch: 82, iter: 102450/160000, loss: 0.7505, lr: 0.000436, batch_cost: 0.1853, reader_cost: 0.00387, ips: 43.1683 samples/sec | ETA 02:57:45 2022-08-24 22:09:33 [INFO] [TRAIN] epoch: 82, iter: 102500/160000, loss: 0.7081, lr: 0.000435, batch_cost: 0.2047, reader_cost: 0.00561, ips: 39.0802 samples/sec | ETA 03:16:10 2022-08-24 22:09:45 [INFO] [TRAIN] epoch: 82, iter: 102550/160000, loss: 0.7214, lr: 0.000435, batch_cost: 0.2355, reader_cost: 0.00054, ips: 33.9767 samples/sec | ETA 03:45:26 2022-08-24 22:09:56 [INFO] [TRAIN] epoch: 82, iter: 102600/160000, loss: 0.7566, lr: 0.000435, batch_cost: 0.2179, reader_cost: 0.00071, ips: 36.7188 samples/sec | ETA 03:28:25 2022-08-24 22:10:07 [INFO] [TRAIN] epoch: 82, iter: 102650/160000, loss: 0.7207, lr: 0.000434, batch_cost: 0.2201, reader_cost: 0.00077, ips: 36.3429 samples/sec | ETA 03:30:24 2022-08-24 22:10:17 [INFO] [TRAIN] epoch: 82, iter: 102700/160000, loss: 0.7210, lr: 0.000434, batch_cost: 0.1954, reader_cost: 0.00045, ips: 40.9459 samples/sec | ETA 03:06:35 2022-08-24 22:10:28 [INFO] [TRAIN] epoch: 82, iter: 102750/160000, loss: 0.7078, lr: 0.000433, batch_cost: 0.2242, reader_cost: 0.00079, ips: 35.6779 samples/sec | ETA 03:33:57 2022-08-24 22:10:37 [INFO] [TRAIN] epoch: 82, iter: 102800/160000, loss: 0.7274, lr: 0.000433, batch_cost: 0.1746, reader_cost: 0.00065, ips: 45.8118 samples/sec | ETA 02:46:28 2022-08-24 22:10:47 [INFO] [TRAIN] epoch: 82, iter: 102850/160000, loss: 0.7088, lr: 0.000433, batch_cost: 0.2145, reader_cost: 0.00079, ips: 37.3000 samples/sec | ETA 03:24:17 2022-08-24 22:10:58 [INFO] [TRAIN] epoch: 82, iter: 102900/160000, loss: 0.6646, lr: 0.000432, batch_cost: 0.2016, reader_cost: 0.00050, ips: 39.6739 samples/sec | ETA 03:11:53 2022-08-24 22:11:09 [INFO] [TRAIN] epoch: 82, iter: 102950/160000, loss: 0.7091, lr: 0.000432, batch_cost: 0.2250, reader_cost: 0.00558, ips: 35.5569 samples/sec | ETA 03:33:55 2022-08-24 22:11:19 [INFO] [TRAIN] epoch: 82, iter: 103000/160000, loss: 0.7443, lr: 0.000432, batch_cost: 0.1947, reader_cost: 0.00103, ips: 41.0873 samples/sec | ETA 03:04:58 2022-08-24 22:11:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 192s - batch_cost: 0.1915 - reader cost: 9.6506e-04 2022-08-24 22:14:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3115 Acc: 0.7473 Kappa: 0.7280 Dice: 0.4363 2022-08-24 22:14:30 [INFO] [EVAL] Class IoU: [0.6602 0.7696 0.9255 0.7047 0.6668 0.7413 0.7652 0.7326 0.4818 0.6211 0.4665 0.5261 0.6604 0.2883 0.219 0.3691 0.4933 0.4031 0.5486 0.3622 0.7082 0.4153 0.5626 0.4386 0.3391 0.3855 0.3538 0.3908 0.3857 0.2975 0.2214 0.3571 0.2792 0.2823 0.3955 0.3836 0.3563 0.4901 0.2754 0.2617 0.0627 0.05 0.2924 0.2166 0.2844 0.2778 0.2475 0.4136 0.6656 0.479 0.4743 0.2734 0.1925 0.1982 0.6821 0.4613 0.8116 0.2759 0.4898 0.2035 0.0386 0.1952 0.2945 0.1398 0.3613 0.6362 0.2309 0.3872 0.0929 0.3462 0.3259 0.3884 0.3732 0.2234 0.4113 0.305 0.2712 0.2097 0.1406 0.1943 0.6261 0.2922 0.2426 0.0123 0.4455 0.4803 0.0755 0.0676 0.3937 0.3902 0.419 0. 0.1783 0.058 0.0023 0.0038 0.1338 0.1404 0.1472 0.3797 0.0365 0.0169 0.0333 0.7449 0.0052 0.5339 0.1525 0.4189 0.1098 0.3156 0.0475 0.0848 0.0871 0.4501 0.7401 0.0001 0.344 0.5558 0.0863 0.2223 0.4363 0.0137 0.1433 0.131 0.2367 0.1523 0.431 0.3288 0.2876 0.253 0.5012 0.0114 0.0392 0.2554 0.0884 0.0993 0.066 0.0109 0.1272 0.325 0.1008 0.1404 0.2688 0.4754 0.3043 0. 0.255 0.0069 0.1115 0.04 ] 2022-08-24 22:14:30 [INFO] [EVAL] Class Precision: [0.7717 0.8364 0.9581 0.8123 0.7641 0.8716 0.8926 0.7841 0.599 0.7364 0.6467 0.6406 0.7504 0.5172 0.5116 0.539 0.7026 0.6385 0.7085 0.5493 0.7819 0.5657 0.7145 0.5977 0.4869 0.5525 0.5052 0.6649 0.6103 0.4502 0.3746 0.4708 0.4883 0.39 0.4773 0.5148 0.5513 0.7939 0.4932 0.4948 0.1417 0.1883 0.5511 0.5592 0.4095 0.4683 0.3804 0.6605 0.7093 0.5801 0.5965 0.3265 0.4675 0.5751 0.7642 0.5848 0.8468 0.6262 0.6446 0.4231 0.078 0.3886 0.4695 0.5632 0.4341 0.7453 0.3473 0.5312 0.2181 0.7693 0.5469 0.6368 0.6318 0.2706 0.6486 0.5268 0.4356 0.5622 0.6179 0.5635 0.7329 0.5184 0.7799 0.0554 0.6837 0.647 0.2789 0.3862 0.6421 0.5453 0.5867 0. 0.3128 0.3123 0.0119 0.0207 0.5492 0.4793 0.4243 0.5496 0.6553 0.0288 0.4765 0.8102 0.1617 0.7765 0.3653 0.9298 0.2646 0.4519 0.2499 0.0963 0.3707 0.7873 0.7477 0.0015 0.7214 0.586 0.148 0.6264 0.7603 0.2135 0.8562 0.5619 0.6605 0.5894 0.7353 0.5032 0.4771 0.3918 0.6161 0.4101 0.3069 0.6743 0.5773 0.3214 0.2641 0.1916 0.4656 0.6651 0.1825 0.2613 0.528 0.6551 0.4952 0. 0.8671 0.2361 0.511 0.6764] 2022-08-24 22:14:30 [INFO] [EVAL] Class Recall: [0.8205 0.906 0.9645 0.8417 0.8396 0.8322 0.8427 0.9176 0.7113 0.7986 0.6261 0.7464 0.8462 0.3945 0.2768 0.5394 0.6236 0.5223 0.7085 0.5152 0.8826 0.6096 0.7259 0.6223 0.5277 0.5604 0.5415 0.4867 0.5117 0.4672 0.3512 0.5967 0.3946 0.5054 0.6978 0.6008 0.5018 0.5616 0.3841 0.3572 0.101 0.0638 0.3838 0.2612 0.4823 0.4057 0.4147 0.5252 0.9153 0.7332 0.6985 0.6272 0.2466 0.2321 0.8639 0.6859 0.9513 0.3303 0.6709 0.2815 0.071 0.2817 0.4414 0.1568 0.683 0.8129 0.4077 0.5882 0.1394 0.3863 0.4464 0.4988 0.4769 0.5614 0.5293 0.4201 0.4182 0.2506 0.154 0.2288 0.8112 0.4011 0.2604 0.0156 0.5611 0.6508 0.0938 0.0757 0.5044 0.5784 0.5944 0. 0.2932 0.0665 0.0028 0.0046 0.1503 0.1657 0.1839 0.5513 0.0372 0.0395 0.0346 0.9024 0.0053 0.6309 0.2074 0.4326 0.158 0.5112 0.0553 0.4161 0.1022 0.5124 0.9865 0.0001 0.3967 0.9152 0.1714 0.2562 0.5059 0.0144 0.1469 0.1459 0.2696 0.1704 0.5101 0.4869 0.4199 0.4166 0.7289 0.0116 0.0431 0.2914 0.0946 0.1256 0.0809 0.0114 0.149 0.3886 0.1837 0.2329 0.3538 0.6342 0.4412 0. 0.2654 0.007 0.1249 0.0408] 2022-08-24 22:14:30 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:14:40 [INFO] [TRAIN] epoch: 82, iter: 103050/160000, loss: 0.7942, lr: 0.000431, batch_cost: 0.1944, reader_cost: 0.00213, ips: 41.1475 samples/sec | ETA 03:04:32 2022-08-24 22:14:49 [INFO] [TRAIN] epoch: 82, iter: 103100/160000, loss: 0.7415, lr: 0.000431, batch_cost: 0.1852, reader_cost: 0.00901, ips: 43.1935 samples/sec | ETA 02:55:38 2022-08-24 22:14:59 [INFO] [TRAIN] epoch: 82, iter: 103150/160000, loss: 0.7398, lr: 0.000430, batch_cost: 0.1994, reader_cost: 0.00171, ips: 40.1193 samples/sec | ETA 03:08:56 2022-08-24 22:15:10 [INFO] [TRAIN] epoch: 82, iter: 103200/160000, loss: 0.6939, lr: 0.000430, batch_cost: 0.2200, reader_cost: 0.00424, ips: 36.3606 samples/sec | ETA 03:28:17 2022-08-24 22:15:21 [INFO] [TRAIN] epoch: 82, iter: 103250/160000, loss: 0.7573, lr: 0.000430, batch_cost: 0.2116, reader_cost: 0.00035, ips: 37.8134 samples/sec | ETA 03:20:06 2022-08-24 22:15:30 [INFO] [TRAIN] epoch: 82, iter: 103300/160000, loss: 0.6901, lr: 0.000429, batch_cost: 0.1810, reader_cost: 0.01296, ips: 44.1873 samples/sec | ETA 02:51:05 2022-08-24 22:15:40 [INFO] [TRAIN] epoch: 82, iter: 103350/160000, loss: 0.6897, lr: 0.000429, batch_cost: 0.1968, reader_cost: 0.00595, ips: 40.6479 samples/sec | ETA 03:05:49 2022-08-24 22:15:50 [INFO] [TRAIN] epoch: 82, iter: 103400/160000, loss: 0.7396, lr: 0.000429, batch_cost: 0.2002, reader_cost: 0.00454, ips: 39.9602 samples/sec | ETA 03:08:51 2022-08-24 22:16:01 [INFO] [TRAIN] epoch: 82, iter: 103450/160000, loss: 0.7486, lr: 0.000428, batch_cost: 0.2276, reader_cost: 0.00054, ips: 35.1557 samples/sec | ETA 03:34:28 2022-08-24 22:16:12 [INFO] [TRAIN] epoch: 82, iter: 103500/160000, loss: 0.7105, lr: 0.000428, batch_cost: 0.2199, reader_cost: 0.00045, ips: 36.3844 samples/sec | ETA 03:27:02 2022-08-24 22:16:22 [INFO] [TRAIN] epoch: 82, iter: 103550/160000, loss: 0.6922, lr: 0.000427, batch_cost: 0.1945, reader_cost: 0.00072, ips: 41.1225 samples/sec | ETA 03:03:01 2022-08-24 22:16:36 [INFO] [TRAIN] epoch: 83, iter: 103600/160000, loss: 0.7047, lr: 0.000427, batch_cost: 0.2811, reader_cost: 0.07448, ips: 28.4579 samples/sec | ETA 04:24:15 2022-08-24 22:16:47 [INFO] [TRAIN] epoch: 83, iter: 103650/160000, loss: 0.7846, lr: 0.000427, batch_cost: 0.2149, reader_cost: 0.00093, ips: 37.2319 samples/sec | ETA 03:21:47 2022-08-24 22:16:57 [INFO] [TRAIN] epoch: 83, iter: 103700/160000, loss: 0.7086, lr: 0.000426, batch_cost: 0.2037, reader_cost: 0.01239, ips: 39.2774 samples/sec | ETA 03:11:07 2022-08-24 22:17:07 [INFO] [TRAIN] epoch: 83, iter: 103750/160000, loss: 0.6899, lr: 0.000426, batch_cost: 0.1984, reader_cost: 0.00698, ips: 40.3322 samples/sec | ETA 03:05:57 2022-08-24 22:17:18 [INFO] [TRAIN] epoch: 83, iter: 103800/160000, loss: 0.7118, lr: 0.000425, batch_cost: 0.2278, reader_cost: 0.00067, ips: 35.1188 samples/sec | ETA 03:33:22 2022-08-24 22:17:28 [INFO] [TRAIN] epoch: 83, iter: 103850/160000, loss: 0.7412, lr: 0.000425, batch_cost: 0.2030, reader_cost: 0.00111, ips: 39.4038 samples/sec | ETA 03:09:59 2022-08-24 22:17:39 [INFO] [TRAIN] epoch: 83, iter: 103900/160000, loss: 0.7606, lr: 0.000425, batch_cost: 0.2044, reader_cost: 0.00148, ips: 39.1371 samples/sec | ETA 03:11:07 2022-08-24 22:17:50 [INFO] [TRAIN] epoch: 83, iter: 103950/160000, loss: 0.6877, lr: 0.000424, batch_cost: 0.2166, reader_cost: 0.00597, ips: 36.9406 samples/sec | ETA 03:22:18 2022-08-24 22:18:01 [INFO] [TRAIN] epoch: 83, iter: 104000/160000, loss: 0.7187, lr: 0.000424, batch_cost: 0.2317, reader_cost: 0.00094, ips: 34.5242 samples/sec | ETA 03:36:16 2022-08-24 22:18:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 201s - batch_cost: 0.2007 - reader cost: 8.7145e-04 2022-08-24 22:21:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3070 Acc: 0.7430 Kappa: 0.7233 Dice: 0.4323 2022-08-24 22:21:22 [INFO] [EVAL] Class IoU: [0.6578 0.772 0.9254 0.6967 0.6632 0.74 0.7653 0.7404 0.4787 0.5407 0.4573 0.5288 0.6557 0.2955 0.2455 0.3716 0.4786 0.3984 0.5484 0.3562 0.7188 0.4036 0.5536 0.4461 0.3128 0.4005 0.3791 0.3782 0.3543 0.239 0.2105 0.3753 0.2774 0.2909 0.3137 0.3456 0.3505 0.479 0.2678 0.2605 0.0799 0.0639 0.2864 0.2116 0.3034 0.2564 0.2545 0.4344 0.6658 0.4976 0.4446 0.2695 0.1991 0.1756 0.5755 0.4457 0.7996 0.2875 0.495 0.1862 0.0551 0.1651 0.2595 0.1295 0.3575 0.6558 0.1939 0.3512 0.0956 0.3524 0.3181 0.3938 0.371 0.2173 0.4218 0.2936 0.3566 0.1871 0.1248 0.2392 0.62 0.2833 0.2006 0.0109 0.4217 0.468 0.0878 0.061 0.3615 0.3652 0.429 0.0034 0.1044 0.0698 0.0026 0.0114 0.1781 0.1405 0.1569 0.3747 0.032 0.0106 0.1489 0.5275 0.0012 0.5257 0.1979 0.4845 0.1185 0.3791 0.0399 0.2827 0.0994 0.5124 0.6594 0. 0.3523 0.5384 0.0459 0.3043 0.4018 0.012 0.0569 0.1343 0.2428 0.1463 0.4107 0.2975 0.4271 0.2649 0.5209 0.01 0.0202 0.2249 0.0965 0.1047 0.0627 0.0159 0.1336 0.3168 0.053 0.047 0.2365 0.3055 0.3582 0. 0.2647 0.0098 0.0538 0.0609] 2022-08-24 22:21:22 [INFO] [EVAL] Class Precision: [0.7682 0.84 0.96 0.7803 0.771 0.8607 0.8757 0.802 0.6056 0.7445 0.6731 0.6606 0.7484 0.4914 0.4546 0.5303 0.6332 0.6517 0.7373 0.568 0.8022 0.5802 0.6958 0.5836 0.5152 0.6179 0.5279 0.7212 0.6923 0.301 0.3943 0.5009 0.4787 0.3976 0.4822 0.467 0.5878 0.7616 0.4643 0.5042 0.1405 0.1864 0.4961 0.5346 0.4455 0.473 0.3806 0.6485 0.741 0.6328 0.586 0.329 0.4643 0.5748 0.6998 0.5891 0.8232 0.6213 0.6737 0.3763 0.0886 0.5037 0.4033 0.562 0.453 0.7838 0.3089 0.4573 0.2224 0.5977 0.5332 0.5885 0.619 0.2771 0.6127 0.4367 0.5513 0.6283 0.8731 0.5549 0.7201 0.5311 0.7741 0.0365 0.7015 0.6499 0.3609 0.4476 0.709 0.4817 0.6657 0.0047 0.2533 0.2994 0.0125 0.0362 0.6501 0.5778 0.4296 0.5662 0.5601 0.0189 0.6421 0.554 0.0435 0.7488 0.5812 0.8806 0.2543 0.5571 0.4599 0.4685 0.3235 0.7145 0.6631 0. 0.7089 0.5698 0.1046 0.6264 0.7822 0.4817 0.7396 0.5906 0.5792 0.628 0.6554 0.4218 0.5321 0.4271 0.5844 0.474 0.2157 0.7505 0.5633 0.2598 0.3442 0.0758 0.3335 0.6711 0.1561 0.0713 0.5321 0.6887 0.6587 0. 0.8529 0.2142 0.4278 0.6604] 2022-08-24 22:21:22 [INFO] [EVAL] Class Recall: [0.8206 0.905 0.9625 0.8667 0.8259 0.8407 0.8585 0.906 0.6955 0.664 0.5879 0.726 0.8411 0.4257 0.3479 0.5538 0.6623 0.5061 0.6815 0.4886 0.8737 0.57 0.7303 0.6543 0.4433 0.5323 0.5735 0.4429 0.4206 0.5373 0.3111 0.5993 0.3975 0.5203 0.4731 0.5707 0.4647 0.5635 0.3876 0.3502 0.1562 0.0886 0.4038 0.2593 0.4874 0.359 0.4344 0.5682 0.8678 0.6997 0.6481 0.5982 0.2585 0.2018 0.7643 0.6469 0.9655 0.3486 0.6511 0.2694 0.1273 0.1972 0.4212 0.1441 0.629 0.8006 0.3426 0.6022 0.1435 0.462 0.4409 0.5434 0.4809 0.5014 0.5753 0.4726 0.5024 0.2104 0.1271 0.296 0.8169 0.3778 0.2131 0.0154 0.5138 0.6257 0.1039 0.066 0.4245 0.6016 0.5468 0.0125 0.1508 0.0834 0.0033 0.0163 0.1969 0.1565 0.1981 0.5257 0.0329 0.0235 0.1623 0.9168 0.0012 0.6382 0.2308 0.5186 0.1816 0.5427 0.0419 0.4161 0.1255 0.6444 0.9916 0. 0.4119 0.9072 0.0757 0.3718 0.4524 0.0122 0.0581 0.1481 0.2947 0.1602 0.5238 0.5024 0.6839 0.411 0.8275 0.0101 0.0218 0.2431 0.1043 0.1492 0.0712 0.0198 0.1822 0.3751 0.0742 0.1212 0.2986 0.3544 0.4398 0. 0.2774 0.0102 0.058 0.0629] 2022-08-24 22:21:22 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:21:33 [INFO] [TRAIN] epoch: 83, iter: 104050/160000, loss: 0.7656, lr: 0.000424, batch_cost: 0.2139, reader_cost: 0.01010, ips: 37.4026 samples/sec | ETA 03:19:27 2022-08-24 22:21:43 [INFO] [TRAIN] epoch: 83, iter: 104100/160000, loss: 0.7134, lr: 0.000423, batch_cost: 0.2024, reader_cost: 0.00095, ips: 39.5332 samples/sec | ETA 03:08:31 2022-08-24 22:21:54 [INFO] [TRAIN] epoch: 83, iter: 104150/160000, loss: 0.7215, lr: 0.000423, batch_cost: 0.2167, reader_cost: 0.00082, ips: 36.9177 samples/sec | ETA 03:21:42 2022-08-24 22:22:04 [INFO] [TRAIN] epoch: 83, iter: 104200/160000, loss: 0.6907, lr: 0.000422, batch_cost: 0.2111, reader_cost: 0.00437, ips: 37.8974 samples/sec | ETA 03:16:19 2022-08-24 22:22:16 [INFO] [TRAIN] epoch: 83, iter: 104250/160000, loss: 0.7395, lr: 0.000422, batch_cost: 0.2387, reader_cost: 0.00059, ips: 33.5151 samples/sec | ETA 03:41:47 2022-08-24 22:22:27 [INFO] [TRAIN] epoch: 83, iter: 104300/160000, loss: 0.7741, lr: 0.000422, batch_cost: 0.2087, reader_cost: 0.00034, ips: 38.3242 samples/sec | ETA 03:13:47 2022-08-24 22:22:38 [INFO] [TRAIN] epoch: 83, iter: 104350/160000, loss: 0.7245, lr: 0.000421, batch_cost: 0.2187, reader_cost: 0.00174, ips: 36.5816 samples/sec | ETA 03:22:50 2022-08-24 22:22:49 [INFO] [TRAIN] epoch: 83, iter: 104400/160000, loss: 0.6806, lr: 0.000421, batch_cost: 0.2214, reader_cost: 0.00069, ips: 36.1376 samples/sec | ETA 03:25:08 2022-08-24 22:23:00 [INFO] [TRAIN] epoch: 83, iter: 104450/160000, loss: 0.6941, lr: 0.000421, batch_cost: 0.2164, reader_cost: 0.00034, ips: 36.9677 samples/sec | ETA 03:20:21 2022-08-24 22:23:11 [INFO] [TRAIN] epoch: 83, iter: 104500/160000, loss: 0.7344, lr: 0.000420, batch_cost: 0.2221, reader_cost: 0.00063, ips: 36.0128 samples/sec | ETA 03:25:28 2022-08-24 22:23:21 [INFO] [TRAIN] epoch: 83, iter: 104550/160000, loss: 0.7506, lr: 0.000420, batch_cost: 0.2151, reader_cost: 0.00127, ips: 37.1888 samples/sec | ETA 03:18:48 2022-08-24 22:23:33 [INFO] [TRAIN] epoch: 83, iter: 104600/160000, loss: 0.7011, lr: 0.000419, batch_cost: 0.2237, reader_cost: 0.00070, ips: 35.7585 samples/sec | ETA 03:26:34 2022-08-24 22:23:43 [INFO] [TRAIN] epoch: 83, iter: 104650/160000, loss: 0.7079, lr: 0.000419, batch_cost: 0.1999, reader_cost: 0.02265, ips: 40.0229 samples/sec | ETA 03:04:23 2022-08-24 22:23:53 [INFO] [TRAIN] epoch: 83, iter: 104700/160000, loss: 0.7220, lr: 0.000419, batch_cost: 0.2085, reader_cost: 0.00102, ips: 38.3666 samples/sec | ETA 03:12:10 2022-08-24 22:24:03 [INFO] [TRAIN] epoch: 83, iter: 104750/160000, loss: 0.7639, lr: 0.000418, batch_cost: 0.2045, reader_cost: 0.00596, ips: 39.1152 samples/sec | ETA 03:08:19 2022-08-24 22:24:12 [INFO] [TRAIN] epoch: 83, iter: 104800/160000, loss: 0.7330, lr: 0.000418, batch_cost: 0.1701, reader_cost: 0.00092, ips: 47.0222 samples/sec | ETA 02:36:31 2022-08-24 22:24:22 [INFO] [TRAIN] epoch: 84, iter: 104850/160000, loss: 0.7295, lr: 0.000418, batch_cost: 0.2054, reader_cost: 0.03075, ips: 38.9502 samples/sec | ETA 03:08:47 2022-08-24 22:24:34 [INFO] [TRAIN] epoch: 84, iter: 104900/160000, loss: 0.7022, lr: 0.000417, batch_cost: 0.2306, reader_cost: 0.00070, ips: 34.6938 samples/sec | ETA 03:31:45 2022-08-24 22:24:45 [INFO] [TRAIN] epoch: 84, iter: 104950/160000, loss: 0.7210, lr: 0.000417, batch_cost: 0.2248, reader_cost: 0.00034, ips: 35.5944 samples/sec | ETA 03:26:12 2022-08-24 22:24:57 [INFO] [TRAIN] epoch: 84, iter: 105000/160000, loss: 0.7615, lr: 0.000416, batch_cost: 0.2354, reader_cost: 0.00082, ips: 33.9786 samples/sec | ETA 03:35:49 2022-08-24 22:24:57 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 187s - batch_cost: 0.1870 - reader cost: 9.5776e-04 2022-08-24 22:28:04 [INFO] [EVAL] #Images: 2000 mIoU: 0.3133 Acc: 0.7440 Kappa: 0.7244 Dice: 0.4411 2022-08-24 22:28:04 [INFO] [EVAL] Class IoU: [0.6555 0.7616 0.925 0.7033 0.6615 0.7421 0.7623 0.7439 0.482 0.6357 0.4633 0.5029 0.6504 0.2664 0.1976 0.3662 0.4564 0.4101 0.5556 0.3636 0.7129 0.4027 0.5535 0.4436 0.3427 0.3635 0.3578 0.389 0.3495 0.2891 0.2318 0.3778 0.28 0.2827 0.3091 0.4027 0.3587 0.4754 0.25 0.247 0.097 0.0598 0.2877 0.2086 0.2937 0.2503 0.2451 0.398 0.6521 0.495 0.476 0.2588 0.2104 0.2135 0.5824 0.4552 0.8233 0.2394 0.4776 0.1923 0.0473 0.1652 0.3175 0.1739 0.3689 0.6543 0.2131 0.3624 0.0686 0.3544 0.2987 0.4071 0.3542 0.205 0.4239 0.2935 0.332 0.2203 0.2202 0.3218 0.625 0.2823 0.2302 0.0166 0.4506 0.4758 0.0883 0.072 0.3149 0.3538 0.3527 0.0026 0.2387 0.0727 0.0023 0.0151 0.1428 0.1381 0.1633 0.2703 0.1033 0.0292 0.2029 0.5376 0.0037 0.5167 0.2163 0.5081 0.0854 0.4024 0.0567 0.2645 0.1044 0.5608 0.7162 0. 0.3889 0.5531 0.0352 0.258 0.4673 0.0049 0.2119 0.1051 0.2356 0.2174 0.4328 0.286 0.3965 0.2283 0.5578 0.0233 0.0396 0.2872 0.1072 0.1112 0.0525 0.0182 0.1356 0.2754 0.1765 0.0823 0.2988 0.1882 0.3283 0. 0.3076 0.0145 0.0778 0.0327] 2022-08-24 22:28:04 [INFO] [EVAL] Class Precision: [0.7706 0.8282 0.9604 0.808 0.7555 0.8596 0.8425 0.8208 0.6117 0.7721 0.6586 0.7099 0.73 0.4542 0.515 0.5105 0.583 0.6367 0.7299 0.5544 0.797 0.5666 0.6938 0.591 0.5397 0.6274 0.5126 0.5901 0.686 0.5023 0.407 0.5322 0.4488 0.4385 0.4265 0.5815 0.5844 0.7478 0.5557 0.5122 0.1696 0.2121 0.4458 0.494 0.4131 0.4305 0.3753 0.5455 0.6977 0.6471 0.5899 0.3108 0.3886 0.6053 0.6698 0.5925 0.8635 0.6265 0.5672 0.3942 0.0789 0.4001 0.4604 0.533 0.468 0.8035 0.3866 0.5605 0.2709 0.7401 0.5315 0.5312 0.6443 0.2521 0.6103 0.4723 0.5219 0.4947 0.662 0.5265 0.7363 0.5567 0.7602 0.0556 0.5585 0.677 0.3331 0.3458 0.4554 0.4588 0.454 0.0031 0.3732 0.2554 0.0098 0.0424 0.5536 0.5115 0.441 0.5785 0.5853 0.0387 0.7008 0.6072 0.054 0.7526 0.5048 0.8546 0.2631 0.5513 0.2574 0.4207 0.4129 0.7081 0.7229 0.0011 0.6546 0.6009 0.0853 0.6648 0.6972 0.4004 0.7348 0.6026 0.5844 0.5899 0.8068 0.402 0.5769 0.5025 0.6459 0.4052 0.3119 0.6478 0.5313 0.3158 0.3405 0.1255 0.3686 0.7301 0.3305 0.1511 0.5099 0.7202 0.5541 0. 0.8254 0.2927 0.646 0.8472] 2022-08-24 22:28:04 [INFO] [EVAL] Class Recall: [0.8144 0.9045 0.9617 0.8445 0.8416 0.8445 0.889 0.8881 0.6945 0.7825 0.6098 0.6329 0.8565 0.3918 0.2428 0.5644 0.6775 0.5355 0.6995 0.5137 0.871 0.582 0.7324 0.6401 0.4842 0.4637 0.5424 0.533 0.416 0.4052 0.35 0.5656 0.4267 0.443 0.5291 0.5671 0.4816 0.5662 0.3124 0.3229 0.1849 0.0769 0.4479 0.2653 0.5038 0.3741 0.414 0.5954 0.9088 0.678 0.7114 0.6072 0.3145 0.248 0.8169 0.6627 0.9466 0.2792 0.7513 0.2729 0.1059 0.2195 0.5057 0.2051 0.6353 0.7788 0.322 0.5062 0.0842 0.4047 0.4055 0.6353 0.4402 0.5231 0.5812 0.4367 0.477 0.2842 0.248 0.4528 0.8052 0.3642 0.2483 0.0231 0.6998 0.6156 0.1073 0.0834 0.5051 0.6073 0.6126 0.0158 0.3984 0.0923 0.0029 0.0229 0.1614 0.1591 0.206 0.3366 0.1114 0.1066 0.2222 0.8243 0.004 0.6225 0.2746 0.5562 0.1123 0.5983 0.0678 0.4161 0.1226 0.7294 0.9873 0. 0.4893 0.8744 0.0565 0.2965 0.5864 0.0049 0.2295 0.1129 0.283 0.2561 0.4829 0.4977 0.559 0.295 0.8036 0.0241 0.0434 0.3404 0.1184 0.1466 0.0585 0.0208 0.1766 0.3066 0.2748 0.153 0.4192 0.2031 0.4461 0. 0.329 0.015 0.0813 0.0329] 2022-08-24 22:28:04 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:28:15 [INFO] [TRAIN] epoch: 84, iter: 105050/160000, loss: 0.7998, lr: 0.000416, batch_cost: 0.2312, reader_cost: 0.00421, ips: 34.6013 samples/sec | ETA 03:31:44 2022-08-24 22:28:27 [INFO] [TRAIN] epoch: 84, iter: 105100/160000, loss: 0.7360, lr: 0.000416, batch_cost: 0.2296, reader_cost: 0.00042, ips: 34.8466 samples/sec | ETA 03:30:03 2022-08-24 22:28:38 [INFO] [TRAIN] epoch: 84, iter: 105150/160000, loss: 0.7211, lr: 0.000415, batch_cost: 0.2137, reader_cost: 0.00570, ips: 37.4385 samples/sec | ETA 03:15:20 2022-08-24 22:28:49 [INFO] [TRAIN] epoch: 84, iter: 105200/160000, loss: 0.6985, lr: 0.000415, batch_cost: 0.2263, reader_cost: 0.00078, ips: 35.3448 samples/sec | ETA 03:26:43 2022-08-24 22:29:01 [INFO] [TRAIN] epoch: 84, iter: 105250/160000, loss: 0.7434, lr: 0.000415, batch_cost: 0.2493, reader_cost: 0.00051, ips: 32.0850 samples/sec | ETA 03:47:31 2022-08-24 22:29:12 [INFO] [TRAIN] epoch: 84, iter: 105300/160000, loss: 0.7552, lr: 0.000414, batch_cost: 0.2039, reader_cost: 0.00710, ips: 39.2390 samples/sec | ETA 03:05:52 2022-08-24 22:29:21 [INFO] [TRAIN] epoch: 84, iter: 105350/160000, loss: 0.7241, lr: 0.000414, batch_cost: 0.1855, reader_cost: 0.00085, ips: 43.1218 samples/sec | ETA 02:48:58 2022-08-24 22:29:32 [INFO] [TRAIN] epoch: 84, iter: 105400/160000, loss: 0.7205, lr: 0.000413, batch_cost: 0.2210, reader_cost: 0.00172, ips: 36.1910 samples/sec | ETA 03:21:09 2022-08-24 22:29:44 [INFO] [TRAIN] epoch: 84, iter: 105450/160000, loss: 0.7017, lr: 0.000413, batch_cost: 0.2331, reader_cost: 0.00100, ips: 34.3192 samples/sec | ETA 03:31:55 2022-08-24 22:29:56 [INFO] [TRAIN] epoch: 84, iter: 105500/160000, loss: 0.7054, lr: 0.000413, batch_cost: 0.2460, reader_cost: 0.00039, ips: 32.5146 samples/sec | ETA 03:43:29 2022-08-24 22:30:06 [INFO] [TRAIN] epoch: 84, iter: 105550/160000, loss: 0.7324, lr: 0.000412, batch_cost: 0.2086, reader_cost: 0.00056, ips: 38.3593 samples/sec | ETA 03:09:15 2022-08-24 22:30:18 [INFO] [TRAIN] epoch: 84, iter: 105600/160000, loss: 0.7239, lr: 0.000412, batch_cost: 0.2374, reader_cost: 0.00671, ips: 33.6920 samples/sec | ETA 03:35:17 2022-08-24 22:30:29 [INFO] [TRAIN] epoch: 84, iter: 105650/160000, loss: 0.7173, lr: 0.000411, batch_cost: 0.2151, reader_cost: 0.00062, ips: 37.1970 samples/sec | ETA 03:14:49 2022-08-24 22:30:39 [INFO] [TRAIN] epoch: 84, iter: 105700/160000, loss: 0.6837, lr: 0.000411, batch_cost: 0.2055, reader_cost: 0.00334, ips: 38.9248 samples/sec | ETA 03:05:59 2022-08-24 22:30:50 [INFO] [TRAIN] epoch: 84, iter: 105750/160000, loss: 0.7280, lr: 0.000411, batch_cost: 0.2072, reader_cost: 0.00033, ips: 38.6167 samples/sec | ETA 03:07:18 2022-08-24 22:31:01 [INFO] [TRAIN] epoch: 84, iter: 105800/160000, loss: 0.7382, lr: 0.000410, batch_cost: 0.2373, reader_cost: 0.00049, ips: 33.7169 samples/sec | ETA 03:34:20 2022-08-24 22:31:10 [INFO] [TRAIN] epoch: 84, iter: 105850/160000, loss: 0.7456, lr: 0.000410, batch_cost: 0.1758, reader_cost: 0.00185, ips: 45.5103 samples/sec | ETA 02:38:38 2022-08-24 22:31:21 [INFO] [TRAIN] epoch: 84, iter: 105900/160000, loss: 0.7269, lr: 0.000410, batch_cost: 0.2077, reader_cost: 0.00041, ips: 38.5152 samples/sec | ETA 03:07:17 2022-08-24 22:31:31 [INFO] [TRAIN] epoch: 84, iter: 105950/160000, loss: 0.6939, lr: 0.000409, batch_cost: 0.2081, reader_cost: 0.00064, ips: 38.4374 samples/sec | ETA 03:07:29 2022-08-24 22:31:41 [INFO] [TRAIN] epoch: 84, iter: 106000/160000, loss: 0.7646, lr: 0.000409, batch_cost: 0.1986, reader_cost: 0.00166, ips: 40.2799 samples/sec | ETA 02:58:44 2022-08-24 22:31:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 207s - batch_cost: 0.2066 - reader cost: 9.1599e-04 2022-08-24 22:35:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3137 Acc: 0.7454 Kappa: 0.7258 Dice: 0.4404 2022-08-24 22:35:08 [INFO] [EVAL] Class IoU: [0.6579 0.7685 0.9245 0.704 0.6635 0.7439 0.7598 0.7379 0.4744 0.6233 0.4588 0.5309 0.6478 0.3002 0.2292 0.3706 0.4905 0.4154 0.5398 0.3629 0.7213 0.4186 0.5602 0.4498 0.3428 0.3666 0.3759 0.3816 0.3536 0.2965 0.202 0.3719 0.2692 0.2809 0.3196 0.3699 0.3508 0.4445 0.2463 0.2822 0.0877 0.0576 0.2946 0.2238 0.2952 0.2421 0.2455 0.4073 0.6556 0.4848 0.4576 0.2134 0.2141 0.1904 0.589 0.4599 0.8277 0.256 0.4461 0.2125 0.07 0.1724 0.3101 0.0967 0.3732 0.6658 0.2405 0.3237 0.1225 0.342 0.3256 0.3987 0.3682 0.2105 0.3969 0.3044 0.3277 0.2074 0.254 0.2638 0.66 0.2743 0.2262 0.0182 0.4811 0.4752 0.0729 0.0577 0.3443 0.3672 0.385 0.0029 0.23 0.0634 0.0006 0.0127 0.1713 0.1427 0.1795 0.3552 0.0711 0.0367 0.1832 0.6353 0.0011 0.4831 0.1259 0.5366 0.0809 0.446 0.0646 0.182 0.0933 0.4955 0.6784 0.0023 0.3366 0.5733 0.0628 0.2959 0.5033 0.0089 0.1884 0.0998 0.215 0.22 0.4408 0.2619 0.4064 0.1899 0.544 0.014 0.0157 0.2557 0.085 0.1233 0.0758 0.0209 0.1397 0.3046 0.0749 0.0807 0.2898 0.4306 0.3305 0.0064 0.2791 0.0111 0.0637 0.0237] 2022-08-24 22:35:08 [INFO] [EVAL] Class Precision: [0.7596 0.8428 0.9673 0.8157 0.7512 0.8538 0.884 0.8141 0.6088 0.7693 0.7016 0.6571 0.7197 0.4959 0.4588 0.5054 0.6793 0.6158 0.6973 0.543 0.8081 0.6088 0.748 0.5912 0.5126 0.5537 0.5206 0.6946 0.726 0.4457 0.3897 0.4998 0.5444 0.3942 0.4374 0.5425 0.63 0.6754 0.5406 0.4705 0.1472 0.2041 0.4887 0.4905 0.3939 0.4047 0.3745 0.6519 0.7197 0.5979 0.6055 0.2481 0.4853 0.6631 0.6764 0.6766 0.8645 0.6382 0.642 0.355 0.1213 0.4346 0.4002 0.6484 0.5006 0.7793 0.3677 0.5889 0.2813 0.6177 0.5088 0.5242 0.6507 0.2651 0.6385 0.5168 0.4963 0.5547 0.5851 0.5686 0.7766 0.527 0.7927 0.0736 0.6702 0.6075 0.3066 0.5203 0.5674 0.4901 0.5088 0.0038 0.3557 0.2798 0.0039 0.0314 0.6789 0.5937 0.5282 0.5953 0.7848 0.0572 0.7068 0.6915 0.0492 0.6879 0.4634 0.7986 0.2727 0.6115 0.2544 0.2444 0.4472 0.7761 0.6803 0.0517 0.7189 0.6001 0.1252 0.6491 0.7962 0.5607 0.6921 0.5587 0.646 0.6562 0.6999 0.3352 0.5292 0.5485 0.6344 0.4525 0.1159 0.6572 0.5603 0.2669 0.2704 0.1726 0.3521 0.6984 0.2009 0.1399 0.5057 0.6432 0.5686 0.0078 0.8924 0.312 0.5333 0.8831] 2022-08-24 22:35:08 [INFO] [EVAL] Class Recall: [0.8309 0.8971 0.9544 0.8371 0.8504 0.8526 0.8439 0.8874 0.6826 0.7666 0.5701 0.7343 0.8664 0.4321 0.314 0.5815 0.6383 0.5607 0.705 0.5224 0.8704 0.5726 0.6906 0.653 0.5086 0.5203 0.5748 0.4586 0.4081 0.4698 0.2956 0.5923 0.3475 0.4943 0.5428 0.5376 0.4418 0.5652 0.3116 0.4135 0.1785 0.0743 0.4259 0.2916 0.5408 0.3759 0.4161 0.5205 0.8804 0.7193 0.6519 0.6043 0.277 0.2108 0.8201 0.5894 0.9511 0.2995 0.5939 0.3462 0.142 0.2223 0.5795 0.1021 0.5945 0.8204 0.4101 0.4182 0.1782 0.4339 0.475 0.6248 0.4589 0.5057 0.512 0.4255 0.491 0.2489 0.3098 0.3299 0.8148 0.3639 0.2404 0.0237 0.6303 0.6857 0.0873 0.0609 0.4668 0.5942 0.6128 0.0122 0.3943 0.0758 0.0008 0.0208 0.1865 0.1581 0.2138 0.4682 0.0725 0.0926 0.1983 0.8866 0.0012 0.6187 0.1474 0.6206 0.1032 0.6223 0.0797 0.4161 0.1055 0.5781 0.9959 0.0024 0.3876 0.9279 0.1119 0.3523 0.5778 0.0089 0.2056 0.1084 0.2437 0.2486 0.5435 0.5451 0.6365 0.225 0.7925 0.0143 0.0178 0.295 0.0911 0.1865 0.0953 0.0232 0.1881 0.3508 0.1066 0.1604 0.4044 0.5658 0.4411 0.034 0.2889 0.0114 0.0675 0.0238] 2022-08-24 22:35:08 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:35:20 [INFO] [TRAIN] epoch: 84, iter: 106050/160000, loss: 0.7591, lr: 0.000408, batch_cost: 0.2329, reader_cost: 0.00442, ips: 34.3543 samples/sec | ETA 03:29:23 2022-08-24 22:35:32 [INFO] [TRAIN] epoch: 85, iter: 106100/160000, loss: 0.6819, lr: 0.000408, batch_cost: 0.2498, reader_cost: 0.03088, ips: 32.0217 samples/sec | ETA 03:44:25 2022-08-24 22:35:42 [INFO] [TRAIN] epoch: 85, iter: 106150/160000, loss: 0.7282, lr: 0.000408, batch_cost: 0.2037, reader_cost: 0.00042, ips: 39.2777 samples/sec | ETA 03:02:48 2022-08-24 22:35:53 [INFO] [TRAIN] epoch: 85, iter: 106200/160000, loss: 0.7095, lr: 0.000407, batch_cost: 0.2203, reader_cost: 0.00128, ips: 36.3130 samples/sec | ETA 03:17:32 2022-08-24 22:36:04 [INFO] [TRAIN] epoch: 85, iter: 106250/160000, loss: 0.7636, lr: 0.000407, batch_cost: 0.2147, reader_cost: 0.00422, ips: 37.2597 samples/sec | ETA 03:12:20 2022-08-24 22:36:16 [INFO] [TRAIN] epoch: 85, iter: 106300/160000, loss: 0.7289, lr: 0.000407, batch_cost: 0.2415, reader_cost: 0.00043, ips: 33.1307 samples/sec | ETA 03:36:06 2022-08-24 22:36:27 [INFO] [TRAIN] epoch: 85, iter: 106350/160000, loss: 0.7148, lr: 0.000406, batch_cost: 0.2145, reader_cost: 0.01223, ips: 37.3034 samples/sec | ETA 03:11:45 2022-08-24 22:36:37 [INFO] [TRAIN] epoch: 85, iter: 106400/160000, loss: 0.6842, lr: 0.000406, batch_cost: 0.2060, reader_cost: 0.00047, ips: 38.8308 samples/sec | ETA 03:04:02 2022-08-24 22:36:47 [INFO] [TRAIN] epoch: 85, iter: 106450/160000, loss: 0.7298, lr: 0.000405, batch_cost: 0.1978, reader_cost: 0.00044, ips: 40.4411 samples/sec | ETA 02:56:33 2022-08-24 22:36:58 [INFO] [TRAIN] epoch: 85, iter: 106500/160000, loss: 0.6972, lr: 0.000405, batch_cost: 0.2124, reader_cost: 0.01064, ips: 37.6615 samples/sec | ETA 03:09:24 2022-08-24 22:37:10 [INFO] [TRAIN] epoch: 85, iter: 106550/160000, loss: 0.7144, lr: 0.000405, batch_cost: 0.2402, reader_cost: 0.00068, ips: 33.3105 samples/sec | ETA 03:33:56 2022-08-24 22:37:21 [INFO] [TRAIN] epoch: 85, iter: 106600/160000, loss: 0.7101, lr: 0.000404, batch_cost: 0.2208, reader_cost: 0.00051, ips: 36.2386 samples/sec | ETA 03:16:28 2022-08-24 22:37:32 [INFO] [TRAIN] epoch: 85, iter: 106650/160000, loss: 0.7625, lr: 0.000404, batch_cost: 0.2177, reader_cost: 0.00057, ips: 36.7438 samples/sec | ETA 03:13:35 2022-08-24 22:37:42 [INFO] [TRAIN] epoch: 85, iter: 106700/160000, loss: 0.7347, lr: 0.000404, batch_cost: 0.2043, reader_cost: 0.00065, ips: 39.1507 samples/sec | ETA 03:01:31 2022-08-24 22:37:54 [INFO] [TRAIN] epoch: 85, iter: 106750/160000, loss: 0.7429, lr: 0.000403, batch_cost: 0.2367, reader_cost: 0.00673, ips: 33.7976 samples/sec | ETA 03:30:04 2022-08-24 22:38:09 [INFO] [TRAIN] epoch: 85, iter: 106800/160000, loss: 0.7389, lr: 0.000403, batch_cost: 0.3021, reader_cost: 0.00188, ips: 26.4827 samples/sec | ETA 04:27:50 2022-08-24 22:38:20 [INFO] [TRAIN] epoch: 85, iter: 106850/160000, loss: 0.6707, lr: 0.000402, batch_cost: 0.2227, reader_cost: 0.00053, ips: 35.9157 samples/sec | ETA 03:17:18 2022-08-24 22:38:30 [INFO] [TRAIN] epoch: 85, iter: 106900/160000, loss: 0.7041, lr: 0.000402, batch_cost: 0.2032, reader_cost: 0.00317, ips: 39.3792 samples/sec | ETA 02:59:47 2022-08-24 22:38:41 [INFO] [TRAIN] epoch: 85, iter: 106950/160000, loss: 0.7864, lr: 0.000402, batch_cost: 0.2193, reader_cost: 0.00078, ips: 36.4733 samples/sec | ETA 03:13:55 2022-08-24 22:38:52 [INFO] [TRAIN] epoch: 85, iter: 107000/160000, loss: 0.7031, lr: 0.000401, batch_cost: 0.2177, reader_cost: 0.00042, ips: 36.7493 samples/sec | ETA 03:12:17 2022-08-24 22:38:52 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1791 - reader cost: 7.2227e-04 2022-08-24 22:41:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3135 Acc: 0.7460 Kappa: 0.7265 Dice: 0.4411 2022-08-24 22:41:52 [INFO] [EVAL] Class IoU: [0.6582 0.7636 0.9253 0.703 0.6633 0.7485 0.7576 0.7462 0.4807 0.6376 0.459 0.5293 0.658 0.3021 0.2087 0.367 0.4702 0.3938 0.5544 0.3529 0.7216 0.439 0.5645 0.4554 0.3211 0.3726 0.3687 0.3913 0.3476 0.3287 0.2132 0.381 0.2775 0.286 0.3047 0.3867 0.3646 0.4542 0.2366 0.2204 0.0826 0.0891 0.2986 0.2243 0.2886 0.2648 0.2007 0.4171 0.6411 0.4693 0.4654 0.243 0.2076 0.1773 0.5737 0.47 0.8097 0.3266 0.4462 0.2159 0.0684 0.1645 0.3085 0.119 0.3894 0.6457 0.1886 0.3582 0.0952 0.3468 0.3369 0.4165 0.3849 0.2196 0.4085 0.2893 0.3172 0.1972 0.1199 0.332 0.6398 0.2842 0.2447 0.0106 0.4214 0.4459 0.0861 0.0766 0.3469 0.372 0.3871 0.0006 0.2125 0.056 0.0035 0.0059 0.2362 0.1324 0.1969 0.3457 0.1262 0.0113 0.2229 0.5778 0.0005 0.4763 0.1679 0.5151 0.0952 0.3514 0.0645 0.1524 0.0905 0.5691 0.6615 0.0026 0.3641 0.5774 0.0524 0.2842 0.4515 0.0111 0.1859 0.1461 0.2454 0.1825 0.4213 0.2804 0.4026 0.2491 0.5157 0.0174 0.0931 0.2809 0.0801 0.115 0.0546 0.0212 0.1346 0.3344 0.0779 0.0078 0.2765 0.3998 0.3244 0.0069 0.2758 0.0173 0.0741 0.0473] 2022-08-24 22:41:52 [INFO] [EVAL] Class Precision: [0.7658 0.8329 0.965 0.8038 0.7589 0.8491 0.8723 0.8133 0.5961 0.7484 0.6681 0.6597 0.7599 0.5008 0.4748 0.5571 0.6295 0.6172 0.7257 0.5874 0.8157 0.6339 0.7453 0.5733 0.519 0.5915 0.5344 0.6394 0.7181 0.5063 0.3573 0.5341 0.5009 0.4202 0.4487 0.5345 0.5702 0.7555 0.4227 0.5566 0.1448 0.2132 0.491 0.4882 0.4232 0.4196 0.2715 0.669 0.677 0.5761 0.6016 0.2901 0.432 0.5976 0.6955 0.6047 0.8408 0.5769 0.6639 0.3665 0.1158 0.511 0.4661 0.5931 0.4926 0.7629 0.3094 0.5272 0.2085 0.7174 0.5245 0.5879 0.6175 0.272 0.5919 0.4465 0.5499 0.5831 0.8744 0.5751 0.7433 0.5226 0.7784 0.0669 0.6384 0.6874 0.3787 0.3668 0.5679 0.5079 0.5175 0.0008 0.3882 0.2897 0.0138 0.02 0.4829 0.6514 0.4755 0.5855 0.647 0.0309 0.6922 0.7553 0.0372 0.6502 0.3909 0.8537 0.3352 0.528 0.2199 0.1938 0.3452 0.7466 0.6644 0.0822 0.6432 0.6001 0.1263 0.7866 0.7059 0.4147 0.7717 0.531 0.6162 0.6284 0.7774 0.3832 0.5402 0.4595 0.5755 0.5409 0.3649 0.5806 0.582 0.2716 0.2784 0.1521 0.4043 0.5828 0.1535 0.0296 0.5644 0.6636 0.5562 0.008 0.8167 0.3328 0.4074 0.6188] 2022-08-24 22:41:52 [INFO] [EVAL] Class Recall: [0.8241 0.9018 0.9575 0.8487 0.8405 0.8634 0.8522 0.9004 0.7129 0.8115 0.5946 0.7282 0.8308 0.4324 0.2713 0.5182 0.65 0.5211 0.7013 0.4692 0.8622 0.5881 0.6994 0.689 0.457 0.5017 0.5432 0.502 0.4025 0.4837 0.3457 0.5707 0.3836 0.4726 0.4871 0.5831 0.5027 0.5324 0.3495 0.2674 0.1614 0.1329 0.4324 0.2933 0.4757 0.4179 0.435 0.5255 0.9236 0.7167 0.6726 0.5994 0.2856 0.2014 0.7662 0.6784 0.9564 0.4295 0.5764 0.3443 0.1434 0.1952 0.4769 0.1296 0.6502 0.8077 0.3257 0.5276 0.149 0.4017 0.4851 0.5881 0.5054 0.5329 0.5686 0.451 0.4285 0.2296 0.122 0.4399 0.8212 0.3838 0.2631 0.0125 0.5535 0.5594 0.1003 0.0883 0.4712 0.5816 0.6058 0.0021 0.3195 0.0649 0.0047 0.0084 0.3161 0.1425 0.2515 0.4577 0.1355 0.0174 0.2474 0.7109 0.0005 0.6404 0.2274 0.565 0.1174 0.5124 0.0837 0.4161 0.1093 0.7054 0.9934 0.0027 0.4562 0.9385 0.0823 0.3079 0.5561 0.0113 0.1967 0.1677 0.2897 0.2045 0.479 0.511 0.6125 0.3524 0.8323 0.0176 0.1112 0.3524 0.085 0.1663 0.0636 0.0241 0.1679 0.4396 0.1365 0.0106 0.3516 0.5014 0.4377 0.0469 0.2939 0.018 0.083 0.0487] 2022-08-24 22:41:52 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:42:03 [INFO] [TRAIN] epoch: 85, iter: 107050/160000, loss: 0.7209, lr: 0.000401, batch_cost: 0.2260, reader_cost: 0.00461, ips: 35.4034 samples/sec | ETA 03:19:24 2022-08-24 22:42:14 [INFO] [TRAIN] epoch: 85, iter: 107100/160000, loss: 0.7463, lr: 0.000401, batch_cost: 0.2236, reader_cost: 0.00097, ips: 35.7782 samples/sec | ETA 03:17:08 2022-08-24 22:42:26 [INFO] [TRAIN] epoch: 85, iter: 107150/160000, loss: 0.7588, lr: 0.000400, batch_cost: 0.2288, reader_cost: 0.00059, ips: 34.9705 samples/sec | ETA 03:21:30 2022-08-24 22:42:36 [INFO] [TRAIN] epoch: 85, iter: 107200/160000, loss: 0.6598, lr: 0.000400, batch_cost: 0.2156, reader_cost: 0.00062, ips: 37.1017 samples/sec | ETA 03:09:44 2022-08-24 22:42:48 [INFO] [TRAIN] epoch: 85, iter: 107250/160000, loss: 0.7275, lr: 0.000399, batch_cost: 0.2230, reader_cost: 0.00110, ips: 35.8754 samples/sec | ETA 03:16:02 2022-08-24 22:42:58 [INFO] [TRAIN] epoch: 85, iter: 107300/160000, loss: 0.7346, lr: 0.000399, batch_cost: 0.2079, reader_cost: 0.00053, ips: 38.4834 samples/sec | ETA 03:02:35 2022-08-24 22:43:08 [INFO] [TRAIN] epoch: 85, iter: 107350/160000, loss: 0.7067, lr: 0.000399, batch_cost: 0.2034, reader_cost: 0.00694, ips: 39.3258 samples/sec | ETA 02:58:30 2022-08-24 22:43:20 [INFO] [TRAIN] epoch: 86, iter: 107400/160000, loss: 0.7084, lr: 0.000398, batch_cost: 0.2379, reader_cost: 0.03552, ips: 33.6228 samples/sec | ETA 03:28:35 2022-08-24 22:43:31 [INFO] [TRAIN] epoch: 86, iter: 107450/160000, loss: 0.7405, lr: 0.000398, batch_cost: 0.2205, reader_cost: 0.00936, ips: 36.2733 samples/sec | ETA 03:13:09 2022-08-24 22:43:43 [INFO] [TRAIN] epoch: 86, iter: 107500/160000, loss: 0.6833, lr: 0.000397, batch_cost: 0.2277, reader_cost: 0.00036, ips: 35.1404 samples/sec | ETA 03:19:12 2022-08-24 22:43:52 [INFO] [TRAIN] epoch: 86, iter: 107550/160000, loss: 0.7400, lr: 0.000397, batch_cost: 0.1956, reader_cost: 0.00365, ips: 40.8895 samples/sec | ETA 02:51:01 2022-08-24 22:44:03 [INFO] [TRAIN] epoch: 86, iter: 107600/160000, loss: 0.7075, lr: 0.000397, batch_cost: 0.2193, reader_cost: 0.00201, ips: 36.4733 samples/sec | ETA 03:11:33 2022-08-24 22:44:13 [INFO] [TRAIN] epoch: 86, iter: 107650/160000, loss: 0.7385, lr: 0.000396, batch_cost: 0.2015, reader_cost: 0.01493, ips: 39.7108 samples/sec | ETA 02:55:46 2022-08-24 22:44:24 [INFO] [TRAIN] epoch: 86, iter: 107700/160000, loss: 0.7254, lr: 0.000396, batch_cost: 0.2123, reader_cost: 0.00744, ips: 37.6864 samples/sec | ETA 03:05:02 2022-08-24 22:44:36 [INFO] [TRAIN] epoch: 86, iter: 107750/160000, loss: 0.7393, lr: 0.000396, batch_cost: 0.2333, reader_cost: 0.00746, ips: 34.2961 samples/sec | ETA 03:23:07 2022-08-24 22:44:47 [INFO] [TRAIN] epoch: 86, iter: 107800/160000, loss: 0.7341, lr: 0.000395, batch_cost: 0.2247, reader_cost: 0.00060, ips: 35.6039 samples/sec | ETA 03:15:29 2022-08-24 22:44:58 [INFO] [TRAIN] epoch: 86, iter: 107850/160000, loss: 0.7014, lr: 0.000395, batch_cost: 0.2204, reader_cost: 0.00687, ips: 36.2937 samples/sec | ETA 03:11:35 2022-08-24 22:45:12 [INFO] [TRAIN] epoch: 86, iter: 107900/160000, loss: 0.7337, lr: 0.000394, batch_cost: 0.2723, reader_cost: 0.02745, ips: 29.3760 samples/sec | ETA 03:56:28 2022-08-24 22:45:25 [INFO] [TRAIN] epoch: 86, iter: 107950/160000, loss: 0.6994, lr: 0.000394, batch_cost: 0.2620, reader_cost: 0.01767, ips: 30.5368 samples/sec | ETA 03:47:15 2022-08-24 22:45:37 [INFO] [TRAIN] epoch: 86, iter: 108000/160000, loss: 0.7281, lr: 0.000394, batch_cost: 0.2378, reader_cost: 0.00121, ips: 33.6401 samples/sec | ETA 03:26:06 2022-08-24 22:45:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 203s - batch_cost: 0.2033 - reader cost: 7.7235e-04 2022-08-24 22:49:00 [INFO] [EVAL] #Images: 2000 mIoU: 0.3100 Acc: 0.7446 Kappa: 0.7248 Dice: 0.4356 2022-08-24 22:49:00 [INFO] [EVAL] Class IoU: [0.6578 0.7556 0.9252 0.7037 0.6596 0.7453 0.7597 0.745 0.4717 0.6203 0.4568 0.529 0.6579 0.3044 0.2206 0.3763 0.4854 0.4113 0.528 0.3684 0.7268 0.4434 0.537 0.4562 0.3389 0.31 0.3585 0.3958 0.3783 0.2964 0.2022 0.3792 0.2683 0.27 0.3229 0.3849 0.3593 0.471 0.2441 0.2643 0.0774 0.0813 0.3023 0.2219 0.2734 0.2761 0.2406 0.411 0.6493 0.4787 0.4789 0.2143 0.2017 0.1643 0.6173 0.4544 0.8433 0.2786 0.4737 0.223 0.034 0.157 0.3012 0.1498 0.3841 0.6562 0.2057 0.3603 0.0943 0.3222 0.3168 0.3908 0.3805 0.1911 0.4037 0.3071 0.3189 0.2051 0.2145 0.2937 0.6083 0.2646 0.2694 0.0136 0.4284 0.4692 0.0698 0.0763 0.378 0.3704 0.3469 0. 0.1911 0.0704 0.0001 0.0054 0.1367 0.1023 0.1861 0.4128 0.0713 0.004 0.2318 0.5198 0.0043 0.4534 0.1739 0.5201 0.0916 0.3463 0.0613 0.0709 0.0795 0.5119 0.7311 0.001 0.3896 0.5628 0.0458 0.2534 0.4205 0.0083 0.1873 0.1093 0.2594 0.1373 0.4135 0.3256 0.4575 0.1814 0.4945 0.0215 0.0133 0.2553 0.0663 0.1124 0.0597 0.0197 0.138 0.2835 0.1308 0.0111 0.3067 0.4002 0.3283 0. 0.2762 0.0174 0.0831 0.0256] 2022-08-24 22:49:00 [INFO] [EVAL] Class Precision: [0.765 0.8133 0.9595 0.8072 0.7924 0.8637 0.8668 0.8268 0.5973 0.7612 0.6701 0.6715 0.7494 0.4921 0.4621 0.5407 0.6672 0.6355 0.7574 0.552 0.8265 0.6301 0.6739 0.5865 0.4883 0.5903 0.5167 0.7029 0.6393 0.4408 0.4219 0.5072 0.4187 0.3665 0.4478 0.5124 0.5885 0.8281 0.4129 0.4818 0.1614 0.2137 0.4916 0.4658 0.3586 0.469 0.3645 0.5951 0.692 0.6035 0.6403 0.249 0.4533 0.5815 0.7001 0.5896 0.879 0.6374 0.7091 0.3709 0.0705 0.3497 0.4809 0.5391 0.5072 0.842 0.3554 0.5285 0.186 0.733 0.5582 0.5143 0.6601 0.251 0.5723 0.5829 0.5294 0.6114 0.7267 0.7419 0.6917 0.5264 0.7468 0.0648 0.6078 0.6499 0.3385 0.4187 0.6062 0.5148 0.4372 0. 0.4037 0.2898 0.0013 0.0166 0.6221 0.5545 0.3919 0.5782 0.5191 0.0095 0.734 0.5854 0.1577 0.6062 0.4362 0.8183 0.245 0.4791 0.2392 0.0787 0.2586 0.7908 0.7374 0.0384 0.663 0.5812 0.1069 0.7589 0.6811 0.6407 0.7936 0.6463 0.5376 0.685 0.8009 0.4948 0.573 0.4032 0.5637 0.451 0.1254 0.6299 0.6154 0.3526 0.2694 0.1935 0.3987 0.736 0.2377 0.0227 0.5622 0.5534 0.5804 0. 0.8585 0.2889 0.5124 0.3343] 2022-08-24 22:49:00 [INFO] [EVAL] Class Recall: [0.8243 0.9142 0.9628 0.846 0.7973 0.8446 0.86 0.8829 0.6916 0.7701 0.5894 0.7137 0.8435 0.4439 0.2968 0.553 0.6405 0.5383 0.6355 0.5254 0.8576 0.5995 0.7255 0.6724 0.5256 0.3951 0.5394 0.4754 0.481 0.4749 0.2797 0.6003 0.4276 0.5063 0.5366 0.6074 0.4799 0.522 0.3739 0.3692 0.1295 0.116 0.4398 0.2976 0.5351 0.4016 0.4146 0.5706 0.9132 0.6985 0.6552 0.6065 0.2665 0.1863 0.8393 0.6646 0.9539 0.331 0.588 0.3587 0.0616 0.2218 0.4464 0.1718 0.613 0.7484 0.3282 0.531 0.1606 0.365 0.4229 0.6192 0.4732 0.4444 0.5782 0.3936 0.4451 0.2359 0.2333 0.3271 0.8346 0.3474 0.2965 0.017 0.5922 0.6279 0.0808 0.0854 0.5011 0.5691 0.6267 0. 0.2662 0.0851 0.0002 0.0079 0.1491 0.1115 0.2617 0.5907 0.0764 0.0067 0.2531 0.8228 0.0044 0.6428 0.2244 0.5881 0.1276 0.5553 0.0761 0.4161 0.103 0.5921 0.9884 0.0011 0.4859 0.9468 0.0742 0.2756 0.5236 0.0083 0.1969 0.1162 0.334 0.1465 0.4608 0.4878 0.6942 0.248 0.8013 0.0221 0.0147 0.3003 0.0691 0.1416 0.0712 0.0214 0.1743 0.3156 0.2252 0.0211 0.4029 0.591 0.4304 0. 0.2894 0.0181 0.0903 0.027 ] 2022-08-24 22:49:00 [INFO] [EVAL] The model with the best validation mIoU (0.3147) was saved at iter 93000. 2022-08-24 22:49:10 [INFO] [TRAIN] epoch: 86, iter: 108050/160000, loss: 0.6985, lr: 0.000393, batch_cost: 0.1925, reader_cost: 0.01585, ips: 41.5563 samples/sec | ETA 02:46:40 2022-08-24 22:49:23 [INFO] [TRAIN] epoch: 86, iter: 108100/160000, loss: 0.7626, lr: 0.000393, batch_cost: 0.2548, reader_cost: 0.00194, ips: 31.3971 samples/sec | ETA 03:40:24 2022-08-24 22:49:33 [INFO] [TRAIN] epoch: 86, iter: 108150/160000, loss: 0.7500, lr: 0.000393, batch_cost: 0.1988, reader_cost: 0.00051, ips: 40.2341 samples/sec | ETA 02:51:49 2022-08-24 22:49:43 [INFO] [TRAIN] epoch: 86, iter: 108200/160000, loss: 0.7188, lr: 0.000392, batch_cost: 0.2085, reader_cost: 0.00038, ips: 38.3650 samples/sec | ETA 03:00:01 2022-08-24 22:49:54 [INFO] [TRAIN] epoch: 86, iter: 108250/160000, loss: 0.7601, lr: 0.000392, batch_cost: 0.2146, reader_cost: 0.00843, ips: 37.2842 samples/sec | ETA 03:05:03 2022-08-24 22:50:04 [INFO] [TRAIN] epoch: 86, iter: 108300/160000, loss: 0.7334, lr: 0.000391, batch_cost: 0.2078, reader_cost: 0.00412, ips: 38.4975 samples/sec | ETA 02:59:03 2022-08-24 22:50:15 [INFO] [TRAIN] epoch: 86, iter: 108350/160000, loss: 0.7348, lr: 0.000391, batch_cost: 0.2101, reader_cost: 0.00040, ips: 38.0796 samples/sec | ETA 03:00:50 2022-08-24 22:50:25 [INFO] [TRAIN] epoch: 86, iter: 108400/160000, loss: 0.7072, lr: 0.000391, batch_cost: 0.2055, reader_cost: 0.00367, ips: 38.9337 samples/sec | ETA 02:56:42 2022-08-24 22:50:35 [INFO] [TRAIN] epoch: 86, iter: 108450/160000, loss: 0.6641, lr: 0.000390, batch_cost: 0.2042, reader_cost: 0.00113, ips: 39.1862 samples/sec | ETA 02:55:24 2022-08-24 22:50:47 [INFO] [TRAIN] epoch: 86, iter: 108500/160000, loss: 0.7301, lr: 0.000390, batch_cost: 0.2386, reader_cost: 0.00162, ips: 33.5291 samples/sec | ETA 03:24:47 2022-08-24 22:50:57 [INFO] [TRAIN] epoch: 86, iter: 108550/160000, loss: 0.7110, lr: 0.000390, batch_cost: 0.1991, reader_cost: 0.00058, ips: 40.1795 samples/sec | ETA 02:50:44 2022-08-24 22:51:08 [INFO] [TRAIN] epoch: 86, iter: 108600/160000, loss: 0.7259, lr: 0.000389, batch_cost: 0.2149, reader_cost: 0.00053, ips: 37.2263 samples/sec | ETA 03:04:05 2022-08-24 22:51:20 [INFO] [TRAIN] epoch: 87, iter: 108650/160000, loss: 0.6597, lr: 0.000389, batch_cost: 0.2524, reader_cost: 0.04521, ips: 31.6960 samples/sec | ETA 03:36:00 2022-08-24 22:51:32 [INFO] [TRAIN] epoch: 87, iter: 108700/160000, loss: 0.7284, lr: 0.000388, batch_cost: 0.2251, reader_cost: 0.00137, ips: 35.5380 samples/sec | ETA 03:12:28 2022-08-24 22:51:42 [INFO] [TRAIN] epoch: 87, iter: 108750/160000, loss: 0.7491, lr: 0.000388, batch_cost: 0.2070, reader_cost: 0.00045, ips: 38.6543 samples/sec | ETA 02:56:46 2022-08-24 22:51:52 [INFO] [TRAIN] epoch: 87, iter: 108800/160000, loss: 0.6995, lr: 0.000388, batch_cost: 0.1995, reader_cost: 0.01090, ips: 40.1078 samples/sec | ETA 02:50:12 2022-08-24 22:52:02 [INFO] [TRAIN] epoch: 87, iter: 108850/160000, loss: 0.7401, lr: 0.000387, batch_cost: 0.2077, reader_cost: 0.00741, ips: 38.5208 samples/sec | ETA 02:57:02 2022-08-24 22:52:16 [INFO] [TRAIN] epoch: 87, iter: 108900/160000, loss: 0.7191, lr: 0.000387, batch_cost: 0.2729, reader_cost: 0.00038, ips: 29.3133 samples/sec | ETA 03:52:25 2022-08-24 22:52:26 [INFO] [TRAIN] epoch: 87, iter: 108950/160000, loss: 0.7184, lr: 0.000387, batch_cost: 0.2019, reader_cost: 0.01975, ips: 39.6147 samples/sec | ETA 02:51:49 2022-08-24 22:52:37 [INFO] [TRAIN] epoch: 87, iter: 109000/160000, loss: 0.7282, lr: 0.000386, batch_cost: 0.2272, reader_cost: 0.00386, ips: 35.2158 samples/sec | ETA 03:13:05 2022-08-24 22:52:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 199s - batch_cost: 0.1986 - reader cost: 0.0010 2022-08-24 22:55:56 [INFO] [EVAL] #Images: 2000 mIoU: 0.3157 Acc: 0.7455 Kappa: 0.7259 Dice: 0.4423 2022-08-24 22:55:56 [INFO] [EVAL] Class IoU: [0.6549 0.7636 0.9249 0.7031 0.6652 0.7415 0.7621 0.7447 0.4754 0.6339 0.4506 0.5272 0.6573 0.2813 0.2135 0.3715 0.4693 0.4138 0.5456 0.3647 0.7256 0.4282 0.5461 0.4336 0.2743 0.3632 0.358 0.3901 0.3575 0.2895 0.2089 0.3853 0.2636 0.2714 0.3313 0.3888 0.3527 0.5011 0.238 0.2739 0.0761 0.0857 0.2819 0.2282 0.2683 0.2798 0.248 0.4287 0.6823 0.4738 0.4657 0.232 0.2357 0.1652 0.6128 0.4745 0.8422 0.2585 0.4496 0.2223 0.0531 0.176 0.2829 0.1881 0.383 0.6574 0.2268 0.3573 0.0606 0.348 0.325 0.3971 0.3809 0.2199 0.4203 0.3011 0.3042 0.1999 0.1699 0.3495 0.6003 0.2824 0.2699 0.0073 0.4349 0.4643 0.068 0.0656 0.4064 0.3697 0.3483 0.0006 0.2389 0.0809 0. 0.0102 0.1814 0.0745 0.2355 0.3954 0.0377 0.0243 0.2095 0.6384 0.0085 0.5139 0.206 0.5397 0.1191 0.3405 0.047 0.1241 0.1021 0.5136 0.7188 0.006 0.2968 0.5761 0.0576 0.1227 0.5133 0.0086 0.2135 0.1298 0.2272 0.2315 0.428 0.3053 0.5041 0.2633 0.528 0.0173 0.0748 0.2258 0.0953 0.1164 0.0544 0.0154 0.1455 0.3215 0.1508 0.0306 0.3011 0.4485 0.3122 0. 0.2778 0.0066 0.0825 0.0237] 2022-08-24 22:55:56 [INFO] [EVAL] Class Precision: [0.7645 0.832 0.9644 0.8026 0.7579 0.8709 0.873 0.8043 0.6096 0.7344 0.6801 0.6757 0.7411 0.5082 0.4599 0.5412 0.6093 0.6042 0.6949 0.5665 0.8157 0.5983 0.6775 0.5966 0.5224 0.6411 0.5155 0.6401 0.6541 0.4735 0.3849 0.4983 0.4891 0.3789 0.4781 0.5211 0.6042 0.7791 0.4383 0.4761 0.1613 0.1925 0.4205 0.5319 0.3609 0.4927 0.3873 0.7103 0.7575 0.59 0.5796 0.2655 0.4455 0.5619 0.6925 0.7206 0.9141 0.6323 0.5267 0.3681 0.1065 0.3702 0.4086 0.5495 0.5001 0.8014 0.4402 0.4881 0.2345 0.6373 0.5368 0.5661 0.6429 0.2732 0.5928 0.5081 0.6038 0.6747 0.8162 0.6829 0.6844 0.5566 0.7661 0.0416 0.5509 0.6408 0.3268 0.4178 0.6409 0.4931 0.4516 0.0009 0.425 0.2869 0.0009 0.0328 0.5316 0.5536 0.4466 0.5637 0.4144 0.0538 0.7454 0.679 0.1153 0.7706 0.5148 0.9173 0.3148 0.5951 0.3063 0.1502 0.4666 0.7475 0.7248 0.076 0.7326 0.5983 0.1154 0.5072 0.7214 0.4963 0.5321 0.6169 0.6248 0.6648 0.8267 0.4137 0.5783 0.4122 0.6179 0.5322 0.3142 0.6835 0.5185 0.3248 0.2896 0.3875 0.3996 0.6671 0.2461 0.0604 0.5303 0.7136 0.5305 0. 0.8473 0.3418 0.5142 0.7956] 2022-08-24 22:55:56 [INFO] [EVAL] Class Recall: [0.8204 0.9028 0.9575 0.8501 0.8448 0.8331 0.8571 0.9096 0.6835 0.8226 0.5718 0.7057 0.8532 0.3866 0.285 0.5423 0.6714 0.5676 0.7174 0.5058 0.8679 0.6009 0.738 0.6135 0.3662 0.4558 0.5395 0.4996 0.4409 0.4269 0.3137 0.6294 0.3637 0.489 0.5191 0.605 0.4588 0.5841 0.3425 0.3921 0.126 0.1338 0.461 0.2856 0.511 0.393 0.4082 0.5195 0.8729 0.7064 0.7034 0.6477 0.3337 0.1897 0.842 0.5814 0.9146 0.3043 0.7545 0.3595 0.0958 0.2513 0.479 0.2224 0.6205 0.7853 0.3188 0.5714 0.0756 0.4339 0.4516 0.5708 0.483 0.5301 0.5909 0.425 0.3801 0.2212 0.1767 0.4173 0.8301 0.3644 0.2941 0.0088 0.6737 0.6276 0.0791 0.0723 0.5263 0.5963 0.6035 0.002 0.3529 0.1013 0. 0.0146 0.2159 0.0792 0.3325 0.5697 0.0399 0.0424 0.2256 0.9144 0.0091 0.6067 0.2556 0.5673 0.1608 0.4432 0.0526 0.4161 0.1156 0.6214 0.9887 0.0065 0.3329 0.9396 0.1032 0.1393 0.6402 0.0086 0.2628 0.1411 0.2631 0.2621 0.4702 0.5383 0.7972 0.4216 0.784 0.0175 0.0894 0.2522 0.1045 0.1536 0.0628 0.0157 0.1863 0.383 0.2804 0.0585 0.4106 0.547 0.4315 0. 0.2924 0.0067 0.0895 0.0238] 2022-08-24 22:55:56 [INFO] [EVAL] The model with the best validation mIoU (0.3157) was saved at iter 109000. 2022-08-24 22:56:07 [INFO] [TRAIN] epoch: 87, iter: 109050/160000, loss: 0.7052, lr: 0.000386, batch_cost: 0.2068, reader_cost: 0.00337, ips: 38.6783 samples/sec | ETA 02:55:38 2022-08-24 22:56:16 [INFO] [TRAIN] epoch: 87, iter: 109100/160000, loss: 0.7145, lr: 0.000385, batch_cost: 0.1941, reader_cost: 0.00322, ips: 41.2064 samples/sec | ETA 02:44:41 2022-08-24 22:56:27 [INFO] [TRAIN] epoch: 87, iter: 109150/160000, loss: 0.7401, lr: 0.000385, batch_cost: 0.2097, reader_cost: 0.00035, ips: 38.1494 samples/sec | ETA 02:57:43 2022-08-24 22:56:39 [INFO] [TRAIN] epoch: 87, iter: 109200/160000, loss: 0.6690, lr: 0.000385, batch_cost: 0.2377, reader_cost: 0.00059, ips: 33.6599 samples/sec | ETA 03:21:13 2022-08-24 22:56:50 [INFO] [TRAIN] epoch: 87, iter: 109250/160000, loss: 0.6999, lr: 0.000384, batch_cost: 0.2217, reader_cost: 0.00074, ips: 36.0767 samples/sec | ETA 03:07:33 2022-08-24 22:57:01 [INFO] [TRAIN] epoch: 87, iter: 109300/160000, loss: 0.6832, lr: 0.000384, batch_cost: 0.2294, reader_cost: 0.00083, ips: 34.8693 samples/sec | ETA 03:13:52 2022-08-24 22:57:12 [INFO] [TRAIN] epoch: 87, iter: 109350/160000, loss: 0.6830, lr: 0.000383, batch_cost: 0.2162, reader_cost: 0.00417, ips: 37.0074 samples/sec | ETA 03:02:29 2022-08-24 22:57:24 [INFO] [TRAIN] epoch: 87, iter: 109400/160000, loss: 0.7309, lr: 0.000383, batch_cost: 0.2291, reader_cost: 0.00103, ips: 34.9197 samples/sec | ETA 03:13:12 2022-08-24 22:57:34 [INFO] [TRAIN] epoch: 87, iter: 109450/160000, loss: 0.7198, lr: 0.000383, batch_cost: 0.2162, reader_cost: 0.00071, ips: 37.0103 samples/sec | ETA 03:02:06 2022-08-24 22:57:43 [INFO] [TRAIN] epoch: 87, iter: 109500/160000, loss: 0.7396, lr: 0.000382, batch_cost: 0.1798, reader_cost: 0.01036, ips: 44.5046 samples/sec | ETA 02:31:17 2022-08-24 22:57:54 [INFO] [TRAIN] epoch: 87, iter: 109550/160000, loss: 0.7468, lr: 0.000382, batch_cost: 0.2136, reader_cost: 0.00101, ips: 37.4504 samples/sec | ETA 02:59:36 2022-08-24 22:58:04 [INFO] [TRAIN] epoch: 87, iter: 109600/160000, loss: 0.7061, lr: 0.000382, batch_cost: 0.2063, reader_cost: 0.00088, ips: 38.7713 samples/sec | ETA 02:53:19 2022-08-24 22:58:15 [INFO] [TRAIN] epoch: 87, iter: 109650/160000, loss: 0.7533, lr: 0.000381, batch_cost: 0.2114, reader_cost: 0.02504, ips: 37.8420 samples/sec | ETA 02:57:24 2022-08-24 22:58:26 [INFO] [TRAIN] epoch: 87, iter: 109700/160000, loss: 0.7376, lr: 0.000381, batch_cost: 0.2187, reader_cost: 0.00068, ips: 36.5867 samples/sec | ETA 03:03:18 2022-08-24 22:58:37 [INFO] [TRAIN] epoch: 87, iter: 109750/160000, loss: 0.7369, lr: 0.000380, batch_cost: 0.2124, reader_cost: 0.00071, ips: 37.6729 samples/sec | ETA 02:57:50 2022-08-24 22:58:49 [INFO] [TRAIN] epoch: 87, iter: 109800/160000, loss: 0.7315, lr: 0.000380, batch_cost: 0.2542, reader_cost: 0.00061, ips: 31.4774 samples/sec | ETA 03:32:38 2022-08-24 22:59:03 [INFO] [TRAIN] epoch: 87, iter: 109850/160000, loss: 0.7180, lr: 0.000380, batch_cost: 0.2739, reader_cost: 0.00067, ips: 29.2102 samples/sec | ETA 03:48:54 2022-08-24 22:59:19 [INFO] [TRAIN] epoch: 88, iter: 109900/160000, loss: 0.7127, lr: 0.000379, batch_cost: 0.3149, reader_cost: 0.03786, ips: 25.4018 samples/sec | ETA 04:22:58 2022-08-24 22:59:31 [INFO] [TRAIN] epoch: 88, iter: 109950/160000, loss: 0.6780, lr: 0.000379, batch_cost: 0.2549, reader_cost: 0.00213, ips: 31.3824 samples/sec | ETA 03:32:38 2022-08-24 22:59:41 [INFO] [TRAIN] epoch: 88, iter: 110000/160000, loss: 0.6716, lr: 0.000379, batch_cost: 0.1972, reader_cost: 0.00202, ips: 40.5583 samples/sec | ETA 02:44:22 2022-08-24 22:59:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 181s - batch_cost: 0.1806 - reader cost: 6.2415e-04 2022-08-24 23:02:42 [INFO] [EVAL] #Images: 2000 mIoU: 0.3170 Acc: 0.7481 Kappa: 0.7285 Dice: 0.4441 2022-08-24 23:02:42 [INFO] [EVAL] Class IoU: [0.6578 0.7657 0.9256 0.7072 0.6702 0.7433 0.7697 0.7457 0.4823 0.6272 0.4597 0.5375 0.6621 0.3308 0.1865 0.3795 0.4738 0.4108 0.5547 0.3648 0.716 0.4279 0.545 0.4333 0.3301 0.3533 0.3718 0.3716 0.3627 0.308 0.2154 0.3977 0.2657 0.2831 0.2829 0.3777 0.3595 0.4888 0.2586 0.2546 0.0915 0.0801 0.3037 0.2133 0.291 0.2323 0.2493 0.4238 0.6667 0.4624 0.4864 0.2268 0.2346 0.196 0.6138 0.4974 0.8507 0.2861 0.4521 0.2371 0.043 0.1563 0.3007 0.1297 0.3817 0.6579 0.2385 0.3531 0.0729 0.359 0.35 0.4348 0.3737 0.2052 0.4198 0.3059 0.3156 0.2172 0.258 0.3648 0.6319 0.2756 0.2579 0.0164 0.3957 0.4555 0.0706 0.072 0.3635 0.4122 0.377 0.0006 0.1829 0.0857 0.0055 0.007 0.1864 0.0833 0.1975 0.2929 0.0475 0.0066 0.1637 0.6711 0.0077 0.5016 0.2167 0.5172 0.1063 0.3698 0.0542 0.2861 0.0789 0.5175 0.6785 0.0048 0.3458 0.5559 0.0462 0.1484 0.4892 0.0086 0.243 0.1222 0.2227 0.2008 0.4073 0.2937 0.4551 0.2271 0.5479 0.0134 0.0937 0.2463 0.078 0.1082 0.0583 0.0179 0.1364 0.2985 0.1698 0.0578 0.2816 0.4498 0.3429 0. 0.268 0.0137 0.0786 0.0305] 2022-08-24 23:02:42 [INFO] [EVAL] Class Precision: [0.7573 0.8356 0.9608 0.8069 0.7663 0.8613 0.8908 0.8151 0.6068 0.758 0.6656 0.6655 0.7537 0.4986 0.5028 0.5587 0.6281 0.6485 0.7214 0.5723 0.8006 0.6133 0.6747 0.6001 0.5335 0.5998 0.5254 0.7434 0.6338 0.5123 0.3562 0.5312 0.4784 0.4073 0.4655 0.4704 0.615 0.7431 0.5049 0.5099 0.15 0.2324 0.4915 0.5246 0.4238 0.4915 0.3995 0.7051 0.7082 0.5724 0.633 0.2649 0.4223 0.5369 0.7034 0.6903 0.8986 0.6119 0.682 0.4212 0.0707 0.3468 0.4679 0.6336 0.4869 0.7941 0.3923 0.526 0.3292 0.629 0.5489 0.5911 0.6577 0.2733 0.5814 0.5026 0.4794 0.5186 0.7717 0.5248 0.7508 0.5462 0.7577 0.0823 0.5411 0.6835 0.383 0.3667 0.6027 0.5926 0.4871 0.0008 0.3091 0.2836 0.0377 0.0176 0.5966 0.4329 0.4559 0.5727 0.5256 0.0187 0.725 0.7503 0.0668 0.7316 0.4879 0.9419 0.2681 0.5403 0.3599 0.4789 0.4799 0.5753 0.6827 0.0908 0.657 0.5874 0.1123 0.7017 0.7691 0.3138 0.6675 0.6334 0.6409 0.6287 0.8504 0.4001 0.5642 0.5298 0.6524 0.7297 0.4179 0.6793 0.5801 0.3332 0.3044 0.065 0.443 0.7216 0.291 0.0993 0.6339 0.7152 0.743 0. 0.8148 0.3617 0.5922 0.5528] 2022-08-24 23:02:42 [INFO] [EVAL] Class Recall: [0.8335 0.9015 0.9619 0.8512 0.8424 0.8444 0.8499 0.8976 0.7016 0.7841 0.5978 0.7365 0.845 0.4957 0.2287 0.542 0.6585 0.5284 0.706 0.5016 0.8714 0.586 0.7392 0.6092 0.464 0.4623 0.5597 0.4262 0.4589 0.4358 0.3527 0.6128 0.3742 0.4814 0.4191 0.6573 0.4639 0.5883 0.3464 0.3371 0.1902 0.1089 0.4428 0.2644 0.4817 0.3059 0.3986 0.5151 0.9192 0.7065 0.6775 0.6125 0.3454 0.2358 0.8281 0.6404 0.941 0.3496 0.5728 0.3517 0.0987 0.2215 0.4569 0.1402 0.6384 0.7932 0.3784 0.5179 0.0856 0.4555 0.4913 0.6218 0.4639 0.4517 0.6017 0.4386 0.4803 0.272 0.2793 0.5448 0.7996 0.3574 0.2811 0.0201 0.5955 0.5773 0.0797 0.0823 0.478 0.5752 0.6251 0.0018 0.3093 0.1093 0.0064 0.0115 0.2133 0.0935 0.2583 0.3747 0.0496 0.0101 0.1745 0.8641 0.0087 0.6147 0.2804 0.5342 0.1497 0.5396 0.0599 0.4154 0.0863 0.8374 0.9911 0.005 0.422 0.9122 0.0728 0.1584 0.5734 0.0088 0.2765 0.1315 0.2545 0.2278 0.4388 0.525 0.7018 0.2845 0.7738 0.0135 0.1078 0.2787 0.0827 0.1381 0.0673 0.0241 0.1646 0.3373 0.2897 0.1216 0.3363 0.5479 0.3891 0. 0.2854 0.014 0.0831 0.0313] 2022-08-24 23:02:42 [INFO] [EVAL] The model with the best validation mIoU (0.3170) was saved at iter 110000. 2022-08-24 23:02:53 [INFO] [TRAIN] epoch: 88, iter: 110050/160000, loss: 0.7464, lr: 0.000378, batch_cost: 0.2101, reader_cost: 0.00437, ips: 38.0686 samples/sec | ETA 02:54:56 2022-08-24 23:03:04 [INFO] [TRAIN] epoch: 88, iter: 110100/160000, loss: 0.7201, lr: 0.000378, batch_cost: 0.2273, reader_cost: 0.00131, ips: 35.1978 samples/sec | ETA 03:09:01 2022-08-24 23:03:15 [INFO] [TRAIN] epoch: 88, iter: 110150/160000, loss: 0.6910, lr: 0.000377, batch_cost: 0.2066, reader_cost: 0.00033, ips: 38.7215 samples/sec | ETA 02:51:39 2022-08-24 23:03:24 [INFO] [TRAIN] epoch: 88, iter: 110200/160000, loss: 0.7268, lr: 0.000377, batch_cost: 0.1816, reader_cost: 0.00068, ips: 44.0415 samples/sec | ETA 02:30:46 2022-08-24 23:03:35 [INFO] [TRAIN] epoch: 88, iter: 110250/160000, loss: 0.7170, lr: 0.000377, batch_cost: 0.2230, reader_cost: 0.00061, ips: 35.8697 samples/sec | ETA 03:04:55 2022-08-24 23:03:45 [INFO] [TRAIN] epoch: 88, iter: 110300/160000, loss: 0.7316, lr: 0.000376, batch_cost: 0.1986, reader_cost: 0.02164, ips: 40.2916 samples/sec | ETA 02:44:28 2022-08-24 23:03:55 [INFO] [TRAIN] epoch: 88, iter: 110350/160000, loss: 0.6904, lr: 0.000376, batch_cost: 0.2100, reader_cost: 0.00048, ips: 38.1015 samples/sec | ETA 02:53:44 2022-08-24 23:04:07 [INFO] [TRAIN] epoch: 88, iter: 110400/160000, loss: 0.7084, lr: 0.000376, batch_cost: 0.2432, reader_cost: 0.00061, ips: 32.8910 samples/sec | ETA 03:21:04 2022-08-24 23:04:17 [INFO] [TRAIN] epoch: 88, iter: 110450/160000, loss: 0.7020, lr: 0.000375, batch_cost: 0.1911, reader_cost: 0.00093, ips: 41.8587 samples/sec | ETA 02:37:49 2022-08-24 23:04:27 [INFO] [TRAIN] epoch: 88, iter: 110500/160000, loss: 0.7266, lr: 0.000375, batch_cost: 0.1983, reader_cost: 0.00044, ips: 40.3488 samples/sec | ETA 02:43:34 2022-08-24 23:04:37 [INFO] [TRAIN] epoch: 88, iter: 110550/160000, loss: 0.6576, lr: 0.000374, batch_cost: 0.2058, reader_cost: 0.00150, ips: 38.8688 samples/sec | ETA 02:49:37 2022-08-24 23:04:48 [INFO] [TRAIN] epoch: 88, iter: 110600/160000, loss: 0.6989, lr: 0.000374, batch_cost: 0.2112, reader_cost: 0.00041, ips: 37.8722 samples/sec | ETA 02:53:55 2022-08-24 23:05:00 [INFO] [TRAIN] epoch: 88, iter: 110650/160000, loss: 0.7427, lr: 0.000374, batch_cost: 0.2343, reader_cost: 0.00081, ips: 34.1432 samples/sec | ETA 03:12:43 2022-08-24 23:05:09 [INFO] [TRAIN] epoch: 88, iter: 110700/160000, loss: 0.6940, lr: 0.000373, batch_cost: 0.1865, reader_cost: 0.00578, ips: 42.8977 samples/sec | ETA 02:33:13 2022-08-24 23:05:19 [INFO] [TRAIN] epoch: 88, iter: 110750/160000, loss: 0.6752, lr: 0.000373, batch_cost: 0.2094, reader_cost: 0.00482, ips: 38.2132 samples/sec | ETA 02:51:50 2022-08-24 23:05:29 [INFO] [TRAIN] epoch: 88, iter: 110800/160000, loss: 0.7165, lr: 0.000372, batch_cost: 0.1978, reader_cost: 0.00074, ips: 40.4489 samples/sec | ETA 02:42:10 2022-08-24 23:05:41 [INFO] [TRAIN] epoch: 88, iter: 110850/160000, loss: 0.7139, lr: 0.000372, batch_cost: 0.2373, reader_cost: 0.00040, ips: 33.7193 samples/sec | ETA 03:14:20 2022-08-24 23:05:53 [INFO] [TRAIN] epoch: 88, iter: 110900/160000, loss: 0.6806, lr: 0.000372, batch_cost: 0.2418, reader_cost: 0.00810, ips: 33.0815 samples/sec | ETA 03:17:53 2022-08-24 23:06:06 [INFO] [TRAIN] epoch: 88, iter: 110950/160000, loss: 0.7359, lr: 0.000371, batch_cost: 0.2497, reader_cost: 0.00120, ips: 32.0385 samples/sec | ETA 03:24:07 2022-08-24 23:06:17 [INFO] [TRAIN] epoch: 88, iter: 111000/160000, loss: 0.6717, lr: 0.000371, batch_cost: 0.2259, reader_cost: 0.00044, ips: 35.4097 samples/sec | ETA 03:04:30 2022-08-24 23:06:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 191s - batch_cost: 0.1911 - reader cost: 7.1525e-04 2022-08-24 23:09:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3148 Acc: 0.7465 Kappa: 0.7272 Dice: 0.4406 2022-08-24 23:09:28 [INFO] [EVAL] Class IoU: [0.6589 0.7689 0.926 0.7045 0.6684 0.7502 0.758 0.7605 0.4802 0.6167 0.457 0.5084 0.6514 0.2892 0.2128 0.3763 0.4776 0.4025 0.5513 0.3729 0.7199 0.4193 0.5471 0.4622 0.3143 0.3883 0.3667 0.3903 0.373 0.3211 0.2133 0.3504 0.2817 0.2683 0.3627 0.3712 0.3635 0.4768 0.247 0.2539 0.0914 0.0713 0.2968 0.2225 0.2951 0.2553 0.2467 0.4191 0.6892 0.4563 0.477 0.2095 0.2278 0.1764 0.5976 0.4267 0.8245 0.3688 0.4255 0.2232 0.0403 0.1585 0.2951 0.1386 0.3598 0.6541 0.2154 0.3658 0.0886 0.3374 0.3378 0.4213 0.3777 0.1982 0.4158 0.3148 0.3268 0.2065 0.1295 0.234 0.6048 0.2871 0.2527 0.0081 0.4689 0.4696 0.0509 0.0633 0.3461 0.3953 0.3897 0.0037 0.1864 0.0767 0.007 0.0114 0.1736 0.0866 0.1761 0.388 0.0644 0.008 0.2095 0.7404 0.0046 0.5419 0.192 0.4596 0.106 0.3302 0.0546 0.1808 0.1036 0.5961 0.7604 0.0082 0.4122 0.5819 0.0352 0.24 0.4659 0.0073 0.2641 0.0919 0.2431 0.2128 0.4519 0.2857 0.4713 0.262 0.5193 0.0173 0.0428 0.2122 0.0713 0.106 0.0627 0.0207 0.127 0.3241 0.1065 0.0962 0.3012 0.2164 0.2766 0.0001 0.2945 0.0073 0.0747 0.0177] 2022-08-24 23:09:28 [INFO] [EVAL] Class Precision: [0.7748 0.8376 0.9593 0.8059 0.7751 0.8416 0.8487 0.8405 0.6141 0.7611 0.6771 0.6757 0.7287 0.4708 0.4614 0.5642 0.6658 0.6168 0.7143 0.5557 0.8051 0.5904 0.6895 0.5918 0.5441 0.5477 0.5231 0.6577 0.5888 0.4734 0.4052 0.4705 0.451 0.4179 0.4739 0.4927 0.5897 0.7657 0.5359 0.5217 0.1652 0.1973 0.4858 0.52 0.3835 0.4276 0.3802 0.6364 0.8049 0.5422 0.6204 0.2389 0.4029 0.5897 0.7166 0.6236 0.8695 0.5591 0.6543 0.3863 0.089 0.381 0.4097 0.5878 0.4342 0.8175 0.3241 0.5448 0.2816 0.5678 0.5127 0.5733 0.6081 0.2703 0.5901 0.4902 0.5128 0.497 0.8043 0.5964 0.6914 0.5888 0.7793 0.0557 0.6656 0.6567 0.325 0.4185 0.5017 0.5416 0.5169 0.0049 0.3378 0.2915 0.0377 0.0464 0.6741 0.4252 0.319 0.5843 0.6415 0.0156 0.7408 0.8261 0.0887 0.8279 0.4292 0.8727 0.2297 0.4923 0.2689 0.2438 0.4209 0.7453 0.7683 0.1365 0.6481 0.614 0.0869 0.7252 0.8233 0.4877 0.7246 0.6847 0.5859 0.6967 0.7388 0.3882 0.5971 0.444 0.6235 0.6159 0.3239 0.7628 0.5931 0.2832 0.299 0.1131 0.4073 0.6813 0.2813 0.1336 0.563 0.7347 0.4365 0.0001 0.9204 0.3156 0.5549 0.8367] 2022-08-24 23:09:28 [INFO] [EVAL] Class Recall: [0.815 0.9037 0.9638 0.8486 0.8292 0.8735 0.8764 0.8887 0.6876 0.7648 0.5843 0.6724 0.86 0.4285 0.2831 0.5305 0.6283 0.5368 0.7073 0.5314 0.8719 0.5914 0.7259 0.6786 0.4266 0.5715 0.5509 0.4898 0.5045 0.4996 0.3105 0.5784 0.4288 0.4285 0.6072 0.6009 0.4866 0.5583 0.3142 0.331 0.1697 0.1003 0.4326 0.28 0.5616 0.3879 0.4126 0.551 0.8274 0.7421 0.6736 0.6301 0.344 0.2011 0.7826 0.5746 0.9409 0.5201 0.549 0.3457 0.0687 0.2135 0.5133 0.1536 0.6772 0.7658 0.391 0.5268 0.1145 0.454 0.4975 0.6137 0.4992 0.4264 0.5846 0.468 0.4739 0.2611 0.1337 0.278 0.8285 0.3591 0.2721 0.0094 0.6134 0.6224 0.0569 0.0694 0.5274 0.5941 0.6128 0.0142 0.2937 0.0942 0.0085 0.0149 0.1895 0.0981 0.2821 0.536 0.0668 0.0162 0.226 0.8772 0.0048 0.6107 0.2579 0.4927 0.1645 0.5007 0.0642 0.4117 0.1208 0.7486 0.9867 0.0087 0.5311 0.9176 0.056 0.264 0.5177 0.0073 0.2936 0.096 0.2935 0.2345 0.5378 0.5197 0.6911 0.39 0.7566 0.0175 0.047 0.2272 0.0749 0.1449 0.0735 0.0247 0.1558 0.3821 0.1463 0.2555 0.3932 0.2348 0.4302 0.0009 0.3021 0.0074 0.0794 0.0177] 2022-08-24 23:09:28 [INFO] [EVAL] The model with the best validation mIoU (0.3170) was saved at iter 110000. 2022-08-24 23:09:39 [INFO] [TRAIN] epoch: 88, iter: 111050/160000, loss: 0.7746, lr: 0.000371, batch_cost: 0.2161, reader_cost: 0.00482, ips: 37.0174 samples/sec | ETA 02:56:18 2022-08-24 23:09:49 [INFO] [TRAIN] epoch: 88, iter: 111100/160000, loss: 0.6783, lr: 0.000370, batch_cost: 0.1971, reader_cost: 0.00581, ips: 40.5878 samples/sec | ETA 02:40:38 2022-08-24 23:10:01 [INFO] [TRAIN] epoch: 89, iter: 111150/160000, loss: 0.6973, lr: 0.000370, batch_cost: 0.2301, reader_cost: 0.03111, ips: 34.7604 samples/sec | ETA 03:07:22 2022-08-24 23:10:11 [INFO] [TRAIN] epoch: 89, iter: 111200/160000, loss: 0.7019, lr: 0.000369, batch_cost: 0.2026, reader_cost: 0.00759, ips: 39.4825 samples/sec | ETA 02:44:47 2022-08-24 23:10:21 [INFO] [TRAIN] epoch: 89, iter: 111250/160000, loss: 0.7316, lr: 0.000369, batch_cost: 0.2107, reader_cost: 0.01170, ips: 37.9713 samples/sec | ETA 02:51:10 2022-08-24 23:10:31 [INFO] [TRAIN] epoch: 89, iter: 111300/160000, loss: 0.6859, lr: 0.000369, batch_cost: 0.2004, reader_cost: 0.00059, ips: 39.9281 samples/sec | ETA 02:42:37 2022-08-24 23:10:43 [INFO] [TRAIN] epoch: 89, iter: 111350/160000, loss: 0.7634, lr: 0.000368, batch_cost: 0.2354, reader_cost: 0.00467, ips: 33.9777 samples/sec | ETA 03:10:54 2022-08-24 23:10:54 [INFO] [TRAIN] epoch: 89, iter: 111400/160000, loss: 0.7183, lr: 0.000368, batch_cost: 0.2114, reader_cost: 0.00882, ips: 37.8397 samples/sec | ETA 02:51:14 2022-08-24 23:11:04 [INFO] [TRAIN] epoch: 89, iter: 111450/160000, loss: 0.7620, lr: 0.000368, batch_cost: 0.2149, reader_cost: 0.00079, ips: 37.2326 samples/sec | ETA 02:53:51 2022-08-24 23:11:16 [INFO] [TRAIN] epoch: 89, iter: 111500/160000, loss: 0.6914, lr: 0.000367, batch_cost: 0.2231, reader_cost: 0.00035, ips: 35.8635 samples/sec | ETA 03:00:18 2022-08-24 23:11:26 [INFO] [TRAIN] epoch: 89, iter: 111550/160000, loss: 0.7499, lr: 0.000367, batch_cost: 0.2083, reader_cost: 0.00051, ips: 38.4023 samples/sec | ETA 02:48:13 2022-08-24 23:11:37 [INFO] [TRAIN] epoch: 89, iter: 111600/160000, loss: 0.6797, lr: 0.000366, batch_cost: 0.2146, reader_cost: 0.00046, ips: 37.2847 samples/sec | ETA 02:53:04 2022-08-24 23:11:47 [INFO] [TRAIN] epoch: 89, iter: 111650/160000, loss: 0.7469, lr: 0.000366, batch_cost: 0.2109, reader_cost: 0.00077, ips: 37.9300 samples/sec | ETA 02:49:57 2022-08-24 23:11:58 [INFO] [TRAIN] epoch: 89, iter: 111700/160000, loss: 0.6989, lr: 0.000366, batch_cost: 0.2080, reader_cost: 0.00081, ips: 38.4683 samples/sec | ETA 02:47:24 2022-08-24 23:12:07 [INFO] [TRAIN] epoch: 89, iter: 111750/160000, loss: 0.7341, lr: 0.000365, batch_cost: 0.1948, reader_cost: 0.00033, ips: 41.0778 samples/sec | ETA 02:36:36 2022-08-24 23:12:20 [INFO] [TRAIN] epoch: 89, iter: 111800/160000, loss: 0.7170, lr: 0.000365, batch_cost: 0.2611, reader_cost: 0.01859, ips: 30.6428 samples/sec | ETA 03:29:43 2022-08-24 23:12:32 [INFO] [TRAIN] epoch: 89, iter: 111850/160000, loss: 0.7656, lr: 0.000365, batch_cost: 0.2412, reader_cost: 0.01131, ips: 33.1642 samples/sec | ETA 03:13:34 2022-08-24 23:12:47 [INFO] [TRAIN] epoch: 89, iter: 111900/160000, loss: 0.6700, lr: 0.000364, batch_cost: 0.2858, reader_cost: 0.00054, ips: 27.9945 samples/sec | ETA 03:49:05 2022-08-24 23:13:00 [INFO] [TRAIN] epoch: 89, iter: 111950/160000, loss: 0.6978, lr: 0.000364, batch_cost: 0.2588, reader_cost: 0.02031, ips: 30.9106 samples/sec | ETA 03:27:15 2022-08-24 23:13:11 [INFO] [TRAIN] epoch: 89, iter: 112000/160000, loss: 0.7134, lr: 0.000363, batch_cost: 0.2296, reader_cost: 0.01187, ips: 34.8467 samples/sec | ETA 03:03:39 2022-08-24 23:13:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 201s - batch_cost: 0.2014 - reader cost: 7.6295e-04 2022-08-24 23:16:33 [INFO] [EVAL] #Images: 2000 mIoU: 0.3168 Acc: 0.7459 Kappa: 0.7263 Dice: 0.4434 2022-08-24 23:16:33 [INFO] [EVAL] Class IoU: [0.6577 0.7719 0.9257 0.7045 0.6604 0.7349 0.765 0.7424 0.4802 0.6114 0.4631 0.5246 0.6617 0.2776 0.2318 0.3709 0.4845 0.399 0.5527 0.3518 0.7283 0.4228 0.5449 0.4135 0.3357 0.4134 0.3442 0.3823 0.3787 0.302 0.2405 0.3833 0.2332 0.2665 0.3748 0.3452 0.3595 0.4944 0.2661 0.2889 0.095 0.0553 0.3052 0.2243 0.298 0.2921 0.2659 0.424 0.6881 0.4647 0.4573 0.2308 0.2119 0.1813 0.6059 0.4212 0.836 0.2739 0.4711 0.2021 0.0613 0.1661 0.2931 0.1252 0.3763 0.6671 0.2372 0.3626 0.0872 0.3343 0.3503 0.4358 0.366 0.2179 0.4064 0.3153 0.3323 0.2183 0.1308 0.2266 0.6412 0.2803 0.2834 0.0244 0.4665 0.4776 0.0897 0.0611 0.3639 0.4203 0.3941 0.0003 0.177 0.0697 0.0204 0.0078 0.1773 0.0977 0.1738 0.367 0.1226 0.0184 0.1825 0.727 0.0042 0.4704 0.1852 0.4779 0.1043 0.3677 0.0807 0.0955 0.0943 0.5923 0.7122 0.0087 0.4033 0.5821 0.0492 0.0877 0.4255 0.0117 0.2472 0.1431 0.2458 0.1839 0.4468 0.3061 0.5331 0.2836 0.5207 0.0205 0.1643 0.2487 0.0755 0.1035 0.0499 0.0146 0.1258 0.3378 0.1686 0.0055 0.3306 0.2128 0.3233 0. 0.3285 0.011 0.0673 0.0238] 2022-08-24 23:16:33 [INFO] [EVAL] Class Precision: [0.76 0.8372 0.9595 0.815 0.7548 0.8785 0.8679 0.8095 0.6221 0.7559 0.6687 0.6757 0.7546 0.4673 0.4308 0.5965 0.6678 0.6324 0.7148 0.6013 0.8242 0.5897 0.6932 0.6229 0.517 0.6009 0.5173 0.7183 0.5686 0.4379 0.3219 0.4712 0.5312 0.3944 0.518 0.4758 0.5954 0.7478 0.4435 0.4403 0.1814 0.2185 0.5313 0.5563 0.3901 0.4497 0.4192 0.6786 0.754 0.5652 0.61 0.2673 0.3451 0.6388 0.6986 0.5429 0.8846 0.6671 0.6128 0.3699 0.1043 0.3749 0.4333 0.535 0.4609 0.833 0.3847 0.5428 0.2398 0.6532 0.6105 0.6489 0.6459 0.2534 0.5997 0.534 0.495 0.645 0.8097 0.4741 0.7607 0.5183 0.7545 0.1465 0.6813 0.6389 0.3288 0.4029 0.7337 0.6402 0.5384 0.0004 0.3304 0.2908 0.0656 0.0314 0.6742 0.4021 0.5302 0.5601 0.5804 0.0344 0.7082 0.8067 0.0487 0.6434 0.5331 0.8831 0.2499 0.5821 0.2488 0.1103 0.3873 0.7404 0.7164 0.08 0.6877 0.6238 0.1234 0.5209 0.6414 0.2885 0.6879 0.5818 0.6604 0.6568 0.724 0.4361 0.6498 0.4976 0.5796 0.5673 0.4415 0.6789 0.6753 0.2433 0.3121 0.1173 0.4098 0.6417 0.2927 0.0152 0.6282 0.6951 0.6535 0. 0.8697 0.3011 0.4592 0.5579] 2022-08-24 23:16:33 [INFO] [EVAL] Class Recall: [0.8302 0.9082 0.9634 0.8386 0.8407 0.818 0.8657 0.8996 0.678 0.7619 0.6009 0.7011 0.8431 0.4061 0.334 0.4951 0.6383 0.5195 0.7091 0.4589 0.8623 0.5991 0.7181 0.5515 0.4891 0.5698 0.5071 0.4498 0.5314 0.4931 0.4873 0.6727 0.2937 0.4511 0.5756 0.5571 0.4758 0.5933 0.3995 0.4565 0.1663 0.0689 0.4178 0.2731 0.5578 0.4546 0.4209 0.5305 0.8874 0.7233 0.6463 0.6282 0.3543 0.202 0.8203 0.6526 0.9384 0.3172 0.6709 0.3082 0.1297 0.2298 0.4753 0.1405 0.6722 0.7701 0.3822 0.5221 0.1206 0.4064 0.4511 0.5703 0.4578 0.6086 0.5577 0.4349 0.5027 0.2481 0.1349 0.3026 0.8031 0.379 0.3122 0.0285 0.5967 0.6542 0.1098 0.0671 0.4193 0.5503 0.5952 0.0009 0.2759 0.084 0.0288 0.0102 0.1939 0.1143 0.2055 0.5156 0.1345 0.0382 0.1974 0.8803 0.0045 0.6362 0.221 0.5101 0.1518 0.4995 0.1067 0.4161 0.1109 0.7476 0.9919 0.0097 0.4937 0.8969 0.0757 0.0954 0.5583 0.0121 0.2784 0.1595 0.2814 0.2034 0.5385 0.5067 0.748 0.3974 0.8367 0.0208 0.2075 0.2818 0.0783 0.1527 0.0561 0.0164 0.1536 0.4163 0.2846 0.0087 0.4111 0.2347 0.3902 0. 0.3456 0.0113 0.0731 0.0243] 2022-08-24 23:16:33 [INFO] [EVAL] The model with the best validation mIoU (0.3170) was saved at iter 110000. 2022-08-24 23:16:43 [INFO] [TRAIN] epoch: 89, iter: 112050/160000, loss: 0.7045, lr: 0.000363, batch_cost: 0.1912, reader_cost: 0.00348, ips: 41.8383 samples/sec | ETA 02:32:48 2022-08-24 23:16:53 [INFO] [TRAIN] epoch: 89, iter: 112100/160000, loss: 0.7122, lr: 0.000363, batch_cost: 0.2193, reader_cost: 0.00118, ips: 36.4719 samples/sec | ETA 02:55:06 2022-08-24 23:17:03 [INFO] [TRAIN] epoch: 89, iter: 112150/160000, loss: 0.6860, lr: 0.000362, batch_cost: 0.1917, reader_cost: 0.00067, ips: 41.7214 samples/sec | ETA 02:32:55 2022-08-24 23:17:14 [INFO] [TRAIN] epoch: 89, iter: 112200/160000, loss: 0.6590, lr: 0.000362, batch_cost: 0.2092, reader_cost: 0.00066, ips: 38.2348 samples/sec | ETA 02:46:41 2022-08-24 23:17:25 [INFO] [TRAIN] epoch: 89, iter: 112250/160000, loss: 0.7294, lr: 0.000362, batch_cost: 0.2196, reader_cost: 0.00049, ips: 36.4260 samples/sec | ETA 02:54:47 2022-08-24 23:17:35 [INFO] [TRAIN] epoch: 89, iter: 112300/160000, loss: 0.7516, lr: 0.000361, batch_cost: 0.2118, reader_cost: 0.00117, ips: 37.7788 samples/sec | ETA 02:48:20 2022-08-24 23:17:46 [INFO] [TRAIN] epoch: 89, iter: 112350/160000, loss: 0.6972, lr: 0.000361, batch_cost: 0.2245, reader_cost: 0.00105, ips: 35.6297 samples/sec | ETA 02:58:18 2022-08-24 23:17:57 [INFO] [TRAIN] epoch: 89, iter: 112400/160000, loss: 0.6807, lr: 0.000360, batch_cost: 0.2047, reader_cost: 0.00050, ips: 39.0872 samples/sec | ETA 02:42:22 2022-08-24 23:18:09 [INFO] [TRAIN] epoch: 90, iter: 112450/160000, loss: 0.7630, lr: 0.000360, batch_cost: 0.2505, reader_cost: 0.03067, ips: 31.9411 samples/sec | ETA 03:18:29 2022-08-24 23:18:20 [INFO] [TRAIN] epoch: 90, iter: 112500/160000, loss: 0.7006, lr: 0.000360, batch_cost: 0.2260, reader_cost: 0.01195, ips: 35.4028 samples/sec | ETA 02:58:53 2022-08-24 23:18:30 [INFO] [TRAIN] epoch: 90, iter: 112550/160000, loss: 0.7359, lr: 0.000359, batch_cost: 0.1957, reader_cost: 0.00231, ips: 40.8834 samples/sec | ETA 02:34:44 2022-08-24 23:18:41 [INFO] [TRAIN] epoch: 90, iter: 112600/160000, loss: 0.7710, lr: 0.000359, batch_cost: 0.2224, reader_cost: 0.00544, ips: 35.9775 samples/sec | ETA 02:55:39 2022-08-24 23:18:55 [INFO] [TRAIN] epoch: 90, iter: 112650/160000, loss: 0.6817, lr: 0.000358, batch_cost: 0.2658, reader_cost: 0.00081, ips: 30.1001 samples/sec | ETA 03:29:44 2022-08-24 23:19:08 [INFO] [TRAIN] epoch: 90, iter: 112700/160000, loss: 0.7260, lr: 0.000358, batch_cost: 0.2622, reader_cost: 0.00935, ips: 30.5091 samples/sec | ETA 03:26:42 2022-08-24 23:19:20 [INFO] [TRAIN] epoch: 90, iter: 112750/160000, loss: 0.7292, lr: 0.000358, batch_cost: 0.2532, reader_cost: 0.00903, ips: 31.5917 samples/sec | ETA 03:19:25 2022-08-24 23:19:33 [INFO] [TRAIN] epoch: 90, iter: 112800/160000, loss: 0.7186, lr: 0.000357, batch_cost: 0.2466, reader_cost: 0.00554, ips: 32.4422 samples/sec | ETA 03:13:59 2022-08-24 23:19:46 [INFO] [TRAIN] epoch: 90, iter: 112850/160000, loss: 0.7102, lr: 0.000357, batch_cost: 0.2692, reader_cost: 0.00579, ips: 29.7149 samples/sec | ETA 03:31:33 2022-08-24 23:19:59 [INFO] [TRAIN] epoch: 90, iter: 112900/160000, loss: 0.7842, lr: 0.000357, batch_cost: 0.2607, reader_cost: 0.01303, ips: 30.6845 samples/sec | ETA 03:24:39 2022-08-24 23:20:10 [INFO] [TRAIN] epoch: 90, iter: 112950/160000, loss: 0.6715, lr: 0.000356, batch_cost: 0.2059, reader_cost: 0.00039, ips: 38.8622 samples/sec | ETA 02:41:25 2022-08-24 23:20:20 [INFO] [TRAIN] epoch: 90, iter: 113000/160000, loss: 0.7551, lr: 0.000356, batch_cost: 0.2086, reader_cost: 0.00124, ips: 38.3478 samples/sec | ETA 02:43:24 2022-08-24 23:20:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1734 - reader cost: 8.4177e-04 2022-08-24 23:23:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3154 Acc: 0.7454 Kappa: 0.7257 Dice: 0.4424 2022-08-24 23:23:14 [INFO] [EVAL] Class IoU: [0.6564 0.7672 0.9251 0.7045 0.6687 0.7386 0.7653 0.7477 0.4792 0.5901 0.4575 0.5231 0.6565 0.2914 0.195 0.3739 0.4809 0.4006 0.5475 0.3816 0.718 0.4342 0.5637 0.4271 0.3446 0.3771 0.351 0.3806 0.3726 0.2956 0.2157 0.3787 0.2713 0.2788 0.309 0.3813 0.364 0.4805 0.2382 0.2654 0.0881 0.0543 0.2842 0.2122 0.3059 0.2608 0.2371 0.4158 0.6957 0.4838 0.4876 0.197 0.2238 0.2049 0.58 0.4566 0.8272 0.3189 0.4628 0.1928 0.0698 0.1549 0.308 0.0886 0.3646 0.6533 0.2091 0.3538 0.063 0.3395 0.3576 0.4156 0.3786 0.2078 0.4098 0.2953 0.2955 0.2184 0.1462 0.3979 0.6311 0.2758 0.2563 0.0138 0.4909 0.4789 0.0829 0.0702 0.3636 0.396 0.4008 0.0078 0.1864 0.0698 0.0075 0.0151 0.19 0.1126 0.1666 0.3441 0.1442 0.0254 0.197 0.7289 0.0157 0.4167 0.2296 0.5017 0.0875 0.333 0.0657 0.1377 0.0833 0.5749 0.6466 0.0018 0.3947 0.5898 0.0261 0.1733 0.4276 0.0059 0.218 0.1214 0.2424 0.1938 0.4349 0.3008 0.4714 0.2495 0.5428 0.0131 0.1478 0.2657 0.0868 0.101 0.0762 0.0163 0.126 0.2838 0.1575 0.067 0.242 0.404 0.3089 0. 0.2576 0.0064 0.0653 0.0312] 2022-08-24 23:23:14 [INFO] [EVAL] Class Precision: [0.7568 0.8359 0.9618 0.8177 0.7637 0.8573 0.8718 0.8185 0.5874 0.7417 0.6782 0.6816 0.743 0.4796 0.4787 0.5678 0.6096 0.6829 0.7461 0.5436 0.8039 0.5923 0.7556 0.6208 0.5025 0.6619 0.5085 0.7335 0.6611 0.4053 0.3774 0.4744 0.4389 0.4054 0.4898 0.5108 0.6096 0.7095 0.5336 0.497 0.1431 0.1944 0.4319 0.5449 0.4279 0.4466 0.3854 0.6493 0.7727 0.6084 0.6219 0.2206 0.4258 0.5834 0.6928 0.6309 0.8622 0.6184 0.6999 0.3794 0.1173 0.3461 0.4883 0.5458 0.451 0.7819 0.3779 0.5317 0.1829 0.5779 0.5698 0.5199 0.6238 0.2891 0.6318 0.5062 0.4502 0.5072 0.525 0.5807 0.7434 0.5808 0.7697 0.0515 0.6721 0.6884 0.3544 0.3923 0.4572 0.5566 0.5385 0.0123 0.3106 0.3025 0.0332 0.0382 0.5246 0.4381 0.494 0.643 0.6516 0.047 0.6817 0.8076 0.1351 0.5469 0.5129 0.8379 0.2634 0.5462 0.2342 0.1708 0.4741 0.6933 0.6488 0.1052 0.6921 0.6162 0.1041 0.5355 0.7997 0.239 0.6481 0.6376 0.6044 0.6698 0.7932 0.428 0.6187 0.4041 0.6236 0.7399 0.481 0.6793 0.6457 0.2773 0.2903 0.2515 0.3737 0.7139 0.2335 0.1392 0.657 0.6974 0.5356 0. 0.9254 0.4104 0.4687 0.6562] 2022-08-24 23:23:14 [INFO] [EVAL] Class Recall: [0.8317 0.9033 0.9604 0.8358 0.8431 0.8421 0.8623 0.8963 0.7224 0.7427 0.5844 0.6923 0.8494 0.4262 0.2476 0.5228 0.6949 0.4921 0.6728 0.5615 0.8704 0.6194 0.6893 0.5779 0.5232 0.4671 0.5312 0.4417 0.4606 0.5221 0.3349 0.6525 0.4154 0.4717 0.4556 0.6005 0.4747 0.5982 0.3008 0.3628 0.1863 0.0701 0.4539 0.2579 0.5174 0.3852 0.3814 0.5362 0.8748 0.7027 0.693 0.6482 0.3206 0.24 0.7808 0.623 0.9532 0.397 0.5775 0.2815 0.1472 0.219 0.4548 0.0957 0.6555 0.7989 0.3189 0.514 0.0876 0.4515 0.4899 0.6746 0.4906 0.4251 0.5384 0.4147 0.4624 0.2772 0.1685 0.5584 0.807 0.3443 0.2775 0.0185 0.6455 0.6114 0.0977 0.0787 0.6397 0.5784 0.6105 0.0208 0.318 0.0831 0.0095 0.0245 0.2294 0.1316 0.2009 0.4254 0.1563 0.0523 0.217 0.8821 0.0175 0.6365 0.2936 0.5556 0.1159 0.4604 0.0836 0.4161 0.0918 0.7709 0.9947 0.0018 0.4788 0.9323 0.0336 0.204 0.479 0.006 0.2473 0.1304 0.2882 0.2143 0.4906 0.5031 0.6644 0.3947 0.8073 0.0132 0.1759 0.3038 0.0911 0.1371 0.0936 0.0171 0.1598 0.3203 0.3262 0.1144 0.2771 0.4898 0.4219 0. 0.263 0.0065 0.0705 0.0317] 2022-08-24 23:23:14 [INFO] [EVAL] The model with the best validation mIoU (0.3170) was saved at iter 110000. 2022-08-24 23:23:23 [INFO] [TRAIN] epoch: 90, iter: 113050/160000, loss: 0.6777, lr: 0.000355, batch_cost: 0.1835, reader_cost: 0.00531, ips: 43.5982 samples/sec | ETA 02:23:35 2022-08-24 23:23:32 [INFO] [TRAIN] epoch: 90, iter: 113100/160000, loss: 0.6791, lr: 0.000355, batch_cost: 0.1857, reader_cost: 0.00064, ips: 43.0806 samples/sec | ETA 02:25:09 2022-08-24 23:23:44 [INFO] [TRAIN] epoch: 90, iter: 113150/160000, loss: 0.7012, lr: 0.000355, batch_cost: 0.2340, reader_cost: 0.00139, ips: 34.1843 samples/sec | ETA 03:02:44 2022-08-24 23:23:54 [INFO] [TRAIN] epoch: 90, iter: 113200/160000, loss: 0.6827, lr: 0.000354, batch_cost: 0.1970, reader_cost: 0.00051, ips: 40.6050 samples/sec | ETA 02:33:40 2022-08-24 23:24:03 [INFO] [TRAIN] epoch: 90, iter: 113250/160000, loss: 0.7310, lr: 0.000354, batch_cost: 0.1940, reader_cost: 0.00041, ips: 41.2453 samples/sec | ETA 02:31:07 2022-08-24 23:24:15 [INFO] [TRAIN] epoch: 90, iter: 113300/160000, loss: 0.7366, lr: 0.000354, batch_cost: 0.2246, reader_cost: 0.00049, ips: 35.6255 samples/sec | ETA 02:54:46 2022-08-24 23:24:24 [INFO] [TRAIN] epoch: 90, iter: 113350/160000, loss: 0.7132, lr: 0.000353, batch_cost: 0.1944, reader_cost: 0.00044, ips: 41.1548 samples/sec | ETA 02:31:08 2022-08-24 23:24:35 [INFO] [TRAIN] epoch: 90, iter: 113400/160000, loss: 0.7082, lr: 0.000353, batch_cost: 0.2055, reader_cost: 0.00035, ips: 38.9257 samples/sec | ETA 02:39:37 2022-08-24 23:24:45 [INFO] [TRAIN] epoch: 90, iter: 113450/160000, loss: 0.7075, lr: 0.000352, batch_cost: 0.2053, reader_cost: 0.00154, ips: 38.9627 samples/sec | ETA 02:39:17 2022-08-24 23:24:56 [INFO] [TRAIN] epoch: 90, iter: 113500/160000, loss: 0.7226, lr: 0.000352, batch_cost: 0.2256, reader_cost: 0.00625, ips: 35.4584 samples/sec | ETA 02:54:51 2022-08-24 23:25:06 [INFO] [TRAIN] epoch: 90, iter: 113550/160000, loss: 0.7110, lr: 0.000352, batch_cost: 0.1908, reader_cost: 0.00974, ips: 41.9225 samples/sec | ETA 02:27:43 2022-08-24 23:25:15 [INFO] [TRAIN] epoch: 90, iter: 113600/160000, loss: 0.7711, lr: 0.000351, batch_cost: 0.1911, reader_cost: 0.00057, ips: 41.8686 samples/sec | ETA 02:27:45 2022-08-24 23:25:28 [INFO] [TRAIN] epoch: 90, iter: 113650/160000, loss: 0.6900, lr: 0.000351, batch_cost: 0.2552, reader_cost: 0.00391, ips: 31.3424 samples/sec | ETA 03:17:10 2022-08-24 23:25:43 [INFO] [TRAIN] epoch: 91, iter: 113700/160000, loss: 0.7439, lr: 0.000351, batch_cost: 0.3058, reader_cost: 0.06063, ips: 26.1609 samples/sec | ETA 03:55:58 2022-08-24 23:25:57 [INFO] [TRAIN] epoch: 91, iter: 113750/160000, loss: 0.7175, lr: 0.000350, batch_cost: 0.2722, reader_cost: 0.01397, ips: 29.3923 samples/sec | ETA 03:29:48 2022-08-24 23:26:11 [INFO] [TRAIN] epoch: 91, iter: 113800/160000, loss: 0.7335, lr: 0.000350, batch_cost: 0.2770, reader_cost: 0.00999, ips: 28.8820 samples/sec | ETA 03:33:16 2022-08-24 23:26:24 [INFO] [TRAIN] epoch: 91, iter: 113850/160000, loss: 0.7466, lr: 0.000349, batch_cost: 0.2617, reader_cost: 0.00067, ips: 30.5714 samples/sec | ETA 03:21:16 2022-08-24 23:26:36 [INFO] [TRAIN] epoch: 91, iter: 113900/160000, loss: 0.7000, lr: 0.000349, batch_cost: 0.2354, reader_cost: 0.00075, ips: 33.9838 samples/sec | ETA 03:00:52 2022-08-24 23:26:48 [INFO] [TRAIN] epoch: 91, iter: 113950/160000, loss: 0.6758, lr: 0.000349, batch_cost: 0.2505, reader_cost: 0.00552, ips: 31.9377 samples/sec | ETA 03:12:14 2022-08-24 23:26:59 [INFO] [TRAIN] epoch: 91, iter: 114000/160000, loss: 0.7133, lr: 0.000348, batch_cost: 0.2093, reader_cost: 0.00042, ips: 38.2208 samples/sec | ETA 02:40:28 2022-08-24 23:26:59 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 211s - batch_cost: 0.2111 - reader cost: 7.4902e-04 2022-08-24 23:30:30 [INFO] [EVAL] #Images: 2000 mIoU: 0.3120 Acc: 0.7466 Kappa: 0.7271 Dice: 0.4369 2022-08-24 23:30:30 [INFO] [EVAL] Class IoU: [0.6602 0.7673 0.9256 0.7051 0.6663 0.743 0.7683 0.7484 0.482 0.6193 0.4649 0.5303 0.6493 0.2961 0.219 0.3744 0.4765 0.3967 0.5406 0.3696 0.7205 0.4337 0.5495 0.4416 0.3242 0.3661 0.3536 0.3881 0.3975 0.2894 0.2158 0.3986 0.2573 0.2614 0.3013 0.3796 0.3596 0.4597 0.2535 0.2725 0.0805 0.075 0.3082 0.2156 0.3091 0.2896 0.2384 0.4186 0.663 0.4935 0.4716 0.1978 0.2119 0.1526 0.5394 0.4084 0.8233 0.3291 0.4736 0.206 0.0388 0.1517 0.2764 0.1072 0.3868 0.6621 0.2203 0.3686 0.083 0.32 0.3479 0.4234 0.3818 0.2044 0.4018 0.2856 0.3413 0.2214 0.155 0.3504 0.6071 0.2945 0.2461 0.0601 0.4424 0.4749 0.0697 0.0531 0.3236 0.4242 0.4186 0.0308 0.2096 0.065 0.0048 0.0042 0.1723 0.1184 0.2096 0.325 0.0675 0.0158 0.2001 0.7157 0.0007 0.4716 0.2089 0.4601 0.0941 0.3789 0.0436 0.1676 0.0948 0.5988 0.728 0.0027 0.3377 0.5751 0.0381 0.0943 0.4809 0.0057 0.1601 0.096 0.2444 0.1164 0.4134 0.3068 0.4833 0.2485 0.567 0.017 0.0322 0.2638 0.0924 0.0979 0.0589 0.0168 0.132 0.3205 0.0674 0.0434 0.234 0.3388 0.2917 0. 0.2646 0.0069 0.0513 0.033 ] 2022-08-24 23:30:30 [INFO] [EVAL] Class Precision: [0.7648 0.828 0.9608 0.8284 0.7725 0.8629 0.8735 0.8139 0.6076 0.7321 0.6758 0.6654 0.7302 0.4817 0.4702 0.5335 0.6237 0.6531 0.7039 0.5486 0.8082 0.6185 0.6847 0.6158 0.5248 0.6318 0.5106 0.6691 0.6061 0.4405 0.3823 0.5366 0.4707 0.3346 0.5134 0.5043 0.6031 0.7236 0.4601 0.4925 0.1465 0.2052 0.5295 0.5258 0.4655 0.4766 0.3633 0.6688 0.796 0.6251 0.5971 0.2202 0.3833 0.6711 0.6717 0.6378 0.8582 0.6264 0.6804 0.3323 0.0811 0.3174 0.4151 0.6152 0.514 0.8318 0.3629 0.4861 0.2128 0.7019 0.5926 0.5919 0.5857 0.29 0.5734 0.4194 0.5216 0.6096 0.6993 0.5729 0.6892 0.5921 0.7686 0.2354 0.6301 0.6539 0.3236 0.4544 0.4318 0.5852 0.609 0.0637 0.3958 0.2831 0.0268 0.0158 0.638 0.5835 0.4044 0.4735 0.4344 0.0428 0.7646 0.7972 0.024 0.6653 0.4934 0.9219 0.2057 0.5507 0.2682 0.2192 0.4259 0.7294 0.7331 0.0924 0.7154 0.6058 0.1232 0.4957 0.7274 0.2203 0.7883 0.5479 0.6313 0.712 0.7736 0.4305 0.6285 0.5277 0.6606 0.4668 0.2841 0.731 0.5866 0.2275 0.309 0.1631 0.3685 0.6862 0.2395 0.1202 0.5549 0.7769 0.4499 0. 0.9307 0.2802 0.524 0.5511] 2022-08-24 23:30:30 [INFO] [EVAL] Class Recall: [0.8284 0.9128 0.9619 0.8258 0.8289 0.8424 0.8645 0.9028 0.6999 0.8007 0.5983 0.7232 0.8544 0.4344 0.2907 0.5567 0.6688 0.5027 0.6998 0.5311 0.8691 0.5922 0.7355 0.6095 0.4589 0.4655 0.535 0.4803 0.536 0.4576 0.3312 0.6077 0.362 0.5444 0.4218 0.6056 0.4711 0.5576 0.3608 0.3789 0.1515 0.1058 0.4244 0.2677 0.4793 0.4247 0.4095 0.5281 0.7988 0.701 0.6918 0.6604 0.3214 0.1649 0.7325 0.5317 0.9529 0.4094 0.6091 0.3516 0.0691 0.2252 0.4527 0.1149 0.6099 0.7644 0.3591 0.6039 0.1197 0.3704 0.4572 0.598 0.523 0.4091 0.5731 0.4724 0.4967 0.258 0.1661 0.4743 0.8361 0.3694 0.2658 0.0746 0.5977 0.6343 0.0815 0.0567 0.5635 0.6066 0.5725 0.0564 0.3082 0.0778 0.0058 0.0058 0.191 0.1294 0.3032 0.5089 0.074 0.0245 0.2132 0.875 0.0008 0.6182 0.266 0.4788 0.1478 0.5484 0.0496 0.4161 0.1087 0.7699 0.9904 0.0028 0.3901 0.9189 0.0523 0.1044 0.5867 0.0058 0.1673 0.1042 0.2851 0.1221 0.4703 0.5164 0.6765 0.3195 0.8 0.0173 0.0351 0.2922 0.0988 0.1467 0.0679 0.0184 0.1705 0.3755 0.0857 0.0637 0.2881 0.3753 0.4536 0. 0.27 0.0071 0.0538 0.0339] 2022-08-24 23:30:30 [INFO] [EVAL] The model with the best validation mIoU (0.3170) was saved at iter 110000. 2022-08-24 23:30:41 [INFO] [TRAIN] epoch: 91, iter: 114050/160000, loss: 0.7666, lr: 0.000348, batch_cost: 0.2105, reader_cost: 0.00403, ips: 38.0109 samples/sec | ETA 02:41:10 2022-08-24 23:30:52 [INFO] [TRAIN] epoch: 91, iter: 114100/160000, loss: 0.6808, lr: 0.000348, batch_cost: 0.2270, reader_cost: 0.00117, ips: 35.2375 samples/sec | ETA 02:53:40 2022-08-24 23:31:01 [INFO] [TRAIN] epoch: 91, iter: 114150/160000, loss: 0.7366, lr: 0.000347, batch_cost: 0.1815, reader_cost: 0.00360, ips: 44.0693 samples/sec | ETA 02:18:43 2022-08-24 23:31:13 [INFO] [TRAIN] epoch: 91, iter: 114200/160000, loss: 0.6914, lr: 0.000347, batch_cost: 0.2317, reader_cost: 0.00051, ips: 34.5316 samples/sec | ETA 02:56:50 2022-08-24 23:31:23 [INFO] [TRAIN] epoch: 91, iter: 114250/160000, loss: 0.6659, lr: 0.000346, batch_cost: 0.2086, reader_cost: 0.00445, ips: 38.3533 samples/sec | ETA 02:39:02 2022-08-24 23:31:32 [INFO] [TRAIN] epoch: 91, iter: 114300/160000, loss: 0.6902, lr: 0.000346, batch_cost: 0.1845, reader_cost: 0.00607, ips: 43.3520 samples/sec | ETA 02:20:33 2022-08-24 23:31:43 [INFO] [TRAIN] epoch: 91, iter: 114350/160000, loss: 0.7713, lr: 0.000346, batch_cost: 0.2133, reader_cost: 0.00103, ips: 37.5138 samples/sec | ETA 02:42:15 2022-08-24 23:31:53 [INFO] [TRAIN] epoch: 91, iter: 114400/160000, loss: 0.7477, lr: 0.000345, batch_cost: 0.2022, reader_cost: 0.00715, ips: 39.5693 samples/sec | ETA 02:33:39 2022-08-24 23:32:03 [INFO] [TRAIN] epoch: 91, iter: 114450/160000, loss: 0.6681, lr: 0.000345, batch_cost: 0.2031, reader_cost: 0.00052, ips: 39.3899 samples/sec | ETA 02:34:11 2022-08-24 23:32:14 [INFO] [TRAIN] epoch: 91, iter: 114500/160000, loss: 0.7083, lr: 0.000344, batch_cost: 0.2183, reader_cost: 0.00034, ips: 36.6449 samples/sec | ETA 02:45:33 2022-08-24 23:32:26 [INFO] [TRAIN] epoch: 91, iter: 114550/160000, loss: 0.7388, lr: 0.000344, batch_cost: 0.2294, reader_cost: 0.00061, ips: 34.8727 samples/sec | ETA 02:53:46 2022-08-24 23:32:38 [INFO] [TRAIN] epoch: 91, iter: 114600/160000, loss: 0.7100, lr: 0.000344, batch_cost: 0.2467, reader_cost: 0.00585, ips: 32.4270 samples/sec | ETA 03:06:40 2022-08-24 23:32:51 [INFO] [TRAIN] epoch: 91, iter: 114650/160000, loss: 0.6661, lr: 0.000343, batch_cost: 0.2595, reader_cost: 0.00066, ips: 30.8344 samples/sec | ETA 03:16:06 2022-08-24 23:33:04 [INFO] [TRAIN] epoch: 91, iter: 114700/160000, loss: 0.7432, lr: 0.000343, batch_cost: 0.2487, reader_cost: 0.00563, ips: 32.1723 samples/sec | ETA 03:07:44 2022-08-24 23:33:16 [INFO] [TRAIN] epoch: 91, iter: 114750/160000, loss: 0.6888, lr: 0.000343, batch_cost: 0.2420, reader_cost: 0.00116, ips: 33.0597 samples/sec | ETA 03:02:29 2022-08-24 23:33:30 [INFO] [TRAIN] epoch: 91, iter: 114800/160000, loss: 0.7190, lr: 0.000342, batch_cost: 0.2815, reader_cost: 0.01011, ips: 28.4232 samples/sec | ETA 03:32:02 2022-08-24 23:33:44 [INFO] [TRAIN] epoch: 91, iter: 114850/160000, loss: 0.7443, lr: 0.000342, batch_cost: 0.2847, reader_cost: 0.00036, ips: 28.1042 samples/sec | ETA 03:34:12 2022-08-24 23:33:55 [INFO] [TRAIN] epoch: 91, iter: 114900/160000, loss: 0.7096, lr: 0.000341, batch_cost: 0.2213, reader_cost: 0.00070, ips: 36.1507 samples/sec | ETA 02:46:20 2022-08-24 23:34:07 [INFO] [TRAIN] epoch: 92, iter: 114950/160000, loss: 0.7075, lr: 0.000341, batch_cost: 0.2338, reader_cost: 0.03748, ips: 34.2181 samples/sec | ETA 02:55:32 2022-08-24 23:34:18 [INFO] [TRAIN] epoch: 92, iter: 115000/160000, loss: 0.7553, lr: 0.000341, batch_cost: 0.2305, reader_cost: 0.00083, ips: 34.7114 samples/sec | ETA 02:52:51 2022-08-24 23:34:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 175s - batch_cost: 0.1746 - reader cost: 6.4933e-04 2022-08-24 23:37:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3162 Acc: 0.7452 Kappa: 0.7257 Dice: 0.4431 2022-08-24 23:37:13 [INFO] [EVAL] Class IoU: [0.658 0.7675 0.9261 0.7051 0.6633 0.746 0.7611 0.7503 0.4838 0.6097 0.4539 0.5215 0.6601 0.2781 0.2177 0.3746 0.4722 0.3771 0.5478 0.3729 0.7369 0.4357 0.5534 0.4493 0.338 0.3789 0.3841 0.4036 0.3525 0.2816 0.2182 0.4009 0.2517 0.2844 0.2794 0.3579 0.3626 0.4873 0.2412 0.253 0.0915 0.064 0.3053 0.2014 0.3 0.2417 0.2202 0.4232 0.6767 0.4989 0.4642 0.1899 0.2233 0.2018 0.5824 0.42 0.8212 0.3265 0.4868 0.2023 0.0686 0.1575 0.3079 0.0971 0.3764 0.6377 0.2112 0.3649 0.078 0.2992 0.3563 0.4065 0.354 0.2039 0.4244 0.2999 0.3515 0.2182 0.1105 0.4056 0.6562 0.2703 0.2442 0.0154 0.4288 0.4713 0.0944 0.053 0.3342 0.427 0.3862 0.006 0.2166 0.074 0.0063 0.006 0.1439 0.064 0.1955 0.3423 0.0525 0.0298 0.2261 0.6738 0.0181 0.4572 0.2132 0.4839 0.0792 0.3742 0.039 0.2239 0.0759 0.579 0.7265 0.007 0.4462 0.5891 0.0633 0.2792 0.4201 0.0045 0.2317 0.1456 0.2352 0.2371 0.4293 0.3193 0.4262 0.2529 0.5658 0.0139 0.1474 0.1982 0.1034 0.0955 0.0638 0.019 0.1306 0.2842 0.1149 0.0763 0.2861 0.345 0.3078 0. 0.3033 0.0136 0.0642 0.0479] 2022-08-24 23:37:13 [INFO] [EVAL] Class Precision: [0.7673 0.8338 0.9599 0.804 0.7638 0.8531 0.8806 0.8194 0.5997 0.7594 0.6624 0.6714 0.7497 0.4988 0.4553 0.5482 0.5931 0.6158 0.6844 0.5603 0.8411 0.6644 0.7354 0.6062 0.5206 0.5584 0.5265 0.6907 0.6776 0.3688 0.394 0.5333 0.4018 0.4088 0.4704 0.5437 0.6084 0.7625 0.4982 0.4913 0.1498 0.2394 0.5225 0.5828 0.4213 0.3947 0.3168 0.6488 0.7809 0.634 0.608 0.2125 0.3736 0.5197 0.7314 0.5401 0.8607 0.6034 0.6111 0.3716 0.1131 0.3513 0.4756 0.6044 0.4972 0.7412 0.2918 0.5621 0.2138 0.6186 0.5347 0.6438 0.6364 0.2573 0.6423 0.5662 0.6329 0.5348 0.786 0.5949 0.7783 0.5939 0.7387 0.0736 0.681 0.6952 0.3332 0.4495 0.5431 0.5826 0.4965 0.0081 0.3362 0.2917 0.0529 0.0215 0.4731 0.6055 0.4347 0.5549 0.3969 0.0503 0.7068 0.7253 0.1539 0.6253 0.5921 0.8563 0.2583 0.5144 0.249 0.3265 0.4848 0.7059 0.7304 0.0838 0.666 0.6114 0.1314 0.712 0.7119 0.1525 0.6551 0.6011 0.6622 0.6162 0.7592 0.4409 0.5722 0.385 0.6666 0.6934 0.3976 0.7994 0.6092 0.2674 0.325 0.3148 0.4413 0.7213 0.2967 0.258 0.566 0.7958 0.5542 0. 0.8528 0.2148 0.5636 0.5938] 2022-08-24 23:37:13 [INFO] [EVAL] Class Recall: [0.8219 0.9061 0.9634 0.8515 0.8345 0.8559 0.8487 0.899 0.7146 0.7557 0.5904 0.7002 0.8467 0.3859 0.2943 0.5418 0.6986 0.4931 0.7329 0.5271 0.8561 0.5587 0.691 0.6345 0.4907 0.5411 0.5868 0.4927 0.4236 0.5434 0.3284 0.6176 0.4026 0.4831 0.4076 0.5115 0.4729 0.5745 0.3186 0.3429 0.1903 0.0803 0.4235 0.2353 0.5103 0.3839 0.4191 0.549 0.8353 0.7008 0.6625 0.6417 0.3568 0.2481 0.7408 0.654 0.9471 0.4156 0.7053 0.3074 0.1483 0.2222 0.4661 0.1037 0.6078 0.8203 0.433 0.5098 0.1093 0.3669 0.5164 0.5245 0.4438 0.4956 0.5557 0.3893 0.4414 0.2694 0.1139 0.5605 0.807 0.3315 0.2673 0.0191 0.5366 0.594 0.1164 0.0567 0.4649 0.6152 0.6349 0.0222 0.3785 0.0901 0.0071 0.0082 0.1714 0.0668 0.2621 0.4718 0.057 0.0683 0.2495 0.9047 0.0201 0.6297 0.2498 0.5267 0.1025 0.5786 0.0442 0.4161 0.0825 0.7632 0.9926 0.0076 0.5749 0.9416 0.109 0.3147 0.5062 0.0047 0.2639 0.1612 0.2673 0.2782 0.4969 0.5364 0.6255 0.4244 0.7892 0.014 0.1898 0.2085 0.1107 0.1294 0.0735 0.0198 0.1565 0.3192 0.158 0.0978 0.3665 0.3785 0.409 0. 0.32 0.0143 0.0675 0.0495] 2022-08-24 23:37:13 [INFO] [EVAL] The model with the best validation mIoU (0.3170) was saved at iter 110000. 2022-08-24 23:37:24 [INFO] [TRAIN] epoch: 92, iter: 115050/160000, loss: 0.7191, lr: 0.000340, batch_cost: 0.2075, reader_cost: 0.00387, ips: 38.5602 samples/sec | ETA 02:35:25 2022-08-24 23:37:33 [INFO] [TRAIN] epoch: 92, iter: 115100/160000, loss: 0.6805, lr: 0.000340, batch_cost: 0.1925, reader_cost: 0.00818, ips: 41.5585 samples/sec | ETA 02:24:03 2022-08-24 23:37:43 [INFO] [TRAIN] epoch: 92, iter: 115150/160000, loss: 0.7317, lr: 0.000340, batch_cost: 0.2033, reader_cost: 0.01477, ips: 39.3445 samples/sec | ETA 02:31:59 2022-08-24 23:37:54 [INFO] [TRAIN] epoch: 92, iter: 115200/160000, loss: 0.7328, lr: 0.000339, batch_cost: 0.2146, reader_cost: 0.00043, ips: 37.2831 samples/sec | ETA 02:40:12 2022-08-24 23:38:05 [INFO] [TRAIN] epoch: 92, iter: 115250/160000, loss: 0.6964, lr: 0.000339, batch_cost: 0.2207, reader_cost: 0.00503, ips: 36.2480 samples/sec | ETA 02:44:36 2022-08-24 23:38:17 [INFO] [TRAIN] epoch: 92, iter: 115300/160000, loss: 0.7053, lr: 0.000338, batch_cost: 0.2318, reader_cost: 0.00038, ips: 34.5110 samples/sec | ETA 02:52:41 2022-08-24 23:38:28 [INFO] [TRAIN] epoch: 92, iter: 115350/160000, loss: 0.7104, lr: 0.000338, batch_cost: 0.2180, reader_cost: 0.00076, ips: 36.6911 samples/sec | ETA 02:42:15 2022-08-24 23:38:38 [INFO] [TRAIN] epoch: 92, iter: 115400/160000, loss: 0.7305, lr: 0.000338, batch_cost: 0.2099, reader_cost: 0.00295, ips: 38.1140 samples/sec | ETA 02:36:01 2022-08-24 23:38:48 [INFO] [TRAIN] epoch: 92, iter: 115450/160000, loss: 0.7408, lr: 0.000337, batch_cost: 0.1941, reader_cost: 0.00343, ips: 41.2093 samples/sec | ETA 02:24:08 2022-08-24 23:39:00 [INFO] [TRAIN] epoch: 92, iter: 115500/160000, loss: 0.7322, lr: 0.000337, batch_cost: 0.2387, reader_cost: 0.00089, ips: 33.5217 samples/sec | ETA 02:56:59 2022-08-24 23:39:13 [INFO] [TRAIN] epoch: 92, iter: 115550/160000, loss: 0.6787, lr: 0.000337, batch_cost: 0.2661, reader_cost: 0.00035, ips: 30.0620 samples/sec | ETA 03:17:08 2022-08-24 23:39:27 [INFO] [TRAIN] epoch: 92, iter: 115600/160000, loss: 0.7197, lr: 0.000336, batch_cost: 0.2696, reader_cost: 0.03211, ips: 29.6781 samples/sec | ETA 03:19:28 2022-08-24 23:39:41 [INFO] [TRAIN] epoch: 92, iter: 115650/160000, loss: 0.7152, lr: 0.000336, batch_cost: 0.2790, reader_cost: 0.00063, ips: 28.6725 samples/sec | ETA 03:26:14 2022-08-24 23:39:53 [INFO] [TRAIN] epoch: 92, iter: 115700/160000, loss: 0.6936, lr: 0.000335, batch_cost: 0.2553, reader_cost: 0.01108, ips: 31.3388 samples/sec | ETA 03:08:28 2022-08-24 23:40:06 [INFO] [TRAIN] epoch: 92, iter: 115750/160000, loss: 0.7167, lr: 0.000335, batch_cost: 0.2603, reader_cost: 0.05892, ips: 30.7354 samples/sec | ETA 03:11:57 2022-08-24 23:40:20 [INFO] [TRAIN] epoch: 92, iter: 115800/160000, loss: 0.7297, lr: 0.000335, batch_cost: 0.2638, reader_cost: 0.00498, ips: 30.3310 samples/sec | ETA 03:14:18 2022-08-24 23:40:32 [INFO] [TRAIN] epoch: 92, iter: 115850/160000, loss: 0.6959, lr: 0.000334, batch_cost: 0.2413, reader_cost: 0.04975, ips: 33.1469 samples/sec | ETA 02:57:35 2022-08-24 23:40:42 [INFO] [TRAIN] epoch: 92, iter: 115900/160000, loss: 0.6881, lr: 0.000334, batch_cost: 0.2003, reader_cost: 0.00473, ips: 39.9398 samples/sec | ETA 02:27:13 2022-08-24 23:40:51 [INFO] [TRAIN] epoch: 92, iter: 115950/160000, loss: 0.7000, lr: 0.000334, batch_cost: 0.1878, reader_cost: 0.00247, ips: 42.6080 samples/sec | ETA 02:17:50 2022-08-24 23:41:01 [INFO] [TRAIN] epoch: 92, iter: 116000/160000, loss: 0.7267, lr: 0.000333, batch_cost: 0.2010, reader_cost: 0.00054, ips: 39.8088 samples/sec | ETA 02:27:22 2022-08-24 23:41:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1635 - reader cost: 8.4042e-04 2022-08-24 23:43:45 [INFO] [EVAL] #Images: 2000 mIoU: 0.3205 Acc: 0.7478 Kappa: 0.7282 Dice: 0.4482 2022-08-24 23:43:45 [INFO] [EVAL] Class IoU: [0.6596 0.7689 0.9253 0.706 0.67 0.7431 0.7577 0.7451 0.479 0.619 0.4576 0.5272 0.6602 0.2988 0.2217 0.3688 0.496 0.4192 0.5443 0.3572 0.7297 0.4104 0.5653 0.4507 0.3316 0.3452 0.378 0.402 0.3954 0.2793 0.2308 0.3873 0.2735 0.2698 0.4147 0.3766 0.3562 0.4637 0.2575 0.2389 0.0841 0.0778 0.3097 0.2143 0.2774 0.2609 0.2247 0.4272 0.6586 0.4896 0.4817 0.2563 0.2176 0.1972 0.5949 0.3472 0.818 0.356 0.3849 0.1771 0.047 0.1689 0.3217 0.1584 0.3685 0.6593 0.2277 0.3611 0.0874 0.3139 0.3588 0.4095 0.3665 0.2063 0.4165 0.2874 0.3721 0.2036 0.1645 0.434 0.6502 0.2722 0.2708 0.0164 0.4878 0.4735 0.0937 0.0555 0.3412 0.4397 0.4008 0.0017 0.2094 0.0745 0.0189 0.011 0.1929 0.138 0.2029 0.3272 0.0352 0.0218 0.2264 0.7724 0.0194 0.4873 0.1772 0.5005 0.068 0.3356 0.0665 0.25 0.0989 0.571 0.7742 0.005 0.4139 0.5358 0.0605 0.2266 0.4846 0.0101 0.2642 0.1685 0.2538 0.2192 0.4399 0.2898 0.3965 0.2806 0.5455 0.0202 0.1253 0.2612 0.094 0.0901 0.0608 0.0226 0.1272 0.3229 0.0803 0.0414 0.2813 0.4067 0.321 0. 0.2836 0.0147 0.0518 0.0265] 2022-08-24 23:43:45 [INFO] [EVAL] Class Precision: [0.7535 0.8359 0.9634 0.8186 0.7723 0.8626 0.8871 0.816 0.6129 0.765 0.6567 0.6344 0.7428 0.5105 0.4979 0.514 0.6589 0.625 0.7481 0.5985 0.8219 0.6143 0.7527 0.6137 0.5304 0.6068 0.5247 0.6932 0.6273 0.4026 0.398 0.5009 0.5223 0.402 0.5697 0.5333 0.6282 0.7091 0.4737 0.5482 0.1442 0.1899 0.5444 0.6003 0.3526 0.3933 0.3315 0.647 0.6986 0.6102 0.6054 0.3045 0.3498 0.5561 0.6892 0.6623 0.8575 0.5859 0.7766 0.3132 0.0817 0.3977 0.4895 0.5524 0.4532 0.7906 0.3588 0.5557 0.2402 0.5464 0.5377 0.6007 0.6056 0.2819 0.6592 0.4531 0.6109 0.4859 0.6843 0.6615 0.7703 0.5694 0.7628 0.072 0.6598 0.6706 0.3356 0.4347 0.5528 0.6635 0.5499 0.0024 0.3478 0.313 0.0826 0.0397 0.5591 0.4765 0.5138 0.4845 0.6524 0.0394 0.6029 0.8669 0.2163 0.7043 0.6276 0.8815 0.2497 0.5894 0.2125 0.3866 0.3881 0.7873 0.7809 0.0719 0.7105 0.5573 0.1658 0.6857 0.743 0.3987 0.8567 0.5719 0.6223 0.5883 0.7797 0.4002 0.5358 0.561 0.6304 0.5651 0.4544 0.718 0.6239 0.3041 0.2899 0.1565 0.3718 0.6629 0.2369 0.0956 0.5795 0.698 0.5404 0. 0.9056 0.2892 0.5106 0.6384] 2022-08-24 23:43:45 [INFO] [EVAL] Class Recall: [0.841 0.9055 0.959 0.837 0.8349 0.843 0.8385 0.8956 0.6868 0.7644 0.6015 0.7573 0.8559 0.4188 0.2856 0.5662 0.6674 0.56 0.6664 0.4698 0.8667 0.5529 0.6942 0.6291 0.4695 0.4446 0.5749 0.489 0.5168 0.4771 0.3546 0.6306 0.3647 0.4507 0.6038 0.5618 0.4513 0.5726 0.3607 0.2975 0.1677 0.1164 0.4179 0.2499 0.5653 0.4365 0.4109 0.557 0.92 0.7123 0.7021 0.6183 0.3653 0.2341 0.813 0.422 0.9467 0.4756 0.4329 0.2896 0.0997 0.2269 0.4841 0.1817 0.6635 0.7988 0.3839 0.5077 0.1209 0.4244 0.5189 0.5628 0.4814 0.4347 0.5308 0.44 0.4876 0.2594 0.178 0.5579 0.8066 0.3427 0.2957 0.0208 0.6517 0.6171 0.115 0.0598 0.4713 0.5659 0.5966 0.0058 0.3447 0.0891 0.024 0.0149 0.2275 0.1626 0.2511 0.502 0.0359 0.0468 0.266 0.8763 0.0208 0.6127 0.1981 0.5366 0.0854 0.4381 0.0883 0.4144 0.1171 0.6751 0.9892 0.0054 0.4979 0.933 0.0871 0.2529 0.5822 0.0103 0.2764 0.1928 0.3001 0.2589 0.5023 0.5123 0.6039 0.3596 0.8018 0.0206 0.1474 0.291 0.0997 0.1135 0.0714 0.0257 0.162 0.3863 0.1083 0.0681 0.3534 0.4935 0.4416 0. 0.2923 0.0152 0.0545 0.0269] 2022-08-24 23:43:45 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-24 23:43:55 [INFO] [TRAIN] epoch: 92, iter: 116050/160000, loss: 0.7109, lr: 0.000333, batch_cost: 0.2049, reader_cost: 0.00462, ips: 39.0371 samples/sec | ETA 02:30:06 2022-08-24 23:44:06 [INFO] [TRAIN] epoch: 92, iter: 116100/160000, loss: 0.6987, lr: 0.000332, batch_cost: 0.2061, reader_cost: 0.00116, ips: 38.8151 samples/sec | ETA 02:30:48 2022-08-24 23:44:16 [INFO] [TRAIN] epoch: 92, iter: 116150/160000, loss: 0.7092, lr: 0.000332, batch_cost: 0.2017, reader_cost: 0.00198, ips: 39.6598 samples/sec | ETA 02:27:25 2022-08-24 23:44:28 [INFO] [TRAIN] epoch: 93, iter: 116200/160000, loss: 0.6772, lr: 0.000332, batch_cost: 0.2433, reader_cost: 0.03343, ips: 32.8799 samples/sec | ETA 02:57:36 2022-08-24 23:44:38 [INFO] [TRAIN] epoch: 93, iter: 116250/160000, loss: 0.6915, lr: 0.000331, batch_cost: 0.2084, reader_cost: 0.00365, ips: 38.3803 samples/sec | ETA 02:31:59 2022-08-24 23:44:48 [INFO] [TRAIN] epoch: 93, iter: 116300/160000, loss: 0.7536, lr: 0.000331, batch_cost: 0.1988, reader_cost: 0.00221, ips: 40.2437 samples/sec | ETA 02:24:47 2022-08-24 23:44:59 [INFO] [TRAIN] epoch: 93, iter: 116350/160000, loss: 0.6330, lr: 0.000330, batch_cost: 0.2155, reader_cost: 0.00812, ips: 37.1236 samples/sec | ETA 02:36:46 2022-08-24 23:45:09 [INFO] [TRAIN] epoch: 93, iter: 116400/160000, loss: 0.7351, lr: 0.000330, batch_cost: 0.1942, reader_cost: 0.00046, ips: 41.2011 samples/sec | ETA 02:21:05 2022-08-24 23:45:19 [INFO] [TRAIN] epoch: 93, iter: 116450/160000, loss: 0.7209, lr: 0.000330, batch_cost: 0.2049, reader_cost: 0.00044, ips: 39.0427 samples/sec | ETA 02:28:43 2022-08-24 23:45:28 [INFO] [TRAIN] epoch: 93, iter: 116500/160000, loss: 0.7206, lr: 0.000329, batch_cost: 0.1765, reader_cost: 0.00236, ips: 45.3255 samples/sec | ETA 02:07:57 2022-08-24 23:45:38 [INFO] [TRAIN] epoch: 93, iter: 116550/160000, loss: 0.7393, lr: 0.000329, batch_cost: 0.2024, reader_cost: 0.00706, ips: 39.5230 samples/sec | ETA 02:26:34 2022-08-24 23:45:51 [INFO] [TRAIN] epoch: 93, iter: 116600/160000, loss: 0.6765, lr: 0.000329, batch_cost: 0.2522, reader_cost: 0.00852, ips: 31.7168 samples/sec | ETA 03:02:26 2022-08-24 23:46:04 [INFO] [TRAIN] epoch: 93, iter: 116650/160000, loss: 0.6769, lr: 0.000328, batch_cost: 0.2738, reader_cost: 0.00531, ips: 29.2185 samples/sec | ETA 03:17:49 2022-08-24 23:46:16 [INFO] [TRAIN] epoch: 93, iter: 116700/160000, loss: 0.7158, lr: 0.000328, batch_cost: 0.2332, reader_cost: 0.01404, ips: 34.3026 samples/sec | ETA 02:48:18 2022-08-24 23:46:28 [INFO] [TRAIN] epoch: 93, iter: 116750/160000, loss: 0.6754, lr: 0.000327, batch_cost: 0.2306, reader_cost: 0.01491, ips: 34.6958 samples/sec | ETA 02:46:12 2022-08-24 23:46:40 [INFO] [TRAIN] epoch: 93, iter: 116800/160000, loss: 0.6999, lr: 0.000327, batch_cost: 0.2515, reader_cost: 0.00387, ips: 31.8043 samples/sec | ETA 03:01:06 2022-08-24 23:46:53 [INFO] [TRAIN] epoch: 93, iter: 116850/160000, loss: 0.7123, lr: 0.000327, batch_cost: 0.2540, reader_cost: 0.00722, ips: 31.4910 samples/sec | ETA 03:02:41 2022-08-24 23:47:06 [INFO] [TRAIN] epoch: 93, iter: 116900/160000, loss: 0.7306, lr: 0.000326, batch_cost: 0.2561, reader_cost: 0.00541, ips: 31.2368 samples/sec | ETA 03:03:58 2022-08-24 23:47:18 [INFO] [TRAIN] epoch: 93, iter: 116950/160000, loss: 0.6971, lr: 0.000326, batch_cost: 0.2377, reader_cost: 0.00189, ips: 33.6531 samples/sec | ETA 02:50:33 2022-08-24 23:47:27 [INFO] [TRAIN] epoch: 93, iter: 117000/160000, loss: 0.6922, lr: 0.000326, batch_cost: 0.1966, reader_cost: 0.00039, ips: 40.6931 samples/sec | ETA 02:20:53 2022-08-24 23:47:27 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1760 - reader cost: 7.5299e-04 2022-08-24 23:50:24 [INFO] [EVAL] #Images: 2000 mIoU: 0.3158 Acc: 0.7470 Kappa: 0.7277 Dice: 0.4422 2022-08-24 23:50:24 [INFO] [EVAL] Class IoU: [0.6593 0.7713 0.9252 0.7068 0.6681 0.7413 0.7639 0.7575 0.4836 0.6154 0.456 0.5135 0.6609 0.2907 0.2275 0.3739 0.4722 0.4188 0.5466 0.3738 0.7238 0.4418 0.5551 0.4522 0.3502 0.3961 0.3592 0.3998 0.3782 0.2842 0.2038 0.3924 0.2748 0.2731 0.3837 0.3693 0.3593 0.4786 0.2579 0.277 0.0763 0.059 0.3071 0.2103 0.3074 0.2515 0.2437 0.4221 0.6638 0.4643 0.4638 0.2928 0.2155 0.1682 0.5968 0.4462 0.8272 0.3721 0.4915 0.1931 0.0447 0.1684 0.3044 0.1356 0.3675 0.6611 0.2318 0.2847 0.0932 0.3411 0.3562 0.3978 0.3706 0.2082 0.402 0.3099 0.3298 0.2223 0.1576 0.3198 0.6637 0.2875 0.2595 0.0156 0.4689 0.4729 0.0955 0.0592 0.3392 0.4382 0.3337 0.0037 0.1487 0.0791 0.0035 0.0085 0.188 0.099 0.1811 0.3516 0.0981 0.0149 0.2014 0.5955 0.0045 0.4503 0.2381 0.5067 0.1055 0.3297 0.0323 0.211 0.0973 0.6085 0.7531 0.0019 0.3649 0.5605 0.0946 0.1383 0.4602 0.0098 0.1658 0.0892 0.2456 0.1895 0.4555 0.3033 0.4325 0.2613 0.5748 0.0175 0.0281 0.2319 0.0824 0.0986 0.0512 0.0187 0.1137 0.3086 0.1419 0.0841 0.2675 0.3446 0.3068 0. 0.2769 0.009 0.0261 0.0558] 2022-08-24 23:50:24 [INFO] [EVAL] Class Precision: [0.7678 0.8418 0.9635 0.814 0.7677 0.8686 0.8883 0.8222 0.6104 0.7549 0.6698 0.6624 0.742 0.4781 0.4493 0.5498 0.6548 0.6303 0.7126 0.5464 0.8119 0.617 0.7119 0.5956 0.511 0.5156 0.5235 0.7068 0.6063 0.3911 0.4089 0.529 0.4753 0.3961 0.5211 0.468 0.5926 0.7361 0.4375 0.4747 0.1436 0.2215 0.5344 0.5134 0.4952 0.4317 0.3447 0.6507 0.7445 0.5562 0.5898 0.3648 0.3288 0.6507 0.7017 0.6572 0.8707 0.5679 0.6595 0.3405 0.0782 0.4141 0.3772 0.5623 0.4498 0.8187 0.4096 0.6073 0.241 0.7047 0.5574 0.6471 0.6109 0.26 0.5627 0.5321 0.5584 0.5626 0.7612 0.612 0.7849 0.5012 0.7519 0.1156 0.6818 0.6778 0.2829 0.3962 0.5072 0.6556 0.4072 0.0058 0.2733 0.3082 0.0173 0.0367 0.5589 0.7438 0.4483 0.5308 0.5485 0.0285 0.6798 0.6441 0.0715 0.6221 0.5192 0.8901 0.2118 0.5758 0.2219 0.3002 0.3925 0.7522 0.7588 0.0833 0.7139 0.5941 0.2082 0.633 0.7741 0.4968 0.7092 0.6548 0.608 0.6764 0.7765 0.4148 0.6011 0.4134 0.696 0.6285 0.2096 0.7845 0.6385 0.2788 0.3952 0.1046 0.3646 0.691 0.2042 0.1383 0.4827 0.6239 0.4774 0. 0.9021 0.336 0.3561 0.5982] 2022-08-24 23:50:24 [INFO] [EVAL] Class Recall: [0.8235 0.902 0.9589 0.8429 0.8374 0.8349 0.8451 0.9058 0.6994 0.7691 0.5882 0.6956 0.8581 0.4259 0.3155 0.539 0.6287 0.5551 0.7011 0.5419 0.8697 0.6089 0.716 0.6527 0.5267 0.6307 0.5337 0.4793 0.5013 0.5098 0.2888 0.6032 0.3945 0.468 0.5926 0.6366 0.4772 0.5777 0.3857 0.3994 0.14 0.0745 0.4193 0.2626 0.4477 0.376 0.4541 0.5458 0.8597 0.7376 0.6845 0.5975 0.3847 0.1849 0.7998 0.5815 0.943 0.519 0.6586 0.3083 0.0946 0.2211 0.6118 0.1516 0.6674 0.7744 0.3481 0.3489 0.132 0.398 0.4967 0.508 0.4851 0.5112 0.5845 0.426 0.4461 0.2687 0.1658 0.4011 0.8112 0.4027 0.2838 0.0178 0.6002 0.61 0.126 0.0651 0.506 0.5693 0.6492 0.0106 0.246 0.0962 0.0043 0.0109 0.2208 0.1025 0.233 0.5102 0.1067 0.0302 0.2225 0.8876 0.0048 0.6198 0.3055 0.5406 0.1737 0.4354 0.0364 0.4151 0.1145 0.761 0.9902 0.002 0.4275 0.9084 0.1478 0.1504 0.5316 0.0099 0.1779 0.0936 0.2918 0.2084 0.5243 0.53 0.6066 0.4153 0.7674 0.0177 0.0314 0.2476 0.0865 0.1324 0.0556 0.0222 0.1419 0.358 0.3177 0.1765 0.375 0.4349 0.4619 0. 0.2855 0.0091 0.0274 0.0579] 2022-08-24 23:50:24 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-24 23:50:34 [INFO] [TRAIN] epoch: 93, iter: 117050/160000, loss: 0.7248, lr: 0.000325, batch_cost: 0.2040, reader_cost: 0.00688, ips: 39.2185 samples/sec | ETA 02:26:01 2022-08-24 23:50:44 [INFO] [TRAIN] epoch: 93, iter: 117100/160000, loss: 0.7183, lr: 0.000325, batch_cost: 0.2084, reader_cost: 0.00369, ips: 38.3862 samples/sec | ETA 02:29:00 2022-08-24 23:50:54 [INFO] [TRAIN] epoch: 93, iter: 117150/160000, loss: 0.6859, lr: 0.000324, batch_cost: 0.2020, reader_cost: 0.00273, ips: 39.6067 samples/sec | ETA 02:24:15 2022-08-24 23:51:05 [INFO] [TRAIN] epoch: 93, iter: 117200/160000, loss: 0.7512, lr: 0.000324, batch_cost: 0.2126, reader_cost: 0.00043, ips: 37.6233 samples/sec | ETA 02:31:40 2022-08-24 23:51:15 [INFO] [TRAIN] epoch: 93, iter: 117250/160000, loss: 0.7280, lr: 0.000324, batch_cost: 0.1913, reader_cost: 0.01232, ips: 41.8263 samples/sec | ETA 02:16:16 2022-08-24 23:51:25 [INFO] [TRAIN] epoch: 93, iter: 117300/160000, loss: 0.7603, lr: 0.000323, batch_cost: 0.2047, reader_cost: 0.01117, ips: 39.0859 samples/sec | ETA 02:25:39 2022-08-24 23:51:36 [INFO] [TRAIN] epoch: 93, iter: 117350/160000, loss: 0.7085, lr: 0.000323, batch_cost: 0.2314, reader_cost: 0.00070, ips: 34.5718 samples/sec | ETA 02:44:29 2022-08-24 23:51:46 [INFO] [TRAIN] epoch: 93, iter: 117400/160000, loss: 0.7025, lr: 0.000323, batch_cost: 0.1940, reader_cost: 0.00088, ips: 41.2400 samples/sec | ETA 02:17:43 2022-08-24 23:51:57 [INFO] [TRAIN] epoch: 93, iter: 117450/160000, loss: 0.7017, lr: 0.000322, batch_cost: 0.2226, reader_cost: 0.00392, ips: 35.9388 samples/sec | ETA 02:37:51 2022-08-24 23:52:14 [INFO] [TRAIN] epoch: 94, iter: 117500/160000, loss: 0.7317, lr: 0.000322, batch_cost: 0.3260, reader_cost: 0.05497, ips: 24.5385 samples/sec | ETA 03:50:55 2022-08-24 23:52:25 [INFO] [TRAIN] epoch: 94, iter: 117550/160000, loss: 0.7245, lr: 0.000321, batch_cost: 0.2305, reader_cost: 0.01491, ips: 34.7051 samples/sec | ETA 02:43:05 2022-08-24 23:52:37 [INFO] [TRAIN] epoch: 94, iter: 117600/160000, loss: 0.7418, lr: 0.000321, batch_cost: 0.2439, reader_cost: 0.00143, ips: 32.8044 samples/sec | ETA 02:52:20 2022-08-24 23:52:50 [INFO] [TRAIN] epoch: 94, iter: 117650/160000, loss: 0.6684, lr: 0.000321, batch_cost: 0.2628, reader_cost: 0.01409, ips: 30.4434 samples/sec | ETA 03:05:28 2022-08-24 23:53:03 [INFO] [TRAIN] epoch: 94, iter: 117700/160000, loss: 0.6874, lr: 0.000320, batch_cost: 0.2575, reader_cost: 0.01910, ips: 31.0631 samples/sec | ETA 03:01:33 2022-08-24 23:53:16 [INFO] [TRAIN] epoch: 94, iter: 117750/160000, loss: 0.6771, lr: 0.000320, batch_cost: 0.2623, reader_cost: 0.00078, ips: 30.4947 samples/sec | ETA 03:04:43 2022-08-24 23:53:30 [INFO] [TRAIN] epoch: 94, iter: 117800/160000, loss: 0.7021, lr: 0.000320, batch_cost: 0.2609, reader_cost: 0.00069, ips: 30.6589 samples/sec | ETA 03:03:31 2022-08-24 23:53:43 [INFO] [TRAIN] epoch: 94, iter: 117850/160000, loss: 0.6999, lr: 0.000319, batch_cost: 0.2767, reader_cost: 0.00115, ips: 28.9077 samples/sec | ETA 03:14:24 2022-08-24 23:53:57 [INFO] [TRAIN] epoch: 94, iter: 117900/160000, loss: 0.6516, lr: 0.000319, batch_cost: 0.2633, reader_cost: 0.00474, ips: 30.3869 samples/sec | ETA 03:04:43 2022-08-24 23:54:08 [INFO] [TRAIN] epoch: 94, iter: 117950/160000, loss: 0.6691, lr: 0.000318, batch_cost: 0.2371, reader_cost: 0.00093, ips: 33.7394 samples/sec | ETA 02:46:10 2022-08-24 23:54:18 [INFO] [TRAIN] epoch: 94, iter: 118000/160000, loss: 0.6229, lr: 0.000318, batch_cost: 0.1949, reader_cost: 0.00034, ips: 41.0553 samples/sec | ETA 02:16:24 2022-08-24 23:54:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 188s - batch_cost: 0.1880 - reader cost: 8.8430e-04 2022-08-24 23:57:26 [INFO] [EVAL] #Images: 2000 mIoU: 0.3160 Acc: 0.7472 Kappa: 0.7277 Dice: 0.4420 2022-08-24 23:57:26 [INFO] [EVAL] Class IoU: [0.6579 0.7638 0.9257 0.7017 0.6721 0.7453 0.7591 0.7513 0.4868 0.6151 0.4557 0.5171 0.6641 0.2784 0.2133 0.3747 0.4837 0.412 0.5508 0.3645 0.722 0.4447 0.5551 0.455 0.3362 0.3772 0.351 0.3875 0.3365 0.3076 0.1995 0.3984 0.2689 0.2723 0.3508 0.3634 0.3571 0.5166 0.2582 0.2843 0.0896 0.0839 0.3058 0.2037 0.3007 0.2653 0.2225 0.429 0.6967 0.4641 0.4756 0.2254 0.2082 0.2122 0.5858 0.4575 0.8471 0.2285 0.484 0.1837 0.0479 0.1646 0.314 0.1446 0.37 0.6617 0.2306 0.3272 0.0958 0.3119 0.3359 0.414 0.3607 0.2268 0.4217 0.3003 0.3974 0.2311 0.0681 0.426 0.6636 0.2712 0.2145 0.0191 0.4436 0.4738 0.0628 0.0556 0.3575 0.4512 0.4158 0.0005 0.1919 0.0706 0.0068 0.012 0.1715 0.1334 0.1824 0.3112 0.0461 0.011 0.1981 0.6162 0.0071 0.4834 0.2349 0.4605 0.0677 0.3291 0.0523 0.1167 0.0949 0.5649 0.7631 0.0011 0.3583 0.589 0.065 0.1337 0.4465 0.0145 0.2675 0.1546 0.2437 0.1574 0.435 0.2957 0.4297 0.2698 0.5619 0.0157 0.0993 0.2617 0.072 0.0879 0.0633 0.0207 0.1173 0.3046 0.1416 0.0907 0.2218 0.452 0.3134 0.0001 0.2805 0.0053 0.0555 0.0593] 2022-08-24 23:57:26 [INFO] [EVAL] Class Precision: [0.7654 0.8277 0.9636 0.7885 0.7652 0.851 0.8709 0.8244 0.6346 0.7278 0.6697 0.6596 0.7469 0.5079 0.4486 0.546 0.6529 0.6668 0.7172 0.5919 0.8066 0.5966 0.7234 0.5927 0.5333 0.6307 0.5071 0.6922 0.6564 0.4713 0.3909 0.5346 0.4683 0.3889 0.514 0.4935 0.6203 0.7031 0.4733 0.4749 0.1383 0.2479 0.4747 0.5726 0.4082 0.4211 0.3528 0.6722 0.7779 0.5583 0.6508 0.2541 0.3934 0.5368 0.6686 0.6071 0.908 0.6726 0.674 0.3505 0.0983 0.3567 0.4108 0.5776 0.4666 0.8286 0.3678 0.6079 0.1999 0.5804 0.5998 0.6005 0.6141 0.2875 0.6643 0.5674 0.5836 0.5967 0.7938 0.6439 0.7793 0.5327 0.752 0.0718 0.6662 0.6725 0.321 0.4445 0.5688 0.6603 0.5874 0.0008 0.3194 0.3078 0.0312 0.0406 0.6598 0.5531 0.4581 0.5655 0.5483 0.0204 0.6379 0.6733 0.1222 0.6834 0.6285 0.9054 0.3245 0.6194 0.2244 0.1396 0.3815 0.6504 0.7685 0.0207 0.7598 0.6216 0.1301 0.6672 0.728 0.3203 0.601 0.6145 0.6503 0.6398 0.7518 0.3987 0.702 0.4114 0.6596 0.6528 0.4249 0.7169 0.6408 0.3038 0.2691 0.097 0.3663 0.6945 0.2201 0.2014 0.5572 0.7197 0.5262 0.0001 0.8772 0.3291 0.373 0.5822] 2022-08-24 23:57:26 [INFO] [EVAL] Class Recall: [0.8242 0.9082 0.9592 0.8643 0.8467 0.8572 0.8553 0.8944 0.6764 0.799 0.5878 0.7054 0.8569 0.3812 0.2891 0.5443 0.6513 0.5189 0.7035 0.4868 0.8731 0.6359 0.7046 0.662 0.4763 0.4841 0.5329 0.4682 0.4085 0.4696 0.2895 0.6099 0.387 0.4759 0.525 0.5796 0.457 0.6608 0.3624 0.4147 0.203 0.1126 0.4623 0.2403 0.533 0.4177 0.3761 0.5424 0.8696 0.7335 0.6386 0.6659 0.3066 0.2598 0.8256 0.65 0.9267 0.2571 0.6319 0.2785 0.0855 0.2342 0.5711 0.1617 0.6411 0.7666 0.3819 0.4147 0.1553 0.4027 0.433 0.5714 0.4664 0.518 0.5358 0.3894 0.5546 0.2739 0.0693 0.5574 0.8172 0.3558 0.2308 0.0253 0.5703 0.6159 0.0724 0.0598 0.4904 0.5876 0.5874 0.0014 0.3247 0.0839 0.0087 0.0167 0.1882 0.1495 0.2327 0.409 0.0479 0.0234 0.2232 0.8792 0.0075 0.6229 0.2728 0.4837 0.0788 0.4124 0.0638 0.4153 0.1121 0.8113 0.9909 0.0012 0.4041 0.9183 0.115 0.1433 0.5359 0.015 0.3253 0.1713 0.2804 0.1726 0.508 0.5338 0.5256 0.4394 0.7914 0.0159 0.1147 0.2918 0.075 0.1101 0.0764 0.0256 0.1471 0.3517 0.2842 0.1416 0.2693 0.5486 0.4366 0.0006 0.292 0.0053 0.0612 0.062 ] 2022-08-24 23:57:27 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-24 23:57:36 [INFO] [TRAIN] epoch: 94, iter: 118050/160000, loss: 0.6965, lr: 0.000318, batch_cost: 0.1950, reader_cost: 0.00384, ips: 41.0250 samples/sec | ETA 02:16:20 2022-08-24 23:57:45 [INFO] [TRAIN] epoch: 94, iter: 118100/160000, loss: 0.6785, lr: 0.000317, batch_cost: 0.1829, reader_cost: 0.00081, ips: 43.7332 samples/sec | ETA 02:07:44 2022-08-24 23:57:56 [INFO] [TRAIN] epoch: 94, iter: 118150/160000, loss: 0.7130, lr: 0.000317, batch_cost: 0.2077, reader_cost: 0.00512, ips: 38.5100 samples/sec | ETA 02:24:53 2022-08-24 23:58:06 [INFO] [TRAIN] epoch: 94, iter: 118200/160000, loss: 0.7116, lr: 0.000316, batch_cost: 0.2124, reader_cost: 0.00036, ips: 37.6713 samples/sec | ETA 02:27:56 2022-08-24 23:58:16 [INFO] [TRAIN] epoch: 94, iter: 118250/160000, loss: 0.6924, lr: 0.000316, batch_cost: 0.1866, reader_cost: 0.00515, ips: 42.8677 samples/sec | ETA 02:09:51 2022-08-24 23:58:27 [INFO] [TRAIN] epoch: 94, iter: 118300/160000, loss: 0.6984, lr: 0.000316, batch_cost: 0.2322, reader_cost: 0.00053, ips: 34.4479 samples/sec | ETA 02:41:24 2022-08-24 23:58:40 [INFO] [TRAIN] epoch: 94, iter: 118350/160000, loss: 0.6988, lr: 0.000315, batch_cost: 0.2485, reader_cost: 0.00161, ips: 32.1918 samples/sec | ETA 02:52:30 2022-08-24 23:58:52 [INFO] [TRAIN] epoch: 94, iter: 118400/160000, loss: 0.6591, lr: 0.000315, batch_cost: 0.2484, reader_cost: 0.00056, ips: 32.2117 samples/sec | ETA 02:52:11 2022-08-24 23:59:05 [INFO] [TRAIN] epoch: 94, iter: 118450/160000, loss: 0.7562, lr: 0.000315, batch_cost: 0.2533, reader_cost: 0.00116, ips: 31.5818 samples/sec | ETA 02:55:25 2022-08-24 23:59:17 [INFO] [TRAIN] epoch: 94, iter: 118500/160000, loss: 0.7154, lr: 0.000314, batch_cost: 0.2486, reader_cost: 0.00733, ips: 32.1833 samples/sec | ETA 02:51:55 2022-08-24 23:59:30 [INFO] [TRAIN] epoch: 94, iter: 118550/160000, loss: 0.7100, lr: 0.000314, batch_cost: 0.2508, reader_cost: 0.00435, ips: 31.9022 samples/sec | ETA 02:53:14 2022-08-24 23:59:42 [INFO] [TRAIN] epoch: 94, iter: 118600/160000, loss: 0.7384, lr: 0.000313, batch_cost: 0.2508, reader_cost: 0.01449, ips: 31.8925 samples/sec | ETA 02:53:04 2022-08-24 23:59:55 [INFO] [TRAIN] epoch: 94, iter: 118650/160000, loss: 0.7143, lr: 0.000313, batch_cost: 0.2567, reader_cost: 0.01700, ips: 31.1641 samples/sec | ETA 02:56:54 2022-08-25 00:00:07 [INFO] [TRAIN] epoch: 94, iter: 118700/160000, loss: 0.7179, lr: 0.000313, batch_cost: 0.2384, reader_cost: 0.00452, ips: 33.5562 samples/sec | ETA 02:44:06 2022-08-25 00:00:23 [INFO] [TRAIN] epoch: 95, iter: 118750/160000, loss: 0.7102, lr: 0.000312, batch_cost: 0.3233, reader_cost: 0.03542, ips: 24.7427 samples/sec | ETA 03:42:17 2022-08-25 00:00:35 [INFO] [TRAIN] epoch: 95, iter: 118800/160000, loss: 0.7053, lr: 0.000312, batch_cost: 0.2361, reader_cost: 0.00686, ips: 33.8902 samples/sec | ETA 02:42:05 2022-08-25 00:00:48 [INFO] [TRAIN] epoch: 95, iter: 118850/160000, loss: 0.7317, lr: 0.000312, batch_cost: 0.2601, reader_cost: 0.01063, ips: 30.7611 samples/sec | ETA 02:58:21 2022-08-25 00:01:00 [INFO] [TRAIN] epoch: 95, iter: 118900/160000, loss: 0.6454, lr: 0.000311, batch_cost: 0.2342, reader_cost: 0.00033, ips: 34.1631 samples/sec | ETA 02:40:24 2022-08-25 00:01:10 [INFO] [TRAIN] epoch: 95, iter: 118950/160000, loss: 0.6717, lr: 0.000311, batch_cost: 0.1958, reader_cost: 0.00214, ips: 40.8539 samples/sec | ETA 02:13:58 2022-08-25 00:01:20 [INFO] [TRAIN] epoch: 95, iter: 119000/160000, loss: 0.6931, lr: 0.000310, batch_cost: 0.2049, reader_cost: 0.00061, ips: 39.0452 samples/sec | ETA 02:20:00 2022-08-25 00:01:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1683 - reader cost: 0.0011 2022-08-25 00:04:08 [INFO] [EVAL] #Images: 2000 mIoU: 0.3164 Acc: 0.7447 Kappa: 0.7252 Dice: 0.4438 2022-08-25 00:04:08 [INFO] [EVAL] Class IoU: [0.6582 0.7712 0.9253 0.7071 0.6744 0.7433 0.7629 0.7525 0.4769 0.5833 0.449 0.5133 0.6633 0.2805 0.2401 0.3723 0.4826 0.4139 0.5415 0.3651 0.7202 0.4133 0.5679 0.4515 0.3137 0.3973 0.3734 0.381 0.3657 0.2858 0.2024 0.4035 0.2662 0.2657 0.3575 0.3692 0.3616 0.4797 0.2398 0.2575 0.0724 0.0831 0.2896 0.2261 0.2989 0.2253 0.2439 0.4167 0.6375 0.4905 0.4736 0.2473 0.2167 0.2223 0.5912 0.4519 0.8469 0.2384 0.5106 0.1896 0.0515 0.1631 0.2985 0.1114 0.3501 0.6679 0.2186 0.3736 0.0908 0.3284 0.3103 0.4197 0.3664 0.213 0.4096 0.3121 0.4011 0.2246 0.1501 0.3173 0.6446 0.2856 0.2719 0.0163 0.4758 0.4657 0.0764 0.0561 0.3505 0.4266 0.3768 0.0062 0.2002 0.0816 0.0067 0.0115 0.1725 0.1344 0.1599 0.3741 0.1234 0.0198 0.1786 0.6525 0.0005 0.4907 0.1843 0.4643 0.0952 0.3157 0.0603 0.3161 0.0764 0.5589 0.6676 0.0025 0.4142 0.5693 0.0302 0.2877 0.434 0.013 0.2492 0.0892 0.259 0.2 0.4341 0.3007 0.4071 0.2597 0.556 0.0169 0.1392 0.2362 0.0671 0.1106 0.0441 0.0201 0.1216 0.3053 0.0875 0.1044 0.2553 0.3747 0.2879 0. 0.2916 0.0097 0.0603 0.0292] 2022-08-25 00:04:08 [INFO] [EVAL] Class Precision: [0.7625 0.8547 0.9618 0.8115 0.7701 0.865 0.8731 0.8295 0.5986 0.7327 0.6475 0.6556 0.7529 0.4765 0.456 0.5383 0.6867 0.6327 0.7335 0.5542 0.796 0.5932 0.7553 0.5797 0.5338 0.551 0.5239 0.6997 0.6391 0.3666 0.4125 0.5351 0.474 0.365 0.4683 0.5254 0.5944 0.7117 0.4721 0.5312 0.1308 0.2074 0.4581 0.4907 0.3806 0.4074 0.3784 0.6097 0.6705 0.6199 0.6303 0.2874 0.3681 0.6736 0.6844 0.5952 0.8949 0.6355 0.702 0.3848 0.0855 0.3626 0.436 0.6097 0.4244 0.8527 0.3411 0.5595 0.1562 0.6325 0.6195 0.5725 0.5977 0.2643 0.6051 0.5307 0.5409 0.4578 0.8065 0.6835 0.743 0.5732 0.7232 0.0648 0.6806 0.6608 0.2305 0.4858 0.6292 0.6024 0.4888 0.0079 0.3014 0.2908 0.0277 0.0331 0.5667 0.5399 0.283 0.5842 0.6122 0.0326 0.713 0.7507 0.0073 0.6955 0.5651 0.8743 0.2474 0.5388 0.2362 0.568 0.3478 0.7393 0.6715 0.0613 0.6947 0.5919 0.0735 0.7216 0.6837 0.2856 0.6293 0.6884 0.6162 0.6591 0.7771 0.4438 0.5815 0.4853 0.6424 0.57 0.506 0.7072 0.6597 0.2477 0.3895 0.1513 0.3217 0.6865 0.1323 0.2491 0.5123 0.735 0.4603 0. 0.8972 0.2538 0.437 0.5189] 2022-08-25 00:04:08 [INFO] [EVAL] Class Recall: [0.8279 0.8876 0.9606 0.8461 0.8444 0.8408 0.8581 0.8902 0.701 0.7411 0.5943 0.7028 0.8478 0.4055 0.3365 0.547 0.6189 0.5448 0.6742 0.5169 0.8831 0.5768 0.696 0.6712 0.4321 0.5875 0.5652 0.4555 0.4609 0.5644 0.2844 0.6215 0.3778 0.4939 0.6018 0.5538 0.4801 0.5954 0.3276 0.3333 0.1397 0.1219 0.4405 0.2955 0.5818 0.3353 0.4069 0.5684 0.9283 0.7015 0.6558 0.6391 0.3451 0.2492 0.8126 0.6523 0.9404 0.2762 0.6518 0.2721 0.1145 0.2287 0.4862 0.1199 0.6666 0.755 0.3782 0.5293 0.1782 0.4059 0.3834 0.6112 0.4863 0.5233 0.559 0.4311 0.6082 0.3059 0.1557 0.372 0.8295 0.3627 0.3035 0.0212 0.6125 0.612 0.1026 0.0596 0.4418 0.5938 0.6217 0.0266 0.3736 0.1018 0.0088 0.0173 0.1987 0.1518 0.2688 0.5099 0.1339 0.0481 0.1924 0.8329 0.0005 0.6249 0.2147 0.4975 0.1339 0.4325 0.075 0.4161 0.0892 0.6961 0.9913 0.0026 0.5064 0.9373 0.0487 0.3236 0.5431 0.0135 0.2921 0.0929 0.3089 0.2231 0.4959 0.4825 0.5757 0.3584 0.8053 0.0171 0.1611 0.2618 0.0696 0.1665 0.0474 0.0226 0.1635 0.3547 0.2051 0.1524 0.3372 0.4333 0.4345 0. 0.3017 0.01 0.0654 0.03 ] 2022-08-25 00:04:09 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:04:18 [INFO] [TRAIN] epoch: 95, iter: 119050/160000, loss: 0.6427, lr: 0.000310, batch_cost: 0.1884, reader_cost: 0.00957, ips: 42.4692 samples/sec | ETA 02:08:33 2022-08-25 00:04:28 [INFO] [TRAIN] epoch: 95, iter: 119100/160000, loss: 0.6790, lr: 0.000310, batch_cost: 0.2064, reader_cost: 0.00111, ips: 38.7524 samples/sec | ETA 02:20:43 2022-08-25 00:04:38 [INFO] [TRAIN] epoch: 95, iter: 119150/160000, loss: 0.6539, lr: 0.000309, batch_cost: 0.1958, reader_cost: 0.00085, ips: 40.8589 samples/sec | ETA 02:13:18 2022-08-25 00:04:48 [INFO] [TRAIN] epoch: 95, iter: 119200/160000, loss: 0.6752, lr: 0.000309, batch_cost: 0.2010, reader_cost: 0.00661, ips: 39.7930 samples/sec | ETA 02:16:42 2022-08-25 00:04:59 [INFO] [TRAIN] epoch: 95, iter: 119250/160000, loss: 0.7351, lr: 0.000309, batch_cost: 0.2207, reader_cost: 0.00132, ips: 36.2455 samples/sec | ETA 02:29:54 2022-08-25 00:05:09 [INFO] [TRAIN] epoch: 95, iter: 119300/160000, loss: 0.7647, lr: 0.000308, batch_cost: 0.1947, reader_cost: 0.00463, ips: 41.0982 samples/sec | ETA 02:12:02 2022-08-25 00:05:18 [INFO] [TRAIN] epoch: 95, iter: 119350/160000, loss: 0.7159, lr: 0.000308, batch_cost: 0.1881, reader_cost: 0.01816, ips: 42.5334 samples/sec | ETA 02:07:25 2022-08-25 00:05:30 [INFO] [TRAIN] epoch: 95, iter: 119400/160000, loss: 0.7370, lr: 0.000307, batch_cost: 0.2333, reader_cost: 0.00216, ips: 34.2902 samples/sec | ETA 02:37:52 2022-08-25 00:05:44 [INFO] [TRAIN] epoch: 95, iter: 119450/160000, loss: 0.7400, lr: 0.000307, batch_cost: 0.2781, reader_cost: 0.00154, ips: 28.7636 samples/sec | ETA 03:07:58 2022-08-25 00:05:57 [INFO] [TRAIN] epoch: 95, iter: 119500/160000, loss: 0.6864, lr: 0.000307, batch_cost: 0.2606, reader_cost: 0.00133, ips: 30.6947 samples/sec | ETA 02:55:55 2022-08-25 00:06:09 [INFO] [TRAIN] epoch: 95, iter: 119550/160000, loss: 0.6605, lr: 0.000306, batch_cost: 0.2415, reader_cost: 0.00177, ips: 33.1299 samples/sec | ETA 02:42:47 2022-08-25 00:06:22 [INFO] [TRAIN] epoch: 95, iter: 119600/160000, loss: 0.6520, lr: 0.000306, batch_cost: 0.2592, reader_cost: 0.00066, ips: 30.8646 samples/sec | ETA 02:54:31 2022-08-25 00:06:34 [INFO] [TRAIN] epoch: 95, iter: 119650/160000, loss: 0.7166, lr: 0.000305, batch_cost: 0.2424, reader_cost: 0.00221, ips: 33.0000 samples/sec | ETA 02:43:01 2022-08-25 00:06:46 [INFO] [TRAIN] epoch: 95, iter: 119700/160000, loss: 0.6685, lr: 0.000305, batch_cost: 0.2312, reader_cost: 0.02062, ips: 34.6051 samples/sec | ETA 02:35:16 2022-08-25 00:06:59 [INFO] [TRAIN] epoch: 95, iter: 119750/160000, loss: 0.6614, lr: 0.000305, batch_cost: 0.2598, reader_cost: 0.00089, ips: 30.7870 samples/sec | ETA 02:54:18 2022-08-25 00:07:12 [INFO] [TRAIN] epoch: 95, iter: 119800/160000, loss: 0.6848, lr: 0.000304, batch_cost: 0.2633, reader_cost: 0.00042, ips: 30.3876 samples/sec | ETA 02:56:23 2022-08-25 00:07:24 [INFO] [TRAIN] epoch: 95, iter: 119850/160000, loss: 0.7212, lr: 0.000304, batch_cost: 0.2482, reader_cost: 0.00042, ips: 32.2366 samples/sec | ETA 02:46:03 2022-08-25 00:07:37 [INFO] [TRAIN] epoch: 95, iter: 119900/160000, loss: 0.7017, lr: 0.000304, batch_cost: 0.2521, reader_cost: 0.00051, ips: 31.7303 samples/sec | ETA 02:48:30 2022-08-25 00:07:50 [INFO] [TRAIN] epoch: 95, iter: 119950/160000, loss: 0.7330, lr: 0.000303, batch_cost: 0.2632, reader_cost: 0.00042, ips: 30.3910 samples/sec | ETA 02:55:42 2022-08-25 00:08:04 [INFO] [TRAIN] epoch: 96, iter: 120000/160000, loss: 0.7106, lr: 0.000303, batch_cost: 0.2738, reader_cost: 0.04594, ips: 29.2190 samples/sec | ETA 03:02:31 2022-08-25 00:08:04 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1649 - reader cost: 9.5812e-04 2022-08-25 00:10:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3162 Acc: 0.7466 Kappa: 0.7271 Dice: 0.4437 2022-08-25 00:10:49 [INFO] [EVAL] Class IoU: [0.6583 0.769 0.9253 0.7039 0.6761 0.747 0.7556 0.7539 0.4834 0.6075 0.4634 0.5193 0.6651 0.2961 0.2152 0.3767 0.4663 0.4138 0.5465 0.3642 0.7227 0.412 0.5627 0.4483 0.3178 0.3803 0.3586 0.395 0.3627 0.2776 0.222 0.4068 0.2646 0.2775 0.3037 0.3874 0.3647 0.4788 0.2395 0.268 0.0869 0.0835 0.3062 0.2218 0.315 0.231 0.2385 0.4207 0.6674 0.4813 0.4771 0.2473 0.2253 0.206 0.6159 0.4438 0.844 0.2517 0.4839 0.1992 0.0603 0.1724 0.3058 0.1872 0.3782 0.6603 0.2311 0.3767 0.0348 0.3173 0.3816 0.4283 0.3616 0.2218 0.4127 0.3068 0.3177 0.2203 0.1984 0.2314 0.638 0.3039 0.2654 0.0276 0.4312 0.4675 0.0825 0.0513 0.322 0.4441 0.404 0.0072 0.1945 0.0984 0.0055 0.0126 0.1731 0.1493 0.1888 0.3226 0.0961 0.0077 0.1727 0.5984 0.0003 0.5278 0.2064 0.4586 0.0691 0.3119 0.0428 0.2244 0.0821 0.5139 0.6739 0. 0.3492 0.5723 0.0444 0.3144 0.4598 0.0112 0.2487 0.125 0.2378 0.1731 0.4334 0.2935 0.4168 0.2961 0.5453 0.0143 0.1152 0.2877 0.0801 0.1052 0.0537 0.0202 0.1337 0.3241 0.0703 0.0744 0.2279 0.4019 0.3066 0. 0.3017 0.0085 0.0468 0.0493] 2022-08-25 00:10:49 [INFO] [EVAL] Class Precision: [0.764 0.8389 0.9625 0.8084 0.7701 0.8637 0.8857 0.817 0.6093 0.7496 0.671 0.6304 0.7583 0.4915 0.4479 0.5352 0.609 0.6474 0.7154 0.5685 0.811 0.5701 0.7173 0.5908 0.5185 0.5795 0.5107 0.6643 0.622 0.3923 0.3966 0.5381 0.4329 0.4169 0.4654 0.5397 0.5987 0.726 0.5204 0.5155 0.1358 0.2461 0.5034 0.5278 0.4501 0.4376 0.3764 0.6799 0.7511 0.5854 0.6234 0.287 0.392 0.6122 0.7265 0.6226 0.8876 0.6365 0.5966 0.3509 0.0977 0.4396 0.4266 0.5912 0.4788 0.8141 0.3461 0.5801 0.1232 0.6464 0.5816 0.621 0.6319 0.2725 0.6075 0.5097 0.512 0.6005 0.6793 0.6795 0.7368 0.5307 0.7393 0.1161 0.6534 0.6528 0.4259 0.4367 0.5259 0.6414 0.555 0.0113 0.3452 0.2881 0.0287 0.0483 0.6696 0.4229 0.3752 0.5596 0.5192 0.0177 0.7313 0.7907 0.0089 0.7859 0.5741 0.9436 0.1959 0.5581 0.2878 0.3276 0.288 0.7853 0.677 0.0001 0.7211 0.594 0.1183 0.708 0.7935 0.3018 0.7774 0.6548 0.683 0.6184 0.7781 0.4133 0.6779 0.5157 0.6135 0.4857 0.3491 0.6972 0.6314 0.2803 0.393 0.0922 0.4162 0.6536 0.1298 0.4732 0.5371 0.6633 0.5496 0. 0.8615 0.2375 0.4939 0.6835] 2022-08-25 00:10:49 [INFO] [EVAL] Class Recall: [0.8264 0.9024 0.9599 0.8448 0.8472 0.8468 0.8373 0.907 0.7006 0.7622 0.5997 0.7466 0.8441 0.4268 0.2928 0.5599 0.6656 0.5342 0.6984 0.5033 0.8691 0.5976 0.723 0.6501 0.451 0.5253 0.5464 0.4935 0.4653 0.4869 0.3353 0.6251 0.4049 0.4536 0.4664 0.5785 0.4827 0.5843 0.3074 0.3583 0.1945 0.1122 0.4388 0.2767 0.512 0.3285 0.3944 0.5246 0.8569 0.7302 0.6702 0.6414 0.3463 0.2369 0.8017 0.6071 0.945 0.294 0.7192 0.3154 0.1363 0.2209 0.5193 0.215 0.6429 0.7775 0.4101 0.5178 0.0463 0.384 0.5259 0.5799 0.458 0.5438 0.5627 0.4353 0.4557 0.2582 0.2189 0.2598 0.8264 0.4156 0.2928 0.035 0.5591 0.6223 0.0928 0.055 0.4536 0.5909 0.5975 0.0199 0.3083 0.1299 0.0068 0.0167 0.1892 0.1874 0.2754 0.4323 0.1054 0.0136 0.1844 0.711 0.0003 0.6163 0.2437 0.4716 0.0965 0.4141 0.0479 0.4161 0.103 0.5979 0.9933 0. 0.4038 0.9401 0.0663 0.3612 0.5223 0.0115 0.2678 0.1338 0.2674 0.1938 0.4945 0.5033 0.5198 0.41 0.8307 0.0145 0.1467 0.3288 0.0841 0.1441 0.0586 0.0251 0.1646 0.3913 0.1331 0.0811 0.2836 0.5048 0.4095 0. 0.3171 0.0087 0.0491 0.0505] 2022-08-25 00:10:49 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:10:59 [INFO] [TRAIN] epoch: 96, iter: 120050/160000, loss: 0.6595, lr: 0.000302, batch_cost: 0.1939, reader_cost: 0.00357, ips: 41.2588 samples/sec | ETA 02:09:06 2022-08-25 00:11:09 [INFO] [TRAIN] epoch: 96, iter: 120100/160000, loss: 0.6688, lr: 0.000302, batch_cost: 0.1975, reader_cost: 0.00255, ips: 40.4975 samples/sec | ETA 02:11:21 2022-08-25 00:11:19 [INFO] [TRAIN] epoch: 96, iter: 120150/160000, loss: 0.6972, lr: 0.000302, batch_cost: 0.2081, reader_cost: 0.00081, ips: 38.4381 samples/sec | ETA 02:18:13 2022-08-25 00:11:30 [INFO] [TRAIN] epoch: 96, iter: 120200/160000, loss: 0.6813, lr: 0.000301, batch_cost: 0.2149, reader_cost: 0.00133, ips: 37.2207 samples/sec | ETA 02:22:34 2022-08-25 00:11:41 [INFO] [TRAIN] epoch: 96, iter: 120250/160000, loss: 0.6824, lr: 0.000301, batch_cost: 0.2168, reader_cost: 0.00088, ips: 36.8968 samples/sec | ETA 02:23:38 2022-08-25 00:11:51 [INFO] [TRAIN] epoch: 96, iter: 120300/160000, loss: 0.6897, lr: 0.000301, batch_cost: 0.2152, reader_cost: 0.00069, ips: 37.1804 samples/sec | ETA 02:22:22 2022-08-25 00:12:02 [INFO] [TRAIN] epoch: 96, iter: 120350/160000, loss: 0.7602, lr: 0.000300, batch_cost: 0.2167, reader_cost: 0.00036, ips: 36.9161 samples/sec | ETA 02:23:12 2022-08-25 00:12:12 [INFO] [TRAIN] epoch: 96, iter: 120400/160000, loss: 0.6786, lr: 0.000300, batch_cost: 0.1947, reader_cost: 0.01032, ips: 41.0953 samples/sec | ETA 02:08:28 2022-08-25 00:12:24 [INFO] [TRAIN] epoch: 96, iter: 120450/160000, loss: 0.7249, lr: 0.000299, batch_cost: 0.2357, reader_cost: 0.00036, ips: 33.9399 samples/sec | ETA 02:35:22 2022-08-25 00:12:36 [INFO] [TRAIN] epoch: 96, iter: 120500/160000, loss: 0.6764, lr: 0.000299, batch_cost: 0.2514, reader_cost: 0.01352, ips: 31.8281 samples/sec | ETA 02:45:28 2022-08-25 00:12:50 [INFO] [TRAIN] epoch: 96, iter: 120550/160000, loss: 0.7108, lr: 0.000299, batch_cost: 0.2769, reader_cost: 0.00065, ips: 28.8937 samples/sec | ETA 03:02:02 2022-08-25 00:13:02 [INFO] [TRAIN] epoch: 96, iter: 120600/160000, loss: 0.7378, lr: 0.000298, batch_cost: 0.2408, reader_cost: 0.01161, ips: 33.2268 samples/sec | ETA 02:38:06 2022-08-25 00:13:14 [INFO] [TRAIN] epoch: 96, iter: 120650/160000, loss: 0.6775, lr: 0.000298, batch_cost: 0.2456, reader_cost: 0.01620, ips: 32.5719 samples/sec | ETA 02:41:04 2022-08-25 00:13:26 [INFO] [TRAIN] epoch: 96, iter: 120700/160000, loss: 0.7350, lr: 0.000298, batch_cost: 0.2379, reader_cost: 0.02491, ips: 33.6283 samples/sec | ETA 02:35:49 2022-08-25 00:13:39 [INFO] [TRAIN] epoch: 96, iter: 120750/160000, loss: 0.6543, lr: 0.000297, batch_cost: 0.2585, reader_cost: 0.00095, ips: 30.9529 samples/sec | ETA 02:49:04 2022-08-25 00:13:51 [INFO] [TRAIN] epoch: 96, iter: 120800/160000, loss: 0.6770, lr: 0.000297, batch_cost: 0.2446, reader_cost: 0.00243, ips: 32.7080 samples/sec | ETA 02:39:47 2022-08-25 00:14:04 [INFO] [TRAIN] epoch: 96, iter: 120850/160000, loss: 0.7078, lr: 0.000296, batch_cost: 0.2483, reader_cost: 0.00242, ips: 32.2170 samples/sec | ETA 02:42:01 2022-08-25 00:14:16 [INFO] [TRAIN] epoch: 96, iter: 120900/160000, loss: 0.6566, lr: 0.000296, batch_cost: 0.2408, reader_cost: 0.00135, ips: 33.2243 samples/sec | ETA 02:36:54 2022-08-25 00:14:29 [INFO] [TRAIN] epoch: 96, iter: 120950/160000, loss: 0.7540, lr: 0.000296, batch_cost: 0.2532, reader_cost: 0.02776, ips: 31.5896 samples/sec | ETA 02:44:49 2022-08-25 00:14:42 [INFO] [TRAIN] epoch: 96, iter: 121000/160000, loss: 0.6737, lr: 0.000295, batch_cost: 0.2641, reader_cost: 0.00064, ips: 30.2897 samples/sec | ETA 02:51:40 2022-08-25 00:14:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 153s - batch_cost: 0.1532 - reader cost: 0.0015 2022-08-25 00:17:15 [INFO] [EVAL] #Images: 2000 mIoU: 0.3191 Acc: 0.7479 Kappa: 0.7282 Dice: 0.4466 2022-08-25 00:17:15 [INFO] [EVAL] Class IoU: [0.6561 0.7674 0.9257 0.7074 0.6734 0.7401 0.758 0.7592 0.486 0.6166 0.466 0.5223 0.6626 0.2656 0.2129 0.3763 0.4801 0.4011 0.5502 0.3646 0.7219 0.4295 0.5518 0.4512 0.3315 0.365 0.3519 0.3938 0.3798 0.2832 0.224 0.4218 0.2355 0.2788 0.3202 0.3918 0.3672 0.5284 0.2603 0.2722 0.0785 0.067 0.3058 0.2239 0.3013 0.263 0.2502 0.433 0.6667 0.4681 0.471 0.2922 0.2213 0.2097 0.6443 0.4456 0.8469 0.3113 0.4964 0.1949 0.0628 0.1782 0.3162 0.1049 0.3668 0.6663 0.2076 0.3759 0.0681 0.3075 0.3778 0.4303 0.3409 0.2147 0.412 0.2989 0.33 0.2011 0.0796 0.2857 0.6533 0.3004 0.2617 0.0325 0.4643 0.4611 0.0798 0.0615 0.3496 0.4321 0.3845 0.0021 0.2127 0.0726 0.0004 0.0142 0.1822 0.1485 0.2098 0.3912 0.099 0.0283 0.1824 0.7477 0.0102 0.5324 0.2064 0.4599 0.0868 0.2744 0.0532 0.3244 0.0926 0.5998 0.5929 0.0068 0.3415 0.5685 0.0701 0.1846 0.495 0.0089 0.258 0.1461 0.2347 0.1891 0.4294 0.3037 0.3896 0.2631 0.5461 0.0167 0.0496 0.2524 0.07 0.1054 0.0672 0.0215 0.1189 0.3017 0.1085 0.1093 0.2745 0.4215 0.3051 0. 0.3004 0.0151 0.073 0.0513] 2022-08-25 00:17:15 [INFO] [EVAL] Class Precision: [0.7528 0.8302 0.9637 0.8141 0.7696 0.8763 0.8936 0.8313 0.6046 0.7194 0.6899 0.6346 0.7486 0.49 0.4807 0.5718 0.6082 0.6516 0.709 0.5794 0.8135 0.5889 0.7112 0.5772 0.5235 0.554 0.5364 0.6345 0.6077 0.4516 0.4066 0.5367 0.5455 0.4517 0.4501 0.5378 0.5718 0.7366 0.5017 0.4979 0.142 0.2168 0.4824 0.5393 0.4411 0.4415 0.4501 0.6949 0.749 0.5555 0.6257 0.3773 0.4006 0.621 0.7239 0.6759 0.8877 0.6274 0.6573 0.3717 0.1102 0.4501 0.4376 0.6437 0.4421 0.8112 0.3305 0.585 0.1571 0.7067 0.6217 0.6087 0.6013 0.2699 0.6152 0.4399 0.4983 0.5013 0.8656 0.5472 0.7696 0.5599 0.7226 0.1059 0.6486 0.6793 0.3228 0.4207 0.6863 0.6361 0.4983 0.0039 0.3455 0.2898 0.003 0.0478 0.7257 0.4087 0.4141 0.5612 0.526 0.0469 0.6734 0.8433 0.1392 0.7973 0.4565 0.8598 0.2018 0.5194 0.1957 0.6079 0.4452 0.718 0.5945 0.1122 0.6897 0.5839 0.1634 0.655 0.7399 0.2345 0.8268 0.6229 0.6666 0.6411 0.7087 0.4838 0.6126 0.4403 0.6268 0.5805 0.1847 0.7223 0.5911 0.2491 0.3163 0.2132 0.4718 0.6938 0.1906 0.3374 0.5675 0.7477 0.5377 0. 0.9021 0.283 0.5274 0.6494] 2022-08-25 00:17:15 [INFO] [EVAL] Class Recall: [0.8363 0.9103 0.9592 0.8436 0.8434 0.8264 0.8332 0.8974 0.7125 0.8118 0.5894 0.747 0.8523 0.367 0.2765 0.524 0.6951 0.5106 0.7106 0.4959 0.865 0.6134 0.7111 0.6738 0.4748 0.5169 0.5057 0.5093 0.5032 0.4316 0.3329 0.6635 0.2929 0.4213 0.526 0.5906 0.5064 0.6515 0.3512 0.3753 0.1495 0.0884 0.455 0.2769 0.4874 0.3942 0.3603 0.5347 0.8584 0.7484 0.6558 0.5646 0.3309 0.2405 0.8543 0.5667 0.9485 0.3818 0.6698 0.2908 0.1274 0.2278 0.5325 0.1114 0.6827 0.7886 0.3583 0.5125 0.1073 0.3524 0.4906 0.5949 0.4404 0.5124 0.555 0.4827 0.4943 0.2514 0.0806 0.3741 0.8121 0.3932 0.2909 0.0448 0.6202 0.5895 0.0958 0.0672 0.4161 0.5739 0.6275 0.0046 0.3562 0.0883 0.0005 0.0199 0.1957 0.1891 0.2984 0.5637 0.1087 0.0665 0.2001 0.8683 0.0109 0.6158 0.2737 0.4972 0.1321 0.3678 0.0681 0.4102 0.1047 0.7847 0.9956 0.0072 0.4035 0.9558 0.1092 0.2044 0.5992 0.0091 0.2728 0.1603 0.2659 0.2115 0.5214 0.4492 0.5169 0.3953 0.809 0.0169 0.0636 0.2796 0.0736 0.1545 0.0786 0.0234 0.1372 0.348 0.2013 0.1391 0.347 0.4913 0.4136 0. 0.3105 0.0157 0.0781 0.0527] 2022-08-25 00:17:15 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:17:24 [INFO] [TRAIN] epoch: 96, iter: 121050/160000, loss: 0.7636, lr: 0.000295, batch_cost: 0.1798, reader_cost: 0.00423, ips: 44.4898 samples/sec | ETA 01:56:43 2022-08-25 00:17:34 [INFO] [TRAIN] epoch: 96, iter: 121100/160000, loss: 0.6784, lr: 0.000295, batch_cost: 0.1868, reader_cost: 0.00101, ips: 42.8332 samples/sec | ETA 02:01:05 2022-08-25 00:17:43 [INFO] [TRAIN] epoch: 96, iter: 121150/160000, loss: 0.6602, lr: 0.000294, batch_cost: 0.1926, reader_cost: 0.00057, ips: 41.5410 samples/sec | ETA 02:04:41 2022-08-25 00:17:53 [INFO] [TRAIN] epoch: 96, iter: 121200/160000, loss: 0.6885, lr: 0.000294, batch_cost: 0.1938, reader_cost: 0.00125, ips: 41.2852 samples/sec | ETA 02:05:18 2022-08-25 00:18:05 [INFO] [TRAIN] epoch: 97, iter: 121250/160000, loss: 0.6934, lr: 0.000293, batch_cost: 0.2342, reader_cost: 0.03332, ips: 34.1580 samples/sec | ETA 02:31:15 2022-08-25 00:18:16 [INFO] [TRAIN] epoch: 97, iter: 121300/160000, loss: 0.6979, lr: 0.000293, batch_cost: 0.2275, reader_cost: 0.00073, ips: 35.1586 samples/sec | ETA 02:26:45 2022-08-25 00:18:25 [INFO] [TRAIN] epoch: 97, iter: 121350/160000, loss: 0.7045, lr: 0.000293, batch_cost: 0.1849, reader_cost: 0.00712, ips: 43.2633 samples/sec | ETA 01:59:06 2022-08-25 00:18:35 [INFO] [TRAIN] epoch: 97, iter: 121400/160000, loss: 0.6933, lr: 0.000292, batch_cost: 0.1989, reader_cost: 0.00695, ips: 40.2162 samples/sec | ETA 02:07:58 2022-08-25 00:18:45 [INFO] [TRAIN] epoch: 97, iter: 121450/160000, loss: 0.6959, lr: 0.000292, batch_cost: 0.1861, reader_cost: 0.00324, ips: 42.9989 samples/sec | ETA 01:59:32 2022-08-25 00:18:55 [INFO] [TRAIN] epoch: 97, iter: 121500/160000, loss: 0.7416, lr: 0.000291, batch_cost: 0.1987, reader_cost: 0.00063, ips: 40.2613 samples/sec | ETA 02:07:30 2022-08-25 00:19:05 [INFO] [TRAIN] epoch: 97, iter: 121550/160000, loss: 0.7061, lr: 0.000291, batch_cost: 0.2031, reader_cost: 0.00057, ips: 39.3831 samples/sec | ETA 02:10:10 2022-08-25 00:19:17 [INFO] [TRAIN] epoch: 97, iter: 121600/160000, loss: 0.7324, lr: 0.000291, batch_cost: 0.2529, reader_cost: 0.00119, ips: 31.6274 samples/sec | ETA 02:41:53 2022-08-25 00:19:30 [INFO] [TRAIN] epoch: 97, iter: 121650/160000, loss: 0.7179, lr: 0.000290, batch_cost: 0.2498, reader_cost: 0.01876, ips: 32.0239 samples/sec | ETA 02:39:40 2022-08-25 00:19:43 [INFO] [TRAIN] epoch: 97, iter: 121700/160000, loss: 0.7579, lr: 0.000290, batch_cost: 0.2550, reader_cost: 0.00595, ips: 31.3758 samples/sec | ETA 02:42:45 2022-08-25 00:19:55 [INFO] [TRAIN] epoch: 97, iter: 121750/160000, loss: 0.7092, lr: 0.000290, batch_cost: 0.2535, reader_cost: 0.00090, ips: 31.5624 samples/sec | ETA 02:41:35 2022-08-25 00:20:08 [INFO] [TRAIN] epoch: 97, iter: 121800/160000, loss: 0.6442, lr: 0.000289, batch_cost: 0.2568, reader_cost: 0.00082, ips: 31.1487 samples/sec | ETA 02:43:30 2022-08-25 00:20:22 [INFO] [TRAIN] epoch: 97, iter: 121850/160000, loss: 0.6444, lr: 0.000289, batch_cost: 0.2826, reader_cost: 0.00098, ips: 28.3065 samples/sec | ETA 02:59:41 2022-08-25 00:20:35 [INFO] [TRAIN] epoch: 97, iter: 121900/160000, loss: 0.6862, lr: 0.000288, batch_cost: 0.2468, reader_cost: 0.00035, ips: 32.4119 samples/sec | ETA 02:36:43 2022-08-25 00:20:47 [INFO] [TRAIN] epoch: 97, iter: 121950/160000, loss: 0.7614, lr: 0.000288, batch_cost: 0.2459, reader_cost: 0.01042, ips: 32.5369 samples/sec | ETA 02:35:55 2022-08-25 00:21:01 [INFO] [TRAIN] epoch: 97, iter: 122000/160000, loss: 0.7041, lr: 0.000288, batch_cost: 0.2774, reader_cost: 0.00059, ips: 28.8382 samples/sec | ETA 02:55:41 2022-08-25 00:21:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 171s - batch_cost: 0.1708 - reader cost: 9.6075e-04 2022-08-25 00:23:52 [INFO] [EVAL] #Images: 2000 mIoU: 0.3191 Acc: 0.7482 Kappa: 0.7287 Dice: 0.4465 2022-08-25 00:23:52 [INFO] [EVAL] Class IoU: [0.6552 0.7694 0.925 0.7063 0.6699 0.7449 0.7641 0.7507 0.4844 0.6157 0.4638 0.5264 0.6691 0.2975 0.2256 0.3738 0.4724 0.4124 0.5524 0.3714 0.7215 0.428 0.5684 0.4542 0.3206 0.3675 0.3557 0.3976 0.3774 0.2979 0.1951 0.414 0.2587 0.2749 0.3608 0.3943 0.3592 0.487 0.2419 0.245 0.0787 0.0717 0.3016 0.2177 0.2977 0.2576 0.2413 0.4244 0.6616 0.4743 0.481 0.2463 0.2252 0.178 0.63 0.4602 0.8422 0.293 0.4815 0.1912 0.0429 0.2199 0.2911 0.1379 0.3737 0.6616 0.2051 0.3785 0.051 0.3273 0.3538 0.4298 0.3659 0.2248 0.4184 0.2999 0.3505 0.2092 0.1347 0.3622 0.6581 0.3034 0.2348 0.0167 0.4959 0.458 0.0753 0.0658 0.3543 0.4566 0.3881 0.0093 0.2062 0.0616 0.0004 0.0124 0.1701 0.142 0.2042 0.3345 0.1026 0.019 0.1686 0.6452 0.0033 0.5025 0.201 0.4712 0.0725 0.3296 0.0528 0.3122 0.0961 0.5903 0.6863 0. 0.3305 0.5262 0.0394 0.2593 0.448 0.0088 0.2174 0.0721 0.2395 0.1728 0.4333 0.3211 0.4624 0.2543 0.5545 0.0146 0.1085 0.275 0.0841 0.1148 0.0731 0.0215 0.1283 0.3058 0.1196 0.1387 0.2534 0.4236 0.3173 0. 0.2736 0.0133 0.0706 0.0306] 2022-08-25 00:23:52 [INFO] [EVAL] Class Precision: [0.7578 0.8455 0.9595 0.8161 0.761 0.8462 0.889 0.8072 0.6256 0.7438 0.6683 0.6567 0.7706 0.4804 0.4677 0.538 0.6494 0.6447 0.7095 0.5493 0.8126 0.6097 0.7357 0.5961 0.5091 0.5531 0.5211 0.6744 0.6053 0.4646 0.4348 0.5113 0.4562 0.3975 0.4577 0.5677 0.6208 0.7165 0.4649 0.5476 0.1504 0.2708 0.4981 0.5382 0.3936 0.4256 0.3886 0.6571 0.7077 0.5677 0.6453 0.304 0.3895 0.5976 0.6782 0.6235 0.8914 0.6058 0.6791 0.3406 0.0866 0.4969 0.3941 0.5405 0.4554 0.7939 0.4312 0.5457 0.1691 0.6394 0.6264 0.61 0.6085 0.2735 0.628 0.4829 0.7159 0.5298 0.7189 0.6102 0.7721 0.5719 0.7515 0.0696 0.6547 0.6939 0.4841 0.3744 0.5129 0.6461 0.5305 0.0125 0.3416 0.2925 0.0035 0.0418 0.7055 0.55 0.4371 0.5441 0.4354 0.0378 0.6749 0.7228 0.0557 0.7152 0.5452 0.8728 0.1898 0.5528 0.2432 0.558 0.4279 0.74 0.6909 0.0001 0.7498 0.5483 0.126 0.6685 0.7645 0.5621 0.8226 0.6188 0.6395 0.6068 0.7436 0.447 0.5306 0.5566 0.6534 0.5861 0.3454 0.7053 0.6271 0.2257 0.3196 0.2425 0.458 0.71 0.2409 0.3252 0.544 0.7112 0.5953 0. 0.9249 0.2871 0.5163 0.5969] 2022-08-25 00:23:52 [INFO] [EVAL] Class Recall: [0.8287 0.8953 0.9626 0.84 0.8484 0.8616 0.8447 0.9147 0.6821 0.7814 0.6025 0.7262 0.8355 0.4387 0.3035 0.5504 0.6342 0.5337 0.7139 0.5341 0.8655 0.5895 0.7141 0.6561 0.4642 0.5228 0.5284 0.492 0.5006 0.4538 0.2615 0.6851 0.374 0.4713 0.6301 0.5634 0.4601 0.6031 0.3352 0.3072 0.1417 0.0888 0.4333 0.2677 0.5498 0.395 0.389 0.5452 0.9103 0.7424 0.6539 0.5645 0.348 0.2022 0.8987 0.6373 0.9386 0.362 0.6234 0.3037 0.0782 0.2828 0.5269 0.1562 0.6757 0.7989 0.2813 0.5526 0.068 0.4014 0.4485 0.5926 0.4786 0.5582 0.5563 0.4418 0.4071 0.2569 0.1422 0.4713 0.8167 0.3925 0.2545 0.0215 0.6715 0.5741 0.0819 0.0739 0.5339 0.6089 0.5912 0.0347 0.3423 0.0724 0.0005 0.0174 0.1831 0.1607 0.277 0.4648 0.1184 0.0369 0.1835 0.8572 0.0035 0.6282 0.2415 0.506 0.1049 0.4494 0.0631 0.4148 0.1103 0.7448 0.9903 0. 0.3715 0.9288 0.0543 0.2975 0.5197 0.0088 0.2281 0.0754 0.2768 0.1946 0.5094 0.5329 0.7825 0.3189 0.7856 0.0147 0.1366 0.3107 0.0886 0.1895 0.0866 0.0231 0.1513 0.3494 0.1919 0.1948 0.3217 0.5116 0.4046 0. 0.2798 0.0138 0.0756 0.0312] 2022-08-25 00:23:52 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:24:00 [INFO] [TRAIN] epoch: 97, iter: 122050/160000, loss: 0.6784, lr: 0.000287, batch_cost: 0.1702, reader_cost: 0.00421, ips: 46.9904 samples/sec | ETA 01:47:40 2022-08-25 00:24:08 [INFO] [TRAIN] epoch: 97, iter: 122100/160000, loss: 0.6727, lr: 0.000287, batch_cost: 0.1549, reader_cost: 0.00075, ips: 51.6343 samples/sec | ETA 01:37:52 2022-08-25 00:24:16 [INFO] [TRAIN] epoch: 97, iter: 122150/160000, loss: 0.7250, lr: 0.000287, batch_cost: 0.1555, reader_cost: 0.00537, ips: 51.4415 samples/sec | ETA 01:38:06 2022-08-25 00:24:25 [INFO] [TRAIN] epoch: 97, iter: 122200/160000, loss: 0.6649, lr: 0.000286, batch_cost: 0.1724, reader_cost: 0.00035, ips: 46.4103 samples/sec | ETA 01:48:35 2022-08-25 00:24:33 [INFO] [TRAIN] epoch: 97, iter: 122250/160000, loss: 0.6924, lr: 0.000286, batch_cost: 0.1640, reader_cost: 0.00048, ips: 48.7785 samples/sec | ETA 01:43:11 2022-08-25 00:24:41 [INFO] [TRAIN] epoch: 97, iter: 122300/160000, loss: 0.6779, lr: 0.000285, batch_cost: 0.1707, reader_cost: 0.00074, ips: 46.8587 samples/sec | ETA 01:47:16 2022-08-25 00:24:50 [INFO] [TRAIN] epoch: 97, iter: 122350/160000, loss: 0.6321, lr: 0.000285, batch_cost: 0.1642, reader_cost: 0.01643, ips: 48.7191 samples/sec | ETA 01:43:02 2022-08-25 00:24:58 [INFO] [TRAIN] epoch: 97, iter: 122400/160000, loss: 0.6802, lr: 0.000285, batch_cost: 0.1757, reader_cost: 0.00123, ips: 45.5261 samples/sec | ETA 01:50:07 2022-08-25 00:25:09 [INFO] [TRAIN] epoch: 97, iter: 122450/160000, loss: 0.7546, lr: 0.000284, batch_cost: 0.2118, reader_cost: 0.00033, ips: 37.7666 samples/sec | ETA 02:12:34 2022-08-25 00:25:19 [INFO] [TRAIN] epoch: 97, iter: 122500/160000, loss: 0.6404, lr: 0.000284, batch_cost: 0.2007, reader_cost: 0.00097, ips: 39.8623 samples/sec | ETA 02:05:25 2022-08-25 00:25:32 [INFO] [TRAIN] epoch: 98, iter: 122550/160000, loss: 0.7042, lr: 0.000284, batch_cost: 0.2570, reader_cost: 0.03304, ips: 31.1324 samples/sec | ETA 02:40:23 2022-08-25 00:25:41 [INFO] [TRAIN] epoch: 98, iter: 122600/160000, loss: 0.6926, lr: 0.000283, batch_cost: 0.1890, reader_cost: 0.00682, ips: 42.3245 samples/sec | ETA 01:57:49 2022-08-25 00:25:52 [INFO] [TRAIN] epoch: 98, iter: 122650/160000, loss: 0.6880, lr: 0.000283, batch_cost: 0.2091, reader_cost: 0.01387, ips: 38.2619 samples/sec | ETA 02:10:09 2022-08-25 00:26:04 [INFO] [TRAIN] epoch: 98, iter: 122700/160000, loss: 0.7277, lr: 0.000282, batch_cost: 0.2429, reader_cost: 0.01823, ips: 32.9324 samples/sec | ETA 02:31:00 2022-08-25 00:26:16 [INFO] [TRAIN] epoch: 98, iter: 122750/160000, loss: 0.6543, lr: 0.000282, batch_cost: 0.2436, reader_cost: 0.03050, ips: 32.8342 samples/sec | ETA 02:31:15 2022-08-25 00:26:28 [INFO] [TRAIN] epoch: 98, iter: 122800/160000, loss: 0.7046, lr: 0.000282, batch_cost: 0.2381, reader_cost: 0.01146, ips: 33.6053 samples/sec | ETA 02:27:35 2022-08-25 00:26:40 [INFO] [TRAIN] epoch: 98, iter: 122850/160000, loss: 0.6660, lr: 0.000281, batch_cost: 0.2385, reader_cost: 0.00916, ips: 33.5416 samples/sec | ETA 02:27:40 2022-08-25 00:26:52 [INFO] [TRAIN] epoch: 98, iter: 122900/160000, loss: 0.6872, lr: 0.000281, batch_cost: 0.2471, reader_cost: 0.01848, ips: 32.3791 samples/sec | ETA 02:32:46 2022-08-25 00:27:06 [INFO] [TRAIN] epoch: 98, iter: 122950/160000, loss: 0.7188, lr: 0.000281, batch_cost: 0.2668, reader_cost: 0.00150, ips: 29.9797 samples/sec | ETA 02:44:46 2022-08-25 00:27:18 [INFO] [TRAIN] epoch: 98, iter: 123000/160000, loss: 0.6961, lr: 0.000280, batch_cost: 0.2567, reader_cost: 0.01687, ips: 31.1707 samples/sec | ETA 02:38:16 2022-08-25 00:27:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 203s - batch_cost: 0.2028 - reader cost: 0.0011 2022-08-25 00:30:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3160 Acc: 0.7470 Kappa: 0.7275 Dice: 0.4424 2022-08-25 00:30:41 [INFO] [EVAL] Class IoU: [0.6553 0.7672 0.9251 0.704 0.6637 0.7461 0.7626 0.7645 0.494 0.6283 0.4618 0.5138 0.6616 0.2912 0.2204 0.3827 0.4877 0.3922 0.5472 0.3734 0.7251 0.437 0.5593 0.4565 0.3097 0.3656 0.345 0.394 0.3536 0.3123 0.2144 0.399 0.2649 0.2684 0.3474 0.3683 0.3614 0.49 0.2348 0.2541 0.0817 0.0725 0.297 0.2079 0.3025 0.2544 0.2343 0.4231 0.685 0.4683 0.469 0.2517 0.2215 0.1653 0.6216 0.4214 0.8368 0.3553 0.5058 0.2079 0.0432 0.1824 0.2729 0.1212 0.3878 0.6548 0.1975 0.374 0.0602 0.3092 0.3806 0.4404 0.374 0.2311 0.4037 0.2946 0.3501 0.2199 0.1432 0.4616 0.6677 0.2898 0.2569 0.0136 0.4629 0.4683 0.0858 0.0465 0.3333 0.4332 0.3291 0.003 0.1692 0.0688 0.0041 0.0152 0.1855 0.1256 0.2492 0.3759 0.0322 0.009 0.162 0.6405 0.0035 0.5288 0.2341 0.4704 0.0835 0.3355 0.0398 0.3063 0.084 0.5418 0.6352 0.0002 0.375 0.5599 0.0346 0.1188 0.427 0.0114 0.2215 0.1338 0.2542 0.2147 0.4214 0.2953 0.3906 0.2771 0.5541 0.0197 0.121 0.2433 0.0753 0.0862 0.0639 0.0206 0.109 0.3146 0.0859 0.0893 0.2708 0.2804 0.3235 0. 0.2879 0.0189 0.0734 0.0403] 2022-08-25 00:30:41 [INFO] [EVAL] Class Precision: [0.7611 0.8346 0.9603 0.8084 0.7525 0.8443 0.883 0.851 0.635 0.7664 0.6878 0.6639 0.7539 0.4476 0.4455 0.5665 0.642 0.6571 0.7295 0.5404 0.8151 0.601 0.7138 0.6036 0.4803 0.5405 0.4953 0.6991 0.6199 0.5246 0.3927 0.5307 0.4728 0.356 0.528 0.4942 0.6254 0.7137 0.5019 0.5502 0.1465 0.2333 0.4708 0.5922 0.4582 0.4536 0.3834 0.6713 0.7519 0.5617 0.6436 0.2979 0.4089 0.4963 0.7005 0.5558 0.8781 0.5862 0.7265 0.3324 0.0888 0.4294 0.3463 0.601 0.4939 0.8022 0.3469 0.5433 0.1808 0.6615 0.6321 0.6349 0.6325 0.2853 0.63 0.473 0.6003 0.5843 0.726 0.6139 0.8059 0.595 0.736 0.0685 0.6753 0.6925 0.3335 0.4764 0.4473 0.5984 0.3993 0.0041 0.256 0.2983 0.0175 0.0442 0.7903 0.5417 0.4352 0.5148 0.4846 0.022 0.6511 0.6951 0.0548 0.7618 0.5617 0.8748 0.2314 0.5407 0.3809 0.5371 0.3899 0.6194 0.6374 0.0065 0.7078 0.5771 0.1117 0.6202 0.8192 0.2859 0.8041 0.6774 0.5973 0.6321 0.7388 0.4453 0.5195 0.4794 0.6821 0.5007 0.4117 0.7821 0.6293 0.2625 0.3211 0.1636 0.4325 0.703 0.1747 0.1912 0.5677 0.7475 0.6194 0. 0.8931 0.3225 0.4082 0.6873] 2022-08-25 00:30:41 [INFO] [EVAL] Class Recall: [0.8251 0.9048 0.9619 0.845 0.849 0.8652 0.8483 0.8826 0.6899 0.7772 0.5842 0.6944 0.8438 0.4545 0.3037 0.5412 0.6698 0.493 0.6865 0.5472 0.8678 0.6156 0.7211 0.652 0.4659 0.5305 0.5321 0.4745 0.4515 0.4355 0.3207 0.6165 0.376 0.5215 0.5039 0.5913 0.4612 0.6099 0.3062 0.3208 0.1558 0.0952 0.4459 0.2426 0.471 0.3669 0.3758 0.5336 0.8851 0.738 0.6336 0.6186 0.3259 0.1986 0.8466 0.6353 0.9468 0.4743 0.6247 0.3569 0.0777 0.2407 0.5628 0.1318 0.6437 0.7809 0.3144 0.5454 0.0828 0.3673 0.489 0.5898 0.4778 0.5489 0.5292 0.4385 0.4564 0.2607 0.1514 0.6505 0.7957 0.361 0.283 0.0167 0.5955 0.5913 0.1035 0.049 0.5665 0.6108 0.6516 0.0114 0.3328 0.0821 0.0053 0.0227 0.1951 0.1405 0.3684 0.5822 0.0333 0.0149 0.1774 0.8907 0.0037 0.6335 0.2864 0.5044 0.1155 0.4693 0.0425 0.4161 0.0967 0.8122 0.9946 0.0002 0.4437 0.9495 0.0477 0.1281 0.4715 0.0117 0.2341 0.1429 0.3068 0.2454 0.4952 0.467 0.6115 0.3964 0.7471 0.0201 0.1462 0.261 0.0788 0.1137 0.0738 0.023 0.1272 0.3628 0.1447 0.1435 0.3411 0.3098 0.4038 0. 0.2981 0.0197 0.0822 0.0411] 2022-08-25 00:30:42 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:30:49 [INFO] [TRAIN] epoch: 98, iter: 123050/160000, loss: 0.6583, lr: 0.000280, batch_cost: 0.1502, reader_cost: 0.00758, ips: 53.2795 samples/sec | ETA 01:32:28 2022-08-25 00:30:57 [INFO] [TRAIN] epoch: 98, iter: 123100/160000, loss: 0.7022, lr: 0.000279, batch_cost: 0.1680, reader_cost: 0.00216, ips: 47.6285 samples/sec | ETA 01:43:17 2022-08-25 00:31:05 [INFO] [TRAIN] epoch: 98, iter: 123150/160000, loss: 0.6852, lr: 0.000279, batch_cost: 0.1529, reader_cost: 0.00047, ips: 52.3212 samples/sec | ETA 01:33:54 2022-08-25 00:31:14 [INFO] [TRAIN] epoch: 98, iter: 123200/160000, loss: 0.7058, lr: 0.000279, batch_cost: 0.1725, reader_cost: 0.00070, ips: 46.3670 samples/sec | ETA 01:45:49 2022-08-25 00:31:23 [INFO] [TRAIN] epoch: 98, iter: 123250/160000, loss: 0.6681, lr: 0.000278, batch_cost: 0.1776, reader_cost: 0.00031, ips: 45.0419 samples/sec | ETA 01:48:47 2022-08-25 00:31:31 [INFO] [TRAIN] epoch: 98, iter: 123300/160000, loss: 0.6898, lr: 0.000278, batch_cost: 0.1725, reader_cost: 0.00057, ips: 46.3810 samples/sec | ETA 01:45:30 2022-08-25 00:31:41 [INFO] [TRAIN] epoch: 98, iter: 123350/160000, loss: 0.6644, lr: 0.000277, batch_cost: 0.2006, reader_cost: 0.00048, ips: 39.8860 samples/sec | ETA 02:02:30 2022-08-25 00:31:52 [INFO] [TRAIN] epoch: 98, iter: 123400/160000, loss: 0.7085, lr: 0.000277, batch_cost: 0.2084, reader_cost: 0.00086, ips: 38.3795 samples/sec | ETA 02:07:09 2022-08-25 00:32:04 [INFO] [TRAIN] epoch: 98, iter: 123450/160000, loss: 0.6768, lr: 0.000277, batch_cost: 0.2557, reader_cost: 0.00062, ips: 31.2888 samples/sec | ETA 02:35:45 2022-08-25 00:32:17 [INFO] [TRAIN] epoch: 98, iter: 123500/160000, loss: 0.6838, lr: 0.000276, batch_cost: 0.2451, reader_cost: 0.00116, ips: 32.6458 samples/sec | ETA 02:29:04 2022-08-25 00:32:29 [INFO] [TRAIN] epoch: 98, iter: 123550/160000, loss: 0.7056, lr: 0.000276, batch_cost: 0.2439, reader_cost: 0.00664, ips: 32.8011 samples/sec | ETA 02:28:09 2022-08-25 00:32:41 [INFO] [TRAIN] epoch: 98, iter: 123600/160000, loss: 0.6779, lr: 0.000276, batch_cost: 0.2320, reader_cost: 0.00639, ips: 34.4835 samples/sec | ETA 02:20:44 2022-08-25 00:32:52 [INFO] [TRAIN] epoch: 98, iter: 123650/160000, loss: 0.6774, lr: 0.000275, batch_cost: 0.2213, reader_cost: 0.01329, ips: 36.1487 samples/sec | ETA 02:14:04 2022-08-25 00:33:04 [INFO] [TRAIN] epoch: 98, iter: 123700/160000, loss: 0.6715, lr: 0.000275, batch_cost: 0.2509, reader_cost: 0.00566, ips: 31.8848 samples/sec | ETA 02:31:47 2022-08-25 00:33:15 [INFO] [TRAIN] epoch: 98, iter: 123750/160000, loss: 0.6514, lr: 0.000274, batch_cost: 0.2195, reader_cost: 0.00035, ips: 36.4475 samples/sec | ETA 02:12:36 2022-08-25 00:33:30 [INFO] [TRAIN] epoch: 99, iter: 123800/160000, loss: 0.6709, lr: 0.000274, batch_cost: 0.2931, reader_cost: 0.03755, ips: 27.2910 samples/sec | ETA 02:56:51 2022-08-25 00:33:42 [INFO] [TRAIN] epoch: 99, iter: 123850/160000, loss: 0.6846, lr: 0.000274, batch_cost: 0.2504, reader_cost: 0.00665, ips: 31.9526 samples/sec | ETA 02:30:50 2022-08-25 00:33:54 [INFO] [TRAIN] epoch: 99, iter: 123900/160000, loss: 0.6220, lr: 0.000273, batch_cost: 0.2405, reader_cost: 0.00051, ips: 33.2697 samples/sec | ETA 02:24:40 2022-08-25 00:34:06 [INFO] [TRAIN] epoch: 99, iter: 123950/160000, loss: 0.7277, lr: 0.000273, batch_cost: 0.2417, reader_cost: 0.00362, ips: 33.1032 samples/sec | ETA 02:25:12 2022-08-25 00:34:18 [INFO] [TRAIN] epoch: 99, iter: 124000/160000, loss: 0.7062, lr: 0.000273, batch_cost: 0.2248, reader_cost: 0.00160, ips: 35.5875 samples/sec | ETA 02:14:52 2022-08-25 00:34:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1725 - reader cost: 0.0011 2022-08-25 00:37:10 [INFO] [EVAL] #Images: 2000 mIoU: 0.3169 Acc: 0.7461 Kappa: 0.7266 Dice: 0.4449 2022-08-25 00:37:10 [INFO] [EVAL] Class IoU: [0.6585 0.7653 0.9255 0.7058 0.6793 0.7444 0.7608 0.7595 0.4897 0.6085 0.4592 0.5237 0.659 0.2992 0.2092 0.3789 0.4799 0.4129 0.5499 0.3685 0.7146 0.4412 0.5532 0.4451 0.3095 0.3475 0.3531 0.405 0.3643 0.2852 0.2087 0.3914 0.2746 0.2738 0.3184 0.401 0.3638 0.4828 0.2436 0.2982 0.0748 0.0728 0.3023 0.2277 0.2996 0.2309 0.2253 0.4072 0.65 0.4991 0.4281 0.2102 0.2167 0.1556 0.6226 0.469 0.8533 0.3037 0.4192 0.1916 0.0535 0.1733 0.2897 0.0859 0.3566 0.6466 0.2138 0.3675 0.0836 0.3416 0.348 0.4155 0.3655 0.2231 0.4107 0.3048 0.3877 0.2015 0.219 0.4029 0.6338 0.3065 0.2519 0.0129 0.4981 0.474 0.0923 0.0546 0.3148 0.4379 0.3954 0.0018 0.2181 0.0922 0.0049 0.0033 0.1804 0.1146 0.2077 0.3245 0.1297 0.0098 0.1789 0.6002 0.0018 0.504 0.2096 0.4315 0.0981 0.3896 0.0468 0.292 0.1045 0.5582 0.6632 0. 0.3302 0.5458 0.0646 0.2893 0.4068 0.0085 0.2129 0.1127 0.2396 0.1897 0.4076 0.307 0.4052 0.2376 0.5473 0.0229 0.1007 0.2599 0.0911 0.0984 0.0612 0.0195 0.1146 0.3279 0.0905 0.0736 0.2622 0.4499 0.2882 0. 0.3022 0.0134 0.1006 0.053 ] 2022-08-25 00:37:10 [INFO] [EVAL] Class Precision: [0.7637 0.8365 0.9595 0.8134 0.7879 0.8656 0.8881 0.8339 0.6165 0.7471 0.67 0.65 0.7434 0.4724 0.4589 0.5305 0.6425 0.6335 0.7269 0.5415 0.7922 0.5878 0.7163 0.6186 0.4868 0.5722 0.5172 0.6758 0.6017 0.4383 0.3822 0.4978 0.458 0.4124 0.516 0.5681 0.6133 0.7788 0.4348 0.4875 0.1274 0.2069 0.5429 0.5019 0.4286 0.4422 0.334 0.5833 0.7107 0.6437 0.5015 0.2439 0.3779 0.5664 0.6934 0.66 0.9085 0.6345 0.8035 0.3082 0.1123 0.4518 0.4239 0.5918 0.4265 0.7749 0.3475 0.5104 0.1662 0.6734 0.5769 0.5569 0.6228 0.2872 0.6353 0.4629 0.5523 0.5106 0.816 0.6925 0.7304 0.5812 0.7458 0.0731 0.6381 0.6646 0.4016 0.434 0.4885 0.5953 0.5439 0.0025 0.3386 0.2919 0.0236 0.0179 0.5232 0.4476 0.3878 0.535 0.5152 0.0166 0.6949 0.6512 0.0477 0.7068 0.5808 0.8887 0.192 0.5707 0.4732 0.4947 0.3327 0.6828 0.6683 0. 0.738 0.5667 0.1391 0.6897 0.8357 0.4297 0.7381 0.6193 0.5724 0.6046 0.768 0.4268 0.632 0.3547 0.6266 0.4591 0.4562 0.6753 0.623 0.2994 0.3051 0.1575 0.3391 0.6505 0.213 0.1907 0.5243 0.6405 0.4506 0. 0.8851 0.3676 0.4617 0.5815] 2022-08-25 00:37:10 [INFO] [EVAL] Class Recall: [0.827 0.8999 0.9631 0.8422 0.8314 0.8417 0.8415 0.8948 0.7043 0.7664 0.5935 0.7295 0.853 0.4494 0.2778 0.5701 0.6546 0.5424 0.6931 0.5355 0.8794 0.6388 0.7084 0.6135 0.4594 0.4695 0.5266 0.5027 0.48 0.4495 0.315 0.6467 0.4068 0.4491 0.454 0.5768 0.4721 0.5595 0.3566 0.4344 0.1535 0.1009 0.4054 0.2941 0.4989 0.3258 0.4093 0.5743 0.8838 0.6895 0.7452 0.6034 0.3368 0.1766 0.8592 0.6184 0.9335 0.3681 0.4671 0.3362 0.0927 0.2194 0.4779 0.0913 0.6851 0.7961 0.3572 0.5674 0.144 0.4095 0.4672 0.6207 0.4695 0.4998 0.5375 0.4715 0.5654 0.2498 0.2304 0.4906 0.8273 0.3934 0.2756 0.0155 0.6942 0.6231 0.1071 0.0588 0.4695 0.6235 0.5914 0.0057 0.38 0.1188 0.0062 0.004 0.216 0.1335 0.3089 0.4519 0.1478 0.0232 0.1941 0.8846 0.0018 0.6373 0.247 0.4562 0.167 0.5512 0.0493 0.4161 0.1323 0.7537 0.9886 0. 0.3741 0.9366 0.1076 0.3327 0.4422 0.0086 0.2303 0.1211 0.2918 0.2166 0.4649 0.5225 0.5304 0.4186 0.812 0.0235 0.1144 0.297 0.0964 0.1279 0.0711 0.0218 0.1476 0.398 0.1359 0.1071 0.344 0.6019 0.4443 0. 0.3145 0.0137 0.1139 0.0551] 2022-08-25 00:37:11 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:37:18 [INFO] [TRAIN] epoch: 99, iter: 124050/160000, loss: 0.6940, lr: 0.000272, batch_cost: 0.1490, reader_cost: 0.00743, ips: 53.6889 samples/sec | ETA 01:29:16 2022-08-25 00:37:29 [INFO] [TRAIN] epoch: 99, iter: 124100/160000, loss: 0.6835, lr: 0.000272, batch_cost: 0.2144, reader_cost: 0.00078, ips: 37.3050 samples/sec | ETA 02:08:18 2022-08-25 00:37:38 [INFO] [TRAIN] epoch: 99, iter: 124150/160000, loss: 0.7442, lr: 0.000271, batch_cost: 0.1891, reader_cost: 0.00103, ips: 42.3162 samples/sec | ETA 01:52:57 2022-08-25 00:37:48 [INFO] [TRAIN] epoch: 99, iter: 124200/160000, loss: 0.6927, lr: 0.000271, batch_cost: 0.1962, reader_cost: 0.00070, ips: 40.7742 samples/sec | ETA 01:57:04 2022-08-25 00:37:57 [INFO] [TRAIN] epoch: 99, iter: 124250/160000, loss: 0.6642, lr: 0.000271, batch_cost: 0.1695, reader_cost: 0.00066, ips: 47.2001 samples/sec | ETA 01:40:59 2022-08-25 00:38:05 [INFO] [TRAIN] epoch: 99, iter: 124300/160000, loss: 0.6326, lr: 0.000270, batch_cost: 0.1724, reader_cost: 0.00031, ips: 46.4141 samples/sec | ETA 01:42:33 2022-08-25 00:38:12 [INFO] [TRAIN] epoch: 99, iter: 124350/160000, loss: 0.6873, lr: 0.000270, batch_cost: 0.1455, reader_cost: 0.00070, ips: 54.9687 samples/sec | ETA 01:26:28 2022-08-25 00:38:21 [INFO] [TRAIN] epoch: 99, iter: 124400/160000, loss: 0.6668, lr: 0.000270, batch_cost: 0.1703, reader_cost: 0.00049, ips: 46.9848 samples/sec | ETA 01:41:01 2022-08-25 00:38:30 [INFO] [TRAIN] epoch: 99, iter: 124450/160000, loss: 0.7123, lr: 0.000269, batch_cost: 0.1764, reader_cost: 0.00065, ips: 45.3449 samples/sec | ETA 01:44:31 2022-08-25 00:38:40 [INFO] [TRAIN] epoch: 99, iter: 124500/160000, loss: 0.6949, lr: 0.000269, batch_cost: 0.2146, reader_cost: 0.00059, ips: 37.2730 samples/sec | ETA 02:06:59 2022-08-25 00:38:52 [INFO] [TRAIN] epoch: 99, iter: 124550/160000, loss: 0.6818, lr: 0.000268, batch_cost: 0.2269, reader_cost: 0.00468, ips: 35.2558 samples/sec | ETA 02:14:04 2022-08-25 00:39:03 [INFO] [TRAIN] epoch: 99, iter: 124600/160000, loss: 0.6807, lr: 0.000268, batch_cost: 0.2185, reader_cost: 0.00078, ips: 36.6116 samples/sec | ETA 02:08:55 2022-08-25 00:39:15 [INFO] [TRAIN] epoch: 99, iter: 124650/160000, loss: 0.7301, lr: 0.000268, batch_cost: 0.2370, reader_cost: 0.00058, ips: 33.7619 samples/sec | ETA 02:19:36 2022-08-25 00:39:26 [INFO] [TRAIN] epoch: 99, iter: 124700/160000, loss: 0.6665, lr: 0.000267, batch_cost: 0.2259, reader_cost: 0.00629, ips: 35.4091 samples/sec | ETA 02:12:55 2022-08-25 00:39:39 [INFO] [TRAIN] epoch: 99, iter: 124750/160000, loss: 0.6795, lr: 0.000267, batch_cost: 0.2520, reader_cost: 0.00439, ips: 31.7430 samples/sec | ETA 02:28:03 2022-08-25 00:39:52 [INFO] [TRAIN] epoch: 99, iter: 124800/160000, loss: 0.6962, lr: 0.000267, batch_cost: 0.2648, reader_cost: 0.00092, ips: 30.2076 samples/sec | ETA 02:35:22 2022-08-25 00:40:04 [INFO] [TRAIN] epoch: 99, iter: 124850/160000, loss: 0.6825, lr: 0.000266, batch_cost: 0.2409, reader_cost: 0.00616, ips: 33.2086 samples/sec | ETA 02:21:07 2022-08-25 00:40:16 [INFO] [TRAIN] epoch: 99, iter: 124900/160000, loss: 0.7020, lr: 0.000266, batch_cost: 0.2506, reader_cost: 0.00043, ips: 31.9175 samples/sec | ETA 02:26:37 2022-08-25 00:40:29 [INFO] [TRAIN] epoch: 99, iter: 124950/160000, loss: 0.7128, lr: 0.000265, batch_cost: 0.2631, reader_cost: 0.00101, ips: 30.4051 samples/sec | ETA 02:33:42 2022-08-25 00:40:43 [INFO] [TRAIN] epoch: 99, iter: 125000/160000, loss: 0.6475, lr: 0.000265, batch_cost: 0.2607, reader_cost: 0.00033, ips: 30.6866 samples/sec | ETA 02:32:04 2022-08-25 00:40:43 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1804 - reader cost: 0.0011 2022-08-25 00:43:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.3191 Acc: 0.7476 Kappa: 0.7282 Dice: 0.4467 2022-08-25 00:43:43 [INFO] [EVAL] Class IoU: [0.6586 0.7662 0.9241 0.7093 0.6673 0.7418 0.7704 0.7632 0.4851 0.6194 0.4637 0.5324 0.6626 0.2806 0.2141 0.3788 0.4835 0.4118 0.5492 0.3703 0.7165 0.4352 0.5652 0.4571 0.3108 0.3499 0.3877 0.3969 0.3468 0.289 0.2188 0.4114 0.2696 0.2796 0.2988 0.3751 0.3564 0.4758 0.2396 0.2878 0.0809 0.0772 0.3035 0.2174 0.2993 0.252 0.2368 0.4052 0.669 0.4929 0.4614 0.2219 0.218 0.2301 0.6159 0.4524 0.8432 0.2752 0.5028 0.2126 0.0778 0.1692 0.2879 0.0901 0.4032 0.6551 0.2382 0.3776 0.0678 0.3375 0.344 0.4176 0.3663 0.2139 0.4117 0.3056 0.3882 0.2036 0.2846 0.2328 0.642 0.2899 0.2599 0.0497 0.4924 0.4676 0.0627 0.0596 0.3367 0.4388 0.3606 0.014 0.1615 0.0794 0.0033 0.0075 0.1718 0.1129 0.219 0.3457 0.1263 0.0277 0.1708 0.5905 0.0006 0.5194 0.2108 0.5066 0.0785 0.3984 0.0539 0.2311 0.077 0.5827 0.7429 0.0044 0.3292 0.5607 0.0741 0.2713 0.4546 0.0143 0.2033 0.1004 0.2554 0.1747 0.442 0.3137 0.4182 0.2664 0.5529 0.0227 0.1379 0.243 0.0797 0.0906 0.0723 0.0154 0.1084 0.308 0.0271 0.1132 0.2689 0.4305 0.3188 0. 0.3092 0.0111 0.0782 0.0468] 2022-08-25 00:43:43 [INFO] [EVAL] Class Precision: [0.7642 0.84 0.9603 0.817 0.7512 0.8629 0.8859 0.8344 0.599 0.7335 0.6495 0.6691 0.7593 0.4976 0.47 0.5488 0.6317 0.6438 0.7408 0.5558 0.8021 0.6248 0.7603 0.5914 0.4971 0.4777 0.5415 0.6579 0.6384 0.445 0.4021 0.5422 0.4137 0.4141 0.4798 0.5012 0.6186 0.7275 0.4681 0.5031 0.143 0.2083 0.4951 0.514 0.4028 0.4322 0.3696 0.5994 0.719 0.6121 0.5877 0.2552 0.3985 0.6364 0.6922 0.6262 0.8842 0.6483 0.7027 0.4299 0.1297 0.3734 0.4324 0.5985 0.5212 0.8126 0.4073 0.5781 0.2412 0.6144 0.5955 0.5951 0.6028 0.2926 0.6206 0.5363 0.6124 0.5113 0.7328 0.5775 0.7484 0.6362 0.7273 0.1668 0.6553 0.6626 0.4836 0.42 0.4941 0.6076 0.4552 0.0194 0.3192 0.2865 0.019 0.0288 0.5888 0.5695 0.4572 0.5189 0.487 0.0505 0.7391 0.6622 0.0181 0.7137 0.592 0.8316 0.1816 0.6002 0.2865 0.342 0.3764 0.7026 0.7521 0.107 0.7124 0.5894 0.1517 0.7874 0.7663 0.2171 0.6216 0.6503 0.6099 0.7004 0.8233 0.4765 0.5301 0.5094 0.6774 0.309 0.4346 0.728 0.6322 0.236 0.3079 0.1104 0.4183 0.7066 0.0605 0.232 0.5681 0.6061 0.5577 0. 0.9149 0.3104 0.573 0.8027] 2022-08-25 00:43:43 [INFO] [EVAL] Class Recall: [0.8267 0.8971 0.9608 0.8432 0.8567 0.841 0.8552 0.8994 0.7183 0.7993 0.6184 0.7226 0.8388 0.3915 0.2823 0.5501 0.6733 0.5333 0.6798 0.5259 0.8703 0.5892 0.6878 0.6681 0.4534 0.5666 0.5772 0.5001 0.4316 0.452 0.3244 0.6303 0.4362 0.4628 0.4421 0.5985 0.4568 0.579 0.3292 0.4022 0.1572 0.1093 0.4395 0.2736 0.538 0.3768 0.3972 0.5557 0.9058 0.7167 0.6823 0.63 0.3249 0.2649 0.8481 0.6199 0.9478 0.3235 0.6386 0.2961 0.163 0.2363 0.4628 0.0959 0.6403 0.7717 0.3647 0.5211 0.0863 0.4281 0.4489 0.5834 0.4829 0.4431 0.5502 0.4154 0.5147 0.2527 0.3176 0.2806 0.8187 0.3475 0.288 0.0662 0.6645 0.6137 0.0672 0.065 0.5139 0.6123 0.6344 0.0478 0.2463 0.099 0.0039 0.01 0.1952 0.1234 0.2959 0.5088 0.1457 0.0579 0.1817 0.845 0.0006 0.6561 0.2466 0.5646 0.1214 0.5423 0.0622 0.4161 0.0883 0.7734 0.9838 0.0046 0.3797 0.9201 0.1267 0.2927 0.5277 0.0151 0.2321 0.1062 0.3052 0.1888 0.4883 0.4786 0.6645 0.3584 0.7505 0.0239 0.168 0.2672 0.0836 0.1283 0.0863 0.0176 0.1276 0.3531 0.0468 0.181 0.338 0.5977 0.4266 0. 0.3184 0.0114 0.083 0.0474] 2022-08-25 00:43:43 [INFO] [EVAL] The model with the best validation mIoU (0.3205) was saved at iter 116000. 2022-08-25 00:43:53 [INFO] [TRAIN] epoch: 100, iter: 125050/160000, loss: 0.6577, lr: 0.000265, batch_cost: 0.1906, reader_cost: 0.02519, ips: 41.9830 samples/sec | ETA 01:50:59 2022-08-25 00:44:02 [INFO] [TRAIN] epoch: 100, iter: 125100/160000, loss: 0.7108, lr: 0.000264, batch_cost: 0.1771, reader_cost: 0.00075, ips: 45.1642 samples/sec | ETA 01:43:01 2022-08-25 00:44:10 [INFO] [TRAIN] epoch: 100, iter: 125150/160000, loss: 0.7001, lr: 0.000264, batch_cost: 0.1735, reader_cost: 0.00062, ips: 46.1188 samples/sec | ETA 01:40:45 2022-08-25 00:44:18 [INFO] [TRAIN] epoch: 100, iter: 125200/160000, loss: 0.6837, lr: 0.000263, batch_cost: 0.1631, reader_cost: 0.00062, ips: 49.0427 samples/sec | ETA 01:34:36 2022-08-25 00:44:28 [INFO] [TRAIN] epoch: 100, iter: 125250/160000, loss: 0.6753, lr: 0.000263, batch_cost: 0.1874, reader_cost: 0.00035, ips: 42.6896 samples/sec | ETA 01:48:32 2022-08-25 00:44:35 [INFO] [TRAIN] epoch: 100, iter: 125300/160000, loss: 0.7278, lr: 0.000263, batch_cost: 0.1483, reader_cost: 0.00055, ips: 53.9475 samples/sec | ETA 01:25:45 2022-08-25 00:44:45 [INFO] [TRAIN] epoch: 100, iter: 125350/160000, loss: 0.7168, lr: 0.000262, batch_cost: 0.1913, reader_cost: 0.00040, ips: 41.8138 samples/sec | ETA 01:50:29 2022-08-25 00:44:55 [INFO] [TRAIN] epoch: 100, iter: 125400/160000, loss: 0.6844, lr: 0.000262, batch_cost: 0.2101, reader_cost: 0.00087, ips: 38.0782 samples/sec | ETA 02:01:09 2022-08-25 00:45:05 [INFO] [TRAIN] epoch: 100, iter: 125450/160000, loss: 0.7076, lr: 0.000262, batch_cost: 0.1968, reader_cost: 0.00088, ips: 40.6533 samples/sec | ETA 01:53:18 2022-08-25 00:45:16 [INFO] [TRAIN] epoch: 100, iter: 125500/160000, loss: 0.7365, lr: 0.000261, batch_cost: 0.2252, reader_cost: 0.00035, ips: 35.5257 samples/sec | ETA 02:09:29 2022-08-25 00:45:29 [INFO] [TRAIN] epoch: 100, iter: 125550/160000, loss: 0.7073, lr: 0.000261, batch_cost: 0.2462, reader_cost: 0.00072, ips: 32.4953 samples/sec | ETA 02:21:21 2022-08-25 00:45:41 [INFO] [TRAIN] epoch: 100, iter: 125600/160000, loss: 0.7318, lr: 0.000260, batch_cost: 0.2468, reader_cost: 0.00163, ips: 32.4207 samples/sec | ETA 02:21:28 2022-08-25 00:45:52 [INFO] [TRAIN] epoch: 100, iter: 125650/160000, loss: 0.6477, lr: 0.000260, batch_cost: 0.2251, reader_cost: 0.00788, ips: 35.5404 samples/sec | ETA 02:08:52 2022-08-25 00:46:04 [INFO] [TRAIN] epoch: 100, iter: 125700/160000, loss: 0.7225, lr: 0.000260, batch_cost: 0.2391, reader_cost: 0.02060, ips: 33.4546 samples/sec | ETA 02:16:42 2022-08-25 00:46:16 [INFO] [TRAIN] epoch: 100, iter: 125750/160000, loss: 0.6629, lr: 0.000259, batch_cost: 0.2394, reader_cost: 0.00084, ips: 33.4192 samples/sec | ETA 02:16:38 2022-08-25 00:46:29 [INFO] [TRAIN] epoch: 100, iter: 125800/160000, loss: 0.6990, lr: 0.000259, batch_cost: 0.2499, reader_cost: 0.00087, ips: 32.0116 samples/sec | ETA 02:22:26 2022-08-25 00:46:42 [INFO] [TRAIN] epoch: 100, iter: 125850/160000, loss: 0.6599, lr: 0.000259, batch_cost: 0.2711, reader_cost: 0.00060, ips: 29.5146 samples/sec | ETA 02:34:16 2022-08-25 00:46:54 [INFO] [TRAIN] epoch: 100, iter: 125900/160000, loss: 0.6711, lr: 0.000258, batch_cost: 0.2351, reader_cost: 0.00095, ips: 34.0247 samples/sec | ETA 02:13:37 2022-08-25 00:47:07 [INFO] [TRAIN] epoch: 100, iter: 125950/160000, loss: 0.7053, lr: 0.000258, batch_cost: 0.2499, reader_cost: 0.02063, ips: 32.0103 samples/sec | ETA 02:21:49 2022-08-25 00:47:20 [INFO] [TRAIN] epoch: 100, iter: 126000/160000, loss: 0.6649, lr: 0.000257, batch_cost: 0.2620, reader_cost: 0.00280, ips: 30.5319 samples/sec | ETA 02:28:28 2022-08-25 00:47:20 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 200s - batch_cost: 0.2002 - reader cost: 0.0011 2022-08-25 00:50:40 [INFO] [EVAL] #Images: 2000 mIoU: 0.3206 Acc: 0.7493 Kappa: 0.7299 Dice: 0.4480 2022-08-25 00:50:40 [INFO] [EVAL] Class IoU: [0.6586 0.7686 0.9251 0.7063 0.6661 0.7442 0.7662 0.7542 0.4893 0.6254 0.4657 0.5236 0.6647 0.2739 0.2152 0.3836 0.4967 0.4083 0.5507 0.3751 0.7312 0.422 0.5589 0.4532 0.3267 0.3496 0.3799 0.4143 0.3473 0.3196 0.2192 0.4043 0.2514 0.2892 0.3724 0.389 0.3653 0.462 0.2553 0.2793 0.0841 0.076 0.2953 0.2331 0.3025 0.2465 0.2361 0.4258 0.6636 0.468 0.463 0.2558 0.2174 0.2055 0.619 0.4508 0.8612 0.3005 0.5159 0.2 0.0732 0.1601 0.3025 0.1433 0.3858 0.6589 0.244 0.3666 0.0449 0.343 0.3661 0.4229 0.3659 0.2476 0.4219 0.3065 0.3749 0.2074 0.1435 0.2776 0.6546 0.3096 0.2302 0.0275 0.5142 0.4775 0.095 0.065 0.3322 0.4387 0.433 0.0052 0.2247 0.0676 0.0277 0.0043 0.1772 0.1182 0.1982 0.3425 0.1382 0.0168 0.2344 0.6817 0.0025 0.5467 0.2251 0.3773 0.0933 0.4038 0.0455 0.3277 0.0922 0.5187 0.7386 0.0008 0.3753 0.6049 0.0537 0.2437 0.386 0.0056 0.2337 0.0985 0.2274 0.2024 0.4182 0.3163 0.3667 0.2965 0.5341 0.0142 0.1144 0.2384 0.0688 0.0931 0.0727 0.0177 0.1182 0.3326 0.0089 0.0702 0.2553 0.4319 0.2612 0. 0.2682 0.0104 0.0703 0.0588] 2022-08-25 00:50:40 [INFO] [EVAL] Class Precision: [0.761 0.8375 0.9603 0.8055 0.7534 0.8683 0.8789 0.8223 0.6181 0.7265 0.696 0.6684 0.7566 0.5077 0.4749 0.5344 0.6382 0.6177 0.708 0.5487 0.8327 0.5731 0.7239 0.5915 0.5385 0.5689 0.5213 0.6774 0.6297 0.5292 0.3965 0.5216 0.4826 0.4471 0.5265 0.5268 0.5842 0.7477 0.5124 0.5073 0.1364 0.2243 0.4413 0.492 0.4238 0.4213 0.4005 0.6252 0.7308 0.5553 0.6541 0.3225 0.4366 0.6909 0.7085 0.5937 0.9173 0.6285 0.7059 0.3612 0.1454 0.3176 0.4233 0.5726 0.477 0.7975 0.4133 0.565 0.2053 0.7076 0.6061 0.6224 0.6204 0.3208 0.6418 0.5435 0.5706 0.5606 0.6834 0.607 0.7466 0.5776 0.772 0.0836 0.6357 0.6653 0.3064 0.3948 0.5446 0.6348 0.6009 0.0095 0.4366 0.295 0.1153 0.0206 0.6532 0.499 0.4549 0.5367 0.4508 0.0302 0.6585 0.7807 0.0346 0.8907 0.5411 0.873 0.2628 0.583 0.2837 0.6308 0.376 0.711 0.7461 0.0336 0.7632 0.6285 0.1443 0.6701 0.7958 0.1566 0.8443 0.6282 0.6506 0.6508 0.7454 0.445 0.5041 0.5464 0.6081 0.3238 0.3002 0.6967 0.7158 0.2805 0.305 0.1765 0.3382 0.6476 0.0372 0.1416 0.5115 0.6558 0.3647 0. 0.9172 0.3544 0.4347 0.7713] 2022-08-25 00:50:40 [INFO] [EVAL] Class Recall: [0.8304 0.9034 0.9619 0.8516 0.8518 0.8389 0.8566 0.9011 0.7013 0.8181 0.5847 0.7074 0.8455 0.3728 0.2824 0.5761 0.6914 0.5464 0.7125 0.5425 0.8571 0.6155 0.7103 0.6597 0.4538 0.4756 0.5835 0.5161 0.4364 0.4465 0.3289 0.6425 0.3441 0.4502 0.56 0.5979 0.4936 0.5474 0.3372 0.3833 0.1801 0.103 0.4717 0.307 0.514 0.3727 0.3651 0.5718 0.8782 0.7485 0.6132 0.5529 0.3022 0.2264 0.8306 0.6521 0.9337 0.3655 0.6572 0.3094 0.1284 0.244 0.5146 0.1604 0.6687 0.7913 0.3734 0.5107 0.0543 0.3997 0.4803 0.5689 0.4714 0.5203 0.5518 0.4127 0.5223 0.2476 0.1537 0.3383 0.8415 0.4002 0.247 0.0394 0.7289 0.6285 0.1211 0.0721 0.4599 0.5869 0.6078 0.0116 0.3164 0.0806 0.0352 0.0053 0.1956 0.1341 0.26 0.4862 0.1662 0.0363 0.2668 0.8431 0.0027 0.586 0.2782 0.3992 0.1263 0.5678 0.0514 0.4055 0.1088 0.6574 0.9865 0.0008 0.4247 0.9416 0.0789 0.277 0.4284 0.0058 0.2442 0.1047 0.2591 0.2271 0.4879 0.5223 0.5737 0.3933 0.8145 0.0146 0.1559 0.266 0.0707 0.1223 0.0871 0.0193 0.1539 0.4061 0.0115 0.1221 0.3376 0.5586 0.4795 0. 0.2748 0.0106 0.0774 0.0599] 2022-08-25 00:50:40 [INFO] [EVAL] The model with the best validation mIoU (0.3206) was saved at iter 126000. 2022-08-25 00:50:48 [INFO] [TRAIN] epoch: 100, iter: 126050/160000, loss: 0.6923, lr: 0.000257, batch_cost: 0.1631, reader_cost: 0.00291, ips: 49.0578 samples/sec | ETA 01:32:16 2022-08-25 00:50:57 [INFO] [TRAIN] epoch: 100, iter: 126100/160000, loss: 0.7010, lr: 0.000257, batch_cost: 0.1636, reader_cost: 0.00123, ips: 48.9081 samples/sec | ETA 01:32:25 2022-08-25 00:51:05 [INFO] [TRAIN] epoch: 100, iter: 126150/160000, loss: 0.6900, lr: 0.000256, batch_cost: 0.1787, reader_cost: 0.00073, ips: 44.7768 samples/sec | ETA 01:40:47 2022-08-25 00:51:15 [INFO] [TRAIN] epoch: 100, iter: 126200/160000, loss: 0.6789, lr: 0.000256, batch_cost: 0.1897, reader_cost: 0.00043, ips: 42.1696 samples/sec | ETA 01:46:52 2022-08-25 00:51:24 [INFO] [TRAIN] epoch: 100, iter: 126250/160000, loss: 0.7147, lr: 0.000256, batch_cost: 0.1794, reader_cost: 0.00288, ips: 44.5813 samples/sec | ETA 01:40:56 2022-08-25 00:51:33 [INFO] [TRAIN] epoch: 100, iter: 126300/160000, loss: 0.6535, lr: 0.000255, batch_cost: 0.1808, reader_cost: 0.00042, ips: 44.2497 samples/sec | ETA 01:41:32 2022-08-25 00:51:46 [INFO] [TRAIN] epoch: 101, iter: 126350/160000, loss: 0.6583, lr: 0.000255, batch_cost: 0.2577, reader_cost: 0.03840, ips: 31.0436 samples/sec | ETA 02:24:31 2022-08-25 00:51:58 [INFO] [TRAIN] epoch: 101, iter: 126400/160000, loss: 0.6595, lr: 0.000254, batch_cost: 0.2376, reader_cost: 0.00987, ips: 33.6649 samples/sec | ETA 02:13:04 2022-08-25 00:52:10 [INFO] [TRAIN] epoch: 101, iter: 126450/160000, loss: 0.7135, lr: 0.000254, batch_cost: 0.2494, reader_cost: 0.00086, ips: 32.0752 samples/sec | ETA 02:19:27 2022-08-25 00:52:22 [INFO] [TRAIN] epoch: 101, iter: 126500/160000, loss: 0.6552, lr: 0.000254, batch_cost: 0.2413, reader_cost: 0.02156, ips: 33.1496 samples/sec | ETA 02:14:44 2022-08-25 00:52:35 [INFO] [TRAIN] epoch: 101, iter: 126550/160000, loss: 0.6831, lr: 0.000253, batch_cost: 0.2508, reader_cost: 0.00078, ips: 31.9005 samples/sec | ETA 02:19:48 2022-08-25 00:52:47 [INFO] [TRAIN] epoch: 101, iter: 126600/160000, loss: 0.6670, lr: 0.000253, batch_cost: 0.2465, reader_cost: 0.01049, ips: 32.4559 samples/sec | ETA 02:17:12 2022-08-25 00:52:59 [INFO] [TRAIN] epoch: 101, iter: 126650/160000, loss: 0.6740, lr: 0.000252, batch_cost: 0.2432, reader_cost: 0.00412, ips: 32.8888 samples/sec | ETA 02:15:12 2022-08-25 00:53:12 [INFO] [TRAIN] epoch: 101, iter: 126700/160000, loss: 0.7063, lr: 0.000252, batch_cost: 0.2467, reader_cost: 0.00131, ips: 32.4248 samples/sec | ETA 02:16:55 2022-08-25 00:53:25 [INFO] [TRAIN] epoch: 101, iter: 126750/160000, loss: 0.7137, lr: 0.000252, batch_cost: 0.2707, reader_cost: 0.00180, ips: 29.5515 samples/sec | ETA 02:30:01 2022-08-25 00:53:37 [INFO] [TRAIN] epoch: 101, iter: 126800/160000, loss: 0.7093, lr: 0.000251, batch_cost: 0.2335, reader_cost: 0.00049, ips: 34.2584 samples/sec | ETA 02:09:12 2022-08-25 00:53:49 [INFO] [TRAIN] epoch: 101, iter: 126850/160000, loss: 0.6677, lr: 0.000251, batch_cost: 0.2340, reader_cost: 0.00077, ips: 34.1942 samples/sec | ETA 02:09:15 2022-08-25 00:54:00 [INFO] [TRAIN] epoch: 101, iter: 126900/160000, loss: 0.6658, lr: 0.000251, batch_cost: 0.2318, reader_cost: 0.00057, ips: 34.5133 samples/sec | ETA 02:07:52 2022-08-25 00:54:13 [INFO] [TRAIN] epoch: 101, iter: 126950/160000, loss: 0.6951, lr: 0.000250, batch_cost: 0.2473, reader_cost: 0.00260, ips: 32.3450 samples/sec | ETA 02:16:14 2022-08-25 00:54:25 [INFO] [TRAIN] epoch: 101, iter: 127000/160000, loss: 0.6690, lr: 0.000250, batch_cost: 0.2472, reader_cost: 0.00083, ips: 32.3621 samples/sec | ETA 02:15:57 2022-08-25 00:54:25 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1796 - reader cost: 9.5204e-04 2022-08-25 00:57:25 [INFO] [EVAL] #Images: 2000 mIoU: 0.3222 Acc: 0.7500 Kappa: 0.7306 Dice: 0.4500 2022-08-25 00:57:25 [INFO] [EVAL] Class IoU: [0.6594 0.7709 0.9256 0.707 0.6755 0.7425 0.7721 0.7602 0.4882 0.6198 0.4626 0.5237 0.6618 0.3045 0.2285 0.3807 0.4975 0.4261 0.5501 0.3607 0.7171 0.4154 0.5671 0.4524 0.3233 0.3492 0.3793 0.4084 0.3655 0.3167 0.2124 0.408 0.2795 0.2787 0.3219 0.3891 0.3524 0.4723 0.2459 0.2745 0.0717 0.0672 0.3076 0.2266 0.2988 0.2405 0.241 0.4108 0.6757 0.5019 0.4397 0.2534 0.2286 0.2016 0.607 0.4545 0.8328 0.312 0.4966 0.2043 0.0618 0.1555 0.3025 0.1 0.3932 0.6565 0.239 0.3827 0.0664 0.3351 0.3465 0.4196 0.3618 0.227 0.4059 0.3161 0.4167 0.2164 0.1166 0.4728 0.6303 0.301 0.2661 0.0371 0.4929 0.4702 0.0871 0.0565 0.3436 0.4512 0.3711 0.0092 0.2003 0.0905 0.0133 0.0087 0.17 0.1155 0.203 0.3269 0.104 0.0163 0.1705 0.5858 0.0001 0.522 0.2117 0.483 0.0864 0.3776 0.0472 0.3508 0.098 0.5947 0.7335 0.0014 0.3838 0.5792 0.058 0.286 0.3765 0.0078 0.2239 0.0779 0.2571 0.1998 0.4162 0.306 0.3892 0.2834 0.5501 0.0201 0.1375 0.2479 0.1023 0.0942 0.057 0.0118 0.1084 0.3198 0.0581 0.0691 0.2683 0.5112 0.3195 0. 0.2992 0.0212 0.0822 0.0698] 2022-08-25 00:57:25 [INFO] [EVAL] Class Precision: [0.7628 0.8307 0.9592 0.8107 0.7719 0.8677 0.876 0.8432 0.6138 0.7521 0.6381 0.6797 0.7621 0.4978 0.4639 0.5391 0.6555 0.6387 0.7224 0.5828 0.7952 0.598 0.7323 0.609 0.4951 0.7061 0.5357 0.6421 0.6035 0.4983 0.3978 0.5331 0.4639 0.429 0.496 0.5519 0.6283 0.7491 0.5578 0.5173 0.1174 0.2178 0.5243 0.5153 0.4324 0.4075 0.3877 0.5728 0.7338 0.6274 0.5754 0.2921 0.4432 0.6723 0.6908 0.6105 0.8647 0.6173 0.6655 0.3896 0.1017 0.3093 0.4392 0.6528 0.5091 0.8075 0.3871 0.5618 0.1994 0.6505 0.5716 0.5947 0.6228 0.3064 0.5948 0.5472 0.587 0.4817 0.6879 0.6521 0.718 0.5555 0.7276 0.1305 0.6612 0.6495 0.3342 0.4164 0.5434 0.6335 0.4744 0.0268 0.3276 0.3202 0.0592 0.0355 0.4896 0.4744 0.4075 0.5155 0.5339 0.0365 0.7751 0.6386 0.0011 0.66 0.6605 0.8873 0.2273 0.5963 0.2034 0.7231 0.3233 0.7184 0.7414 0.0367 0.6974 0.5976 0.1086 0.7643 0.7216 0.2947 0.6142 0.6709 0.6122 0.6223 0.7678 0.4691 0.5354 0.4933 0.6488 0.4404 0.3858 0.7123 0.5765 0.2606 0.3776 0.302 0.381 0.6726 0.1388 0.207 0.468 0.7189 0.5265 0. 0.9107 0.1997 0.478 0.7329] 2022-08-25 00:57:25 [INFO] [EVAL] Class Recall: [0.8295 0.9145 0.9635 0.8468 0.844 0.8372 0.8668 0.8853 0.7046 0.7789 0.6272 0.6953 0.8341 0.4395 0.3105 0.5644 0.6737 0.5615 0.6976 0.4862 0.8796 0.5764 0.7154 0.6376 0.4823 0.4086 0.5651 0.5288 0.481 0.465 0.313 0.6349 0.4129 0.4431 0.4783 0.5687 0.4452 0.5611 0.3054 0.3689 0.1557 0.0885 0.4266 0.2879 0.4917 0.3699 0.3891 0.5923 0.8952 0.7151 0.6509 0.6566 0.3208 0.2235 0.8335 0.6401 0.9576 0.3869 0.6617 0.3006 0.1364 0.2383 0.4929 0.1056 0.6333 0.7783 0.3845 0.5456 0.0905 0.4086 0.468 0.5876 0.4632 0.4669 0.5611 0.428 0.5897 0.282 0.1231 0.6324 0.8376 0.3965 0.2956 0.0492 0.6594 0.6301 0.1054 0.0613 0.4832 0.6106 0.6303 0.0139 0.3401 0.1121 0.0169 0.0114 0.2066 0.1324 0.2881 0.4719 0.1144 0.0286 0.1794 0.8763 0.0001 0.714 0.2376 0.5145 0.1223 0.5073 0.0579 0.4053 0.1234 0.7755 0.9856 0.0015 0.4605 0.9496 0.1106 0.3137 0.4406 0.008 0.2605 0.081 0.3071 0.2273 0.4762 0.4682 0.5877 0.3998 0.7833 0.0206 0.176 0.2754 0.1106 0.1285 0.0629 0.0121 0.1315 0.3788 0.0909 0.0939 0.3861 0.6388 0.4483 0. 0.3082 0.0231 0.0903 0.0717] 2022-08-25 00:57:25 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 00:57:35 [INFO] [TRAIN] epoch: 101, iter: 127050/160000, loss: 0.7036, lr: 0.000249, batch_cost: 0.1980, reader_cost: 0.00443, ips: 40.3977 samples/sec | ETA 01:48:45 2022-08-25 00:57:45 [INFO] [TRAIN] epoch: 101, iter: 127100/160000, loss: 0.7158, lr: 0.000249, batch_cost: 0.2090, reader_cost: 0.00128, ips: 38.2709 samples/sec | ETA 01:54:37 2022-08-25 00:57:55 [INFO] [TRAIN] epoch: 101, iter: 127150/160000, loss: 0.6884, lr: 0.000249, batch_cost: 0.1974, reader_cost: 0.00067, ips: 40.5370 samples/sec | ETA 01:48:02 2022-08-25 00:58:04 [INFO] [TRAIN] epoch: 101, iter: 127200/160000, loss: 0.7317, lr: 0.000248, batch_cost: 0.1762, reader_cost: 0.00130, ips: 45.3985 samples/sec | ETA 01:36:19 2022-08-25 00:58:13 [INFO] [TRAIN] epoch: 101, iter: 127250/160000, loss: 0.7055, lr: 0.000248, batch_cost: 0.1869, reader_cost: 0.00033, ips: 42.8119 samples/sec | ETA 01:41:59 2022-08-25 00:58:23 [INFO] [TRAIN] epoch: 101, iter: 127300/160000, loss: 0.6639, lr: 0.000248, batch_cost: 0.1948, reader_cost: 0.00570, ips: 41.0594 samples/sec | ETA 01:46:11 2022-08-25 00:58:35 [INFO] [TRAIN] epoch: 101, iter: 127350/160000, loss: 0.6672, lr: 0.000247, batch_cost: 0.2468, reader_cost: 0.00077, ips: 32.4116 samples/sec | ETA 02:14:18 2022-08-25 00:58:47 [INFO] [TRAIN] epoch: 101, iter: 127400/160000, loss: 0.6902, lr: 0.000247, batch_cost: 0.2248, reader_cost: 0.03313, ips: 35.5875 samples/sec | ETA 02:02:08 2022-08-25 00:58:58 [INFO] [TRAIN] epoch: 101, iter: 127450/160000, loss: 0.7230, lr: 0.000246, batch_cost: 0.2371, reader_cost: 0.01725, ips: 33.7410 samples/sec | ETA 02:08:37 2022-08-25 00:59:11 [INFO] [TRAIN] epoch: 101, iter: 127500/160000, loss: 0.6697, lr: 0.000246, batch_cost: 0.2558, reader_cost: 0.00041, ips: 31.2747 samples/sec | ETA 02:18:33 2022-08-25 00:59:24 [INFO] [TRAIN] epoch: 101, iter: 127550/160000, loss: 0.6985, lr: 0.000246, batch_cost: 0.2554, reader_cost: 0.00125, ips: 31.3213 samples/sec | ETA 02:18:08 2022-08-25 00:59:39 [INFO] [TRAIN] epoch: 102, iter: 127600/160000, loss: 0.6580, lr: 0.000245, batch_cost: 0.2984, reader_cost: 0.04155, ips: 26.8061 samples/sec | ETA 02:41:09 2022-08-25 00:59:50 [INFO] [TRAIN] epoch: 102, iter: 127650/160000, loss: 0.7072, lr: 0.000245, batch_cost: 0.2285, reader_cost: 0.00302, ips: 35.0102 samples/sec | ETA 02:03:12 2022-08-25 01:00:02 [INFO] [TRAIN] epoch: 102, iter: 127700/160000, loss: 0.6639, lr: 0.000245, batch_cost: 0.2279, reader_cost: 0.02027, ips: 35.1062 samples/sec | ETA 02:02:40 2022-08-25 01:00:14 [INFO] [TRAIN] epoch: 102, iter: 127750/160000, loss: 0.6582, lr: 0.000244, batch_cost: 0.2494, reader_cost: 0.00121, ips: 32.0721 samples/sec | ETA 02:14:04 2022-08-25 01:00:27 [INFO] [TRAIN] epoch: 102, iter: 127800/160000, loss: 0.7272, lr: 0.000244, batch_cost: 0.2548, reader_cost: 0.00262, ips: 31.3938 samples/sec | ETA 02:16:45 2022-08-25 01:00:40 [INFO] [TRAIN] epoch: 102, iter: 127850/160000, loss: 0.6897, lr: 0.000243, batch_cost: 0.2605, reader_cost: 0.00313, ips: 30.7160 samples/sec | ETA 02:19:33 2022-08-25 01:00:52 [INFO] [TRAIN] epoch: 102, iter: 127900/160000, loss: 0.6812, lr: 0.000243, batch_cost: 0.2419, reader_cost: 0.00040, ips: 33.0739 samples/sec | ETA 02:09:24 2022-08-25 01:01:04 [INFO] [TRAIN] epoch: 102, iter: 127950/160000, loss: 0.6756, lr: 0.000243, batch_cost: 0.2426, reader_cost: 0.01911, ips: 32.9763 samples/sec | ETA 02:09:35 2022-08-25 01:01:15 [INFO] [TRAIN] epoch: 102, iter: 128000/160000, loss: 0.6695, lr: 0.000242, batch_cost: 0.2160, reader_cost: 0.00059, ips: 37.0409 samples/sec | ETA 01:55:11 2022-08-25 01:01:15 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1558 - reader cost: 9.3303e-04 2022-08-25 01:03:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3197 Acc: 0.7472 Kappa: 0.7279 Dice: 0.4466 2022-08-25 01:03:51 [INFO] [EVAL] Class IoU: [0.659 0.7681 0.9248 0.7044 0.6756 0.7431 0.7632 0.7618 0.4756 0.6043 0.4587 0.5211 0.6599 0.3063 0.2034 0.3829 0.4862 0.4217 0.5522 0.3674 0.7307 0.424 0.5635 0.4515 0.3322 0.3824 0.3594 0.3932 0.3635 0.2733 0.2051 0.387 0.2647 0.2847 0.319 0.3815 0.3636 0.4857 0.2531 0.2722 0.089 0.0757 0.3053 0.236 0.3135 0.2275 0.2354 0.4242 0.6726 0.4826 0.4536 0.2394 0.2165 0.2145 0.5755 0.4612 0.8529 0.3137 0.4927 0.2087 0.0386 0.1669 0.2998 0.109 0.3835 0.6691 0.2099 0.3589 0.0868 0.3381 0.3574 0.4173 0.3642 0.2285 0.4132 0.2938 0.4168 0.2062 0.107 0.1889 0.6413 0.287 0.2921 0.0139 0.4621 0.4792 0.0867 0.0679 0.3274 0.4147 0.3752 0.0118 0.2002 0.0876 0.0128 0.0093 0.1524 0.1416 0.1124 0.3227 0.1356 0.0268 0.2116 0.7018 0.0015 0.5217 0.1965 0.4986 0.1067 0.3901 0.0473 0.3015 0.0825 0.6053 0.7002 0.0001 0.376 0.5974 0.0612 0.2915 0.4391 0.0103 0.2069 0.1004 0.2573 0.1694 0.4357 0.2889 0.4627 0.2771 0.5342 0.0231 0.0816 0.2525 0.0799 0.111 0.0715 0.0202 0.1055 0.3395 0.1124 0.0512 0.2646 0.4993 0.3072 0. 0.2931 0.0064 0.0816 0.0406] 2022-08-25 01:03:51 [INFO] [EVAL] Class Precision: [0.7651 0.8495 0.9597 0.8013 0.7701 0.8721 0.8733 0.8434 0.5791 0.7298 0.674 0.6872 0.7402 0.4766 0.4747 0.5567 0.6418 0.6524 0.7097 0.5287 0.822 0.6277 0.7268 0.6006 0.5246 0.504 0.5353 0.6186 0.6339 0.4237 0.3943 0.4998 0.4492 0.3995 0.4719 0.5076 0.608 0.7334 0.5188 0.5162 0.1558 0.2199 0.5395 0.518 0.4406 0.4401 0.3919 0.652 0.7578 0.5884 0.5754 0.2734 0.3851 0.5103 0.6412 0.6057 0.895 0.6209 0.6326 0.4513 0.0797 0.4146 0.3948 0.6243 0.487 0.8437 0.3412 0.5744 0.231 0.5907 0.6158 0.5622 0.6034 0.2913 0.6303 0.4455 0.617 0.5224 0.6431 0.612 0.7466 0.6057 0.7033 0.0622 0.6215 0.6733 0.4883 0.3895 0.6297 0.546 0.4844 0.0161 0.3289 0.304 0.051 0.0334 0.6253 0.3964 0.5503 0.5601 0.5857 0.0474 0.6761 0.8286 0.0293 0.6983 0.5114 0.8821 0.2174 0.6219 0.2564 0.5281 0.3083 0.7245 0.7065 0.0029 0.6803 0.6172 0.1486 0.7355 0.8097 0.475 0.7392 0.6477 0.5738 0.6637 0.7873 0.4228 0.5897 0.5168 0.5877 0.4628 0.4423 0.7027 0.6602 0.2886 0.3184 0.1517 0.3719 0.6577 0.3201 0.125 0.4862 0.6615 0.4863 0. 0.9191 0.3224 0.4445 0.7446] 2022-08-25 01:03:51 [INFO] [EVAL] Class Recall: [0.8261 0.8891 0.9621 0.8535 0.8462 0.834 0.8583 0.8873 0.7269 0.7784 0.5895 0.6831 0.8587 0.4616 0.2625 0.5509 0.6672 0.5439 0.7133 0.5464 0.8681 0.5665 0.715 0.6452 0.4753 0.6131 0.5223 0.519 0.4601 0.4349 0.2995 0.6317 0.3919 0.4978 0.496 0.6058 0.475 0.5899 0.3307 0.3654 0.1718 0.1034 0.4129 0.3024 0.5208 0.3202 0.371 0.5484 0.8568 0.7285 0.6819 0.6579 0.3309 0.2702 0.8489 0.6591 0.9477 0.388 0.6902 0.2796 0.0697 0.2184 0.5548 0.1167 0.6434 0.7638 0.353 0.4889 0.122 0.4414 0.4599 0.6183 0.4788 0.5145 0.5453 0.4632 0.5623 0.2542 0.1138 0.2145 0.8197 0.353 0.3332 0.0175 0.643 0.6244 0.0954 0.076 0.4055 0.6328 0.6247 0.0428 0.3384 0.1096 0.0167 0.0127 0.1677 0.1806 0.1238 0.4323 0.1499 0.058 0.2354 0.8209 0.0015 0.6735 0.2419 0.5342 0.1731 0.5114 0.0548 0.4127 0.1013 0.7863 0.9874 0.0001 0.4567 0.949 0.0942 0.3256 0.4896 0.0104 0.2231 0.1062 0.3181 0.1853 0.4938 0.4772 0.6824 0.3739 0.8545 0.0237 0.091 0.2827 0.0833 0.1528 0.0844 0.0228 0.1284 0.4123 0.1476 0.0798 0.3674 0.6706 0.4548 0. 0.3009 0.0065 0.0909 0.0411] 2022-08-25 01:03:51 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:04:00 [INFO] [TRAIN] epoch: 102, iter: 128050/160000, loss: 0.7177, lr: 0.000242, batch_cost: 0.1798, reader_cost: 0.00379, ips: 44.4869 samples/sec | ETA 01:35:45 2022-08-25 01:04:11 [INFO] [TRAIN] epoch: 102, iter: 128100/160000, loss: 0.7244, lr: 0.000242, batch_cost: 0.2139, reader_cost: 0.00128, ips: 37.3981 samples/sec | ETA 01:53:43 2022-08-25 01:04:22 [INFO] [TRAIN] epoch: 102, iter: 128150/160000, loss: 0.6652, lr: 0.000241, batch_cost: 0.2142, reader_cost: 0.00036, ips: 37.3463 samples/sec | ETA 01:53:42 2022-08-25 01:04:31 [INFO] [TRAIN] epoch: 102, iter: 128200/160000, loss: 0.7279, lr: 0.000241, batch_cost: 0.1905, reader_cost: 0.00097, ips: 42.0012 samples/sec | ETA 01:40:56 2022-08-25 01:04:41 [INFO] [TRAIN] epoch: 102, iter: 128250/160000, loss: 0.6562, lr: 0.000240, batch_cost: 0.2024, reader_cost: 0.00057, ips: 39.5229 samples/sec | ETA 01:47:06 2022-08-25 01:04:51 [INFO] [TRAIN] epoch: 102, iter: 128300/160000, loss: 0.6619, lr: 0.000240, batch_cost: 0.2012, reader_cost: 0.00035, ips: 39.7558 samples/sec | ETA 01:46:18 2022-08-25 01:05:02 [INFO] [TRAIN] epoch: 102, iter: 128350/160000, loss: 0.6924, lr: 0.000240, batch_cost: 0.2229, reader_cost: 0.00120, ips: 35.8967 samples/sec | ETA 01:57:33 2022-08-25 01:05:14 [INFO] [TRAIN] epoch: 102, iter: 128400/160000, loss: 0.7073, lr: 0.000239, batch_cost: 0.2349, reader_cost: 0.01837, ips: 34.0533 samples/sec | ETA 02:03:43 2022-08-25 01:05:26 [INFO] [TRAIN] epoch: 102, iter: 128450/160000, loss: 0.6794, lr: 0.000239, batch_cost: 0.2327, reader_cost: 0.00950, ips: 34.3852 samples/sec | ETA 02:02:20 2022-08-25 01:05:38 [INFO] [TRAIN] epoch: 102, iter: 128500/160000, loss: 0.7276, lr: 0.000238, batch_cost: 0.2354, reader_cost: 0.00259, ips: 33.9895 samples/sec | ETA 02:03:34 2022-08-25 01:05:50 [INFO] [TRAIN] epoch: 102, iter: 128550/160000, loss: 0.6646, lr: 0.000238, batch_cost: 0.2410, reader_cost: 0.00266, ips: 33.1925 samples/sec | ETA 02:06:20 2022-08-25 01:06:01 [INFO] [TRAIN] epoch: 102, iter: 128600/160000, loss: 0.7107, lr: 0.000238, batch_cost: 0.2335, reader_cost: 0.00658, ips: 34.2557 samples/sec | ETA 02:02:13 2022-08-25 01:06:14 [INFO] [TRAIN] epoch: 102, iter: 128650/160000, loss: 0.7148, lr: 0.000237, batch_cost: 0.2447, reader_cost: 0.00101, ips: 32.6990 samples/sec | ETA 02:07:49 2022-08-25 01:06:26 [INFO] [TRAIN] epoch: 102, iter: 128700/160000, loss: 0.6583, lr: 0.000237, batch_cost: 0.2478, reader_cost: 0.00092, ips: 32.2823 samples/sec | ETA 02:09:16 2022-08-25 01:06:38 [INFO] [TRAIN] epoch: 102, iter: 128750/160000, loss: 0.6961, lr: 0.000237, batch_cost: 0.2388, reader_cost: 0.00273, ips: 33.4952 samples/sec | ETA 02:04:23 2022-08-25 01:06:51 [INFO] [TRAIN] epoch: 102, iter: 128800/160000, loss: 0.6531, lr: 0.000236, batch_cost: 0.2626, reader_cost: 0.00052, ips: 30.4695 samples/sec | ETA 02:16:31 2022-08-25 01:07:06 [INFO] [TRAIN] epoch: 103, iter: 128850/160000, loss: 0.7550, lr: 0.000236, batch_cost: 0.2902, reader_cost: 0.05700, ips: 27.5649 samples/sec | ETA 02:30:40 2022-08-25 01:07:19 [INFO] [TRAIN] epoch: 103, iter: 128900/160000, loss: 0.7003, lr: 0.000235, batch_cost: 0.2628, reader_cost: 0.00273, ips: 30.4414 samples/sec | ETA 02:16:13 2022-08-25 01:07:31 [INFO] [TRAIN] epoch: 103, iter: 128950/160000, loss: 0.6856, lr: 0.000235, batch_cost: 0.2384, reader_cost: 0.00326, ips: 33.5550 samples/sec | ETA 02:03:22 2022-08-25 01:07:42 [INFO] [TRAIN] epoch: 103, iter: 129000/160000, loss: 0.7124, lr: 0.000235, batch_cost: 0.2378, reader_cost: 0.00314, ips: 33.6437 samples/sec | ETA 02:02:51 2022-08-25 01:07:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1557 - reader cost: 8.6182e-04 2022-08-25 01:10:18 [INFO] [EVAL] #Images: 2000 mIoU: 0.3207 Acc: 0.7494 Kappa: 0.7299 Dice: 0.4480 2022-08-25 01:10:18 [INFO] [EVAL] Class IoU: [0.6593 0.7663 0.9249 0.7033 0.6708 0.7486 0.7536 0.7599 0.4864 0.6219 0.4555 0.5279 0.6665 0.306 0.2187 0.3861 0.4943 0.4055 0.5479 0.3715 0.7272 0.4515 0.5452 0.458 0.3348 0.354 0.3621 0.39 0.3522 0.3182 0.1864 0.388 0.2633 0.2858 0.3384 0.3895 0.3584 0.5087 0.2619 0.2618 0.0753 0.0648 0.3066 0.2092 0.2977 0.2469 0.262 0.4183 0.651 0.5022 0.464 0.2668 0.2148 0.1999 0.5711 0.4263 0.856 0.2596 0.463 0.198 0.0513 0.1809 0.3041 0.1187 0.3702 0.6694 0.2431 0.3852 0.0858 0.3392 0.3451 0.4305 0.3655 0.2344 0.4093 0.3056 0.4059 0.2021 0.1867 0.2522 0.6601 0.2683 0.2724 0.0344 0.4384 0.4717 0.0834 0.0668 0.3678 0.438 0.3943 0.0022 0.1821 0.0764 0.002 0.008 0.1625 0.1337 0.1813 0.2915 0.0854 0.0149 0.1796 0.6914 0.0244 0.526 0.1891 0.4684 0.1006 0.3792 0.0498 0.3395 0.0917 0.5484 0.7533 0. 0.414 0.5444 0.0405 0.2965 0.4316 0.0103 0.1849 0.125 0.2498 0.136 0.4227 0.2958 0.5061 0.2752 0.5472 0.0146 0.0517 0.2479 0.0849 0.1028 0.0725 0.021 0.1132 0.3278 0.106 0.0862 0.2966 0.4997 0.3385 0. 0.3097 0.0111 0.074 0.0466] 2022-08-25 01:10:18 [INFO] [EVAL] Class Precision: [0.7619 0.8286 0.9627 0.7912 0.7667 0.8641 0.8903 0.8389 0.6091 0.7312 0.6603 0.6358 0.7768 0.5006 0.4939 0.5739 0.6587 0.6756 0.7224 0.5482 0.8152 0.6065 0.6824 0.5615 0.4812 0.6127 0.5335 0.6884 0.645 0.541 0.4254 0.5269 0.4755 0.3938 0.5104 0.4876 0.6189 0.7431 0.505 0.5311 0.1461 0.178 0.544 0.6066 0.433 0.4208 0.4705 0.6181 0.6967 0.6371 0.6058 0.3347 0.4546 0.6589 0.6435 0.5873 0.9076 0.6416 0.6748 0.3739 0.1124 0.413 0.4783 0.5584 0.4574 0.8269 0.3771 0.5511 0.2689 0.6198 0.5391 0.5639 0.6343 0.3036 0.6326 0.4969 0.6189 0.5113 0.7647 0.6037 0.7805 0.5718 0.7291 0.1831 0.5266 0.6753 0.3854 0.4627 0.5969 0.6256 0.5211 0.0032 0.3171 0.2928 0.0131 0.0311 0.6317 0.605 0.4907 0.5347 0.6347 0.0314 0.6647 0.807 0.2671 0.7259 0.6333 0.9438 0.2288 0.6024 0.2638 0.6487 0.3096 0.7593 0.7604 0.0004 0.6485 0.5591 0.1046 0.7381 0.7207 0.4235 0.7955 0.6057 0.6461 0.5605 0.7019 0.4383 0.6475 0.4745 0.6668 0.4877 0.2683 0.7107 0.6368 0.2642 0.3233 0.1213 0.3056 0.6935 0.2379 0.1497 0.473 0.6645 0.56 0. 0.8865 0.2853 0.3082 0.8044] 2022-08-25 01:10:18 [INFO] [EVAL] Class Recall: [0.8304 0.9106 0.9593 0.8635 0.8428 0.8484 0.8308 0.8896 0.7071 0.8062 0.5949 0.7567 0.8243 0.4404 0.2819 0.5413 0.6645 0.5035 0.694 0.5354 0.8708 0.6386 0.7306 0.7131 0.5238 0.4561 0.5298 0.4736 0.4369 0.4358 0.2491 0.5953 0.3711 0.5102 0.5009 0.6594 0.46 0.6172 0.3523 0.3405 0.1346 0.0924 0.4127 0.242 0.488 0.3741 0.3716 0.564 0.9084 0.7034 0.6646 0.5683 0.2894 0.2229 0.8354 0.6086 0.9377 0.3036 0.596 0.2961 0.0862 0.2434 0.4551 0.131 0.6601 0.7785 0.4061 0.5614 0.1119 0.4282 0.4895 0.6453 0.463 0.5072 0.537 0.4425 0.5411 0.2505 0.1981 0.3022 0.8107 0.3358 0.3031 0.0406 0.7235 0.6101 0.0962 0.0724 0.4894 0.5935 0.6184 0.0064 0.2995 0.0937 0.0023 0.0107 0.1794 0.1465 0.2233 0.3906 0.0898 0.0275 0.1975 0.8283 0.0262 0.6563 0.2123 0.4819 0.1522 0.5058 0.0578 0.416 0.1152 0.6638 0.9877 0. 0.5338 0.9539 0.0621 0.3313 0.5183 0.0105 0.1942 0.1361 0.2894 0.1522 0.5152 0.4765 0.6987 0.3958 0.7531 0.0148 0.0602 0.2757 0.0892 0.1441 0.0855 0.0247 0.1524 0.3833 0.1605 0.1689 0.4429 0.6682 0.4612 0. 0.3225 0.0114 0.0887 0.0471] 2022-08-25 01:10:18 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:10:28 [INFO] [TRAIN] epoch: 103, iter: 129050/160000, loss: 0.7110, lr: 0.000234, batch_cost: 0.1868, reader_cost: 0.00327, ips: 42.8230 samples/sec | ETA 01:36:21 2022-08-25 01:10:37 [INFO] [TRAIN] epoch: 103, iter: 129100/160000, loss: 0.6667, lr: 0.000234, batch_cost: 0.1830, reader_cost: 0.00074, ips: 43.7206 samples/sec | ETA 01:34:14 2022-08-25 01:10:47 [INFO] [TRAIN] epoch: 103, iter: 129150/160000, loss: 0.7071, lr: 0.000234, batch_cost: 0.2079, reader_cost: 0.00057, ips: 38.4865 samples/sec | ETA 01:46:52 2022-08-25 01:10:58 [INFO] [TRAIN] epoch: 103, iter: 129200/160000, loss: 0.6623, lr: 0.000233, batch_cost: 0.2064, reader_cost: 0.00068, ips: 38.7685 samples/sec | ETA 01:45:55 2022-08-25 01:11:10 [INFO] [TRAIN] epoch: 103, iter: 129250/160000, loss: 0.6284, lr: 0.000233, batch_cost: 0.2402, reader_cost: 0.00106, ips: 33.3085 samples/sec | ETA 02:03:05 2022-08-25 01:11:23 [INFO] [TRAIN] epoch: 103, iter: 129300/160000, loss: 0.6941, lr: 0.000232, batch_cost: 0.2745, reader_cost: 0.00069, ips: 29.1491 samples/sec | ETA 02:20:25 2022-08-25 01:11:35 [INFO] [TRAIN] epoch: 103, iter: 129350/160000, loss: 0.6264, lr: 0.000232, batch_cost: 0.2394, reader_cost: 0.00478, ips: 33.4221 samples/sec | ETA 02:02:16 2022-08-25 01:11:47 [INFO] [TRAIN] epoch: 103, iter: 129400/160000, loss: 0.7135, lr: 0.000232, batch_cost: 0.2397, reader_cost: 0.00079, ips: 33.3766 samples/sec | ETA 02:02:14 2022-08-25 01:12:00 [INFO] [TRAIN] epoch: 103, iter: 129450/160000, loss: 0.6933, lr: 0.000231, batch_cost: 0.2523, reader_cost: 0.01055, ips: 31.7041 samples/sec | ETA 02:08:28 2022-08-25 01:12:13 [INFO] [TRAIN] epoch: 103, iter: 129500/160000, loss: 0.7764, lr: 0.000231, batch_cost: 0.2582, reader_cost: 0.00199, ips: 30.9820 samples/sec | ETA 02:11:15 2022-08-25 01:12:25 [INFO] [TRAIN] epoch: 103, iter: 129550/160000, loss: 0.6797, lr: 0.000231, batch_cost: 0.2334, reader_cost: 0.00602, ips: 34.2779 samples/sec | ETA 01:58:26 2022-08-25 01:12:37 [INFO] [TRAIN] epoch: 103, iter: 129600/160000, loss: 0.6633, lr: 0.000230, batch_cost: 0.2395, reader_cost: 0.00036, ips: 33.4073 samples/sec | ETA 02:01:19 2022-08-25 01:12:50 [INFO] [TRAIN] epoch: 103, iter: 129650/160000, loss: 0.6719, lr: 0.000230, batch_cost: 0.2616, reader_cost: 0.00054, ips: 30.5845 samples/sec | ETA 02:12:18 2022-08-25 01:13:02 [INFO] [TRAIN] epoch: 103, iter: 129700/160000, loss: 0.6794, lr: 0.000229, batch_cost: 0.2482, reader_cost: 0.02067, ips: 32.2321 samples/sec | ETA 02:05:20 2022-08-25 01:13:13 [INFO] [TRAIN] epoch: 103, iter: 129750/160000, loss: 0.7258, lr: 0.000229, batch_cost: 0.2275, reader_cost: 0.00040, ips: 35.1606 samples/sec | ETA 01:54:42 2022-08-25 01:13:26 [INFO] [TRAIN] epoch: 103, iter: 129800/160000, loss: 0.7018, lr: 0.000229, batch_cost: 0.2559, reader_cost: 0.00067, ips: 31.2680 samples/sec | ETA 02:08:46 2022-08-25 01:13:39 [INFO] [TRAIN] epoch: 103, iter: 129850/160000, loss: 0.6956, lr: 0.000228, batch_cost: 0.2591, reader_cost: 0.00048, ips: 30.8783 samples/sec | ETA 02:10:11 2022-08-25 01:13:52 [INFO] [TRAIN] epoch: 103, iter: 129900/160000, loss: 0.6788, lr: 0.000228, batch_cost: 0.2519, reader_cost: 0.00872, ips: 31.7567 samples/sec | ETA 02:06:22 2022-08-25 01:14:03 [INFO] [TRAIN] epoch: 103, iter: 129950/160000, loss: 0.6705, lr: 0.000228, batch_cost: 0.2311, reader_cost: 0.01422, ips: 34.6103 samples/sec | ETA 01:55:45 2022-08-25 01:14:16 [INFO] [TRAIN] epoch: 103, iter: 130000/160000, loss: 0.7153, lr: 0.000227, batch_cost: 0.2581, reader_cost: 0.00076, ips: 30.9925 samples/sec | ETA 02:09:03 2022-08-25 01:14:16 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 149s - batch_cost: 0.1488 - reader cost: 0.0011 2022-08-25 01:16:45 [INFO] [EVAL] #Images: 2000 mIoU: 0.3205 Acc: 0.7511 Kappa: 0.7316 Dice: 0.4470 2022-08-25 01:16:45 [INFO] [EVAL] Class IoU: [0.6592 0.7695 0.9247 0.7047 0.6743 0.7464 0.7704 0.7567 0.4855 0.6336 0.4638 0.5227 0.665 0.3105 0.1926 0.3851 0.4886 0.4108 0.5472 0.3617 0.7255 0.4114 0.5626 0.4573 0.3369 0.4164 0.3688 0.4094 0.3751 0.3015 0.2198 0.3768 0.2732 0.2881 0.3242 0.4206 0.3674 0.5252 0.2489 0.2538 0.0741 0.0697 0.3114 0.2199 0.2971 0.2631 0.2368 0.4161 0.6597 0.4876 0.4739 0.2716 0.2185 0.2085 0.6269 0.4736 0.8467 0.3202 0.4974 0.1948 0.0698 0.232 0.3154 0.1184 0.3945 0.6675 0.2068 0.3548 0.0541 0.3487 0.3626 0.4269 0.3669 0.2275 0.4153 0.3047 0.3639 0.2023 0.1024 0.1885 0.6514 0.2913 0.2737 0.0235 0.4451 0.4741 0.0783 0.0659 0.3564 0.4223 0.4146 0.0035 0.2071 0.0552 0.002 0.0172 0.1868 0.1394 0.1854 0.3014 0.0725 0.0173 0.1979 0.5931 0.011 0.5362 0.2278 0.4605 0.063 0.3917 0.0603 0.2657 0.0942 0.5591 0.7647 0.0007 0.3125 0.6026 0.0686 0.2544 0.4494 0.0089 0.1833 0.0943 0.2453 0.1217 0.4098 0.2804 0.5098 0.2751 0.5499 0.0165 0.1062 0.2792 0.099 0.1062 0.0543 0.0209 0.1235 0.316 0.0649 0.0621 0.2736 0.4965 0.3109 0. 0.2926 0.0053 0.0937 0.0432] 2022-08-25 01:16:45 [INFO] [EVAL] Class Precision: [0.7537 0.835 0.9604 0.8034 0.7679 0.871 0.8644 0.8387 0.6102 0.7446 0.6558 0.6812 0.7534 0.4914 0.5216 0.5344 0.6461 0.6703 0.7374 0.57 0.8205 0.5765 0.7433 0.5896 0.5251 0.5976 0.549 0.6922 0.6304 0.5055 0.4002 0.5357 0.4827 0.4621 0.5065 0.5448 0.6075 0.7681 0.4659 0.5274 0.1429 0.2026 0.5497 0.5157 0.4247 0.4344 0.4291 0.6805 0.6975 0.5941 0.6316 0.3241 0.4095 0.5866 0.743 0.6685 0.8833 0.6159 0.6132 0.3564 0.1132 0.4207 0.4363 0.5685 0.528 0.7955 0.3669 0.617 0.1864 0.6723 0.5993 0.6149 0.6297 0.2977 0.6701 0.505 0.6403 0.5668 0.7124 0.5935 0.7608 0.5756 0.7526 0.0835 0.6724 0.6643 0.5391 0.3984 0.67 0.5833 0.5681 0.0051 0.3758 0.29 0.0106 0.0554 0.6062 0.4645 0.4838 0.4991 0.5269 0.043 0.7271 0.6709 0.1735 0.8092 0.5118 0.9207 0.2344 0.6406 0.2705 0.4251 0.3081 0.7132 0.7729 0.0726 0.7609 0.6277 0.1752 0.7793 0.7298 0.214 0.6164 0.692 0.6492 0.5936 0.7957 0.3778 0.6292 0.5023 0.6267 0.4534 0.4545 0.6999 0.6653 0.2714 0.3456 0.2294 0.3896 0.7151 0.1408 0.1709 0.5433 0.678 0.5206 0. 0.897 0.2956 0.3677 0.7983] 2022-08-25 01:16:45 [INFO] [EVAL] Class Recall: [0.8401 0.9075 0.9614 0.8516 0.8469 0.8392 0.8763 0.8857 0.7039 0.8096 0.6131 0.6921 0.85 0.4576 0.2339 0.5795 0.6672 0.5148 0.6797 0.4975 0.8623 0.5895 0.6983 0.6709 0.4845 0.5787 0.5292 0.5004 0.4808 0.4276 0.3278 0.5594 0.3864 0.4335 0.4739 0.6484 0.4817 0.6242 0.3482 0.3285 0.1334 0.0961 0.4181 0.2771 0.4971 0.4002 0.3458 0.5171 0.9241 0.7311 0.655 0.6261 0.3191 0.2444 0.8005 0.6189 0.9533 0.4002 0.7248 0.3005 0.1539 0.3409 0.5322 0.1301 0.6094 0.8057 0.3216 0.455 0.0708 0.42 0.4787 0.5826 0.4678 0.4911 0.5221 0.4344 0.4575 0.2392 0.1068 0.2164 0.8192 0.371 0.3007 0.0317 0.5684 0.6235 0.0839 0.0732 0.4323 0.6049 0.6054 0.0109 0.3157 0.0639 0.0025 0.0243 0.2126 0.1661 0.2312 0.432 0.0776 0.0281 0.2137 0.8364 0.0116 0.6139 0.291 0.4795 0.0794 0.502 0.0721 0.4146 0.1195 0.7212 0.9862 0.0007 0.3465 0.9377 0.1013 0.2741 0.5391 0.0092 0.2069 0.0984 0.2828 0.1328 0.458 0.521 0.7289 0.3781 0.8177 0.0169 0.1218 0.3172 0.1042 0.1486 0.0605 0.0225 0.1531 0.3615 0.1076 0.0889 0.3552 0.6497 0.4356 0. 0.3028 0.0054 0.1117 0.0437] 2022-08-25 01:16:45 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:16:56 [INFO] [TRAIN] epoch: 103, iter: 130050/160000, loss: 0.6940, lr: 0.000227, batch_cost: 0.2056, reader_cost: 0.00518, ips: 38.9139 samples/sec | ETA 01:42:37 2022-08-25 01:17:07 [INFO] [TRAIN] epoch: 104, iter: 130100/160000, loss: 0.7140, lr: 0.000226, batch_cost: 0.2288, reader_cost: 0.03251, ips: 34.9697 samples/sec | ETA 01:54:00 2022-08-25 01:17:18 [INFO] [TRAIN] epoch: 104, iter: 130150/160000, loss: 0.6619, lr: 0.000226, batch_cost: 0.2070, reader_cost: 0.00041, ips: 38.6392 samples/sec | ETA 01:43:00 2022-08-25 01:17:28 [INFO] [TRAIN] epoch: 104, iter: 130200/160000, loss: 0.7125, lr: 0.000226, batch_cost: 0.1998, reader_cost: 0.00045, ips: 40.0303 samples/sec | ETA 01:39:15 2022-08-25 01:17:39 [INFO] [TRAIN] epoch: 104, iter: 130250/160000, loss: 0.6585, lr: 0.000225, batch_cost: 0.2362, reader_cost: 0.00518, ips: 33.8747 samples/sec | ETA 01:57:05 2022-08-25 01:17:51 [INFO] [TRAIN] epoch: 104, iter: 130300/160000, loss: 0.6357, lr: 0.000225, batch_cost: 0.2248, reader_cost: 0.00629, ips: 35.5859 samples/sec | ETA 01:51:16 2022-08-25 01:18:03 [INFO] [TRAIN] epoch: 104, iter: 130350/160000, loss: 0.6419, lr: 0.000224, batch_cost: 0.2548, reader_cost: 0.00958, ips: 31.4008 samples/sec | ETA 02:05:53 2022-08-25 01:18:15 [INFO] [TRAIN] epoch: 104, iter: 130400/160000, loss: 0.6893, lr: 0.000224, batch_cost: 0.2300, reader_cost: 0.00423, ips: 34.7864 samples/sec | ETA 01:53:27 2022-08-25 01:18:26 [INFO] [TRAIN] epoch: 104, iter: 130450/160000, loss: 0.7074, lr: 0.000224, batch_cost: 0.2244, reader_cost: 0.01817, ips: 35.6457 samples/sec | ETA 01:50:31 2022-08-25 01:18:38 [INFO] [TRAIN] epoch: 104, iter: 130500/160000, loss: 0.6607, lr: 0.000223, batch_cost: 0.2462, reader_cost: 0.00855, ips: 32.4888 samples/sec | ETA 02:01:04 2022-08-25 01:18:50 [INFO] [TRAIN] epoch: 104, iter: 130550/160000, loss: 0.7434, lr: 0.000223, batch_cost: 0.2310, reader_cost: 0.01352, ips: 34.6332 samples/sec | ETA 01:53:22 2022-08-25 01:19:02 [INFO] [TRAIN] epoch: 104, iter: 130600/160000, loss: 0.7116, lr: 0.000223, batch_cost: 0.2468, reader_cost: 0.00449, ips: 32.4114 samples/sec | ETA 02:00:56 2022-08-25 01:19:15 [INFO] [TRAIN] epoch: 104, iter: 130650/160000, loss: 0.6639, lr: 0.000222, batch_cost: 0.2472, reader_cost: 0.00134, ips: 32.3654 samples/sec | ETA 02:00:54 2022-08-25 01:19:25 [INFO] [TRAIN] epoch: 104, iter: 130700/160000, loss: 0.6699, lr: 0.000222, batch_cost: 0.2119, reader_cost: 0.00537, ips: 37.7602 samples/sec | ETA 01:43:27 2022-08-25 01:19:39 [INFO] [TRAIN] epoch: 104, iter: 130750/160000, loss: 0.6448, lr: 0.000221, batch_cost: 0.2668, reader_cost: 0.01592, ips: 29.9828 samples/sec | ETA 02:10:04 2022-08-25 01:19:50 [INFO] [TRAIN] epoch: 104, iter: 130800/160000, loss: 0.6434, lr: 0.000221, batch_cost: 0.2221, reader_cost: 0.00321, ips: 36.0166 samples/sec | ETA 01:48:05 2022-08-25 01:20:03 [INFO] [TRAIN] epoch: 104, iter: 130850/160000, loss: 0.7125, lr: 0.000221, batch_cost: 0.2614, reader_cost: 0.00871, ips: 30.6090 samples/sec | ETA 02:06:58 2022-08-25 01:20:15 [INFO] [TRAIN] epoch: 104, iter: 130900/160000, loss: 0.7103, lr: 0.000220, batch_cost: 0.2442, reader_cost: 0.00062, ips: 32.7541 samples/sec | ETA 01:58:27 2022-08-25 01:20:28 [INFO] [TRAIN] epoch: 104, iter: 130950/160000, loss: 0.6543, lr: 0.000220, batch_cost: 0.2526, reader_cost: 0.00051, ips: 31.6710 samples/sec | ETA 02:02:17 2022-08-25 01:20:40 [INFO] [TRAIN] epoch: 104, iter: 131000/160000, loss: 0.7121, lr: 0.000220, batch_cost: 0.2459, reader_cost: 0.00652, ips: 32.5276 samples/sec | ETA 01:58:52 2022-08-25 01:20:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 162s - batch_cost: 0.1617 - reader cost: 6.5280e-04 2022-08-25 01:23:22 [INFO] [EVAL] #Images: 2000 mIoU: 0.3217 Acc: 0.7491 Kappa: 0.7298 Dice: 0.4488 2022-08-25 01:23:22 [INFO] [EVAL] Class IoU: [0.66 0.7696 0.9244 0.7063 0.6731 0.7455 0.7587 0.7595 0.4806 0.6216 0.463 0.5212 0.6582 0.279 0.2285 0.385 0.5039 0.4228 0.5478 0.3646 0.7229 0.4222 0.5535 0.4669 0.3358 0.3863 0.3624 0.4079 0.371 0.3152 0.211 0.3731 0.2784 0.2877 0.3716 0.3777 0.3641 0.5001 0.2576 0.2517 0.078 0.0644 0.3082 0.2242 0.2854 0.2474 0.2008 0.4169 0.6663 0.505 0.4647 0.2384 0.216 0.1941 0.6299 0.4809 0.8458 0.3165 0.4825 0.197 0.0559 0.1839 0.3095 0.1248 0.3946 0.6719 0.2227 0.3666 0.0853 0.3514 0.3629 0.4075 0.3609 0.2195 0.4109 0.304 0.3795 0.2066 0.1524 0.2788 0.6446 0.2902 0.2592 0.0153 0.4762 0.4743 0.0887 0.0594 0.3512 0.4171 0.3957 0.0047 0.2174 0.07 0.0092 0.0126 0.1792 0.1184 0.1923 0.3431 0.1259 0.0081 0.1955 0.7046 0.0012 0.5274 0.1594 0.4952 0.0974 0.4045 0.0526 0.2606 0.084 0.5848 0.75 0.0026 0.338 0.5657 0.0471 0.3119 0.4358 0.0116 0.2249 0.1316 0.2513 0.1742 0.4045 0.2788 0.5129 0.2243 0.5267 0.0195 0.045 0.2963 0.0898 0.0927 0.0808 0.0197 0.1182 0.3191 0.089 0.0612 0.277 0.4939 0.2989 0. 0.297 0.0078 0.0921 0.0473] 2022-08-25 01:23:22 [INFO] [EVAL] Class Precision: [0.7667 0.8374 0.9586 0.8013 0.7842 0.8629 0.8806 0.8305 0.6004 0.7338 0.6627 0.6394 0.74 0.4906 0.4605 0.555 0.6909 0.6201 0.7535 0.5465 0.8055 0.6297 0.7085 0.5907 0.5197 0.5804 0.5269 0.6592 0.6543 0.4565 0.3957 0.5057 0.4474 0.4205 0.5166 0.5635 0.6215 0.792 0.4626 0.5294 0.1444 0.2068 0.5349 0.5563 0.41 0.4268 0.3467 0.6227 0.7602 0.6331 0.5941 0.2912 0.3907 0.5 0.7268 0.6575 0.8806 0.6122 0.6712 0.3541 0.1076 0.3839 0.43 0.5476 0.5213 0.7973 0.3119 0.5718 0.1978 0.6212 0.5876 0.5538 0.6411 0.2925 0.5682 0.5155 0.591 0.5447 0.6965 0.5769 0.7567 0.5868 0.7497 0.0612 0.6332 0.6784 0.4702 0.4621 0.5282 0.5685 0.5295 0.0064 0.3295 0.2783 0.0466 0.0401 0.6096 0.5024 0.4661 0.5057 0.5563 0.0168 0.6435 0.8038 0.0402 0.7781 0.5002 0.8857 0.2108 0.5951 0.2759 0.4108 0.3891 0.7131 0.7586 0.1974 0.7515 0.5829 0.1095 0.759 0.6979 0.3688 0.6321 0.6994 0.6207 0.6247 0.8023 0.3748 0.6 0.4594 0.5863 0.4184 0.3303 0.66 0.6108 0.273 0.2705 0.1627 0.3439 0.7065 0.2289 0.1527 0.4661 0.6776 0.4654 0. 0.8778 0.3746 0.4086 0.5847] 2022-08-25 01:23:22 [INFO] [EVAL] Class Recall: [0.8259 0.9048 0.9628 0.8562 0.8261 0.8457 0.8458 0.8989 0.7066 0.8025 0.6057 0.7383 0.8563 0.3928 0.3121 0.557 0.6506 0.5706 0.6674 0.5226 0.8758 0.5617 0.7167 0.6902 0.4869 0.536 0.5371 0.5169 0.4614 0.5046 0.3114 0.5874 0.4242 0.4765 0.5696 0.534 0.4678 0.5757 0.3676 0.3243 0.145 0.0855 0.4211 0.273 0.4843 0.3704 0.323 0.5578 0.8436 0.714 0.6809 0.5684 0.3256 0.2409 0.8253 0.6416 0.9554 0.3959 0.6319 0.3075 0.1042 0.2608 0.5249 0.1392 0.6189 0.8103 0.4379 0.5053 0.1304 0.4473 0.487 0.6066 0.4523 0.4679 0.5975 0.4256 0.5146 0.2498 0.1632 0.3504 0.8132 0.3647 0.2837 0.02 0.6576 0.6119 0.0986 0.0638 0.5117 0.6104 0.6103 0.0175 0.3899 0.0855 0.0113 0.018 0.2025 0.1341 0.2466 0.5162 0.14 0.0153 0.2193 0.8509 0.0013 0.6208 0.1896 0.529 0.1534 0.5581 0.061 0.4161 0.0968 0.7647 0.985 0.0026 0.3806 0.9504 0.0764 0.3462 0.5372 0.0119 0.2587 0.1395 0.2968 0.1945 0.4493 0.5212 0.7793 0.3047 0.8384 0.0201 0.0496 0.3497 0.0952 0.1231 0.1033 0.0219 0.1526 0.3679 0.1271 0.0926 0.4056 0.6456 0.4553 0. 0.3098 0.0079 0.1063 0.049 ] 2022-08-25 01:23:22 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:23:30 [INFO] [TRAIN] epoch: 104, iter: 131050/160000, loss: 0.7132, lr: 0.000219, batch_cost: 0.1559, reader_cost: 0.00339, ips: 51.3111 samples/sec | ETA 01:15:13 2022-08-25 01:23:41 [INFO] [TRAIN] epoch: 104, iter: 131100/160000, loss: 0.7068, lr: 0.000219, batch_cost: 0.2184, reader_cost: 0.00206, ips: 36.6255 samples/sec | ETA 01:45:12 2022-08-25 01:23:50 [INFO] [TRAIN] epoch: 104, iter: 131150/160000, loss: 0.7074, lr: 0.000218, batch_cost: 0.1934, reader_cost: 0.00046, ips: 41.3708 samples/sec | ETA 01:32:58 2022-08-25 01:24:00 [INFO] [TRAIN] epoch: 104, iter: 131200/160000, loss: 0.7173, lr: 0.000218, batch_cost: 0.1965, reader_cost: 0.00055, ips: 40.7026 samples/sec | ETA 01:34:20 2022-08-25 01:24:14 [INFO] [TRAIN] epoch: 104, iter: 131250/160000, loss: 0.6272, lr: 0.000218, batch_cost: 0.2736, reader_cost: 0.00059, ips: 29.2406 samples/sec | ETA 02:11:05 2022-08-25 01:24:27 [INFO] [TRAIN] epoch: 104, iter: 131300/160000, loss: 0.6888, lr: 0.000217, batch_cost: 0.2554, reader_cost: 0.00067, ips: 31.3266 samples/sec | ETA 02:02:09 2022-08-25 01:24:38 [INFO] [TRAIN] epoch: 104, iter: 131350/160000, loss: 0.6889, lr: 0.000217, batch_cost: 0.2276, reader_cost: 0.01577, ips: 35.1509 samples/sec | ETA 01:48:40 2022-08-25 01:24:54 [INFO] [TRAIN] epoch: 105, iter: 131400/160000, loss: 0.6994, lr: 0.000217, batch_cost: 0.3267, reader_cost: 0.03688, ips: 24.4866 samples/sec | ETA 02:35:43 2022-08-25 01:25:06 [INFO] [TRAIN] epoch: 105, iter: 131450/160000, loss: 0.6734, lr: 0.000216, batch_cost: 0.2390, reader_cost: 0.00196, ips: 33.4699 samples/sec | ETA 01:53:44 2022-08-25 01:25:18 [INFO] [TRAIN] epoch: 105, iter: 131500/160000, loss: 0.6894, lr: 0.000216, batch_cost: 0.2358, reader_cost: 0.00040, ips: 33.9314 samples/sec | ETA 01:51:59 2022-08-25 01:25:30 [INFO] [TRAIN] epoch: 105, iter: 131550/160000, loss: 0.6752, lr: 0.000215, batch_cost: 0.2431, reader_cost: 0.00361, ips: 32.9070 samples/sec | ETA 01:55:16 2022-08-25 01:25:42 [INFO] [TRAIN] epoch: 105, iter: 131600/160000, loss: 0.6552, lr: 0.000215, batch_cost: 0.2426, reader_cost: 0.00102, ips: 32.9779 samples/sec | ETA 01:54:49 2022-08-25 01:25:54 [INFO] [TRAIN] epoch: 105, iter: 131650/160000, loss: 0.6754, lr: 0.000215, batch_cost: 0.2396, reader_cost: 0.00045, ips: 33.3891 samples/sec | ETA 01:53:12 2022-08-25 01:26:06 [INFO] [TRAIN] epoch: 105, iter: 131700/160000, loss: 0.6877, lr: 0.000214, batch_cost: 0.2329, reader_cost: 0.00127, ips: 34.3490 samples/sec | ETA 01:49:51 2022-08-25 01:26:19 [INFO] [TRAIN] epoch: 105, iter: 131750/160000, loss: 0.7112, lr: 0.000214, batch_cost: 0.2504, reader_cost: 0.00040, ips: 31.9529 samples/sec | ETA 01:57:52 2022-08-25 01:26:31 [INFO] [TRAIN] epoch: 105, iter: 131800/160000, loss: 0.6633, lr: 0.000214, batch_cost: 0.2420, reader_cost: 0.00779, ips: 33.0639 samples/sec | ETA 01:53:43 2022-08-25 01:26:43 [INFO] [TRAIN] epoch: 105, iter: 131850/160000, loss: 0.7226, lr: 0.000213, batch_cost: 0.2463, reader_cost: 0.00115, ips: 32.4815 samples/sec | ETA 01:55:33 2022-08-25 01:26:55 [INFO] [TRAIN] epoch: 105, iter: 131900/160000, loss: 0.7172, lr: 0.000213, batch_cost: 0.2392, reader_cost: 0.00057, ips: 33.4508 samples/sec | ETA 01:52:00 2022-08-25 01:27:07 [INFO] [TRAIN] epoch: 105, iter: 131950/160000, loss: 0.7129, lr: 0.000212, batch_cost: 0.2402, reader_cost: 0.00071, ips: 33.2994 samples/sec | ETA 01:52:18 2022-08-25 01:27:19 [INFO] [TRAIN] epoch: 105, iter: 132000/160000, loss: 0.6874, lr: 0.000212, batch_cost: 0.2349, reader_cost: 0.00048, ips: 34.0522 samples/sec | ETA 01:49:38 2022-08-25 01:27:19 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 148s - batch_cost: 0.1480 - reader cost: 8.5413e-04 2022-08-25 01:29:47 [INFO] [EVAL] #Images: 2000 mIoU: 0.3191 Acc: 0.7493 Kappa: 0.7299 Dice: 0.4457 2022-08-25 01:29:47 [INFO] [EVAL] Class IoU: [0.6629 0.7686 0.9245 0.703 0.6684 0.7444 0.7631 0.756 0.4886 0.6252 0.4596 0.5202 0.658 0.3027 0.2182 0.3841 0.4934 0.3908 0.5512 0.3697 0.7174 0.4115 0.5603 0.4613 0.3249 0.3528 0.3747 0.396 0.3509 0.3241 0.2159 0.3939 0.2819 0.2922 0.3671 0.4044 0.3594 0.5175 0.2474 0.2456 0.0914 0.0701 0.3097 0.217 0.2946 0.2588 0.247 0.4241 0.6776 0.4994 0.4595 0.2435 0.2184 0.1741 0.6206 0.4574 0.8502 0.3102 0.4734 0.1816 0.0401 0.1719 0.3069 0.0912 0.3779 0.6724 0.213 0.3844 0.0912 0.3533 0.3349 0.4319 0.3646 0.2119 0.4295 0.2909 0.4078 0.2076 0.1085 0.2841 0.6084 0.2964 0.2676 0.0124 0.4807 0.4712 0.0981 0.0678 0.3609 0.4154 0.3798 0.0107 0.2056 0.0646 0.0083 0.0077 0.1809 0.1269 0.181 0.3178 0.0672 0.0235 0.1996 0.6073 0.0018 0.5184 0.2098 0.4638 0.1012 0.3886 0.051 0.3024 0.0787 0.5455 0.7581 0.0043 0.3675 0.5878 0.0156 0.2667 0.4218 0.0093 0.1975 0.1174 0.2451 0.1809 0.4086 0.2743 0.5372 0.231 0.5652 0.0186 0.0448 0.2694 0.0912 0.1087 0.0696 0.0201 0.1228 0.3061 0.1034 0.0476 0.2303 0.4634 0.3133 0. 0.2902 0.0083 0.0942 0.0525] 2022-08-25 01:29:47 [INFO] [EVAL] Class Precision: [0.7634 0.8375 0.9579 0.7937 0.7601 0.862 0.8834 0.8247 0.6293 0.7714 0.6688 0.6417 0.7403 0.4663 0.4871 0.5514 0.6808 0.6498 0.7022 0.5452 0.7986 0.6027 0.72 0.586 0.5285 0.5796 0.5285 0.7346 0.6545 0.4683 0.4445 0.5732 0.507 0.4384 0.5267 0.5443 0.6259 0.7343 0.4709 0.5558 0.1467 0.2442 0.5784 0.526 0.3934 0.4435 0.4108 0.6152 0.74 0.6225 0.6036 0.2906 0.3911 0.5679 0.7218 0.6326 0.8955 0.6105 0.6912 0.3107 0.0654 0.4198 0.4675 0.6071 0.4687 0.8139 0.3127 0.577 0.2237 0.6562 0.6133 0.5458 0.6233 0.2899 0.6538 0.4639 0.6218 0.6108 0.7527 0.5742 0.6935 0.5989 0.7449 0.0576 0.6818 0.6468 0.3141 0.4071 0.5367 0.5691 0.4952 0.0138 0.3545 0.268 0.0441 0.0469 0.6819 0.5806 0.4567 0.5612 0.5329 0.0509 0.7071 0.6595 0.0502 0.7552 0.5576 0.9054 0.2593 0.5938 0.3641 0.5266 0.4188 0.7393 0.765 0.0686 0.6872 0.6102 0.0476 0.7292 0.755 0.3934 0.7289 0.6747 0.6354 0.597 0.7311 0.3988 0.6411 0.5162 0.6549 0.3167 0.3597 0.7047 0.6479 0.271 0.2974 0.1973 0.4128 0.724 0.2321 0.2478 0.567 0.6616 0.6143 0. 0.9008 0.2742 0.5245 0.6747] 2022-08-25 01:29:47 [INFO] [EVAL] Class Recall: [0.8344 0.9033 0.9637 0.8602 0.8472 0.8451 0.8486 0.9008 0.686 0.7674 0.595 0.7331 0.8554 0.4631 0.2834 0.5587 0.6419 0.495 0.7194 0.5344 0.8758 0.5646 0.7164 0.6844 0.4575 0.4741 0.5628 0.4621 0.4307 0.5128 0.2957 0.5574 0.3884 0.4669 0.5477 0.6115 0.4578 0.6367 0.3427 0.3056 0.1952 0.0895 0.4 0.2697 0.5399 0.3831 0.3826 0.5772 0.8894 0.7162 0.658 0.6006 0.331 0.2006 0.8157 0.6228 0.9439 0.3868 0.6004 0.3042 0.0941 0.2254 0.4719 0.0969 0.6613 0.7945 0.4004 0.5353 0.1334 0.4336 0.4245 0.6743 0.4676 0.4405 0.556 0.4383 0.5422 0.2393 0.1125 0.3599 0.832 0.3699 0.2946 0.0155 0.6198 0.6345 0.1248 0.0752 0.5243 0.6061 0.6197 0.0458 0.3285 0.0784 0.0101 0.0091 0.1976 0.1397 0.2307 0.4228 0.0714 0.0418 0.2175 0.8847 0.0018 0.6231 0.2517 0.4874 0.1424 0.5293 0.056 0.4153 0.0883 0.6754 0.9883 0.0045 0.4414 0.9413 0.0227 0.296 0.4887 0.0094 0.2131 0.1245 0.2853 0.2061 0.4809 0.4676 0.7682 0.2947 0.8049 0.0193 0.0487 0.3036 0.0959 0.1536 0.0833 0.0219 0.1488 0.3465 0.1572 0.0557 0.2794 0.6075 0.3901 0. 0.2997 0.0085 0.103 0.0539] 2022-08-25 01:29:47 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:29:56 [INFO] [TRAIN] epoch: 105, iter: 132050/160000, loss: 0.6436, lr: 0.000212, batch_cost: 0.1722, reader_cost: 0.00406, ips: 46.4649 samples/sec | ETA 01:20:12 2022-08-25 01:30:03 [INFO] [TRAIN] epoch: 105, iter: 132100/160000, loss: 0.7609, lr: 0.000211, batch_cost: 0.1513, reader_cost: 0.00113, ips: 52.8915 samples/sec | ETA 01:10:19 2022-08-25 01:30:13 [INFO] [TRAIN] epoch: 105, iter: 132150/160000, loss: 0.6448, lr: 0.000211, batch_cost: 0.2065, reader_cost: 0.00044, ips: 38.7448 samples/sec | ETA 01:35:50 2022-08-25 01:30:24 [INFO] [TRAIN] epoch: 105, iter: 132200/160000, loss: 0.6679, lr: 0.000210, batch_cost: 0.2088, reader_cost: 0.00095, ips: 38.3108 samples/sec | ETA 01:36:45 2022-08-25 01:30:34 [INFO] [TRAIN] epoch: 105, iter: 132250/160000, loss: 0.6960, lr: 0.000210, batch_cost: 0.2064, reader_cost: 0.00447, ips: 38.7516 samples/sec | ETA 01:35:28 2022-08-25 01:30:47 [INFO] [TRAIN] epoch: 105, iter: 132300/160000, loss: 0.6808, lr: 0.000210, batch_cost: 0.2599, reader_cost: 0.00050, ips: 30.7757 samples/sec | ETA 02:00:00 2022-08-25 01:30:59 [INFO] [TRAIN] epoch: 105, iter: 132350/160000, loss: 0.7065, lr: 0.000209, batch_cost: 0.2384, reader_cost: 0.00142, ips: 33.5549 samples/sec | ETA 01:49:52 2022-08-25 01:31:12 [INFO] [TRAIN] epoch: 105, iter: 132400/160000, loss: 0.6847, lr: 0.000209, batch_cost: 0.2508, reader_cost: 0.00115, ips: 31.9024 samples/sec | ETA 01:55:21 2022-08-25 01:31:23 [INFO] [TRAIN] epoch: 105, iter: 132450/160000, loss: 0.6767, lr: 0.000209, batch_cost: 0.2311, reader_cost: 0.00057, ips: 34.6241 samples/sec | ETA 01:46:05 2022-08-25 01:31:36 [INFO] [TRAIN] epoch: 105, iter: 132500/160000, loss: 0.7000, lr: 0.000208, batch_cost: 0.2478, reader_cost: 0.00103, ips: 32.2799 samples/sec | ETA 01:53:35 2022-08-25 01:31:49 [INFO] [TRAIN] epoch: 105, iter: 132550/160000, loss: 0.6800, lr: 0.000208, batch_cost: 0.2578, reader_cost: 0.00412, ips: 31.0283 samples/sec | ETA 01:57:57 2022-08-25 01:32:01 [INFO] [TRAIN] epoch: 105, iter: 132600/160000, loss: 0.6750, lr: 0.000207, batch_cost: 0.2485, reader_cost: 0.00060, ips: 32.1907 samples/sec | ETA 01:53:29 2022-08-25 01:32:15 [INFO] [TRAIN] epoch: 106, iter: 132650/160000, loss: 0.7352, lr: 0.000207, batch_cost: 0.2711, reader_cost: 0.05150, ips: 29.5128 samples/sec | ETA 02:03:33 2022-08-25 01:32:26 [INFO] [TRAIN] epoch: 106, iter: 132700/160000, loss: 0.6928, lr: 0.000207, batch_cost: 0.2239, reader_cost: 0.00577, ips: 35.7238 samples/sec | ETA 01:41:53 2022-08-25 01:32:38 [INFO] [TRAIN] epoch: 106, iter: 132750/160000, loss: 0.7140, lr: 0.000206, batch_cost: 0.2382, reader_cost: 0.00431, ips: 33.5859 samples/sec | ETA 01:48:10 2022-08-25 01:32:50 [INFO] [TRAIN] epoch: 106, iter: 132800/160000, loss: 0.6746, lr: 0.000206, batch_cost: 0.2463, reader_cost: 0.01429, ips: 32.4871 samples/sec | ETA 01:51:38 2022-08-25 01:33:02 [INFO] [TRAIN] epoch: 106, iter: 132850/160000, loss: 0.6988, lr: 0.000206, batch_cost: 0.2416, reader_cost: 0.00052, ips: 33.1115 samples/sec | ETA 01:49:19 2022-08-25 01:33:15 [INFO] [TRAIN] epoch: 106, iter: 132900/160000, loss: 0.6946, lr: 0.000205, batch_cost: 0.2542, reader_cost: 0.00123, ips: 31.4664 samples/sec | ETA 01:54:49 2022-08-25 01:33:28 [INFO] [TRAIN] epoch: 106, iter: 132950/160000, loss: 0.7393, lr: 0.000205, batch_cost: 0.2558, reader_cost: 0.00081, ips: 31.2700 samples/sec | ETA 01:55:20 2022-08-25 01:33:40 [INFO] [TRAIN] epoch: 106, iter: 133000/160000, loss: 0.6662, lr: 0.000204, batch_cost: 0.2560, reader_cost: 0.00141, ips: 31.2512 samples/sec | ETA 01:55:11 2022-08-25 01:33:40 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1587 - reader cost: 0.0011 2022-08-25 01:36:19 [INFO] [EVAL] #Images: 2000 mIoU: 0.3207 Acc: 0.7480 Kappa: 0.7287 Dice: 0.4479 2022-08-25 01:36:19 [INFO] [EVAL] Class IoU: [0.6586 0.7675 0.9251 0.7035 0.673 0.741 0.757 0.7542 0.4793 0.6281 0.4584 0.5181 0.6609 0.2985 0.2217 0.3818 0.4903 0.4026 0.5468 0.3666 0.7157 0.4363 0.5636 0.4625 0.3145 0.3777 0.3678 0.4111 0.354 0.3255 0.2304 0.3896 0.2765 0.283 0.3563 0.3807 0.3654 0.5456 0.261 0.2707 0.0812 0.0783 0.2923 0.2206 0.2945 0.2359 0.2344 0.4219 0.6805 0.5008 0.4616 0.2575 0.214 0.1686 0.6329 0.4524 0.8514 0.3174 0.4758 0.1635 0.0549 0.1703 0.3004 0.1315 0.3744 0.6815 0.1961 0.3793 0.0897 0.3483 0.3514 0.4196 0.3678 0.2351 0.4204 0.3057 0.3733 0.2132 0.0645 0.4005 0.6285 0.3091 0.248 0.0152 0.4859 0.4739 0.0959 0.0676 0.351 0.4022 0.3685 0.0051 0.2091 0.0707 0.0393 0.0179 0.0678 0.1177 0.1813 0.3299 0.1 0.0362 0.1741 0.6211 0.0026 0.5017 0.2069 0.4763 0.1102 0.4213 0.0774 0.238 0.0901 0.5781 0.7324 0.0036 0.3461 0.5981 0.04 0.2602 0.4093 0.0124 0.2274 0.0898 0.2441 0.179 0.4183 0.2634 0.4608 0.2694 0.5498 0.0181 0.1006 0.2666 0.0775 0.106 0.068 0.0169 0.1147 0.3369 0.0915 0.1172 0.2537 0.5165 0.3079 0. 0.2985 0.0144 0.0842 0.0546] 2022-08-25 01:36:19 [INFO] [EVAL] Class Precision: [0.7661 0.8341 0.9611 0.7989 0.7733 0.8659 0.8936 0.823 0.6053 0.7477 0.6772 0.6327 0.7541 0.4946 0.4652 0.5442 0.6624 0.6468 0.6915 0.5565 0.7998 0.5959 0.7322 0.5975 0.5292 0.5667 0.5369 0.669 0.6514 0.5111 0.407 0.4964 0.4778 0.4125 0.5056 0.542 0.6107 0.7314 0.5082 0.5193 0.1334 0.2244 0.4659 0.534 0.3784 0.4389 0.3934 0.6461 0.7667 0.6247 0.5916 0.3157 0.4652 0.502 0.7195 0.5969 0.9049 0.6199 0.5646 0.301 0.113 0.4142 0.4356 0.5639 0.4719 0.8414 0.3211 0.5337 0.2117 0.6513 0.6187 0.5521 0.6184 0.31 0.6189 0.517 0.62 0.5133 0.6491 0.6241 0.7379 0.5535 0.7662 0.0598 0.6304 0.664 0.3217 0.4284 0.5468 0.5591 0.4695 0.0076 0.3573 0.288 0.1641 0.0586 0.4656 0.5205 0.4157 0.5191 0.5824 0.0611 0.7169 0.7258 0.0422 0.7182 0.5816 0.9091 0.252 0.6049 0.2804 0.3574 0.3741 0.7357 0.7377 0.1193 0.6837 0.6188 0.0901 0.7197 0.7398 0.3239 0.7438 0.6935 0.6445 0.6215 0.7296 0.3679 0.5858 0.5768 0.6677 0.4114 0.3713 0.7082 0.6591 0.2121 0.2896 0.0762 0.37 0.6435 0.1553 0.2364 0.5244 0.7083 0.4831 0. 0.8899 0.353 0.3739 0.641 ] 2022-08-25 01:36:19 [INFO] [EVAL] Class Recall: [0.8244 0.9057 0.9611 0.8549 0.8385 0.8371 0.832 0.9002 0.6972 0.7969 0.5866 0.741 0.8425 0.4295 0.2975 0.5613 0.6536 0.516 0.7233 0.5179 0.8719 0.6197 0.71 0.6717 0.4367 0.5311 0.5387 0.516 0.4368 0.4728 0.3469 0.6443 0.3962 0.4741 0.5467 0.5612 0.4764 0.6824 0.3493 0.3612 0.1718 0.1074 0.4397 0.2733 0.5706 0.3379 0.3672 0.5487 0.8581 0.7163 0.6775 0.5826 0.2839 0.2025 0.8403 0.6514 0.935 0.394 0.7517 0.2636 0.0966 0.2243 0.4918 0.1464 0.6445 0.7819 0.335 0.5674 0.1347 0.4282 0.4486 0.6362 0.4757 0.4932 0.5673 0.4278 0.4842 0.2672 0.0669 0.5278 0.8092 0.4118 0.2683 0.02 0.6795 0.6234 0.1202 0.0743 0.4949 0.5891 0.6315 0.0154 0.3351 0.0857 0.0492 0.0251 0.0736 0.132 0.2433 0.4751 0.1078 0.0815 0.1869 0.8115 0.0028 0.6248 0.243 0.5002 0.1638 0.5813 0.0966 0.4161 0.1061 0.7297 0.9903 0.0037 0.4121 0.9468 0.0671 0.2895 0.4781 0.0128 0.2467 0.0935 0.2821 0.2009 0.4951 0.4813 0.6836 0.3358 0.7568 0.0186 0.1213 0.2995 0.0808 0.1748 0.0816 0.0212 0.1425 0.4142 0.1822 0.1887 0.3296 0.6561 0.4592 0. 0.31 0.0148 0.098 0.0563] 2022-08-25 01:36:19 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:36:28 [INFO] [TRAIN] epoch: 106, iter: 133050/160000, loss: 0.6734, lr: 0.000204, batch_cost: 0.1644, reader_cost: 0.00350, ips: 48.6653 samples/sec | ETA 01:13:50 2022-08-25 01:36:38 [INFO] [TRAIN] epoch: 106, iter: 133100/160000, loss: 0.6317, lr: 0.000204, batch_cost: 0.1983, reader_cost: 0.00123, ips: 40.3366 samples/sec | ETA 01:28:55 2022-08-25 01:36:47 [INFO] [TRAIN] epoch: 106, iter: 133150/160000, loss: 0.7038, lr: 0.000203, batch_cost: 0.1982, reader_cost: 0.00043, ips: 40.3585 samples/sec | ETA 01:28:42 2022-08-25 01:36:58 [INFO] [TRAIN] epoch: 106, iter: 133200/160000, loss: 0.6957, lr: 0.000203, batch_cost: 0.2008, reader_cost: 0.00055, ips: 39.8497 samples/sec | ETA 01:29:40 2022-08-25 01:37:08 [INFO] [TRAIN] epoch: 106, iter: 133250/160000, loss: 0.6572, lr: 0.000203, batch_cost: 0.2145, reader_cost: 0.00312, ips: 37.2888 samples/sec | ETA 01:35:38 2022-08-25 01:37:22 [INFO] [TRAIN] epoch: 106, iter: 133300/160000, loss: 0.7115, lr: 0.000202, batch_cost: 0.2780, reader_cost: 0.00075, ips: 28.7744 samples/sec | ETA 02:03:43 2022-08-25 01:37:34 [INFO] [TRAIN] epoch: 106, iter: 133350/160000, loss: 0.7277, lr: 0.000202, batch_cost: 0.2412, reader_cost: 0.02280, ips: 33.1622 samples/sec | ETA 01:47:08 2022-08-25 01:37:46 [INFO] [TRAIN] epoch: 106, iter: 133400/160000, loss: 0.7288, lr: 0.000201, batch_cost: 0.2372, reader_cost: 0.00690, ips: 33.7270 samples/sec | ETA 01:45:09 2022-08-25 01:37:58 [INFO] [TRAIN] epoch: 106, iter: 133450/160000, loss: 0.6864, lr: 0.000201, batch_cost: 0.2444, reader_cost: 0.00261, ips: 32.7285 samples/sec | ETA 01:48:09 2022-08-25 01:38:11 [INFO] [TRAIN] epoch: 106, iter: 133500/160000, loss: 0.6532, lr: 0.000201, batch_cost: 0.2493, reader_cost: 0.00702, ips: 32.0864 samples/sec | ETA 01:50:07 2022-08-25 01:38:23 [INFO] [TRAIN] epoch: 106, iter: 133550/160000, loss: 0.6767, lr: 0.000200, batch_cost: 0.2488, reader_cost: 0.00408, ips: 32.1596 samples/sec | ETA 01:49:39 2022-08-25 01:38:36 [INFO] [TRAIN] epoch: 106, iter: 133600/160000, loss: 0.6513, lr: 0.000200, batch_cost: 0.2563, reader_cost: 0.00077, ips: 31.2130 samples/sec | ETA 01:52:46 2022-08-25 01:38:48 [INFO] [TRAIN] epoch: 106, iter: 133650/160000, loss: 0.7021, lr: 0.000200, batch_cost: 0.2442, reader_cost: 0.00427, ips: 32.7667 samples/sec | ETA 01:47:13 2022-08-25 01:38:59 [INFO] [TRAIN] epoch: 106, iter: 133700/160000, loss: 0.7094, lr: 0.000199, batch_cost: 0.2225, reader_cost: 0.00948, ips: 35.9622 samples/sec | ETA 01:37:30 2022-08-25 01:39:11 [INFO] [TRAIN] epoch: 106, iter: 133750/160000, loss: 0.6319, lr: 0.000199, batch_cost: 0.2389, reader_cost: 0.01080, ips: 33.4841 samples/sec | ETA 01:44:31 2022-08-25 01:39:23 [INFO] [TRAIN] epoch: 106, iter: 133800/160000, loss: 0.7171, lr: 0.000198, batch_cost: 0.2345, reader_cost: 0.02613, ips: 34.1169 samples/sec | ETA 01:42:23 2022-08-25 01:39:35 [INFO] [TRAIN] epoch: 106, iter: 133850/160000, loss: 0.7322, lr: 0.000198, batch_cost: 0.2427, reader_cost: 0.02243, ips: 32.9668 samples/sec | ETA 01:45:45 2022-08-25 01:39:48 [INFO] [TRAIN] epoch: 107, iter: 133900/160000, loss: 0.6774, lr: 0.000198, batch_cost: 0.2574, reader_cost: 0.05534, ips: 31.0787 samples/sec | ETA 01:51:58 2022-08-25 01:40:00 [INFO] [TRAIN] epoch: 107, iter: 133950/160000, loss: 0.6869, lr: 0.000197, batch_cost: 0.2434, reader_cost: 0.01191, ips: 32.8668 samples/sec | ETA 01:45:40 2022-08-25 01:40:12 [INFO] [TRAIN] epoch: 107, iter: 134000/160000, loss: 0.6991, lr: 0.000197, batch_cost: 0.2381, reader_cost: 0.00355, ips: 33.6021 samples/sec | ETA 01:43:10 2022-08-25 01:40:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 145s - batch_cost: 0.1454 - reader cost: 8.6260e-04 2022-08-25 01:42:38 [INFO] [EVAL] #Images: 2000 mIoU: 0.3215 Acc: 0.7495 Kappa: 0.7305 Dice: 0.4502 2022-08-25 01:42:38 [INFO] [EVAL] Class IoU: [0.6588 0.7755 0.9253 0.7031 0.6708 0.7484 0.7577 0.7613 0.4854 0.6246 0.4632 0.5304 0.6576 0.2942 0.2296 0.3831 0.4899 0.4235 0.5537 0.371 0.7284 0.4096 0.55 0.4633 0.3218 0.4004 0.4085 0.4074 0.3788 0.3297 0.2237 0.3906 0.2703 0.2757 0.3343 0.3938 0.3693 0.4991 0.264 0.2543 0.0673 0.0705 0.2878 0.2128 0.2863 0.2579 0.2303 0.4188 0.6595 0.5095 0.4636 0.2713 0.235 0.1932 0.6134 0.4492 0.8536 0.3452 0.4311 0.1674 0.0509 0.1829 0.3135 0.1307 0.3998 0.6704 0.2207 0.3772 0.0717 0.3412 0.3574 0.4184 0.3661 0.2368 0.4123 0.3033 0.3529 0.2173 0.1108 0.3992 0.6439 0.3056 0.2702 0.0136 0.4928 0.4699 0.0969 0.0666 0.3641 0.4278 0.398 0.0104 0.1902 0.0677 0.0051 0.0139 0.1618 0.1348 0.2378 0.3181 0.1523 0.0567 0.2143 0.4213 0.0002 0.5258 0.2203 0.4433 0.0908 0.3706 0.0662 0.2028 0.0875 0.5728 0.7161 0.0029 0.3416 0.5773 0.0974 0.2405 0.3571 0.0097 0.1831 0.1279 0.254 0.2204 0.4311 0.3003 0.5126 0.2597 0.5347 0.0306 0.0922 0.2593 0.0881 0.1081 0.0578 0.0201 0.1181 0.3175 0.0885 0.076 0.2518 0.5155 0.3011 0. 0.3091 0.0234 0.0795 0.0681] 2022-08-25 01:42:38 [INFO] [EVAL] Class Precision: [0.7707 0.8486 0.9582 0.8027 0.7593 0.8499 0.8994 0.8326 0.6006 0.7494 0.6727 0.6329 0.7439 0.488 0.4781 0.5432 0.6587 0.6444 0.6971 0.5619 0.8245 0.5798 0.7099 0.5889 0.5118 0.6763 0.562 0.7073 0.6219 0.4891 0.4226 0.5086 0.4642 0.3826 0.4825 0.5184 0.5642 0.7997 0.5173 0.5294 0.1176 0.2264 0.4427 0.5613 0.3962 0.4336 0.3924 0.6009 0.6984 0.6783 0.5783 0.3422 0.4365 0.5563 0.7046 0.5944 0.9015 0.5964 0.6672 0.3463 0.0865 0.4712 0.4691 0.5898 0.5373 0.8154 0.3674 0.5611 0.1747 0.6202 0.5799 0.6434 0.6626 0.2877 0.666 0.4876 0.6598 0.6202 0.7025 0.595 0.7515 0.5054 0.7392 0.0618 0.631 0.6801 0.4013 0.4131 0.5601 0.6148 0.5303 0.0152 0.3122 0.305 0.027 0.0633 0.6609 0.5871 0.4319 0.5613 0.512 0.0897 0.7106 0.6887 0.0037 0.7111 0.5469 0.8208 0.2069 0.5978 0.2889 0.2835 0.3742 0.7193 0.7208 0.1344 0.735 0.5945 0.1983 0.6897 0.7582 0.2923 0.5687 0.7007 0.6399 0.642 0.7759 0.4449 0.6268 0.4312 0.6235 0.395 0.4053 0.7654 0.6137 0.2777 0.3119 0.1642 0.3739 0.7206 0.176 0.1881 0.4778 0.7445 0.4737 0. 0.8932 0.1954 0.4488 0.6149] 2022-08-25 01:42:38 [INFO] [EVAL] Class Recall: [0.8194 0.9 0.9643 0.85 0.852 0.8624 0.8279 0.8989 0.7168 0.7896 0.598 0.7661 0.85 0.4256 0.3064 0.5653 0.6565 0.5527 0.7291 0.522 0.8621 0.5826 0.7095 0.6848 0.4644 0.4953 0.5993 0.49 0.4921 0.5028 0.3221 0.6273 0.3929 0.4966 0.5212 0.6209 0.5168 0.5705 0.3502 0.3286 0.1358 0.0928 0.4512 0.2553 0.5079 0.3891 0.358 0.5802 0.9222 0.6719 0.7002 0.5668 0.3373 0.2283 0.8257 0.6478 0.9414 0.4504 0.5492 0.2447 0.1102 0.2301 0.4858 0.1438 0.6096 0.7903 0.3559 0.535 0.1083 0.4313 0.4822 0.5447 0.45 0.5727 0.5198 0.4453 0.4314 0.2507 0.1162 0.5481 0.8181 0.4361 0.2987 0.0171 0.6923 0.6032 0.1132 0.0735 0.5098 0.5844 0.6148 0.0318 0.3275 0.0801 0.0063 0.0174 0.1764 0.1489 0.346 0.4233 0.1782 0.1333 0.2348 0.5204 0.0002 0.6687 0.2695 0.4908 0.1392 0.4936 0.0791 0.4161 0.1025 0.7376 0.9908 0.0029 0.3896 0.9524 0.1606 0.2697 0.403 0.01 0.2127 0.1353 0.2963 0.2513 0.4925 0.4803 0.7378 0.3949 0.7897 0.0321 0.1067 0.2816 0.0933 0.1504 0.0662 0.0224 0.1472 0.362 0.1511 0.1132 0.3475 0.6264 0.4524 0.0002 0.321 0.0259 0.0881 0.0711] 2022-08-25 01:42:38 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:42:46 [INFO] [TRAIN] epoch: 107, iter: 134050/160000, loss: 0.6938, lr: 0.000196, batch_cost: 0.1640, reader_cost: 0.00442, ips: 48.7700 samples/sec | ETA 01:10:56 2022-08-25 01:42:56 [INFO] [TRAIN] epoch: 107, iter: 134100/160000, loss: 0.6497, lr: 0.000196, batch_cost: 0.1902, reader_cost: 0.00113, ips: 42.0622 samples/sec | ETA 01:22:06 2022-08-25 01:43:04 [INFO] [TRAIN] epoch: 107, iter: 134150/160000, loss: 0.6841, lr: 0.000196, batch_cost: 0.1729, reader_cost: 0.00199, ips: 46.2805 samples/sec | ETA 01:14:28 2022-08-25 01:43:14 [INFO] [TRAIN] epoch: 107, iter: 134200/160000, loss: 0.6368, lr: 0.000195, batch_cost: 0.2045, reader_cost: 0.00072, ips: 39.1291 samples/sec | ETA 01:27:54 2022-08-25 01:43:24 [INFO] [TRAIN] epoch: 107, iter: 134250/160000, loss: 0.6723, lr: 0.000195, batch_cost: 0.1926, reader_cost: 0.00055, ips: 41.5409 samples/sec | ETA 01:22:38 2022-08-25 01:43:35 [INFO] [TRAIN] epoch: 107, iter: 134300/160000, loss: 0.6863, lr: 0.000195, batch_cost: 0.2149, reader_cost: 0.00071, ips: 37.2241 samples/sec | ETA 01:32:03 2022-08-25 01:43:48 [INFO] [TRAIN] epoch: 107, iter: 134350/160000, loss: 0.6991, lr: 0.000194, batch_cost: 0.2611, reader_cost: 0.00105, ips: 30.6393 samples/sec | ETA 01:51:37 2022-08-25 01:44:00 [INFO] [TRAIN] epoch: 107, iter: 134400/160000, loss: 0.6989, lr: 0.000194, batch_cost: 0.2489, reader_cost: 0.00328, ips: 32.1353 samples/sec | ETA 01:46:13 2022-08-25 01:44:12 [INFO] [TRAIN] epoch: 107, iter: 134450/160000, loss: 0.7493, lr: 0.000193, batch_cost: 0.2297, reader_cost: 0.00074, ips: 34.8250 samples/sec | ETA 01:37:49 2022-08-25 01:44:24 [INFO] [TRAIN] epoch: 107, iter: 134500/160000, loss: 0.6436, lr: 0.000193, batch_cost: 0.2400, reader_cost: 0.04235, ips: 33.3335 samples/sec | ETA 01:41:59 2022-08-25 01:44:37 [INFO] [TRAIN] epoch: 107, iter: 134550/160000, loss: 0.6237, lr: 0.000193, batch_cost: 0.2589, reader_cost: 0.00486, ips: 30.8947 samples/sec | ETA 01:49:50 2022-08-25 01:44:50 [INFO] [TRAIN] epoch: 107, iter: 134600/160000, loss: 0.7544, lr: 0.000192, batch_cost: 0.2577, reader_cost: 0.00191, ips: 31.0383 samples/sec | ETA 01:49:06 2022-08-25 01:45:02 [INFO] [TRAIN] epoch: 107, iter: 134650/160000, loss: 0.7012, lr: 0.000192, batch_cost: 0.2384, reader_cost: 0.00832, ips: 33.5546 samples/sec | ETA 01:40:43 2022-08-25 01:45:15 [INFO] [TRAIN] epoch: 107, iter: 134700/160000, loss: 0.6966, lr: 0.000192, batch_cost: 0.2792, reader_cost: 0.00635, ips: 28.6540 samples/sec | ETA 01:57:43 2022-08-25 01:45:27 [INFO] [TRAIN] epoch: 107, iter: 134750/160000, loss: 0.6372, lr: 0.000191, batch_cost: 0.2378, reader_cost: 0.01189, ips: 33.6374 samples/sec | ETA 01:40:05 2022-08-25 01:45:40 [INFO] [TRAIN] epoch: 107, iter: 134800/160000, loss: 0.6633, lr: 0.000191, batch_cost: 0.2475, reader_cost: 0.00465, ips: 32.3256 samples/sec | ETA 01:43:56 2022-08-25 01:45:53 [INFO] [TRAIN] epoch: 107, iter: 134850/160000, loss: 0.6419, lr: 0.000190, batch_cost: 0.2713, reader_cost: 0.00096, ips: 29.4929 samples/sec | ETA 01:53:41 2022-08-25 01:46:05 [INFO] [TRAIN] epoch: 107, iter: 134900/160000, loss: 0.7009, lr: 0.000190, batch_cost: 0.2286, reader_cost: 0.00379, ips: 34.9893 samples/sec | ETA 01:35:38 2022-08-25 01:46:17 [INFO] [TRAIN] epoch: 107, iter: 134950/160000, loss: 0.7218, lr: 0.000190, batch_cost: 0.2398, reader_cost: 0.00160, ips: 33.3556 samples/sec | ETA 01:40:07 2022-08-25 01:46:29 [INFO] [TRAIN] epoch: 107, iter: 135000/160000, loss: 0.7270, lr: 0.000189, batch_cost: 0.2421, reader_cost: 0.00053, ips: 33.0507 samples/sec | ETA 01:40:51 2022-08-25 01:46:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 164s - batch_cost: 0.1641 - reader cost: 8.1088e-04 2022-08-25 01:49:13 [INFO] [EVAL] #Images: 2000 mIoU: 0.3195 Acc: 0.7500 Kappa: 0.7307 Dice: 0.4479 2022-08-25 01:49:13 [INFO] [EVAL] Class IoU: [0.6579 0.7737 0.9237 0.7079 0.6674 0.7464 0.7695 0.7574 0.4901 0.6301 0.4596 0.5265 0.6597 0.2962 0.226 0.3841 0.4894 0.4077 0.5489 0.3703 0.7089 0.429 0.5581 0.465 0.3265 0.3691 0.3632 0.4162 0.348 0.3207 0.216 0.3998 0.2872 0.2742 0.3284 0.3859 0.3716 0.5205 0.2643 0.2539 0.0576 0.0703 0.2952 0.2246 0.2764 0.2758 0.1999 0.4247 0.6866 0.4801 0.4644 0.2592 0.2247 0.165 0.5824 0.4653 0.8554 0.3005 0.4978 0.2067 0.0474 0.2031 0.327 0.1409 0.4008 0.666 0.2205 0.3488 0.0929 0.3347 0.3615 0.4279 0.3727 0.2358 0.4041 0.3122 0.3486 0.2134 0.3115 0.2289 0.6205 0.3032 0.2413 0.0426 0.502 0.475 0.0787 0.0654 0.3499 0.408 0.3971 0.0046 0.2355 0.0628 0.0078 0.0124 0.0815 0.1248 0.2287 0.312 0.1319 0.0433 0.1899 0.5844 0.0019 0.5091 0.2418 0.4289 0.1149 0.3864 0.0576 0.2004 0.0872 0.5486 0.6392 0.0038 0.3238 0.5335 0.0768 0.2394 0.4155 0.0088 0.2108 0.1364 0.2532 0.1958 0.3982 0.2877 0.4938 0.266 0.5394 0.0212 0.125 0.3011 0.0994 0.1007 0.0531 0.0184 0.1194 0.326 0.0998 0.0295 0.2435 0.4877 0.2397 0. 0.3048 0.0101 0.0938 0.0381] 2022-08-25 01:49:13 [INFO] [EVAL] Class Precision: [0.7625 0.8322 0.9564 0.8123 0.7692 0.8552 0.8633 0.8309 0.6189 0.7646 0.6666 0.683 0.7464 0.489 0.4608 0.5289 0.6425 0.6525 0.7585 0.5599 0.7946 0.6094 0.7291 0.5737 0.5448 0.5943 0.5394 0.624 0.6832 0.4974 0.4285 0.5368 0.5447 0.4244 0.4752 0.6003 0.5883 0.8084 0.5429 0.5424 0.1239 0.2158 0.4716 0.5212 0.378 0.4226 0.4057 0.6495 0.7655 0.5776 0.5944 0.3178 0.4229 0.4982 0.6983 0.6414 0.8969 0.621 0.6362 0.3663 0.0866 0.4163 0.4437 0.5953 0.5209 0.7971 0.3382 0.5976 0.2502 0.6146 0.5782 0.5921 0.63 0.2936 0.5769 0.4941 0.5738 0.4996 0.8047 0.5435 0.7246 0.575 0.7644 0.1158 0.6176 0.6536 0.4243 0.4107 0.5446 0.5543 0.5284 0.0065 0.4119 0.2783 0.0502 0.0585 0.4789 0.61 0.3688 0.5126 0.6284 0.0754 0.6729 0.7277 0.0312 0.6824 0.5761 0.8346 0.2559 0.6071 0.2171 0.2787 0.4263 0.7679 0.6419 0.1065 0.6935 0.5458 0.173 0.7275 0.7825 0.2904 0.595 0.6477 0.6387 0.619 0.757 0.4212 0.6251 0.4907 0.6197 0.4288 0.3307 0.635 0.6099 0.2749 0.2721 0.2165 0.3588 0.6995 0.2569 0.0891 0.5348 0.6111 0.3259 0. 0.8859 0.2648 0.3952 0.6493] 2022-08-25 01:49:13 [INFO] [EVAL] Class Recall: [0.8274 0.9167 0.9643 0.8464 0.8344 0.8543 0.8763 0.8954 0.7019 0.7817 0.5968 0.6968 0.8502 0.429 0.3073 0.5839 0.6725 0.5209 0.6652 0.5224 0.8679 0.5917 0.7042 0.7105 0.449 0.4934 0.5266 0.5554 0.415 0.4745 0.3034 0.6104 0.378 0.4365 0.5153 0.5193 0.5022 0.5937 0.3399 0.3231 0.0973 0.0944 0.441 0.283 0.507 0.4425 0.2826 0.5509 0.8694 0.7398 0.6798 0.5845 0.3241 0.1978 0.7783 0.6289 0.9486 0.368 0.6958 0.3218 0.095 0.2839 0.5543 0.1558 0.6349 0.802 0.3879 0.4558 0.1288 0.4237 0.491 0.6069 0.4771 0.5448 0.5744 0.4588 0.4703 0.2714 0.3369 0.2834 0.8121 0.3907 0.2607 0.0631 0.7285 0.6348 0.0881 0.0722 0.4946 0.6072 0.6151 0.0153 0.3549 0.075 0.0091 0.0156 0.0894 0.1356 0.3756 0.4437 0.143 0.0924 0.2093 0.7479 0.002 0.6671 0.2941 0.4688 0.1727 0.5152 0.0728 0.4161 0.0988 0.6577 0.9934 0.0039 0.3779 0.9594 0.1213 0.263 0.4698 0.009 0.2462 0.1474 0.2955 0.2226 0.4565 0.4758 0.7017 0.3674 0.8062 0.0218 0.1673 0.3641 0.1061 0.1372 0.0619 0.0197 0.1519 0.3791 0.1403 0.0421 0.3088 0.7071 0.4753 0. 0.3172 0.0104 0.1095 0.0389] 2022-08-25 01:49:13 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:49:23 [INFO] [TRAIN] epoch: 107, iter: 135050/160000, loss: 0.6964, lr: 0.000189, batch_cost: 0.1949, reader_cost: 0.00366, ips: 41.0519 samples/sec | ETA 01:21:02 2022-08-25 01:49:32 [INFO] [TRAIN] epoch: 107, iter: 135100/160000, loss: 0.7169, lr: 0.000189, batch_cost: 0.1866, reader_cost: 0.00111, ips: 42.8718 samples/sec | ETA 01:17:26 2022-08-25 01:49:44 [INFO] [TRAIN] epoch: 108, iter: 135150/160000, loss: 0.6951, lr: 0.000188, batch_cost: 0.2393, reader_cost: 0.03810, ips: 33.4270 samples/sec | ETA 01:39:07 2022-08-25 01:49:58 [INFO] [TRAIN] epoch: 108, iter: 135200/160000, loss: 0.6733, lr: 0.000188, batch_cost: 0.2728, reader_cost: 0.00069, ips: 29.3267 samples/sec | ETA 01:52:45 2022-08-25 01:50:10 [INFO] [TRAIN] epoch: 108, iter: 135250/160000, loss: 0.6588, lr: 0.000187, batch_cost: 0.2407, reader_cost: 0.01053, ips: 33.2324 samples/sec | ETA 01:39:18 2022-08-25 01:50:23 [INFO] [TRAIN] epoch: 108, iter: 135300/160000, loss: 0.6835, lr: 0.000187, batch_cost: 0.2575, reader_cost: 0.00592, ips: 31.0710 samples/sec | ETA 01:45:59 2022-08-25 01:50:35 [INFO] [TRAIN] epoch: 108, iter: 135350/160000, loss: 0.7137, lr: 0.000187, batch_cost: 0.2526, reader_cost: 0.00065, ips: 31.6702 samples/sec | ETA 01:43:46 2022-08-25 01:50:48 [INFO] [TRAIN] epoch: 108, iter: 135400/160000, loss: 0.6585, lr: 0.000186, batch_cost: 0.2534, reader_cost: 0.00100, ips: 31.5682 samples/sec | ETA 01:43:54 2022-08-25 01:50:59 [INFO] [TRAIN] epoch: 108, iter: 135450/160000, loss: 0.6529, lr: 0.000186, batch_cost: 0.2255, reader_cost: 0.00059, ips: 35.4742 samples/sec | ETA 01:32:16 2022-08-25 01:51:11 [INFO] [TRAIN] epoch: 108, iter: 135500/160000, loss: 0.6523, lr: 0.000185, batch_cost: 0.2339, reader_cost: 0.00376, ips: 34.1976 samples/sec | ETA 01:35:31 2022-08-25 01:51:24 [INFO] [TRAIN] epoch: 108, iter: 135550/160000, loss: 0.6729, lr: 0.000185, batch_cost: 0.2486, reader_cost: 0.00510, ips: 32.1783 samples/sec | ETA 01:41:18 2022-08-25 01:51:36 [INFO] [TRAIN] epoch: 108, iter: 135600/160000, loss: 0.7038, lr: 0.000185, batch_cost: 0.2570, reader_cost: 0.00353, ips: 31.1258 samples/sec | ETA 01:44:31 2022-08-25 01:51:48 [INFO] [TRAIN] epoch: 108, iter: 135650/160000, loss: 0.6670, lr: 0.000184, batch_cost: 0.2387, reader_cost: 0.00214, ips: 33.5205 samples/sec | ETA 01:36:51 2022-08-25 01:52:00 [INFO] [TRAIN] epoch: 108, iter: 135700/160000, loss: 0.6995, lr: 0.000184, batch_cost: 0.2347, reader_cost: 0.00135, ips: 34.0837 samples/sec | ETA 01:35:03 2022-08-25 01:52:13 [INFO] [TRAIN] epoch: 108, iter: 135750/160000, loss: 0.6667, lr: 0.000184, batch_cost: 0.2613, reader_cost: 0.00040, ips: 30.6107 samples/sec | ETA 01:45:37 2022-08-25 01:52:24 [INFO] [TRAIN] epoch: 108, iter: 135800/160000, loss: 0.7102, lr: 0.000183, batch_cost: 0.2156, reader_cost: 0.01503, ips: 37.1126 samples/sec | ETA 01:26:56 2022-08-25 01:52:37 [INFO] [TRAIN] epoch: 108, iter: 135850/160000, loss: 0.6417, lr: 0.000183, batch_cost: 0.2535, reader_cost: 0.01385, ips: 31.5589 samples/sec | ETA 01:42:01 2022-08-25 01:52:48 [INFO] [TRAIN] epoch: 108, iter: 135900/160000, loss: 0.6864, lr: 0.000182, batch_cost: 0.2210, reader_cost: 0.00769, ips: 36.1952 samples/sec | ETA 01:28:46 2022-08-25 01:52:59 [INFO] [TRAIN] epoch: 108, iter: 135950/160000, loss: 0.7181, lr: 0.000182, batch_cost: 0.2296, reader_cost: 0.00506, ips: 34.8415 samples/sec | ETA 01:32:02 2022-08-25 01:53:11 [INFO] [TRAIN] epoch: 108, iter: 136000/160000, loss: 0.6642, lr: 0.000182, batch_cost: 0.2411, reader_cost: 0.01045, ips: 33.1759 samples/sec | ETA 01:36:27 2022-08-25 01:53:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1733 - reader cost: 0.0011 2022-08-25 01:56:05 [INFO] [EVAL] #Images: 2000 mIoU: 0.3193 Acc: 0.7485 Kappa: 0.7293 Dice: 0.4478 2022-08-25 01:56:05 [INFO] [EVAL] Class IoU: [0.6572 0.7735 0.9256 0.7061 0.6707 0.7488 0.7696 0.756 0.4807 0.61 0.4662 0.5283 0.6607 0.2927 0.2263 0.3803 0.4888 0.4139 0.5438 0.3668 0.7113 0.407 0.559 0.4486 0.3143 0.3818 0.3619 0.4068 0.3722 0.3087 0.2199 0.4014 0.2752 0.2726 0.3358 0.3948 0.3672 0.4952 0.2547 0.2705 0.0756 0.0705 0.3027 0.2274 0.2862 0.2814 0.2313 0.4131 0.6963 0.4874 0.472 0.2544 0.2171 0.2152 0.6277 0.459 0.8449 0.2759 0.5014 0.1963 0.0681 0.1751 0.307 0.137 0.3784 0.6563 0.2085 0.3631 0.1029 0.337 0.3534 0.416 0.367 0.2348 0.4027 0.3056 0.3809 0.2039 0.1111 0.199 0.6352 0.2923 0.2455 0.0137 0.5166 0.4817 0.0938 0.0588 0.3387 0.441 0.3798 0.0035 0.2097 0.0757 0.0287 0.0153 0.1025 0.1223 0.1962 0.3604 0.0887 0.035 0.1892 0.5437 0.002 0.5289 0.2099 0.4336 0.1017 0.3548 0.0639 0.2519 0.0926 0.5523 0.6554 0.0043 0.3546 0.5791 0.0762 0.2468 0.4162 0.0124 0.237 0.1463 0.2576 0.2404 0.4072 0.3017 0.422 0.2827 0.54 0.0259 0.119 0.2604 0.0789 0.1006 0.0461 0.0183 0.1286 0.341 0.1212 0.057 0.2464 0.522 0.2707 0. 0.2777 0.0124 0.0931 0.0738] 2022-08-25 01:56:05 [INFO] [EVAL] Class Precision: [0.7688 0.8443 0.9589 0.8074 0.7605 0.856 0.8748 0.8205 0.5966 0.7513 0.6811 0.6753 0.7441 0.4866 0.4466 0.5532 0.6514 0.673 0.6998 0.5433 0.7945 0.5916 0.7328 0.5988 0.5156 0.5856 0.5086 0.6515 0.635 0.4535 0.3914 0.5433 0.4654 0.3861 0.4923 0.5489 0.6046 0.7621 0.5199 0.5112 0.1398 0.2239 0.496 0.5487 0.4078 0.4572 0.3688 0.6362 0.7524 0.59 0.6344 0.3081 0.3923 0.5124 0.7003 0.6131 0.8808 0.6442 0.6497 0.378 0.1082 0.3944 0.4078 0.611 0.4724 0.7707 0.3651 0.573 0.2707 0.6633 0.579 0.5496 0.6374 0.2829 0.613 0.5043 0.5632 0.6145 0.7554 0.545 0.7361 0.5481 0.7563 0.0523 0.64 0.6779 0.3162 0.4616 0.4847 0.6263 0.4908 0.0047 0.4041 0.3019 0.0815 0.0617 0.581 0.5043 0.4193 0.5506 0.5757 0.0585 0.6579 0.7045 0.035 0.764 0.557 0.8506 0.2136 0.5797 0.2663 0.3909 0.3919 0.7662 0.6582 0.1564 0.7668 0.5988 0.1584 0.6868 0.7615 0.1982 0.7958 0.6129 0.6316 0.6501 0.7743 0.4324 0.5972 0.5013 0.6192 0.6212 0.4543 0.7433 0.6365 0.3151 0.3013 0.1203 0.33 0.6856 0.3145 0.1774 0.4939 0.6874 0.3945 0. 0.9017 0.2857 0.4312 0.6746] 2022-08-25 01:56:05 [INFO] [EVAL] Class Recall: [0.8191 0.9022 0.9638 0.8491 0.8504 0.8567 0.865 0.9059 0.7122 0.7643 0.5963 0.7082 0.855 0.4236 0.3145 0.5488 0.6618 0.5181 0.7092 0.5303 0.8716 0.566 0.7021 0.6415 0.446 0.5232 0.5565 0.52 0.4736 0.4917 0.334 0.6059 0.4024 0.4811 0.5137 0.5845 0.4833 0.5858 0.333 0.3648 0.1414 0.0933 0.4372 0.2797 0.4899 0.4224 0.3827 0.541 0.9033 0.737 0.6484 0.5933 0.3271 0.2706 0.8582 0.6461 0.954 0.3255 0.6872 0.29 0.1555 0.2395 0.554 0.1501 0.6553 0.8155 0.3271 0.4978 0.1424 0.4066 0.4757 0.6312 0.4638 0.5804 0.54 0.4368 0.5407 0.2339 0.1153 0.2386 0.8224 0.385 0.2666 0.0182 0.7282 0.6247 0.1176 0.0631 0.5291 0.5984 0.6267 0.0139 0.3035 0.0918 0.0424 0.0199 0.1106 0.1391 0.2695 0.5105 0.0949 0.0803 0.2099 0.7042 0.0021 0.6323 0.252 0.4693 0.1625 0.4776 0.0775 0.4146 0.1082 0.6643 0.9938 0.0044 0.3974 0.9461 0.1282 0.2782 0.4785 0.0131 0.2523 0.1612 0.3031 0.2762 0.462 0.4996 0.5899 0.3933 0.8084 0.0263 0.1388 0.2861 0.0826 0.1288 0.0517 0.0211 0.174 0.4042 0.1647 0.0775 0.3296 0.6845 0.4631 0. 0.2864 0.0128 0.1061 0.0765] 2022-08-25 01:56:05 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 01:56:14 [INFO] [TRAIN] epoch: 108, iter: 136050/160000, loss: 0.6760, lr: 0.000181, batch_cost: 0.1911, reader_cost: 0.00407, ips: 41.8629 samples/sec | ETA 01:16:16 2022-08-25 01:56:24 [INFO] [TRAIN] epoch: 108, iter: 136100/160000, loss: 0.6686, lr: 0.000181, batch_cost: 0.1958, reader_cost: 0.00113, ips: 40.8634 samples/sec | ETA 01:17:59 2022-08-25 01:56:35 [INFO] [TRAIN] epoch: 108, iter: 136150/160000, loss: 0.6743, lr: 0.000181, batch_cost: 0.2202, reader_cost: 0.00037, ips: 36.3308 samples/sec | ETA 01:27:31 2022-08-25 01:56:47 [INFO] [TRAIN] epoch: 108, iter: 136200/160000, loss: 0.7182, lr: 0.000180, batch_cost: 0.2362, reader_cost: 0.01407, ips: 33.8727 samples/sec | ETA 01:33:41 2022-08-25 01:56:58 [INFO] [TRAIN] epoch: 108, iter: 136250/160000, loss: 0.6802, lr: 0.000180, batch_cost: 0.2261, reader_cost: 0.00946, ips: 35.3880 samples/sec | ETA 01:29:29 2022-08-25 01:57:09 [INFO] [TRAIN] epoch: 108, iter: 136300/160000, loss: 0.7102, lr: 0.000179, batch_cost: 0.2202, reader_cost: 0.00094, ips: 36.3248 samples/sec | ETA 01:26:59 2022-08-25 01:57:21 [INFO] [TRAIN] epoch: 108, iter: 136350/160000, loss: 0.6867, lr: 0.000179, batch_cost: 0.2376, reader_cost: 0.00541, ips: 33.6636 samples/sec | ETA 01:33:40 2022-08-25 01:57:32 [INFO] [TRAIN] epoch: 108, iter: 136400/160000, loss: 0.7201, lr: 0.000179, batch_cost: 0.2246, reader_cost: 0.01962, ips: 35.6265 samples/sec | ETA 01:28:19 2022-08-25 01:57:47 [INFO] [TRAIN] epoch: 109, iter: 136450/160000, loss: 0.6169, lr: 0.000178, batch_cost: 0.2864, reader_cost: 0.05257, ips: 27.9284 samples/sec | ETA 01:52:25 2022-08-25 01:57:59 [INFO] [TRAIN] epoch: 109, iter: 136500/160000, loss: 0.6504, lr: 0.000178, batch_cost: 0.2510, reader_cost: 0.00115, ips: 31.8769 samples/sec | ETA 01:38:17 2022-08-25 01:58:11 [INFO] [TRAIN] epoch: 109, iter: 136550/160000, loss: 0.6510, lr: 0.000178, batch_cost: 0.2263, reader_cost: 0.00764, ips: 35.3531 samples/sec | ETA 01:28:26 2022-08-25 01:58:23 [INFO] [TRAIN] epoch: 109, iter: 136600/160000, loss: 0.6900, lr: 0.000177, batch_cost: 0.2554, reader_cost: 0.00061, ips: 31.3260 samples/sec | ETA 01:39:35 2022-08-25 01:58:36 [INFO] [TRAIN] epoch: 109, iter: 136650/160000, loss: 0.6477, lr: 0.000177, batch_cost: 0.2494, reader_cost: 0.00046, ips: 32.0721 samples/sec | ETA 01:37:04 2022-08-25 01:58:47 [INFO] [TRAIN] epoch: 109, iter: 136700/160000, loss: 0.7061, lr: 0.000176, batch_cost: 0.2215, reader_cost: 0.03233, ips: 36.1128 samples/sec | ETA 01:26:01 2022-08-25 01:58:59 [INFO] [TRAIN] epoch: 109, iter: 136750/160000, loss: 0.6458, lr: 0.000176, batch_cost: 0.2312, reader_cost: 0.01551, ips: 34.6046 samples/sec | ETA 01:29:35 2022-08-25 01:59:11 [INFO] [TRAIN] epoch: 109, iter: 136800/160000, loss: 0.6743, lr: 0.000176, batch_cost: 0.2419, reader_cost: 0.00120, ips: 33.0730 samples/sec | ETA 01:33:31 2022-08-25 01:59:24 [INFO] [TRAIN] epoch: 109, iter: 136850/160000, loss: 0.6683, lr: 0.000175, batch_cost: 0.2603, reader_cost: 0.00671, ips: 30.7376 samples/sec | ETA 01:40:25 2022-08-25 01:59:34 [INFO] [TRAIN] epoch: 109, iter: 136900/160000, loss: 0.6914, lr: 0.000175, batch_cost: 0.2081, reader_cost: 0.00679, ips: 38.4350 samples/sec | ETA 01:20:08 2022-08-25 01:59:43 [INFO] [TRAIN] epoch: 109, iter: 136950/160000, loss: 0.6511, lr: 0.000175, batch_cost: 0.1843, reader_cost: 0.00058, ips: 43.4034 samples/sec | ETA 01:10:48 2022-08-25 01:59:53 [INFO] [TRAIN] epoch: 109, iter: 137000/160000, loss: 0.7340, lr: 0.000174, batch_cost: 0.1856, reader_cost: 0.00197, ips: 43.0944 samples/sec | ETA 01:11:09 2022-08-25 01:59:53 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1590 - reader cost: 8.1481e-04 2022-08-25 02:02:32 [INFO] [EVAL] #Images: 2000 mIoU: 0.3201 Acc: 0.7488 Kappa: 0.7292 Dice: 0.4481 2022-08-25 02:02:32 [INFO] [EVAL] Class IoU: [0.6566 0.77 0.9258 0.7083 0.6683 0.7412 0.769 0.7582 0.4929 0.6152 0.467 0.5323 0.663 0.2904 0.2211 0.3837 0.4938 0.4236 0.5348 0.3633 0.7182 0.4201 0.5697 0.4527 0.3307 0.3448 0.3608 0.4092 0.3835 0.3426 0.2265 0.3989 0.2812 0.2736 0.3439 0.3814 0.3688 0.5241 0.2609 0.2691 0.0684 0.0783 0.3081 0.2223 0.2757 0.2806 0.218 0.4241 0.7105 0.476 0.4661 0.2437 0.2229 0.2084 0.6186 0.4586 0.8527 0.2898 0.4554 0.1737 0.0548 0.1923 0.3103 0.1126 0.3834 0.6531 0.2169 0.3717 0.0906 0.3249 0.3445 0.4324 0.3694 0.2284 0.4019 0.3106 0.3664 0.2095 0.0758 0.2351 0.6387 0.2841 0.2731 0.0129 0.5131 0.4841 0.0842 0.0747 0.3394 0.4301 0.3715 0.0052 0.1855 0.0774 0.0175 0.0113 0.1401 0.1408 0.1712 0.3431 0.0993 0.0323 0.1925 0.5038 0.0002 0.5244 0.2178 0.4371 0.1038 0.3612 0.064 0.1902 0.0893 0.5749 0.6683 0.004 0.4145 0.5443 0.0651 0.3307 0.4573 0.0103 0.2012 0.1434 0.2576 0.2291 0.4209 0.277 0.4678 0.2771 0.5316 0.0262 0.1518 0.2417 0.1111 0.1008 0.0601 0.0167 0.1306 0.3417 0.0839 0.0486 0.2708 0.4823 0.3089 0. 0.3363 0.0089 0.075 0.0497] 2022-08-25 02:02:32 [INFO] [EVAL] Class Precision: [0.7539 0.8298 0.9616 0.8192 0.7747 0.8819 0.8875 0.8306 0.6176 0.7437 0.651 0.6674 0.7543 0.4852 0.4815 0.5552 0.6672 0.6457 0.7469 0.5649 0.8055 0.6163 0.7245 0.608 0.4845 0.5868 0.5239 0.6542 0.6098 0.5155 0.3838 0.5478 0.4626 0.399 0.5087 0.5109 0.6097 0.7688 0.5331 0.5225 0.1376 0.2146 0.5364 0.5644 0.3696 0.456 0.3314 0.6349 0.78 0.5772 0.6059 0.2887 0.3993 0.6452 0.6998 0.6492 0.8983 0.6438 0.6608 0.3986 0.0865 0.3498 0.4459 0.6581 0.49 0.7593 0.3459 0.5708 0.2023 0.6 0.6656 0.6191 0.6211 0.2862 0.6339 0.5364 0.6045 0.6125 0.6748 0.569 0.7475 0.5469 0.7497 0.052 0.6525 0.6593 0.3365 0.4144 0.4996 0.6273 0.4715 0.007 0.3328 0.2994 0.0915 0.0455 0.5658 0.505 0.3288 0.5637 0.7011 0.0546 0.6672 0.6158 0.0111 0.7739 0.5865 0.8692 0.1972 0.5911 0.2623 0.2595 0.3795 0.7437 0.6719 0.0868 0.6854 0.5578 0.1245 0.7252 0.7389 0.3685 0.8232 0.6766 0.6413 0.6444 0.7386 0.423 0.6836 0.4969 0.6022 0.4831 0.4667 0.7696 0.633 0.2399 0.2753 0.1114 0.34 0.6856 0.1853 0.138 0.5119 0.707 0.507 0. 0.8912 0.518 0.3754 0.5215] 2022-08-25 02:02:32 [INFO] [EVAL] Class Recall: [0.8358 0.9143 0.9613 0.8396 0.8295 0.8228 0.852 0.8968 0.7095 0.7808 0.6231 0.7245 0.8456 0.4197 0.2901 0.5541 0.6551 0.5519 0.6532 0.5045 0.8689 0.5688 0.7272 0.6394 0.5102 0.4554 0.5368 0.5221 0.5082 0.5054 0.3559 0.5946 0.4177 0.4653 0.5149 0.6006 0.4828 0.6222 0.3382 0.3569 0.1198 0.1098 0.4199 0.2683 0.5203 0.4219 0.389 0.5608 0.8885 0.7309 0.6689 0.6096 0.3353 0.2354 0.8422 0.6096 0.9438 0.3451 0.5943 0.2354 0.1302 0.2993 0.505 0.1196 0.6378 0.8236 0.3678 0.5159 0.1409 0.4147 0.4166 0.5892 0.4768 0.5304 0.5234 0.4246 0.482 0.2415 0.0786 0.286 0.8144 0.3715 0.3005 0.0169 0.706 0.6457 0.101 0.0835 0.5143 0.5776 0.6365 0.0195 0.2953 0.0946 0.0212 0.0148 0.157 0.1633 0.2632 0.4672 0.1037 0.0732 0.2129 0.7347 0.0002 0.6194 0.2574 0.4679 0.1799 0.4815 0.0781 0.4161 0.1045 0.7169 0.9921 0.0042 0.5118 0.9573 0.1201 0.378 0.5454 0.0105 0.2103 0.1539 0.301 0.2622 0.4946 0.4452 0.5971 0.3852 0.8194 0.0269 0.1837 0.2606 0.1187 0.148 0.0714 0.0193 0.175 0.4052 0.1329 0.0697 0.3651 0.6029 0.4415 0. 0.3507 0.009 0.0857 0.052 ] 2022-08-25 02:02:32 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 02:02:41 [INFO] [TRAIN] epoch: 109, iter: 137050/160000, loss: 0.6389, lr: 0.000174, batch_cost: 0.1785, reader_cost: 0.00950, ips: 44.8108 samples/sec | ETA 01:08:17 2022-08-25 02:02:50 [INFO] [TRAIN] epoch: 109, iter: 137100/160000, loss: 0.7239, lr: 0.000173, batch_cost: 0.1875, reader_cost: 0.00112, ips: 42.6753 samples/sec | ETA 01:11:32 2022-08-25 02:03:01 [INFO] [TRAIN] epoch: 109, iter: 137150/160000, loss: 0.6517, lr: 0.000173, batch_cost: 0.2113, reader_cost: 0.00065, ips: 37.8526 samples/sec | ETA 01:20:29 2022-08-25 02:03:11 [INFO] [TRAIN] epoch: 109, iter: 137200/160000, loss: 0.6874, lr: 0.000173, batch_cost: 0.1946, reader_cost: 0.00052, ips: 41.1014 samples/sec | ETA 01:13:57 2022-08-25 02:03:23 [INFO] [TRAIN] epoch: 109, iter: 137250/160000, loss: 0.7127, lr: 0.000172, batch_cost: 0.2519, reader_cost: 0.00040, ips: 31.7601 samples/sec | ETA 01:35:30 2022-08-25 02:03:34 [INFO] [TRAIN] epoch: 109, iter: 137300/160000, loss: 0.6863, lr: 0.000172, batch_cost: 0.2202, reader_cost: 0.00069, ips: 36.3235 samples/sec | ETA 01:23:19 2022-08-25 02:03:47 [INFO] [TRAIN] epoch: 109, iter: 137350/160000, loss: 0.6970, lr: 0.000171, batch_cost: 0.2528, reader_cost: 0.00403, ips: 31.6400 samples/sec | ETA 01:35:26 2022-08-25 02:03:59 [INFO] [TRAIN] epoch: 109, iter: 137400/160000, loss: 0.7073, lr: 0.000171, batch_cost: 0.2336, reader_cost: 0.01974, ips: 34.2496 samples/sec | ETA 01:27:58 2022-08-25 02:04:11 [INFO] [TRAIN] epoch: 109, iter: 137450/160000, loss: 0.6639, lr: 0.000171, batch_cost: 0.2450, reader_cost: 0.00051, ips: 32.6496 samples/sec | ETA 01:32:05 2022-08-25 02:04:24 [INFO] [TRAIN] epoch: 109, iter: 137500/160000, loss: 0.6642, lr: 0.000170, batch_cost: 0.2555, reader_cost: 0.00039, ips: 31.3083 samples/sec | ETA 01:35:49 2022-08-25 02:04:36 [INFO] [TRAIN] epoch: 109, iter: 137550/160000, loss: 0.6464, lr: 0.000170, batch_cost: 0.2410, reader_cost: 0.00483, ips: 33.1993 samples/sec | ETA 01:30:09 2022-08-25 02:04:48 [INFO] [TRAIN] epoch: 109, iter: 137600/160000, loss: 0.7135, lr: 0.000170, batch_cost: 0.2454, reader_cost: 0.00216, ips: 32.6056 samples/sec | ETA 01:31:35 2022-08-25 02:05:00 [INFO] [TRAIN] epoch: 109, iter: 137650/160000, loss: 0.6674, lr: 0.000169, batch_cost: 0.2434, reader_cost: 0.00773, ips: 32.8705 samples/sec | ETA 01:30:39 2022-08-25 02:05:14 [INFO] [TRAIN] epoch: 110, iter: 137700/160000, loss: 0.6272, lr: 0.000169, batch_cost: 0.2788, reader_cost: 0.04352, ips: 28.6946 samples/sec | ETA 01:43:37 2022-08-25 02:05:25 [INFO] [TRAIN] epoch: 110, iter: 137750/160000, loss: 0.6521, lr: 0.000168, batch_cost: 0.2137, reader_cost: 0.00553, ips: 37.4297 samples/sec | ETA 01:19:15 2022-08-25 02:05:37 [INFO] [TRAIN] epoch: 110, iter: 137800/160000, loss: 0.7384, lr: 0.000168, batch_cost: 0.2515, reader_cost: 0.00622, ips: 31.8073 samples/sec | ETA 01:33:03 2022-08-25 02:05:49 [INFO] [TRAIN] epoch: 110, iter: 137850/160000, loss: 0.7046, lr: 0.000168, batch_cost: 0.2392, reader_cost: 0.00078, ips: 33.4440 samples/sec | ETA 01:28:18 2022-08-25 02:06:01 [INFO] [TRAIN] epoch: 110, iter: 137900/160000, loss: 0.6680, lr: 0.000167, batch_cost: 0.2331, reader_cost: 0.00731, ips: 34.3130 samples/sec | ETA 01:25:52 2022-08-25 02:06:11 [INFO] [TRAIN] epoch: 110, iter: 137950/160000, loss: 0.7008, lr: 0.000167, batch_cost: 0.2089, reader_cost: 0.02274, ips: 38.3010 samples/sec | ETA 01:16:45 2022-08-25 02:06:22 [INFO] [TRAIN] epoch: 110, iter: 138000/160000, loss: 0.7271, lr: 0.000167, batch_cost: 0.2118, reader_cost: 0.00040, ips: 37.7670 samples/sec | ETA 01:17:40 2022-08-25 02:06:22 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 139s - batch_cost: 0.1388 - reader cost: 6.6567e-04 2022-08-25 02:08:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3194 Acc: 0.7489 Kappa: 0.7295 Dice: 0.4470 2022-08-25 02:08:41 [INFO] [EVAL] Class IoU: [0.6592 0.7705 0.9258 0.7081 0.6725 0.7421 0.765 0.7571 0.4876 0.6044 0.4653 0.5268 0.6631 0.304 0.2086 0.3823 0.4992 0.4137 0.5482 0.3689 0.7173 0.4089 0.5564 0.4582 0.3306 0.3605 0.3653 0.4126 0.3784 0.2808 0.2026 0.3962 0.2739 0.2891 0.3758 0.3885 0.3628 0.5024 0.263 0.285 0.0729 0.0785 0.3041 0.2112 0.2818 0.2569 0.2028 0.4244 0.6884 0.5005 0.4706 0.2341 0.2285 0.207 0.609 0.4642 0.8543 0.2925 0.4777 0.1851 0.049 0.1763 0.31 0.171 0.3644 0.6651 0.2246 0.3707 0.1049 0.3266 0.3683 0.4199 0.368 0.2429 0.4235 0.3076 0.3352 0.2219 0.1039 0.352 0.643 0.3034 0.2618 0.0195 0.495 0.4878 0.069 0.0555 0.3173 0.4447 0.3944 0.0038 0.2189 0.0821 0.0096 0.0102 0.1185 0.1421 0.1548 0.3389 0.1182 0.0279 0.2024 0.5563 0.001 0.5087 0.1999 0.4263 0.1081 0.397 0.0608 0.1203 0.096 0.5601 0.7027 0.0056 0.3812 0.5771 0.055 0.2459 0.4637 0.013 0.2228 0.0772 0.2577 0.2069 0.4333 0.2774 0.4065 0.2418 0.5457 0.0199 0.0953 0.2238 0.1023 0.1164 0.0496 0.0182 0.1233 0.3425 0.1317 0.0572 0.267 0.4718 0.3098 0. 0.322 0.0054 0.0901 0.0664] 2022-08-25 02:08:41 [INFO] [EVAL] Class Precision: [0.7613 0.839 0.9611 0.8051 0.7737 0.8679 0.8785 0.8278 0.6112 0.7351 0.675 0.6723 0.7604 0.4626 0.479 0.5507 0.6954 0.6649 0.7113 0.5649 0.8121 0.5794 0.7293 0.5827 0.4939 0.6111 0.5237 0.6874 0.6309 0.4585 0.4026 0.4993 0.4653 0.4462 0.4751 0.5407 0.631 0.7298 0.5076 0.487 0.1375 0.211 0.5275 0.5679 0.3799 0.4547 0.347 0.6412 0.7315 0.6244 0.6184 0.278 0.3975 0.5767 0.6802 0.6532 0.9029 0.6417 0.6512 0.3684 0.086 0.3921 0.4728 0.5932 0.4305 0.8047 0.3606 0.5728 0.2455 0.5449 0.6054 0.5612 0.6283 0.3063 0.6314 0.5209 0.5377 0.6035 0.7936 0.6609 0.7391 0.5212 0.7389 0.0615 0.619 0.6865 0.405 0.46 0.5966 0.6344 0.5256 0.0054 0.3618 0.3192 0.0391 0.0521 0.6123 0.5188 0.3346 0.5301 0.5575 0.0525 0.7017 0.7145 0.0165 0.7178 0.595 0.894 0.2277 0.6003 0.3408 0.1447 0.3733 0.7716 0.7065 0.062 0.6767 0.5938 0.1269 0.7149 0.7529 0.2052 0.7315 0.7682 0.6233 0.6817 0.7442 0.4253 0.5557 0.6683 0.6287 0.465 0.4743 0.7701 0.6472 0.2984 0.3266 0.0705 0.3796 0.6896 0.2083 0.272 0.5316 0.7787 0.5517 0. 0.8593 0.3247 0.3902 0.6565] 2022-08-25 02:08:41 [INFO] [EVAL] Class Recall: [0.8309 0.9043 0.9619 0.8545 0.8373 0.8366 0.8555 0.8987 0.7068 0.7727 0.5997 0.7088 0.8382 0.47 0.2699 0.5557 0.6388 0.5227 0.7051 0.5152 0.8601 0.5815 0.7013 0.682 0.4999 0.4678 0.547 0.5079 0.486 0.4202 0.2897 0.6573 0.3996 0.4508 0.6427 0.5799 0.4606 0.6173 0.3531 0.4073 0.1344 0.1112 0.4179 0.2516 0.5219 0.3713 0.328 0.5566 0.9211 0.7162 0.6632 0.597 0.3495 0.244 0.8533 0.616 0.9407 0.3496 0.6419 0.2712 0.1022 0.2426 0.4738 0.1937 0.7037 0.793 0.3734 0.5124 0.1547 0.4491 0.4846 0.6252 0.4704 0.5401 0.5626 0.4289 0.471 0.2598 0.1068 0.4296 0.8318 0.4206 0.2885 0.0277 0.7119 0.6275 0.0768 0.0593 0.404 0.5979 0.6125 0.0123 0.3566 0.0995 0.0125 0.0125 0.1282 0.1636 0.2236 0.4845 0.1305 0.0561 0.2215 0.7152 0.0011 0.6359 0.2314 0.449 0.1707 0.5396 0.0688 0.4161 0.1144 0.6714 0.9923 0.0061 0.4662 0.9536 0.0884 0.2726 0.5469 0.0137 0.2427 0.079 0.3053 0.2291 0.5091 0.4439 0.6022 0.2747 0.8051 0.0204 0.1065 0.2398 0.1084 0.1603 0.0552 0.0239 0.1544 0.4049 0.2639 0.0676 0.3491 0.5449 0.4141 0. 0.3399 0.0055 0.1049 0.0687] 2022-08-25 02:08:41 [INFO] [EVAL] The model with the best validation mIoU (0.3222) was saved at iter 127000. 2022-08-25 02:08:51 [INFO] [TRAIN] epoch: 110, iter: 138050/160000, loss: 0.7082, lr: 0.000166, batch_cost: 0.1971, reader_cost: 0.00434, ips: 40.5817 samples/sec | ETA 01:12:07 2022-08-25 02:09:00 [INFO] [TRAIN] epoch: 110, iter: 138100/160000, loss: 0.6467, lr: 0.000166, batch_cost: 0.1913, reader_cost: 0.00139, ips: 41.8228 samples/sec | ETA 01:09:49 2022-08-25 02:09:11 [INFO] [TRAIN] epoch: 110, iter: 138150/160000, loss: 0.6492, lr: 0.000165, batch_cost: 0.2211, reader_cost: 0.00036, ips: 36.1786 samples/sec | ETA 01:20:31 2022-08-25 02:09:22 [INFO] [TRAIN] epoch: 110, iter: 138200/160000, loss: 0.6709, lr: 0.000165, batch_cost: 0.2098, reader_cost: 0.00106, ips: 38.1332 samples/sec | ETA 01:16:13 2022-08-25 02:09:32 [INFO] [TRAIN] epoch: 110, iter: 138250/160000, loss: 0.6624, lr: 0.000165, batch_cost: 0.1904, reader_cost: 0.00045, ips: 42.0170 samples/sec | ETA 01:09:01 2022-08-25 02:09:41 [INFO] [TRAIN] epoch: 110, iter: 138300/160000, loss: 0.7149, lr: 0.000164, batch_cost: 0.1813, reader_cost: 0.00049, ips: 44.1242 samples/sec | ETA 01:05:34 2022-08-25 02:09:52 [INFO] [TRAIN] epoch: 110, iter: 138350/160000, loss: 0.6775, lr: 0.000164, batch_cost: 0.2327, reader_cost: 0.00058, ips: 34.3750 samples/sec | ETA 01:23:58 2022-08-25 02:10:04 [INFO] [TRAIN] epoch: 110, iter: 138400/160000, loss: 0.7103, lr: 0.000164, batch_cost: 0.2382, reader_cost: 0.00051, ips: 33.5784 samples/sec | ETA 01:25:46 2022-08-25 02:10:16 [INFO] [TRAIN] epoch: 110, iter: 138450/160000, loss: 0.6767, lr: 0.000163, batch_cost: 0.2302, reader_cost: 0.01807, ips: 34.7548 samples/sec | ETA 01:22:40 2022-08-25 02:10:28 [INFO] [TRAIN] epoch: 110, iter: 138500/160000, loss: 0.6675, lr: 0.000163, batch_cost: 0.2383, reader_cost: 0.00873, ips: 33.5668 samples/sec | ETA 01:25:24 2022-08-25 02:10:39 [INFO] [TRAIN] epoch: 110, iter: 138550/160000, loss: 0.7083, lr: 0.000162, batch_cost: 0.2378, reader_cost: 0.00054, ips: 33.6388 samples/sec | ETA 01:25:01 2022-08-25 02:10:52 [INFO] [TRAIN] epoch: 110, iter: 138600/160000, loss: 0.7398, lr: 0.000162, batch_cost: 0.2597, reader_cost: 0.00075, ips: 30.8014 samples/sec | ETA 01:32:38 2022-08-25 02:11:04 [INFO] [TRAIN] epoch: 110, iter: 138650/160000, loss: 0.7019, lr: 0.000162, batch_cost: 0.2264, reader_cost: 0.00192, ips: 35.3424 samples/sec | ETA 01:20:32 2022-08-25 02:11:15 [INFO] [TRAIN] epoch: 110, iter: 138700/160000, loss: 0.7075, lr: 0.000161, batch_cost: 0.2340, reader_cost: 0.02273, ips: 34.1937 samples/sec | ETA 01:23:03 2022-08-25 02:11:28 [INFO] [TRAIN] epoch: 110, iter: 138750/160000, loss: 0.6864, lr: 0.000161, batch_cost: 0.2501, reader_cost: 0.00036, ips: 31.9879 samples/sec | ETA 01:28:34 2022-08-25 02:11:39 [INFO] [TRAIN] epoch: 110, iter: 138800/160000, loss: 0.7083, lr: 0.000161, batch_cost: 0.2233, reader_cost: 0.00329, ips: 35.8184 samples/sec | ETA 01:18:54 2022-08-25 02:11:51 [INFO] [TRAIN] epoch: 110, iter: 138850/160000, loss: 0.6297, lr: 0.000160, batch_cost: 0.2319, reader_cost: 0.00431, ips: 34.4962 samples/sec | ETA 01:21:44 2022-08-25 02:12:02 [INFO] [TRAIN] epoch: 110, iter: 138900/160000, loss: 0.6903, lr: 0.000160, batch_cost: 0.2333, reader_cost: 0.00531, ips: 34.2934 samples/sec | ETA 01:22:02 2022-08-25 02:12:17 [INFO] [TRAIN] epoch: 111, iter: 138950/160000, loss: 0.6511, lr: 0.000159, batch_cost: 0.2879, reader_cost: 0.07198, ips: 27.7828 samples/sec | ETA 01:41:01 2022-08-25 02:12:29 [INFO] [TRAIN] epoch: 111, iter: 139000/160000, loss: 0.6730, lr: 0.000159, batch_cost: 0.2409, reader_cost: 0.00556, ips: 33.2125 samples/sec | ETA 01:24:18 2022-08-25 02:12:29 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 179s - batch_cost: 0.1784 - reader cost: 8.8163e-04 2022-08-25 02:15:28 [INFO] [EVAL] #Images: 2000 mIoU: 0.3233 Acc: 0.7490 Kappa: 0.7296 Dice: 0.4520 2022-08-25 02:15:28 [INFO] [EVAL] Class IoU: [0.6583 0.7703 0.9254 0.7081 0.6763 0.7425 0.7594 0.7606 0.4886 0.6019 0.4676 0.5213 0.66 0.2959 0.2193 0.3846 0.4982 0.4295 0.5404 0.3607 0.7285 0.4059 0.5481 0.46 0.3221 0.3684 0.3662 0.4129 0.3793 0.2722 0.2018 0.3879 0.2835 0.2854 0.3527 0.3995 0.3663 0.5045 0.259 0.2787 0.0898 0.0745 0.3075 0.2306 0.2859 0.2599 0.2191 0.4239 0.6704 0.4917 0.4718 0.2801 0.2209 0.2045 0.6227 0.4239 0.8516 0.2878 0.461 0.2119 0.0542 0.1846 0.3028 0.1441 0.4019 0.6712 0.2331 0.3777 0.0907 0.3306 0.3717 0.4376 0.3633 0.2355 0.4162 0.2885 0.3344 0.2217 0.1394 0.3459 0.6459 0.3008 0.2711 0.0235 0.5099 0.4897 0.0804 0.0688 0.3075 0.4288 0.4131 0.0038 0.199 0.071 0.028 0.0123 0.1813 0.1347 0.1815 0.3318 0.1139 0.0175 0.1851 0.6672 0.0001 0.5083 0.1919 0.4403 0.1047 0.3853 0.0583 0.2406 0.0856 0.4758 0.7445 0.0031 0.4065 0.5841 0.037 0.3293 0.4495 0.0132 0.2306 0.1225 0.2527 0.2177 0.4238 0.2912 0.4916 0.2925 0.5403 0.0205 0.1419 0.2289 0.1095 0.1171 0.0521 0.0185 0.1077 0.3325 0.1283 0.0548 0.2685 0.4961 0.2724 0. 0.2909 0.0119 0.1003 0.0693] 2022-08-25 02:15:28 [INFO] [EVAL] Class Precision: [0.76 0.8372 0.9621 0.8104 0.7752 0.8675 0.8808 0.8266 0.6108 0.7189 0.6724 0.6495 0.7455 0.4983 0.4701 0.5476 0.6717 0.6469 0.7465 0.5625 0.8186 0.5784 0.6928 0.5921 0.4913 0.6107 0.5391 0.6911 0.6168 0.4098 0.4058 0.4947 0.4725 0.4214 0.5211 0.5507 0.6172 0.7812 0.5201 0.496 0.1407 0.2224 0.5306 0.5056 0.3996 0.4477 0.3569 0.6705 0.7088 0.6083 0.6369 0.3434 0.3821 0.6161 0.7389 0.6278 0.8986 0.635 0.6914 0.3737 0.091 0.537 0.4361 0.6106 0.5195 0.803 0.3795 0.5311 0.1975 0.6229 0.5975 0.5848 0.6457 0.3162 0.6376 0.4415 0.5765 0.5517 0.7379 0.6164 0.7588 0.5866 0.7387 0.0856 0.669 0.6604 0.4726 0.4232 0.5154 0.6004 0.5614 0.0049 0.347 0.3118 0.0847 0.045 0.6223 0.5863 0.4026 0.5285 0.5617 0.0362 0.6442 0.8214 0.0034 0.7191 0.5584 0.8649 0.2681 0.5944 0.279 0.3633 0.3575 0.7401 0.7503 0.0722 0.6546 0.6055 0.0968 0.7373 0.7823 0.2763 0.8125 0.645 0.653 0.6859 0.7251 0.3971 0.6044 0.5103 0.6575 0.56 0.5354 0.771 0.6657 0.2706 0.326 0.1065 0.3793 0.6897 0.28 0.1457 0.4856 0.6904 0.4228 0. 0.8998 0.256 0.4431 0.6854] 2022-08-25 02:15:28 [INFO] [EVAL] Class Recall: [0.8311 0.9061 0.9604 0.8487 0.8413 0.8374 0.8463 0.905 0.7096 0.7872 0.6056 0.7254 0.8519 0.4214 0.2913 0.5637 0.6587 0.561 0.6619 0.5013 0.8687 0.5765 0.7241 0.6734 0.4834 0.4814 0.533 0.5064 0.4963 0.4477 0.2865 0.6424 0.4148 0.4692 0.5219 0.5926 0.474 0.5876 0.3404 0.3887 0.1987 0.1008 0.4224 0.2978 0.5012 0.3826 0.3619 0.5355 0.9252 0.7196 0.6453 0.6029 0.3437 0.2344 0.7984 0.5662 0.9422 0.3448 0.5805 0.3286 0.118 0.2196 0.4975 0.1587 0.6398 0.8035 0.3765 0.5667 0.1437 0.4134 0.496 0.6348 0.4537 0.48 0.5452 0.4544 0.4434 0.2704 0.1467 0.4408 0.8128 0.3816 0.2999 0.0313 0.682 0.6546 0.0883 0.076 0.4325 0.6 0.6099 0.0161 0.3182 0.0842 0.0402 0.0166 0.2037 0.1489 0.2485 0.4713 0.1251 0.0328 0.2062 0.7805 0.0001 0.6342 0.2262 0.4728 0.1466 0.5227 0.0686 0.4161 0.1012 0.5713 0.9897 0.0032 0.5175 0.9429 0.0566 0.3731 0.5138 0.0136 0.2436 0.1314 0.2919 0.2419 0.505 0.5221 0.7249 0.4066 0.752 0.0208 0.1619 0.2456 0.1158 0.1712 0.0584 0.0219 0.1307 0.391 0.1915 0.0806 0.3753 0.6381 0.4338 0. 0.3006 0.0123 0.1148 0.0716] 2022-08-25 02:15:28 [INFO] [EVAL] The model with the best validation mIoU (0.3233) was saved at iter 139000. 2022-08-25 02:15:38 [INFO] [TRAIN] epoch: 111, iter: 139050/160000, loss: 0.6672, lr: 0.000159, batch_cost: 0.2156, reader_cost: 0.00369, ips: 37.1017 samples/sec | ETA 01:15:17 2022-08-25 02:15:48 [INFO] [TRAIN] epoch: 111, iter: 139100/160000, loss: 0.7252, lr: 0.000158, batch_cost: 0.1962, reader_cost: 0.00066, ips: 40.7795 samples/sec | ETA 01:08:20 2022-08-25 02:15:57 [INFO] [TRAIN] epoch: 111, iter: 139150/160000, loss: 0.7083, lr: 0.000158, batch_cost: 0.1821, reader_cost: 0.00061, ips: 43.9374 samples/sec | ETA 01:03:16 2022-08-25 02:16:08 [INFO] [TRAIN] epoch: 111, iter: 139200/160000, loss: 0.6869, lr: 0.000157, batch_cost: 0.2033, reader_cost: 0.00064, ips: 39.3522 samples/sec | ETA 01:10:28 2022-08-25 02:16:17 [INFO] [TRAIN] epoch: 111, iter: 139250/160000, loss: 0.6611, lr: 0.000157, batch_cost: 0.1962, reader_cost: 0.00304, ips: 40.7825 samples/sec | ETA 01:07:50 2022-08-25 02:16:31 [INFO] [TRAIN] epoch: 111, iter: 139300/160000, loss: 0.6887, lr: 0.000157, batch_cost: 0.2681, reader_cost: 0.00062, ips: 29.8434 samples/sec | ETA 01:32:28 2022-08-25 02:16:43 [INFO] [TRAIN] epoch: 111, iter: 139350/160000, loss: 0.6797, lr: 0.000156, batch_cost: 0.2366, reader_cost: 0.00096, ips: 33.8083 samples/sec | ETA 01:21:26 2022-08-25 02:16:55 [INFO] [TRAIN] epoch: 111, iter: 139400/160000, loss: 0.6583, lr: 0.000156, batch_cost: 0.2448, reader_cost: 0.00059, ips: 32.6767 samples/sec | ETA 01:24:03 2022-08-25 02:17:08 [INFO] [TRAIN] epoch: 111, iter: 139450/160000, loss: 0.6512, lr: 0.000156, batch_cost: 0.2583, reader_cost: 0.00048, ips: 30.9751 samples/sec | ETA 01:28:27 2022-08-25 02:17:20 [INFO] [TRAIN] epoch: 111, iter: 139500/160000, loss: 0.6537, lr: 0.000155, batch_cost: 0.2377, reader_cost: 0.02287, ips: 33.6526 samples/sec | ETA 01:21:13 2022-08-25 02:17:32 [INFO] [TRAIN] epoch: 111, iter: 139550/160000, loss: 0.6296, lr: 0.000155, batch_cost: 0.2512, reader_cost: 0.00871, ips: 31.8489 samples/sec | ETA 01:25:36 2022-08-25 02:17:44 [INFO] [TRAIN] epoch: 111, iter: 139600/160000, loss: 0.6935, lr: 0.000154, batch_cost: 0.2276, reader_cost: 0.00124, ips: 35.1532 samples/sec | ETA 01:17:22 2022-08-25 02:17:56 [INFO] [TRAIN] epoch: 111, iter: 139650/160000, loss: 0.6620, lr: 0.000154, batch_cost: 0.2491, reader_cost: 0.05002, ips: 32.1127 samples/sec | ETA 01:24:29 2022-08-25 02:18:09 [INFO] [TRAIN] epoch: 111, iter: 139700/160000, loss: 0.6222, lr: 0.000154, batch_cost: 0.2565, reader_cost: 0.00062, ips: 31.1862 samples/sec | ETA 01:26:47 2022-08-25 02:18:22 [INFO] [TRAIN] epoch: 111, iter: 139750/160000, loss: 0.6889, lr: 0.000153, batch_cost: 0.2671, reader_cost: 0.00383, ips: 29.9467 samples/sec | ETA 01:30:09 2022-08-25 02:18:36 [INFO] [TRAIN] epoch: 111, iter: 139800/160000, loss: 0.6552, lr: 0.000153, batch_cost: 0.2726, reader_cost: 0.01209, ips: 29.3461 samples/sec | ETA 01:31:46 2022-08-25 02:18:48 [INFO] [TRAIN] epoch: 111, iter: 139850/160000, loss: 0.6894, lr: 0.000153, batch_cost: 0.2367, reader_cost: 0.00035, ips: 33.7968 samples/sec | ETA 01:19:29 2022-08-25 02:18:58 [INFO] [TRAIN] epoch: 111, iter: 139900/160000, loss: 0.7071, lr: 0.000152, batch_cost: 0.2115, reader_cost: 0.00049, ips: 37.8278 samples/sec | ETA 01:10:50 2022-08-25 02:19:07 [INFO] [TRAIN] epoch: 111, iter: 139950/160000, loss: 0.6454, lr: 0.000152, batch_cost: 0.1840, reader_cost: 0.00325, ips: 43.4846 samples/sec | ETA 01:01:28 2022-08-25 02:19:17 [INFO] [TRAIN] epoch: 111, iter: 140000/160000, loss: 0.7043, lr: 0.000151, batch_cost: 0.1865, reader_cost: 0.00208, ips: 42.8946 samples/sec | ETA 01:02:10 2022-08-25 02:19:17 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 158s - batch_cost: 0.1575 - reader cost: 6.5507e-04 2022-08-25 02:21:55 [INFO] [EVAL] #Images: 2000 mIoU: 0.3218 Acc: 0.7483 Kappa: 0.7288 Dice: 0.4494 2022-08-25 02:21:55 [INFO] [EVAL] Class IoU: [0.6566 0.7705 0.925 0.708 0.6675 0.7386 0.7626 0.7627 0.4841 0.6161 0.4616 0.5287 0.6602 0.2968 0.2103 0.382 0.4844 0.4111 0.5459 0.3663 0.7339 0.4124 0.5588 0.4632 0.3148 0.37 0.3727 0.41 0.3678 0.3204 0.2118 0.3966 0.2859 0.2872 0.3214 0.3842 0.3687 0.4951 0.2554 0.2611 0.0854 0.0647 0.2957 0.2235 0.2843 0.2576 0.1929 0.4257 0.6905 0.4857 0.4693 0.2725 0.2286 0.1961 0.6102 0.4251 0.852 0.3277 0.4684 0.2125 0.0601 0.1667 0.287 0.1625 0.3884 0.6667 0.2255 0.377 0.0864 0.3518 0.3806 0.4299 0.3724 0.2338 0.4102 0.3051 0.3232 0.2251 0.0306 0.3079 0.6578 0.2969 0.2645 0.0151 0.5125 0.4835 0.0802 0.0562 0.3316 0.436 0.3915 0.0047 0.1929 0.0861 0.004 0.0106 0.1771 0.1447 0.1782 0.3493 0.107 0.0288 0.2209 0.6264 0.0023 0.5106 0.2239 0.505 0.0949 0.3899 0.0491 0.2834 0.0812 0.5678 0.7128 0.0042 0.418 0.5932 0.0415 0.2988 0.4547 0.0153 0.1957 0.1157 0.2553 0.2506 0.4249 0.3119 0.4432 0.2394 0.5455 0.0205 0.1413 0.2132 0.1093 0.1061 0.0693 0.0189 0.1074 0.338 0.0619 0.0453 0.2663 0.4603 0.3009 0. 0.2836 0.0133 0.0805 0.0647] 2022-08-25 02:21:55 [INFO] [EVAL] Class Precision: [0.7588 0.8325 0.9595 0.8145 0.7592 0.8795 0.8871 0.8304 0.6099 0.7628 0.6804 0.6546 0.7442 0.4723 0.4746 0.5413 0.6392 0.6606 0.7286 0.5512 0.8367 0.5799 0.7156 0.6081 0.5021 0.5777 0.5321 0.6222 0.626 0.4639 0.4278 0.532 0.4847 0.412 0.5112 0.5448 0.6096 0.7472 0.4992 0.5144 0.1565 0.2201 0.4663 0.5629 0.3932 0.4262 0.3412 0.6496 0.7422 0.5931 0.613 0.3337 0.3941 0.5928 0.7102 0.5597 0.9014 0.6118 0.6707 0.3874 0.0955 0.3718 0.3828 0.6156 0.5167 0.8186 0.3937 0.5455 0.2415 0.6695 0.6006 0.5676 0.6 0.2865 0.609 0.5105 0.5466 0.5715 0.6557 0.5751 0.7644 0.5462 0.7514 0.0593 0.6497 0.6572 0.4097 0.438 0.5376 0.5967 0.5236 0.0066 0.3242 0.2904 0.0185 0.0374 0.6185 0.4481 0.383 0.5317 0.6019 0.0494 0.6967 0.7378 0.0387 0.7389 0.565 0.8925 0.2534 0.5469 0.2735 0.4706 0.4136 0.7211 0.7175 0.0666 0.7051 0.6162 0.0983 0.7215 0.7539 0.2246 0.7223 0.6779 0.6593 0.6368 0.7808 0.4415 0.6347 0.4881 0.634 0.6251 0.4338 0.7484 0.6229 0.2265 0.3437 0.0752 0.3529 0.6741 0.1961 0.2043 0.5252 0.727 0.4707 0. 0.9052 0.3388 0.4711 0.6828] 2022-08-25 02:21:55 [INFO] [EVAL] Class Recall: [0.8298 0.9119 0.9626 0.8442 0.8468 0.8217 0.8445 0.9035 0.7013 0.7621 0.5893 0.7334 0.854 0.444 0.274 0.5649 0.6666 0.5212 0.6852 0.5219 0.8566 0.5881 0.7183 0.6604 0.4576 0.5073 0.5543 0.5459 0.4714 0.5087 0.2954 0.6091 0.4108 0.4868 0.4639 0.5659 0.4827 0.5948 0.3434 0.3464 0.1581 0.084 0.447 0.2704 0.5065 0.3943 0.3074 0.5526 0.9084 0.7284 0.6669 0.5977 0.3524 0.2266 0.8124 0.6388 0.9396 0.4138 0.6084 0.3201 0.1397 0.232 0.534 0.1808 0.6101 0.7823 0.3454 0.5497 0.1186 0.4257 0.5096 0.6393 0.4953 0.5594 0.5569 0.4312 0.4415 0.2708 0.0311 0.3986 0.825 0.3942 0.2899 0.0198 0.7082 0.6466 0.0907 0.0606 0.464 0.6182 0.608 0.0165 0.3226 0.109 0.0051 0.0146 0.1989 0.1761 0.2499 0.5045 0.1152 0.0643 0.2444 0.8058 0.0024 0.6229 0.2706 0.5377 0.1317 0.5759 0.0565 0.4161 0.0918 0.7275 0.9909 0.0045 0.5066 0.941 0.0669 0.3377 0.5339 0.0161 0.2117 0.1224 0.2941 0.2924 0.4825 0.5151 0.5949 0.3197 0.7962 0.0208 0.1732 0.2297 0.117 0.1663 0.0799 0.0246 0.1337 0.404 0.0829 0.0551 0.3508 0.5565 0.4549 0. 0.2923 0.0136 0.0885 0.0667] 2022-08-25 02:21:55 [INFO] [EVAL] The model with the best validation mIoU (0.3233) was saved at iter 139000. 2022-08-25 02:22:05 [INFO] [TRAIN] epoch: 111, iter: 140050/160000, loss: 0.6812, lr: 0.000151, batch_cost: 0.2030, reader_cost: 0.00260, ips: 39.4121 samples/sec | ETA 01:07:29 2022-08-25 02:22:15 [INFO] [TRAIN] epoch: 111, iter: 140100/160000, loss: 0.6463, lr: 0.000151, batch_cost: 0.2029, reader_cost: 0.00117, ips: 39.4192 samples/sec | ETA 01:07:18 2022-08-25 02:22:24 [INFO] [TRAIN] epoch: 111, iter: 140150/160000, loss: 0.6787, lr: 0.000150, batch_cost: 0.1871, reader_cost: 0.00768, ips: 42.7579 samples/sec | ETA 01:01:53 2022-08-25 02:22:36 [INFO] [TRAIN] epoch: 112, iter: 140200/160000, loss: 0.6860, lr: 0.000150, batch_cost: 0.2377, reader_cost: 0.03386, ips: 33.6526 samples/sec | ETA 01:18:26 2022-08-25 02:22:46 [INFO] [TRAIN] epoch: 112, iter: 140250/160000, loss: 0.6861, lr: 0.000150, batch_cost: 0.1916, reader_cost: 0.00095, ips: 41.7462 samples/sec | ETA 01:03:04 2022-08-25 02:22:56 [INFO] [TRAIN] epoch: 112, iter: 140300/160000, loss: 0.7258, lr: 0.000149, batch_cost: 0.2115, reader_cost: 0.00058, ips: 37.8325 samples/sec | ETA 01:09:25 2022-08-25 02:23:09 [INFO] [TRAIN] epoch: 112, iter: 140350/160000, loss: 0.6958, lr: 0.000149, batch_cost: 0.2555, reader_cost: 0.00162, ips: 31.3129 samples/sec | ETA 01:23:40 2022-08-25 02:23:21 [INFO] [TRAIN] epoch: 112, iter: 140400/160000, loss: 0.6399, lr: 0.000148, batch_cost: 0.2426, reader_cost: 0.00339, ips: 32.9747 samples/sec | ETA 01:19:15 2022-08-25 02:23:33 [INFO] [TRAIN] epoch: 112, iter: 140450/160000, loss: 0.6776, lr: 0.000148, batch_cost: 0.2407, reader_cost: 0.03663, ips: 33.2321 samples/sec | ETA 01:18:26 2022-08-25 02:23:45 [INFO] [TRAIN] epoch: 112, iter: 140500/160000, loss: 0.6435, lr: 0.000148, batch_cost: 0.2407, reader_cost: 0.00080, ips: 33.2426 samples/sec | ETA 01:18:12 2022-08-25 02:23:58 [INFO] [TRAIN] epoch: 112, iter: 140550/160000, loss: 0.6970, lr: 0.000147, batch_cost: 0.2430, reader_cost: 0.01634, ips: 32.9196 samples/sec | ETA 01:18:46 2022-08-25 02:24:09 [INFO] [TRAIN] epoch: 112, iter: 140600/160000, loss: 0.6204, lr: 0.000147, batch_cost: 0.2299, reader_cost: 0.00156, ips: 34.7905 samples/sec | ETA 01:14:20 2022-08-25 02:24:22 [INFO] [TRAIN] epoch: 112, iter: 140650/160000, loss: 0.6420, lr: 0.000147, batch_cost: 0.2560, reader_cost: 0.00856, ips: 31.2446 samples/sec | ETA 01:22:34 2022-08-25 02:24:35 [INFO] [TRAIN] epoch: 112, iter: 140700/160000, loss: 0.6660, lr: 0.000146, batch_cost: 0.2613, reader_cost: 0.00087, ips: 30.6136 samples/sec | ETA 01:24:03 2022-08-25 02:24:47 [INFO] [TRAIN] epoch: 112, iter: 140750/160000, loss: 0.6386, lr: 0.000146, batch_cost: 0.2447, reader_cost: 0.00048, ips: 32.6988 samples/sec | ETA 01:18:29 2022-08-25 02:24:59 [INFO] [TRAIN] epoch: 112, iter: 140800/160000, loss: 0.6562, lr: 0.000145, batch_cost: 0.2389, reader_cost: 0.01083, ips: 33.4798 samples/sec | ETA 01:16:27 2022-08-25 02:25:10 [INFO] [TRAIN] epoch: 112, iter: 140850/160000, loss: 0.6421, lr: 0.000145, batch_cost: 0.2247, reader_cost: 0.00163, ips: 35.6055 samples/sec | ETA 01:11:42 2022-08-25 02:25:21 [INFO] [TRAIN] epoch: 112, iter: 140900/160000, loss: 0.6317, lr: 0.000145, batch_cost: 0.2122, reader_cost: 0.00853, ips: 37.7019 samples/sec | ETA 01:07:32 2022-08-25 02:25:32 [INFO] [TRAIN] epoch: 112, iter: 140950/160000, loss: 0.6829, lr: 0.000144, batch_cost: 0.2144, reader_cost: 0.00054, ips: 37.3078 samples/sec | ETA 01:08:04 2022-08-25 02:25:41 [INFO] [TRAIN] epoch: 112, iter: 141000/160000, loss: 0.6852, lr: 0.000144, batch_cost: 0.1920, reader_cost: 0.00089, ips: 41.6716 samples/sec | ETA 01:00:47 2022-08-25 02:25:41 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 150s - batch_cost: 0.1497 - reader cost: 7.8572e-04 2022-08-25 02:28:11 [INFO] [EVAL] #Images: 2000 mIoU: 0.3223 Acc: 0.7504 Kappa: 0.7311 Dice: 0.4507 2022-08-25 02:28:11 [INFO] [EVAL] Class IoU: [0.6623 0.7734 0.925 0.7052 0.6658 0.7454 0.7633 0.7637 0.4906 0.6273 0.4655 0.5295 0.6652 0.2924 0.2161 0.3895 0.494 0.4096 0.5562 0.3675 0.7245 0.401 0.5677 0.4655 0.3193 0.3749 0.3708 0.4114 0.3737 0.3567 0.2266 0.3947 0.2734 0.2869 0.377 0.38 0.3691 0.4889 0.2716 0.263 0.0879 0.0591 0.3024 0.2274 0.2783 0.2631 0.2329 0.4305 0.667 0.4956 0.4748 0.2615 0.2223 0.2091 0.5997 0.4286 0.8352 0.3387 0.4284 0.1823 0.0584 0.184 0.2885 0.1236 0.3813 0.6612 0.228 0.3701 0.0982 0.3424 0.3763 0.4366 0.3623 0.2311 0.417 0.298 0.331 0.2223 0.0815 0.2253 0.6492 0.2878 0.255 0.0158 0.5104 0.4797 0.0904 0.0697 0.3315 0.4643 0.3794 0.0052 0.2253 0.0792 0.0034 0.0115 0.1557 0.1514 0.1912 0.3194 0.0855 0.0432 0.1834 0.6225 0.0019 0.5132 0.2304 0.4891 0.0968 0.3628 0.0651 0.179 0.0876 0.4943 0.7225 0.0022 0.4337 0.5542 0.0559 0.3504 0.4359 0.0137 0.2249 0.1634 0.2393 0.2007 0.4205 0.2793 0.472 0.2762 0.561 0.0221 0.1462 0.2726 0.0935 0.1078 0.0551 0.0202 0.1178 0.3459 0.116 0.0455 0.2584 0.492 0.3232 0. 0.2819 0.0097 0.0891 0.0644] 2022-08-25 02:28:11 [INFO] [EVAL] Class Precision: [0.7628 0.8424 0.9606 0.8023 0.7549 0.8605 0.8895 0.839 0.6253 0.7373 0.676 0.6528 0.7585 0.4932 0.4556 0.5478 0.6885 0.6497 0.7286 0.568 0.8128 0.5667 0.7407 0.6143 0.5338 0.5784 0.5463 0.6712 0.6163 0.5279 0.4019 0.5219 0.5142 0.3904 0.5286 0.5527 0.6065 0.7198 0.499 0.5162 0.136 0.2477 0.5094 0.5422 0.3974 0.4493 0.3811 0.6451 0.7078 0.6103 0.6516 0.3261 0.388 0.6321 0.6893 0.6126 0.8721 0.6136 0.677 0.35 0.0938 0.4819 0.3983 0.6355 0.4818 0.8114 0.3408 0.5392 0.2297 0.6032 0.632 0.6064 0.6138 0.2947 0.6413 0.4765 0.4857 0.6332 0.7507 0.4934 0.7525 0.595 0.7573 0.058 0.678 0.6686 0.3769 0.4323 0.5698 0.7083 0.4873 0.0073 0.3778 0.308 0.0175 0.0412 0.7033 0.5556 0.4359 0.4843 0.6257 0.0686 0.6925 0.7198 0.02 0.7265 0.5329 0.9011 0.1878 0.5961 0.2891 0.239 0.3628 0.7656 0.7267 0.0433 0.6706 0.569 0.1439 0.6983 0.7827 0.196 0.6844 0.6336 0.6669 0.6415 0.7672 0.3712 0.5985 0.5624 0.6573 0.6879 0.4244 0.7155 0.6503 0.246 0.3344 0.0985 0.3241 0.6803 0.231 0.1473 0.5571 0.754 0.6019 0. 0.9061 0.3464 0.4268 0.7721] 2022-08-25 02:28:11 [INFO] [EVAL] Class Recall: [0.8341 0.9043 0.9615 0.8535 0.8493 0.8478 0.8432 0.8949 0.6948 0.8079 0.5992 0.7371 0.844 0.4179 0.2913 0.5742 0.6362 0.5257 0.7015 0.5101 0.8695 0.5784 0.7086 0.6577 0.4429 0.5158 0.5359 0.5152 0.487 0.5238 0.3419 0.6182 0.3687 0.5199 0.568 0.5487 0.4853 0.6038 0.3734 0.3491 0.1992 0.072 0.4267 0.2815 0.4815 0.3883 0.3745 0.5641 0.9204 0.7251 0.6363 0.5692 0.3424 0.238 0.8219 0.5879 0.9519 0.4306 0.5385 0.2756 0.1339 0.2294 0.5112 0.1331 0.6463 0.7813 0.4081 0.5414 0.1464 0.442 0.4819 0.6091 0.4693 0.5171 0.5438 0.4432 0.5097 0.2551 0.0838 0.2931 0.8255 0.3579 0.2778 0.0213 0.6737 0.6294 0.1062 0.0767 0.4422 0.574 0.6316 0.0176 0.3582 0.0963 0.0041 0.0157 0.1666 0.1723 0.254 0.484 0.0902 0.1045 0.1997 0.8216 0.0021 0.636 0.2887 0.5168 0.1665 0.4811 0.0775 0.4161 0.1035 0.5824 0.992 0.0023 0.5511 0.9551 0.0836 0.4129 0.496 0.0146 0.2509 0.1805 0.2718 0.2261 0.482 0.5299 0.6908 0.3519 0.793 0.0224 0.1823 0.3058 0.0985 0.1609 0.0619 0.0247 0.1562 0.413 0.1889 0.0618 0.3252 0.5861 0.411 0. 0.2904 0.0099 0.1012 0.0657] 2022-08-25 02:28:11 [INFO] [EVAL] The model with the best validation mIoU (0.3233) was saved at iter 139000. 2022-08-25 02:28:21 [INFO] [TRAIN] epoch: 112, iter: 141050/160000, loss: 0.6662, lr: 0.000143, batch_cost: 0.1853, reader_cost: 0.00284, ips: 43.1771 samples/sec | ETA 00:58:31 2022-08-25 02:28:30 [INFO] [TRAIN] epoch: 112, iter: 141100/160000, loss: 0.6846, lr: 0.000143, batch_cost: 0.1921, reader_cost: 0.00156, ips: 41.6380 samples/sec | ETA 01:00:31 2022-08-25 02:28:41 [INFO] [TRAIN] epoch: 112, iter: 141150/160000, loss: 0.6750, lr: 0.000143, batch_cost: 0.2039, reader_cost: 0.00346, ips: 39.2432 samples/sec | ETA 01:04:02 2022-08-25 02:28:49 [INFO] [TRAIN] epoch: 112, iter: 141200/160000, loss: 0.6500, lr: 0.000142, batch_cost: 0.1693, reader_cost: 0.00197, ips: 47.2614 samples/sec | ETA 00:53:02 2022-08-25 02:28:58 [INFO] [TRAIN] epoch: 112, iter: 141250/160000, loss: 0.6478, lr: 0.000142, batch_cost: 0.1860, reader_cost: 0.01636, ips: 42.9997 samples/sec | ETA 00:58:08 2022-08-25 02:29:07 [INFO] [TRAIN] epoch: 112, iter: 141300/160000, loss: 0.7228, lr: 0.000142, batch_cost: 0.1839, reader_cost: 0.00609, ips: 43.4972 samples/sec | ETA 00:57:19 2022-08-25 02:29:19 [INFO] [TRAIN] epoch: 112, iter: 141350/160000, loss: 0.6740, lr: 0.000141, batch_cost: 0.2233, reader_cost: 0.00137, ips: 35.8323 samples/sec | ETA 01:09:23 2022-08-25 02:29:30 [INFO] [TRAIN] epoch: 112, iter: 141400/160000, loss: 0.7071, lr: 0.000141, batch_cost: 0.2295, reader_cost: 0.01922, ips: 34.8562 samples/sec | ETA 01:11:08 2022-08-25 02:29:42 [INFO] [TRAIN] epoch: 112, iter: 141450/160000, loss: 0.6775, lr: 0.000140, batch_cost: 0.2333, reader_cost: 0.00151, ips: 34.2902 samples/sec | ETA 01:12:07 2022-08-25 02:29:57 [INFO] [TRAIN] epoch: 113, iter: 141500/160000, loss: 0.6449, lr: 0.000140, batch_cost: 0.3088, reader_cost: 0.04964, ips: 25.9028 samples/sec | ETA 01:35:13 2022-08-25 02:30:10 [INFO] [TRAIN] epoch: 113, iter: 141550/160000, loss: 0.6661, lr: 0.000140, batch_cost: 0.2595, reader_cost: 0.00784, ips: 30.8339 samples/sec | ETA 01:19:46 2022-08-25 02:30:21 [INFO] [TRAIN] epoch: 113, iter: 141600/160000, loss: 0.7300, lr: 0.000139, batch_cost: 0.2190, reader_cost: 0.02495, ips: 36.5286 samples/sec | ETA 01:07:09 2022-08-25 02:30:33 [INFO] [TRAIN] epoch: 113, iter: 141650/160000, loss: 0.6855, lr: 0.000139, batch_cost: 0.2450, reader_cost: 0.01814, ips: 32.6517 samples/sec | ETA 01:14:55 2022-08-25 02:30:45 [INFO] [TRAIN] epoch: 113, iter: 141700/160000, loss: 0.6821, lr: 0.000139, batch_cost: 0.2239, reader_cost: 0.00505, ips: 35.7382 samples/sec | ETA 01:08:16 2022-08-25 02:30:57 [INFO] [TRAIN] epoch: 113, iter: 141750/160000, loss: 0.7217, lr: 0.000138, batch_cost: 0.2497, reader_cost: 0.00065, ips: 32.0416 samples/sec | ETA 01:15:56 2022-08-25 02:31:09 [INFO] [TRAIN] epoch: 113, iter: 141800/160000, loss: 0.7074, lr: 0.000138, batch_cost: 0.2458, reader_cost: 0.00041, ips: 32.5430 samples/sec | ETA 01:14:34 2022-08-25 02:31:22 [INFO] [TRAIN] epoch: 113, iter: 141850/160000, loss: 0.6692, lr: 0.000137, batch_cost: 0.2529, reader_cost: 0.00169, ips: 31.6327 samples/sec | ETA 01:16:30 2022-08-25 02:31:35 [INFO] [TRAIN] epoch: 113, iter: 141900/160000, loss: 0.6679, lr: 0.000137, batch_cost: 0.2549, reader_cost: 0.01859, ips: 31.3832 samples/sec | ETA 01:16:53 2022-08-25 02:31:47 [INFO] [TRAIN] epoch: 113, iter: 141950/160000, loss: 0.6703, lr: 0.000137, batch_cost: 0.2525, reader_cost: 0.00132, ips: 31.6863 samples/sec | ETA 01:15:57 2022-08-25 02:32:02 [INFO] [TRAIN] epoch: 113, iter: 142000/160000, loss: 0.6970, lr: 0.000136, batch_cost: 0.2853, reader_cost: 0.00141, ips: 28.0442 samples/sec | ETA 01:25:34 2022-08-25 02:32:02 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 152s - batch_cost: 0.1524 - reader cost: 0.0010 2022-08-25 02:34:34 [INFO] [EVAL] #Images: 2000 mIoU: 0.3238 Acc: 0.7505 Kappa: 0.7312 Dice: 0.4520 2022-08-25 02:34:34 [INFO] [EVAL] Class IoU: [0.6618 0.7713 0.9259 0.7066 0.6701 0.748 0.7643 0.7557 0.4882 0.6158 0.4638 0.523 0.6637 0.288 0.2071 0.3784 0.4838 0.4243 0.5483 0.3709 0.724 0.4244 0.5567 0.4652 0.3389 0.3812 0.3622 0.4143 0.3642 0.3477 0.184 0.4136 0.2678 0.2842 0.3451 0.3991 0.3735 0.5208 0.2689 0.2636 0.0934 0.0651 0.3049 0.2288 0.2793 0.2865 0.2203 0.4335 0.6875 0.4959 0.476 0.2723 0.2194 0.2171 0.6037 0.4448 0.8495 0.2744 0.4536 0.1781 0.0519 0.1797 0.3136 0.1531 0.3813 0.6592 0.2385 0.3695 0.0971 0.3398 0.3677 0.4341 0.3732 0.2259 0.409 0.3071 0.33 0.2047 0.0916 0.2773 0.641 0.2895 0.2476 0.02 0.5091 0.4859 0.0869 0.0614 0.328 0.4173 0.4071 0.006 0.2079 0.0881 0.0046 0.0116 0.1674 0.1547 0.1828 0.3227 0.1179 0.0203 0.1869 0.6796 0.0007 0.5213 0.2204 0.4902 0.0951 0.3859 0.0612 0.1443 0.0864 0.5569 0.6633 0.0053 0.4413 0.5689 0.0523 0.2797 0.4556 0.0112 0.2287 0.1272 0.2442 0.2484 0.4366 0.2867 0.5145 0.2806 0.5357 0.0275 0.1423 0.2644 0.0905 0.1087 0.0562 0.0185 0.1163 0.3356 0.1239 0.0495 0.28 0.5197 0.3293 0. 0.2985 0.0147 0.0859 0.0685] 2022-08-25 02:34:34 [INFO] [EVAL] Class Precision: [0.7662 0.8418 0.9601 0.8025 0.7657 0.8619 0.8792 0.822 0.609 0.75 0.6547 0.6674 0.7515 0.4763 0.4797 0.5488 0.6526 0.6502 0.723 0.5675 0.8127 0.5694 0.7136 0.5725 0.5259 0.5521 0.5362 0.6826 0.6381 0.5164 0.44 0.5544 0.4917 0.3938 0.4868 0.5352 0.5904 0.7596 0.5102 0.5177 0.1568 0.2385 0.5167 0.5685 0.3972 0.4528 0.3667 0.6602 0.7344 0.6107 0.6353 0.343 0.4081 0.6215 0.7061 0.6053 0.8956 0.6507 0.7146 0.3693 0.0926 0.4657 0.452 0.5943 0.4747 0.83 0.3729 0.5556 0.3099 0.6121 0.6171 0.5798 0.6117 0.2727 0.624 0.5127 0.5174 0.6597 0.7446 0.5528 0.7432 0.5647 0.7508 0.0701 0.657 0.6847 0.3159 0.4476 0.5652 0.5644 0.5522 0.0083 0.325 0.3091 0.0242 0.0443 0.6633 0.4787 0.4211 0.538 0.6648 0.0432 0.704 0.8554 0.0112 0.7353 0.5303 0.8682 0.2082 0.5549 0.2631 0.1809 0.3872 0.7551 0.6659 0.0913 0.7002 0.5859 0.1281 0.7193 0.7423 0.3524 0.8021 0.6838 0.631 0.6657 0.767 0.4059 0.7161 0.4578 0.6216 0.412 0.428 0.7218 0.6656 0.2483 0.3047 0.0785 0.3225 0.6851 0.2586 0.1773 0.5758 0.7319 0.5574 0. 0.9162 0.3213 0.503 0.7937] 2022-08-25 02:34:34 [INFO] [EVAL] Class Recall: [0.8293 0.9021 0.9629 0.8554 0.8429 0.8498 0.8539 0.9035 0.7112 0.7749 0.6141 0.7073 0.8503 0.4214 0.2671 0.5493 0.6516 0.5498 0.6941 0.517 0.8689 0.6249 0.7169 0.7129 0.488 0.5518 0.5274 0.5131 0.4589 0.5156 0.2402 0.6196 0.3703 0.5052 0.5424 0.6107 0.5041 0.6236 0.3624 0.3495 0.1875 0.0822 0.4265 0.277 0.4849 0.4382 0.3555 0.5579 0.915 0.7252 0.655 0.569 0.3219 0.2501 0.8064 0.6265 0.9429 0.3218 0.5539 0.256 0.1057 0.2263 0.5061 0.1709 0.6596 0.7621 0.3983 0.5245 0.124 0.4331 0.4765 0.6335 0.4891 0.5683 0.5428 0.4337 0.4769 0.2289 0.0946 0.3575 0.8234 0.3727 0.2698 0.0272 0.6934 0.6259 0.1071 0.0664 0.4386 0.6156 0.6077 0.0214 0.3657 0.1098 0.0056 0.0155 0.183 0.1861 0.2442 0.4464 0.1253 0.0368 0.2029 0.7679 0.0007 0.6417 0.2738 0.5296 0.149 0.5589 0.0739 0.4161 0.1001 0.6796 0.9941 0.0056 0.5441 0.9515 0.0813 0.314 0.5412 0.0114 0.2423 0.1351 0.2848 0.2838 0.5033 0.4939 0.6463 0.4203 0.7949 0.0286 0.1757 0.2945 0.0948 0.1621 0.0645 0.0236 0.1539 0.3968 0.1922 0.0643 0.3528 0.6419 0.4459 0. 0.3068 0.0152 0.0939 0.0698] 2022-08-25 02:34:34 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 02:34:42 [INFO] [TRAIN] epoch: 113, iter: 142050/160000, loss: 0.7129, lr: 0.000136, batch_cost: 0.1478, reader_cost: 0.00199, ips: 54.1341 samples/sec | ETA 00:44:12 2022-08-25 02:34:49 [INFO] [TRAIN] epoch: 113, iter: 142100/160000, loss: 0.6424, lr: 0.000136, batch_cost: 0.1439, reader_cost: 0.00140, ips: 55.5765 samples/sec | ETA 00:42:56 2022-08-25 02:34:57 [INFO] [TRAIN] epoch: 113, iter: 142150/160000, loss: 0.6834, lr: 0.000135, batch_cost: 0.1649, reader_cost: 0.00030, ips: 48.5160 samples/sec | ETA 00:49:03 2022-08-25 02:35:06 [INFO] [TRAIN] epoch: 113, iter: 142200/160000, loss: 0.6747, lr: 0.000135, batch_cost: 0.1788, reader_cost: 0.00104, ips: 44.7411 samples/sec | ETA 00:53:02 2022-08-25 02:35:16 [INFO] [TRAIN] epoch: 113, iter: 142250/160000, loss: 0.6585, lr: 0.000134, batch_cost: 0.1922, reader_cost: 0.00074, ips: 41.6257 samples/sec | ETA 00:56:51 2022-08-25 02:35:26 [INFO] [TRAIN] epoch: 113, iter: 142300/160000, loss: 0.6827, lr: 0.000134, batch_cost: 0.2038, reader_cost: 0.00093, ips: 39.2544 samples/sec | ETA 01:00:07 2022-08-25 02:35:36 [INFO] [TRAIN] epoch: 113, iter: 142350/160000, loss: 0.6927, lr: 0.000134, batch_cost: 0.1967, reader_cost: 0.00071, ips: 40.6662 samples/sec | ETA 00:57:52 2022-08-25 02:35:47 [INFO] [TRAIN] epoch: 113, iter: 142400/160000, loss: 0.6686, lr: 0.000133, batch_cost: 0.2257, reader_cost: 0.00034, ips: 35.4395 samples/sec | ETA 01:06:12 2022-08-25 02:35:59 [INFO] [TRAIN] epoch: 113, iter: 142450/160000, loss: 0.6658, lr: 0.000133, batch_cost: 0.2297, reader_cost: 0.00557, ips: 34.8236 samples/sec | ETA 01:07:11 2022-08-25 02:36:10 [INFO] [TRAIN] epoch: 113, iter: 142500/160000, loss: 0.7062, lr: 0.000132, batch_cost: 0.2293, reader_cost: 0.00100, ips: 34.8964 samples/sec | ETA 01:06:51 2022-08-25 02:36:22 [INFO] [TRAIN] epoch: 113, iter: 142550/160000, loss: 0.6874, lr: 0.000132, batch_cost: 0.2364, reader_cost: 0.01222, ips: 33.8359 samples/sec | ETA 01:08:45 2022-08-25 02:36:35 [INFO] [TRAIN] epoch: 113, iter: 142600/160000, loss: 0.6414, lr: 0.000132, batch_cost: 0.2549, reader_cost: 0.00076, ips: 31.3821 samples/sec | ETA 01:13:55 2022-08-25 02:36:47 [INFO] [TRAIN] epoch: 113, iter: 142650/160000, loss: 0.6692, lr: 0.000131, batch_cost: 0.2435, reader_cost: 0.01513, ips: 32.8556 samples/sec | ETA 01:10:24 2022-08-25 02:36:59 [INFO] [TRAIN] epoch: 113, iter: 142700/160000, loss: 0.6873, lr: 0.000131, batch_cost: 0.2374, reader_cost: 0.00107, ips: 33.7046 samples/sec | ETA 01:08:26 2022-08-25 02:37:13 [INFO] [TRAIN] epoch: 114, iter: 142750/160000, loss: 0.6271, lr: 0.000131, batch_cost: 0.2949, reader_cost: 0.03881, ips: 27.1289 samples/sec | ETA 01:24:46 2022-08-25 02:37:25 [INFO] [TRAIN] epoch: 114, iter: 142800/160000, loss: 0.7061, lr: 0.000130, batch_cost: 0.2352, reader_cost: 0.00727, ips: 34.0187 samples/sec | ETA 01:07:24 2022-08-25 02:37:38 [INFO] [TRAIN] epoch: 114, iter: 142850/160000, loss: 0.6467, lr: 0.000130, batch_cost: 0.2486, reader_cost: 0.02666, ips: 32.1748 samples/sec | ETA 01:11:04 2022-08-25 02:37:49 [INFO] [TRAIN] epoch: 114, iter: 142900/160000, loss: 0.6487, lr: 0.000129, batch_cost: 0.2286, reader_cost: 0.00035, ips: 35.0022 samples/sec | ETA 01:05:08 2022-08-25 02:38:02 [INFO] [TRAIN] epoch: 114, iter: 142950/160000, loss: 0.6757, lr: 0.000129, batch_cost: 0.2653, reader_cost: 0.01029, ips: 30.1523 samples/sec | ETA 01:15:23 2022-08-25 02:38:14 [INFO] [TRAIN] epoch: 114, iter: 143000/160000, loss: 0.6586, lr: 0.000129, batch_cost: 0.2393, reader_cost: 0.02339, ips: 33.4300 samples/sec | ETA 01:07:48 2022-08-25 02:38:14 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 180s - batch_cost: 0.1796 - reader cost: 9.4758e-04 2022-08-25 02:41:14 [INFO] [EVAL] #Images: 2000 mIoU: 0.3199 Acc: 0.7489 Kappa: 0.7296 Dice: 0.4479 2022-08-25 02:41:14 [INFO] [EVAL] Class IoU: [0.6581 0.7741 0.9234 0.7109 0.6696 0.7429 0.7617 0.755 0.4825 0.6145 0.468 0.5281 0.6635 0.2849 0.2284 0.3774 0.4805 0.4131 0.545 0.3622 0.7194 0.4327 0.5573 0.4608 0.3174 0.3533 0.3615 0.4055 0.3709 0.3292 0.2205 0.4023 0.2779 0.2863 0.3211 0.4036 0.3705 0.5366 0.2666 0.2719 0.086 0.0809 0.2981 0.2292 0.2967 0.2577 0.1675 0.4268 0.7098 0.5031 0.4689 0.2899 0.2229 0.2119 0.6066 0.4479 0.826 0.3256 0.4726 0.1908 0.036 0.1997 0.2926 0.1602 0.3733 0.6618 0.2377 0.3668 0.0964 0.3367 0.3666 0.4255 0.3589 0.215 0.4023 0.304 0.3206 0.2128 0.1532 0.2379 0.6322 0.2826 0.2721 0.0279 0.4411 0.4882 0.0788 0.0644 0.31 0.4263 0.3787 0.0048 0.1934 0.0758 0.0007 0.013 0.109 0.1327 0.2042 0.3355 0.114 0.0338 0.2171 0.6055 0.0028 0.5201 0.197 0.4774 0.0795 0.3777 0.0626 0.188 0.0928 0.5152 0.7342 0.0049 0.3753 0.5744 0.0593 0.1753 0.438 0.0113 0.2279 0.1404 0.2599 0.2529 0.4412 0.3054 0.4396 0.2631 0.5319 0.025 0.1349 0.2506 0.1014 0.1063 0.0549 0.0167 0.1096 0.3417 0.0948 0.0453 0.2996 0.5066 0.2849 0. 0.299 0.0191 0.0702 0.0493] 2022-08-25 02:41:14 [INFO] [EVAL] Class Precision: [0.7661 0.8356 0.9609 0.8219 0.7648 0.8667 0.8739 0.8146 0.5958 0.7471 0.6681 0.6672 0.7573 0.4701 0.4542 0.5456 0.6339 0.6418 0.7298 0.5777 0.8089 0.5979 0.7266 0.6022 0.509 0.5765 0.5177 0.6611 0.6358 0.4724 0.4189 0.5147 0.4695 0.4128 0.4992 0.5452 0.5979 0.7417 0.4943 0.5025 0.1561 0.2127 0.4968 0.5392 0.4464 0.436 0.2903 0.6622 0.7683 0.6347 0.6014 0.3603 0.4095 0.5396 0.6951 0.5865 0.86 0.6227 0.681 0.4115 0.0789 0.4197 0.3818 0.5966 0.4557 0.7903 0.379 0.5551 0.2582 0.6419 0.5935 0.5948 0.6314 0.2725 0.5849 0.4891 0.565 0.5939 0.6925 0.6178 0.7219 0.5443 0.7399 0.0814 0.7035 0.6945 0.4429 0.4064 0.5166 0.5954 0.4834 0.007 0.3513 0.3258 0.0051 0.0599 0.6812 0.4438 0.4269 0.5289 0.6213 0.0565 0.7629 0.6922 0.0347 0.7521 0.6027 0.853 0.2233 0.556 0.2635 0.2553 0.3577 0.6918 0.7388 0.1519 0.7123 0.5973 0.1362 0.6982 0.7476 0.2277 0.7206 0.6835 0.597 0.6469 0.7873 0.4307 0.5421 0.534 0.6192 0.584 0.412 0.7329 0.6403 0.2631 0.3146 0.07 0.373 0.6738 0.2589 0.1316 0.5067 0.6621 0.4397 0. 0.8947 0.3359 0.4388 0.8001] 2022-08-25 02:41:14 [INFO] [EVAL] Class Recall: [0.8236 0.9133 0.9594 0.8403 0.8432 0.8387 0.8558 0.9117 0.7171 0.7759 0.6098 0.7169 0.8427 0.4197 0.3148 0.5503 0.6651 0.5369 0.6827 0.4927 0.8667 0.6102 0.7051 0.6625 0.4574 0.4771 0.545 0.5119 0.4709 0.5207 0.3177 0.6482 0.4051 0.4831 0.4738 0.6085 0.4935 0.66 0.3667 0.3721 0.1608 0.1154 0.427 0.285 0.4695 0.3866 0.2838 0.5455 0.9032 0.7083 0.6805 0.5976 0.3286 0.2586 0.8265 0.6547 0.9543 0.4056 0.6071 0.2625 0.0619 0.2758 0.5562 0.1796 0.6737 0.8028 0.3893 0.5195 0.1334 0.4146 0.4895 0.5992 0.454 0.5049 0.563 0.4456 0.4256 0.249 0.1644 0.279 0.8359 0.3702 0.3009 0.0407 0.5418 0.6217 0.0875 0.0711 0.4367 0.6002 0.636 0.0151 0.3009 0.0899 0.0008 0.0164 0.1149 0.1592 0.2812 0.4786 0.1225 0.0776 0.2328 0.8286 0.003 0.6278 0.2264 0.5202 0.1099 0.5409 0.0758 0.4161 0.1114 0.6687 0.9915 0.005 0.4424 0.9374 0.095 0.1897 0.514 0.0118 0.25 0.1502 0.3152 0.2935 0.5009 0.5121 0.6991 0.3416 0.7905 0.0254 0.1671 0.2758 0.1075 0.1514 0.0624 0.0214 0.1344 0.4095 0.1302 0.0645 0.4228 0.6832 0.4472 0. 0.3099 0.0198 0.0771 0.05 ] 2022-08-25 02:41:14 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 02:41:24 [INFO] [TRAIN] epoch: 114, iter: 143050/160000, loss: 0.6783, lr: 0.000128, batch_cost: 0.1893, reader_cost: 0.00438, ips: 42.2573 samples/sec | ETA 00:53:28 2022-08-25 02:41:34 [INFO] [TRAIN] epoch: 114, iter: 143100/160000, loss: 0.6804, lr: 0.000128, batch_cost: 0.2021, reader_cost: 0.00108, ips: 39.5820 samples/sec | ETA 00:56:55 2022-08-25 02:41:44 [INFO] [TRAIN] epoch: 114, iter: 143150/160000, loss: 0.6800, lr: 0.000128, batch_cost: 0.2054, reader_cost: 0.00043, ips: 38.9465 samples/sec | ETA 00:57:41 2022-08-25 02:41:54 [INFO] [TRAIN] epoch: 114, iter: 143200/160000, loss: 0.6897, lr: 0.000127, batch_cost: 0.2039, reader_cost: 0.00832, ips: 39.2312 samples/sec | ETA 00:57:05 2022-08-25 02:42:07 [INFO] [TRAIN] epoch: 114, iter: 143250/160000, loss: 0.6629, lr: 0.000127, batch_cost: 0.2473, reader_cost: 0.00700, ips: 32.3507 samples/sec | ETA 01:09:02 2022-08-25 02:42:19 [INFO] [TRAIN] epoch: 114, iter: 143300/160000, loss: 0.6598, lr: 0.000126, batch_cost: 0.2523, reader_cost: 0.00057, ips: 31.7068 samples/sec | ETA 01:10:13 2022-08-25 02:42:34 [INFO] [TRAIN] epoch: 114, iter: 143350/160000, loss: 0.6720, lr: 0.000126, batch_cost: 0.2883, reader_cost: 0.00040, ips: 27.7533 samples/sec | ETA 01:19:59 2022-08-25 02:42:47 [INFO] [TRAIN] epoch: 114, iter: 143400/160000, loss: 0.6713, lr: 0.000126, batch_cost: 0.2584, reader_cost: 0.00089, ips: 30.9643 samples/sec | ETA 01:11:28 2022-08-25 02:42:59 [INFO] [TRAIN] epoch: 114, iter: 143450/160000, loss: 0.6363, lr: 0.000125, batch_cost: 0.2456, reader_cost: 0.00824, ips: 32.5725 samples/sec | ETA 01:07:44 2022-08-25 02:43:12 [INFO] [TRAIN] epoch: 114, iter: 143500/160000, loss: 0.6566, lr: 0.000125, batch_cost: 0.2535, reader_cost: 0.01560, ips: 31.5585 samples/sec | ETA 01:09:42 2022-08-25 02:43:24 [INFO] [TRAIN] epoch: 114, iter: 143550/160000, loss: 0.6688, lr: 0.000125, batch_cost: 0.2418, reader_cost: 0.01278, ips: 33.0789 samples/sec | ETA 01:06:18 2022-08-25 02:43:36 [INFO] [TRAIN] epoch: 114, iter: 143600/160000, loss: 0.6849, lr: 0.000124, batch_cost: 0.2386, reader_cost: 0.00079, ips: 33.5269 samples/sec | ETA 01:05:13 2022-08-25 02:43:47 [INFO] [TRAIN] epoch: 114, iter: 143650/160000, loss: 0.7292, lr: 0.000124, batch_cost: 0.2342, reader_cost: 0.00168, ips: 34.1638 samples/sec | ETA 01:03:48 2022-08-25 02:44:00 [INFO] [TRAIN] epoch: 114, iter: 143700/160000, loss: 0.6619, lr: 0.000123, batch_cost: 0.2518, reader_cost: 0.02073, ips: 31.7709 samples/sec | ETA 01:08:24 2022-08-25 02:44:13 [INFO] [TRAIN] epoch: 114, iter: 143750/160000, loss: 0.6329, lr: 0.000123, batch_cost: 0.2632, reader_cost: 0.00103, ips: 30.3990 samples/sec | ETA 01:11:16 2022-08-25 02:44:24 [INFO] [TRAIN] epoch: 114, iter: 143800/160000, loss: 0.6896, lr: 0.000123, batch_cost: 0.2146, reader_cost: 0.00087, ips: 37.2854 samples/sec | ETA 00:57:55 2022-08-25 02:44:36 [INFO] [TRAIN] epoch: 114, iter: 143850/160000, loss: 0.5888, lr: 0.000122, batch_cost: 0.2376, reader_cost: 0.00755, ips: 33.6652 samples/sec | ETA 01:03:57 2022-08-25 02:44:47 [INFO] [TRAIN] epoch: 114, iter: 143900/160000, loss: 0.7402, lr: 0.000122, batch_cost: 0.2325, reader_cost: 0.00468, ips: 34.4149 samples/sec | ETA 01:02:22 2022-08-25 02:44:59 [INFO] [TRAIN] epoch: 114, iter: 143950/160000, loss: 0.6967, lr: 0.000122, batch_cost: 0.2410, reader_cost: 0.00123, ips: 33.1960 samples/sec | ETA 01:04:27 2022-08-25 02:45:13 [INFO] [TRAIN] epoch: 115, iter: 144000/160000, loss: 0.6862, lr: 0.000121, batch_cost: 0.2747, reader_cost: 0.03800, ips: 29.1222 samples/sec | ETA 01:13:15 2022-08-25 02:45:13 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 155s - batch_cost: 0.1551 - reader cost: 8.2469e-04 2022-08-25 02:47:48 [INFO] [EVAL] #Images: 2000 mIoU: 0.3199 Acc: 0.7493 Kappa: 0.7300 Dice: 0.4483 2022-08-25 02:47:48 [INFO] [EVAL] Class IoU: [0.6596 0.7725 0.9251 0.7108 0.6664 0.7479 0.7675 0.764 0.4831 0.6335 0.4632 0.5278 0.6551 0.296 0.2228 0.3805 0.4783 0.4247 0.545 0.3707 0.7058 0.3967 0.5603 0.4627 0.3128 0.3583 0.3576 0.4073 0.367 0.3388 0.2104 0.3857 0.2736 0.2854 0.3312 0.4112 0.3671 0.4793 0.2575 0.2744 0.084 0.0849 0.3048 0.2272 0.2814 0.2677 0.2248 0.4252 0.7116 0.5132 0.4685 0.2369 0.2253 0.209 0.6406 0.4458 0.823 0.3331 0.451 0.1861 0.0416 0.1778 0.3024 0.1059 0.39 0.6702 0.2395 0.3669 0.0835 0.3444 0.3641 0.4241 0.3671 0.2283 0.4179 0.2951 0.3468 0.2203 0.075 0.2587 0.6005 0.2793 0.2324 0.0228 0.4734 0.4849 0.0914 0.0639 0.3161 0.4378 0.3893 0.0059 0.2168 0.0807 0.0051 0.0099 0.137 0.1394 0.1846 0.3614 0.1662 0.0271 0.2102 0.5256 0. 0.5385 0.181 0.4439 0.0993 0.3858 0.0644 0.1959 0.0885 0.5568 0.7239 0.0051 0.3209 0.5583 0.0536 0.2607 0.4626 0.0112 0.2167 0.1224 0.252 0.2493 0.4086 0.278 0.4664 0.267 0.5217 0.0221 0.1617 0.2778 0.0949 0.1038 0.0628 0.0175 0.1225 0.3261 0.0764 0.0886 0.2818 0.4814 0.2929 0. 0.2981 0.018 0.0854 0.075 ] 2022-08-25 02:47:48 [INFO] [EVAL] Class Precision: [0.7676 0.8362 0.9591 0.8152 0.7647 0.8565 0.872 0.8362 0.6087 0.7722 0.6829 0.6784 0.7421 0.4873 0.4783 0.5349 0.6135 0.6642 0.7113 0.5554 0.787 0.5719 0.7332 0.6012 0.5083 0.5647 0.5146 0.6493 0.6332 0.5243 0.431 0.5158 0.4818 0.3937 0.5022 0.5583 0.6136 0.7293 0.4949 0.5022 0.1471 0.1933 0.522 0.5139 0.4122 0.4407 0.3929 0.6475 0.7877 0.6673 0.5979 0.2833 0.4147 0.6271 0.7368 0.5932 0.8537 0.6171 0.6649 0.3641 0.0685 0.3296 0.4202 0.6538 0.5006 0.8147 0.4027 0.5483 0.3121 0.6502 0.616 0.5808 0.6309 0.2785 0.6113 0.4473 0.4951 0.6482 0.6224 0.6179 0.6787 0.5519 0.7671 0.0587 0.7119 0.6965 0.3351 0.413 0.4811 0.6076 0.5228 0.0078 0.3387 0.3175 0.0189 0.0346 0.6215 0.5268 0.4205 0.5153 0.5915 0.0471 0.6943 0.605 0.0004 0.7774 0.5719 0.9053 0.1959 0.5539 0.2693 0.2703 0.3528 0.6941 0.7289 0.0694 0.7139 0.5746 0.1404 0.7395 0.7443 0.3375 0.6399 0.6832 0.5967 0.6487 0.8125 0.4042 0.5783 0.473 0.6417 0.5901 0.5016 0.7247 0.6302 0.2739 0.2849 0.0961 0.3811 0.7116 0.1983 0.1434 0.5294 0.7306 0.5009 0. 0.895 0.3238 0.3501 0.7265] 2022-08-25 02:47:49 [INFO] [EVAL] Class Recall: [0.8243 0.9102 0.963 0.8474 0.8382 0.855 0.8649 0.8986 0.7006 0.7791 0.5902 0.7039 0.8482 0.4298 0.2944 0.5686 0.6847 0.5409 0.6997 0.527 0.8725 0.5642 0.7038 0.6677 0.4485 0.495 0.5395 0.5222 0.466 0.4891 0.2913 0.6046 0.3877 0.5092 0.4932 0.6096 0.4774 0.5831 0.3493 0.3769 0.1639 0.1314 0.4228 0.2894 0.4699 0.4054 0.3445 0.5534 0.8804 0.6898 0.6841 0.591 0.3304 0.2387 0.8307 0.6421 0.9582 0.4199 0.5836 0.2757 0.0959 0.2786 0.5189 0.1122 0.6385 0.7907 0.3714 0.5259 0.1023 0.4228 0.471 0.6112 0.4674 0.5587 0.5692 0.4645 0.5364 0.2502 0.0785 0.308 0.8391 0.3611 0.25 0.036 0.5856 0.6148 0.1117 0.0703 0.4797 0.6103 0.604 0.0236 0.3759 0.0976 0.0069 0.0136 0.1495 0.1593 0.2476 0.5476 0.1878 0.0597 0.2317 0.8003 0. 0.6367 0.2094 0.4656 0.1676 0.5597 0.078 0.4161 0.1056 0.7379 0.9906 0.0054 0.3683 0.9518 0.0798 0.287 0.55 0.0114 0.2468 0.1298 0.3037 0.2883 0.4511 0.471 0.7068 0.38 0.7362 0.0224 0.1926 0.3106 0.1005 0.1432 0.0745 0.0209 0.153 0.3758 0.1106 0.1881 0.3761 0.5853 0.4137 0. 0.3089 0.0187 0.1015 0.0772] 2022-08-25 02:47:49 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 02:47:59 [INFO] [TRAIN] epoch: 115, iter: 144050/160000, loss: 0.7069, lr: 0.000121, batch_cost: 0.2005, reader_cost: 0.00415, ips: 39.9064 samples/sec | ETA 00:53:17 2022-08-25 02:48:08 [INFO] [TRAIN] epoch: 115, iter: 144100/160000, loss: 0.7063, lr: 0.000120, batch_cost: 0.1915, reader_cost: 0.00081, ips: 41.7799 samples/sec | ETA 00:50:44 2022-08-25 02:48:18 [INFO] [TRAIN] epoch: 115, iter: 144150/160000, loss: 0.6908, lr: 0.000120, batch_cost: 0.1944, reader_cost: 0.00097, ips: 41.1599 samples/sec | ETA 00:51:20 2022-08-25 02:48:27 [INFO] [TRAIN] epoch: 115, iter: 144200/160000, loss: 0.6921, lr: 0.000120, batch_cost: 0.1825, reader_cost: 0.00967, ips: 43.8258 samples/sec | ETA 00:48:04 2022-08-25 02:48:38 [INFO] [TRAIN] epoch: 115, iter: 144250/160000, loss: 0.6810, lr: 0.000119, batch_cost: 0.2148, reader_cost: 0.00057, ips: 37.2508 samples/sec | ETA 00:56:22 2022-08-25 02:48:49 [INFO] [TRAIN] epoch: 115, iter: 144300/160000, loss: 0.6396, lr: 0.000119, batch_cost: 0.2267, reader_cost: 0.00085, ips: 35.2836 samples/sec | ETA 00:59:19 2022-08-25 02:49:01 [INFO] [TRAIN] epoch: 115, iter: 144350/160000, loss: 0.7148, lr: 0.000118, batch_cost: 0.2318, reader_cost: 0.00582, ips: 34.5085 samples/sec | ETA 01:00:28 2022-08-25 02:49:13 [INFO] [TRAIN] epoch: 115, iter: 144400/160000, loss: 0.6354, lr: 0.000118, batch_cost: 0.2378, reader_cost: 0.00863, ips: 33.6445 samples/sec | ETA 01:01:49 2022-08-25 02:49:26 [INFO] [TRAIN] epoch: 115, iter: 144450/160000, loss: 0.7360, lr: 0.000118, batch_cost: 0.2590, reader_cost: 0.00337, ips: 30.8873 samples/sec | ETA 01:07:07 2022-08-25 02:49:38 [INFO] [TRAIN] epoch: 115, iter: 144500/160000, loss: 0.7335, lr: 0.000117, batch_cost: 0.2425, reader_cost: 0.01239, ips: 32.9950 samples/sec | ETA 01:02:38 2022-08-25 02:49:49 [INFO] [TRAIN] epoch: 115, iter: 144550/160000, loss: 0.6600, lr: 0.000117, batch_cost: 0.2324, reader_cost: 0.01459, ips: 34.4250 samples/sec | ETA 00:59:50 2022-08-25 02:50:02 [INFO] [TRAIN] epoch: 115, iter: 144600/160000, loss: 0.6629, lr: 0.000117, batch_cost: 0.2563, reader_cost: 0.00334, ips: 31.2120 samples/sec | ETA 01:05:47 2022-08-25 02:50:15 [INFO] [TRAIN] epoch: 115, iter: 144650/160000, loss: 0.6872, lr: 0.000116, batch_cost: 0.2601, reader_cost: 0.00350, ips: 30.7601 samples/sec | ETA 01:06:32 2022-08-25 02:50:25 [INFO] [TRAIN] epoch: 115, iter: 144700/160000, loss: 0.6386, lr: 0.000116, batch_cost: 0.2055, reader_cost: 0.00309, ips: 38.9376 samples/sec | ETA 00:52:23 2022-08-25 02:50:37 [INFO] [TRAIN] epoch: 115, iter: 144750/160000, loss: 0.6696, lr: 0.000115, batch_cost: 0.2326, reader_cost: 0.00798, ips: 34.3910 samples/sec | ETA 00:59:07 2022-08-25 02:50:51 [INFO] [TRAIN] epoch: 115, iter: 144800/160000, loss: 0.7147, lr: 0.000115, batch_cost: 0.2727, reader_cost: 0.00265, ips: 29.3412 samples/sec | ETA 01:09:04 2022-08-25 02:51:03 [INFO] [TRAIN] epoch: 115, iter: 144850/160000, loss: 0.6568, lr: 0.000115, batch_cost: 0.2438, reader_cost: 0.00037, ips: 32.8199 samples/sec | ETA 01:01:32 2022-08-25 02:51:14 [INFO] [TRAIN] epoch: 115, iter: 144900/160000, loss: 0.6757, lr: 0.000114, batch_cost: 0.2281, reader_cost: 0.00639, ips: 35.0758 samples/sec | ETA 00:57:23 2022-08-25 02:51:27 [INFO] [TRAIN] epoch: 115, iter: 144950/160000, loss: 0.7005, lr: 0.000114, batch_cost: 0.2513, reader_cost: 0.00054, ips: 31.8352 samples/sec | ETA 01:03:01 2022-08-25 02:51:37 [INFO] [TRAIN] epoch: 115, iter: 145000/160000, loss: 0.6855, lr: 0.000114, batch_cost: 0.2016, reader_cost: 0.00080, ips: 39.6867 samples/sec | ETA 00:50:23 2022-08-25 02:51:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1590 - reader cost: 0.0014 2022-08-25 02:54:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3193 Acc: 0.7489 Kappa: 0.7295 Dice: 0.4473 2022-08-25 02:54:16 [INFO] [EVAL] Class IoU: [0.6569 0.7712 0.9248 0.7077 0.6667 0.7475 0.762 0.7619 0.4911 0.6209 0.4624 0.5295 0.6615 0.3024 0.2156 0.3882 0.4866 0.4044 0.537 0.369 0.7246 0.3972 0.562 0.4631 0.3227 0.3629 0.3578 0.4004 0.3725 0.3339 0.2191 0.3945 0.2777 0.2899 0.348 0.3987 0.3714 0.4963 0.2517 0.2493 0.0812 0.067 0.3033 0.2252 0.2808 0.2505 0.2588 0.4102 0.6874 0.5007 0.494 0.2436 0.2194 0.1926 0.6043 0.4392 0.8475 0.2924 0.4911 0.1658 0.054 0.182 0.307 0.101 0.3848 0.6392 0.2164 0.3558 0.107 0.3398 0.3834 0.43 0.3611 0.2262 0.4191 0.2997 0.3178 0.2208 0.1953 0.3221 0.6272 0.2754 0.2457 0.0162 0.5232 0.4765 0.0921 0.0727 0.3367 0.4317 0.3945 0.0043 0.1818 0.0688 0.0015 0.008 0.1805 0.136 0.1877 0.3602 0.1593 0.0095 0.1998 0.5419 0.0005 0.4978 0.2255 0.4709 0.0997 0.3411 0.0691 0.1876 0.0815 0.5226 0.7125 0.0013 0.4006 0.5668 0.0471 0.3237 0.4371 0.0153 0.2042 0.1163 0.2464 0.2187 0.4096 0.2829 0.4214 0.2608 0.5361 0.0256 0.1624 0.256 0.0879 0.0924 0.0571 0.0193 0.1095 0.3384 0.0661 0.0272 0.2549 0.4491 0.2949 0. 0.2948 0.0126 0.0713 0.0626] 2022-08-25 02:54:16 [INFO] [EVAL] Class Precision: [0.76 0.8377 0.9624 0.8086 0.7592 0.8506 0.8839 0.8334 0.6223 0.7433 0.686 0.668 0.7572 0.4728 0.4884 0.5548 0.6453 0.6607 0.7396 0.5654 0.8186 0.5734 0.7278 0.5869 0.4877 0.6012 0.5205 0.6514 0.6215 0.4862 0.4157 0.5235 0.4562 0.4033 0.5144 0.5049 0.6135 0.7884 0.5349 0.5394 0.1375 0.2071 0.5274 0.5389 0.4065 0.417 0.436 0.6742 0.7417 0.6236 0.646 0.2879 0.388 0.727 0.7076 0.595 0.8936 0.638 0.6438 0.329 0.0874 0.3939 0.4427 0.6482 0.4811 0.7416 0.3714 0.5728 0.274 0.5889 0.5686 0.5675 0.6165 0.3102 0.6248 0.4678 0.573 0.5855 0.7425 0.5854 0.7193 0.6 0.7609 0.0459 0.7002 0.7028 0.4693 0.4058 0.5125 0.6023 0.5384 0.0059 0.3054 0.2925 0.0095 0.0322 0.6157 0.6318 0.3879 0.5357 0.6018 0.0223 0.7128 0.6905 0.0093 0.6695 0.5669 0.8387 0.239 0.5613 0.3022 0.2547 0.3669 0.7289 0.7184 0.0319 0.709 0.5867 0.1634 0.7378 0.8056 0.3341 0.6169 0.6565 0.5931 0.6349 0.8014 0.3926 0.5818 0.4479 0.6502 0.6806 0.4833 0.7227 0.6431 0.2665 0.2865 0.1354 0.3332 0.6984 0.1345 0.0952 0.5389 0.6303 0.4809 0. 0.9084 0.2602 0.3998 0.7403] 2022-08-25 02:54:16 [INFO] [EVAL] Class Recall: [0.8289 0.9067 0.9594 0.8502 0.8455 0.8604 0.8467 0.8988 0.6998 0.7904 0.5866 0.7186 0.8397 0.4563 0.2784 0.564 0.6644 0.5103 0.6622 0.5152 0.8632 0.5639 0.7115 0.687 0.4882 0.478 0.5338 0.5096 0.4818 0.5159 0.3166 0.6154 0.4151 0.5078 0.5183 0.6545 0.4848 0.5727 0.3222 0.3167 0.1654 0.0901 0.4165 0.2789 0.4758 0.3855 0.3891 0.5116 0.9038 0.7176 0.6773 0.6128 0.3354 0.2076 0.8054 0.6265 0.9427 0.3506 0.6743 0.2505 0.1238 0.2528 0.5004 0.1068 0.6578 0.8225 0.3415 0.4843 0.1493 0.4455 0.5407 0.6397 0.4657 0.4553 0.5601 0.4548 0.4163 0.2617 0.2095 0.4173 0.8304 0.3373 0.2662 0.0244 0.6742 0.5967 0.1028 0.0814 0.4954 0.6037 0.596 0.0155 0.31 0.0825 0.0018 0.0105 0.2035 0.1477 0.2667 0.5237 0.1781 0.0162 0.2173 0.7157 0.0005 0.6599 0.2724 0.5178 0.146 0.4652 0.0823 0.4161 0.0949 0.6487 0.9885 0.0013 0.4795 0.9434 0.062 0.3657 0.4886 0.0157 0.2339 0.1239 0.2965 0.2501 0.4559 0.5029 0.6044 0.3844 0.7534 0.0259 0.1965 0.2838 0.0924 0.124 0.0666 0.022 0.1403 0.3963 0.1149 0.0367 0.3261 0.6097 0.4325 0. 0.3038 0.0131 0.0799 0.064 ] 2022-08-25 02:54:16 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 02:54:26 [INFO] [TRAIN] epoch: 115, iter: 145050/160000, loss: 0.6646, lr: 0.000113, batch_cost: 0.1887, reader_cost: 0.00387, ips: 42.4046 samples/sec | ETA 00:47:00 2022-08-25 02:54:36 [INFO] [TRAIN] epoch: 115, iter: 145100/160000, loss: 0.6357, lr: 0.000113, batch_cost: 0.2007, reader_cost: 0.00102, ips: 39.8601 samples/sec | ETA 00:49:50 2022-08-25 02:54:45 [INFO] [TRAIN] epoch: 115, iter: 145150/160000, loss: 0.6722, lr: 0.000112, batch_cost: 0.1809, reader_cost: 0.00038, ips: 44.2218 samples/sec | ETA 00:44:46 2022-08-25 02:54:54 [INFO] [TRAIN] epoch: 115, iter: 145200/160000, loss: 0.6711, lr: 0.000112, batch_cost: 0.1744, reader_cost: 0.00583, ips: 45.8707 samples/sec | ETA 00:43:01 2022-08-25 02:55:06 [INFO] [TRAIN] epoch: 116, iter: 145250/160000, loss: 0.6419, lr: 0.000112, batch_cost: 0.2415, reader_cost: 0.04838, ips: 33.1268 samples/sec | ETA 00:59:22 2022-08-25 02:55:17 [INFO] [TRAIN] epoch: 116, iter: 145300/160000, loss: 0.6356, lr: 0.000111, batch_cost: 0.2339, reader_cost: 0.00102, ips: 34.1969 samples/sec | ETA 00:57:18 2022-08-25 02:55:30 [INFO] [TRAIN] epoch: 116, iter: 145350/160000, loss: 0.7166, lr: 0.000111, batch_cost: 0.2626, reader_cost: 0.00069, ips: 30.4683 samples/sec | ETA 01:04:06 2022-08-25 02:55:43 [INFO] [TRAIN] epoch: 116, iter: 145400/160000, loss: 0.7014, lr: 0.000111, batch_cost: 0.2460, reader_cost: 0.00594, ips: 32.5174 samples/sec | ETA 00:59:51 2022-08-25 02:55:55 [INFO] [TRAIN] epoch: 116, iter: 145450/160000, loss: 0.6783, lr: 0.000110, batch_cost: 0.2515, reader_cost: 0.00603, ips: 31.8058 samples/sec | ETA 01:00:59 2022-08-25 02:56:07 [INFO] [TRAIN] epoch: 116, iter: 145500/160000, loss: 0.6468, lr: 0.000110, batch_cost: 0.2401, reader_cost: 0.00484, ips: 33.3136 samples/sec | ETA 00:58:02 2022-08-25 02:56:19 [INFO] [TRAIN] epoch: 116, iter: 145550/160000, loss: 0.7178, lr: 0.000109, batch_cost: 0.2422, reader_cost: 0.00581, ips: 33.0342 samples/sec | ETA 00:58:19 2022-08-25 02:56:32 [INFO] [TRAIN] epoch: 116, iter: 145600/160000, loss: 0.6469, lr: 0.000109, batch_cost: 0.2416, reader_cost: 0.00811, ips: 33.1165 samples/sec | ETA 00:57:58 2022-08-25 02:56:43 [INFO] [TRAIN] epoch: 116, iter: 145650/160000, loss: 0.6169, lr: 0.000109, batch_cost: 0.2307, reader_cost: 0.01554, ips: 34.6736 samples/sec | ETA 00:55:10 2022-08-25 02:56:55 [INFO] [TRAIN] epoch: 116, iter: 145700/160000, loss: 0.6341, lr: 0.000108, batch_cost: 0.2304, reader_cost: 0.00228, ips: 34.7265 samples/sec | ETA 00:54:54 2022-08-25 02:57:07 [INFO] [TRAIN] epoch: 116, iter: 145750/160000, loss: 0.7118, lr: 0.000108, batch_cost: 0.2527, reader_cost: 0.00193, ips: 31.6597 samples/sec | ETA 01:00:00 2022-08-25 02:57:21 [INFO] [TRAIN] epoch: 116, iter: 145800/160000, loss: 0.6642, lr: 0.000108, batch_cost: 0.2689, reader_cost: 0.00035, ips: 29.7467 samples/sec | ETA 01:03:38 2022-08-25 02:57:34 [INFO] [TRAIN] epoch: 116, iter: 145850/160000, loss: 0.6854, lr: 0.000107, batch_cost: 0.2584, reader_cost: 0.00062, ips: 30.9619 samples/sec | ETA 01:00:56 2022-08-25 02:57:46 [INFO] [TRAIN] epoch: 116, iter: 145900/160000, loss: 0.6871, lr: 0.000107, batch_cost: 0.2389, reader_cost: 0.00641, ips: 33.4825 samples/sec | ETA 00:56:08 2022-08-25 02:57:59 [INFO] [TRAIN] epoch: 116, iter: 145950/160000, loss: 0.6617, lr: 0.000106, batch_cost: 0.2617, reader_cost: 0.01075, ips: 30.5741 samples/sec | ETA 01:01:16 2022-08-25 02:58:11 [INFO] [TRAIN] epoch: 116, iter: 146000/160000, loss: 0.6927, lr: 0.000106, batch_cost: 0.2447, reader_cost: 0.00086, ips: 32.6886 samples/sec | ETA 00:57:06 2022-08-25 02:58:11 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 171s - batch_cost: 0.1709 - reader cost: 8.9659e-04 2022-08-25 03:01:02 [INFO] [EVAL] #Images: 2000 mIoU: 0.3201 Acc: 0.7488 Kappa: 0.7294 Dice: 0.4485 2022-08-25 03:01:02 [INFO] [EVAL] Class IoU: [0.6576 0.7695 0.9249 0.7083 0.6702 0.7448 0.7667 0.761 0.4865 0.6212 0.4594 0.5281 0.6602 0.2958 0.214 0.3836 0.4865 0.4194 0.5364 0.3733 0.7262 0.4196 0.5561 0.4595 0.3242 0.3552 0.3634 0.401 0.3709 0.3445 0.2185 0.3914 0.2736 0.2816 0.3398 0.4026 0.3673 0.4915 0.25 0.2642 0.0773 0.0661 0.2948 0.2191 0.2831 0.2542 0.2225 0.4279 0.6981 0.4944 0.476 0.2556 0.2235 0.2018 0.6126 0.4455 0.8412 0.3155 0.4951 0.2127 0.0577 0.173 0.3069 0.1343 0.3932 0.6587 0.2289 0.3684 0.0811 0.3344 0.3801 0.4223 0.3653 0.2398 0.4132 0.2978 0.3347 0.2161 0.1881 0.2253 0.6493 0.2878 0.2537 0.0157 0.5126 0.4916 0.0805 0.0612 0.3284 0.401 0.4103 0.007 0.2102 0.0854 0.0022 0.0127 0.1735 0.132 0.2112 0.3612 0.1243 0.0259 0.2202 0.5245 0.001 0.5323 0.2189 0.435 0.097 0.3352 0.0824 0.1626 0.0863 0.4348 0.7501 0.0019 0.385 0.5757 0.0747 0.2756 0.4641 0.0159 0.2205 0.0865 0.2493 0.2279 0.4206 0.3033 0.4358 0.2628 0.5398 0.0251 0.1362 0.2504 0.1012 0.1003 0.0526 0.0187 0.118 0.31 0.074 0.0679 0.2483 0.4905 0.2986 0. 0.2934 0.0092 0.091 0.0628] 2022-08-25 03:01:02 [INFO] [EVAL] Class Precision: [0.7619 0.8302 0.9623 0.8187 0.7717 0.8666 0.8785 0.8343 0.599 0.7413 0.6838 0.6763 0.7479 0.4952 0.4776 0.5471 0.6547 0.639 0.7382 0.5517 0.8252 0.5759 0.7269 0.6022 0.5051 0.531 0.5166 0.6317 0.6185 0.5101 0.4221 0.507 0.4743 0.4103 0.5136 0.5575 0.5959 0.7401 0.4628 0.5142 0.1429 0.1894 0.4738 0.5137 0.3747 0.4276 0.3715 0.6415 0.7662 0.6088 0.6105 0.3027 0.3762 0.611 0.7054 0.6048 0.8795 0.629 0.6497 0.4024 0.1061 0.3342 0.4553 0.6133 0.5046 0.8027 0.3541 0.578 0.2123 0.626 0.6315 0.5814 0.6298 0.2994 0.6363 0.5188 0.5277 0.5526 0.6858 0.5282 0.7515 0.5696 0.7576 0.0491 0.6724 0.6628 0.4002 0.4047 0.5627 0.5571 0.5651 0.0099 0.3764 0.301 0.0118 0.0421 0.5823 0.5036 0.4148 0.5368 0.5873 0.0478 0.7515 0.6705 0.0212 0.73 0.5293 0.9319 0.2477 0.55 0.27 0.2107 0.3506 0.7619 0.7562 0.0385 0.7027 0.6017 0.1753 0.7173 0.7621 0.2686 0.6113 0.738 0.6303 0.6275 0.7922 0.4475 0.5941 0.4263 0.6437 0.4719 0.4316 0.6943 0.6682 0.2847 0.3326 0.131 0.3721 0.7156 0.1422 0.1408 0.5574 0.7499 0.499 0. 0.8989 0.3445 0.4339 0.7284] 2022-08-25 03:01:02 [INFO] [EVAL] Class Recall: [0.8277 0.9133 0.9597 0.8401 0.8359 0.8413 0.8577 0.8966 0.7216 0.7931 0.5832 0.7068 0.8492 0.4235 0.2794 0.5622 0.6545 0.5497 0.6625 0.5359 0.8582 0.6073 0.703 0.6597 0.4752 0.5176 0.5507 0.5233 0.481 0.5149 0.3118 0.632 0.3927 0.4731 0.5011 0.5916 0.4892 0.594 0.3523 0.352 0.1441 0.0922 0.4383 0.2764 0.5365 0.3854 0.3569 0.5624 0.887 0.7246 0.6837 0.6216 0.3551 0.2316 0.8232 0.6285 0.9507 0.3877 0.6754 0.3109 0.1123 0.264 0.4848 0.1468 0.6403 0.786 0.3931 0.5039 0.116 0.4179 0.4885 0.6068 0.4652 0.5461 0.5409 0.4114 0.4778 0.2619 0.2058 0.2821 0.8269 0.3678 0.2761 0.0226 0.6832 0.6555 0.0915 0.0672 0.441 0.5886 0.5995 0.0237 0.3224 0.1065 0.0028 0.0178 0.1982 0.1518 0.3008 0.5248 0.1362 0.0535 0.2375 0.7067 0.0011 0.6628 0.2719 0.4493 0.1374 0.4618 0.106 0.4161 0.1028 0.5031 0.9893 0.002 0.4599 0.9302 0.1151 0.3092 0.5427 0.0166 0.2564 0.0893 0.292 0.2635 0.4728 0.4849 0.6205 0.4066 0.7698 0.0258 0.166 0.2815 0.1066 0.1342 0.0588 0.0214 0.1473 0.3535 0.1338 0.1158 0.3093 0.5865 0.4265 0. 0.3034 0.0094 0.1033 0.0643] 2022-08-25 03:01:02 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 03:01:11 [INFO] [TRAIN] epoch: 116, iter: 146050/160000, loss: 0.6758, lr: 0.000106, batch_cost: 0.1829, reader_cost: 0.00324, ips: 43.7329 samples/sec | ETA 00:42:31 2022-08-25 03:01:21 [INFO] [TRAIN] epoch: 116, iter: 146100/160000, loss: 0.7361, lr: 0.000105, batch_cost: 0.1974, reader_cost: 0.00081, ips: 40.5271 samples/sec | ETA 00:45:43 2022-08-25 03:01:31 [INFO] [TRAIN] epoch: 116, iter: 146150/160000, loss: 0.6489, lr: 0.000105, batch_cost: 0.2010, reader_cost: 0.00066, ips: 39.8102 samples/sec | ETA 00:46:23 2022-08-25 03:01:41 [INFO] [TRAIN] epoch: 116, iter: 146200/160000, loss: 0.6819, lr: 0.000104, batch_cost: 0.2038, reader_cost: 0.00067, ips: 39.2559 samples/sec | ETA 00:46:52 2022-08-25 03:01:51 [INFO] [TRAIN] epoch: 116, iter: 146250/160000, loss: 0.6143, lr: 0.000104, batch_cost: 0.2023, reader_cost: 0.01625, ips: 39.5527 samples/sec | ETA 00:46:21 2022-08-25 03:02:03 [INFO] [TRAIN] epoch: 116, iter: 146300/160000, loss: 0.6537, lr: 0.000104, batch_cost: 0.2248, reader_cost: 0.00820, ips: 35.5866 samples/sec | ETA 00:51:19 2022-08-25 03:02:14 [INFO] [TRAIN] epoch: 116, iter: 146350/160000, loss: 0.6914, lr: 0.000103, batch_cost: 0.2344, reader_cost: 0.00040, ips: 34.1275 samples/sec | ETA 00:53:19 2022-08-25 03:02:27 [INFO] [TRAIN] epoch: 116, iter: 146400/160000, loss: 0.6809, lr: 0.000103, batch_cost: 0.2581, reader_cost: 0.00892, ips: 30.9957 samples/sec | ETA 00:58:30 2022-08-25 03:02:38 [INFO] [TRAIN] epoch: 116, iter: 146450/160000, loss: 0.6452, lr: 0.000103, batch_cost: 0.2225, reader_cost: 0.00480, ips: 35.9504 samples/sec | ETA 00:50:15 2022-08-25 03:02:51 [INFO] [TRAIN] epoch: 116, iter: 146500/160000, loss: 0.6449, lr: 0.000102, batch_cost: 0.2508, reader_cost: 0.00194, ips: 31.8923 samples/sec | ETA 00:56:26 2022-08-25 03:03:06 [INFO] [TRAIN] epoch: 117, iter: 146550/160000, loss: 0.6490, lr: 0.000102, batch_cost: 0.3001, reader_cost: 0.04099, ips: 26.6587 samples/sec | ETA 01:07:16 2022-08-25 03:03:19 [INFO] [TRAIN] epoch: 117, iter: 146600/160000, loss: 0.6851, lr: 0.000101, batch_cost: 0.2629, reader_cost: 0.01986, ips: 30.4250 samples/sec | ETA 00:58:43 2022-08-25 03:03:31 [INFO] [TRAIN] epoch: 117, iter: 146650/160000, loss: 0.6418, lr: 0.000101, batch_cost: 0.2329, reader_cost: 0.00332, ips: 34.3499 samples/sec | ETA 00:51:49 2022-08-25 03:03:44 [INFO] [TRAIN] epoch: 117, iter: 146700/160000, loss: 0.6761, lr: 0.000101, batch_cost: 0.2586, reader_cost: 0.00121, ips: 30.9358 samples/sec | ETA 00:57:19 2022-08-25 03:03:55 [INFO] [TRAIN] epoch: 117, iter: 146750/160000, loss: 0.6652, lr: 0.000100, batch_cost: 0.2322, reader_cost: 0.00560, ips: 34.4555 samples/sec | ETA 00:51:16 2022-08-25 03:04:07 [INFO] [TRAIN] epoch: 117, iter: 146800/160000, loss: 0.6698, lr: 0.000100, batch_cost: 0.2422, reader_cost: 0.00132, ips: 33.0325 samples/sec | ETA 00:53:16 2022-08-25 03:04:19 [INFO] [TRAIN] epoch: 117, iter: 146850/160000, loss: 0.6799, lr: 0.000100, batch_cost: 0.2298, reader_cost: 0.01483, ips: 34.8102 samples/sec | ETA 00:50:22 2022-08-25 03:04:31 [INFO] [TRAIN] epoch: 117, iter: 146900/160000, loss: 0.7158, lr: 0.000099, batch_cost: 0.2411, reader_cost: 0.00158, ips: 33.1855 samples/sec | ETA 00:52:38 2022-08-25 03:04:43 [INFO] [TRAIN] epoch: 117, iter: 146950/160000, loss: 0.6438, lr: 0.000099, batch_cost: 0.2427, reader_cost: 0.01504, ips: 32.9583 samples/sec | ETA 00:52:47 2022-08-25 03:04:54 [INFO] [TRAIN] epoch: 117, iter: 147000/160000, loss: 0.7006, lr: 0.000098, batch_cost: 0.2242, reader_cost: 0.00986, ips: 35.6853 samples/sec | ETA 00:48:34 2022-08-25 03:04:54 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 166s - batch_cost: 0.1662 - reader cost: 5.9486e-04 2022-08-25 03:07:41 [INFO] [EVAL] #Images: 2000 mIoU: 0.3218 Acc: 0.7495 Kappa: 0.7300 Dice: 0.4499 2022-08-25 03:07:41 [INFO] [EVAL] Class IoU: [0.6582 0.7723 0.9248 0.7056 0.6682 0.7478 0.7625 0.7613 0.4887 0.6084 0.4624 0.5263 0.6613 0.3066 0.2132 0.3803 0.4778 0.4107 0.5403 0.366 0.7252 0.4053 0.5506 0.4657 0.3322 0.3599 0.3664 0.4089 0.3645 0.3284 0.2264 0.3877 0.2843 0.2866 0.3231 0.3957 0.3729 0.5066 0.2559 0.2647 0.0679 0.0581 0.296 0.2257 0.2815 0.2724 0.2346 0.4297 0.6952 0.4849 0.4748 0.2764 0.2245 0.1954 0.6114 0.464 0.8463 0.3162 0.5103 0.1898 0.0505 0.1879 0.302 0.143 0.3906 0.654 0.2327 0.3618 0.097 0.3313 0.3807 0.4285 0.3726 0.2232 0.4198 0.2913 0.3156 0.224 0.1397 0.2455 0.6573 0.2974 0.2774 0.0141 0.5341 0.4934 0.0835 0.0613 0.3297 0.4226 0.4167 0.0043 0.2151 0.065 0.0009 0.014 0.1734 0.1305 0.2193 0.339 0.0712 0.0302 0.1955 0.5343 0.0013 0.5236 0.1764 0.4823 0.0943 0.3243 0.0748 0.2397 0.0805 0.5312 0.7133 0.0025 0.3801 0.602 0.0506 0.3432 0.4633 0.016 0.2108 0.1027 0.2407 0.2348 0.4243 0.2905 0.448 0.2679 0.5391 0.0196 0.1401 0.2739 0.1116 0.0989 0.0599 0.0203 0.1199 0.3334 0.0936 0.0558 0.2463 0.5114 0.3084 0. 0.2981 0.0125 0.0812 0.053 ] 2022-08-25 03:07:41 [INFO] [EVAL] Class Precision: [0.7564 0.8335 0.9636 0.8062 0.7734 0.8596 0.8791 0.8337 0.6153 0.757 0.6955 0.664 0.7499 0.4986 0.4791 0.5427 0.6224 0.6258 0.723 0.5657 0.8186 0.5814 0.6948 0.5837 0.5079 0.5883 0.5106 0.6815 0.6262 0.4608 0.4117 0.5051 0.5089 0.4033 0.5009 0.5843 0.5805 0.7828 0.5007 0.541 0.1349 0.2015 0.4771 0.527 0.44 0.4471 0.3787 0.6796 0.7605 0.5932 0.6087 0.3388 0.4172 0.656 0.7037 0.6444 0.8955 0.617 0.6892 0.3355 0.0856 0.546 0.4262 0.6396 0.4942 0.7819 0.3851 0.5691 0.2797 0.6324 0.6282 0.5749 0.6145 0.2962 0.6718 0.4443 0.5076 0.62 0.6505 0.5779 0.7724 0.5719 0.7392 0.0582 0.6696 0.6889 0.4084 0.4272 0.5673 0.5931 0.5814 0.0059 0.3641 0.3085 0.005 0.0509 0.597 0.5331 0.4473 0.5839 0.555 0.0559 0.7399 0.6866 0.036 0.7463 0.5268 0.874 0.2449 0.5487 0.265 0.3611 0.4001 0.7326 0.7174 0.0699 0.6998 0.6254 0.14 0.7148 0.7639 0.2062 0.6077 0.7453 0.6366 0.6402 0.7649 0.4223 0.6077 0.5041 0.6136 0.514 0.4455 0.705 0.6443 0.2869 0.2876 0.1823 0.3757 0.6945 0.214 0.1879 0.5445 0.7055 0.5361 0. 0.8944 0.421 0.4646 0.7726] 2022-08-25 03:07:41 [INFO] [EVAL] Class Recall: [0.8353 0.9132 0.9583 0.8498 0.8309 0.8518 0.8517 0.8976 0.7037 0.7561 0.5798 0.7174 0.8483 0.4433 0.2776 0.5595 0.6728 0.5444 0.6813 0.509 0.8641 0.5724 0.7262 0.6973 0.4899 0.481 0.5647 0.5055 0.4659 0.5333 0.3346 0.6252 0.3917 0.4977 0.4766 0.5508 0.5104 0.5894 0.3435 0.3413 0.1203 0.0754 0.4381 0.283 0.4386 0.4107 0.3815 0.5389 0.89 0.7265 0.6833 0.6003 0.327 0.2177 0.8234 0.6238 0.939 0.3935 0.6629 0.304 0.1096 0.2228 0.5089 0.1555 0.6507 0.7999 0.3703 0.4983 0.1292 0.4103 0.4914 0.6272 0.4862 0.4754 0.5282 0.4583 0.4548 0.2596 0.1511 0.2991 0.8151 0.3826 0.3075 0.0183 0.7253 0.6349 0.095 0.0668 0.4405 0.5952 0.5953 0.0149 0.3445 0.0761 0.0011 0.0189 0.1964 0.1473 0.3008 0.447 0.0755 0.0616 0.2099 0.7067 0.0013 0.6369 0.2095 0.5183 0.1331 0.4422 0.0944 0.4161 0.0916 0.659 0.9922 0.0025 0.4542 0.9413 0.0734 0.3976 0.5407 0.0171 0.244 0.1065 0.279 0.2704 0.4879 0.4821 0.6304 0.3637 0.8164 0.02 0.1696 0.3094 0.1189 0.1312 0.0703 0.0223 0.1497 0.3907 0.1426 0.0735 0.3103 0.6502 0.4206 0. 0.309 0.0127 0.0896 0.0538] 2022-08-25 03:07:41 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 03:07:51 [INFO] [TRAIN] epoch: 117, iter: 147050/160000, loss: 0.6952, lr: 0.000098, batch_cost: 0.2051, reader_cost: 0.00373, ips: 38.9959 samples/sec | ETA 00:44:16 2022-08-25 03:08:02 [INFO] [TRAIN] epoch: 117, iter: 147100/160000, loss: 0.7015, lr: 0.000098, batch_cost: 0.2138, reader_cost: 0.00100, ips: 37.4155 samples/sec | ETA 00:45:58 2022-08-25 03:08:11 [INFO] [TRAIN] epoch: 117, iter: 147150/160000, loss: 0.6091, lr: 0.000097, batch_cost: 0.1878, reader_cost: 0.00034, ips: 42.5965 samples/sec | ETA 00:40:13 2022-08-25 03:08:22 [INFO] [TRAIN] epoch: 117, iter: 147200/160000, loss: 0.6775, lr: 0.000097, batch_cost: 0.2050, reader_cost: 0.00043, ips: 39.0227 samples/sec | ETA 00:43:44 2022-08-25 03:08:33 [INFO] [TRAIN] epoch: 117, iter: 147250/160000, loss: 0.6956, lr: 0.000097, batch_cost: 0.2352, reader_cost: 0.01072, ips: 34.0166 samples/sec | ETA 00:49:58 2022-08-25 03:08:46 [INFO] [TRAIN] epoch: 117, iter: 147300/160000, loss: 0.6900, lr: 0.000096, batch_cost: 0.2570, reader_cost: 0.00066, ips: 31.1268 samples/sec | ETA 00:54:24 2022-08-25 03:08:58 [INFO] [TRAIN] epoch: 117, iter: 147350/160000, loss: 0.7045, lr: 0.000096, batch_cost: 0.2321, reader_cost: 0.00998, ips: 34.4711 samples/sec | ETA 00:48:55 2022-08-25 03:09:10 [INFO] [TRAIN] epoch: 117, iter: 147400/160000, loss: 0.6608, lr: 0.000095, batch_cost: 0.2503, reader_cost: 0.00743, ips: 31.9614 samples/sec | ETA 00:52:33 2022-08-25 03:09:21 [INFO] [TRAIN] epoch: 117, iter: 147450/160000, loss: 0.6236, lr: 0.000095, batch_cost: 0.2102, reader_cost: 0.00935, ips: 38.0576 samples/sec | ETA 00:43:58 2022-08-25 03:09:32 [INFO] [TRAIN] epoch: 117, iter: 147500/160000, loss: 0.6384, lr: 0.000095, batch_cost: 0.2302, reader_cost: 0.01404, ips: 34.7558 samples/sec | ETA 00:47:57 2022-08-25 03:09:45 [INFO] [TRAIN] epoch: 117, iter: 147550/160000, loss: 0.6537, lr: 0.000094, batch_cost: 0.2459, reader_cost: 0.00064, ips: 32.5319 samples/sec | ETA 00:51:01 2022-08-25 03:09:56 [INFO] [TRAIN] epoch: 117, iter: 147600/160000, loss: 0.6426, lr: 0.000094, batch_cost: 0.2362, reader_cost: 0.00916, ips: 33.8661 samples/sec | ETA 00:48:49 2022-08-25 03:10:09 [INFO] [TRAIN] epoch: 117, iter: 147650/160000, loss: 0.6477, lr: 0.000094, batch_cost: 0.2491, reader_cost: 0.00109, ips: 32.1103 samples/sec | ETA 00:51:16 2022-08-25 03:10:20 [INFO] [TRAIN] epoch: 117, iter: 147700/160000, loss: 0.6600, lr: 0.000093, batch_cost: 0.2271, reader_cost: 0.00614, ips: 35.2258 samples/sec | ETA 00:46:33 2022-08-25 03:10:32 [INFO] [TRAIN] epoch: 117, iter: 147750/160000, loss: 0.6889, lr: 0.000093, batch_cost: 0.2392, reader_cost: 0.00260, ips: 33.4422 samples/sec | ETA 00:48:50 2022-08-25 03:10:47 [INFO] [TRAIN] epoch: 118, iter: 147800/160000, loss: 0.6400, lr: 0.000092, batch_cost: 0.2984, reader_cost: 0.06190, ips: 26.8053 samples/sec | ETA 01:00:41 2022-08-25 03:11:00 [INFO] [TRAIN] epoch: 118, iter: 147850/160000, loss: 0.7054, lr: 0.000092, batch_cost: 0.2518, reader_cost: 0.00977, ips: 31.7747 samples/sec | ETA 00:50:59 2022-08-25 03:11:11 [INFO] [TRAIN] epoch: 118, iter: 147900/160000, loss: 0.6740, lr: 0.000092, batch_cost: 0.2286, reader_cost: 0.00396, ips: 34.9924 samples/sec | ETA 00:46:06 2022-08-25 03:11:21 [INFO] [TRAIN] epoch: 118, iter: 147950/160000, loss: 0.6809, lr: 0.000091, batch_cost: 0.1884, reader_cost: 0.00042, ips: 42.4549 samples/sec | ETA 00:37:50 2022-08-25 03:11:32 [INFO] [TRAIN] epoch: 118, iter: 148000/160000, loss: 0.6781, lr: 0.000091, batch_cost: 0.2213, reader_cost: 0.00061, ips: 36.1488 samples/sec | ETA 00:44:15 2022-08-25 03:11:32 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 156s - batch_cost: 0.1555 - reader cost: 6.1340e-04 2022-08-25 03:14:07 [INFO] [EVAL] #Images: 2000 mIoU: 0.3219 Acc: 0.7503 Kappa: 0.7311 Dice: 0.4497 2022-08-25 03:14:07 [INFO] [EVAL] Class IoU: [0.6585 0.7741 0.9255 0.7073 0.6673 0.747 0.7641 0.7626 0.4867 0.6239 0.4663 0.5279 0.6596 0.3012 0.2329 0.3872 0.4893 0.4124 0.5492 0.366 0.7259 0.4126 0.5575 0.4564 0.3343 0.3755 0.3888 0.402 0.3767 0.3439 0.2189 0.3989 0.2781 0.2934 0.3368 0.3838 0.3692 0.5105 0.269 0.2569 0.0942 0.0629 0.292 0.2168 0.2894 0.2551 0.261 0.4209 0.685 0.4982 0.4686 0.2648 0.224 0.2156 0.6224 0.4513 0.8394 0.2987 0.4856 0.1891 0.0489 0.1793 0.3063 0.0913 0.3881 0.6624 0.2289 0.3575 0.0916 0.3458 0.3622 0.4312 0.3678 0.2277 0.415 0.3006 0.3617 0.2145 0.1205 0.2233 0.6429 0.2841 0.2321 0.0194 0.5287 0.4939 0.082 0.0629 0.3561 0.4298 0.3951 0.0048 0.2124 0.0825 0.0046 0.0066 0.1699 0.1315 0.2041 0.3438 0.1091 0.023 0.2038 0.5564 0.0011 0.5156 0.1989 0.4753 0.0907 0.373 0.0581 0.2707 0.0877 0.5464 0.6893 0.0023 0.3996 0.5839 0.0417 0.3193 0.4643 0.0132 0.2477 0.0856 0.2576 0.2225 0.4205 0.2867 0.4387 0.2497 0.5381 0.0212 0.1428 0.2436 0.0826 0.0985 0.0571 0.0182 0.1153 0.3237 0.0898 0.0664 0.2348 0.5183 0.314 0. 0.3026 0.0149 0.0667 0.0611] 2022-08-25 03:14:07 [INFO] [EVAL] Class Precision: [0.7653 0.8411 0.9595 0.8116 0.7654 0.8559 0.8795 0.8311 0.619 0.7538 0.6715 0.6589 0.7445 0.4772 0.4456 0.5347 0.6493 0.6532 0.7078 0.5533 0.8182 0.5768 0.7138 0.5847 0.5403 0.5853 0.5343 0.6693 0.6067 0.5237 0.4202 0.523 0.4845 0.4616 0.4909 0.5579 0.6028 0.743 0.5254 0.5284 0.1766 0.1968 0.4699 0.5273 0.4285 0.4356 0.4514 0.5977 0.7706 0.6127 0.5948 0.33 0.4045 0.6767 0.702 0.622 0.8834 0.6313 0.6647 0.3795 0.0881 0.4283 0.4306 0.6941 0.4956 0.8031 0.3897 0.5804 0.3233 0.6336 0.6006 0.5753 0.6307 0.2911 0.6218 0.4819 0.5823 0.5623 0.7461 0.5431 0.7388 0.5532 0.7736 0.0645 0.6866 0.686 0.3415 0.4337 0.5518 0.6372 0.5213 0.0067 0.4057 0.3015 0.028 0.0303 0.5524 0.5023 0.3798 0.5666 0.7233 0.0388 0.7264 0.6769 0.0184 0.7076 0.5801 0.9385 0.2448 0.5577 0.2747 0.4366 0.3311 0.7294 0.6936 0.0632 0.703 0.6053 0.0834 0.7345 0.7517 0.2826 0.6878 0.7202 0.5995 0.6122 0.7578 0.4315 0.5669 0.5319 0.6196 0.5682 0.3749 0.7508 0.7083 0.2533 0.315 0.1342 0.3674 0.6945 0.1843 0.2092 0.5397 0.7255 0.5084 0. 0.8791 0.3356 0.3734 0.7061] 2022-08-25 03:14:07 [INFO] [EVAL] Class Recall: [0.8251 0.9068 0.9631 0.8462 0.8389 0.8545 0.8534 0.9025 0.6948 0.7835 0.6042 0.7264 0.8525 0.4495 0.3279 0.584 0.665 0.5281 0.7102 0.5194 0.8656 0.5917 0.7179 0.6753 0.4672 0.5115 0.5882 0.5017 0.4984 0.5005 0.3135 0.6269 0.395 0.4461 0.5175 0.5517 0.4878 0.62 0.3553 0.3333 0.168 0.0846 0.4355 0.2691 0.4713 0.3811 0.3824 0.5873 0.8605 0.7272 0.6884 0.5726 0.3342 0.2403 0.846 0.6218 0.9439 0.3619 0.6432 0.2737 0.0989 0.2356 0.5147 0.0951 0.6416 0.7908 0.3567 0.4822 0.1134 0.4321 0.477 0.6325 0.4687 0.5108 0.5551 0.4442 0.4884 0.2575 0.1256 0.275 0.8321 0.3687 0.2491 0.0269 0.6969 0.6381 0.0973 0.0685 0.5009 0.569 0.62 0.0162 0.3082 0.102 0.0055 0.0083 0.197 0.1512 0.3061 0.4664 0.1139 0.0533 0.2208 0.7577 0.0012 0.6552 0.2323 0.4906 0.1259 0.5296 0.0686 0.4161 0.1065 0.6854 0.991 0.0024 0.4808 0.9429 0.0769 0.361 0.5484 0.0136 0.279 0.0886 0.3112 0.2591 0.4857 0.4608 0.6599 0.3201 0.8036 0.0215 0.1874 0.2651 0.0855 0.1388 0.0652 0.0206 0.1439 0.3774 0.1491 0.0887 0.2935 0.6448 0.4509 0. 0.3158 0.0153 0.0751 0.0627] 2022-08-25 03:14:08 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 03:14:17 [INFO] [TRAIN] epoch: 118, iter: 148050/160000, loss: 0.6570, lr: 0.000090, batch_cost: 0.1972, reader_cost: 0.00429, ips: 40.5760 samples/sec | ETA 00:39:16 2022-08-25 03:14:27 [INFO] [TRAIN] epoch: 118, iter: 148100/160000, loss: 0.6617, lr: 0.000090, batch_cost: 0.1827, reader_cost: 0.00074, ips: 43.7806 samples/sec | ETA 00:36:14 2022-08-25 03:14:35 [INFO] [TRAIN] epoch: 118, iter: 148150/160000, loss: 0.6166, lr: 0.000090, batch_cost: 0.1756, reader_cost: 0.00532, ips: 45.5646 samples/sec | ETA 00:34:40 2022-08-25 03:14:44 [INFO] [TRAIN] epoch: 118, iter: 148200/160000, loss: 0.6746, lr: 0.000089, batch_cost: 0.1823, reader_cost: 0.00044, ips: 43.8879 samples/sec | ETA 00:35:50 2022-08-25 03:14:54 [INFO] [TRAIN] epoch: 118, iter: 148250/160000, loss: 0.6825, lr: 0.000089, batch_cost: 0.1917, reader_cost: 0.00056, ips: 41.7298 samples/sec | ETA 00:37:32 2022-08-25 03:15:05 [INFO] [TRAIN] epoch: 118, iter: 148300/160000, loss: 0.6919, lr: 0.000089, batch_cost: 0.2221, reader_cost: 0.00053, ips: 36.0220 samples/sec | ETA 00:43:18 2022-08-25 03:15:17 [INFO] [TRAIN] epoch: 118, iter: 148350/160000, loss: 0.6688, lr: 0.000088, batch_cost: 0.2384, reader_cost: 0.00747, ips: 33.5585 samples/sec | ETA 00:46:17 2022-08-25 03:15:29 [INFO] [TRAIN] epoch: 118, iter: 148400/160000, loss: 0.6590, lr: 0.000088, batch_cost: 0.2445, reader_cost: 0.00068, ips: 32.7198 samples/sec | ETA 00:47:16 2022-08-25 03:15:41 [INFO] [TRAIN] epoch: 118, iter: 148450/160000, loss: 0.6439, lr: 0.000087, batch_cost: 0.2283, reader_cost: 0.00061, ips: 35.0404 samples/sec | ETA 00:43:56 2022-08-25 03:15:54 [INFO] [TRAIN] epoch: 118, iter: 148500/160000, loss: 0.6874, lr: 0.000087, batch_cost: 0.2584, reader_cost: 0.01484, ips: 30.9580 samples/sec | ETA 00:49:31 2022-08-25 03:16:06 [INFO] [TRAIN] epoch: 118, iter: 148550/160000, loss: 0.6519, lr: 0.000087, batch_cost: 0.2451, reader_cost: 0.00623, ips: 32.6396 samples/sec | ETA 00:46:46 2022-08-25 03:16:18 [INFO] [TRAIN] epoch: 118, iter: 148600/160000, loss: 0.6760, lr: 0.000086, batch_cost: 0.2375, reader_cost: 0.00110, ips: 33.6912 samples/sec | ETA 00:45:06 2022-08-25 03:16:29 [INFO] [TRAIN] epoch: 118, iter: 148650/160000, loss: 0.6330, lr: 0.000086, batch_cost: 0.2323, reader_cost: 0.00080, ips: 34.4350 samples/sec | ETA 00:43:56 2022-08-25 03:16:41 [INFO] [TRAIN] epoch: 118, iter: 148700/160000, loss: 0.6739, lr: 0.000086, batch_cost: 0.2397, reader_cost: 0.00046, ips: 33.3790 samples/sec | ETA 00:45:08 2022-08-25 03:16:53 [INFO] [TRAIN] epoch: 118, iter: 148750/160000, loss: 0.6929, lr: 0.000085, batch_cost: 0.2409, reader_cost: 0.00080, ips: 33.2050 samples/sec | ETA 00:45:10 2022-08-25 03:17:05 [INFO] [TRAIN] epoch: 118, iter: 148800/160000, loss: 0.6990, lr: 0.000085, batch_cost: 0.2422, reader_cost: 0.00425, ips: 33.0300 samples/sec | ETA 00:45:12 2022-08-25 03:17:17 [INFO] [TRAIN] epoch: 118, iter: 148850/160000, loss: 0.6640, lr: 0.000084, batch_cost: 0.2390, reader_cost: 0.01550, ips: 33.4661 samples/sec | ETA 00:44:25 2022-08-25 03:17:29 [INFO] [TRAIN] epoch: 118, iter: 148900/160000, loss: 0.6731, lr: 0.000084, batch_cost: 0.2276, reader_cost: 0.00227, ips: 35.1475 samples/sec | ETA 00:42:06 2022-08-25 03:17:39 [INFO] [TRAIN] epoch: 118, iter: 148950/160000, loss: 0.6654, lr: 0.000084, batch_cost: 0.2064, reader_cost: 0.00069, ips: 38.7646 samples/sec | ETA 00:38:00 2022-08-25 03:17:48 [INFO] [TRAIN] epoch: 118, iter: 149000/160000, loss: 0.6517, lr: 0.000083, batch_cost: 0.1836, reader_cost: 0.00046, ips: 43.5647 samples/sec | ETA 00:33:39 2022-08-25 03:17:48 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1588 - reader cost: 5.7805e-04 2022-08-25 03:20:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3234 Acc: 0.7498 Kappa: 0.7305 Dice: 0.4514 2022-08-25 03:20:27 [INFO] [EVAL] Class IoU: [0.6598 0.7753 0.9248 0.7041 0.6692 0.7488 0.7666 0.7579 0.4895 0.6222 0.468 0.5328 0.6644 0.2926 0.2255 0.3836 0.4795 0.4198 0.5504 0.3649 0.7293 0.3909 0.5729 0.4485 0.3252 0.3817 0.3676 0.4056 0.3512 0.3337 0.2292 0.4094 0.2727 0.2921 0.3236 0.3815 0.3705 0.5113 0.2609 0.2742 0.0909 0.067 0.2941 0.2218 0.2934 0.2769 0.2399 0.4313 0.6813 0.4867 0.4731 0.2756 0.2133 0.2142 0.6121 0.4551 0.8523 0.3105 0.4779 0.1766 0.0572 0.1788 0.2904 0.1359 0.3837 0.6691 0.2345 0.3579 0.1072 0.3485 0.3824 0.4373 0.3727 0.2253 0.4242 0.2928 0.3217 0.2287 0.069 0.185 0.6547 0.3002 0.2801 0.0139 0.5376 0.501 0.0916 0.0623 0.3513 0.4273 0.3919 0.0062 0.1798 0.078 0.0014 0.0102 0.1729 0.1444 0.1995 0.3614 0.1358 0.0332 0.211 0.555 0.0026 0.5611 0.2339 0.4946 0.101 0.3953 0.0656 0.1814 0.0922 0.5389 0.7355 0.0016 0.4374 0.5742 0.0526 0.2656 0.4775 0.013 0.2108 0.1164 0.245 0.2363 0.4255 0.3001 0.4444 0.2677 0.5432 0.0227 0.153 0.2354 0.0896 0.104 0.0662 0.0185 0.1179 0.3326 0.0992 0.0562 0.2902 0.4957 0.3135 0. 0.2972 0.01 0.0721 0.0469] 2022-08-25 03:20:27 [INFO] [EVAL] Class Precision: [0.7639 0.8488 0.9632 0.7945 0.7669 0.8636 0.882 0.8249 0.6382 0.7353 0.6718 0.6779 0.752 0.4954 0.4543 0.558 0.6368 0.6392 0.7156 0.5655 0.8327 0.5619 0.7588 0.6197 0.5209 0.5447 0.5248 0.6432 0.6498 0.4857 0.378 0.5462 0.4575 0.421 0.4898 0.5086 0.6142 0.7418 0.5169 0.5142 0.1666 0.211 0.4738 0.5407 0.4052 0.448 0.3809 0.6595 0.7227 0.6034 0.6201 0.3365 0.3618 0.6011 0.7051 0.6264 0.9066 0.6304 0.6328 0.3635 0.0859 0.397 0.3868 0.5882 0.4803 0.7993 0.3756 0.5599 0.2796 0.6442 0.6007 0.6143 0.6196 0.2789 0.6562 0.4643 0.5723 0.6238 0.7246 0.5094 0.7575 0.5616 0.7546 0.0421 0.6773 0.6843 0.3853 0.4441 0.5642 0.6289 0.5123 0.0087 0.2972 0.2909 0.0071 0.0366 0.645 0.4357 0.3589 0.5485 0.6335 0.0628 0.7435 0.6483 0.0315 0.8129 0.5583 0.9298 0.2305 0.568 0.2549 0.2433 0.3963 0.6857 0.7423 0.0265 0.6945 0.5936 0.1277 0.7097 0.7595 0.194 0.6765 0.6763 0.6482 0.639 0.7926 0.4478 0.5846 0.5567 0.6218 0.5246 0.4699 0.7735 0.6511 0.2547 0.2872 0.1401 0.3532 0.6945 0.208 0.2261 0.5676 0.7689 0.58 0. 0.8921 0.4696 0.4503 0.7448] 2022-08-25 03:20:27 [INFO] [EVAL] Class Recall: [0.8289 0.8996 0.9587 0.8608 0.8401 0.8493 0.8542 0.9032 0.6776 0.8018 0.6067 0.7134 0.8508 0.4168 0.3093 0.551 0.66 0.5502 0.7045 0.5071 0.8545 0.5622 0.7004 0.6189 0.464 0.5605 0.551 0.5234 0.4332 0.5161 0.3679 0.6205 0.4031 0.4882 0.4882 0.6042 0.4828 0.622 0.3451 0.37 0.1666 0.0893 0.4367 0.2733 0.5152 0.4203 0.3933 0.5549 0.9224 0.7157 0.6663 0.6036 0.3418 0.2497 0.8227 0.6247 0.9343 0.3796 0.6612 0.2556 0.1466 0.2454 0.5382 0.1501 0.6561 0.8042 0.3842 0.4981 0.1481 0.4315 0.5127 0.6028 0.4832 0.5395 0.5454 0.4421 0.4235 0.2653 0.0709 0.2251 0.8284 0.392 0.3082 0.0203 0.7227 0.6516 0.1073 0.0675 0.4821 0.5714 0.6249 0.0205 0.3128 0.0963 0.0017 0.014 0.191 0.1776 0.3101 0.5144 0.1474 0.066 0.2276 0.7941 0.0028 0.6444 0.287 0.5138 0.1524 0.5652 0.0812 0.4161 0.1073 0.7157 0.9877 0.0017 0.5415 0.9461 0.0821 0.298 0.5626 0.0138 0.2345 0.1233 0.2826 0.2727 0.4788 0.4765 0.6495 0.3402 0.8114 0.0232 0.185 0.2528 0.0941 0.1495 0.0793 0.0209 0.1503 0.3896 0.1594 0.0695 0.3725 0.5825 0.4055 0. 0.3083 0.0101 0.0791 0.0476] 2022-08-25 03:20:28 [INFO] [EVAL] The model with the best validation mIoU (0.3238) was saved at iter 142000. 2022-08-25 03:20:38 [INFO] [TRAIN] epoch: 119, iter: 149050/160000, loss: 0.6687, lr: 0.000083, batch_cost: 0.2169, reader_cost: 0.03018, ips: 36.8777 samples/sec | ETA 00:39:35 2022-08-25 03:20:49 [INFO] [TRAIN] epoch: 119, iter: 149100/160000, loss: 0.7028, lr: 0.000083, batch_cost: 0.2190, reader_cost: 0.00039, ips: 36.5280 samples/sec | ETA 00:39:47 2022-08-25 03:21:00 [INFO] [TRAIN] epoch: 119, iter: 149150/160000, loss: 0.6921, lr: 0.000082, batch_cost: 0.2133, reader_cost: 0.00057, ips: 37.5085 samples/sec | ETA 00:38:34 2022-08-25 03:21:10 [INFO] [TRAIN] epoch: 119, iter: 149200/160000, loss: 0.6712, lr: 0.000082, batch_cost: 0.1920, reader_cost: 0.00059, ips: 41.6698 samples/sec | ETA 00:34:33 2022-08-25 03:21:19 [INFO] [TRAIN] epoch: 119, iter: 149250/160000, loss: 0.6850, lr: 0.000081, batch_cost: 0.1930, reader_cost: 0.00052, ips: 41.4582 samples/sec | ETA 00:34:34 2022-08-25 03:21:28 [INFO] [TRAIN] epoch: 119, iter: 149300/160000, loss: 0.6862, lr: 0.000081, batch_cost: 0.1783, reader_cost: 0.00073, ips: 44.8620 samples/sec | ETA 00:31:48 2022-08-25 03:21:39 [INFO] [TRAIN] epoch: 119, iter: 149350/160000, loss: 0.6883, lr: 0.000081, batch_cost: 0.2202, reader_cost: 0.01849, ips: 36.3240 samples/sec | ETA 00:39:05 2022-08-25 03:21:51 [INFO] [TRAIN] epoch: 119, iter: 149400/160000, loss: 0.6965, lr: 0.000080, batch_cost: 0.2291, reader_cost: 0.00103, ips: 34.9244 samples/sec | ETA 00:40:28 2022-08-25 03:22:01 [INFO] [TRAIN] epoch: 119, iter: 149450/160000, loss: 0.7052, lr: 0.000080, batch_cost: 0.2167, reader_cost: 0.00065, ips: 36.9245 samples/sec | ETA 00:38:05 2022-08-25 03:22:12 [INFO] [TRAIN] epoch: 119, iter: 149500/160000, loss: 0.6611, lr: 0.000080, batch_cost: 0.2138, reader_cost: 0.00054, ips: 37.4266 samples/sec | ETA 00:37:24 2022-08-25 03:22:24 [INFO] [TRAIN] epoch: 119, iter: 149550/160000, loss: 0.7028, lr: 0.000079, batch_cost: 0.2340, reader_cost: 0.01386, ips: 34.1913 samples/sec | ETA 00:40:45 2022-08-25 03:22:35 [INFO] [TRAIN] epoch: 119, iter: 149600/160000, loss: 0.7000, lr: 0.000079, batch_cost: 0.2306, reader_cost: 0.00601, ips: 34.6937 samples/sec | ETA 00:39:58 2022-08-25 03:22:48 [INFO] [TRAIN] epoch: 119, iter: 149650/160000, loss: 0.6619, lr: 0.000078, batch_cost: 0.2437, reader_cost: 0.00094, ips: 32.8337 samples/sec | ETA 00:42:01 2022-08-25 03:23:00 [INFO] [TRAIN] epoch: 119, iter: 149700/160000, loss: 0.6410, lr: 0.000078, batch_cost: 0.2506, reader_cost: 0.00078, ips: 31.9189 samples/sec | ETA 00:43:01 2022-08-25 03:23:11 [INFO] [TRAIN] epoch: 119, iter: 149750/160000, loss: 0.6915, lr: 0.000078, batch_cost: 0.2230, reader_cost: 0.00479, ips: 35.8719 samples/sec | ETA 00:38:05 2022-08-25 03:23:23 [INFO] [TRAIN] epoch: 119, iter: 149800/160000, loss: 0.6511, lr: 0.000077, batch_cost: 0.2276, reader_cost: 0.00320, ips: 35.1518 samples/sec | ETA 00:38:41 2022-08-25 03:23:35 [INFO] [TRAIN] epoch: 119, iter: 149850/160000, loss: 0.6866, lr: 0.000077, batch_cost: 0.2566, reader_cost: 0.00040, ips: 31.1809 samples/sec | ETA 00:43:24 2022-08-25 03:23:48 [INFO] [TRAIN] epoch: 119, iter: 149900/160000, loss: 0.7045, lr: 0.000076, batch_cost: 0.2452, reader_cost: 0.00603, ips: 32.6314 samples/sec | ETA 00:41:16 2022-08-25 03:23:58 [INFO] [TRAIN] epoch: 119, iter: 149950/160000, loss: 0.6421, lr: 0.000076, batch_cost: 0.1966, reader_cost: 0.00042, ips: 40.6904 samples/sec | ETA 00:32:55 2022-08-25 03:24:07 [INFO] [TRAIN] epoch: 119, iter: 150000/160000, loss: 0.6490, lr: 0.000076, batch_cost: 0.1868, reader_cost: 0.00040, ips: 42.8180 samples/sec | ETA 00:31:08 2022-08-25 03:24:07 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 167s - batch_cost: 0.1666 - reader cost: 7.6946e-04 2022-08-25 03:26:54 [INFO] [EVAL] #Images: 2000 mIoU: 0.3247 Acc: 0.7503 Kappa: 0.7309 Dice: 0.4534 2022-08-25 03:26:54 [INFO] [EVAL] Class IoU: [0.6598 0.77 0.9248 0.7054 0.6691 0.7467 0.7632 0.7616 0.4891 0.6176 0.4654 0.523 0.6635 0.3006 0.2158 0.3844 0.4884 0.4165 0.5546 0.3662 0.7239 0.4147 0.5658 0.4569 0.3315 0.3558 0.3648 0.4045 0.3741 0.3256 0.2266 0.4006 0.2778 0.2831 0.3284 0.396 0.3717 0.5106 0.2641 0.2726 0.0857 0.0763 0.2969 0.2297 0.2932 0.2675 0.2262 0.4285 0.7045 0.506 0.4896 0.2812 0.2192 0.2169 0.6277 0.4223 0.8517 0.3259 0.5024 0.1712 0.0502 0.2069 0.3008 0.1356 0.395 0.6707 0.2341 0.364 0.0972 0.344 0.3918 0.433 0.3808 0.2179 0.4148 0.2982 0.3089 0.2257 0.1057 0.2382 0.6454 0.295 0.2696 0.0171 0.5268 0.5001 0.0943 0.0676 0.3594 0.4124 0.4042 0.0042 0.2102 0.0759 0.0012 0.01 0.1814 0.1398 0.2289 0.352 0.16 0.0334 0.2067 0.5652 0.0038 0.5376 0.2036 0.4526 0.0963 0.3462 0.062 0.2787 0.0837 0.5207 0.7233 0.0025 0.4097 0.5858 0.0432 0.3132 0.4787 0.0117 0.2005 0.1176 0.2509 0.2407 0.4178 0.3061 0.4546 0.267 0.5326 0.0243 0.1439 0.2578 0.1009 0.0955 0.0611 0.02 0.1199 0.3197 0.0949 0.0527 0.2739 0.5299 0.2925 0. 0.3003 0.0071 0.081 0.0593] 2022-08-25 03:26:54 [INFO] [EVAL] Class Precision: [0.7625 0.8302 0.9604 0.8049 0.7785 0.8647 0.8741 0.8313 0.6172 0.7451 0.68 0.6683 0.7554 0.4996 0.4834 0.5327 0.6453 0.6451 0.7346 0.5611 0.8146 0.5825 0.7437 0.5946 0.5167 0.5709 0.5244 0.6528 0.622 0.497 0.4048 0.5187 0.4829 0.3864 0.5275 0.5377 0.5932 0.7723 0.5032 0.5071 0.1478 0.2074 0.4989 0.5028 0.4035 0.4519 0.3719 0.6635 0.781 0.6357 0.6316 0.3453 0.4027 0.6177 0.7069 0.6416 0.8984 0.6182 0.6729 0.3582 0.089 0.4651 0.4325 0.5894 0.5108 0.7983 0.3765 0.5707 0.2262 0.6049 0.5975 0.5728 0.6366 0.2911 0.6058 0.4676 0.5502 0.6092 0.7795 0.5616 0.7502 0.5859 0.7515 0.053 0.6688 0.6951 0.3752 0.4147 0.5906 0.6038 0.5405 0.0058 0.377 0.3011 0.01 0.0388 0.6089 0.572 0.4345 0.5734 0.6051 0.0597 0.6805 0.7515 0.0886 0.7995 0.544 0.9452 0.2686 0.5714 0.2927 0.4576 0.3954 0.7116 0.7286 0.0777 0.6867 0.6067 0.1195 0.7209 0.7678 0.2892 0.6509 0.6904 0.6192 0.6578 0.7617 0.4494 0.6095 0.5319 0.6044 0.5868 0.4876 0.7427 0.632 0.2739 0.2928 0.1488 0.373 0.7074 0.2394 0.1694 0.5446 0.7149 0.4878 0. 0.8922 0.3476 0.4112 0.6428] 2022-08-25 03:26:54 [INFO] [EVAL] Class Recall: [0.8305 0.9139 0.9615 0.851 0.8264 0.8455 0.8575 0.9008 0.7021 0.7831 0.5959 0.7063 0.845 0.4301 0.2805 0.5799 0.6675 0.5403 0.6936 0.5133 0.8666 0.59 0.7027 0.6636 0.4805 0.4857 0.5453 0.5154 0.4841 0.4856 0.3399 0.6376 0.3954 0.5141 0.4654 0.6006 0.4988 0.6011 0.3573 0.3709 0.1694 0.1076 0.423 0.2973 0.5175 0.3959 0.366 0.5475 0.878 0.7125 0.6852 0.6025 0.3249 0.2505 0.8485 0.5527 0.9424 0.408 0.6648 0.2471 0.1033 0.2714 0.4969 0.1498 0.6352 0.8076 0.3824 0.5012 0.1457 0.4436 0.5322 0.6397 0.4866 0.4642 0.5682 0.4516 0.4132 0.2639 0.1089 0.2925 0.8221 0.3727 0.296 0.0245 0.7127 0.6407 0.1118 0.0747 0.4786 0.5655 0.6158 0.0143 0.322 0.0922 0.0014 0.0133 0.2053 0.1562 0.3261 0.4768 0.1787 0.0705 0.229 0.6951 0.004 0.6214 0.2455 0.4648 0.1304 0.4676 0.0729 0.4161 0.0959 0.66 0.9902 0.0025 0.5039 0.9446 0.0634 0.3565 0.5597 0.012 0.2247 0.1241 0.2967 0.2752 0.4806 0.4898 0.6415 0.349 0.8177 0.0248 0.1696 0.2831 0.1072 0.1278 0.0716 0.0226 0.1501 0.3684 0.1358 0.071 0.3553 0.6718 0.4221 0. 0.3116 0.0072 0.0916 0.0613] 2022-08-25 03:26:54 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 03:27:03 [INFO] [TRAIN] epoch: 119, iter: 150050/160000, loss: 0.6709, lr: 0.000075, batch_cost: 0.1764, reader_cost: 0.01559, ips: 45.3402 samples/sec | ETA 00:29:15 2022-08-25 03:27:12 [INFO] [TRAIN] epoch: 119, iter: 150100/160000, loss: 0.6774, lr: 0.000075, batch_cost: 0.1930, reader_cost: 0.00722, ips: 41.4457 samples/sec | ETA 00:31:50 2022-08-25 03:27:22 [INFO] [TRAIN] epoch: 119, iter: 150150/160000, loss: 0.6363, lr: 0.000075, batch_cost: 0.1842, reader_cost: 0.00062, ips: 43.4238 samples/sec | ETA 00:30:14 2022-08-25 03:27:30 [INFO] [TRAIN] epoch: 119, iter: 150200/160000, loss: 0.6731, lr: 0.000074, batch_cost: 0.1702, reader_cost: 0.00032, ips: 47.0162 samples/sec | ETA 00:27:47 2022-08-25 03:27:40 [INFO] [TRAIN] epoch: 119, iter: 150250/160000, loss: 0.6568, lr: 0.000074, batch_cost: 0.1887, reader_cost: 0.00037, ips: 42.4004 samples/sec | ETA 00:30:39 2022-08-25 03:27:51 [INFO] [TRAIN] epoch: 120, iter: 150300/160000, loss: 0.6753, lr: 0.000073, batch_cost: 0.2361, reader_cost: 0.03362, ips: 33.8846 samples/sec | ETA 00:38:10 2022-08-25 03:28:01 [INFO] [TRAIN] epoch: 120, iter: 150350/160000, loss: 0.6811, lr: 0.000073, batch_cost: 0.1986, reader_cost: 0.00302, ips: 40.2760 samples/sec | ETA 00:31:56 2022-08-25 03:28:10 [INFO] [TRAIN] epoch: 120, iter: 150400/160000, loss: 0.7093, lr: 0.000073, batch_cost: 0.1831, reader_cost: 0.00058, ips: 43.6940 samples/sec | ETA 00:29:17 2022-08-25 03:28:22 [INFO] [TRAIN] epoch: 120, iter: 150450/160000, loss: 0.6727, lr: 0.000072, batch_cost: 0.2361, reader_cost: 0.00037, ips: 33.8780 samples/sec | ETA 00:37:35 2022-08-25 03:28:36 [INFO] [TRAIN] epoch: 120, iter: 150500/160000, loss: 0.6734, lr: 0.000072, batch_cost: 0.2673, reader_cost: 0.00510, ips: 29.9333 samples/sec | ETA 00:42:18 2022-08-25 03:28:50 [INFO] [TRAIN] epoch: 120, iter: 150550/160000, loss: 0.6318, lr: 0.000072, batch_cost: 0.2776, reader_cost: 0.00056, ips: 28.8160 samples/sec | ETA 00:43:43 2022-08-25 03:29:01 [INFO] [TRAIN] epoch: 120, iter: 150600/160000, loss: 0.6195, lr: 0.000071, batch_cost: 0.2388, reader_cost: 0.00087, ips: 33.5036 samples/sec | ETA 00:37:24 2022-08-25 03:29:12 [INFO] [TRAIN] epoch: 120, iter: 150650/160000, loss: 0.6420, lr: 0.000071, batch_cost: 0.2144, reader_cost: 0.00818, ips: 37.3145 samples/sec | ETA 00:33:24 2022-08-25 03:29:25 [INFO] [TRAIN] epoch: 120, iter: 150700/160000, loss: 0.6550, lr: 0.000070, batch_cost: 0.2515, reader_cost: 0.00046, ips: 31.8039 samples/sec | ETA 00:38:59 2022-08-25 03:29:38 [INFO] [TRAIN] epoch: 120, iter: 150750/160000, loss: 0.6277, lr: 0.000070, batch_cost: 0.2620, reader_cost: 0.00068, ips: 30.5316 samples/sec | ETA 00:40:23 2022-08-25 03:29:50 [INFO] [TRAIN] epoch: 120, iter: 150800/160000, loss: 0.6485, lr: 0.000070, batch_cost: 0.2465, reader_cost: 0.00577, ips: 32.4505 samples/sec | ETA 00:37:48 2022-08-25 03:30:03 [INFO] [TRAIN] epoch: 120, iter: 150850/160000, loss: 0.6431, lr: 0.000069, batch_cost: 0.2556, reader_cost: 0.02965, ips: 31.3026 samples/sec | ETA 00:38:58 2022-08-25 03:30:15 [INFO] [TRAIN] epoch: 120, iter: 150900/160000, loss: 0.7022, lr: 0.000069, batch_cost: 0.2371, reader_cost: 0.00585, ips: 33.7386 samples/sec | ETA 00:35:57 2022-08-25 03:30:25 [INFO] [TRAIN] epoch: 120, iter: 150950/160000, loss: 0.7057, lr: 0.000069, batch_cost: 0.1946, reader_cost: 0.00616, ips: 41.1062 samples/sec | ETA 00:29:21 2022-08-25 03:30:35 [INFO] [TRAIN] epoch: 120, iter: 151000/160000, loss: 0.6875, lr: 0.000068, batch_cost: 0.2041, reader_cost: 0.00089, ips: 39.2043 samples/sec | ETA 00:30:36 2022-08-25 03:30:35 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 174s - batch_cost: 0.1741 - reader cost: 5.9165e-04 2022-08-25 03:33:29 [INFO] [EVAL] #Images: 2000 mIoU: 0.3243 Acc: 0.7498 Kappa: 0.7307 Dice: 0.4529 2022-08-25 03:33:29 [INFO] [EVAL] Class IoU: [0.661 0.775 0.9235 0.7081 0.6705 0.7461 0.7617 0.7575 0.4886 0.6126 0.4679 0.5304 0.6598 0.2965 0.2322 0.3837 0.4914 0.4226 0.5541 0.3684 0.7277 0.3932 0.5599 0.4495 0.3319 0.3777 0.3602 0.401 0.3725 0.323 0.2261 0.3975 0.2795 0.2875 0.3646 0.3863 0.3696 0.4891 0.2683 0.2741 0.0915 0.0709 0.2939 0.2302 0.2995 0.2619 0.2414 0.4278 0.6926 0.4748 0.4783 0.2753 0.2204 0.2075 0.62 0.4471 0.8506 0.305 0.4961 0.1634 0.0448 0.178 0.3045 0.1253 0.3822 0.6593 0.235 0.3582 0.1097 0.3403 0.3761 0.433 0.3734 0.2243 0.4178 0.2992 0.3207 0.223 0.1007 0.176 0.6414 0.2944 0.2938 0.025 0.5511 0.486 0.0857 0.0692 0.3376 0.4132 0.397 0.0081 0.1971 0.0752 0.0054 0.0077 0.1731 0.1409 0.2223 0.3796 0.1426 0.0372 0.2023 0.6261 0.0038 0.5379 0.2114 0.4655 0.0995 0.3609 0.0549 0.291 0.0923 0.4774 0.7372 0.0007 0.424 0.5609 0.0462 0.3775 0.4531 0.0119 0.2084 0.1237 0.2385 0.2439 0.4391 0.3031 0.4461 0.2684 0.5477 0.0222 0.1378 0.2536 0.1003 0.1099 0.065 0.0182 0.1187 0.3435 0.1157 0.0294 0.2842 0.5083 0.2863 0. 0.2936 0.0079 0.0825 0.0534] 2022-08-25 03:33:29 [INFO] [EVAL] Class Precision: [0.7702 0.8396 0.9598 0.8083 0.7739 0.8652 0.8832 0.8244 0.6209 0.7393 0.6867 0.6592 0.7475 0.4946 0.4386 0.5417 0.6474 0.6338 0.7268 0.5405 0.8209 0.562 0.7176 0.606 0.5245 0.5668 0.521 0.6949 0.6105 0.4688 0.3876 0.5281 0.4838 0.4136 0.5276 0.5399 0.6046 0.7467 0.494 0.5298 0.164 0.2137 0.4717 0.5203 0.4083 0.4551 0.3884 0.6373 0.7663 0.5702 0.6182 0.3388 0.3738 0.6244 0.7015 0.6095 0.9041 0.6264 0.6635 0.3434 0.0741 0.4343 0.4171 0.6268 0.475 0.8078 0.378 0.5733 0.2941 0.5933 0.6049 0.5794 0.6313 0.2899 0.6048 0.4825 0.5533 0.6112 0.7755 0.534 0.7464 0.5971 0.7299 0.0737 0.6904 0.7101 0.3213 0.4189 0.538 0.5815 0.5166 0.0109 0.3589 0.2955 0.0238 0.0323 0.5936 0.5365 0.4631 0.5408 0.6585 0.0718 0.6858 0.7949 0.0711 0.7637 0.5389 0.9182 0.2247 0.5783 0.2927 0.4919 0.3807 0.7329 0.7435 0.0168 0.6965 0.5747 0.114 0.7097 0.7529 0.1727 0.6926 0.6956 0.6294 0.6642 0.7709 0.4398 0.5456 0.5249 0.626 0.4531 0.4482 0.7562 0.6614 0.2648 0.3026 0.1328 0.366 0.6765 0.2406 0.1161 0.55 0.7223 0.4407 0. 0.8932 0.3067 0.428 0.7596] 2022-08-25 03:33:29 [INFO] [EVAL] Class Recall: [0.8234 0.9097 0.9607 0.8511 0.834 0.8443 0.8471 0.9032 0.6964 0.7814 0.5949 0.7309 0.8491 0.4254 0.3305 0.568 0.6709 0.5592 0.6999 0.5364 0.8651 0.5669 0.7182 0.6351 0.4747 0.5309 0.5385 0.4867 0.4885 0.5094 0.3518 0.6163 0.3982 0.4853 0.5414 0.5759 0.4875 0.5864 0.37 0.3622 0.1715 0.0959 0.4381 0.2922 0.5293 0.3815 0.3894 0.5655 0.8781 0.7395 0.6788 0.5949 0.3493 0.2371 0.8423 0.6266 0.9351 0.3729 0.6628 0.2376 0.1018 0.2318 0.53 0.1354 0.6619 0.7819 0.383 0.4884 0.1489 0.4438 0.4986 0.6315 0.4775 0.4977 0.5746 0.4406 0.4328 0.2598 0.1038 0.208 0.8201 0.3674 0.3296 0.0364 0.732 0.6063 0.1046 0.0766 0.4754 0.5881 0.6316 0.0306 0.3043 0.0916 0.007 0.0101 0.1964 0.1604 0.2995 0.5601 0.154 0.0718 0.2229 0.7467 0.004 0.6453 0.2581 0.4856 0.1514 0.4899 0.0634 0.4161 0.1086 0.578 0.9887 0.0007 0.5201 0.9591 0.0721 0.4464 0.5323 0.0126 0.2296 0.1308 0.2774 0.2782 0.5049 0.4936 0.7099 0.3545 0.8141 0.0228 0.166 0.2762 0.1058 0.1582 0.0765 0.0206 0.1494 0.411 0.1824 0.0378 0.3703 0.6318 0.4496 0. 0.3043 0.008 0.0928 0.0543] 2022-08-25 03:33:29 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 03:33:39 [INFO] [TRAIN] epoch: 120, iter: 151050/160000, loss: 0.6947, lr: 0.000068, batch_cost: 0.1887, reader_cost: 0.00523, ips: 42.3920 samples/sec | ETA 00:28:08 2022-08-25 03:33:48 [INFO] [TRAIN] epoch: 120, iter: 151100/160000, loss: 0.6744, lr: 0.000067, batch_cost: 0.1936, reader_cost: 0.00094, ips: 41.3240 samples/sec | ETA 00:28:42 2022-08-25 03:33:58 [INFO] [TRAIN] epoch: 120, iter: 151150/160000, loss: 0.6354, lr: 0.000067, batch_cost: 0.1899, reader_cost: 0.00412, ips: 42.1277 samples/sec | ETA 00:28:00 2022-08-25 03:34:07 [INFO] [TRAIN] epoch: 120, iter: 151200/160000, loss: 0.6614, lr: 0.000067, batch_cost: 0.1895, reader_cost: 0.00085, ips: 42.2173 samples/sec | ETA 00:27:47 2022-08-25 03:34:17 [INFO] [TRAIN] epoch: 120, iter: 151250/160000, loss: 0.6458, lr: 0.000066, batch_cost: 0.1909, reader_cost: 0.00389, ips: 41.9091 samples/sec | ETA 00:27:50 2022-08-25 03:34:26 [INFO] [TRAIN] epoch: 120, iter: 151300/160000, loss: 0.6866, lr: 0.000066, batch_cost: 0.1803, reader_cost: 0.00581, ips: 44.3661 samples/sec | ETA 00:26:08 2022-08-25 03:34:36 [INFO] [TRAIN] epoch: 120, iter: 151350/160000, loss: 0.6637, lr: 0.000065, batch_cost: 0.2050, reader_cost: 0.00522, ips: 39.0277 samples/sec | ETA 00:29:33 2022-08-25 03:34:48 [INFO] [TRAIN] epoch: 120, iter: 151400/160000, loss: 0.7051, lr: 0.000065, batch_cost: 0.2449, reader_cost: 0.00076, ips: 32.6664 samples/sec | ETA 00:35:06 2022-08-25 03:35:01 [INFO] [TRAIN] epoch: 120, iter: 151450/160000, loss: 0.6812, lr: 0.000065, batch_cost: 0.2556, reader_cost: 0.00052, ips: 31.3021 samples/sec | ETA 00:36:25 2022-08-25 03:35:13 [INFO] [TRAIN] epoch: 120, iter: 151500/160000, loss: 0.6731, lr: 0.000064, batch_cost: 0.2327, reader_cost: 0.00077, ips: 34.3773 samples/sec | ETA 00:32:58 2022-08-25 03:35:25 [INFO] [TRAIN] epoch: 120, iter: 151550/160000, loss: 0.7077, lr: 0.000064, batch_cost: 0.2513, reader_cost: 0.00051, ips: 31.8369 samples/sec | ETA 00:35:23 2022-08-25 03:35:40 [INFO] [TRAIN] epoch: 121, iter: 151600/160000, loss: 0.6768, lr: 0.000064, batch_cost: 0.2880, reader_cost: 0.04022, ips: 27.7777 samples/sec | ETA 00:40:19 2022-08-25 03:35:52 [INFO] [TRAIN] epoch: 121, iter: 151650/160000, loss: 0.6707, lr: 0.000063, batch_cost: 0.2457, reader_cost: 0.01077, ips: 32.5653 samples/sec | ETA 00:34:11 2022-08-25 03:36:05 [INFO] [TRAIN] epoch: 121, iter: 151700/160000, loss: 0.7041, lr: 0.000063, batch_cost: 0.2522, reader_cost: 0.00072, ips: 31.7180 samples/sec | ETA 00:34:53 2022-08-25 03:36:18 [INFO] [TRAIN] epoch: 121, iter: 151750/160000, loss: 0.6928, lr: 0.000062, batch_cost: 0.2694, reader_cost: 0.00065, ips: 29.6972 samples/sec | ETA 00:37:02 2022-08-25 03:36:29 [INFO] [TRAIN] epoch: 121, iter: 151800/160000, loss: 0.6700, lr: 0.000062, batch_cost: 0.2178, reader_cost: 0.00640, ips: 36.7238 samples/sec | ETA 00:29:46 2022-08-25 03:36:39 [INFO] [TRAIN] epoch: 121, iter: 151850/160000, loss: 0.6818, lr: 0.000062, batch_cost: 0.1896, reader_cost: 0.00045, ips: 42.2026 samples/sec | ETA 00:25:44 2022-08-25 03:36:48 [INFO] [TRAIN] epoch: 121, iter: 151900/160000, loss: 0.6662, lr: 0.000061, batch_cost: 0.1946, reader_cost: 0.00060, ips: 41.1173 samples/sec | ETA 00:26:15 2022-08-25 03:36:59 [INFO] [TRAIN] epoch: 121, iter: 151950/160000, loss: 0.6653, lr: 0.000061, batch_cost: 0.2178, reader_cost: 0.00057, ips: 36.7226 samples/sec | ETA 00:29:13 2022-08-25 03:37:09 [INFO] [TRAIN] epoch: 121, iter: 152000/160000, loss: 0.6507, lr: 0.000061, batch_cost: 0.1970, reader_cost: 0.00054, ips: 40.6149 samples/sec | ETA 00:26:15 2022-08-25 03:37:09 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 170s - batch_cost: 0.1699 - reader cost: 9.1610e-04 2022-08-25 03:39:59 [INFO] [EVAL] #Images: 2000 mIoU: 0.3231 Acc: 0.7503 Kappa: 0.7309 Dice: 0.4513 2022-08-25 03:39:59 [INFO] [EVAL] Class IoU: [0.6586 0.772 0.925 0.7072 0.6687 0.746 0.7631 0.7564 0.4918 0.6191 0.4669 0.5254 0.6599 0.3021 0.2194 0.3855 0.4885 0.4107 0.548 0.3681 0.7254 0.4127 0.5633 0.4572 0.3208 0.4115 0.3669 0.4009 0.3617 0.3478 0.212 0.3962 0.2725 0.2854 0.3119 0.393 0.3706 0.4956 0.2604 0.267 0.0744 0.074 0.2963 0.2223 0.2969 0.2689 0.2399 0.4274 0.7049 0.5003 0.4732 0.2652 0.226 0.21 0.6195 0.4464 0.8496 0.3258 0.4923 0.1873 0.0568 0.1763 0.2865 0.1223 0.3834 0.6576 0.2236 0.3686 0.0917 0.3334 0.3764 0.4348 0.37 0.2243 0.4118 0.2967 0.3377 0.2249 0.0974 0.2818 0.6473 0.2967 0.2471 0.0139 0.5289 0.487 0.0723 0.0669 0.3482 0.4126 0.4014 0.0039 0.186 0.0726 0.0059 0.0075 0.1825 0.1387 0.1963 0.3333 0.0831 0.0314 0.2121 0.5613 0.0027 0.5232 0.2161 0.5174 0.0922 0.3925 0.0648 0.2932 0.0927 0.5346 0.6982 0.0063 0.3812 0.5779 0.057 0.3325 0.4513 0.0125 0.2268 0.1186 0.2559 0.1937 0.3824 0.2937 0.4739 0.2681 0.5475 0.0215 0.1412 0.2483 0.1002 0.1054 0.0589 0.0185 0.1342 0.319 0.1078 0.0538 0.2791 0.5117 0.3008 0. 0.3157 0.0137 0.0635 0.0554] 2022-08-25 03:39:59 [INFO] [EVAL] Class Precision: [0.7596 0.8337 0.96 0.8093 0.7652 0.8608 0.8782 0.822 0.6344 0.7448 0.6854 0.6644 0.7475 0.4855 0.4808 0.5586 0.6454 0.6724 0.7342 0.5673 0.8153 0.5786 0.7204 0.6042 0.523 0.5906 0.5247 0.6689 0.6306 0.5323 0.4095 0.5321 0.4709 0.3991 0.5132 0.52 0.5931 0.7533 0.5247 0.5408 0.146 0.2175 0.4903 0.5183 0.4149 0.4679 0.386 0.6694 0.781 0.6222 0.5986 0.3218 0.4204 0.6087 0.7005 0.6444 0.8905 0.6162 0.6602 0.376 0.0914 0.3578 0.396 0.6108 0.4804 0.7738 0.3776 0.544 0.2054 0.5846 0.6218 0.6009 0.6394 0.2926 0.6229 0.4544 0.539 0.5538 0.7616 0.5315 0.7551 0.6391 0.7634 0.0521 0.676 0.6937 0.3522 0.4196 0.5599 0.5804 0.5347 0.0055 0.3485 0.2921 0.032 0.0308 0.5707 0.4953 0.4364 0.5468 0.6074 0.0575 0.6872 0.7009 0.0379 0.7603 0.5655 0.8971 0.2732 0.5588 0.2649 0.4982 0.3795 0.7139 0.7024 0.1207 0.7194 0.595 0.1245 0.7166 0.7839 0.1868 0.6415 0.7035 0.5881 0.6477 0.817 0.4281 0.6647 0.4453 0.628 0.46 0.4358 0.748 0.6565 0.2525 0.319 0.1359 0.3758 0.7224 0.2339 0.1446 0.5409 0.7085 0.5022 0. 0.8833 0.398 0.42 0.704 ] 2022-08-25 03:39:59 [INFO] [EVAL] Class Recall: [0.832 0.9125 0.9621 0.8486 0.8413 0.8484 0.8534 0.9045 0.6863 0.7857 0.5943 0.7153 0.8492 0.4444 0.2875 0.5544 0.6676 0.5135 0.6835 0.5119 0.868 0.59 0.7209 0.6528 0.4535 0.5757 0.5496 0.5001 0.459 0.5008 0.3054 0.6079 0.3928 0.5004 0.443 0.6167 0.4969 0.5916 0.3408 0.3453 0.1317 0.1008 0.4282 0.2801 0.5107 0.3873 0.388 0.5417 0.8785 0.7186 0.6932 0.6012 0.3283 0.2428 0.8426 0.5923 0.9487 0.4087 0.6594 0.2717 0.1303 0.2579 0.5088 0.1326 0.6549 0.8141 0.354 0.5333 0.1422 0.4369 0.4882 0.6113 0.4675 0.4901 0.5485 0.4608 0.4749 0.2747 0.1005 0.375 0.8194 0.3564 0.2675 0.0187 0.7084 0.6204 0.0833 0.0738 0.4795 0.588 0.6168 0.0132 0.2851 0.0882 0.0072 0.0099 0.2115 0.1616 0.263 0.4604 0.0878 0.0649 0.2347 0.7382 0.0029 0.6266 0.2591 0.55 0.1221 0.5687 0.079 0.4161 0.1092 0.6804 0.9914 0.0066 0.4478 0.9526 0.0951 0.3829 0.5154 0.0132 0.2597 0.1249 0.3118 0.2165 0.4182 0.4833 0.6227 0.4025 0.8102 0.0221 0.1728 0.271 0.1057 0.1533 0.0673 0.0209 0.1728 0.3636 0.1666 0.0789 0.3658 0.6482 0.4286 0. 0.3294 0.014 0.0696 0.0567] 2022-08-25 03:39:59 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 03:40:09 [INFO] [TRAIN] epoch: 121, iter: 152050/160000, loss: 0.6325, lr: 0.000060, batch_cost: 0.1858, reader_cost: 0.01968, ips: 43.0551 samples/sec | ETA 00:24:37 2022-08-25 03:40:18 [INFO] [TRAIN] epoch: 121, iter: 152100/160000, loss: 0.7065, lr: 0.000060, batch_cost: 0.1911, reader_cost: 0.00079, ips: 41.8665 samples/sec | ETA 00:25:09 2022-08-25 03:40:28 [INFO] [TRAIN] epoch: 121, iter: 152150/160000, loss: 0.7511, lr: 0.000059, batch_cost: 0.2046, reader_cost: 0.00101, ips: 39.1032 samples/sec | ETA 00:26:46 2022-08-25 03:40:37 [INFO] [TRAIN] epoch: 121, iter: 152200/160000, loss: 0.6740, lr: 0.000059, batch_cost: 0.1792, reader_cost: 0.00046, ips: 44.6470 samples/sec | ETA 00:23:17 2022-08-25 03:40:47 [INFO] [TRAIN] epoch: 121, iter: 152250/160000, loss: 0.6909, lr: 0.000059, batch_cost: 0.1934, reader_cost: 0.00053, ips: 41.3656 samples/sec | ETA 00:24:58 2022-08-25 03:40:57 [INFO] [TRAIN] epoch: 121, iter: 152300/160000, loss: 0.6769, lr: 0.000058, batch_cost: 0.1942, reader_cost: 0.00051, ips: 41.1952 samples/sec | ETA 00:24:55 2022-08-25 03:41:07 [INFO] [TRAIN] epoch: 121, iter: 152350/160000, loss: 0.6232, lr: 0.000058, batch_cost: 0.2026, reader_cost: 0.00090, ips: 39.4946 samples/sec | ETA 00:25:49 2022-08-25 03:41:16 [INFO] [TRAIN] epoch: 121, iter: 152400/160000, loss: 0.6785, lr: 0.000058, batch_cost: 0.1860, reader_cost: 0.00047, ips: 43.0023 samples/sec | ETA 00:23:33 2022-08-25 03:41:26 [INFO] [TRAIN] epoch: 121, iter: 152450/160000, loss: 0.6729, lr: 0.000057, batch_cost: 0.1866, reader_cost: 0.00046, ips: 42.8629 samples/sec | ETA 00:23:29 2022-08-25 03:41:36 [INFO] [TRAIN] epoch: 121, iter: 152500/160000, loss: 0.6261, lr: 0.000057, batch_cost: 0.2140, reader_cost: 0.00057, ips: 37.3790 samples/sec | ETA 00:26:45 2022-08-25 03:41:48 [INFO] [TRAIN] epoch: 121, iter: 152550/160000, loss: 0.6549, lr: 0.000056, batch_cost: 0.2391, reader_cost: 0.00053, ips: 33.4580 samples/sec | ETA 00:29:41 2022-08-25 03:41:59 [INFO] [TRAIN] epoch: 121, iter: 152600/160000, loss: 0.6447, lr: 0.000056, batch_cost: 0.2170, reader_cost: 0.01819, ips: 36.8658 samples/sec | ETA 00:26:45 2022-08-25 03:42:11 [INFO] [TRAIN] epoch: 121, iter: 152650/160000, loss: 0.6407, lr: 0.000056, batch_cost: 0.2399, reader_cost: 0.00052, ips: 33.3506 samples/sec | ETA 00:29:23 2022-08-25 03:42:24 [INFO] [TRAIN] epoch: 121, iter: 152700/160000, loss: 0.6675, lr: 0.000055, batch_cost: 0.2669, reader_cost: 0.00263, ips: 29.9792 samples/sec | ETA 00:32:28 2022-08-25 03:42:36 [INFO] [TRAIN] epoch: 121, iter: 152750/160000, loss: 0.6923, lr: 0.000055, batch_cost: 0.2362, reader_cost: 0.00921, ips: 33.8627 samples/sec | ETA 00:28:32 2022-08-25 03:42:48 [INFO] [TRAIN] epoch: 121, iter: 152800/160000, loss: 0.6429, lr: 0.000055, batch_cost: 0.2435, reader_cost: 0.00119, ips: 32.8551 samples/sec | ETA 00:29:13 2022-08-25 03:43:01 [INFO] [TRAIN] epoch: 122, iter: 152850/160000, loss: 0.7135, lr: 0.000054, batch_cost: 0.2426, reader_cost: 0.05347, ips: 32.9743 samples/sec | ETA 00:28:54 2022-08-25 03:43:11 [INFO] [TRAIN] epoch: 122, iter: 152900/160000, loss: 0.6455, lr: 0.000054, batch_cost: 0.2032, reader_cost: 0.00087, ips: 39.3736 samples/sec | ETA 00:24:02 2022-08-25 03:43:21 [INFO] [TRAIN] epoch: 122, iter: 152950/160000, loss: 0.6757, lr: 0.000053, batch_cost: 0.2044, reader_cost: 0.00344, ips: 39.1415 samples/sec | ETA 00:24:00 2022-08-25 03:43:30 [INFO] [TRAIN] epoch: 122, iter: 153000/160000, loss: 0.6651, lr: 0.000053, batch_cost: 0.1906, reader_cost: 0.00035, ips: 41.9730 samples/sec | ETA 00:22:14 2022-08-25 03:43:30 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 176s - batch_cost: 0.1757 - reader cost: 7.8975e-04 2022-08-25 03:46:27 [INFO] [EVAL] #Images: 2000 mIoU: 0.3236 Acc: 0.7507 Kappa: 0.7315 Dice: 0.4520 2022-08-25 03:46:27 [INFO] [EVAL] Class IoU: [0.6599 0.7737 0.925 0.7085 0.6723 0.7439 0.7682 0.757 0.486 0.6226 0.4671 0.5264 0.6661 0.3065 0.2227 0.3872 0.4885 0.4242 0.5537 0.3632 0.7239 0.4143 0.5604 0.4584 0.3238 0.3815 0.3611 0.4073 0.3739 0.3339 0.2093 0.4041 0.279 0.2865 0.3061 0.3878 0.368 0.5152 0.2612 0.2685 0.0935 0.0747 0.2949 0.2248 0.2933 0.2592 0.1956 0.4296 0.7024 0.4958 0.481 0.2703 0.2203 0.2187 0.6198 0.4328 0.8485 0.3262 0.4839 0.17 0.0497 0.2362 0.2948 0.1233 0.3793 0.6707 0.2384 0.3672 0.089 0.3484 0.3824 0.4343 0.3756 0.2268 0.4145 0.2954 0.326 0.2283 0.1078 0.2745 0.6434 0.2949 0.242 0.0231 0.517 0.4912 0.0816 0.0664 0.3382 0.4122 0.4085 0.0053 0.2041 0.0804 0.0021 0.0096 0.1613 0.1428 0.2029 0.3514 0.086 0.0336 0.2168 0.4984 0.0018 0.5333 0.1834 0.518 0.0944 0.3848 0.0596 0.3227 0.0906 0.5367 0.7186 0.0051 0.3856 0.575 0.0425 0.2672 0.4487 0.0126 0.2372 0.1352 0.2504 0.2568 0.4232 0.3028 0.4708 0.2737 0.5275 0.0204 0.136 0.2695 0.1015 0.108 0.0578 0.0185 0.1223 0.3162 0.0861 0.0551 0.2713 0.5197 0.3165 0. 0.3003 0.0184 0.0652 0.0477] 2022-08-25 03:46:27 [INFO] [EVAL] Class Precision: [0.767 0.8371 0.9602 0.8107 0.7774 0.8602 0.8726 0.8236 0.608 0.7508 0.6816 0.6768 0.7574 0.4812 0.4719 0.5493 0.6329 0.6507 0.7207 0.5709 0.8121 0.5698 0.7138 0.5826 0.5314 0.5833 0.5336 0.6474 0.6219 0.4969 0.4239 0.5367 0.4823 0.4093 0.478 0.5371 0.6123 0.7428 0.5573 0.5215 0.1582 0.2138 0.4638 0.534 0.4314 0.4257 0.3702 0.65 0.7705 0.611 0.6221 0.3267 0.4105 0.6435 0.7138 0.5771 0.8919 0.6153 0.6578 0.3601 0.0854 0.4943 0.4391 0.6193 0.47 0.8012 0.4053 0.5478 0.2458 0.6357 0.5869 0.5893 0.6499 0.2771 0.5869 0.4419 0.5438 0.6056 0.7733 0.5724 0.7475 0.5954 0.7638 0.0843 0.6939 0.694 0.3663 0.4376 0.5373 0.5601 0.5451 0.0074 0.3372 0.3012 0.0147 0.038 0.6384 0.5003 0.4184 0.5642 0.6812 0.0601 0.684 0.6953 0.0265 0.7419 0.5852 0.9206 0.2481 0.5786 0.2431 0.5898 0.3997 0.7169 0.724 0.0788 0.6949 0.6041 0.1022 0.7198 0.7328 0.2772 0.6806 0.6846 0.5983 0.6736 0.8008 0.4658 0.7176 0.429 0.5999 0.5103 0.4111 0.7385 0.6366 0.2562 0.3109 0.1449 0.3535 0.7106 0.2024 0.1255 0.53 0.7004 0.542 0. 0.8842 0.3428 0.3736 0.6946] 2022-08-25 03:46:27 [INFO] [EVAL] Class Recall: [0.8253 0.9109 0.9619 0.8489 0.8325 0.8463 0.8653 0.9035 0.7079 0.7847 0.5975 0.7031 0.8469 0.4576 0.2966 0.5675 0.6817 0.5493 0.705 0.4995 0.8695 0.6029 0.7228 0.6826 0.4531 0.5245 0.5276 0.5234 0.4839 0.5044 0.2926 0.6206 0.3983 0.4884 0.4599 0.5825 0.4798 0.6271 0.3296 0.3562 0.1861 0.103 0.4474 0.2797 0.4781 0.3985 0.2931 0.5588 0.8883 0.7246 0.6796 0.6102 0.3222 0.2488 0.8247 0.6339 0.9457 0.4098 0.6467 0.2436 0.106 0.3115 0.4728 0.1335 0.6627 0.8046 0.3667 0.527 0.1223 0.4353 0.5233 0.6229 0.4708 0.5556 0.5852 0.4713 0.4487 0.2682 0.1114 0.3453 0.8221 0.3688 0.2616 0.0309 0.6697 0.627 0.095 0.0726 0.4772 0.6095 0.6197 0.0186 0.3407 0.0988 0.0024 0.0127 0.1776 0.1666 0.2825 0.4823 0.0896 0.0708 0.2409 0.6378 0.0019 0.6548 0.2108 0.5422 0.1322 0.5347 0.0732 0.4161 0.1048 0.6811 0.9898 0.0054 0.4642 0.9226 0.0678 0.2982 0.5365 0.0131 0.2669 0.1442 0.3009 0.2933 0.4729 0.4638 0.5779 0.4305 0.8138 0.0208 0.1688 0.298 0.1078 0.1573 0.0663 0.0208 0.1575 0.3629 0.1303 0.0893 0.3572 0.6682 0.4321 0. 0.3126 0.0191 0.0732 0.0487] 2022-08-25 03:46:27 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 03:46:37 [INFO] [TRAIN] epoch: 122, iter: 153050/160000, loss: 0.6437, lr: 0.000053, batch_cost: 0.2089, reader_cost: 0.00397, ips: 38.2870 samples/sec | ETA 00:24:12 2022-08-25 03:46:46 [INFO] [TRAIN] epoch: 122, iter: 153100/160000, loss: 0.6765, lr: 0.000052, batch_cost: 0.1778, reader_cost: 0.00087, ips: 44.9851 samples/sec | ETA 00:20:27 2022-08-25 03:46:54 [INFO] [TRAIN] epoch: 122, iter: 153150/160000, loss: 0.6796, lr: 0.000052, batch_cost: 0.1684, reader_cost: 0.00672, ips: 47.5084 samples/sec | ETA 00:19:13 2022-08-25 03:47:04 [INFO] [TRAIN] epoch: 122, iter: 153200/160000, loss: 0.6574, lr: 0.000051, batch_cost: 0.1926, reader_cost: 0.01262, ips: 41.5349 samples/sec | ETA 00:21:49 2022-08-25 03:47:13 [INFO] [TRAIN] epoch: 122, iter: 153250/160000, loss: 0.6269, lr: 0.000051, batch_cost: 0.1799, reader_cost: 0.00032, ips: 44.4741 samples/sec | ETA 00:20:14 2022-08-25 03:47:23 [INFO] [TRAIN] epoch: 122, iter: 153300/160000, loss: 0.6873, lr: 0.000051, batch_cost: 0.2056, reader_cost: 0.00039, ips: 38.9045 samples/sec | ETA 00:22:57 2022-08-25 03:47:33 [INFO] [TRAIN] epoch: 122, iter: 153350/160000, loss: 0.6573, lr: 0.000050, batch_cost: 0.1966, reader_cost: 0.00043, ips: 40.6854 samples/sec | ETA 00:21:47 2022-08-25 03:47:44 [INFO] [TRAIN] epoch: 122, iter: 153400/160000, loss: 0.6439, lr: 0.000050, batch_cost: 0.2207, reader_cost: 0.00062, ips: 36.2511 samples/sec | ETA 00:24:16 2022-08-25 03:47:54 [INFO] [TRAIN] epoch: 122, iter: 153450/160000, loss: 0.6463, lr: 0.000050, batch_cost: 0.2043, reader_cost: 0.00034, ips: 39.1607 samples/sec | ETA 00:22:18 2022-08-25 03:48:06 [INFO] [TRAIN] epoch: 122, iter: 153500/160000, loss: 0.6335, lr: 0.000049, batch_cost: 0.2372, reader_cost: 0.00121, ips: 33.7269 samples/sec | ETA 00:25:41 2022-08-25 03:48:18 [INFO] [TRAIN] epoch: 122, iter: 153550/160000, loss: 0.6673, lr: 0.000049, batch_cost: 0.2395, reader_cost: 0.00079, ips: 33.4070 samples/sec | ETA 00:25:44 2022-08-25 03:48:30 [INFO] [TRAIN] epoch: 122, iter: 153600/160000, loss: 0.6898, lr: 0.000048, batch_cost: 0.2354, reader_cost: 0.00102, ips: 33.9775 samples/sec | ETA 00:25:06 2022-08-25 03:48:43 [INFO] [TRAIN] epoch: 122, iter: 153650/160000, loss: 0.6715, lr: 0.000048, batch_cost: 0.2624, reader_cost: 0.00531, ips: 30.4826 samples/sec | ETA 00:27:46 2022-08-25 03:48:55 [INFO] [TRAIN] epoch: 122, iter: 153700/160000, loss: 0.6530, lr: 0.000048, batch_cost: 0.2296, reader_cost: 0.00323, ips: 34.8453 samples/sec | ETA 00:24:06 2022-08-25 03:49:06 [INFO] [TRAIN] epoch: 122, iter: 153750/160000, loss: 0.6481, lr: 0.000047, batch_cost: 0.2314, reader_cost: 0.00064, ips: 34.5686 samples/sec | ETA 00:24:06 2022-08-25 03:49:17 [INFO] [TRAIN] epoch: 122, iter: 153800/160000, loss: 0.6695, lr: 0.000047, batch_cost: 0.2223, reader_cost: 0.01754, ips: 35.9893 samples/sec | ETA 00:22:58 2022-08-25 03:49:27 [INFO] [TRAIN] epoch: 122, iter: 153850/160000, loss: 0.6529, lr: 0.000047, batch_cost: 0.1845, reader_cost: 0.02104, ips: 43.3600 samples/sec | ETA 00:18:54 2022-08-25 03:49:36 [INFO] [TRAIN] epoch: 122, iter: 153900/160000, loss: 0.7393, lr: 0.000046, batch_cost: 0.1969, reader_cost: 0.00033, ips: 40.6326 samples/sec | ETA 00:20:01 2022-08-25 03:49:46 [INFO] [TRAIN] epoch: 122, iter: 153950/160000, loss: 0.6328, lr: 0.000046, batch_cost: 0.2019, reader_cost: 0.00035, ips: 39.6213 samples/sec | ETA 00:20:21 2022-08-25 03:49:56 [INFO] [TRAIN] epoch: 122, iter: 154000/160000, loss: 0.6755, lr: 0.000045, batch_cost: 0.1969, reader_cost: 0.00061, ips: 40.6321 samples/sec | ETA 00:19:41 2022-08-25 03:49:56 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 173s - batch_cost: 0.1727 - reader cost: 7.4143e-04 2022-08-25 03:52:49 [INFO] [EVAL] #Images: 2000 mIoU: 0.3223 Acc: 0.7492 Kappa: 0.7299 Dice: 0.4511 2022-08-25 03:52:49 [INFO] [EVAL] Class IoU: [0.6593 0.7746 0.9241 0.7072 0.6683 0.7433 0.7635 0.7574 0.4889 0.6153 0.4695 0.53 0.6627 0.2975 0.2214 0.3826 0.4879 0.4184 0.5593 0.3668 0.726 0.3994 0.5497 0.4603 0.3254 0.3663 0.3619 0.4026 0.3689 0.3345 0.2114 0.3882 0.2701 0.291 0.3249 0.3935 0.3683 0.4988 0.2626 0.2636 0.088 0.0772 0.299 0.2329 0.2902 0.2605 0.2354 0.4298 0.6932 0.4947 0.4705 0.2534 0.2255 0.1955 0.612 0.4346 0.8417 0.3284 0.4953 0.1824 0.0393 0.1754 0.3035 0.1301 0.3925 0.6663 0.243 0.359 0.1004 0.3484 0.3721 0.4362 0.3701 0.23 0.4162 0.2967 0.3308 0.2353 0.1109 0.2607 0.6294 0.2931 0.2678 0.0191 0.5099 0.4883 0.0792 0.066 0.3193 0.4357 0.4004 0.0038 0.1979 0.0834 0.0009 0.0067 0.1664 0.138 0.1985 0.3333 0.1258 0.0264 0.209 0.6017 0.0017 0.5602 0.2145 0.4667 0.0882 0.3841 0.0603 0.2656 0.0869 0.5116 0.6741 0.0079 0.397 0.557 0.044 0.3574 0.3859 0.0112 0.194 0.1461 0.2453 0.252 0.4075 0.294 0.4902 0.2719 0.5415 0.026 0.1546 0.2564 0.1059 0.1051 0.0609 0.0205 0.1329 0.3171 0.0802 0.0701 0.2707 0.4722 0.2892 0. 0.2967 0.0144 0.068 0.0659] 2022-08-25 03:52:49 [INFO] [EVAL] Class Precision: [0.7659 0.8377 0.9622 0.8128 0.7668 0.8654 0.8856 0.8285 0.6085 0.7497 0.6743 0.6512 0.7601 0.4821 0.4683 0.5508 0.6393 0.6504 0.7256 0.5632 0.8137 0.5693 0.6905 0.5975 0.5117 0.5833 0.5088 0.6517 0.631 0.4887 0.4099 0.4997 0.4508 0.4051 0.5191 0.5338 0.6105 0.7513 0.5067 0.5252 0.1383 0.2311 0.503 0.5317 0.4231 0.4347 0.3997 0.6672 0.7391 0.6165 0.6132 0.3026 0.4204 0.6369 0.7047 0.5997 0.8761 0.6183 0.654 0.3394 0.066 0.4044 0.42 0.6374 0.5012 0.7937 0.3812 0.571 0.2683 0.6265 0.6098 0.6166 0.6389 0.2956 0.6155 0.4738 0.5269 0.5721 0.6164 0.541 0.7261 0.5669 0.7624 0.0703 0.7076 0.6757 0.344 0.4425 0.4779 0.6329 0.5414 0.0052 0.3036 0.3039 0.0068 0.032 0.5675 0.6006 0.4154 0.5206 0.6101 0.0503 0.6852 0.7478 0.0306 0.802 0.4932 0.8955 0.2461 0.5575 0.2241 0.4235 0.4316 0.6962 0.6778 0.1155 0.6935 0.5726 0.127 0.7129 0.7842 0.2641 0.7565 0.649 0.6375 0.674 0.7697 0.4319 0.608 0.4656 0.6176 0.6086 0.4636 0.7512 0.6646 0.2527 0.2997 0.2325 0.3645 0.7068 0.1667 0.1647 0.5613 0.7482 0.4452 0. 0.8953 0.2938 0.4235 0.7095] 2022-08-25 03:52:49 [INFO] [EVAL] Class Recall: [0.8257 0.9115 0.9589 0.8448 0.8387 0.8405 0.8471 0.8982 0.7132 0.7743 0.6072 0.7402 0.8379 0.4372 0.2958 0.5561 0.6732 0.5398 0.7093 0.5126 0.8707 0.5723 0.7294 0.6671 0.4719 0.4962 0.5564 0.513 0.4704 0.5146 0.3038 0.6349 0.4026 0.5083 0.4648 0.5995 0.4814 0.5974 0.3528 0.3461 0.1948 0.1038 0.4243 0.293 0.4801 0.394 0.3642 0.5471 0.9177 0.7147 0.6691 0.609 0.3272 0.2201 0.8231 0.6122 0.9555 0.4119 0.6712 0.2829 0.0885 0.2364 0.5226 0.1405 0.644 0.8058 0.4014 0.4917 0.1383 0.4397 0.4884 0.5986 0.468 0.5087 0.5625 0.4424 0.4705 0.2855 0.1191 0.3347 0.8253 0.3777 0.2922 0.0255 0.646 0.6378 0.0933 0.072 0.4904 0.5831 0.606 0.0135 0.3624 0.1031 0.001 0.0085 0.1906 0.1519 0.2754 0.481 0.1368 0.0525 0.2312 0.7549 0.0018 0.6501 0.2751 0.4936 0.1209 0.5524 0.0762 0.4161 0.0981 0.6587 0.992 0.0084 0.4814 0.9536 0.0632 0.4175 0.4317 0.0116 0.207 0.1587 0.2851 0.287 0.4641 0.4794 0.7168 0.3953 0.8146 0.0264 0.1883 0.2802 0.1119 0.1525 0.0711 0.022 0.173 0.3651 0.1338 0.1088 0.3433 0.5615 0.4521 0. 0.3074 0.0149 0.0749 0.0678] 2022-08-25 03:52:49 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 03:53:00 [INFO] [TRAIN] epoch: 122, iter: 154050/160000, loss: 0.6715, lr: 0.000045, batch_cost: 0.2123, reader_cost: 0.00480, ips: 37.6752 samples/sec | ETA 00:21:03 2022-08-25 03:53:12 [INFO] [TRAIN] epoch: 123, iter: 154100/160000, loss: 0.6630, lr: 0.000045, batch_cost: 0.2352, reader_cost: 0.03021, ips: 34.0191 samples/sec | ETA 00:23:07 2022-08-25 03:53:22 [INFO] [TRAIN] epoch: 123, iter: 154150/160000, loss: 0.6470, lr: 0.000044, batch_cost: 0.2002, reader_cost: 0.00046, ips: 39.9669 samples/sec | ETA 00:19:30 2022-08-25 03:53:31 [INFO] [TRAIN] epoch: 123, iter: 154200/160000, loss: 0.6536, lr: 0.000044, batch_cost: 0.1862, reader_cost: 0.00300, ips: 42.9696 samples/sec | ETA 00:17:59 2022-08-25 03:53:42 [INFO] [TRAIN] epoch: 123, iter: 154250/160000, loss: 0.6453, lr: 0.000044, batch_cost: 0.2092, reader_cost: 0.00092, ips: 38.2462 samples/sec | ETA 00:20:02 2022-08-25 03:53:53 [INFO] [TRAIN] epoch: 123, iter: 154300/160000, loss: 0.6772, lr: 0.000043, batch_cost: 0.2193, reader_cost: 0.00068, ips: 36.4877 samples/sec | ETA 00:20:49 2022-08-25 03:54:03 [INFO] [TRAIN] epoch: 123, iter: 154350/160000, loss: 0.6770, lr: 0.000043, batch_cost: 0.2078, reader_cost: 0.00063, ips: 38.4898 samples/sec | ETA 00:19:34 2022-08-25 03:54:12 [INFO] [TRAIN] epoch: 123, iter: 154400/160000, loss: 0.6565, lr: 0.000042, batch_cost: 0.1902, reader_cost: 0.00045, ips: 42.0595 samples/sec | ETA 00:17:45 2022-08-25 03:54:23 [INFO] [TRAIN] epoch: 123, iter: 154450/160000, loss: 0.6857, lr: 0.000042, batch_cost: 0.2136, reader_cost: 0.00061, ips: 37.4555 samples/sec | ETA 00:19:45 2022-08-25 03:54:34 [INFO] [TRAIN] epoch: 123, iter: 154500/160000, loss: 0.6443, lr: 0.000042, batch_cost: 0.2246, reader_cost: 0.00088, ips: 35.6205 samples/sec | ETA 00:20:35 2022-08-25 03:54:46 [INFO] [TRAIN] epoch: 123, iter: 154550/160000, loss: 0.6973, lr: 0.000041, batch_cost: 0.2272, reader_cost: 0.01516, ips: 35.2116 samples/sec | ETA 00:20:38 2022-08-25 03:54:58 [INFO] [TRAIN] epoch: 123, iter: 154600/160000, loss: 0.6694, lr: 0.000041, batch_cost: 0.2460, reader_cost: 0.00454, ips: 32.5246 samples/sec | ETA 00:22:08 2022-08-25 03:55:09 [INFO] [TRAIN] epoch: 123, iter: 154650/160000, loss: 0.6531, lr: 0.000041, batch_cost: 0.2166, reader_cost: 0.00053, ips: 36.9261 samples/sec | ETA 00:19:19 2022-08-25 03:55:20 [INFO] [TRAIN] epoch: 123, iter: 154700/160000, loss: 0.6433, lr: 0.000040, batch_cost: 0.2209, reader_cost: 0.00699, ips: 36.2152 samples/sec | ETA 00:19:30 2022-08-25 03:55:30 [INFO] [TRAIN] epoch: 123, iter: 154750/160000, loss: 0.6505, lr: 0.000040, batch_cost: 0.2010, reader_cost: 0.00163, ips: 39.8078 samples/sec | ETA 00:17:35 2022-08-25 03:55:40 [INFO] [TRAIN] epoch: 123, iter: 154800/160000, loss: 0.6735, lr: 0.000039, batch_cost: 0.2034, reader_cost: 0.00460, ips: 39.3351 samples/sec | ETA 00:17:37 2022-08-25 03:55:49 [INFO] [TRAIN] epoch: 123, iter: 154850/160000, loss: 0.6613, lr: 0.000039, batch_cost: 0.1779, reader_cost: 0.00424, ips: 44.9758 samples/sec | ETA 00:15:16 2022-08-25 03:55:58 [INFO] [TRAIN] epoch: 123, iter: 154900/160000, loss: 0.6781, lr: 0.000039, batch_cost: 0.1751, reader_cost: 0.00372, ips: 45.6783 samples/sec | ETA 00:14:53 2022-08-25 03:56:07 [INFO] [TRAIN] epoch: 123, iter: 154950/160000, loss: 0.6536, lr: 0.000038, batch_cost: 0.1836, reader_cost: 0.00043, ips: 43.5790 samples/sec | ETA 00:15:27 2022-08-25 03:56:18 [INFO] [TRAIN] epoch: 123, iter: 155000/160000, loss: 0.6640, lr: 0.000038, batch_cost: 0.2115, reader_cost: 0.00057, ips: 37.8268 samples/sec | ETA 00:17:37 2022-08-25 03:56:18 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 165s - batch_cost: 0.1648 - reader cost: 7.7004e-04 2022-08-25 03:59:03 [INFO] [EVAL] #Images: 2000 mIoU: 0.3230 Acc: 0.7504 Kappa: 0.7312 Dice: 0.4509 2022-08-25 03:59:03 [INFO] [EVAL] Class IoU: [0.6602 0.7697 0.9255 0.7069 0.6693 0.7483 0.7654 0.7629 0.4882 0.6265 0.4639 0.5267 0.6609 0.2961 0.2244 0.3855 0.4878 0.4194 0.5555 0.3667 0.7217 0.4144 0.5647 0.4643 0.3289 0.3733 0.3908 0.4044 0.3609 0.3553 0.2216 0.3957 0.2695 0.2883 0.3204 0.3973 0.3695 0.5169 0.2564 0.2752 0.0987 0.0712 0.2972 0.2274 0.2961 0.2504 0.1962 0.4254 0.7017 0.5017 0.4844 0.253 0.2176 0.2136 0.6164 0.4508 0.8449 0.3153 0.5034 0.1857 0.0551 0.1753 0.3067 0.1272 0.3906 0.6547 0.2297 0.3598 0.0944 0.3441 0.3773 0.4357 0.3688 0.232 0.4156 0.3001 0.3315 0.2257 0.1026 0.2216 0.64 0.29 0.2677 0.0183 0.4941 0.4918 0.0823 0.0673 0.3328 0.4105 0.4161 0.0041 0.197 0.0784 0.0036 0.008 0.1647 0.1383 0.1943 0.3394 0.124 0.0251 0.2146 0.6255 0.0011 0.5223 0.1902 0.485 0.0989 0.3783 0.072 0.2252 0.0888 0.5364 0.7308 0.0034 0.3675 0.5829 0.0341 0.3035 0.4581 0.0111 0.192 0.1265 0.248 0.2382 0.4116 0.2895 0.4829 0.2818 0.5414 0.024 0.1158 0.2604 0.0985 0.1068 0.0586 0.0185 0.1318 0.3224 0.1019 0.0533 0.2585 0.5092 0.3069 0. 0.2959 0.0161 0.0812 0.0573] 2022-08-25 03:59:03 [INFO] [EVAL] Class Precision: [0.7646 0.8406 0.9623 0.8026 0.7613 0.8623 0.87 0.8369 0.6177 0.7467 0.6757 0.6688 0.7479 0.4757 0.4693 0.5513 0.6373 0.6668 0.7206 0.5752 0.8107 0.5913 0.738 0.5875 0.526 0.5566 0.5546 0.6701 0.6383 0.5389 0.4087 0.5071 0.4441 0.4156 0.5085 0.5448 0.6091 0.7448 0.5329 0.5177 0.1576 0.2097 0.4979 0.5306 0.4097 0.424 0.3462 0.6579 0.7846 0.6279 0.6332 0.3026 0.389 0.6098 0.7061 0.6318 0.8826 0.6317 0.6709 0.3589 0.093 0.4159 0.4242 0.6268 0.5096 0.7643 0.4081 0.5681 0.2576 0.6164 0.5926 0.5683 0.6506 0.299 0.6094 0.5043 0.5762 0.5822 0.6571 0.5472 0.7406 0.5993 0.7535 0.0568 0.6918 0.6921 0.4142 0.4387 0.5543 0.5757 0.565 0.0056 0.3304 0.2959 0.0237 0.0364 0.6166 0.5498 0.4609 0.6062 0.5819 0.047 0.6647 0.8347 0.0134 0.7563 0.5715 0.8695 0.2519 0.567 0.267 0.3292 0.3985 0.7289 0.7366 0.0547 0.6562 0.6023 0.0964 0.7281 0.7851 0.2623 0.7142 0.6777 0.6296 0.6752 0.7807 0.4296 0.6207 0.4856 0.63 0.5953 0.4982 0.7252 0.6631 0.2676 0.3208 0.1521 0.3689 0.7022 0.2493 0.138 0.5577 0.7133 0.5077 0. 0.8999 0.3241 0.4777 0.684 ] 2022-08-25 03:59:03 [INFO] [EVAL] Class Recall: [0.8286 0.9012 0.9603 0.8557 0.8471 0.8499 0.8642 0.8961 0.6997 0.7956 0.5967 0.7127 0.8503 0.4394 0.3007 0.5617 0.6754 0.5306 0.7081 0.503 0.868 0.5807 0.7063 0.6889 0.4674 0.5312 0.5695 0.5049 0.4536 0.5105 0.3262 0.6431 0.4068 0.4849 0.4642 0.5947 0.4844 0.6281 0.3308 0.3701 0.2088 0.0974 0.4243 0.2846 0.5162 0.3795 0.3116 0.5463 0.8692 0.7139 0.6733 0.6067 0.3306 0.2475 0.8291 0.6115 0.9519 0.3863 0.6685 0.2778 0.1192 0.2325 0.5254 0.1376 0.626 0.8204 0.3444 0.4954 0.1297 0.4379 0.5095 0.6513 0.4599 0.5086 0.5666 0.4256 0.4383 0.2694 0.1084 0.2714 0.8248 0.3597 0.2934 0.0264 0.6335 0.6296 0.0932 0.0737 0.4545 0.5885 0.6122 0.0149 0.3281 0.0964 0.0042 0.0101 0.1835 0.156 0.2514 0.4353 0.1361 0.051 0.2407 0.7139 0.0012 0.6279 0.2218 0.5231 0.1401 0.532 0.0897 0.4161 0.1025 0.6701 0.9893 0.0036 0.4551 0.9476 0.05 0.3423 0.5238 0.0115 0.208 0.1346 0.2903 0.269 0.4654 0.4703 0.6851 0.4018 0.7938 0.0244 0.1311 0.2889 0.1037 0.1509 0.0669 0.0206 0.1702 0.3735 0.1471 0.08 0.3252 0.6403 0.4369 0. 0.306 0.0166 0.0891 0.0589] 2022-08-25 03:59:03 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 03:59:13 [INFO] [TRAIN] epoch: 123, iter: 155050/160000, loss: 0.6504, lr: 0.000037, batch_cost: 0.2114, reader_cost: 0.00476, ips: 37.8345 samples/sec | ETA 00:17:26 2022-08-25 03:59:24 [INFO] [TRAIN] epoch: 123, iter: 155100/160000, loss: 0.6655, lr: 0.000037, batch_cost: 0.2127, reader_cost: 0.00729, ips: 37.6107 samples/sec | ETA 00:17:22 2022-08-25 03:59:34 [INFO] [TRAIN] epoch: 123, iter: 155150/160000, loss: 0.6380, lr: 0.000037, batch_cost: 0.1949, reader_cost: 0.00073, ips: 41.0377 samples/sec | ETA 00:15:45 2022-08-25 03:59:44 [INFO] [TRAIN] epoch: 123, iter: 155200/160000, loss: 0.6911, lr: 0.000036, batch_cost: 0.2004, reader_cost: 0.00034, ips: 39.9297 samples/sec | ETA 00:16:01 2022-08-25 03:59:55 [INFO] [TRAIN] epoch: 123, iter: 155250/160000, loss: 0.6678, lr: 0.000036, batch_cost: 0.2331, reader_cost: 0.00032, ips: 34.3255 samples/sec | ETA 00:18:27 2022-08-25 04:00:05 [INFO] [TRAIN] epoch: 123, iter: 155300/160000, loss: 0.6638, lr: 0.000036, batch_cost: 0.1808, reader_cost: 0.00167, ips: 44.2520 samples/sec | ETA 00:14:09 2022-08-25 04:00:16 [INFO] [TRAIN] epoch: 124, iter: 155350/160000, loss: 0.6666, lr: 0.000035, batch_cost: 0.2229, reader_cost: 0.03930, ips: 35.8876 samples/sec | ETA 00:17:16 2022-08-25 04:00:26 [INFO] [TRAIN] epoch: 124, iter: 155400/160000, loss: 0.6556, lr: 0.000035, batch_cost: 0.2083, reader_cost: 0.00613, ips: 38.4090 samples/sec | ETA 00:15:58 2022-08-25 04:00:36 [INFO] [TRAIN] epoch: 124, iter: 155450/160000, loss: 0.6524, lr: 0.000034, batch_cost: 0.1922, reader_cost: 0.00299, ips: 41.6239 samples/sec | ETA 00:14:34 2022-08-25 04:00:46 [INFO] [TRAIN] epoch: 124, iter: 155500/160000, loss: 0.6449, lr: 0.000034, batch_cost: 0.2070, reader_cost: 0.00041, ips: 38.6474 samples/sec | ETA 00:15:31 2022-08-25 04:00:55 [INFO] [TRAIN] epoch: 124, iter: 155550/160000, loss: 0.6458, lr: 0.000034, batch_cost: 0.1784, reader_cost: 0.00046, ips: 44.8369 samples/sec | ETA 00:13:13 2022-08-25 04:01:06 [INFO] [TRAIN] epoch: 124, iter: 155600/160000, loss: 0.6897, lr: 0.000033, batch_cost: 0.2253, reader_cost: 0.00035, ips: 35.5105 samples/sec | ETA 00:16:31 2022-08-25 04:01:18 [INFO] [TRAIN] epoch: 124, iter: 155650/160000, loss: 0.6602, lr: 0.000033, batch_cost: 0.2376, reader_cost: 0.00335, ips: 33.6666 samples/sec | ETA 00:17:13 2022-08-25 04:01:32 [INFO] [TRAIN] epoch: 124, iter: 155700/160000, loss: 0.6398, lr: 0.000033, batch_cost: 0.2694, reader_cost: 0.00049, ips: 29.6950 samples/sec | ETA 00:19:18 2022-08-25 04:01:44 [INFO] [TRAIN] epoch: 124, iter: 155750/160000, loss: 0.6860, lr: 0.000032, batch_cost: 0.2552, reader_cost: 0.01304, ips: 31.3522 samples/sec | ETA 00:18:04 2022-08-25 04:01:56 [INFO] [TRAIN] epoch: 124, iter: 155800/160000, loss: 0.6672, lr: 0.000032, batch_cost: 0.2320, reader_cost: 0.00045, ips: 34.4841 samples/sec | ETA 00:16:14 2022-08-25 04:02:06 [INFO] [TRAIN] epoch: 124, iter: 155850/160000, loss: 0.6545, lr: 0.000031, batch_cost: 0.1998, reader_cost: 0.00047, ips: 40.0482 samples/sec | ETA 00:13:49 2022-08-25 04:02:16 [INFO] [TRAIN] epoch: 124, iter: 155900/160000, loss: 0.6547, lr: 0.000031, batch_cost: 0.2110, reader_cost: 0.00053, ips: 37.9200 samples/sec | ETA 00:14:24 2022-08-25 04:02:27 [INFO] [TRAIN] epoch: 124, iter: 155950/160000, loss: 0.6520, lr: 0.000031, batch_cost: 0.2061, reader_cost: 0.00059, ips: 38.8097 samples/sec | ETA 00:13:54 2022-08-25 04:02:37 [INFO] [TRAIN] epoch: 124, iter: 156000/160000, loss: 0.6502, lr: 0.000030, batch_cost: 0.2015, reader_cost: 0.00070, ips: 39.7104 samples/sec | ETA 00:13:25 2022-08-25 04:02:37 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1585 - reader cost: 5.5935e-04 2022-08-25 04:05:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3226 Acc: 0.7493 Kappa: 0.7301 Dice: 0.4511 2022-08-25 04:05:16 [INFO] [EVAL] Class IoU: [0.6597 0.7682 0.9247 0.7062 0.6719 0.7436 0.7639 0.7628 0.4853 0.6192 0.4665 0.5273 0.6581 0.3002 0.2232 0.3836 0.4886 0.4147 0.5557 0.3678 0.7222 0.4018 0.5635 0.4635 0.3199 0.3682 0.3759 0.4003 0.3556 0.3446 0.2204 0.3954 0.2729 0.2866 0.3534 0.407 0.3712 0.4912 0.2478 0.2671 0.0963 0.0696 0.2981 0.2214 0.2934 0.2568 0.2097 0.4308 0.7041 0.4908 0.4762 0.2628 0.2249 0.2123 0.6135 0.4357 0.8515 0.3211 0.4806 0.1683 0.0513 0.1801 0.2871 0.1418 0.3868 0.6623 0.2188 0.3553 0.0953 0.3449 0.3749 0.4347 0.3741 0.2288 0.4149 0.2959 0.3405 0.242 0.1313 0.247 0.6311 0.2987 0.2666 0.024 0.5165 0.4845 0.0849 0.068 0.3337 0.4107 0.413 0.0033 0.2087 0.0804 0.0142 0.008 0.1729 0.1354 0.1842 0.3399 0.1094 0.0356 0.2041 0.5833 0.0003 0.5121 0.1644 0.4943 0.0934 0.3899 0.0618 0.3305 0.0846 0.5216 0.7473 0.0044 0.3569 0.5872 0.0433 0.304 0.4429 0.0132 0.2303 0.125 0.2458 0.2217 0.4177 0.2786 0.4553 0.2747 0.5382 0.0244 0.1152 0.2701 0.104 0.1048 0.0644 0.0194 0.1287 0.3192 0.0901 0.0619 0.2536 0.4826 0.2975 0. 0.305 0.0144 0.0807 0.0616] 2022-08-25 04:05:16 [INFO] [EVAL] Class Precision: [0.7679 0.8342 0.9599 0.805 0.7679 0.8624 0.8787 0.8367 0.6047 0.7511 0.6809 0.661 0.7409 0.4817 0.4814 0.552 0.6438 0.6767 0.7236 0.5676 0.8075 0.5888 0.7279 0.5962 0.5199 0.5525 0.5242 0.6693 0.6345 0.5137 0.3938 0.5052 0.4583 0.4079 0.5106 0.548 0.5885 0.7393 0.5174 0.5399 0.1503 0.2327 0.4837 0.5292 0.4137 0.4362 0.3692 0.6459 0.7761 0.6033 0.6208 0.3156 0.4148 0.6316 0.6956 0.6013 0.8908 0.6238 0.6699 0.3297 0.0839 0.408 0.3783 0.6074 0.4923 0.8029 0.3628 0.5565 0.2408 0.6235 0.6061 0.5877 0.6292 0.2902 0.5933 0.4839 0.5328 0.6085 0.6578 0.5366 0.728 0.5955 0.7532 0.0726 0.6441 0.6979 0.3866 0.4241 0.5265 0.5598 0.5547 0.0045 0.3524 0.2915 0.0717 0.0339 0.5498 0.595 0.4778 0.5849 0.551 0.0645 0.7161 0.6965 0.0042 0.7282 0.521 0.9226 0.2551 0.5704 0.2668 0.6163 0.3903 0.7653 0.7533 0.1295 0.6504 0.6058 0.1192 0.7057 0.7865 0.2945 0.6801 0.7019 0.6383 0.6733 0.7594 0.3934 0.5632 0.4673 0.6145 0.5154 0.416 0.707 0.7062 0.2573 0.3108 0.1424 0.4032 0.7014 0.1809 0.1594 0.546 0.7256 0.4912 0. 0.8891 0.3968 0.4418 0.7147] 2022-08-25 04:05:16 [INFO] [EVAL] Class Recall: [0.824 0.9066 0.9618 0.8519 0.843 0.8437 0.8539 0.8963 0.7108 0.7791 0.597 0.7228 0.8548 0.4435 0.2939 0.5571 0.6696 0.5172 0.7055 0.5109 0.8723 0.5586 0.7139 0.6756 0.454 0.5248 0.5706 0.499 0.4472 0.5114 0.3337 0.6454 0.4028 0.4908 0.5343 0.6126 0.5013 0.5941 0.3223 0.3457 0.2116 0.0904 0.4373 0.2758 0.5023 0.3842 0.3267 0.564 0.8836 0.7248 0.6716 0.6108 0.3294 0.2423 0.8386 0.6128 0.9507 0.3982 0.6297 0.2559 0.1169 0.2438 0.5437 0.1561 0.6434 0.7908 0.3554 0.4957 0.1363 0.4357 0.4957 0.6254 0.4799 0.5196 0.5798 0.4323 0.4854 0.2866 0.141 0.3139 0.8259 0.3748 0.2922 0.0346 0.7228 0.6131 0.0981 0.0749 0.4768 0.6066 0.6178 0.0116 0.3385 0.0999 0.0174 0.0104 0.2014 0.1492 0.2307 0.4479 0.1202 0.0738 0.222 0.7821 0.0003 0.6332 0.1937 0.5157 0.1284 0.552 0.0744 0.4161 0.0975 0.6209 0.9894 0.0045 0.4416 0.9503 0.0636 0.3481 0.5035 0.0136 0.2582 0.1321 0.2856 0.2484 0.4815 0.4885 0.7037 0.4 0.8125 0.025 0.1374 0.3041 0.1087 0.1503 0.0751 0.0219 0.159 0.3694 0.1522 0.0919 0.3213 0.5904 0.4301 0. 0.3171 0.0147 0.0899 0.0632] 2022-08-25 04:05:16 [INFO] [EVAL] The model with the best validation mIoU (0.3247) was saved at iter 150000. 2022-08-25 04:05:24 [INFO] [TRAIN] epoch: 124, iter: 156050/160000, loss: 0.6875, lr: 0.000030, batch_cost: 0.1656, reader_cost: 0.00403, ips: 48.2964 samples/sec | ETA 00:10:54 2022-08-25 04:05:32 [INFO] [TRAIN] epoch: 124, iter: 156100/160000, loss: 0.7023, lr: 0.000030, batch_cost: 0.1520, reader_cost: 0.00140, ips: 52.6297 samples/sec | ETA 00:09:52 2022-08-25 04:05:39 [INFO] [TRAIN] epoch: 124, iter: 156150/160000, loss: 0.6489, lr: 0.000029, batch_cost: 0.1403, reader_cost: 0.00044, ips: 57.0101 samples/sec | ETA 00:09:00 2022-08-25 04:05:46 [INFO] [TRAIN] epoch: 124, iter: 156200/160000, loss: 0.6790, lr: 0.000029, batch_cost: 0.1447, reader_cost: 0.00048, ips: 55.2994 samples/sec | ETA 00:09:09 2022-08-25 04:05:53 [INFO] [TRAIN] epoch: 124, iter: 156250/160000, loss: 0.6589, lr: 0.000028, batch_cost: 0.1474, reader_cost: 0.00059, ips: 54.2786 samples/sec | ETA 00:09:12 2022-08-25 04:06:02 [INFO] [TRAIN] epoch: 124, iter: 156300/160000, loss: 0.7132, lr: 0.000028, batch_cost: 0.1833, reader_cost: 0.00054, ips: 43.6425 samples/sec | ETA 00:11:18 2022-08-25 04:06:10 [INFO] [TRAIN] epoch: 124, iter: 156350/160000, loss: 0.6759, lr: 0.000028, batch_cost: 0.1595, reader_cost: 0.00082, ips: 50.1724 samples/sec | ETA 00:09:41 2022-08-25 04:06:20 [INFO] [TRAIN] epoch: 124, iter: 156400/160000, loss: 0.6655, lr: 0.000027, batch_cost: 0.1985, reader_cost: 0.00031, ips: 40.2928 samples/sec | ETA 00:11:54 2022-08-25 04:06:28 [INFO] [TRAIN] epoch: 124, iter: 156450/160000, loss: 0.6528, lr: 0.000027, batch_cost: 0.1612, reader_cost: 0.00098, ips: 49.6326 samples/sec | ETA 00:09:32 2022-08-25 04:06:36 [INFO] [TRAIN] epoch: 124, iter: 156500/160000, loss: 0.6817, lr: 0.000027, batch_cost: 0.1526, reader_cost: 0.00076, ips: 52.4353 samples/sec | ETA 00:08:53 2022-08-25 04:06:44 [INFO] [TRAIN] epoch: 124, iter: 156550/160000, loss: 0.6118, lr: 0.000026, batch_cost: 0.1655, reader_cost: 0.00071, ips: 48.3419 samples/sec | ETA 00:09:30 2022-08-25 04:06:53 [INFO] [TRAIN] epoch: 124, iter: 156600/160000, loss: 0.6365, lr: 0.000026, batch_cost: 0.1820, reader_cost: 0.00056, ips: 43.9575 samples/sec | ETA 00:10:18 2022-08-25 04:07:05 [INFO] [TRAIN] epoch: 125, iter: 156650/160000, loss: 0.6473, lr: 0.000025, batch_cost: 0.2383, reader_cost: 0.02759, ips: 33.5680 samples/sec | ETA 00:13:18 2022-08-25 04:07:14 [INFO] [TRAIN] epoch: 125, iter: 156700/160000, loss: 0.6457, lr: 0.000025, batch_cost: 0.1709, reader_cost: 0.00070, ips: 46.8041 samples/sec | ETA 00:09:24 2022-08-25 04:07:24 [INFO] [TRAIN] epoch: 125, iter: 156750/160000, loss: 0.6641, lr: 0.000025, batch_cost: 0.2023, reader_cost: 0.00035, ips: 39.5360 samples/sec | ETA 00:10:57 2022-08-25 04:07:34 [INFO] [TRAIN] epoch: 125, iter: 156800/160000, loss: 0.6712, lr: 0.000024, batch_cost: 0.1922, reader_cost: 0.00067, ips: 41.6211 samples/sec | ETA 00:10:15 2022-08-25 04:07:42 [INFO] [TRAIN] epoch: 125, iter: 156850/160000, loss: 0.6666, lr: 0.000024, batch_cost: 0.1753, reader_cost: 0.00246, ips: 45.6232 samples/sec | ETA 00:09:12 2022-08-25 04:07:52 [INFO] [TRAIN] epoch: 125, iter: 156900/160000, loss: 0.6774, lr: 0.000023, batch_cost: 0.1896, reader_cost: 0.00054, ips: 42.1833 samples/sec | ETA 00:09:47 2022-08-25 04:08:02 [INFO] [TRAIN] epoch: 125, iter: 156950/160000, loss: 0.6618, lr: 0.000023, batch_cost: 0.2016, reader_cost: 0.00069, ips: 39.6804 samples/sec | ETA 00:10:14 2022-08-25 04:08:12 [INFO] [TRAIN] epoch: 125, iter: 157000/160000, loss: 0.6869, lr: 0.000023, batch_cost: 0.1938, reader_cost: 0.00032, ips: 41.2729 samples/sec | ETA 00:09:41 2022-08-25 04:08:12 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 159s - batch_cost: 0.1594 - reader cost: 6.9498e-04 2022-08-25 04:10:51 [INFO] [EVAL] #Images: 2000 mIoU: 0.3249 Acc: 0.7502 Kappa: 0.7310 Dice: 0.4537 2022-08-25 04:10:51 [INFO] [EVAL] Class IoU: [0.6592 0.771 0.9243 0.7066 0.6691 0.7453 0.7628 0.7588 0.4898 0.6223 0.4684 0.5305 0.6597 0.2877 0.2191 0.3847 0.4837 0.4225 0.5567 0.3711 0.7247 0.412 0.5576 0.4624 0.3259 0.3643 0.388 0.4044 0.3691 0.3413 0.2187 0.4026 0.273 0.2953 0.3555 0.4 0.3722 0.5046 0.2555 0.2724 0.0952 0.0693 0.3009 0.222 0.2996 0.2601 0.1807 0.4309 0.7005 0.4968 0.4774 0.2812 0.2278 0.2205 0.6101 0.4289 0.8385 0.3154 0.4745 0.1722 0.0563 0.1765 0.2938 0.1378 0.3865 0.6628 0.2381 0.3582 0.0992 0.3422 0.3848 0.4425 0.3759 0.2255 0.4183 0.3041 0.3063 0.2286 0.1138 0.2696 0.6511 0.2961 0.2711 0.0135 0.5212 0.4821 0.0817 0.0719 0.3168 0.4415 0.404 0.0052 0.1896 0.0671 0.0075 0.0084 0.1826 0.1412 0.193 0.3455 0.1196 0.0337 0.2062 0.6032 0.0054 0.5326 0.1905 0.5033 0.1082 0.3653 0.0541 0.3438 0.0894 0.5199 0.7471 0.0038 0.3853 0.5868 0.0638 0.2911 0.4373 0.0104 0.202 0.1448 0.2469 0.2553 0.4244 0.2871 0.4565 0.2786 0.5558 0.0216 0.1274 0.2708 0.1148 0.1118 0.0691 0.019 0.1281 0.3257 0.0941 0.0745 0.2561 0.4984 0.3056 0. 0.2932 0.0129 0.0812 0.0649] 2022-08-25 04:10:51 [INFO] [EVAL] Class Precision: [0.7669 0.8329 0.9614 0.8107 0.7624 0.8605 0.8839 0.8302 0.6092 0.7397 0.6755 0.6637 0.7437 0.4996 0.4837 0.5598 0.6251 0.6476 0.721 0.5655 0.8148 0.5905 0.7143 0.5943 0.5084 0.5757 0.5456 0.6408 0.6234 0.5108 0.4033 0.5159 0.4834 0.4351 0.5065 0.5181 0.5968 0.7554 0.5413 0.5139 0.1468 0.2339 0.489 0.5522 0.4197 0.4278 0.3323 0.6595 0.7971 0.6148 0.6245 0.3477 0.3986 0.6542 0.7066 0.6005 0.8746 0.6245 0.6994 0.3473 0.0924 0.4129 0.3953 0.6442 0.4828 0.8113 0.3911 0.5561 0.2672 0.6105 0.5958 0.5902 0.6154 0.2908 0.621 0.496 0.5008 0.6195 0.7456 0.5355 0.7562 0.585 0.7571 0.0501 0.7038 0.7117 0.3711 0.4052 0.513 0.634 0.5331 0.0075 0.3264 0.2891 0.0439 0.0353 0.6219 0.5809 0.4835 0.5701 0.5794 0.0624 0.7132 0.7823 0.0578 0.7638 0.5743 0.9204 0.2351 0.5517 0.2579 0.6644 0.4512 0.6959 0.7531 0.0803 0.6572 0.6046 0.1859 0.6949 0.81 0.2269 0.7887 0.6706 0.6487 0.6591 0.7681 0.4119 0.5934 0.4924 0.6397 0.5429 0.406 0.7184 0.6773 0.2714 0.3151 0.2 0.3807 0.6878 0.2099 0.2926 0.5357 0.7483 0.4882 0. 0.9012 0.3506 0.4508 0.7617] 2022-08-25 04:10:51 [INFO] [EVAL] Class Recall: [0.8243 0.912 0.96 0.8463 0.8454 0.8478 0.8478 0.8982 0.7142 0.7968 0.6044 0.7254 0.8539 0.4042 0.2861 0.5516 0.6814 0.5487 0.7095 0.5191 0.8676 0.5767 0.7178 0.6757 0.4758 0.498 0.5732 0.523 0.475 0.5071 0.3234 0.6471 0.3854 0.479 0.5438 0.637 0.4972 0.6032 0.3262 0.3669 0.2132 0.0897 0.4388 0.2707 0.5117 0.3988 0.2837 0.5542 0.8526 0.7212 0.6696 0.595 0.3472 0.2496 0.817 0.6002 0.9531 0.3893 0.596 0.2546 0.126 0.2357 0.5335 0.1491 0.6597 0.7837 0.3785 0.5017 0.1364 0.4378 0.5208 0.6387 0.4913 0.5011 0.5617 0.4401 0.441 0.2659 0.1183 0.3518 0.8242 0.3748 0.2969 0.0181 0.6677 0.5991 0.0948 0.0803 0.453 0.5926 0.6253 0.0171 0.3115 0.0804 0.009 0.0109 0.2054 0.1572 0.2432 0.4672 0.1309 0.0681 0.2249 0.7248 0.0059 0.6376 0.2219 0.5262 0.1669 0.5195 0.0641 0.4161 0.1002 0.6728 0.9896 0.004 0.4821 0.9524 0.0884 0.3338 0.4873 0.0108 0.2136 0.1559 0.285 0.2941 0.4868 0.4866 0.6642 0.3908 0.809 0.022 0.1566 0.303 0.1214 0.1597 0.0814 0.0206 0.1619 0.3823 0.1456 0.0909 0.3292 0.5988 0.4496 0. 0.303 0.0133 0.0901 0.0662] 2022-08-25 04:10:51 [INFO] [EVAL] The model with the best validation mIoU (0.3249) was saved at iter 157000. 2022-08-25 04:11:01 [INFO] [TRAIN] epoch: 125, iter: 157050/160000, loss: 0.6511, lr: 0.000022, batch_cost: 0.1848, reader_cost: 0.00521, ips: 43.2790 samples/sec | ETA 00:09:05 2022-08-25 04:11:09 [INFO] [TRAIN] epoch: 125, iter: 157100/160000, loss: 0.6917, lr: 0.000022, batch_cost: 0.1610, reader_cost: 0.00092, ips: 49.6891 samples/sec | ETA 00:07:46 2022-08-25 04:11:16 [INFO] [TRAIN] epoch: 125, iter: 157150/160000, loss: 0.7371, lr: 0.000022, batch_cost: 0.1471, reader_cost: 0.00047, ips: 54.3861 samples/sec | ETA 00:06:59 2022-08-25 04:11:23 [INFO] [TRAIN] epoch: 125, iter: 157200/160000, loss: 0.6614, lr: 0.000021, batch_cost: 0.1487, reader_cost: 0.00100, ips: 53.7976 samples/sec | ETA 00:06:56 2022-08-25 04:11:31 [INFO] [TRAIN] epoch: 125, iter: 157250/160000, loss: 0.6916, lr: 0.000021, batch_cost: 0.1554, reader_cost: 0.00074, ips: 51.4819 samples/sec | ETA 00:07:07 2022-08-25 04:11:39 [INFO] [TRAIN] epoch: 125, iter: 157300/160000, loss: 0.6660, lr: 0.000020, batch_cost: 0.1615, reader_cost: 0.00069, ips: 49.5223 samples/sec | ETA 00:07:16 2022-08-25 04:11:47 [INFO] [TRAIN] epoch: 125, iter: 157350/160000, loss: 0.6829, lr: 0.000020, batch_cost: 0.1575, reader_cost: 0.00061, ips: 50.8032 samples/sec | ETA 00:06:57 2022-08-25 04:11:55 [INFO] [TRAIN] epoch: 125, iter: 157400/160000, loss: 0.6696, lr: 0.000020, batch_cost: 0.1537, reader_cost: 0.00062, ips: 52.0372 samples/sec | ETA 00:06:39 2022-08-25 04:12:02 [INFO] [TRAIN] epoch: 125, iter: 157450/160000, loss: 0.6252, lr: 0.000019, batch_cost: 0.1517, reader_cost: 0.00067, ips: 52.7342 samples/sec | ETA 00:06:26 2022-08-25 04:12:11 [INFO] [TRAIN] epoch: 125, iter: 157500/160000, loss: 0.6852, lr: 0.000019, batch_cost: 0.1643, reader_cost: 0.00265, ips: 48.7024 samples/sec | ETA 00:06:50 2022-08-25 04:12:18 [INFO] [TRAIN] epoch: 125, iter: 157550/160000, loss: 0.6683, lr: 0.000019, batch_cost: 0.1576, reader_cost: 0.00037, ips: 50.7745 samples/sec | ETA 00:06:26 2022-08-25 04:12:26 [INFO] [TRAIN] epoch: 125, iter: 157600/160000, loss: 0.6553, lr: 0.000018, batch_cost: 0.1571, reader_cost: 0.00031, ips: 50.9105 samples/sec | ETA 00:06:17 2022-08-25 04:12:36 [INFO] [TRAIN] epoch: 125, iter: 157650/160000, loss: 0.6610, lr: 0.000018, batch_cost: 0.1909, reader_cost: 0.00096, ips: 41.9173 samples/sec | ETA 00:07:28 2022-08-25 04:12:47 [INFO] [TRAIN] epoch: 125, iter: 157700/160000, loss: 0.6741, lr: 0.000017, batch_cost: 0.2229, reader_cost: 0.00063, ips: 35.8970 samples/sec | ETA 00:08:32 2022-08-25 04:12:56 [INFO] [TRAIN] epoch: 125, iter: 157750/160000, loss: 0.6190, lr: 0.000017, batch_cost: 0.1756, reader_cost: 0.00503, ips: 45.5627 samples/sec | ETA 00:06:35 2022-08-25 04:13:05 [INFO] [TRAIN] epoch: 125, iter: 157800/160000, loss: 0.7096, lr: 0.000017, batch_cost: 0.1773, reader_cost: 0.00053, ips: 45.1262 samples/sec | ETA 00:06:30 2022-08-25 04:13:14 [INFO] [TRAIN] epoch: 125, iter: 157850/160000, loss: 0.6065, lr: 0.000016, batch_cost: 0.1809, reader_cost: 0.00283, ips: 44.2248 samples/sec | ETA 00:06:28 2022-08-25 04:13:24 [INFO] [TRAIN] epoch: 126, iter: 157900/160000, loss: 0.6467, lr: 0.000016, batch_cost: 0.2041, reader_cost: 0.02785, ips: 39.2050 samples/sec | ETA 00:07:08 2022-08-25 04:13:33 [INFO] [TRAIN] epoch: 126, iter: 157950/160000, loss: 0.6701, lr: 0.000016, batch_cost: 0.1806, reader_cost: 0.00071, ips: 44.2884 samples/sec | ETA 00:06:10 2022-08-25 04:13:42 [INFO] [TRAIN] epoch: 126, iter: 158000/160000, loss: 0.6732, lr: 0.000015, batch_cost: 0.1733, reader_cost: 0.00134, ips: 46.1530 samples/sec | ETA 00:05:46 2022-08-25 04:13:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 154s - batch_cost: 0.1538 - reader cost: 6.1255e-04 2022-08-25 04:16:16 [INFO] [EVAL] #Images: 2000 mIoU: 0.3243 Acc: 0.7508 Kappa: 0.7315 Dice: 0.4529 2022-08-25 04:16:16 [INFO] [EVAL] Class IoU: [0.6609 0.7701 0.9257 0.7081 0.6695 0.747 0.7631 0.7611 0.4868 0.6274 0.467 0.5277 0.6624 0.3025 0.2255 0.3845 0.4975 0.42 0.5534 0.365 0.7248 0.4002 0.5673 0.4586 0.3339 0.3632 0.3647 0.4065 0.3637 0.3523 0.2183 0.3973 0.277 0.2848 0.3709 0.4031 0.3725 0.4865 0.2631 0.2705 0.0897 0.0738 0.2956 0.2242 0.2958 0.2601 0.2058 0.4305 0.7011 0.4961 0.4958 0.273 0.2255 0.2174 0.6115 0.442 0.8444 0.3116 0.4797 0.1649 0.0576 0.1731 0.3034 0.13 0.3717 0.6703 0.2449 0.3615 0.0956 0.3382 0.3814 0.4336 0.3754 0.2241 0.4191 0.3023 0.3365 0.2238 0.1203 0.1968 0.6454 0.296 0.2678 0.0241 0.5109 0.4818 0.0828 0.0656 0.3588 0.4159 0.4083 0.0061 0.2092 0.0705 0.0069 0.007 0.1581 0.1413 0.1975 0.3405 0.1241 0.0187 0.2147 0.5352 0.0025 0.5214 0.1862 0.4958 0.1031 0.3777 0.0708 0.3026 0.0947 0.534 0.7544 0.005 0.3645 0.5886 0.042 0.3604 0.4706 0.0122 0.2171 0.1347 0.2515 0.2241 0.421 0.2936 0.4541 0.2803 0.5372 0.0252 0.1343 0.2608 0.0992 0.1078 0.0567 0.0179 0.1302 0.3341 0.0937 0.0578 0.2697 0.5023 0.3113 0. 0.314 0.014 0.0769 0.0566] 2022-08-25 04:16:16 [INFO] [EVAL] Class Precision: [0.766 0.8339 0.9611 0.8072 0.7651 0.8669 0.8781 0.8309 0.6054 0.7525 0.6672 0.6631 0.7515 0.4923 0.4955 0.544 0.6685 0.6461 0.7346 0.565 0.8174 0.5697 0.7347 0.6016 0.5146 0.5735 0.5239 0.6684 0.6296 0.5289 0.404 0.5239 0.4855 0.4112 0.518 0.5405 0.6027 0.7401 0.5411 0.5324 0.1584 0.2074 0.4778 0.5397 0.418 0.4485 0.3703 0.6475 0.7921 0.6151 0.6266 0.3386 0.4025 0.6371 0.7019 0.6336 0.8818 0.6317 0.6756 0.333 0.0938 0.3747 0.4385 0.5942 0.4557 0.811 0.3885 0.5705 0.2456 0.5927 0.5887 0.5625 0.6253 0.288 0.615 0.4727 0.543 0.601 0.7937 0.5459 0.7423 0.5706 0.7578 0.0736 0.6833 0.702 0.3865 0.4141 0.6305 0.5701 0.5465 0.0085 0.3559 0.2925 0.0376 0.0288 0.6318 0.5123 0.4182 0.5948 0.5604 0.0386 0.7134 0.7259 0.0292 0.7323 0.5894 0.9241 0.2414 0.5545 0.2937 0.5258 0.3709 0.7361 0.7606 0.0791 0.6582 0.6116 0.1149 0.7224 0.7773 0.2354 0.6614 0.6757 0.6136 0.6271 0.7797 0.4329 0.5635 0.4727 0.619 0.4915 0.4212 0.7336 0.6714 0.238 0.288 0.1133 0.3757 0.6858 0.1836 0.165 0.5586 0.726 0.5527 0. 0.8923 0.4022 0.4052 0.6745] 2022-08-25 04:16:16 [INFO] [EVAL] Class Recall: [0.8281 0.9096 0.9617 0.8522 0.8427 0.8437 0.8535 0.9006 0.713 0.7906 0.6087 0.721 0.8482 0.4397 0.2928 0.5674 0.6603 0.5455 0.6916 0.5077 0.8647 0.5735 0.7134 0.6587 0.4874 0.4976 0.5455 0.5092 0.4627 0.5134 0.322 0.6217 0.3922 0.4811 0.5663 0.6132 0.4937 0.5867 0.3387 0.3547 0.1715 0.1027 0.4366 0.2773 0.5031 0.3823 0.3166 0.5623 0.8592 0.7194 0.7036 0.5851 0.339 0.2481 0.826 0.5937 0.9521 0.3808 0.6233 0.2463 0.1298 0.2434 0.496 0.1426 0.6686 0.7943 0.3985 0.4968 0.1353 0.4406 0.52 0.6543 0.4844 0.5028 0.5681 0.4562 0.4695 0.2629 0.1242 0.2353 0.8319 0.3809 0.2928 0.0345 0.6694 0.6056 0.0953 0.0723 0.4543 0.6058 0.6174 0.0218 0.3368 0.0851 0.0084 0.0091 0.1742 0.1632 0.2722 0.4433 0.1375 0.035 0.2349 0.6708 0.0028 0.6442 0.214 0.5169 0.1525 0.5422 0.0853 0.4161 0.1128 0.6605 0.9894 0.0053 0.4496 0.9401 0.0621 0.4184 0.5439 0.0127 0.2442 0.144 0.2989 0.2585 0.4778 0.4772 0.7006 0.4078 0.8026 0.0258 0.1646 0.288 0.1043 0.1645 0.066 0.0208 0.1661 0.3945 0.1607 0.0818 0.3428 0.6197 0.4162 0. 0.3263 0.0143 0.0867 0.0582] 2022-08-25 04:16:16 [INFO] [EVAL] The model with the best validation mIoU (0.3249) was saved at iter 157000. 2022-08-25 04:16:24 [INFO] [TRAIN] epoch: 126, iter: 158050/160000, loss: 0.6347, lr: 0.000015, batch_cost: 0.1588, reader_cost: 0.00393, ips: 50.3916 samples/sec | ETA 00:05:09 2022-08-25 04:16:31 [INFO] [TRAIN] epoch: 126, iter: 158100/160000, loss: 0.6755, lr: 0.000014, batch_cost: 0.1489, reader_cost: 0.00119, ips: 53.7413 samples/sec | ETA 00:04:42 2022-08-25 04:16:39 [INFO] [TRAIN] epoch: 126, iter: 158150/160000, loss: 0.7221, lr: 0.000014, batch_cost: 0.1495, reader_cost: 0.00070, ips: 53.5141 samples/sec | ETA 00:04:36 2022-08-25 04:16:46 [INFO] [TRAIN] epoch: 126, iter: 158200/160000, loss: 0.6488, lr: 0.000014, batch_cost: 0.1484, reader_cost: 0.00044, ips: 53.8981 samples/sec | ETA 00:04:27 2022-08-25 04:16:54 [INFO] [TRAIN] epoch: 126, iter: 158250/160000, loss: 0.6706, lr: 0.000013, batch_cost: 0.1512, reader_cost: 0.00100, ips: 52.9207 samples/sec | ETA 00:04:24 2022-08-25 04:17:01 [INFO] [TRAIN] epoch: 126, iter: 158300/160000, loss: 0.6418, lr: 0.000013, batch_cost: 0.1514, reader_cost: 0.00042, ips: 52.8340 samples/sec | ETA 00:04:17 2022-08-25 04:17:09 [INFO] [TRAIN] epoch: 126, iter: 158350/160000, loss: 0.6893, lr: 0.000012, batch_cost: 0.1521, reader_cost: 0.00801, ips: 52.5842 samples/sec | ETA 00:04:11 2022-08-25 04:17:18 [INFO] [TRAIN] epoch: 126, iter: 158400/160000, loss: 0.6502, lr: 0.000012, batch_cost: 0.1923, reader_cost: 0.00032, ips: 41.6082 samples/sec | ETA 00:05:07 2022-08-25 04:17:27 [INFO] [TRAIN] epoch: 126, iter: 158450/160000, loss: 0.6590, lr: 0.000012, batch_cost: 0.1668, reader_cost: 0.00053, ips: 47.9708 samples/sec | ETA 00:04:18 2022-08-25 04:17:35 [INFO] [TRAIN] epoch: 126, iter: 158500/160000, loss: 0.6772, lr: 0.000011, batch_cost: 0.1609, reader_cost: 0.00053, ips: 49.7067 samples/sec | ETA 00:04:01 2022-08-25 04:17:43 [INFO] [TRAIN] epoch: 126, iter: 158550/160000, loss: 0.6869, lr: 0.000011, batch_cost: 0.1694, reader_cost: 0.00041, ips: 47.2256 samples/sec | ETA 00:04:05 2022-08-25 04:17:51 [INFO] [TRAIN] epoch: 126, iter: 158600/160000, loss: 0.6788, lr: 0.000011, batch_cost: 0.1645, reader_cost: 0.00064, ips: 48.6400 samples/sec | ETA 00:03:50 2022-08-25 04:18:00 [INFO] [TRAIN] epoch: 126, iter: 158650/160000, loss: 0.7055, lr: 0.000010, batch_cost: 0.1604, reader_cost: 0.00540, ips: 49.8608 samples/sec | ETA 00:03:36 2022-08-25 04:18:08 [INFO] [TRAIN] epoch: 126, iter: 158700/160000, loss: 0.6563, lr: 0.000010, batch_cost: 0.1615, reader_cost: 0.00043, ips: 49.5466 samples/sec | ETA 00:03:29 2022-08-25 04:18:16 [INFO] [TRAIN] epoch: 126, iter: 158750/160000, loss: 0.6601, lr: 0.000009, batch_cost: 0.1659, reader_cost: 0.00061, ips: 48.2150 samples/sec | ETA 00:03:27 2022-08-25 04:18:24 [INFO] [TRAIN] epoch: 126, iter: 158800/160000, loss: 0.6535, lr: 0.000009, batch_cost: 0.1620, reader_cost: 0.00050, ips: 49.3968 samples/sec | ETA 00:03:14 2022-08-25 04:18:32 [INFO] [TRAIN] epoch: 126, iter: 158850/160000, loss: 0.6784, lr: 0.000009, batch_cost: 0.1676, reader_cost: 0.00190, ips: 47.7297 samples/sec | ETA 00:03:12 2022-08-25 04:18:42 [INFO] [TRAIN] epoch: 126, iter: 158900/160000, loss: 0.6350, lr: 0.000008, batch_cost: 0.1891, reader_cost: 0.00188, ips: 42.3048 samples/sec | ETA 00:03:28 2022-08-25 04:18:51 [INFO] [TRAIN] epoch: 126, iter: 158950/160000, loss: 0.6830, lr: 0.000008, batch_cost: 0.1887, reader_cost: 0.00064, ips: 42.4050 samples/sec | ETA 00:03:18 2022-08-25 04:19:01 [INFO] [TRAIN] epoch: 126, iter: 159000/160000, loss: 0.6703, lr: 0.000008, batch_cost: 0.1959, reader_cost: 0.00047, ips: 40.8475 samples/sec | ETA 00:03:15 2022-08-25 04:19:01 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 168s - batch_cost: 0.1684 - reader cost: 5.8651e-04 2022-08-25 04:21:50 [INFO] [EVAL] #Images: 2000 mIoU: 0.3241 Acc: 0.7506 Kappa: 0.7314 Dice: 0.4526 2022-08-25 04:21:50 [INFO] [EVAL] Class IoU: [0.6601 0.7709 0.9248 0.7064 0.6689 0.7459 0.7624 0.7628 0.4895 0.6266 0.4679 0.5286 0.6622 0.2983 0.2233 0.3857 0.4885 0.4192 0.5555 0.3701 0.7273 0.4059 0.5623 0.4545 0.3333 0.3707 0.371 0.4046 0.369 0.3473 0.2214 0.3999 0.2808 0.2949 0.3335 0.3923 0.3723 0.495 0.2619 0.2766 0.0876 0.0657 0.3014 0.2231 0.2907 0.269 0.2135 0.4261 0.6975 0.4927 0.4744 0.2802 0.2349 0.2221 0.613 0.4322 0.8475 0.3097 0.4813 0.1657 0.0481 0.1855 0.3087 0.1335 0.376 0.6675 0.2378 0.3517 0.1011 0.3429 0.3868 0.4337 0.3674 0.2273 0.4229 0.3061 0.3183 0.227 0.0988 0.2735 0.6634 0.2982 0.2774 0.0141 0.5295 0.4855 0.0726 0.0676 0.3304 0.4334 0.4124 0.0026 0.2085 0.0699 0.0093 0.0059 0.1793 0.1428 0.2207 0.3296 0.1279 0.0305 0.2092 0.5878 0.0028 0.5397 0.1923 0.4916 0.0961 0.3687 0.0709 0.302 0.0937 0.5108 0.7623 0.0065 0.3895 0.5963 0.0433 0.2197 0.4098 0.0115 0.21 0.1472 0.2512 0.2569 0.4192 0.2814 0.4613 0.2721 0.5484 0.0254 0.1266 0.2803 0.098 0.1117 0.0587 0.0185 0.1249 0.3293 0.092 0.0547 0.2506 0.4815 0.3101 0. 0.2928 0.0114 0.0853 0.0613] 2022-08-25 04:21:50 [INFO] [EVAL] Class Precision: [0.764 0.8336 0.9603 0.8037 0.7677 0.8667 0.8814 0.8391 0.6169 0.7422 0.6752 0.659 0.75 0.4991 0.4715 0.555 0.6371 0.6425 0.7189 0.5582 0.8231 0.5799 0.7155 0.6019 0.5055 0.5801 0.523 0.6801 0.618 0.5429 0.3878 0.5494 0.4966 0.4314 0.5055 0.5414 0.6041 0.7668 0.5467 0.5058 0.1413 0.2326 0.484 0.5605 0.4224 0.4538 0.3799 0.6338 0.7704 0.6103 0.5951 0.3471 0.4141 0.6403 0.7085 0.6134 0.8933 0.6251 0.6883 0.3424 0.0805 0.4599 0.4211 0.6349 0.4648 0.797 0.4002 0.5598 0.2611 0.6079 0.6068 0.5686 0.6246 0.2927 0.6373 0.4892 0.5243 0.6425 0.6895 0.5772 0.776 0.5684 0.7508 0.0459 0.6883 0.7009 0.3589 0.4283 0.5979 0.6198 0.5517 0.0036 0.3397 0.3005 0.0582 0.0252 0.6445 0.5239 0.4402 0.5529 0.575 0.0554 0.6885 0.7677 0.0354 0.7672 0.5624 0.9145 0.2337 0.5552 0.2807 0.5242 0.4442 0.7257 0.769 0.0965 0.6424 0.6173 0.1197 0.6778 0.8112 0.2576 0.7384 0.6681 0.6234 0.6551 0.7913 0.3965 0.6065 0.4821 0.6349 0.5836 0.3704 0.7005 0.6726 0.2745 0.3217 0.1791 0.3553 0.6933 0.1971 0.1995 0.51 0.7485 0.5202 0. 0.9023 0.3563 0.4217 0.696 ] 2022-08-25 04:21:50 [INFO] [EVAL] Class Recall: [0.8291 0.911 0.9615 0.8537 0.8386 0.8425 0.8495 0.8935 0.7032 0.801 0.6038 0.7277 0.8497 0.4258 0.2979 0.5584 0.6769 0.5467 0.7096 0.5233 0.862 0.5749 0.7243 0.6499 0.4946 0.5066 0.5607 0.4997 0.478 0.4909 0.3404 0.5951 0.3925 0.4823 0.495 0.5875 0.4924 0.5828 0.3346 0.379 0.1875 0.0839 0.444 0.2704 0.4824 0.3978 0.3276 0.5653 0.8806 0.7189 0.7006 0.5926 0.3518 0.2538 0.8197 0.5941 0.943 0.3803 0.6155 0.243 0.1066 0.2371 0.5364 0.1446 0.663 0.8042 0.3696 0.4861 0.1417 0.4403 0.5162 0.6464 0.4715 0.5043 0.5569 0.4498 0.4475 0.2598 0.1034 0.342 0.8205 0.3854 0.3055 0.0199 0.6965 0.6124 0.0834 0.0743 0.4248 0.5904 0.6203 0.0097 0.3505 0.0834 0.011 0.0076 0.199 0.164 0.3067 0.4494 0.1413 0.0637 0.2311 0.715 0.003 0.6454 0.2261 0.5152 0.1403 0.5232 0.0866 0.4161 0.1062 0.6331 0.9886 0.0069 0.4974 0.946 0.0636 0.2454 0.453 0.0119 0.2269 0.1588 0.2962 0.297 0.4714 0.4923 0.6584 0.3845 0.8011 0.0259 0.1613 0.3184 0.1029 0.1584 0.067 0.0203 0.1615 0.3855 0.1471 0.0701 0.33 0.5744 0.4343 0. 0.3023 0.0117 0.0966 0.063 ] 2022-08-25 04:21:50 [INFO] [EVAL] The model with the best validation mIoU (0.3249) was saved at iter 157000. 2022-08-25 04:21:58 [INFO] [TRAIN] epoch: 126, iter: 159050/160000, loss: 0.6416, lr: 0.000007, batch_cost: 0.1661, reader_cost: 0.00568, ips: 48.1617 samples/sec | ETA 00:02:37 2022-08-25 04:22:07 [INFO] [TRAIN] epoch: 126, iter: 159100/160000, loss: 0.6244, lr: 0.000007, batch_cost: 0.1762, reader_cost: 0.00097, ips: 45.4089 samples/sec | ETA 00:02:38 2022-08-25 04:22:16 [INFO] [TRAIN] epoch: 127, iter: 159150/160000, loss: 0.6506, lr: 0.000006, batch_cost: 0.1835, reader_cost: 0.02665, ips: 43.5915 samples/sec | ETA 00:02:35 2022-08-25 04:22:25 [INFO] [TRAIN] epoch: 127, iter: 159200/160000, loss: 0.6869, lr: 0.000006, batch_cost: 0.1715, reader_cost: 0.00051, ips: 46.6382 samples/sec | ETA 00:02:17 2022-08-25 04:22:33 [INFO] [TRAIN] epoch: 127, iter: 159250/160000, loss: 0.6625, lr: 0.000006, batch_cost: 0.1684, reader_cost: 0.00172, ips: 47.5017 samples/sec | ETA 00:02:06 2022-08-25 04:22:41 [INFO] [TRAIN] epoch: 127, iter: 159300/160000, loss: 0.6250, lr: 0.000005, batch_cost: 0.1607, reader_cost: 0.00057, ips: 49.7703 samples/sec | ETA 00:01:52 2022-08-25 04:22:49 [INFO] [TRAIN] epoch: 127, iter: 159350/160000, loss: 0.7070, lr: 0.000005, batch_cost: 0.1646, reader_cost: 0.00059, ips: 48.6172 samples/sec | ETA 00:01:46 2022-08-25 04:22:57 [INFO] [TRAIN] epoch: 127, iter: 159400/160000, loss: 0.6723, lr: 0.000005, batch_cost: 0.1623, reader_cost: 0.00033, ips: 49.2973 samples/sec | ETA 00:01:37 2022-08-25 04:23:05 [INFO] [TRAIN] epoch: 127, iter: 159450/160000, loss: 0.6896, lr: 0.000004, batch_cost: 0.1546, reader_cost: 0.00082, ips: 51.7341 samples/sec | ETA 00:01:25 2022-08-25 04:23:15 [INFO] [TRAIN] epoch: 127, iter: 159500/160000, loss: 0.6929, lr: 0.000004, batch_cost: 0.1919, reader_cost: 0.00054, ips: 41.6978 samples/sec | ETA 00:01:35 2022-08-25 04:23:24 [INFO] [TRAIN] epoch: 127, iter: 159550/160000, loss: 0.6614, lr: 0.000003, batch_cost: 0.1881, reader_cost: 0.00058, ips: 42.5284 samples/sec | ETA 00:01:24 2022-08-25 04:23:33 [INFO] [TRAIN] epoch: 127, iter: 159600/160000, loss: 0.6990, lr: 0.000003, batch_cost: 0.1788, reader_cost: 0.00077, ips: 44.7408 samples/sec | ETA 00:01:11 2022-08-25 04:23:42 [INFO] [TRAIN] epoch: 127, iter: 159650/160000, loss: 0.6743, lr: 0.000003, batch_cost: 0.1761, reader_cost: 0.00028, ips: 45.4318 samples/sec | ETA 00:01:01 2022-08-25 04:23:50 [INFO] [TRAIN] epoch: 127, iter: 159700/160000, loss: 0.6554, lr: 0.000002, batch_cost: 0.1541, reader_cost: 0.00068, ips: 51.9099 samples/sec | ETA 00:00:46 2022-08-25 04:23:58 [INFO] [TRAIN] epoch: 127, iter: 159750/160000, loss: 0.6670, lr: 0.000002, batch_cost: 0.1613, reader_cost: 0.00110, ips: 49.6047 samples/sec | ETA 00:00:40 2022-08-25 04:24:06 [INFO] [TRAIN] epoch: 127, iter: 159800/160000, loss: 0.6346, lr: 0.000002, batch_cost: 0.1687, reader_cost: 0.00031, ips: 47.4278 samples/sec | ETA 00:00:33 2022-08-25 04:24:14 [INFO] [TRAIN] epoch: 127, iter: 159850/160000, loss: 0.6567, lr: 0.000001, batch_cost: 0.1624, reader_cost: 0.00063, ips: 49.2666 samples/sec | ETA 00:00:24 2022-08-25 04:24:23 [INFO] [TRAIN] epoch: 127, iter: 159900/160000, loss: 0.6636, lr: 0.000001, batch_cost: 0.1736, reader_cost: 0.01419, ips: 46.0805 samples/sec | ETA 00:00:17 2022-08-25 04:24:32 [INFO] [TRAIN] epoch: 127, iter: 159950/160000, loss: 0.6494, lr: 0.000000, batch_cost: 0.1832, reader_cost: 0.00057, ips: 43.6594 samples/sec | ETA 00:00:09 2022-08-25 04:24:42 [INFO] [TRAIN] epoch: 127, iter: 160000/160000, loss: 0.6412, lr: 0.000000, batch_cost: 0.1963, reader_cost: 0.00057, ips: 40.7502 samples/sec | ETA 00:00:00 2022-08-25 04:24:42 [INFO] Start evaluating (total_samples: 2000, total_iters: 1000)... 1000/1000 - 181s - batch_cost: 0.1805 - reader cost: 0.0014 2022-08-25 04:27:43 [INFO] [EVAL] #Images: 2000 mIoU: 0.3236 Acc: 0.7510 Kappa: 0.7318 Dice: 0.4521 2022-08-25 04:27:43 [INFO] [EVAL] Class IoU: [0.6614 0.7712 0.9251 0.7079 0.6698 0.746 0.7642 0.7602 0.487 0.6354 0.4651 0.5323 0.6615 0.2905 0.2259 0.3856 0.4848 0.4252 0.5581 0.3685 0.7255 0.4097 0.5637 0.4654 0.328 0.3591 0.3777 0.4093 0.3752 0.3481 0.2183 0.3922 0.2821 0.283 0.351 0.402 0.3733 0.5186 0.2587 0.271 0.0831 0.0681 0.2967 0.2283 0.2847 0.2594 0.1787 0.4318 0.6882 0.4873 0.4871 0.2769 0.2262 0.2153 0.6353 0.4497 0.846 0.3217 0.4742 0.1828 0.0535 0.1797 0.3098 0.1293 0.3793 0.6719 0.2496 0.3566 0.0944 0.3427 0.3761 0.4391 0.3727 0.2277 0.4122 0.301 0.3295 0.226 0.1285 0.2125 0.6376 0.3009 0.2753 0.019 0.5306 0.4803 0.0794 0.0623 0.3254 0.4256 0.4149 0.0016 0.2094 0.0718 0.0038 0.0096 0.157 0.1453 0.2203 0.3521 0.1309 0.0244 0.2126 0.6497 0.0017 0.5357 0.1837 0.5014 0.0921 0.3492 0.0783 0.2948 0.0943 0.4669 0.7046 0.0083 0.348 0.5666 0.0461 0.2957 0.4546 0.0122 0.2098 0.1341 0.2485 0.2287 0.4175 0.2913 0.4116 0.2718 0.5532 0.0252 0.1265 0.2571 0.093 0.1093 0.0583 0.0176 0.1319 0.3288 0.1025 0.0654 0.2749 0.5099 0.2925 0. 0.3064 0.0127 0.0754 0.0628] 2022-08-25 04:27:43 [INFO] [EVAL] Class Precision: [0.766 0.8363 0.9605 0.8081 0.7681 0.8604 0.8776 0.8353 0.6095 0.7466 0.6725 0.663 0.7469 0.4947 0.4838 0.5619 0.6453 0.6485 0.7329 0.5764 0.8192 0.5842 0.736 0.5998 0.5209 0.5884 0.5407 0.6831 0.6259 0.5265 0.4041 0.5037 0.47 0.3991 0.5144 0.5459 0.6083 0.7601 0.5224 0.5055 0.1368 0.2042 0.4878 0.5477 0.4001 0.4375 0.3259 0.6604 0.7508 0.5931 0.6287 0.3456 0.3989 0.5826 0.7012 0.624 0.8899 0.6195 0.6648 0.3512 0.0897 0.4043 0.4321 0.5998 0.4671 0.7967 0.4033 0.5592 0.2568 0.6135 0.5872 0.5875 0.6445 0.2973 0.5864 0.4951 0.543 0.6121 0.6458 0.5609 0.729 0.5735 0.7582 0.0589 0.6956 0.7163 0.3221 0.4166 0.4986 0.5926 0.5614 0.0023 0.3454 0.2983 0.0227 0.0398 0.6213 0.5466 0.445 0.5966 0.6009 0.0462 0.7065 0.7838 0.0258 0.7636 0.5544 0.878 0.215 0.5593 0.2717 0.5029 0.3871 0.721 0.7102 0.1183 0.672 0.5819 0.1165 0.729 0.7768 0.2659 0.6599 0.6906 0.6246 0.647 0.7726 0.4378 0.5559 0.4465 0.6471 0.523 0.435 0.732 0.6549 0.2786 0.2998 0.1538 0.367 0.7007 0.2035 0.1814 0.5591 0.7136 0.4584 0. 0.8814 0.4144 0.4435 0.6884] 2022-08-25 04:27:43 [INFO] [EVAL] Class Recall: [0.8288 0.9084 0.9616 0.8509 0.8396 0.8488 0.8553 0.8942 0.7079 0.8102 0.6012 0.7297 0.8526 0.4131 0.2977 0.5514 0.6609 0.5525 0.7007 0.5053 LAUNCH INFO 2022-08-25 04:27:47,083 Pod failed INFO 2022-08-25 04:27:47,083 controller.py:99] Pod failed LAUNCH ERROR 2022-08-25 04:27:47,083 Container failed !!! Container rank 0 status failed cmd ['/ssd3/pengjuncai/anaconda3/bin/python', '-u', 'train.py', '--config', 'configs/topformer/topformer_tiny_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_tiny_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] code 1 log output/topformer/topformer_tiny_ade20k_512x512_160k/test_0/log_dir/default.rrrvdl.0.log env {'XDG_SESSION_ID': '5', 'HOSTNAME': 'instance-mqcyj27y-2', 'SHELL': '/bin/bash', 'TERM': 'screen', 'HISTSIZE': '50000', 'SSH_CLIENT': '172.31.22.20 26694 22', 'CONDA_SHLVL': '1', 'CONDA_PROMPT_MODIFIER': '(base) ', 'QTDIR': '/usr/lib64/qt-3.3', 'QTINC': '/usr/lib64/qt-3.3/include', 'SSH_TTY': '/dev/pts/2', 'ZSH': '/ssd3/pengjuncai/.oh-my-zsh', 'USER': 'pengjuncai', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:', 'LD_LIBRARY_PATH': '/usr/local/cuda/lib64', 'CONDA_EXE': '/ssd3/pengjuncai/anaconda3/bin/conda', 'TMOUT': '172800', 'base_model': 'topformer', 'PAGER': 'less', 'TMUX': '/tmp/tmux-1032/default,17077,0', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', '_CE_CONDA': '', 'MAIL': '/var/spool/mail/pengjuncai', 'PATH': '/ssd3/pengjuncai/.BCloud/bin:/usr/local/cuda/bin:/ssd3/pengjuncai/anaconda3/bin:/ssd3/pengjuncai/anaconda3/condabin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/ssd3/pengjuncai/.local/bin:/ssd3/pengjuncai/bin:/opt/bin:/home/opt/bin', 'tag': 'test_0', 'CONDA_PREFIX': '/ssd3/pengjuncai/anaconda3', 'PWD': '/ssd3/pengjuncai/PaddleSeg', 'CUDA_VISIBLE_DEVICES': '3,4', 'LANG': 'en_US.UTF-8', 'TMUX_PANE': '%8', 'HISTCONTROL': 'ignoredups', '_CE_M': '', 'HOME': '/ssd3/pengjuncai', 'SHLVL': '8', 'CONDA_PYTHON_EXE': '/ssd3/pengjuncai/anaconda3/bin/python', 'LESS': '-R', 'LOGNAME': 'pengjuncai', 'QTLIB': '/usr/lib64/qt-3.3/lib', 'SSH_CONNECTION': '172.31.43.62 55146 10.9.189.6 22', 'XDG_DATA_DIRS': '/ssd3/pengjuncai/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'CONDA_DEFAULT_ENV': 'base', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'XDG_RUNTIME_DIR': '/run/user/1032', 'HISTTIMEFORMAT': '%Y-%m-%d %H:%M:%S ', 'model': 'topformer_tiny_ade20k_512x512_160k', '_': '/usr/bin/nohup', 'OLDPWD': '/ssd3/pengjuncai/PaddleSeg/output/topformer/topformer_tiny_ade20k_512x512_160k/test_0', 'CUSTOM_DEVICE_ROOT': '', 'OMP_NUM_THREADS': '1', 'QT_QPA_PLATFORM_PLUGIN_PATH': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/plugins', 'QT_QPA_FONTDIR': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/fonts', 'PADDLE_MASTER': '10.9.189.6:49915', 'PADDLE_GLOBAL_SIZE': '2', 'PADDLE_LOCAL_SIZE': '2', 'PADDLE_GLOBAL_RANK': '0', 'PADDLE_LOCAL_RANK': '0', 'PADDLE_TRAINER_ENDPOINTS': '10.9.189.6:58800,10.9.189.6:55347', 'PADDLE_CURRENT_ENDPOINT': '10.9.189.6:58800', 'PADDLE_TRAINER_ID': '0', 'PADDLE_TRAINERS_NUM': '2', 'PADDLE_RANK_IN_NODE': '0', 'FLAGS_selected_gpus': '0'} ERROR 2022-08-25 04:27:47,083 controller.py:100] Container failed !!! Container rank 0 status failed cmd ['/ssd3/pengjuncai/anaconda3/bin/python', '-u', 'train.py', '--config', 'configs/topformer/topformer_tiny_ade20k_512x512_160k.yml', '--save_dir', 'output/topformer/topformer_tiny_ade20k_512x512_160k/test_0', '--num_workers', '3', '--do_eval', '--use_vdl', '--log_iters', '50'] code 1 log output/topformer/topformer_tiny_ade20k_512x512_160k/test_0/log_dir/default.rrrvdl.0.log env {'XDG_SESSION_ID': '5', 'HOSTNAME': 'instance-mqcyj27y-2', 'SHELL': '/bin/bash', 'TERM': 'screen', 'HISTSIZE': '50000', 'SSH_CLIENT': '172.31.22.20 26694 22', 'CONDA_SHLVL': '1', 'CONDA_PROMPT_MODIFIER': '(base) ', 'QTDIR': '/usr/lib64/qt-3.3', 'QTINC': '/usr/lib64/qt-3.3/include', 'SSH_TTY': '/dev/pts/2', 'ZSH': '/ssd3/pengjuncai/.oh-my-zsh', 'USER': 'pengjuncai', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:', 'LD_LIBRARY_PATH': '/usr/local/cuda/lib64', 'CONDA_EXE': '/ssd3/pengjuncai/anaconda3/bin/conda', 'TMOUT': '172800', 'base_model': 'topformer', 'PAGER': 'less', 'TMUX': '/tmp/tmux-1032/default,17077,0', 'LSCOLORS': 'Gxfxcxdxbxegedabagacad', '_CE_CONDA': '', 'MAIL': '/var/spool/mail/pengjuncai', 'PATH': '/ssd3/pengjuncai/.BCloud/bin:/usr/local/cuda/bin:/ssd3/pengjuncai/anaconda3/bin:/ssd3/pengjuncai/anaconda3/condabin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/opt/bin:/home/opt/bin:/ssd3/pengjuncai/.local/bin:/ssd3/pengjuncai/bin:/opt/bin:/home/opt/bin', 'tag': 'test_0', 'CONDA_PREFIX': '/ssd3/pengjuncai/anaconda3', 'PWD': '/ssd3/pengjuncai/PaddleSeg', 'CUDA_VISIBLE_DEVICES': '3,4', 'LANG': 'en_US.UTF-8', 'TMUX_PANE': '%8', 'HISTCONTROL': 'ignoredups', '_CE_M': '', 'HOME': '/ssd3/pengjuncai', 'SHLVL': '8', 'CONDA_PYTHON_EXE': '/ssd3/pengjuncai/anaconda3/bin/python', 'LESS': '-R', 'LOGNAME': 'pengjuncai', 'QTLIB': '/usr/lib64/qt-3.3/lib', 'SSH_CONNECTION': '172.31.43.62 55146 10.9.189.6 22', 'XDG_DATA_DIRS': '/ssd3/pengjuncai/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share', 'CONDA_DEFAULT_ENV': 'base', 'LESSOPEN': '||/usr/bin/lesspipe.sh %s', 'XDG_RUNTIME_DIR': '/run/user/1032', 'HISTTIMEFORMAT': '%Y-%m-%d %H:%M:%S ', 'model': 'topformer_tiny_ade20k_512x512_160k', '_': '/usr/bin/nohup', 'OLDPWD': '/ssd3/pengjuncai/PaddleSeg/output/topformer/topformer_tiny_ade20k_512x512_160k/test_0', 'CUSTOM_DEVICE_ROOT': '', 'OMP_NUM_THREADS': '1', 'QT_QPA_PLATFORM_PLUGIN_PATH': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/plugins', 'QT_QPA_FONTDIR': '/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/cv2/qt/fonts', 'PADDLE_MASTER': '10.9.189.6:49915', 'PADDLE_GLOBAL_SIZE': '2', 'PADDLE_LOCAL_SIZE': '2', 'PADDLE_GLOBAL_RANK': '0', 'PADDLE_LOCAL_RANK': '0', 'PADDLE_TRAINER_ENDPOINTS': '10.9.189.6:58800,10.9.189.6:55347', 'PADDLE_CURRENT_ENDPOINT': '10.9.189.6:58800', 'PADDLE_TRAINER_ID': '0', 'PADDLE_TRAINERS_NUM': '2', 'PADDLE_RANK_IN_NODE': '0', 'FLAGS_selected_gpus': '0'} LAUNCH INFO 2022-08-25 04:27:47,084 Exit code 1 INFO 2022-08-25 04:27:47,084 controller.py:124] Exit code 1 0.8639 0.5784 0.7066 0.6751 0.4697 0.4795 0.5561 0.5052 0.4837 0.5068 0.322 0.6392 0.4137 0.4933 0.5251 0.604 0.4914 0.6201 0.3389 0.3687 0.1747 0.0927 0.431 0.2813 0.4969 0.3892 0.2835 0.5551 0.8918 0.7321 0.6838 0.5819 0.3433 0.2546 0.8712 0.6168 0.9449 0.4009 0.6232 0.2759 0.1169 0.2444 0.5227 0.1415 0.6686 0.8109 0.3958 0.496 0.1299 0.4371 0.5113 0.6348 0.4691 0.4929 0.5812 0.4343 0.456 0.2637 0.1383 0.2548 0.8358 0.3876 0.3018 0.0273 0.6911 0.5932 0.0954 0.0683 0.4837 0.6016 0.614 0.0052 0.3472 0.0863 0.0045 0.0125 0.1736 0.1652 0.3038 0.462 0.1434 0.0491 0.2332 0.7915 0.0019 0.6423 0.2155 0.539 0.1387 0.4818 0.099 0.4161 0.1108 0.5699 0.9888 0.0088 0.4192 0.9555 0.0709 0.3322 0.5229 0.0126 0.2353 0.1427 0.2922 0.2613 0.476 0.4655 0.6133 0.41 0.7921 0.0258 0.1514 0.2838 0.0978 0.1524 0.0675 0.0195 0.1708 0.3826 0.1713 0.0928 0.351 0.6411 0.4469 0. 0.3196 0.013 0.0833 0.0647] 2022-08-25 04:27:43 [INFO] [EVAL] The model with the best validation mIoU (0.3249) was saved at iter 157000. 's flops has been counted Customize Function has been applied to 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. Traceback (most recent call last): File "/ssd3/pengjuncai/PaddleSeg/train.py", line 240, in main(args) File "/ssd3/pengjuncai/PaddleSeg/train.py", line 216, in main train( File "/ssd3/pengjuncai/PaddleSeg/paddleseg/core/train.py", line 327, in train _ = paddle.flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 109, in flops return dynamic_flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 257, in dynamic_flops model(inputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/topformer.py", line 76, in forward x = self.backbone(x) # len=3, 1/8,1/16,1/32 File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/backbones/top_transformer.py", line 573, in forward out = self.ppa(ouputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 918, in _dygraph_call_func hook_result = forward_post_hook(self, inputs, outputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 183, in count_io_info m.register_buffer('input_shape', paddle.to_tensor(x[0].shape)) AttributeError: 'list' object has no attribute 'shape' 69 0. 0.3196 0.013 0.0833 0.0647] 2022-08-25 04:27:43 [INFO] [EVAL] The model with the best validation mIoU (0.3249) was saved at iter 157000. 's flops has been counted Customize Function has been applied to 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. 's flops has been counted Cannot find suitable count function for . Treat it as zero FLOPs. Traceback (most recent call last): File "/ssd3/pengjuncai/PaddleSeg/train.py", line 240, in main(args) File "/ssd3/pengjuncai/PaddleSeg/train.py", line 216, in main train( File "/ssd3/pengjuncai/PaddleSeg/paddleseg/core/train.py", line 327, in train _ = paddle.flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 109, in flops return dynamic_flops( File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 257, in dynamic_flops model(inputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/topformer.py", line 76, in forward x = self.backbone(x) # len=3, 1/8,1/16,1/32 File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 915, in _dygraph_call_func outputs = self.forward(*inputs, **kwargs) File "/ssd3/pengjuncai/PaddleSeg/paddleseg/models/backbones/top_transformer.py", line 573, in forward out = self.ppa(ouputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 930, in __call__ return self._dygraph_call_func(*inputs, **kwargs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/fluid/dygraph/layers.py", line 918, in _dygraph_call_func hook_result = forward_post_hook(self, inputs, outputs) File "/ssd3/pengjuncai/anaconda3/lib/python3.9/site-packages/paddle/hapi/dynamic_flops.py", line 183, in count_io_info m.register_buffer('input_shape', paddle.to_tensor(x[0].shape)) AttributeError: 'list' object has no attribute 'shape'