Dev Builds » 20210724-1604

You are viewing an old NCM Stockfish dev build test. You may find the most recent dev build tests using Stockfish 15 as the baseline here.

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 14. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo
ncm-dbt-01 09:34:51 1119219 3166 795 734 1637 +6.69 +/- 5.68 0 316 893 371 3 +12.74 +/- 11.3
ncm-dbt-02 10:21:13 1329517 3408 833 815 1760 +1.84 +/- 5.62 1 384 915 404 0 +3.87 +/- 11.23
ncm-dbt-03 10:24:43 1329284 3424 836 802 1786 +3.45 +/- 5.75 0 397 888 423 4 +6.09 +/- 11.42
ncm-dbt-04 10:25:52 1316941 3410 830 805 1775 +2.55 +/- 5.74 0 396 893 411 5 +4.08 +/- 11.38
ncm-dbt-05 10:17:56 1325198 3402 861 782 1759 +8.07 +/- 5.76 3 364 888 443 3 +16.15 +/- 11.42
ncm-dbt-06 09:35:06 1223241 3190 818 782 1590 +3.92 +/- 5.91 2 356 844 390 3 +7.63 +/- 11.7
20000 4973 4720 10307 +4.4 +/- 2.35 6 2213 5321 2442 18 +8.37 +/- 4.66

Test Detail

ID Host Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo CLI PGN
189018 ncm-dbt-05 1234222 152 40 32 80 +18.3 +/- 29.59 0 17 35 23 1 +32.09 +/- 57.95
189017 ncm-dbt-04 1231079 160 43 44 73 -2.17 +/- 28.61 0 23 35 22 0 -4.34 +/- 57.62
189016 ncm-dbt-02 1236258 158 40 32 86 +17.61 +/- 25.68 0 14 43 22 0 +35.3 +/- 52.06
189015 ncm-dbt-01 1094437 166 34 33 99 +2.09 +/- 24.31 0 17 48 18 0 +4.19 +/- 48.86
189014 ncm-dbt-03 1238458 174 49 47 78 +3.99 +/- 26.0 0 20 46 20 1 +3.99 +/- 50.47
189013 ncm-dbt-06 1224007 190 53 48 89 +9.14 +/- 23.48 0 19 52 24 0 +18.3 +/- 47.28
189012 ncm-dbt-05 1227833 500 129 108 263 +14.6 +/- 14.65 1 45 136 68 0 +30.65 +/- 29.13
189011 ncm-dbt-02 1226767 500 128 134 238 -4.17 +/- 15.05 0 64 128 58 0 -8.34 +/- 30.16
189010 ncm-dbt-04 1215888 500 125 111 264 +9.73 +/- 14.64 0 50 137 62 1 +18.08 +/- 29.01
189009 ncm-dbt-01 1118605 500 129 123 248 +4.17 +/- 13.89 0 48 149 52 1 +6.95 +/- 27.43
189008 ncm-dbt-03 1244022 500 110 117 273 -4.86 +/- 15.35 0 67 123 60 0 -9.73 +/- 30.78
189007 ncm-dbt-06 1216224 500 137 134 229 +2.08 +/- 15.95 0 66 116 67 1 +2.78 +/- 31.62
189006 ncm-dbt-05 1233120 500 124 121 255 +2.08 +/- 14.86 0 57 134 58 1 +2.78 +/- 29.41
189005 ncm-dbt-02 1239065 500 122 130 248 -5.56 +/- 14.54 0 61 136 53 0 -11.12 +/- 29.15
189004 ncm-dbt-04 1227097 500 120 120 260 -0.0 +/- 15.05 0 61 128 61 0 -0.0 +/- 30.16
189003 ncm-dbt-03 1250794 500 122 119 259 +2.08 +/- 14.86 0 58 131 61 0 +4.17 +/- 29.79
189002 ncm-dbt-01 1131569 500 136 125 239 +7.64 +/- 14.59 0 51 138 60 1 +13.9 +/- 28.89
189001 ncm-dbt-06 1221583 500 128 123 249 +3.47 +/- 14.74 0 56 133 61 0 +6.95 +/- 29.53
189000 ncm-dbt-05 1237435 500 145 111 244 +23.66 +/- 14.96 0 45 126 79 0 +47.55 +/- 30.4
188999 ncm-dbt-04 1239395 500 117 118 265 -0.69 +/- 14.87 0 60 131 59 0 -1.39 +/- 29.79
188998 ncm-dbt-02 1224014 500 113 112 275 +0.69 +/- 14.1 0 53 143 54 0 +1.39 +/- 28.24
188997 ncm-dbt-03 1238249 500 134 117 249 +11.82 +/- 14.68 0 50 133 67 0 +23.66 +/- 29.53
188996 ncm-dbt-06 1227146 500 139 112 249 +18.78 +/- 14.98 1 45 130 74 0 +39.08 +/- 29.9
188995 ncm-dbt-01 1127373 500 121 111 268 +6.95 +/- 14.4 0 51 138 61 0 +13.9 +/- 28.89
188994 ncm-dbt-05 1222789 500 129 116 255 +9.04 +/- 14.71 1 49 136 64 0 +19.48 +/- 29.14
188993 ncm-dbt-02 1235998 500 115 119 266 -2.78 +/- 14.42 0 58 138 54 0 -5.56 +/- 28.89
188992 ncm-dbt-04 1221192 500 121 115 264 +4.17 +/- 14.8 0 54 138 56 2 +5.56 +/- 28.89
188991 ncm-dbt-03 1233812 500 127 117 256 +6.95 +/- 15.28 0 57 127 65 1 +12.51 +/- 30.29
188990 ncm-dbt-01 1105651 500 128 114 258 +9.73 +/- 14.51 0 49 139 61 1 +18.08 +/- 28.75
188989 ncm-dbt-06 1214427 500 113 118 269 -3.47 +/- 14.48 1 55 143 50 1 -6.95 +/- 28.24
188988 ncm-dbt-05 1242611 500 118 116 266 +1.39 +/- 16.24 1 66 114 68 1 +2.78 +/- 31.85
188987 ncm-dbt-02 1237590 500 116 110 274 +4.17 +/- 14.41 0 53 138 59 0 +8.34 +/- 28.89
188986 ncm-dbt-01 1131690 500 119 120 261 -0.69 +/- 14.36 0 56 139 55 0 -1.39 +/- 28.76
188985 ncm-dbt-03 1209606 500 118 118 264 0.0 +/- 15.3 0 62 127 60 1 -1.39 +/- 30.29
188984 ncm-dbt-04 1224069 500 119 112 269 +4.86 +/- 14.6 0 54 135 61 0 +9.73 +/- 29.28
188983 ncm-dbt-06 1221112 500 130 122 248 +5.56 +/- 14.28 0 50 143 56 1 +9.73 +/- 28.23
188982 ncm-dbt-05 1225485 500 117 118 265 -0.69 +/- 13.83 0 52 147 51 0 -1.39 +/- 27.7
188981 ncm-dbt-02 1246496 500 128 109 263 +13.21 +/- 15.65 1 54 120 75 0 +27.85 +/- 31.15
188980 ncm-dbt-06 1238190 500 118 125 257 -4.86 +/- 15.11 0 65 127 58 0 -9.73 +/- 30.29
188979 ncm-dbt-03 1250486 500 114 109 277 +3.47 +/- 14.48 0 54 137 59 0 +6.95 +/- 29.02
188978 ncm-dbt-01 1125208 500 128 108 264 +13.9 +/- 14.08 0 44 142 64 0 +27.85 +/- 28.34
188977 ncm-dbt-04 1223453 500 117 124 259 -4.86 +/- 15.11 0 65 127 58 0 -9.73 +/- 30.29
174545 ncm-dbt-02 1462580 50 12 14 24 -13.9 +/- 47.24 0 7 13 5 0 -27.85 +/- 96.81
174544 ncm-dbt-04 1441084 50 17 14 19 +20.87 +/- 40.37 0 3 16 6 0 +41.89 +/- 82.84
174543 ncm-dbt-03 1457961 50 8 11 31 -20.87 +/- 40.37 0 6 16 3 0 -41.89 +/- 82.84
174542 ncm-dbt-05 1454568 50 11 17 22 -41.89 +/- 53.17 0 11 9 5 0 -85.04 +/- 115.03
174540 ncm-dbt-02 1459480 50 16 11 23 +34.86 +/- 47.98 0 4 12 9 0 +70.44 +/- 101.42
174539 ncm-dbt-04 1447638 50 13 12 25 +6.95 +/- 49.39 0 6 12 7 0 +13.9 +/- 100.98
174538 ncm-dbt-03 1455737 50 14 14 22 -0.01 +/- 58.33 0 8 10 6 1 -13.9 +/- 108.98
174536 ncm-dbt-05 1453251 50 10 8 32 +13.9 +/- 38.37 0 3 17 5 0 +27.85 +/- 78.07
174531 ncm-dbt-02 1463927 50 15 13 22 +13.9 +/- 43.02 0 4 15 6 0 +27.85 +/- 87.86
174530 ncm-dbt-03 1460496 50 13 9 28 +27.85 +/- 42.16 0 3 15 7 0 +56.07 +/- 87.47
174529 ncm-dbt-04 1441953 50 13 14 23 -6.96 +/- 59.92 0 9 9 6 1 -27.85 +/- 113.01
174528 ncm-dbt-05 1458406 50 13 13 24 0.0 +/- 47.48 0 6 13 6 0 -0.0 +/- 96.78
174527 ncm-dbt-03 1455893 50 13 13 24 0.0 +/- 47.48 0 6 13 6 0 -0.0 +/- 96.78
174526 ncm-dbt-02 1460640 50 16 15 19 +6.95 +/- 49.39 0 6 12 7 0 +13.9 +/- 100.98
174525 ncm-dbt-04 1445008 50 13 8 29 +34.86 +/- 43.75 0 3 14 8 0 +70.44 +/- 92.08
174524 ncm-dbt-05 1456625 50 14 10 26 +27.85 +/- 50.47 0 5 11 9 0 +56.07 +/- 105.56
174523 ncm-dbt-03 1455900 50 14 11 25 +20.87 +/- 52.71 0 6 10 9 0 +41.89 +/- 109.37
174522 ncm-dbt-02 1461390 50 12 16 22 -27.85 +/- 37.35 0 6 17 2 0 -56.07 +/- 77.18
174521 ncm-dbt-05 1456035 50 11 12 27 -6.95 +/- 53.12 0 8 10 7 0 -13.9 +/- 108.98
174520 ncm-dbt-04 1445447 50 12 13 25 -6.96 +/- 56.61 0 8 11 5 1 -27.85 +/- 105.14

Commit

Commit ID b939c805139e4b37f04fbf177f580c35ebe9f130
Author MichaelB7
Date 2021-07-24 16:04:59 UTC
Update the default net to nn-76a8a7ffb820.nnue. combined work by Serio Vieri, Michael Byrne, and Jonathan D (aka SFisGod) based on top of previous developments, by restarts from good nets. Sergio generated the net https://tests.stockfishchess.org/api/nn/nn-d8609abe8caf.nnue: The initial net nn-d8609abe8caf.nnue is trained by generating around 16B of training data from the last master net nn-9e3c6298299a.nnue, then trained, continuing from the master net, with lambda=0.2 and sampling ratio of 1. Starting with LR=2e-3, dropping LR with a factor of 0.5 until it reaches LR=5e-4. in_scaling is set to 361. No other significant changes made to the pytorch trainer. Training data gen command (generates in chunks of 200k positions): generate_training_data min_depth 9 max_depth 11 count 200000 random_move_count 10 random_move_max_ply 80 random_multi_pv 12 random_multi_pv_diff 100 random_multi_pv_depth 8 write_min_ply 10 eval_limit 1500 book noob_3moves.epd output_file_name gendata/$(date +"%Y%m%d-%H%M")_${HOSTNAME}.binpack PyTorch trainer command (Note that this only trains for 20 epochs, repeatedly train until convergence): python train.py --features "HalfKAv2^" --max_epochs 20 --smart-fen-skipping --random-fen-skipping 500 --batch-size 8192 --default_root_dir $dir --seed $RANDOM --threads 4 --num-workers 32 --gpus $gpuids --track_grad_norm 2 --gradient_clip_val 0.05 --lambda 0.2 --log_every_n_steps 50 $resumeopt $data $val See https://github.com/sergiovieri/Stockfish/tree/tools_mod/rl for the scripts used to generate data. Based on that Michael generated nn-76a8a7ffb820.nnue in the following way: The net being submitted was trained with the pytorch trainer: https://github.com/glinscott/nnue-pytorch python train.py i:/bin/all.binpack i:/bin/all.binpack --gpus 1 --threads 4 --num-workers 30 --batch-size 16384 --progress_bar_refresh_rate 30 --smart-fen-skipping --random-fen-skipping 3 --features=HalfKAv2^ --auto_lr_find True --lambda=1.0 --max_epochs=240 --seed %random%%random% --default_root_dir exp/run_109 --resume-from-model ./pt/nn-d8609abe8caf.pt This run is thus started from Segio Vieri's net nn-d8609abe8caf.nnue all.binpack equaled 4 parts Wrong_NNUE_2.binpack https://drive.google.com/file/d/1seGNOqcVdvK_vPNq98j-zV3XPE5zWAeq/view?usp=sharing plus two parts of Training_Data.binpack https://drive.google.com/file/d/1RFkQES3DpsiJqsOtUshENtzPfFgUmEff/view?usp=sharing Each set was concatenated together - making one large Wrong_NNUE 2 binpack and one large Training so the were approximately equal in size. They were then interleaved together. The idea was to give Wrong_NNUE.binpack closer to equal weighting with the Training_Data binpack model.py modifications: loss = torch.pow(torch.abs(p - q), 2.6).mean() LR = 8.0e-5 calculated as follows: 1.5e-3*(.992^360) - the idea here was to take a highly trained net and just use all.binpack as a finishing micro refinement touch for the last 2 Elo or so. This net was discovered on the 59th epoch. optimizer = ranger.Ranger(train_params, betas=(.90, 0.999), eps=1.0e-7, gc_loc=False, use_gc=False) scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=1, gamma=0.992) For this micro optimization, I had set the period to "5" in train.py. This changes the checkpoint output so that every 5th checkpoint file is created The final touches were to adjust the NNUE scale, as was done by Jonathan in tests running at the same time. passed LTC https://tests.stockfishchess.org/tests/view/60fa45aed8a6b65b2f3a77a4 LLR: 2.94 (-2.94,2.94) <0.50,3.50> Total: 53040 W: 1732 L: 1575 D: 49733 Ptnml(0-2): 14, 1432, 23474, 1583, 17 passed STC https://tests.stockfishchess.org/tests/view/60f9fee2d8a6b65b2f3a7775 LLR: 2.94 (-2.94,2.94) <-0.50,2.50> Total: 37928 W: 3178 L: 3001 D: 31749 Ptnml(0-2): 100, 2446, 13695, 2623, 100. closes https://github.com/official-stockfish/Stockfish/pull/3626 Bench: 5169957
Copyright 2011–2024 Next Chess Move LLC