Dev Builds » 20210724-1604

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 7. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games Wins Losses Draws Elo
ncm-et-3 08:18:50 1960617 3339 2834 7 498 +432.29 +/- 15.27
ncm-et-4 08:18:36 1963125 3352 2810 4 538 +420.9 +/- 14.67
ncm-et-9 08:18:12 1956941 3340 2830 9 501 +429.79 +/- 15.23
ncm-et-10 08:18:11 1952919 3290 2772 9 509 +424.06 +/- 15.1
ncm-et-13 08:18:20 1960005 3346 2769 12 565 +406.17 +/- 14.33
ncm-et-15 08:18:11 1958424 3333 2792 7 534 +419.13 +/- 14.73
20000 16807 48 3145 +421.87 +/- 6.06

Test Detail

ID Host Started (UTC) Duration Base NPS Games Wins Losses Draws Elo CLI PGN
150260 ncm-et-10 2021-07-25 01:49 00:44:21 1954197 290 243 1 46 +417.87 +/- 51.27
150259 ncm-et-15 2021-07-25 01:44 00:48:58 1969474 333 284 1 48 +436.24 +/- 50.19
150258 ncm-et-3 2021-07-25 01:44 00:49:55 1965383 339 293 0 46 +455.18 +/- 51.24
150257 ncm-et-13 2021-07-25 01:42 00:50:54 1959239 346 282 3 61 +387.92 +/- 44.29
150256 ncm-et-9 2021-07-25 01:42 00:51:24 1951667 340 281 1 58 +405.7 +/- 45.4
150255 ncm-et-4 2021-07-25 01:39 00:54:07 1952705 352 303 0 49 +450.41 +/- 49.56
150254 ncm-et-10 2021-07-25 00:32 01:16:18 1949502 500 424 1 75 +431.48 +/- 39.8
150253 ncm-et-15 2021-07-25 00:29 01:14:11 1962958 500 426 3 71 +431.48 +/- 41.0
150252 ncm-et-9 2021-07-25 00:27 01:13:30 1964153 500 435 1 64 +460.32 +/- 43.24
150251 ncm-et-3 2021-07-25 00:27 01:16:07 1952217 500 416 2 82 +410.58 +/- 38.02
150250 ncm-et-13 2021-07-25 00:25 01:16:29 1948434 500 417 0 83 +417.31 +/- 37.67
150249 ncm-et-4 2021-07-25 00:24 01:13:58 1966466 500 415 1 84 +410.58 +/- 37.5
150248 ncm-et-10 2021-07-24 23:15 01:15:54 1954059 500 405 1 94 +389.56 +/- 35.35
150247 ncm-et-15 2021-07-24 23:13 01:15:17 1952991 500 420 0 80 +424.27 +/- 38.4
150246 ncm-et-9 2021-07-24 23:11 01:15:25 1956027 500 401 1 98 +381.7 +/- 34.59
150245 ncm-et-3 2021-07-24 23:11 01:15:46 1957938 500 425 0 75 +436.43 +/- 39.73
150244 ncm-et-13 2021-07-24 23:11 01:13:49 1967388 500 424 3 73 +426.65 +/- 40.41
150243 ncm-et-4 2021-07-24 23:10 01:13:43 1970960 500 410 0 90 +401.92 +/- 36.1
150242 ncm-et-10 2021-07-24 21:59 01:15:34 1952224 500 419 3 78 +415.05 +/- 39.05
150241 ncm-et-15 2021-07-24 21:58 01:14:24 1963554 500 407 1 92 +393.6 +/- 35.75
150240 ncm-et-13 2021-07-24 21:57 01:13:16 1965073 500 401 1 98 +381.7 +/- 34.59
150239 ncm-et-3 2021-07-24 21:56 01:14:00 1963691 500 415 4 81 +404.05 +/- 38.29
150238 ncm-et-9 2021-07-24 21:55 01:16:11 1928992 500 422 4 74 +419.61 +/- 40.12
150237 ncm-et-4 2021-07-24 21:53 01:16:03 1946341 500 422 1 77 +426.65 +/- 39.25
150236 ncm-et-10 2021-07-24 20:43 01:15:16 1953280 500 429 1 70 +444.09 +/- 41.26
150235 ncm-et-15 2021-07-24 20:42 01:15:41 1946362 500 403 0 97 +387.56 +/- 34.71
150234 ncm-et-13 2021-07-24 20:41 01:14:39 1964938 500 412 1 87 +404.05 +/- 36.82
150233 ncm-et-9 2021-07-24 20:40 01:14:20 1960023 500 435 1 64 +460.32 +/- 43.24
150232 ncm-et-3 2021-07-24 20:39 01:16:35 1956401 500 421 0 79 +426.65 +/- 38.66
150231 ncm-et-4 2021-07-24 20:38 01:14:21 1960065 500 436 0 64 +466.03 +/- 43.18
150230 ncm-et-10 2021-07-24 19:27 01:15:02 1953129 500 435 1 64 +460.32 +/- 43.24
150229 ncm-et-15 2021-07-24 19:27 01:14:52 1958036 500 421 1 78 +424.28 +/- 38.99
150228 ncm-et-9 2021-07-24 19:26 01:13:19 1971418 500 427 1 72 +438.95 +/- 40.66
150227 ncm-et-13 2021-07-24 19:25 01:15:38 1955110 500 412 1 87 +404.05 +/- 36.82
150226 ncm-et-3 2021-07-24 19:25 01:13:28 1969135 500 420 0 80 +424.27 +/- 38.4
150225 ncm-et-4 2021-07-24 19:24 01:14:01 1970354 500 407 2 91 +391.57 +/- 36.01
150224 ncm-et-15 2021-07-24 18:11 01:14:48 1955593 500 431 1 68 +449.35 +/- 41.89
150223 ncm-et-10 2021-07-24 18:11 01:15:46 1954043 500 417 1 82 +415.05 +/- 37.98
150222 ncm-et-9 2021-07-24 18:11 01:14:03 1966309 500 429 0 71 +446.7 +/- 40.89
150221 ncm-et-3 2021-07-24 18:11 01:12:59 1959560 500 444 1 55 +487.45 +/- 46.83
150220 ncm-et-13 2021-07-24 18:11 01:13:35 1959857 500 421 3 76 +419.61 +/- 39.58
150219 ncm-et-4 2021-07-24 18:11 01:12:23 1974987 500 417 0 83 +417.31 +/- 37.67

Commit

Commit ID b939c805139e4b37f04fbf177f580c35ebe9f130
Author MichaelB7
Date 2021-07-24 16:04:59 UTC
Update the default net to nn-76a8a7ffb820.nnue. combined work by Serio Vieri, Michael Byrne, and Jonathan D (aka SFisGod) based on top of previous developments, by restarts from good nets. Sergio generated the net https://tests.stockfishchess.org/api/nn/nn-d8609abe8caf.nnue: The initial net nn-d8609abe8caf.nnue is trained by generating around 16B of training data from the last master net nn-9e3c6298299a.nnue, then trained, continuing from the master net, with lambda=0.2 and sampling ratio of 1. Starting with LR=2e-3, dropping LR with a factor of 0.5 until it reaches LR=5e-4. in_scaling is set to 361. No other significant changes made to the pytorch trainer. Training data gen command (generates in chunks of 200k positions): generate_training_data min_depth 9 max_depth 11 count 200000 random_move_count 10 random_move_max_ply 80 random_multi_pv 12 random_multi_pv_diff 100 random_multi_pv_depth 8 write_min_ply 10 eval_limit 1500 book noob_3moves.epd output_file_name gendata/$(date +"%Y%m%d-%H%M")_${HOSTNAME}.binpack PyTorch trainer command (Note that this only trains for 20 epochs, repeatedly train until convergence): python train.py --features "HalfKAv2^" --max_epochs 20 --smart-fen-skipping --random-fen-skipping 500 --batch-size 8192 --default_root_dir $dir --seed $RANDOM --threads 4 --num-workers 32 --gpus $gpuids --track_grad_norm 2 --gradient_clip_val 0.05 --lambda 0.2 --log_every_n_steps 50 $resumeopt $data $val See https://github.com/sergiovieri/Stockfish/tree/tools_mod/rl for the scripts used to generate data. Based on that Michael generated nn-76a8a7ffb820.nnue in the following way: The net being submitted was trained with the pytorch trainer: https://github.com/glinscott/nnue-pytorch python train.py i:/bin/all.binpack i:/bin/all.binpack --gpus 1 --threads 4 --num-workers 30 --batch-size 16384 --progress_bar_refresh_rate 30 --smart-fen-skipping --random-fen-skipping 3 --features=HalfKAv2^ --auto_lr_find True --lambda=1.0 --max_epochs=240 --seed %random%%random% --default_root_dir exp/run_109 --resume-from-model ./pt/nn-d8609abe8caf.pt This run is thus started from Segio Vieri's net nn-d8609abe8caf.nnue all.binpack equaled 4 parts Wrong_NNUE_2.binpack https://drive.google.com/file/d/1seGNOqcVdvK_vPNq98j-zV3XPE5zWAeq/view?usp=sharing plus two parts of Training_Data.binpack https://drive.google.com/file/d/1RFkQES3DpsiJqsOtUshENtzPfFgUmEff/view?usp=sharing Each set was concatenated together - making one large Wrong_NNUE 2 binpack and one large Training so the were approximately equal in size. They were then interleaved together. The idea was to give Wrong_NNUE.binpack closer to equal weighting with the Training_Data binpack model.py modifications: loss = torch.pow(torch.abs(p - q), 2.6).mean() LR = 8.0e-5 calculated as follows: 1.5e-3*(.992^360) - the idea here was to take a highly trained net and just use all.binpack as a finishing micro refinement touch for the last 2 Elo or so. This net was discovered on the 59th epoch. optimizer = ranger.Ranger(train_params, betas=(.90, 0.999), eps=1.0e-7, gc_loc=False, use_gc=False) scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=1, gamma=0.992) For this micro optimization, I had set the period to "5" in train.py. This changes the checkpoint output so that every 5th checkpoint file is created The final touches were to adjust the NNUE scale, as was done by Jonathan in tests running at the same time. passed LTC https://tests.stockfishchess.org/tests/view/60fa45aed8a6b65b2f3a77a4 LLR: 2.94 (-2.94,2.94) <0.50,3.50> Total: 53040 W: 1732 L: 1575 D: 49733 Ptnml(0-2): 14, 1432, 23474, 1583, 17 passed STC https://tests.stockfishchess.org/tests/view/60f9fee2d8a6b65b2f3a7775 LLR: 2.94 (-2.94,2.94) <-0.50,2.50> Total: 37928 W: 3178 L: 3001 D: 31749 Ptnml(0-2): 100, 2446, 13695, 2623, 100. closes https://github.com/official-stockfish/Stockfish/pull/3626 Bench: 5169957
Copyright 2011–2024 Next Chess Move LLC