Dev Builds » 20230606-1917

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 14. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo
ncm-dbt-01 09:56:16 1118733 3358 1392 326 1640 +114.24 +/- 5.47 1 60 526 1056 36 +248.56 +/- 14.86
ncm-dbt-02 09:50:47 1230206 3292 1339 295 1658 +114.12 +/- 5.45 0 53 533 1023 37 +247.29 +/- 14.76
ncm-dbt-03 09:56:53 1234870 3338 1400 306 1632 +118.23 +/- 5.35 0 47 519 1065 38 +259.16 +/- 14.96
ncm-dbt-04 09:56:05 1225238 3356 1422 305 1629 +120.22 +/- 5.64 0 60 500 1059 59 +257.89 +/- 15.25
ncm-dbt-05 09:53:17 1231540 3320 1410 336 1574 +116.58 +/- 5.43 0 54 515 1054 37 +254.56 +/- 15.02
ncm-dbt-06 09:56:48 1234441 3336 1396 345 1595 +113.31 +/- 5.55 1 61 533 1032 41 +244.16 +/- 14.77
20000 8359 1913 9728 +116.12 +/- 2.24 2 335 3126 6289 248 +251.89 +/- 6.09

Test Detail

ID Host Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo CLI PGN
191244 ncm-dbt-02 1210593 292 116 24 152 +113.32 +/- 17.96 0 6 43 96 1 +253.75 +/- 52.75
191243 ncm-dbt-05 1240319 320 134 29 157 +118.38 +/- 17.56 0 6 46 105 3 +261.95 +/- 50.98
191242 ncm-dbt-03 1241879 338 149 29 160 +128.96 +/- 15.79 0 2 49 114 4 +292.23 +/- 49.2
191241 ncm-dbt-06 1236453 336 146 37 153 +116.93 +/- 17.9 0 6 53 103 6 +248.02 +/- 47.38
191240 ncm-dbt-04 1221250 356 153 31 172 +124.08 +/- 18.72 0 8 50 110 10 +257.14 +/- 48.72
191239 ncm-dbt-01 1120507 358 146 40 172 +106.05 +/- 17.62 1 7 60 107 4 +227.77 +/- 44.43
191232 ncm-dbt-02 1226966 500 206 50 244 +112.14 +/- 14.22 0 9 82 153 6 +240.82 +/- 37.9
191231 ncm-dbt-05 1229166 500 209 52 239 +112.91 +/- 14.38 0 9 82 152 7 +240.82 +/- 37.9
191229 ncm-dbt-06 1219141 500 203 53 244 +107.54 +/- 14.37 0 9 89 145 7 +226.0 +/- 36.29
191228 ncm-dbt-04 1214800 500 211 40 249 +123.81 +/- 13.46 0 5 75 164 6 +275.45 +/- 39.63
191226 ncm-dbt-03 1238697 500 208 44 248 +118.33 +/- 15.0 0 13 67 163 7 +256.44 +/- 41.76
191224 ncm-dbt-01 1101231 500 205 50 245 +111.37 +/- 14.22 0 10 80 155 5 +240.82 +/- 38.39
191220 ncm-dbt-02 1230915 500 193 43 264 +107.54 +/- 13.56 0 9 84 155 2 +236.51 +/- 37.42
191216 ncm-dbt-05 1216080 500 223 56 221 +120.67 +/- 14.01 0 8 73 163 6 +265.78 +/- 40.25
191213 ncm-dbt-06 1239407 500 216 48 236 +121.45 +/- 14.67 0 9 73 159 9 +261.07 +/- 40.24
191211 ncm-dbt-04 1226533 500 207 50 243 +112.91 +/- 14.85 0 10 82 149 9 +236.51 +/- 37.9
191210 ncm-dbt-01 1122021 500 207 44 249 +117.55 +/- 14.2 0 11 69 166 4 +261.07 +/- 41.31
191209 ncm-dbt-03 1232552 500 208 42 250 +119.89 +/- 13.68 0 5 81 157 7 +261.07 +/- 38.02
191202 ncm-dbt-02 1238335 500 205 44 251 +116.0 +/- 15.9 0 15 70 154 11 +240.82 +/- 40.76
191201 ncm-dbt-05 1215829 500 202 54 244 +106.01 +/- 14.51 0 13 80 153 4 +228.08 +/- 38.34
191200 ncm-dbt-06 1242765 500 203 54 243 +106.77 +/- 14.36 0 13 78 156 3 +232.26 +/- 38.82
191199 ncm-dbt-04 1243539 500 209 45 246 +118.33 +/- 16.05 0 14 71 152 13 +243.0 +/- 40.56
191198 ncm-dbt-03 1250461 500 208 45 247 +117.55 +/- 12.46 0 4 80 165 1 +268.17 +/- 38.21
191197 ncm-dbt-01 1137741 500 206 59 235 +105.25 +/- 14.81 0 9 95 136 10 +213.85 +/- 35.03
191190 ncm-dbt-02 1235001 500 202 43 255 +114.45 +/- 13.55 0 5 87 152 6 +247.41 +/- 36.56
191189 ncm-dbt-04 1227842 500 210 41 249 +122.24 +/- 12.76 0 3 79 164 4 +275.45 +/- 38.39
191188 ncm-dbt-06 1220902 500 214 53 233 +116.0 +/- 14.37 1 8 75 161 5 +256.44 +/- 39.69
191187 ncm-dbt-05 1241667 500 225 49 226 +127.76 +/- 13.75 0 5 72 165 8 +282.94 +/- 40.5
191186 ncm-dbt-03 1228262 500 217 60 223 +112.91 +/- 13.89 0 7 85 152 6 +243.0 +/- 37.13
191185 ncm-dbt-01 1108552 500 208 52 240 +112.14 +/- 14.06 0 10 78 158 4 +245.2 +/- 38.89
191178 ncm-dbt-04 1229239 500 218 46 236 +124.6 +/- 14.96 0 10 68 162 10 +268.17 +/- 41.66
191177 ncm-dbt-02 1245946 500 197 43 260 +110.6 +/- 13.39 0 5 91 149 5 +238.66 +/- 35.65
191176 ncm-dbt-06 1250273 500 205 53 242 +109.07 +/- 13.73 0 9 83 155 3 +238.66 +/- 37.66
191175 ncm-dbt-01 1130811 500 207 44 249 +117.55 +/- 13.36 0 6 79 161 4 +261.07 +/- 38.58
191174 ncm-dbt-03 1222901 500 202 41 257 +116.0 +/- 14.05 0 8 79 157 6 +251.89 +/- 38.63
191173 ncm-dbt-05 1226276 500 210 49 241 +116.0 +/- 13.54 0 7 79 160 4 +256.44 +/- 38.62
191166 ncm-dbt-02 1223691 500 220 48 232 +124.6 +/- 13.27 0 4 76 164 6 +277.93 +/- 39.29
191165 ncm-dbt-04 1213463 500 214 52 234 +116.77 +/- 14.53 0 10 75 158 7 +251.89 +/- 39.67
191164 ncm-dbt-05 1251448 500 207 47 246 +115.22 +/- 13.55 0 6 83 156 5 +251.89 +/- 37.57
191163 ncm-dbt-01 1110270 500 213 37 250 +127.76 +/- 13.57 0 7 65 173 5 +290.66 +/- 42.73
191162 ncm-dbt-06 1232147 500 209 47 244 +116.77 +/- 14.21 0 7 82 153 8 +249.64 +/- 37.86
191161 ncm-dbt-03 1229341 500 208 45 247 +117.55 +/- 14.2 0 8 78 157 7 +254.16 +/- 38.89

Commit

Commit ID 373359b44d0947cce2628a9a8c9b432a458615a8
Author Linmiao Xu
Date 2023-06-06 19:17:36 UTC
Update default net to nn-0dd1cebea573.nnue Created by retraining an earlier epoch of the experiment leading to the first SFNNv6 net on a more-randomized version of the nn-e1fb1ade4432.nnue dataset mixed with unfiltered T80 apr2023 data. Trained using early-fen-skipping 28 and max-epoch 960. The trainer settings and epochs used in the 5-step training sequence leading here were: 1. train from scratch for 400 epochs, lambda 1.0, constant LR 9.75e-4, T79T77-filter-v6-dd.min.binpack 2. retrain ep379, max-epoch 800, end-lambda 0.75, T60T70wIsRightFarseerT60T74T75T76.binpack 3. retrain ep679, max-epoch 800, end-lambda 0.75, skip 28, nn-e1fb1ade4432 dataset 4. retrain ep799, max-epoch 800, end-lambda 0.7, skip 28, nn-e1fb1ade4432 dataset 5. retrain ep439, max-epoch 960, end-lambda 0.7, skip 28, shuffled nn-e1fb1ade4432 + T80 apr2023 This net was epoch 559 of the final (step 5) retraining: ```bash python3 easy_train.py \ --experiment-name L1-1536-Re4-leela96-dfrc99-T60novdec-v2-T80juntonovjanfebT79aprmayT78jantosepT77dec-v6dd-T80apr-shuffled-sk28 \ --training-dataset /data/leela96-dfrc99-T60novdec-v2-T80juntonovjanfebT79aprmayT78jantosepT77dec-v6dd-T80apr.binpack \ --nnue-pytorch-branch linrock/nnue-pytorch/misc-fixes-L1-1536 \ --early-fen-skipping 28 \ --start-lambda 1.0 \ --end-lambda 0.7 \ --max_epoch 960 \ --start-from-engine-test-net False \ --start-from-model /data/L1-1536-Re3-nn-epoch439.nnue \ --engine-test-branch linrock/Stockfish/L1-1536 \ --lr 4.375e-4 \ --gamma 0.995 \ --tui False \ --seed $RANDOM \ --gpus "0," ``` During data preparation, most binpacks were unminimized by removing positions with score 32002 (`VALUE_NONE`). This makes the tradeoff of increasing dataset filesize on disk to increase the randomness of positions in interleaved datasets. The code used for unminimizing is at: https://github.com/linrock/Stockfish/tree/tools-unminify For preparing the dataset used in this experiment: ```bash python3 interleave_binpacks.py \ leela96-filt-v2.binpack \ dfrc99-16tb7p-eval-filt-v2.binpack \ filt-v6-dd-min/test60-novdec2021-12tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-aug2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-sep2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-jun2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd/test80-jul2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test80-oct2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test80-nov2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd-min/test80-jan2023-3of3-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-feb2023-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd/test79-apr2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test79-may2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd-min/test78-jantomay2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd/test78-juntosep2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test77-dec2021-16tb7p-filter-v6-dd.binpack \ test80-apr2023-2tb7p.binpack \ /data/leela96-dfrc99-T60novdec-v2-T80juntonovjanfebT79aprmayT78jantosepT77dec-v6dd-T80apr.binpack ``` T80 apr2023 data was converted using lc0-rescorer with ~2tb of tablebases and can be found at: https://robotmoon.com/nnue-training-data/ Local elo at 25k nodes per move vs. nn-e1fb1ade4432.nnue (L1 size 1024): nn-epoch559.nnue : 25.7 +/- 1.6 Passed STC: https://tests.stockfishchess.org/tests/view/647cd3b87cf638f0f53f9cbb LLR: 2.95 (-2.94,2.94) <0.00,2.00> Total: 59200 W: 16000 L: 15660 D: 27540 Ptnml(0-2): 159, 6488, 15996, 6768, 189 Passed LTC: https://tests.stockfishchess.org/tests/view/647d58de726f6b400e4085d8 LLR: 2.95 (-2.94,2.94) <0.50,2.50> Total: 58800 W: 16002 L: 15657 D: 27141 Ptnml(0-2): 44, 5607, 17748, 5962, 39 closes https://github.com/official-stockfish/Stockfish/pull/4606 bench 2141197
Copyright 2011–2024 Next Chess Move LLC