1. 使用 RMarkdown 的 child 参数,进行文档拼接。
  2. 这样拼接以后的笔记方便复习。
  3. 相关问题提交到 Issue

1 Baseline

True
Stored 'sub' (DataFrame)

1.1 Get Regular Data

Stored 'regular_data' (DataFrame)
---------------------------------------------------------------------------

TypeError                                 Traceback (most recent call last)

<ipython-input-14-eb0174fe6336> in <module>
      7 funcs = [np.mean]
      8 fold = 0
----> 9 for train_index, test_index in time_kfold(regular_data):
     10     cv_train, cv_test = regular_data.iloc[train_index], regular_data.iloc[test_index]
     11 


TypeError: 'TimeSeriesSplit' object is not callable

1.7 Merge Data as Train_df

1.9 Model Training

1.10 Submission

1.10.1 线下评测

fileName date description status publicScore privateScore
submission-20200306-180624.csv 2020-03-06 10:13:33 change n fold to 100 complete 0.11217 None
submission-20200306-172122.csv 2020-03-06 09:36:21 change n fold to 50 complete 0.11676 None
submission-20200306-170449.csv 2020-03-06 09:05:40 change n fold to 40 complete 0.11711 None
submission-20200306-164630.csv 2020-03-06 08:50:29 change n fold to 30 complete 0.12090 None
submission-20200306-162932.csv 2020-03-06 08:31:06 change n fold to 20. complete 0.12624 None
submission-20200306-160828.csv 2020-03-06 08:09:55 add id embedding. complete 0.41290 None
submission-20200306-154807.csv 2020-03-06 07:49:37 baseline with ratan123/march-madness-2020-ncaam-simple-lightgbm-on-kfold complete 0.14907 None
submission-20200301-195712.csv 2020-03-06 07:21:23 baseline with original hyperparameters complete 1.54431 None

2 XGBOOST 超参数调整

The blackcellmagic extension is already loaded. To reload it, use:
  %reload_ext blackcellmagic

以上是目前默认超参数。

D:\install\miniconda\lib\site-packages\xgboost\core.py:587: FutureWarning: Series.base is deprecated and will be removed in a future version
  if getattr(data, 'base', None) is not None and \
D:\install\miniconda\lib\site-packages\xgboost\core.py:588: FutureWarning: Series.base is deprecated and will be removed in a future version
  data.base is not None and isinstance(data, np.ndarray) \
[0] train-mae:11.2447+0.085538  test-mae:11.2587+0.16325
[50]    train-mae:7.43714+0.0840341 test-mae:7.93709+0.137656
[100]   train-mae:6.99125+0.0716425 test-mae:7.91611+0.143741
Untuned rmse: 7.899204

2.0.3 learning rate

{‘eval_metric’: ‘mae’, ‘booster’: ‘gbtree’, ‘eta’: 0.05, ‘subsample’: 0.35, ‘colsample_bytree’: 0.7, ‘num_parallel_tree’: 3, ‘min_child_weight’: 40, ‘gamma’: 10, ‘max_depth’: 3, ‘silent’: 1}

eta best_mae
0 0.001 10.841469
1 0.010 8.241155
2 0.100 7.941816
3 0.200 7.941957
4 0.300 7.995030

2.0.6 subsample

subsample best_mae
0 0.10 8.181179
1 0.50 7.829253
2 0.80 7.781617
3 0.90 7.745327
4 0.95 7.747116
5 1.00 7.761571

已经降低了误差, 因此最好的超参数是

{‘eval_metric’: ‘mae’, ‘booster’: ‘gbtree’, ‘eta’: 0.1, ‘subsample’: 0.9, ‘colsample_bytree’: 0.8, ‘num_parallel_tree’: 3, ‘min_child_weight’: 40, ‘gamma’: 10, ‘max_depth’: 5, ‘silent’: 1}

3 Target encoding

(175008, 35)

Fold: 0 CV train shape: (140006, 35) Fold: 1 CV train shape: (140006, 35) Fold: 2 CV train shape: (140006, 35) Fold: 3 CV train shape: (140007, 35) Fold: 4 CV train shape: (140007, 35)

5

pandas.core.frame.DataFrame

# 0 <class ‘pandas.core.frame.DataFrame’> (35002, 73) # 1 <class ‘pandas.core.frame.DataFrame’> (35002, 73) # 2 <class ‘pandas.core.frame.DataFrame’> (35002, 73) # 3 <class ‘pandas.core.frame.DataFrame’> (35001, 73) # 4 <class ‘pandas.core.frame.DataFrame’> (35001, 73)

Season DayNum T1_TeamID T1_Score T2_TeamID T2_Score location NumOT T1_FGM T1_FGA T2_opponent_FGMmean T2_opponent_FGAmean T2_opponent_FGM3mean T2_opponent_FGA3mean T2_opponent_ORmean T2_opponent_Astmean T2_opponent_TOmean T2_opponent_Stlmean T2_opponent_Blkmean T2_PointDiffmean
0 2003 13 1166 106 1426 50 1 0 41 69 23.708333 54.708333 5.708333 17.458333 10.583333 11.416667 14.041667 7.625000 3.125000 -0.583333
1 2003 14 1353 60 1162 36 1 0 23 57 21.894737 50.842105 6.789474 18.315789 10.315789 12.947368 12.578947 9.263158 3.947368 -15.368421
2 2003 14 1390 61 1131 57 1 0 20 53 20.880000 51.680000 4.880000 16.600000 10.640000 11.920000 13.240000 6.840000 3.120000 5.840000
3 2003 14 1426 59 1106 47 0 0 25 53 22.043478 52.608696 4.869565 15.652174 11.130435 12.434783 14.782609 8.521739 3.043478 -1.130435
4 2003 18 1113 59 1287 56 1 0 22 54 28.157895 58.894737 7.052632 19.631579 12.000000 16.263158 16.052632 5.789474 3.473684 2.578947

5 rows × 73 columns

(175008, 73)

4 ID embedding

The blackcellmagic extension is already loaded. To reload it, use: %reload_ext blackcellmagic

Tensorflow version: 2.0.0 Keras version: 2.2.4-tf

<class ‘pandas.core.frame.DataFrame’> RangeIndex: 4716704 entries, 0 to 4716703 Data columns (total 7 columns): Season int64 T1_TeamID int64 T2_TeamID int64 Score_Diff int64 win int64 index_t1 int64 index_t2 int64 dtypes: int64(7) memory usage: 251.9 MB

Season T1_TeamID T2_TeamID Score_Diff win index_t1 index_t2
0 2003 1421 1411 8 1 214 206
1 2003 1421 1411 8 1 214 206
2 2003 1421 1411 8 1 214 206
3 2003 1421 1411 8 1 214 206
4 2003 1421 1411 8 1 214 206

min 2003 max 2019 Name: Season, dtype: int64

Model: “model” __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to
================================================================================================== input_2 (InputLayer) [(None, 1)] 0
__________________________________________________________________________________________________ input_3 (InputLayer) [(None, 1)] 0
__________________________________________________________________________________________________ Team-Strength-Model (Model) (None, 1) 249 input_2[0][0]
input_3[0][0]
__________________________________________________________________________________________________ subtract (Subtract) (None, 1) 0 Team-Strength-Model[1][0]
Team-Strength-Model[2][0]
================================================================================================== Total params: 249 Trainable params: 249 Non-trainable params: 0 __________________________________________________________________________________________________

png

png

(4716704, 7)

0 ../model/fold_id_0.pkl saved. 1 ../model/fold_id_1.pkl saved. 2 ../model/fold_id_2.pkl saved. 3 ../model/fold_id_3.pkl saved. 4 ../model/fold_id_4.pkl saved. 5 ../model/fold_id_5.pkl saved. 6 ../model/fold_id_6.pkl saved. 7 ../model/fold_id_7.pkl saved. 8 ../model/fold_id_8.pkl saved. 9 ../model/fold_id_9.pkl saved. 10 ../model/fold_id_10.pkl saved. 11 ../model/fold_id_11.pkl saved. 12 ../model/fold_id_12.pkl saved. 13 ../model/fold_id_13.pkl saved. 14 ../model/fold_id_14.pkl saved. 15 ../model/fold_id_15.pkl saved. 16 ../model/fold_id_16.pkl saved. 17 ../model/fold_id_17.pkl saved. 18 ../model/fold_id_18.pkl saved. 19 ../model/fold_id_19.pkl saved.

WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032781 samples, validate on 448087 samples Epoch 1/10 4032781/4032781 [==============================] - 6s 1us/sample - loss: 8.2729 - val_loss: 9.5765 Epoch 2/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2726 - val_loss: 9.5742 Epoch 3/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2723 - val_loss: 9.5722 Epoch 4/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2721 - val_loss: 9.5688 Epoch 5/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2719 - val_loss: 9.5666 Epoch 6/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2717 - val_loss: 9.5634 Epoch 7/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2715 - val_loss: 9.5617 Epoch 8/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2713 - val_loss: 9.5594 Epoch 9/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2711 - val_loss: 9.5567 Epoch 10/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2709 - val_loss: 9.5550 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 0 CV train shape: (4480868,) and (4480868,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032781 samples, validate on 448087 samples Epoch 1/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2682 - val_loss: 9.5451 Epoch 2/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2680 - val_loss: 9.5438 Epoch 3/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2679 - val_loss: 9.5409 Epoch 4/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2678 - val_loss: 9.5376 Epoch 5/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2678 - val_loss: 9.5360 Epoch 6/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2677 - val_loss: 9.5335 Epoch 7/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2676 - val_loss: 9.5307 Epoch 8/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2675 - val_loss: 9.5283 Epoch 9/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5267 Epoch 10/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5266 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 1 CV train shape: (4480868,) and (4480868,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032781 samples, validate on 448087 samples Epoch 1/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5286 Epoch 2/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5273 Epoch 3/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2673 - val_loss: 9.5267 Epoch 4/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5253 Epoch 5/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5255 Epoch 6/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2671 - val_loss: 9.5244 Epoch 7/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2671 - val_loss: 9.5238 Epoch 8/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2670 - val_loss: 9.5226 Epoch 9/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2670 - val_loss: 9.5222 Epoch 10/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2669 - val_loss: 9.5217 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 2 CV train shape: (4480868,) and (4480868,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032781 samples, validate on 448087 samples Epoch 1/10 4032781/4032781 [==============================] - 6s 1us/sample - loss: 8.2676 - val_loss: 9.5230 Epoch 2/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2675 - val_loss: 9.5230 Epoch 3/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2675 - val_loss: 9.5228 Epoch 4/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5222 Epoch 5/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5210 Epoch 6/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2673 - val_loss: 9.5213 Epoch 7/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2673 - val_loss: 9.5203 Epoch 8/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5205 Epoch 9/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5198 Epoch 10/10 4032781/4032781 [==============================] - 5s 1us/sample - loss: 8.2671 - val_loss: 9.5201 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 3 CV train shape: (4480868,) and (4480868,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5166 Epoch 2/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2674 - val_loss: 9.5164 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2673 - val_loss: 9.5158 Epoch 4/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2673 - val_loss: 9.5159 Epoch 5/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2673 - val_loss: 9.5154 Epoch 6/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2673 - val_loss: 9.5154 Epoch 7/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5149 Epoch 8/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5144 Epoch 9/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5141 Epoch 10/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2672 - val_loss: 9.5138 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 4 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2661 - val_loss: 9.5129 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2661 - val_loss: 9.5122 Epoch 3/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2660 - val_loss: 9.5119 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5121 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5115 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5117 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2659 - val_loss: 9.5113 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2659 - val_loss: 9.5112 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2659 - val_loss: 9.5107 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5106 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 5 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2645 - val_loss: 9.5089 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2645 - val_loss: 9.5093 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2645 - val_loss: 9.5081 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2645 - val_loss: 9.5091 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2644 - val_loss: 9.5082 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2644 - val_loss: 9.5096 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2644 - val_loss: 9.5090 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2644 - val_loss: 9.5090 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2644 - val_loss: 9.5088 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2644 - val_loss: 9.5096 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 6 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2649 - val_loss: 9.5149 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2649 - val_loss: 9.5144 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2649 - val_loss: 9.5145 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2649 - val_loss: 9.5138 Epoch 5/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2649 - val_loss: 9.5139 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2649 - val_loss: 9.5132 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2649 - val_loss: 9.5129 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2648 - val_loss: 9.5134 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2648 - val_loss: 9.5133 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2648 - val_loss: 9.5126 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 7 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5089 Epoch 2/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2668 - val_loss: 9.5088 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5083 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5083 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5084 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5082 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5082 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2668 - val_loss: 9.5088 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2667 - val_loss: 9.5082 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2667 - val_loss: 9.5084 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 8 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5080 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5081 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5077 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5074 Epoch 5/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2660 - val_loss: 9.5069 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5077 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5072 Epoch 8/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2660 - val_loss: 9.5069 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5065 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5068 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 9 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5049 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5051 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5057 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5051 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5054 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5059 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5054 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5052 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5052 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2675 - val_loss: 9.5061 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 10 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2653 - val_loss: 9.5058 Epoch 2/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2653 - val_loss: 9.5061 Epoch 3/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2653 - val_loss: 9.5054 Epoch 4/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2653 - val_loss: 9.5061 Epoch 5/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2653 - val_loss: 9.5051 Epoch 6/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2653 - val_loss: 9.5055 Epoch 7/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2653 - val_loss: 9.5059 Epoch 8/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2653 - val_loss: 9.5060 Epoch 9/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2653 - val_loss: 9.5058 Epoch 10/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2653 - val_loss: 9.5055 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 11 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2665 - val_loss: 9.5064 Epoch 2/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2665 - val_loss: 9.5070 Epoch 3/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2665 - val_loss: 9.5072 Epoch 4/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2665 - val_loss: 9.5078 Epoch 5/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2665 - val_loss: 9.5072 Epoch 6/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2665 - val_loss: 9.5081 Epoch 7/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2665 - val_loss: 9.5081 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2665 - val_loss: 9.5085 Epoch 9/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2665 - val_loss: 9.5083 Epoch 10/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2665 - val_loss: 9.5090 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 12 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2646 - val_loss: 9.5069 Epoch 2/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2646 - val_loss: 9.5081 Epoch 3/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2646 - val_loss: 9.5083 Epoch 4/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2646 - val_loss: 9.5079 Epoch 5/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2645 - val_loss: 9.5083 Epoch 6/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2645 - val_loss: 9.5082 Epoch 7/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2645 - val_loss: 9.5088 Epoch 8/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2645 - val_loss: 9.5086 Epoch 9/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2645 - val_loss: 9.5089 Epoch 10/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2645 - val_loss: 9.5095 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 13 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5111 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5108 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5115 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5111 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5116 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5123 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2671 - val_loss: 9.5122 Epoch 8/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2671 - val_loss: 9.5115 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2670 - val_loss: 9.5121 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2670 - val_loss: 9.5114 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 14 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5103 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5099 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5095 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5091 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5091 Epoch 6/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2658 - val_loss: 9.5093 Epoch 7/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2658 - val_loss: 9.5095 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5092 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5089 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2658 - val_loss: 9.5091 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 15 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5113 Epoch 2/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2669 - val_loss: 9.5127 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5117 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5116 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5117 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5115 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5117 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5120 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2669 - val_loss: 9.5114 Epoch 10/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2669 - val_loss: 9.5114 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 16 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5148 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5152 Epoch 3/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5156 Epoch 4/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2660 - val_loss: 9.5152 Epoch 5/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2660 - val_loss: 9.5153 Epoch 6/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2660 - val_loss: 9.5151 Epoch 7/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2660 - val_loss: 9.5155 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5154 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2660 - val_loss: 9.5152 Epoch 10/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2660 - val_loss: 9.5157 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 17 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2652 - val_loss: 9.5141 Epoch 2/10 4032782/4032782 [==============================] - 7s 2us/sample - loss: 8.2652 - val_loss: 9.5134 Epoch 3/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2652 - val_loss: 9.5145 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5137 Epoch 5/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5143 Epoch 6/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5145 Epoch 7/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5138 Epoch 8/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5130 Epoch 9/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5141 Epoch 10/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2652 - val_loss: 9.5138 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 18 CV train shape: (4480869,) and (4480869,) WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’> Train on 4032782 samples, validate on 448087 samples Epoch 1/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2655 - val_loss: 9.5130 Epoch 2/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2655 - val_loss: 9.5133 Epoch 3/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2655 - val_loss: 9.5135 Epoch 4/10 4032782/4032782 [==============================] - 6s 1us/sample - loss: 8.2655 - val_loss: 9.5133 Epoch 5/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2655 - val_loss: 9.5137 Epoch 6/10 4032782/4032782 [==============================] - 6s 2us/sample - loss: 8.2655 - val_loss: 9.5140 Epoch 7/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2655 - val_loss: 9.5121 Epoch 8/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2655 - val_loss: 9.5140 Epoch 9/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2655 - val_loss: 9.5136 Epoch 10/10 4032782/4032782 [==============================] - 5s 1us/sample - loss: 8.2655 - val_loss: 9.5142 WARNING:tensorflow:Falling back from v2 loop because of error: Failed to find data adapter that can handle input: (<class ‘list’> containing values of types {“<class ‘pandas.core.series.Series’>”}), <class ‘NoneType’>

png

png

Fold: 19 CV train shape: (4480869,) and (4480869,)

(235836, 2) (235836, 2) (235836, 2) (235836, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2) (235835, 2)

(4716704, 1)

(4716704, 7)

(4716704, 8)

Season T1_TeamID T2_TeamID Score_Diff win index_t1 index_t2 strength
0 2003 1421 1411 8 1 214 206 -2.002989
1 2003 1421 1411 8 1 214 206 -2.003113
2 2003 1421 1411 8 1 214 206 -2.004044
3 2003 1421 1411 8 1 214 206 -2.010653
4 2003 1421 1411 8 1 214 206 -1.999414

20

../model/id2vec_model_0.h5 ../model/id2vec_model_1.h5 ../model/id2vec_model_2.h5 ../model/id2vec_model_3.h5 ../model/id2vec_model_4.h5 ../model/id2vec_model_5.h5 ../model/id2vec_model_6.h5 ../model/id2vec_model_7.h5 ../model/id2vec_model_8.h5 ../model/id2vec_model_9.h5 ../model/id2vec_model_10.h5 ../model/id2vec_model_11.h5 ../model/id2vec_model_12.h5 ../model/id2vec_model_13.h5 ../model/id2vec_model_14.h5 ../model/id2vec_model_15.h5 ../model/id2vec_model_16.h5 ../model/id2vec_model_17.h5 ../model/id2vec_model_18.h5 ../model/id2vec_model_19.h5

5 GBDT + LR

Start training…

D:-packages.py:148: UserWarning: Found num_trees in params. Will use it instead of argument warnings.warn(“Found {} in params. Will use it instead of argument”.format(alias))

Save model…

<lightgbm.basic.Booster at 0x164fbf98>

Start predicting… End.

特征工程后样本大小和维度 (1427, 100) target 长度为 (1427,)

Writing transformed training data

Writing transformed evaluating data

Writing transformed testing data

index gain
43 164 2166253.83
42 25 806088.40
45 116 695291.18
62 82 642513.68
44 66 450929.10

特征工程后训练集维度为 (1427, 6300) 特征工程后验证集维度为 (357, 6300)

Load data… 当选择 1.0 时, 训练集MSE为 125.88480173186079 ; 验证集MSE为 133.94909748143684 ; MSE差异为 8.064295749576047 当选择 0.5 时, 训练集MSE为 104.6912551108138 ; 验证集MSE为 116.13157134377984 ; MSE差异为 11.44031623296604 当选择 0.1 时, 训练集MSE为 79.77988429475656 ; 验证集MSE为 102.44363039432746 ; MSE差异为 22.6637460995709 当选择 0.05 时, 训练集MSE为 73.63251379783733 ; 验证集MSE为 103.00623966063947 ; MSE差异为 29.373725862802132 当选择 0.01 时, 训练集MSE为 67.22352603326681 ; 验证集MSE为 108.30661826466915 ; MSE差异为 41.08309223140235

D:-packages_model_descent.py:475: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 137.94698587385938, tolerance: 29.66780868955851 positive)

当选择 0.005 时, 训练集MSE为 65.59313539789633 ; 验证集MSE为 112.78284699895917 ; MSE差异为 47.18971160106284 当选择 0.001 时, 训练集MSE为 63.94851127901013 ; 验证集MSE为 119.18013197298585 ; MSE差异为 55.23162069397572

D:-packages_model_descent.py:475: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 15371.277124878361, tolerance: 29.66780868955851 positive)

选择 alpha = 1

D:-packages_model_descent.py:475: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 15371.277124878361, tolerance: 29.66780868955851 positive)

Train using LightGBM with MSE: 118.36963814343773 Train using GBDT+LR with MSE: 116.92075519964115

6 GBDT + LR k-fold

The blackcellmagic extension is already loaded. To reload it, use: %reload_ext blackcellmagic

Index([‘T1_FGMmean’, ‘T1_FGAmean’, ‘T1_FGM3mean’, ‘T1_FGA3mean’, ‘T1_ORmean’, ‘T1_Astmean’, ‘T1_TOmean’, ‘T1_Stlmean’, ‘T1_PFmean’, ‘T1_opponent_FGMmean’, ‘T1_opponent_FGAmean’, ‘T1_opponent_FGM3mean’, ‘T1_opponent_FGA3mean’, ‘T1_opponent_ORmean’, ‘T1_opponent_Astmean’, ‘T1_opponent_TOmean’, ‘T1_opponent_Stlmean’, ‘T1_opponent_Blkmean’, ‘T1_PointDiffmean’, ‘T2_FGMmean’, ‘T2_FGAmean’, ‘T2_FGM3mean’, ‘T2_FGA3mean’, ‘T2_ORmean’, ‘T2_Astmean’, ‘T2_TOmean’, ‘T2_Stlmean’, ‘T2_PFmean’, ‘T2_opponent_FGMmean’, ‘T2_opponent_FGAmean’, ‘T2_opponent_FGM3mean’, ‘T2_opponent_FGA3mean’, ‘T2_opponent_ORmean’, ‘T2_opponent_Astmean’, ‘T2_opponent_TOmean’, ‘T2_opponent_Stlmean’, ‘T2_opponent_Blkmean’, ‘T2_PointDiffmean’, ‘T1_seed’, ‘T2_seed’, ‘T1_win_ratio_14d’, ‘T2_win_ratio_14d’, ‘Seed_diff’, ‘strength’, ‘T1_rank’, ‘T1_adj_em’, ‘T1_adj_o’, ‘T1_adj_o_rank’, ‘T1_adj_d’, ‘T1_adj_d_rank’, ‘T1_adj_tempo’, ‘T1_adj_tempo_rank’, ‘T1_luck’, ‘T1_luck_rank’, ‘T1_sos_adj_em’, ‘T1_sos_adj_em_rank’, ‘T1_sos_adj_o’, ‘T1_sos_adj_o_rank’, ‘T1_sos_adj_d’, ‘T1_sos_adj_d_rank’, ‘T1_nc_sos_adj_em’, ‘T1_nc_sos_adj_em_rank’, ‘T2_rank’, ‘T2_adj_em’, ‘T2_adj_o’, ‘T2_adj_o_rank’, ‘T2_adj_d’, ‘T2_adj_d_rank’, ‘T2_adj_tempo’, ‘T2_adj_tempo_rank’, ‘T2_luck’, ‘T2_luck_rank’, ‘T2_sos_adj_em’, ‘T2_sos_adj_em_rank’, ‘T2_sos_adj_o’, ‘T2_sos_adj_o_rank’, ‘T2_sos_adj_d’, ‘T2_sos_adj_d_rank’, ‘T2_nc_sos_adj_em’, ‘T2_nc_sos_adj_em_rank’], dtype=‘object’)

(2230, 80)

(2230,)

6.0.1 分割出holdout


KeyError Traceback (most recent call last)

D:-packages.py in get_loc(self, key, method, tolerance) 2656 -> 2657 _index_shared_docs[ 2658 “get_indexer”

pandas_libs.pyx in pandas._libs.index.IndexEngine.get_loc()

pandas_libs.pyx in pandas._libs.index.IndexEngine.get_loc()

pandas_libs_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()

pandas_libs_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()

KeyError: ‘Season’

During handling of the above exception, another exception occurred:

KeyError Traceback (most recent call last)

in —-> 1 y_holdout = y[X[“Season”] == 2019] 2 y = y[X[“Season”] < 2019] 3 X_holdout = X[X[“Season”] == 2019] 4 X = X[X[“Season”] < 2019]

D:-packages.py in getitem(self, key) 2925 # see if we can slice the rows 2926 indexer = convert_to_index_sliceable(self, key) -> 2927 if indexer is not None: 2928 # either we have a slice or we have a string that can be converted 2929 # to a slice for partial-string date indexing

D:-packages.py in get_loc(self, key, method, tolerance) 2657 _index_shared_docs[ 2658 “get_indexer” -> 2659 ] = """ 2660 Compute indexer and mask for new index given the current index. The 2661 indexer should be then used as an input to ndarray.take to align the

pandas_libs.pyx in pandas._libs.index.IndexEngine.get_loc()

pandas_libs.pyx in pandas._libs.index.IndexEngine.get_loc()

pandas_libs_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()

pandas_libs_class_helper.pxi in pandas._libs.hashtable.PyObjectHashTable.get_item()

KeyError: ‘Season’

6.0.2 开始training

6.0.3 CV试参数

[1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [6] training’s l2: 140.945 valid_1’s l2: 223.666 [7] training’s l2: 135.092 valid_1’s l2: 213.036 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [6] training’s l2: 141.933 valid_1’s l2: 193.409 [7] training’s l2: 136.153 valid_1’s l2: 186.268 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [6] training’s l2: 144.97 valid_1’s l2: 144.765 [7] training’s l2: 139.145 valid_1’s l2: 149.502 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [6] training’s l2: 138.774 valid_1’s l2: 207.305 [7] training’s l2: 133.715 valid_1’s l2: 199.479 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [6] training’s l2: 137.229 valid_1’s l2: 255.652 [7] training’s l2: 131.585 valid_1’s l2: 246.362 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [6] training’s l2: 140.945 valid_1’s l2: 223.666 [7] training’s l2: 135.092 valid_1’s l2: 213.036 [8] training’s l2: 130.519 valid_1’s l2: 203.637 [9] training’s l2: 125.933 valid_1’s l2: 215.403 [10] training’s l2: 121.637 valid_1’s l2: 207.352 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [6] training’s l2: 141.933 valid_1’s l2: 193.409 [7] training’s l2: 136.153 valid_1’s l2: 186.268 [8] training’s l2: 130.873 valid_1’s l2: 179.258 [9] training’s l2: 126.122 valid_1’s l2: 173.123 [10] training’s l2: 122.4 valid_1’s l2: 168.657 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [6] training’s l2: 144.97 valid_1’s l2: 144.765 [7] training’s l2: 139.145 valid_1’s l2: 149.502 [8] training’s l2: 133.297 valid_1’s l2: 143.823 [9] training’s l2: 128.477 valid_1’s l2: 146.819 [10] training’s l2: 124.103 valid_1’s l2: 152.426 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [6] training’s l2: 138.774 valid_1’s l2: 207.305 [7] training’s l2: 133.715 valid_1’s l2: 199.479 [8] training’s l2: 128.522 valid_1’s l2: 190.15 [9] training’s l2: 124.924 valid_1’s l2: 185.098 [10] training’s l2: 120.902 valid_1’s l2: 179.091 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [6] training’s l2: 137.229 valid_1’s l2: 255.652 [7] training’s l2: 131.585 valid_1’s l2: 246.362 [8] training’s l2: 126.692 valid_1’s l2: 238.548 [9] training’s l2: 122.05 valid_1’s l2: 253.49 [10] training’s l2: 118.011 valid_1’s l2: 267.586 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [6] training’s l2: 140.945 valid_1’s l2: 223.666 [7] training’s l2: 135.092 valid_1’s l2: 213.036 [8] training’s l2: 130.519 valid_1’s l2: 203.637 [9] training’s l2: 125.933 valid_1’s l2: 215.403 [10] training’s l2: 121.637 valid_1’s l2: 207.352 [11] training’s l2: 118.118 valid_1’s l2: 217.934 [12] training’s l2: 114.93 valid_1’s l2: 227.138 [13] training’s l2: 112.1 valid_1’s l2: 219.833 [14] training’s l2: 109.558 valid_1’s l2: 229.61 [15] training’s l2: 107.288 valid_1’s l2: 223.213 [16] training’s l2: 104.99 valid_1’s l2: 218.353 [17] training’s l2: 103.056 valid_1’s l2: 213.066 [18] training’s l2: 101.368 valid_1’s l2: 209.502 [19] training’s l2: 99.9598 valid_1’s l2: 205.871 [20] training’s l2: 98.5693 valid_1’s l2: 203.931 [21] training’s l2: 97.2565 valid_1’s l2: 211.485 [22] training’s l2: 96.2457 valid_1’s l2: 208.971 [23] training’s l2: 95.2098 valid_1’s l2: 218.984 [24] training’s l2: 94.3621 valid_1’s l2: 225.915 [25] training’s l2: 93.5341 valid_1’s l2: 222.766 [26] training’s l2: 92.7745 valid_1’s l2: 221.437 [27] training’s l2: 92.0241 valid_1’s l2: 220.383 [28] training’s l2: 91.2186 valid_1’s l2: 219.604 [29] training’s l2: 90.6907 valid_1’s l2: 218.704 [30] training’s l2: 90.1588 valid_1’s l2: 218.177 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [6] training’s l2: 141.933 valid_1’s l2: 193.409 [7] training’s l2: 136.153 valid_1’s l2: 186.268 [8] training’s l2: 130.873 valid_1’s l2: 179.258 [9] training’s l2: 126.122 valid_1’s l2: 173.123 [10] training’s l2: 122.4 valid_1’s l2: 168.657 [11] training’s l2: 118.726 valid_1’s l2: 163.772 [12] training’s l2: 115.242 valid_1’s l2: 159.829 [13] training’s l2: 112.208 valid_1’s l2: 155.973 [14] training’s l2: 109.399 valid_1’s l2: 152.519 [15] training’s l2: 107.127 valid_1’s l2: 149.713 [16] training’s l2: 105.069 valid_1’s l2: 147.82 [17] training’s l2: 103.371 valid_1’s l2: 145.525 [18] training’s l2: 101.532 valid_1’s l2: 143.049 [19] training’s l2: 100.215 valid_1’s l2: 142.137 [20] training’s l2: 98.9011 valid_1’s l2: 140.583 [21] training’s l2: 97.66 valid_1’s l2: 139.857 [22] training’s l2: 96.63 valid_1’s l2: 139.269 [23] training’s l2: 95.6133 valid_1’s l2: 138.65 [24] training’s l2: 94.6918 valid_1’s l2: 139.455 [25] training’s l2: 93.6657 valid_1’s l2: 138.847 [26] training’s l2: 92.7317 valid_1’s l2: 137.827 [27] training’s l2: 91.9199 valid_1’s l2: 137.44 [28] training’s l2: 91.305 valid_1’s l2: 136.742 [29] training’s l2: 90.0209 valid_1’s l2: 142.351 [30] training’s l2: 89.3433 valid_1’s l2: 142.539 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [6] training’s l2: 144.97 valid_1’s l2: 144.765 [7] training’s l2: 139.145 valid_1’s l2: 149.502 [8] training’s l2: 133.297 valid_1’s l2: 143.823 [9] training’s l2: 128.477 valid_1’s l2: 146.819 [10] training’s l2: 124.103 valid_1’s l2: 152.426 [11] training’s l2: 120.169 valid_1’s l2: 159.291 [12] training’s l2: 116.836 valid_1’s l2: 155.257 [13] training’s l2: 113.655 valid_1’s l2: 152.465 [14] training’s l2: 110.832 valid_1’s l2: 148.946 [15] training’s l2: 108.382 valid_1’s l2: 152.708 [16] training’s l2: 106.045 valid_1’s l2: 156.915 [17] training’s l2: 103.801 valid_1’s l2: 154.791 [18] training’s l2: 102.084 valid_1’s l2: 153.993 [19] training’s l2: 100.437 valid_1’s l2: 152.838 [20] training’s l2: 98.8454 valid_1’s l2: 156.344 [21] training’s l2: 97.2941 valid_1’s l2: 162.9 [22] training’s l2: 96.1575 valid_1’s l2: 161.947 [23] training’s l2: 94.892 valid_1’s l2: 169.783 [24] training’s l2: 93.8264 valid_1’s l2: 175.086 [25] training’s l2: 92.8634 valid_1’s l2: 179.95 [26] training’s l2: 91.9237 valid_1’s l2: 184.238 [27] training’s l2: 91.019 valid_1’s l2: 182.562 [28] training’s l2: 90.2524 valid_1’s l2: 186.535 [29] training’s l2: 89.4469 valid_1’s l2: 185.531 [30] training’s l2: 88.6954 valid_1’s l2: 189.147 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [6] training’s l2: 138.774 valid_1’s l2: 207.305 [7] training’s l2: 133.715 valid_1’s l2: 199.479 [8] training’s l2: 128.522 valid_1’s l2: 190.15 [9] training’s l2: 124.924 valid_1’s l2: 185.098 [10] training’s l2: 120.902 valid_1’s l2: 179.091 [11] training’s l2: 117.485 valid_1’s l2: 174.062 [12] training’s l2: 114.242 valid_1’s l2: 168.602 [13] training’s l2: 111.475 valid_1’s l2: 165.225 [14] training’s l2: 109.119 valid_1’s l2: 161.608 [15] training’s l2: 106.835 valid_1’s l2: 157.535 [16] training’s l2: 104.872 valid_1’s l2: 154.116 [17] training’s l2: 103.3 valid_1’s l2: 151.361 [18] training’s l2: 101.794 valid_1’s l2: 148.737 [19] training’s l2: 100.442 valid_1’s l2: 146.963 [20] training’s l2: 99.3297 valid_1’s l2: 145.469 [21] training’s l2: 98.0576 valid_1’s l2: 143.953 [22] training’s l2: 96.9952 valid_1’s l2: 143.582 [23] training’s l2: 96.1551 valid_1’s l2: 142.703 [24] training’s l2: 95.2429 valid_1’s l2: 141.321 [25] training’s l2: 94.4858 valid_1’s l2: 139.902 [26] training’s l2: 93.4971 valid_1’s l2: 138.747 [27] training’s l2: 92.7743 valid_1’s l2: 137.533 [28] training’s l2: 92.0786 valid_1’s l2: 137.394 [29] training’s l2: 90.7768 valid_1’s l2: 144.579 [30] training’s l2: 89.6475 valid_1’s l2: 150.809 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [6] training’s l2: 137.229 valid_1’s l2: 255.652 [7] training’s l2: 131.585 valid_1’s l2: 246.362 [8] training’s l2: 126.692 valid_1’s l2: 238.548 [9] training’s l2: 122.05 valid_1’s l2: 253.49 [10] training’s l2: 118.011 valid_1’s l2: 267.586 [11] training’s l2: 114.012 valid_1’s l2: 259.26 [12] training’s l2: 110.599 valid_1’s l2: 268.502 [13] training’s l2: 107.466 valid_1’s l2: 261.291 [14] training’s l2: 104.701 valid_1’s l2: 254.963 [15] training’s l2: 102.282 valid_1’s l2: 249.327 [16] training’s l2: 100.293 valid_1’s l2: 243.619 [17] training’s l2: 98.4398 valid_1’s l2: 238.206 [18] training’s l2: 96.7813 valid_1’s l2: 233.614 [19] training’s l2: 95.4195 valid_1’s l2: 229.749 [20] training’s l2: 93.8965 valid_1’s l2: 227.953 [21] training’s l2: 92.4006 valid_1’s l2: 236.485 [22] training’s l2: 91.2743 valid_1’s l2: 233.261 [23] training’s l2: 90.0754 valid_1’s l2: 243.75 [24] training’s l2: 89.117 valid_1’s l2: 253.566 [25] training’s l2: 88.2277 valid_1’s l2: 257.643 [26] training’s l2: 87.2302 valid_1’s l2: 253.659 [27] training’s l2: 86.4847 valid_1’s l2: 251.105 [28] training’s l2: 85.7754 valid_1’s l2: 248.987 [29] training’s l2: 85.1998 valid_1’s l2: 246.66 [30] training’s l2: 84.6967 valid_1’s l2: 250.542 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [6] training’s l2: 140.945 valid_1’s l2: 223.666 [7] training’s l2: 135.092 valid_1’s l2: 213.036 [8] training’s l2: 130.519 valid_1’s l2: 203.637 [9] training’s l2: 125.933 valid_1’s l2: 215.403 [10] training’s l2: 121.637 valid_1’s l2: 207.352 [11] training’s l2: 118.118 valid_1’s l2: 217.934 [12] training’s l2: 114.93 valid_1’s l2: 227.138 [13] training’s l2: 112.1 valid_1’s l2: 219.833 [14] training’s l2: 109.558 valid_1’s l2: 229.61 [15] training’s l2: 107.288 valid_1’s l2: 223.213 [16] training’s l2: 104.99 valid_1’s l2: 218.353 [17] training’s l2: 103.056 valid_1’s l2: 213.066 [18] training’s l2: 101.368 valid_1’s l2: 209.502 [19] training’s l2: 99.9598 valid_1’s l2: 205.871 [20] training’s l2: 98.5693 valid_1’s l2: 203.931 [21] training’s l2: 97.2565 valid_1’s l2: 211.485 [22] training’s l2: 96.2457 valid_1’s l2: 208.971 [23] training’s l2: 95.2098 valid_1’s l2: 218.984 [24] training’s l2: 94.3621 valid_1’s l2: 225.915 [25] training’s l2: 93.5341 valid_1’s l2: 222.766 [26] training’s l2: 92.7745 valid_1’s l2: 221.437 [27] training’s l2: 92.0241 valid_1’s l2: 220.383 [28] training’s l2: 91.2186 valid_1’s l2: 219.604 [29] training’s l2: 90.6907 valid_1’s l2: 218.704 [30] training’s l2: 90.1588 valid_1’s l2: 218.177 [31] training’s l2: 89.5516 valid_1’s l2: 223.377 [32] training’s l2: 88.8837 valid_1’s l2: 220.679 [33] training’s l2: 88.3643 valid_1’s l2: 218.387 [34] training’s l2: 88.0358 valid_1’s l2: 216.503 [35] training’s l2: 87.6762 valid_1’s l2: 215.715 [36] training’s l2: 87.2646 valid_1’s l2: 216.161 [37] training’s l2: 86.919 valid_1’s l2: 215.409 [38] training’s l2: 86.4544 valid_1’s l2: 215.502 [39] training’s l2: 86.0242 valid_1’s l2: 218.742 [40] training’s l2: 85.6701 valid_1’s l2: 219.645 [41] training’s l2: 85.2952 valid_1’s l2: 218.215 [42] training’s l2: 85.0124 valid_1’s l2: 216.836 [43] training’s l2: 84.7854 valid_1’s l2: 215.599 [44] training’s l2: 84.4793 valid_1’s l2: 215.598 [45] training’s l2: 84.0256 valid_1’s l2: 215.239 [46] training’s l2: 83.7852 valid_1’s l2: 214.256 [47] training’s l2: 83.3949 valid_1’s l2: 213.944 [48] training’s l2: 83.0255 valid_1’s l2: 213.777 [49] training’s l2: 82.767 valid_1’s l2: 213.897 [50] training’s l2: 82.43 valid_1’s l2: 214.046 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [6] training’s l2: 141.933 valid_1’s l2: 193.409 [7] training’s l2: 136.153 valid_1’s l2: 186.268 [8] training’s l2: 130.873 valid_1’s l2: 179.258 [9] training’s l2: 126.122 valid_1’s l2: 173.123 [10] training’s l2: 122.4 valid_1’s l2: 168.657 [11] training’s l2: 118.726 valid_1’s l2: 163.772 [12] training’s l2: 115.242 valid_1’s l2: 159.829 [13] training’s l2: 112.208 valid_1’s l2: 155.973 [14] training’s l2: 109.399 valid_1’s l2: 152.519 [15] training’s l2: 107.127 valid_1’s l2: 149.713 [16] training’s l2: 105.069 valid_1’s l2: 147.82 [17] training’s l2: 103.371 valid_1’s l2: 145.525 [18] training’s l2: 101.532 valid_1’s l2: 143.049 [19] training’s l2: 100.215 valid_1’s l2: 142.137 [20] training’s l2: 98.9011 valid_1’s l2: 140.583 [21] training’s l2: 97.66 valid_1’s l2: 139.857 [22] training’s l2: 96.63 valid_1’s l2: 139.269 [23] training’s l2: 95.6133 valid_1’s l2: 138.65 [24] training’s l2: 94.6918 valid_1’s l2: 139.455 [25] training’s l2: 93.6657 valid_1’s l2: 138.847 [26] training’s l2: 92.7317 valid_1’s l2: 137.827 [27] training’s l2: 91.9199 valid_1’s l2: 137.44 [28] training’s l2: 91.305 valid_1’s l2: 136.742 [29] training’s l2: 90.0209 valid_1’s l2: 142.351 [30] training’s l2: 89.3433 valid_1’s l2: 142.539 [31] training’s l2: 88.3146 valid_1’s l2: 148.405 [32] training’s l2: 87.3156 valid_1’s l2: 154.517 [33] training’s l2: 86.8501 valid_1’s l2: 156.153 [34] training’s l2: 86.3668 valid_1’s l2: 157.577 [35] training’s l2: 85.9572 valid_1’s l2: 157.379 [36] training’s l2: 85.6386 valid_1’s l2: 157.73 [37] training’s l2: 85.3331 valid_1’s l2: 157.637 [38] training’s l2: 84.7324 valid_1’s l2: 157.236 [39] training’s l2: 84.2202 valid_1’s l2: 156.696 [40] training’s l2: 83.7973 valid_1’s l2: 157.479 [41] training’s l2: 83.3949 valid_1’s l2: 156.364 [42] training’s l2: 82.9546 valid_1’s l2: 155.778 [43] training’s l2: 82.567 valid_1’s l2: 154.96 [44] training’s l2: 82.0161 valid_1’s l2: 159.416 [45] training’s l2: 81.6496 valid_1’s l2: 159.083 [46] training’s l2: 81.3859 valid_1’s l2: 159.622 [47] training’s l2: 81.144 valid_1’s l2: 159.95 [48] training’s l2: 80.8722 valid_1’s l2: 159.696 [49] training’s l2: 80.7166 valid_1’s l2: 159.082 [50] training’s l2: 80.557 valid_1’s l2: 158.459 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [6] training’s l2: 144.97 valid_1’s l2: 144.765 [7] training’s l2: 139.145 valid_1’s l2: 149.502 [8] training’s l2: 133.297 valid_1’s l2: 143.823 [9] training’s l2: 128.477 valid_1’s l2: 146.819 [10] training’s l2: 124.103 valid_1’s l2: 152.426 [11] training’s l2: 120.169 valid_1’s l2: 159.291 [12] training’s l2: 116.836 valid_1’s l2: 155.257 [13] training’s l2: 113.655 valid_1’s l2: 152.465 [14] training’s l2: 110.832 valid_1’s l2: 148.946 [15] training’s l2: 108.382 valid_1’s l2: 152.708 [16] training’s l2: 106.045 valid_1’s l2: 156.915 [17] training’s l2: 103.801 valid_1’s l2: 154.791 [18] training’s l2: 102.084 valid_1’s l2: 153.993 [19] training’s l2: 100.437 valid_1’s l2: 152.838 [20] training’s l2: 98.8454 valid_1’s l2: 156.344 [21] training’s l2: 97.2941 valid_1’s l2: 162.9 [22] training’s l2: 96.1575 valid_1’s l2: 161.947 [23] training’s l2: 94.892 valid_1’s l2: 169.783 [24] training’s l2: 93.8264 valid_1’s l2: 175.086 [25] training’s l2: 92.8634 valid_1’s l2: 179.95 [26] training’s l2: 91.9237 valid_1’s l2: 184.238 [27] training’s l2: 91.019 valid_1’s l2: 182.562 [28] training’s l2: 90.2524 valid_1’s l2: 186.535 [29] training’s l2: 89.4469 valid_1’s l2: 185.531 [30] training’s l2: 88.6954 valid_1’s l2: 189.147 [31] training’s l2: 88.1107 valid_1’s l2: 188.142 [32] training’s l2: 87.5525 valid_1’s l2: 190.458 [33] training’s l2: 87.1031 valid_1’s l2: 190.978 [34] training’s l2: 86.5261 valid_1’s l2: 197.331 [35] training’s l2: 86.1056 valid_1’s l2: 196.396 [36] training’s l2: 85.5373 valid_1’s l2: 199.09 [37] training’s l2: 85.0395 valid_1’s l2: 200.081 [38] training’s l2: 84.6891 valid_1’s l2: 200.874 [39] training’s l2: 84.4386 valid_1’s l2: 201.743 [40] training’s l2: 84.1133 valid_1’s l2: 201.242 [41] training’s l2: 83.6846 valid_1’s l2: 199.915 [42] training’s l2: 83.3243 valid_1’s l2: 199.493 [43] training’s l2: 83.0083 valid_1’s l2: 198.727 [44] training’s l2: 82.7126 valid_1’s l2: 198.309 [45] training’s l2: 82.5443 valid_1’s l2: 199.148 [46] training’s l2: 82.2186 valid_1’s l2: 198.995 [47] training’s l2: 81.9292 valid_1’s l2: 198.71 [48] training’s l2: 81.7027 valid_1’s l2: 198.122 [49] training’s l2: 81.4932 valid_1’s l2: 198.712 [50] training’s l2: 81.2892 valid_1’s l2: 199.593 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [6] training’s l2: 138.774 valid_1’s l2: 207.305 [7] training’s l2: 133.715 valid_1’s l2: 199.479 [8] training’s l2: 128.522 valid_1’s l2: 190.15 [9] training’s l2: 124.924 valid_1’s l2: 185.098 [10] training’s l2: 120.902 valid_1’s l2: 179.091 [11] training’s l2: 117.485 valid_1’s l2: 174.062 [12] training’s l2: 114.242 valid_1’s l2: 168.602 [13] training’s l2: 111.475 valid_1’s l2: 165.225 [14] training’s l2: 109.119 valid_1’s l2: 161.608 [15] training’s l2: 106.835 valid_1’s l2: 157.535 [16] training’s l2: 104.872 valid_1’s l2: 154.116 [17] training’s l2: 103.3 valid_1’s l2: 151.361 [18] training’s l2: 101.794 valid_1’s l2: 148.737 [19] training’s l2: 100.442 valid_1’s l2: 146.963 [20] training’s l2: 99.3297 valid_1’s l2: 145.469 [21] training’s l2: 98.0576 valid_1’s l2: 143.953 [22] training’s l2: 96.9952 valid_1’s l2: 143.582 [23] training’s l2: 96.1551 valid_1’s l2: 142.703 [24] training’s l2: 95.2429 valid_1’s l2: 141.321 [25] training’s l2: 94.4858 valid_1’s l2: 139.902 [26] training’s l2: 93.4971 valid_1’s l2: 138.747 [27] training’s l2: 92.7743 valid_1’s l2: 137.533 [28] training’s l2: 92.0786 valid_1’s l2: 137.394 [29] training’s l2: 90.7768 valid_1’s l2: 144.579 [30] training’s l2: 89.6475 valid_1’s l2: 150.809 [31] training’s l2: 89.013 valid_1’s l2: 149.92 [32] training’s l2: 88.4401 valid_1’s l2: 148.834 [33] training’s l2: 87.9473 valid_1’s l2: 147.572 [34] training’s l2: 87.4929 valid_1’s l2: 146.491 [35] training’s l2: 87.0853 valid_1’s l2: 145.65 [36] training’s l2: 86.5945 valid_1’s l2: 146.168 [37] training’s l2: 86.1343 valid_1’s l2: 147.654 [38] training’s l2: 85.785 valid_1’s l2: 148.441 [39] training’s l2: 85.4071 valid_1’s l2: 148.517 [40] training’s l2: 84.6404 valid_1’s l2: 153.813 [41] training’s l2: 83.9097 valid_1’s l2: 159.55 [42] training’s l2: 83.2289 valid_1’s l2: 165.15 [43] training’s l2: 82.9138 valid_1’s l2: 164.142 [44] training’s l2: 82.5748 valid_1’s l2: 163.977 [45] training’s l2: 82.0764 valid_1’s l2: 168.908 [46] training’s l2: 81.8109 valid_1’s l2: 169.097 [47] training’s l2: 81.4568 valid_1’s l2: 168.659 [48] training’s l2: 81.2605 valid_1’s l2: 169.237 [49] training’s l2: 80.9983 valid_1’s l2: 168.894 [50] training’s l2: 80.6754 valid_1’s l2: 168.599 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [6] training’s l2: 137.229 valid_1’s l2: 255.652 [7] training’s l2: 131.585 valid_1’s l2: 246.362 [8] training’s l2: 126.692 valid_1’s l2: 238.548 [9] training’s l2: 122.05 valid_1’s l2: 253.49 [10] training’s l2: 118.011 valid_1’s l2: 267.586 [11] training’s l2: 114.012 valid_1’s l2: 259.26 [12] training’s l2: 110.599 valid_1’s l2: 268.502 [13] training’s l2: 107.466 valid_1’s l2: 261.291 [14] training’s l2: 104.701 valid_1’s l2: 254.963 [15] training’s l2: 102.282 valid_1’s l2: 249.327 [16] training’s l2: 100.293 valid_1’s l2: 243.619 [17] training’s l2: 98.4398 valid_1’s l2: 238.206 [18] training’s l2: 96.7813 valid_1’s l2: 233.614 [19] training’s l2: 95.4195 valid_1’s l2: 229.749 [20] training’s l2: 93.8965 valid_1’s l2: 227.953 [21] training’s l2: 92.4006 valid_1’s l2: 236.485 [22] training’s l2: 91.2743 valid_1’s l2: 233.261 [23] training’s l2: 90.0754 valid_1’s l2: 243.75 [24] training’s l2: 89.117 valid_1’s l2: 253.566 [25] training’s l2: 88.2277 valid_1’s l2: 257.643 [26] training’s l2: 87.2302 valid_1’s l2: 253.659 [27] training’s l2: 86.4847 valid_1’s l2: 251.105 [28] training’s l2: 85.7754 valid_1’s l2: 248.987 [29] training’s l2: 85.1998 valid_1’s l2: 246.66 [30] training’s l2: 84.6967 valid_1’s l2: 250.542 [31] training’s l2: 84.0718 valid_1’s l2: 248.02 [32] training’s l2: 83.4119 valid_1’s l2: 259.022 [33] training’s l2: 82.9172 valid_1’s l2: 264.104 [34] training’s l2: 82.5544 valid_1’s l2: 263.383 [35] training’s l2: 82.2256 valid_1’s l2: 260.906 [36] training’s l2: 81.8384 valid_1’s l2: 260.967 [37] training’s l2: 81.4423 valid_1’s l2: 260.448 [38] training’s l2: 81.091 valid_1’s l2: 259.701 [39] training’s l2: 80.6978 valid_1’s l2: 258.846 [40] training’s l2: 80.3647 valid_1’s l2: 258.979 [41] training’s l2: 80.0704 valid_1’s l2: 257.698 [42] training’s l2: 79.7841 valid_1’s l2: 256.787 [43] training’s l2: 79.548 valid_1’s l2: 257.321 [44] training’s l2: 79.3817 valid_1’s l2: 257.261 [45] training’s l2: 79.0728 valid_1’s l2: 256.842 [46] training’s l2: 78.7786 valid_1’s l2: 257.665 [47] training’s l2: 78.5309 valid_1’s l2: 258.698 [48] training’s l2: 78.1787 valid_1’s l2: 259.331 [49] training’s l2: 78.0073 valid_1’s l2: 260.036 [50] training’s l2: 77.8439 valid_1’s l2: 260.908 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [6] training’s l2: 140.945 valid_1’s l2: 223.666 [7] training’s l2: 135.092 valid_1’s l2: 213.036 [8] training’s l2: 130.519 valid_1’s l2: 203.637 [9] training’s l2: 125.933 valid_1’s l2: 215.403 [10] training’s l2: 121.637 valid_1’s l2: 207.352 [11] training’s l2: 118.118 valid_1’s l2: 217.934 [12] training’s l2: 114.93 valid_1’s l2: 227.138 [13] training’s l2: 112.1 valid_1’s l2: 219.833 [14] training’s l2: 109.558 valid_1’s l2: 229.61 [15] training’s l2: 107.288 valid_1’s l2: 223.213 [16] training’s l2: 104.99 valid_1’s l2: 218.353 [17] training’s l2: 103.056 valid_1’s l2: 213.066 [18] training’s l2: 101.368 valid_1’s l2: 209.502 [19] training’s l2: 99.9598 valid_1’s l2: 205.871 [20] training’s l2: 98.5693 valid_1’s l2: 203.931 [21] training’s l2: 97.2565 valid_1’s l2: 211.485 [22] training’s l2: 96.2457 valid_1’s l2: 208.971 [23] training’s l2: 95.2098 valid_1’s l2: 218.984 [24] training’s l2: 94.3621 valid_1’s l2: 225.915 [25] training’s l2: 93.5341 valid_1’s l2: 222.766 [26] training’s l2: 92.7745 valid_1’s l2: 221.437 [27] training’s l2: 92.0241 valid_1’s l2: 220.383 [28] training’s l2: 91.2186 valid_1’s l2: 219.604 [29] training’s l2: 90.6907 valid_1’s l2: 218.704 [30] training’s l2: 90.1588 valid_1’s l2: 218.177 [31] training’s l2: 89.5516 valid_1’s l2: 223.377 [32] training’s l2: 88.8837 valid_1’s l2: 220.679 [33] training’s l2: 88.3643 valid_1’s l2: 218.387 [34] training’s l2: 88.0358 valid_1’s l2: 216.503 [35] training’s l2: 87.6762 valid_1’s l2: 215.715 [36] training’s l2: 87.2646 valid_1’s l2: 216.161 [37] training’s l2: 86.919 valid_1’s l2: 215.409 [38] training’s l2: 86.4544 valid_1’s l2: 215.502 [39] training’s l2: 86.0242 valid_1’s l2: 218.742 [40] training’s l2: 85.6701 valid_1’s l2: 219.645 [41] training’s l2: 85.2952 valid_1’s l2: 218.215 [42] training’s l2: 85.0124 valid_1’s l2: 216.836 [43] training’s l2: 84.7854 valid_1’s l2: 215.599 [44] training’s l2: 84.4793 valid_1’s l2: 215.598 [45] training’s l2: 84.0256 valid_1’s l2: 215.239 [46] training’s l2: 83.7852 valid_1’s l2: 214.256 [47] training’s l2: 83.3949 valid_1’s l2: 213.944 [48] training’s l2: 83.0255 valid_1’s l2: 213.777 [49] training’s l2: 82.767 valid_1’s l2: 213.897 [50] training’s l2: 82.43 valid_1’s l2: 214.046 [51] training’s l2: 82.0641 valid_1’s l2: 212.458 [52] training’s l2: 81.7886 valid_1’s l2: 212.319 [53] training’s l2: 81.4989 valid_1’s l2: 211.561 [54] training’s l2: 81.3257 valid_1’s l2: 212.333 [55] training’s l2: 81.1936 valid_1’s l2: 213.048 [56] training’s l2: 80.915 valid_1’s l2: 214.463 [57] training’s l2: 80.6962 valid_1’s l2: 215.24 [58] training’s l2: 80.5801 valid_1’s l2: 216.456 [59] training’s l2: 80.3643 valid_1’s l2: 219.741 [60] training’s l2: 80.2357 valid_1’s l2: 220.257 [61] training’s l2: 79.9677 valid_1’s l2: 220.522 [62] training’s l2: 79.6561 valid_1’s l2: 220.378 [63] training’s l2: 79.4465 valid_1’s l2: 221.233 [64] training’s l2: 79.2254 valid_1’s l2: 220.866 [65] training’s l2: 79.08 valid_1’s l2: 220.395 [66] training’s l2: 78.9408 valid_1’s l2: 219.842 [67] training’s l2: 78.6085 valid_1’s l2: 220.914 [68] training’s l2: 78.4286 valid_1’s l2: 221.175 [69] training’s l2: 78.1928 valid_1’s l2: 220.842 [70] training’s l2: 77.9342 valid_1’s l2: 220.521 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [6] training’s l2: 141.933 valid_1’s l2: 193.409 [7] training’s l2: 136.153 valid_1’s l2: 186.268 [8] training’s l2: 130.873 valid_1’s l2: 179.258 [9] training’s l2: 126.122 valid_1’s l2: 173.123 [10] training’s l2: 122.4 valid_1’s l2: 168.657 [11] training’s l2: 118.726 valid_1’s l2: 163.772 [12] training’s l2: 115.242 valid_1’s l2: 159.829 [13] training’s l2: 112.208 valid_1’s l2: 155.973 [14] training’s l2: 109.399 valid_1’s l2: 152.519 [15] training’s l2: 107.127 valid_1’s l2: 149.713 [16] training’s l2: 105.069 valid_1’s l2: 147.82 [17] training’s l2: 103.371 valid_1’s l2: 145.525 [18] training’s l2: 101.532 valid_1’s l2: 143.049 [19] training’s l2: 100.215 valid_1’s l2: 142.137 [20] training’s l2: 98.9011 valid_1’s l2: 140.583 [21] training’s l2: 97.66 valid_1’s l2: 139.857 [22] training’s l2: 96.63 valid_1’s l2: 139.269 [23] training’s l2: 95.6133 valid_1’s l2: 138.65 [24] training’s l2: 94.6918 valid_1’s l2: 139.455 [25] training’s l2: 93.6657 valid_1’s l2: 138.847 [26] training’s l2: 92.7317 valid_1’s l2: 137.827 [27] training’s l2: 91.9199 valid_1’s l2: 137.44 [28] training’s l2: 91.305 valid_1’s l2: 136.742 [29] training’s l2: 90.0209 valid_1’s l2: 142.351 [30] training’s l2: 89.3433 valid_1’s l2: 142.539 [31] training’s l2: 88.3146 valid_1’s l2: 148.405 [32] training’s l2: 87.3156 valid_1’s l2: 154.517 [33] training’s l2: 86.8501 valid_1’s l2: 156.153 [34] training’s l2: 86.3668 valid_1’s l2: 157.577 [35] training’s l2: 85.9572 valid_1’s l2: 157.379 [36] training’s l2: 85.6386 valid_1’s l2: 157.73 [37] training’s l2: 85.3331 valid_1’s l2: 157.637 [38] training’s l2: 84.7324 valid_1’s l2: 157.236 [39] training’s l2: 84.2202 valid_1’s l2: 156.696 [40] training’s l2: 83.7973 valid_1’s l2: 157.479 [41] training’s l2: 83.3949 valid_1’s l2: 156.364 [42] training’s l2: 82.9546 valid_1’s l2: 155.778 [43] training’s l2: 82.567 valid_1’s l2: 154.96 [44] training’s l2: 82.0161 valid_1’s l2: 159.416 [45] training’s l2: 81.6496 valid_1’s l2: 159.083 [46] training’s l2: 81.3859 valid_1’s l2: 159.622 [47] training’s l2: 81.144 valid_1’s l2: 159.95 [48] training’s l2: 80.8722 valid_1’s l2: 159.696 [49] training’s l2: 80.7166 valid_1’s l2: 159.082 [50] training’s l2: 80.557 valid_1’s l2: 158.459 [51] training’s l2: 80.3325 valid_1’s l2: 158.177 [52] training’s l2: 80.1574 valid_1’s l2: 157.977 [53] training’s l2: 79.9992 valid_1’s l2: 157.964 [54] training’s l2: 79.7614 valid_1’s l2: 157.856 [55] training’s l2: 79.5948 valid_1’s l2: 157.542 [56] training’s l2: 79.4393 valid_1’s l2: 157.697 [57] training’s l2: 79.2512 valid_1’s l2: 157.83 [58] training’s l2: 79.0895 valid_1’s l2: 158.008 [59] training’s l2: 78.7997 valid_1’s l2: 158.7 [60] training’s l2: 78.6677 valid_1’s l2: 158.534 [61] training’s l2: 78.3948 valid_1’s l2: 159.081 [62] training’s l2: 78.1873 valid_1’s l2: 159.613 [63] training’s l2: 78.0211 valid_1’s l2: 160.136 [64] training’s l2: 77.7863 valid_1’s l2: 159.924 [65] training’s l2: 77.6018 valid_1’s l2: 160.046 [66] training’s l2: 77.3692 valid_1’s l2: 159.906 [67] training’s l2: 77.1181 valid_1’s l2: 159.808 [68] training’s l2: 76.8823 valid_1’s l2: 159.96 [69] training’s l2: 76.773 valid_1’s l2: 159.984 [70] training’s l2: 76.5827 valid_1’s l2: 160.455 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [6] training’s l2: 144.97 valid_1’s l2: 144.765 [7] training’s l2: 139.145 valid_1’s l2: 149.502 [8] training’s l2: 133.297 valid_1’s l2: 143.823 [9] training’s l2: 128.477 valid_1’s l2: 146.819 [10] training’s l2: 124.103 valid_1’s l2: 152.426 [11] training’s l2: 120.169 valid_1’s l2: 159.291 [12] training’s l2: 116.836 valid_1’s l2: 155.257 [13] training’s l2: 113.655 valid_1’s l2: 152.465 [14] training’s l2: 110.832 valid_1’s l2: 148.946 [15] training’s l2: 108.382 valid_1’s l2: 152.708 [16] training’s l2: 106.045 valid_1’s l2: 156.915 [17] training’s l2: 103.801 valid_1’s l2: 154.791 [18] training’s l2: 102.084 valid_1’s l2: 153.993 [19] training’s l2: 100.437 valid_1’s l2: 152.838 [20] training’s l2: 98.8454 valid_1’s l2: 156.344 [21] training’s l2: 97.2941 valid_1’s l2: 162.9 [22] training’s l2: 96.1575 valid_1’s l2: 161.947 [23] training’s l2: 94.892 valid_1’s l2: 169.783 [24] training’s l2: 93.8264 valid_1’s l2: 175.086 [25] training’s l2: 92.8634 valid_1’s l2: 179.95 [26] training’s l2: 91.9237 valid_1’s l2: 184.238 [27] training’s l2: 91.019 valid_1’s l2: 182.562 [28] training’s l2: 90.2524 valid_1’s l2: 186.535 [29] training’s l2: 89.4469 valid_1’s l2: 185.531 [30] training’s l2: 88.6954 valid_1’s l2: 189.147 [31] training’s l2: 88.1107 valid_1’s l2: 188.142 [32] training’s l2: 87.5525 valid_1’s l2: 190.458 [33] training’s l2: 87.1031 valid_1’s l2: 190.978 [34] training’s l2: 86.5261 valid_1’s l2: 197.331 [35] training’s l2: 86.1056 valid_1’s l2: 196.396 [36] training’s l2: 85.5373 valid_1’s l2: 199.09 [37] training’s l2: 85.0395 valid_1’s l2: 200.081 [38] training’s l2: 84.6891 valid_1’s l2: 200.874 [39] training’s l2: 84.4386 valid_1’s l2: 201.743 [40] training’s l2: 84.1133 valid_1’s l2: 201.242 [41] training’s l2: 83.6846 valid_1’s l2: 199.915 [42] training’s l2: 83.3243 valid_1’s l2: 199.493 [43] training’s l2: 83.0083 valid_1’s l2: 198.727 [44] training’s l2: 82.7126 valid_1’s l2: 198.309 [45] training’s l2: 82.5443 valid_1’s l2: 199.148 [46] training’s l2: 82.2186 valid_1’s l2: 198.995 [47] training’s l2: 81.9292 valid_1’s l2: 198.71 [48] training’s l2: 81.7027 valid_1’s l2: 198.122 [49] training’s l2: 81.4932 valid_1’s l2: 198.712 [50] training’s l2: 81.2892 valid_1’s l2: 199.593 [51] training’s l2: 81.0045 valid_1’s l2: 200.175 [52] training’s l2: 80.7583 valid_1’s l2: 200.864 [53] training’s l2: 80.5343 valid_1’s l2: 201.11 [54] training’s l2: 80.3948 valid_1’s l2: 200.587 [55] training’s l2: 80.1765 valid_1’s l2: 200.516 [56] training’s l2: 79.9549 valid_1’s l2: 200.342 [57] training’s l2: 79.7704 valid_1’s l2: 199.926 [58] training’s l2: 79.5933 valid_1’s l2: 199.757 [59] training’s l2: 79.3517 valid_1’s l2: 200.224 [60] training’s l2: 79.1845 valid_1’s l2: 200.674 [61] training’s l2: 78.9526 valid_1’s l2: 199.458 [62] training’s l2: 78.7603 valid_1’s l2: 198.594 [63] training’s l2: 78.5888 valid_1’s l2: 198.084 [64] training’s l2: 78.3818 valid_1’s l2: 197.717 [65] training’s l2: 78.1224 valid_1’s l2: 197.111 [66] training’s l2: 77.9479 valid_1’s l2: 197.754 [67] training’s l2: 77.7393 valid_1’s l2: 197.864 [68] training’s l2: 77.4902 valid_1’s l2: 197.682 [69] training’s l2: 77.4181 valid_1’s l2: 198.213 [70] training’s l2: 77.1666 valid_1’s l2: 199.113 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [6] training’s l2: 138.774 valid_1’s l2: 207.305 [7] training’s l2: 133.715 valid_1’s l2: 199.479 [8] training’s l2: 128.522 valid_1’s l2: 190.15 [9] training’s l2: 124.924 valid_1’s l2: 185.098 [10] training’s l2: 120.902 valid_1’s l2: 179.091 [11] training’s l2: 117.485 valid_1’s l2: 174.062 [12] training’s l2: 114.242 valid_1’s l2: 168.602 [13] training’s l2: 111.475 valid_1’s l2: 165.225 [14] training’s l2: 109.119 valid_1’s l2: 161.608 [15] training’s l2: 106.835 valid_1’s l2: 157.535 [16] training’s l2: 104.872 valid_1’s l2: 154.116 [17] training’s l2: 103.3 valid_1’s l2: 151.361 [18] training’s l2: 101.794 valid_1’s l2: 148.737 [19] training’s l2: 100.442 valid_1’s l2: 146.963 [20] training’s l2: 99.3297 valid_1’s l2: 145.469 [21] training’s l2: 98.0576 valid_1’s l2: 143.953 [22] training’s l2: 96.9952 valid_1’s l2: 143.582 [23] training’s l2: 96.1551 valid_1’s l2: 142.703 [24] training’s l2: 95.2429 valid_1’s l2: 141.321 [25] training’s l2: 94.4858 valid_1’s l2: 139.902 [26] training’s l2: 93.4971 valid_1’s l2: 138.747 [27] training’s l2: 92.7743 valid_1’s l2: 137.533 [28] training’s l2: 92.0786 valid_1’s l2: 137.394 [29] training’s l2: 90.7768 valid_1’s l2: 144.579 [30] training’s l2: 89.6475 valid_1’s l2: 150.809 [31] training’s l2: 89.013 valid_1’s l2: 149.92 [32] training’s l2: 88.4401 valid_1’s l2: 148.834 [33] training’s l2: 87.9473 valid_1’s l2: 147.572 [34] training’s l2: 87.4929 valid_1’s l2: 146.491 [35] training’s l2: 87.0853 valid_1’s l2: 145.65 [36] training’s l2: 86.5945 valid_1’s l2: 146.168 [37] training’s l2: 86.1343 valid_1’s l2: 147.654 [38] training’s l2: 85.785 valid_1’s l2: 148.441 [39] training’s l2: 85.4071 valid_1’s l2: 148.517 [40] training’s l2: 84.6404 valid_1’s l2: 153.813 [41] training’s l2: 83.9097 valid_1’s l2: 159.55 [42] training’s l2: 83.2289 valid_1’s l2: 165.15 [43] training’s l2: 82.9138 valid_1’s l2: 164.142 [44] training’s l2: 82.5748 valid_1’s l2: 163.977 [45] training’s l2: 82.0764 valid_1’s l2: 168.908 [46] training’s l2: 81.8109 valid_1’s l2: 169.097 [47] training’s l2: 81.4568 valid_1’s l2: 168.659 [48] training’s l2: 81.2605 valid_1’s l2: 169.237 [49] training’s l2: 80.9983 valid_1’s l2: 168.894 [50] training’s l2: 80.6754 valid_1’s l2: 168.599 [51] training’s l2: 80.366 valid_1’s l2: 168.284 [52] training’s l2: 80.1175 valid_1’s l2: 167.832 [53] training’s l2: 79.8311 valid_1’s l2: 167.52 [54] training’s l2: 79.5507 valid_1’s l2: 168.844 [55] training’s l2: 79.3371 valid_1’s l2: 168.814 [56] training’s l2: 79.0848 valid_1’s l2: 169.275 [57] training’s l2: 78.8929 valid_1’s l2: 169.257 [58] training’s l2: 78.6591 valid_1’s l2: 168.669 [59] training’s l2: 78.4492 valid_1’s l2: 168.299 [60] training’s l2: 78.2873 valid_1’s l2: 168.636 [61] training’s l2: 78.0389 valid_1’s l2: 168.809 [62] training’s l2: 77.9196 valid_1’s l2: 169.109 [63] training’s l2: 77.706 valid_1’s l2: 168.718 [64] training’s l2: 77.5516 valid_1’s l2: 169.026 [65] training’s l2: 77.3462 valid_1’s l2: 169.956 [66] training’s l2: 77.1325 valid_1’s l2: 170.166 [67] training’s l2: 76.9509 valid_1’s l2: 170.644 [68] training’s l2: 76.7989 valid_1’s l2: 170.923 [69] training’s l2: 76.6742 valid_1’s l2: 171.397 [70] training’s l2: 76.5455 valid_1’s l2: 172.087 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [6] training’s l2: 137.229 valid_1’s l2: 255.652 [7] training’s l2: 131.585 valid_1’s l2: 246.362 [8] training’s l2: 126.692 valid_1’s l2: 238.548 [9] training’s l2: 122.05 valid_1’s l2: 253.49 [10] training’s l2: 118.011 valid_1’s l2: 267.586 [11] training’s l2: 114.012 valid_1’s l2: 259.26 [12] training’s l2: 110.599 valid_1’s l2: 268.502 [13] training’s l2: 107.466 valid_1’s l2: 261.291 [14] training’s l2: 104.701 valid_1’s l2: 254.963 [15] training’s l2: 102.282 valid_1’s l2: 249.327 [16] training’s l2: 100.293 valid_1’s l2: 243.619 [17] training’s l2: 98.4398 valid_1’s l2: 238.206 [18] training’s l2: 96.7813 valid_1’s l2: 233.614 [19] training’s l2: 95.4195 valid_1’s l2: 229.749 [20] training’s l2: 93.8965 valid_1’s l2: 227.953 [21] training’s l2: 92.4006 valid_1’s l2: 236.485 [22] training’s l2: 91.2743 valid_1’s l2: 233.261 [23] training’s l2: 90.0754 valid_1’s l2: 243.75 [24] training’s l2: 89.117 valid_1’s l2: 253.566 [25] training’s l2: 88.2277 valid_1’s l2: 257.643 [26] training’s l2: 87.2302 valid_1’s l2: 253.659 [27] training’s l2: 86.4847 valid_1’s l2: 251.105 [28] training’s l2: 85.7754 valid_1’s l2: 248.987 [29] training’s l2: 85.1998 valid_1’s l2: 246.66 [30] training’s l2: 84.6967 valid_1’s l2: 250.542 [31] training’s l2: 84.0718 valid_1’s l2: 248.02 [32] training’s l2: 83.4119 valid_1’s l2: 259.022 [33] training’s l2: 82.9172 valid_1’s l2: 264.104 [34] training’s l2: 82.5544 valid_1’s l2: 263.383 [35] training’s l2: 82.2256 valid_1’s l2: 260.906 [36] training’s l2: 81.8384 valid_1’s l2: 260.967 [37] training’s l2: 81.4423 valid_1’s l2: 260.448 [38] training’s l2: 81.091 valid_1’s l2: 259.701 [39] training’s l2: 80.6978 valid_1’s l2: 258.846 [40] training’s l2: 80.3647 valid_1’s l2: 258.979 [41] training’s l2: 80.0704 valid_1’s l2: 257.698 [42] training’s l2: 79.7841 valid_1’s l2: 256.787 [43] training’s l2: 79.548 valid_1’s l2: 257.321 [44] training’s l2: 79.3817 valid_1’s l2: 257.261 [45] training’s l2: 79.0728 valid_1’s l2: 256.842 [46] training’s l2: 78.7786 valid_1’s l2: 257.665 [47] training’s l2: 78.5309 valid_1’s l2: 258.698 [48] training’s l2: 78.1787 valid_1’s l2: 259.331 [49] training’s l2: 78.0073 valid_1’s l2: 260.036 [50] training’s l2: 77.8439 valid_1’s l2: 260.908 [51] training’s l2: 77.5918 valid_1’s l2: 259.721 [52] training’s l2: 77.3803 valid_1’s l2: 259.082 [53] training’s l2: 77.178 valid_1’s l2: 257.41 [54] training’s l2: 76.965 valid_1’s l2: 257.029 [55] training’s l2: 76.7343 valid_1’s l2: 255.517 [56] training’s l2: 76.4189 valid_1’s l2: 255.792 [57] training’s l2: 76.0997 valid_1’s l2: 255.582 [58] training’s l2: 75.9178 valid_1’s l2: 256.721 [59] training’s l2: 75.7499 valid_1’s l2: 257.773 [60] training’s l2: 75.6012 valid_1’s l2: 257.739 [61] training’s l2: 75.4726 valid_1’s l2: 257.361 [62] training’s l2: 75.3769 valid_1’s l2: 256.184 [63] training’s l2: 75.1815 valid_1’s l2: 256.603 [64] training’s l2: 75.1169 valid_1’s l2: 257.47 [65] training’s l2: 74.9534 valid_1’s l2: 256.77 [66] training’s l2: 74.7485 valid_1’s l2: 256.268 [67] training’s l2: 74.6057 valid_1’s l2: 255.777 [68] training’s l2: 74.4203 valid_1’s l2: 254.845 [69] training’s l2: 74.2608 valid_1’s l2: 255.01 [70] training’s l2: 74.1225 valid_1’s l2: 254.157 [1] training’s l2: 181.976 valid_1’s l2: 264.947 [2] training’s l2: 171.173 valid_1’s l2: 275.578 [3] training’s l2: 161.926 valid_1’s l2: 260.795 [4] training’s l2: 154.458 valid_1’s l2: 248.335 [5] training’s l2: 147.61 valid_1’s l2: 235.198 [6] training’s l2: 140.945 valid_1’s l2: 223.666 [7] training’s l2: 135.092 valid_1’s l2: 213.036 [8] training’s l2: 130.519 valid_1’s l2: 203.637 [9] training’s l2: 125.933 valid_1’s l2: 215.403 [10] training’s l2: 121.637 valid_1’s l2: 207.352 [11] training’s l2: 118.118 valid_1’s l2: 217.934 [12] training’s l2: 114.93 valid_1’s l2: 227.138 [13] training’s l2: 112.1 valid_1’s l2: 219.833 [14] training’s l2: 109.558 valid_1’s l2: 229.61 [15] training’s l2: 107.288 valid_1’s l2: 223.213 [16] training’s l2: 104.99 valid_1’s l2: 218.353 [17] training’s l2: 103.056 valid_1’s l2: 213.066 [18] training’s l2: 101.368 valid_1’s l2: 209.502 [19] training’s l2: 99.9598 valid_1’s l2: 205.871 [20] training’s l2: 98.5693 valid_1’s l2: 203.931 [21] training’s l2: 97.2565 valid_1’s l2: 211.485 [22] training’s l2: 96.2457 valid_1’s l2: 208.971 [23] training’s l2: 95.2098 valid_1’s l2: 218.984 [24] training’s l2: 94.3621 valid_1’s l2: 225.915 [25] training’s l2: 93.5341 valid_1’s l2: 222.766 [26] training’s l2: 92.7745 valid_1’s l2: 221.437 [27] training’s l2: 92.0241 valid_1’s l2: 220.383 [28] training’s l2: 91.2186 valid_1’s l2: 219.604 [29] training’s l2: 90.6907 valid_1’s l2: 218.704 [30] training’s l2: 90.1588 valid_1’s l2: 218.177 [31] training’s l2: 89.5516 valid_1’s l2: 223.377 [32] training’s l2: 88.8837 valid_1’s l2: 220.679 [33] training’s l2: 88.3643 valid_1’s l2: 218.387 [34] training’s l2: 88.0358 valid_1’s l2: 216.503 [35] training’s l2: 87.6762 valid_1’s l2: 215.715 [36] training’s l2: 87.2646 valid_1’s l2: 216.161 [37] training’s l2: 86.919 valid_1’s l2: 215.409 [38] training’s l2: 86.4544 valid_1’s l2: 215.502 [39] training’s l2: 86.0242 valid_1’s l2: 218.742 [40] training’s l2: 85.6701 valid_1’s l2: 219.645 [41] training’s l2: 85.2952 valid_1’s l2: 218.215 [42] training’s l2: 85.0124 valid_1’s l2: 216.836 [43] training’s l2: 84.7854 valid_1’s l2: 215.599 [44] training’s l2: 84.4793 valid_1’s l2: 215.598 [45] training’s l2: 84.0256 valid_1’s l2: 215.239 [46] training’s l2: 83.7852 valid_1’s l2: 214.256 [47] training’s l2: 83.3949 valid_1’s l2: 213.944 [48] training’s l2: 83.0255 valid_1’s l2: 213.777 [49] training’s l2: 82.767 valid_1’s l2: 213.897 [50] training’s l2: 82.43 valid_1’s l2: 214.046 [51] training’s l2: 82.0641 valid_1’s l2: 212.458 [52] training’s l2: 81.7886 valid_1’s l2: 212.319 [53] training’s l2: 81.4989 valid_1’s l2: 211.561 [54] training’s l2: 81.3257 valid_1’s l2: 212.333 [55] training’s l2: 81.1936 valid_1’s l2: 213.048 [56] training’s l2: 80.915 valid_1’s l2: 214.463 [57] training’s l2: 80.6962 valid_1’s l2: 215.24 [58] training’s l2: 80.5801 valid_1’s l2: 216.456 [59] training’s l2: 80.3643 valid_1’s l2: 219.741 [60] training’s l2: 80.2357 valid_1’s l2: 220.257 [61] training’s l2: 79.9677 valid_1’s l2: 220.522 [62] training’s l2: 79.6561 valid_1’s l2: 220.378 [63] training’s l2: 79.4465 valid_1’s l2: 221.233 [64] training’s l2: 79.2254 valid_1’s l2: 220.866 [65] training’s l2: 79.08 valid_1’s l2: 220.395 [66] training’s l2: 78.9408 valid_1’s l2: 219.842 [67] training’s l2: 78.6085 valid_1’s l2: 220.914 [68] training’s l2: 78.4286 valid_1’s l2: 221.175 [69] training’s l2: 78.1928 valid_1’s l2: 220.842 [70] training’s l2: 77.9342 valid_1’s l2: 220.521 [71] training’s l2: 77.7903 valid_1’s l2: 220.228 [72] training’s l2: 77.6553 valid_1’s l2: 219.738 [73] training’s l2: 77.5304 valid_1’s l2: 218.981 [74] training’s l2: 77.3058 valid_1’s l2: 218.487 [75] training’s l2: 77.1282 valid_1’s l2: 218.492 [76] training’s l2: 76.9462 valid_1’s l2: 219.049 [77] training’s l2: 76.7116 valid_1’s l2: 220.044 [78] training’s l2: 76.6249 valid_1’s l2: 219.668 [79] training’s l2: 76.5202 valid_1’s l2: 219.29 [80] training’s l2: 76.304 valid_1’s l2: 219.204 [81] training’s l2: 76.0367 valid_1’s l2: 218.327 [82] training’s l2: 75.8774 valid_1’s l2: 218.16 [83] training’s l2: 75.6845 valid_1’s l2: 217.262 [84] training’s l2: 75.4574 valid_1’s l2: 216.369 [85] training’s l2: 75.2471 valid_1’s l2: 215.945 [86] training’s l2: 75.1309 valid_1’s l2: 216.104 [87] training’s l2: 74.9042 valid_1’s l2: 215.791 [88] training’s l2: 74.7636 valid_1’s l2: 216.589 [89] training’s l2: 74.6492 valid_1’s l2: 217.23 [90] training’s l2: 74.3943 valid_1’s l2: 217.412 [91] training’s l2: 74.1933 valid_1’s l2: 217.229 [92] training’s l2: 73.9776 valid_1’s l2: 216.808 [93] training’s l2: 73.734 valid_1’s l2: 216.409 [94] training’s l2: 73.6212 valid_1’s l2: 215.92 [95] training’s l2: 73.4973 valid_1’s l2: 215.754 [96] training’s l2: 73.2653 valid_1’s l2: 214.922 [97] training’s l2: 73.1166 valid_1’s l2: 214.322 [98] training’s l2: 72.9421 valid_1’s l2: 213.556 [99] training’s l2: 72.7465 valid_1’s l2: 213.427 [100] training’s l2: 72.6943 valid_1’s l2: 213.622 [1] training’s l2: 185.572 valid_1’s l2: 248.598 [2] training’s l2: 173.821 valid_1’s l2: 232.524 [3] training’s l2: 164.299 valid_1’s l2: 221.028 [4] training’s l2: 156.081 valid_1’s l2: 210.026 [5] training’s l2: 148.668 valid_1’s l2: 200.968 [6] training’s l2: 141.933 valid_1’s l2: 193.409 [7] training’s l2: 136.153 valid_1’s l2: 186.268 [8] training’s l2: 130.873 valid_1’s l2: 179.258 [9] training’s l2: 126.122 valid_1’s l2: 173.123 [10] training’s l2: 122.4 valid_1’s l2: 168.657 [11] training’s l2: 118.726 valid_1’s l2: 163.772 [12] training’s l2: 115.242 valid_1’s l2: 159.829 [13] training’s l2: 112.208 valid_1’s l2: 155.973 [14] training’s l2: 109.399 valid_1’s l2: 152.519 [15] training’s l2: 107.127 valid_1’s l2: 149.713 [16] training’s l2: 105.069 valid_1’s l2: 147.82 [17] training’s l2: 103.371 valid_1’s l2: 145.525 [18] training’s l2: 101.532 valid_1’s l2: 143.049 [19] training’s l2: 100.215 valid_1’s l2: 142.137 [20] training’s l2: 98.9011 valid_1’s l2: 140.583 [21] training’s l2: 97.66 valid_1’s l2: 139.857 [22] training’s l2: 96.63 valid_1’s l2: 139.269 [23] training’s l2: 95.6133 valid_1’s l2: 138.65 [24] training’s l2: 94.6918 valid_1’s l2: 139.455 [25] training’s l2: 93.6657 valid_1’s l2: 138.847 [26] training’s l2: 92.7317 valid_1’s l2: 137.827 [27] training’s l2: 91.9199 valid_1’s l2: 137.44 [28] training’s l2: 91.305 valid_1’s l2: 136.742 [29] training’s l2: 90.0209 valid_1’s l2: 142.351 [30] training’s l2: 89.3433 valid_1’s l2: 142.539 [31] training’s l2: 88.3146 valid_1’s l2: 148.405 [32] training’s l2: 87.3156 valid_1’s l2: 154.517 [33] training’s l2: 86.8501 valid_1’s l2: 156.153 [34] training’s l2: 86.3668 valid_1’s l2: 157.577 [35] training’s l2: 85.9572 valid_1’s l2: 157.379 [36] training’s l2: 85.6386 valid_1’s l2: 157.73 [37] training’s l2: 85.3331 valid_1’s l2: 157.637 [38] training’s l2: 84.7324 valid_1’s l2: 157.236 [39] training’s l2: 84.2202 valid_1’s l2: 156.696 [40] training’s l2: 83.7973 valid_1’s l2: 157.479 [41] training’s l2: 83.3949 valid_1’s l2: 156.364 [42] training’s l2: 82.9546 valid_1’s l2: 155.778 [43] training’s l2: 82.567 valid_1’s l2: 154.96 [44] training’s l2: 82.0161 valid_1’s l2: 159.416 [45] training’s l2: 81.6496 valid_1’s l2: 159.083 [46] training’s l2: 81.3859 valid_1’s l2: 159.622 [47] training’s l2: 81.144 valid_1’s l2: 159.95 [48] training’s l2: 80.8722 valid_1’s l2: 159.696 [49] training’s l2: 80.7166 valid_1’s l2: 159.082 [50] training’s l2: 80.557 valid_1’s l2: 158.459 [51] training’s l2: 80.3325 valid_1’s l2: 158.177 [52] training’s l2: 80.1574 valid_1’s l2: 157.977 [53] training’s l2: 79.9992 valid_1’s l2: 157.964 [54] training’s l2: 79.7614 valid_1’s l2: 157.856 [55] training’s l2: 79.5948 valid_1’s l2: 157.542 [56] training’s l2: 79.4393 valid_1’s l2: 157.697 [57] training’s l2: 79.2512 valid_1’s l2: 157.83 [58] training’s l2: 79.0895 valid_1’s l2: 158.008 [59] training’s l2: 78.7997 valid_1’s l2: 158.7 [60] training’s l2: 78.6677 valid_1’s l2: 158.534 [61] training’s l2: 78.3948 valid_1’s l2: 159.081 [62] training’s l2: 78.1873 valid_1’s l2: 159.613 [63] training’s l2: 78.0211 valid_1’s l2: 160.136 [64] training’s l2: 77.7863 valid_1’s l2: 159.924 [65] training’s l2: 77.6018 valid_1’s l2: 160.046 [66] training’s l2: 77.3692 valid_1’s l2: 159.906 [67] training’s l2: 77.1181 valid_1’s l2: 159.808 [68] training’s l2: 76.8823 valid_1’s l2: 159.96 [69] training’s l2: 76.773 valid_1’s l2: 159.984 [70] training’s l2: 76.5827 valid_1’s l2: 160.455 [71] training’s l2: 76.2614 valid_1’s l2: 160.658 [72] training’s l2: 76.0244 valid_1’s l2: 160.652 [73] training’s l2: 75.7554 valid_1’s l2: 160.437 [74] training’s l2: 75.5714 valid_1’s l2: 160.619 [75] training’s l2: 75.3077 valid_1’s l2: 160.647 [76] training’s l2: 75.18 valid_1’s l2: 161.027 [77] training’s l2: 74.9655 valid_1’s l2: 161.244 [78] training’s l2: 74.8555 valid_1’s l2: 161.586 [79] training’s l2: 74.6492 valid_1’s l2: 162.101 [80] training’s l2: 74.3627 valid_1’s l2: 164.244 [81] training’s l2: 74.0979 valid_1’s l2: 165.314 [82] training’s l2: 73.8566 valid_1’s l2: 166.789 [83] training’s l2: 73.6906 valid_1’s l2: 166.303 [84] training’s l2: 73.5044 valid_1’s l2: 166.027 [85] training’s l2: 73.3327 valid_1’s l2: 166.299 [86] training’s l2: 73.0946 valid_1’s l2: 166.756 [87] training’s l2: 72.9091 valid_1’s l2: 166.827 [88] training’s l2: 72.6721 valid_1’s l2: 166.495 [89] training’s l2: 72.4381 valid_1’s l2: 166.85 [90] training’s l2: 72.2439 valid_1’s l2: 166.618 [91] training’s l2: 72.0926 valid_1’s l2: 166.85 [92] training’s l2: 71.9794 valid_1’s l2: 166.638 [93] training’s l2: 71.8625 valid_1’s l2: 166.346 [94] training’s l2: 71.6752 valid_1’s l2: 166.037 [95] training’s l2: 71.5945 valid_1’s l2: 165.988 [96] training’s l2: 71.3808 valid_1’s l2: 167.347 [97] training’s l2: 71.2178 valid_1’s l2: 168.073 [98] training’s l2: 71.0699 valid_1’s l2: 168.8 [99] training’s l2: 70.9258 valid_1’s l2: 169.25 [100] training’s l2: 70.7526 valid_1’s l2: 169.344 [1] training’s l2: 191.603 valid_1’s l2: 186.57 [2] training’s l2: 179.615 valid_1’s l2: 175.476 [3] training’s l2: 169.601 valid_1’s l2: 166.759 [4] training’s l2: 160.718 valid_1’s l2: 159.195 [5] training’s l2: 152.532 valid_1’s l2: 151.473 [6] training’s l2: 144.97 valid_1’s l2: 144.765 [7] training’s l2: 139.145 valid_1’s l2: 149.502 [8] training’s l2: 133.297 valid_1’s l2: 143.823 [9] training’s l2: 128.477 valid_1’s l2: 146.819 [10] training’s l2: 124.103 valid_1’s l2: 152.426 [11] training’s l2: 120.169 valid_1’s l2: 159.291 [12] training’s l2: 116.836 valid_1’s l2: 155.257 [13] training’s l2: 113.655 valid_1’s l2: 152.465 [14] training’s l2: 110.832 valid_1’s l2: 148.946 [15] training’s l2: 108.382 valid_1’s l2: 152.708 [16] training’s l2: 106.045 valid_1’s l2: 156.915 [17] training’s l2: 103.801 valid_1’s l2: 154.791 [18] training’s l2: 102.084 valid_1’s l2: 153.993 [19] training’s l2: 100.437 valid_1’s l2: 152.838 [20] training’s l2: 98.8454 valid_1’s l2: 156.344 [21] training’s l2: 97.2941 valid_1’s l2: 162.9 [22] training’s l2: 96.1575 valid_1’s l2: 161.947 [23] training’s l2: 94.892 valid_1’s l2: 169.783 [24] training’s l2: 93.8264 valid_1’s l2: 175.086 [25] training’s l2: 92.8634 valid_1’s l2: 179.95 [26] training’s l2: 91.9237 valid_1’s l2: 184.238 [27] training’s l2: 91.019 valid_1’s l2: 182.562 [28] training’s l2: 90.2524 valid_1’s l2: 186.535 [29] training’s l2: 89.4469 valid_1’s l2: 185.531 [30] training’s l2: 88.6954 valid_1’s l2: 189.147 [31] training’s l2: 88.1107 valid_1’s l2: 188.142 [32] training’s l2: 87.5525 valid_1’s l2: 190.458 [33] training’s l2: 87.1031 valid_1’s l2: 190.978 [34] training’s l2: 86.5261 valid_1’s l2: 197.331 [35] training’s l2: 86.1056 valid_1’s l2: 196.396 [36] training’s l2: 85.5373 valid_1’s l2: 199.09 [37] training’s l2: 85.0395 valid_1’s l2: 200.081 [38] training’s l2: 84.6891 valid_1’s l2: 200.874 [39] training’s l2: 84.4386 valid_1’s l2: 201.743 [40] training’s l2: 84.1133 valid_1’s l2: 201.242 [41] training’s l2: 83.6846 valid_1’s l2: 199.915 [42] training’s l2: 83.3243 valid_1’s l2: 199.493 [43] training’s l2: 83.0083 valid_1’s l2: 198.727 [44] training’s l2: 82.7126 valid_1’s l2: 198.309 [45] training’s l2: 82.5443 valid_1’s l2: 199.148 [46] training’s l2: 82.2186 valid_1’s l2: 198.995 [47] training’s l2: 81.9292 valid_1’s l2: 198.71 [48] training’s l2: 81.7027 valid_1’s l2: 198.122 [49] training’s l2: 81.4932 valid_1’s l2: 198.712 [50] training’s l2: 81.2892 valid_1’s l2: 199.593 [51] training’s l2: 81.0045 valid_1’s l2: 200.175 [52] training’s l2: 80.7583 valid_1’s l2: 200.864 [53] training’s l2: 80.5343 valid_1’s l2: 201.11 [54] training’s l2: 80.3948 valid_1’s l2: 200.587 [55] training’s l2: 80.1765 valid_1’s l2: 200.516 [56] training’s l2: 79.9549 valid_1’s l2: 200.342 [57] training’s l2: 79.7704 valid_1’s l2: 199.926 [58] training’s l2: 79.5933 valid_1’s l2: 199.757 [59] training’s l2: 79.3517 valid_1’s l2: 200.224 [60] training’s l2: 79.1845 valid_1’s l2: 200.674 [61] training’s l2: 78.9526 valid_1’s l2: 199.458 [62] training’s l2: 78.7603 valid_1’s l2: 198.594 [63] training’s l2: 78.5888 valid_1’s l2: 198.084 [64] training’s l2: 78.3818 valid_1’s l2: 197.717 [65] training’s l2: 78.1224 valid_1’s l2: 197.111 [66] training’s l2: 77.9479 valid_1’s l2: 197.754 [67] training’s l2: 77.7393 valid_1’s l2: 197.864 [68] training’s l2: 77.4902 valid_1’s l2: 197.682 [69] training’s l2: 77.4181 valid_1’s l2: 198.213 [70] training’s l2: 77.1666 valid_1’s l2: 199.113 [71] training’s l2: 76.8945 valid_1’s l2: 200.639 [72] training’s l2: 76.7589 valid_1’s l2: 200.559 [73] training’s l2: 76.5988 valid_1’s l2: 201.32 [74] training’s l2: 76.4136 valid_1’s l2: 201.957 [75] training’s l2: 76.2142 valid_1’s l2: 201.825 [76] training’s l2: 75.983 valid_1’s l2: 202.85 [77] training’s l2: 75.8078 valid_1’s l2: 203.756 [78] training’s l2: 75.7056 valid_1’s l2: 203.322 [79] training’s l2: 75.4546 valid_1’s l2: 206.435 [80] training’s l2: 75.2827 valid_1’s l2: 207.225 [81] training’s l2: 75.0775 valid_1’s l2: 207.418 [82] training’s l2: 74.884 valid_1’s l2: 207.925 [83] training’s l2: 74.7679 valid_1’s l2: 209.017 [84] training’s l2: 74.6311 valid_1’s l2: 209.872 [85] training’s l2: 74.4108 valid_1’s l2: 209.728 [86] training’s l2: 74.2119 valid_1’s l2: 209.899 [87] training’s l2: 74.011 valid_1’s l2: 209.635 [88] training’s l2: 73.7435 valid_1’s l2: 209.679 [89] training’s l2: 73.5001 valid_1’s l2: 209.987 [90] training’s l2: 73.324 valid_1’s l2: 209.477 [91] training’s l2: 73.2189 valid_1’s l2: 209.182 [92] training’s l2: 73.0688 valid_1’s l2: 210.521 [93] training’s l2: 72.9836 valid_1’s l2: 210.774 [94] training’s l2: 72.7919 valid_1’s l2: 210.351 [95] training’s l2: 72.6428 valid_1’s l2: 210.848 [96] training’s l2: 72.3969 valid_1’s l2: 210.964 [97] training’s l2: 72.134 valid_1’s l2: 210.797 [98] training’s l2: 71.9175 valid_1’s l2: 211.131 [99] training’s l2: 71.8026 valid_1’s l2: 211.748 [100] training’s l2: 71.6771 valid_1’s l2: 212.45 [1] training’s l2: 182.13 valid_1’s l2: 268.925 [2] training’s l2: 171.234 valid_1’s l2: 253.895 [3] training’s l2: 161.188 valid_1’s l2: 240.087 [4] training’s l2: 152.438 valid_1’s l2: 227.371 [5] training’s l2: 144.973 valid_1’s l2: 216.639 [6] training’s l2: 138.774 valid_1’s l2: 207.305 [7] training’s l2: 133.715 valid_1’s l2: 199.479 [8] training’s l2: 128.522 valid_1’s l2: 190.15 [9] training’s l2: 124.924 valid_1’s l2: 185.098 [10] training’s l2: 120.902 valid_1’s l2: 179.091 [11] training’s l2: 117.485 valid_1’s l2: 174.062 [12] training’s l2: 114.242 valid_1’s l2: 168.602 [13] training’s l2: 111.475 valid_1’s l2: 165.225 [14] training’s l2: 109.119 valid_1’s l2: 161.608 [15] training’s l2: 106.835 valid_1’s l2: 157.535 [16] training’s l2: 104.872 valid_1’s l2: 154.116 [17] training’s l2: 103.3 valid_1’s l2: 151.361 [18] training’s l2: 101.794 valid_1’s l2: 148.737 [19] training’s l2: 100.442 valid_1’s l2: 146.963 [20] training’s l2: 99.3297 valid_1’s l2: 145.469 [21] training’s l2: 98.0576 valid_1’s l2: 143.953 [22] training’s l2: 96.9952 valid_1’s l2: 143.582 [23] training’s l2: 96.1551 valid_1’s l2: 142.703 [24] training’s l2: 95.2429 valid_1’s l2: 141.321 [25] training’s l2: 94.4858 valid_1’s l2: 139.902 [26] training’s l2: 93.4971 valid_1’s l2: 138.747 [27] training’s l2: 92.7743 valid_1’s l2: 137.533 [28] training’s l2: 92.0786 valid_1’s l2: 137.394 [29] training’s l2: 90.7768 valid_1’s l2: 144.579 [30] training’s l2: 89.6475 valid_1’s l2: 150.809 [31] training’s l2: 89.013 valid_1’s l2: 149.92 [32] training’s l2: 88.4401 valid_1’s l2: 148.834 [33] training’s l2: 87.9473 valid_1’s l2: 147.572 [34] training’s l2: 87.4929 valid_1’s l2: 146.491 [35] training’s l2: 87.0853 valid_1’s l2: 145.65 [36] training’s l2: 86.5945 valid_1’s l2: 146.168 [37] training’s l2: 86.1343 valid_1’s l2: 147.654 [38] training’s l2: 85.785 valid_1’s l2: 148.441 [39] training’s l2: 85.4071 valid_1’s l2: 148.517 [40] training’s l2: 84.6404 valid_1’s l2: 153.813 [41] training’s l2: 83.9097 valid_1’s l2: 159.55 [42] training’s l2: 83.2289 valid_1’s l2: 165.15 [43] training’s l2: 82.9138 valid_1’s l2: 164.142 [44] training’s l2: 82.5748 valid_1’s l2: 163.977 [45] training’s l2: 82.0764 valid_1’s l2: 168.908 [46] training’s l2: 81.8109 valid_1’s l2: 169.097 [47] training’s l2: 81.4568 valid_1’s l2: 168.659 [48] training’s l2: 81.2605 valid_1’s l2: 169.237 [49] training’s l2: 80.9983 valid_1’s l2: 168.894 [50] training’s l2: 80.6754 valid_1’s l2: 168.599 [51] training’s l2: 80.366 valid_1’s l2: 168.284 [52] training’s l2: 80.1175 valid_1’s l2: 167.832 [53] training’s l2: 79.8311 valid_1’s l2: 167.52 [54] training’s l2: 79.5507 valid_1’s l2: 168.844 [55] training’s l2: 79.3371 valid_1’s l2: 168.814 [56] training’s l2: 79.0848 valid_1’s l2: 169.275 [57] training’s l2: 78.8929 valid_1’s l2: 169.257 [58] training’s l2: 78.6591 valid_1’s l2: 168.669 [59] training’s l2: 78.4492 valid_1’s l2: 168.299 [60] training’s l2: 78.2873 valid_1’s l2: 168.636 [61] training’s l2: 78.0389 valid_1’s l2: 168.809 [62] training’s l2: 77.9196 valid_1’s l2: 169.109 [63] training’s l2: 77.706 valid_1’s l2: 168.718 [64] training’s l2: 77.5516 valid_1’s l2: 169.026 [65] training’s l2: 77.3462 valid_1’s l2: 169.956 [66] training’s l2: 77.1325 valid_1’s l2: 170.166 [67] training’s l2: 76.9509 valid_1’s l2: 170.644 [68] training’s l2: 76.7989 valid_1’s l2: 170.923 [69] training’s l2: 76.6742 valid_1’s l2: 171.397 [70] training’s l2: 76.5455 valid_1’s l2: 172.087 [71] training’s l2: 76.2846 valid_1’s l2: 172.508 [72] training’s l2: 76.0715 valid_1’s l2: 172.652 [73] training’s l2: 75.9377 valid_1’s l2: 172.755 [74] training’s l2: 75.7948 valid_1’s l2: 173.186 [75] training’s l2: 75.6894 valid_1’s l2: 173.32 [76] training’s l2: 75.4808 valid_1’s l2: 173.167 [77] training’s l2: 75.2653 valid_1’s l2: 173.339 [78] training’s l2: 75.0994 valid_1’s l2: 174.076 [79] training’s l2: 74.8357 valid_1’s l2: 174.713 [80] training’s l2: 74.6526 valid_1’s l2: 174.646 [81] training’s l2: 74.4418 valid_1’s l2: 174.613 [82] training’s l2: 74.1644 valid_1’s l2: 177.414 [83] training’s l2: 73.9704 valid_1’s l2: 178.09 [84] training’s l2: 73.796 valid_1’s l2: 177.936 [85] training’s l2: 73.6514 valid_1’s l2: 178.36 [86] training’s l2: 73.3955 valid_1’s l2: 179.109 [87] training’s l2: 73.2818 valid_1’s l2: 179.522 [88] training’s l2: 73.1496 valid_1’s l2: 180.355 [89] training’s l2: 72.9406 valid_1’s l2: 180.131 [90] training’s l2: 72.8059 valid_1’s l2: 180.468 [91] training’s l2: 72.7013 valid_1’s l2: 180.401 [92] training’s l2: 72.6139 valid_1’s l2: 180.081 [93] training’s l2: 72.5242 valid_1’s l2: 179.891 [94] training’s l2: 72.2956 valid_1’s l2: 180.05 [95] training’s l2: 72.1959 valid_1’s l2: 180.058 [96] training’s l2: 72.0999 valid_1’s l2: 179.589 [97] training’s l2: 71.8895 valid_1’s l2: 181.923 [98] training’s l2: 71.6846 valid_1’s l2: 181.643 [99] training’s l2: 71.4806 valid_1’s l2: 181.524 [100] training’s l2: 71.2936 valid_1’s l2: 180.836 [1] training’s l2: 181.575 valid_1’s l2: 263.094 [2] training’s l2: 169.925 valid_1’s l2: 250.017 [3] training’s l2: 160.469 valid_1’s l2: 237.496 [4] training’s l2: 151.835 valid_1’s l2: 226.981 [5] training’s l2: 144.223 valid_1’s l2: 239.729 [6] training’s l2: 137.229 valid_1’s l2: 255.652 [7] training’s l2: 131.585 valid_1’s l2: 246.362 [8] training’s l2: 126.692 valid_1’s l2: 238.548 [9] training’s l2: 122.05 valid_1’s l2: 253.49 [10] training’s l2: 118.011 valid_1’s l2: 267.586 [11] training’s l2: 114.012 valid_1’s l2: 259.26 [12] training’s l2: 110.599 valid_1’s l2: 268.502 [13] training’s l2: 107.466 valid_1’s l2: 261.291 [14] training’s l2: 104.701 valid_1’s l2: 254.963 [15] training’s l2: 102.282 valid_1’s l2: 249.327 [16] training’s l2: 100.293 valid_1’s l2: 243.619 [17] training’s l2: 98.4398 valid_1’s l2: 238.206 [18] training’s l2: 96.7813 valid_1’s l2: 233.614 [19] training’s l2: 95.4195 valid_1’s l2: 229.749 [20] training’s l2: 93.8965 valid_1’s l2: 227.953 [21] training’s l2: 92.4006 valid_1’s l2: 236.485 [22] training’s l2: 91.2743 valid_1’s l2: 233.261 [23] training’s l2: 90.0754 valid_1’s l2: 243.75 [24] training’s l2: 89.117 valid_1’s l2: 253.566 [25] training’s l2: 88.2277 valid_1’s l2: 257.643 [26] training’s l2: 87.2302 valid_1’s l2: 253.659 [27] training’s l2: 86.4847 valid_1’s l2: 251.105 [28] training’s l2: 85.7754 valid_1’s l2: 248.987 [29] training’s l2: 85.1998 valid_1’s l2: 246.66 [30] training’s l2: 84.6967 valid_1’s l2: 250.542 [31] training’s l2: 84.0718 valid_1’s l2: 248.02 [32] training’s l2: 83.4119 valid_1’s l2: 259.022 [33] training’s l2: 82.9172 valid_1’s l2: 264.104 [34] training’s l2: 82.5544 valid_1’s l2: 263.383 [35] training’s l2: 82.2256 valid_1’s l2: 260.906 [36] training’s l2: 81.8384 valid_1’s l2: 260.967 [37] training’s l2: 81.4423 valid_1’s l2: 260.448 [38] training’s l2: 81.091 valid_1’s l2: 259.701 [39] training’s l2: 80.6978 valid_1’s l2: 258.846 [40] training’s l2: 80.3647 valid_1’s l2: 258.979 [41] training’s l2: 80.0704 valid_1’s l2: 257.698 [42] training’s l2: 79.7841 valid_1’s l2: 256.787 [43] training’s l2: 79.548 valid_1’s l2: 257.321 [44] training’s l2: 79.3817 valid_1’s l2: 257.261 [45] training’s l2: 79.0728 valid_1’s l2: 256.842 [46] training’s l2: 78.7786 valid_1’s l2: 257.665 [47] training’s l2: 78.5309 valid_1’s l2: 258.698 [48] training’s l2: 78.1787 valid_1’s l2: 259.331 [49] training’s l2: 78.0073 valid_1’s l2: 260.036 [50] training’s l2: 77.8439 valid_1’s l2: 260.908 [51] training’s l2: 77.5918 valid_1’s l2: 259.721 [52] training’s l2: 77.3803 valid_1’s l2: 259.082 [53] training’s l2: 77.178 valid_1’s l2: 257.41 [54] training’s l2: 76.965 valid_1’s l2: 257.029 [55] training’s l2: 76.7343 valid_1’s l2: 255.517 [56] training’s l2: 76.4189 valid_1’s l2: 255.792 [57] training’s l2: 76.0997 valid_1’s l2: 255.582 [58] training’s l2: 75.9178 valid_1’s l2: 256.721 [59] training’s l2: 75.7499 valid_1’s l2: 257.773 [60] training’s l2: 75.6012 valid_1’s l2: 257.739 [61] training’s l2: 75.4726 valid_1’s l2: 257.361 [62] training’s l2: 75.3769 valid_1’s l2: 256.184 [63] training’s l2: 75.1815 valid_1’s l2: 256.603 [64] training’s l2: 75.1169 valid_1’s l2: 257.47 [65] training’s l2: 74.9534 valid_1’s l2: 256.77 [66] training’s l2: 74.7485 valid_1’s l2: 256.268 [67] training’s l2: 74.6057 valid_1’s l2: 255.777 [68] training’s l2: 74.4203 valid_1’s l2: 254.845 [69] training’s l2: 74.2608 valid_1’s l2: 255.01 [70] training’s l2: 74.1225 valid_1’s l2: 254.157 [71] training’s l2: 73.8395 valid_1’s l2: 259.893 [72] training’s l2: 73.6125 valid_1’s l2: 265.322 [73] training’s l2: 73.3955 valid_1’s l2: 265.928 [74] training’s l2: 73.287 valid_1’s l2: 265.059 [75] training’s l2: 73.1297 valid_1’s l2: 264.583 [76] training’s l2: 73.0022 valid_1’s l2: 264.294 [77] training’s l2: 72.8961 valid_1’s l2: 264.65 [78] training’s l2: 72.7389 valid_1’s l2: 265.038 [79] training’s l2: 72.5318 valid_1’s l2: 265.115 [80] training’s l2: 72.3306 valid_1’s l2: 267.136 [81] training’s l2: 72.1744 valid_1’s l2: 267.143 [82] training’s l2: 72.0279 valid_1’s l2: 267.015 [83] training’s l2: 71.943 valid_1’s l2: 266.439 [84] training’s l2: 71.8534 valid_1’s l2: 266.282 [85] training’s l2: 71.8044 valid_1’s l2: 265.696 [86] training’s l2: 71.7122 valid_1’s l2: 266.236 [87] training’s l2: 71.6288 valid_1’s l2: 266.285 [88] training’s l2: 71.5445 valid_1’s l2: 265.239 [89] training’s l2: 71.3887 valid_1’s l2: 265.474 [90] training’s l2: 71.3229 valid_1’s l2: 265.115 [91] training’s l2: 71.1077 valid_1’s l2: 267.549 [92] training’s l2: 70.9074 valid_1’s l2: 267.838 [93] training’s l2: 70.7565 valid_1’s l2: 268.125 [94] training’s l2: 70.6162 valid_1’s l2: 268.987 [95] training’s l2: 70.4679 valid_1’s l2: 268.744 [96] training’s l2: 70.3679 valid_1’s l2: 269.065 [97] training’s l2: 70.2049 valid_1’s l2: 268.731 [98] training’s l2: 70.0907 valid_1’s l2: 268.503 [99] training’s l2: 69.9497 valid_1’s l2: 269.578 [100] training’s l2: 69.8173 valid_1’s l2: 268.947

num_boosting_rounds train mse eval mse test mse
0 3 1.434403 1.622983 1.174889
1 5 0.930536 1.552662 0.613107
2 7 0.605738 1.152197 0.594836
3 10 0.572752 1.195924 0.607897
4 30 0.393149 1.265725 0.746060
5 50 0.373485 1.272142 0.869008
6 70 0.361333 1.281454 0.874337
7 100 0.346081 1.292908 0.842149

num_boosting_rounds=50

6.0.4 early stopping

[1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 5 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 5 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 5 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 Early stopping, best iteration is: [35] training’s l2: 177.456 valid_1’s l2: 183.917 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 5 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 5 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 Early stopping, best iteration is: [44] training’s l2: 162.394 valid_1’s l2: 263.122 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 15 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 15 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 15 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 15 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 15 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 20 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 20 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 20 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 20 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 20 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 25 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 25 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 25 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 25 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 25 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 30 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 30 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 30 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 30 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 30 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 35 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 35 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 35 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 35 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 35 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 40 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 40 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 40 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 40 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 40 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 45 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 45 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 45 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 45 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 45 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 50 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 50 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 50 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 50 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 50 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 55 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 55 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 55 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 55 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 55 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 60 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 60 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 60 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 60 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 60 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 65 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 65 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 65 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 65 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 65 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 70 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 70 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 70 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 70 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 70 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 75 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 75 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 75 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 75 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 75 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 80 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 80 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 80 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 80 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 80 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 85 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 85 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 85 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 85 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 85 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 90 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 90 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 90 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 90 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 90 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099 [1] training’s l2: 193.625 valid_1’s l2: 283.865 Training until validation scores don’t improve for 95 rounds [2] training’s l2: 192.834 valid_1’s l2: 283.578 [3] training’s l2: 192.01 valid_1’s l2: 282.389 [4] training’s l2: 191.158 valid_1’s l2: 281.972 [5] training’s l2: 190.387 valid_1’s l2: 281.703 [6] training’s l2: 189.58 valid_1’s l2: 280.991 [7] training’s l2: 188.82 valid_1’s l2: 279.919 [8] training’s l2: 188.067 valid_1’s l2: 278.835 [9] training’s l2: 187.282 valid_1’s l2: 278.151 [10] training’s l2: 186.505 valid_1’s l2: 277.474 [11] training’s l2: 185.732 valid_1’s l2: 276.417 [12] training’s l2: 184.986 valid_1’s l2: 277.021 [13] training’s l2: 184.243 valid_1’s l2: 277.628 [14] training’s l2: 183.518 valid_1’s l2: 278.172 [15] training’s l2: 182.812 valid_1’s l2: 277.093 [16] training’s l2: 182.079 valid_1’s l2: 276.084 [17] training’s l2: 181.354 valid_1’s l2: 275.085 [18] training’s l2: 180.637 valid_1’s l2: 274.094 [19] training’s l2: 179.91 valid_1’s l2: 272.942 [20] training’s l2: 179.205 valid_1’s l2: 271.968 [21] training’s l2: 178.496 valid_1’s l2: 271.67 [22] training’s l2: 177.846 valid_1’s l2: 270.901 [23] training’s l2: 177.151 valid_1’s l2: 270.617 [24] training’s l2: 176.409 valid_1’s l2: 270.023 [25] training’s l2: 175.726 valid_1’s l2: 269.75 [26] training’s l2: 175.091 valid_1’s l2: 268.806 [27] training’s l2: 174.48 valid_1’s l2: 267.878 [28] training’s l2: 173.811 valid_1’s l2: 266.845 [29] training’s l2: 173.17 valid_1’s l2: 266.611 [30] training’s l2: 172.537 valid_1’s l2: 266.383 [31] training’s l2: 171.922 valid_1’s l2: 267.006 [32] training’s l2: 171.261 valid_1’s l2: 266.787 [33] training’s l2: 170.607 valid_1’s l2: 266.574 [34] training’s l2: 169.96 valid_1’s l2: 266.367 [35] training’s l2: 169.333 valid_1’s l2: 266.913 [36] training’s l2: 168.705 valid_1’s l2: 266.423 [37] training’s l2: 168.084 valid_1’s l2: 265.939 [38] training’s l2: 167.445 valid_1’s l2: 266.445 [39] training’s l2: 166.834 valid_1’s l2: 265.969 [40] training’s l2: 166.232 valid_1’s l2: 265.483 [41] training’s l2: 165.627 valid_1’s l2: 265.23 [42] training’s l2: 165.031 valid_1’s l2: 265.713 [43] training’s l2: 164.476 valid_1’s l2: 264.817 [44] training’s l2: 163.888 valid_1’s l2: 264.576 [45] training’s l2: 163.308 valid_1’s l2: 265.057 [46] training’s l2: 162.736 valid_1’s l2: 264.428 [47] training’s l2: 162.132 valid_1’s l2: 263.63 [48] training’s l2: 161.57 valid_1’s l2: 263 [49] training’s l2: 161.015 valid_1’s l2: 262.377 [50] training’s l2: 160.465 valid_1’s l2: 261.76 Did not meet early stopping. Best iteration is: [50] training’s l2: 160.465 valid_1’s l2: 261.76 [1] training’s l2: 197.617 valid_1’s l2: 264.133 Training until validation scores don’t improve for 95 rounds [2] training’s l2: 196.824 valid_1’s l2: 263.054 [3] training’s l2: 195.994 valid_1’s l2: 261.969 [4] training’s l2: 195.17 valid_1’s l2: 260.922 [5] training’s l2: 194.398 valid_1’s l2: 259.873 [6] training’s l2: 193.557 valid_1’s l2: 258.938 [7] training’s l2: 192.729 valid_1’s l2: 258.016 [8] training’s l2: 191.893 valid_1’s l2: 257.089 [9] training’s l2: 191.077 valid_1’s l2: 256.181 [10] training’s l2: 190.275 valid_1’s l2: 255.28 [11] training’s l2: 189.483 valid_1’s l2: 254.394 [12] training’s l2: 188.709 valid_1’s l2: 253.386 [13] training’s l2: 187.944 valid_1’s l2: 252.449 [14] training’s l2: 187.186 valid_1’s l2: 251.521 [15] training’s l2: 186.438 valid_1’s l2: 250.607 [16] training’s l2: 185.669 valid_1’s l2: 249.589 [17] training’s l2: 184.944 valid_1’s l2: 248.652 [18] training’s l2: 184.189 valid_1’s l2: 247.654 [19] training’s l2: 183.445 valid_1’s l2: 246.74 [20] training’s l2: 182.738 valid_1’s l2: 245.806 [21] training’s l2: 182.019 valid_1’s l2: 244.928 [22] training’s l2: 181.315 valid_1’s l2: 244.104 [23] training’s l2: 180.607 valid_1’s l2: 243.236 [24] training’s l2: 179.942 valid_1’s l2: 242.465 [25] training’s l2: 179.246 valid_1’s l2: 241.611 [26] training’s l2: 178.542 valid_1’s l2: 240.708 [27] training’s l2: 177.852 valid_1’s l2: 239.788 [28] training’s l2: 177.179 valid_1’s l2: 238.962 [29] training’s l2: 176.494 valid_1’s l2: 238.083 [30] training’s l2: 175.807 valid_1’s l2: 237.213 [31] training’s l2: 175.186 valid_1’s l2: 236.452 [32] training’s l2: 174.513 valid_1’s l2: 235.719 [33] training’s l2: 173.847 valid_1’s l2: 234.993 [34] training’s l2: 173.19 valid_1’s l2: 234.317 [35] training’s l2: 172.54 valid_1’s l2: 233.604 [36] training’s l2: 171.866 valid_1’s l2: 232.853 [37] training’s l2: 171.199 valid_1’s l2: 232.109 [38] training’s l2: 170.58 valid_1’s l2: 231.423 [39] training’s l2: 169.954 valid_1’s l2: 230.648 [40] training’s l2: 169.334 valid_1’s l2: 229.868 [41] training’s l2: 168.705 valid_1’s l2: 229.042 [42] training’s l2: 168.014 valid_1’s l2: 228.095 [43] training’s l2: 167.396 valid_1’s l2: 227.277 [44] training’s l2: 166.786 valid_1’s l2: 226.471 [45] training’s l2: 166.114 valid_1’s l2: 225.523 [46] training’s l2: 165.535 valid_1’s l2: 224.763 [47] training’s l2: 164.952 valid_1’s l2: 224.065 [48] training’s l2: 164.384 valid_1’s l2: 223.319 [49] training’s l2: 163.822 valid_1’s l2: 222.569 [50] training’s l2: 163.272 valid_1’s l2: 221.882 Did not meet early stopping. Best iteration is: [50] training’s l2: 163.272 valid_1’s l2: 221.882 [1] training’s l2: 204.481 valid_1’s l2: 198.34 Training until validation scores don’t improve for 95 rounds [2] training’s l2: 203.591 valid_1’s l2: 198.34 [3] training’s l2: 202.701 valid_1’s l2: 197.909 [4] training’s l2: 201.82 valid_1’s l2: 197.496 [5] training’s l2: 200.952 valid_1’s l2: 197.468 [6] training’s l2: 200.046 valid_1’s l2: 197.114 [7] training’s l2: 199.158 valid_1’s l2: 196.54 [8] training’s l2: 198.268 valid_1’s l2: 195.978 [9] training’s l2: 197.383 valid_1’s l2: 195.472 [10] training’s l2: 196.502 valid_1’s l2: 195.041 [11] training’s l2: 195.679 valid_1’s l2: 194.322 [12] training’s l2: 194.866 valid_1’s l2: 193.87 [13] training’s l2: 194.063 valid_1’s l2: 193.426 [14] training’s l2: 193.267 valid_1’s l2: 192.988 [15] training’s l2: 192.487 valid_1’s l2: 192.53 [16] training’s l2: 191.67 valid_1’s l2: 191.981 [17] training’s l2: 190.863 valid_1’s l2: 191.449 [18] training’s l2: 190.069 valid_1’s l2: 190.779 [19] training’s l2: 189.281 valid_1’s l2: 190.122 [20] training’s l2: 188.492 valid_1’s l2: 189.574 [21] training’s l2: 187.719 valid_1’s l2: 188.927 [22] training’s l2: 186.948 valid_1’s l2: 188.292 [23] training’s l2: 186.188 valid_1’s l2: 187.655 [24] training’s l2: 185.453 valid_1’s l2: 187.647 [25] training’s l2: 184.704 valid_1’s l2: 187.035 [26] training’s l2: 183.935 valid_1’s l2: 186.467 [27] training’s l2: 183.196 valid_1’s l2: 186.299 [28] training’s l2: 182.491 valid_1’s l2: 185.653 [29] training’s l2: 181.732 valid_1’s l2: 185.238 [30] training’s l2: 180.986 valid_1’s l2: 184.709 [31] training’s l2: 180.298 valid_1’s l2: 184.74 [32] training’s l2: 179.584 valid_1’s l2: 184.428 [33] training’s l2: 178.857 valid_1’s l2: 184.367 [34] training’s l2: 178.167 valid_1’s l2: 184.027 [35] training’s l2: 177.456 valid_1’s l2: 183.917 [36] training’s l2: 176.757 valid_1’s l2: 184.187 [37] training’s l2: 176.064 valid_1’s l2: 184.459 [38] training’s l2: 175.371 valid_1’s l2: 183.918 [39] training’s l2: 174.69 valid_1’s l2: 184.192 [40] training’s l2: 174.016 valid_1’s l2: 184.395 [41] training’s l2: 173.353 valid_1’s l2: 184.1 [42] training’s l2: 172.685 valid_1’s l2: 183.764 [43] training’s l2: 172.04 valid_1’s l2: 183.219 [44] training’s l2: 171.399 valid_1’s l2: 182.935 [45] training’s l2: 170.74 valid_1’s l2: 182.599 [46] training’s l2: 170.106 valid_1’s l2: 182.392 [47] training’s l2: 169.455 valid_1’s l2: 181.879 [48] training’s l2: 168.833 valid_1’s l2: 181.35 [49] training’s l2: 168.207 valid_1’s l2: 180.81 [50] training’s l2: 167.598 valid_1’s l2: 180.307 Did not meet early stopping. Best iteration is: [50] training’s l2: 167.598 valid_1’s l2: 180.307 [1] training’s l2: 193.21 valid_1’s l2: 285.277 Training until validation scores don’t improve for 95 rounds [2] training’s l2: 192.378 valid_1’s l2: 284.126 [3] training’s l2: 191.579 valid_1’s l2: 283.015 [4] training’s l2: 190.788 valid_1’s l2: 281.955 [5] training’s l2: 189.976 valid_1’s l2: 280.831 [6] training’s l2: 189.197 valid_1’s l2: 279.758 [7] training’s l2: 188.431 valid_1’s l2: 278.668 [8] training’s l2: 187.666 valid_1’s l2: 277.612 [9] training’s l2: 186.91 valid_1’s l2: 276.566 [10] training’s l2: 186.161 valid_1’s l2: 275.529 [11] training’s l2: 185.366 valid_1’s l2: 274.421 [12] training’s l2: 184.559 valid_1’s l2: 273.235 [13] training’s l2: 183.76 valid_1’s l2: 272.053 [14] training’s l2: 182.966 valid_1’s l2: 270.865 [15] training’s l2: 182.182 valid_1’s l2: 269.727 [16] training’s l2: 181.442 valid_1’s l2: 268.621 [17] training’s l2: 180.71 valid_1’s l2: 267.525 [18] training’s l2: 179.986 valid_1’s l2: 266.441 [19] training’s l2: 179.239 valid_1’s l2: 265.355 [20] training’s l2: 178.528 valid_1’s l2: 264.289 [21] training’s l2: 177.848 valid_1’s l2: 263.182 [22] training’s l2: 177.15 valid_1’s l2: 262.205 [23] training’s l2: 176.46 valid_1’s l2: 261.239 [24] training’s l2: 175.79 valid_1’s l2: 260.292 [25] training’s l2: 175.11 valid_1’s l2: 259.197 [26] training’s l2: 174.416 valid_1’s l2: 258.177 [27] training’s l2: 173.755 valid_1’s l2: 257.174 [28] training’s l2: 173.092 valid_1’s l2: 256.191 [29] training’s l2: 172.413 valid_1’s l2: 255.198 [30] training’s l2: 171.742 valid_1’s l2: 254.214 [31] training’s l2: 171.08 valid_1’s l2: 253.284 [32] training’s l2: 170.46 valid_1’s l2: 252.373 [33] training’s l2: 169.845 valid_1’s l2: 251.479 [34] training’s l2: 169.238 valid_1’s l2: 250.585 [35] training’s l2: 168.636 valid_1’s l2: 249.696 [36] training’s l2: 168.029 valid_1’s l2: 248.878 [37] training’s l2: 167.429 valid_1’s l2: 248.067 [38] training’s l2: 166.835 valid_1’s l2: 247.202 [39] training’s l2: 166.246 valid_1’s l2: 246.514 [40] training’s l2: 165.656 valid_1’s l2: 245.832 [41] training’s l2: 165.022 valid_1’s l2: 244.976 [42] training’s l2: 164.396 valid_1’s l2: 244.098 [43] training’s l2: 163.775 valid_1’s l2: 243.258 [44] training’s l2: 163.216 valid_1’s l2: 242.419 [45] training’s l2: 162.651 valid_1’s l2: 241.665 [46] training’s l2: 162.062 valid_1’s l2: 240.807 [47] training’s l2: 161.485 valid_1’s l2: 240.036 [48] training’s l2: 160.903 valid_1’s l2: 239.226 [49] training’s l2: 160.328 valid_1’s l2: 238.423 [50] training’s l2: 159.758 valid_1’s l2: 237.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 159.758 valid_1’s l2: 237.628 [1] training’s l2: 194.307 valid_1’s l2: 279.622 Training until validation scores don’t improve for 95 rounds [2] training’s l2: 193.445 valid_1’s l2: 279.663 [3] training’s l2: 192.594 valid_1’s l2: 278.589 [4] training’s l2: 191.711 valid_1’s l2: 278.55 [5] training’s l2: 190.871 valid_1’s l2: 278.603 [6] training’s l2: 190.025 valid_1’s l2: 279.488 [7] training’s l2: 189.234 valid_1’s l2: 278.579 [8] training’s l2: 188.455 valid_1’s l2: 277.645 [9] training’s l2: 187.63 valid_1’s l2: 278.526 [10] training’s l2: 186.815 valid_1’s l2: 279.383 [11] training’s l2: 185.988 valid_1’s l2: 278.639 [12] training’s l2: 185.183 valid_1’s l2: 277.974 [13] training’s l2: 184.384 valid_1’s l2: 277.316 [14] training’s l2: 183.593 valid_1’s l2: 276.683 [15] training’s l2: 182.86 valid_1’s l2: 275.718 [16] training’s l2: 182.069 valid_1’s l2: 274.74 [17] training’s l2: 181.286 valid_1’s l2: 273.772 [18] training’s l2: 180.509 valid_1’s l2: 272.839 [19] training’s l2: 179.751 valid_1’s l2: 271.816 [20] training’s l2: 178.988 valid_1’s l2: 270.9 [21] training’s l2: 178.213 valid_1’s l2: 270.528 [22] training’s l2: 177.488 valid_1’s l2: 269.642 [23] training’s l2: 176.725 valid_1’s l2: 269.265 [24] training’s l2: 176.006 valid_1’s l2: 270.131 [25] training’s l2: 175.259 valid_1’s l2: 269.781 [26] training’s l2: 174.525 valid_1’s l2: 268.812 [27] training’s l2: 173.814 valid_1’s l2: 267.91 [28] training’s l2: 173.088 valid_1’s l2: 266.877 [29] training’s l2: 172.34 valid_1’s l2: 266.481 [30] training’s l2: 171.599 valid_1’s l2: 266.01 [31] training’s l2: 170.933 valid_1’s l2: 266.011 [32] training’s l2: 170.226 valid_1’s l2: 265.964 [33] training’s l2: 169.528 valid_1’s l2: 266.016 [34] training’s l2: 168.841 valid_1’s l2: 266.016 [35] training’s l2: 168.155 valid_1’s l2: 265.984 [36] training’s l2: 167.483 valid_1’s l2: 265.37 [37] training’s l2: 166.824 valid_1’s l2: 264.803 [38] training’s l2: 166.156 valid_1’s l2: 264.272 [39] training’s l2: 165.508 valid_1’s l2: 263.719 [40] training’s l2: 164.864 valid_1’s l2: 263.463 [41] training’s l2: 164.229 valid_1’s l2: 263.367 [42] training’s l2: 163.601 valid_1’s l2: 264.253 [43] training’s l2: 163.013 valid_1’s l2: 263.47 [44] training’s l2: 162.394 valid_1’s l2: 263.122 [45] training’s l2: 161.783 valid_1’s l2: 264.005 [46] training’s l2: 161.181 valid_1’s l2: 263.596 [47] training’s l2: 160.55 valid_1’s l2: 264.048 [48] training’s l2: 159.959 valid_1’s l2: 263.621 [49] training’s l2: 159.385 valid_1’s l2: 263.515 [50] training’s l2: 158.805 valid_1’s l2: 263.099 Did not meet early stopping. Best iteration is: [50] training’s l2: 158.805 valid_1’s l2: 263.099

early stopping round train mse eval mse test mse
0 5 0.416999 1.187050 0.725257
1 10 0.413938 1.192593 0.716515
2 15 0.413938 1.192593 0.716515
3 20 0.413938 1.192593 0.716515
4 25 0.413938 1.192593 0.716515
5 30 0.413938 1.192593 0.716515
6 35 0.413938 1.192593 0.716515
7 40 0.413938 1.192593 0.716515
8 45 0.413938 1.192593 0.716515
9 50 0.413938 1.192593 0.716515
10 55 0.413938 1.192593 0.716515
11 60 0.413938 1.192593 0.716515
12 65 0.413938 1.192593 0.716515
13 70 0.413938 1.192593 0.716515
14 75 0.413938 1.192593 0.716515
15 80 0.413938 1.192593 0.716515
16 85 0.413938 1.192593 0.716515
17 90 0.413938 1.192593 0.716515
18 95 0.413938 1.192593 0.716515

early stopping round=10

6.0.5 Learning rate

[1] training’s l2: 194.414 valid_1’s l2: 284.26 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.334 valid_1’s l2: 284.231 [3] training’s l2: 194.25 valid_1’s l2: 284.11 [4] training’s l2: 194.162 valid_1’s l2: 284.066 [5] training’s l2: 194.083 valid_1’s l2: 284.037 [6] training’s l2: 193.999 valid_1’s l2: 283.962 [7] training’s l2: 193.919 valid_1’s l2: 283.851 [8] training’s l2: 193.839 valid_1’s l2: 283.737 [9] training’s l2: 193.756 valid_1’s l2: 283.663 [10] training’s l2: 193.672 valid_1’s l2: 283.589 [11] training’s l2: 193.59 valid_1’s l2: 283.538 [12] training’s l2: 193.509 valid_1’s l2: 283.6 [13] training’s l2: 193.425 valid_1’s l2: 283.527 [14] training’s l2: 193.342 valid_1’s l2: 283.453 [15] training’s l2: 193.263 valid_1’s l2: 283.335 [16] training’s l2: 193.183 valid_1’s l2: 283.223 [17] training’s l2: 193.104 valid_1’s l2: 283.112 [18] training’s l2: 193.025 valid_1’s l2: 283.001 [19] training’s l2: 192.943 valid_1’s l2: 282.874 [20] training’s l2: 192.864 valid_1’s l2: 282.763 [21] training’s l2: 192.781 valid_1’s l2: 282.723 [22] training’s l2: 192.704 valid_1’s l2: 282.634 [23] training’s l2: 192.622 valid_1’s l2: 282.594 [24] training’s l2: 192.537 valid_1’s l2: 282.522 [25] training’s l2: 192.455 valid_1’s l2: 282.483 [26] training’s l2: 192.378 valid_1’s l2: 282.373 [27] training’s l2: 192.304 valid_1’s l2: 282.263 [28] training’s l2: 192.223 valid_1’s l2: 282.147 [29] training’s l2: 192.144 valid_1’s l2: 282.11 [30] training’s l2: 192.065 valid_1’s l2: 282.074 [31] training’s l2: 191.988 valid_1’s l2: 282.049 [32] training’s l2: 191.907 valid_1’s l2: 282.015 [33] training’s l2: 191.826 valid_1’s l2: 281.981 [34] training’s l2: 191.744 valid_1’s l2: 281.948 [35] training’s l2: 191.664 valid_1’s l2: 281.912 [36] training’s l2: 191.584 valid_1’s l2: 281.882 [37] training’s l2: 191.505 valid_1’s l2: 281.852 [38] training’s l2: 191.423 valid_1’s l2: 281.777 [39] training’s l2: 191.344 valid_1’s l2: 281.747 [40] training’s l2: 191.265 valid_1’s l2: 281.717 [41] training’s l2: 191.183 valid_1’s l2: 281.675 [42] training’s l2: 191.099 valid_1’s l2: 281.64 [43] training’s l2: 191.023 valid_1’s l2: 281.521 [44] training’s l2: 190.942 valid_1’s l2: 281.479 [45] training’s l2: 190.858 valid_1’s l2: 281.444 [46] training’s l2: 190.781 valid_1’s l2: 281.406 [47] training’s l2: 190.7 valid_1’s l2: 281.302 [48] training’s l2: 190.622 valid_1’s l2: 281.263 [49] training’s l2: 190.545 valid_1’s l2: 281.224 [50] training’s l2: 190.468 valid_1’s l2: 281.186 Did not meet early stopping. Best iteration is: [50] training’s l2: 190.468 valid_1’s l2: 281.186 [1] training’s l2: 198.369 valid_1’s l2: 265.118 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.289 valid_1’s l2: 265.009 [3] training’s l2: 198.205 valid_1’s l2: 264.899 [4] training’s l2: 198.12 valid_1’s l2: 264.791 [5] training’s l2: 198.04 valid_1’s l2: 264.682 [6] training’s l2: 197.953 valid_1’s l2: 264.585 [7] training’s l2: 197.866 valid_1’s l2: 264.489 [8] training’s l2: 197.778 valid_1’s l2: 264.391 [9] training’s l2: 197.691 valid_1’s l2: 264.294 [10] training’s l2: 197.605 valid_1’s l2: 264.197 [11] training’s l2: 197.52 valid_1’s l2: 264.093 [12] training’s l2: 197.433 valid_1’s l2: 263.993 [13] training’s l2: 197.349 valid_1’s l2: 263.891 [14] training’s l2: 197.266 valid_1’s l2: 263.789 [15] training’s l2: 197.182 valid_1’s l2: 263.687 [16] training’s l2: 197.1 valid_1’s l2: 263.581 [17] training’s l2: 197.019 valid_1’s l2: 263.475 [18] training’s l2: 196.937 valid_1’s l2: 263.369 [19] training’s l2: 196.852 valid_1’s l2: 263.265 [20] training’s l2: 196.771 valid_1’s l2: 263.159 [21] training’s l2: 196.687 valid_1’s l2: 263.058 [22] training’s l2: 196.605 valid_1’s l2: 262.962 [23] training’s l2: 196.522 valid_1’s l2: 262.86 [24] training’s l2: 196.442 valid_1’s l2: 262.765 [25] training’s l2: 196.359 valid_1’s l2: 262.663 [26] training’s l2: 196.274 valid_1’s l2: 262.553 [27] training’s l2: 196.19 valid_1’s l2: 262.441 [28] training’s l2: 196.108 valid_1’s l2: 262.34 [29] training’s l2: 196.023 valid_1’s l2: 262.23 [30] training’s l2: 195.938 valid_1’s l2: 262.123 [31] training’s l2: 195.861 valid_1’s l2: 262.028 [32] training’s l2: 195.776 valid_1’s l2: 261.931 [33] training’s l2: 195.692 valid_1’s l2: 261.836 [34] training’s l2: 195.608 valid_1’s l2: 261.74 [35] training’s l2: 195.524 valid_1’s l2: 261.645 [36] training’s l2: 195.441 valid_1’s l2: 261.541 [37] training’s l2: 195.358 valid_1’s l2: 261.437 [38] training’s l2: 195.277 valid_1’s l2: 261.346 [39] training’s l2: 195.195 valid_1’s l2: 261.243 [40] training’s l2: 195.113 valid_1’s l2: 261.137 [41] training’s l2: 195.029 valid_1’s l2: 261.028 [42] training’s l2: 194.945 valid_1’s l2: 260.91 [43] training’s l2: 194.862 valid_1’s l2: 260.8 [44] training’s l2: 194.778 valid_1’s l2: 260.69 [45] training’s l2: 194.694 valid_1’s l2: 260.575 [46] training’s l2: 194.615 valid_1’s l2: 260.473 [47] training’s l2: 194.534 valid_1’s l2: 260.375 [48] training’s l2: 194.455 valid_1’s l2: 260.273 [49] training’s l2: 194.376 valid_1’s l2: 260.17 [50] training’s l2: 194.297 valid_1’s l2: 260.069 Did not meet early stopping. Best iteration is: [50] training’s l2: 194.297 valid_1’s l2: 260.069 [1] training’s l2: 205.298 valid_1’s l2: 198.724 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.208 valid_1’s l2: 198.723 [3] training’s l2: 205.117 valid_1’s l2: 198.679 [4] training’s l2: 205.027 valid_1’s l2: 198.636 [5] training’s l2: 204.937 valid_1’s l2: 198.633 [6] training’s l2: 204.843 valid_1’s l2: 198.596 [7] training’s l2: 204.75 valid_1’s l2: 198.535 [8] training’s l2: 204.656 valid_1’s l2: 198.475 [9] training’s l2: 204.562 valid_1’s l2: 198.414 [10] training’s l2: 204.468 valid_1’s l2: 198.377 [11] training’s l2: 204.38 valid_1’s l2: 198.3 [12] training’s l2: 204.292 valid_1’s l2: 198.249 [13] training’s l2: 204.204 valid_1’s l2: 198.199 [14] training’s l2: 204.116 valid_1’s l2: 198.148 [15] training’s l2: 204.029 valid_1’s l2: 198.095 [16] training’s l2: 203.938 valid_1’s l2: 198.033 [17] training’s l2: 203.847 valid_1’s l2: 197.971 [18] training’s l2: 203.757 valid_1’s l2: 197.892 [19] training’s l2: 203.667 valid_1’s l2: 197.817 [20] training’s l2: 203.577 valid_1’s l2: 197.755 [21] training’s l2: 203.487 valid_1’s l2: 197.68 [22] training’s l2: 203.397 valid_1’s l2: 197.605 [23] training’s l2: 203.307 valid_1’s l2: 197.529 [24] training’s l2: 203.221 valid_1’s l2: 197.521 [25] training’s l2: 203.131 valid_1’s l2: 197.445 [26] training’s l2: 203.044 valid_1’s l2: 197.368 [27] training’s l2: 202.956 valid_1’s l2: 197.315 [28] training’s l2: 202.871 valid_1’s l2: 197.236 [29] training’s l2: 202.782 valid_1’s l2: 197.162 [30] training’s l2: 202.693 valid_1’s l2: 197.088 [31] training’s l2: 202.608 valid_1’s l2: 197.082 [32] training’s l2: 202.518 valid_1’s l2: 197.044 [33] training’s l2: 202.428 valid_1’s l2: 197.006 [34] training’s l2: 202.339 valid_1’s l2: 196.968 [35] training’s l2: 202.249 valid_1’s l2: 196.934 [36] training’s l2: 202.158 valid_1’s l2: 196.878 [37] training’s l2: 202.066 valid_1’s l2: 196.822 [38] training’s l2: 201.977 valid_1’s l2: 196.752 [39] training’s l2: 201.886 valid_1’s l2: 196.696 [40] training’s l2: 201.795 valid_1’s l2: 196.641 [41] training’s l2: 201.704 valid_1’s l2: 196.59 [42] training’s l2: 201.614 valid_1’s l2: 196.536 [43] training’s l2: 201.524 valid_1’s l2: 196.459 [44] training’s l2: 201.437 valid_1’s l2: 196.414 [45] training’s l2: 201.347 valid_1’s l2: 196.36 [46] training’s l2: 201.259 valid_1’s l2: 196.324 [47] training’s l2: 201.171 valid_1’s l2: 196.258 [48] training’s l2: 201.084 valid_1’s l2: 196.186 [49] training’s l2: 200.996 valid_1’s l2: 196.111 [50] training’s l2: 200.909 valid_1’s l2: 196.041 Did not meet early stopping. Best iteration is: [50] training’s l2: 200.909 valid_1’s l2: 196.041 [1] training’s l2: 193.949 valid_1’s l2: 286.276 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.864 valid_1’s l2: 286.16 [3] training’s l2: 193.783 valid_1’s l2: 286.047 [4] training’s l2: 193.702 valid_1’s l2: 285.939 [5] training’s l2: 193.618 valid_1’s l2: 285.823 [6] training’s l2: 193.537 valid_1’s l2: 285.712 [7] training’s l2: 193.457 valid_1’s l2: 285.598 [8] training’s l2: 193.376 valid_1’s l2: 285.487 [9] training’s l2: 193.295 valid_1’s l2: 285.376 [10] training’s l2: 193.215 valid_1’s l2: 285.265 [11] training’s l2: 193.129 valid_1’s l2: 285.147 [12] training’s l2: 193.042 valid_1’s l2: 285.019 [13] training’s l2: 192.954 valid_1’s l2: 284.89 [14] training’s l2: 192.867 valid_1’s l2: 284.762 [15] training’s l2: 192.779 valid_1’s l2: 284.633 [16] training’s l2: 192.697 valid_1’s l2: 284.511 [17] training’s l2: 192.614 valid_1’s l2: 284.388 [18] training’s l2: 192.532 valid_1’s l2: 284.265 [19] training’s l2: 192.447 valid_1’s l2: 284.141 [20] training’s l2: 192.364 valid_1’s l2: 284.018 [21] training’s l2: 192.283 valid_1’s l2: 283.901 [22] training’s l2: 192.201 valid_1’s l2: 283.785 [23] training’s l2: 192.119 valid_1’s l2: 283.671 [24] training’s l2: 192.04 valid_1’s l2: 283.555 [25] training’s l2: 191.959 valid_1’s l2: 283.44 [26] training’s l2: 191.876 valid_1’s l2: 283.317 [27] training’s l2: 191.796 valid_1’s l2: 283.196 [28] training’s l2: 191.715 valid_1’s l2: 283.077 [29] training’s l2: 191.632 valid_1’s l2: 282.955 [30] training’s l2: 191.548 valid_1’s l2: 282.833 [31] training’s l2: 191.479 valid_1’s l2: 282.726 [32] training’s l2: 191.401 valid_1’s l2: 282.613 [33] training’s l2: 191.323 valid_1’s l2: 282.501 [34] training’s l2: 191.246 valid_1’s l2: 282.387 [35] training’s l2: 191.168 valid_1’s l2: 282.274 [36] training’s l2: 191.09 valid_1’s l2: 282.168 [37] training’s l2: 191.012 valid_1’s l2: 282.063 [38] training’s l2: 190.931 valid_1’s l2: 281.951 [39] training’s l2: 190.854 valid_1’s l2: 281.84 [40] training’s l2: 190.776 valid_1’s l2: 281.728 [41] training’s l2: 190.691 valid_1’s l2: 281.614 [42] training’s l2: 190.606 valid_1’s l2: 281.5 [43] training’s l2: 190.522 valid_1’s l2: 281.387 [44] training’s l2: 190.445 valid_1’s l2: 281.277 [45] training’s l2: 190.365 valid_1’s l2: 281.191 [46] training’s l2: 190.284 valid_1’s l2: 281.074 [47] training’s l2: 190.205 valid_1’s l2: 280.968 [48] training’s l2: 190.124 valid_1’s l2: 280.855 [49] training’s l2: 190.043 valid_1’s l2: 280.743 [50] training’s l2: 189.962 valid_1’s l2: 280.63 Did not meet early stopping. Best iteration is: [50] training’s l2: 189.962 valid_1’s l2: 280.63 [1] training’s l2: 195.095 valid_1’s l2: 280.188 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 195.008 valid_1’s l2: 280.191 [3] training’s l2: 194.924 valid_1’s l2: 280.079 [4] training’s l2: 194.834 valid_1’s l2: 280.07 [5] training’s l2: 194.747 valid_1’s l2: 280.073 [6] training’s l2: 194.66 valid_1’s l2: 280.163 [7] training’s l2: 194.577 valid_1’s l2: 280.069 [8] training’s l2: 194.495 valid_1’s l2: 279.972 [9] training’s l2: 194.407 valid_1’s l2: 280.062 [10] training’s l2: 194.32 valid_1’s l2: 280.149 [11] training’s l2: 194.231 valid_1’s l2: 280.07 [12] training’s l2: 194.144 valid_1’s l2: 280 [13] training’s l2: 194.056 valid_1’s l2: 279.93 [14] training’s l2: 193.969 valid_1’s l2: 279.859 [15] training’s l2: 193.885 valid_1’s l2: 279.75 [16] training’s l2: 193.796 valid_1’s l2: 279.642 [17] training’s l2: 193.708 valid_1’s l2: 279.535 [18] training’s l2: 193.62 valid_1’s l2: 279.428 [19] training’s l2: 193.531 valid_1’s l2: 279.309 [20] training’s l2: 193.443 valid_1’s l2: 279.202 [21] training’s l2: 193.353 valid_1’s l2: 279.154 [22] training’s l2: 193.268 valid_1’s l2: 279.052 [23] training’s l2: 193.177 valid_1’s l2: 279.002 [24] training’s l2: 193.095 valid_1’s l2: 279.004 [25] training’s l2: 193.005 valid_1’s l2: 278.956 [26] training’s l2: 192.921 valid_1’s l2: 278.84 [27] training’s l2: 192.835 valid_1’s l2: 278.733 [28] training’s l2: 192.749 valid_1’s l2: 278.615 [29] training’s l2: 192.656 valid_1’s l2: 278.603 [30] training’s l2: 192.564 valid_1’s l2: 278.591 [31] training’s l2: 192.48 valid_1’s l2: 278.574 [32] training’s l2: 192.391 valid_1’s l2: 278.557 [33] training’s l2: 192.302 valid_1’s l2: 278.551 [34] training’s l2: 192.213 valid_1’s l2: 278.534 [35] training’s l2: 192.124 valid_1’s l2: 278.517 [36] training’s l2: 192.037 valid_1’s l2: 278.439 [37] training’s l2: 191.95 valid_1’s l2: 278.394 [38] training’s l2: 191.863 valid_1’s l2: 278.313 [39] training’s l2: 191.775 valid_1’s l2: 278.267 [40] training’s l2: 191.688 valid_1’s l2: 278.222 [41] training’s l2: 191.603 valid_1’s l2: 278.2 [42] training’s l2: 191.518 valid_1’s l2: 278.131 [43] training’s l2: 191.438 valid_1’s l2: 278.03 [44] training’s l2: 191.354 valid_1’s l2: 278.008 [45] training’s l2: 191.272 valid_1’s l2: 277.939 [46] training’s l2: 191.186 valid_1’s l2: 277.873 [47] training’s l2: 191.103 valid_1’s l2: 277.82 [48] training’s l2: 191.017 valid_1’s l2: 277.753 [49] training’s l2: 190.932 valid_1’s l2: 277.687 [50] training’s l2: 190.846 valid_1’s l2: 277.621 Did not meet early stopping. Best iteration is: [50] training’s l2: 190.846 valid_1’s l2: 277.621 [1] training’s l2: 194.326 valid_1’s l2: 284.216 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.166 valid_1’s l2: 284.157 [3] training’s l2: 193.999 valid_1’s l2: 283.916 [4] training’s l2: 193.824 valid_1’s l2: 283.829 [5] training’s l2: 193.665 valid_1’s l2: 283.771 [6] training’s l2: 193.498 valid_1’s l2: 283.623 [7] training’s l2: 193.34 valid_1’s l2: 283.4 [8] training’s l2: 193.181 valid_1’s l2: 283.174 [9] training’s l2: 193.015 valid_1’s l2: 283.028 [10] training’s l2: 192.849 valid_1’s l2: 282.881 [11] training’s l2: 192.687 valid_1’s l2: 282.781 [12] training’s l2: 192.526 valid_1’s l2: 282.904 [13] training’s l2: 192.365 valid_1’s l2: 283.028 [14] training’s l2: 192.206 valid_1’s l2: 283.139 [15] training’s l2: 192.049 valid_1’s l2: 282.905 [16] training’s l2: 191.893 valid_1’s l2: 282.684 [17] training’s l2: 191.736 valid_1’s l2: 282.464 [18] training’s l2: 191.58 valid_1’s l2: 282.244 [19] training’s l2: 191.418 valid_1’s l2: 281.992 [20] training’s l2: 191.262 valid_1’s l2: 281.773 [21] training’s l2: 191.1 valid_1’s l2: 281.696 [22] training’s l2: 190.95 valid_1’s l2: 281.52 [23] training’s l2: 190.788 valid_1’s l2: 281.443 [24] training’s l2: 190.621 valid_1’s l2: 281.302 [25] training’s l2: 190.46 valid_1’s l2: 281.226 [26] training’s l2: 190.31 valid_1’s l2: 281.01 [27] training’s l2: 190.164 valid_1’s l2: 280.795 [28] training’s l2: 190.006 valid_1’s l2: 280.566 [29] training’s l2: 189.852 valid_1’s l2: 280.496 [30] training’s l2: 189.697 valid_1’s l2: 280.426 [31] training’s l2: 189.548 valid_1’s l2: 280.555 [32] training’s l2: 189.389 valid_1’s l2: 280.49 [33] training’s l2: 189.23 valid_1’s l2: 280.426 [34] training’s l2: 189.072 valid_1’s l2: 280.362 [35] training’s l2: 188.915 valid_1’s l2: 280.295 [36] training’s l2: 188.763 valid_1’s l2: 280.417 [37] training’s l2: 188.612 valid_1’s l2: 280.54 [38] training’s l2: 188.452 valid_1’s l2: 280.395 [39] training’s l2: 188.301 valid_1’s l2: 280.518 [40] training’s l2: 188.15 valid_1’s l2: 280.635 [41] training’s l2: 187.993 valid_1’s l2: 280.554 [42] training’s l2: 187.83 valid_1’s l2: 280.486 [43] training’s l2: 187.684 valid_1’s l2: 280.253 [44] training’s l2: 187.527 valid_1’s l2: 280.174 [45] training’s l2: 187.366 valid_1’s l2: 280.108 [46] training’s l2: 187.216 valid_1’s l2: 280.037 [47] training’s l2: 187.059 valid_1’s l2: 279.834 [48] training’s l2: 186.91 valid_1’s l2: 279.761 [49] training’s l2: 186.761 valid_1’s l2: 279.685 [50] training’s l2: 186.613 valid_1’s l2: 279.612 Did not meet early stopping. Best iteration is: [50] training’s l2: 186.613 valid_1’s l2: 279.612 [1] training’s l2: 198.285 valid_1’s l2: 265.008 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.125 valid_1’s l2: 264.791 [3] training’s l2: 197.957 valid_1’s l2: 264.57 [4] training’s l2: 197.788 valid_1’s l2: 264.355 [5] training’s l2: 197.629 valid_1’s l2: 264.139 [6] training’s l2: 197.455 valid_1’s l2: 263.945 [7] training’s l2: 197.283 valid_1’s l2: 263.753 [8] training’s l2: 197.108 valid_1’s l2: 263.559 [9] training’s l2: 196.935 valid_1’s l2: 263.366 [10] training’s l2: 196.764 valid_1’s l2: 263.174 [11] training’s l2: 196.595 valid_1’s l2: 262.968 [12] training’s l2: 196.424 valid_1’s l2: 262.77 [13] training’s l2: 196.258 valid_1’s l2: 262.567 [14] training’s l2: 196.092 valid_1’s l2: 262.365 [15] training’s l2: 195.927 valid_1’s l2: 262.163 [16] training’s l2: 195.765 valid_1’s l2: 261.954 [17] training’s l2: 195.604 valid_1’s l2: 261.745 [18] training’s l2: 195.443 valid_1’s l2: 261.537 [19] training’s l2: 195.276 valid_1’s l2: 261.331 [20] training’s l2: 195.115 valid_1’s l2: 261.124 [21] training’s l2: 194.952 valid_1’s l2: 260.924 [22] training’s l2: 194.791 valid_1’s l2: 260.735 [23] training’s l2: 194.627 valid_1’s l2: 260.534 [24] training’s l2: 194.47 valid_1’s l2: 260.349 [25] training’s l2: 194.307 valid_1’s l2: 260.149 [26] training’s l2: 194.141 valid_1’s l2: 259.934 [27] training’s l2: 193.978 valid_1’s l2: 259.718 [28] training’s l2: 193.818 valid_1’s l2: 259.52 [29] training’s l2: 193.653 valid_1’s l2: 259.306 [30] training’s l2: 193.486 valid_1’s l2: 259.096 [31] training’s l2: 193.335 valid_1’s l2: 258.911 [32] training’s l2: 193.172 valid_1’s l2: 258.728 [33] training’s l2: 193.008 valid_1’s l2: 258.548 [34] training’s l2: 192.845 valid_1’s l2: 258.369 [35] training’s l2: 192.684 valid_1’s l2: 258.19 [36] training’s l2: 192.522 valid_1’s l2: 257.988 [37] training’s l2: 192.361 valid_1’s l2: 257.785 [38] training’s l2: 192.205 valid_1’s l2: 257.608 [39] training’s l2: 192.045 valid_1’s l2: 257.41 [40] training’s l2: 191.885 valid_1’s l2: 257.204 [41] training’s l2: 191.724 valid_1’s l2: 256.993 [42] training’s l2: 191.561 valid_1’s l2: 256.771 [43] training’s l2: 191.4 valid_1’s l2: 256.558 [44] training’s l2: 191.238 valid_1’s l2: 256.345 [45] training’s l2: 191.077 valid_1’s l2: 256.125 [46] training’s l2: 190.924 valid_1’s l2: 255.93 [47] training’s l2: 190.768 valid_1’s l2: 255.74 [48] training’s l2: 190.616 valid_1’s l2: 255.545 [49] training’s l2: 190.464 valid_1’s l2: 255.347 [50] training’s l2: 190.313 valid_1’s l2: 255.153 Did not meet early stopping. Best iteration is: [50] training’s l2: 190.313 valid_1’s l2: 255.153 [1] training’s l2: 205.207 valid_1’s l2: 198.681 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.027 valid_1’s l2: 198.68 [3] training’s l2: 204.846 valid_1’s l2: 198.591 [4] training’s l2: 204.666 valid_1’s l2: 198.506 [5] training’s l2: 204.487 valid_1’s l2: 198.501 [6] training’s l2: 204.3 valid_1’s l2: 198.426 [7] training’s l2: 204.115 valid_1’s l2: 198.306 [8] training’s l2: 203.927 valid_1’s l2: 198.186 [9] training’s l2: 203.741 valid_1’s l2: 198.067 [10] training’s l2: 203.555 valid_1’s l2: 197.992 [11] training’s l2: 203.38 valid_1’s l2: 197.839 [12] training’s l2: 203.205 valid_1’s l2: 197.739 [13] training’s l2: 203.031 valid_1’s l2: 197.639 [14] training’s l2: 202.857 valid_1’s l2: 197.54 [15] training’s l2: 202.685 valid_1’s l2: 197.435 [16] training’s l2: 202.506 valid_1’s l2: 197.318 [17] training’s l2: 202.327 valid_1’s l2: 197.203 [18] training’s l2: 202.149 valid_1’s l2: 197.052 [19] training’s l2: 201.973 valid_1’s l2: 196.905 [20] training’s l2: 201.794 valid_1’s l2: 196.784 [21] training’s l2: 201.618 valid_1’s l2: 196.636 [22] training’s l2: 201.441 valid_1’s l2: 196.487 [23] training’s l2: 201.265 valid_1’s l2: 196.339 [24] training’s l2: 201.095 valid_1’s l2: 196.324 [25] training’s l2: 200.92 valid_1’s l2: 196.176 [26] training’s l2: 200.748 valid_1’s l2: 196.024 [27] training’s l2: 200.576 valid_1’s l2: 195.921 [28] training’s l2: 200.409 valid_1’s l2: 195.768 [29] training’s l2: 200.235 valid_1’s l2: 195.623 [30] training’s l2: 200.061 valid_1’s l2: 195.479 [31] training’s l2: 199.895 valid_1’s l2: 195.469 [32] training’s l2: 199.72 valid_1’s l2: 195.396 [33] training’s l2: 199.546 valid_1’s l2: 195.324 [34] training’s l2: 199.372 valid_1’s l2: 195.252 [35] training’s l2: 199.198 valid_1’s l2: 195.187 [36] training’s l2: 199.023 valid_1’s l2: 195.142 [37] training’s l2: 198.849 valid_1’s l2: 195.097 [38] training’s l2: 198.676 valid_1’s l2: 194.96 [39] training’s l2: 198.498 valid_1’s l2: 194.854 [40] training’s l2: 198.322 valid_1’s l2: 194.749 [41] training’s l2: 198.147 valid_1’s l2: 194.65 [42] training’s l2: 197.971 valid_1’s l2: 194.548 [43] training’s l2: 197.802 valid_1’s l2: 194.406 [44] training’s l2: 197.634 valid_1’s l2: 194.321 [45] training’s l2: 197.46 valid_1’s l2: 194.219 [46] training’s l2: 197.291 valid_1’s l2: 194.151 [47] training’s l2: 197.12 valid_1’s l2: 194.025 [48] training’s l2: 196.952 valid_1’s l2: 193.886 [49] training’s l2: 196.783 valid_1’s l2: 193.741 [50] training’s l2: 196.616 valid_1’s l2: 193.606 Did not meet early stopping. Best iteration is: [50] training’s l2: 196.616 valid_1’s l2: 193.606 [1] training’s l2: 193.866 valid_1’s l2: 286.165 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.698 valid_1’s l2: 285.933 [3] training’s l2: 193.536 valid_1’s l2: 285.708 [4] training’s l2: 193.374 valid_1’s l2: 285.491 [5] training’s l2: 193.207 valid_1’s l2: 285.26 [6] training’s l2: 193.046 valid_1’s l2: 285.039 [7] training’s l2: 192.886 valid_1’s l2: 284.813 [8] training’s l2: 192.725 valid_1’s l2: 284.592 [9] training’s l2: 192.565 valid_1’s l2: 284.371 [10] training’s l2: 192.405 valid_1’s l2: 284.151 [11] training’s l2: 192.235 valid_1’s l2: 283.916 [12] training’s l2: 192.062 valid_1’s l2: 283.662 [13] training’s l2: 191.889 valid_1’s l2: 283.407 [14] training’s l2: 191.716 valid_1’s l2: 283.153 [15] training’s l2: 191.543 valid_1’s l2: 282.899 [16] training’s l2: 191.38 valid_1’s l2: 282.656 [17] training’s l2: 191.217 valid_1’s l2: 282.414 [18] training’s l2: 191.055 valid_1’s l2: 282.172 [19] training’s l2: 190.887 valid_1’s l2: 281.926 [20] training’s l2: 190.725 valid_1’s l2: 281.685 [21] training’s l2: 190.566 valid_1’s l2: 281.505 [22] training’s l2: 190.405 valid_1’s l2: 281.282 [23] training’s l2: 190.245 valid_1’s l2: 281.058 [24] training’s l2: 190.09 valid_1’s l2: 280.831 [25] training’s l2: 189.93 valid_1’s l2: 280.604 [26] training’s l2: 189.767 valid_1’s l2: 280.365 [27] training’s l2: 189.61 valid_1’s l2: 280.126 [28] training’s l2: 189.452 valid_1’s l2: 279.893 [29] training’s l2: 189.289 valid_1’s l2: 279.655 [30] training’s l2: 189.127 valid_1’s l2: 279.417 [31] training’s l2: 188.981 valid_1’s l2: 279.18 [32] training’s l2: 188.829 valid_1’s l2: 278.959 [33] training’s l2: 188.678 valid_1’s l2: 278.74 [34] training’s l2: 188.527 valid_1’s l2: 278.52 [35] training’s l2: 188.377 valid_1’s l2: 278.299 [36] training’s l2: 188.225 valid_1’s l2: 278.094 [37] training’s l2: 188.073 valid_1’s l2: 277.889 [38] training’s l2: 187.916 valid_1’s l2: 277.672 [39] training’s l2: 187.767 valid_1’s l2: 277.457 [40] training’s l2: 187.615 valid_1’s l2: 277.239 [41] training’s l2: 187.451 valid_1’s l2: 277.019 [42] training’s l2: 187.288 valid_1’s l2: 276.799 [43] training’s l2: 187.124 valid_1’s l2: 276.579 [44] training’s l2: 186.976 valid_1’s l2: 276.366 [45] training’s l2: 186.823 valid_1’s l2: 276.202 [46] training’s l2: 186.667 valid_1’s l2: 275.975 [47] training’s l2: 186.514 valid_1’s l2: 275.771 [48] training’s l2: 186.357 valid_1’s l2: 275.553 [49] training’s l2: 186.201 valid_1’s l2: 275.336 [50] training’s l2: 186.046 valid_1’s l2: 275.12 Did not meet early stopping. Best iteration is: [50] training’s l2: 186.046 valid_1’s l2: 275.12 [1] training’s l2: 195.007 valid_1’s l2: 280.125 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.833 valid_1’s l2: 280.132 [3] training’s l2: 194.66 valid_1’s l2: 279.914 [4] training’s l2: 194.486 valid_1’s l2: 279.789 [5] training’s l2: 194.313 valid_1’s l2: 279.797 [6] training’s l2: 194.138 valid_1’s l2: 279.976 [7] training’s l2: 193.973 valid_1’s l2: 279.789 [8] training’s l2: 193.81 valid_1’s l2: 279.595 [9] training’s l2: 193.636 valid_1’s l2: 279.775 [10] training’s l2: 193.463 valid_1’s l2: 279.949 [11] training’s l2: 193.287 valid_1’s l2: 279.793 [12] training’s l2: 193.114 valid_1’s l2: 279.653 [13] training’s l2: 192.94 valid_1’s l2: 279.514 [14] training’s l2: 192.767 valid_1’s l2: 279.375 [15] training’s l2: 192.601 valid_1’s l2: 279.158 [16] training’s l2: 192.427 valid_1’s l2: 278.946 [17] training’s l2: 192.252 valid_1’s l2: 278.734 [18] training’s l2: 192.078 valid_1’s l2: 278.523 [19] training’s l2: 191.903 valid_1’s l2: 278.288 [20] training’s l2: 191.73 valid_1’s l2: 278.077 [21] training’s l2: 191.553 valid_1’s l2: 277.983 [22] training’s l2: 191.385 valid_1’s l2: 277.782 [23] training’s l2: 191.208 valid_1’s l2: 277.685 [24] training’s l2: 191.046 valid_1’s l2: 277.691 [25] training’s l2: 190.87 valid_1’s l2: 277.599 [26] training’s l2: 190.706 valid_1’s l2: 277.371 [27] training’s l2: 190.536 valid_1’s l2: 277.161 [28] training’s l2: 190.368 valid_1’s l2: 276.929 [29] training’s l2: 190.187 valid_1’s l2: 276.907 [30] training’s l2: 190.007 valid_1’s l2: 276.886 [31] training’s l2: 189.843 valid_1’s l2: 276.856 [32] training’s l2: 189.67 valid_1’s l2: 276.825 [33] training’s l2: 189.497 valid_1’s l2: 276.816 [34] training’s l2: 189.324 valid_1’s l2: 276.786 [35] training’s l2: 189.151 valid_1’s l2: 276.756 [36] training’s l2: 188.982 valid_1’s l2: 276.605 [37] training’s l2: 188.813 valid_1’s l2: 276.455 [38] training’s l2: 188.644 valid_1’s l2: 276.299 [39] training’s l2: 188.476 valid_1’s l2: 276.15 [40] training’s l2: 188.308 valid_1’s l2: 276.064 [41] training’s l2: 188.143 valid_1’s l2: 276.023 [42] training’s l2: 187.979 valid_1’s l2: 275.891 [43] training’s l2: 187.823 valid_1’s l2: 275.695 [44] training’s l2: 187.662 valid_1’s l2: 275.654 [45] training’s l2: 187.503 valid_1’s l2: 275.523 [46] training’s l2: 187.338 valid_1’s l2: 275.396 [47] training’s l2: 187.175 valid_1’s l2: 275.22 [48] training’s l2: 187.011 valid_1’s l2: 275.095 [49] training’s l2: 186.846 valid_1’s l2: 274.97 [50] training’s l2: 186.682 valid_1’s l2: 274.845 Did not meet early stopping. Best iteration is: [50] training’s l2: 186.682 valid_1’s l2: 274.845 [1] training’s l2: 192.754 valid_1’s l2: 283.432 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 191.188 valid_1’s l2: 282.874 [3] training’s l2: 189.574 valid_1’s l2: 280.536 [4] training’s l2: 187.919 valid_1’s l2: 279.751 [5] training’s l2: 186.436 valid_1’s l2: 279.268 [6] training’s l2: 184.89 valid_1’s l2: 277.909 [7] training’s l2: 183.449 valid_1’s l2: 275.858 [8] training’s l2: 182.036 valid_1’s l2: 273.804 [9] training’s l2: 180.575 valid_1’s l2: 272.55 [10] training’s l2: 179.163 valid_1’s l2: 271.331 [11] training’s l2: 177.735 valid_1’s l2: 269.405 [12] training’s l2: 176.375 valid_1’s l2: 270.587 [13] training’s l2: 175.033 valid_1’s l2: 271.769 [14] training’s l2: 173.738 valid_1’s l2: 272.828 [15] training’s l2: 172.449 valid_1’s l2: 270.808 [16] training’s l2: 171.146 valid_1’s l2: 268.958 [17] training’s l2: 169.868 valid_1’s l2: 267.141 [18] training’s l2: 168.617 valid_1’s l2: 265.357 [19] training’s l2: 167.384 valid_1’s l2: 263.504 [20] training’s l2: 166.176 valid_1’s l2: 261.776 [21] training’s l2: 164.981 valid_1’s l2: 261.533 [22] training’s l2: 163.868 valid_1’s l2: 260.367 [23] training’s l2: 162.719 valid_1’s l2: 260.183 [24] training’s l2: 161.461 valid_1’s l2: 259.235 [25] training’s l2: 160.218 valid_1’s l2: 257.786 [26] training’s l2: 159.216 valid_1’s l2: 256.288 [27] training’s l2: 158.27 valid_1’s l2: 254.915 [28] training’s l2: 157.174 valid_1’s l2: 253.169 [29] training’s l2: 156.16 valid_1’s l2: 254.265 [30] training’s l2: 155.181 valid_1’s l2: 255.356 [31] training’s l2: 154.077 valid_1’s l2: 254.013 [32] training’s l2: 153.05 valid_1’s l2: 254.344 [33] training’s l2: 152.038 valid_1’s l2: 255.487 [34] training’s l2: 151.04 valid_1’s l2: 255.263 [35] training’s l2: 150.067 valid_1’s l2: 255.574 [36] training’s l2: 149.127 valid_1’s l2: 254.926 [37] training’s l2: 148.187 valid_1’s l2: 254.225 [38] training’s l2: 147.208 valid_1’s l2: 255.215 Early stopping, best iteration is: [28] training’s l2: 157.174 valid_1’s l2: 253.169 [1] training’s l2: 196.785 valid_1’s l2: 263.043 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 195.216 valid_1’s l2: 260.911 [3] training’s l2: 193.59 valid_1’s l2: 258.786 [4] training’s l2: 191.991 valid_1’s l2: 256.757 [5] training’s l2: 190.35 valid_1’s l2: 254.591 [6] training’s l2: 188.735 valid_1’s l2: 252.797 [7] training’s l2: 187.16 valid_1’s l2: 251.046 [8] training’s l2: 185.586 valid_1’s l2: 249.304 [9] training’s l2: 184.065 valid_1’s l2: 247.612 [10] training’s l2: 182.555 valid_1’s l2: 245.522 [11] training’s l2: 181.095 valid_1’s l2: 243.895 [12] training’s l2: 179.678 valid_1’s l2: 242.132 [13] training’s l2: 178.319 valid_1’s l2: 240.541 [14] training’s l2: 176.964 valid_1’s l2: 238.876 [15] training’s l2: 175.667 valid_1’s l2: 237.349 [16] training’s l2: 174.296 valid_1’s l2: 235.588 [17] training’s l2: 172.958 valid_1’s l2: 233.822 [18] training’s l2: 171.646 valid_1’s l2: 232.088 [19] training’s l2: 170.375 valid_1’s l2: 230.507 [20] training’s l2: 169.125 valid_1’s l2: 228.863 [21] training’s l2: 167.9 valid_1’s l2: 227.368 [22] training’s l2: 166.709 valid_1’s l2: 225.974 [23] training’s l2: 165.526 valid_1’s l2: 224.525 [24] training’s l2: 164.451 valid_1’s l2: 223.079 [25] training’s l2: 163.308 valid_1’s l2: 221.68 [26] training’s l2: 162.17 valid_1’s l2: 220.253 [27] training’s l2: 161.046 valid_1’s l2: 218.752 [28] training’s l2: 159.954 valid_1’s l2: 217.366 [29] training’s l2: 158.862 valid_1’s l2: 215.953 [30] training’s l2: 157.784 valid_1’s l2: 214.568 [31] training’s l2: 156.818 valid_1’s l2: 213.856 [32] training’s l2: 155.761 valid_1’s l2: 212.83 [33] training’s l2: 154.749 valid_1’s l2: 211.613 [34] training’s l2: 153.745 valid_1’s l2: 210.418 [35] training’s l2: 152.748 valid_1’s l2: 209.37 [36] training’s l2: 151.714 valid_1’s l2: 208.228 [37] training’s l2: 150.701 valid_1’s l2: 207.109 [38] training’s l2: 149.769 valid_1’s l2: 206.084 [39] training’s l2: 148.765 valid_1’s l2: 204.719 [40] training’s l2: 147.793 valid_1’s l2: 203.399 [41] training’s l2: 146.773 valid_1’s l2: 202.177 [42] training’s l2: 145.76 valid_1’s l2: 200.753 [43] training’s l2: 144.781 valid_1’s l2: 199.458 [44] training’s l2: 143.916 valid_1’s l2: 198.327 [45] training’s l2: 142.959 valid_1’s l2: 196.933 [46] training’s l2: 142.155 valid_1’s l2: 195.851 [47] training’s l2: 141.342 valid_1’s l2: 194.89 [48] training’s l2: 140.532 valid_1’s l2: 193.997 [49] training’s l2: 139.752 valid_1’s l2: 193.223 [50] training’s l2: 138.966 valid_1’s l2: 192.373 Did not meet early stopping. Best iteration is: [50] training’s l2: 138.966 valid_1’s l2: 192.373 [1] training’s l2: 203.579 valid_1’s l2: 197.918 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 201.818 valid_1’s l2: 197.934 [3] training’s l2: 200.076 valid_1’s l2: 197.092 [4] training’s l2: 198.322 valid_1’s l2: 195.46 [5] training’s l2: 196.648 valid_1’s l2: 195.451 [6] training’s l2: 194.902 valid_1’s l2: 194.796 [7] training’s l2: 193.181 valid_1’s l2: 193.8 [8] training’s l2: 191.5 valid_1’s l2: 192.746 [9] training’s l2: 189.852 valid_1’s l2: 191.839 [10] training’s l2: 188.234 valid_1’s l2: 191.094 [11] training’s l2: 186.714 valid_1’s l2: 189.765 [12] training’s l2: 185.228 valid_1’s l2: 188.973 [13] training’s l2: 183.773 valid_1’s l2: 188.207 [14] training’s l2: 182.355 valid_1’s l2: 187.474 [15] training’s l2: 180.87 valid_1’s l2: 186.844 [16] training’s l2: 179.427 valid_1’s l2: 185.933 [17] training’s l2: 178.024 valid_1’s l2: 184.751 [18] training’s l2: 176.648 valid_1’s l2: 183.554 [19] training’s l2: 175.216 valid_1’s l2: 182.394 [20] training’s l2: 173.856 valid_1’s l2: 182.126 [21] training’s l2: 172.541 valid_1’s l2: 181.045 [22] training’s l2: 171.254 valid_1’s l2: 179.969 [23] training’s l2: 169.988 valid_1’s l2: 178.928 [24] training’s l2: 168.76 valid_1’s l2: 179.051 [25] training’s l2: 167.516 valid_1’s l2: 178.045 [26] training’s l2: 166.25 valid_1’s l2: 177.129 [27] training’s l2: 165.078 valid_1’s l2: 177.382 [28] training’s l2: 163.871 valid_1’s l2: 176.36 [29] training’s l2: 162.651 valid_1’s l2: 175.748 [30] training’s l2: 161.454 valid_1’s l2: 175.156 [31] training’s l2: 160.349 valid_1’s l2: 176.015 [32] training’s l2: 159.246 valid_1’s l2: 176.157 [33] training’s l2: 158.118 valid_1’s l2: 176.254 [34] training’s l2: 157.054 valid_1’s l2: 176.415 [35] training’s l2: 155.976 valid_1’s l2: 176.167 [36] training’s l2: 154.906 valid_1’s l2: 176.532 [37] training’s l2: 153.857 valid_1’s l2: 176.909 [38] training’s l2: 152.798 valid_1’s l2: 176.065 [39] training’s l2: 151.778 valid_1’s l2: 176.709 [40] training’s l2: 150.779 valid_1’s l2: 177.357 Early stopping, best iteration is: [30] training’s l2: 161.454 valid_1’s l2: 175.156 [1] training’s l2: 192.394 valid_1’s l2: 284.171 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 190.747 valid_1’s l2: 281.891 [3] training’s l2: 189.181 valid_1’s l2: 279.719 [4] training’s l2: 187.644 valid_1’s l2: 277.656 [5] training’s l2: 186.081 valid_1’s l2: 275.484 [6] training’s l2: 184.587 valid_1’s l2: 273.417 [7] training’s l2: 183.132 valid_1’s l2: 271.335 [8] training’s l2: 181.695 valid_1’s l2: 269.338 [9] training’s l2: 180.285 valid_1’s l2: 267.383 [10] training’s l2: 178.902 valid_1’s l2: 265.46 [11] training’s l2: 177.436 valid_1’s l2: 263.405 [12] training’s l2: 175.957 valid_1’s l2: 261.19 [13] training’s l2: 174.501 valid_1’s l2: 258.895 [14] training’s l2: 173.079 valid_1’s l2: 256.78 [15] training’s l2: 171.681 valid_1’s l2: 254.744 [16] training’s l2: 170.398 valid_1’s l2: 252.749 [17] training’s l2: 169.134 valid_1’s l2: 250.804 [18] training’s l2: 167.881 valid_1’s l2: 248.9 [19] training’s l2: 166.587 valid_1’s l2: 247 [20] training’s l2: 165.396 valid_1’s l2: 245.147 [21] training’s l2: 164.235 valid_1’s l2: 243.274 [22] training’s l2: 163.095 valid_1’s l2: 241.461 [23] training’s l2: 161.935 valid_1’s l2: 240.223 [24] training’s l2: 160.812 valid_1’s l2: 238.632 [25] training’s l2: 159.688 valid_1’s l2: 236.838 [26] training’s l2: 158.55 valid_1’s l2: 235.174 [27] training’s l2: 157.426 valid_1’s l2: 233.524 [28] training’s l2: 156.377 valid_1’s l2: 232.061 [29] training’s l2: 155.295 valid_1’s l2: 230.479 [30] training’s l2: 154.234 valid_1’s l2: 228.93 [31] training’s l2: 153.182 valid_1’s l2: 227.44 [32] training’s l2: 152.211 valid_1’s l2: 226.005 [33] training’s l2: 151.258 valid_1’s l2: 224.609 [34] training’s l2: 150.326 valid_1’s l2: 223.228 [35] training’s l2: 149.414 valid_1’s l2: 221.866 [36] training’s l2: 148.488 valid_1’s l2: 220.624 [37] training’s l2: 147.582 valid_1’s l2: 219.406 [38] training’s l2: 146.684 valid_1’s l2: 218.1 [39] training’s l2: 145.812 valid_1’s l2: 217.132 [40] training’s l2: 144.94 valid_1’s l2: 216.183 [41] training’s l2: 144.036 valid_1’s l2: 214.875 [42] training’s l2: 143.15 valid_1’s l2: 213.64 [43] training’s l2: 142.28 valid_1’s l2: 212.372 [44] training’s l2: 141.487 valid_1’s l2: 211.214 [45] training’s l2: 140.667 valid_1’s l2: 210.078 [46] training’s l2: 139.843 valid_1’s l2: 208.862 [47] training’s l2: 138.988 valid_1’s l2: 207.807 [48] training’s l2: 138.192 valid_1’s l2: 206.63 [49] training’s l2: 137.403 valid_1’s l2: 205.405 [50] training’s l2: 136.623 valid_1’s l2: 204.318 Did not meet early stopping. Best iteration is: [50] training’s l2: 136.623 valid_1’s l2: 204.318 [1] training’s l2: 193.437 valid_1’s l2: 279 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 191.732 valid_1’s l2: 279.095 [3] training’s l2: 190.061 valid_1’s l2: 276.983 [4] training’s l2: 188.344 valid_1’s l2: 276.937 [5] training’s l2: 186.727 valid_1’s l2: 277.076 [6] training’s l2: 185.107 valid_1’s l2: 278.821 [7] training’s l2: 183.6 valid_1’s l2: 277.066 [8] training’s l2: 182.132 valid_1’s l2: 275.283 [9] training’s l2: 180.588 valid_1’s l2: 277.008 [10] training’s l2: 179.081 valid_1’s l2: 278.679 [11] training’s l2: 177.548 valid_1’s l2: 277.286 [12] training’s l2: 176.155 valid_1’s l2: 275.812 [13] training’s l2: 174.707 valid_1’s l2: 274.618 [14] training’s l2: 173.36 valid_1’s l2: 273.397 [15] training’s l2: 172.016 valid_1’s l2: 271.521 [16] training’s l2: 170.615 valid_1’s l2: 269.749 [17] training’s l2: 169.24 valid_1’s l2: 268.061 [18] training’s l2: 167.895 valid_1’s l2: 266.361 [19] training’s l2: 166.574 valid_1’s l2: 264.531 [20] training’s l2: 165.273 valid_1’s l2: 262.937 [21] training’s l2: 163.945 valid_1’s l2: 264.679 [22] training’s l2: 162.769 valid_1’s l2: 263.088 [23] training’s l2: 161.508 valid_1’s l2: 264.803 [24] training’s l2: 160.303 valid_1’s l2: 266.463 [25] training’s l2: 159.061 valid_1’s l2: 267.995 [26] training’s l2: 157.84 valid_1’s l2: 266.354 [27] training’s l2: 156.636 valid_1’s l2: 264.674 [28] training’s l2: 155.452 valid_1’s l2: 262.88 [29] training’s l2: 154.246 valid_1’s l2: 262.287 [30] training’s l2: 153.085 valid_1’s l2: 262.332 [31] training’s l2: 152.05 valid_1’s l2: 262.481 [32] training’s l2: 150.983 valid_1’s l2: 262.573 [33] training’s l2: 149.914 valid_1’s l2: 262.875 [34] training’s l2: 148.888 valid_1’s l2: 262.992 [35] training’s l2: 147.86 valid_1’s l2: 263.166 [36] training’s l2: 146.87 valid_1’s l2: 262.365 [37] training’s l2: 145.884 valid_1’s l2: 261.538 [38] training’s l2: 144.858 valid_1’s l2: 260.307 [39] training’s l2: 143.92 valid_1’s l2: 259.563 [40] training’s l2: 142.961 valid_1’s l2: 258.982 [41] training’s l2: 142.035 valid_1’s l2: 258.938 [42] training’s l2: 141.121 valid_1’s l2: 260.576 [43] training’s l2: 140.262 valid_1’s l2: 259.28 [44] training’s l2: 139.397 valid_1’s l2: 259.245 [45] training’s l2: 138.529 valid_1’s l2: 260.866 [46] training’s l2: 137.691 valid_1’s l2: 261.832 [47] training’s l2: 136.793 valid_1’s l2: 262.691 [48] training’s l2: 135.983 valid_1’s l2: 263.648 [49] training’s l2: 135.18 valid_1’s l2: 264.601 [50] training’s l2: 134.393 valid_1’s l2: 266.052 Did not meet early stopping. Best iteration is: [50] training’s l2: 134.393 valid_1’s l2: 266.052 [1] training’s l2: 177.854 valid_1’s l2: 276.541 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 165.13 valid_1’s l2: 274.253 [3] training’s l2: 153.897 valid_1’s l2: 257.312 [4] training’s l2: 144.717 valid_1’s l2: 244.933 [5] training’s l2: 135.971 valid_1’s l2: 234.847 [6] training’s l2: 128.69 valid_1’s l2: 243.49 [7] training’s l2: 122.525 valid_1’s l2: 232.644 [8] training’s l2: 117.377 valid_1’s l2: 223.542 [9] training’s l2: 112.532 valid_1’s l2: 215.126 [10] training’s l2: 108.297 valid_1’s l2: 210.908 [11] training’s l2: 104.494 valid_1’s l2: 215.936 [12] training’s l2: 101.222 valid_1’s l2: 221.541 [13] training’s l2: 98.4494 valid_1’s l2: 226.393 [14] training’s l2: 96.0482 valid_1’s l2: 233.501 [15] training’s l2: 94.0582 valid_1’s l2: 228.572 [16] training’s l2: 92.0735 valid_1’s l2: 225.113 [17] training’s l2: 90.2305 valid_1’s l2: 220.869 [18] training’s l2: 88.7237 valid_1’s l2: 219.379 [19] training’s l2: 87.4577 valid_1’s l2: 217.351 [20] training’s l2: 86.1959 valid_1’s l2: 214.742 Early stopping, best iteration is: [10] training’s l2: 108.297 valid_1’s l2: 210.908 [1] training’s l2: 182.541 valid_1’s l2: 244.412 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 168.123 valid_1’s l2: 226.358 [3] training’s l2: 156.947 valid_1’s l2: 212.639 [4] training’s l2: 147.493 valid_1’s l2: 201.251 [5] training’s l2: 138.544 valid_1’s l2: 189.408 [6] training’s l2: 130.719 valid_1’s l2: 181.947 [7] training’s l2: 124.025 valid_1’s l2: 174.721 [8] training’s l2: 118.375 valid_1’s l2: 167.221 [9] training’s l2: 113.996 valid_1’s l2: 167.12 [10] training’s l2: 109.654 valid_1’s l2: 161.851 [11] training’s l2: 105.914 valid_1’s l2: 156.891 [12] training’s l2: 102.914 valid_1’s l2: 155.359 [13] training’s l2: 100.238 valid_1’s l2: 151.913 [14] training’s l2: 97.5673 valid_1’s l2: 148.727 [15] training’s l2: 95.3754 valid_1’s l2: 145.828 [16] training’s l2: 93.3648 valid_1’s l2: 144.586 [17] training’s l2: 91.7296 valid_1’s l2: 143.403 [18] training’s l2: 90.384 valid_1’s l2: 142.21 [19] training’s l2: 89.0758 valid_1’s l2: 140.431 [20] training’s l2: 87.8963 valid_1’s l2: 141.079 [21] training’s l2: 86.6869 valid_1’s l2: 139.991 [22] training’s l2: 85.4879 valid_1’s l2: 139.784 [23] training’s l2: 84.005 valid_1’s l2: 146.299 [24] training’s l2: 83.1915 valid_1’s l2: 147.741 [25] training’s l2: 82.427 valid_1’s l2: 147.724 [26] training’s l2: 81.5398 valid_1’s l2: 146.816 [27] training’s l2: 80.7053 valid_1’s l2: 147.134 [28] training’s l2: 80.0769 valid_1’s l2: 146.891 [29] training’s l2: 79.442 valid_1’s l2: 146.749 [30] training’s l2: 78.6937 valid_1’s l2: 147.067 [31] training’s l2: 77.6754 valid_1’s l2: 153.832 [32] training’s l2: 76.9442 valid_1’s l2: 159.045 Early stopping, best iteration is: [22] training’s l2: 85.4879 valid_1’s l2: 139.784 [1] training’s l2: 188.12 valid_1’s l2: 191.239 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 173.743 valid_1’s l2: 193.733 [3] training’s l2: 161.072 valid_1’s l2: 181.47 [4] training’s l2: 150.14 valid_1’s l2: 176.867 [5] training’s l2: 140.453 valid_1’s l2: 169.654 [6] training’s l2: 132.144 valid_1’s l2: 176.226 [7] training’s l2: 124.918 valid_1’s l2: 171.88 [8] training’s l2: 118.573 valid_1’s l2: 165.972 [9] training’s l2: 113.456 valid_1’s l2: 167.95 [10] training’s l2: 108.847 valid_1’s l2: 162.794 [11] training’s l2: 104.915 valid_1’s l2: 171.14 [12] training’s l2: 101.408 valid_1’s l2: 180.411 [13] training’s l2: 98.4779 valid_1’s l2: 189.376 [14] training’s l2: 95.674 valid_1’s l2: 187.436 [15] training’s l2: 93.5419 valid_1’s l2: 185.85 [16] training’s l2: 91.7014 valid_1’s l2: 189.388 [17] training’s l2: 89.9413 valid_1’s l2: 187.565 [18] training’s l2: 88.6607 valid_1’s l2: 185.931 [19] training’s l2: 87.2567 valid_1’s l2: 191.293 [20] training’s l2: 85.917 valid_1’s l2: 195.286 Early stopping, best iteration is: [10] training’s l2: 108.847 valid_1’s l2: 162.794 [1] training’s l2: 178.418 valid_1’s l2: 265.035 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 165.025 valid_1’s l2: 246.134 [3] training’s l2: 153.385 valid_1’s l2: 228.866 [4] training’s l2: 144.126 valid_1’s l2: 215.088 [5] training’s l2: 135.955 valid_1’s l2: 204.217 [6] training’s l2: 128.861 valid_1’s l2: 197.507 [7] training’s l2: 123.194 valid_1’s l2: 189.106 [8] training’s l2: 118.085 valid_1’s l2: 181.7 [9] training’s l2: 113.233 valid_1’s l2: 174.792 [10] training’s l2: 109.485 valid_1’s l2: 168.466 [11] training’s l2: 105.845 valid_1’s l2: 163.912 [12] training’s l2: 102.826 valid_1’s l2: 159.953 [13] training’s l2: 100.023 valid_1’s l2: 157.383 [14] training’s l2: 97.4501 valid_1’s l2: 154.217 [15] training’s l2: 95.3744 valid_1’s l2: 152.694 [16] training’s l2: 93.628 valid_1’s l2: 150.739 [17] training’s l2: 92.1237 valid_1’s l2: 147.857 [18] training’s l2: 90.6586 valid_1’s l2: 146.127 [19] training’s l2: 89.4702 valid_1’s l2: 144.918 [20] training’s l2: 88.2176 valid_1’s l2: 143.505 [21] training’s l2: 87.0567 valid_1’s l2: 143.632 [22] training’s l2: 86.1811 valid_1’s l2: 142.965 [23] training’s l2: 84.4552 valid_1’s l2: 151.943 [24] training’s l2: 83.5158 valid_1’s l2: 151.15 [25] training’s l2: 82.0879 valid_1’s l2: 160.11 [26] training’s l2: 81.3162 valid_1’s l2: 158.647 [27] training’s l2: 80.6522 valid_1’s l2: 157.982 [28] training’s l2: 80.1071 valid_1’s l2: 158.093 [29] training’s l2: 79.1111 valid_1’s l2: 166.187 [30] training’s l2: 78.138 valid_1’s l2: 172.115 [31] training’s l2: 77.4974 valid_1’s l2: 172.452 [32] training’s l2: 76.9435 valid_1’s l2: 171.263 Early stopping, best iteration is: [22] training’s l2: 86.1811 valid_1’s l2: 142.965 [1] training’s l2: 178.527 valid_1’s l2: 268.846 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 164.735 valid_1’s l2: 271.958 [3] training’s l2: 152.872 valid_1’s l2: 255.633 [4] training’s l2: 142.021 valid_1’s l2: 257.481 [5] training’s l2: 133.525 valid_1’s l2: 249.246 [6] training’s l2: 125.815 valid_1’s l2: 260.603 [7] training’s l2: 119.826 valid_1’s l2: 251.652 [8] training’s l2: 114.823 valid_1’s l2: 243.876 [9] training’s l2: 109.695 valid_1’s l2: 252.638 [10] training’s l2: 105.283 valid_1’s l2: 264.212 [11] training’s l2: 101.479 valid_1’s l2: 255.974 [12] training’s l2: 98.3201 valid_1’s l2: 262.973 [13] training’s l2: 95.3968 valid_1’s l2: 255.544 [14] training’s l2: 92.9111 valid_1’s l2: 255.379 [15] training’s l2: 90.8771 valid_1’s l2: 250.804 [16] training’s l2: 88.9209 valid_1’s l2: 247.222 [17] training’s l2: 87.3116 valid_1’s l2: 243.371 [18] training’s l2: 85.9583 valid_1’s l2: 241.463 [19] training’s l2: 84.7326 valid_1’s l2: 239.065 [20] training’s l2: 83.8677 valid_1’s l2: 237.792 [21] training’s l2: 82.539 valid_1’s l2: 244.119 [22] training’s l2: 81.7653 valid_1’s l2: 242.413 [23] training’s l2: 80.7099 valid_1’s l2: 250.489 [24] training’s l2: 79.8266 valid_1’s l2: 257.392 [25] training’s l2: 79.243 valid_1’s l2: 261.641 [26] training’s l2: 78.2941 valid_1’s l2: 256.933 [27] training’s l2: 77.7116 valid_1’s l2: 255.796 [28] training’s l2: 76.8302 valid_1’s l2: 255.069 [29] training’s l2: 76.1392 valid_1’s l2: 254.08 [30] training’s l2: 75.636 valid_1’s l2: 252.346 Early stopping, best iteration is: [20] training’s l2: 83.8677 valid_1’s l2: 237.792 [1] training’s l2: 163.051 valid_1’s l2: 270.914 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 143.562 valid_1’s l2: 264.322 [3] training’s l2: 127.819 valid_1’s l2: 237.395 [4] training’s l2: 116.222 valid_1’s l2: 247.713 [5] training’s l2: 108.219 valid_1’s l2: 230.198 [6] training’s l2: 101.749 valid_1’s l2: 217.907 [7] training’s l2: 97.257 valid_1’s l2: 209.855 [8] training’s l2: 93.4926 valid_1’s l2: 204.666 [9] training’s l2: 90.4017 valid_1’s l2: 217.235 [10] training’s l2: 88.107 valid_1’s l2: 231.242 [11] training’s l2: 85.9269 valid_1’s l2: 241.81 [12] training’s l2: 84.1826 valid_1’s l2: 249.211 [13] training’s l2: 82.7968 valid_1’s l2: 246.76 [14] training’s l2: 81.5581 valid_1’s l2: 244.511 [15] training’s l2: 80.5764 valid_1’s l2: 244.511 [16] training’s l2: 79.2896 valid_1’s l2: 245.052 [17] training’s l2: 78.1915 valid_1’s l2: 244.701 [18] training’s l2: 77.475 valid_1’s l2: 245.778 Early stopping, best iteration is: [8] training’s l2: 93.4926 valid_1’s l2: 204.666 [1] training’s l2: 168.333 valid_1’s l2: 225.902 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 145.518 valid_1’s l2: 198.733 [3] training’s l2: 130.483 valid_1’s l2: 182.549 [4] training’s l2: 119.887 valid_1’s l2: 168.691 [5] training’s l2: 111.914 valid_1’s l2: 162.129 [6] training’s l2: 104.399 valid_1’s l2: 154.802 [7] training’s l2: 99.5143 valid_1’s l2: 153.361 [8] training’s l2: 95.4399 valid_1’s l2: 149.785 [9] training’s l2: 92.1601 valid_1’s l2: 146.182 [10] training’s l2: 88.6098 valid_1’s l2: 160.874 [11] training’s l2: 86.4699 valid_1’s l2: 162.366 [12] training’s l2: 85.1058 valid_1’s l2: 160.651 [13] training’s l2: 83.7852 valid_1’s l2: 158.692 [14] training’s l2: 81.8385 valid_1’s l2: 167.85 [15] training’s l2: 81.3172 valid_1’s l2: 167.677 [16] training’s l2: 79.8688 valid_1’s l2: 166.813 [17] training’s l2: 78.7436 valid_1’s l2: 166.632 [18] training’s l2: 78.2889 valid_1’s l2: 166.253 [19] training’s l2: 77.7411 valid_1’s l2: 166.485 Early stopping, best iteration is: [9] training’s l2: 92.1601 valid_1’s l2: 146.182 [1] training’s l2: 172.701 valid_1’s l2: 185.844 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 150.082 valid_1’s l2: 196.02 [3] training’s l2: 132.213 valid_1’s l2: 181.091 [4] training’s l2: 119.654 valid_1’s l2: 176.56 [5] training’s l2: 110.112 valid_1’s l2: 169.551 [6] training’s l2: 102.334 valid_1’s l2: 172.423 [7] training’s l2: 96.7403 valid_1’s l2: 178.417 [8] training’s l2: 93.0094 valid_1’s l2: 174.656 [9] training’s l2: 89.9377 valid_1’s l2: 189.242 [10] training’s l2: 87.2142 valid_1’s l2: 198.955 [11] training’s l2: 84.8644 valid_1’s l2: 211.793 [12] training’s l2: 83.4756 valid_1’s l2: 216.884 [13] training’s l2: 82.6183 valid_1’s l2: 217.059 [14] training’s l2: 81.3108 valid_1’s l2: 217.244 [15] training’s l2: 80.3123 valid_1’s l2: 217.496 Early stopping, best iteration is: [5] training’s l2: 110.112 valid_1’s l2: 169.551 [1] training’s l2: 164.477 valid_1’s l2: 245.497 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 143.467 valid_1’s l2: 214.682 [3] training’s l2: 127.538 valid_1’s l2: 190.053 [4] training’s l2: 115.963 valid_1’s l2: 176.518 [5] training’s l2: 108.677 valid_1’s l2: 166.345 [6] training’s l2: 101.826 valid_1’s l2: 161.714 [7] training’s l2: 97.0991 valid_1’s l2: 153.575 [8] training’s l2: 93.3258 valid_1’s l2: 149.033 [9] training’s l2: 89.3775 valid_1’s l2: 156.138 [10] training’s l2: 86.8294 valid_1’s l2: 154.988 [11] training’s l2: 84.6331 valid_1’s l2: 151.363 [12] training’s l2: 82.5291 valid_1’s l2: 150.351 [13] training’s l2: 80.6282 valid_1’s l2: 155.235 [14] training’s l2: 79.8313 valid_1’s l2: 155.067 [15] training’s l2: 78.7096 valid_1’s l2: 153.907 [16] training’s l2: 77.8392 valid_1’s l2: 153.805 [17] training’s l2: 77.2681 valid_1’s l2: 153.255 [18] training’s l2: 76.6197 valid_1’s l2: 153.732 Early stopping, best iteration is: [8] training’s l2: 93.3258 valid_1’s l2: 149.033 [1] training’s l2: 163.646 valid_1’s l2: 259.903 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 141.582 valid_1’s l2: 241.433 [3] training’s l2: 125.879 valid_1’s l2: 220.453 [4] training’s l2: 113.467 valid_1’s l2: 243.713 [5] training’s l2: 105.322 valid_1’s l2: 242.626 [6] training’s l2: 97.9246 valid_1’s l2: 255.85 [7] training’s l2: 93.4125 valid_1’s l2: 247.446 [8] training’s l2: 89.6665 valid_1’s l2: 241.614 [9] training’s l2: 86.1358 valid_1’s l2: 266.475 [10] training’s l2: 83.6605 valid_1’s l2: 285.112 [11] training’s l2: 81.4557 valid_1’s l2: 298.568 [12] training’s l2: 80.0066 valid_1’s l2: 295.897 [13] training’s l2: 78.9017 valid_1’s l2: 290.316 Early stopping, best iteration is: [3] training’s l2: 125.879 valid_1’s l2: 220.453 [1] training’s l2: 150.094 valid_1’s l2: 267.425 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 127.833 valid_1’s l2: 229.706 [3] training’s l2: 112.312 valid_1’s l2: 199.986 [4] training’s l2: 102.301 valid_1’s l2: 213.786 [5] training’s l2: 96.202 valid_1’s l2: 226.148 [6] training’s l2: 91.3107 valid_1’s l2: 237.67 [7] training’s l2: 88.6703 valid_1’s l2: 234.569 [8] training’s l2: 85.7671 valid_1’s l2: 229.331 [9] training’s l2: 84.4724 valid_1’s l2: 237.795 [10] training’s l2: 82.9758 valid_1’s l2: 232.501 [11] training’s l2: 81.7452 valid_1’s l2: 228.617 [12] training’s l2: 80.322 valid_1’s l2: 227.07 [13] training’s l2: 79.0962 valid_1’s l2: 227.409 Early stopping, best iteration is: [3] training’s l2: 112.312 valid_1’s l2: 199.986 [1] training’s l2: 155.831 valid_1’s l2: 209.697 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 129.047 valid_1’s l2: 175.179 [3] training’s l2: 114.483 valid_1’s l2: 160.503 [4] training’s l2: 104.757 valid_1’s l2: 150.913 [5] training’s l2: 98.4102 valid_1’s l2: 148.176 [6] training’s l2: 93.7276 valid_1’s l2: 151.622 [7] training’s l2: 90.3264 valid_1’s l2: 152.092 [8] training’s l2: 87.7756 valid_1’s l2: 151.673 [9] training’s l2: 83.9092 valid_1’s l2: 173.393 [10] training’s l2: 82.2155 valid_1’s l2: 171.809 [11] training’s l2: 79.6033 valid_1’s l2: 189.582 [12] training’s l2: 78.049 valid_1’s l2: 189.499 [13] training’s l2: 76.8293 valid_1’s l2: 189.466 [14] training’s l2: 75.8629 valid_1’s l2: 187.564 [15] training’s l2: 74.4299 valid_1’s l2: 187.917 Early stopping, best iteration is: [5] training’s l2: 98.4102 valid_1’s l2: 148.176 [1] training’s l2: 159.132 valid_1’s l2: 182.583 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 131.937 valid_1’s l2: 183.241 [3] training’s l2: 113.543 valid_1’s l2: 168.708 [4] training’s l2: 102.236 valid_1’s l2: 167.861 [5] training’s l2: 94.9944 valid_1’s l2: 179.291 [6] training’s l2: 89.3762 valid_1’s l2: 193.865 [7] training’s l2: 85.6254 valid_1’s l2: 198.564 [8] training’s l2: 83.0953 valid_1’s l2: 201.286 [9] training’s l2: 80.9649 valid_1’s l2: 218.112 [10] training’s l2: 79.5709 valid_1’s l2: 220.977 [11] training’s l2: 78.4243 valid_1’s l2: 217.812 [12] training’s l2: 77.4406 valid_1’s l2: 217.927 [13] training’s l2: 76.1498 valid_1’s l2: 222.822 [14] training’s l2: 75.5196 valid_1’s l2: 223.746 Early stopping, best iteration is: [4] training’s l2: 102.236 valid_1’s l2: 167.861 [1] training’s l2: 152.209 valid_1’s l2: 227.775 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 127.717 valid_1’s l2: 192.268 [3] training’s l2: 111.676 valid_1’s l2: 167.157 [4] training’s l2: 101.839 valid_1’s l2: 153.794 [5] training’s l2: 95.9211 valid_1’s l2: 149.98 [6] training’s l2: 89.6332 valid_1’s l2: 159.842 [7] training’s l2: 86.4152 valid_1’s l2: 155.311 [8] training’s l2: 84.531 valid_1’s l2: 153.005 [9] training’s l2: 82.7928 valid_1’s l2: 151.397 [10] training’s l2: 80.0961 valid_1’s l2: 171.852 [11] training’s l2: 77.9722 valid_1’s l2: 173.91 [12] training’s l2: 76.7695 valid_1’s l2: 173.988 [13] training’s l2: 75.6523 valid_1’s l2: 172.389 [14] training’s l2: 74.943 valid_1’s l2: 171.028 [15] training’s l2: 74.1793 valid_1’s l2: 171.929 Early stopping, best iteration is: [5] training’s l2: 95.9211 valid_1’s l2: 149.98 [1] training’s l2: 150.538 valid_1’s l2: 253.422 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 125.066 valid_1’s l2: 255.674 [3] training’s l2: 108.488 valid_1’s l2: 228.815 [4] training’s l2: 98.1366 valid_1’s l2: 270.83 [5] training’s l2: 90.7324 valid_1’s l2: 294.078 [6] training’s l2: 86.331 valid_1’s l2: 302.676 [7] training’s l2: 83.4827 valid_1’s l2: 303.655 [8] training’s l2: 81.2833 valid_1’s l2: 299.224 [9] training’s l2: 79.3261 valid_1’s l2: 326.568 [10] training’s l2: 78.0871 valid_1’s l2: 324.406 [11] training’s l2: 76.2718 valid_1’s l2: 320.484 [12] training’s l2: 75.101 valid_1’s l2: 316.917 [13] training’s l2: 73.8968 valid_1’s l2: 317.744 Early stopping, best iteration is: [3] training’s l2: 108.488 valid_1’s l2: 228.815

learning rate train logloss eval logloss test logloss
0 0.0005 0.417459 1.168713 0.691839
1 0.0010 0.416468 1.167130 0.670094
2 0.0100 0.414676 1.224881 0.741908
3 0.1000 0.405200 1.291950 0.800208
4 0.2000 0.475938 1.022849 0.502928
5 0.3000 0.549879 1.165234 0.513813

learning rate=0.001

6.0.6 Max Depth

[1] training’s l2: 194.37 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.238 valid_1’s l2: 283.897 [3] training’s l2: 194.106 valid_1’s l2: 283.695 [4] training’s l2: 193.975 valid_1’s l2: 283.493 [5] training’s l2: 193.844 valid_1’s l2: 283.289 [6] training’s l2: 193.718 valid_1’s l2: 283.115 [7] training’s l2: 193.592 valid_1’s l2: 282.941 [8] training’s l2: 193.466 valid_1’s l2: 282.767 [9] training’s l2: 193.34 valid_1’s l2: 282.594 [10] training’s l2: 193.215 valid_1’s l2: 282.421 [11] training’s l2: 193.09 valid_1’s l2: 282.204 [12] training’s l2: 192.967 valid_1’s l2: 282.34 [13] training’s l2: 192.843 valid_1’s l2: 282.471 [14] training’s l2: 192.72 valid_1’s l2: 282.602 [15] training’s l2: 192.597 valid_1’s l2: 282.428 [16] training’s l2: 192.47 valid_1’s l2: 282.258 [17] training’s l2: 192.343 valid_1’s l2: 282.089 [18] training’s l2: 192.216 valid_1’s l2: 281.92 [19] training’s l2: 192.083 valid_1’s l2: 281.723 [20] training’s l2: 191.957 valid_1’s l2: 281.555 [21] training’s l2: 191.83 valid_1’s l2: 281.358 [22] training’s l2: 191.703 valid_1’s l2: 281.161 [23] training’s l2: 191.577 valid_1’s l2: 280.964 [24] training’s l2: 191.447 valid_1’s l2: 280.766 [25] training’s l2: 191.321 valid_1’s l2: 280.57 [26] training’s l2: 191.202 valid_1’s l2: 280.408 [27] training’s l2: 191.081 valid_1’s l2: 280.228 [28] training’s l2: 190.957 valid_1’s l2: 280.035 [29] training’s l2: 190.836 valid_1’s l2: 279.855 [30] training’s l2: 190.715 valid_1’s l2: 279.676 [31] training’s l2: 190.592 valid_1’s l2: 279.803 [32] training’s l2: 190.469 valid_1’s l2: 279.609 [33] training’s l2: 190.345 valid_1’s l2: 279.417 [34] training’s l2: 190.223 valid_1’s l2: 279.224 [35] training’s l2: 190.1 valid_1’s l2: 279.032 [36] training’s l2: 189.979 valid_1’s l2: 279.175 [37] training’s l2: 189.859 valid_1’s l2: 279.318 [38] training’s l2: 189.735 valid_1’s l2: 279.131 [39] training’s l2: 189.614 valid_1’s l2: 279.274 [40] training’s l2: 189.494 valid_1’s l2: 279.416 [41] training’s l2: 189.372 valid_1’s l2: 279.226 [42] training’s l2: 189.249 valid_1’s l2: 279.354 [43] training’s l2: 189.127 valid_1’s l2: 279.165 [44] training’s l2: 189.006 valid_1’s l2: 278.976 [45] training’s l2: 188.884 valid_1’s l2: 279.104 [46] training’s l2: 188.766 valid_1’s l2: 278.919 [47] training’s l2: 188.64 valid_1’s l2: 278.695 [48] training’s l2: 188.522 valid_1’s l2: 278.51 [49] training’s l2: 188.405 valid_1’s l2: 278.326 [50] training’s l2: 188.287 valid_1’s l2: 278.143 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.287 valid_1’s l2: 278.143 [1] training’s l2: 198.317 valid_1’s l2: 265.053 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.181 valid_1’s l2: 264.867 [3] training’s l2: 198.046 valid_1’s l2: 264.692 [4] training’s l2: 197.911 valid_1’s l2: 264.517 [5] training’s l2: 197.775 valid_1’s l2: 264.332 [6] training’s l2: 197.636 valid_1’s l2: 264.179 [7] training’s l2: 197.497 valid_1’s l2: 264.026 [8] training’s l2: 197.361 valid_1’s l2: 263.882 [9] training’s l2: 197.222 valid_1’s l2: 263.73 [10] training’s l2: 197.085 valid_1’s l2: 263.578 [11] training’s l2: 196.945 valid_1’s l2: 263.415 [12] training’s l2: 196.811 valid_1’s l2: 263.267 [13] training’s l2: 196.674 valid_1’s l2: 263.1 [14] training’s l2: 196.538 valid_1’s l2: 262.934 [15] training’s l2: 196.401 valid_1’s l2: 262.768 [16] training’s l2: 196.263 valid_1’s l2: 262.58 [17] training’s l2: 196.124 valid_1’s l2: 262.392 [18] training’s l2: 195.986 valid_1’s l2: 262.204 [19] training’s l2: 195.85 valid_1’s l2: 262.044 [20] training’s l2: 195.712 valid_1’s l2: 261.857 [21] training’s l2: 195.581 valid_1’s l2: 261.711 [22] training’s l2: 195.45 valid_1’s l2: 261.564 [23] training’s l2: 195.32 valid_1’s l2: 261.418 [24] training’s l2: 195.186 valid_1’s l2: 261.258 [25] training’s l2: 195.056 valid_1’s l2: 261.113 [26] training’s l2: 194.921 valid_1’s l2: 260.939 [27] training’s l2: 194.788 valid_1’s l2: 260.758 [28] training’s l2: 194.659 valid_1’s l2: 260.604 [29] training’s l2: 194.525 valid_1’s l2: 260.432 [30] training’s l2: 194.392 valid_1’s l2: 260.263 [31] training’s l2: 194.261 valid_1’s l2: 260.097 [32] training’s l2: 194.126 valid_1’s l2: 259.918 [33] training’s l2: 193.991 valid_1’s l2: 259.739 [34] training’s l2: 193.857 valid_1’s l2: 259.561 [35] training’s l2: 193.722 valid_1’s l2: 259.383 [36] training’s l2: 193.593 valid_1’s l2: 259.212 [37] training’s l2: 193.465 valid_1’s l2: 259.041 [38] training’s l2: 193.336 valid_1’s l2: 258.889 [39] training’s l2: 193.208 valid_1’s l2: 258.719 [40] training’s l2: 193.079 valid_1’s l2: 258.549 [41] training’s l2: 192.941 valid_1’s l2: 258.354 [42] training’s l2: 192.81 valid_1’s l2: 258.159 [43] training’s l2: 192.672 valid_1’s l2: 257.964 [44] training’s l2: 192.534 valid_1’s l2: 257.77 [45] training’s l2: 192.404 valid_1’s l2: 257.576 [46] training’s l2: 192.274 valid_1’s l2: 257.408 [47] training’s l2: 192.144 valid_1’s l2: 257.257 [48] training’s l2: 192.014 valid_1’s l2: 257.089 [49] training’s l2: 191.884 valid_1’s l2: 256.922 [50] training’s l2: 191.755 valid_1’s l2: 256.755 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.755 valid_1’s l2: 256.755 [1] training’s l2: 205.243 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.096 valid_1’s l2: 198.5 [3] training’s l2: 204.952 valid_1’s l2: 198.371 [4] training’s l2: 204.807 valid_1’s l2: 198.243 [5] training’s l2: 204.661 valid_1’s l2: 198.105 [6] training’s l2: 204.516 valid_1’s l2: 197.969 [7] training’s l2: 204.373 valid_1’s l2: 197.832 [8] training’s l2: 204.229 valid_1’s l2: 197.696 [9] training’s l2: 204.086 valid_1’s l2: 197.56 [10] training’s l2: 203.943 valid_1’s l2: 197.424 [11] training’s l2: 203.799 valid_1’s l2: 197.3 [12] training’s l2: 203.658 valid_1’s l2: 197.167 [13] training’s l2: 203.516 valid_1’s l2: 197.033 [14] training’s l2: 203.375 valid_1’s l2: 196.9 [15] training’s l2: 203.234 valid_1’s l2: 196.768 [16] training’s l2: 203.088 valid_1’s l2: 196.636 [17] training’s l2: 202.942 valid_1’s l2: 196.505 [18] training’s l2: 202.797 valid_1’s l2: 196.374 [19] training’s l2: 202.655 valid_1’s l2: 196.251 [20] training’s l2: 202.51 valid_1’s l2: 196.12 [21] training’s l2: 202.366 valid_1’s l2: 196.005 [22] training’s l2: 202.222 valid_1’s l2: 195.89 [23] training’s l2: 202.078 valid_1’s l2: 195.775 [24] training’s l2: 201.938 valid_1’s l2: 195.642 [25] training’s l2: 201.795 valid_1’s l2: 195.528 [26] training’s l2: 201.656 valid_1’s l2: 195.407 [27] training’s l2: 201.517 valid_1’s l2: 195.293 [28] training’s l2: 201.381 valid_1’s l2: 195.174 [29] training’s l2: 201.242 valid_1’s l2: 195.06 [30] training’s l2: 201.103 valid_1’s l2: 194.946 [31] training’s l2: 200.967 valid_1’s l2: 194.82 [32] training’s l2: 200.83 valid_1’s l2: 194.705 [33] training’s l2: 200.693 valid_1’s l2: 194.591 [34] training’s l2: 200.557 valid_1’s l2: 194.476 [35] training’s l2: 200.421 valid_1’s l2: 194.362 [36] training’s l2: 200.276 valid_1’s l2: 194.231 [37] training’s l2: 200.131 valid_1’s l2: 194.1 [38] training’s l2: 199.99 valid_1’s l2: 193.965 [39] training’s l2: 199.846 valid_1’s l2: 193.834 [40] training’s l2: 199.702 valid_1’s l2: 193.704 [41] training’s l2: 199.57 valid_1’s l2: 193.591 [42] training’s l2: 199.436 valid_1’s l2: 193.473 [43] training’s l2: 199.304 valid_1’s l2: 193.36 [44] training’s l2: 199.176 valid_1’s l2: 193.257 [45] training’s l2: 199.042 valid_1’s l2: 193.139 [46] training’s l2: 198.904 valid_1’s l2: 193.025 [47] training’s l2: 198.765 valid_1’s l2: 192.908 [48] training’s l2: 198.627 valid_1’s l2: 192.794 [49] training’s l2: 198.489 valid_1’s l2: 192.676 [50] training’s l2: 198.351 valid_1’s l2: 192.563 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.351 valid_1’s l2: 192.563 [1] training’s l2: 193.905 valid_1’s l2: 286.206 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.769 valid_1’s l2: 286.029 [3] training’s l2: 193.644 valid_1’s l2: 285.846 [4] training’s l2: 193.519 valid_1’s l2: 285.666 [5] training’s l2: 193.383 valid_1’s l2: 285.489 [6] training’s l2: 193.258 valid_1’s l2: 285.31 [7] training’s l2: 193.133 valid_1’s l2: 285.133 [8] training’s l2: 193.009 valid_1’s l2: 284.955 [9] training’s l2: 192.885 valid_1’s l2: 284.778 [10] training’s l2: 192.761 valid_1’s l2: 284.601 [11] training’s l2: 192.623 valid_1’s l2: 284.387 [12] training’s l2: 192.486 valid_1’s l2: 284.17 [13] training’s l2: 192.349 valid_1’s l2: 283.951 [14] training’s l2: 192.211 valid_1’s l2: 283.732 [15] training’s l2: 192.074 valid_1’s l2: 283.514 [16] training’s l2: 191.945 valid_1’s l2: 283.32 [17] training’s l2: 191.816 valid_1’s l2: 283.127 [18] training’s l2: 191.687 valid_1’s l2: 282.934 [19] training’s l2: 191.552 valid_1’s l2: 282.729 [20] training’s l2: 191.424 valid_1’s l2: 282.536 [21] training’s l2: 191.297 valid_1’s l2: 282.338 [22] training’s l2: 191.17 valid_1’s l2: 282.141 [23] training’s l2: 191.044 valid_1’s l2: 281.943 [24] training’s l2: 190.919 valid_1’s l2: 281.756 [25] training’s l2: 190.793 valid_1’s l2: 281.561 [26] training’s l2: 190.66 valid_1’s l2: 281.376 [27] training’s l2: 190.534 valid_1’s l2: 281.178 [28] training’s l2: 190.405 valid_1’s l2: 280.977 [29] training’s l2: 190.273 valid_1’s l2: 280.793 [30] training’s l2: 190.142 valid_1’s l2: 280.609 [31] training’s l2: 190.022 valid_1’s l2: 280.414 [32] training’s l2: 189.904 valid_1’s l2: 280.216 [33] training’s l2: 189.785 valid_1’s l2: 280.018 [34] training’s l2: 189.667 valid_1’s l2: 279.821 [35] training’s l2: 189.55 valid_1’s l2: 279.624 [36] training’s l2: 189.429 valid_1’s l2: 279.449 [37] training’s l2: 189.308 valid_1’s l2: 279.275 [38] training’s l2: 189.181 valid_1’s l2: 279.094 [39] training’s l2: 189.061 valid_1’s l2: 278.921 [40] training’s l2: 188.941 valid_1’s l2: 278.747 [41] training’s l2: 188.829 valid_1’s l2: 278.601 [42] training’s l2: 188.717 valid_1’s l2: 278.455 [43] training’s l2: 188.605 valid_1’s l2: 278.309 [44] training’s l2: 188.495 valid_1’s l2: 278.168 [45] training’s l2: 188.381 valid_1’s l2: 278.004 [46] training’s l2: 188.252 valid_1’s l2: 277.81 [47] training’s l2: 188.129 valid_1’s l2: 277.648 [48] training’s l2: 188 valid_1’s l2: 277.458 [49] training’s l2: 187.871 valid_1’s l2: 277.269 [50] training’s l2: 187.743 valid_1’s l2: 277.08 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.743 valid_1’s l2: 277.08 [1] training’s l2: 195.039 valid_1’s l2: 280.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.901 valid_1’s l2: 279.884 [3] training’s l2: 194.758 valid_1’s l2: 279.707 [4] training’s l2: 194.615 valid_1’s l2: 279.528 [5] training’s l2: 194.478 valid_1’s l2: 279.343 [6] training’s l2: 194.35 valid_1’s l2: 279.528 [7] training’s l2: 194.22 valid_1’s l2: 279.376 [8] training’s l2: 194.09 valid_1’s l2: 279.225 [9] training’s l2: 193.963 valid_1’s l2: 279.41 [10] training’s l2: 193.836 valid_1’s l2: 279.594 [11] training’s l2: 193.696 valid_1’s l2: 279.4 [12] training’s l2: 193.558 valid_1’s l2: 279.202 [13] training’s l2: 193.421 valid_1’s l2: 279.003 [14] training’s l2: 193.284 valid_1’s l2: 278.806 [15] training’s l2: 193.148 valid_1’s l2: 278.608 [16] training’s l2: 193.006 valid_1’s l2: 278.411 [17] training’s l2: 192.864 valid_1’s l2: 278.214 [18] training’s l2: 192.723 valid_1’s l2: 278.018 [19] training’s l2: 192.583 valid_1’s l2: 277.825 [20] training’s l2: 192.442 valid_1’s l2: 277.63 [21] training’s l2: 192.303 valid_1’s l2: 277.439 [22] training’s l2: 192.164 valid_1’s l2: 277.25 [23] training’s l2: 192.025 valid_1’s l2: 277.06 [24] training’s l2: 191.897 valid_1’s l2: 276.881 [25] training’s l2: 191.759 valid_1’s l2: 276.692 [26] training’s l2: 191.624 valid_1’s l2: 276.491 [27] training’s l2: 191.482 valid_1’s l2: 276.303 [28] training’s l2: 191.346 valid_1’s l2: 276.123 [29] training’s l2: 191.205 valid_1’s l2: 275.935 [30] training’s l2: 191.064 valid_1’s l2: 275.747 [31] training’s l2: 190.935 valid_1’s l2: 275.56 [32] training’s l2: 190.798 valid_1’s l2: 275.367 [33] training’s l2: 190.664 valid_1’s l2: 275.182 [34] training’s l2: 190.528 valid_1’s l2: 274.99 [35] training’s l2: 190.392 valid_1’s l2: 274.799 [36] training’s l2: 190.258 valid_1’s l2: 274.621 [37] training’s l2: 190.125 valid_1’s l2: 274.444 [38] training’s l2: 189.995 valid_1’s l2: 274.257 [39] training’s l2: 189.862 valid_1’s l2: 274.081 [40] training’s l2: 189.729 valid_1’s l2: 273.905 [41] training’s l2: 189.603 valid_1’s l2: 273.754 [42] training’s l2: 189.478 valid_1’s l2: 273.593 [43] training’s l2: 189.352 valid_1’s l2: 273.442 [44] training’s l2: 189.226 valid_1’s l2: 273.291 [45] training’s l2: 189.104 valid_1’s l2: 273.133 [46] training’s l2: 188.973 valid_1’s l2: 272.963 [47] training’s l2: 188.837 valid_1’s l2: 272.802 [48] training’s l2: 188.707 valid_1’s l2: 272.632 [49] training’s l2: 188.577 valid_1’s l2: 272.463 [50] training’s l2: 188.447 valid_1’s l2: 272.294 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.447 valid_1’s l2: 272.294 [1] training’s l2: 194.326 valid_1’s l2: 284.216 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.166 valid_1’s l2: 284.157 [3] training’s l2: 193.999 valid_1’s l2: 283.916 [4] training’s l2: 193.824 valid_1’s l2: 283.829 [5] training’s l2: 193.665 valid_1’s l2: 283.771 [6] training’s l2: 193.498 valid_1’s l2: 283.623 [7] training’s l2: 193.34 valid_1’s l2: 283.4 [8] training’s l2: 193.181 valid_1’s l2: 283.174 [9] training’s l2: 193.015 valid_1’s l2: 283.028 [10] training’s l2: 192.849 valid_1’s l2: 282.881 [11] training’s l2: 192.687 valid_1’s l2: 282.781 [12] training’s l2: 192.526 valid_1’s l2: 282.904 [13] training’s l2: 192.365 valid_1’s l2: 283.028 [14] training’s l2: 192.206 valid_1’s l2: 283.139 [15] training’s l2: 192.049 valid_1’s l2: 282.905 [16] training’s l2: 191.893 valid_1’s l2: 282.684 [17] training’s l2: 191.736 valid_1’s l2: 282.464 [18] training’s l2: 191.58 valid_1’s l2: 282.244 [19] training’s l2: 191.418 valid_1’s l2: 281.992 [20] training’s l2: 191.262 valid_1’s l2: 281.773 [21] training’s l2: 191.1 valid_1’s l2: 281.696 [22] training’s l2: 190.95 valid_1’s l2: 281.52 [23] training’s l2: 190.788 valid_1’s l2: 281.443 [24] training’s l2: 190.621 valid_1’s l2: 281.302 [25] training’s l2: 190.46 valid_1’s l2: 281.226 [26] training’s l2: 190.31 valid_1’s l2: 281.01 [27] training’s l2: 190.164 valid_1’s l2: 280.795 [28] training’s l2: 190.006 valid_1’s l2: 280.566 [29] training’s l2: 189.852 valid_1’s l2: 280.496 [30] training’s l2: 189.697 valid_1’s l2: 280.426 [31] training’s l2: 189.548 valid_1’s l2: 280.555 [32] training’s l2: 189.389 valid_1’s l2: 280.49 [33] training’s l2: 189.23 valid_1’s l2: 280.426 [34] training’s l2: 189.072 valid_1’s l2: 280.362 [35] training’s l2: 188.915 valid_1’s l2: 280.295 [36] training’s l2: 188.763 valid_1’s l2: 280.417 [37] training’s l2: 188.612 valid_1’s l2: 280.54 [38] training’s l2: 188.452 valid_1’s l2: 280.395 [39] training’s l2: 188.301 valid_1’s l2: 280.518 [40] training’s l2: 188.15 valid_1’s l2: 280.635 [41] training’s l2: 187.993 valid_1’s l2: 280.554 [42] training’s l2: 187.83 valid_1’s l2: 280.486 [43] training’s l2: 187.684 valid_1’s l2: 280.253 [44] training’s l2: 187.527 valid_1’s l2: 280.174 [45] training’s l2: 187.366 valid_1’s l2: 280.108 [46] training’s l2: 187.216 valid_1’s l2: 280.037 [47] training’s l2: 187.059 valid_1’s l2: 279.834 [48] training’s l2: 186.91 valid_1’s l2: 279.761 [49] training’s l2: 186.761 valid_1’s l2: 279.685 [50] training’s l2: 186.613 valid_1’s l2: 279.612 Did not meet early stopping. Best iteration is: [50] training’s l2: 186.613 valid_1’s l2: 279.612 [1] training’s l2: 198.285 valid_1’s l2: 265.008 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.125 valid_1’s l2: 264.791 [3] training’s l2: 197.957 valid_1’s l2: 264.57 [4] training’s l2: 197.788 valid_1’s l2: 264.355 [5] training’s l2: 197.629 valid_1’s l2: 264.139 [6] training’s l2: 197.455 valid_1’s l2: 263.945 [7] training’s l2: 197.283 valid_1’s l2: 263.753 [8] training’s l2: 197.108 valid_1’s l2: 263.559 [9] training’s l2: 196.935 valid_1’s l2: 263.366 [10] training’s l2: 196.764 valid_1’s l2: 263.174 [11] training’s l2: 196.595 valid_1’s l2: 262.968 [12] training’s l2: 196.424 valid_1’s l2: 262.77 [13] training’s l2: 196.258 valid_1’s l2: 262.567 [14] training’s l2: 196.092 valid_1’s l2: 262.365 [15] training’s l2: 195.927 valid_1’s l2: 262.163 [16] training’s l2: 195.765 valid_1’s l2: 261.954 [17] training’s l2: 195.604 valid_1’s l2: 261.745 [18] training’s l2: 195.443 valid_1’s l2: 261.537 [19] training’s l2: 195.276 valid_1’s l2: 261.331 [20] training’s l2: 195.115 valid_1’s l2: 261.124 [21] training’s l2: 194.952 valid_1’s l2: 260.924 [22] training’s l2: 194.791 valid_1’s l2: 260.735 [23] training’s l2: 194.627 valid_1’s l2: 260.534 [24] training’s l2: 194.47 valid_1’s l2: 260.349 [25] training’s l2: 194.307 valid_1’s l2: 260.149 [26] training’s l2: 194.141 valid_1’s l2: 259.934 [27] training’s l2: 193.978 valid_1’s l2: 259.718 [28] training’s l2: 193.818 valid_1’s l2: 259.52 [29] training’s l2: 193.653 valid_1’s l2: 259.306 [30] training’s l2: 193.486 valid_1’s l2: 259.096 [31] training’s l2: 193.335 valid_1’s l2: 258.911 [32] training’s l2: 193.172 valid_1’s l2: 258.728 [33] training’s l2: 193.008 valid_1’s l2: 258.548 [34] training’s l2: 192.845 valid_1’s l2: 258.369 [35] training’s l2: 192.684 valid_1’s l2: 258.19 [36] training’s l2: 192.522 valid_1’s l2: 257.988 [37] training’s l2: 192.361 valid_1’s l2: 257.785 [38] training’s l2: 192.205 valid_1’s l2: 257.608 [39] training’s l2: 192.045 valid_1’s l2: 257.41 [40] training’s l2: 191.885 valid_1’s l2: 257.204 [41] training’s l2: 191.724 valid_1’s l2: 256.993 [42] training’s l2: 191.561 valid_1’s l2: 256.771 [43] training’s l2: 191.4 valid_1’s l2: 256.558 [44] training’s l2: 191.238 valid_1’s l2: 256.345 [45] training’s l2: 191.077 valid_1’s l2: 256.125 [46] training’s l2: 190.924 valid_1’s l2: 255.93 [47] training’s l2: 190.768 valid_1’s l2: 255.74 [48] training’s l2: 190.616 valid_1’s l2: 255.545 [49] training’s l2: 190.464 valid_1’s l2: 255.347 [50] training’s l2: 190.313 valid_1’s l2: 255.153 Did not meet early stopping. Best iteration is: [50] training’s l2: 190.313 valid_1’s l2: 255.153 [1] training’s l2: 205.207 valid_1’s l2: 198.681 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.027 valid_1’s l2: 198.68 [3] training’s l2: 204.846 valid_1’s l2: 198.591 [4] training’s l2: 204.666 valid_1’s l2: 198.506 [5] training’s l2: 204.487 valid_1’s l2: 198.501 [6] training’s l2: 204.3 valid_1’s l2: 198.426 [7] training’s l2: 204.115 valid_1’s l2: 198.306 [8] training’s l2: 203.927 valid_1’s l2: 198.186 [9] training’s l2: 203.741 valid_1’s l2: 198.067 [10] training’s l2: 203.555 valid_1’s l2: 197.992 [11] training’s l2: 203.38 valid_1’s l2: 197.839 [12] training’s l2: 203.205 valid_1’s l2: 197.739 [13] training’s l2: 203.031 valid_1’s l2: 197.639 [14] training’s l2: 202.857 valid_1’s l2: 197.54 [15] training’s l2: 202.685 valid_1’s l2: 197.435 [16] training’s l2: 202.506 valid_1’s l2: 197.318 [17] training’s l2: 202.327 valid_1’s l2: 197.203 [18] training’s l2: 202.149 valid_1’s l2: 197.052 [19] training’s l2: 201.973 valid_1’s l2: 196.905 [20] training’s l2: 201.794 valid_1’s l2: 196.784 [21] training’s l2: 201.618 valid_1’s l2: 196.636 [22] training’s l2: 201.441 valid_1’s l2: 196.487 [23] training’s l2: 201.265 valid_1’s l2: 196.339 [24] training’s l2: 201.095 valid_1’s l2: 196.324 [25] training’s l2: 200.92 valid_1’s l2: 196.176 [26] training’s l2: 200.748 valid_1’s l2: 196.024 [27] training’s l2: 200.576 valid_1’s l2: 195.921 [28] training’s l2: 200.409 valid_1’s l2: 195.768 [29] training’s l2: 200.235 valid_1’s l2: 195.623 [30] training’s l2: 200.061 valid_1’s l2: 195.479 [31] training’s l2: 199.895 valid_1’s l2: 195.469 [32] training’s l2: 199.72 valid_1’s l2: 195.396 [33] training’s l2: 199.546 valid_1’s l2: 195.324 [34] training’s l2: 199.372 valid_1’s l2: 195.252 [35] training’s l2: 199.198 valid_1’s l2: 195.187 [36] training’s l2: 199.023 valid_1’s l2: 195.142 [37] training’s l2: 198.849 valid_1’s l2: 195.097 [38] training’s l2: 198.676 valid_1’s l2: 194.96 [39] training’s l2: 198.498 valid_1’s l2: 194.854 [40] training’s l2: 198.322 valid_1’s l2: 194.749 [41] training’s l2: 198.147 valid_1’s l2: 194.65 [42] training’s l2: 197.971 valid_1’s l2: 194.548 [43] training’s l2: 197.802 valid_1’s l2: 194.406 [44] training’s l2: 197.634 valid_1’s l2: 194.321 [45] training’s l2: 197.46 valid_1’s l2: 194.219 [46] training’s l2: 197.291 valid_1’s l2: 194.151 [47] training’s l2: 197.12 valid_1’s l2: 194.025 [48] training’s l2: 196.952 valid_1’s l2: 193.886 [49] training’s l2: 196.783 valid_1’s l2: 193.741 [50] training’s l2: 196.616 valid_1’s l2: 193.606 Did not meet early stopping. Best iteration is: [50] training’s l2: 196.616 valid_1’s l2: 193.606 [1] training’s l2: 193.866 valid_1’s l2: 286.165 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.698 valid_1’s l2: 285.933 [3] training’s l2: 193.536 valid_1’s l2: 285.708 [4] training’s l2: 193.374 valid_1’s l2: 285.491 [5] training’s l2: 193.207 valid_1’s l2: 285.26 [6] training’s l2: 193.046 valid_1’s l2: 285.039 [7] training’s l2: 192.886 valid_1’s l2: 284.813 [8] training’s l2: 192.725 valid_1’s l2: 284.592 [9] training’s l2: 192.565 valid_1’s l2: 284.371 [10] training’s l2: 192.405 valid_1’s l2: 284.151 [11] training’s l2: 192.235 valid_1’s l2: 283.916 [12] training’s l2: 192.062 valid_1’s l2: 283.662 [13] training’s l2: 191.889 valid_1’s l2: 283.407 [14] training’s l2: 191.716 valid_1’s l2: 283.153 [15] training’s l2: 191.543 valid_1’s l2: 282.899 [16] training’s l2: 191.38 valid_1’s l2: 282.656 [17] training’s l2: 191.217 valid_1’s l2: 282.414 [18] training’s l2: 191.055 valid_1’s l2: 282.172 [19] training’s l2: 190.887 valid_1’s l2: 281.926 [20] training’s l2: 190.725 valid_1’s l2: 281.685 [21] training’s l2: 190.566 valid_1’s l2: 281.505 [22] training’s l2: 190.405 valid_1’s l2: 281.282 [23] training’s l2: 190.245 valid_1’s l2: 281.058 [24] training’s l2: 190.09 valid_1’s l2: 280.831 [25] training’s l2: 189.93 valid_1’s l2: 280.604 [26] training’s l2: 189.767 valid_1’s l2: 280.365 [27] training’s l2: 189.61 valid_1’s l2: 280.126 [28] training’s l2: 189.452 valid_1’s l2: 279.893 [29] training’s l2: 189.289 valid_1’s l2: 279.655 [30] training’s l2: 189.127 valid_1’s l2: 279.417 [31] training’s l2: 188.981 valid_1’s l2: 279.18 [32] training’s l2: 188.829 valid_1’s l2: 278.959 [33] training’s l2: 188.678 valid_1’s l2: 278.74 [34] training’s l2: 188.527 valid_1’s l2: 278.52 [35] training’s l2: 188.377 valid_1’s l2: 278.299 [36] training’s l2: 188.225 valid_1’s l2: 278.094 [37] training’s l2: 188.073 valid_1’s l2: 277.889 [38] training’s l2: 187.916 valid_1’s l2: 277.672 [39] training’s l2: 187.767 valid_1’s l2: 277.457 [40] training’s l2: 187.615 valid_1’s l2: 277.239 [41] training’s l2: 187.451 valid_1’s l2: 277.019 [42] training’s l2: 187.288 valid_1’s l2: 276.799 [43] training’s l2: 187.124 valid_1’s l2: 276.579 [44] training’s l2: 186.976 valid_1’s l2: 276.366 [45] training’s l2: 186.823 valid_1’s l2: 276.202 [46] training’s l2: 186.667 valid_1’s l2: 275.975 [47] training’s l2: 186.514 valid_1’s l2: 275.771 [48] training’s l2: 186.357 valid_1’s l2: 275.553 [49] training’s l2: 186.201 valid_1’s l2: 275.336 [50] training’s l2: 186.046 valid_1’s l2: 275.12 Did not meet early stopping. Best iteration is: [50] training’s l2: 186.046 valid_1’s l2: 275.12 [1] training’s l2: 195.007 valid_1’s l2: 280.125 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.833 valid_1’s l2: 280.132 [3] training’s l2: 194.66 valid_1’s l2: 279.914 [4] training’s l2: 194.486 valid_1’s l2: 279.789 [5] training’s l2: 194.313 valid_1’s l2: 279.797 [6] training’s l2: 194.138 valid_1’s l2: 279.976 [7] training’s l2: 193.973 valid_1’s l2: 279.789 [8] training’s l2: 193.81 valid_1’s l2: 279.595 [9] training’s l2: 193.636 valid_1’s l2: 279.775 [10] training’s l2: 193.463 valid_1’s l2: 279.949 [11] training’s l2: 193.287 valid_1’s l2: 279.793 [12] training’s l2: 193.114 valid_1’s l2: 279.653 [13] training’s l2: 192.94 valid_1’s l2: 279.514 [14] training’s l2: 192.767 valid_1’s l2: 279.375 [15] training’s l2: 192.601 valid_1’s l2: 279.158 [16] training’s l2: 192.427 valid_1’s l2: 278.946 [17] training’s l2: 192.252 valid_1’s l2: 278.734 [18] training’s l2: 192.078 valid_1’s l2: 278.523 [19] training’s l2: 191.903 valid_1’s l2: 278.288 [20] training’s l2: 191.73 valid_1’s l2: 278.077 [21] training’s l2: 191.553 valid_1’s l2: 277.983 [22] training’s l2: 191.385 valid_1’s l2: 277.782 [23] training’s l2: 191.208 valid_1’s l2: 277.685 [24] training’s l2: 191.046 valid_1’s l2: 277.691 [25] training’s l2: 190.87 valid_1’s l2: 277.599 [26] training’s l2: 190.706 valid_1’s l2: 277.371 [27] training’s l2: 190.536 valid_1’s l2: 277.161 [28] training’s l2: 190.368 valid_1’s l2: 276.929 [29] training’s l2: 190.187 valid_1’s l2: 276.907 [30] training’s l2: 190.007 valid_1’s l2: 276.886 [31] training’s l2: 189.843 valid_1’s l2: 276.856 [32] training’s l2: 189.67 valid_1’s l2: 276.825 [33] training’s l2: 189.497 valid_1’s l2: 276.816 [34] training’s l2: 189.324 valid_1’s l2: 276.786 [35] training’s l2: 189.151 valid_1’s l2: 276.756 [36] training’s l2: 188.982 valid_1’s l2: 276.605 [37] training’s l2: 188.813 valid_1’s l2: 276.455 [38] training’s l2: 188.644 valid_1’s l2: 276.299 [39] training’s l2: 188.476 valid_1’s l2: 276.15 [40] training’s l2: 188.308 valid_1’s l2: 276.064 [41] training’s l2: 188.143 valid_1’s l2: 276.023 [42] training’s l2: 187.979 valid_1’s l2: 275.891 [43] training’s l2: 187.823 valid_1’s l2: 275.695 [44] training’s l2: 187.662 valid_1’s l2: 275.654 [45] training’s l2: 187.503 valid_1’s l2: 275.523 [46] training’s l2: 187.338 valid_1’s l2: 275.396 [47] training’s l2: 187.175 valid_1’s l2: 275.22 [48] training’s l2: 187.011 valid_1’s l2: 275.095 [49] training’s l2: 186.846 valid_1’s l2: 274.97 [50] training’s l2: 186.682 valid_1’s l2: 274.845 Did not meet early stopping. Best iteration is: [50] training’s l2: 186.682 valid_1’s l2: 274.845 [1] training’s l2: 194.294 valid_1’s l2: 284.226 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.1 valid_1’s l2: 284.167 [3] training’s l2: 193.9 valid_1’s l2: 283.915 [4] training’s l2: 193.694 valid_1’s l2: 283.841 [5] training’s l2: 193.502 valid_1’s l2: 283.783 [6] training’s l2: 193.3 valid_1’s l2: 283.694 [7] training’s l2: 193.108 valid_1’s l2: 283.459 [8] training’s l2: 192.915 valid_1’s l2: 283.211 [9] training’s l2: 192.711 valid_1’s l2: 283.122 [10] training’s l2: 192.507 valid_1’s l2: 283.033 [11] training’s l2: 192.312 valid_1’s l2: 282.933 [12] training’s l2: 192.111 valid_1’s l2: 283.044 [13] training’s l2: 191.908 valid_1’s l2: 283.159 [14] training’s l2: 191.702 valid_1’s l2: 283.258 [15] training’s l2: 191.503 valid_1’s l2: 283.046 [16] training’s l2: 191.303 valid_1’s l2: 282.782 [17] training’s l2: 191.106 valid_1’s l2: 282.531 [18] training’s l2: 190.909 valid_1’s l2: 282.281 [19] training’s l2: 190.718 valid_1’s l2: 282.057 [20] training’s l2: 190.521 valid_1’s l2: 281.791 [21] training’s l2: 190.324 valid_1’s l2: 281.709 [22] training’s l2: 190.138 valid_1’s l2: 281.538 [23] training’s l2: 189.942 valid_1’s l2: 281.458 [24] training’s l2: 189.741 valid_1’s l2: 281.327 [25] training’s l2: 189.543 valid_1’s l2: 281.243 [26] training’s l2: 189.356 valid_1’s l2: 281.003 [27] training’s l2: 189.174 valid_1’s l2: 280.749 [28] training’s l2: 188.992 valid_1’s l2: 280.538 [29] training’s l2: 188.805 valid_1’s l2: 280.467 [30] training’s l2: 188.616 valid_1’s l2: 280.4 [31] training’s l2: 188.429 valid_1’s l2: 280.339 [32] training’s l2: 188.237 valid_1’s l2: 280.251 [33] training’s l2: 188.045 valid_1’s l2: 280.159 [34] training’s l2: 187.854 valid_1’s l2: 280.071 [35] training’s l2: 187.663 valid_1’s l2: 279.982 [36] training’s l2: 187.471 valid_1’s l2: 280.102 [37] training’s l2: 187.279 valid_1’s l2: 280.237 [38] training’s l2: 187.088 valid_1’s l2: 280.19 [39] training’s l2: 186.897 valid_1’s l2: 280.31 [40] training’s l2: 186.711 valid_1’s l2: 280.448 [41] training’s l2: 186.522 valid_1’s l2: 280.363 [42] training’s l2: 186.328 valid_1’s l2: 280.28 [43] training’s l2: 186.149 valid_1’s l2: 280.05 [44] training’s l2: 185.962 valid_1’s l2: 279.97 [45] training’s l2: 185.769 valid_1’s l2: 279.882 [46] training’s l2: 185.582 valid_1’s l2: 279.808 [47] training’s l2: 185.394 valid_1’s l2: 279.685 [48] training’s l2: 185.208 valid_1’s l2: 279.598 [49] training’s l2: 185.024 valid_1’s l2: 279.521 [50] training’s l2: 184.84 valid_1’s l2: 279.445 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.84 valid_1’s l2: 279.445 [1] training’s l2: 198.249 valid_1’s l2: 265.046 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.054 valid_1’s l2: 264.834 [3] training’s l2: 197.849 valid_1’s l2: 264.587 [4] training’s l2: 197.642 valid_1’s l2: 264.415 [5] training’s l2: 197.448 valid_1’s l2: 264.204 [6] training’s l2: 197.239 valid_1’s l2: 263.99 [7] training’s l2: 197.027 valid_1’s l2: 263.776 [8] training’s l2: 196.812 valid_1’s l2: 263.56 [9] training’s l2: 196.603 valid_1’s l2: 263.355 [10] training’s l2: 196.392 valid_1’s l2: 263.135 [11] training’s l2: 196.194 valid_1’s l2: 262.905 [12] training’s l2: 195.984 valid_1’s l2: 262.703 [13] training’s l2: 195.783 valid_1’s l2: 262.469 [14] training’s l2: 195.582 valid_1’s l2: 262.256 [15] training’s l2: 195.381 valid_1’s l2: 262.021 [16] training’s l2: 195.181 valid_1’s l2: 261.779 [17] training’s l2: 194.983 valid_1’s l2: 261.537 [18] training’s l2: 194.785 valid_1’s l2: 261.298 [19] training’s l2: 194.589 valid_1’s l2: 261.102 [20] training’s l2: 194.391 valid_1’s l2: 260.861 [21] training’s l2: 194.188 valid_1’s l2: 260.62 [22] training’s l2: 193.984 valid_1’s l2: 260.401 [23] training’s l2: 193.777 valid_1’s l2: 260.16 [24] training’s l2: 193.58 valid_1’s l2: 259.943 [25] training’s l2: 193.373 valid_1’s l2: 259.701 [26] training’s l2: 193.171 valid_1’s l2: 259.471 [27] training’s l2: 192.973 valid_1’s l2: 259.247 [28] training’s l2: 192.782 valid_1’s l2: 259.021 [29] training’s l2: 192.582 valid_1’s l2: 258.802 [30] training’s l2: 192.379 valid_1’s l2: 258.614 [31] training’s l2: 192.193 valid_1’s l2: 258.458 [32] training’s l2: 191.997 valid_1’s l2: 258.29 [33] training’s l2: 191.796 valid_1’s l2: 258.097 [34] training’s l2: 191.598 valid_1’s l2: 257.911 [35] training’s l2: 191.391 valid_1’s l2: 257.713 [36] training’s l2: 191.193 valid_1’s l2: 257.484 [37] training’s l2: 190.996 valid_1’s l2: 257.255 [38] training’s l2: 190.802 valid_1’s l2: 257.069 [39] training’s l2: 190.608 valid_1’s l2: 256.864 [40] training’s l2: 190.411 valid_1’s l2: 256.671 [41] training’s l2: 190.211 valid_1’s l2: 256.446 [42] training’s l2: 190.011 valid_1’s l2: 256.204 [43] training’s l2: 189.814 valid_1’s l2: 255.967 [44] training’s l2: 189.62 valid_1’s l2: 255.779 [45] training’s l2: 189.423 valid_1’s l2: 255.54 [46] training’s l2: 189.233 valid_1’s l2: 255.332 [47] training’s l2: 189.046 valid_1’s l2: 255.142 [48] training’s l2: 188.857 valid_1’s l2: 254.925 [49] training’s l2: 188.669 valid_1’s l2: 254.698 [50] training’s l2: 188.479 valid_1’s l2: 254.484 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.479 valid_1’s l2: 254.484 [1] training’s l2: 205.17 valid_1’s l2: 198.675 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 204.958 valid_1’s l2: 198.667 [3] training’s l2: 204.741 valid_1’s l2: 198.571 [4] training’s l2: 204.525 valid_1’s l2: 198.479 [5] training’s l2: 204.312 valid_1’s l2: 198.467 [6] training’s l2: 204.082 valid_1’s l2: 198.38 [7] training’s l2: 203.856 valid_1’s l2: 198.262 [8] training’s l2: 203.627 valid_1’s l2: 198.138 [9] training’s l2: 203.402 valid_1’s l2: 198.004 [10] training’s l2: 203.173 valid_1’s l2: 197.914 [11] training’s l2: 202.96 valid_1’s l2: 197.822 [12] training’s l2: 202.747 valid_1’s l2: 197.733 [13] training’s l2: 202.532 valid_1’s l2: 197.642 [14] training’s l2: 202.32 valid_1’s l2: 197.544 [15] training’s l2: 202.108 valid_1’s l2: 197.413 [16] training’s l2: 201.892 valid_1’s l2: 197.348 [17] training’s l2: 201.678 valid_1’s l2: 197.22 [18] training’s l2: 201.463 valid_1’s l2: 197.068 [19] training’s l2: 201.256 valid_1’s l2: 196.973 [20] training’s l2: 201.042 valid_1’s l2: 196.9 [21] training’s l2: 200.827 valid_1’s l2: 196.816 [22] training’s l2: 200.619 valid_1’s l2: 196.67 [23] training’s l2: 200.405 valid_1’s l2: 196.581 [24] training’s l2: 200.208 valid_1’s l2: 196.579 [25] training’s l2: 199.996 valid_1’s l2: 196.494 [26] training’s l2: 199.781 valid_1’s l2: 196.393 [27] training’s l2: 199.572 valid_1’s l2: 196.304 [28] training’s l2: 199.369 valid_1’s l2: 196.235 [29] training’s l2: 199.155 valid_1’s l2: 196.145 [30] training’s l2: 198.942 valid_1’s l2: 196.059 [31] training’s l2: 198.748 valid_1’s l2: 196.043 [32] training’s l2: 198.539 valid_1’s l2: 195.971 [33] training’s l2: 198.331 valid_1’s l2: 195.894 [34] training’s l2: 198.123 valid_1’s l2: 195.818 [35] training’s l2: 197.915 valid_1’s l2: 195.755 [36] training’s l2: 197.705 valid_1’s l2: 195.69 [37] training’s l2: 197.493 valid_1’s l2: 195.633 [38] training’s l2: 197.286 valid_1’s l2: 195.516 [39] training’s l2: 197.074 valid_1’s l2: 195.455 [40] training’s l2: 196.866 valid_1’s l2: 195.388 [41] training’s l2: 196.658 valid_1’s l2: 195.317 [42] training’s l2: 196.449 valid_1’s l2: 195.25 [43] training’s l2: 196.244 valid_1’s l2: 195.129 [44] training’s l2: 196.042 valid_1’s l2: 195.082 [45] training’s l2: 195.835 valid_1’s l2: 195.018 [46] training’s l2: 195.63 valid_1’s l2: 194.941 [47] training’s l2: 195.431 valid_1’s l2: 194.885 [48] training’s l2: 195.229 valid_1’s l2: 194.794 [49] training’s l2: 195.026 valid_1’s l2: 194.691 [50] training’s l2: 194.824 valid_1’s l2: 194.603 Did not meet early stopping. Best iteration is: [50] training’s l2: 194.824 valid_1’s l2: 194.603 [1] training’s l2: 193.826 valid_1’s l2: 286.136 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.627 valid_1’s l2: 285.89 [3] training’s l2: 193.425 valid_1’s l2: 285.642 [4] training’s l2: 193.218 valid_1’s l2: 285.383 [5] training’s l2: 193.02 valid_1’s l2: 285.137 [6] training’s l2: 192.816 valid_1’s l2: 284.87 [7] training’s l2: 192.614 valid_1’s l2: 284.599 [8] training’s l2: 192.412 valid_1’s l2: 284.34 [9] training’s l2: 192.208 valid_1’s l2: 284.075 [10] training’s l2: 192.003 valid_1’s l2: 283.812 [11] training’s l2: 191.808 valid_1’s l2: 283.573 [12] training’s l2: 191.599 valid_1’s l2: 283.327 [13] training’s l2: 191.393 valid_1’s l2: 283.079 [14] training’s l2: 191.189 valid_1’s l2: 282.828 [15] training’s l2: 190.986 valid_1’s l2: 282.568 [16] training’s l2: 190.786 valid_1’s l2: 282.293 [17] training’s l2: 190.586 valid_1’s l2: 282.019 [18] training’s l2: 190.387 valid_1’s l2: 281.743 [19] training’s l2: 190.193 valid_1’s l2: 281.483 [20] training’s l2: 189.996 valid_1’s l2: 281.208 [21] training’s l2: 189.798 valid_1’s l2: 280.957 [22] training’s l2: 189.6 valid_1’s l2: 280.697 [23] training’s l2: 189.397 valid_1’s l2: 280.446 [24] training’s l2: 189.207 valid_1’s l2: 280.185 [25] training’s l2: 189.008 valid_1’s l2: 279.928 [26] training’s l2: 188.81 valid_1’s l2: 279.672 [27] training’s l2: 188.621 valid_1’s l2: 279.388 [28] training’s l2: 188.432 valid_1’s l2: 279.153 [29] training’s l2: 188.236 valid_1’s l2: 278.894 [30] training’s l2: 188.039 valid_1’s l2: 278.64 [31] training’s l2: 187.861 valid_1’s l2: 278.417 [32] training’s l2: 187.673 valid_1’s l2: 278.167 [33] training’s l2: 187.487 valid_1’s l2: 277.924 [34] training’s l2: 187.299 valid_1’s l2: 277.676 [35] training’s l2: 187.11 valid_1’s l2: 277.419 [36] training’s l2: 186.923 valid_1’s l2: 277.188 [37] training’s l2: 186.737 valid_1’s l2: 276.951 [38] training’s l2: 186.547 valid_1’s l2: 276.732 [39] training’s l2: 186.364 valid_1’s l2: 276.478 [40] training’s l2: 186.176 valid_1’s l2: 276.222 [41] training’s l2: 185.98 valid_1’s l2: 275.972 [42] training’s l2: 185.786 valid_1’s l2: 275.723 [43] training’s l2: 185.592 valid_1’s l2: 275.473 [44] training’s l2: 185.409 valid_1’s l2: 275.232 [45] training’s l2: 185.217 valid_1’s l2: 275.075 [46] training’s l2: 185.032 valid_1’s l2: 274.835 [47] training’s l2: 184.847 valid_1’s l2: 274.614 [48] training’s l2: 184.662 valid_1’s l2: 274.371 [49] training’s l2: 184.476 valid_1’s l2: 274.127 [50] training’s l2: 184.291 valid_1’s l2: 273.885 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.291 valid_1’s l2: 273.885 [1] training’s l2: 194.973 valid_1’s l2: 280.1 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.768 valid_1’s l2: 280.106 [3] training’s l2: 194.568 valid_1’s l2: 279.886 [4] training’s l2: 194.356 valid_1’s l2: 279.867 [5] training’s l2: 194.151 valid_1’s l2: 279.862 [6] training’s l2: 193.941 valid_1’s l2: 280.029 [7] training’s l2: 193.741 valid_1’s l2: 279.827 [8] training’s l2: 193.539 valid_1’s l2: 279.618 [9] training’s l2: 193.322 valid_1’s l2: 279.781 [10] training’s l2: 193.108 valid_1’s l2: 279.932 [11] training’s l2: 192.9 valid_1’s l2: 279.798 [12] training’s l2: 192.688 valid_1’s l2: 279.625 [13] training’s l2: 192.476 valid_1’s l2: 279.49 [14] training’s l2: 192.262 valid_1’s l2: 279.343 [15] training’s l2: 192.056 valid_1’s l2: 279.093 [16] training’s l2: 191.852 valid_1’s l2: 278.866 [17] training’s l2: 191.649 valid_1’s l2: 278.635 [18] training’s l2: 191.447 valid_1’s l2: 278.415 [19] training’s l2: 191.242 valid_1’s l2: 278.203 [20] training’s l2: 191.04 valid_1’s l2: 277.982 [21] training’s l2: 190.838 valid_1’s l2: 277.923 [22] training’s l2: 190.646 valid_1’s l2: 277.742 [23] training’s l2: 190.443 valid_1’s l2: 277.679 [24] training’s l2: 190.255 valid_1’s l2: 277.694 [25] training’s l2: 190.054 valid_1’s l2: 277.633 [26] training’s l2: 189.846 valid_1’s l2: 277.391 [27] training’s l2: 189.644 valid_1’s l2: 277.19 [28] training’s l2: 189.439 valid_1’s l2: 276.98 [29] training’s l2: 189.226 valid_1’s l2: 276.946 [30] training’s l2: 189.01 valid_1’s l2: 276.913 [31] training’s l2: 188.822 valid_1’s l2: 276.851 [32] training’s l2: 188.621 valid_1’s l2: 276.788 [33] training’s l2: 188.42 valid_1’s l2: 276.75 [34] training’s l2: 188.22 valid_1’s l2: 276.69 [35] training’s l2: 188.02 valid_1’s l2: 276.632 [36] training’s l2: 187.816 valid_1’s l2: 276.475 [37] training’s l2: 187.611 valid_1’s l2: 276.313 [38] training’s l2: 187.411 valid_1’s l2: 276.161 [39] training’s l2: 187.206 valid_1’s l2: 276.061 [40] training’s l2: 187.005 valid_1’s l2: 275.96 [41] training’s l2: 186.81 valid_1’s l2: 275.893 [42] training’s l2: 186.613 valid_1’s l2: 275.738 [43] training’s l2: 186.429 valid_1’s l2: 275.525 [44] training’s l2: 186.236 valid_1’s l2: 275.463 [45] training’s l2: 186.04 valid_1’s l2: 275.307 [46] training’s l2: 185.841 valid_1’s l2: 275.166 [47] training’s l2: 185.647 valid_1’s l2: 275.11 [48] training’s l2: 185.449 valid_1’s l2: 274.971 [49] training’s l2: 185.251 valid_1’s l2: 274.832 [50] training’s l2: 185.054 valid_1’s l2: 274.702 Did not meet early stopping. Best iteration is: [50] training’s l2: 185.054 valid_1’s l2: 274.702 [1] training’s l2: 194.28 valid_1’s l2: 284.224 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.077 valid_1’s l2: 284.168 [3] training’s l2: 193.864 valid_1’s l2: 283.928 [4] training’s l2: 193.644 valid_1’s l2: 283.85 [5] training’s l2: 193.44 valid_1’s l2: 283.792 [6] training’s l2: 193.229 valid_1’s l2: 283.704 [7] training’s l2: 193.028 valid_1’s l2: 283.464 [8] training’s l2: 192.827 valid_1’s l2: 283.23 [9] training’s l2: 192.614 valid_1’s l2: 283.139 [10] training’s l2: 192.402 valid_1’s l2: 283.053 [11] training’s l2: 192.197 valid_1’s l2: 282.953 [12] training’s l2: 191.984 valid_1’s l2: 283.065 [13] training’s l2: 191.77 valid_1’s l2: 283.183 [14] training’s l2: 191.551 valid_1’s l2: 283.276 [15] training’s l2: 191.343 valid_1’s l2: 283.068 [16] training’s l2: 191.132 valid_1’s l2: 282.808 [17] training’s l2: 190.924 valid_1’s l2: 282.56 [18] training’s l2: 190.716 valid_1’s l2: 282.309 [19] training’s l2: 190.515 valid_1’s l2: 282.1 [20] training’s l2: 190.307 valid_1’s l2: 281.842 [21] training’s l2: 190.098 valid_1’s l2: 281.759 [22] training’s l2: 189.899 valid_1’s l2: 281.58 [23] training’s l2: 189.691 valid_1’s l2: 281.491 [24] training’s l2: 189.482 valid_1’s l2: 281.362 [25] training’s l2: 189.272 valid_1’s l2: 281.272 [26] training’s l2: 189.073 valid_1’s l2: 281.03 [27] training’s l2: 188.877 valid_1’s l2: 280.783 [28] training’s l2: 188.686 valid_1’s l2: 280.572 [29] training’s l2: 188.484 valid_1’s l2: 280.494 [30] training’s l2: 188.28 valid_1’s l2: 280.42 [31] training’s l2: 188.082 valid_1’s l2: 280.356 [32] training’s l2: 187.879 valid_1’s l2: 280.266 [33] training’s l2: 187.676 valid_1’s l2: 280.17 [34] training’s l2: 187.475 valid_1’s l2: 280.079 [35] training’s l2: 187.274 valid_1’s l2: 279.987 [36] training’s l2: 187.071 valid_1’s l2: 280.103 [37] training’s l2: 186.868 valid_1’s l2: 280.236 [38] training’s l2: 186.673 valid_1’s l2: 280.19 [39] training’s l2: 186.48 valid_1’s l2: 280.13 [40] training’s l2: 186.291 valid_1’s l2: 280.077 [41] training’s l2: 186.09 valid_1’s l2: 279.983 [42] training’s l2: 185.885 valid_1’s l2: 279.892 [43] training’s l2: 185.695 valid_1’s l2: 279.656 [44] training’s l2: 185.495 valid_1’s l2: 279.567 [45] training’s l2: 185.293 valid_1’s l2: 279.47 [46] training’s l2: 185.097 valid_1’s l2: 279.392 [47] training’s l2: 184.903 valid_1’s l2: 279.261 [48] training’s l2: 184.708 valid_1’s l2: 279.166 [49] training’s l2: 184.512 valid_1’s l2: 279.084 [50] training’s l2: 184.318 valid_1’s l2: 279.001 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.318 valid_1’s l2: 279.001 [1] training’s l2: 198.24 valid_1’s l2: 265.047 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.034 valid_1’s l2: 264.859 [3] training’s l2: 197.822 valid_1’s l2: 264.61 [4] training’s l2: 197.607 valid_1’s l2: 264.435 [5] training’s l2: 197.402 valid_1’s l2: 264.217 [6] training’s l2: 197.184 valid_1’s l2: 264.013 [7] training’s l2: 196.965 valid_1’s l2: 263.808 [8] training’s l2: 196.742 valid_1’s l2: 263.592 [9] training’s l2: 196.522 valid_1’s l2: 263.394 [10] training’s l2: 196.303 valid_1’s l2: 263.169 [11] training’s l2: 196.096 valid_1’s l2: 262.944 [12] training’s l2: 195.881 valid_1’s l2: 262.753 [13] training’s l2: 195.666 valid_1’s l2: 262.518 [14] training’s l2: 195.451 valid_1’s l2: 262.303 [15] training’s l2: 195.236 valid_1’s l2: 262.071 [16] training’s l2: 195.025 valid_1’s l2: 261.828 [17] training’s l2: 194.815 valid_1’s l2: 261.586 [18] training’s l2: 194.604 valid_1’s l2: 261.35 [19] training’s l2: 194.403 valid_1’s l2: 261.155 [20] training’s l2: 194.194 valid_1’s l2: 260.913 [21] training’s l2: 193.98 valid_1’s l2: 260.672 [22] training’s l2: 193.766 valid_1’s l2: 260.456 [23] training’s l2: 193.549 valid_1’s l2: 260.205 [24] training’s l2: 193.342 valid_1’s l2: 259.98 [25] training’s l2: 193.125 valid_1’s l2: 259.727 [26] training’s l2: 192.912 valid_1’s l2: 259.487 [27] training’s l2: 192.701 valid_1’s l2: 259.258 [28] training’s l2: 192.502 valid_1’s l2: 259.041 [29] training’s l2: 192.291 valid_1’s l2: 258.829 [30] training’s l2: 192.078 valid_1’s l2: 258.648 [31] training’s l2: 191.882 valid_1’s l2: 258.511 [32] training’s l2: 191.676 valid_1’s l2: 258.365 [33] training’s l2: 191.469 valid_1’s l2: 258.179 [34] training’s l2: 191.262 valid_1’s l2: 257.997 [35] training’s l2: 191.054 valid_1’s l2: 257.808 [36] training’s l2: 190.852 valid_1’s l2: 257.571 [37] training’s l2: 190.647 valid_1’s l2: 257.331 [38] training’s l2: 190.449 valid_1’s l2: 257.131 [39] training’s l2: 190.247 valid_1’s l2: 256.913 [40] training’s l2: 190.044 valid_1’s l2: 256.723 [41] training’s l2: 189.835 valid_1’s l2: 256.507 [42] training’s l2: 189.627 valid_1’s l2: 256.264 [43] training’s l2: 189.421 valid_1’s l2: 256.033 [44] training’s l2: 189.217 valid_1’s l2: 255.861 [45] training’s l2: 189.014 valid_1’s l2: 255.625 [46] training’s l2: 188.814 valid_1’s l2: 255.417 [47] training’s l2: 188.619 valid_1’s l2: 255.221 [48] training’s l2: 188.422 valid_1’s l2: 255 [49] training’s l2: 188.224 valid_1’s l2: 254.774 [50] training’s l2: 188.026 valid_1’s l2: 254.559 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.026 valid_1’s l2: 254.559 [1] training’s l2: 205.157 valid_1’s l2: 198.671 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 204.933 valid_1’s l2: 198.656 [3] training’s l2: 204.704 valid_1’s l2: 198.558 [4] training’s l2: 204.476 valid_1’s l2: 198.466 [5] training’s l2: 204.25 valid_1’s l2: 198.447 [6] training’s l2: 204.009 valid_1’s l2: 198.354 [7] training’s l2: 203.774 valid_1’s l2: 198.235 [8] training’s l2: 203.537 valid_1’s l2: 198.102 [9] training’s l2: 203.3 valid_1’s l2: 197.964 [10] training’s l2: 203.059 valid_1’s l2: 197.869 [11] training’s l2: 202.836 valid_1’s l2: 197.78 [12] training’s l2: 202.611 valid_1’s l2: 197.69 [13] training’s l2: 202.385 valid_1’s l2: 197.603 [14] training’s l2: 202.161 valid_1’s l2: 197.507 [15] training’s l2: 201.939 valid_1’s l2: 197.377 [16] training’s l2: 201.714 valid_1’s l2: 197.31 [17] training’s l2: 201.492 valid_1’s l2: 197.194 [18] training’s l2: 201.269 valid_1’s l2: 197.058 [19] training’s l2: 201.051 valid_1’s l2: 196.959 [20] training’s l2: 200.828 valid_1’s l2: 196.887 [21] training’s l2: 200.606 valid_1’s l2: 196.8 [22] training’s l2: 200.389 valid_1’s l2: 196.653 [23] training’s l2: 200.167 valid_1’s l2: 196.566 [24] training’s l2: 199.958 valid_1’s l2: 196.559 [25] training’s l2: 199.739 valid_1’s l2: 196.473 [26] training’s l2: 199.515 valid_1’s l2: 196.375 [27] training’s l2: 199.296 valid_1’s l2: 196.291 [28] training’s l2: 199.084 valid_1’s l2: 196.229 [29] training’s l2: 198.862 valid_1’s l2: 196.146 [30] training’s l2: 198.64 valid_1’s l2: 196.068 [31] training’s l2: 198.431 valid_1’s l2: 196.052 [32] training’s l2: 198.212 valid_1’s l2: 195.984 [33] training’s l2: 197.993 valid_1’s l2: 195.908 [34] training’s l2: 197.775 valid_1’s l2: 195.833 [35] training’s l2: 197.557 valid_1’s l2: 195.773 [36] training’s l2: 197.338 valid_1’s l2: 195.713 [37] training’s l2: 197.12 valid_1’s l2: 195.657 [38] training’s l2: 196.906 valid_1’s l2: 195.538 [39] training’s l2: 196.686 valid_1’s l2: 195.475 [40] training’s l2: 196.471 valid_1’s l2: 195.408 [41] training’s l2: 196.253 valid_1’s l2: 195.336 [42] training’s l2: 196.035 valid_1’s l2: 195.269 [43] training’s l2: 195.819 valid_1’s l2: 195.15 [44] training’s l2: 195.61 valid_1’s l2: 195.101 [45] training’s l2: 195.392 valid_1’s l2: 195.036 [46] training’s l2: 195.178 valid_1’s l2: 194.958 [47] training’s l2: 194.968 valid_1’s l2: 194.904 [48] training’s l2: 194.757 valid_1’s l2: 194.809 [49] training’s l2: 194.544 valid_1’s l2: 194.707 [50] training’s l2: 194.333 valid_1’s l2: 194.615 Did not meet early stopping. Best iteration is: [50] training’s l2: 194.333 valid_1’s l2: 194.615 [1] training’s l2: 193.818 valid_1’s l2: 286.143 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.609 valid_1’s l2: 285.89 [3] training’s l2: 193.397 valid_1’s l2: 285.643 [4] training’s l2: 193.181 valid_1’s l2: 285.378 [5] training’s l2: 192.974 valid_1’s l2: 285.126 [6] training’s l2: 192.765 valid_1’s l2: 284.862 [7] training’s l2: 192.556 valid_1’s l2: 284.592 [8] training’s l2: 192.345 valid_1’s l2: 284.329 [9] training’s l2: 192.133 valid_1’s l2: 284.062 [10] training’s l2: 191.92 valid_1’s l2: 283.795 [11] training’s l2: 191.718 valid_1’s l2: 283.569 [12] training’s l2: 191.504 valid_1’s l2: 283.332 [13] training’s l2: 191.291 valid_1’s l2: 283.085 [14] training’s l2: 191.081 valid_1’s l2: 282.832 [15] training’s l2: 190.869 valid_1’s l2: 282.576 [16] training’s l2: 190.662 valid_1’s l2: 282.307 [17] training’s l2: 190.455 valid_1’s l2: 282.038 [18] training’s l2: 190.249 valid_1’s l2: 281.764 [19] training’s l2: 190.048 valid_1’s l2: 281.506 [20] training’s l2: 189.843 valid_1’s l2: 281.235 [21] training’s l2: 189.639 valid_1’s l2: 280.99 [22] training’s l2: 189.435 valid_1’s l2: 280.746 [23] training’s l2: 189.228 valid_1’s l2: 280.483 [24] training’s l2: 189.033 valid_1’s l2: 280.229 [25] training’s l2: 188.826 valid_1’s l2: 279.983 [26] training’s l2: 188.62 valid_1’s l2: 279.732 [27] training’s l2: 188.421 valid_1’s l2: 279.454 [28] training’s l2: 188.227 valid_1’s l2: 279.218 [29] training’s l2: 188.023 valid_1’s l2: 278.963 [30] training’s l2: 187.817 valid_1’s l2: 278.721 [31] training’s l2: 187.63 valid_1’s l2: 278.511 [32] training’s l2: 187.429 valid_1’s l2: 278.254 [33] training’s l2: 187.229 valid_1’s l2: 278.001 [34] training’s l2: 187.029 valid_1’s l2: 277.748 [35] training’s l2: 186.831 valid_1’s l2: 277.495 [36] training’s l2: 186.636 valid_1’s l2: 277.264 [37] training’s l2: 186.442 valid_1’s l2: 277.031 [38] training’s l2: 186.248 valid_1’s l2: 276.813 [39] training’s l2: 186.056 valid_1’s l2: 276.569 [40] training’s l2: 185.863 valid_1’s l2: 276.318 [41] training’s l2: 185.66 valid_1’s l2: 276.073 [42] training’s l2: 185.46 valid_1’s l2: 275.83 [43] training’s l2: 185.259 valid_1’s l2: 275.583 [44] training’s l2: 185.069 valid_1’s l2: 275.336 [45] training’s l2: 184.868 valid_1’s l2: 275.192 [46] training’s l2: 184.673 valid_1’s l2: 274.955 [47] training’s l2: 184.48 valid_1’s l2: 274.743 [48] training’s l2: 184.285 valid_1’s l2: 274.49 [49] training’s l2: 184.091 valid_1’s l2: 274.242 [50] training’s l2: 183.897 valid_1’s l2: 273.997 Did not meet early stopping. Best iteration is: [50] training’s l2: 183.897 valid_1’s l2: 273.997 [1] training’s l2: 194.962 valid_1’s l2: 280.1 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.747 valid_1’s l2: 280.103 [3] training’s l2: 194.539 valid_1’s l2: 279.89 [4] training’s l2: 194.316 valid_1’s l2: 279.872 [5] training’s l2: 194.099 valid_1’s l2: 279.871 [6] training’s l2: 193.879 valid_1’s l2: 280.045 [7] training’s l2: 193.669 valid_1’s l2: 279.841 [8] training’s l2: 193.457 valid_1’s l2: 279.637 [9] training’s l2: 193.233 valid_1’s l2: 279.798 [10] training’s l2: 193.01 valid_1’s l2: 279.954 [11] training’s l2: 192.796 valid_1’s l2: 279.833 [12] training’s l2: 192.576 valid_1’s l2: 279.665 [13] training’s l2: 192.354 valid_1’s l2: 279.528 [14] training’s l2: 192.133 valid_1’s l2: 279.397 [15] training’s l2: 191.921 valid_1’s l2: 279.158 [16] training’s l2: 191.707 valid_1’s l2: 278.924 [17] training’s l2: 191.493 valid_1’s l2: 278.701 [18] training’s l2: 191.281 valid_1’s l2: 278.484 [19] training’s l2: 191.072 valid_1’s l2: 278.267 [20] training’s l2: 190.861 valid_1’s l2: 278.046 [21] training’s l2: 190.649 valid_1’s l2: 277.995 [22] training’s l2: 190.447 valid_1’s l2: 277.823 [23] training’s l2: 190.237 valid_1’s l2: 277.758 [24] training’s l2: 190.04 valid_1’s l2: 277.774 [25] training’s l2: 189.831 valid_1’s l2: 277.711 [26] training’s l2: 189.617 valid_1’s l2: 277.481 [27] training’s l2: 189.412 valid_1’s l2: 277.271 [28] training’s l2: 189.2 valid_1’s l2: 277.072 [29] training’s l2: 188.979 valid_1’s l2: 277.04 [30] training’s l2: 188.756 valid_1’s l2: 277.006 [31] training’s l2: 188.554 valid_1’s l2: 276.938 [32] training’s l2: 188.346 valid_1’s l2: 276.875 [33] training’s l2: 188.137 valid_1’s l2: 276.838 [34] training’s l2: 187.929 valid_1’s l2: 276.776 [35] training’s l2: 187.721 valid_1’s l2: 276.723 [36] training’s l2: 187.508 valid_1’s l2: 276.565 [37] training’s l2: 187.295 valid_1’s l2: 276.453 [38] training’s l2: 187.087 valid_1’s l2: 276.299 [39] training’s l2: 186.875 valid_1’s l2: 276.145 [40] training’s l2: 186.662 valid_1’s l2: 276.046 [41] training’s l2: 186.456 valid_1’s l2: 275.982 [42] training’s l2: 186.251 valid_1’s l2: 275.838 [43] training’s l2: 186.057 valid_1’s l2: 275.625 [44] training’s l2: 185.853 valid_1’s l2: 275.569 [45] training’s l2: 185.649 valid_1’s l2: 275.425 [46] training’s l2: 185.444 valid_1’s l2: 275.284 [47] training’s l2: 185.242 valid_1’s l2: 275.229 [48] training’s l2: 185.037 valid_1’s l2: 275.09 [49] training’s l2: 184.833 valid_1’s l2: 274.951 [50] training’s l2: 184.631 valid_1’s l2: 274.821 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.631 valid_1’s l2: 274.821 [1] training’s l2: 194.275 valid_1’s l2: 284.212 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.063 valid_1’s l2: 284.151 [3] training’s l2: 193.844 valid_1’s l2: 283.899 [4] training’s l2: 193.618 valid_1’s l2: 283.809 [5] training’s l2: 193.405 valid_1’s l2: 283.746 [6] training’s l2: 193.185 valid_1’s l2: 283.645 [7] training’s l2: 192.979 valid_1’s l2: 283.406 [8] training’s l2: 192.77 valid_1’s l2: 283.172 [9] training’s l2: 192.551 valid_1’s l2: 283.075 [10] training’s l2: 192.333 valid_1’s l2: 282.978 [11] training’s l2: 192.122 valid_1’s l2: 282.88 [12] training’s l2: 191.903 valid_1’s l2: 282.974 [13] training’s l2: 191.686 valid_1’s l2: 283.074 [14] training’s l2: 191.465 valid_1’s l2: 283.162 [15] training’s l2: 191.256 valid_1’s l2: 282.953 [16] training’s l2: 191.044 valid_1’s l2: 282.697 [17] training’s l2: 190.834 valid_1’s l2: 282.449 [18] training’s l2: 190.625 valid_1’s l2: 282.187 [19] training’s l2: 190.423 valid_1’s l2: 281.979 [20] training’s l2: 190.212 valid_1’s l2: 281.723 [21] training’s l2: 189.995 valid_1’s l2: 281.646 [22] training’s l2: 189.79 valid_1’s l2: 281.466 [23] training’s l2: 189.573 valid_1’s l2: 281.382 [24] training’s l2: 189.361 valid_1’s l2: 281.25 [25] training’s l2: 189.145 valid_1’s l2: 281.172 [26] training’s l2: 188.944 valid_1’s l2: 280.936 [27] training’s l2: 188.744 valid_1’s l2: 280.707 [28] training’s l2: 188.553 valid_1’s l2: 280.499 [29] training’s l2: 188.34 valid_1’s l2: 280.423 [30] training’s l2: 188.127 valid_1’s l2: 280.35 [31] training’s l2: 187.926 valid_1’s l2: 280.298 [32] training’s l2: 187.715 valid_1’s l2: 280.216 [33] training’s l2: 187.505 valid_1’s l2: 280.134 [34] training’s l2: 187.297 valid_1’s l2: 280.054 [35] training’s l2: 187.09 valid_1’s l2: 279.97 [36] training’s l2: 186.88 valid_1’s l2: 279.898 [37] training’s l2: 186.672 valid_1’s l2: 279.827 [38] training’s l2: 186.474 valid_1’s l2: 279.772 [39] training’s l2: 186.267 valid_1’s l2: 279.7 [40] training’s l2: 186.065 valid_1’s l2: 279.634 [41] training’s l2: 185.855 valid_1’s l2: 279.548 [42] training’s l2: 185.645 valid_1’s l2: 279.455 [43] training’s l2: 185.445 valid_1’s l2: 279.218 [44] training’s l2: 185.239 valid_1’s l2: 279.137 [45] training’s l2: 185.032 valid_1’s l2: 279.05 [46] training’s l2: 184.83 valid_1’s l2: 278.966 [47] training’s l2: 184.63 valid_1’s l2: 278.837 [48] training’s l2: 184.429 valid_1’s l2: 278.746 [49] training’s l2: 184.227 valid_1’s l2: 278.654 [50] training’s l2: 184.025 valid_1’s l2: 278.564 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.025 valid_1’s l2: 278.564 [1] training’s l2: 198.239 valid_1’s l2: 265.043 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.029 valid_1’s l2: 264.86 [3] training’s l2: 197.814 valid_1’s l2: 264.621 [4] training’s l2: 197.599 valid_1’s l2: 264.409 [5] training’s l2: 197.391 valid_1’s l2: 264.193 [6] training’s l2: 197.168 valid_1’s l2: 263.988 [7] training’s l2: 196.945 valid_1’s l2: 263.78 [8] training’s l2: 196.718 valid_1’s l2: 263.561 [9] training’s l2: 196.495 valid_1’s l2: 263.371 [10] training’s l2: 196.27 valid_1’s l2: 263.156 [11] training’s l2: 196.058 valid_1’s l2: 262.93 [12] training’s l2: 195.839 valid_1’s l2: 262.747 [13] training’s l2: 195.624 valid_1’s l2: 262.508 [14] training’s l2: 195.408 valid_1’s l2: 262.289 [15] training’s l2: 195.192 valid_1’s l2: 262.048 [16] training’s l2: 194.977 valid_1’s l2: 261.803 [17] training’s l2: 194.762 valid_1’s l2: 261.555 [18] training’s l2: 194.546 valid_1’s l2: 261.318 [19] training’s l2: 194.341 valid_1’s l2: 261.12 [20] training’s l2: 194.127 valid_1’s l2: 260.877 [21] training’s l2: 193.909 valid_1’s l2: 260.639 [22] training’s l2: 193.693 valid_1’s l2: 260.425 [23] training’s l2: 193.474 valid_1’s l2: 260.188 [24] training’s l2: 193.264 valid_1’s l2: 259.97 [25] training’s l2: 193.042 valid_1’s l2: 259.727 [26] training’s l2: 192.824 valid_1’s l2: 259.493 [27] training’s l2: 192.61 valid_1’s l2: 259.273 [28] training’s l2: 192.405 valid_1’s l2: 259.051 [29] training’s l2: 192.19 valid_1’s l2: 258.839 [30] training’s l2: 191.973 valid_1’s l2: 258.661 [31] training’s l2: 191.774 valid_1’s l2: 258.531 [32] training’s l2: 191.565 valid_1’s l2: 258.383 [33] training’s l2: 191.357 valid_1’s l2: 258.185 [34] training’s l2: 191.148 valid_1’s l2: 257.991 [35] training’s l2: 190.939 valid_1’s l2: 257.804 [36] training’s l2: 190.733 valid_1’s l2: 257.589 [37] training’s l2: 190.527 valid_1’s l2: 257.365 [38] training’s l2: 190.327 valid_1’s l2: 257.172 [39] training’s l2: 190.121 valid_1’s l2: 256.958 [40] training’s l2: 189.918 valid_1’s l2: 256.767 [41] training’s l2: 189.708 valid_1’s l2: 256.556 [42] training’s l2: 189.5 valid_1’s l2: 256.314 [43] training’s l2: 189.294 valid_1’s l2: 256.086 [44] training’s l2: 189.089 valid_1’s l2: 255.917 [45] training’s l2: 188.881 valid_1’s l2: 255.676 [46] training’s l2: 188.68 valid_1’s l2: 255.474 [47] training’s l2: 188.481 valid_1’s l2: 255.291 [48] training’s l2: 188.282 valid_1’s l2: 255.07 [49] training’s l2: 188.081 valid_1’s l2: 254.842 [50] training’s l2: 187.881 valid_1’s l2: 254.625 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.881 valid_1’s l2: 254.625 [1] training’s l2: 205.147 valid_1’s l2: 198.678 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 204.908 valid_1’s l2: 198.668 [3] training’s l2: 204.666 valid_1’s l2: 198.575 [4] training’s l2: 204.425 valid_1’s l2: 198.478 [5] training’s l2: 204.19 valid_1’s l2: 198.462 [6] training’s l2: 203.946 valid_1’s l2: 198.366 [7] training’s l2: 203.708 valid_1’s l2: 198.242 [8] training’s l2: 203.467 valid_1’s l2: 198.115 [9] training’s l2: 203.228 valid_1’s l2: 197.981 [10] training’s l2: 202.984 valid_1’s l2: 197.886 [11] training’s l2: 202.754 valid_1’s l2: 197.795 [12] training’s l2: 202.523 valid_1’s l2: 197.713 [13] training’s l2: 202.29 valid_1’s l2: 197.623 [14] training’s l2: 202.057 valid_1’s l2: 197.528 [15] training’s l2: 201.829 valid_1’s l2: 197.396 [16] training’s l2: 201.597 valid_1’s l2: 197.335 [17] training’s l2: 201.369 valid_1’s l2: 197.216 [18] training’s l2: 201.145 valid_1’s l2: 197.076 [19] training’s l2: 200.918 valid_1’s l2: 196.986 [20] training’s l2: 200.69 valid_1’s l2: 196.913 [21] training’s l2: 200.457 valid_1’s l2: 196.829 [22] training’s l2: 200.235 valid_1’s l2: 196.685 [23] training’s l2: 200.004 valid_1’s l2: 196.601 [24] training’s l2: 199.785 valid_1’s l2: 196.595 [25] training’s l2: 199.556 valid_1’s l2: 196.506 [26] training’s l2: 199.327 valid_1’s l2: 196.412 [27] training’s l2: 199.104 valid_1’s l2: 196.328 [28] training’s l2: 198.886 valid_1’s l2: 196.271 [29] training’s l2: 198.66 valid_1’s l2: 196.188 [30] training’s l2: 198.432 valid_1’s l2: 196.116 [31] training’s l2: 198.215 valid_1’s l2: 196.1 [32] training’s l2: 197.987 valid_1’s l2: 196.031 [33] training’s l2: 197.76 valid_1’s l2: 195.957 [34] training’s l2: 197.534 valid_1’s l2: 195.879 [35] training’s l2: 197.308 valid_1’s l2: 195.818 [36] training’s l2: 197.085 valid_1’s l2: 195.755 [37] training’s l2: 196.861 valid_1’s l2: 195.7 [38] training’s l2: 196.646 valid_1’s l2: 195.581 [39] training’s l2: 196.421 valid_1’s l2: 195.518 [40] training’s l2: 196.2 valid_1’s l2: 195.453 [41] training’s l2: 195.969 valid_1’s l2: 195.382 [42] training’s l2: 195.745 valid_1’s l2: 195.336 [43] training’s l2: 195.519 valid_1’s l2: 195.221 [44] training’s l2: 195.295 valid_1’s l2: 195.177 [45] training’s l2: 195.073 valid_1’s l2: 195.123 [46] training’s l2: 194.848 valid_1’s l2: 195.051 [47] training’s l2: 194.634 valid_1’s l2: 194.998 [48] training’s l2: 194.414 valid_1’s l2: 194.9 [49] training’s l2: 194.192 valid_1’s l2: 194.795 [50] training’s l2: 193.972 valid_1’s l2: 194.699 Did not meet early stopping. Best iteration is: [50] training’s l2: 193.972 valid_1’s l2: 194.699 [1] training’s l2: 193.817 valid_1’s l2: 286.142 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.607 valid_1’s l2: 285.898 [3] training’s l2: 193.394 valid_1’s l2: 285.647 [4] training’s l2: 193.176 valid_1’s l2: 285.382 [5] training’s l2: 192.968 valid_1’s l2: 285.135 [6] training’s l2: 192.757 valid_1’s l2: 284.867 [7] training’s l2: 192.547 valid_1’s l2: 284.596 [8] training’s l2: 192.335 valid_1’s l2: 284.331 [9] training’s l2: 192.121 valid_1’s l2: 284.055 [10] training’s l2: 191.907 valid_1’s l2: 283.785 [11] training’s l2: 191.705 valid_1’s l2: 283.552 [12] training’s l2: 191.49 valid_1’s l2: 283.312 [13] training’s l2: 191.275 valid_1’s l2: 283.069 [14] training’s l2: 191.063 valid_1’s l2: 282.816 [15] training’s l2: 190.85 valid_1’s l2: 282.564 [16] training’s l2: 190.64 valid_1’s l2: 282.297 [17] training’s l2: 190.431 valid_1’s l2: 282.028 [18] training’s l2: 190.223 valid_1’s l2: 281.753 [19] training’s l2: 190.021 valid_1’s l2: 281.505 [20] training’s l2: 189.815 valid_1’s l2: 281.235 [21] training’s l2: 189.607 valid_1’s l2: 281.006 [22] training’s l2: 189.399 valid_1’s l2: 280.761 [23] training’s l2: 189.188 valid_1’s l2: 280.509 [24] training’s l2: 188.989 valid_1’s l2: 280.256 [25] training’s l2: 188.78 valid_1’s l2: 280.02 [26] training’s l2: 188.57 valid_1’s l2: 279.775 [27] training’s l2: 188.367 valid_1’s l2: 279.503 [28] training’s l2: 188.17 valid_1’s l2: 279.278 [29] training’s l2: 187.961 valid_1’s l2: 279.022 [30] training’s l2: 187.752 valid_1’s l2: 278.794 [31] training’s l2: 187.562 valid_1’s l2: 278.583 [32] training’s l2: 187.358 valid_1’s l2: 278.335 [33] training’s l2: 187.153 valid_1’s l2: 278.09 [34] training’s l2: 186.95 valid_1’s l2: 277.845 [35] training’s l2: 186.748 valid_1’s l2: 277.596 [36] training’s l2: 186.544 valid_1’s l2: 277.395 [37] training’s l2: 186.345 valid_1’s l2: 277.164 [38] training’s l2: 186.151 valid_1’s l2: 276.945 [39] training’s l2: 185.957 valid_1’s l2: 276.699 [40] training’s l2: 185.759 valid_1’s l2: 276.472 [41] training’s l2: 185.555 valid_1’s l2: 276.24 [42] training’s l2: 185.352 valid_1’s l2: 276 [43] training’s l2: 185.151 valid_1’s l2: 275.742 [44] training’s l2: 184.962 valid_1’s l2: 275.508 [45] training’s l2: 184.76 valid_1’s l2: 275.359 [46] training’s l2: 184.562 valid_1’s l2: 275.12 [47] training’s l2: 184.365 valid_1’s l2: 274.912 [48] training’s l2: 184.165 valid_1’s l2: 274.673 [49] training’s l2: 183.966 valid_1’s l2: 274.436 [50] training’s l2: 183.768 valid_1’s l2: 274.203 Did not meet early stopping. Best iteration is: [50] training’s l2: 183.768 valid_1’s l2: 274.203 [1] training’s l2: 194.96 valid_1’s l2: 280.111 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.739 valid_1’s l2: 280.109 [3] training’s l2: 194.525 valid_1’s l2: 279.855 [4] training’s l2: 194.298 valid_1’s l2: 279.834 [5] training’s l2: 194.077 valid_1’s l2: 279.836 [6] training’s l2: 193.851 valid_1’s l2: 280.016 [7] training’s l2: 193.639 valid_1’s l2: 279.818 [8] training’s l2: 193.427 valid_1’s l2: 279.622 [9] training’s l2: 193.2 valid_1’s l2: 279.803 [10] training’s l2: 192.974 valid_1’s l2: 279.981 [11] training’s l2: 192.758 valid_1’s l2: 279.858 [12] training’s l2: 192.537 valid_1’s l2: 279.692 [13] training’s l2: 192.315 valid_1’s l2: 279.567 [14] training’s l2: 192.092 valid_1’s l2: 279.429 [15] training’s l2: 191.877 valid_1’s l2: 279.181 [16] training’s l2: 191.66 valid_1’s l2: 278.947 [17] training’s l2: 191.443 valid_1’s l2: 278.726 [18] training’s l2: 191.228 valid_1’s l2: 278.513 [19] training’s l2: 191.018 valid_1’s l2: 278.305 [20] training’s l2: 190.802 valid_1’s l2: 278.079 [21] training’s l2: 190.589 valid_1’s l2: 278.023 [22] training’s l2: 190.384 valid_1’s l2: 277.835 [23] training’s l2: 190.171 valid_1’s l2: 277.77 [24] training’s l2: 189.966 valid_1’s l2: 277.787 [25] training’s l2: 189.754 valid_1’s l2: 277.73 [26] training’s l2: 189.54 valid_1’s l2: 277.511 [27] training’s l2: 189.334 valid_1’s l2: 277.307 [28] training’s l2: 189.122 valid_1’s l2: 277.109 [29] training’s l2: 188.896 valid_1’s l2: 277.087 [30] training’s l2: 188.672 valid_1’s l2: 277.059 [31] training’s l2: 188.466 valid_1’s l2: 276.998 [32] training’s l2: 188.253 valid_1’s l2: 276.937 [33] training’s l2: 188.041 valid_1’s l2: 276.901 [34] training’s l2: 187.829 valid_1’s l2: 276.845 [35] training’s l2: 187.617 valid_1’s l2: 276.791 [36] training’s l2: 187.399 valid_1’s l2: 276.685 [37] training’s l2: 187.185 valid_1’s l2: 276.53 [38] training’s l2: 186.975 valid_1’s l2: 276.382 [39] training’s l2: 186.759 valid_1’s l2: 276.282 [40] training’s l2: 186.545 valid_1’s l2: 276.182 [41] training’s l2: 186.334 valid_1’s l2: 276.125 [42] training’s l2: 186.127 valid_1’s l2: 275.983 [43] training’s l2: 185.929 valid_1’s l2: 275.772 [44] training’s l2: 185.719 valid_1’s l2: 275.716 [45] training’s l2: 185.513 valid_1’s l2: 275.57 [46] training’s l2: 185.305 valid_1’s l2: 275.426 [47] training’s l2: 185.1 valid_1’s l2: 275.369 [48] training’s l2: 184.894 valid_1’s l2: 275.222 [49] training’s l2: 184.686 valid_1’s l2: 275.078 [50] training’s l2: 184.48 valid_1’s l2: 274.938 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.48 valid_1’s l2: 274.938 [1] training’s l2: 194.275 valid_1’s l2: 284.212 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.063 valid_1’s l2: 284.151 [3] training’s l2: 193.844 valid_1’s l2: 283.899 [4] training’s l2: 193.618 valid_1’s l2: 283.809 [5] training’s l2: 193.405 valid_1’s l2: 283.746 [6] training’s l2: 193.185 valid_1’s l2: 283.645 [7] training’s l2: 192.979 valid_1’s l2: 283.406 [8] training’s l2: 192.77 valid_1’s l2: 283.172 [9] training’s l2: 192.551 valid_1’s l2: 283.075 [10] training’s l2: 192.333 valid_1’s l2: 282.978 [11] training’s l2: 192.122 valid_1’s l2: 282.88 [12] training’s l2: 191.903 valid_1’s l2: 282.974 [13] training’s l2: 191.686 valid_1’s l2: 283.074 [14] training’s l2: 191.465 valid_1’s l2: 283.162 [15] training’s l2: 191.256 valid_1’s l2: 282.953 [16] training’s l2: 191.044 valid_1’s l2: 282.697 [17] training’s l2: 190.834 valid_1’s l2: 282.449 [18] training’s l2: 190.625 valid_1’s l2: 282.187 [19] training’s l2: 190.423 valid_1’s l2: 281.979 [20] training’s l2: 190.212 valid_1’s l2: 281.723 [21] training’s l2: 189.995 valid_1’s l2: 281.646 [22] training’s l2: 189.79 valid_1’s l2: 281.466 [23] training’s l2: 189.573 valid_1’s l2: 281.382 [24] training’s l2: 189.361 valid_1’s l2: 281.25 [25] training’s l2: 189.145 valid_1’s l2: 281.172 [26] training’s l2: 188.944 valid_1’s l2: 280.936 [27] training’s l2: 188.744 valid_1’s l2: 280.707 [28] training’s l2: 188.553 valid_1’s l2: 280.499 [29] training’s l2: 188.34 valid_1’s l2: 280.423 [30] training’s l2: 188.127 valid_1’s l2: 280.35 [31] training’s l2: 187.926 valid_1’s l2: 280.298 [32] training’s l2: 187.715 valid_1’s l2: 280.216 [33] training’s l2: 187.505 valid_1’s l2: 280.134 [34] training’s l2: 187.297 valid_1’s l2: 280.054 [35] training’s l2: 187.09 valid_1’s l2: 279.97 [36] training’s l2: 186.88 valid_1’s l2: 279.898 [37] training’s l2: 186.672 valid_1’s l2: 279.827 [38] training’s l2: 186.474 valid_1’s l2: 279.772 [39] training’s l2: 186.267 valid_1’s l2: 279.7 [40] training’s l2: 186.065 valid_1’s l2: 279.634 [41] training’s l2: 185.855 valid_1’s l2: 279.548 [42] training’s l2: 185.645 valid_1’s l2: 279.455 [43] training’s l2: 185.445 valid_1’s l2: 279.218 [44] training’s l2: 185.239 valid_1’s l2: 279.137 [45] training’s l2: 185.032 valid_1’s l2: 279.05 [46] training’s l2: 184.83 valid_1’s l2: 278.966 [47] training’s l2: 184.63 valid_1’s l2: 278.837 [48] training’s l2: 184.429 valid_1’s l2: 278.746 [49] training’s l2: 184.227 valid_1’s l2: 278.654 [50] training’s l2: 184.025 valid_1’s l2: 278.564 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.025 valid_1’s l2: 278.564 [1] training’s l2: 198.239 valid_1’s l2: 265.043 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.029 valid_1’s l2: 264.86 [3] training’s l2: 197.814 valid_1’s l2: 264.621 [4] training’s l2: 197.599 valid_1’s l2: 264.409 [5] training’s l2: 197.391 valid_1’s l2: 264.193 [6] training’s l2: 197.168 valid_1’s l2: 263.988 [7] training’s l2: 196.945 valid_1’s l2: 263.78 [8] training’s l2: 196.718 valid_1’s l2: 263.561 [9] training’s l2: 196.495 valid_1’s l2: 263.371 [10] training’s l2: 196.27 valid_1’s l2: 263.156 [11] training’s l2: 196.058 valid_1’s l2: 262.93 [12] training’s l2: 195.839 valid_1’s l2: 262.747 [13] training’s l2: 195.624 valid_1’s l2: 262.508 [14] training’s l2: 195.408 valid_1’s l2: 262.289 [15] training’s l2: 195.192 valid_1’s l2: 262.048 [16] training’s l2: 194.977 valid_1’s l2: 261.803 [17] training’s l2: 194.762 valid_1’s l2: 261.553 [18] training’s l2: 194.546 valid_1’s l2: 261.317 [19] training’s l2: 194.341 valid_1’s l2: 261.118 [20] training’s l2: 194.127 valid_1’s l2: 260.878 [21] training’s l2: 193.909 valid_1’s l2: 260.64 [22] training’s l2: 193.693 valid_1’s l2: 260.425 [23] training’s l2: 193.474 valid_1’s l2: 260.188 [24] training’s l2: 193.264 valid_1’s l2: 259.97 [25] training’s l2: 193.043 valid_1’s l2: 259.728 [26] training’s l2: 192.824 valid_1’s l2: 259.494 [27] training’s l2: 192.61 valid_1’s l2: 259.274 [28] training’s l2: 192.405 valid_1’s l2: 259.051 [29] training’s l2: 192.19 valid_1’s l2: 258.84 [30] training’s l2: 191.973 valid_1’s l2: 258.662 [31] training’s l2: 191.774 valid_1’s l2: 258.531 [32] training’s l2: 191.565 valid_1’s l2: 258.383 [33] training’s l2: 191.358 valid_1’s l2: 258.186 [34] training’s l2: 191.148 valid_1’s l2: 257.992 [35] training’s l2: 190.939 valid_1’s l2: 257.804 [36] training’s l2: 190.733 valid_1’s l2: 257.59 [37] training’s l2: 190.527 valid_1’s l2: 257.366 [38] training’s l2: 190.327 valid_1’s l2: 257.172 [39] training’s l2: 190.121 valid_1’s l2: 256.959 [40] training’s l2: 189.918 valid_1’s l2: 256.768 [41] training’s l2: 189.708 valid_1’s l2: 256.557 [42] training’s l2: 189.5 valid_1’s l2: 256.314 [43] training’s l2: 189.294 valid_1’s l2: 256.087 [44] training’s l2: 189.09 valid_1’s l2: 255.918 [45] training’s l2: 188.881 valid_1’s l2: 255.677 [46] training’s l2: 188.68 valid_1’s l2: 255.475 [47] training’s l2: 188.481 valid_1’s l2: 255.291 [48] training’s l2: 188.282 valid_1’s l2: 255.071 [49] training’s l2: 188.081 valid_1’s l2: 254.842 [50] training’s l2: 187.881 valid_1’s l2: 254.626 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.881 valid_1’s l2: 254.626 [1] training’s l2: 205.147 valid_1’s l2: 198.678 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 204.908 valid_1’s l2: 198.668 [3] training’s l2: 204.665 valid_1’s l2: 198.573 [4] training’s l2: 204.424 valid_1’s l2: 198.475 [5] training’s l2: 204.189 valid_1’s l2: 198.459 [6] training’s l2: 203.945 valid_1’s l2: 198.363 [7] training’s l2: 203.707 valid_1’s l2: 198.239 [8] training’s l2: 203.466 valid_1’s l2: 198.112 [9] training’s l2: 203.227 valid_1’s l2: 197.978 [10] training’s l2: 202.983 valid_1’s l2: 197.883 [11] training’s l2: 202.753 valid_1’s l2: 197.792 [12] training’s l2: 202.522 valid_1’s l2: 197.71 [13] training’s l2: 202.289 valid_1’s l2: 197.62 [14] training’s l2: 202.056 valid_1’s l2: 197.525 [15] training’s l2: 201.828 valid_1’s l2: 197.393 [16] training’s l2: 201.596 valid_1’s l2: 197.332 [17] training’s l2: 201.368 valid_1’s l2: 197.213 [18] training’s l2: 201.144 valid_1’s l2: 197.073 [19] training’s l2: 200.917 valid_1’s l2: 196.983 [20] training’s l2: 200.689 valid_1’s l2: 196.91 [21] training’s l2: 200.456 valid_1’s l2: 196.826 [22] training’s l2: 200.234 valid_1’s l2: 196.682 [23] training’s l2: 200.003 valid_1’s l2: 196.598 [24] training’s l2: 199.784 valid_1’s l2: 196.592 [25] training’s l2: 199.555 valid_1’s l2: 196.503 [26] training’s l2: 199.326 valid_1’s l2: 196.409 [27] training’s l2: 199.103 valid_1’s l2: 196.325 [28] training’s l2: 198.885 valid_1’s l2: 196.268 [29] training’s l2: 198.659 valid_1’s l2: 196.185 [30] training’s l2: 198.431 valid_1’s l2: 196.113 [31] training’s l2: 198.214 valid_1’s l2: 196.097 [32] training’s l2: 197.986 valid_1’s l2: 196.028 [33] training’s l2: 197.759 valid_1’s l2: 195.954 [34] training’s l2: 197.533 valid_1’s l2: 195.876 [35] training’s l2: 197.306 valid_1’s l2: 195.815 [36] training’s l2: 197.084 valid_1’s l2: 195.752 [37] training’s l2: 196.86 valid_1’s l2: 195.697 [38] training’s l2: 196.645 valid_1’s l2: 195.578 [39] training’s l2: 196.42 valid_1’s l2: 195.515 [40] training’s l2: 196.199 valid_1’s l2: 195.45 [41] training’s l2: 195.968 valid_1’s l2: 195.377 [42] training’s l2: 195.744 valid_1’s l2: 195.331 [43] training’s l2: 195.517 valid_1’s l2: 195.216 [44] training’s l2: 195.294 valid_1’s l2: 195.172 [45] training’s l2: 195.071 valid_1’s l2: 195.119 [46] training’s l2: 194.847 valid_1’s l2: 195.046 [47] training’s l2: 194.632 valid_1’s l2: 194.994 [48] training’s l2: 194.412 valid_1’s l2: 194.896 [49] training’s l2: 194.191 valid_1’s l2: 194.79 [50] training’s l2: 193.971 valid_1’s l2: 194.694 Did not meet early stopping. Best iteration is: [50] training’s l2: 193.971 valid_1’s l2: 194.694 [1] training’s l2: 193.817 valid_1’s l2: 286.142 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.607 valid_1’s l2: 285.898 [3] training’s l2: 193.394 valid_1’s l2: 285.647 [4] training’s l2: 193.176 valid_1’s l2: 285.382 [5] training’s l2: 192.968 valid_1’s l2: 285.135 [6] training’s l2: 192.757 valid_1’s l2: 284.867 [7] training’s l2: 192.547 valid_1’s l2: 284.596 [8] training’s l2: 192.335 valid_1’s l2: 284.331 [9] training’s l2: 192.121 valid_1’s l2: 284.055 [10] training’s l2: 191.907 valid_1’s l2: 283.785 [11] training’s l2: 191.705 valid_1’s l2: 283.552 [12] training’s l2: 191.49 valid_1’s l2: 283.312 [13] training’s l2: 191.275 valid_1’s l2: 283.069 [14] training’s l2: 191.063 valid_1’s l2: 282.816 [15] training’s l2: 190.85 valid_1’s l2: 282.564 [16] training’s l2: 190.64 valid_1’s l2: 282.297 [17] training’s l2: 190.431 valid_1’s l2: 282.028 [18] training’s l2: 190.223 valid_1’s l2: 281.753 [19] training’s l2: 190.021 valid_1’s l2: 281.505 [20] training’s l2: 189.815 valid_1’s l2: 281.235 [21] training’s l2: 189.607 valid_1’s l2: 281.006 [22] training’s l2: 189.399 valid_1’s l2: 280.769 [23] training’s l2: 189.188 valid_1’s l2: 280.517 [24] training’s l2: 188.989 valid_1’s l2: 280.264 [25] training’s l2: 188.78 valid_1’s l2: 280.028 [26] training’s l2: 188.57 valid_1’s l2: 279.783 [27] training’s l2: 188.367 valid_1’s l2: 279.511 [28] training’s l2: 188.17 valid_1’s l2: 279.286 [29] training’s l2: 187.96 valid_1’s l2: 279.03 [30] training’s l2: 187.752 valid_1’s l2: 278.802 [31] training’s l2: 187.562 valid_1’s l2: 278.591 [32] training’s l2: 187.357 valid_1’s l2: 278.343 [33] training’s l2: 187.153 valid_1’s l2: 278.098 [34] training’s l2: 186.949 valid_1’s l2: 277.853 [35] training’s l2: 186.748 valid_1’s l2: 277.605 [36] training’s l2: 186.544 valid_1’s l2: 277.403 [37] training’s l2: 186.345 valid_1’s l2: 277.172 [38] training’s l2: 186.151 valid_1’s l2: 276.953 [39] training’s l2: 185.957 valid_1’s l2: 276.707 [40] training’s l2: 185.759 valid_1’s l2: 276.48 [41] training’s l2: 185.555 valid_1’s l2: 276.248 [42] training’s l2: 185.352 valid_1’s l2: 276.008 [43] training’s l2: 185.151 valid_1’s l2: 275.75 [44] training’s l2: 184.962 valid_1’s l2: 275.516 [45] training’s l2: 184.76 valid_1’s l2: 275.367 [46] training’s l2: 184.562 valid_1’s l2: 275.128 [47] training’s l2: 184.365 valid_1’s l2: 274.92 [48] training’s l2: 184.165 valid_1’s l2: 274.681 [49] training’s l2: 183.966 valid_1’s l2: 274.444 [50] training’s l2: 183.767 valid_1’s l2: 274.211 Did not meet early stopping. Best iteration is: [50] training’s l2: 183.767 valid_1’s l2: 274.211 [1] training’s l2: 194.96 valid_1’s l2: 280.111 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.739 valid_1’s l2: 280.109 [3] training’s l2: 194.525 valid_1’s l2: 279.855 [4] training’s l2: 194.298 valid_1’s l2: 279.834 [5] training’s l2: 194.077 valid_1’s l2: 279.836 [6] training’s l2: 193.851 valid_1’s l2: 280.016 [7] training’s l2: 193.639 valid_1’s l2: 279.818 [8] training’s l2: 193.427 valid_1’s l2: 279.622 [9] training’s l2: 193.2 valid_1’s l2: 279.803 [10] training’s l2: 192.974 valid_1’s l2: 279.981 [11] training’s l2: 192.758 valid_1’s l2: 279.858 [12] training’s l2: 192.537 valid_1’s l2: 279.692 [13] training’s l2: 192.315 valid_1’s l2: 279.567 [14] training’s l2: 192.092 valid_1’s l2: 279.429 [15] training’s l2: 191.877 valid_1’s l2: 279.181 [16] training’s l2: 191.66 valid_1’s l2: 278.947 [17] training’s l2: 191.443 valid_1’s l2: 278.726 [18] training’s l2: 191.228 valid_1’s l2: 278.513 [19] training’s l2: 191.018 valid_1’s l2: 278.305 [20] training’s l2: 190.802 valid_1’s l2: 278.079 [21] training’s l2: 190.589 valid_1’s l2: 278.023 [22] training’s l2: 190.384 valid_1’s l2: 277.835 [23] training’s l2: 190.171 valid_1’s l2: 277.77 [24] training’s l2: 189.966 valid_1’s l2: 277.787 [25] training’s l2: 189.754 valid_1’s l2: 277.73 [26] training’s l2: 189.54 valid_1’s l2: 277.511 [27] training’s l2: 189.334 valid_1’s l2: 277.307 [28] training’s l2: 189.122 valid_1’s l2: 277.109 [29] training’s l2: 188.896 valid_1’s l2: 277.087 [30] training’s l2: 188.672 valid_1’s l2: 277.059 [31] training’s l2: 188.466 valid_1’s l2: 276.998 [32] training’s l2: 188.253 valid_1’s l2: 276.937 [33] training’s l2: 188.041 valid_1’s l2: 276.901 [34] training’s l2: 187.829 valid_1’s l2: 276.845 [35] training’s l2: 187.617 valid_1’s l2: 276.791 [36] training’s l2: 187.399 valid_1’s l2: 276.685 [37] training’s l2: 187.185 valid_1’s l2: 276.53 [38] training’s l2: 186.975 valid_1’s l2: 276.382 [39] training’s l2: 186.759 valid_1’s l2: 276.282 [40] training’s l2: 186.545 valid_1’s l2: 276.182 [41] training’s l2: 186.334 valid_1’s l2: 276.125 [42] training’s l2: 186.127 valid_1’s l2: 275.983 [43] training’s l2: 185.929 valid_1’s l2: 275.772 [44] training’s l2: 185.719 valid_1’s l2: 275.716 [45] training’s l2: 185.513 valid_1’s l2: 275.57 [46] training’s l2: 185.305 valid_1’s l2: 275.426 [47] training’s l2: 185.1 valid_1’s l2: 275.369 [48] training’s l2: 184.894 valid_1’s l2: 275.222 [49] training’s l2: 184.686 valid_1’s l2: 275.078 [50] training’s l2: 184.48 valid_1’s l2: 274.938 Did not meet early stopping. Best iteration is: [50] training’s l2: 184.48 valid_1’s l2: 274.938

max depths train logloss eval logloss test logloss
0 2 0.514985 0.992908 0.605194
1 3 0.416468 1.167130 0.670094
2 5 0.349214 1.269051 0.905083
3 6 0.324668 1.280065 0.885121
4 10 0.304745 1.316455 0.868913
5 20 0.304890 1.314212 0.868913

max depths=2

6.0.7 colsample_bytree

[1] training’s l2: 194.34 valid_1’s l2: 284.079 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.206 valid_1’s l2: 283.871 [3] training’s l2: 194.046 valid_1’s l2: 283.647 [4] training’s l2: 193.885 valid_1’s l2: 283.423 [5] training’s l2: 193.752 valid_1’s l2: 283.217 [6] training’s l2: 193.616 valid_1’s l2: 282.977 [7] training’s l2: 193.497 valid_1’s l2: 282.745 [8] training’s l2: 193.366 valid_1’s l2: 282.515 [9] training’s l2: 193.236 valid_1’s l2: 282.286 [10] training’s l2: 193.106 valid_1’s l2: 282.057 [11] training’s l2: 192.972 valid_1’s l2: 281.832 [12] training’s l2: 192.84 valid_1’s l2: 281.596 [13] training’s l2: 192.701 valid_1’s l2: 281.37 [14] training’s l2: 192.567 valid_1’s l2: 281.146 [15] training’s l2: 192.434 valid_1’s l2: 280.922 [16] training’s l2: 192.312 valid_1’s l2: 280.727 [17] training’s l2: 192.19 valid_1’s l2: 280.532 [18] training’s l2: 192.068 valid_1’s l2: 280.337 [19] training’s l2: 191.976 valid_1’s l2: 280.159 [20] training’s l2: 191.855 valid_1’s l2: 279.965 [21] training’s l2: 191.746 valid_1’s l2: 279.745 [22] training’s l2: 191.637 valid_1’s l2: 279.526 [23] training’s l2: 191.528 valid_1’s l2: 279.307 [24] training’s l2: 191.397 valid_1’s l2: 279.084 [25] training’s l2: 191.289 valid_1’s l2: 278.866 [26] training’s l2: 191.157 valid_1’s l2: 278.684 [27] training’s l2: 191.014 valid_1’s l2: 278.48 [28] training’s l2: 190.884 valid_1’s l2: 278.285 [29] training’s l2: 190.741 valid_1’s l2: 278.075 [30] training’s l2: 190.598 valid_1’s l2: 277.988 [31] training’s l2: 190.472 valid_1’s l2: 277.747 [32] training’s l2: 190.323 valid_1’s l2: 277.478 [33] training’s l2: 190.189 valid_1’s l2: 277.38 [34] training’s l2: 190.056 valid_1’s l2: 277.282 [35] training’s l2: 189.923 valid_1’s l2: 277.184 [36] training’s l2: 189.818 valid_1’s l2: 277.038 [37] training’s l2: 189.709 valid_1’s l2: 276.879 [38] training’s l2: 189.617 valid_1’s l2: 276.831 [39] training’s l2: 189.508 valid_1’s l2: 276.673 [40] training’s l2: 189.405 valid_1’s l2: 276.527 [41] training’s l2: 189.243 valid_1’s l2: 276.267 [42] training’s l2: 189.082 valid_1’s l2: 276.008 [43] training’s l2: 188.922 valid_1’s l2: 275.75 [44] training’s l2: 188.803 valid_1’s l2: 275.56 [45] training’s l2: 188.641 valid_1’s l2: 275.308 [46] training’s l2: 188.531 valid_1’s l2: 275.135 [47] training’s l2: 188.414 valid_1’s l2: 274.917 [48] training’s l2: 188.3 valid_1’s l2: 274.706 [49] training’s l2: 188.186 valid_1’s l2: 274.494 [50] training’s l2: 188.072 valid_1’s l2: 274.283 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.072 valid_1’s l2: 274.283 [1] training’s l2: 198.312 valid_1’s l2: 265.051 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.172 valid_1’s l2: 264.875 [3] training’s l2: 198.032 valid_1’s l2: 264.706 [4] training’s l2: 197.879 valid_1’s l2: 264.51 [5] training’s l2: 197.739 valid_1’s l2: 264.335 [6] training’s l2: 197.59 valid_1’s l2: 264.142 [7] training’s l2: 197.442 valid_1’s l2: 263.95 [8] training’s l2: 197.294 valid_1’s l2: 263.758 [9] training’s l2: 197.146 valid_1’s l2: 263.566 [10] training’s l2: 197.002 valid_1’s l2: 263.377 [11] training’s l2: 196.853 valid_1’s l2: 263.161 [12] training’s l2: 196.714 valid_1’s l2: 262.952 [13] training’s l2: 196.568 valid_1’s l2: 262.767 [14] training’s l2: 196.422 valid_1’s l2: 262.582 [15] training’s l2: 196.276 valid_1’s l2: 262.398 [16] training’s l2: 196.143 valid_1’s l2: 262.255 [17] training’s l2: 196.004 valid_1’s l2: 262.102 [18] training’s l2: 195.871 valid_1’s l2: 261.96 [19] training’s l2: 195.749 valid_1’s l2: 261.805 [20] training’s l2: 195.616 valid_1’s l2: 261.663 [21] training’s l2: 195.458 valid_1’s l2: 261.452 [22] training’s l2: 195.311 valid_1’s l2: 261.266 [23] training’s l2: 195.165 valid_1’s l2: 261.079 [24] training’s l2: 195.04 valid_1’s l2: 260.936 [25] training’s l2: 194.894 valid_1’s l2: 260.75 [26] training’s l2: 194.741 valid_1’s l2: 260.532 [27] training’s l2: 194.589 valid_1’s l2: 260.315 [28] training’s l2: 194.458 valid_1’s l2: 260.13 [29] training’s l2: 194.314 valid_1’s l2: 259.921 [30] training’s l2: 194.17 valid_1’s l2: 259.712 [31] training’s l2: 194.024 valid_1’s l2: 259.514 [32] training’s l2: 193.883 valid_1’s l2: 259.335 [33] training’s l2: 193.738 valid_1’s l2: 259.138 [34] training’s l2: 193.593 valid_1’s l2: 258.94 [35] training’s l2: 193.448 valid_1’s l2: 258.744 [36] training’s l2: 193.314 valid_1’s l2: 258.575 [37] training’s l2: 193.179 valid_1’s l2: 258.406 [38] training’s l2: 193.068 valid_1’s l2: 258.288 [39] training’s l2: 192.934 valid_1’s l2: 258.119 [40] training’s l2: 192.8 valid_1’s l2: 257.952 [41] training’s l2: 192.666 valid_1’s l2: 257.788 [42] training’s l2: 192.537 valid_1’s l2: 257.618 [43] training’s l2: 192.403 valid_1’s l2: 257.455 [44] training’s l2: 192.287 valid_1’s l2: 257.307 [45] training’s l2: 192.159 valid_1’s l2: 257.138 [46] training’s l2: 192.029 valid_1’s l2: 256.962 [47] training’s l2: 191.889 valid_1’s l2: 256.779 [48] training’s l2: 191.76 valid_1’s l2: 256.604 [49] training’s l2: 191.63 valid_1’s l2: 256.43 [50] training’s l2: 191.502 valid_1’s l2: 256.256 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.502 valid_1’s l2: 256.256 [1] training’s l2: 205.209 valid_1’s l2: 198.609 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.053 valid_1’s l2: 198.477 [3] training’s l2: 204.874 valid_1’s l2: 198.32 [4] training’s l2: 204.7 valid_1’s l2: 198.159 [5] training’s l2: 204.545 valid_1’s l2: 198.022 [6] training’s l2: 204.386 valid_1’s l2: 197.87 [7] training’s l2: 204.241 valid_1’s l2: 197.867 [8] training’s l2: 204.089 valid_1’s l2: 197.735 [9] training’s l2: 203.938 valid_1’s l2: 197.603 [10] training’s l2: 203.775 valid_1’s l2: 197.453 [11] training’s l2: 203.609 valid_1’s l2: 197.304 [12] training’s l2: 203.488 valid_1’s l2: 197.293 [13] training’s l2: 203.362 valid_1’s l2: 197.289 [14] training’s l2: 203.225 valid_1’s l2: 197.287 [15] training’s l2: 203.092 valid_1’s l2: 197.207 [16] training’s l2: 202.956 valid_1’s l2: 197.115 [17] training’s l2: 202.807 valid_1’s l2: 196.977 [18] training’s l2: 202.658 valid_1’s l2: 196.839 [19] training’s l2: 202.506 valid_1’s l2: 196.707 [20] training’s l2: 202.37 valid_1’s l2: 196.616 [21] training’s l2: 202.211 valid_1’s l2: 196.477 [22] training’s l2: 202.06 valid_1’s l2: 196.364 [23] training’s l2: 201.902 valid_1’s l2: 196.222 [24] training’s l2: 201.756 valid_1’s l2: 196.089 [25] training’s l2: 201.597 valid_1’s l2: 195.951 [26] training’s l2: 201.456 valid_1’s l2: 195.846 [27] training’s l2: 201.314 valid_1’s l2: 195.743 [28] training’s l2: 201.184 valid_1’s l2: 195.657 [29] training’s l2: 201.044 valid_1’s l2: 195.552 [30] training’s l2: 200.888 valid_1’s l2: 195.41 [31] training’s l2: 200.74 valid_1’s l2: 195.262 [32] training’s l2: 200.582 valid_1’s l2: 195.125 [33] training’s l2: 200.423 valid_1’s l2: 194.987 [34] training’s l2: 200.267 valid_1’s l2: 194.848 [35] training’s l2: 200.109 valid_1’s l2: 194.711 [36] training’s l2: 199.963 valid_1’s l2: 194.573 [37] training’s l2: 199.817 valid_1’s l2: 194.435 [38] training’s l2: 199.691 valid_1’s l2: 194.309 [39] training’s l2: 199.548 valid_1’s l2: 194.189 [40] training’s l2: 199.403 valid_1’s l2: 194.051 [41] training’s l2: 199.258 valid_1’s l2: 193.929 [42] training’s l2: 199.119 valid_1’s l2: 193.811 [43] training’s l2: 198.975 valid_1’s l2: 193.689 [44] training’s l2: 198.835 valid_1’s l2: 193.565 [45] training’s l2: 198.697 valid_1’s l2: 193.448 [46] training’s l2: 198.552 valid_1’s l2: 193.324 [47] training’s l2: 198.421 valid_1’s l2: 193.208 [48] training’s l2: 198.277 valid_1’s l2: 193.084 [49] training’s l2: 198.133 valid_1’s l2: 192.961 [50] training’s l2: 197.989 valid_1’s l2: 192.838 Did not meet early stopping. Best iteration is: [50] training’s l2: 197.989 valid_1’s l2: 192.838 [1] training’s l2: 193.86 valid_1’s l2: 286.177 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.713 valid_1’s l2: 286.015 [3] training’s l2: 193.543 valid_1’s l2: 285.806 [4] training’s l2: 193.375 valid_1’s l2: 285.608 [5] training’s l2: 193.223 valid_1’s l2: 285.415 [6] training’s l2: 193.095 valid_1’s l2: 285.202 [7] training’s l2: 192.947 valid_1’s l2: 284.93 [8] training’s l2: 192.808 valid_1’s l2: 284.732 [9] training’s l2: 192.67 valid_1’s l2: 284.534 [10] training’s l2: 192.532 valid_1’s l2: 284.298 [11] training’s l2: 192.385 valid_1’s l2: 284.099 [12] training’s l2: 192.244 valid_1’s l2: 283.923 [13] training’s l2: 192.097 valid_1’s l2: 283.724 [14] training’s l2: 191.951 valid_1’s l2: 283.526 [15] training’s l2: 191.806 valid_1’s l2: 283.329 [16] training’s l2: 191.673 valid_1’s l2: 283.124 [17] training’s l2: 191.54 valid_1’s l2: 282.92 [18] training’s l2: 191.408 valid_1’s l2: 282.717 [19] training’s l2: 191.269 valid_1’s l2: 282.528 [20] training’s l2: 191.137 valid_1’s l2: 282.325 [21] training’s l2: 191.028 valid_1’s l2: 282.218 [22] training’s l2: 190.886 valid_1’s l2: 282.041 [23] training’s l2: 190.743 valid_1’s l2: 281.86 [24] training’s l2: 190.601 valid_1’s l2: 281.679 [25] training’s l2: 190.459 valid_1’s l2: 281.498 [26] training’s l2: 190.348 valid_1’s l2: 281.313 [27] training’s l2: 190.238 valid_1’s l2: 281.128 [28] training’s l2: 190.119 valid_1’s l2: 280.966 [29] training’s l2: 190.009 valid_1’s l2: 280.781 [30] training’s l2: 189.883 valid_1’s l2: 280.583 [31] training’s l2: 189.741 valid_1’s l2: 280.386 [32] training’s l2: 189.589 valid_1’s l2: 280.148 [33] training’s l2: 189.438 valid_1’s l2: 279.911 [34] training’s l2: 189.286 valid_1’s l2: 279.674 [35] training’s l2: 189.136 valid_1’s l2: 279.438 [36] training’s l2: 188.996 valid_1’s l2: 279.243 [37] training’s l2: 188.856 valid_1’s l2: 279.048 [38] training’s l2: 188.754 valid_1’s l2: 278.867 [39] training’s l2: 188.614 valid_1’s l2: 278.672 [40] training’s l2: 188.48 valid_1’s l2: 278.474 [41] training’s l2: 188.369 valid_1’s l2: 278.297 [42] training’s l2: 188.257 valid_1’s l2: 278.121 [43] training’s l2: 188.146 valid_1’s l2: 277.944 [44] training’s l2: 188.03 valid_1’s l2: 277.779 [45] training’s l2: 187.916 valid_1’s l2: 277.579 [46] training’s l2: 187.782 valid_1’s l2: 277.369 [47] training’s l2: 187.656 valid_1’s l2: 277.165 [48] training’s l2: 187.534 valid_1’s l2: 276.972 [49] training’s l2: 187.412 valid_1’s l2: 276.778 [50] training’s l2: 187.29 valid_1’s l2: 276.586 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.29 valid_1’s l2: 276.586 [1] training’s l2: 195.028 valid_1’s l2: 280.067 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.87 valid_1’s l2: 280.218 [3] training’s l2: 194.716 valid_1’s l2: 280.035 [4] training’s l2: 194.562 valid_1’s l2: 279.846 [5] training’s l2: 194.405 valid_1’s l2: 279.997 [6] training’s l2: 194.261 valid_1’s l2: 279.767 [7] training’s l2: 194.116 valid_1’s l2: 279.527 [8] training’s l2: 193.972 valid_1’s l2: 279.291 [9] training’s l2: 193.827 valid_1’s l2: 279.052 [10] training’s l2: 193.683 valid_1’s l2: 278.814 [11] training’s l2: 193.534 valid_1’s l2: 278.675 [12] training’s l2: 193.397 valid_1’s l2: 278.458 [13] training’s l2: 193.26 valid_1’s l2: 278.244 [14] training’s l2: 193.123 valid_1’s l2: 278.028 [15] training’s l2: 192.987 valid_1’s l2: 277.814 [16] training’s l2: 192.834 valid_1’s l2: 277.637 [17] training’s l2: 192.687 valid_1’s l2: 277.461 [18] training’s l2: 192.541 valid_1’s l2: 277.286 [19] training’s l2: 192.388 valid_1’s l2: 277.105 [20] training’s l2: 192.236 valid_1’s l2: 276.929 [21] training’s l2: 192.086 valid_1’s l2: 276.701 [22] training’s l2: 191.958 valid_1’s l2: 276.538 [23] training’s l2: 191.804 valid_1’s l2: 276.298 [24] training’s l2: 191.676 valid_1’s l2: 276.099 [25] training’s l2: 191.522 valid_1’s l2: 275.861 [26] training’s l2: 191.39 valid_1’s l2: 275.653 [27] training’s l2: 191.234 valid_1’s l2: 275.443 [28] training’s l2: 191.086 valid_1’s l2: 275.239 [29] training’s l2: 190.931 valid_1’s l2: 275.029 [30] training’s l2: 190.776 valid_1’s l2: 274.82 [31] training’s l2: 190.622 valid_1’s l2: 274.634 [32] training’s l2: 190.474 valid_1’s l2: 274.467 [33] training’s l2: 190.336 valid_1’s l2: 274.329 [34] training’s l2: 190.189 valid_1’s l2: 274.162 [35] training’s l2: 190.043 valid_1’s l2: 273.996 [36] training’s l2: 189.894 valid_1’s l2: 273.816 [37] training’s l2: 189.745 valid_1’s l2: 273.637 [38] training’s l2: 189.605 valid_1’s l2: 273.532 [39] training’s l2: 189.466 valid_1’s l2: 273.341 [40] training’s l2: 189.318 valid_1’s l2: 273.163 [41] training’s l2: 189.2 valid_1’s l2: 273.031 [42] training’s l2: 189.078 valid_1’s l2: 272.923 [43] training’s l2: 188.961 valid_1’s l2: 272.791 [44] training’s l2: 188.843 valid_1’s l2: 272.66 [45] training’s l2: 188.722 valid_1’s l2: 272.552 [46] training’s l2: 188.581 valid_1’s l2: 272.373 [47] training’s l2: 188.465 valid_1’s l2: 272.236 [48] training’s l2: 188.324 valid_1’s l2: 272.057 [49] training’s l2: 188.184 valid_1’s l2: 271.878 [50] training’s l2: 188.044 valid_1’s l2: 271.7 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.044 valid_1’s l2: 271.7 [1] training’s l2: 194.368 valid_1’s l2: 284.099 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.248 valid_1’s l2: 283.938 [3] training’s l2: 194.115 valid_1’s l2: 283.733 [4] training’s l2: 193.982 valid_1’s l2: 283.529 [5] training’s l2: 193.862 valid_1’s l2: 283.369 [6] training’s l2: 193.736 valid_1’s l2: 283.15 [7] training’s l2: 193.609 valid_1’s l2: 282.931 [8] training’s l2: 193.483 valid_1’s l2: 282.713 [9] training’s l2: 193.357 valid_1’s l2: 282.495 [10] training’s l2: 193.231 valid_1’s l2: 282.277 [11] training’s l2: 193.097 valid_1’s l2: 282.059 [12] training’s l2: 192.961 valid_1’s l2: 281.83 [13] training’s l2: 192.826 valid_1’s l2: 281.601 [14] training’s l2: 192.69 valid_1’s l2: 281.373 [15] training’s l2: 192.555 valid_1’s l2: 281.146 [16] training’s l2: 192.43 valid_1’s l2: 280.995 [17] training’s l2: 192.304 valid_1’s l2: 280.844 [18] training’s l2: 192.179 valid_1’s l2: 280.693 [19] training’s l2: 192.054 valid_1’s l2: 280.499 [20] training’s l2: 191.927 valid_1’s l2: 280.341 [21] training’s l2: 191.803 valid_1’s l2: 280.406 [22] training’s l2: 191.678 valid_1’s l2: 280.2 [23] training’s l2: 191.552 valid_1’s l2: 279.994 [24] training’s l2: 191.431 valid_1’s l2: 279.788 [25] training’s l2: 191.306 valid_1’s l2: 279.583 [26] training’s l2: 191.185 valid_1’s l2: 279.437 [27] training’s l2: 191.063 valid_1’s l2: 279.273 [28] training’s l2: 190.939 valid_1’s l2: 279.087 [29] training’s l2: 190.814 valid_1’s l2: 278.896 [30] training’s l2: 190.689 valid_1’s l2: 278.706 [31] training’s l2: 190.573 valid_1’s l2: 278.85 [32] training’s l2: 190.447 valid_1’s l2: 278.993 [33] training’s l2: 190.321 valid_1’s l2: 279.136 [34] training’s l2: 190.196 valid_1’s l2: 279.279 [35] training’s l2: 190.07 valid_1’s l2: 279.422 [36] training’s l2: 189.946 valid_1’s l2: 279.243 [37] training’s l2: 189.821 valid_1’s l2: 279.065 [38] training’s l2: 189.702 valid_1’s l2: 278.859 [39] training’s l2: 189.578 valid_1’s l2: 278.681 [40] training’s l2: 189.454 valid_1’s l2: 278.503 [41] training’s l2: 189.328 valid_1’s l2: 278.335 [42] training’s l2: 189.199 valid_1’s l2: 278.148 [43] training’s l2: 189.071 valid_1’s l2: 277.962 [44] training’s l2: 188.946 valid_1’s l2: 277.794 [45] training’s l2: 188.817 valid_1’s l2: 277.609 [46] training’s l2: 188.691 valid_1’s l2: 277.412 [47] training’s l2: 188.567 valid_1’s l2: 277.216 [48] training’s l2: 188.442 valid_1’s l2: 277.02 [49] training’s l2: 188.316 valid_1’s l2: 276.824 [50] training’s l2: 188.191 valid_1’s l2: 276.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.191 valid_1’s l2: 276.628 [1] training’s l2: 198.311 valid_1’s l2: 265.041 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.172 valid_1’s l2: 264.862 [3] training’s l2: 198.036 valid_1’s l2: 264.68 [4] training’s l2: 197.891 valid_1’s l2: 264.479 [5] training’s l2: 197.75 valid_1’s l2: 264.293 [6] training’s l2: 197.617 valid_1’s l2: 264.136 [7] training’s l2: 197.484 valid_1’s l2: 263.979 [8] training’s l2: 197.352 valid_1’s l2: 263.822 [9] training’s l2: 197.216 valid_1’s l2: 263.657 [10] training’s l2: 197.08 valid_1’s l2: 263.491 [11] training’s l2: 196.945 valid_1’s l2: 263.319 [12] training’s l2: 196.803 valid_1’s l2: 263.142 [13] training’s l2: 196.661 valid_1’s l2: 262.966 [14] training’s l2: 196.519 valid_1’s l2: 262.79 [15] training’s l2: 196.378 valid_1’s l2: 262.614 [16] training’s l2: 196.257 valid_1’s l2: 262.463 [17] training’s l2: 196.135 valid_1’s l2: 262.313 [18] training’s l2: 196.014 valid_1’s l2: 262.163 [19] training’s l2: 195.893 valid_1’s l2: 262.037 [20] training’s l2: 195.772 valid_1’s l2: 261.888 [21] training’s l2: 195.635 valid_1’s l2: 261.731 [22] training’s l2: 195.497 valid_1’s l2: 261.574 [23] training’s l2: 195.36 valid_1’s l2: 261.417 [24] training’s l2: 195.231 valid_1’s l2: 261.244 [25] training’s l2: 195.095 valid_1’s l2: 261.089 [26] training’s l2: 194.954 valid_1’s l2: 260.898 [27] training’s l2: 194.815 valid_1’s l2: 260.707 [28] training’s l2: 194.686 valid_1’s l2: 260.562 [29] training’s l2: 194.547 valid_1’s l2: 260.373 [30] training’s l2: 194.408 valid_1’s l2: 260.183 [31] training’s l2: 194.279 valid_1’s l2: 260.058 [32] training’s l2: 194.139 valid_1’s l2: 259.889 [33] training’s l2: 194 valid_1’s l2: 259.72 [34] training’s l2: 193.86 valid_1’s l2: 259.552 [35] training’s l2: 193.721 valid_1’s l2: 259.383 [36] training’s l2: 193.589 valid_1’s l2: 259.207 [37] training’s l2: 193.456 valid_1’s l2: 259.03 [38] training’s l2: 193.31 valid_1’s l2: 258.822 [39] training’s l2: 193.178 valid_1’s l2: 258.647 [40] training’s l2: 193.046 valid_1’s l2: 258.471 [41] training’s l2: 192.913 valid_1’s l2: 258.3 [42] training’s l2: 192.78 valid_1’s l2: 258.129 [43] training’s l2: 192.647 valid_1’s l2: 257.958 [44] training’s l2: 192.516 valid_1’s l2: 257.794 [45] training’s l2: 192.384 valid_1’s l2: 257.624 [46] training’s l2: 192.254 valid_1’s l2: 257.448 [47] training’s l2: 192.123 valid_1’s l2: 257.303 [48] training’s l2: 191.994 valid_1’s l2: 257.127 [49] training’s l2: 191.865 valid_1’s l2: 256.972 [50] training’s l2: 191.736 valid_1’s l2: 256.797 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.736 valid_1’s l2: 256.797 [1] training’s l2: 205.247 valid_1’s l2: 198.672 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.107 valid_1’s l2: 198.539 [3] training’s l2: 204.965 valid_1’s l2: 198.445 [4] training’s l2: 204.821 valid_1’s l2: 198.343 [5] training’s l2: 204.682 valid_1’s l2: 198.21 [6] training’s l2: 204.531 valid_1’s l2: 198.08 [7] training’s l2: 204.38 valid_1’s l2: 197.949 [8] training’s l2: 204.229 valid_1’s l2: 197.819 [9] training’s l2: 204.079 valid_1’s l2: 197.689 [10] training’s l2: 203.93 valid_1’s l2: 197.559 [11] training’s l2: 203.781 valid_1’s l2: 197.421 [12] training’s l2: 203.639 valid_1’s l2: 197.295 [13] training’s l2: 203.494 valid_1’s l2: 197.349 [14] training’s l2: 203.35 valid_1’s l2: 197.403 [15] training’s l2: 203.2 valid_1’s l2: 197.277 [16] training’s l2: 203.068 valid_1’s l2: 197.155 [17] training’s l2: 202.937 valid_1’s l2: 197.034 [18] training’s l2: 202.806 valid_1’s l2: 196.913 [19] training’s l2: 202.673 valid_1’s l2: 196.798 [20] training’s l2: 202.543 valid_1’s l2: 196.677 [21] training’s l2: 202.384 valid_1’s l2: 196.534 [22] training’s l2: 202.225 valid_1’s l2: 196.39 [23] training’s l2: 202.067 valid_1’s l2: 196.247 [24] training’s l2: 201.928 valid_1’s l2: 196.116 [25] training’s l2: 201.77 valid_1’s l2: 195.974 [26] training’s l2: 201.631 valid_1’s l2: 195.849 [27] training’s l2: 201.492 valid_1’s l2: 195.725 [28] training’s l2: 201.348 valid_1’s l2: 195.584 [29] training’s l2: 201.209 valid_1’s l2: 195.462 [30] training’s l2: 201.07 valid_1’s l2: 195.341 [31] training’s l2: 200.929 valid_1’s l2: 195.196 [32] training’s l2: 200.79 valid_1’s l2: 195.083 [33] training’s l2: 200.652 valid_1’s l2: 194.967 [34] training’s l2: 200.51 valid_1’s l2: 194.853 [35] training’s l2: 200.368 valid_1’s l2: 194.74 [36] training’s l2: 200.219 valid_1’s l2: 194.604 [37] training’s l2: 200.07 valid_1’s l2: 194.468 [38] training’s l2: 199.919 valid_1’s l2: 194.326 [39] training’s l2: 199.771 valid_1’s l2: 194.191 [40] training’s l2: 199.623 valid_1’s l2: 194.056 [41] training’s l2: 199.483 valid_1’s l2: 194.036 [42] training’s l2: 199.343 valid_1’s l2: 194.016 [43] training’s l2: 199.201 valid_1’s l2: 193.896 [44] training’s l2: 199.065 valid_1’s l2: 193.873 [45] training’s l2: 198.926 valid_1’s l2: 193.854 [46] training’s l2: 198.788 valid_1’s l2: 193.733 [47] training’s l2: 198.65 valid_1’s l2: 193.611 [48] training’s l2: 198.512 valid_1’s l2: 193.491 [49] training’s l2: 198.376 valid_1’s l2: 193.376 [50] training’s l2: 198.239 valid_1’s l2: 193.257 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.239 valid_1’s l2: 193.257 [1] training’s l2: 193.895 valid_1’s l2: 286.205 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.744 valid_1’s l2: 285.995 [3] training’s l2: 193.608 valid_1’s l2: 285.813 [4] training’s l2: 193.475 valid_1’s l2: 285.623 [5] training’s l2: 193.324 valid_1’s l2: 285.415 [6] training’s l2: 193.189 valid_1’s l2: 285.208 [7] training’s l2: 193.055 valid_1’s l2: 285.001 [8] training’s l2: 192.921 valid_1’s l2: 284.794 [9] training’s l2: 192.787 valid_1’s l2: 284.588 [10] training’s l2: 192.653 valid_1’s l2: 284.383 [11] training’s l2: 192.518 valid_1’s l2: 284.184 [12] training’s l2: 192.375 valid_1’s l2: 283.952 [13] training’s l2: 192.233 valid_1’s l2: 283.718 [14] training’s l2: 192.092 valid_1’s l2: 283.484 [15] training’s l2: 191.951 valid_1’s l2: 283.25 [16] training’s l2: 191.823 valid_1’s l2: 283.071 [17] training’s l2: 191.696 valid_1’s l2: 282.891 [18] training’s l2: 191.568 valid_1’s l2: 282.712 [19] training’s l2: 191.444 valid_1’s l2: 282.548 [20] training’s l2: 191.317 valid_1’s l2: 282.369 [21] training’s l2: 191.204 valid_1’s l2: 282.177 [22] training’s l2: 191.087 valid_1’s l2: 281.957 [23] training’s l2: 190.973 valid_1’s l2: 281.757 [24] training’s l2: 190.849 valid_1’s l2: 281.546 [25] training’s l2: 190.737 valid_1’s l2: 281.355 [26] training’s l2: 190.607 valid_1’s l2: 281.177 [27] training’s l2: 190.479 valid_1’s l2: 280.998 [28] training’s l2: 190.346 valid_1’s l2: 280.781 [29] training’s l2: 190.217 valid_1’s l2: 280.603 [30] training’s l2: 190.088 valid_1’s l2: 280.426 [31] training’s l2: 189.965 valid_1’s l2: 280.209 [32] training’s l2: 189.827 valid_1’s l2: 279.963 [33] training’s l2: 189.689 valid_1’s l2: 279.716 [34] training’s l2: 189.552 valid_1’s l2: 279.471 [35] training’s l2: 189.415 valid_1’s l2: 279.226 [36] training’s l2: 189.278 valid_1’s l2: 279.011 [37] training’s l2: 189.142 valid_1’s l2: 278.799 [38] training’s l2: 189.011 valid_1’s l2: 278.612 [39] training’s l2: 188.875 valid_1’s l2: 278.401 [40] training’s l2: 188.74 valid_1’s l2: 278.19 [41] training’s l2: 188.622 valid_1’s l2: 278.01 [42] training’s l2: 188.505 valid_1’s l2: 277.83 [43] training’s l2: 188.387 valid_1’s l2: 277.651 [44] training’s l2: 188.27 valid_1’s l2: 277.472 [45] training’s l2: 188.153 valid_1’s l2: 277.293 [46] training’s l2: 188.028 valid_1’s l2: 277.114 [47] training’s l2: 187.892 valid_1’s l2: 276.915 [48] training’s l2: 187.764 valid_1’s l2: 276.736 [49] training’s l2: 187.637 valid_1’s l2: 276.558 [50] training’s l2: 187.51 valid_1’s l2: 276.38 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.51 valid_1’s l2: 276.38 [1] training’s l2: 195.042 valid_1’s l2: 280.083 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.892 valid_1’s l2: 279.901 [3] training’s l2: 194.752 valid_1’s l2: 279.734 [4] training’s l2: 194.613 valid_1’s l2: 279.567 [5] training’s l2: 194.464 valid_1’s l2: 279.386 [6] training’s l2: 194.327 valid_1’s l2: 279.211 [7] training’s l2: 194.191 valid_1’s l2: 279.037 [8] training’s l2: 194.06 valid_1’s l2: 278.88 [9] training’s l2: 193.924 valid_1’s l2: 278.707 [10] training’s l2: 193.788 valid_1’s l2: 278.509 [11] training’s l2: 193.655 valid_1’s l2: 278.322 [12] training’s l2: 193.511 valid_1’s l2: 278.104 [13] training’s l2: 193.367 valid_1’s l2: 277.901 [14] training’s l2: 193.224 valid_1’s l2: 277.699 [15] training’s l2: 193.081 valid_1’s l2: 277.497 [16] training’s l2: 192.942 valid_1’s l2: 277.3 [17] training’s l2: 192.803 valid_1’s l2: 277.104 [18] training’s l2: 192.665 valid_1’s l2: 276.908 [19] training’s l2: 192.528 valid_1’s l2: 276.727 [20] training’s l2: 192.39 valid_1’s l2: 276.532 [21] training’s l2: 192.253 valid_1’s l2: 276.696 [22] training’s l2: 192.121 valid_1’s l2: 276.535 [23] training’s l2: 191.984 valid_1’s l2: 276.698 [24] training’s l2: 191.858 valid_1’s l2: 276.537 [25] training’s l2: 191.722 valid_1’s l2: 276.7 [26] training’s l2: 191.586 valid_1’s l2: 276.51 [27] training’s l2: 191.444 valid_1’s l2: 276.322 [28] training’s l2: 191.305 valid_1’s l2: 276.122 [29] training’s l2: 191.163 valid_1’s l2: 275.935 [30] training’s l2: 191.022 valid_1’s l2: 275.748 [31] training’s l2: 190.898 valid_1’s l2: 275.562 [32] training’s l2: 190.77 valid_1’s l2: 275.391 [33] training’s l2: 190.641 valid_1’s l2: 275.221 [34] training’s l2: 190.513 valid_1’s l2: 275.05 [35] training’s l2: 190.385 valid_1’s l2: 274.88 [36] training’s l2: 190.238 valid_1’s l2: 274.696 [37] training’s l2: 190.092 valid_1’s l2: 274.511 [38] training’s l2: 189.962 valid_1’s l2: 274.337 [39] training’s l2: 189.816 valid_1’s l2: 274.153 [40] training’s l2: 189.67 valid_1’s l2: 273.97 [41] training’s l2: 189.537 valid_1’s l2: 273.802 [42] training’s l2: 189.413 valid_1’s l2: 273.632 [43] training’s l2: 189.281 valid_1’s l2: 273.464 [44] training’s l2: 189.149 valid_1’s l2: 273.296 [45] training’s l2: 189.024 valid_1’s l2: 273.126 [46] training’s l2: 188.895 valid_1’s l2: 272.97 [47] training’s l2: 188.753 valid_1’s l2: 272.785 [48] training’s l2: 188.624 valid_1’s l2: 272.628 [49] training’s l2: 188.496 valid_1’s l2: 272.473 [50] training’s l2: 188.368 valid_1’s l2: 272.317 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.368 valid_1’s l2: 272.317 [1] training’s l2: 194.371 valid_1’s l2: 284.119 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.243 valid_1’s l2: 283.92 [3] training’s l2: 194.113 valid_1’s l2: 283.736 [4] training’s l2: 193.982 valid_1’s l2: 283.552 [5] training’s l2: 193.855 valid_1’s l2: 283.353 [6] training’s l2: 193.733 valid_1’s l2: 283.14 [7] training’s l2: 193.611 valid_1’s l2: 282.934 [8] training’s l2: 193.489 valid_1’s l2: 282.722 [9] training’s l2: 193.367 valid_1’s l2: 282.51 [10] training’s l2: 193.245 valid_1’s l2: 282.298 [11] training’s l2: 193.115 valid_1’s l2: 282.059 [12] training’s l2: 192.992 valid_1’s l2: 282.197 [13] training’s l2: 192.869 valid_1’s l2: 281.983 [14] training’s l2: 192.746 valid_1’s l2: 281.769 [15] training’s l2: 192.622 valid_1’s l2: 281.556 [16] training’s l2: 192.494 valid_1’s l2: 281.403 [17] training’s l2: 192.367 valid_1’s l2: 281.25 [18] training’s l2: 192.239 valid_1’s l2: 281.098 [19] training’s l2: 192.115 valid_1’s l2: 280.906 [20] training’s l2: 191.988 valid_1’s l2: 280.754 [21] training’s l2: 191.861 valid_1’s l2: 280.837 [22] training’s l2: 191.738 valid_1’s l2: 280.627 [23] training’s l2: 191.612 valid_1’s l2: 280.711 [24] training’s l2: 191.487 valid_1’s l2: 280.516 [25] training’s l2: 191.361 valid_1’s l2: 280.6 [26] training’s l2: 191.242 valid_1’s l2: 280.44 [27] training’s l2: 191.119 valid_1’s l2: 280.259 [28] training’s l2: 190.994 valid_1’s l2: 280.072 [29] training’s l2: 190.871 valid_1’s l2: 279.891 [30] training’s l2: 190.749 valid_1’s l2: 279.711 [31] training’s l2: 190.624 valid_1’s l2: 279.48 [32] training’s l2: 190.491 valid_1’s l2: 279.244 [33] training’s l2: 190.359 valid_1’s l2: 279.008 [34] training’s l2: 190.226 valid_1’s l2: 278.773 [35] training’s l2: 190.094 valid_1’s l2: 278.538 [36] training’s l2: 189.973 valid_1’s l2: 278.382 [37] training’s l2: 189.853 valid_1’s l2: 278.228 [38] training’s l2: 189.733 valid_1’s l2: 278.044 [39] training’s l2: 189.613 valid_1’s l2: 277.89 [40] training’s l2: 189.494 valid_1’s l2: 277.736 [41] training’s l2: 189.365 valid_1’s l2: 277.523 [42] training’s l2: 189.241 valid_1’s l2: 277.347 [43] training’s l2: 189.114 valid_1’s l2: 277.135 [44] training’s l2: 188.989 valid_1’s l2: 276.951 [45] training’s l2: 188.861 valid_1’s l2: 276.765 [46] training’s l2: 188.747 valid_1’s l2: 276.591 [47] training’s l2: 188.624 valid_1’s l2: 276.388 [48] training’s l2: 188.51 valid_1’s l2: 276.214 [49] training’s l2: 188.396 valid_1’s l2: 276.041 [50] training’s l2: 188.283 valid_1’s l2: 275.868 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.283 valid_1’s l2: 275.868 [1] training’s l2: 198.312 valid_1’s l2: 265.051 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.173 valid_1’s l2: 264.882 [3] training’s l2: 198.035 valid_1’s l2: 264.714 [4] training’s l2: 197.901 valid_1’s l2: 264.538 [5] training’s l2: 197.761 valid_1’s l2: 264.362 [6] training’s l2: 197.625 valid_1’s l2: 264.181 [7] training’s l2: 197.488 valid_1’s l2: 263.999 [8] training’s l2: 197.352 valid_1’s l2: 263.818 [9] training’s l2: 197.217 valid_1’s l2: 263.637 [10] training’s l2: 197.082 valid_1’s l2: 263.455 [11] training’s l2: 196.948 valid_1’s l2: 263.285 [12] training’s l2: 196.821 valid_1’s l2: 263.138 [13] training’s l2: 196.687 valid_1’s l2: 262.966 [14] training’s l2: 196.555 valid_1’s l2: 262.796 [15] training’s l2: 196.422 valid_1’s l2: 262.625 [16] training’s l2: 196.285 valid_1’s l2: 262.446 [17] training’s l2: 196.149 valid_1’s l2: 262.267 [18] training’s l2: 196.013 valid_1’s l2: 262.088 [19] training’s l2: 195.887 valid_1’s l2: 261.933 [20] training’s l2: 195.751 valid_1’s l2: 261.755 [21] training’s l2: 195.618 valid_1’s l2: 261.602 [22] training’s l2: 195.482 valid_1’s l2: 261.444 [23] training’s l2: 195.347 valid_1’s l2: 261.286 [24] training’s l2: 195.219 valid_1’s l2: 261.132 [25] training’s l2: 195.084 valid_1’s l2: 260.975 [26] training’s l2: 194.945 valid_1’s l2: 260.779 [27] training’s l2: 194.807 valid_1’s l2: 260.599 [28] training’s l2: 194.675 valid_1’s l2: 260.439 [29] training’s l2: 194.536 valid_1’s l2: 260.26 [30] training’s l2: 194.399 valid_1’s l2: 260.08 [31] training’s l2: 194.262 valid_1’s l2: 259.903 [32] training’s l2: 194.126 valid_1’s l2: 259.729 [33] training’s l2: 193.99 valid_1’s l2: 259.556 [34] training’s l2: 193.855 valid_1’s l2: 259.382 [35] training’s l2: 193.72 valid_1’s l2: 259.209 [36] training’s l2: 193.582 valid_1’s l2: 259.034 [37] training’s l2: 193.445 valid_1’s l2: 258.859 [38] training’s l2: 193.311 valid_1’s l2: 258.697 [39] training’s l2: 193.174 valid_1’s l2: 258.523 [40] training’s l2: 193.038 valid_1’s l2: 258.348 [41] training’s l2: 192.898 valid_1’s l2: 258.171 [42] training’s l2: 192.764 valid_1’s l2: 257.988 [43] training’s l2: 192.624 valid_1’s l2: 257.811 [44] training’s l2: 192.484 valid_1’s l2: 257.634 [45] training’s l2: 192.352 valid_1’s l2: 257.452 [46] training’s l2: 192.218 valid_1’s l2: 257.281 [47] training’s l2: 192.087 valid_1’s l2: 257.138 [48] training’s l2: 191.954 valid_1’s l2: 256.967 [49] training’s l2: 191.821 valid_1’s l2: 256.797 [50] training’s l2: 191.689 valid_1’s l2: 256.627 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.689 valid_1’s l2: 256.627 [1] training’s l2: 205.242 valid_1’s l2: 198.639 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.095 valid_1’s l2: 198.5 [3] training’s l2: 204.948 valid_1’s l2: 198.372 [4] training’s l2: 204.802 valid_1’s l2: 198.245 [5] training’s l2: 204.656 valid_1’s l2: 198.107 [6] training’s l2: 204.511 valid_1’s l2: 197.976 [7] training’s l2: 204.366 valid_1’s l2: 197.846 [8] training’s l2: 204.222 valid_1’s l2: 197.717 [9] training’s l2: 204.077 valid_1’s l2: 197.587 [10] training’s l2: 203.934 valid_1’s l2: 197.458 [11] training’s l2: 203.791 valid_1’s l2: 197.312 [12] training’s l2: 203.649 valid_1’s l2: 197.173 [13] training’s l2: 203.498 valid_1’s l2: 197.024 [14] training’s l2: 203.348 valid_1’s l2: 196.875 [15] training’s l2: 203.197 valid_1’s l2: 196.726 [16] training’s l2: 203.058 valid_1’s l2: 196.6 [17] training’s l2: 202.918 valid_1’s l2: 196.474 [18] training’s l2: 202.779 valid_1’s l2: 196.345 [19] training’s l2: 202.648 valid_1’s l2: 196.23 [20] training’s l2: 202.509 valid_1’s l2: 196.105 [21] training’s l2: 202.361 valid_1’s l2: 195.984 [22] training’s l2: 202.214 valid_1’s l2: 195.863 [23] training’s l2: 202.067 valid_1’s l2: 195.743 [24] training’s l2: 201.926 valid_1’s l2: 195.609 [25] training’s l2: 201.78 valid_1’s l2: 195.489 [26] training’s l2: 201.639 valid_1’s l2: 195.368 [27] training’s l2: 201.496 valid_1’s l2: 195.239 [28] training’s l2: 201.349 valid_1’s l2: 195.098 [29] training’s l2: 201.205 valid_1’s l2: 194.974 [30] training’s l2: 201.061 valid_1’s l2: 194.851 [31] training’s l2: 200.924 valid_1’s l2: 194.723 [32] training’s l2: 200.779 valid_1’s l2: 194.6 [33] training’s l2: 200.634 valid_1’s l2: 194.477 [34] training’s l2: 200.49 valid_1’s l2: 194.354 [35] training’s l2: 200.346 valid_1’s l2: 194.231 [36] training’s l2: 200.204 valid_1’s l2: 194.104 [37] training’s l2: 200.063 valid_1’s l2: 193.976 [38] training’s l2: 199.921 valid_1’s l2: 193.852 [39] training’s l2: 199.78 valid_1’s l2: 193.725 [40] training’s l2: 199.639 valid_1’s l2: 193.599 [41] training’s l2: 199.497 valid_1’s l2: 193.466 [42] training’s l2: 199.359 valid_1’s l2: 193.354 [43] training’s l2: 199.217 valid_1’s l2: 193.222 [44] training’s l2: 199.076 valid_1’s l2: 193.091 [45] training’s l2: 198.938 valid_1’s l2: 192.979 [46] training’s l2: 198.804 valid_1’s l2: 192.869 [47] training’s l2: 198.659 valid_1’s l2: 192.73 [48] training’s l2: 198.525 valid_1’s l2: 192.62 [49] training’s l2: 198.391 valid_1’s l2: 192.511 [50] training’s l2: 198.257 valid_1’s l2: 192.401 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.257 valid_1’s l2: 192.401 [1] training’s l2: 193.888 valid_1’s l2: 286.176 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.745 valid_1’s l2: 285.975 [3] training’s l2: 193.603 valid_1’s l2: 285.765 [4] training’s l2: 193.461 valid_1’s l2: 285.548 [5] training’s l2: 193.32 valid_1’s l2: 285.349 [6] training’s l2: 193.197 valid_1’s l2: 285.154 [7] training’s l2: 193.073 valid_1’s l2: 284.96 [8] training’s l2: 192.951 valid_1’s l2: 284.767 [9] training’s l2: 192.828 valid_1’s l2: 284.574 [10] training’s l2: 192.706 valid_1’s l2: 284.38 [11] training’s l2: 192.568 valid_1’s l2: 284.19 [12] training’s l2: 192.43 valid_1’s l2: 284 [13] training’s l2: 192.301 valid_1’s l2: 283.818 [14] training’s l2: 192.172 valid_1’s l2: 283.636 [15] training’s l2: 192.043 valid_1’s l2: 283.455 [16] training’s l2: 191.917 valid_1’s l2: 283.28 [17] training’s l2: 191.791 valid_1’s l2: 283.105 [18] training’s l2: 191.665 valid_1’s l2: 282.93 [19] training’s l2: 191.546 valid_1’s l2: 282.779 [20] training’s l2: 191.42 valid_1’s l2: 282.605 [21] training’s l2: 191.291 valid_1’s l2: 282.411 [22] training’s l2: 191.161 valid_1’s l2: 282.216 [23] training’s l2: 191.031 valid_1’s l2: 282.008 [24] training’s l2: 190.905 valid_1’s l2: 281.818 [25] training’s l2: 190.775 valid_1’s l2: 281.611 [26] training’s l2: 190.645 valid_1’s l2: 281.444 [27] training’s l2: 190.516 valid_1’s l2: 281.278 [28] training’s l2: 190.383 valid_1’s l2: 281.07 [29] training’s l2: 190.255 valid_1’s l2: 280.905 [30] training’s l2: 190.126 valid_1’s l2: 280.739 [31] training’s l2: 190.01 valid_1’s l2: 280.55 [32] training’s l2: 189.886 valid_1’s l2: 280.352 [33] training’s l2: 189.762 valid_1’s l2: 280.155 [34] training’s l2: 189.638 valid_1’s l2: 279.957 [35] training’s l2: 189.515 valid_1’s l2: 279.761 [36] training’s l2: 189.389 valid_1’s l2: 279.588 [37] training’s l2: 189.263 valid_1’s l2: 279.416 [38] training’s l2: 189.141 valid_1’s l2: 279.255 [39] training’s l2: 189.015 valid_1’s l2: 279.084 [40] training’s l2: 188.89 valid_1’s l2: 278.913 [41] training’s l2: 188.773 valid_1’s l2: 278.744 [42] training’s l2: 188.657 valid_1’s l2: 278.573 [43] training’s l2: 188.539 valid_1’s l2: 278.405 [44] training’s l2: 188.423 valid_1’s l2: 278.238 [45] training’s l2: 188.309 valid_1’s l2: 278.061 [46] training’s l2: 188.184 valid_1’s l2: 277.863 [47] training’s l2: 188.048 valid_1’s l2: 277.65 [48] training’s l2: 187.923 valid_1’s l2: 277.453 [49] training’s l2: 187.799 valid_1’s l2: 277.256 [50] training’s l2: 187.674 valid_1’s l2: 277.06 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.674 valid_1’s l2: 277.06 [1] training’s l2: 195.035 valid_1’s l2: 280.058 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.89 valid_1’s l2: 279.874 [3] training’s l2: 194.744 valid_1’s l2: 279.682 [4] training’s l2: 194.597 valid_1’s l2: 279.49 [5] training’s l2: 194.453 valid_1’s l2: 279.307 [6] training’s l2: 194.316 valid_1’s l2: 279.469 [7] training’s l2: 194.183 valid_1’s l2: 279.291 [8] training’s l2: 194.05 valid_1’s l2: 279.113 [9] training’s l2: 193.913 valid_1’s l2: 279.275 [10] training’s l2: 193.777 valid_1’s l2: 279.437 [11] training’s l2: 193.629 valid_1’s l2: 279.246 [12] training’s l2: 193.484 valid_1’s l2: 279.033 [13] training’s l2: 193.338 valid_1’s l2: 278.82 [14] training’s l2: 193.193 valid_1’s l2: 278.607 [15] training’s l2: 193.048 valid_1’s l2: 278.395 [16] training’s l2: 192.911 valid_1’s l2: 278.201 [17] training’s l2: 192.775 valid_1’s l2: 278.007 [18] training’s l2: 192.64 valid_1’s l2: 277.813 [19] training’s l2: 192.503 valid_1’s l2: 277.632 [20] training’s l2: 192.368 valid_1’s l2: 277.439 [21] training’s l2: 192.232 valid_1’s l2: 277.579 [22] training’s l2: 192.094 valid_1’s l2: 277.404 [23] training’s l2: 191.959 valid_1’s l2: 277.544 [24] training’s l2: 191.829 valid_1’s l2: 277.694 [25] training’s l2: 191.694 valid_1’s l2: 277.834 [26] training’s l2: 191.552 valid_1’s l2: 277.663 [27] training’s l2: 191.41 valid_1’s l2: 277.461 [28] training’s l2: 191.271 valid_1’s l2: 277.292 [29] training’s l2: 191.13 valid_1’s l2: 277.088 [30] training’s l2: 190.989 valid_1’s l2: 276.885 [31] training’s l2: 190.863 valid_1’s l2: 277.034 [32] training’s l2: 190.731 valid_1’s l2: 276.846 [33] training’s l2: 190.599 valid_1’s l2: 276.659 [34] training’s l2: 190.468 valid_1’s l2: 276.471 [35] training’s l2: 190.337 valid_1’s l2: 276.287 [36] training’s l2: 190.194 valid_1’s l2: 276.109 [37] training’s l2: 190.05 valid_1’s l2: 275.932 [38] training’s l2: 189.915 valid_1’s l2: 275.769 [39] training’s l2: 189.772 valid_1’s l2: 275.592 [40] training’s l2: 189.63 valid_1’s l2: 275.416 [41] training’s l2: 189.505 valid_1’s l2: 275.257 [42] training’s l2: 189.38 valid_1’s l2: 275.084 [43] training’s l2: 189.256 valid_1’s l2: 274.926 [44] training’s l2: 189.132 valid_1’s l2: 274.768 [45] training’s l2: 189.007 valid_1’s l2: 274.596 [46] training’s l2: 188.871 valid_1’s l2: 274.419 [47] training’s l2: 188.742 valid_1’s l2: 274.25 [48] training’s l2: 188.607 valid_1’s l2: 274.073 [49] training’s l2: 188.472 valid_1’s l2: 273.897 [50] training’s l2: 188.337 valid_1’s l2: 273.72 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.337 valid_1’s l2: 273.72 [1] training’s l2: 194.371 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.242 valid_1’s l2: 283.896 [3] training’s l2: 194.112 valid_1’s l2: 283.695 [4] training’s l2: 193.982 valid_1’s l2: 283.494 [5] training’s l2: 193.853 valid_1’s l2: 283.289 [6] training’s l2: 193.727 valid_1’s l2: 283.095 [7] training’s l2: 193.601 valid_1’s l2: 282.902 [8] training’s l2: 193.475 valid_1’s l2: 282.71 [9] training’s l2: 193.349 valid_1’s l2: 282.518 [10] training’s l2: 193.224 valid_1’s l2: 282.326 [11] training’s l2: 193.102 valid_1’s l2: 282.149 [12] training’s l2: 192.978 valid_1’s l2: 282.286 [13] training’s l2: 192.852 valid_1’s l2: 282.089 [14] training’s l2: 192.727 valid_1’s l2: 281.892 [15] training’s l2: 192.602 valid_1’s l2: 281.695 [16] training’s l2: 192.477 valid_1’s l2: 281.529 [17] training’s l2: 192.353 valid_1’s l2: 281.364 [18] training’s l2: 192.229 valid_1’s l2: 281.199 [19] training’s l2: 192.098 valid_1’s l2: 281.003 [20] training’s l2: 191.974 valid_1’s l2: 280.838 [21] training’s l2: 191.846 valid_1’s l2: 280.645 [22] training’s l2: 191.718 valid_1’s l2: 280.452 [23] training’s l2: 191.59 valid_1’s l2: 280.26 [24] training’s l2: 191.465 valid_1’s l2: 280.058 [25] training’s l2: 191.337 valid_1’s l2: 279.866 [26] training’s l2: 191.219 valid_1’s l2: 279.703 [27] training’s l2: 191.1 valid_1’s l2: 279.53 [28] training’s l2: 190.976 valid_1’s l2: 279.35 [29] training’s l2: 190.857 valid_1’s l2: 279.177 [30] training’s l2: 190.738 valid_1’s l2: 279.004 [31] training’s l2: 190.617 valid_1’s l2: 279.132 [32] training’s l2: 190.488 valid_1’s l2: 278.934 [33] training’s l2: 190.359 valid_1’s l2: 278.737 [34] training’s l2: 190.231 valid_1’s l2: 278.54 [35] training’s l2: 190.102 valid_1’s l2: 278.344 [36] training’s l2: 189.982 valid_1’s l2: 278.481 [37] training’s l2: 189.863 valid_1’s l2: 278.618 [38] training’s l2: 189.737 valid_1’s l2: 278.41 [39] training’s l2: 189.617 valid_1’s l2: 278.547 [40] training’s l2: 189.498 valid_1’s l2: 278.683 [41] training’s l2: 189.376 valid_1’s l2: 278.808 [42] training’s l2: 189.253 valid_1’s l2: 278.933 [43] training’s l2: 189.131 valid_1’s l2: 278.75 [44] training’s l2: 189.009 valid_1’s l2: 278.875 [45] training’s l2: 188.888 valid_1’s l2: 278.999 Early stopping, best iteration is: [35] training’s l2: 190.102 valid_1’s l2: 278.344 [1] training’s l2: 198.317 valid_1’s l2: 265.052 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.184 valid_1’s l2: 264.87 [3] training’s l2: 198.049 valid_1’s l2: 264.694 [4] training’s l2: 197.914 valid_1’s l2: 264.519 [5] training’s l2: 197.781 valid_1’s l2: 264.338 [6] training’s l2: 197.645 valid_1’s l2: 264.152 [7] training’s l2: 197.509 valid_1’s l2: 263.965 [8] training’s l2: 197.373 valid_1’s l2: 263.779 [9] training’s l2: 197.238 valid_1’s l2: 263.594 [10] training’s l2: 197.103 valid_1’s l2: 263.41 [11] training’s l2: 196.965 valid_1’s l2: 263.248 [12] training’s l2: 196.829 valid_1’s l2: 263.103 [13] training’s l2: 196.692 valid_1’s l2: 262.941 [14] training’s l2: 196.555 valid_1’s l2: 262.78 [15] training’s l2: 196.419 valid_1’s l2: 262.619 [16] training’s l2: 196.278 valid_1’s l2: 262.426 [17] training’s l2: 196.137 valid_1’s l2: 262.233 [18] training’s l2: 195.997 valid_1’s l2: 262.04 [19] training’s l2: 195.86 valid_1’s l2: 261.878 [20] training’s l2: 195.72 valid_1’s l2: 261.686 [21] training’s l2: 195.585 valid_1’s l2: 261.522 [22] training’s l2: 195.451 valid_1’s l2: 261.357 [23] training’s l2: 195.317 valid_1’s l2: 261.192 [24] training’s l2: 195.186 valid_1’s l2: 261.029 [25] training’s l2: 195.053 valid_1’s l2: 260.866 [26] training’s l2: 194.922 valid_1’s l2: 260.699 [27] training’s l2: 194.793 valid_1’s l2: 260.523 [28] training’s l2: 194.663 valid_1’s l2: 260.352 [29] training’s l2: 194.534 valid_1’s l2: 260.186 [30] training’s l2: 194.405 valid_1’s l2: 260.011 [31] training’s l2: 194.274 valid_1’s l2: 259.842 [32] training’s l2: 194.138 valid_1’s l2: 259.659 [33] training’s l2: 194.003 valid_1’s l2: 259.484 [34] training’s l2: 193.875 valid_1’s l2: 259.345 [35] training’s l2: 193.74 valid_1’s l2: 259.171 [36] training’s l2: 193.61 valid_1’s l2: 259.002 [37] training’s l2: 193.48 valid_1’s l2: 258.834 [38] training’s l2: 193.351 valid_1’s l2: 258.68 [39] training’s l2: 193.222 valid_1’s l2: 258.513 [40] training’s l2: 193.093 valid_1’s l2: 258.346 [41] training’s l2: 192.956 valid_1’s l2: 258.17 [42] training’s l2: 192.828 valid_1’s l2: 258.001 [43] training’s l2: 192.692 valid_1’s l2: 257.825 [44] training’s l2: 192.557 valid_1’s l2: 257.65 [45] training’s l2: 192.429 valid_1’s l2: 257.482 [46] training’s l2: 192.304 valid_1’s l2: 257.346 [47] training’s l2: 192.18 valid_1’s l2: 257.205 [48] training’s l2: 192.055 valid_1’s l2: 257.069 [49] training’s l2: 191.931 valid_1’s l2: 256.934 [50] training’s l2: 191.807 valid_1’s l2: 256.799 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.807 valid_1’s l2: 256.799 [1] training’s l2: 205.246 valid_1’s l2: 198.645 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.103 valid_1’s l2: 198.51 [3] training’s l2: 204.961 valid_1’s l2: 198.388 [4] training’s l2: 204.819 valid_1’s l2: 198.267 [5] training’s l2: 204.677 valid_1’s l2: 198.133 [6] training’s l2: 204.533 valid_1’s l2: 197.996 [7] training’s l2: 204.388 valid_1’s l2: 197.868 [8] training’s l2: 204.244 valid_1’s l2: 197.739 [9] training’s l2: 204.101 valid_1’s l2: 197.611 [10] training’s l2: 203.957 valid_1’s l2: 197.483 [11] training’s l2: 203.81 valid_1’s l2: 197.347 [12] training’s l2: 203.668 valid_1’s l2: 197.217 [13] training’s l2: 203.517 valid_1’s l2: 197.089 [14] training’s l2: 203.367 valid_1’s l2: 196.962 [15] training’s l2: 203.217 valid_1’s l2: 196.835 [16] training’s l2: 203.069 valid_1’s l2: 196.702 [17] training’s l2: 202.92 valid_1’s l2: 196.57 [18] training’s l2: 202.772 valid_1’s l2: 196.438 [19] training’s l2: 202.628 valid_1’s l2: 196.312 [20] training’s l2: 202.48 valid_1’s l2: 196.18 [21] training’s l2: 202.33 valid_1’s l2: 196.059 [22] training’s l2: 202.18 valid_1’s l2: 195.937 [23] training’s l2: 202.031 valid_1’s l2: 195.807 [24] training’s l2: 201.891 valid_1’s l2: 195.675 [25] training’s l2: 201.743 valid_1’s l2: 195.554 [26] training’s l2: 201.606 valid_1’s l2: 195.436 [27] training’s l2: 201.469 valid_1’s l2: 195.323 [28] training’s l2: 201.329 valid_1’s l2: 195.191 [29] training’s l2: 201.192 valid_1’s l2: 195.077 [30] training’s l2: 201.055 valid_1’s l2: 194.963 [31] training’s l2: 200.92 valid_1’s l2: 194.834 [32] training’s l2: 200.782 valid_1’s l2: 194.713 [33] training’s l2: 200.645 valid_1’s l2: 194.592 [34] training’s l2: 200.508 valid_1’s l2: 194.472 [35] training’s l2: 200.371 valid_1’s l2: 194.352 [36] training’s l2: 200.235 valid_1’s l2: 194.232 [37] training’s l2: 200.1 valid_1’s l2: 194.113 [38] training’s l2: 199.966 valid_1’s l2: 194.002 [39] training’s l2: 199.831 valid_1’s l2: 193.883 [40] training’s l2: 199.696 valid_1’s l2: 193.762 [41] training’s l2: 199.564 valid_1’s l2: 193.644 [42] training’s l2: 199.432 valid_1’s l2: 193.526 [43] training’s l2: 199.301 valid_1’s l2: 193.409 [44] training’s l2: 199.168 valid_1’s l2: 193.298 [45] training’s l2: 199.037 valid_1’s l2: 193.18 [46] training’s l2: 198.902 valid_1’s l2: 193.07 [47] training’s l2: 198.763 valid_1’s l2: 192.946 [48] training’s l2: 198.628 valid_1’s l2: 192.837 [49] training’s l2: 198.494 valid_1’s l2: 192.729 [50] training’s l2: 198.36 valid_1’s l2: 192.621 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.36 valid_1’s l2: 192.621 [1] training’s l2: 193.899 valid_1’s l2: 286.209 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.763 valid_1’s l2: 286.027 [3] training’s l2: 193.632 valid_1’s l2: 285.849 [4] training’s l2: 193.503 valid_1’s l2: 285.669 [5] training’s l2: 193.368 valid_1’s l2: 285.488 [6] training’s l2: 193.241 valid_1’s l2: 285.315 [7] training’s l2: 193.115 valid_1’s l2: 285.143 [8] training’s l2: 192.989 valid_1’s l2: 284.971 [9] training’s l2: 192.863 valid_1’s l2: 284.799 [10] training’s l2: 192.737 valid_1’s l2: 284.627 [11] training’s l2: 192.602 valid_1’s l2: 284.417 [12] training’s l2: 192.475 valid_1’s l2: 284.235 [13] training’s l2: 192.35 valid_1’s l2: 284.012 [14] training’s l2: 192.225 valid_1’s l2: 283.79 [15] training’s l2: 192.1 valid_1’s l2: 283.568 [16] training’s l2: 191.972 valid_1’s l2: 283.38 [17] training’s l2: 191.844 valid_1’s l2: 283.191 [18] training’s l2: 191.717 valid_1’s l2: 283.003 [19] training’s l2: 191.588 valid_1’s l2: 282.813 [20] training’s l2: 191.461 valid_1’s l2: 282.626 [21] training’s l2: 191.334 valid_1’s l2: 282.439 [22] training’s l2: 191.208 valid_1’s l2: 282.252 [23] training’s l2: 191.082 valid_1’s l2: 282.066 [24] training’s l2: 190.963 valid_1’s l2: 281.87 [25] training’s l2: 190.838 valid_1’s l2: 281.684 [26] training’s l2: 190.709 valid_1’s l2: 281.505 [27] training’s l2: 190.585 valid_1’s l2: 281.315 [28] training’s l2: 190.464 valid_1’s l2: 281.138 [29] training’s l2: 190.336 valid_1’s l2: 280.959 [30] training’s l2: 190.209 valid_1’s l2: 280.781 [31] training’s l2: 190.088 valid_1’s l2: 280.59 [32] training’s l2: 189.968 valid_1’s l2: 280.4 [33] training’s l2: 189.848 valid_1’s l2: 280.21 [34] training’s l2: 189.728 valid_1’s l2: 280.02 [35] training’s l2: 189.609 valid_1’s l2: 279.831 [36] training’s l2: 189.489 valid_1’s l2: 279.661 [37] training’s l2: 189.369 valid_1’s l2: 279.492 [38] training’s l2: 189.242 valid_1’s l2: 279.304 [39] training’s l2: 189.123 valid_1’s l2: 279.136 [40] training’s l2: 189.003 valid_1’s l2: 278.967 [41] training’s l2: 188.89 valid_1’s l2: 278.806 [42] training’s l2: 188.777 valid_1’s l2: 278.641 [43] training’s l2: 188.665 valid_1’s l2: 278.48 [44] training’s l2: 188.552 valid_1’s l2: 278.32 [45] training’s l2: 188.44 valid_1’s l2: 278.156 [46] training’s l2: 188.312 valid_1’s l2: 277.969 [47] training’s l2: 188.187 valid_1’s l2: 277.792 [48] training’s l2: 188.058 valid_1’s l2: 277.605 [49] training’s l2: 187.93 valid_1’s l2: 277.419 [50] training’s l2: 187.801 valid_1’s l2: 277.232 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.801 valid_1’s l2: 277.232 [1] training’s l2: 195.041 valid_1’s l2: 280.064 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.905 valid_1’s l2: 279.874 [3] training’s l2: 194.764 valid_1’s l2: 279.687 [4] training’s l2: 194.624 valid_1’s l2: 279.502 [5] training’s l2: 194.488 valid_1’s l2: 279.312 [6] training’s l2: 194.346 valid_1’s l2: 279.132 [7] training’s l2: 194.204 valid_1’s l2: 278.952 [8] training’s l2: 194.063 valid_1’s l2: 278.773 [9] training’s l2: 193.922 valid_1’s l2: 278.594 [10] training’s l2: 193.781 valid_1’s l2: 278.415 [11] training’s l2: 193.643 valid_1’s l2: 278.221 [12] training’s l2: 193.506 valid_1’s l2: 278.34 [13] training’s l2: 193.369 valid_1’s l2: 278.459 [14] training’s l2: 193.233 valid_1’s l2: 278.578 [15] training’s l2: 193.099 valid_1’s l2: 278.396 [16] training’s l2: 192.958 valid_1’s l2: 278.206 [17] training’s l2: 192.817 valid_1’s l2: 278.017 [18] training’s l2: 192.677 valid_1’s l2: 277.829 [19] training’s l2: 192.535 valid_1’s l2: 277.656 [20] training’s l2: 192.395 valid_1’s l2: 277.468 [21] training’s l2: 192.262 valid_1’s l2: 277.274 [22] training’s l2: 192.13 valid_1’s l2: 277.093 [23] training’s l2: 191.998 valid_1’s l2: 276.902 [24] training’s l2: 191.866 valid_1’s l2: 276.717 [25] training’s l2: 191.736 valid_1’s l2: 276.525 [26] training’s l2: 191.605 valid_1’s l2: 276.346 [27] training’s l2: 191.465 valid_1’s l2: 276.164 [28] training’s l2: 191.332 valid_1’s l2: 275.991 [29] training’s l2: 191.193 valid_1’s l2: 275.81 [30] training’s l2: 191.054 valid_1’s l2: 275.63 [31] training’s l2: 190.923 valid_1’s l2: 275.45 [32] training’s l2: 190.787 valid_1’s l2: 275.265 [33] training’s l2: 190.652 valid_1’s l2: 275.083 [34] training’s l2: 190.517 valid_1’s l2: 274.898 [35] training’s l2: 190.381 valid_1’s l2: 274.714 [36] training’s l2: 190.243 valid_1’s l2: 274.538 [37] training’s l2: 190.105 valid_1’s l2: 274.362 [38] training’s l2: 189.972 valid_1’s l2: 274.195 [39] training’s l2: 189.835 valid_1’s l2: 274.02 [40] training’s l2: 189.697 valid_1’s l2: 273.845 [41] training’s l2: 189.574 valid_1’s l2: 273.703 [42] training’s l2: 189.452 valid_1’s l2: 273.55 [43] training’s l2: 189.329 valid_1’s l2: 273.408 [44] training’s l2: 189.206 valid_1’s l2: 273.267 [45] training’s l2: 189.085 valid_1’s l2: 273.115 [46] training’s l2: 188.956 valid_1’s l2: 272.95 [47] training’s l2: 188.824 valid_1’s l2: 272.8 [48] training’s l2: 188.696 valid_1’s l2: 272.636 [49] training’s l2: 188.568 valid_1’s l2: 272.472 [50] training’s l2: 188.439 valid_1’s l2: 272.308 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.439 valid_1’s l2: 272.308 [1] training’s l2: 194.373 valid_1’s l2: 284.124 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.245 valid_1’s l2: 283.921 [3] training’s l2: 194.117 valid_1’s l2: 283.741 [4] training’s l2: 193.989 valid_1’s l2: 283.561 [5] training’s l2: 193.862 valid_1’s l2: 283.36 [6] training’s l2: 193.735 valid_1’s l2: 283.166 [7] training’s l2: 193.608 valid_1’s l2: 282.973 [8] training’s l2: 193.481 valid_1’s l2: 282.781 [9] training’s l2: 193.354 valid_1’s l2: 282.588 [10] training’s l2: 193.228 valid_1’s l2: 282.396 [11] training’s l2: 193.094 valid_1’s l2: 282.2 [12] training’s l2: 192.97 valid_1’s l2: 282.331 [13] training’s l2: 192.846 valid_1’s l2: 282.138 [14] training’s l2: 192.722 valid_1’s l2: 281.946 [15] training’s l2: 192.599 valid_1’s l2: 281.754 [16] training’s l2: 192.474 valid_1’s l2: 281.57 [17] training’s l2: 192.35 valid_1’s l2: 281.387 [18] training’s l2: 192.226 valid_1’s l2: 281.204 [19] training’s l2: 192.095 valid_1’s l2: 281.014 [20] training’s l2: 191.972 valid_1’s l2: 280.848 [21] training’s l2: 191.844 valid_1’s l2: 280.655 [22] training’s l2: 191.716 valid_1’s l2: 280.462 [23] training’s l2: 191.588 valid_1’s l2: 280.269 [24] training’s l2: 191.463 valid_1’s l2: 280.064 [25] training’s l2: 191.336 valid_1’s l2: 279.872 [26] training’s l2: 191.216 valid_1’s l2: 279.703 [27] training’s l2: 191.094 valid_1’s l2: 279.522 [28] training’s l2: 190.97 valid_1’s l2: 279.338 [29] training’s l2: 190.848 valid_1’s l2: 279.478 [30] training’s l2: 190.727 valid_1’s l2: 279.617 [31] training’s l2: 190.609 valid_1’s l2: 279.749 [32] training’s l2: 190.488 valid_1’s l2: 279.88 [33] training’s l2: 190.367 valid_1’s l2: 280.011 [34] training’s l2: 190.246 valid_1’s l2: 280.142 [35] training’s l2: 190.126 valid_1’s l2: 280.273 [36] training’s l2: 190.007 valid_1’s l2: 280.1 [37] training’s l2: 189.889 valid_1’s l2: 279.928 [38] training’s l2: 189.773 valid_1’s l2: 279.75 Early stopping, best iteration is: [28] training’s l2: 190.97 valid_1’s l2: 279.338 [1] training’s l2: 198.32 valid_1’s l2: 265.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.184 valid_1’s l2: 264.891 [3] training’s l2: 198.05 valid_1’s l2: 264.736 [4] training’s l2: 197.918 valid_1’s l2: 264.578 [5] training’s l2: 197.783 valid_1’s l2: 264.4 [6] training’s l2: 197.645 valid_1’s l2: 264.214 [7] training’s l2: 197.507 valid_1’s l2: 264.028 [8] training’s l2: 197.369 valid_1’s l2: 263.843 [9] training’s l2: 197.232 valid_1’s l2: 263.658 [10] training’s l2: 197.095 valid_1’s l2: 263.473 [11] training’s l2: 196.958 valid_1’s l2: 263.315 [12] training’s l2: 196.824 valid_1’s l2: 263.162 [13] training’s l2: 196.687 valid_1’s l2: 262.981 [14] training’s l2: 196.55 valid_1’s l2: 262.799 [15] training’s l2: 196.413 valid_1’s l2: 262.618 [16] training’s l2: 196.273 valid_1’s l2: 262.428 [17] training’s l2: 196.135 valid_1’s l2: 262.247 [18] training’s l2: 195.995 valid_1’s l2: 262.058 [19] training’s l2: 195.86 valid_1’s l2: 261.891 [20] training’s l2: 195.723 valid_1’s l2: 261.71 [21] training’s l2: 195.59 valid_1’s l2: 261.545 [22] training’s l2: 195.456 valid_1’s l2: 261.379 [23] training’s l2: 195.323 valid_1’s l2: 261.214 [24] training’s l2: 195.192 valid_1’s l2: 261.048 [25] training’s l2: 195.06 valid_1’s l2: 260.876 [26] training’s l2: 194.928 valid_1’s l2: 260.705 [27] training’s l2: 194.796 valid_1’s l2: 260.536 [28] training’s l2: 194.662 valid_1’s l2: 260.356 [29] training’s l2: 194.531 valid_1’s l2: 260.187 [30] training’s l2: 194.4 valid_1’s l2: 260.019 [31] training’s l2: 194.267 valid_1’s l2: 259.845 [32] training’s l2: 194.136 valid_1’s l2: 259.705 [33] training’s l2: 194.005 valid_1’s l2: 259.559 [34] training’s l2: 193.875 valid_1’s l2: 259.42 [35] training’s l2: 193.745 valid_1’s l2: 259.274 [36] training’s l2: 193.612 valid_1’s l2: 259.092 [37] training’s l2: 193.479 valid_1’s l2: 258.909 [38] training’s l2: 193.352 valid_1’s l2: 258.763 [39] training’s l2: 193.22 valid_1’s l2: 258.581 [40] training’s l2: 193.088 valid_1’s l2: 258.4 [41] training’s l2: 192.955 valid_1’s l2: 258.222 [42] training’s l2: 192.83 valid_1’s l2: 258.086 [43] training’s l2: 192.698 valid_1’s l2: 257.909 [44] training’s l2: 192.566 valid_1’s l2: 257.733 [45] training’s l2: 192.442 valid_1’s l2: 257.597 [46] training’s l2: 192.313 valid_1’s l2: 257.432 [47] training’s l2: 192.182 valid_1’s l2: 257.271 [48] training’s l2: 192.053 valid_1’s l2: 257.106 [49] training’s l2: 191.925 valid_1’s l2: 256.942 [50] training’s l2: 191.797 valid_1’s l2: 256.778 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.797 valid_1’s l2: 256.778 [1] training’s l2: 205.242 valid_1’s l2: 198.637 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.097 valid_1’s l2: 198.501 [3] training’s l2: 204.951 valid_1’s l2: 198.371 [4] training’s l2: 204.805 valid_1’s l2: 198.241 [5] training’s l2: 204.66 valid_1’s l2: 198.106 [6] training’s l2: 204.516 valid_1’s l2: 197.973 [7] training’s l2: 204.371 valid_1’s l2: 197.845 [8] training’s l2: 204.227 valid_1’s l2: 197.717 [9] training’s l2: 204.083 valid_1’s l2: 197.589 [10] training’s l2: 203.94 valid_1’s l2: 197.462 [11] training’s l2: 203.796 valid_1’s l2: 197.334 [12] training’s l2: 203.653 valid_1’s l2: 197.204 [13] training’s l2: 203.503 valid_1’s l2: 197.074 [14] training’s l2: 203.36 valid_1’s l2: 196.946 [15] training’s l2: 203.21 valid_1’s l2: 196.816 [16] training’s l2: 203.064 valid_1’s l2: 196.68 [17] training’s l2: 202.917 valid_1’s l2: 196.543 [18] training’s l2: 202.771 valid_1’s l2: 196.407 [19] training’s l2: 202.63 valid_1’s l2: 196.285 [20] training’s l2: 202.484 valid_1’s l2: 196.15 [21] training’s l2: 202.342 valid_1’s l2: 196.024 [22] training’s l2: 202.2 valid_1’s l2: 195.899 [23] training’s l2: 202.058 valid_1’s l2: 195.775 [24] training’s l2: 201.92 valid_1’s l2: 195.643 [25] training’s l2: 201.778 valid_1’s l2: 195.518 [26] training’s l2: 201.642 valid_1’s l2: 195.403 [27] training’s l2: 201.504 valid_1’s l2: 195.292 [28] training’s l2: 201.363 valid_1’s l2: 195.165 [29] training’s l2: 201.227 valid_1’s l2: 195.053 [30] training’s l2: 201.091 valid_1’s l2: 194.942 [31] training’s l2: 200.952 valid_1’s l2: 194.814 [32] training’s l2: 200.814 valid_1’s l2: 194.692 [33] training’s l2: 200.676 valid_1’s l2: 194.571 [34] training’s l2: 200.538 valid_1’s l2: 194.45 [35] training’s l2: 200.401 valid_1’s l2: 194.329 [36] training’s l2: 200.256 valid_1’s l2: 194.196 [37] training’s l2: 200.112 valid_1’s l2: 194.064 [38] training’s l2: 199.972 valid_1’s l2: 193.933 [39] training’s l2: 199.828 valid_1’s l2: 193.802 [40] training’s l2: 199.685 valid_1’s l2: 193.674 [41] training’s l2: 199.548 valid_1’s l2: 193.554 [42] training’s l2: 199.412 valid_1’s l2: 193.433 [43] training’s l2: 199.276 valid_1’s l2: 193.314 [44] training’s l2: 199.14 valid_1’s l2: 193.195 [45] training’s l2: 199.004 valid_1’s l2: 193.075 [46] training’s l2: 198.869 valid_1’s l2: 192.966 [47] training’s l2: 198.727 valid_1’s l2: 192.842 [48] training’s l2: 198.592 valid_1’s l2: 192.733 [49] training’s l2: 198.457 valid_1’s l2: 192.624 [50] training’s l2: 198.323 valid_1’s l2: 192.515 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.323 valid_1’s l2: 192.515 [1] training’s l2: 193.896 valid_1’s l2: 286.204 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.758 valid_1’s l2: 286.015 [3] training’s l2: 193.623 valid_1’s l2: 285.832 [4] training’s l2: 193.491 valid_1’s l2: 285.646 [5] training’s l2: 193.353 valid_1’s l2: 285.458 [6] training’s l2: 193.223 valid_1’s l2: 285.281 [7] training’s l2: 193.096 valid_1’s l2: 285.103 [8] training’s l2: 192.966 valid_1’s l2: 284.927 [9] training’s l2: 192.837 valid_1’s l2: 284.751 [10] training’s l2: 192.708 valid_1’s l2: 284.575 [11] training’s l2: 192.572 valid_1’s l2: 284.368 [12] training’s l2: 192.445 valid_1’s l2: 284.187 [13] training’s l2: 192.317 valid_1’s l2: 284.007 [14] training’s l2: 192.19 valid_1’s l2: 283.828 [15] training’s l2: 192.063 valid_1’s l2: 283.648 [16] training’s l2: 191.936 valid_1’s l2: 283.467 [17] training’s l2: 191.809 valid_1’s l2: 283.287 [18] training’s l2: 191.683 valid_1’s l2: 283.107 [19] training’s l2: 191.547 valid_1’s l2: 282.906 [20] training’s l2: 191.419 valid_1’s l2: 282.726 [21] training’s l2: 191.294 valid_1’s l2: 282.538 [22] training’s l2: 191.169 valid_1’s l2: 282.352 [23] training’s l2: 191.045 valid_1’s l2: 282.165 [24] training’s l2: 190.926 valid_1’s l2: 281.972 [25] training’s l2: 190.802 valid_1’s l2: 281.786 [26] training’s l2: 190.681 valid_1’s l2: 281.617 [27] training’s l2: 190.561 valid_1’s l2: 281.448 [28] training’s l2: 190.437 valid_1’s l2: 281.267 [29] training’s l2: 190.317 valid_1’s l2: 281.099 [30] training’s l2: 190.198 valid_1’s l2: 280.928 [31] training’s l2: 190.079 valid_1’s l2: 280.734 [32] training’s l2: 189.952 valid_1’s l2: 280.558 [33] training’s l2: 189.826 valid_1’s l2: 280.383 [34] training’s l2: 189.699 valid_1’s l2: 280.208 [35] training’s l2: 189.573 valid_1’s l2: 280.034 [36] training’s l2: 189.452 valid_1’s l2: 279.863 [37] training’s l2: 189.331 valid_1’s l2: 279.692 [38] training’s l2: 189.199 valid_1’s l2: 279.493 [39] training’s l2: 189.079 valid_1’s l2: 279.323 [40] training’s l2: 188.96 valid_1’s l2: 279.14 [41] training’s l2: 188.838 valid_1’s l2: 278.968 [42] training’s l2: 188.716 valid_1’s l2: 278.796 [43] training’s l2: 188.594 valid_1’s l2: 278.624 [44] training’s l2: 188.472 valid_1’s l2: 278.453 [45] training’s l2: 188.352 valid_1’s l2: 278.277 [46] training’s l2: 188.227 valid_1’s l2: 278.099 [47] training’s l2: 188.096 valid_1’s l2: 277.917 [48] training’s l2: 187.969 valid_1’s l2: 277.743 [49] training’s l2: 187.842 valid_1’s l2: 277.57 [50] training’s l2: 187.716 valid_1’s l2: 277.397 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.716 valid_1’s l2: 277.397 [1] training’s l2: 195.036 valid_1’s l2: 280.06 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.896 valid_1’s l2: 279.871 [3] training’s l2: 194.751 valid_1’s l2: 279.68 [4] training’s l2: 194.605 valid_1’s l2: 279.492 [5] training’s l2: 194.466 valid_1’s l2: 279.305 [6] training’s l2: 194.324 valid_1’s l2: 279.118 [7] training’s l2: 194.183 valid_1’s l2: 278.931 [8] training’s l2: 194.041 valid_1’s l2: 278.745 [9] training’s l2: 193.9 valid_1’s l2: 278.559 [10] training’s l2: 193.759 valid_1’s l2: 278.373 [11] training’s l2: 193.617 valid_1’s l2: 278.206 [12] training’s l2: 193.479 valid_1’s l2: 278.33 [13] training’s l2: 193.337 valid_1’s l2: 278.145 [14] training’s l2: 193.196 valid_1’s l2: 277.961 [15] training’s l2: 193.056 valid_1’s l2: 277.777 [16] training’s l2: 192.916 valid_1’s l2: 277.591 [17] training’s l2: 192.777 valid_1’s l2: 277.405 [18] training’s l2: 192.638 valid_1’s l2: 277.22 [19] training’s l2: 192.496 valid_1’s l2: 277.049 [20] training’s l2: 192.357 valid_1’s l2: 276.864 [21] training’s l2: 192.224 valid_1’s l2: 276.671 [22] training’s l2: 192.089 valid_1’s l2: 276.486 [23] training’s l2: 191.955 valid_1’s l2: 276.297 [24] training’s l2: 191.824 valid_1’s l2: 276.115 [25] training’s l2: 191.691 valid_1’s l2: 275.926 [26] training’s l2: 191.559 valid_1’s l2: 275.747 [27] training’s l2: 191.421 valid_1’s l2: 275.57 [28] training’s l2: 191.288 valid_1’s l2: 275.408 [29] training’s l2: 191.15 valid_1’s l2: 275.232 [30] training’s l2: 191.013 valid_1’s l2: 275.056 [31] training’s l2: 190.885 valid_1’s l2: 275.201 [32] training’s l2: 190.751 valid_1’s l2: 275.025 [33] training’s l2: 190.617 valid_1’s l2: 274.85 [34] training’s l2: 190.483 valid_1’s l2: 274.675 [35] training’s l2: 190.349 valid_1’s l2: 274.5 [36] training’s l2: 190.212 valid_1’s l2: 274.324 [37] training’s l2: 190.075 valid_1’s l2: 274.148 [38] training’s l2: 189.939 valid_1’s l2: 273.971 [39] training’s l2: 189.803 valid_1’s l2: 273.796 [40] training’s l2: 189.667 valid_1’s l2: 273.621 [41] training’s l2: 189.538 valid_1’s l2: 273.474 [42] training’s l2: 189.41 valid_1’s l2: 273.315 [43] training’s l2: 189.282 valid_1’s l2: 273.168 [44] training’s l2: 189.154 valid_1’s l2: 273.022 [45] training’s l2: 189.027 valid_1’s l2: 272.864 [46] training’s l2: 188.893 valid_1’s l2: 272.69 [47] training’s l2: 188.757 valid_1’s l2: 272.534 [48] training’s l2: 188.624 valid_1’s l2: 272.36 [49] training’s l2: 188.491 valid_1’s l2: 272.187 [50] training’s l2: 188.358 valid_1’s l2: 272.014 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.358 valid_1’s l2: 272.014 [1] training’s l2: 194.374 valid_1’s l2: 284.114 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.248 valid_1’s l2: 283.914 [3] training’s l2: 194.12 valid_1’s l2: 283.724 [4] training’s l2: 193.993 valid_1’s l2: 283.856 [5] training’s l2: 193.867 valid_1’s l2: 283.656 [6] training’s l2: 193.741 valid_1’s l2: 283.462 [7] training’s l2: 193.615 valid_1’s l2: 283.268 [8] training’s l2: 193.489 valid_1’s l2: 283.075 [9] training’s l2: 193.363 valid_1’s l2: 282.882 [10] training’s l2: 193.238 valid_1’s l2: 282.69 [11] training’s l2: 193.105 valid_1’s l2: 282.48 [12] training’s l2: 192.981 valid_1’s l2: 282.313 [13] training’s l2: 192.856 valid_1’s l2: 282.125 [14] training’s l2: 192.731 valid_1’s l2: 281.937 [15] training’s l2: 192.607 valid_1’s l2: 281.75 [16] training’s l2: 192.482 valid_1’s l2: 281.563 [17] training’s l2: 192.358 valid_1’s l2: 281.376 [18] training’s l2: 192.234 valid_1’s l2: 281.206 [19] training’s l2: 192.115 valid_1’s l2: 281.008 [20] training’s l2: 191.991 valid_1’s l2: 280.822 [21] training’s l2: 191.864 valid_1’s l2: 280.635 [22] training’s l2: 191.737 valid_1’s l2: 280.449 [23] training’s l2: 191.611 valid_1’s l2: 280.263 [24] training’s l2: 191.487 valid_1’s l2: 280.067 [25] training’s l2: 191.361 valid_1’s l2: 279.881 [26] training’s l2: 191.24 valid_1’s l2: 279.711 [27] training’s l2: 191.118 valid_1’s l2: 279.528 [28] training’s l2: 190.989 valid_1’s l2: 279.334 [29] training’s l2: 190.867 valid_1’s l2: 279.471 [30] training’s l2: 190.746 valid_1’s l2: 279.608 [31] training’s l2: 190.626 valid_1’s l2: 279.727 [32] training’s l2: 190.505 valid_1’s l2: 279.559 [33] training’s l2: 190.384 valid_1’s l2: 279.392 [34] training’s l2: 190.264 valid_1’s l2: 279.225 [35] training’s l2: 190.143 valid_1’s l2: 279.059 [36] training’s l2: 190.021 valid_1’s l2: 278.873 [37] training’s l2: 189.9 valid_1’s l2: 278.688 [38] training’s l2: 189.774 valid_1’s l2: 278.473 [39] training’s l2: 189.653 valid_1’s l2: 278.289 [40] training’s l2: 189.532 valid_1’s l2: 278.104 [41] training’s l2: 189.411 valid_1’s l2: 278.232 [42] training’s l2: 189.29 valid_1’s l2: 278.36 [43] training’s l2: 189.17 valid_1’s l2: 278.183 [44] training’s l2: 189.05 valid_1’s l2: 278.311 [45] training’s l2: 188.929 valid_1’s l2: 278.439 [46] training’s l2: 188.81 valid_1’s l2: 278.567 [47] training’s l2: 188.686 valid_1’s l2: 278.373 [48] training’s l2: 188.567 valid_1’s l2: 278.501 [49] training’s l2: 188.448 valid_1’s l2: 278.628 [50] training’s l2: 188.329 valid_1’s l2: 278.756 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.329 valid_1’s l2: 278.756 [1] training’s l2: 198.314 valid_1’s l2: 265.043 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.178 valid_1’s l2: 264.862 [3] training’s l2: 198.039 valid_1’s l2: 264.678 [4] training’s l2: 197.901 valid_1’s l2: 264.494 [5] training’s l2: 197.765 valid_1’s l2: 264.314 [6] training’s l2: 197.623 valid_1’s l2: 264.114 [7] training’s l2: 197.481 valid_1’s l2: 263.914 [8] training’s l2: 197.34 valid_1’s l2: 263.714 [9] training’s l2: 197.198 valid_1’s l2: 263.514 [10] training’s l2: 197.058 valid_1’s l2: 263.315 [11] training’s l2: 196.921 valid_1’s l2: 263.155 [12] training’s l2: 196.787 valid_1’s l2: 263.012 [13] training’s l2: 196.654 valid_1’s l2: 262.868 [14] training’s l2: 196.52 valid_1’s l2: 262.717 [15] training’s l2: 196.387 valid_1’s l2: 262.575 [16] training’s l2: 196.251 valid_1’s l2: 262.4 [17] training’s l2: 196.115 valid_1’s l2: 262.226 [18] training’s l2: 195.98 valid_1’s l2: 262.052 [19] training’s l2: 195.847 valid_1’s l2: 261.895 [20] training’s l2: 195.712 valid_1’s l2: 261.722 [21] training’s l2: 195.577 valid_1’s l2: 261.552 [22] training’s l2: 195.442 valid_1’s l2: 261.382 [23] training’s l2: 195.308 valid_1’s l2: 261.213 [24] training’s l2: 195.177 valid_1’s l2: 261.043 [25] training’s l2: 195.042 valid_1’s l2: 260.874 [26] training’s l2: 194.911 valid_1’s l2: 260.706 [27] training’s l2: 194.778 valid_1’s l2: 260.535 [28] training’s l2: 194.644 valid_1’s l2: 260.354 [29] training’s l2: 194.512 valid_1’s l2: 260.184 [30] training’s l2: 194.38 valid_1’s l2: 260.014 [31] training’s l2: 194.248 valid_1’s l2: 259.841 [32] training’s l2: 194.112 valid_1’s l2: 259.658 [33] training’s l2: 193.976 valid_1’s l2: 259.475 [34] training’s l2: 193.84 valid_1’s l2: 259.292 [35] training’s l2: 193.705 valid_1’s l2: 259.109 [36] training’s l2: 193.573 valid_1’s l2: 258.926 [37] training’s l2: 193.441 valid_1’s l2: 258.743 [38] training’s l2: 193.314 valid_1’s l2: 258.599 [39] training’s l2: 193.183 valid_1’s l2: 258.417 [40] training’s l2: 193.052 valid_1’s l2: 258.235 [41] training’s l2: 192.922 valid_1’s l2: 258.065 [42] training’s l2: 192.794 valid_1’s l2: 257.901 [43] training’s l2: 192.665 valid_1’s l2: 257.731 [44] training’s l2: 192.536 valid_1’s l2: 257.562 [45] training’s l2: 192.409 valid_1’s l2: 257.399 [46] training’s l2: 192.282 valid_1’s l2: 257.258 [47] training’s l2: 192.153 valid_1’s l2: 257.108 [48] training’s l2: 192.027 valid_1’s l2: 256.967 [49] training’s l2: 191.901 valid_1’s l2: 256.826 [50] training’s l2: 191.775 valid_1’s l2: 256.685 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.775 valid_1’s l2: 256.685 [1] training’s l2: 205.242 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.098 valid_1’s l2: 198.505 [3] training’s l2: 204.952 valid_1’s l2: 198.377 [4] training’s l2: 204.806 valid_1’s l2: 198.248 [5] training’s l2: 204.663 valid_1’s l2: 198.116 [6] training’s l2: 204.518 valid_1’s l2: 197.988 [7] training’s l2: 204.373 valid_1’s l2: 197.861 [8] training’s l2: 204.229 valid_1’s l2: 197.734 [9] training’s l2: 204.085 valid_1’s l2: 197.607 [10] training’s l2: 203.941 valid_1’s l2: 197.48 [11] training’s l2: 203.79 valid_1’s l2: 197.344 [12] training’s l2: 203.646 valid_1’s l2: 197.211 [13] training’s l2: 203.502 valid_1’s l2: 197.085 [14] training’s l2: 203.359 valid_1’s l2: 196.957 [15] training’s l2: 203.216 valid_1’s l2: 196.83 [16] training’s l2: 203.071 valid_1’s l2: 196.703 [17] training’s l2: 202.927 valid_1’s l2: 196.576 [18] training’s l2: 202.782 valid_1’s l2: 196.449 [19] training’s l2: 202.639 valid_1’s l2: 196.317 [20] training’s l2: 202.495 valid_1’s l2: 196.19 [21] training’s l2: 202.352 valid_1’s l2: 196.063 [22] training’s l2: 202.209 valid_1’s l2: 195.937 [23] training’s l2: 202.066 valid_1’s l2: 195.811 [24] training’s l2: 201.926 valid_1’s l2: 195.678 [25] training’s l2: 201.784 valid_1’s l2: 195.551 [26] training’s l2: 201.647 valid_1’s l2: 195.435 [27] training’s l2: 201.509 valid_1’s l2: 195.324 [28] training’s l2: 201.367 valid_1’s l2: 195.194 [29] training’s l2: 201.23 valid_1’s l2: 195.083 [30] training’s l2: 201.093 valid_1’s l2: 194.973 [31] training’s l2: 200.954 valid_1’s l2: 194.844 [32] training’s l2: 200.816 valid_1’s l2: 194.726 [33] training’s l2: 200.678 valid_1’s l2: 194.605 [34] training’s l2: 200.54 valid_1’s l2: 194.484 [35] training’s l2: 200.402 valid_1’s l2: 194.361 [36] training’s l2: 200.258 valid_1’s l2: 194.237 [37] training’s l2: 200.115 valid_1’s l2: 194.113 [38] training’s l2: 199.978 valid_1’s l2: 193.984 [39] training’s l2: 199.836 valid_1’s l2: 193.855 [40] training’s l2: 199.693 valid_1’s l2: 193.731 [41] training’s l2: 199.549 valid_1’s l2: 193.607 [42] training’s l2: 199.413 valid_1’s l2: 193.486 [43] training’s l2: 199.269 valid_1’s l2: 193.362 [44] training’s l2: 199.125 valid_1’s l2: 193.238 [45] training’s l2: 198.99 valid_1’s l2: 193.117 [46] training’s l2: 198.855 valid_1’s l2: 193.007 [47] training’s l2: 198.713 valid_1’s l2: 192.883 [48] training’s l2: 198.578 valid_1’s l2: 192.773 [49] training’s l2: 198.443 valid_1’s l2: 192.664 [50] training’s l2: 198.309 valid_1’s l2: 192.554 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.309 valid_1’s l2: 192.554 [1] training’s l2: 193.903 valid_1’s l2: 286.21 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.779 valid_1’s l2: 286.015 [3] training’s l2: 193.652 valid_1’s l2: 285.838 [4] training’s l2: 193.526 valid_1’s l2: 285.661 [5] training’s l2: 193.402 valid_1’s l2: 285.467 [6] training’s l2: 193.276 valid_1’s l2: 285.289 [7] training’s l2: 193.15 valid_1’s l2: 285.111 [8] training’s l2: 193.025 valid_1’s l2: 284.933 [9] training’s l2: 192.899 valid_1’s l2: 284.756 [10] training’s l2: 192.774 valid_1’s l2: 284.579 [11] training’s l2: 192.644 valid_1’s l2: 284.406 [12] training’s l2: 192.512 valid_1’s l2: 284.224 [13] training’s l2: 192.381 valid_1’s l2: 284.042 [14] training’s l2: 192.25 valid_1’s l2: 283.86 [15] training’s l2: 192.12 valid_1’s l2: 283.679 [16] training’s l2: 191.988 valid_1’s l2: 283.5 [17] training’s l2: 191.856 valid_1’s l2: 283.322 [18] training’s l2: 191.725 valid_1’s l2: 283.144 [19] training’s l2: 191.59 valid_1’s l2: 282.942 [20] training’s l2: 191.459 valid_1’s l2: 282.764 [21] training’s l2: 191.33 valid_1’s l2: 282.586 [22] training’s l2: 191.203 valid_1’s l2: 282.407 [23] training’s l2: 191.075 valid_1’s l2: 282.229 [24] training’s l2: 190.956 valid_1’s l2: 282.037 [25] training’s l2: 190.829 valid_1’s l2: 281.859 [26] training’s l2: 190.701 valid_1’s l2: 281.686 [27] training’s l2: 190.578 valid_1’s l2: 281.5 [28] training’s l2: 190.455 valid_1’s l2: 281.318 [29] training’s l2: 190.328 valid_1’s l2: 281.146 [30] training’s l2: 190.202 valid_1’s l2: 280.974 [31] training’s l2: 190.083 valid_1’s l2: 280.781 [32] training’s l2: 189.963 valid_1’s l2: 280.606 [33] training’s l2: 189.843 valid_1’s l2: 280.431 [34] training’s l2: 189.723 valid_1’s l2: 280.259 [35] training’s l2: 189.604 valid_1’s l2: 280.084 [36] training’s l2: 189.485 valid_1’s l2: 279.908 [37] training’s l2: 189.366 valid_1’s l2: 279.733 [38] training’s l2: 189.236 valid_1’s l2: 279.534 [39] training’s l2: 189.118 valid_1’s l2: 279.359 [40] training’s l2: 189.001 valid_1’s l2: 279.172 [41] training’s l2: 188.878 valid_1’s l2: 279 [42] training’s l2: 188.755 valid_1’s l2: 278.828 [43] training’s l2: 188.632 valid_1’s l2: 278.656 [44] training’s l2: 188.51 valid_1’s l2: 278.485 [45] training’s l2: 188.39 valid_1’s l2: 278.311 [46] training’s l2: 188.265 valid_1’s l2: 278.137 [47] training’s l2: 188.143 valid_1’s l2: 277.967 [48] training’s l2: 188.017 valid_1’s l2: 277.796 [49] training’s l2: 187.892 valid_1’s l2: 277.625 [50] training’s l2: 187.767 valid_1’s l2: 277.455 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.767 valid_1’s l2: 277.455 [1] training’s l2: 195.036 valid_1’s l2: 280.066 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.898 valid_1’s l2: 279.878 [3] training’s l2: 194.752 valid_1’s l2: 279.694 [4] training’s l2: 194.606 valid_1’s l2: 279.51 [5] training’s l2: 194.469 valid_1’s l2: 279.323 [6] training’s l2: 194.325 valid_1’s l2: 279.14 [7] training’s l2: 194.181 valid_1’s l2: 278.956 [8] training’s l2: 194.037 valid_1’s l2: 278.773 [9] training’s l2: 193.894 valid_1’s l2: 278.59 [10] training’s l2: 193.751 valid_1’s l2: 278.408 [11] training’s l2: 193.609 valid_1’s l2: 278.242 [12] training’s l2: 193.472 valid_1’s l2: 278.357 [13] training’s l2: 193.329 valid_1’s l2: 278.165 [14] training’s l2: 193.187 valid_1’s l2: 277.974 [15] training’s l2: 193.045 valid_1’s l2: 277.784 [16] training’s l2: 192.906 valid_1’s l2: 277.601 [17] training’s l2: 192.768 valid_1’s l2: 277.419 [18] training’s l2: 192.63 valid_1’s l2: 277.238 [19] training’s l2: 192.488 valid_1’s l2: 277.07 [20] training’s l2: 192.351 valid_1’s l2: 276.889 [21] training’s l2: 192.218 valid_1’s l2: 276.72 [22] training’s l2: 192.085 valid_1’s l2: 276.551 [23] training’s l2: 191.953 valid_1’s l2: 276.382 [24] training’s l2: 191.82 valid_1’s l2: 276.199 [25] training’s l2: 191.688 valid_1’s l2: 276.031 [26] training’s l2: 191.555 valid_1’s l2: 275.849 [27] training’s l2: 191.418 valid_1’s l2: 275.669 [28] training’s l2: 191.286 valid_1’s l2: 275.507 [29] training’s l2: 191.149 valid_1’s l2: 275.328 [30] training’s l2: 191.012 valid_1’s l2: 275.149 [31] training’s l2: 190.883 valid_1’s l2: 275.292 [32] training’s l2: 190.748 valid_1’s l2: 275.114 [33] training’s l2: 190.613 valid_1’s l2: 274.937 [34] training’s l2: 190.478 valid_1’s l2: 274.76 [35] training’s l2: 190.343 valid_1’s l2: 274.584 [36] training’s l2: 190.206 valid_1’s l2: 274.397 [37] training’s l2: 190.069 valid_1’s l2: 274.211 [38] training’s l2: 189.935 valid_1’s l2: 274.052 [39] training’s l2: 189.798 valid_1’s l2: 273.867 [40] training’s l2: 189.662 valid_1’s l2: 273.682 [41] training’s l2: 189.527 valid_1’s l2: 273.507 [42] training’s l2: 189.4 valid_1’s l2: 273.356 [43] training’s l2: 189.265 valid_1’s l2: 273.182 [44] training’s l2: 189.131 valid_1’s l2: 273.008 [45] training’s l2: 189.005 valid_1’s l2: 272.857 [46] training’s l2: 188.87 valid_1’s l2: 272.683 [47] training’s l2: 188.737 valid_1’s l2: 272.529 [48] training’s l2: 188.603 valid_1’s l2: 272.355 [49] training’s l2: 188.469 valid_1’s l2: 272.182 [50] training’s l2: 188.335 valid_1’s l2: 272.009 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.335 valid_1’s l2: 272.009 [1] training’s l2: 194.373 valid_1’s l2: 284.111 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.247 valid_1’s l2: 283.907 [3] training’s l2: 194.119 valid_1’s l2: 283.714 [4] training’s l2: 193.991 valid_1’s l2: 283.522 [5] training’s l2: 193.866 valid_1’s l2: 283.319 [6] training’s l2: 193.738 valid_1’s l2: 283.127 [7] training’s l2: 193.611 valid_1’s l2: 282.936 [8] training’s l2: 193.484 valid_1’s l2: 282.745 [9] training’s l2: 193.357 valid_1’s l2: 282.554 [10] training’s l2: 193.231 valid_1’s l2: 282.364 [11] training’s l2: 193.1 valid_1’s l2: 282.16 [12] training’s l2: 192.975 valid_1’s l2: 282.29 [13] training’s l2: 192.849 valid_1’s l2: 282.101 [14] training’s l2: 192.723 valid_1’s l2: 281.911 [15] training’s l2: 192.598 valid_1’s l2: 281.722 [16] training’s l2: 192.473 valid_1’s l2: 281.533 [17] training’s l2: 192.348 valid_1’s l2: 281.345 [18] training’s l2: 192.223 valid_1’s l2: 281.157 [19] training’s l2: 192.094 valid_1’s l2: 280.955 [20] training’s l2: 191.97 valid_1’s l2: 280.768 [21] training’s l2: 191.846 valid_1’s l2: 280.581 [22] training’s l2: 191.722 valid_1’s l2: 280.394 [23] training’s l2: 191.599 valid_1’s l2: 280.525 [24] training’s l2: 191.477 valid_1’s l2: 280.328 [25] training’s l2: 191.353 valid_1’s l2: 280.458 [26] training’s l2: 191.231 valid_1’s l2: 280.293 [27] training’s l2: 191.109 valid_1’s l2: 280.107 [28] training’s l2: 190.98 valid_1’s l2: 279.9 [29] training’s l2: 190.858 valid_1’s l2: 280.03 [30] training’s l2: 190.736 valid_1’s l2: 280.161 [31] training’s l2: 190.615 valid_1’s l2: 279.965 [32] training’s l2: 190.493 valid_1’s l2: 280.095 [33] training’s l2: 190.372 valid_1’s l2: 280.226 [34] training’s l2: 190.25 valid_1’s l2: 280.041 [35] training’s l2: 190.129 valid_1’s l2: 280.172 [36] training’s l2: 190.008 valid_1’s l2: 279.988 [37] training’s l2: 189.887 valid_1’s l2: 280.118 [38] training’s l2: 189.761 valid_1’s l2: 279.919 Early stopping, best iteration is: [28] training’s l2: 190.98 valid_1’s l2: 279.9 [1] training’s l2: 198.314 valid_1’s l2: 265.042 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.178 valid_1’s l2: 264.862 [3] training’s l2: 198.039 valid_1’s l2: 264.676 [4] training’s l2: 197.901 valid_1’s l2: 264.492 [5] training’s l2: 197.765 valid_1’s l2: 264.312 [6] training’s l2: 197.627 valid_1’s l2: 264.128 [7] training’s l2: 197.49 valid_1’s l2: 263.944 [8] training’s l2: 197.352 valid_1’s l2: 263.76 [9] training’s l2: 197.215 valid_1’s l2: 263.577 [10] training’s l2: 197.079 valid_1’s l2: 263.394 [11] training’s l2: 196.943 valid_1’s l2: 263.236 [12] training’s l2: 196.808 valid_1’s l2: 263.055 [13] training’s l2: 196.672 valid_1’s l2: 262.873 [14] training’s l2: 196.537 valid_1’s l2: 262.699 [15] training’s l2: 196.401 valid_1’s l2: 262.518 [16] training’s l2: 196.266 valid_1’s l2: 262.345 [17] training’s l2: 196.131 valid_1’s l2: 262.165 [18] training’s l2: 195.996 valid_1’s l2: 261.992 [19] training’s l2: 195.862 valid_1’s l2: 261.835 [20] training’s l2: 195.727 valid_1’s l2: 261.656 [21] training’s l2: 195.593 valid_1’s l2: 261.484 [22] training’s l2: 195.46 valid_1’s l2: 261.305 [23] training’s l2: 195.326 valid_1’s l2: 261.134 [24] training’s l2: 195.194 valid_1’s l2: 260.959 [25] training’s l2: 195.061 valid_1’s l2: 260.781 [26] training’s l2: 194.929 valid_1’s l2: 260.613 [27] training’s l2: 194.797 valid_1’s l2: 260.436 [28] training’s l2: 194.663 valid_1’s l2: 260.247 [29] training’s l2: 194.531 valid_1’s l2: 260.078 [30] training’s l2: 194.399 valid_1’s l2: 259.902 [31] training’s l2: 194.268 valid_1’s l2: 259.729 [32] training’s l2: 194.137 valid_1’s l2: 259.561 [33] training’s l2: 194.001 valid_1’s l2: 259.379 [34] training’s l2: 193.87 valid_1’s l2: 259.212 [35] training’s l2: 193.735 valid_1’s l2: 259.031 [36] training’s l2: 193.604 valid_1’s l2: 258.864 [37] training’s l2: 193.469 valid_1’s l2: 258.676 [38] training’s l2: 193.339 valid_1’s l2: 258.526 [39] training’s l2: 193.209 valid_1’s l2: 258.36 [40] training’s l2: 193.074 valid_1’s l2: 258.172 [41] training’s l2: 192.945 valid_1’s l2: 258.008 [42] training’s l2: 192.817 valid_1’s l2: 257.837 [43] training’s l2: 192.684 valid_1’s l2: 257.65 [44] training’s l2: 192.55 valid_1’s l2: 257.472 [45] training’s l2: 192.423 valid_1’s l2: 257.309 [46] training’s l2: 192.29 valid_1’s l2: 257.124 [47] training’s l2: 192.161 valid_1’s l2: 256.973 [48] training’s l2: 192.033 valid_1’s l2: 256.81 [49] training’s l2: 191.901 valid_1’s l2: 256.636 [50] training’s l2: 191.769 valid_1’s l2: 256.451 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.769 valid_1’s l2: 256.451 [1] training’s l2: 205.242 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.098 valid_1’s l2: 198.505 [3] training’s l2: 204.952 valid_1’s l2: 198.376 [4] training’s l2: 204.806 valid_1’s l2: 198.248 [5] training’s l2: 204.663 valid_1’s l2: 198.115 [6] training’s l2: 204.518 valid_1’s l2: 197.987 [7] training’s l2: 204.373 valid_1’s l2: 197.859 [8] training’s l2: 204.228 valid_1’s l2: 197.732 [9] training’s l2: 204.083 valid_1’s l2: 197.605 [10] training’s l2: 203.939 valid_1’s l2: 197.478 [11] training’s l2: 203.795 valid_1’s l2: 197.345 [12] training’s l2: 203.652 valid_1’s l2: 197.213 [13] training’s l2: 203.508 valid_1’s l2: 197.087 [14] training’s l2: 203.365 valid_1’s l2: 196.961 [15] training’s l2: 203.223 valid_1’s l2: 196.835 [16] training’s l2: 203.08 valid_1’s l2: 196.71 [17] training’s l2: 202.938 valid_1’s l2: 196.585 [18] training’s l2: 202.796 valid_1’s l2: 196.46 [19] training’s l2: 202.653 valid_1’s l2: 196.328 [20] training’s l2: 202.512 valid_1’s l2: 196.203 [21] training’s l2: 202.371 valid_1’s l2: 196.079 [22] training’s l2: 202.23 valid_1’s l2: 195.955 [23] training’s l2: 202.089 valid_1’s l2: 195.832 [24] training’s l2: 201.949 valid_1’s l2: 195.699 [25] training’s l2: 201.809 valid_1’s l2: 195.575 [26] training’s l2: 201.669 valid_1’s l2: 195.446 [27] training’s l2: 201.53 valid_1’s l2: 195.324 [28] training’s l2: 201.386 valid_1’s l2: 195.189 [29] training’s l2: 201.247 valid_1’s l2: 195.065 [30] training’s l2: 201.108 valid_1’s l2: 194.941 [31] training’s l2: 200.969 valid_1’s l2: 194.813 [32] training’s l2: 200.831 valid_1’s l2: 194.69 [33] training’s l2: 200.693 valid_1’s l2: 194.569 [34] training’s l2: 200.555 valid_1’s l2: 194.446 [35] training’s l2: 200.417 valid_1’s l2: 194.326 [36] training’s l2: 200.279 valid_1’s l2: 194.203 [37] training’s l2: 200.142 valid_1’s l2: 194.083 [38] training’s l2: 200.004 valid_1’s l2: 193.961 [39] training’s l2: 199.867 valid_1’s l2: 193.84 [40] training’s l2: 199.731 valid_1’s l2: 193.72 [41] training’s l2: 199.594 valid_1’s l2: 193.601 [42] training’s l2: 199.458 valid_1’s l2: 193.48 [43] training’s l2: 199.323 valid_1’s l2: 193.361 [44] training’s l2: 199.187 valid_1’s l2: 193.242 [45] training’s l2: 199.052 valid_1’s l2: 193.122 [46] training’s l2: 198.917 valid_1’s l2: 193.003 [47] training’s l2: 198.78 valid_1’s l2: 192.875 [48] training’s l2: 198.645 valid_1’s l2: 192.756 [49] training’s l2: 198.511 valid_1’s l2: 192.639 [50] training’s l2: 198.377 valid_1’s l2: 192.52 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.377 valid_1’s l2: 192.52 [1] training’s l2: 193.897 valid_1’s l2: 286.204 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.772 valid_1’s l2: 286.005 [3] training’s l2: 193.639 valid_1’s l2: 285.822 [4] training’s l2: 193.508 valid_1’s l2: 285.638 [5] training’s l2: 193.384 valid_1’s l2: 285.44 [6] training’s l2: 193.251 valid_1’s l2: 285.258 [7] training’s l2: 193.12 valid_1’s l2: 285.075 [8] training’s l2: 192.988 valid_1’s l2: 284.894 [9] training’s l2: 192.856 valid_1’s l2: 284.713 [10] training’s l2: 192.725 valid_1’s l2: 284.532 [11] training’s l2: 192.597 valid_1’s l2: 284.361 [12] training’s l2: 192.465 valid_1’s l2: 284.181 [13] training’s l2: 192.334 valid_1’s l2: 284.002 [14] training’s l2: 192.204 valid_1’s l2: 283.823 [15] training’s l2: 192.073 valid_1’s l2: 283.644 [16] training’s l2: 191.943 valid_1’s l2: 283.466 [17] training’s l2: 191.813 valid_1’s l2: 283.288 [18] training’s l2: 191.683 valid_1’s l2: 283.11 [19] training’s l2: 191.557 valid_1’s l2: 282.941 [20] training’s l2: 191.427 valid_1’s l2: 282.764 [21] training’s l2: 191.298 valid_1’s l2: 282.588 [22] training’s l2: 191.17 valid_1’s l2: 282.411 [23] training’s l2: 191.041 valid_1’s l2: 282.236 [24] training’s l2: 190.92 valid_1’s l2: 282.042 [25] training’s l2: 190.792 valid_1’s l2: 281.867 [26] training’s l2: 190.664 valid_1’s l2: 281.691 [27] training’s l2: 190.541 valid_1’s l2: 281.504 [28] training’s l2: 190.409 valid_1’s l2: 281.314 [29] training’s l2: 190.282 valid_1’s l2: 281.14 [30] training’s l2: 190.155 valid_1’s l2: 280.966 [31] training’s l2: 190.035 valid_1’s l2: 280.774 [32] training’s l2: 189.909 valid_1’s l2: 280.6 [33] training’s l2: 189.782 valid_1’s l2: 280.428 [34] training’s l2: 189.656 valid_1’s l2: 280.254 [35] training’s l2: 189.531 valid_1’s l2: 280.082 [36] training’s l2: 189.405 valid_1’s l2: 279.91 [37] training’s l2: 189.28 valid_1’s l2: 279.738 [38] training’s l2: 189.156 valid_1’s l2: 279.573 [39] training’s l2: 189.033 valid_1’s l2: 279.401 [40] training’s l2: 188.91 valid_1’s l2: 279.229 [41] training’s l2: 188.786 valid_1’s l2: 279.059 [42] training’s l2: 188.661 valid_1’s l2: 278.888 [43] training’s l2: 188.537 valid_1’s l2: 278.719 [44] training’s l2: 188.418 valid_1’s l2: 278.546 [45] training’s l2: 188.299 valid_1’s l2: 278.378 [46] training’s l2: 188.178 valid_1’s l2: 278.207 [47] training’s l2: 188.055 valid_1’s l2: 278.044 [48] training’s l2: 187.937 valid_1’s l2: 277.873 [49] training’s l2: 187.819 valid_1’s l2: 277.702 [50] training’s l2: 187.701 valid_1’s l2: 277.531 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.701 valid_1’s l2: 277.531 [1] training’s l2: 195.037 valid_1’s l2: 280.065 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.901 valid_1’s l2: 279.876 [3] training’s l2: 194.756 valid_1’s l2: 279.69 [4] training’s l2: 194.612 valid_1’s l2: 279.505 [5] training’s l2: 194.476 valid_1’s l2: 279.317 [6] training’s l2: 194.333 valid_1’s l2: 279.133 [7] training’s l2: 194.19 valid_1’s l2: 278.949 [8] training’s l2: 194.047 valid_1’s l2: 278.765 [9] training’s l2: 193.904 valid_1’s l2: 278.582 [10] training’s l2: 193.762 valid_1’s l2: 278.399 [11] training’s l2: 193.623 valid_1’s l2: 278.239 [12] training’s l2: 193.487 valid_1’s l2: 278.347 [13] training’s l2: 193.345 valid_1’s l2: 278.165 [14] training’s l2: 193.204 valid_1’s l2: 277.983 [15] training’s l2: 193.063 valid_1’s l2: 277.802 [16] training’s l2: 192.922 valid_1’s l2: 277.621 [17] training’s l2: 192.781 valid_1’s l2: 277.436 [18] training’s l2: 192.641 valid_1’s l2: 277.255 [19] training’s l2: 192.504 valid_1’s l2: 277.097 [20] training’s l2: 192.363 valid_1’s l2: 276.913 [21] training’s l2: 192.223 valid_1’s l2: 276.729 [22] training’s l2: 192.084 valid_1’s l2: 276.549 [23] training’s l2: 191.945 valid_1’s l2: 276.366 [24] training’s l2: 191.813 valid_1’s l2: 276.183 [25] training’s l2: 191.674 valid_1’s l2: 276 [26] training’s l2: 191.541 valid_1’s l2: 275.82 [27] training’s l2: 191.403 valid_1’s l2: 275.642 [28] training’s l2: 191.268 valid_1’s l2: 275.473 [29] training’s l2: 191.13 valid_1’s l2: 275.291 [30] training’s l2: 190.993 valid_1’s l2: 275.114 [31] training’s l2: 190.862 valid_1’s l2: 274.931 [32] training’s l2: 190.725 valid_1’s l2: 274.755 [33] training’s l2: 190.588 valid_1’s l2: 274.574 [34] training’s l2: 190.451 valid_1’s l2: 274.399 [35] training’s l2: 190.315 valid_1’s l2: 274.219 [36] training’s l2: 190.179 valid_1’s l2: 274.044 [37] training’s l2: 190.043 valid_1’s l2: 273.865 [38] training’s l2: 189.909 valid_1’s l2: 273.711 [39] training’s l2: 189.774 valid_1’s l2: 273.536 [40] training’s l2: 189.639 valid_1’s l2: 273.358 [41] training’s l2: 189.504 valid_1’s l2: 273.184 [42] training’s l2: 189.374 valid_1’s l2: 273.293 [43] training’s l2: 189.24 valid_1’s l2: 273.115 [44] training’s l2: 189.106 valid_1’s l2: 272.942 [45] training’s l2: 188.976 valid_1’s l2: 273.05 [46] training’s l2: 188.843 valid_1’s l2: 272.874 [47] training’s l2: 188.711 valid_1’s l2: 272.721 [48] training’s l2: 188.578 valid_1’s l2: 272.549 [49] training’s l2: 188.444 valid_1’s l2: 272.373 [50] training’s l2: 188.312 valid_1’s l2: 272.202 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.312 valid_1’s l2: 272.202

colsample bytree train logloss eval logloss test logloss
0 0.10 0.460455 0.805872 0.617258
1 0.30 0.487318 1.000109 0.548485
2 0.50 0.518791 0.961162 0.607704
3 0.80 0.568898 1.029488 0.535118
4 0.90 0.626881 1.067931 0.620248
5 0.95 0.572878 1.116754 0.608845
6 1.00 0.644596 1.085327 0.562080

6.0.8 subsample

[1] training’s l2: 194.43 valid_1’s l2: 284.282 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.306 valid_1’s l2: 284.08 [3] training’s l2: 194.207 valid_1’s l2: 284.035 [4] training’s l2: 194.11 valid_1’s l2: 283.87 [5] training’s l2: 194.014 valid_1’s l2: 283.735 [6] training’s l2: 193.908 valid_1’s l2: 283.56 [7] training’s l2: 193.806 valid_1’s l2: 283.391 [8] training’s l2: 193.704 valid_1’s l2: 283.211 [9] training’s l2: 193.612 valid_1’s l2: 283.181 [10] training’s l2: 193.515 valid_1’s l2: 282.999 [11] training’s l2: 193.402 valid_1’s l2: 282.806 [12] training’s l2: 193.348 valid_1’s l2: 282.697 [13] training’s l2: 193.247 valid_1’s l2: 282.555 [14] training’s l2: 193.132 valid_1’s l2: 282.372 [15] training’s l2: 193.058 valid_1’s l2: 282.398 [16] training’s l2: 192.975 valid_1’s l2: 282.379 [17] training’s l2: 192.869 valid_1’s l2: 282.234 [18] training’s l2: 192.775 valid_1’s l2: 282.071 [19] training’s l2: 192.684 valid_1’s l2: 281.937 [20] training’s l2: 192.57 valid_1’s l2: 281.761 [21] training’s l2: 192.445 valid_1’s l2: 281.578 [22] training’s l2: 192.352 valid_1’s l2: 281.432 [23] training’s l2: 192.26 valid_1’s l2: 281.281 [24] training’s l2: 192.166 valid_1’s l2: 281.253 [25] training’s l2: 192.129 valid_1’s l2: 281.252 [26] training’s l2: 192.009 valid_1’s l2: 281.09 [27] training’s l2: 191.956 valid_1’s l2: 281.009 [28] training’s l2: 191.866 valid_1’s l2: 280.899 [29] training’s l2: 191.78 valid_1’s l2: 280.886 [30] training’s l2: 191.695 valid_1’s l2: 280.731 [31] training’s l2: 191.634 valid_1’s l2: 280.653 [32] training’s l2: 191.516 valid_1’s l2: 280.453 [33] training’s l2: 191.453 valid_1’s l2: 280.369 [34] training’s l2: 191.371 valid_1’s l2: 280.231 [35] training’s l2: 191.254 valid_1’s l2: 280.032 [36] training’s l2: 191.193 valid_1’s l2: 280.03 [37] training’s l2: 191.103 valid_1’s l2: 279.879 [38] training’s l2: 190.987 valid_1’s l2: 279.684 [39] training’s l2: 190.88 valid_1’s l2: 279.821 [40] training’s l2: 190.815 valid_1’s l2: 279.84 [41] training’s l2: 190.684 valid_1’s l2: 279.639 [42] training’s l2: 190.57 valid_1’s l2: 279.441 [43] training’s l2: 190.478 valid_1’s l2: 279.303 [44] training’s l2: 190.387 valid_1’s l2: 279.167 [45] training’s l2: 190.333 valid_1’s l2: 279.074 [46] training’s l2: 190.235 valid_1’s l2: 278.946 [47] training’s l2: 190.122 valid_1’s l2: 278.758 [48] training’s l2: 190.032 valid_1’s l2: 278.6 [49] training’s l2: 189.941 valid_1’s l2: 278.445 [50] training’s l2: 189.815 valid_1’s l2: 278.237 Did not meet early stopping. Best iteration is: [50] training’s l2: 189.815 valid_1’s l2: 278.237 [1] training’s l2: 198.386 valid_1’s l2: 265.157 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.256 valid_1’s l2: 264.986 [3] training’s l2: 198.157 valid_1’s l2: 264.853 [4] training’s l2: 198.053 valid_1’s l2: 264.739 [5] training’s l2: 197.943 valid_1’s l2: 264.619 [6] training’s l2: 197.827 valid_1’s l2: 264.473 [7] training’s l2: 197.713 valid_1’s l2: 264.332 [8] training’s l2: 197.602 valid_1’s l2: 264.201 [9] training’s l2: 197.496 valid_1’s l2: 264.083 [10] training’s l2: 197.386 valid_1’s l2: 263.972 [11] training’s l2: 197.269 valid_1’s l2: 263.829 [12] training’s l2: 197.202 valid_1’s l2: 263.787 [13] training’s l2: 197.107 valid_1’s l2: 263.673 [14] training’s l2: 196.982 valid_1’s l2: 263.512 [15] training’s l2: 196.901 valid_1’s l2: 263.455 [16] training’s l2: 196.833 valid_1’s l2: 263.36 [17] training’s l2: 196.717 valid_1’s l2: 263.219 [18] training’s l2: 196.616 valid_1’s l2: 263.074 [19] training’s l2: 196.523 valid_1’s l2: 262.936 [20] training’s l2: 196.393 valid_1’s l2: 262.771 [21] training’s l2: 196.258 valid_1’s l2: 262.602 [22] training’s l2: 196.155 valid_1’s l2: 262.469 [23] training’s l2: 196.053 valid_1’s l2: 262.359 [24] training’s l2: 195.957 valid_1’s l2: 262.256 [25] training’s l2: 195.904 valid_1’s l2: 262.248 [26] training’s l2: 195.768 valid_1’s l2: 262.075 [27] training’s l2: 195.7 valid_1’s l2: 261.989 [28] training’s l2: 195.604 valid_1’s l2: 261.891 [29] training’s l2: 195.504 valid_1’s l2: 261.781 [30] training’s l2: 195.393 valid_1’s l2: 261.655 [31] training’s l2: 195.328 valid_1’s l2: 261.607 [32] training’s l2: 195.203 valid_1’s l2: 261.458 [33] training’s l2: 195.139 valid_1’s l2: 261.388 [34] training’s l2: 195.058 valid_1’s l2: 261.284 [35] training’s l2: 194.926 valid_1’s l2: 261.117 [36] training’s l2: 194.868 valid_1’s l2: 261.09 [37] training’s l2: 194.763 valid_1’s l2: 260.973 [38] training’s l2: 194.643 valid_1’s l2: 260.837 [39] training’s l2: 194.538 valid_1’s l2: 260.726 [40] training’s l2: 194.474 valid_1’s l2: 260.661 [41] training’s l2: 194.341 valid_1’s l2: 260.499 [42] training’s l2: 194.216 valid_1’s l2: 260.331 [43] training’s l2: 194.125 valid_1’s l2: 260.223 [44] training’s l2: 194.024 valid_1’s l2: 260.107 [45] training’s l2: 193.959 valid_1’s l2: 260.016 [46] training’s l2: 193.852 valid_1’s l2: 259.888 [47] training’s l2: 193.743 valid_1’s l2: 259.768 [48] training’s l2: 193.649 valid_1’s l2: 259.669 [49] training’s l2: 193.55 valid_1’s l2: 259.558 [50] training’s l2: 193.419 valid_1’s l2: 259.386 Did not meet early stopping. Best iteration is: [50] training’s l2: 193.419 valid_1’s l2: 259.386 [1] training’s l2: 205.316 valid_1’s l2: 198.778 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.175 valid_1’s l2: 198.651 [3] training’s l2: 205.068 valid_1’s l2: 198.585 [4] training’s l2: 204.949 valid_1’s l2: 198.497 [5] training’s l2: 204.82 valid_1’s l2: 198.41 [6] training’s l2: 204.687 valid_1’s l2: 198.281 [7] training’s l2: 204.564 valid_1’s l2: 198.173 [8] training’s l2: 204.445 valid_1’s l2: 198.079 [9] training’s l2: 204.337 valid_1’s l2: 197.979 [10] training’s l2: 204.223 valid_1’s l2: 197.898 [11] training’s l2: 204.098 valid_1’s l2: 197.789 [12] training’s l2: 204.023 valid_1’s l2: 197.764 [13] training’s l2: 203.919 valid_1’s l2: 197.679 [14] training’s l2: 203.788 valid_1’s l2: 197.554 [15] training’s l2: 203.679 valid_1’s l2: 197.562 [16] training’s l2: 203.594 valid_1’s l2: 197.573 [17] training’s l2: 203.469 valid_1’s l2: 197.447 [18] training’s l2: 203.358 valid_1’s l2: 197.34 [19] training’s l2: 203.248 valid_1’s l2: 197.348 [20] training’s l2: 203.11 valid_1’s l2: 197.224 [21] training’s l2: 202.965 valid_1’s l2: 197.103 [22] training’s l2: 202.857 valid_1’s l2: 197.002 [23] training’s l2: 202.75 valid_1’s l2: 196.914 [24] training’s l2: 202.656 valid_1’s l2: 196.826 [25] training’s l2: 202.577 valid_1’s l2: 196.912 [26] training’s l2: 202.436 valid_1’s l2: 196.789 [27] training’s l2: 202.369 valid_1’s l2: 196.731 [28] training’s l2: 202.269 valid_1’s l2: 196.629 [29] training’s l2: 202.172 valid_1’s l2: 196.546 [30] training’s l2: 202.06 valid_1’s l2: 196.443 [31] training’s l2: 201.995 valid_1’s l2: 196.446 [32] training’s l2: 201.865 valid_1’s l2: 196.33 [33] training’s l2: 201.801 valid_1’s l2: 196.267 [34] training’s l2: 201.718 valid_1’s l2: 196.192 [35] training’s l2: 201.583 valid_1’s l2: 196.063 [36] training’s l2: 201.505 valid_1’s l2: 196.03 [37] training’s l2: 201.395 valid_1’s l2: 195.928 [38] training’s l2: 201.263 valid_1’s l2: 195.81 [39] training’s l2: 201.149 valid_1’s l2: 195.712 [40] training’s l2: 201.067 valid_1’s l2: 195.742 [41] training’s l2: 200.923 valid_1’s l2: 195.614 [42] training’s l2: 200.8 valid_1’s l2: 195.504 [43] training’s l2: 200.695 valid_1’s l2: 195.518 [44] training’s l2: 200.603 valid_1’s l2: 195.414 [45] training’s l2: 200.539 valid_1’s l2: 195.375 [46] training’s l2: 200.431 valid_1’s l2: 195.273 [47] training’s l2: 200.308 valid_1’s l2: 195.158 [48] training’s l2: 200.205 valid_1’s l2: 195.068 [49] training’s l2: 200.104 valid_1’s l2: 194.98 [50] training’s l2: 199.965 valid_1’s l2: 194.853 Did not meet early stopping. Best iteration is: [50] training’s l2: 199.965 valid_1’s l2: 194.853 [1] training’s l2: 193.969 valid_1’s l2: 286.334 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.846 valid_1’s l2: 286.133 [3] training’s l2: 193.746 valid_1’s l2: 286.022 [4] training’s l2: 193.646 valid_1’s l2: 285.892 [5] training’s l2: 193.547 valid_1’s l2: 285.739 [6] training’s l2: 193.434 valid_1’s l2: 285.568 [7] training’s l2: 193.32 valid_1’s l2: 285.405 [8] training’s l2: 193.221 valid_1’s l2: 285.247 [9] training’s l2: 193.131 valid_1’s l2: 285.113 [10] training’s l2: 193.032 valid_1’s l2: 284.974 [11] training’s l2: 192.927 valid_1’s l2: 284.813 [12] training’s l2: 192.863 valid_1’s l2: 284.741 [13] training’s l2: 192.772 valid_1’s l2: 284.604 [14] training’s l2: 192.66 valid_1’s l2: 284.447 [15] training’s l2: 192.593 valid_1’s l2: 284.291 [16] training’s l2: 192.524 valid_1’s l2: 284.205 [17] training’s l2: 192.417 valid_1’s l2: 284.043 [18] training’s l2: 192.324 valid_1’s l2: 283.912 [19] training’s l2: 192.231 valid_1’s l2: 283.78 [20] training’s l2: 192.108 valid_1’s l2: 283.585 [21] training’s l2: 191.981 valid_1’s l2: 283.383 [22] training’s l2: 191.881 valid_1’s l2: 283.21 [23] training’s l2: 191.787 valid_1’s l2: 283.056 [24] training’s l2: 191.701 valid_1’s l2: 282.913 [25] training’s l2: 191.67 valid_1’s l2: 282.847 [26] training’s l2: 191.536 valid_1’s l2: 282.66 [27] training’s l2: 191.473 valid_1’s l2: 282.604 [28] training’s l2: 191.384 valid_1’s l2: 282.447 [29] training’s l2: 191.293 valid_1’s l2: 282.326 [30] training’s l2: 191.197 valid_1’s l2: 282.183 [31] training’s l2: 191.144 valid_1’s l2: 282.107 [32] training’s l2: 191.033 valid_1’s l2: 281.917 [33] training’s l2: 190.975 valid_1’s l2: 281.85 [34] training’s l2: 190.908 valid_1’s l2: 281.749 [35] training’s l2: 190.792 valid_1’s l2: 281.554 [36] training’s l2: 190.733 valid_1’s l2: 281.465 [37] training’s l2: 190.638 valid_1’s l2: 281.298 [38] training’s l2: 190.527 valid_1’s l2: 281.107 [39] training’s l2: 190.427 valid_1’s l2: 280.961 [40] training’s l2: 190.369 valid_1’s l2: 280.851 [41] training’s l2: 190.241 valid_1’s l2: 280.66 [42] training’s l2: 190.139 valid_1’s l2: 280.52 [43] training’s l2: 190.056 valid_1’s l2: 280.384 [44] training’s l2: 189.973 valid_1’s l2: 280.251 [45] training’s l2: 189.917 valid_1’s l2: 280.189 [46] training’s l2: 189.822 valid_1’s l2: 280.025 [47] training’s l2: 189.711 valid_1’s l2: 279.859 [48] training’s l2: 189.622 valid_1’s l2: 279.73 [49] training’s l2: 189.532 valid_1’s l2: 279.602 [50] training’s l2: 189.4 valid_1’s l2: 279.41 Did not meet early stopping. Best iteration is: [50] training’s l2: 189.4 valid_1’s l2: 279.41 [1] training’s l2: 195.11 valid_1’s l2: 280.235 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.975 valid_1’s l2: 280.069 [3] training’s l2: 194.877 valid_1’s l2: 279.943 [4] training’s l2: 194.778 valid_1’s l2: 279.79 [5] training’s l2: 194.675 valid_1’s l2: 279.641 [6] training’s l2: 194.558 valid_1’s l2: 279.492 [7] training’s l2: 194.45 valid_1’s l2: 279.365 [8] training’s l2: 194.35 valid_1’s l2: 279.249 [9] training’s l2: 194.26 valid_1’s l2: 279.134 [10] training’s l2: 194.155 valid_1’s l2: 279.047 [11] training’s l2: 194.046 valid_1’s l2: 278.883 [12] training’s l2: 193.981 valid_1’s l2: 278.842 [13] training’s l2: 193.877 valid_1’s l2: 278.717 [14] training’s l2: 193.748 valid_1’s l2: 278.559 [15] training’s l2: 193.676 valid_1’s l2: 278.404 [16] training’s l2: 193.598 valid_1’s l2: 278.299 [17] training’s l2: 193.477 valid_1’s l2: 278.148 [18] training’s l2: 193.374 valid_1’s l2: 278.014 [19] training’s l2: 193.276 valid_1’s l2: 277.882 [20] training’s l2: 193.141 valid_1’s l2: 277.689 [21] training’s l2: 193.007 valid_1’s l2: 277.505 [22] training’s l2: 192.905 valid_1’s l2: 277.373 [23] training’s l2: 192.804 valid_1’s l2: 277.238 [24] training’s l2: 192.703 valid_1’s l2: 277.143 [25] training’s l2: 192.666 valid_1’s l2: 277.124 [26] training’s l2: 192.528 valid_1’s l2: 276.913 [27] training’s l2: 192.461 valid_1’s l2: 276.869 [28] training’s l2: 192.355 valid_1’s l2: 276.732 [29] training’s l2: 192.254 valid_1’s l2: 276.602 [30] training’s l2: 192.146 valid_1’s l2: 276.456 [31] training’s l2: 192.073 valid_1’s l2: 276.422 [32] training’s l2: 191.949 valid_1’s l2: 276.267 [33] training’s l2: 191.885 valid_1’s l2: 276.221 [34] training’s l2: 191.8 valid_1’s l2: 276.156 [35] training’s l2: 191.677 valid_1’s l2: 276.001 [36] training’s l2: 191.612 valid_1’s l2: 275.945 [37] training’s l2: 191.508 valid_1’s l2: 275.797 [38] training’s l2: 191.383 valid_1’s l2: 275.625 [39] training’s l2: 191.27 valid_1’s l2: 275.7 [40] training’s l2: 191.195 valid_1’s l2: 275.583 [41] training’s l2: 191.064 valid_1’s l2: 275.428 [42] training’s l2: 190.955 valid_1’s l2: 275.29 [43] training’s l2: 190.863 valid_1’s l2: 275.154 [44] training’s l2: 190.761 valid_1’s l2: 275.046 [45] training’s l2: 190.702 valid_1’s l2: 274.988 [46] training’s l2: 190.594 valid_1’s l2: 274.862 [47] training’s l2: 190.479 valid_1’s l2: 274.718 [48] training’s l2: 190.388 valid_1’s l2: 274.578 [49] training’s l2: 190.294 valid_1’s l2: 274.456 [50] training’s l2: 190.157 valid_1’s l2: 274.288 Did not meet early stopping. Best iteration is: [50] training’s l2: 190.157 valid_1’s l2: 274.288 [1] training’s l2: 194.37 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.246 valid_1’s l2: 283.9 [3] training’s l2: 194.114 valid_1’s l2: 283.698 [4] training’s l2: 193.983 valid_1’s l2: 283.493 [5] training’s l2: 193.852 valid_1’s l2: 283.289 [6] training’s l2: 193.726 valid_1’s l2: 283.115 [7] training’s l2: 193.6 valid_1’s l2: 282.941 [8] training’s l2: 193.475 valid_1’s l2: 282.751 [9] training’s l2: 193.349 valid_1’s l2: 282.578 [10] training’s l2: 193.231 valid_1’s l2: 282.38 [11] training’s l2: 193.108 valid_1’s l2: 282.174 [12] training’s l2: 192.985 valid_1’s l2: 282.31 [13] training’s l2: 192.862 valid_1’s l2: 282.135 [14] training’s l2: 192.738 valid_1’s l2: 282.266 [15] training’s l2: 192.616 valid_1’s l2: 282.092 [16] training’s l2: 192.488 valid_1’s l2: 281.922 [17] training’s l2: 192.361 valid_1’s l2: 281.753 [18] training’s l2: 192.234 valid_1’s l2: 281.584 [19] training’s l2: 192.125 valid_1’s l2: 281.438 [20] training’s l2: 192.011 valid_1’s l2: 281.263 [21] training’s l2: 191.897 valid_1’s l2: 281.053 [22] training’s l2: 191.775 valid_1’s l2: 280.87 [23] training’s l2: 191.651 valid_1’s l2: 281.013 [24] training’s l2: 191.521 valid_1’s l2: 280.814 [25] training’s l2: 191.41 valid_1’s l2: 280.739 [26] training’s l2: 191.291 valid_1’s l2: 280.578 [27] training’s l2: 191.169 valid_1’s l2: 280.397 [28] training’s l2: 191.046 valid_1’s l2: 280.204 [29] training’s l2: 190.92 valid_1’s l2: 280.007 [30] training’s l2: 190.802 valid_1’s l2: 279.846 [31] training’s l2: 190.679 valid_1’s l2: 279.973 [32] training’s l2: 190.556 valid_1’s l2: 279.779 [33] training’s l2: 190.432 valid_1’s l2: 279.586 [34] training’s l2: 190.309 valid_1’s l2: 279.393 [35] training’s l2: 190.186 valid_1’s l2: 279.201 [36] training’s l2: 190.067 valid_1’s l2: 279.344 [37] training’s l2: 189.946 valid_1’s l2: 279.486 [38] training’s l2: 189.824 valid_1’s l2: 279.302 [39] training’s l2: 189.704 valid_1’s l2: 279.109 [40] training’s l2: 189.586 valid_1’s l2: 279.252 [41] training’s l2: 189.458 valid_1’s l2: 279.061 [42] training’s l2: 189.344 valid_1’s l2: 278.971 [43] training’s l2: 189.222 valid_1’s l2: 278.802 [44] training’s l2: 189.098 valid_1’s l2: 278.613 [45] training’s l2: 188.97 valid_1’s l2: 278.423 [46] training’s l2: 188.844 valid_1’s l2: 278.204 [47] training’s l2: 188.719 valid_1’s l2: 277.985 [48] training’s l2: 188.6 valid_1’s l2: 277.811 [49] training’s l2: 188.487 valid_1’s l2: 277.661 [50] training’s l2: 188.373 valid_1’s l2: 277.512 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.373 valid_1’s l2: 277.512 [1] training’s l2: 198.317 valid_1’s l2: 265.053 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.183 valid_1’s l2: 264.872 [3] training’s l2: 198.048 valid_1’s l2: 264.698 [4] training’s l2: 197.919 valid_1’s l2: 264.527 [5] training’s l2: 197.791 valid_1’s l2: 264.357 [6] training’s l2: 197.653 valid_1’s l2: 264.212 [7] training’s l2: 197.514 valid_1’s l2: 264.059 [8] training’s l2: 197.378 valid_1’s l2: 263.916 [9] training’s l2: 197.239 valid_1’s l2: 263.763 [10] training’s l2: 197.106 valid_1’s l2: 263.59 [11] training’s l2: 196.972 valid_1’s l2: 263.423 [12] training’s l2: 196.837 valid_1’s l2: 263.275 [13] training’s l2: 196.7 valid_1’s l2: 263.108 [14] training’s l2: 196.566 valid_1’s l2: 262.961 [15] training’s l2: 196.432 valid_1’s l2: 262.814 [16] training’s l2: 196.293 valid_1’s l2: 262.625 [17] training’s l2: 196.155 valid_1’s l2: 262.437 [18] training’s l2: 196.016 valid_1’s l2: 262.25 [19] training’s l2: 195.891 valid_1’s l2: 262.105 [20] training’s l2: 195.761 valid_1’s l2: 261.961 [21] training’s l2: 195.627 valid_1’s l2: 261.8 [22] training’s l2: 195.496 valid_1’s l2: 261.654 [23] training’s l2: 195.366 valid_1’s l2: 261.507 [24] training’s l2: 195.237 valid_1’s l2: 261.345 [25] training’s l2: 195.122 valid_1’s l2: 261.205 [26] training’s l2: 194.987 valid_1’s l2: 261.032 [27] training’s l2: 194.855 valid_1’s l2: 260.867 [28] training’s l2: 194.726 valid_1’s l2: 260.713 [29] training’s l2: 194.597 valid_1’s l2: 260.558 [30] training’s l2: 194.464 valid_1’s l2: 260.388 [31] training’s l2: 194.333 valid_1’s l2: 260.223 [32] training’s l2: 194.206 valid_1’s l2: 260.064 [33] training’s l2: 194.08 valid_1’s l2: 259.905 [34] training’s l2: 193.954 valid_1’s l2: 259.747 [35] training’s l2: 193.819 valid_1’s l2: 259.569 [36] training’s l2: 193.691 valid_1’s l2: 259.419 [37] training’s l2: 193.564 valid_1’s l2: 259.238 [38] training’s l2: 193.438 valid_1’s l2: 259.081 [39] training’s l2: 193.319 valid_1’s l2: 258.929 [40] training’s l2: 193.201 valid_1’s l2: 258.778 [41] training’s l2: 193.067 valid_1’s l2: 258.589 [42] training’s l2: 192.951 valid_1’s l2: 258.422 [43] training’s l2: 192.82 valid_1’s l2: 258.226 [44] training’s l2: 192.682 valid_1’s l2: 258.032 [45] training’s l2: 192.548 valid_1’s l2: 257.852 [46] training’s l2: 192.416 valid_1’s l2: 257.682 [47] training’s l2: 192.285 valid_1’s l2: 257.512 [48] training’s l2: 192.156 valid_1’s l2: 257.324 [49] training’s l2: 192.028 valid_1’s l2: 257.138 [50] training’s l2: 191.9 valid_1’s l2: 256.95 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.9 valid_1’s l2: 256.95 [1] training’s l2: 205.243 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.096 valid_1’s l2: 198.5 [3] training’s l2: 204.952 valid_1’s l2: 198.371 [4] training’s l2: 204.812 valid_1’s l2: 198.245 [5] training’s l2: 204.672 valid_1’s l2: 198.119 [6] training’s l2: 204.528 valid_1’s l2: 197.982 [7] training’s l2: 204.384 valid_1’s l2: 197.846 [8] training’s l2: 204.242 valid_1’s l2: 197.707 [9] training’s l2: 204.099 valid_1’s l2: 197.571 [10] training’s l2: 203.961 valid_1’s l2: 197.443 [11] training’s l2: 203.817 valid_1’s l2: 197.3 [12] training’s l2: 203.675 valid_1’s l2: 197.167 [13] training’s l2: 203.533 valid_1’s l2: 197.033 [14] training’s l2: 203.392 valid_1’s l2: 196.9 [15] training’s l2: 203.251 valid_1’s l2: 196.767 [16] training’s l2: 203.105 valid_1’s l2: 196.636 [17] training’s l2: 202.959 valid_1’s l2: 196.504 [18] training’s l2: 202.813 valid_1’s l2: 196.373 [19] training’s l2: 202.684 valid_1’s l2: 196.255 [20] training’s l2: 202.546 valid_1’s l2: 196.132 [21] training’s l2: 202.41 valid_1’s l2: 196.011 [22] training’s l2: 202.269 valid_1’s l2: 195.891 [23] training’s l2: 202.126 valid_1’s l2: 195.772 [24] training’s l2: 201.982 valid_1’s l2: 195.639 [25] training’s l2: 201.856 valid_1’s l2: 195.524 [26] training’s l2: 201.717 valid_1’s l2: 195.403 [27] training’s l2: 201.577 valid_1’s l2: 195.288 [28] training’s l2: 201.441 valid_1’s l2: 195.169 [29] training’s l2: 201.306 valid_1’s l2: 195.047 [30] training’s l2: 201.167 valid_1’s l2: 194.927 [31] training’s l2: 201.032 valid_1’s l2: 194.802 [32] training’s l2: 200.898 valid_1’s l2: 194.682 [33] training’s l2: 200.764 valid_1’s l2: 194.562 [34] training’s l2: 200.631 valid_1’s l2: 194.442 [35] training’s l2: 200.494 valid_1’s l2: 194.328 [36] training’s l2: 200.358 valid_1’s l2: 194.202 [37] training’s l2: 200.223 valid_1’s l2: 194.076 [38] training’s l2: 200.082 valid_1’s l2: 193.933 [39] training’s l2: 199.951 valid_1’s l2: 193.815 [40] training’s l2: 199.821 valid_1’s l2: 193.699 [41] training’s l2: 199.684 valid_1’s l2: 193.571 [42] training’s l2: 199.56 valid_1’s l2: 193.456 [43] training’s l2: 199.427 valid_1’s l2: 193.336 [44] training’s l2: 199.297 valid_1’s l2: 193.224 [45] training’s l2: 199.158 valid_1’s l2: 193.083 [46] training’s l2: 199.022 valid_1’s l2: 192.948 [47] training’s l2: 198.884 valid_1’s l2: 192.823 [48] training’s l2: 198.745 valid_1’s l2: 192.709 [49] training’s l2: 198.607 valid_1’s l2: 192.591 [50] training’s l2: 198.469 valid_1’s l2: 192.477 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.469 valid_1’s l2: 192.477 [1] training’s l2: 193.905 valid_1’s l2: 286.204 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.782 valid_1’s l2: 286.004 [3] training’s l2: 193.657 valid_1’s l2: 285.821 [4] training’s l2: 193.532 valid_1’s l2: 285.626 [5] training’s l2: 193.407 valid_1’s l2: 285.432 [6] training’s l2: 193.283 valid_1’s l2: 285.253 [7] training’s l2: 193.158 valid_1’s l2: 285.076 [8] training’s l2: 193.032 valid_1’s l2: 284.902 [9] training’s l2: 192.908 valid_1’s l2: 284.724 [10] training’s l2: 192.79 valid_1’s l2: 284.542 [11] training’s l2: 192.656 valid_1’s l2: 284.33 [12] training’s l2: 192.519 valid_1’s l2: 284.113 [13] training’s l2: 192.382 valid_1’s l2: 283.897 [14] training’s l2: 192.245 valid_1’s l2: 283.678 [15] training’s l2: 192.108 valid_1’s l2: 283.463 [16] training’s l2: 191.979 valid_1’s l2: 283.269 [17] training’s l2: 191.849 valid_1’s l2: 283.076 [18] training’s l2: 191.72 valid_1’s l2: 282.883 [19] training’s l2: 191.607 valid_1’s l2: 282.709 [20] training’s l2: 191.486 valid_1’s l2: 282.515 [21] training’s l2: 191.366 valid_1’s l2: 282.319 [22] training’s l2: 191.239 valid_1’s l2: 282.121 [23] training’s l2: 191.114 valid_1’s l2: 281.919 [24] training’s l2: 190.989 valid_1’s l2: 281.732 [25] training’s l2: 190.88 valid_1’s l2: 281.573 [26] training’s l2: 190.747 valid_1’s l2: 281.387 [27] training’s l2: 190.621 valid_1’s l2: 281.189 [28] training’s l2: 190.492 valid_1’s l2: 280.988 [29] training’s l2: 190.367 valid_1’s l2: 280.811 [30] training’s l2: 190.236 valid_1’s l2: 280.627 [31] training’s l2: 190.12 valid_1’s l2: 280.433 [32] training’s l2: 190.004 valid_1’s l2: 280.279 [33] training’s l2: 189.889 valid_1’s l2: 280.125 [34] training’s l2: 189.775 valid_1’s l2: 279.971 [35] training’s l2: 189.657 valid_1’s l2: 279.774 [36] training’s l2: 189.542 valid_1’s l2: 279.582 [37] training’s l2: 189.421 valid_1’s l2: 279.407 [38] training’s l2: 189.294 valid_1’s l2: 279.218 [39] training’s l2: 189.185 valid_1’s l2: 279.027 [40] training’s l2: 189.075 valid_1’s l2: 278.839 [41] training’s l2: 188.965 valid_1’s l2: 278.676 [42] training’s l2: 188.861 valid_1’s l2: 278.55 [43] training’s l2: 188.746 valid_1’s l2: 278.385 [44] training’s l2: 188.635 valid_1’s l2: 278.243 [45] training’s l2: 188.516 valid_1’s l2: 278.084 [46] training’s l2: 188.386 valid_1’s l2: 277.892 [47] training’s l2: 188.264 valid_1’s l2: 277.731 [48] training’s l2: 188.135 valid_1’s l2: 277.538 [49] training’s l2: 188.012 valid_1’s l2: 277.343 [50] training’s l2: 187.889 valid_1’s l2: 277.15 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.889 valid_1’s l2: 277.15 [1] training’s l2: 195.039 valid_1’s l2: 280.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.904 valid_1’s l2: 279.904 [3] training’s l2: 194.761 valid_1’s l2: 279.728 [4] training’s l2: 194.623 valid_1’s l2: 279.542 [5] training’s l2: 194.486 valid_1’s l2: 279.357 [6] training’s l2: 194.358 valid_1’s l2: 279.542 [7] training’s l2: 194.228 valid_1’s l2: 279.39 [8] training’s l2: 194.098 valid_1’s l2: 279.239 [9] training’s l2: 193.971 valid_1’s l2: 279.424 [10] training’s l2: 193.841 valid_1’s l2: 279.577 [11] training’s l2: 193.702 valid_1’s l2: 279.385 [12] training’s l2: 193.565 valid_1’s l2: 279.186 [13] training’s l2: 193.427 valid_1’s l2: 278.988 [14] training’s l2: 193.29 valid_1’s l2: 278.79 [15] training’s l2: 193.155 valid_1’s l2: 278.586 [16] training’s l2: 193.013 valid_1’s l2: 278.389 [17] training’s l2: 192.872 valid_1’s l2: 278.192 [18] training’s l2: 192.73 valid_1’s l2: 277.995 [19] training’s l2: 192.603 valid_1’s l2: 277.857 [20] training’s l2: 192.469 valid_1’s l2: 277.663 [21] training’s l2: 192.339 valid_1’s l2: 277.802 [22] training’s l2: 192.208 valid_1’s l2: 277.616 [23] training’s l2: 192.076 valid_1’s l2: 277.435 [24] training’s l2: 191.944 valid_1’s l2: 277.279 [25] training’s l2: 191.831 valid_1’s l2: 277.128 [26] training’s l2: 191.695 valid_1’s l2: 276.926 [27] training’s l2: 191.553 valid_1’s l2: 276.737 [28] training’s l2: 191.417 valid_1’s l2: 276.557 [29] training’s l2: 191.28 valid_1’s l2: 276.377 [30] training’s l2: 191.145 valid_1’s l2: 276.176 [31] training’s l2: 191.016 valid_1’s l2: 275.989 [32] training’s l2: 190.896 valid_1’s l2: 276.166 [33] training’s l2: 190.776 valid_1’s l2: 276.343 [34] training’s l2: 190.657 valid_1’s l2: 276.52 [35] training’s l2: 190.521 valid_1’s l2: 276.327 [36] training’s l2: 190.391 valid_1’s l2: 276.45 [37] training’s l2: 190.26 valid_1’s l2: 276.268 [38] training’s l2: 190.13 valid_1’s l2: 276.08 [39] training’s l2: 190.002 valid_1’s l2: 275.884 [40] training’s l2: 189.873 valid_1’s l2: 276.007 [41] training’s l2: 189.746 valid_1’s l2: 275.831 [42] training’s l2: 189.629 valid_1’s l2: 275.687 [43] training’s l2: 189.506 valid_1’s l2: 275.527 [44] training’s l2: 189.38 valid_1’s l2: 275.375 [45] training’s l2: 189.247 valid_1’s l2: 275.219 [46] training’s l2: 189.111 valid_1’s l2: 275.056 [47] training’s l2: 188.975 valid_1’s l2: 274.893 [48] training’s l2: 188.843 valid_1’s l2: 274.712 [49] training’s l2: 188.713 valid_1’s l2: 274.541 [50] training’s l2: 188.582 valid_1’s l2: 274.631 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.582 valid_1’s l2: 274.631 [1] training’s l2: 194.37 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.238 valid_1’s l2: 283.897 [3] training’s l2: 194.106 valid_1’s l2: 283.695 [4] training’s l2: 193.975 valid_1’s l2: 283.493 [5] training’s l2: 193.844 valid_1’s l2: 283.289 [6] training’s l2: 193.718 valid_1’s l2: 283.115 [7] training’s l2: 193.592 valid_1’s l2: 282.941 [8] training’s l2: 193.466 valid_1’s l2: 282.767 [9] training’s l2: 193.34 valid_1’s l2: 282.594 [10] training’s l2: 193.215 valid_1’s l2: 282.421 [11] training’s l2: 193.09 valid_1’s l2: 282.204 [12] training’s l2: 192.967 valid_1’s l2: 282.34 [13] training’s l2: 192.843 valid_1’s l2: 282.471 [14] training’s l2: 192.72 valid_1’s l2: 282.602 [15] training’s l2: 192.597 valid_1’s l2: 282.428 [16] training’s l2: 192.47 valid_1’s l2: 282.258 [17] training’s l2: 192.343 valid_1’s l2: 282.089 [18] training’s l2: 192.216 valid_1’s l2: 281.92 [19] training’s l2: 192.083 valid_1’s l2: 281.723 [20] training’s l2: 191.957 valid_1’s l2: 281.555 [21] training’s l2: 191.83 valid_1’s l2: 281.358 [22] training’s l2: 191.703 valid_1’s l2: 281.161 [23] training’s l2: 191.577 valid_1’s l2: 280.964 [24] training’s l2: 191.447 valid_1’s l2: 280.766 [25] training’s l2: 191.321 valid_1’s l2: 280.57 [26] training’s l2: 191.202 valid_1’s l2: 280.408 [27] training’s l2: 191.081 valid_1’s l2: 280.228 [28] training’s l2: 190.957 valid_1’s l2: 280.035 [29] training’s l2: 190.836 valid_1’s l2: 279.855 [30] training’s l2: 190.715 valid_1’s l2: 279.676 [31] training’s l2: 190.592 valid_1’s l2: 279.803 [32] training’s l2: 190.469 valid_1’s l2: 279.609 [33] training’s l2: 190.345 valid_1’s l2: 279.417 [34] training’s l2: 190.223 valid_1’s l2: 279.224 [35] training’s l2: 190.1 valid_1’s l2: 279.032 [36] training’s l2: 189.979 valid_1’s l2: 279.175 [37] training’s l2: 189.859 valid_1’s l2: 279.318 [38] training’s l2: 189.735 valid_1’s l2: 279.131 [39] training’s l2: 189.614 valid_1’s l2: 279.274 [40] training’s l2: 189.494 valid_1’s l2: 279.416 [41] training’s l2: 189.372 valid_1’s l2: 279.226 [42] training’s l2: 189.249 valid_1’s l2: 279.354 [43] training’s l2: 189.127 valid_1’s l2: 279.165 [44] training’s l2: 189.006 valid_1’s l2: 278.976 [45] training’s l2: 188.884 valid_1’s l2: 279.104 [46] training’s l2: 188.766 valid_1’s l2: 278.919 [47] training’s l2: 188.64 valid_1’s l2: 278.695 [48] training’s l2: 188.522 valid_1’s l2: 278.51 [49] training’s l2: 188.405 valid_1’s l2: 278.326 [50] training’s l2: 188.287 valid_1’s l2: 278.143 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.287 valid_1’s l2: 278.143 [1] training’s l2: 198.317 valid_1’s l2: 265.053 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.181 valid_1’s l2: 264.867 [3] training’s l2: 198.046 valid_1’s l2: 264.692 [4] training’s l2: 197.911 valid_1’s l2: 264.517 [5] training’s l2: 197.775 valid_1’s l2: 264.332 [6] training’s l2: 197.636 valid_1’s l2: 264.179 [7] training’s l2: 197.497 valid_1’s l2: 264.026 [8] training’s l2: 197.361 valid_1’s l2: 263.882 [9] training’s l2: 197.222 valid_1’s l2: 263.73 [10] training’s l2: 197.085 valid_1’s l2: 263.578 [11] training’s l2: 196.945 valid_1’s l2: 263.415 [12] training’s l2: 196.811 valid_1’s l2: 263.267 [13] training’s l2: 196.674 valid_1’s l2: 263.1 [14] training’s l2: 196.538 valid_1’s l2: 262.934 [15] training’s l2: 196.401 valid_1’s l2: 262.768 [16] training’s l2: 196.263 valid_1’s l2: 262.58 [17] training’s l2: 196.124 valid_1’s l2: 262.392 [18] training’s l2: 195.986 valid_1’s l2: 262.204 [19] training’s l2: 195.85 valid_1’s l2: 262.044 [20] training’s l2: 195.712 valid_1’s l2: 261.857 [21] training’s l2: 195.581 valid_1’s l2: 261.711 [22] training’s l2: 195.45 valid_1’s l2: 261.564 [23] training’s l2: 195.32 valid_1’s l2: 261.418 [24] training’s l2: 195.186 valid_1’s l2: 261.258 [25] training’s l2: 195.056 valid_1’s l2: 261.113 [26] training’s l2: 194.921 valid_1’s l2: 260.939 [27] training’s l2: 194.788 valid_1’s l2: 260.758 [28] training’s l2: 194.659 valid_1’s l2: 260.604 [29] training’s l2: 194.525 valid_1’s l2: 260.432 [30] training’s l2: 194.392 valid_1’s l2: 260.263 [31] training’s l2: 194.261 valid_1’s l2: 260.097 [32] training’s l2: 194.126 valid_1’s l2: 259.918 [33] training’s l2: 193.991 valid_1’s l2: 259.739 [34] training’s l2: 193.857 valid_1’s l2: 259.561 [35] training’s l2: 193.722 valid_1’s l2: 259.383 [36] training’s l2: 193.593 valid_1’s l2: 259.212 [37] training’s l2: 193.465 valid_1’s l2: 259.041 [38] training’s l2: 193.336 valid_1’s l2: 258.889 [39] training’s l2: 193.208 valid_1’s l2: 258.719 [40] training’s l2: 193.079 valid_1’s l2: 258.549 [41] training’s l2: 192.941 valid_1’s l2: 258.354 [42] training’s l2: 192.81 valid_1’s l2: 258.159 [43] training’s l2: 192.672 valid_1’s l2: 257.964 [44] training’s l2: 192.534 valid_1’s l2: 257.77 [45] training’s l2: 192.404 valid_1’s l2: 257.576 [46] training’s l2: 192.274 valid_1’s l2: 257.408 [47] training’s l2: 192.144 valid_1’s l2: 257.257 [48] training’s l2: 192.014 valid_1’s l2: 257.089 [49] training’s l2: 191.884 valid_1’s l2: 256.922 [50] training’s l2: 191.755 valid_1’s l2: 256.755 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.755 valid_1’s l2: 256.755 [1] training’s l2: 205.243 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.096 valid_1’s l2: 198.5 [3] training’s l2: 204.952 valid_1’s l2: 198.371 [4] training’s l2: 204.807 valid_1’s l2: 198.243 [5] training’s l2: 204.661 valid_1’s l2: 198.105 [6] training’s l2: 204.516 valid_1’s l2: 197.969 [7] training’s l2: 204.373 valid_1’s l2: 197.832 [8] training’s l2: 204.229 valid_1’s l2: 197.696 [9] training’s l2: 204.086 valid_1’s l2: 197.56 [10] training’s l2: 203.943 valid_1’s l2: 197.424 [11] training’s l2: 203.799 valid_1’s l2: 197.3 [12] training’s l2: 203.658 valid_1’s l2: 197.167 [13] training’s l2: 203.516 valid_1’s l2: 197.033 [14] training’s l2: 203.375 valid_1’s l2: 196.9 [15] training’s l2: 203.234 valid_1’s l2: 196.768 [16] training’s l2: 203.088 valid_1’s l2: 196.636 [17] training’s l2: 202.942 valid_1’s l2: 196.505 [18] training’s l2: 202.797 valid_1’s l2: 196.374 [19] training’s l2: 202.655 valid_1’s l2: 196.251 [20] training’s l2: 202.51 valid_1’s l2: 196.12 [21] training’s l2: 202.366 valid_1’s l2: 196.005 [22] training’s l2: 202.222 valid_1’s l2: 195.89 [23] training’s l2: 202.078 valid_1’s l2: 195.775 [24] training’s l2: 201.938 valid_1’s l2: 195.642 [25] training’s l2: 201.795 valid_1’s l2: 195.528 [26] training’s l2: 201.656 valid_1’s l2: 195.407 [27] training’s l2: 201.517 valid_1’s l2: 195.293 [28] training’s l2: 201.381 valid_1’s l2: 195.174 [29] training’s l2: 201.242 valid_1’s l2: 195.06 [30] training’s l2: 201.103 valid_1’s l2: 194.946 [31] training’s l2: 200.967 valid_1’s l2: 194.82 [32] training’s l2: 200.83 valid_1’s l2: 194.705 [33] training’s l2: 200.693 valid_1’s l2: 194.591 [34] training’s l2: 200.557 valid_1’s l2: 194.476 [35] training’s l2: 200.421 valid_1’s l2: 194.362 [36] training’s l2: 200.276 valid_1’s l2: 194.231 [37] training’s l2: 200.131 valid_1’s l2: 194.1 [38] training’s l2: 199.99 valid_1’s l2: 193.965 [39] training’s l2: 199.846 valid_1’s l2: 193.834 [40] training’s l2: 199.702 valid_1’s l2: 193.704 [41] training’s l2: 199.57 valid_1’s l2: 193.591 [42] training’s l2: 199.436 valid_1’s l2: 193.473 [43] training’s l2: 199.304 valid_1’s l2: 193.36 [44] training’s l2: 199.176 valid_1’s l2: 193.257 [45] training’s l2: 199.042 valid_1’s l2: 193.139 [46] training’s l2: 198.904 valid_1’s l2: 193.025 [47] training’s l2: 198.765 valid_1’s l2: 192.908 [48] training’s l2: 198.627 valid_1’s l2: 192.794 [49] training’s l2: 198.489 valid_1’s l2: 192.676 [50] training’s l2: 198.351 valid_1’s l2: 192.563 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.351 valid_1’s l2: 192.563 [1] training’s l2: 193.905 valid_1’s l2: 286.206 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.769 valid_1’s l2: 286.029 [3] training’s l2: 193.644 valid_1’s l2: 285.846 [4] training’s l2: 193.519 valid_1’s l2: 285.666 [5] training’s l2: 193.383 valid_1’s l2: 285.489 [6] training’s l2: 193.258 valid_1’s l2: 285.31 [7] training’s l2: 193.133 valid_1’s l2: 285.133 [8] training’s l2: 193.009 valid_1’s l2: 284.955 [9] training’s l2: 192.885 valid_1’s l2: 284.778 [10] training’s l2: 192.761 valid_1’s l2: 284.601 [11] training’s l2: 192.623 valid_1’s l2: 284.387 [12] training’s l2: 192.486 valid_1’s l2: 284.17 [13] training’s l2: 192.349 valid_1’s l2: 283.951 [14] training’s l2: 192.211 valid_1’s l2: 283.732 [15] training’s l2: 192.074 valid_1’s l2: 283.514 [16] training’s l2: 191.945 valid_1’s l2: 283.32 [17] training’s l2: 191.816 valid_1’s l2: 283.127 [18] training’s l2: 191.687 valid_1’s l2: 282.934 [19] training’s l2: 191.552 valid_1’s l2: 282.729 [20] training’s l2: 191.424 valid_1’s l2: 282.536 [21] training’s l2: 191.297 valid_1’s l2: 282.338 [22] training’s l2: 191.17 valid_1’s l2: 282.141 [23] training’s l2: 191.044 valid_1’s l2: 281.943 [24] training’s l2: 190.919 valid_1’s l2: 281.756 [25] training’s l2: 190.793 valid_1’s l2: 281.561 [26] training’s l2: 190.66 valid_1’s l2: 281.376 [27] training’s l2: 190.534 valid_1’s l2: 281.178 [28] training’s l2: 190.405 valid_1’s l2: 280.977 [29] training’s l2: 190.273 valid_1’s l2: 280.793 [30] training’s l2: 190.142 valid_1’s l2: 280.609 [31] training’s l2: 190.022 valid_1’s l2: 280.414 [32] training’s l2: 189.904 valid_1’s l2: 280.216 [33] training’s l2: 189.785 valid_1’s l2: 280.018 [34] training’s l2: 189.667 valid_1’s l2: 279.821 [35] training’s l2: 189.55 valid_1’s l2: 279.624 [36] training’s l2: 189.429 valid_1’s l2: 279.449 [37] training’s l2: 189.308 valid_1’s l2: 279.275 [38] training’s l2: 189.181 valid_1’s l2: 279.094 [39] training’s l2: 189.061 valid_1’s l2: 278.921 [40] training’s l2: 188.941 valid_1’s l2: 278.747 [41] training’s l2: 188.829 valid_1’s l2: 278.601 [42] training’s l2: 188.717 valid_1’s l2: 278.455 [43] training’s l2: 188.605 valid_1’s l2: 278.309 [44] training’s l2: 188.495 valid_1’s l2: 278.168 [45] training’s l2: 188.381 valid_1’s l2: 278.004 [46] training’s l2: 188.252 valid_1’s l2: 277.81 [47] training’s l2: 188.129 valid_1’s l2: 277.648 [48] training’s l2: 188 valid_1’s l2: 277.458 [49] training’s l2: 187.871 valid_1’s l2: 277.269 [50] training’s l2: 187.743 valid_1’s l2: 277.08 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.743 valid_1’s l2: 277.08 [1] training’s l2: 195.039 valid_1’s l2: 280.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.901 valid_1’s l2: 279.884 [3] training’s l2: 194.758 valid_1’s l2: 279.707 [4] training’s l2: 194.615 valid_1’s l2: 279.528 [5] training’s l2: 194.478 valid_1’s l2: 279.343 [6] training’s l2: 194.35 valid_1’s l2: 279.528 [7] training’s l2: 194.22 valid_1’s l2: 279.376 [8] training’s l2: 194.09 valid_1’s l2: 279.225 [9] training’s l2: 193.963 valid_1’s l2: 279.41 [10] training’s l2: 193.836 valid_1’s l2: 279.594 [11] training’s l2: 193.696 valid_1’s l2: 279.4 [12] training’s l2: 193.558 valid_1’s l2: 279.202 [13] training’s l2: 193.421 valid_1’s l2: 279.003 [14] training’s l2: 193.284 valid_1’s l2: 278.806 [15] training’s l2: 193.148 valid_1’s l2: 278.608 [16] training’s l2: 193.006 valid_1’s l2: 278.411 [17] training’s l2: 192.864 valid_1’s l2: 278.214 [18] training’s l2: 192.723 valid_1’s l2: 278.018 [19] training’s l2: 192.583 valid_1’s l2: 277.825 [20] training’s l2: 192.442 valid_1’s l2: 277.63 [21] training’s l2: 192.303 valid_1’s l2: 277.439 [22] training’s l2: 192.164 valid_1’s l2: 277.25 [23] training’s l2: 192.025 valid_1’s l2: 277.06 [24] training’s l2: 191.897 valid_1’s l2: 276.881 [25] training’s l2: 191.759 valid_1’s l2: 276.692 [26] training’s l2: 191.624 valid_1’s l2: 276.491 [27] training’s l2: 191.482 valid_1’s l2: 276.303 [28] training’s l2: 191.346 valid_1’s l2: 276.123 [29] training’s l2: 191.205 valid_1’s l2: 275.935 [30] training’s l2: 191.064 valid_1’s l2: 275.747 [31] training’s l2: 190.935 valid_1’s l2: 275.56 [32] training’s l2: 190.798 valid_1’s l2: 275.367 [33] training’s l2: 190.664 valid_1’s l2: 275.182 [34] training’s l2: 190.528 valid_1’s l2: 274.99 [35] training’s l2: 190.392 valid_1’s l2: 274.799 [36] training’s l2: 190.258 valid_1’s l2: 274.621 [37] training’s l2: 190.125 valid_1’s l2: 274.444 [38] training’s l2: 189.995 valid_1’s l2: 274.257 [39] training’s l2: 189.862 valid_1’s l2: 274.081 [40] training’s l2: 189.729 valid_1’s l2: 273.905 [41] training’s l2: 189.603 valid_1’s l2: 273.754 [42] training’s l2: 189.478 valid_1’s l2: 273.593 [43] training’s l2: 189.352 valid_1’s l2: 273.442 [44] training’s l2: 189.226 valid_1’s l2: 273.291 [45] training’s l2: 189.104 valid_1’s l2: 273.133 [46] training’s l2: 188.973 valid_1’s l2: 272.963 [47] training’s l2: 188.837 valid_1’s l2: 272.802 [48] training’s l2: 188.707 valid_1’s l2: 272.632 [49] training’s l2: 188.577 valid_1’s l2: 272.463 [50] training’s l2: 188.447 valid_1’s l2: 272.294 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.447 valid_1’s l2: 272.294 [1] training’s l2: 194.37 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.238 valid_1’s l2: 283.897 [3] training’s l2: 194.106 valid_1’s l2: 283.695 [4] training’s l2: 193.975 valid_1’s l2: 283.493 [5] training’s l2: 193.844 valid_1’s l2: 283.292 [6] training’s l2: 193.717 valid_1’s l2: 283.118 [7] training’s l2: 193.591 valid_1’s l2: 282.944 [8] training’s l2: 193.465 valid_1’s l2: 282.77 [9] training’s l2: 193.34 valid_1’s l2: 282.597 [10] training’s l2: 193.214 valid_1’s l2: 282.424 [11] training’s l2: 193.09 valid_1’s l2: 282.555 [12] training’s l2: 192.968 valid_1’s l2: 282.691 [13] training’s l2: 192.844 valid_1’s l2: 282.823 [14] training’s l2: 192.721 valid_1’s l2: 282.954 [15] training’s l2: 192.599 valid_1’s l2: 283.09 [16] training’s l2: 192.472 valid_1’s l2: 282.92 [17] training’s l2: 192.345 valid_1’s l2: 282.751 [18] training’s l2: 192.218 valid_1’s l2: 282.581 [19] training’s l2: 192.085 valid_1’s l2: 282.385 [20] training’s l2: 191.958 valid_1’s l2: 282.216 [21] training’s l2: 191.831 valid_1’s l2: 282.018 [22] training’s l2: 191.705 valid_1’s l2: 281.821 [23] training’s l2: 191.578 valid_1’s l2: 281.624 [24] training’s l2: 191.452 valid_1’s l2: 281.428 [25] training’s l2: 191.326 valid_1’s l2: 281.232 [26] training’s l2: 191.208 valid_1’s l2: 281.07 [27] training’s l2: 191.086 valid_1’s l2: 280.89 [28] training’s l2: 190.965 valid_1’s l2: 280.71 [29] training’s l2: 190.844 valid_1’s l2: 280.53 [30] training’s l2: 190.723 valid_1’s l2: 280.35 [31] training’s l2: 190.599 valid_1’s l2: 280.157 [32] training’s l2: 190.476 valid_1’s l2: 279.964 [33] training’s l2: 190.353 valid_1’s l2: 279.771 [34] training’s l2: 190.231 valid_1’s l2: 279.578 [35] training’s l2: 190.108 valid_1’s l2: 279.386 [36] training’s l2: 189.988 valid_1’s l2: 279.529 [37] training’s l2: 189.868 valid_1’s l2: 279.672 [38] training’s l2: 189.749 valid_1’s l2: 279.814 [39] training’s l2: 189.63 valid_1’s l2: 279.957 [40] training’s l2: 189.51 valid_1’s l2: 280.1 [41] training’s l2: 189.388 valid_1’s l2: 279.91 [42] training’s l2: 189.266 valid_1’s l2: 279.721 [43] training’s l2: 189.144 valid_1’s l2: 279.532 [44] training’s l2: 189.023 valid_1’s l2: 279.343 [45] training’s l2: 188.901 valid_1’s l2: 279.155 [46] training’s l2: 188.784 valid_1’s l2: 278.97 [47] training’s l2: 188.667 valid_1’s l2: 278.771 [48] training’s l2: 188.55 valid_1’s l2: 278.586 [49] training’s l2: 188.433 valid_1’s l2: 278.402 [50] training’s l2: 188.316 valid_1’s l2: 278.218 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.316 valid_1’s l2: 278.218 [1] training’s l2: 198.317 valid_1’s l2: 265.053 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.181 valid_1’s l2: 264.867 [3] training’s l2: 198.046 valid_1’s l2: 264.692 [4] training’s l2: 197.911 valid_1’s l2: 264.517 [5] training’s l2: 197.776 valid_1’s l2: 264.343 [6] training’s l2: 197.637 valid_1’s l2: 264.19 [7] training’s l2: 197.499 valid_1’s l2: 264.037 [8] training’s l2: 197.36 valid_1’s l2: 263.885 [9] training’s l2: 197.222 valid_1’s l2: 263.732 [10] training’s l2: 197.085 valid_1’s l2: 263.58 [11] training’s l2: 196.947 valid_1’s l2: 263.413 [12] training’s l2: 196.811 valid_1’s l2: 263.246 [13] training’s l2: 196.674 valid_1’s l2: 263.08 [14] training’s l2: 196.538 valid_1’s l2: 262.914 [15] training’s l2: 196.402 valid_1’s l2: 262.748 [16] training’s l2: 196.263 valid_1’s l2: 262.56 [17] training’s l2: 196.125 valid_1’s l2: 262.372 [18] training’s l2: 195.986 valid_1’s l2: 262.185 [19] training’s l2: 195.85 valid_1’s l2: 262.024 [20] training’s l2: 195.712 valid_1’s l2: 261.837 [21] training’s l2: 195.581 valid_1’s l2: 261.691 [22] training’s l2: 195.451 valid_1’s l2: 261.544 [23] training’s l2: 195.32 valid_1’s l2: 261.399 [24] training’s l2: 195.19 valid_1’s l2: 261.253 [25] training’s l2: 195.06 valid_1’s l2: 261.108 [26] training’s l2: 194.926 valid_1’s l2: 260.935 [27] training’s l2: 194.791 valid_1’s l2: 260.762 [28] training’s l2: 194.657 valid_1’s l2: 260.59 [29] training’s l2: 194.524 valid_1’s l2: 260.418 [30] training’s l2: 194.391 valid_1’s l2: 260.249 [31] training’s l2: 194.256 valid_1’s l2: 260.069 [32] training’s l2: 194.121 valid_1’s l2: 259.89 [33] training’s l2: 193.987 valid_1’s l2: 259.712 [34] training’s l2: 193.852 valid_1’s l2: 259.534 [35] training’s l2: 193.718 valid_1’s l2: 259.356 [36] training’s l2: 193.589 valid_1’s l2: 259.215 [37] training’s l2: 193.46 valid_1’s l2: 259.075 [38] training’s l2: 193.331 valid_1’s l2: 258.922 [39] training’s l2: 193.2 valid_1’s l2: 258.747 [40] training’s l2: 193.072 valid_1’s l2: 258.607 [41] training’s l2: 192.934 valid_1’s l2: 258.412 [42] training’s l2: 192.795 valid_1’s l2: 258.217 [43] training’s l2: 192.657 valid_1’s l2: 258.023 [44] training’s l2: 192.519 valid_1’s l2: 257.829 [45] training’s l2: 192.382 valid_1’s l2: 257.635 [46] training’s l2: 192.255 valid_1’s l2: 257.467 [47] training’s l2: 192.128 valid_1’s l2: 257.305 [48] training’s l2: 192.001 valid_1’s l2: 257.138 [49] training’s l2: 191.872 valid_1’s l2: 256.97 [50] training’s l2: 191.746 valid_1’s l2: 256.803 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.746 valid_1’s l2: 256.803 [1] training’s l2: 205.243 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.096 valid_1’s l2: 198.5 [3] training’s l2: 204.952 valid_1’s l2: 198.371 [4] training’s l2: 204.807 valid_1’s l2: 198.243 [5] training’s l2: 204.663 valid_1’s l2: 198.115 [6] training’s l2: 204.519 valid_1’s l2: 197.978 [7] training’s l2: 204.375 valid_1’s l2: 197.841 [8] training’s l2: 204.231 valid_1’s l2: 197.705 [9] training’s l2: 204.088 valid_1’s l2: 197.569 [10] training’s l2: 203.945 valid_1’s l2: 197.433 [11] training’s l2: 203.803 valid_1’s l2: 197.3 [12] training’s l2: 203.662 valid_1’s l2: 197.166 [13] training’s l2: 203.521 valid_1’s l2: 197.033 [14] training’s l2: 203.38 valid_1’s l2: 196.9 [15] training’s l2: 203.239 valid_1’s l2: 196.768 [16] training’s l2: 203.093 valid_1’s l2: 196.636 [17] training’s l2: 202.947 valid_1’s l2: 196.505 [18] training’s l2: 202.802 valid_1’s l2: 196.374 [19] training’s l2: 202.66 valid_1’s l2: 196.251 [20] training’s l2: 202.515 valid_1’s l2: 196.121 [21] training’s l2: 202.371 valid_1’s l2: 196.006 [22] training’s l2: 202.227 valid_1’s l2: 195.891 [23] training’s l2: 202.083 valid_1’s l2: 195.776 [24] training’s l2: 201.94 valid_1’s l2: 195.662 [25] training’s l2: 201.797 valid_1’s l2: 195.548 [26] training’s l2: 201.658 valid_1’s l2: 195.427 [27] training’s l2: 201.519 valid_1’s l2: 195.313 [28] training’s l2: 201.38 valid_1’s l2: 195.199 [29] training’s l2: 201.241 valid_1’s l2: 195.086 [30] training’s l2: 201.103 valid_1’s l2: 194.972 [31] training’s l2: 200.966 valid_1’s l2: 194.857 [32] training’s l2: 200.829 valid_1’s l2: 194.743 [33] training’s l2: 200.692 valid_1’s l2: 194.628 [34] training’s l2: 200.556 valid_1’s l2: 194.514 [35] training’s l2: 200.42 valid_1’s l2: 194.4 [36] training’s l2: 200.275 valid_1’s l2: 194.269 [37] training’s l2: 200.13 valid_1’s l2: 194.138 [38] training’s l2: 199.994 valid_1’s l2: 194.012 [39] training’s l2: 199.85 valid_1’s l2: 193.882 [40] training’s l2: 199.706 valid_1’s l2: 193.752 [41] training’s l2: 199.573 valid_1’s l2: 193.64 [42] training’s l2: 199.44 valid_1’s l2: 193.528 [43] training’s l2: 199.309 valid_1’s l2: 193.416 [44] training’s l2: 199.178 valid_1’s l2: 193.304 [45] training’s l2: 199.046 valid_1’s l2: 193.193 [46] training’s l2: 198.907 valid_1’s l2: 193.079 [47] training’s l2: 198.771 valid_1’s l2: 192.951 [48] training’s l2: 198.633 valid_1’s l2: 192.838 [49] training’s l2: 198.495 valid_1’s l2: 192.72 [50] training’s l2: 198.358 valid_1’s l2: 192.608 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.358 valid_1’s l2: 192.608 [1] training’s l2: 193.905 valid_1’s l2: 286.206 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.769 valid_1’s l2: 286.029 [3] training’s l2: 193.644 valid_1’s l2: 285.846 [4] training’s l2: 193.519 valid_1’s l2: 285.666 [5] training’s l2: 193.395 valid_1’s l2: 285.486 [6] training’s l2: 193.27 valid_1’s l2: 285.307 [7] training’s l2: 193.145 valid_1’s l2: 285.13 [8] training’s l2: 193.021 valid_1’s l2: 284.952 [9] training’s l2: 192.897 valid_1’s l2: 284.775 [10] training’s l2: 192.773 valid_1’s l2: 284.598 [11] training’s l2: 192.635 valid_1’s l2: 284.378 [12] training’s l2: 192.498 valid_1’s l2: 284.161 [13] training’s l2: 192.36 valid_1’s l2: 283.942 [14] training’s l2: 192.223 valid_1’s l2: 283.723 [15] training’s l2: 192.086 valid_1’s l2: 283.505 [16] training’s l2: 191.957 valid_1’s l2: 283.311 [17] training’s l2: 191.827 valid_1’s l2: 283.118 [18] training’s l2: 191.698 valid_1’s l2: 282.925 [19] training’s l2: 191.564 valid_1’s l2: 282.72 [20] training’s l2: 191.436 valid_1’s l2: 282.528 [21] training’s l2: 191.309 valid_1’s l2: 282.33 [22] training’s l2: 191.182 valid_1’s l2: 282.132 [23] training’s l2: 191.055 valid_1’s l2: 281.937 [24] training’s l2: 190.929 valid_1’s l2: 281.74 [25] training’s l2: 190.803 valid_1’s l2: 281.546 [26] training’s l2: 190.671 valid_1’s l2: 281.361 [27] training’s l2: 190.539 valid_1’s l2: 281.176 [28] training’s l2: 190.408 valid_1’s l2: 280.985 [29] training’s l2: 190.277 valid_1’s l2: 280.801 [30] training’s l2: 190.146 valid_1’s l2: 280.618 [31] training’s l2: 190.027 valid_1’s l2: 280.42 [32] training’s l2: 189.909 valid_1’s l2: 280.222 [33] training’s l2: 189.791 valid_1’s l2: 280.025 [34] training’s l2: 189.673 valid_1’s l2: 279.828 [35] training’s l2: 189.555 valid_1’s l2: 279.631 [36] training’s l2: 189.435 valid_1’s l2: 279.457 [37] training’s l2: 189.314 valid_1’s l2: 279.283 [38] training’s l2: 189.199 valid_1’s l2: 279.091 [39] training’s l2: 189.078 valid_1’s l2: 278.918 [40] training’s l2: 188.959 valid_1’s l2: 278.745 [41] training’s l2: 188.847 valid_1’s l2: 278.599 [42] training’s l2: 188.735 valid_1’s l2: 278.453 [43] training’s l2: 188.623 valid_1’s l2: 278.307 [44] training’s l2: 188.512 valid_1’s l2: 278.161 [45] training’s l2: 188.401 valid_1’s l2: 278.016 [46] training’s l2: 188.272 valid_1’s l2: 277.826 [47] training’s l2: 188.153 valid_1’s l2: 277.63 [48] training’s l2: 188.024 valid_1’s l2: 277.441 [49] training’s l2: 187.896 valid_1’s l2: 277.251 [50] training’s l2: 187.767 valid_1’s l2: 277.063 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.767 valid_1’s l2: 277.063 [1] training’s l2: 195.039 valid_1’s l2: 280.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.901 valid_1’s l2: 279.884 [3] training’s l2: 194.758 valid_1’s l2: 279.707 [4] training’s l2: 194.615 valid_1’s l2: 279.528 [5] training’s l2: 194.473 valid_1’s l2: 279.351 [6] training’s l2: 194.345 valid_1’s l2: 279.536 [7] training’s l2: 194.217 valid_1’s l2: 279.721 [8] training’s l2: 194.09 valid_1’s l2: 279.906 [9] training’s l2: 193.963 valid_1’s l2: 280.09 [10] training’s l2: 193.836 valid_1’s l2: 280.275 [11] training’s l2: 193.698 valid_1’s l2: 280.076 [12] training’s l2: 193.56 valid_1’s l2: 279.877 [13] training’s l2: 193.423 valid_1’s l2: 279.678 [14] training’s l2: 193.286 valid_1’s l2: 279.48 [15] training’s l2: 193.15 valid_1’s l2: 279.282 [16] training’s l2: 193.008 valid_1’s l2: 279.085 [17] training’s l2: 192.866 valid_1’s l2: 278.887 [18] training’s l2: 192.725 valid_1’s l2: 278.691 [19] training’s l2: 192.585 valid_1’s l2: 278.498 [20] training’s l2: 192.444 valid_1’s l2: 278.301 [21] training’s l2: 192.305 valid_1’s l2: 278.111 [22] training’s l2: 192.166 valid_1’s l2: 277.921 [23] training’s l2: 192.027 valid_1’s l2: 277.731 [24] training’s l2: 191.888 valid_1’s l2: 277.542 [25] training’s l2: 191.75 valid_1’s l2: 277.353 [26] training’s l2: 191.614 valid_1’s l2: 277.151 [27] training’s l2: 191.473 valid_1’s l2: 276.963 [28] training’s l2: 191.332 valid_1’s l2: 276.774 [29] training’s l2: 191.192 valid_1’s l2: 276.586 [30] training’s l2: 191.051 valid_1’s l2: 276.398 [31] training’s l2: 190.915 valid_1’s l2: 276.205 [32] training’s l2: 190.778 valid_1’s l2: 276.012 [33] training’s l2: 190.644 valid_1’s l2: 275.826 [34] training’s l2: 190.508 valid_1’s l2: 275.634 [35] training’s l2: 190.373 valid_1’s l2: 275.442 [36] training’s l2: 190.239 valid_1’s l2: 275.265 [37] training’s l2: 190.106 valid_1’s l2: 275.088 [38] training’s l2: 189.977 valid_1’s l2: 275.211 [39] training’s l2: 189.844 valid_1’s l2: 275.034 [40] training’s l2: 189.711 valid_1’s l2: 274.858 [41] training’s l2: 189.585 valid_1’s l2: 274.706 [42] training’s l2: 189.459 valid_1’s l2: 274.554 [43] training’s l2: 189.333 valid_1’s l2: 274.403 [44] training’s l2: 189.207 valid_1’s l2: 274.252 [45] training’s l2: 189.082 valid_1’s l2: 274.101 [46] training’s l2: 188.952 valid_1’s l2: 273.931 [47] training’s l2: 188.824 valid_1’s l2: 273.753 [48] training’s l2: 188.693 valid_1’s l2: 273.583 [49] training’s l2: 188.564 valid_1’s l2: 273.418 [50] training’s l2: 188.435 valid_1’s l2: 273.249 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.435 valid_1’s l2: 273.249 [1] training’s l2: 194.37 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.238 valid_1’s l2: 283.9 [3] training’s l2: 194.106 valid_1’s l2: 283.698 [4] training’s l2: 193.975 valid_1’s l2: 283.496 [5] training’s l2: 193.844 valid_1’s l2: 283.295 [6] training’s l2: 193.717 valid_1’s l2: 283.121 [7] training’s l2: 193.591 valid_1’s l2: 282.947 [8] training’s l2: 193.465 valid_1’s l2: 282.773 [9] training’s l2: 193.34 valid_1’s l2: 282.6 [10] training’s l2: 193.214 valid_1’s l2: 282.427 [11] training’s l2: 193.091 valid_1’s l2: 282.559 [12] training’s l2: 192.968 valid_1’s l2: 282.695 [13] training’s l2: 192.844 valid_1’s l2: 282.826 [14] training’s l2: 192.721 valid_1’s l2: 282.958 [15] training’s l2: 192.598 valid_1’s l2: 283.089 [16] training’s l2: 192.471 valid_1’s l2: 282.919 [17] training’s l2: 192.344 valid_1’s l2: 282.75 [18] training’s l2: 192.217 valid_1’s l2: 282.58 [19] training’s l2: 192.091 valid_1’s l2: 282.411 [20] training’s l2: 191.964 valid_1’s l2: 282.243 [21] training’s l2: 191.837 valid_1’s l2: 282.045 [22] training’s l2: 191.711 valid_1’s l2: 281.848 [23] training’s l2: 191.584 valid_1’s l2: 281.651 [24] training’s l2: 191.458 valid_1’s l2: 281.455 [25] training’s l2: 191.332 valid_1’s l2: 281.259 [26] training’s l2: 191.211 valid_1’s l2: 281.078 [27] training’s l2: 191.089 valid_1’s l2: 280.898 [28] training’s l2: 190.968 valid_1’s l2: 280.718 [29] training’s l2: 190.847 valid_1’s l2: 280.539 [30] training’s l2: 190.726 valid_1’s l2: 280.359 [31] training’s l2: 190.603 valid_1’s l2: 280.166 [32] training’s l2: 190.48 valid_1’s l2: 279.973 [33] training’s l2: 190.357 valid_1’s l2: 279.78 [34] training’s l2: 190.234 valid_1’s l2: 279.588 [35] training’s l2: 190.112 valid_1’s l2: 279.396 [36] training’s l2: 189.992 valid_1’s l2: 279.539 [37] training’s l2: 189.871 valid_1’s l2: 279.682 [38] training’s l2: 189.751 valid_1’s l2: 279.825 [39] training’s l2: 189.632 valid_1’s l2: 279.968 [40] training’s l2: 189.512 valid_1’s l2: 280.111 [41] training’s l2: 189.39 valid_1’s l2: 279.921 [42] training’s l2: 189.268 valid_1’s l2: 279.732 [43] training’s l2: 189.147 valid_1’s l2: 279.543 [44] training’s l2: 189.026 valid_1’s l2: 279.355 [45] training’s l2: 188.905 valid_1’s l2: 279.167 [46] training’s l2: 188.787 valid_1’s l2: 278.982 [47] training’s l2: 188.67 valid_1’s l2: 278.798 [48] training’s l2: 188.553 valid_1’s l2: 278.614 [49] training’s l2: 188.436 valid_1’s l2: 278.43 [50] training’s l2: 188.32 valid_1’s l2: 278.246 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.32 valid_1’s l2: 278.246 [1] training’s l2: 198.317 valid_1’s l2: 265.053 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.182 valid_1’s l2: 264.878 [3] training’s l2: 198.047 valid_1’s l2: 264.703 [4] training’s l2: 197.912 valid_1’s l2: 264.529 [5] training’s l2: 197.777 valid_1’s l2: 264.355 [6] training’s l2: 197.638 valid_1’s l2: 264.202 [7] training’s l2: 197.5 valid_1’s l2: 264.049 [8] training’s l2: 197.361 valid_1’s l2: 263.896 [9] training’s l2: 197.223 valid_1’s l2: 263.744 [10] training’s l2: 197.086 valid_1’s l2: 263.592 [11] training’s l2: 196.949 valid_1’s l2: 263.425 [12] training’s l2: 196.812 valid_1’s l2: 263.258 [13] training’s l2: 196.675 valid_1’s l2: 263.091 [14] training’s l2: 196.539 valid_1’s l2: 262.925 [15] training’s l2: 196.403 valid_1’s l2: 262.759 [16] training’s l2: 196.264 valid_1’s l2: 262.571 [17] training’s l2: 196.126 valid_1’s l2: 262.383 [18] training’s l2: 195.987 valid_1’s l2: 262.196 [19] training’s l2: 195.85 valid_1’s l2: 262.009 [20] training’s l2: 195.712 valid_1’s l2: 261.822 [21] training’s l2: 195.581 valid_1’s l2: 261.676 [22] training’s l2: 195.451 valid_1’s l2: 261.53 [23] training’s l2: 195.32 valid_1’s l2: 261.384 [24] training’s l2: 195.19 valid_1’s l2: 261.239 [25] training’s l2: 195.06 valid_1’s l2: 261.093 [26] training’s l2: 194.926 valid_1’s l2: 260.921 [27] training’s l2: 194.792 valid_1’s l2: 260.748 [28] training’s l2: 194.658 valid_1’s l2: 260.576 [29] training’s l2: 194.525 valid_1’s l2: 260.404 [30] training’s l2: 194.391 valid_1’s l2: 260.233 [31] training’s l2: 194.256 valid_1’s l2: 260.054 [32] training’s l2: 194.121 valid_1’s l2: 259.875 [33] training’s l2: 193.987 valid_1’s l2: 259.696 [34] training’s l2: 193.853 valid_1’s l2: 259.518 [35] training’s l2: 193.718 valid_1’s l2: 259.341 [36] training’s l2: 193.589 valid_1’s l2: 259.2 [37] training’s l2: 193.46 valid_1’s l2: 259.047 [38] training’s l2: 193.331 valid_1’s l2: 258.907 [39] training’s l2: 193.203 valid_1’s l2: 258.754 [40] training’s l2: 193.072 valid_1’s l2: 258.579 [41] training’s l2: 192.934 valid_1’s l2: 258.384 [42] training’s l2: 192.795 valid_1’s l2: 258.189 [43] training’s l2: 192.657 valid_1’s l2: 257.995 [44] training’s l2: 192.52 valid_1’s l2: 257.801 [45] training’s l2: 192.382 valid_1’s l2: 257.607 [46] training’s l2: 192.255 valid_1’s l2: 257.439 [47] training’s l2: 192.128 valid_1’s l2: 257.271 [48] training’s l2: 192.001 valid_1’s l2: 257.104 [49] training’s l2: 191.872 valid_1’s l2: 256.937 [50] training’s l2: 191.746 valid_1’s l2: 256.77 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.746 valid_1’s l2: 256.77 [1] training’s l2: 205.243 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.099 valid_1’s l2: 198.509 [3] training’s l2: 204.954 valid_1’s l2: 198.38 [4] training’s l2: 204.81 valid_1’s l2: 198.252 [5] training’s l2: 204.665 valid_1’s l2: 198.124 [6] training’s l2: 204.521 valid_1’s l2: 197.987 [7] training’s l2: 204.378 valid_1’s l2: 197.851 [8] training’s l2: 204.234 valid_1’s l2: 197.715 [9] training’s l2: 204.091 valid_1’s l2: 197.579 [10] training’s l2: 203.948 valid_1’s l2: 197.443 [11] training’s l2: 203.806 valid_1’s l2: 197.309 [12] training’s l2: 203.665 valid_1’s l2: 197.176 [13] training’s l2: 203.523 valid_1’s l2: 197.043 [14] training’s l2: 203.382 valid_1’s l2: 196.91 [15] training’s l2: 203.242 valid_1’s l2: 196.778 [16] training’s l2: 203.096 valid_1’s l2: 196.646 [17] training’s l2: 202.95 valid_1’s l2: 196.515 [18] training’s l2: 202.805 valid_1’s l2: 196.384 [19] training’s l2: 202.66 valid_1’s l2: 196.253 [20] training’s l2: 202.515 valid_1’s l2: 196.123 [21] training’s l2: 202.371 valid_1’s l2: 196.008 [22] training’s l2: 202.227 valid_1’s l2: 195.893 [23] training’s l2: 202.084 valid_1’s l2: 195.779 [24] training’s l2: 201.941 valid_1’s l2: 195.665 [25] training’s l2: 201.798 valid_1’s l2: 195.551 [26] training’s l2: 201.658 valid_1’s l2: 195.436 [27] training’s l2: 201.519 valid_1’s l2: 195.322 [28] training’s l2: 201.38 valid_1’s l2: 195.209 [29] training’s l2: 201.242 valid_1’s l2: 195.095 [30] training’s l2: 201.103 valid_1’s l2: 194.982 [31] training’s l2: 200.966 valid_1’s l2: 194.867 [32] training’s l2: 200.83 valid_1’s l2: 194.752 [33] training’s l2: 200.693 valid_1’s l2: 194.638 [34] training’s l2: 200.557 valid_1’s l2: 194.524 [35] training’s l2: 200.421 valid_1’s l2: 194.41 [36] training’s l2: 200.276 valid_1’s l2: 194.279 [37] training’s l2: 200.131 valid_1’s l2: 194.148 [38] training’s l2: 199.987 valid_1’s l2: 194.017 [39] training’s l2: 199.843 valid_1’s l2: 193.887 [40] training’s l2: 199.699 valid_1’s l2: 193.756 [41] training’s l2: 199.566 valid_1’s l2: 193.645 [42] training’s l2: 199.433 valid_1’s l2: 193.533 [43] training’s l2: 199.301 valid_1’s l2: 193.422 [44] training’s l2: 199.17 valid_1’s l2: 193.31 [45] training’s l2: 199.038 valid_1’s l2: 193.199 [46] training’s l2: 198.9 valid_1’s l2: 193.085 [47] training’s l2: 198.762 valid_1’s l2: 192.972 [48] training’s l2: 198.624 valid_1’s l2: 192.859 [49] training’s l2: 198.487 valid_1’s l2: 192.741 [50] training’s l2: 198.35 valid_1’s l2: 192.628 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.35 valid_1’s l2: 192.628 [1] training’s l2: 193.905 valid_1’s l2: 286.206 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.78 valid_1’s l2: 286.026 [3] training’s l2: 193.656 valid_1’s l2: 285.845 [4] training’s l2: 193.531 valid_1’s l2: 285.665 [5] training’s l2: 193.407 valid_1’s l2: 285.485 [6] training’s l2: 193.282 valid_1’s l2: 285.307 [7] training’s l2: 193.157 valid_1’s l2: 285.129 [8] training’s l2: 193.033 valid_1’s l2: 284.951 [9] training’s l2: 192.909 valid_1’s l2: 284.774 [10] training’s l2: 192.785 valid_1’s l2: 284.597 [11] training’s l2: 192.647 valid_1’s l2: 284.377 [12] training’s l2: 192.51 valid_1’s l2: 284.161 [13] training’s l2: 192.373 valid_1’s l2: 283.942 [14] training’s l2: 192.236 valid_1’s l2: 283.723 [15] training’s l2: 192.099 valid_1’s l2: 283.504 [16] training’s l2: 191.969 valid_1’s l2: 283.311 [17] training’s l2: 191.84 valid_1’s l2: 283.117 [18] training’s l2: 191.711 valid_1’s l2: 282.925 [19] training’s l2: 191.582 valid_1’s l2: 282.732 [20] training’s l2: 191.454 valid_1’s l2: 282.54 [21] training’s l2: 191.327 valid_1’s l2: 282.342 [22] training’s l2: 191.2 valid_1’s l2: 282.145 [23] training’s l2: 191.074 valid_1’s l2: 281.95 [24] training’s l2: 190.948 valid_1’s l2: 281.753 [25] training’s l2: 190.821 valid_1’s l2: 281.559 [26] training’s l2: 190.689 valid_1’s l2: 281.374 [27] training’s l2: 190.557 valid_1’s l2: 281.189 [28] training’s l2: 190.427 valid_1’s l2: 280.998 [29] training’s l2: 190.296 valid_1’s l2: 280.815 [30] training’s l2: 190.164 valid_1’s l2: 280.631 [31] training’s l2: 190.046 valid_1’s l2: 280.433 [32] training’s l2: 189.928 valid_1’s l2: 280.235 [33] training’s l2: 189.81 valid_1’s l2: 280.038 [34] training’s l2: 189.692 valid_1’s l2: 279.841 [35] training’s l2: 189.574 valid_1’s l2: 279.645 [36] training’s l2: 189.454 valid_1’s l2: 279.471 [37] training’s l2: 189.333 valid_1’s l2: 279.297 [38] training’s l2: 189.213 valid_1’s l2: 279.124 [39] training’s l2: 189.093 valid_1’s l2: 278.951 [40] training’s l2: 188.973 valid_1’s l2: 278.778 [41] training’s l2: 188.861 valid_1’s l2: 278.632 [42] training’s l2: 188.749 valid_1’s l2: 278.485 [43] training’s l2: 188.638 valid_1’s l2: 278.34 [44] training’s l2: 188.527 valid_1’s l2: 278.194 [45] training’s l2: 188.416 valid_1’s l2: 278.049 [46] training’s l2: 188.287 valid_1’s l2: 277.859 [47] training’s l2: 188.158 valid_1’s l2: 277.669 [48] training’s l2: 188.03 valid_1’s l2: 277.48 [49] training’s l2: 187.901 valid_1’s l2: 277.291 [50] training’s l2: 187.773 valid_1’s l2: 277.102 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.773 valid_1’s l2: 277.102 [1] training’s l2: 195.039 valid_1’s l2: 280.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.896 valid_1’s l2: 279.894 [3] training’s l2: 194.753 valid_1’s l2: 279.714 [4] training’s l2: 194.61 valid_1’s l2: 279.536 [5] training’s l2: 194.468 valid_1’s l2: 279.357 [6] training’s l2: 194.34 valid_1’s l2: 279.542 [7] training’s l2: 194.213 valid_1’s l2: 279.727 [8] training’s l2: 194.085 valid_1’s l2: 279.912 [9] training’s l2: 193.958 valid_1’s l2: 280.097 [10] training’s l2: 193.831 valid_1’s l2: 280.282 [11] training’s l2: 193.693 valid_1’s l2: 280.082 [12] training’s l2: 193.556 valid_1’s l2: 279.883 [13] training’s l2: 193.419 valid_1’s l2: 279.685 [14] training’s l2: 193.282 valid_1’s l2: 279.487 [15] training’s l2: 193.145 valid_1’s l2: 279.289 [16] training’s l2: 193.003 valid_1’s l2: 279.091 [17] training’s l2: 192.862 valid_1’s l2: 278.894 [18] training’s l2: 192.72 valid_1’s l2: 278.698 [19] training’s l2: 192.579 valid_1’s l2: 278.501 [20] training’s l2: 192.439 valid_1’s l2: 278.305 [21] training’s l2: 192.299 valid_1’s l2: 278.115 [22] training’s l2: 192.16 valid_1’s l2: 277.925 [23] training’s l2: 192.022 valid_1’s l2: 277.735 [24] training’s l2: 191.883 valid_1’s l2: 277.546 [25] training’s l2: 191.745 valid_1’s l2: 277.357 [26] training’s l2: 191.604 valid_1’s l2: 277.168 [27] training’s l2: 191.462 valid_1’s l2: 276.979 [28] training’s l2: 191.322 valid_1’s l2: 276.791 [29] training’s l2: 191.181 valid_1’s l2: 276.603 [30] training’s l2: 191.041 valid_1’s l2: 276.415 [31] training’s l2: 190.904 valid_1’s l2: 276.222 [32] training’s l2: 190.768 valid_1’s l2: 276.029 [33] training’s l2: 190.634 valid_1’s l2: 275.843 [34] training’s l2: 190.498 valid_1’s l2: 275.651 [35] training’s l2: 190.363 valid_1’s l2: 275.459 [36] training’s l2: 190.229 valid_1’s l2: 275.282 [37] training’s l2: 190.096 valid_1’s l2: 275.105 [38] training’s l2: 189.963 valid_1’s l2: 274.928 [39] training’s l2: 189.83 valid_1’s l2: 274.752 [40] training’s l2: 189.697 valid_1’s l2: 274.576 [41] training’s l2: 189.571 valid_1’s l2: 274.424 [42] training’s l2: 189.445 valid_1’s l2: 274.273 [43] training’s l2: 189.32 valid_1’s l2: 274.122 [44] training’s l2: 189.194 valid_1’s l2: 273.971 [45] training’s l2: 189.069 valid_1’s l2: 273.821 [46] training’s l2: 188.939 valid_1’s l2: 273.65 [47] training’s l2: 188.81 valid_1’s l2: 273.485 [48] training’s l2: 188.68 valid_1’s l2: 273.316 [49] training’s l2: 188.551 valid_1’s l2: 273.151 [50] training’s l2: 188.422 valid_1’s l2: 272.982 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.422 valid_1’s l2: 272.982 [1] training’s l2: 194.37 valid_1’s l2: 284.102 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.238 valid_1’s l2: 283.9 [3] training’s l2: 194.106 valid_1’s l2: 283.698 [4] training’s l2: 193.975 valid_1’s l2: 283.496 [5] training’s l2: 193.844 valid_1’s l2: 283.295 [6] training’s l2: 193.717 valid_1’s l2: 283.121 [7] training’s l2: 193.591 valid_1’s l2: 282.947 [8] training’s l2: 193.465 valid_1’s l2: 282.773 [9] training’s l2: 193.34 valid_1’s l2: 282.6 [10] training’s l2: 193.214 valid_1’s l2: 282.427 [11] training’s l2: 193.091 valid_1’s l2: 282.559 [12] training’s l2: 192.967 valid_1’s l2: 282.69 [13] training’s l2: 192.843 valid_1’s l2: 282.822 [14] training’s l2: 192.721 valid_1’s l2: 282.958 [15] training’s l2: 192.598 valid_1’s l2: 283.089 [16] training’s l2: 192.471 valid_1’s l2: 282.919 [17] training’s l2: 192.344 valid_1’s l2: 282.75 [18] training’s l2: 192.217 valid_1’s l2: 282.58 [19] training’s l2: 192.091 valid_1’s l2: 282.411 [20] training’s l2: 191.964 valid_1’s l2: 282.243 [21] training’s l2: 191.837 valid_1’s l2: 282.045 [22] training’s l2: 191.711 valid_1’s l2: 281.848 [23] training’s l2: 191.584 valid_1’s l2: 281.651 [24] training’s l2: 191.458 valid_1’s l2: 281.455 [25] training’s l2: 191.332 valid_1’s l2: 281.259 [26] training’s l2: 191.211 valid_1’s l2: 281.078 [27] training’s l2: 191.089 valid_1’s l2: 280.898 [28] training’s l2: 190.968 valid_1’s l2: 280.718 [29] training’s l2: 190.847 valid_1’s l2: 280.539 [30] training’s l2: 190.726 valid_1’s l2: 280.359 [31] training’s l2: 190.603 valid_1’s l2: 280.166 [32] training’s l2: 190.48 valid_1’s l2: 279.973 [33] training’s l2: 190.357 valid_1’s l2: 279.78 [34] training’s l2: 190.234 valid_1’s l2: 279.588 [35] training’s l2: 190.112 valid_1’s l2: 279.396 [36] training’s l2: 189.992 valid_1’s l2: 279.539 [37] training’s l2: 189.871 valid_1’s l2: 279.682 [38] training’s l2: 189.751 valid_1’s l2: 279.825 [39] training’s l2: 189.632 valid_1’s l2: 279.968 [40] training’s l2: 189.512 valid_1’s l2: 280.111 [41] training’s l2: 189.39 valid_1’s l2: 279.921 [42] training’s l2: 189.268 valid_1’s l2: 279.732 [43] training’s l2: 189.147 valid_1’s l2: 279.543 [44] training’s l2: 189.026 valid_1’s l2: 279.355 [45] training’s l2: 188.905 valid_1’s l2: 279.167 [46] training’s l2: 188.787 valid_1’s l2: 278.982 [47] training’s l2: 188.67 valid_1’s l2: 278.798 [48] training’s l2: 188.553 valid_1’s l2: 278.613 [49] training’s l2: 188.436 valid_1’s l2: 278.43 [50] training’s l2: 188.32 valid_1’s l2: 278.246 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.32 valid_1’s l2: 278.246 [1] training’s l2: 198.317 valid_1’s l2: 265.053 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 198.182 valid_1’s l2: 264.878 [3] training’s l2: 198.047 valid_1’s l2: 264.703 [4] training’s l2: 197.912 valid_1’s l2: 264.529 [5] training’s l2: 197.777 valid_1’s l2: 264.355 [6] training’s l2: 197.638 valid_1’s l2: 264.202 [7] training’s l2: 197.5 valid_1’s l2: 264.049 [8] training’s l2: 197.361 valid_1’s l2: 263.896 [9] training’s l2: 197.223 valid_1’s l2: 263.744 [10] training’s l2: 197.086 valid_1’s l2: 263.592 [11] training’s l2: 196.949 valid_1’s l2: 263.425 [12] training’s l2: 196.812 valid_1’s l2: 263.258 [13] training’s l2: 196.675 valid_1’s l2: 263.091 [14] training’s l2: 196.539 valid_1’s l2: 262.925 [15] training’s l2: 196.403 valid_1’s l2: 262.759 [16] training’s l2: 196.264 valid_1’s l2: 262.571 [17] training’s l2: 196.126 valid_1’s l2: 262.383 [18] training’s l2: 195.987 valid_1’s l2: 262.196 [19] training’s l2: 195.85 valid_1’s l2: 262.009 [20] training’s l2: 195.712 valid_1’s l2: 261.822 [21] training’s l2: 195.581 valid_1’s l2: 261.676 [22] training’s l2: 195.451 valid_1’s l2: 261.53 [23] training’s l2: 195.32 valid_1’s l2: 261.384 [24] training’s l2: 195.19 valid_1’s l2: 261.239 [25] training’s l2: 195.06 valid_1’s l2: 261.093 [26] training’s l2: 194.926 valid_1’s l2: 260.921 [27] training’s l2: 194.792 valid_1’s l2: 260.748 [28] training’s l2: 194.658 valid_1’s l2: 260.576 [29] training’s l2: 194.525 valid_1’s l2: 260.404 [30] training’s l2: 194.391 valid_1’s l2: 260.233 [31] training’s l2: 194.256 valid_1’s l2: 260.054 [32] training’s l2: 194.121 valid_1’s l2: 259.875 [33] training’s l2: 193.987 valid_1’s l2: 259.696 [34] training’s l2: 193.853 valid_1’s l2: 259.518 [35] training’s l2: 193.718 valid_1’s l2: 259.341 [36] training’s l2: 193.589 valid_1’s l2: 259.2 [37] training’s l2: 193.46 valid_1’s l2: 259.047 [38] training’s l2: 193.331 valid_1’s l2: 258.907 [39] training’s l2: 193.203 valid_1’s l2: 258.754 [40] training’s l2: 193.072 valid_1’s l2: 258.579 [41] training’s l2: 192.934 valid_1’s l2: 258.384 [42] training’s l2: 192.795 valid_1’s l2: 258.189 [43] training’s l2: 192.657 valid_1’s l2: 257.995 [44] training’s l2: 192.52 valid_1’s l2: 257.801 [45] training’s l2: 192.382 valid_1’s l2: 257.607 [46] training’s l2: 192.255 valid_1’s l2: 257.439 [47] training’s l2: 192.128 valid_1’s l2: 257.271 [48] training’s l2: 192.001 valid_1’s l2: 257.104 [49] training’s l2: 191.875 valid_1’s l2: 256.936 [50] training’s l2: 191.746 valid_1’s l2: 256.77 Did not meet early stopping. Best iteration is: [50] training’s l2: 191.746 valid_1’s l2: 256.77 [1] training’s l2: 205.243 valid_1’s l2: 198.638 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 205.099 valid_1’s l2: 198.509 [3] training’s l2: 204.954 valid_1’s l2: 198.38 [4] training’s l2: 204.81 valid_1’s l2: 198.252 [5] training’s l2: 204.665 valid_1’s l2: 198.124 [6] training’s l2: 204.521 valid_1’s l2: 197.987 [7] training’s l2: 204.378 valid_1’s l2: 197.851 [8] training’s l2: 204.234 valid_1’s l2: 197.715 [9] training’s l2: 204.091 valid_1’s l2: 197.579 [10] training’s l2: 203.948 valid_1’s l2: 197.443 [11] training’s l2: 203.806 valid_1’s l2: 197.309 [12] training’s l2: 203.665 valid_1’s l2: 197.176 [13] training’s l2: 203.523 valid_1’s l2: 197.043 [14] training’s l2: 203.382 valid_1’s l2: 196.91 [15] training’s l2: 203.242 valid_1’s l2: 196.778 [16] training’s l2: 203.096 valid_1’s l2: 196.646 [17] training’s l2: 202.95 valid_1’s l2: 196.515 [18] training’s l2: 202.805 valid_1’s l2: 196.384 [19] training’s l2: 202.66 valid_1’s l2: 196.253 [20] training’s l2: 202.515 valid_1’s l2: 196.123 [21] training’s l2: 202.371 valid_1’s l2: 196.008 [22] training’s l2: 202.227 valid_1’s l2: 195.893 [23] training’s l2: 202.084 valid_1’s l2: 195.779 [24] training’s l2: 201.941 valid_1’s l2: 195.665 [25] training’s l2: 201.798 valid_1’s l2: 195.551 [26] training’s l2: 201.658 valid_1’s l2: 195.436 [27] training’s l2: 201.519 valid_1’s l2: 195.322 [28] training’s l2: 201.38 valid_1’s l2: 195.209 [29] training’s l2: 201.242 valid_1’s l2: 195.095 [30] training’s l2: 201.103 valid_1’s l2: 194.982 [31] training’s l2: 200.966 valid_1’s l2: 194.867 [32] training’s l2: 200.83 valid_1’s l2: 194.752 [33] training’s l2: 200.693 valid_1’s l2: 194.638 [34] training’s l2: 200.557 valid_1’s l2: 194.524 [35] training’s l2: 200.421 valid_1’s l2: 194.41 [36] training’s l2: 200.276 valid_1’s l2: 194.279 [37] training’s l2: 200.131 valid_1’s l2: 194.148 [38] training’s l2: 199.987 valid_1’s l2: 194.017 [39] training’s l2: 199.843 valid_1’s l2: 193.887 [40] training’s l2: 199.699 valid_1’s l2: 193.756 [41] training’s l2: 199.566 valid_1’s l2: 193.645 [42] training’s l2: 199.433 valid_1’s l2: 193.533 [43] training’s l2: 199.301 valid_1’s l2: 193.422 [44] training’s l2: 199.169 valid_1’s l2: 193.311 [45] training’s l2: 199.037 valid_1’s l2: 193.2 [46] training’s l2: 198.898 valid_1’s l2: 193.086 [47] training’s l2: 198.76 valid_1’s l2: 192.973 [48] training’s l2: 198.623 valid_1’s l2: 192.86 [49] training’s l2: 198.485 valid_1’s l2: 192.747 [50] training’s l2: 198.348 valid_1’s l2: 192.634 Did not meet early stopping. Best iteration is: [50] training’s l2: 198.348 valid_1’s l2: 192.634 [1] training’s l2: 193.905 valid_1’s l2: 286.206 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 193.78 valid_1’s l2: 286.026 [3] training’s l2: 193.656 valid_1’s l2: 285.845 [4] training’s l2: 193.531 valid_1’s l2: 285.665 [5] training’s l2: 193.407 valid_1’s l2: 285.485 [6] training’s l2: 193.282 valid_1’s l2: 285.307 [7] training’s l2: 193.157 valid_1’s l2: 285.129 [8] training’s l2: 193.033 valid_1’s l2: 284.951 [9] training’s l2: 192.909 valid_1’s l2: 284.774 [10] training’s l2: 192.785 valid_1’s l2: 284.597 [11] training’s l2: 192.647 valid_1’s l2: 284.377 [12] training’s l2: 192.509 valid_1’s l2: 284.158 [13] training’s l2: 192.372 valid_1’s l2: 283.938 [14] training’s l2: 192.235 valid_1’s l2: 283.719 [15] training’s l2: 192.098 valid_1’s l2: 283.501 [16] training’s l2: 191.968 valid_1’s l2: 283.307 [17] training’s l2: 191.839 valid_1’s l2: 283.114 [18] training’s l2: 191.71 valid_1’s l2: 282.921 [19] training’s l2: 191.581 valid_1’s l2: 282.729 [20] training’s l2: 191.453 valid_1’s l2: 282.537 [21] training’s l2: 191.326 valid_1’s l2: 282.339 [22] training’s l2: 191.2 valid_1’s l2: 282.141 [23] training’s l2: 191.073 valid_1’s l2: 281.946 [24] training’s l2: 190.947 valid_1’s l2: 281.75 [25] training’s l2: 190.821 valid_1’s l2: 281.555 [26] training’s l2: 190.688 valid_1’s l2: 281.371 [27] training’s l2: 190.556 valid_1’s l2: 281.186 [28] training’s l2: 190.425 valid_1’s l2: 281.002 [29] training’s l2: 190.293 valid_1’s l2: 280.818 [30] training’s l2: 190.162 valid_1’s l2: 280.635 [31] training’s l2: 190.044 valid_1’s l2: 280.437 [32] training’s l2: 189.925 valid_1’s l2: 280.239 [33] training’s l2: 189.807 valid_1’s l2: 280.042 [34] training’s l2: 189.69 valid_1’s l2: 279.845 [35] training’s l2: 189.572 valid_1’s l2: 279.648 [36] training’s l2: 189.451 valid_1’s l2: 279.474 [37] training’s l2: 189.331 valid_1’s l2: 279.301 [38] training’s l2: 189.211 valid_1’s l2: 279.127 [39] training’s l2: 189.091 valid_1’s l2: 278.954 [40] training’s l2: 188.971 valid_1’s l2: 278.781 [41] training’s l2: 188.859 valid_1’s l2: 278.635 [42] training’s l2: 188.747 valid_1’s l2: 278.489 [43] training’s l2: 188.636 valid_1’s l2: 278.343 [44] training’s l2: 188.525 valid_1’s l2: 278.198 [45] training’s l2: 188.413 valid_1’s l2: 278.052 [46] training’s l2: 188.284 valid_1’s l2: 277.862 [47] training’s l2: 188.156 valid_1’s l2: 277.673 [48] training’s l2: 188.027 valid_1’s l2: 277.483 [49] training’s l2: 187.899 valid_1’s l2: 277.295 [50] training’s l2: 187.771 valid_1’s l2: 277.106 Did not meet early stopping. Best iteration is: [50] training’s l2: 187.771 valid_1’s l2: 277.106 [1] training’s l2: 195.039 valid_1’s l2: 280.07 Training until validation scores don’t improve for 10 rounds [2] training’s l2: 194.896 valid_1’s l2: 279.894 [3] training’s l2: 194.753 valid_1’s l2: 279.714 [4] training’s l2: 194.61 valid_1’s l2: 279.536 [5] training’s l2: 194.468 valid_1’s l2: 279.357 [6] training’s l2: 194.34 valid_1’s l2: 279.542 [7] training’s l2: 194.213 valid_1’s l2: 279.727 [8] training’s l2: 194.085 valid_1’s l2: 279.912 [9] training’s l2: 193.958 valid_1’s l2: 280.097 [10] training’s l2: 193.831 valid_1’s l2: 280.282 [11] training’s l2: 193.693 valid_1’s l2: 280.082 [12] training’s l2: 193.556 valid_1’s l2: 279.883 [13] training’s l2: 193.419 valid_1’s l2: 279.685 [14] training’s l2: 193.282 valid_1’s l2: 279.487 [15] training’s l2: 193.145 valid_1’s l2: 279.289 [16] training’s l2: 193.003 valid_1’s l2: 279.091 [17] training’s l2: 192.862 valid_1’s l2: 278.894 [18] training’s l2: 192.72 valid_1’s l2: 278.698 [19] training’s l2: 192.579 valid_1’s l2: 278.501 [20] training’s l2: 192.439 valid_1’s l2: 278.305 [21] training’s l2: 192.299 valid_1’s l2: 278.115 [22] training’s l2: 192.16 valid_1’s l2: 277.925 [23] training’s l2: 192.022 valid_1’s l2: 277.735 [24] training’s l2: 191.883 valid_1’s l2: 277.546 [25] training’s l2: 191.745 valid_1’s l2: 277.357 [26] training’s l2: 191.604 valid_1’s l2: 277.168 [27] training’s l2: 191.462 valid_1’s l2: 276.979 [28] training’s l2: 191.322 valid_1’s l2: 276.791 [29] training’s l2: 191.181 valid_1’s l2: 276.603 [30] training’s l2: 191.041 valid_1’s l2: 276.415 [31] training’s l2: 190.904 valid_1’s l2: 276.222 [32] training’s l2: 190.768 valid_1’s l2: 276.029 [33] training’s l2: 190.632 valid_1’s l2: 275.837 [34] training’s l2: 190.497 valid_1’s l2: 275.645 [35] training’s l2: 190.361 valid_1’s l2: 275.453 [36] training’s l2: 190.228 valid_1’s l2: 275.276 [37] training’s l2: 190.094 valid_1’s l2: 275.099 [38] training’s l2: 189.961 valid_1’s l2: 274.922 [39] training’s l2: 189.829 valid_1’s l2: 274.746 [40] training’s l2: 189.696 valid_1’s l2: 274.57 [41] training’s l2: 189.57 valid_1’s l2: 274.418 [42] training’s l2: 189.444 valid_1’s l2: 274.267 [43] training’s l2: 189.318 valid_1’s l2: 274.116 [44] training’s l2: 189.193 valid_1’s l2: 273.965 [45] training’s l2: 189.068 valid_1’s l2: 273.814 [46] training’s l2: 188.938 valid_1’s l2: 273.644 [47] training’s l2: 188.808 valid_1’s l2: 273.479 [48] training’s l2: 188.678 valid_1’s l2: 273.31 [49] training’s l2: 188.55 valid_1’s l2: 273.145 [50] training’s l2: 188.42 valid_1’s l2: 272.976 Did not meet early stopping. Best iteration is: [50] training’s l2: 188.42 valid_1’s l2: 272.976

feature_fraction train logloss eval logloss test logloss
0 0.10 0.434106 0.880916 0.519100
1 0.50 0.489516 0.997952 0.647324
2 0.80 0.514985 0.992908 0.605194
3 0.90 0.611081 1.119046 0.651779
4 0.95 0.789516 1.239830 0.783058
5 1.00 0.840671 1.271269 0.782180

6.0.11 local validation


NameError Traceback (most recent call last)

in —-> 1 log_loss(answer_loc_valid[‘Pred’],sub[‘Pred’])

NameError: name ‘answer_loc_valid’ is not defined

Warning: Your Kaggle API key is readable by other users on this system! To fix this, you can run ‘chmod 600 /Users/zhipengliang/.kaggle/kaggle.json’ 2020-03-09 21:20:25,746 WARNING Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by ‘SSLError(SSLError(“bad handshake: SysCallError(54, ‘ECONNRESET’)”))’: /api/v1/competitions/google-cloud-ncaa-march-madness-2020-division-1-mens-tournament/submissions/url/368658/1583759441 2020-03-09 21:23:52,387 WARNING Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by ‘SSLError(SSLError(“bad handshake: SysCallError(54, ‘ECONNRESET’)”))’: /api/v1/competitions/google-cloud-ncaa-march-madness-2020-division-1-mens-tournament/submissions/url/368658/1583759441 2020-03-09 21:27:04,077 WARNING Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by ‘SSLError(SSLError(“bad handshake: SysCallError(54, ‘ECONNRESET’)”))’: /api/v1/competitions/google-cloud-ncaa-march-madness-2020-division-1-mens-tournament/submissions/url/368658/1583759441 Traceback (most recent call last): File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py”, line 456, in wrap_socket cnx.do_handshake() File “/Applications/anaconda3/lib/python3.7/site-packages/OpenSSL/SSL.py”, line 1915, in do_handshake self._raise_ssl_error(self._ssl, result) File “/Applications/anaconda3/lib/python3.7/site-packages/OpenSSL/SSL.py”, line 1639, in _raise_ssl_error raise SysCallError(errno, errorcode.get(errno)) OpenSSL.SSL.SysCallError: (54, ‘ECONNRESET’)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 600, in urlopen chunked=chunked) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 343, in _make_request self._validate_conn(conn) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 839, in _validate_conn conn.connect() File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connection.py”, line 344, in connect ssl_context=context) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/util/ssl_.py”, line 347, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/contrib/pyopenssl.py”, line 462, in wrap_socket raise ssl.SSLError(‘bad handshake: %r’ % e) ssl.SSLError: (“bad handshake: SysCallError(54, ‘ECONNRESET’)”,)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File “/Applications/anaconda3/bin/kaggle”, line 11, in sys.exit(main()) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/cli.py”, line 51, in main out = args.func(**command_args) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/api/kaggle_api_extended.py”, line 545, in competition_submit_cli competition, quiet) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/api/kaggle_api_extended.py”, line 497, in competition_submit last_modified_date_utc=int(os.path.getmtime(file_name)))) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/api/kaggle_api.py”, line 1060, in competitions_submissions_url_with_http_info collection_formats=collection_formats) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/api_client.py”, line 334, in call_api _preload_content, _request_timeout) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/api_client.py”, line 165, in __call_api _request_timeout=_request_timeout) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/api_client.py”, line 377, in request body=body) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/rest.py”, line 288, in POST body=body) File “/Users/zhipengliang/.local/lib/python3.7/site-packages/kaggle/rest.py”, line 200, in request headers=headers) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/request.py”, line 72, in request urlopen_kw) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/request.py”, line 150, in request_encode_body return self.urlopen(method, url, extra_kw) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/poolmanager.py”, line 324, in urlopen response = conn.urlopen(method, u.request_uri, kw) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 667, in urlopen response_kw) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 667, in urlopen response_kw) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 667, in urlopen response_kw) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py”, line 638, in urlopen _stacktrace=sys.exc_info()[2]) File “/Applications/anaconda3/lib/python3.7/site-packages/urllib3/util/retry.py”, line 399, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host=‘www.kaggle.com’, port=443): Max retries exceeded with url: /api/v1/competitions/google-cloud-ncaa-march-madness-2020-division-1-mens-tournament/submissions/url/368658/1583759441 (Caused by SSLError(SSLError(“bad handshake: SysCallError(54, ‘ECONNRESET’)”)))

7 变量重要性

pandas.core.series.Series

pandas.core.frame.DataFrame

D:-packages.py:587: FutureWarning: Series.base is deprecated and will be removed in a future version if getattr(data, ‘base’, None) is not None and
D:-packages.py:588: FutureWarning: Series.base is deprecated and will be removed in a future version data.base is not None and isinstance(data, np.ndarray)

[20:33:07] WARNING: C:/Jenkins/workspace/xgboost-win64_release_0.90/src/objective/regression_obj.cu:152: reg:linear is now deprecated in favor of reg:squarederror.

XGBRegressor(base_score=0.5, booster=‘gbtree’, colsample_bylevel=0.4, colsample_bynode=0.4, colsample_bytree=0.4, gamma=5, importance_type=‘gain’, learning_rate=0.1, max_delta_step=0, max_depth=2, min_child_weight=10, missing=None, n_estimators=100, n_jobs=1, nthread=None, objective=‘reg:linear’, random_state=0, reg_alpha=10, reg_lambda=10, scale_pos_weight=1, seed=123, silent=None, subsample=1, verbosity=1)

Train error: 9.151611486497108 Test error: 10.886869227642492

png

png

可以发现 strength 排在第一。

[0.01222955 0.01099528 0.0051984 0.00428317 0.01178154 0.01417081 0.00612846 0.00996721 0.01794786 0.00857869 0.0096215 0.00591545 0.00695141 0.0034778 0.0129878 0.00573123 0.01062892 0.00319808 0.10284305 0.02298457 0.01307057 0.00425704 0.00468724 0.00815761 0.00833656 0.01476676 0.00987337 0.01617342 0.00750422 0.00495538 0.00469722 0.00778599 0.00603842 0.01360503 0.00415565 0.00488834 0.00912852 0.06675322 0.05344637 0.14447303 0.02388838 0.0275108 0.10994774 0.14627835]

8 Linear vs. Tree linear?

The blackcellmagic extension is already loaded. To reload it, use: %reload_ext blackcellmagic

D:-packages.py:587: FutureWarning: Series.base is deprecated and will be removed in a future version if getattr(data, ‘base’, None) is not None and
D:-packages.py:588: FutureWarning: Series.base is deprecated and will be removed in a future version data.base is not None and isinstance(data, np.ndarray)

[0] train-mae:11.2448+0.0855384 test-mae:11.2587+0.163249 [50] train-mae:7.43713+0.0840374 test-mae:7.93709+0.137658 [100] train-mae:6.99125+0.0716434 test-mae:7.91611+0.143742

Untuned mae: 7.899208

[0] train-mae:10.3218+0.0774403 test-mae:10.3303+0.153266 [50] train-mae:7.84504+0.0685376 test-mae:7.90051+0.139972 [100] train-mae:7.74014+0.0662995 test-mae:7.81881+0.150041 [150] train-mae:7.6965+0.0657856 test-mae:7.79258+0.157027 [199] train-mae:7.6694+0.065349 test-mae:7.77965+0.162464

Untuned mae: 7.779647

9 Auto-encoder 查询异常值

9.2 Define and train model

Train on 1605 samples, validate on 402 samples Epoch 1/1000 1605/1605 [==============================] - 1s 387us/step - loss: 5254.8319 - accuracy: 0.0623 - val_loss: 5181.2046 - val_accuracy: 0.1517 Epoch 2/1000 1605/1605 [==============================] - 0s 76us/step - loss: 5224.7148 - accuracy: 0.1389 - val_loss: 5109.4582 - val_accuracy: 0.2015 Epoch 3/1000 1605/1605 [==============================] - 0s 84us/step - loss: 5155.9119 - accuracy: 0.0978 - val_loss: 4858.1406 - val_accuracy: 0.0423 Epoch 4/1000 1605/1605 [==============================] - 0s 93us/step - loss: 5018.8089 - accuracy: 0.0685 - val_loss: 4406.7502 - val_accuracy: 0.0199 Epoch 5/1000 1605/1605 [==============================] - 0s 79us/step - loss: 4786.8360 - accuracy: 0.0274 - val_loss: 4014.2261 - val_accuracy: 0.0124 Epoch 6/1000 1605/1605 [==============================] - 0s 83us/step - loss: 4445.4361 - accuracy: 0.0206 - val_loss: 3705.8223 - val_accuracy: 0.0124 Epoch 7/1000 1605/1605 [==============================] - 0s 88us/step - loss: 4001.4202 - accuracy: 0.0162 - val_loss: 3244.9505 - val_accuracy: 0.0124 Epoch 8/1000 1605/1605 [==============================] - 0s 82us/step - loss: 3485.2446 - accuracy: 0.0093 - val_loss: 2725.7951 - val_accuracy: 0.0124 Epoch 9/1000 1605/1605 [==============================] - 0s 80us/step - loss: 2933.9729 - accuracy: 0.0093 - val_loss: 2348.3547 - val_accuracy: 0.0124 Epoch 10/1000 1605/1605 [==============================] - 0s 117us/step - loss: 2420.6643 - accuracy: 0.0087 - val_loss: 2117.5475 - val_accuracy: 0.0124 Epoch 11/1000 1605/1605 [==============================] - 0s 93us/step - loss: 1983.9528 - accuracy: 0.0081 - val_loss: 1702.5990 - val_accuracy: 0.0124 Epoch 12/1000 1605/1605 [==============================] - 0s 83us/step - loss: 1663.4083 - accuracy: 0.0087 - val_loss: 1503.4459 - val_accuracy: 0.0149 Epoch 13/1000 1605/1605 [==============================] - 0s 82us/step - loss: 1451.0816 - accuracy: 0.0212 - val_loss: 1373.6005 - val_accuracy: 0.0771 Epoch 14/1000 1605/1605 [==============================] - 0s 83us/step - loss: 1307.4790 - accuracy: 0.1445 - val_loss: 1330.1042 - val_accuracy: 0.1915 Epoch 15/1000 1605/1605 [==============================] - 0s 80us/step - loss: 1218.2929 - accuracy: 0.1944 - val_loss: 1230.7162 - val_accuracy: 0.1990 Epoch 16/1000 1605/1605 [==============================] - 0s 83us/step - loss: 1159.9186 - accuracy: 0.2100 - val_loss: 1187.4280 - val_accuracy: 0.2239 Epoch 17/1000 1605/1605 [==============================] - 0s 80us/step - loss: 1113.4778 - accuracy: 0.2206 - val_loss: 1113.9449 - val_accuracy: 0.2114 Epoch 18/1000 1605/1605 [==============================] - 0s 90us/step - loss: 1090.0265 - accuracy: 0.2280 - val_loss: 1098.2741 - val_accuracy: 0.2065 Epoch 19/1000 1605/1605 [==============================] - 0s 83us/step - loss: 1059.5650 - accuracy: 0.2224 - val_loss: 1074.6094 - val_accuracy: 0.2114 Epoch 20/1000 1605/1605 [==============================] - 0s 91us/step - loss: 1047.5920 - accuracy: 0.2168 - val_loss: 1061.9056 - val_accuracy: 0.2040 Epoch 21/1000 1605/1605 [==============================] - 0s 83us/step - loss: 1024.8612 - accuracy: 0.2143 - val_loss: 1040.9632 - val_accuracy: 0.2065 Epoch 22/1000 1605/1605 [==============================] - 0s 88us/step - loss: 1012.9057 - accuracy: 0.2255 - val_loss: 1029.1616 - val_accuracy: 0.2090 Epoch 23/1000 1605/1605 [==============================] - 0s 93us/step - loss: 991.1143 - accuracy: 0.2268 - val_loss: 1004.3970 - val_accuracy: 0.2090 Epoch 24/1000 1605/1605 [==============================] - 0s 134us/step - loss: 974.2894 - accuracy: 0.2243 - val_loss: 984.1159 - val_accuracy: 0.2090 Epoch 25/1000 1605/1605 [==============================] - 0s 161us/step - loss: 959.2768 - accuracy: 0.2156 - val_loss: 958.0947 - val_accuracy: 0.2164 Epoch 26/1000 1605/1605 [==============================] - 0s 136us/step - loss: 935.8090 - accuracy: 0.2206 - val_loss: 939.1827 - val_accuracy: 0.2189 Epoch 27/1000 1605/1605 [==============================] - 0s 118us/step - loss: 897.4690 - accuracy: 0.2274 - val_loss: 892.0123 - val_accuracy: 0.2189 Epoch 28/1000 1605/1605 [==============================] - 0s 97us/step - loss: 866.6181 - accuracy: 0.2237 - val_loss: 857.5299 - val_accuracy: 0.2090 Epoch 29/1000 1605/1605 [==============================] - 0s 69us/step - loss: 835.5344 - accuracy: 0.2237 - val_loss: 827.9142 - val_accuracy: 0.2139 Epoch 30/1000 1605/1605 [==============================] - 0s 69us/step - loss: 809.2219 - accuracy: 0.2349 - val_loss: 789.5658 - val_accuracy: 0.2139 Epoch 31/1000 1605/1605 [==============================] - 0s 70us/step - loss: 784.6293 - accuracy: 0.2212 - val_loss: 765.8092 - val_accuracy: 0.2239 Epoch 32/1000 1605/1605 [==============================] - 0s 72us/step - loss: 762.2970 - accuracy: 0.2224 - val_loss: 756.5740 - val_accuracy: 0.2214 Epoch 33/1000 1605/1605 [==============================] - 0s 72us/step - loss: 758.3213 - accuracy: 0.2156 - val_loss: 742.1743 - val_accuracy: 0.2164 Epoch 34/1000 1605/1605 [==============================] - 0s 81us/step - loss: 751.9294 - accuracy: 0.2156 - val_loss: 731.0019 - val_accuracy: 0.2139 Epoch 35/1000 1605/1605 [==============================] - 0s 69us/step - loss: 752.4235 - accuracy: 0.2218 - val_loss: 723.6421 - val_accuracy: 0.2139 Epoch 36/1000 1605/1605 [==============================] - 0s 67us/step - loss: 751.3752 - accuracy: 0.2168 - val_loss: 729.7619 - val_accuracy: 0.2164 Epoch 37/1000 1605/1605 [==============================] - 0s 90us/step - loss: 736.1790 - accuracy: 0.2212 - val_loss: 716.2957 - val_accuracy: 0.2090 Epoch 38/1000 1605/1605 [==============================] - 0s 128us/step - loss: 734.8946 - accuracy: 0.2131 - val_loss: 713.9137 - val_accuracy: 0.2065 Epoch 39/1000 1605/1605 [==============================] - 0s 90us/step - loss: 738.9839 - accuracy: 0.2106 - val_loss: 715.9628 - val_accuracy: 0.2065 Epoch 40/1000 1605/1605 [==============================] - 0s 67us/step - loss: 734.9096 - accuracy: 0.2106 - val_loss: 715.3116 - val_accuracy: 0.2090 Epoch 41/1000 1605/1605 [==============================] - 0s 67us/step - loss: 729.1818 - accuracy: 0.2131 - val_loss: 712.2779 - val_accuracy: 0.1990 Epoch 42/1000 1605/1605 [==============================] - 0s 72us/step - loss: 729.9150 - accuracy: 0.2075 - val_loss: 714.2635 - val_accuracy: 0.2015 Epoch 43/1000 1605/1605 [==============================] - 0s 75us/step - loss: 737.4607 - accuracy: 0.2069 - val_loss: 709.9772 - val_accuracy: 0.2065 Epoch 44/1000 1605/1605 [==============================] - 0s 165us/step - loss: 732.6835 - accuracy: 0.2118 - val_loss: 710.9301 - val_accuracy: 0.2114 Epoch 45/1000 1605/1605 [==============================] - 0s 104us/step - loss: 731.4931 - accuracy: 0.2174 - val_loss: 707.7978 - val_accuracy: 0.2139 Epoch 46/1000 1605/1605 [==============================] - 0s 90us/step - loss: 724.2355 - accuracy: 0.2287 - val_loss: 710.6172 - val_accuracy: 0.2164 Epoch 47/1000 1605/1605 [==============================] - 0s 131us/step - loss: 723.6183 - accuracy: 0.2199 - val_loss: 713.0529 - val_accuracy: 0.2189 Epoch 48/1000 1605/1605 [==============================] - 0s 115us/step - loss: 727.7036 - accuracy: 0.2374 - val_loss: 708.4298 - val_accuracy: 0.2289 Epoch 49/1000 1605/1605 [==============================] - 0s 88us/step - loss: 722.1802 - accuracy: 0.2374 - val_loss: 706.2345 - val_accuracy: 0.2438 Epoch 50/1000 1605/1605 [==============================] - 0s 85us/step - loss: 724.6661 - accuracy: 0.2530 - val_loss: 704.0473 - val_accuracy: 0.2761 Epoch 51/1000 1605/1605 [==============================] - 0s 87us/step - loss: 722.7594 - accuracy: 0.2573 - val_loss: 701.5946 - val_accuracy: 0.2637 Epoch 52/1000 1605/1605 [==============================] - 0s 87us/step - loss: 719.2806 - accuracy: 0.2660 - val_loss: 707.1080 - val_accuracy: 0.3010 Epoch 53/1000 1605/1605 [==============================] - 0s 89us/step - loss: 710.6123 - accuracy: 0.2922 - val_loss: 698.1499 - val_accuracy: 0.3134 Epoch 54/1000 1605/1605 [==============================] - 0s 87us/step - loss: 708.8643 - accuracy: 0.2928 - val_loss: 695.9299 - val_accuracy: 0.3333 Epoch 55/1000 1605/1605 [==============================] - 0s 146us/step - loss: 712.0878 - accuracy: 0.3003 - val_loss: 691.1577 - val_accuracy: 0.3358 Epoch 56/1000 1605/1605 [==============================] - 0s 119us/step - loss: 715.0994 - accuracy: 0.3134 - val_loss: 687.0455 - val_accuracy: 0.3433 Epoch 57/1000 1605/1605 [==============================] - 0s 112us/step - loss: 702.3045 - accuracy: 0.3034 - val_loss: 682.4670 - val_accuracy: 0.3433 Epoch 58/1000 1605/1605 [==============================] - 0s 108us/step - loss: 695.5463 - accuracy: 0.3202 - val_loss: 674.0352 - val_accuracy: 0.3433 Epoch 59/1000 1605/1605 [==============================] - 0s 98us/step - loss: 704.1848 - accuracy: 0.3121 - val_loss: 668.8351 - val_accuracy: 0.3358 Epoch 60/1000 1605/1605 [==============================] - 0s 105us/step - loss: 688.8760 - accuracy: 0.3215 - val_loss: 661.8978 - val_accuracy: 0.3607 Epoch 61/1000 1605/1605 [==============================] - 0s 95us/step - loss: 676.6143 - accuracy: 0.3184 - val_loss: 657.7410 - val_accuracy: 0.3607 Epoch 62/1000 1605/1605 [==============================] - 0s 87us/step - loss: 679.7955 - accuracy: 0.3333 - val_loss: 650.3084 - val_accuracy: 0.3682 Epoch 63/1000 1605/1605 [==============================] - 0s 90us/step - loss: 670.4424 - accuracy: 0.3296 - val_loss: 644.0038 - val_accuracy: 0.3607 Epoch 64/1000 1605/1605 [==============================] - 0s 123us/step - loss: 655.3004 - accuracy: 0.3371 - val_loss: 638.4636 - val_accuracy: 0.3632 Epoch 65/1000 1605/1605 [==============================] - 0s 233us/step - loss: 648.0751 - accuracy: 0.3421 - val_loss: 628.5418 - val_accuracy: 0.3632 Epoch 66/1000 1605/1605 [==============================] - 0s 94us/step - loss: 646.3447 - accuracy: 0.3396 - val_loss: 626.2045 - val_accuracy: 0.3657 Epoch 67/1000 1605/1605 [==============================] - 0s 86us/step - loss: 641.1501 - accuracy: 0.3427 - val_loss: 620.3955 - val_accuracy: 0.3657 Epoch 68/1000 1605/1605 [==============================] - 0s 87us/step - loss: 630.5982 - accuracy: 0.3458 - val_loss: 620.3684 - val_accuracy: 0.3682 Epoch 69/1000 1605/1605 [==============================] - 0s 87us/step - loss: 632.8971 - accuracy: 0.3383 - val_loss: 609.6316 - val_accuracy: 0.3632 Epoch 70/1000 1605/1605 [==============================] - 0s 87us/step - loss: 625.3773 - accuracy: 0.3502 - val_loss: 601.3391 - val_accuracy: 0.3632 Epoch 71/1000 1605/1605 [==============================] - 0s 88us/step - loss: 620.5754 - accuracy: 0.3489 - val_loss: 594.0892 - val_accuracy: 0.3682 Epoch 72/1000 1605/1605 [==============================] - 0s 89us/step - loss: 615.6081 - accuracy: 0.3464 - val_loss: 591.0960 - val_accuracy: 0.3682 Epoch 73/1000 1605/1605 [==============================] - 0s 87us/step - loss: 610.9648 - accuracy: 0.3576 - val_loss: 587.3166 - val_accuracy: 0.3706 Epoch 74/1000 1605/1605 [==============================] - 0s 89us/step - loss: 612.1398 - accuracy: 0.3508 - val_loss: 589.8203 - val_accuracy: 0.3657 Epoch 75/1000 1605/1605 [==============================] - 0s 93us/step - loss: 613.0200 - accuracy: 0.3495 - val_loss: 588.8291 - val_accuracy: 0.3682 Epoch 76/1000 1605/1605 [==============================] - 0s 95us/step - loss: 601.5058 - accuracy: 0.3508 - val_loss: 585.1040 - val_accuracy: 0.3657 Epoch 77/1000 1605/1605 [==============================] - 0s 98us/step - loss: 604.4047 - accuracy: 0.3514 - val_loss: 582.0013 - val_accuracy: 0.3682 Epoch 78/1000 1605/1605 [==============================] - 0s 98us/step - loss: 598.2328 - accuracy: 0.3533 - val_loss: 581.3580 - val_accuracy: 0.3632 Epoch 79/1000 1605/1605 [==============================] - 0s 110us/step - loss: 596.7526 - accuracy: 0.3526 - val_loss: 578.0201 - val_accuracy: 0.3706 Epoch 80/1000 1605/1605 [==============================] - 0s 137us/step - loss: 604.0200 - accuracy: 0.3626 - val_loss: 578.8541 - val_accuracy: 0.3657 Epoch 81/1000 1605/1605 [==============================] - 0s 98us/step - loss: 597.5658 - accuracy: 0.3576 - val_loss: 578.4950 - val_accuracy: 0.3607 Epoch 82/1000 1605/1605 [==============================] - 0s 105us/step - loss: 596.7037 - accuracy: 0.3551 - val_loss: 576.6876 - val_accuracy: 0.3582 Epoch 83/1000 1605/1605 [==============================] - 0s 110us/step - loss: 597.2279 - accuracy: 0.3558 - val_loss: 575.2591 - val_accuracy: 0.3657 Epoch 84/1000 1605/1605 [==============================] - 0s 102us/step - loss: 600.9159 - accuracy: 0.3645 - val_loss: 579.5091 - val_accuracy: 0.3607 Epoch 85/1000 1605/1605 [==============================] - 0s 91us/step - loss: 596.0606 - accuracy: 0.3620 - val_loss: 574.2224 - val_accuracy: 0.3557 Epoch 86/1000 1605/1605 [==============================] - 0s 91us/step - loss: 596.0884 - accuracy: 0.3607 - val_loss: 576.2443 - val_accuracy: 0.3582 Epoch 87/1000 1605/1605 [==============================] - 0s 102us/step - loss: 594.6251 - accuracy: 0.3645 - val_loss: 575.6261 - val_accuracy: 0.3657 Epoch 88/1000 1605/1605 [==============================] - 0s 92us/step - loss: 598.7577 - accuracy: 0.3720 - val_loss: 578.3938 - val_accuracy: 0.3657 Epoch 89/1000 1605/1605 [==============================] - 0s 102us/step - loss: 598.4776 - accuracy: 0.3583 - val_loss: 578.0608 - val_accuracy: 0.3657 Epoch 90/1000 1605/1605 [==============================] - 0s 123us/step - loss: 583.6868 - accuracy: 0.3695 - val_loss: 567.9922 - val_accuracy: 0.3657 Epoch 91/1000 1605/1605 [==============================] - 0s 127us/step - loss: 591.9644 - accuracy: 0.3657 - val_loss: 563.2321 - val_accuracy: 0.3607 Epoch 92/1000 1605/1605 [==============================] - 0s 95us/step - loss: 580.6294 - accuracy: 0.3682 - val_loss: 557.5182 - val_accuracy: 0.3632 Epoch 93/1000 1605/1605 [==============================] - 0s 87us/step - loss: 579.0412 - accuracy: 0.3695 - val_loss: 555.5246 - val_accuracy: 0.3632 Epoch 94/1000 1605/1605 [==============================] - 0s 84us/step - loss: 575.6983 - accuracy: 0.3632 - val_loss: 553.7887 - val_accuracy: 0.3731 Epoch 95/1000 1605/1605 [==============================] - 0s 72us/step - loss: 570.2798 - accuracy: 0.3639 - val_loss: 550.5065 - val_accuracy: 0.3706 Epoch 96/1000 1605/1605 [==============================] - 0s 69us/step - loss: 578.4729 - accuracy: 0.3670 - val_loss: 548.5203 - val_accuracy: 0.3706 Epoch 97/1000 1605/1605 [==============================] - 0s 70us/step - loss: 563.1453 - accuracy: 0.3682 - val_loss: 548.9840 - val_accuracy: 0.3657 Epoch 98/1000 1605/1605 [==============================] - 0s 75us/step - loss: 565.5391 - accuracy: 0.3720 - val_loss: 542.7884 - val_accuracy: 0.3706 Epoch 99/1000 1605/1605 [==============================] - 0s 74us/step - loss: 570.4250 - accuracy: 0.3732 - val_loss: 541.2191 - val_accuracy: 0.3706 Epoch 100/1000 1605/1605 [==============================] - 0s 72us/step - loss: 573.3648 - accuracy: 0.3763 - val_loss: 541.2850 - val_accuracy: 0.3706 Epoch 101/1000 1605/1605 [==============================] - 0s 80us/step - loss: 566.1095 - accuracy: 0.3738 - val_loss: 537.7398 - val_accuracy: 0.3731 Epoch 102/1000 1605/1605 [==============================] - 0s 85us/step - loss: 563.4899 - accuracy: 0.3738 - val_loss: 535.4701 - val_accuracy: 0.3756 Epoch 103/1000 1605/1605 [==============================] - 0s 85us/step - loss: 557.9722 - accuracy: 0.3763 - val_loss: 532.4375 - val_accuracy: 0.3781 Epoch 104/1000 1605/1605 [==============================] - 0s 119us/step - loss: 560.5209 - accuracy: 0.3782 - val_loss: 531.8410 - val_accuracy: 0.3731 Epoch 105/1000 1605/1605 [==============================] - 0s 93us/step - loss: 549.4352 - accuracy: 0.3788 - val_loss: 530.0347 - val_accuracy: 0.3781 Epoch 106/1000 1605/1605 [==============================] - 0s 75us/step - loss: 570.0771 - accuracy: 0.3869 - val_loss: 534.0438 - val_accuracy: 0.3831 Epoch 107/1000 1605/1605 [==============================] - 0s 85us/step - loss: 548.5544 - accuracy: 0.3813 - val_loss: 527.9239 - val_accuracy: 0.3781 Epoch 108/1000 1605/1605 [==============================] - 0s 73us/step - loss: 554.5476 - accuracy: 0.3794 - val_loss: 528.8962 - val_accuracy: 0.3781 Epoch 109/1000 1605/1605 [==============================] - 0s 78us/step - loss: 561.0721 - accuracy: 0.3925 - val_loss: 531.1013 - val_accuracy: 0.3856 Epoch 110/1000 1605/1605 [==============================] - 0s 77us/step - loss: 551.3892 - accuracy: 0.3819 - val_loss: 528.1341 - val_accuracy: 0.3781 Epoch 111/1000 1605/1605 [==============================] - 0s 83us/step - loss: 546.7981 - accuracy: 0.3850 - val_loss: 525.2657 - val_accuracy: 0.3881 Epoch 112/1000 1605/1605 [==============================] - 0s 78us/step - loss: 548.8088 - accuracy: 0.3826 - val_loss: 525.5835 - val_accuracy: 0.3831 Epoch 113/1000 1605/1605 [==============================] - 0s 74us/step - loss: 552.1391 - accuracy: 0.3819 - val_loss: 525.2352 - val_accuracy: 0.3856 Epoch 114/1000 1605/1605 [==============================] - 0s 81us/step - loss: 547.9412 - accuracy: 0.3900 - val_loss: 523.8443 - val_accuracy: 0.3930 Epoch 115/1000 1605/1605 [==============================] - 0s 98us/step - loss: 547.5214 - accuracy: 0.3819 - val_loss: 526.2694 - val_accuracy: 0.4005 Epoch 116/1000 1605/1605 [==============================] - 0s 84us/step - loss: 554.6328 - accuracy: 0.3819 - val_loss: 524.0288 - val_accuracy: 0.4055 Epoch 117/1000 1605/1605 [==============================] - 0s 80us/step - loss: 552.3784 - accuracy: 0.3863 - val_loss: 522.4942 - val_accuracy: 0.3980 Epoch 118/1000 1605/1605 [==============================] - 0s 88us/step - loss: 543.7321 - accuracy: 0.3869 - val_loss: 522.4081 - val_accuracy: 0.4055 Epoch 119/1000 1605/1605 [==============================] - 0s 70us/step - loss: 547.8322 - accuracy: 0.3882 - val_loss: 526.2781 - val_accuracy: 0.3980 Epoch 120/1000 1605/1605 [==============================] - 0s 69us/step - loss: 541.8626 - accuracy: 0.3900 - val_loss: 531.7458 - val_accuracy: 0.3881 Epoch 121/1000 1605/1605 [==============================] - 0s 69us/step - loss: 543.8141 - accuracy: 0.3907 - val_loss: 523.0282 - val_accuracy: 0.4129 Epoch 122/1000 1605/1605 [==============================] - 0s 67us/step - loss: 550.5220 - accuracy: 0.3907 - val_loss: 521.2541 - val_accuracy: 0.4005 Epoch 123/1000 1605/1605 [==============================] - 0s 67us/step - loss: 544.5540 - accuracy: 0.3944 - val_loss: 520.2686 - val_accuracy: 0.4080 Epoch 124/1000 1605/1605 [==============================] - 0s 84us/step - loss: 536.3613 - accuracy: 0.4100 - val_loss: 517.6810 - val_accuracy: 0.4005 Epoch 125/1000 1605/1605 [==============================] - 0s 91us/step - loss: 538.9835 - accuracy: 0.3931 - val_loss: 521.0877 - val_accuracy: 0.4030 Epoch 126/1000 1605/1605 [==============================] - 0s 73us/step - loss: 547.6888 - accuracy: 0.4006 - val_loss: 515.2833 - val_accuracy: 0.4080 Epoch 127/1000 1605/1605 [==============================] - 0s 71us/step - loss: 538.5372 - accuracy: 0.4093 - val_loss: 515.8630 - val_accuracy: 0.4055 Epoch 128/1000 1605/1605 [==============================] - 0s 82us/step - loss: 541.6747 - accuracy: 0.3988 - val_loss: 516.0620 - val_accuracy: 0.4154 Epoch 129/1000 1605/1605 [==============================] - 0s 93us/step - loss: 541.7889 - accuracy: 0.4137 - val_loss: 517.7039 - val_accuracy: 0.4154 Epoch 130/1000 1605/1605 [==============================] - 0s 81us/step - loss: 529.1356 - accuracy: 0.4062 - val_loss: 512.0358 - val_accuracy: 0.4055 Epoch 131/1000 1605/1605 [==============================] - 0s 83us/step - loss: 533.5964 - accuracy: 0.3988 - val_loss: 512.8228 - val_accuracy: 0.4328 Epoch 132/1000 1605/1605 [==============================] - 0s 83us/step - loss: 539.8895 - accuracy: 0.4037 - val_loss: 511.3279 - val_accuracy: 0.4303 Epoch 133/1000 1605/1605 [==============================] - 0s 69us/step - loss: 535.1552 - accuracy: 0.4131 - val_loss: 509.0857 - val_accuracy: 0.4179 Epoch 134/1000 1605/1605 [==============================] - 0s 85us/step - loss: 535.6297 - accuracy: 0.4187 - val_loss: 508.2439 - val_accuracy: 0.4428 Epoch 135/1000 1605/1605 [==============================] - 0s 82us/step - loss: 535.5259 - accuracy: 0.4093 - val_loss: 510.4575 - val_accuracy: 0.4353 Epoch 136/1000 1605/1605 [==============================] - 0s 71us/step - loss: 526.7716 - accuracy: 0.4187 - val_loss: 512.5342 - val_accuracy: 0.4254 Epoch 137/1000 1605/1605 [==============================] - 0s 72us/step - loss: 530.4992 - accuracy: 0.4168 - val_loss: 504.0950 - val_accuracy: 0.4428 Epoch 138/1000 1605/1605 [==============================] - 0s 69us/step - loss: 526.2249 - accuracy: 0.4187 - val_loss: 505.1701 - val_accuracy: 0.4478 Epoch 139/1000 1605/1605 [==============================] - 0s 67us/step - loss: 528.5774 - accuracy: 0.4143 - val_loss: 500.0038 - val_accuracy: 0.4478 Epoch 140/1000 1605/1605 [==============================] - 0s 70us/step - loss: 525.0386 - accuracy: 0.4237 - val_loss: 501.7678 - val_accuracy: 0.4552 Epoch 141/1000 1605/1605 [==============================] - 0s 67us/step - loss: 523.5631 - accuracy: 0.4150 - val_loss: 497.1890 - val_accuracy: 0.4577 Epoch 142/1000 1605/1605 [==============================] - 0s 66us/step - loss: 516.0000 - accuracy: 0.4199 - val_loss: 494.1525 - val_accuracy: 0.4602 Epoch 143/1000 1605/1605 [==============================] - 0s 70us/step - loss: 518.4509 - accuracy: 0.4299 - val_loss: 495.2677 - val_accuracy: 0.4577 Epoch 144/1000 1605/1605 [==============================] - 0s 67us/step - loss: 522.3678 - accuracy: 0.4368 - val_loss: 491.7857 - val_accuracy: 0.4502 Epoch 145/1000 1605/1605 [==============================] - 0s 69us/step - loss: 509.5193 - accuracy: 0.4336 - val_loss: 492.0822 - val_accuracy: 0.4602 Epoch 146/1000 1605/1605 [==============================] - 0s 67us/step - loss: 514.6649 - accuracy: 0.4361 - val_loss: 490.7606 - val_accuracy: 0.4478 Epoch 147/1000 1605/1605 [==============================] - 0s 66us/step - loss: 515.9024 - accuracy: 0.4436 - val_loss: 486.5863 - val_accuracy: 0.4602 Epoch 148/1000 1605/1605 [==============================] - 0s 77us/step - loss: 513.5540 - accuracy: 0.4355 - val_loss: 487.5347 - val_accuracy: 0.4478 Epoch 149/1000 1605/1605 [==============================] - 0s 84us/step - loss: 513.2683 - accuracy: 0.4368 - val_loss: 484.8134 - val_accuracy: 0.4652 Epoch 150/1000 1605/1605 [==============================] - 0s 80us/step - loss: 508.3558 - accuracy: 0.4349 - val_loss: 480.1764 - val_accuracy: 0.4602 Epoch 151/1000 1605/1605 [==============================] - 0s 73us/step - loss: 505.7681 - accuracy: 0.4361 - val_loss: 482.5647 - val_accuracy: 0.4652 Epoch 152/1000 1605/1605 [==============================] - 0s 74us/step - loss: 508.3983 - accuracy: 0.4461 - val_loss: 477.5255 - val_accuracy: 0.4726 Epoch 153/1000 1605/1605 [==============================] - 0s 85us/step - loss: 510.0842 - accuracy: 0.4343 - val_loss: 474.7064 - val_accuracy: 0.4677 Epoch 154/1000 1605/1605 [==============================] - 0s 88us/step - loss: 515.1111 - accuracy: 0.4349 - val_loss: 483.3097 - val_accuracy: 0.4602 Epoch 155/1000 1605/1605 [==============================] - 0s 69us/step - loss: 505.0349 - accuracy: 0.4480 - val_loss: 474.4525 - val_accuracy: 0.4726 Epoch 156/1000 1605/1605 [==============================] - 0s 68us/step - loss: 505.1788 - accuracy: 0.4467 - val_loss: 471.3006 - val_accuracy: 0.4826 Epoch 157/1000 1605/1605 [==============================] - 0s 74us/step - loss: 505.6395 - accuracy: 0.4474 - val_loss: 471.4018 - val_accuracy: 0.4776 Epoch 158/1000 1605/1605 [==============================] - 0s 73us/step - loss: 502.0096 - accuracy: 0.4374 - val_loss: 470.3154 - val_accuracy: 0.4801 Epoch 159/1000 1605/1605 [==============================] - 0s 68us/step - loss: 502.9656 - accuracy: 0.4417 - val_loss: 478.2964 - val_accuracy: 0.4776 Epoch 160/1000 1605/1605 [==============================] - 0s 65us/step - loss: 500.3986 - accuracy: 0.4461 - val_loss: 469.0633 - val_accuracy: 0.4776 Epoch 161/1000 1605/1605 [==============================] - 0s 67us/step - loss: 495.1654 - accuracy: 0.4486 - val_loss: 467.7262 - val_accuracy: 0.4751 Epoch 162/1000 1605/1605 [==============================] - 0s 65us/step - loss: 495.4741 - accuracy: 0.4461 - val_loss: 470.4753 - val_accuracy: 0.4751 Epoch 163/1000 1605/1605 [==============================] - 0s 66us/step - loss: 501.4175 - accuracy: 0.4461 - val_loss: 468.4079 - val_accuracy: 0.4826 Epoch 164/1000 1605/1605 [==============================] - 0s 68us/step - loss: 498.5407 - accuracy: 0.4424 - val_loss: 467.4620 - val_accuracy: 0.4826 Epoch 165/1000 1605/1605 [==============================] - 0s 71us/step - loss: 493.9946 - accuracy: 0.4449 - val_loss: 465.8642 - val_accuracy: 0.4776 Epoch 166/1000 1605/1605 [==============================] - 0s 65us/step - loss: 498.9944 - accuracy: 0.4498 - val_loss: 467.4276 - val_accuracy: 0.4876 Epoch 167/1000 1605/1605 [==============================] - 0s 66us/step - loss: 502.7626 - accuracy: 0.4424 - val_loss: 465.9680 - val_accuracy: 0.4826 Epoch 168/1000 1605/1605 [==============================] - 0s 67us/step - loss: 497.2715 - accuracy: 0.4523 - val_loss: 465.0828 - val_accuracy: 0.4826 Epoch 169/1000 1605/1605 [==============================] - 0s 65us/step - loss: 504.0778 - accuracy: 0.4424 - val_loss: 465.0407 - val_accuracy: 0.4776 Epoch 170/1000 1605/1605 [==============================] - 0s 65us/step - loss: 495.7306 - accuracy: 0.4380 - val_loss: 466.8593 - val_accuracy: 0.4776 Epoch 171/1000 1605/1605 [==============================] - 0s 67us/step - loss: 490.9914 - accuracy: 0.4486 - val_loss: 464.4773 - val_accuracy: 0.4801 Epoch 172/1000 1605/1605 [==============================] - 0s 64us/step - loss: 500.7666 - accuracy: 0.4511 - val_loss: 465.3403 - val_accuracy: 0.4801 Epoch 173/1000 1605/1605 [==============================] - 0s 66us/step - loss: 496.7647 - accuracy: 0.4505 - val_loss: 468.7081 - val_accuracy: 0.4851 Epoch 174/1000 1605/1605 [==============================] - 0s 66us/step - loss: 495.7143 - accuracy: 0.4505 - val_loss: 466.3542 - val_accuracy: 0.4876 Epoch 175/1000 1605/1605 [==============================] - 0s 100us/step - loss: 503.1656 - accuracy: 0.4492 - val_loss: 466.2004 - val_accuracy: 0.4876 Epoch 176/1000 1605/1605 [==============================] - 0s 83us/step - loss: 499.9817 - accuracy: 0.4455 - val_loss: 478.9427 - val_accuracy: 0.4826 Epoch 177/1000 1605/1605 [==============================] - 0s 90us/step - loss: 499.9938 - accuracy: 0.4492 - val_loss: 476.1168 - val_accuracy: 0.4876 Epoch 178/1000 1605/1605 [==============================] - 0s 91us/step - loss: 490.9068 - accuracy: 0.4498 - val_loss: 468.1833 - val_accuracy: 0.4876 Epoch 179/1000 1605/1605 [==============================] - 0s 70us/step - loss: 499.2529 - accuracy: 0.4405 - val_loss: 463.2941 - val_accuracy: 0.4851 Epoch 180/1000 1605/1605 [==============================] - 0s 68us/step - loss: 494.2119 - accuracy: 0.4361 - val_loss: 462.7210 - val_accuracy: 0.4751 Epoch 181/1000 1605/1605 [==============================] - 0s 73us/step - loss: 494.6881 - accuracy: 0.4393 - val_loss: 472.4657 - val_accuracy: 0.4876 Epoch 182/1000 1605/1605 [==============================] - 0s 89us/step - loss: 495.7019 - accuracy: 0.4498 - val_loss: 467.5540 - val_accuracy: 0.4876 Epoch 183/1000 1605/1605 [==============================] - 0s 72us/step - loss: 494.6410 - accuracy: 0.4523 - val_loss: 462.4488 - val_accuracy: 0.4876 Epoch 184/1000 1605/1605 [==============================] - 0s 79us/step - loss: 492.7602 - accuracy: 0.4449 - val_loss: 462.7790 - val_accuracy: 0.4925 Epoch 185/1000 1605/1605 [==============================] - 0s 95us/step - loss: 494.7000 - accuracy: 0.4542 - val_loss: 463.2587 - val_accuracy: 0.4851 Epoch 186/1000 1605/1605 [==============================] - 0s 66us/step - loss: 499.9839 - accuracy: 0.4442 - val_loss: 464.6529 - val_accuracy: 0.4776 Epoch 187/1000 1605/1605 [==============================] - 0s 66us/step - loss: 494.5829 - accuracy: 0.4511 - val_loss: 462.1811 - val_accuracy: 0.4776 Epoch 188/1000 1605/1605 [==============================] - 0s 66us/step - loss: 497.2548 - accuracy: 0.4430 - val_loss: 462.1846 - val_accuracy: 0.4751 Epoch 189/1000 1605/1605 [==============================] - 0s 67us/step - loss: 496.0609 - accuracy: 0.4498 - val_loss: 463.3975 - val_accuracy: 0.4801 Epoch 190/1000 1605/1605 [==============================] - 0s 65us/step - loss: 494.3317 - accuracy: 0.4555 - val_loss: 459.8298 - val_accuracy: 0.4851 Epoch 191/1000 1605/1605 [==============================] - 0s 94us/step - loss: 495.8768 - accuracy: 0.4536 - val_loss: 463.2614 - val_accuracy: 0.4900 Epoch 192/1000 1605/1605 [==============================] - 0s 92us/step - loss: 488.7550 - accuracy: 0.4467 - val_loss: 457.2756 - val_accuracy: 0.4826 Epoch 193/1000 1605/1605 [==============================] - 0s 67us/step - loss: 489.8687 - accuracy: 0.4498 - val_loss: 456.1486 - val_accuracy: 0.4801 Epoch 194/1000 1605/1605 [==============================] - 0s 66us/step - loss: 486.3947 - accuracy: 0.4411 - val_loss: 456.8721 - val_accuracy: 0.4776 Epoch 195/1000 1605/1605 [==============================] - 0s 80us/step - loss: 492.1610 - accuracy: 0.4411 - val_loss: 458.7757 - val_accuracy: 0.4701 Epoch 196/1000 1605/1605 [==============================] - 0s 90us/step - loss: 487.6861 - accuracy: 0.4505 - val_loss: 453.8355 - val_accuracy: 0.4900 Epoch 197/1000 1605/1605 [==============================] - 0s 73us/step - loss: 482.2239 - accuracy: 0.4436 - val_loss: 451.4695 - val_accuracy: 0.4851 Epoch 198/1000 1605/1605 [==============================] - 0s 66us/step - loss: 478.4485 - accuracy: 0.4536 - val_loss: 447.2115 - val_accuracy: 0.4801 Epoch 199/1000 1605/1605 [==============================] - 0s 68us/step - loss: 480.9605 - accuracy: 0.4511 - val_loss: 446.3257 - val_accuracy: 0.4776 Epoch 200/1000 1605/1605 [==============================] - 0s 66us/step - loss: 480.6099 - accuracy: 0.4548 - val_loss: 447.6130 - val_accuracy: 0.4851 Epoch 201/1000 1605/1605 [==============================] - 0s 67us/step - loss: 484.9316 - accuracy: 0.4498 - val_loss: 450.6756 - val_accuracy: 0.4801 Epoch 202/1000 1605/1605 [==============================] - 0s 67us/step - loss: 475.1847 - accuracy: 0.4505 - val_loss: 450.1184 - val_accuracy: 0.4851 Epoch 203/1000 1605/1605 [==============================] - 0s 66us/step - loss: 475.1830 - accuracy: 0.4492 - val_loss: 441.5646 - val_accuracy: 0.4801 Epoch 204/1000 1605/1605 [==============================] - 0s 66us/step - loss: 471.5694 - accuracy: 0.4517 - val_loss: 435.7072 - val_accuracy: 0.4851 Epoch 205/1000 1605/1605 [==============================] - 0s 65us/step - loss: 469.6004 - accuracy: 0.4492 - val_loss: 433.2795 - val_accuracy: 0.4751 Epoch 206/1000 1605/1605 [==============================] - 0s 66us/step - loss: 468.8019 - accuracy: 0.4555 - val_loss: 427.8190 - val_accuracy: 0.4776 Epoch 207/1000 1605/1605 [==============================] - 0s 67us/step - loss: 465.6645 - accuracy: 0.4567 - val_loss: 430.6283 - val_accuracy: 0.4801 Epoch 208/1000 1605/1605 [==============================] - 0s 66us/step - loss: 473.2208 - accuracy: 0.4474 - val_loss: 426.0095 - val_accuracy: 0.4726 Epoch 209/1000 1605/1605 [==============================] - 0s 67us/step - loss: 460.9574 - accuracy: 0.4561 - val_loss: 426.7435 - val_accuracy: 0.4776 Epoch 210/1000 1605/1605 [==============================] - 0s 65us/step - loss: 463.7407 - accuracy: 0.4492 - val_loss: 422.0280 - val_accuracy: 0.4751 Epoch 211/1000 1605/1605 [==============================] - 0s 77us/step - loss: 453.5911 - accuracy: 0.4623 - val_loss: 422.4576 - val_accuracy: 0.4701 Epoch 212/1000 1605/1605 [==============================] - 0s 78us/step - loss: 461.4087 - accuracy: 0.4567 - val_loss: 422.4147 - val_accuracy: 0.4801 Epoch 213/1000 1605/1605 [==============================] - 0s 70us/step - loss: 456.0566 - accuracy: 0.4586 - val_loss: 419.9747 - val_accuracy: 0.4826 Epoch 214/1000 1605/1605 [==============================] - 0s 86us/step - loss: 468.2213 - accuracy: 0.4555 - val_loss: 427.2501 - val_accuracy: 0.4801 Epoch 215/1000 1605/1605 [==============================] - 0s 67us/step - loss: 460.7927 - accuracy: 0.4461 - val_loss: 420.6979 - val_accuracy: 0.4900 Epoch 216/1000 1605/1605 [==============================] - 0s 67us/step - loss: 455.7750 - accuracy: 0.4523 - val_loss: 423.8638 - val_accuracy: 0.4876 Epoch 217/1000 1605/1605 [==============================] - 0s 65us/step - loss: 458.4937 - accuracy: 0.4505 - val_loss: 425.3529 - val_accuracy: 0.4851 Epoch 218/1000 1605/1605 [==============================] - 0s 69us/step - loss: 456.9364 - accuracy: 0.4498 - val_loss: 420.5183 - val_accuracy: 0.4701 Epoch 219/1000 1605/1605 [==============================] - 0s 72us/step - loss: 463.9514 - accuracy: 0.4480 - val_loss: 422.4641 - val_accuracy: 0.4851 Epoch 220/1000 1605/1605 [==============================] - 0s 72us/step - loss: 463.1153 - accuracy: 0.4586 - val_loss: 420.9244 - val_accuracy: 0.4876 Epoch 221/1000 1605/1605 [==============================] - 0s 72us/step - loss: 459.8391 - accuracy: 0.4598 - val_loss: 420.0485 - val_accuracy: 0.4701 Epoch 222/1000 1605/1605 [==============================] - 0s 80us/step - loss: 458.9651 - accuracy: 0.4573 - val_loss: 422.9332 - val_accuracy: 0.4801 Epoch 223/1000 1605/1605 [==============================] - 0s 79us/step - loss: 455.3613 - accuracy: 0.4517 - val_loss: 418.9547 - val_accuracy: 0.4751 Epoch 224/1000 1605/1605 [==============================] - 0s 69us/step - loss: 457.2214 - accuracy: 0.4579 - val_loss: 421.1668 - val_accuracy: 0.4751 Epoch 225/1000 1605/1605 [==============================] - 0s 68us/step - loss: 452.7036 - accuracy: 0.4536 - val_loss: 426.2106 - val_accuracy: 0.4776 Epoch 226/1000 1605/1605 [==============================] - 0s 69us/step - loss: 457.8949 - accuracy: 0.4511 - val_loss: 420.9654 - val_accuracy: 0.4751 Epoch 227/1000 1605/1605 [==============================] - 0s 66us/step - loss: 462.8947 - accuracy: 0.4461 - val_loss: 418.4921 - val_accuracy: 0.4801 Epoch 228/1000 1605/1605 [==============================] - 0s 65us/step - loss: 457.3072 - accuracy: 0.4480 - val_loss: 416.6498 - val_accuracy: 0.4751 Epoch 229/1000 1605/1605 [==============================] - 0s 66us/step - loss: 451.1638 - accuracy: 0.4592 - val_loss: 425.7111 - val_accuracy: 0.4776 Epoch 230/1000 1605/1605 [==============================] - 0s 67us/step - loss: 458.0404 - accuracy: 0.4561 - val_loss: 415.2072 - val_accuracy: 0.4776 Epoch 231/1000 1605/1605 [==============================] - 0s 67us/step - loss: 455.8623 - accuracy: 0.4536 - val_loss: 416.8936 - val_accuracy: 0.4876 Epoch 232/1000 1605/1605 [==============================] - 0s 99us/step - loss: 455.8067 - accuracy: 0.4642 - val_loss: 418.8144 - val_accuracy: 0.4851 Epoch 233/1000 1605/1605 [==============================] - 0s 117us/step - loss: 455.0675 - accuracy: 0.4505 - val_loss: 415.4091 - val_accuracy: 0.4776 Epoch 234/1000 1605/1605 [==============================] - 0s 132us/step - loss: 451.2636 - accuracy: 0.4604 - val_loss: 415.9759 - val_accuracy: 0.4876 Epoch 235/1000 1605/1605 [==============================] - 0s 136us/step - loss: 453.8551 - accuracy: 0.4523 - val_loss: 417.6592 - val_accuracy: 0.4677 Epoch 236/1000 1605/1605 [==============================] - 0s 75us/step - loss: 452.5871 - accuracy: 0.4555 - val_loss: 417.6244 - val_accuracy: 0.4577 Epoch 237/1000 1605/1605 [==============================] - 0s 82us/step - loss: 453.9523 - accuracy: 0.4517 - val_loss: 415.5420 - val_accuracy: 0.4776 Epoch 238/1000 1605/1605 [==============================] - 0s 69us/step - loss: 463.7722 - accuracy: 0.4555 - val_loss: 414.9966 - val_accuracy: 0.4826 Epoch 239/1000 1605/1605 [==============================] - 0s 77us/step - loss: 462.3225 - accuracy: 0.4604 - val_loss: 419.4596 - val_accuracy: 0.4677 Epoch 240/1000 1605/1605 [==============================] - 0s 85us/step - loss: 454.7461 - accuracy: 0.4629 - val_loss: 416.8614 - val_accuracy: 0.4701 Epoch 241/1000 1605/1605 [==============================] - 0s 83us/step - loss: 453.1627 - accuracy: 0.4523 - val_loss: 416.1142 - val_accuracy: 0.4826 Epoch 242/1000 1605/1605 [==============================] - 0s 74us/step - loss: 451.2090 - accuracy: 0.4604 - val_loss: 416.3743 - val_accuracy: 0.4776 Epoch 243/1000 1605/1605 [==============================] - 0s 72us/step - loss: 456.2119 - accuracy: 0.4617 - val_loss: 414.3959 - val_accuracy: 0.4776 Epoch 244/1000 1605/1605 [==============================] - 0s 80us/step - loss: 455.7502 - accuracy: 0.4517 - val_loss: 415.8906 - val_accuracy: 0.4726 Epoch 245/1000 1605/1605 [==============================] - ETA: 0s - loss: 451.7363 - accuracy: 0.45 - 0s 95us/step - loss: 454.8591 - accuracy: 0.4492 - val_loss: 417.4744 - val_accuracy: 0.4701 Epoch 246/1000 1605/1605 [==============================] - 0s 85us/step - loss: 452.3653 - accuracy: 0.4498 - val_loss: 414.9141 - val_accuracy: 0.4701 Epoch 247/1000 1605/1605 [==============================] - 0s 80us/step - loss: 456.7787 - accuracy: 0.4530 - val_loss: 424.1859 - val_accuracy: 0.4701 Epoch 248/1000 1605/1605 [==============================] - 0s 85us/step - loss: 456.6260 - accuracy: 0.4498 - val_loss: 420.0568 - val_accuracy: 0.4751 Epoch 249/1000 1605/1605 [==============================] - 0s 85us/step - loss: 451.2747 - accuracy: 0.4492 - val_loss: 416.7389 - val_accuracy: 0.4701 Epoch 250/1000 1605/1605 [==============================] - 0s 87us/step - loss: 459.7842 - accuracy: 0.4511 - val_loss: 416.8747 - val_accuracy: 0.4826 Epoch 251/1000 1605/1605 [==============================] - 0s 90us/step - loss: 452.6808 - accuracy: 0.4579 - val_loss: 417.1073 - val_accuracy: 0.4776 Epoch 252/1000 1605/1605 [==============================] - 0s 87us/step - loss: 454.7951 - accuracy: 0.4455 - val_loss: 417.2205 - val_accuracy: 0.4801 Epoch 253/1000 1605/1605 [==============================] - 0s 84us/step - loss: 451.6221 - accuracy: 0.4548 - val_loss: 415.2676 - val_accuracy: 0.4677 Epoch 254/1000 1605/1605 [==============================] - 0s 93us/step - loss: 448.2887 - accuracy: 0.4517 - val_loss: 413.7201 - val_accuracy: 0.4677 Epoch 255/1000 1605/1605 [==============================] - 0s 78us/step - loss: 450.0806 - accuracy: 0.4673 - val_loss: 413.7530 - val_accuracy: 0.4726 Epoch 256/1000 1605/1605 [==============================] - 0s 68us/step - loss: 454.7849 - accuracy: 0.4617 - val_loss: 416.7661 - val_accuracy: 0.4826 Epoch 257/1000 1605/1605 [==============================] - 0s 82us/step - loss: 451.8953 - accuracy: 0.4648 - val_loss: 413.9668 - val_accuracy: 0.4776 Epoch 258/1000 1605/1605 [==============================] - 0s 70us/step - loss: 452.2717 - accuracy: 0.4530 - val_loss: 417.3743 - val_accuracy: 0.4801 Epoch 259/1000 1605/1605 [==============================] - 0s 67us/step - loss: 449.3397 - accuracy: 0.4629 - val_loss: 414.3532 - val_accuracy: 0.4751 Epoch 260/1000 1605/1605 [==============================] - 0s 67us/step - loss: 455.7187 - accuracy: 0.4467 - val_loss: 415.3222 - val_accuracy: 0.4801 Epoch 261/1000 1605/1605 [==============================] - 0s 67us/step - loss: 443.2780 - accuracy: 0.4511 - val_loss: 417.1261 - val_accuracy: 0.4726 Epoch 262/1000 1605/1605 [==============================] - 0s 72us/step - loss: 455.0112 - accuracy: 0.4636 - val_loss: 417.9013 - val_accuracy: 0.4751 Epoch 263/1000 1605/1605 [==============================] - 0s 66us/step - loss: 448.3009 - accuracy: 0.4542 - val_loss: 414.1860 - val_accuracy: 0.4726 Epoch 264/1000 1605/1605 [==============================] - 0s 66us/step - loss: 447.8752 - accuracy: 0.4523 - val_loss: 413.3951 - val_accuracy: 0.4726 Epoch 265/1000 1605/1605 [==============================] - 0s 69us/step - loss: 445.9304 - accuracy: 0.4598 - val_loss: 412.6443 - val_accuracy: 0.4627 Epoch 266/1000 1605/1605 [==============================] - 0s 73us/step - loss: 448.4829 - accuracy: 0.4579 - val_loss: 412.9984 - val_accuracy: 0.4701 Epoch 267/1000 1605/1605 [==============================] - 0s 83us/step - loss: 448.5972 - accuracy: 0.4511 - val_loss: 412.4637 - val_accuracy: 0.4701 Epoch 268/1000 1605/1605 [==============================] - 0s 86us/step - loss: 455.2660 - accuracy: 0.4617 - val_loss: 414.2492 - val_accuracy: 0.4677 Epoch 269/1000 1605/1605 [==============================] - 0s 83us/step - loss: 448.2193 - accuracy: 0.4654 - val_loss: 413.1502 - val_accuracy: 0.4776 Epoch 270/1000 1605/1605 [==============================] - 0s 80us/step - loss: 454.3129 - accuracy: 0.4586 - val_loss: 421.0350 - val_accuracy: 0.4776 Epoch 271/1000 1605/1605 [==============================] - 0s 85us/step - loss: 448.4072 - accuracy: 0.4629 - val_loss: 420.0949 - val_accuracy: 0.4826 Epoch 272/1000 1605/1605 [==============================] - 0s 80us/step - loss: 451.4334 - accuracy: 0.4548 - val_loss: 413.2422 - val_accuracy: 0.4776 Epoch 273/1000 1605/1605 [==============================] - 0s 83us/step - loss: 447.1188 - accuracy: 0.4654 - val_loss: 413.5325 - val_accuracy: 0.4751 Epoch 274/1000 1605/1605 [==============================] - 0s 83us/step - loss: 448.2762 - accuracy: 0.4654 - val_loss: 411.1920 - val_accuracy: 0.4751 Epoch 275/1000 1605/1605 [==============================] - 0s 81us/step - loss: 454.9935 - accuracy: 0.4573 - val_loss: 411.9754 - val_accuracy: 0.4751 Epoch 276/1000 1605/1605 [==============================] - 0s 72us/step - loss: 444.6398 - accuracy: 0.4561 - val_loss: 416.5145 - val_accuracy: 0.4751 Epoch 277/1000 1605/1605 [==============================] - 0s 72us/step - loss: 450.5246 - accuracy: 0.4586 - val_loss: 414.0053 - val_accuracy: 0.4826 Epoch 278/1000 1605/1605 [==============================] - 0s 69us/step - loss: 437.5878 - accuracy: 0.4717 - val_loss: 415.1577 - val_accuracy: 0.4726 Epoch 279/1000 1605/1605 [==============================] - 0s 65us/step - loss: 446.9173 - accuracy: 0.4654 - val_loss: 412.4361 - val_accuracy: 0.4726 Epoch 280/1000 1605/1605 [==============================] - 0s 65us/step - loss: 449.4632 - accuracy: 0.4561 - val_loss: 411.2443 - val_accuracy: 0.4726 Epoch 281/1000 1605/1605 [==============================] - 0s 66us/step - loss: 447.0214 - accuracy: 0.4660 - val_loss: 411.8214 - val_accuracy: 0.4652 Epoch 282/1000 1605/1605 [==============================] - 0s 65us/step - loss: 444.3986 - accuracy: 0.4654 - val_loss: 410.4124 - val_accuracy: 0.4751 Epoch 283/1000 1605/1605 [==============================] - 0s 65us/step - loss: 444.1592 - accuracy: 0.4648 - val_loss: 411.2850 - val_accuracy: 0.4701 Epoch 284/1000 1605/1605 [==============================] - 0s 70us/step - loss: 450.4511 - accuracy: 0.4673 - val_loss: 412.3560 - val_accuracy: 0.4602 Epoch 285/1000 1605/1605 [==============================] - 0s 66us/step - loss: 447.0355 - accuracy: 0.4642 - val_loss: 412.1450 - val_accuracy: 0.4726 Epoch 286/1000 1605/1605 [==============================] - 0s 65us/step - loss: 450.0987 - accuracy: 0.4623 - val_loss: 410.7634 - val_accuracy: 0.4826 Epoch 287/1000 1605/1605 [==============================] - 0s 66us/step - loss: 444.0867 - accuracy: 0.4729 - val_loss: 416.7777 - val_accuracy: 0.4776 Epoch 288/1000 1605/1605 [==============================] - 0s 65us/step - loss: 436.9965 - accuracy: 0.4654 - val_loss: 411.4326 - val_accuracy: 0.4851 Epoch 289/1000 1605/1605 [==============================] - 0s 67us/step - loss: 443.4293 - accuracy: 0.4735 - val_loss: 409.1313 - val_accuracy: 0.4851 Epoch 290/1000 1605/1605 [==============================] - 0s 90us/step - loss: 445.4124 - accuracy: 0.4685 - val_loss: 407.8509 - val_accuracy: 0.4776 Epoch 291/1000 1605/1605 [==============================] - 0s 98us/step - loss: 445.1302 - accuracy: 0.4835 - val_loss: 410.7843 - val_accuracy: 0.4900 Epoch 292/1000 1605/1605 [==============================] - 0s 98us/step - loss: 442.6606 - accuracy: 0.4773 - val_loss: 410.4735 - val_accuracy: 0.4801 Epoch 293/1000 1605/1605 [==============================] - 0s 67us/step - loss: 445.7154 - accuracy: 0.4779 - val_loss: 413.1785 - val_accuracy: 0.4900 Epoch 294/1000 1605/1605 [==============================] - 0s 80us/step - loss: 446.2073 - accuracy: 0.4816 - val_loss: 406.6968 - val_accuracy: 0.4876 Epoch 295/1000 1605/1605 [==============================] - 0s 95us/step - loss: 442.8812 - accuracy: 0.4735 - val_loss: 414.2261 - val_accuracy: 0.4925 Epoch 296/1000 1605/1605 [==============================] - 0s 79us/step - loss: 443.7380 - accuracy: 0.4735 - val_loss: 412.7382 - val_accuracy: 0.4950 Epoch 297/1000 1605/1605 [==============================] - 0s 69us/step - loss: 441.0721 - accuracy: 0.4903 - val_loss: 406.7226 - val_accuracy: 0.4925 Epoch 298/1000 1605/1605 [==============================] - 0s 87us/step - loss: 436.1705 - accuracy: 0.4847 - val_loss: 410.7954 - val_accuracy: 0.4925 Epoch 299/1000 1605/1605 [==============================] - 0s 90us/step - loss: 436.9377 - accuracy: 0.4872 - val_loss: 406.0079 - val_accuracy: 0.4851 Epoch 300/1000 1605/1605 [==============================] - 0s 70us/step - loss: 436.2925 - accuracy: 0.4960 - val_loss: 403.8766 - val_accuracy: 0.5000 Epoch 301/1000 1605/1605 [==============================] - 0s 67us/step - loss: 435.0086 - accuracy: 0.5040 - val_loss: 401.3185 - val_accuracy: 0.4950 Epoch 302/1000 1605/1605 [==============================] - 0s 65us/step - loss: 438.8698 - accuracy: 0.5016 - val_loss: 401.0585 - val_accuracy: 0.5025 Epoch 303/1000 1605/1605 [==============================] - 0s 78us/step - loss: 433.5411 - accuracy: 0.4966 - val_loss: 399.8087 - val_accuracy: 0.4950 Epoch 304/1000 1605/1605 [==============================] - 0s 90us/step - loss: 437.9256 - accuracy: 0.5022 - val_loss: 399.0248 - val_accuracy: 0.4975 Epoch 305/1000 1605/1605 [==============================] - 0s 68us/step - loss: 432.2497 - accuracy: 0.5047 - val_loss: 401.9267 - val_accuracy: 0.5025 Epoch 306/1000 1605/1605 [==============================] - 0s 67us/step - loss: 431.9640 - accuracy: 0.5159 - val_loss: 402.9565 - val_accuracy: 0.5149 Epoch 307/1000 1605/1605 [==============================] - 0s 98us/step - loss: 432.1565 - accuracy: 0.5153 - val_loss: 406.0639 - val_accuracy: 0.5174 Epoch 308/1000 1605/1605 [==============================] - 0s 86us/step - loss: 422.6678 - accuracy: 0.5159 - val_loss: 395.7047 - val_accuracy: 0.5100 Epoch 309/1000 1605/1605 [==============================] - 0s 71us/step - loss: 426.7914 - accuracy: 0.5134 - val_loss: 401.4526 - val_accuracy: 0.5124 Epoch 310/1000 1605/1605 [==============================] - 0s 67us/step - loss: 425.1197 - accuracy: 0.5315 - val_loss: 393.6887 - val_accuracy: 0.5149 Epoch 311/1000 1605/1605 [==============================] - 0s 77us/step - loss: 427.6170 - accuracy: 0.5190 - val_loss: 389.1072 - val_accuracy: 0.5224 Epoch 312/1000 1605/1605 [==============================] - 0s 70us/step - loss: 421.6905 - accuracy: 0.5202 - val_loss: 396.0892 - val_accuracy: 0.5274 Epoch 313/1000 1605/1605 [==============================] - 0s 68us/step - loss: 417.8348 - accuracy: 0.5333 - val_loss: 385.6999 - val_accuracy: 0.5249 Epoch 314/1000 1605/1605 [==============================] - 0s 69us/step - loss: 416.4870 - accuracy: 0.5252 - val_loss: 386.3887 - val_accuracy: 0.5274 Epoch 315/1000 1605/1605 [==============================] - 0s 67us/step - loss: 412.8965 - accuracy: 0.5340 - val_loss: 383.2580 - val_accuracy: 0.5274 Epoch 316/1000 1605/1605 [==============================] - 0s 86us/step - loss: 413.1180 - accuracy: 0.5396 - val_loss: 381.4930 - val_accuracy: 0.5323 Epoch 317/1000 1605/1605 [==============================] - 0s 70us/step - loss: 407.6116 - accuracy: 0.5371 - val_loss: 381.9825 - val_accuracy: 0.5398 Epoch 318/1000 1605/1605 [==============================] - 0s 84us/step - loss: 408.2413 - accuracy: 0.5495 - val_loss: 376.9777 - val_accuracy: 0.5398 Epoch 319/1000 1605/1605 [==============================] - 0s 72us/step - loss: 411.5412 - accuracy: 0.5421 - val_loss: 376.3911 - val_accuracy: 0.5448 Epoch 320/1000 1605/1605 [==============================] - 0s 70us/step - loss: 406.4916 - accuracy: 0.5526 - val_loss: 373.7316 - val_accuracy: 0.5348 Epoch 321/1000 1605/1605 [==============================] - 0s 73us/step - loss: 409.7494 - accuracy: 0.5383 - val_loss: 375.0918 - val_accuracy: 0.5498 Epoch 322/1000 1605/1605 [==============================] - 0s 85us/step - loss: 406.8619 - accuracy: 0.5470 - val_loss: 373.8885 - val_accuracy: 0.5373 Epoch 323/1000 1605/1605 [==============================] - 0s 82us/step - loss: 414.7112 - accuracy: 0.5514 - val_loss: 371.5836 - val_accuracy: 0.5498 Epoch 324/1000 1605/1605 [==============================] - 0s 89us/step - loss: 404.9592 - accuracy: 0.5551 - val_loss: 392.6752 - val_accuracy: 0.5622 Epoch 325/1000 1605/1605 [==============================] - 0s 87us/step - loss: 409.8441 - accuracy: 0.5439 - val_loss: 385.7968 - val_accuracy: 0.5597 Epoch 326/1000 1605/1605 [==============================] - 0s 74us/step - loss: 406.5614 - accuracy: 0.5489 - val_loss: 373.4266 - val_accuracy: 0.5597 Epoch 327/1000 1605/1605 [==============================] - 0s 76us/step - loss: 400.7829 - accuracy: 0.5495 - val_loss: 372.8903 - val_accuracy: 0.5522 Epoch 328/1000 1605/1605 [==============================] - 0s 71us/step - loss: 410.1229 - accuracy: 0.5427 - val_loss: 369.9819 - val_accuracy: 0.5647 Epoch 329/1000 1605/1605 [==============================] - 0s 88us/step - loss: 400.7763 - accuracy: 0.5558 - val_loss: 369.5596 - val_accuracy: 0.5597 Epoch 330/1000 1605/1605 [==============================] - 0s 96us/step - loss: 401.5880 - accuracy: 0.5427 - val_loss: 371.6850 - val_accuracy: 0.5622 Epoch 331/1000 1605/1605 [==============================] - 0s 83us/step - loss: 398.2383 - accuracy: 0.5533 - val_loss: 369.3447 - val_accuracy: 0.5647 Epoch 332/1000 1605/1605 [==============================] - 0s 95us/step - loss: 402.6343 - accuracy: 0.5558 - val_loss: 369.5299 - val_accuracy: 0.5622 Epoch 333/1000 1605/1605 [==============================] - 0s 88us/step - loss: 405.4214 - accuracy: 0.5445 - val_loss: 369.8932 - val_accuracy: 0.5771 Epoch 334/1000 1605/1605 [==============================] - 0s 80us/step - loss: 400.7716 - accuracy: 0.5483 - val_loss: 368.6576 - val_accuracy: 0.5423 Epoch 335/1000 1605/1605 [==============================] - 0s 76us/step - loss: 400.3451 - accuracy: 0.5502 - val_loss: 367.0543 - val_accuracy: 0.5473 Epoch 336/1000 1605/1605 [==============================] - 0s 74us/step - loss: 397.7851 - accuracy: 0.5533 - val_loss: 367.3206 - val_accuracy: 0.5597 Epoch 337/1000 1605/1605 [==============================] - 0s 72us/step - loss: 397.9046 - accuracy: 0.5558 - val_loss: 366.3089 - val_accuracy: 0.5597 Epoch 338/1000 1605/1605 [==============================] - 0s 81us/step - loss: 397.4092 - accuracy: 0.5626 - val_loss: 366.3225 - val_accuracy: 0.5547 Epoch 339/1000 1605/1605 [==============================] - 0s 92us/step - loss: 400.9344 - accuracy: 0.5533 - val_loss: 367.4870 - val_accuracy: 0.5721 Epoch 340/1000 1605/1605 [==============================] - 0s 84us/step - loss: 397.2343 - accuracy: 0.5595 - val_loss: 368.7620 - val_accuracy: 0.5572 Epoch 341/1000 1605/1605 [==============================] - 0s 97us/step - loss: 397.6119 - accuracy: 0.5564 - val_loss: 376.6406 - val_accuracy: 0.5746 Epoch 342/1000 1605/1605 [==============================] - 0s 90us/step - loss: 405.5854 - accuracy: 0.5558 - val_loss: 364.1867 - val_accuracy: 0.5746 Epoch 343/1000 1605/1605 [==============================] - 0s 72us/step - loss: 402.8168 - accuracy: 0.5570 - val_loss: 368.5391 - val_accuracy: 0.5771 Epoch 344/1000 1605/1605 [==============================] - 0s 75us/step - loss: 402.0380 - accuracy: 0.5664 - val_loss: 378.7730 - val_accuracy: 0.5697 Epoch 345/1000 1605/1605 [==============================] - 0s 67us/step - loss: 398.1184 - accuracy: 0.5508 - val_loss: 364.6591 - val_accuracy: 0.5771 Epoch 346/1000 1605/1605 [==============================] - 0s 68us/step - loss: 392.5482 - accuracy: 0.5620 - val_loss: 383.3905 - val_accuracy: 0.5771 Epoch 347/1000 1605/1605 [==============================] - 0s 65us/step - loss: 393.6971 - accuracy: 0.5539 - val_loss: 364.0525 - val_accuracy: 0.5871 Epoch 348/1000 1605/1605 [==============================] - 0s 71us/step - loss: 396.4477 - accuracy: 0.5526 - val_loss: 362.4039 - val_accuracy: 0.5771 Epoch 349/1000 1605/1605 [==============================] - 0s 65us/step - loss: 398.2322 - accuracy: 0.5595 - val_loss: 363.5322 - val_accuracy: 0.5721 Epoch 350/1000 1605/1605 [==============================] - 0s 112us/step - loss: 400.4915 - accuracy: 0.5564 - val_loss: 365.6469 - val_accuracy: 0.5622 Epoch 351/1000 1605/1605 [==============================] - 0s 67us/step - loss: 391.5277 - accuracy: 0.5564 - val_loss: 365.8149 - val_accuracy: 0.5622 Epoch 352/1000 1605/1605 [==============================] - 0s 68us/step - loss: 389.6537 - accuracy: 0.5558 - val_loss: 360.8252 - val_accuracy: 0.5697 Epoch 353/1000 1605/1605 [==============================] - 0s 66us/step - loss: 401.3814 - accuracy: 0.5502 - val_loss: 388.4736 - val_accuracy: 0.5746 Epoch 354/1000 1605/1605 [==============================] - 0s 65us/step - loss: 391.2039 - accuracy: 0.5639 - val_loss: 372.8479 - val_accuracy: 0.5697 Epoch 355/1000 1605/1605 [==============================] - 0s 67us/step - loss: 398.8642 - accuracy: 0.5589 - val_loss: 369.8843 - val_accuracy: 0.5697 Epoch 356/1000 1605/1605 [==============================] - 0s 66us/step - loss: 391.2434 - accuracy: 0.5639 - val_loss: 372.1578 - val_accuracy: 0.5821 Epoch 357/1000 1605/1605 [==============================] - 0s 66us/step - loss: 392.3395 - accuracy: 0.5551 - val_loss: 360.7647 - val_accuracy: 0.5746 Epoch 358/1000 1605/1605 [==============================] - 0s 66us/step - loss: 387.8343 - accuracy: 0.5726 - val_loss: 363.8622 - val_accuracy: 0.5771 Epoch 359/1000 1605/1605 [==============================] - 0s 65us/step - loss: 381.4031 - accuracy: 0.5645 - val_loss: 373.2884 - val_accuracy: 0.5871 Epoch 360/1000 1605/1605 [==============================] - 0s 65us/step - loss: 384.2894 - accuracy: 0.5464 - val_loss: 355.3887 - val_accuracy: 0.5871 Epoch 361/1000 1605/1605 [==============================] - 0s 65us/step - loss: 383.7003 - accuracy: 0.5583 - val_loss: 351.8693 - val_accuracy: 0.5896 Epoch 362/1000 1605/1605 [==============================] - 0s 68us/step - loss: 381.9077 - accuracy: 0.5533 - val_loss: 351.5154 - val_accuracy: 0.5622 Epoch 363/1000 1605/1605 [==============================] - 0s 66us/step - loss: 379.7287 - accuracy: 0.5670 - val_loss: 346.5797 - val_accuracy: 0.5597 Epoch 364/1000 1605/1605 [==============================] - 0s 65us/step - loss: 369.4954 - accuracy: 0.5632 - val_loss: 343.4186 - val_accuracy: 0.5746 Epoch 365/1000 1605/1605 [==============================] - 0s 72us/step - loss: 382.5496 - accuracy: 0.5570 - val_loss: 342.3908 - val_accuracy: 0.5647 Epoch 366/1000 1605/1605 [==============================] - 0s 92us/step - loss: 378.3502 - accuracy: 0.5551 - val_loss: 336.3501 - val_accuracy: 0.5746 Epoch 367/1000 1605/1605 [==============================] - 0s 92us/step - loss: 374.3146 - accuracy: 0.5607 - val_loss: 335.8190 - val_accuracy: 0.5721 Epoch 368/1000 1605/1605 [==============================] - 0s 82us/step - loss: 374.6700 - accuracy: 0.5695 - val_loss: 337.9274 - val_accuracy: 0.5721 Epoch 369/1000 1605/1605 [==============================] - 0s 68us/step - loss: 368.0517 - accuracy: 0.5745 - val_loss: 330.9960 - val_accuracy: 0.5771 Epoch 370/1000 1605/1605 [==============================] - 0s 69us/step - loss: 371.1099 - accuracy: 0.5688 - val_loss: 333.2964 - val_accuracy: 0.5746 Epoch 371/1000 1605/1605 [==============================] - 0s 68us/step - loss: 366.2413 - accuracy: 0.5589 - val_loss: 334.2808 - val_accuracy: 0.5821 Epoch 372/1000 1605/1605 [==============================] - 0s 72us/step - loss: 368.9436 - accuracy: 0.5576 - val_loss: 331.3831 - val_accuracy: 0.5896 Epoch 373/1000 1605/1605 [==============================] - 0s 67us/step - loss: 368.8813 - accuracy: 0.5639 - val_loss: 333.9231 - val_accuracy: 0.5920 Epoch 374/1000 1605/1605 [==============================] - 0s 67us/step - loss: 366.2799 - accuracy: 0.5589 - val_loss: 328.4580 - val_accuracy: 0.5697 Epoch 375/1000 1605/1605 [==============================] - 0s 65us/step - loss: 364.1154 - accuracy: 0.5645 - val_loss: 327.5427 - val_accuracy: 0.5771 Epoch 376/1000 1605/1605 [==============================] - 0s 77us/step - loss: 369.5585 - accuracy: 0.5564 - val_loss: 333.8721 - val_accuracy: 0.5796 Epoch 377/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.2041 - accuracy: 0.5826 - val_loss: 328.3440 - val_accuracy: 0.5697 Epoch 378/1000 1605/1605 [==============================] - 0s 67us/step - loss: 366.5455 - accuracy: 0.5682 - val_loss: 325.7690 - val_accuracy: 0.5821 Epoch 379/1000 1605/1605 [==============================] - 0s 67us/step - loss: 367.1865 - accuracy: 0.5583 - val_loss: 329.0362 - val_accuracy: 0.5945 Epoch 380/1000 1605/1605 [==============================] - 0s 65us/step - loss: 357.6060 - accuracy: 0.5614 - val_loss: 326.6712 - val_accuracy: 0.5846 Epoch 381/1000 1605/1605 [==============================] - 0s 66us/step - loss: 369.7840 - accuracy: 0.5682 - val_loss: 328.0957 - val_accuracy: 0.5771 Epoch 382/1000 1605/1605 [==============================] - 0s 67us/step - loss: 364.9995 - accuracy: 0.5601 - val_loss: 328.9985 - val_accuracy: 0.5896 Epoch 383/1000 1605/1605 [==============================] - 0s 67us/step - loss: 363.8947 - accuracy: 0.5651 - val_loss: 324.2935 - val_accuracy: 0.5796 Epoch 384/1000 1605/1605 [==============================] - 0s 65us/step - loss: 364.4937 - accuracy: 0.5664 - val_loss: 323.8080 - val_accuracy: 0.5970 Epoch 385/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.9593 - accuracy: 0.5645 - val_loss: 325.9614 - val_accuracy: 0.5746 Epoch 386/1000 1605/1605 [==============================] - 0s 65us/step - loss: 359.1252 - accuracy: 0.5838 - val_loss: 324.7347 - val_accuracy: 0.5896 Epoch 387/1000 1605/1605 [==============================] - 0s 66us/step - loss: 370.0734 - accuracy: 0.5745 - val_loss: 326.4111 - val_accuracy: 0.5796 Epoch 388/1000 1605/1605 [==============================] - 0s 66us/step - loss: 361.8353 - accuracy: 0.5720 - val_loss: 330.4388 - val_accuracy: 0.5896 Epoch 389/1000 1605/1605 [==============================] - 0s 66us/step - loss: 362.2934 - accuracy: 0.5776 - val_loss: 323.8994 - val_accuracy: 0.5796 Epoch 390/1000 1605/1605 [==============================] - 0s 66us/step - loss: 363.6704 - accuracy: 0.5751 - val_loss: 325.2577 - val_accuracy: 0.5672 Epoch 391/1000 1605/1605 [==============================] - 0s 67us/step - loss: 363.2694 - accuracy: 0.5570 - val_loss: 329.3666 - val_accuracy: 0.5970 Epoch 392/1000 1605/1605 [==============================] - 0s 66us/step - loss: 368.3885 - accuracy: 0.5614 - val_loss: 325.7176 - val_accuracy: 0.5771 Epoch 393/1000 1605/1605 [==============================] - 0s 77us/step - loss: 364.6884 - accuracy: 0.5726 - val_loss: 322.1270 - val_accuracy: 0.5846 Epoch 394/1000 1605/1605 [==============================] - 0s 88us/step - loss: 359.9020 - accuracy: 0.5576 - val_loss: 322.5428 - val_accuracy: 0.5871 Epoch 395/1000 1605/1605 [==============================] - 0s 77us/step - loss: 361.8337 - accuracy: 0.5639 - val_loss: 324.1120 - val_accuracy: 0.5796 Epoch 396/1000 1605/1605 [==============================] - 0s 71us/step - loss: 364.3043 - accuracy: 0.5713 - val_loss: 323.8914 - val_accuracy: 0.5771 Epoch 397/1000 1605/1605 [==============================] - 0s 73us/step - loss: 362.9257 - accuracy: 0.5614 - val_loss: 323.7299 - val_accuracy: 0.5846 Epoch 398/1000 1605/1605 [==============================] - 0s 72us/step - loss: 356.5824 - accuracy: 0.5564 - val_loss: 325.0662 - val_accuracy: 0.5622 Epoch 399/1000 1605/1605 [==============================] - 0s 79us/step - loss: 362.5571 - accuracy: 0.5726 - val_loss: 322.6075 - val_accuracy: 0.5721 Epoch 400/1000 1605/1605 [==============================] - 0s 73us/step - loss: 358.8212 - accuracy: 0.5701 - val_loss: 325.1870 - val_accuracy: 0.5896 Epoch 401/1000 1605/1605 [==============================] - 0s 71us/step - loss: 357.3848 - accuracy: 0.5626 - val_loss: 337.9637 - val_accuracy: 0.5896 Epoch 402/1000 1605/1605 [==============================] - 0s 71us/step - loss: 359.9270 - accuracy: 0.5726 - val_loss: 342.9376 - val_accuracy: 0.5945 Epoch 403/1000 1605/1605 [==============================] - 0s 70us/step - loss: 367.8361 - accuracy: 0.5614 - val_loss: 330.8879 - val_accuracy: 0.5796 Epoch 404/1000 1605/1605 [==============================] - 0s 73us/step - loss: 356.4045 - accuracy: 0.5664 - val_loss: 328.9243 - val_accuracy: 0.5796 Epoch 405/1000 1605/1605 [==============================] - 0s 72us/step - loss: 369.2160 - accuracy: 0.5632 - val_loss: 326.2730 - val_accuracy: 0.5871 Epoch 406/1000 1605/1605 [==============================] - 0s 69us/step - loss: 365.3492 - accuracy: 0.5670 - val_loss: 343.9191 - val_accuracy: 0.5945 Epoch 407/1000 1605/1605 [==============================] - 0s 69us/step - loss: 357.6388 - accuracy: 0.5695 - val_loss: 328.6322 - val_accuracy: 0.5771 Epoch 408/1000 1605/1605 [==============================] - 0s 97us/step - loss: 363.2117 - accuracy: 0.5539 - val_loss: 322.4179 - val_accuracy: 0.5796 Epoch 409/1000 1605/1605 [==============================] - 0s 79us/step - loss: 366.1364 - accuracy: 0.5745 - val_loss: 324.5166 - val_accuracy: 0.5771 Epoch 410/1000 1605/1605 [==============================] - 0s 77us/step - loss: 364.2936 - accuracy: 0.5564 - val_loss: 323.0968 - val_accuracy: 0.5896 Epoch 411/1000 1605/1605 [==============================] - 0s 71us/step - loss: 368.6550 - accuracy: 0.5583 - val_loss: 321.9111 - val_accuracy: 0.5821 Epoch 412/1000 1605/1605 [==============================] - 0s 71us/step - loss: 361.4188 - accuracy: 0.5732 - val_loss: 322.1467 - val_accuracy: 0.5796 Epoch 413/1000 1605/1605 [==============================] - 0s 70us/step - loss: 359.7910 - accuracy: 0.5607 - val_loss: 323.1118 - val_accuracy: 0.5771 Epoch 414/1000 1605/1605 [==============================] - 0s 71us/step - loss: 361.2399 - accuracy: 0.5576 - val_loss: 324.5700 - val_accuracy: 0.5647 Epoch 415/1000 1605/1605 [==============================] - 0s 82us/step - loss: 362.7250 - accuracy: 0.5769 - val_loss: 333.2458 - val_accuracy: 0.5821 Epoch 416/1000 1605/1605 [==============================] - 0s 83us/step - loss: 355.7298 - accuracy: 0.5695 - val_loss: 325.1450 - val_accuracy: 0.5821 Epoch 417/1000 1605/1605 [==============================] - 0s 80us/step - loss: 361.2946 - accuracy: 0.5589 - val_loss: 325.7361 - val_accuracy: 0.5746 Epoch 418/1000 1605/1605 [==============================] - 0s 78us/step - loss: 361.9446 - accuracy: 0.5713 - val_loss: 323.6257 - val_accuracy: 0.5771 Epoch 419/1000 1605/1605 [==============================] - 0s 71us/step - loss: 362.6899 - accuracy: 0.5695 - val_loss: 336.5193 - val_accuracy: 0.5746 Epoch 420/1000 1605/1605 [==============================] - 0s 79us/step - loss: 357.0531 - accuracy: 0.5788 - val_loss: 325.2919 - val_accuracy: 0.5697 Epoch 421/1000 1605/1605 [==============================] - 0s 74us/step - loss: 358.6201 - accuracy: 0.5801 - val_loss: 323.2162 - val_accuracy: 0.5672 Epoch 422/1000 1605/1605 [==============================] - 0s 70us/step - loss: 360.5687 - accuracy: 0.5676 - val_loss: 321.9276 - val_accuracy: 0.5672 Epoch 423/1000 1605/1605 [==============================] - 0s 70us/step - loss: 360.4886 - accuracy: 0.5745 - val_loss: 325.1199 - val_accuracy: 0.5746 Epoch 424/1000 1605/1605 [==============================] - 0s 199us/step - loss: 353.1394 - accuracy: 0.5720 - val_loss: 321.8229 - val_accuracy: 0.5771 Epoch 425/1000 1605/1605 [==============================] - 0s 95us/step - loss: 358.2355 - accuracy: 0.5763 - val_loss: 328.1785 - val_accuracy: 0.5846 Epoch 426/1000 1605/1605 [==============================] - 0s 101us/step - loss: 368.4039 - accuracy: 0.5626 - val_loss: 332.9424 - val_accuracy: 0.5647 Epoch 427/1000 1605/1605 [==============================] - 0s 88us/step - loss: 365.4241 - accuracy: 0.5682 - val_loss: 325.5802 - val_accuracy: 0.5697 Epoch 428/1000 1605/1605 [==============================] - 0s 105us/step - loss: 358.0347 - accuracy: 0.5688 - val_loss: 321.4169 - val_accuracy: 0.5721 Epoch 429/1000 1605/1605 [==============================] - 0s 82us/step - loss: 360.0640 - accuracy: 0.5657 - val_loss: 331.7585 - val_accuracy: 0.5821 Epoch 430/1000 1605/1605 [==============================] - 0s 100us/step - loss: 356.2038 - accuracy: 0.5757 - val_loss: 325.2680 - val_accuracy: 0.5746 Epoch 431/1000 1605/1605 [==============================] - 0s 82us/step - loss: 356.3254 - accuracy: 0.5626 - val_loss: 326.9430 - val_accuracy: 0.5896 Epoch 432/1000 1605/1605 [==============================] - 0s 116us/step - loss: 358.5112 - accuracy: 0.5695 - val_loss: 322.4822 - val_accuracy: 0.5821 Epoch 433/1000 1605/1605 [==============================] - 0s 86us/step - loss: 362.6717 - accuracy: 0.5757 - val_loss: 320.9420 - val_accuracy: 0.5796 Epoch 434/1000 1605/1605 [==============================] - 0s 80us/step - loss: 363.1040 - accuracy: 0.5664 - val_loss: 322.6225 - val_accuracy: 0.5821 Epoch 435/1000 1605/1605 [==============================] - 0s 80us/step - loss: 362.4454 - accuracy: 0.5614 - val_loss: 328.9467 - val_accuracy: 0.5871 Epoch 436/1000 1605/1605 [==============================] - 0s 77us/step - loss: 361.5319 - accuracy: 0.5657 - val_loss: 322.0087 - val_accuracy: 0.5721 Epoch 437/1000 1605/1605 [==============================] - 0s 77us/step - loss: 358.2607 - accuracy: 0.5551 - val_loss: 323.9817 - val_accuracy: 0.5920 Epoch 438/1000 1605/1605 [==============================] - 0s 79us/step - loss: 361.2713 - accuracy: 0.5726 - val_loss: 324.0032 - val_accuracy: 0.5796 Epoch 439/1000 1605/1605 [==============================] - 0s 82us/step - loss: 363.1161 - accuracy: 0.5607 - val_loss: 320.3772 - val_accuracy: 0.5771 Epoch 440/1000 1605/1605 [==============================] - 0s 81us/step - loss: 357.0589 - accuracy: 0.5707 - val_loss: 320.0679 - val_accuracy: 0.5796 Epoch 441/1000 1605/1605 [==============================] - 0s 83us/step - loss: 362.9743 - accuracy: 0.5763 - val_loss: 322.4574 - val_accuracy: 0.5821 Epoch 442/1000 1605/1605 [==============================] - 0s 88us/step - loss: 355.7265 - accuracy: 0.5607 - val_loss: 322.3369 - val_accuracy: 0.5746 Epoch 443/1000 1605/1605 [==============================] - 0s 121us/step - loss: 357.3954 - accuracy: 0.5801 - val_loss: 321.9146 - val_accuracy: 0.5846 Epoch 444/1000 1605/1605 [==============================] - 0s 79us/step - loss: 354.8625 - accuracy: 0.5782 - val_loss: 319.9214 - val_accuracy: 0.5746 Epoch 445/1000 1605/1605 [==============================] - 0s 105us/step - loss: 360.3572 - accuracy: 0.5745 - val_loss: 321.2884 - val_accuracy: 0.5721 Epoch 446/1000 1605/1605 [==============================] - 0s 89us/step - loss: 360.0117 - accuracy: 0.5701 - val_loss: 321.3225 - val_accuracy: 0.5771 Epoch 447/1000 1605/1605 [==============================] - 0s 72us/step - loss: 359.6197 - accuracy: 0.5782 - val_loss: 321.5562 - val_accuracy: 0.5871 Epoch 448/1000 1605/1605 [==============================] - 0s 71us/step - loss: 352.1001 - accuracy: 0.5751 - val_loss: 325.1404 - val_accuracy: 0.5846 Epoch 449/1000 1605/1605 [==============================] - 0s 76us/step - loss: 353.5576 - accuracy: 0.5576 - val_loss: 322.0612 - val_accuracy: 0.5796 Epoch 450/1000 1605/1605 [==============================] - 0s 83us/step - loss: 359.8211 - accuracy: 0.5657 - val_loss: 324.9421 - val_accuracy: 0.5871 Epoch 451/1000 1605/1605 [==============================] - 0s 84us/step - loss: 363.1999 - accuracy: 0.5595 - val_loss: 321.9028 - val_accuracy: 0.5721 Epoch 452/1000 1605/1605 [==============================] - 0s 76us/step - loss: 356.6187 - accuracy: 0.5701 - val_loss: 323.8242 - val_accuracy: 0.5721 Epoch 453/1000 1605/1605 [==============================] - 0s 88us/step - loss: 365.6574 - accuracy: 0.5776 - val_loss: 326.9342 - val_accuracy: 0.5597 Epoch 454/1000 1605/1605 [==============================] - 0s 73us/step - loss: 357.3030 - accuracy: 0.5732 - val_loss: 321.4362 - val_accuracy: 0.5672 Epoch 455/1000 1605/1605 [==============================] - 0s 84us/step - loss: 365.5911 - accuracy: 0.5670 - val_loss: 325.0135 - val_accuracy: 0.5771 Epoch 456/1000 1605/1605 [==============================] - 0s 70us/step - loss: 354.9828 - accuracy: 0.5807 - val_loss: 323.8285 - val_accuracy: 0.5721 Epoch 457/1000 1605/1605 [==============================] - 0s 79us/step - loss: 361.3967 - accuracy: 0.5601 - val_loss: 322.4590 - val_accuracy: 0.5871 Epoch 458/1000 1605/1605 [==============================] - 0s 86us/step - loss: 361.0307 - accuracy: 0.5688 - val_loss: 320.8777 - val_accuracy: 0.5821 Epoch 459/1000 1605/1605 [==============================] - 0s 80us/step - loss: 362.2073 - accuracy: 0.5639 - val_loss: 324.7464 - val_accuracy: 0.5771 Epoch 460/1000 1605/1605 [==============================] - 0s 69us/step - loss: 362.1892 - accuracy: 0.5664 - val_loss: 322.4665 - val_accuracy: 0.5721 Epoch 461/1000 1605/1605 [==============================] - 0s 69us/step - loss: 363.2757 - accuracy: 0.5564 - val_loss: 322.7165 - val_accuracy: 0.5771 Epoch 462/1000 1605/1605 [==============================] - 0s 68us/step - loss: 358.1042 - accuracy: 0.5726 - val_loss: 323.6771 - val_accuracy: 0.5721 Epoch 463/1000 1605/1605 [==============================] - 0s 66us/step - loss: 355.9869 - accuracy: 0.5576 - val_loss: 320.4512 - val_accuracy: 0.5697 Epoch 464/1000 1605/1605 [==============================] - 0s 67us/step - loss: 365.2554 - accuracy: 0.5645 - val_loss: 322.7698 - val_accuracy: 0.5697 Epoch 465/1000 1605/1605 [==============================] - 0s 69us/step - loss: 361.2946 - accuracy: 0.5707 - val_loss: 319.9098 - val_accuracy: 0.5821 Epoch 466/1000 1605/1605 [==============================] - 0s 68us/step - loss: 364.4016 - accuracy: 0.5614 - val_loss: 322.2858 - val_accuracy: 0.5771 Epoch 467/1000 1605/1605 [==============================] - 0s 67us/step - loss: 352.1142 - accuracy: 0.5701 - val_loss: 319.5282 - val_accuracy: 0.5771 Epoch 468/1000 1605/1605 [==============================] - 0s 67us/step - loss: 353.6733 - accuracy: 0.5751 - val_loss: 323.1858 - val_accuracy: 0.5746 Epoch 469/1000 1605/1605 [==============================] - 0s 67us/step - loss: 359.4826 - accuracy: 0.5664 - val_loss: 322.4986 - val_accuracy: 0.5846 Epoch 470/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.9484 - accuracy: 0.5682 - val_loss: 321.2613 - val_accuracy: 0.5846 Epoch 471/1000 1605/1605 [==============================] - 0s 66us/step - loss: 357.5442 - accuracy: 0.5688 - val_loss: 321.7618 - val_accuracy: 0.5697 Epoch 472/1000 1605/1605 [==============================] - 0s 67us/step - loss: 357.2489 - accuracy: 0.5732 - val_loss: 320.8788 - val_accuracy: 0.5796 Epoch 473/1000 1605/1605 [==============================] - 0s 66us/step - loss: 358.9010 - accuracy: 0.5763 - val_loss: 320.0267 - val_accuracy: 0.5821 Epoch 474/1000 1605/1605 [==============================] - 0s 71us/step - loss: 354.5420 - accuracy: 0.5601 - val_loss: 319.3929 - val_accuracy: 0.5771 Epoch 475/1000 1605/1605 [==============================] - 0s 76us/step - loss: 359.5026 - accuracy: 0.5707 - val_loss: 318.6108 - val_accuracy: 0.5871 Epoch 476/1000 1605/1605 [==============================] - 0s 70us/step - loss: 361.7342 - accuracy: 0.5626 - val_loss: 322.9377 - val_accuracy: 0.5871 Epoch 477/1000 1605/1605 [==============================] - 0s 72us/step - loss: 355.6484 - accuracy: 0.5664 - val_loss: 320.5365 - val_accuracy: 0.5846 Epoch 478/1000 1605/1605 [==============================] - 0s 96us/step - loss: 365.6171 - accuracy: 0.5607 - val_loss: 327.0976 - val_accuracy: 0.5746 Epoch 479/1000 1605/1605 [==============================] - 0s 98us/step - loss: 355.9059 - accuracy: 0.5701 - val_loss: 319.9739 - val_accuracy: 0.5920 Epoch 480/1000 1605/1605 [==============================] - 0s 72us/step - loss: 358.6653 - accuracy: 0.5794 - val_loss: 322.2930 - val_accuracy: 0.5846 Epoch 481/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.3937 - accuracy: 0.5583 - val_loss: 319.5671 - val_accuracy: 0.5821 Epoch 482/1000 1605/1605 [==============================] - 0s 67us/step - loss: 358.5585 - accuracy: 0.5670 - val_loss: 320.4242 - val_accuracy: 0.5721 Epoch 483/1000 1605/1605 [==============================] - 0s 67us/step - loss: 356.5078 - accuracy: 0.5688 - val_loss: 318.3501 - val_accuracy: 0.5746 Epoch 484/1000 1605/1605 [==============================] - 0s 65us/step - loss: 362.2478 - accuracy: 0.5738 - val_loss: 320.2040 - val_accuracy: 0.5746 Epoch 485/1000 1605/1605 [==============================] - 0s 67us/step - loss: 356.9236 - accuracy: 0.5614 - val_loss: 327.6174 - val_accuracy: 0.5697 Epoch 486/1000 1605/1605 [==============================] - 0s 66us/step - loss: 367.1070 - accuracy: 0.5664 - val_loss: 333.3511 - val_accuracy: 0.5871 Epoch 487/1000 1605/1605 [==============================] - 0s 67us/step - loss: 357.0668 - accuracy: 0.5651 - val_loss: 319.4699 - val_accuracy: 0.5821 Epoch 488/1000 1605/1605 [==============================] - 0s 66us/step - loss: 354.9830 - accuracy: 0.5695 - val_loss: 318.9878 - val_accuracy: 0.5796 Epoch 489/1000 1605/1605 [==============================] - 0s 69us/step - loss: 355.4255 - accuracy: 0.5751 - val_loss: 324.1778 - val_accuracy: 0.5746 Epoch 490/1000 1605/1605 [==============================] - 0s 99us/step - loss: 359.6348 - accuracy: 0.5595 - val_loss: 318.2534 - val_accuracy: 0.5721 Epoch 491/1000 1605/1605 [==============================] - 0s 96us/step - loss: 363.4456 - accuracy: 0.5614 - val_loss: 320.4228 - val_accuracy: 0.5796 Epoch 492/1000 1605/1605 [==============================] - 0s 92us/step - loss: 358.1040 - accuracy: 0.5701 - val_loss: 319.7188 - val_accuracy: 0.5796 Epoch 493/1000 1605/1605 [==============================] - 0s 87us/step - loss: 361.9963 - accuracy: 0.5769 - val_loss: 321.2363 - val_accuracy: 0.5771 Epoch 494/1000 1605/1605 [==============================] - 0s 81us/step - loss: 360.7018 - accuracy: 0.5695 - val_loss: 322.4841 - val_accuracy: 0.5896 Epoch 495/1000 1605/1605 [==============================] - 0s 69us/step - loss: 356.6551 - accuracy: 0.5682 - val_loss: 318.6047 - val_accuracy: 0.5771 Epoch 496/1000 1605/1605 [==============================] - 0s 69us/step - loss: 365.3193 - accuracy: 0.5626 - val_loss: 320.2100 - val_accuracy: 0.5821 Epoch 497/1000 1605/1605 [==============================] - 0s 68us/step - loss: 358.5708 - accuracy: 0.5607 - val_loss: 321.8270 - val_accuracy: 0.5796 Epoch 498/1000 1605/1605 [==============================] - 0s 67us/step - loss: 360.3411 - accuracy: 0.5713 - val_loss: 321.9342 - val_accuracy: 0.5796 Epoch 499/1000 1605/1605 [==============================] - 0s 66us/step - loss: 361.1708 - accuracy: 0.5670 - val_loss: 318.3839 - val_accuracy: 0.5821 Epoch 500/1000 1605/1605 [==============================] - 0s 70us/step - loss: 360.3183 - accuracy: 0.5782 - val_loss: 319.6278 - val_accuracy: 0.5721 Epoch 501/1000 1605/1605 [==============================] - 0s 66us/step - loss: 360.8504 - accuracy: 0.5782 - val_loss: 323.0952 - val_accuracy: 0.5846 Epoch 502/1000 1605/1605 [==============================] - 0s 76us/step - loss: 361.2339 - accuracy: 0.5583 - val_loss: 318.7887 - val_accuracy: 0.5796 Epoch 503/1000 1605/1605 [==============================] - 0s 68us/step - loss: 360.6795 - accuracy: 0.5670 - val_loss: 320.1186 - val_accuracy: 0.5746 Epoch 504/1000 1605/1605 [==============================] - 0s 71us/step - loss: 361.1244 - accuracy: 0.5720 - val_loss: 318.5169 - val_accuracy: 0.5771 Epoch 505/1000 1605/1605 [==============================] - 0s 74us/step - loss: 360.6052 - accuracy: 0.5782 - val_loss: 325.9556 - val_accuracy: 0.5721 Epoch 506/1000 1605/1605 [==============================] - 0s 71us/step - loss: 358.3606 - accuracy: 0.5726 - val_loss: 320.9766 - val_accuracy: 0.5771 Epoch 507/1000 1605/1605 [==============================] - 0s 78us/step - loss: 355.4642 - accuracy: 0.5670 - val_loss: 318.4236 - val_accuracy: 0.5821 Epoch 508/1000 1605/1605 [==============================] - 0s 67us/step - loss: 360.1258 - accuracy: 0.5707 - val_loss: 319.2881 - val_accuracy: 0.5871 Epoch 509/1000 1605/1605 [==============================] - 0s 70us/step - loss: 357.3719 - accuracy: 0.5695 - val_loss: 320.8099 - val_accuracy: 0.5771 Epoch 510/1000 1605/1605 [==============================] - 0s 68us/step - loss: 359.3380 - accuracy: 0.5701 - val_loss: 318.9716 - val_accuracy: 0.5920 Epoch 511/1000 1605/1605 [==============================] - 0s 69us/step - loss: 361.2542 - accuracy: 0.5763 - val_loss: 317.7184 - val_accuracy: 0.5896 Epoch 512/1000 1605/1605 [==============================] - 0s 70us/step - loss: 366.9327 - accuracy: 0.5676 - val_loss: 321.3253 - val_accuracy: 0.5771 Epoch 513/1000 1605/1605 [==============================] - 0s 67us/step - loss: 362.6700 - accuracy: 0.5576 - val_loss: 328.0348 - val_accuracy: 0.5846 Epoch 514/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.1920 - accuracy: 0.5664 - val_loss: 321.3594 - val_accuracy: 0.5796 Epoch 515/1000 1605/1605 [==============================] - 0s 69us/step - loss: 356.8252 - accuracy: 0.5589 - val_loss: 323.6157 - val_accuracy: 0.5821 Epoch 516/1000 1605/1605 [==============================] - 0s 69us/step - loss: 355.7054 - accuracy: 0.5664 - val_loss: 322.1231 - val_accuracy: 0.5846 Epoch 517/1000 1605/1605 [==============================] - 0s 68us/step - loss: 353.7103 - accuracy: 0.5713 - val_loss: 318.4230 - val_accuracy: 0.5846 Epoch 518/1000 1605/1605 [==============================] - 0s 67us/step - loss: 356.8224 - accuracy: 0.5751 - val_loss: 320.2804 - val_accuracy: 0.5796 Epoch 519/1000 1605/1605 [==============================] - 0s 66us/step - loss: 360.0812 - accuracy: 0.5657 - val_loss: 319.0040 - val_accuracy: 0.5821 Epoch 520/1000 1605/1605 [==============================] - 0s 67us/step - loss: 355.5310 - accuracy: 0.5639 - val_loss: 318.0136 - val_accuracy: 0.5821 Epoch 521/1000 1605/1605 [==============================] - 0s 68us/step - loss: 353.4780 - accuracy: 0.5707 - val_loss: 320.0722 - val_accuracy: 0.5746 Epoch 522/1000 1605/1605 [==============================] - 0s 68us/step - loss: 359.8288 - accuracy: 0.5720 - val_loss: 318.7286 - val_accuracy: 0.5746 Epoch 523/1000 1605/1605 [==============================] - 0s 67us/step - loss: 357.4113 - accuracy: 0.5576 - val_loss: 316.5706 - val_accuracy: 0.5721 Epoch 524/1000 1605/1605 [==============================] - 0s 68us/step - loss: 359.7169 - accuracy: 0.5688 - val_loss: 324.0491 - val_accuracy: 0.5721 Epoch 525/1000 1605/1605 [==============================] - 0s 70us/step - loss: 357.8321 - accuracy: 0.5707 - val_loss: 321.1595 - val_accuracy: 0.5771 Epoch 526/1000 1605/1605 [==============================] - 0s 69us/step - loss: 357.7577 - accuracy: 0.5626 - val_loss: 323.1691 - val_accuracy: 0.5846 Epoch 527/1000 1605/1605 [==============================] - ETA: 0s - loss: 361.7231 - accuracy: 0.55 - 0s 77us/step - loss: 362.4235 - accuracy: 0.5794 - val_loss: 315.8383 - val_accuracy: 0.5871 Epoch 528/1000 1605/1605 [==============================] - 0s 82us/step - loss: 358.6257 - accuracy: 0.5738 - val_loss: 320.7189 - val_accuracy: 0.5746 Epoch 529/1000 1605/1605 [==============================] - 0s 82us/step - loss: 361.0402 - accuracy: 0.5670 - val_loss: 318.1150 - val_accuracy: 0.5746 Epoch 530/1000 1605/1605 [==============================] - 0s 76us/step - loss: 354.4393 - accuracy: 0.5632 - val_loss: 323.5343 - val_accuracy: 0.5697 Epoch 531/1000 1605/1605 [==============================] - 0s 78us/step - loss: 357.9534 - accuracy: 0.5626 - val_loss: 320.9545 - val_accuracy: 0.5721 Epoch 532/1000 1605/1605 [==============================] - 0s 87us/step - loss: 354.1815 - accuracy: 0.5707 - val_loss: 318.0868 - val_accuracy: 0.5746 Epoch 533/1000 1605/1605 [==============================] - 0s 81us/step - loss: 355.9608 - accuracy: 0.5695 - val_loss: 318.5095 - val_accuracy: 0.5771 Epoch 534/1000 1605/1605 [==============================] - 0s 79us/step - loss: 357.2882 - accuracy: 0.5645 - val_loss: 321.3471 - val_accuracy: 0.5796 Epoch 535/1000 1605/1605 [==============================] - 0s 80us/step - loss: 354.5790 - accuracy: 0.5645 - val_loss: 318.2720 - val_accuracy: 0.5746 Epoch 536/1000 1605/1605 [==============================] - 0s 86us/step - loss: 361.5479 - accuracy: 0.5769 - val_loss: 327.1765 - val_accuracy: 0.5896 Epoch 537/1000 1605/1605 [==============================] - 0s 75us/step - loss: 355.7795 - accuracy: 0.5701 - val_loss: 325.7018 - val_accuracy: 0.5746 Epoch 538/1000 1605/1605 [==============================] - 0s 80us/step - loss: 359.4534 - accuracy: 0.5682 - val_loss: 319.4322 - val_accuracy: 0.5896 Epoch 539/1000 1605/1605 [==============================] - 0s 78us/step - loss: 359.2352 - accuracy: 0.5651 - val_loss: 319.5815 - val_accuracy: 0.5846 Epoch 540/1000 1605/1605 [==============================] - 0s 78us/step - loss: 357.1630 - accuracy: 0.5745 - val_loss: 319.2182 - val_accuracy: 0.5821 Epoch 541/1000 1605/1605 [==============================] - 0s 83us/step - loss: 356.1670 - accuracy: 0.5745 - val_loss: 320.7615 - val_accuracy: 0.5871 Epoch 542/1000 1605/1605 [==============================] - 0s 72us/step - loss: 357.8719 - accuracy: 0.5657 - val_loss: 318.7984 - val_accuracy: 0.5821 Epoch 543/1000 1605/1605 [==============================] - 0s 68us/step - loss: 359.5947 - accuracy: 0.5776 - val_loss: 317.3307 - val_accuracy: 0.5846 Epoch 544/1000 1605/1605 [==============================] - 0s 67us/step - loss: 358.2836 - accuracy: 0.5738 - val_loss: 316.9729 - val_accuracy: 0.5821 Epoch 545/1000 1605/1605 [==============================] - 0s 70us/step - loss: 358.9468 - accuracy: 0.5713 - val_loss: 323.0014 - val_accuracy: 0.5871 Epoch 546/1000 1605/1605 [==============================] - 0s 69us/step - loss: 360.0888 - accuracy: 0.5676 - val_loss: 340.1296 - val_accuracy: 0.5995 Epoch 547/1000 1605/1605 [==============================] - 0s 70us/step - loss: 356.8622 - accuracy: 0.5763 - val_loss: 329.5432 - val_accuracy: 0.5945 Epoch 548/1000 1605/1605 [==============================] - 0s 70us/step - loss: 359.4891 - accuracy: 0.5788 - val_loss: 332.1464 - val_accuracy: 0.5821 Epoch 549/1000 1605/1605 [==============================] - 0s 67us/step - loss: 358.5034 - accuracy: 0.5720 - val_loss: 320.4611 - val_accuracy: 0.5846 Epoch 550/1000 1605/1605 [==============================] - 0s 67us/step - loss: 355.3081 - accuracy: 0.5682 - val_loss: 327.1166 - val_accuracy: 0.5896 Epoch 551/1000 1605/1605 [==============================] - 0s 67us/step - loss: 361.4344 - accuracy: 0.5657 - val_loss: 322.4373 - val_accuracy: 0.5896 Epoch 552/1000 1605/1605 [==============================] - 0s 69us/step - loss: 357.4001 - accuracy: 0.5626 - val_loss: 319.8556 - val_accuracy: 0.5796 Epoch 553/1000 1605/1605 [==============================] - 0s 71us/step - loss: 355.0836 - accuracy: 0.5738 - val_loss: 319.9504 - val_accuracy: 0.5771 Epoch 554/1000 1605/1605 [==============================] - 0s 74us/step - loss: 351.5253 - accuracy: 0.5794 - val_loss: 318.1231 - val_accuracy: 0.5746 Epoch 555/1000 1605/1605 [==============================] - 0s 74us/step - loss: 356.8063 - accuracy: 0.5676 - val_loss: 317.3106 - val_accuracy: 0.5846 Epoch 556/1000 1605/1605 [==============================] - 0s 79us/step - loss: 352.0855 - accuracy: 0.5695 - val_loss: 320.4921 - val_accuracy: 0.5846 Epoch 557/1000 1605/1605 [==============================] - 0s 75us/step - loss: 358.3532 - accuracy: 0.5801 - val_loss: 317.5794 - val_accuracy: 0.5871 Epoch 558/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.4570 - accuracy: 0.5676 - val_loss: 322.3207 - val_accuracy: 0.5796 Epoch 559/1000 1605/1605 [==============================] - 0s 77us/step - loss: 356.5584 - accuracy: 0.5776 - val_loss: 315.8991 - val_accuracy: 0.5821 Epoch 560/1000 1605/1605 [==============================] - 0s 68us/step - loss: 355.7595 - accuracy: 0.5570 - val_loss: 315.7285 - val_accuracy: 0.5920 Epoch 561/1000 1605/1605 [==============================] - 0s 69us/step - loss: 358.9182 - accuracy: 0.5576 - val_loss: 318.7885 - val_accuracy: 0.5970 Epoch 562/1000 1605/1605 [==============================] - 0s 66us/step - loss: 353.9378 - accuracy: 0.5657 - val_loss: 320.4120 - val_accuracy: 0.5945 Epoch 563/1000 1605/1605 [==============================] - 0s 68us/step - loss: 362.7018 - accuracy: 0.5801 - val_loss: 318.1809 - val_accuracy: 0.5871 Epoch 564/1000 1605/1605 [==============================] - 0s 67us/step - loss: 359.6521 - accuracy: 0.5713 - val_loss: 316.3369 - val_accuracy: 0.5846 Epoch 565/1000 1605/1605 [==============================] - 0s 67us/step - loss: 357.0687 - accuracy: 0.5639 - val_loss: 316.0723 - val_accuracy: 0.5846 Epoch 566/1000 1605/1605 [==============================] - 0s 66us/step - loss: 355.9967 - accuracy: 0.5738 - val_loss: 320.3634 - val_accuracy: 0.5846 Epoch 567/1000 1605/1605 [==============================] - 0s 70us/step - loss: 356.7211 - accuracy: 0.5701 - val_loss: 316.4596 - val_accuracy: 0.5896 Epoch 568/1000 1605/1605 [==============================] - 0s 67us/step - loss: 352.9588 - accuracy: 0.5794 - val_loss: 319.7891 - val_accuracy: 0.5871 Epoch 569/1000 1605/1605 [==============================] - 0s 70us/step - loss: 365.6466 - accuracy: 0.5639 - val_loss: 315.8497 - val_accuracy: 0.5896 Epoch 570/1000 1605/1605 [==============================] - 0s 68us/step - loss: 354.5478 - accuracy: 0.5676 - val_loss: 318.4366 - val_accuracy: 0.5846 Epoch 571/1000 1605/1605 [==============================] - 0s 76us/step - loss: 356.3248 - accuracy: 0.5763 - val_loss: 330.3748 - val_accuracy: 0.5746 Epoch 572/1000 1605/1605 [==============================] - 0s 69us/step - loss: 354.4839 - accuracy: 0.5713 - val_loss: 356.3772 - val_accuracy: 0.5796 Epoch 573/1000 1605/1605 [==============================] - 0s 68us/step - loss: 355.4400 - accuracy: 0.5676 - val_loss: 333.2626 - val_accuracy: 0.5821 Epoch 574/1000 1605/1605 [==============================] - 0s 68us/step - loss: 350.5934 - accuracy: 0.5676 - val_loss: 317.4095 - val_accuracy: 0.5796 Epoch 575/1000 1605/1605 [==============================] - 0s 67us/step - loss: 350.4507 - accuracy: 0.5676 - val_loss: 333.8356 - val_accuracy: 0.5796 Epoch 576/1000 1605/1605 [==============================] - 0s 81us/step - loss: 359.4284 - accuracy: 0.5676 - val_loss: 321.1224 - val_accuracy: 0.5871 Epoch 577/1000 1605/1605 [==============================] - 0s 78us/step - loss: 357.4678 - accuracy: 0.5688 - val_loss: 319.3756 - val_accuracy: 0.5796 Epoch 578/1000 1605/1605 [==============================] - 0s 69us/step - loss: 359.2282 - accuracy: 0.5751 - val_loss: 319.9312 - val_accuracy: 0.5846 Epoch 579/1000 1605/1605 [==============================] - 0s 69us/step - loss: 357.1688 - accuracy: 0.5732 - val_loss: 316.4497 - val_accuracy: 0.5846 Epoch 580/1000 1605/1605 [==============================] - 0s 67us/step - loss: 351.3874 - accuracy: 0.5632 - val_loss: 319.9177 - val_accuracy: 0.5896 Epoch 581/1000 1605/1605 [==============================] - 0s 80us/step - loss: 364.0951 - accuracy: 0.5632 - val_loss: 316.9574 - val_accuracy: 0.5896 Epoch 582/1000 1605/1605 [==============================] - 0s 97us/step - loss: 350.6700 - accuracy: 0.5688 - val_loss: 316.0882 - val_accuracy: 0.5846 Epoch 583/1000 1605/1605 [==============================] - 0s 68us/step - loss: 356.4042 - accuracy: 0.5732 - val_loss: 320.3960 - val_accuracy: 0.5871 Epoch 584/1000 1605/1605 [==============================] - 0s 67us/step - loss: 350.8357 - accuracy: 0.5794 - val_loss: 319.5612 - val_accuracy: 0.5896 Epoch 585/1000 1605/1605 [==============================] - 0s 67us/step - loss: 352.2136 - accuracy: 0.5726 - val_loss: 316.7037 - val_accuracy: 0.5746 Epoch 586/1000 1605/1605 [==============================] - 0s 69us/step - loss: 355.1950 - accuracy: 0.5645 - val_loss: 317.0500 - val_accuracy: 0.5771 Epoch 587/1000 1605/1605 [==============================] - 0s 74us/step - loss: 363.4286 - accuracy: 0.5651 - val_loss: 316.4810 - val_accuracy: 0.5871 Epoch 588/1000 1605/1605 [==============================] - 0s 81us/step - loss: 360.5669 - accuracy: 0.5688 - val_loss: 315.7491 - val_accuracy: 0.5871 Epoch 589/1000 1605/1605 [==============================] - 0s 69us/step - loss: 357.2021 - accuracy: 0.5682 - val_loss: 316.4370 - val_accuracy: 0.5796 Epoch 590/1000 1605/1605 [==============================] - 0s 71us/step - loss: 357.4923 - accuracy: 0.5614 - val_loss: 321.2016 - val_accuracy: 0.5796 Epoch 591/1000 1605/1605 [==============================] - 0s 67us/step - loss: 353.4757 - accuracy: 0.5682 - val_loss: 318.2928 - val_accuracy: 0.5771 Epoch 592/1000 1605/1605 [==============================] - 0s 88us/step - loss: 353.7564 - accuracy: 0.5726 - val_loss: 315.5663 - val_accuracy: 0.5846 Epoch 593/1000 1605/1605 [==============================] - 0s 67us/step - loss: 352.6568 - accuracy: 0.5869 - val_loss: 315.9488 - val_accuracy: 0.5871 Epoch 594/1000 1605/1605 [==============================] - 0s 67us/step - loss: 353.7536 - accuracy: 0.5688 - val_loss: 316.3017 - val_accuracy: 0.5846 Epoch 595/1000 1605/1605 [==============================] - 0s 67us/step - loss: 356.8335 - accuracy: 0.5769 - val_loss: 318.1628 - val_accuracy: 0.5871 Epoch 596/1000 1605/1605 [==============================] - 0s 85us/step - loss: 356.1505 - accuracy: 0.5701 - val_loss: 319.0683 - val_accuracy: 0.5896 Epoch 597/1000 1605/1605 [==============================] - 0s 69us/step - loss: 362.9130 - accuracy: 0.5738 - val_loss: 318.9610 - val_accuracy: 0.5771 Epoch 598/1000 1605/1605 [==============================] - 0s 70us/step - loss: 352.4548 - accuracy: 0.5882 - val_loss: 317.1051 - val_accuracy: 0.5871 Epoch 599/1000 1605/1605 [==============================] - 0s 79us/step - loss: 362.6213 - accuracy: 0.5626 - val_loss: 315.5367 - val_accuracy: 0.5771 Epoch 600/1000 1605/1605 [==============================] - 0s 82us/step - loss: 353.7526 - accuracy: 0.5769 - val_loss: 316.6482 - val_accuracy: 0.5821 Epoch 601/1000 1605/1605 [==============================] - 0s 85us/step - loss: 349.9212 - accuracy: 0.5757 - val_loss: 316.0941 - val_accuracy: 0.5846 Epoch 602/1000 1605/1605 [==============================] - 0s 78us/step - loss: 359.8098 - accuracy: 0.5732 - val_loss: 322.5079 - val_accuracy: 0.5896 Epoch 603/1000 1605/1605 [==============================] - 0s 67us/step - loss: 353.8756 - accuracy: 0.5670 - val_loss: 315.8469 - val_accuracy: 0.5896 Epoch 604/1000 1605/1605 [==============================] - 0s 69us/step - loss: 353.6048 - accuracy: 0.5701 - val_loss: 319.1924 - val_accuracy: 0.5995 Epoch 605/1000 1605/1605 [==============================] - 0s 83us/step - loss: 349.8872 - accuracy: 0.5664 - val_loss: 317.6613 - val_accuracy: 0.5846 Epoch 606/1000 1605/1605 [==============================] - 0s 87us/step - loss: 355.8584 - accuracy: 0.5701 - val_loss: 319.2544 - val_accuracy: 0.5920 Epoch 607/1000 1605/1605 [==============================] - 0s 82us/step - loss: 350.3650 - accuracy: 0.5807 - val_loss: 319.8824 - val_accuracy: 0.5796 Epoch 608/1000 1605/1605 [==============================] - 0s 80us/step - loss: 353.9872 - accuracy: 0.5738 - val_loss: 319.9191 - val_accuracy: 0.5821 Epoch 609/1000 1605/1605 [==============================] - 0s 80us/step - loss: 359.6244 - accuracy: 0.5651 - val_loss: 319.1586 - val_accuracy: 0.5821 Epoch 610/1000 1605/1605 [==============================] - 0s 83us/step - loss: 360.3410 - accuracy: 0.5657 - val_loss: 321.4079 - val_accuracy: 0.5871 Epoch 611/1000 1605/1605 [==============================] - 0s 97us/step - loss: 355.3938 - accuracy: 0.5794 - val_loss: 318.5953 - val_accuracy: 0.5821 Epoch 612/1000 1605/1605 [==============================] - 0s 74us/step - loss: 351.0549 - accuracy: 0.5695 - val_loss: 327.7261 - val_accuracy: 0.5871 Epoch 613/1000 1605/1605 [==============================] - 0s 108us/step - loss: 348.1513 - accuracy: 0.5639 - val_loss: 316.6057 - val_accuracy: 0.5771 Epoch 614/1000 1605/1605 [==============================] - 0s 164us/step - loss: 352.3768 - accuracy: 0.5757 - val_loss: 321.3395 - val_accuracy: 0.5920 Epoch 615/1000 1605/1605 [==============================] - 0s 118us/step - loss: 361.1397 - accuracy: 0.5788 - val_loss: 317.4450 - val_accuracy: 0.5846 Epoch 616/1000 1605/1605 [==============================] - 0s 124us/step - loss: 359.9956 - accuracy: 0.5813 - val_loss: 315.2409 - val_accuracy: 0.5871 Epoch 617/1000 1605/1605 [==============================] - 0s 126us/step - loss: 359.1320 - accuracy: 0.5776 - val_loss: 323.6633 - val_accuracy: 0.5846 Epoch 618/1000 1605/1605 [==============================] - 0s 82us/step - loss: 359.1542 - accuracy: 0.5558 - val_loss: 315.1902 - val_accuracy: 0.5796 Epoch 619/1000 1605/1605 [==============================] - 0s 103us/step - loss: 355.9102 - accuracy: 0.5682 - val_loss: 314.2998 - val_accuracy: 0.5896 Epoch 620/1000 1605/1605 [==============================] - 0s 93us/step - loss: 350.3985 - accuracy: 0.5645 - val_loss: 314.3796 - val_accuracy: 0.5796 Epoch 621/1000 1605/1605 [==============================] - 0s 83us/step - loss: 358.2738 - accuracy: 0.5726 - val_loss: 316.3066 - val_accuracy: 0.5821 Epoch 622/1000 1605/1605 [==============================] - 0s 84us/step - loss: 359.7160 - accuracy: 0.5769 - val_loss: 316.0949 - val_accuracy: 0.5771 Epoch 623/1000 1605/1605 [==============================] - 0s 83us/step - loss: 360.3361 - accuracy: 0.5707 - val_loss: 315.7060 - val_accuracy: 0.5846 Epoch 624/1000 1605/1605 [==============================] - 0s 84us/step - loss: 355.5458 - accuracy: 0.5763 - val_loss: 314.6203 - val_accuracy: 0.5796 Epoch 625/1000 1605/1605 [==============================] - 0s 81us/step - loss: 358.3831 - accuracy: 0.5676 - val_loss: 315.2036 - val_accuracy: 0.5846 Epoch 626/1000 1605/1605 [==============================] - 0s 116us/step - loss: 353.8912 - accuracy: 0.5801 - val_loss: 314.6031 - val_accuracy: 0.5821 Epoch 627/1000 1605/1605 [==============================] - 0s 78us/step - loss: 360.6506 - accuracy: 0.5495 - val_loss: 318.8918 - val_accuracy: 0.5796 Epoch 628/1000 1605/1605 [==============================] - 0s 81us/step - loss: 350.8100 - accuracy: 0.5745 - val_loss: 317.8811 - val_accuracy: 0.5746 Epoch 629/1000 1605/1605 [==============================] - 0s 102us/step - loss: 360.4264 - accuracy: 0.5788 - val_loss: 324.9055 - val_accuracy: 0.5771 Epoch 630/1000 1605/1605 [==============================] - 0s 79us/step - loss: 357.4546 - accuracy: 0.5720 - val_loss: 326.5755 - val_accuracy: 0.5871 Epoch 631/1000 1605/1605 [==============================] - 0s 80us/step - loss: 355.3235 - accuracy: 0.5632 - val_loss: 318.6683 - val_accuracy: 0.5821 Epoch 632/1000 1605/1605 [==============================] - 0s 78us/step - loss: 353.9878 - accuracy: 0.5670 - val_loss: 313.3333 - val_accuracy: 0.5896 Epoch 633/1000 1605/1605 [==============================] - 0s 78us/step - loss: 353.8526 - accuracy: 0.5732 - val_loss: 313.7588 - val_accuracy: 0.5821 Epoch 634/1000 1605/1605 [==============================] - 0s 79us/step - loss: 354.0412 - accuracy: 0.5657 - val_loss: 315.6941 - val_accuracy: 0.5871 Epoch 635/1000 1605/1605 [==============================] - 0s 121us/step - loss: 359.1418 - accuracy: 0.5738 - val_loss: 318.3917 - val_accuracy: 0.5746 Epoch 636/1000 1605/1605 [==============================] - 0s 78us/step - loss: 357.4367 - accuracy: 0.5819 - val_loss: 315.5242 - val_accuracy: 0.5821 Epoch 637/1000 1605/1605 [==============================] - 0s 78us/step - loss: 359.5908 - accuracy: 0.5763 - val_loss: 315.9841 - val_accuracy: 0.5821 Epoch 638/1000 1605/1605 [==============================] - 0s 87us/step - loss: 361.0942 - accuracy: 0.5794 - val_loss: 321.1807 - val_accuracy: 0.5846 Epoch 639/1000 1605/1605 [==============================] - 0s 80us/step - loss: 358.1811 - accuracy: 0.5651 - val_loss: 313.3150 - val_accuracy: 0.5647 Epoch 640/1000 1605/1605 [==============================] - 0s 88us/step - loss: 354.0593 - accuracy: 0.5682 - val_loss: 314.5508 - val_accuracy: 0.5647 Epoch 641/1000 1605/1605 [==============================] - 0s 79us/step - loss: 355.9617 - accuracy: 0.5857 - val_loss: 313.8126 - val_accuracy: 0.5697 Epoch 642/1000 1605/1605 [==============================] - 0s 77us/step - loss: 349.5806 - accuracy: 0.5807 - val_loss: 317.6831 - val_accuracy: 0.5821 Epoch 643/1000 1605/1605 [==============================] - 0s 75us/step - loss: 354.2379 - accuracy: 0.5726 - val_loss: 322.4917 - val_accuracy: 0.5896 Epoch 644/1000 1605/1605 [==============================] - 0s 79us/step - loss: 357.1172 - accuracy: 0.5788 - val_loss: 315.2159 - val_accuracy: 0.5846 Epoch 645/1000 1605/1605 [==============================] - 0s 79us/step - loss: 354.1373 - accuracy: 0.5819 - val_loss: 314.0079 - val_accuracy: 0.5896 Epoch 646/1000 1605/1605 [==============================] - 0s 79us/step - loss: 352.3393 - accuracy: 0.5726 - val_loss: 313.6063 - val_accuracy: 0.5672 Epoch 647/1000 1605/1605 [==============================] - 0s 82us/step - loss: 361.2355 - accuracy: 0.5738 - val_loss: 317.2799 - val_accuracy: 0.5846 Epoch 648/1000 1605/1605 [==============================] - 0s 80us/step - loss: 360.7557 - accuracy: 0.5794 - val_loss: 315.2485 - val_accuracy: 0.5821 Epoch 649/1000 1605/1605 [==============================] - 0s 89us/step - loss: 351.8239 - accuracy: 0.5826 - val_loss: 317.3078 - val_accuracy: 0.5871 Epoch 650/1000 1605/1605 [==============================] - 0s 80us/step - loss: 352.1366 - accuracy: 0.5763 - val_loss: 313.1367 - val_accuracy: 0.5796 Epoch 651/1000 1605/1605 [==============================] - 0s 80us/step - loss: 359.6444 - accuracy: 0.5838 - val_loss: 320.8750 - val_accuracy: 0.5796 Epoch 652/1000 1605/1605 [==============================] - 0s 80us/step - loss: 354.7856 - accuracy: 0.5763 - val_loss: 314.4543 - val_accuracy: 0.5746 Epoch 653/1000 1605/1605 [==============================] - 0s 79us/step - loss: 351.1910 - accuracy: 0.5819 - val_loss: 313.8374 - val_accuracy: 0.5796 Epoch 654/1000 1605/1605 [==============================] - 0s 77us/step - loss: 362.4453 - accuracy: 0.5695 - val_loss: 314.0481 - val_accuracy: 0.5821 Epoch 655/1000 1605/1605 [==============================] - 0s 75us/step - loss: 353.0006 - accuracy: 0.5726 - val_loss: 316.0645 - val_accuracy: 0.5871 Epoch 656/1000 1605/1605 [==============================] - 0s 77us/step - loss: 352.5944 - accuracy: 0.5776 - val_loss: 312.9293 - val_accuracy: 0.5796 Epoch 657/1000 1605/1605 [==============================] - 0s 110us/step - loss: 356.2374 - accuracy: 0.5757 - val_loss: 315.9970 - val_accuracy: 0.5871 Epoch 658/1000 1605/1605 [==============================] - 0s 111us/step - loss: 356.5740 - accuracy: 0.5732 - val_loss: 313.1701 - val_accuracy: 0.5896 Epoch 659/1000 1605/1605 [==============================] - 0s 82us/step - loss: 351.0197 - accuracy: 0.5726 - val_loss: 317.1705 - val_accuracy: 0.5871 Epoch 660/1000 1605/1605 [==============================] - 0s 79us/step - loss: 355.5014 - accuracy: 0.5738 - val_loss: 317.9778 - val_accuracy: 0.5871 Epoch 661/1000 1605/1605 [==============================] - 0s 79us/step - loss: 361.6734 - accuracy: 0.5732 - val_loss: 314.5307 - val_accuracy: 0.5846 Epoch 662/1000 1605/1605 [==============================] - 0s 78us/step - loss: 351.4740 - accuracy: 0.5776 - val_loss: 319.7127 - val_accuracy: 0.5871 Epoch 663/1000 1605/1605 [==============================] - 0s 80us/step - loss: 354.1597 - accuracy: 0.5788 - val_loss: 312.8621 - val_accuracy: 0.5796 Epoch 664/1000 1605/1605 [==============================] - 0s 93us/step - loss: 349.8764 - accuracy: 0.5645 - val_loss: 313.4726 - val_accuracy: 0.5796 Epoch 665/1000 1605/1605 [==============================] - 0s 78us/step - loss: 353.1092 - accuracy: 0.5738 - val_loss: 316.6814 - val_accuracy: 0.5746 Epoch 666/1000 1605/1605 [==============================] - 0s 107us/step - loss: 350.6483 - accuracy: 0.5732 - val_loss: 316.2563 - val_accuracy: 0.5746 Epoch 667/1000 1605/1605 [==============================] - 0s 134us/step - loss: 353.5413 - accuracy: 0.5832 - val_loss: 315.1772 - val_accuracy: 0.5771 Epoch 668/1000 1605/1605 [==============================] - 0s 97us/step - loss: 361.1901 - accuracy: 0.5776 - val_loss: 313.0952 - val_accuracy: 0.5771 Epoch 669/1000 1605/1605 [==============================] - 0s 113us/step - loss: 358.8909 - accuracy: 0.5738 - val_loss: 312.4817 - val_accuracy: 0.5771 Epoch 670/1000 1605/1605 [==============================] - 0s 142us/step - loss: 353.4093 - accuracy: 0.5726 - val_loss: 314.0183 - val_accuracy: 0.5896 Epoch 671/1000 1605/1605 [==============================] - 0s 82us/step - loss: 355.0183 - accuracy: 0.5813 - val_loss: 313.4150 - val_accuracy: 0.5771 Epoch 672/1000 1605/1605 [==============================] - 0s 81us/step - loss: 349.7119 - accuracy: 0.5925 - val_loss: 316.7944 - val_accuracy: 0.5796 Epoch 673/1000 1605/1605 [==============================] - 0s 101us/step - loss: 351.1887 - accuracy: 0.5751 - val_loss: 328.3146 - val_accuracy: 0.5746 Epoch 674/1000 1605/1605 [==============================] - 0s 82us/step - loss: 356.4131 - accuracy: 0.5788 - val_loss: 317.2790 - val_accuracy: 0.5796 Epoch 675/1000 1605/1605 [==============================] - 0s 80us/step - loss: 351.1047 - accuracy: 0.5745 - val_loss: 315.5905 - val_accuracy: 0.5821 Epoch 676/1000 1605/1605 [==============================] - 0s 84us/step - loss: 355.0466 - accuracy: 0.5720 - val_loss: 315.0762 - val_accuracy: 0.5970 Epoch 677/1000 1605/1605 [==============================] - 0s 80us/step - loss: 355.2065 - accuracy: 0.5751 - val_loss: 315.0661 - val_accuracy: 0.5896 Epoch 678/1000 1605/1605 [==============================] - 0s 79us/step - loss: 351.6500 - accuracy: 0.5826 - val_loss: 314.3147 - val_accuracy: 0.5920 Epoch 679/1000 1605/1605 [==============================] - 0s 79us/step - loss: 355.5333 - accuracy: 0.5670 - val_loss: 316.3615 - val_accuracy: 0.5846 Epoch 680/1000 1605/1605 [==============================] - 0s 80us/step - loss: 347.0162 - accuracy: 0.5745 - val_loss: 319.1150 - val_accuracy: 0.5945 Epoch 681/1000 1605/1605 [==============================] - 0s 90us/step - loss: 349.0797 - accuracy: 0.5757 - val_loss: 314.7193 - val_accuracy: 0.5846 Epoch 682/1000 1605/1605 [==============================] - 0s 79us/step - loss: 351.2030 - accuracy: 0.5720 - val_loss: 315.4110 - val_accuracy: 0.5796 Epoch 683/1000 1605/1605 [==============================] - 0s 86us/step - loss: 356.1538 - accuracy: 0.5819 - val_loss: 312.4839 - val_accuracy: 0.5796 Epoch 684/1000 1605/1605 [==============================] - 0s 80us/step - loss: 352.5059 - accuracy: 0.5807 - val_loss: 317.9671 - val_accuracy: 0.5846 Epoch 685/1000 1605/1605 [==============================] - 0s 75us/step - loss: 355.5637 - accuracy: 0.5931 - val_loss: 313.1179 - val_accuracy: 0.5970 Epoch 686/1000 1605/1605 [==============================] - 0s 76us/step - loss: 355.9519 - accuracy: 0.5720 - val_loss: 312.4806 - val_accuracy: 0.5821 Epoch 687/1000 1605/1605 [==============================] - 0s 71us/step - loss: 352.2929 - accuracy: 0.5894 - val_loss: 314.9818 - val_accuracy: 0.5945 Epoch 688/1000 1605/1605 [==============================] - 0s 90us/step - loss: 350.8479 - accuracy: 0.5838 - val_loss: 314.6010 - val_accuracy: 0.5945 Epoch 689/1000 1605/1605 [==============================] - 0s 83us/step - loss: 356.4306 - accuracy: 0.5963 - val_loss: 314.2952 - val_accuracy: 0.5896 Epoch 690/1000 1605/1605 [==============================] - 0s 87us/step - loss: 354.1205 - accuracy: 0.5819 - val_loss: 315.0560 - val_accuracy: 0.5920 Epoch 691/1000 1605/1605 [==============================] - 0s 72us/step - loss: 357.8318 - accuracy: 0.5919 - val_loss: 318.1375 - val_accuracy: 0.5920 Epoch 692/1000 1605/1605 [==============================] - 0s 80us/step - loss: 355.1317 - accuracy: 0.5813 - val_loss: 316.3871 - val_accuracy: 0.5995 Epoch 693/1000 1605/1605 [==============================] - 0s 92us/step - loss: 350.8207 - accuracy: 0.5863 - val_loss: 318.5172 - val_accuracy: 0.5995 Epoch 694/1000 1605/1605 [==============================] - 0s 72us/step - loss: 351.0373 - accuracy: 0.5788 - val_loss: 315.1001 - val_accuracy: 0.5920 Epoch 695/1000 1605/1605 [==============================] - 0s 75us/step - loss: 350.5435 - accuracy: 0.5807 - val_loss: 312.4245 - val_accuracy: 0.5920 Epoch 696/1000 1605/1605 [==============================] - 0s 81us/step - loss: 359.0578 - accuracy: 0.5819 - val_loss: 314.6877 - val_accuracy: 0.5846 Epoch 697/1000 1605/1605 [==============================] - 0s 96us/step - loss: 355.0185 - accuracy: 0.5807 - val_loss: 313.1332 - val_accuracy: 0.5945 Epoch 698/1000 1605/1605 [==============================] - 0s 72us/step - loss: 356.4837 - accuracy: 0.5807 - val_loss: 312.2264 - val_accuracy: 0.5846 Epoch 699/1000 1605/1605 [==============================] - 0s 75us/step - loss: 355.5445 - accuracy: 0.5956 - val_loss: 312.8540 - val_accuracy: 0.5945 Epoch 700/1000 1605/1605 [==============================] - 0s 74us/step - loss: 358.0487 - accuracy: 0.5713 - val_loss: 312.9963 - val_accuracy: 0.5846 Epoch 701/1000 1605/1605 [==============================] - 0s 77us/step - loss: 354.3096 - accuracy: 0.5919 - val_loss: 315.1711 - val_accuracy: 0.5896 Epoch 702/1000 1605/1605 [==============================] - 0s 83us/step - loss: 348.7238 - accuracy: 0.5713 - val_loss: 312.3718 - val_accuracy: 0.5871 Epoch 703/1000 1605/1605 [==============================] - 0s 72us/step - loss: 360.0903 - accuracy: 0.5726 - val_loss: 315.2516 - val_accuracy: 0.5821 Epoch 704/1000 1605/1605 [==============================] - 0s 72us/step - loss: 352.4150 - accuracy: 0.5894 - val_loss: 313.1117 - val_accuracy: 0.5896 Epoch 705/1000 1605/1605 [==============================] - 0s 95us/step - loss: 358.6771 - accuracy: 0.5763 - val_loss: 314.9968 - val_accuracy: 0.5846 Epoch 706/1000 1605/1605 [==============================] - 0s 107us/step - loss: 354.1983 - accuracy: 0.5769 - val_loss: 312.4877 - val_accuracy: 0.5821 Epoch 707/1000 1605/1605 [==============================] - 0s 83us/step - loss: 352.5099 - accuracy: 0.5907 - val_loss: 316.3025 - val_accuracy: 0.5871 Epoch 708/1000 1605/1605 [==============================] - 0s 73us/step - loss: 347.0948 - accuracy: 0.5869 - val_loss: 316.9199 - val_accuracy: 0.5796 Epoch 709/1000 1605/1605 [==============================] - 0s 72us/step - loss: 354.7508 - accuracy: 0.5857 - val_loss: 313.3248 - val_accuracy: 0.5821 Epoch 710/1000 1605/1605 [==============================] - 0s 73us/step - loss: 359.7194 - accuracy: 0.5813 - val_loss: 314.3006 - val_accuracy: 0.5871 Epoch 711/1000 1605/1605 [==============================] - 0s 72us/step - loss: 351.4083 - accuracy: 0.5956 - val_loss: 311.6031 - val_accuracy: 0.5846 Epoch 712/1000 1605/1605 [==============================] - 0s 79us/step - loss: 355.0294 - accuracy: 0.5913 - val_loss: 311.5759 - val_accuracy: 0.5821 Epoch 713/1000 1605/1605 [==============================] - 0s 72us/step - loss: 352.7596 - accuracy: 0.5838 - val_loss: 312.3249 - val_accuracy: 0.5871 Epoch 714/1000 1605/1605 [==============================] - 0s 74us/step - loss: 350.8389 - accuracy: 0.5931 - val_loss: 313.0376 - val_accuracy: 0.5871 Epoch 715/1000 1605/1605 [==============================] - 0s 82us/step - loss: 353.8159 - accuracy: 0.5869 - val_loss: 311.9798 - val_accuracy: 0.5846 Epoch 716/1000 1605/1605 [==============================] - 0s 97us/step - loss: 352.8164 - accuracy: 0.5907 - val_loss: 312.1045 - val_accuracy: 0.5821 Epoch 717/1000 1605/1605 [==============================] - 0s 72us/step - loss: 357.3981 - accuracy: 0.5813 - val_loss: 323.5384 - val_accuracy: 0.5771 Epoch 718/1000 1605/1605 [==============================] - 0s 78us/step - loss: 351.4398 - accuracy: 0.5956 - val_loss: 313.2628 - val_accuracy: 0.5846 Epoch 719/1000 1605/1605 [==============================] - 0s 77us/step - loss: 353.8875 - accuracy: 0.5944 - val_loss: 318.3437 - val_accuracy: 0.5821 Epoch 720/1000 1605/1605 [==============================] - 0s 74us/step - loss: 352.1439 - accuracy: 0.5813 - val_loss: 316.5812 - val_accuracy: 0.5896 Epoch 721/1000 1605/1605 [==============================] - 0s 77us/step - loss: 351.8015 - accuracy: 0.5944 - val_loss: 312.5821 - val_accuracy: 0.5945 Epoch 722/1000 1605/1605 [==============================] - 0s 74us/step - loss: 356.3525 - accuracy: 0.5863 - val_loss: 321.2502 - val_accuracy: 0.5995 Epoch 723/1000 1605/1605 [==============================] - 0s 81us/step - loss: 353.7410 - accuracy: 0.5801 - val_loss: 316.7987 - val_accuracy: 0.5871 Epoch 724/1000 1605/1605 [==============================] - 0s 72us/step - loss: 356.3448 - accuracy: 0.5776 - val_loss: 311.7713 - val_accuracy: 0.5796 Epoch 725/1000 1605/1605 [==============================] - 0s 77us/step - loss: 351.1210 - accuracy: 0.5925 - val_loss: 317.3112 - val_accuracy: 0.5920 Epoch 726/1000 1605/1605 [==============================] - 0s 75us/step - loss: 355.0901 - accuracy: 0.5913 - val_loss: 311.3552 - val_accuracy: 0.5945 Epoch 727/1000 1605/1605 [==============================] - 0s 72us/step - loss: 355.4919 - accuracy: 0.5969 - val_loss: 313.9129 - val_accuracy: 0.6045 Epoch 728/1000 1605/1605 [==============================] - 0s 74us/step - loss: 349.2570 - accuracy: 0.5944 - val_loss: 312.0845 - val_accuracy: 0.5970 Epoch 729/1000 1605/1605 [==============================] - 0s 77us/step - loss: 348.2254 - accuracy: 0.6037 - val_loss: 310.7969 - val_accuracy: 0.5920 Epoch 730/1000 1605/1605 [==============================] - 0s 72us/step - loss: 349.8590 - accuracy: 0.5956 - val_loss: 312.2975 - val_accuracy: 0.5970 Epoch 731/1000 1605/1605 [==============================] - 0s 72us/step - loss: 357.3714 - accuracy: 0.5863 - val_loss: 311.7412 - val_accuracy: 0.5846 Epoch 732/1000 1605/1605 [==============================] - 0s 98us/step - loss: 352.1350 - accuracy: 0.5832 - val_loss: 312.8198 - val_accuracy: 0.5970 Epoch 733/1000 1605/1605 [==============================] - 0s 177us/step - loss: 350.6403 - accuracy: 0.5950 - val_loss: 311.2607 - val_accuracy: 0.5970 Epoch 734/1000 1605/1605 [==============================] - 0s 157us/step - loss: 351.9814 - accuracy: 0.5875 - val_loss: 312.3072 - val_accuracy: 0.5945 Epoch 735/1000 1605/1605 [==============================] - 0s 193us/step - loss: 357.0056 - accuracy: 0.5850 - val_loss: 315.1222 - val_accuracy: 0.5920 Epoch 736/1000 1605/1605 [==============================] - 0s 174us/step - loss: 352.2483 - accuracy: 0.5738 - val_loss: 327.5570 - val_accuracy: 0.5821 Epoch 737/1000 1605/1605 [==============================] - 0s 103us/step - loss: 350.6960 - accuracy: 0.5963 - val_loss: 328.7640 - val_accuracy: 0.5871 Epoch 738/1000 1605/1605 [==============================] - 0s 80us/step - loss: 348.6498 - accuracy: 0.5919 - val_loss: 318.5779 - val_accuracy: 0.5871 Epoch 739/1000 1605/1605 [==============================] - 0s 118us/step - loss: 350.2985 - accuracy: 0.5994 - val_loss: 321.3559 - val_accuracy: 0.5945 Epoch 740/1000 1605/1605 [==============================] - 0s 103us/step - loss: 351.2505 - accuracy: 0.5863 - val_loss: 323.4211 - val_accuracy: 0.5896 Epoch 741/1000 1605/1605 [==============================] - 0s 197us/step - loss: 351.9565 - accuracy: 0.5925 - val_loss: 315.6762 - val_accuracy: 0.5970 Epoch 742/1000 1605/1605 [==============================] - 0s 174us/step - loss: 352.3652 - accuracy: 0.5894 - val_loss: 313.9963 - val_accuracy: 0.5871 Epoch 743/1000 1605/1605 [==============================] - 0s 148us/step - loss: 351.6906 - accuracy: 0.5981 - val_loss: 312.1972 - val_accuracy: 0.6070 Epoch 744/1000 1605/1605 [==============================] - 0s 169us/step - loss: 352.4489 - accuracy: 0.5788 - val_loss: 316.5323 - val_accuracy: 0.6070 Epoch 745/1000 1605/1605 [==============================] - 0s 105us/step - loss: 347.7739 - accuracy: 0.5994 - val_loss: 311.8232 - val_accuracy: 0.6144 Epoch 746/1000 1605/1605 [==============================] - 0s 127us/step - loss: 355.0923 - accuracy: 0.5944 - val_loss: 313.6388 - val_accuracy: 0.6045 Epoch 747/1000 1605/1605 [==============================] - 0s 135us/step - loss: 358.6220 - accuracy: 0.5956 - val_loss: 318.4617 - val_accuracy: 0.6020 Epoch 748/1000 1605/1605 [==============================] - 0s 125us/step - loss: 351.6705 - accuracy: 0.5988 - val_loss: 313.7923 - val_accuracy: 0.5945 Epoch 749/1000 1605/1605 [==============================] - 0s 84us/step - loss: 348.1054 - accuracy: 0.5950 - val_loss: 311.4936 - val_accuracy: 0.5896 Epoch 750/1000 1605/1605 [==============================] - 0s 80us/step - loss: 358.0902 - accuracy: 0.5869 - val_loss: 316.2030 - val_accuracy: 0.5920 Epoch 751/1000 1605/1605 [==============================] - 0s 80us/step - loss: 350.9002 - accuracy: 0.5969 - val_loss: 312.5339 - val_accuracy: 0.6095 Epoch 752/1000 1605/1605 [==============================] - 0s 85us/step - loss: 348.3896 - accuracy: 0.5919 - val_loss: 313.3131 - val_accuracy: 0.6244 Epoch 753/1000 1605/1605 [==============================] - 0s 136us/step - loss: 348.3556 - accuracy: 0.5938 - val_loss: 312.8311 - val_accuracy: 0.6169 Epoch 754/1000 1605/1605 [==============================] - 0s 82us/step - loss: 358.2827 - accuracy: 0.6012 - val_loss: 310.4286 - val_accuracy: 0.6119 Epoch 755/1000 1605/1605 [==============================] - 0s 81us/step - loss: 350.9497 - accuracy: 0.6019 - val_loss: 311.4579 - val_accuracy: 0.6194 Epoch 756/1000 1605/1605 [==============================] - 0s 117us/step - loss: 350.7590 - accuracy: 0.5894 - val_loss: 314.4956 - val_accuracy: 0.6194 Epoch 757/1000 1605/1605 [==============================] - 0s 80us/step - loss: 355.1387 - accuracy: 0.5975 - val_loss: 314.9977 - val_accuracy: 0.6070 Epoch 758/1000 1605/1605 [==============================] - 0s 87us/step - loss: 354.4466 - accuracy: 0.6131 - val_loss: 311.3715 - val_accuracy: 0.6219 Epoch 759/1000 1605/1605 [==============================] - 0s 111us/step - loss: 353.9934 - accuracy: 0.6112 - val_loss: 311.8427 - val_accuracy: 0.6144 Epoch 760/1000 1605/1605 [==============================] - 0s 145us/step - loss: 347.4272 - accuracy: 0.5975 - val_loss: 315.7053 - val_accuracy: 0.5945 Epoch 761/1000 1605/1605 [==============================] - 0s 97us/step - loss: 350.3649 - accuracy: 0.5994 - val_loss: 311.2919 - val_accuracy: 0.6169 Epoch 762/1000 1605/1605 [==============================] - 0s 257us/step - loss: 349.5032 - accuracy: 0.5969 - val_loss: 311.1790 - val_accuracy: 0.6219 Epoch 763/1000 1605/1605 [==============================] - 0s 161us/step - loss: 345.1139 - accuracy: 0.5907 - val_loss: 311.1968 - val_accuracy: 0.6020 Epoch 764/1000 1605/1605 [==============================] - 0s 253us/step - loss: 353.7695 - accuracy: 0.6069 - val_loss: 314.6137 - val_accuracy: 0.6144 Epoch 765/1000 1605/1605 [==============================] - 0s 140us/step - loss: 349.4148 - accuracy: 0.5963 - val_loss: 314.0014 - val_accuracy: 0.5970 Epoch 766/1000 1605/1605 [==============================] - 0s 92us/step - loss: 349.4741 - accuracy: 0.6012 - val_loss: 311.7456 - val_accuracy: 0.6144 Epoch 767/1000 1605/1605 [==============================] - 0s 115us/step - loss: 351.9121 - accuracy: 0.6100 - val_loss: 311.0975 - val_accuracy: 0.6144 Epoch 768/1000 1605/1605 [==============================] - 0s 160us/step - loss: 343.5728 - accuracy: 0.6000 - val_loss: 313.0745 - val_accuracy: 0.6194 Epoch 769/1000 1605/1605 [==============================] - 0s 161us/step - loss: 350.2101 - accuracy: 0.6106 - val_loss: 311.1564 - val_accuracy: 0.6318 Epoch 770/1000 1605/1605 [==============================] - 1s 315us/step - loss: 347.4450 - accuracy: 0.5907 - val_loss: 313.2240 - val_accuracy: 0.6219 Epoch 771/1000 1605/1605 [==============================] - 0s 158us/step - loss: 348.9486 - accuracy: 0.6019 - val_loss: 316.6217 - val_accuracy: 0.6095 Epoch 772/1000 1605/1605 [==============================] - 0s 101us/step - loss: 346.5762 - accuracy: 0.6087 - val_loss: 313.0913 - val_accuracy: 0.6095 Epoch 773/1000 1605/1605 [==============================] - 0s 132us/step - loss: 351.7970 - accuracy: 0.6069 - val_loss: 312.3070 - val_accuracy: 0.6269 Epoch 774/1000 1605/1605 [==============================] - 0s 163us/step - loss: 350.2552 - accuracy: 0.6156 - val_loss: 310.4730 - val_accuracy: 0.6343 Epoch 775/1000 1605/1605 [==============================] - 0s 106us/step - loss: 350.4974 - accuracy: 0.5963 - val_loss: 310.1456 - val_accuracy: 0.6269 Epoch 776/1000 1605/1605 [==============================] - 0s 102us/step - loss: 351.9309 - accuracy: 0.6062 - val_loss: 323.1789 - val_accuracy: 0.6269 Epoch 777/1000 1605/1605 [==============================] - 0s 140us/step - loss: 344.9774 - accuracy: 0.6006 - val_loss: 311.1131 - val_accuracy: 0.6169 Epoch 778/1000 1605/1605 [==============================] - 0s 117us/step - loss: 352.9711 - accuracy: 0.6143 - val_loss: 310.2916 - val_accuracy: 0.6219 Epoch 779/1000 1605/1605 [==============================] - 0s 136us/step - loss: 351.2356 - accuracy: 0.6093 - val_loss: 313.1382 - val_accuracy: 0.6144 Epoch 780/1000 1605/1605 [==============================] - 0s 87us/step - loss: 351.6654 - accuracy: 0.6056 - val_loss: 312.3331 - val_accuracy: 0.6169 Epoch 781/1000 1605/1605 [==============================] - 0s 82us/step - loss: 352.1591 - accuracy: 0.6112 - val_loss: 313.3330 - val_accuracy: 0.6343 Epoch 782/1000 1605/1605 [==============================] - 0s 85us/step - loss: 342.0286 - accuracy: 0.6162 - val_loss: 312.7114 - val_accuracy: 0.6368 Epoch 783/1000 1605/1605 [==============================] - 0s 85us/step - loss: 345.4433 - accuracy: 0.6093 - val_loss: 311.6529 - val_accuracy: 0.6343 Epoch 784/1000 1605/1605 [==============================] - 0s 95us/step - loss: 350.1734 - accuracy: 0.6143 - val_loss: 320.0934 - val_accuracy: 0.6194 Epoch 785/1000 1605/1605 [==============================] - 0s 179us/step - loss: 352.8501 - accuracy: 0.6125 - val_loss: 319.6799 - val_accuracy: 0.6194 Epoch 786/1000 1605/1605 [==============================] - 0s 130us/step - loss: 343.8759 - accuracy: 0.6231 - val_loss: 316.3450 - val_accuracy: 0.6095 Epoch 787/1000 1605/1605 [==============================] - 0s 158us/step - loss: 342.9411 - accuracy: 0.6150 - val_loss: 313.2186 - val_accuracy: 0.6294 Epoch 788/1000 1605/1605 [==============================] - 0s 189us/step - loss: 348.5606 - accuracy: 0.6075 - val_loss: 312.7875 - val_accuracy: 0.6418 Epoch 789/1000 1605/1605 [==============================] - 0s 105us/step - loss: 346.0818 - accuracy: 0.6106 - val_loss: 310.2487 - val_accuracy: 0.6393 Epoch 790/1000 1605/1605 [==============================] - 0s 80us/step - loss: 356.2819 - accuracy: 0.6125 - val_loss: 313.3393 - val_accuracy: 0.6294 Epoch 791/1000 1605/1605 [==============================] - 0s 99us/step - loss: 348.6541 - accuracy: 0.6162 - val_loss: 311.4993 - val_accuracy: 0.6269 Epoch 792/1000 1605/1605 [==============================] - 0s 128us/step - loss: 346.7560 - accuracy: 0.6081 - val_loss: 312.2272 - val_accuracy: 0.6194 Epoch 793/1000 1605/1605 [==============================] - 0s 80us/step - loss: 348.0782 - accuracy: 0.6075 - val_loss: 312.5912 - val_accuracy: 0.6269 Epoch 794/1000 1605/1605 [==============================] - 0s 109us/step - loss: 344.7493 - accuracy: 0.6093 - val_loss: 312.0221 - val_accuracy: 0.6393 Epoch 795/1000 1605/1605 [==============================] - 0s 98us/step - loss: 342.5422 - accuracy: 0.6212 - val_loss: 310.6866 - val_accuracy: 0.6418 Epoch 796/1000 1605/1605 [==============================] - 0s 79us/step - loss: 341.8930 - accuracy: 0.6075 - val_loss: 311.3269 - val_accuracy: 0.6343 Epoch 797/1000 1605/1605 [==============================] - 0s 88us/step - loss: 346.0110 - accuracy: 0.6262 - val_loss: 310.9462 - val_accuracy: 0.6418 Epoch 798/1000 1605/1605 [==============================] - 0s 83us/step - loss: 343.4239 - accuracy: 0.6162 - val_loss: 318.8577 - val_accuracy: 0.6393 Epoch 799/1000 1605/1605 [==============================] - 0s 80us/step - loss: 345.7785 - accuracy: 0.6156 - val_loss: 314.7643 - val_accuracy: 0.6318 Epoch 800/1000 1605/1605 [==============================] - 0s 87us/step - loss: 348.5035 - accuracy: 0.6174 - val_loss: 316.4106 - val_accuracy: 0.6443 Epoch 801/1000 1605/1605 [==============================] - 0s 85us/step - loss: 343.8431 - accuracy: 0.6249 - val_loss: 310.5674 - val_accuracy: 0.6343 Epoch 802/1000 1605/1605 [==============================] - 0s 108us/step - loss: 346.9418 - accuracy: 0.6199 - val_loss: 313.1731 - val_accuracy: 0.6393 Epoch 803/1000 1605/1605 [==============================] - 0s 123us/step - loss: 346.9078 - accuracy: 0.6206 - val_loss: 312.1120 - val_accuracy: 0.6269 Epoch 804/1000 1605/1605 [==============================] - 0s 125us/step - loss: 349.2786 - accuracy: 0.6075 - val_loss: 315.5334 - val_accuracy: 0.6393 Epoch 805/1000 1605/1605 [==============================] - 0s 189us/step - loss: 345.8850 - accuracy: 0.6193 - val_loss: 315.8182 - val_accuracy: 0.6219 Epoch 806/1000 1605/1605 [==============================] - 0s 193us/step - loss: 342.9408 - accuracy: 0.6374 - val_loss: 314.4035 - val_accuracy: 0.6318 Epoch 807/1000 1605/1605 [==============================] - 0s 118us/step - loss: 342.5713 - accuracy: 0.6274 - val_loss: 314.5176 - val_accuracy: 0.6418 Epoch 808/1000 1605/1605 [==============================] - 0s 121us/step - loss: 353.8147 - accuracy: 0.6174 - val_loss: 314.6255 - val_accuracy: 0.6343 Epoch 809/1000 1605/1605 [==============================] - 0s 136us/step - loss: 340.9614 - accuracy: 0.6268 - val_loss: 315.3790 - val_accuracy: 0.6393 Epoch 810/1000 1605/1605 [==============================] - 0s 125us/step - loss: 339.9738 - accuracy: 0.6305 - val_loss: 312.2351 - val_accuracy: 0.6468 Epoch 811/1000 1605/1605 [==============================] - 0s 119us/step - loss: 346.3291 - accuracy: 0.6156 - val_loss: 314.8884 - val_accuracy: 0.6343 Epoch 812/1000 1605/1605 [==============================] - 0s 146us/step - loss: 344.3422 - accuracy: 0.6174 - val_loss: 314.5529 - val_accuracy: 0.6443 Epoch 813/1000 1605/1605 [==============================] - 0s 121us/step - loss: 343.6292 - accuracy: 0.6386 - val_loss: 314.3083 - val_accuracy: 0.6393 Epoch 814/1000 1605/1605 [==============================] - 0s 96us/step - loss: 348.5115 - accuracy: 0.6262 - val_loss: 313.1487 - val_accuracy: 0.6468 Epoch 815/1000 1605/1605 [==============================] - 0s 166us/step - loss: 350.4307 - accuracy: 0.6399 - val_loss: 314.0728 - val_accuracy: 0.6517 Epoch 816/1000 1605/1605 [==============================] - 0s 96us/step - loss: 341.9564 - accuracy: 0.6243 - val_loss: 313.1556 - val_accuracy: 0.6592 Epoch 817/1000 1605/1605 [==============================] - 0s 139us/step - loss: 343.5259 - accuracy: 0.6268 - val_loss: 313.2638 - val_accuracy: 0.6567 Epoch 818/1000 1605/1605 [==============================] - 0s 134us/step - loss: 341.5685 - accuracy: 0.6224 - val_loss: 314.4787 - val_accuracy: 0.6517 Epoch 819/1000 1605/1605 [==============================] - 0s 119us/step - loss: 347.4734 - accuracy: 0.6262 - val_loss: 315.7584 - val_accuracy: 0.6343 Epoch 820/1000 1605/1605 [==============================] - 0s 282us/step - loss: 352.3232 - accuracy: 0.6137 - val_loss: 319.0310 - val_accuracy: 0.6493 Epoch 821/1000 1605/1605 [==============================] - 0s 91us/step - loss: 346.6742 - accuracy: 0.6181 - val_loss: 317.1372 - val_accuracy: 0.6269 Epoch 822/1000 1605/1605 [==============================] - 0s 84us/step - loss: 353.6762 - accuracy: 0.6218 - val_loss: 313.8580 - val_accuracy: 0.6343 Epoch 823/1000 1605/1605 [==============================] - 0s 77us/step - loss: 343.7450 - accuracy: 0.6299 - val_loss: 313.5389 - val_accuracy: 0.6493 Epoch 824/1000 1605/1605 [==============================] - 0s 83us/step - loss: 348.4862 - accuracy: 0.6262 - val_loss: 313.1456 - val_accuracy: 0.6493 Epoch 825/1000 1605/1605 [==============================] - 0s 108us/step - loss: 345.2088 - accuracy: 0.6262 - val_loss: 315.1207 - val_accuracy: 0.6443 Epoch 826/1000 1605/1605 [==============================] - 0s 87us/step - loss: 346.3706 - accuracy: 0.6206 - val_loss: 315.9432 - val_accuracy: 0.6493 Epoch 827/1000 1605/1605 [==============================] - 0s 73us/step - loss: 344.2109 - accuracy: 0.6361 - val_loss: 316.8914 - val_accuracy: 0.6642 Epoch 828/1000 1605/1605 [==============================] - 0s 72us/step - loss: 343.5375 - accuracy: 0.6087 - val_loss: 313.6634 - val_accuracy: 0.6567 Epoch 829/1000 1605/1605 [==============================] - 0s 80us/step - loss: 347.0311 - accuracy: 0.6249 - val_loss: 315.2144 - val_accuracy: 0.6667 Epoch 830/1000 1605/1605 [==============================] - 0s 85us/step - loss: 347.6088 - accuracy: 0.6280 - val_loss: 315.3720 - val_accuracy: 0.6617 Epoch 831/1000 1605/1605 [==============================] - 0s 77us/step - loss: 351.0660 - accuracy: 0.6268 - val_loss: 315.7090 - val_accuracy: 0.6567 Epoch 832/1000 1605/1605 [==============================] - 0s 79us/step - loss: 339.5662 - accuracy: 0.6293 - val_loss: 313.6298 - val_accuracy: 0.6617 Epoch 833/1000 1605/1605 [==============================] - 0s 87us/step - loss: 342.6479 - accuracy: 0.6480 - val_loss: 316.9416 - val_accuracy: 0.6592 Epoch 834/1000 1605/1605 [==============================] - 0s 79us/step - loss: 345.4344 - accuracy: 0.6212 - val_loss: 313.7323 - val_accuracy: 0.6617 Epoch 835/1000 1605/1605 [==============================] - 0s 82us/step - loss: 346.3188 - accuracy: 0.6312 - val_loss: 314.3769 - val_accuracy: 0.6617 Epoch 836/1000 1605/1605 [==============================] - 0s 77us/step - loss: 341.1932 - accuracy: 0.6143 - val_loss: 313.4375 - val_accuracy: 0.6617 Epoch 837/1000 1605/1605 [==============================] - 0s 77us/step - loss: 344.8869 - accuracy: 0.6255 - val_loss: 311.9918 - val_accuracy: 0.6667 Epoch 838/1000 1605/1605 [==============================] - 0s 104us/step - loss: 344.9476 - accuracy: 0.6280 - val_loss: 324.4697 - val_accuracy: 0.6418 Epoch 839/1000 1605/1605 [==============================] - 0s 103us/step - loss: 346.9916 - accuracy: 0.6262 - val_loss: 320.0880 - val_accuracy: 0.6542 Epoch 840/1000 1605/1605 [==============================] - 0s 86us/step - loss: 359.3092 - accuracy: 0.6268 - val_loss: 317.0057 - val_accuracy: 0.6617 Epoch 841/1000 1605/1605 [==============================] - 0s 79us/step - loss: 353.8539 - accuracy: 0.6374 - val_loss: 317.0338 - val_accuracy: 0.6567 Epoch 842/1000 1605/1605 [==============================] - 0s 87us/step - loss: 349.1935 - accuracy: 0.6330 - val_loss: 311.9004 - val_accuracy: 0.6692 Epoch 843/1000 1605/1605 [==============================] - 0s 80us/step - loss: 347.2955 - accuracy: 0.6262 - val_loss: 315.5721 - val_accuracy: 0.6592 Epoch 844/1000 1605/1605 [==============================] - 0s 110us/step - loss: 348.0700 - accuracy: 0.6343 - val_loss: 313.9558 - val_accuracy: 0.6592 Epoch 845/1000 1605/1605 [==============================] - 0s 100us/step - loss: 350.3069 - accuracy: 0.6287 - val_loss: 313.6563 - val_accuracy: 0.6716 Epoch 846/1000 1605/1605 [==============================] - 0s 101us/step - loss: 350.7163 - accuracy: 0.6174 - val_loss: 325.6331 - val_accuracy: 0.6567 Epoch 847/1000 1605/1605 [==============================] - 0s 78us/step - loss: 345.9423 - accuracy: 0.6237 - val_loss: 322.3197 - val_accuracy: 0.6542 Epoch 848/1000 1605/1605 [==============================] - 0s 75us/step - loss: 343.4577 - accuracy: 0.6374 - val_loss: 327.5898 - val_accuracy: 0.6468 Epoch 849/1000 1605/1605 [==============================] - 0s 110us/step - loss: 341.8755 - accuracy: 0.6243 - val_loss: 318.2979 - val_accuracy: 0.6517 Epoch 850/1000 1605/1605 [==============================] - 0s 126us/step - loss: 351.9646 - accuracy: 0.6243 - val_loss: 313.3985 - val_accuracy: 0.6493 Epoch 851/1000 1605/1605 [==============================] - 0s 73us/step - loss: 339.5977 - accuracy: 0.6349 - val_loss: 315.0067 - val_accuracy: 0.6542 Epoch 852/1000 1605/1605 [==============================] - 0s 74us/step - loss: 346.8006 - accuracy: 0.6287 - val_loss: 318.8010 - val_accuracy: 0.6418 Epoch 853/1000 1605/1605 [==============================] - 0s 102us/step - loss: 346.1393 - accuracy: 0.6106 - val_loss: 314.8734 - val_accuracy: 0.6493 Epoch 854/1000 1605/1605 [==============================] - 0s 90us/step - loss: 347.4926 - accuracy: 0.6162 - val_loss: 319.9419 - val_accuracy: 0.6443 Epoch 855/1000 1605/1605 [==============================] - 0s 81us/step - loss: 342.0070 - accuracy: 0.6293 - val_loss: 320.0711 - val_accuracy: 0.6244 Epoch 856/1000 1605/1605 [==============================] - 0s 87us/step - loss: 343.1307 - accuracy: 0.6280 - val_loss: 318.6243 - val_accuracy: 0.6468 Epoch 857/1000 1605/1605 [==============================] - 0s 83us/step - loss: 341.6040 - accuracy: 0.6087 - val_loss: 319.2433 - val_accuracy: 0.6368 Epoch 858/1000 1605/1605 [==============================] - 0s 87us/step - loss: 346.9448 - accuracy: 0.6368 - val_loss: 315.6531 - val_accuracy: 0.6567 Epoch 859/1000 1605/1605 [==============================] - 0s 98us/step - loss: 345.8775 - accuracy: 0.6361 - val_loss: 317.0738 - val_accuracy: 0.6493 Epoch 860/1000 1605/1605 [==============================] - 0s 77us/step - loss: 344.0746 - accuracy: 0.6330 - val_loss: 313.9868 - val_accuracy: 0.6493 Epoch 861/1000 1605/1605 [==============================] - 0s 72us/step - loss: 346.4324 - accuracy: 0.6324 - val_loss: 315.1706 - val_accuracy: 0.6468 Epoch 862/1000 1605/1605 [==============================] - 0s 95us/step - loss: 348.1313 - accuracy: 0.6280 - val_loss: 314.5341 - val_accuracy: 0.6517 Epoch 863/1000 1605/1605 [==============================] - 0s 77us/step - loss: 341.9072 - accuracy: 0.6299 - val_loss: 322.6411 - val_accuracy: 0.6517 Epoch 864/1000 1605/1605 [==============================] - 0s 110us/step - loss: 347.5567 - accuracy: 0.6206 - val_loss: 320.1673 - val_accuracy: 0.6468 Epoch 865/1000 1605/1605 [==============================] - 0s 79us/step - loss: 346.2806 - accuracy: 0.6255 - val_loss: 318.5283 - val_accuracy: 0.6517 Epoch 866/1000 1605/1605 [==============================] - 0s 103us/step - loss: 340.2169 - accuracy: 0.6324 - val_loss: 320.1388 - val_accuracy: 0.6567 Epoch 867/1000 1605/1605 [==============================] - 0s 90us/step - loss: 347.7910 - accuracy: 0.6324 - val_loss: 320.0441 - val_accuracy: 0.6493 Epoch 868/1000 1605/1605 [==============================] - 0s 79us/step - loss: 339.7036 - accuracy: 0.6274 - val_loss: 315.4092 - val_accuracy: 0.6567 Epoch 869/1000 1605/1605 [==============================] - 0s 76us/step - loss: 339.8920 - accuracy: 0.6237 - val_loss: 315.2818 - val_accuracy: 0.6567 Epoch 870/1000 1605/1605 [==============================] - 0s 92us/step - loss: 341.5414 - accuracy: 0.6299 - val_loss: 317.5616 - val_accuracy: 0.6468 Epoch 871/1000 1605/1605 [==============================] - 0s 85us/step - loss: 347.8731 - accuracy: 0.6293 - val_loss: 316.1864 - val_accuracy: 0.6468 Epoch 872/1000 1605/1605 [==============================] - 0s 112us/step - loss: 348.5305 - accuracy: 0.6150 - val_loss: 316.9081 - val_accuracy: 0.6542 Epoch 873/1000 1605/1605 [==============================] - 0s 102us/step - loss: 343.8945 - accuracy: 0.6293 - val_loss: 314.3617 - val_accuracy: 0.6567 Epoch 874/1000 1605/1605 [==============================] - 0s 119us/step - loss: 349.4077 - accuracy: 0.6243 - val_loss: 315.5876 - val_accuracy: 0.6517 Epoch 875/1000 1605/1605 [==============================] - 0s 82us/step - loss: 341.7916 - accuracy: 0.6393 - val_loss: 316.8413 - val_accuracy: 0.6493 Epoch 876/1000 1605/1605 [==============================] - 0s 121us/step - loss: 338.1668 - accuracy: 0.6293 - val_loss: 314.9112 - val_accuracy: 0.6542 Epoch 877/1000 1605/1605 [==============================] - 0s 106us/step - loss: 340.5259 - accuracy: 0.6268 - val_loss: 317.6010 - val_accuracy: 0.6443 Epoch 878/1000 1605/1605 [==============================] - 0s 83us/step - loss: 347.0874 - accuracy: 0.6280 - val_loss: 316.0522 - val_accuracy: 0.6517 Epoch 879/1000 1605/1605 [==============================] - 0s 130us/step - loss: 345.4241 - accuracy: 0.6181 - val_loss: 322.4944 - val_accuracy: 0.6542 Epoch 880/1000 1605/1605 [==============================] - 0s 146us/step - loss: 338.5790 - accuracy: 0.6312 - val_loss: 318.4409 - val_accuracy: 0.6542 Epoch 881/1000 1605/1605 [==============================] - 0s 115us/step - loss: 348.0717 - accuracy: 0.6361 - val_loss: 318.1062 - val_accuracy: 0.6493 Epoch 882/1000 1605/1605 [==============================] - 0s 99us/step - loss: 347.0208 - accuracy: 0.6374 - val_loss: 322.2383 - val_accuracy: 0.6517 Epoch 883/1000 1605/1605 [==============================] - 0s 96us/step - loss: 339.4324 - accuracy: 0.6368 - val_loss: 317.0104 - val_accuracy: 0.6418 Epoch 884/1000 1605/1605 [==============================] - 0s 118us/step - loss: 342.8819 - accuracy: 0.6299 - val_loss: 316.6196 - val_accuracy: 0.6468 Epoch 885/1000 1605/1605 [==============================] - 0s 94us/step - loss: 346.9682 - accuracy: 0.6174 - val_loss: 313.4520 - val_accuracy: 0.6393 Epoch 886/1000 1605/1605 [==============================] - 0s 107us/step - loss: 346.0607 - accuracy: 0.6280 - val_loss: 315.1267 - val_accuracy: 0.6517 Epoch 887/1000 1605/1605 [==============================] - 0s 85us/step - loss: 351.0564 - accuracy: 0.6249 - val_loss: 314.5779 - val_accuracy: 0.6443 Epoch 888/1000 1605/1605 [==============================] - 0s 147us/step - loss: 341.5523 - accuracy: 0.6231 - val_loss: 315.0632 - val_accuracy: 0.6493 Epoch 889/1000 1605/1605 [==============================] - 0s 105us/step - loss: 340.7445 - accuracy: 0.6262 - val_loss: 320.0374 - val_accuracy: 0.6517 Epoch 890/1000 1605/1605 [==============================] - 0s 170us/step - loss: 345.6400 - accuracy: 0.6330 - val_loss: 316.4530 - val_accuracy: 0.6542 Epoch 891/1000 1605/1605 [==============================] - 0s 275us/step - loss: 343.0330 - accuracy: 0.6386 - val_loss: 314.6508 - val_accuracy: 0.6567 Epoch 892/1000 1605/1605 [==============================] - 0s 112us/step - loss: 343.3793 - accuracy: 0.6368 - val_loss: 316.5740 - val_accuracy: 0.6443 Epoch 893/1000 1605/1605 [==============================] - 0s 113us/step - loss: 345.8717 - accuracy: 0.6262 - val_loss: 316.5215 - val_accuracy: 0.6493 Epoch 894/1000 1605/1605 [==============================] - 0s 94us/step - loss: 348.5060 - accuracy: 0.6417 - val_loss: 317.5916 - val_accuracy: 0.6592 Epoch 895/1000 1605/1605 [==============================] - 0s 117us/step - loss: 342.0072 - accuracy: 0.6255 - val_loss: 319.2168 - val_accuracy: 0.6418 Epoch 896/1000 1605/1605 [==============================] - 0s 114us/step - loss: 341.8443 - accuracy: 0.6305 - val_loss: 317.1434 - val_accuracy: 0.6443 Epoch 897/1000 1605/1605 [==============================] - 0s 136us/step - loss: 349.1384 - accuracy: 0.6218 - val_loss: 317.2617 - val_accuracy: 0.6468 Epoch 898/1000 1605/1605 [==============================] - 0s 94us/step - loss: 345.0526 - accuracy: 0.6355 - val_loss: 315.2726 - val_accuracy: 0.6368 Epoch 899/1000 1605/1605 [==============================] - 0s 133us/step - loss: 339.5329 - accuracy: 0.6330 - val_loss: 315.7418 - val_accuracy: 0.6542 Epoch 900/1000 1605/1605 [==============================] - 0s 160us/step - loss: 347.0538 - accuracy: 0.6324 - val_loss: 316.2127 - val_accuracy: 0.6468 Epoch 901/1000 1605/1605 [==============================] - 0s 91us/step - loss: 344.7238 - accuracy: 0.6355 - val_loss: 315.3868 - val_accuracy: 0.6443 Epoch 902/1000 1605/1605 [==============================] - 0s 107us/step - loss: 342.2858 - accuracy: 0.6218 - val_loss: 331.3548 - val_accuracy: 0.6294 Epoch 903/1000 1605/1605 [==============================] - 0s 126us/step - loss: 345.0607 - accuracy: 0.6343 - val_loss: 317.7417 - val_accuracy: 0.6418 Epoch 904/1000 1605/1605 [==============================] - 0s 115us/step - loss: 350.1953 - accuracy: 0.6262 - val_loss: 314.6083 - val_accuracy: 0.6393 Epoch 905/1000 1605/1605 [==============================] - 0s 136us/step - loss: 344.2931 - accuracy: 0.6249 - val_loss: 315.1615 - val_accuracy: 0.6418 Epoch 906/1000 1605/1605 [==============================] - 0s 146us/step - loss: 347.6852 - accuracy: 0.6262 - val_loss: 315.4358 - val_accuracy: 0.6368 Epoch 907/1000 1605/1605 [==============================] - 0s 90us/step - loss: 343.1219 - accuracy: 0.6349 - val_loss: 314.3876 - val_accuracy: 0.6443 Epoch 908/1000 1605/1605 [==============================] - 0s 91us/step - loss: 341.7033 - accuracy: 0.6324 - val_loss: 315.4635 - val_accuracy: 0.6468 Epoch 909/1000 1605/1605 [==============================] - 0s 96us/step - loss: 343.7287 - accuracy: 0.6324 - val_loss: 314.6907 - val_accuracy: 0.6517 Epoch 910/1000 1605/1605 [==============================] - 0s 123us/step - loss: 341.5589 - accuracy: 0.6330 - val_loss: 317.4394 - val_accuracy: 0.6517 Epoch 911/1000 1605/1605 [==============================] - 0s 135us/step - loss: 338.6812 - accuracy: 0.6312 - val_loss: 316.9256 - val_accuracy: 0.6493 Epoch 912/1000 1605/1605 [==============================] - 0s 74us/step - loss: 351.0397 - accuracy: 0.6274 - val_loss: 321.2287 - val_accuracy: 0.6493 Epoch 913/1000 1605/1605 [==============================] - 0s 138us/step - loss: 343.8028 - accuracy: 0.6231 - val_loss: 318.4660 - val_accuracy: 0.6393 Epoch 914/1000 1605/1605 [==============================] - 0s 121us/step - loss: 347.3831 - accuracy: 0.6405 - val_loss: 318.8584 - val_accuracy: 0.6443 Epoch 915/1000 1605/1605 [==============================] - 0s 153us/step - loss: 343.5744 - accuracy: 0.6274 - val_loss: 315.4639 - val_accuracy: 0.6468 Epoch 916/1000 1605/1605 [==============================] - 0s 85us/step - loss: 347.4407 - accuracy: 0.6218 - val_loss: 325.2691 - val_accuracy: 0.6343 Epoch 917/1000 1605/1605 [==============================] - 0s 119us/step - loss: 345.4077 - accuracy: 0.6255 - val_loss: 320.2373 - val_accuracy: 0.6294 Epoch 918/1000 1605/1605 [==============================] - 0s 118us/step - loss: 340.6113 - accuracy: 0.6262 - val_loss: 317.5403 - val_accuracy: 0.6368 Epoch 919/1000 1605/1605 [==============================] - 0s 152us/step - loss: 349.8673 - accuracy: 0.6237 - val_loss: 324.5828 - val_accuracy: 0.6443 Epoch 920/1000 1605/1605 [==============================] - 0s 228us/step - loss: 345.2921 - accuracy: 0.6312 - val_loss: 316.4801 - val_accuracy: 0.6443 Epoch 921/1000 1605/1605 [==============================] - 0s 93us/step - loss: 346.3837 - accuracy: 0.6268 - val_loss: 318.8006 - val_accuracy: 0.6493 Epoch 922/1000 1605/1605 [==============================] - 0s 90us/step - loss: 346.9928 - accuracy: 0.6417 - val_loss: 319.6371 - val_accuracy: 0.6468 Epoch 923/1000 1605/1605 [==============================] - 0s 85us/step - loss: 347.4073 - accuracy: 0.6224 - val_loss: 324.2227 - val_accuracy: 0.6269 Epoch 924/1000 1605/1605 [==============================] - 0s 85us/step - loss: 335.2252 - accuracy: 0.6368 - val_loss: 320.4755 - val_accuracy: 0.6269 Epoch 925/1000 1605/1605 [==============================] - 0s 86us/step - loss: 350.8071 - accuracy: 0.6206 - val_loss: 318.0802 - val_accuracy: 0.6468 Epoch 926/1000 1605/1605 [==============================] - 0s 84us/step - loss: 346.1932 - accuracy: 0.6368 - val_loss: 314.8511 - val_accuracy: 0.6343 Epoch 927/1000 1605/1605 [==============================] - 0s 126us/step - loss: 346.2308 - accuracy: 0.6231 - val_loss: 314.8263 - val_accuracy: 0.6343 Epoch 928/1000 1605/1605 [==============================] - 0s 111us/step - loss: 341.2733 - accuracy: 0.6193 - val_loss: 316.8648 - val_accuracy: 0.6343 Epoch 929/1000 1605/1605 [==============================] - 0s 103us/step - loss: 340.9411 - accuracy: 0.6199 - val_loss: 314.3909 - val_accuracy: 0.6294 Epoch 930/1000 1605/1605 [==============================] - 0s 86us/step - loss: 340.4807 - accuracy: 0.6268 - val_loss: 315.6945 - val_accuracy: 0.6269 Epoch 931/1000 1605/1605 [==============================] - 0s 179us/step - loss: 344.2324 - accuracy: 0.6243 - val_loss: 315.1390 - val_accuracy: 0.6343 Epoch 932/1000 1605/1605 [==============================] - 0s 179us/step - loss: 344.6541 - accuracy: 0.6355 - val_loss: 319.8543 - val_accuracy: 0.6343 Epoch 933/1000 1605/1605 [==============================] - 0s 86us/step - loss: 343.0401 - accuracy: 0.6280 - val_loss: 315.6877 - val_accuracy: 0.6368 Epoch 934/1000 1605/1605 [==============================] - 0s 74us/step - loss: 340.6835 - accuracy: 0.6380 - val_loss: 315.0355 - val_accuracy: 0.6244 Epoch 935/1000 1605/1605 [==============================] - 0s 94us/step - loss: 340.1093 - accuracy: 0.6262 - val_loss: 319.1951 - val_accuracy: 0.6368 Epoch 936/1000 1605/1605 [==============================] - 0s 108us/step - loss: 344.9494 - accuracy: 0.6343 - val_loss: 314.8831 - val_accuracy: 0.6318 Epoch 937/1000 1605/1605 [==============================] - 0s 110us/step - loss: 349.9142 - accuracy: 0.6299 - val_loss: 314.8297 - val_accuracy: 0.6244 Epoch 938/1000 1605/1605 [==============================] - 0s 102us/step - loss: 345.0706 - accuracy: 0.6262 - val_loss: 316.4355 - val_accuracy: 0.6294 Epoch 939/1000 1605/1605 [==============================] - 0s 93us/step - loss: 341.6868 - accuracy: 0.6287 - val_loss: 316.5764 - val_accuracy: 0.6318 Epoch 940/1000 1605/1605 [==============================] - 0s 90us/step - loss: 344.7795 - accuracy: 0.6181 - val_loss: 316.1924 - val_accuracy: 0.6418 Epoch 941/1000 1605/1605 [==============================] - 0s 85us/step - loss: 349.3268 - accuracy: 0.6249 - val_loss: 318.4687 - val_accuracy: 0.6269 Epoch 942/1000 1605/1605 [==============================] - 0s 77us/step - loss: 342.1099 - accuracy: 0.6368 - val_loss: 319.8433 - val_accuracy: 0.6343 Epoch 943/1000 1605/1605 [==============================] - 0s 92us/step - loss: 344.3301 - accuracy: 0.6349 - val_loss: 316.9486 - val_accuracy: 0.6443 Epoch 944/1000 1605/1605 [==============================] - 0s 92us/step - loss: 350.8939 - accuracy: 0.6305 - val_loss: 316.1186 - val_accuracy: 0.6244 Epoch 945/1000 1605/1605 [==============================] - 0s 73us/step - loss: 340.9680 - accuracy: 0.6399 - val_loss: 315.3836 - val_accuracy: 0.6393 Epoch 946/1000 1605/1605 [==============================] - 0s 90us/step - loss: 346.4440 - accuracy: 0.6355 - val_loss: 319.4960 - val_accuracy: 0.6393 Epoch 947/1000 1605/1605 [==============================] - 0s 91us/step - loss: 350.7742 - accuracy: 0.6224 - val_loss: 317.2652 - val_accuracy: 0.6318 Epoch 948/1000 1605/1605 [==============================] - 0s 86us/step - loss: 346.5345 - accuracy: 0.6436 - val_loss: 314.3749 - val_accuracy: 0.6418 Epoch 949/1000 1605/1605 [==============================] - 0s 82us/step - loss: 349.8330 - accuracy: 0.6212 - val_loss: 321.1475 - val_accuracy: 0.6493 Epoch 950/1000 1605/1605 [==============================] - 0s 97us/step - loss: 344.0573 - accuracy: 0.6355 - val_loss: 316.2092 - val_accuracy: 0.6318 Epoch 951/1000 1605/1605 [==============================] - 0s 97us/step - loss: 350.0016 - accuracy: 0.6249 - val_loss: 316.2840 - val_accuracy: 0.6368 Epoch 952/1000 1605/1605 [==============================] - 0s 79us/step - loss: 347.9001 - accuracy: 0.6262 - val_loss: 318.4960 - val_accuracy: 0.6443 Epoch 953/1000 1605/1605 [==============================] - 0s 75us/step - loss: 344.1795 - accuracy: 0.6330 - val_loss: 317.4441 - val_accuracy: 0.6443 Epoch 954/1000 1605/1605 [==============================] - 0s 72us/step - loss: 346.7164 - accuracy: 0.6274 - val_loss: 318.6717 - val_accuracy: 0.6318 Epoch 955/1000 1605/1605 [==============================] - 0s 78us/step - loss: 346.0048 - accuracy: 0.6168 - val_loss: 316.0706 - val_accuracy: 0.6443 Epoch 956/1000 1605/1605 [==============================] - 0s 73us/step - loss: 344.0833 - accuracy: 0.6218 - val_loss: 317.1687 - val_accuracy: 0.6269 Epoch 957/1000 1605/1605 [==============================] - 0s 72us/step - loss: 344.3301 - accuracy: 0.6405 - val_loss: 315.1541 - val_accuracy: 0.6318 Epoch 958/1000 1605/1605 [==============================] - 0s 72us/step - loss: 345.3438 - accuracy: 0.6218 - val_loss: 316.3076 - val_accuracy: 0.6318 Epoch 959/1000 1605/1605 [==============================] - 0s 82us/step - loss: 348.6172 - accuracy: 0.6268 - val_loss: 319.0595 - val_accuracy: 0.6368 Epoch 960/1000 1605/1605 [==============================] - 0s 78us/step - loss: 343.8348 - accuracy: 0.6249 - val_loss: 316.7604 - val_accuracy: 0.6318 Epoch 961/1000 1605/1605 [==============================] - 0s 73us/step - loss: 342.6218 - accuracy: 0.6293 - val_loss: 315.0158 - val_accuracy: 0.6244 Epoch 962/1000 1605/1605 [==============================] - 0s 72us/step - loss: 346.8332 - accuracy: 0.6349 - val_loss: 316.2222 - val_accuracy: 0.6318 Epoch 963/1000 1605/1605 [==============================] - 0s 71us/step - loss: 347.2619 - accuracy: 0.6206 - val_loss: 315.7176 - val_accuracy: 0.6318 Epoch 964/1000 1605/1605 [==============================] - 0s 74us/step - loss: 342.8497 - accuracy: 0.6305 - val_loss: 315.9560 - val_accuracy: 0.6343 Epoch 965/1000 1605/1605 [==============================] - 0s 72us/step - loss: 341.2351 - accuracy: 0.6399 - val_loss: 315.5278 - val_accuracy: 0.6294 Epoch 966/1000 1605/1605 [==============================] - 0s 184us/step - loss: 336.8098 - accuracy: 0.6368 - val_loss: 317.9602 - val_accuracy: 0.6244 Epoch 967/1000 1605/1605 [==============================] - 0s 94us/step - loss: 342.9167 - accuracy: 0.6299 - val_loss: 315.9293 - val_accuracy: 0.6244 Epoch 968/1000 1605/1605 [==============================] - 0s 232us/step - loss: 343.7591 - accuracy: 0.6380 - val_loss: 316.9173 - val_accuracy: 0.6443 Epoch 969/1000 1605/1605 [==============================] - 0s 169us/step - loss: 346.1269 - accuracy: 0.6330 - val_loss: 314.9680 - val_accuracy: 0.6468 Epoch 970/1000 1605/1605 [==============================] - 0s 230us/step - loss: 344.1998 - accuracy: 0.6424 - val_loss: 315.8662 - val_accuracy: 0.6318 Epoch 971/1000 1605/1605 [==============================] - 0s 155us/step - loss: 342.1346 - accuracy: 0.6168 - val_loss: 319.6935 - val_accuracy: 0.6318 Epoch 972/1000 1605/1605 [==============================] - 0s 219us/step - loss: 345.7463 - accuracy: 0.6318 - val_loss: 317.0518 - val_accuracy: 0.6368 Epoch 973/1000 1605/1605 [==============================] - 0s 124us/step - loss: 342.8391 - accuracy: 0.6212 - val_loss: 315.5667 - val_accuracy: 0.6269 Epoch 974/1000 1605/1605 [==============================] - 0s 127us/step - loss: 353.7392 - accuracy: 0.6299 - val_loss: 315.4722 - val_accuracy: 0.6368 Epoch 975/1000 1605/1605 [==============================] - 0s 157us/step - loss: 346.1800 - accuracy: 0.6349 - val_loss: 316.9727 - val_accuracy: 0.6343 Epoch 976/1000 1605/1605 [==============================] - 0s 128us/step - loss: 343.8350 - accuracy: 0.6206 - val_loss: 316.6433 - val_accuracy: 0.6418 Epoch 977/1000 1605/1605 [==============================] - 0s 133us/step - loss: 351.0311 - accuracy: 0.6224 - val_loss: 315.2564 - val_accuracy: 0.6269 Epoch 978/1000 1605/1605 [==============================] - 0s 151us/step - loss: 350.8488 - accuracy: 0.6361 - val_loss: 315.9868 - val_accuracy: 0.6418 Epoch 979/1000 1605/1605 [==============================] - 0s 93us/step - loss: 340.4117 - accuracy: 0.6393 - val_loss: 327.5629 - val_accuracy: 0.6219 Epoch 980/1000 1605/1605 [==============================] - 0s 118us/step - loss: 351.9036 - accuracy: 0.6330 - val_loss: 317.0646 - val_accuracy: 0.6368 Epoch 981/1000 1605/1605 [==============================] - 0s 150us/step - loss: 349.3195 - accuracy: 0.6280 - val_loss: 316.6908 - val_accuracy: 0.6368 Epoch 982/1000 1605/1605 [==============================] - 0s 298us/step - loss: 342.3346 - accuracy: 0.6293 - val_loss: 315.8121 - val_accuracy: 0.6368 Epoch 983/1000 1605/1605 [==============================] - 0s 143us/step - loss: 342.2870 - accuracy: 0.6293 - val_loss: 318.7948 - val_accuracy: 0.6244 Epoch 984/1000 1605/1605 [==============================] - 0s 144us/step - loss: 342.3688 - accuracy: 0.6262 - val_loss: 315.1125 - val_accuracy: 0.6318 Epoch 985/1000 1605/1605 [==============================] - 0s 78us/step - loss: 340.1978 - accuracy: 0.6212 - val_loss: 314.9572 - val_accuracy: 0.6368 Epoch 986/1000 1605/1605 [==============================] - 0s 120us/step - loss: 343.7633 - accuracy: 0.6312 - val_loss: 316.7294 - val_accuracy: 0.6443 Epoch 987/1000 1605/1605 [==============================] - 0s 161us/step - loss: 344.8516 - accuracy: 0.6255 - val_loss: 316.1765 - val_accuracy: 0.6294 Epoch 988/1000 1605/1605 [==============================] - 0s 121us/step - loss: 353.2082 - accuracy: 0.6249 - val_loss: 319.2619 - val_accuracy: 0.6294 Epoch 989/1000 1605/1605 [==============================] - 0s 153us/step - loss: 338.5146 - accuracy: 0.6349 - val_loss: 314.5030 - val_accuracy: 0.6343 Epoch 990/1000 1605/1605 [==============================] - 0s 116us/step - loss: 344.8962 - accuracy: 0.6305 - val_loss: 317.3570 - val_accuracy: 0.6269 Epoch 991/1000 1605/1605 [==============================] - 0s 115us/step - loss: 345.2912 - accuracy: 0.6368 - val_loss: 314.7029 - val_accuracy: 0.6318 Epoch 992/1000 1605/1605 [==============================] - 0s 160us/step - loss: 340.3442 - accuracy: 0.6318 - val_loss: 314.5730 - val_accuracy: 0.6343 Epoch 993/1000 1605/1605 [==============================] - 0s 117us/step - loss: 343.6501 - accuracy: 0.6368 - val_loss: 315.4403 - val_accuracy: 0.6294 Epoch 994/1000 1605/1605 [==============================] - 0s 132us/step - loss: 350.3110 - accuracy: 0.6131 - val_loss: 314.5706 - val_accuracy: 0.6443 Epoch 995/1000 1605/1605 [==============================] - 0s 117us/step - loss: 349.6110 - accuracy: 0.6324 - val_loss: 314.8963 - val_accuracy: 0.6418 Epoch 996/1000 1605/1605 [==============================] - 0s 125us/step - loss: 344.8026 - accuracy: 0.6168 - val_loss: 314.9734 - val_accuracy: 0.6318 Epoch 997/1000 1605/1605 [==============================] - 0s 117us/step - loss: 339.6754 - accuracy: 0.6374 - val_loss: 314.5585 - val_accuracy: 0.6318 Epoch 998/1000 1605/1605 [==============================] - 0s 153us/step - loss: 343.3066 - accuracy: 0.6417 - val_loss: 320.7785 - val_accuracy: 0.6468 Epoch 999/1000 1605/1605 [==============================] - 0s 101us/step - loss: 341.3142 - accuracy: 0.6287 - val_loss: 316.0979 - val_accuracy: 0.6294 Epoch 1000/1000 1605/1605 [==============================] - 0s 125us/step - loss: 345.7612 - accuracy: 0.6318 - val_loss: 316.2685 - val_accuracy: 0.6343

png

png

png

png

keras.engine.training.Model

((223, 80), (223, 80))

outlier_strength作为异常值强度。

<matplotlib.axes._subplots.AxesSubplot at 0x25ecf7f0>

png

png

10 Python 包说明

getLast14 get_seeds log_loss np prepare_data stats_data sum_dict test

参考文献