Sei sulla pagina 1di 13

What is this excel notebook about?

This notebook
To keep things explores different
simple, instead of flavors/variations
working with neural of Stochastic Gradient
networks here, Descent,
we will the keylinear
use a simple optimiza
mo
NNs.
NB: Click File->Make a Copy before using this spreadsheet.

Goal: We want to create a model so that we can predict y when we're given x.
To build the model, we are given training data x, y and we want to find an optimal a, b to create a m

The key idea of SGD:


1. Randomly choose some weights to start.
2. Use those weights to calculate a prediciton, calculate the error/loss, and then update those weig
3. Repeat step 2 lots of times. Eventually we end up with some decent weights.

How does this relate to NNs? To make an analogy with CNNs: x would be our images, y would be th
We are trying to find the optimal weights (a, b). We use the same optimization methods to find wei

Overview of the sheets in this notebook:


data:

basic SGD:

momentum:

adagrad:
rmsprop:
adam:
adagrad_ann:
adam_ann:

Some comments have been added to the other sheets in this notebook.
Comments are indicated by a small red triangle in the corner of a cell. Hover your mouse over tha
is this excel notebook about?
otebook
ep things explores different
simple, instead of flavors/variations
working with neuralof Stochastic Gradient
networks here, Descent,
we will the keylinear
use a simple optimization method usH
model (y=ax+b).

ck File->Make a Copy before using this spreadsheet.

We want to create a model so that we can predict y when we're given x.


ld the model, we are given training data x, y and we want to find an optimal a, b to create a model of the form

ey idea of SGD:
ndomly choose some weights to start.
e those weights to calculate a prediciton, calculate the error/loss, and then update those weights.
peat step 2 lots of times. Eventually we end up with some decent weights.

does this relate to NNs? To make an analogy with CNNs: x would be our images, y would be the dog/cat label
e trying to find the optimal weights (a, b). We use the same optimization methods to find weights for our NN

iew of theHere
sheets in this notebook:
we randomly generate some training data x and y that we will use on the other sheets. In the
another
We create source. In our
our data CNN, x is the
by arbitrarily image adata
choosing andand y is the
b (a=2, category
b=30), (doggenerating
randomly or cat). x, and then cal
know the true values of a and b, and see if our optimization algorithms help us find them.
We
Thiscan getmost
is the a sense of how
basic, good
vanilla formour
ofanswers
stochasticare, since really
gradient we know
descent. If wewhat
werethe "true"
doing a anddescen
gradient b are.
our
What data, anditthen
makes updateisthe
stochastic thatweights a, b.evaluating the error on a subset of our data (in this case,
we're just
batch") and then updating the weights.
Accelerate SGD in the relevant direction and dampen side-to-side oscillations
It does this by combining a decaying average of previous gradients with the current gradient
Adapts the learning rate by dividing by the sqrt of the avg of the previous gradients squared
Adapts the learning rate by dividing by exponentially decaying average of squared gradients
rmsprop + momentum
adagrad with learning rate annealling
adam with learning rate annealling

On each sheet, click 'Tools->Macros->reset_coeffs' to return the a and b guesses to 1.


Click 'Tools->Macros->run_some' to copy updated a and b guesses to the top row, and recalculate, a
Currently macros only work on 'basic SGD' notebook
comments have been added to the other sheets in this notebook.
ments are indicated by a small red triangle in the corner of a cell. Hover your mouse over that cell to read th
weights:
b const 30
a slope 2

x y=a*x + b
36 102
55 140
23 76
91 212
84 198
61 152
95 220
91 212
84 198
6 42
30 90
46 122
36 102
44 118
26 82
97 224
61 152
35 100
64 158
74 178
78 186
64 158
14 58
4 38
23 76
78 186
63 156
46 122
82 194
intercept 1 learn 0.0001
slope 1 e=(ax+b-y)^2 de/db=2(ax+b-y)
x y intercept slope y_pred err^2 errb1 est de/db erra1 est de/da de/db
14 58 1 1 15 1,849 1,848.14 -85.99 1,836.98 -1,202.04 -86.00
86 202 1.01 1.12 97.36 10,949 10,946.81 -209.26 10,769.67 -17,923.60 -209.27
28 86 1.03 2.92 82.79 10 10.22 -6.40 8.56 -171.70 -6.41
51 132 1.03 2.94 150.87 356 356.60 37.76 375.73 1,951.14 37.75
28 86 1.03 2.75 77.90 66 65.40 -16.18 61.10 -445.58 -16.19
29 88 1.03 2.79 81.97 36 36.30 -12.06 33.00 -341.60 -12.07
72 174 1.03 2.83 204.50 930 930.68 61.00 974.50 4,443.41 60.99
62 154 1.02 2.39 149.00 25 24.86 -9.98 19.15 -581.09 -9.99
84 198 1.02 2.45 206.72 76 76.18 17.45 91.36 1,535.20 17.44
15 60 1.02 2.30 35.56 597 597.00 -48.88 590.17 -731.06 -48.89
42 114 1.03 2.38 100.80 174 173.91 -26.38 163.26 -1,090.94 -26.39
62 154 1.03 2.49 155.19 1 1.44 2.39 3.28 186.07 2.38
47 124 1.03 2.47 117.20 46 46.11 -13.59 40.07 -617.15 -13.60
35 100 1.03 2.54 89.78 104 104.29 -20.43 97.46 -703.30 -20.44
9 48 1.03 2.61 24.50 552 551.89 -46.99 548.14 -422.23 -47.00
38 106 1.04 2.65 101.72 18 18.25 -8.55 15.22 -310.98 -8.56
44 118 1.04 2.68 119.05 1 1.12 2.11 2.21 111.56 2.10
99 228 1.04 2.67 265.65 1,417 1,417.98 75.30 1,492.75 7,551.94 75.29
13 56 1.03 1.93 26.09 895 894.17 -59.82 887.01 -776.04 -59.83
21 72 1.04 2.01 43.15 833 831.99 -57.70 820.49 -1,207.47 -57.71
28 86 1.04 2.13 60.58 646 645.61 -50.83 631.97 -1,415.62 -50.84
20 70 1.05 2.27 46.42 556 555.45 -47.15 546.53 -939.12 -47.16
8 46 1.05 2.36 19.96 678 677.73 -52.08 674.09 -416.05 -52.09
64 158 1.06 2.40 154.96 9 9.19 -6.07 5.77 -348.36 -6.08
99 228 1.06 2.44 242.98 224 224.63 29.97 254.97 3,063.62 29.96
70 170 1.06 2.15 151.35 348 347.44 -37.29 322.19 -2,561.96 -37.30
27 84 1.06 2.41 66.08 321 320.79 -35.83 311.54 -960.42 -35.84
17 64 1.06 2.50 43.65 414 413.86 -40.70 407.37 -689.13 -40.71
8 46 1.07 2.57 21.66 592 591.96 -48.67 588.56 -388.80 -48.68
rmse 151 301.51

Paste error here:


RMSE
de/da=x*2(ax+b-y)
de/db=2(ax+b-y)
de/da new a new b
-1,204.00 1.12 1.01
-17,997.56 2.92 1.03
-179.54 2.94 1.03
1,925.13 2.75 1.03
-453.42 2.79 1.03
-350.01 2.83 1.03
4,391.57 2.39 1.02
-619.53 2.45 1.02
1,464.64 2.30 1.02
-733.31 2.38 1.03
-1,108.58 2.49 1.03
147.63 2.47 1.03
-639.24 2.54 1.03
-715.55 2.61 1.03
-423.04 2.65 1.04
-325.42 2.68 1.04
92.20 2.67 1.04
7,453.93 1.93 1.03
-777.73 2.01 1.04
-1,211.88 2.13 1.04
-1,423.46 2.27 1.05
-943.12 2.36 1.05
-416.69 2.40 1.06
-389.32 2.44 1.06
2,965.61 2.15 1.06
-2,610.96 2.41 1.06
-967.71 2.50 1.06
-692.02 2.57 1.07
-389.44 2.61 1.07
20,791.83
b 1.2611 learn beta 0.9 0.1
a 2.3580 0.0001
x y b a pred de/db de/da new b new a -19.28 162.84
14 58 1.2611 2.3580 34.2736 -47.45 -664.34 1.26 2.35 -22.1 80.122
86 202 1.2633 2.3500 203.3652 2.73 234.82 1.27 2.34 -19.61 95.591
28 86 1.2652 2.3405 66.7982 -38.40 -1075.30 1.27 2.34 -21.49 -21.5
51 132 1.2674 2.3426 120.7407 -22.52 -1148.45 1.27 2.36 -21.6 -134.2
28 86 1.2696 2.3560 67.2385 -37.52 -1050.65 1.27 2.38 -23.19 -225.8
29 88 1.2719 2.3786 70.2518 -35.50 -1029.40 1.27 2.41 -24.42 -306.2
72 174 1.2743 2.4092 174.7393 1.48 106.46 1.28 2.44 -21.83 -264.9
62 154 1.2765 2.4357 152.2917 -3.42 -211.83 1.28 2.46 -19.99 -259.6
84 198 1.2785 2.4617 208.0605 20.12 1690.17 1.28 2.47 -15.98 -64.64
15 60 1.2801 2.4682 38.3024 -43.40 -650.93 1.28 2.48 -18.72 -123.3
42 114 1.2820 2.4805 105.4622 -17.08 -717.18 1.28 2.50 -18.55 -182.7
62 154 1.2838 2.4987 156.2062 4.41 273.57 1.29 2.51 -16.26 -137
47 124 1.2855 2.5125 119.3707 -9.26 -435.16 1.29 2.53 -15.56 -166.8
35 100 1.2870 2.5291 89.8068 -20.39 -713.53 1.29 2.55 -16.04 -221.5
9 48 1.2886 2.5513 24.2502 -47.50 -427.50 1.29 2.58 -19.19 -242.1
38 106 1.2905 2.5755 99.1595 -13.68 -519.88 1.29 2.60 -18.64 -269.9
44 118 1.2924 2.6025 115.8019 -4.40 -193.43 1.29 2.63 -17.21 -262.2
99 228 1.2941 2.6287 261.5367 67.07 6640.27 1.29 2.59 -8.784 428.01
13 56 1.2950 2.5859 34.9119 -42.18 -548.29 1.30 2.55 -12.12 330.38
21 72 1.2962 2.5529 54.9066 -34.19 -717.92 1.30 2.53 -14.33 225.55
28 86 1.2976 2.5303 72.1466 -27.71 -775.79 1.30 2.52 -15.67 125.41
20 70 1.2992 2.5178 51.6548 -36.69 -733.81 1.30 2.51 -17.77 39.491
8 46 1.3010 2.5138 21.4116 -49.18 -393.41 1.30 2.51 -20.91 -3.8
64 158 1.3031 2.5142 162.2125 8.43 539.20 1.30 2.51 -17.98 50.5
99 228 1.3049 2.5092 249.7117 43.42 4298.92 1.31 2.46 -11.84 475.34
70 170 1.3061 2.4616 173.6199 7.24 506.78 1.31 2.41 -9.929 478.49
27 84 1.3070 2.4138 66.4790 -35.04 -946.13 1.31 2.38 -12.44 336.02
17 64 1.3083 2.3802 41.7713 -44.46 -755.78 1.31 2.36 -15.64 226.84
8 46 1.3099 2.3575 20.1698 -51.66 -413.28 1.31 2.34 -19.24 162.83
-19.28 162.84
Paste error here:
RMSE
b 1.0000 learn 28.09 1918.71
a 1.0000 0.001 3.56E-05 5.21E-07
x y b a pred de/db de/da new b new a
14 58 1.0000 1.0000 15.0000 -86.00 -1204.00 1.00 1.00
86 202 1.0031 1.0006 87.0570 -229.89 -19770.19 1.01 1.01
28 86 1.0112 1.0109 29.3173 -113.37 -3174.23 1.02 1.01
51 132 1.0153 1.0126 52.6572 -158.69 -8092.97 1.02 1.02
28 86 1.0209 1.0168 29.4914 -113.02 -3164.48 1.02 1.02
29 88 1.0250 1.0185 30.5601 -114.88 -3331.51 1.03 1.02
72 174 1.0290 1.0202 74.4827 -199.03 -14330.50 1.04 1.03
62 154 1.0361 1.0277 64.7509 -178.50 -11066.88 1.04 1.03
84 198 1.0425 1.0334 87.8503 -220.30 -18505.16 1.05 1.04
15 60 1.0503 1.0431 16.6964 -86.61 -1299.11 1.05 1.04
42 114 1.0534 1.0437 44.8908 -138.22 -5805.17 1.06 1.05
62 154 1.0583 1.0468 65.9583 -176.08 -10917.18 1.06 1.05
47 124 1.0646 1.0525 50.5304 -146.94 -6906.15 1.07 1.06
35 100 1.0698 1.0561 38.0320 -123.94 -4337.76 1.07 1.06
9 48 1.0742 1.0583 10.5991 -74.80 -673.22 1.08 1.06
38 106 1.0769 1.0587 41.3065 -129.39 -4916.71 1.08 1.06
44 118 1.0815 1.0612 47.7759 -140.45 -6179.72 1.09 1.06
99 228 1.0865 1.0645 106.4678 -243.06 -24063.38 1.10 1.08
13 56 1.0952 1.0770 15.0961 -81.81 -1063.50 1.10 1.08
21 72 1.0981 1.0776 23.7267 -96.55 -2027.48 1.10 1.08
28 86 1.1015 1.0786 31.3026 -109.39 -3063.06 1.11 1.08
20 70 1.1054 1.0802 22.7095 -94.58 -1891.62 1.11 1.08
8 46 1.1088 1.0812 9.7583 -72.48 -579.87 1.11 1.08
64 158 1.1113 1.0815 70.3270 -175.35 -11222.15 1.12 1.09
99 228 1.1176 1.0873 108.7645 -238.47 -23608.62 1.13 1.10
70 170 1.1261 1.0996 78.1014 -183.80 -12865.80 1.13 1.11
27 84 1.1326 1.1064 31.0041 -105.99 -2861.78 1.14 1.11
17 64 1.1364 1.1078 19.9697 -88.06 -1497.03 1.14 1.11
8 46 1.1395 1.1086 10.0085 -71.98 -575.86 1.14 1.11
27.32 1857.07
b 1.0000 learn 0.9 0.1
a 1.0000 0.02
x y b a pred de/db de/da new b new a 9309.63 38847505.0494841
14 58 1.0000 1.0000 15.0000 -86.00 -1204.00 1.02 1.00 9118.268 35107716.1445357
86 202 1.0178 1.0039 87.3501 -229.30 -19719.79 1.07 1.07 13464.28 70483939.5994261
28 86 1.0659 1.0704 31.0378 -109.92 -3077.88 1.08 1.08 13326.19 64382882.6752002
51 132 1.0848 1.0778 56.0505 -151.90 -7746.85 1.11 1.10 14300.91 63945964.9811008
28 86 1.1111 1.0971 31.8290 -108.34 -3033.58 1.13 1.10 14044.61 58471626.2905753
29 88 1.1292 1.1047 33.1642 -109.67 -3180.47 1.15 1.11 13842.94 53636005.5711338
72 174 1.1477 1.1130 81.2818 -185.44 -13351.41 1.18 1.15 15897.31 66098433.1021482
62 154 1.1793 1.1494 72.4442 -163.11 -10112.92 1.21 1.17 16968.11 69715700.411518
84 198 1.2051 1.1743 99.8474 -196.31 -16489.64 1.24 1.21 19124.88 89934955.7390569
15 60 1.2353 1.2138 19.4424 -81.12 -1216.73 1.25 1.22 17870.36 81089502.6100171
42 114 1.2470 1.2164 52.3348 -123.33 -5179.88 1.27 1.23 17604.36 75663662.9620357
62 154 1.2655 1.2279 77.3941 -153.21 -9499.13 1.29 1.25 18191.31 77120651.9371483
47 124 1.2886 1.2497 60.0255 -127.95 -6013.60 1.31 1.26 18009.27 73024930.9992305
35 100 1.3075 1.2634 45.5271 -108.95 -3813.10 1.32 1.27 17395.26 67176411.3947884
9 48 1.3238 1.2723 12.7748 -70.45 -634.05 1.33 1.27 16152.06 60498972.5601813
38 106 1.3344 1.2739 49.7422 -112.52 -4275.59 1.35 1.28 15802.83 56277143.112958
44 118 1.3522 1.2849 57.8870 -120.23 -5289.94 1.37 1.30 15668 53447779.976595
99 228 1.3713 1.2990 129.9709 -196.06 -19409.77 1.40 1.35 17945.06 85776901.0769887
13 56 1.4026 1.3521 18.9797 -74.04 -962.53 1.41 1.35 16698.76 77291856.8927518
21 72 1.4137 1.3542 29.8511 -84.30 -1770.25 1.43 1.36 15739.49 69876051.1733122
28 86 1.4267 1.3582 39.4560 -93.09 -2606.46 1.44 1.36 15032.08 63567810.280457
20 70 1.4415 1.3644 28.7301 -82.54 -1650.80 1.46 1.37 14210.15 57483542.2371786
8 46 1.4550 1.3686 12.4036 -67.19 -537.54 1.47 1.37 13240.63 51764083.2743327
64 158 1.4663 1.3700 89.1454 -137.71 -8813.39 1.49 1.39 13812.95 54355265.1535157
99 228 1.4902 1.3945 139.5443 -176.91 -17514.24 1.52 1.44 15561.42 79594590.2120739
70 170 1.5203 1.4420 102.4601 -135.08 -9455.58 1.54 1.46 15829.93 80575939.7122253
27 84 1.5420 1.4632 41.0482 -85.90 -2319.40 1.56 1.47 14984.88 73056305.5273374
17 64 1.5556 1.4684 26.5178 -74.96 -1274.40 1.57 1.47 14048.36 65913083.2759256
8 46 1.5679 1.4713 13.3386 -65.32 -522.58 1.58 1.47 13070.23 59349084.1217769
9309.63 38847505.0494841
b 1.0000 learn beta 0.7 0.3 0.9 0.1
a 1.0000 1
x y b a pred de/db de/da new b new a -19.37 -637 5490.00 27555653.26
14 58 1.0000 1.0000 15.0000 -86.00 -1204.00 1.52 1.16 -39.36 -807.1 5680.599 24945049.534
86 202 1.5222 1.1616 101.4192 -201.16 -17299.90 2.44 1.96 -87.9 -5755 9159.142 52379213.396
28 86 2.4407 1.9568 57.2300 -57.54 -1611.12 3.29 2.61 -78.79 -4512 8574.312 47400862.323
51 132 3.2916 2.6121 136.5079 9.02 459.81 3.89 3.07 -52.45 -3020 7725.009 42681918.295
28 86 3.8883 3.0744 89.9713 7.94 222.39 4.30 3.40 -34.33 -2047 6958.817 38418672.216
29 88 4.2999 3.4047 103.0369 30.07 872.14 4.49 3.60 -15.01 -1172 6353.378 34652867.479
72 174 4.4882 3.6037 263.9582 179.92 12953.98 4.03 3.16 43.468 3066.1 8955.032 47968146.799
62 154 4.0289 3.1611 200.0142 92.03 5705.76 3.41 2.59 58.036 3858 8906.452 46426905.059
84 198 3.4139 2.5948 221.3811 46.76 3928.02 2.81 2.01 54.654 3879 8234.477 43327151.606
15 60 2.8116 2.0055 32.8948 -54.21 -813.16 2.56 1.61 21.995 2471.3 7704.906 39060558.742
42 114 2.5611 1.6101 70.1860 -87.63 -3680.37 2.69 1.51 -10.89 625.83 7702.281 36509016.712
62 154 2.6852 1.5065 96.0908 -115.82 -7180.74 3.15 1.78 -42.37 -1716 8273.441 38014410.983
47 124 3.1510 1.7849 87.0406 -73.92 -3474.19 3.73 2.16 -51.83 -2244 7992.496 35419966.864
35 100 3.7308 2.1619 79.3959 -41.21 -1442.29 4.30 2.52 -48.65 -2003 7363.059 32085989.939
9 48 4.2977 2.5155 26.9372 -42.13 -379.13 4.86 2.80 -46.69 -1516 6804.209 28891764.924
38 106 4.8637 2.7975 111.1700 10.34 392.92 5.24 2.98 -29.58 -943.3 6134.48 26018026.954
44 118 5.2414 2.9825 136.4699 36.94 1625.35 5.37 3.02 -9.625 -172.7 5657.486 23680399.66
99 228 5.3694 3.0180 304.1469 152.29 15077.08 4.92 2.35 38.951 4402.2 7411.076 44044197.472
13 56 4.9169 2.3546 35.5271 -40.95 -532.30 4.74 1.89 14.982 2921.9 6837.625 39668111.633
21 72 4.7358 1.8907 44.4406 -55.12 -1157.49 4.81 1.61 -6.048 1698.1 6457.67 35835279.51
28 86 4.8110 1.6070 49.8084 -72.38 -2026.73 5.14 1.51 -25.95 580.63 6335.836 32662515.323
20 70 5.1370 1.5055 35.2461 -69.51 -1390.16 5.63 1.51 -39.02 -10.61 6185.386 29589517.193
8 46 5.6331 1.5074 17.6924 -56.62 -452.92 6.21 1.54 -44.3 -143.3 5887.377 26651079.341
64 158 6.2104 1.5352 104.4608 -107.08 -6853.01 7.00 1.94 -63.13 -2156 6445.216 28682350.263
99 228 6.9968 1.9378 198.8363 -58.33 -5774.41 7.78 2.54 -61.69 -3242 6140.902 29148496.065
70 170 7.7840 2.5382 185.4581 30.92 2164.14 8.24 2.85 -33.91 -1620 5622.394 26701994.871
27 84 8.2362 2.8517 85.2319 2.46 66.52 8.56 3.08 -23 -1114 5060.761 24032237.917
17 64 8.5595 3.0789 60.9013 -6.20 -105.35 8.83 3.25 -17.96 -811.4 4558.526 21630124.081
8 46 8.8254 3.2534 34.8526 -22.29 -178.36 9.12 3.39 -19.26 -621.5 4152.379 19470292.834
-19.37 -637 5490.00 27555653.26
b 29.4062 learn 0.275855 16.186122587 100
a 2.0217 0.0039063 0.014161 0.0002413333
x y b a pred de/db de/da new a new b
14 58 29.4062 2.0217 57.7102 -0.58 -8.11 2.02 29.41
86 202 29.4145 2.0237 203.4500 2.90 249.40 1.96 29.37
28 86 29.3734 1.9635 84.3508 -3.30 -92.35 1.99 29.42
51 132 29.4201 1.9858 130.6943 -2.61 -133.18 2.02 29.46
28 86 29.4571 2.0179 85.9586 -0.08 -2.32 2.02 29.46
29 88 29.4582 2.0185 87.9939 -0.01 -0.35 2.02 29.46
72 174 29.4584 2.0186 174.7944 1.59 114.40 1.99 29.44
62 154 29.4359 1.9909 152.8747 -2.25 -139.54 2.02 29.47
84 198 29.4678 2.0246 199.5362 3.07 258.08 1.96 29.42
15 60 29.4243 1.9623 58.8594 -2.28 -34.22 1.97 29.46
42 114 29.4566 1.9706 112.2218 -3.56 -149.37 2.01 29.51
62 154 29.5070 2.0066 153.9191 -0.16 -10.03 2.01 29.51
47 124 29.5092 2.0091 123.9355 -0.13 -6.07 2.01 29.51
35 100 29.5111 2.0105 99.8797 -0.24 -8.42 2.01 29.51
9 48 29.5145 2.0126 47.6276 -0.74 -6.70 2.01 29.53
38 106 29.5250 2.0142 106.0640 0.13 4.86 2.01 29.52
44 118 29.5232 2.0130 118.0956 0.19 8.42 2.01 29.52
99 228 29.5205 2.0110 228.6074 1.21 120.26 1.98 29.50
13 56 29.5033 1.9820 55.2687 -1.46 -19.01 1.99 29.52
21 72 29.5240 1.9865 71.2414 -1.52 -31.86 1.99 29.55
28 86 29.5455 1.9942 85.3840 -1.23 -34.50 2.00 29.56
20 70 29.5629 2.0026 69.6141 -0.77 -15.44 2.01 29.57
8 46 29.5739 2.0063 45.6241 -0.75 -6.01 2.01 29.58
64 158 29.5845 2.0077 158.0795 0.16 10.18 2.01 29.58
99 228 29.5823 2.0053 228.1048 0.21 20.75 2.00 29.58
70 170 29.5793 2.0003 169.5982 -0.80 -56.25 2.01 29.59
27 84 29.5907 2.0138 83.9645 -0.07 -1.92 2.01 29.59
17 64 29.5917 2.0143 63.8349 -0.33 -5.61 2.02 29.60
8 46 29.5964 2.0157 45.7217 -0.56 -4.45 2.02 29.60
0.29 16.63 0.01
0.00815
b 29.9909 learn beta 0.9 0.1 0.95 0.05
a 2.0042 0.00390625
x y b a pred de/db de/da new b new a -0.048 -4.064 0.03 161.90 10.32628
14 58 29.99 2.00 58.05 0.10 1.38 29.99 2.01 -0.034 -3.52 0.024497 153.89633477
86 202 29.99 2.01 202.45 0.89 76.56 29.99 2.00 -0.03 4.4885 0.062903 439.30955681
28 86 29.99 2.00 86.12 0.23 6.52 29.99 2.00 -0.004 4.6917 0.062469 419.46997186
51 132 29.99 2.00 132.17 0.35 17.63 29.99 2.00 0.0309 5.9859 0.065323 414.04277507
28 86 29.99 2.00 86.06 0.12 3.29 29.99 2.00 0.0396 5.7165 0.062748 393.88274096
29 88 29.99 2.00 88.03 0.06 1.62 29.99 2.00 0.0412 5.307 0.059767 374.31993803
72 174 29.99 2.00 174.00 0.01 0.66 29.99 2.00 0.038 4.8422 0.056783 355.62565959
62 154 29.99 2.00 153.94 -0.12 -7.46 29.99 2.00 0.0222 3.6117 0.054668 340.628522665
84 198 29.99 2.00 197.86 -0.29 -23.95 29.99 2.00 -0.009 0.8558 0.055998 352.27217923
15 60 29.99 2.00 59.96 -0.07 -1.10 29.99 2.00 -0.015 0.6606 0.053465 334.71867901
42 114 29.99 2.00 113.91 -0.18 -7.52 29.99 2.00 -0.031 -0.157 0.052394 320.80844935
62 154 29.99 2.00 153.88 -0.25 -15.46 29.99 2.00 -0.053 -1.687 0.052883 316.71788185
47 124 29.99 2.00 123.92 -0.16 -7.38 29.99 2.00 -0.064 -2.257 0.051471 303.60415205
35 100 29.99 2.00 99.96 -0.08 -2.93 29.99 2.00 -0.066 -2.324 0.049248 288.85296765
9 48 29.99 2.00 47.99 -0.02 -0.19 29.99 2.00 -0.061 -2.11 0.046807 274.41206035
38 106 29.99 2.00 106.00 -0.01 -0.25 30.00 2.00 -0.056 -1.924 0.044469 260.69465709
44 118 30.00 2.00 118.02 0.04 1.63 30.00 2.00 -0.046 -1.569 0.042314 247.79228973
99 228 30.00 2.00 228.09 0.17 17.09 30.00 2.00 -0.024 0.2962 0.041687 249.99903149
13 56 30.00 2.00 56.01 0.02 0.21 30.00 2.00 -0.02 0.2873 0.039616 237.5012213
21 72 30.00 2.00 72.01 0.03 0.57 30.00 2.00 -0.016 0.3152 0.037671 225.64220609
28 86 30.00 2.00 86.02 0.03 0.94 30.00 2.00 -0.011 0.3779 0.035844 214.40447435
20 70 30.00 2.00 70.01 0.02 0.38 30.00 2.00 -0.008 0.3786 0.034071 203.69164094
8 46 30.00 2.00 46.00 0.00 0.03 30.00 2.00 -0.007 0.344 0.032368 193.50711164
64 158 30.00 2.00 158.02 0.04 2.88 30.00 2.00 -0.001 0.5972 0.03085 184.24531624
99 228 30.00 2.00 228.02 0.04 3.69 30.00 2.00 0.0024 0.9068 0.029378 175.71519254
70 170 30.00 2.00 169.99 -0.01 -0.84 30.00 2.00 0.001 0.7317 0.027916 166.96510155
27 84 30.00 2.00 83.99 -0.02 -0.50 30.00 2.00 -0.001 0.6081 0.026538 158.62952504
17 64 30.00 2.00 63.99 -0.02 -0.33 30.00 2.00 -0.003 0.5145 0.025229 150.70345102
8 46 30.00 2.00 45.99 -0.01 -0.11 30.00 2.00 -0.004 0.4524 0.023977 143.16884522
0.04 2.99 -0.048 -4.064 0.03 161.90 12.88
-0.246 39.56338 1.247577

Potrebbero piacerti anche