120 likes | 134 Views
A B C D E F G H I J K L M N O P Q R S 61 1 1.006 1 1 1.006 1 1.003 1 1 1 1.006 1 1.003 1.006 1 1 1 1.003 1.006 62 1 1.012 1 1 1.012 1 1.006 1 1 1 1.012 1 1.006 1.012 1 1 1 1.006 1.012 63 1 1.018 1 1 1.018 1 1.009 1 1 1 1.018 1 1.009 1.018 1 1 1 1.009 1.018
E N D
A B C D E F G H I J K L M N O P Q R S 61 1 1.006 1 1 1.006 1 1.003 1 1 1 1.006 1 1.003 1.006 1 1 1 1.003 1.006 62 1 1.012 1 1 1.012 1 1.006 1 1 1 1.012 1 1.006 1.012 1 1 1 1.006 1.012 63 1 1.018 1 1 1.018 1 1.009 1 1 1 1.018 1 1.009 1.018 1 1 1 1.009 1.018 64 1 1.024 1 1 1.024 1 1.012 1 1 1 1.024 1 1.012 1.024 1 1 1 1.012 1.024 65 1 1.030 1 1 1.030 1 1.015 1 1 1 1.030 1 1.015 1.030 1 1 1 1.015 1.030 66 1 1.036 1 1 1.036 1 1.018 1 1 1 1.036 1 1.018 1.036 1 1 1 1.018 1.036 67 1 1.042 1 1 1.042 1 1.021 1 1 1 1.042 1 1.021 1.042 1 1 1 1.021 1.042 68 1 1.048 1 1 1.048 1 1.024 1 1 1 1.048 1 1.024 1.048 1 1 1 1.024 1.048 69 1 1.054 1 1 1.054 1 1.027 1 1 1 1.054 1 1.027 1.054 1 1 1 1.027 1.054 70 1 1.060 1 1 1.060 1 1.030 1 1 1 1.060 1 1.030 1.060 1 1 1 1.030 1.060 71 1 1.066 1 1 1.066 1 1.033 1 1 1 1.066 1 1.033 1.066 1 1 1 1.033 1.066 72 1 1.072 1 1 1.072 1 1.036 1 1 1 1.072 1 1.036 1.072 1 1 1 1.036 1.072 73 1 1.078 1 1 1.078 1 1.039 1 1 1 1.078 1 1.039 1.078 1 1 1 1.039 1.078 74 1 1.084 1 1 1.084 1 1.042 1 1 1 1.084 1 1.042 1.084 1 1 1 1.042 1.084 75 1 1.090 1 1 1.090 1 1.045 1 1 1 1.090 1 1.045 1.090 1 1 1 1.045 1.090 76 1 1.096 1 1 1.096 1 1.048 1 1 1 1.096 1 1.048 1.096 1 1 1 1.048 1.096 77 1 1.102 1 1 1.102 1 1.051 1 1 1 1.102 1 1.051 1.102 1 1 1 1.051 1.102 78 1 1.108 1 1 1.108 1 1.054 1 1 1 1.108 1 1.054 1.108 1 1 1 1.054 1.108 79 1 1.114 1 1 1.114 1 1.057 1 1 1 1.114 1 1.057 1.114 1 1 1 1.057 1.114 80 1 1.120 1 1 1.120 1 1.060 1 1 1 1.120 1 1.060 1.120 1 1 1 1.060 1.120 81 1 1.126 1 1 1.126 1 1.063 1 1 1 1.126 1 1.063 1.126 1 1 1 1.063 1.126 82 1 1.132 1 1 1.132 1 1.066 1 1 1 1.132 1 1.066 1.132 1 1 1 1.066 1.132 83 1 1.138 1 1 1.138 1 1.069 1 1 1 1.138 1 1.069 1.138 1 1 1 1.069 1.138 84 1 1.144 1 1 1.144 1 1.072 1 1 1 1.144 1 1.072 1.144 1 1 1 1.072 1.144 85 1 1.150 1 1 1.150 1 1.075 1 1 1 1.150 1 1.075 1.150 1 1 1 1.075 1.150 86 1 1.156 1 1 1.156 1 1.078 1 1 1 1.156 1 1.078 1.156 1 1 1 1.078 1.156 87 1 1.162 1 1 1.162 1 1.081 1 1 1 1.162 1 1.081 1.162 1 1 1 1.081 1.162 88 1 1.168 1 1 1.168 1 1.084 1 1 1 1.168 1 1.084 1.168 1 1 1 1.084 1.168 89 1 1.174 1 1 1.174 1 1.087 1 1 1 1.174 1 1.087 1.174 1 1 1 1.087 1.174 90 1 1.180 1 1 1.180 1 1.090 1 1 1 1.180 1 1.090 1.180 1 1 1 1.090 1.180 91 1 1.186 1 1 1.186 1 1.093 1 1 1 1.186 1 1.093 1.186 1 1 1 1.093 1.186 92 1 1.192 1 1 1.192 1 1.096 1 1 1 1.192 1 1.096 1.192 1 1 1 1.096 1.192 93 1 1.198 1 1 1.198 1 1.099 1 1 1 1.198 1 1.099 1.198 1 1 1 1.099 1.198 94 1 1.204 1 1 1.204 1 1.102 1 1 1 1.204 1 1.102 1.204 1 1 1 1.102 1.204 95 1 1.210 1 1 1.210 1 1.105 1 1 1 1.210 1 1.105 1.210 1 1 1 1.105 1.210 96 1 1.216 1 1 1.216 1 1.108 1 1 1 1.216 1 1.108 1.216 1 1 1 1.108 1.216 97 1 1.222 1 1 1.222 1 1.111 1 1 1 1.222 1 1.111 1.222 1 1 1 1.111 1.222 98 1 1.228 1 1 1.228 1 1.114 1 1 1 1.228 1 1.114 1.228 1 1 1 1.114 1.228 99 1 1.234 1 1 1.234 1 1.117 1 1 1 1.234 1 1.117 1.234 1 1 1 1.117 1.234 100 1 1.240 1 1 1.240 1 1.120 1 1 1 1.240 1 1.120 1.240 1 1 1 1.120 1.240 101 1 1.247 1 1 1.246 1 1.123 1 1 1 1.246 1 1.123 1.246 1 1 1 1.123 1.246 102 1 1.253 1 1 1.252 1 1.126 1 1 1 1.252 1 1.126 1.252 1 1 1 1.126 1.252 103 1 1.259 1 1 1.258 1 1.129 1 1 1 1.258 1 1.129 1.258 1 1 1 1.129 1.258 104 1 1.265 1 1 1.264 1 1.132 1 1 1 1.264 1 1.132 1.264 1 1 1 1.132 1.264 105 1 1.271 1 1 1.270 1 1.135 1 1 1 1.270 1 1.135 1.270 1 1 1 1.135 1.270 106 1 1.277 1 1 1.276 1 1.138 1 1 1 1.276 1 1.138 1.277 1 1 1 1.138 1.277 107 1 1.283 1 1 1.282 1 1.141 1 1 1 1.282 1 1.141 1.283 1 1 1 1.141 1.283 108 1 1.289 1 1 1.289 1 1.144 1 1 1 1.289 1 1.144 1.289 1 1 1 1.144 1.289 109 1 1.295 1 1 1.295 1 1.147 1 1 1 1.295 1 1.147 1.295 1 1 1 1.147 1.295 110 1 1.301 1 1 1.301 1 1.150 1 1 1 1.301 1 1.150 1.301 1 1 1 1.150 1.301 111 1 1.307 1 1 1.307 1 1.153 1 1 1 1.307 1 1.153 1.307 1 1 1 1.153 1.307 112 1 1.313 1 1 1.313 1 1.156 1 1 1 1.313 1 1.156 1.313 1 1 1 1.156 1.313 113 1 1.319 1 1 1.319 1 1.159 1 1 1 1.319 1 1.159 1.319 1 1 1 1.159 1.319 114 1 1.322 1 1 1.325 1 1.162 1 1 1 1.325 1 1.163 1.325 1 1 1 1.162 1.325 115 1 1.322 1 1 1.325 1 1.165 1 1 1 1.325 1 1.166 1.325 1 1 1 1.165 1.325 116 1 1.322 1 1 1.325 1 1.168 1 1 1 1.325 1 1.169 1.325 1 1 1 1.168 1.325 117 1 1.322 1 1 1.325 1 1.171 1 1 1 1.325 1 1.172 1.325 1 1 1 1.171 1.325 118 1 1.322 1 1 1.325 1 1.174 1 1 1 1.325 1 1.175 1.325 1 1 1 1.174 1.325 119 1 1.322 1 1 1.325 1 1.177 1 1 1 1.325 1 1.178 1.325 1 1 1 1.177 1.325 120 1 1.322 1 1 1.325 1 1.180 1 1 1 1.325 1 1.181 1.325 1 1 1 1.180 1.325 121 1 1.322 1 1 1.325 1 1.183 1 1 1 1.325 1 1.184 1.325 1 1 1 1.183 1.325 122 1 1.322 1 1 1.325 1 1.186 1 1 1 1.325 1 1.187 1.325 1 1 1 1.187 1.325 123 1 1.322 1 1 1.325 1 1.189 1 1 1 1.325 1 1.190 1.325 1 1 1 1.190 1.325 124 1 1.322 1 1 1.325 1 1.192 1 1 1 1.325 1 1.193 1.325 1 1 1 1.193 1.325 125 1 1.322 1 1 1.325 1 1.195 1 1 1 1.325 1 1.196 1.325 1 1 1 1.196 1.325 126 1 1.322 1 1 1.325 1 1.198 1 1 1 1.325 1 1.199 1.325 1 1 1 1.199 1.325 127 1 1.322 1 1 1.325 1 1.201 1 1 1 1.325 1 1.202 1.325 1 1 1 1.202 1.325 128 1 1.322 1 1 1.325 1 1.204 1 1 1 1.325 1 1.205 1.325 1 1 1 1.205 1.325 129 1 1.322 1 1 1.325 1 1.207 1 1 1 1.325 1 1.208 1.325 1 1 1 1.208 1.325 130 1 1.322 1 1 1.325 1 1.210 1 1 1 1.325 1 1.211 1.325 1 1 1 1.211 1.325 131 1 1.322 1 1 1.325 1 1.213 1 1 1 1.325 1 1.214 1.325 1 1 1 1.214 1.325 132 1 1.322 1 1 1.325 1 1.216 1 1 1 1.325 1 1.217 1.325 1 1 1 1.217 1.325 133 1 1.322 1 1 1.325 1 1.219 1 1 1 1.325 1 1.220 1.325 1 1 1 1.220 1.325 134 1 1.322 1 1 1.325 1 1.222 1 1 1 1.325 1 1.223 1.325 1 1 1 1.223 1.325 135 1 1.322 1 1 1.325 1 1.225 1 1 1 1.325 1 1.226 1.325 1 1 1 1.226 1.325 136 1 1.322 1 1 1.325 1 1.229 1 1 1 1.325 1 1.230 1.325 1 1 1 1.229 1.325 137 1 1.322 1 1 1.325 1 1.232 1 1 1 1.325 1 1.233 1.325 1 1 1 1.232 1.325 138 1 1.322 1 1 1.325 1 1.235 1 1 1 1.325 1 1.236 1.325 1 1 1 1.235 1.325 139 1 1.322 1 1 1.325 1 1.238 1 1 1 1.325 1 1.239 1.325 1 1 1 1.238 1.325 140 1 1.322 1 1 1.325 1 1.241 1 1 1 1.325 1 1.242 1.325 1 1 1 1.241 1.325 141 1 1.322 1 1 1.325 1 1.244 1 1 1 1.325 1 1.245 1.325 1 1 1 1.244 1.325 142 1 1.322 1 1 1.325 1 1.247 1 1 1 1.325 1 1.248 1.325 1 1 1 1.247 1.325 143 1 1.322 1 1 1.325 1 1.250 1 1 1 1.325 1 1.251 1.325 1 1 1 1.250 1.325 144 1 1.322 1 1 1.325 1 1.253 1 1 1 1.325 1 1.254 1.325 1 1 1 1.253 1.325 145 1 1.322 1 1 1.325 1 1.256 1 1 1 1.325 1 1.257 1.325 1 1 1 1.256 1.325 146 1 1.322 1 1 1.325 1 1.259 1 1 1 1.325 1 1.260 1.325 1 1 1 1.259 1.325 147 1 1.322 1 1 1.325 1 1.262 1 1 1 1.325 1 1.263 1.325 1 1 1 1.262 1.325 148 1 1.322 1 1 1.325 1 1.265 1 1 1 1.325 1 1.266 1.325 1 1 1 1.265 1.325 149 1 1.322 1 1 1.325 1 1.268 1 1 1 1.325 1 1.269 1.325 1 1 1 1.269 1.325 150 1 1.322 1 1 1.325 1 1.271 1 1 1 1.325 1 1.272 1.325 1 1 1 1.272 1.325 151 1 1.322 1 1 1.325 1 1.274 1 1 1 1.325 1 1.275 1.325 1 1 1 1.275 1.325 152 1 1.322 1 1 1.325 1 1.277 1 1 1 1.325 1 1.278 1.325 1 1 1 1.278 1.325 153 1 1.322 1 1 1.325 1 1.280 1 1 1 1.325 1 1.281 1.325 1 1 1 1.281 1.325 154 1 1.322 1 1 1.325 1 1.283 1 1 1 1.325 1 1.284 1.325 1 1 1 1.284 1.325 155 1 1.322 1 1 1.325 1 1.286 1 1 1 1.325 1 1.288 1.325 1 1 1 1.287 1.325 156 1 1.322 1 1 1.325 1 1.289 1 1 1 1.325 1 1.291 1.325 1 1 1 1.290 1.325 157 1 1.322 1 1 1.325 1 1.292 1 1 1 1.325 1 1.294 1.325 1 1 1 1.293 1.325 158 1 1.322 1 1 1.325 1 1.295 1 1 1 1.325 1 1.297 1.325 1 1 1 1.296 1.325 159 1 1.322 1 1 1.325 1 1.298 1 1 1 1.325 1 1.300 1.325 1 1 1 1.299 1.325 160 1 1.322 1 1 1.325 1 1.301 1 1 1 1.325 1 1.303 1.325 1 1 1 1.302 1.325 161 1 1.322 1 1 1.325 1 1.304 1 1 1 1.325 1 1.306 1.325 1 1 1 1.305 1.325 162 1 1.322 1 1 1.325 1 1.307 1 1 1 1.325 1 1.309 1.325 1 1 1 1.308 1.325 163 1 1.322 1 1 1.325 1 1.310 1 1 1 1.325 1 1.309 1.325 1 1 1 1.311 1.325 164 1 1.322 1 1 1.325 1 1.313 1 1 1 1.325 1 1.309 1.325 1 1 1 1.314 1.325 165 1 1.322 1 1 1.325 1 1.316 1 1 1 1.325 1 1.309 1.325 1 1 1 1.314 1.325 166 1 1.322 1 1 1.325 1 1.319 1 1 1 1.325 1 1.309 1.325 1 1 1 1.314 1.325 T 1 2 3 4 5 6 7 8 mse 1.003 3.000 3.000 3.000 3.000 3 3.000 3.000 3.000 0.4 1.006 3.000 3.000 3.000 3.001 3 3.001 3.001 3.001 0.4 1.009 3.001 3.001 3.001 3.002 3 3.001 3.001 3.001 0.4 1.012 3.001 3.001 3.001 3.002 3 3.002 3.002 3.002 0.4 1.015 3.001 3.001 3.002 3.003 3 3.002 3.002 3.002 0.4 1.018 3.002 3.002 3.002 3.004 3 3.003 3.003 3.003 0.4 1.021 3.002 3.002 3.002 3.004 3 3.003 3.003 3.003 0.4 1.024 3.002 3.002 3.003 3.005 3 3.004 3.004 3.004 0.4 1.027 3.003 3.003 3.003 3.006 3 3.004 3.004 3.004 0.4 1.030 3.003 3.003 3.004 3.006 3 3.005 3.005 3.005 0.4 1.033 3.003 3.003 3.004 3.007 3 3.005 3.005 3.005 0.4 1.036 3.004 3.004 3.004 3.008 3 3.006 3.006 3.006 0.4 1.039 3.004 3.004 3.005 3.008 3 3.006 3.006 3.006 0.4 1.042 3.004 3.004 3.005 3.009 3 3.007 3.007 3.007 0.4 1.045 3.005 3.005 3.006 3.010 3 3.007 3.007 3.007 0.4 1.048 3.005 3.005 3.006 3.011 3 3.008 3.008 3.008 0.4 1.051 3.005 3.005 3.007 3.011 3 3.008 3.008 3.008 0.4 1.054 3.006 3.006 3.007 3.012 3 3.009 3.009 3.009 0.4 1.057 3.006 3.006 3.008 3.013 3 3.009 3.010 3.010 0.4 1.060 3.006 3.006 3.008 3.013 3 3.010 3.010 3.010 0.4 1.063 3.007 3.007 3.008 3.014 3 3.011 3.011 3.011 0.4 1.066 3.007 3.007 3.009 3.015 3 3.011 3.011 3.011 0.4 1.069 3.008 3.008 3.009 3.016 3 3.012 3.012 3.012 0.4 1.072 3.008 3.008 3.010 3.016 3 3.012 3.012 3.012 0.4 1.075 3.008 3.008 3.010 3.017 3 3.013 3.013 3.013 0.4 1.078 3.009 3.009 3.011 3.018 3 3.013 3.014 3.014 0.4 1.081 3.009 3.009 3.011 3.019 3 3.014 3.014 3.014 0.4 1.084 3.009 3.009 3.012 3.019 3 3.014 3.015 3.015 0.4 1.087 3.010 3.010 3.012 3.020 3 3.015 3.015 3.015 0.4 1.090 3.010 3.010 3.013 3.021 3 3.016 3.016 3.016 0.4 1.093 3.011 3.011 3.013 3.022 3 3.016 3.016 3.016 0.4 1.096 3.011 3.011 3.014 3.022 3 3.017 3.017 3.017 0.4 1.099 3.011 3.011 3.014 3.023 3 3.017 3.018 3.018 0.4 1.102 3.012 3.012 3.015 3.024 3 3.018 3.018 3.018 0.4 1.105 3.012 3.012 3.015 3.025 3 3.018 3.019 3.019 0.4 1.108 3.013 3.013 3.016 3.026 3 3.019 3.020 3.020 0.4 1.111 3.013 3.013 3.016 3.026 3 3.020 3.020 3.020 0.4 1.114 3.013 3.013 3.016 3.027 3 3.020 3.021 3.021 0.4 1.117 3.014 3.014 3.017 3.028 3 3.021 3.021 3.021 0.4 1.120 3.014 3.014 3.017 3.029 3 3.021 3.022 3.022 0.4 1.123 3.014 3.014 3.018 3.029 3 3.022 3.023 3.023 0.4 1.126 3.015 3.015 3.018 3.030 3 3.023 3.023 3.023 0.4 1.129 3.015 3.015 3.019 3.031 3 3.023 3.024 3.024 0.4 1.132 3.016 3.016 3.019 3.032 3 3.024 3.024 3.024 0.4 1.135 3.016 3.016 3.020 3.033 3 3.024 3.025 3.025 0.4 1.138 3.016 3.016 3.021 3.033 3 3.025 3.026 3.026 0.4 1.141 3.017 3.017 3.021 3.034 3 3.026 3.026 3.026 0.4 1.144 3.017 3.017 3.022 3.035 3 3.026 3.027 3.027 0.4 1.147 3.018 3.018 3.022 3.036 3 3.027 3.028 3.028 0.4 1.150 3.018 3.018 3.023 3.037 3 3.027 3.028 3.028 0.4 1.153 3.018 3.018 3.023 3.037 3 3.028 3.029 3.029 0.4 1.156 3.019 3.019 3.024 3.038 3 3.029 3.030 3.030 0.4 1.159 3.019 3.019 3.024 3.039 3 3.029 3.030 3.030 0.3714285714 1.162 3.020 3.020 3.025 3.040 3 3.030 3.031 3.031 0.1142857143 1.165 3.020 3.020 3.025 3.040 3 3.030 3.031 3.031 0.1142857143 1.168 3.020 3.020 3.025 3.040 3 3.030 3.031 3.031 0.1142857143 1.171 3.020 3.020 3.025 3.041 3 3.031 3.031 3.031 0.1142857143 1.174 3.021 3.021 3.025 3.041 3 3.031 3.031 3.031 0.1142857143 1.177 3.021 3.021 3.025 3.041 3 3.031 3.031 3.031 0.1142857143 1.180 3.021 3.021 3.025 3.042 3 3.032 3.031 3.031 0.1142857143 1.183 3.021 3.021 3.025 3.042 3 3.032 3.031 3.031 0.1142857143 1.186 3.021 3.021 3.025 3.043 3 3.032 3.031 3.031 0.1142857143 1.189 3.021 3.021 3.025 3.043 3 3.033 3.031 3.031 0.1142857143 1.192 3.022 3.022 3.025 3.043 3 3.033 3.031 3.031 0.1142857143 1.195 3.022 3.022 3.025 3.044 3 3.033 3.031 3.031 0.1142857143 1.198 3.022 3.022 3.025 3.044 3 3.033 3.031 3.031 0.1142857143 1.201 3.022 3.022 3.025 3.045 3 3.034 3.031 3.031 0.1142857143 1.204 3.022 3.022 3.025 3.045 3 3.034 3.031 3.031 0.1142857143 1.207 3.023 3.023 3.025 3.045 3 3.034 3.031 3.031 0.1142857143 1.210 3.023 3.023 3.025 3.046 3 3.035 3.031 3.031 0.1142857143 1.213 3.023 3.023 3.025 3.046 3 3.035 3.031 3.031 0.1142857143 1.216 3.023 3.023 3.025 3.047 3 3.035 3.031 3.031 0.1142857143 1.219 3.024 3.024 3.025 3.047 3 3.036 3.031 3.031 0.1142857143 1.222 3.024 3.024 3.025 3.047 3 3.036 3.031 3.031 0.1142857143 1.225 3.024 3.024 3.025 3.048 3 3.036 3.031 3.031 0.1142857143 1.229 3.024 3.024 3.025 3.048 3 3.036 3.031 3.031 0.1142857143 1.232 3.024 3.024 3.025 3.049 3 3.037 3.031 3.031 0.1142857143 1.235 3.025 3.025 3.025 3.049 3 3.037 3.031 3.031 0.1142857143 1.238 3.025 3.025 3.025 3.050 3 3.037 3.031 3.031 0.1142857143 1.241 3.025 3.025 3.025 3.050 3 3.038 3.031 3.031 0.1142857143 1.244 3.025 3.025 3.025 3.050 3 3.038 3.031 3.031 0.1142857143 1.247 3.025 3.025 3.025 3.051 3 3.038 3.031 3.031 0.1142857143 1.250 3.026 3.026 3.025 3.051 3 3.039 3.031 3.031 0.1142857143 1.253 3.026 3.026 3.025 3.052 3 3.039 3.031 3.031 0.1142857143 1.256 3.026 3.026 3.025 3.052 3 3.039 3.031 3.031 0.1142857143 1.259 3.026 3.026 3.025 3.052 3 3.040 3.031 3.031 0.1142857143 1.262 3.026 3.026 3.025 3.053 3 3.040 3.031 3.031 0.1142857143 1.265 3.027 3.027 3.025 3.053 3 3.040 3.031 3.031 0.1142857143 1.268 3.027 3.027 3.025 3.054 3 3.041 3.031 3.031 0.1142857143 1.271 3.027 3.027 3.025 3.054 3 3.041 3.031 3.031 0.1142857143 1.274 3.027 3.027 3.025 3.055 3 3.041 3.031 3.031 0.1142857143 1.277 3.027 3.027 3.025 3.055 3 3.041 3.031 3.031 0.1142857143 1.280 3.028 3.028 3.025 3.055 3 3.042 3.031 3.031 0.1142857143 1.283 3.028 3.028 3.025 3.056 3 3.042 3.031 3.031 0.1142857143 1.286 3.028 3.028 3.025 3.056 3 3.042 3.031 3.031 0.1142857143 1.289 3.028 3.028 3.025 3.057 3 3.043 3.031 3.031 0.1142857143 1.292 3.029 3.029 3.025 3.057 3 3.043 3.031 3.031 0.1142857143 1.295 3.029 3.029 3.025 3.058 3 3.043 3.031 3.031 0.1142857143 1.298 3.029 3.029 3.025 3.058 3 3.044 3.031 3.031 0.1142857143 1.301 3.029 3.029 3.025 3.058 3 3.044 3.031 3.031 0.1142857143 1.304 3.029 3.029 3.025 3.059 3 3.044 3.031 3.031 0.1142857143 1.307 3.030 3.030 3.025 3.059 3 3.045 3.031 3.031 0.0857142857 1.310 3.030 3.030 3.025 3.059 3 3.045 3.031 3.031 0.0857142857 1.313 3.030 3.030 3.025 3.059 3 3.045 3.031 3.031 0.0571428571 1.316 3.030 3.030 3.025 3.059 3 3.045 3.031 3.031 0.0571428571 1.319 3.031 3.031 3.025 3.059 3 3.045 3.031 3.031 0 HPVD using error = rating2 NotEqual prediction2 166 0 1.422 0 0 1.420 0 1.404 0 0 0 1.419 0 1.196 1.560 0 0 0 1.141 1.557 1.155 3.055 3.025 3.054 3.071 2 2.979 3.029 3.003 0.3887226594 Actual rating -prediction error rate ended up at 1.126438081 while starting at 1.9142857143 How close is the ending feature vector to the ending feature vector we get when we learn with error = rating - prediction? The next slide shows the same number or rounds using this actual error and the ending vector for it. The ending actual error after the same number of rounds is 0.3887226594 That ending vector is -->
error = rating2 <> prediction2 with rounds 61 through 166: VPHD A B C D E F G H I J K L M N 61 1.00 1.00 1.0 1.0 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 ... 1031 1.46 2.61 .95 1.6 2.64 1.56 2.65 1.17 1.66 1.07 2.66 1.66 2.06 3.01 1032 1.46 2.61 .95 1.6 2.64 1.56 2.65 1.17 1.66 1.07 2.66 1.66 2.06 3.01 ... 2000 1.48 2.54 .90 1.6 2.60 1.55 2.68 1.15 1.73 1.08 2.67 1.73 2.06 3.08 O P Q R S T 12345678 L.001 1.0 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.0 1.00 1.0 1.00 ... .95 1.76 1.17 2.14 2.87 2.68 1.86 1.71 1.76 1.91 .93 1.54 1.7 1.51 .95 1.76 1.17 2.14 2.87 2.68 1.86 1.71 1.76 1.91 .93 1.54 1.7 1.51 ... .90 1.76 1.16 2.07 2.90 2.75 1.86 1.74 1.71 1.94 .87 1.61 1.7 1.50 error = rating - prediction with rounds 61 through 166: 6.6800339589 ... 0.2056153265 0.2055978766 ... 0.1999285256 .755 1.422 .573 .993 1.42 .99 1.404 .711 .99 .577 1.419 .992 1.2 1.56 .573 .99 .79 1.14 1.557 1.155 3.06 3.03 3.054 3.07 2.91 2.98 3.03 3 1 1.322 1 1 1.325 1 1.319 1 1 1 1.325 1 1.3 1.33 1 1 1 1.3 1.325 1.319 3.03 3.03 3.025 3.06 3 3.05 3.03 3 HPVD A B C D E F G H I J K L M N O P Q R S T 1 61 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 62 1 1.02 1 1 1.02 1 1.01 1 1 1 1.02 1 1.01 1.02 1 1 1 1.01 1.02 1.01 1 ... 177 1 2.61 1 1 2.77 1 2.51 1 1 1 2.77 1 2.23 2.66 1 1 1 2.35 2.66 2.51 1.58 2345 67 8 L.01 1 1 1 1 1 1 1 1.00 1.00 1.00 1 1.00 1.0 1.00 0.4 ... 1.58 1.51 1.80 1 1.71 1.6 1.570.0571428571 Actual error: 0.8165226434 VPHD A B C D E F G H I J K L M N 61 5 5 5 5 5 5 5 5 5 5 5 5 5 5 62 4.94 4.99 4.96 4.99 4.99 4.99 5 4.97 4.99 4.96 4.99 4.99 4.99 5 ... 1032 3.48 6.27 2.28 3.87 6.32 3.87 6.29 2.83 4.11 2.64 6.27 4.11 5.08 7.2 O P Q R S T 12345 67 8 L.005 5 5 5 5 5 5 1 1 1 1 1 1 1 1 4.96 4.98 4.98 4.98 5 4.98 0.96 0.95 0.97 0.97 0.91 0.93 0.96 0.95 ... 2.28 4.15 2.99 5.11 6.91 6.64 0.78 0.71 0.72 0.78 0.36 0.65 0.70 0.63 4.177165 ... 0.203944 89 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.9147839 90 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.8957260 91 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.8773340 92 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.8595836 93 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.8424510 94 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.8259137 95 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.8099501 96 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7945390 97 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7796605 98 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7652952 99 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7514245 100 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7380304 101 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7250958 102 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7126040 103 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.7005391 104 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6888857 105 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6776290 106 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6667549 107 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6562495 108 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6460998 109 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6362930 110 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6268168 111 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6176595 112 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6088097 113 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.6002565 114 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5919893 115 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5839981 116 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5762730 117 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5688047 118 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5615839 119 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5546022 120 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5478509 121 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5413219 122 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5350075 123 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5289002 124 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5229925 125 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5172777 126 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5117488 127 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5063994 128 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.5012232 129 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4962141 130 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4913665 131 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4866745 132 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4821328 133 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4777363 134 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4734797 135 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4693584 136 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4653675 137 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4615027 138 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4577595 139 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4541338 140 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4506214 141 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4472185 142 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.4439214 A B C D E F G H I J K L M N O P Q R S T 1 2 3 4 5 6 7 8 lrate MSE VPHD 60 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 3 3 3 3 3 3 3 3 0.001 1.9142857 61 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.8589104 62 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.8056105 63 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.7543016 64 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.7049032 65 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.6573384 66 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.6115338 67 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.5674192 68 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.5249276 69 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.4839949 70 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.4445598 71 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.4065637 72 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.3699505 73 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.3346664 74 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.3006602 75 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.2678824 76 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.2362860 77 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.2058257 78 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.1764582 79 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.1481420 80 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.1208372 81 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.0945055 82 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.0691105 83 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.0446168 84 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.0209907 85 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.9981998 86 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.9762130 87 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.9550003 88 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.001 0.9345332 Using HPVD and error=r2 p2 is not fruitful! Back to err=r-p and fixed increment line search each round (using .005 incr) Next slide: continuing this line search idea for more rounds A B C D E F G H I J K L M N O P Q R S T 1 2 3 4 5 6 7 8 lrate MSE VPHD 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 3 3 3 3 3 3 3 3 0.001 1.9142857 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.001 1.8589104 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.005 1.6479359 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.010 1.4078947 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.015 1.1941850 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.020 1.0068821 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.025 0.8461148 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.030 0.7120650 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.035 0.6049681 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 0.040 0.5251139 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 3 3 3 0.045 0.4728471 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 3 3 3 0.050 0.4485671 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 3 3 3 0.055 0.4527298 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 3 3 3 .0525 0.4470620 Error=.4470620 in 12 steps. A binary search could be even fewer steps? These could be done in parallel too! Funk's method with L=.001, 82 steps:
a b c d e f g h i j k l m n o p q r s t 1 2 3 4 5 6 7 8 LRATEMSE .52 1.47 .52 1 1.47 1 1.31 .68 1 .52 1.47 1 1.15 1.63 .52 1 .84 1.15 1.63 1.15 3.04 3.02 3.06 3.07 2.95 3.00 3.05 3.04 .05250.4470620 Note that we came up with an approximately minimized mse at LRATE=,030. Going from this line search resulting from LRATE=.03, we do another round round: .76 1.47 .51 .99 1.47 .99 1.40 .67 .99 0.51 1.47 .99 1.19 1.63 .51 0.99 .79 1.15 1.63 1.16 3.06 3.03 3.07 3.08 2.93 3.00 3.04 3.03 .0300.368960 Going from this line search resulting from LRATE=.02, we the same for the next round: .75 1.47 .50 .99 1.47 .98 1.44 .66 .99 0.51 1.47 .99 1.21 1.63 .50 0.98 .76 1.15 1.63 1.17 3.06 3.03 3.07 3.08 2.92 2.99 3.04 3.03 .0200.351217 Here is the result after 1 round when using a fixed increment line search to find minimize mse with respect to the LRATE used: Without the line search and using LRATE=.001, to arrive at nearly the same mse (and a nearly identical feature vector) it takes 81 rounds: .76 1.38 .61 .99 1.38 .99 1.34 .74 0.99 .61 1.38 .99 1.16 1.50 .61 .99 .82 1.12 1.50 1.13 3.04 3.01 3.04 3.06 2.92 2.98 3.02 3 .001 0.44721854 Going from the round 1 result (LRATE=.0525) shown here, we do a second round and again do fixed increment line search: .52 1.47 .52 1 1.47 1 1.31 .68 1 .52 1.47 1 1.15 1.63 .52 1 .84 1.15 1.63 1.15 3.04 3.02 3.06 3.07 2.95 3.00 3.05 3.04 .05250.447062 .92 1.48 .50 .99 1.47 .98 1.46 .66 .98 0.50 1.47 .98 1.22 1.63 .50 0.98 .75 1.15 1.63 1.17 3.07 3.03 3.07 3.09 2.92 2.99 3.04 3.03 .0500.387166 .84 1.47 .50 .99 1.47 .99 1.43 .66 .99 0.50 1.47 .99 1.21 1.63 .50 0.98 .77 1.15 1.63 1.17 3.06 3.03 3.07 3.08 2.93 3.00 3.04 3.03 .0400.371007 .76 1.47 .51 .99 1.47 .99 1.40 .67 .99 0.51 1.47 .99 1.19 1.63 .51 0.99 .79 1.15 1.63 1.16 3.06 3.03 3.07 3.08 2.93 3.00 3.04 3.03 .0300.368960 .76 1.47 .51 .99 1.47 .99 1.40 .67 .99 0.51 1.47 .99 1.19 1.63 .51 0.99 .79 1.15 1.63 1.16 3.06 3.03 3.07 3.08 2.93 3.00 3.04 3.03 .0200.380975 .75 1.47 .50 .99 1.47 .98 1.44 .66 .99 0.51 1.47 .99 1.21 1.63 .50 0.98 .76 1.15 1.63 1.17 3.06 3.03 3.07 3.08 2.92 2.99 3.04 3.03 .0200.351217 .75 1.47 .50 .99 1.47 .99 1.42 .66 .99 0.51 1.47 .99 1.20 1.63 .50 0.99 .77 1.15 1.63 1.17 3.06 3.03 3.07 3.08 2.92 3.00 3.04 3.03 .0100.362428 .74 1.47 .50 .99 1.47 .98 1.46 .66 .98 0.50 1.47 .98 1.22 1.63 .50 0.98 .75 1.15 1.63 1.17 3.07 3.04 3.07 3.09 2.91 2.99 3.04 3.02 .0100.351899 At this point we might conclude that LRATE=.02 is a stable, near-optimal learning rate and just use it for the duration (no further line search). After 200 rounds at LRATE=.02. we arrive at the following (note that it took ~2000 rounds without line search and with line search ~219): .83 1.39 .48 .91 1.40 .86 1.52 .61 .98 0.60 1.51 .98 1.15 1.74 .48 0.99 .64 1.10 1.62 1.45 3.28 3.28 3.04 3.48 1.69 2.98 3.12 2.65 .0200.199358 Comparing this resulting feature vector to the one we got when we did ~2000 rounds at LRATE=.001 (without line search) we see that we arrive at a very different feature vector: a b c d e f g h i j k l m n o p q r s t 1 2 3 4 5 6 7 8 LRATE 1.48 2.54 .90 1.6 2.6 1.55 2.68 1.15 1.73 1.08 2.67 1.73 2.06 3.08 .90 1.76 1.16 2.07 2.90 2.75 1.86 1.74 1.71 1.94 .87 1.61 1.7 1.5 .001, no ls .83 1.39 .48 .91 1.40 .86 1.52 .61 .98 .60 1.51 .98 1.15 1.74 .48 0.99 .64 1.10 1.62 1.45 3.28 3.28 3.04 3.48 1.69 2.98 3.12 2.65 .020, w ls However, an interesting observation is that the UserFeatureVector protions differ by constant multiplier and the MovieFeatureVector portions differ by a different constant. If we divide the LR=.001 vector by the LR=.020, we get the following multiplier vector (one is not a dialation of the other but if we split user portion from the movie portion, they are!!! What does that mean!?!?!?! 1.77 1.81 1.85 1.75 1.84 1.79 1.76 1.86 1.75 1.78 1.76 1.75 1.79 1.76 1.85 1.76 1.81 1.86 1.78 1.89 .56 .53 .56 .55 .51 .54 .54 .56".001/.020" 1.80 avg 0.04 std 0.54 avg 0.01 std Another interesting observation is that 1 / 1.8 = .55, that is, 1 / AVGufv = AVGmfv. They are reciporicals of oneanother!!! This makes some sense since it means, if you double the ufv you have to halve the mfv to get the same predictions. The bottom line is that the predictions are the same! What is the nature of the set of vectors that [nearly] minimize the mse? It is not a subspace (not closed under scalar multiplication) but it is clearly closed under "reciporical scalar multiplication" (multiplying the mfv's by the reciporical of the ufv's multiplier). Waht else can we say about it? So, we get an order of magnitude speedup by doing line search. If fact it may be more than that since we may be able to do all the LRATE calculations in parallel (without recalculating the error matrix or feature vectors????). Or we there may be a better search mechanism than fixed increment search. A binary type search? Othere?
MSE=(1/rc)allm,usup(m){ru,m- }2 (f(u)+LA) (f(m)+LB) =(1/rc)allm,usup(m){ru,m}2 -fufm-(fmA+fuB)L-ABL2 f(u) += L nsup(u)e(u,n)f(n) A f(m) += L vsup(m)e(v,m)f(v) B A(f(m)+LB)}=0? B - {-(f(u)+LA) =(2/rc)mM,usup(m) MSE/L {-(f(u)B +f(m)A)- 2LAB } =(2/rc)mM,usup(m) MSE/L C {ru,m- } f(u)f(m) - (f(m)A+f(u)B)L - ABL2 {eu,m} {-C - 2ABL} = 0 (looks hopeless to find a nice formula for L) - CL - ABL2 =(2/rc)mM,usup(m) MSE/L { ru,m- } (f(m)+LB) (f(u)+LA) Here we pause just a minute to see if there is a "calculus based" closed form formula which will immediately give us the LRATE producing the minimum mse. If so, that will clearly save time! Below, as you can see, I failed to be able to produce a closed form formula. There may be ways to cheat and get close enough? Go for it! eu,m = ru,m - f(u)f(m), f = the feature vector (dimensions include all users (a-t) followed by all movies (1-8) mse = (1/rc)allm,usup(m)(ru,m - f(u)f(m))2 f(u)+=LA (looks hopeless to find a nice formula where all the squared terms are 0) f(m)+=LB So the line search will have to be a simple (maybe binary) search) for the heuristically optimal L On the next slide I show some success with a fixed increment line search (increment=.005) but always starting at .001 (The LRATE that Funk uses throughout).
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 1 \a=Z2 2 3 3 5 2 5 3 3 /rvnfv~fv~{goto}L~{edit}+.005~/XImse<omse-.00001~/xg\a~ 3 2 5 1 2 3 5 3 .001~{goto}se~/rvfv~{end}{down}{down}~ 4 3 3 3 5 5 2 /xg\a~ 5 5 3 4 3 6 2 1 2 1 7 4 1 1 4 3 8 4 3 2 5 3 9 1 4 5 3 2 LRATE omse 10 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3 1 3 3 2 0.001 0.1952090 fv A22: +A2-A$10*$U2 /* error for u=a, m=1 */ A30: +A10+$L*(A$22*$U$2+A$24*$U$4+A$26*$U$6+A$29*$U$9) /* updates f(u=a) */ U29: +U9+$L*(($A29*$A$30+$K29*$K$30+$N29*$N$30+$P29*$P$30)/4)/* updates f(m=8 */ AB30: +U29 /* copies f(m=8) feature update in the new feature vector, nfv */ W22: @COUNT(A22..T22) /* counts the number of actual ratings (users) for m=1 */ X22: [W3] @SUM(W22..W29) /*adds ratings counts for all 8 movies = training count*/ AD30: [W9] @SUM(SE)/X22 /* averages se's giving the mse */ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 21 working error and new feature vector (nfv) 22 0 0 0 **0 ** 3 6 35 23 0 0 ** 0 ** 0 3 6 24 0 0 0 ** 0 2 5 25 0 ** ** 3 3 26 0 0 **1 3 27 **** ** 0 3 4 28 ** 1 0 ** 3 4 29 ** ** 0 0 2 4 L mse 30 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.1952063 nfv A52: +A22^2 /*squares all the individual erros */ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 52 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 square errors 53 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 54 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 55 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 56 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 57 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 58 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 59 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 SE 60 --------------------------------------------------------------- 61 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3. 2 2 3 2 0.125 0.225073 62 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3. 1 2 3 2 0.141 0.200424 63 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3. 1 3 3 2 0.151 0.197564 64 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.151 0.196165 65 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.151 0.195222 66 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195232 67 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195228 68 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195224 69 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195221 70 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195218 71 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195214 72 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195211 {goto}se~/rvfv~{end}{down}{down}~ "value copy" fv to output list Notes: In 2 rounds mse is as low as Funk gets it in 2000 rounds. After 5 rounds mse is lower than ever before (and appears to be bottoming out). I know I shouldn't hardcode parameters! Experiments should be done to optimize this line search (e.g., with some binary search for a low mse). Since we have the resulting individual square_errors for each training pair, we could run this, then for mask the pairs with se(u,m) > Threshold. Then do it again after masking out those that have already achieved a low se. But what do I do with the two resulting feature vectors? Do I treat it like a two feature SVD or do I use some linear combo of the resulting predictions of the two (or it could be more than two)? We need to test out which works best (or other modifications) on Netflix data. Maybe on those test pairs for which the training row and column have some high errors, we apply the second feature vector instead of the first? Maybe we invoke CkNN for test pairs in this case (or use all 3 and a linear combo?) This is powerful! We need to optimize the calculations using pTrees!!! /rvnfv~fvcopies fv to nfv after converting fv to values. {goto}L~{edit}+.005~increments L by .005 /XImse<omse-.00001~/xg\a~IF mse still decreasing, recalc mse with new L .001~ Reset L=.001 for next round /xg\a~ Start over with next round
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 1 \a=Z2 2 3 3 5 2 5 3 3 /rvnfv~fv~{goto}L~{edit}+.005~/XImse<omse-.00001~/xg\a~ 3 2 5 1 2 3 5 3 .001~{goto}se~/rvfv~{end}{down}{down}~ 4 3 3 3 5 5 2 /xg\a~ 5 5 3 4 3 6 2 1 2 1 7 4 1 1 4 3 8 4 3 2 5 3 9 1 4 5 3 2 LRATE omse 10 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3 1 3 3 2 0.001 0.1952090 fv A22: +A2-A$10*$U2 /* error for u=a, m=1 */ A30: +A10+$L*(A$22*$U$2+A$24*$U$4+A$26*$U$6+A$29*$U$9) /* updates f(u=a) */ U29: +U9+$L*(($A29*$A$30+$K29*$K$30+$N29*$N$30+$P29*$P$30)/4)/* updates f(m=8 */ AB30: +U29 /* copies f(m=8) feature update in the new feature vector, nfv */ W22: @COUNT(A22..T22) /* counts the number of actual ratings (users) for m=1 */ X22: [W3] @SUM(W22..W29) /*adds ratings counts for all 8 movies = training count*/ AD30: [W9] @SUM(SE)/X22 /* averages se's giving the mse */ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 21 working error and new feature vector (nfv) 22 0 0 0 **0 ** 3 6 35 23 0 0 ** 0 ** 0 3 6 24 0 0 0 ** 0 2 5 25 0 ** ** 3 3 26 0 0 **1 3 27 **** ** 0 3 4 28 ** 1 0 ** 3 4 29 ** ** 0 0 2 4 L mse 30 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.1952063 nfv A52: +A22^2 /*squares all the individual erros */ A B C D E F G H I J K L M N 52 0.04 0 0 0.00 0 0 0.00 0 0 0.00 0.00 0 0 0 53 0 0 0.21 0 0.14 0 0 1.04 0 0 0 0 0 0 54 0.18 0 0 0 0 0 0 0 0.00 0 0 0.00 0 0.10 55 0 0.00 0 0 0 0.00 0 0 0 0 0 0 0.00 0 56 0.38 0 0 0 0 0 0 0 0 0.00 0 0 0 0 57 0 0.00 0.23 0 0 0 0 0 0 0 0 0 0 0 58 0 0 0 0 0.15 0 0 1.16 0 0 0 0 0 0 59 1.52 0 0 0 0 0 0 0 0 0 0.00 0 0 0.13 SE 60 --------------------------------------------------------------- 61 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3. 2 2 3 2 0.125 0.225073 62 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3. 1 2 3 2 0.141 0.200424 63 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3. 1 3 3 2 0.151 0.197564 64 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.151 0.196165 65 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.151 0.195222 66 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195232 67 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195228 68 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195224 69 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195221 70 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195218 71 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195214 72 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.195211 O P Q R S T 0 0.08 0 0 0 0 0.21 0 0 0.30 0 0.03 0 0 0 0 0.00 0 0 0 0 0 0 0 0 0 0 0 0 0.13 0.23 0 0 0.33 0 0 0 0 0.00 0 0.00 0 0 0.12 0 0 0 0 Here I show the square errors to 2 decimal places. If we mask off all > 1.5 times our mse=.195211 (or 0.292817), then do it again on the next slide
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 1 2 ** 3 1 ** 4 ** 5 ** 6 2 ** 7 ** 8 3 ** 9 1 ** omse 10 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3 1 3 3 2 0.001 0.1952090fv A22: +A2-A$10*$U2 /* error for u=a, m=1 */ A30: +A10+$L*(A$22*$U$2+A$24*$U$4+A$26*$U$6+A$29*$U$9) /* updates f(u=a) */ U29: +U9+$L*(($A29*$A$30+$K29*$K$30+$N29*$N$30+$P29*$P$30)/4)/* updates f(m=8 */ AB30: +U29 /* copies f(m=8) feature update in the new feature vector, nfv */ W22: @COUNT(A22..T22) /* counts the number of actual ratings (users) for m=1 */ X22: [W3] @SUM(W22..W29) /*adds ratings counts for all 8 movies = training count*/ AD30: [W9] @SUM(SE)/X22 /* averages se's giving the mse */ A B C D E F G H I J K L M N O P Q R S T U V W X Y Z AAAB AC AD 21 working error and new feature vector (nfv) 22 0 0 0 **0 ** 3 6 35 23 0 0 ** 0 ** 0 3 6 24 0 0 0 ** 0 2 5 25 0 ** ** 3 3 26 0 0 **1 3 27 **** ** 0 3 4 28 ** 1 0 ** 3 4 29 ** ** 0 0 2 4 L mse 30 0 1 0 0 1 0 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 2 3. 1 3 3 2 0.001 0.1952063nfv A52: +A22^2 /*squares all the individual erros */ 0.04 0 0 0.00 0 0 0.00 0 0 0.00 0.00 0 0 0 53 0 0 0.21 0 0.14 0 0 1.04 0 0 0 0 0 0 54 0.18 0 0 0 0 0 0 0 0.00 0 0 0.00 0 0.10 55 0 0.00 0 0 0 0.00 0 0 0 0 0 0 0.00 0 56 0.38 0 0 0 0 0 0 0 0 0.00 0 0 0 0 57 0 0.00 0.23 0 0 0 0 0 0 0 0 0 0 0 58 0 0 0 0 0.15 0 0 1.16 0 0 0 0 0 0 59 1.52 0 0 0 0 0 0 0 0 0 0.00 0 0 0.13 A B C D E F G H 61 0.215 ************************************0.694 62 0.229 ************************************0.702 63 0.217 ************************************0.706 64 0.316 ************************************0.708 65 0.255 ************************************0.709 66 0.321 ************************************0.709 67 0.254 ************************************0.709 68 0.340 ************************************0.709 69 0.266 ************************************0.708 70 0.340 ************************************0.707 71 0.277 ************************************0.707 72 0.343 ************************************0.706 Notice, for these 4 ratings, the mse ws driven down from about 1 to .039 That's got to be good!!! A better method? Take each training rating as a separate mask. Combine the resulting predictions in a weighted average? AC AD 0.12 0.1180635 0.126 0.1014195 0.126 0.0885055 0.121 0.0786766 0.121 0.0705816 0.121 0.0639272 0.121 0.0582184 0.121 0.0533879 0.121 0.0489644 0.126 0.0449534 0.116 0.0417486 0.116 0.0390410
Take each training rating as a separate mask combine the resulting [3] predictions in a weighted average? A B C D E F G H I J K L M N O P Q R 61 0.44 1.03 0.00 **0.95 **1.41 0.00 ****1.00 ****2.00 0.00 0.00 0.00 0.00 62 -0.3 1.10 *******0.97 **1.21 *********1.06 ****2.92 ******************** 63 0.12 1.11 *******0.97 **1.13 *********1.05 ****3.98 ******************** A B C D E F G H I J K L M N O P Q R 61 0.30 0.76 0 0 0.83 0.00 0.00 ******0.75 **1.69 0.00 0.00 0.00 *****0.79 62 -0.1 0.79 ****0.92 ****************0.79 **2.13 ********************0.83 63 0.16 0.79 ****0.97 ****************0.81 **2.57 ********************0.84 64 -0.2 0.79 ****0.99 ****************0.83 **2.95 ********************0.84 65 -0.0 0.79 ****1.00 ****************0.84 **3.25 ********************0.83 66 0.06 0.78 ****1.00 ****************0.84 **3.52 ********************0.82 67 -0.0 0.78 ****1.01 ****************0.85 **3.76 ********************0.81 68 -0.0 0.77 ****1.01 ****************0.85 **3.98 ********************0.80 69 0.06 0.77 ****1.01 ****************0.85 **4.20 ********************0.80 70 0.02 0.77 ****1.01 ****************0.85 **4.35 ********************0.79 71 -0.0 0.76 ****1.01 ****************0.85 **4.51 ********************0.78 A B C D E F G H I J K L M N O P Q R 61 0.71 ****0.96 0.00 1.02 **0.50 0.94 ****0.94 **0.00 **1.02 *****0.57 62 0.75 ****0.92 *****1.02 **0.50 0.87 ****0.87 *********1.03 *****0.65 63 0.92 ****0.91 *****1.02 **0.49 0.82 ****0.82 *********1.05 *****0.79 64 0.78 ****0.91 *****1.02 **0.49 0.79 ****0.79 *********1.06 *****0.91 65 0.62 ****0.91 *****1.02 **0.49 0.77 ****0.77 *********1.07 *****1.05 66 0.78 ****0.92 *****1.02 **0.50 0.76 ****0.76 *********1.08 *****1.18 67 0.66 ****0.93 *****1.02 **0.51 0.75 ****0.75 *********1.09 *****1.30 68 0.76 ****0.94 *****1.02 **0.51 0.74 ****0.74 *********1.10 *****1.42 69 0.68 ****0.94 *****1.02 **0.52 0.73 ****0.73 *********1.11 *****1.56 A B C D E F G H I J K L M N O P Q R 61 0.11 **0.37 **-0.0 ****-0.0 **0.32 **************0.37 *****0.68 ***** 62 0.09 **0.40 ******************0.31 **************0.40 *****0.68 ***** 63 0.24 **0.44 ******************0.29 **************0.44 *****0.68 ***** 64 0.10 **0.48 ******************0.28 **************0.48 *****0.68 ***** 65 0.28 **0.51 ******************0.27 **************0.51 *****0.68 ***** 66 0.15 **0.55 ******************0.26 **************0.55 *****0.68 ***** 67 0.25 **0.57 ******************0.25 **************0.57 *****0.68 ***** 68 0.27 **0.59 ******************0.25 **************0.59 *****0.68 ***** 69 0.13 **0.61 ******************0.24 **************0.61 *****0.68 ***** 70 0.17 **0.63 ******************0.23 **************0.63 *****0.68 ***** 71 0.24 **0.64 ******************0.23 **************0.64 *****0.68 ***** 72 0.24 **0.65 ******************0.22 **************0.65 *****0.68 ***** S T 1234567 8 LRATE mse 1.99 1.08 3.55 3.94 2.68 3.91 1.58 1.89 2.24 2.23 0.135 0.6936999 2.88 1.09 4.11 4.54 1.77 4.42 0.38 0.69 1.61 1.56 0.181 0.2192143 3.95 1.03 4.42 4.81 1.27 4.49 0.08 0.20 1.20 1.18 0.196 0.0991360 S T U V W X Y Z AA AB AC AD 0.00 *****2.52 2.15 2.90 2.28 2.94 3.84 3.41 3.45 0.125 0.5521861 **********2.16 1.48 2.90 1.79 2.94 4.29 3.69 3.80 0.121 0.3093295 **********1.83 0.97 2.90 1.48 2.94 4.53 3.84 4.06 0.121 0.1805761 **********1.54 0.63 2.90 1.29 2.94 4.67 3.90 4.24 0.121 0.1141923 **********1.30 0.42 2.90 1.17 2.94 4.75 3.93 4.36 0.116 0.0763865 **********1.09 0.27 2.90 1.09 2.94 4.82 3.94 4.45 0.121 0.0541106 **********0.90 0.17 2.90 1.02 2.94 4.88 3.95 4.51 0.121 0.0401457 **********0.74 0.11 2.90 0.97 2.94 4.93 3.95 4.55 0.126 0.0306785 **********0.58 0.06 2.90 0.92 2.94 4.99 3.95 4.59 0.136 0.0237115 **********0.48 0.04 2.90 0.89 2.94 5.02 3.95 4.60 0.121 0.0196966 **********0.39 0.02 2.90 0.86 2.94 5.06 3.95 4.62 0.126 0.0168809 S T U V W X Y Z AA AB AC AD 0.00 *****3.12 2.93 3.20 2.92 2.50 2.59 3.21 2.67 0.125 0.4479001 **********3.23 2.93 3.43 2.92 1.98 2.16 3.53 2.41 0.136 0.3707344 **********3.29 2.89 3.65 2.92 1.41 1.58 3.86 2.19 0.151 0.2955221 **********3.29 2.82 3.77 2.92 1.01 1.08 4.12 2.07 0.141 0.2364692 **********3.27 2.67 3.86 2.92 0.70 0.62 4.37 2.00 0.146 0.1942664 **********3.24 2.48 3.94 2.92 0.48 0.31 4.58 1.96 0.146 0.1674254 **********3.21 2.27 3.98 2.92 0.33 0.13 4.75 1.93 0.146 0.1508041 -0.0 *****3.19 2.07 4.02 2.92 0.22 0.04 4.90 1.92 0.146 0.1398103 0.00 *****3.16 1.89 4.05 2.92 0.15 0.01 5.02 1.91 0.146 0.1318013 S T U V W X Y Z AA AB AC AD **0.33 2.97 3.01 2.87 2.87 3.07 2.65 2.92 2.90 0.12 0.2997544 **0.32 3.04 3.10 2.84 2.87 3.21 2.33 2.92 2.86 0.126 0.2815915 **0.31 3.09 3.16 2.81 2.87 3.34 2.02 2.92 2.83 0.121 0.2635386 **0.31 3.15 3.20 2.79 2.87 3.46 1.72 2.92 2.79 0.121 0.2458982 **0.30 3.20 3.23 2.76 2.87 3.60 1.42 2.92 2.75 0.121 0.2297889 **0.30 3.25 3.22 2.72 2.87 3.73 1.12 2.92 2.71 0.126 0.2133656 **0.30 3.29 3.20 2.70 2.87 3.85 0.89 2.92 2.68 0.116 0.2025623 **0.30 3.33 3.17 2.67 2.87 3.95 0.71 2.92 2.65 0.111 0.1945259 **0.30 3.36 3.13 2.65 2.87 4.04 0.57 2.92 2.62 0.106 0.1880539 **0.31 3.39 3.09 2.63 2.87 4.12 0.46 2.92 2.59 0.101 0.1830460 **0.31 3.42 3.04 2.60 2.87 4.21 0.36 2.92 2.56 0.106 0.1789409 **0.31 3.44 2.99 2.58 2.87 4.29 0.29 2.92 2.54 0.101 0.1754944 rating=5 rating=4 rating=2 rating=2 A B C D E F G H I J K L M N O P Q R 61 0.10 *0.17 0.00 *0.00 0.00 0.17 0.00 0.17 *0.00 0.00 **0.17 **0.00 ** 62 0.05 *0.17 ****************0.17 *****0.17 *************0.17 ********* 63 0.06 *0.17 ****************0.17 *****0.17 *************0.17 ********* 64 0.07 *0.17 ****************0.18 *****0.17 *************0.17 ********* 65 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 66 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 67 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 68 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 69 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 70 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 71 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 72 0.07 *0.17 ****************0.18 *****0.18 *************0.17 ********* 73 0.08 *0.17 ****************0.18 *****0.18 *************0.17 ********* 74 0.08 *0.17 ****************0.18 *****0.18 *************0.17 ********* 75 0.08 *0.17 ****************0.18 *****0.18 *************0.17 ********* 76 0.08 *0.17 ****************0.18 *****0.18 *************0.17 ********* S T U V W X Y Z AA AB AC AD ****2.87 2.90 2.89 2.87 2.97 2.99 2.87 2.96 0.115 0.0770642 ****2.84 2.87 2.88 2.87 3.00 3.06 2.83 2.99 0.131 0.0756578 ****2.81 2.85 2.88 2.87 3.04 3.13 2.79 3.01 0.126 0.0742988 ****2.78 2.83 2.87 2.87 3.07 3.19 2.76 3.04 0.126 0.0730005 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.126 0.0717411 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717480 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717397 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717320 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717247 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717178 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717114 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0717054 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0716997 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0716943 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0716893 ****2.75 2.80 2.86 2.87 3.10 3.26 2.72 3.06 0.001 0.0716846 rating=1
SVD: follow gradients to minimize mse over the TrainSet. Taxonomy of classifications: starting with the number of entities in the Training Set: 1 Entity TrainingSet (e.g., IRISs, Concrete, Wines, Seeds,...) use FAUST or CkNN or ? 2 Entity TrainingSet (e.g., NetflixCinematch(users,movies), MBR(users,items), TextMining(docs,terms) 3 Entity TrainingSet (e.g., Document Recomenders(users,docs,terms) Recommender Taxonomy Two Entity Recommenders (2Es) (e.g., Users and Items) D DT 1 1 0 0 0 1 0 1 4 0 0 0 1 1 0 0 0 3 1 0 1 0 0 1 1 1 2 0 0 0 1 1 1 0 0 1 DU 2 3 4 5 T 2 0 1 0 0 0 0 0 1 3 0 0 1 0 4 0 0 0 1 5 U UT a b c d e f g h i j k l m n o p q r s t 1 2 3 4 5 6 7 8 -0.0 0.31 -0.4 1.48 0.17 -1.3 -0.6 0.58 -0.3 0.21 0.01 0.70 -0.0 0.64 0.28 0.60 0.31 0.65 -1.1 0.70 -0.0 -0.0 0.16 0.57 -0.0 0.29 -0.1 -0.6 -0.2 -0.6 -0.3 0.87 -1.2 -0.3 0.24 User-ID-Yes (UY) 2 entity (users, items) s.t. users give identity at Check-Out (e.g., Sam's,Amazon? ...) User-ID-NO (UN) 2 entity (users, items) s.t. users do not give identity at Check-Out (e.g., Sunmart ...) User-ID-Maybe (UM) 2 entity (users, items) s.t. users may give identity (e.g., card carriers ...) Netflix (2E, 5 star, 85% blanks) is a UY,QN,RM pTreeSVD works best for RY. RN? (0 blanks.] (in RN, buy means rating=1. don't buy raing=0) Quantity-Yes (QY) matrix cells contain the quantity of that item by that user Quantity-NO (QN) matrix cells do not contain the quantity of item Ratings-Yes (RY) matrix cells always contain a qualitative rating Ratings-No (RN) matrix cells never contain a qualitative rating Ratings-Maybe (RM) matrix cells may contain a rating of that item by that user Three Entity Recommenders (3Es) (e.g., Document Recommenders: Users, Documents, Terms) Matrixes: Document-Term (tfIDFs?); User-Document (2E, Doc=Item); User-Term (user's liking of term). Cyclic 3 hop rolodex (DT DU UT) Let uc=user_count, ic=item_count, fc=feature_count, bp=blanks_%, uc=500K, ic=17K, fc=40, bp=85%. HorizSVD converts 100M non-blanks to 2 matrixes with 500Kx40=20M and 17Kx40=700K (tot=21M). In RM, ignore blanks! For a Document Recommender, there are no blanks! 3 3 5 2 5 3 3 2 5 1 2 3 5 3 3 3 3 5 5 3 5 3 4 3 2 1 2 2 4 1 1 4 2 4 3 2 5 3 1 4 5 3 3 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 Offsets ARE the way of pTrees. No alternative! Level-1 predicate(>50% 1s) pTrees may work. pTree mask non-blanks or eliminate blanks and re-pTree-ize? SVD loop: 1. Calc SqErrs for each nonblank trainer, em,u = rm,u - pm,u = rm,u - UFuoMFm 2. update feature vectors, UFu+=Lrate*nupport(u) en,u (nonblanks only). re-create pTrees (eliminate blanks): m u r r2 p2 e pi MFi UFi po MFo UFo 1 a 3 0 0 1 1 3 0.05 1.2 0.9 1.4 d 3 0 0 1 0 3 0.05 0.8 0.9 0.9 g 5 1 0 0 0 3 0.05 0 0 0.0 j 2 0 0 0 0 3 0.05 0 0 0.0 k 5 1 0 0 0 3 0.05 0 0 0.0 p 3 0 0 1 0 3 0.05 0.8 0.9 0.9 2 c 2 0 0 0 0 3 0.05 0 0 0.0 e 5 1 0 1 0 3 0.05 1.3 0.9 1.5 h 1 0 0 1 0 3 0.05 1.3 0.9 1.5 o 2 0 0 0 0 3 0.05 0 0 0.0 r 3 0 0 1 0 3 0.05 0.8 0.9 0.9 t 5 1 0 1 0 3 0.05 0.8 0.9 0.9 3 a 3 0 0 0 1 3 0.05 0 0 0.0 i 3 0 0 0 0 3 0.05 0 0 0.0 l 3 0 0 0 0 3 0.05 0 0 0.0 n 5 1 0 1 0 3 0.05 0.6 0.7 0.8 s 5 1 0 0 0 3 0.05 0 0 0.0 4 b 5 1 0 1 0 3 0.05 0.2 0.4 0.5 f 3 0 0 1 0 3 0.05 0.2 0.4 0.5 update feature vectors, MFm+=Lrate*vupport(m) em,v (nonblanks only). 3.06 3.03 3.06 3.08 2.89 2.98 3.03 3.00 3.06 0.75 1.44 0.54 0.99 1.44 0.98 1.44 0.69 0.99 0.55 1.44 0.99 1.21 1.59 0.54 0.99 0.76 1.14 1.58 1.16
comparison of pTree calculating (hpvd) versus vphd. a b c d e f g h i j k l m n 3 3 5 2 5 2 5 1 3 3 3 5 5 3 4 2 1 4 1 4 3 1 4 5 1 1 1 1 1 1 1 1 1 1 1 1 1 1 o p q r s t L=.01 B=0 3 3 2 3 5 3 5 3 3 2 3 1 4 3 2 5 3 3 3 1 1 1 1 1 1 HPVD: Using pTree calculations of errors (~900 iterations) a b c d e f g h i j k l m n o p q r s t 1 2 3 4 5 6 7 8 square error 61 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 0 3 2 3 3 2 2 3 3 1.3075347861 62 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.9774510308 63 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.7551469836 64 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.6045859744 65 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.5019745883 66 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.4315323575 67 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.3827597958 68 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.348652357 69 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.3245244474 70 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.3072320245 71 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.2946575562 72 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.2853683589 73 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.278389731 74 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.2730541598 75 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 3 0.2689009612 76 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 2 0.2656093661 77 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 2 0.2629537987 78 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 2 0.260773893 79 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 2 0.2589543092 80 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 0 3 2 3 3 2 2 3 2 0.2574110813 ... 604 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2089987535 605 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.208997914 606 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2089971821 607 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2089965575 608 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.20899604 609 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2089956291 610 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2089953246 611 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088268247 612 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2089898306 613 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088216729 614 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.208817433 615 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088140459 616 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088114566 617 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088096113 618 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088084588 619 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2088079501 620 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.209154252 621 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 2 3 3 1 2 3 2 0.2091612095 VPHD: Using horizontal data calculations of errors (~900 iterations) a b c d e f g h i j k l m n o p q r s t 1 2 3 4 5 6 7 8 square error 61 0 1 0 1 1 1 1 0 1 0 1 1 1 1 0 1 0 1 1 1 3 3 3 3 2 2 3 3 1.4078947902 62 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 1.0724414729 63 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.8471101679 64 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.6941820269 65 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.5894727044 66 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.5171470253 67 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.4667031538 68 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.4311218237 69 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.4056876928 70 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.3872210428 71 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.373570135 72 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.3632733114 73 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 3 0.3553332222 74 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.3490656701 75 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.3439982768 76 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.3398024667 77 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.3362477387 78 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.333170853 79 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.3304550077 80 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 1 1 1 3 3 3 3 2 2 3 2 0.3280157095 ... 604 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1978362318 605 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1978278064 606 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1978194081 607 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1978110365 608 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1978026915 609 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977943728 610 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977860803 611 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977778138 612 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977695729 613 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977613577 614 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977531677 615 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.197745003 616 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977368632 617 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977287482 618 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977206578 619 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1977125918 620 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.19770455 621 0 1 0 0 1 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 3 3 3 3 1 3 3 2 0.1976965323 There should be no difference. The difference is because of the way I set things up in the program (addition steps in HPVD with more approximations.)
Simon Funk: Netflix provided a database of 100M ratings (1 to 5) of 17K movies by 500K users. as a triplet of numbers: (User,Movie,Rating). The challenge: For (User,Movie,?) not in the database, predict how the given User would rate the given Movie. Think of the data as a big sparsely filled matrix, with userIDs across the top and movieIDs down the side (or vice versa then transpose everything), and each cell contains an observed rating (1-5) for that movie (row) by that user (column), or is blank meaning you don't know. This matrix would have 8.5B entries, but you are only given values for 1/85th of those 8.5B cells (or 100M of them). The rest are all blank. Netflix posed a "quiz" of a bunch of question marks plopped into previously blank slots, and your job is to fill in best-guess ratings in their place. Squared error (se) measures accuracy (You guess=1.5, actual=2, you get docked (2-1.5)2=.25. They use root mean squared error (rmse) but if we minimize mse, we minimize rmse. There is a date for ratings and question marks (so a cell can potentially have >=1 rating in it. Any movie can be described in terms of some features (or aspects) such as quality, action, comedy, stars (e.g., Pitt), producer, etc. A user's preferences can be described in terms how they rate the same features (quality/action/comedy/star/producer/etc.). Then ratings ought to be explainable by a lot less than 8.5 billion numbers (e.g., a single number specifying how much action a particular movie has may help explain why a few million action-buffs like that movie.). SVD:Assume 40 features. A movie, m, is described by mF[40] = how much that movie exemplifies each aspect. A user, u, is described by uF[40] = how much he likes each aspect. Pu,m=uFomFerru,m=Pu,m- ru,m m=1..17K; u=1..500K()2/8.5B k=1..40uFk*mFk - ru,m mse = k=1..40uFk*mFk - ru,m (2/8.5B) m=1..17K; u=1..500K (erru,m)[ mse/uFh = ( )/uFh] uFk ] (2/8.5B) m=1..17K; u=1..500K (erru,m)[ mse/mFh = mFh ] = (2/8.5B) m=1..17K; u=1..500K (erru,m)[ Pm1 m m17K u1 . . u500K UTa1 a40 u1 u500K Mm1 m m17K a1 mF a40 = o u uF u Pu,m = k=1..40uFk*mFk - ru,m So, we increment each uFh+ = 2mse * mFh and we increment each mFh+ = 2mse * uFh+ This is a big move and may overshoot the minimum, so the 2 is replaced by a smaller learning rate, lrate (e.g., Funk takes lrate=0.001) SVD is a trick which finds UT, M which minimize mse(k) (one k at a time). So, the rank=40 SVD of the 8.5B Training matrix, is the best (least error) approx we can get within limits of our user-movie-rating model. I.e., the SVD has found the "best" feature generalizations. To get the SVD matrixes we take the gradient of mse(k) and follow it.This has a bonus - we can ignore the unknown error on the 8.4B empty slots. Take gradient of mse(k) (just the given values, not empties), one k at a time. userValue[user] += lrate*err*movieValue[movie]; movieValue[movie] += lrate*err*userValue[user]; With Horizontal data, the code is evaluated for each rating. So, to train for one sample:real *userValue= userFeature[featureBeingTrained]; real *movieValue= movieFeature[featureBeingTrained]; real lrate = 0.001; More correctly: uv = userValue[user] += err * movieValue[movie]; movieValue[movie] += err * uv; finds the most prominent feature remaining (most reduces error). When it's good, shift it onto done features, start a new one (cache residuals of the 100M. "What does that mean for us???). This Gradient descent has no local minima, which means it doesn't really matter how it's initialized. ua+= lrate (u,i * iaT - * ua ) where u,i = pu,i - ru,i andru,i = actual rating
Refinements:Prior to starting SVD, Note: AvgRating(movie), AvgOffset(UserRating, MovieAvgRating), for every user. I.e.: static inline real predictRating_Baseline(int movie, int user) {return averageRating[movie] + averageOffset[user];} That's the return of predictRating before 1st SVD feature starts training. You'd think avg mvoie rating = its average rating! If m only appears once with r(m,u)=1 say, AvgRating(m)=1? Probably not! View r(m,u)=1 as a draw from a true prob dist who's avg you want... View true avg as a draw from a prob dist of avgs-histogram of avg movie ratings. Assume dists Gaussian, then best-guess mean = lin combo of observed mean and apriori mean, with a blending ratio equal to the ratio of variances. If Ra and Va are the mean and variance (squared standard deviation) of all of the movies' average ratings (which defines your prior expectation for a new movie's average rating before you've observed any actual ratings) and Vb is the average variance of individual movie ratings (which tells you how indicative each new observation is of the true mean--e.g,. if the average variance is low, then ratings tend to be near the movie's true mean, whereas if the avg variance is high, ratings tend to be more random and less indicative) then: BogusMean = sum(ObservedRatings)/count(ObservedRatings) K = Vb/Va BetterMean = [GlobalAverage*K + sum(ObservedRatings)] / [K + count(ObservedRatings)] The point here is simply that any time you're averaging a small number of examples, the true average is most likely nearer the apriori average than the sparsely observed average. Note if the number of observed ratings for a particular movie is zero, the BetterMean (best guess) above defaults to the global average movie rating as one would expect. Moving on: 20M free params is a lot for a 100M TrainSet. Seems neat to just ignore all blanks, but we have expectations about them. As-is, this modified SVD algorithm tends to make a mess of sparsely observed movies or users. If you have a user who has only rated 1 movie, say American Beauty=2 while the avg is 4.5, and further that their offset is only -1, we'd, prior to SVD, expect them to rate it 3.5. So the error given to the SVD is -1.5 (the true rating is 1.5 less than we expect). m(Action) is training up to measure the amount of Action, say, .01 for American Beauty (ust slightly more than avg). SVD optimize predictions, which it can do by eventually setting our user's preference for Action to a huge -150. I.e., the alg naively looks at the only example it has of this user's preferences and in the context of only the one feature it knows about so far (Action), determines that our user so hates action movies that even the tiniest bit of action in American Beauty makes it suck a lot more than it otherwise might. This is not a problem for users we have lots of observations for because those random apparent correlations average out and the true trends dominate. We need to account for priors. As with the average movie ratings, blend our sparse observations in with some sort of prior, but it's a little less clear how to do that with this incremental algorithm. But if you look at where the incremental algorithm theoretically converges, you get: userValue[user] = [sum residual[user,movie]*movieValue[movie]] / [sum (movieValue[movie]^2)] The numerator there will fall in a roughly zero-mean Gaussian distribution when charted over all users, which through various gyrations: userValue[user] = [sum residual[user,movie]*movieValue[movie]] / [sum (movieValue[movie]^2 + K)] And finally back to: userValue[user] += lrate * (err * movieValue[movie] - K * userValue[user]); movieValue[movie] += lrate * (err * userValue[user] - K * movieValue[movie]); This is equivalent to penalizing the magnitude of the features. To cut over fitting, allowing use of more features. Moving on: Add non-linear outputs s.t. instead of predicting with: sum (userFeature[f][user] * movieFeature[f][movie]) for f from 1 to 40. We can use: sum G(userFeature[f][user] * movieFeature[f][movie]) for f from 1 to 40. 1. clip the prediction to 1-5 after each comp is added. 2. Introduce some functional non-linearity such as a sigmoid. I.e., G(x) = sigmoid(x). Moving on: Despite the regularization term in the final incremental law above, over fitting remains a problem. Plotting the progress over time, the probe rmse eventually turns upward and starts getting worse (even though the training error is still inching down). We found that simply choosing a fixed number of training epochs appropriate to the learning rate and regularization constant gives best overall perf. Here is the probe and training rmse for the first few features with and w/o regularization term "decay" enabled. Same thing, just the probe set rmse, further along where you can see the regularized version pulling ahead: