Logo
Explore Help
Sign In
tommy/libpostal
1
0
Fork 0
You've already forked libpostal
Code Issues Pull Requests Actions Packages Projects Releases Wiki Activity
4,306 Commits 2 Branches 0 Tags
dbe801fa08fd4e95d7bd1ea2db73343da5e6fcc3
Commit Graph

7 Commits

Author SHA1 Message Date
Al
46cd725c13 [math] Generic dense matrix implementation using BLAS calls for matrix-matrix multiplication if available 2016-08-06 00:40:01 -04:00
Al
7d727fc8f0 [optimization] Using adapted learning rate in stochastic gradient descent (if lambda > 0) 2016-01-17 20:59:47 -05:00
Al
622dc354e7 [optimization] Adding learning rate to lazy sparse update in stochastic gradient descent 2016-01-12 11:04:16 -05:00
Al
7cc201dec3 [optimization] Moving gamma_t calculation to the header in SGD 2016-01-11 16:40:50 -05:00
Al
b85e454a58 [fix] var 2016-01-09 03:43:53 -05:00
Al
62017fd33d [optimization] Using sparse updates in stochastic gradient descent. Decomposing the updates into the gradient of the loss function (zero for features not observed in the current batch) and the gradient of the regularization term. The derivative of the regularization term in L2-regularized models is equivalent to an exponential decay function. Before computing the gradient for the current batch, we bring the weights up to date only for the features observed in that batch, and update only those values 2016-01-09 03:37:31 -05:00
Al
8b70529711 [optimization] Stochastic gradient descent with gain schedule a la Leon Bottou 2016-01-08 00:54:17 -05:00
Powered by Gitea Version: 1.24.6 Page: 72ms Template: 2ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API