Efficient Algorithms for Learning Sparse Models from Large Amounts of Data
Yoram Singer (Google Inc.) | Yoram Singer ... and the Magic Broom |
---|
We will review the design, analysis and implementation of several sparsity promoting learning algorithms. We start with an efficient projected gradient algorithm onto the L1 ball. We then describe a forward-backward splitting (Fobos) method that incorporates L1 and mixed-norms. We next present adaptive gradient versions of the above methods that generalize well-studied sub-gradient methods. We conclude with a description of a recent approach for "sparse counting" which facilitate compact yet accurate language modeling.
Osnova
0:00:01
0:00:04
0:00:13
0:00:25
0:00:34
0:00:46
0:00:49
0:00:55
0:01:01
0:01:07
0:01:13
0:01:28
0:01:40
0:01:43
0:02:01