[mlpack] Project about Low rank/sparse optimization using Frank-Wolfe

Shangtong Zhang zhangshangtong.cpp at gmail.com
Thu Mar 16 11:02:19 EDT 2017


Haha, maybe we don’t know each other. We are from 3 different departments, CS, ECE and MATH.
But I think it’s a good chance to know each other.

Shangtong Zhang,
First year graduate student,
Department of Computing Science,
University of Alberta
Github <https://github.com/ShangtongZhang> | Stackoverflow <http://stackoverflow.com/users/3650053/slardar-zhang>
> On Mar 16, 2017, at 08:11, Ryan Curtin <ryan at ratml.org> wrote:
> 
> On Wed, Mar 15, 2017 at 05:33:15AM -0600, Chenzhe Diao wrote:
>> Hello everyone,
>> 
>> My name is Chenzhe. I am a 4th year Ph.D. student in Applied Mathematics
>> from University of Alberta in Canada. Part of my research is about image
>> recovery using over-complete systems (wavelet frames), which involves some
>> machine learning techniques, and uses sparse optimization techniques as one
>> of the key steps. So I am quite interested in the project about "Low
>> rank/sparse optimization using Frank-Wolfe".
>> 
>> I checked the mailing list from last year. It seems that there was one
>> student from GSOC16 interested in a similar project. Is that still not done
>> for some special difficulties? I took a brief look of the Martin Jaggi paper,
>> it seems that the algorithm is not complicated by itself. So I guess most
>> of the time for the project would be to implement the algorithm in desired
>> form, and to make extensive tests? What kinds of tests are we expecting?
>> 
>> Also, I checked src/mlpack/core/optimizers/ and I saw the GradientDescent
>> class implemented. I guess I need to write a new class in similar structure?
> 
> Hi Chenzhe,
> 
> Do you know Shangtong Zhang?  He is a first-year MSc student who also
> attends the University of Alberta.  Or Bang Liu?  He also is a PhD
> student at UofA and was a part of mlpack GSoC last year.  Maybe you guys
> all know each other?  It seems like it's a big university though (nearly
> 40k students) so maybe the chances are small. :)
> 
> Nobody implemented the Frank-Wolfe optimizer from last year, so the
> project (and related projects) are still open.  Anything you find in
> src/mlpack/core/optimizers/ is what we have, although there are a few
> open PRs related to this issue:
> 
> https://github.com/mlpack/mlpack/issues/893 <https://github.com/mlpack/mlpack/issues/893>
> 
> But those are not F-W, those are basically other optimizers related to
> SGD.
> 
> Essentially you are right, the idea of the project would be to provide
> an implementation of the algorithm in Jaggi's paper.  In your case given
> your background and expertise, this will probably be a relatively
> straightforward task.  Testing the algorithm has some difficulty but
> honestly I suspect it can be tested mostly like the other optimizers:
> come up with some easy and hard problems to optimize, and make sure that
> the implemented F-W algorithm can successfully find the minimum.  You
> can take a look at the existing tests for other optimizers in
> src/mlpack/tests/ to get some kind of an idea for how to do that.
> 
> Building on top of that, there are many further places you could go with
> the project:
> 
> * you could modify the various mlpack programs like
>   mlpack_logistic_regression and mlpack_softmax_regression and so
>   forth to expand the list of available optimizers
> 
> * you could benchmark the F-W optimizer against other optimizers on
>   various problems and possibly (depending on the results) assemble
>   something that could be published
> 
> * you could try implementing some new ideas based on the stock F-W
>   optimizer and see if they give improvement
> 
> * you could implement an additional optimizer
> 
> * you could implement an algorithm that is meant to use the F-W
>   optimizer, like maybe some of the F-W SVM work that Jaggi also did?
>   That might be too much for a single summer though...
> 
> In either case, the choice is up to you---the project idea is there as
> kind of a boilerplate starting point for whatever ideas you would find
> most interesting.
> 
> Thanks,
> 
> Ryan
> 
> -- 
> Ryan Curtin    | "Avoid the planet Earth at all costs."
> ryan at ratml.org <mailto:ryan at ratml.org> |   - The President
> _______________________________________________
> mlpack mailing list
> mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20170316/cabf1dd4/attachment-0001.html>


More information about the mlpack mailing list