[mlpack] Deep Learning modules idea: Discussion

Vivek Pal vivekpal.dtu at gmail.com
Thu Feb 9 09:42:14 EST 2017


Hi Marcus,

> There are many really good resources, also I often hear that the coursera
> machine learning courses are worth a look.

Yes indeed. Infact, this and another course specifically "Neural Networks
for Machine Learning" by
G. Hinton is also on my watch list for this month. Will definitely try to
cover some major concepts
related to this project.

> Absolutely, I was going to open an issue for the ELU function, but sounds
like I
> don't have to do this anymore. If you are going to implement one or both
> functions, don't feel obligated, please open a single PR for each
function and
> also include some tests.

Well, looks like I'm in luck. I'd prefer to start with softplus first and will
try to open to a PR (complete
with tests) over the weekend.

Thanks again,
Vivek Pal


On Thu, Feb 9, 2017 at 7:33 PM, Marcus Edel <marcus.edel at fu-berlin.de>
wrote:

> Hello Vivek,
>
> I went through a lot of text on deep learning in the past few days
> starting with
> covering chapter 20 from Deep Learning book by Ian Goodfellow and
> following up with the
> papers I mentioned earlier.
>
>
> There are many really good resources, also I often hear that the coursera
> machine learning courses are worth a look.
>
> Also, while going through the text and looking at the codebase
> simulatenously I discovered
> that there's scope for implementing atleast a couple more activation
> functions, e.g. softplus
> and a more recent one; ELU (https://arxiv.org/abs/1511.07289).
>
> I believe ELU can be implemented similar to the existing implementation of
> Leaky ReLU layer
> and softplus can be simply added as a new activation function in
> ann/actication_functions
> directory.
>
>
> Absolutely, I was going to open an issue for the ELU function, but sounds
> like I
> don't have to do this anymore. If you are going to implement one or both
> functions, don't feel obligated, please open a single PR for each function
> and
> also include some tests.
>
> Thanks,
> Marcus
>
> On 9 Feb 2017, at 13:19, Vivek Pal <vivekpal.dtu at gmail.com> wrote:
>
> Hi Marcus,
>
> I went through a lot of text on deep learning in the past few days
> starting with
> covering chapter 20 from Deep Learning book by Ian Goodfellow and
> following up with the
> papers I mentioned earlier.
>
> Also, while going through the text and looking at the codebase
> simulatenously I discovered
> that there's scope for implementing atleast a couple more activation
> functions, e.g. softplus
> and a more recent one; ELU (https://arxiv.org/abs/1511.07289).
>
> I believe ELU can be implemented similar to the existing implementation of
> Leaky ReLU layer
> and softplus can be simply added as a new activation function in
> ann/actication_functions
> directory.
>
> Would that be a useful first contribution?
>
> It'll help me gain first hand experience with mlpack development process
> and a better
> understanding of ann modules internals.
>
> > also I've added some papers to the Essential Deep Learning Modules
> project idea, that
> > might be interesting too.
>
> Yeah, I feel I could use some more ideas or details on certain
> architectures from some other
> papers that you have added there. Thanks, will get started once I'm done
> reading all papers
> at hand.
>
> Thanks,
> Vivek Pal
> _______________________________________________
> mlpack mailing list
> mlpack at lists.mlpack.org
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20170209/3245d664/attachment.html>


More information about the mlpack mailing list