[mlpack] Deep Learning modules idea: Discussion

Vivek Pal vivekpal.dtu at gmail.com
Thu Feb 9 07:19:52 EST 2017


Hi Marcus,

I went through a lot of text on deep learning in the past few days starting
with
covering chapter 20 from Deep Learning book by Ian Goodfellow and following
up with the
papers I mentioned earlier.

Also, while going through the text and looking at the codebase
simulatenously I discovered
that there's scope for implementing atleast a couple more activation
functions, e.g. softplus
and a more recent one; ELU (https://arxiv.org/abs/1511.07289).

I believe ELU can be implemented similar to the existing implementation of
Leaky ReLU layer
and softplus can be simply added as a new activation function in
ann/actication_functions
directory.

Would that be a useful first contribution?

It'll help me gain first hand experience with mlpack development process
and a better
understanding of ann modules internals.

> also I've added some papers to the Essential Deep Learning Modules
project idea, that
> might be interesting too.

Yeah, I feel I could use some more ideas or details on certain
architectures from some other
papers that you have added there. Thanks, will get started once I'm done
reading all papers
at hand.

Thanks,
Vivek Pal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20170209/6ee287a4/attachment.html>


More information about the mlpack mailing list