I implemented another definition of the
Forward() function in the FFN class along with a test. This now makes it very easy to forward pass just through the encoder or the decoder of a saved model.
The convolutional model was not working with the
Sequential object because in its
reset was being hard set to
true. Hence, it was setting the
inputHeight of the
TransposedConvolutional layer both to 0. I trained that convolutional model and observed the total loss go much below what it went with a dense layered model, as expected. Although, the KL divergence was heigher than dense layered model and as a result sampling from the prior didn't generate defined results.
To put this on the models repository, I had to work with some CMake. It was to learn some basics of CMake for this task.
BernoulliDistribution to ann dists. It will be needed for generating binary MNIST. I tried training a model and it did seem to work. I also added support for beta VAEs which was a very simple task.
Generated by 1.8.13