Building up on our work from the last week on optimizing our
ANN framework, we went forward with implementing
EvaluateWithGradient() function for the
RNN classes as well.
Though we had done the same with the aim of reducing code duplication in mind initially, we realized that with the above function implemented, we were able to obtain atleast a
30% speedup in the case of simple
For the case of
RNN class, the speedup was slightly lower at
25%, primarily because of the heavier gradient computation routines being used. Nevertheless, we also applied the above function inside our
GAN::EvaluateWithGradient() function, so a certain amount of speedup is expected there as well!
I also received my Phase II evaluations this week, and I'm glad that Marcus is satisfied with the effort that we have put in. I will continue to build up on my work on
RBMs and hopefully, we can merge them as well before this month ends.
Generated by 1.8.13