mlpack IRC logs, 2017-07-23
Logs for the day 2017-07-23 (starts at 0:00 UTC) are shown below.
--- Log opened Sun Jul 23 00:00:12 2017
01:27 -!- deep-book-gk_ [~firstname.lastname@example.org] has joined #mlpack
01:29 -!- deep-book-gk_ [~email@example.com] has left #mlpack 
03:22 -!- govg [~govg@unaffiliated/govg] has quit [Ping timeout: 240 seconds]
05:16 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
05:44 -!- govg [~govg@unaffiliated/govg] has quit [Ping timeout: 260 seconds]
06:00 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
07:00 -!- sumedhghaisas [~firstname.lastname@example.org] has quit [Ping timeout: 268 seconds]
07:02 -!- kris1 [~email@example.com] has joined #mlpack
09:07 < kris1> lozhnikov: hi i have updated the gan PR could you have look.
09:08 < kris1> With low no of iterations i am getting very bad results but i am. testing with high number of iterations(takes lot of time) now lets see how it performs
09:09 < kris1> zoq: if you have the time could you also have a look particularly at the evaluate function and gradient function of GAN. That would be very helpful.
09:25 < lozhnikov> kris1: what about this comment https://github.com/mlpack/mlpack/pull/1066#discussion_r128708271 ?
09:34 -!- kris1 [~firstname.lastname@example.org] has quit [Quit: kris1]
09:37 -!- kris1 [~email@example.com] has joined #mlpack
09:37 < kris1> lozhnikov i actually reset parameters first and then do generator.Parameters()
09:37 < kris1> = arma::mat()
09:38 < kris1> but it works now….
09:38 < lozhnikov> did you test that with valgrind?
09:39 < lozhnikov> I think this is incorrect since the layers still use the previous pointer
09:39 < kris1> you mean for the memory leak.
09:40 < lozhnikov> No, I mean invalid pointer
09:40 < lozhnikov> And I've got the second comment: How do you think, is it possible to move the code of Train() to the Gradient() function
09:40 < lozhnikov> ?
09:42 < kris1> Can you please elaborate. I don’t understand the comment.
09:43 < lozhnikov> the first or the second?
09:43 < kris1> second
09:44 < lozhnikov> I mean is it possible to refactor the gradient function in such a way that Train() contains only "optimizer.Optimize()"?
09:45 < kris1> for 1 i think it is to check i will the where the generator.parameters are pointing to usning memptr if it is the same as parameters.memptr() that would be sufficient i guess.
09:46 < kris1> I would have to think about the refactoring. Intially i was thinking of creating 2 seprate function trainGenerator and trainDiscriminator and then calling them from the train function. But just having optimizer.Optimize would be require some thinking. I will see.
09:47 < kris1> the result just came for 1000 iterations for gan. Will have to see where i am going wrong.
09:48 < kris1> lozhnikov could you comment on if the train and gradient function are correctly implemented.
09:49 < lozhnikov> regarding the first comment: pointers are different since you use the following:
09:49 < lozhnikov> generator.Parameters() = arma::mat(parameter.memptr().....
09:51 < lozhnikov> okay, I'll take a quick look now. I'll look through the code in detail as soon as I finish with the ssRBM
09:51 < kris1> Okay sure……….. i will check comment 1
09:57 -!- kris1 [~firstname.lastname@example.org] has quit [Quit: kris1]
10:02 -!- kris1 [~email@example.com] has joined #mlpack
10:08 < lozhnikov> kris1: It seems the formulas of gradients do not match the formulas in the GAN paper
10:08 < lozhnikov> see https://arxiv.org/pdf/1406.2661.pdf page 4
10:08 -!- kris1 [~firstname.lastname@example.org] has quit [Quit: kris1]
10:11 -!- kris1 [~email@example.com] has joined #mlpack
10:14 < kris1> how do you mean…. for the gradient calculations of discriminator i just call the discriminator.Gradient().
10:15 < kris1> for generator i am passing the error from discriminator.network.front() to the genertor.error() and then i am bascically calling the generator of the generator.
10:16 < kris1> i have taken the fakeLabels to 0 and real labels to 1 btw for disriminator calculation and fakelabels = 1 for generator calculation.
10:27 < lozhnikov> okay, It seems I'have understood that. Right now I haven't got any comment except
10:27 < lozhnikov> Optimizer.MaxIterations() = numFunctions;
10:28 < lozhnikov> I think you should replace that by "Optimizer.MaxIterations() = k"
10:29 -!- kris1 [~firstname.lastname@example.org] has quit [Quit: kris1]