mlpack IRC logs, 2018-07-05

Logs for the day 2018-07-05 (starts at 0:00 UTC) are shown below.

>
July 2018
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
--- Log opened Thu Jul 05 00:00:33 2018
01:56 -!- manish7294 [8ba77a0f@gateway/web/freenode/ip.139.167.122.15] has joined #mlpack
02:31 -!- cjlcarvalho [~caio@200-165-26-228.user.veloxzone.com.br] has joined #mlpack
02:41 < manish7294> rcurtin: BINDING_TYPE macro and it's value is defined in mlpack_main.cpp which always defines BINDING_TYPE https://github.com/mlpack/mlpack/blob/fd59d030ca31f51cc9a4864eb8f892266bfd1807/src/mlpack/core/util/mlpack_main.hpp#L26 no matter what the situation. The thing is actual error is in the order in which random.hpp and mlpack_main.cpp files are included in all method's main.cpp files.
02:42 < manish7294> In all main.cpp's random.hpp which uses BINDING_TYPE is included way before mlpack_main.cpp(which defines BINDING_TYPE) and this is the case in almost all the methods.
03:35 -!- manish7294 [8ba77a0f@gateway/web/freenode/ip.139.167.122.15] has quit [Ping timeout: 252 seconds]
06:22 -!- cjlcarvalho [~caio@200-165-26-228.user.veloxzone.com.br] has quit [Ping timeout: 248 seconds]
06:51 -!- lozhnikov [~mikhail@lozhnikov.static.corbina.ru] has quit [Ping timeout: 268 seconds]
06:51 -!- lozhnikov [~mikhail@lozhnikov.static.corbina.ru] has joined #mlpack
09:10 -!- vpal [~vivek@unaffiliated/vivekp] has joined #mlpack
09:10 -!- vivekp [~vivek@unaffiliated/vivekp] has quit [Ping timeout: 260 seconds]
09:10 -!- vpal is now known as vivekp
10:00 < jenkins-mlpack> Yippee, build fixed!
10:00 < jenkins-mlpack> Project docker mlpack nightly build build #370: FIXED in 2 hr 46 min: http://masterblaster.mlpack.org/job/docker%20mlpack%20nightly%20build/370/
12:45 < Atharva> zoq: The `Backward()`function in FFN class doesn't go over the first layer of the network, but in the case of first layer being sequential, it should go over it. Otherwise, the layers in the sequential object are left with empty errors.
12:49 -!- vivekp [~vivek@unaffiliated/vivekp] has quit [Ping timeout: 240 seconds]
12:50 -!- vivekp [~vivek@unaffiliated/vivekp] has joined #mlpack
12:50 -!- travis-ci [~travis-ci@ec2-54-166-159-252.compute-1.amazonaws.com] has joined #mlpack
12:50 < travis-ci> manish7294/mlpack#54 (evalBounds - 35af793 : Manish): The build was fixed.
12:50 < travis-ci> Change view : https://github.com/manish7294/mlpack/compare/0e7f82a5628d...35af7935bcee
12:50 < travis-ci> Build details : https://travis-ci.com/manish7294/mlpack/builds/78172568
12:50 -!- travis-ci [~travis-ci@ec2-54-166-159-252.compute-1.amazonaws.com] has left #mlpack []
12:52 < zoq> Right, in most cases the backward call of the first layer isn't needed, the seq layer is an exception, and I think there are two solutions here, either we add an identity layer or we check inside the FFN class if the layer implements the Model function.
12:55 < Atharva> zoq: Even if we add an identity layer, we will have to check if the layer implementes the Model function. Instead, in that case, we can just call the BackwardVisitor once more. Am I right here?
13:05 < zoq> If we add an Identity layer before the seq layer, we will call the backward of the seq layer since it's the second layer and not the first. Perhaps I missed something?
13:10 < zoq> If we check for the Model function, which acts as an indecator, we don't have to insert an extra identiy layer.
13:18 < Atharva> zoq: Yes you are right. My doubt is, do we ask the users to add the identity layer before the seq layer or do we add it ourselves? In the later case, we would have to check for model function anyway, right?
13:20 < zoq> Atharva: Right, I guess the second idea might be the way to go, less user interaction, what do you think?
13:26 < Atharva> zoq: I think that's better too. So, while adding a layer, we would have to check if it has the Model() function and is it the first layer of the netwok. If yes, we add an Identity layer before it.
13:28 < Atharva> Or, another option can be too check if the first layer has the Model() function and just run the BackwardVisitor() on it if it has. In this case, we don't have to add an extra layer as only the backward function is concerned with it.
13:28 < zoq> Agreed, that's easier.
13:32 < Atharva> zoq: Okay then, I will make these changes in of my PRs.
13:33 < zoq> Great but don't feel obligated, we could use the identity solution for now, if you like.
13:55 < Atharva> zoq: It's not a problem, I have already made a lot of changes locally and they are not much.
14:22 -!- cjlcarvalho [~caio@177-177-182-238.user.veloxzone.com.br] has joined #mlpack
14:28 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
15:53 -!- cjlcarvalho [~caio@177-177-182-238.user.veloxzone.com.br] has quit [Remote host closed the connection]
16:10 -!- cjlcarvalho [~caio@177-177-182-238.user.veloxzone.com.br] has joined #mlpack
19:31 -!- cjlcarvalho [~caio@177-177-182-238.user.veloxzone.com.br] has quit [Ping timeout: 248 seconds]
19:49 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Ping timeout: 245 seconds]
19:57 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has joined #mlpack
20:29 -!- ImQ009 [~ImQ009@unaffiliated/imq009] has quit [Quit: Leaving]
--- Log closed Fri Jul 06 00:00:34 2018