BinaryByter
but shouldnt it work under regular conditions as well?
Mat
Usually for deep learning you give something like 3 or 4
Mat
I'll try to figure something :)
Kelvin
OK it's a joke
Kelvin
Mat
😆
Mat
Don't be sad
Kelvin
Kelvin
2.35am
Kelvin
Goof night people!
klimi
Goof night people!
Good night :3 sleep tight
MᏫᎻᎯᎷᎷᎬᎠ
.
klimi
.
Hmmm
MᏫᎻᎯᎷᎷᎬᎠ
..
klimi
You forgot cd
MᏫᎻᎯᎷᎷᎬᎠ
Hhhh
BinaryByter
Usually for deep learning you give something like 3 or 4
Well yea, but even with 12 the nan shouldnt occur. Besides with layers like 5000-200-500 it occurs as well
BinaryByter
Yep
Mat
Maybe the network is too big
Mat
I didn't had time to read the code
Mat
But those are really big numbers for a NN
BinaryByter
But it shouldnt output 'not a number', should it?
Mat
You should study the evolution of the nn
Mat
Probably it goes out of control
Defragmented
But it shouldnt output 'not a number', should it?
"Or rather: when solving for exp (x) = 0 , x is undefined" do you have access to an activation function\backprop functon? if yes - add a condition, if nan - stop and dump the whole NN i had such a problem, but in my case it was because i used a bad activation function and didnt normilize into 0...1 and the value expoded
BinaryByter
The normalization function is a sigmoid
Defragmented
thats good, then you can add a stop and save condition to activation/backprop functions
Defragmented
when you will have a dump you will ser where it happened and analyze nearby neurons
BinaryByter
when you will have a dump you will ser where it happened and analyze nearby neurons
It happens in the function "calculateDerivatives"
BinaryByter
Between tde first and second layer
Defragmented
with what input data?
BinaryByter
0,1,1
BinaryByter
Or 1,1*1
BinaryByter
Or 1,0,1
BinaryByter
Or
BinaryByter
0,0,1
Defragmented
what those 0 and 1 are?
BinaryByter
Its a XOR nn
Defragmented
sigmoid and 1 layer cant do xor, as i remember (need 2 layers)
BinaryByter
Thats not the problem
BinaryByter
Beside, ive got 12 layers lol
Defragmented
okay. and error is in second layer? did you check that all the links where your neurons read from lead to a correct neurons in precious layer?
Defragmented
if you use just arrays as neurons, its easy to go out of boundary
BinaryByter
Besides, im not a crackpot i know what i do, kinda lol
BinaryByter
So tldr; no out of bound action
BinaryByter
Besides, an out of bound would throw a sigsev
BinaryByter
Wel except for a vector
Defragmented
okay, try to do a if(nan){save}
Defragmented
if you have a full access this should be easy
BinaryByter
okay, try to do a if(nan){save}
Ive already done that with the vsc debugger
BinaryByter
The question is not where it happens, its about how i can make it stop
Defragmented
you cant stop it if you dont know what leads to it
BinaryByter
you cant stop it if you dont know what leads to it
A derivative gets calculated and becomes nan
BinaryByter
Except i cant change the calculation without breaking the maths
Defragmented
whats the input data to derivative when it procudes nan?
BinaryByter
whats the input data to derivative when it procudes nan?
The derivative doesnt calculate over input data
BinaryByter
Or does it?
BinaryByter
Wait
BinaryByter
Im tired ill check tomorrow in front of a pc again
BinaryByter
Or in two depending on how nice my dad is
Dima
BinaryByter
needs more BOND
Bangladesdi's Opera Naughty Dancer
Defragmented
Im tired ill check tomorrow in front of a pc again
anyway. if (x==0){x=0.00001} might be as a last resort
Defragmented
where x is what?
derivative input data
BinaryByter
Im 100% sure that the weighted sum of the inputs isnt 0
Defragmented
if it reacts to 0 this way
BinaryByter
derivative input data
I have no division that takes the derivative as dividend
Defragmented
I have no division that takes the derivative as dividend
detivative is a difference in real signal and prefered signal, right?
BinaryByter
difference may be 0
But that would make the error pop up in the last layer