# Search NKS | Online

1 - 2 of 2 for Tanh

Then given a list s[i] of the values of one set of neurons, one finds the values of another set using s[i + 1] = u[w . s[i]] , where in early models u = Sign was usually chosen, and now u = Tanh is more common, and w is a rectangular matrix which gives weights—normally assumed to be continuous numbers, often between -1 and +1—for the synaptic connections between the neurons in each set.

But for smaller e[s] one can show that
Abs[m[s]] (1 - Sinh[2 β ] -4 ) 1/8
where β can be deduced from
e[s] -(Coth[2 β ](1 + 2 EllipticK[4 Sech[2 β ] 2 Tanh[2 β ] 2 ] (-1 + 2 Tanh[2 β ] 2 )/ π ))
This implies that just below the critical point e 0 = - √ 2 (which corresponds to β = Log[1 + √ 2 ]/2 ) Abs[m] ~ (e 0 - e) 1/8 , where here 1/8 is a so-called critical exponent.