Public API:
nt.stax
LeakyRelu()
nt.empirical
nt.predict
nt.batch
nt.monte_carlo_kernel_fn
Internal:
nt.experimental
Kernel
Colab Examples:
Papers:
Other Resources:
Leaky ReLU nonlinearity, i.e. alpha * min(x, 0) + max(x, 0).
alpha * min(x, 0) + max(x, 0)
alpha (float) – slope for x < 0.
float
do_stabilize (bool) – set to True for very deep networks.
bool
tuple[InitFn, ApplyFn, LayerKernelFn]
tuple
InitFn
ApplyFn
LayerKernelFn
(init_fn, apply_fn, kernel_fn).