Topics:
Leaky ReLU nonlinearity, i.e. alpha * min(x, 0) + max(x, 0).
alpha (float) – slope for x < 0.
float
do_stabilize (bool) – set to True for very deep networks.
bool
Tuple[InitFn, ApplyFn, LayerKernelFn]
Tuple
InitFn
ApplyFn
LayerKernelFn
(init_fn, apply_fn, kernel_fn).