Fork me on GitHub
Arraymancer Technical reference Tutorial Spellbook (How-To's) Under the hood

Module nnp_sigmoid_cross_entropy

Procs

proc sigmoid_cross_entropy[T](input, target: Tensor[T]): T
Sigmoid function + Cross-Entropy loss fused in one layer.
Input:
  • A Tensor
  • The target values
Returns:
  • Apply a sigmoid activation and returns the cross-entropy loss.
Shape:
  • Both the cache and target shape should be [batch_size, features] i.e. number of samples as first dimension
  Source Edit
proc sigmoid_cross_entropy_backward[T](gradient: Tensor[T] or T;
                                      cached_tensor: Tensor[T]; target: Tensor[T]): Tensor[
    T] {.
noInit
.}
Derivatives of sigmoid_cross_entropy
Input:
  • The input gradient as a scalar or a Tensor
  • A cache tensor that contains data from before the forward pass
  • The target values
Shape:
  • Both the cache and target shape should be [batch_size, features] i.e. number of samples as first dimension
  Source Edit