Fork me on GitHub

src/arraymancer/nn_primitives/nnp_sigmoid_cross_entropy

Search:
Group by:
  Source Edit

Procs

proc sigmoid_cross_entropy[T](input, target: Tensor[T]): T

Sigmoid function + Cross-Entropy loss fused in one layer.

Input:

  • A Tensor
  • The target values

Returns:

  • Apply a sigmoid activation and returns the cross-entropy loss.

Shape:

  • Both the cache and target shape should be batch_size, features i.e. number of samples as first dimension
  Source Edit
proc sigmoid_cross_entropy_backward[T](gradient: Tensor[T] or T;
                                       cached_tensor: Tensor[T];
                                       target: Tensor[T]): Tensor[T] {.noinit.}
Derivatives of sigmoid_cross_entropy Input:
  • The input gradient as a scalar or a Tensor
  • A cache tensor that contains data from before the forward pass
  • The target values

Shape:

  • Both the cache and target shape should be batch_size, features i.e. number of samples as first dimension
  Source Edit
Arraymancer Technical reference Tutorial Spellbook (How-To's) Under the hood