Fork me on GitHub

src/arraymancer/nn/layers/gru

  Source Edit

Types

GRUGate[TT] {.final.} = ref object of Gate[TT]
For now the GRU layer only supports fixed size GRU stack and Timesteps   Source Edit
GRULayer[T] = object
  w3s0*, w3sN*: Variable[Tensor[T]]
  u3s*: Variable[Tensor[T]]
  bW3s*, bU3s*: Variable[Tensor[T]]
  Source Edit

Procs

proc forward[T](self: GRULayer[T]; input, hidden0: Variable): tuple[
    output, hiddenN: Variable]
Inputs:

Outputs:

  Source Edit
proc gru[TT](input, hidden0: Variable[TT]; W3s0, W3sN, U3s: Variable[TT];
             bW3s, bU3s: Variable[TT]): tuple[output, hiddenN: Variable[TT]]

⚠️ API subject to change to match CuDNNs

Bidirectional support is not implemented

Inputs:

Outputs:

  Source Edit
proc init[T](ctx: Context[Tensor[T]]; layerType: typedesc[GRULayer[T]];
             numInputFeatures, hiddenSize, layers: int): GRULayer[T]
Creates an gated recurrent layer. Input:
- ``numInputFeatures`` Number of features of the input.
- ``hiddenSize`` size of the hidden layer(s)
- ``layers`` Number of stacked layers

Returns the created GRULayer.

  Source Edit
Arraymancer Technical reference Tutorial Spellbook (How-To's) Under the hood