Types
Backward[TT] = proc (self: Gate[TT]; payload: Payload[TT]): SmallDiffs[TT] {. nimcall.}
- ⚠️ Warning: make sure the identifier is not overloaded https://github.com/nim-lang/Nim/issues/9997 Source Edit
Context[TT] = ref object
-
An autograd context is a record of operations or layers. It holds the following fields:
- nodes: This records the list of operations(Node) applied in the context
- no_grad: This disable tracing the list of operations altogether. This is useful to save memory when you don't need the gradient (for validation or prediction for example)
A context is also called a tape or a Wengert list.
Note: backpropagation empties the list of operations.
Source Edit Gate[TT] = ref object of RootObj
- Base operator or layer. You can describe your custom operations or layers by inheriting from Gate and add a forward and optionally a backward method. Each operations should set the number of gradients produced during backpropagation. Additional fields specific to the operations like weights or inputs cache should be added too. Source Edit
Payload[TT] = object case kind*: PayloadKind of pkVar: variable*: Variable[TT] of pkSeq: sequence*: seq[Variable[TT]]
- Source Edit
PayloadKind = enum pkVar, pkSeq
- Source Edit
SmallDiffs[TT] = seq[TT]
- Source Edit
Variable[TT] = ref object context*: Context[TT] value*: TT grad*: TT requires_grad*: bool
-
A variable is a wrapper for Tensors that tracks operations applied to it. It consists of:
- A weak reference to a record of operations context
- The tensor being tracked value
- The gradient of the tensor grad
- a flag that indicates if gradient is needed
Procs
func is_grad_needed(v: Variable): bool {.inline.}
- Depending on the input variable and its context no_grad_mode, returns true if gradient computation is needed and false otherwise Source Edit
func newContext(TT: typedesc): Context[TT]
- Initialize a context Source Edit
func newDiffs[TT](num: Natural): SmallDiffs[TT] {.inline.}
- Source Edit
func newParents[TT](num: Natural): Parents[TT] {.inline.}
- Source Edit
Templates
template no_grad_mode(ctx: Context; body: untyped): untyped
-
Within this block, the context will not track the operations applied to each Variable.
This should be used for validation or prediction to optimize memory.
Source Edit