layers#
Parent layers#
|
A base class that instantiates buffers/states which update at every time step and provides helper methods that manage those states. |
Utility mixin class that will wrap the __init__ and forward call to flatten the input to and the output from a child class. |
Spiking#
All spiking layers are subclasses of StatefulLayer and Squeeze layers additionally inherit from SqueezeMixin.
|
Integrate and Fire neuron layer that is designed as a special case of |
|
IAF layer with 4-dimensional input (Batch*Time, Channel, Height, Width). |
|
Integrate and Fire neuron layer with recurrent connections which inherits from |
|
Leaky Integrate and Fire neuron layer that inherits from |
|
|
|
Leaky Integrate and Fire neuron layer with recurrent connections which inherits from |
|
Adaptive Leaky Integrate and Fire neuron layer that inherits from |
|
Adaptive Leaky Integrate and Fire neuron layer with recurrent connections which inherits from |
Non-spiking#
These layers are special cases of LIF layers.
|
Leaky Integrator layer which is a special case of |
|
ExpLeak layer with 4-dimensional input (Batch*Time, Channel, Height, Width). |
Pooling#
|
Torch implementation of SpikingMaxPooling. |
|
Non-spiking sumpooling layer to be used in analogue Torch models. |
Conversion from images / analog signals#
|
Layer to convert images to spikes. |
|
Layer to convert analog Signals to spikes. |
Auxiliary#
|
Crop input image by. |
|
Utility layer which wraps any nn.Module. |
Utility layer which always flattens first two dimensions and is a special case of torch.nn.Flatten(). |
|
|
Utility layer which always unflattens (expands) the first dimension into two separate ones. |
ANN layers#
|
NeuromorphicReLU layer. |
|
Layer that quantizes the input, i.e. returns floor(input). |