Supported Layers and Activation Functions

The PEtab SciML NN model YAML format supports numerous standard neural network layers and activation functions. Layer names and associated keyword arguments follow the PyTorch naming scheme. PyTorch is used because it is currently the most popular machine learning framework, and its comprehensive documentation makes it easy to look up details for any specific layer or activation function.

If support is lacking for a layer or activation function you would like to see, please file an issue on GitHub.

The table below lists the supported and tested neural network layers along with links to their respective PyTorch documentation. Additionally, the table indicates which tools support each layer.

layer

PE ta b. jl

A M I C I

Linear

✔️

Bilinear

✔️

Flatten

✔️

Dropout

✔️

Dropout1d

✔️

Dropout2d

✔️

Dropout3d

✔️

AlphaDropout

✔️

Conv1d

✔️

Conv2d

✔️

Conv3d

✔️

ConvTranspose1d

✔️

ConvTranspose2d

✔️

ConvTranspose3d

✔️

MaxPool1d

✔️

MaxPool2d

✔️

MaxPool3d

✔️

AvgPool1d

✔️

AvgPool2d

✔️

AvgPool3d

✔️

LPPool1

✔️

LPPool2

✔️

LPPool3

✔️

Ada ptiveMaxPool1d

✔️

Ada ptiveMaxPool2d

✔️

Ada ptiveMaxPool3d

✔️

Ada ptiveAvgPool1d

✔️

Ada ptiveAvgPool2d

✔️

Ada ptiveAvgPool3d

✔️

Supported Activation Function

The table below lists the supported and tested activation functions along with links to their respective PyTorch documentation. Additionally, the table indicates which tools support each layer.

Function

PE ta b. jl

A M I C I

relu

✔️

relu6

✔️

hardtanh

✔️

h ardswish

✔️

selu

✔️

leak y_relu

✔️

gelu

✔️

tanh shrink

✔️

softsign

✔️

softplus

✔️

tanh

✔️

sigmoid

✔️

hardsig moid

✔️

mish

✔️

elu

✔️

celu

✔️

softmax

✔️

log_sof tmax

✔️