Python/neural_network/activation_functions
Maxim Smolskiy f437f92279
Enable ruff INP001 rule ()
* Enable ruff INP001 rule

* Fix

* Fix

* Fix

* Fix

* Fix
2024-04-02 21:13:56 +02:00
..
__init__.py Enable ruff INP001 rule () 2024-04-02 21:13:56 +02:00
binary_step.py [pre-commit.ci] pre-commit autoupdate () 2024-03-13 07:52:41 +01:00
exponential_linear_unit.py The ELU activation is added () 2023-05-02 16:36:28 +02:00
gaussian_error_linear_unit.py File moved to neural_network/activation_functions () 2023-12-27 03:35:29 -05:00
leaky_rectified_linear_unit.py Added Leaky ReLU Activation Function () 2023-08-16 18:22:15 -07:00
mish.py Enable ruff INP001 rule () 2024-04-02 21:13:56 +02:00
rectified_linear_unit.py [pre-commit.ci] pre-commit autoupdate () 2024-03-13 07:52:41 +01:00
scaled_exponential_linear_unit.py Added Scaled Exponential Linear Unit Activation Function () 2023-09-06 15:16:51 -04:00
soboleva_modified_hyperbolic_tangent.py [pre-commit.ci] pre-commit autoupdate () 2024-03-13 07:52:41 +01:00
softplus.py Added Softplus activation function () 2023-10-06 16:26:09 -04:00
squareplus.py Added Squareplus Activation Function () 2023-10-08 12:08:37 -04:00
swish.py Added A General Swish Activation Function inNeural Networks () 2023-10-18 10:50:18 -04:00