Python/neural_network/activation_functions
Adarsh Acharya 153c35eac0
Added Scaled Exponential Linear Unit Activation Function (#9027)
* Added Scaled Exponential Linear Unit Activation Function

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-09-06 15:16:51 -04:00
..
exponential_linear_unit.py The ELU activation is added (#8699) 2023-05-02 16:36:28 +02:00
leaky_rectified_linear_unit.py Added Leaky ReLU Activation Function (#8962) 2023-08-16 18:22:15 -07:00
scaled_exponential_linear_unit.py Added Scaled Exponential Linear Unit Activation Function (#9027) 2023-09-06 15:16:51 -04:00