Python/neural_network/activation_functions
Saahil Mahato ed19b1cf0c
Add binary step activation function (#10030)
* Add binary step activation function

* fix: ruff line too long error

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor: add link to directory

* revert: add link to directory

* fix: algorithm bug and docs

* Update neural_network/activation_functions/binary_step.py

* fix: ruff line too long error

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-08 19:34:50 -04:00
..
binary_step.py Add binary step activation function (#10030) 2023-10-08 19:34:50 -04:00
exponential_linear_unit.py The ELU activation is added (#8699) 2023-05-02 16:36:28 +02:00
leaky_rectified_linear_unit.py Added Leaky ReLU Activation Function (#8962) 2023-08-16 18:22:15 -07:00
mish.py Changed Mish Activation Function to use Softplus (#10111) 2023-10-08 11:48:22 -04:00
rectified_linear_unit.py Moved relu.py from maths/ to neural_network/activation_functions (#9753) 2023-10-04 16:28:19 -04:00
scaled_exponential_linear_unit.py Added Scaled Exponential Linear Unit Activation Function (#9027) 2023-09-06 15:16:51 -04:00
sigmoid_linear_unit.py Changing the directory of sigmoid_linear_unit.py (#9824) 2023-10-05 10:07:44 -04:00
soboleva_modified_hyperbolic_tangent.py Add Soboleva Modified Hyberbolic Tangent function (#10043) 2023-10-08 19:19:28 -04:00
softplus.py Added Softplus activation function (#9944) 2023-10-06 16:26:09 -04:00
squareplus.py Added Squareplus Activation Function (#9977) 2023-10-08 12:08:37 -04:00