* Add binary step activation function
* fix: ruff line too long error
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* refactor: add link to directory
* revert: add link to directory
* fix: algorithm bug and docs
* Update neural_network/activation_functions/binary_step.py
* fix: ruff line too long error
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
* Added Squareplus Activation Function
* Added parameter beta to the function
* Fixed Squareplus Function
* Update neural_network/activation_functions/squareplus.py
---------
Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
* Moved file relu.py from maths/ to neural_network/activation_functions
* Renamed relu.py to rectified_linear_unit.py
* Renamed relu.py to rectified_linear_unit.py in DIRECTORY.md
* Added Scaled Exponential Linear Unit Activation Function
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* tanh function been added
* tanh function been added
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* tanh function is added
* tanh function is added
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* tanh function added
* tanh function added
* tanh function is added
* Apply suggestions from code review
* ELU activation function is added
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* elu activation is added
* ELU activation is added
* Update maths/elu_activation.py
Co-authored-by: Christian Clauss <cclauss@me.com>
* Exponential_linear_unit activation is added
* Exponential_linear_unit activation is added
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Christian Clauss <cclauss@me.com>