* Moved file relu.py from maths/ to neural_network/activation_functions
* Renamed relu.py to rectified_linear_unit.py
* Renamed relu.py to rectified_linear_unit.py in DIRECTORY.md
* Added Scaled Exponential Linear Unit Activation Function
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Update scaled_exponential_linear_unit.py
* Update scaled_exponential_linear_unit.py
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* tanh function been added
* tanh function been added
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* tanh function is added
* tanh function is added
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* tanh function added
* tanh function added
* tanh function is added
* Apply suggestions from code review
* ELU activation function is added
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* elu activation is added
* ELU activation is added
* Update maths/elu_activation.py
Co-authored-by: Christian Clauss <cclauss@me.com>
* Exponential_linear_unit activation is added
* Exponential_linear_unit activation is added
---------
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Christian Clauss <cclauss@me.com>