mirror of
https://github.com/TheAlgorithms/Python.git
synced 2024-12-18 01:00:15 +00:00
51c5c87b9a
* added GELU activation functions file * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update gaussian_error_linear_unit.py * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Update gaussian_error_linear_unit.py * Delete neural_network/activation_functions/gaussian_error_linear_unit.py * Rename maths/gaussian_error_linear_unit.py to neural_network/activation_functions/gaussian_error_linear_unit.py --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> |
||
---|---|---|
.. | ||
binary_step.py | ||
exponential_linear_unit.py | ||
gaussian_error_linear_unit.py | ||
leaky_rectified_linear_unit.py | ||
mish.py | ||
rectified_linear_unit.py | ||
scaled_exponential_linear_unit.py | ||
soboleva_modified_hyperbolic_tangent.py | ||
softplus.py | ||
squareplus.py | ||
swish.py |