Python/neural_network/activation_functions
Kausthub Kannan c6ec99d571
Added Mish Activation Function (#9942)
* Added Mish Activation Function

* Apply suggestions from code review

---------

Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-06 15:23:05 -04:00
..
exponential_linear_unit.py The ELU activation is added (#8699) 2023-05-02 16:36:28 +02:00
leaky_rectified_linear_unit.py Added Leaky ReLU Activation Function (#8962) 2023-08-16 18:22:15 -07:00
mish.py Added Mish Activation Function (#9942) 2023-10-06 15:23:05 -04:00
rectified_linear_unit.py Moved relu.py from maths/ to neural_network/activation_functions (#9753) 2023-10-04 16:28:19 -04:00
scaled_exponential_linear_unit.py Added Scaled Exponential Linear Unit Activation Function (#9027) 2023-09-06 15:16:51 -04:00
sigmoid_linear_unit.py Changing the directory of sigmoid_linear_unit.py (#9824) 2023-10-05 10:07:44 -04:00