Commit Graph

15 Commits

Author SHA1 Message Date
Maxim Smolskiy
f437f92279
Enable ruff INP001 rule (#11346)
* Enable ruff INP001 rule

* Fix

* Fix

* Fix

* Fix

* Fix
2024-04-02 21:13:56 +02:00
pre-commit-ci[bot]
bc8df6de31
[pre-commit.ci] pre-commit autoupdate (#11322)
* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/astral-sh/ruff-pre-commit: v0.2.2 → v0.3.2](https://github.com/astral-sh/ruff-pre-commit/compare/v0.2.2...v0.3.2)
- [github.com/pre-commit/mirrors-mypy: v1.8.0 → v1.9.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.8.0...v1.9.0)

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-03-13 07:52:41 +01:00
Param Thakkar
51c5c87b9a
File moved to neural_network/activation_functions (#11216)
* added GELU activation functions file

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update gaussian_error_linear_unit.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update gaussian_error_linear_unit.py

* Delete neural_network/activation_functions/gaussian_error_linear_unit.py

* Rename maths/gaussian_error_linear_unit.py to neural_network/activation_functions/gaussian_error_linear_unit.py

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-12-27 03:35:29 -05:00
Shivansh Bhatnagar
572de4f15e
Added A General Swish Activation Function inNeural Networks (#10415)
* Added A General Swish Activation Function inNeural Networks

* Added the general swish function in the SiLU function and renamed it as swish.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: Shivansh Bhatnagar <shivansh.bhatnagar.mat22@iitbhu.ac.in>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-10-18 10:50:18 -04:00
Saahil Mahato
ed19b1cf0c
Add binary step activation function (#10030)
* Add binary step activation function

* fix: ruff line too long error

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor: add link to directory

* revert: add link to directory

* fix: algorithm bug and docs

* Update neural_network/activation_functions/binary_step.py

* fix: ruff line too long error

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-08 19:34:50 -04:00
Saahil Mahato
2260961a80
Add Soboleva Modified Hyberbolic Tangent function (#10043)
* Add Sobovela Modified Hyberbolic Tangent function

* fix: typo

* Update and rename sobovela_modified_hyperbolic_tangent.py to soboleva_modified_hyperbolic_tangent.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix: typo

* Apply suggestions from code review

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-08 19:19:28 -04:00
Kausthub Kannan
a12b07f352
Added Squareplus Activation Function (#9977)
* Added Squareplus Activation Function

* Added parameter beta to the function

* Fixed Squareplus Function

* Update neural_network/activation_functions/squareplus.py

---------

Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-08 12:08:37 -04:00
Kausthub Kannan
08d394126c
Changed Mish Activation Function to use Softplus (#10111) 2023-10-08 11:48:22 -04:00
Kausthub Kannan
80a2087e0a
Added Softplus activation function (#9944) 2023-10-06 16:26:09 -04:00
Kausthub Kannan
c6ec99d571
Added Mish Activation Function (#9942)
* Added Mish Activation Function

* Apply suggestions from code review

---------

Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-06 15:23:05 -04:00
Aasheesh
deb0480b3a
Changing the directory of sigmoid_linear_unit.py (#9824)
* Changing the directory of sigmoid_linear_unit.py

* Delete neural_network/activation_functions/__init__.py

---------

Co-authored-by: Tianyi Zheng <tianyizheng02@gmail.com>
2023-10-05 10:07:44 -04:00
piyush-poddar
26d650ec28
Moved relu.py from maths/ to neural_network/activation_functions (#9753)
* Moved file relu.py from maths/ to neural_network/activation_functions

* Renamed relu.py to rectified_linear_unit.py

* Renamed relu.py to rectified_linear_unit.py in DIRECTORY.md
2023-10-04 16:28:19 -04:00
Adarsh Acharya
153c35eac0
Added Scaled Exponential Linear Unit Activation Function (#9027)
* Added Scaled Exponential Linear Unit Activation Function

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update scaled_exponential_linear_unit.py

* Update scaled_exponential_linear_unit.py

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2023-09-06 15:16:51 -04:00
Kausthub Kannan
f6b12420ce
Added Leaky ReLU Activation Function (#8962)
* Added Leaky ReLU activation function

* Added Leaky ReLU activation function

* Added Leaky ReLU activation function

* Formatting and spelling fixes done
2023-08-16 18:22:15 -07:00
Dipankar Mitra
7310514509
The ELU activation is added (#8699)
* tanh function been added

* tanh function been added

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* tanh function is added

* tanh function is added

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* tanh function added

* tanh function added

* tanh function is added

* Apply suggestions from code review

* ELU activation function is added

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* elu activation is added

* ELU activation is added

* Update maths/elu_activation.py

Co-authored-by: Christian Clauss <cclauss@me.com>

* Exponential_linear_unit activation is added

* Exponential_linear_unit activation is added

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Christian Clauss <cclauss@me.com>
2023-05-02 16:36:28 +02:00