leakyrelu to leaky_relu (#9270)

This commit is contained in:
Francis Lata
2025-02-26 13:22:08 -05:00
committed by GitHub
parent cd822bbe11
commit 86b737a120
11 changed files with 30 additions and 30 deletions

View File

@@ -110,7 +110,7 @@ class TinyNet:
def __call__(self, x):
x = self.l1(x)
x = x.leakyrelu()
x = x.leaky_relu()
x = self.l2(x)
return x
@@ -118,7 +118,7 @@ net = TinyNet()
```
We can see that the forward pass of our neural network is just the sequence of operations performed on the input tensor `x`.
We can also see that functional operations like `leakyrelu` are not defined as classes and instead are just methods we can just call.
We can also see that functional operations like `leaky_relu` are not defined as classes and instead are just methods we can just call.
Finally, we just initialize an instance of our neural network, and we are ready to start training it.
## Training

View File

@@ -52,7 +52,7 @@ Elementwise ops operate on a per element basis. They don't change the shape of t
::: tinygrad.Tensor.erf
::: tinygrad.Tensor.gelu
::: tinygrad.Tensor.quick_gelu
::: tinygrad.Tensor.leakyrelu
::: tinygrad.Tensor.leaky_relu
::: tinygrad.Tensor.mish
::: tinygrad.Tensor.softplus
::: tinygrad.Tensor.softsign