Skip to content

ReLU op #401

Answered by ricardoV94
finncatling asked this question in Q&A
Jul 27, 2023 · 1 comments · 1 reply
Discussion options

You must be logged in to vote

You can create a relu with pytensor.tensor.switch:

import pytensor.tensor as pt

def relu(x):
  return pt.switch(pt.lt(x, 0), 0, x)

No need for a custom Op

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@finncatling
Comment options

Answer selected by finncatling
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy