Revision history of "ReLU"

Jump to: navigation, search

Diff selection: Mark the radio boxes of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

  • (cur | prev) 15:52, 17 February 2023Xinreality (talk | contribs). . (50 bytes) (-1,768). . (Replaced content with "{{#externalredirect: https://aiwiki.ai/wiki/ReLU}}")
  • (cur | prev) 09:54, 24 January 2023Xinreality (talk | contribs). . (1,818 bytes) (+1,818). . (Created page with "==Introduction== ReLU, or rectified line unit, is a type activation function that is used in artificial neural network. It can be described mathematically as f(x) = max(0,x)...")