Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only. On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live. Read our full announcement for more information and let us know if you have any questions.

ReLU as a switch

Discussion in 'ML-Agents' started by S6Regen, Apr 24, 2020.

  1. S6Regen

    S6Regen

    Joined:
    Apr 24, 2020
    Posts:
    1
    You can view the ReLU activation function in artificial neural networks as a switch.
    On: f(x)=x (connect.)
    Off: zero (disconnect.)
    Of course in the home and ordinary experience there is a fixed voltage on one side of a switch and you will get some cognitive dissonance regarding the more general analog/linear aspect.
    A ReLU neural network then is a switched composition of dot products (weighted sums.)
    Also the dot product of a number of dot products is still a dot product.
    That has a number of implications, however the most important one is the possibility to rearrange neural networks into something more efficient. Moving from adjustable weights and a fixed activation function to fixed weights (using a fast transform like the FFT) and individually parametrized/adjustable activation functions.
    Those require less parameters and are faster than conventional neural networks. They can be used for ALife, ML Agents etc.
    https://ai462qqq.blogspot.com/2019/11/artificial-neural-networks.html
    Also there is information at that link to create associative memory to give your ALife, ML Agent an external memory bank (aka. A Neural Turing Machine.)