Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

ReLU as a switch

Discussion in 'ML-Agents' started by S6Regen, Apr 24, 2020.

  1. S6Regen

    S6Regen

    Joined:
    Apr 24, 2020
    Posts:
    1
    You can view the ReLU activation function in artificial neural networks as a switch.
    On: f(x)=x (connect.)
    Off: zero (disconnect.)
    Of course in the home and ordinary experience there is a fixed voltage on one side of a switch and you will get some cognitive dissonance regarding the more general analog/linear aspect.
    A ReLU neural network then is a switched composition of dot products (weighted sums.)
    Also the dot product of a number of dot products is still a dot product.
    That has a number of implications, however the most important one is the possibility to rearrange neural networks into something more efficient. Moving from adjustable weights and a fixed activation function to fixed weights (using a fast transform like the FFT) and individually parametrized/adjustable activation functions.
    Those require less parameters and are faster than conventional neural networks. They can be used for ALife, ML Agents etc.
    https://ai462qqq.blogspot.com/2019/11/artificial-neural-networks.html
    Also there is information at that link to create associative memory to give your ALife, ML Agent an external memory bank (aka. A Neural Turing Machine.)