PyTorch可以处理非一对一函数的自定义非线性激活函数吗?

huangapple go评论79阅读模式
英文:

Can Pytorch handle custom nonlinear activation functions that are not 1-to-1 functions?

问题

我对使用自定义非线性激活函数构建神经网络很感兴趣,这些函数不是一对一函数。

我了解到可以在PyTorch中添加自定义非线性激活函数,但是目前只考虑了一对一函数。也就是说,有一个线性层执行点积,然后将其输入到非线性函数中,该函数接受单个输入并返回输出。

是否可以有一个自定义非线性激活函数,它依赖于前一层的多个输入参数?

因此,输出不是单个数字,而是依赖于输入层的所有输入。一般来说,它将是一个关于输入和可调权重f(x, A)的函数,不能表示为f(x dot A)。例如,这样一个函数可能如下所示:

是否可以在PyTorch的神经网络中使用这样一个复杂的激活层?
或者说这太不常规了?

英文:

I am interested in making a neural network with custom nonlinear activation functions that are not 1-to-1 functions.

I see that it is possible to add custom nonlinear activation functions to Pytorch, but the only functions that are considered are 1-to-1 functions. That is, there is a linear layer which performs a dot product, and then it is fed to a nonlinear function, which takes a single input and returns and output,

PyTorch可以处理非一对一函数的自定义非线性激活函数吗?

Is it possible to have a custom nonlinear activation function that depends on multiple input arguments of the previous layer?

So instead of taking a single number, the output depends on all of the inputs of the input layer. In general it would be a function of the inputs and tunable weights f(x, A) that cannot be expressed as f(x dot A). One such function for example might look like:

PyTorch可以处理非一对一函数的自定义非线性激活函数吗?

Is it possible to use such a complex activation layer in a NN in pytorch?
Or is this too unconventional?

答案1

得分: 1

你可以在PyTorch模块中编写任何逻辑,不必局限于点积或逐点非线性操作。

class CustomLayer(nn.Module):

    def __init__(self, size_in):
        super().__init__()
        self.A = nn.Parameter(torch.Tensor(size_in, size_in))

    def forward(self, x):
        # x是一个矩阵,批次大小为batch x hidden_size
        # 在这里使用self.A编写任何你想要的逻辑
        return ...
英文:

You can write absolutely any logic in pytorch module, no need to be limited to dot products or pointwise nonlinearities

class CustomLayer(nn.Module):

    def __init__(self, size_in):
        super().__init__()
        self.A = nn.Parameter(torch.Tensor(size_in, size_in))

    def forward(self, x):
        # x is a matrix batch x hidden_size
        # just write any logic you wish here using self.A
        return ...

huangapple
  • 本文由 发表于 2023年8月8日 22:38:45
  • 转载请务必保留本文链接:https://go.coder-hub.com/76860612.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定