如何在NumPy的outer函数中省略轴。

huangapple go评论73阅读模式
英文:

how omit axis on outer substaract numpy

问题

我正在计算欧几里德(或其他)距离的表格。这是一个空间距离的查找表。对于二维点,查找表格如下 table[x1, y1, x2, y2]。例如 table[0, 0, 0, 10] 应该等于 10table[0, 0, 10, 10] 应该是 14.142135623730951

我有一个可工作的示例,我想重写它,使其适用于具有任意维度的点。这段代码仅适用于二维情况。

import numpy as np

to = np.indices((5, 5)).T
x_shape, y_shape, n_dim = to.shape
x_sub = np.subtract.outer(to[..., 0], to[..., 0])
y_sub = np.subtract.outer(to[..., 1], to[..., 1])
distances = np.sqrt(x_sub ** 2 + y_sub ** 2)

n_test = 5
for i in np.random.choice(x_shape, n_test):
    for j in np.random.choice(y_shape, n_test):
        for k in np.random.choice(x_shape, n_test):
            for l in np.random.choice(y_shape, n_test):
                d = np.sqrt((i - k) ** 2 + (j - l) ** 2)
                assert (distances[i, j, k, l] == d)
print('3[92m TEST PASSES')

现在,我可以简单地运行一个循环,并在每个维度上执行 np.subtract.outer,但我想知道是否有一种方法可以直接在索引上执行外部减法(就像 np.subtract.outer(to, to))。当我尝试这样做时,结果的形状是 (5, 5, 2, 5, 5, 2),而我需要的是 (5, 5, 5, 5, 2)。有人知道如何做到这一点吗?

从被接受的答案派生的解决方案:

import numpy as np 
to = np.indices((5, 5, 5)).T
sh = list(to.shape)
n_dims = sh[-1]
t = to.reshape(sh[:-1] + [1] * n_dims + [n_dims]) - to.reshape([1] * n_dims + sh[:-1] + [n_dims])
distances = np.sqrt(np.sum(np.power(t, 2), axis=-1))

n_test = 100
idxs = np.indices(distances.shape).T.reshape(-1, n_dims * 2)
for idx in np.random.permutation(idxs)[:n_test]:
    assert distances[tuple(idx)] == np.linalg.norm(idx[:n_dims] - idx[n_dims:])

print('3[92m TEST PASSES')
英文:

I am computing a table of euclidean (or other) distances. This is a lookup for distances in a space. For 2-d points the lookup is done thusly table[x1,y1,x2,y2]. For instance table[0,0,0,10] should equal 10 and table[0,0,10,10] would be 14.142135623730951.

I have a working example that I would like to rewrite so that it works on points with an arbitrary number of dimensions. This code works only for two.

import numpy as np

to = np.indices((5, 5)).T
x_shape, y_shape, n_dim = to.shape
x_sub = np.subtract.outer(to[..., 0], to[..., 0])
y_sub = np.subtract.outer(to[..., 1], to[..., 1])
distances = np.sqrt(x_sub ** 2 + y_sub ** 2)

n_test = 5
for i in np.random.choice(x_shape, n_test):
    for j in np.random.choice(y_shape, n_test):
        for k in np.random.choice(x_shape, n_test):
            for l in np.random.choice(y_shape, n_test):
                d = np.sqrt((i - k) ** 2 + (j - l) ** 2)
                assert (distances[i, j, k, l] == d)
print('3[92m TEST PASSES')

Now, I could simply run a loop and do np.subtract.outer on each dimm, but I would like to know if there is a way to do outer subtract on the indices directly. (As in np.subtract.outer(to, to))
When I try this the result has a shape of (5, 5, 2, 5, 5, 2) while what i need is (5, 5, 5, 5, 2). Anyone know how to do this?

Solution derived from accepted answer:

import numpy as np 
to = np.indices((5, 5, 5)).T
sh = list(to.shape)
n_dims = sh[-1]
t = to.reshape(sh[:-1] + [1] * n_dims + [n_dims]) - to.reshape([1] * n_dims + sh[:-1] + [n_dims])
distances = np.sqrt(np.sum(np.power(t, 2), axis=-1))

n_test = 100
idxs = np.indices(distances.shape).T.reshape(-1, n_dims * 2)
for idx in np.random.permutation(idxs)[:n_test]:
    assert distances[tuple(idx)] == np.linalg.norm(idx[:n_dims] - idx[n_dims:])

print('3[92m TEST PASSES')

答案1

得分: 0

I'll illustrate with a (3,4) array:

在一个 (3,4) 的数组中进行演示:

In [9]: x = np.arange(12).reshape(3,4)
在 [9]: x = np.arange(12).reshape(3,4)

In [10]: np.subtract.outer(x,x).shape
在 [10]: np.subtract.outer(x,x).shape

Out[10]: (3, 4, 3, 4)
Out[10]: (3, 4, 3, 4)

But with broadcasting, we can do an 'outer' on first dimensions, while "sharing" the second. That can be generalized.

但是通过 broadcasting,我们可以在第一维上执行 'outer',同时"共享"第二维。这可以泛化。

In [11]: (x[:,None,:]-x[None,:,:]).shape
在 [11]: (x[:,None,:]-x[None,:,:]).shape

Out[11]: (3, 3, 4)
Out[11]: (3, 3, 4)

'outer' can be done with 'broadcasting'

'outer' 可以使用 'broadcasting' 完成

In [12]: np.allclose(np.subtract.outer(x,x), x[:,:,None,None]-x[None,None,:,:])
在 [12]: np.allclose(np.subtract.outer(x,x), x[:,:,None,None]-x[None,None,:,:])

Out[12]: True
Out[12]: True

英文:

I'll illustrate with a (3,4) array:

In [9]: x = np.arange(12).reshape(3,4)
In [10]: np.subtract.outer(x,x).shape
Out[10]: (3, 4, 3, 4)

But with broadcasting, we can do an 'outer' on first dimensions, while "sharing" the second. That can be generalized.

In [11]: (x[:,None,:]-x[None,:,:]).shape
Out[11]: (3, 3, 4)

outer can be done with broadcasting

In [12]: np.allclose(np.subtract.outer(x,x), x[:,:,None,None]-x[None,None,:,:])
Out[12]: True

huangapple
  • 本文由 发表于 2023年3月1日 12:13:34
  • 转载请务必保留本文链接:https://go.coder-hub.com/75599502.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定