英文:
Applying np.linspace to Multi-Dimensional Array
问题
我有一个多维的Numpy数组,大小如下:
(1200, 2600, 200)
在每个点i, j
,都有一组无序的数据,这些数据在每个点上都不同。我正在进行一些需要使用大小为200的均匀间隔数值的分析。我当前的方法如下:
x = np.empty(shape=(array.shape[0], array.shape[1], 200))
def compiler(i, j):
x[i, j] = np.linspace(0.1, np.max(array[i, j]), 200)
[[compiler(i, j) for i in range(array.shape[0])] for j in range(array.shape[1])]
在列表推导式中构建这个似乎可能非常低效,考虑到Numpy的能力,肯定有一种更快的方法来执行这个过程吧?
**编辑:**我希望在每个点i, j使用np.linspace得到如下结果的示例。我期望得到的ndarray形状是(1200, 2600, 200)
。
英文:
I have a multi-dimensional Numpy array of the following size:
(1200,2600,200)
At each point i, j
, there is an assortment of unordered data which vary at each point. I'm performing some analyses which require the use of evenly spaced numbers with size 200. My current approach is the following:
x = np.empty(shape=(array.shape[0],array.shape[1],200))
def compiler(i, j):
x[i, j] = np.linspace(0.1, np.max(array[i, j]), 200)
[[compiler(i, j) for i in range(array.shape[0])] for j in range(array.shape[1])]
Constructing this in a list comprehension seems potentially very inefficient given Numpy's capabilites, surely there is a faster way to execute this process?
Edit: An example of the result I'm looking for at each point i, j with np.linspace is as follows. I'd expect the resulting ndarray to be of shape (1200,2600,200)
:
[ 0.1 0.31248513 0.52497025 0.73745538 0.94994051 1.16242564
1.37491076 1.58739589 1.79988102 2.01236615 2.22485127 2.4373364
2.64982153 2.86230665 3.07479178 3.28727691 3.49976204 3.71224716
3.92473229 4.13721742 4.34970255 4.56218767 4.7746728 4.98715793
5.19964305 5.41212818 5.62461331 5.83709844 6.04958356 6.26206869
6.47455382 6.68703895 6.89952407 7.1120092 7.32449433 7.53697945
7.74946458 7.96194971 8.17443484 8.38691996 8.59940509 8.81189022
9.02437535 9.23686047 9.4493456 9.66183073 9.87431585 10.08680098
10.29928611 10.51177124 10.72425636 10.93674149 11.14922662 11.36171175
11.57419687 11.786682 11.99916713 12.21165225 12.42413738 12.63662251
12.84910764 13.06159276 13.27407789 13.48656302 13.69904815 13.91153327
14.1240184 14.33650353 14.54898865 14.76147378 14.97395891 15.18644404
15.39892916 15.61141429 15.82389942 16.03638455 16.24886967 16.4613548
16.67383993 16.88632505 17.09881018 17.31129531 17.52378044 17.73626556
17.94875069 18.16123582 18.37372095 18.58620607 18.7986912 19.01117633
19.22366145 19.43614658 19.64863171 19.86111684 20.07360196 20.28608709
20.49857222 20.71105735 20.92354247 21.1360276 21.34851273 21.56099785
21.77348298 21.98596811 22.19845324 22.41093836 22.62342349 22.83590862
23.04839375 23.26087887 23.473364 23.68584913 23.89833425 24.11081938
24.32330451 24.53578964 24.74827476 24.96075989 25.17324502 25.38573015
25.59821527 25.8107004 26.02318553 26.23567066 26.44815578 26.66064091
26.87312604 27.08561116 27.29809629 27.51058142 27.72306655 27.93555167
28.1480368 28.36052193 28.57300706 28.78549218 28.99797731 29.21046244
29.42294756 29.63543269 29.84791782 30.06040295 30.27288807 30.4853732
30.69785833 30.91034346 31.12282858 31.33531371 31.54779884 31.76028396
31.97276909 32.18525422 32.39773935 32.61022447 32.8227096 33.03519473
33.24767986 33.46016498 33.67265011 33.88513524 34.09762036 34.31010549
34.52259062 34.73507575 34.94756087 35.160046 35.37253113 35.58501626
35.79750138 36.00998651 36.22247164 36.43495676 36.64744189 36.85992702
37.07241215 37.28489727 37.4973824 37.70986753 37.92235266 38.13483778
38.34732291 38.55980804 38.77229316 38.98477829 39.19726342 39.40974855
39.62223367 39.8347188 40.04720393 40.25968906 40.47217418 40.68465931
40.89714444 41.10962956 41.32211469 41.53459982 41.74708495 41.95957007
42.1720552 42.38454033 42.59702546 42.80951058 43.02199571 43.23448084
43.44696596 43.65945109 43.87193622 44.08442135 44.29690647 44.5093916
44.72187673 44.93436186 45.14684698 45.35933211 45.57181724 45.78430236
45.99678749 46.20927262 46.42175775 46.63424287 46.846728 47.05921313
47.27169826 47.48418338 47.69666851 47.90915364 48.12163876 48.33412389
48.54660902 48.75909415 48.97157927 49.1840644 49.39654953 49.60903466
49.82151978 50.03400491 50.24649004 50.45897516 50.67146029 50.88394542
51.09643055 51.30891567 51.5214008 51.73388593 51.94637106 52.15885618
52.37134131 52.58382644 52.79631156 53.00879669]
答案1
得分: 2
So for i, j, you want 200 equally spaced values that just differ by the end point. Iteratively you use:
np.max(array[i, j])
np.max(array, axis=2)
should give you that max for all i, j points.
If that's right, try:
M = np.linspace(0.1, np.max(array, axis=2), 200)
Check the shape of M
. I don't remember whether the 200 will be leading or trailing. It may need a transpose/swapaxis.
In [3]: arr = np.arange(4, 28).reshape(2, 3, 4)
In [4]: np.max(arr, axis=2)
Out[4]:
array([[ 7, 11, 15],
[19, 23, 27]])
In [5]: M = np.linspace(0, np.max(arr, axis=2), 4)
In [6]: M.shape
Out[6]: (4, 2, 3)
In [7]: M.transpose(1, 2, 0).shape
Out[7]: (2, 3, 4)
In [9]: np.linspace(0, np.max(arr, axis=2), 5).transpose(1, 2, 0)
Out[9]:
array([[[ 0. , 1.75, 3.5 , 5.25, 7. ],
[ 0. , 2.75, 5.5 , 8.25, 11. ],
[ 0. , 3.75, 7.5 , 11.25, 15. ]],
[[ 0. , 4.75, 9.5 , 14.25, 19. ],
[ 0. , 5.75, 11.5 , 17.25, 23. ],
[ 0. , 6.75, 13.5 , 20.25, 27. ]]])
英文:
So for i,j you want 200 equally space values that just differ by the end point. Iteratively you use
np.max(array[i, j])
np.max(array, axis=2)
should give you that max for all i,j points
If that's right try:
M=np.linspace(0.1, np.max(array,axis=2), 200)
Check the shape of M
. I don't remember whether the 200 will leading or trailing. It may need a transpose/swapaxis
In [3]: arr = np.arange(4,28).reshape(2,3,4)
In [4]: np.max(arr, axis=2)
Out[4]:
array([[ 7, 11, 15],
[19, 23, 27]])
In [5]: M=np.linspace(0,np.max(arr,axis=2),4)
In [6]: M.shape
Out[6]: (4, 2, 3)
In [7]: M.transpose(1,2,0).shape
Out[7]: (2, 3, 4)
In [9]: np.linspace(0,np.max(arr,axis=2),5).transpose(1,2,0)
Out[9]:
array([[[ 0. , 1.75, 3.5 , 5.25, 7. ],
[ 0. , 2.75, 5.5 , 8.25, 11. ],
[ 0. , 3.75, 7.5 , 11.25, 15. ]],
[[ 0. , 4.75, 9.5 , 14.25, 19. ],
[ 0. , 5.75, 11.5 , 17.25, 23. ],
[ 0. , 6.75, 13.5 , 20.25, 27. ]]])
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论