英文:
Is there a better way to refactor map and filter code
问题
I have an array of data. The array contains over 500k items. I want to map items in the array and then filter it. This is what I have done.
我有一个数据数组。该数组包含超过 500,000 个项。我想映射数组中的项,然后对其进行筛选。这是我所做的事情。
I create a function that filters the array to get unique values.
我创建了一个函数来过滤数组以获取唯一值。
const getUniqueValues = (array: string[]): string[] => {
return array.filter((item, index, _array) => _array.indexOf(item) === index);
};
Then I pass the mapped data into the function.
然后我将映射后的数据传递给该函数。
const uniqueValues = getUniqueValues(
editedData.map((bodyItem: any) => bodyItem[index])
).filter(Boolean);
This worked well and faster when the array contains fewer items. Now, it takes sometimes five to ten minutes to perform the action, which isn't good for the user experience. Now uniqueValue
returns approximately 210,000 items
in the array.
这在数组包含较少项时效果良好且更快。但现在,执行操作有时需要五到十分钟,这对用户体验不好。现在 uniqueValue
在数组中返回大约 210,000 个项
。
Is there a better way to perform this in less time?
有没有更快的方法来执行这个操作?
I have tried array.reduce
, but I'm not sure of my code because it seems not to solve the problem. If someone could check it out, I'll appreciate it.
我尝试过 array.reduce
,但不确定我的代码是否正确,因为似乎没有解决问题。如果有人能帮忙检查一下,我会感激不尽。
const uniqueValues = editedData.reduce(
(accumulator, bodyItem) => {
const item = bodyItem[index];
if (!accumulator.includes(item)) {
accumulator.push(item);
}
return accumulator;
},
[]
);
英文:
I have an array of data. array contains over 500k items. I want to map items in the array then fileter it. This is what I have done.
I create a function that filters array tpo get unique values
const getUniqueValues = (array: string[]): string[] => {
return array.filter((item, index, _array) => _array.indexOf(item) === index);
};
then I pass in the mapped data into the function
const uniqueValues = getUniqueValues(
editedData.map((bodyItem: any) => bodyItem[index])
).filter(Boolean);
This worked well and faster when the array contains less items. Now, it takes sometimes five to ten minutes to perform action, which isn't good for user experience. Now uniqueValue
returns aproximately 210,000 items
in array
Is there a better way to perform this in less time?
I have tried array.reduce
, not sure of my code tho cos it seems not to solve the problem. if someone could check it out, i'll appreciate
const uniqueValues = editedData.reduce(
(accumulator, bodyItem) => {
const item = bodyItem[index];
if (!accumulator.includes(item)) {
accumulator.push(item);
}
return accumulator;
},
[]
);
答案1
得分: 1
使用这么多项目时,您应该使用内置的 Set 类,它可以有效地去除重复项。只需将 getUniqueValues
替换为以下代码:
const getUniqueValues = (array: string[]): string[] => {
return [...new Set(array)];
};
英文:
With that high amount of items, you should be using the builtin Set class, which will get rid of duplicates efficiently. Just replace getUniqueValues
with the code below
const getUniqueValues = (array: string[]): string[] => {
return [...new Set(array)];
};
答案2
得分: -1
尝试一下这个:
const uniqueValues = Array.from(new Set(editedData.map((bodyItem: any) => bodyItem[index]))).filter(Boolean);
new Set(editedData.map((bodyItem: any) => bodyItem[index]))
创建一个 Set,自动移除任何重复的值。Array.from()
然后将这个 Set 转换回数组。.filter(Boolean)
过滤掉任何假值。
英文:
can you try this :
const uniqueValues = Array.from(new Set(editedData.map((bodyItem: any) => bodyItem[index]))).filter(Boolean);
- new Set(editedData.map((bodyItem: any) => bodyItem[index])) creates a Set, automatically removing any duplicate values.
- Array.from() then converts this Set back into an array.
- .filter(Boolean) filters out any falsy values.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论