Article Categories
- All Categories
-
Data Structure
-
Networking
-
RDBMS
-
Operating System
-
Java
-
MS Excel
-
iOS
-
HTML
-
CSS
-
Android
-
Python
-
C Programming
-
C++
-
C#
-
MongoDB
-
MySQL
-
Javascript
-
PHP
-
Economics & Finance
Selected Reading
How to filter out common array in array of arrays in JavaScript
When working with arrays of arrays in JavaScript, you may need to remove duplicate subarrays and keep only unique ones. This is useful for data deduplication and filtering operations.
The Problem
Suppose we have an array of arrays with duplicate subarrays:
const arr = [
[
"Serta",
"Black Friday"
],
[
"Serta",
"Black Friday"
],
[
"Simmons",
"Black Friday"
],
[
"Simmons",
"Black Friday"
],
[
"Simmons",
"Black Friday"
],
[
"Simmons",
"Black Friday"
]
];
console.log("Original array length:", arr.length);
Original array length: 6
Using JSON.stringify() with Filter
The most straightforward approach is to convert each subarray to a JSON string for comparison:
const filterCommon = arr => {
const map = Object.create(null);
let res = [];
res = arr.filter(el => {
const str = JSON.stringify(el);
const bool = !map[str];
map[str] = true;
return bool;
});
return res;
};
const arr = [
["Serta", "Black Friday"],
["Serta", "Black Friday"],
["Simmons", "Black Friday"],
["Simmons", "Black Friday"],
["Simmons", "Black Friday"],
["Simmons", "Black Friday"]
];
console.log(filterCommon(arr));
[ [ 'Serta', 'Black Friday' ], [ 'Simmons', 'Black Friday' ] ]
Using Set with JSON.stringify()
A more concise approach using Set for automatic uniqueness:
const filterUniqueArrays = arr => {
const seen = new Set();
return arr.filter(subArray => {
const str = JSON.stringify(subArray);
if (seen.has(str)) {
return false;
}
seen.add(str);
return true;
});
};
const testArray = [
["A", "B"],
["A", "B"],
["C", "D"],
["A", "B"],
["E", "F"]
];
console.log(filterUniqueArrays(testArray));
[ [ 'A', 'B' ], [ 'C', 'D' ], [ 'E', 'F' ] ]
Using Map for Better Performance
For larger datasets, using Map can provide better performance:
const filterWithMap = arr => {
const uniqueMap = new Map();
return arr.filter(subArray => {
const key = JSON.stringify(subArray);
if (uniqueMap.has(key)) {
return false;
}
uniqueMap.set(key, true);
return true;
});
};
const duplicateArray = [
[1, 2, 3],
[4, 5, 6],
[1, 2, 3],
[7, 8, 9],
[4, 5, 6]
];
console.log(filterWithMap(duplicateArray));
[ [ 1, 2, 3 ], [ 4, 5, 6 ], [ 7, 8, 9 ] ]
Comparison of Methods
| Method | Performance | Code Length | Readability |
|---|---|---|---|
| Object with filter() | Good | Medium | Medium |
| Set with filter() | Good | Short | High |
| Map with filter() | Best | Medium | High |
Key Points
-
JSON.stringify()converts arrays to strings for comparison - All methods preserve the original order of first occurrences
- Set and Map provide cleaner syntax than plain objects
- These approaches work for nested arrays of any depth
Conclusion
Use JSON.stringify() with Set or Map to filter duplicate subarrays efficiently. The Set approach offers the cleanest syntax, while Map provides better performance for large datasets.
Advertisements
