How to make your code faster using JavaScript Sets?

While writing code, we always try to make our code easier to read, less complex, more efficient, and smaller in size. For that, we follow several methodologies which make our code efficient. In this article, we shall focus on some techniques based on Javascript sets to perform certain array-like or collection-based applications which will run faster and the code will also be concise. Let us focus few use cases.

How sets are different than arrays and the benefits of using sets over an array

Arrays are indexed collections where each element is associated with a specific index. On the other hand, sets are keyed-collection. In sets, the data elements are ordered based on their key values. And sets do not allow duplications so all elements in the set are unique. There are a few benefits of using sets so that our code runs faster.

  • To check whether an element is present inside an array, we can use indexOf() or includes() functions of the array. Which is a slow operation as compared to has() function in sets.

  • We can delete elements from sets using their values. But in the array, we can use the function splice() based on an index of an element. So this operation is also based on an index, which is a slower process.

  • Similarly inserting an element in an array using push() or unshift() methods are slower than the sets insert operation.

  • Storing NaN is not allowed in javascript arrays, whether we can store NaN values inside sets.

  • Since sets do not allow duplicate elements, we can remove duplicates by using sets. The approach is simple. We insert all elements present in an array into a set without checking any condition, then simply take out all elements from the set. It will automatically remove the duplicate elements only the unique elements will be there.

Why sets are faster than arrays

Most of the operations on javascript arrays, like insert, delete, search, etc are linear-time operations. They need O(n) time to complete where n is the size of the array. But since sets use keys to store elements, most of the operations take constant time O(1). So the size of the sets does not affect the performance of the set collections. Now let us see a few examples to analyze how fast sets are in javascript.

Initial setup for extreme performance test on sets and arrays

We are creating an array and a set with 1000000 elements. They are starting from 0 to 999999. Then we shall perform different tests on these data structures.


let A = [] S = new Set() size = 1000000; for (let i = 0; i < size; i++) { A.push(i); S.add(i); }

Performance test on item searching

Let us find an element say, 56420 in the array as well as in the set. We also display the time needed to perform this operation. From the time value, we can easily understand how much time they are taking so which one that takes lesser time, is performing better.

Note − These code examples results cannot be shown on HTML page. To get the result, you need to run these scripts on local file and check the result on Javascript Console from your browser.


let A = [] S = new Set() size = 1000000; for (let i = 0; i < size; i++) { A.push(i); S.add(i); } let res; let toSearch = 56420; console.time( 'ArrTime' ); res = A.indexOf( toSearch ) !== -1; console.timeEnd( 'ArrTime' ); console.time( 'SetTime' ); res = S.has( toSearch ); console.timeEnd( 'SetTime' );

The time may be different in various systems and browsers but the array will always take a longer time than set. In this example, at the point of time when the code has been executed, the array search is taking 0.172ms time whereas the set is taking 0.008ms time, which is nearly 21 times faster.

Performance test on item insertion

Set us to see another similar example where we are inserting an element into the set and the array and then analyzing their performance based on execution time.


let A = [] S = new Set() size = 1000000; for (let i = 0; i < size; i++) { A.push(i); S.add(i); } console.time( 'ArrTimeInsert' ); A.push( size ); console.timeEnd( 'ArrTimeInsert' ); console.time( 'SetTimeInsert' ); S.add( size ); console.timeEnd( 'SetTimeInsert' );

Insertion is taking place in 0.140ms for the array and 0.009ms for the set. Here also we can see the set is performing better, and this is 15.5 times faster than the array.

Performance test on item deletion

A similar test on element deletion. Deleting an element from an array in JavaScript is not a straightforward process. It takes a few steps to remove elements from a given index. For set, we can use the delete() method to delete using value. Let us see the code for a better understanding.


let A = [] S = new Set() size = 1000000; for (let i = 0; i < size; i++) { A.push(i); S.add(i); } function deleteFromArray(array, element){ let idx = array.indexOf(element); return idx !== -1 && array.splice(idx, 1); } let res; let toDelete = 56420; console.time( 'ArrTimeDelete' ); res = deleteFromArray( A, toDelete); console.timeEnd( 'ArrTimeDelete' ); console.time( 'SetTimeDelete' ); res = S.delete( toDelete ); console.timeEnd( 'SetTimeDelete' );

Array deletion is taking 1.09ms but here this is not the actual time for deletion. Two operations are merged. Here at first, we are finding the index of the given element, then performing the delete operation. For set, it is taking almost the same time as other operations which ensures set operations take constant time to perform.


While developing systems for a large-scale application, the performance of the system should be very efficient in all domains. Writing faster and more efficient code is always a good practice for developers. This article covers the benefits of using set data structure over array data structure in javascript to enhance its performance. Javascript array operations like insert, delete, and search takes linear time which depends on the number of elements present in the array, on the other hand, sets use a constant-time algorithm to perform these operations. Sets store data based on keys which make searching, inserting, and deleting efficient. Arrays use index-based approaches that slow down the performance of the said operations in practice.