Dot Product of Two Sparse Vectors - Problem

Given two sparse vectors, compute their dot product. A sparse vector is a vector that has mostly zero values - think of it like a long list where only a few positions have meaningful data.

You need to implement a SparseVector class with:

  • SparseVector(nums) - Initializes the object with the vector nums
  • dotProduct(vec) - Computes the dot product between this instance and vec

The dot product is calculated by multiplying corresponding elements and summing the results: v1[0]*v2[0] + v1[1]*v2[1] + ... + v1[n]*v2[n]

Key Challenge: Store the sparse vector efficiently and compute the dot product optimally. Since most values are zero, we don't want to waste time or space on them!

Follow-up: What if only one of the vectors is sparse?

Input & Output

example_1.py โ€” Basic sparse vectors
$ Input: nums1 = [1,0,0,2,3], nums2 = [0,3,0,4,0]
โ€บ Output: 8
๐Ÿ’ก Note: v1 = [1,0,0,2,3], v2 = [0,3,0,4,0]. Dot product = 1ร—0 + 0ร—3 + 0ร—0 + 2ร—4 + 3ร—0 = 0 + 0 + 0 + 8 + 0 = 8
example_2.py โ€” Highly sparse vectors
$ Input: nums1 = [0,1,0,0,0], nums2 = [0,0,0,0,2]
โ€บ Output: 0
๐Ÿ’ก Note: No overlapping non-zero positions. All products are 0, so dot product = 0
example_3.py โ€” Dense vectors
$ Input: nums1 = [0,1,0,0,2,0,0], nums2 = [1,0,0,0,3,0,4]
โ€บ Output: 6
๐Ÿ’ก Note: Only position 4 has non-zero values in both: 2ร—3 = 6

Constraints

  • n == nums1.length == nums2.length
  • 1 โ‰ค n โ‰ค 105
  • -100 โ‰ค nums1[i], nums2[i] โ‰ค 100
  • At most 1000 calls will be made to dotProduct

Visualization

Tap to expand
Sparse Vector Dot Product OptimizationโŒ Brute ForceStore: [1,0,0,2,3] โ† Full arrayCompute: Check all 5 positionsResult: 1ร—0 + 0ร—3 + 0ร—0 + 2ร—4 + 3ร—0Time: O(n), Space: O(n)โœ… Hash MapStore: {0:1, 3:2, 4:3} โ† Non-zeros onlyCompute: Find matching indicesResult: Index 3 matches โ†’ 2ร—4 = 8Time: O(min(a,b)), Space: O(k)Optimization๐ŸŽฏ Key InsightSparse vectors have mostly zerosโ€ข Zero ร— Anything = Zeroโ€ข Only non-zero pairs contribute to dot productโ€ข Hash map gives O(1) lookup for index matching๐Ÿ’ก Skip all zero multiplications!Performance scales with sparsity level
Understanding the Visualization
1
Identify the Problem
Most vector elements are zero, wasting computation time
2
Store Smart
Use hash map to store only meaningful (non-zero) values
3
Compute Efficiently
Only process indices that have values in both vectors
Key Takeaway
๐ŸŽฏ Key Insight: For sparse data, only store and process the meaningful elements. This transforms an O(n) problem into O(k) where k << n, providing massive speedups for highly sparse vectors.
Asked in
Google 12 Meta 8 Amazon 6 Microsoft 4
41.3K Views
Medium Frequency
~15 min Avg. Time
1.8K Likes
Ln 1, Col 1
Smart Actions
๐Ÿ’ก Explanation
AI Ready
๐Ÿ’ก Suggestion Tab to accept Esc to dismiss
// Output will appear here after running code
Code Editor Closed
Click the red button to reopen