Algorithm for matrix multiplication in JavaScript

We are required to write a JavaScript function that takes in two 2-D arrays of numbers and returns their matrix multiplication result.

Let’s write the code for this function −

Example

The code for this will be −

const multiplyMatrices = (a, b) => {
   if (!Array.isArray(a) || !Array.isArray(b) || !a.length || !b.length) {
      throw new Error('arguments should be in 2-dimensional array format');
   }
   let x = a.length,
   z = a[0].length,
   y = b[0].length;
   if (b.length !== z) {
      // XxZ & ZxY => XxY
      throw new Error('number of columns in the first matrix should be
      the same as the number of rows in the second');
   }
   let productRow = Array.apply(null, new Array(y)).map(Number.prototype.valueOf, 0);
   let product = new Array(x);
   for (let p = 0; p 

Output

The output in the console −

[
   [ 15, 33, 51, 37, 34, 40 ],
   [ 33, 78, 123, 85, 76, 88 ],
   [ 51, 123, 195, 133, 118, 136 ],
   [ 7, 16, 25, 18, 17, 22 ],
   [ 31, 73, 115, 88, 73, 96 ]
]
Updated on: 2020-10-15T09:34:04+05:30

2K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements