# Multiplying two matrices in JavaScript with different dimensions

JavascriptWeb DevelopmentFront End TechnologyObject Oriented Programming

#### JavaScript for beginners

Best Seller

74 Lectures 10 hours

#### Modern Javascript for Beginners + Javascript Projects

Most Popular

112 Lectures 15 hours

#### The Complete Full-Stack JavaScript Course!

Best Seller

96 Lectures 24 hours

We are required to write a JavaScript function that takes in two 2-D arrays of numbers and returns their matrix multiplication result.

Let’s say the following are our two matrices −

// 5 x 4
let a = [
[1, 2, 3, 1],
[4, 5, 6, 1],
[7, 8, 9, 1],
[1, 1, 1, 1],
[5, 7, 2, 6]
];
// 4 x 6
let b = [
[1, 4, 7, 3, 4, 6],
[2, 5, 8, 7, 3, 2],
[3, 6, 9, 6, 7, 8],
[1, 1, 1, 2, 3, 6]
];

## Example

Let’s write the code for this function −

const multiplyMatrices = (a, b) => {
if (!Array.isArray(a) || !Array.isArray(b) || !a.length || !b.length) {
throw new Error('arguments should be in 2-dimensional array format');
}
let x = a.length,
z = a.length,
y = b.length;
if (b.length !== z) {
// XxZ & ZxY => XxY
throw new Error('number of columns in the first matrix should be the same as the number of rows in the second');
}
let productRow = Array.apply(null, new
Array(y)).map(Number.prototype.valueOf, 0);
let product = new Array(x);
for (let p = 0; p < x; p++) {
product[p] = productRow.slice();
}
for (let i = 0; i < x; i++) {
for (let j = 0; j < y; j++) {
for (let k = 0; k < z; k++) {
product[i][j] += a[i][k] * b[k][j];
}
}
}
return product;
}
// 5 x 4
let a = [
[1, 2, 3, 1],
[4, 5, 6, 1],
[7, 8, 9, 1],
[1, 1, 1, 1],
[5, 7, 2, 6]
];
// 4 x 6
let b = [
[1, 4, 7, 3, 4, 6],
[2, 5, 8, 7, 3, 2],
[3, 6, 9, 6, 7, 8],
[1, 1, 1, 2, 3, 6]
];
// should result in a 5 x 6 matrix
console.log(multiplyMatrices(a, b));

## Output

The output in the console: −

[
[ 15, 33, 51, 37, 34, 40 ],
[ 33, 78, 123, 85, 76, 88 ],
[ 51, 123, 195, 133, 118, 136 ],
[ 7, 16, 25, 18, 17, 22 ],
[ 31, 73, 115, 88, 73, 96 ]
]