Counting largest numbers in row and column in 2-D array in JavaScript


We are required to write a JavaScript function that takes in a two-dimensional array of integers as the only argument.

The task of our function is to calculate the count of all such integers from the array that are the greatest both within their row and the column.

The function should then return that count.

For example −

If the input array is −

const arr = [
   [21, 23, 22],
   [26, 26, 25],
   [21, 25, 27]
];

Then the output should be −

const output = 3;

because those three numbers are 26, 26, 27

Example

Following is the code −

const arr = [
   [21, 23, 22],
   [26, 26, 25],
   [21, 25, 27]
];
const countGreatest = (matrix = []) => {
   let rows = matrix.length;
   if (rows == 0){
      return 0;
   };
   let cols = matrix[0].length;
   const colMax = [];
   const rowMax = [];
   let res = 0;
   for (let r = 0; r < rows; ++ r) {
      for (let c = 0; c < cols; ++ c) {
         rowMax[r] = Math.max(rowMax[r] || 0, matrix[r][c]);
         colMax[c] = Math.max(colMax[c] || 0, matrix[r][c]);
      }
   };
   for (let r = 0; r < rows; ++ r) {
      for (let c = 0; c < cols; ++ c) {
         if (matrix[r][c] == rowMax[r] && matrix[r][c] == colMax[c]) {
            res ++;
         }
      }
   }
   return res;
};
console.log(countGreatest(arr));

Output

Following is the console output −

3

Updated on: 23-Jan-2021

341 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements