Calculating excluded average - JavaScript


Suppose we have an array of objects like this −

const arr = [
   {val: 56, canUse: true},
   {val: 16, canUse: true},
   {val: 45, canUse: true},
   {val: 76, canUse: false},
   {val: 45, canUse: true},
   {val: 23, canUse: false},
   {val: 23, canUse: false},
   {val: 87, canUse: true},
];

We are required to write a JavaScript function that calculates the average of the val property of all those objects that have a boolean true set for the canUse flag.

Example

Following is the code −

const arr = [
   {val: 56, canUse: true},
   {val: 16, canUse: true},
   {val: 45, canUse: true},
   {val: 76, canUse: false},
   {val: 45, canUse: true},
   {val: 23, canUse: false},
   {val: 23, canUse: false},
   {val: 87, canUse: true},
];
const excludedAverage = arr => {
   let count = 0, props = 0;
   for(let i = 0; i < arr.length; i++){
      if(!arr[i].canUse){
         continue;
      };
      props++;
      count += arr[i].val;
   };
   return (count) / (!props ? 1 : props);
};
console.log(excludedAverage(arr));

Output

This will produce the following output in console −

49.8

Updated on: 30-Sep-2020

82 Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements