On each iteration, check to see if the current name being iterated over is included in the set (and if so, break), then add the current name to that Set, for an overall complexity of O(n):
const names = [
{name: 'aa'},
{name: 'bb'},
{name: 'cc'},
{name: 'aa'}
];
const namesSet = new Set();
const isDuplicatedName = names.some(({name}) => {
if (namesSet.has(name)) {
return true;
}
namesSet.add(name);
return false;
});
console.log(isDuplicatedName)
(in contrast, .filter inside .some has O(n^2) complexity)
Looking at your original code, for a general rule, if you ever see code like arr.filter(callback).length;, you can improve it by not constructing an unnecessary intermediate array by using reduce instead:
return names.filter((row, idx) => row.name === name && index !== idx).length;
can be changed to
return names.reduce((countSoFar, row, idx) => countSoFar + (row.name === name && index !== idx), 0);
(which would probably be more appropriate, but would still be more inefficient than the Set solution)