33

I have an array of objects

list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}]

And I'm looking for an efficient way (if possible O(log(n))) to remove duplicates and to end up with

list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}]

I've tried _.uniq or even _.contains but couldn't find a satisfying solution.

Thanks!

Edit : The question has been identified as a duplicate of another one. I saw this question before posting but it didn't answer my question since it's an array of object (and not a 2-dim array, thanks Aaron), or at least the solutions on the other question weren't working in my case.

2
  • 1
    Possible duplicate of Remove Duplicates from JavaScript Array Commented Mar 16, 2016 at 9:56
  • 3
    Note that this is not a 2-dimensional array but rather an array of objects. Commented Mar 16, 2016 at 9:57

10 Answers 10

40

Plain javascript (ES2015), using Set

const list = [{ x: 1, y: 2 }, { x: 3, y: 4 }, { x: 5, y: 6 }, { x: 1, y: 2 }];

const uniq = new Set(list.map(e => JSON.stringify(e)));

const res = Array.from(uniq).map(e => JSON.parse(e));

document.write(JSON.stringify(res));

Sign up to request clarification or add additional context in comments.

4 Comments

stringify and parse don't look like the most performant way to achieve this.
@AminJafari There's nothing with var usage even in 2020. As long as scoping is taken care, var is absolutely fine. While this answer is not the most performant it works as per OP's question
@SeaWarrior404 I know, that's why I upvoted :) It's just not a good practice to use var where you can use const
JSON.stringify is a correct, generic way to identify distinct objects with equal contents. However, most answers that use JSON.stringify, including this one, only work if the keys are always specified in the same order. See my answer on how to use a replacer function so that the solution even works when the order of the keys varies: stackoverflow.com/a/71102751/1166087.
32

Try using the following:

list = list.filter((elem, index, self) => self.findIndex(
    (t) => {return (t.x === elem.x && t.y === elem.y)}) === index)

2 Comments

Just a question, can't you just use Object.is() for your filter's comparison? developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/…
Hello! I've never seen self used before as in .filter((elem, index, self) => what does self refer to? I get that element refers to the element value and index to its position in the array. Or if someone could point me in the direction of docs that'd be great!
16

Vanilla JS version:

const list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}];

function dedupe(arr) {
  return arr.reduce(function(p, c) {

    // create an identifying id from the object values
    var id = [c.x, c.y].join('|');

    // if the id is not found in the temp array
    // add the object to the output array
    // and add the key to the temp array
    if (p.temp.indexOf(id) === -1) {
      p.out.push(c);
      p.temp.push(id);
    }
    return p;

    // return the deduped array
  }, {
    temp: [],
    out: []
  }).out;
}

console.log(dedupe(list));

Comments

14

I would use a combination of Arrayr.prototype.reduce and Arrayr.prototype.some methods with spread operator.

1. Explicit solution. Based on complete knowledge of the array object contains.

list = list.reduce((r, i) => 
  !r.some(j => i.x === j.x && i.y === j.y) ? [...r, i] : r
, [])

Here we have strict limitation on compared objects structure: {x: N, y: M}. And [{x:1, y:2}, {x:1, y:2, z:3}] will be filtered to [{x:1, y:2}].

2. Generic solution, JSON.stringify(). The compared objects could have any number of any properties.

list = list.reduce((r, i) => 
  !r.some(j => JSON.stringify(i) === JSON.stringify(j)) ? [...r, i] : r
, [])

This approach has a limitation on properties order, so [{x:1, y:2}, {y:2, x:1}] won't be filtered.

3. Generic solution, Object.keys(). The order doesn't matter.

list = list.reduce((r, i) => 
  !r.some(j => !Object.keys(i).some(k => i[k] !== j[k])) ? [...r, i] : r
, [])

This approach has another limitation: compared objects must have the same list of keys. So [{x:1, y:2}, {x:1}] would be filtered despite the obvious difference.

4. Generic solution, Object.keys() + .length.

list = list.reduce((r, i) => 
  !r.some(j => Object.keys(i).length === Object.keys(j).length 
    && !Object.keys(i).some(k => i[k] !== j[k])) ? [...r, i] : r
, [])

With the last approach objects are being compared by the number of keys, by keys itself and by key values.

I created a Plunker to play with it.

4 Comments

Can you please explain what's happening in the 4th approach, especially with the array part and the some comparison? Its a bit dense to understand
@SeaWarrior404 The 4 approach is a combination of keys/values comparison presented as the second "some" taken from p.3 AND length comparison presented as the first "some", which iteratively compares the length of the keys of each object in the initial array with the length of the keys of all other objects in the initial array to make sure that all items in the list have the same number of fields. If a given object has the same number of fields as the others in the list, it will be sent to the key-values comparison (the second "some"). If not, it will be filtered out immediately.
You don't need to recreate an array at each iteration. you can use r.push(i) instead. if (!r.some(j => JSON.stringify(i) === JSON.stringify(j))) r.push(i); return r
thumbs up for the generic solution, just what I needed!
6

One liners for ES6+

If you want to find uniq by x and y:

arr.filter((v,i,a)=>a.findIndex(t=>(t.x === v.x && t.y===v.y))===i)

If you want to find uniques by all properties:

arr.filter((v,i,a)=>a.findIndex(t=>(JSON.stringify(t) === JSON.stringify(v)))===i)

Comments

4

The following will work:

var a = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}];

var b = _.uniq(a, function(v) { 
    return v.x && v.y;
})

console.log(b);  // [ { x: 1, y: 2 }, { x: 3, y: 4 }, { x: 5, y: 6 } ]

10 Comments

Thanks! Works for me, is it faster than Andy's solution ?
I have no idea. It's less code to write and underscore seems to be included either way... @kwn
Unless you're dealing with 10s of 1000s of elements, performance shouldn't be an issue. I guess you have to weigh up the cost of loading a separate library (if lodash isn't in your stack already) or just using a slightly more verbose vanilla function.
Don't think this works properly e.g. 1 && 2 == 2 && 2
If the input was [{x:1, y: 2}, {x: 2. y:2}] then the result would be [{x:1, y:2}] - the second object would be removed even though it's not a duplicate
|
4

Filter the array after checking if already in a temorary object in O(n).

var list = [{ x: 1, y: 2 }, { x: 3, y: 4 }, { x: 5, y: 6 }, { x: 1, y: 2 }],
    filtered = function (array) {
        var o = {};
        return array.filter(function (a) {
            var k = a.x + '|' + a.y;
            if (!o[k]) {
                o[k] = true;
                return true;
            }
        });
    }(list);

document.write('<pre>' + JSON.stringify(filtered, 0, 4) + '</pre>');

Comments

0

No libraries, and works with any depth

Limitation:

  • You must provide only string or Number properties as hash objects otherwise you'll get inconsistent results
/** 
 * Implementation, you can convert this function to the prototype pattern to allow
 * usage like `myArray.unique(...)`
 */ 
function unique(array, f) {
  return Object.values(
    array.reduce((acc, item) => ({ ...acc, [f(item).join(``)]: item }), {})
  );
}

const list = [{ x: 1, y: 2}, {x: 3, y: 4}, { x: 5, y: 6}, { x: 1, y: 2}];

// Usage
const result = unique(list, item => [item.x, item.y]);

// Output: [{ x: 1, y: 2}, {x: 3, y: 4}, { x: 5, y: 6}]
console.log(result); 

Snippet Sample

// Implementation
function unique(array, f) {
  return Object.values(
    array.reduce((acc, item) => ({ ...acc, [f(item).join(``)]: item }), {})
  );
}

// Your object list
const list = [{ x: 1, y: 2}, {x: 3, y: 4}, { x: 5, y: 6}, { x: 1, y: 2}];

// Usage
const result = unique(list, item => [item.x, item.y]);

// Add result to DOM
document.querySelector(`p`).textContent = JSON.stringify(result, null, 2);
<p></p>

Comments

0

With Underscore's _.uniq and the standard JSON.stringify it is a oneliner:

var list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {x:1,y:2}];

var deduped = _.uniq(list, JSON.stringify);

console.log(deduped);
<script src="https://underscorejs.org/underscore-umd-min.js"></script>

However, this presumes that the keys are always specified in the same order. By sophisticating the iteratee, we can make the solution work even if the order of the keys varies. This problem as well as the solution also apply to other answers that involve JSON.stringify.

var list = [{x:1,y:2}, {x:3,y:4}, {x:5,y:6}, {y:2, x:1}];

// Ensure that objects are always stringified
// with the keys in alphabetical order.
function replacer(key, value) {
    if (!_.isObject(value)) return value;
    var sortedKeys = _.keys(value).sort();
    return _.pick(value, sortedKeys);
}

// Create a modified JSON.stringify that always
// uses the above replacer.
var stringify = _.partial(JSON.stringify, _, replacer, null);

var deduped = _.uniq(list, stringify);

console.log(deduped);
<script src="https://underscorejs.org/underscore-umd-min.js"></script>

For Lodash 4, use _.uniqBy instead of _.uniq.

Comments

-2

Using lodash you can use this one-liner:

 _.uniqBy(list, e => { return e.x && e.y })

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.