41

In my CouchDB reduce function I need to reduce a list of items to the unique ones.

Note: In that case it's ok to have a list, it will be a small number of items of string type.

My current way is to set keys of a object, then return the keys of that object since the place the code can't use things like _.uniq for example.

I'd like to find a more elegant way to spell it than this.

function(keys, values, rereduce) {
  // values is a Array of Arrays
  values = Array.concat.apply(null, values);
  var uniq = {};
  values.forEach(function(item) { uniq[item] = true; });
  return Object.keys(uniq);
}
4
  • depending on what you define as being elegant, you could lookup the source of underscore's unique on github Commented Jul 27, 2012 at 15:35
  • underscore is more expensive for string only and less elegant due to needing to work in the general case Commented Jul 27, 2012 at 22:34
  • Do you need a reduce function? If you only need the unique values you can use the group=true option when requesting the view. For more info on that see CouchDB Wiki Commented Jul 31, 2012 at 19:20
  • 1
    Possible duplicate of [Get all unique values in an array (remove duplicates) stackoverflow.com/questions/1960473/… Commented Nov 18, 2017 at 20:44

9 Answers 9

74

The best method seem to be using ES6 and Set. Single line and faster* than above according to fiddle

    
const myList = [1,4,5,1,2,4,5,6,7];
const unique = [...new Set(myList)];
    
console.log(unique);

*tested in safari

Sign up to request clarification or add additional context in comments.

3 Comments

you can also just do new Set([1,2,4,6]) which is less ugly imo
new Set(list) will only create a Set.
yeah you're right, I meant Array.from(new Set([1,2,4,6])), see my answer
38

2021 answer:

const unique = (arr) => [...new Set(arr)];
unique([1, 2, 2, 3, 4, 4, 5, 1]); // [1, 2, 3, 4, 5]

Here you just create a set from the given array and then convert it back to the array. I measured performance and it's almost twice faster now than the approach proposed in the old answer I posted before. Also, it's just a one-liner.

Updated fiddle

Old answer just for the record:

Commonly, the approach you used is a good idea. But I could propose a solution that will make the algorithm a lot faster.

function unique(arr) {
    var u = {}, a = [];
    for(var i = 0, l = arr.length; i < l; ++i){
        if(!u.hasOwnProperty(arr[i])) {
            a.push(arr[i]);
            u[arr[i]] = 1;
        }
    }
    return a;
}

As you can see we have only one loop here.

I've made an example that is testing both your and my solutions. Try to play with it.

8 Comments

The speed of algorithm is always important when you have a deal with collections and loops.
If I were to use for(var i=0; i<arr.length; ++i) would it recount the array's length on every loop? Or why create a separate l variable?
@MarkusMeskanen It depends on the particular JavaScript engine implementation. Also, don't forget about the property lookup time.
Opening the testing link through chrome today shows little to no difference in the testing speeds. Is this answer still up to date or has the performance difference been optimized away by browsers? I've updated a jsfiddle along with my indexOf proposition.
@FélixAdriyelGagnon-Grenier Strange. I got around 0.7-0.8 ratio still in Linux Chrome. But sometimes it's >1. So looks like ES engine works in a different way nowadays.
|
13

An alternative that's suitable for small lists would be to ape the Unix command line approach of sort | uniq:

    function unique(a) {
        return a.sort().filter(function(value, index, array) {
            return (index === 0) || (value !== array[index-1]);
        });
    }

This function sorts the argument, and then filters the result to omit any items that are equal to their predecessor.

The keys-based approach is fine, and will have better performance characteristics for large numbers of items (O(n) for inserting n items into a hashtable, compared to O(n log n) for sorting the array). However, this is unlikely to be noticeable on small lists. Moreover, with this version you could modify it to use a different sorting or equality function if necessary; with hash keys you're stuck with JavaScripts notion of key equality.

1 Comment

Beautiful, was able to use this in ember.js to filter a recordarray with Arrayproxy's "filter" function
8

This should work with anything, not just strings:

export const getUniqueList =  (a: Array<any>) : Array<any> => {

  const set = new Set<any>();

  for(let v of a){
      set.add(v);
  }

  return Array.from(set);

};

the above can just be reduced to:

export const getUniqueValues = (a: Array<any>) => {
   return Array.from(new Set(a));
};

:)

Comments

2

To get unique objects, you can use JSON.stringify and JSON.parse:

const arr = [{test: "a"}, {test: "a"}];
const unique = Array.from(new Set(arr.map(JSON.stringify))).map(JSON.parse);
console.log(unique);

1 Comment

This is the only one that worked with a list of lists and found the unique lists
0

Using Object.keys will give you strings if you put in integer arguments (uniq([1,2,3]) => ['1','2','3']. Here's one with Array.reduce:

function uniq(list) {
    return list.reduce((acc, d) => acc.includes(d) ? acc : acc.concat(d), []);
}

Comments

0

This is an old question, I know. However, it is at the top of some google searches, so I wanted to add that you can combine the answers from @RobHague and @EugeneNaydenov using the following:

function unique(arr) {
  const u = {};
  return arr.filter((v) => {
    return u[v] = !u.hasOwnProperty(v);
  });
};

You can also ignore undefined values (often handy) by adding:

function unique(arr) {
  const u = {};
  return arr.filter((v) => {
    return u[v] = (v !== undefined && !u.hasOwnProperty(v));
  });
};

You can play with this solution here: https://jsfiddle.net/s8d14v5n/

1 Comment

maybe use new Set for more fun
0

I find the other answers to be rather complicated for no gain that I can see.

We can use the indexOf method of the Array to verify if an item exists in it before pushing:

const duplicated_values = ['one', 'one', 'one', 'one', 'two', 'three', 'three', 'four'];
const unique_list = [];

duplicated_values.forEach(value => {
  if (unique_list.indexOf(value) === -1) {
    unique_list.push(value);
  }
});

console.log(unique_list);

That will work with any type of variable as well, even objects (given the identifier actually reference the same entity, merely equivalent objects are not seen as the same).

Comments

-1

what about

    function unique(list) {
      for (i = 0; i<list.length; i++) {
        for (j=i+1; j<list.length; j++) {
          if (list[i] == list[j]) {
            list.splice(j, 1);
          }
        }
      }
    }

1 Comment

You need to decrement j after running list.splice(). This kind of O(N^2) solution will work on small arrays, but I wouldn't use it once the arrays get bigger.