Well as it seems that as other responses are either highly vague in execution or suggest writing a new collection or other "unsafe" operations, then I suppose a response that you should follow is in order.
It's acutally not a simple problem to solve ( though others have discounted it as such ) when you take in all the considerations such as "keeping the order of elements" and "not possibly overwriting other changes to the document", which are important considerations in a production system.
The only real "safe(ish)" way that I can see of doing this without possiblly (mostly) blowing away your whole array and loosing any "active" writes is with a construct like this ( using Bulk Operations for efficiency ):
var bulk = db.collection.initializeOrderedBulkOp(),
count = 0;
db.collection.find({ "l": { "$type": 18 }}).forEach(function(doc) {
bulk.find({ "_id": doc._id }).updateOne({
"$pull": { "l": { "$in": doc.l } }
});
count++;
if ( count % 1000 == 0 ) {
bulk.execute();
bulk = db.collection.initializeOrderedBulkOp();
}
doc.l.map(function(l,idx) {
return { "l": parseFloat(l.valueOf()), "idx": idx }
}).forEach(function(l) {
bulk.find({ "_id": doc._id }).updateOne({
"$push": { "l": { "$each": [l.l], "$position": l.idx } }
});
count++;
if ( count % 1000 == 0 ) {
bulk.execute();
bulk = db.collection.initializeOrderedBulkOp();
}
});
});
if ( count % 1000 != 0 )
bulk.execute();
This is meant to "bulk" rewrite the data in your entire collection where needed, but on your single sample the result is:
{ "_id" : 1, "l" : [ 114770288670819, 10150097174480584 ] }
So the basics of that is that this will find any document in your collection that matches the current $type for a NumberLong ( 64bit integer in BSON types ) and process the results in the following way.
First remove all the elements from the array with the values of the elements already there.
Convert all elements and add them back into the array at the same position they were found in.
Now I say "safe(ish)" as while this is mostly "safe" in considering that your documents could well be being updated and having new items appended to the array, or indeed other alterations happening to the document, there is one possible problem here.
If your array "values" here are not truly "unique" then there is no real way to "pull" the items that need to be changed and also re-insert them at the same position where they originally occurred. Not in an "atomic" and "safe" way at any rate.
So the "risk" this process as shown runs, is that "if" you happened to append a new array element that has the same value as another existing element to the array ( and only when this happens between the cursor read for the document and before the bulk execution ), then you run the "risk" of that new item being removed from the array entirely.
If however all values in your array are completely "unique" per document, then this will work "perfectly", and just transpose everything back into place with converted types, and no other "risk" to missing other document updates or important actions.
It's not a simple statement, because the logic demands you take the precaution. But it cannot fail within the contraints already mentioned. And it runs on the collection you have now, without producing a new one.