Is there any chance for a change in this behavior in future ECMAScript versions?
I would say the chances are very slim. And there are several reasons:
We already know what ES5 and ES6 look like
The future ES versions are already done or in draft. Neither one, afaik, changes this behavior. And the thing to keep in mind here is that it will take years for these standards to be established in browsers in the sense that you can write applications with these standards without relying on proxy tools that compile it to actual Javascript.
Just try to estimate the duration. Not even ES5 is fully supported by the majority of browsers out there and it will probably take another few years. ES6 is not even fully specified yet. Out of the blue, we are looking at at least another five years.
Browsers do their own things
Browsers are known to make their own decisions on certain topics. You don't know whether all browsers will fully support this feature in exactly the same way. Of course you would know once it is part of the standard, but as of now, even if it was announced to become part of ES7, it would only be speculation at best.
And browsers may make their own decision here especially because:
This change is breaking
One of the biggest things about standards is that they usually try to be backwards compatible. This is especially true for the web where the same code has to run on all kinds of envrionments.
If the standard introduces a new feature and it's not supported in old browsers, that's one thing. Tell your client to update their browser to use the site. But if you update your browser and suddenly half the internet breaks for you, that's a bug uhm-no.
Sure, this particular change is unlikely to break a lot of scripts. But that's usually a poor arguments because a standard is universal and has to take every chance into account. Just consider
"use strict";
as the instruction to switch to strict mode. It goes to show huw much effort a standard puts into trying to make everything compatible, because they could've made strict mode the default (and even only mode). But with this clever instruction, you allow old code to run without a change and still can take advantage of the new, stricter mode.
Another example for backwards compatibility: the === operator. == is fundamentally flawed (though some people disagree) and it could've just changed its meaning. Instead, === was introduced, allowing old code to still run without breaking; at the same time allowing new programs to be written using a more strict check.
And for a standard to break compatibility, there has to be a very good reason. Which brings us to
There is just no good reason
Yes, it bugs you. That's understandable. But ultimately, it is nothing that can't be solved very easily. Use ||, write a function – whatever. You can make it work at almost no cost. So what is really the benefit for investing all the time and effort into analyzing this change which we know is breaking anyway? I just don't see the point.
Javascript has several weak points in its design. And it has increasingly become a bigger issue as the language became more and more important and powerful. But while there are very good reasons to change a lot of its design, other things just arent't meant to be changed.
Disclaimer: This answer is partly opinion-based.
Rbackground and in there you get what you saidpaste0("a", NULL) == "a"toStringorvalueOfmethods. I would imagine there are as many circumstances where you would want them to show asnullorundefinedas you would want them to show as an empty string. Fortunately, I hardly ever run into this problem since I often concatenate strings with join (it solves other issues such as doing concatenations while using ternary (?) expressions).