In Scala, the Int type corresponds to java int.
The Scala type system is devided into AnyVal and AnyRef hierarchies, both of which are subtypes of Any.
This AnyVal hierarchy has Int, Long etc. corresponding to Java primitive types. And these AnyVals can be boxed to convert to corresponding AnyRef type (which is Integer for Int).
Type casting from Int to Integer (called boxing) happens by Int.box call, and this implementation comes from runtime implementation scala.runtime.BoxesRunTime.boxToInteger which is defined as,
public static java.lang.Integer boxToInteger(int i) {
return java.lang.Integer.valueOf(i);
}
So, this first adapts the i: Int to a int i which is then converted to Integer using java.lang.Integer.valueOf(i).
Similarly, the type casting from Integer to Int (called unboxing) is done by Int.unbox. Amd this implementation comes from runtime implementation scala.runtime.BoxesRunTime.unboxToInt which is defined as,
public static int unboxToInt(Object i) {
return i == null ? 0 : ((java.lang.Integer)i).intValue();
}
Notice that null check in null ? 0 : ((java.lang.Integer)i).intValue(). This unboxing has this little "improvement" to "increase null safety" but can throw a ClassCastException as the input can be any Object.
Scala Predef has following two implicits to implicitly do this type casting whenever required.
implicit def int2Integer(x: Int): java.lang.Integer = x.asInstanceOf[java.lang.Integer]
implicit def Integer2int(x: java.lang.Integer): Int = x.asInstanceOf[Int]
Notice that starting with Scala 2.12.0 these methods to a type casting and do not call any methods like x.intValue (this was done till Scala 2.11.x, where you will see the same NullPointerException as you see in Java, but then it was inconsistent with explicit Int.box calls, which might have been the reason for the change of implicit definitions).
Scala-Java interop has a lot more "unexpected" cases, so you need to be very carefull when you use Scala with almost any Java library (and interestingly we use Java libraries for almost everything).
Here are some surprizes just with int, Int and Integer.
public class JavaIntMagic {
public static int toPrimitiveInt(Integer integer) {
return integer;
}
public static Integer toInteger(int i) {
return i;
}
}
object ScalaIntMagic {
def toPrimitiveInt(integer: java.lang.Integer): Int = integer
def toInteger(i: Int): java.lang.Integer = i
}
These implementations have different behaviours when called from Scala and Java code.
public class JavaMain {
public static void main(String[] args) {
Integer integer = null;
int i1 = ScalaIntMagic.toPrimitiveInt(integer);
// 0
int i2 = JavaIntMagic.toPrimitiveInt(integer);
// fails with NullPointerException
Integer integer1 = ScalaIntMagic.toInteger(integer);
// fails with NullPointerException
Integer integer2 = JavaIntMagic.toInteger(integer);
// fails with NullPointerException
}
}
object ScalaMain {
def main(args: Array[String]): Unit = {
val integer: java.lang.Integer = null
val i1: Int = ScalaIntMagic.toPrimitiveInt(integer)
// 0
val i2: Int = JavaIntMagic.toPrimitiveInt(integer)
// fails with NullPointerException
val integer1: java.lang.Integer = ScalaIntMagic.toInteger(integer)
// 0
// why ? because the "null safe" unboxing strikes again
val integer2: java.lang.Integer = JavaIntMagic.toInteger(integer)
// 0
// why ? because the "null safe" unboxing strikes again
}
}
When using JDBC or other java drivers, don't be surprised when your database tables magically have some numbers as 0 instead of the expected null (or write failures). Add more and more unit tests (even for something which might look very trivial).
It's weird that, on the one hand, we don't have any "pure" Scala libraries for fundamental stuff like networking, databases, etc. and rely heavily on Java libraries wrapped in Scala API's, yet on the other hand, we don't put any emphasis on consistent and predictable Java interop (even for very fundamental stuff like int, boolean, bytes) for reliable usage of those libraries in Scala ecosystem.