This question is almost a duplicate of: Scala can't multiply java Doubles? - you can look at my answer as well, as the idea is similar.
As Eastsun already hinted, the answer is an implicit conversion from an java.lang.Integer (basically a boxed int primitive) to a scala.Int, which is the Scala way of representing JVM primitive integers.
implicit def javaToScalaInt(d: java.lang.Integer) = d.intValue
And interoperability has been achieved - the code snipped you've given should compile just fine! And code that uses scala.Int where java.lang.Integer is needed seems to work just fine due to autoboxing. So the following works:
def foo(d: java.lang.Integer) = println(d)
val z: scala.Int = 1
foo(z)
Also, as michaelkebe said, do not use the Integer type - which is actually shorthand for scala.Predef.Integer as it is deprecated and most probably is going to be removed in Scala 2.8.
EDIT: Oops... forgot to answer the why. The error you get is probably that the scala.Predef.Integer tried to mimic Java's syntactic sugar where a + "my String" means string concatenation, a is an int. Therefore the + method in the scala.Predef.Integer type only does string concatenation (expecting a String type) and no natural integer addition.
-- Flaviu Cipcigan