6

I am new to Scala. I want to parse JSON data in scala.

I want to loop this data and in each iteration extract data for id,v,q, and t from values

I am using below code to parse it to JSON

import scala.util.parsing.json._

val data =
  """
{
  "timestamp":
  1518501114949
  , "values":
  [
  {
    "id":
    "abc"
    , "v":
    0
    , "q":
    true
    , "t":
    1518501114487
  }
  ,
  {
    "id":
    "xyz"
    , "v":
    15
    , "q":
    true
    , "t":
    1518501114494
  }
  ]
}
"""

val parsed = JSON.parseFull(data)

I am getting output as below

 Some(Map(timestamp -> 1.518501114949E12, values -> List(Map(id -> abc, v -> 0.0, q -> true, t -> 1.518501114487E12), Map(id -> xyz, v -> 15.0, q -> true, t -> 1.518501114494E12), Map(id -> klm, v -> 12.6999998, q -> true, t -> 1.518501114487E12), Map(id -> 901.Hotmelt.PSA.0759_PSAM01_Vac, v -> 1.0, q -> true, t -> 1.518501114494E12))))

but I don't know how to loop and fetch all values after that

and I am not understanding why timestamp is getting converted to E12 values

2

3 Answers 3

5

The problem is that the parseFull returns an Option with an Any inside, so you first need to get rid of that:

With this code below, you will keep the values:

val listAsAny = parsed match {
  case Some(e:Map[Any,Any]) => e("values")
  case None => println("Failed.")
}

But they still as Any, so you can transform it as follows:

val values = listAsAny.asInstanceOf[List[Map[String, Any]]]

Now values is a List of maps with the following values, and you can get the values inside as you will do with a regular List

List(Map(id -> abc, v -> 0.0, q -> true, t -> 1.518501114487E12), Map(id -> xyz, v -> 15.0, q -> true, t -> 1.518501114494E12))

For instance, to retrieve the ids you can do:

values.map(_("id"))

And the result will be:

List(abc, xyz)
Sign up to request clarification or add additional context in comments.

5 Comments

Is there any other method to parse data except json.parseFull()
and why timestamp is getting converted to E12
Because it consider it to be a number. Check this answer, it may help stackoverflow.com/questions/4170949/…
As a provisional solution, you can retrieve your timestamps as follows: values.map(_("t").toString.replaceAll("\\.", "").replaceAll("E12", ""))
@SCouto no, for christ sake don't use string replace to manipulate numbers! Just cast it to long with .toLong. Anyway, this accepted answer didn't solve this problem, that JSON.parseFull function for some cryptic reason turns all numbers into double, which is silly in case of large integers, where one can easily loose precision
1

The upickle library allows for a robust, elegant solution.

val parsed = ujson.read(data)
parsed("values").arr.map(_("id").str) // ArrayBuffer("abc", "xyz")

See here for a more detailed discussion on why upickle / ujson is the best Scala library for parsing JSON.

1 Comment

Is there any way that it could be used for checking the schema and datatypes?
0

Regarding your second question

and I am not understanding why timestamp is getting converted to E12 values

this particular JSON parser treats all numbers as Double fractions, so that's why you get scientific notation with fraction × 10¹² (E12 suffix). There's an answer here on how to change this parser's default behaviour for numbers, namely one can implement own parser which would return Long instead of the default Double.

It makes more sense if you have large integer timestamps to parse, because you could easily start loosing precision (timestamps would be rounded if exceeded 2^51 ~= 4,5×10¹⁵, which is border of Double type fractional part precision). However, in your case numbers like 1518501114949 are 100 times smaller, so there's still some safe margin and probably casting the resulting Double to Long with .toLong method should be enough.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.