0

I have an array of ~10,000 strings and integers that I want to serialize into a Rails web page. The eventual goal is to get the array into a JavaScript array, but I'm happy to just have it as a text blob that I then parse client side.

I can't cache the string, the values change every request.

This takes ~25ms on my VPS:

arr = ["ABCD", 1] * 10000 # always in string, number, string, number order
start = Time.now
arr.to_s
duration = (Time.now - start)*1000
puts "took #{duration}ms"

Can we do better?

edit

@sawa's answer is correct, to_json is fast and a good way to do this. I was getting thrown off because to_json in a Rails environment is overridden. Use JSON.generate(arr) instead.

0

1 Answer 1

6

JSON is faster as the array gets longer. As I tested with ["ABCD", 1] * n, when approx n < 50, to_s is faster, but when n > 50, to_json is faster.

arr = ["ABCD", 1] * 10000 # always in string, number, string, number order

start = Time.now
arr.to_s
duration = (Time.now - start)*1000
puts "took #{duration}ms"

require "json"
start = Time.now
arr.to_json
duration = (Time.now - start)*1000
puts "took #{duration}ms"

# =>
# took 7.546628ms # to_s
# took 4.684186ms # to_json
Sign up to request clarification or add additional context in comments.

1 Comment

aha! i was too tied up trying things in my rails environment. I believe rails overrides to_json to do some sanitation, switched to JSON.generate(arr) and I see timing similar to what we get in the contrived example. Thanks!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.