0

So it's possible that there may not be a way to fix this but I thought I would ask:

I have a Postgresql DB set up. The first table contains replays which each have 10 unique players (a has many relationship). I want to render a JSON based on my search that contains the replays with their players. However, I am pulling a large number of replays (2000) -- which all have 10 players, meaning searching for 20,000 players.

This what the index search currently looks like:

def index
  @replays = Replay.where(map_id: params['map_id'].to_i)
  @replays = @replays.reverse
  render json: @replays[0..2000].to_json(include: [:players])
end

The search takes this long:

Completed 200 OK in 639596ms (Views: 42.1ms | ActiveRecord: 329252.3ms)

Is there a better way I can search and render the JSON that won't take this long? It's worth noting that just searching for 2k replays, 20k players, or even a single specific replay with players takes only a couple seconds (the first search itself also takes only a couple of seconds) so I assume it's a volume issue.

2
  • can you post the actual sql query that is generated? Commented Nov 24, 2017 at 14:55
  • The first search: SELECT "players".* FROM "players" WHERE "players"."replay_id" = $1 Then when searching the players: SELECT "players".* FROM "players" WHERE "players"."replay_id" = $1 [["replay_id", 117217]] (per replay) Commented Nov 24, 2017 at 15:06

1 Answer 1

2

Try to eager load your replays

replays = Replay.last_two_thousands_by_map_id(params[:map_id])

render json: replays.to_json(include: [:players])

# replay.rb model
def self.last_two_thousands_by_map_id(map_id)
  includes(:players).where(map_id: map_id).order(id: :desc).limit(2000)
end
Sign up to request clarification or add additional context in comments.

6 Comments

This works, the time was cut drastically. Completed 200 OK in 254032ms (Views: 34.1ms | ActiveRecord: 20682.4ms)
Just the ticket - nice answer. I'd add that it'd be worth dropping at least some elements of that request into a model scope to clean up the controller a little. @GustavMauler - for future reference, it was an N + 1 problem slowing you down, docs here.
the query still seems pretty huge. Open another question with the new bottleneck if you want. Thanks @SRack
Yeah I see that now. Perhaps it's greedy but is it possible to get even better performance?
Yeah, I'm pretty sure. But we need more info as the ones in this question. Open another one
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.