1

I'm working on a sports app that allows users to edit roughly 500 rows of player stats at a time. Its similar to an editable spreadsheet with 500 rows and each user can potentially save multiple 'sheets'. I'm thinking this would put a lot of strain on the app.

I come from a LAMP background and am a little new to Rails. In a LAMP environment I would setup a hybrid system where I have a relational DB (Mysql) to store my general tables (user's email/pass, players, teams, etc) and in another table have the user_id and a path to a json file on the server with that users custom stats. This way when a user is editing a particular sheet I would just load and edit that one file.

Am I thinking about this the right way? I don't have much experience with NoSQL and I would prefer to not have the complexities of two databases. Is there some hybrid best practice in Rails to store a file path or serialize JSON as a field in Rails with a friendly way to manipulate it with AR?

I'm running Rails 4 and MySQL.

1 Answer 1

2

Well it sounds like you already have two databases: MySQL and the JSON file one you invented.

You could just as easily JSON/binary/gzip (or however) encode the data and save it in a BLOB field in your MySQL database.

$ rake g migration add_player_data_to_users player_data:binary
$ rake db:migrate

In your User model

class User

  def player_data
    return nil unless @player_data.present?
    ActiveSupport::JSON.decode(ActiveSupport::Gzip.decompress(@player_data))
  end

  def player_data= data
    @player_data = ActiveSupport::Gzip.compress(ActiveSupport::JSON.encode(data))
  end

end

With MySQL, you might have to edit the migration file and set the :limit modifier to a accommodate the size of the resulting data

Sign up to request clarification or add additional context in comments.

5 Comments

I haven't implemented anything yet, that was my first instinct. I'm just wondering if this is the most efficient way to go about doing it.
Sincerely no offense to you, but I don't think your JSON file database would be considered the most efficient. If you don't want to store binary data in your MySQL, you might want to consider MongoDB.
Would recommend something like MongoDB, sticking to MySQL, or both then?
There'd be no need for both. MongoDB would mean adding another dependency (and its dependencies) to your app; that, and it's additional complexities might outweigh its benefits if you only plan on using it for one thing. Most importantly, don't pre-optimize. I'd try to get it working with your existing MySQL database. If it doesn't perform to your requirements, consider an alternative.
I like the "don't pre-optimize" suggestion, always good advice for any project. I just wanted to make sure I was on the right track before I went any further.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.