One thing that I can't figure out is how to set up a database host in case when we dockerize a Rail app ? For example, a Postgres DB is supposed to run on localhost on a dev machine. But in a the docker-compose file the database service has its own name, - it's on that host that the database will be accessible for other containers, foe example:
version: '3'
services:
web:
build: .
ports:
- "3000:3000"
volumes:
- .:/usr/src/app
env_file:
- .env/development/database
- .env/development/web
redis:
image: redis
database:
image: postgres
env_file:
- .env/development/database
volumes:
- db-data:/var/lib/postgresql/data
volumes:
db-data:
Most examples suppose to execute all the commands related the development Rails (creating models, migrations, etc.) from inside the container, e.g.
docker-compose exec web rails g scaffold User first_name:string last_name:string
And to run the above migration I'd have to run
docker-compose exec web rails db:migrate
This way it works. But why do I need to run Docker for my dev locally to be able to access the app ?
So I come back to my original essential question: when the app was generated, database.yml had the below settings (for Postgres):
default: &default
adapter: postgresql
encoding: unicode
pool: <%= ENV.fetch("RAILS_MAX_THREADS") { 5 } %>
development:
<<: *default
database: rails5-ember_development
This way, everybody could clone the project and continue to develop when having Postgres DB running on localhost. Now when dockerizing the app, how to change/adapt the host value, - localhost:5432 being by default so that the application could run both in a Docker container ?
So, to resume my question is:
To be able to simulate the same behaviour in a dockerized Rails app, is the only solution would be to run it in a special environment other than development? In this case, I'd add it to database.yml and set the same DB values as in docker-compose.yml file (username, host, etc.).
Thank you.
docker-compose exec ....docker-composefile.