In this post I’ll share an example Docker Compose configuration to integrate Rails logs with Elasticsearch, Logstash, and Kibana.
I installed my Docker dependencies via Brew on OSX.
Initial Rails setup
Update Rails configuration to use environment variables. This enables Docker Compose to pass container hostnames and credentials.
1) Edit file: Gemfile, add:
gem 'dotenv-rails'. Execute:
2) Create new dotenv file:
config/database.yml to use ENV variables:
Next I added some basic controller routes, edit file: config/routes.rb
And the corresponding controller, new file:
app/controllers/api/pages_controller.rb. The controller simply responds in JSON with the passed params.
Rails logging configuration
Add gems to output JSON logs and integrate with Logstash, edit file: Gemfile
bundle install to install the new gems.
Configure logging gems, edit:
Docker compose integration
First I created a simple Dockerfile using the latest Ruby repo, new file: Dockerfile
Next I defined the Docker compose file to include all the services, expose ports, copy configuration files, and utilize more persistent volumes. new file: docker-compose.yml
Docker bin script used to start nginx, new file: bin/docker-start-nginx
Docker bin script used to start rails, new file: bin/docker-start-rails
Docker Logstash config, new file: config/docker-logstash.conf
Docker nginx config, new file: config/docker-nginx.conf.template
Docker nginx config to reverse proxy to Rails/Puma, new file: config/docker-nginx.rails.conf
Build and start docker compose cluster:
The app should now be accessible on port 80 through nginx.
I created a simple Rake task to simulate traffic, new file: lib/tasks/simulate.rake
I executed the rake task from my host system for a while via:
docker exec -it dockerrailslogstash_rails_1 rake simulate:traffic
Opened Kibana to view logs.
Source code on Github