Another simple task that’s often hard for beginners is importing and exporting MySQL dumps. Here is quick rundown on how to do it.

To export data you need to use mysqldump:

mysqldump -u db_user -p db_name > dump_name.sql

Options given to mysqldump are:

  • -u db_user - connect as user db_user to database
  • -p - use password, it will ask you to enter your password
  • db_name is the name of MySQL database you want to dump
  • > dump_name.sql - by default mysqldump will print out the dump to terminal, but simple output redirect with > will instead write it to given filename, in this case dump_name.sql

Now that you have dump_name.sql file with all SQL queries needed to replicate your database you can import it using general-purpose mysql client:

mysql -u db_user -p db_name < dump_name.sql

User, password, and database name options are the same as for mysqldump. Since mysql reads input from terminal this time we can use < to read input from given file instead.

As always for more information you can consult manual using man mysqldump and man mysql.

One of the simplest tasks is creating and extracting files using tar and gzip. Yet for most new developers this is a daunting task. These days tar is mostly used to simply combine a few files into a single file and then gzip is used to compress that file.

Here is a quick overview how to use tar and gzip to create and compress an archive:

# archive individual files
tar -cvzf myarchive.tar.gz /path/to/file1 /path/to/file2

# archive whole directory
tar -cvzf myarchive.tar.gz /path/to/dir

# archive whole directory but don't store full path
tar -cvzf myarchive.tar.gz -C /path/to/dir ./

Options give to tar are: c to create new archive, v to be verbose, z to compress resulting archive with gzip, and f to write the archive to specified file. After options you can list files and dirs you want to archive.

In all examples we provide a full path to a file or dir we want to archive. In this case tar will store files in the archive using the full path. This means once you extract the files you’ll have a complete directory structure from root dir onwards.

The way to avoid this is either to manually cd to dir in which files are stored, or to tell tar using C option to change dir before archiving files.

Finally to extract an archive:

tar -xvzf myarchive.tar.gz

The x option tells tar to extract the archive into current directory.

For more information you can consult manual using man tar.

A gem is a simple way to distribute functionality, it can be a small plugin, a Ruby library or sometimes a whole program. Thanks to RubyGems, a gem hosting service, developers have a wide range of gems at their disposal allowing them to easily add functionality to their applications.

But what if there is no gem available that will suit the functionality you need, and you find yourself writing the same code over and over again for different projects? Well, in that case you should consider making your own gem.

It’s considered a good practice to extract a gem out of an existing application, since that way you will have a better understanding of all the requirements as well as how the gem will be used. This blog post will illustrate just that on a real life example, and will take you through the process of creating a slug_converter gem.

For our new project it was necessary to modify the starting id of our database. This can be handled through migration for creating table but we decided to create a rake task that handled this for us.

The rake task that we created detects what database is being used and executes appropriate changes according to that. You can create a rake task using rails generate command for rake task:

If you are using Devise gem for authentication and you have been adding custom fields to your model you’ll get in trouble when you try to create a new instance or update an existing one. All your added fields will be treated as unpermitted. The solution for this problem is to customise Devise’s configure_permited_parameters action. All you need to do is to add this action to your Application controller and push parameters that need to be permitted to devise_paremeter_sanitizer array. So let’s say you have a User Model and you have added company_name and website fields to your user’s table, to permit this parameters on sign_up you need to add this to your Application controller:

def configure_permitted_parameters
  devise_parameter_sanitizer.for(:sign_up).push(:company_name, :website)

It is the same principle for the :sign_in and :edit_account. You can see what are default permitted parameters here.

Devise has a very useful Trackable module used to track user’s sign in count, timestamps and IP address. There are some occasions when you need to disable tracking. For example for API requests where user signs in on every request; for instances where admin might sign in as an user; and similar.

To disable Devise Trackable module you need to set request.env["devise.skip_trackable"] = true. You have to do that before trying to authenticate user, so you’ll want to put it in a before_filter, or even better prepend_before_filter to make sure it’s before authentication.

Add this to your controller in which you want to disable tracking:

prepend_before_filter :disable_devise_trackable

  def disable_devise_trackable
    request.env["devise.skip_trackable"] = true

Note to self: here is how to upgrade Ubuntu 8.04 LTS (or any other release that is no longer supported) to newer Ubuntu release.

When you are upgrading unsupported release of Ubuntu if you try to do the usual sudo apt-get update it will most likely fail because… well, it’s unsupported. The simple fix for this is to change your /etc/apt/sources.list and replace repository URLs from something like to

After that you should be able follow normal upgrade procedure (use sudo if you are not root):

apt-get update
apt-get install update-manager-core


Here is a quick way to setup VirtualBox using Vagrant with Heroku-like box on Mac.

  1. Install VirtualBox from

  2. Install Vagrant from

  3. Create Vagrantfile for Heroku-like box (based on that looks something like:

Vagrant.configure("2") do |config| = "heroku"
  config.vm.box_url = ""
  config.vm.synced_folder ".", "/vagrant", :nfs => true :private_network, ip: ""  # required for NFS

Beside telling Vagrant to use Heroku-like box from it also sets up shared dir between host and VM machine. It will mount Vagrantfile dir (.) to /vagrant in VM.

vagrant up will setup the VM and start it up.

Now you can use vagrant ssh to login to VM.

Vagrant Heroku-like box comes with Postgresql, but if you want you can easily setup sqlite:

sudo apt-get install libsqlite3-dev

Bonus tip: when you are working on multiple projects sometimes you can forget which VMs are running. You can list all running VMs using:

VBoxManage list runningvms

Further reading:

RailsDiff is a very useful site when upgrading Rails versions (for example, from Rails 3.2 to Rails 4). It will generate default Rails app using two different Rails versions and it will compare them. The result is that you can see all the configuration changes (like in application.rb) and all other changes – which is really useful when upgrading to new Rails version.

Assume that you have the usual setup with model (MyFile) using simple Carrierwave uploader (MyFileUploader):

# app/models/my_file.rb
class MyFile &lt; ActiveRecord::Base
  mount_uploader :file, MyFileUploader

To be able to test Carrierwave uploaders with RSpec using FactoryGirl factories you need:

  • define factory with uploaded file
  • modify test environment storage so test file uploads are separated from other uploads
  • turn off image processing to speed up tests
  • perform cleanup after each test

Define factory

# spec/factories/my_files.rb
FactoryGirl.define do
 factory :my_file do
   photo, '/spec/fixtures/myfiles/myfile.jpg')))

Setup Carrierwave

First we need to make sure Carrierwave is using local file system for storage and to disable file processing for testing environments. Disabling file processing will speed up tests considerably. We can do that by adding following to Carrierwave initializer:

if Rails.env.test? || Rails.env.cucumber?
  CarrierWave.configure do |config| = :file
    config.enable_processing = false

Next we should separate test uploads from any other uploads. We can do that by modifying cache_dir and store_dir methods for all Carrierwave models (i.e. all models that are descendants of CarrierWave::Uploader::Base). So the whole Carrierwave initializer looks something like:

# config/initializers/carrierwave.rb
if Rails.env.test? || Rails.env.cucumber?
  CarrierWave.configure do |config| = :file
    config.enable_processing = false

  # make sure our uploader is auto-loaded

  # use different dirs when testing
  CarrierWave::Uploader::Base.descendants.each do |klass|
    next if klass.anonymous?
    klass.class_eval do
      def cache_dir

      def store_dir

Clean up uploaded files

Using factory defined above will create uploaded files in cache_dir and store_dir. These are just temporary files and should be removed after each test, so each of them has a clean slate. By adding after :each hook in RSpec configuration block we can remove these files simply by deleting spec/support/uploads dir.

# spec_helper.rb
RSpec.configure do |config|
  config.after(:each) do
    if Rails.env.test? || Rails.env.cucumber?