Cron Jobs on Heroku

Heroku is an App Engine for Ruby on Rails … and other Languages and Frameworks. Heroku is running your application and you don’t have to care about hardware or IT-Infrastructure. It is one abstraction layer above the Amazon Cloud EC2.

First of all, there are no cron jobs on Heroku. Because it is an App Engine you don’t have access to the linux os and the native cron daemon or crontab. Forget it! You have to use heroku worker dynos. Keep reading.

Usually you push your rails app via git to Heroku like this:

git push heroku master

and then it will be deployed on x web dynos. There are different kind of dynos. The default is “web”. If you want to do some background jobs you need a “worker” dyno. You can add worker dynos with this command:

heroku ps:scale worker=1

Now you have to create a Procfile in the root of your application. That can look like this:

web: bundle exec rails server -p $PORT
worker: bundle exec rake do_work

The Procfile is defining the different types of dynos which are available for your application. With “heroku ps:scale” you can scale the dyno types. If you want to have 7 web dynos and 1 worker just execute this:

heroku ps:scale web=7 worker=1

And Heroku will immediately deploy your web application on 7 web dynos and your background job on 1 worker dyno. That is fucking awesome!

All right we added one worker dyno and we defined in the Procfile that “bundle exec rake do_work” should be executed on the worker dyno. Now we just have to define the task “do_work” in the Rakefile.

There are different GEMs for scheduling jobs. For example clockwork, rufus and Qu. I prefer a more simple/native way. With a couple lines of code you can write your own scheduler. This rake task here is executing jobs at a given time. In this example every day at 07:00 AM.

task :do_work => :environment do
  puts "START"

  start_hour = 7
  start_minute = 0

  until 2 < 1 do
    now =
    hour = now.hour
    minute = now.min
    if hour == start_hour && minute == start_minute

      puts " do work #{}"

      # execute code here !!!

      if == start_hour && == start_minute

Just replace the comment with your code. And push it to heroku

heroku maintenance:on
git push heroku master
heroku maintenance:off

That’s it.

Amazon EC2 API Tools

To deal with instances on Amazon EC2 you need the Amazon EC2 API Tools. You can download it here:

This command line tools are used to access the EC2 instances. To use this tools you need Java 1.5 or higher and you need to setup the System Variable “JAVA_HOME”.  How to set up the tools with the certificates you can read in the User Guide.

HTML Upload to Amazon S3 with Ruby on Rails

This Blog Post will demonstrate how to build a HTML file upload for images with Ruby on Rails and store the images on Amazon S3 Cloud Storage. I assume that you are familiar with Ruby on Rails and RubyGems.

For the connection to Amazons S3 Storage System we will use the gem “aws-s3”. To use this you have to add this line to your Gemfile:

gem 'aws-s3', '0.6.2', :require => 'aws/s3'

After you added the line to your Gemfile, execute:


to load the gem. To use aws-s3 is pretty easy. Just add this lines to your “application.rb” file:

      :access_key_id     => 'your_access_key_id',
      :secret_access_key => 'your_secret_access_key'

Just customize the above lines with your access_key_id and your secret_access_key from your S3 Account. Everytime then you are starting your Ruby App AWS will establish a connection to your S3 storage.

OK. Let’s start with the HTML upload form. Here it is:

 <%= form_for @image, :html => {:multipart => true, :accept => "image/gif,image/png,image/jpg"} do |f| %>

   <div style="width: 400px;" >
     <%= file_field 'upload', 'datafile'  %>

   <%= f.submit "Upload image", :class => "button" %>

<% end  %>

And the corresponding Ruby Controller looks like this:

class ImagesController < ApplicationController

@@BUCKET = "my_image_bucket"

def create
  fileUp = params[:upload]
  orig_filename =  fileUp['datafile'].original_filename
  filename = sanitize_filename(orig_filename), fileUp['datafile'].read, @@BUCKET, :access => :public_read)
  url = AWS::S3::S3Object.url_for(filename, @@BUCKET, :authenticated => false)
  @image =[:image])
  @image.user = current_user
  @image.filename = filename
  @image.url = url;
    flash[:success] = "Image saved! "
    render '/home'
    render '/users/new_image'

def destroy
  AWS::S3::S3Object.find(@image.filename, @@BUCKET).delete
  render '/home'

    def sanitize_filename(file_name)
      just_filename = File.basename(file_name)


That’s it. Just cusotmezice the “@@BUCKET” with your Amazon Bucket name.

Fukushima of Cloud Computing

Since 5 a.m. ET Thursday morning April 21 the EC2 Cloud Service from Amazon in Virginia is down. Amazon is one of the biggest Cloud Service Providers world wide. Even Barak Obama propagated Cloud Computer Services as new green, safe and reliable IT.

Amazon invested a lot of money to make the world believe that their Cloud Service EC2 is high scaleable and high available. Many companies decided to go with Amazon EC2 because they thought the Amazon Cloud would be more reliable than their own IT department. They thought the Cloud would never be down. They thought the Cloud would always scale and always be up. Unfortunately this is not true.

The Amazon EC2 Cloud Service is since over 12 hours down. Many sites are affected. Reddit, HootSuite, Foursquare, Quora, Heroku and This is the Fukuschima of Cloud Computing! Nobody thought that this could happen. But it happened.

Amazon is down. Foursquare is sad.
Amazon is down. Foursquare is sad.

SCVNGR tweeted: “The sky is falling! Amazon’s cloud seems to be down (raining?) so we’re experiencing some issues too. Be back soon!”

Also the Ruby on Rails App Engine Heroku is down because they are running on Amazon. Heroku is well-known as rock-solid platform for RoR Apps. Many facebook Apps are hosted on Heroku and now they are all down. Here is a Recovery Status Page from Heroku:

Here you can watch the status of the Amazon Clouds:

And here is a list of sites who are affected:

This is a black day for Cloud Providers. Many Managers have to re-think their decisions.