Persistent Ghost platform On Cloud

Some tweeks and tricks on how to persist uploaded images on ghost platform hosted on clouds.

This is a follow up article on Hosting ghost platform on Cloud.

The ghost blogging platform is very simple blogging solution for those who like simplicity. Markdown editing makes the editing as simple as possible and attaching images to the blog blog is very easy.

This article tries to solve one simple problem associated with hosting ghost platform on cloud. If you have hosted your ghost powered blog on cloud and your uploaded images got deleted after pushing new updates, and you want simple solution to that, you have came to the right place.

In cloud hosting your repo got recreated everything you push new code to the cloud. For storing user generated files, some provides separate storage location outside of your repo.

So we will be modifying upload handler on ghost to copy them to data folder provided by the cloud instead of on /content/images. In my case it is data folder two levels up from the repo directory (../../data).

The code snippet that i changed on upload handler.

changes on upload handler
 // TODO: this could be a separate module
function getUniqueFileName(dir, name, ext, i, done) {
    var filename,
        append = '';

    if (i) {
        append = '-' + i;

    filename = path.join(dir, name + append + ext);
	filenamewo= name+append+ext;
    fs.exists(filename, function (exists) {
        if (exists) {
            setImmediate(function () {
                i = i + 1;
                return getUniqueFileName(dir, name, ext, i, done);
        } else {
            return done(filename,filenamewo);

adminControllers = {
    'uploader': function (req, res) {

        var currentDate = moment(),
            month = currentDate.format('MMM'),
            year =  currentDate.format('YYYY'),
            tmp_path = req.files.uploadimage.path,
            dir = path.join('../../data',path.join('content/images', year, month)),
			dir2= path.join('/content/images',year,month),
            ext = path.extname(,
            type = req.files.uploadimage.type,
            basename = path.basename(, ext).replace(/[\W]/gi, '_');

        function renameFile(target_path,filenamewo) {
            // adds directories recursively
            fs.mkdirs(dir, function (err) {
                if (err) {
                    return errors.logError(err);

                fs.copy(tmp_path, target_path, function (err) {
                    if (err) {
                        return errors.logError(err);

                    fs.unlink(tmp_path, function (e) {
                        if (err) {
                            return errors.logError(err);

                        var src = path.join(dir2,filenamewo);
                        return res.send(src);

        //limit uploads to type && extension
        if ((type === 'image/jpeg' || type === 'image/png' || type === 'image/gif')
                && (ext === '.jpg' || ext === '.jpeg' || ext === '.png' || ext === '.gif')) {
            getUniqueFileName(dir, basename, ext, null, function (filename,filenamewo) {
        } else {
            res.send(403, 'Invalid file type');

Above you changed the upload handler to store the uploaded files to separate storage instead of /content/images folder. Doing this you images are always persistent.

But these images are not directly accessible from browser since they are outside of the application folder. In Linux you can create symlink to solve this problem.

Just fire up the ssh terminal to your cloud and create a symlink. Remember every time you push new changes to your server, you need to recreate this symlink.

Creating symblink using ssh commands

cd webroot/content/
ln -s realpath/to/uploadedfolder/content/images  images

Commands on openshift cloud

cd app-root/repo/content
ln -s $OPENSHIFT_DATA_DIR/conent/images images