articles tagged with dreamhost

The poor man's backup, backup plan

1 comment

My primary backup is Time Machine set to auto-backup my entire machine to a single 1TB drive (including all my iTunes media and applications). This way I can do a full restore if I ever need to. BUT, what happens when that drive fails? I needed a remote, off-site backup for my essential files. The stuff I just couldn’t afford to loose if Time Machine died.

I wanted a solution that was simple, fast, efficient (incremental and deletes redundant files), secure and wouldn’t cost the earth. After researching different options (MobileMe, Dropbox, S3, iDrive, mozy) I found that I could simply use rSync over SSH with my Dreamhost. Here’s how …

I already use Dreamhost to host some legacy websites, and they offer unlimited storage space on even their cheapest hosting plan. Per/Gb its way more cost effective than Amazon S3. I setup the following bash script to run these rsync commands on a daily cron interval (where mydhserver.dreamhost.com is my remote dreamhost server)

#!/bin/bash

# send via rsync using SSH
rsync -azP --delete --delete-excluded --exclude-from=/Users/matt/.rsync_exclude.txt /Users/matt/Documents mydhusername@mydhserver.dreamhost.com:~/rsync_backup
rsync -azP --delete --delete-excluded --exclude-from=/Users/matt/.rsync_exclude.txt /Users/matt/Sites mydhusername@mydhserver.dreamhost.com:~/rsync_backup
rsync -azP --delete --delete-excluded --exclude-from=/Users/matt/.rsync_exclude.txt /Users/matt/work mydhusername@mydhserver.dreamhost.com:~/rsync_backup
rsync -azP --delete --delete-excluded --exclude-from=/Users/matt/.rsync_exclude.txt /Users/matt/resources mydhusername@mydhserver.dreamhost.com:~/rsync_backup
rsync -azP --delete --delete-excluded --exclude-from=/Users/matt/.rsync_exclude.txt /Users/matt/workbench mydhusername@mydhserver.dreamhost.com:~/rsync_backup

I have a separate rsync_backup user setup on my Dreamhost account using an SSH key pair without passphrase. Make sure sure that ~/rsync_backup exists on your remote server before running this. The .rsync_exclude.txt file simply contains a list of file patterns to exlude from backups, mine looks like this;

Steam Content
.svn
.DS_Store

rSync will use SSH by default when sending your data (so its encrypted over the wire). However, you should remember that your backed up data (on the remote machine) is NOT encrypted, it is only as secure as the server it resides on.

So rSync is cheap, simple, fast (you can tweak SSH options to make it faster/slower), incremental and secure. Here’s a few helpful links on configuring your rSync backups further.

Installing your own Ruby Gems on Dreamhost

no comments yet, post one now

I came across this problem when trying to setup Capistrano on my dreamhost box.

Capistrano (orginally SwitchTower); is a standalone deployment utility that can integrate nicely with Rails. It allows you to deploy your apps across multiple servers from a subversion; Its handy for any shared environment (such as dreamhost), since you can use it to migrate databases and reset running fcgi processes.

Checking the Dreamhost Gemlist (or; gem list —local) I found that Capistrano wasnt installed – so I had to go about setting up my shared box so I could install any Gem I liked, in my home directory.

Its easier than I thought, but I had some trouble searching Google to find an answer so Im posting it up here; share the knowlegde and all that …

First up, create a new .gems folder in your home directory;

mkdir ~/.gems

Next open up your ~/.bashrc and ~/.bash_profile files and make sure to add the following lines as new environment variables;

export GEM_HOME=$HOME/.gems
export GEM_PATH=/usr/lib/ruby/gems/1.8:$GEM_HOME

Also adjust your PATH variable to include your new ~/.gems folder;

export PATH=~/bin:~/.gems/bin:$PATH

Thats basically it ! – For any gems you want to install you’ll need to grab them from somewhere online; I picked up Capistrano from here with

wget http://rubyforge.org/frs/?group_id=1420&release_id=4528

Then ran this command to install the gem (from my home dir)

gem install ~/capistrano-1.1.0.gem

Since we added the ~/.gems folder to the PATH variable in your bash files, you can simply type cap -V to check capistrano is installed.

Although this is not nessecary for Capistrano – in order to get your rails app to use other gems installed in your home directory, you first have to unpack them in RAILS_ROOT/vendor. Therefore, to be able to require them in your code, enter the RAILS_ROOT/vendor directory and do the following:

gem unpack gem_name

Multiple SVN users on Dreamhost

no comments yet, post one now

I recently recieved an email from Jonathan Arthur of OpenWebDesign, asking how I had managed to setup Subversion on Dreamhost with multiple SVN users. The Dreamhost SVN wiki page goes some way to explain the process, but heres what I did (with success);

The process is a little tricky, due to folder permissions and the way dreamhost sets up users and groups.

This mini-guide assumes you already have one repository setup and are able to access it with a single subversion user – as described in the dh SVN wiki page

To start, create new users (with shell accounts) to your server; I created ones with usernames like mattsvn etc. Do this in your dh web control panel, under Users→manage users→Add New User. Each of these users will get their own home directory on the dreamhost box.

Now (still in the web panel) create a new group; (Users→Groups→Add New Custom Group) call it something like devteamsvn – and add all your new svn users into this group (also add the user you already had svn working for)

Next – follow dreamhost’s instructions to create ssh public/private keys for each of these users in turn. (doing this in each of the users new home directories)

Take a note of all the passphrases you use – and also keep the public session keys you generate for each user – So they can be handed out for use with “Putty Pageant”: later.

Now, terminal into your dh box (as the orginal ssh user who created the repository to begin with) and change the group ownership (recursively) on the subversion repository folder; ie.

chgrp devteamsvn -R ~/path/to/your/svn/repository

And make sure that this group has full read/write/execute access to all files inside here with;

chmod 775 -R ~/path/to/your/svn/repository

Also – if you have a live web accessible folder online that you want everyone to be able to checkout into; you’ll need to change the group there as well;

chgrp devteamsvn -R ~/path/to/your/web/folder/you/plan/to/checkout/to
chmod 775 -R ~/path/to/your/web/folder/you/plan/to/checkout/to

Your now ready to try it all; Use Putty / Tortoise SVN and Putty Pageant with one of the new users details – Try connecting to the repository and committing a simple change or add.

For this user to checkout/update from the repository on the dreamhost to the dreamhost web folder – simply terminal into the dreamhost box using the new svn users details and run the command;

svn up /home/orginal_ssh_users_name/path/to/your/web/folder/you/checkout/to
April 24, 2006 05:56 by

PHP pushups

1 comment

I recently had the rather unpleasant task of writing some PHP to compare a CSV file (with some 22,000+ entries) with a mySQL database. With the CSV file holding the master copy of data, it would update/insert and delete from mySQL. The script needed to run as a daily Cron on my (shared) Dreamhost box.

This would normally be simple enough, using a status field on the CSV file to indicate fields that had been updated. Unfortunately there was no status field, and none could be added. In fact the CSV file could not be modified at all. The only way to check if a row had been modified was to do a field by field comparison on every row.

I started off with a single script that imported the CSV to an Array, and also extracted all rows from the db table. With some looping to search through all rows and all fields in each row, I got the script to work. Great! (I Thought)

But with 22,000 rows looping ~22,000 other rows, (22,000 × 22,000 = 484 million loops ) – in short the script took minutes to execute, and if left long enough it ate up 100% CPU usage (through php). Even using a exponential back-off search took too long.

On Dreamhost, if any script you run nears 100% CPU usage, its killed automatically. A major rethink was required. So I decided to split the script in two.

  • script 1 – would create a temporary table in the database and simply import the CSV file into it – row by row.
  • script 2 – running a few minutes later, would then compare the two tables using mysql queries (rather than a php loop search) – after performing all updates/inserts and deletes the temporary table would be destroyed.

The comparison script (2), works by looking for id matches between the two tables, and marking any rows found. If found – both rows are fetched and a field by field comparison is made to check if an UPDATE statement is needed.

Finally any rows not marked as found in the master CSV file were added, and any rows not marked as found in the DB were deleted.

Using 2 tables for the comparison rather than looping and searching in php, meant that the strain was now on mySQL (rather than PHP). Dreamhost seems to tolerate this, and the php script execution time is reduced from minutes to seconds.

And, why am I explaining all this ? – you ask,

1. So I can remember what on earth I did. and;
2. Im curious to know if anyone can think of a better way to do this. Bearing in mind the limiting factors,
the CSV file CANNOT be altered in any way. It has to execute in seconds and the CPU usage cannot approach 100%

December 08, 2005 03:18 by

Dreamhost FCGI saga

no comments yet, post one now

After reading on nubyOnRails about stray Ruby processes – I have setup an hourly cronjob to kill all defunct Ruby processes on my dreamhost box. You may see some slight improvements in speed on this site. Now if we could only convince dreamhost to run lighttpd we’d be sorted.

Thanks nuby on rails !

the cron

# hourly - kill defunct ruby procs
01 * * * * /home/hiddenloop/crontabs/killrubyprocs

the script

#!/bin/sh
export procs="`ps x -o ppid,comm | grep "<defunct>" | sed -e "s/^[ \t]*\([0-9]*\)   ruby.*/\1/"`"
for i in $procs; do
  kill -9 $i
done

November 18, 2005 06:03 by
← (k) prev | next (j) →