Keep a clean backstage

There are a lot of great web designers in the world. There are also a lot of great web developers in the world. These folks mostly work on their own or at awe-inspiring agencies. Much of these exquisite folks do client work, which eventually ends them with a product or service that needs to be deployed to a web-server.

The sum total is a list of FTP Sites which – when opened – give access to the backstage of that specific place on the web. And if you’ve experienced anything like I did, you will more often than not find a situation that isn’t all that tidy (no pun intended). Let’s try to change that.

The Situation

In the eight years actively working on the web, there have been many messes of web servers I have gotten access to. Even in situations in which the server was initially clean, people easily found a way to kill the structure and put random files in random locations, of which the URL got communication in a random way and the expected lifetime was unknown by even the person that put it on there.

Net result: 40+ files and 20+ folders in the public root of the webserver (mostly seen as public_html, but sometimes as httproot on those pesky IIS servers.

On a few sites I have been taking a different approach, with the sole purpose of keeping the public root folder as clean as possible. In effect this means that you put the whole website in a subfolder, be it website, site or even gowallawallabingbang, after which you use URL Rewriting to reference the site correctly.

The Basics

First off, we need to make sure that there is something to keep the server from sending back a 404 HTTP message. To do this, put in one of the files the server will try to return by default, like an index.php, index.html etc. (if you forget this, there is a good chance your page will be loading, but there will be an invisible 404 present, which will stop search engines from crawling your content)

The Code

Now we need to rewrite all the site-related requests to the site’s folder. I’m not the biggest .htaccess junkie, and I know there is not a single, flexible way to rewrite it. The following example is from one of the websites I created, which uses a page directive and multiple parameters.

The subfolder in this example website:

RewriteEngine On

# Rewrite the access to your asset subfolders. This so the server returns 404's.
# Conditionals make sure that files and directories get served as they are, and not as parameters.

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(css|files|img|js|swf|xml)/(.*) website/$1/$2 [L,QSA]

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([A-Za-z0-9-_]+)/?$ website/index.php?page=$1 [L,QSA]

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([A-Za-z0-9-_]+)/([A-Za-z0-9-_]+)/?$ website/index.php?page=$1&param1=$2 [L,QSA]

RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([A-Za-z0-9-_]+)/([A-Za-z0-9-_]+)/([A-Za-z0-9-_]+)/?$ website/index.php?page=$1&param1=$2&param2=$2 [L,QSA]

RewriteRule ^index\.([A-Za-z]+)$ website/index.$1 [L,QSA]

The Result

Now you should be able to have a clean root folder, and not have the feeling you’ve entered a war zone every time you connect to your FTP server.