Darren Mothersele

Software Developer

Warning: You are viewing old, legacy content. Kept for posterity. Information is out of date. Code samples probably don't work. My opinions have probably changed. Browse at your own risk.

8 Drupal Admin Tasks to Complete Before Putting Your Site Live

Nov 21, 2008


There are loads of configuration options in Drupal core, and no doubt you'll have modules installed that add loads more options. Here's a mixture of configuration and admin tasks that you should make sure are completed before putting your Drupal website live. From caching and CSS aggregation to improve the performance of the site, to SEO tweaks to your htaccess and robots.txt files...

1. Turn on Caching

Use as much caching as you can get away with. There's no point wasting server power recreating blocks for every visitor. Turn on caching and have them stored and reused. Many modules like views and panels come with their own cache options, and the main performance options are available under "Site Configuration" -> "Performance".

2. Aggregate CSS and Javascript

Turn on this option in your performance settings and Drupal will combine all CSS files into one css file, and all Javascript files into one cacheable js file. This reduces the number of required HTTP requests to download each page, and is a nice big lump of data that the user can cache in their browser as they explore the website.

3. Make errors only report to logs

Make sure that your error reporting is setup so that any messages are logged in the database and not shown on screen. You don't want to bamboozle your visitors should something go wrong, and there's always a chance the error message reveals some information about your server configuration that might be exploitable by some hacker or spammer. These options are under "Site Configuration" -> "Error Reporting".

4. Give your permissions a once over

Have a double check down your permissions list to sanity check your configuration. Make sure you've not accidentally given registered users more permissions that you meant to, and check that user roles that do need permissions have them. For example, you need to turn on permission for anonymous visitors to access the site wide contact form. You may also want to check the configuration of any node access modules you are using, especially if certain parts of the site are meant to be restricted to unregistered users.

5. Make sure CRON is running.

Make sure that your server is successfully calling the cron system
so you can be sure that any routine tasks get taken care of. This would also be a good time (while you're messing around in cron settings) to set up your automated rsync and mysqldump backups. There's more information on setting up cron here, and I wrote something previously about Drupal backups.

6. Check file permissions

If you're using modules that manipulate files, such as imagecache, upload, or any of the filefield modules, then make sure that you have permissions set correctly on the folders on the server. If not then users will have trouble uploading files, such as avatar images or attachments.

7. Check htaccess

Check that .htaccess is working and restricting access to any parts of the server you don't want people accessing directly. The Drupal htaccess file by default restricts direct access to folders, modules, and info files. If apache is not configured correctly the htaccess file may be ignored.

While you're in htaccess have a look at the redirects section. If your site is accessible by www.example.com as well as example.com (without the www) then you should pick one of them and forward all requests from the other to the main url. The configuration to do this is already in the htaccess file, you just need to adapt it to your URL. This prevents your site being published on two locations which could otherwise be problematic for search engines like Google that have a duplicate content penalty.

8. Tweak the robots.txt

You can tell Search engines to leave bits of your site out of their indexes. This is useful again for duplicate content, especially if you are using Path aliases. By default all your content is available via it's node address, eg "node/1" and it's alias address which could be anything set by you or pathauto. You simple add "Disallow: /node/" to the robots.txt file. This page on blamcast has more information on this.