Is the Googlebot getting lazy?

After spending a finishing off a new site, installing & configuring servers etc, the SEO ‘expert’ comes along with a list of things that need to be done.

Create a robots.txt

Even though it can see everything we create one?

Add ‘ping’ support

This seems useful and is utilised by most wordpress blogs.  The idea is, you add a new article/news post etc and notify a few sources letting them know to crawl for new content.  I can see the benefit of this if you want to see your pages appearing instantly, however…surely the search engine will come to find it on its own. The more you update, the more frequent a crawler visits.

Create a sitemap.xml file

What now?  I’ve known about sitemap files for a while and since the beginning this seems really pointless.

Create a big list of URLs on your website – so google doesn’t have to?  I am aware that sitemap files, apparantly, do not affect a search engines ability to crawl your site…but how long before it does.

According to wikipedia;

Sitemaps are particularly beneficial on websites where:

  • some areas of the website are not available through the browsable interface, or
  • webmasters use rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.

If content isn’t browsable on your website, surely you don’t want it publicly visible or indexed anyway…and if you do, why is it not somewhere within your navigation?  If it can’t be found by person browsing your site, google shouldn’t see it either.

Ajax/Flash/Silverlight….these are all easily sorted as well, I personally don’t see what the problem is.

Anyways, that’s my rant over.

Goodbye – hello Linode

After seeing yet another failure from the camp, this being FAR worse than anything I’ve seen before, I decided to jump ship.

What was this failure?  well…

It has to do with the licensing, the licenses stopped pinging, which made replication stop, which caused some services to stop, they then logged in and rebooted it, and thought it was back up, but upon mounting VMs, it came to the point where it was clear that there was corruption. We then were able to restore all the logical volumes, but even then, the VMs wouldn’t boot..that brings us to where we’re at now

From my point of view, my server died, wouldn’t come up and after 24 hours it still wasn’t available.  There seems to be a high chance there is a missing data….that’s to be checked later.

All very far from re-assuring.

So, over to Linode. Similar pricing for the basic stuff.

  • The initial setup was very much the same on both providers, under 5 minutes and a new machine was ready to go.
  • Installing the software I need, pretty quick – then again I’ve done it several times now.
  • Restore backups from my Amazon AWS. It costs less than £3 per month to run what I require.
  • Less than 4 hours after setting up the nodes I am back live with only emails delayed.

I must admit, I’m pretty impressed with the flexibility of the control panel for Linode. In addition to what offers, Linode has;

  • A job queue
  • Root password reset
  • Rescue mode
  • Kernel switch
  • Custom disk sizing
  • Custom drive mounting
  • Alerts  on CPU usage, Disk IO, Traffic and Transfer Quota
  • Auto-reboots – this has not yet been a problem anyway
  • Additional RAM/Storage/Data Transfer all separately configurable

There may well be more as well. This is under 24 hours of signup which included sleep, work & setting up servers so much tinkering is still to be done.

My sites also seem more responsive as well, faster disks maybe?

Standalone DNS server – Almost there

A while ago I mentioned a Standalone DNS server.  Stuff got in the way and it got set back.

Well, its almost done.  Just needs some final testing, a quick import of the current records and we’re set to go.