One of the things I do on a regular basis is on-site search engine optimisation campaigns for several clients. The process goes something like this: (i) we discuss the objectives for the campaign; (ii) we identify keywords or search terms; (iii) I make website changes based on the keywords; and (iv) I spend ages checking and tracking search engine positions for each of our identified terms on Google, Yahoo etc etc over several months.
Luckily, I’ve managed to get some decent results but monitoring search engine positions in step (iv) can be time-consuming [and less than exciting] without automating the process. That’s where the Rank Checker add-on for Firefox comes in. In short, it enables you to search several search engines for your desired keywords/search terms and track the position of your website/domain in the search results. This means that you can measure the effectiveness of your website optimisation work. Best of all, you can export the history of your results into a .csv file to see how your search engine rankings improve for each of your keywords/terms over time.
There’s more. Rank Checker also has a range of configurable options. For example, it enables you to choose which search engines to use, which is great for me since I like to search Google UK. You can even graph your results with the Site Rank Reporter add-on.
Rank Checker must be the most useful Firefox add-on I have discovered this year!
Note: Monitoring search engine positions like this is only one way of assessing the effectiveness of your work. You’ll probably want to measure website visitor numbers and other website metrics as well.
I was a bit lukewarm about Google Chrome, the new browser from the Googleplex, when it was first announced. Another browser for testing websites. I have a few of those already on my system. However, with a heavy heart [just kidding], I downloaded it and my first impressions are pretty good. Firstly, it does seem fast…
These days, as someone who manages a variety of websites for myself and clients, I find that one of the most valuable online resources is Google Webmaster Tools. This Google service allows you to see statistics and error analysis and enables the management of Google’s indexing of your website. It also includes Sitemap submission and reporting. In my opinion, it’s pretty much essential if you want to know more about your website and how Google sees it.
Google is providing much more link data if you are signed up with Google Webmaster Tools. The amount of information is much greater than using the link:operator in normal Google searches (in my experience this was often a gross underestimate). The new system provides more information to anyone who has verified their site with Google. The date can also be filtered and downloaded (which will make it useful for sending to clients, ahem).
The information looks much more comprehensive in terms of numbers but it's important to say that how Google uses the information to rank your site is not revealed. That would be asking too much!
Searchmash is a low-profile site by Google. It's a search engine with some different features and I'd speculate that some of these may make it into Google itself in the future You might call it Google's test area, perhaps?
Like me, you may have heard a song and searched for the song lyric and artist on Google. What was the name of that song and who sung it?! Well, here's something I didn't know.
In fact, Google has a music search facility that usually provides better results than the regular search for music-related items. Simply enter "music:" before your search term to get results from the music search database. Google Operating System has the details.
The companies are adopting Google's Sitemaps protocol, available since June 2005, which enables Web site owners to manually feed their pages to Google and to check whether their sites have been crawled.
The new official Sitemaps website has the full technical details and an FAQs page. It looks like Sitemaps are becoming more important for search engine submissions and a common standard would seem to make sense.
Is it me or has Google 'lost it' recently? Google always prides itself on the relevance and accuracy of its search results. However, these days, Google's search results for specific keywords sometimes make no sense at all.
Recently, we have been attempting to get some back links for a client's website in order to improve search engine positions. The client is based in Sussex and the website is about 'subject x'. Although we were not involved in its creation, the website has unique page titles and all its pages are listed by Google in a site:domain name search.
Google lists one of the link 'donor' websites higher for the specific search term
Naturally, we added links to the client's website from several Sussex-based websites that we know. Result: Google lists one of the link 'donor' websites much higher than the client's website for the specific search term. Even though the 'donor' website only has the term listed in a paragraph on a links page. The 'donor' website is only related by locality. It's main subject is not 'subject x'.
Surely, this is not an accurate search result?
There’s a lot of discussion in web designer/developer circles about valid HTML and search engine rankings. If your site has valid HTML or XHTML, will it have better search engine rankings than a website with HTML errors and warnings? My answer would be ‘Yes’.
It’s interesting to see more and more online applications like this but how will it integrate with other Google tools and services? Perhaps it will be used to manage AdWords accounts?
Expect to see Ads in the full version when it goes beyond the limited trial stage….
» CSS3 Foundations is a new book that combines practicality with inspiration to show you how to create modern websites.
Sync files between computers. Share files with your clients, friends, and family using DropBox. It's great!