Dynamic Web Page Optimization - Problems with Dynamic Generated Web Sites


Dynamic URLs vs. static URLs
There are two types of URLs: dynamic and static. A dynamic URL is a page address that results from the search of a database-driven web site or the URL of a web site that runs a script.

In contrast to static URLs, in which the contents of the web page stay the same unless the changes are hard-coded into the HTML, dynamic URLs are generated from specific queries to a site's database. The dynamic page is basically only a template in which to display the results of the database query. Instead of changing information in the HTML code, the data is changed in the database.

Issues:
A search engine wants to only list pages its index that are unique. Search engines decide to combat this issue by cutting off the URLs after a specific number of variable strings(e.g.:? & =).
For example, let's look at three URLs:


http://www.somesites.com/forums/thread.php?threadid=12345&sort=date
http://www.somesites.com/forums/thread.php?threadid=67890&sort=date
http://www.somesites.com/forums/thread.php?threadid=13579&sort=date

All three of these URLs point to three different pages. But if the search engine purges the information after the first offending character, the question mark (?), now all three pages look the same:

http://www.somesites.com/forums/thread.php
http://www.somesites.com/forums/thread.php
http://www.somesites.com/forums/thread.php

Now, you don't have unique pages, and consequently, the duplicate URLs won't be indexed.

Another issue is that dynamic pages generally do not have any keywords in the URL. It is very important to have keyword rich URLs. Highly relevant keywords should appear in the domain name or the page URL. This became clear in a recent study on how the top three search engines, Google, Yahoo, and MSN, rank websites.


The Solution

So what can you do about this difficult problem? You certainly don't want to have to go back and recode every single dynamic URL into a static URL. This would be too much work for any website owner.

If you are hosted on a Linux server, then you will want to make the most of the Apache Mod Rewrite Rule, which is gives you the ability to inconspicuously redirect one URL to another, without the user's (or a search engine's) knowledge. You will need to have this module installed in Apache; for more information, you can view the documentation for this module (http://httpd.apache.org/docs/1.3/mod/mod_rewrite.html) here. This module saves you from having to rewrite your static URLs manually.

How does this module work? When a request comes in to a server for the new static URL, the Apache module redirects the URL internally to the old, dynamic URL, while still looking like the new static URL. The web server compares the URL requested by the client with the search pattern in the individual rules.

For example, when someone requests this URL:
http://www.somesites.com/forums/the-challenges-of-dynamic-urls.html

The server looks for and compares this static-looking URL to what information is listed in the .htaccess file, such as:

RewriteEngine on
RewriteRule thread-threadid-(.*)\.htm$ thread.php?threadid=$1


It then converts the static URL to the old dynamic URL that looks like this, with no one the wiser:
http://www.somesites.com/forums/thread.php?threadid=12345

You now have a URL that only will rank better in the search engines, but your end-users can definitely understand by glancing at the URL what the page will be about, while allowing Apache's Mod Rewrite Rule to handle to conversion for you, and still keeping the dynamic URL.

Another thing you must remember to do is to change all of your links in your website to the static URLs in order to avoid penalties by search engines due to having duplicate URLs. You could even add your dynamic URLs to your Robots Exclusion Standard File (robots.txt) to keep the search engines from spidering the duplicate URLs.

You have multiple reasons to utilize static URLs in your website whenever possible. When it's not possible, and you need to keep your database-driven content as those old dynamic URLs, you can still give end-users and search engine a static URL to navigate, and all the while, they are still your dynamic URLs in disguise. When a search engine engineer was asked if this method was considered "cloaking", he responded that it indeed was not, and that in fact, search engines prefer you do it this way.

Why Optimize?

Take for example http://www.dannytalk.com/?p=230. When you go to that link, it actually rewrites the URL to http://www.dannytalk.com/2008/10/23/the-8020-seo-rule-in-web-page-optimisation/ which is optimized in a directory structure. It shows clearly my site architecture with the date of the post and also the title. It helps people remember your URL much easier than seeing a bunch of cryptic dynamic parameters.

Which can Googlebot read better, static or dynamic URLs?

This is based on the presumption that search engines have issues with crawling and analyzing URLs that include session IDs or source trackers. However, as a matter of fact, we at Google have made some progress in both areas. While static URLs might have a slight advantage in terms of click through rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.

Brief Answer:

If your server sends content to Googlebot when it requests a URL, then it doesn't matter how that content is generated. With dynamic content, there are many pitfalls to be aware of, but the fact that the content is not a static file on the server doesn't matter.

* Add code to rewrite (not redirect) the new friendly URLs to the old unfriendly ones needed by your script(s).
* Change the links on your pages to use those new friendly URLs.
* Get your responsive linking partners to link to the new friendly URLs.
* Let this sit awhile, until you see the new URLs appear consistently in the SERPs for important pages.


Conclusion:


Dynamic URLs with a large number of parameters may be problematic for search engine crawlers in general, so rewriting dynamic URLs into user-friendly versions is always a good practice when that option is available to you. If you can, keeping the number of URL parameters to one or two may make it more likely that search engines will crawl your dynamic urls.

Reference URLs:

http://www.webconfs.com/dynamic-urls-vs-static-urls-article-3.php
http://www.dannytalk.com/tag/google/
http://www.dannytalk.com/2008/11/09/googles-opinion-on-crawling-dynamic-urls-vs-static-urls/#more-260
http://www.webmasterworld.com/google/3660314.htm
http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html
http://www.quickonlinetips.com/archives/2006/10/google-now-indexes-dynamic-pages-with-id-urls/

Comments

Popular posts from this blog

Introduction to Search Engine Optimization

Malware or I-frame Viruses Attacks !! Permanent Solutions

Key phrase analysis