dynamic sites and search engines
- Started
- Last post
- 10 Responses
- v3nt
ok - so we always create dynamic sites with sql and php. how do we get the content detectable by search engines?
i hear creating xml files is one way, but what is the format etc?
- acescence0
use mod_rewrite to make your URLs clean.
no more than 2 parameters in a URL, or most spiders will skip.
- UndoUndo0
clean urls is good - security and SEO - but as yr output is html the spiders will be able to digest the ouput. using correct tags h1, h2, p etc will also help - as will page titles
checkout
http://www.webmasterworld.com
http://www.webproworld.comfor more info
- v3nt0
well i don't really have any urls as they come from the satabase and php made pages!
- kinetic0
i've been wondering this same thing as it has come up for a project recently
i'd be interested to hear how it works out for pages that come straight out of the db
- acescence0
you have URLs, they are just dynamically generated. if you didn't have URLs, you couldn't navigate anywhere!
think of the spider as a regular visitor to your site. the content is dynamically generated just like for any other visitor, except when a spider sees all the links with ??s it ignores them. use mod_rewrite to clean those chars out of the URLs and the spiders will follow!
- UndoUndo0
seo friendly urls are just rerwriting a url like this
productPage.php?id=645353
to something like this
/product/productname/645353
this cuts the crap and makes alot more sense to the SE. when you use htaccess to rewrite the page you pull the bits between the slahes and assign them to vars in yr script so you can get info
ie yr script might look like this
$productId = $get['id']
$productname = $get['productname']where you have used htaccess to assign the values to data posted as GET
- unfittoprint0
Undo > here's a good tutorial on "human readable" paths.
http://www.sitepoint.com/article…
note: access to Apache's mod_rewrite required
- v3nt0
cheers undi - was just reading about that here and trying to figure out hoew to get it to work!
http://www.searchtools.com/robot…
my main problem is most of our sites are flash so i still need to get the content written to a file somehow!
- UndoUndo0
good link unfit - better than I can describe it in this thread!
- UndoUndo0
probably the best way with full flash sites driven from a db is to have a html version of the site parrallel to the flash, offer a link to it from the homepage so the SE's can spider it and get the content. Macromedias SEO tool they relased (beta) a couple of years back did just this but for content held in the swf.
I would keep the html link smaller than the link to the flash site or out of the way so most ppl go for the flash