Matt Cutts #4: Static vs. Dynamic URLs
Here’s the fourth in the series of videos posted by Google’s Matt Cutts to Google Video over the past year. These are important for every web developer to see. Please see Matt’s first!
Transcription
Alright, here we go again! I am learning something every time I do one of these. For example, it is probably smart to mention that today is Sunday, July 30th, 2006.
Alright! Jeremy writes in:
He says, “Does Google shrink dynamic pages differently than static pages? My company writes Perl and pages are dynamically created using arguments in the URLs yada yada”.
Good Question.
To a first approximation, we do treat static and dynamic pages in similar ways in ranking. So, let me explain that in a little more detail. Pagerank flows to dynamic urls in the same way it flows to static urls. And so, if you’ve got New York Times linking to a dynamic url, it will still get the pagerank benefit and it will flow the pagerank benefit.
There are other search engines who in the past which have said, “OK, we go one level deep from the static urls. So we are not going to crawl from a dynamic url, but we are willing to go into the dynamic url space from a static url.”
So, the short answer is pagerank flows just the same between static and dynamic urls.
Lets go into the more detailed answer. The example you gave actually has five parameters and one of them is like a product id with like 2725… You definitely can use too many parameters. I would absolutely opt for two or three at the most, if you have any choice what so ever and try to avoid long numbers because we can think of them as session ids. Any extra parameters that you can get rid of are always a good idea. And remember, Google is not the only search engine out there. So if you have the ability to basically say, I am going to use a little bit of mod-rewrite and I am going to make it look like a static url, that can often be a very good way to tackle the problem.
So, pagerank still flows but, experiment. If you don’t see any urls that have the same structure or with the same number of parameters, as you are thinking about doing, then its probably better, if you can either cut back on the number of parameters or shorten them some how, or try to use mod-rewrite.
Alright. Mark writes in. This is an interesting question. He had a friend whose site was hacked and did not know about it for a couple of months because Google had taken it out or something like that. So he asks:
“Can Google inform the webmaster, of this occurrence, basically when your site gets hacked, within sitemaps? Inform them that may be inappropriate pages were crawled.”
That’s a really good question!
My guess is, we don’t have the resources to do something like that right now. In general, when somebody is hacked, if they have a small number of sites they are monitoring, they usually notice it pretty quickly or else the webhost will alert them to it. So, the sitemaps team is always willing to work on new things, but my guess is this would be at the lower end of the priority list.
OK. James M. writes in.
He says, “Matt, in the fullness of sometime, I would like to use geotargeting software, to deliver different marketing messages to different people in different parts of the world. So for example, discounted pricing structure. Are we safe to run with this plain vanilla use of geotargeting software? Clearly, we want to avoid any suspicions of cloaking.”
That’s a really neat question!
Lets talk about it a little bit. The way that Google defines cloacking is very specific. It says, “showing different content to users than you show to search engines.” Now geotargeting by itself is not cloaking under Google’s guidelines, because all you are doing is, you are saying, “take the IP address, oh you are from Canada, we will show you this particular page”. Or, “take the IP address, you are from Germany, so we will show you this particular page”.
The thing that will get you in trouble is if you treat Google bot in some special way. So, if you are geotargeting by country, don’t make a special country just for Google bot - GoogleBotiStan or anything like that. Instead what you want to do is to treat Google bot just like a regular user. So if you geotarget by country, we are coming from an IP address that is in United States, so just give Google bot what ever the United States users would see.
So, Google for example does geotargeting. We don’t consider that cloaking. I think I’ve explained the subleties pretty well. But, again, cloaking is showing different content to users than to search engines. In this case, you should treat the Google bot like you would treat any other user based on the fact that they’ve got this IP address and you should be totally fine.
Alright! Lets take another break.
Transcription thanks to Peter T. Davis
[…] Caydel’s SEO Blog » Matt Cutts #4: Static vs. Dynamic URLs […]