Dynamic Linking & Nofollow – Practical Examples, Diagrams, + FAQs

In response to all the questions and comments I’ve received after my recent post on using nofollow with internal links, I’ve put together a few practical examples and a couple diagrams to better illustrate the concepts.

As I mentioned last time, slapping a nofollow on some of your internal links is not intended to remove pages from the index. In fact, what we’re trying to do is to get more pages indexed, by reducing the share of your site’s PageRank that flows to less important pages. I’ll begin with an example that illustrates this technique.

Moving “Overhead” Pages To The “Third Tier”

Close your eyes if it helps, but try to picture a typical eCommerce shopping cart site. You have a home page, several product categories, and products in each category. Your home page is the first tier in your site’s linking structure, the category pages are the second tier, and the product pages are in the third tier.

You also have several of what I call “overhead” pages on the second tier, like privacy policies, terms & conditions, shipping information, guarantees, contacts, price match promises, etc. It’s not unusual, in fact, to have more of these pages than you have product categories.

To make your users’ experience the best it can be, you probably have “run of site” links (on every page) pointing to all of the second tier pages.

The effect of this on the flow of PageRank should be obvious – the overhead pages on your second tier receive as much PageRank as your product category pages… and far more than the actual product pages. This is clearly an upside-down arrangement from an SEO perspective.

The diagram below illustrates a simple modfication that moves your overhead pages down to the third tier – this will drive more PageRank to your product categories, which pass it along to your product pages.

prflow1.png

As you can see, we’ve “nofollowed” the run of site links to the overhead pages. We have a single site map page (directly linked from the home page) that passes PageRank on to the overhead pages.

In this simplified diagram, I’m not showing you “one way” and “two way” links… and I’m ignoring the third tier, which would also have “nofollow” on links to the overhead pages.

This structure allows you to get your overhead pages indexed (so they can appear in site: searches) without giving them as much weight as your product category pages.

What If The Overhead Pages Are A “Quality Signal?”

Some folks have expressed a concern that if search engines don’t find a “followed” link to these important overhead pages, they might consider your site to be of lower quality and rank your pages lower. I’ve never seen this actually happen, but I can’t say that this isn’t a legitimate concern.

If you are concerned about that, you can make some modifications, as shown in the diagram below. This time, I’ve shown the two-way linking relationship between the tiers, and added the third tier pages to the diagram.

prflow2.png

In this structure, the home page passes PageRank to the second tier, which in a shopping cart site consists of overhead pages and category pages.

The overhead pages link back to the home page, passing some of the PageRank back, and they also link to the category pages on the second tier, passing some PageRank across.

The category pages don’t link back to the home page (they do, but the links are nofollowed), so more PageRank passes down into the third tier. The third tier pages link back to the category pages (and may crosslink, see below).

Mix & Match As You See Fit

Neither of the approaches I’ve illustrates so far is designed to remove pages from the index. The intent is to conserve the total amount of PageRank within the site, but to simply redistribute it to pages that matter more to us. The goal is to get more of our important pages indexed. By doing so, we can actually add to the total PageRank within the site, because every page has an intrinsic value.

Neither of these approaches is a “recommendation” for what you should do with a given web site. Every situation is different. Sometimes nofollow helps, but it’s not the only tool at your disposal, and not the only tool you can or should use.

Imaginary Real Life Scenario – PageRank Misses The Point

Imaginary real life. Sorry. Best I could do – this is based on a true story, but sanitized for your protection.

Let’s say that Joe runs an e-commerce store selling gardening equipment and supplies. Joe has 10 categories of products in his store. Most of the categories have a couple dozen products, but the “sprinklers” category has 85 products. Why? Because that’s what he needs to have in order to meet his customers’ needs.

Unfortunately, Joe’s having indexing problems. Google doesn’t have a problem picking up his other product pages, but he’s scratching his head over why they don’t want to index his sprinkler product pages. So he goes out to the forums for advice, and in no particular order, is told to:

  1. Rewrite the product descriptions in case they’ve been filtered as duplicate content. Done. No change.
  2. Submit an XML sitemap. Done. No change.
  3. Add content to the category page, in case it’s been filtered as duplicate content. Done. No change.

Experienced SEOs will already have spotted Joe’s real problem – it’s structural. The 85 product pages (plus 10 category links plus 15 overhead pages) add up to 110 links on his “Sprinklers” category page. The PageRank is just being sliced too thin, and the sprinkler product pages barely make the supplemental index if that.

So what can Joe do? Sprinklers are his most important product category – irrigation is the linchpin of good gardening after all. (OK, I made that up).

What if Joe “cut” the links between his product categories (2nd tier pages), by nofollowing the cross-links between his category pages? Except for the link to the Sprinklers category, which is left in place. The result: more PageRank for the sprinklers page, by borrowing a little from other pages.

Would this fix Joe’s problem? Well, it might, but it might also borrow too much from the other pages and create indexing problems elsewhere. That’s why you have to do the math, or do this stuff a little bit at a time.

Joe could also “fix the structure” (without using nofollows) by splitting his Sprinkler category into 3 or 4 categories, and if this actually increases sales, I’m all for it. That’s another option, but if it doesn’t add something to conversion / usability, Joe would be insane to do it simply for SEO reasons…

Don’t “Break” Your Site Over SEO!

In my experience (which goes back to the birth of Netscape as a browser), you never need to harm usability & conversion in order to accomplish an SEO goal. There’s always another way. Nofollow is a tool that can help.

We’ll talk more soon.

89 thoughts on “Dynamic Linking & Nofollow – Practical Examples, Diagrams, + FAQs

  1. Thanks for the diagrams Dan. I have an e-commerce site with a lot of overhead pages, categories, etc… that will be perfect to test this out on over the next month. I’ll test and let you know the results.

  2. Pretty interesting, but still somewhat confusing to me. I have a somewhat small site for now and I am getting pages indexed each day, so I am not going to worry about this right now. Thanks for the article.

  3. Awesome article. I’ve learned more about genuine, no gimmick, SEO in these past few months reading the book and the site than I have in the past 3 or so years I’ve been trying to learn it!! Does this mean I can safely delete those “I’ve found the loop in Google and can get your site listed #1 for all your keywords in 10 minutes…just pay me $1000″ emails?

    :-) This is a great rounded edge to this whole discussion of structure.

    I’ve got the fix for the PR Bot and now I’m off to tinker!

    Thanks Again!
    Paul

  4. I still haven’t worked out what I plan to do with overhead pages on my Sandcastles structure.
    The simplest method would probably be to create a small sitemap purely for the overhead pages, in the form of an about page.
    With WordPress is it possible to add logic specific to a page slug, thus the links can automatically be nofollowed on all pages other than the about page.
    In this way the overhead pages would be on the 3rd tier, but the page linking to them wouldn’t be splitting the juice anywhere near as much as a sitemap.
    It could also be constructed such that the overhead pages only link back to the home page, thus turning them into a spider circle.

  5. Dan, the diagrams really helped me understand about No follow and dynamic linking. I’m starting to put some of this into practise, along with internal links, I’ve seen an improvement in my rankings.

    Now I have my homepage doing well, I need to concentrate on the 2nd tier pages.

  6. Great explanation, clean and simple. Now I have a couple of questions.

    If I understand your previous comments correctly, this affects pagerank by only a tiny fraction, but when multiplied by thousands of pages on a large site, comes out to something significant. For a small site, a single external link to each internal page would pass more pagerank, so this isn’t worth bothering with. At around 300 pages or so, Joe’s EGarden Shop is just barely large enough to benefit, right?

    And the purpose – this is for getting those deep pages just enough pagerank to sit in the main index so that they can show up for long-tail searches (and rank based upon their content), not about creating a ‘trap’ to get enough pagerank to compete for head keywords nor about being able to compete on the strength of internal linking with thin content, right?

    And finally, is there an effect on link rep? For example, if Joe has both a “Home” link in his header and a “Joe’s Gardening Supplies” link in his footer pointing at the home page, would nofollowing the header link (leaving ~300 links instead of ~600, but all containing ‘gardening supplies’ instead of only half) have a positive effect on ranking? Presumably only one link of the links to page A on page B is counted, and this should make sure that the one with the good anchor text is always the one chosen. But is the ratio more important than the quantity?

  7. @ Paul, I hear ya on the email hype… it’s a real letdown to follow a 2 month email sequence and discover that their big secret is expired domains. :D

    @ Andy, an “about” page is what we used to use to contain all those links – you have a vertical stack that says About \ Contact \ etc. with one anchor tag wrapped around it pointing to that one page. Wasn’t ideal for usability, but we did put summaries of privacy policies etc. on it to try to limit the impact. Looking forward to seeing the next iteration of Sandcastles.

    @ Ray, even for small sites, you can see some benefit, but the effects are greater on large sites. I don’t know what happens with link rep, I know some folks who do that without ill effects, but I don’t know if there’s any significant benefit. One of my concerns about it, is that the weight of anchor text in the footer may be reduced, but if it’s the only way you can figure out to do it then I guess you do what you can. Breadcrumbs are another good target for anchor text, and that can actually help with usability and conversion.

  8. Ray, we’re not really expecting to move so much PageRank around that it would have a big ranking impact. I think that your use of anchor text is going to weigh far more heavily, and that isn’t a “Google only” effect.

  9. Very nice, scientific quality presentation, Dan.

    I see so many SEO pros focused on unproven or “secret” elements of page and search optimization.

    Great to see another kindred spirit covering the nuances of page rank and spider activity.

    Much appreciated. I’d love to chat in the coming weeks about how we like to manage some of the new and vitally important elements of Google, MSN, Yahoo! and ASK.com’s search. I believe you’ll find it very worthwhile.

    Have a great weekend.

    Mark Alan Effinger
    RichContent.tv

  10. Pingback: Search Engine Land: News About Search Engines & Search Marketing

  11. Dan,

    I’ve drawn a similar diagram to you and realised I actually have four tiers

    Home [links to 2nd tier]
    Second Tier [country] [links to 3rd tier]
    Third Tier [region] [links to 4th and 3rd tier]
    Fourth Tier [property] [links to 2nd, 3rd tier]

    I have a blog which links to home and all tiers within blog posts. On the footer of the blog I have links to the 2nd tier.

    Does this sound all about right?

  12. Hey Dan…

    Great stuff as always…we’ve produced some great results based on this model (although can be a tad more complex with 500k of pages and multiple layers of sub-categories)…for anyone doubting it “out there” it really works…

    What do you think about the influence of “anchor text” using this methodology?

    KP

  13. Hey Dan,

    Indeed great stuff, thanks for sharing! It’s always fun to try new things, especially when they work :) I tried to rearrange the internal site structure using nofollow on a small site of mine (30 pages) and within two weeks after reindexing I got these results:

    Total Visitors: +163%
    Organic Search: +47%
    Image Search: +260%

    Note that my ‘product pages’ are images. That explains the increase in image searches. Whether these results are stable, I still have to see in the future. I still prefer to built sites in pyramid / directory style without nofollow, but in some cases this is not possible. Nofollow can offer a solution here.

    The moral of the story to me is not so much the increase of traffic or higher rankings (always fun), but the fact that Google makes EVERY link on your pages count and that information architecture is becoming more and more important to get deeper pages indexed and bring the right pages into the spotlight. This is also true for small sites. For larger sites the results can be even more dramatic: nofollowing only the disclaimer, privacy policy, affiliate links etc. without touching the overall internal site structure can already have a substantial impact on rankings and traffic.

    Until now rankings seem stable, but before I use this on client sites, I want to make sure this is not just a temporarily flux in rankings. Time will tell.

  14. Hi Dan,

    Awesome article. But then, as with everything I read from you, I come away feeling warm and fuzzy and confident that what I have just learned is the real deal, presented with ZERO hype and BS. In the truest sense of the phrase, “You are a breath of fresh air”.
    Thanks Dan.

    Tom

  15. We have an ecommerce stype site and realizing that I’m wasting pagerank by bouncing it back and forth between pages I made some changes.

    I didn’t go for your complete solution above though as it would have requires more time than I currently have available but I did this.

    I use PHP and I made a variably $index_page that is set only on the index page. I then put php code in my site template so all the links that appear on every single page of the site gets a nofollow tag, but not on the index page.

    This means categories, guarantees, external links etz are only followed from the main page and nofollow an all other pages.

    I added one more fix to collect the pagerank from the bottom pages, like product pages, that after the fix didn’t have any links back to the site.

    I added one link at the bottom of every page back to the index page with my main keywords in the anchor text. This way pagerank from over 300 product pages is passed back to the index page instead of being lost in the ether.

    Thanks for giving some insight into this. I’ve heard about it before but never at a detail level that would make me believe that it really matters.

    Simon

  16. @Mark, if you just reply to the newsletter emails that will get forwarded to me.

    @Darren, I don’t have a clear understanding of what you’re asking about – it sounds right (pushing more PageRank to the country pages) to treat the country pages as the first tier of several sub-sites.

    @Keith, if you cut links between pages, you’re removing any anchor text reputation that came through the nofollowed links. As an example, cutting links between your second tier pages would also cut off that anchor text.

    That doesn’t mean you can’t replace it if you get more pages indexed, but you do need to understand the implications of changes that you make. A third level push is often used by affiliate marketers who don’t expect to rank for second-tier (category) search terms.

    @ Everyone, thanks for your contributions to the community!

  17. Why have a third tier?
    A page isn’t optimal for a keyword until you get to about 6 pages in length. And there is no penalty for more relevant text. 2 tiers is enough, and doesn’t need a site map. Spiders can figure it out just fine.

  18. Why have a third tier?

    That’s fine if your site is small enough to link to every page from the home page, Chuck. But it does limit the size of the site.

    A page isn’t optimal for a keyword until you get to about 6 pages in length.

    Huh?

  19. You can add as much (relevant) text as you wish to a page with no penalty. When a page comes up in the serps there is a number “3K” or “57K” or whatever. Just look up a few thousand and you’ll see a trend of some ranking higher than others. It’s another one of the statistically valid observations. If you put (6) 500 word articles on the same page, on the same topic… Google ranks that page well. But don’t take my word for it. Try it and see.

  20. So your recommending that I should use a separate website for sales and conversion? And I should have multivariate testing running on that page so that it’s optimized for conversion instead of ranking well? And because my content sites are limited in size…..I should set up a number of them for different segments of my traffic and funnel all the traffic to the sales page? Hmmm,I’ll have to think that through.

  21. No Chuck, if what you’re doing is working for you, keep doing it. But if you’re going to drive traffic to a page, you may as well drive traffic to the best performing page.

    So are you saying that your statistics prove that the ideal page length is 3000 words? You’re killing me…

  22. Great stuff Dan! We actually were using this technique last week on a client’s site that we combined with a social media campaign in order to maximize the link juice flowing deep into the site. I’ll keep you posted on how it works out.
    Toren

  23. Great stuff as usual Dan.

    I just took a look at GWT and what they’re reporting on internal linking. I’ve got more links going to my privacy pages than to my product pages – DOH!

    Anyway, just wanted to add that using GWT and the internal links report is a super easy way to get a “view” of your internal linking structure.

  24. Thanks,

    Is there a good alternative that will pass back pagerank? Right now I’m having to link all the category pages on each product page (which is actually helping the overall PR flow to the Category pages – thanks to the PR Bot Tool, I can see that) and it’s spreading the little PR that my product pages have too thin.

    I would like to keep the links on each page for usability, but implement a nofollow on the ones that the product is not linked from. ex. http://www.seefred.com/cgi-bin/shop.pl/page=tequilaworm.htm

    This can only be done dynamically and my site is straight HTML.

    Paul

  25. The easiest way is to have categories listed twice

    When you list all the category items, have them nofollowed

    Then have something like “See more items from XYZ category” and have that followed

    Also your sitewide anchortext is “Home / Product List” in the header and in the footer it is “Take me back to the product listing”

    I am sure those are not primary keywords

  26. Well, why not just have them all nofollowed, use that javascript and just make sure you have one decent home link on each page as the only link for product pages.

    That will then create a site built from multiple spider circles

    Looking at the front page you should nofollow duplicate links that do not contain keywords.

    [SEE ALL ITEMS IN THE CATEGORY]

    I am not sure what to do with the I II III IV links but I would try to make them pass useful anchor text

    Be warned that more more you move away from a ball linking structure, the easier it is to mess things up if you have external links or links off to pages that shouldn’t get so much juice.

  27. Using the PR BOT tool I’m noticing that when I add the category menu on the product pages it’s having a substantial influence on the category pages PR which in turn is helping the products under that category.

    The tool is showing me that before I started adding the category menu on each product page it had a PR score of .061 [I’ve removed all the leading 0s] (I already had some menus on pages at this point, probably about 120 or so).

    Now I’ve gone through and have about 245 IBLs and it’s bumped the category pages up to .121 and I have about 350 more products to go, so I’m anticipating a much better flow through. Not to mention, some of these older products don’t have any crosslinks whatsoever.

    So, I’m a little concerned about not have ANY followed links on the product pages. What I’m probably going to do is take out the [I II III IV] links and just link those secondary pages from the actual category and shove some less important products in those pages so it doesn’t hurt too bad.

    Also, when I nofollowed duplicate links from the home page it seemed to dump them into the supplemental index. My Toys&Novelties 1 category was dumped! So I figured it really doesn’t hurt anything by having duplicate followed links because the same amount of PR flows through, whether it’s 2 or 1.

    Unless, of course, I’m missing something completely!! Which is a possibility, this is still pretty new to me.

    Also, thanks for the heads up on the anchor text, I’ll have to figure something out.

    Paul

  28. Dan,

    You keep telling we want to have as many pages as possible indexed. I’ve wrestling with search engines for quite some time to reduce the listed size of my site from many thousand pages down to about 300 meaningful ones. I succeeded, but now, after reading seo fast start and your posts here, I started questioning myself if it was the right move.

    Let me tell you the story in more detail. I run a php-nuke site. This is an open source CMS, in case you don’t know. One of the add-ons I have on it is Amazon affiliate module. This module generates literally thousands and thousands of pages. Basically, every product in Amazon inventory has a dedicated page on my site. On top of it there are various groupings and searches. Problem is, the only person ever used my affiliate module to buy from Amazon is my wife :).

    I had literally thousands of pages indexed on my site. Problem was, those Amazon pages generated quite an amount of weird untargeted traffic. And no sales, as I told already. So, I decided to get them de-indexed. It took several months, and they are long gone from the engines.

    Now, looks like I was completely wrong and having those pages indexed is good for boosting the rankings of important ones? Do I understand you correctly? All of those pages are tier three and more, the majority probably being tier four. Does it make any sense for me to try to get them indexed now? Could you advise, please?

    Thanks, Misha

  29. Misha, it sounds like those Amazon pages are just duplicate content to what’s already on Amazon. I wouldn’t be working to get those indexed, I’d be working to get your most important content indexed.

  30. Instead of building pages, I’d get text links with good relevant anchor text. One of my sites with 119 links and no text on the site to speak of is currently ranking higher than other sites:
    #10 | PR: 3 | Google Cache Date: Sep 12 2007 | Age: 05-2007 | Y! Links: 118 | Y! Page Links: 119 | Alexa: 198

    #11 | PR: 4 | Google Cache Date: Sep 13 2007 | Age: 04-2003 | Y! Links: 6,020 | Y! Page Links: 0 | Alexa: 48

    Though I just noticed that “Alexa” may be pulling some weight here. And I have no clue how they count links.

  31. Chuck,

    Right now, you can fix pretty much anything you want to with more links and anchor text. :D

    Alexa’s link data is based on the Alexa/Amazon crawl. Depending on which Alexa (now Amazon Web Search) API call you use, you can get the total # of links, or the # of *sites* that are linking.

    The count in your comment looks like # of sites to me, and I bet you a donut that it correlates strongly to the age of the domain when you look at enough sites.

  32. Not sure if I agree with your “Sprinklers” example. For every example like this you can find websites that look exactly the same as the “Sprinklers” website and not have any issues getting product level pages indexed.

    Just to be clear Dan your saying that websites can have TO MUCH internal linking and you should no follow some of this internal linking, correct?

  33. Jaan, enough links coming in can “fix” structural problems.

    Just to be clear Dan your saying that websites can have TO MUCH internal linking and you should no follow some of this internal linking, correct?

    No, that’s not correct.

    I’m saying that it’s worth actually understanding how PageRank flows inside your site, and using the tools that are available to make modifications when doing so can help.

  34. Sure Dan, that is what I thought you meant. So I will ask it this way:

    Within your concept of how page rank works, within a website, the idea of PR hoarding or PR leakage DOES exist?

    My feelings our simple. Linking out to contact us pages or shopping cart pages is not going to hurt your individual docs PR enough to make that big of deal. I see docs rank fine every day that have substantial links out to such pages. This must be where this statement makes sense:

    “Jaan, enough links coming in can “fix” structural problems.”

    How when it comes to internal architecture, responsible internal linking only makes sense and should be part of of everyone’s SEO campaign. I agree here.

  35. Dan, I definitely understand how PageRank flows inside mysite, and additionally I installed the tool PageRank Bot for testing.

    Just for my understanding, do you mean that using in some specific cases the “nofollow” attribute is not ok?

    For example if I a have a page for “Tell A Friend” or any page with too little content, can’t I use the “nofollow” attribute in the links to those pages?

  36. @Jaan, I’ve written a few very detailed posts on this, so I won’t try to rehash it all. If you want to understand how things work, it’s easy enough. It’s not about “leaking” PageRank, it’s about managing it. If document A links out to a bunch of overhead pages, that’s not going to hurt document A at all – but it does mean that document A is passing less PageRank than it could, into other pages that may be more important.

  37. @John, “tell a friend” pages would be a perfect example of an overhead page that doesn’t even need to be indexed. Nofollowing links to that page would leave more for your other pages.

  38. Depending on how recursive calculations really work with PageRank and other factors these days, if Page A is passing on juice to lots of external sources rather than internal documents that ultimately flow juice back to page A, then linking out could potentially hurt the ranking of Page A.

    But you can always increase internal linking such that on pages where you are linking out a lot, you also increase the internal linking dynamically.

    This can get a little bit extreme, as an example I have a page on my blog with 82 comments, and 143 trackback/pingbacks … and all those links are followed to related sites.

    Yahoo site explorer reports 3590 links to the page

    To compensate I have a growing tag cloud on the bottom of every page which works on a formula of 10+(4 x # links)

    In the past the page ranked 3rd for the primary term, these day it hangs around 6th or 7th. Position 1 and 2 are untouchable, as they have 2 years of links for the same topic just pooring in, without lots of external links on the page.
    Yahoo don’t seem to mind, ranking 5th, and MSN on page 7 with a different page. MSN really don’t like it.

    I think one of the biggest problems is that all those tag links really dilute the topic of the page, and there is no way (unlike Yahoo) to tell Google not to assign value to the content, but to still pass juice.

  39. Hi Dan,

    Great post!

    I always thank you for giving me a lot of practical SEO techniques.
    I’ve read this aritcle over and over again so that I could understand fully since English is my second language.
    (Yeah, I’m a Japanese guy who commented on proxy hacking before.)

    I’m going to your techqunique to my clients.
    Thank you very much again.

  40. Pingback: To link out or not to link out? Thats is the question - WebProWorld

  41. Dan,

    This is again about my Amazon pages :). After giving it the second and third and N-th thought I decided I want to give it a try.

    All my important pages are indexed long ago, and I’m working on SERPs improvement mostly. Trying to get Amazon pages indexed is not something that requires any significant effort – I just removed corresponding disallow string from my robots.txt yesterday. At the same time if successful, it should give a significant boost to my other pages ranking – if I understand correctly how it works. Based on the results Halfdeck’s tool is showing me so far, we are talking at least 15000 extra pages (and growing) for the site that has about 300 pages indexed now.

    Duplicate content may be an issue and may be not. My pages do not repeat Amazons’ exactly, they have a noticeable part of their own content and overhead, and they do not use all Amazon content. If you are interested, you can see for yourself at http://www.funandsafedriving.com/amazon-store.html .

    Is it enough? IDK. At least half a year ago I had quite a few of those pages indexed, and I did receive noticeable traffic on them – means they were showing up at good places for some long tail searches. Granted, things and algorithms changed since than. Only test will show if it still works.

    So far I had a great success applying your teachings with the help of Halfdeck’s tool. We’ll see how it goes that time, and I’ll keep you posted.

    Thanks, Misha

  42. Hi Dan,

    You say, “third tier pages link back to the category pages.”

    I’m not able to understand the reason though I’ve read the following section.

    Would you mind explaining it to me breifly?

    Thank you very much.

  43. Pingback: WordPress: How to Nofollow Sitemap Link Everywhere Except on Home Page

  44. Pingback: Aaron Wall's SEO Book.com

  45. Here is an update on my experiement. Currently Halfdeck’s tool found about 340,000 pages and crawled 91,000 of them. Search engines found some, too, but the results are mixed so far. I definitely got a serious boost on MSN across the board. And I got a serious slap on Yahoo across the board, too. Google seems to be indifferent so far, with just a few noticeable changes both up and down.

    Yesterday I realized that all of my Amazon pages have identical title. I had to spend some time on coding trying to figure out the solution, and now they are getting unique titles with the product name and my site name. I hope this will push things in the direction I want. Will keep you updated :)

    Regards, Misha

  46. I’ve read through your site with great interest Dan. Most of the techniques you describe seem to make sense and I would love to integrate them into my site.
    The question I have though is that the techniques you describe are all well and good for a site where the content is fairly static or at least only updated by a single person/group of people and can be done by hand on each page. How would incorporate these techniques into a community site that uses a content management system like Drupal or Joomla?

  47. I’ve completely restructured my site (actually since September), but Google doesn’t seem to be seeing (or updating) the changes I’ve made.

    When I go to Webmaster tools and look at the internal links table it shows I only have 170 links pointing to the home page when in reality I have 600+…any ideas on how long it takes for these things to kick in??

    Paul

  48. @Aidan, to do the same with a CMS, you would want to make changes in the templates to affect run-of-site links. To do anything more sophisticated than that, you’d need some code changes.

    I know there’s a WordPress plug-in somewhere that lets you change nofollow on a link by link basis, but I don’t use Drupal or Joomla and haven’t looked. One of our Stompernet projects coming up will probably involve Drupal, so we may have to tackle that soon.

    @Paul, The data at Webmaster Tools appears to be 1-2 months behind.

  49. With my internal linking…would it matter if I used the full path: “http://www.mydomain.com/category/page.htm” or “/category/page.htm”. Is there a benefit (or problem) of using one or the other?

    Thanks!
    Paul

  50. The primary reason never to use relative links is that if you get scraped, you don’t get as many backlinks with relative links.

    Syndication of any kind really requires everything to use permalinks.

  51. Never really thought along those lines Andy. But when I give it a thought – yep, I bet you are right. So, one need to give up programming good practices in the favor of SEO good practices :)

    Coming back to my experiement with generating a massive number of internal links. Currently Google shows more than 5000 pages indexed, with more than 800 being displayed. Lots of my low ranked keywords got a serious boost of several hundreds positions, but it does not seem to affect first page ranked keywords in any way, neither did it propell any keyword to the first page. Similar picture is on Yahoo and MSN.

  52. Dan,

    I did not really watch referrals, cause they are only about 10% of my total traffic – 85% comes from search engines. I looked at the GA chart – nothing that I can attribute to those internal links…

  53. It’s been really volatile lately. But based on what I see on forums, we just have another google dance, so I can’t attribute this to my effort :)

    On average traffic did not change really for the last three months probably…

  54. With my internal linking…would it matter if I used the full path: “http://www.mydomain.com/category/page.htm” or “/category/page.htm”. Is there a benefit (or problem) of using one or the other?

    The Best SEO results come from
    http://www.mydomain.com/xxx/
    with xxx being random.

    This avoids “coloring” of every word on the page with the category name. Like if it was a painting with one color crayon scribbled on the whole surface.

  55. There are some other things going on that are weird – some people have emailed to suggest a different PR for me in their toolbar, and data center tools such as digpagerank are returning nothing but errors.

  56. I like the idea of using NOFOLLOW on internal links to funnel PageRank, but while researching it I found a few discussions on WMW where the concept gets panned by some senior members and moderators, e.g.:
    “I would never use nofollow on internal links.”

    Source: http://www.webmasterworld.com/google/3392958.htm

    The objections seem to stem from the initial meaning of NOFOLLOW, namely, “As the owner of this page I can’t vouch for that offsite page I’m linking to.” There’s an implied statement that the link points to a low-quality or untrustworthy page — which, if local, would tend to hurt the rankings of one’s own site. Your blog post mentions this possibility as well.

    Of course, HTML tables were never intended for layout grids, but were used as such for a very long time to good effect. (By which I mean, perhaps using NOFOLLOW as you propose leads to a useful end even if it was never intended to be used that way.)

    So, my question is whether anybody is having good results with this approach. Most commenters that I saw on WMW suggest that their rankings dropped after deploying NOFOLLOW on local links, and recover when the NOFOLLOWs were removed.

    My personal experience is not conclusive, but not positive either: the bio page on my blog dropped out of the SERPs completely when I inadvertently NOFOLLOWed the link to it from the sidebar. Removing the NOFOLLOW restored my bio to page 1 of the SERPs in a search for my name.

  57. That thread was from July. If anyone still believes that now, they haven’t been paying attention, or aren’t willing to change their mind.

    A lot of people felt that way before we got clear confirmation from Google that they don’t put any special meaning behind nofollow other than “don’t follow this link.”

    If you’re uncomfortable with it, you don’t have to use it.

    It sounds like you got exactly what you should have expected when you nofollowed all the links to your bio page. “It works as advertised.” :D

  58. From what I understand, NOFOLLOW isn’t a vote AGAINST a link, it’s just not a vote FOR a link. So I wouldn’t imagine (though I’m not an expert by any means) that a spider looks at a NOFOLLOW link and flags it as “bad”. If this were the case then you could abuse it to cause damage on other sites.

    As far as your bio page problem I think that just goes to show that it works because it didn’t penalize or completely remove your page from the index. The second you removed it, the page was included. Sound’s like it works like a charm.

    The trick is using it wisely and not shooting yourself in the foot.

    My 2 cents (which happen to be on sale today for half off!)

    Paul

  59. Quick question.

    If I already use Google’s webmaster tools to submit a sitemap.xml file, how much more of a value would it be to link to an HTML sitemap on my index page?

    When I use seo4fun’s tool I see a drastic dip in my PageRank.

    For measurement’s sake:

    Index page w/o sitemap link: .385
    -category page – .303 (2nd tier)
    -product page – .019 (3rd tier)

    Index page w/ sitemap link: .192
    -category page – .155
    -product page – .014

    At first glance there doesn’t seem to be any reason why this amount of PageRank bleed would be worth it. Am I missing something??

    Thanks
    Paul

  60. Pingback: Nofollow for pages on your own site? - WebProWorld

  61. Pingback: What Is PR Sculpting And Does It Work? - Search Marketing Blog from Cincinnati, Ohio

  62. Pingback: Architecting Web sites - Design from the SEO perspective | SEO Theory - SEO Theory and Analysis Blog

  63. Someone please help me understand how “Dynamic Linking” is related to this article. Are taking about the content and links be generated during page load or does this mean something entirely different that I am not understanding? In what way have dynamic linking been used to reduce the number of overhead pages being indexed? Maybe someone just need to define “dynamic linking” so clueless me can understand it.

  64. Dan,
    The information you provide is fabulous and priceless. One little challenge though, when I clicked through from the SEOFASTSTART structure page and then clicked through to this page I got a couple of scripting messages asking me if I wanted to cancel or continue. This happened twice to which I clicked on the “continue” button. It brought me to this page but I cant see any diagrams just some text saying prflow1.png
    and thats the case for the other diagrams as well.
    Is the problem on my browser or on the page?
    I would love to see the diagrams as I am a very visual learner.
    Again thanks for a fab site and your generosity.

    regards
    Bryan

  65. absurd question…but????

    wat if i only hane a single page… and include tabs for categories…in same page and sub categories…means a long long page…so wats d use of linking…more precise dynamic linking…ven if i not have 2 pages????
    pl ans i new to SEO

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>