129 thoughts on “Site Structure Questions

  1. It was my understanding if you were building an authority site, links to other relevant sites was a natural situation and therefore a good thing. Supposedly any PageRank lost was made up for the outbound links fitting what an authority site should look like.
    And yet you suggest using nofollow tags for these links. Am I missing something?

  2. David, from the home page, unless you want to pass along some link love, I would use nofollow. Past the home page, it’s not such a big deal.

  3. I had just heard that google is again changing their algorithim and page rank will much less of an issue for rankings. So if this is true (and I don’t know if it is or isn’t) then linking to lower pr sites wouldn’t negatively effect rankings would it?

    Thanks,

  4. Great resource Dan. I am interested in using the no-follow approach for channelling page rank, but as a non technical person using a CMS how easy is it to add no-follow elements to Global navigation that is automatically generted by the system? Would this be a coding issue?

  5. Ron, linking to lower PR sites should not be an issue at all.

    Greg, it would be a coding issue and could be a challenge depending on the CMS. There are other ways to accomplish some of this – for example by focusing on adding more links to your most important pages instead of cutting off links to your overhead pages. “Addition by addition” instead of “addition by subtraction.” :D

  6. Dan,

    great document – thanks.

    I have a PR5 4-tier website (home – categories – subcategories – products). I can see that some of the product pages are not indexed and am not sure if this is because their parent category pages have little content or because of being a 4-tier site, its harder to get them indexed.

    question – is a 4-tier site a complete no-no? What creative ways are there to re-organise the site into a 3 tier site (or give the SEs the impression that it is a 3tier site) but still give the users a meaninful menu structure to follow? Would a site map help here?

    many thanks! Alex

  7. Ron, PageRank is “only one factor,” but it’s a big one, and one of the easiest for us to optimize. A lot of the factors nowadays involve actual user behavior on your site.

    Alex C, a 4-tier site isn’t a complete no-no, and from a usability perspective that category organization probably makes sense. I would think about a site map page (you’re already on it) to push some PR into the sub-cats.

    You might also consider just leveling it out to 3 tiers. You can make a functional navigation system that actually links to all of your categories and subcategories, but still maintains a good visual separation for users – see this article:
    http://alistapart.com/articles/domtricks2/

  8. Dan, I have a 3-tier linking system working pretty well. The long tail is rocking on Yahoo & MSN but not Google. I know get more links… But I am wondering about American Bridals 2 & 3 tiers with the ugly footer with about 50+ links pointing to categories. Is this a good tactic as long as you stay below the total 150 links for Google. And is this helping with their internal links and pushing page rank.

    Thanks,
    Jeff

  9. Hi, it’s me again… Sorry, I can’t get enough of this place.

    Wow! The structure chapter is awesome! I didn’t think of using no-follow on anything other than my no money pages.

    On a side note… I tried helping someone on the high rankings forum. I brought up using no-follow on privacy policy pages and discussed a little about page rank bleed, etc. To make a long story short, a moderator (Randy The Root) gave me a lot of grief. Anyway, I trust you and Leslie. I am just going to hang out here and Stomper (:

    Thanks,
    Jeff

  10. Hi Dan,

    We have a site with franchises in various cities. Each franchise uses the one main site for their homepage. Is there any way to give them each their own site by city and get those pages ranked by search term and city – keeping in mind the main site content is always the same?

    Thanks,

    George

  11. Hey Jeff, there’s been more than a little noise about the whole nofollow/structure thing, so I am writing one response to all of ‘em this week. It’s not about “leaking” PageRank, it’s about the problems that PageRank creates when they try to apply it inside of a site. It works great to analyze the web, it’s not so good for identifying the most important pages on a site.

    George, have you thought about giving them a single page or a small set of local page, maybe a subdomain? The big challenge is creating unique content for those pages, but even a “light” content management system could make it possible for the franchisees to add their own content.

  12. George, I echo what Dan says. Depending on the volume of searches per month, you should be able to get each search term and city ranked without their own site. You will have to do a rewrite for each page though.

    1st – do some keyword research and check the volume for each. Then setup a subdomain for each franchise: http://www.franchise.site.com or just a new page for each: http://www.site.com/franchise

    Make sure each franchise is linked from the homepage and has it’s own unique title with the search term and city you are trying to rank. Be sure to incorporate those core keywords in your copy on the page.

    You can also get more page rank and internal links by linking back and forth to each franchise (more on that later).

    Jeff

  13. I’m building a new commerce site for personalized jewelry (i.e. jewelry with names, dates, messages, etc.). The jewelry is all original so the actual product names are meaningless as search terms. The search traffic will come at the home page levls (personalized jewelry) and the category and sub category levels (i.e. personalized rings, name rings, mothers rings, etc.), not the product level.

    So, it seems I would want to concentrate my PR at the Home Page, Category and Sub category Levels and not send any on to the product level (or some meaningless subcategories such as “Siena Collection Name Rings.”).

    Am I on the right track with this, Dan? If so, could you speak to how I would modify the 3rd level push to concentrate on Levels 1, 2 and 3, and not send any PR down the line further. I have an idea on how to do this, but since it’s easy and so detrimental to do this wrong, I’d love to get some expert advice.

    Thanks so much Beth

  14. Beth, I’d work on a 3-tier strategy:

    1) Home

    2) Categories & Sub Categories

    3) Product Pages

    If you write detailed product descriptions and use the most important features of each piece in page titles, I wouldn’t be surprised to see a lot of very targeted traffic coming into the product pages.

    Go Google some long tail searches and see how little real competition there is: women’s+gold+ring+with+opal+inlay

    BTW, do not overlook the huge opportunity to get into the Google Product Search by submitting a feed. It’s easy and it works.

  15. Thanks for the response, Dan. I may be a bit dense cause I’m a bit confused. With the three tier structure you suggest, you have Categories and Subcategories as one level. Are you recommending getting rid of subcategories?

    Limiting to only three tiers (meaning no subcategories) seems to be a tough challenge without ending up presenting the visitor with an awful lot of products on the Category page. Do you think it’s better to have only three tiers and have dozens of products presented on one page (or use pagination) rather than using subcategories to narrow it down?

    Another option would obviously be to create more main categories. But what I’ve chosen to do is use main categories to create multiple ways to navigate the products. So, on my main nav, I have 4 navs – Nav by Product, Nav by Personalization (name, date, etc.), Nav by Theme (mothers, couples, etc.) and Nav by Collection.

    Sorry to take up so much space, but it’s dealing with the detail of real examples that I really need. Hopefully it helps others. Thanks for your help. Beth

    P.S. I’d be happy to share my url with you, but I’m reluctant to post it here as the site is still in development.

  16. I’m not sure how many categories and sub-categories you’re planning for, Beth. That’d be more helpful at this point than the URL.

    Let me give an example for now. Let’s say you have 10 categories with 4 sub categories in each. That’s a total of 50 pages.

    In your navigation, you link to all of these pages. That doesn’t mean you have to present users with 50 choices, you can use an unordered list to present users with an expanding menu, as described in this article:
    http://alistapart.com/articles/domtricks2/

    This would effectively put all of your categories and subcategories in the second tier, and your products into the 3rd, while giving users a nice navigation menu.

  17. Checked out the article and I see what you mean now. Unfortunately my ecommerce platform (Volusion)doesn’t have that menu option. In addition to the straight text links they offer Rollover popout (which wouldn’t be spiderable, right?) and Tree which would make my nav reallllly long.

    I think the expanding menu you suggest would work well for my site because I don’t have a terribly long list of subcategories – five would be the most for a particular category. And in each of my 4 nav options (By Product, By Personalization, By Theme, By Collection), I only have 3-5 Categories. So totals are 16 categories and 45 subcategories.

    Maybe I’ll check to see how much it would cost to do custom development of an expanded nav menu for the Volusion platform.

    Thanks. Beth

  18. Dan,
    Could you please tell me what you think of the nav at http://www.buy.com? Is this in keeping with what you were thinking? I’m told they’re html menus controlled by javascript. Kosher, right?

    I notice that the main category is still clickable, where I believe in the example you sent me to it wasn’t. Only the sub cats were. Is it better to not make the main cats clickable since you wouldn’t then be truly eliminating a level?

    Really appreciate your help with this. Hope I’m not wearing out my welcome. Beth

  19. Yes Beth, it’s the same thing, just a different way of implementing it. If you look at the page with CSS disabled you can see how it looks to a spider – just a nice long list of links.

  20. The site structure was a very interesting read because I haven’t thought about this until now.

    I have search > product > enquiry about product

    The enquiry about product page is different per product, so they are tons of enquiry about product pages which are pretty much the same, would you recommend excluding these pages from the search engines?

  21. Jeff, I’d rather not give my URL out publically, so do you have any objections me emailing you via your site?

  22. Dan,
    Following your article on Shirley Tan I am unsure about linking on my product pages. Now I have 3-4 directly related products below each product. Below that 3-4 loosely related products. Should I have links back to the sub-category headings? Should I lose the 3-4 loosely related products? I’m not quite sure were to put the link focus?

    Thanks,
    Ben W

  23. Ben, I don’t see any reason why you would need to remove those links. Adding more links to help boost pages that aren’t getting indexed and/or ranked is generally a pretty safe approach.

  24. Question from the Shirley Tan case study:

    “added some internal links for each page to help boost her reputation for each search term.”

    Could you elaborate a bit on this, or tell me where to find it in your materials?

    By this do you mean rather than always linking to the “bath and soap wedding favors” page with “bath and soap wedding favors”, to add (or change some) internal links pointing to that page with varying text, like “bath salts favors”, “pillow sachet favors” for example?

    Does that dilute the relevance for the core keyword?

  25. Hello Dan, I got question about not site structure, but the structure of network of websites. I don’t know what the way of doing this is better. My current concept is shown here: http://img442.imageshack.us/img442/8849/networkingcf9.gif
    There is a one big most important website (HUB), with secondary pages on different themes. Network’s sites (relevant content) are linked from the hub’s secondary pages, while those websites link to the main page of the hub.

  26. James, Chapter 4, start around page 46. Varying anchor text is one part of it, yes – bath wedding favors, soap wedding favors, etc.

    As far as I’ve been able to tell, you can’t “dilute” real content that way, but I haven’t done much testing with empty pages.

    Shimon, the easiest way to look at it, is to think of it as one big web site. The fact that it’s spread across more than one domain isn’t really relevant in terms of structure.

  27. Yes, but in terms of network? I’m creating automotive website, so I will have for example sports cars, racing cars sections. So my plan is to create smaller website about sports cars and put link from sports cars subpage of my main website. And put link back to the main domain. I know Dan that’s not about website structure, but I think that could help other to work on competetive phrases.

  28. Here’s a wrinkle for you to consider…

    If you build a hub site and a dozen baby sites… and there are 2-way links between them, you’re sort of swimming against the current there, because everything we know says that search engines want to discount 2-way links.

    It might end up being harder than having one big site, no? Or it might go along great until one day one of the search engines decides to handle 2-way links a bit more harshly.

    Most of what I see is people going the other way. Sitepoint consolidated their network of sites (ecommercebase.com, webmasterbase.com) into one big site a few years ago, and they’re doing great.

  29. In my ‘inexperienced’ opinion, I would go for the same domain. It sounds like an absolute minefield going with different domain names linking to a hub. Let me give you an example of this going badly wrong, if not done correctly.

    ebookers, are a large travel portal, they had domains all over the world, i.e. ebookers.ie, ebookers.es the list goes on. Each domain linked to the hub in this case the .com and all of the content was unique, well it was unique in that it was a different language.

    Anyway, one big mistake they made was having the sites linking into each other, but they were all on the same IP class, so guess what, Google dumped the whole lot, apart from the .com

    Traffic zonked. I’m not sure what the situation is now with them, but its a lesson to be learnt.

  30. I get it. Typical 2-way linking is http://www.domain1.com http://www.domain2.com

    But I want create http://www.domain1.com/sportscars/ -> http://www.sportscarsrelated.com -> http://www.domain1.com

    All this baby sites, would be valuable sites with other whois information, and hosted on completely different place. I know that it works for sure when there is no loop (no links going from domain1.com to babies), but what about scenario I showed above?

  31. Yesterday, I read sites should always have absolute instead of relative links, so spiders don’t have trouble crawling them.

    For a simple site structure do you think there is a problem with using relative links?

  32. Hi Dan

    I have just finished reading “fast start” and am thinking about applying some of your ideas on site structure. My property directory site currently has 4 levels of navigation but as it expands to cover the whole of the UK that will need to increase to 6; home, city (e.g.London), district (e.g.North London), postcode (e.g.N14), area (e.g.Southgate), services (e.g.Surveyors).

    Even with the current structure many of the lowest level pages (and they are the most important ones being the landing pages) have been put in the supplemental index. I am working hard to build up the number of incoming links before adding any more levels of navigation but do you think your “third level push” idea can help with forth, fifth and sixth levels?

  33. Say If I have link to a page from two different places of same page, in that case does it mean that link juice will be given twice to destination page. Or Search Engine just neglects the second link to same page. In simple words do you need to add no-fllow to a page link if its linked twice or more times from a page. Or do you think its fine to link more than once to a page from single page.

  34. Justin, you’d need a lot of incoming links to really make that work. If you put some effort into bringing in links to your second tier pages, then that sort of pushes things one level deeper. As I mentioned in the link building course, it’s not so much “how many clicks from the home page” as it is “clicks from the web.”

    Maninder, we don’t know exactly how that case is handled, but if I had to bet, I’d say it’s probably treated as one link in terms of PageRank (the technical reason is that the graph can’t have parallel edges), but that the text you use in those links all passes to the page as reputation. So I wouldn’t use nofollow.

  35. Do you know of a snippet of code that would deal with multiple browsers and dealing with variable screen resolutions. We are using a static 800 x 600 resolution and it is starting to look like a grandmother site from the 80’s. What can we do to resolve this situation.

  36. Brian, I think you mean a site from the 90’s. :D Which is about how old the idea of using Javascript to deal with different screen resolutions is. What you really want is a fluid layout that resizes well to accomodate larger and smaller screens.

    Amazon.com is one of my favorite examples of this. You can resize your browser window all kinds of ways and you have to go pretty small to “break” the site. Larger screen sizes allow you to see more of the sales copy and offers, but it’s still rock solid at 800×600.

    If you have to go with a fixed-width layout, you’re probably not sacrificing anything by designing for 1024×768 at this point. I have a couple students who did that, with at least slight gains in conversion.

  37. A quick note to follow up, thanks for the email clarifying things, Brian:

    I’m not the best person to give you a design tutorial, but that’s OK because there are plenty of people who can. I’d start with these two articles.

    The first one explains the concept of liquid layouts:
    http://www.maxdesign.com.au/presentation/liquid/

    The second one describes exactly how to do what Amazon has done – fixed width on the left and right columns, and the center column resizing with the screen:
    http://www.alistapart.com/articles/holygrail

    Lots of great stuff for designers on AListApart. I feel like I should pay them, but all I can give them is a link. :D

  38. Hi Dan,
    Here are few lines from your book. After your answer to my question above and after reading these lines, I have got more confused.

    From Page 48 of SEO FAST START

    “However, you can add a global link that contains important keywords for the home
    page in the footer or elsewhere on the page, and use nofollow on the “Home” link
    that you create for your human visitors.”

    While earlier in answer to my question about Two links to same page from one page, you said, it might be fine if I don’t add no-follow on any of links.

    And above lines from your book I think tries to say that add NO FOLLOW to link to home page with TEXT “HOME” and don’t add no follow on other Link to Home which has keywords in it. If link juice doesn’t goes to a page twice if we link it twice from single page, then I think no need to add no-follow on first link to home page.

    Sorry if something is not clear. Let me know I can explain it more. But I just want to understand how it works and how it doesn’t

  39. Maninder, what you quoted from the book there is about the keywords in the links. Anchor text.

    If “home” is not an important keyword for your home page, then there’s not much reason to point a followed link at the home page using that word.

  40. Dan,
    I am restructuring my dating site to allow using the methods of SEO Fast Start. It was original built as a highly dynamic site with PHP and using lots of parameters on the URL to track session. I am replacing all of the URL parameters with other session tracking methods to get safer treatmgne by SE bots.
    Here is my question: My site is a dating site. THe third level pages are member profiles. I do not anticipate anyone searching for a specific member from a SE. So I am in a different situation from a typical catalog sales where you want people to land on the third level pages. I really want them to find and land on either the home page or a second level page. But I am thinking that the third level pages (profiles) are sill a valuable resource to pass link juice back up to the important second level pages. So I am thinking i should seek to get the profiles indexed so they can support the second level pages even though I do not anticipate them being searched for. Is this right?
    Second question: If the profiles of a dating site sre to support the second level target pages, can they be dynamically generated from the member database in php and actually be only one dynamic PHP page (as I curently have) or do I need to store a page for each member so that they can be separately indexed and and pass up link juice? I think the answer is yes they have to be seperate stored pages for this purpose, but am unsure. I have read that google can index dynamic pages, but I think this probably means one index entry to the dynamic page, not one per flavor based on parameters. But if there are links with the various parameters, maybe google will produce an index entry for each generated page like Site.com?member=5 and separately fro site.com?member=10 and so on. Please help me sort this out.
    Any further thoughts on the special situation of a dating or social network site appreciated.

  41. Gary, it doesn’t matter that the profile pages have variables in the URL – the important thing is that they have unique content, and the #1 thing would be to create a unique TITLE tag on each one.

  42. My other comment on this, since you mentioned the social network concept. If the members can also blog on their profile, or post links they find useful, or stuff like that, then you have an additional reason why there might be links pointing to those pages from outside.

  43. Hi Dan, I am rebuilding my membership site: http://www.uspropertybuyers.com It is being rebuilt on http://www.uspropertybuyers.net at VisionGate.

    The owner Mark Braunstein had me on the phone for an hour and a half and invited me to Vegas. Free VIP ticket! Whoo hoo! VTribes is gonna rock!

    I am trying to figure out how to accomplish two things on my home page. 1.) Appeal to homesellers to fill out a form to sell their house. 2.) Appeal to investors to sign-up as a member. Investors get an optimized webpage for the town they live in, content, downloads, etc.

    Any ideas?

    Thanks,
    Jeff

  44. I was reading something a few days ago about the concept of using Robots.txt to keep Google from indexing certain pages (product pages in my sense) that may not be that important to you to better maximize the PR flow through. Instead of spreading it real thin, just direct it specifically in a way that will benefit the pages you want.

    What do you think about this, is it safe, worth it, etc?

    Thanks,
    Paul

  45. Hi Dan,
    My present site structure is a dynamic PHP structure:

    http://www.website.com/product.php?id=1

    I’ve decided to create a more SE and user friendly URL structure that will become:

    http://www.website.com/product/productname

    Right now, I get quite a lot of traffic through the search engines (as we do have good & unique page titles) but I’m afraid that once I switch to the new structure I might lose my traffic from the search engines until it indexes all the new URL structures.

    Although, the old dynamic links will not change and will still exist as it is presently since the new URLs are merely serving up the old dynamic links – its just the URLs are user friendlier and I can’t do a 301 redirect since it’ll cause a loop(?).

    What would be your suggestions in terms of implementing the new URL structures?

    Thanks a lot!
    BTW, the SEO Fast STart book is a great resource.

  46. Hi and Thanks for the Info!

    I have a very modest site about 185 pages I believe. I have 3 sections that contain about 75 item pages that sell various types of widgets.

    I also carry the most important accessory for these widgets. The last line of text, on each list of features, for any of these widgets reads “Check our Fitting Guides for the correct accessory for your widget.” “Fitting Guides” is the anchor text for each of these 75 or 80 internal links pointing to the Fitting Guides with an absolute link.

    I am concerned that Search Engines especially the big “G” may not like this constant internal link pointing at the “guides page” and wondered if nofollow would be the right thing to put in these repetitious internal links?

    Thank you kindly for a reply!

  47. @Vincent, I’m not quite sure what you’re asking. With 185 pages you may be able to flatten the structure a bit.

    @Chuck, what you need to decide is whether that fitting guides page would be a good fit for any relevant search terms – if so, then you may want to modify your anchor text. If not, and there’s no reason why you’d want the page to show up in search results, then why not nofollow most of those links?

  48. Hi Dan,

    I have a site, http://www.seductiontuition.com, which has around 300 links on every page. Basically every page links to every page. It seems almost all my pages are indexed by Google. However as you can see, my homepage has pr4 and the rest are pr2 at most (pr0 or pr2). The thing that makes my situation unique is that the links is presented as a tree and my visitors really like it. The thing is, even though my pages are indexed, the pages are out ranked even on the long tailed search. Do you think I should use no follow so that google only follows up to 150 links?

  49. Dan,
    Great information and now I must retink everything! My clients typically are small site owners, frequently the 2nd tier is the lowest tier. Is is still a good idea to have the “site map” links in the footer or not. Even with a 3 tier site, should there be a site map in the footer that is basically a duplicate of the primary navigation? Most of my clients want one, it’s recommended for usability; what can be gained from using a footer site map from an SEO perspective?
    Thanks,
    Scott

  50. Scott, for a 2-tier site, there’s not as much to be gained. I don’t know why you’d need a site map when everything is linked from the home page already – can’t the home page serve as a site map?

  51. Hi There

    Google have apparently just recently removed the ‘supplemental index’ tag, which was useful for finding poorly optimised pages.

    How do we now find poorly optimised pages that the supplemental tag has been removed from?

    Cheers
    Al

  52. Al,

    The tag is gone but supplementals can still (courtesy of Google) be found with the * searches!
    Check the link on my name as I had a short post on this explaining how to do.

    Guess it’ll help.

  53. Anytime Al :)

    LEO,
    Your site has PHP errors in the header of the page:
    Add this line before any PHP code :

    error_reporting(E_ALL ^ (E_WARNING|E_NOTICE));

    to get rid of those notices / warnings. Then reconsider directory structure and make it a tree. Place pages on levels by importance. You dilute the link juice too much and waste it all. It’s not link juice anymore … it’s water!

  54. Hi Al

    Thanks. I am working on sorting out the php errors. Its to do with the star rating script. It just occurred yesterday evening.

    I already have the links as a tree, can you elaborate please?

  55. First page … link to the most important … then from most important link to the rest. The all must link to homepage and each of 3rd level must link to it’s upper level or even all upper levels. It’s like categories in a store. (On hompage you can put some links to most important 3rd level pages – it’s you game how you spread them across)

    Do as google says : less then 100 links per page. If you have more make sure you have s**tload of content too.
    Kepp the URL structure just change linking relationships.

  56. Hi Noobliminal,

    I just realise what you meant about the error messages. By adding your code I could make them disappear while I spend sometime to fix the php. Will put it the code on tonight!

    I think I get what you mean about the links. I will figure it out and add another comment with my thoughts of what I could change. Hope to run it by you to see what you think.

    Thanks Noobliminal!

  57. Check back on my site. By Monday I’ll add a tutorial about site structure as I see it maybe you’ll figure it out from there.
    I always write from a code’s point of view so it might help you more.
    Those problems in PHP you have are not errors. The script might work but tell PHP to hide them. Any of my scripts generates notices as I write them … pushing the envelope. Those notices are for beginners.
    Regards.

  58. Hi There

    I am trying to optimise a relatively small site (less than 40 pages.) The site has about 30% of its pages in the supplemental index (most likely due to the addition of an unnecessary tier.)

    Is dynamic link not so necessary for smaller sites? I was planning to avoid dynamic linking due to

    1)The size of the site.
    2) The fact that I believe I could capitalise from optimising the contact page.

    Cheers
    Al

  59. Hey Noobliminal

    I will wait till you have posted your tutorial up on monday. Bookmarking your site:).

    Thing about my links. The way it is, its great for getting indexing. Almost all my pages are indexed except maybe a few (single figures), but as you said its not making much use of the link juice. By checking on google webmasters tool, when I click on internal links, each page has maximum 200 pages linking to it. The good thing is that almost all my pages has this statistic, and the bad news is, it should be over 200 i.e. 300+, which is how almost all my pages linking to each other. This tells me googlebot is stopping after crawling 200 links on a page.

  60. Leo,

    Maybe the G.WebMaster.Tools won’t show them;) WebMaster tools show IBLs with nofollow and so on…
    So don’t take their word for it. Make an idea but don’t live by their statistics. Better use Yahoo! SiteExplorer to get linking info. It’s a bit better.

  61. I don’t know, it seems that when I implemented nofollow, Google picked up on it nicely and started showing my links the way I wanted them. For example I had hundreds of links pointing to my shopping cart page, contact page, etc. Once I corrected it with nofollow, the Webmaster Tool presented them correctly.

    Just a note.

  62. Nofollow to pages like : privacy policy, shopping cart, sorting pages or other pages with no real value meant only for getting the user around the site is … mandatory!

    I meant that Google shows links in the Links Section that do not pass value but point to you. I expected it to only show the links that pass PR or make them distinct. So do not consider all links shown to you are worth the link juice.

    And words in link’s text are limited to 8. I have more then 8 in link text many times. So maybe links shown are limited to about 100 for internals. Who knows?

    And do not believe everything in GWT. It’s a cool feature but it’s not updated as often as it should be and it shows more/less then it should.

  63. Hi There

    Is there a limit to how many internal links should be created targeting core search terms? Is it sufficient to create a few internal links targeting your key search terms, or should you create as many as possible?

    Thanks
    Al

  64. I have a page, linked from the homepage, which is indexed fairly regularly [last cached on 13 Sept] that hasn’t been updated in the Google cache since 1 August.

    Is this normal?

    I have a sitemap.xml which is updated manually via the Google Webmaster panel.

    I just wanted to thank Dan and Jeff, and everyone else who has helped me out. I now have 1st page rankings for all my targeted keywords on the homepage.

    Time to move on and get some of them 1st tier pages ranking better! :)

  65. Sorry, I should check my comments before posting.

    1st paragraph might be confusing so to confirm.

    I have 1st tier page that hasn’t been cached since 1 August. The page is linked from the homepage which is cached regularly.

  66. My hompage stayed uncached from the 9th of August until 10th of Semptember. All inner pages cached regularly.

    Something was messed up compeletely in that period of time.
    Wait and see and give it more links in the site from other pages.

  67. If you do a “site:www.mysite.com” search in Google, does the order in which the results appear reflect how Google sees the hierarchy of pages for that site? For example, I have structured my site to have a particular page be the most important page after the homepage, but when I do the “site:” search on my domain, my #2 page is ranking number 5 in Googs. Thanks Dan!

  68. Matt Cutts said this:

    “Identical link text looks like (and 9 out of 10 times is) a sign of manipulation of our Google PageRank algorithm.”

    I have product thumbnails that are used throughout the site in crosslinks, all of these have the product name as the anchor text. Should I be worried about a penalty at all? I’ve always had this on a site and it doesn’t seem to be a problem, but a client sent me an email about duplicate internal link text.

    So what do I need to know, be worried about, and fix?

    Thanks
    Paul

  69. I am looking to have an image gallery for each product page in a website and want to know whether there is a worthwhile SEO advantage in having a separate thumbnails page and then separate large image pages with breadcrumb style links pointing back to the product page. Would this be a worthwhile way of getting more page rank to flow to the product page (i.e. pagerank flowing from the large image pages to the product page).

    I want to know because my inclination is to have an AJAX run image gallery as part of the product page if there is no advantage in structuring the image gallery as outlined above.

  70. Dan,
    I’ve launched a new site, http://www.cheapscrubsanduniforms.com and it’s taking a while to get pages indexed. I verified my site, submitted an XML sitemap, and have posted here, and on Brad Fallon’s blog, and some nursing related blogs. I only have 2 pages indexed since I got the site ready to be indexed last week. Could you look at the site and see if you see any problems. I put a noindex, nofollow on the privacy and about us page. Is that the problem?
    Thanks,
    Brent

  71. Hi Dan,
    Great site and I really appreciate all the helpful information. I’m also an avid reader of the High Rankings forum though and I’m a little hung up on whether or not to use nofollow in the SEOFS 2008 sense. The question was posed here: http://www.highrankings.com/forum/index.php?showtopic=16956&hl=need%20a%20sitemap&st=45.

    Maybe I’m misinterpreting, but it seems like the active members/moderators of High Rankings think using nofollow on internal links is kinda pointless. I’m curious how you respond to those claims. In that particular thread, you made a few posts, but none of them seemed to defend your site structuring strategy or the idea that it’s very important to control the flow of PR. Their views seem to me to be the opposite.

    And I don’t mean to call you out at all, but I see two opposing strategies from two very reputable sources. Maybe I’m just misunderstanding something here. Perhaps you could clear things up…

    Thanks,
    Sam

  72. Awesome and hilarious reading. It’s amazing how worked up people get over this stuff.

    My sites are pretty small (mostly small businesses that don’t actually sell anything online) so this probably won’t help me too much. But I’m going to run some tests anyway and I’ll keep no follow in mind for when I start working on bigger sites.

    Thanks Dan.

  73. Hi Dan

    I wonder if you can share with me your wisdom on our current problem.

    We are building a website in the automotive market, aimed at helping customers find the exact car they are looking for. The idea is to have very a very focused site that is optimized for one make of car e.g. Mercedes. We would start with one make and then add more makes as we go.

    The question is whether we should have these optimized sections on one site e.g. http://www.ourdomain.com/mercedes http://www.ourdomain.com/bmw or to have a separate site for each make on a sub domain e.g. bmw.ourdomain.com

    The second option will give us a cluster of highly optimised sites, but I guess will be more work to get all the sites well linked.

    So the key to my question is, in the long run which is going to give us the best SEO result vs time / money spent in your opinion?

    Cheers

  74. Hi Dan, I’d read in the past that it’s best to have your content as close to the top as possible in your HTML so it can be easily spidered. Does this still hold true?

    If you have a 3 column template with the center column being the content, is it worth building the site so the center column appears first in the HTML but still appears in the center to the user.

    We’re working on a new template so wondering if this makes much or any difference. Thanks

  75. Shiraz,

    It does help to place relevant content closer to the top of the page within HTML. This is especially true with a long page, where search engine spiders may abandon the page after @800+ words. There are several CSS layouts available and I recommend “CSS, The Missing Manual” as a reference on how to properly construct a search engine friendly layout.

    I hope this information is helpful,

    Scott

  76. Dan,
    I’ve been searching all over the web – but no joy regarding this question: How do search engines view the use of index.htm within several different folders on the same site? Example: ../foldera/index.htm and ../folderb/index.htm.

    One of my clients has hundreds of these index.htm pages. The content is not identical, however there is little content on these pages, execpt for the usual menu navigation. These index.htm pages are content pages, not an index of pages within the folder.

    Is this OK? Is it considered “not best practices?” Do search engines, Google in particular, take a negative view of this page naming system?

    Thanks,
    Scott

  77. Scott,

    Are the /foldera/index.htm pages the same page as /foldera/ etc.?

    If so, you’ve got a “canonical URL” issue, but it’s fairly easy to resolve. If not, it’s an odd choice for a file name, but search engines would probably find & index those pages just fine.

  78. Hi Dan,

    Before I knew anything about SEO, I purchased a domain name (classicalscholar) which is really my “brand” but certainly not my primary keyword. I’ve been writing content under this URL for a little over a year, and now going back and changing the permalinks to include the primary keyword. Here’s an example:

    http://classicalscholar.com/homeschool-curriculum-road-map-reading-skills/

    In this example the keyword is “homeschool curriculum.” Here’s my question. Will I rank better if I purchase a new domain name that includes “homeschool curriculum” and point my old domain (classicalscholar) to the new domain and copy all the old content to the new domain? Or should I just stick with the old name and always add the keyword to the permalink?

    According to google, I’ve got about 700 incoming links to my current pages, but these were mostly obtained through bookmarking sites. I know I would have to start the getting links again.

    Thanks for your input!

  79. Hello Dan,
    I recentlly started to read your book and in page 35 I saw the following:
    “If you do nothing but design a web site that is stuctured well for human cosumption, and don’t do silly stuff with JavaScript and image maps…”
    and I didn’t understood why JavaScript and image maps are “silly stuff”. I’m currently working on an browser game site and I really need image maps (for the “cities” in the game) and JavaScript… So is it good in SEO terms?

  80. Hi Dan

    I have a blog, and I’ve been doing some of your third-tier push techniques. My blog has a sidebar where I list the most recent posts, and the all-time favorite posts. Would it fit your paradigm best if I dofollow those links on the main page, then nofollow them on individual posts? I noticed on your own blog, you don’t do that technique with sidebar navigation. My concern if I don’t do it is the “hot” key phrases in the titles of those favorite and recent posts end up as live links in non-related posts (i.e. a martini recipe with a live link to dirty text messages isn’t good, and looks like keyword stuffing)

    Second – I have Navigation Tabs at the top of every page for 5 categories. (I also have cats and tags.) In your book, you mention the most aggressive third-level push would have the third level only point back to the category it came from. This happens with cats and tags for free in my blog, but the Navigation Tabs (5 categories) at the top – that would take some work. Is it fine to not do that work, and let the third-level send link juice up to all 5 Navigation Tabs? I also have relevant links to other posts, so there is juice going where I want it.

    thanks very much! Love your concepts and all your work.
    David

  81. @Ido, Javascript and image maps aren’t silly by themselves, but you can do lots of silly stuff with them. Image map links will be followed by spiders, but there is no anchor text associated with those links.

    @David, blogs are pretty well structured already.

    I do not “recommend” a third-tier push, it’s there in the book to illustrate what’s possible. Sounds like you have very little reason to screw around with the navigation sidebars, so I wouldn’t do it.

  82. Thanks Dan. A related question on keyword density – as people comment on a blog post, whatever keyword density the author built into the post gets diluted, since more words are being added to the page. (At least this is true on WordPress blogs. Blogspot puts comments on a separate page.) Will this keyword density reduction make a page drop in search rank over time? If so, are there techniques to prevent this?

    thanks much – I’m really enjoying your insights.

  83. Comments “might” make the on page stuff slightly less attractive for a specific query, but you can probably overcome it with one more link… It’s also a small tradeoff for the added breadth of content and all the long tail presence you get by having the comments there.

  84. Dan,

    My client offers a high level online marketing platform in a specific niche. As part of the deal, customers get a customized profile page which is optimized for their area of expertise together with local modifiers and for their name.

    What people are asking is: if they sign up and get a profile and then someone in their same city/metro area signs up with the same keywords (their area of expertise within that niche), who will rank?

    To give everyone who signs up a good ranking, I’m thinking of subdomains like name.domain.com for every profile instead of the current structure domain.com/dir/name, so that Google can display several profiles.

    But: the domain has high quality links from Crunchbase, NYT and others and has a Page Rank of 5. With subdomains, my fear is that I’d have to start from scratch and do more link building for each profile than I’d have to if it was all on one domain.

    What would you advise?

    Thanks a lot,
    Jonathan

  85. Hello,

    I am reading your book SEO Fast start and have purchased some of your pulications via site point – firstly a big thank you every thing we have read of yours has been excellent and has provided sound results.

    Second a query:
    with regards to the robots text no follow – is it better to use the robot text file that you can generate via the Google webmaster tools or should the no follow codel be in the html of each webpage – are their any advantages of each?
    I understand that using the html in each page allows increased flexibility but if the website is not huge and there are clear universal no index interior pages will the robots.txt file suffice?

    Thank you in advance

    Hannah

  86. Thanks, Hannah!

    If you can accomplish it with robots.txt, that’s a far better solution than using meta tags on every page. The search engines read robots.txt first, and that saves them having to fetch a bunch of pages only to find noindex in the robots meta tag.

  87. A) The #1 reason to do this is to avoid duplicate content. The #2 reason close behind is to make the URL friendlier to users.

    If my URLs looked like that, I’d want to do something about it. :D

    B) URL-rewriting is possible on IIS – there are at least two solutions available for older versions – IIS Rewrite and ISAPI Rewrite.

    For IIS 7 Microsoft has a URL Rewrite module:
    http://www.iis.net/extensions/URLRewrite

  88. Wow, Dan, thanks.

    No really. I didn’t expect a a)quick b)specific c)”good” (if i am qualified to judge) response.

    You really exceeded my expectations. And this is not phony reply. It’s 6:30 pm on a Saturday and I’m supposed to be at dinner!

    :)

  89. Dan, in SEO fast start, you express how important it is when linking to your 2nd tier pages using the keywords you want those pages to get ranked for. If all the sites in the top 10 on Google for your top 10 or even 20 keywords are all homepages, not 2nd tier pages, is this still critical? Does it affect the rankings for your homepage then, what terms you use to link to the 2nd tier pages?

  90. Dan,

    Would you agree with this assumption?

    If you are trying to rank your home page for “Blue Widgets” and on your inner pages you have three links pointing to the home page: 1. the logo, 2. “Home” link in the global navigation 3. “Home” link in the breadcrumb trail – then you would benefit if you put a 4th link that displays first in your source order and contains the anchor text of “Blue Widgets”

  91. Ok, then I’m confused about two paragraphs in SEO Fast Start.

    1. On page 49 you wrote: Because of the way that search engines handle link reputation (anchor text), I recommend using a format that allows you to include some keywords in every page’s FIRST link to the home page, but also having a “Home” link in your navigation somewhere.

    2. On page 53 you wrote – If you are using multiple links from Page A to Page B, the only link that will predictably become part of the anchor text / link reputation for Page B is the link which appears first in the code. This means that you may wish to modify your site’s code, so that you can put the keyword-rich contextual links into the code first. An SEO-friendly design would use tables or CSS to make sure that the “content” section of your pages appears first in the code, before your navigation elements.

    In my previous post I thought the example was following these recommendations. Can you clarify the best practice for handling the “blue widgets” anchor text example listed in my previous comment?

  92. James, the first link is what you want to focus on for anchor text. In your example, the first link is an image. Adding a 4th link won’t do what you want it to do.

    Dan

  93. Ok, cool, that clears things up! :)

    If you have a website with 500 pages – is it better to use “blue widgets” in all of your anchor text or should you mix it up with different variations of blue widgets anchor text?

  94. When you link to page A from page B, does it help your homepage (page c) to get ranked for that keyword as well? Or it will only help page A? The top 10 listings in Google for the keywords I had in mind are all homepages…

    • That kind of extra redirect was passing anchor text (with Google) the last time we tested it, but that looks like a bug to me and I wouldn’t want to rely on it working forever, or how Google might interpret your intent.

  95. QUOTE: “That kind of extra redirect was passing anchor text (with Google) the last time we tested it, but that looks like a bug to me and I wouldn’t want to rely on it working forever, or how Google might interpret your intent.”

    Do you mean the third link in my example would pass anchor text then, the link that isn’t pointing to something that is redirected? My homepage is index.php, so it wont work as it creates a loop. I could point to index.html which is already setup as a 301 to homepage.?? If one were to do this, would you advise to only do homepage & not entire site?

  96. Or would you say using nofollow on those two links would be better? But does the PR / link juice just gets filtered back into Google’s index?

  97. So as of today, the only way to get google to use the anchor text in the 3rd url in the code is to use that redirect I mentioned above, no follow doesn’t work like that?

    Is this true for links on other people’s pages? If they have several links pointing to you on one page with various keywords, because they like you, only the first one in the code will count?

    Are you positive on this, done extensive testing?

    • Yeah, we’ve tested this stupidity to death. It’s probably somewhat more complex than “first link always wins” but that’s the most reliable behavior. Where it gets especially stupid is when they link to you with something like: “Curtis wrote a really great post about the actual subject of your post including keywords.” Because all that passes through from that is “Curtis” – and not the useful information that could help Google find your great post when someone searches for the thing you actually wrote about.

  98. Just to make sure I’m completely understanding you. If you have a high PR site linking to you with several links throughout the content of the page (which these are free links), only the first one will help you rank for that term. All the others will do nothing for you?

  99. So, am I assuming correctly on this? “So as of today, the only way to get google to use the anchor text in the 3rd url in the code is to use that redirect I mentioned above, no follow doesn’t work like that?”

  100. 1) Yes – if all those links point to the same page, may as well be just one link.
    2) Yes – and I wouldn’t try to use redirects to stuff a bunch of extra keywords in.

  101. 1) What if you have those additional links pointing to other sub pages, will that help your homepage get ranked some, even if pointing to a sub page?
    2)I was only thinking about doing the homepage, 1st link is now gone, 2nd link redirect and 3rd is at the bottom of the site, which is one of my keyword phrases. Ok?

    3) Why is this the first time I have heard of this? Wasn’t in the SEO book, nor do I remember it in STSE 2.0???

  102. 1) Anchor text flows to the target page, home page would gain a share of the PageRank.
    2) Sounds good.
    3) It is in STSE2 (Module 4C) – and it’s in the book. STSE2 has a number of solutions around this issue.

  103. 1. So the sub page would get the help on ranking and homepage would also get a share of PR? Do you advise doing this when the top ranking sites for those keywords are all homepages & not any sub? If all the links point to homepage, do the additional links just give more PR?
    3. Maybe I missed this. I read your book front to back & slowly, but I guess I missed it…

    Thanks!

  104. 1) On your homepage links, if the first one is an image & not a text link, does it count as the one that would pass anchor text?

    2) Do I assume correctly, that only the first link to any given page counts for anchor text, off AND on page links?

    3) If you are able to work this around, would it help a lot? Have one link on various content pages in the body that points to the homepage with different keywords?

  105. I looked at mod 4c again. I have CSS and learned something about it. I can easily do this now without any tricky redirects! Woo hoo!

    My question, is it ok if you vary the keywords that point to your homepage, one link per page of course, but not the same on every page? What does this do? Powers in quantity?

  106. Hey Dan…

    This is a great resource you have going here… many thanks for the valuable content. I have two questions I would like to ask…

    1. Are .info domains ok to use for sites you want to get ranked highly in the SERPS?

    2. I recently changed one of my sites internal linking to cater better for my visitors experience on my site, but also for distributing page rank to my “money pages” on the site as they were ranking very poorly. I am going to write on article and point a couple of links to the newly optimized site. I would like to know… how long does it take for google to crawl and re-rank my newly optimized pages on the site?

    Thanks Dan!!
    Sean

  107. Dan the Man, I have a quick question that I hope you can answer for me.

    I setup a wordpress site as a static site which has subpages under the root page like as seen below. Do you think the url’s aren’t as seo friendly?

    http://www.example.com/equipment/
    http://www.example.com/equipment/supplies/

    I wanted to have the page under example.com/equipment to be more like example.com/supplies.

    Do you think this is a big deal for the SE’s?

    Thanks
    -Eric

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>